commit 42975de5e87048cdd5a7e7c18a5797a09790dc02 Author: spike Date: Tue Mar 28 17:02:22 2017 +0200 commit diff --git a/.gitattributes b/.gitattributes new file mode 100644 index 0000000..e6a2d87 --- /dev/null +++ b/.gitattributes @@ -0,0 +1,8 @@ +**/data_batch_* filter=lfs diff=lfs merge=lfs -text +**/preprocess*.p filter=lfs diff=lfs merge=lfs -text +**/*.zip filter=lfs diff=lfs merge=lfs -text +**/*.pickle filter=lfs diff=lfs merge=lfs -text +**/*.gz filter=lfs diff=lfs merge=lfs -text +**/*.txt filter=lfs diff=lfs merge=lfs -text +**/*data*-of* filter=lfs diff=lfs merge=lfs -text +**/logs filter=lfs diff=lfs merge=lfs -text diff --git a/embeddings/.ipynb_checkpoints/Skip-Gram word2vec-checkpoint.ipynb b/embeddings/.ipynb_checkpoints/Skip-Gram word2vec-checkpoint.ipynb new file mode 100644 index 0000000..a55a361 --- /dev/null +++ b/embeddings/.ipynb_checkpoints/Skip-Gram word2vec-checkpoint.ipynb @@ -0,0 +1,600 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Skip-gram word2vec\n", + "\n", + "In this notebook, I'll lead you through using TensorFlow to implement the word2vec algorithm using the skip-gram architecture. By implementing this, you'll learn about embedding words for use in natural language processing. This will come in handy when dealing with things like translations.\n", + "\n", + "## Readings\n", + "\n", + "Here are the resources I used to build this notebook. I suggest reading these either beforehand or while you're working on this material.\n", + "\n", + "* A really good [conceptual overview](http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/) of word2vec from Chris McCormick \n", + "* [First word2vec paper](https://arxiv.org/pdf/1301.3781.pdf) from Mikolov et al.\n", + "* [NIPS paper](http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf) with improvements for word2vec also from Mikolov et al.\n", + "* An [implementation of word2vec](http://www.thushv.com/natural_language_processing/word2vec-part-1-nlp-with-deep-learning-with-tensorflow-skip-gram/) from Thushan Ganegedara\n", + "* TensorFlow [word2vec tutorial](https://www.tensorflow.org/tutorials/word2vec)\n", + "\n", + "## Word embeddings\n", + "\n", + "When you're dealing with language and words, you end up with tens of thousands of classes to predict, one for each word. Trying to one-hot encode these words is massively inefficient, you'll have one element set to 1 and the other 50,000 set to 0. The word2vec algorithm finds much more efficient representations by finding vectors that represent the words. These vectors also contain semantic information about the words. Words that show up in similar contexts, such as \"black\", \"white\", and \"red\" will have vectors near each other. There are two architectures for implementing word2vec, CBOW (Continuous Bag-Of-Words) and Skip-gram.\n", + "\n", + "\n", + "\n", + "In this implementation, we'll be using the skip-gram architecture because it performs better than CBOW. Here, we pass in a word and try to predict the words surrounding it in the text. In this way, we can train the network to learn representations for words that show up in similar contexts.\n", + "\n", + "First up, importing packages." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "\n", + "import utils" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Load the [text8 dataset](http://mattmahoney.net/dc/textdata.html), a file of cleaned up Wikipedia articles from Matt Mahoney. The next cell will download the data set to the `data` folder. Then you can extract it and delete the archive file to save storage space." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from urllib.request import urlretrieve\n", + "from os.path import isfile, isdir\n", + "from tqdm import tqdm\n", + "import zipfile\n", + "\n", + "dataset_folder_path = 'data'\n", + "dataset_filename = 'text8.zip'\n", + "dataset_name = 'Text8 Dataset'\n", + "\n", + "class DLProgress(tqdm):\n", + " last_block = 0\n", + "\n", + " def hook(self, block_num=1, block_size=1, total_size=None):\n", + " self.total = total_size\n", + " self.update((block_num - self.last_block) * block_size)\n", + " self.last_block = block_num\n", + "\n", + "if not isfile(dataset_filename):\n", + " with DLProgress(unit='B', unit_scale=True, miniters=1, desc=dataset_name) as pbar:\n", + " urlretrieve(\n", + " 'http://mattmahoney.net/dc/text8.zip',\n", + " dataset_filename,\n", + " pbar.hook)\n", + "\n", + "if not isdir(dataset_folder_path):\n", + " with zipfile.ZipFile(dataset_filename) as zip_ref:\n", + " zip_ref.extractall(dataset_folder_path)\n", + " \n", + "with open('data/text8') as f:\n", + " text = f.read()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Preprocessing\n", + "\n", + "Here I'm fixing up the text to make training easier. This comes from the `utils` module I wrote. The `preprocess` function coverts any punctuation into tokens, so a period is changed to ` `. In this data set, there aren't any periods, but it will help in other NLP problems. I'm also removing all words that show up five or fewer times in the dataset. This will greatly reduce issues due to noise in the data and improve the quality of the vector representations. If you want to write your own functions for this stuff, go for it." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "words = utils.preprocess(text)\n", + "print(words[:30])" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "print(\"Total words: {}\".format(len(words)))\n", + "print(\"Unique words: {}\".format(len(set(words))))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And here I'm creating dictionaries to covert words to integers and backwards, integers to words. The integers are assigned in descending frequency order, so the most frequent word (\"the\") is given the integer 0 and the next most frequent is 1 and so on. The words are converted to integers and stored in the list `int_words`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "vocab_to_int, int_to_vocab = utils.create_lookup_tables(words)\n", + "int_words = [vocab_to_int[word] for word in words]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Subsampling\n", + "\n", + "Words that show up often such as \"the\", \"of\", and \"for\" don't provide much context to the nearby words. If we discard some of them, we can remove some of the noise from our data and in return get faster training and better representations. This process is called subsampling by Mikolov. For each word $w_i$ in the training set, we'll discard it with probability given by \n", + "\n", + "$$ P(w_i) = 1 - \\sqrt{\\frac{t}{f(w_i)}} $$\n", + "\n", + "where $t$ is a threshold parameter and $f(w_i)$ is the frequency of word $w_i$ in the total dataset.\n", + "\n", + "I'm going to leave this up to you as an exercise. This is more of a programming challenge, than about deep learning specifically. But, being able to prepare your data for your network is an important skill to have. Check out my solution to see how I did it.\n", + "\n", + "> **Exercise:** Implement subsampling for the words in `int_words`. That is, go through `int_words` and discard each word given the probablility $P(w_i)$ shown above. Note that $P(w_i)$ is the probability that a word is discarded. Assign the subsampled data to `train_words`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "## Your code here\n", + "train_words = # The final subsampled word list" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Making batches" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "Now that our data is in good shape, we need to get it into the proper form to pass it into our network. With the skip-gram architecture, for each word in the text, we want to grab all the words in a window around that word, with size $C$. \n", + "\n", + "From [Mikolov et al.](https://arxiv.org/pdf/1301.3781.pdf): \n", + "\n", + "\"Since the more distant words are usually less related to the current word than those close to it, we give less weight to the distant words by sampling less from those words in our training examples... If we choose $C = 5$, for each training word we will select randomly a number $R$ in range $< 1; C >$, and then use $R$ words from history and $R$ words from the future of the current word as correct labels.\"\n", + "\n", + "> **Exercise:** Implement a function `get_target` that receives a list of words, an index, and a window size, then returns a list of words in the window around the index. Make sure to use the algorithm described above, where you choose a random number of words from the window." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_target(words, idx, window_size=5):\n", + " ''' Get a list of words in a window around an index. '''\n", + " \n", + " # Your code here\n", + " \n", + " return" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here's a function that returns batches for our network. The idea is that it grabs `batch_size` words from a words list. Then for each of those words, it gets the target words in the window. I haven't found a way to pass in a random number of target words and get it to work with the architecture, so I make one row per input-target pair. This is a generator function by the way, helps save memory." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_batches(words, batch_size, window_size=5):\n", + " ''' Create a generator of word batches as a tuple (inputs, targets) '''\n", + " \n", + " n_batches = len(words)//batch_size\n", + " \n", + " # only full batches\n", + " words = words[:n_batches*batch_size]\n", + " \n", + " for idx in range(0, len(words), batch_size):\n", + " x, y = [], []\n", + " batch = words[idx:idx+batch_size]\n", + " for ii in range(len(batch)):\n", + " batch_x = batch[ii]\n", + " batch_y = get_target(batch, ii, window_size)\n", + " y.extend(batch_y)\n", + " x.extend([batch_x]*len(batch_y))\n", + " yield x, y\n", + " " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "## Building the graph\n", + "\n", + "From Chris McCormick's blog, we can see the general structure of our network.\n", + "![embedding_network](./assets/skip_gram_net_arch.png)\n", + "\n", + "The input words are passed in as one-hot encoded vectors. This will go into a hidden layer of linear units, then into a softmax layer. We'll use the softmax layer to make a prediction like normal.\n", + "\n", + "The idea here is to train the hidden layer weight matrix to find efficient representations for our words. This weight matrix is usually called the embedding matrix or embedding look-up table. We can discard the softmax layer becuase we don't really care about making predictions with this network. We just want the embedding matrix so we can use it in other networks we build from the dataset.\n", + "\n", + "I'm going to have you build the graph in stages now. First off, creating the `inputs` and `labels` placeholders like normal.\n", + "\n", + "> **Exercise:** Assign `inputs` and `labels` using `tf.placeholder`. We're going to be passing in integers, so set the data types to `tf.int32`. The batches we're passing in will have varying sizes, so set the batch sizes to [`None`]. To make things work later, you'll need to set the second dimension of `labels` to `None` or `1`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_graph = tf.Graph()\n", + "with train_graph.as_default():\n", + " inputs = \n", + " labels = " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Embedding\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "The embedding matrix has a size of the number of words by the number of neurons in the hidden layer. So, if you have 10,000 words and 300 hidden units, the matrix will have size $10,000 \\times 300$. Remember that we're using one-hot encoded vectors for our inputs. When you do the matrix multiplication of the one-hot vector with the embedding matrix, you end up selecting only one row out of the entire matrix:\n", + "\n", + "![one-hot matrix multiplication](assets/matrix_mult_w_one_hot.png)\n", + "\n", + "You don't actually need to do the matrix multiplication, you just need to select the row in the embedding matrix that corresponds to the input word. Then, the embedding matrix becomes a lookup table, you're looking up a vector the size of the hidden layer that represents the input word.\n", + "\n", + "\n", + "\n", + "\n", + "> **Exercise:** Tensorflow provides a convenient function [`tf.nn.embedding_lookup`](https://www.tensorflow.org/api_docs/python/tf/nn/embedding_lookup) that does this lookup for us. You pass in the embedding matrix and a tensor of integers, then it returns rows in the matrix corresponding to those integers. Below, set the number of embedding features you'll use (200 is a good start), create the embedding matrix variable, and use [`tf.nn.embedding_lookup`](https://www.tensorflow.org/api_docs/python/tf/nn/embedding_lookup) to get the embedding tensors. For the embedding matrix, I suggest you initialize it with a uniform random numbers between -1 and 1 using [tf.random_uniform](https://www.tensorflow.org/api_docs/python/tf/random_uniform). This [TensorFlow tutorial](https://www.tensorflow.org/tutorials/word2vec) will help if you get stuck." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "n_vocab = len(int_to_vocab)\n", + "n_embedding = # Number of embedding features \n", + "with train_graph.as_default():\n", + " embedding = # create embedding weight matrix here\n", + " embed = # use tf.nn.embedding_lookup to get the hidden layer output" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Negative sampling\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For every example we give the network, we train it using the output from the softmax layer. That means for each input, we're making very small changes to millions of weights even though we only have one true example. This makes training the network very inefficient. We can approximate the loss from the softmax layer by only updating a small subset of all the weights at once. We'll update the weights for the correct label, but only a small number of incorrect labels. This is called [\"negative sampling\"](http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf). Tensorflow has a convenient function to do this, [`tf.nn.sampled_softmax_loss`](https://www.tensorflow.org/api_docs/python/tf/nn/sampled_softmax_loss).\n", + "\n", + "> **Exercise:** Below, create weights and biases for the softmax layer. Then, use [`tf.nn.sampled_softmax_loss`](https://www.tensorflow.org/api_docs/python/tf/nn/sampled_softmax_loss) to calculate the loss. Be sure to read the documentation to figure out how it works." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Number of negative labels to sample\n", + "n_sampled = 100\n", + "with train_graph.as_default():\n", + " softmax_w = # create softmax weight matrix here\n", + " softmax_b = # create softmax biases here\n", + " \n", + " # Calculate the loss using negative sampling\n", + " loss = tf.nn.sampled_softmax_loss \n", + " \n", + " cost = tf.reduce_mean(loss)\n", + " optimizer = tf.train.AdamOptimizer().minimize(cost)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Validation\n", + "\n", + "This code is from Thushan Ganegedara's implementation. Here we're going to choose a few common words and few uncommon words. Then, we'll print out the closest words to them. It's a nice way to check that our embedding table is grouping together words with similar semantic meanings." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "with train_graph.as_default():\n", + " ## From Thushan Ganegedara's implementation\n", + " valid_size = 16 # Random set of words to evaluate similarity on.\n", + " valid_window = 100\n", + " # pick 8 samples from (0,100) and (1000,1100) each ranges. lower id implies more frequent \n", + " valid_examples = np.array(random.sample(range(valid_window), valid_size//2))\n", + " valid_examples = np.append(valid_examples, \n", + " random.sample(range(1000,1000+valid_window), valid_size//2))\n", + "\n", + " valid_dataset = tf.constant(valid_examples, dtype=tf.int32)\n", + " \n", + " # We use the cosine distance:\n", + " norm = tf.sqrt(tf.reduce_sum(tf.square(embedding), 1, keep_dims=True))\n", + " normalized_embedding = embedding / norm\n", + " valid_embedding = tf.nn.embedding_lookup(normalized_embedding, valid_dataset)\n", + " similarity = tf.matmul(valid_embedding, tf.transpose(normalized_embedding))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# If the checkpoints directory doesn't exist:\n", + "!mkdir checkpoints" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Below is the code to train the network. Every 100 batches it reports the training loss. Every 1000 batches, it'll print out the validation words." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "epochs = 10\n", + "batch_size = 1000\n", + "window_size = 10\n", + "\n", + "with train_graph.as_default():\n", + " saver = tf.train.Saver()\n", + "\n", + "with tf.Session(graph=train_graph) as sess:\n", + " iteration = 1\n", + " loss = 0\n", + " sess.run(tf.global_variables_initializer())\n", + "\n", + " for e in range(1, epochs+1):\n", + " batches = get_batches(train_words, batch_size, window_size)\n", + " start = time.time()\n", + " for x, y in batches:\n", + " \n", + " feed = {inputs: x,\n", + " labels: np.array(y)[:, None]}\n", + " train_loss, _ = sess.run([cost, optimizer], feed_dict=feed)\n", + " \n", + " loss += train_loss\n", + " \n", + " if iteration % 100 == 0: \n", + " end = time.time()\n", + " print(\"Epoch {}/{}\".format(e, epochs),\n", + " \"Iteration: {}\".format(iteration),\n", + " \"Avg. Training loss: {:.4f}\".format(loss/100),\n", + " \"{:.4f} sec/batch\".format((end-start)/100))\n", + " loss = 0\n", + " start = time.time()\n", + " \n", + " if iteration % 1000 == 0:\n", + " ## From Thushan Ganegedara's implementation\n", + " # note that this is expensive (~20% slowdown if computed every 500 steps)\n", + " sim = similarity.eval()\n", + " for i in range(valid_size):\n", + " valid_word = int_to_vocab[valid_examples[i]]\n", + " top_k = 8 # number of nearest neighbors\n", + " nearest = (-sim[i, :]).argsort()[1:top_k+1]\n", + " log = 'Nearest to %s:' % valid_word\n", + " for k in range(top_k):\n", + " close_word = int_to_vocab[nearest[k]]\n", + " log = '%s %s,' % (log, close_word)\n", + " print(log)\n", + " \n", + " iteration += 1\n", + " save_path = saver.save(sess, \"checkpoints/text8.ckpt\")\n", + " embed_mat = sess.run(normalized_embedding)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Restore the trained network if you need to:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with train_graph.as_default():\n", + " saver = tf.train.Saver()\n", + "\n", + "with tf.Session(graph=train_graph) as sess:\n", + " saver.restore(sess, tf.train.latest_checkpoint('checkpoints'))\n", + " embed_mat = sess.run(embedding)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualizing the word vectors\n", + "\n", + "Below we'll use T-SNE to visualize how our high-dimensional word vectors cluster together. T-SNE is used to project these vectors into two dimensions while preserving local stucture. Check out [this post from Christopher Olah](http://colah.github.io/posts/2014-10-Visualizing-MNIST/) to learn more about T-SNE and other ways to visualize high-dimensional data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina'\n", + "\n", + "import matplotlib.pyplot as plt\n", + "from sklearn.manifold import TSNE" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "viz_words = 500\n", + "tsne = TSNE()\n", + "embed_tsne = tsne.fit_transform(embed_mat[:viz_words, :])" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "fig, ax = plt.subplots(figsize=(14, 14))\n", + "for idx in range(viz_words):\n", + " plt.scatter(*embed_tsne[idx, :], color='steelblue')\n", + " plt.annotate(int_to_vocab[idx], (embed_tsne[idx, 0], embed_tsne[idx, 1]), alpha=0.7)" + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/embeddings/Skip-Gram word2vec.ipynb b/embeddings/Skip-Gram word2vec.ipynb new file mode 100644 index 0000000..ab28179 --- /dev/null +++ b/embeddings/Skip-Gram word2vec.ipynb @@ -0,0 +1,599 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Skip-gram word2vec\n", + "\n", + "In this notebook, I'll lead you through using TensorFlow to implement the word2vec algorithm using the skip-gram architecture. By implementing this, you'll learn about embedding words for use in natural language processing. This will come in handy when dealing with things like translations.\n", + "\n", + "## Readings\n", + "\n", + "Here are the resources I used to build this notebook. I suggest reading these either beforehand or while you're working on this material.\n", + "\n", + "* A really good [conceptual overview](http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/) of word2vec from Chris McCormick \n", + "* [First word2vec paper](https://arxiv.org/pdf/1301.3781.pdf) from Mikolov et al.\n", + "* [NIPS paper](http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf) with improvements for word2vec also from Mikolov et al.\n", + "* An [implementation of word2vec](http://www.thushv.com/natural_language_processing/word2vec-part-1-nlp-with-deep-learning-with-tensorflow-skip-gram/) from Thushan Ganegedara\n", + "* TensorFlow [word2vec tutorial](https://www.tensorflow.org/tutorials/word2vec)\n", + "\n", + "## Word embeddings\n", + "\n", + "When you're dealing with language and words, you end up with tens of thousands of classes to predict, one for each word. Trying to one-hot encode these words is massively inefficient, you'll have one element set to 1 and the other 50,000 set to 0. The word2vec algorithm finds much more efficient representations by finding vectors that represent the words. These vectors also contain semantic information about the words. Words that show up in similar contexts, such as \"black\", \"white\", and \"red\" will have vectors near each other. There are two architectures for implementing word2vec, CBOW (Continuous Bag-Of-Words) and Skip-gram.\n", + "\n", + "\n", + "\n", + "In this implementation, we'll be using the skip-gram architecture because it performs better than CBOW. Here, we pass in a word and try to predict the words surrounding it in the text. In this way, we can train the network to learn representations for words that show up in similar contexts.\n", + "\n", + "First up, importing packages." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "\n", + "import utils" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Load the [text8 dataset](http://mattmahoney.net/dc/textdata.html), a file of cleaned up Wikipedia articles from Matt Mahoney. The next cell will download the data set to the `data` folder. Then you can extract it and delete the archive file to save storage space." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from urllib.request import urlretrieve\n", + "from os.path import isfile, isdir\n", + "from tqdm import tqdm\n", + "import zipfile\n", + "\n", + "dataset_folder_path = 'data'\n", + "dataset_filename = 'text8.zip'\n", + "dataset_name = 'Text8 Dataset'\n", + "\n", + "class DLProgress(tqdm):\n", + " last_block = 0\n", + "\n", + " def hook(self, block_num=1, block_size=1, total_size=None):\n", + " self.total = total_size\n", + " self.update((block_num - self.last_block) * block_size)\n", + " self.last_block = block_num\n", + "\n", + "if not isfile(dataset_filename):\n", + " with DLProgress(unit='B', unit_scale=True, miniters=1, desc=dataset_name) as pbar:\n", + " urlretrieve(\n", + " 'http://mattmahoney.net/dc/text8.zip',\n", + " dataset_filename,\n", + " pbar.hook)\n", + "\n", + "if not isdir(dataset_folder_path):\n", + " with zipfile.ZipFile(dataset_filename) as zip_ref:\n", + " zip_ref.extractall(dataset_folder_path)\n", + " \n", + "with open('data/text8') as f:\n", + " text = f.read()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Preprocessing\n", + "\n", + "Here I'm fixing up the text to make training easier. This comes from the `utils` module I wrote. The `preprocess` function coverts any punctuation into tokens, so a period is changed to ` `. In this data set, there aren't any periods, but it will help in other NLP problems. I'm also removing all words that show up five or fewer times in the dataset. This will greatly reduce issues due to noise in the data and improve the quality of the vector representations. If you want to write your own functions for this stuff, go for it." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "words = utils.preprocess(text)\n", + "print(words[:30])" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "print(\"Total words: {}\".format(len(words)))\n", + "print(\"Unique words: {}\".format(len(set(words))))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And here I'm creating dictionaries to covert words to integers and backwards, integers to words. The integers are assigned in descending frequency order, so the most frequent word (\"the\") is given the integer 0 and the next most frequent is 1 and so on. The words are converted to integers and stored in the list `int_words`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "vocab_to_int, int_to_vocab = utils.create_lookup_tables(words)\n", + "int_words = [vocab_to_int[word] for word in words]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Subsampling\n", + "\n", + "Words that show up often such as \"the\", \"of\", and \"for\" don't provide much context to the nearby words. If we discard some of them, we can remove some of the noise from our data and in return get faster training and better representations. This process is called subsampling by Mikolov. For each word $w_i$ in the training set, we'll discard it with probability given by \n", + "\n", + "$$ P(w_i) = 1 - \\sqrt{\\frac{t}{f(w_i)}} $$\n", + "\n", + "where $t$ is a threshold parameter and $f(w_i)$ is the frequency of word $w_i$ in the total dataset.\n", + "\n", + "I'm going to leave this up to you as an exercise. This is more of a programming challenge, than about deep learning specifically. But, being able to prepare your data for your network is an important skill to have. Check out my solution to see how I did it.\n", + "\n", + "> **Exercise:** Implement subsampling for the words in `int_words`. That is, go through `int_words` and discard each word given the probablility $P(w_i)$ shown above. Note that $P(w_i)$ is the probability that a word is discarded. Assign the subsampled data to `train_words`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "## Your code here\n", + "train_words = # The final subsampled word list" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Making batches" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "Now that our data is in good shape, we need to get it into the proper form to pass it into our network. With the skip-gram architecture, for each word in the text, we want to grab all the words in a window around that word, with size $C$. \n", + "\n", + "From [Mikolov et al.](https://arxiv.org/pdf/1301.3781.pdf): \n", + "\n", + "\"Since the more distant words are usually less related to the current word than those close to it, we give less weight to the distant words by sampling less from those words in our training examples... If we choose $C = 5$, for each training word we will select randomly a number $R$ in range $< 1; C >$, and then use $R$ words from history and $R$ words from the future of the current word as correct labels.\"\n", + "\n", + "> **Exercise:** Implement a function `get_target` that receives a list of words, an index, and a window size, then returns a list of words in the window around the index. Make sure to use the algorithm described above, where you choose a random number of words from the window." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_target(words, idx, window_size=5):\n", + " ''' Get a list of words in a window around an index. '''\n", + " \n", + " # Your code here\n", + " \n", + " return" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here's a function that returns batches for our network. The idea is that it grabs `batch_size` words from a words list. Then for each of those words, it gets the target words in the window. I haven't found a way to pass in a random number of target words and get it to work with the architecture, so I make one row per input-target pair. This is a generator function by the way, helps save memory." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_batches(words, batch_size, window_size=5):\n", + " ''' Create a generator of word batches as a tuple (inputs, targets) '''\n", + " \n", + " n_batches = len(words)//batch_size\n", + " \n", + " # only full batches\n", + " words = words[:n_batches*batch_size]\n", + " \n", + " for idx in range(0, len(words), batch_size):\n", + " x, y = [], []\n", + " batch = words[idx:idx+batch_size]\n", + " for ii in range(len(batch)):\n", + " batch_x = batch[ii]\n", + " batch_y = get_target(batch, ii, window_size)\n", + " y.extend(batch_y)\n", + " x.extend([batch_x]*len(batch_y))\n", + " yield x, y\n", + " " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "## Building the graph\n", + "\n", + "From Chris McCormick's blog, we can see the general structure of our network.\n", + "![embedding_network](./assets/skip_gram_net_arch.png)\n", + "\n", + "The input words are passed in as one-hot encoded vectors. This will go into a hidden layer of linear units, then into a softmax layer. We'll use the softmax layer to make a prediction like normal.\n", + "\n", + "The idea here is to train the hidden layer weight matrix to find efficient representations for our words. This weight matrix is usually called the embedding matrix or embedding look-up table. We can discard the softmax layer becuase we don't really care about making predictions with this network. We just want the embedding matrix so we can use it in other networks we build from the dataset.\n", + "\n", + "I'm going to have you build the graph in stages now. First off, creating the `inputs` and `labels` placeholders like normal.\n", + "\n", + "> **Exercise:** Assign `inputs` and `labels` using `tf.placeholder`. We're going to be passing in integers, so set the data types to `tf.int32`. The batches we're passing in will have varying sizes, so set the batch sizes to [`None`]. To make things work later, you'll need to set the second dimension of `labels` to `None` or `1`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_graph = tf.Graph()\n", + "with train_graph.as_default():\n", + " inputs = \n", + " labels = " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Embedding\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "The embedding matrix has a size of the number of words by the number of neurons in the hidden layer. So, if you have 10,000 words and 300 hidden units, the matrix will have size $10,000 \\times 300$. Remember that we're using one-hot encoded vectors for our inputs. When you do the matrix multiplication of the one-hot vector with the embedding matrix, you end up selecting only one row out of the entire matrix:\n", + "\n", + "![one-hot matrix multiplication](assets/matrix_mult_w_one_hot.png)\n", + "\n", + "You don't actually need to do the matrix multiplication, you just need to select the row in the embedding matrix that corresponds to the input word. Then, the embedding matrix becomes a lookup table, you're looking up a vector the size of the hidden layer that represents the input word.\n", + "\n", + "\n", + "\n", + "\n", + "> **Exercise:** Tensorflow provides a convenient function [`tf.nn.embedding_lookup`](https://www.tensorflow.org/api_docs/python/tf/nn/embedding_lookup) that does this lookup for us. You pass in the embedding matrix and a tensor of integers, then it returns rows in the matrix corresponding to those integers. Below, set the number of embedding features you'll use (200 is a good start), create the embedding matrix variable, and use [`tf.nn.embedding_lookup`](https://www.tensorflow.org/api_docs/python/tf/nn/embedding_lookup) to get the embedding tensors. For the embedding matrix, I suggest you initialize it with a uniform random numbers between -1 and 1 using [tf.random_uniform](https://www.tensorflow.org/api_docs/python/tf/random_uniform). This [TensorFlow tutorial](https://www.tensorflow.org/tutorials/word2vec) will help if you get stuck." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "n_vocab = len(int_to_vocab)\n", + "n_embedding = # Number of embedding features \n", + "with train_graph.as_default():\n", + " embedding = # create embedding weight matrix here\n", + " embed = # use tf.nn.embedding_lookup to get the hidden layer output" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Negative sampling\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For every example we give the network, we train it using the output from the softmax layer. That means for each input, we're making very small changes to millions of weights even though we only have one true example. This makes training the network very inefficient. We can approximate the loss from the softmax layer by only updating a small subset of all the weights at once. We'll update the weights for the correct label, but only a small number of incorrect labels. This is called [\"negative sampling\"](http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf). Tensorflow has a convenient function to do this, [`tf.nn.sampled_softmax_loss`](https://www.tensorflow.org/api_docs/python/tf/nn/sampled_softmax_loss).\n", + "\n", + "> **Exercise:** Below, create weights and biases for the softmax layer. Then, use [`tf.nn.sampled_softmax_loss`](https://www.tensorflow.org/api_docs/python/tf/nn/sampled_softmax_loss) to calculate the loss. Be sure to read the documentation to figure out how it works." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Number of negative labels to sample\n", + "n_sampled = 100\n", + "with train_graph.as_default():\n", + " softmax_w = # create softmax weight matrix here\n", + " softmax_b = # create softmax biases here\n", + " \n", + " # Calculate the loss using negative sampling\n", + " loss = tf.nn.sampled_softmax_loss \n", + " \n", + " cost = tf.reduce_mean(loss)\n", + " optimizer = tf.train.AdamOptimizer().minimize(cost)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Validation\n", + "\n", + "This code is from Thushan Ganegedara's implementation. Here we're going to choose a few common words and few uncommon words. Then, we'll print out the closest words to them. It's a nice way to check that our embedding table is grouping together words with similar semantic meanings." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "with train_graph.as_default():\n", + " ## From Thushan Ganegedara's implementation\n", + " valid_size = 16 # Random set of words to evaluate similarity on.\n", + " valid_window = 100\n", + " # pick 8 samples from (0,100) and (1000,1100) each ranges. lower id implies more frequent \n", + " valid_examples = np.array(random.sample(range(valid_window), valid_size//2))\n", + " valid_examples = np.append(valid_examples, \n", + " random.sample(range(1000,1000+valid_window), valid_size//2))\n", + "\n", + " valid_dataset = tf.constant(valid_examples, dtype=tf.int32)\n", + " \n", + " # We use the cosine distance:\n", + " norm = tf.sqrt(tf.reduce_sum(tf.square(embedding), 1, keep_dims=True))\n", + " normalized_embedding = embedding / norm\n", + " valid_embedding = tf.nn.embedding_lookup(normalized_embedding, valid_dataset)\n", + " similarity = tf.matmul(valid_embedding, tf.transpose(normalized_embedding))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# If the checkpoints directory doesn't exist:\n", + "!mkdir checkpoints" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Below is the code to train the network. Every 100 batches it reports the training loss. Every 1000 batches, it'll print out the validation words." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "epochs = 10\n", + "batch_size = 1000\n", + "window_size = 10\n", + "\n", + "with train_graph.as_default():\n", + " saver = tf.train.Saver()\n", + "\n", + "with tf.Session(graph=train_graph) as sess:\n", + " iteration = 1\n", + " loss = 0\n", + " sess.run(tf.global_variables_initializer())\n", + "\n", + " for e in range(1, epochs+1):\n", + " batches = get_batches(train_words, batch_size, window_size)\n", + " start = time.time()\n", + " for x, y in batches:\n", + " \n", + " feed = {inputs: x,\n", + " labels: np.array(y)[:, None]}\n", + " train_loss, _ = sess.run([cost, optimizer], feed_dict=feed)\n", + " \n", + " loss += train_loss\n", + " \n", + " if iteration % 100 == 0: \n", + " end = time.time()\n", + " print(\"Epoch {}/{}\".format(e, epochs),\n", + " \"Iteration: {}\".format(iteration),\n", + " \"Avg. Training loss: {:.4f}\".format(loss/100),\n", + " \"{:.4f} sec/batch\".format((end-start)/100))\n", + " loss = 0\n", + " start = time.time()\n", + " \n", + " if iteration % 1000 == 0:\n", + " ## From Thushan Ganegedara's implementation\n", + " # note that this is expensive (~20% slowdown if computed every 500 steps)\n", + " sim = similarity.eval()\n", + " for i in range(valid_size):\n", + " valid_word = int_to_vocab[valid_examples[i]]\n", + " top_k = 8 # number of nearest neighbors\n", + " nearest = (-sim[i, :]).argsort()[1:top_k+1]\n", + " log = 'Nearest to %s:' % valid_word\n", + " for k in range(top_k):\n", + " close_word = int_to_vocab[nearest[k]]\n", + " log = '%s %s,' % (log, close_word)\n", + " print(log)\n", + " \n", + " iteration += 1\n", + " save_path = saver.save(sess, \"checkpoints/text8.ckpt\")\n", + " embed_mat = sess.run(normalized_embedding)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Restore the trained network if you need to:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with train_graph.as_default():\n", + " saver = tf.train.Saver()\n", + "\n", + "with tf.Session(graph=train_graph) as sess:\n", + " saver.restore(sess, tf.train.latest_checkpoint('checkpoints'))\n", + " embed_mat = sess.run(embedding)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualizing the word vectors\n", + "\n", + "Below we'll use T-SNE to visualize how our high-dimensional word vectors cluster together. T-SNE is used to project these vectors into two dimensions while preserving local stucture. Check out [this post from Christopher Olah](http://colah.github.io/posts/2014-10-Visualizing-MNIST/) to learn more about T-SNE and other ways to visualize high-dimensional data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina'\n", + "\n", + "import matplotlib.pyplot as plt\n", + "from sklearn.manifold import TSNE" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "viz_words = 500\n", + "tsne = TSNE()\n", + "embed_tsne = tsne.fit_transform(embed_mat[:viz_words, :])" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "fig, ax = plt.subplots(figsize=(14, 14))\n", + "for idx in range(viz_words):\n", + " plt.scatter(*embed_tsne[idx, :], color='steelblue')\n", + " plt.annotate(int_to_vocab[idx], (embed_tsne[idx, 0], embed_tsne[idx, 1]), alpha=0.7)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/embeddings/Skip-Grams-Solution.ipynb b/embeddings/Skip-Grams-Solution.ipynb new file mode 100644 index 0000000..26ba004 --- /dev/null +++ b/embeddings/Skip-Grams-Solution.ipynb @@ -0,0 +1,1848 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Skip-gram word2vec\n", + "\n", + "In this notebook, I'll lead you through using TensorFlow to implement the word2vec algorithm using the skip-gram architecture. By implementing this, you'll learn about embedding words for use in natural language processing. This will come in handy when dealing with things like translations.\n", + "\n", + "## Readings\n", + "\n", + "Here are the resources I used to build this notebook. I suggest reading these either beforehand or while you're working on this material.\n", + "\n", + "* A really good [conceptual overview](http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/) of word2vec from Chris McCormick \n", + "* [First word2vec paper](https://arxiv.org/pdf/1301.3781.pdf) from Mikolov et al.\n", + "* [NIPS paper](http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf) with improvements for word2vec also from Mikolov et al.\n", + "* An [implementation of word2vec](http://www.thushv.com/natural_language_processing/word2vec-part-1-nlp-with-deep-learning-with-tensorflow-skip-gram/) from Thushan Ganegedara\n", + "* TensorFlow [word2vec tutorial](https://www.tensorflow.org/tutorials/word2vec)\n", + "\n", + "## Word embeddings\n", + "\n", + "When you're dealing with language and words, you end up with tens of thousands of classes to predict, one for each word. Trying to one-hot encode these words is massively inefficient, you'll have one element set to 1 and the other 50,000 set to 0. The word2vec algorithm finds much more efficient representations by finding vectors that represent the words. These vectors also contain semantic information about the words. Words that show up in similar contexts, such as \"black\", \"white\", and \"red\" will have vectors near each other. There are two architectures for implementing word2vec, CBOW (Continuous Bag-Of-Words) and Skip-gram.\n", + "\n", + "\n", + "\n", + "In this implementation, we'll be using the skip-gram architecture because it performs better than CBOW. Here, we pass in a word and try to predict the words surrounding it in the text. In this way, we can train the network to learn representations for words that show up in similar contexts.\n", + "\n", + "First up, importing packages." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "\n", + "import utils" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Load the [text8 dataset](http://mattmahoney.net/dc/textdata.html), a file of cleaned up Wikipedia articles from Matt Mahoney. The next cell will download the data set to the `data` folder. Then you can extract it and delete the archive file to save storage space." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Text8 Dataset: 31.4MB [00:16, 1.88MB/s] \n" + ] + } + ], + "source": [ + "from urllib.request import urlretrieve\n", + "from os.path import isfile, isdir\n", + "from tqdm import tqdm\n", + "import zipfile\n", + "\n", + "dataset_folder_path = 'data'\n", + "dataset_filename = 'text8.zip'\n", + "dataset_name = 'Text8 Dataset'\n", + "\n", + "class DLProgress(tqdm):\n", + " last_block = 0\n", + "\n", + " def hook(self, block_num=1, block_size=1, total_size=None):\n", + " self.total = total_size\n", + " self.update((block_num - self.last_block) * block_size)\n", + " self.last_block = block_num\n", + "\n", + "if not isfile(dataset_filename):\n", + " with DLProgress(unit='B', unit_scale=True, miniters=1, desc=dataset_name) as pbar:\n", + " urlretrieve(\n", + " 'http://mattmahoney.net/dc/text8.zip',\n", + " dataset_filename,\n", + " pbar.hook)\n", + "\n", + "if not isdir(dataset_folder_path):\n", + " with zipfile.ZipFile(dataset_filename) as zip_ref:\n", + " zip_ref.extractall(dataset_folder_path)\n", + " \n", + "with open('data/text8') as f:\n", + " text = f.read()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Preprocessing\n", + "\n", + "Here I'm fixing up the text to make training easier. This comes from the `utils` module I wrote. The `preprocess` function coverts any punctuation into tokens, so a period is changed to ` `. In this data set, there aren't any periods, but it will help in other NLP problems. I'm also removing all words that show up five or fewer times in the dataset. This will greatly reduce issues due to noise in the data and improve the quality of the vector representations. If you want to write your own functions for this stuff, go for it." + ] + }, + { + "cell_type": "code", + "execution_count": 57, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['anarchism', 'originated', 'as', 'a', 'term', 'of', 'abuse', 'first', 'used', 'against', 'early', 'working', 'class', 'radicals', 'including', 'the', 'diggers', 'of', 'the', 'english', 'revolution', 'and', 'the', 'sans', 'culottes', 'of', 'the', 'french', 'revolution', 'whilst']\n" + ] + } + ], + "source": [ + "words = utils.preprocess(text)\n", + "print(words[:30])" + ] + }, + { + "cell_type": "code", + "execution_count": 58, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Total words: 16680599\n", + "Unique words: 63641\n" + ] + } + ], + "source": [ + "print(\"Total words: {}\".format(len(words)))\n", + "print(\"Unique words: {}\".format(len(set(words))))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And here I'm creating dictionaries to covert words to integers and backwards, integers to words. The integers are assigned in descending frequency order, so the most frequent word (\"the\") is given the integer 0 and the next most frequent is 1 and so on. The words are converted to integers and stored in the list `int_words`." + ] + }, + { + "cell_type": "code", + "execution_count": 59, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "vocab_to_int, int_to_vocab = utils.create_lookup_tables(words)\n", + "int_words = [vocab_to_int[word] for word in words]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Subsampling\n", + "\n", + "Words that show up often such as \"the\", \"of\", and \"for\" don't provide much context to the nearby words. If we discard some of them, we can remove some of the noise from our data and in return get faster training and better representations. This process is called subsampling by Mikolov. For each word $w_i$ in the training set, we'll discard it with probability given by \n", + "\n", + "$$ P(w_i) = 1 - \\sqrt{\\frac{t}{f(w_i)}} $$\n", + "\n", + "where $t$ is a threshold parameter and $f(w_i)$ is the frequency of word $w_i$ in the total dataset.\n", + "\n", + "I'm going to leave this up to you as an exercise. Check out my solution to see how I did it.\n", + "\n", + "> **Exercise:** Implement subsampling for the words in `int_words`. That is, go through `int_words` and discard each word given the probablility $P(w_i)$ shown above. Note that $P(w_i)$ is that probability that a word is discarded. Assign the subsampled data to `train_words`." + ] + }, + { + "cell_type": "code", + "execution_count": 60, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import random\n", + "\n", + "threshold = 1e-5\n", + "word_counts = Counter(int_words)\n", + "total_count = len(int_words)\n", + "freqs = {word: count/total_count for word, count in word_counts.items()}\n", + "p_drop = {word: 1 - np.sqrt(threshold/freqs[word]) for word in word_counts}\n", + "train_words = [word for word in int_words if p_drop[word] < random.random()]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Making batches" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "Now that our data is in good shape, we need to get it into the proper form to pass it into our network. With the skip-gram architecture, for each word in the text, we want to grab all the words in a window around that word, with size $C$. \n", + "\n", + "From [Mikolov et al.](https://arxiv.org/pdf/1301.3781.pdf): \n", + "\n", + "\"Since the more distant words are usually less related to the current word than those close to it, we give less weight to the distant words by sampling less from those words in our training examples... If we choose $C = 5$, for each training word we will select randomly a number $R$ in range $< 1; C >$, and then use $R$ words from history and $R$ words from the future of the current word as correct labels.\"\n", + "\n", + "> **Exercise:** Implement a function `get_target` that receives a list of words, an index, and a window size, then returns a list of words in the window around the index. Make sure to use the algorithm described above, where you chose a random number of words to from the window." + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_target(words, idx, window_size=5):\n", + " ''' Get a list of words in a window around an index. '''\n", + " \n", + " R = np.random.randint(1, window_size+1)\n", + " start = idx - R if (idx - R) > 0 else 0\n", + " stop = idx + R\n", + " target_words = set(words[start:idx] + words[idx+1:stop+1])\n", + " \n", + " return list(target_words)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here's a function that returns batches for our network. The idea is that it grabs `batch_size` words from a words list. Then for each of those words, it gets the target words in the window. I haven't found a way to pass in a random number of target words and get it to work with the architecture, so I make one row per input-target pair. This is a generator function by the way, helps save memory." + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_batches(words, batch_size, window_size=5):\n", + " ''' Create a generator of word batches as a tuple (inputs, targets) '''\n", + " \n", + " n_batches = len(words)//batch_size\n", + " \n", + " # only full batches\n", + " words = words[:n_batches*batch_size]\n", + " \n", + " for idx in range(0, len(words), batch_size):\n", + " x, y = [], []\n", + " batch = words[idx:idx+batch_size]\n", + " for ii in range(len(batch)):\n", + " batch_x = batch[ii]\n", + " batch_y = get_target(batch, ii, window_size)\n", + " y.extend(batch_y)\n", + " x.extend([batch_x]*len(batch_y))\n", + " yield x, y\n", + " " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "## Building the graph\n", + "\n", + "From Chris McCormick's blog, we can see the general structure of our network.\n", + "![embedding_network](./assets/skip_gram_net_arch.png)\n", + "\n", + "The input words are passed in as one-hot encoded vectors. This will go into a hidden layer of linear units, then into a softmax layer. We'll use the softmax layer to make a prediction like normal.\n", + "\n", + "The idea here is to train the hidden layer weight matrix to find efficient representations for our words. This weight matrix is usually called the embedding matrix or embedding look-up table. We can discard the softmax layer becuase we don't really care about making predictions with this network. We just want the embedding matrix so we can use it in other networks we build from the dataset.\n", + "\n", + "I'm going to have you build the graph in stages now. First off, creating the `inputs` and `labels` placeholders like normal.\n", + "\n", + "> **Exercise:** Assign `inputs` and `labels` using `tf.placeholder`. We're going to be passing in integers, so set the data types to `tf.int32`. The batches we're passing in will have varying sizes, so set the batch sizes to [`None`]. To make things work later, you'll need to set the second dimension of `labels` to `None` or `1`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_graph = tf.Graph()\n", + "with train_graph.as_default():\n", + " inputs = tf.placeholder(tf.int32, [None], name='inputs')\n", + " labels = tf.placeholder(tf.int32, [None, None], name='labels')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Embedding\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "The embedding matrix has a size of the number of words by the number of neurons in the hidden layer. So, if you have 10,000 words and 300 hidden units, the matrix will have size $10,000 \\times 300$. Remember that we're using one-hot encoded vectors for our inputs. When you do the matrix multiplication of the one-hot vector with the embedding matrix, you end up selecting only one row out of the entire matrix:\n", + "\n", + "![one-hot matrix multiplication](assets/matrix_mult_w_one_hot.png)\n", + "\n", + "You don't actually need to do the matrix multiplication, you just need to select the row in the embedding matrix that corresponds to the input word. Then, the embedding matrix becomes a lookup table, you're looking up a vector the size of the hidden layer that represents the input word.\n", + "\n", + "\n", + "\n", + "\n", + "> **Exercise:** Tensorflow provides a convenient function [`tf.nn.embedding_lookup`](https://www.tensorflow.org/api_docs/python/tf/nn/embedding_lookup) that does this lookup for us. You pass in the embedding matrix and a tensor of integers, then it returns rows in the matrix corresponding to those integers. Below, set the number of embedding features you'll use (200 is a good start), create the embedding matrix variable, and use `tf.nn.embedding_lookup` to get the embedding tensors. For the embedding matrix, I suggest you initialize it with a uniform random numbers between -1 and 1 using [tf.random_uniform](https://www.tensorflow.org/api_docs/python/tf/random_uniform)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "n_vocab = len(int_to_vocab)\n", + "n_embedding = 200 # Number of embedding features \n", + "with train_graph.as_default():\n", + " embedding = tf.Variable(tf.random_uniform((n_vocab, n_embedding), -1, 1))\n", + " embed = tf.nn.embedding_lookup(embedding, inputs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Negative sampling\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For every example we give the network, we train it using the output from the softmax layer. That means for each input, we're making very small changes to millions of weights even though we only have one true example. This makes training the network very inefficient. We can approximate the loss from the softmax layer by only updating a small subset of all the weights at once. We'll update the weights for the correct label, but only a small number of incorrect labels. This is called [\"negative sampling\"](http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf). Tensorflow has a convenient function to do this, [`tf.nn.sampled_softmax_loss`](https://www.tensorflow.org/api_docs/python/tf/nn/sampled_softmax_loss).\n", + "\n", + "> **Exercise:** Below, create weights and biases for the softmax layer. Then, use [`tf.nn.sampled_softmax_loss`](https://www.tensorflow.org/api_docs/python/tf/nn/sampled_softmax_loss) to calculate the loss. Be sure to read the documentation to figure out how it works." + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Number of negative labels to sample\n", + "n_sampled = 100\n", + "with train_graph.as_default():\n", + " softmax_w = tf.Variable(tf.truncated_normal((n_vocab, n_embedding), stddev=0.1))\n", + " softmax_b = tf.Variable(tf.zeros(n_vocab))\n", + " \n", + " # Calculate the loss using negative sampling\n", + " loss = tf.nn.sampled_softmax_loss(softmax_w, softmax_b, \n", + " labels, embed,\n", + " n_sampled, n_vocab)\n", + " \n", + " cost = tf.reduce_mean(loss)\n", + " optimizer = tf.train.AdamOptimizer().minimize(cost)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Validation\n", + "\n", + "This code is from Thushan Ganegedara's implementation. Here we're going to choose a few common words and few uncommon words. Then, we'll print out the closest words to them. It's a nice way to check that our embedding table is grouping together words with similar semantic meanings." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "with train_graph.as_default():\n", + " ## From Thushan Ganegedara's implementation\n", + " valid_size = 16 # Random set of words to evaluate similarity on.\n", + " valid_window = 100\n", + " # pick 8 samples from (0,100) and (1000,1100) each ranges. lower id implies more frequent \n", + " valid_examples = np.array(random.sample(range(valid_window), valid_size//2))\n", + " valid_examples = np.append(valid_examples, \n", + " random.sample(range(1000,1000+valid_window), valid_size//2))\n", + "\n", + " valid_dataset = tf.constant(valid_examples, dtype=tf.int32)\n", + " \n", + " # We use the cosine distance:\n", + " norm = tf.sqrt(tf.reduce_sum(tf.square(embedding), 1, keep_dims=True))\n", + " normalized_embedding = embedding / norm\n", + " valid_embedding = tf.nn.embedding_lookup(normalized_embedding, valid_dataset)\n", + " similarity = tf.matmul(valid_embedding, tf.transpose(normalized_embedding))" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# If the checkpoints directory doesn't exist:\n", + "!mkdir checkpoints" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1/10 Iteration: 100 Avg. Training loss: 5.6559 0.1018 sec/batch\n", + "Epoch 1/10 Iteration: 200 Avg. Training loss: 5.6093 0.1028 sec/batch\n", + "Epoch 1/10 Iteration: 300 Avg. Training loss: 5.5315 0.1023 sec/batch\n", + "Epoch 1/10 Iteration: 400 Avg. Training loss: 5.5730 0.1030 sec/batch\n", + "Epoch 1/10 Iteration: 500 Avg. Training loss: 5.5062 0.1014 sec/batch\n", + "Epoch 1/10 Iteration: 600 Avg. Training loss: 5.5396 0.1025 sec/batch\n", + "Epoch 1/10 Iteration: 700 Avg. Training loss: 5.5646 0.1033 sec/batch\n", + "Epoch 1/10 Iteration: 800 Avg. Training loss: 5.5273 0.1035 sec/batch\n", + "Epoch 1/10 Iteration: 900 Avg. Training loss: 5.5067 0.1030 sec/batch\n", + "Epoch 1/10 Iteration: 1000 Avg. Training loss: 5.4201 0.0999 sec/batch\n", + "Nearest to for: hoffman, rogue, jehoiakim, montinari, aldington, silos, explains, ilayaraja,\n", + "Nearest to would: louisiane, lampoon, albertina, bottle, olin, allahabad, disobey, tcl,\n", + "Nearest to known: homicide, intervening, tori, satrapies, mated, rtgs, lodbrok, assistants,\n", + "Nearest to used: contributing, brazil, institutionalization, ceilings, breed, gilchrist, superstitious, hawat,\n", + "Nearest to at: squaresoft, taya, buffalo, ferraris, poststructuralism, osiris, bathory, fina,\n", + "Nearest to such: expellees, wanderer, monopolistic, seldom, nanda, imperii, portnoy, heseltine,\n", + "Nearest to called: ramp, philology, lacklustre, stoner, purification, nuisances, implementing, vegetative,\n", + "Nearest to when: benguela, edinburgh, sul, tze, konkani, fo, gigue, iranic,\n", + "Nearest to taking: leopards, arlene, disembodied, maharishi, offal, krulak, sidgwick, rational,\n", + "Nearest to consists: lippe, karaca, anthropic, gramophone, squids, cbd, buildup, detox,\n", + "Nearest to scale: exposed, shrek, allude, chappell, foretells, childe, sheltered, escola,\n", + "Nearest to units: experimenter, lawn, fortieth, jagdish, mileposts, summit, danse, decorations,\n", + "Nearest to ice: pediment, witnessing, staining, plasmodium, habibie, riggs, detection, reconstruction,\n", + "Nearest to instance: caesarean, healthy, wong, resize, corals, movers, attitudes, buena,\n", + "Nearest to channel: creditors, tritium, bouchard, mastercard, gli, dray, stringy, frees,\n", + "Nearest to report: conscious, hellas, candlestick, midwinter, presidents, girls, bathyscaphe, haryana,\n", + "Epoch 1/10 Iteration: 1100 Avg. Training loss: 5.4772 0.1044 sec/batch\n", + "Epoch 1/10 Iteration: 1200 Avg. Training loss: 5.4192 0.1002 sec/batch\n", + "Epoch 1/10 Iteration: 1300 Avg. Training loss: 5.3636 0.1020 sec/batch\n", + "Epoch 1/10 Iteration: 1400 Avg. Training loss: 5.2318 0.1000 sec/batch\n", + "Epoch 1/10 Iteration: 1500 Avg. Training loss: 5.1699 0.0994 sec/batch\n", + "Epoch 1/10 Iteration: 1600 Avg. Training loss: 5.1744 0.0986 sec/batch\n", + "Epoch 1/10 Iteration: 1700 Avg. Training loss: 5.1248 0.1007 sec/batch\n", + "Epoch 1/10 Iteration: 1800 Avg. Training loss: 5.0379 0.1045 sec/batch\n", + "Epoch 1/10 Iteration: 1900 Avg. Training loss: 4.9862 0.0994 sec/batch\n", + "Epoch 1/10 Iteration: 2000 Avg. Training loss: 4.9961 0.0995 sec/batch\n", + "Nearest to for: hoffman, rogue, explains, cited, dod, listed, census, oxford,\n", + "Nearest to would: louisiane, still, bottle, nyquist, lampoon, introduced, disobey, feet,\n", + "Nearest to known: homicide, intervening, tori, assistants, lodbrok, mated, millions, justified,\n", + "Nearest to used: contributing, ceilings, institutionalization, brazil, pre, question, superstitious, incorporates,\n", + "Nearest to at: squaresoft, help, taya, good, degree, their, melody, ferraris,\n", + "Nearest to such: school, seldom, noise, distances, desired, wanderer, heseltine, next,\n", + "Nearest to called: purification, implementing, industry, ramp, stoner, philology, cost, vegetative,\n", + "Nearest to when: edinburgh, tze, preservation, sul, five, order, benguela, fo,\n", + "Nearest to taking: rational, death, disembodied, countless, krulak, quaternions, carpal, audited,\n", + "Nearest to consists: gramophone, karaca, whigs, squids, brighton, anthropic, heterosexuals, increase,\n", + "Nearest to scale: exposed, formation, shrek, full, childe, sheltered, aggregated, speciation,\n", + "Nearest to units: summit, begins, independent, dod, asserted, appoint, lawn, experimenter,\n", + "Nearest to ice: pediment, witnessing, reconstruction, habibie, aiding, riggs, inflammable, detection,\n", + "Nearest to instance: healthy, wong, census, attitudes, believed, buena, corals, husband,\n", + "Nearest to channel: creditors, tritium, mastercard, bouchard, frees, stringy, bypassing, nietzsche,\n", + "Nearest to report: conscious, presidents, hellas, but, girls, cooper, lineage, publishing,\n", + "Epoch 1/10 Iteration: 2100 Avg. Training loss: 4.9267 0.0995 sec/batch\n", + "Epoch 1/10 Iteration: 2200 Avg. Training loss: 4.9097 0.1014 sec/batch\n", + "Epoch 1/10 Iteration: 2300 Avg. Training loss: 4.8684 0.1004 sec/batch\n", + "Epoch 1/10 Iteration: 2400 Avg. Training loss: 4.8427 0.1060 sec/batch\n", + "Epoch 1/10 Iteration: 2500 Avg. Training loss: 4.8111 0.1087 sec/batch\n", + "Epoch 1/10 Iteration: 2600 Avg. Training loss: 4.8307 0.1029 sec/batch\n", + "Epoch 1/10 Iteration: 2700 Avg. Training loss: 4.7947 0.1068 sec/batch\n", + "Epoch 1/10 Iteration: 2800 Avg. Training loss: 4.8068 0.1025 sec/batch\n", + "Epoch 1/10 Iteration: 2900 Avg. Training loss: 4.7837 0.1026 sec/batch\n", + "Epoch 1/10 Iteration: 3000 Avg. Training loss: 4.7842 0.1076 sec/batch\n", + "Nearest to for: hoffman, rogue, searchable, housed, cited, explains, dod, silos,\n", + "Nearest to would: louisiane, still, concentrate, lampoon, disobey, nyquist, bottle, kaiju,\n", + "Nearest to known: homicide, intervening, tori, millions, justified, mated, lodbrok, satrapies,\n", + "Nearest to used: contributing, ceilings, brazil, institutionalization, breed, superstitious, incorporates, tends,\n", + "Nearest to at: squaresoft, melody, ferraris, buffalo, competed, emi, taya, kids,\n", + "Nearest to such: seldom, desired, school, noise, distances, wanderer, rays, unions,\n", + "Nearest to called: ramp, philology, implementing, purification, industry, lacklustre, stoner, strategic,\n", + "Nearest to when: edinburgh, attractive, preservation, fo, sul, itv, tze, scotland,\n", + "Nearest to taking: rational, disembodied, india, death, arlene, exercised, quaternions, countless,\n", + "Nearest to consists: gramophone, karaca, anthropic, brighton, buildup, whigs, squids, fascist,\n", + "Nearest to scale: exposed, formation, coral, curved, childe, chappell, unusable, shrek,\n", + "Nearest to units: lawn, summit, appoint, begins, dod, laid, independent, experimenter,\n", + "Nearest to ice: witnessing, reconstruction, detection, pediment, aiding, inflammable, drugs, habibie,\n", + "Nearest to instance: healthy, wong, buena, census, attitudes, implementations, caesarean, corals,\n", + "Nearest to channel: creditors, tritium, mastercard, bouchard, frees, bypassing, nietzsche, dray,\n", + "Nearest to report: conscious, presidents, hellas, cooper, ts, girls, isomorphism, credibility,\n", + "Epoch 1/10 Iteration: 3100 Avg. Training loss: 4.7704 0.1056 sec/batch\n", + "Epoch 1/10 Iteration: 3200 Avg. Training loss: 4.7655 0.1045 sec/batch\n", + "Epoch 1/10 Iteration: 3300 Avg. Training loss: 4.7184 0.1032 sec/batch\n", + "Epoch 1/10 Iteration: 3400 Avg. Training loss: 4.7202 0.1049 sec/batch\n", + "Epoch 1/10 Iteration: 3500 Avg. Training loss: 4.7368 0.1028 sec/batch\n", + "Epoch 1/10 Iteration: 3600 Avg. Training loss: 4.7046 0.1022 sec/batch\n", + "Epoch 1/10 Iteration: 3700 Avg. Training loss: 4.6942 0.1021 sec/batch\n", + "Epoch 1/10 Iteration: 3800 Avg. Training loss: 4.7397 0.1023 sec/batch\n", + "Epoch 1/10 Iteration: 3900 Avg. Training loss: 4.7120 0.1021 sec/batch\n", + "Epoch 1/10 Iteration: 4000 Avg. Training loss: 4.6501 0.1022 sec/batch\n", + "Nearest to for: hoffman, rogue, searchable, housed, silos, cited, dod, jehoiakim,\n", + "Nearest to would: louisiane, lampoon, concentrate, disobey, nyquist, still, albertina, bottle,\n", + "Nearest to known: homicide, mated, tori, intervening, justified, satrapies, millions, lodbrok,\n", + "Nearest to used: ceilings, contributing, institutionalization, brazil, breed, gilchrist, hawat, superstitious,\n", + "Nearest to at: squaresoft, emi, buffalo, melody, worded, polls, competed, lander,\n", + "Nearest to such: desired, seldom, distances, wanderer, noise, license, expellees, heseltine,\n", + "Nearest to called: ramp, philology, implementing, purification, lacklustre, vegetative, industry, intimidated,\n", + "Nearest to when: edinburgh, sul, preservation, fo, attractive, tze, launchers, benguela,\n", + "Nearest to taking: leopards, maharishi, india, rational, forge, concordat, arlene, disembodied,\n", + "Nearest to consists: gramophone, buildup, karaca, coronets, brighton, terminals, efficiencies, anthropic,\n", + "Nearest to scale: exposed, chappell, childe, formation, allude, sheltered, embroiled, unusable,\n", + "Nearest to units: lawn, experimenter, summit, typewriter, fortieth, torsion, independent, jagdish,\n", + "Nearest to ice: witnessing, reconstruction, detection, pediment, habibie, aiding, pyotr, inflammable,\n", + "Nearest to instance: healthy, wong, attitudes, resize, buena, implementations, synapses, census,\n", + "Nearest to channel: creditors, tritium, mastercard, odor, frees, bouchard, dray, speculators,\n", + "Nearest to report: conscious, candlestick, hellas, presidents, haight, credibility, cooper, isomorphism,\n", + "Epoch 1/10 Iteration: 4100 Avg. Training loss: 4.6614 0.1032 sec/batch\n", + "Epoch 1/10 Iteration: 4200 Avg. Training loss: 4.6734 0.1022 sec/batch\n", + "Epoch 1/10 Iteration: 4300 Avg. Training loss: 4.6329 0.1024 sec/batch\n", + "Epoch 1/10 Iteration: 4400 Avg. Training loss: 4.6284 0.1037 sec/batch\n", + "Epoch 1/10 Iteration: 4500 Avg. Training loss: 4.6296 0.1047 sec/batch\n", + "Epoch 1/10 Iteration: 4600 Avg. Training loss: 4.6149 0.1042 sec/batch\n", + "Epoch 2/10 Iteration: 4700 Avg. Training loss: 4.5956 0.0812 sec/batch\n", + "Epoch 2/10 Iteration: 4800 Avg. Training loss: 4.5381 0.1114 sec/batch\n", + "Epoch 2/10 Iteration: 4900 Avg. Training loss: 4.5008 0.1046 sec/batch\n", + "Epoch 2/10 Iteration: 5000 Avg. Training loss: 4.5004 0.1017 sec/batch\n", + "Nearest to for: hoffman, rogue, searchable, housed, cited, explains, appropriately, silos,\n", + "Nearest to would: lampoon, concentrate, disobey, nyquist, louisiane, albertina, still, bottle,\n", + "Nearest to known: homicide, mated, assistants, satrapies, justified, tori, uppercase, rtgs,\n", + "Nearest to used: ceilings, contributing, institutionalization, gilchrist, mollusks, breed, hawat, tends,\n", + "Nearest to at: squaresoft, taya, emi, melody, buffalo, lander, awarding, polls,\n", + "Nearest to such: desired, noise, distances, seldom, license, heseltine, expellees, plosives,\n", + "Nearest to called: ramp, philology, lacklustre, purification, implementing, vegetative, bakunin, intimidated,\n", + "Nearest to when: edinburgh, attractive, preservation, fo, sul, tze, launchers, ragga,\n", + "Nearest to taking: leopards, arlene, rational, sidgwick, concordat, india, maharishi, representational,\n", + "Nearest to consists: gramophone, efficiencies, karaca, buildup, coronets, coasts, terminals, anthropic,\n", + "Nearest to scale: exposed, chappell, allude, formation, childe, fuse, aggregated, curved,\n", + "Nearest to units: torsion, lawn, fortieth, experimenter, typewriter, overlordship, jagdish, latest,\n", + "Nearest to ice: reconstruction, witnessing, detection, plasmodium, pinstripes, habibie, pediment, pyotr,\n", + "Nearest to instance: healthy, resize, synapses, attitudes, lenses, wong, implementations, corals,\n", + "Nearest to channel: tritium, creditors, mastercard, speculators, gli, dray, bouchard, frees,\n", + "Nearest to report: candlestick, conscious, haight, hellas, presidents, leaped, credibility, cooper,\n", + "Epoch 2/10 Iteration: 5100 Avg. Training loss: 4.5328 0.1027 sec/batch\n", + "Epoch 2/10 Iteration: 5200 Avg. Training loss: 4.4976 0.1024 sec/batch\n", + "Epoch 2/10 Iteration: 5300 Avg. Training loss: 4.4784 0.1023 sec/batch\n", + "Epoch 2/10 Iteration: 5400 Avg. Training loss: 4.5429 0.1024 sec/batch\n", + "Epoch 2/10 Iteration: 5500 Avg. Training loss: 4.5072 0.1021 sec/batch\n", + "Epoch 2/10 Iteration: 5600 Avg. Training loss: 4.4743 0.1062 sec/batch\n", + "Epoch 2/10 Iteration: 5700 Avg. Training loss: 4.4699 0.1040 sec/batch\n", + "Epoch 2/10 Iteration: 5800 Avg. Training loss: 4.3911 0.1088 sec/batch\n", + "Epoch 2/10 Iteration: 5900 Avg. Training loss: 4.4513 0.1101 sec/batch\n", + "Epoch 2/10 Iteration: 6000 Avg. Training loss: 4.4301 0.1096 sec/batch\n", + "Nearest to for: rogue, hoffman, searchable, appropriately, cited, meats, silos, housed,\n", + "Nearest to would: disobey, nyquist, concentrate, lampoon, louisiane, whyte, still, albertina,\n", + "Nearest to known: homicide, mated, satrapies, rtgs, justified, tori, ctor, millions,\n", + "Nearest to used: ceilings, contributing, mollusks, institutionalization, hawat, user, breed, weight,\n", + "Nearest to at: squaresoft, taya, emi, awarding, buffalo, melody, lander, polls,\n", + "Nearest to such: desired, license, seldom, distances, noise, heseltine, plosives, consumers,\n", + "Nearest to called: ramp, vegetative, lacklustre, philology, implementing, bakunin, supersessionism, purification,\n", + "Nearest to when: edinburgh, fo, attractive, ragga, preservation, tze, be, benguela,\n", + "Nearest to taking: leopards, arlene, rational, sidgwick, concordat, bhagavan, vicar, applause,\n", + "Nearest to consists: efficiencies, gramophone, karaca, buildup, coasts, coronets, cbd, terminals,\n", + "Nearest to scale: exposed, chappell, formation, allude, childe, curved, fuse, coral,\n", + "Nearest to units: torsion, typewriter, fortieth, lawn, latest, experimenter, torrens, arched,\n", + "Nearest to ice: reconstruction, detection, plasmodium, witnessing, staining, soils, pediment, habibie,\n", + "Nearest to instance: healthy, resize, synapses, implementations, lenses, attitudes, spreads, what,\n", + "Nearest to channel: tritium, speculators, creditors, dray, restructured, mastercard, gli, frees,\n", + "Nearest to report: candlestick, haight, conscious, leaped, credibility, presidents, hellas, standish,\n", + "Epoch 2/10 Iteration: 6100 Avg. Training loss: 4.4451 0.1131 sec/batch\n", + "Epoch 2/10 Iteration: 6200 Avg. Training loss: 4.4053 0.1095 sec/batch\n", + "Epoch 2/10 Iteration: 6300 Avg. Training loss: 4.4466 0.1095 sec/batch\n", + "Epoch 2/10 Iteration: 6400 Avg. Training loss: 4.4000 0.1088 sec/batch\n", + "Epoch 2/10 Iteration: 6500 Avg. Training loss: 4.4273 0.1082 sec/batch\n", + "Epoch 2/10 Iteration: 6600 Avg. Training loss: 4.4487 0.1098 sec/batch\n", + "Epoch 2/10 Iteration: 6700 Avg. Training loss: 4.3700 0.1094 sec/batch\n", + "Epoch 2/10 Iteration: 6800 Avg. Training loss: 4.3856 0.1091 sec/batch\n", + "Epoch 2/10 Iteration: 6900 Avg. Training loss: 4.4200 0.1091 sec/batch\n", + "Epoch 2/10 Iteration: 7000 Avg. Training loss: 4.3654 0.1083 sec/batch\n", + "Nearest to for: hoffman, rogue, searchable, cited, appropriately, silos, caller, jehoiakim,\n", + "Nearest to would: disobey, nyquist, lampoon, concentrate, louisiane, whyte, still, olin,\n", + "Nearest to known: mated, homicide, satrapies, tori, rtgs, assistants, grady, oak,\n", + "Nearest to used: ceilings, mollusks, institutionalization, contributing, user, breed, gilchrist, negating,\n", + "Nearest to at: squaresoft, taya, emi, awarding, room, bathory, berke, melody,\n", + "Nearest to such: desired, license, noise, seldom, plosives, distances, itself, techniques,\n", + "Nearest to called: ramp, vegetative, bakunin, lacklustre, philology, supersessionism, intimidated, sealand,\n", + "Nearest to when: edinburgh, ragga, attractive, benguela, be, fo, preservation, launchers,\n", + "Nearest to taking: leopards, rational, arlene, concordat, sidgwick, bhagavan, vicar, tents,\n", + "Nearest to consists: karaca, gramophone, coasts, efficiencies, cbd, buildup, anthropic, eee,\n", + "Nearest to scale: exposed, chappell, formation, childe, speciation, allude, curved, coral,\n", + "Nearest to units: torsion, typewriter, fortieth, force, experimenter, arched, latest, teletype,\n", + "Nearest to ice: reconstruction, detection, plasmodium, staining, soils, witnessing, pediment, robotics,\n", + "Nearest to instance: synapses, resize, healthy, implementations, lenses, attitudes, spreads, krugerrand,\n", + "Nearest to channel: tritium, speculators, creditors, curler, mastercard, restructured, dray, almohades,\n", + "Nearest to report: candlestick, presidents, haight, leaped, conscious, standish, credibility, tillman,\n", + "Epoch 2/10 Iteration: 7100 Avg. Training loss: 4.3969 0.1102 sec/batch\n", + "Epoch 2/10 Iteration: 7200 Avg. Training loss: 4.3768 0.1086 sec/batch\n", + "Epoch 2/10 Iteration: 7300 Avg. Training loss: 4.3602 0.1087 sec/batch\n", + "Epoch 2/10 Iteration: 7400 Avg. Training loss: 4.3689 0.1125 sec/batch\n", + "Epoch 2/10 Iteration: 7500 Avg. Training loss: 4.4073 0.1099 sec/batch\n", + "Epoch 2/10 Iteration: 7600 Avg. Training loss: 4.3354 0.1114 sec/batch\n", + "Epoch 2/10 Iteration: 7700 Avg. Training loss: 4.3640 0.1068 sec/batch\n", + "Epoch 2/10 Iteration: 7800 Avg. Training loss: 4.3759 0.1094 sec/batch\n", + "Epoch 2/10 Iteration: 7900 Avg. Training loss: 4.3205 0.1064 sec/batch\n", + "Epoch 2/10 Iteration: 8000 Avg. Training loss: 4.3363 0.1084 sec/batch\n", + "Nearest to for: hoffman, rogue, silos, searchable, housed, entities, appropriately, jehoiakim,\n", + "Nearest to would: disobey, nyquist, lampoon, louisiane, zubaydah, habilis, concentrate, despaired,\n", + "Nearest to known: satrapies, mated, oak, homicide, demographically, justified, conglomerates, uppercase,\n", + "Nearest to used: ceilings, mollusks, institutionalization, gilchrist, bp, negating, nazca, contributing,\n", + "Nearest to at: emi, awarding, taya, bathory, squaresoft, sharps, motivates, room,\n", + "Nearest to such: desired, license, seldom, plosives, noise, assumes, techniques, furtherance,\n", + "Nearest to called: ramp, vegetative, bakunin, lacklustre, reintroduce, philology, purification, supersessionism,\n", + "Nearest to when: edinburgh, ragga, refuse, attractive, be, benguela, tze, fo,\n", + "Nearest to taking: leopards, rational, concordat, sidgwick, arlene, anoxic, bhagavan, vicar,\n", + "Nearest to consists: karaca, cbd, coasts, gramophone, brighton, eee, circumcising, efficiencies,\n", + "Nearest to scale: exposed, chappell, formation, speciation, curved, allude, childe, coral,\n", + "Nearest to units: torsion, fortieth, typewriter, force, arched, experimenter, latest, torrens,\n", + "Nearest to ice: soils, plasmodium, reconstruction, staining, detection, golem, hartsfield, witnessing,\n", + "Nearest to instance: synapses, resize, healthy, lenses, implementations, illogical, krugerrand, attitudes,\n", + "Nearest to channel: speculators, tritium, curler, creditors, mastercard, restructured, almohades, odor,\n", + "Nearest to report: haight, candlestick, presidents, leaped, corte, conscious, tillman, standish,\n", + "Epoch 2/10 Iteration: 8100 Avg. Training loss: 4.3422 0.1105 sec/batch\n", + "Epoch 2/10 Iteration: 8200 Avg. Training loss: 4.2877 0.1093 sec/batch\n", + "Epoch 2/10 Iteration: 8300 Avg. Training loss: 4.3619 0.1113 sec/batch\n", + "Epoch 2/10 Iteration: 8400 Avg. Training loss: 4.3875 0.1123 sec/batch\n", + "Epoch 2/10 Iteration: 8500 Avg. Training loss: 4.3750 0.1136 sec/batch\n", + "Epoch 2/10 Iteration: 8600 Avg. Training loss: 4.2679 0.1082 sec/batch\n", + "Epoch 2/10 Iteration: 8700 Avg. Training loss: 4.3009 0.1120 sec/batch\n", + "Epoch 2/10 Iteration: 8800 Avg. Training loss: 4.3798 0.1139 sec/batch\n", + "Epoch 2/10 Iteration: 8900 Avg. Training loss: 4.2172 0.1133 sec/batch\n", + "Epoch 2/10 Iteration: 9000 Avg. Training loss: 4.2966 0.1099 sec/batch\n", + "Nearest to for: hoffman, rogue, searchable, silos, serrated, appropriately, emeryville, jehoiakim,\n", + "Nearest to would: disobey, nyquist, habilis, whyte, zubaydah, despaired, replied, concentrate,\n", + "Nearest to known: mated, satrapies, rtgs, uppercase, oak, homicide, demographically, very,\n", + "Nearest to used: ceilings, mollusks, bp, comprehensible, institutionalization, gilchrist, nazca, negating,\n", + "Nearest to at: emi, taya, bathory, squaresoft, awarding, motivates, room, summer,\n", + "Nearest to such: desired, license, heseltine, furtherance, seldom, techniques, monopolistic, plosives,\n", + "Nearest to called: ramp, vegetative, lacklustre, bakunin, philology, purification, supersessionism, reintroduce,\n", + "Nearest to when: edinburgh, ragga, be, refuse, benguela, attractive, tze, bursa,\n", + "Nearest to taking: leopards, rational, concordat, sidgwick, bhagavan, go, arlene, garis,\n", + "Nearest to consists: eee, karaca, cbd, efficiencies, coasts, brighton, coronets, circumcising,\n", + "Nearest to scale: exposed, chappell, formation, allude, curved, speciation, fuse, coral,\n", + "Nearest to units: torsion, fortieth, typewriter, force, torrens, arched, teletype, experimenter,\n", + "Nearest to ice: soils, plasmodium, reconstruction, staining, golem, detection, hartsfield, pyotr,\n", + "Nearest to instance: synapses, resize, healthy, lenses, krugerrand, illogical, implementations, spreads,\n", + "Nearest to channel: tritium, speculators, curler, mastercard, restructured, creditors, almohades, dray,\n", + "Nearest to report: haight, leaped, candlestick, presidents, standish, corte, conscious, credibility,\n", + "Epoch 2/10 Iteration: 9100 Avg. Training loss: 4.3073 0.1099 sec/batch\n", + "Epoch 2/10 Iteration: 9200 Avg. Training loss: 4.3067 0.1088 sec/batch\n", + "Epoch 3/10 Iteration: 9300 Avg. Training loss: 4.3305 0.0503 sec/batch\n", + "Epoch 3/10 Iteration: 9400 Avg. Training loss: 4.2538 0.1096 sec/batch\n", + "Epoch 3/10 Iteration: 9500 Avg. Training loss: 4.2195 0.1093 sec/batch\n", + "Epoch 3/10 Iteration: 9600 Avg. Training loss: 4.2297 0.1091 sec/batch\n", + "Epoch 3/10 Iteration: 9700 Avg. Training loss: 4.2225 0.1116 sec/batch\n", + "Epoch 3/10 Iteration: 9800 Avg. Training loss: 4.2412 0.1091 sec/batch\n", + "Epoch 3/10 Iteration: 9900 Avg. Training loss: 4.2439 0.1091 sec/batch\n", + "Epoch 3/10 Iteration: 10000 Avg. Training loss: 4.1912 0.1096 sec/batch\n", + "Nearest to for: rogue, hoffman, searchable, silos, caller, converged, appropriately, pokey,\n", + "Nearest to would: disobey, nyquist, whyte, habilis, zubaydah, concentrate, lampoon, weaponry,\n", + "Nearest to known: mated, rtgs, conglomerates, demographically, oak, uppercase, satrapies, assistants,\n", + "Nearest to used: ceilings, mollusks, bp, negating, comprehensible, institutionalization, cages, bleaches,\n", + "Nearest to at: emi, taya, bathory, awarding, room, summer, squaresoft, sharps,\n", + "Nearest to such: license, desired, heseltine, plosives, afips, furtherance, expellees, techniques,\n", + "Nearest to called: ramp, bakunin, philology, vegetative, lacklustre, supersessionism, purification, reintroduce,\n", + "Nearest to when: edinburgh, ragga, refuse, benguela, attractive, remove, be, falklands,\n", + "Nearest to taking: leopards, rational, concordat, go, sidgwick, garis, bhagavan, applause,\n", + "Nearest to consists: eee, cbd, coasts, efficiencies, karaca, brighton, coronets, located,\n", + "Nearest to scale: exposed, chappell, coral, allude, curved, formation, fuse, speciation,\n", + "Nearest to units: torsion, fortieth, force, typewriter, teletype, torrens, pucker, arched,\n", + "Nearest to ice: soils, plasmodium, staining, reconstruction, detection, golem, pyotr, pinstripes,\n", + "Nearest to instance: resize, synapses, healthy, lenses, krugerrand, illogical, attitudes, caesarean,\n", + "Nearest to channel: speculators, tritium, curler, mastercard, restructured, creditors, bypassing, almohades,\n", + "Nearest to report: candlestick, standish, credibility, haight, leaped, presidents, conscious, corte,\n", + "Epoch 3/10 Iteration: 10100 Avg. Training loss: 4.2465 0.1103 sec/batch\n", + "Epoch 3/10 Iteration: 10200 Avg. Training loss: 4.2411 0.1091 sec/batch\n", + "Epoch 3/10 Iteration: 10300 Avg. Training loss: 4.2232 0.1098 sec/batch\n", + "Epoch 3/10 Iteration: 10400 Avg. Training loss: 4.1565 0.1094 sec/batch\n", + "Epoch 3/10 Iteration: 10500 Avg. Training loss: 4.1659 0.1097 sec/batch\n", + "Epoch 3/10 Iteration: 10600 Avg. Training loss: 4.1560 0.1100 sec/batch\n", + "Epoch 3/10 Iteration: 10700 Avg. Training loss: 4.1616 0.1101 sec/batch\n", + "Epoch 3/10 Iteration: 10800 Avg. Training loss: 4.1829 0.1101 sec/batch\n", + "Epoch 3/10 Iteration: 10900 Avg. Training loss: 4.1989 0.1096 sec/batch\n", + "Epoch 3/10 Iteration: 11000 Avg. Training loss: 4.1676 0.1097 sec/batch\n", + "Nearest to for: hoffman, rogue, searchable, caller, silos, appropriately, typeface, converged,\n", + "Nearest to would: disobey, nyquist, whyte, weaponry, habilis, zubaydah, concentrate, despaired,\n", + "Nearest to known: rtgs, demographically, mated, satrapies, very, conical, usability, uppercase,\n", + "Nearest to used: ceilings, mollusks, negating, bp, institutionalization, grams, cages, painstaking,\n", + "Nearest to at: emi, taya, awarding, room, squaresoft, sharps, bathory, italia,\n", + "Nearest to such: license, desired, plosives, techniques, heseltine, undercurrent, imperii, procedure,\n", + "Nearest to called: vegetative, ramp, supersessionism, bakunin, sealand, philology, purification, reintroduce,\n", + "Nearest to when: ragga, edinburgh, attractive, refuse, be, benguela, remove, falklands,\n", + "Nearest to taking: leopards, rational, go, concordat, garis, sidgwick, carpal, anoxic,\n", + "Nearest to consists: eee, cbd, coasts, located, condorcet, circumcising, gramophone, brighton,\n", + "Nearest to scale: exposed, chappell, fuse, childe, curved, allude, formation, speciation,\n", + "Nearest to units: torsion, force, fortieth, typewriter, teletype, latest, unit, prefixes,\n", + "Nearest to ice: soils, plasmodium, staining, detection, reconstruction, pinstripes, fracture, golem,\n", + "Nearest to instance: resize, synapses, lenses, implementations, healthy, illogical, oscillators, krugerrand,\n", + "Nearest to channel: curler, speculators, tritium, restructured, creditors, bypassing, mastercard, dray,\n", + "Nearest to report: credibility, presidents, candlestick, standish, leaped, haight, corte, conscious,\n", + "Epoch 3/10 Iteration: 11100 Avg. Training loss: 4.1830 0.1103 sec/batch\n", + "Epoch 3/10 Iteration: 11200 Avg. Training loss: 4.2133 0.1089 sec/batch\n", + "Epoch 3/10 Iteration: 11300 Avg. Training loss: 4.1865 0.1096 sec/batch\n", + "Epoch 3/10 Iteration: 11400 Avg. Training loss: 4.1479 0.1090 sec/batch\n", + "Epoch 3/10 Iteration: 11500 Avg. Training loss: 4.2011 0.1093 sec/batch\n", + "Epoch 3/10 Iteration: 11600 Avg. Training loss: 4.1720 0.1095 sec/batch\n", + "Epoch 3/10 Iteration: 11700 Avg. Training loss: 4.2111 0.1095 sec/batch\n", + "Epoch 3/10 Iteration: 11800 Avg. Training loss: 4.1659 0.1095 sec/batch\n", + "Epoch 3/10 Iteration: 11900 Avg. Training loss: 4.1315 0.1091 sec/batch\n", + "Epoch 3/10 Iteration: 12000 Avg. Training loss: 4.1508 0.1092 sec/batch\n", + "Nearest to for: hoffman, rogue, given, searchable, silos, census, converged, caller,\n", + "Nearest to would: disobey, habilis, nyquist, zubaydah, whyte, despaired, weaponry, preeminence,\n", + "Nearest to known: rtgs, mated, satrapies, uppercase, usability, conical, very, oak,\n", + "Nearest to used: ceilings, mollusks, bp, negating, institutionalization, decorator, supplementation, cirth,\n", + "Nearest to at: emi, taya, awarding, habr, squaresoft, sharps, coronets, dini,\n", + "Nearest to such: desired, techniques, plosives, license, pollutant, procedure, unfair, lysenkoism,\n", + "Nearest to called: ramp, vegetative, supersessionism, bakunin, philology, sealand, reintroduce, denunciations,\n", + "Nearest to when: ragga, edinburgh, attractive, be, refuse, benguela, bush, remove,\n", + "Nearest to taking: leopards, rational, concordat, sidgwick, arlene, garis, carpal, anoxic,\n", + "Nearest to consists: eee, cbd, coasts, gramophone, located, morisot, condorcet, brighton,\n", + "Nearest to scale: exposed, chappell, curved, allude, formation, fuse, speciation, childe,\n", + "Nearest to units: force, torsion, fortieth, typewriter, teletype, unit, prefixes, pucker,\n", + "Nearest to ice: soils, staining, plasmodium, fracture, pinstripes, reconstruction, pyotr, louth,\n", + "Nearest to instance: resize, lenses, synapses, implementations, illogical, healthy, krugerrand, oscillators,\n", + "Nearest to channel: curler, tritium, speculators, restructured, mastercard, creditors, bypassing, almohades,\n", + "Nearest to report: credibility, presidents, standish, candlestick, leaped, annotated, haight, serviced,\n", + "Epoch 3/10 Iteration: 12100 Avg. Training loss: 4.1912 0.1103 sec/batch\n", + "Epoch 3/10 Iteration: 12200 Avg. Training loss: 4.1658 0.1091 sec/batch\n", + "Epoch 3/10 Iteration: 12300 Avg. Training loss: 4.1775 0.1089 sec/batch\n", + "Epoch 3/10 Iteration: 12400 Avg. Training loss: 4.1726 0.1093 sec/batch\n", + "Epoch 3/10 Iteration: 12500 Avg. Training loss: 4.1599 0.1099 sec/batch\n", + "Epoch 3/10 Iteration: 12600 Avg. Training loss: 4.1498 0.1099 sec/batch\n", + "Epoch 3/10 Iteration: 12700 Avg. Training loss: 4.1615 0.1097 sec/batch\n", + "Epoch 3/10 Iteration: 12800 Avg. Training loss: 4.1188 0.1095 sec/batch\n", + "Epoch 3/10 Iteration: 12900 Avg. Training loss: 4.1679 0.1098 sec/batch\n", + "Epoch 3/10 Iteration: 13000 Avg. Training loss: 4.2005 0.1100 sec/batch\n", + "Nearest to for: hoffman, rogue, emeryville, census, given, scriptwriter, searchable, converged,\n", + "Nearest to would: disobey, habilis, despaired, zubaydah, amontillado, preeminence, whyte, replied,\n", + "Nearest to known: satrapies, mated, rtgs, oak, grady, tori, demographically, usability,\n", + "Nearest to used: ceilings, bp, negating, cirth, decorator, supplementation, comprehensible, hyphen,\n", + "Nearest to at: emi, taya, italia, habr, bathory, dini, nde, awarding,\n", + "Nearest to such: desired, unfair, expellees, eudicots, actus, nanda, plosives, license,\n", + "Nearest to called: supersessionism, bakunin, reintroduce, excommunicating, faithless, denunciations, ramp, vegetative,\n", + "Nearest to when: edinburgh, ragga, refuse, attractive, bush, be, benguela, convinced,\n", + "Nearest to taking: leopards, rational, sidgwick, concordat, go, garis, anoxic, arlene,\n", + "Nearest to consists: eee, cbd, condorcet, located, coasts, brighton, morisot, circumcising,\n", + "Nearest to scale: exposed, chappell, allude, curved, fuse, speciation, hashes, sheltered,\n", + "Nearest to units: force, torsion, fortieth, typewriter, teletype, unit, pucker, prefixes,\n", + "Nearest to ice: staining, plasmodium, soils, pinstripes, pyotr, fracture, louth, golem,\n", + "Nearest to instance: resize, synapses, lenses, illogical, implementations, unappreciated, healthy, krugerrand,\n", + "Nearest to channel: curler, tritium, restructured, speculators, creditors, mastercard, bypassing, dray,\n", + "Nearest to report: presidents, credibility, leaped, standish, candlestick, focusing, haight, corte,\n", + "Epoch 3/10 Iteration: 13100 Avg. Training loss: 4.2402 0.1103 sec/batch\n", + "Epoch 3/10 Iteration: 13200 Avg. Training loss: 4.1416 0.1096 sec/batch\n", + "Epoch 3/10 Iteration: 13300 Avg. Training loss: 4.1287 0.1098 sec/batch\n", + "Epoch 3/10 Iteration: 13400 Avg. Training loss: 4.1439 0.1095 sec/batch\n", + "Epoch 3/10 Iteration: 13500 Avg. Training loss: 4.0455 0.1098 sec/batch\n", + "Epoch 3/10 Iteration: 13600 Avg. Training loss: 4.1497 0.1102 sec/batch\n", + "Epoch 3/10 Iteration: 13700 Avg. Training loss: 4.1528 0.1098 sec/batch\n", + "Epoch 3/10 Iteration: 13800 Avg. Training loss: 4.1375 0.1094 sec/batch\n", + "Epoch 4/10 Iteration: 13900 Avg. Training loss: 4.1982 0.0209 sec/batch\n", + "Epoch 4/10 Iteration: 14000 Avg. Training loss: 4.1256 0.1089 sec/batch\n", + "Nearest to for: hoffman, rogue, given, converged, searchable, scriptwriter, typeface, emeryville,\n", + "Nearest to would: disobey, habilis, nyquist, whyte, zubaydah, busting, amontillado, gimme,\n", + "Nearest to known: rtgs, very, perihelion, uppercase, satrapies, usability, fervour, conglomerates,\n", + "Nearest to used: ceilings, bp, bleaches, cirth, negating, supplementation, institutionalization, stds,\n", + "Nearest to at: emi, taya, travelling, seated, bathory, coronets, breach, awarding,\n", + "Nearest to such: license, pollutant, techniques, desired, conceals, actus, procedure, unfair,\n", + "Nearest to called: ramp, vegetative, supersessionism, reintroduce, faithless, ripples, sealand, joliot,\n", + "Nearest to when: edinburgh, ragga, attractive, bush, refuse, be, benguela, bursa,\n", + "Nearest to taking: leopards, rational, sidgwick, garis, anoxic, go, concordat, carpal,\n", + "Nearest to consists: eee, cbd, located, brighton, condorcet, chamber, appoints, coasts,\n", + "Nearest to scale: exposed, allude, curved, fuse, chappell, mellin, capricornus, gears,\n", + "Nearest to units: force, torsion, fortieth, unit, prefixes, typewriter, teletype, pucker,\n", + "Nearest to ice: staining, plasmodium, soils, pinstripes, pyotr, louth, hawk, golem,\n", + "Nearest to instance: resize, synapses, illogical, lenses, krugerrand, healthy, unappreciated, oscillators,\n", + "Nearest to channel: curler, creditors, tritium, dray, restructured, bypassing, mastercard, speculators,\n", + "Nearest to report: credibility, presidents, leaped, standish, candlestick, annotated, haight, targeted,\n", + "Epoch 4/10 Iteration: 14100 Avg. Training loss: 4.0816 0.1103 sec/batch\n", + "Epoch 4/10 Iteration: 14200 Avg. Training loss: 4.1231 0.1090 sec/batch\n", + "Epoch 4/10 Iteration: 14300 Avg. Training loss: 4.0923 0.1093 sec/batch\n", + "Epoch 4/10 Iteration: 14400 Avg. Training loss: 4.0457 0.1082 sec/batch\n", + "Epoch 4/10 Iteration: 14500 Avg. Training loss: 4.0987 0.1090 sec/batch\n", + "Epoch 4/10 Iteration: 14600 Avg. Training loss: 4.0307 0.1086 sec/batch\n", + "Epoch 4/10 Iteration: 14700 Avg. Training loss: 4.0652 0.1095 sec/batch\n", + "Epoch 4/10 Iteration: 14800 Avg. Training loss: 4.0900 0.1090 sec/batch\n", + "Epoch 4/10 Iteration: 14900 Avg. Training loss: 4.1109 0.1091 sec/batch\n", + "Epoch 4/10 Iteration: 15000 Avg. Training loss: 4.0441 0.1098 sec/batch\n", + "Nearest to for: rogue, given, converged, census, autrefois, hoffman, silos, searchable,\n", + "Nearest to would: disobey, nyquist, habilis, whyte, gimme, busting, preeminence, amontillado,\n", + "Nearest to known: rtgs, oak, usability, very, perihelion, mated, satrapies, fervour,\n", + "Nearest to used: ceilings, bp, grams, alliances, pacemakers, stds, epoxy, mollusks,\n", + "Nearest to at: emi, seated, travelling, aviators, coronets, taya, italia, awarding,\n", + "Nearest to such: desired, license, undercurrent, hinges, pollutant, unfair, techniques, heseltine,\n", + "Nearest to called: ramp, vegetative, supersessionism, reintroduce, sealand, denunciations, faithless, purification,\n", + "Nearest to when: ragga, edinburgh, attractive, bush, be, refuse, benguela, remove,\n", + "Nearest to taking: leopards, rational, garis, sidgwick, concordat, go, nba, anoxic,\n", + "Nearest to consists: eee, cbd, located, chamber, coasts, twos, consist, morisot,\n", + "Nearest to scale: exposed, allude, curved, capricornus, mellin, fuse, chappell, sheltered,\n", + "Nearest to units: force, unit, torsion, fortieth, prefixes, teletype, typewriter, pucker,\n", + "Nearest to ice: plasmodium, soils, staining, pinstripes, pyotr, louth, golem, gskola,\n", + "Nearest to instance: resize, lenses, illogical, synapses, krugerrand, healthy, unappreciated, caesarean,\n", + "Nearest to channel: curler, restructured, bypassing, creditors, dray, tritium, speculators, mastercard,\n", + "Nearest to report: credibility, presidents, spirituality, leaped, focusing, standish, annotated, targeted,\n", + "Epoch 4/10 Iteration: 15100 Avg. Training loss: 4.0226 0.1103 sec/batch\n", + "Epoch 4/10 Iteration: 15200 Avg. Training loss: 4.0229 0.1098 sec/batch\n", + "Epoch 4/10 Iteration: 15300 Avg. Training loss: 4.0029 0.1098 sec/batch\n", + "Epoch 4/10 Iteration: 15400 Avg. Training loss: 4.0458 0.1080 sec/batch\n", + "Epoch 4/10 Iteration: 15500 Avg. Training loss: 4.0678 0.0983 sec/batch\n", + "Epoch 4/10 Iteration: 15600 Avg. Training loss: 4.0606 0.1029 sec/batch\n", + "Epoch 4/10 Iteration: 15700 Avg. Training loss: 4.0898 0.1005 sec/batch\n", + "Epoch 4/10 Iteration: 15800 Avg. Training loss: 4.1047 0.0983 sec/batch\n", + "Epoch 4/10 Iteration: 15900 Avg. Training loss: 4.0668 0.1013 sec/batch\n", + "Epoch 4/10 Iteration: 16000 Avg. Training loss: 4.0396 0.1101 sec/batch\n", + "Nearest to for: given, census, hoffman, rogue, converged, parliamentary, autrefois, tomo,\n", + "Nearest to would: disobey, whyte, nyquist, habilis, gimme, despaired, busting, relegated,\n", + "Nearest to known: rtgs, banach, pisin, perihelion, oak, satrapies, mated, usability,\n", + "Nearest to used: bp, ceilings, grams, cirth, stds, bleaches, pacemakers, primary,\n", + "Nearest to at: emi, travelling, degree, taya, dominants, aviators, habr, awarding,\n", + "Nearest to such: desired, actus, plosives, lysenkoism, hinges, license, pollutant, conceals,\n", + "Nearest to called: supersessionism, reintroduce, denunciations, vegetative, faithless, ramp, core, sealand,\n", + "Nearest to when: ragga, edinburgh, attractive, be, refuse, bush, remove, painda,\n", + "Nearest to taking: rational, leopards, garis, sidgwick, concordat, go, anoxic, carpal,\n", + "Nearest to consists: eee, chamber, cbd, located, consist, morisot, condorcet, coasts,\n", + "Nearest to scale: exposed, mellin, allude, capricornus, fuse, childe, visualizing, curved,\n", + "Nearest to units: force, unit, fortieth, torsion, prefixes, teletype, typewriter, pucker,\n", + "Nearest to ice: plasmodium, staining, soils, pinstripes, louth, fracture, pyotr, detection,\n", + "Nearest to instance: resize, synapses, lenses, implementations, unappreciated, illogical, caesarean, oscillators,\n", + "Nearest to channel: curler, creditors, bypassing, restructured, mbit, tritium, dray, speculators,\n", + "Nearest to report: credibility, presidents, leaped, standish, spirituality, focusing, annotated, candlestick,\n", + "Epoch 4/10 Iteration: 16100 Avg. Training loss: 4.0831 0.1100 sec/batch\n", + "Epoch 4/10 Iteration: 16200 Avg. Training loss: 4.0817 0.1094 sec/batch\n", + "Epoch 4/10 Iteration: 16300 Avg. Training loss: 4.0709 0.1093 sec/batch\n", + "Epoch 4/10 Iteration: 16400 Avg. Training loss: 4.0693 0.1013 sec/batch\n", + "Epoch 4/10 Iteration: 16500 Avg. Training loss: 4.0710 0.1000 sec/batch\n", + "Epoch 4/10 Iteration: 16600 Avg. Training loss: 4.0771 0.1090 sec/batch\n", + "Epoch 4/10 Iteration: 16700 Avg. Training loss: 4.0465 0.1083 sec/batch\n", + "Epoch 4/10 Iteration: 16800 Avg. Training loss: 4.0753 0.1018 sec/batch\n", + "Epoch 4/10 Iteration: 16900 Avg. Training loss: 4.1115 0.1103 sec/batch\n", + "Epoch 4/10 Iteration: 17000 Avg. Training loss: 4.0615 0.1194 sec/batch\n", + "Nearest to for: given, scriptwriter, census, rogue, emeryville, hoffman, autrefois, converged,\n", + "Nearest to would: disobey, nyquist, habilis, whyte, busting, gimme, despaired, maecenas,\n", + "Nearest to known: satrapies, fervour, pisin, sixteenth, banach, with, perihelion, oak,\n", + "Nearest to used: ceilings, cirth, bp, alliances, stds, grams, machining, hyphen,\n", + "Nearest to at: emi, travelling, breach, dominants, taya, dini, bathory, degree,\n", + "Nearest to such: plosives, pollutant, desired, hinges, lysenkoism, undercurrent, actus, characterised,\n", + "Nearest to called: supersessionism, reintroduce, vegetative, denunciations, faithless, ramp, sealand, purification,\n", + "Nearest to when: ragga, edinburgh, attractive, refuse, be, painda, bush, manor,\n", + "Nearest to taking: leopards, rational, sidgwick, garis, concordat, templar, anoxic, carpal,\n", + "Nearest to consists: eee, chamber, cbd, morisot, consist, located, brighton, trending,\n", + "Nearest to scale: exposed, mellin, capricornus, allude, curved, regolith, fuse, speciation,\n", + "Nearest to units: force, unit, fortieth, torsion, prefixes, typewriter, teletype, pucker,\n", + "Nearest to ice: plasmodium, pinstripes, soils, pyotr, staining, louth, gory, fracture,\n", + "Nearest to instance: synapses, lenses, resize, unappreciated, implementations, illogical, placed, oscillators,\n", + "Nearest to channel: curler, restructured, creditors, mbit, bypassing, dray, dts, tritium,\n", + "Nearest to report: presidents, credibility, annotated, standish, spirituality, leaped, focusing, targeted,\n", + "Epoch 4/10 Iteration: 17100 Avg. Training loss: 4.0576 0.1166 sec/batch\n", + "Epoch 4/10 Iteration: 17200 Avg. Training loss: 4.0014 0.1178 sec/batch\n", + "Epoch 4/10 Iteration: 17300 Avg. Training loss: 4.0085 0.1100 sec/batch\n", + "Epoch 4/10 Iteration: 17400 Avg. Training loss: 4.0609 0.1082 sec/batch\n", + "Epoch 4/10 Iteration: 17500 Avg. Training loss: 4.0888 0.1111 sec/batch\n", + "Epoch 4/10 Iteration: 17600 Avg. Training loss: 4.1041 0.1124 sec/batch\n", + "Epoch 4/10 Iteration: 17700 Avg. Training loss: 4.1330 0.1147 sec/batch\n", + "Epoch 4/10 Iteration: 17800 Avg. Training loss: 4.0638 0.1094 sec/batch\n", + "Epoch 4/10 Iteration: 17900 Avg. Training loss: 4.0446 0.1126 sec/batch\n", + "Epoch 4/10 Iteration: 18000 Avg. Training loss: 4.0699 0.1122 sec/batch\n", + "Nearest to for: given, scriptwriter, rogue, census, autrefois, emeryville, converged, first,\n", + "Nearest to would: disobey, whyte, habilis, nyquist, busting, gimme, relegated, maecenas,\n", + "Nearest to known: satrapies, banach, rtgs, perihelion, pisin, quetzal, fervour, with,\n", + "Nearest to used: ceilings, cirth, machining, bp, stds, alliances, ido, okinawan,\n", + "Nearest to at: emi, travelling, breach, bathory, italia, dominants, dini, taya,\n", + "Nearest to such: hinges, cc, actus, plosives, desired, conceals, license, eudicots,\n", + "Nearest to called: supersessionism, reintroduce, ramp, faithless, denunciations, sealand, excommunicating, vegetative,\n", + "Nearest to when: edinburgh, ragga, attractive, refuse, be, bush, remove, painda,\n", + "Nearest to taking: rational, leopards, sidgwick, garis, anoxic, concordat, go, nba,\n", + "Nearest to consists: eee, chamber, cbd, appoints, consist, morisot, located, condorcet,\n", + "Nearest to scale: exposed, mellin, capricornus, allude, curved, fuse, regolith, speciation,\n", + "Nearest to units: unit, force, fortieth, prefixes, torsion, si, typewriter, teletype,\n", + "Nearest to ice: pinstripes, soils, louth, pyotr, plasmodium, staining, gory, rink,\n", + "Nearest to instance: illogical, resize, lenses, unappreciated, synapses, oscillators, implementations, krugerrand,\n", + "Nearest to channel: curler, restructured, dray, creditors, mbit, bypassing, mastercard, tritium,\n", + "Nearest to report: presidents, credibility, spirituality, leaped, annotated, standish, focusing, reports,\n", + "Epoch 4/10 Iteration: 18100 Avg. Training loss: 3.9760 0.1089 sec/batch\n", + "Epoch 4/10 Iteration: 18200 Avg. Training loss: 4.0450 0.1039 sec/batch\n", + "Epoch 4/10 Iteration: 18300 Avg. Training loss: 4.0234 0.1026 sec/batch\n", + "Epoch 4/10 Iteration: 18400 Avg. Training loss: 4.0367 0.1004 sec/batch\n", + "Epoch 4/10 Iteration: 18500 Avg. Training loss: 4.0817 0.1018 sec/batch\n", + "Epoch 5/10 Iteration: 18600 Avg. Training loss: 4.0321 0.0936 sec/batch\n", + "Epoch 5/10 Iteration: 18700 Avg. Training loss: 4.0089 0.1002 sec/batch\n", + "Epoch 5/10 Iteration: 18800 Avg. Training loss: 3.9820 0.1098 sec/batch\n", + "Epoch 5/10 Iteration: 18900 Avg. Training loss: 4.0002 0.1016 sec/batch\n", + "Epoch 5/10 Iteration: 19000 Avg. Training loss: 3.9676 0.1011 sec/batch\n", + "Nearest to for: given, scriptwriter, rogue, census, autrefois, converged, to, emeryville,\n", + "Nearest to would: disobey, whyte, habilis, nyquist, maecenas, busting, gimme, relegated,\n", + "Nearest to known: perihelion, rtgs, banach, satrapies, pisin, fervour, oak, quetzal,\n", + "Nearest to used: ceilings, stds, cirth, machining, bp, alliances, grams, common,\n", + "Nearest to at: emi, travelling, dominants, breach, italia, taya, bathory, seated,\n", + "Nearest to such: hinges, actus, undercurrent, pollutant, lysenkoism, desired, cc, license,\n", + "Nearest to called: supersessionism, reintroduce, keno, faithless, bother, sealand, vegetative, denunciations,\n", + "Nearest to when: edinburgh, refuse, attractive, ragga, bush, be, remove, painda,\n", + "Nearest to taking: leopards, garis, rational, sidgwick, go, anoxic, nba, boosts,\n", + "Nearest to consists: eee, chamber, cbd, consist, located, morisot, twos, appoints,\n", + "Nearest to scale: exposed, capricornus, curved, allude, mellin, regolith, fuse, gears,\n", + "Nearest to units: unit, fortieth, prefixes, force, torsion, typewriter, si, irl,\n", + "Nearest to ice: soils, pinstripes, plasmodium, louth, rink, pyotr, staining, joaquin,\n", + "Nearest to instance: illogical, synapses, lenses, resize, krugerrand, healthy, placed, caesarean,\n", + "Nearest to channel: curler, restructured, dray, creditors, bypassing, mastercard, wb, mbit,\n", + "Nearest to report: credibility, spirituality, presidents, reports, annotated, standish, focusing, leaped,\n", + "Epoch 5/10 Iteration: 19100 Avg. Training loss: 3.9968 0.1027 sec/batch\n", + "Epoch 5/10 Iteration: 19200 Avg. Training loss: 3.9635 0.1035 sec/batch\n", + "Epoch 5/10 Iteration: 19300 Avg. Training loss: 4.0181 0.1107 sec/batch\n", + "Epoch 5/10 Iteration: 19400 Avg. Training loss: 4.0267 0.1175 sec/batch\n", + "Epoch 5/10 Iteration: 19500 Avg. Training loss: 4.0411 0.1127 sec/batch\n", + "Epoch 5/10 Iteration: 19600 Avg. Training loss: 3.9779 0.1149 sec/batch\n", + "Epoch 5/10 Iteration: 19700 Avg. Training loss: 3.9253 0.1095 sec/batch\n", + "Epoch 5/10 Iteration: 19800 Avg. Training loss: 3.9642 0.1090 sec/batch\n", + "Epoch 5/10 Iteration: 19900 Avg. Training loss: 3.9214 0.1154 sec/batch\n", + "Epoch 5/10 Iteration: 20000 Avg. Training loss: 3.9692 0.1104 sec/batch\n", + "Nearest to for: given, census, to, scriptwriter, first, converged, emeryville, autrefois,\n", + "Nearest to would: disobey, relegated, whyte, habilis, nyquist, capitalistic, busting, maecenas,\n", + "Nearest to known: rtgs, banach, oak, perihelion, satrapies, with, nbi, hoosiers,\n", + "Nearest to used: ceilings, grams, cirth, machining, bp, stds, nazca, epoxy,\n", + "Nearest to at: emi, dominants, travelling, the, italia, degree, breach, surrounding,\n", + "Nearest to such: undercurrent, actus, cc, hinges, license, lysenkoism, group, techniques,\n", + "Nearest to called: supersessionism, vegetative, the, reintroduce, core, bother, denunciations, sealand,\n", + "Nearest to when: edinburgh, ragga, attractive, be, refuse, remove, down, itv,\n", + "Nearest to taking: leopards, rational, garis, go, anoxic, sidgwick, nba, carpal,\n", + "Nearest to consists: eee, chamber, consist, located, cbd, morisot, leblanc, appoints,\n", + "Nearest to scale: exposed, mellin, capricornus, allude, fuse, curved, townes, gears,\n", + "Nearest to units: unit, force, prefixes, fortieth, torsion, typewriter, si, teletype,\n", + "Nearest to ice: plasmodium, pinstripes, louth, soils, pyotr, staining, cools, rink,\n", + "Nearest to instance: lenses, resize, placed, synapses, bookstore, illogical, oscillators, unappreciated,\n", + "Nearest to channel: curler, restructured, dray, creditors, wb, channels, hearsay, dts,\n", + "Nearest to report: credibility, presidents, spirituality, reports, annotated, standish, leaped, timeline,\n", + "Epoch 5/10 Iteration: 20100 Avg. Training loss: 3.9983 0.1107 sec/batch\n", + "Epoch 5/10 Iteration: 20200 Avg. Training loss: 3.9932 0.1185 sec/batch\n", + "Epoch 5/10 Iteration: 20300 Avg. Training loss: 3.9784 0.1098 sec/batch\n", + "Epoch 5/10 Iteration: 20400 Avg. Training loss: 3.9886 0.1104 sec/batch\n", + "Epoch 5/10 Iteration: 20500 Avg. Training loss: 4.0409 0.1045 sec/batch\n", + "Epoch 5/10 Iteration: 20600 Avg. Training loss: 3.9733 0.1048 sec/batch\n", + "Epoch 5/10 Iteration: 20700 Avg. Training loss: 3.9866 0.1072 sec/batch\n", + "Epoch 5/10 Iteration: 20800 Avg. Training loss: 4.0136 0.1085 sec/batch\n", + "Epoch 5/10 Iteration: 20900 Avg. Training loss: 3.9813 0.1100 sec/batch\n", + "Epoch 5/10 Iteration: 21000 Avg. Training loss: 4.0106 0.1119 sec/batch\n", + "Nearest to for: given, census, scriptwriter, first, to, cited, autrefois, awards,\n", + "Nearest to would: disobey, whyte, relegated, nyquist, maecenas, habilis, lege, forbid,\n", + "Nearest to known: banach, rtgs, pisin, satrapies, nbi, hoosiers, sixteenth, perihelion,\n", + "Nearest to used: cirth, bjarne, ceilings, alliances, grams, bp, machining, stds,\n", + "Nearest to at: emi, travelling, dominants, degree, breach, their, the, awarding,\n", + "Nearest to such: lysenkoism, actus, hinges, desired, cc, unfair, plosives, license,\n", + "Nearest to called: supersessionism, bother, reintroduce, the, screenname, denunciations, ripples, core,\n", + "Nearest to when: edinburgh, be, ragga, attractive, refuse, itv, retrospect, remove,\n", + "Nearest to taking: rational, garis, leopards, go, sidgwick, anoxic, salim, nba,\n", + "Nearest to consists: chamber, eee, consist, morisot, leblanc, cbd, located, hydrohalic,\n", + "Nearest to scale: mellin, exposed, capricornus, townes, speciation, allude, fuse, curved,\n", + "Nearest to units: unit, force, prefixes, fortieth, torsion, typewriter, si, kilogram,\n", + "Nearest to ice: louth, pinstripes, rink, pyotr, plasmodium, staining, joaquin, sweden,\n", + "Nearest to instance: lenses, bookstore, unappreciated, resize, illogical, synapses, placed, caesarean,\n", + "Nearest to channel: curler, restructured, wb, dray, creditors, bandwidth, bypassing, mbit,\n", + "Nearest to report: reports, credibility, presidents, spirituality, annotated, standish, leaped, timeline,\n", + "Epoch 5/10 Iteration: 21100 Avg. Training loss: 3.9997 0.1121 sec/batch\n", + "Epoch 5/10 Iteration: 21200 Avg. Training loss: 3.9752 0.1114 sec/batch\n", + "Epoch 5/10 Iteration: 21300 Avg. Training loss: 4.0002 0.1109 sec/batch\n", + "Epoch 5/10 Iteration: 21400 Avg. Training loss: 3.9800 0.1107 sec/batch\n", + "Epoch 5/10 Iteration: 21500 Avg. Training loss: 4.0198 0.1114 sec/batch\n", + "Epoch 5/10 Iteration: 21600 Avg. Training loss: 4.0034 0.1111 sec/batch\n", + "Epoch 5/10 Iteration: 21700 Avg. Training loss: 3.9504 0.1112 sec/batch\n", + "Epoch 5/10 Iteration: 21800 Avg. Training loss: 3.9446 0.1112 sec/batch\n", + "Epoch 5/10 Iteration: 21900 Avg. Training loss: 3.9754 0.1101 sec/batch\n", + "Epoch 5/10 Iteration: 22000 Avg. Training loss: 4.0392 0.1137 sec/batch\n", + "Nearest to for: given, census, scriptwriter, first, to, emeryville, unusually, from,\n", + "Nearest to would: disobey, relegated, whyte, nyquist, maecenas, habilis, in, lege,\n", + "Nearest to known: satrapies, banach, rtgs, pisin, with, oak, yemenite, aalborg,\n", + "Nearest to used: cirth, grams, machining, common, bp, ceilings, other, alliances,\n", + "Nearest to at: emi, travelling, degree, dominants, the, breach, italia, their,\n", + "Nearest to such: lysenkoism, cc, actus, hinges, license, desired, baa, undercurrent,\n", + "Nearest to called: supersessionism, bother, reintroduce, denunciations, sealand, vegetative, ripples, faithless,\n", + "Nearest to when: attractive, edinburgh, refuse, ragga, be, remove, painda, itv,\n", + "Nearest to taking: rational, leopards, garis, go, sidgwick, anoxic, salim, kessinger,\n", + "Nearest to consists: chamber, eee, consist, cbd, located, morisot, leblanc, sint,\n", + "Nearest to scale: exposed, mellin, capricornus, speciation, accede, allude, gears, fuse,\n", + "Nearest to units: unit, prefixes, force, fortieth, typewriter, si, torsion, irl,\n", + "Nearest to ice: louth, rink, pinstripes, plasmodium, cools, pyotr, soils, staining,\n", + "Nearest to instance: lenses, placed, illogical, synapses, unappreciated, bookstore, krugerrand, oscillators,\n", + "Nearest to channel: curler, bandwidth, restructured, dray, wb, channels, mbit, dts,\n", + "Nearest to report: reports, credibility, presidents, annotated, spirituality, standish, focusing, lebanon,\n", + "Epoch 5/10 Iteration: 22100 Avg. Training loss: 3.9926 0.1178 sec/batch\n", + "Epoch 5/10 Iteration: 22200 Avg. Training loss: 4.1086 0.1140 sec/batch\n", + "Epoch 5/10 Iteration: 22300 Avg. Training loss: 4.0173 0.1238 sec/batch\n", + "Epoch 5/10 Iteration: 22400 Avg. Training loss: 4.0545 0.1200 sec/batch\n", + "Epoch 5/10 Iteration: 22500 Avg. Training loss: 3.9600 0.1167 sec/batch\n", + "Epoch 5/10 Iteration: 22600 Avg. Training loss: 3.9318 0.1150 sec/batch\n", + "Epoch 5/10 Iteration: 22700 Avg. Training loss: 3.9985 0.1157 sec/batch\n", + "Epoch 5/10 Iteration: 22800 Avg. Training loss: 3.9130 0.1197 sec/batch\n", + "Epoch 5/10 Iteration: 22900 Avg. Training loss: 3.9757 0.1174 sec/batch\n", + "Epoch 5/10 Iteration: 23000 Avg. Training loss: 3.9773 0.1208 sec/batch\n", + "Nearest to for: given, to, first, scriptwriter, census, the, from, have,\n", + "Nearest to would: disobey, whyte, relegated, nyquist, busting, gimme, habilis, in,\n", + "Nearest to known: banach, rtgs, satrapies, pisin, with, perihelion, usability, oak,\n", + "Nearest to used: cirth, common, grams, machining, use, bp, ceilings, phenol,\n", + "Nearest to at: travelling, degree, emi, the, dominants, breach, italia, awarding,\n", + "Nearest to such: cc, multinationals, lysenkoism, unfair, senegal, group, undercurrent, actus,\n", + "Nearest to called: the, supersessionism, bother, core, ripples, sealand, reintroduce, macedonian,\n", + "Nearest to when: attractive, ragga, edinburgh, remove, be, refuse, itv, retrospect,\n", + "Nearest to taking: go, garis, rational, sidgwick, leopards, salim, anoxic, nba,\n", + "Nearest to consists: chamber, eee, consist, leblanc, morisot, cbd, located, appoints,\n", + "Nearest to scale: mellin, exposed, townes, fuse, gears, curved, capricornus, allude,\n", + "Nearest to units: unit, prefixes, fortieth, force, si, typewriter, torsion, irl,\n", + "Nearest to ice: louth, rink, pyotr, pinstripes, plasmodium, joaquin, soils, gory,\n", + "Nearest to instance: lenses, illogical, placed, synapses, bookstore, unappreciated, healthy, resize,\n", + "Nearest to channel: dray, curler, wb, channels, dts, bandwidth, hearsay, restructured,\n", + "Nearest to report: reports, credibility, presidents, annotated, spirituality, binge, standish, leaped,\n", + "Epoch 5/10 Iteration: 23100 Avg. Training loss: 3.9697 0.1115 sec/batch\n", + "Epoch 6/10 Iteration: 23200 Avg. Training loss: 3.9797 0.0768 sec/batch\n", + "Epoch 6/10 Iteration: 23300 Avg. Training loss: 3.9693 0.1202 sec/batch\n", + "Epoch 6/10 Iteration: 23400 Avg. Training loss: 3.9590 0.1265 sec/batch\n", + "Epoch 6/10 Iteration: 23500 Avg. Training loss: 3.9599 0.1224 sec/batch\n", + "Epoch 6/10 Iteration: 23600 Avg. Training loss: 3.8895 0.1215 sec/batch\n", + "Epoch 6/10 Iteration: 23700 Avg. Training loss: 3.9265 0.1228 sec/batch\n", + "Epoch 6/10 Iteration: 23800 Avg. Training loss: 3.9374 0.1243 sec/batch\n", + "Epoch 6/10 Iteration: 23900 Avg. Training loss: 3.9506 0.1151 sec/batch\n", + "Epoch 6/10 Iteration: 24000 Avg. Training loss: 3.9664 0.1254 sec/batch\n", + "Nearest to for: given, first, to, scriptwriter, the, census, from, converged,\n", + "Nearest to would: whyte, relegated, disobey, busting, in, habilis, gimme, maecenas,\n", + "Nearest to known: rtgs, banach, hoosiers, pisin, nbi, oak, which, perihelion,\n", + "Nearest to used: grams, cirth, common, epoxy, bp, use, machining, commonly,\n", + "Nearest to at: travelling, the, emi, degree, dominants, their, breach, italia,\n", + "Nearest to such: lysenkoism, group, cc, undercurrent, multinationals, actus, hinges, baa,\n", + "Nearest to called: supersessionism, the, bother, reintroduce, denunciations, ripples, systematized, keno,\n", + "Nearest to when: attractive, edinburgh, remove, ragga, refuse, bursa, painda, be,\n", + "Nearest to taking: go, rational, garis, leopards, salim, sidgwick, anoxic, nba,\n", + "Nearest to consists: chamber, consist, eee, located, leblanc, cbd, sint, hydrohalic,\n", + "Nearest to scale: mellin, townes, exposed, capricornus, gears, diatonic, curved, allude,\n", + "Nearest to units: unit, prefixes, fortieth, si, typewriter, force, torsion, irl,\n", + "Nearest to ice: louth, rink, soils, joaquin, pyotr, pinstripes, plasmodium, cools,\n", + "Nearest to instance: lenses, bookstore, illogical, placed, synapses, unappreciated, caesarean, healthy,\n", + "Nearest to channel: curler, wb, dray, creditors, dts, channels, mbit, restructured,\n", + "Nearest to report: reports, credibility, spirituality, annotated, presidents, standish, lebanon, binge,\n", + "Epoch 6/10 Iteration: 24100 Avg. Training loss: 3.9397 0.1236 sec/batch\n", + "Epoch 6/10 Iteration: 24200 Avg. Training loss: 3.9810 0.1160 sec/batch\n", + "Epoch 6/10 Iteration: 24300 Avg. Training loss: 3.8346 0.1265 sec/batch\n", + "Epoch 6/10 Iteration: 24400 Avg. Training loss: 3.9313 0.1289 sec/batch\n", + "Epoch 6/10 Iteration: 24500 Avg. Training loss: 3.8972 0.1195 sec/batch\n", + "Epoch 6/10 Iteration: 24600 Avg. Training loss: 3.8997 0.1186 sec/batch\n", + "Epoch 6/10 Iteration: 24700 Avg. Training loss: 3.9321 0.1139 sec/batch\n", + "Epoch 6/10 Iteration: 24800 Avg. Training loss: 3.9608 0.1289 sec/batch\n", + "Epoch 6/10 Iteration: 24900 Avg. Training loss: 3.9414 0.1107 sec/batch\n", + "Epoch 6/10 Iteration: 25000 Avg. Training loss: 3.9407 0.1113 sec/batch\n", + "Nearest to for: given, to, first, the, scriptwriter, have, from, census,\n", + "Nearest to would: relegated, whyte, disobey, busting, nyquist, in, habilis, coastlands,\n", + "Nearest to known: rtgs, banach, hoosiers, with, which, pisin, charcoal, oak,\n", + "Nearest to used: cirth, grams, common, epoxy, is, use, invented, commonly,\n", + "Nearest to at: the, degree, travelling, emi, dominants, of, awarding, their,\n", + "Nearest to such: cc, group, lysenkoism, hinges, multinationals, undercurrent, actus, baa,\n", + "Nearest to called: the, supersessionism, core, bother, denunciations, keno, reintroduce, systematized,\n", + "Nearest to when: attractive, be, edinburgh, remove, ragga, refuse, retrospect, itv,\n", + "Nearest to taking: go, rational, leopards, garis, salim, sidgwick, anoxic, carpal,\n", + "Nearest to consists: chamber, consist, eee, located, leblanc, calderon, sint, cbd,\n", + "Nearest to scale: mellin, gears, townes, exposed, capricornus, diatonic, fuse, effects,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, torsion, hubei,\n", + "Nearest to ice: louth, rink, joaquin, pyotr, plasmodium, soils, pinstripes, cools,\n", + "Nearest to instance: lenses, placed, bookstore, resize, synapses, unappreciated, jimbo, illogical,\n", + "Nearest to channel: dts, creditors, mbit, curler, wb, bandwidth, channels, hearsay,\n", + "Nearest to report: reports, credibility, annotated, presidents, spirituality, binge, standish, focusing,\n", + "Epoch 6/10 Iteration: 25100 Avg. Training loss: 4.0258 0.1102 sec/batch\n", + "Epoch 6/10 Iteration: 25200 Avg. Training loss: 3.9340 0.1118 sec/batch\n", + "Epoch 6/10 Iteration: 25300 Avg. Training loss: 3.9212 0.1136 sec/batch\n", + "Epoch 6/10 Iteration: 25400 Avg. Training loss: 3.9460 0.1095 sec/batch\n", + "Epoch 6/10 Iteration: 25500 Avg. Training loss: 3.9257 0.1138 sec/batch\n", + "Epoch 6/10 Iteration: 25600 Avg. Training loss: 3.9545 0.1245 sec/batch\n", + "Epoch 6/10 Iteration: 25700 Avg. Training loss: 3.9430 0.1241 sec/batch\n", + "Epoch 6/10 Iteration: 25800 Avg. Training loss: 3.9479 0.1211 sec/batch\n", + "Epoch 6/10 Iteration: 25900 Avg. Training loss: 3.9151 0.1171 sec/batch\n", + "Epoch 6/10 Iteration: 26000 Avg. Training loss: 3.9370 0.1135 sec/batch\n", + "Nearest to for: given, first, to, scriptwriter, by, from, have, the,\n", + "Nearest to would: in, disobey, whyte, relegated, preeminence, lege, nyquist, that,\n", + "Nearest to known: banach, pisin, rtgs, hoosiers, satrapies, which, named, oak,\n", + "Nearest to used: cirth, alliances, invented, machining, is, common, use, grams,\n", + "Nearest to at: the, travelling, degree, emi, dominants, of, their, awarding,\n", + "Nearest to such: group, cc, lysenkoism, hinges, unfair, actus, baa, multinationals,\n", + "Nearest to called: supersessionism, bother, the, denunciations, core, sealand, reintroduce, anakkale,\n", + "Nearest to when: attractive, edinburgh, refuse, ragga, be, remove, painda, itv,\n", + "Nearest to taking: go, rational, sidgwick, garis, salim, leopards, carpal, dedicates,\n", + "Nearest to consists: chamber, consist, eee, leblanc, calderon, morisot, sint, located,\n", + "Nearest to scale: mellin, townes, exposed, capricornus, effects, accede, allude, correlations,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, torsion, hubei,\n", + "Nearest to ice: louth, rink, plasmodium, pyotr, joaquin, soils, cools, pinstripes,\n", + "Nearest to instance: lenses, placed, resize, bookstore, unappreciated, illogical, synapses, consented,\n", + "Nearest to channel: curler, creditors, mbit, dts, bandwidth, wb, dray, restructured,\n", + "Nearest to report: reports, credibility, presidents, annotated, santer, haight, standish, lebanon,\n", + "Epoch 6/10 Iteration: 26100 Avg. Training loss: 3.9495 0.1184 sec/batch\n", + "Epoch 6/10 Iteration: 26200 Avg. Training loss: 3.9339 0.1132 sec/batch\n", + "Epoch 6/10 Iteration: 26300 Avg. Training loss: 3.9436 0.1120 sec/batch\n", + "Epoch 6/10 Iteration: 26400 Avg. Training loss: 3.9021 0.1305 sec/batch\n", + "Epoch 6/10 Iteration: 26500 Avg. Training loss: 3.9170 0.1217 sec/batch\n", + "Epoch 6/10 Iteration: 26600 Avg. Training loss: 3.9391 0.1154 sec/batch\n", + "Epoch 6/10 Iteration: 26700 Avg. Training loss: 3.9181 0.1176 sec/batch\n", + "Epoch 6/10 Iteration: 26800 Avg. Training loss: 4.0194 0.1174 sec/batch\n", + "Epoch 6/10 Iteration: 26900 Avg. Training loss: 4.0194 0.1122 sec/batch\n", + "Epoch 6/10 Iteration: 27000 Avg. Training loss: 3.9875 0.1128 sec/batch\n", + "Nearest to for: given, first, scriptwriter, from, to, the, have, census,\n", + "Nearest to would: disobey, relegated, whyte, in, lege, that, maecenas, coastlands,\n", + "Nearest to known: hoosiers, banach, pisin, oak, with, named, nbi, millions,\n", + "Nearest to used: cirth, invented, use, bunyan, commonly, machining, common, paused,\n", + "Nearest to at: travelling, the, emi, degree, dominants, of, breach, leadbelly,\n", + "Nearest to such: actus, cc, lysenkoism, unfair, hinges, baa, musical, plosives,\n", + "Nearest to called: bother, supersessionism, the, anakkale, keno, denunciations, reintroduce, distinctive,\n", + "Nearest to when: edinburgh, attractive, refuse, painda, remove, scotland, trouble, ragga,\n", + "Nearest to taking: go, sidgwick, rational, salim, garis, leopards, anoxic, dedicates,\n", + "Nearest to consists: chamber, consist, eee, leblanc, sint, calderon, morisot, located,\n", + "Nearest to scale: mellin, diatonic, exposed, accede, effects, gears, capricornus, townes,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, hubei, trucial,\n", + "Nearest to ice: rink, louth, pyotr, joaquin, plasmodium, pinstripes, gory, soils,\n", + "Nearest to instance: lenses, placed, illogical, bookstore, consented, unappreciated, philos, contacts,\n", + "Nearest to channel: creditors, curler, channels, dray, restructured, hearsay, mbit, dts,\n", + "Nearest to report: reports, credibility, presidents, annotated, santer, lebanon, standish, haight,\n", + "Epoch 6/10 Iteration: 27100 Avg. Training loss: 3.9083 0.1172 sec/batch\n", + "Epoch 6/10 Iteration: 27200 Avg. Training loss: 3.9032 0.1138 sec/batch\n", + "Epoch 6/10 Iteration: 27300 Avg. Training loss: 3.9424 0.1262 sec/batch\n", + "Epoch 6/10 Iteration: 27400 Avg. Training loss: 3.8443 0.1288 sec/batch\n", + "Epoch 6/10 Iteration: 27500 Avg. Training loss: 3.9509 0.1284 sec/batch\n", + "Epoch 6/10 Iteration: 27600 Avg. Training loss: 3.9196 0.1230 sec/batch\n", + "Epoch 6/10 Iteration: 27700 Avg. Training loss: 3.9078 0.1216 sec/batch\n", + "Epoch 7/10 Iteration: 27800 Avg. Training loss: 3.9767 0.0466 sec/batch\n", + "Epoch 7/10 Iteration: 27900 Avg. Training loss: 3.8898 0.1218 sec/batch\n", + "Epoch 7/10 Iteration: 28000 Avg. Training loss: 3.9203 0.1215 sec/batch\n", + "Nearest to for: given, scriptwriter, first, to, the, census, have, from,\n", + "Nearest to would: disobey, whyte, relegated, coastlands, lege, that, busting, atomic,\n", + "Nearest to known: with, hoosiers, banach, named, pisin, which, rtgs, oak,\n", + "Nearest to used: cirth, commonly, use, machining, stds, invented, netbios, is,\n", + "Nearest to at: travelling, the, degree, dominants, emi, of, breach, leadbelly,\n", + "Nearest to such: lysenkoism, multinationals, actus, group, unfair, hinges, cc, baa,\n", + "Nearest to called: the, bother, supersessionism, anakkale, systematized, keno, denunciations, core,\n", + "Nearest to when: attractive, refuse, edinburgh, painda, remove, be, scotland, trouble,\n", + "Nearest to taking: go, rational, chinguetti, garis, nba, anoxic, boosts, salim,\n", + "Nearest to consists: chamber, eee, consist, leblanc, located, sint, calderon, cbd,\n", + "Nearest to scale: diatonic, mellin, gears, townes, effects, accede, fretting, capricornus,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, kilogram, sumo,\n", + "Nearest to ice: rink, louth, pyotr, plasmodium, joaquin, pinstripes, gory, zubr,\n", + "Nearest to instance: lenses, placed, illogical, bookstore, resize, attitudes, oscillators, unappreciated,\n", + "Nearest to channel: channels, curler, wb, creditors, dray, mbit, dts, hearsay,\n", + "Nearest to report: reports, credibility, annotated, presidents, spirituality, standish, haight, comprehensive,\n", + "Epoch 7/10 Iteration: 28100 Avg. Training loss: 3.8978 0.1224 sec/batch\n", + "Epoch 7/10 Iteration: 28200 Avg. Training loss: 3.9022 0.1212 sec/batch\n", + "Epoch 7/10 Iteration: 28300 Avg. Training loss: 3.9255 0.1210 sec/batch\n", + "Epoch 7/10 Iteration: 28400 Avg. Training loss: 3.9095 0.1189 sec/batch\n", + "Epoch 7/10 Iteration: 28500 Avg. Training loss: 3.8764 0.1190 sec/batch\n", + "Epoch 7/10 Iteration: 28600 Avg. Training loss: 3.9017 0.1203 sec/batch\n", + "Epoch 7/10 Iteration: 28700 Avg. Training loss: 3.9144 0.1210 sec/batch\n", + "Epoch 7/10 Iteration: 28800 Avg. Training loss: 3.9431 0.1213 sec/batch\n", + "Epoch 7/10 Iteration: 28900 Avg. Training loss: 3.8440 0.1219 sec/batch\n", + "Epoch 7/10 Iteration: 29000 Avg. Training loss: 3.9068 0.1244 sec/batch\n", + "Nearest to for: to, given, the, first, have, from, and, scriptwriter,\n", + "Nearest to would: relegated, coastlands, disobey, that, whyte, in, habilis, lege,\n", + "Nearest to known: with, hoosiers, pisin, banach, which, oak, named, rtgs,\n", + "Nearest to used: use, cirth, commonly, is, grams, machining, epoxy, invented,\n", + "Nearest to at: the, travelling, dominants, emi, of, degree, two, meeting,\n", + "Nearest to such: multinationals, unfair, lysenkoism, group, pashtuns, many, actus, hinges,\n", + "Nearest to called: the, supersessionism, bother, anakkale, core, denunciations, systematized, keno,\n", + "Nearest to when: attractive, remove, refuse, retrospect, edinburgh, be, painda, itv,\n", + "Nearest to taking: go, rational, salim, nba, chinguetti, anoxic, garis, levees,\n", + "Nearest to consists: chamber, consist, eee, located, leblanc, calderon, sint, cbd,\n", + "Nearest to scale: diatonic, mellin, capricornus, townes, suggests, motherhood, accede, effects,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, dera, sumo,\n", + "Nearest to ice: rink, louth, pyotr, plasmodium, joaquin, pinstripes, zubr, cools,\n", + "Nearest to instance: placed, lenses, bookstore, resize, unappreciated, contacts, illogical, envisage,\n", + "Nearest to channel: channels, curler, creditors, wb, dray, bandwidth, mbit, restructured,\n", + "Nearest to report: reports, credibility, annotated, spirituality, presidents, comprehensive, focusing, html,\n", + "Epoch 7/10 Iteration: 29100 Avg. Training loss: 3.8945 0.1254 sec/batch\n", + "Epoch 7/10 Iteration: 29200 Avg. Training loss: 3.8284 0.1224 sec/batch\n", + "Epoch 7/10 Iteration: 29300 Avg. Training loss: 3.8781 0.1231 sec/batch\n", + "Epoch 7/10 Iteration: 29400 Avg. Training loss: 3.9094 0.1229 sec/batch\n", + "Epoch 7/10 Iteration: 29500 Avg. Training loss: 3.8962 0.1207 sec/batch\n", + "Epoch 7/10 Iteration: 29600 Avg. Training loss: 3.8959 0.1095 sec/batch\n", + "Epoch 7/10 Iteration: 29700 Avg. Training loss: 3.9419 0.1060 sec/batch\n", + "Epoch 7/10 Iteration: 29800 Avg. Training loss: 3.9093 0.1057 sec/batch\n", + "Epoch 7/10 Iteration: 29900 Avg. Training loss: 3.8714 0.1004 sec/batch\n", + "Epoch 7/10 Iteration: 30000 Avg. Training loss: 3.8931 0.1013 sec/batch\n", + "Nearest to for: given, first, scriptwriter, to, the, have, census, from,\n", + "Nearest to would: relegated, that, disobey, lege, whyte, coastlands, in, nyquist,\n", + "Nearest to known: banach, with, pisin, which, hoosiers, rtgs, nbi, first,\n", + "Nearest to used: is, use, commonly, cirth, netbios, invented, grams, common,\n", + "Nearest to at: the, travelling, dominants, emi, of, degree, surrounding, aviators,\n", + "Nearest to such: lysenkoism, unfair, cc, other, actus, hinges, desired, group,\n", + "Nearest to called: the, supersessionism, bother, core, systematized, denunciations, rearranged, eusocial,\n", + "Nearest to when: be, attractive, remove, edinburgh, refuse, trouble, itv, retrospect,\n", + "Nearest to taking: go, rational, salim, xo, anoxic, garis, chinguetti, nba,\n", + "Nearest to consists: chamber, consist, eee, leblanc, calderon, conscience, hydrohalic, located,\n", + "Nearest to scale: diatonic, mellin, capricornus, suggests, townes, correlations, accede, motherhood,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, dera, hubei,\n", + "Nearest to ice: rink, louth, pyotr, plasmodium, joaquin, pinstripes, zubr, gory,\n", + "Nearest to instance: placed, lenses, bookstore, contacts, envisage, geometrically, consented, illogical,\n", + "Nearest to channel: creditors, curler, wb, hearsay, channels, transmitters, dts, mbit,\n", + "Nearest to report: reports, credibility, annotated, spirituality, santer, presidents, comprehensive, lebanon,\n", + "Epoch 7/10 Iteration: 30100 Avg. Training loss: 3.9198 0.1057 sec/batch\n", + "Epoch 7/10 Iteration: 30200 Avg. Training loss: 3.9272 0.1015 sec/batch\n", + "Epoch 7/10 Iteration: 30300 Avg. Training loss: 3.9112 0.1014 sec/batch\n", + "Epoch 7/10 Iteration: 30400 Avg. Training loss: 3.8940 0.1035 sec/batch\n", + "Epoch 7/10 Iteration: 30500 Avg. Training loss: 3.9486 0.1055 sec/batch\n", + "Epoch 7/10 Iteration: 30600 Avg. Training loss: 3.9379 0.1060 sec/batch\n", + "Epoch 7/10 Iteration: 30700 Avg. Training loss: 3.8933 0.1067 sec/batch\n", + "Epoch 7/10 Iteration: 30800 Avg. Training loss: 3.8929 0.1102 sec/batch\n", + "Epoch 7/10 Iteration: 30900 Avg. Training loss: 3.9001 0.1094 sec/batch\n", + "Epoch 7/10 Iteration: 31000 Avg. Training loss: 3.8601 0.1133 sec/batch\n", + "Nearest to for: given, the, to, first, scriptwriter, by, in, of,\n", + "Nearest to would: relegated, that, disobey, coastlands, lege, whyte, in, maecenas,\n", + "Nearest to known: with, which, first, banach, hoosiers, pisin, aalborg, millions,\n", + "Nearest to used: use, cirth, commonly, common, invented, is, netbios, grams,\n", + "Nearest to at: the, travelling, of, dominants, degree, emi, as, to,\n", + "Nearest to such: lysenkoism, unfair, cc, hinges, group, plosives, other, baa,\n", + "Nearest to called: the, bother, supersessionism, denunciations, anakkale, keno, distinctive, eusocial,\n", + "Nearest to when: attractive, be, edinburgh, remove, scotland, trouble, refuse, painda,\n", + "Nearest to taking: go, rational, anoxic, salim, xo, sidgwick, boosts, regrettable,\n", + "Nearest to consists: chamber, consist, leblanc, eee, calderon, morisot, conscience, sint,\n", + "Nearest to scale: diatonic, mellin, effects, capricornus, suggests, correlations, agglomeration, motherhood,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, dera, hubei,\n", + "Nearest to ice: rink, louth, joaquin, pyotr, plasmodium, zubr, sweden, soils,\n", + "Nearest to instance: placed, bookstore, husband, lenses, contacts, pasts, wong, envisage,\n", + "Nearest to channel: creditors, curler, hearsay, channels, dray, restructured, wb, mbit,\n", + "Nearest to report: reports, credibility, santer, annotated, standish, presidents, spirituality, comprehensive,\n", + "Epoch 7/10 Iteration: 31100 Avg. Training loss: 3.9213 0.1056 sec/batch\n", + "Epoch 7/10 Iteration: 31200 Avg. Training loss: 3.8905 0.1058 sec/batch\n", + "Epoch 7/10 Iteration: 31300 Avg. Training loss: 3.8990 0.1132 sec/batch\n", + "Epoch 7/10 Iteration: 31400 Avg. Training loss: 3.9640 0.1252 sec/batch\n", + "Epoch 7/10 Iteration: 31500 Avg. Training loss: 3.9684 0.1159 sec/batch\n", + "Epoch 7/10 Iteration: 31600 Avg. Training loss: 3.9861 0.1196 sec/batch\n", + "Epoch 7/10 Iteration: 31700 Avg. Training loss: 3.9020 0.1109 sec/batch\n", + "Epoch 7/10 Iteration: 31800 Avg. Training loss: 3.8697 0.1079 sec/batch\n", + "Epoch 7/10 Iteration: 31900 Avg. Training loss: 3.9195 0.1062 sec/batch\n", + "Epoch 7/10 Iteration: 32000 Avg. Training loss: 3.7972 0.1137 sec/batch\n", + "Nearest to for: given, to, the, first, scriptwriter, by, and, have,\n", + "Nearest to would: that, relegated, coastlands, disobey, to, lege, in, busting,\n", + "Nearest to known: with, which, hoosiers, pisin, first, banach, millions, aalborg,\n", + "Nearest to used: use, commonly, common, cirth, netbios, is, bunyan, invented,\n", + "Nearest to at: the, travelling, emi, of, degree, dominants, to, s,\n", + "Nearest to such: unfair, cc, other, lysenkoism, group, pashtuns, hinges, multinationals,\n", + "Nearest to called: the, supersessionism, bother, denunciations, anakkale, is, keno, instituted,\n", + "Nearest to when: be, remove, attractive, edinburgh, trouble, refuse, painda, scotland,\n", + "Nearest to taking: go, rational, salim, boosts, xo, anoxic, sidgwick, regrettable,\n", + "Nearest to consists: chamber, consist, eee, appoints, leblanc, calderon, conscience, couturat,\n", + "Nearest to scale: diatonic, mellin, effects, motherhood, suggests, capricornus, correlations, townes,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, dera, kilogram,\n", + "Nearest to ice: rink, louth, pyotr, joaquin, plasmodium, sweden, indoor, zubr,\n", + "Nearest to instance: placed, lenses, bookstore, contacts, philos, illogical, envisage, kruskal,\n", + "Nearest to channel: creditors, hearsay, curler, wb, channels, dray, mbit, bandwidth,\n", + "Nearest to report: reports, credibility, annotated, santer, presidents, spirituality, haight, focusing,\n", + "Epoch 7/10 Iteration: 32100 Avg. Training loss: 3.9153 0.1189 sec/batch\n", + "Epoch 7/10 Iteration: 32200 Avg. Training loss: 3.9433 0.1161 sec/batch\n", + "Epoch 7/10 Iteration: 32300 Avg. Training loss: 3.9029 0.1209 sec/batch\n", + "Epoch 8/10 Iteration: 32400 Avg. Training loss: 3.9170 0.0138 sec/batch\n", + "Epoch 8/10 Iteration: 32500 Avg. Training loss: 3.8952 0.1250 sec/batch\n", + "Epoch 8/10 Iteration: 32600 Avg. Training loss: 3.8827 0.1306 sec/batch\n", + "Epoch 8/10 Iteration: 32700 Avg. Training loss: 3.8966 0.1219 sec/batch\n", + "Epoch 8/10 Iteration: 32800 Avg. Training loss: 3.9122 0.1221 sec/batch\n", + "Epoch 8/10 Iteration: 32900 Avg. Training loss: 3.8753 0.1216 sec/batch\n", + "Epoch 8/10 Iteration: 33000 Avg. Training loss: 3.8522 0.1206 sec/batch\n", + "Nearest to for: to, given, the, and, first, by, in, have,\n", + "Nearest to would: that, in, relegated, coastlands, to, disobey, whyte, lege,\n", + "Nearest to known: which, first, with, hoosiers, most, millions, pisin, many,\n", + "Nearest to used: use, commonly, common, is, netbios, cirth, other, for,\n", + "Nearest to at: the, travelling, of, to, dominants, later, as, s,\n", + "Nearest to such: other, group, lysenkoism, multinationals, unfair, hinges, cc, actus,\n", + "Nearest to called: bother, the, supersessionism, is, denunciations, instituted, keno, ripples,\n", + "Nearest to when: remove, be, attractive, edinburgh, refuse, painda, trouble, retrospect,\n", + "Nearest to taking: go, salim, levees, boosts, xo, nba, anoxic, nsaids,\n", + "Nearest to consists: chamber, consist, eee, conscience, sint, couturat, leblanc, calderon,\n", + "Nearest to scale: diatonic, mellin, capricornus, motherhood, gears, suggests, agglomeration, tuning,\n", + "Nearest to units: unit, prefixes, fortieth, si, typewriter, hubei, force, dera,\n", + "Nearest to ice: rink, louth, pyotr, joaquin, plasmodium, sweden, gory, zubr,\n", + "Nearest to instance: placed, bookstore, husband, lenses, illogical, attitudes, pasts, herders,\n", + "Nearest to channel: creditors, wb, mbit, curler, channels, bandwidth, hearsay, transmitters,\n", + "Nearest to report: reports, credibility, annotated, standish, spirituality, presidents, santer, focusing,\n", + "Epoch 8/10 Iteration: 33100 Avg. Training loss: 3.8330 0.1218 sec/batch\n", + "Epoch 8/10 Iteration: 33200 Avg. Training loss: 3.8716 0.1212 sec/batch\n", + "Epoch 8/10 Iteration: 33300 Avg. Training loss: 3.8915 0.1208 sec/batch\n", + "Epoch 8/10 Iteration: 33400 Avg. Training loss: 3.9107 0.1212 sec/batch\n", + "Epoch 8/10 Iteration: 33500 Avg. Training loss: 3.8661 0.1210 sec/batch\n", + "Epoch 8/10 Iteration: 33600 Avg. Training loss: 3.8355 0.1189 sec/batch\n", + "Epoch 8/10 Iteration: 33700 Avg. Training loss: 3.8342 0.1208 sec/batch\n", + "Epoch 8/10 Iteration: 33800 Avg. Training loss: 3.7842 0.1212 sec/batch\n", + "Epoch 8/10 Iteration: 33900 Avg. Training loss: 3.8311 0.1226 sec/batch\n", + "Epoch 8/10 Iteration: 34000 Avg. Training loss: 3.8845 0.1218 sec/batch\n", + "Nearest to for: to, the, given, and, in, have, first, by,\n", + "Nearest to would: that, relegated, to, in, with, coastlands, yet, accelerations,\n", + "Nearest to known: with, which, first, hoosiers, most, many, millions, banach,\n", + "Nearest to used: is, commonly, use, common, grams, for, other, cirth,\n", + "Nearest to at: the, of, travelling, dominants, to, as, degree, two,\n", + "Nearest to such: other, and, as, group, can, cc, exotic, actus,\n", + "Nearest to called: the, is, supersessionism, bother, of, denunciations, a, rearranged,\n", + "Nearest to when: be, remove, attractive, refuse, tire, initial, painda, headers,\n", + "Nearest to taking: go, rational, levees, xo, nsaids, salim, boosts, nba,\n", + "Nearest to consists: consist, chamber, calderon, eee, conscience, located, couturat, leblanc,\n", + "Nearest to scale: diatonic, mellin, suggests, capricornus, motherhood, gears, townes, effects,\n", + "Nearest to units: unit, prefixes, fortieth, si, typewriter, force, hubei, dera,\n", + "Nearest to ice: rink, louth, pyotr, plasmodium, joaquin, sweden, detection, ussr,\n", + "Nearest to instance: placed, bookstore, lenses, oscillators, resize, xa, philos, barcodes,\n", + "Nearest to channel: creditors, channels, mbit, wb, curler, dts, restructured, dray,\n", + "Nearest to report: reports, credibility, annotated, santer, presidents, standish, spirituality, focusing,\n", + "Epoch 8/10 Iteration: 34100 Avg. Training loss: 3.8751 0.1228 sec/batch\n", + "Epoch 8/10 Iteration: 34200 Avg. Training loss: 3.8528 0.1223 sec/batch\n", + "Epoch 8/10 Iteration: 34300 Avg. Training loss: 3.9067 0.1178 sec/batch\n", + "Epoch 8/10 Iteration: 34400 Avg. Training loss: 3.8909 0.1161 sec/batch\n", + "Epoch 8/10 Iteration: 34500 Avg. Training loss: 3.8444 0.1158 sec/batch\n", + "Epoch 8/10 Iteration: 34600 Avg. Training loss: 3.8552 0.1208 sec/batch\n", + "Epoch 8/10 Iteration: 34700 Avg. Training loss: 3.8861 0.1260 sec/batch\n", + "Epoch 8/10 Iteration: 34800 Avg. Training loss: 3.8621 0.1159 sec/batch\n", + "Epoch 8/10 Iteration: 34900 Avg. Training loss: 3.8820 0.1110 sec/batch\n", + "Epoch 8/10 Iteration: 35000 Avg. Training loss: 3.9116 0.1115 sec/batch\n", + "Nearest to for: to, given, the, and, by, have, in, first,\n", + "Nearest to would: that, to, relegated, in, accelerations, yet, than, it,\n", + "Nearest to known: which, with, first, pisin, most, hoosiers, banach, millions,\n", + "Nearest to used: is, use, common, commonly, cirth, occasionally, for, invented,\n", + "Nearest to at: the, travelling, of, dominants, to, as, degree, s,\n", + "Nearest to such: other, as, and, can, group, lysenkoism, cc, hinges,\n", + "Nearest to called: the, bother, supersessionism, is, denunciations, rearranged, anakkale, timbres,\n", + "Nearest to when: be, remove, attractive, painda, refuse, trouble, edinburgh, initial,\n", + "Nearest to taking: go, rational, salim, levees, nsaids, xo, pia, regrettable,\n", + "Nearest to consists: consist, chamber, calderon, conscience, leblanc, couturat, eee, sint,\n", + "Nearest to scale: diatonic, mellin, suggests, capricornus, motherhood, trillions, correlations, effects,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, hubei, dera,\n", + "Nearest to ice: rink, louth, pyotr, joaquin, plasmodium, sweden, ussr, pontine,\n", + "Nearest to instance: placed, bookstore, lenses, contacts, geometrically, pasts, oscillators, robby,\n", + "Nearest to channel: creditors, curler, mbit, wb, restructured, dts, dray, channels,\n", + "Nearest to report: reports, credibility, santer, annotated, focusing, html, standish, comprehensive,\n", + "Epoch 8/10 Iteration: 35100 Avg. Training loss: 3.8544 0.1112 sec/batch\n", + "Epoch 8/10 Iteration: 35200 Avg. Training loss: 3.8741 0.1111 sec/batch\n", + "Epoch 8/10 Iteration: 35300 Avg. Training loss: 3.8893 0.1121 sec/batch\n", + "Epoch 8/10 Iteration: 35400 Avg. Training loss: 3.8901 0.1112 sec/batch\n", + "Epoch 8/10 Iteration: 35500 Avg. Training loss: 3.8736 0.1117 sec/batch\n", + "Epoch 8/10 Iteration: 35600 Avg. Training loss: 3.8698 0.1114 sec/batch\n", + "Epoch 8/10 Iteration: 35700 Avg. Training loss: 3.8237 0.1114 sec/batch\n", + "Epoch 8/10 Iteration: 35800 Avg. Training loss: 3.8605 0.1120 sec/batch\n", + "Epoch 8/10 Iteration: 35900 Avg. Training loss: 3.9338 0.1116 sec/batch\n", + "Epoch 8/10 Iteration: 36000 Avg. Training loss: 3.8586 0.1116 sec/batch\n", + "Nearest to for: given, the, to, and, in, first, scriptwriter, by,\n", + "Nearest to would: that, to, in, relegated, coastlands, yet, lege, with,\n", + "Nearest to known: which, with, first, hoosiers, millions, seventeenth, banach, pisin,\n", + "Nearest to used: is, common, commonly, use, cirth, netbios, often, invented,\n", + "Nearest to at: the, of, travelling, as, s, to, later, in,\n", + "Nearest to such: other, as, lysenkoism, actus, cc, group, hinges, types,\n", + "Nearest to called: bother, the, supersessionism, denunciations, keno, is, timbres, anakkale,\n", + "Nearest to when: be, the, painda, edinburgh, remove, scotland, refuse, trouble,\n", + "Nearest to taking: go, salim, pia, nsaids, xo, rational, levees, diva,\n", + "Nearest to consists: consist, chamber, calderon, eee, sint, conscience, couturat, leblanc,\n", + "Nearest to scale: diatonic, motherhood, capricornus, mellin, suggests, effects, correlations, trillions,\n", + "Nearest to units: unit, prefixes, fortieth, si, typewriter, force, dera, hubei,\n", + "Nearest to ice: rink, joaquin, louth, pyotr, plasmodium, sweden, ussr, hockey,\n", + "Nearest to instance: placed, geometrically, bookstore, philos, oscillators, kruskal, pasts, lenses,\n", + "Nearest to channel: creditors, mbit, channels, curler, wb, bandwidth, restructured, hearsay,\n", + "Nearest to report: reports, credibility, santer, focusing, annotated, comprehensive, standish, html,\n", + "Epoch 8/10 Iteration: 36100 Avg. Training loss: 3.9513 0.1133 sec/batch\n", + "Epoch 8/10 Iteration: 36200 Avg. Training loss: 3.9537 0.1111 sec/batch\n", + "Epoch 8/10 Iteration: 36300 Avg. Training loss: 3.8965 0.1114 sec/batch\n", + "Epoch 8/10 Iteration: 36400 Avg. Training loss: 3.8243 0.1119 sec/batch\n", + "Epoch 8/10 Iteration: 36500 Avg. Training loss: 3.8824 0.1117 sec/batch\n", + "Epoch 8/10 Iteration: 36600 Avg. Training loss: 3.8074 0.1114 sec/batch\n", + "Epoch 8/10 Iteration: 36700 Avg. Training loss: 3.8481 0.1124 sec/batch\n", + "Epoch 8/10 Iteration: 36800 Avg. Training loss: 3.8889 0.1118 sec/batch\n", + "Epoch 8/10 Iteration: 36900 Avg. Training loss: 3.8722 0.1119 sec/batch\n", + "Epoch 8/10 Iteration: 37000 Avg. Training loss: 3.8919 0.1121 sec/batch\n", + "Nearest to for: to, given, the, and, by, in, scriptwriter, have,\n", + "Nearest to would: that, to, with, relegated, coastlands, lege, yet, maecenas,\n", + "Nearest to known: which, with, most, hoosiers, many, the, first, pisin,\n", + "Nearest to used: commonly, use, is, netbios, common, other, cirth, for,\n", + "Nearest to at: the, travelling, to, as, dominants, s, of, emi,\n", + "Nearest to such: as, other, many, group, and, exotic, pashtuns, cc,\n", + "Nearest to called: the, bother, supersessionism, of, denunciations, keno, philology, systematized,\n", + "Nearest to when: be, remove, attractive, was, painda, marysville, edinburgh, the,\n", + "Nearest to taking: go, levees, xo, nsaids, nba, boosts, salim, pia,\n", + "Nearest to consists: chamber, calderon, consist, conscience, couturat, eee, appoints, leblanc,\n", + "Nearest to scale: diatonic, mellin, accidentals, motherhood, capricornus, suggests, gears, scales,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, dera, kilogram,\n", + "Nearest to ice: rink, joaquin, pyotr, louth, sweden, hockey, plasmodium, ussr,\n", + "Nearest to instance: placed, bookstore, pasts, geometrically, oscillators, philos, kruskal, husband,\n", + "Nearest to channel: creditors, mbit, curler, channels, wb, hearsay, bandwidth, dts,\n", + "Nearest to report: reports, credibility, annotated, santer, focusing, standish, html, comprehensive,\n", + "Epoch 9/10 Iteration: 37100 Avg. Training loss: 3.8941 0.0937 sec/batch\n", + "Epoch 9/10 Iteration: 37200 Avg. Training loss: 3.8418 0.1114 sec/batch\n", + "Epoch 9/10 Iteration: 37300 Avg. Training loss: 3.8491 0.1207 sec/batch\n", + "Epoch 9/10 Iteration: 37400 Avg. Training loss: 3.8795 0.1237 sec/batch\n", + "Epoch 9/10 Iteration: 37500 Avg. Training loss: 3.8064 0.1177 sec/batch\n", + "Epoch 9/10 Iteration: 37600 Avg. Training loss: 3.8517 0.1224 sec/batch\n", + "Epoch 9/10 Iteration: 37700 Avg. Training loss: 3.8122 0.1167 sec/batch\n", + "Epoch 9/10 Iteration: 37800 Avg. Training loss: 3.8771 0.1231 sec/batch\n", + "Epoch 9/10 Iteration: 37900 Avg. Training loss: 3.8810 0.1157 sec/batch\n", + "Epoch 9/10 Iteration: 38000 Avg. Training loss: 3.8750 0.1181 sec/batch\n", + "Nearest to for: the, to, and, in, given, by, first, a,\n", + "Nearest to would: that, to, with, relegated, in, than, coastlands, asians,\n", + "Nearest to known: which, most, with, hoosiers, first, and, many, name,\n", + "Nearest to used: commonly, use, is, common, netbios, cirth, as, other,\n", + "Nearest to at: the, of, two, as, and, travelling, to, s,\n", + "Nearest to such: other, as, can, group, lysenkoism, exotic, many, american,\n", + "Nearest to called: the, bother, supersessionism, hardin, is, of, anakkale, eusocial,\n", + "Nearest to when: be, was, painda, attractive, initial, trouble, remove, but,\n", + "Nearest to taking: go, pia, salim, xo, levees, nba, boosts, fugees,\n", + "Nearest to consists: chamber, calderon, consist, conscience, couturat, eee, sint, appoints,\n", + "Nearest to scale: diatonic, motherhood, capricornus, correlations, mellin, chords, gears, trillions,\n", + "Nearest to units: unit, prefixes, fortieth, si, force, typewriter, hubei, dera,\n", + "Nearest to ice: rink, joaquin, pyotr, louth, hockey, sweden, ussr, plasmodium,\n", + "Nearest to instance: placed, bookstore, pasts, philos, accepts, geometrically, oscillators, kruskal,\n", + "Nearest to channel: creditors, curler, wb, restructured, channels, mbit, dts, bandwidth,\n", + "Nearest to report: reports, credibility, annotated, focusing, santer, standish, html, spirituality,\n", + "Epoch 9/10 Iteration: 38100 Avg. Training loss: 3.8705 0.1189 sec/batch\n", + "Epoch 9/10 Iteration: 38200 Avg. Training loss: 3.7634 0.1132 sec/batch\n", + "Epoch 9/10 Iteration: 38300 Avg. Training loss: 3.8207 0.1136 sec/batch\n", + "Epoch 9/10 Iteration: 38400 Avg. Training loss: 3.7974 0.1140 sec/batch\n", + "Epoch 9/10 Iteration: 38500 Avg. Training loss: 3.8033 0.1138 sec/batch\n", + "Epoch 9/10 Iteration: 38600 Avg. Training loss: 3.8553 0.1134 sec/batch\n", + "Epoch 9/10 Iteration: 38700 Avg. Training loss: 3.8482 0.1135 sec/batch\n", + "Epoch 9/10 Iteration: 38800 Avg. Training loss: 3.8287 0.1131 sec/batch\n", + "Epoch 9/10 Iteration: 38900 Avg. Training loss: 3.9033 0.1122 sec/batch\n", + "Epoch 9/10 Iteration: 39000 Avg. Training loss: 3.8907 0.1133 sec/batch\n", + "Nearest to for: the, to, and, in, given, have, by, a,\n", + "Nearest to would: to, that, relegated, with, than, coastlands, in, it,\n", + "Nearest to known: which, most, with, hoosiers, first, banach, the, in,\n", + "Nearest to used: commonly, is, use, common, for, occasionally, as, invented,\n", + "Nearest to at: the, of, to, two, travelling, as, dominants, and,\n", + "Nearest to such: as, other, and, can, many, exotic, lysenkoism, types,\n", + "Nearest to called: the, is, bother, supersessionism, eusocial, of, rearranged, a,\n", + "Nearest to when: be, was, attractive, remove, initial, edinburgh, painda, time,\n", + "Nearest to taking: go, levees, pia, xo, nba, fugees, nsaids, boosts,\n", + "Nearest to consists: consist, chamber, calderon, conscience, couturat, located, leblanc, eee,\n", + "Nearest to scale: diatonic, suggests, trillions, motherhood, mellin, correlations, capricornus, effects,\n", + "Nearest to units: unit, prefixes, fortieth, si, typewriter, force, hubei, dera,\n", + "Nearest to ice: rink, pyotr, joaquin, louth, sweden, hockey, plasmodium, frozen,\n", + "Nearest to instance: placed, geometrically, philos, bookstore, pasts, accepts, oscillators, contacts,\n", + "Nearest to channel: curler, creditors, wb, restructured, channels, mbit, bandwidth, hearsay,\n", + "Nearest to report: reports, credibility, focusing, annotated, santer, standish, binge, html,\n", + "Epoch 9/10 Iteration: 39100 Avg. Training loss: 3.8177 0.1132 sec/batch\n", + "Epoch 9/10 Iteration: 39200 Avg. Training loss: 3.8758 0.1144 sec/batch\n", + "Epoch 9/10 Iteration: 39300 Avg. Training loss: 3.8498 0.1183 sec/batch\n", + "Epoch 9/10 Iteration: 39400 Avg. Training loss: 3.8540 0.1166 sec/batch\n", + "Epoch 9/10 Iteration: 39500 Avg. Training loss: 3.8741 0.1142 sec/batch\n", + "Epoch 9/10 Iteration: 39600 Avg. Training loss: 3.8607 0.1127 sec/batch\n", + "Epoch 9/10 Iteration: 39700 Avg. Training loss: 3.8709 0.1122 sec/batch\n", + "Epoch 9/10 Iteration: 39800 Avg. Training loss: 3.8405 0.1132 sec/batch\n", + "Epoch 9/10 Iteration: 39900 Avg. Training loss: 3.8565 0.1126 sec/batch\n", + "Epoch 9/10 Iteration: 40000 Avg. Training loss: 3.8557 0.1125 sec/batch\n", + "Nearest to for: given, the, to, in, by, and, of, have,\n", + "Nearest to would: that, to, than, with, manorialism, coastlands, relegated, lege,\n", + "Nearest to known: which, with, most, first, name, this, by, pisin,\n", + "Nearest to used: is, use, commonly, common, other, for, as, occasionally,\n", + "Nearest to at: the, of, travelling, dominants, to, two, as, in,\n", + "Nearest to such: as, other, types, and, lysenkoism, exotic, many, american,\n", + "Nearest to called: the, is, bother, of, supersessionism, rearranged, a, eusocial,\n", + "Nearest to when: be, initial, the, attractive, painda, time, was, scotland,\n", + "Nearest to taking: pia, go, levees, novels, xo, fugees, salim, neustria,\n", + "Nearest to consists: consist, chamber, calderon, leblanc, conscience, located, couturat, composed,\n", + "Nearest to scale: diatonic, suggests, correlations, capricornus, motherhood, trillions, mellin, effects,\n", + "Nearest to units: unit, prefixes, fortieth, si, typewriter, dera, force, hubei,\n", + "Nearest to ice: rink, pyotr, joaquin, louth, plasmodium, ussr, sweden, hockey,\n", + "Nearest to instance: placed, geometrically, philos, accepts, kruskal, pasts, bookstore, barcodes,\n", + "Nearest to channel: creditors, curler, mbit, bandwidth, wb, restructured, channels, broadcasts,\n", + "Nearest to report: reports, credibility, santer, annotated, focusing, zangger, html, standish,\n", + "Epoch 9/10 Iteration: 40100 Avg. Training loss: 3.8686 0.1133 sec/batch\n", + "Epoch 9/10 Iteration: 40200 Avg. Training loss: 3.8666 0.1148 sec/batch\n", + "Epoch 9/10 Iteration: 40300 Avg. Training loss: 3.8254 0.1171 sec/batch\n", + "Epoch 9/10 Iteration: 40400 Avg. Training loss: 3.8455 0.1171 sec/batch\n", + "Epoch 9/10 Iteration: 40500 Avg. Training loss: 3.8998 0.1156 sec/batch\n", + "Epoch 9/10 Iteration: 40600 Avg. Training loss: 3.8319 0.1151 sec/batch\n", + "Epoch 9/10 Iteration: 40700 Avg. Training loss: 3.9923 0.1180 sec/batch\n", + "Epoch 9/10 Iteration: 40800 Avg. Training loss: 3.8747 0.1179 sec/batch\n", + "Epoch 9/10 Iteration: 40900 Avg. Training loss: 3.8889 0.1259 sec/batch\n", + "Epoch 9/10 Iteration: 41000 Avg. Training loss: 3.8198 0.1099 sec/batch\n", + "Nearest to for: the, given, to, in, of, have, and, by,\n", + "Nearest to would: that, to, coastlands, with, manorialism, relegated, yet, asians,\n", + "Nearest to known: with, most, which, this, name, first, by, hoosiers,\n", + "Nearest to used: commonly, is, use, common, for, invented, netbios, or,\n", + "Nearest to at: the, of, travelling, as, and, where, dominants, to,\n", + "Nearest to such: as, other, many, types, can, american, lysenkoism, dodging,\n", + "Nearest to called: the, is, bother, of, supersessionism, hardin, a, eusocial,\n", + "Nearest to when: be, painda, was, initial, remove, refuse, edinburgh, scotland,\n", + "Nearest to taking: go, pia, levees, xo, fugees, novels, reestablishing, boosts,\n", + "Nearest to consists: chamber, calderon, consist, conscience, leblanc, judicial, couturat, mayors,\n", + "Nearest to scale: diatonic, suggests, mellin, correlations, capricornus, motherhood, trillions, accidentals,\n", + "Nearest to units: unit, prefixes, fortieth, si, dera, force, typewriter, kilogram,\n", + "Nearest to ice: rink, pyotr, hockey, joaquin, ussr, plasmodium, louth, sweden,\n", + "Nearest to instance: placed, geometrically, philos, pasts, accepts, bookstore, kruskal, oscillators,\n", + "Nearest to channel: creditors, curler, channels, restructured, mbit, hearsay, wb, bandwidth,\n", + "Nearest to report: reports, credibility, santer, commission, annotated, zangger, focusing, binge,\n", + "Epoch 9/10 Iteration: 41100 Avg. Training loss: 3.7843 0.1144 sec/batch\n", + "Epoch 9/10 Iteration: 41200 Avg. Training loss: 3.8725 0.1137 sec/batch\n", + "Epoch 9/10 Iteration: 41300 Avg. Training loss: 3.8033 0.1140 sec/batch\n", + "Epoch 9/10 Iteration: 41400 Avg. Training loss: 3.8783 0.1153 sec/batch\n", + "Epoch 9/10 Iteration: 41500 Avg. Training loss: 3.8427 0.1154 sec/batch\n", + "Epoch 9/10 Iteration: 41600 Avg. Training loss: 3.8499 0.1160 sec/batch\n", + "Epoch 10/10 Iteration: 41700 Avg. Training loss: 3.8824 0.0667 sec/batch\n", + "Epoch 10/10 Iteration: 41800 Avg. Training loss: 3.8163 0.1239 sec/batch\n", + "Epoch 10/10 Iteration: 41900 Avg. Training loss: 3.8315 0.1177 sec/batch\n", + "Epoch 10/10 Iteration: 42000 Avg. Training loss: 3.8348 0.1208 sec/batch\n", + "Nearest to for: the, to, given, and, in, a, by, as,\n", + "Nearest to would: that, to, coastlands, with, relegated, than, lege, in,\n", + "Nearest to known: most, which, with, the, by, first, name, in,\n", + "Nearest to used: commonly, use, is, common, or, as, invented, cirth,\n", + "Nearest to at: the, of, as, travelling, to, in, where, and,\n", + "Nearest to such: as, other, types, can, any, and, lysenkoism, musical,\n", + "Nearest to called: the, is, bother, of, a, supersessionism, systematized, hardin,\n", + "Nearest to when: was, be, initial, the, painda, then, in, remove,\n", + "Nearest to taking: levees, boosts, go, fugees, xo, pia, ukrainians, salim,\n", + "Nearest to consists: chamber, consist, calderon, conscience, leblanc, couturat, sint, judicial,\n", + "Nearest to scale: diatonic, capricornus, suggests, accidentals, mellin, motherhood, specifying, scales,\n", + "Nearest to units: unit, prefixes, fortieth, si, measurement, kilogram, dera, force,\n", + "Nearest to ice: rink, pyotr, joaquin, ussr, louth, hockey, plasmodium, sweden,\n", + "Nearest to instance: placed, pasts, geometrically, bookstore, philos, herders, kruskal, oscillators,\n", + "Nearest to channel: creditors, curler, channels, mbit, wb, hearsay, bandwidth, restructured,\n", + "Nearest to report: reports, credibility, annotated, commission, focusing, santer, binge, zangger,\n", + "Epoch 10/10 Iteration: 42100 Avg. Training loss: 3.8185 0.1217 sec/batch\n", + "Epoch 10/10 Iteration: 42200 Avg. Training loss: 3.8360 0.1214 sec/batch\n", + "Epoch 10/10 Iteration: 42300 Avg. Training loss: 3.8103 0.1212 sec/batch\n", + "Epoch 10/10 Iteration: 42400 Avg. Training loss: 3.8191 0.1210 sec/batch\n", + "Epoch 10/10 Iteration: 42500 Avg. Training loss: 3.8747 0.1212 sec/batch\n", + "Epoch 10/10 Iteration: 42600 Avg. Training loss: 3.8540 0.1210 sec/batch\n", + "Epoch 10/10 Iteration: 42700 Avg. Training loss: 3.8766 0.1211 sec/batch\n", + "Epoch 10/10 Iteration: 42800 Avg. Training loss: 3.7192 0.1214 sec/batch\n", + "Epoch 10/10 Iteration: 42900 Avg. Training loss: 3.8094 0.1219 sec/batch\n", + "Epoch 10/10 Iteration: 43000 Avg. Training loss: 3.7974 0.1225 sec/batch\n", + "Nearest to for: the, to, and, given, a, in, of, by,\n", + "Nearest to would: that, to, relegated, than, coastlands, in, because, with,\n", + "Nearest to known: most, which, with, by, in, first, the, this,\n", + "Nearest to used: is, commonly, use, common, as, for, or, occasionally,\n", + "Nearest to at: the, and, two, as, of, degree, in, s,\n", + "Nearest to such: as, other, can, and, types, many, any, american,\n", + "Nearest to called: is, the, a, of, bother, and, supersessionism, systematized,\n", + "Nearest to when: be, initial, was, then, remove, time, the, before,\n", + "Nearest to taking: go, pia, fugees, levees, nsaids, boosts, xo, ukrainians,\n", + "Nearest to consists: consist, chamber, conscience, calderon, couturat, composed, leblanc, the,\n", + "Nearest to scale: diatonic, suggests, motherhood, capricornus, mellin, accidentals, specifying, trillions,\n", + "Nearest to units: unit, prefixes, fortieth, si, measurement, hubei, dera, kilogram,\n", + "Nearest to ice: rink, pyotr, joaquin, ussr, plasmodium, detection, jabir, louth,\n", + "Nearest to instance: placed, philos, geometrically, kruskal, pasts, accepts, xa, oscillators,\n", + "Nearest to channel: creditors, wb, channels, hearsay, curler, mbit, restructured, carnivores,\n", + "Nearest to report: reports, credibility, annotated, santer, focusing, commission, binge, html,\n", + "Epoch 10/10 Iteration: 43100 Avg. Training loss: 3.7622 0.1223 sec/batch\n", + "Epoch 10/10 Iteration: 43200 Avg. Training loss: 3.8084 0.1211 sec/batch\n", + "Epoch 10/10 Iteration: 43300 Avg. Training loss: 3.8268 0.1220 sec/batch\n", + "Epoch 10/10 Iteration: 43400 Avg. Training loss: 3.8140 0.1209 sec/batch\n", + "Epoch 10/10 Iteration: 43500 Avg. Training loss: 3.8296 0.1220 sec/batch\n", + "Epoch 10/10 Iteration: 43600 Avg. Training loss: 3.8960 0.1191 sec/batch\n", + "Epoch 10/10 Iteration: 43700 Avg. Training loss: 3.8529 0.1213 sec/batch\n", + "Epoch 10/10 Iteration: 43800 Avg. Training loss: 3.8322 0.1238 sec/batch\n", + "Epoch 10/10 Iteration: 43900 Avg. Training loss: 3.8167 0.1228 sec/batch\n", + "Epoch 10/10 Iteration: 44000 Avg. Training loss: 3.8544 0.1259 sec/batch\n", + "Nearest to for: the, to, and, given, in, a, of, by,\n", + "Nearest to would: that, to, than, relegated, in, coastlands, asians, it,\n", + "Nearest to known: most, which, with, this, in, first, the, by,\n", + "Nearest to used: is, commonly, use, common, occasionally, other, often, for,\n", + "Nearest to at: the, of, as, degree, and, travelling, in, dominants,\n", + "Nearest to such: as, other, can, and, any, types, the, american,\n", + "Nearest to called: is, the, bother, a, of, systematized, rearranged, supersessionism,\n", + "Nearest to when: be, initial, attractive, was, painda, time, tire, somehow,\n", + "Nearest to taking: pia, go, fugees, levees, nsaids, reestablishing, boosts, nba,\n", + "Nearest to consists: consist, chamber, conscience, calderon, leblanc, couturat, composed, hydrohalic,\n", + "Nearest to scale: diatonic, suggests, capricornus, correlations, mellin, motherhood, trillions, townes,\n", + "Nearest to units: unit, prefixes, measurement, fortieth, si, force, moller, remembrance,\n", + "Nearest to ice: rink, pyotr, joaquin, ussr, plasmodium, sweden, jabir, frozen,\n", + "Nearest to instance: placed, pasts, geometrically, accepts, kruskal, philos, barcodes, bookstore,\n", + "Nearest to channel: creditors, wb, curler, channels, mbit, hearsay, bandwidth, broadcasts,\n", + "Nearest to report: reports, credibility, santer, annotated, zangger, commission, binge, focusing,\n", + "Epoch 10/10 Iteration: 44100 Avg. Training loss: 3.8485 0.1220 sec/batch\n", + "Epoch 10/10 Iteration: 44200 Avg. Training loss: 3.8296 0.1186 sec/batch\n", + "Epoch 10/10 Iteration: 44300 Avg. Training loss: 3.8256 0.1181 sec/batch\n", + "Epoch 10/10 Iteration: 44400 Avg. Training loss: 3.8264 0.1154 sec/batch\n", + "Epoch 10/10 Iteration: 44500 Avg. Training loss: 3.8798 0.1159 sec/batch\n", + "Epoch 10/10 Iteration: 44600 Avg. Training loss: 3.8181 0.1083 sec/batch\n", + "Epoch 10/10 Iteration: 44700 Avg. Training loss: 3.8231 0.1113 sec/batch\n", + "Epoch 10/10 Iteration: 44800 Avg. Training loss: 3.8373 0.1067 sec/batch\n", + "Epoch 10/10 Iteration: 44900 Avg. Training loss: 3.7952 0.1103 sec/batch\n", + "Epoch 10/10 Iteration: 45000 Avg. Training loss: 3.8190 0.1097 sec/batch\n", + "Nearest to for: the, to, in, given, of, by, and, a,\n", + "Nearest to would: that, to, than, with, in, it, relegated, coastlands,\n", + "Nearest to known: most, with, which, first, in, by, the, this,\n", + "Nearest to used: is, use, common, commonly, other, often, for, to,\n", + "Nearest to at: the, of, in, as, two, three, degree, and,\n", + "Nearest to such: as, other, and, types, any, can, many, american,\n", + "Nearest to called: is, the, bother, a, of, eusocial, identical, rearranged,\n", + "Nearest to when: be, initial, the, attractive, remove, time, before, was,\n", + "Nearest to taking: pia, go, nsaids, fugees, boosts, neustria, reestablishing, xo,\n", + "Nearest to consists: consist, chamber, calderon, leblanc, conscience, composed, couturat, located,\n", + "Nearest to scale: diatonic, suggests, capricornus, motherhood, correlations, mellin, trillions, accede,\n", + "Nearest to units: unit, prefixes, fortieth, measurement, remembrance, force, si, dera,\n", + "Nearest to ice: rink, pyotr, ussr, joaquin, sweden, hockey, plasmodium, louth,\n", + "Nearest to instance: placed, pasts, geometrically, kruskal, philos, accepts, barcodes, xa,\n", + "Nearest to channel: creditors, channels, curler, mbit, wb, bandwidth, hearsay, restructured,\n", + "Nearest to report: reports, credibility, santer, annotated, zangger, commission, focusing, lists,\n", + "Epoch 10/10 Iteration: 45100 Avg. Training loss: 3.8512 0.1079 sec/batch\n", + "Epoch 10/10 Iteration: 45200 Avg. Training loss: 3.8194 0.1076 sec/batch\n", + "Epoch 10/10 Iteration: 45300 Avg. Training loss: 3.9229 0.1111 sec/batch\n", + "Epoch 10/10 Iteration: 45400 Avg. Training loss: 3.9125 0.1113 sec/batch\n", + "Epoch 10/10 Iteration: 45500 Avg. Training loss: 3.8759 0.1216 sec/batch\n", + "Epoch 10/10 Iteration: 45600 Avg. Training loss: 3.8293 0.1217 sec/batch\n", + "Epoch 10/10 Iteration: 45700 Avg. Training loss: 3.8020 0.1224 sec/batch\n", + "Epoch 10/10 Iteration: 45800 Avg. Training loss: 3.8479 0.1217 sec/batch\n", + "Epoch 10/10 Iteration: 45900 Avg. Training loss: 3.7367 0.1218 sec/batch\n", + "Epoch 10/10 Iteration: 46000 Avg. Training loss: 3.8804 0.1215 sec/batch\n", + "Nearest to for: the, and, to, a, given, of, in, from,\n", + "Nearest to would: that, to, than, coastlands, asians, relegated, with, because,\n", + "Nearest to known: most, with, which, the, first, by, in, this,\n", + "Nearest to used: commonly, is, use, common, or, often, other, as,\n", + "Nearest to at: the, in, of, as, degree, s, to, two,\n", + "Nearest to such: as, other, and, types, many, can, any, exotic,\n", + "Nearest to called: the, is, of, a, bother, identical, rearranged, hardin,\n", + "Nearest to when: be, the, was, initial, remove, laga, then, painda,\n", + "Nearest to taking: pia, go, fugees, ukrainians, reestablishing, xo, malm, boosts,\n", + "Nearest to consists: chamber, consist, calderon, leblanc, conscience, judicial, composed, couturat,\n", + "Nearest to scale: diatonic, suggests, capricornus, accidentals, mellin, motherhood, trillions, accede,\n", + "Nearest to units: unit, prefixes, measurement, fortieth, si, remembrance, force, dera,\n", + "Nearest to ice: rink, pyotr, ussr, joaquin, hockey, sweden, louth, plasmodium,\n", + "Nearest to instance: placed, pasts, geometrically, kruskal, philos, lenses, barcodes, oscillators,\n", + "Nearest to channel: creditors, channels, curler, hearsay, mbit, wb, carnivores, bandwidth,\n", + "Nearest to report: reports, credibility, annotated, commission, zangger, santer, focusing, lists,\n", + "Epoch 10/10 Iteration: 46100 Avg. Training loss: 3.8255 0.1184 sec/batch\n", + "Epoch 10/10 Iteration: 46200 Avg. Training loss: 3.8518 0.1119 sec/batch\n" + ] + } + ], + "source": [ + "epochs = 10\n", + "batch_size = 1000\n", + "window_size = 10\n", + "\n", + "with train_graph.as_default():\n", + " saver = tf.train.Saver()\n", + "\n", + "with tf.Session(graph=train_graph) as sess:\n", + " iteration = 1\n", + " loss = 0\n", + " sess.run(tf.global_variables_initializer())\n", + "\n", + " for e in range(1, epochs+1):\n", + " batches = get_batches(train_words, batch_size, window_size)\n", + " start = time.time()\n", + " for x, y in batches:\n", + " \n", + " feed = {inputs: x,\n", + " labels: np.array(y)[:, None]}\n", + " train_loss, _ = sess.run([cost, optimizer], feed_dict=feed)\n", + " \n", + " loss += train_loss\n", + " \n", + " if iteration % 100 == 0: \n", + " end = time.time()\n", + " print(\"Epoch {}/{}\".format(e, epochs),\n", + " \"Iteration: {}\".format(iteration),\n", + " \"Avg. Training loss: {:.4f}\".format(loss/100),\n", + " \"{:.4f} sec/batch\".format((end-start)/100))\n", + " loss = 0\n", + " start = time.time()\n", + " \n", + " if iteration % 1000 == 0:\n", + " # note that this is expensive (~20% slowdown if computed every 500 steps)\n", + " sim = similarity.eval()\n", + " for i in range(valid_size):\n", + " valid_word = int_to_vocab[valid_examples[i]]\n", + " top_k = 8 # number of nearest neighbors\n", + " nearest = (-sim[i, :]).argsort()[1:top_k+1]\n", + " log = 'Nearest to %s:' % valid_word\n", + " for k in range(top_k):\n", + " close_word = int_to_vocab[nearest[k]]\n", + " log = '%s %s,' % (log, close_word)\n", + " print(log)\n", + " \n", + " iteration += 1\n", + " save_path = saver.save(sess, \"checkpoints/text8.ckpt\")\n", + " embed_mat = sess.run(normalized_embedding)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Restore the trained network if you need to:" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with train_graph.as_default():\n", + " saver = tf.train.Saver()\n", + "\n", + "with tf.Session(graph=train_graph) as sess:\n", + " saver.restore(sess, tf.train.latest_checkpoint('checkpoints'))\n", + " embed_mat = sess.run(embedding)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualizing the word vectors\n", + "\n", + "Below we'll use T-SNE to visualize how our high-dimensional word vectors cluster together. T-SNE is used to project these vectors into two dimensions while preserving local stucture. Check out [this post from Christopher Olah](http://colah.github.io/posts/2014-10-Visualizing-MNIST/) to learn more about T-SNE and other ways to visualize high-dimensional data." + ] + }, + { + "cell_type": "code", + "execution_count": 115, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina'\n", + "\n", + "import matplotlib.pyplot as plt\n", + "from sklearn.manifold import TSNE" + ] + }, + { + "cell_type": "code", + "execution_count": 138, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "viz_words = 500\n", + "tsne = TSNE()\n", + "embed_tsne = tsne.fit_transform(embed_mat[:viz_words, :])" + ] + }, + { + "cell_type": "code", + "execution_count": 139, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABoEAAAYzCAYAAAA7x5RXAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAWJQAAFiUBSVIk8AAAIABJREFUeJzs3XlYVnXex/HPYZWb/WZTERUFFHelxJxc0hqVIm1KS5vU\nyamZsqto0nnGmnImW56emtJmzKaGKcu0xRrTNFyoRiUXBHFXQFnc2GVHELifP5Q7EVBUXML367rm\n4nDO7/zO9z7gdU1+/P5+hsViEQAAAAAAAAAAAFoXm2tdAAAAAAAAAAAAAFoeIRAAAAAAAAAAAEAr\nRAgEAAAAAAAAAADQChECAQAAAAAAAAAAtEKEQAAAAAAAAAAAAK0QIRAAAAAAAAAAAEArRAgEAAAA\nAAAAAADQChECAQAAAAAAAAAAtEKEQAAAAAAAAAAAAK0QIRAAAAAAAAAAAEArRAgEAAAAAAAAAADQ\nChECAQAAAAAAAAAAtEKEQAAAAAAAAAAAAK0QIRAAAAAAAAAAAEArRAgEAAAAAAAAAADQChECAQAA\nAAAAAAAAtEJ217qA65VhGGmS3CSlX+NSAAAAAAAAAADAz09nScUWiyXwWhVACNQ0NycnJ3NoaKj5\nWhcCAAAAAAAAAAB+Xvbt26eKioprWgMhUNPSQ0NDzQkJCde6DgAAAAAAAAAA8DMTFhamxMTE9GtZ\nA3sCAQAAAAAAAAAAtEKEQAAAAAAAAAAAAK0QIRAAAAAAAAAAAEArRAgEAAAAAAAAAADQChECAQAA\nAAAAAAAAtEKEQAAAAAAAAAAAAK0QIRAAAAAAAAAAAEArRAgEAAAAAAAAAADQChECAQAAAAAAAAAA\ntEKEQAAAAAAAAAAAAK0QIRAAAAAAAAAAAEArRAgEAAAAAAAAAADQChECAQAAAAAAAAAAtEKEQAAA\nAAAAAAAAAK0QIRAAAAAAAAAAAEArRAgEAAAAAAAAAADQChECAQAAAAAAAAAAtEKEQAAAAAAAAAAA\nAK3QdRMCGYbxmmEYsYZhHDYMo8IwjALDMLYbhjHbMAyvJu4ZbBjGqjNjKwzD2GkYRpRhGLZXu34A\nAAAAAAAAAIDryXUTAkl6WpKzpLWS5kn6RFK1pL9I2mkYRsDZgw3DGCtpvaShkv4j6R+SHCS9JenT\nq1Y1AAAAAAAAAADAdcjuWhdwFjeLxXLy3JOGYbws6VlJsyQ9fuacm6T3JdVIGm6xWLadOf+8pO8k\n3WcYxgMWi4UwCAAAAAAAAAAA3JCum06gxgKgMz4/8zX4rHP3SfKR9GldAHTWHH8+8+1jLV4kAAAA\nAAAAAADAz8R1EwKdR+SZrzvPOjfizNeYRsavl1QuabBhGI5XsjAAAAAAAAAAAIDr1fW0HJwkyTCM\nGZJcJLlLuknSrTodAP3vWcO6nfmafO79Foul2jCMNEk9JXWRtO8Cz0to4lL3i6scAAAAAAAAAADg\n+nHdhUCSZkjyO+v7GElTLRZL7lnn3M98LWpijrrzHi1cGwAAAAAAAAAAwM/CdRcCWSyWtpJkGIaf\npME63QG03TCMuywWS+IVeF5YY+fPdAgNaOnnAQAAAAAAAAAAXA3X7Z5AFosl22Kx/EfSLyV5Sfro\nrMt1nT7uDW6sf77wCpUHAAAAAAAAAABwXbtuQ6A6FoslQ9JeST0Nw/A+c/rAma8h5443DMNOUqCk\nakmHrkqRAAAAAAAAAAAA15nrPgQ6o/2ZrzVnvn535uvoRsYOlWSS9KPFYqm80oUBAAAAAAAAAABc\nj66LEMgwjBDDMBos7WYYho1hGC9L8tXpUOfEmUtLJeVJesAwjJvOGt9G0ktnvl1whcsGAAAAAAAA\nAAC4btld6wLOiJD0qmEYGyWlScqX5CdpmKQukrIkPVI32GKxFBuG8YhOh0E/GIbxqaQCSXdL6nbm\n/GdX9RMAAAAAAAAAAABcR66XEGidpCBJt0rqL8lDUpmkZEkfS3rbYrEUnH2DxWJZZhjGMEnPSbpX\nUhtJqZL+cGa85eqVDwAAAAAAAAAAcH25LkIgi8WyW9ITl3BfnE53EQEAAAAAAAAAAOAs18WeQAAA\nAAAAAAAAAGhZhEAAAAAAAAAAAACtECEQAAAAAAAAAABAK0QIBAAAAAAAAAAA0AoRAgEAAAAAAAAA\nALRChEAAAAAAAAAAAACtECEQAAAAAAAAAABAK0QIBAAAAAAAAAAA0AoRAgEAAAAAAAAAALRChEAA\nAAAAAAAAAACtECEQAAAAAAAAAABAK0QIBAAAAAAAAAAA0AoRAgEAAAAAAAAAALRChEAAAAAAAAAA\nAACtECEQAAAAAAAAAABAK0QIBAAAAAAAAAAA0AoRAgEAAAAAAAAAALRChEAAAAAAAAAAAACtECEQ\nAAAAAAAAAABAK0QIBAAAAAAAAAAA0AoRAgEAAAAAAAAAALRChEAAAAAAAAAAAACtECEQAAAAAAAA\nAABAK0QIBAAAAAAAAAAA0AoRAgEAAAAAAAAAALRChEAAAAAAAAAAAACtECEQAAAAAAAAAABAK0QI\nBAAAAAAAAAAA0AoRAgEAAAAAAAAAALRCdte6AAAAAAAAAIvFohUrVigmJkZZWVkqLCxUTU2N3N3d\ntW/fPvn6+mrNmjV68skndezYMdnZ2SkqKkrJycn629/+pg4dOqi2tla/+MUvdOjQIZWXl+ull17S\nV199peTkZGVnZ6uoqEj29vby8fFR+/btNXjwYN13331ydnbWtGnTJEnR0dH1jiVp8eLFWrJkiV55\n5RXt2rXLevzss8+qV69emjlzpoYMGSLDMBQYGKiioiI5OjrKzc1NAQEBuueeezRs2DBZLBZ99913\niomJ0bFjx1RRUSF3d3cFBATojjvu0JAhQ67Z+wcAAK0TIRAAAAAAALjm3n33Xa1atUpms1lOTk46\nePCgKioqZG9vL19fX+Xn5+v5559XdXW19Z64uDjFxMTI1tZWnp6eOnDggJycnDRmzBht2LBBzz//\nvBwdHeXl5aX9+/errKxMDg4OcnZ2Vps2bbR06VJt2bJFr7/++iXXXVpaqpkzZ6q8vFxdu3ZVXl6e\n8vPzZTKZ1LlzZ2VlZemNN95Qfn6+SktL9cUXX8jPz0+33nqrnJ2dVVBQoJSUFG3cuJEQCAAAtDiW\ngwMAAAAAANfUnj17tGrVKvn7++uJJ55QcXGxbrnlFsXHx+umm26S2WzWLbfcIgcHBxUUFFjv27Zt\nmx566CGFhISopqZGH3/8sV544QVFRESooKBAbdq00XPPPafs7GwFBwfru+++0/Tp02UymRQcHKyI\niAgdPnxYH3zwwSXXnp6erpCQEPXs2VNOTk4aN26c1q9fr9DQUOXl5em5556Ti4uLPv74Yy1btkxe\nXl6aP3++HnvsMU2ePFlRUVGaP3++pk+f3hKvEgAAoB5CIAAAAAAAcE3FxsZKkiZMmKBNmzZZj93d\n3TVlyhRJko2NjfW4Tnh4uEJCQqzHYWFhkqQffvhB1dXVuuuuu5ScnGw97tChgx566CE5OTnp+++/\n1wMPPGA9rq2tvaTabWxsNHXqVBmGYT1u27atIiMjVV1drd27d1uPs7OzZWtrKxubhn8d4+bmdknP\nBwAAOB9CIAAAAAAAcNWl55Ro2dY0zY/ZrSWrNyk9p0QZlS5K2r1fktSjRw9JUrdu3WRra9vgWJI1\nADr3+ODBg5KkPn361DuWJBcXF3Xt2lVVVVUqLCy0HpeVlV3S5/Dx8ZGfn1+D4969e1trqTv29vZW\nTk6OHn/8cS1cuFAJCQmX/FwAAIDmYE8gAAAAAABw1WxPy9Mn61O0K/OnZd0OZ59QZUm5Ptl0VPsT\nD8mxulTphdVq2/Z0p42rq6ukn44rKyslSZ6entY5zj6uC1bMZnO943PHlpWVWY/P3mvoYnh4eDR6\nXDdveXm59bhHjx4aNGiQ1q1bp6VLl2rp0qWytbXVTTfdpGnTpqldu3aXVAMAAEBTCIEAAAAAAMBV\nEbM9U3NX7pLFUv+8rb2DJKn6ZJls7R1UXFKlWR/8oD9NHKY7+virpKREXl5eqq2tVUlJifU+wzAa\nPXZ2dpYknThxot5xx44drceSZDKZrMf29vaqqamxznV2KHR2t865nTuFhYWNHjf2DGdnZ40dO1Zj\nx45VUVGR9uzZow0bNmjjxo3KzMzU/PnzZW9vf4G3CAAA0HwsBwcAAAAAAK647Wl5jQZAkuRkPt0B\nU5qbaT0uycnUW9/s1H9iN1vDmQMHDliPz6dLly6SpF27dtU7lk6HOIcOHZKDg4PMZrP12M/PT4WF\nhaqurpaLi4v1WJJSUlKsc599LEm5ubnKyclpcFz3vK5du9Y7ruPu7q7Bgwfrf/7nf9SnTx8dP35c\nGRkZF/xsAAAAF4MQCAAAAAAAXHGfrE9pNACSJHPg6f16sndvkHuHbtbjyooy/d+8dyVJtbW1+uij\nj5r1rNtuu012dnb65ptv1KNHD+vx8ePHtWjRIpWXl2v48OH67LPPrMfdu3dXTU2N1q1bp5CQEOtx\nbGys9u3bJ0navHmz9bhObW2tPvjgA1ksFutxVlaWVqxYIVtbW/Xs2VMrVqyQYRjy9/dvUGt1dbVK\nS0slSY6Ojs36fAAAAM3FcnAAAAAAAOCKSs8pqbcH0Llc/TrLOzhMeSkJOrJ1lewcnVR4eL+2/vMP\nauPurWpzG6WmpmrMmDEym83Kzc097/N8fX31yCOPaMGCBZozZ478/Py0efNmDR8+XPb29vL19VVq\naqoOHTqkDh06aOrUqSosLNS6dev0zjvvqGfPnjp69KiefPJJOTo6qmvXrtqzZ49OnjypIUOGKD4+\n3vqszp07Kzk5WXv27FHXrl21bNkyRUdHy2Qy6eabb9Yrr7yisrIyTZw4Ua+//roWLVqkoKAg+fr6\nqqqqSklJSTp8+LDCw8MVEBDQYu8cAABAohMIAAAAAFpMbGysIiMjFRsbW+/8tGnTNG3atMuef+7c\nuYqMjLQuNwX8XCSl511wTMDAO9UhbJRs7OxVc6pK9k6ucnB2l629gw4fOaK2bdtqzpw5Ki8vb1bH\nTEREhF588UV169ZNhYWFMpvNcnZ2lslkkrOzsyoqKvSrX/1Kb7zxhlxdXRUQEKCXXnpJPXr0UHJy\nsnx9feXr6yt/f39lZWXJxsZGzzzzjIKCguo9x8XFRa+//rqcnJyUlZUlLy8v+fn5ycfHR/n5+fLz\n89OMGTM0YcIETZ06Ve3bt9e+ffu0fPly/fe//5XJZNLjjz+uP/3pT5f8fgEAAJpCJxAAAAAA4LLs\n2rVLzz77rCZOnKhJkyZd63JwHSqvrL7gGMMw5Bs6SL6hg+qdP1mcr5IfP1RExBgVFRXp5MmTGjp0\nqGbOnGkdM3LkyEbn7N+/v/r379/sOnv06KH//d//veC4c3/PzWazunbtql69eunVV19t8r57771X\n9957b7PrAQAAuFx0AgEAAADAFfbSSy/ppZdeuux5Jk+erAULFshsNrdAVY3LyclRZGSk5s6d26Lz\nRkZGatasWS06Z51du3YpMjJSixcvviLz4/KZHC/8b1BPVZTKcmbToLrj2upTOpqwWrY2hsLCwvT+\n++9Lkm655ZYrWi8AAEBrQScQAAAAAFxh7dq1a5F5zGbzFQ2ALlVISIgWLFggNze3KzJ/Tk6Opk2b\nppEjRyoqKuqKPANXVr/O3hcck7N/i06k75KrX2eV5mSqJDtdkkW1NdXy795FH330kfLz8xUWFqZf\n/OIXV7xmAACA1oAQCAAAAMAN6+xwYfz48Vq0aJF27dql4uJivfzyy+rdu7dKSkr01VdfafPmzcrJ\nyZGdnZ2CgoJ03333NXuZqbr9gKKjo+udLysr0+LFixUXF6fi4mL5+vpq9OjRGjRokB555JEGocfc\nuXMVGxur6Oho+fr61ptr48aN+uabb5SWlqbq6mq1a9dOw4YN07hx42Rvb19vbGRkZJPLVv3zn//U\n1q1bNXDgwHrnt2zZouXLl+vw4cMqKSmRm5ub2rdvryFDhigiIkIdOnQ47ztYsGBBs/ZxuRRXOoTC\n5evs66reHc3alVnQ5Bi3doGqOJGl4uMHdbI4XzWVFbLIIjdXN3l5usnd3V1333237r77bhmGcRWr\nBwAA+PkiBAIAAACuEroZrl/Hjx/XM888I39/fw0fPlyVlZUymUzKycnRrFmzlJOTo549eyosLEwn\nT55UfHy8Zs+erenTp2vUqFGX9Myqqio999xzOnjwoLp06aLhw4errKxMn3/+ufbs2XNRc3300Uf6\n4osv5ObmpmHDhqlNmzZKSEjQRx99pMTERM2ZM0d2ds3/z7/q6motWrRINjY2eu655xQTE6P58+fL\n09NTAwcOlJubm9577z3FxcWppKREAQEBTe4J1NwQLTExUbNnz9aECRP00EMPWc/v3LlTM2bM0I4d\nOxQeHl5v7tdee00bN27U+++/f8EQCtfeg0ODNeuTLTqz4lsDrm27yLVtl3rnDEN69cFw9Q+8cCfR\ntbBixYpGjwEAAK4XhEAAAAAAbnh79+7V+PHjNXny5HrnZ82apdzcXM2cOVNDhw61ni8rK9OsWbP0\n3nvvKTw8XB4eHhf9zK+++koHDx7U0KFDNWPGDGtnw/3336+nnnqq2fPs379fX3zxhby9vfXmm2/K\n09NTkjRlyhS9/PLLio+P11dffaUJEyZIkpKTk5Wamqr09HTt379frq6u6tSpk0aNGqVbb71VkmRn\nZyc3NzfFxcVpzpw5WrRokUpKSjRu3DiFh4fL3d1dX3zxhe68805Nnz5dn332mbZu3arevXsrISFB\nS5cu1aFDh3TixAkFBAQoJydH+/btU0hIiCIiIqwh2m9/+1uVl5drw4YNysrKUmJiojIyMnTs2DHd\ne++9CgoK0oIFC7Rjxw5J0sqVK7VlyxZJksVi0cmTJ9WpUyfl5ubqkUceaRBCpaam6rvvvtOuXbuU\nl5enyspKeXt7Kzw8XPfff79cXFzqvcvY2FjNnTtXUVFR8vHx0ZIlS5SamirDMNSzZ089/PDDCggI\nuOifNU7rH+itqDt7a+7KXU0GQWczDOnpu/pctwEQAADAz4HNtS4AAAAAuFGYzWYtWLCgQdCAqyc9\np0TLtqZp8YYULduapszcUkmSh4eHJk6cWG9sWlqadu/ercGDB9cLgCTJ2dlZDz74oKqqqvTjjz9e\nUi3fffedDMPQlClT6i1t5e3trbFjxzZ7nrVr10o6HR7VBUCSZGtrq2nTpskwDK1Zs0aStHr1as2c\nOVMnTpyQl5eX7rnnHt10000qKirSypUr683r4+OjnTt3KjExUQEBAfLx8dGRI0c0Z84cffTRR5Kk\nkSNH1luCLTk5WS+++KKcnJw0ZswYnTp1yhqihYaGqm/fvnriiSf0zjvvqFOnTpoxY4Y+/PBDmUwm\njRkzRn379lVNTY2SkpK0f/9+SVJ5ebl69eolOzs72dnZaeLEiZo4caJuv/12WSwW9e3bt8l3s3r1\naq1fv14dOnTQ7bffroiICJnNZi1btkx//OMfVVFR0eh9W7du1QsvvGCtq2fPntq2bZtmzZql4uLi\nZv9s0NDo/h316oPh6tPp/Htb9elk1qsPhmtUP0I3AACAy0EnEAAAAHCV2NnZsWTVNbI9LU+frE9p\nsB9JZWmhDh8+oZGB3Rrsm1MXQtTt23OuoqIiSdLhw4cvup7y8nIdP35c3t7eDfb2kaQePXo0e66D\nBw9KUqNhiL+/v7y9vZWdna0DBw5owYIFMplM6tWrl26++eZ6gWReXl6DGv38/BQYGKihQ4cqOjpa\ntbW1yszM1J49e3TTTTcpLCys3j2HDh3S+++/r7CwMKWlpenLL7+0hmivv/66dZyzs7NGjBihzz//\nXL6+vtZrzs7OWrx4sZ566in16tVLFRUVKioq0tixY3Xs2DGdPHnS2umzbNkymUym84ZA48eP12OP\nPSYbm/r//nHt2rV6++23tXLlSt13330N7tu8ebNefPHFenMvXLhQS5cu1dq1a3Xvvfc2+UxcWP9A\nb/UP9FZ6TomS0vN0tKBM+SUn5eXaRv5mZ/Xr7K3Ovq7XukwAAIBWgRAIAAAAuEoa2xPo6NGjWrdu\nnZKSkpSTk6Py8nJ5enpqwIABeuCBB+TtXX8ZpF27dln3Xhk0aJA+/vhj7du3T6dOnVJISIgmT56s\n0NDQevfMnTtXsbGxio6ObhA4nD1fSy2j5eHhYV0OrLy8XEuWLNGUKVNkNpv13nvvNbqh+4svvqj4\n+Hi9+eabCg4Ovqz3fK6Y7ZnnXX6quKJK61OLtDrpcL2ug5KSEklSUlKSkpKSmpy/qW6S8ykvL5ek\nep07Z7uY5eUuNJfZbFZubq6WL1+umpoaPfDAA/rXv/7VYFxpraM2bk3Tzox8VZ6qkbvZR7feeqt2\n7Nihp59+Wm5ublq1apXi4+NVVFQkFxcXPf/88/rNb35jnSMoKMgaDJ0boh09elSGYVgDtUOHDlmv\n1+nTp48WL16snTt3atCgQYqPj1dNTY169uwpV1dXlZWV6fDhwwoICNDOnTut9xw5cqTRz95YwCZJ\nt99+u/71r39p+/btjYZAQ4cObRAujR49WkuXLlVycnKjc+LidfZ1JewBAAC4wgiBAAAAgGto06ZN\n+vbbb9W7d2+FhobKzs5OmZmZWrNmjbZu3aq33npLXl5eDe5LTU3Vl19+qe7du+uXv/ylcnNzFRcX\npz//+c96++235e/vf1l1rV69Wps2bVLv3r3Vr18/WSwWpaamatmyZUpISNDf/vY3OTk5NbgvLi5O\nCQkJCgsL05gxY5STkyMXFxcNHTpU69at044dO9SvX7969+Tl5SkhIUFBQUEtHgBtT8tr3v4jFkNv\nfbNTvu5O1v1HTCaTJOnRRx9VZGRki9ZVN/eJEycavV5YWHhJc7Vr167etfScEm0/kKETBWX6Pn6X\nKiurFRYWpujoaNXU1Ehq2CWVfOCoyquqtTNXalPTXkXl8YqNjdXUqVM1YsQIxcXFKTk5Wffcc48S\nExM1e/ZshYeHS1K9558boh09elTFxcVasmSJpNN7+phMJu3fv19//OMfFR4erpCQEDk6Olr3ANqx\nY4fs7OwUEhIiNzc3lZWVaceOHWrfvr12796tgIAAeXp6NhkCVVdXKyYmRuvXr9fhw4dVVlYmy1m/\nDPn5+Y3eFxQU1OBcXSBbWlp6vh8HAAAAcF0hBAIAAACuodtuu01jx45tsBTZ9u3bNXv2bH322Wd6\n/PHHG9wXHx+vqKgojRw50nouJiZG8+fP1/Lly/XYY49dVl2XuozWtm3bNHv27AbLhEVERGjdunX6\n9ttvG4RAa9asUW1trUaPHn1ZNTfmk/UpzdqAXpIsFmnxhhRrCNStWzdJ0p49e65ICNS2bVtlZ2cr\nJyenQcfK3r17mz1Xly5ddPDgQe3evdsawtQFO9v2pGpvSqYcnD2UcyhHlSUFev3bVJ2SvfLy8hp0\nSVlqa3WyKFeSZGvfRgUO7ZWSU6FPvvxGkydPVklJifLz8+Xm5qa//OUvevvtt7V27VodP35c0unl\n3M7+jNJPIVpkZKR69eqlV1991TqmtLRUn376qeLi4vThhx9KOt0hlJycrOPHj2vHjh3q3r27HB0d\n5eTkJBcXFyUlJalr166qqKg471JwkvR///d/2rRpk9q2bavw8HB5enpa/6wtX75cp06davS+czvd\npNN7LElSbW3teZ8JAAAAXE9sLjwEAAAAwKVIzynRsq1pWrwhRcu2pikzt2EHgZeXV4MASJL69++v\nTp06KTExsdG5Q0ND6wVA0uklrmxtbVtkuSpfX98GAVDdM0wmk7Zv397ofeHh4Q0CIEkKDg5WcHCw\ntmzZUq/7pba2VmvXrpWTk5OGDRt22XWfLT2npMEeQBeyM6NA6Tkl1pp79uypH3/8UWvXrm38Genp\n1r2BLtaIESNksVi0cOHCet0peXl5+vrrr5s9zx133CFJ+vTTT1VUVKSY7Zma9ckW7UzP09GENbJY\nLPIK6i87hzaSpKQDGUopdVTC3kN68f1l9UKyrN0bVFVebP3exs5eHh17KGH3AS355nv98MMPqq2t\ntXbF1HUs2dk1/PeFZ4doTXFxcdFvf/tbffDBB3rvvff05JNPqkuXLsrOztZzzz2njIyMekFPu3bt\ntHv3buvyfOcLgVJSUrRp0yb169dP7777rqKiojRlyhRNmjRJEydObDIAAgAAAFoTOoEAAACAFnbu\n8lp1KksLdfjwCXXL/ykMslgs+uGHHxQbG6u0tDSVlpbW6zRo7C/XJTW6bJqdnZ08PDxaZLmqS11G\nKyQkpMk5IyIiNG/ePK1du1YTJkyQdLpzKC8vTxEREWrTps1l1322pPS8S76vbp+SGTNm6LnnntPb\nb7+tFStWqFu3bnJ2dlZeXp7S09OVkZGhN954Q+7u7hf9nHvvvVebN2/W+vXrdeTIEQ0YMEBlZWXa\nuHGjevbsqc2bNze6f9K5QkNDde+99+rLL7/UxCm/VXqtr2xs7VV8LFUVhTly8e0o39DBOlVRqrL8\nYyo+lirf0FuUGrtIxn8/k2fHnrJ1bKOy3COqKj0hZ58OKsk6ZJ3f3LWf0jZ8oWeeekJd2nmqpKRE\nbdq00R/+8AelpKQoKChIAQEBDepqbojm6ekpd3d3tWvXTu3atZO/v79uv/12xcbGqk+fPurbt681\nkGzbtq2OHj2qlStXyjAM9e7du8n3UtedNHDgQGsXT53k5GRVVVVd8N0CAAAAP3eEQAAAAEALOnd5\nrXMVV1Tpm4RM3ZF0WKP6BSg6Olpff/21zGazBgwYIC8vLzk4OEiSYmNjlZOT0+g8Zy+7dTZbW9sW\nWa7qUpfR8vT0bHLOoUOHKjo6WqtXr9b48eNlGIZiYmIk6YosBVdeWX3Z93l7e2vu3LlasWKFfvzx\nR2snjIeHhzp27Ki77rpLnTp1uqTnODg46JVXXtEnn3yiuLg4LVu2TH5+fho/frw1BKpbUu1Cpk6d\nqi5duuiZ/31PBZk7ZKmtlYOLp9r3GyHf7oNkY2srn5CblJeSoKzd6xXyy98ocOj9ytq9XicydsvG\nzkEmr3Z9nV0dAAAgAElEQVQKGf1bHdkWU29uF58Aufh1VmlRrtIzSlVTU6MTJ06opqZGU6dOVURE\nhBYtWtRoXWeHaLt371ZlZaU+/PBD5eXlae/evTpy5Ij+/ve/1wvR/Pz8ZGNjo5qaGjk5OSkkJESn\nTp2SYRhydHSUJBUVFSk4OLjJPwd180jS7t276y3nV1RUpAULFjTrvQIAAAA/d4RAAAAAQAvZnpZ3\n3gDIyiK99c1OORmntHz5cnXq1Emvv/66nJyc6g1bv359i9RV101SU1PT4FpZWVmDc2cvo/WXv/yl\nXheFxWLRl19+ecFnNcbBwUEjR47U119/rcTERHXq1EkJCQnq1q2bAgMDL+YjNYvJ8cL/uePo4qEB\nv5593vucnJw0YcIEa/fS+YwcObLBMn2SFB0d3eh4Z2dnPfroo3r00UfrnV+9erUkNeiwiYqKUlRU\nVKNzdezeX+ZbJsp8S+O1tXH3UcDNY3R460rtX/VPuXfoLrd2XWUyt1N5/jFZamrk6OKhDjeNVtGR\nA7Jz/On30b//7Tq243v5+7qqe+f2Kiws1Lx585p8D3XODtH+9Kc/KTMzUytWrJCHh4dsbW1VXFys\nd955R4GBgTKbzSoqKtKWLVvk6uoqFxcX9ezZU7a2trK1tVVISIiSk5OVl5enyspKdejQQenp6erc\nuXOjzw4ODlZoaKh+/PFHzZw5Uz169FBhYaESEhLk7+8vs9l8wfoBAACAnztCIAAAAKCFfLI+5cIB\n0BkWi7QwZqssFov69+/fIADKy8tTVlZWi9RVt8l9bm6u2rVrV+9aSkpKg/FXchmtiIgILV++XDEx\nMQoMDFRtbe0V6QKSpH6dva/qfZeioKCgQRiRm5urTz/9VLa2tho4cGCz52rO8nfewWFy8vBV9r5N\nKs1OV9GR/bJ1NMnJw09eXfs3eV/b3kPVtvdQTRkeol2r/m3dC6hOly5dNHDgwEYDsLoQ7eOPP1av\nXr306quvSjr9O75q1Srt3r1bCQkJKi0tlbu7u4KCgvT000832FvqmWee0fvvv6/9+/ertLRUR44c\n0cGDB5sMgWxsbPT8889r0aJF2rZtm1asWCEvLy/98pe/1P3336/HH3/8gu8LAAAA+LkjBAIAAABa\nQHpOSYM9gC4krciQUVmtvXv3qra21rrvycmTJ/WPf/yj0c6dS1G3T8/q1avVp0+fn2pOT9fy5csb\njL+Sy2i1b99effv2VXx8vPbv3y9nZ2cNHTr0suZsSmdfV/XuaL6on0ufTmbrfkBXwyuvvKKamhoF\nBQXJ2dlZ2dnZio+PV2VlpaZMmXJR3SrNXf7O2SdAXXwa7uFTp7HuqDomRztriHO2pjqgzrZixYp6\n33t7e2vy5MnNqPi0du3a6YUXXmj0Wu/evRvML0murq567LHHGr2nse6sC32Oxp4BAAAAXM8IgQAA\nAIAW0JwujHPZO7mobbd+Sk7erSeffFL9+/dXWVmZkpKS5ODgoC5duujQoUOXXVt4eLjat2+v9evX\nKz8/XyEhIcrNzdWWLVsUHh6ujRs31ht/pZfRioiIUFJSkgoLCxUZGWndA+lKeHBosGZ9sqVZHVqG\nIU0aEnzFamnMiBEj9N133ykuLk7l5eVq06aNunXrpjvvvFODBw++qLmas/zd5bqaXVIAAAAALh8h\nEAAAANACmtuFca4R434tm2OJ2rBhg1auXCl3d3cNHDhQv/71r/XKK6+0SG0ODg56+eWXFR0draSk\nJKWkpKhTp06aMWOGXF1dG4RAV3oZrfDwcLm5uam4uPiKLQVXp3+gt6Lu7H3BvZoMQ3r6rj7qH3h1\nQ46IiAhFRES0yFxXOqC52l1SAAAAAC6fYWnuouU3GMMwEgYMGDAgISHhWpcCAACAn4FlW9O0YPXe\n8445WZSnvSvmyzs4TB3D75IkPTaqh8YNDLwaJV43srKy9Oijjyo0NFSvvfbaVXnm9rQ8Ld6Qop0Z\nDZeG69PJrElDgq96AHQlzFi46aKXJWwOw5BefTC8VbwjAAAA4GoJCwtTYmJiosViCbvw6CuDTiAA\nAACgBTSnC+Nkcb4kyd70UzfFjbi81n/+8x9ZLBbdddddV+2Z/QO91T/QW+k5JUpKz1N5ZbVMjnbq\n19m7VXW3XOzyd78KD9RXW9Kuyy4pAAAAAJePEAgAAABoAZ19XdW7o7nRLoyKE9kqSN+lE2m7ZBiG\nPAJCJd1Yy2vl5ubqv//9r44dO6Z169YpMDBQt95661Wvo7Ova6t+5xe7/N2ofgG6Ocj3huiSAgAA\nAG5EhEAAAABAC2mqC6O84LhyD2xVGzcvBYTfKScPXxmGNGlI8LUp9DLk5ORo2rRpGjlypKKiopp9\nX1ZWlhYuXChHR0f169dPqamp+u1vf6vo6OgrWO2NaXT/jvLzMDU72LlRuqQAAACAGxEhEAAAANBC\nmurC8OraT15d+1m/vxGX1+rdu7dWrFhh/X7atGnXsJrW71KCndbeJQUAAADciAiBAAAAgBbUVBfG\nnmXzJEkTn/rrz3p5LbPZrAULFshkMp13XGRkpHr16qVXX321wbVZs2Zp7dq1uuOOO65UmTiDYAcA\nAAC4sRECAQAAAC2ssS6Mf8d7yMuljV6ffMu1Lu+y2NnZqUOHDte6DAAAAABAM9hc6wIAAACA1qqz\nr6vGDQzUpCHB6uTjKhcn+2td0mXLyclRZGSk5s6daz03a9YsRUZGNjo+NjZWkZGRio2NPe+8MTEx\nioyM1JIlSxq9fuLECY0bN05PPPHEpRcPAAAAADcYQiAAAAAA19zw4cNlMpm0Zs0a1dbWNri+du1a\n1dTUaPTo0degOgAAAAD4eWI5OAAAAKCFWCwWrVy5UqtWrVJWVpZcXV11yy236KGHHmrynvXr1ysm\nJkaHDh1SVVWV/Pz8NHz4cP3qV7+Svf1PnUM5OTmaNm2awsLCZGtrq6+//lrZ2dmysbGRv7+/Hn30\nUU2dOlVFRUX6+OOPtXXrVuXm5qqiokLu7u6ysbGRs7Oz+vbtqzFjxmj37t1KTEzU8ePHVVpaqiNH\njqisrEyLFi3SsWPHrJ/B09NTgwYN0tdff63bb79d7u7uiouL08aNG/XGG2/oN7/5jWpqaiRJJSUl\nWrZsmTZv3qxt27YpNTVVO3bs0I4dO/TSSy/pnXfeUUBAgHJzc63v6/PPP9e6deuUm5urrKwsFRQU\n6NFHH1V5ebkqKirk7e2tQYMGacOGDXJ0dNRtt91mfSfTpk2TJP3973/X4sWLtWnTJuXn52vChAma\nNGnSlfgRAwAAAMDPCiEQAAAA0ELef/99rVixQmazWaNHj5atra22bNmi5ORkVVdXy86u/v/9njdv\nntatWydvb28NHjxYzs7OOnDggBYtWqQdO3Zozpw5srW1tY4vLCzUkiVLVFZWps6dO+uOO+5Qfn6+\n9uzZozlz5uiXv/ylZs+eLZPJpI4dOyopKUm5ubkym8363e9+p8rKSm3atEnffvut7O3tNXjwYA0e\nPFhOTk764osvlJCQoClTpigwMFAjRoxQ//79tWXLFn3xxRfKy8vThg0btGnTJhmGoa5du8pkMiku\nLk729vayWCyKiopSTk6OgoKC5OvrKy8vL6WkpKiwsFA9evRQjx49tGHDBm3btk3l5eXatWuXqqur\nFRYWJpPJpA8//FDp6ekqKSnRzJkz5e7urvT0dC1cuFCZmZmaPn26nJ2d673D6upqPffccyopKVH/\n/v1lMpnk5+d3VX7eAAAAAHC9IwQCAAAAWsC+ffu0YsUKtWvXTn/729/k6uoqSXrooYf07LPPqqCg\nQL6+vtbxsbGxWrdunW655RbNmDFDDg4O1muLFy/WkiVLtHLlSt19992STnfZHDx4ULW1tZoxY4Zm\nzJhhHf/pp5/qgw8+0DPPPKNbb71VkydP1iOPPKJevXpp7NixWrRokUwmk5588kllZGToqaeekr+/\nv2bPnm2do7CwUCdPntShQ4fUvXt3RUVFSZJ+cXukHn54itIyjijjyDFN+c00pezdqZEjR2r69Ol6\n6qmntG7dOp06dUqOjo6aPHmyxo8fb90jaPjw4Tp8+LAiIyM1evRo3XfffbrttttUVlam4uJizZ8/\nX66urtq5c6eWLVsmd3d31dTUKDIyUkFBQZJOd/wcOHBAlZWVDd57QUGBAgIC9Oqrr6pNmzYt9eME\nAAAAgFaBPYEAAMDPWmOb1J9PczepB5ojPadEy7amafGGFL0Z/ZnKK6s1YcIEawAkSQ4ODpoyZUqD\ne5cvXy5bW1s99dRT9QIgSXrggQfk6uqqH374wXpuw4YNqqmpUdeuXfWHP/yh3viRI0fKwcFBp06d\n0sMPP6zvv/9eZWVlevDBBzV+/HjZ2trq0KFDkqROnTrprrvuUmZmpg4fPlxvHpPJpKFDhyo1NVXx\nKVmasXCTnl6UoDwbX5VVVqnWyayE2hDtPXxCR/JLZW9vryFDhqiyslLFxcXq0qWL7rvvPut8NjY2\nGjVqlOzt7a1L2/n5+VnDHVdXV+u7WrFihWxsbPTAAw9IkpYsWSJJOnHihPLz89WuXTvt37+/0Z/D\ntGnTCIAAAAAAoBF0AgEAAFyEug6NV155Rb17977W5eAa2Z6Wp0/Wp2hXZoH13P647SovyNeXe07K\nq2ue+gd6W6/16NFDNjY//furyspKpaWlyc3NTV9//XWjz7C3t9f+1DQt25qm8spqrVy/TTW1FvXt\n27feXJJkNpslSf7+/nJycrKGJWlpafr0009VUFCgzZs3a/HixZKko0ePqrCwUH/9619VVVWl4uJi\npaSkKC8vT7169VLpKUP/88EPsnc6HdDYO7lIkkxeHWTY2Km4okrfJGTqjqTD8vLy0qlTp1RbW6t+\n/frJMAxrXT4+PvX2NTr7vCTrXkKStH//ftnZ2alNmzY6deqUli5dqsDAQG3dulWZmZny9/dXUVGR\nSkpKGoRsnTt3bupHBQAAAAA3NEIgAABwQxk0aJAWLFggT0/Pa10KfqZitmdq7spdsljqn685dXqp\nspT8U5r1yRY9fVcfjeoXIEmytbWVm5ubdWxpaaksFouKioqsHS9nKyqv0tH8MhVXVKlm9V5J0v79\nR1RcUaXEIye1Pa1+yFS3b5DJZJJ0euk4SVq9erUkKT09XdJP3TXZ2dnKyMiQnZ2dRo8eLR8fH33/\n/ffas2eP3LzbadeOZPU4K6CRcTp0cjD9FL7IIr31zU6N61xpDX7O3r9Ikjw8PJSamtrg89V17dTW\n1lrPlZSUqKamRhs2bFB1dbUyMzP17rvvKjMzU9XV1Wrbtq0kqaKiol4I5O7uXi94AgAAAAD8hBAI\nAADcUJydnRtsLA801/a0vEYDIEmytXeUJFWfLJOtvYPe+manfN2d1D/QWzU1NSouLpa39+ngpu53\nsEuXLpo3b169eepCpiBL4/Nn5Z1oEDKdqy4M+vvf/67OnTtr2rRpkqTo6GjV1NTowQcfVPfu3TV3\n7lxrF1FxcbEKCwt1uKim0c93Wv2wxWKRvtt1VE5OTiovL9fGjRs1efJk6/WDBw8qOTm5wSwnT56U\nJNnZ/fSfIyaTSRaLRVFRUXrttddUWlqqoKAgeXp6avTo0Zo+fXrjFREAAQAAAECTCIEAAECrkZOT\now8//FBJSUk6efKkOnXqpEmTJunmm2+2jomNjdXcuXMVFRWlkSNHWs+np6friy++0P79+1VQUCCT\nySRvb2/16tVLv/nNb2RnZ6dp06YpJydHkvTss8/We/aKFSusxwUFBfrss8+0bds261w9e/bUhAkT\nrHuhNFaPh4eHli5dqkOHDqm8vFxLlizRlClTZDab9d577zX6l90vvvii4uPj9eabbyo4OLhF3iOa\n9sn6lCYDEpO5ncoLjqs0J0OOrp6yWKTFG1LUP9Bbe/furdf10qZNG3Xs2FGZmZn1ljc7X8hkMreT\nJJ0szpPlTBdOXch0ru7du+vHH3/Unj17GiyVVlxcrLKyMvXt29caANUprahS5vGMi3gj0qHsYpm9\n/VRYWKi1a9fqL3/5izp27KiUlBQVFRXp4Ycf1s6dO+vdk5ubK0n1uqO6d++u+Ph4ZWdny9HRUf7+\n/srLy5MkjR49+qJqAgAAAACcZnPhIQAAANe/nJwc/eEPf1BOTo5GjBihIUOGKCMjQ3PmzGnwF9Dn\nSk9P1zPPPKPNmzerW7duGjdunG699Va5u7tr1apVqq6uliTdfffd6tWrlyRp5MiRmjhxovV/dbKz\ns/X0009r1apVatu2rcaNG6cBAwYoPj5eM2fOVHx8fKM1xMXF6cUXX5STk5PGjBmjIUOGyMXFRUOH\nDlVWVpZ27NjR4J68vDwlJCQoKCiIAOgqSM8pqbcH0LnMXftJkrJ2b1B1ZbkkaWdGgZKP5GvhwoUN\nxo8bN07V1dWaN2+eysrKJNUPmaorK1RecNw63qNTTxmGjcpyD6skO90aMtWpqqqyHt9+++1ydnbW\nkiVLGnTieHh4yMHBQVu3brV25Ein9+c5mJammpMVzX0lVhb7Nho0aJDs7e21evVqffvtt6qurlZI\nSIhycnJUXV2tU6dOSTr9Z6Ruibi6Jd4kaezYsZKkZcuWqaqqSmFhYZKk4OBgde3aVSdPntSBAwcu\nujYAAAAAuJHRCQQAAFqFXbt2adKkSfUCmWHDhmn27Nn66quv1KdPnybvjY2NVVVVlf785z8rPDy8\n3rXS0lI5Op5ehmvs2LEqKyvT7t27NXLkSPXu3bvBXPPnz1dBQYEeeughTZgwwXo+IiJCf/rTn/TW\nW2/p3//+t3VPlDrbtm3T7NmzrX/xffZ969at07fffqt+/frVu7ZmzRrV1tbSJXGVJKXnnfe6i0+A\nfLuHK2f/Fu1b+a48O/aQDBs9kbBIvbq0a9B1c8cddyg1NVWrVq3SI488ok7BPRSzq0A1VeWqKi1U\naU6GzF36qWP4XZIkO0eTnMxtVVNZodR1H8mtfbCOefrKI3uLSvKztG/fPg0YMECS5OrqqlmzZunl\nl1/WjBkzlJmZKRcXF/3rX/9Sbm6usrOzdfDgQU2fPl2DBg1SdXW1vvrqK5UWF8s1KLRe+HQuS83p\nUNQ4a/+fmlqLpk6dqlWrVik3N1cdOnRQSkqK7OzslJiYqKKiIiUnJ+vgwYPasGGDqqqq1LNnz3r7\nIfXt21dTpkzRm2++qbS0NFVVVenEiRMKDAzUX//6V+3evVs9evTQX//614v7wQEAAADADYwQCAAA\ntAq+vr66//77650bMGCAfHx8Gt2TpDEODg4Nzrm4uDS7hry8PG3fvl0+Pj761a9+Ve9aaGiohg0b\npu+//14//vijRowYUe96eHh4gwBIOt0FERwcrC1btujEiRPy9PSUJNXW1mrt2rVycnLSsGHDml0j\nLl15ZfUFx/iHjZKjq1m5yfHKS9kmW0eTwkcM1ZwXZ+jJJ59sMP6xxx7TTTfdpG+//Vaxm7YpNyNb\ntg5OcjC5yTd0sMyB9cNL+zYu8grsJ7s2JpVkp6kk66DWlB/S4H6hateuXb2xffv21T/+8Q999dVX\neuutt1RYWKg1a9bIbDZr3LhxqqmpUUZGhmJiYmQymeTj46OyahuVOrlJajoEOlmcf7oWk6v1nK2N\nIU9PT82bN09ffvmlNm/erOzsbFVVVenWW29VaWmpNm3apMrKSgUEBCg8PFzFxcUN5r7vvvtUXFys\n1157TQcOHJCtra11CcZRo0bxuw4AAAAAF4kQCAAA/Kyk55QoKT1P5ZXVMjnaqYPz6bWzAgMDZWPT\ncKVbb29v7d+//7xzDhkyRMuXL9dLL72kX/ziF+rXr59CQxv+pfqFHDp0SJLUs2fPehve1+nTp4++\n//57HTp0qEEIFBIS0uS8ERERmjdvntauXWvtLtq2bZvy8vIUERHRoKsIV4bJ8cL/19kwDPl0Gyif\nbgOt5+4e1UPOzs6Kjo5u9J6bb75ZN998s4I2pGjhD00Hlo4uHhrw69kNzk8ZHqJJQxpfDtDX11e/\n//3v9fvf//6CtUun/3z97p/rG5zvODBCPsFhKkjfpSPxq2QYhjwCQiVJXl376Z//96Q6+54OhaZO\nnaqpU6fq3XffbdYzzxYfHy+TyaQ777xTKSkpevjhh3XPPfc0Ob6pdwoAAAAAOI0QCAAA/CxsT8vT\nJ+tTGuzJUllaqMOHT6hbv8bvs7W1laVuk5UmhISE6LXXXtPnn3+uuLg4ff/995Ikf39/TZo0SUOH\nDm1WjXX7utR165yr7nxpaWmT1xozdOhQRUdHa/Xq1Ro/frwMw1BMTIwksRTcVdSvs/cVva85IVNL\n3teYzr6u6t3R3OjeR+UFx5V7YKvauHkpIPxOOXn4SpL6dDJbA6DLFRcXp9jYWHl4eGj8+PEaN25c\ni8wLAAAAADcqQiAAAHDdi9meqbkrd6mpLKe4okrfJGTqjqTDGtUv4JKe0b17d73wwgs6deqUUlNT\nlZiYqBUrVuj111+Xm5tbg/14GuPs7CxJKiwsbPT6iRMn6o07m2EYTc7r4OCgkSNH6uuvv1ZiYqI6\ndeqkhIQEdevWTYGBgc35eGgB5wtImnIxAcmVDpma68GhwZr1yZYGf968uvaTV9f6fw4MQ012IV2K\nqKgoRUVFtdh8AAAAAHCja7hmCgAAwHVke1reeQMgK4v01jc7tT0t77KeZ29vr9DQUD344IP63e9+\nJ0nasmWL9XrdknO1tbUN7u3SpYskac+ePaqpqWlwfefOnZKkrl27XnRdERER1g6gNWvWqLa2li6g\na+DBocE6T15Xz8UGJHUh08VoyS6cOv0DvRV1Z+8Lfk7DkJ6+q4/6B7ZsCAUAAAAAaDmEQAAA4Lr2\nyfqUCwdAZ1gs0uINKRf9jH379qmqqqrB+bqOHkdHR+s5Nzc3SVJubm6D8d7e3urXr59ycnK0fPny\netcOHDig//73v3JxcdEtt9xy0TW2b99effv2VXx8vL799ls5Ozs3e5k6tJwrHZBcyZDpYozu31Gv\nPhiuPp0aD6X6dDLr1QfDL7nzDgAAAABwdbAcHAAAuG6l55Rc1NJbkrQzo0DpOSUX1R3x5ZdfaufO\nnerZs6f8/Pzk5OSkjIwMJSQkyMXFRaNGjbKO7d27twzD0MKFC5WRkSEXFxdJ0v333y9Jmj59uv74\nxz/q3//+txITExUcHKy8vDxt3LhRNjY2ioqKkpOT00V9pjoRERFKSkpSYWGhIiMj5eDgcEnz4PKM\n7t9Rfh4mLd6Qop0ZDX8/+3Qya9KQ4EvqkKkLmS7U/XY1unD+n707D4i6Th84/h7uczjEQRE5VDw5\nRFE8MSPTUMvKUthV82fHL9s8OnZXzXVbW9xad72z3NzNNjErNcEDFKy8uRS5REHFA8QREBhAUGB+\nf/hjchzkUMzref0F3+/n8/l+vuMwwjzzPI+/pxP+nk7kqjWk5BZSWV2DlbkJvT2cWj37SAghhBBC\nCCHEvSFBICGEEEI8sFJy76y0W0puYYvepB49ejQ2NjacPHmSzMxMamtrcXJyYvTo0YwbNw6VSqUb\n27FjR2bPns2WLVvYsWOHLoOoPgjUrl07lixZwsaNG0lKSiI9PR1LS0v69OnDhAkT8PK688yNwMBA\nlEolZWVlUgruPruXAZJ7GWS6Ex4qWwn6CCGEEEIIIcRDSqFtbn2Vx4xCoUju06dPn+Tk5Pu9FSGE\nEOKxFbEvm3U/nWzxvClPdL1nZbLup4KCAl5//XV69OjBxx9/fL+3I34FkoUjhBBCCCGEEA+vvn37\ncuTIkSNarbbv/dqDZAIJIYQQ4oFlZX5nv6rc6bwH3ZYtW9BqtYwZM+Z+b0X8SiQLRwjRmLi4OJYu\nXcqsWbMIDg6+39sRQgghhBAPoEfzHRIhhBBCPBJ6e9xZuas7nfcgunz5Mj///DP5+fnExsbi6enJ\nkCFD7ve2hBBCPMCmTZsGwNq1a+/zToQQQgghxP0mQSAhhBBCPLA8VLb4uDmSds6wL8rt+Lo7PlKZ\nEwUFBaxbtw5zc3N69+7N9OnTUSgU93tbQgghhBBCCCGEeAhIEEgIIYQQD7TfBHkxZ308zWljqFDw\nyPUC8vHxISoq6n5vQwghhBBCCCGEEA8hCQIJIYQQ4oHm7+nErNE+LN2e1mggSKGA2WN88fd8dErB\nCSGEeHScPHmSLVu2kJmZSVlZGba2tri7uzNy5EiGDBlCWloac+fOJTQ0lLCwMIP5zSnxVr9GvbFj\nx+q+Dg4OZtasWajVaqZNm6b7/lZz5swhPT1d7wMIN+8tICCADRs2kJWVRXl5OWvXrkWlUgFQWFjI\n999/T1JSEkVFRVhaWtKjRw8mTpyIl9ej9SENIYQQQoiHhQSBhBBCCPHAG+XvhrO9FRH7skk9a1ga\nztfdkbChXhIAEkII8UCKiYnh008/xcjIiMDAQFxcXCgpKSEnJ4ft27e3Wq83Z2dnQkNDiYyMBODZ\nZ5/VnevUqdNdr5+VlcV3331Hz549GTFiBGVlZZiY3Hhb4dSpU8yfP5/y8nL69OnDoEGDKCsr4/Dh\nw/z+979n3rx5BAQE3PUehBBCCCFEy0gQSAghhBAPBX9PJ/w9nchVa0jJLaSyugYrcxN6ezg9Uj2A\nhBBCPFrOnz/P6tWrsbKy4uOPP8bNzU3vfGFhYatdS6VSERYWRlxcHECDGUV34+jRo7z11luMGjVK\n73htbS0ff/wxVVVVhIeH4+3trTtXXFzM7NmzWb58OWvXrsXU1LRV9ySEEEIIIRpndL83IIQQQgjR\nEh4qW8b19yRsqBfj+nv+qgEgtVrN2LFjWbp06a92TSGEEA+3HTt2UFtby8SJEw0CQABOTg9PFmun\nTp0MAkAASUlJXLx4kTFjxugFgAAcHR158cUXuXLlCseOHfu1tiqEEEIIIf6fZAIJIYQQ4p5oTu8C\nIYQQ4lF0c9bq9p8TqKyuoW/fvvd7W3eta9euDR7PysoC4PLly0RERBicz8/PB25kRUlJOCGEEEKI\nX5wDslkAACAASURBVJcEgYQQQgghhBBCiFZw9Ewh6/dmk3bul/51GSfzqNYU8/edObzylMVD3b/O\n3t6+weNlZWUA7N+/v9H5VVVVrb4nIYQQQgjROAkCCSGEEEK0osrKSr7++mvi4+MpLCykrq6OZcuW\ntUpDbiGEEA+u6KPnWLo9Da1W/7iJmQXVQMqJs8y5VMHsMb6M7N3RYL5CoQBu9NdpSEVFBdbW1ne9\nz+Zcp6m5t6rf1wcffEBgYOBd7lAIIYQQQrQmCQIJIYQQj7kTJ06wefNmMjMzKS8vx97enoCAAEJD\nQ3F0dATg4MGDLFq0iG7duvG3v/0NE5NffoU4e/Ys77zzDjY2Nixfvpxz584xd+5c3fmxY8fqvg4O\nDmbWrFm67y9cuMD333/PsWPHKCkpwdraGj8/P8LCwujQoYPePpcuXUpcXBz/+te/SExMZNeuXeTn\n59O1a1cWLVpEWloac+fOJTQ0lAEDBvDf//6X48ePc/36dbp27crkyZPp0aOH3prFxcXs2rWLI0eO\ncPHiRcrLy1EqlXh7ezNx4kQ6djR8k64p//nPf4iOjqZfv34MHz4cIyMjHBwcWryOEEKIh8fRM4UN\nBoAArJxcqSjKpyw/Bws7J5ZsS0VlZ2mQEWRjYwNAYWGhwRoXL15sURDIyMiImpqaBs81dp3Kykry\n8vKadY2bdevWDYCMjAwJAgkhhBBCPGAkCCSEEEI8xnbv3s3KlSsxNTUlMDAQJycn8vPziYmJISEh\ngcWLF9O2bVsGDRrE6NGj2b59O//973+ZOnUqANXV1Xz88cdcv36dd999Fzs7O5ydnQkNDSUyMhKA\nZ599Vne9m7NhkpOTCQ8Pp7a2lv79+9O+fXsKCws5dOgQSUlJhIeH07lzZ4M9r1mzhszMTAICAggI\nCMDIyEjvfE5ODps2baJ79+48/fTTXL58mQMHDvDBBx+wfPlyveBSeno63333Hb6+vgwaNAhLS0vy\n8/M5ePAgCQkJfPLJJ3h6erboMU1MTKRDhw786U9/atE8IYQQD6/1e7MbDAABtO0aQGF2MgXpe1G6\ndMbCri0R+7J1QaDCwkKcnJxwdXXFysqK+Ph4SktLsbOzA+DatWt8/vnnLdqPra0tubm5XLt2DTMz\nM71zlpaWuLq6kpmZyfnz53UfeKirq+OLL77g2rVrLbx7CAwMpH379mzfvh1fX98G+/5kZWXh6emJ\nubl5i9cXQgghhBB3ToJAQgghxGMqLy+PTz/9FGdnZxYtWkSbNm10544dO8b8+fNZs2YN8+bNA2Da\ntGkcP36cLVu24OvrS9++fVm9ejXnz59n4sSJ+Pr6AqBSqQgLCyMuLg6AsLAwg2uXl5fz97//HXNz\ncz7++GO9jJuzZ8/y3nvvsXz5cpYtW2Yw99SpUyxbtgxnZ+cG7ysxMZFZs2YRHBysOxYdHc2qVauI\njIzkzTff1B338/Pj66+/xtLSUm+NM2fO8Pvf/55169bx5z//uamHUk9xcTG9evVq0RwhhBAPr1y1\nRq8H0K0s7NrSsd8znE/YTtaOz7Fz7U5+iiO2+QcpLjiPlZUV4eHhmJiY8Oyzz/LNN98wY8YMBg4c\nSG1tLSkpKTg6Ouqyc5vDz8+P7OxsFixYQK9evTA1NcXT05P+/fsD8MILL7B8+XLef/99hgwZgpmZ\nGampqdTU1ODp6cmZM2da9BiYmJgwd+5c/vSnP/Hhhx/So0cPXcCnsLCQ7OxsCgoK+OqrryQIJIQQ\nQgjxK5MgkBBCCPGY2rlzJzU1Nbz22mt6ASC48eZRYGAgCQkJXL16FUtLS0xNTfnDH/7AzJkzWbJk\nCS+++CJxcXF4e3sTGhraomvv2bOHiooK/vd//9eg5Jq7uzsjR45k69atep9Qrvfiiy/eNgAE0KNH\nD70AEMBTTz3FZ599xsmTJ/WOV1dX89vf/pbg4GBeeuklvv76a9LS0igrK8PNzY3U1FSuXLlCZGQk\nhw8f5vz58xw5coSqqiqGDRuGv7+/bq05c+aQnp4O3Mgwqi+D5+3tzaJFi3Tjjhw5QmRkJCdPnuTq\n1as4OTkxcOBAJkyYYFDmZ9q0aQCsWLGCiIgIDh06RFFRES+//LIuuFZbW0tMTAx79uzh3Llz1NbW\n4urqyogRIxg9erRe/wa1Ws20adMIDg4mLCyML7/8kpSUFKqqqnB3dycsLIx+/fo1+Lju27eP6Oho\nTp8+TXV1NQ4ODnTv3p1x48bh5eWlN3bv3r26sdeuXcPZ2ZknnniCF154AVNTU72xGRkZbNq0idOn\nT1NaWoqNjQ3Ozs707du3xc8rIYS4H1JyDcuq3crJqy+W9iouHT9E+aVcSi9k8WOVC0EB3jz99NO6\ncWFhYZibmxMTE0NMTAz29vYEBQURFhbG9OnTm72nCRMmUFFRQUJCApmZmdTV1REcHKwLAo0YMQKA\nLVu2EBcXh42NDQMGDGDy5MmEh4e38BG4wcPDgxUrVvDDDz+QkJBAbGysriRqp06dCAsLQ6lU3tHa\nQgghhBDizkkQSAghhHiM5Ko1pOQWUlldw7Yf46msriE9PZ3s7GyDsaWlpdTV1ZGXl0eXLl0AcHFx\n4a233uIf//gH//73v1Eqlbz33nsGJdmakpWVBdzIuImIiDA4X9+PoKEgUNeuXRtd+9aABNz4hLK9\nvT3l5eUG50pKSvj+++9ZunQpJiYmWFtbU1dXR3l5OSYmJrz99tuUlpbSq1cvvLy8yM3NpaSkhAUL\nFvDWW28xcuRI4EagycfHhw0bNqBSqXSBqJsDVhs2bCAiIgJbW1v69euHnZ0dubm5bNmyhaSkJBYv\nXoyVlZXe/mpqapg3bx4ajQZ/f3+srKx0a9bU1LBw4UKOHDlChw4dGDZsmO7T3J9//jknT57knXfe\nMbhntVrNO++8Q7t27XjyySfRaDTs27ePhQsX8tFHH+myugC0Wi3Lli0jLi4OpVLJwIEDsbOzo6io\niNTUVDp06KD3mC9btozY2FicnJwYNGgQ1tbWnDhxgq+//ppjx46xcOFCjI2NgRslAT/88EOsrKwI\nDAykTZs2aDQaLly4wPbt2yUIJIR4KFRWN9x751bWbTvSqe0v/6dNeaIrYUP1/89SKBSMHz+e8ePH\nG8xfu3atwbHg4GCDDz4AWFhYMH369EYDRyNGjNAFg2528wcX6vn4+BAVFXXbterZ2dkxZcoUpkyZ\n0uRYIYQQQgjx65AgkBBCCPEYOHqmkPV7s/XK1WScOE+1ppiPlq2lQxtr7KzMGpxbVVWl9319IKKy\nspIhQ4YYZBE1h0ajASAmJqbRcVevXjU45uDg0Oic2zXNNjY2pq6uTu9YdHQ0J0+exMTEhMGDBzNq\n1CjMzc1RKBQcPnyYHTt2oFAomDdvHkFBQajVan788UeGDBlCXl4ea9asITAwEHt7e92bcPVBoFvL\n4KWmphIREUH37t3585//rLfPuLg4li5dSkREBK+++qrevOLiYjp27MiiRYuwsLDQO/ftt99y5MgR\nxowZw2uvvaYLxtXV1bFy5Up2797N4MGDDZp0p6WlERYWphdkGTZsGAsWLGDz5s16QaCYmBji4uLw\n8vJi4cKFevuuq6ujpKRE7z5iY2MZOHAg7733nl4fioiICDZs2MD27dt1faJ27dqFVqtl0aJFBr2X\nysrKEEKIh4GV+Z39WX2n84QQQgghhGgJ+a1TCCGEeMRFHz3H0u1pBg2rjc1uBBQ8x87GxNyC343x\nZWTvjg2s8AutVsuSJUuorKxEqVQSHR3N0KFD8fb2btGe6rNdVqxYgYeHR4vm3lze7G7U1tayefNm\nTE1NGTRoEOvXr9crVXbo0CE0Gg39+vUjKChIb665uTm/+c1v+Oijjzh48CAhISFNXq/+E9Rvv/22\nQaAqODiYyMhIfvrpJ4MgENwoC3drAEir1bJt2zYcHBx49dVX9bKxjIyMmDZtGrGxsfz0008GQSCV\nSsWECRP0jvXp04e2bdsalMzbtm0bAL/73e8M9m1kZKTXoyIyMhJjY2Nmzpxp0Ih84sSJbNu2jZ9+\n+kkXBKp361hASgYJIR4avT2cftV5QgghhBBCtIQEgYQQQohH2NEzhQ0GgACsnTpQWZRP+eVz2HXo\nypJtqajsLPH3vP2bUps3byY5OZknnniCF198kXfffZfFixezYsUKbG1t9cYaGRlRU9NwiZzu3btz\n8OBBMjIyWhwEulv1JfEuFxaRe7EIC0trunfvrhcAqqqq0vX3qays1JWsKy0tJS8vj+TkZF1j6/Pn\nzzfrullZWZiYmLB///4Gz1+/fp3S0lI0Go3eY2lmZtbgY5SXl4dGo8HFxYWNGzc2uKaZmVmD+/P0\n9GywhJ+Tk5OuVB/ceBzOnj2Lvb09nTp1avT+qqurOXPmDEqlkq1btzY4xtTUVG8/w4YN4+DBg7z7\n7rsMHToUX19fevTogZOTvDEqxKNg7NixBn3RHkUeKlt83Bz1sm2b4uvuiIfKtumBQgghhBBC3CUJ\nAgkhhBCPsPV7sxsMAAG07dqfopwj5CXvwtzWEQulExH7snVBoJqaGk6cOEGvXr0AOHHiBP/9739p\n374906dPx9LSkldffZVPP/2UJUuWMH/+fL0sHVtbW3Jzc7l27ZpBpsdTTz3Fxo0b2bBhA15eXgZ9\nfrRaLenp6fj4+LTaY1GkqeJUQRlvfL5Xd428kirK1cX8lFXE0TOF+Hs6UVNTw5o1a3TlyG7umVRd\nXU1eXh7V1dW6MmgNlaxriEajoba2lg0bNjQ67urVq3pBIDs7uwazn+pL6uXn5ze6ZkP7s7GxaXCs\nsbEx2pueMBUVFQDNKvlXXl6OVqvl3LlzzJ8/n6qqKmpqanBwcND1DKqoqODChQuEhYWh0Wjw9PTk\nT3/6E3PnzmXhwoX4+vpibm5Oly5dmDJlCr17927yunCjx9G0adMIDg5m1qxZzZrTkLS0NObOnUto\naKhBOT8hhGjMb4K8mLM+/rb/595MocCgF5AQQgghhBD3igSBhBBCiEdUrlrT6KeSLeyccAt8lnPx\nkRzf9hnK9p25oGyDgzqRuqoyMjMzUSqVfPbZZ1RUVPDJJ5+gUCj4/e9/j6WlJQDPPPMMx44d48CB\nA/zwww88//zzuvX9/PzIzs5mwYIF9OrVC1NTUzw9Penfvz+2trbMmTOHv/71r7z33nv4+fnh5uaG\nQqHg8uXLZGVlodFo2Lx5c6s8FtFHz5F06jIALv9/TKFQ0KZLX0rzskk9FMtvZxvxZM/21Fw5j0aj\noXPnzly4cIFJkyYxadIk4O6CDVZWVmi12iaDQLe6Xfm7+pJ6AwcOZO7cuS1aE2DOnDmkp6c32ui7\nvvxbUVFRk+tZW1tTXV1Nfn4+gwcPJigoCCsrK1xdXQkKCqKyspLXX3+dtm3bEhQUhFKpxMHBgX79\n+jF69GhMTU15//33OXXqFDt37uTDDz9k+fLldOzYeIlCIYR4EPh7OjFrtM9ts2/rKRQwe4xvo1m3\nQgghhBBCtCYJAgkhhBCPqJTcwibHOHbyxdLBGfXxw2gunUFTcIqdlafx9XJj8ODBDB06FIDly5ej\nVqt59dVX6dKli94ab7/9Njk5OXz11Vf06tVLl9UzYcIEKioqSEhIIDMzk7q6OoKDg+nfvz9wI0i0\ncuVKNm/ezJEjR8jIyMDExARHR0f8/PwYNGhQqzwO9SXxGtKu5yAupsSBwpjC7CNsyjUndPRwPvzw\nDZYtW8bPP//MiRMnWmUf3bt3JzExkXPnzuHm5nbX67m6umJtbc2JEyeoqanBxOTuf61Tq9UkJCQQ\nERFBWFgYFhYWuLu7c/bsWU6fPt1oSTgLCwssLCwoLy8nNDSUZ555Ru/8yZMnKS0tZdKkSbz88st6\n5yZPnsz48eNp164dw4cPx8bGhvXr15OUlNSsIJCjoyOrV6/WBcaEeNydPHmSLVu2kJmZSVlZGba2\ntri7uzNy5EiGDBmiG3fixAk2b95MZmYm5eXl2NvbExAQQGhoqF6/L/glcPzDDz+wadMmYmNjuXz5\nMvb29gwbNozf/va3utehuLg4li5dCtzIphw7dqxunfpMu6Yy76ZNmwbA2rVrdcfq1501axb29vZ8\n//33nD59msrKSjZs2MCUKVNwdHRkzZo1DQbQ//KXv5CYmMg///lPXYZiaxrl74azvRUR+7JJPWv4\nIQxfd0fChnpJAEgIIYQQQvyqJAgkhBBCPKIqqxvux3MrSwdn3Ac9p/t+yhNdDcrUzJkz57bzra2t\n+eKLLwyOW1hYMH36dKZPn37buSqViv/93/9t1j5nzZrVaPaNj49Pg1kt9SXxeo2baTjJyBgzGwfa\ndOqtewyM3B1RqVT89a9/pba2ltTUVHbv3s2IESNQqVR618jNzcXBwQE7O7sm9//cc8+RmJjIihUr\nmDNnjsEbrPX9d7p169bkWnCjdNvYsWP55ptvWLNmDa+++qpB2b3i4mIqKioaDKS88847VFdXN3md\nsWPHsnLlSlauXMnChQt12UFwo6TelStXdPfi7e1NfHw8O3bsICgoSG9scXExNTU1etdMT0+nR48e\nODo66j0e9aX26vsuNcXExARXV9dmjRXiURcTE8Onn36KkZERgYGBuLi4UFJSQk5ODtu3b9cFgXbv\n3s3KlSsxNTUlMDAQJycn8vPziYmJISEhgcWLF9O2bVuD9RcvXkxGRgZ9+/bFysqKpKQkNm3aRElJ\nie412tPTk9DQUDZs2IBKpSI4OFg3vzXKfB44cIDk5GT69u3LM888g1qtxsbGhqCgIGJjYzl27JhB\nOcnCwkKSk5Pp0qXLPQkA1fP3dMLf00nXf66yugYrcxN6ezhJDyAhhBBCCHFfSBBICCGEeERZmd/Z\nf/N3Ou9B1FRJvIakni0mV63BQ2XLe++9x7x581i+fDlRUVF069YNa2trCgsLyc3N5ezZsyxevLhZ\nQSA/Pz+mTJnCV199xeuvv05AQADOzs5UVVWhVqtJT0+nZ8+efPjhh83e64QJEzhz5gw7d+4kISEB\nX19f2rRpQ2lpKfn5+WRmZjJ58uQGg0ANvbnbkKeffpqoqCgiIyPZtGkTDg4OODk54erqilarZeTI\nkfj4+OhK0qlUKr799ls2b96MnZ0dY8eOZfv27Vy7do2ysjIuXLjAt99+C9wI3lhZWXHlyhUuXLjA\ntGnTUKvVpKamolKpCAoKalZGw+3K9OXl5REbG0tKSgpqtZrKykocHBzo06cPEydOxMlJPo3flIay\nMcSD6/z587qsuI8//tgg67Cw8EaGaF5eHp9++inOzs4sWrRIr+/XsWPHmD9/PmvWrGHevHkG17h4\n8SKrVq3S9S6bNGkSM2bMYM+ePUyZMgUHBwc6depEp06ddEGg1u6xlZSUxIIFC+jbt6/e8ZCQEGJj\nY9m5c6dBEGjXrl3U1dUxatSoVt3L7XiobCXoI4QQQgghHgiPzrs8QgghhNDT2+PO3uC+03kPouaU\nxLvdPA+VLU5OTixdupSoqCgOHjzITz/9RF1dHfb29ri5uTFmzBjc3d2bve748ePp2bMnUVFRZGZm\nEh8fj5WVFW3atGHkyJEMGzasRfs0MTFh3rx5/PTTT8TGxpKYmEhVVRXV1dVcuXIFU1NT/vOf/xAV\nFYWLi4veJ/Bv7Qm0dOlSfv75ZwA2bNig61104cIFlEolHh4eGBsbc+XKFTIyMkhISMDS0pKLFy/i\n4uKCh4eHrlfUiBEjOHfuHAUFBRw9ehR7e3u0Wi3W1tY89dRT+Pv76/Z/9uxZNm/eTGFhIXv27MHV\n1ZWXX36ZZ599lgMHDjQro+F2Dh06xM6dO/Hx8aFHjx6YmJhw7tw5du3aRUJCAkuWLNF781uIh9HN\nGSf7tn+LprKaqVOnNlh2sj7wuXPnTmpqanjttdcMfgb8/PwIDAwkISGBq1ev6n6u673yyiu6ABDc\nyPocNmwY33zzDTk5OfTr1+8e3KW+wMBAgwAQgJeXF15eXsTHx3PlyhUcHBwAqKurY/fu3VhaWrb4\ndbYpzemvJoQQQgghxP0kQSAhhBDiEeWhssXHzbFFmTC+7o6P1CeXmyqJZ25jT5/fLmh0nqWlJS+/\n/LJBH5vbaeqNwJ49e9KzZ89mrdWc7AuFQsHw4cMZPnw4ANHR0axatQovLy/69++PUqmkpKSE3Nxc\nkpKSdPu7tcTfgAEDgBs9N7y9vfHx8SE/P5+IiAjat2/P6tWrcXBwYNmyZezevZu8vDyuX79Op06d\nUCqVHD9+HCMjI7RaLVOnTjUo+VTfy2PUqFF6paEATE1NiYuL41//+hcqlQpofkZDY4YPH85zzz2H\nqamp3vGjR4+yYMECNm7c2Gi5QiEeZEfPFLJ+b7bea/yJvYlUFBWx4zS4nSm8be+ZrKws4EZJxuzs\nbIPzpaWl1NXVkZeXZ9AHrqFSavWZheXl5Xd8Py1R33uuISEhIbrXqfrX7aSkJAoLCwkJCcHCwuJX\n2aMQQgghhBAPCgkCCSGEEI+w3wR5MWd9PFpt02MVCgx6AT3sHseSeNHR0ZiYmLBixQqDMnVlZWW3\nnTdgwACsra2J2hFDtZUzdOzHiSPrcXBy5pVXXsHBwYG4uDhiY2MZNGgQoaGhzJw5k5qaGj755BMi\nIiJYvHgxxsbGrXIfO3bsoLa2lokTJzaa0dCY22X5+Pv74+7uzpEjR+56n0LcD9FHz7F0e5rBa3vN\ntSoATl2pZc76eGaP8WVkb8NykPWvBZs3b270OlVVVQbHbu71Va/+576urq5Z+79b9Rk+DQkKCmLt\n2rXExMTw0ksvoVAoiI6OBvjVSsEJIYQQQgjxIHl43+EQQgghRJP8PZ2YNdqnwTcLb6ZQwOwxvrf9\n1PjD6nEtiWdsbNxgMEapVN52ztEzhSzdeozUs0Vctssjg5NkHThKZXERP5y4jnO3QiIjIzE2Nmbm\nzJlYW1vj5OTEpUuXqKioYOLEiaxcuZKCgoJWuYcTJ04ANFjyqbm0Wi0//fQTcXFxnDlzhvLycr03\nqU1M5FdhuPE4bd++nR07dlBQUICtrS0DBw5k0qRJBmMrKiqIiYkhOTmZvLw8SktLsbKyonv37rz0\n0kt0795dN7a8vJwpU6bg6OjImjVrUCgUBuv95S9/ITExkX/+85+6DJP4+HgiIyM5f/48Go0GpVKJ\ni4sLQ4cOJSQk5N49EA+Jo2cKb/uabmJmQTVwvVKDsak5S7alorKzNHhtrw/kbNy4ESsrq19h14bq\nnw+1tbUNnq+oqGgw4HTz3IaYmZkRHBzM1q1bOXLkCO7u7iQnJ9OtWzc8PT1btMfGnosBAQG6nlkA\nY8eO1X3t7e3NokWLdN/n5OTw3XffkZGRQUVFBQ4ODvTr148JEybg6Oiod82lS5fqMiMTExPZtWsX\n+fn5dO3alZdeeokFCxYY9D+rd/36daZMmQLAunXrDLIghRBCCCHE40n+8hVCCCEecaP83XC2tyJi\nXzapZw1Lw/m6OxI21OuRCwDB41MS7+aeIJYdenAl8wTTp08nKCgIb29vevToYZAVdLP6rIKyglK9\n47XXqwHIKarhD+v2U5aSSZeOzmzduhWAS5cucfHiRb766ivs7OwwMTFpMHPgTtSXlbqbnj1r165l\n69atODo60qdPH9q0aYOZmRlwozydWq1ulb0+7P71r38RFRWFo6Mjo0aNwtjYmPj4eE6ePElNTY1e\nsOzChQvMnj0bBwcHXnvtNWxsbFCr1SQkJJCcnMz8+fN1gTsbGxuCgoKIjY3l2LFj9O7dW++6hYWF\nJCcn06VLF10AqL6coYODg0E5w9jYWAkCAev3Zt82qG/l5EpFUT5l+TlY2Dmh1ULEvmyD1/du3bqR\nk5NDRkbGPe3ho1AobpsdZGNjAzRc2vHixYuNBoGaEhISQmRkJNHR0Xh6elJXV9fiLKCmnovDhg0j\nNDRU91oSGhqqm+vs7Kz7OjExkfDwcAAGDRqESqUiJyeHHTt2cPjwYT755BO98fXWrFlDZmYmAQEB\nBAQEYGRkhL+/P+3bt2f//v289tprBo/PwYMH0Wg0PP/88xIAEkIIIYQQOhIEEkIIIR4D/p5O+Hs6\n6QULrMxN6O3h9NAFPFrqUS6J11BPEHCltMNQSgvSOfvN9ygtt6JQKPD29mbq1KkG/TwayyowNjUH\noKaqHIWxCacvlWJmYsSGDRsASEtLo7q6mqioKIyNjSkvL7/tp/pbqv4N4qKiIlxdXVs8v7S0lMjI\nSNzd3fn73/9u0Nx+7969rbLPh93x48eJioqiffv2/OMf/8DW9sbrwaRJk5g7dy7FxcW6Pk0Arq6u\nBAUFYWZmxltvvaU7XlhYyLvvvssXX3yhl70VEhJCbGwsO3fuNAgC7dq1y+DN+TstZ/i4yFVrGg1q\nt+0aQGF2MgXpe1G6dMbCri2pZ4vJVWvwUNlSWFiIk5MTY8aMISYmhi+++AIXFxc6dOigt05NTQ0n\nTpygV69ed7VfpVJ52/5drq6uWFlZER8fT2lpqe7f+9q1a3z++ed3dV0XFxf8/PxITEwkKysLa2tr\ngoKCWrRGU89Fa2trwsLCSEtLQ61WExYWZrBGVVUVS5Ysoba2lkWLFuk9nt9//z3r1q1j5cqVLFy4\n0GDuqVOnWLZsmUGA6JlnnuHf//43P/74I2PGjDHYM8DIkSNbdK9CCCGEEOLRJkEgIYQQ4jHiobJ9\n5IM+t3pUS+LdricIQJtOftDJj9rrVTzdzRyKz7B7924WLFjA6tWr9d7QbCyrwNKxHZXFFym/dBZ7\n915otVBhbMfeqG+5ePEib7zxBiqVii+++AKAiIgIXYDobnXr1o3s7GySk5PvKAhUUFCAVqvF39/f\nIABUWFjYamXrHkY3B4N/3LqRyuoaXn75ZV0ACG6U1JoyZQpz587Vm2ttba3LprqZk5MTgwcPJioq\nisuXL9O2bVsAvLy88PLyIj4+nitXruh6udTV1bF7924sLS0ZNmyY3lp3Us7wcZGS23BApZ6FTVh+\nqwAAIABJREFUXVs69nuG8wnbydrxOXau3TG3dST870lYXb+ClZUV4eHhuLq6MmPGDJYvX85bb71F\nnz596NChA7W1tajVajIzM1EqlXz22Wd3tV8/Pz/27t3LX/7yFzp37oyJiQm9evXC29sbExMTnn32\nWb755htmzJjBwIEDqa2tJSUlBUdHR4MyaS0VEhJCSkoKJSUljB07tsHnbVPu9rl4+PBhNBoNQUFB\nBgG1559/np07d5KSkqL3M1PvxRdfbDBD6KmnnuLrr78mOjpaLwiUl5dHeno6vr6+BkE9IYQQQgjx\neJMgkBBCCCEeeY9aSbzGsnduZmxqwfYzsOg3oWi1Wnbv3k1GRgaDBg0CoLK6Ri+roL7Phvb/F27T\n2Z+inKM3sgpcu2Jpr+LChfOk5lwg8psv0Wq1PP300/fkHkNCQti5cyfffPMNffr0oWNH/eb29RkN\nt1OfvZKZmUldXR1GRkbAjU/mr1y5stUylh4mDWWO1fd92pRRRZvOhXo/Az179tQ9bjcrKSnh3Llz\nTJ06lZKSEmpqavTOFxUV6b2hHRISwrJly9i9ezcvv/wyAElJSRQWFhISEoKFhYVu7BNPPMHatWtb\nVM7wcVJZXdPkGCevvljaq7h0/BDll3IpvZBFVkU7hgf66v28Dh8+HE9PT3744QdSU1M5evQoFhYW\nODo6MnjwYIYOHXrX+3399dcBOHbsGElJSWi1WkJDQ/H29gYgLCwMc3NzYmJiiImJwd7enqCgIMLC\nwpg+ffpdXTswMBClUklZWVmzSsHdminbs3cgp06duqvn4qlTp4AbwbBbGRsb4+3tzZ49ezh9+rRB\nEKhr164Nrmlra8uQIUPYs2cPx48fp0ePHsAvWUDPPPNMs/cnhBBCCCEeDxIEEkIIIcRj4VEqiddY\n9o6m4Aw2zh43BXRu9ASxLSkBwNzcXDe27Oo1bv6svYn5jQbx1ytu9AayadsR516DuZRxgKxtqzGx\ntKWy+BLjx42hc8f2+Pr68sILL+jmV1VVUVFR0Sr32LFjR958801WrVrFjBkzGDBgAC4uLpSVlZGd\nna3LaLgdBwcHgoKC2Lt3LzNmzMDf35+KigpSUlIwMzOjU6dOnD59ulX2+jC4XeZYfd+n7KLrzFkf\nz+wxvozs3RGtVsvOnTvJysqiurqaKVOmMHDgQLp27UpiYiLGxsZ07tyZ9u3bo1AoSEtLIz4+nvPn\nzzN9+nTat29P9+7deemllwgKCmLt2rXExMTwzDPP8Morr5Cbm4uLi4vBm/Pjxo3jhx9+YN++feTl\n5WFlZdVoOcPHjZV58/58s27bkU5tfwmcvjmyJ+P6exqM8/DwYNasWc1ac9GiRbc9FxwcTHBwsMFx\nOzs73n///dvOUygUjB8/nvHjxxucW7t2bbOv0xC1Wo1Go6Fnz564ubnddlzDZTUBlNj1fBqKs4iM\njGTr1sZLazak/vWwPgPuVvXZTvU90G52uzlwI7C6Z88eoqOj6dGjB9evXycuLg47OzsGDBjQ5L6E\nEEIIIcTjRYJAQgghhHisPOwl8ZrqCXJm77cYmZhh5dQBcxt7tFo4sfMsnW2q8e3VXe8T6bV1+hEB\nc9s2mFkpuXI2HYyMMLO2w9jEjA59R1J6LpOrJZdAAVcKL6Oxt8He3p7169ej0Wi4dOkS0dHRVFVV\ntdq9jhw5End3d7Zs2UJaWhqHDx9GqVTi4eHRrAykGTNm0K5dO/bt28f27duxs7Ojf//+/Pa3v200\ngPSoaV7fpwqMTc1Ysi0VlZ0libs3ExkZSWVlJe7u7gQFBREfH89nn32GQqGgf//+fPDBBwCcOHGC\nqKgolEol9vb2PPnkk5ibm5OQkEBycjLz588nODiYrVu3cvLkSfr06cOBAwfo1asXnp76gYnCwkKu\nXLnCs88+y0cffcTx48c5dOjQbcsZPm56e9xZtuKdznuYbdmyBa1Wa9A352aNldUEKLXpRJltJ978\nny50MNW0+LlobW0N3Miea0hxcbHeuJvVB/Ib0q1bNzp16sT+/ft57bXXSE5ORqPRMH78eExM5E98\nIYQQQgihT35DFEIIIYR4iDTVE6R972A0F09xtbiAsvwcjIxNMLO2I+DJsfx55lS9NwiNjfTfZFQY\nGeE57GXyj8ZRcu44dder0Wq1eI2YgnOPXz5d/lTH61w9l0pGRgYJCQnY2NjQtm1bZsyYwfDhwxvs\n4dPYJ/hnzZp122yE7t27M2fOnEbvWaVSERUVZXDc3NycSZMmMWnSJINzDWU1+Pj4NLjOw66xzDEr\nx/Y3+j6pz2Ju64BWCys27ubygSgsLCzw9vbGxcWFadOmMWnSJHr16oWxsTE2Nja6NVxdXfnyyy+Z\nM2cOpqamPPfcc/j4+FBYWMi7777LF198wfz584mMjCQ6OhpLS0u0Wi2mpqYG+9m1axd1dXWMGjUK\na2trAgICCAgIaLCc4ePIQ2WLj5tjo4HgW/m6Oz7Uge+WuHz5Mj///DP5+fnExsbi6enJkCFDGhzb\n3LKaWi2sjsth0W8Cefttw+difcnEm8tO1uvUqRMAaWlpjBgxQu9cbW0tGRkZAHTu3LnF9zp69GhW\nrFjBnj17OHToEAqFgpEjR7Z4HSGEEEII8eiTIJAQQgghxEOkqZ4gbbsG0LZrgMFxv8FdsbS01H2/\naNEi3lBreOPzvXrjrNt0wOupyY1e46WQYDxU41qwa3G/NJU55ti5N4U5RyhI34eda1dMzK04tO9n\nHMqv0kFloRc0NDMzw9/fn/3791NdXa07bmVlRUREBOfPn9db28nJicGDBxMVFYWpqSl+fn4kJiZi\na2uLvb09ly9f5sqVK7qyV3V1dWzcuBELCwuGDRumt1ZJA+UMH1e/CfJizvr4JoMXAAoFhA19fEro\nFRQUsG7dOszNzenduzfTp0+/bUZNY8FR0C+tWV9W09/TyeC5qFQqgRsBKGdnZ701Bg4ciK2tLT//\n/DOjR4+mW7duunORkZFcunSJ3r17G/QDao5hw4bx73//m02bNlFcXIy/vz/t2rVr8TpCCCGEEOLR\nJ0EgIYQQQoiHSHN7gjRnnmQVPPqayhyzadsRVfdA1FnxHN/+GQ5uPVFnHeZMUT4VvQfh1sZKb/yU\nKVPYu3cvhw8fZvXq1RgbG3P8+HFdRsPx48eZOXOmQXmroqIiQkJCSElJoaSkhJCQEHJycti9ezcv\nv/wyAElJSSQlJeHm5saSJUtwdnZGq9WSkZFBdnY2Xbp00Stn2FrmzJlDenr6Q5MF5u/pxKzRPk1m\nsSgUMHuML/6ej08puOZm8zUVHAXD0poXkuHKwfVcyjur91z08/Nj//79hIeHExAQgJmZGSqViuHD\nh2NhYcHMmTP529/+xh//+EeGDBlC27ZtycnJ4ejRozg4OPDWW2/d0b2am5vz5JNP6u731v5aQggh\nhBBC1JMgkBBCCCHEQ6S1e4JIVsGjranMMYAOfUdibuvI5ZOJFGYncbXkMibmVjgOmMDR3f+hV8df\nMilGjx6Nt7c3Fy9eJC4uDjMzM2xtbamrq6OsrAxra2uGDRtGly5dUCgUpKWlkZ6ezvXr1wkMDESp\nVFJWVsZbb73F/PnziYmJ4aWXXkKhUBAdHY2rqyuDBw/m1KlTJCUl6d5Qf+WVVwgJCZF+J/9vlL8b\nzvZWROzLJvWsYTDD192RsKFej1UAqCWaCo5Cw6U1C8w7MfWW5+LTTz+NWq1m7969bNq0idraWry9\nvRk+fDgAgYGBfPLJJ3z77bccOXKEyspK7O3teeaZZ5g4cSKOjo53fB8jRowgKioKR0dHAgMD73gd\nIYQQQgjxaJO/ooQQQgghHiKtnb0jWQWPtuZkjikUCtp260/bbv0ByNqxhsrii2jrauk1biYKxY3+\nKf6eTtTW1mJtbc2AAQNYu3YtAG+99RZ2dnb85z//oWPHjnprr1q1ivT0dADUajUajYaePXvSpUsX\ngoOD2bp1K0eOHMHd3Z3k5GSGDh3K4sWLW/lReDT5ezrh7+lErlpDSm4hldU1WJmb0NvDSbL1mtCc\n4GhDpTXDnujKi7cEwo2MjJg8eTKTJ9++jKaXlxfz5s1r1t4a65F2q9OnTwM3gkHGxsbNmiOEEEII\nIR4/EgQSQgghhHjItHb2jmQVPLruJHPMyrE9lcUXKVefxdzWQa8fSmZmJnV1dXrjL168iJubm0EA\nqL6UW70tW7ag1WoZM2YMACEhIURGRhIdHY2npyd1dXWtXtIqPj6eyMhIzp8/j0ajQalU4uLiwtCh\nQwkICGDatGm6sWPHjtV97e3tzaJFi1p1L/eKh8pWgj4t1JplNe+X2tpafvjhB4yNjaUUnBBCCCGE\naNSD81usEEIIIYRolnuRvSNZBY+mO8kcc+zcm8KcIxSk78POtSsm5lakni3m5IUi1q1bZzBepVKR\nn59PcXGxrrSVVqslIiKCnJwcioqKWL9+PZmZmXh6ejJkyBAAXFxc8PPzIzExkaysLKytrQkKCmqd\nGweio6NZtWoVDg4O9O/fH6VSSUlJCbm5ucTGxjJs2DBCQ0OJi4tDrVYTGhqqm+vs7Nxq+xAPntYu\nq/lryszMJD09nbS0NHJzcxkzZgxOTvd/X0IIIYQQ4sElQSAhhBBCiIfQvcrekayCR09LMscAbNp2\nRNU9EHVWPMe3f4aDW09QGPG75K/x7tTeoIfJuHHjWLVqFTNmzGDw4MEYGxtz/Phxzp07R9euXfnu\nu+9ITk5m4MCBTJ8+HYXilx5DISEhpKSkUFJSwtixYzEzM2u1+46OjsbExIQVK1ZgZ2end66+f1FY\nWBhpaWmo1WrCwsJa7driwdbaZTV/TSkpKWzYsAFbW1tGjhzJ1KlT7/eWhBBCCCHEA06CQEIIIYQQ\nN1Gr1UybNo3g4GBdX4alS5cSFxfH2rVrUalUd7x2XFwcS5cuZdasWQQHB9/1XiV7RzRHczPHbtah\n70jMbR25fDKRwuwkjM2tCHwyiIV/eY8ZM2bojR01ahSmpqZs3bqVuLg4zMzM6NWrFzNnzuTgwYOc\nPXuW8PBwfHx8DK4TGBiIUqmkrKzsnpS0MjY2brBXilKpbPVriYdLa5fV/LWEhYVJwFIIIYQQQrSI\nBIGEEEIIIR5ykr0jmtJU5titFAoFbbv1p223/rpjz47sibW1NWvXrjUYHxwc3GBg08PDo9E3rNVq\nNRqNhp49e+Lm5tbMu2nYrcHQnr0DOXXqFNOnTycoKAhvb2969OhhkBUkHk/3oqymEEIIIYQQDyIJ\nAgkhhBBCNGHy5MmMHz/eoAyWEA+T+syxg1kFfPhdcovn34t+KFu2bEGr1TJmzJg7XuPomULW781u\noLSXErueT0NxFpGRkWzduhWFQoG3tzdTp07Fy+vByOwQ98+9KqsphBBCCCHEg0SCQEIIIYRolntZ\nJu1B5+joKAEg8cgY1L3dfe2HcvnyZX7++Wfy8/OJjY3F09OTIUOG3NFa0UfPNZrJUWrTiTLbTrz5\nP13oYKrh0KFD7N69mwULFrB69WrJChJSVlMIIYQQQjzyJAgkhBBCiPtmzpw5pKenExUVdb+30qiG\ngl03B8XCwsL48ssvSUlJoaqqCnd3d8LCwujXr1+z1i8vL+ejjz4iMzOTSZMm8dJLLwFQUFDA999/\nT2pqKkVFRZiZmdGmTRt69OjB5MmTsbWVNyjFnbmf/VAKCgpYt24d5ubm9O7dm+nTp6NQKFq8ztEz\nhc3qc6TVwuq4HBb9JpC33w5Aq9Wye/duMjIyGDRoEEZGRgDU1dXpvhaPHymrKYQQQgghHlUSBBJC\nCCHEHZMyaTeCQe+88w7t2rXjySefRKPRsG/fPhYuXMhHH32Er69vo/MvX77MggULuHjxIrNnz2b4\n8OEAFBcX884771BZWUlAQACDBg3i2rVrXLp0iR9//JExY8ZIEEg06sKFC7z55pv4+PgQHh6ud66+\nH8rrb06nqrQI73EzMbW68Xwqy89BnRVPZVE+dTXV9OnuwVGXS3RVTcDa2lpvndTUVPbu3UtmZiaF\nhYXU1tbSrl07hgwZwosvvoiZmZne+IiICDZs2EB4eDjFxcVERkby9ttvo1QqG+w11Jj1e7MbDQBp\nCs5g4+yBQqFAq4WIfdn4ezpRUlICgLm5OQBKpRK48bPo7Ozcoj0IIYQQQgghxINOgkBCCCGEuGNS\nJg3S0tIICwsjNDRUd2zYsGEsWLCAzZs3NxoEOnPmDH/+85+pqqpiwYIF9O7dW3fuwIEDaDQaXnvt\nNZ599lm9eVVVVZKxIJrk6uqKr68vqamp5OXl0aFDB73z7hYVeFhfh469dQGgi6k/czH1J0zMreju\n7UeQXye0FUVs2bKFpKQkFi9ejJWVlW6NTZs2ceHCBbp3705AQADXr18nMzOTiIgI0tLS+Oijjxp8\nrm7ZsoWUlBT69++Pr68vFRUVLbq3XLWmyXJ2Z/Z+i5GJGVZOHTC3sedCMlw5uJ5LeWfp0qULfn5+\nAPj5+bF//37Cw8MJCAjAzMwMlUqlC8gKIYQQQgghxMNMgkBCCCGEuGO36wkUHx9PZGQk58+fR6PR\noFQqcXFxYejQoYSEhOhKqdUbO3as7mtvb28WLVr0q97H3VCpVEyYMEHvWJ8+fWjbti0nT5687byU\nlBTCw8OxtLTkb3/7G56eng2OuzWTAsDCwuLuNi0eGyEhIaSmphITE8P//M//6J2LiYnBzsqMhfPe\nwN6lMz/s3sdX0UkM7teb8IV/oadnO93YuLg4li5dSkREBK+++qru+Jtvvomzs7NBObevv/6ajRs3\ncuDAAYYOHWqwr9TUVBYvXkynTp3u6L5ScgubHNO+dzCai6e4WlxAWX4ORsYmFJh3YuorrxASEoKJ\nyY0/hZ5++mnUajV79+5l06ZN1NbW4u3tLUEgIYQQQgghxCNBgkBCCCGEaFXR0dGsWrUKBwcH+vfv\nj1KppKSkhNzcXGJjYwkJCcHa2prQ0FDi4uJQq9V6WTS/djmmW5uBu1o3o0nKTTw9PRvMdHByciIr\nK6vBOQcOHODo0aO0b9+eDz/8kLZt2xqMCQwM5KuvvuKzzz7j6NGj+Pv707NnTzp27HhH/VPE42nA\ngAE4OjoSGxvLpEmTMDU1BaCiooJ9+/bRvn17/Pz8UCgUVOQepYOjNcvDP8DNrZ3eOsHBwURGRvLT\nTz/pBYHatdMfV++5555j48aNHDlypMEg0KhRo+44AARQWV3T5Ji2XQNo2zVA71jYE1158Zb+RkZG\nRkyePJnJkyff8X6EEEIIIYQQ4kElQSAhhBBCtKro6GhMTExYsWIFdnZ2eufKysoAsLa2JiwsjLS0\nNNRqNWFhYb/6Po+eKWT93myDklLV5SWcP3+FbkXlzVrHxsamwePGxsZob9OwJCsri5qaGrp164aT\nk1ODY1QqFf/85z+JiIjgyJEjHDx4ELgRXHrhhRf0sqeEuNmtgU3/wKHE7dzKwYMHGTZsGAB79uzh\n2rVrjBw5UhdUzMrKwsTEhP379ze47vXr1yktLUWj0ej6UVVVVREZGcnhw4fJy8vj6tWres/7oqKi\nBtfq2rXrXd2jlfmd/Rlzp/OEEEIIIYQQ4mElfwUJIYQQotUZGxtjbGxscLy+Afv9Fn30HEu3p922\nqXzZ1WtsSz7HiJTzjOzdsdWvP3nyZJKSkoiNjUWr1TJz5swGs3s6duzIH/7wB2prazlz5gwpKSls\n27aNNWvWYGFhwYgRI1p9b+LhdbvA5rVKa87nlbDum826IFBMTAwmJiY89dRTunEajYba2lo2bNjQ\n6HWuXr2Kra0tNTU1zJs3j5MnT+Lu7s7QoUOxs7PT/exv2LCB69evN7iGvb393dwqvT0aDp7eq3lC\nCCGEEEII8bCSIJAQQgghGnSnZdKeeOIJ1q5dy/Tp0wkKCsLb25sePXoYZAXdL0fPFDYaANLRwpJt\nqajsLFt9D6ampvzxj3/kH//4B3FxcVy/fp133nmnwcAZ3AiqdenShS5dutCjRw/++Mc/cujQIQkC\nCZ3GAptmVkoUjp3Y9uMhvo5JoK+bLWfPntUFbepZWVmh1WqbDALVi4+P5+TJkwQHBzNr1iy9c8XF\nxY2uc7clDT1Utvi4ORoEvBrj6+6Ih8r2rq4rhBBCCCGEEA8bCQIJIYQQQs/dlkkbN24cSqWSHTt2\nEBkZydatW1EoFHh7ezN16lS8vLwanX+vrd+b3XQA6P9ptRCxL5sO92AfJiYmvP/++5iamvLjjz9S\nU1PD+++/r2tWn5OTQ/v27bG2ttabV1JSAoC5ufk92JV4GDUnsOnUNYCS88f52+qvGeWjAm705blZ\n9+7dSUxM5Ny5c7i5uTV53YsXLwIwaNAgg3Pp6ektuIM785sgL+asj2/Wz7NCAWFD7+9rjxD15syZ\nQ3p6OlFRUfd7K0IIIYQQ4jFg2MVYCCGEEI+t6KPnmLM+/rafrq8vkxaTcr7RdZ588kkWL17Mhg0b\nWLBgASNGjCA9PZ0FCxZQWlp6L7beLLlqTYsyBwBSzxZTrKm6J/sxMjJi9uzZPP300xw8eJDw8HBd\n+awff/yRyZMnM3/+fFatWsW6dev4+OOP+ec//4mpqSnPPffcPdmTePg0J7Bp284TC2Ubik4fIzI6\njg4dOuDr66s3pv45tWLFCoqLDX9OqqqqOHHihO57lepGMCktLU1vXEFBAV9++eUd3EnL+Hs6MWu0\nD00lFSkUMHuML/6eUgpOiHpxcXGMHTuWuLi4+70VIYQQQghxj0kmkBBCCCGAe1MmzdramoCAAAIC\nAtBqtezevZuMjAxd5oCR0Y3Po9TV1em+vpdScgvvaF5ecUUr7+QXCoWC3/3ud5iZmbFt2zYWLlzI\nBx98QFBQENevX+f48ePk5ORw7do12rRpw9ChQ3n++edxd3e/Z3sSD4/mBjYVCgVOXgFcSI7hSjX0\nGTDMYIyfnx9Tpkzhq6++4vXXXycgIABnZ2eqqqpQq9Wkp6fTs2dPPvzwQwD69+9P+/bt+eGHH8jN\nzaVz585cvnyZhIQE+vXrx+XLl1v9fm81yt8NZ3srIvZlk3rW8HHwdXckbKiXBICEEEIIIYQQjy0J\nAgkhhBACaL0yaampqfj4+Bj0/GiojJlSqQTg8uXLODs739G+W6KyuqbJMeY29vT57QK9Y8EvTCZs\n6EK9YyqVqtFSPosWLTI4FhwcTHBwsMFxhULBG2+8wRtvvKE71q1bN7p169bkfsWDR61WM23atAZ7\n5bSWsWPH4u3tTeDzrzdrvOZSLucTd3Kt4gqWDu2wdfducNz48ePp2bMnUVFRZGZmEh8fj5WVFW3a\ntGHkyJEMG/ZL8MjCwoLw8HC+/PJL0tLSyMzMxNnZmYkTJzJu3Dj27dvXKvfaFH9PJ/w9nQz6mPX2\ncJIeQEIIIYQQQojHngSBhBBCCHHHZdIsMSyTFh4ejoWFBd26dcPZ2RmtVktGRgbZ2dl06dIFPz8/\n3Vg/Pz/2799PeHg4AQEBmJmZoVKpGD58+F3fU0OszP+PvTuPq6paHz/+OcyDzHhQQQUUTQURUXHW\nQs2xsknB0kytq32/ZaXfe7VfeV91rzbY1cyh681b3VLqhpo4oYgalApOjKaAgKIiR5ThcJD5/P4g\nThwPsyPyvP8p995r7bXPS/Ds9az1PC376tPSdkLcbU0JbNaoqihFqwX7zr3QmljUe13v3r3p3bt3\nk/p0dnZm0aJFdZ6rK0gaHBxMcHBw0wbcTO5KGwn6iPsuJiaGsLAwsrKyUKvV2Nra0qlTJ0aMGMHE\niRP1rq2srGTr1q0cOHCAa9euYW9vz6hRo3jhhRd09eFqi4+PZ9u2baSkpFBSUoJSqWTo0KE8++yz\nBvXjauoObd++ndDQUA4fPkxOTg6jRo0iJydHV7dr9erVrF69Wtdu06ZNulSPQgghhBDi4SAzGkII\nIYS4o2nSZs2axalTpzh//jwnTpzQBXZeeuklJk6cqDexNW7cOFQqFVFRUWzdupXKykq8vb3vWhCo\nn3vLUkK1tJ0Qd1tzApSl6hsYmVrQvudACWwKcReEh4ezbt06HBwcGDRoELa2tuTn55OZmcmBAwcM\ngkArV64kOTkZf39/rKysOHHiBFu3biU/P99gF2F4eDjr16/H3Nyc4cOHY29vT2JiIqGhocTExPDJ\nJ58YBIKgemFGamoq/v7+DB48GDs7O3x8fLC2tiYmJoaAgAA8PT1119fVhxBCCCGEaN3k7U8IIYQQ\ndzRN2oQJE5gwYUKT7mtkZMTMmTOZOXNm0wd7G9yVNvh0cWzWrqe+XR1ld4F4YDUWoLyZl0PB5VTy\nLiZTXqLBwcUda2c3CWwKcReEh4djYmLC559/jp2dnd65wsJCg+uzs7NZt24dNjbV/8a8+OKLvP76\n6xw8eJBZs2bh4OAAVKeY/Oc//4mFhQX/+Mc/cHNz0/WxYcMG9uzZw1dffcX//M//GNzj2rVrrFu3\nTpd+tbaYmBiGDBlSZ5pSIYQQQgjx8JAgkBBCCCHaVJq0GSO9WLI5pkn1jxQKCB7hdfcHJR5aKpWK\nr7/+mri4OEpKSujatSvBwcEMHDjQ4NqoqCjCw8NJT0+nrKwMFxcXRo8ezdNPP42pqWmd/d8a2Cy/\nWcSV+IMUXkqhsqIMbVUVpUV5mFhYY2Zpg7PXAAlsCnEXGRsbY2xsbHC8riDMSy+9pAsAQXWdrVGj\nRvH999+Tlpam+z1x+PBhKioqmDp1ql4ACKoDR4cOHeLQoUO8+uqrBr8rXnjhhTrvLYQQQggh2g6j\n+z0AIYQQQtx/bSlNmp+HMwsn+aBQNHydQgFvTu6Ln0fre0bxYFCpVLz11luoVCoee+wxRowYwYUL\nF/jggw9ISEjQu/azzz7jk08+ITs7m6FDhzJp0iRsbGz47rvvWLZsGZWVlfXeZ8ZILxQKqCgpJmX/\nv7medhpzWyfa9wzA0d0HC1snHLv0xtKxI8amZhLYFOIOylSp+Sk2gy3RqVi69iKvUMPh91Y4AAAg\nAElEQVSCBQv48ssvOXbsGAUFBfW29fIy/Fls3749AEVFRbpj58+fB6Bv374G17dr145u3bpRVlbG\npUuXmnQPIYQQQgjRtrS+5btCCCGEuOPaWpq08X5dcLG3Ykt0KgkXDJ+5b1dHgkd4SQBI3JbExESC\ng4MJCgrSHRs1ahTLli1j27ZtugndyMhIDhw4wJAhQ1i0aBFmZma667ds2UJISAi7d+/miSeeqPM+\nNYHNN99dTqk6D2Wvwbj5P64779xzACn7vkKhgIn9u8jfayHugNMZuWyOSr3l3003ClxHUHA1iQvf\nh2JruQOFQoG3tzezZ882CMjUVX+nZhdRVVWV7phGU11/z9HRsc6x1KSNq7murnNCCCGEEKLtkiCQ\nEEIIIYC2lybNz8MZPw9nMlVq4jJzKS6twMrchH7uzq02uCUeLEqlkmnTpukd69+/P+3btyclJUV3\nLCwsDGNjY9544w29ABDA9OnT2bVrF4cPH643CAQwxqcTrpWXqbKzoYPPKL1z1k6uePcPoOLqb/Tt\n6nQHnkyIti389EVW706s899LJ09f8PSlsryEcT3N4UYGERERLFu2jA0bNhjUCmqKmmBRXl4eXbp0\nMTifl5cHgJWVlcE5RWPbXoUQQgghxENPgkBCCCGEAP7YTVDfxFaNhy1NmrvSRoI+4rbcGkh0s67+\nAfLw8MDIyDD7srOzM2fPngWgtLSUjIwMbG1t2bFjR539m5qakpWV1eAYLl26hKUJTH1sIH/633EG\ngc3ziRasXn3+Np9UCHE6I7fRfycBjE0t2J0BK2YEodVqiYiIIDk5maFDhzb7np6enhw5coTExER8\nfX31zmk0GtLT0zEzM6Nz585N7rPmd1PtHUdCCCGEEOLhJEEgIYQQQuhImjQhmq7udFBQWpRPVlYe\nPfvV3c7Y2Bjt7zPIRUVFaLVaCgoKCAkJafFYiouLAbC3t68zsHnd3r7FfQsh/rA5KrXeAJD6agbt\nXNx1u2+0WtgSnYpNfj4A5ubmLbrno48+yvfff8+uXbsIDAykY8eOunPfffcdxcXFjBs3DlNT0yb3\naWNT/TtCpVK1aExCCCGEEKL1kCCQEEIIIfRImjQhGtdQOiiAwptl7Dp5kbFxWTzer/7V+TVpnjw9\nPfnss89aPJ6aNFD5v08236q+43eaSqVizpw5BAYGsnDhwntyTyHulUyVusHaeRlR/8XIxAwrZ1fM\n29mj1cK5vRfo1q6Uvn0eMdjF01RKpZJ58+axYcMG3njjDYYPH46dnR1JSUmcPXsWNzc3XnrppWb1\n+cgjj2Bubk5YWBhqtVpXO2jy5Ml11ioSQgghhBCtlwSBhBBCCFGntpombefOnezdu5ecnBzKysqY\nO3cuTz75ZLP6SExMZOnSpQQFBREcHKw7vmTJEpKSkti5c+edHra4h5qaDgotrNqVgNLOst7dcxYW\nFnTp0oWLFy+iVqt1q/Oby83NDXNzc9LT09FoNAaTuImJiS3q90E3Z84cADZt2nSfRyLagrjM3AbP\nd+wXiDr7PDdvXKXwShpGxiaYWdsx4LEp/PWN2ZiYtPz1e+LEiXTs2JFt27Zx5MgRSktLad++PU8/\n/TTPP/98swM37dq1Y8mSJYSEhBAZGUlJSQlQvetIgkBCCCGEEA8XCQIJIYQQQvwuKiqKjRs34unp\nyRNPPIGpqSmPPPLI/R6WeMA0lA7qVjXpoBpKofjUU0+xZs0aPvvsM958802DCdiioiJycnLo1q1b\nvX2YmJgwevRo9u3bR0hICHPnztWdS01N5fDhw00bsBCiXsWlFQ2eb99jAO17DDA47jusB5aWlro/\nr1ixot4+AgMDCQwMrPOcn58ffn5+TRprQ/eo4e/vj7+/f5P6E0IIIYQQrZcEgYQQQgghfnf8+HEA\nli1bhqOj430ejXgQNZYOqi4JF26QqVLXu7Nu7NixpKWlsWfPHubNm4efnx9KpRK1Wk1OTg5JSUmM\nGTOG1157rcH7zJw5k/j4eHbs2EFqaiq9e/cmLy+P6OhoBgwYQExMTLPGLZquvt1/4uFiZd6y1+eW\nthNCCCGEEOJOkG+jQgghhBC/u3GjenJfAkCiPo2lg2qoXUPpFefPn8+AAQPYu3cv8fHxaDQa2rVr\np0v39OijjzZ6D1tbWz7++GP+85//EBsbS1paGq6urixYsAClUnnfgkCXL1/mwIEDxMXFoVKpKC4u\nxsHBgf79+zN9+nScnfV3SWm1Wg4ePEh4eDhXrlzh5s2b2NnZ0blzZ8aOHcuIESN0QZcaU6ZM0f3/\n3apHJPWORD/3+nf03Y12QgghhBBC3AkSBBJCCCFEm7dlyxZCQkJ0f649obxp06YGJ36lzk/b0lg6\nKADzdvb0f2FZve3qS9M0cOBABg4c2KRx1Pf3zcHBgTfeeKNZbe62o0ePsnfvXnx8fOjVqxcmJiZc\nvHiR/fv3Exsby6pVq3ByctJd/+233/Ljjz/i4uLC8OHDsba25saNG6SmpvLLL78wYsQIXFxcCAoK\nIiwsDIAnnnhC197T0/OeP6NoG9yVNvh0cWzWbsC+XR3bZH09IYQQQgjx4JAgkBBCCCHaPB8fHwAi\nIyNRqVQEBQXd5xGJB5Wkg2q+Rx99lCeffBJTU1O946dPn2bZsmX88MMPLFiwQHc8PDwcJycn1q1b\nh7m5uV6bwsJCAJRKJcHBwURGRgJICjZxz8wY6cWSzTFNqgumUEDwCK+7PyghhBBCCCEa0HbfRoUQ\nQgghfufj44OPjw+JiYmoVCq9CWWVSnUfRyYeNJIOSl+mSk1cZi7FpRVYmZvgZm04M157l09tfn5+\ndO3alVOnThmcMzY2xsjIyOC4ra3t7Q+6BWrvFoyMjNQFnwAWLlyIUqnU/Tk9PZ1vv/2W3377jfLy\ncnr06MHMmTPp1avXPR+3uPP8PJxZOMmH1bsTGwwEKRTw5uS++Hk8nD/7QgghhBCi9ZAgkBBCCCHa\nrFsnsAs0Zfd7SOIBJ+mgqp3OyGVzVKrB51BalE9WVh49rxfpjmm1Wg4fPkxkZCQZGRkUFRVRVVWl\nO29iov9KMnr0aHbu3MmCBQsYPnw43t7ePPLII1hbW9/dh2qAj48PGo2GsLAwPDw8GDx4sO6ch4cH\nGo0GgLS0NLZu3cojjzzCuHHjuHbtGr/++iv/7//9P9asWYOrq+v9egRxB43364KLvRVbolNJuGD4\nu6BvV0eCR3hJAEgIIYQQQjwQJAgkhBBCiDanvgns1LiLKArzOJ2RK5N3ol5tPR1U+OmLDe6CKLxZ\nxq6TFxkbl8Xj/TqzadMmduzYgaOjI/3798fJyQkzMzPgjxSMtc2dOxcXFxcOHDhAaGgooaGhGBsb\nM2DAAObMmUPHjh3v9iMa8PHxwcXFhbCwMDw9PQ3SzyUmJgJw/PhxFi5cSGBgoO5ceHg469atIyws\njPnz59/TcYu7x8/DGT8PZ4PFBP3cnR+6oK8QQgghhGjdJAgkhBBCiDalKRPYSzbH8Obkvjzer/O9\nHZxoFdpyOqjTGbmNPjcAWli1KwFLRTlhYWF07dqVTz75BEtLS73LoqKiDJoaGRnx5JNP8uSTT1JQ\nUEBycjLR0dH88ssvXLx4kXXr1hnUF3pQ9OrVSy8ABDBmzBi++OILUlJS7tOoxN3krrSRoI8QQggh\nhHigSRBICCGEEG1GUyewtb9PYCvtLHFrpwCgsrKyzmtr0kCJtqWtpoPaHJXapB1QUP1z9E14LFqt\nFj8/P4MAUG5uLlevXm2wDzs7O4YOHcrQoUMpLCwkISGBCxcu0L17d6A6YFRRUdGiZ2mK2rs8yjT5\nFJc2fC8vL8NdXyYmJtjb21NUVFRHCyGEEEK0dpGRkaxevdpgN7AQQjwoJAgkhBBCiDajuRPYW6JT\nef+5fkD1hPWtiouLuXz58p0comhF2lo6qEyVulm1kAAyChQoSis4c+YMVVVVGBkZAVBSUsLatWsN\ngqvl5eWkpaXRq1cvveMVFRW6IIq5ubnuuI2NDZmZmZSVlelSzN0JdaWMLC3KJ/nCdapiMxlVT8rI\n+uoWGRsb69VBEkIIIcTdM2XKFLy9vVmxYsX9HooQQjwQJAgkhBBCiDahJRPYCRdukKOuwM3NjTNn\nzpCVlUXnztUp4qqqqvjyyy8pKyu7G8O9pxITE1m6dClBQUEGtU5E49pKOqi4TMNAaGNMLdvRoWc/\nUlKSeP311/Hz80Oj0RAXF4eZmRmenp6kp6frri8rK+P//u//6NixI927d0epVFJWVkZcXBxZWVkE\nBATofgYBfH19SU1NZdmyZfTp0wdTU1M8PDwYNGhQi5+zsZSR2XnFkjJSCCGEEEII0WpIEEgIIYQQ\nbUJLJrBr2j399NOsWbOGxYsXM3z4cMzMzEhISKCiogIPDw8yMjLu8GjvPJVKxZw5cwgMDGThwoX3\neziiFWosFVp9HnvqBYyunCI6Oprdu3djZ2fHoEGDeOGFF1i+fLnetebm5rz00kskJiby22+/cezY\nMSwtLenYsSMLFixg7NixetdPmzYNjUZDbGysbrdRYGBgi4NADaWMVCiqU0NqtVV6KSMftpR/Qggh\nhBBCiIeLBIGEEEII0Sa0dAK7uLSCp36feN6+fTuRkZG0a9eOwYMHM3PmTINJ7NaoR48ebNiwAVtb\n2/s9FPEAszJv/NXBvJ09/V9YpnfMzsaKp158kRdffNHg+lvTtJiYmPDMM8/wzDPPNGlMFhYWLFiw\ngAULFjTp+sY0lDLS2MwShUJBeXEB8EfKSAkCCSGEENVSUlLYvn07Z86cobCwEBsbG7p27crjjz/O\n8OHDddedO3eObdu2cebMGYqKirC3t2fAgAEEBQXh6Oio1+eSJUtISkrip59+YuvWrRw4cIBr165h\nb2/PqFGjeOGFFzAxqf6OUlObByApKYkpU6bo+qnZ8V57YdRzzz3Hd999R2JiIoWFhfz973/Hx8eH\ntLQ0Dh48SGJiIrm5uZSWluLs7ExAQADTpk2jXbt29+DTFEKIO0eCQEIIIYRoE5oyge019qV6240d\nO9ZgFwIYTmID+Pj4sHPnziZd+yAwNzfHzc3tfg9DPOD6ubcs2NHSdvdaYykjjU3NsHJypUh1kcxf\ntmFu68TVRAVP9rbBzrzeZkIIIUSbsG/fPtavX4+RkREBAQF06tSJ/Px80tLS2L17ty4IFBERwdq1\nazE1NSUgIABnZ2euXLnCvn37iI2NZeXKlbRv396g/5UrV5KcnIy/vz9WVlacOHGCrVu3kp+fr9vl\n7uHhQVBQECEhISiVSgIDA3XtfXx89PrLzs7m7bffxtXVldGjR1NaWoqVlZXuWY4ePYqPjw/9+vVD\nq9WSlpbGTz/9xMmTJ/n000+xtLS8Wx+lEELccRIEEkIIIUSbcCcmsGNiYggLCyMrKwu1Wo2trS2d\nOnVixIgRTJw4UXedWq1m27ZtHDt2DJVKhYmJCd27d+fZZ5/Fz8+vzvtERUURHh5Oeno6ZWVluLi4\nMHr0aJ5++mlMTU1bNPYaW7ZsISQkBKheIRkZGak7t3DhQpRKZZ01gWpWXm7fvp3Q0FAiIyO5fv06\nSqWSqVOn8vjjjwOwd+9edu/eTXZ2NjY2NowdO5bg4GBd+qzamrPy8+rVq4SGhpKQkMD169cxMzPD\nycmJXr16MXPmTGxsHv46PA8Sd6UNPl0cm1Vbq29Xx1ZTL6kpKSPdh03l0ol9FGafp/JCElqtloMx\nPkwd6XsPRiiEEEI8mLKystiwYQNWVlZ89NFHdOnSRe98bm71v7GXL19m/fr1uLi4sGLFCpycnHTX\nxMfH8+6777Jx40beeecdg3tkZ2ezbt063fe/F198kddff52DBw8ya9YsHBwc8PT0xNPTUxcEaqjW\n5ZkzZ3juueeYOXOmwbnnnnuO+fPnY2RkpHc8IiKCNWvWsHv3bp599tmmf0BCCHGfSRBICCGEEG3C\n7U5gh4eHs27dOhwcHBg0aBC2trbk5+eTmZnJgQMHdEEglUrFkiVLUKlU9OnTB39/f0pKSjh+/DjL\nli3jtdde0wVPanz22WccOHAAZ2dnhg4dirW1NefOneO7774jPj6eDz74AGNj4xY/u4+PDxqNhrCw\nMDw8PBg8eLDunIeHBxqNpsH2n3zyCefOnWPAgAEYGxvz66+/snbtWkxMTMjIyODgwYMMHDgQX19f\nYmJi+P777zE3Nzd4OW7Oys8bN27w1ltvUVxczIABAxg6dChlZWXk5ORw6NAhJk+eLEGg+2DGSC+W\nbI6pN2VabQoFBI/wuvuDukOakjLS3MaRbo8G6R3r3rcHPj5ede7+q7Fp06bbHp8QQgjxoNqzZw+V\nlZVMnz7dIAAE4Oxcvahq7969VFRUMG/ePL0AEICvry8BAQHExsZy8+ZNg502L730kt53PwsLC0aN\nGsX3339PWloaAwcObNaY7e3tCQoKqvOcUqms8/iYMWP48ssvOX36tASBhBCtigSBhBBCCNFm3M4E\ndnh4OCYmJnz++efY2dnpXVtYWKj7/1WrVnHt2jUWL17MyJEjdcc1Gg1Llixh48aNBAQEYG9vD1Tv\nzDlw4ABDhgxh0aJFmJmZ6drU7ODZvXs3TzzxREsfGx8fH1xcXAgLC8PT09NgVWRiYmKD7a9du8a6\ndeuwtrYGYOrUqcyfP59//etfWFtb8/nnn+te5IODg5k3bx7bt29n6tSpuuBVc1d+/vrrr6jVaubN\nm2fw7CUlJQYrM8W94efhzMJJPqzendjgz5FCAW9O7tuq6uU0JWXknWwnhBBCtGaZKjVxmbkUl1aw\n++dYiksr8Pf3b7DN2bNngep6PampqQbnCwoKqKqq4vLly3Tv3l3vnJeX4cKSmsVDRUVFzR6/h4dH\nvbvtKyoqCA8PJyoqiqysLDQaDdpaX3yuX7/e7PsJIcT9JG8sQgghhGgzmjuB7WBtzk+xGRSXVnD+\nagEVFdo6d+TY2toCkJGRQVJSEsOGDdMLAAFYW1szY8YM/va3v3HkyBHdzqGwsDCMjY1544039AJA\nANOnT2fXrl0cPny4RUGg2i/nZZr8Ju10qMusWbN0ASCADh060Lt3bxISEpgzZ45eQMfa2ppBgwbp\npY6Dlq/8vPUzgeqVn+L+Ge/XBRd7K7ZEp5JwwXBnXd+ujgSP8GpVASB4+GseCSGEEHfC6YxcNkel\n6u2uT065TKn6Bp/sTeOlMRb1fgeoWTi1bdu2Bu9RUlJicKz2d9EaNd/Lq6qqmjz+Gg4ODvWe+/jj\njzl69CgdOnQgICAABwcHXcAoLCyM8vLyZt9PCCHuJwkCCSGEEKJNacoE9qDuSiLiL/GPnQm64yoj\nVy6lJBPw+HM8M2UcE0cPoVevXnq7gmpWN2o0GrZs2WLQd0FBAVCdNx2gtLSUjIwMbG1t2bFjR53j\nNTU11V3fVHW9nJcW5ZN84TpVsZmMysht1gT9rSsxAV39nrrO1QR5ageBmrvyMyAggP/85z988cUX\nnD59Gj8/P3r37k3nzp3rrDUk7i0/D2f8PJz1Ao1W5ib0c3duNTWAbvWw1zwSQgghblf46Yt1LqYy\nMbOgFIg7d4ElORrenNyXx/t1NmhfE8j54YcfsLKyugcjrl993ydTU1M5evQo/fr1469//aveAjCt\nVsvWrVvv1RCFEOKOkSCQEEIIIdqchiawz17Oq/PlVtlrCMbmVuSmnOCLr79n/97dKO2s8Pb2Zvbs\n2Xh5eaFWqwGIi4sjLi6u3vvfvHkTqE5dodVqKSgoICQk5I48W30v5zWy84pZsjmm3pfzujS08rKh\ncxUVf+w8au7KT6VSyT/+8Q+2bNnCqVOnOHLkCFCdU/7pp59mypQpTRq7uLvclTYPVRDkYa55JIQQ\nQtyO0xm59X7HtHJ2Q3P9CoVX0rCwc2bVrgSUdpYGi4569uxJWloaycnJza7h0xwKhaJFu4MAsrOz\nARg0aJBBBoCUlBTKyspue3xCCHGvSRBICCGEEG3WrRPYDb3cAjh5+uLk6UtFWQnFuVn0crlJ0qmj\nLFu2jA0bNuhWNL7yyitNClLUBFA8PT357LPPbvt5Ght/Da2Wel/O75aWrPzs3Lkzf/7zn6msrCQj\nI4O4uDh27drFxo0bsbCwYOzYsXdzyKINephrHgkhhBC3Y3NUar3/NrbvMYDc1JNcTYrCtlM3LOza\nsyU6VffvZG5uLs7OzkyePJl9+/bx5Zdf0qlTJ1xdXfX6qaio4Ny5c/Tp0+e2xmpra0tubm6L2rq4\nuADVu9drf58vKChgw4YNtzUuIYS4XyQIJIQQQgjxu4ZebmszMbPAtpMXVV0dGeNoTUREBMnJyfTs\n2ROA5OTkJgWBLCws6NKlCxcvXkStVmNjc3s7Khoaf03KC6226vf/ovdyfrfdzspPY2NjunfvTvfu\n3enVqxd/+ctfOHr0qASBxF3xsNY8Em3Hli1bCAkJYfny5fj4+DSpzZIlS0hKSmLnzp1Nvs+UKVPw\n9vZmxYoVt3VvIcSDL1OlbjBdqoVdezoPnEBW7G7O7vkndm6PcCXOEZsrR7hxNQsrKyuWL1+Om5sb\nr7/+OmvWrOG1116jf//+uLq6UllZiUql4syZM9ja2vLFF1/c1nh9fX2Jiori/fffp1u3bpiYmNCn\nTx+8vb0bbfv222+Tm5vLkSNHWLx4Mb179yY/P5+TJ0/i6uqqS4kshBCtiQSBhBBCCCFo/OVWfTWD\ndi7uevnDEy7coLIoBwBzc3O8vLzo06cPR44cISIios4gRWZmJg4ODrpaQk899RRr1qzhs88+4803\n3zRIr1ZUVEROTg7dunW7rfEbm1miUCgoLy7QG3+mSt1gv3dKc1d+pqWl0bFjR4PPIz8/H6j+vIW4\nWx7GmkdCCCFES8VlNr6rxtnLH0t7JTm/HaUoJ5OCS2c5VNKJkQO8GTdunO66Rx99FA8PD3766ScS\nEhI4ffo0FhYWODo6MmzYMEaMGHHb433llVcAiI+P58SJE2i1WoKCgpoUBFIoFAwdOhQvLy9OnDjB\nzp07cXJyYty4cUybNo0FCxbc9viEEOJekyCQEEIIIQSNv9xmRP0XIxMzrJxdMW9nj1YLGtUFbhir\nGT6gL76+vgAsWrSId955hzVr1rBz50569uyJtbU1ubm5ZGZmcuHCBVauXKkLAo0dO5a0tDT27NnD\nvHnz8PPzQ6lUolarycnJISkpiTFjxvDaa6/d1viNTc2wcnKlSHWRzF+2YW7rhEKhYP8RW4Z0s2/G\nJ9UyzV35eejQIcLDw+nduzcdOnSgXbt2XL16ldjYWExNTXnyySfv+piFeNhqHomHk0qlYs6cOQQG\nBrJw4UImT57MyJEjad++/f0emhDiIVFcWtH4RYB1+854tv+j5uSs0T3qrJ/n7u7OwoULm9Rn7d2G\ntwoMDCQwMNDguJ2dHYsXL66zjVKpbHTXo7m5OfPnz6/z3KZNm5o8DiGEeFBIEEgIIYQQgsZfbjv2\nC0SdfZ6bN65SeCUNI2MTzKztGDZuKssXzcHEpPprlbOzM6tXr2bnzp0cOXKEw4cPU1VVhb29PV26\ndGHy5Ml07dpVr+/58+czYMAA9u7dS3x8PBqNhnbt2tG+fXuefvppHn300dseP4D7sKlcOrGPwuzz\nVF5IQqvVkjm0N0O69W+07Z3QnJWfI0eOpLy8nN9++420tDTKyspwcnJixIgRTJ061eAzFEIIUc3W\n1hZbW9v7PQwhxEPEyrxl04fNbXfu3Dm2bdvGmTNnKCoqwt7engEDBhAUFKSXhi0tLY2DBw+SmJhI\nbm4upaWlODs7ExAQwLRp02jXrp1evxUVFezdu5cDBw6Qk5NDeXk59vb2eHh4MHnyZPr160dkZCSr\nV68GDOsBBQUFERwc3KLPQAghHgQSBBJCCCGEoPGX1PY9BtC+xwCD46Mf742lpaXeMUtLS55//nme\nf/75Jt9/4MCBza6VU1tTXrLNbRzp9miQ3rFBQ3vj4+NR54rIhlZeLly4sN4VnMHBwfW+KDd15WfP\nnj11NZaEEOJhV3s3z7Rp0/j6669JTEykvLycRx55hLlz59K1a1cKCgr49ttviY2NpaioCHd3d4Ma\ndA3V5YmKimLbtm1kZWVhaWlJ//79eemll+odV0VFBaGhoURGRpKbm4ujoyOjR49m+vTpzX7GS5cu\nERoaSnx8PPn5+VhbW+Pr60twcLBBilAhxIOln3vL6uA1p11ERARr167F1NSUgIAAnJ2duXLlCvv2\n7SM2NpaVK1fqdjju27ePo0eP4uPjQ79+/dBqtaSlpfHTTz9x8uRJPv30U73v56tWrSIqKoquXbvy\n2GOPYW5uzvXr1zlz5gynTp2iX79+eHh4EBQUREhICEqlUm9nj9Q4E0K0dhIEEkIIIYTg3rzc3k2t\nffxCCCEgJyeHt99+m86dOxMYGIhKpeLo0aMsWbKElStXsmzZMqysrBgxYgRqtZro6GhWrlxJeXl5\no33v2LGDL7/8Emtrax577DGsra05deoUixcvxsrKyuB6rVbLhx9+SExMDB07dmTy5MlUVFRw4MAB\nLly40KznOnnyJMuXL6eyspJBgwbRsWNHcnNzOXr0KCdOnGD58uWN1r4TQtw/7kobfLo4Nlh/8lZ9\nuzo2OaXq5cuXWb9+PS4uLqxYsQInJyfdufj4eN599102btzIO++8A8Bzzz3H/PnzMTIy0usnIiKC\nNWvWsHv3bp599lkANBoN0dHRdO/enU8//dSgjVpdXR/T09MTT09PXRBIdv4IIR4mEgQSQgghhODu\nv9zeba19/EIIIapTEL344ot6O0m///57Nm/ezNtvv83w4cNZsGABCoUCAD8/P1asWEFOTo5BXwUF\nBWzYsIETJ06QnZ1NYmIijo6ObNy4kYCAAABmzZrFhx9+SFhYGOnp6URGRtK+fXtCQkI4evQoKSkp\nuLu7s3r1al2QJjg4mLfeeguonjxdsWIF8fHxVFRUUFJSQkFBAceOHWPp0qUsXHUfdXMAACAASURB\nVLiQgIAAPvnkE8zNzfnoo4/o3PmPeiEXLlxg0aJFrFmzhs8+++yufa5CiNs3Y6QXSzbHoNU2fq1C\nQZ21gGrLVKmJy8yluLSCX8O3UqgpYenSeXoBIABfX18CAgKIjY3l5s2bWFpaolQq6+xzzJgxfPnl\nl5w+fVoXBFIoFGi1WkxNTXW/O2uzsZHvwkKIh58EgYQQQgghfnenX27vtdY+fiGEaOuUSqVu4rJG\nYGAgmzdvpry8nJdffllvEnPUqFF8/PHHFBcX67UpLS3l448/RqvV0rdvXywsLMjMzMTU1JQPP/yQ\npUuXMnDgQBQKBbNnz9alBI2NjSUmJgZ/f39sbGywsbHB1NSUZcuWsX79emxtbbGxsWH69Ol8+OGH\nHDx4kG7dujFw4EDc3d3Zu3cvJ06cID4+XjeWgwcPotFo+NOf/qQXAALo2rUrjz/+ODt27CArK8vg\nvBDiweHn4czCST6s3p3Y4HdNhQLenNwXP4+6d5ufzshlc1Sq3sKlc4dj0eRe591//sTIX08ZLFIq\nKCigqqqKy5cv0717dyoqKggPDycqKoqsrCw0Gg3aWoO6fv267v+trKwYNGgQsbGxvP766wwbNoze\nvXvTs2dPzM3NW/hpCCFE6yJBICGEEEKI392pl9v7pbWPXwgh2oraK+CtzE1ws67+pe3p6WmQqqim\nGLqrq6tBDTojIyPs7OwoKyvT7z8zk44dO/Laa6/x/PPPs2LFCrp168acOXP46quvWLVqFf/+97+x\nsLCgQ4cO2NnZAXDs2DHef/99Xa2enj178tRTT7F9+3YiIiJ45plngOr6GJmZmWi1WubPn8/EiRMB\nMDEx4fLlyyQmJmJrawvA2bNnAcjIyGDLli0Gn8Xly5cBJAgkRCsw3q8LLvZWbIlOJeGC4e7zvl0d\nCR7hVe93zPDTF+v8nlpRWh3IPhkdwalfwNPFlva2lgbtS0pKAPj44485evQoHTp0ICAgAAcHB0xN\nTQEICwszSJH55z//mdDQUH7++Wc2b94MgJmZGcOGDePll1/G3t6+eR+EEEK0MhIEEkIIIYSo5XZf\nbu+31j5+IYR4mNW1Ah6gtCifrKw8evYzjOAbGxsD1Fm3B6oDQbVXwKvVagoKCujduzdPP/00UF0T\nA6B///6kp6dz6NAhjhw5wmOPPQb8kQ5p5MiR+Pr66trY2NgwadIktm/fTkpKiu4eFRUVFBYW0rFj\nRyZMmKA3Hjs7Ozp16sSVK1d044HqQu4NuXnzZoPnhRAPBj8PZ/w8nA2C2f3cnRtMM3w6I7fehUrG\nZhYA+D7/Z4zNLFAo4P0ZAXV+X01NTeXo0aP069ePv/71r7rfkVBdy2zr1q0GbczMzAgODiY4OJjc\n3FySkpKIjIzk0KFD5OTk8NFHH7XgkxBCiNZDgkBCCCGEELdo6cvtg6K1j18IIR5G9a2Ar1F4s4xd\nJy8yNi6Lx/vVvyPm1t/tmpIKvfMqlQqA7t27Y2JS/cpvbW0NQH5+Pn379uXQoUOkp6frgkA1gZru\n3bvr+rG2tkatVutWyBcVFenOJSQkAODk5FRnjQ1PT09dEKgmePX555/j7u5e73MJIVoXd6VNs75X\nbo5Krff3n7WzK8XXr1B07SJ2rj3QamFLdGqdQaDs7GwABg0apBcAAkhJSTHYGXkrZ2dnRo8ezahR\no3j11Vc5c+YMarVaFwxXKBRUVVU1+bmEEKI1kCCQEEIIIUQ9mvty+6Bp7eMXQoiHRUMr4PVoYdWu\nBJR2lgaTnzn5xSz65qjBLqKENBUaTSmXrlcHaUpLSwF06dgAunXrxpEjR0hMTOSRRx4B/gjqXL16\nlYKCAgDatWun1yYuLo5z584B6E2KJicnA2BhYVHnY9QutP7II49w5MgRkpOTJQgkRBuVqVIb/O6q\nrX2PQVxPO8Xlk/sxt3HEwtaZhAs3yFSpcVfaUFFRwblz5+jTpw8uLi4AJCUlMWXKFF0fBQUFbNiw\nwaDvgoIC8vLyDH7/lJSUUFJSgrGxsS5gDtW/O3Nzc2/ziYUQ4sEiQSAhhBBCCCHqoFKpmDNnDoGB\ngSxcuPCe33/OnDkAbNq06Z7fWwhxZzW0Av5Wda2AVxXcJLngCl6d6p5ELauo0u0iqil0XrO7B2D0\n6NGEhISwa9cuzMzMgOqdPlqtlq+++kovnVyNMWPGEBcXx7fffqsXAFKr1URHRwN/1Oe4Ve17jxkz\nhh9++IGQkBC8vLzo0aPHLc+rJSkpCR8fnwY/FyFE6xWX2XBQxcLOmS4BT3AxJozfdn2BbcdumNs6\n8fGqeDpZV3HmzBlsbW354osv8PLyolevXhw5coTFixfTu3dv8vPzOXnyJK6urro6ajWuX7/OG2+8\ngbu7O+7u7jg7O1NcXMzx48fJy8tjypQpevXWfH19iYqK4v3336dbt26YmJjQp08fvL2978pnI4QQ\n94IEgYQQQgghhBBCiLuksRXwdam9Av50Ri4ZqkLaKR0abvT7LiJ/o+rJzLS0NCorKzE2NkapVDJr\n1iw2bdrE8uXLqaioIDU1lTfeeAONRoOLiwvp6el63Y0cOZLo6GhiYmJISkpCq9WyceNGfv31V3r1\n6sXRo0e5fv06Wq3WICVc7b5sbGxYsmQJf//731m0aBG+vr506dIFhULBtWvXOHv2LGq1mm3btjXr\nMxJCtB7FpRWNXuPo2RdLBxdUvx1DnZOB+up54oqcUPTsyrBhwxgxYgRQXQft3Xff5bvvvuPEiRPs\n3LkTJycnxo0bx7Rp01iwYIFevy4uLsyYMYPExEQSEhIoLCzExsYGV1dXXnrpJV2/NV555RUA4uPj\nOXHiBFqtlqCgIAkCCSFaNQkCCSGEEEIIIYQQd0ljK+AbaueutGn2LqL4yxrs7Oy4ceMGYWFhTJ06\nFYCnnnqKoqIi3nvvPcrLy7l48SIDBw5k9uzZvPbaawZ9KRQK/vKXvxAaGkp8fDxpaWnExMQwZswY\npk+fTmhoKEVFRezdu5eJEyfq2hUUFFBYWKiXjs7X15e1a9eybds2Tp06RXJyMiYmJjg6OuLr68vQ\noUNb9BkJIVoHK/OmTT9aOrjQdeiTuj/Pf7w3Tw3yMLjOxsaG+fPn19nHrTuora2tmT59OtOnT2/S\nGOzs7Fi8eHGTrhVCiNZCgkBCCCGEEEIIIcRd0pQV8Obt7On/wjKDdjW7iG49V1vPCfNI/ukz3Z/L\nOw1g87YZfP7x+/z73//m1KlTeHl5kZubyy+//IKvry9/+ctfCAgI0LWZO3eurkZQbSYmJkyfPp3N\nmzfj7e3NihUrdOcOHjzI4sWL2bBhAydOnMDDw4OrV6/i5OSEv78/MTExejuElEolf/rTnxr9LIQQ\nD59+7s6NX3QH2wkhhNAnQSAhhBBCCCEacenSJb7++muSk5MpLy/H09OToKAg/Pz89K4rLy9nx44d\nHD58mOzsbIyNjfHw8GDKlCkMHz7coF+tVsvu3bvZs2cPV69excbGhiFDhvDiiy8aXBseHs66desI\nDg4mKCjI4HxeXh6zZ8/Gzc2NtWvX3rmHF0LclqaugK+rXUt3EV0uNmbVqlX88MMPnDhxgqSkJCwt\nLenfvz/Tpk3Dy8urRf3W1rlzZ1auXMl//vMfEhISSEhIwN3dnaVLl3Lp0iViYmKwsrK67fsIIVo/\nd6UNPl0cm5Uas29XR9yVNndxVEII0XZIEEgIIYQQQogG5OTksGjRItzd3Rk/fjx5eXlER0ezbNky\nFi9erMslX1FRwXvvvUdSUhJubm5MmjSJ0tJSfv31Vz766CPS09OZOXOmXt//+te/2LlzJ46Ojowf\nPx5jY2NiYmJISUmhoqICE5M/vq6PHj2ar776iv379zNt2jSMjIz0+oqIiKCyspLx48ff/Q9FCNFk\nt7MC/si5q41eV98uIicnJ4PaGPUJDAwkMDCw3vM7d+6s87ibmxtLly41OP7zzz8D1YEiIYQAmDHS\niyWbY5qU3lKhgOARtx+sFkIIUc2o8UuEEEIIIYRou5KSkhg3bhwffvghs2bNYuHChXz44YcYGRmx\nbt06iouLAdi+fTtJSUn4+/uzdu1aXn75ZebPn8+6detQKpX8+OOP/Pbbb7p+f/vtN3bu3EnHjh1Z\nu3Ytr7zyCnPmzGHt2rUYGRlx44b+alkLCwseffRRcnNzOXnypN45rVbL/v37MTc359FHH737H4oQ\noslqVsA3R80K+NvZRXS3abVa8vLyDI7Hx8cTHR1N586dcXV1vevjEEK0Dn4eziyc5EOtLJF1Uijg\nzcl98fOQVHBCCHGnSBBICCGEEEIIIFOl5qfYDLZEp/JTbAYXr1XXx7C2tjZIv+bl5cXo0aPRaDQc\nPXoUqN6Jo1AomDt3LsbGxrpr7ezsdMWI9+/frzt+4MABAJ5//nlsbP5Id2JmZsasWbPqHGNN8fW9\ne/fqHT99+jQ5OTmMGDECa2vrFj2/EOLumTHSq9GJzxq1V8A/yHU0ysvLmT17Nu+++y4bN27kyy+/\n5L333uPdd9/F2Ni43qLtQoi2a7xfF1bMCKBv17oD4327OrJiRgCP97v3uwinTJnCkiVL7vl9hRDi\nXpB0cEIIIYRotbZs2UJISAjLly/Hx8enRX1MmTLFoNi1aFtOZ+SyOSrVIE99aVE+WVl5jB7WHUtL\nS4N2Pj4+REZGkp6eztChQ8nOzsbJyQk3NzeDa/v27QtAenq67tj58+cB8Pb2Nri+d+/eBuneALp0\n6YK3tzcnT54kNzcXZ+fqid59+/YBMGHChKY+thDiHqpZAb96d2KDqZBuXQH/INfRMDExYcKECcTH\nx5OSkkJpaSm2trYMGzaM5557Dk9Pz7s+BiFE6+Pn4YyfhzOZKjVxmbkUl1ZgZW5CP3dnqQEkhBB3\niQSBhBBCCPHAioyMZPXq1SxcuLDBWgVCtFT46YsNTsoW3izjl/MF7IvLMliVam9vD4BGo0Gj0QDg\n6Fj3ylYHBwcAioqKdMdq0sjV9FObsbExtra2dfY1ceJEkpKS2LdvHzNmzCAvL4+YmBg8PT3p0aNH\nA08rhLifxvt1wcXeii3RqSRcMAzq9O3qSPAIL4MUSA9qHQ0jIyNeffXVe3Kvh9mSJUtISkqqt+5S\nXeT7kXgYuCttJOgjhBD3iASBhBBCCNFqTZ48mZEjR9K+ffv7PRTRCp3OyG10VT5A+U0Nq3YloLSz\n1Juczc/PB6rTxdWkYKurPkbt47VTtVlZWen66dChg971lZWVFBYW6nb61DZkyBDs7e2JiIggKCiI\niIgIKisrGT9+fCNPLIS431qyAr6lu4hE65WYmMjSpUsJCgoiODj4fg9HiDuipKSEoKAgvLy8+Pjj\nj3XHy8rKmD59OuXl5bz11lt6tQ337NnDhg0beP311xk7diwAarWabdu2cezYMVQqFSYmJnTv3p1n\nn30WPz8/vXtWVFSwd+9eDhw4QE5ODuXl5djb2+Ph4cHkyZPp16+fLqgK1XUgp0yZomt/68/guXPn\n2LZtG2fOnKGoqAh7e3sGDBhAUFCQwUKgmgDv9u3bCQ0N5fDhw+Tk5DBq1CgWLlyoF8xt3749ISEh\npKWloVAo6NOnDy+//DKdO9/7tHhCiIeTBIGEEEII0WrZ2trWu1tCiMZsjkpt0sr6mzeyqSgrZUt0\nqt7kamJiIgCenp5YWlrSsWNHrl69ypUrV+jUqZNeHwkJCQB069ZNd6xbt26cP3+epKQkgyDQmTNn\nqKqqqnM8JiYmjBs3jv/+97/Exsayf/9+LCwsGD16dFMeWwjxAGjuCviW7iISD7633nqL0tLSZrUZ\nPHgwGzZs0O0yFaI1sLCwwMvLi5SUFG7evKlLtXvmzBnKy8sBiI+P1wsCxcfHA+Dr6wuASqViyZIl\nqFQq+vTpg7+/PyUlJRw/fpxly5bx2muv8fjjj+var1q1iqioKLp27cpjjz2Gubk5169f58yZM5w6\ndYp+/frh4eFBUFAQISEhKJVKvd11tdNNR0REsHbtWkxNTQkICMDZ2ZkrV66wb98+YmNjWblyZZ0L\n05YvX05qair+/v4MHjwYOzs7vfOxsbHExMTg7+/PhAkTyMrK4sSJE6SmprJ+/Xp51xFC3BESBBJC\nCCHEHRUZGUlsbCznz58nLy8PY2Nj3N3dmTBhgt5LHTS8Qi4nJ4ekpCQAVq9erVuhB7Bp0yaUSmWD\nNYEuXbrE1q1bSUhI4MaNG1hbW+Pq6sqoUaOYOHFio89RWVnJvn37OHjwIBcvXqSyshI3NzfGjh3L\npEmTUDS1wrd4IGWq1E2usVFRVsLVxJ9JMB1HpkqNu9KG1NRUDh8+jLW1NUOGDAFgzJgxfPvtt/z7\n3/9m6dKlupo+hYWFfP/99wC6Vaw11+/fv5///ve/BAQEYGNTPSFcVlbGN9980+CYxo8fT2hoKF98\n8QXXr19n/PjxddYtEkI8PKSOxsOpJbuZa+9AFaI18fX15bfffiMpKYmBAwcC1YEeIyMjvL29dUEf\nAK1WS2JiIh06dECpVALVQZ1r166xePFiRo4cqbtWo9GwZMkSNm7cSEBAAPb29mg0GqKjo+nevTuf\nfvqpQa1FtVoNVC/m8fT01AWB6tp9d/nyZdavX4+LiwsrVqzAyclJdy4+Pp53332XjRs38s477xi0\nvXbtGuvWras3mHPs2DHef/99XaAL4JtvviE0NJSIiAieeeaZRj9XIYRojASBhBBCCHFHrV+/Xle8\n3sHBAbVazYkTJ/jHP/7B5cuXeeGFFwza1LVCzsfHB2tra2JiYggICNArMN3YxMfx48f58MMPKS8v\nx9/fn5EjR6LRaMjIyGDr1q2NBoEqKir44IMPOHXqlC5wZGZmRkJCAv/85z9JSUnhrbfeatkHJB4I\ncZm5Tb7WxqUr19NOo8m9wj8qfsPTwYTo6Giqqqp47bXXdGndnn76aU6ePElMTAz/+7//y4ABAygt\nLeWXX36hoKCAZ555ht69e+v67dWrF1OmTGHnzp38z//8D8OGDcPY2JiYmBjatWtXb30hqJ40HDhw\nIDExMQCSCk6INkTqaNw+lUrFnDlzCAwM5Nlnn+Xrr78mOTmZ8vJyPD09CQoKMkgrVV5ezo4dOzh8\n+DDZ2dkYGxvj4eHBlClTGD58uME9YmJiCAsLIysrC7Vaja2tLZ06dWLEiBF630NurQm0evVqIiMj\nAQgJCSEkJER3bc2il4ZqAqWlpfHjjz+SnJyMRqPBwcGBgQMHMm3aNIN/V2rutWnTJk6dOsWuXbu4\ncuUKVlZWDB48mNmzZ0uwSdyWW4PWzm7dgerASe0gUPfu3Rk6dChffPEFly9fxtXVlfT0dNRqNUOH\nDgUgIyODpKQkhg0bphcAgup3gxkzZvC3v/2NI0eOMHHiRBQKBVqtFlNT0zoXb9UsvmmKvXv3UlFR\nwbx58/QCQFAd2AoICCA2NlZvh1ONF154ocHdPCNHjtQLAMEfi31SUlKaPEYhhGiIBIGEEEIIcUet\nXbuWjh076h2rqKhg2bJlhIaGMmHCBIOXp4ZWyMXExDBkyJAmFz4uLCxk5cqVVFVVsXz5cry9vfXO\n5+Y2Pvn/3//+l1OnTjF58mTmzZunWzlYVVXF2rVriYiIYNiwYQQEBDRpTOLBU1xa0eRrzawd6Dxo\nEldOR3L8l0NctregW7duTJ8+nf79++uuMzEx4YMPPuCnn37i559/ZteuXRgZGeHh4cErr7xiMGEB\nMG/ePDp16sTu3bvZu3cvtra2DB48mJkzZ/L66683OK6xY8cSExODl5eXXpo5IYQQTZOTk8OiRYtw\nd3dn/Pjx5OXlER0dzbJly1i8eDEjRowAqr/HvPfeeyQlJeHm5sakSZMoLS3l119/5aOPPiI9PZ2Z\nM2fq+g0PD2fdunU4ODgwaNAgbG1tyc/PJzMzkwMHDjS4GGXw4MFA9c5qb29vvZ3OLi4uDT7P8ePH\nWb58OQBDhw5FqVSSlpbGnj17OHbsGB9//HGdfXz11VecOnWKQYMG4efnR0JCAvv27SM7O5u///3v\nTf9Ahfjd6YxcNkelGuy6rqqs5EJ2EQeijzF37lw0Gg3nz5/nmWeeoW/fvkB1UMjV1VWXSrfm+Nmz\nZ4HqXT9btmwxuGdBQQEAWVlZQHXtxUGDBhEbG8vrr7/OsGHD6N27Nz179sTc3LxZz1Nz76SkJFJT\nU+u8d1VVFZcvX6Z79+5657y8vBrs+9brAV1NyKKiomaNUwgh6iNBICGEEELclrrS0tzKxMSESZMm\nkZCQQHx8PI899pje+cZWyDVHZGQkxcXFTJkyxSAABH+8VNVHq9Wya9cuHBwcmDt3rl7qCCMjI+bM\nmcOBAwc4fPiwBIFaMSvzxr8Gm7ezp/8Ly3R/9hw9nfmP9+apQR71tjEzM+P555/n+eefb9I4FAoF\nkydPZvLkyQbnNm3a1GDb8+fPAzBhwoQm3UsIIYS+pKQkpk6dyssvv6w7NmnSJBYvXsy6devw9/fH\nysqK7du3k5SUhL+/P++++y7GxsYABAcH89Zbb/Hjjz8ycOBAevXqBVQHgUxMTPj8888N6n8UFhY2\nOKbBgwdjbW1NZGQkPj4+daamqktJSQmrVq2isrKSFStW0KdPH9250NBQvvnmG9auXcsHH3xg0Pbs\n2bOsXbtWl5qusrKSd955h4SEBFJSUujRo0eTxiAEQPjpi6zenVhn3UUjY2Mq27lwMCaRbdHJuJoV\nUVVVha+vL507d8bR0ZH4+HgmTpxIfHw8CoVCt0umJn1bXFwccXFx9d7/5s2buv//85//TGhoKD//\n/DObN28Gqr+rDRs2jJdffhl7e/smPVPNz+22bdsavK6kpMTgWGO1u9q1a2dwrOZ3TH31IYUQorkk\nCCSEEEKIFqlvhV+ZpgDj7NPYl19DW6qmrKxM7/z169cN+mpshVxznDt3DgB/f/8Wtb98+TJqtZpO\nnTrxww8/1HmNmZmZbpWhaJ3qClbezXZ32s2bN9m7dy82NjZ17jASQgjROGtra4KCgvSOeXl5MXr0\naCIjIzl69CiBgYFERESgUCiYO3eubnIWwM7OjunTp7NmzRr279+vCwJB9SRu7Wtr3K0i78eOHUOt\nVjNy5Ei9ABDA1KlT2bt3L3FxcVy7ds2gDlFQUJDeMWNjY8aMGUNycrIEgUSznM7IrTcAVKNdBw8K\ns9P58JtdjPM0xczMTPez07dvX06ePEl5eTnJycl06dJFF0itSb/7yiuvMGXKlCaNx8zMjODgYIKD\ng8nNzSUpKYnIyEgOHTpETk4OH330UZP6qUmL+MMPP+jG0VRSR1QI8SCQIJAQQgghmq2+FX6l6jzO\nhX9JZdlN2im7MGXUQAb2dMPIyAiVSkVkZCTl5eUG/TW2Qq45NBoNgEHKuaaqWWV45coVvTz8t6q9\nylC0Pu5KG3y6OBoEMRvSt6vjfa/Dcfz4cc6fP09sbCz5+fm8/PLLzU5pIoQQbVHtnctlmnyKSyvo\n27ebQf0OQFd3Jz09naFDh5KdnY2TkxNubm4G19akqkpPT9cdGz16NJs2bWLBggWMHDkSb29vevXq\nZbAr6E6q2R16a20RqA7qeHt7c/DgQdLT0w2CQJKOStwpm6NSGwwAAdh0qN5Rrc7OYHfmNSYGPIKZ\nmRlQ/ff38OHD7Nmzh5KSEr2/zz179gQgOTm5yUGg2pydnRk9ejSjRo3i1Vdf5cyZM6jVal1tIIVC\nUe/Om549e5KWlkZycrKulpEQQrQmEgQSQgghRLM0tMJPdfYoFaXFdB3yJE7d+nFOAS8NC8DPw5mo\nqChdoeNb3ckVcjUr9a5fv467u3uz29es7hsyZAhLly69Y+MSD54ZI71Ysjmm0ckKAIUCgkfcuR1r\nLfXrr78SGRmJvb09zz33HE899dT9HpIQQjzQ6tq5XFqUT/KF61Q4FHA6Ixc/D/1dnjUpojQajW5x\niaOjY5391yxkqR0seeqpp7C1tWXPnj2EhYWxY8cOFAoF3t7ezJ49+47ugK5RM876FtbUjL+uoI6k\noxJ3QqZK3aTFNVYOHTExs6Dg0jlySzR0nPaE7lxNUPXHH3/U+zNU79Lr06cPR44cISIigrFjxxqO\nITMTBwcH7OzsKCgoIC8vz+B9oKSkhJL/z96dx1Vd5Y8ff132fRdEZBVEZBNFUXBBKbegMpdQK502\nf/O1SSutUStbTKuxdGzKcnKanERr0ExNLUQRExVQdjcUUFyv7AiCIPf3B3HH6wVBE0V9Px+PeZSf\nz/mcz/nc6eOF8z7n/a6pQVdXFz29/02LWlhYtFg7NDIykl9++YWvv/6aLl264OTkpHG+vr6eo0eP\nau3CE0KIjkKCQEIIIYS4KTda4VdbWQqAlUtjSgeVCmJ25xLkbkdWVtZN36upHs/NTEB4e3uzZ88e\nDhw4cEsp4bp27YqpqSlHjx6lvr5e45dDcX8Jcrdj5iP+raYtUSjglcgArUnCu2HmzJnMnDnzbg9D\nCCHuCTeqTQJQeP4ic1bv55XIAEb0clYfLysrAxoXljQtLiktLW22j6bjTe2aDBs2jGHDhlFVVcXh\nw4fZu3cvcXFxzJ8/n+XLl9/2XUFN928a+/VKSkqaHacQt0t6QfMBlOspdHQws3el7HRjCmeF1f92\n2Nnb2+Po6Mi5c+fQ0dHRqu85a9Ys5s2bx7Jly9i0aRPe3t6YmppSVFREQUEBJ0+eZPHixVhaWlJc\nXMyMGTNwc3PDzc0NOzs7qqurSUlJobS0lKioKI2dgIGBgSQmJvLee+/RrVs39PT08PX1xc/Pj65d\nu/Lyyy+zbNkypk+fTu/evXFycuLq1asolUoOHTqEhYUFX3755W34JIUQ4vaTWQ0hhBBCtFlrK/wM\nTBsnNC5dKMCya2PKhsyTJWzevptff/31pu/XlJ5BqVS2+ZqIiAjWrl3L1q1bCQ0N1frlsaioSJ3i\npDm6urpERUWxdu1aVqxYwfPPP69OUdGkpKSEqqoqnJ2dW+hF3CtGBrngY2br4gAAIABJREFUYGVC\nzO5cMk9q/7cd4GrDpEFeHSIAJIQQou3aUpvkcsk56q/UsmRzJvaWxuq/65sWrnh4eGBsbIyjoyPn\nz5/n7NmzdOnSRaOPzMxMALp169bsPUxNTQkODiY4OBiVSkVcXBw5OTmEhoa2OK5bWQTj4eGhHvv1\nOySuXr1KTk7ODccpxB9VXVvf5rZmnd0pO30UXQMjLO010ywGBgZy7tw5PD09tYKWdnZ2LF26lE2b\nNpGUlERCQgINDQ1YWVnh4uJCZGQkrq6uADg4ODB58mSysrLIzMykoqICc3NznJycmDp1KoMGDdLo\n+8UXXwQgIyOD1NRUVCoVEydOVP8uMXToUNzd3dmwYQOZmZmkpaVhZGSEjY0NYWFhWv0JIURHIkEg\nIYQQQrRZayv8OnXvS0leOvm7Y7Fy8UHf2JzLZUrei1MyLnI4u3fvvqn79ejRA0NDQzZu3EhlZaU6\nxUlkZGSLK1ktLCyYNWsWH374IXPnziU4OBg3Nzeqq6spKCjg4sWLrFy58ob3ffLJJ8nPz2fr1q0k\nJycTEBCAra0t5eXlnD17lkOHDvHMM89IEOg+EeRuR5C7nUa9CBNDPXq52d31GkBCCCHaRqlU8txz\nzxEREcHMmTPbVJuk/koN57N24dR7uHrncm5uLgkJCZiamjJgwAAAHnroIf7zn//wr3/9i7lz56qD\nNBUVFaxduxZAI/CSmZnJ3Llz8ff3Z9GiRerjTbt0WqvlZmFhAcDFixfb/PwDBgzA3NycXbt28cgj\nj6jrpwBs3LiRCxcu0KtXL616QELcLiaGbZ9itO8Rgn2PEADMjDUXW02fPp3p06e3eK2xsTETJkxg\nwoQJN7yHqakp0dHRREdHt2lMlpaWzJ49+4Zt3Nzc2rwj+9p3vzkRERFERES0eH7Tpk1tuo8QQrSF\nBIGEEEII0WatrfAztnbA86EpnMvYScWZXFSqBoytHHjkqWmMCutx00EgMzMz5syZw5o1a4iPj6em\npgZoXIl3o3Qmffv2ZcmSJcTGxpKRkUFaWhqmpqY4Ozszfvz4Vu+rp6fHvHnzSEhIYPv27aSkpFBT\nU4OFhQUODg489dRThIeH39SziI7Pzd5cgj5CCHEfaGttEnMHV4qPp1FVdJYznZwxPrmLnPQUGhoa\nmD59urpO4BNPPMGBAwfYv38/f/nLXwgODqa2tpbffvuN8vJyxo4dS8+ePdX9Lly4kIyMDC5dusS/\n/vUvVCoVOTk55Obm4unpqVHsvjlOTk7Y2tqSmJiIrq4u9vb2KBQKhg4dir29fbPXGBkZMWPGDD78\n8EP++te/MnDgQDp16sTx48dJS0vD2tr6hhPrQvxRvdxubdf0rV4nhBCi7SQIJIQQQog2a8sKP7NO\nzng99IzGscDePfH3d9da0dbaCjmAPn36tFjbZ9KkSUyaNKnZcy4uLrz66qut9t/SKrumyZahQ4e2\n2ocQQgghOo621iYxMLXGud8jnE2Lpzg3lbhSEwb1DSA6OprevXur2+np6fH++++zYcMGdu3axebN\nm9HR0cHd3Z0XX3yRwYMHa/Q7ZcoU0tPTKS0t5eeff8bAwAB7e3umTp3K6NGjW603qKOjw7x58/j3\nv//Nnj17uHz5MiqVip49e7YYBAIICQnh448/5ocffuDgwYNUV1djZWXFqFGjiI6OxsbGpk2fixC3\nws3eHH8XmzYFYJsEuNrIAhwhhLgDFKrW9kc/oBQKxYHevXv3PnDgwN0eihBCCNFhFCgrmfZV4k1f\n99W0wfILnhBCCCHazbXp4Oz7PMK3CcdabFt7qYycDX/H1qMXrqGPqY9PCe/OpEFet2U8UVFR+Pn5\ntWnBixD3i7T8Iuas3t9qKkYAhQIWTQ6RuotCiPtenz59OHjw4EGVStX86tY7QHYCCSGEEKLNZIWf\nEEIIITq66rKL5CWs5dLFUzRcrcfEujOdA4Zg4dhN3UalaqDyQj6521dRW1FMfW0VNXvsOREWzPjx\n4+nRo0ezfZ8+fZp169aRmZlJSUkJpqamODk5MWTIEEaPHt3q2NavX8+///1vevTowVtvvYW5ufyM\nJO4fQe52zHzEn6U/Z90wEKRQwCuRARIAEkKIO0Tnbg9ACCGEEPeWyYO9UCja1lah4LatqBVCCCGE\naM2FCxf4aeUn1F+pwdazD9YuvlSXnufEjtWUFmSr2zXUXaHibC6gwMLJi049BhAa0pfMzEz++te/\n0lxWkJSUFGbMmEF8fDwuLi48/vjjhIaG0tDQwLp16244LpVKxYoVK/jmm28YMGAACxYskACQuC+N\nDHJh0eQQAlybTz8Y4GrDoskhjOjlfIdHJoQQDy7ZCSSEEEKImyIr/IQQQgjRUWVnZzNmzBi8dX3U\nO5ftvIM59ss3FCb/jEUXTwB09A1w9AzHffB44PeJ6WcGUFRUxGuvvcbXX3+tUZOwoqKCxYsX09DQ\nwMKFC/Hz89O4b1FRy3WIrly5wieffEJSUhKRkZG8+OKLKNq6okaIe1CQux1B7nYUKCtJLyiiurYe\nE0M9ernZSYYAIYS4CyQIJIQQQtwhmzZtYuvWrVy4cIErV67w/PPP89hjj7V+YQc0MsgFBysTYnbn\nknlSOzVcgKsNkwZ5SQBICCGEEHeUqakpEydO5Mj5KnVtElNbJ2zc/CnOS6es8Ai23XrR5+l31ddc\nu3PZzs6OsLAwNm3axMWLF+nUqRMA8fHxVFdXq2v9XM/OrvmfeSorK3n//fc5cuQIU6dOZezYse3w\n1EJ0TG725hL0EUKIDkCCQEJ0APfTxLAQonmJiYmsWLECDw8PHn30UfT19VvMNX+vkBV+QgghhLhb\nrv/5o6tp4/bkbt26YWxsTJC7scbOZTMHV4rz0rlcel7dx6WLhVw8sh834yreO/A19fX1GvcoLi5W\nB4GOHj0KoLE7qDVlZWW8/vrrnD9/ntdee40hQ4b80ccWQgghhLhpEgQS4i67HyeGhRDaUlJSAJg/\nfz42Ns3nx75XyQo/IYQQQtwpaflFrE7MVad6a1J7qYzCwlK6+emrj127c/m3s2YAXL1SC0DZqcMU\np/6EW2crBvcNwdHRESMjIxQKBVlZWWRnZ1NXV6fuq6qqCgBbW9s2j7W0tJTq6mrs7Ozo2bPnLT+z\nEEIIIcQfIUEgIe6y+3liWAjxPyUljRMV8p4LIYQQQtyabWmnbliTsOLyFTYlHWZUeqG66HzTzuXV\nJkUszTYn0N+FyBE9WfflBqo97Fm6dCnOzpoF6j///HOys7M1jpmamgKNu4Pc3NzaNF53d3eGDx/O\n0qVL+etf/8oHH3xA586db+6hhRBCCCH+IAkCCXGXycSwEPe3mJgY1qxZo/5zVFSU+t83bdoEQEZG\nBuvXr+fYsWPU1NRgb29PaGgo48aNU084NJkzZw7Z2dn8+OOPxMbGkpCQwIULFxgyZAgzZ84kPj6e\npUuXMnPmTGxtbVmzZg15eXkYGBjQt29fXnjhBUxNTcnLy+O7777j0KFDXL16lYCAAKZNm4a9vb3G\n/c6fP09sbCyZmZkUFxdjYGCAra0tPj4+PPPMM5ibyw4gIYQQQrS/tPyiGwaAmlSXnGPxjynYWxpr\n1Ca8ePoEna1MmDhiABH93Fn1YTEuLi5aASCVSkVOTo5Wv97e3uzZs4cDBw7cVEq4oUOHYmBgwOLF\ni9WBICcnpzZfL4QQQgjxR0kQSIi75F6fGBZCtI2/vz/QWExYqVQyceJEjfPbtm3jiy++wNDQkIED\nB2JlZUVWVhaxsbHs37+fv/3tb1rvO8DChQvJzc2lT58+9O/fH0tLS43z+/fvJyUlhb59+zJq1CgO\nHz6sHsOUKVOYN28evr6+DB8+nIKCApKTkzl//jz/+Mc/UCgUQGOQ+tVXX6W6uprg4GBCQ0O5cuUK\nFy5cYOfOnURGRkoQSAghhBB3xOrE3FYDQAD1V2o4l7mLmN2O6iBQbm4uCQkJmJqaMmDAAADs7e05\ne/YsJSUl6gV5KpWKmJgYCgsLtfqNiIhg7dq1bN26ldDQUPz8/DTOFxUVYWdnp3UdQFhYGHp6enz0\n0UfMmTOHBQsW4OLicjOPL4QQQghxyyQIJMRdci9PDAsh2s7f3x9/f3+ysrJQKpVMmjRJfU6pVPLV\nV19hZGTEp59+SteuXdXnli9fzpYtW/jmm2946aWXtPq9ePEin3/+ORYWFs3ed//+/XzwwQfqCQqV\nSsXbb79Neno677zzDi+99BLh4eHq9suWLSMuLo7k5GRCQkIA2LNnD5WVlbzwwgs8+uijGv3X1NSg\no6Nzy5+LEEIIIURbFSgrtWoAtcTcwZXi42nE/vMsncsj0L1aw+7du2loaGD69OmYmJgA8Pjjj/P5\n55/z8ssvExYWhq6uLocPH+bUqVP069eP5ORkjX4tLCyYNWsWH374IXPnziU4OBg3Nzeqq6spKCjg\n4sWLrFy5ssVxhYSE8Oabb/LBBx+oA0Hu7u63/qEIIYQQQrSRBIGEuEvu5YlhIcTtkZCQQH19PWPG\njNF4zwGefvppdu7cyc6dO5k2bRr6+voa55966qkW33OAIUOGaKxQVSgUDB06lPT0dFxdXTXec4Bh\nw4YRFxdHXl6e1rtuYGCg1b+RkVFbH1MIIYQQ4g9JLyhqc1sDU2uc+z3C2bR4Nmz8GXsLA7p160Z0\ndDS9e/dWtxs5ciT6+vr89NNPxMfHY2BggK+vLzNmzCApKUkrCATQt29flixZQmxsLBkZGaSlpWFq\naoqzszPjx49vdWy9e/fmnXfe4b333mPu3Lm89957eHl5tfnZhBBCCCFuhQSBhLiDCpSVpBcUUV1b\nj4mhHr3cmk8XcK9MDAshWnb9+15edUWrzYkTJwAICAjQOmdmZka3bt3Izs7m9OnTWitFW5sw8PT0\n1DrWlOqkuXO2trZAYyqTJiEhIaxatYovv/yStLQ0goKC6NmzJ87OzrIzUAghhBB3THVtfattDM2s\n6P3UfPWfPcKjmRLenUmDWv6ZKSIigoiICK3jbm5uGov0ruXi4sKrr77a6niaUnxfz9/fn//+97+t\nXi+EuDc899xzADfcCSiEEHebBIGEuAPS8otYnZjbbAqDsqwzGF7WnBzu6BPDQoiWtfS+56afQlFR\nSlp+kTo/fVVVFfC/d/B61tbWGu2aO9eS5tJF6urqAqjToDR37urVq+pj9vb2fPrpp8TExHDw4EGS\nkpIAsLOz44knntCoZSaEEEII0V5MDG9t6uJWrxNCiCZN9ZdbCuwKIcS9QH4iEqKdbUs7xdKfs1os\nYnqx4jKXlKX8kl7IiF7OQMefGBZCNK+1973i8hXmrN7PK5EBjOjlrH4fS0tLmy0OXFpaCjT/bt6p\nnTjOzs688cYbXL16lfz8fNLT09m8eTMrVqzAyMiIhx9++I6MQwghhBAPrpYyKLTXdUIIIYQQ9xOp\n6CxEO0rLL7rhhHATlQqWbM4kLb9xt821E8PN6QgTw0IITbfyvnt4eACQlZWl1a6qqoq8vDwMDAxw\ndnZujyHfFF1dXTw9PRk3bhyzZ88GYO/evXd5VEIIIYR4ELjZm+Pv0vwCuZYEuNrgZm/eTiMSQggh\nhLh3yE4gIdrR6sTcVieEm6hUELM7lyB3Ozw8PEhKSiIrK4vAwECNdh1tYlgI0ehW3vfZI4eydu1a\nNm/eTEREBI6Ojuo23333HdXV1QwfPlyr9tedcvz4cRwdHbV2EJaVlQFgaGh4N4YlhBBCiAfQ5MFe\nzFm9v00/bykU3LAWkBDij4uKisLPz49Fixa1S//79+9n48aNFBYWUllZiYWFBV26dGHQoEGMHj1a\n3e7s2bOsXbuWjIwMKioqsLCwIDAwkOjoaLp06aLR59KlS4mPj2flypXY29trnMvKymLu3LlMnDiR\nSZMmoVQq1fV+mp63SXPPXVNTQ0xMDLt376asrIxOnToxfPhwxo4dK4t1hRB3nQSBhGgnBcrKZmsA\n3UjmyRIKlJUMHdqxJ4aFEJpu9X2vxo8XXniB5cuXM2PGDAYOHIilpSXZ2dkcOXKErl27MnXq1PYZ\ndBvs3LmTbdu20bNnTzp37oyZmRnnz58nOTkZfX19Hnvssbs2NiGEuN2OHj3K+vXrOXToEJcuXcLK\nyorg4GAmTpyoTtH7//7f/+PChQt8++23WFhYaPURGxvLt99+y7Rp04iMjFQfLyoqIjY2ltTUVIqL\nizE2NsbHx4fo6Giteo4xMTGsWbOGhQsXUlJSwsaNGzl16hQWFha8++67/PnPf8bf35+FCxc2+xwv\nvfQSp0+f5l//+leLqYWFuBcFudsx8xH/VndeKxTwSmSAugajEOLes23bNj7//HOsra3p168fFhYW\nlJWVUVBQwPbt29VBoNzcXN58800uX75Mv379cHFx4fTp0yQkJLB//34WLFjQat3klpiamjJx4kTi\n4+NRKpVMnDhRfc7BwUGjbX19PW+//TYlJSUEBwejo6PDvn37+Pbbb6mrq9O4Vggh7gYJAgnRTtIL\nim75usf7uXfoiWEhhKY/9L6PHo2joyPr168nKSmJ2tpaOnXqxBNPPMGECROareN1pwwePJi6ujoO\nHz7M8ePHuXLlCra2tgwaNIgxY8bg6up618YmhBC3U1xcHP/4xz/Q19cnJCQEOzs7zp49yy+//EJy\ncjKLFy+mU6dOREREsGrVKnbt2qWxIrjJjh070NPTY8iQIepjJ06c4K233uLSpUv07t2b0NBQKioq\n2LdvH6+//jrz5s0jODhYq68ff/yR9PR0+vXrR0BAAFVVVXTt2pWAgAAyMzM5c+YMTk5OGtccPnyY\nkydPEhoaKgEgcV8aGeSCg5UJMbtzyTypvQAnwNWGSYO8JAAkxD1u27Zt6Onp8dlnn2FpaalxrqKi\nAgCVSsWnn35KdXU1r732GuHh4eo2u3fv5uOPP+aTTz5h+fLlt7QTx9TUlEmTJpGVlYVSqWTSpEkt\nti0pKcHd3Z0FCxZgYGAAwKRJk5g2bRo//fQT48ePR09PpmCFEHeP/A0kRDuprq3/Q9eN7sATw0II\nTW15370entridUFBQQQFBbXpXq2lW4iIiCAiIqLZc/7+/mzatKnZc/b29lrnvL298fb2btO4hBDi\nXnXmzBm++OILHBwcWLRoEba2tupzGRkZvPXWW6xYsYJ58+YxdOhQ/vOf/7Bjxw6tIFBubi6FhYWE\nhoZibt5Yh+Tq1at89NFH1NTUsHDhQvz8/NTtS0pKeOWVV1i2bBkrV67U2uGdmZnJ4sWL1fXjmowe\nPZrMzEx++eUXnn32WY1zv/zyCwCjRo364x+MEB1UkLsdQe52FCgrSS8oorq2HhNDPXq52UkNIHFf\nqampYeLEiXh5efHxxx+rj1+5coXo6Gjq6up49dVXGTp0qPrcli1bWL58OS+//DIPP/wwAJWVlaxf\nv559+/ahVCrR09NT1/q8/neQ+vp6tm7dyvbt27lw4QJ1dXVYWVnh7u5OZGQkvXr1Ij4+nqVLlwKQ\nnZ2t8X3YlErtdtDV1UVXV1freNNO3CNHjnD69Gl69OihEQACGDRoEJs3b+bQoUPk5ORofP+2l2nT\npqkDQACWlpaEhISwY8cOzpw5IwvohBB3lQSBhGgnJoZte72unxi+9rqOOjEshNDU1vf9dl0nhBDi\nj7l28njPtnVUVNUwd+4LGgEggMDAQEJCQkhOTuby5cvY2dkRGBhIeno6p06dwsXFRd02Pj4egGHD\nhqmPpaamcu7cOcaMGaM1AWVjY8PYsWP55z//SUZGhtZuoJEjR2oFgAD69++PjY0N27dv5+mnn1YH\nj6qqqti9ezeOjo5aNSWFuB+52ZtL0Efc14yMjPDy8uLYsWNcvnwZY2NjAA4dOkRdXR3QuFjh2iBQ\nRkYGgPp7QKlUMmfOHJRKJb6+vvTp04eamhpSUlKYP38+06dPZ8SIEerrlyxZQmJiIq6urgwbNgxD\nQ0OKi4s5dOgQBw8epFevXri7uzNx4kTWrFmDvb29xjyDv7//LT/vtd/Nxk4+lB46yv/93/8xePBg\n/Pz88PHx0dgVdPz4cQACAgKa7S8gIIBDhw6Rl5fX7kEgU1NTjTT+TezsGnclXrp0qV3vL4QQrZHZ\nJyHaSS+3W0tBcKvXCSHuHnnfhRDi3pCWX8TqxFyNOm5HE5KpKirmra82MHjPQa1J5fLychoaGjhz\n5gyenp489NBDpKenEx8fz5/+9CegceV0YmIilpaWGsGcI0eOAHDx4kViYmK0xnP27FkACgsLtYJA\n3bt3b/YZdHV1GT58OGvXriUpKUmdem7Hjh1cuXKFESNGSAFqIYS4TwQGBnL48GGys7Pp27cv0Bjo\n0dHRwc/PTx30gcb0aFlZWXTu3Bl7e3ugMahz8eJFZs+ezeDBg9Vtq6qqmDNnDitWrCAkJAQrKyv1\nYgJPT08++eQTdHR0NMZSWVkJgIeHBx4eHuog0B/d+dPcdzN0pdxpEOXnszm5NhYL459QKBT4+fnx\npz/9CS8vL6qrqwFaTH/adLyqquoPja8tWsrU0rSTqaGhod3HIIQQNyJBICHaiZu9Of4uNjdVLD7A\n1UZWswlxD5L3XQghOr5taaeaLShfX9s4iXRgdxwHfwMPBws6WRhrXV9TUwPAgAEDMDExISEhgSlT\npqCjo0NycjKVlZU89thjGqlrmuoW/PbbbzccW1Pf17Kysmqx/ciRI/nhhx/Ytm2bOgj0yy+/oKen\nx0MPPXTDewkhhLh3BAYGsnbtWjIyMjSCQJ6enoSGhvLll1+qa8Tl5eVRWVlJaGgoAPn5+WRnZxMW\nFqYRAILGoMXkyZNZsGABSUlJjB49GoVCgUqlQl9fv9nFBE2pTm+nlr6bAWw9AsEjkKt1NQz3NoSS\nfOLi4pg/fz7Lly/HxMQEgNLS0mb7Lilp/N2sqR2gfq6rV69qtb8TwSIhhLhbJAgkRDuaPNiLOav3\nN/sDzfUUCpg0yKv9ByWEaBfyvgshRMeVll/U4iSTroERAIET3kDXwAiFAt6bHNJiYXkDAwMGDhzI\nr7/+SlpaGn369GHHjh2AZio4+N/K4DfffJOQkJCbGvONdvPY2toSEhLC3r17OX36NJWVlZw8eZJB\ngwZpFdAWQghx77i+1pVfVycMDAzUO36qqqo4ceIEY8eOVadBy8jIwMnJiczMTOB/6dGadqNWVVU1\nuxu1vLwcaNyNCo3Bkn79+pGcnMzLL79MWFgYPXv2xNvbG0NDw9v+rDf6br6Wrr4RP+fDoskTUalU\nxMXFkZOTQ7du3QDIyspq9rqm403tAMzMzIDGHbrXp2/Lzc1ttp+mHVENDQ1au6OEEOJeIUEgIdpR\nkLsdMx/xb/UHG4UCXokMaHGyQQjR8cn7LoQQHdfqxNwW/242tXOiuvgsly6ewtKpOyoVxOzOveHf\n0w899BC//vorO3bswNPTkwMHDuDm5qZVw8fb2xuAnJycmw4CtWb06NHs3buXbdu2qWsNjBw58rbe\nQ4hboVQqee6554iIiGDcuHH8+9//Jicnh7q6Ojw8PJg4caJG3dOmIvMzZ87EysqK2NhY8vLyqK6u\n1qhNmpGRwfr16zl27Bg1NTXY29sTGhrKuHHjmk3FVFlZyYYNG9i3bx/nz59HT08Pe3t7goODefLJ\nJzEyMtJou379evbt24dSqURPTw9PT0/GjRunVaO1vr6erVu3sn37di5cuEBdXR1WVla4u7sTGRlJ\nr1691G1zcnJYt24deXl5lJeXY2ZmhoODA3369GHixIm382MX97jmU6I1qqizoOhwLuXl5Rw5coSG\nhgYCAwNxdnbGxsaGjIwMRo8eTUZGBgqFQl0PqCl9W3p6Ounp6S3e+/Lly+p/f+ONN4iNjWXXrl2s\nXr0aaFz8EBYWxrPPPnvDXao360bfzZXn8zFzcFMviGj6bjYvKwPA0NAQHx8fnJycOHToEHv27CEs\nLEx9/Z49e8jJycHJyQlfX1/18aZUq7/88otGLaGCggI2btzY7FgsLCyAxsCRg4PDrT+wEELcRR0i\nCKRQKGyBMcAjgD/gBFwBsoBvgG9UKpVWAk2FQhEKvAn0B4yBXOBfwGcqlUp7b6cQd8HIIBccrEyI\n2Z1L5kntH+gCXG2YNMhLJoSFuA/I+y6EEB1PgbLyhuk6O3XvR/Hxg5w58CuG5jYYWdiRebKEAmUl\nbvbm1NfXc/ToUY1JJB8fH7p06cK+fftwdnamvr6+2TRsISEhODo68vPPPxMQEKBV9wcaV2q7u7vf\n9CrrwMBAnJyciI+P58qVKzg5ObVYHFuIu+HChQvMmjULNzc3Ro4cSWlpKbt372b+/PnMnj2bQYMG\nabTfs2cPBw4coE+fPowaNQqlUqk+t23bNr744gsMDQ0ZOHAgVlZWZGVlERsby/79+/nb3/6mEQi6\ncOECc+fORalU4unpyejRo1GpVJw5c4YNGzYwatQodRBIqVQyZ84clEolvr6+9OnTh5qaGlJSUpg/\nfz7Tp09nxIgR6r6XLFlCYmIirq6uDBs2DENDQ4qLizl06BAHDx5UB4EOHDjAu+++i4mJCSEhIdja\n2lJZWcnp06f5+eefJQgk1G6UEg2g2qQzJ47l8M/YOCyulmBgYICPjw/QuOvnwIED1NXVkZOTg4uL\ni3pHaFMatBdffJGoqKg2jcXAwIBJkyYxadIkioqKyM7OJj4+np07d3LhwgU++uijP/7AtP7dnJ/4\nAzp6BpjYOWFoZoVKBUe3nqSbWS0Bvj0IDAxEoVDwyiuv8NZbb/HRRx/Rv39/unbtypkzZ9i7dy/G\nxsa88sorGjtrQ0JC6NKlC4mJiRQXF9O9e3cuXrzI/v37CQkJaTZ9a2BgIL/99hsLFy4kODgYAwMD\n7O3tGTp06G35LIQQ4k7oEEEgYDywHDgH7AROAQ7AE8DXwCiFQjFepfrfV6JCoXgMWAfUAN8DJUAU\nsAQI+71PITqEIHc7gtzttLZ293Kzk5ogQtxn5H0XQoiOJb2g6Ia/RzKmAAAgAElEQVTnjSztcAl5\nlFP7N3J485dYOHbD0MKWj5dk0MW0gUOHDmFhYcGXX36pcd2wYcP47rvv+P7779HV1SU8PFyrbz09\nPebOncvbb7/Nu+++i4+PjzrgU1RURG5uLufPn2fVqlU3HQRSKBSMGjWKr7/+GpBdQKLjyc7OZsyY\nMTz77LPqY4888gizZ8/m888/p0+fPhq1OlJTU5k/fz59+vTR6EepVPLVV19hZGTEp59+SteuXdXn\nli9fzpYtW/jmm2946aWX1McXL16MUqnkmWeeYfx4zamBiooKjV1AS5Ys4eLFi8yePVujbkpVVRVz\n5sxhxYoVhISEYGVlRVVVFbt378bT05NPPvlEKzVU084LgF9//RWVSsWiRYtwd3fXGoMQ0LaUaOad\n3TmrgpU/bsff+go9evTAwMAAaAxQJCQksGXLFmpqatS7gEBzN2pbg0DXsrOzIzw8nCFDhjBt2jQO\nHTpEZWWlujaQQqGgoUFrvXabtPbd7NgrgspzJ7hccp6Ks8fR0dXDwNSS4GFRvDPjT+jpNU5nent7\ns2TJEr7//nvS09NJTk7GwsKCIUOGEB0djZOTk0a/BgYGfPDBB6xcuZL09HRyc3NxdXVl1qxZmJub\nNxsEGj58OEqlksTERNatW8fVq1fx8/OTIJAQ4p7SUYJAx4BHgZ+v3fGjUCjmAsnAWBoDQut+P24B\n/BO4CoSrVKrU34+/BewAxikUimiVSrX2jj6FEK1wszeXSWAhHhDyvgshRMdQXVvfahsbjwCMrR1Q\nHt5H5YV8Ks+fIP2SLQpvV8LCwrR2LEBjEGj16tXU19fTt2/fFmvxuLm58dlnn7FhwwaSk5PZvn07\nOjo6WFtb4+HhwaRJk9SpZm5WREQEK1euRF9fn4iIiFvqQ4j2YmpqqrXbxcvLi/DwcOLj49m7d6/G\nf7chISFaASCAhIQE6uvrGTNmjEYACODpp59m586d7Ny5k2nTpqGvr8/x48c5cuQIHh4ejBs3Tqu/\na9+3/Px8srOzCQsL0wgANY1/8uTJLFiwgKSkJEaPHo1CoUClUqGvr99s3a6myfFrNU3WtzQG8WC7\nUUq0JibWjugZGFFeeJQDZ+oYF/W/oH/TDtD//ve/Gn+GxvfN19eXpKQk4uLiePjhh7X6LigowNra\nGktLS8rLyyktLcXNzU2jTU1NDTU1Nejq6qqDL9D433FR0Y2DOS1p7bu5U/dgOnXX3j0bGNYdY2Nj\njWNOTk68+uqrbb63nZ0db7zxRrPnrk1B2URHR4dnnnmGZ555ptlrVq5c2eK9mnZVCSHE3dYhgkAq\nlWpHC8fPKxSKL4EPgHB+DwIB44BOwKqmANDv7WsUCsWbQDzwZ0CCQEIIIYQQQjzATAzb9iuPsbUD\nrqGPqf/85xE9ebyfe4vtO3Xq1GL9gOtZWloyZcoUpkyZ0mrbm5kwys/PR6VSERYW1uzks+g4oqKi\n8PPzY9GiRXd7KLfd9bufu5o2zmh369ZNa7IWwN/fn/j4ePLy8jSCQE21Oq534sQJgGbTHZqZmdGt\nWzeys7M5ffo07u7uHD16FIDevXs3G6i51pEjR4DGXT8xMTFa58vLywEoLCwEGtNr9evXj+TkZF5+\n+WXCwsLo2bMn3t7eWrv5hgwZQlJSEq+99hqDBg0iICAAHx8f7OwkLbBo1FpKtCYKHR3M7F0pO32U\nOsC2q6f6nL29PY6Ojpw7dw4dHR38/Pw0rp01axbz5s1j2bJlbNq0CW9vb0xNTSkqKqKgoICTJ0+y\nePFiLC0tKS4uZsaMGbi5ueHm5oadnR3V1dWkpKRQWlpKVFSUxjsdGBhIYmIi7733Ht26dUNPTw9f\nX1+tMTSnrd/Nt+s6IYR40N0Lf3vW/f7Pa5cJDPv9n9uaaZ8IVAOhCoXCUKVS1bbn4IQQQgghhBAd\nVy+3W5twvdXr7qR16xrXyD3yyCN3eSTiQdRSIfvaS2UUFpbSzU+/2euaCstXVVVpHLe2tm62fVM7\nGxubZs83XdfUrrX212pK35aenk56enqL7S5fvqz+9zfeeIPY2Fh27drF6tWrgcbdPmFhYTz77LPq\n5wsNDeXtt99mw4YNbN++nW3bGqcvPD09mTJlirp2kHhwtZYS7Vpmnd0pO30UXQMjynU0d54GBgZy\n7tw5PD09NWpjQeOul6VLl7Jp0yaSkpJISEigoaEBKysrXFxciIyMxNXVFQAHBwcmT55MVlYWmZmZ\nVFRUYG5ujpOTE1OnTtXaFfviiy8CkJGRQWpqKiqViokTJ7YpCHQ/fzcLIURH1KGDQAqFQg9o2m95\nbcDH+/d/Hrv+GpVKVa9QKPIBX8ADONzKPQ60cKrHzY1WCCGEEEII0dG42Zvj72LTptXWTQJcbTps\nSs+CggJSUlI4fvw4Bw4coG/fvuq6D6LjWr58+U3XferIWitkX3H5CpuSDjMqvZARvZw1zpWVlQFo\nTVa3tGunqV1paSkuLi5a50tLSwHU9YWa2peUtP7ON13z4osvtrlmioGBgXrHXlFREdnZ2cTHx7Nz\n504uXLjARx99pG7bt29f+vbtS01NDceOHSM5OZmtW7fy7rvvsmzZMpydnW9wJ3G/a0u60ib2PUKw\n7xECQE2dZh2e6dOnM3369BavNTY2ZsKECUyYMOGG9zA1NSU6Opro6Og2jcnS0pLZs2e3qe317rfv\nZiGE6Oh0Wm9yV30I+AFbVCrVL9ccb1r2UN7CdU3HrdprYEIIIYQQQoh7w+TBXrSSFUpNoYBJg7za\nd0B/wIkTJ1i1ahXp6ekMHDiQmTNn3u0hiTbo2rUrnTp1utvDuC3aUsgeoLrkHIt/TCEtX3O3Q1ZW\nFgAeHh5tul9Tu6brrlVVVUVeXh4GBgbqgEpTUPTgwYOoWhlkU9ucnJw2jeV6dnZ2hIeH89577+Ho\n6MihQ4fUu4uuZWRkREBAAM8//zzjx4+nvr6e1NTUZnoUD5IHPSXa/fTdLIQQHV2H/eZQKBQvA68B\nR4Cn2+s+KpVKu/Ik6h1CvdvrvkIIIYQQQog7I8jdjpmP+Lc6ca1QwCuRAQS5d9x0MxERERp1VMTN\n2b9/Pxs3bqSwsJDKykosLCzo0qULgwYNYvTo0ep2lZWVrF+/nn379qFUKtHT08PT05Nx48YRFBSk\n0Wd8fDxLly5l5syZWFlZERsbS15eHtXV1eoi4y3VBLp69Sq//PILO3bs4NSpU1y9epWuXbvy8MMP\n88gjj2jtjmnr+NtTWwrZA9RfqeFc5i5idjuq36nc3FwSEhIwNTVlwIABbbrf0KFDWbt2LZs3byYi\nIgJHR0f1ue+++47q6mqGDx+Ovn5j+jlPT098fHw4fPgwsbGxjB8/XqO/yspKDA0NMTAwwMvLC19f\nX5KSkoiLi+Phhx/Wun9BQQHW1tZYWlpSXl5OaWkpbm5uGm1qamqoqalBV1cXPb3GaZbs7Gx8fHzQ\n1dXVaNu0E+p+2hkmbs2DnhLtfvpuFkKIjq5DBoEUCsVLwN+BQ0CESqW6fn9o004fS5rXdLysHYYn\nhBBCCCGEuMeMDHLBwcqEmN25ZJ7UTj8T4GrDpEFeMsl0H9u2bRuff/451tbW9OvXDwsLC8rKyigo\nKGD79u3qIIpSqWTOnDkolUp8fX3p06cPNTU1pKSkMH/+fKZPn86IESO0+t+zZw8HDhygT58+jBo1\nCqVSecPx1NfX8/7773Pw4EGcnJwYMmQIBgYGZGZm8tVXX3Hs2DFeffXVmx5/e2prIXsAcwdXio+n\nEfvPs3Quj0D3ag27d++moaGB6dOnq1Oxtcbe3p4XXniB5cuXM2PGDAYOHIilpSXZ2dkcOXKErl27\nMnXqVI1rXnvtNebMmcOqVatISkrC398flUrF2bNnSUtL48svv8Te3h6AWbNmMW/ePJYtW8amTZvw\n9vbG1NSUoqIiCgoKOHnyJIsXL8bS0pLi4mJmzJiBm5sbbm5u2NnZUV1dTUpKCqWlpURFRWFsbAzA\nihUrKC4uxsfHBwcHB/T09Dh+/DiZmZnY29szePDgtn/w4r4kKdHku1kIIe6UDhcEUigUM4ElQDaN\nAaDmfnI+CgQD3QGNmj6/1xFyB+qBvPYdrRBCCCGEEOJeEeRuR5C7HQXKStILiqiurcfEUI9ebnb3\n1aSaaN62bdvQ09Pjs88+w9JScz1hRUWF+t+XLFnCxYsXmT17tsZEfVVVFXPmzGHFihWEhIRgZaWZ\nfTw1NZX58+fTp0+zySa0/PDDDxw8eJDIyEheeOEFdHQas7U3NDTwj3/8g7i4OMLCwggJCbmp8ben\nmylkb2BqjXO/RzibFs+GjT9jb2FAt27diI6Opnfvm0u6MXr0aBwdHVm/fj1JSUnU1tbSqVMnnnji\nCSZMmKBVX8jBwYG///3vrFu3jn379rF582YMDAywt7dnzJgxGp+fnZ0dS5cuZdOmTSQlJZGQkEBD\nQwNWVla4uLgQGRmJq6urut/JkyeTlZVFZmYmFRUVmJub4+TkxNSpUxk0aJC63wkTJrB3715yc3PJ\nyMhAoVDQqVMnJkyYwKOPPoqZmdlNfQbi/jR5sBdzVu9v0+66+zUlmnw3CyFE++tQQSCFQvEGjXWA\n0oGHVSpVSz9h7gAmAyOBNdedGwyYAIkqlaq2vcYqhBBCCCGEuDe52ZvLxNIDSldXVys9F4CFhQUA\n+fn5ZGdnExYWprVTw9TUlMmTJ7NgwQKSkpK0dt6EhIS0OQCkUqnYvHkz1tbWPP/88+oAEICOjg7P\nPfcc27dvJyEhQR0Easv429vNFLIHMLLshEd4NFPCu7c4ed3WFIdBQUFaqfhuxNzcnKlTp2rtEmqO\nsbExEyZMYMKECTdsZ2pqSnR0NNHR0a32OXDgQAYOHNjW4YoHlKRE+x/5bhZCiPbTYYJACoXiLeA9\nGnf2DG8mBdy1YoGPgGiFQvGZSqVK/b0PI2DB722Wt+d4hRBCCCFEx3JtXY4HoWaKUqnkueeeIyIi\ngpkzZ97t4QjRIV27stzYyYfSQ0f5v//7PwYPHoyfnx8+Pj4au0KOHDkCNO76iYmJ0eqvvLwxM3lh\nYaHWue7du7d5XGfOnKGyspIuXbrw/fffN9vGwMBA4z7h4eGsXLnyhuNvbw96IXsh2oOkRBNCCNHe\nOsRPYgqFYgqNAaCrwG7g5esLYAIFKpXq3wAqlapCoVC8QGMwKEGhUKwFSoBHAe/fjzf/k7QQQggh\nhBBCiPtaWn4RqxNzr6u10ZVyp0GUn8/m5NpYLIx/QqFQ4Ofnx5/+9Ce8vLyorKwEID09nfT09Bb7\nv3z5stYxa2vrNo+v6T5nz55lzZrrk1s0f5/HH38cCwsLtmzZwsaNG/npJ+3xt7cHvZC9EO1FUqIJ\nIYRoTx0iCERjDR8AXaClZYy7gH83/UGlUm1QKBRDgHnAWMAIOA68CixTqdqSUVUIIYQQQjxoZAeN\nEPe3bWmnWkytZOsRCB6BXK2rYbi3IZTkExcXx/z581m+fDkmJiYAvPjii0RFRbXpfl9//TXJyck0\ns5CxRU33GTBgAHPnzm3zdcOGDWPYsGFUVVVx+PBh9u7dqzH+9t4VJIXshWhfkhJNCCFEe9BpvUn7\nU6lU76hUKkUr/wtv5ro9KpVqtEqlslapVMYqlcpfpVItUalUV+/CYwghhBBCCCGEuIvS8otara0B\noKtvxM/5CgZGTuShhx6isrKSnJwcvL29AcjJyWnXcXbt2hVTU1OOHj1Kff3N1dmBxto0wcHB/OUv\nf9EY/50webAXN4p3GZpZ0fup+biGPnbfFrIX/6NUKomKimLp0qV3eyhCCCGEaEFH2QkkhBBCCCEe\nMNfuyBk/fjzfffcdWVlZVFRU8MEHH+Dv709lZSXr169n3759KJVK9PT08PT0ZNy4cTdVILyoqIjY\n2FhSU1MpKipCV1cXpVJJbm6uVgqlkpISfv31Vw4ePMi5c+e4dOkSFhYW+Pn5ER0djbOzs1b/+/fv\nZ+PGjRQWFlJZWYmFhQVdunRh0KBBWsXjb/aZLl++zOrVq/ntt9+oqKjA3t6ekSNH0r9//zY/vxAP\nitWJuS0GgCrP52Pm4KbesaNSQczuXMzLygAwNDTEy8sLX19fkpKSiIuL4+GHH9bqp6CgAGtr6z+0\n60ZXV5eoqCjWrl3LihUreP755zEwMNBoU1JSQlVVlfrvnMzMTPz9/bV2HJVdM/47QQrZCyGEEELc\nWyQIJIQQQggh7qpz587x2muv4eTkRHh4OLW1tZiYmKBUKpkzZw5KpRJfX1/69OlDTU0NKSkpzJ8/\nn+nTpzNixIhW+z9x4gRvvfUWly5donfv3oSGhlJRUcG+fft4/fXXmTdvHsHBwer22dnZ/Pe//yUg\nIIDQ0FCMjY05e/YsSUlJJCcn8/HHH+Pu7q5uv23bNj7//HOsra3p168fFhYWlJWVUVBQwPbt2zWC\nQDf7THV1dcybN4/c3Fzc3d0JDw+nqqqKtWvXkp2dfZv+HxDi/lCgrLxhmrL8xB/Q0TPAxM4JQzMr\nVCo4uvUk3cxqCfDtQWBgIACzZs1i3rx5LFu2jE2bNuHt7Y2pqSlFRUUUFBRw8uRJFi9e/IdTrz35\n5JPk5+ezdetWkpOTCQgIwNbWlvLycs6ePcuhQ4d45pln1EGghQsXYmRkhLe3Nw4ODqhUKnJycsjN\nzcXT01M9/jtBCtkLIYQQQtw7JAgkhBBCCCH+sKNHjzJr1iz69+/PvHnzmm3z5z//mfPnz7Nq1SrM\nzRvz3ZeXl7NhwwZsbGyora2loqKCAQMG0LlzZxYsWMDFixeZPXs2gwcP5rnnngNg8eLFREdHM23a\nNHx9fZk8eTKTJk2itraWs2fPsmzZMr766itUKhUWFhZkZ2djaWnJp59+ip+fn3oHUv/+/Tl27BjL\nli1j5cqV6OvrU1JSQmpqKmZmZqSnp2NiYoKvry8TJkxg4sSJvP7663z77be88847xMfHs3TpUgwM\nDKiqqsLT05Ndu3ahUCjw9fXllVde0ZokXrJkicYzNamqqmLOnDmsWLGCkJAQrKysAPjxxx/Jzc0l\nNDSUv/71r+odAOPGjZN6RkJcJ72g6IbnHXtFUHnuBJdLzlNx9jg6unoYmFoSPCyKd2b8CT29xl+P\n7ezsWLp0KZs2bWL9+vXExcVRXV0NgLW1Nd7e3hw+fFidOq7J1atX+eGHH9i+fTsXL17EysqKIUOG\n8NRTTzU7npycHK5cuUJVVRXHjh3jt99+w9LSEm9vb7p27cpTTz1FeHg4f/vb30hMTOTJJ5/k5MmT\nnDhxgtTUVPLz8yktLaV3794sXLhQPf7Lly8zceJEevTowYcffvhHP9YWSSF7ERMTw5o1awCIj48n\nPj5efW7mzJlEREQAcPDgQTZu3MixY8e4fPkydnZ2DBgwgCeffBJTU1ONPpu+6z/77DNiYmLYu3cv\nxcXFTJgwgUmTJqnvuXDhQkpLS1m/fj2FhYWYmZkxaNAgpkyZgr6+PpmZmaxZs4YTJ06go6NDv379\neOGFF9Q/fwghhBAPEgkCCSGEEEKIP8zb2xsnJydSU1OprKzUmmQ5duwYp0+fJjQ0VH1u/fr1HD16\nFBMTE8aOHYuNjQ0FBQX8+OOP7Nixg5KSEgYPHqwRLKmvr2fhwoWYmppibm6Ol5eXekX8t99+y+nT\np+nWrRvDhw9HV1eXgwcPcubMGfr374+fn5/GmExNTRk7diz//Oc/ycjIwNnZmddff52SkhICAgLo\n3r07RUVF/Pbbb6SkpDB37lwCAgJIS0vTqOFx/vx58vPzCQ0NZdSoURQWFpKamkpubi5ffPGFul1+\nfj7Z2dmEhYVpPFPTWCZPnsyCBQtISkpS7x7avn07CoWCqVOnaqSAcnBwICoqSj35drtdO8nm7++v\nPh4VFYWfnx+LFi26pbZCtKfq2hvX1unUPZhO3YO1jgeGdcfY2FjjmLGxMRYWFlRVVdG3b1+tXX6J\niYk8/vjjALi5udGvXz/S0tLIycmhT58+mJiYkJqayrp16ygrK2PTpk0a/W/bto0vvvgCQ0NDHnvs\nMaysrMjKyuLo0aNYWVkxf/589eR4YGAgiYmJ2NraagSUpk6dSnFxMdCYXq5JdnY2V69evWM7g6SQ\n/f3l2lStrS028Pf3p6qqio0bN+Lu7q6RprRpx+yaNWuIiYnB3Nycvn37Ymlpqf6uT01NZfHixZiY\nmGj0W19fz7x586isrCQoKAgTExMcHBw02mzevJnU1FT69++Pv78/aWlp/PTTT1y6dImQkBA+/vhj\n+vbty8iRIzl8+DA7d+6koqKCd9555/Z8UEIIIcQ9RIJAQgghhBDitoiIiGDVqlXs2rWLyMhIjXPx\n8fFU19aj29mHmN25KE/lsm7tD5iZmTF+/HhmzZql0Xbu3LnU1NRQVVVFTEwM0JjWraKiAhcXFx56\n6CHi4uLw8fEhIiKCgoICTp06hbW1NdOmTVOvPtbR0aGgoABbW1t1P+Xl5Zw5c4YDBw6ogzmFhYVs\n3LiRkpISnn76adzd3dm6dSvHjx+noqKCnJwcHn/8cXr16oWOjg4VFRXq8dbV1eHl5cWpU6dwc3Nj\nxIgR2Nvbs2XLFuLi4hg7diwAR44cAdB4pmuVl5erxwKNq/nPnTuHnZ0djo6OWu39/f3bLQgkxL3I\nxPDWfr1t6bpt27ahp6fHZ599prWr79q/A5qcO3eOzz//XB3ofvrpp3n55ZfZsWMHU6ZMwdraGmic\nZP/qq68wMjLi008/pWvXruo+li9fzpYtW/jmm2946aWXAAgICAAgIyODUaNGAXDmzBmKi4vp1asX\n6enpHD58WB30ycjI0LhOiPbi7++Pg4MDGzduxMPDg0mTJmmcz8zMJCYmhh49evDOO+9o7Ppp2k0b\nExPD888/r3FdSUkJzs7OLFq0CCMjo2bvnZ6eztKlS9XpEuvq6pgxYwY7duwgOTmZ999/X734Q6VS\n8fbbb3PgwAHy8vLw8PC4nR+DEEII0eFJEEgIIYQQQtwWQ4cO5T//+Q87duzQCAKl5J5n2aofKa+u\nQ/eUDorTx8jb9T0lhaWYGdlQp2em0U9ERAQ2NjZkZWWRnp5Oeno6AHl5edTW1mJtbU1cXBzQGCi5\nlo6OjsafmyZqU1JSSElJAaC2tpYzZ85QW1urLqh+8eJF0tLS6NSpE/r6+rz33nuYmZnRq1cvwsPD\niY+P5/DhwxgaGlJXV6exE+jJJ58kKCiILVu2sHHjRn766SeuXLlCXl4eSUlJ6iBQZWUlgMYzNafp\nmaqqqgDUE8fXa+n47RAZGcngwYPp1KnTLfexfPnyO1aoXgiAXm63Vn/m2uuuTW124nw59fUqjV02\nTSwsLLSOTZ06VWMXpJGREUOGDGHt2rUcP36cvn37ApCQkEB9fT1jxozRCABBY+Bo586d7Ny5k2nT\npqGvr0/nzp2xt7cnMzMTlUqFQqFQB3qeeuopMjMzycjI0AgCNdUOEqI9XPueXKkqa3EXXtMOuL/8\n5S9aad8iIiLYuHEjCQkJWkEgaEwL11IACBp3mzYFgAD09fUZPHgwq1evJjg4WGP3r0KhIDw8nPT0\ndPLz8yUIJIQQ4oEjQSAhhBBCCHHLrq8F4erZg9zcwxQWFuLs7My2tFPMX/4DRSVl2Pv0R6HTOJla\nVXQaHV1dSktLWLN5B/qmlgS42qr7VSgU1NXV8fTTTzNhwgSgcUKorKyM2NhYjdRoAC4uLjg6OpKX\nl8eKFSsoKyujZ8+e6gmkN998k5CQEKD5VDfJycls2rQJHx8fvv/+e6ytrVm6dCk2Njbq/v/+979T\nV1en9Rl4enoybNgwhg0bRlVVFYcPH2bPnj0sWrSIbdu28fbbb2NpaalOd/Piiy8SFRXV6mfbNGFW\nWlra7PmWjt8OFhYWzU5y34zrJ7eFaG9u9ub4u9iQdaqkzdcEuNrgZm9OWn4RqxNzNa5V6jhx+lgO\nISPGMzZqOKPDB+Dj46O1K6iJl5eX1rGmQOqlS5fUx06cONF472Z26piZmdGtWzeys7M5ffq0OqVW\nYGAgcXFx6gnsjIwMbGxs8Pb2xtPTUx0UKi8v5+TJkwQFBalrBAnRVq3V+LHxCGR1Yi579iWjPLKf\n6uKz1NVc4tKFk5wpr6ffw2MI9XVVX3PkyBEyMzN55plnmDJlCklJSeTm5qpTtqlUKg4dOsSoUaP4\n+OOPKS0tZd++fZSVlfHOO+80W+Nn586dFBYW4uHhwWOPPaYReLWxsaG6uprMzEyee+45SkpKMDEx\nwc7ODktLSxoaGtQpFIUQ/5+9Ow+oqswfP/5m33cEFQTBDZTFBcVdk9TKTDNzodLKGrNmJjNrxvo2\n1mj2a5kys5z6jjM2GWoZFWhBCpG4hArIKqKyqsgqcNnhcn9/8L03rveyueT2ef2l55znnOdevefc\n+3ye5/MRQtxJ5FuhEEIIIYToMX0DpgAVlY5UFF7i3zu/44H5i9m4N43ys22rXhy9f6tP0dJYT6tS\nSUN1Oa0tzfz78y/wcXPAztIU+G01TGpqqiYIBGBnZ6cTAIK2FUBPPvkkZ8+epbKykm3btgFtq2nK\nyso4ceKEJgikj3rVjbm5ObW1tQQGBmoCQNC26kapVFJcXEzfvn212lpb/7aSycrKiqCgIIKCgvjf\n//1fGhoayMjIYPz48ZpZ+RkZGd0KAllYWNCnTx8uXrxIUVGRTkq4tLQ0rb+3D24tXLiQbdu2kZaW\nRnNzMz4+Pjz11FN4enpSVVXFF198wdGjR6mpqaF///48/vjjWgPSHdX56YmOagLV1taye/dujhw5\nQklJCaampgwePJh58+YxfPhwndf4yiuvsHjxYsaOHcsXX3zByZMnaW5uZvDgwSxZsgRfX98r6p+4\nPT0yeRBrvkxAper6WAMDCJ00iKjkAjbuTdNp4+I7DiMzS8qyj/PPbTv56ce9uNhZ4ufnxxNPPKET\n9Ll8pQP8VquntbVVs019v2l/j2lPvcpPfRz8FgRKSUnBy2gp8s8AACAASURBVMuLtLQ0goKCNPu+\n+eYbamtrNauFfq96QOL20lmNn4I6M979MoELKb9QlBqHsZkldm6DUGFAXXkR53KzWfTks3y08QPm\njBsMtK2AValUnD17lnXr1tHS0oKtra2mZp+zc9sqPKVSqanxY2VlhYODAzY2Nnpr/AQEBFBaWsqJ\nEyf4xz/+oVXjp7S0lMzMTJqampg9ezaurq7U1dVRVFTEoUOHMDMz01rJK4QQQtwpDLs+RAghhBBC\niN9EJRew5ssEvbPt7fv5UKs05LOw79i0J5Wm+lqqL5zF0qE3lg69NccZmZhhZGaOndtg3Efdw4hH\n1hK89HUiIyOJjIzk559/5oknniA1NVWT+g3QCgDl5eVp6uhAW9DEw8ODl19+mc8++4w///nPBAQE\nUFVVxccff8zx48f1vp6srCxMTEwAaGpqwszMjDNnztDQ0KA5pqysjIKCAr2DRzk5Oaj0jDirVw2p\nU6INGjSIYcOGcfjwYa3X1N7lr+nuu+9GpVKxbds2rWsUFxfrFJpvv+/FF1+ksrKSkJAQRo4cSUpK\nCmvWrOHChQusXr2a06dPM2nSJCZOnEhubi6vv/46paWles93LdXW1vLSSy+xe/duLC0tmTNnDuPH\njycrK4u//e1vREVF6W135swZXnrpJZqampgxYwZjxowhIyOD//mf/+H8+fPXvd/i1jHCy5mVs/zR\nEyvWYmAAL9zfFvjUFwBSc/IOZMg9y/B/+GUMht6P78hxpKens3btWq3Pak90d5WfevUg/LZq6MSJ\nE+Tk5KBQKDSBnoCAAFpbW0lLS5N6QOKq+Pv7M2fOHABNjZ/Q0FB8x81gV0oV1UW5FKXGYdWrH0Pn\n/AnP8XPpEzAFK2c3HPr7UV9ZyqtvbyY5twxo+z9sbGzM8OHDCQ0NJTs7m/T0dFJSUjhy5AiRkZGs\nXr0aMzMzTY0ff39/hgwZwgcffEC/fv2IjY3lo48+Yt26dbzyyitMnToVX19ffHx8NDV+1JKTk2lt\nbWXhwoW8/PLLLF26lBUrVvD3v/+dDRs26KSMFUIIIe4UshJICCGEEEJ0W3JuWacDpobGJjh4DKXs\nTBJZmWk0VJWhalVqrQICsHJ251J+BvDbiVLzK8grUdDfpS21y+rVq3n11VfZtGkTkZGRZGZmYmJi\nwnvvvUdeXh75+fm89957elMz9enThz59+jBlyhSys7PJzMzkjTfewNfXF2dnZwoKCoiJiSEjI4OL\nFy/ywQcfAJCZmcmsWbMIDw/nueeeY+zYsbS0tPDFF19QWVnJlClTdArCh4WFcejQIYYMGYKrqysq\nlYqMjAwqKyvx8PDQmpF/+WsaMmQIVlZWlJWV6X1NDz74IL/++iuHDx/m+eefZ+TIkdTW1hIfH4+f\nnx8JCQk6rz09PV0rjR7Azp07+fLLL3nxxReZOHEizz77rCagNmLECN5//32+//57vXUZrqVt27ZR\nWFjIPffco9WH+fPn88ILL/Dpp58ycuRIXFxctNodO3aMlStXEhISotkWFRXFxx9/TEREBCtWrLiu\n/Ra3lntGeOBqb0lY/GlS83WD1QGejoROGsQIL2dWf36kW6uGjE3Nse07iFZPR+52tGLfvn2aVX49\n5e3tzeHDh0lLS9NZsVNbW0tOTg6mpqZa9U4cHBzo168fGRkZJCYmAmjaDh06FBMTE1JSUkhNTdWk\nlBPiWvnywGlUKig9dRQAj+D7MTZtS7eqvo9bOvbB0MiIitw0wuJPM8LLGR8fHw4ePEh9ff01r/ET\nFBRERESE3ho/+lIhWlpa6l1JLIQQQtwJZBqEEEIIIYToNvVAUGccB7Sl9KrISaUiNwUDQyMcvLRT\nirn4tqVmq79UjLL5txU3J/LaZg83NDRQXl7Oxo0beeyxxzA0NOTixYsUFBRw8uRJevXqxXPPPYen\nZ1vtgeLiYioqdAd7a2pqMDMzY8yYMcyfP5/a2loOHDhAaWkpZWVleHt7s2rVKry8vBg+fDglJSXY\n2NiwbNkyzMzMiIqKIioqiurqakaPHq2TrgxgxowZDBo0iLNnz7J3717279+PUqmkX79+TJkyRWsw\nytnZWes1xcXFERkZqfc1Qdsg2Pr165kzZw5VVVVERESQlpbGwoULmTl3ERcr60g8W8p3R3MpKG2r\nOeLi4sL8+fO1+qgOnjQ3N/Pkk09qDYRNmTIFIyMjrdnU10NLSws///wz5ubmLFmyRKsPffv2Zfbs\n2bS0tBAbG6vT1tfXVysABG2rpIyMjMjOzr6u/Ra3phFezry7ZByfLp/MiplDWTp1MCtmDuXT5ZN5\nd8k4Rng5k1ei6LR+kOJirs4qv9T8CvLOFwO/rfLrqbvuugtjY2P27NlDUVGR1r7t27dTV1fH1KlT\nNSsU1QIDA2lsbCQiIoK+fftqUmmZmpri4+NDfHw8RUVF+Pv7y2C36La8EgXfHc0lLP601rOk/X71\n50Rdz6+yIJOi1DiKUuMoPXWMRkU5l/LSaFUqaW6oJSn7HHklCs2qooKCAr115hoaGrhw4QKgv6aW\nOmXiwIEDdfbZ29sDaNX4Uf/f//LLL3n//feJjY3V+YwJIYQQdyJZCSSEEEIIIbqlqwFTNete/TCz\ncaSyMJNWpRI798GYmGvXyrDp7Y376HsoOhFLddFZcg98jam1PRGVx0n5sW01y9ChQ3njjTdYsGAB\nCxYsQKFQALB161ada+bm5vLFF18wePBg0tLSOH/+PFVVVSQkJNDS0sLSpUuZO3cuS5cu1aqds3Ll\nSs05nnvuOV5++WU+//xzhg8fztixYykrK+PgwYMMGjSIv/71rwQHB/PSSy9pXXvMmDE6AQpom9V8\n+SAutKWtU7+m7rC0tOSpp57SrNLR1GP69hSOM1ZRAmyJzqSxppLCwkt4DPHXSXmjHkhzc3PDwsJC\na5+hoSH29vaUlZV1qz9X6ty5czQ2NuLr66tVyFstICCAXbt2cfbsWZ19+gYHjY2Nsbe3p6amRmef\nEGr9XWw0qwsvpw46dyT3wFcYGpti6eyGmbU9KhXUluRTYaRgYlDAFdfdcXFx4emnn2bLli08//zz\nTJw4ETs7O9LT08nKysLd3Z3HH39cp11gYCB79uyhqqpKZwVSYGCgpk6Y1AMS3dFRbT/1s2RIedu9\ntf3npKWxHlWrkqLUX7TatDTUUVWdjYlFEYYmplxM/YWfDvfmD3MnMXDgQM6cOcPy5csJCgrC1dWV\nhoYGSkpKSE9PR6lUAtrpD9XUNbX01dtSP+fap2l1d3fH19cXT09PDh06xM8//wy0Pfs6qwsohBBC\n3O4kCCSEEEIIIbqlqwHT9py8A7mQ8vP//Vl39QxA72ETse7lQempo9SUFqA8f4qz9S5YD/Zk5syZ\nTJkypdvXGzhwIPPnzyc9PZ3ExERqamqws7Nj4MCBzJ49m1GjRnV5jt69e/PBBx+wa9cujh8/Tnp6\nOhYWFowcOZKFCxfqDUT83joqYK9WXd9ETGYZ0ScKmTn8t7Q66oE0fYNs6v3qgbjrpa6uDvgtIHU5\n9fba2lqdffoGAKGt362trdeoh+JOU9fYeYH4PsNDUBSdpb7iItUXzmBoZIyplR0TZjzIhtXL9Kac\n6q777ruPPn36EB4ezuHDh2lsbKRXr17MmzePBQsW6P0/r17loFKpdGr+BAYGsn37dkDqAYmudedZ\nsiexgOknCrU+J0YmZoCKgIdf1jq+UVHBuePR1JadQ9lUT8mpo+TljQYm4eXlhYODA6NHjyYzM5OE\nhAQsLS1xcnJi5syZVFdXa4I114K1tTWPPfYYkydP5syZMyQlJREZGalTW08IIYS4k0gQSAghhBBC\ndEtXA6bt9fafTG//yV0eZ+3igbWLh+bvny6f3OGsfX0rgNScnZ1ZsmRJt/rm4uJCZGSk3n1OTk48\n++yz3TpPSEiI3hVAah1d40p1VY9JQwUf7EnFxc6CEV7O17QPV0MdgFIXvb+cOp1fR4EqIa41S7PO\nfw73GhxEr8FBOtunzhyqtaLurbfe6vAcnd0nRowYwYgRI7rZ27ZgaEREhN59Pj4+1/yeI25PXT1L\n1KkEVa2tfLAnlftH/faMtnJ2p+p8NvWVJVjY/1a7zczGkQF3LdY6z5jxQzV/tre35y9/+Yve64WF\nhWn9vbNnPUBoaCihoaGalW/tXf558/X1xdfXl759+/L+++9z//33Exoa2un5hRBCiNuR1AQSQggh\nhBDd0tWA6dUK8HTsMAAkulePSU2lgrD409e3Qz3k7u6OmZkZubm5elf7qAf09NV+EOJ6GN7/yoKk\nV9pOiJtBV88SI1MLDAwMaK6rQqWCU+erNPvU9fwKEvbQXKfQaatsbqK27BxwYz4nJ0+epKmpSWd7\nZWUlcOV1vIQQQohbnawEEkIIIYQQ3XI9B3QMDCB00o1Pt3az6m49pvZS8yvIK1HcNIE1Y2Njpk6d\nSnR0NNu3b2f58uWafUVFRURGRmJsbMxdd911A3t5/emrSbVx40ZiYmLYunUrLi4uWsdHRkby448/\nUlxcTFNTE0899ZSm2Lq4Ov1dbPD3cOzRZ0uC1eJW1p1niZGJKZZObtSUFJB3MJwiWyf6OFhSb+2B\nTW9v+o4IoehELBkRH2HXdxCm1va0tjTTVFtJTUk+Vr08mPfEn27I5+Sbb74hNTWVYcOG4erqioWF\nBfn5+SQmJmJtbc3MmTN/9z4JIYQQNwMJAgkhhBBCiG65kgFTd0crzl+q7XTWsYEBvHB/wE2Vuuxm\n05N6TJe3u5kGrJcuXUpGRgZ79uzh9OnT+Pv7U11dzcGDB6mvr+eZZ57B1dX1RnfzpnHgwAE+++wz\nvL29eeCBBzAxMcHHx+dGd+u28sjkQaz5MqFbq+wkWC1udd19lvSf8CDnjkdTXXQWZX46KjsLzIdO\nx8LBVW89P0MTM0wtbHEaOBJHL78b9jmZNWsW1tbWZGdnk5mZiVKpxNnZmVmzZjF37lydILsQQghx\np5AgkBBCCCGE6LaeDpj+8T4/oC01WWq+bvAowNOR0EmDJADUhZ7UY7oW7a4XGxsb3nvvPb7++msO\nHz7Md999h5mZGYMHD2bevHk9qo9yO1myZAnz58/H0dFRa/uxY8cAWLt2rc4+cW2M8HJm5Sz/Lutt\nSbBa3A66+0y4vMbP0qmDcbQ203xOLq/np6bvc9LdGj/6dFZTy9/fX6cOVk/rbAkhhBB3CgNVdxOL\n32EMDAwSR44cOTIxMfFGd0UIIYQQ4qYSlVzQ7QHTmcP7abbllSg4kVdGXWMLlmbGDO/vfFOtUrmZ\nfXc0ly3RmT1ut2LmUOaO8boOPRJXSl86uI68+uqrpKam6gx0imsvObdMgtXitne1zxL5nAghhBA9\nN2rUKJKSkpJUKtWoG9UHWQkkhBBCCCF65J4RHrjaW/Z4IKi/i40Efa6QFLC/vV1eEygsLIwdO3Zo\n9s+ePVvz5/YBoXPnzrF7925SUlKorKzEysqKwMBAQkNDcXNz+11fw61uhJczI7ycJVgtbmtX+yyR\nz4kQQghxa5IgkBBCCCGE6DEZCPp9SQH7O4u/vz8AMTExlJSUsHjxYp1jEhMT2bBhA0qlkjFjxtCn\nTx/Kyso4cuQIx48fZ8OGDQwYMOD37vot73oHq5ctWwZ0nSLrWurJ6jN90tLSeOWVV1i8eHGHabvE\nreFaPUtkUocQQghxa5EgkBBCCCGEuGIyEPT7kQL2dw5/f3/8/f1JS0ujpKREZ+C9pqaGd999FzMz\nM95++2369fst7WJ+fj6rV69m06ZNfPjhh7931287VxtAEeJmI88SIYQQ4s4jQSAhhBBCCCFuAVLA\n/tZ0+Wo5d6urr8kaGxtLbW0tzzzzjFYACMDT05OZM2fy/fffU1hYqLNf3HkcHR3ZsmULlpaWV9R+\n8ODBbNmyBVtb22vcM3EjyLNECCGEuPNIEEgIIYQQQvyuYmJi2LhxIytXriQkJESz/UakSbrVXGk9\nJvH7S84t48sDp3XSLjXWVFJYeIkh5TVXfO6srCwAcnNzCQsL09l//vx5AAkCCQCMjY1xd3e/4vZm\nZmZX1V7cfORZIu5E8j1TCHEnkyCQEEIIIYQQtxCpx3Tzi0ou6HSWfXV9E3sSC5h+opCZw3sepFEo\nFABER0d3elx9fX2Pzy1+ExYWxo4dO4C24HVMTIxmX/sgdlJSEhEREWRnZ1NfX4+zszPjxo1j4cKF\nWFlZdft6Bw4cICoqipycHJqamnB1dWXq1KnMmzcPExMTAMrLy3niiSfw8vLqMN3f66+/TmJiIps3\nb8bT07PDlHaVlZWEh4dz9OhRysrKMDY2xt7eHh8fHxYtWkTv3r2BzmsCXbhwgZ07d5KSkkJ1dTW2\ntrYEBgayaNEi+vbtq/f93LBhA9XV1XzzzTfk5+djamrKiBEjWLZsGU5OTt1+v8TVkWeJuN2sWbOG\n9PR0IiMjb3RXhBDipiNBICGEEEII8bsaO3YsW7ZswcHB4UZ35ZYm9ZhuTsm5ZV2mWQJABR/sScXF\nzqLH11Cn9froo4/o379/zzspusXf35/a2loiIiLw8vJi7Nixmn1eXl4A7Nixg7CwMGxsbBg9ejR2\ndnbk5eXx7bffcvz4cd57771upWH78MMP2b9/P87OzowfPx4rKytOnTrF9u3bSUlJYd26dRgZGeHk\n5MTw4cNJTk4mLy9P59+/oqKC5ORkBg4ciKenZ4fXa2xs5OWXX6aoqIjhw4czZswYVCoVJSUl/Prr\nr0yYMEETBOrI6dOn+Z//+R/q6+sZM2YMHh4enDt3jri4OBISEli/fj2DBunWk/nhhx9ISEggODgY\nPz8/srOziY+PJzc3l02bNmkCXuL3Ic8SIYQQ4vYnQSAhhBBCCPG7srKy6tHseCFuJV8eON2tgusA\nKhWExZ/GrYfX8PHx4fDhw2RkZEgQ6Dry9/fH1dWViIgIvL29dVbBpKamEhYWho+PD6+//rrWfU2d\n9jIsLIynnnqq0+vExMSwf/9+xo0bx+rVqzE1NdXsU6+e2bt3Lw888AAAd999N8nJycTGxvLkk09q\nnSsuLo7W1lamTZvW6TVTUlIoKipizpw5Ov1raWmhubm50/YqlYr333+furo6XnzxRaZOnarZFx8f\nzzvvvMM//vEPtmzZgoGBgVbbxMRE3n//fa3/u++++y4HDhwgISGBiRMndnptIYQQQgjRMxIEEkII\nIYS4iWRnZ/Ptt9+SmZlJdXU1NjY2mkLv7QfGDh48yJ49e8jNzaWlpYU+ffowZcoU5s6dqzOLevbs\n2fj5+fHWW2/pXG/jxo3ExMSwdetWXFxcALRSB4WGhrJt2zZOnDhBQ0MDnp6ehIaGMnr0aL39j4+P\n16QzamxsxMHBAR8fH+bOnauZEd5RTSAhbnV5JQqdGkBdSc2vwIKGHrW5++672bVrFzt27GDQoEEM\nHjxYa79KpSI9PR1/f/8enVegkxrL3arjiJ465dCf/vQnncB2SEgIERERxMXFdRkEioiIwMjIiOef\nf14rAASwaNEi9uzZQ1xcnCYINHbsWKysrIiLi+Pxxx/H0NBQc3xMTAzGxsZMmTKlW6/38utBWw0h\nY+POhwqysrI4d+4cPj4+WgEggEmTJrFnzx4yMzPJyMjAz89Pa//s2bN1gpczZ87kwIEDZGdnSxBI\niFvQlXx3bG5u5vvvvycuLo6ioiKMjIzw8vJi9uzZOveB9ud/+OGH2b59O2lpaVRXV/P888+zceNG\nzbGzZ8/W/Fnf99+GhgbCwsKIj4+nsrKSXr16MWPGDB566CGdoLUQQtwuJAgkhBBCiJvenRI0iI6O\n5pNPPsHQ0JDg4GD69u1LZWUlZ86cYe/evZofxP/973/5+uuvsbW1ZcqUKZibm5OYmMh///tfkpKS\nWLduXZcDeN1RUlLCqlWr6N27N9OmTUOhUBAfH8+6detYv349AQEBmmNVKhUffvghMTEx2NraMm7c\nOOzs7CgvLyc1NRU3Nze9aYGEuJ2cyCu7onbnK2p7dLyNjQ1r1qzhzTffZPXq1QQGBuLh4YGBgQGl\npaVkZWWhUCgIDw+/ov7ciZJzy/jywGmdIF5jTSWFhZcYUl6j0yYrKwtjY2MOHjyo95zNzc1UVVWh\nUCiwsdGfbquxsZHc3FxsbW35/vvv9R5jYmJCYWGh5u+mpqZMnDiR6OhokpKSCAoKAuDMmTMUFBQw\nbtw4bG1tO329fn5+ODk5sXv3bs6ePUtQUBC+vr54e3trBZU6cubMGQCt50B7AQEBZGZmkpOToxME\n0vcs6NWrFwA1NbrvsxDi1tHd744tLS387W9/Iz09HXd3d2bNmkVjYyOHDh3i7bffJicnhyVLluic\nv6ioiBdffBE3NzemTp1KY2Mj/fv3Z/HixcTExFBSUsLixYs1x7u6umq1V1+3oqKCoKAgDA0N+fXX\nX/n8889pbm7WaiuEELcTCQIJIYQQQtwECgsL2bJlC5aWlrz99tt4eHho7S8raxtczsrK4uuvv8bZ\n2Zn3339fU1dn6dKlvPnmmxw7dozw8HAWLFhw1X1KS0sjNDRU6wfxlClTWLt2LeHh4VqDf9HR0cTE\nxDBo0CDWrVunNSu+tbWVysrKq+6PEO1FRkby448/UlxcTFNTE0899RRz5sy5oX2qa2y5onZNLa09\nbhMYGMjmzZsJDw8nKSmJjIwMjI2NcXR0JDAwkPHjx19RX+5EUckFndZxqq5vYk9iAdNPFDJzeD/N\ndoVCgVKpZMeOHZ2ev76+vsMgUE1NDSqViqqqqi7P015ISIjmvqsOAsXGxmr2dcXS0pL33nuPsLAw\nEhISSEpKAsDW1pb77ruPhQsXdjqZoK6uDgBHR0e9+9Xba2t1A5z60oEaGRkBbc8L0T3tV0asXLny\nRndHCKD73x2//fZb0tPTGTVqFK+99prmHhAaGsqqVav4+uuvGT16NL6+vlrnz8zM5OGHH9YJEA0Y\nMIC0tDRKSkp0Une2V1FRgZeXF+vXr9eshAwNDWX58uV8//33PPzww9dkIpUQQtxs5M4mhBBCCHET\n+OGHH1AqlSxatEgnAATg7OwMwL59+wBYuHChJgAEbQNoy5Yt4/jx4/z000/XJAjk4uLCwoULtbaN\nHDmSXr16kZ2drbV9z549APzxj3/UGeAzNDTscKBQiCtx4MABPvvsM7y9vXnggQcwMTHBx8fnRncL\nS7Ouf16ZWdsz8tG1Wtseeuwp5o7x0jlWXwrH9lxcXHjmmWd61kmhJTm3rNMAkIYKPtiTioudBSO8\n2u7HlpaWqFSqHgVvLqe+X3p7e/Phhx92u52vry99+/bl6NGj1NbWYmZmxi+//IKtrS2jRo3q1jmc\nnZ3585//jEqlorCwkJSUFPbu3cvOnTtRqVQ8+uijHba1tLQE4NKlS3r3V1RUaB0nhLgzdPe74759\n+zAwMOCpp57SBIAA7OzsWLRoEZs2beKnn37SCQLZ29tf9Wqd5cuXa6XCtLOzIzg4mNjYWM6fP4+n\np+dVnV8IIW5GXa/zFkIIIYQQV2Xjxo3Mnj2bkpISre15JQq+O5pLWPxp9v5ylLrGlk4H79asWcMH\nH3wAtK0CuJybmxvOzs4UFxfrnX3dU15eXnrTAjk7O2ul7GloaCA/Px97e3u8vb2v+rpCdOXYsWMA\nrF27lqVLlxIaGsqQIUNucK9geH/n37WduHpfHjjdaQBIXR9CpWpFpYKw+NOafT4+PtTU1FBQUHDF\n1zc3N8fDw4OCggIUCkWP2oaEhNDU1ER8fDzHjx+nurqaKVOm9HgWu4GBAR4eHsyePZv169cD8Ouv\nv3baZsCAAUDbrH991NvVxwkhbi/tv8N+dzSXgtK274Xd+e5YX19PUVERjo6OuLu76xyrXi2Uk5Oj\ns8/Ly0un9mVPWFlZ0adPH739A0lJKYS4fclKICGEEEL87tqnMJk/fz7btm0jIyOD5uZmvL29Wbx4\nMSNGjOjyPKmpqRw4cIDMzEzKyspQKpX07t2biRMn8tBDD+kteN3a2kp0dDQ///wz+fn5tLS04OTk\nhJ+fH/Pnz6dv376aY5VKJdHR0cTGxlJQUIBSqcTd3Z3p06cza9asKy4eq6/2REb2eRoVFbz74xke\nv9tcM9P8ckqlEkBrFVB7jo6OlJaWUltbqzflTk9YW1vr3W5kZISq3aipOuDk5OR0VdcTorvUqwxu\nthVm/V1s8Pdw1Kkr05kAT0f6u+hPFSaur7wSRZf/VkamFhgYGNBcVwVAan4FeSUK+rvYMGfOHI4d\nO8ZHH33EmjVrdP4/qgPkXQUo586dy6ZNm/jwww954YUXdO7dNTU1FBcX6wRUpk2bxvbt24mNjcXe\n3h6Au+++u1uvvaCgAFtbW007NfXKHjMzs07b+/r64ubmRmZmJocOHWLChAmafYcOHSIjIwM3NzeG\nDRvWrf4IIW4NXdZPG66/XfvvjurvjR09w9XfcfUFZDr6/ttdHX03lpSUQojbnQSBhBBCCHHDFBcX\ns3r1avr3788999zDpUuXiI+PZ+3atbz00ktMmjSp0/bffPMN586dw8fHh6CgIJqbm8nMzCQsLIy0\ntDTWr1+vNRuxpaWFN954gxMnTuDs7MyUKVOwtLSkuLiYX3/9lWHDhmmCQC0tLaxbt46kpCTc3NyY\nMmUKpqampKam8umnn5Kdnc2qVau69TqXLFnC/PnzcXR07LD2hLGpOY3AiVP5rCmu5YX7A7RqT6ip\nf6ReunRJ70xG9eB4+x+5BgYGmuDR5a7FjEf1tcrLy6/6XEJ0JiwsTCv11uzZszV/joyMBCAlJYXw\n8HCys7NpaGjAxcWF8ePHM3/+fJ3Bn2XLlgGwdevWDq+1YcMG/P39ta7p5+fHyy+/zBdffEFiYiKX\nLl3i+eefJyQkhEcmD2LNlwldpxcDDAwgdNKgHr0H4to5kVfW5TFGJqZYOrlRU1JA3sFwzGyd2Py/\nOfzxkdkEBgaydOlS/vvf//KHP/yBoKAgXF1daWhovGoFFQAAIABJREFUoKSkhPT0dIYOHcobb7zR\n6TWmT5/OmTNn+OGHH3j66acZMWIELi4uKBQKiouLSU9P5+677+a5557Taufs7ExAQAApKSkYGRnR\nv3//bq/GTE5O5j//+Q8+Pj707dsXe3t7ysrKSEhIwMDAgHnz5nXa3sDAgBdeeIHXXnuNt99+m7Fj\nx+Lu7s758+c5cuQIFhYWvPDCC1c8WUL0zLlz57o9oSY+Pp6oqChycnJobGzEwcEBHx8f5s6dy6BB\ncj8SHbvS+mmXUz+LO0onqd6uL2Aj9xQhhLgyEgQSQgghxA2Tnp7Ogw8+yJNPPqnZNmvWLF566SU+\n/vhjRo0a1Wk9gRUrVuDq6qrzg3D79u3s2rWLQ4cOaQWSwsLCOHHiBGPGjOGvf/2rVjqJ5uZmTaFr\ngK+++oqkpCTuv/9+nn76aU0wqbW1lc2bN7Nv3z4mTJhAcHBwl6/T0dERR0fHTmtPWDq7U1t+geoL\nZzC3c9apPaE57v/ej/T0dJ0gUFFREWVlZbi6umr9cLa2tqasTHews7W1ldzc3C773xVzc3M8PT3J\nz88nJydHUsKJ60YdjImJiaGkpESnLkBUVBSffPIJZmZmTJw4EXt7e9LS0ti9ezcJCQm8++67na6Q\nO3fuHCtWrMDf3x8/Pz+9x6SlpZGRkUFhYSF2dnaMHz+ewsJCdu3axdatW6mvr8fAwJxzKhdc/SZj\nbGqu1V5xMZdL+enUlhbS386A9cn/7nQFY/tgVEVFBREREZpVHOrgVUJCAhERERQWFqJQKLC1taVv\n375MmjSJ++67r8fv852irrGlW8f1n/Ag545HU110FmV+OrEXrLh37FD69+/P/PnzGTp0KJGRkWRm\nZpKQkIClpSVOTk7MnDmTKVOmdOsaK1asICgoiB9//JGUlBRqa2uxtramV69ezJs3j7vuuktvu5CQ\nEFJSUlAqlUybNq3br33kyJGUlpaSkZFBQkICdXV1ODo6Mnz4cObOnatTh0OfIUOG8MEHH7Br1y5O\nnDjB0aNHsbW1ZcqUKSxatAg3N7du90dcue5OqFGpVHz44YfExMRga2vLuHHjsLOzo7y8nNTUVNzc\n3CQIJDp0NfXTLmdhYUGfPn24ePEiFy5c0FqFD20r/aHn6STbf1fXl5JOCCHuZBIEEkIIIcQNY2Vl\npTOIO2jQIKZOnUpMTAxHjhwhJCREa393BjuTk5M5evQox44dIy8vj7i4OMrLy0lLS8PJyYnly5fr\n5BM3MTFBoVDwn//8h5SUFPbv34+JiQkTJ06kqKhIM5hlaGjIsmXL2L9/P/v376ewsJBDhw5x7tw5\noG1m9ogRI1iwYIEmzc7GjRuJiYmh34xntH48l589QdX5bOorLtKgKKfmYi6n922jtaWJ3v6TCYs/\nrfkBrQ7iqHOW79y5kzFjxmBnZwe0/eDdunUrKpWKGTNmaL22wYMHk5iYSHJystas4F27dunUKbpS\ns2fPZvPmzWzevJl169ZpDbSrVCouXbp006XuErcef39//P39SUtLo6SkhNDQUM2+kpISPv30U8zN\nzXn//fe16gxs2bKFH374gf/85z/88Y9/7PD87u7uBAQEkJqaqjMoBXDy5Enq6+txdHRk5MiRPP/8\n83z11VekpKRgY2PD6NGjsbOzIy8vj9pDCZQfLsR5/CMYtQsEFWcexrJVwbxpowkY5NHlCka1b7/9\nVhPEDggI0KTTiYqK4uOPP8bBwYExY8Zga2tLZWUleXl57N+/X4JAnbA0697PYTMbRwbc9duzasXM\noYSM8dL8fejQoQwdOrRb59K36kxt9OjRjB49ulvnUbvrrrs6DBCpubi4aFbKqfXr14+nnnqqW9fw\n9/fXaa/m5ubW7VWxoaGhWp/Zrvoouqe7E2qio6OJiYlh0KBBOs/p1tZWKisrb0T3xS2iq/pp7anr\np3UUBIK21JVffPEF//73v3nllVc0z73q6mp27twJtK2S7AlbW1sASktLcXV17VFbIYS43UkQSAgh\nhBDXXV6JghN5ZdQ1tmBpZoy7VduvyAEDBmBhYaFzvL+/PzExMeTk5GgFgY4ePcrhw4c1g50WFhYc\nPHiQn376ifDwcP75z3+iUqk4efIk0JYeysPDgwkTJlBZWUlycjLl5eX885//5LXXXtNaQZSYmMiG\nDRtQKpUMGTIEJycnzM3N2b17N99++y0LFy7U+kFpaGjIrl278PT0xM3NjenTp2NsbMzFixfZt28f\n48aN06q1UNfYQua5Ssysf9tWeOwHzO16Ye3iib3nMKocXCnNOkrWD59RfPIIFwaPxubCYSouFmpW\nANnY2PDQQw/xzTff8NxzzzFhwgTMzc1JTEwkPz+foUOH6qTxefDBB0lKSmL9+vVMmjQJa2trsrKy\nuHjxomZA/WrNmDGDjIwMfv75Z5YvX05wcDB2dnZUVFSQkpLC9OnTOxz8E+JaiIuLo6WlhQcffFCn\n0PRjjz3Gzz//rPn/2VlR6fvuu4/U1FTNTOT2oqOjAejduzfLli0jIyODsLAwfHx8eP3117UGVWNi\nYti4cSMTHIrwmThLc/9ze/DvBA0b0O0VjGqpqam89957OivtoqKiMDY25qOPPtIEhdWqq6s7fJ0C\nhvfveIDyerQT4nro7oSaPXv2APDHP/5RZ0WkoaGhTNQQHepO/bTLta+fps+8efNITEwkISGBP/3p\nTwQFBdHY2MjBgwepqqrioYce6nZwXS0wMJCDBw+yYcMGgoKCMDU1xcXFpctAuRBC3AkkCCSEEEKI\n66ar4rED/PQPxKqDJ+qZ7mrHjx/XDHZaWVnxl7/8hbKyMsaNG4enpye9e/fGyMiInTt3kpCQQGVl\nJZGRkVhbW3Py5El++eUXqqurOXbsGHFxcZofhTU1Nbz77ruYmZnx9ttvU1NTQ2ZmJgB1dXWcPHmS\nTZs2aaWHOnv2LDU1NTzzzDOsWLFCa0C3oaFBpwZPdX0Tlyeh8p31DGY2vw26uI0IQTF6Fll7/0ll\nwUmUjfX83ODB5CA/ZsyYoRnAefzxx/H29mbPnj3ExsaiVCrp3bs3jz32GHPnzsXYWPsrXmBgIK++\n+io7d+7kwIEDmJubM3z4cF5++WXCwsL0/hv0lIGBAatWrWLkyJFER0dz8OBBmpubcXBwYNiwYd1K\nmydERy4PJFfVNukcc/bsWQACAgJ09llbWzNgwADS09M5d+4cXl5eWvvbn9/M2BVTSxvS09O1ClDX\n1tYSHx+Pubk5gwcPxs7OTrNy4U9/+pPOoGpISAgRERGkJ/3KX1d1vPpIbc6cOezatYukpCS9QaB7\n7rmnw1SLRkZGmnph7alnRQv9+rvY4O/h2KPBzQBPxw4HNYW4nq5mQs2ECRPIz8/H3t5eUraKHutO\n/bSO2nV0vzQ2NmbdunV89913/PLLL+zZswdDQ0O8vLz4wx/+wOTJk3t8vRkzZlBSUsKBAwf45ptv\nUCqV+Pn5SRBICCGQIJAQQgghrpPuFI+NPHySe/UUj1WnJNFXu0M92JmQkEB2djYhISGsXLlS65ij\nR4+SkJCAr68v1tbWmnMZGhri7e1NRUUF+/bt0/wojI2Npba2lmeeeYZ+/fqRn58PwLhx43jllVf4\n17/+xffff88nn3xCv379qKqq4rHHHsPBwYEnn3xSZ0a/ubl2DRAAZavuG9E+AKRm4+rJoLuXkHPg\nKzzHz+XpJx/WFI5XB4EAJk+e3KMfyMHBwXoDMStXrtR5/7pKy/PWW291uG/q1KlMnTq1076EhITo\npPmDztMkiTuTvkCyojiPE2Fh2NjYkpxbpkk3ow4adzSbXR3QaR9cLlc0cPZiNcs/PaB1bJGyL8Xn\nE3FtaNZsi42NpampiV69emmukZWVhbGxMQcPHtR7zebmZqqqqlAoFNjYtA2ENTQ0EBERwa+//sr5\n8+epr69H1e5GWV5ervdcgwcP1rt96tSpbN26lWeffZbJkyfj5+eHr6+vzqogod8jkwex5suEbqU5\nMjBAcz8W4vdyLSbUqO97Tk5O17ez4rbUnfppZtb2jHx0bYft9H13NDU1ZcGCBSxYsKDL83cnZaSh\noSFLlixhyZIlevd39j2zs3SVQghxO5AgkBBCCCGuue4Wj62rKOK9b4/pFI9Vpyfz9vYmr0TBoawi\nzlfUMmTYAEryTvHss89iZWXFpUuX9M76V9fP6dWrl2abu7s7VlZWKBQKWlpayMnJ0ezLysoCIDc3\nl7CwMFpbWykrK+PHH3+kX79+nD9/HoDCwkL69etHdnY2KpWKYcOG6Q346GNkaKCzram2iuKMQygu\n5tBUV01rS7PW/uY6RbdrVghxOykpKWH2/FCqrL3xHDdH7zH1zS2s+TKBF+4PYObwfpqg8aVLl/Dw\n8NA5/tKlSwCa1IpRyQUczylDpVRyefUfp0GjyD0Uzpn8cxzKakubGB0djbGxsaYuF4BCoUCpVLJj\nx45OX099fT02Nja0tLTw6quvkp2djaenJ5MmTcLOzk6zimfHjh00NzfrPUf79JLtzZ07F1tbW374\n4QciIiL4/vvvMTAwwM/PjyeeeEIKvXdhhJczK2f5d/nMMjCAF+4P6LTGhbh2SkpKWLZsmd6JHneS\nazWhRn1/7CjILERnrvS7qHyHFUKIm4fckYUQQghxzXW3eGxLUwNFqb8QFt9HM7B2+vRp4uLiaMKY\niLOGZB0+QPnZfArLajBs6gVuLlRdTKclM5GiwnxWrVrFfffdpxnsvHjxoqaWh5mZmeZahoaGzJo1\ni6+++oqioiKt1EkKhQKVSsWePXs0tUIaGxvJycnhnXfewcPDA0NDQ+rr64G2WbXNzc06K4A6Y2th\nSvsEcY2KS5yK+hfKpnqsXTyw7TsQQxMzDAwMaKqpojznBKrWFqk9Ie4ICQkJREREUFhYiEKhoLZJ\nRVpGBo5e2j9XWhrrKMn6labaKprrqzmx622e3deH9S8tx9vbm8OHD5OWlsbp06f5/PPPefrpp3ng\ngQeora0lJycHU1NT+vXrR1zyaZaEPkZDZSnm9i60KpUYGhmhalVSdiaJipxUGqvKaWmqZ+3f/kZ+\n1gny8/OZNGkS8fHxQNsg9bFjx3B1dSUqKopt27Zx4sQJGhoa8PT0JDQ0lNGjR+u8zo5WMFZUVHQa\nTOrsfjNt2jSmTZtGbW0tJ0+e5MiRI+zbt4+1a9eyZcsWWRXUhXtGeOBqb0lY/GlS83VTwwV4OhI6\naZAEgK6x2bNn4+fn1+nq0jvZtZxQY25ujqenJ/n5+eTk5EhKONEjUj9NCCFufRIEEkIIIcQ11ZPi\nsTaunpSfSWb3/16gd1UIRsoG4uPjuXipFtWgu8kqrtNp4+QdCN6BNNdVY5WwjaqS8+zevZvY2Fju\nu+8+0tLScHJyori4mMbGRq22ixcv5uTJkxw7dozq6mq2bNmCpaUliYmJpKSk8M477/DII48A0NLS\nwv/7f/+PhIQEnJycCAgI4Ny5c2zatInExEROnDihtdKoK5ZmxvRzt+ds28RcSrKO0NJYh+e4OTgN\nGK51bEVeOuU5J/BwtpbaE4KNGzcSExPD1q1bcXFxudHdueaioqL4+OOPcXBwYMyYMdja2vJZxCFU\nrSrqys5rjmusqeTM/s+pqyjCwMAQIzMLHDyHUnX+NK+9tpY3//pHjI2N2bNnD6+99hoGBgbExsby\nwAMPsH37durq6pgxYwYmJia8v/VrVK2t2PUbQkNVGRU5J3D0Hk7OLzupvnAGACNzS1qVzTQoyvnf\nf27B1d6CN998UxMEgrZaQwqFguXLlzNgwACmTZuGQqEgPj6edevWsX79eq3VikVFRQCMHz9e531I\nT0+/6vfSysqKoKAggoKCUKlU7Nu3j4yMDL3XE9pGeDkzwstZp+7K8P7Och8WN8S1mFBjZWXFuHHj\ngLag2+bNm9m8eTPr1q3TSrmrUqm4dOlSh+k0xZ1N6qcJIcStT4JAQgghhLimelI81tTKgX5jZnEh\nOYbvIvbiYmuKtXNfDF0nY9NnYKdtTSxtIXgp0w1Oc2j/XvLz8zlx4gSPPPIIR48eJTMzk9LSUq02\nxsbGLFiwgP379wNtNT5UKhXNzc04OjrS2tqqdeyrr75KXFwc+/fv59ixYzQ0NGBra4u9vT3u7u40\nNTXR0NDQ7ZRwD4315t3oHFSqtpVAAPYevjrH1RTnYWAA44f07tZ5hbiVRUVFYWxszEcffYSdnR0f\nbtnKqbRkrF08aFUqSdr+BgC1pYWYWtnjPvoecqp2YmxuifOg0TQqLlGUfpDnV77A2OAxFBcX88Yb\nb9DU1ERcXBx/+MMfOH78OBUVFbz66qvs/O5Hft79L5rrqjAwMsbQyJjCo3s5nxyD4uJZUKlQNjdh\nZGKGSqWipbEBpYEBlVUKzWpANWdnZ9LT00lKSqKlpYXCwkIGDhzI/Pnz2bFjB1999RVmZmYMGTIE\nQBPES0tLY8yYMZrzXLx4kW3btl3R+5eamoq/v7/OSiF1Kqj2KyJF1/q72MjApbjhrsWEmtbWVp57\n7jlNCswZM2aQkZHBzz//zPLlywkODsbOzo6KigpSUlKYPn261EQRHZL6aUIIcWuTIJAQQgghrqnu\nFI9tz9yuF95TF7F06mBCJw1i9edHuHjZwIfTgOE4DRiO4mIuKpVKM9hpYmmHsecM7jc34NixY7z6\n6quMGjWK48ePM2bMGGxsbKipqcHa2hqApqYmtm/fjqurKy+88ALTpk0D2tLBPf3000RHRxMcHKwp\nwG5gYMBdd93F1KlTSU9Px9/fX9On9957j19++YV///vfrFixQmsAtqGhAaVSqTXLFsDf04mVs6zY\nuDcNU6u29Ew1xXnYuQ/RHFN94QzlZ5PxdrWVgUgBwJIlS5g/f/5tPUPbyMjotxSNdn1x8QmmJCsB\na2c37PoNoam2kkZFOfaeQ7Ht2xYgVjY1kv3Tv7FydsfVbyJNBYkUFRVhbW1N3759KSsr4+LFi5w6\ndYrRo0dTUFDATz/9RPjen1Apm7Hr54ujdwCO/f05nxzDheR9KJsaMLd3oU/gVJrrFJRmH6O5thID\nYxNM7V2Ji4vT9Lm0tJTCwkJMTEwwMTHhwoULuLm5ERcXR2RkJHV1daSkpGBkZMQbb7QFssaMGUOf\nPn347rvvyMvLY8CAAZSWlnL06FFGjx6tE7jujg0bNmBubs6QIUNwdXVFpVKRkZHB6dOnGThwIIGB\ngVf97yNuX6dOnSI8PJzMzExqamqwt7cnKCiIxYsXa91zzpw5Q2xsLGlpaZSVldHY2IizszPBwcEs\nXLhQ85xVi4mJYePGjaxcuRJ7e3t2795NTk4OdXV1rFy5ko0bNwJtK+Bmz56tabd48WKdQERJSUm3\n0i3eTq52Qs2AAQNYtGgRI0eO1BxnYGDAqlWrGDlyJNHR0Rw8eJDm5mYcHBwYNmwYwcHB1+OliNuE\n1E8TQohbmwSBhBBCCHFNXU3x2K5mvuYe+ApDY1Msnd0ws7ZHpYJTP+YzwLqRgGE+OoOd/fr147nn\nnmPChAkYGRmRkJBAUVERo0eP5q677tIcZ2Njw5o1a3jzzTdZvXo1gYGBeHh4YGBgQGlpKVlZWSgU\nCsLDwzVtnnnmGfLz8/nxxx9JS0tj5MiRGBsbU1xcTFJSEq+99ppW0EhNXXtii0Uj3+ecIDd+N/Ye\nvphY2FBfWYJBZSGL58wkJyPpit5HcftxdHS87QJA7VNuWbj5cinzFM8++yyTJ0+mssUOB68ASrIS\nsHDoTZ+AqZRmH8fMxgkjYxNKs49hZNY2s93KyQ1rFw9aGuqw7+tB8Cg/cnNzGTJkCO+88w5LlizB\nzMyMSZMmsWvXLlJTUxkcMJrzlU30nzAP+35tAdjefhMoSonB3N4Fz3FzMDazAMCu3xDyDobTVFNB\nY2MjhYWFREZGAvDnP/+ZpqYm5s6dy6pVq4iMjCQzMxOVSkVZWRllZWW4u7vz6KOPal63ubk5GzZs\nYNu2baSlpZGZmYmrqyuLFi1i7ty5Wqnmumvp0qUkJSVx9uxZjh8/jqmpKS4uLjz++OPcd999GBvL\nTz6h3759+9i8eTMmJiYEBwfj7OzMhQsXiI6O5ujRo7z33nuatKfR0dEcOXIEf39/hg8fjkql4syZ\nM3z33XckJibyj3/8AwsLC51rHDp0iMTEREaNGsW9995LSUkJXl5eLF68mB07duDi4kJISIjm+Muf\nmyUlJaxatYrevXt3mW7xdtKdCTVm1vaMfHSt5u/tJ9R0ZurUqUydOvVquyjuQLdr/bT2Qev29yMh\nhLidyC8CIYQQQvRYSUkJy5Yt01vc/GqKx3Y187XP8BAURWepr7hI9YUzGBoZY2plR9C02bz+/BM6\ng51/+ctf2LlzJ3FxcVRUVODk5ERoaCjz58/XSZ0UGBjI5s2bCQ8PJykpiYyMDIyNjXF0dCQwMFCn\npoa1tTXvvvsuERERxMfHExUVhaGhIb169WL69Ol4eHh0+DpGeDnz2YvzWBDswT+3/ofzBXkYNBcx\nYdhgnnj0z1hZWfHKKxIEulFiYmI4evQoZ8+e5dKlSxgZGdG/f3/uvfdereCh2unTp/nvf/9LVlYW\nBgYGDB48mEcffZSkpCR27NjBhg0btAY2f/31Vw4dOkR2djbl5eUAuLu7ExISwv3336/zf1NfTaD2\nn8HQ0NBbZpZ8cm4ZXx44fVmw150qt0lUXUwnf+du6hqbyS26RN2li9i4egGgbGpLw1ZdlIOyqZ5G\nRTnGphYoivNQFOcBbWm8PD09KSgoIDs7G1NTUyZOnEh0dDT5+fkATJ8+nS/D92JibqVZUQRQdjoR\nlUqFqaUtpaeOara3NNbRXK/AwMiYuppqFAoFALm5uWRlZeHg4ICfnx9Dhw5l6NChmnYJCQk88sgj\nWFlZMWiQ9oCss7Mzq1ev1vv+qANM7YWGhnaaounee+/l3nvv7XC/EPqcP3+eTz75BFdXV9566y2c\nnJw0+1JSUnjttdf47LPPePXVVwF4+OGHWbFiBYaGhlrn2bdvH5s2bWLv3r3Mnz9f5zrHjx9n7dq1\njBo1Smu7t7e3JgjU2f/vtLQ0QkNDWbx4sWbblClTWLt2LeHh4bdtEOhqJtQIcT1J/TQhhLg1yTcE\nIYQQQlxTV1M89vCpi50e12twEL0GB+lsD5wwWO8MZBMTEx577DEee+yxbvXDxcWFZ555pnudpm1W\n/4IFC1iwYEGnx61cuVInWAZw98Qg7p6o+3pA/2DwW2+91e2+iSv3ySef4OHhgZ+fHw4ODigUCo4f\nP87777/P+fPntVZ2pKen87e//Y3W1lbGjRtHnz59yMvL45VXXulwcHLbtm0YGhoyZMgQnJycqK2t\nJTU1lc8++4zTp0+zatWqbvf1VpolH5Vc0GEaGSfvQPAORNnUgK9xNZXxcSgu5lJ2JpH+kx7CyKSt\nro170D1YOLhyet/nuPgE4x50j+Ycny6fTH8XG9LS0qipqQEgJCSE6Oho0tPTAVAqlZiomnHo74eh\nOv0c0FDZFoA2tbLDddgEzfbijEMobZ2w7T2AIc5GvPbaawBkZWVpzpeYmEhYWJjW66mqqgLQqSEk\nxI1y+YBt1sG9tLS08PTTT2sFgKBtUkRwcDBHjx6lvr4eCwsLTQD6cnfffTf/+te/SE5O1hsECg4O\n1gkA9YSLiwsLFy7U2jZy5Eh69epFdnb2FZ/3Znc1E2qE+D3cTvXTxo4dy5YtW3BwcLjRXRFCiOtG\ngkBCCCGEuOa6Kh7bPoVJ++KxMvNV3Aw2b95Mnz59tLa1tLSwdu1adu/ezb333ouTkxMqlYpNmzbR\n3NzM66+/rjXQ+eOPP/LJJ5/oPf/atWt1zq9Sqdi4cSOxsbHMmjWLIUOG6G17uVtllnxyblmXdQQA\njEzNKcYct1EzKEqNo7WlmdqSAiyd3QCoLSnAwsFVc6yaOpAMbfWFWltbAfD19aVv374kJyfj5ORE\nZmYmlmbGjJs4hcKG366rQoWBoRFV57Npqq9B2ViHsrmB5voaTMytMFPVYWnmQENDWyP1iqCqqiqS\nkpKorKzU+3qUSmXP36wbZNmyZQBs3br1BvdEXEv6V9/BqahYjGovsffnI5w+fVqnXVVVFa2trZw/\nf56BAwfS0tJCVFQUBw4coLCwkNraWlTtPtDqVY2XU9fYu1JeXl46q4+gbUWdOhh7O7qaCTVCiJ6x\nsrLSqeMphBC3GxkxEUIIIW4BV5r66cCBA0RFRZGTk0NTUxOurq5MnTqVefPmYWJiojnus88+IzIy\nkjlz5vDUU09pnUOd6mX48OH8/e9/Z8eOHezYsQNoS5sVExOjOVadS/tKi8fKzFdxM7g8QANgbGzM\nrFmzSE1NJSUlhWnTpnHy5EmKiooICAjQmel+zz338P3333P+/Plund/AwIAHHniA2NhYkpOTux0E\nulVmyX954HSH9wLFxVysXfvrpMFTtbZgYGiMobHJ/9X+8aSy8CTGltoF6NWB5Ly8PL2zeENCQjh2\n7BglJSU0NTUxdOhQnpw/VStQbWxmgamVPeY2Dlg49aW+vAgjMwts+wzAY8x9vLvsLjztDKmtrQXA\n0rKtJpH6/qtvpd+aNWs0K5DuFLNnz8bPz09WLd4kOlt919JYR219E5/8ezverrb0stVdTQtoAp/v\nvPMOR44coXfv3gQHB+Pg4KD5HhEREUFzc7Pe9lc7s97a2lrvdiMjI60g1O2oqwk17bWfUCPE76En\nqXPVz8Nvv/2W3bt3ExMTQ3l5OS4uLjz44IPMnDkTaJtAs3fvXoqKirCxsWH69OmEhobqfD8AOHXq\nFOHh4WRmZlJTU4O9vT1BQUEsXrxYp47i5dePi4ujuLiYKVOmsHLlyk5rApWVlREeHs7x48cpLy/H\n1NSUPn36MGbMGBYtWqQ5LjU1lQMHDpCZmUlZWRlKpZLevXszceJEHnroIUxNTa/VWy+EEFdEgkBC\nCCHELaQnqZ8+/PBD9u/fj7Oz8/9n784Dqq4DvnAbAAAgAElEQVTSx4+/L7vsIEsIKGCgsrgjKrnv\n+5YWlJOTlWM1aqb+JsussVzSGZcyG5fGLbQxzVBzSdwwUxSQVRTFFZBFRBaVzfv7g++9eb2XVVTU\n5/VP9NnO+SD3fu49z3meQ+fOnTEzM+Ps2bNs3LiRmJgY5syZg/7/lUN68803SUxMJDQ0lFatWqkD\nSleuXOE///kPNjY2fPjhhygUCvz8/CgsLCQ0NBR3d3c6duyobtPd3V39c20Wj5WZr+JJeLBMkqs5\nRBzeQ0xMDFlZWRQXF2scr5rxfuHCBQCNtWBUFAoFzZs31xkEys/PVw8oXL9+XT3I+uD1q+NpmCV/\nKTO/0tf0xSP/Q8/ACFM7Z4zNrVEqIS81mZK7hZjZuWD+f+sCuQWO4HzYBq7HHuZObiY3zkejLCul\no5sZa/61m8uXL7No0SKt6/fo0YMFCxaQmpqKubm5zkC1mZ0zhdmpmFjbcyfnOkZmVpg7umFkZoln\ncSI//ieSxMRE/vKXv+Dq6qoO0qkygoSob6rKvlNl0rUc/f8wMDbhn68FVLiYe3JyMn/88QetW7fm\ns88+U392gPIsxq1bt1bYD12Dt6J6ajuhRojHoSalc1UWLlzI2bNnad++Pfr6+vz+++988803GBgY\ncPHiRQ4cOIC/vz+tWrXixIkTbN68GWNjY61Sk7/99hvffPMNhoaGBAQEYGdnR1paGnv37iUiIoJF\nixZhb2+v1f7cuXNJTk6mXbt2dOzYESsrq0rvMTk5mdmzZ5Ofn4+vry+dO3emqKiIK1euEBISohEE\n2rp1K9euXaN58+a0b9+ekpISEhMTCQkJIS4uji+++ELn5zUhhHhcJAgkhBBCPEWqW/opLCyM/fv3\n06lTJ6ZNm6Yx+ywkJIRNmzaxa9cuhg4dCpRnOfy///f/mDx5MkuWLGHZsmWYm5uzYMECiouLmTVr\nFtbW1gD4+fnh6OhIaGgoHh4elS7oXJvFY+ti5qvMQhfVoatMUlH+Tc7uWY2pfhndOralX79+mJqa\noqenR2ZmJmFhYeoZ77dv3wZQvzYepGsGfGFhIR988AEZGRl4eXnRs2dPzM3N0dfXVwdXK5pRr8vT\nMEv+9KXsSvc7te5FfvoF7uRcJy/tPHr6Bhg0sKCBlT1Wzl7qtXuMzKxoNuBtrpzYxdWIXdzLS8fh\ntgn3bjli37gxgwcPpkmTJlrXt7e3x9XVlWvXrqGvr0/37t0BzUD13VsduHE+CuW9ezRq3ZP86xfR\ny72MvdKI2xl2mNnb061bN/W5np6eNGvWjNjYWM6ePavzvm7dulWjf0sh6lJl2XdQHvi8fSONgqwr\nWDl7ERKeXGEQIT09HYAOHTpoBIAAzp07pxUory6FQqEu3Sh0q82EGiEeh+qWzr1fVlYWy5cvV5de\nGzFiBBMnTmTVqlWYmZnx9ddfq88JDg7m7bff5ueff2bEiBHq957U1FS+/fZbHB0dmTdvnkYbMTEx\nzJo1i5UrV/Lxxx9r9VnVvqWlZZX3V1payvz588nPz2fatGl069ZNY392tuZnm4kTJ+Lo6KgV+N64\ncSM//vgjv//+O126dKmyXSGEeFQkCCSEEELUQw8GTVzMykdyqlv6KTQ0FH19fSZPnqxVfuDVV19l\n586dHDp0SB0EgvISVe+//z4LFy5k0aJFvPDCC1y5coUxY8bQqlWrh7qfmiweKzNfxeNQUZmkzKQ/\nKC26jU2nYaQ5t6ZJh5b0a+0KlJdXvL/8oaokWEXrwdy8eVNr2759+8jIyCAoKEgrgJqUlERoaOjD\n3Fa9dLuotNL99l7tsfdqr7X97J413L6RyqWj2zC2bIhCoWDUoJ5YNX+ZUGUa48eNrTQIfb8xY8ZQ\nVlbG3LlzNYJ2fwaqfdngYUjo5v+idz2KXh398fHyoKysjMzMTHW5GTu7P99vZs+eTXFxMdeuXWPS\npEk0a9YMMzMzsrOzuXTpElevXmXp0qXV/C09Hkqlkl27dvHrr79y/fp1LCws6NSpE2PHjtU6trCw\nkL179xIZGUlqaiq3bt3C1NSU5s2bM3r0aJo3b64+VlVKByA+Pp4hQ4ao993/t16T8kGi9qrKvgOw\n9yoPfKZG7sPYwpbYy+XnqZ7VpaWlnD17Fh8fHxwdy9fhevDf9tatW6xYsaLW/bS0tNQaSBXaajOh\nRoi6puvv70G6Sufe74033tBYe+eFF17A29ub2NhYxo8frxHQMTMzo0OHDhql46C8ZFxpaSlvv/22\nVpCpVatWBAQEEBERwZ07d2jQQLPM5euvv16tABBAREQEmZmZBAQEaAWAAI3PA6p70WXYsGH8+OOP\nREVFSRBICPFESRBICCGEqEcqWsC5qCCXq1dv0tjLt8rST0VFRVy8eBFLS0t++eUXne0YGhpy9epV\nre1du3YlJiaGffv2ER8fj7e3N6+99lod3FnNyMxX8ShVViapKL88cGPduAVKJSzeGYuDVQPauNsR\nFxencayHhwcAiYmJWtdRKpU6y7GlpaUB0LlzZ619z+r6MabGtfvK4RY4gmun9pKXfoGyy/HlmU3d\n/OjZtRX7a3nNCttysGDWu8GMHdiZ7du3Exsby85ziZiYmGBra0tgYKDW4I2dnR1Llixhx44dHDt2\njEOHDnHv3j2sra1pXElm0pO0atUqduzYga2tLf3790dfX58TJ05w7tw5SktLMTD48/d67do1NmzY\ngI+PD/7+/pibm5OZmUlERASRkZHMmjVLvRaWu7s7QUFBbNq0CQcHB401Ffz8/NQ/16Z8kKi5qrLv\nAEys7GgcMJQrJ0I5s/M7LJ2asqgojpaNbdWBT0tLS7777js8PT1p0aIFx44dY/r06Xh7e5Obm0tk\nZCTOzs5a629UV6tWrThy5Aj//Oc/adq0KQYGBvj4+ODr61ur6z3rajKhRoi6UtF3k+LCW+inR2Nd\nkoWyKL/C0rn3e/HFF7W2qd4/dO1TBXnuDwKpPlvFx8eTnJysdc6tW7e4d+8eqampWtf09Kz+ulmq\ndh5c87Eid+/eJTQ0lOPHj5OamsqdO3c0MrJrUupXCCEeBQkCCSGEEPVEZQs4A+TdKSbszA32nr6q\nzkxQub/0U0FBAUqlklu3brFp06Ya9yMwMJB9+/YB5Yt8P6n61TLzVTwqlZVJMjIrrw9fkHEJK5dm\nKJUQEp6M8uYV9etCxdvbGycnJ2JjY4mMjNQYKNizZ4/O9YBUM+rj4uJwc3NTb09JSWHLli0PeWf1\nk67ZwtVhbGFL0x5BGttebOmFn58nO3bsqPC8NWvWaG0LDg6uVtaQm5sbU6ZMqXYfGzRowJgxYxgz\nZky1z3lSzpw5w44dO3BycuJf//oXFhbl76Njx45l5syZ5OTkqAfZAFxcXFi3bp3WrOns7Gw+/PBD\nVq9erf6b9/DwwMPDQx0Equh3XZvyQaLmqsq+U7H1aEkDG0cyzxwnP+MiJ45kcMPZTivwqaenx6xZ\ns9i4cSOnTp1ix44dNGzYkL59+/LKK6/w7rvv1qqf77zzDlBewunUqVMolUqCgoIkCCREPVHRdxNV\n6dyy4juYOzRmSDd//Ju56Cyde7/7s4BUVGXeKttXWvrne1peXh4A27Ztq7TvD663CLrL9FaksLAQ\noFrPpNLSUj7++GPOnTtHkyZN6NKlC1ZWVur+b9q0ScrDCiGeOAkCCSGEEPVAVQs4qz2QmaCL6kuU\nh4dHjUsR5eXlsWzZMoyNjYHyWeN+fn5VLpz6KMnMV1GXqiqTZO/lT07KaS6G/4R14xYYNrDg/IFM\noo1y6durO+Hh4epjFQoFf//735k9ezZz5syhc+fOODk5cfHiRU6fPk27du2IjIzUqA/fs2dPtm3b\nxqpVq4iLi6NRo0akpaVx8uRJOnXqpHH9Z4WbgwV+jW2rLE9VHbXNKnpe3R9EP/jLj9wuKmXMmDHq\nABCAkZERb7zxBjNnztQ4V9eAHJRnQAUGBrJjxw6ysrJ0Lr5dkQcDQFB1+SBRczV5nTSwcaRJ52EA\nTOznzfAO7jqPs7CwYOLEiTr36Qq89urVSyMjTBcrKyumT5+uc5+Dg0OlwV5Z+0+IR6uy7yaq0rlN\nOg2jYdPWnFXAuMAA2rjbaZXOrWuqZ9OPP/6oLstbXQ+u11OddqqTwaPKqO3Vq5fWRJKcnJxaTcoT\nQoi6Jt+ihBBCiHqgqgWc76fKTKgoCGRiYkLjxo25cuUK+fn5GoN9lV9XyeLFi7lx4wZ///vfAfj6\n669ZvHgxs2fP1vjipMoOkgWdxdOmqjJJDWwcebH3G6THHCQvNRml8h4NrB3p8/pbDOjgqRWk8fPz\nY968eWzcuJGTJ08C0KxZM+bOncuhQ4cANAYpbG1tWbBgAWvXriUxMZGoqChcXFyYOHEirVu3fiaD\nQACvdfXkox9OVPt9riK1zSp63ugq35P0ezS3c26wNeEuDZtmazxDvL29dWZ9njlzhtDQUJKSksjN\nzdWYjQ3lg2M1CQJlZWXx008/ERMTQ1ZWVrXKB4maq+3rRF5fQgiVyr6b3F86FzS/mzxYOreuNWvW\njPPnz5OQkIC/v/8ja0e17l1kZCQDBgyo9Nj09HTg+Sr1K4R4+kgQSAghhHjCqrOA84NiL+doLOD8\noOHDh7Ns2TKWLl3KBx98oDWju6CggIyMDJo2baretn37dk6dOkWXLl3o27cvAKdPnyY8PJxt27Yx\natQo9bHm5uYoFAqysrJq1G8hnrTqlEkyt3fFs/dfNLa5elVchqxZs2bMmTNHa/v333+Pnp4ejRo1\n0ryWqyuzZs3S2bau60+ZMkVrZunTNku+jbsdUwb5VS/jsQItm9hKVmA1VFS+p6ykCIDkGyV89MMJ\nPhjcUl1aVF9fX6vs2x9//MG8efMwMjKidevWODk5YWJigkKhIC4ujvj4+BqVt7l+/TpTp06loKAA\nHx8f2rZti6mpaZXlg0TN1Sb7Tl5fQgiVqr6bPFg6F8q/m+zcH65VOreuDR48mL1797J69WoaNWqE\ns7Ozxv7S0lLOnj2Lj4/PQ7XToUMHHBwcOHHiBEeOHKFr164a+7Ozs7GzKw+cq0qpxsXF0aFDB/Ux\n169fZ+3atQ/VDyGEqCsSBBJCCCGesOos4FzReRUN2PTp04fz58/z66+/8vbbb9OmTRscHBzIz88n\nIyOD+Ph4evfuzXvvvQdAcnIy69evx9HRUb0N4P333yc5OZkNGzbg6+tLs2blX/RMTEzw8vIiISGB\nRYsW4ezsjJ6eHgEBARrrnAhR39S2nFhF5xUVFVFaWqoVaA0LC+PMmTO0a9cOExOTWrX5rOnfpjGO\n1qaEhCcTe7lmgW+FAoK7VH9B5+dVZeV79A3Ly3yW3i1E39BIo7RoWVkZeXl56gEtgI0bN2JoaMji\nxYtxddVch2758uU1nt28fft28vPzmTJlilaZsEddPuh5VJPsO3l9CSHuV9V3E12lc+/kZvLP3zJ5\neXDfR5rV7OLiwqRJk1i2bBnvvfcebdu2xdnZmbKyMjIzM0lMTMTS0pLvvvvuodoxMDDgH//4B59+\n+ikLFy5k9+7dNG/enOLiYq5evUpMTAy//PILUB4wcnJyYvv27Vy6dImmTZuSlZVFREQE/v7+MmlO\nCFEvSBBICCGeEjt27GD37t1kZGRQXFzMW2+9xbBhwx57P4YMGYKvr2+9m2X+NKvuAs41PW/ixIm0\nb9+e3bt3ExMTQ2FhIebm5tjb2zNy5Eh69OgBlC98umDBAgBmzJihMZhtamrKjBkzmDFjBl999RXL\nli1T7//www9ZtWoVUVFRHDlyBKVSiZ2dnQSBRL1W12WSsrKymDx5sjpT4t69e1y4cIHExETMzMwY\nP378w3T3mdPG3Y427nbqtWpiLt3g2NmMSs9RKOCDwS0rLIEp/lRZ+R5TWydu56RTkHkZYwsbjfI9\niYmJWuU909PTady4sVYASKlUkpCQoLMNhUJRYZnQysrlPOryQc+j6mbfPYuvr8zMTMaPH69zfQ4h\nRNWq+o5RUencQa9PYEBg80de2rZHjx64u7uzfft2YmNjiY6OxsTEBFtbWwIDA+nSpUudtOPp6cmy\nZcv46aefOHXqFElJSTRo0AAnJydee+019XEmJibMnTuXtWvXEhcXR2JiIo6Ojrz66qsMHz78mS31\nK4R4ukgQSAghngJHjhxh5cqVeHh4MHToUAwNDdV1iuuaasBS1yK/4tGoTmaCsbk1bV+fXeF5FQXl\n/P39q6yXbWZmxurVqyvc7+npyc8//6y13cnJiU8//bTSawtR39R1mSRra2u6detGfHw8sbGxlJaW\nYm1tTe/evRkzZgxOTk511fWnkq4JDKtXr1ZPJhjewZ3oi9kVZge1bGJLcBfPZ2qA+lGpqnyPbdPW\nZJ+P4np8OFYuXhgYmxJ7OYdz126wbt06reMdHBxIS0sjJycHW1tboDwAFBISwtWrV3W2YWlpSXa2\n7hnkFZXLiYqKeuTlg55XVWXfyetLCKFLdb6b6Cqd26qtN35+7lrlaiubPKir5K1KcHAwwcHBOve5\nublVO8hb1eTFXr16aWWoqtjb2zNx4sQq27Czs2PatGk691VWvlcIIR4XCQIJIcRTQLXY+OzZs9UD\nMeLZ8TgWcJbgnhB/qssySebm5kyaNKkOe/fsqO4Ehgezg24XlWJqbEBrNztZo6QGqirfY27vikPz\nADKTTnBm13fYNPYGhR7vR27E18NJ6/PF8OHDWb58OZMmTSIwMBB9fX3OnDnDlStX6NChAxEREVpt\ntGrViiNHjvDPf/6Tpk2bYmBggI+PD76+vgwaNIj9+/czf/58AgMDsbW15fLly0RFRfHSSy89sZnS\nT0PWyMNkYT+Pry9bW1tWrFiBqanpk+6KEE+lx/HdRAghxOMlQSAhhHgK5OSUz96UANCzqS4yEz76\n6CPi4+NlppkQ1fA8l0l6nCqawLBixQqMjY21jndzsHhmB6Ufh+qUFnVu1w9jC1uyzp0kO/kU+sam\nBPTsypx/TtMKZvbv3x9DQ0N++eUXwsLCMDIywsfHh8mTJ3Ps2DGdQaB33nkHgJiYGE6dOoVSqSQo\nKAhfX1/c3NyYO3cuGzdu5OTJk5SVleHu7s7MmTMxMzOTcjmP2PP0+jIwMMDFxeVJd0OIp1ZdZ00L\nIYR48iQIJIQQ9VhISAibNm1S//+QIUPUP69Zs6bSmau6ggJxcXHMnDmToKAg2rdvz6ZNm0hKSqKg\noIApU6awZMkSnW3paiMvL4/169cTERFBfn4+Tk5OjBw5kt69e+u8l6ioKEJDQzl37hx37tzBzs6O\nTp068corr2gtqK7KWvn6668JCQnhjz/+4MaNG4wZM6bCkgBPO1nAWYjHS8okPXoVTWCQwdlHozrl\nexQKBfbNOmDf7M9ybEP7eWNmZqYzU7SiEjlubm46n8dWVlZMnz69wvZbtGjBl19+qXOfTGIQdUVX\ndteSJUsICwtj9erVnDx5kl9//ZXr169jY2NDv379GD16NAqFgqNHj7Jt2zauXLmCiYkJL730Em++\n+SZGRkYabRw/fpzff/+dc+fOcePGDaD8va1Xr14MHjwYhUKh1a/U1FTWr19PTEwMpaWluLu7M2bM\nGPLy8liyZAlTpkzRer1lZ2er1yO5ceMGDRo0oEWLFrz66qt4espnQfHoyHcTIYR4tkgQSAgh6jE/\nPz8AwsLCyMzMJCgoqE6um5SUxJYtW/D29qZPnz7k5eXRqFEjgoKCCA0NBWDo0KHq4z08PDTOLyws\nZMaMGRgYGBAYGEhJSQlHjx5l6dKlKBQKrS+wmzZtIiQkBAsLC/z9/bGysuLSpUv8/PPPnDp1ikWL\nFmmV7CgtLeXjjz8mPz+fNm3aYGpqiqOjY53cf30kmQlCPH7PY5mkx6GyCQw7duzQKm21fPly9uzZ\nwyeffEJAQIDW9c6ePcu0adPo3LkzH330kXp7UVERoaGhhIeHk5aWhkKhoEmTJgwdOpSuXbs+wjus\nn6R8jxBV+/7779XrUrVp04YTJ06wYcMGSktLsbCwYO3atXTs2BEfHx9Onz7Nrl27uHfvHu+++67G\nddauXYuenh7NmjWjYcOGFBYWEhsby8qVK0lOTmbq1Kkax1+7do3p06dTUFCAv78/bm5uXL9+nblz\n59KuXTudfb1w4QKzZs2ioKCAtm3b0rlzZ/Ly8jh+/DgzZszg448/pn379o/sdyWeb/LdRAghni0S\nBBJCiHrMz88PPz8/4uLiyMzM1Jh1m5mZWevrRkdH895779G/f3+N7S1atCAsLAyg0oybixcv0qdP\nH95//3309PQAGDZsGO+//z5bt27VCALFxsYSEhJC8+bN+eyzzzSyfsLCwliyZAkhISG89dZbGm3k\n5OTg6urKvHnzMDExqfW9PiphYWFERERw4cIFbt68ib6+Pm5ubgwYMIAePXpoHKvKytq+fTtbt25l\n//79ZGVlqReUf/311zEwMNDKTMi/nkJG4h/cvpHKvdJiXnB0ZNSg3rzk2V19bdVsV5X7B1t1rR9w\n9+5dQkJCCA8PJzc3F3t7e/r27cuoUaN0zlo9e/Ys27ZtIzExkYKCAqytrWnfvj1BQUFas/tV9/nz\nzz/z008/cejQITIyMujWrVu9XWdBCHi+yiQ9DjWdwNCrVy/27NnDgQMHdAaBDhw4AKCRaVpYWMjM\nmTNJSUmhadOm9OnTh3v37hEdHc3ChQu5fPkyY8eOrcO7qv+kfE/dysnJ4ccff+TUqVPk5ORgamqK\nj48PY8aM4cUXX9R5Tnh4OHv27CElJYWioiJsbGxo3rw5w4cPV2dtFBYWsnfvXiIjI0lNTeXWrVuY\nmprSvHlzRo8erXPdLFF3zp8/z9dff03Dhg2B8s+7b7/9Ntu2bcPY2JglS5bg6uoKQElJCZMnT+a3\n337jtddew8rKSn2d2bNn4+TkpHFtpVLJkiVLOHDgAIMGDaJZs2bqfStWrKCgoICJEycycOBA9fbI\nyEg+++wzrX6WlZWxYMEC7t69y9y5c/H19VXvy8nJ4YMPPmDZsmWsWbMGQ0PDOvndCPEgyZoWQohn\nhwSBhBDiOeTh4aEVAKoJY2Nj3nrrLXUACMDV1RVvb2/i4+O5e/euOnCjKu/y97//XavsW69evQgN\nDeXQoUNaQSAoLwtXHwNAAN9++y2NGzfG19cXGxsb8vPzOXXqFP/+979JTU3l9ddf1zpn0aJFJCQk\n0K5dO0xNTTl16hRbt24lNzdXHSRRZSas27yN5b/9gp2+Ib69u+Ht3oj0y+c5eXgv01MSWbhwIWZm\nZpiZmREUFKRzsPXBzKnS0lI+/fRTcnJyaN++PXp6ehw/fpx169ZRUlKiNVD722+/8c0332BoaEhA\nQAB2dnakpaWxd+9eIiIiWLRoEfb29lr3OXfuXJKTk2nXrh0dO3bUGDQRQjz7KpvAoEvz5s1xdnZW\nlxe1sPgzKFFSUsKRI0ewsrKibdu26u2rVq0iJSWFcePGMWrUKPX24uJivvzyS7Zs2UJgYKBWJuuz\nTsr31I2MjAxmzJhBTk4OLVu2pGvXrmRnZ3P06FFOnjzJzJkz8ff3Vx+vVCpZunQpYWFhWFpa0qlT\nJ6ysrLhx4waxsbE4Ozurg0DXrl1jw4YN+Pj44O/vj7m5OZmZmURERBAZGcmsWbMqzAwRD+/VV19V\nB4AAzMzMCAgIYP/+/YwYMUIdAAIwNDSkS5cuhISEcPXqVY3PMw8GgKC83OLQoUM5cOAA0dHR6iBQ\ndnY2sbGxODk5MWDAAI1z2rVrR+vWrTl9+rTG9lOnTpGens6IESM0AkBQXmJz1KhRrFq1ipiYGMkG\nEo+UZE0LIcSzQYJAQghRDz34IftWYXGdXt/Ly+uhzm/UqJFW+TYAO7vyWWAFBQXq4E1SUhIGBgYc\nPXpU57VKSkq4deuW1sCfkZERbm5uD9XPR+mbb77RGgAoLS1l9uzZ/PTTTwwYMEBjkAEgPT2d5cuX\nq+9z7NixTJo0iQMHDvDGG29gY2MDlGf3bP9xA24v2PLvf/9bY/2MFStW8Ouvv/Lf//6X999/HzMz\nM4KDg6s12JqTk4O7uztffPGFurZ9cHAwEyZM4JdffmH06NEYGJR/NEhNTeXbb7/F0dGRefPmadxL\nTEwMs2bNYuXKlXz88cda7WRlZbF8+XIsLS1r8isVQjzlHubZ1bNnTzZs2MCRI0cYNGiQentERAQF\nBQUMGzYMfX19APLz8zl48CCenp4aASAof3aMGzeOqKgoDh8+/NwFgaR8T91Yvnw5OTk5jB07ljFj\nxqi3Dxw4kH/84x8sXryY77//Xv1ZZ+/evYSFheHp6cmcOXM0Jr3cu3eP3Nxc9f+7uLiwbt06rWdk\ndnY2H374IatXr5YgUA09+N7jYlbxH7+uLC5VZrOufarPP9nZ2Rrb8/Pz2bZtG6dOneL69evcvXtX\nY79qnSCAlJQUoDzgrSvr2tvbWysIlJSUBJR/pgoJCdE6Jy0tDYCrV69KEEg8FpI1LYQQTzcJAgkh\nRD0SfTGbH44ka5VyST59BUXeTaIvZtfJgI21tfVDnf9gRo+KaoDu3r176m35+fmUlZVprA+hy507\ndzSCQFZWVjq/KNcXumaAGhgYMGjQIGJjY4mJiaFnz54a+8eNG6dxjyYmJnTr1o3Nmzdz/vx59azi\nQ4cOUVpayogRI7QWUB87diwHDx7k4MGDTJgwocYlQCZMmKCxuLGVlRUBAQEcOHCA1NRUmjRpAsDu\n3bspLS3l7bff1gpmtWrVioCAACIiIrhz5w4NGjTQ2P/6669LAEiI50hdPLt69uzJxo0bCQsL0wgC\nqUqU3l8K7ty5c+rnjK7B0bKyMqB8cPR5JOV7KldVwCA7O5vo6Gjs7e0ZOXKkxr4WLVrQrVs3Dh48\nyLFjx9TP+Z07dwKoJ2fcT09PT6N8akWfoezs7AgMDGTHjh1kZWXpzLQVmip67ykqyOXq1Zs0u1Gg\ndY6u37/q86uuCU6qfar3FSgv6ffBB6vrvU0AACAASURBVB+QkZGBl5cXPXv2xNzcHH19fQoLCwkN\nDaWkpETjeKj487eu7Xl5eQAVTqJSeTD4JIQQQgihiwSBhBCintgTfaXSmbt5d4r56IcTfDC4Jf1a\nu6oDJPd/Kb2f6gunLo8zuGJqaopSqawyCPSg+hYAenDQyNUcIg7vISYmhqysLIqLNWe83z8DVEVV\nCuZ+qkGegoI/ByouXLgAQMuWLbWONzc3p2nTpsTHx3Pt2jXc3d2rfQ9mZmY6g1f3Z3CpqGagxsfH\nk5ycrHXOrVu3uHfvHqmpqVozZ3XdpxDi2VTTZ1dF7OzsaNWqFadPn+bq1au4urpy69YtoqKi8PDw\n0MgMzc/PByA5OVnn+5PK8zw4KuV7tFU3YKDK2vDx8VFnx96vZcuWHDx4kJSUFHr27Mndu3e5fPky\n1tbW1c48O3PmDKGhoSQlJZGbm0tpaanG/hs3bkgQqArVee/ZGXmFPqevVvreUxv79u0jIyODoKAg\nrQzspKQkQkNDNbapgkv3Z4TdT9d2VbDqk08+0blWmhBCCCFETUgQSAgh6oHoi9lVlm4BUCph8c5Y\nHKwa0PwFc0C7PAXA7du3SU1NrVVf9PT0tAYjHkbz5s05efIkV65coXHjxnV23cdF16BRUf5Nzu5Z\njal+Gd06tqVfv36Ympqip6dHZmYmYWFhGjNAVSqbfXp/9pQqgHf/zOH7qcrGVRbo06UmGVyqGajb\ntm2r9Jq6BllV/RNCPNtq8+yqLPukV69enD59Wl0i89ChQ5SVlWllVarey4YNG6ZzPTnxJynfU64m\nAQOj/3u2VvQsU21XTZxQPYsfzJqtyB9//MG8efMwMjKidevWODk5YWJigkKhIC4ujvj4eJ2fIcSf\nqvvew33vPXVJVYqtc+fOWvvi4+O1tqmCg0lJSSiVSq2JTomJiVrnqNYTSkhIkCCQEEIIIR6aBIGE\nEKIe+OFIcrUWcYbywbSQ8GQW/qUTLi4uJCYmqmdNQ/lA/urVq7UyU6rLwsKCS5cuUVxcrFE2rLaG\nDRvGyZMn+frrr/noo4+0AhuqGbSqL7v1SUWDRplJf1BadBubTsNIc25Nkw5/znA/cuSIunxRbakG\nOG/evKkzcHbz5k1Ad9mSuqLqw48//ljjdupbFpcQ4tGozbOrsiBQ586dWbFiBQcPHuQvf/kLYWFh\n6Ovr0717d43jvLy8UCgUOgdOhXhQTQMGr/mWl1mtKGtD9QxWPSdV/9WVAazLxo0bMTQ0ZPHixerP\nbirLly/XGUQQmmrz3uNch+07OjoCEBcXp5GlmJKSwpYtW7SOt7e3x8/Pj7i4OHbv3s3AgQPV+yIj\nI7XWAwIICAjAycmJXbt20bJlS53r/iQlJeHu7o6xsXEd3JUQQgghnmUSBBJCiCfsUma+VmmSqsRe\nzuFSZj4jR45k2bJlTJ8+nZdeegkjIyNiY2MpLS3F3d2dixcv1rg/rVq1Ijk5mdmzZ+Pj44OhoSHu\n7u506NChxtdSXe+NN95g/fr1vPPOO7Rv3x5HR0fu3r1LZmYm8fHxeHt78/nnn9fq+o9KZYNGRfnl\nA0DWjVtozXCPi4t76LY9PDw4duwYcXFxtGrVSmNfYWEhKSkpGBkZaQwe6enpAeVBQNXPD6NZs2ac\nP3+ehIQE9VpFQgih8jDProoYGRnx0ksvsW/fPrZv387FixcJCAjAyspK4zgrKyu6d+/OwYMH2bx5\nM2PGjNF630tPT0dPT089WCueXzUNGBxPLS+zm5CQQFlZmTpbViU2NhaApk2bAuXr+zVp0oTLly+T\nkpJSZUm49PR0GjdurBUAUiqVJCQkVK+jz7Havvc0oO7KQ/bs2ZNt27axatUq4uLiaNSoEWlpaZw8\neZJOnToRHh6udc7EiROZPn06K1as4NSpU7i7u3P9+nWOHTtGQEAAJ06c0JhEY2BgwMyZM/n000/5\n/PPPadGihTrgk52dTXJyMtevX2f9+vUSBBJCCCFElR5+lEgIIcRDOX1Ju5xbdc/r06cPkyZNwtbW\nlrCwMMLDw2nRogULFy6ssPRXVV555RUGDBhAeno6W7ZsYePGjRw7dqxW11J5+eWXmT9/Pv7+/uo6\n+EePHuXGjRv069eP119//aGu/yhUNmhkZFY+IFmQcQn4c5ZpVFQU+/bte+i2e/TogYGBATt37iQ9\nPV1j38aNG7l9+zbdu3fH0NBQvd3S0hKArKysh24fYPDgwRgYGLB69WqdpQVLS0tlsEqI59jDPLsq\n06tXLwDWr18PoFUKTuVvf/sbzZo144cffmDixIksXbqUdevWsXjxYqZOnco777zD2bNna9VH8eyo\nTcDgfM493DxbkJmZqbW2y9mzZzl8+DDm5uZ06tRJvX3IkCEAfPPNN1qlWpVKJTk5f/bBwcGBtLQ0\njW1KpZKQkBCuXr1ao74+j2r73pOaU7MSupWxtbVlwYIF+Pv7k5iYyM6dO8nMzGTixImMGzdO5zmu\nrq4sWrSITp06kZiYyC+//EJGRgYzZ87Ex8cH0M7wdnNz4+uvv+bll1+msLCQ/fv3s3v3bs6fP4+H\nhwdTp05Vf/4TQgghhKiMZAIJIUQllixZQlhYGGvWrMHBweGRtHG7qOr1dzz7jKvwvD59+tCnTx+t\n/fPmzdPa5ufnx44dOypty8TEhHfffZd3331X5/7Kzp8yZQpTpkzRuc/b2xtvb+9K21ZZs2ZNtY57\nVKoaNLL38icn5TQXw3/CunELDBtYcP5AJtFGufTt1V3nDNCacHBw4O2332bFihVMnjyZl156CSsr\nK+Lj40lKSsLFxUVrkKFVq1YcPXqUuXPn0r59e4yMjHBwcKBHjx616oOLiwuTJk1i2bJlvPfee7Rt\n2xZnZ2fKysrIzMwkMTERS0tLvvvuu4e6VyHE06k6z67anOft7Y2TkxPp6elYWFhUmIVqamrK/Pnz\n2bNnD4cPH+bYsWMUFxdjbW1No0aNeOutt2jTpk2t+iieHbUNGLTt8zK3sr/h+++/JyoqCk9PT7Kz\nszl69Ch6enpMmTKFBg3+XGemb9++JCQkcPDgQSZMmKDOYMvJySEmJoY+ffoQHBwMwPDhw1m+fDmT\nJk0iMDAQfX19zpw5w5UrV+jQoQMRERF1cu/Pquq89xibW9P29dka23qN/AvBXeboPD44OFj97/Og\nXr16qYPT93N1dWXWrFk6z6nos7KLiwszZ87U2n748GH1NR9kZWXFG2+8wRtvvKHzmkIIIYQQ1SFB\nICGEeMJMjWv3Vlzb80TVqho0amDjyIu93yA95iB5qckolfdoYO1In9ffYkAHz4cOAgEMHDgQJycn\ntm3bxrFjxygqKsLe3p6RI0cyZswYrUyvvn37kpmZyZEjR9i6dStlZWX4+vrWOggE5RlJ7u7ubN++\nndjYWKKjozExMcHW1pbAwEC6dOnysLcphHhKVecZpGsCg6mxQZWTEVauXFmtPhgYGDB48GAGDx5c\nrePF86e2wUpjcxsWL17Mjz/+yKlTp4iPj6dBgwa0bduWV155BU9PT43jFQoFU6dOpW3btuzdu5ej\nR49SUlKCjY0NPj4+BAQEqI/t378/hoaG/PLLL4SFhWFkZISPjw+TJ0/m2LFjEgSqwtP6uVmpVJKb\nm4uNjY3G9piYGMLDw3F1dcXZuS5XLhJCCCGE+JNCWd0Cyc8ZhUIR2bZt27aRkZFPuitCiCfocWQC\nXcrMZ8J/jtT4vP9M6Iqbg8Uj6JEICU9m3aFzNT7vje5eBHfxrPpAIYR4ysmzSzwNtkdcZMXexBqf\nN7GfN8M7uD+CHomH9bS+9xQXFzNmzBj8/PxwdXVFT0+PK1eucPr0aQwMDPj888/x8/N7Yv0TQggh\nxKPTrl07oqKiopRKZbsn1QeZRi6EEE+Ym4MFfo1ta1SzvmUTWxlEe4Se1lmmQgjxuMizSzwNWrvZ\nPdbzxKP3tL73GBgYMGDAAGJiYjh37hxFRUVYWloSGBjI6NGj8fDweKL9E0IIIcSzTUarhBD11rVr\n15g4cSJ+fn7MnTtX5zHvv/8+165d4/vvv8fW1halUsmePXv47bffuHr1KkqlksaNG9O7d28GDBiA\nQqHQOH/IkCH4+voyY8YMNmzYQGRkJDdv3mTy5Mk663+rXLx4kc8++4w7d+4wc+ZMWrdu/VD3+lpX\nTz764QTVSc5UKHgi2SaZmZmMHz+eXr16Vbjuz7NCBo2EEKJqT8OzSzzfntaAgajc0/jeo6enx4QJ\nE550N4QQQgjxnNJ70h0QQoiKuLi40LJlS+Li4khNTdXaf+bMGS5fvkxAQAC2trYA/Otf/+Lbb7/l\n5s2b9O3bl/79+3Pr1i1WrFjBv/71L53tFBQUMG3aNM6ePUvnzp0ZPHgw1tbWFfYrJiaGf/zjHwDM\nnz//oQNAAG3c7ZgyyI8HYlRaFAr4YHBL2rg/vcGG8ePHM378+CfdjUqpBo1qoq4GjZYsWcKQIUPI\nzMx86GsJIcSj9Dw9u8TT67WunlX+jarUl4CBqJy89wghhBBC1IxkAgkh6rWBAwcSGxvL3r17efPN\nNzX27d27F4ABAwYAcOTIEQ4fPoyHhwcLFizAxMQEgNdff52PPvqIw4cP4+/vT7du3TSuc+nSJXr0\n6MHkyZPR19evtD8HDx5k2bJlODk58dlnn9XpOkH92zTG0dqUkPBkYi9rz1ht2cSW4C6eT+yLrK2t\nLStWrMDU1PSJtP+4vdbVk6n/2Uv8z0tp6NGaJp2HVXisDBoJIZ5X9f3ZJYQqYLBkV1ylmSMSMHi6\nyHuPEEIIIUT1SRBICFGvdezYEVtbW/bv38/YsWMxNDQEoLCwkPDwcJycnGjVqhUAv/32GwDjxo1T\nB4AATExMGDduHJ988gn79u3TCgIZGBgwfvz4KgNAP/30E+vXr6dFixbMmjULc3PzurxVoHygoo27\nHZcy8zl9KZvbRaWYGhvQ2s3uiZcmMTAwwMXF5Yn24XFq427HhD4tmLS98uNk0EgI8byrz88uIUAC\nBs8qee8RQgghhKgeCQIJIeqdB7/ItQnoQtjuXzh27Jg6gHPgwAGKi4vp16+fep2fCxcuoFAo8PPz\n07qmr68venp6XLhwQWufo6MjVlZWlfZp1apVHD9+nM6dO/Phhx9iZGRUB3daMTcHi3r35VXXmkBL\nliwhLCyMNWvWEBUVxc6dO0lLS8PU1JSOHTvy17/+FTMzMwDi4uKYOXOm+npDhgxR//zgOkMxMTFs\n27aNc+fOcffuXRwcHOjcuTMvv/yy+nqPQ08/F5o722Bgqzv7SQaNhBDiT/Xx2SWEigQMnl3y3iOE\nEEIIUTkJAgkh6o3oi9n8cCRZa/He4ttmXE3NZd3mbeog0N69ezEwMKB3797q4woLC7GwsMDAQPut\nTV9fH0tLS27duqW1z8bGpsq+JSQkANChQ4dHHgB6Gv33v/8lKiqKDh060KZNG3UJv/T0dL788kug\nPNgWFBREaGgoAEOHDlWf7+Hhof55z549fPvttxQXF2NoaMjNmzeJj49n165dfP3118ybN48+ffqo\njz969Cg7d+7k4sWLlJaW4uTkRLdu3Rg+fLg6c0xFtRbR8uXLCQkJITw8nNzcXOzt7enbty+jRo1S\nBxVDQkLYtGkTVqZGkHeB2+FnybtTTNk9JcOD3+Qvo4eRn3GJmZP+SlBQEO3bt2fTpk0kJSVRUFDA\nmjVr1OUCz58/z5YtW0hISKCwsBAbGxv8/f155ZVX1OtZCSGEEOLRk4CBEEIIIYR43kgQSAhRL+yJ\nvlJhrXYjU0sUth7sPPgHG/dG0K6xBZcvX6ZLly4aGTxmZmbk5+dTWlqqFQgqKysjLy+v1uvZfPzx\nxyxdupSlS5dSWlpKv379anWdZ1VSUhLffPMN9vb2QPnv++OPPyY2NpZz587h5eWFg4MDwcHBhIWF\nARAcHKx1nczMTP7zn/+Qm5uLiYkJBgYGDB48mEaNGrFz505OnjzJwoUL1UGg9evXs2XLFiwtLenW\nrRsmJiZERkayfv16oqKimDNnjtbfQmlpKZ9++ik5OTm0b98ePT09jh8/zrp16ygpKSEoKAgAPz8/\nCgsLCQ0Nxd3dnY4dO6qv0bFjR9wcLIjL+PP+t2zZgre3N3369CEvL0/d7smTJ5k7dy4AnTt3xsHB\ngfPnz/Prr79y/PhxvvrqKxwdHevwX0MIIYQQQgghhBBCiHISBBJCPHHRF7OrXKzXzqs9uVfPMH/F\nRvr7lWdX9O/fX+MYDw8PYmJiSEhIUK8TpJKQkMC9e/do2rRprfpob2/P/Pnz+eSTT1i+fDmlpaUM\nGjSoVtd6FgUFBakDQFCeedW7d28SEhLUQaDqOHToEPn5+RQWFuLs7MyCBQto3LgxACNHjuTNN9/k\n9u3blJSUcOHCBbZs2YKdnR3//ve/1Rldb7zxBl9++SUnT55k27ZtjBkzRqONnJwc3N3d+eKLL9RZ\nXcHBwUyYMIFffvmF0aNHY2BggJ+fH46OjoSGhuLh4aEzaKUSHR3Ne++9p/U3effuXRYvXkxZWRnz\n5s3Dx8dHve+nn35i3bp1fPPNN8yZM6davx8hhKivxo8fT2ZmZoX77y/7WVRURGhoKOHh4aSlpaFQ\nKGjSpAlDhw6la9euGuepSolKxqUQQgghhBBC1I4EgYQQT9wPR5IrDQABWLzgjollQ26kxBCaqkfP\nds1o2bKlxjF9+vQhJiaGdevWMW/ePIyNjYHywaa1a9eqj6ktW1tb5s2bxyeffMJ3331HcXExI0aM\nqPX16rsHa+a7mFX8j/Tiiy9qbbOzK18np6CgoNptbQ07wflL13C0b8irr76qDgABmJub07RpU+Lj\n47l27Rq//fYbAK+88opGST99fX3Gjx/PqVOn2Ldvn1YQCGDChAkaZf2srKwICAjgwIEDpKam0qRJ\nkyr7fD8PDw+tABDA8ePHyc/Pp2vXrhoBIIARI0awe/duTp8+TVZWlkYQTQghnjZDhw6lsLBQa3tE\nRAQXLlxQP5MLCwuZOXMmKSkpNG3alD59+nDv3j2io6NZuHAhly9fZuzYsVrXkYxLIYQQQgghhKgd\nCQIJIZ6oS5n5WmsA6aJQKLDzbM+1yL3cLIK2HbtpHdOtWzeOHz/O0aNHeffdd+nUqRNQPhCfkZFB\nly5d6N69+0P118rKirlz5zJ79my+//57SkpKdAYZnmYVrc1UVJDL1as3aXZDO6hjbm6utU1fXx+A\ne/fuVbut5KRUMm7kkq804dcUaHwxmzbudurjVcGewsJCLly4AKCV9QXg7OyMnZ0dGRkZFBYWYmZm\npt5nZmaGk5OT1jk1CVo9qKJMp8r6qK+vj6+vLwcOHCAlJUWCQEKIp9qwYcO0tp0+fZr//e9/ODk5\n8dprrwGwatUqUlJSGDduHKNGjVIfW1xczJdffsmWLVsIDAzUWCsOJONSCCGEEEIIIWpL70l3QAjx\nfDt9Kbvax9p6tEKhUKBnYIhFE1+dx8yYMYOJEydiaWnJ7t272b17N+bm5vztb39j+vTpddJnCwsL\nvvjiC1q0aMGGDRvYuHFjnVy3PtgTfYWPfjhRYWAu704xOyOvsPf01UfSlr6RMUrlPZRlpVy4WcZH\nP5zQaOvmzZsAmJqacvv2bQCNLKD7qcr/PDgz/f6A0P2qE7SqiLW1tc7tqrar6mNtAk9CPCs++ugj\nhgwZ8ljbDAkJYciQIcTFxT3Wdp8llzLz2R5xkZDwZLZHXORSZr7G/suXLzNv3jxMTU357LPPsLS0\nJD8/n4MHD+Lp6akRAAIwMjJi3LhxKJVKDh8+rNVeVRmXXbp00Zlx6eDgoM64FEIIIYQQQojnkWQC\nCSGeqNtFpdU+9k5uBkqlEhvXFigNTHQeo1AoGDhwIAMHDqzWNXfs2FHp/ilTpqjXMLifqakpX331\nVbXaeFpUZ20mAJSweGcsDlYNatWOnp4eGTcLdLZlavMCCoUepUV3KLmdj76hsbotL4cGpKSkYGRk\nhKurK6ampkB5YEhXZk9OTnlwqaKgT11SKBQ6t6vazs3N1bn/cfZRiCdlyZIlhIWFaazfIp5eFWWL\nAvg1tuW1rp40sdLj888/p6SkhNmzZ9OoUSMAzp07pw60h4SEaJ1fVlYGwNWr2hMNJONSCCGEEEII\nIWpHgkBCiCfK1Lj6b0MZCb8DYN/Mv0bnieqpztpMKkolhIQn41yLdiwsLDgUnYybTwl6BoYa+2zc\nW2JoYkpxYS43LkTj3LaPuq0X78Ry+/Zt+vbti6GhIR4eHly4cIH4+HitIFB6ejrZ2dk4Ojo+VIBF\nT688YbY22UGAupxRXFyc1npUZWVlJCQkANC0adNa91GIp93UqVMpKip60t0Q1bAn+kqlkwXiruQw\nY2045ud3UpafzbRp0/D29lbvz88vzxZKTk4mOTm5wnbu3r2rtU0yLoUQQgghhBCidmQUVQjxRLV2\ns6t0/52bGdxKTeZ2Thp5aeexcvbCzM6lyvNEzVR3bab7xV7OoQHaA3VVcXb3InfXUS4c/AEzhybo\n6enTwMYRK5dmGJtb07jTMJL3reX8gR+4cyuLBtaOnN17maamt/Fq6qYuG9WnTx9+++03Nm/eTIcO\nHbCysgLKAzZr1qxBqVTSt2/fGvfvfubm5igUilqXEerUqRMWFhYcPnyYQYMG0axZM/W+0NBQMjIy\naN26tcxOF881+ft/OlQnW1R57x4Xw7eSl3aOD99/h65du2rsVwXlhw0bxltvvVWj9iXjUgghhBBC\nCCFqR4JAQognys3BAr/GthUGIG7npJN2Ogx9IxNsmvjg6j+Qlk1scXOweMw9fbbVZG2m+6XmFFZ9\n0ANcW3fHziuWvGvnKMi6ivLePRp6tMbKpTxA4tymN2XFd7lyPJTUU3sxMDHFyMya0uYe6OnpsXLl\nSubOnUuLFi0YNWoUW7du5b333iMwMBATExMiIyO5fPky3t7ejBw5slb3pWJiYoKXlxcJCQksWrQI\nZ2dn9PT0CAgIwM3NrVrnT548mfnz5/OPf/yDl156CXt7e86fP090dDQ2Nja89957D9VHISqTmZnJ\n+PHj6dWrFy+//DJr164lISGBkpISPDw8CAoKok2bNurjCwsL2bt3L5GRkaSmpnLr1i1MTU1p3rw5\no0ePpnnz5lptDBkyBF9fX2bMmMGGDRuIjIzk5s2bTJ48mSVLlqiPGz9+vPpnBwcH1qxZA5SvCRQf\nH6+zPGd0dDQ7duzg3LlzFBYWYm1tTdOmTRk8eDCtW7cGICwsjCVLljBlyhR69epVYf/mzZtX5e/r\n+PHj/P7775w7d44bN24A4OLiQq9evRg8eLBWIEJV6m7VqlWcPHmSffv2kZaWhpeXV7Xae5yq+j1V\npTrZotci93Ir9RwNX2xDtqWP1n4vLy8UCgWJiYk1br8iknEphBBCCCGEEJWTIJAQ4ol7rasnH/1w\nQufgUsOmrWnYtLX6/xUKCO7i+Rh793yoztpMxubWtH19tsa2XiP/QnCXOTqP9/Pz0zmoW4o+jTsM\ngg6DKmyrccBgGnq0IuPMHxRmXqGs5C5KpQI7OzuN7J5x48bh4eHBzp07OXDgAGVlZbzwwguMHTuW\n4cOHY2Dw8I+5Dz/8kFWrVhEVFcWRI0dQKpXY2dlVKwgEEBAQwFdffcX//vc/oqKiuH37NtbW1gwY\nMIBXX31VXapIiEcpIyODadOm4ebmRv/+/bl58ybh4eHMnj2b6dOn06VLFwCuXbvGhg0b8PHxwd/f\nH3NzczIzM4mIiCAyMpJZs2bRrl07resXFBQwbdo0TExM6Ny5MwqFAmtra4KCgjh+/DgXL15k6NCh\n6myM6mRl/PDDD2zevBkTExM6deqEnZ0dOTk5nDlzhkOHDqmDQHVp7dq16Onp0axZMxo2bEhhYSGx\nsbGsXLmS5ORkpk6dqvO8lStXkpiYSPv27Wnfvr26lGRduz+op2u9ukelOtmimWeOk3U2AksnD1z9\nBxF7OYdLmfkakzasrKzo3r07Bw8eZPPmzYwZM0brd5Weno6enh6Ojo7V6ptkXAohhBBCCCFE5SQI\nJIR44tq42zFlkF+VZWYUCvhgcEvauEspuLpW2zWWanNedc8xs3fFw95V/f8T+3kzvIO71nFdu3bV\nKjlUEVXmgS7BwcEEBwdrbXdycuLTTz/VeU5Fga4HeXp68vHHH1erj1OmTHmsg7uPS0hICJs2bWLu\n3Ln4+fk96e6os1Iq+5uoS7UZvH/YzI37xcfHM2LECN588031tkGDBjF9+nSWL19Ou3btMDU1xcXF\nhXXr1mFpaalxfnZ2Nh9++CGrV6/WGQS6dOkSPXr0YPLkyejr66u3t2vXjszMTC5evMiwYcNwcHCo\nVn+jo6PZvHkzjo6OLFiwgIYNG2r151GYPXu21hpjSqWSJUuWcODAAa0gg8qFCxdYunRptQMXT0LH\njh3561//yq5du1i5ciWlpaU4OTnRrVs3hg8fjqHhn2u0qV4fy5cvJyQkhB+2/Up8SiqGplY0fLEN\njt6BGllRJXcKSI3aR1lJEfmZVzj134+4V1rMoG02+LfypnPnzrRs2ZKOHTvyt7/9jbS0NH744QcO\nHjyIt7c31tbW5OTkcPXqVZKTk5k+fXq1f5eScSmEEEIIIYQQlZMgkBCiXujfpjGO1qaEhCcTe1l7\ntnHLJrYEd/GUANAjUts1lmpz3uNsS4jnzaXMfE5fyuZ2USmmxga4mJVH1s3MzAgKCtI41tPTk+7d\nuxMWFsYff/xBr169KszQsbOzIzAwkB07dpCVlaWVVWFgYMD48eM1AkAPQxVcHT9+vFYASNWfR+HB\nABCUr0UzdOhQDhw4QHR0tM4g0KhRo+p1AAhg69atbNmyBUtLS7p166Yun7l+/XqioqKYM2eORvZk\naWkpn376KTk5Obh6+pCucCT3WhJp0WEoy8pwatlNfey9slKKC29x52YGKBQYmpij0Dfg5o1s9uzZ\nQ1hYGBMmTKBjx46Ympoyf/58E8ICxAAAIABJREFU9uzZw+HDhzl27BjFxcVYW1vTqFEj3nrrLY0S\nhdUhGZdCCFH/3D8BJjg4mLVr13L69Gnu3r1LkyZNCA4Oxt/fX+u8I0eOsGfPHlJSUiguLsbR0ZHu\n3bszcuRIjQkL8GyXcRVCCCHqkgSBhBD1Rht3O9q422kNYrZ2s5M1gB6xqtZm0qW2azM9zraEqC9s\nbW1ZsWIFpqamj+T60Rez+eFIstbrqqggl6tXb9I98EUaNGigdZ6fnx9hYWGkpKSos43OnDlDaGgo\nSUlJ5ObmUlqqWS7yxo0bWkEgR0dHrKys6ux+zp49i0Kh0Jl19Cjl5+ezbds2Tp06xfXr17l7967G\nftUA04O8vLweed9U2XRQniX266+/EhMTg52dHbNmzeKf//wnxsbGuLu7qwfXGjduzFtvvUVBQQE5\nOTnk5OTw9ddfM3z4cNauXcvly5e5cOECMTEx+Pj4UFxczK5duwgLC6OkpARra2t69+5NAyMDXvDt\nwgstu3Em9Buyko5j7uhG1tkTFGZdpbjwFkX5OTSwdcJ35AeYNWwElGdwuhvnMWvWLI2BOAMDAwYP\nHszgwYOrvO9HkXEphHj2PKlymaJqmZmZTJ06lRdeeIGePXuSn59PeHg4c+bM4YsvvqBly5bqY5cu\nXcr+/fuxs7Ojc+fOmJmZcfbsWTZu3EhMTAxz5szRmHBS38u4CiGEEPWFBIGEEPWOm4OFDPg/AZWt\nzfSgh12b6XG2JUR9YGBggIuLyyO59p7oK5WW08y7U8zRC7fYe/oq/Vq7auyztrYGoLCwEIA//viD\nefPmYWRkROvWrXFycsLExASFQkFcXBzx8fGUlJRotWFjY1On91RYWIi5uTlGRkZ1et2q2vzggw/I\nyMjAy8uLnj17Ym5ujr6+PoWFhYSGhuq8d6j7+9fFz89P3Q93d3datGhBdnY2rq6ubNq0CT09PUxM\nTOjSpYt6cG3gwIEAZGVlUVRURKNGjbCwKH++xsTEYGxsjK+vL8ePH+err77C3d2dS5cuoaenh6Gh\nIdbW1uTl5XEt/RhJWYfw6vtXrFyakR53hLN7VmFgbIqVSzMKM69w52YGCj09Ug5vplm/8RiZWf3f\nJA53AgICiIiI4M6dOzqDkUIIIZ5dcXFxBAcHa2Qkd+vWjdmzZ7Nt2zZ1ECgsLIz9+/fTqVMnpk2b\npvEZQDURYteuXQwdOlS9/Vku4yqEEELUJQkCCSGEAB7v2kyyDlT9d/+M2tGjR7Nx40bi4uLIy8vj\nyy+/xM/Pj7S0NDZv3kxMTAx5eXlYWlrSqlUrXn31VRo1alTttq5du8ZPP/1ETEwMubm5mJmZ0apV\nK4KDg3F2dq71PSiVSnbt2sWvv/7K9evXsbCwoFOnTowdO1br2MLCQvbu3Utk5P9n784Doi73xY+/\ncdj3HRFkc2UTQRTXRKCslDJPGaCZ5bF70ltueM7F6njuqfSYZupx6VaWmrn8NMotVxTBJVD2RRQE\nXAAZEJRNkWV+f3BmZJwBxjXR5/WXftdnRuE783yez+eTRFFRETdu3MDQ0JC+ffvyxhtv0LdvX8Wx\n165d45133sHV1ZUVK1aovfc//vEPkpKSWLVqFc7Ozu2uUC4pKWHDhg2kpqbS2NiIq6srEyZM0Og1\nphSUd/hzBNBws5av9qRja2ag9PN0/fp1AEUZuE2bNqGjo8NXX31F9+7KAaPVq1eTmZmp0bgelJGR\nEdXV1dy+fbvDQJA8w6SpqUllnzy4pYmDBw9SWlpKeHi4Sn+wnJwcdu3a1eEYHiVvb2/s7OzYtWsX\nbm5u/OlPf+K3336jubmZiIgISkpKOHr0KH/6058YOXIk8/7nI77+YRNGppbU3m6mrroGFxcXAGpq\narhw4QLe3t5cvXoVLS0t6urq0NHRwdnZmR49egAo3vv3wsKYv+jfXE7Yg46hKbeuS7Fw9sT9lRno\nGppybv86DK26YebQC2lOAmf3rMV/RAgnD9/gJHDjxg2am5spKiqiZ8+ej/y9UicqKorMzEyNsooE\nQRCEh8fW1pY333xTaZufnx82NjacP39esW3Xrl1IJBJmzpyp8uwPCwtjz549xMbGKgWBnuYyroIg\nCILwMIkgkCAIgqDwOHsziT5QnUNJSQlz587FwcGBwMBA6uvrMTQ0JDc3l48//pibN28yaNAgnJyc\nuHLlCrGxsSQkJPDZZ5/Rq1fHGVxJSUksXLiQpqYmBg0ahL29PeXl5Zw6dYozZ86wcOFCxYT0vfr2\n22/ZvXs3lpaWvPjii0gkEhISEjh//jyNjY1K/U+uXLnCjz/+iKenJwMHDsTY2BipVEpiYiJJSUl8\n8sknitJkVlZW9O/fn5SUFAoLCxUT63IVFRWkpKTQs2dPnJ2d2x1jcXExkZGRVFdXM2DAANzc3Cgp\nKeHzzz/XqBTaT3G5GmXU3awoofF2PZvjc5V+pjIyMgBwc3MDWv69nZycVAJAMpmMrKysjm+khrzE\nirogTVv69OnD6dOnSUpKYsiQIe0ea2xsDLRku9wtNzdX43sWFxcDMHToUJV9jyv4dT/kk2tHjx7l\n6NGjRB+M54qOGznljVQU5NJ94Mtcv51PzY0acq9Wk1tyA4P0dGQyGT4+Ply9ehUdHR3MzMyoqqri\n+eefJycnBwAvLy+OHDnCwIED8e7tzMn0XCR6RsiQYesxFF1DUwAa6+sAuFGUS9PtW1RezKTSWo8t\nV84ojfXu8nqCIAjC06OtvoSurq5qy61ZW1srnjf19fUUFBRgamrKzp071V5fR0eHy5cvK217ksu4\nCoIgCMKTRASBBEEQBCWPszeT6AP15MvOzuaNN95g8uTJim0ymYzp06dTV1fH3LlzCQwMVOyLj4/n\niy++4Msvv2Tt2rXtZkjU1NSwZMkS9PT0WLx4sVLg4eLFi0RGRrJy5co2s23ac/bsWXbv3o29vT1f\nfvmlogTWW2+9xfz586moqMDW1lZxvKOjIxs2bMDU1FTpOuXl5cydO5fvvvtOKSgTEhJCSkoKR44c\n4d1331U6JzY2lubmZoKCgjoc59q1a6murmbatGlKK1vlgbT2FEqrNe6t1Xj7FlczjpGu8wKF0mpc\nbE3Izc0lNjYWIyMjRaDF1taW4uJiKioqsLS0BFr+vTdv3qwy8aIp+XtfVlamdsWuOqGhoZw+fZp1\n69bRu3dvrKyslPZfu3ZNsa1nz55oaWlx7NgxXn/9dfT09ICWiaEffvhB43HKVwRnZGQoBfby8/PZ\nvn27xtd5mFr/brx1u4mq6+UUVdSScfEal8pqgDuTaz4+Pkhv3GT11v24jQyjSxdtmhtuY9LVleqS\nfGpKC6m8Xsm6mLMMOdsy6ebj48OBAweQyWSKwKiPj49iUs7auiVgePPmTUYO8afkainnCosAqKso\npiQ9FoBbN8qpr6nEeXAoteWXsetygw3fff2HZf2oM2fOHOrr6//oYQiC8BhIpVLWr19Pamoqt27d\nUvRJGzhwoOIYeXmxhQsX4u3trXK+uuzd5cuXExMTw3fffcfp06cVmcYWFhaMHj2aN954Ay0tLY4f\nP050dDSXLl1CX1+f4cOH8+6776pkt/z++++cOHGC8+fPK4IVjo6OBAcHM3bsWJXPUPL7r1u3juTk\nZPbs2UNxcTGGhoYMHjyYd955R5HZ+zh11JewT3/150kkEmT/WclSU1ODTCbjxo0biv53HXnSy7gK\ngiAIwpNEBIEEQRAEtR5nbybRB+rJZW5urlTDHVpKY125coW+ffsqBYAARowYwZ49e8jOziYrKwsv\nL682r33kyBFqa2v5y1/+opJ54uzszOjRo9m5cyeXL19W2a9O6wnzozu3UVffyIQJExRBCGgpb/X2\n228zf/58pXPbmjSxtrZm2LBh7N69m7KyMmxsbAAYPHgwRkZGxMbGMmXKFKUVrjExMWhrazNy5Mh2\nx1teXk5qaip2dnaMHTtWaV9AQABeXl7tZqCkFpa3e/3WTOycuZaXQm15Mcsaz+JmoU18fDzNzc3M\nmDEDQ0NDAMaNG8fq1av58MMPGTZsGBKJhLNnz3Lp0iUGDRpEYmKixveU8/HxITo6mlWrVjF06FAM\nDAwwMjJSec2t+fr68uabb7Jt2zbef/99Bg8ejI2NDZWVlWRnZ9O3b1/FxJylpSWBgYEcPXqUDz/8\nkIEDB1JXV8eZM2fw9PQkPz9fo3EGBQURHR3Nt99+S0ZGBt26daO4uJjTp08zZMgQ4uPj7/m13692\nJ9TKa6jLKeXCxlNKk2uXq6H4pg6NlReRNTdz+2YVMmSYdHXFxKEn5XnJNNRWIZPBr4eO09dWl169\nelFXV0djYyOGhoZIJBKliTF58+3m5mYsLS2xNTPgurE2V250oSI/jRtXWsr41FeXU19znevZR3Fz\n7IqZocETl/Uj/9kVBOHpJpVKmTNnDl27diUoKIjq6mpFn7TPPvtM0X/mQXz//fdkZGQwaNAgfH19\nSUhI4Mcff6SxsRETExPWr1/P4MGD8fT0JDU1lb1799Lc3Mz06dOVrrN+/Xq6dOlCnz59sLKyora2\nlvT0dL755htyc3OZM2eO2vv/8MMPJCcnK+6fnp7OgQMHFJnEj5MmfQn3JF3ieTV9CVuTfw5zc3PT\nePHPk17GVRAEQRCeJCIIJAiCIAhCuyU8dHR0lI7Ny8sDaHMipV+/fmRnZ5Ofn99uEEiebVBQUMDm\nzZtV9hcVtWQcdBQEUjdhnnMihbqKa/ycdQurHuVKJdA8PDzUliU5e/Ysu3btIicnh+vXr9PY2Ki0\n/9q1a4qJZF1dXYYPH86BAwdITk7G398faHlvLl26xJAhQ1Syiu4mD060NR5vb+92g0B19Y1t7rub\nrpEF3QeNoTglhtPHj1Jkrk+PHj0ICwvDz89PcdyLL76Ijo4OO3fuJCYmBl1dXTw9PZk5cyYnT568\nryCQn58fU6dO5cCBA+zcuZPGxkZsbW3bDQIBTJo0ib59+7J7925Onz7NrVu3MDc3p2fPnipZVh98\n8AHm5ubExcWxd+9ebGxsCA0NZfz48Rw/flyjcVpaWrJ48WLWr19PdnY2ycnJODo68v7779O/f//H\nFgTqaEKttdaTa4fSrmBs50J5bhJ1FcXcrr1OF4kOeiaW2PQeyMUTv3C79jq3qiu5eaOcWkcfAEU/\nBh8fH4qKihR9ou5WUdHy82VpZsz1azr839p/g6UzdfWN1F0vY/vaRbh078bf//53lV5ejY2NnDt3\nDk9Pzwd4Z9qWkJDArl27uHz5MtXV1ZiamtKtWzdGjBjByy+/DKj2BGpsbOSvf/2rorRlQECA0jWX\nLVvG0aNHmThxImFhYY9k3IIgPHwZGRlEREQoLWAZOXIkCxYsIDo6+qEEgfLy8vj3v/+tyEiNiIhg\n2rRpREdHo6enx/LlyxWfWxoaGpg5cyaHDh1i4sSJmJmZKa6zYMEClQxZmUzG8uXLOXLkCGPGjFHb\nzyYnJ4dVq1YpPpM0NTXx0UcfkZ6ezvnz5x9bqTNN+xIiQ21fwtb09fVxcnLi0qVLVFdXKy3gaUtn\nLeMqCIIgCH8EEQQSBEEQhGdYRyU8evvoqpxTV9fS/0NeLuxu8u21tbXt3ru6uhqAAwcOtHvczZs3\n29zX1oR5U0NL2afcaw1E/ZTA7LH9FCtQJRKJSoDm1KlTLFq0CF1dXfr374+9vT36+vpoaWmRkZFB\nZmamSkmR4OBgDhw4QExMjCIIdOTIEcW+jsjfH3Nzc7X7OypVYqh3bx/j9M1scAsM4/3RHowb5Nrm\nccHBwWrH7+LiorLSFlBMqrdn3LhxjBs3Tu2+RYsWtXmev78//v7+ign8tsq76ejo8O6776qU5mtr\nfBEREWpfS/fu3fnkk0/U3kPddWbNmqVUKuhBdTShJl+5LJM139n4n8k1mQxMurpSnptEVfEFGmqr\n0NZryfAytumOiX0PKvJTuXj8Z0DG5eKrTJn2F8rKyjA3N+f1119nxYoVij5RrTU1NSl6Qrm6upKf\nn8/NyhLCXwz8zxG96GE8h5UrVzJjxgz8/PxwcHCgqakJqVRKdnY2pqamfP311w/pnbpj//79rF69\nGgsLCwYNGoSpqSnXr1+nsLCQw4cPK4JAd9PW1uavf/0rM2fOZMWKFaxcuVJR/u7w4cMcPXoUHx8f\nlWbmgiA82eR90lrz8/PDxsZGEfR+UGFhYUplSo2MjAgICODw4cO89tprSgtXdHR0GDFihKKsausg\nkLoSqVpaWrzyyiscOXKElJQUtUGg8PBwpexGiURCSEgIWVlZjzUIpGlfQgCZDJW+hHcbN26cogzw\n7NmzVbK0a2pqKC0tVfSKfBLLuAqCIAjCk0oEgQRBEAThGaVJCY+9yZd44a4SHvLSYZWVlWrPk2cM\nyI9ri3z/v//9b6Uv75pqb8JcotPSF6bxVi0SHV2lFahNTU1UVVUpJnwBNm3ahI6ODl999ZVK1tHq\n1avVrih1d3enW7duJCYmUltbi56eHseOHcPU1FSpf1Bb5JMbbWVetPX+yvV3aXsi5UHOy8jIYP78\n+WrLqwjKQkND8fLyajeQdS86mlCT6BqgpaVFQ90Npe3yc0zsXNHS0qL8/Gmam5vQNbwzgWbnPoQb\nl8/SWF9Lc1MjdeVFlN0wo2fPnjg7OzN8+HC+//57jh07ho6OjtJEZWxsLKWlpfTv3x9bW1uOHDlC\ndHQ0zz33nCLrZ9SoUbi6uvLzzz9z/PhxUlJS0NfXx9LSkmHDhjFixIiH8h7dbf/+/Whra/Pvf/9b\nacwAVVVV7Z7btWtXPvjgAxYvXsySJUtYtGgRRUVFfP3115iZmTF37lxRMkgQnlDtZTCry661trZW\nZCA/KHX9zuQLYNTtkweMysuVy7hWV1cTHR3NmTNnuHr1qkoZTXmfILk9e/aQkZGh9h7yzzQ1NTUd\njr+9fkiaupe+hHLpFysUfQnVef7558nLy+O3335j2rRp+Pr6YmtrS3V1NaWlpWRmZhISEsKMGTOA\nJ6uMqyAIgiA86UQQSBAEQRCeQQ9SwkO+AlNdxkDr7fLj2tK3b19OnjxJVlbWfQWB2pswN7S0p66i\nhBrpRfRMLJRWoGZnZ9Pc3Kx0fElJCU5OTioBIJlMpsiAUCc4OJgff/yR+Ph4zM3NqaqqIjQ0FG3t\njj9iubm5ASjGc/ekVVvvr5yLrQneTpb3NAnTz9lS9N96QmkyoSbR0cXQyoEa6SUuJ+6lvuoaVSUX\nuFlZioGFHdr6hhiY21FXebXleL07gVh9M2u6aOuiZ2KJoaU9Xn+ay1uj+hCz/l8t+/X1mTlzJv/6\n179oamrCw8OD+Ph4zp07R1VVFc7OzsyYMYOuXbvi5eXVbtaPnZ3dI8n6afN9kUgU/Yta66gkI8Dw\n4cNJS0tj//79rF+/nuTkZG7fvs1HH30kGocLwhOoowxmeZ+0u0kkEmSapq10QF0fQfnvIHULYOT7\nmpqaFNtqa2uZPXs2paWl9O7dm6CgIIyNjZFIJNTW1rJr1y6VDGQ5Y2PjNu9x9+ebh+Xukpr30pew\ntdTC8nY/h7z//vv4+/uzb98+0tLSqK2txdjYGBsbG8aPH8+oUaMUxz4pZVwFQRAEoTMQQSBBEARB\neAY9SAkPd3d3HBwcyM7O5sSJEwwbNkxx7IkTJ8jKysLBwaHD/h8hISFs27aNLVu20KtXL5XyJTKZ\njMzMTLWrVDuaMLfs0Z/yvGSuZsZj5tgbbT1D0i9WcP7KNTZs2KByvK2tLcXFxVRUVChW88pkMkX5\nlrYEBQWxadMmjhw5oijrFhIS0u7rlrO2tqZ///6kpqayZ88eXnnlFcW+hIQEjerZT3yuF1E/JWj0\nb6mlBREjemk0NuHx03RCzWXYa1w5c4Dqq/nUV1e0ZPVUlGBg0VIWx7irK3WVV9HWM6CL5M5Hfa0u\nEiS6Bi3H2LmgpaWlUlIwICCAL774gv/3//4fycnJnD9/nps3bzJ8+HDmzp2r+NmQZ/38+uuvpKen\nP7asH7nWGQAGDu5UZp9j+vTpPPfcc3h5eeHu7q6SFdSeadOmkZOTwy+//ALAG2+8ga+v76MaviAI\n90mTDGZ5n7TWGczqyBdetA7MyGmSTfOgDh48SGlpqdqs25ycHHbt2qVyTmBg4CMfl6Y06UuoZ2yO\n36QFbZ7XVhbtwIEDGThwoEbj+KPLuAqCIAhCZyGCQIIgCILwjHnQEh5aWlrMnj2bTz75hMWLFzN4\n8GAcHR0pKiri1KlTGBgYMHv27A7LKJmYmBAVFcXnn39OZGQkPj4+ODk5oaWlRVlZGTk5OYpSKXfr\naMLc2KY7tn0DkOYkcHbv11g4eYBWF/47aRNebvYq/YzGjRvH6tWr+fDDDxk2bBgSiYSzZ89y6dIl\nBg0aRGJiotr7WFtb069fP9LS0pBIJLi4uCgyfDTx/vvvExkZybfffktKSgqurq6UlJRw6tSpdu8r\n5+tqzawx3m1OisknYLS0YPbYfu3W4n/cYmJiSExM5MKFC1RWVirev5deeklppW9rDQ0NbN26ldjY\nWCoqKrC2tiYoKIg33nhDbfZVWloa0dHRnD9/nlu3bmFra8vQoUN5/fXXVVZyT506FYB169apXOfu\n0jkxMTEsX74caGk+HRoaqjj2fsvoaTKhBqBnYkmPUeFt7ncc8AKOA15Q2W7Voz8jZn2jtK2/izXj\n7nq9vXr14qOPPupwHC4uLo99Ik19BoAjNxxGcONqJhe37sDUYCdaWlp4eXnxzjvv0KtXx4FPXV1d\n/P39KSwsRCKRMGbMmEf3IgRBuC8PksGsjvwZcHeJNoC8vLwHGapGiouLARg6dKjKvrYWgRgbG2Ng\nYPBIx6Wpe+1L+KDnCYIgCILwYMQTWBAEQRCeMQ+jhEefPn346quv2LZtG6mpqSQmJmJqasrIkSMJ\nCwtT9AnpiI+PD6tWrSI6Oprk5GSysrLQ1tbG0tISHx8ftZMjoNmEucOA0eiZWFJ2/jTluWeQ6BkS\nEPQcn/4zkg8//FDp2BdffBEdHR127txJTEwMurq6eHp6MnPmTE6ePNluMCY4OJi0tDSampoICgrS\n6HXLdevWjS+//JL169eTlpamaG780UcfUVVVpXLf48ePs2fPHgoKCmhsbMTe3p6RI0fyzwnD2f57\nIekXWybHs35dAUDfl/8LSdEZ9GsusfbMt1ybMEERoLh+/TobN24kMTGRmzdv4uDgwKuvvoqtrW2b\n45UH5X7//XekUina2tr07NmT119/XSVzQh4omTVrFubm5uzYsYP8/Hzq6urYvXs3a9aswcnJCS8v\nLywsLKiurubMmTMsW7aMoqIiJk2apHL/xYsXk5ubqwjUJSQksHnzZnJzc/nkk0+UAo/79+9nzZo1\n6OnpMXz4cMzNzcnIyGDHjh0kJCSwZMkStSV9NOHq6kp4eDhbtmzB1taW4OBgxb777a/wuCfGOltp\nwPYyAKzcfMDNh6aGW7zQRw8qCjh06BALFixg7dq1HWYFZWdnEx0djampKVVVVaxYsYL//d//Ff2A\nBOEJ8iAZzOrIs48PHz7MqFGjFOXUysvL2bJlywOPtyN2di3Zm/LnPoBUKiU8PJyKigqsrKyIiYnh\nzJkzVFVV8fnnnyt6At2toaGBvXv3kpaWxtWrV4mJiSEwMJCwsDDGjx/fbu+6EydO8PPPP3Px4kV0\ndXXx9fVl6tSpij5GUqlUsUgCUCx6qKtvpOCWMb2enwLAzcpSrmYdp678Cg03q+mio4euoSnGts50\n832eLv95f++3n6EgCIIgCA9GBIEEQRAE4RnzMEp4ADg4ODBnzhyN7hkREdFmdoStrS1/+ctfNLqO\nnCYT5lpaWtj0GYRNn0GKba+M9sDIyEhttkdwcLDSZL6ci4tLu5kdo0aNajNzRc7W1lZtWRIAe3t7\noqKi1O5rPZ6NGzeyfft2RbBNX1+fpKQkNm7ciJdXMos+/ZQrFTdJLSxn+e9maNGMs/QIWl3q6f/c\nEAwNDRWTTlVVVcybN4+rV6/i4eGBh4cHlZWVrFmzps0yWFKplKioKKRSKZ6engwYMIBbt25x+vRp\nFixYwIwZMxg9erTKeSdOnCApKYkBAwbw0ksvIZVKAVi1ahX29vZKxzY2NrJgwQJ27NjBSy+9pJiE\nkrt8+TKrV69W9EN46623mD9/PqdPnyY2Nlbx7yCVSvm///s/9PX1WbZsGY6OjoprrF27lt9++40f\nfviB//7v/1b7Wjvi5uaGm5ubIgh0P5k/d3ucE2OdrTSgphkAEh199hbAoonhyGQyDh06RFZWVpvB\nZGgJbC5ZsgRtbW0+//xzfv75Z2JjY/n55595/fXXH/IrEQThfjxoBrM6ffr0wcvLi8zMTObMmYOP\njw/Xr18nMTERX19fjh8//jCG3qagoCCio6P59ttvycjIoFu3bpw/f56srCz69u3L2bNn6d27N6++\n+ir19fVqew1BS+naRYsWERMTg5aWFr6+vvj7+xMTE8OlS5faHcNvv/1GQkICAQEBeHl5cf78eeLj\n4ykoKGDlypXo6OhgZGREeHg4MTExiiCV3C9p16iiJQB07kDL5yozh97oGlvQ3FBPfXUF5blnsPcJ\nAomk0y0+EARBEISniQgCCYIgCMIz5mko4XG/E+addQVqTk4O27dvx9rammXLlika1r/99tt8/vnn\nnD59mujoaCZMmICLrQm7u5oilUqxt7Hg448/Rl9fX+l6Gzdu5OrVq7z66qv8+c9/VmwfM2YM8+bN\nUzuGr776irKyMubNm8dzzz2n2F5bW0tUVBTffPMNAQEBit5IcmfOnGHBggUMGDBAafvdASAAbW1t\nxowZQ3p6OmlpaSqZVWFhYUoNsXV1dXn77beZP38+hw4dUgSBYmNjaWxs5LXXXlMKAEFL4Ojo0aMc\nPXqU//qv/0JHR0ft633cXGxN8HayvOeJTrl+zpYEezt0GCx5EksDdqS9DIDqqwWKHkdwJwPA5Pp1\nAPT09Nq99vLlyykvL2dyGm6gAAAgAElEQVT69Om4uLgwffp0zp07x6ZNm/D09MTd3f2hvhbhySLP\ncggODhZ9Qp5gDyODWZ2PP/6Y77//noSEBHbv3k23bt2YMmUKfn5+jzwIZGlpyeLFi1m/fj3Z2dkk\nJydjaWmJi4sLurq62Nraqjyj1YmNjeX06dP07NkTY2NjAgMDiYiIYOLEicydO7fdc5OSkli2bJki\nEwlgyZIlxMXFkZCQwPDhwzEyMiIiIoKMjAykUqnSogf3IeVE/ZTAtfw0mhsbcBsZhnn3Pkr3aKy/\nSRdtnU63+EAQBEEQnjZPzmyOIAiCIAiPxdMQQLmfCfPOvAL10KFDALz55puKABCARCJh6tSpnDlz\nhoMHDzJhwgSl86ZOnaoSAGpsbCQ2NhYDAwOlFb3Q0g8mMDCQmJgYpe0FBQVkZmYybNgwpQAQtPRV\nmDhxIp999hknT57k5ZdfVtofEBDAgAEDKJRWk1pYTl19I4Z62nQ3hsRj+0lLS6OsrIzbt28rnXft\n2jWV98HLy0tlm4eHB126dCE/P1+x7cKFCwD069dP5XhjY2N69OhBZmYmV65cwdXVVeWYP8rE53oR\n9VOCxiWP5OSTa76u1tiZG7I5PldRGrC1fs6WiuM6i44yAAri/h9dtHUxtHZAz9gcmQzO7btID+N6\n+nn2xcfHp81zd+7cSWJiIkOHDuWll14CwMDAgL/97W9ERkayZMkSVq5cqRR4FITOrnWpTnXZr0+i\nh5HBrK4cmpGRER988AEffPCByj512buzZs1qM1jYXsZzW5nGTXrmeI+eRI/Alueio5GM//2fmZib\nm7N3716VRQoDBw7EwMBAqWyr/Hk9e/ZspWekkZERYWFhfPnll2rHBC2l3VoHgABGjx5NXFwc58+f\nZ/jw4W2eC3f6EkYmHQSgi5refNp6Bp1y8YEgCIIgPG1EEEgQBEEQnjFPSwDlXibMO+MK1NZBk4Mn\nkqmrb1Q7oe3g4IC1tTWlpaXU1tYq+tzo6uqqTO4AXLlyhfr6ejw9PdX2xPH29lYJAuXk5AAtWT+b\nN29WOefGjRtAS7m2u+mYdyVywyml/2/11ZWc2/8dhpImRg72Y/To0RgaGtKlSxekUikxMTE0NDSo\nXOvuLCNoCYSZmpoqxiAfJ7SstFZHHkiTH/ekkE+oadT8/D/unlzzdbXG19VaJejW38X6ifsZ1kRH\nGQD2/YOpLrnAzYqrVBXn0UWija6RGf5Bofxj5jtoq5mUhJbG7+vXr8fW1lalR1iPHj149913+eab\nb1i+fDkff/zxQ3s9giDcu6chg7m1lIJyforLVfkcVl9zncuXKwl27aNxlmp+fj5aWlpqsxY9PDza\nPbdXL9XPRTY2NgDU1NRodP8XfZ249d8RRM5LJv/YNsy7u2Ni74axTXf0TCw75eIDQRAEQXgaPZmf\nigRBEARBeKRaB1Dqa66T9esKrNz64zz0VZVjH3UAJSMjg/nz5xMeHn5PvVU0nTDvbCtQ1U0OZeWV\nUF9dweI953g7REfltVhaWlJWVqYUBDIzM1Pb2L6urg5QH1Bpa3t1dTUAqamppKamtjn2mzdvKv1d\neuMm289cxdKtq/L2nFM01tdhMeRVih364zyoH6P7dwcgLi5OJQgld/36dcUElVxTUxNVVVVK/RLk\n70FlZSVOTk4q16msrARQOkdLS4vGRvWrzR9nsOhFX6d2s3laa29yzcXWpFMGfe7WUQaATW9/bHr7\nq2z3GdYbAwMDxd/vzgLo2bMnv/zyS5vXDQ0NVTRAFwThj/U0ZDDL7U+51O7nlqqbt4nLu8GB1MuK\n52J7amtrMTExQSKRqOxr6zkvp24hiPw6zc3NHd5bblxwAH03f8s3P/zIqcQk6s/nU5unhYObC6Ej\n3+40n78EQRAE4WkmgkCCIAiC8AxqHUBpz8MIoDzKngsdTZh3thWobU0OSXRa+pqk5l4mp7SW2WP7\nKU0OVVS0vPbWEzrqAkBwJ/Bx/T89U+6mbrv8nPfee0/jifHckhsUSKtwclPdV1/dEoQxd3JHJoOv\n9qRja2aAr6s1GRlt/5/MzMxU9P2Ry87Oprm5GTe3Ozdyc3Pj5MmTZGRkqGRP1dbWkp+fj66uLt27\n33kPjY2NKSwspLGxUSV7JDc3V+14tLS07mmiTFPqsnlu3W4CLRn6OtqdOrPnXj1tGQDCk6+oqIjD\nhw+TmpqKVCqlrq4OCwsL/Pz8CAsLw9pa+XnSeiGDv78/W7ZsIScnh5qaGtatW6co3ZWcnMzWrVvJ\nz89HR0cHT09PpkyZwo4dO4iJiVE6Vu7cuXNER0eTnZ1NTU0N5ubm+Pv7Ex4erpLpePXqVXbs2EF6\nejrXrl1DV1cXKysr3N3dmTx5MiYmJkRFRZGZmQm09MNavny54nx1939SPC0ZzCkF5Zplesq0lJ6L\n7TE0NKS6upqmpiaVQFBbz/lHoW/fvixb/DkNDQ3k5eWRnJzM7t27WbJkCaampvTv3/+xjUUQBEEQ\nBFXi25EgCIIgPKPkAZRv9yaQpWZ/ZwmgPC3lr9qbHDKw7EpdRQk1pRfRM7FUmhwqKSmhvLwcOzs7\ntat67+bo6Iienh75+flKmUNy6oIwffq0NHrOysrSOAgUk1HU5kSXrpEZADWlhZg59kEmg83xucgq\nL3Hw4ME2r7l161YGDhyo6NFy+/ZtNmzYAEBISIjiuFGjRrF161b27NlDcHAw9vb2in2bNm2irq6O\nF154QancTu/evblw4QKHDx/mxRdfvPM6YmI4e/as2vGYmppSXn5/Dcs18bRk8zyIpykD4GkgDyKo\n65fytDh16hT79u3D29sbd3d3tLW1uXSp5XdTYmIiX331FVZWVirn5eTksH37djw8PHj++eepqqpS\nBJTj4uJYunQpOjo6jBgxAgsLC3JycoiMjGyzL9mhQ4dYtWoVOjo6BAQEYG1tTXFxMQcOHCAxMZGl\nS5cqMiMrKiqYM2cOdXV1+Pv7M3ToUG7fvk1paSlHjx5l7NixmJiYEBISgpGREQkJCQQEBCgFzzV5\nfvyRnoYSsD/F5Wpc6lP+XOzoM5ibmxvp6emcPXtWpW9ednb2/Q5VRZcuXYCWDCH5n9XR0dHB3d0d\nd3d3unXrxrJly0hISBBBIEEQBEH4g4kgkCAIgiA8w3xdrfnHhIGc22WFS187ggN7d8oACnT+CfP2\nJoesevhyLS+Fq5lxmDr2RkffiM3xufg4W7Ju3TpkMhkvvPCCRvfR1tYmMDCQAwcOsGXLFv785z8r\n9uXm5hIbG6tyTq9evfD09OTkyZMcOnSI559/XuWYwsJCLCwsMDMzo1BaTUFpVZtjsOk9kIr8VAri\nd2Du5I6OgQl5R6Sk6F7nheBA4uPj1Z7XvXt3ZsyYwbBhw5BIJCQkJFBSUsLAgQOVMoRsbW2ZNm0a\na9euZebMmQwfPhwzMzMyMzPJycnB0dGRKVOmKF07NDSUw4cPs2bNGtLS0rCxsSE/P5+cnBwGDhzI\n6dOnVcbj4+NDXFwc//znP+nRowfa2tp4enqqTMQJ9+9pyQDQ1MPOnAwNDcXLy0ulHJ7QtlGjRvHq\nq6+q9GRJSUlhwYIFbNu2jenTp6ucl5KSwowZM5SCyNBSJnPNmjVIJBKWLl2qFPTZsGEDO3bsULlW\nUVERa9aswc7OjkWLFikFndLS0vjkk0/45ptv+OijjwA4ceIE1dXVTJs2jVdeeUXpWrdu3VJM2gcH\nBwOQkJDAkCFDFH/vDDp7CdhCafU9/R4DSL9YQaG0ut3fZ0FBQaSnp7Np0yY+++wzReCxtraWrVu3\nPtCYWzM1NQWgrKwMOzs7pX1nz56lR48e6OrqKm2XZyLp6ek9tHEIgiAIgnB/RBBIEARBEAQM9bTx\ndrbiOVcD1q9fz7asLBoaGnBzcyM8PBxfX1+l4xsaGti5cyexsbGUlJQgkUhwdXUlNDSU4cOHK47b\nvHkzW7ZsAVoyKlr3epk1a5bKBFR+fj4//vgjZ8+epaGhgd69ezN58mS1DY+fJh1NDhnbdMfOcxil\nWSfI2bMWcycPipJ1KD32A5XSEjw8PBg/frzG95s8eTJpaWns3LmT3NxcPDw8qKysJD4+Hn9/fxIS\nElTOiYyM5KOPPmLlypXs3r2bPn36YGRkRHl5OYWFhVy8eJGlS5diZmZGamH72TEGFnb0DHmbkrSj\nVBXlIpM1Y2Bux/OT/sxLg3q1GQT629/+xtatW4mNjaWiogIrKysiIiJ4/fXXVcrfvfzyy9jb2xMd\nHc3Jkyepr6/HxsaG8ePHM2HCBJVV7927d+ezzz5j48aNJCYmIpFI8PT0ZOnSpZw8eVJtEOi9994D\nWiZlz5w5g0wmIzw8XASBHrKnIQNA6DzUZfkA+Pr64uzsTHJystr9bm5uKgEggN9//53a2lpCQkJU\nsn7efPNN9u3bp9J3bN++fTQ2NjJt2jSV8fj4+BAQEEBiYiI3b95U6n119yQ8gL6+vvoX2gl15hKw\nHT0X2zuvoyBQfHw8SUlJzJgxg4CAABobGzl58iS9evWiqKio3cwdTfn4+HD8+HEWLlyIv78/urq6\n2NraMmrUKH7++WfS09Px9PTEzs4OAwMDLl68SFJSEsbGxowePfqB7y8IgiAIwoMRQSBBEARBEAAo\nLS0lMjISFxcXXnzxRUVQYMGCBcybN48RI0YA0NjYyN///ncyMzNxdHRkzJgx1NfXc+LECRYvXkx+\nfj6TJ08GwNvbm9raWnbt2oWrqyuDBw9W3O/uybC8vDx+/vln+vbtywsvvEBZWRknTpzg448/ZuXK\nlTg4ODy+N+Mx02RyyME3BAOLrpSfS6SiIA1ZczNlHm6889ZbjBs3TqWPTXtMTU354osvFAGPvLw8\nHBwcmD59Ora2tmqDQNbW1ixfvpzdu3dz8uRJYmNjaW5uxtzcHCcnJ8aOHYuzszMAdfWNHY7B2KY7\nvUImK23r3rs33t69VEpNtc5ieOutt3jrrbc0ep2+vr4qAcz2eHh48K9//Utlu4uLCxERESrbzczM\nmDdvnsbXF+5PZ88AEJ4cd5cNdTRS/Q8lk8mIjY0lJiaGgoICampqlHp/tfW7tnfv3mq35+fnAy2/\nX+6mr6+Pm5ubShnOnJwcoKUPmrqeZDdu3KC5uZmioiJ69uxJQEAAGzdu5OuvvyYlJQVfX188PDzo\n3r17m/3hOqvOWgJWk+fi/ZynpaXF/Pnz2b59O0eOHGH37t1YWloSHBzMyy+/zO+//64UKLxfL7zw\nAlKplLi4OH7++Weamprw8vJi1KhRjBkzBmNjY86fP092djZNTU1YW1szZswYMjMzee+9957qEpKC\nIAiC0BmIIJAgCIIgCEDLZNNrr73Gu+++q9g2ZswY5s2bx+rVqxkwYACGhob88ssvZGZmMmDAAD75\n5BNFI+KIiAjmzJnD9u3bGThwIO7u7nh7e2NnZ8euXbtwc3NTO5Eud/r0aZXsoP3797N69Wp27drF\n+++//+he/B9M08khSxcvLF3uZJm8FdibCWqyHtatW9fhtSwsLJg5c6bafW1N1hgYGDBhwgQmTJjQ\n7rUN9bSx6tEfqx731gPAUE98NBXU68wZAJrSJHNSJpOxf/9+Dh06xOXLl5HJZDg5ORESEsJLL72k\nmPCPiYlh+fLlQMvv9ta9vMLDwxW/i2NiYkhMTOTChQtUVlYikUhwcXHhpZdeUiqx2NmlFJTzU1yu\nSsZlfc11Ll+upM+1GsW2devWsXPnTiwtLfHz88PKykqRYRMTE4NUKlV7D3Nzc7Xb5Vk+be1Xt72q\nqqWcZnR0dLuv69atW0BLCcxly5axefNmkpOTOXnyJNASvB8/frzGvdw6k85WAlaT55uesTl+kxa0\neV5bZR11dXWZOHEiEydOVNqempoKtGS6thYREdHm5zFbW1u1nwG6dOnC5MmTFYt8WmtvwUVUVJTa\n7YIgCIIgPF7im7YgCIIgCEBLU+jw8HClbb169SIwMJCYmBhOnTpFcHAwhw4dQktLiz//+c+KABC0\nZEWEhYWxcuVKDh48eM8l3Nzd3VXKw4WEhPD1119z/vz5+39hncD9Bj+e1KBJf5f7m4i/3/OEZ0Nn\nzQDQlCaZk19++SXHjh3D2tqaF154AS0tLU6dOsXatWvJzs4mMjJScXx4eDhbtmzB1tZW6Xert7e3\n4s9r1qzByckJLy8vLCwsqK6u5syZMyxbtoyioiImTZr0mF79o7M/5VK7WWRVN2+zJ+kSz6deZrCr\nKbt27cLZ2ZklS5aoZFDExcW1eZ+2Mm4MDQ2BO/1R7qZuu7xc5bZt2xTnd6R79+787W9/o6mpiYKC\nAlJTU9mzZw/ffPMN+vr6anu5CY/Po3wuVlRUYGlpqbSturqa9evXAzBkyJD7uvfDMGfOHOrr6/+w\n+wuCIAiC0OLJnDkQBEEQBOGRaascTo8ePdSWDPH29iYmJob8/HyGDh1KSUkJVlZWODo6qhzbr18/\n4E75m3vRq5dqRou2tjbm5ubU1NSoOePp8bQFTVxsTfB2srynJtj9nC2fiol84dHrbBkAmuooczIu\nLo5jx47h5ubG4sWLFb1eJk2aRFRUFMeOHWPgwIGMHDkSNzc33NzcFEGgtlb9r1q1Cnt7e6VtjY2N\nLFiwgB07dvDSSy+12SOnM0gpKO+wjCAAMvhqTzr/NcQKmUyGr6+vyvOwvLycq1ev3vMYevToAUB2\ndrZKIObWrVtqn5d9+vQhLy+PrKwsBg4ceE/3k0gk9OzZk549e+Lu7s7//M//cOrUKcW95f1hWpe4\nEx69R/lc/O677ygoKMDd3R0zMzPKy8tJSkqiurqaF198sc1ShY+DjY3NH3ZvQRAEQRDuePAOgYIg\nCIIgdAopBeVEbjjFf/1fHGsPZLMh9jxrD2QTufEU2ZcrqW3WUXuevFRNbW2toqzN3StO5SwsLADu\nK2gjX/l8N4lE8tRPVsknh+7Fkx40mfhcLzRtRaGlBRFqytoJgnDHoUOHAJgyZYoiAAQtfWWmTJkC\nwMGDB+/pmncHgKAl+D5mzBiamppIS0u7/wE/AX6Ky+04APQfMhkcOX8DaAnYtH7u3Lp1i1WrVtHU\n1HTPYwgICMDIyIjY2FgKCgqU9m3btk3xXG1t7NixaGtr891331FUVKSyv7GxkaysLMXf8/Ly1F5H\nnmWkp6en2GZi0vLcaKusnfDoPKrn4tChQ7GwsCAxMZFff/2VhIQEunXrxgcffMD06dMfYMQdk0ql\nhIaGsnz5coqKili8eDGTJk3ilVdeISMjg6ioKKVyhHFxcYSGhvLdd9+pvV5DQwNhYWFMnjxZ5ect\nLi6O+fPnExYWxvjx43n//ffZtm0bDQ0NKtcJDQ0lKiqKyspKVq5cydtvv80rr7yiVGZTEARBEJ4l\nIhNIEARBeKZJpVKmTp1KcHAwERERrF+/ntTUVG7duoWzszMRERFqV+HGxcWxf/9+8vPzuX37NnZ2\ndgQGBjJ+/Hh0dFqCKdeuXeOdd97B1dWVFStWqL3/P/7xD5KSkli1ahXOzs6K7efOnSM6Oprs7Gxq\namowNzfH39+f8PBwlQBMVFQUmZmZ/PLLL+zYsYPY2FhKS0sZOXIks2bNAjQrh7P75FleSr3M6P7K\ntePlk0hGRkaKQE1lZaXa68i3txXQEdo28bleRP2UoNGEZWcImvi6WjNrjDeLt8WT+csKrNz64zz0\nVZXjtLRg9th+il4u8l4md/eHuletf7blPweC8CRqnZ15u/Z6mz3CLly4gJaWllI5NzkvLy+6dOnC\nhQsX7uneZWVl7Nixg7S0NMrKyrh9+7bS/mvXrt3T9Z4khdLqe8q6ADhf3oifXwCZyQl8+OGH+Pr6\nUltbS2pqKrq6uri5ud1zpquhoSF/+ctfWLZsGfPmzWP48OFYWlpy9uxZCgoK8PLyIjMzU6mcnKOj\nIx9++CErV65kxowZ+Pn54eDgQFNTE1KplOzsbExNTfn6668BOHr0KPv378fDw4OuXbtibGzM1atX\nSUxMREdHh1dfvfO7t2/fvujp6bFr1y6qq6sVizfGjh371Dy75Z+L2upv90eRPxc7yk67+7nYkeHD\nhzN8+PCHNMr7U1JSwty5c3FwcCAwMJD6+nq1pQwHDx6sCIq+8847SmWFARISEqitreWFF15Q2rdi\nxQoOHz6MtbU1Q4cOxcjIiHPnzrFp0ybS0tL49NNPVa5VU1NDZGQk+vr6DB06FC0trTZ7cwmCIAjC\n004EgQRBEASBlgnjOXPm0LVrV4KCgqiuriY+Pp5PP/2Uzz77TFHmDDT/ImplZUX//v1JSUmhsLAQ\nFxcXpXtWVFSQkpJCz549lQJAhw4dYtWqVejo6BAQEIC1tTXFxcUcOHCAxMREli5dqra8xsKFC8nN\nzWXAgAEMHjwYMzMzQPNyOHUVJSz95TS2ZgZKEw8ZGRkAuLm5YWBggL29PVevXqW4uJhu3bopXSM9\nPR24U/4GROkZTT2qyaE/0ou+Tug0+vHeAV21+/s5WxIxoleneC2C8LClFJTzU1yuUqCivuY6WRev\n0ZxYyMiCcqWfjdraWkxMTNDWVv0KJ5FIMDU15caNGxrf/+rVq8yZM4eamho8PT3x8/PD0NCQLl26\nIJVKiYmJUbvCvrNILSy/r/P8nn8dj57OxMfHs3fvXszMzBg0aBCTJk1i4cKF93XNwMBATExM2Lp1\nK/Hx8ejo6ODl5cXSpUv5/vvvAVQmzEeNGoWrqyu//vor6enppKSkoK+vj6WlJcOGDWPEiBGKY597\n7jkaGho4e/YseXl53L59GysrK0aMGMFrr72m9BnD2NiYqKgotmzZQkxMDLdu3VLc72kJAj3JXvR1\nws7ckM3xuaRfVA1SdtbnYnZ2Nm+88QaTJ09u9zhdXV1GjBjB/v37SU5OVlloJc/UCQoKUtp2+PBh\nhgwZQmRkJLq6dz5TbN68mS1btrB3715eeeUVpWsVFhYyatQoZs6cqRIgEgRBEIRnjQgCCYIgCAIt\ngY6IiAjCw8MV20aOHMmCBQuIjo5WBIHu9YtoSEgIKSkpHDlyhHfffVfpnrGxsTQ3Nyt90S0qKmLN\nmjXY2dmxaNEipV4MaWlpfPLJJ3zzzTd89NFHKq+hrKyM1atXY2pqqrRd03I4jbdvUZJ+jM3x9orJ\nh9zcXGJjYzEyMlI0Fg4JCeHHH3/k+++/Z/78+YogT1VVFVu3bgVQ6ntgbGyMlpYWZWVlHQ/iGfc0\nTg55O1vh4WiB7yA3/EZ7KHpR9XexVlvObvDgwaxdu1axOl0QnkYdZWeWVNYR9VMCs8f2U2RnGhkZ\nUV1dTWNjo0ogqKmpiaqqKrUr79vy66+/Ul1drTbrLi4urtOXTWoro6o1PWNz/CYtUNrWIJPw1ltv\n8dZbb6kcv2jRIpVt3t7eGmWcDBgwgAEDBihta25uprCwEAsLC7UBGBcXF40yGfv06UOfPn06PK69\nsQiPj6+rNb6u1io9Gtt6Lj5J2uoraW5urvQZuj1BQUHs37+fmJgYpSBQZWUlycnJuLm5KS2c2rVr\nFxKJhJkzZyp97gYICwtjz549xMbGqgSBtLW1mTp1qggACYIgCAIiCCQIgiA8Y9r68mpra8ubb76p\ndKyfnx82NjacP39ese1ev4i2LnsxZcoURcAEWgJK2trajBw5UrFt3759NDY2Mm3aNJVm3D4+PgQE\nBJCYmMjNmzdVmlZPmjRJJQB0L+VwTOycuZaXwo5vi+l6IxhJ0y3i4+Npbm5mxowZisnF8ePHk5SU\nREJCAh988AH+/v7U19dz/Phxbty4wZ/+9Cc8PDwU19XX16d3795kZWWxdOlSHBwc6NKlCwEBASrZ\nUULnnhxqj6WJPuMGuXZ4XOuyg4LwNGovO1NeEkwma0Ymg6/2pCuyM93c3EhLSyMrKwsfHx+l87Ky\nsmhublbKwpRfr60szJKSEqCln8jd5BmgnZmh3v191b3f89pTW1uLtra2Um8emUzGtm3bKCsr4+WX\nX37o93wUzp8/zy+//EJ2djZVVVWYmJjg7OzM6NGjlcqRHT9+nD179lBQUEBjYyP29vaMHDmScePG\nKUrmyoWGhuLl5aU2wLZ8+XJiYmJYt24dtra2gHKpzzfeeINNmzaRkZFBVVUVM2fOZPny5UrXlmvr\nHn8kF1uTTvNcV5e5CC3Zi5cvVxLs2kfl37Yt7u7uODg4kJiYSE1NDcbGxsCdxVEhISF3rl9fT0FB\nAaampuzcuVPt9XR0dLh8+bLKdjs7O0VWvCAIgiA860QQSBAEQXgmdPTl1am3l1KARs7a2pqcnJyW\nY+/ji6iuri7Dhw/nwIEDJCcn4+/vD7Q0cb506RJDhgxRCtzI75WZmUlubq7K9W/cuEFzczNFRUX0\n7NlTaV+vXqo9Yu6lHI6ukQXdB42hOCWGX3ftxdZUlx49ehAWFoafn5/iOG1tbT799FN+/fVXjh07\nxp49e+jSpQuurq689957PPfccyrXnjt3Lt9++y3JycnExcUhk8mwtrYWQaB2dKbJIU1JpdIO+261\n1xMoOTmZrVu3kp+fj46ODp6enkyZMoUdO3aoTBTe630F4XFpLztTomuAlpYWDXUtZd1kMtgcn4uv\nqzXPP/88aWlpbNiwgUWLFikCCvX19axfvx5QzsIEMDU1pbxc/XNA/rOSkZHBoEGDFNuTk5M5ePDg\ng7zEJ0J/l/vLmLzf89qTk5PDF198ga+vL7a2tty6dYtz586Rn5+PtbU1ERERD/2eD9uBAwdYs2aN\nYhFHt27duH79Onl5eezdu1cRBNq4cSPbt2/H1NSUkSNHoq+vT1JSEhs3biQ5OZlPP/1UbUnDe6Wu\nB42Liwvh4eHExMQglUqVMlPs7Owe+J7PKk36Ssbl3eCAmr6SbQkKCuLHH38kLi5OEQQ9cuSIyuKo\nmpoaZDIZN27cYMuWLfc0bpFRLAiCIAh3iCCQIAiC8NTT5MtrzNlrar+8SiQSZP858X6/iAYHB3Pg\nwAFiYmIUQaAjR5nFDzsAACAASURBVI4o9imNpaoKgOjo6HavKa/h35q6L7v3Uw7HLTCMtwN7EzFC\nNagkp6ury4QJE5gwYUKH1wewt7fn73//u9p9HZXSWbdunUb3EJ5s99J3S524uDiWLl2Kjo4OI0aM\nwMLCgpycHCIjI3F1bTvD6EHvKwgPU0fZmRIdXQytHKiRXqLweDR6plZczdDiVQ8TRo4cye+//87x\n48eZPn26okTn77//TmlpKSNGjCAwMFDpej4+PsTFxfHPf/6THj16oK2tjaenJ15eXowZM4bDhw/z\nr3/9i2HDhmFpacnFixdJTk5m+PDhxMfHP8q34pFzsTXB28lS42xYaCm5+SiC746OjgwcOJCzZ89y\n5swZmpqasLa2JjQ0lAkTJjzx2QqXL19m7dq1GBoasnjxYpycnJT2ywONOTk5bN++HWtra5YtW6b4\nXPL222/z+eefc/r0aaKjozX+7NCetnrQ9OjRg4yMDKRSaacIrj3pNO0riUxLKXOxI0FBQWzatIkj\nR47w8ssvk5+fT2FhIQEBAUqLo+SZwW5ubqxYseJBXorwmKnL5HsSdZZxCoIgPCgRBBIEQRCeapp/\neaXDL6/3+0XU3d2dbt26kZiYSG1tLXp6ehw7dgxTU1OVmvzye2zbtu2eejvAnTJCrT1J5XCEZ5um\nfbfUuXnzJmvWrEEikbB06VKloM+GDRvYsWPHI7mvIDxsmmRnugx7jStnDlBVcoGmi5nIZDKOJHgz\n3N+Lv/71r3h7e3Po0CH27dsHQPfu3XnttdfUlhR77733gJaecmfOnEEmkxEeHo6XlxcuLi4sXLiQ\nTZs2cfr0aZqamnB1dWX+/PkYGRl1+iAQwMTnehH1U4JGffG0tGh38cODsLOzIzIy8pFc+3H47bff\naGpqIiwsTCUABC1Z0wCHDh0C4M0331RamCKRSJg6dSpnzpzh4MGDDyUIdC89aIT7p2lfSVDOXOyI\ntbU1Pj4+pKamUlRUpOhBdvfiKH19fZycnLh06RLV1dWYmDxdGdJCx6KiosjMzNSo95ogCILQNjHD\nIwgCoFxfOyIiQuOyOXFxcezfv5/8/Hxu376NnZ0dgYGBjB8/Xqku9Ntvvw20TNa19u6771JWVsbE\niRMJCwtTbE9KSuIf//gHYWFhTJw48RG9auFZ8DC/vD7IF9Hg4GB+/PFH4uPjMTc3p6qqitDQUJWS\nKH369CEvL4+srKyHUqrqSSqHIzzbNO27pc7vv/9ObW0tISEhKlk/b775Jvv27aO2tvah31cQHjaN\nsjNNLOkxSnlyu2e/3kBLsP/ll1/WuIeMmZkZ8+bNa3O/u7s7n3/+udp96ibcnrSeKh3xdbVm1hjv\nDheDaGnB7LH9NJq8fla07ku391gidfWNKgtX7nbhwgUAlZ5VAA4ODlhbW1NaWkptbe0D935zdXXV\nuAeNcH/upa+kXPrFCgql1Rpl1AUHB5OamsrBgwcVi6PUffYdN24cK1euZMWKFcyePVvl/05NTQ2l\npaUqPdGEP9bkyZN5/fXXsbS0/KOH0q7OMk5BEIQHJYJAgiAouZeyOStWrODw4cNYW1szdOhQjIyM\nOHfuHJs2bSItLY1PP/0UiUQCQL9+/YiNjeXKlSs4OjoCLbW8y8rKgJYVqq2DQGlpaYD6L5HC06V1\nAHLWrFkP9dqP4svr/X4RbV32wtzcHECp8a3c2LFjOXDgAN999x3dunXDwcFBaX9jYyPnzp3D09NT\no9fzJJXDEZ4NrScODfW0cTRqmXl1dXXtsO9WW/Lz8wHw8PBQ2aevr4+bm1ubjewf5L6C8LDJsywr\nL2ZRdu40N6+XImtuQs/YAgsXb2zdB9NFok1zYwOZ0cvQ6iLBa/xstdmZa9asYd++ffz9739Xmji9\ncuUKO3bsIC0tjevXr2NkZISPjw8REREqzxR5GZxvv/2W06dPc/DgQYqLi+ndu3enC/i05UVfJ+zM\nDdkcn0v6RdVnYT9nSyJG9BIBoP9Q10Mx63wR9dUVLNmXx5QQ/Tbfq7q6OqDtXiyWlpaUlZU9lCCQ\n6Pfy6N1LX8m7z9Pkc+SQIUMwNDRk165dNDY2ql0cBS29zvLy8vjtt9+YNm2aordWdXU1paWlZGZm\nEhISwowZM+5rvMKjYWlp2SkCK51lnIIgCA9KBIEEQVCiadmcmJgYDh8+zJAhQ4iMjERXV1dx/ObN\nm9myZQt79+7llVdeAe4EgdLS0hRBIHmgp3///mRmZlJfX69ocpyWloauri59+/Z9LK9beDo9ii+v\n9/tF1Nramn79+pGWloZEIsHFxQU3NzeV6zs6OvLhhx+ycuVKZsyYgZ+fHw4ODjQ1NSGVSsnOzsbU\n1JSvv/5a49fzpJTDEZ5u6iYOAeprrnP5ciV9+qs/r3XfrbbIs3zkAdS7tbUdwNjY+L7vKwgPW38X\na4pTY7iaeRxtfUMsXLyQaOtQVXyB4tQYqkvy6BH0Fl20dTB39qQ8N4mq4jz6u4xSuk5DQ4Mis9TP\nz0+xPSkpiYULF9LU1MSgQYOwt7envLycU6dOcebMGRYuXKh2tfw333xDdnY2/v7++Pv7qw2cdma+\nrtb4ulqrBKn7u1iLRQ+ttNVDUVtXn3og9dxFokprmT22n0oPRUBRxrayshJ7e3uV/RUVLc+H1gEg\nLS0tmpqa1I6npqamzbGqK4ErPFyaZC4+yHl6enoMGzZMUUYwKCiozWPff/99/P392bdvH2lpadTW\n1mJsbIyNjQ3jx49n1KhRbZ77LEpISGDXrl1cvnyZ6upqTE1N6datGyNGjFDKJC0uLmbr1q2kpaVR\nVVWFqakpPj4+hIWF0a1bN8Vxq1evZv/+/Xz88ccEBASo3O/cuXNERkYydOhQoqKigPZ77Zw7d47o\n6Giys7OpqanB3Nwcf39/wsPDFQEZqVTKG2+8QU5ODg4ODoSGhirO9/LyYtGiRUydOhW40z80JiaG\n5cuXM2vWLGxsbNiyZQt5eXn8f/buPC7Kcn38+GcY9h1kUVEEDBVlUUFx31DTzHIPScGyTqc8ebS0\n10tbPL9T2elklh3Njm1q5XLE3Mh9CEFTEJHdBQQXNgdkGxCBGfj9wXcmxhlWtVTu9z/is88DDM/c\n13Vfl0QioV+/frz44ot076793tXcdZ48eZKIiAiys7NRKpV06dKF0aNHM23aNJ2ZiOpr2bBhA9u2\nbSMmJobS0lIcHR2ZOHEiM2fOFO9bgiD8qUQQSBAELa0tm7N//36kUil///vftQJAAMHBwURERBAV\nFaUJAqln9CQlJTFlyhTN17a2tjzzzDMkJiaSnp7OgAEDUCgUZGdn4+fnpzcbTHi82Nvba5oN328P\n6sNrez+IBgUFkZSUhEqlavaD7tixY3F3d2fv3r0kJydz/vx5TE1Nsbe3Z/jw4YwcObJNr0eUwxEe\ntKYGDtXKq2qIOHedCYk39A4ctkT9/lBaWqp3fVPLBeFhc6c4l1uJR1EU3aTPlFdw9hoKQNf+KrJO\n/I+y3MvIL/xGZ++RdPLwoyjjHMbFGTqBitjYWCoqKpg2bZpm1nVFRQWffPIJJiYmfPzxx1oDXdeu\nXWPZsmWamax3u3LlCuvWrcPZ2fkBvvo/n5uTlQj6NKG5HormDt2ovJVHeV4mpjYOTfZQ9PDw4MqV\nK6SmpuoEgfLz8ykqKsLZ2VkrCGRpaUlRkW7STl1dHdnZ2e16LeogZl1d3WMX0PwjtaY/pImlLQPn\nrWpyv5ZmFC5evJjFixe36noGDRrU6lLJHbl/zOHDh9mwYQN2dnYMHjwYa2trSktLuXr1KsePH9cE\ngTIyMnjnnXeoqqpi8ODBuLq6kpOTQ1RUFLGxsXzwwQd4ejYkhgUFBXH48GEiIyP1BoEiIyMB/VUO\n7nbs2DHWr1+PkZERgYGBODg4kJeXx5EjR4iLi2PNmjU4OjpiYWHB5MmTNe8DjZNUW/pbFRcXR2xs\nLP7+/kyePJkbN24QHx9PRkYGX375JdbW1i1e59atW9m1axfW1taMHj0aU1NTzp07x9atW0lISOD9\n99/XGatQKpW89957FBcXaxIqzpw5w5YtW6itrRV9zARB+FOJ0VVB6KDupVxPdXU12dnZWFtbs2/f\nPr3HNzIy4saNG5r/Ozk50blzZ1JSUjSZ1ykpKfj5+eHt7Y1UKiUpKYkBAwaQnJxMfX29KAXXQRga\nGmpmh91vD/LDa1s+iKqNHTu21ZmKbm5urS6P15qSPaIcjvCgNDdwqKWeJgcOW6KeuZCens6ECRO0\n1t25c0dTLk4QHjZ3P28lHj9AJ2tTKmvsMTQ202wnMZDi4j+B8rwMbmWep7P3SCwcu2Nq0wnDct0+\ndOoBt8ZN1CMjI6msrOSvf/2rTqZzjx49ePLJJ9m3bx83btzQWT9z5szHPgAkNK+5HoqOvQIoyjhH\nQWo01l17YmrjqNVDsaioCAcHByZMmMCxY8fYsWMHgwcPxsbGBmgIxnz77bfU19czceJErWP36tWL\nc+fOcf78eQYMGKBZvnPnTuRyebtei3qAt7CwUPxc3wPRV/LRdPjwYQwNDfnPf/6j+R1UKy8vB6C+\nvp61a9dy+/Zt3nzzTcaMGaPZJiYmhn//+998+umnbNy4EYlEQp8+fXBxcSEuLk7n71FtbS3R0dHY\n2NhozUzVJzc3ly+//BJnZ2c++ugjOnXqpFmXlJTEu+++y6ZNm3j77bexsLBgypQpfPPNNwCEhIS0\n+h6cOXOGf/7zn1rjCVu2bCE8PJxjx44xc+bMZve/ePEiu3btwsHBgbVr12rKT4aFhfHhhx9y9uxZ\nfv75Z+bMmaO1X3FxMe7u7nzwwQeaRNmQkBBeeeUV9u3bx+zZs0WSqyAIfxrx7iMIHcz9KNdTUVFB\nfX09ZWVlbN++vdXn9vPz48iRI2RmZmJoaEhZWRn9+/fHzMwMT09PTXk40Q/oj9e4L89zzz3H5s2b\nSUlJoba2lj59+vDSSy/Ro0cPysrK+OGHH4iLi6OiogI3NzcWLFig1SuquLiYo0ePkpCQQH5+PhUV\nFVhbW+Pt7U1wcLDOwFNTPYEaT81PSEggIiKCvLw8zM3NGTJkCC+88EKL9eTFh1dtohyO8CA0N3B4\nt/p6tAYOWyswMBALCwvNDFN3d3fNup07d2rKxQnCw6Kp562LB09SW34HF6dO3F0VxtTaASNza6or\nSlDW3MHIxJTQWVNJjjlETEyMJnu7tLSUhIQEPDw8cHNz+/3Y/5esk52dzbZt23SuKTc3F0BvEKhX\nr173+pKFR1hLPRRNbRzpPmgyN+J+4eLB/2LTrQ95ifZY5f1GccENzM3NWb16NV5eXsycOZPdu3ez\naNEihg8frsmev3btGn379mXGjBlax54+fToJCQl88MEHjBw5EktLSy5evEhBQQE+Pj5N9ntrjp+f\nHydPnmT16tUEBARgbGyMk5OTKBnWRqKv5KOj8bP9lYIylMp6zSzRxtQB0osXL5KTk0OfPn20AkAA\nI0eOJCIigvT0dNLS0vD29gYayvX98MMPREdHayp7AJrPhc8++6zeczZ26NAhlEolL7/8slYACBp+\nbwMDA4mLi6OqqgozM7MmjtKyUaNG6YwlTJo0ifDwcK3qJk1Rlyh87rnntPqPSaVSFi5cSHx8PEeP\nHtUJAgG88sorWpVSbGxsCAwMJDIyktzcXHr06NHelyUIgnBPRBBIEDqQ+1WuRz3w7uHhobekSFN8\nfX05cuQISUlJmgwY9cOZr68vu3btQqFQkJSUhIWFhd6a9cKDdfPmTd588026d+9OUFAQcrmc06dP\ns2LFCtasWcOqVaswNzdn5MiRKBQKYmJi+Mc//sF///tfHB0dAUhNTWXXrl34+voybNgwzMzMyMvL\n47fffiMuLo5///vfWgO4Lfn+++9JSEhg8ODBmpliR44cIT8/nw8//LDZfcWHV/1EORzhfmlp4FCf\n5GvFXJUr2vQzaG5uzl//+lfWrl3L8uXLGTFiBPb29ly4cIHs7Gy8vb1JTU0VtdaFh0Jzz1uq2moq\na5TUKmqZ5NUFaWd7rdmZRmZW1FSW0dvZlJenBNLdKpAXTx5GJpNpgkBRUVGoVCqtWUAACoUCgCNH\njjR7fVVVVTrLGg9yCR1Pa3ooOnj6Y2brxM0Lp6m4eZWynIv8eqcrowK8tWb3LFiwAA8PDyIiIoiM\njESlUtG5c2fmz5/PtGnTdLLg/fz8ePvtt9mxYwfR0dGYmprSv39/3nrrLb3BzNaYOHEicrmc6Oho\ndu/ejUqlwtvbWwSB2kH0lXy46Us4kBu4kHM5jcAnZzNz6kSeGjMULy8vrVlBmZmZAFqJfI35+vqS\nnp5OVlaWVhDoxx9/RCaTaQWBZDIZ0LpScOpkhdTUVDIyMnTWX88vJK+4go17T+HRsyemt9uX5PPE\nE0/oLHNwaEhAaq7XmNqVK1cA/UmpLi4uODg4cPPmTSorK7WSEi0sLPT2Q2vLuQVBEB4UEQQShA7i\nfpbrMTU1xdXVlevXdcuTNMfPzw+JREJSUhJGRkZ07txZ03zRz8+P//3vf0RGRpKXl0dgYKCo4f0n\nSE1NZf78+VpZTTt27OCnn37izTffZMSIEbz22muagdYBAwawdu1a9u3bx0svvQQ0fC9//PFHneyt\n7Oxs3nrrLbZs2cI//vGPVl/TxYsXWb9+vSbIpFKpePvtt0lOTuby5cstZi+LD6+C8OC0ZuCwqf3a\nGogcM2YMVlZW7Nixg5iYGIyMjPD29mbNmjV89913AA+kt5jw+ElJSWHlypXMnTtXb3mZe2k03fh5\nq1pRTN55GYqCLOrqVJjZdUZZfRuAujolMRfy2TT7GRZNMtFkcG88qaSwuJLb8bv4p+wbzMzMKC8v\nJyEhgZycHLp164ZMJsPQ0BC5XM7UqVNZvXo1xcXFREVFceXKFUaOHMnOnTvbdE9EALVja20PRQvH\n7ng4/p4oFjaml97nplGjRjFq1KhWnz8wMFBvn5ElS5bolMZ1cnJqsd+LgYEBoaGhhIaGtvoaBP1E\nX8mHV1MJB05eQ5GamFN0OZ6vNu/g6KFfcLIxx9vbmxdeeAFPT09u3274W2Rvb6/32OrljWdaOzg4\n4OfnR2JiomZGaVlZmd6ZqU1Rl6P7+eeftZaX3a4h91Yl5VU1AISfuoRVZjWKm1e5pbiDoUnbnu8s\nLS11lqlnKdXV1bW4v/r+NJUgYW9vT2Fhod4gkD5tObcgCMKDIoJAgtBB3O9yPdOmTdM0F166dKnO\nA09FRQU3b97Ums1jY2ODq6sr6enpSKVSrQ+HXl5eGBsbs2vXLkCUgvuzODk5MWvWLK1lQUFB/PTT\nT9TW1vLiiy9qDRSNHj2adevWafXjuLv2tJq7uzu+vr6cP38epVLZ6nrIc+fO1QSAoOEhevz48aSl\npbUqCCQ+vArCg9OagUN9fbca73d3T6ugoCCdGQ5q/v7++Pv7ay2rq6vj6tWr2NnZaf0tammgsDW9\ntAShsdY0mlY/b90pv8XlI9+hrL6NddcnMLfvTLWihFuZ56hTKpHw+/PWJ6FDcXOy4vTp02SnJSCV\nSnFzc8PV1ZXy8nL2799Peno633zzDaGhoVy9epXAwEBN0HPPnj0kJibi5uaGQqHQKbEjCC1pTQ/F\n+7mf8GgRfSUfPi0leHby8KOThx/KmjvcLrqBl3MVqQmnWbVqFRs3btT8/SgpKdG7f3Fxw/f57uSa\noKAgEhMTiYyMJCwsTDMzddy4ca26bvVz2s6dOzXHVgeznmjitVTXqsgrruSInmoldwdh7pfG90ff\nzB71/XkQ5xYEQXhQxFObIHQAD6Jcz4QJE8jMzOTgwYO8/PLLDBgwACcnJxQKBTdv3iQ1NZXx48ez\naNEirf38/Py4du2a5ms1IyMjvLy8RD+gP8jdPWG6WTQ8dXt4eOjMwFJngrm4uOjM7jEwMMDW1pai\nIu3ZAGfPnuXQoUNkZmZSXl6OSqXSWl9eXt5k5tnd7nU6P/z5H15byjp/lKxYsYLU1NQWs3CFjuGP\nHDisrKzE0NAQExMTzbL6+np27txJYWGhplSWcG/u1+/4tm3b2L59O6tXr8bHx+c+Xd2fq6VG0/4j\nJ2qet3LOHkRZfZtuAZNw6vP7DAdTWycuHvwvdbXVKGvuaJ63XOxMWbx4MSqViiVLlrBs2TLNPsHB\nwQwZMoQff/xR0+Q+KCiI7OxsAJKTk1mzZg2Ojo68/PLLlJeX602SqK+vJzU19bH5fgj3j+ihKLRE\n9JV8uLQ2wdPQ2BTrrp7U9bBnvL0Fx44dIy0tTZOo2VTPLfXyu8uzDxs2jI0bN/Lrr78SGhqKTCZD\nKpXq9BVqSu/evcnMzCQtLY1Bgwa1GMwyNDYDiYQ6VS1r9ydqVSvJz89/YEEgDw8Prly5Qmpqqk4Q\nKD8/n6KiIpydnUUQSBCER4oIAglCB/CgyvW8+uqrBAQEcOjQIZKSkqisrMTS0hJHR0dmzJiht+62\nn58f+/fvRyKR6NQg9vPzIykpCVtbW1xdXdt1zULzmmpUXV1Ryo0bJfTur/sErp6+3lSZJalUqhXk\n2b9/P19//TWWlpb0798fR0dHTExMkEgknDlzhuzsbJTK1pUdgXufzq/2oD+8yuVyFi5cSFBQkE7p\nEkF4XP2RA4cXL17k3//+tybp4M6dO1y6dImsrCwcHBwe+QCr8OCCzHe/77e3x0BLjaal3QcCUFNZ\nRnl+FiaWdjj2GqS1fRefUeQnRVKWm8GNMweoURTzaU0a+RfiuH79OgMHDtT5G9K5c2cmTJjA/v37\n2blzJ127dmXQoEGaINCkSZPw8PAAGu7hhx9+yLJly/Dz88PV1RWJREJhYSEXL15EoVDolOERBNFD\nUWgt0Vfyz9dSgqeiIBtLZzet6g3J14pRVdwEwMTEBC8vL1xcXEhPT+fUqVMMHz5cs+2pU6dIS0vD\nxcWFfv36aR3b2NiYESNGcPToUfbu3Ut2djaBgYFNVoK429NPP82RI0f45ptv6Nq1Kz9FX9cKANWp\nVNy+lYOlU4+Ga7V2QGpkQu3tcipv5WuqldTU1PDf//63VedsjwkTJnDs2DF27NjB4MGDNa+vrq6O\nb7/9lvr6eq1eaIIgCI8CEQQShA7gQZTrURs0aBCDBg3Su06fwYMHNzm4M3v2bGbPnt3qYwlt01yj\naoDyqhoizl1ngp6p9q2lUqnYtm0bdnZ2fP755zqzfdTNQP9M4sOrINw/f+TAYbdu3Rg0aBAXLlwg\nPj4elUqFg4MDU6dOZc6cOa0egBCa98Ybb1BdXf1nX8Z90VTig+LmVW7eKOGqXNGm47U0M1X93FRV\nUgA09E+R6Olv2MV3DLVVFRiZW1GcncSZsgysDJV069aNkSNH8r///U9nH1tbW815Ro8erVVStfGM\nHz8/P9avX8/PP/9MQkICaWlpGBoaYm9vj5+fH8OGDWvTaxY6DtFDURAeDS0leGZH/w8DQ2PMHVww\nsbSlvh4q5dcolioYEeCr6dO7dOlS3n33XT7++GOGDBlCt27dyM3N5fTp05iZmbF06VK9/eKCgoI4\nevQoW7duBWh1KThoeJZbvHgxX3zxBS+89ApZ1baYWHeC+jpqKkqpKLyBoYkZfZ/5GwAGUikOTwwk\nN1HG+W3vkx3dm2uHu1BZVoy3t3erK0u0lZeXFzNnzmT37t0sWrSI4cOHY2pqyrlz57h27Rp9+/Zl\nxowZD+TcgiAID4oIAglCByDqfAstTbXXqIfPIpK1ptq3RXl5OZWVlfj5+ek8lN+5c4crV660+ZiC\nIDzc/qiBQ2dnZ60SWcKD0bgH26OsNYkPO05l4j+m9YkPLc1MVT83qWoagmhGZvrLxBiaWmJkZkW3\ngEl06tmfV5/sy43YCI4ePcrp06c5ffq03v0GDx5MSEgIc+fO1VquDhCpOTk58de//rVVr2nJkiVi\n9qoAiB6KgvCoaCnBs0v/IBT5V6gqLqA8LxMDqSHGFjYMnzid1csWapIIevfuzWeffcbOnTtJTEwk\nLi4Oa2trRo8eTXBwMC4uLnqP37dvX7p06UJ+fj5WVlYMHjy4Tdc/duxY3N3deX/dt6SdOIOi4AoG\nhsYYmVlh6+qFXQ/t2UdPjA9DVVvNrSuJFGUmEJlnQmDAQP75z3/y2muvtencbbFgwQI8PDyIiIgg\nMjISlUpF586dmT9/PtOmTWt1f1tBEISHhXjXEoQOQNT5FlpbNxp+b1Tdng/3tra2mJiYkJmZyZ07\ndzA1NQVAqVSyadMmysvL23zMR4G69wWATCZDJpNp1i1ZsgQnJyfN/7Oysvjhhx+4cOECtbW19OrV\ni9DQULy8vHSOW1lZSXh4OKdPn0Yul2NsbEyvXr2YMWMG/fv319pWJpPx+eefs2TJEoKCgnSONXXq\nVLy9vXVm9RUXF7N161bi4+OpqqrCxcWFZ599Ficnp2b7GKlUKnbv3s3x48cpLCzE1taW0aNHM2/e\nPPGhqIMRA4ePhkuXLvHzzz+Tnp5ORUUFtra2BAQEMHfuXK2gfVMl2Wpra9m1axeRkZHcunULe3t7\nxowZQ3BwMDNmzND7/qJ26tQpdu/ezbVr1zA2NmbAgAEsXLiQTp06Ab+X01SbOnWq5uvmjtuU1iY+\n1NfV6U18aG+PAfVzk9S4oW9VbZX+snPKO9r97Pq7OVCc2nC+d955h8DAQH27NUlfprYgtMef3UNR\nEISWtZSo6dgrAMdeATrLxzzZV6e/q4uLC2+88Uabr2HTpk0tbtNckoGbmxtBM0LJsR/S4nEMpFK8\nnn5V8/+wMb00CUXffvut1rZBQUF6Pwep6atI0tx1jho1ilGjRrV4jfqupbGQkBBRtlgQhD+dGKUR\nhA5A1Pnu2FqqG62PulF1W38GJBIJU6dOJTw8nEWLFjFkyBCUSiXJyckoFAp8fX1JTk5u0zEfBT4+\nPlRWVrJ//37c3d0ZMuT3DzTu7u5UVjYMBGZmZrJ792769OnDxIkTKSws5NSpU7zzzjt88cUXWhl3\nlZWVLF++PhHsvgAAIABJREFUnBs3buDp6cmzzz5LWVkZJ0+e5L333uO1115j0qRJ93TdZWVlLF++\nHLlcjre3N3369KGkpISNGzcyYMCAZvdds2YNaWlp+Pv7Y25uTnx8PLt376a0tFRklXdAYuDw4Xbs\n2DHWr1+PkZERgYGBODg4kJeXx5EjR4iLi2PNmjXNzgCqr6/no48+4uzZs3Tt2pWnn34alUqFTCbj\n+vXrzZ774MGDxMbGEhgYiLe3N5cvXyYmJobs7Gy++OILjIyMsLCwYO7cuchkMuRyudZMF2dn5za/\n3pYSHwyNGwbBam+X6yQ+3EujafXz1rnKzgBUFt6gvq5OpyRcxc2rmq/Vz1u9e/cGIC0trc1BIEG4\nnx50D0VBEO7N45LgKaqVCIIg/LHEu6cgdBCiznfH1VLd6Ob2a8+H/Xnz5mFjY8PRo0c5fPgw5ubm\nDBgwgHnz5rFt27Z2XcvDzsfHB2dnZ/bv34+Hh4dOpldKSgoAZ8+e1Zmpc/jwYTZs2MD+/ft59dXf\ns9w2b97MjRs3mDRpEq+99pom03vWrFksXbqU//73vwwcOFBrllFbbdmyBblczsyZM1mwYIFm+bPP\nPttiVmB+fj4bNmzAyqrhZ2T+/PksXryYyMhIwsLCsLOza/d1CY8mMXD4cMrNzeXLL7/E2dmZjz76\nSDP7BiApKYl3332XTZs28fbbbzd5jKioKM6ePUu/fv344IMPNLP9nn/+ed58881mz3/u3DnWrl2L\nm5ubZtknn3xCdHQ0sbGxjBgxAgsLC0JCQkhJSUEul99TtmxrEh9MrB2QGptSlnOJ2juVJF9r2K+r\nrck9N5p+fpQnqTeKse7iQXl+FoWXz+LU5/egTumNSyhuXgO0n7cCAwPp0qULv/zyC76+vgQE6GZx\nX7x4EXd3d0xMTO7pGgWhNUQPRUF4OD0uCZ6PSzBLEAThUSGCQILQQYhyPR1XS3WjAUwsbRk4b1WT\n++mbOq9299R3qVTKtGnTmDZtms62+qbbOzk5tXlqvo+PT7PX9EdpPNhdU1na4r328vLSKVEwfvx4\nvvrqKy5fvqxZplQq+fXXXzE1NSU0NFSr1E/Xrl2ZOnUqO3fuJDIykuDg4HZdu1Kp5MSJE1hYWPDc\nc89prXN3d2fcuHEcPXq0yf0XLFigCQABmJqaMnr0aHbs2EFmZiaDBg1q13UJjz4xcPhwOXToEEql\nkpdfflkrAATg5+dHYGAgcXFxVFVV6ZSJUVOXuLy73KOFhQXBwcF8+umnTZ5/6tSpWgEggCeffJLo\n6GguX77MiBEj2vnK9GtN4oOBVIpT78Hkp0Rz8eB/se3eh9Xl8dSX5mBvb39PjabVz1v/Kn+KS4e/\nIyf+MIr8K5jZOVOtKKH0xkVsuvWiPPcyM4d4aJ63DA0NWblyJe+99x7/7//9P7y8vDQBn6KiIjIy\nMigoKGDr1q0iCCQIgtDBPQ4Jno9LMEsQBOFRIYJAgtCBiHI9HZOYan//nc8u4qfoDK0PLdUVpaRd\nu0Vd3FVGZxfp/T3y9NT9AGZoaIitrS0VFb/3iMjJyaG6uhovLy+tQIuar68vO3fu5MqVK+1+DTk5\nOdTU1ODp6al34Ldv377NBoH0vRZ1OanGr0V4NKn7szQO8rbUd0p4ONw9Eys2oaEEZ2pqKhkZGTrb\nl5WVUVdXR25uLk888YTeY2ZlZSGRSPT2Luvbt2+z1/NHv1e0JvEBoLPvGCSGRtzKTOBWZgKXlF1Z\nMPtpQkJC7rnRdMPz1lNs6mbH8QPhVBRko7h5FTNbZzxGz8HVxpACChn0hPZMTjc3N/7zn/+wd+9e\n4uLiOH78OAYGBtjZ2WlmmVpbW9/TtQmCIAiPvsclwfNxCGYJgiA8KsQInyB0MKJcT8cjptrfX4fP\nX2/2A1d+yW1W/BTL0qd9ebJ/d611TfWYkEql1NXVaf5/+/ZtgCaz0dXL1b2G2kN9DltbW73rm1qu\npu+1SKVSAK3XIgjCH0NfcBogLe4yJsoKyn7cgY25cZP737lzp8l1lZWVWFlZaX7HG3vY3itam8Ag\nkUjo3G8Enfs1zER69cm+TBvsDtyfRtMD3B3YuHgqV4PHNPG89breY9nY2BAWFkZYWFiLr0E0mhYE\nQei4HocEz8clmCUIgvAoEEEgQeigWlOuRy6Xs3DhQoKCgkSj90eYmGp//5zPLmrxQwpAfT18FpGM\nk41Zuz6smJubA1BSUqJ3fXFxsdZ2gKZknEql0tleX7BIvW9paaneczS1XOi4hgwZwsaNG0W/p4dQ\nc8FpqbEp5YpiPIYv5W8zBusEp1vD3NwchUKBSqXSCQQ9bO8VD1vigyiPKAiCIDwoj0OC5+MQzBIE\nQXgUiCCQIAhCByCm2t8fP0VnNHkP1UGY+vq6//sXtsVktOsDS7du3TAxMSE7O5vKykqdTPqUlBQA\nrdJNlpaWABQWFuocT18JqG7dumFsbMzVq1f19gJJT09v83ULjzcLC4smZ7M9Ch7XcnYtBactHFy4\nfSsPhfw6n0WYtis47eHhQXJyMhcuXMDb21tr3f18rzAwMAAaZgipv24rkfggCIIgdDSPesLB4xDM\nEgRBeNiJIJAgCEIHIKba37urckWzg4pSYzMkEgm1t8s0y5KvFXNVrmjzuQwNDRkzZgxHjhzhxx9/\n5JVXXtGsy8/P58CBAxgaGjJ27FjN8ieeeAKJRMKJEyeYNWuWpnG4QqHg+++/13uOkSNHIpPJ2Llz\nJwsWLNCsy87OJjIyss3X/biSyWTExcVx5coVSkpKkEqluLm5MXnyZK3vgZpCoWDv3r2cOXOGgoIC\nDA0NcXJyIiAggOeeew5TU9N2bZuXl8eOHTtISkqivLwca2tr/Pz8CA4OpmvXrlrXsG3bNrZv387q\n1aspLi5m//79XL9+HWtra02pq/r6en755RcOHjxIQUEBVlZWDB06lPnz5zd5H/QFUdT9gzZs2MC2\nbduIiYmhtLQUR0dHJk6cyMyZMzVBUrX6+noOHDjA4cOHdc69ePFiQLckl6Bfc8FpAMdeg7mVmUDu\nuaOYWNnrBKeVSiWXLl2iX79+TR5j3LhxJCcn8+OPP/LBBx9gaNjwEaKyspIdO3bct9ei7ndTWFiI\ns7Nzu48jEh8EQRAE4dHzqAezBEEQHmYiCCQIgtBBiKn29ybxalGz66VGxph3cqFCfp2rJ3/GxLoT\nEomEo79ZM7Rn8z0z9AkLCyMtLY2IiAgyMjLw8fGhvLyckydPUlVVxV//+letQVJ7e3vGjBnDr7/+\nyuLFixk0aBC3b98mPj6efv36kZWVpXOOBQsWkJyczO7du7l06RJeXl4UFxdz8uRJAgICOHPmTLuz\n8R8nX375Ja6urnh7e2NnZ4dCoSA+Pp61a9eSm5vLvHnzNNvevHmTlStXIpfLeeKJJ3jqqaeor68n\nNzeXvXv3MnnyZE1gpy3bZmRk8M4771BVVcXgwYNxdXUlJyeHqKgoYmNj+eCDD/D01B3I3rNnD4mJ\niQwePBhfX1+t0oBff/01Bw4cwN7enkmTJiGVSomNjeXy5csolUrNQH9rKJVK3nvvPYqLiwkICMDA\nwIAzZ86wZcsWamtrmTt3rtb2X331FQcPHtSc29DQsN3n7shaCk4DmNo44Br4DNdj93Mh4ityz/XE\n4VYCdhZGyOVy0tPTsba25quvvmryGOPGjSMmJoZz586xaNEiAgMDUSqV/Pbbb3h6epKbm3tf3iv8\n/Pw4efIkq1evJiAgAGNjY5ycnPQGW5sjEh8EQRAEQRAEQRB+Jz5hC4LQKjk5OWzevJm0tDRqa2vx\n8PBg7ty5DBgwQGfb6OhoDh8+TFZWFjU1NTg7OzNmzBhmzJiBkZGRzvZRUVHs2bOHnJwczMzMGDhw\nIAsWLOCTTz4hNTVVq+GyUqnk8OHDxMfHc/36dUpKSjA1NaVnz55Mnz4df39/neO3J0v9cSWm2rff\n7Wpli9u4DZ9OTvwRyvOvoLqWSn19PVeH9WVoz4FtPp+VlRVr1qxh165d/Pbbb+zduxcTExN69erF\njBkz9P7uvf7669ja2hIdHc0vv/yCo6MjU6dOZcaMGZw8eVJne1tbWz755BO2bt1KfHw8ly9fxsXF\nhVdffRVTU1POnDmjUyauI1q/fj1dunTRWqZUKlm1ahXh4eFMnjyZTp06AbBmzRrkcjmhoaHMnj1b\na5/y8nKtmT2t3ba+vp61a9dy+/Zt3nzzTcaMGaPZLiYmhn//+998+umnbNy4Uee9LDk5mTVr1uDh\n4aG1/MKFCxw4cIAuXbrw6aefYmXV8Ps/f/58Vq5cSXFxMU5OTq2+R8XFxbi7u/PBBx9gbGwMNDSt\nf+WVV9i3bx+zZ8/WBHbS0tI4ePAgLi4ufPrpp5oSc6GhobzzzjttPndH1lJwWs3ewxczO2fkF86g\nuJnNnn37ce9ij729PcOHD2fkyJHN7i+RSFi5ciW7du0iMjJSEzwMCgriqaeeum/vFRMnTkQulxMd\nHc3u3btRqVR4e3u3OQgEIvFBEARBEARBEARBTQSBBEFo0c2bN1m2bBlubm5MmjSJkpISYmJiWLVq\nFcuXL9caPFq3bh3Hjx/HwcGBYcOGYWFhwaVLl/jxxx9JSkri/fff12oqvXv3bjZv3oylpSXjxo3D\nwsKC8+fPs3z5cr29JxQKBZs2bcLLy4v+/ftjY2NDSUkJcXFx/OMf/+D1119n4sSJOvu1NUv9cSem\n2reduUnLfzJNrOzpOVb7Z2nwsL74+LhrBTPv1lTZKwsLCxYsWKBVqq05RkZGvPjii7z44os665o6\nf6dOnVi6dKnO8h9++AGA7t21m8h/9NFHTZ4/KCjoseq1onZ3AAgayulNmTKF5ORkkpKSGDduHJmZ\nmVy8eBEPDw9mzZqls4+61BXQpm0vXrxITk4Offr00QoAAYwcOZKIiAjS09NJS0vT6dcyadIkTQCo\ncVm7+Ph45HI5I0aMID4+XjPIbmxsTFhYGNOnTycpKQmlUkl4eDhRUVGkpqZSUlKiOXZtbS379u3j\n9OnTmuXvvvsuU6dOZcSIEdjY2BAYGEhkZCQymYz169czd+5ciooaAhdz5szRvM+rg/XLli3jrbfe\n0lyvuvyco6Mj27dvJzMzE4lEQr9+/XjxxRd1fj6hoWTili1bSExMRKlU4u7uzpw5c3S2exy0Jjit\nZmbnTI9hzwIQNqZXkyXQmvodNzY25vnnn+f555/XWp6YmAjovleEhIQQEhKi91hOTk5635MMDAwI\nDQ0lNDS0+RfTSiLxQRAEQRAEQRAEQQSBBEFohdTUVKZPn641sDxlyhSWL1/Ohg0b8Pf3x9zcHJlM\nxvHjxxk6dCjLli3TZIPD7/0pfvnlF5555hkACgoK+OGHH7C2tmbdunU4ODRk44aFhbFmzRqio6N1\nrsXS0pLvvvtOs61aZWUlb731Ft9//z1jxozROje0LUtdEPTp79a+bPH27vdHKS4uxt7eXmvZ1atX\n2b9/P1ZWVjpBhY7g7gHj7pYQd+IwSUlJFBYWUlNTo7X9rVu3ALh06RIAAwcObHF2YVu2zczMBMDX\n11fvel9fX9LT08nKytL5fvXq1UvzdeOydtevX0elUiGRSHTK2vXt21dzTatXryYjIwN/f3+sra35\n9ddfgd8D66mpDTPeunfvzlNPPcWpU6f4+OOPycrKIjQ0VPNeffv2bc11qEsT9u3bV+e19O7dWytR\nACAuLo7Y2Fj8/f2ZPHkyN27cID4+noyMDL788kutgFleXh7Lli1DoVDg7++Ph4cH+fn5fPjhh3pn\nij7qWhOcvl/76XuvUCgUbN68GYChQ4e261r+CCLxQRAEQRAEQRCEjkyMeAqC0CILCwudmTKenp6M\nGTMGmUzG6dOnCQoKYv/+/UilUv7+97/rBGGCg4OJiIggKipKEwQ6ceIEKpWKqVOnagV1JBIJYWFh\nnDx5krq6Oq3jGBkZ6QSA1Nc4YcIEvv32Wy5fvqx34PqVV17Ruq7GWeq5ubn06NGj7TdH6DDcnKzw\ncbVvsf9GY7497B/6gcelS5fSpUsXevTogYmJCXl5ecTHx1NXV8ff/vY3nd/lx9n57CJ+is7Q+h5X\nK0q4dPgbzKUqRg8ZyJNPPom5uTkGBgbI5XJkMhm1tbUAmn47dw+U69OWbdUBlKa2VS9v3O9Hzdb2\n935UjcvapaenY2JiwrfffstHH32kVdZOKpViYmJCdXU1hYWFbNiwAWtra2QyGRcuXAAaeg2lpqbi\n7++Po6MjEomEV199lZCQEN544w127drFoEGDNAGdxu/l6tfT+NrUDAwMNKXp1M6cOcM///lP/Pz8\nNMu2bNlCeHg4x44dY+bMmZrlGzduRKFQ8PLLL2v+1gCavkmPmz8yOP3NN9+QnZ2Nl5cXNjY2FBUV\nce7cORQKBZMmTdIKOAqCIAiCIAiCIAgPDxEEEgRBS+MM+JrKUm5XK/H17am31r+Pjw8ymYysrCxG\njBhBdnY21tbW7Nu3T++xjYyMuHHjhub/zWWDOzk54eDggFwu11l3/fp1fv75Z01poruz8ouLdQfp\nLSws9JZ0UgeUKioq9F6zIDT2/ChPVvwU22yjcTWJhCbLLT1MJk2axJkzZzhx4gRVVVVYWFgwcOBA\npk+fjo+Pz599eX+Yw+ev620iL794GmX1beyGPkueS396DPblyf4NZa+io6ORyWSabdWlzfS9B92t\nLduam5sDaJVia0x9DPV2jTWeZdT4PVC9bUVFhU5ZO5VKRXV1NQDz5s3TmmmjduzYMSQSCS+99BKr\nVq3SLLexsSE4OJgvvviCo0eP4ujoqLOv+u9JaWkpnTt31lpXV1eHQqHQ9FgCGDVqlFYACBp+bsPD\nw7l8+bJmWVFREYmJiTg7O/P0009rbR8YGIi3tzepqak61/Mo+yOD08OGDaO0tJS4uDgqKysxMjLC\n1dWViRMnMmHChDYfTxAeJZ9//jkymYxvv/32gfYsU5fGbKpMrCAIgiAIgiC0hwgCCYIANJEBX1FK\n2rVbKO3KOJ9dpNM8WZ3FXVlZSUVFBfX19ZSVlbF9+/ZWnVOdta4vGxzAzs5OJwh06dIlVq5cSV1d\nHX5+fgQGBmJubo5EIiErK4vY2FhNVn5j+voLAXqz1AWhKQPcHVgyxUdvsKAxiQSWPu37SDQcnzt3\nbofriXW389lFTX5PqxUNgRdbVy/q6+GziGScbMwY4O5ASkqK1ra9e/cGICEhgdDQ0GbLvLVl2549\newLonE9NvVy9ndrtaiWRKTmklJrqlLU7c+YM+fn5TJs2TROoUZe1S09Pp/7/boanp24gs7q6mvz8\nfDp16kS3bt101qvL1mVlZekNAvXs2ZOsrCzS09N1gkCXLl1CpVJpLXviiSd0jqEvgN84scDAwEBn\nHx8fn8cuCAR/XHB6xIgRjBgxol37CoLQYMWKFaSmpjbbJ1AQBEEQBEEQ7jcRBBIEockMeLUbBYWs\n+CmWpU//ngEPDVnc0BBgUQdZPDw8WLduXavOq85ELy0txdXVVWe9vqz3nTt3UlNTw+rVq3VmKeza\ntYvY2NhWnVsQ2mvSAFecbc3ZFpNB8jXd7HvfHvaEjPR8JAJAQoOfojOafP8ztrABoOLmVWy69aa+\nHrbFZFBfcp2jR49qbfvEE0/g5eXFhQsXCA8PZ/bs2VrrFQoFJiYmGBsbt2lbLy8vXFxcSE9P59Sp\nUwwfPlyz3alTp0hLS8PFxYV+/foBvwf1k6/douq3LKyc63TK2k2fPp29e/dibm7O8OHDOXXqFLW1\ntdTU1LBlyxbN8e3s7HTuyZ07d4Cmy9Op92lqhuW4ceM4duwY//vf/wgMDNT8/airq2Pr1q0621ta\nWuos0xfAb01iwePocQxOC0JH9TiWrRQEQRAEQRD+fCIIJAgdXHMZ8GpVxfkoa6q1MuDh9+xzDw8P\nTE1NcXV15fr16ygUCp2eDvp4eHhw+vRp0tPTdRqey+VyioqKdPbJy8vDyspKb5mqxzHDW3g4DXB3\nYIC7g1b5RHMTQ/q7OTz0PYAEbVflimZLaTn2GkRxViLZMeHYunphZGZFZqSc88alTAwaQ0xMjNb2\nb775JitWrGDr1q389ttv+Pj4UF9fT15eHufPn+err77SlBJq7bYSiYSlS5fy7rvv8vHHHzNkyBC6\ndetGbm4up0+fxszMjKVLlyKRSDRB/bxb2gGYu8vaPfe0L126dOHAgQNERUWRl5dHVFQUUVFRWFpa\nYmZmRk1Njd4ZSqampkDT5enUyxvPwFQfR6VS4e3tzaRJkzh8+DCLFi1i2LBhpKSkUF5ezlNPPYW9\nvX2zM6Oaoj6fOkGhqet6HIngtCA8HvSVLhYEQRAEQRCEeyWCQILQwTWXAa+mrLlDQcoJXAZOZFtM\nBgPcHcjIyCAqKgoLCwuGDh0KwLRp0/jiiy9Yt24dS5cu1SnBVlFRwc2bNzUli0aPHs2OHTs4cOAA\n48eP15T3qa+vZ8uWLXpLtDk7O5Obm8vVq1dxc3PTLD927BgJCQn3cCcEoe3cnKxE0OcRl3hVN9jc\nmJmdM0+MDyM/6VfKczOor6/DzNaZCfNeYvJgT50gkLOzM+vWrWP37t2cOXOGiIgIjI2NcXJyYvr0\n6djY2LRr2969e/PZZ5+xc+dOEhMTiYuLw9ramtGjRxMcHIyLi0uby9qtDplO165d+fjjj5HL5Vy8\neJG5c+cSGhqq1evobiYmJnTp0oWCggLy8vJ01icnJwPa5enUMz/Vwf3XXnuNbt26cejQIfbs2UN+\nfj7du3fn/fffZ8GCBe0aCPXw8AAaytnV1dXplIRrqpze40IEp4WHkVwuZ+HChQQFBTFr1iw2b95M\nWloatbW1eHh4MHfuXAYMGKC1T21tLfv27SMqKor8/HykUinu7u5MnTpVpyRhe46/bds2tm/frndW\neePjLVmypMXXJ5PJiIuL48qVK5SUlCCVSnFzc2Py5MmMHTtW57hqU6dO1Xzt7e3NRx99BDTdE6i9\n9yQkJITNmzeTmJjInTt36NGjByEhIQwaNKjF1yYIgiAIgiA8PkQQSBA6sJYy4NWsnHtwK/M8lUV5\n5Dp2x+zaCdISz1JXV8eiRYs0g3sTJkwgMzOTgwcP8vLLLzNgwACcnJxQKBTcvHmT1NRUxo8fz6JF\ni4CGbMfnn3+erVu38vrrrzNy5EgsLCw4f/48CoUCd3d3rl69qnUtzzzzDAkJCbz11luMGDECCwsL\nMjMzSUtL05Q0EgRBaK3b1coWt7F07I7n+FCtZd179cLHx1NvXwcrKysWLFjAggULWjx2W7Z1cXHh\njTfeaHJ946B+F98xdPEdo1mnr6zd9pOZzPXuir29PYMGDWLu3LmEhIQA8NRTT+nMrgwKCiIoKAho\n6B/0ww8/8N133/H1119rAi7l5eXs2LEDaPib0LdvX0JCQlAqlXz//ffExsZSVlaGjY0Nzz77LJMn\nT2b16tUYGhri5OREWVkZd+7coXv37rSVg4MD/fv3JzExkYiICJ555hnNutjY2A4zW1QEp4WH0c2b\nN1m2bBlubm5MmjSJkpISYmJiWLVqFcuXL2fkyJEAKJVK3nvvPVJTU+nWrRtTpkyhurqaU6dO8fHH\nH5OVlUVoaGi7j3+/ffnll7i6uuLt7Y2dnR0KhYL4+HjWrl1Lbm4u8+bNAxpmKs6dOxeZTIZcLtfq\nxefs7NzsOdp7T+RyOW+88QadO3dm3LhxKBQKYmJieP/99/nggw90ZuELgiAIgiAIjy8RBBKEDqyl\nDHg1Yws7ug+eQt55Gbcy4jlWYs7IQb4EBwczcOBArW1fffVVAgICOHToEElJSVRWVmJpaYmjoyMz\nZszQyooEmD17Ng4ODuzdu5fjx49jZmbGwIEDeeGFF3j33Xc1ASY1f39/3nvvPXbu3ElMTAxSqRRP\nT09Wr17NzZs3RRBIaJWmMm3vp5SUFFauXKk1sC48fMxN2vco1N79HpT7XdauJTNmzODcuXPExsby\n+uuvExAQQHV1NSdPnqSsrIyZM2fSt29fzfaGhoY888wz7Nixg8WLF+Pr64uJiQlJSUnY29tjb2+P\nSqXi66+/BmDo0KFUV1e3+T68+uqrLFu2jK+//prz58/j7u5Ofn4+p0+fZvDgwcTFxbX5mIIg3LvU\n1FSmT5/Oiy++qFk2ZcoUli9fzoYNG/D398fc3Jw9e/aQmpqKv78/7777rqb/V0hICG+88Qa7du1i\n0KBBeHl5tev499v69et1Zi4qlUpWrVpFeHg4kydPplOnTlhYWBASEkJKSgpyubxNzwXtvScpKSmE\nhIRoBZxGjx7NqlWr+Pnnn0UQSBAEQRAEoQN5uEYwBEH4Q7WUAW9iacvAeas0//cYEwxA2JhehIz0\nbHK/QYMGtanMxNixY3WCQ7dv36agoAB3d/dWH9/b21uTpd5YcwP9ISEhYoBeEDqw/m7t65HS3v0e\nlPtd1q4lhoaGvP/+++zdu5cTJ04QERGBgYEB7u7u/OUvf2HUqFE6+4SEhGBiYsKRI0fYtm0b5eXl\nBAQEEBgYyIkTJyguLubWrVv4+/szfPhwIiMj23RNAF27duXTTz9l8+bNJCUlkZKSgpubG2+//Tbl\n5eUiCCQIfxL1TJjGPD09GTNmDDKZjNOnTxMUFMSxY8eQSCS89NJLmmAHgI2NDcHBwXzxxRccPXpU\nJ+DR2uPfb/pKVxoaGjJlyhSSk5NJSkpi3Lhx93SO9t4TJycnnnvuOa1lAwcOxNHRkcuXL9/TNQmC\nIAiCIAiPFhEEEoQO7GHIgC8rK8PCwgJDw9+PqVKp+Pbbb6mpqdH0GxKER02vXr3YuHEj1tbWf/al\nCM1wc7LCx9W+VaUx1Xx72D905bbud1k7dX+K5hgbGzNnzhzmzJnTqmuUSCTMmjWLWbNmkZSUxJ49\ne8jKyuKXX36hW7duBAYGMnr0aJ555hkkEolW+Tl99JXig4ZB2RUrVuhd9yAGgQVB+N3dPam6WTTU\nqOwXKhznAAAgAElEQVTZsydmZmY62/v4+CCTycjKymLYsGHk5+fTqVMnunXrprOteuZKVlaWzrrW\nHP9B/P4XFhYSHh5OUlIShYWF1NTUaK2/devWPR2/qqqqyXsik8n45JNPUCgUeu+Ju7u7Tm80aCid\nefHixXu6LkEQBEEQBOHRIoJAgtCBPQwZ8L/99hs//fQTfn5+ODo6olAoSEtLIzc3Fw8PD63GuYLw\nKDExMdE7iCU8fJ4f5cmKn2I1/XSaI5HQ7EzIP8vDENRvCz8/P/z8/P6UcwuCcP+dzy7ip+gMnYB6\ndUUpN26U0NPbSO9+tra2AFRWVlJZWQmAvb293m3t7OwAqKioaPI4zR3/fisoKOCNN96goqKCfv36\nMXDgQMzNzTEwMEAulyOTyaitrb2nc7R0T4yMGu6rvntiaWmpdx+pVEp9a/7gCYIgCIIgCI8NEQQS\nhA7sYciA7927N3379iUtLQ2FQgE0NMidM2cOs2bNwtjY+L6dS+hY6uvr+eWXXzh48CAFBQVYWVkx\ndOhQ5s+fr7Pttm3b2L59O6tXr8bHx0drnVwuZ+HChQQFBbFkyRLN8s8//xyZTMbXX3/N2bNnOXr0\nKHl5efTq1YuPPvqoyZ5AK1asIDU1lb1797J7926OHz9OYWEhtra2jB49mnnz5mnNjFOLiopiz549\n5OTkaHpnLViwgE8++YTU1NQmZ0UILRvg7sCSKT58/ktKs4EgiQSWPu3LAPeHqxQcPBxBfUEQOqbD\n5683+/5ZXlXDgd8uMDnxBk/27661rrS0FGgo52ZhYQFASUmJ3uOol6u303ecppY33kc9O0alUuls\nry+Y0pS9e/eiUChYsmSJziyj6OhoZDJZq4/VlJbuiTrIpO+eCIIgCIIgCIKaCAIJQgf3Z2fAe3h4\nsHLlyvt6TEEA+Prrrzlw4AD29vZMmjQJqVRKbGwsly9fRqlU6g20tMemTZtIT08nICCAgIAAvaVX\n9FmzZg1paWmaZtXx8fHs3r2b0tJSrWATwO7du9m8eTOWlpaMGzcOCwsLzp8/z/Lly8XAz30yaYAr\nzrbmbIvJIPmabmDct4c9ISM9H8oAEDwcQX1BEB5uMpmMuLg4rly5QklJCVKpFDc3NyZPnqzTm1Gd\nsLBnzx7Cw8OJiori5s2bjB49Wutv1Dc797P6yx+pKi6gTqXE2NIWezcfnPoOw0D6+9/Z28X5vPuf\nnzjSuYbKolxNmbTCwkKqqqpwd3fHzMyMLl26UFBQQF5eHubm5vz888/ExcVRVFREUVER2dnZGBoa\nUlBQQOfOnTXHv3LlClVVVTol4VJSUoCG50019d/NoiLdXmqZmZmtvp/5+fkADBs2TGed+rxqsbGx\n7N+/n4iICAoLCwkNDcXFxYWRI0fy1FNPabarra3l6tWrvPrqq8jlcgwNDcnJyaG8vJy8vDy6du0K\n/P79KS8vJzs7m/Lycs3s+X/961+tfg2CIAiCIAhCxyCCQILQwT0OGfCCcLcLFy5w4MABunTpwqef\nfoqVVcNA9/z581m5ciXFxcU4OTndl3NduXKFdevW4ezs3Kb98vPz2bBhg9a1LV68mMjISMLCwjRl\nbwoKCvjhhx+wtrZm3bp1ODg0/A6GhYWxZs0aoqOj78vrEBreDwe4O+j0tOjv5vBIBEv+7KC+IAgP\nty+//BJXV1e8vb2xs7NDoVAQHx/P2rVryc3NZd68eTr7rF69moyMDPz9/RkyZAg2NjaadevWreOL\n73dRbWCGjasXUiNTbhflkJf0KyU3LnD7Vj623foAoKy5Q2bkT9R4eBI6ZQSdOnUiOzubLVu2UFNT\nQ3x8POPHj2f8+PH88MMPbNq0idzcXAoKCujfvz8+Pj7s2LEDc3NzqquruXHjhlYQqLKyku3bt/Pi\niy9qlmVkZBAVFYWFhYVWj8levXoBcPz4ccaOHYtUKgUagkLbt29v9f1UP0ekpKQwePBgzfKEhASO\nHj2q+f/hw4fZsGEDdnZ2eHh4YGhoSK9evSguLub48eOaIJBcLufMmTPcuXOHYcOG4e/vz507dygo\nKODSpUusWLGC77//HgMDA8aPH49UKmXr1q3Y2dkRFhamKUFrbm7e6tcgCIIgCIIgdAwiCCQIwiOf\nAS8Idzt+/DgAc+bM0QRZoKGRfVhY2H2dfTZz5sw2B4AAFixYoHVtpqamjB49mh07dpCZmcmgQYMA\nOHHiBCqViqlTp2oCQAASiYSwsDBOnjxJXV3dvb8QQcPNyeqRCPrcTQT1BUFozvr16+nSpYvWMqVS\nyapVqwgPD2fy5Ml06tRJa31hYSEbNmzA2tpaa7lMJmP/L4cxcOhJ3+HTMTD8vedPfnIUuedl1FYp\nNMusnHtQIb/Bjewsvvkuk7nPzSYlJQVvb2/c3Nw4deoUly5dYsaMGZw7d47jx4+Tk5PD2LFjcXFx\n4eTJk9ja2rJw4ULmzZun02vH29ubo0ePcvnyZby8vCgpKSEmJoa6ujoWLVqkFRjp3bs33t7epKam\n8sYbb+Dn50dpaSlxcXEMGDCAkydPtup+TpkyhePHj/Ovf/2L4cOHY29vz7Vr10hISGDEiBHExMQA\nDUEgQ0ND/vOf/3D69Gk2bNhAYWEhAQEBKJVKfv31V8aOHctnn33GnTt38PHx0ZrNM3/+fMaNG8fR\no0f5y1/+wvDhw6murubMmTNUV1cTHBzMW2+9pdleLpe36vo7KvUsKlFGVxAEQRCEjkQEgQRBAB79\nDPg/mvgA+fBp/LN79FQCt6uVeHt762zXt2/fVpdsaw11RnFbeXrqzsJwdHQEtHsSZGVlAQ3XfTcn\nJyccHBzEgI+gIYL6giA05e4AEIChoSFTpkwhOTmZpKQkxo0bp7V+3rx5OgEggP3791NRrcJ11DNa\nASCAzt6jkF+Mw8zWic7eIyjLvYSxhR1eT08hff96Cm7eIDY2lr59+xIcHIyNjQ1Llizh/Pnz9O7d\nm/fff581a9awadMmUlJSqKiowN3dnb/85S+MGjVKc92NOTs789prr7FlyxYOHTpEbW0tPXv2JDg4\nmIEDB+pc/zvvvMN3331HbGwsBw4coGvXrixYsICBAwe2Ogjk5ubG6tWr+fHHHzl79iwqlQp3d3dW\nrlyJhYWFJggEIJVKkUqlTJw4EblcTnR0NLt370alUmkCYampqTg7O2vNcAKwsbFh3bp1/P3vf6eg\noICIiAgMDAzo1KkTPXv25Mknn2zV9QqCIAiCIAgdlwgCCYKg5VHNgBc6rvPZRfwUnaHVCyUtM59q\nRTH/OnCRsPGGWgPeUqlU74BWe6nLtrWVvl4+6pI0jWf2VFZWAmBra9vk+UUQSGhMBPUFQdD3+28h\nuUN4eDhJSUkUFhZSU1OjtY+6T09j+hIWqquryc7OxsjEjMKLZ/Se38DQEOWdCgzNLDXLDE0tMDS1\nBEMjDAwMuHDhAqtWrdI5v7GxMUuWLOHy5csUFxfTt29fAgIC6Nq1K3V1dU0mcnTv3p133nmn5ZtD\nw9/g119/nddff11nnb4EnyVLluj06wPw8vLiww8/1Fl+Va5g4btfcLtaiVmugpL0S7z22muMGjUK\nb29vnn32Wa3SeocOHQJg0qRJeHl5sW3bNq3jlZWV0bVrV55++mleeeUVoGE21ueff65zbicnp2aT\nlD766KMm1wmCIAiCIAiPJxEEEgRBEB5Zh89f11v6SmpkAkBiRg4Xb1ay9GlfnuzfHQCVSkV5eblW\naTX1gJJKpdI5R+NZOfpIJJJ7eQktUpewKS0txdXVVWd9SUnJAz2/8OgSQX1B6Hj0JUYAVCtKuBmz\nFScLA4YNGsDAgQMxNzfHwMAAuVyOTCbTKbEG+hMdKioqqKio4Gr2dYqKT1Ffp0JiIEVqaIKRhQ1G\nZpbUqZQoCrLIOXsYgDplLaf+8xp1qlrsbazJzs7G0NAQiUSCubk5dnZ2hIeHc+TIEb755hucnJxY\ns2YN27ZtIzY2loSEBAoKCsjLy2P27NmsWbNGZzbQw0D//e9GmctIygpSubYjHGuzfUgkEry9vXnh\nhRfw9PREoWgonZeYmEhiYmKTx6+qqnrAr+DhIZPJiIuL48qVK5SUlCCVSnFzc2Py5MmMHTtWZ3uF\nQsHevXs5c+YMBQUFGBoa4uTkREBAAM899xzl5eUsXLhQs/3UqVM1X3t7e4vgmCAIgiAIj7WH78lZ\nEARBEFrhfHZRk71PzO27cLs4nwr5NUys7PgsIhknGzMGuDuQnp6u00NHPSunqKhI51iZmZkP5Ppb\ny8PDg9OnT5Oeno6vr6/WOrlcrveaBUEQhI6nqcQIAPnF0xQVl2LZ51nGPBeqSYwAiI6ORiaT6T2m\nvkSH3377jfT0dExMzXAd/DQm1vbU3qmk6lY+/5+98wyI6lrb9jV0GHoVEQUsKEixgWIssYvdE000\niZoYT2LMSYwxvtGTnORLMT2WYzTFJLYYEzs2ELGAgtKUplIEAanSB5AyMN8PzmwZZyh2jfv6k7jL\n2mvvGWavte7neW5tPX26j55HbWUZSfvWCOdUFV9DT2qKnpEZ7i52PDdjuvDuLS0tJTAwEDc3N7Ky\nsggKCuLFF1/E2tqaN998E4VCQXZ2Nv/85z8pKioiOTmZHTt28MILL9zlE7u3tPb8rVy8wMWLhvoa\nxrjqQ0kGwcHBfPjhh2zYsEEI+PjnP/+pIk7cCSkpKezdu5eLFy9SUVGBiYkJXbp0YezYsTz11FPA\n7Qss+fn57Nq1i/j4eIqLi9HT08PKyopevXoxZ84cFY9DaPpOBQYGkp6eTl1dHXZ2dgwfPpzp06ej\nq6ur1r4m1q9fT+fOnenduzcWFhbIZDKio6P57rvvyMnJUfn8CwoKWLFiBYWFhXTr1g1/f38UCgU5\nOTns27eP8ePHI5VKmTVrFiEhIRQWFjJr1izh/DvxdhQREREREREReZwQRSARERGR/3G7E2JNKBQK\nAgMDCQ4OJjs7G4VCQefOnRk1ahTjx49XW0yZNGkSvXv3Zvny5WzZsoXIyEhkMhn29vZMnz6dUaNG\nqV2jvr6enTt3cvz4cYqLi7G0tGT48OE899xzTJ8+/YmJZvw9NFXjQguAZVdvitJiyU8Mw6xTD3T0\nmzxS3B1M2bx5s9rxSl+fY8eO8fTTTwtl2YqKivjjjz/u2z20h2HDhrFjxw4OHDjAqFGjhAwmhULB\n5s2b1QQtEREREZEnj9YCI6ApEwjAzLGXSmAEQEJCQruvk52dzS+//IKJiQldu3alx9QXSL5eK+yv\nqyrXeJ68php9Eyu69eyNUeN1pkyZgq2tLQC7du0CwNnZmdLSUoKDg5k9e7bwLpZIJJSXl6Orq8u8\nefOIjo7m7Nmzj5QI1NbzV6Kta8ChDPj8+VkoFAqCg4NJSkrC1dUVgKSkpHaLQMos5ubjgKCgINav\nX4+Wlha+vr507NiRsrIy0tLSOHTokCAC3Y7AUlJSwpIlS6iurqZ///74+flRV1dHQUEBJ06cYOLE\niSoi0Jo1azh27BjW1tb4+fkhlUo5f/48b775Jhs3biQwMFD4bFtj3bp1aj5WcrmcDz/8kF27djF+\n/HisrKwA+OabbygsLGTOnDnMmDFD5ZyKigoMDAzQ09Nj9uzZJCQkUFhYyOzZs9v1nEVERERERERE\n/g6IIpCIiIjI/7idCXFLfPvtt5w6dQpra2vGjBmDRCIhIiKCDRs2cPHiRZYuXap2TlVVFcuWLUNH\nR4fBgwdTX1/P6dOnWbNmDRKJhJEjRwrHKhQKPv/8c6KiooTa8A0NDYSEhJCVlXVPn8ejzNVCmVqp\nm+YY2zhi29OXwsvnuHToByw6u3EtRotrwT9hb2OBpaWlyvGurq707t2bxMRElixZgpeXF2VlZURG\nRtKnT592m0TfD+zt7Xn++efZsmUL//rXvxgyZIiwoCKTyXB2dubq1asPrX8iIiIiIg+f1gIjAPSk\nTf4zlQVXMevkyvawVPo4WxMbG8vRo0dbbbu5v1DYob+QVdfywgsvEB0dTUNqCA3S/mjrGqhcp6Gu\nhoa6GnT0Den7wofkJ50m70IIHQzqkFffbDs9PZ2dO3cCoKOjw6hRo9i7dy/79+9nxIgRgh9eYGBT\nWbn+/fsTHR2Nvn5T2de2/G8eFK09f1l+BsZ2TkIgkEIB28NSMSkrA0BfX5/u3bvj7u5OeHg4wcHB\njB49Wq2dq1evYmFhIXgJKYUXpS9gdna2kFX05ZdfqpWQbZ45fDsCy5kzZ5DJZCxYsIDJkyernFNT\nU6Pi0RQSEsKxY8cYNGgQS5cuRU9PT+hjeHg4ubm5HDp0SK0dTdzaP2j6jkyYMIH4+Hji4uIYMWIE\naWlpXL58GRcXF5555hm1c+6lD6SIiIiIiIiIyOOKKAKJiIiI/I/bmRBrIjQ0lFOnTuHi4sKXX36J\ngUHTgsgLL7zA8uXLOXXqFAMGDGDYsGEq52VkZDB69GjeeOMNYSI9ZcoU3njjDXbv3q0iAp08eZKo\nqCjc3d359NNPhXr4zz//PO+88849eQ6PAxeutl0CzaHfWPRNLLmeEkVRajTa+kaYjRnOJ/9Zwptv\nvql2/Pvvv8+vv/7KuXPnOHDgAB07dmTevHn07dv3oYpAADNmzMDa2pp9+/Zx7NgxDA0N6du3Ly+9\n9BIffPCBUEZGREREROTJo63ACACbHgMoSb9ARtguzDv3IifWhPqEANKTk3jqqacICwtTO6e8uo6l\nmyNU2k4OjaKquJhu473x8DEiITKUBvlFshVW6BqZ01BXTV1lGeW5adRV38wKsnLxwrL8EqlJF5DL\n5ezYsYPKykqioqIYNGiQcH1/f3/27dvHjh072LJlCz179sTS0pK//voLfX19tmzZgkQiYfr06ffo\n6d09bT3/jNC/0NLRw8jaAX1jcxQKSD6SSVfjWjzde+Ll5QXA0qVL+fe//83atWs5cOAArq6uSKVS\nioqKuHr1KpmZmXzzzTeCCNSzZ0/09fUJCAhAJpMRExNDdnY2y5Yt0+gh2NwLsb0CS3OUgk5zlGNd\nJQEBAWhra/PWW2+pHd+xY0fy8vI4efKkRhGoudhopK+DozFEngokLi6O69evU1dXp3J8cXExAMnJ\nyQD07dv3vvs0ioiIiIiIiIg8rogikIiIiMj/uJMJcXOCg4MBmDdvnsqk2MDAgHnz5vH+++9z9OhR\nNRFIX1+fV155RSWS0tHRETc3NxITE6mpqRHaU9bsf+GFF1QMkaVSKc899xzffvvtHdz540d1rbzN\nYyQSCTauPti4+gjbhg7vgVQq5ZdfflE7XiqV8q9//Yt//etfavs0RRkvXryYxYsXt3h9Dw8Pjee1\nVqpv5MiRKqJfc55++mm1soTV1dXk5+fj7OzcYpsiIiIiIn9v2hMYYWhhR7dRc8mLO0FFTioKRSOZ\nxu6sWLECqVSqJgKlF1RwOacUw1vEDXldDQBXShvI1HPD/7lelF2J5WxsAqmZV6hq0EbPyBSbHgMo\nuBgBgGcXS2YP8cVaZzCvvvoq8fHxBAcH07VrVxYuXIi3t7dw/Q4dOtC3b1/OnDnD2LFjycnJ4eDB\ng1y7dg0PDw+8vb2ZOnUqvXr1uheP7p7Q1vO39x6JLO8KN0ryqchNQ0tbBz2pGf1HTOKjt14SxnPW\n1tasXr2aAwcOEB4ezsmTJ2lsbMTc3JzOnTszceJEunTpIrRrbGzM3Fff5JdNW/ntz/1kpiQhr6sR\nSsu1xvXr19m1a1ebAguAr68vW7Zs4YcffuD8+fP06dMHNzc3HB0dVUSX2tpaMjIyMDU1Zf/+/Srt\nlZeXk5ubi7a2NtnZ2Sr7zmcU8XtoqoqQVisrJTlwI0baDQwb2JexY8diZGSElpYWhYWFhISEUF9f\nDzRl1ANqWd4iIiIiIiIiIiI3EUUgERGRJ5Y7jThsiStXriCRSPDw8FDb17t3b7S0tLhy5Yravo4d\nO2rM5FBGbFZWVgoiUHp6OhKJROPih5ubW6v9+zthpH9nr687Pe9hU15ejlQqVRH+Ghoa+OWXX6ir\nq2PQoEEPsXePB4WFhcyfP5+RI0e2Kt61RkhICKtXr2bx4sUtinUiIiIiD5r2BEZAU6nU7qPmCP+e\nMbwHAwd2B1SDHc5nFFHu4k8fZ3+1NnT0DKgF6qtlaOvqczhDwucvvs5//tM0ZlGOrfLyC/j16xTG\n+7nw8Zyb76gxY8agra3NL7/8IngC3Xr98ePHExMTg42NDUuXLuW1117DxsaGzZs3Y2xs3L6H8gBp\n6/nb9OiPTY/+atu9BvfA0NBQZZuhoSEzZ85k5syZrbZ5UziRQc+pmPeEHNl/aZCV8FtUGfPMigTP\np1vJz89nyZIlVFZW4u7uTt++fVsUWKCp5N53333H9u3biY2NJTw8HGgap06fPl3wMKqsrEShUFBe\nXq7mp1hbW0tOTg7W1tbU1DQJiTk5OXz3y5/sDgyltqqMxvpadAyMMe3YlYb6OuS11VgMmkKugzdS\nJ1t2/Pdjhg0bho+PjxAUVVBQwDfffENmZiaDBg1S8VPatGkTu3fv5rPPPsPT07PV5ykiIiIiIiIi\n8nfn8VwNExEREbkL7jbisCWqqqowMTFRWahXoq2tjampKeXl6obJUqlUY3tK09zmhr/Ka2gy1FXW\nzX8S8HbSvLBxv8572ISHh/P777/j5eWFjY0NMpmMpKQkcnJycHFxabeJtMjfg4SEBFasWMGsWbNE\nY+vHFFFQFLmX3OvAiNb8bYysO1FVnEtFbhoGZtaCv41ScHCyNcHJ1oTCQimHzY2wNFEtF6bMem5o\naGixXz4+PtjY2BAcHIynpyc5OTmMGDHikRSA4MEHpgSez2L1oQS1z0gp0F1IzmR5QRVvT/RkrLej\n2vn79u1DJpNp/P0JDQ0VBJbmODo68n//9380NDSQkZHBhQsXOHjwID/99BMGBgaMHj1aGM+6uLiw\nZs0alfM1BWJs3xfE9p37MLZzQmrjiERLixtl1ylOO8+NskL0jS0w79wLhQK2Rhaipy8lPj5eRTiL\ni4sTvhehoaEsX75cyE6Ki4tDT0+Pnj17qvRF+R1sbGxUycIXEREREREREfk7I4pAIiIiTxQtTZwL\nL0eoRBx28bk5cW5pQnwrUqkUmUyGXC5XE4IaGhqoqKi4a+8WIyMjZDIZDQ0NakJQ2f8Mhp8EnGxN\n8Ohs2aYHQnM8u1jiZGtyH3t1/3B1dcXNzY2kpCRkMhkAdnZ2zJw5k2eeeUZjnX4RVSwtLQXD7MeB\ne5G5JCIi8mRwLwMj2vK3senRn6LUGPITQzHt2BUDMxviM0u4WijDydaEoqIiFe+ZWzExaXoPX79+\nXWMZXmgq5zpu3Di2bt0qiAnjx4+/nVt7oDzIwJTzGUUax7GgLtCtOhiPrZmhWkZQXl4eAH5+fmpt\nJCQktHp9bW1tunXrRrdu3ejVqxfvvfceERERjB49GgMDAzp37kxWVhYymUz4rFviKh3p/Y930NJW\nHTNX5F0hcfd31MhKqCy4ilknVxQKkOnZUnb1Avn5+ejr6wNNQo+9vT3l5eWkpaWxa9cuZsyYQWVl\nJVeuXMHDw4Pa2lrgpqeRqakp0PQdtLOza7WPIiJtoVAoOHDgAIGBgeTn52NiYsKgQYN48cUXBQ/S\nW8tQh4aGEhgYSHp6OnV1ddjZ2TF8+HCmT5+Orq6uyrGTJk2id+/eLFu2jK1btxITE0NpaSlvvfUW\nI0eOZPXq1YSEhLBx40aioqI4fPgw+fn5WFhYMHbsWGbMmIFEIuH06dPs2bOHrKwsDAwMeOqpp3j5\n5ZfV5hBnz57lzJkzpKSkCFUwOnXqxMiRI5k4caKa75by+r/88guxsbEcPHiQ3NxcjIyMGDhwIC+9\n9JIgEDc2NjJ//nyqqqrYsmWLmqcYwI8//sjBgwd57733GDx48N19OCIiIiIiKogikIiIyBNDaxPn\nWlkpgBBx2Hzi3NaEWImLiwtxcXEkJSUJJr9KkpKSaGxspGvXrnd1Dy4uLsTHx3Pp0iV69+6tsu/i\nxYt31fbjxvNDu7P893MtRis3RyKB2UO63/9O3SdcXFxYsWLFw+7GY42Ojg6dOnV62N24J/To0YMN\nGzYIC1kiIiJPNvcyMKItfxsDMxscB4wnO/IQlw//iFmnnuibWLLy62iM6ksxMjJi5cqVLZ7v5eXF\nnj17WLduHX5+fhgaGiKVSpk4caLKcWPGjOGPP/6guLgYJycntWyOR4kHGZjSWpaWJoGueZaWUqBT\nluFLSEjAx+emb2JsbCxHjx5VazctLQ17e3u1zHVl8JFSkAGYOnUqa9euZc2aNbz99ttq59TW1nLl\nyhW0TWxJK2lQE4AATO27YtKxK6UZiWSE7cK8cy90DU0oyUikPucyEyb4C4vTcXFxeHp60r9/f1av\nXs2mTZsIDw/H0NCQzMxMDA0NmTt3Lj/88INw315eXpw+fZqVK1fSv39/9PT0sLW1VfNdFBFpDz/8\n8AOHDx/G0tKScePGoaOjw7lz50hJSdEYGLhmzRqOHTuGtbU1fn5+SKVSkpOT2bZtG3FxcXzyySdq\ngX6VlZUsXboUAwMD/Pz8kEgkahUgfv31V+Fvuk+fPpw7d46tW7cil8sxMTFh06ZNDBw4EHd3dy5c\nuMChQ4dobGzk9ddfV2ln06ZNaGlp4erqipWVFVVVVcTHx/PTTz+RmprKkiVLND6H3377jdjYWOH6\n8fHxBAUFkZeXx2effQY0ZeGNHTuW33//nVOnTjF27FiVNurq6jhx4gQWFhb4+vre0echIiIiItIy\noggkIiLyxNDaxFlPagagEnG4PSwVRWmWxgmxJkaPHk1cXBybN2/m888/FybFtbW1bNq0STjmbhgx\nYgTx8fFs27aNTz/9VJhYVFVVsWPHjrtq+3Gjj7M1iyd4tCjsKZFI4O2Jni3Wxhd5Mmgps6akpIQ/\n//yT6OhoSkpKMDIywt3dnZkzZ9KtW7cW24uPj+ePP/4gLS0NiUSCu7s7L7/8Mo6OqqV3bidCspcz\nihAAACAASURBVL3o6+v/bQQtERGRe8O9Coxoj7+Qdfd+GJrbUnApgsqCq5Rfu8zlqg487evJmDFj\nWj23b9++zJ8/n6CgIPbv349cLsfW1lZNBDI3N6d///6cPXuWcePGtX1TD5kHEZjSVpaWJoEu94Il\nJrnhlORnCwLdhAkTOHbsGF988QWDBw/G0tKSzMxMYmNjeeqppwgLC1Np98SJEwQGBuLm5kaHDh0w\nNjYmPz+fyMhIdHV1mTJlinDs6NGjmzJy9gZwNCySTl17YW1jg4G8kuTkZDIzMzE2NsZhgD8KhYLS\njASK0+O4UZZPQ10NimYlkI2s7JHaOFKRk4pC0Yie1Byzjp0xNjamuLiYoqIiysvLhVK5R48eZcCA\nAeTn53Py5EmKiorQ09Nj4sSJmJmZCe2OGTOGwsJCQkND2b17Nw0NDfTu3VsUgURum6SkJA4fPoyD\ngwPffvutMJ6bM2cO77//PiUlJSreZyEhIRw7doxBgwaxdOlSlSyc7du388cff3Do0CEmT56scp2r\nV6/y9NNP89Zbb2ksCQ5NYu1///tfrKysAJg9ezYLFixgz5496Ovrs3r1amGMWl9fz1tvvUVwcDDP\nP/+8yt/Hhx9+qJalqVAoWL16NcePH2fChAm4urqqXf/y5cusW7cOGxsboKkKxr///W/i4+NJSUmh\nR48eQNPf344dOwgMDFQTgcLCwqiqqmLChAkay6uLiIiIiNwd4i+riIjIE0Hb5U0GUJJ+QSXiMO14\nIef1yhgzcrjahFgTw4YN4+zZs5w+fZrXX3+dQYOajJDPnj1LQUEBQ4YMYfjw4Xd1HyNGjCAsLIyY\nmBgWLVqEr68vcrmc8PBwunfvTk5OzhNV33xcn87YmRuxPSyV+Ez1z9eziyWzh3QXBSARjRQUFLBs\n2TJKSkrw9PRk6NChFBUVcfr0aaKiolixYgUDBgxQOy8yMpJz587Rr18/xo8fT3Z2NtHR0aSmprJ+\n/XqNGTrtiZBUolwIgKYFg+blKBcvXoytra1GT6Dly5eTmJjI3r172bVrFyEhIRQXF2Nra8u0adOE\nyfaRI0c4dOgQeXl5mJiYMHr0aGbPnq1W4gMgOTmZPXv2cPHiRSorK4WF2VmzZmFpaXlnD/4h0957\nSktL4/jx4yQkJFBUVERtbS3W1tb4+vry7LPPtuhPEhYWJpR5qa2txcLCgp49ezJ16lS6d1df+G2v\noCgi0hr3KjCivT41UhtHXGxufkcXjnVjqo+z8G9bW1sOHDig8dypU6cyderUVttXKBRkZGSgr6//\nWCzOP4jAlLaytECzQHeipiND+/cWBDonJydWrlzJtm3biIqKoqGhAWdnZ1asWIFUKlUb8w4dOpT6\n+nouXbpEWloadXV1WFlZMWTIEKZNm0aXLl2EY89nFHHFyJsqlxqKUmJIOxNJQ30NSLSpLa3Cz8+T\nKVOmEJpxg5zYoxReOouukQmm9l3RNTIVMoNK0uOorSyj+6g5Kn2RR22lqKiI/fv3c/DgQaAps8fC\nwkIoR/fRRx+xcOFCSkpK2Lx5s9q4WEtLizlz5jBnjmrbIiK3i3J8NnPmTJWAHh0dHebOncuyZctU\njg8ICEBbW5u33npLrQzbc889x8GDBzl58qSaCKSjo8P8+fNbFICU5ysFIGgqU+7r68uxY8eYNm2a\nyphCV1eXIUOGsH37drKzs1VEIE1lOiUSCZMnT+b48eOcP39eowg0a9YsQQCCptKRo0aNIikpSUUE\nsrS0ZODAgZw5c4a0tDSVgKsjR44gkUjUxCERERERkXuDKAKJiIg8EbQ1cTa0sKPbqLnkxZ0QIg4N\nze0Y/cIrjPfp3i4RCGDZsmV4eHgQHBzMkSNHgCYz3WnTpuHv73/X9yGRSFixYgU7d+7k+PHjHDhw\nAEtLS0aOHIm/vz9nz55VMcx9EujjbE0fZ2uuFsq4cLWI6lo5Rvo6eDtZt7vUSkJCgsZFdZG/N99/\n/z0lJSW8+OKLzJw5U9ju7+/Pe++9x6pVq/j111/VapafPXuWjz/+WKXs4+bNm9m1axfBwcH84x//\nULtWeyMkATw8PKiqqiIgIABnZ2cGDhwo7HN2dqaqqqrV+/r6669JTk6mf//+aGtrc+bMGdatW4eO\njg4ZGRkcP36cAQMG4OXlxblz59ixYwf6+vo888wzKu0EBwezbt06dHV18fX1xdramtzcXIKCgoiM\njOSbb75RmfA/DtzOPQUFBREREYGHhwfe3t4oFArS0tLYt28fMTExfPvttyq/twqFgjVr1hASEoKp\nqSmDBg3CzMyM4uJi4uPjcXBwUBOB7kRQFBFpiXsRGPEg/W1a48yZMxQUFDB+/PjHxsvtfgemtCdL\nC9QFurnDe6hlHvXq1UstAEHJreKdq6urxkXfW2nuu2nm0AMzh5vvtdrKMpL2rSG93pKkIgWK+mqu\nXz6HobktPca+jLauvkpbpVcTNV6jq2sv8i43/UbGxcVha2srLFp3796dCxcuUFJSwrVr1xgwYMAT\nFRgl8mBoPt8IDj9Pda0cNzc3teNcXV1VRJva2loyMjIwNTVl//79GtvW1dUlOztbbbudnZ2KUKMJ\nTdnrysAWTfuUglFRkeocWSaTsWfPHqKjo8nPz6empkZlv7IUY3uur/SHq6ysVNnu7+/PmTNnCAwM\n5I033gCasp2Sk5Pp16+fSvaUiIiIiMi9QxSBREREngjaM3E2tnFUizh07NEDD4/uahPizz//XGMb\nEokEf3//dgs+LUXJQlPEvyZDeD09PZ5//nmef/55le0XLlxo6vMTFj2emprKli1buHLlCjKZDGdn\nZ9auXXtP2lZmVrT2OYk8nhQVFXH+/HlsbGyYPn26yr5evXoxbNgwTpw4QXh4OCNGjFDZP3ToUDXf\nr3HjxrFr1y5SUlI0Xq+9EZLQJALZ2dkREBCAi4uLmjDZlk/Z9evX+f7774Wo1GnTprFw4UJ+/vln\npFKpxnIhe/fuZdq0acKCRU5ODuvXr8fOzo7PP/9cJbo0Li6ODz74gJ9++ol///vfrfblUeJ272nG\njBksXLhQbRExODiYtWvXcujQIRXhLCgoiJCQELp3784nn3yiEhXc2Ngo+Gc0504ERRGR1rjbwIgH\n6W+jiV27diGTyQgKCsLAwIAZM2bck3YfFPciMKUl2pulda/Oux1a891U4X++m8+4G6FQKDCx76om\nANVVlVNbqf57CTB66CC2XI4mNjaWpKQk/Pz8hH1eXl78+eefQuDWre9pEZG74XxGEb+Hpqr8Nial\n5VErK+GLA5eZO0pHReDV0tLCxOTm33xlZSUKhYLy8nIh27u9WFhYtHmMptLCyjGdJiFdua+hoUHY\nVlVVxdtvv01BQQE9evRgxIgRGBsbo62tLQQn1dfXa7y+puxo5TUam5V5BPD09MTR0ZFTp04xf/58\nDA0NCQoKAmD8+PFt3quIiIiIyJ0hikAiIiJPBI/yxPl2KSkpUSvDJJPJBN8hZRm6J4Hq6mr+3//7\nf9TX1/P0009jamoqTJSae7GIEWVPHrcuwHWSqq5MpaenA+Du7q6x7rinpycnTpwgPT1dTQS6nWjH\n9p7TVn9vh7lz56osBnTo0AE3Nzfi4+OZP3++WrkQHx8fldJx0FSSQy6Xs2DBApXjoWlhzdfXl8jI\nSG7cuPFIZx82f65nAndTUVXDihXtu6eWfjdGjRrFxo0bOX/+vIoIpCxN9MYbb6gtxmhpaWksn3e7\ngmJL3lb3k/nz5wPwyy+/3LdrhISEsHr1ahYvXszIkSMf6LX/rjjZmtyx6PAg/G1aYvPmzejo6ODo\n6MjLL7/82GUbKrmb598Sj0qWliZa8928FYUC4vLrMDXUo+p6ForGRiT/E9sb6uvIOncQRWOD2nme\nXSwZO8ydrT+v49ChQ1RVVan8fnp5ebFjxw527twp/PvvxsN4B/xduJus/+ZZbs3R1m0q6XYh9RqX\nC6p4e6InY72bgvEaGxuRyWTCeEM5LnBxcWHNmjWtXq/5O/FBcvToUQoKCjQ+o8uXLxMQEHDPrjV+\n/Hh++uknTp48yciRIzlx4gRWVlYayzCLiIiIiNwbHr3VTREREZH7wKM8cb5dNm7cSEZGBr169cLM\nzIyioiJiYmKQyWSMGzdOJaPg705KSgrl5eVq5bxEnlw0RWpCUyma7OxSXIubRBplSbWWoiuV2zWJ\nOrcT7djWOeXVdWw9mcxvqaqLhbf293a405IgzUWgy5cvA5CYmEhqaqraOeXl5TQ2NpKTk6OxzYeN\npu9B8slIqoqK+eDHfQw9E6u2QHvrPcnlcgIDAwkNDSU7O5uqqioUzVaAmpdEqampITMzE3Nzc1xc\nXNrdzzsRFEVE7jcPwt+mJcTM25Z52FlaLdGW76YmUorkDB8+lAOBx7h8+EdM7LvSUF+DLC8dLW0d\njCw7UF2SLxyvFBtNTU1xcnIiIyMDaArYUNKzZ0/09fUpLy/HzMxMxatIRJW/o8B+vwSy1rLcDC3t\nqS7Jp/J6FvomFqw6GI+tmSF9nK1JTk5WybJRelZlZWUhk8lUsoRuRaFQUFBQwJo1a4iOjubq1at0\n6dKFF198kTfffBNQ/+wiIiI4d+4c6enp1NXVYWdnh46OjsZx6aRJkzA2Nqa+vp49e/awZcsWSktL\ncXJyAprGgJMmTWLjxo1ERUVx+PBhYmJiyMvLw9LSEoVCgUQi4fTp0+zbt4/k5GRef/11Ro0axcsv\nv4yenh7JyclCwExaWhqrVq1CX18fS0tL9PT0sLOzo6SkhHfeeYcuXbqgo6PDpEmT0NLSEgL5/P39\n2bRpEy4uLtTW1mJkZMTAgQM5fPgwXl5eLFu2jK1btxITE0NpaSlvvfWWEEhSUlLCn3/+SXR0NCUl\nJRgZGeHu7s7MmTPVxl7NRTdTU1P++usvMjIy0NHRwcvLi7lz59KxY0e151hbW0tAQABhYWHk5uYi\nkUjo0qULkydPZujQoS1+viIiIiIPC1EEEhEReSJ4VCfOd4Kfnx9lZWVERkZSVVWFrq4unTt3ZsyY\nMYwePfphd0+g+WRsxowZbNu2jYSEBCoqKvjss8/w8PAQ6k6fPXuWwsJCdHR06NatG8888wx9+vRR\naU8ul3PkyBGOHTtGQUEB9fX1VFdXk5GRQWlpqdp1NWV3QPtKvCnbUDJp0iTh/3v37t1iOUCRh0tL\nkZpKKm7UcTAmi9EXsrH4X0SmphJdgPCd0lRe415x+lIel3NK6WBZib1D6/1VRpa2h9ZKgrS2Ty6/\nWTazoqICgD179rR6rVtrxT8KtPQ9kNdWAxATFkzsaXCxM8XGVD2LSXlPX331FREREXTo0AFfX18s\nLCzQ1dUFUCuJohQVb80waovbFRQtLS3ZsGHDY+ORcrd8+umnD7sLTyz3299G5M54mFlaLdGW72ZL\njHtmDpZWtuwICKQoJQodAylmDj2w9xpORuhfwnG3io1eXl5kZGTg6OioEsiho6ODm5sb58+fx8PD\nA4lEcnc39gjypL0D7iU9evRgw4YNt+2111qWm6WzJ8Vp5ylIDMOskys6egZsD0vFw9GcLVu2qB0/\ndepU1q5dy5o1a3j77bfVxmSVlZUUFBRw4MABIbDExsYGR0dHzp8/T0pKCnK5XG2Ok5GRwffff4+D\ngwN+fn5IpVKSk5M5duwYMplMRYxScuPGDS5evIi+vj7+/v5IJBJKS0vJyMggNzcXgF9//ZWEhARc\nXFyEMUl0dDR//PEHJiYmbNq0CVNTU2xtbTEzM+PQoUM0NjbSvXt31q1bR3V1NaampigUCrS0tISx\n9aRJk8jIyKC8vJy8vDzq6+txc3Nj7NixKn1UBuEMGTKEXr16ER8fT1BQEKmpqXTt2pWlS5diYGCA\nn58fEokEc3NzAAoKCli2bBklJSV4enoydOhQioqKOH36NFFRUaxYsUJjxlF4eDgxMTEMGjQIDw8P\n0tPTCQ8PJyEhga+//hoHh5uD9aqqKlasWEF6ejpdu3Zl9OjRNDY2cv78eb7++msyMzN58cUXNX9x\nRERERB4SoggkIiLyxPAoTpzvhKeeeoqnnnrqYXej3eTl5fHOO+/g4ODA8OHDhUiuwsJCli9fTmFh\nIe7u7vTr14+amhoCAgLYs2cPXbp0QSqVoq2tjZOTE+Xl5eTm5tKlSxf69+/Ptm3bBCFo/fr1Qimm\nuXPnEhkZibW1NS4uLipijq2trUqJJ6UIdejQIaKjo7l+/TpJSUlMmDCBWbNmERISQmFhIbNmzSIx\nMZHAwECGDh1KTEwMu3btIj09nerqajFy+RHgdv0Ilo1vigJMSkqioaFBxbwXID4+HoCuXbvej+5y\nPqOIbe0pn/O//iojSx8UyoWJP//887FabGrte6CtZwCA18z/Q1vPAIkEPn7eV+NzTU1NJSIiAm9v\nbz766COV74dCoWD37t0qxyufV0uGyfcKHR0dOnXqdF+v8SihNHwXeTjcT38bkTvjYWZptUR7fDf1\njc3p+8KHKtvqFdp8/cFiZr/wgprY2H30PECz2Dh//nyVsV1zPv744zu4g8eHJ+0dcC/R19e/7WfX\nVpabiZ0T1t37UZQaw+WDGzDv3IucWC1yQzZiZ2WGpaWlihg5evRo0tLSOHz4MAsWLKBPnz7Y2toi\nk8koKCggMTERd3d3IiMjMTAw4M033+THH3+kd+/efPLJJ7z//vuUlJSozGVSUlK4fv06kydP5qOP\nPkJPT0/Yt2jRInbt2kVoaCje3t4qfS8oKMDExIRFixYxZswYoCl7Ji4ujrNnzyKXyykvL2fo0KEk\nJSUxY8YMTp48SX5+Pnv27EFfX5/Vq1eze/duQkJC+Pjjj/nss88ICAhAR0eHjh078uKLL/LVV18x\nbtw4Fi1aJPgvyuVyNmzYwAcffMC3335LZWUl/fv3FzKhleTm5uLh4cHcuXPx8PCgoaGBf//730RG\nRpKUlMTEiRN566231Mbw33//PSUlJWqVIvz9/XnvvfdYtWoVv/76KwYGBirnRUZG8p///EdFIAoI\nCODnn39m/fr1fPbZZ8L2n3/+mfT0dObNm6fi31hXV8dnn33Gzp07GTx48G1lh4uIiIjcb0QRSERE\n5InhUZw4PwlcvHiRGTNmMGfOHJXty5cv5/r167z77rsqKfPBwcEYGBiQl5fHq6++SmNjIxEREYSE\nhODt7c3atWu5ceMGHTt2JD09nXPnzuHt7U2vXr0A6Ny5Mw4ODkKE/eTJk4UFWqlUytmzZwFURCil\nONSzZ0+uXbvGF198waJFi7C1taWwsJDZs2cTEhJCQkICN27c4OOPP6Zfv36MHz+ewsLCVu//wIED\nHDlyhIKCAurq6njllVfYuHGjmFF0j7ldP4IjSSV4e3tz4cIFAgICmDZtmrA/OTmZU6dOYWxsfN88\ntn4PTeXW7tZWlpG0bw1WLt7Yew3/X18bUShge1jqA/1NcnV1JS0tjaSkpMeqPntr3wOptQPVxblU\nXs/CzKFHq881Ly8PAB8fH7XFhZSUFOrq6lS2GRgY0KVLFzIzM0lPT79vk35N5W6a+5/FxsZy8OBB\ncnNzhbIpL730ksYMsKKiIvbs2UN0dDTFxcXo6elhb2+Pj48Pzz33XKv92L59O3/88QcrV67Ew8Oj\nzT4qycvLY/PmzVy4cAG5XI6zs3OrpTw1lSxqXrbFxsaGP/74g7S0NCQSCe7u7rz88ss4OqpnzuXk\n5LBlyxbi4uJUrl1RUaHRj0jkJvfD30bkznnUsrTu1ndTFBvbz92+A5S+OEqaZ7rf+pt97do1du3a\nRVxcHGVlZUilUry8vJg9e7ZKRkTzPvz8889ERUVx9OhRcnNz6dGjB59//rmKH8/AgQPZunUrly5d\nor6+nh49ejBnzhxhHK+kpKSEo0ePEhsbS15eHpWVlZiamtK7d2+ee+45ld955TsJmt4RISEhwj7l\nb3trnkC5ubns2LGDuLg4KioqMDU1xcvLC3NXP7XPIC/+JHnxp+g+ei7ymmqqi3OpvyGjquga5Tmp\nGNt1wcR/NJ/8Zwnz5s3D3t6etLQ0jh8/TkJCAkVFRVRWVpKXl0d6ejo2NjZCxs/06dOFUocdO3ZU\n8VzU0dFh7ty5LFu2TKU/iYmJSCQSFixYoCIAQZNP7L59+4iOjla7D21tbRwdHVXGOJaWlnz55Ze8\n+uqrxMfHY2NjQ2VlJQsXLsTb25uwsDC6dOlCbW0t06ZNU/kMdHV1GTJkCGfOnMHCwoIFCxYIIosy\nY6+5/2JNTQ0vvfQSa9asob6+nnHjxqn1cdCgQSQlJan0edSoUfz222/cuHGD+fPnq43RioqKOH/+\nvPA8m9OrVy+GDRvGiRMnCA8PV/P89PT0VBvzTpw4kYMHDxIfHy/MGWUyGSdOnKB79+4qAhCAnp4e\n8+bNIzY2llOnTokikIiIyCOFKAKJiIg8UTxqE+e/Ey0Z25ubmzNr1iyVYzMyMkhMTGTw4MFqNZN/\n+OEHsrKy+PTTT3FycsLf3585c+bQr18/rly5QklJCdbW1oIwc+7cOYYPHy4s3hUWFuLg4ICOjg5y\nuZwpU6aoRMwpRaBVq1YJIpSFhQUrVqxgzJgxTJkyheXLl/PTTz/RuXNntfuMjo7mww8/pF+/fm0+\nk9DQUH766SdcXFyYPHkyurq69OzZ8/Ye7B3QnpJ3fyfuxI8gPrOET5+dS2ZmJr/++iuxsbF0795d\nKBehpaXF4sWLVSbgD7K/2nqGSCQS6qvLhf5eLZQ9sAWxiRMnEhQUxMaNG+nYsaPago9cLic5ORl3\nd/cH0p/20NZztenhQ3FaLDkxR9E3scTA1FrluTa/Jzs7O6BpcaX5Ill5eTkbNmzQ2P6kSZNYt24d\n69at45NPPlERXhQKBaWlpYI30/3gt99+IzY2Fh8fH/r06SOUTcnLy1OJXoWmTKcPP/wQmUxG7969\n8fPzo7a2lqysLLZv396mCHQn5ObmsnTpUmQyGf369cPFxUXoW3t+T28lMjKSc+fOCYJ8dnY20dHR\npKamsn79epWSP9euXePdd9+lsrKSAQMG4OTkRH5+PitXrryja4uIPGweJeHkXvluimLj3dGed4Cd\nnR2zZs0iICAAaAqUUtJ8sTomJoaVK1fS0NCAj48P9vb2FBUVERERQXR0NCtXrtSYKf3TTz9x8eJF\n+vfvT//+/dHS0lLZn5aWxu7du+nZsydjxozh+vXrnDlzhvfff5+1a9eqjDUSExPZuXMnnp6e+Pn5\nYWhoSG5uLuHh4URGRvLVV1/h7OwMgIeHB1VVVQQEBODs7MzAgQOFdpTHtERqairvv/8+N27cwMfH\nh86dO3Pt2jVOnjxJ4cFj6HhORWqlXrO3KCWa8mvJmHVypVO/MVQV5VJZmImuoTF+4/5BeXk5NTU1\nODo6EhQUREREBB4eHnh7e6NQKIRAG0dHR978v4+4XFBFda2cqEPHqalrwMaqqbRZ87G8q6uriuhR\nW1uLubk5o0aNIiIigoiICJU+amlpMWrUKKqrq9X637t3b3744Qe17Y6OjowZMwZtbW3Wrl2r8vwO\nHDjA1q1b+euvvwRfncWLFzN19nzCrxYRl1tDQXEpOnoGJCYmoq+vz6xZs7hx4wavvfYa6enpZGZm\nUlZWxoQJEzAwMKChoQEtLS369++v1pcOHTqoiEBw0zdRKpViZmamdk56ejoA7u7uGkuDe3p6cuLE\nCdLT09VEoFuDWpTP0M3NTRDtbG1tSUlJEcrjbd++Xe0cZfm97OxstX0iIiIiDxNRBBIREXnieJQm\nzn8HNBmww01j+5HOroKPhhKl6XxVVRVrNvxCZpGM2vpG9HW16GJtgh5NUfbKwbOpqSnDhw8nICCA\nBQsW8Oyzz+Lm5qYWjd9eqqurVUSohIQEYZ9UKuX555/n008/JScnR+1cX1/fdi8aRkVFAfDhhx+q\nLP5u2LABfX39O+q7iDp36keQU63NqlWrBOPYxMREDA0N6du3L88++yzdu9+fkpDt6a+2rh5GVg5U\nFmZx9fQe9E2tWPdzOm88P6nNc+8FnTp14s0332Tt2rUsWrSIvn374uDgQENDA4WFhVy8eBFTU1ON\nCwgPi7aeq4GZNZ19J5N1LoBLB3/A1L4r+qZWfLUqjo7SRpV76t69O7169SI8PJx3330XNzc3ysrK\niImJwcHBQaOYM2bMGJKSkjhx4gSvvvoqvr6+mJmZCeVVRo8erRZ9fC+5fPky69atw8bGBkAomxIf\nH09KSgo9evQAmgS8L774AplMxtKlSxk2bJhKO0VFd/b31BYbNmxAJpOxYMEClYXHc+fO3ZH3z9mz\nZ/n444/x8vIStm3evJldu3YRHBysEp27YcMGIZrZ399f2B4TE8NHH310ZzckIvII8CgIJ38n383H\nmfa8A2xtbYUgKkDjO6myspKvv/4afX19vvzyS5Vsj8zMTJYuXSr42tzKlStXWLNmjRBIcStRUVFq\nWZeBgYF8//33BAQEsHDhQmG7l5cX27ZtUwvGycjIYNmyZWzevFn4/fbw8MDOzo6AgABcXFza/a5V\nKBR89913VFdX88477zB8+HBhX1hYGEtWfETOmb30mrRIzWOqIjcN13GvoGMgRcdAikQiIeP0bkqv\nJnL14nl+Dk0DmrJZevTowcKFC9VEsQ1bdvHNd6s583+r6ODeVOo7KS0PWWEF5XIdUvPKaZ6fqqWl\nhYnJzb+byspKFAoF5eXlQiZUe2nup9USrflIGhkZqc0Bi6/kUlZZQ01+OZ+u+QUHKynG+tokJiZS\nW1uLsbExRkZGdOzYkbFjx5KRkYFCocDMzEyjh5emuVLz62tC6dHY0v0pt1dWVqrtU3oKtXSOsm2Z\nTAY0CYipqakaz4FH0zdTRETkyUYUgURERJ5YHoWJ8+NOSwbsSipu1BGaVk7QLcb2MpmM8uo6th84\nQcUNVSFH0SBHUluOTmMdWVlZgtdPY2MjDg4OVFZW8vvvvwNQVlZGSUmJxoF8a1RWVmJkQ4oIowAA\nIABJREFUZERVVRXbt28nKyuLnJwczpw5AzRF+wNUVFSoCVjKhdT2UFLSNCm6dcFYrOd+b2mPH4Gi\noekYSbMIyupaOVZWVrz++uvtus7IkSNbLRelKfNq8eLFaiWxlP01sXNS80hojtPgaVyLDqIi7woN\nmYkcz5UyfqCbSmbb/eTpp5/G2dmZffv2ER8fz/nz5zEwMMDS0pLBgwczZMiQB9KP9tKe74GliyeG\nFnYUXjqLrCADWf4VLlRaIXHtonJPWlpafPDBB2zbto3o6GgOHDiAlZUVY8aM4dlnn9X4nZFIJCxZ\nsoS+ffsSFBTE6dOnqa+vx8LCAnd3d3x9fW/7nlrKsNTErFmzhMU/uFk2JSkpSUUEioyMpLCwEF9f\nXzUBCFCryX8vKCoq4sKFC9jZ2TFx4kSVfb6+vvTu3ZvExMTbanPo0KEqAhDAuHHj2LVrFykpKSrX\njo+Px97envHjx6sc369fP6EspIiIyJ3zd/HdfJxp7zugLY4fP05VVRWvvfaaWmnNLl26MHbsWPbv\n3092drba/n/84x8tCkDQVI7r1nHUqFGj+OGHH1R+twGNWR7QlNnj6enJ+fPnkcvlGrM92svly5e5\ndu0aPXv2VBGAAIYMGUI/b092Hz1DZWEmJnZOKvttXH0wtLAj53wIpVcTMLFzolFez43SfAK2rMPe\n1pp+/foxePBgjQJH4Pks9mUaUNWghSIvXRCBtHWbSrqVV93gl5BLOPe+OYdqbGxEJpNhZWUF3BRp\nXFxcNIpy95PTl/LYn5Kl9jcvkTQJXc6T3kZH34D+Rrloa2sLZfiqqqo4cuQIxcXFXLlyBS0trRY/\na00ZTG2hfCZlZWUa95eWlqoc15z2nqP875QpU3jllVduu48iIiIiDwtRBBIRERERuSNaM2BXQSFR\nM7a/lFfF5ZxSHPqNo1vPmwujtbJSkgM30qCji6FtZ4YP68cA105oaWlRWFhISEgIs2bNYsyYMSQm\nJrJx40ZSUlL4448/mDJlCoAw0VKm6d9KVVUVcnnTYvGFCxe4cOECFRUV5OTkEBERQVZWlnCsXC5X\nE4HaEznXvDY5qNZcP3DgAJMmTVLzBGrusVFSUkJAQABZWVmYmpoKfhjnzp0jICCA7OxsZDIZpqam\ndOzYkSFDhuDv7y/Uitd03cfBg6it+1Mik8nYs2cPZ8+epbCwkIKKWnLrpNi5D8bUXr08CcD15Ciq\niq5Rf0NGSXocuobGBBX2Y0CHl1Uyfurr69m/fz8nT54kLy8PbW1tnJ2dmTRpEk899ZRKm81r88+e\nPZtNmzZx4cIFampq6NKlC7Nnz9bop6OtkHMtJoiyzIvIa6vRk5ph3b0fZp1USwXqm1jS9embpRQX\njnVjpE9TWQ5NglNrn68mMUrJ7NmzW4yadXJyavG8R432+lIYWtjRxW+K8O+FY92Y6qNeLsbExEQl\nKrk5zT1qbi2/OHz4cLXFpNvl47W/8XtoKq/+GCpsywzfz/XUaEz0dXD1Vhe+laVZmqMUdJoL5cpM\nzAdZBk1ZnsXNzU0tEhqaorhvVwRq7/0qr92zZ0+NC3Fubm6iCCQicpeIvpv3h9sJBGjvb2JbKN8R\nGRkZGktdKbPkNYlAbQlNmjKsdXR0MDc319jHqKgojhw5QlpaGhUVFUKZLSUVFRV3VWY1La0pW8fT\n01Pj/iED+xMcFsWN0gI1EcjIqiMApvbO3CjNpyLvCnVVFVBXjdRAj5deeonJkycjkUiQy+UEBgYS\nGhpKdnY2uddLuXStRPhbqa+uENo1tLSn7FoKDbU3UChQmUMlJyerPAMDAwM6d+5MVlYWMplMJUvo\nflJeXce20FSMb3kmANp6TT5ASv/FXSdisa6uw8+vyV+pqqqKzZs3o6uri5GREZaWltTW1moU9JT+\nSLeDsqxhUlISDQ0Nap5B8fHxABrLGSYkJKiVw21sbMoUb952jx49kEgkwnYRERGRxwVRBBIRERER\nuSNaM2C/leYG7Oczijh2tR6FAqoKs6CZCFR4OQJ5bTVdBk3Bqqs3yRKYN9iXPs7WhIaGCuUrrK2t\nGT58OHK5nLCwMDIzM4XJj7GxMdA0ybi1hER1dTU5OTnChOCf//wnkyZNatEsVllGo7mgpGkR8VaU\nNaVDQkIoLCxU80Rqjb1793LhwgV8fHzw9PQUSg8oy2VYWFjg4+ODqakpZWVlXL16lWPHjuHv749U\nKmXWrFkar9taZOajQHvuD5qEl+XLl1NYWIi7uzv9+vUj53oZP/51hCvHf8fRZwLW3W8ubleX5JMW\nsoXSzIugUGDVtQ+GFh2or66g6noWUVFRwqKEXC7nP//5D4mJiXTq1IkJEyZQW1vLmTNn+PLLL0lP\nT2fOnDlqfS8sLGTJkiV06NCBESNGIJPJCAsL45NPPuHTTz9VWVyor68nePv3FF6KxMiiAxbOHjTU\n1ZCfEEplQWarz+hOfReeFO6VL8W9oDUD6rZoK8NSVlPPwZgsRt+SYan87WuO8reu+W+Y8jdFGUn8\nIFBes61SK7fD7d5vS9duabuIiMjtIfpu3jvaKrXsWqwumLT3N7EtlKWugoKCWj3uxo0batva+i3X\nlH0BTf28tY8BAQH8/PPPGBsb4+3tjY2NDfr6+kgkEs6ePUtGRoYQ1HWnKDNNWhKSLC0tcbCSUl6v\nXtZLKXaYdHDBpEOTOFBXVUZD1BamTRrP9OnThWO/+uorIiIi6NChA76+vgQlFdPBoqnN65fPoWi8\nKexYOnuSF3+KWlkJDfI6YQ7l4WjOli1b1PoxdepUoTzf22+/rfaMKysrKSgo0Ch63Ck5xVW0NKvQ\nk5qjkNcJ/ou6UnNyrlWRkJCAk5MTlpaWfPHFFxgaGrJ8+XLMzc1pbGzk2LFjjBs3TminqKiI2tra\n2y6fbW1tLWT4BgQEMG3aNGFfcnIyp06dwtjYmEGDBqmdGx8fT1RUlEoA18GDB8nLy8PT01PIxDcz\nM2P48OGcOHGCHTt2MHPmTLUAl7y8PLS0tNqcf6WkpLB3714uXrxIRUUFJiYmQsZd8+Cz06dPc/Dg\nQeF7b29vz7Bhw5g6dapawKIyIPD7779n27ZtnDlzhoqKChwcHJg9ezYDBw6koaGB3bt3c+zYMYqK\nirCysmLKlClq2drNx7N9+/Zl27ZtpKam0tjYSK9evXjxxRfVxN2SkhKOHj1KbGwseXl5VFZWYmpq\nSu/evXnuuefUxOPbDapTzhlnz56tcY5dWlrKSy+9RKdOnVi3bl2rz19E5ElDFIFERERERG6b9hjb\n34rSgP330FSMLB0wtu1CWfYlitPOY9WtD9CUCQRg3rkXN0oL0DE0ZntYKi6Wupw8eVKtzbq6OsFQ\nVBk9ZmhoSKdOnTh9+jR2dnZCxFxjYyMbN26krq5OmCgnJSWpZMvcitJY/Pr167d1rx4eHnh4eJCQ\nkEBhYeFtLQLHx8fzzTffqBj0QtOAV0dHh//+979qZRMqKpoiCKVSKbNnz76j6z5s2nN/AKtWreL6\n9eu8++67DB06VNheYtWXfb+u5lpMEGadXNE1bPqM8xNDKclIQN/YAlf/BVg6NQl0nl0s+fIFX5XS\nD3v37iUxMZF+/frxwQcfCIsns2fPZsmSJezcuZMBAwbQq1cvlf4lJCSoTUSGDRvGhx9+yJ49e1RE\noL1791KQk0lPz34YekwQREU798EkH/m5xecj+ie0zcPypViyZAm1tbV31YaS1jIsO3qPwMK5N6nH\ntkKz6ODbRblAVFxcfMf9VC523BqVDZojzttbnuV+oPQNaOnaLW0XERG5fUTfzbunPaWWNQUC3CuU\nv5n//e9/cXJyuq1z2xMo1R4aGhrYvn07FhYWrF69Wk2kUWYr3S3Ke23pHVRSUoKZkR5PD+rBuSra\nzHJ7bYwb25L0VLanpqYSERGBt7c3H330EdnF1Rz5MRT7zk2eRAUXw2meq2Ji54SZQ3fyE8PIPnuA\nOlkJObFa5IZsxM7KDEtLS5XnPHr0aNLS0jh8+DALFiygT58+2NraIpPJKCgoIDExkVGjRrFo0aI7\nfk7NuV5+g4obdS2KQNq6elh270dZ9kUuHfwBqXUnSq5fZ9l7y1m7di0ymUwo8T1o0CCKi4spLCxk\n/fr1xMXFYWNjw+HDh7l69Spz5sy5o2ybRYsWsWzZMn799VdiY2Pp3r07RUVFnD59Gi0tLRYvXqwW\nKAjg4+PDZ599xqBBg7C3tyc9PZ2YmBiNmeGvvfYaubm5/P7775w4cQI3NzfMzc0pKSkhOzub1NRU\n3n333VZFoKCgINavX4+Wlha+vr507NiRsrIy0tLSOHTokCACbdmyhZ07d2JqasqwYcMwMDAgJiaG\nLVu2EBsbyyeffKKWRSWXy3n//feprKzE19cXuVzOqVOnWLlyJZ988gmHDx8mOTmZfv36oaury+nT\np/nxxx8xMzPTWO45JSWFnTt34u3tzYQJE8jLyyM8PJykpCQ+/vhj3N3dhWMTExPZuXMnnp6e+Pn5\nYWhoSG5uLuHh4URGRvLVV1/h7Kyegd/eoLrhw4fz22+/cfToUZ599lk1AS44OJiGhgYVUVFERKQJ\nUQQSEREREblt2mNsr4njCTnCAq3T4GmkhWwl82wA15MjMbJ2oLokj+qSPBJ3f0eDvA7XcfOJz5Sy\nMyCQ9evXY2hoyKFDh6ivr6e6upr9+/dTX1+Pj4+PymB++vTpREREcOnSJdauXYuzszPx8fHI5XKc\nnZ3JyMjA3d2d8PBwgoOD6dChg1pfr169Srdu3Th9+jQrV64UBrAXLlxo1Rfmbhk3bpyaAKREW1tb\nrawB3BSrHnfaur+MjAwSExMZPHiwigAEMG+0J+dihnPl5A7Ksi9h06MpYqy2ohjTjt3o6f8qRpZN\nn7PSj0BLS0tlYSE4OBiJRMIrr7yi0g8zMzOee+451q5dy9GjR9VEIFtbW5599lmVbX379sXGxkat\nxv2xY8eQSCR8sHQR3wVnCgsK+sYW2Lj6kBd/Su3+Rf+E9vMwfCmaezDcLa1lWOoamdDY2CAs/iij\ngx1u8xo9ezaVHYyJiVHzyGkvSlGnqEj9XaAsr9Mc5W/axYsXaWxsVJuwJyQk3FE/2oPy2pcvX0ah\nUKgtUorlXERE7j2i7+ad0f5Sy3ceCABNQn5LWTQ9e/YUFndvVwS6V1RUVFBVVYWXl5eaAFRTU8OV\nK1fUzlG+V24n60mZHdPSO0i5fcbogUyVdmB7WCp58erHKbPcHKSNbLtlX15eHtAkMGhra6vMoaqL\nc2iU16u1Z9NzIOXZyUi0dShKjUZb3wiTMcP55D9LmDdvHvb29irHL1y4kP79+3PkyBHi4uKoqqrC\n2NgYGxsbpk+fztNPP93eR9ImmUWyNo8xsXfBzn2w4L+oa2hMdW0VmZmZSKVS3NzcmDt3Lt7e3oSF\nhQnZ/pGRkWhra6OlpYWbmxtOTk539I7u0KEDq1at4s8//yQ6OprExEQMDQ3p27cvzz77rMayhAB+\nfn6MGzeOP//8k6ioKHR0dPDz82POnDk4OKiOtoyMjPjiiy8IDAzk1KlThIeHU1dXh7m5OR07duSV\nV16hT58+LfYxOzubDRs2YGRkxJdffknnzp1V9ivHV5cvX2bnzp1YW1vz3XffCdl2c+fO5bPPPiMq\nKoo9e/Ywc+ZMlfNLSkro2rUrn3/+uZAp9PTTT/Pee+/xxRdfYG9vz/fffy+M56ZOncrChQvZtWuX\nRhEoJiaGV199VSVT6Ny5c3z66aesWbOGH3/8URhfeXl5sW3bNjWhLSMjg2XLlrF582Y++ugjtWu0\nN6jOwOD/s3fmAVVVa///HGaZQQYBRUGZDyKiOOU8j9ms5VVvWt2y4VZ0b2Zl7630Vt6bdi27+trP\nzMQKLWdFMQMnQIbDJMgskwwyHY7M8PuD9xw5nsOolsP6/AV7r73XWpvD2Xvt53m+XyMmT57M4cOH\niYmJUasSam1tJTQ0FENDw9v6uRcI7hdEEEggEAgEPaY7BuzayC65UdFhYGKBx+znKE2LovLKJSpy\nEmmqr6WlsZ6aklwsnb0pz06kKOE3tjQUMWXKFKKjo8nLy+OXX37BzMwMKysrBg8ezNy5c9X6mT59\nOs8//zyffvopwcHBODo64unpyaxZs4iLiwMgKCiINWvW8MUXX2BmZkZOTg6//fYbhYWF5OTkkJub\ny6effsoTTzxBeHg4Fy9eJD8/n4sXL3Y4v5szX6sUDT2+Rh3pqU+aNInt27fz0ksvMWHCBKRSKV5e\nXh2aqd4LtL9efZy8qEhJ63R+ysxPhUKhVad+RN9asiRQV9W2cGpubKC2sgT9PqZqASBtfgS1tbUU\nFRXRt29f+vfvr3Fu5cJD6S/SHhcXF60+JzY2NmrZqso+bGxsmDFKSouBudrLnjZtdfUgkPBP6Bk9\n9aXwcjDlkUcewc3NjU8//VS1v6GhgUWLFtHY2Mgbb7yhtpA8cuQIW7Zs4dVXX2X69OkankAbN25U\nSVcGBwer+YOtW7dOJRepJCEhgeDgYGTJqcRmlWFq54zT8OkYWagHl5SeQO0nFp2cxf5f/ou+ng6l\npaUaEhqBgYEacw8MDMTOzo7IyEjCw8M1AqplZWUqH4mOUH5PnTx5ksmTJ6uCpmVlZWrzVdJenuXQ\noUMsWLBAtS8yMrLbfkBhYWHs3r0bmUzGBx98wH//+18GDRrE7NmztS7209PT2blzJxkZGcTExLB4\n8WI++OADYmNjCQ4OZsmSJRp+QPn5+YSEhCCTyaisrMTExAQ/Pz+efvppjZdAAoFAcDvpjdRyb76V\nlM+9DQ0NGBioV65MmzaNH374geDgYNzc3DSeS1tbW0lKStK4l91OLC0tMTQ0JCMjg7q6OoyM2qTX\nmpqa2Lp1q1qFuBJTU1MkEkmPqve9vLxwcnIiJSWFs2fPMm7cONW+s2fPkpycjJOTEz4+PkgkEvxd\nbOivSGHX1VgWjBiIp7e3WpVbSUmJRh/KSpCkpCTmz5+vWkM11inIizqidVwSwMDUUiWPDTBhkjtV\nVVXU1dVpyGkBjBw5UqsPpTa0+Uq2pzMfyRGT5pKM9iBK38HDVOMF1PwXl01y15p409VYbq7A8fX1\npbi4uNNjoE3y9qWXXuqy3c305Drq6ekxb948DQm17nDkyBGam5tZtGiRRgAIbvh5nThxAoCnnnpK\nTW5RV1eXFStWcPHiRUJDQzWCQADPPfecmlScj48P9vb2FBcXs3z5cjXpwH79+uHl5dVhso6Dg4PG\nenvUqFFIpVKSkpJITk5GKpUCdLg2dXFxYejQocTFxWn1gOpJUt2cOXM4fPgwR48eVft7xcXFUVxc\nzLRp0zqUnxQIHmREEEggEAgEPaY7BuyGppYMX7K20za6+ob0k46nn/RGxlFNaR5Fsl+pLS+iPCue\nPpb2LPjTSzw6zlPDYyMsLIyNGzdqlZ946aWXcHR05Pjx41y9epXS0lKOHz+uZui+ceNGDh48yLlz\n5/D09KSsrIympiacnZ2ZN28eLi4ueHl5sXTpUlVfK1eu1OirI+329PgrSKoriMsu6/ZL/I78KRYu\nXIi5uTlHjhzhwIED7N+/H4lEglQq5c9//nOHWW13I9qvV3+qnMZTdTWJ3D0hmPfRnJ9Spz4+Pr5D\nI3dPJyv0zNo+n83/p+Gu36dtcd6ZH4HSN6QjXXjlwkub1JU2HX5oW6C1tnubo+xDea6b/RP0jdTP\nI/wTekdPfSnc3Ny4fPkytbW1qqzFlJQUGhvbsnNlMplagEEmkwFtmY7aGD16NND2/SSVStVelN0s\nCxIVFUVkZCQBAQG4Dh3N5ZoEqgrSuX6tEK95L6FnZNzlfOsam2ltaWLt2rUMGjRITUIjLi5O7TMI\nbS8t3n77bd5//30+++wzjh49iqenJw0NDeTl5SGTydi/f3+nfXp4eKgW/m+88QZ+fn5UVlYSFRWF\nv78/Z86c0TjmxRdfJCgoiG3bthEXF4eLiwtFRUWcP3+ewMBAoqKiupzrV199BbS9wBw7diwDBgzg\n4sWL/Pvf/6agoIAlS5ao2iYlJfH+++/T0tLCwoULOXXqFHFxcTz66KN4eHiQmZnJf//7X6ZMmUJk\nZCQSiYSYmBjWrVtHc3MzgYGBODg4UFZWxvnz57l48SLr1q27rb4KAoFAoKS3Ust90PSr6Qo/Pz/S\n09NZu3YtPj4+6Ovr4+LiQmBgIGZmZqxevZqPP/6YoKAg/Pz8cHZ2VgVYUlNTkcvl7Nu3r8f9dheJ\nRML8+fMJCQlh1apVjB49mqamJhISEpDL5QwdOpSEBPWSHCMjI9zd3UlOTmbDhg04OTmpZLY6qmiS\nSCS8/vrrvPfee3zyySeMHj2a/v37U1BQwPnz5+nTpw+vv/662jrD1qIP/SyNmRswEF9fTUmrm3Fz\nc8PLy4tz587x1ltvUWdkS+7FdKoLMzA074u+sWbFXFODpt+SvqSZbdvaZIO1+dn8XnRnDXg7j7uf\naJ/8dvi3KK7XNxEQENDpMcqqN23PnE5OTtjY2FBcXIxCoVALepiYmGhUjEHbOqcjj6i+ffvS3NxM\nRUWFhm+kMhB6M76+viQlJZGZmakKAgFER0dz9OhRMjIyqK6u1pAPrq6u1lhzdTepDsDZ2RmpVEpM\nTIxa8pLSy6y3le4Cwf2O+CYWCAQCQY/prZG6i70ZURmdZ+iZ2g7AbdpStW1+w73x9XXRyBabOnVq\np9JsCxcuZOHChR3u79OnD08++aTW7Kmb6aiv7mi3r/4+ktfnDe2WdntneupTpkxhypQpKBQKLl26\nxPnz5zlx4gRr165ly5Yt90RVUGfXq6+rH7j60dxYxwwPQyjPVpufUrv9+eef79TLCdoWWlFpBfwz\n3BRzcx3++8KETqVplAunjnThldtvJatMWx/t/RP2nzyD/Lwp/p72vNfFeAWd0xNfCj8/Py5dukRS\nUpIqm1Amk6Gjo4NUKlUFfaAtAzoxMZF+/fqpDIJvZvTo0ZiYmBAWFoavr2+n3lwXLlzgH//4B35+\nfuyOSCfdUEpBXBjFyWe4lhmHvc+4Do9V0tLaSnV1NdOmTeMvf/mLavvEiRN5/fXXVVI07XFzc+OL\nL74gJCSEixcvkpqaSp8+fXBwcOCZZ57psk+Ad999l2+++YbIyEgOHjyIo6Mjy5cvZ/jw4VqDQI6O\njvzrX/9ix44dyGQylUn0mjVrqK6u7lYQaPPmzaSkpLBx40ZmzJjB1KlTaWpqC4CFhISoFv2tra18\n8cUXNDY28sEHHxAQEEB+fj47d+7k+PHjXLhwAWNjY5577jmMjIyIjIwE4LPPPsPQ0JBPPvlELdM6\nNzeXoKAglfm2QCAQ3G56K7VcUK7o8TFPPfUUCoWCqKgoVeb/1KlTVdWjfn5+bN68mX379hEbG0ty\ncjJ6enpYW1vj5+fH2LFjezXWnrBkyRIsLCwIDQ3l2LFjGBsb4+/vz5IlS7RWgwO8+eabbNu2jdjY\nWMLDw2ltbcXGxqZTWTsPDw+VdFh8fDxRUVEq75VFixbdcgWojo4O7733Hrt27eLixYsUXE2jprSe\nvkP86SedwKVDX2kcU3klFfnVbIpTzlJXfY2m2hp+TLlOXU0VAQEBahVLvze9XQP29rj7AW3Jb8mX\nC6iXl/PZ0QyWTzPqMOHr+vXrAGpVQO2xtramtLRUaxBIG8rKbW37lfu0+T12lKSoHJdynAAHDhxg\n27ZtmJqaMmzYMGxtbTE0NEQikXDhwgWys7O1ylF2N6lOyZw5c0hKSuL48eM888wzVFRUEBkZiaur\na4fKGgLBg44IAgkEAoGgx/TWgH2KtD8/nNWU0+qKu3Xh0F3t9tZ22u23o6rDxMSEESNGMGLECFpb\nWzlx4gTJycmqRXl7XXRtGVV/FN29Xrr6RhzOhvXPLFabn4eHBwDJycldBoHa/Ag8Cff3Jjc3l5aa\nUugkqKJ8AX716lUKCwtxdHRU26/MOL2VKoD2fRQVFall6A2yM8OWCpysTfAd2FcEgG4T3fGl8PPz\nY8+ePchkMrUg0JAhQxg7dixff/01BQUFODk5kZWVhVwuv20vwCZMmKDK7lRmydq4Dac4+QyKawUa\n7XV09fBZ+BqGpjcW4w79nRnqPojnn39ere3w4cMZPHgwTk5OWgNRtra2GjIr2mhfPdkeExMTXnnl\nFV555RWNfR3Juzg4OLB69Wqt+24OsueUyJn/wrtcr2/il6jstgCegwMODg5qbfX09Jg7dy4JCQnI\nZDIOHjxISkoKf//73xk6dKgqy7Z///688847rF69mhdffJGCggKkUqkqazQ3NxeFQsFf/vIXDamd\ngQMHMnPmTPbv309eXp5WKR6BQCC4Fbojtaytyn7qo0t5evyHWtv7+vpq/T42MjLipZde6lQuy87O\nTi2xoDM6kw/rbBxKtN1ndHV1O0zm6qg/BwcH3n///R6PwcnJiTfeeKPD8bXn6aef7jC5w87OTmsf\nZmZmavfboG/Pq9ZQPgtf02jv6DcRXX0DaiuuUpp6HgsTIxzdhzLxiUdZsGBBpwljd5rergEf1Ofa\njpLf9AyMqAfi03JZXazoMFlQmQBXUVGhtbKnvLzt73Cnpc8qKyu1blcmtinH2dzczO7du7GysmLj\nxo0a1T43V/TcCmPGjMHS0pITJ06wePFiTpw4QXNzM7NmzbptfQgE9xsiCCQQCASCXtEbA/Y7vXAo\nKSlhxYoVTJ06tdPF6K0wf/58pFIp69ev75V2e2+DQAkJCfj6+mos/JQP5YaGhqpt5ubmAJSWlmrI\nT/2RdHa95FezMbUfpGF6b9Zufm5ubvj4+HDu3DlOnDjB9OnTNc6Tk5ODlZWVqipq/vz5bN68mc2b\nN/Phhx+qLZJaW1upqKhQLVCmTZvGd999xzfffMM777yjCqBVV1ezZ88eAK199gRlHzt27ODtt99W\nzbe4uLhLXXTB7eHm6iBpfycMDAxUFT8KhYLMzEwee+wxlReUTCbDyclJFQxUbr+FU4KjAAAgAElE\nQVRVhgwZovpZGew2MG77/21u6J7Ej3kfgx5JaNztdCSvCTDYUoJlZTLlBZmUlpbS0KDuu3bt2jXg\nhnyKt7c30Pa/XllZiZWVFRKJBE9PTwoKCkhLSyMiIoIBAwaofCSys7O1ZpkXFLQF5UQQSCAQ3AmE\nzNaDQ1drKLN+rpj1cwXa1lDrnxl1V0kD92YNeDfTlbJEb+ks+c3Ypj+Ka4VUF2ZgZGHTYbKgq6sr\nmZmZJCUlaQSBioqKKCsrw97e/o4HgVJSUmhtbdVYhyYmJgI3kuSqq6tRKBT4+flpBIDq6upUz2e3\nAz09PWbMmMGPP/5IVFQUoaGhGBkZMWnSpNvWh0BwvyGeGAQCgUDQK3pqwK58qL1fFg691W7PKZH3\nqr9169ZhZGSEh4cH9vb2tLa2kpycTHp6OkOGDFHTivbz8+PMmTOsW7eOESNGYGBggJ2dnVbj9N+L\nrq5XdviP6OgZYGzjhKGpJa2tkHY0l8Gm9Qz18VTNLygoiDVr1vDFF19w8OBBPDw8MDExoaysjJyc\nHHJzc9mwYYMqCDRjxgySk5P59ddfeeGFFxg1ahQWFhaUl5cjk8mYPn26KqPz0UcfJSYmhsjISF55\n5RVGjBhBfX09Z86coaqqiscee0z1Urm3PPLII1y4cIFz587x2muvMXz4cBQKBREREUilUpUsleD2\n01lwobrRnLJL6VRVVZGamkpLSwt+fn4MGDAAa2trZDIZc+bMQSaTIZFIOvQD6intpS80guStLV0e\n7z3AkrwUvR5LaNytdCYXWS+v4Oef/pfmhlomjx3BzJkzMTY2RkdHh5KSEsLCwlQ+TkpZEqV8SWNj\nI3/+85/x9fVlwIABJCQkkJaWxubNm7G2tubFF1/kxx9/BG7oyXdEba2mV4NAIBDcKkJm68Ght2uo\nu4V7ffy/F50lv9m6j6AsPYarSeGYOw7GyMJWLVlQ6XMzffp0Tpw4wZ49ewgMDFStb1paWti+fTut\nra3MmDHjjs+lsLCQw4cPM2/ePNW2yMhIVXDKx8cHaHvuMjQ0JCMjg7q6OoyMjABoampi69atVFdX\n39ZxzZo1i5CQEL7++muuXbvGrFmzVP6eAoFAExEEEggEAkGv6akBO9z7C4ctW7ZgaGjI2ezeabf3\nVvN92bJlxMbGkpmZycWLF1WBneXLlzNnzhz09G7c0mfMmEFJSQnh4eHs3buX5uZmpFLpHxoE6mre\nDsOmIi/KpLb8KtWFGejo6mFgYsGIKfP54LU/q+ZnY2PDxo0bOXjwIOfOneP06dO0tLRgaWmJs7Mz\n8+bNY+DAgarzSiQS3njjDYYPH87x48c5c+YMjY2NWFlZ4ePjw6hRo1Rt9fT0+PDDD/nll1/47bff\nOHToEDo6Ori4uPD8888zYcKEW74O+vr6fPTRR+zevZuIiAgOHDiAnZ0dTz31FGPGjBFBoDtEV95d\n1437kXk5mW0hJzBvLsfAwAAvLy+greonJiaGxsZGkpOTcXZ2vmP+W8ogeXeQSOCxUa5s7Dxmcc/Q\nlVxkSep5muqvM3DMw1S5DmPk9BuZ0eHh4YSFhanaKmVJlJWSenp6zJ49G5lMxuXLl0lJSeH69ev4\n+/vzyiuv4OrqyqFDhwD4z3/+06l/hEAgENwJhMzWg0Vv1lB3E/f6+O80XSW/GVnYMmDkbPKiDpN6\n5L9Y9PekMN4as8JzlF/Nw9jYmHXr1uHl5cVjjz3G3r17WbVqFePGjcPIyIiYmBhyc3Px9vbm0Ucf\n7XQs7ZUyuktYWBgbN25kwYIFAAQEBLB9+3bef/99zMzMmDNnDufOncPAwIDXXntNVSEkkUiYP38+\nISEhrFq1itGjR9PU1ERCQgJyuZzU1NTbmpxka2vLyJEjVeunrqTgNm7cSFhYGNu3b+/Q21MguJ8R\nQSCBQCAQ3BI9MWBXci8vHPr37w/A9VTt2sjtcZu+XGPb9fomrbJfnWmMA8yePVtlet4VOjo6LF26\nlKVLl3ar/e9BV1r3tu4jsHUfobHdb5y7RkZXnz59ePLJJ3nyySe73f+kSZO6JQ9gYGDQ7XN3pP2u\nZP369Vq3Gxsbs3LlSlauXKmxT0jC3X6640Vl1s+FwlbY/vNJfK0a8PT0xMDAAGirrDt9+jRHjhyh\nrq6uW1VA7X25eoIySL50V+ftlEFyX0fDzhveQ3Qlr1kvb9Odt3T20pDXVMqRKHF1bZPRSUlJAdr+\nHi+88ALQJg2n9AR69tlnVW09PT05d+4cycnJIggkEAj+EO6XanlB9+jNGupu4l4f/52kO0l/Nm4B\n9LG0o/jSeWqKc6jKT+XXOkf6WxsTHR1NWFgYU6dOZfny5apklVOnTtHc3Ey/fv3405/+xMKFC9US\nAe8U7u7uLFq0iEWLFnHlyhUuXrzI0KFDWbp0KW5u6t9DS5YswcLCgtDQUI4dO4axsTH+/v4sWbKE\nyZMnU19ff1vHNn36dCIjI3Fzc7sl71aB4EFABIEEAoFAcFvojgF7e+70wiE/P58dO3aQnJxMY2Mj\nrq6uLF68GH9/f4224eHhHDt2jKysLBoaGrC3t2fSpEk8+uij6Ovrq7VVegKNeuSGCXtRwmmKEn7D\nbfoymuquU5JyltqqUnR09TDr54pTwAyVz0d77fb09HR27txJamoqEokEd3d3lixZQmxsLMHBwaxb\ntw5fX99bvhZ3A0LrXvBH0R3vLmMrB/QMjKjKSyOmoJHH59/IJFT6//z0009qv3dGe1+unjLL3xlP\nJyvqO5Cz8B5gyXNz26pgSkpKenz+u5HuyGsamLRVX9UU52DR30Mlr1men05oaKhaW29vbxwcHEhI\nSCAmJoaAgADVvmPHjqn8fdozbdo0fvjhB4KDg3Fzc8Pd3V1tf2trK0lJSffNd7JAILj7uNer5QW9\no6drqLuNe338d4Kukt+UmNgOwNX2hs/gsknu2DdcYePGjWrtJkyY0G1Fgu3bt3e4r6MENYC//vWv\nnXrqenp6cvToUQANf6L26OrqsnDhQhYuXKixz9XVFalUqlaF09ukOiVKn6HuJksKBA8y4s2KQCAQ\nCP5Q7sTCobi4mKCgIAYNGsSsWbOoqKggIiKCtWvX8tZbbzF+/HhV202bNnH48GEuXbrEyJEjmTt3\nLmlpaezatQuZTMaHH36Irq6uRh/aNNjLLl+kKj8Ni/4emNoPRFFWSEVuMrWVxXjOeYHKvFT2bzvE\n95+VUVpaSlVVFQMHDmTMmDE4ODiQk5PDO++8c9tM539PVq9eTVJSktpDfGJiIu+88w6LFy9m7LT5\nvTpvV1r37SUOOlu4CB5MuuvdJdHRwdRuIJX5aTQCffsPUe2zs7PDwcGBoqIidHR0kEqlXZ7PycmJ\nvn37Eh4ejq6uLnZ2dkgkEiZPntwt+QkLYwOkUideeGGCKkgeVm5PTnNfPnhyJHZ299eLv+5kzNq6\nj6Q8K57siBAsnb3Q72PG22tCuV6czUMPPURERISqrUQi4ZVXXmHt2rV8+OGHjB07FgcHB7Kzs4mP\njycgIICYmBg1g2MzMzNWr17Nxx9/TFBQEH5+fjg7OyORSCgtLSU1NRW5XM6+ffvuyDUQCAQCuLer\n5QUCQRu3lPzWcJsHcxvpLPjzR1BbW8vRo0cxMzO7LbLdAsH9jggCCQQCgeC+IykpiUceeYRnn31W\ntW3u3Lm89dZbfPnllwQEBGBsbExYWBgnT55kxIgR6OvrM2HCBFasWAHA7t27CQ4O5vDhwyo95PZo\n026vLszAY9ZK+ljZq7Zln9lLRU4SV5MiqM2MRH+kF7Nnz2bXrl2Ympry3nvvqWWpHz16lK+++upO\nXJY/FKF1L/gj6IkHl2k/Fyrz09A1MKJKR93zx8/Pj6KiIoYMGYKJiUmX59LR0WHNmjXs2LGDs2fP\nUltbS2trK97e3j3SIG8fJC+J6UtJxv356N6djNk+VvYMmbaMItmvVBek09raQk0fL9595x1MTEzU\ngkAAvr6+rF+/nl27dhEdHQ2Ah4cH69at4/Tp08AN7yAlfn5+bN68mX379hEbG0tycjJ6enpYW1vj\n5+fH2LFjb8+EBXcFYWFhREVFkZmZSUVFBbq6ugwaNIjZs2dreOgpEx1+/vlnQkJCOH36NMXFxUyc\nOFEtAaEnlcUXLlzg7NmzXL58mWvXrgFtkrNTp05l3rx5akFKwYOFkNkSCO4d0tLS2LdvHykpKdTU\n1GBpaYmLh5TG607oG9/4f62XV1Cccgb51Rwaa+Xo6Oqh38cME9sBOA6bgp6hMaG7NpOXfRlo869p\nXxGk9LEpLy8nNDSU2NhYioqKqKmpwdzcHKlUyqJFixgwYIDGGJX0RClDG8p1cvtqo6amJo4ePcrJ\nkycpLi6msbGx7Rq4uDBv3jyGDRumcZ7q6mp27txJVFQUcrkcBwcHHn30UaZNm6a139jYWA4cOMDl\ny5dVz9T29vYYGBhQWVnJs88+i6HhDYnk+Ph4goODyczMRF9fHx8fH5YvX96tOQoE9zP350pSIBAI\nBA8ENy+O+5u0aWeYmJiwePFitbZubm5MmjSJsLAwzp8/z9SpUzlw4AC6uro899xzvPrqq2rtFy1a\nxKFDhzh9+rTWIBBomrjbegSqBYAAbIYMpyIniYrsBFytjXn99ddpbW3l559/ZujQoWoBIGgztNy/\nf79WuaJ7HaF1L/i96a4cB4Cd5yjsPEcBUNeo7uWzatUqVq1apfW4jmQq3Nzc+Pjjj7Xumzp1aqcG\nvdpkMbTJdNyqhMbdQnczZk1tB+A27YbX2cqZ3owOdAG0XzMPDw8+/PBDje3ffPMNOjo6ODo6auyz\ns7PjL3/5S3eHLriH+eqrr3B2dkYqlWJlZYVcLufixYv8+9//pqCggCVLlmgcs27dOtLT0wkICGD0\n6NFYWNwIGG/atImTJ09iY2PD2LFjMTEx6bSyeMeOHejo6ODh4UHfvn1RKBQkJCSwdetW0tPTeeON\nN36X6yC4exEyWwLB3c2JEyfYvHkz+vr6jBo1ChsbGwoLC4k89xvF5U3YPbQEAxMLGq/LSTv2vzQ3\n1mPhOKTN37C5ifqaCsqzE7D1CGS4e39mjZ3D+fNWREZGMmrUKJVvIaBKQkpKSuKnn35i6NChjB07\nlj59+lBYWMi5c+eIiori008/xcXFRWOsPVHK6Amff/454eHhDBw4kClTpmBoaMi1a9dISUkhNjZW\nIwikUCj429/+hp6eHuPGjaOxsZEzZ86wadMmJBKJxvNxcHAwu3fvxszMjJEjR2JhYcEPP/zAL7/8\ngqWlJUFBQWrSc2fPnuWTTz5BX1+f8ePHY2VlRUpKCkFBQVqvi0DwICGCQAKBQCC454jLLuP78HSN\nqpL6mkry8iqYNG4IfbT4afj6+hIWFkZWVhYPPfQQ2dnZmJubqzwiYmJi2L17t6q9vr4+eXl5HY5D\nqd3+N9lpAIz7ar5QNDCxQCKBgZa6WBjr0bdvXyIj2wJH3t7eGu0lEgmenp73ZRBIaN0Lfm+EF9W9\nQVeyj705rr6+nqamJo3KrbCwMC5dukRAQABGRka96ldwf7B582YNaZumpibWrl1LSEgIs2fPpm/f\nvmr7S0tL+fLLL1W+X0qUlcVjxowhKCgIAwMD1b6OKovXrl2r0X9raysbN27k1KlTzJ07Fw8Pj9s1\nXYFAIBDcRgoKCvjqq6+wt7dn/fr1avcLmUzGq2/+nfyLx3Cd+BQVV1Joqr9O/xGzVAlHSpobG9DR\nkajJPEZGRjJmzBitCUN+fn7s2rVLY62bnZ3N3/72N7799ls++OADjeO6q5TRHldX106TjRQKBRER\nEQwZMoR//etf6OjoqO2Xy+Uax2RnZzN9+nRefvllVfuHH36Yl19+mb1796rNOSEhgd27d+Pp6ckH\nH3ygeqZ79tlnCQsLY+PGjUgkElXlbF1dHV9++SU6Ojr885//xM3tRkLh//7v/7J///4O5yIQPAiI\nFa5AIBAI7imOxV3pNIhQXdvAmcwqjsfnMXOYejm8paUl0PbAWlNTQ2trK1VVVfz8888UFBRQX19P\nUVER+fn5yOVyWlpaMDY2Ji4uTmuZfHh4OOHHjkHmaa4XZ5F77hfqvMdi5z0WHd22W6x57RWay7Kp\n1FVg7erKihUrKCwsJD8/n0WLFqnOlZGRwU8//URycjKXLl2itLSUH374AScnJ6ytrdX63bhxI2Fh\nYWzbto3o6GhCQ0MpLCzE3d1dLfP/5tJ5GxsbxowZw/Dhwzl+/DgpKSlUV1djZmbGwIEDmTlzJg89\n9BDQM5mcnqDUut8RGk/Y8SNU5afSoKhCoqOLsbUDD02bzevPzNMIANXW1vL9999z5swZqqursbOz\nY9asWYwePbrXYxHc/9yJ4ILg9nMn5CJLS0t57bXXGDZsGA4ODrS0tJCZmUlKSgomJiYqSRPBg4s2\nbwM9PT3mzp1LQkICMpmMKVOmqO1fsmSJRgAIUFUWv/baa2oBIOi4slhb/xKJhAULFnDq1Cni4uJE\nEEggEAjuUo4ePUpTUxPPPfecRsKAn58fMyaP59CJ32hpqldtV64P26NnYNCj5Lf2FajtcXFxYejQ\nocTFxdHU1ISennpf3VXK6AkSiYTW1lb09fW1SpiamWk+pxkaGrJy5Uq1gNGAAQPw9vYmKSmJuro6\nVZKOMgD1yiuvaCT1KFU9Tp8+zcqVK4E2mVW5XM6UKVPUAkAAixcv5uTJkygUih7NUSC4nxBBIIFA\nIBDcM8Rll3VZRQLQWKvg80MJ2Fn0UXugrqysBNoegpUPkq6urqxZs4YVK1YglUrJzs5mxIgReHl5\ndVom3172ZZjUC0lzA+5eA8kpuIixYRVLXwoiYLA9JTm2/OXsXkxNTQFYsGABaWlpnDp1irq6OgCi\no6NZt24dAGPHjqWlpQW5XE5ERAR5eXl8+umn2Nury8wBbN26lZSUFEaMGMGIESPUHqa1lc7n5OSw\ndetWiouLkUqljBs3DkdHRyorK8nIyODw4cOqIFBvZHK6i5NJCzVRwdjXFOLuNZC+/QYgaWni2pU0\nKqP3URLoAC4zb/w9GxtZs2YN6enpuLi4MGnSJBQKBXv27CEpKYlLly6RlZWlIZUlEAgvqnuH2y0X\naWlpycSJE0lKSiIhIYGmpiYsLS2ZNm0aTz755F1nbiy489wsITvAFKJ+O4ZMJqO0tJSGBnU3bqVP\nT3tufqkEbVVnysrijrKMtVUWy+Vy9u3bx8WLF7l69arqmaCz/gUCgUDwx9H+PnLo10iu1zeRlJRE\nenq6RtuqqipszAxZOak/x20N2Sc7RV70UaqLMjF3GIyJ7QACh3rwzAT3HqsfREdHc/ToUTIyMqiu\nrqa5uVltf3V1tUYS4eDBg7tUyuhpEMjY2JjAwECioqJ49dVXGTduHN7e3nh4eKh59LTH0dFRo+II\nwMam7RrU1NSogkCpqano6elx5swZredqbGykqqoKuVyOmZkZmZmZAEilUo22JiYmuLi4kJSU1KM5\nCgT3EyIIJBAIBIJ7hu/D07v1grC2vIimhnp2R6SrPVQnJiYCbYEfIyMjnJ2duXLlCjU1NUD3yuQB\ncnJySEpKUsm+hISEUFFRwftr/k5iYiLBwcHol6YwaMwQjBmEk5MTenp6NDU18fDDD1NaWkpaWhpZ\nWVnU1dXx+eef09zczPr16/H29ubFF1/Ew8ODSZMmcfr0aTZv3qzV1yIzM5NNmzZpBIg6Kp3Py8sj\nLCyMhoYGHnroIf7+97+rHVdWVqb6uTcyOd3l888/p7S0lLXvrmbChAmq7QqFgtWrV7N161ZGjRql\nqtz6+eefSU9PZ+zYsbz99tuqTLPHH39cBH4EXSK8qO4NbrdcpKmpqYbX271ASUkJK1asYOrUqarv\nN2X1p9IUujckJibyzjvvsHjxYp5++unbOeS7nvYSsrnn9nMtKx63acvIjvgJY91mJo4ezsyZMzE2\nNkZHR4eSkhLCwsJobGzUOJeVlZXGtvaVxcHBwd0ak0Kh4PXXX6e4uBh3d3emTJmCqakpurq6KBQK\nDhw4oLV/gUAgEPz+aJMiT07Lo15ezkebtuPU1wQLYwOtxw62Nearl+eyZJwrX2/fweWURJqz89G/\nasC18n7kWz+Kv8v8bo/lwIEDbNu2DVNTU4YNG4atrS2GhoZIJBIuXLhAdnY2TU2anpjKdVVH23tb\nIfP3v/+dkJAQfvvtN77//nsADAwMGDduHM8++6xGvzdX9ChReua1tNzw5ZTL5TQ3N3d5b62trcXM\nzEw1h47mqu0eLhA8SIggkEAgEAjuCXJK5N3O5m9qqONq4m8k6M8gp0TOIDsz0tPTOX36NCYmJowZ\nMwaAhQsX8sUXX7B161aam5sxNzdXK5OvqalBR0dHrUweID09nf79+/dI9qU93t7eODg4kJCQwLff\nfotcLmfChAn4+Phw9OhRlR/Q1KlTSUlJIT4+ntLSUmxtbdXO89hjj2mtEOqodP7IkSNYW1tjbW2t\nCoi1R5mBBb2TyekO2dnZJCUlMW7cOLUAELQtCp555hk++ugjzp07x5w5cwA4efIkEomE5cuXq0kN\n2NvbM3/+fKKjo3s8DsGDg/CiundQykXujkgnIVfz+37oQGs1zXyBoCs6kpAty7hIU/11rMY8TKHT\nMAYGDlVJyIaHhxMWFqb1fNrkbtpXFm/atKlb4woNDaW4uFhrUC41NZUDBw506zwCwYOAtuB4Vyj9\nQv7617/2uLpBIGhPR/cRXYO2ahWX+a+jZ2jEy/OGakiRt2esvxdjN39Cc3Mz2dnZxMfHc+jQIbZu\n3YqRkRHTp0/vcizNzc3s3r0bKysrNm7cqFHtk5qa2uGxSkWMjrZ3FJzpCgMDA55++mmefvppysrK\nSEpKIiwsjF9//ZXi4mI++eSTXp0X2iqNWltbu51goZxDR3OtqKjo9VgEgvsBEQQSCAQCwT1BfE5Z\n143+DzP7gVzLiENRVsi/my7haqVHREQELS0trFq1ipKaZuKTsrluNAgn70DOXviVrPQ0PD09+fHH\nH5HL5RQXF5OUlMS0adPUyuSbm5upqqrC09NTJfty9uxZCgoKOHz4MImJiVplX9ojkUh45ZVXWLt2\nLV9//TX19fV4eXnxj3/8g/j4eAICAoiJiUFPTw+pVMqpU6fIysrSCAK5u7trPX/70vnSqlpyy+TU\nN7Zw/uQB6uUVBPj7UVFRoSqd10ZpaSkhISE9ksnpDsrFiUKhYPfu3Rr7q6qqADhx4gQymYy0tDSO\nHz+OkZERn3/+uZonUUlJCf/973+Ry+UYGhoyf/6NLDqpVKrmjyR4sBHBhXsHfxcb/F1sNKS7hg2y\neWBl+pYuXcrjjz+u8bKnJ7i7u7Nlyxatfjb3K9okZB2HTcHeZxx50UcBsHT2orUVNQlZbUkSndG+\nsriz+2p7CgsLgTYJ2JsRUjUCgUBwd9CZFLmJjRPXrxVSU3oFCyd3rVLk2tDV1WXIkCEMGTIELy8v\n3n77bc6fP68KAinlvdtXxCiprq5GoVDg5+en8UxQV1enkkPTRmZmJrW1tRqScO2VMm4VGxsbJk2a\nxMSJE3nhhRdISUnp9n1RG56enkRHR3PlyhWcnZ27bD948GCg7T56c1BNoVCQnZ3dq3EIBPcLIggk\nEAgEgnuC6/WaZe0dYWBixYDAuRTGhRF95lcKLI0YPHgww8bPZH+WHomnw280Nh3O9YGt1KZlUHC1\nlF9++QVTU1NsbW159NFHmTx5MsXFxUDbw2Nzc7OG7EtBQQEFBQUcPXq02y/YfH19Wb9+Pa+++iqJ\niYnExMQwZswY1q1bx+nTp4G27CflA75Ssq49HZW0y+VyyuW1vPfpV1TX3gje1BTn0NzUwHUdEwbY\nmKlK52/m6tWrvPHGG9TU1ODj48Pw4cO7JZPTHeRyOQDx8fHEx8d32O7UqVNMmTKFIUOGEB8fj7W1\nNSUlJWqeRCYmJjz++ON89tlnAGpVXNoqpAQPNiK4cG8xyM5M/F3+D2UF561gaGhI//79b9OI7g20\nScjqG5uhjxmGZlbIr7bdFy36e9DaCrsj0mmtuEJoaGiP+1JWFm/atInXX39dI6O6pqaG4uJi1Qsq\n5T0qMTGRQYMGqdplZWXx008/9bh/gUAgENx+OpMit3UP5FpGLAUxoRiaWWNkbqMmRd7U1ERaWho+\nPj5kZGTg4OCgcW9QVqy0989Rrs1KSko0+rS0tMTQ0JCMjAzq6upU3jlNTU1s3bqV6urqDueiUCgI\nDg5Wkz3XppTRE6qqqqioqFC7j0FbQKqurg5dXV309Hr/2vnhhx8mOjqa//znP6xevVpr4Cs3NxcP\nDw8ARo8ejampKb/99hvz5s1T8/ELDg7uteSdQHC/IIJAAoFAILgnMDbs+pZlaGrJ8CVrVb+7TlrE\nizO9WRjo0q6Uv07jOLN+g9E170ejjQt/WbtBo5Q/LS0NaCsx379/P0888US3ZV/s7Ow4ePCgytOh\nPR4eHjz55JMYGBjw6quvqjKWvvnmG3R0dHB0dKS8vFzV981ok6UBqKpvJb20jqFPrFafx9FtKK4V\n4jz9BfpY2hBXWM9MLfYSv/zyC3K5XKuERmcyOd1BaQT6/PPPq1Xu3ExRUREODg7U1tYSGRmJjY0N\n27Zt0/AkmjRpkurv8KD5XAh6hwguCO41bvYESktLIygoiNGjR7NmzRqtx7z44otcvXqVnTt3YmZm\n1qEn0OrVq0lKSuKXX35h7969nDx5ktLSUiwtLZk4cSJLlizR+gLn9OnT/Pzzz+Tn59OnTx+GDx/O\n8uXL+eyzz0hKSlLJkranvaTTE088wa5du0hMTKS6upqPP/4YX19f5HI5+/bt48KFC5SUlKCnp8eQ\nIUN4/PHH8ff31zinsqr07NmzVFdXY2dnx6xZs3AaIuXbda/R13UYA8c+rGqv9AQaPPlpyrPiyY4I\nwdLZC/0+ZiTtS+anyiwszE0pKSmhvr4efX19Fi5cqNHvihUrAPjyyy/ZvRxa/oUAACAASURBVHs3\nERER5OXlkZCQwLFjx1iwYAH29vYalcWrVq0CYMqUKezbt49t27aRmJiIo6MjhYWFREdHM2bMGCIi\nIrr4VAgEAoHgTtKVFLmRhQ3OoxZwJfIAlw59jbnDYPLN+2JVEk1LXTUpKSmYm5vz9ddf8+uvv3Ls\n2DG8vb3p168fpqamXL16laioKPT19Xn44Rv3KU9PTwwNDTlw4AByuVyV9Ddv3jxMTEyYP38+ISEh\nrFq1itGjR9PU1ERCQgJyuZyhQ4eSkJCgdbxSqZTQ0FAuX76Ml5cXFRUVakoZyjVaT7h27RqvvfYa\ngwYNYtCgQdjY2HD9+nWio6OpqKhg/vz5GpVHPcHPz49ly5axc+dOnn/+eUaMGIG9vT11dXWUlJSQ\nlJSEt7c3//M//wO0Vea+/PLLfPLJJ7z99tuMHz8eKysrUlJSyM3NRSqVimpbwQONCAIJBAKB4J5g\n2KDeyTQNG2TTaSl/e66XF7Hh52iNUv72ZfK9kX3RRn19PU1NTarS+8TERKZPn05YWBiXLl0iICAA\nfX19kpOTgRvl7V0Rl11GXoMZTfVF1FaW0MfyRpTH2KY/imuFVBdmYGRh06FsQVFREaBdpqanMjk3\no8zUSk5O7jQIpPQk6tOnDw4ODly9epXS0lINT6JbHY9AIBDca3h4eODk5MTFixe13ocuX75Mfn4+\nY8eO7fY9asOGDSQnJxMQEICxsTEXL15k7969VFZWanhw7N27lx07dmBqasqUKVMwMTEhLi6Ot956\nq1ueAkVFRbz55ps4OTkxadIk6uvrMTY2pqSkhNWrV1NSUoKPjw8BAQHU1dURHR3N2rVrWbVqFTNn\nzlSdp6GhgTVr1pCZmYmrqyuTJk1CoVDw448/IrE40+kYjCxsGTJtGUWyX6kuSOd6xVUar8vxcHdj\nztTxHDp0iNbWVnbu3ElsbKxWWZ6mpibef/99ysvLGTFiBIGBgRw6dIiMjAwOHDiAra2tRmWxEmtr\naz755BN27NhBSkoKsbGx9O/fnxdffJFhw4aJIJDgD6Ouro7Fixfj5ubGp59+qtre0NDAokWLaGxs\n5I033lD7PB85coQtW7aoJTQVFhayZ88eZDIZ1dXVmJub4+fnx6JFi3B0dFTrc/fu3QQHB7Nu3TrK\ny8s5cOAAV65cwdzcnO3bt3c63qKiIr799lvi4+NpamrCxcWFJ5988jZeEcGDSnekyK1dh9LHyp6S\nSxeQF2cjv5rJ0etZDHVzZty4cYwfPx6ACRMm0NjYyKVLl8jIyKChoYG+ffsyfvx4HnnkEQYOHKg6\np6mpKatXryY4OJiwsDDq6toSGCdPnoyJiQlLlizBwsKC0NBQjh07hrGxMf7+/ixZskSr1LYSe3t7\nXnrpJb799luOHj1KY2MjgwcPZtGiRQwfPrxX18je3p5nnnmGxMREEhISqK6uxszMDCcnJ5YvX66a\n/63w+OOP4+3tzcGDB0lJSSEyMhJjY2P69u3LzJkzmThxolr7cePG8Y9//EOVoKGvr49UKmXDhg2E\nhISIIJDggUYEgQQCgUBwTzDIzgxfZ+tOM7JuZuhAawbZmbH526QuA0AATQ11FCX8xu4IB1VgRFuZ\nfE9lX7RRWlrKa6+9ho+PDyUlJezcuZOUlBSKi4sxMTFhxYoVHDhwgOLiYoYNG6bhB9QR34enY+c5\niqr8y1yJPITr+CfQN257CWjrPoKy9BiKEk6jZ2SMtctQNdmCsrIybGxssLNrCxwlJiYSGBioOnds\nbGyvZHLa4+bmho+PD+fOnePEiRNMnz5dQ57LRkeBg1UfQkNDkclkyGQycnJymDJlCoMHD0YikXDt\n2jWKi4u1ZpsLBALB/c7UqVPZuXOnSvKkPcpqzZ6YoRcVFfHll1+qgkZ/+tOfePXVVzl16hTLli1T\nZSJfvXqV7777DnNzczZt2oSNTdv9Y9myZWzYsIHw8PAO+1CSkpLCE088wdKlS9W2r169mtLSUt56\n6y0mTJig2q5QKFi9ejVbt25l1KhRWFpaArBv3z4yMzOZMGECQUFBqurYp556iocXP0tXmNoOwG3a\nUhSleaQd/wZTu4G89P56npszgo8//pjm5mY+/vhjoqOj+dOf/qRhbl1eXo6LiwsfffQRBgYGQFtF\n6gsvvADArl27OpXBGTBgAO+9957WfeLeJvijMDIyws3NjcuXL6v5h6SkpKikgGUymVoQSCaTAW1Z\n+9D27Pzuu+9SW1tLYGAgzs7O5Ofnc/r0aSIjI/noo4/UZJqU/Pzzz8THxxMYGMjQoUO7lG4qLCwk\nKCgIuVxOQEAArq6uFBUV8fHHHxMQEHBbrofgwaW7UuR9rOzVKk6XTXLn6fHqn28PDw9VIlx3CAgI\n6PAzrKury8KFC7VWqf71r3/VSNxQqlIoeffdd7vsf+rUqVqfIW4OypqYmLBo0SIWLVrU5Tmh83ub\ntrEr8fb2xtvbu1t9AAwbNoxhw4b1qA+B4EFABIEEAoFAcM/wzAQ3Vn8f2a2AjkQCT49367KUvz1m\n9gO5lhFHyLZC+lVNRbe5TmuZ/PTp08nIyODIkSM899xz+Pv7Y2dn16HsizaUUjtJSUmYmJiQl5fH\niRMn8Pf3Z+zYsWzbto24uDisrKw6PU97lHM16+eKo/9UiuJPkXzgP1g4umFgaklLUyN6BkZcy5Ih\nL87GyX86hfHWmBWeo/xqHsbGxqxbt465c+dy8uRJ/vnPfzJu3Disra3Jzc0lNjaWhx566JYzlIOC\nglizZg0f/nMDf/90K7WGNugaGNGoqKa2shjFtQIsTI0ZYGPG2JH+LF++XBUQq66uprq6msOHD7N3\n716kUinR0dG3NB6BQCC415g8eTLfffcdp06dUgsCNTU1ERERgYWFRY9egi5fvlytasjIyIiJEyey\nZ88eMjIyGDlyJAC//fYbzc3NzJ8/XxUAgjZ50mXLlnHmzBmtVTPtsbS0VPNwA8jOziYpKYlx48ap\nBYCg7SXTM888w0cffcS5c+eYM2cO0OYdp+y3vTyqjY0NYyfP4PK3O7s192uZbf50/aTjsbXpq9qu\nq6vLihUruHjxIqGhoVqrC1544QVVAAjAwsKCUaNGcerUKQoKCtSyuwWCewU/Pz8uXbpEUlKS6n9f\nJpOho6ODVCpVBX0AWltbSUxMpF+/ftjZ2dHa2sq///1vrl+/zptvvsmkSZNUbSMiIvj000/517/+\nxZYtWzRkjRMSEtiwYUO3Deq3bNmCXC7nueeeY8GCBartykCTQHArdEeK/HYeJxAIBHca8e0kEAgE\ngnsGfxcb/jrXt0tpN4kEXp83FH8XG36Jyu72+Q1MrBgQOJfCuDB+OXAYO3ODDsvkX3zxRUaMGMHR\no0eRyWQoFIoOZV+0YWpqyquvvqr6PT09nR9//JGUlBR+/fVXLC0tmT17NosWLeq2IXh72YJ+Pg9h\nautMaVoUNaVXaC5IQ0ffEIM+5gwInENjnYKa4hyq8lP5tc6RCSOkzJgxA4BBgwaxbt06du3aRXR0\nNM3Nzbi4uPDOO+9gYmJyy0EgGxsb5ix/k8iN31Bx5RL1RYm0traib2SCkYUtSCRUXyviuv8cJj21\nlJnDBrBs2TJ2795NSEgI6enp5OXl8fLLLzNmzBh27NhxS+P5PTh48CBHjx6luLiYhoYGVq5cqab/\n3R2U3h3ts+g68vn4vWnv9SEy7ASC7nNzJWR/k25kOdD2Pern50d8fDx5eXkMGNDmZRcVFYVcLufh\nhx9GV1e32+PQlpWvrECtqalRbcvKygLQmpFrZ2eHjY2Nysy6o7m5uLigr6+vdmxqaipww+PnZqqq\nqgDIy8sD4Pr16xQVFalVr7Zn2riR7OhmEOh6eZsEqlk/Fw3pWScnJ2xsbCguLkahUKhV/pqYmKik\nS9ujDI61v24Cwd2MRkV2/yFAW+CnfRBoyJAhjB07lq+//pqCggKcnJzIyspCLperJIRTU1PJz8/H\n09NTLQAEMH58m9RiSkoKycnJSKVStf2zZs3qdgCorKyM+Ph47O3tNaohR40aJbw/BLfMrUiRCwQC\nwd2ICAIJBAKB4J5ilr8z9pbG7I5IJyFXs8Jn6EBrnh7vppI4604pv6GpJcOXrFX97jppkdZS/psZ\nOXKkanHcFcry88TERFasWKHx0t7Nza1Dg++OznUzN8/V1M4ZUzvnLs+nba5eXl58/PHHWttrK+Vf\nv369xjZfX1+tbeOyy9gSlo69dDz2Uk2t6IxT3yPRKcZigJeab9HKlStVRt2LFy/mkUceAdqkfxIS\nEmhpaUFHR6fL+f7ehIeHs3XrVlxdXVmwYAH6+vp4enr+0cMSCAR/IHHZZXwfnq5RqVpfU0leXgUe\n17oOIEybNo34+HjCwsJYvnw50DspOECrl48yiNS+skcpz6SUZLsZKysr0nPyCfr2fIdzc/cz0DhO\nLpcDEB8fT3x8fIfjrK2tBdqCQMr+tOEz2AnzPpr9aKO5sR4Af/cBDLLT9FCytramtLRUaxBIG9qu\nm0BwN9LR91BLczO5RTWcjLjAypUrUSgUZGZm8thjjzF06FCgLSjk5OSkMqFXbs/IyFD7/WaGDh1K\nSkoKWVlZGkEgd3f3bo+9fUBa27Ofr6+vCAIJbolbkSIXCASCuxERBBIIBALBPYe/iw3+LjYamYvD\nBtloPHj/EaX8f1RVxL0iW/B9eHqnlVwGJhYA1BTnYNHfQ+Vb1JEnkbm5OdDms2Rvb39HxnwrKOXq\n1q5d2+2qrnsNa2trtmzZopJMFAgEHXMs7kqnFa3VtQ0cirnC9Pg8Zg4b0OF5xowZg7GxMb/++itL\nly5FLpcTExODi4sLLi4ud2Tsyv/xyspKnJ01kwwSM/JILaigTwcvzaprGzgce4UZN81Ned7nn3+e\n+fPnd3scFRUVWvdXVlbi1NeEEonW3Wro6hsikcBsX+3Z2+XlbXPpKOgjENyLdPY9pKOrS7OpPaci\nE9kXkYyTQQ0tLS34+fkxYMAArK2tkclkzJkzB5lMhkQiUfkBKQO0HT3vKLdr8/vpKLisje4EpAWC\nW6U3UuQCgUBwtyKCQAKBQCC4ZxlkZ9ZlttWDVMp/L8y1Ox5Ntu4jKc+KJzsiBEtnLwpizWhMPEBW\nWrJWTyI/Pz/OnDnDunXrGDFiBAYGBtjZ2XUpyfd7oXyBeL8GgAD09PTo37//Hz0MgeCuJy67rEtJ\nUwBaUVVCdoSBgQEPPfQQoaGhKlm45ubmHlcB9QRXV1fOnz9PSkqKRqZ/WPQl4tJyezQ3ZdWu0jA7\nOTm520Ggfv36UVxcTElJiYYkXEpKChbGBnh5OJArodMxGVv3w8FAgaS6CFCvTCgqKqKsrAx7e3sR\nBBLcN3Tne8i0nwvVRVn889tDzHDVx8DAAC8vL6CtmicmJobGxkaSk5NxdnbGwqItgaerAK3ymUhb\n0sjNHkGdofx/rKys1Lq/o/4Fgp7QGylygUAguFsRQSCBQHDf8HtUX8yfPx+pVKpV+kpwd/IglfLf\nC3Nt71vUEX2s7BkybRlFsl+pLkintbWFXFOfDj2JZsyYQUlJCeHh4ezdu5fm5makUukfHgTavXs3\nwcHBqt/bv9hUyuTJZDL27dvH5cuXqaurw87OjrFjx/L444/f8gvHwsJC9uzZg0wmo7q6GnNzc/z8\n/Fi0aBGOjo6qdseOHePLL7/k5ZdfZubMmartJ0+eZNOmTRgYGLBnzx41D48333yT7Oxs9uzZg4GB\nQYffvxs3biQsLIzt27cTGxvLoUOHKCwsxNjYmNGjR/PnP/9Z6zxjY2PZs2cPWVlZ6Ovr4+Pjw/Ll\nywkJCVGdT5sPyIPImTNnOHToENnZ2TQ1NeHg4MDEiRNZuHCh6m8WFBREZmYmwcHBGBkZqY5V+kxN\nnz5dzaMsLy+Pl156icmTJ/PGG28ANz7P69ato7q6mr1795Kbm4uBgQH+/v6sWLGCvn373tJcIiMj\nOXDgAHl5ecjlcszNzXF0dGT8+PHMmTNH1U4ul7Nv3z4uXLhASUkJenp6DBkyhMcffxx/f3+1cyoU\nCo4fP05MTAwFBQVUVVVhbGyMp6cnTzzxxO8uzdhVJWR7Wlthd0Q6Tp20mTZtGqGhoZw6dYq8vDx0\ndXU1fDhuJxMnTmTPnj0cPHiQadOmqfxvWltb+ceGL2ntpgSacm7KF2Zubm74+Phw7tw5Tpw4wfTp\n0zWOycnJwcrKSvWyecqUKezevZtvv/2WoKAg1QvksrIy9u/fD4BXfyv+8vCoTiVknx+9jB1frGfP\nnj0EBgaqzt/S0sL27dtpbW1VeeYJBPcD3fkeMuvXVk0oL8rmcE4pc0Z5YmDQJrHo5+fH6dOnOXLk\nCHV1daoqIIDBgwcDbZ6F2lBuV7brLUrvoJSUFK1ywB31LxD0lJ5KkQsEAsHdiggCCQQCgeC+5/cs\n5W//4j8sLEzlzwBtXj7tX1xnZWXx3XffcenSJRobG3F3d2fp0qWqTMv2NDc3c/z4cU6dOsWVK1do\nbm6mf//+TJ8+nblz56peft3tsgXd8WgCMLUdgNu0parfn5jkzujRbWO92WdIR0eHpUuXsnTpUu4m\nfH19gbbPQUlJCYsXL1bbf+zYMb766isMDQ156KGHsLS0JDExkZCQECIjI/nss896HQhKT0/n3Xff\npba2lsDAQJydncnPz+f06dNERkby0UcfqYzglS9vZDKZWhBIJpMB0NDQQGpqqmo+CoWCjIwMfHx8\nVC+EuuL//b//R2xsLIGBgfj7+5OQkMDx48cpKirS8J4KDw9nw4YN6OvrM378eKysrEhNTSUoKOiO\nSVzdq+zcuZOffvoJc3NzJk6ciJGRETExMezcuZPY2Fg+/PBD9PT08PPzIy0tjeTkZAICAgCor68n\nNTUVuPG3VqL8vf2LPSVHjhwhMjJSZbx9+fJlIiIiyM7O5osvvlALFvYEZTDSysqKwMBAzM3Nqays\nJCcnh5MnT6qCQCUlJaxevZqSkhJ8fHwICAigrq6O6Oho1q5dy6pVq9Q+x/n5+Xz33Xf4+PgwcuRI\nTE1NKSkpISoqipiYGN577z3VNbnTdKcS8mYScsvpQ12H+728vHBwcODs2bM0NTWpBTHuBA4ODjzz\nzDPs3LmTV155hfHjx7N161Za0KFc1wZjq37UVhZ361wJueXklMhViQhBQUGsWbOGL774goMHD+Lh\n4YGJiQllZWXk5OSQm5vLhg0bVPN77LHHuHDhAuHh4eTn5zN8+HAUCgVnzpzBx8eHCxcuIJFI1CRk\nPyy/QFy1KcsnuzNpuKeqb3nB/2fvzuOiLNfHj3+GYd9BZBFEFklEcNxJcc0108odKZdzXPqZHjPL\nc77a1+M53067qZVKy7Gy1PS45EEzTVHUhEBRBgZEUBYVUUAWYZB9fn/QPDLOoLhgoPf79eqVPPsz\nPMzAfd3XdU1g586dzJ8/n5CQEOlnKTs7m4CAAMaPH988L6ggPGJNfR+ydHDD2NScksvnKKhQ4zbl\neWmdNgtw+/btOl9D/XuSu7s7KSkpnDhxgpCQEGndiRMnSE5Oxt3dnS5dujzQfTg5OdGtWzcSEhLY\nu3cvzz9/6/piY2NFPyDhobqXUuSCIAgtlQgCCYIgCI+9R5nKHxQUhFqtJiIiAm9vb55++mlpnbe3\nt1TD/Pz58+zcuRN/f39GjBhBfn4+J06c4H//93/59NNPcXe/Nfe7pqaGt99+m9OnT+Pu7s6gQYMw\nNTUlMTGRL774grS0NGm2fksvW9Ba+hY9DEFBQQQFBZGUlEReXh5hYWHSury8PL744gvMzc1ZtWqV\nTim18PBw9u3bxzfffMOCBQvu+bwajYZVq1ZRXl7OG2+8oZMVcPz4cT788EM+/vhjwsPDkclkuLm5\n0bZtWxITE9FoNFJAMTExka5du5KUlIRSqZSCQCqVirq6ukabPhuSmprK2rVradu2LVAf1HzrrbdI\nTEwkLS1NagZ98+ZN1q9fj1wuZ+XKlTpBn40bN7Jjx457fj0eV6mpqWzfvh0nJydWrVol9T+YMWMG\n77zzDidPnmTXrl1MnjwZhULBf/7zH5RKpRTwSE5OpqamRhpEy83Nxc3NDbhzECg+Pp5Vq1bh5eUl\nLfvoo484duwYsbGx9O/f/77uZ//+/RgbG/PZZ5/pBTFu3Lgh/Xv16tXk5+ezZMkSBg4cKC1Xq9Us\nXbqUL7/8kuDgYKlPhIeHBxs3bpR6h2kVFBTwxhtv8O9///uRBYGakglpSE6hfu+MhoYOHcqmTZuk\nfze3SZMm4eTkxO7duzl06BD5+fm0cevAU8P/zPnDm5CbmDX5WAlZBdIAmpOTE2vWrGHPnj1ER0cT\nFRVFXV0d9vb2eHp6MmbMGDp06CDta2pqyrvvvsvmzZs5ceIEu3fvxsXFhUmTJklBoIYlp7ycbQjq\n0Ia881aM7tEB5wYDdzNnzsTHx4e9e/dy+PBhamtrcXV1Zdq0abz44osYG7e+zyBBMKSp70MyIyOs\nnTtQfPlc/df2t35PcXZ2xs3NjdzcXIyMjAgMvFVGUSaT8frrr7N8+XI++OADnn76aTw8PMjJySEm\nJgYLCwtef/31eyr91ph58+bx5ptv8tVXX3HmzBm8vb3Jzc0lJiaGPn36EBcX98DnEISGmlKKXBAE\noaUSv80KgiAIT4RHlcofFBSEi4sLERER+Pj46Az8w63yFCdPnmTRokU6A3bamfARERHMmzdPWv6f\n//yH06dPM2bMGObMmSOVvKirq2Pt2rUcPHiQkJAQgoODH+m93o/W0LfoUYiKiqKmpoZx48bp9dKZ\nNm0aR44c4ciRI7zyyiv3nFmRmprK5cuX8ff31ysLNWDAAPbu3UtKSgrJycnSwE3Xrl2JjIwkOzsb\nLy8vLl26RGFhIVOmTOHmzZsolUpefvll4M4BgsZMnTpVCgAByOVyhg0bRnJysk4Q6LfffkOtVjNs\n2DC9rJ8pU6bw888/G2wm/SQ6ePAgUP+6NGyALZfLmTVrFqdOneKXX35h8uTJ+PvXl/FpmPGjVCqR\ny+W89NJLJCQkoFQqcXNzQ6PRkJSURLt27aRSXw2NHTtWJwAEMHLkSI4dO0ZaWto9BYEazqi9cLWE\nmhoNcrlcbzttACczMxOVSkVISIhOAAjq+0O89NJL/Otf/yI6OlrKHGosm87JyYmQkBD27NlDfn6+\nzvPZXJqSCWlmbU+Pl1foLBs6fjphA95udJ8pU6YwZcqUOx43KChIL4sSuGN526FDhzYaVBoyZIhU\ncnPs2LHI7NpRbmxCZVkRFvYuBvcxdG+3vyYWFhZMnjyZyZMn3/F+tKysrJg7dy5z587VWX7gwAEA\n2rdvr7N80aJFjZYMHjhwoN5z1ZgNGzY0ui4sLEzvs18QWoqmZmRDfV+g4svnkJuaY+es+7uKQqEg\nNzeXjh076r3PdurUidWrV7Nt2zYSEhKIi4uTMlZDQ0N1Jjo9iHbt2vHxxx/z7bffolQqSUpKwsvL\ni7feeosbN26IIJAgCIIgNCCCQIIgPPYqKyuJiIjg+PHjXLlyBZlMRocOHXj++ecN/rFfU1Mj9Z0o\nKCjA0dGRwYMHExoa+gdcvfAwtaRU/s6dO+sNrg0bNozPP/+ctLQ0aZlGo2Hv3r04ODgwe/ZsnZrn\nRkZGzJo1i0OHDhEVFSUFgaBl3WtDraFv0YO6/TUvUVfpbXPhwgUAg9k01tbW+Pr6olKpuHz58j2X\nQDt//nyjx9YuT0lJISMjQwoCKRQKIiMjUSqVeHl56QR68vLy2L17Nzdv3sTCwgKlUom5ubkUuGmK\njh076i3TBhjKysqkZRkZGQAEBATobW9ubo6Pj88TXee/4bP1y4nTlFfWGAzGubu74+TkxLVr11Cr\n1VhZWeHv709SUhKlpaXY2NiQmJiIn58f/v7+2Nvbo1QqGTVqFOfPn0etVjNgwACD16AtI9iQNoDS\n8Ht5J2cyC9h8LF3nfSDPyJ3LackEj5zEhLEjGD24L507d9bJCtKWr1Or1WzZskXvuCUlJUB9T6OG\nzp49S0REBKmpqRQXF1NTozsIev369UcSBHpcMiFLSkqwsrLSyY4xlkHO6YPU1VRj377pfZYe9N4K\nCwtxdHTUWZafn8/WrVuRy+X06dPngY4vCI+be/mZc/YPxtm//ndLawvd8q/z589n/vz5je7r7u4u\nZanfzd0Cp87OzgaD2FBfonLp0qUG1z2KzEhBEARBaC1a1l8UgiAID5larWbZsmVkZGTg6+vL8OHD\nqaur48yZM3z00UdkZ2czbdo0aXuNRsP7779PbGwsbm5ujBkzhpqaGg4dOkR2dvYfeCfCw/SwU/lv\nH/T3sLp7Qx5DA6nGxsbY29vrDKTm5ORQWlpKu3bt2LZtm8FjmZqa6g16arXEsgUtvW/R/TI0sA2Q\nnnAR2Y0izmQWSNlX2myW2wcvtbSZHfeT9VJeXn7HY2uXNzx2w75AL7zwAkqlEicnJ9zd3VEoFOzc\nuROVSkXHjh25ePEivXr1Mpix0Rhra2u9Zdr96xo0ktdek7aU1+0aW/64M/RsJZ/PpbK0kA/2nmPG\nMBO9zD5HR0fy8/OlIJBCoSAxMZHExEQUCgUXLlyQsi26du2KUqlEo9HcNdPLUGaNoe9lY/afuWiw\nXKVz577IzSwpSDvF599u5Zeff8LZzpLAwED+9Kc/4efnR2lpKQAJCQkkJCQ0eo6bN29K/46JieG9\n997D1NSUbt264ebmhrm5OTKZjKSkJFQqFdXV1Xe97ofh9ozGyrJiknd/Qhufbrh0CeHKmUOU5V+k\nrrYGSwdXXLsOwtbNV2+/Y8eOsX//fjIyMqiqqsLFxYXBgwczfvx4g5mDSqWSXbt2kZaWRkVFBc7O\nzvTr14+JEyfqfT+XLl2KSqVi165dbN26laioKAoLC3FycuKZZ55h0qRJREdHs3nzZhQKBW3btiUz\nMxON7CIFVWZYOrrStlN94EVTV0vB+dMUZiRSUZKPRlOHuW0b2vh2Sya8cwAAIABJREFUx+mp3shk\nsgfO8nz33Xepra2VshGuXbvGyZMnqaysZMaMGY2+DwrCk6qlZWRXVFQwdepU/Pz8+PDDD6XlVVVV\nhIaGUl1dzeLFi6XMQ6jvTRceHs7ChQsZPnw458+f5/DhwyQlJVFQUEBlZSVOTk4EBwczZcoUvd9B\nIiMjWbNmDYsWLcLe3p4dO3aQkZFBeXl5o8EmQRAEQWjtRBBIEITH2ldffUVGRgYzZ85kwoQJ0vKq\nqireeecdtm/fTkhICD4+PgBST4NOnTrx7rvvSk3Pw8LCmjybTXhyNDboX1lWzKVLRXS63vis+MZK\nFMnlcp2BVO2g55UrV/jhhx8aPV7DQc+WrqX3LbofjQ1sa924WcXSzbG8PqYrI7u1l77/RUVFeHp6\n6m1fVFQEoNPPoqm0+2iPcbvCwkK9Yzs6OuLu7i4NiCclJUn9rAICAjA2NiYhIUEKMN1LP6D7ufbi\n4mKD6xtb/jhr7NnS9l1JSL9E6jW19Gxpab/P2mdNoVDw/fffk5CQgEwmQ6PRSIEehULBsWPHyMzM\nRKlUIpPJmuV7fCaz4I4/J218FLTxUVBTVUF5wSU6u9xEdTqGFStWEB4eLj0fc+fOZezYsU0656ZN\nmzAxMWH16tV6pcHWrVt3z83Dx44dS2Bg4B3LqDWmsUzIKnURaQc2YG7vQpuOPam5WUbRxWQuHN7M\n0AkzdQL5n3zyCYcOHcLJyYl+/fphZWXFuXPn2LRpE0qlkrffflsnQLt//37Wr1+PmZkZ/fv3x97e\nnqSkJHbs2EFsbCwfffSRwc+jDz74gPT0dEJCQpDL5cTGxrJlyxbS09N56aWXCAgIIDk5mdLSUq5f\nv46bmxu9+46iyrUbRsYm1NXWknF0KzeunMfctg0OXoEYyY0pvZbFpZM/oy7I4fmwWQ88SeGZZ57h\n8OHDnDhxgvLycszNzenUqRPPPfcc/fr1e6BjC8LjqKVlZJubm+Pn50daWpqUcQyQkpIiBeiVSqVO\nEOj2yQoHDhwgJiaGoKAgunXrhkaj4fz58+zevZv4+Hg+/vhj6bgNnThxgvj4eHr27Mmzzz5LXl5e\ns9yjIAiCILQEIggkCMJjq7S0lCNHjuDn56cTAIL6zImZM2dy+vRpjh49KgWBDh06BMD06dOlABCA\njY0NoaGhrFmz5tHdgNCiNWXQf2/8RYYnXNIZmL1X2kHPvn37smzZsvs+TkvTkvsW3au7DWxraTSw\nem8iznYW+Pj4EB0dTVJSkl7GhVqtJiMjA1NTU71B66bw9fUFaLRsmna5djsthULBvn372LdvH2q1\nWrouMzMz/P39USqVUrDxXvoB3c+1p6SkMHz4cJ11FRUVUrm4J8Wdni0LR1fKC3Mpu5aNmY2j9Gx1\n93YiNzeXgoICXFxcpAF+Pz8/LC0tpV5Apqam+PvXl+3Sfj9PnTrF2bNn8fLykvrwPIi8vDxmzZrF\n0KFDWbRoEZuPpTcpA9DY1Bzbdn7UdXBkmKMVBw8eJDk5mU6dOgGQnJzc5CBQbm4unp6eej9LGo2G\n5ORkve1nzZoF3Lnny4MwlAlZei0bl4C+uPcYIS1z6tSL9APfUJj4C+XlM7C0tCQyMpJDhw7Rt29f\n3nzzTZ3fU7Zs2cIPP/zATz/9xPPPPw/Uv/5ffPEF5ubmrFq1Sqf/WHh4OPv27eObb75hwYIFetd5\n6dIl1q1bJ82gnzZtGsuWLePkyZMMGDBA5/NIGxibPHehdG/Xko9z48p52nbqg0fPkch+L2Wqqavj\nYtxeCi+cIcjmxgO/nqNHj5b6PwmC0DQtLSNboVBw9uxZVCoVvXv3BuoDPUZGRgQGBur0s9P2rXN1\ndcXZ2RmASZMmMW/ePJ2SyVDfO+/TTz/lp59+YuLEiXrnPXXqFCtWrKBnz57NeHeCIAiC0DIY3X0T\nQRCElikrr5TdcZlsOZ7O7rhMLubrZl2kpaVJGRVbtmzR++/XX38FdHsHXLhwAZlMZrAfRVBQUDPe\njdCa3G3QXyaTAfWDXav3JnIms+C+z+Xh4SHN9L69j0Vr193biY+m9+WLVwYyb2QAMwY/xbyRAXzx\nykA+mt63VQSAgCYPbEN9IGjL8XSGDBmCsbExe/fuJTc3V2ebTZs2UV5ezuDBgw2Wdrqbzp074+7u\nTkpKCidOnNBZd+LECZKTk3F3d6dLly4667SBgO3bt+t8DfWZP9nZ2cTFxWFjY3PPfYqaKjg4GCsr\nK6KiosjMzNRZt23btvsqj9ea3enZauPbHYCrqmNUV6ilZ6uuro4NGzag0WgYMeJWUMHIyIguXbqQ\nm5vLr7/+SkBAgPR8ubi44OzsTEREBJWVlc2SBZSVV3rHmeelVzPR3HazidmFZOVcA+qDkX5+fnTp\n0oXo6GgOHjxo+DxZWVJvIKjvJXHlyhUpMwrqBxG3bNnSaBnN5qTNhPz9YwKoD3q5Bg3S2c7ayZ1x\nY0ZgSg0xMTEAREREIJfLee2113QCQAChoaHY2NgQFRUlLYuKiqKmpoYxY8boBICgPqhjYWHBkSNH\nDJbDCw0N1SmhZGpqyowZMwAafe219wYa8s+dxMTCGo+eI6QAEIDMyAiPniPwcbEjN63xkn6CIDQf\nQ+9DhjyqjOyGJWm1lEolHTt2pF+/fhQUFJCTkwPU9w4sLS3V+R3F2dlZLwAE9b02LS0tOXPmjMHz\nBgcHiwCQIAiC8MQQmUCCILQ6TS3BpS2jlZ6eTnp6eqPHq6iokP6tVquxsbHRaXas9aT2ohD03W3Q\nX25qgUwmo7q8RBqYvd8/oOVyOWPHjmXr1q18+eWXzJ49W2/wr7CwELVafV9ZIy1BS+xb1FR3G9g2\nJDG7kHICmTNnDuHh4bz22mv0798fOzs7VCoVqampeHh4MHPmzPu6JplMxuuvv87y5cv54IMPePrp\np/Hw8CAnJ4eYmBgsLCx4/fXXpWClVlBQEDKZjJKSEjw8PHR6aSgUCrZs2UJJSQkhISF6+z4slpaW\n/L//9/9YtWoVS5YsoX///jg6OnL27FkyMzMJDAxEpVI12/lbkrs9W9Zt2+PSJYRrySdI3RuOvWcA\nOadNuHb0G4rycgkICGD8+PE6+ygUCk6ePElJSYleNpdCoZAG95sj0ysh687B8Mxj/8HI2BRLJ3fM\nrO3RaECdl02hvJT+vbpK1/Tmm2/y1ltv8emnn7Jnzx46deqElZUVBQUFZGVlkZ2dzcqVK7GzswPg\nxRdfZN26dSxcuFAqbXb27FkuXrxInz59iIuLe+j3eidZeaVUVNcyvKsHJ5MrSQYsHN2k8n5wKxOy\nMMOcNamnycjIoH///mRmZmJra8t///tfg8c2MTHRm9gChss3Wltb4+vri0ql4vLly3qB3cDAQL19\nAgICMDIyumNG3qjunnCzmPm7aqgytuGq6rjOek8nawZ1cuV0W7s/JAgnCEK9PzIj+/ZemoEe7pia\nmkpBILVazYULF5gwYYL0/qVUKnF3dycxMbH++hq8r9XU1LB//36OHTvGpUuXUKvVOpMKrl+/bvA6\nnnrqqYd+b4IgCILQUokgkCAIrcq9lOBy/L0EzgsvvMDs2bObdHwrKytKS0upqanRCwQ9ib0oBH1N\nGfSXm5hi2cadsryLZP26i9zENnSoOMeYEYPv65xTpkwhMzOTn3/+mbi4OLp27UqbNm0oKSnhypUr\npKSkMH369FYbBGrN7jawfaf9Xhw9Gjc3N3bt2kV0dDSVlZW0bduW8ePHM3ny5Eb7RjVFp06dWL16\nNdu2bSMhIYG4uDhsbW0ZNGgQoaGhuLu76+1jY2ODj48PFy5c0Bs0fuqppzA3N6eioqLZ+gFpDR48\nGBsbG7Zu3crx48cxMTEhMDCQlStX8vXXXwP31yuptWnKs+XefRgWDq4UnIujMFOJpq6O/AAf/jRt\nGi+++KLe51jD4M7tgZ5u3bpx8OBB5HK5wQDAvdKWJ4P6Jtw52yO4VFA/SaND3xdo41vft6EgPZ7C\nC2eouHGd6ptllFw5j7GZJeZ2TphZ2xMyYhzvvjlLuheVSoWxsTF5eXmkp6dz9OhRPD09uXjxIhUV\nFXzyySd06NBBuo5Ro0aRlZXFN998w7Fjx9BoNLi4uDB58mScnG4NbiYlJemVONPSlrNr6MaNG3z3\n3XfExcVRWlqKm5sb48ePZ9iwYYZfjz2RrN2whcyMC9TVVGJiaYtV2w5o6mpxdLBnuMKDjq62dPNy\n4u2li1h7FP785z9z8eJFNmzYwI8//sjVq1dxd3e/Y3+4hrSZcw0Dug05ODjobNeQoYkvcrkcW1tb\nnUwrQzo4mBDg4UB5ZQ03rp6mtk6D3EiGrYUpsgpjYi7Xb9eaetkJwuOou7cT3b2d9IIy3bycmmVy\nTmMT+QBuVNtScDadkpISUlNTqaurQ6FQ0L59exwdHVEqlYwePVrqW9fwM+zDDz8kJiYGV1dXgoOD\ncXBwkDJdIyIiDGY7wq33QEEQBEF4EoggkCAIrUZT+27we9+NZWM7I5PJSElJafI5fH19SUhIICUl\nRW+gs7H+GsKTpamD/l4h47h86gA3ci9Qm61i49V4Ovu2l+qX3wtjY2PeeustoqKiOHToECdPnqSi\nogJbW1tcXFx4+eWXGTx48D0fV3hw5ZV3L9HnN3xmo/t1796d7t27N+lchhrRBwUFsWfPHoPbu7u7\ns3jx4iYdW6uxvmfGxsZSmThDnJ2dDV7HokWL9AbPte507T179tQr0VJXV0dWVhYODg4PFCBrLZry\nbAE4egXi6HUraDNt8FNMbqR/g5eXV6Ov+cCBAxk4cGCj5wkLCyMsLMzgOkPf/6CgINRqNREREXh7\ne+PXx4dDifXlfCwcXAHIPvEjhVlJmFrZ0a7bMyCTUXIplcqyYuza+eHVfzyDRwZIDb137tzJt99+\ni7W1NdOnT8fKyoozZ85QVlaGt7c3mZmZjBo1Suc6tH1yfH19CQ0Nxc7OjqysLI4fP0779u3Ztm0b\nlpaW5OXlMXXqVCIiIgCkvjqA1DdQS61W89e//hVjY2NCQkKorq7m119/5ZNPPkEmkzF06FCd7f/n\nvc/4euP3yE0tsXP3w9jckptFeRSkn6K84DIFBV4cSrxMkGdXaeC1pqaGVatWUVRUREBAAIMHD+bb\nb7/Fx8eHTz75pNHvU0Pan5OioiI8PT311hcVFQGGg6rFxcW0bdtWZ1ltbS03bty4axBWu37o4AGP\nVS+7x11kZCRr1qxh0aJFes+w8Hh7FBnZd5vIV27pyoW0ZL7acRDb2kJMTU3p3LkzUJ/1Ex8fT3V1\nNcnJyXh6ekrZnunp6cTExNCtWzf+8Y9/IJfLpWNqNBp27tzZ6DU9CVnFgiAIgqAlgkCCILQa99p3\nY4/yGoMHD+bIkSNs3bqVyZMn69WLzs3NxcjICBcXF6C+dnRCQgLff/8977zzjlR2q7S0lG3btj3U\n+xFap6YOzJrZOOI7ZKr09YzBTzH094HZxgZhofFm5DKZjCFDhjBkyJB7uFqhuVma3d+vUve735NA\nrVZjbGyMmdmt8lgajYZt27aRn5//xDSBb+3PVlBQEC4uLkRERODj48OLYbNI/uKYtL4wS0VhVhKW\njq74Df8TcpP6z1u3rkNIP7SRwqwkbN396OZVH5i6evUq33//Pba2tnzyySdSFs+MGTNYuXIlx44d\n07uGxMREtmzZgr+/P//4xz90gofaAe8tW7Ywe/ZsnJ2dCQsLIzIyEqDRgBdAZmYmw4cPZ8GCBdLv\nFS+88AILFixg586dOgPo2/Yd5euN32Pp1B7fIWEYm5pL664m/8rZveEUZSZRU1XJ6r2JONvVB7wK\nCwuprKwkMDCQ2bNnM3ToUBITE7l48SKlpaXY2Nx9wNbHx4fo6GiSkpL0Mr/UajUZGRmYmpoazCJV\nqVR6nzcpKSnU1dXpBcVud3svO0MldgVBeHI0ZSKfjas3VzSw4cdDBDlU4e/vL/0dplAoiIqKYt++\nfVRUVOi8n2n7Kvbp00cnAAT1/WGrqqoe/g0JgiAIQiuk3z1PEAShBbrfvhvPjn+JTp06sXnzZubN\nm8cnn3zCxo0bWb16NYsXL2bu3LmcO3dO2mfgwIEEBweTmprKggUL2LBhA19++SULFiwwOItWePK0\n9oFZ4eHq5nV/tfLvd78nQWpqKtOnT+f999/n66+/Zv369SxatIgtW7bg5OR0x8H5x0lrfLay8krZ\nHZfJluPp7I7L5GJ+mbTOy9mGIM9bZckKL9Q36m7XfZgUAIL6cpru3euDKPKCs9Ls9KNHj1JbW8vY\nsWN1yrjJZDJmzJhhsCm4NuD+l7/8RS97bOjQofj4+BAVFXXP92lmZsbs2bN1ztm+fXsCAgK4dOmS\nTq/BzzZsQaMBz+AxOgEgAIcOgchNzKgsvc7VpKNSDzlA6mlha2tL3759gfr+RjU1NXzyyScGS7iV\nlZVJfYAAhgwZgrGxMXv37pUGSrU2bdpEeXk5gwcPlsomNbR161bKym59/6qqqti4cSNAoyXvtLS9\n7AoLC/nyyy8NDsIWFhaKnkDCPcvLy2Ps2LGNZq0KLVNTJvJZOrhhbGpOyaVzxKvSdAI92uoM2ozk\nhtUatBP5VCqVzvFKSkoIDw9/GJcvCIIgCI8FMSIlCEKrcL99N87llfP++++zf/9+jh49SnR0NFVV\nVdjb29OuXTtmz56tU4pJJpPxP//zP+zYsYNDhw6xd+9eHB0dGTZsGKGhoXpNtoUnT2scmBWaj3Zg\n+16C1F07ODZ72ZXWzMPDg969e3P27FlOnTpFbW0tTk5OjB07lsmTJ0slYB53renZaqzPQ2VZMZcu\nFdHpen0w4aWBfizdHItGA+WFV5HJZFg7e+kdz9q5AzIjI+w0N6RlGRkZAAQEBOht7+zsjJOTE3l5\neTrLU1NTMTY25tdffzV43dXV1ZSUlDQ5s0arXbt2BkuiaYNTZWVlmJubk5VXSsaFdIzkcoovplB8\nUbc8bXVFOWg0GJmYkX/uJOqCK+S0bY9JdiJ5V6/Qo0cP5s+fL51r+PDhnD9/nn379jFnzhy6d++O\ns7MzpaWlXLt2DZVKxbBhw5g/f770usyZM4fw8HBee+01+vfvj52dHSqVitTUVDw8PJg5c6bBe2zf\nvj3z588nJCQEuVxObGwsubm59O7du0kZqaKX3YOpqKhg6tSp+Pn58eGHH0rLq6qqCA0Npbq6msWL\nF+t8L/bt20d4eDgLFy6UnpXDhw+TlJREQUEBlZWVODk5ERwczJQpU7C2tpb2Xbp0qTSIvmbNGp0g\ny4YNG6RStrW1tRw4cIDDhw9z8eJFamtr8fDwYPjw4Tz33HM65bXy8vKYNWsWQ4cOZdKkSWzatImk\npCRu3LjBO++8Q1BQULO9fkLL0dSJfDIjI6ydO1B8+RzVQBuPjtI6Z2dn3NzcpAoODfvW+fn50blz\nZ6Kjo1myZAkBAQEUFxcTHx+Pu7t7oz3RBEEQBOFJ0yKCQDKZbCIwCOgGKAAbYLNGo3n5Dvv0A/4X\neBqwANKBr4HPNBpNbbNftCAIj1RTSnCZWdvT4+UVevsZGxszZswYxowZ06RzGRsbExoaSmhoqN66\nO5XxEp4MrWlgVng0Gg5s341MBmGN9GsR6rm4uPDmm2/+0ZfRIrSGZ+tufR5u3Kxib/xFhidcYmS3\n9ix6Log1PyVRW12B3MwCo9vK9wAYyeUE+bbDTHbrs1+b+WJvb2/wPA4ODnpBoNLSUmpra/nhhx/u\neA83b968pyBQYz2ptKWI6urqgPoJLDWVN9HU1ZKbeFRv+7raGmprqjA1tcBncCgF505yPf0UtQW5\nuDg58s9//pMePXro7DNv3jx69erFzz//jFKpRK1WY21tTdu2bRk/frxegGb06NG4ubmxa9cuoqOj\nqayslLadPHlyo/fyt7/9ja1btxIVFUVhYSFt2rQhLCyMiRMnNqmPhuhl92DMzc3x8/MjLS2Nmzdv\nSn2xUlJSpCb3SqVS5/utVCoBpAyKAwcOEBMTQ1BQEN26dUOj0XD+/Hl2795NfHw8H3/8sXTcYcOG\nYWVlRWxsLMHBwTol/7TPSE1NDW+//TanT5/G3d2dQYMGYWpqSmJiIl988QVpaWkG+9Dl5ubyxhtv\n4O7uzuDBg6msrLxrXynh8XEvE/msXb0pvnwOuak5JUa6Ez4UCgW5ubl07NgRKysrnSDj8uXL2bRp\nE6dOnWLPnj20adOGESNGMGXKFF599dWHfUtNkpSUxLJly5g6deoTk8EsCIIgtGwtIghEfTBHAZQB\nlwH/O20sk8leAHYCFcA2oBAYC6wGQoBJzXmxgiA8eqIEl9CStIaBWeHR6e7tJA1s3+mZkMng9TFd\n6e4tssKEpmnpz1ZT+jwAoEHqdzOquycu9paE/WTL9aIS6mprdQJBXTs4MqWfD2+f/BILi1sDxdpB\n4+LiYoPlWYuKivSWWVpaotFo7hoEai7llTXITcwADV0n/VVvfWVZMcm7P6GNTzds3XyxdfOtX/7b\nt/i62uoFgLR69+5N7969m3wd3bt318l6bgoTExOmTZvGtGnT7rptYxNkRC+7B6NQKDh79iwqlUr6\nfiuVSikTQhv0gfq+aUlJSbi6ukpZO5MmTWLevHl6pRIPHjzIp59+yk8//cTEiRMBpD5WsbGx9O3b\nV6evldZ//vMfTp8+zZgxY5gzZ4503Lq6OtauXcvBgwcJCQkhODhYZ7+UlBQmTZrE9OnTH9IrI7Qm\nTe2lCeDsH4yzf/3zU1Fdp7Nu/vz5Upbj7WxsbJg3b57BdYZ6bQ4dOtTgM367sWPHEhgYyHvvvae3\nrmEQatGiRXc9liAIgiD80VpKT6DXgacAW8Dwp/fvZDKZLfAVUAsM1mg0szQazRLqs4higIkymUx/\n+r4gCK2aKMEltCTagdm7TYYWg/5PjlHdPXnvpWC6djBcdqRrB0feeymYkd1E6SPh3rTkZ+tufR60\nGSMaTZ1Ov5vu3k5MGNqHIE9HRnc0Zsbgp5g3MoAvXhnIR9P7Ylp+jbq6Onx9faVjaTMTUlJS9M6T\nl5dHQYH+bHN/f3/Kysq4ePFik+/JyMhIyuR5UJZmxlg5eVBTeZObxXl33+F3xvK7Z9oIjz9tRk/D\nYI9SqaRjx47069ePgoICcnJygPpyiaWlpTp9VJydnQ32yho2bBiWlpacOXOmydei0WjYu3cvDg4O\nev2wjIyMmDVrFjKZzGCPLXt7e6ZOndrkczVVXl4eH374IWFhYYwfP57XX3+dkydP6m1XXV3Njh07\nWLBgARMmTGDy5Mn87W9/0ysTWVFRwbhx4/jrX3UDtlVVVYwfP56xY8dy5MgRnXX79u1j7NixHDx4\n8KHf3+NCTOQTBEEQhJahRXyyajQa6bepJpQXmAi0Bb7TaDSnGhyjQiaT/S8QSX0gaWszXKogCH8Q\nUYJLaGm0s9m3HE8nMVv/uezawZGwAX4iAPQE6e7tRHdvJ7LySknIKqC8sgZLM2O6eTmJ9yLhgbTE\nZ6spfR7kphbIZDKqy0sASMwuJCuvFC9nG4YPH45SqeTS6UO8OmkoZmZmAFRWVvLtt98C9T1wtAYN\nGsTWrVvZs2cPw4YNk/rvaDQaNm7caDBw88ILL3Dy5Ek+++wzli5dqtcboqKiguzsbDp16iQts7Gx\nISsri6qqKkxNTe/9hWmgm5cTzp2DKclJ42LsXnwGTMLEUvf7pamro6qsWGdZG2vzBzqv0Drd/vMd\n6OGOqampFARSq9VcuHCBCRMm0LVrV6A+KOTu7k5iYiKAtBzqy7ft37+fY8eOcenSJdRqNZoGUdvr\n1683+dpycnIoLS2lXbt2bNu2zeA2pqamXLp0SW+5t7c3JiYmTT5XU+Tl5bF48WJcXV155plnKC0t\n5fjx47z99tv861//kl6Hmpoa/v73v6NSqfDw8OC5556jsrKSEydO8MEHH5CRkSFlKD2MEnyCPjGR\nTxAEQRBahhYRBLpHz/z+//0G1h0DyoF+MpnMTKPRVD66yxIEobmJElxCS9MSB2aFP56Xs434/gvN\noiU9W03p8yA3McWyjTtleRfJ+nUXZrZtWPtVBgteGsugQYP47bff+PXXX3n11Vfp27cvAL/99hvX\nrl1jwIABOj1j3NzceOmll/juu+/4y1/+woABA7CysuLMmTOUlpbi7e1NVlaWzvkVCgUzZszgu+++\nY+7cufTq1QsXFxcqKirIy8tDpVIREBDAP//5T5190tPTWbFiBV26dMHExARvb2/69Olzz6+Rl7MN\n/fr0Qn39CrkJh0mO+Ay7dn6YWttTV1NNedFVSq9m0PAXm64dHCnMergD5kLLdiazgM3H0g0GVW9U\n21JwNp2SkhJSU1Opq6tDoVDQvn17HB0dUSqVjB49GqVSiUwm0wlGfPjhh8TExODq6kpwcDAODg5S\nMCYiIkIKbDRFaWkpAFeuXLljecWbN2/qLXNwcGjyeZoqKSmJsLAwnQyjQYMGsWLFCnbt2iUFgX78\n8UdUKhU9e/Zk+fLlUt+usLAwFi9ezPbt2+nduzedO3cGHrwEn6DvUUzku3z5Mt9++y3JyclUV1fj\n4+PD1KlTdcpgqtVqDhw4QHx8PDk5OZSUlGBpaYm/vz+TJk3C3/9WR4LIyEjWrFkDgEqlYuzYsdI6\n7TOn/TmIjIwkMjJSWr9o0aK7lporLS1l165d/Pbbb+Tl5WFsbEzHjh2ZOHHiPZfuFARBEISmao1B\nIO1UvbTbV2g0mhqZTJYJdAF8gLN3O5hMJotvZNUd+xIJgvDotfTeCMKTqyUNzAqCIDwKTe3z4BUy\njsunDnAj9wK12SoOX7Hi2acD8PLy4q9//StBQUEcPHiQn3/+GYD27dszbtw4Ro8erXesSZMm4eTk\nxO7duzl06BAWFhb06NGDP/3pTyxfvtxgs/mJEycSEBDAnj17SElJITY2FktLS9q0acPIkSMZNGiQ\nzvZTpkxBrVYTFxdHSkoKdXV1DB069L6CQFA/gUV1qT/WbT0eC7XcAAAgAElEQVTJPxdHWf5FanPO\nYWRihqmFLR2HvoyDVyBwawLL2qP3daqHwlDvC6H57D9z8Y6/15ZbunIhLZmvdhzEtrYQU1NTKWDR\ntWtX4uPjqa6uJjk5GU9PT+zs7ABIT08nJiaGbt268Y9//EMKfkB98GLnzp33dJ3an62+ffuybNmy\ne9q3CZU+GnX7JBsPq/oXytnZmSlTpuhs26NHD9q2bUta2q1hgoMHDyKTyZg9e7bOa2BnZ0doaCif\nfvopv/zyi04QaOvWrSiVSp0gkLYE3+eff05OTg7u7u5SCb5+/frd9/09KZpzIt+1a9d488038fLy\nYtSoURQVFXH8+HFWrFjBkiVLGDBgAFAfKPr+++/p0qULvXv3xtramry8POLi4oiPj2f58uX07NkT\nqM9emzp1Kj/88APOzs46QZ2goCCgPqgUERGBt7c3Tz/9tLTe29v7jtebl5fH0qVLycvLo0uXLvTs\n2ZOKigpOnjzJihUrmD9/PiNHjmzy/QuCIAhCU7XGIJDd7/8vaWS9drn9I7gWQRAeMVGCSxAEQRD+\neE3t12Bm44jvkFuz9eeNDGBon/pBMplMxujRow0GfBozZMgQnXJMAOXl5Vy9erXRwbeAgAACAgKa\ndHxzc3NeffVVXn31VYPr9+zZ0+i+ixYt0msQfmsCC1g7eza6b8MJLIYamQuPnzOZBXed2GTj6s0V\nDWz48RBBDlX4+/tLZQoVCgVRUVHs27ePiooKnSyg3NxcAPr06aMT/ABIS0ujqqpK71zaPj+GSit6\neHhgZWXFuXPnqKmpwdi4eYcRGsuOqiwr5tKlIjyfCjTY78jJyYnU1FSgPispNzeXNm3a4OHhobet\nNlsoIyNDWqZ9fe+3BJ9gWHNO5FOpVIwbN44///nP0rLnnnuOJUuWsG7dOnr27ImlpSUeHh5s3LgR\nW1tbnf0LCgp44403+Pe//y0FgXx8fPDx8ZGCQGFhYXrndXFxISIiAh8fH4PrG7N69Wry8/NZsmQJ\nAwcOlJar1WqWLl3Kl19+SXBwMPb2YjhLEARBeLj0f3N6wmg0mp6G/gNS/+hrEwTBsO7eTnw0vS9f\nvDKQeSMD9BpKiwCQIAiCIDSvP6LPQ0lJCTU1uhlItbW1bNiwgaqqKqmkXEszqrsn770UTNcOjgbX\nd+3gyHsvBTOyW/tHfGXCH2nzsfS7ZkZYOrhhbGpOyaVzxKvSdAI92uDD9u3bdb6G+gFqqB8gb6ik\npITw8HCD57Kxqc9ozsvL01snl8sZO3YshYWFfPnllwaDSIWFhQZ7At2r/WcusnRzbKPlw27crCLy\n7HUOJOifSy6XS32P1Go1gF4vMC1tmbqysjJpmbGxMQEBAWRnZ1NSUoJKpTJYgg8wWIJPaFxzvQ9a\nWVnplAUE8PPzY/DgwajVamJiYqTtbg8AQX3gMCQkhMuXL5Ofn39P575XmZmZqFQq+vXrpxMA0l7f\nSy+9RFVVFdHR0c16HYIgCMKTqTVmAmkzfewaWa9dXtzIekEQHhOiBJcgCIIg/DEeRZ+H20VHR7N5\n82YUCgVt27altLSU5ORkcnJy8PHx0enb0NKIHnJCQ1l5pU362ZEZGWHt3IHiy+eoBtp4dJTWOTs7\n4+bmRm5urtSzRsvPz4/OnTsTHR3NkiVLCAgIoLi4mPj4eNzd3Q0GRvz9/TEzMyMiIoLS0lIpSDJm\nzBisrKyYMmUKmZmZ/Pzzz8TFxdG1a1fatGlDSUkJV65cISUlhenTp9O+/f0HM5uSHQWABlbvTcTZ\nzqLRyV9WVlYAFBUVGVyvXa7dTkuhUJCQkIBSqSQ1NbXJJfiEu3uQ98HGSgP6+vpiYWGht31QUBCR\nkZFkZGRI5dzOnj1LREQEqampFBcX600quH79Om3btn1Id6tPm6WmVqvZsmWL3vqSkvqhrocRTBUE\nQRCE27XGINA5oBfwFKDTz0cmkxkD3kANkKG/qyAIgiAIgiAID0Nz9nkwpFOnTgQEBJCcnCw1qndx\ncWHy5MlMnDhRKpPVkokJLAJAQlZBk7e1dvWm+PI55KbmlBjpBhwUCgW5ubl07NhRJ5hhZGTE8uXL\n2bRpE6dOnWLPnj20adOGESNGMGXKFIPlDq2trVm6dCk//PADkZGRVFRUAPUlGK2srDA2Nuatt94i\nKiqKQ4cOcfLkSSoqKrC1tcXFxYWXX36ZwYMH398L8rumZEdpaTSw5Xh6o0EgCwsL3NzcuHr1Kleu\nXKFdu3Y667Xl3Hx9fXWWazN7tEGgppbgE5ruXt4H71Ya0DfQxOB+2nJq2oywmJgY3nvvPUxNTenW\nrRtubm6Ym5sjk8lISkpCpVJRXV39AHd1d9rPrYSEBBISEhrd7ubNm816HYIgCMKTqTUGgQ4DLwGj\ngB9uWzcQsASOaTSaykd9YYIgCIIgCILwpGjOPg+G+Pj43HNTekFoicora+6+0e+c/YNx9g8GoKJa\nt1/P/PnzmT9/vsH9bGxsmDdvnsF1jfWd6tmzp9QXxRCZTGawL5fB63Z2vmMPrds1NTuqocTsQrLy\nShsNKAwbNozvv/+er7/+mmXLlkl9hG7cuMHWrVsBGD58uM4+vr6+WFlZERsbS0lJCYMGDZLW3akE\nn/Dw7T9z8Y6fLzduVrEn+izPJlzSKyNXXFxfGEYbHN20aRMmJiasXr1aL1tt3bp1eqUTm4OlpSUA\nc+fObdGZq4IgCMLjqTUGgXYAHwChMpnsM41GcwpAJpOZA//6fRvDhY4FQRAEoYFff/2VvXv3kpmZ\nSU1NDW5ubgwaNIgXX3wRE5NbMwtnzZoF1P+RuGXLFo4fP05xcTFt27ZlxIgRTJgwAZlM9kfdhiAI\nwh9mVHdPXOwt2XI8ncRs/QHcrh0cCRvg16r79S1duhSVSnVPA9qCcCeWZvf3Z/j97tca3Et21O37\nNRYEGj9+PPHx8cTGxvKXv/yFXr16UVlZya+//kpJSQkTJkwgICBAZx9tab3Y2FgAnWyfO5XgEx6u\nppYGLC/MZeWPJ/VKAyYlJQH1kwcAcnNz8fT01AsAaTQakpOTDR5bJpNRV1dncJ02oNjYekM6deoE\nQHJysggCCYIgCI9ci/gtUiaTvQi8+PuXrr//v69MJvv2938XaDSaNwE0Gs0NmUw2h/pgUJRMJtsK\nFALPA51+X77tUV27IAhCa5GWlsaPP/5ISkoKN27cwMbGhg4dOjBy5Ej69+8PQGRkJHFxcVy4cIGi\noiLkcjleXl48++yzBmd9agfGfvzxR3bs2EFkZCTXr1/H2dmZcePGMXLkSAB+/vlnfvrpJ3Jzc7Gx\nsWH48OGEhYUZDJycO3eOXbt2kZKSQllZGfb29vTq1YupU6c22tz3fnz33Xds374dW1tbBg0ahLm5\nOfHx8Xz33XecPn2at99+G2PjWx+TNTU1/P3vf6ewsJBevXphZGTEb7/9xsaNG6murtZrSisIgvCk\nEP1uBOHedPO6v6Do/e7XGtxLdlRT9zM2Nubtt99m9+7dHD16lL1792JkZIS3tzdz585l4MCBBvdT\nKBTExsZiaWmJn5+f3jpDJfiEh6uppQFrqirITTzKluNuUhAoPT2dqKgorKys6Nu3L1AfwLty5QqF\nhYXS3xMajYYtW7Y02oPH1taWggLDwUlra2tkMhn5+flNvic/Pz+6dOlCdHQ0Bw8e1MtCA8jKysLB\nwUH0mhIEQRAeuhYRBAK6ATNuW+bz+38A2cCb2hUajWa3TCYbBLwFTADMgfPAYuBTjaaplYQFQRCe\nDAcOHGD9+vUYGRkRHBxMu3btKC4u5vz58/z0009SEGj9+vV4enoSGBiIg4MDpaWlnDp1ilWrVpGT\nk8PLL79s8PgfffQR586do1evXsjlck6cOMHatWsxNjYmMzOTw4cP07t3b+mP6q1bt2JmZsbEiRN1\njnPw4EHWrl2LiYkJwcHBODk5ceXKFQ4cOEBcXBwrV658KA1bU1NT2b59O05OTqxatUpqfjxjxgze\neecdTp48ya5du5g8ebK0T2FhId7e3vzrX/+SasOHhYXxyiuv8N///pdJkybpBI0EQRCeNKLfjSA0\njZezDUGejvdU/qxrB8fH+uerKVlOZtb29Hh5RaP7vffee3r7mJqaMnnyZJ3f6e5m7NixjWZq3KkE\nn/Bw3EtpQBuXDlw/f4YdX13BtWQo8toKjh8/Tl1dHfPnz5dKsL344ousW7eOhQsXEhISglwu5+zZ\ns1y8eJE+ffoQFxend2yFQsGxY8f4v//7P3x9fTE2NqZLly4EBgZibm7OU089RXJyMitXrsTd3V36\nO8vLy6vR633zzTd56623+PTTT9mzZw+dOnXCysqKgoICsrKyyM7OZuXKlSIIJAiCIDx0LWK0SqPR\n/AP4xz3ucwIY3RzXIwiC8Di5dOkS4eHhWFpa8sEHH+Dp6amzvuEMt7Vr1+Lm5qazvqamhhUrVrBj\nxw6effZZ2rRpo3eO/Px81q1bJ82IHDduHPPmzeOrr77CysqKzz77TNovLCyMOXPm8OOPPzJu3Djk\ncjkAOTk5rF+/HhcXF9577z2d8yiVSpYvX86XX37JW2+9dV+vQ8MZ6kf+u5XyyhqmTJkiBYAA5HI5\ns2bN4tSpU/zyyy96AwavvPKKTuNxOzs7goODOXz4MDk5OXTo0OG+rk0QBEEQhCfLSwP9WLo5tknZ\nDjIZhA3wu/uGrZjIjhK07qU0oKmVA+37PMeVM5HsjvgJZ1tTfH19CQ0NpUePHtJ2o0aNwsTEhP/+\n979ERkZiampKly5deO2114iOjjYYBJo7dy5Q/3fIqVOn0Gg0TJ06VSoD+MYbb/DVV19x+vRpjh07\nhkajwcnJ6Y5BICcnJ9asWcOePXuIjo4mKiqKuro67O3t8fT0ZMyYMeLvCUEQBKFZtIggkCAIgtB8\n9u3bR21tLaGhoXoBIKj/Y0Tr9gAQ1JfSeO6550hMTESpVPLMM8/obTNjxgydkhiurq4EBASQmJjI\nrFmzdAI6VlZW9OnTR6d0HNSXjKupqWHOnDl6gSaFQkFwcDBxcXHcvHkTCwuLJt//mcwCNh9L15lR\nmHriDOWF19l9rhqXTgU6NcTd3d1xcnLi2rVrqNVq6b6srKwMvj7a16+srKzJ1yQIgiA8GhUVFUyd\nOhU/Pz8+/PBDaXlVVRWhoaFUV1ezePFinZKn+/btIzw8nIULF+qU66mtrWXnzp0cOnSI/Px87O3t\nGTRoEC+//LLBTFClUsmuXbtIS0ujoqICZ2dn+vXrx8SJE0UZKYHu3k4sei7orn1PZDJ4fUzXVt1X\nqylEdpSg1ZTSgLdnhfkMDmXG4KfuGCwdOnQoQ4cO1Vvu5eVFWFiY3nI7OzuWLFnS6PHc3Nz4+9//\nbnBdUFBQo33kLCws7jk7TRAEQRAelAgCCYIgPIYaZr38dDSO8soaevbsedf98vPz2bFjB0qlkvz8\nfKqqqnTWX79+3eB+HTt21FumrbdtaJ02yNMwCJSamgqASqUiPT1db5+SkhLq6urIyckxeExD9p+5\naHBwpba6EoDz12tYujmW18d0ZWS3W41iHR0dyc/P1wsCGaLNZLqXxrCCIAjCo2Fubo6fnx9paWk6\nkwhSUlKorq4G6oM1DYNASqUS0G0ID7By5UqSk5Pp2bMnlpaWnDp1ip07d1JcXMyiRYt0tt2/fz/r\n16/HzMyM/v37Y29vT1JSEjt27CA2NpaPPvpIBIIERnX3xMXeki3H00nM1g9+dO3gSNgAv8c+AKQl\nsqMEaFppwIe5nyAIgiA8CcSnpCAIwmPEUNZLcloOlaWFfPTzeWYOM290IOHq1assXryYsrIyunTp\nQo8ePbC0tMTIyIi8vDwiIyOlAbPbGRrI0gZH7rSupubWTL8bN24AsGvXrjveY0VFxR3Xa53JLGh0\ndq3cxKz+/BVlyE0cWb03EWc7C+m1KSwsbPTaBUEQhNZFoVBw9uxZVCoVvXv3BuoDPUZGRgQGBkpB\nH6hvFJ6UlISrq6s0SUErNzeXdevWYWNTn3kwbdo0Fi5cyOHDh5kxY4ZUXjQvL48vvvgCc3NzVq1a\nhYeHh3SM8PBw9u3bxzfffMOCBQua+9aFVqC7txPdvZ10JvBYmhnTzcvpictyEdlRAojSgIIgCILQ\nHEQQSBAE4THRWNaLsak5lUDCuWyWXlPrZb1o7d69m9LSUhYtWqRXKuHYsWNERkY249XfCrhs27ZN\nauL6IDYfS290AMHC0ZXywlzKrmVjZuOIRgNbjqfT3duJ3NxcCgoKcHFxEUEgQWiCpUuXolKpdMqe\nJCUlsWzZMqZOnWqwxIogNKfbB9OdPOqzR5VKpU4QqGPHjvTr14/PP/+cnJwc3N3dycjIoLS0lH79\n+ukdd+bMmVIACOqzjAYNGsTWrVs5f/68dOyoqChqamoYN26cTgAI6gNHR44c4ciRI7zyyiuYmJg0\n18sgtDJezjZPXNDHEJEdJYjSgIIgCILw8IkgkCAIwmPgTlkvlk4eqK9f4caV85jbOellvWjl5uYC\nGBz4SkpKapbrbqhTp06cP3+e5ORkaSDtfmXlld7xD8c2vt25fv4MV1XHsPV4ChNzKxKzC8m4WsKW\nDRvQaDSMGDHiga5BEARBeLQMZcMC1NXWkp1bxqHjvzF79mzUajUXLlxgwoQJdO3aFagPCrm7u5OY\nmAggLW/Iz0+/9FTbtm0B3b5wFy5caPQY1tbW+Pr6olKpuHz5Mt7e3vd5t4Lw+BLZUYIoDSgIgiAI\nD5fRH30BgiAIwoO7U9ZL26d6ITOSc1V1jIqSfCnrRaugoABAKntze8Dn9OnT/PLLL81z4Q2MGTMG\nY2Nj/v3vf5OTk6O3vqamhuTk5CYdKyGr4I7rrdu2x6VLCJVlxaTuDedS3D5yTh9kwV/+QmxsLAEB\nAYwfP/6+7kMQBEF49PafucjSzbEGJwAYyeXUWrtwODaJXceTUalU1NXVoVAoaN++PY6OjlJJOKVS\niUwm0+sHBHcub9qwL5xarQZu9ca7nbZsnHY7QRAM83K24cU+3oQN8OPFPt4iAPQE0ZYGlMnuvJ0o\nDSgIgiAITSMygQRBEFq5u2W9mNu1pX3vZ7kU9xOp+77AzsOfKwmO2FyJpvDqJSwtLXn33Xd57rnn\nOHToEO+//z4hISE4OjqSnZ3N6dOn6d+/P8ePH2/W+/Dw8GDhwoV8+umnzJ8/nx49euDu7k5tbS15\neXmkpKRga2vL559/ftdjlVfW3HUb9+7DsHBwpeBcHIWZSjR1dbh18mbmtGm8+OKLGBuLj0hBaO0i\nIyNZs2aNwTKXwuPjTtmwWtau3tzIzeD9jXsZ4WOCqakpnTt3BuozduLj46muriY5ORlPT0/s7Ozu\n+3q0waKioiI8PT311hcVFQE8lNKnQuuSl5fHrFmzGDp0KIsWLfqjL0cQWjRRGlAQBEEQHh4xwiUI\ngtDK3S3rBcDJrycW9s5cOxtD2bUsSi6ncqSiHQN7BUplz7y8vHj33XfZtGkTJ0+epLa2Fm9vb5Yt\nW4aVlVWzBoEqKiqYOnUqfn5+rF69mt27d5OYmEh8fDynT59GLpfzwgsvMGvWLGmfffv2ER4ezsKF\nCxk+fDjnz5/n8OHDJCUlkXAui7TL1zGxtMXOoxOugQMwNrPQOWddbS01FWrqaquRyYzQyDRoaqtJ\nS0sjJSWFbt26Sdtu2LCh0WsPCwsTPU+Ex05kZCRxcXFcuHCBoqIi5HI5Xl5ePPvsswwZMuSPvjxB\n0HGnbFgtG9f6smuluZn8lJXP6GB/TE1NAVAoFERFRbFv3z4qKioMZgHdCx8fH6Kjo0lKStI7llqt\nJiMjA1NTU9q31+/PJwiC0NoYCm6uWbOGyMhINmzYIFUbuB+iNKAgCIIgPBwiCCQIgtDKNSXrBcCq\nbXt82t4acJox+Cm9+tmdO3fmnXfeMbh/w6bvWu+9916j51u0aFGjs1xvD5yYm5vj5+dHWloaLi4u\n0n4JCQksX74cAG9vb53+CtrSPdoBtgMHDhATE0NQUBAePv58d/Qc5ddzyTsbw40r5+k0ahZyEzNp\n/+yY3RRlqbCwd8bRR4FMbsLT3dqSlXWB06dP6wSBBOFJs379ejw9PQkMDMTBwYHS0lJOnTrFqlWr\nyMnJ4eWXX/6jL7FJnn76acLDw6XyW4K+pKQkli1bxtSpU5stoN2c2Q93y4bVsnRww9jUnJLL5yio\nUOM25XlpnfazZfv27Tpf368hQ4awdetW9u7dy9ChQ3Fzc5PWbdq0ifLyckaMGIGJickDnUcQBOFJ\n4eVsI4I+giAIgvAARBBIEAShlbM0u7+38vvdr7koFArOnj2LSqWid+/eQH2gx8jIiMDAQCnoA6DR\naEhKSsLV1VWaXThp0iTmzZuHkVF9u7t8xxiSLhZy/fwZsn+LID/tJK5d+gNQU1VBcXYylm3a0Wnk\nLGRGRnTt4Mi/pvcFoLS09FHeuiC0OGvXrtUZuIb6vlwrVqxgx44dPPvss7Rp0+YPurqms7KyMtjH\n5UnzOJegako2LIDMyAhr5w4UXz5X/7W9h7TO2dkZNzc3cnNzpc+cB+Hs7MycOXMIDw/ntddeo3//\n/tjZ2aFSqUhNTcXDw4OZM2c+0DkEQRBaCkdHR8LDw3VKXE6fPp2JEyc22htNEBo6d+7/s3fmcVGW\n6/9/D7vDvggiiICAuACSCooLFppbZKaZWqbH9Nvi+R4tzV8upX0ts/LkkmbqsVwSNY1zRFNccENF\nVmEARUFUVkVkG1B2fn9wZmKcAcalNL3fr1ev9Fnu555nnPt57utzX5/rEqGhoVy4cIHy8nIsLCzo\n1asXEyZMUPk3pHCF0OTSEBISwo4dO1i6dCleXl7K7cHBwXTv3p25c+eybds24uPjKS4uZubMmUqr\n4KKiInbt2kVcXBxFRUVIpVK6devGuHHjcHNzU7lOU6thMzMzfvnlF65evYqenh4+Pj5MnjyZ9u3b\nq/WvqqqKsLAwIiMjycvLQyKR0LFjR15++WUGDhyocmxtbS3h4eHExcWRlZVFcXExRkZGdOrUidGj\nR9OzZ0+19hX3Zu3atYSEhBAZGUlJSQlt27blxRdfZMyYMUhaK7IlEAieap6sCKBAIBAI7psezg/m\ng/2g5/1R+Pj4sHPnTpKSklREIDc3NwICAvjhhx/Izc3FwcGBzMxM5HI5AQEByvPvtZp4Y6A787ZH\nY9WpBzkJh5HnZypFIAmNQpKOji5IJEgkqGRFmZqKlYaCZwtNNiv3oqenx8iRI5HJZCQlJfHCCy88\nhp42/nb37dtHeHg4N27cwNTUlL59+zJp0iT+8Y9/AL8HB+6tCVRdXc1bb72Fnp4eW7ZsQVdXV639\n77//noMHD/Lpp58qxyKAnJwc9uzZQ1JSEiUlJRgbG+Pj48PEiRNxcHBQaaOpDU5CQgL79+8nLy8P\nqVRKnz59+Nvf/vZMiVOaAoSPCm2zYaGxLlBJziV0DYwwt3VU2efj40N+fj5ubm6P5LsZMWIE9vb2\nhIaGcvbsWaqqqmjbti2vvvoq48aNe6a+f0HrNDQ0sHHjRvbt20ffvn2ZM2cOe/bsUQY0i4uLCQ0N\nJTs7GxMTEwYMGMDkyZPR19dHJpOxY8cOrly5go6ODn5+fkyfPl28ywj+NPT09HB0VB1TrayshAAk\n0IojR46wZs0a9PX18ff3x8bGhry8PA4dOkRMTAzLly+nbdu2D3WN8vJy5syZg5GREQEBAUgkEiws\nLAC4efMmc+fOpaioCG9vbwYOHEhhYSGnT58mNjaW+fPnq7wPKjh79izx8fH07dsXLy8vMjMzlVaw\n33zzjcq7YUVFBfPnzyczM5NOnToxZMgQ6uvrOX/+PN988w3Xr19n0qRJyuPlcjkbNmygS5cu9OjR\nA3Nzc4qLi4mJiWHx4sX87//+r9LSvSm1tbV8+umnFBUV0atXL3R0dDh37hxbtmyhpqaGCRMmPNR9\nFAgEf22ECCQQCAR/cZxtTfFystLKDkeBd0erx26pcG/QubujAwYGBsqMn4qKCq5cucKYMWOU1jxJ\nSUk4ODggk8kAVcsexYqpU6dOkZ2dTUVFBcUld7haUEZDA9TcKVMeq2tghLmjB6U5l7l0YD0TXxmG\nrrwDVVWmGBoaIhA8K5y/Wsj2U+lq40d1RSm6+eexqLlFQ5Wc6upqlf23b9/+M7upwg8//MCBAwew\nsrJi2LBh6OnpER0dzeXLl6mtrUVPr/nXWwMDAwYMGEB4eDjx8fH4+fmp7K+pqSEyMhILCwuee+45\n5fb4+HiWLl1KXV0dfn5+2NvbU1hYSFRUFHFxcSxdupROnTqpXe+nn34iISEBPz8/fH19kclkHDp0\niPz8/GatN59GNAUIHxX3k9Vq6+mPrac/ACZtDFT2zZgxgxkzZmg8ryXr06CgIOVK4nvx9fXF19dX\n6/4Jnk2qq6v55z//ydmzZxk5ciTvvPOOymrt/fv3ExcXR58+ffDy8uL8+fPs3buX8vJy/P39+frr\nr+nduzfDhg3j4sWLHD9+nLKyMhYvXvz4PpTgmeKPrAkkeLrJzc3l+++/x87Oji+//FIlyzwpKYlP\nPvmEDRs2sGDBgoe6zrVr13j++eeZOXOm2gKgtWvXUlRUxKRJkxg3bpxy+4gRI/j4449ZsWIFP/74\nI0ZGRirnxcTEqC0YCgsLY+PGjXz//fcq73kbN24kMzOTKVOmMGbMGOX26upqvvjiC3bv3k2/fv1w\ndXUFwMTEhB9//BEbG9VFWRUVFcydO5effvqJQYMGKWsbKigqKsLFxYXPP/9cuW/ixIm888477N27\nl9dee63F92SBQPB0I379AoFA8BSgyHpprTA2oJb18mfTXNAZoKzGjMKL6ZSWlpKWlkZ9fT0+Pj50\n6NABKysrkpKSGDFiBElJSUgkEpWC219//TVRUVG0a9cOf39/LC0t0dfX51qBnK07d1NeU6dyLZf+\nYzEsSMKwNJOUM+EsOBOOgYEB/fr1Y+rUqcrVYQLB001plNEAACAASURBVEr4+SxW/pasNm5UyYu5\nFP4v6qrvYmLrRHBgb3p3dkRHR4eCggIiIiKoqal5LH1OTU3lwIEDODg48M9//lOZTfHWW2+xcOFC\nioqKWg02BQUFER4eTkREhJoIFB0dTXl5Oa+88ooySFBeXs4333yDoaEhX331FR06/F5b7fr168yZ\nM4fVq1ezatUqtWulpaWxZs0a5QrWuro6FixYgEwm4/Lly3h4eDzU/dAGhT0KNGZGRUREKPfNmjVL\n5X5lZmaybds2Ll68SE1NDR4eHrz11lt06dJFrd26ujoOHTrEsWPHyMrKoq6uDkdHR4YMGcLIkSNV\ngtjN2dEpgoQbN24kNjaWw4cPk5eXh4eHR4vCS1OelmxYwbOJXC5nyZIlpKWlMXnyZMaOHat2TGJi\nIitXrlSOPTU1NcycOZNjx44RExPDkiVLlBaGDQ0NfPrpp8THx5OZmakMKAoEAsGTyMGDB6mtrWX6\n9OlqNsM+Pj74+/sTExPD3bt3adOmzQNfR09Pj7fffltNACosLOT8+fPKTN2mdOnShcDAQI4fP87Z\ns2fVMuC9vb3VMoReeukl9u/fj0wmo6CgAFtbW+RyOcePH8fd3V1FAILGxUlTpkwhISGBkydPKsds\nfX19NQEIGm2OhwwZwqZNm7h8+bJG+9p33nlHRRwyNzfH39+fY8eOkZubS8eOHbW4YwKB4GlEiEAC\ngUDwFODrYsOskV4aA7pNkUjgg5e88XV5PMGv5oLOCu5I23Hlciob9xzBrK4IAwMDZfDR29ub+Ph4\nampqSE1NxcnJCXNzcwDS09OJioqiR48eLF68WOUFv6GhgfjIw+gZGjNhaFcVuytn21eAxglASkoK\nERERHD9+nJs3b/LVV1/9sTdDIHiMnL9a2OxvsSAtitqqO3TsOwrrTj24JIEp/fzxdbHh1KlTKiLC\nn0HTrMFj//mFO1W1anZaenp6TJ48mblz57banqenJw4ODsTExCCXy1Usk44dOwagktlx7NgxKioq\nePfdd1UEIICOHTsydOhQ9u7dS3Z2ttr+CRMmqFiY6OrqMnjwYFJTU/80EcjLy4uKigrCwsJwcXGh\nT58+yn0uLi5UVFQAkJGRwa+//oqnpycvvvgit27d4syZMyxcuJDVq1er2JrU1tayZMkSEhIScHBw\nIDAwEAMDA2QyGevXr+fy5ct8+OGHWvdxw4YNXLhwgV69eintS7Tlr5oNK3g2uDfr2dH490G3oKCA\nRYsWcePGDT788EMGDRqksY3g4GCVsUVfX5+BAweyfft2evXqpRIElEgkDBo0iMTERK5evSpEIIFA\n8MTRdFzcfzyaO1W1pKSkkJ6ernZsaWkp9fX15ObmqtXmuR/s7OyU88amZGZmAtCtWzeNGTLe3t4c\nP36czMxMNRGoad0hBTo6OnTt2pX8/HwyMzOxtbXl8uXL1NfXA40Lc+6lrq5xoWJ2drbK9qysLEJD\nQ0lJSaG4uFgtK7+oSP29x9jYWK2mJ6AUlMrLy9X2CQSCZwchAgkEAsFTwjBfJ+wspIREpiO7rv5S\n6N3RiokD3B+bANRS0FmBaTsX8hpg07+P4mVZjaenp3Ilk4+PDydOnODAgQNUVlaqZAHl5+cD4Ofn\np7bC6/Lly1RXV2NhYcErfi4ar2tjY8OgQYMIDAzknXfe4cKFC2rBYYHgaWL7qfRmf4tV8mIALJwa\nBdiGBgiJTMfXxYbk5OQ/q4saswbTziZyp+g2u5PvYOlSqDKede7cWWONH0288MILbNu2jcjISEaM\nGAFASUkJCQkJuLq64uzs/Ps109IAuHr1qsbJe25uLoBGEUhTwOLPnoh7eXlhZ2dHWFgYrq6uTJw4\nUWW/4juNjY1V1k5SEB4eztq1awkLC+O9995Tbv/ll19ISEjgpZdeYvr06UrRpr6+njVr1nDkyBH6\n9euHv7+/Vn28cuUKq1atws7O7oE+418pG1bwbNBc1nNVeQnZ2cVYpqZz/qOPqKysZPHixSrvNPfi\n7q7+71VRa0XTGKNYTf84bTsFTzctiZsCQXNoGhdTL2VTJS/i81WbcLA2xlxqoPHcysrKh7q2paWl\nxu2KhTDN7Vds1/TO1pxrhOIcRdtyuRxoXLSoSehS0PQzXrp0ifnz5ytdMfz9/ZFKpUgkEjIzM4mO\njtaYld9cvUHF+7FCjBIIBM8mQgQSCASCpwhfFxt8XWw0Fnl/3KueWwo6K5Ba2qNnYERp9iXic2sY\nGzxMuU9R/2f37t0qfweUgcOUlBSCg4OV20tLS1m3bp3adUpLSykuLlYJ9ELjy3dlZSW6urrCL1nw\n1HKtQN5i1oSBceNKyfKb1zB37AyA7HoR+49Gcvjw4T+lj81lDdbVVAGQfruGeduj+eAlb4b2aBRe\ndHR0tBZuX3jhBX7++WciIiKUItCJEyeoq6tTq++imLwfOnSoxTbv3r2rts3ExERt25M6Ee/SpYva\nZx88eDA//PADly9fVm5raGhg//79WFpaMm3aNJWsHR0dHd5++22OHj3KiRMntBaBxowZ88ACEPx1\nsmEFzwatZT2X3a3mSHQqHS318O/RTWM9saZIpVK1bYpxRFPAT7Gvtrb2Pnv+eGnONvJBOH/+PCEh\nIcoakf7+/ixcuPAR9fSvQ3JyMvPnz2fChAlqCwAehNbEzc63RZaBQDPNjYu6Bo11dlyCP0DP0Ii/\nN3mv04REIml2bFOILveDYgwtKSnRuL+4uFjluKZoe47i/6NGjWLatGla9WvXrl1UV1ezdOlStYyj\n3bt3Ex0drVU7AoFA0BQR4RIIBI+defPmkZKSwr59+x53V/4ytHbPnG1NH7vo05TWgs4KJDo6mNh2\npCTnEjWAtePvK1xtbW2xt7cnPz8fHR0dFfsTd3d3unTpwtmzZ/noo4/o2rUrJSUlxMfH4+DgoFwx\nq+D27dvMnDkTZ2dnnJ2dsbGx4c6dO8TGxlJcXExwcPBD+U4LBE8yidcKW9zf1qM3RZmJXI3cg4VT\nFyQ6umTH/MbFzcb8/Z1pREZG/qH9aylrUFe/cYVobWUFuvoGrNgvw9a8Db4uNtTX1yOXy9U85TVh\nY2ODj48PiYmJ5OTk4OjoSEREBHp6egQGBqocqwjAfvfdd2rC8ZPKg6zS1pRtoKenh4WFhcoK2Nzc\nXORyOe3bt2fXrl0a2zIwMFCzNWmJR2GL96RnwwqeDbTJegYwd/Cg0tya8ykJLFiwgM8//1xkHz8i\nCgoK+PzzzzE2Nmbw4MFIpVIcHR0fd7f+EB6lcNYa2oib++OzGJKY3WIQX/Ds0dK4aGzjwJ3beZTf\nysLcwUPlvU4TJiYmXLt2jdraWrUFey1l2TSHwjIzNTWVuro6tYxymUwGoFGsT05OZvz48Srb6uvr\nuXDhgkrbHh4eSCQS5XZtyMvLw9TUVKPlXEpKitbtCAQCQVOECCQQCAQPwJ856XoaaC3o3BSTdi6U\n5FxC18CIUh1V72YfHx/y8/Nxc3NTWZGlo6PDJ598ws8//0xcXBz79u3D2tqaF198kddff533339f\npR07OzveeOMNkpOTkclklJWVYWpqioODA1OmTGHAgAEP94EFgieYO1Utrw5vY2mH2+DJ5Ccdpyw3\nnbqaShrq6+n6XADDhw//w0WglrIG21jZc6foBuW3sjA0tVSxqrt06ZLSV10bgoKCSExMJCIiggED\nBnDt2jX8/f3VPOM9PT05e/YsqampT7wI9DCrtFuyEGmataTIjMrLy2PHjh3NtqcpM6o5mrNhuV+e\n5GxYwbOBNlnPCuy69Ud624LMK6eZN28en3/+ebP2Qs8CVlZWrFu3TmPm0/2QmJhIdXU1//jHP9RE\n/WcNDw8P1q1bh5mZ2UO1o624SQPKIL5AoKClcbGthx+3MxLIjT+MoakVRmY2yvc6aMxovHTpEt26\ndQMa/01fuXKFo0ePMmzY744RERERXLx48b77ZmNjQ48ePUhMTCQsLIzRo0cr9126dImTJ09iYmJC\n37591c6VyWTExsbSu3dv5bb9+/eTn5+Pt7c3tra2AJibmzNo0CCOHz/Ozp07GTdunFrtQ8UiR0VW\ntJ2dHbm5uVy7dk3l3fPIkSMkJCTc9+cUCAQCECKQQCAQCP4EWgs6N8XW0x9bz0YLocoaVbukGTNm\nMGPGDI3nmZqaqtSsaMqmTZtU/m5sbMz48ePVVm8JBM8CUsPWX/9M2nbAffBbQKOAkPqfVTg4dsDL\ny0stA/HLL79UO1/TcdrQWtaglYs3tzPOczMlEnPHzugZGCG7XkRGXjFbt269r2sFBASwbt06Tpw4\noSy2e68dGjRaou3atYsdO3bg7u6ulrXS0NBASkqKxtWafyZ/1iptRYC2b9++zJ8//4HbaYpEInkk\n7Sh40rJhBc8G2mY9N+WOdXcm+LsRumMLH3/8MUuXLlXLXn5W0NPTeyQZO4pi6c/qfWyKoaHhI7mn\n9yNuKhZnODz0VQVPA62Ni0bmNjj5v0xWdBgX9/+AmX0ncsyssSyIpb6yjAsXLmBmZsYPP/wAQHBw\nMEePHuX7778nKSmJtm3bkpmZSVpaGr179yY2Nva++zhjxgzmzp3Ljz/+SEJCAu7u7hQWFnL69Gl0\ndHSYNWuWRocIPz8/vvjiC/r27Yu9vT2ZmZnEx8drnJO+++675OXlsX37do4fP07Xrl2xsLCgqKiI\n7Oxs0tPT+eijj5Qi0Msvv0xCQgJz586lf//+GBsbk5GRQWpqKv369ePMmTP3/TkFAoFAiEACgUAg\n+MPRJuj8KM8TCB6Gppl+Y8eOZfPmzaSmplJTU4OrqysTJkzA19dXeXxERAQrV65k1qxZWFhYsGfP\nHjIzM7lz546KEJKTk8OePXtISkqipKQEY2NjfHx8mDhxIg4OquGSkpISQkNDiYmJobCwUGnL5enp\nyfjx42nXrp3K8QkJCYSFhXH58mXu3r2LjY0Nffv25fXXX1fL8NixchGpl27gOfI9biSfoPj6BWor\ny9GXmmPt5otd137KoHy+7AT5spMAXEuNU6m5NWvWLI2iycPQWtagqZ0zNu49KUyPJ23/uv/a1ekw\n4/x2ujm3w8rKSmtBwcDAgH79+nHkyBEOHDiAqampcjVnU8tNU1NT5s2bxxdffMGcOXPw8fHByckJ\niUTCrVu3SEtLQy6XExoa+tCf/0FpbZW24p401Ne3arXSGo6OjhgbG3Pp0iWNdiwCwbPK/WQ9N8Wi\n03PMnGnBqlWr+Pjjj/niiy9o27btI+7dk4+mLPuVK1cSERHBpk2bSEhIYP/+/eTl5SGVSunTpw9/\n+9vflM84Rf0bBU3/3LSuRl5eHjt37iQpKYmysjLMzMzw8fFh/PjxtG/fXqVPISEh7Nixg6VLl1JU\nVERYWBhZWVmYmZmxadMmlT6//vrrbN68meTkZGpqavD09GTatGl07NiR0tJStm3bRkxMDOXl5Tg7\nOzNlyhSV+pbQKGAdPnyYhIQE8vPzKS8vx8zMjO7duzN+/Hg6dOig1jdofA+JiIhQ7lM8n1uqCaTt\nfbhWIOfw/lDyZSdxHzKZ2so7FFw4w93SW+jo6mHazpW2nqr132TXi2hDJQKBNuOilas3bSztKLh4\nDvnNq8hvXOHgnUy83Z3o16+fikNDhw4d+Pzzz9m6dSsxMTHo6urSrVs3li9fztmzZx9IBGrXrh0r\nVqxg165dxMXFkZKSQps2bXjuued4/fXXNdrlQuNiomHDhrFr1y5iY2PR09MjICCAt956S+29XiqV\nsmzZMsLDwzl58iRnz56luroaCwsL2rdvz7Rp01TmFj179uTTTz9l165dREZGoquri7u7O0uXLuXm\nzZtCBBIIBA+EmLUJBIKHorKykgkTJuDu7s7XX3+t3F5dXc348eOpqanhww8/5Pnnn1fuO3DgAOvW\nreMf//gHQ4YMUW6vq6vj119/5ejRo9y6dQsLCwsCAwN58803NQaZ7iegqphEbty4kdjYWA4fPkxe\nXh4eHh4qq9i1CaRqM+lqaGggPDycI0eOkJ2dTUNDA05OTgwePJjhw4drDFImJSURGhrK5cuXqays\nxNbWloCAAMaOHdusTc+9yGQyvvjiC4yMjFi0aJHSi/hx08P5wYKND3qeQPAouHnzJnPmzMHZ2Zlh\nw4ZRXFxMZGQkixYt4qOPPlKzDTxz5gzx8fH07NmT4cOHU1BQoNwXHx/P0qVLqaurw8/PD3t7ewoL\nC4mKiiIuLo6lS5cq/carqqqYO3cu+fn59OjRAz8/PxoaGigoKODcuXP069dPRQTasWMHISEhShHD\n3Nyca9eu8e9//5u4uDiWL1+uYq9j0kYf8zZ6XDn2MzV3yzFr74ZEokNJThp55yNoqKvD3rvRQsfE\nzhlbz0oqs87TzdOdPn36KNtxcXF55Pdcm6zBDn4jMTKzoTA9jsL0OHQNpfi9MJAl/zeHKVOmYG9v\nr/X1Bg8ezJEjR6itrSUwMLBZQcPHx4c1a9YQGhpKQkICqamp6OnpYWVlhY+PDwEBAVpf84+gtVXa\nugZtkEgk1NwpVbHQexB0dXUJDg5m586dbNiwgWnTpmFgYKByTFFRERUVFSoBS4Hgaed+sp7vPe+V\noCD09fX59ttvlUKQ4Hd++uknEhIS8PPzw9fXF5lMxqFDh8jPz1feKzs7OyZMmEBycjIpKSkEBQUp\n7ZgUq+vT09NZuHAhd+/exc/PDycnJ3Jycjhx4gTR0dF8/vnnGgO+//73v0lMTMTPzw9vb2+1AvQ3\nb95k9uzZdOjQgaCgIAoKCoiKimLevHksX76cRYsWIZVKGTBgAHK5nMjISBYvXsz69etVBL+UlBR2\n796Nt7c3AQEBtGnThry8PM6ePUtMTAxff/218tnr5eVFRUUFYWFhuLi43Nfz+X7uQ9MgfuHlOEpz\nLmHu2BkTu45UFOZRfD2V8lvZNDSoZu/nFqneI8GzibbjYhtLOzoGjFL+ffIgDyYO0Cy+dO3alWXL\nlqltd3Z2VhM7Aa0y062trdXsw7Whd+/eKnZwLaGnp8dLL73ESy+99FBtd+/eXeMirHudL5oyceJE\njfdGIBA8WwgRSCAQPBRGRka4u7srRRNFqvSFCxeoqakBGsWNpiJQUlIS0BhUa8ry5ctJTU2lZ8+e\nSKVS4uLi+PXXXykpKVGru3M/AdWmbNiwgQsXLtCrVy969eql4serbSBVm0nXP//5T06ePImNjQ0v\nvvgiEomEqKgo1q1bx4ULF5gzZ45Kv8LDw/n+++8xNDSkf//+WFhYkJyczJ49e4iOjuabb75pVQg6\nceIEq1atol27dnz22WfKie+TgLOtKV5OVvdlk+Ld0UrY+QgeKykpKYwePZqpU6cqt40cOZKPPvqI\ntWvXKscqBXFxcSxatIiePXuqtFNeXs4333yDoaEhX331lUpg/Pr168yZM4fVq1ezatUqoHGMzM/P\nZ9SoUUybNk2lrdraWuXYCo3Cb0hICJ6enixevFhlnFBkKIWEhKi1Y21YR1WDIW5Bk9DR0wegnXcg\nF8PWcCvtHHbd+qOjq4upnTOGJhbUlV/G1dX1D59AapP9J5FIsO3SB9suv4+9Lw/tSmlpKZWVlSr3\nNygoqMVspa5du2ptW2dra8u7776r1bGzZs1qtl7cg1rlNYc2FlS6+gZIrR0oL8ji2ulQ8mXWdKy8\nxEsvDnqga77++utcvXqVgwcPEhMTg7e3N9bW1pSWlpKXl8eFCxd46623hAgkeKbQZvwyNLHguTcX\naTxv4MCBDBw4ULm9paBdS2Pbox5jngTS0tJYs2aNUjCpq6tjwYIFyGQyLl++jIeHB7a2tkycOJGQ\nkBClCNTUprOhoYFvv/2WO3fuMHv2bAYNGqTcFxkZyddff80///lP1q1bp7ZYSyaTsXz58mYXV6Wk\npDBp0iTGjRun3LZz5062b9/O7Nmz6d+/P++//76yXV9fX7799lv27t2r8nz28fHh559/VrOeunr1\nKnPnzmXLli0sXrwYaPye7ezsCAsLu6/n8/3eh6ZB/LK8DDoPm0YbS7vf+3b6V25fSaSuWrUOXHWt\nqigkeDYRbhACgUDw5KDT+iECgUDQMj4+PtTV1ZGSkqLclpSUhI6ODt7e3krRBxonHsnJybRr105N\npMjPz2ft2rXMnDmT6dOns2rVKuzt7Tl27BjFxcXK45oGVL/77jvmz5/P3/72Nz766CNWrFhBfX09\nq1ev1tjXK1eusGrVKubMmcPkyZOZNGkSoBpI3bhxIx988AFTp07l//7v/5g1axbZ2dmEhIQAjZOu\nUaMaVyopJl2K/1xdXTl16hQnT57E1dWVdevWMX36dKZNm8batWtxc3Pj5MmTnDx5UtmngoIC1q9f\nj5GREStWrGDmzJlMnjyZ5cuXM2LECLKzs/npp59a/A727NnDt99+i4eHB19//fUTJQApeGOgO9qW\nfZBIaHb1l0DwZ2FsbMyECRNUtrm7uzNo0CAqKiqIiopS2efv768mAAEcO3aMiooK3njjDbWgeMeO\nHRk6dCiZmZlkZ2er7Ls3uwIaVxE2DQ4pAn3/+7//qyYUBwUF4erqyokTJ9TaMZca8H8ff4Cuvr5y\nm76RMeaOnamtrqRKfhto/C2++2JXzKUGyOVygoODWblypUpbK1euJDg4WCXz6UHRJvuv5m45Dfek\nvXSxN2Xjxo0AGov3Ps1oa0Hl3G80Zu3dKcu/wo3kk2zZto0rV6480DX19PRYsGABH374IQ4ODsTG\nxvKf//yH+Ph46uvrefPNN1UCiwLBs4DIer4/rhXI+U/MVUIi0/lPzFWybpU3e+yECRNUMmZ0dXUZ\nPHgwAJcvX9bqemlpaeTk5ODp6ak2Pg0YMICuXbuSm5tLamqq2rnDhg1rMbve1taWsWPHqmxTiHQ1\nNTVMnTpVRVgKDAxEV1eXzMxMlXPMzc011h5xcXHB29sbmUxGbe2DZZwpuN/70DQY37azn4oABGDj\n9hw6uno49nxRJZNjzKRp7Nu374mckwj+PMS4KBAIBE8OQl4XCAQPjY+Pj9JTWpGynJSUhJubGwEB\nAfzwww/k5ubi4OBAZmYmcrlco3XOlClTMDX9PfPDyMiIwMBAdu7cSUZGhrJtRUD13XffbTagunfv\nXrKzs9X2jxkzRmkJ0ZTWAqk/79zDtl/3I+0ciNRQD0fj5n13jhw5ovw8RkZGKp9nypQpLFy4kMOH\nDxMY2Gi3dOLECWpraxk9erRa8dZJkyZx/Phxjh8/zjvvvIN+k4AtNIpq69ev57fffiMgIIDZs2dr\nDBw/Cfi62DBrpFeLdSugMej8wUveD2xVJBDcL9cK5CReK+ROVa3K77tTp04agzFeXl5ERESQmZmp\nshLbw8NDY/tpaWlA40pehZjclNzcXADlmNW9e3esra3Zs2cPV65coVevXnTp0gVXV1eV7EVF23p6\nepw+fVrjtWtqaigtLUUul6uMr8bGxkx8sTdd3AsJiUxHdr0xk0RfagZAXfVdvDtaMXGAOw7G9fys\n+dY9crTJGixIi6b4WjKmds7otTHFtk09yz7dQ2FhIT179qRfv34az4uIiCAmJoYrV65QXFyMrq4u\nzs7ODB8+XCVbtTkaGho4duwY4eHh5OXlcffuXczNzenQoQNDhgxRswfMyMhg9+7dpKamUlFRgaWl\nJb179+b1119/pAXLtbVaMTS1otPzv4uakwd5EPRfsb2lrIHm7EUkEgnPP/+8VvfO1tZW4zVaypgS\nCP5qiKxn7Th/tZDtp9LV7lNVeQnZ2cV0vq0uBrm5ualts7FpfE8sL29ePGpKRkYGgFodHgXe3t5c\nuHCBzMxMunfvrrKvuee7Ak3PZ8U47+DgoPYuoaOjg4WFBYWF6iJ+bGwsBw8eJCMjg7KyMurq6lT2\nl5WVPdQz5H7vQ9NgvNS6vdrxBsbmANRWq9YAEkF8AYhxUSAQCJ4khAgkEAgeiKZBU0NdQ2obdJQZ\nPxUVFVy5coUxY8YoJxhJSUk4ODggk8kAzRMPTR7cilV/TSd49xtQbUpLQVpNgdRrBXLOpN1AlpRF\nZektNh1KRM9Q2uJE9cqVK0gkEhULCgXdu3dHR0dHZfW14s+a7omJiQmdOnUiJSWFnJwcNY/vpUuX\ncu7cOYKDg5k+fbrWBdEfF8N8nbCzkKoEnZuiCDoLAUjwZ9BaIKpTd32N51lYWACo1QSwtLTUeLxc\nLgfg0KFDLfbn7t1GKxWpVMry5csJCQkhOjqahIQEAMzMzBgxYgSvv/66snaNXC6nrq5OWaespbbv\nFYGgUZz1dbFRjulHai4RU2DKJ2OeY9igRru1R5Hhcz+8MdCdedujmxWLzexduFt8g7L8K9RV36Vd\nRxvMPFwJDg7m5ZdfbnYc/P7773FycqJ79+5YWloil8uJi4vj22+/JTc3lzfffLPFfm3bto3du3dj\nZ2dH//79MTY2pqioiPT0dE6fPq0iAsXGxrJ06VKgsXCwra0tGRkZHDhwgHPnzvH1119rXJDwIDyN\nVistFTMXCJ5kWhu/mvIsZj2Hn89qcTFQ2d1q9sdnMSQxm6E9fn+HNzExUTtWV1cXgPp67WzH7ty5\nA9CsgKLYfu+zHX5/7jeHJstmRf+a2sbeu/9egScsLIyNGzdiYmJCjx49aNu2LYaGhkgkEs6dO8fV\nq1cfOhPofu+Ds60pHaxNyAd0DYzUT5D8V/yq//1LFUF8QVOetnGxNathgUAgeFJ5cmd/AoHgiaS5\noGlGeRsun07ktaRMDCtvUV9fj4+PDx06dMDKyoqkpCRGjBhBUlISEolErR4QtDyBajrBu9+AalNa\nCtLeG0gtKL3L1YIylRfWuppq9AwbJ3PNTVQrKiowNTXVWGBcV1cXMzMzSktLVY6H5idjij5rmpSm\npqaiq6uLn5/fEy8AKbg36KzIvujhbCMmjII/DW0CUfvOXmT4Pb9vgJKSEkB9zGruN6gIAH333Xc4\nOztr1T8bGxv+8Y9/0NDQQHZ2NklJSfz222/sLzv7WgAAIABJREFU3LmThoYGpWAhlUppaGhoVQRq\nDWdbU5xtTbmT0Y6s81IcrFuuQfZH0lrWoGk7V0zbuSqzBu/9fppjzZo12Nvbq2yrra1l0aJF7Nmz\nh+HDh2Ntbd3s+eHh4VhbW7N27VoMDQ1V9pWVlSn/XFlZyYoVK6irq+PLL7+kW7duyn179uxhy5Yt\nrFmzhiVLlmjV73u5VyD5q1qtFBQU8PbbbxMUFCSygQRPDSLruXnOXy1s9b4A0AAr9suwNVfPxH0Y\nFM/iphbTTSkqKlI5ril/xjt2XV0dISEhWFpasnLlSrV5gWIR3MPyIPehn2c7Yk9o1/5fIYgv+HMR\n46JAIBA8GQgRSCAQaE1LQVOTdi7k5Wcy57vd9LVvwMDAgC5dugCNGS7x8fHU1NSQmpqKk5MT5ubm\nD9yPBwmoKmgpSNs0kHr+aiHztkdjcR8TVcULq7GxMXK5nNraWjUhqK6ujrKyMpWJlSKQXFxcjJOT\nk9olFJM0TZPSpUuXsnDhQpYsWcK8efPo1atXKx1+clAEnQWCPxttA1FF11J4e/p0PKz06NDOmr59\n+zJp0iQWLFhAXl6eWuBaJpMpbeKqq6uxs7Nj0KBBuLm5cfbsWVJTU5VjVnBwMN27d2fevHls3bqV\nmJgY5HI59vb2vPrqq8paBxKJBCcnJ5ycnOjbty+vvvoqq1ev5sCBA9y9e5esrCzq6upIS0vD09NT\npT9vv/020DhWhoSEEBUVxe3btykpKaFTp04UFRVx+PBhEhISyM/Pp7y8nMLCQm7fvs2NGzeU2YwK\ni5t76/A0R05ODu+99x5eXl7KbJh7+fvf/05OTg4//vijRgH8YbMGNYrM9whA0FjbZuTIkchkMpKS\nknjhhRda/Gy6urpqlj/QmKWl4Ny5c8jlcgYOHKgiAAGMHj2agwcPkpiYyK1bt1RqXDTlfgQSYbUi\nEDxZiKxnzWw/la5VJgBAQwOERKbj8Aiv36lTJ6BRSNeEYrviuD+bsrIyKioq8PHxUXsuVlZWaqzh\npngeaZsNBQ92H5xtTXGxNaM1KUwE8QXNIcZFgUAgePwIEUggEGhFa0FT03aNNmXy/Kv8OzmHIb3c\nlbVpfHx8OHHiBAcOHKCyslJjFtD94OnpqRZQfVg8PT2JjY0lKysLJyenVieqCjGpoaFeOVFVvLS6\nurqSlJREamqq2mdNTU2lvr5eZWLl6urK2bNnSU5OVju+oqKCzMxMDAwM1KztAJydnfnyyy9ZuHAh\nX3zxBf/v//0/+vTp86C3QSB4InnUGQPaBKIqSwqoqiihoaEeve4vEzigM9HR0URHR5OXl4eenh59\n+/ZVHn/16lV++eUXPD09CQgIYNOmTeTl5ZGTk4OHhwdSqZQdO3bg7u6utKWsqKhg7ty5yto0tra2\nnD59mmXLllFRUcGoUaNU+rR161YuXbqkrC1jbm7OuXPn2Lt3L+PHj2f//v1qdcWqqqp477330NfX\nx9fXF6lUyu7duwFISUlh9+7deHt7ExAQQJs2bTh48CDp6el88803uLm54eLigomJCRKJRGPtAk04\nOjoqC1gr6sE15eLFi1y/fp2AgIAW6xo8SNZgc9mqAJ0sJFiUpFKUe4Vbt25RXV2tsv/27dstfq5B\ngwaxb98+3n//ffr370/37t3x9PRUywhTBOo0Pet0dXXp3r07x44dIzMzs1kR6H552qxWBIK/OiLr\nWZVrBfL7EqoBZNeLaENl6wdqSZcuXXBwcODChQucOXNGpX7cmTNnSE1NxcHBQU28/7OwsLDA0NCQ\njIwMKisrlXVFa2tr2bBhg0rGqQLF8/nWrVtaX+dB74OteRsmjfQi9pa+xiB+eyspX77hL4L4gmYR\n46JAIBA8XoQIJBAItKK1oKnU0h49AyNKcy5RU1lBAb8LEYpaN4rAY3OFSLVl8ODB7Nq1Sy2gqqCh\noYGUlBSNNXmaY9SoUcTGxvLdd9/xxrS/q01U62qqqSwtwNimMcCqa9AGiURCzZ1GWzfZ9SKuFchx\ntjVlyJAhJCUlsWXLFr788kulbVBVVRWbN28GYMiQIcq2n3/+eXbu3Mn+/fsJCgpSsSv6+eefuXPn\nDi+++CL6+prrk3To0IFly5Yxf/58li1bxuzZs9UKlAsEgka0CURVFOZQVVGCkakVhqZWpGdkUvyc\nCx4eHmzYsIHq6mq8vLyU2XkJCQncunULf39/1q1bh4GBAUeOHKF79+54eXmxY8cOBg8ezJkzZ5gz\nZw4+Pj5cv35dKTpbWlqSn59PaGgoo0aNYsyYMcyaNYuoqCjat2+PhYUFMpmMkJAQTE1NWbNmjTJT\naOrUqdjb27N+/XpGjRrFa6+9hp2dHZWVlZw/f54bN27QqVMnjhw5ogwonTlzBmgUKX7++WeVgtUG\nBgZKsWfLli0sXrwYIyMjPDw8SE5OJjs7G0tLS3bt2oW/v3+zQvyIESOQyWQcOnSIqVOnquxTWHkO\nHz5cq+9M26zBlrJVq+TF/Hv3v6irvsvzAb0YOnQoUqkUHR0dCgoKiIiIoKampsX2p02bhp2dHUeP\nHmXPnj3s2bMHXV1devXqxdtvv60cuxXWnc3ZjyqEL22LmWvDX81qJSQkRJl5GxERQUREhHLfrFmz\nsLW1Vf49MzOTbdu2cfHiRWpqavDw8OCtt95SZhs3pa6ujkOHDnHs2DFllpyjoyNDhgxh5MiRKtnA\nTcXl119/nc2bN5OcnExNTQ2enp5MmzaNjh07UlpayrZt24iJiaG8vBxnZ2emTJny0O8ygmcDkfXc\nSOI17RYR3EtukboV8oMikUj44IMP+OSTT/jqq6/o06cPjo6O5ObmEhUVRZs2bfjggw8em72yRCIh\nODiYPXv2MGPGDPr06UNtbS0ymQy5XK5cXNEUxfM5NTWV5cuX4+DggI6OTovP54e5D10cLRk33Esl\niF9dUcK289YE93Z+7M8WwV8DMS4KBALB40GIQAKBoFW0CZpKdHQwse1ISc4lAIr12ipFEVtbW+zt\n7cnPz0dHR4fu3bs/VH9MTU2ZN28eX3zxhTKg6uTkpFwJl5aWhlwuJzQ0VOs2fXx8mDx5Mlu3buX9\n996lUNcOAxML6mtrqK4oobzgOsZtnXB74Q0AdPUNkFo7UF6QxbXToRiaWbNmYyZ/fyOYwMBAzp07\nx+nTp3n//feV2QLnzp3j5s2bDBgwgEGDBimvbWtry/Tp01m3bh0zZ86kf//+mJubk5KSQlpaGo6O\njkyZMqXF/tvb2/PVV1+xYMECli9fTk1NTau2RgLBs4g2gaji66kAmHfoguvA18g7H8F/wn7D1syA\n/v37k56eTrt27ZTHR0VFIZFIGD16tDIDUoEiQ+f69eusWbOG0NBQpWikq6tL79696dq1KwEBAUCj\nqNu7d29Onz5NRUUF0dHR3Llzh6ysLMzNzVmzZo1aMdrPPvuMK1eukJqaysWLF4mOjkYqlVJVVUXb\ntm357LPPlAJQU5qz5ZRKpXTs2BGZTKa0tZw9ezYrV64kJSWFhIQEiouLsbGxaTbI1KdPH6ysrDh6\n9CiTJk1SitgVFRVERkZib2//0FmhTWktW7UgLYraqjt07DuKUtce9B7y+2rlU6dOqYgQzaGjo8Oo\nUaMYNWoUpaWlpKamEhkZyenTp8nKymLt2rXo6+srM4MUtaPuRVFvQVMdPHg4gaS9rSN07ENujZla\nu907mONcn82Rn1exaWnzAsmjsPNrDS8vLyoqKggLC8PFxUUlg9XFxUUppGVkZPDrr7/i6enJiy++\nyK1btzhz5gwLFy5k9erVKllmtbW1LFmyhISEBBwcHAgMDMTAwACZTMb69eu5fPkyH374oVpfbt68\nyezZs+nQoQNBQUEUFBQQFRXFvHnzWL58OYsWLUIqlTJgwADkcjmRkZEsXryY9evXP7JMLoHgaedO\nVe0DnVddq73NmTZ07tyZFStWsGvXLhITE4mJicHMzIzAwEDGjx+vlrn6Z/Pmm29ibm7O4cOHCQ8P\nRyqV4uvry5tvvklISIjGc2bPns3GjRtJSEjg1KlTNDQ0tPh8hoe/D02D+AUFBfxqKMJKAoFAIBA8\n6YintUAgaBVtV++ZtHOhJOcSugZGSK3ak3itUDlB8PHxIT8/Hzc3t2YDX/eDj4+PSkA1NTUVPT09\nrKys8PHxUQZU74exY8fStWtXlqz+iWvRCdTlXkJH3xCDNmZYuz2HpbOqeOXcbzQ5cYcoy79C3fUU\njuUZM7xPV5ydnZk7dy5eXl4cOXKEgwcPAo3B3dGjRzNixAi1a48YMQJ7e3tCQ0M5e/asMnj76quv\nMm7cuFbvmWJFc58+fdDV1WXlypXU1NQwdOjQ+74PAsHTjDaBqMqSAgAMTSwxMm+L66DxTB7kwcQB\n7tTX1/Pqq68qj62qqqK2tpbBgwdTVlamDNLk5uYikUjYuXMn+vr6ZGdnY2try7vvvgs01gRycXFh\n9erVatd3dXUlKyuLRYsWYWPTKFRMmjSJ8vJybt68qTEQZGZmhqOjI2vWrMHUtHHcffvttykpKeH5\n559XOXbTpk3KP8fGxnLw4EEyMjIoKyujrq4OgOvXrwONNQqsrKywt7dn9uzZXLhwQStbPl1dXV58\n8UV27tzJ2bNnCQwMBODYsWNUV1czdOjQR7raurVs1Sp5Y201C6cuahaezdVFaAlzc3MCAgIICAig\nrKwMmUzG9evXcXNzw9XVVdlu06xPaMxUSU1tFBmbqzvxsAKJfkEOn326lBtVBkqrle6OFmxbv5LD\nWggkj8rOryW8vLyws7MjLCwMV1dXJk6cqLJf8Z3ExsYya9YsFeEzPDyctWvXEhYWxnvvvafc/ssv\nv5CQkMBLL73E9OnTVWplrFmzhiNHjtCvXz/8/f1VrpWSksKkSZMYN26cctvOnTvZvn07s2fPpn//\n/rz//vvKf6++vr58++237N27l2nTpj3Q5xcInjWkWogEhiYWPPfmIpVtYyZN4xU/F43He3l5sW/f\nPrXtEydOVBtTmuLg4KBRENZEa23Z2tpq7IOClvY1fRYr0NXV5ZVXXuGVV15R2zdr1iyNz157e3s+\n/fRTjddo7h6B5vuwcuVK3n33XTZt2qSy4KCl+9DaPbiXlStXEhERoXYNgUAgEAgEfyxCBBIIBK2i\n7eo9W09/bD1/D640PW/GjBnMmDFD43lffvlls20GBQWprXpXXq9JQLU1mps43UvXrl15fer7FDlc\naPVYQ1MrOj0/Qfn394Z2Jei/E1WJRMKIESM0Cj7N4evri6+vr1bHNnfPjI2N+eGHH7S+pkDwVyMn\nJ4fNmzeTmppKTU0Nrq6uTJgwQeW3U1FRwaFDh4iPjyc3N5fS0lKkUimenp5Ydu6rsd2Enz/D1K4j\nLgPGUZZ/hdqqO9xIjaSqvBi7rgFIh3YFGrNBFCJLbW0tW7duJTExkerqak6dOoW1tTXt27cnNzeX\nsrIyZUaHJpoTd3V1dQHVQs9yuZy6uroW2wO4e/eusn/QKFY0J7aEhYWxceNGTExM6NGjB23btsXQ\n0BCJRMK5c+e4evUqtbUPtnobYNiwYfzyyy+Eh4crRaBDhw6hp6entLN7FGiTrWpg3Jj1VH7zGuaO\nnZUWnkU56Rw+fLjVa9TU1JCRkaFmQVZbW6u0dVNYf/bt2xdTU1NOnjzJyJEj6dy5s/L4sLAwbt68\nqbzfmngUAsn5s8dUBJKQkJD7EkgepZ2fgntrADgat17AqEuXLmrvAIMHD+aHH37g8uXLym0NDQ3s\n378fS0tLpk2bpvx80Pibffvttzl69CgnTpxQE4FsbW0ZO3asyragoCC2b99OTU0NU6dOVfkNBQYG\nsmrVKjIzM+/r8wsEzzI9nB/MJuxBz3tQHnUNwj+S5ORk5s+fz4QJE1oUqgQCgUAgEAhAiEACgUAL\ntFm99yjPe9z8VSaqAsGzxs2bN5kzZw7Ozs4MGzaM4uJiIiMjWbRoER999JGyFlZOTg7btm2jW7du\n9O7dGxMTEwoKCoiJiaH0bDRy5yGYtXdTa7/8Vg5n1zaK1RIdXdqY21J7V871qL1UvOQNfi7U19cj\nl8uxsrJi2bJlSiu47t2789prr3H27Fnc3d2V1pctidz3g1QqpaGhoVUR6F6aE4Dq6uoICQnB0tKS\nlStXqmV0pKWlPXBfFVhbW+Pv709UVBQ5OTnI5XKuX7/OgAEDmrWiexC0yVZt69GbosxErkbuwcKp\nC/ptTPl4wWHu3LxK//79iYyMbPH86upq5s6di729PW5ubtja2lJdXU1iYiLZ2dn4+/vToUMHoLFG\nw8yZM1m2bBkff/wx/fv3p23btmRkZHD+/HksLS2bXRRxP/yRAsmjtPM7f7WQ7afS1YS6qvISsrOL\n6Xy7+dpI7u7uatv09PSwsLBQqamUm5uLXC6nffv27Nq1S2NbBgYGZGdnq213dXVVuSfwe90mBwcH\nlZpZ0HjPLCwslLWzBAJB6zjbmuLlZNWqYN8U745Wf0jdkODg4Ef6fH6aeOuttxg7duwDZ3kKBAKB\nQCB4cvlrRmgFAsGfyrMmijxJE1WBQPA7KSkpjB49WiUzYeTIkXz00UesXbuWnj17IpVKcXR0ZMuW\nLZiZqdZFKSwsZPbs2RRcOqlRBKouL0bXoA1mDu4UpsdhZGGL68DXuPjbD+z9z795Y2wwly5doq6u\njhs3bnDr1i26dOlChw4duHnzptIuRVubmfvB09OT2NhYsrKycHJyeuj2ysrKqKiowMfHRy3YU1lZ\nyZUrVx76GtCYURIVFUV4eLgyaD9s2LBH0rYCbbJV21ja4TZ4MvlJxynLTaehoZ7yNl1YOH8+xsbG\nrYpAhoaGTJkyheTkZC5evMi5c+do06YN9vb2vP/++2q2b/7+/nz99ddKi7I7d+5gYWHB8OHDGT9+\nvMYA272Ftlv7XH+kQPKo7PzCz2e1WKup7G41++OzGJKYzdAeHdT2t5Qtd2+mHEBeXl6LQundu3e1\nuoYiG08qlTZ7fYV1okAg0I43Brozb3t0i9adCiQSmDhAfYz7o7GysmLdunXN/vafdqysrIQAJBAI\nBALBU4oQgQQCQas8i6LIX2Gi2hza2GUpOHXqFOHh4WRmZlJdXY2dnR2DBg3i1VdfVa78vrftX3/9\nFZlMRlFREcbGxsr6Ek2t786dO8eZM2e4fPkyt2/fBhrrTAQFBfHSSy+pBQ8V/uD/+te/iI2N5cCB\nA9y4cQNLS0uGDh3Ka6+9hkQi4fTp04SGhpKVlYWRkRH9+/dn6tSpGBgYaOzrnj17SEpKoqSkBGNj\nY3x8fJg4ceJjL/wraJnmbKOMjY2ZMGGCyrHu7u6Ul5cTFRVFVFQUQUFBzQaObWxs6NevH1d3/Urc\n5oWYO7jhPmSKcr9EVw8jcxssnbpSc6cMGurRa2OKSdsOxKdepry8nK1btwKQmppKZWUlU6ZMwczM\njNWrV7Nq1So++OADxo8fz8qVK5XtKmr5NFcDRhtGjRpFbGws3333HfPmzdMo3Fy/fl3FeqwlLCws\nMDQ0JCMjg8rKSoyMjIBGe7MNGzZQVlb2wH1tio+PDw4ODkRERFBdXY2DgwPe3t6PpG0F2madmrTt\ngPvgt5R/nza0K33+a+F5bz2De1eI6+npMWbMGMaMGaN1v9zd3VmwYEGrx2nKlKkqLyH1+m3qY64R\neLVQWb+oKX+0QPKwdn7nrxa2KAApaYAV+2XYmrfR+Dm1QRGw7du3L/Pnz3+gNgQCwR+Lr4sNs0Z6\ntTouSCTwwUveDzwePAx6eno4Ojr+6dcFVSu6sWPHav0uf/HiRebPn6/Vu7wiC2ru3Lls27aN+Ph4\niouLmTlzJkFBQS3W6zl9+jT79+9XWsXa29sTGBjIK6+8onHOkJiYyI4dO7hy5Qr6+vp069aNKVOm\nPNJ7JhAIBAKBQHuECCQQCLTiryyKPAh/hYmqJrS1ywJYtWoVR48excbGhoCAAIyNjbl06RI///wz\nSUlJLFmyRLkaGhrrTyxbtozy8nKuXbtGz549CQgI4OrVq/z6668qItDmzZvR0dGhc+fOWFtbU1FR\ngUwmY8OGDaSnpzebKfHjjz+SnJyMn58fvr6+REdHs23bNmprazE1NWXz5s0UFxdz+/ZtBg4cyG+/\n/UZ9fT3vv/++Sjvx8fEsXbqUuro6/Pz8sLe3p7CwkKioKOLi4li6dOlDBeQFfwyt2UYN6uemZs0E\nKGurZGZmKu2xLl68SFhYGGlpaZSUlKjUttHX1aG+Xj3LQr+NKQ31dbSxtMPGvSeF6fGk7V9HXXUl\nFbdzmfz2O3R0sMPKyorKykokEgmurq74+vqSkZHBgQMHmD59Ou7u7mRnZ1NVVcUnn3xCSkoKgwcP\nfigLMB8fHyZPnszWrVv5n//5H3r16oWdnR2VlZUUFBSQkpJC165d+eyzz7RqTyKREBwczJ49e5gx\nYwZ9+vShtrYWmUyGXC7H29sbmUz2wP1tep3hw4fzr3/9C3j0WUDw185WbS1TJr/4DvO2R/PBS94a\nM2W04UEFkoe189t+Kr2V52fjYoCGhnoaGiAkMv2Bn6WOjo7KZ1htbS16emKKIxA8iQzzdcLOQkpI\nZDqy6+qLyzpZQsq/V3PRZDi+7SeyefNmEhMTqayspGPHjkycOJHevXsrj2+pBuBrr72Gp6en8tiI\niAjlAo2UlBSCg4OV+xR1dVqqCVRUVMSuXbuIi4ujqKgIqVRKt27dGDduHG5uqtnFimvNmjWLtm3b\nsmPHDjIyMpBIJHTr1o2pU6cqLUQV5Ofnk52dzfbt21mxYgWGhobY2dlha2vLhQsXNL7LZ2Zmkpub\ni4+Pj1bv8tC4MGXOnDkYGRkREBCARCLBwsKixe9t69at7N69GzMzMwIDAzEyMiI+Pp6tW7eSkJDA\nkiVLVMbdM2fO8NVXX6Gvr8+AAQOwtLTkwoULzJkzBxcXlxavJRAIBAKB4I9BzJAEAoFW/FVFkYeh\ntYmqd0crJg5wf6I+q7Z2WRERERw9epS+ffsyZ84clUyakJAQduzYwW+//cbLL78MNFpHLV++nPr6\neubPn8/y5cvx8/NTii/31kZYtGgR9vb2KtsaGhpYuXIlx44dUyuWriAjI4PvvvsOa2trACZOnMj0\n6dMJDQ3F0NCQlStX8v3335OSksK3337LzJkzOXLkCG+88YYyKFleXs4333yDoaEhX331lcok+/r1\n68yZM0eZtSFoJDo6mrCwMLKzs5HL5ZiZmdG+fXsGDBigIu7J5XJCQ0M5d+4cBQUF6Onp4ebmxtix\nY1VWp+7Zs4ctW7Ywffp05b+hphQVFfG3v/0NV1dXVqxYATQGw1fsS+RWegJFmTIqS2/R0FCPkZk1\nZg4elN2t5vSVUg791zaqaaCmrq6OiooK1q9fz759+xg3bhy//PIL1dXVGBoaUl5ezp07d6itraWm\npobSiiqc+76GhZOnSr909QyorW7MhujgNxIjMxsK0+Moy8+g5m45Jm0dWbLkU6ZMmUJDQwN6enrK\noMd7771Hr169OHjwIBcvXuTGjRvU1tZSUVHBq6++yvPPP//Q39PYsWPp2rUr+/bt48KFC0RHRyOV\nSrG2tmbo0KHKbA1tefPNNzE3N+fw4cOEh4cjlUrx9fXlzTffJCQk5KH7qyAoKIhNmzahr6+vVsPm\nUfBXzVZtKVPmXoHkYTJlHkYgeVA7v2sF8la/D12DNkgkEmrulAIgu17EtQL5A30vurq6BAcHs3Pn\nTjZs2MC0adPUMkSLioqoqKhQC7wKBI+TlkSHx0VycjLz589XCiOPGl8XG3xdbNSyfns42yDlLm8f\nMqCgoIAPP/yQdu3a8cILLyCXy4mMjGTJkiV8/vnnyozSlmoAxsfH88knn9CzZ08AXFxcmDBhAjt2\n7MDW1lbleeTl5dVin2/evMncuXMpKirC29ubgQMHUlhYyOnTp4mNjWX+/Pkq4pSCmJgYoqOj6dmz\nJ8OHDyc7O5u4uDjS09P5/vvvVWxr4+LiuHXrFqampgwaNIjAwECysrJISEhAT0+P6upqlXf5c+fO\nUVhYyKBBg1i/fn2r7/IKrl27xvPPP8/MmTPVBCJNpKWlsXv3bmxsbPj222+xtLQEYPLkyXzxxRfE\nxsYSGhrKuHHjgMbM5LVr16Kjo8OyZctUrEv/9a9/sXfv3lavKRAIBAKB4NEjRCCBQKA1f0VR5GFp\naaL6uAOImmjOLmvQ/2fvzAOiqtf//xqGfd9BUBQUlE1AWVzKFNfcbTVz62t9y7y/tG72tdVuudTV\nm2nZZt5b5C3NLVdQBBETd9lRFhlQ2fdlkGVgfn/QnBhmEDAXrPP6R/2cfeY453Oe9/O8n9GjiYqK\nEuyy9u/fj1QqZenSpRqBstmzZ3Pw4EFiYmKEF8eoqCjq6uqYNm0anp6eGse1tVX/ztsLQNAa1Jw+\nfTrR0dHEx8drFYFmz54tCECq6wkJCeHYsWPMmjVLLXinyi788ccfuX79uiACRUdHI5fLeemllzSC\nfX379mXixIns27eP69evi8FAICIigs2bN2NlZUVwcDDm5uZUVlaSk5PDsWPHBBGouLiYN998k+Li\nYry9vRk6dCj19fWcP3+elStXsmTJEiZOnAjAmDFjCAsLIzo6WqsIdPz4cVpaWoQATLyslE/2J3A1\nZjvV+VkYmttg1c8HHakuNUU55MdH0VBTRpOjqxAMd/7NCaugoICzZ8+iVCrx8vIiJCSEiIgI9PT0\nCA4O5sqVK4wYMQJbW1uUSiU///wzsguXuJl0HJsB/h1+LhKJBHvPYdh7DiM3bh9l2QkMmzCLqqoq\n6uvrMTY2pr6+Xq3CKCgoiKCgICGod6vG0+3tx9qybNmyDoOBXl5eeHl5dbhtW7Zu3XrL5VKplJkz\nZzJz5swunYO9vb3W877V+QLIZDKUSiUjR47EzOzu/G4+iNWqt6qUaS+Q/JFKmT8ikNyunV9CTmmn\n60j19DG2caa2+Bo5v+7BwNyGz7dk87dFpzTGAAAgAElEQVRnp3W6rTaefvppZDIZ4eHhnDt3jsGD\nB2NjY0NVVRX5+fmkpaUxf/588XdfRKSH0M/eTGMuXVzcmoiRnJzMnDlz1Oa0jzzyCCtXrmTPnj3C\n71BnPQC//fZbQQRyc3PDzc1NEIG6I3Bt3ryZ8vJy5s2bJ4gd0CqUr1ixgg0bNvDvf/9bsFZVcebM\nGT744AP8/PyEse+//55du3bx31376eUzUni36DfID39/f8zMzPjqq6+Eyuf4+HhWrlyJsbExcrlc\nmMvHxMQgkUiYOHFil+byKnR1dVm0aFGXBCCAyMhIoPU3ViUAQeuzZdGiRVy4cIGjR48Kn8uZM2eo\nqakhNDRUo3fdM888w7Fjx5DL5V06toiIiIiIiMidQxSBREREusWDJorcKbS9qN5POuqZ0r9/f612\nWb6+vkRFRZGdnc1DDz2ETCbD3Ny8w2w8PT09tSbh6enpAMKLdGeoKkYuXLhAYWEh9fX1astVfYLa\n095OAxB6n2hbphKM2lYiXblyBWgNPGurZsjLywMQRaDfiIiIQFdXl88++0zD4qltX5gNGzZQUlLC\n8uXLGTVqlDAul8t58803+eabbwgJCcHS0hIbGxv8/f2Jj48nNzeXvn37qu03KioKXV1dHnnkEerr\n63nqiceprG0N/NgNDKb30IkoW5pJ2vlPmhVNGFrYUVMko+pGBorGBlZ9/j01yUeorq4mLS0NU1NT\n6uvrmTt3LpWVlZw6dQodHR309PSYMGEC8+fPR1dXF6VSyaVLl8jNLyE7O4Hknevwn/OOxmdSdSOD\n4itnaKitBGULDdVlKBpvUltexJYtRwBwcXEhPT2d7Oxs6urq2L17N7m5uejr62NmZkZjY+Md+44e\ndHbv3g20ViXeLR60atXOKmW0CSSFyRJmeJlhYdD9492uQHK7dn51DZp2i9roN3IWNy4cobrgKs25\nKUTnm/DoMC+NXhRdQVdXl7fffpuYmBiOHTvG+fPnqa+vx9zcHAcHB+bOncvo0aO7vV8REZF7j729\nPU8//bTa2JAhQ7CzsyMjI0MY66wH4IEDBygpKRFsY2+H0tJS4uPjsbOz47HHHlNb5unpySOPPMLx\n48eJi4sjNDRUbfmoUaPUBCAAZ88g0q5v4cpPkbiN+l1Uaait5HpeFaNHeqjN5QMCAujbty9Xr15F\nKpUKc/m8vDx0dXW5ePGihvgEmnN5FQ4ODl2y9FRx9epVAI3rAHB2dsbW1paioiLkcjkmJibC+j4+\nPhrrm5iY4OrqSkpKSpePLyIiIiIiInJnEEUgERGR26KniSJ/FTrrmdLfR7MxKyB4fcvlcmpra1Eq\nlVRVVd2ySXhbVBl7bat0oLU6pL1f+6xZswgLC6OoqAgPDw9GjhxJdnY2165do7i4mOzsbPLz8ykv\nL9fwa2/7Mh8bG8uePXuIi4sjPz+fXbt2aWQUqrIYm5ubhTFVE/QjR47c8praN0H/KyOVSrVmhKoy\na2UyGSkpKYwcOVJNAILW7+zZZ59l1apVxMXFCZVDY8eOJT4+nqioKDV7wszMTK5fv87w4cMxMzMj\np7iGJkMbarJiserrQ++hE5Do6FBTlENLswKJRIKBmTUSJDRUl1KYfILc2kocmxSYm5ujo6NDY2Mj\nurq6nD17lszMTJycnGhoaMDAwIC9e/dSXV3N0qVLhaqx3k6OZGdn0yCvVLsWpVJJXWkejXXV6OhI\nW+3oLO1pVjRSX1XK7i3/ws7agqFDh6JQKEhPT2fTpk1YWVkxfPhwfHx8SE5OZs+ePUgkEq0Bk78K\nOTk5nD9/nqysLC5evEhQUJDW6r87SVerVZ1NWpg2bdp9tV/qSqVMe4FEqVQSfdaXWaO6f1/9EYHk\nduz8jA269ophYGZN/zG/Z/ovnujF2ODWfhG3qpbrqMpNIpEwZsyYLtkvdlTZpuJ2ji8iItI9Okpq\ncnV1RUdHR2N9W1tbIdlHxa16AEJr4tEfEYGys7MB8Pb21mqnOXjwYI4fP052draGCNQ+gSki/hob\nIrKovtmIWaN6ghRKJWVlZfwScRzZ5FmY6bXQ0tIiLL558yampqZqc/mmpiZOnz7NtWvXunw9bat5\nukJdXd0tt7O2tqakpEQQgVTvDB31Geru8UVERERERETuDKIIJCIiIvKA0FkD8eqbjRyIu8yjv/VM\naUtlZWuw28TERBBa3NzcutwXR7VNWVmZ0GS8I7/2v//97xgbG/Piiy8yZ84c0tPTWbFiBT4+Pkgk\nEurq6nB2diYpKUnwa2/Pvn37+PbbbzExMcHLy4vGxkby8vJYvny5cPyOUC3/7LPP6NevX5eu769G\n26CLkbMnFWnpvPzyy4waNQofHx88PT3VskRVARe5XK61uqqqqtWyqm3G6fDhwzExMeHEiRMsXLhQ\nCOZER0cDCMHkhJxSjCzsaGlW0KJooDDlJADlOak01JRjaGFLWXYiSCTo6BlQmhVPbVEOOo626DY3\nCAGSvn37UlZWxubNmzl16hSbN2+moaGBq1evcvbsWTZv3kxtba1gd6erAy1N6tU69VXFNDXIsXDx\nxCVkKiXp57hZUQjKFgwMDRk4wJUZM2Ywffp0duzYgZWVFZmZmTzyyCPY2tqiUCgoLy/H09OT+Ph4\n8vPz78j39SBy9epVwsLCMDY25qGHHmLx4sX35LhdqVYtLi6+J+dyK7pSKdNeIAEYMNgDX1/3uy6Q\ntOV27Pz8+91epdXtbici8mciLy+PY8eOkZCQQHFxMXV1dVhZWTFkyBBmz56tYcHbtofPsGHD+OGH\nH7h8+TJNTU14eHgwf/58rVa+lZWVhIWFce7cOW7evImzszMzZsy4rUq87tJZUtPADtxapVIpyjYT\n4dOnT7N27Vr09fXx9/enV69eGBoaIpFISE5OJiUlhaampj90ripRoyPxQjWu6pvWFlNTU+Hvqj5w\nSH5LulG2qK1bkBxDXUUB+qaW5CuteeYhf9ydW5OvoqKiKC0txdTUVG0ub2xszOuvv35XejepUM2r\nKyoqtNo9l5e3foeqc1L9qXr3aE9FRcXdOE0RERERERGRThBFIBEREZEHgFs1EG9LXXkB6/ee12gg\nnpycDLQKP4aGhri4uHDt2jVqamq6FNQbOHAgp06d4uLFi4JdUEd+7QsWLKC6upoRI0YA6n7tu3bt\nIiEhgdDQUObOnSv4tbetEFBVF5mamrJx40aOHTtGWVkZK1as4NChQ8TFxd3yXAcNGkRcXBypqami\nCNQO7UGX3lQ5P0xVYQq523dhbrQPiUSCj48Pzz33HO7u7kJ1VUJCAgkJCR3uv211lb6+Pg899BBH\njhwhPj5eqJ45ceIEFhYWgrVgXYMCA8vWgFNdeSEFSScAkJe0ZrU26EipqyhAggQdqRSXkKmk/rKJ\nm/I6muqq8PT0pG/fvpSWlrJw4ULMzMyYNGkSenp6vPvuuxQUFNDY2Ii7uzsvvvgiMpmM06dP05h7\nDUW7AExDTRkSiQ72g4Zh4eyOhXNr5Vnu6X3YN+Sybt06ITgmkUjo378//fr1o6GhgYMHD2Jtbc24\ncePw8vJi8uTJf+lAx9ixY7tcNXI36OnVql2tlLlT2/0RbsfOr5+9Gb4u1re0vGvP4L7WPfo7ExG5\nW1y+fJns7GyhMvH06dOEh4fj6+uLp6cnurq6XLt2jaNHj3Lu3Dk2bNigUZkNkJWVxe7duxk0aBAT\nJkygpKSEU6dO8c4777Bp0yacnZ2Fdaurq1m+fDmFhYVCr7mKigq++OILAgIC7ur1diWp6eDFa4zX\nktTUnm3btqGnp8eGDRs07Cw3b958R2zHuipqdGRNp+JWfeCa6uWUZV5CqmeAiU1vegdNodahF3Pm\nDAdaq+Orq6uxtbUV5vK9evUiPT39rle2u7m5cfXqVVJSUjREoIKCAkpLS3FwcBCuv3///gCkpKQw\nfvx4tfXlcjkymeyunq+IiIiIiIiIdkQRSEREROQB4FYvjm1RNNZTkHSCH0/2EkSgzMxMYmJiMDEx\nYfjw1pfJmTNnsmnTJjZu3Mirr76q8eJaW1tLUVGR8CI3duxYtm/fTnh4uCDYtPVrLy0txdbWVvBr\nT09PJzk5mX79+gn7zs7OZufOncIx2vq1tw1MxMTEoFAomDp1qlo2qkQi4bnnnuP06dNqWaDtGTdu\nHDt27OCnn37C3d0dDw8PteVKpZKUlBR8fX07/0D/RNwq6GLj5gdufjQ31TNhoAGUy4iMjGTlypV8\n+eWXQhbo//7v/zJtWtebto8dO5YjR44QFRXF0KFDOX/+PDU1NYwYPYGDl65T16Agq7AKU/u+SCQ6\nSPUMGDJ3JYrGepJ3rcPBawTWroNJO/AFfYInY+cRRFFaHMY2Tjz7/POkxB5g7NixFBUVUVpaqmYX\n6OLigo2NDZMnT6aqqoq///3vQgWEUqnEyckJIzNDJBJQKsFz6mISf/6YJnkVuga/e/FLJLBxzbta\nA1E6OjrMmzePYcOGqY0XFBQQHBysEfwQEVHR0ytl7oSd37Oj3Hnzv2e79OySSGDOw+6drygi8hdg\nzJgxzJgxAz09dYvf+Ph4Vq5cyY4dO3j55Zc1tjt//jzLli1TE+AjIiLYvHkz+/fvV6vIDAsLo7Cw\nkBkzZvD8888L41OmTGH58uV34ap+u4YuJjWhhA0HkzSSmtpTUFCAi4uLhgCkVCpJTU3Vuo1EIlGz\nWesMNzc3AFJTU2lubtawz01KSgJ+Fz+00VkfuMbaCpRKJboGxrQoGilMPkGS3gRyimsw1WkgKyuL\nsrIyfH19hbl8aGgox48fJyIigtmzZ3c6l79dxo8fT2RkJNu3byc4OFioFG9paWHr1q0olUomTJgg\nrD9s2DBMTU05ceIEU6dOVZub/fTTT0JllYiIiIiIiMi9RRSBRERERHo4nb04tsXMoS9lWfHs2pKP\nY9VYpM31nDx5kpaWFpYsWSIE88ePH09WVhaHDx/mhRdeICAgAHt7e2pqaigqKiIlJYVx48axZMmS\n32yVyvAZ+xSHf9rCu+9/SFH+dfT19fnmm2/IycmhpKREsCDy9vYmIyODLVu2kJycjJOTEwkJCURG\nRmJkZMSNGzfIzs4mKipKOO+2L4SqhrLaRBpHR0fs7OxuaedkZmbGm2++yerVq3n99dfx8/PDxcUF\niURCSUkJV65coaamhj179nTpM/0z0NWgi1TPkEMyWPvsMyiVSiIjI0lNTRWCv6mpqd0SgTw9PXFy\ncuLs2bPI5XK27dpP2vUKKm8YE3MkTVjP2MoePSNT6qtLaair5mZZAcqWFswcXTG0sEPP2IyaQhl2\nHkHUFMqQSCRMGBVMSqy6JVbbAEhBQYFwDmfPnlUL+GRkZNDS0oKZkT5rnw3hx5OZxJW0WtlJpL9P\njVQ9ZG4VgNKW+asKEHUnyCSiya+//srBgweRyWQoFAp69erFI488wsyZM9WCo4sWLQJas75//PFH\nTp48SWVlJXZ2dkyYMIHHH38ciURyy2OtW7eO2NhY1q5dq7WZdVxcHGvXrmXKlCm89NJLf/ja7kal\nzJtvvklKSoqaVVxbm6ju2AX9UTu/qKgozp07BxcTSci4jkSig5GlPbbugVi7DVZbNzPyO+wlFfiu\niGD79u1ERUVRVlaGvb09s2bNYuLEiQCEh4dz6NAhCgoKMDMzY/z48cyZM0frd9vVewdu7/5RKpUc\nOHCAiIgICgsLMTMzY/jw4cybN49XXnkFuDd9g4qLi1m0aNF97W8l0j066oHTFm1VPgABAQH07duX\nS5cuaV3u6empUYE5btw4vvrqKzIyMoQxhUJBTEwMRkZGatXcAO7u7owePVptjnYn6WpSE7QmaPx4\nMvOWz2B7e3uhz6S1tfVv2ymFHoDaMDc3p7S0875sKmxtbfH39ychIYH9+/cza9YsYVl6ejonTpzA\n1NRUEGe00VkfOH2T1v45zY03MbV3oSwrHnlpPutuJnP1UowgaLWdyw8fPhx7e3uuXr3a6Vz+j+Dp\n6cnjjz/O7t27WbJkCSNHjsTQ0JCLFy+Sm5uLl5cXjz32mLC+oaEhf/vb3/j4449ZsWIFDz/8MFZW\nVqSlpZGbm4uPj88dqdASERERERER6R6iCCQiIiLSw+lKA3EV+iZW9AmeQn58FL/sP4S9uT79+/dn\n9uzZDBkyRG3dxYsXExgYSHh4OImJicjlckxNTbGzs+Oxxx7Drr8fr39/uk2Q0hC9IU+TF3+M4pIk\nWlIvY2RkRJ8+fXjyySeF/ZqamjJo0CCCgoJIS0sjMjKS3NxcnJ2dGTlyJOHh4QwePJhHH31U8Gtv\nbm4Wtu+soayBgQHnzp3j008/7bCax8/Pj88//5w9e/Zw6dIlUlNT0dXVxdraGj8/P8Gq7q/CrYIu\nNYUyTB36CUFOVdDF7DfbEwMDA9zd3fH29iYuLo7IyEitFS45OTlYWVmp9RKC1mqgH374gTWbw9gV\nHoOBuS3G1o5q60h0pFi7+VOYEovsxA5MbJzQ0dXDxLY1s9fMoR+V165QV16AvOQazr374O3qdMtr\ndnBwANCwHamqquLLL78U/q3qIXPKx4L/Of41leV1TA/qy/wnR4nWVPeRsLAwdu7cibm5OY888ogQ\ncAoLC+PSpUt8+OGHag26FQoF7733HuXl5QQGBqKjo8OZM2f4/vvvaWpq0gh0tufRRx8lNjaWiIgI\nrSJQeHi4sN6doidXyvxRO78vvvgCFxcXJj0yjBEjH+JEoowrKYnkxO2lvqYMJ7/WqrzBfa1x9nai\n5EY969atIz09ncDAQKRSKadOneLzzz9HV1cXmUxGdHQ0QUFB+Pn5cfbsWbZv346BgQFPPPGE2rG7\ne+9A9++fr776isOHD2Ntbc2kSZPQ1dXl7NmzZGRkoFAotDaPF9EkIyODvXv3kpaWRnV1NWZmZvTt\n25eJEyfy0EMPCevdrqi3bds2Tp06RXV1Nc7OzsyZM4dhw4bR3NzM7t27OXbsGKWlpdjY2DBjxgym\nTp2qtq+2IuqQIUPYtm0bmZmZtLS04Onpybx589SqHAA+/fRToqKi2Lp1q1o1c7yslE//G8HhHzbR\na/Aj9Bo8mobaSpJ3rae+tBxHB3shyUKpVGJubo6bmxsymYza2lrq6+spKCigsrIShULBnDlz8PT0\nZPbs2cIxVOfy448/8tNPP7FmzRrKy8vJzMwkKSmJ6upqtm7dyo0bN2hoaMDb21trIoOvr+9dEYG6\nk9SkIim3nJzimg6fxzNnzmTz5s288sorjBw5EqlUyuXLl7l27RrBwcGtYnQ7/Pz8iI2N5YMPPqB/\n//7o6uri7e2t9bdfxZIlS3jjjTf497//zaVLl3B3d6e0tJRff/0VHR0dli1bhpGRUYfbd9YHTs/I\nFIs+g6gpklF57TJ2g0KoyElm53dnMDXUw8nJCRMTEx5++GG17fr160dwcDBKpVLrXL67PeA6YuHC\nhbi5uXHw4EGio6Npbm7G0dGRefPmMXPmTI3fvJEjR/LBBx8Iwrqenh4+Pj6sX7+eXbt2iSKQiIiI\niIjIfUB8QxERERHp4XSpgbipJUPmrhT+7TZ6NgtGe3QaNAwKCiIoKEhjvCPrMCNLe/oETaY6LxOp\nmz/zXtduk2VkZMS7774LtL44W1tb8+mnn9KnTx8++ugjYT2VX/u8efMEQaet97qLiwtz5sxRy2Bv\n68l+q0Clvb19hxn7ycnJTJs2rdvZ8Q8inQVdZLE/o6Orj7GtMwamliiVkB6eS3/TBgZ7D8LPzw+A\n119/nbfffptNmzZx4MABBg4ciImJCaWlpeTk5JCbm8v69es1RKAxY8aw+Zt/8+13YbQ0N7daz2mh\nT/CjlGVdouxqPKWZFzE0t6EgORZFg5yqvEwqr6Vx43wELYpGpo4d2el1u7u74+npycmTJykuLiYi\nIoKkpCQuXryIs7Mz+vr6ausP9x+Ek60l1aWF+DqZiALQfeTKlSvs3LkTW1tbPvnkE6Hp9oIFC1i9\nejXnz59nz549PPXUU8I25eXluLq6smrVKuG7nTNnDi+++CL79u3jySefvGVg3sfHBxcXF+Li4jR6\npRUWFpKYmCj0n7pTBLjasmyKb6dVehIJvDp18C2z4Xsan3/+uUbviKz8Cla8/Q45WQk8O2wOo/zc\n6WdvxpuX91NyA0pKSti8ebPwDJg1axaLFy9my5YtmJiY8NlnnwkVEnPmzOGFF15g7969zJo1S6i+\nu517B7p3/6SmpnL48GGcnZ3517/+JZzv/PnzeeeddygvL1cL/oto58iRI3zxxRfo6OgQEhKCk5MT\nlZWVZGVlcejQIUEE6o6oV1xcTGRkJA4ODrzzzjvU1tYSEhIi9KNbs2YNH374IYcPHyY9PR2lUklW\nVhYNDQ18/fXXWFhYaATZoVWs2rlzJ/7+/kyZMoWCggKh9+AHH3yAt7f3La9VNaeqLqxSG5fqG+Lg\nNRxZ7A2KKut4csQEBve14fjx46SkpKCnp8eQIUNobm7ml19+QVdXFxsbG5RKJcHBwZw5c4Y33nhD\nsOZtL+js3buXhIQEjI2NsbGx+b0XX10d0HGyTUfjf5TuJDW1366jZ7KqB+C+ffuIiopCX18fb29v\nli5dSlxcnFYR6H//938BSExM5MKFCyiVSp555plbikCOjo5s2LCBHTt2cOHCBVJSUjAyMmLIkCE8\n/fTTGmJge7rSz61P4ERK08+hbGmm6voV9E0smTZzOuveWcaaNWs6FE769+/f5bls20pRbSxbtqzD\nqsJRo0YxatSoLh0HwN/fH39//24dQ0REROR+I1ZZi/yZEUUgERERkR7OvW4gfr/92vv3709cXBzJ\nyckMHqxuG1RYWEhZWVmXr0Wk86BLL/+x1BRc5WZ5IdX5WehIddE3sSAwdBrvL31OCK7Z2try6aef\ncuDAAeLi4oiJiaGlpQVLS0tcXFyYOnWq1gC5nZ0dDcaOtDSXI9GRYtVPe/WWiU1vzBxdabpZw83K\nYpqbGim5chqpgTF6xuYYmNtws6IANwdzpoZ2Xsmlo6PDu+++yzvvvMPevXs5c+aM0Cz76aef5qef\nftJYPyQkhLS0NPbt28fUqVPVMswVCgVyuVxD5BK5M7S1SDq+bzt1DQqefvppIYgPrTZ7ixYt4sKF\nCxw9elQjkP/iiy+qiXsWFhaEhIQQHR1NXl5epwLO5MmT+eqrrzh27Jia3c+RI0dQKpVMmjTpDl3t\n70wKcMHB0pgfT2aSlKsp1nbFkrAn0l4AAhjgZMVrLzzL2rVr6S2t1AjqLliwQC2I7ejoiJeXF0lJ\nSSxatEjNIsvExITg4GA16ziAyMhIgG7fO9D1+0dVIfHUU0+pna+uri4LFizgjTfe0Nh3VytZpk2b\nho+PD2vXrtXYR/sKE1W1h+qc2lZutO8L09O4fv260G/u448/xsXFRW25yqrrdkW9uro6zM3NWbt2\nrfD5jhkzhhUrVvDRRx/Rq1cvNm/ezL59+ygvL+e1115j06ZN7Nq1S6sIdPHiRV588UW1SqGzZ8+y\natUqNm7cyNdff92h5eSt5lS6+obYe40kJ+4XABKa3Zjs7UnR9u0MHTqUdevWoa+vz+LFi3F0dOSD\nDz7g888/Jy8vj2XLllFeXs6rr77Kf//7X632o0lJSaxfv57Vq1cDCH2EVHZibZNq2tLR+B/ldpKa\n2m+n7f9GRwlB/fr10yqOWFhYdNj3yN7evkOhxMbGRmsvJm20P6f2/dzaXyOARKqHgbkNNm7+9B0x\nA4AVL47CzMxM63X7+vp2KuqIiIiIiIiIiKgQRSARERGRHs69biB+v/3aR48ezU8//cTBgwcZP368\nENxTKpX85z//QdnVkxMBOg+62HkEYucRqDHuN9JDw9rEyMiIp556SmsAtSNyimuwCHqSIZoFZ2pI\ndHRabd9upGNi25uBkxZhYttbWF4Y9Q0W0nqsTA3x8fHBxMRECH68+eabWvdpZmbG9OnTyc7O1giK\n+vn5aWT9rlq1Cl1dXRITE3nxxRcJCgrC2NiYkpIS4uPj+Z//+Z8eHVh9EImXlfLf2Ey1arUrp+Kp\nKy/jl/QmHAaWqv2+ODs7Y2trS1FREXK5XAjCm5iYaBUebG1bt62tre30XEJDQ/nuu+84cuSIIAIp\nFAqioqJoaGjg6NGjfPfdd9TU1GBubo6TkxMPP/wwkydPBn7vybN371527drVrd42/5w3h9ySWrVe\nIfJrKVzLjGHz6q1UVFQglUrp168fjz766B2z+LkTtO9x4t/PFhNJPbt27SIxMZGSkhIaGxvVttEm\n5g8YMEBjTPXM0LZMJQq1FYFUPeVUFYxt6ejege7dP9nZ2QB4eXlprD9w4ECNpvG3Y0/XFXx9fZHL\n5ezfvx9XV1eGDRsmLHN1de32/u42be+Tk4d+pqaugeeee05DAILfP/c/Iuq98MILagKbt7c3Dg4O\nFBUVsXDhQkxMTJg6dSqjRo3Czs6OyMhI0tLSaGlpQUdHR21fvXr1YsqUKWpjISEhQm+T1NTUDqtI\nujun+j7iHEqlkoCAAIyMjDh79iwFBQXMmjULR0dHCgsLhfWtra15/PHH+de//qVVBJo0aRJubm4a\n471798bAwIDs7GyN/wvQWi19N7jXSU09ibvRB05ERERERKSnIFZQPRg8+DMqERERkT859/LFsSf4\ntdvb27NgwQK2bt3KK6+8wsMPP4yJiQmXLl1CLpfTp08fEhMT1bbJy8vj2LFjJCQkUFxcTF1dHVZW\nVgwZMoTZs2cLASX4PZsa4KefflKrCFmzZo1anyFVj5Ds7GwaGxtxcHBg9OjRPPbYYxp9CFQZ3G+8\n8QY//PADFy9epKKigqVLl95X4eB+B126Y/9i6uhK5Y10pPqGjBsRgIeTlRBUPmSdQ0REBAMGDNDa\nw+BOoKuryz/+8Q/Cw8OJjo4mOjoapVKJtbU1w4cP1xr0Fbl9OrKdbG5qACCrTMGb/z3Lq1MHq9lO\nWltbU1JSoiECaUMVkNcWIG2PkctmrasAACAASURBVJERY8aMITw8nKSkJAYPHsy5c+dIT0+nvr6e\n/Px8goODMTc3p7KykpycHI4dOyaIQCr+SG+btr+jj3/8Oi4uLvj4+GBlZUVNTQ0XLlzgk08+IS8v\nj7lz53b+Id9FtAl4AA01FRSdDMPeRIcRQQEMGTIEY2NjdHR0KC4uJioqiqamJo39afsOVd/frZYp\nFL8L3Sqbq7aCQVu03Tsd7b/tMdreP7ey0tLR0VGzErzdSpau4Ovri4ODA/v378fNza3HWptqu0/S\nY88jLyvjcDa4yEo7TCS5XVFPX19fq6hnbW1NUVER/fv3B8Dc3Bxzc3OgVVRsbm6moqJCreoMWgUk\nbZU+vr6+pKSkcPXqVa0i0O3MqWRVEiQNCkGQunLlCgD5+fm8+OKLXLt2DWjt+6MaB6ivr9fYl4eH\nh9Zj6OrqMnr0aI4cOcJPP/3E888/LyzLzMwkJiamW+fcVe51UlNPoyf3gRMRERERERH58yOKQCIi\nIiIPAPfqxbGn+LXPnDkTa2trdu/eTVRUlOC7/txzz/H+++9rrH/69GnCw8Px9fXF09MTXV1drl27\nxtGjRzl37hwbNmwQgjqqbOmoqCh8fHzURB8HBwfh7xs3buTYsWPY2toyYsQITExMSE9PZ9u2bSQm\nJvLhhx9qZHzX1tby+uuvY2hoyIgRI5BIJHfNW7+r3O+gS1fsX1TYDwrBflAIAB5OVmr38ZIlS1iy\nZInW7bTZpKjoyCamIwsVqVTK1KlTNRqEt6d9r6q23MpORqSVW1kkSfUMAFDU1yLVs9awnSwvbw2q\n3g0xcPLkyYSHhxMREcHgwYMJDw+npKQEDw8PPvvsMw07wOrqauHvcrmcc+fOUVNTw6FDh/5QbxvQ\n3ldHoVCwcuVKdu3axaOPPqoRrL5XdCTgARRfOU1peSWmg2Yw+un5agJebGzsXWk4r0Jlc1VRUaFV\nBLgT946qQrKyshJHR0e1ZS0tLRSVVtCi12rxd7vWhg8ybTNB7b1G8v66z6kpykXZosDEtjfOQyei\naKynpVlBzNEDHPx+I/1sDAjx82ThwoVqNrDl5eXk5eWxbt06SkpKqK2txdzcHB8fH2bPnt2hqGdg\nYKB2Tkqlki1btrBnzx50dXWFJA6Vpd6aNWuE/3vNzc1CUsf06dORyWT8/PPPxMTE0KtXLx577DHG\njRsH/C421tXV0dTUxM6dO9m+fTs5OTksW7YMe49AWuhPwk+rMXPoi+Pg0Z1+fnpGpjgO9CcjI4VX\nXnmF8vJyZDIZiYmJSCQSdHR0qKur07A0bW5u1tjXreYf8+fPJzExkX379pGZmYmXlxcVFRWcPHmS\nwMBAzp492+m5dpe/ejVMZ33gVFZ4D2IfOBEREZE/M91JOIXWitq33nqLZ555hmHDhvHDDz9w+fJl\nmpqa8PDwYP78+Xh6emocp7y8nLCwMC5cuMDNmzdxdnZmxowZ2NvbC/tr+/65aNEiALZu3aqxr7Zz\nnLbxjjNnznDq1CkyMjKEyvzevXszduxYpk6dqjXpJS8vj7CwMBITE1EoFLi6uvLUU09RXV3Np59+\nyrJly9SOAa3Wvrt27eLChQuUlZVhZGSEp6cns2fP1uijd/PmTfbt28fJkycpKSlBqVRiaWnJgAED\nePzxx7U6AojcHqIIJCIiIvIAcK8aiPcUv3bouAHtO++8g0wmUxsbM2YMM2bM0KjOiY+PZ+XKlezY\nsUPwcR82bBgmJiZERUXh6+ur9fhRUVEcO3aM4cOH8/rrr6v1iVBNqA4dOsT06dPVtsvJyWHMmDEs\nXbpUQyC6X9zvoMv9rkQS6ZncyiLJyNqRuvICaotyMTCzVrOdLCgooLS0FAcHh7siAvXr1w9PT09O\nnz5Neno6iYmJ2NjYYGpqqvX/tKqKoC1BQUF/uLcNaO+ro6ury5QpU0hKSiIxMZHQ0NA/esndprO+\ncQ01FQBY9PHUEPDuls2UCjc3N65evUpKSorG53en7p3+/fuTnZ1NWlqamggULytl0/ZI4rOLMTC1\n5PuYjNuyNvyzkJQuI/zfv2BoYYu1mx+N8kqqrl8hM/J7pPoGyEuuY2hui2Vfb8oab5KQms7777/P\n119/jZ2dHdAaFCgoKEBHR4cRI0ZgZGREfn6+RgLJrT67xsZG/vWvfxEXF8eAAQPQ1dVVe6Z3hFwu\nZ8OGDdTW1uLl5cWoUaP49ddf2bhxIxKJhLFjx1JR0XqvGxkZsXbtWs6fP4+Ojg4ODg4EBARwIPpX\nCpoShH02N2pW62gjdOZcdPIvcfLkSdLT06mqquKZZ57h3XffZc2aNaSkpKglGqiCTe3pqE8RtP52\n/fOf/yQsLIxz586RlZWFs7MzL7/8Mvb29ndFBAKxGubP2gdORERE5M9MdxJO25KVlcXu3buFvrQl\nJSWcOnWKd955h02bNuHs7CysW1VVxfLlyykuLsbHx4dBgwZRUVHBl19+SUBAwB27lu+++w4dHR0G\nDhyIjY0NcrmcpKQkvvnmGzIzM3nttdfU1r9x4wbLly+ntraWoKAg+vXrR2FhIWvWrGHo0KFaj3H1\n6lXeffddamtrGTJkCCNGjKC6upozZ87wxhtv8PbbbxMY2GpHr1QqWblyJZcvXxY+J6lUSmlpKcnJ\nyXh7e4si0B1EjLCIiIiIPCDcixfHBzVg31E2fEBAAH379uXSpUvd2t/+/fuRSqUsXbpUI1g0e/Zs\nDh48SExMjIYIpKury6JFi3qMAKTifgZd7nclkkjPozOLJJv+AZRlxVOYEot5bw/0DE1Iyi0nu7CK\nH7duRalUMmHChLt2fpMnT+bDNR+z+O9vU1lRi2fgIxReTeXll19m1KhR+Pj44OnpqVEVpKJ9NiB0\nv7cNQElJSbf76twLOutxom/S+rnUFuVg0XugIOBdunSJo0eP3tVzGz9+PJGRkWzfvp3g4GDhO2pp\naWHrHbp3QkNDiYyM5OeffyYkJAQTExMi4q/xyf4EsqL2qK17O9aGfxZOnL6IvedDOPo8LIwVJJ+g\nIDGGurJ8dA2NsRsYjIPXcAAs62TUpRxh3759gj1ZYGAgcrmciRMnMn78eGE/MpmMV155hYyMDMEy\nVhs1NTV8+OGHXLlyhQULFnDx4kVSUlK6dP4ymQxvb28h23fJkiXMmDGDv/3tb+zevZuxY8cKomZ5\neTnnz5/H29sbNzc3Dhw4wMSJE+kTPJnlr78u7LOuLF/jOAamljj6PExNoQxlSwsSHR0szIyZOW8e\n8+bN49SpU3z00UcMGjQIMzMzrYk2vr6+t6w+1ZYhDK2VTEuXLtW67G5Vs96rpKaeTICrLQGutlr7\nqf1Zqp5ERERE/kx0J+G0LefPn9foSxsREcHmzZvZv38/ixcvFsa///57iouLefzxx1m4cKEwPmPG\nDA1h5o+wcuVKjUQppVLJp59+SnR0NFOmTGHgwIHCsi+//JLa2loWL16sZoF98eJFrQ4tLS0tfPzx\nx9TX17N06VJu3LghVFDV1NSQnp7Oyy+/zL59++jVqxe5ublcvnyZYcOGMX36dKHi6fHHH0cul/Of\n//yH//znP3+KCqobN24AGEskkrHAMuBTpVKpZpEgkUhsgSeAQMAGuAlcBrYrlcpMjYN0E1EEEhER\nEXmAuNsvjj0xYN/+WnubaEYNlEolMTExREVFIZPJqK2tVevh0J2m2w0NDchkMszNzdm3b5/WdfT0\n9Lh+/brGuIODQ4eB4fvJ/Qy63O9KJJGeR2e2k6Z2fXDwHklR6imuHPwSSxcvdHT1+Nv/24G0vgIv\nLy8ee+yxu3Ju8bJS9mdLuVxUT1N9GbqGxtg4PExVvTVVhSnkbt+FudE+JBIJPj4+PPfccxqWBu2t\nqKD7vW0KCwt57bXXqK2txdvbu8t9de42XelxYucRRHl2ArKTu7B08STvkhlNyfvJTk/loYce4uTJ\nk3ft/Dw9PXn88cfZvXs3S5YsYeTIkRgaGnLx4kVyc3PvyL3j4+PDpEmTiIiIYMmSJfT2GMwv53Op\nupGBVM8APWMzoPVFsLvWhhKJRKutF7TajfZEtD2j6xoUNEiN6e81Um1dGzd/ChJjWoVCiQ5FqSex\ncB6AoYUdFYZ9QaEkOzub0tJSbG1tmT59Or/++quGqNe3b1/q6uqorq7usOdecXExK1euFP4vjR49\nmosXL3b5ugwMDHjsscdITU0lPz+fQ4cOMXXqVLy8vEhJSSE2NlaoOFP1Lpo7dy4VFRUcOHCAI0eO\n8OSCl3D0HUXOqb001ddRkq69ukbXoNXGsLGuCgNTK7U5VUhICL169eLQoUMMHjxYyJxty5UrV3B1\nddX629MTEathWulnbybOdUREREQeAG434dTT01NjnjJu3Di++uorMjIyhDGFQsGJEycwMTHh6aef\nVlvf1dWV0NDQO5ZIpc1pQCKRMH36dKKjo4mPjxdEoNLSUpKSkujVqxePPvqo2jZDhw7F39+fhIQE\ntfHr169TUFDArFmzqKio0KigioyM5Pjx4zz//POEhYUJ27VNvG1bQTVlypQ/TQXV6dOnARyB4dqO\nIZFI+gMfAqbAJSAOMAeGAf+USCSrlUrlhT9yHaIIJCIiIvIAcrdeHHtSwL7DpuO1lVy/XsHAst8D\nYlu3bmXfvn1YW1szZMgQbGxshIlEVFQUxcXFXT5ubW0tSqWSqqoqDc/9zuioGXlP4H4GXf7q9i8i\n6nTFdtI5YBxGVo6Upp+jXJaIsqWFXgNdWThvHjNnzuyWsNtV2va5sXL1pfjyGWzc/NCR6mLj5gdu\nfjQ31TNhoAGUy4iMjGTlypV8+eWXHYq/Kg/xvXv3kpWVxcKFC3F0dOzUQ9zOzo7CwkL69OlDTk4O\nKSkpbN26FXt7e2JjY9mzZw8//vgje/bsQU9PD29vb2pqasjOzmbatGnCum1JT09nz549pKWlUVtb\ni6WlJYGBgTzzzDNCpVJndKVvnJGVAwPGLaAg8TjVeZkolS3kmnrz1ltvYWJicldFIICFCxfi5ubG\nwYMHiY6Oprm5GUdHR+bdwXvn5Zdfpnfv3oSHh7Nj9z7kLbpY9h5EL/9QUvduQM+0VdTprrWhqakp\npaWan3FLS4uGDSqAjo6OsPxec6tn9NWCSowc3JH8dn4q9IxMATC26YWteyDXzx3iyuGvseg9CAMz\naxqu36DgRi4Aa9aswdPTk4CAAHbs2EFAQABmZmZIJBKqqqqoq6vDzMxMqwgkl8tZvnw59fX1vP/+\n+/j5+XX7+pycnIT+T0OHDmXr1q1cvHiRnJwcsrKyWLNmDSYmJixdupTVq1cjkUjw9PSkubkZJycn\nYmNjKSsrw6S+npvlBdSV5eE8ZAIVuakaxzJzdKUiNxVZ7M94D/bn7PFaZPb2jBkzBl1dXd566y3e\ne+89/vGPf+Dp6SkIPqWlpWRmZlJYWEhYWNgDIwKBWA0jIiIiItIzuJsJp+0TxVTrWlpaqiX33Lhx\ng8bGRtzd3YW5R1u8vLzumAhUU1PDnj17uHDhAoWFhdTXq1vVtnUayM7OBmDQoEFaK128vLw0RKCi\noiKg1dGgqamJyZMnq30+Q4cOJS8vj+LiYnbs2MFLL72Em5sbsbGxpKWlUVBQQHR0NG+99ZZa9f6f\noYJq48aN5OfnFwIh7fcvkUikwP8BhsBbSqUypc0ya2AD8IpEIlmkVCpvOxNQFIFERERERNToCQH7\nWzUdB6i+2cjBi9cYn3CdYa7m7N+/n759+7Ju3TqNiVNsbGy3jq0Kxrm5ubFx48bbOv+eyv0Kuoj2\nLyJt6ap9pHU/H6z7+Qj/XjzRi5nBrhrrdWR1BDBnzhyNvl/29vYaVkft+9zcLC9AIpFgM0Dd61qq\nZ8ghGax99hmUSiWRkZGkpqYyYsQIrcdXeYibmppiY2PDhAkTqKysVPMQ10ZWVhYZGRkEBgYyaNAg\nqqurhReovXv3kpGRgYuLC3PmzMHKyoorV64QHR2tYRmnIjk5mR07dqCnp0dISAi2trbk5+dz5MgR\nzp07x/r164U+LLeiKwIetFZzuY+bL/z7ydEeDBvW+qxo/9lrs7dSsWzZMpYtW6Z1mbbvVkVHPeW0\n0d37B1ozJmfMmIHf8FBkX//+jKmvLqO5qREDi9bfsO5aG3p4eHDx4kXi4+PVshd37NihNZnB1NQU\niURCSUlJl671TtHZM7q+sRkTPUONcYlOa9WbVM8AW/ehGFnaU3T5NLVFOVTduIKkuoJedtZMmTIF\naLVmjY+Pp3fv3igUCuRyORKJhP79+6OnpyeIYO2pq6ujvLwcNzc3+vfvf1vX2LZqz8PDg9mzZ7Nt\n2zYuX75MVVUVXl5eLF68GHd3d+RyOWZmZkilUqRSKatXr2br1q0kJCQgKauipbkJSxcvbD0CtYpA\nNv0DaJRXUpGbyk3ZBbZti8fHx4cxY8YArb3KPvvsM3755RfOnTvHsWPH0NHRwcrKCjc3N+bMmaO1\nP9mDgFgNIyIiIiJyP7gXCacd2dVKpVI1Eamurg4AS0tLret3NN5d5HI5r776KkVFRXh4eBAaGir0\nPZXL5ezfv1/NaUAul3f7vFSi0q+//trheVhYWGBkZMSlS5fQ0dFh9erVbN++nYMHD3L9+nXMzMz4\n9ttvkclkLFiwAENDwz9NBRVQByQA/u3GA4FewN62AhCAUqksl0gku4EXAD/gtquBRBFIRERERESN\n+x2w76zpuIASNhxM4sXhNiiVSgICAjQEoNLSUgoLCzU2vVX2tKGhIS4uLly7do2amhrMzP58wYn7\nEXQR7V9EVPRE28m2fW7kpXnUFOVi7jQAQ3MbagplmDr0EzLgVNUcZpWVgHb7NxUqD/HNmzcTFRXF\nnDlzsLe3V/MQ1/YCVVFRQd++fXniiScIDg4WxuPi4vj555+BVoGirUCSnJzM+fPnNfZVX19PZGQk\nQ4YMYe3atWqWFomJibz77rt88803vP32251+Tg9q37g7TUVFBZaWlmqVUS2KJvIuHgHAss8goPvW\nhrNmzeLSpUusWrWKhx9+GFNTU65cuUJhYSG+vr5CDxoVhoaGeHh4kJqayvr163F2dkZHR4eQkBD6\n9et3V669y8/oLmBi1wc3u997JDWf/56+dmaMHDmS5uZmfvzxR6ysrPj+++81qtXee+894uPjNfbp\n5+fH2LFjcXZ2JiwsjLfffptVq1Z12E9Hxa0ER2jNgl21ahW2trZERUWxYsUKodrO2NiYmpoampub\nkUql2Nra8n//939Aqy3dtCfmUIIVZg79GDJ3pca+JTo6OAeMZf27r6r1jGqLhYUFCxYsYMGCBR2e\no4pbCaQiIiIiIiJ/de5nwqk2jI1bbWErf3u3aE9H4xKJRM1Oui0qAactR48epaioSKNHDrRay+7f\nv7/L55VTXMOxCxnklcs5daUAC8fWRD0DAwOampp45513CA4O7rCCSl9fX6g6MjU15fnnnyckJITX\nXnsNHx8fJBIJBw8eRC6X89prr/2pKqiANDRFoEG//WknkUi0TeKcfvuzD6IIJCIiIiJyJ7mfAfvO\nmo63RamE6IwqANLS0mhpaREEnvr6ej7//HOt/RVUGbMdZU/PnDmTTZs2sXHjRl599VWNLJ7a2lqK\niopuO8P4r4po/yICPct2En7vc1OScZ6muhrKshOQSCT0GjwaAFnsz+jo6mNs64yBqSVKJaSH59Lb\noA5rx96kVRtz/ZyM6jrNyvyueIiHhoZqLA8MDKSyspKPPvqIkSNHYm1tTW5uLkePHsXc3BypVKph\n9+bp6ak1KF5cXIxUKuWFF17QOB8/Pz9CQkI4d+4cN2/e1PoC1ZaeKODdD/bv38+JEydoMnEkL68e\nxc1aaopkNMqrMHcagKWLl7Bud6wN/fz8ePvtt9m+fTuxsbEYGhri7+/PG2+8wY8//qj1XP7+97+z\nZcsWLl26RGxsLEqlEltb27smAnXnGd1dbEx/rx6qrq5GLpfj5+enIQDV19cLfXg64sknn0RfX59v\nv/2WN998k1WrVt2xTNr2uLm5kZSUxOXLl/Hx8VFblpaWhr2FEd69nbDpay0mQYiI/EW4VeNxERGR\n+8e9SDjtLr1790ZfX5+cnByt8/G0tDSt25mampKTk4NCodCwpMvMzNRYPz8/H0Crg0FKSorGmJub\nG9AqECmVSiQSiVoFVVbUaapLa9l/PpfoaxKuX68g2N8RamtJTU0lOTm52xVUhoaGBAUF8cQTT/Ds\ns89y5swZYdmfpYIK0KbqqUq6H+rkdDRL7buBKAKJiIiIiGjlfgTsu9J0vD0ZpQqGDAkh5dJZXnnl\nFQICApDL5SQkJKCvr4+bm5uQjaHC2dkZGxsbYmNjhWCqRCJhzJgx2NvbM378eLKysjh8+DAvvPAC\nAQEB2NvbU1NTQ1FRESkpKYwbN44lS5bcycv/yyDav4j0BNtJFapqjuK0OBrrqjEwtcJ5xCxMbFub\njvbyH0tNwVVulhdSnZ+FjlQXfRMLah0CMXQPZNuvrb8vl1PyqK1v4kYb+wiVh/jhw4dJTU1l/vz5\n6OnpCcs78hAPDAwkNDSUbdu2cf78eZqbm3F1dWX06NGcOnWKhoYGjW10dXWFjL221NbWYmFhQUpK\nitYXwqqqKlpaWsjLy2PAgAG3/Kx6moB3v/D390cmk3HiXBIlOQUg0cHQ3Aa7gcHYDQzRyPrrqrUh\nQEhICCEhGlbhHVaq9OrVi/fee+8PXlHXuJ1ndFcZ3Nea8pzf/29YWlpiYGBAVlYW9fX1GBq2vvMq\nFAq++eYbqqurO93njBkz0NfX58svv2TFihWsWbOmy/2vukNoaChJSUls27aNVatWCf+v5XI527dv\nB8DB0pi184eLSRAiIiIiIiL3kXuRcNpddHV1efjhh4mKimLHjh1qvW1kMhnR0dFat/Pw8ODq1asc\nO3aMSZMmCeNRUVFcvnxZY30HBweg1T2gbbJQdnY2O3fu1Fjfzs5OqEQPDw9Hp5ePIKBV52dRXaAe\nY6m+2cilQiX9Tcz45ZdfqKysZPDgwRoVVFeuXOH48eNAaw8hpVKJo6Oj2r5qa2tRKBQdWurBg1lB\n9Rva1CHVgVcplcqzHW34RxFFIBERERGRW3IvA/ZdaTqujSHjn8BrQF9OnjzJoUOHsLCwIDg4mLlz\n57JmzRqN9XV0dHj77bf57rvvOHXqFDdv3kSpVOLl5SVk1y9evJjAwEDCw8NJTExELpdjamqKnZ0d\njz32mODVLyIi0n3ut+1kW1R9brxnLtW63M4jEDuPwE734/bI09ysKBTsIyb69xE8xHv16sXEiRO1\nZsC1tW5S2X1ZWlri6enJ6tWr1Y6xadMmzM3Nee+99wgKClJbtnbtWqysrDh58qQw5uvry9ixYyko\nKGDPnj23PP/2tgYd0ZMEvPuFn58ffn5+5BTX8OLX3bcBeVAro273Gd0Zqvvk8xNtxyRMmzaNXbt2\nsWTJEoYNG4ZCoSApKYmamhoGDx5MUlJSp/t+9NFH0dfXZ+PGjaxYsYLVq1d3qf9VdwgNDeXkyZNc\nvHiRJUuWEBISgkKhIC4uDnd3d/Ly8oSgkZgEISIiIiIicn+4Vwmnt8PChQtJSkpi9+7dpKen4+np\nSXl5Ob/++iuBgYGcOXNGoxfitGnTOHbsGF988QWJiYnY2dmRnZ3NlStXCAoK0rCJDg0NZc+ePWzZ\nsoXk5GScnJzIz8/n/PnzDB8+XO0dQsXixYtZvnw5H/1rI/nYYWTpQENtBVXXL2PZeyCVN9JbJ3K/\nIZHoIO83DknSTtLT0zE2Nua7777DwMCA0tJSMjMzuXbtmtBHUSaTsWbNGtzd3ZFKpdy4cYMjR45w\n+PBhFAoFTzzxRIef2YNUQdUOL42NIP23P70BUQQSEREREfnz05Wm4wamlhqe+k1KKfPmzWPevHka\n63fUA8Dd3V0jwNqeoKAgjUBrR7Rvdi4iIqKd4uJiFi1axNixY1n77Nz73ifqjver+c0+wkjSdNse\n4h34R99Wxpsqg27Hjh1aK4W6S08S8O43f7XKqK48o7vLre6TuXPnYmFhwdGjR4mIiMDY2JiAgADm\nzp3boT2eNsaOHYuenh6ffPKJIAS1zzjVhq+vb5ee7RKJhLfeeoudO3cSHR3NgQMHsLa2ZuzYsUye\nPJkzZ850arUoIiIiIiIicne5Vwmnt4OlpSXr1q0jLCyMCxcukJGRgbOzM4sXL8bQ0FDrXKJPnz6s\nWrWKsLAwzp07h1Qqxdvbm/Xr1xMXF6chAllbW/Pxxx/z3XffkZaWxqVLl+jduzeLFy/G399fqwjU\np08f1q9fzzPLViFPT6OmUIaRpQOuo56ivqqUyhvpSPXU+6MaWjrQ99EXqCz5gMLCQiIjI5FKpVhZ\nWeHi4kJjY6PQQ2fAgAE88cQTpKSkcPnyZQoKCjAzMyM0NJRp06YxdOjQDj+zB6WCavLkyW0XG6PZ\nDwhahZ8CYIpEIklSKpUafX8kEskgQKZUKjUtIbqIKAKJiIiIiPQYxKbjIiJ/LXpCn6i7UZWhVML3\nEefuuIe4qg9ZWloa48ePV1tWX1+vNRNx4MCBZGVlkZqa2mVRuzPuZ9+4nsZfqTKqK89abYkaTlbG\n5Fe0+ra3Xdb+PmnfO0MqlTJz5kxmzpypcRxt9nj29vYdijajRo1i1KhRamNtq/DacivhpyNbPn19\nfZ599lmeffZZtfGEhASgNYgiIiJyb1EqlRw4cICIiAgKCwsxMzNj+PDhzJs3j1deeQVQ/91pampi\n3759xMTEUFBQgFQqxdXVlWnTpvHQQ5ptGpRKJYcOHeLw4cMa+xcREel53KuE086SSDrqFWZjY8Or\nr76qMf7DDz8A2ucSXl5efPTRRxrj/fr10zrH6dOnD++++67W43d0zgp9C4z9pjHYb5raeIWstQrG\n0EJzvp8rN+DRGU+QcuksTk5OahVURkZG9O/fn+zsbGxtbZk/fz7QKrK89dZbWm3XOqKnV1B9+eWX\nXLhwAVdXV4qKigAcgf8CsDDGoAAAIABJREFUIYDw9qBUKhUSiWQN8AGwUiKRXAZkQANgC7j/tu38\n38ZuCzFqJiIiIiLSYxCbjouI/DW5nxZJt1PN0RVkVRIkDYo76iEeEhKCiYkJMTExTJ8+HVfX3/vK\n7NixQ6uP9dSpUzly5AjffvstTk5OODs7qy1XKBSkp6fj7e3drXPpCQJeT+CvVBl1u8/alU+12in+\nme+T8vJyjX5DNTU1fPfddwAMHz78PpyViMhfm6+++orDhw9jbW3NpEmT0NXV5ezZs2RkZGjYACkU\nCt577z1SUlLo3bs3U6ZMoaGhgVOnTvHxxx+TnZ0tBCpVbNmyRaj8mzRpElKptMP9i4iI3H96esKp\ntrlETk4O+/fvx8zMDB8fnw62vHsolUp+TbqqMV5TmE3FtVQMLewwNLeloVbTjUCsoFpPWFgYSUlJ\nJCUl0dTUBFAIpNIqAtW13UapVOZIJJL/B8wEgoFxQAtQAWQDPwKdN8W8BeJTSURERESkx/BXs9YR\nERHpGXSnmqOr6BmZ4jjQn4yMlDvmIW5sbMxLL73EJ598wvLly3nooYewtrbm8uXLyGQyfHx8SElJ\n+f/s3XdUVNf68PHvMDTpUkVQiqKiAiJgVzQkxgKaxG5siebmTbclN2oSk6sxRWOMsaTc5GeKiLFG\njQ2xxkJTqQoiqChl6NLrvH9w58RxhiKxZ3/Wylpyyj57JjBnzn72fh61dHKOjo68+eabrF69mtde\ne42ePXvi4OBAbW0tCoWCxMREzMzM+Oabb1r0OkWNk3/Oyqi/e49+nH9P/vvf/5KWloa7uzvm5ubk\n5uYSHR1NcXExw4YNo1OnTg+6i4Lwj5KQkMDevXtxcHDgiy++kFKjTps2jffee4/8/HypDijAjh07\niI+Px8fHh/fffx+5XA7UrxicO3cuW7Zswc/PD3d3dwAuXLjA7t27sbe354svvsDUtP7zberUqSxc\nuFCjfUEQHryHfcLpnDlzsLe3x8nJCQMDAzIyMoiKiqKuro7XX39dqit6P1VXV7Pqo/kUyK0xNLcB\nmYyKwhyKs1KR6chp16s+1ZlYQaXJ0dGRhQsXSj/7+PigUCjKgA7/25R++zlKpbII+Ol//911Iggk\nCIIgPFQextQ64eHh7Nq1i/T0dIqLizEzM6Nt27YMHDjw9hyvgiC0kFKplGbV9u3bl/nz57N161Y2\nbdrEsmXLuHnzJtu2bePq1avo6+vj7e3NzJkzsbKy0mgrIyODkJAQYmJiuHnzJmZmZnh5eTFx4kTa\ntm0rHbd//37Wrl3L66+/rraaI+/yea6e/h0dXT08x72Djvyvr8xJ+/9LeUF2/XZdPSpLCknY+RXm\nDp2pq60hL/U8RTeSqaupood7R/r5+nLjxo27NgNu8ODBmJqaEhISwokTJ9DT06N79+6sWLGCH3/8\nEUCj9s+QIUNwcXFh586dxMbGcu7cOQwNDbG0tKR///4MHDiwRX0R/vJPWRn1MN6jHwb9+vWjsLCQ\niIgISktL0dPTo3379gwdOlQjdaMgCPdeWFgYAOPHj5cCQFBfQ2L69Om88847aseHhoYik8mYNWuW\nFAACMDc3Z+LEiaxevZqDBw9KQaBDhw5J7asCQFCfGnL69OlqA3+CIDwcHvYJp8OGDePMmTMcO3aM\n8vJyjI2N6dmzJ88++yweHh73pQ+309XVxa//EPYcPkVpXgZ1NdXoGrTCon1X7LoNwMiy4RqL//QV\nVIWFhbRu3fr2Xa2AgUC6Uqm8cb/7JYJAwiNjwYIFxMfHP1TF14ODg6XBqQf1oSwIj5uHLbWOapC4\ndevW9OrVCzMzMwoLC7ly5QqHDh0SQSBBuAuqqqr44osvOHXqFCNHjuTll19WW82yd+9ewsPD6d27\nN927dyc5OZkTJ06QlpbG6tWr0dPTk469dOkS7733HuXl5fTq1Yv27dtz/fp1jh49Snh4OEuXLsXN\nrX5g2svLC4CYmBjeeedpaTXHrpP1K3Tqaqopzb2OqZ0zADVVFZTlZ2Ji0x4d3b+uWX9sJQbG5uib\ntMbYxpHaynKK8q8RHV3I0qVL8fT0VDu+JTPgVHx8fDQKpdbV1XHlyhVat26tNuCl4uzsrLWWiXB3\nPe4rox62e/TDYsCAAVprhgiCcP/cGoQPPXWOssoaunbtqnFc586d1QI95eXlZGZmYmVlhaOjo8bx\nqvv3rat3L1+uT4+kbXCxa9euGjUoBEF4ODzMk1kmTZrEpEmT7tv1mkNHR4d/z32TVOMed3zuP30F\n1QsvvICHhwft2rVDR0eHjIwMgLbARWD9fe8UIggkCIIgPIQeptQ6+/fvR1dXl6+//hpzc3O1fTdv\n/q2UrIIgUF83Y8mSJVy8eJHp06czduxYjWOio6NZuXIlzs7O0rbly5dz/PhxwsPDpcFXpVLJypUr\nKSsrY968eQwePFg6/sSJE3z++ed88cUXrF+/HplMhr29PTY2NsTGxqJUKqXVHKl/fI2snx9pyRfp\nZVPF+HE+fLQlmpLsqyjr6jBp48ztirOvYO85GHtPf2nbG33N+Parz9m+fbtGEKilSktL0dXVxcDA\nQNqmVCrZvHkzOTk5IjAt3HMP0z1aEAThXFouG49fUpvdn5CSSWVxPp/uvsj0J3XVPo90dHTUVu+o\n6undPpNcRTWTu6SkRNpWVlZfysHCwkLjeLlcjpmZ2d94RYIg3CtiMsudEyuo7pyuri7Dhw8nJiaG\n5ORkKisrqaqqAigB5iuVyjvLCX63+vUgLioIgiAITXmQqXVuveblrCJqapRqMwZVxAOeIPw9CoWC\nxYsXk5WVxdy5c9WCNrcKCgpSCwABPP300xw/fpzk5GQpCHTx4kWuX79Oly5dNNoaOHAge/bsITEx\nkYSEBGn2rqenJ2FhYVy9ehVnZ2fS09OpKivmlelTOHTIAHlJBv26tMGjvSX7otIAMG3jqtFHAxML\n2nT/K62ap5MlgU/2ZXvw/5GcnNzCd0jTxYsX+fzzz/H29sbW1paKigqSkpJITU3F2tpaax5rQbjb\n/inp7wRBeLjtP3dN62CuXK9+5vf5S9e5mF3KnEBPnu5RXxeirq6O4uJiKZ2savVsQUGB1muott+6\nylaVdrWwsJA2bdTTIdXW1nLz5k2srcXgsSA8jMRkljsnVlDdGR0dHV5++WW1bT4+PuTn52c/qAAQ\niCCQIAiC8JC7n6l1tM0kVOg4cD05gd5Pj2NM0FBGDO4rFX4WBKFptw8SOxrXPz1cv36dt99+m4qK\nCj788EMpNZs2qvRtt7KxsQHUZ+ampKQANLjqxtPTk8TERFJTU6UgkJeXF2FhYcTExODs7ExMTIy0\nXaFQsHPnTsrLy3l+kBu/rUtDrqePsVVbjbZbWdgh+1/6l1sffqytrbl48WLjb9IdcHR0xM/PjwsX\nLhAVFUVtbS3W1tYEBQUxfvx48dkk3FePe/o7QRAeXufSchuczd/K0p6y/CxKcq5hYNqaL/fEYmve\nCm8Xa5KSkqitrf3r2FatsLe3Jysri4yMDLXagQCxsbEAdOjQQdrWoUMHLl++THx8vEYQKDExkbq6\nurv4SgVBuNvEZJY7I1ZQPR5EEEi4Z5pbSL24uJidO3dy5swZsrKy0NXVxdbWFl9fXyZMmIChoaFa\nu7W1tWzbto1Dhw6Rk5ODhYUF/v7+TJkyBV1dzV/pmJgYtm/fTnJyMhUVFdja2tKvXz/Gjh2rNWd+\nc4tJC4LweGloJqGte1/kBkbkJkfxzYYQDu77A1tzI7p3784LL7ygdXBaEATtQVWAypJC0tMLqJGl\noauswtXVVW1gRRtt92vV6rxbB1pU6VkaSumi2q5K/QLqdYFGjx5NTEwM1tbWODg44OXlxbZt24iP\nj6djx47YG1SQZ+yETEdzZaBcvxWg+fAjl8tRNmfaXDPZ2dkxf/78u9aeIAiCIDyKNh6/1OBgpKWL\nJ3kp58iOP4G5Y2d09Q0JPnEJj3YW/PzzzxrHP/nkk/zyyy/8+OOPLFy4UKrpc/PmTUJCQgB46qmn\n1I4/ePAgv/32G71795bSy1VVVfHTTz/d5VcqCMK9IiazNJ9YQfXoE0Eg4Z5obiH17OxsFi5ciEKh\noGPHjowYMQKlUsmNGzfYuXMnw4cP1wgCrVixgoSEBHx8fDAyMiIqKopt27ZRWFioUfB4//79rFu3\nDgMDAwYMGICFhQVxcXFs3bqV8PBwli9frjawdCfFpAVBeHw0NpMQwMrVCytXr/qi8LnpuNuVE3/2\nNIsXL2b9+vVi5r0g3KahoKrKzfIq0mqsGDPEm9jje1m0aBFLly5Vy9HfEqr0LA2ldMnPz1c7DuoD\nQw4ODsTHx1NdXU1cXBx9+vQB6os76+rqcv78ecrKyrA1b8X4kUPJMrIUDz+CIAiC8IBcURQ3Wp/C\n1M4Zazcfci9Fc3HPeizau3PjrA4ZYf/FzsocS0tLZDKZdPxzzz1HdHQ04eHhvPHGG/j6+lJZWcmf\nf/5JUVERY8aMoWvXrtLx7u7uBAUFsXv3bl5//XX69++PXC4nPDwcExOTBiejCIIgPMrECqpHmwgC\nCfdEcwupr1ixAoVCwbRp0xg3bpzGcbcHgAAyMzNZu3atNFA0depU3nzzTQ4fPsz06dOlwo0KhYJv\nv/0WQ0NDVq5ciaOjo9TG+vXr2bt3L//3f//H66+/Dtx5MWlBEB4fjc0kvJWuviFmbd2oc7LkSUtj\nQkNDSUhIoF+/fve+k4LwiGgqqCpRQnRle54OGs/h3b+xYMECli5dqrXIcnOpVhTFxcVp3a/afvvK\nIy8vL/bu3cvevXspLS2VVgcZGBjQpUsXYmJiKC8vB+DZoQNxdXWVHn4ys7L58ZQpw/u58p9pfVvc\nd0EQBEEQmuf8ldwmj2nXaySGZtbkXooi91IUcgMjTIcOZskHc5kxYwb29vbSsbq6uixZsoSdO3dy\n7Ngx9uzZg46ODi4uLvzrX/9i0KBBGu2/9NJLtG3blj/++IN9+/ZhZmZGnz59mDZtGm+++eZdfb2C\nIAgPE7GC6tEkgkDCPSOXyxstpJ6SksLFixdxdXVl7NixDR53uxkzZqjNFDY0NMTf35+QkBBSUlLw\n8/MD4OjRo9TU1PDss8+qBYCgPnB05MgRjhw5wssvv4yenl6LikkLgvDoa2omYXFWGiZ2zmrB39ir\n+dSWZAP1g8SCIPyluUFVAKUSMgw78uqrr7J+/Xreffddli1b1uIZtO7u7jg4OJCYmMjJkyfp37+/\ntO/kyZMkJCTg4OBAt27d1M5TBYG2bNki/azi6elJcHAwhYWFmJqa4uLiAvz18KNQGLPXwghLU82J\nK4IgCIIg3H1llTVNHiOTybB174Otex9p26DBnSgqKqKiooJ27dqpHa+vr8/48eMZP358s/ogk8kI\nDAwkMDBQY98PP/zQrDYEQRAE4X4RQSDhrrl1OWArB3cKEpN49dVXGTRoEN27d9copJ6UlARAz549\n72hlTXOLQ1++fBnQXhzaxMSEDh06EB8fz/Xr13FxcWlRMWlBEB59Tc0kTDv+Gzq6+hhZO2BgYoFS\nCaWKq+TLixng69loMXtB+KdpKqiqTezVfF4bNoi33tLnq6++4t133+Xjjz+W7u13QiaTMWfOHN5/\n/30+++wz+vTpg6OjIzdu3OD06dO0atWKOXPmaHzv8PDwQCaTUVRUhKOjo1oQysvLi+DgYIqKiujf\nv79YDSwIgiAID5iRQdNDWdXlJegaGqvdt/VktXz//fcA9O0rVu8+ysLCwli1ahWzZ88mICDgQXdH\nuEMKhYKZM2cSEBDAhAkT2LBhA3FxcVRXV9OlSxdmzZqFk5MTRUVF/PLLL0RERFBSUoKzszMzZszQ\nGLerra3lwIEDHD58mGvXrlFbW4ujoyNPPfUUI0eOVPscuPXakydPZsOGDZw/f56KigqcnJyYPHmy\nNLlcEB4nIggk/G3aCz87UuQwkKKseK6GbMWs1e/IZDK1Quqqosx3Otu3ucWhm2pflTZOdVxLikkL\ngvDoa2omoX2PAIozL1Oen8XNjBR05LroG5vTf+izLJs/E11dcSt92O3evZt9+/aRnZ1NVVUVs2bN\nYvTo0Q+6W4+l5qRnaei8ZwIC0NPTY+XKlVIgqCU6d+7Ml19+yebNmzl//jwRERGYmZnh7+/PxIkT\ncXBw0DjH1NQUV1dXLl++rPFQ2alTJwwNDamoqGhwooggCIIgCPdPD+ema+8pLoZTcCUOUztndFuZ\nUlNewm+JZVSUFOHj46O2WlgQhAcjOzubefPm0a5dOwICAlAoFJw+fZoFCxawYsUKFi9ejJGREQMH\nDqS4uJgTJ07w4Ycf8u2330oTxmpqaliyZAlnz57FwcEBf39/9PX1iY2N5dtvvyU5OZm5c+dqXFuh\nUDB37lzatGnDE088IbW/ZMkSli5dKr73C48dMXIl/C2NFX62cvUCVy9qqysY2tkA8tMIDQ2VCqmr\ngjmqIs13m6r9goIC2rdvr7FfVTRaVRy6JcWkBUF49DU1k9Cmky82nXw1tg9+uiutWrW6V90S7pLj\nx4/z3Xff4erqyqhRo9DT06NLly4PuluPreakZzEwsaDnlMVazxs0aJBa3v3JkyczefJkre3Y2tqy\ne/durfscHBy0Puw1ZtWqVVq36+rqSmni7rQfAJ988skd9UMQBEEQhMY525ri0d6y0dXHZvYulBdk\ncTPzMrVV5ZgbG9K2kyf+455j1KhRYmWvIDwE4uPjmTp1qloaxpCQEDZu3Mi8efMYMGAAr776qvT3\n6u3tzcqVK/n999+ZNWsWAL/99htnz54lMDCQl156CR0dHaB+kviaNWsIDQ2lf//+9O7dW+3acXFx\nTJ48mUmTJknb/P39Wbx4Mdu3bxdBIOGxI4JAQos1t/CzXM+QP9Lgk+cnoVQqpULqnTt3BuDs2bNM\nmzbtrn8Jc3V15dSpU8TFxWmkayotLSU1NRV9fX0pF3BLi0kLgvBoa85Mwrt5nnB/RUZGArB48eIW\n15kRmq856Vnu5nmCIAiCIPwzPT/IjQUbwxscjzBt44ppG1cAZDL45PneeLuI7++C8CDcWj7CyEAX\nR+P6P1xbW1uNGuEBAQFs3LiR6upqXnzxRbWxQn9/f7766itSU1MBUCqV7Nmzh9atWzNr1iwpAASg\no6PDzJkzOXToEEePHtUIAtna2jJhwgS1bT179sTGxobk5OS7+voF4WEgnriFFmus8PPthdSVSgg+\ncQnTwkKgvpB6x44dcXd358KFC2zdupVx48apt1FcjIGBAfr6+i3q35AhQwgJCWHPnj0EBARgb28v\n7fv1118pKytj6NCh6OnpAS0vJi0IwqOtOTMJb+fpZImzrek97JVwt6hWcYoA0P0hgqqCIAiCINwP\n3i7WzB7p0eTEVJkM5gR6/qMDQMnJyezYsYPExERu3ryJqakpTk5OPP300wwYMACor7ETERHB5cuX\nKSgoQC6X4+zszPDhwxkyZIhGmwsWLCA+Pp6dO3eybds2Dh06RE5ODhYWFvj7+zNlyhSNtNlnzpzh\n5MmTJCcnk5eXB4CjoyMBAQEEBgZqnRicmZnJTz/9xPnz56mpqcHFxUVt1cjtYmNjOX78OImJieTm\n5lJbW0ubNm0YMGAAY8aMafH4ktAy2stHQGVJIenpBbTv7KEWuIG/ntscHBw0Mm/o6OhgYWFBbm59\nCuobN25QXFxM27Zt2bx5s9Y+6Ovrk56errHdxcVF49oA1tbWXLx4sfkvUhAeESIIJLRIU4WftRVS\nT9p3lQ4mlXh26yKtzJk3bx4LFizg559/5tSpU3h4eKBUKsnIyODcuXN888032NratqiPtra2vPTS\nS6xfv5633nqLAQMGYG5uTnx8PBcvXsTR0ZEZM2ZIx7e0mLQgCI++pmYS3komg8kD3e59p4S/JTg4\nmE2bNkk/BwUFSf/evXs3QUFBdO/enXfeeYdffvmF6OhoCgoKeOutt6Tisvn5+WzevJmoqCjy8/Mx\nMjKiW7dujB8/no4dO6pd79bitFZWVmzatElacern58dLL72EsbExqamp/PrrryQmJlJbW4unpycv\nv/xyi+91DxsRVBUEQRAE4X4Z5t0eOwsjgk9cIvaq5ncPTydLJg90+0cHgA4cOMC6devQ0dGhd+/e\ntG3blsLCQlJSUvjjjz+kINC6deto37493bt3p3Xr1hQXFxMVFcXKlSu5ceMGU6ZM0dr+ihUrSEhI\nwMfHByMjI6Kioti2bRuFhYXMnj1b7dgNGzago6ND586dsbKyorS0lNjYWL777jsuXbqkkco3IyOD\n+fPnU1xcjI+PD66urmRmZvLxxx/j4+OjtT/btm3j+vXrdOnSBV9fX6qrq0lMTCQ4OJi4uDiWLl2q\ndeBfuPsaKx8BcLO8irDEXA6cT+fpHu2k7aqa3w2VYpDL5dTW1gL1k8eh/nfl1me/25WXl2tsMzEx\nabB9ZXMGBgThESOCQEKLNFX4uaFC6r5PBPHhWy9IM0Ls7Oz46quv2LZtG2fOnGHPnj3o6+tja2vL\ns88+i7m5+d/q54gRI7C3t2f79u2cOnWKyspKbGxseO655xg/frxUN0ilJcWkBUF49ImZhI8fDw8P\noD44o1Ao1HI9q5SUlDB//nwMDQ3p168fMpkMCwsLoL5I6TvvvEN+fj6enp4MGjSI3Nxc/vzzTyIj\nI1m4cCF+fn4abYaHhxMZGYmfnx/Dhw/nwoULUh+mT5/OokWL6NatG0OHDuXKlStERESQlZXFmjVr\nHptJBiKoKgiCIAjC/eLtYo23i7VGuqkeztb/+Ekm6enprF+/HiMjIz777DONWsmq1RQAa9asUcue\nAlBTU8PixYvZunUrw4cPx8rKSuMamZmZrF27FlPT+vd66tSpvPnmmxw+fJjp06fTunVr6djFixdr\nXEOpVLJq1SoOHz7MyJEjpbIBAOvXr6e4uJiXXnqJUaNGSdvDw8NZunSp1tf8yiuvYGdnp/G9+tdf\nf2Xz5s2cPHmSgQMHaj1XuHuaWz4CJXy5JxZb81YtesZWBYr69u3LwoULW9BTQfjnEEEgoUWaKvzc\nUCF1r/6dNJZzmpqaMmPGDLVVOdo0Vlg5ICBAmrl9O29vb7y9vRtt+1Z3Uky6sYLVgiA8WsRMwkdX\nXFwcCxcuZNKkSdJnsoeHBx4eHsTFxaFQKLR+Vl+5coUhQ4bw1ltvIZfLCQsL48MPP2T27NkcO3aM\n/Px8pk6dyoEDB8jKyuKHH35gxIgRvPvuu3z55Zf8+OOPGBoaqrUZHh7Oxx9/TPfu3YH6B9sPPviA\n8+fP8+GHH/L6668zePBg6fjVq1cTGhpKRESERp7qR5UIqgqCIAiCcL8525r+44M+t9u7dy+1tbVM\nnDhRIwAE9WmvVG4PzgDo6uoycuRIYmNjiYmJ4YknntA4ZsaMGVIACMDQ0BB/f39CQkJISUlRmzSl\n7RoymYxRo0Zx+PBhzp07JwWBcnNzOX/+PHZ2dgQGBqqd07t3b7p37058fLxGe23atNH2VjB69Gg2\nb97M2bNnRRDoPmisfMTtVOUjWvJM4OjoiLGxMUlJSdTU1GikIBQE4S/ir0NoEVH4WRCEx5GYSfjP\noqury8yZM6WUAypFRUWcO3dOWjl64MABaZ+7uzv+/v4cOXKEU6dOaTwM+/v7SwEgqH+wHTJkCOfP\nn8fJyUktAATwxBNPEBoaSmpq6mMTBAIRVBUEQRAEQXgQbn2O+eNYBGWVNQ2mTrtVTk4OW7duJSYm\nhpycHKqqqtT2q2r43M7NTXNFt42NDVC/6v5WxcXFbN++naioKLKysqioqGjwGqmpqQB07dpVa/o2\nDw8PrUGgiooKdu3axZkzZ7hx4wbl5eVqqb0aeh3C3dNU+QhtYq/mc0VRfMfP3HK5nKCgIEJCQvju\nu++YNWuWRt2n/Px8SktLadeuXQOtCMI/gxiRF1pEFH4WBOFxJmYSPlo6derE+vXrMTMz0wjgFZVW\nNXienZ2d1rSjmZmZAHTr1k3rbDJPT0+OHDlCamqqRhDo9lpB8FdxU237VGk1bk3H8bgQQVVBEARB\nEIT741xaLhuPX1IbfE9IvkFlcT7L96Uw40nDBiffZGVlMXfuXEpKSujWrRs9e/bEyMgIHR0dFAoF\nYWFhVFdXaz339hT78FdNl7q6OmlbaWkpc+bMITs7m06dOvHEE09gYmKCXC6ntLSUXbt2qV2jtLQU\nQErVfLtb08yp1NTUsGjRIpKTk3FycmLgwIGYm5tL/dm0aVODr0O4e5oqH9HYeS15RpgwYQJpaWns\n27ePiIgIPD09sbKyoqioiIyMDBITE5k2bZoIAgn/eCIIJLSIKPwsCIIgPCwMDAzIqTZk1Y4EjfvS\npfPXkN0s4FxarsaDr7aHR0CaldjQftX222c3QuMPwtqKm6r2qYqbPo5EUFUQBEEQBOHe2X/umtY0\nvLr6hlQC55OusiC7lDmBnjzdQ3MgfOfOnRQXFzN79myNNPvHjx8nLCzsb/fx4MGDZGdnq6VvVrl4\n8SK7du1S26b6Tl1YWKi1vYKCAo1t4eHhJCcnExAQwOzZs9X25efns2nTpr/zEoRmaqp8xN0+T1dX\nl0WLFnH06FEOHTpEZGQkFRUVmJmZYWdnx5QpUzSyMQjCP5EIAgktJgo/C4IgPBqSkpLYvn07iYmJ\nlJSUYGFhga+vL5MmTZJWqUD9LMCtW7cSGxtLXl4e+vr6WFlZ4e7uzrRp06R832FhYaxatYrZs2dj\nZmbGb7/9RlpaGrq6unh5eTF9+nTatm2r0Y/Kykp27drFiRMnyMjIQCaT4eTkxKhRoxg0aJDWvp87\nd47du3eTnJxMaWkpFhYWdOjQgcDAQHr06AHA+t8OsviD92jj4Y+952Dp3LK8DG5mplKae50Ro8fg\namNEtw7t6N27NzU1DT9kqOr8NPTQefDgQSIiIhrMOV5QUMALL7yAo6Mja9asafA6giAILaFQKJg5\ncyYBAQFMmDCBDRtVayo8AAAgAElEQVQ2EBcXR3V1NV26dGHWrFk4OTlRVFTEL7/8QkREBCUlJTg7\nOzNjxgw8PT3V2qutreXAgQMcPnyYa9euUVtbi6OjI0899RQjR47UKK4dFhZGREQEly9fpqCgALlc\njrOzM8OHD2fIkCEa/V2wYAHx8fHs3LmTbdu2cejQIXJycrCwsMDf358pU6ZorLpMSEhg27ZtpKam\nUlRUhImJCXZ2dvj4+DBp0qS7/6YKgiA8os6l5TZYh9HI2pHSvAxuZqRgaG7Nl3tisTVvpTExSrUK\nvl+/fhptxMXF3ZV+ZmRkNHgNbWndXF1dAUhMTKSurk4jJZy2fjX2OrRdQ7g3mlMGwsDEgp5TFjd4\n3u7duxs894cfftDYpkrBre17yO1sbW0bbb+xeuSC8CgTQSChxUThZ+FxoxqkaOwLweN4beHxFhoa\nypo1a9DT06N3795YW1uTkZHBgQMHiIiIwM3NjfDwcFasWMFHH31EWVkZvr6+9OvXj6qqKrKzszly\n5AiBgYFqRV8BTp06RXR0NH379sXDw4PU1FROnTpFXFwcy5cvx8HBQTq2tLSUhQsXkpqaSocOHXjq\nqaeoq6vj3LlzLF++nKtXrzJ16lS19jdu3EhISAiGhob07dsXa2tr8vPzuXDhAkePHqVHjx6cS8vl\n1wYKj+amnKWiKBcdXX0sXXtQhJJqnfrZjhcuXKBbt25a3zNV0dqEhAStK3R0dHSQy+Wkp6drfSgN\nDQ2ltraWYcOGNev/kSAIQktkZ2czb9482rVrR0BAAAqFgtOnT7NgwQJWrFjB4sWLMTIyYuDAgRQX\nF3PixAk+/PBDvv32W6leQ01NDUuWLOHs2bM4ODjg7++Pvr4+sbGxfPvttyQnJzN37ly1665bt472\n7dvTvXt3WrduTXFxMVFRUaxcuZIbN24wZcoUrf1dsWIFCQkJ+Pj4YGRkRFRUFNu2baOwsFBtxnZ0\ndDQfffQRRkZG9O7dGysrK4qLi7l+/Tp//PGHCAIJgiDcYmMD34MBbDr5knspmqz445i17YChuQ3B\nJy5JYzO5ublYW1tja2sL1AdWevXqJf37tddeo7S0VOvkrr1796qtxqmpqWHfvn0cOnSI2NhYLl68\nyIoVKzh9+jSBgYHY2dlJ7To7O0vnpaamsmXLFo32ra2t6dGjB+fPn2fPnj2MGjVK2hceHq41qKPt\ndUD9RLcNGzZof5OEu06UjxCEh5MIAgl/iyj8LAiC8PC6ceMG69atw87Ojk8++USqPwMQExPD+++/\nz+nTp9HR0SEiIoLi4mJeeukltYcsqE+Ppq0ga0REBB988AF+fn7Stl27dvH999+zbt06Pv74Y2n7\n999/T2pqKjNmzGDMmDHS9qqqKj7++GO2bNlC//79pVl/586dIyQkBDs7Oz777DO1vsNfNXQ2Hr9E\nQ/MQ7LoNoKIohxLFNRx9hgJg7WTJZPsSpkyZwuXLl7WeZ25uLj103p6aIikpiVOnTuHg4ICOjg7R\n0dFqr1+pVHLw4EEMDAyaNRNNEB6kP//8kz179pCWlkZNTQ329vb4+/vzzDPPoKenJx03c+ZMAL7+\n+muCg4M5ffo0eXl5jB8/XiOli3D/xMfHM3XqVMaPHy9tCwkJYePGjcybN48BAwbw6quvSit5vL29\nWblyJb///juzZs0C4LfffuPs2bMEBgby0ksvSZ/1dXV1rFmzhtDQUPr370/v3r2la6xZs0YKlqvU\n1NSwePFitm7dyvDhwzU+s6F+hvbatWulCQVTp07lzTff5PDhw0yfPl1KtXnw4EGUSiWffPIJLi4u\nam3cvHnz775tgiAIj40riuJGU/QbmtvQzm846RF/cHHvt5g7diHjvCWmGafIz0rHyMiIZcuWMXLk\nSA4dOsSnn35K//79sbS0JCIiguTkZIYMGaI1BfLtvvzyS44fP46TkxM9evSgoKAAJycnrly5wtmz\nZ3nmmWfYvn0733//PXFxcbRt25aMjAwiIyPp27cvJ06c0GjzlVdeYf78+Xz//fecO3cOFxcXMjMz\nOX36NL169SIiIkLt+F69emFvb8/OnTu5cuUKHTp0ICcnh4iICPz8/MjJybnzN1m4Y6J8hCA8nDRH\ndAThDnm7WLN8Wl++fXkQrzzdlemDO/HK01359uVBLJ/WVwSAhEfG3LlzWb9+/YPuhiD8LVcUxeyM\nSCP4xCWWrfuFm6UVvPTSSxoDcl5eXvTu3VtK/aOir6+v0aahoaHW7Z6enmoBEIDAwEDs7e2JjY1F\noVAAUFxczJEjR3Bzc1MLAKmuN2PGDJRKJceOHZO2q1bFzZw5U+tgorW1dZMPvgYmFhppjGKv5tPR\nszdyuZzs7OwGz33ttddo3bo1P/74I9HR0aSkpLBy5UoWLFiAjo4O7733HnK5nH379qmdl5KSQnZ2\nNgMHDtRaH0gQHhY///wzn332Genp6fj7+zNy5EiUSiU///wzH3zwgUbKRFWx5TNnzuDt7c2oUaOk\nWb3Cg2Fra8vYsWPVtqlqOVRXV/Piiy+qfQb6+/sjl8tJTU0F6oPWe/bsoXXr1syaNUst2K+jo8PM\nmTORyWQcPXpU7Rq3B4CgPh//yJEjqa2tJSYmRmt/Z8yYobai1NDQEH9/f5RKJSkpKRrHa7vvmJmZ\naW1bEAThn+j8ldwmj7F286HT0Bcwc+hESfYVFBdOceTEKczNzRk5ciQAzs7OLFu2DHd3dyIjI9m7\ndy8VFRV07NgRLy+vJq9RWlrKiRMn6NixI6tXryYwMJB27doxbtw4/vvf/zJu3DgsLS357LPP8PPz\nIzExkT179qBQKHjllVeYMWOG1nbbtm3LF198Qb9+/bhw4QK7du0iJyeHRYsWaU35ZmhoyLJly/D3\n9+fatWvs3r2btLQ0Jk6cyLx585p8HcLd8/wgN257DGuQKB8hCPeHWAkk3DWi8LPwqFOlRhGER9G5\ntFw2Hr+kFhRJOhpBaW4e73+7k0Enz2p8RhcVFVFXV0dFRQU9e/Zk165dfPPNN5w7dw5vb2+6du1K\nu3btNAIpKh4eHhrbdHR06Nq1K5mZmaSmpmJra0tycjJ1dXUABAcHa5yjCkKlp6f/1fekJGQyGT4+\nPg2+5qYefOtqaynLy6A0J53YLZ9TW1WBUqlk/EFTamtrKS8vb/DcNm3a8OWXX7J582ZWrVpFQUEB\nUVFR9OzZkwkTJuDm5saZM2eIjo6WViUBREZGAjB8+PBG+yYID9LFixfZsmUL1tbWrFy5UlqBMX36\ndD7++GMiIyPZvn272gqT/Px82rVrxyeffCLVzRLujyuKYs5fyaWssgYjA10cjevXP7q6umqs0lTV\neXNwcKBVq1Zq+3R0dLCwsJA+s27cuEFxcTFt27Zl8+bNWq+tr6+v9tkMkJOTw9atW4mJiSEnJ4eq\nqiq1/Xl5eVrbcnPTHOBRffe6dZa5v78/p06dYt68eQwcOBBPT0/c3d2xthYTywRBEG5VVtlwjctb\nGdu0w9WmnfTz9MGdNAbd3d3d1Vbxx8XFsXDhQtq3b8+7776r0aavr6/0b5lMhlKpRE9PD5lMRkBA\ngDQpAZAmALRr1473339fax8bSotub2/PggULtO679Roq1tbWzJ8//46uIdx9onyEIDx8RBBIEIR/\nhPDwcHbt2kV6ejrFxcWYmZnRtm1bBg4cyIgRIwDtdXlUX34nTZpEnz59+OWXX7hw4QLV1dV06tSJ\nadOm4e7urnG9/Px8fv75Z6KioigvL8fBwYHRo0dja2srtdfcFDpnz55l165dJCcnU15ejrW1NX37\n9mXChAlipcEDdGth7rFjx7JhwwYSEhKorq7G1dWVSZMm4e3trXZOdXU1v//+O0ePHiUzMxO5XI6L\niwtBQUEMGDCgxe3vP3eNVX/EkRFzlMzYY7g9NR1TO2dqKssAiD4RStTRGmoL0mnbxlZKuaZSW1uL\njY0NK1euJDg4WPqdKywspLa2FmtraxwdHaXC3yoWFhbSv1V/Pzt27CApKYnY2Fjmz5/P2LFj6dGj\nBwCXLl3i0qVLDb6nFRUV0r9LS0sxMTHROhNcpakH3yt/bqO2phpLV09MbJzQbWWCjlxOL3d7OjjY\nNPn3Y2Vlxauvvkp0dDSgWYR0xIgRxMfHc+DAAZ5//nl69uzJCy+8gKurK506dZKO8/DwaPChs6nC\npIJwt9waSDjyewhllTVMmDBBCgAByOVyZs6cSVRUFAcPHlQLAkH9yjwRALp/tAX3ASpLCklPL6Bz\nD81RFblcDoCRkZHWNuVyuRR4Ly4uBuqLdW/atKnBftwaMM/KymLu3LmUlJTQrVs3evbsiZGRETo6\nOigUCsLCwqiurtbajrbPXFV/VRMFoL6g9wcffMDOnTs5dOgQ+/fvB6Bjx45Mnz5duqcIgiD80xkZ\ntGxIT9t5t084MCwrbX57RkZSerY333yT/v3707VrVzp37oyBgUGL+ig8+kT5CEF4uIggkCAIj739\n+/ezdu1aWrduTa9evTAzM6OwsJArV65w6NAhKQjUmJSUFLZt20aXLl0YOnQoOTk5nDx5kvfee4/V\nq1fj4OAgHVtUVMTbb7+NQqGge/fudOnShYKCAtavX68RFGjKpk2bCA4OxtTUFD8/P8zNzbly5Qo7\nduwgKiqKFStWNDjQI9wf2dnZzJ8/H2dnZ4YNG0ZBQQEnTpxg8eLFvP322wwcOBCoT6X0wQcfEB8f\nj6OjIyNHjqSyspKTJ0/y2WefkZqayrRp0+64fRNH9wZnWMn16wdrvcb/m5qqChJ//4rhz41ixZL3\npGNWrVpFWFgYUD8779///je1tbWMGDECKysrFAoFZWVl2NraolAoWLlyJd27dwegsLBQ45rLli0j\nMjISExMThgwZgpOTkzTwN3r0aKkORVOMjY0pLi6mqqqqwUBQYw++pXk3KEy/gJm9Kx2GTEamI5f2\nPTnUnV8TTjarH43p27cvFhYWhIaGMmnSJEJDQ6mtrWXYsGF/u21BuFu0BRIunjxHWX4eO5Oqseuc\nq/bw7eDggLW1NdnZ2ZSWlkp/v/r6+mqFnIV7SxXcb2j27M3yKvZEX+Op8+k83aOd9oOaoPr+0Ldv\nXxYuXNisc3bu3ElxcTGzZ8/WmIF9/Phx6X7yd/n5+eHn50dFRQXJyclERESwb98+PvroI1avXk27\ndi17zYIgCI+THs4tGzy/9byGJhwUZ18hO72AK4riZrX573//m61bt3Ls2DE2btwI1H936N+/Py++\n+KLa5DHhn8PbxRpvF2uNIGMPZ2uRSUgQ7jMRBBIE4bG3f/9+dHV1+frrrzE3N1fb19wCw5GRkRoD\nHqrg0q5du3jllVek7T/99BMKhYIxY8ao5TcePXo0c+fObXa/Y2NjCQ4OpkuXLnz44YdqM2jDwsJY\ntWoVwcHBGBkZsWnTJpYtW6Y1PZdwb8XHx/Pss8/y4osvSttGjhzJ22+/zdq1a/Hx8cHIyIgdO3YQ\nHx+Pj48P77//vjT7efLkycydO5ctW7bg5+ensbKsqfbtnni5wUFCY2sHyvIyKMm5hqG5LUolnE1t\nOne4XC5nw4YN2Nvbk5CQwLvvvoutrS2rVq1i8eLFHD16FBMTE+Li4pg4caLauargZ15eHm+88Qa2\ntrYUFRUhk8lITExs7ttK586diYyMJDo6mr59+2o9prEH38riAgDMHDqpBYAAzGoLNdIXtYSuri5D\nhw7lt99+IyIigoMHD2JoaMjgwYP/dtuCcDc0FEiora4EICWvhgUbw5kT6KkWSLC0tCQnJ0ctCGRu\nbt5gakjh7jqXlttk+hQAlPDlnlhszVu1aBato6MjxsbGJCUlUVNTg65u04+GmZmZAFprMcTFxd1x\nH5piaGiIp6cnnp6emJiYsHHjRqKiokQQSBAEgfqU/B7tLRutkXk7TydLafC9ORMOQv5Mxmew5oSD\nW78jQH3AZ/LkyUyePJnc3Fzi4+MJCwvjyJEjZGdn89lnn935CxQeG6J8hCA8eDpNHyIIgvDok8vl\n0qD7rZpbYNjd3V1jxuuTTz6JXC4nOTlZ2lZTU8OxY8cwNjZmwoQJase7uLjwxBNPNLvPqhRRb7zx\nhkYKlYCAAFxdXTWKNQv3n7GxMZMmTVLb5ubmxuDBgyktLeX06dMAhIaGIpPJmDVrltrvorm5uRRI\nOXjw4B21n5NfxMlTpxrsm02nXujI5dyIPkhlcf3DYUZ+mTSjr6amRhrQS0tLo7T0r7QPqsLfqtU+\nBgYGUuHvuro6bt68SWxsrFQDR8XJyYm8vDw8PT2xtbWVXuPgwYO5dOkSISEhaml/VDIzM8nOzpZ+\nDgoKAupTsGmrL5GXlyc9+GpjYFI/27BEcVVte2cbfXb/9rPWc1pi2LBh6Ojo8M0335Cdnc3gwYM1\n6nAIwoPQWCBBrlefmqWmogTl/wIJ59L+ChDn59d/Xtx67xEBoPtn4/FLTQeA/kephOATDafZbIxc\nLicoKIj8/Hy+++47rcHx/Px8tZpAqs/12wM+Z8+e1XoPa4n4+HgpZd2tbr0fCYIgCPWeH+RGc2/R\nMhlSLaCmJhzo6td/n60qvanxPSEzM1PtueF21tbWDB48mP/85z/Y29uTmJgopSAVBEEQHgyxEkgQ\nhMfSrcuNWzm4U5CYxKuvvsqgQYPo3r077u7uGquCGqOtmLGuri4WFhZqxYyvX79OVVUVbm5uWgeC\nu3bt2uxBkosXL6Krq8uff/6pdX91dTVFRUWNFrcX7p6GCnN36NBB6/9rDw8PwsLCSE1NpV+/fmRm\nZmJlZYWjo6PGsZ6engCkpqZq7Gus/Y3bdlNekNVgnw3NrWnfexTXwndxKfQnygsykesb8vmXX9PW\nuI7ExESuXLmClZUVf/75J59//jldu3alTZs21NXVcfjwYWJiYqisrKSkpETtd7GqqopevXrx8ccf\n07dvX+Li4khOTqa6uhpLS0u11XEA/+///T8yMjLYuHEjR44coWvXrlhYWEgDjJcuXeLtt9/Gzs4O\nAG9vbyZMmMDmzZt55ZVX6NOnDzY2NhQUFJCYmEiXLl2YPXs2zw9y43TkWY3XbmTZFhObdhReu0DS\ngR8xsWlHTWUpcv1CPDq7SsXT/y4bGxv8/PwIDw8HEKnghIdGY4GEVpZtKMvPpCT7KgamllIgwdvF\nmszMTHJzc7GzsxN15x6AK4riO5rRDRB7NZ8riuIWzbCdMGECaWlp7Nu3j4iICDw9PbGysqKoqIiM\njAwSExOZNm2atPJm5MiRHDp0iE8//ZT+/ftjaWnJ1atXOXv2LAMGDODEiRN33Ifbfffdd+Tl5eHu\n7o6dnR26urqkpKQQGxuLra0tgwYN+tvXEARBeFx4u1gze6RHkytIZTKYE+gprRxtasKBgZk1cn1D\niq4nUVVeKn1PqKqq4ttvv1U7tqioiIKCAo20sRUVFVRUVCCXy5u12lQQBEG4d8SnsCAIj4Xk5GR2\n7NjBnxFnuXgli9JaOa0sbLHq2JPWTt3IMnIjev82Is+ep6OLEzKZjO7du/PCCy/g5uYmFcJu3769\n1GZYWBj/+c9/qKqqQqFQsGDBAlJTUykrK5NW6Rw+fJjWrVtTUFDAL7/8QlhYGJGRkdJANkBlZSW7\ndu3ixIkTJCQkcOHCBerq6nB0dNQYyIiLi2PhwoVMmjSJrKwsrl27xttvv01dXR3GxsY4OjpiavrX\nII9qgN7AwEAjn78oNn93NFWYu0N3Pa3nqfJel5aWSjPlGgo8qAqz3xpQvL0dbdtr65TUVlU22n9L\nV09atbYjI+YIJYqrlCqucT7yFLLOTvTv3x9HR0cuXLhA3759adWqFRcuXCA+Pp6oqCh0dHRwc3Mj\nICCAdu3aSYW/N2/ejFKppF+/fgwbNozNmzeTkpJCaWkpgwYNYvr06Wp1sqC+9sSnn37K/v37OXbs\nGKdOnaKqqgoLCwvatm3LrFmzNGpmTZkyhS5durB7924iIyOpqKjAwsKCjh07SqvqvF2smTLIjcWH\n1F+3TEcH18ETyYw5ys2MS+QkRdCzixMTnwliwoQJvPrqq42+b3fiqaeeIjw8HDc3Nzp06HDX2hXu\nrls/XydPnvygu3NPNRVIsOrgTV7KObLij2Pm2Ak9Q2Nir+aTmlVE8A8/oFQqGTp06H3ssaBy/krT\nKTsbOq8lQSBdXV0WLVrE0aNHOXTokPRZa2Zmhp2dHVOmTFFLcens7MyyZcv49ddfiYyMpLa2FhcX\nFxYuXIixsfFdCQKNHz+e06dPc+nSJWJiYpDJZNjY2DB+/HhGjRqFiYnJ376GINwrqpTN2upmCcK9\nMsy7PXYWRgSfuETsVc37v6eTJZMHukkBoOZMONCRy7Ht3IvMuONc3PstWXFdMM04Rfrli1haWqo9\n1+Tl5fHWW2/h7OyMs7Mz1tbWlJWVERkZSUFBAUFBQWKlvCAIwgMmgkCCIDzyDhw4wLp168gpriRX\ntw2Gzj7IK0opz8skNzmS1k7dMHfsjKK1A3K3fgwd+yTkpxEaGsrixYtZv359o+3n5+ezfft2Ro8e\nzfDhw1EoFGr7q6urmT9/PoaGhvTu3Zu0tDRqamqA+gDAwoULSU1NpUOHDnh6eqJQKCgvL2f58uVc\nvXqVqVOnalwzJSWFixcvYmZmxpIlS8jJyeHkyZPo6emxevVqaYD9999/58yZM8THxxMQECClaXlY\nLViwgPj4eLUA1cM8MNucPNm7T11guJbC3Kq0NcbGxtJs+oKCAq3tqLZrm3WvakfbdrmODLn+LWlx\n/pcLQnlburVWre2w9xxM0fUkrFx78M5HC3imlwsAq1at4sKFC3Ts2FGq8fDNN99QU1PTYOHvzZs3\nSz+rinfr6ekRHx/faHFxXV1dAgMDCQwMbPCY2/n6+uLr69voMa+MH0ofv54aD766Bka06zVC48EX\n6tPM3S4gIEDrgI22Y291+fJlAIYPH97occK9p1AomDlzJgEBAcyePftBd+eBaSqQYGLTDrtu/clO\nOMnFPeuxaN8VHV09Xn9jM/KKArp27cpzzz13n3or3KqssqbJYwxMLOg5ZXGD5zU2CUTb55lMJmPI\nkCEMGTKkWX10d3fn448/1rpP27U/+eSTBtvS9rk7YMAABgwY0Ky+CIIgCPW8XazxdrHWyF7Qw9la\nY5JAcycctPEcjExXj7yUs+SlnOVYXTZTxoxk8uTJahOq7OzseP7554mLiyM2NpabN29iamqKg4MD\nM2bMYODAgXf1tQqCIAh3TgSBBEF4pKWnp7N+/Xoq6+To9ZyAi7l6EKSqtEjtZx1dPf5Ik/HJ85NQ\nKpWEhoaSkJDQ6DWKioqYNm0aixYt0rq/pKSEbt268dZbb6FUKomOjqaiooLy8nK+//57UlNTmTFj\nBmPGjOHrr7/G2dmZcePGcfnyZbZs2UL//v012oyMjCQgIIC8vDyefPJJ2rdvz/79+1m7di27du2S\nUm2NHj2a0tJSKQjk4eFxJ2/fXbdq1SrCwsL44YcfHvqAVFOaW5i7LD+TFTsiNQpzq+oluLq60qpV\nK+zt7cnKyiIjI4O2bduqtREbGwugdRXJ5cuXKS8v15g9FxcXh1krfZSt20jb5HqGAFSX3ZS2VZYU\nkrDzKwzN/upbD+fGC4i3tPB3ZWUlQUFBD2QA/k4efO+m8vJy9u3bh6mpqUhRJDw0mhNIcPB+klat\n25CbFEF+WgzKujrsO7swY+pUnnnmGZG25QExMmjZ+97S8wRBuLv69OnD+vXrpVXewv2RlJTE9u3b\nSUxMpKSkBAsLC3x9fZk0aZLaihXVhLSdO3eybds2Dh06RE5ODhYWFvj7+zNlyhSt97/r16+zdetW\nYmJiKCwsxNjYGC8vLyZPnqyx+l31PPT9998TGRnJwYMHycjIoFOnTlJQvLq6mi1btnD48GHy8vKw\ntLRk8ODBTJw4keeee47u3btLx/70009s3bq1wdVlKSkpzJkzBz8/Pz744AOcbU2b/O7bnO8JUD9J\noE23AbTpVh+Ynz64k1RT6NZJBcbGxkycOFGqcyoIgiA8fHQedAcEQRD+jr1791JbW4uha28MzTWD\nDvrG5hRnpaG8ZSRfVfuguQWGLSwscHFxaXC/TCZj5syZUq7jgQMHUlpayoYNGzhy5Ahubm6MGTOG\ntLQ0Dh8+DNSviJgxYwZKpZJjx45ptOnu7s6cOXMA+Prrr8nPz+fJJ59ELpeTnJwM1OdYTkpKauId\nerjMnTu3yZVXD4vmFuauqaogM/aYWmHuS5cucfToUYyNjenbty8ATz75JEqlkh9//JG6W1bq3Lx5\nk5CQEKA+rdjtSktL2bRpk9o2Vfs2lub0vyVQY2xd/xCad/k8yrq/imrX1VZzM7N+tUpbS6MmHwzv\nR+Hve8XZ1pRnerkweaAbz/Ry0fpaFQoFQUFBrFq1qsXXiYyMJCQkhEWLFlFYWMi4ceNEsXLhodHc\ngIClc3c6Pf0iXhMW0GPSIl5bsJTx48ejr6+vdtwPP/zQ5Io44e5oKkh/t88TBOHuUqVvfthrqoWF\nhREUFERYWNiD7sodCQoKYsGCBWrbQkNDeeedd4iOjsbT05NRo0bRsWNHDhw4wJw5c8jJydFoZ8WK\nFezZs4du3boxYsQI9PX12bZtG2vWrNE4Njo6mrfeeoujR4/i5ubGqFGj8PLy4vTp08ydO1daEX67\n7777jo0bN+Lk5MSoUaPo2rUrAEqlkk8++YRNmzYhl8sJDAykd+/ehIWF8fnnn2u0M3z4cGQyGQcO\nHNB6nf3790vHNZeYcCAIgvDPIz7BBUF4pNw+yz/yXBxllTXU6LXBsIFz0o7/Rm11FWX5meSnxVFb\nXUXSvqt0MKnEs1sXvLy8Gr1mUw9xrVq1wtzcXPp5xowZxMbGEhwcTGZmJpWVlUydOpWkpCScnZ25\nevUqp0+fpra2fpA+PT1do003Nze8vLyYPn06P//8M//617/w9fUlNzeX3NxcPvroI+Lj4+natSud\nO3du/E17iMeyDjsAACAASURBVNjY2DzoLjTLnRTmNrVzIi/lHFu/z6BNUQDy2gpOnDhBXV0dr732\nGkZGRgA899xzREdHEx4ezhtvvIGvry+VlZX8+eefFBUVMWbMGOnh8Fbdu3fn4MGDJCcn4+7uTkFB\ngVr7Jo7uLNgYjlIJxtaOmNo5UZx9laT9/8W0jQvlRbmUKK5hZt8BmQx6ujY9UNhU4e8dO3ZoPU9f\nX5/169dLr/lxdvLkScLCwrCwsGDcuHE888wzavuDgoLUZnEK915wcLAUMA0LC1Mb2Jo9e7ba6sTU\n1FR++eUXLly4QHV1NZ06dWLatGm4u7urtdnY6saGUllmZWWxdetWYmNjycvLQ19fHysrK9zd3Zk2\nbZpaXbd7RQQSHl3OtqZ4tLds9j0I6ms93MsVj4LwMLjTlR47duxg69athIWFkZeXh62tLc8++yxP\nP/00APv27eOPP/4gMzMTU1NTnnrqKSZPnozsf6l1QT3F6NixY9mwYQMJCQlUV1fj6urKpEmTNOoZ\nNlQTaObMmUD95K7g4GBOnz5NXl4e48ePl+4htbW1HDhwgMOHD3Pt2jVqa2txdHTkqaeeYuTIkWp9\ne1io7r3Lli17IBkJbty4wbp167Czs+OTTz7ByspK2hcTE8P777/Pd999p5HRITMzk7Vr10r35KlT\np/Lmm29y+PBhpk+frlavc/ny5RgYGPDZZ5/Rrt1f6Z+vXr3K/PnzWb16NV999ZVG3y5fvsxXX32l\nVisW4OjRo0RGRtKtWzeWLl0qrTx6/vnnmTdvnkY7tra2+Pr6EhkZydWrV3FycpL2lZeXc+zYMayt\nrfHx8Wn2+ya+JwiCIPzziCCQIAiPhHNpuWw8fkljUCQhKgVZZRGd+zQ8+GHfI4DcS1EUZ16mOCuV\n2qpy9I3N8X0iiA/feqHJlDd6enqN7r999r+FhQXLly/ngw8+IDU1lZiYGAwNDWnTpg03btzgxo0b\nnDlzhitXrgD1K3pupwo8jR07lq5du7J7924SExNJT09HV1eXvLw8nn76afz9/YmMjGy0fypN1d5R\nPZyqZnvf+hBrY2PDpk2bSElJQSaT0a1bN1588UW1B6GgoCCNtqD+wUXVpraaQA+jOynMrW/cmna9\nRpJxLoydu/7A1kyfDh06MHHiRHr27Ckdp6ury5IlS9i5cyfHjh1jz5496Ojo4OLiwr/+9a8GU4nZ\n2dnx6quv8tNPP7Fv3z6qq6s12p890kNKXefiP5GMs6EUXU8iJykCXUMTDM2sMXfsjHVNJo5WTRfU\nbk7h70mTJmmkpJDJZDg6Ojb7vbvfVL+XdyMwM3v27H90zZmHkYeHB6WlpezatQsXFxf69Okj7XNx\ncaG0tBSoT5uybds2unTpwtChQ6Waa++9955azbWWyM/PZ+7cuZSVleHr60u/fv2oqqoiOzubI0eO\nEBgYeF+CQCKQ8Gh7fpCbFNxvikyGlJpHEB5XoaGhrFmzBj09PXr37o21tTUZGRkcOHCAiIgIVqxY\noTHRaPny5SQlJeHr64tcLufkyZOsWbMGXV1daXW+n58fXl5ehIeHExISgoGBAWPHjtW4fnZ2NvPn\nz8fZ2Zlhw4ZJE3IWL17M22+/3ex6JzU1NSxatIji4mK8vb0xMjKSAgQ1NTUsWbKEs2fP4uDggL+/\nP/r6+sTGxv5/9s48IKp6/f+vYdgEWWRHQAVEBVnc90RFTc2l0kywzFzqlq2mldr3Wvdm3Xura5rd\nXG5dl8Q1TXFDRREDWZRdREBBEZBVYRhkGeD3B785McwAA5qKnddferb5nAHO+Xye9/O8HzZt2kRa\nWhpLly69/y/zCeP48eMoFAoWL16sIgAB+Pj4MHToUKKjo9WslefPn6/yPjY0NMTX15fdu3eTkZHB\n4MGDAThz5gxyuZy//OUvKusegO7du/P0009z6NAhsrOz1fbPnDlTTQAChCSVptZzSku1b775Ru2c\nyZMnExMTw4kTJ3j99deF7efOnaOyspKZM2eio6O90Y84TxARERH58yGKQCIiIo89J+JuNtubRVff\nELmshJoKGVIzzVZM1r0GYWhmxb07+dh5PkXXfuMA8BnZS1gMyOVyBg0apGZ3Y2pq2qz/MjQsLjw9\nPdW2W1paMn/+fG7evMmMGTNYtGgRADt27GDv3r189tlnKgKBEk39Vjw8PIQKEWUQe/369cJ+bUWg\n9hIdHU1UVBQDBw5k8uTJZGdnc/HiRdLT0/nPf/6DqakpAP7+/kRGRpKZmcn06dMFIetxt8PQhLY+\n2UoMzaxxGTNHxSdbE/r6+syePZvZs2e36fpOTk588sknze6f1L8btuZGBJ5PJ/FGCd2GTQMaRLmq\n8rtkn/yBZ0f05r3Xv2Tr1q34+/urZLEqxQxl36mAgAD8/f3VGn/fuXOHV199le7du6sJiV9++aVg\ns9a4J1DjSorY2FiOHDlCbm4uRkZGDBs2jFdffbVD/o6IPH54eXlha2vL4cOHcXFxUfsdVT5fY2Ji\n1J7rmnqutYfw8HBkMhmLFy9m+vTpKvsqKyvbFKC5X0QhoePS39lKRdxvDokE3p/qrdKPTkTkSaO9\nlR6FhYV8//33whzjueee44033mDLli0YGxvz3XffCdcKCAhg8eLFHDx4kOeeew6pVKpyreTkZJ57\n7jkWLFggbHvmmWdYvnw533//PQMHDtSqCrqkpAQnJye+/PJLDA1VPQz27t1LbGwsU6dOZfHixcL7\noq6ujg0bNnDq1ClGjhzJ0KFD2/DtPZmUyqv5NTqTiioFR85GUVGlIDk5mfT0dPVjS0upq6sjJyeH\nnj17Ctvd3NTfeUohsby8XNiWmpoKQGZmJoGBgWrn5OTkAGgUgXr16qVx/NevX0cikahVHwMaXQEA\nBg0ahK2tLWfPnmX+/PlCEuKJEyeQSqVMnDhR43ktIc4TRERERP5ciCKQiIjIY01cZlGLQRAjK0fk\nxbmU5WZgaNZ8EERXv0Hsqako+/3c/+9pnJeXh1wuf6CB6JKSEnr16oVEIiElJQWArKwsDh8+jImJ\niUbhqL00XiT+EURGRvK3v/1NxTZP2aD01KlTzJw5E2hYQBcUFJCZmcmMGTPUrJM6Eh3RJ7u/sxX9\nna3ULBMdjev5LGU3egp5q1msY8aM4X//+x8nT57kxRdfVAtYnzp1itraWiZNmtTm8f3vf/8jNjaW\nIUOG0L9/fxITEwkODiYvL09NbHqY3Lp1Syt7F4CwsDBOnDjB9evXqa6uxtbWljFjxvD8888LFYPK\nCjpoCBo1rpDz9/fn+eefx9/fHzc3NxXf9+rqaubMmUNNTQ1Lly5l7Nixwr5jx47xww8/8M4776j0\njZLJZBw4cIDIyEgKCgrQ1dWlZ8+ezJo1S+P4tb0HJUpLuxUrVrB9+3aio6ORyWTY29vz/PPPM378\n+HZ8448ed3d3NWF//PjxbNy4Uei5dr807akDqAX8/mhEIaFj01Tcb4p3dwsCnnITf24iTySN5zLh\nJ36hTF7JypVtq/R45ZVXVOb2dnZ2eHh4kJiYyMKFC1WuZWxszJAhQ1Ss4xpjbGyMv7+/yjY3NzfG\njBlDSEgIFy5caDZhrCkLFy5Uex/U19dz5MgRunTpwqJFi1TmXzo6OixcuJCjR48yf/58Xn/9dV54\n4QV+/vlnkpKSKCsrY82aNXh5ebVrXtCUxMREwsLCSElJoaioiNraWuzs7Bg1ahQzZ85Ueb8tXLiQ\ngoICAFauXKlyncZV/1VVVRw+fJjz58+Tm5uLRCIR+uRoqoRXKBSClV9RUREWFhaMGTOG3kPHk5J9\nh8ulN7lm3bC+unw1mypZCZ+v+xEHS2PMjNTfv6DuvKBp3acU/xqvqWQyGUCz/XiU3Lt3T22b0lKu\nKXK5HBMTEzWxERocJTQhkUiYNGkS27Zt4/z584wfP56MjAyuXbvGsGHDVCwRtUWcJ4iIiIj8uRBF\nIBERkceanWHpLU5KrXsNoij9EreTwzDt6oqhmaoVRLW8FH1jMwxMrZDqG1J66yo1lXL0DI3p18OK\n6upqNm3a9MDH/f7772Nvb4++vj5nzpxh9uzZVFVVUV9fz1tvvSUsoPLy8tDR0dFoFaAtykqcpk1P\nm4oBhhXydl1/9OjRan2TJk2axP79+x9YwPRxoyP7ZPewMVGxalAuzrXNYh07dixHjx7l0qVLghUG\nNAQoTp48iYGBgYpAoS2pqals2LBByLKsra1l1apVJCYmkpaW1my25B9JW+xd1q1bx+nTp7GysmLE\niBEYGxtz9epVfv75ZxISEvj73/+OVCrF2dkZf39/du3ahY2NjUpQyMvLC0NDQ9zc3EhLS1MJWKWk\npFBTUwM0ZDY3/o4TEhIAVP4OCwoKWLFiBQUFBfTt25eBAwdSWVlJTEwMq1evZsmSJULfg7beQ2Pk\ncjkffvghurq6jBw5kpqaGn777TfWrVuHRCLROuj1R9L4WVctv9tqJZ+m7F9dXV3Mzc1Vsn/bw9Ch\nQ9m+fTsbN24kLi6O/v374+HhgZOT0yPp5SAKCR2b5sT9fj2sREsekScSTfbPV0OjkRcV83+bfmV0\neKza735zlR6N/61EGSjXtE8pCmkSgVxdXVUEJiVeXl6EhIRw/fp1rd6H+vr69OjRQ217Tk4OMpmM\nrl27smfPHo3n6unpUVlZSV5eHh988AEODg6MGTOGqqoqjIyM2jUv0MQvv/zCrVu36NOnD4MGDaKm\npoaUlBQCAwNJSkri888/F0Sq6dOnExkZSXJyMn5+fhoTwORyOStXruT69eu4uroyYcIE6urqiIuL\n46uvvuLGjRu8/PLLwvH19fX84x//ICoqCnt7e6ZOnYpCoWDb3kPc2HaSsnvVmJj+fn2pfoOg5jzt\nfXQNDHlrqjdP93NqOox2o6zw+u677zT+7Fqiufe+kZERMpmM2tpatXnX3bt3m73ehAkTCAwM5MSJ\nE4wfP54TJ04AtCs5S4k4TxARERH58yCKQCIiIo8tWQWyVn2KDc2scRo8mezoo6Qe24SZYx8MTCxQ\nVFVQUZyLVM8AtwmvoCOVYtN7CHlJYaQe20Rfn4Ec3ZdFfHw8FhYW7cqeaolJkyYRGRkJNFgKhIeH\nY29vj5+fH1lZWcTGxpKdnU16ejrLly+/LxHIy8sLiUTCtm3buHHjBoUVdfx25TY19qp2c7L8LPKz\n75BVIGvT9TUtlK2srIR7exJ5En2ytc1inTJlCkePHuX48eMqIlBcXBz5+fmMHz9eyJ7UVHVUVVXF\nli1bAHjxxRc5ffo0cXFxuLq6sn79ehYtWkT37t0pLy+nvLycuLg45s+fz7Bhw5g/fz7e3t7CZ5aU\nlHDy5EliY2PJy8ujvLwcU1NTPD09mTNnjprtBjQED44ePcqxY8e4ffs2JiYmDB8+XCXAoKSxMBYW\nFsbVq1eprKwkLi6OV199lb/+9a/4+/sTFhbG6dOnGT58OMuWLVPJglU2RD569CjTp0/HxcUFFxcX\nQQTS1H/Lx8eHK1eukJycLHzHCQkJ6Ojo4OnpKYg+yvtJSkrCzs5OJbiydu1aCgsLWb58uUoWrVwu\nZ8WKFWzevJmhQ4cKGaUhISFa30NjMjMzmTBhAm+99ZYQ9JkxYwZvvfUWv/zyyyMVgTQFC6vK73L5\nRjF10Vn4ZhZpDFo0V/UplUrvu6LSxsaGf//73wQGBhIbG0tERATQ8Mx8/vnnVSrDHhaikNDxaSru\ni4g8iTRn/6yoqgDg0vlTxP4GLramWJuqCzJtqfRoaZ9CoZ5I0Fx1hnK7sudca5iZmWkUBpTVJrm5\nuezatUvjudXV1dTW1pKSksILL7zAvHnzVPavWLGiTfOC5njjjTewtbVVG+fPP//Mnj17CA8PF5Jk\nZsyYgVwuF0QgLy8vtett2bKF69evM3/+fME9QHk/a9asYd++fYwcORIXFxegoWI5KiqK3r1788UX\nX6Cvr09cZhH7bllw7/h/1a5vbOVARXEu5YU3MXPoxdojidiYdXpgokWfPn2IiIjg8uXLbRaBmsPF\nxYXExESuXLmi5g6hdJDQhJmZGSNHjiQ0NJQrV65w7tw5bG1tNVqMtwVxniAiIiLy50AUgURERB5b\n4rOKtDrOym0gncxtyL9ygfL8LEpvpSI1MKKTuS2Wrr9bH9h5j0Giq0dxRiy1eZe5eLGQ0aNHExAQ\nwJtvvvlAx+7v7y8E3BUKBSdOnODcuXNkZWWRlpaGubk5Xbt2ZdGiRVrbMzSHk5MT77//PgcPHuR/\ngftJyymmvh4GvKS+ICi7V83u8AwGjsnWOkuuc+fOats02SU8abTmk23Q2ZwBL60G/hifbBsbGxUr\nDW3RJMyA9lms3bp1w9PTk0uXLlFUVCQIfkobjMmTJ2sMvkNDAD4r5y7Smlry8/P54IMPuHfvHlZW\nVgwdOpSEhARWrFjB119/zerVq5HL5VhYWODq6kpmZiaffvopmzZtEqqFkpOT2bdvH97e3owYMYJO\nnTqRm5tLREQE0dHR/Otf/8LZ2VllDFu2bCEoKAgLCwsmTZqEVColKiqKtLQ0FAqFWgNef39/lQqZ\np59+GhMTE2JiYvjhhx9ITU1FJpMhlUp599131Wy+5syZw5EjRwgNDVUTUJrDx8eH3bt3k5CQoCIC\n9ezZkxEjRrBx40ZycnJwcHDg+vXryGQyRowYIZyfmZlJcnIyI0eOVLNRMTY2Zu7cuXz++edEREQw\nZcoUAA4fPtyuezAwMFCzpnFycsLDw4Pk5GQqKysfus0ZtNwrDiDvTgUrdkbx/n1mBCuDYLW1tWr7\nmgv6OTk58dFHH1FbW0tmZibx8fEcOXKEzZs3Y2hoqGLp9zARhYQ/hhUrVpCcnNym57XSNrKlnoMi\nIn8mWrJ/VlZ6+Mz+CKm+IRIJ/G3u0IdamdBcdYZyu7aW0i1VhgAMHz5czVZNSUFBAQsXLsTc3Fwt\nqac984LmsLOz07h9xowZ7Nmzh9jYWJVK6ZaQyWScPXsWNzc3FQEIGqqi5s+fT2xsLOfOnRNEoNOn\nTwMwb948Yb6yMywdqb4Rdp6juXHhkMp1rHsNoTgjlpxLJzEwscDQ1IrA8+nC74dCoeDq1av07dtX\nqzE3Zfz48ezZs4ddu3bh5uamVrleX19PcnKyRgGsOcaNG0diYiI///wzn3/+uTA3lcvl7N69u8Vz\np0yZQmhoKP/85z+prKxk9uzZD6zSWJwniIiIiDzZiCKQSIejcZPxjthzRDmBb9w4XUQzrVn6NMbY\n2gkX65YDfRKJBHvPUfzr4zfVgoI//vij2vF+fn6tBme0Cfro6uoydepUpk6d2uqxXl5eLV5T0zgB\nxo4di3kPL9J2RtG/leae9XV1GrPkHnRfpI5OR/PJbkmYyc6+g6unnsbzNGWxTpkyheTkZIKDg5k7\ndy537twhKioKFxcXrssN+fZo8+KYrLKGKtk9zkbEsHTJa+Tm5hISEsLy5cs5c+YMO3fu5IMPPmDU\nqFE89dRTrFq1ismTJ2Nvb8+///1vDh06xKJFi4AGseTnn39WE68yMzP58MMP2bZtG59++qmw/cqV\nKwQFBWFvb88333yDiUnDQvapp5/lw48+5mZWNhaWVtwsbKhgc3V1JSIiQq1Cpm/fvsjlcmxtbUlI\nSKCwsJBevXpx6JBq4EGJnp4e2dnZzfxk1OnTpw/6+vpCxY9cLufatWvMnDlTqIRKSEjAwcGBxMRE\nAJUKKWWTYrlcrrFJcWlpKYAwpqqqKjIzMzE1NW3zPXTt2lVjs+vG1YAPWwRqKVioDITU19dRX899\nZwQrRfDCwkLs7e1V9mlqQN0YqVRKz5496dmzJ+7u7nz88cdcuHDhkYlAIg+PpKQkVq5cib+/v8Zq\nwMeVjj7HFum4tGT/3LTSo74elSD/w+DatWtqPYeg4W8dEASM9uLo6ChYtCoTVppL6nF2dlbr4dfW\neUFLVFZWcvjwYSIjI8nJyeHevXvUN/rhFBcXa31faWlpQsKYpnEpEywaj+vatWtIJBI8PDwAVWeI\nzrY91K5haGZFt6HTuRl1mCtHNmJq78otU0u6FMRQV1lGSkoKpqambNy4UetxN8bExIQVK1awZs0a\nli1bho+PD926dUMikVBYWCgkCx04cEDra44bN47z589z6dIllixZwtChQ1EoFERERODm5kZOTo5a\nX04l7u7uODs7k5mZia6urjinEBERERHRGlEEEhF5TBAX3uoYGbTvEdW1ixG5dyrUtj/pnsat9U/S\n1W9YuNZUlKktoPPy8h6ICKRcsGjKmu+IdBSf7NaqIsruVRMUcYXJ8eoVYJqyWIcPH465uTmnTp3C\n39+fU6dOUVtbi1u/4a2KYkpulevgOnAMubm/L/r9/PzYuXMnNTU1LFiwgIyMDGGfr68v69at4/r1\n68I2MzMzjdd2dnbG29ubuLg4leoeZfbo7NmzMTExURHGZF36kX0xgYJKXZZtvyAIY5oqZJTCmIeH\nB3fu3KGoqAhbW9tmLVraiq6uLh4eHiQkJFBaWkpqaip1dXX4+Pjg5OSEhYUFCQkJTJkyhYSEBCQS\niUo/IKVtTHx8PPHx8c1+jrJJcXl5OfX19ZSWlrb5HlqyToNHUw3Y0rNOqt8JiURCTUVDwOt+g4XK\njN/g4GAVIS4rK4vDhw+rHZ+RkYG9vb3a96b8OzMwMGjXOEQeX5YuXUpVVdWjHobIA+Z+krb+iISv\nJzmJrDX7Z02VHok3SsgqkNHDxuS+Kz20QS6Xs2vXLpW+iunp6YSGhmJsbMzw4cPv6/pSqZRp06ax\ne/du/u+Lf1PpMIyUXFX7ZnnhLbKv36aXj77a+W2dFzSHQqFg1apVpKWl0b17d5566inMzMyEd/6u\nXbuE/oXaoBxXenp6i4kTja385HI5JiYmwtyusTOEnqHmOYmFizeduthScCUSWX4mstvXOF5xHW+3\nbowcOVLryqXm8PHxYcOGDRw4cIDY2FguX76Mrq4uFhYW+Pj4qFRra4NEImHlypXs27ePM2fOCBXs\nSlvmyMhIjdX7SsaPH8+WLVu0svcTERERERFRIopAIh2OefPmMWvWrAfew+VhYWFhwQ8//KAxs1pE\nlX492he0Wz17EMCfytNYm/5JBqZWSPUNKb11lZpKOYk3Gs7ram7Apk2bHsg4lJUXmrLmOyqPu092\nS1URjakoyePrgzFqVRGaslh1dXWZOHEie/fuJTo6mpMnT2JoaEi6wo76eu187zt1sWN3+DUcGm1T\nPrcdHBzUFrc6OjqYm5tTVKRqAxkTE8Px48fJyMigrKxMTWAsKysTrnvt2jUAPD091YSxztZOSBpl\nVZbdq+bwb8nY6JTS08lWpUImOTmZnJwcUlJS6NSpE5WVlbi4uLBu3Tqt7l0bfHx8iI+PJyEhgdTU\nVPT19XF3dwcaqn4uXbpETU0Nly9fplu3biqCmPL98dprr2nVY0YpSDzoe3gUtPask+rpY2TpQHnB\nTbJ+O4CBqSW3kyTM8DDBrB36y9ChQ+natSthYWEUFxfTq1cvCgsLiYqKYujQofz2228qx589e5YT\nJ07g4eGBnZ0dnTt35vbt20RHR6Onp8eMGTPaPgiRxxqlfaWIiEj7aM3+WVOlh4GpJf9am0BX47r7\nrvTQBk9PT06ePElaWhru7u7cuXOH8+fPU1dXx5IlSx7Iuu7FF1/kZEQcPwbuR6/TSTrb9kDPyBRF\nlZyqshJkedeovlfG0dibTGyS1NPWeUFzKO1zNYmNJSUl7U4kmTFjhlDlrc05MplMSPJp7AxRU9n8\nHLRTF1u6j/j9HfvKmF4arZq//PLLZq/RkguEjY0Nf/nLX7S5Bd57771WxVp9fX3mzp3L3LlzVbYr\nRTxNfS+VKBOmJk+erNV4REREREREQBSBRDogFhYWHVYAgobgqqOj46MeRoegh40JXt0sWhU3GuPd\n3UIIzD8OAfqHhTb9k3SkUmx6DyEvKYzUY5swd+rDF2UXqb9764H9Xfn4+HDgwAE2bNgg9HAxNjbW\nygrvcedx9clurQJMiaK6krzEcwSetxdEoJayWCdNmsT+/fvZuHEjxcXFDB45hvDb2glAAFI9QxJv\nlNCJ37M7lZmkzQVLpFKpishz+PBhtmzZQufOnenXrx/W1tYYGBggkUiIjIwkMzNTpYFzRUVDBWDW\nXYWaMCbRkaJroPq58qIcrtXWoK+roxLYuH79OkVFRRgYGGBlZYW+vj43b95EJpMJQmdrSCSSFqtk\nlJU9ShFIaRGn3BcaGsqxY8eorKxUqQIC6N27NwCXL1/WKthjaGhIt27d2nwPjyPaPOt6jHyOWxeD\nKcu7Ru2NZOrr6zkT5cVzo31aPbcp+vr6rFmzhh9//JH4+HjS09Pp3r07y5Ytw8TERE0EGj16NDU1\nNVy5coWMjAyqq6uxtLTkqaee4rnnnqN79+5tHoPIH0dlZSX+/v64ubnxr3/9S9heXV3NnDlzqKmp\nYenSpYwdO1bYd+zYMX744QfeeecdJkyYoNYTSFnZDQ1Z842fLV988YVa34jExER27dpFRkYGEomE\nvn37smDBAo0BwJKSEvbs2cPFixcpKSnByMiIvn37Mnv2bHr27KlybGBgILt27dL4mZqqSho/SxYu\nXCj828bGplkrWpGHx8NMImuPnWFLv2+t0Zz9c/G1eG5cOET34TOwdO2nVukRX26JpHf3B1Lp0Rq2\ntra8+eabbNu2jePHj1NTU4Orqytz5sxhwAD1HpztISn7LrftxtB9uCXF1+Mpy02nTlGN1MAIA2Nz\nbPuOJP9yOGiwOm3rvKA58vLyADRWtiQnJ2s8R+kCoGnO06tXLyQSCSkpKVqPwdXVlfj4eFJSUvD2\n9lZxhijPz9L6Ou11lHhYlJSUqK29ZDIZW7duBWi2uqyoqIiwsDCcnJxUKpRFRERERERa4/F+M4p0\nOFpbNCgXlcrFZOPGuNbW1lotgpvapl29epVly5YxbNgwVq1apXFcb7zxBrdv32b79u0qwa/Y2FgO\nHz5MJv8ulgAAIABJREFUWlqa0Lx8+PDhvPjii2pWLsqxf/fddwQGBnLhwgWKi4uZPXs2AQEB3Lt3\nj0OHDnH+/HkKCwupr6/H3Nycnj17MnPmTGFx3p6F97Jly0hLS+O///2vRqu4gwcP8tNPP7FgwQKe\ne+65Zn46HZO5o91YsbP5/iONkUjQmPH1Z0Db/kl23mOQ6OpRnBFLcUYsVxVdmf/CVAICAnjzzTfv\nexwDBgxg4cKFBAcHc+jQIRQKBTY2NgwZMoSFCxdq9FMXaT/aVIApMbHtTnFGHPu35GJX6oe0trLF\nLFZra2sGDx5MVFRUw/97DYTL2otASi7f0l7EbUxtbS2BgYF06dKFb7/9Vm2hrPS/b4zyHrYFx1Nf\nr9okt76uFkVVBfpGpr9/hqKGKlkxcqkLYUF7gQZhbPny5Xh5efHTTz9hZGTEqVOnWL9+PevWreP9\n999Xez+Ul5eTn5+Pq6ursM3U1FStqqkxrq6uGBsbExUVRWlpKb6+vsI+5aJ+3759Kv9X4ubmRt++\nfYmIiODUqVMa/eCzsrLo0qWLUEH07LPPtvkeHke0edYZmFjgOla1aXZP7154ebm1q+ealZUVH330\nkcZ9Ta/Xu3dvIRgn8vhjaGiIm5ubMA9Uvp9SUlIEy6OEhAQVEUjZy6upOKtk2LBhQMMc19PTUyUg\nbmtrq3JsdHQ0UVFRDBw4kMmTJ5Odnc3FixdJT0/nP//5D6amvz+v8vPz+fDDDykpKcHb25vRo0dT\nVFTEb7/9RkxMDCtXrmTw4MHt/i78/f0FcX369OnCM0LsFfh48KCTyB4nezltg/VNKz3eeNqDZ4c4\nqx3XUqVHSxUaAQEBLYpeTk5OfPLJJ62Os7lqktbE1J1h6YAECxdvLFzUg/tV5XcpSG2YkzW1Om3P\nvEATynVmUlISQ4YMEbbfvn1bECeaonxOFRYWqu0zMzNjzJgxnD17lt27dzN79my1Xjd5eXno6OgI\nz8fx48cTHx/Pjh07WLNmjeAMoaiq4Hby+WbH3pT2Oko8LP773/+SmZmJu7s7ZmZmFBUVcenSJWQy\nGZMmTRLsaJWcO3eOnJwcwsLCqKmp4aWXXhL6IIqIiIiIiGiDKAKJPBa0ZRHclN69e+Pg4MDFixc1\nZjinpaVx69YtRowYobJv165dBAYGYmJiwuDBgzEzMyMrK4uDBw9y8eJFvv76a7WgqNInWSaT0b9/\nf4yMjLC1taW+vp7Vq1dz5coV+vTpw8SJE5FKpRQVFZGUlETfvn3VMjQb09rCe8qUKVy9epXg4GBe\nfvlltfODg4PR09Nrtny9I9Pf2Yr3nvFq1e5KIoH3p3o/8t4sjwptF9ASiQS7vqOw6zsKUF1AN12c\ntmSJAOrBTyXPPvsszz77rMq2goICAKZOnaq2+Pby8moxMCvSPNpURSjRN+6C05BnyI0L4dfDR7Ex\n1W81i3XChAlERUXh5uaGmbUDkNbmMd4slGPX5rMabN7kcjk+Pj5qAlBlZaVg/dYYV1dXklKuEhOX\ngKVrP5V95YXZ1DfJUjW1dya/JJfUS+Gs+epbjPXQKIxNmDCBjIwMjh07xuLFi+nfvz82NjbIZDLy\n8/NJTk5m/PjxLFmyRLi2j48PYWFh/O1vf8PV1RVdXV369u2Lp6cn0JA56+npKYhsjQPKNjY22Nvb\nC4ER5TmNWbZsGatWrWL9+vUEBQXRu3dvjI2NKSoqIisrixs3bvD1118LwZ723MPjSHszex/3jGCR\nR4ePjw9XrlwhOTlZEFESEhKEvz2l6ANQX19PUlISdnZ2zfZvHDZsGMbGxoSEhODl5dViUDkyMpK/\n/e1vKn//27ZtY//+/Zw6dYqZM2cK27///ntKSkp4+eWXmT17trB9ypQpfPzxx6xdu5affvoJQ0PD\ndn0PAQEBFBQUkJmZyYwZM8T+lI0oKChg69atxMfHU1lZSffu3QkICFAR3eRyOcHBwVy6dIlr164R\nExNDbm4ut2/fRqFQkJubK5w7ZcoUvvvuO5ydnQWLzpqaGg4dOkRoaCh5eXmkpqZSVVXFP/7xD2bN\nmqUi2gQEBAj/LisrY9euXbi7u1NYWEhlZSWWlpZYWVnh4+ODjo4OCQkJlJeX06NHD+bPn4+3tzeJ\niYmkpKRw8+ZNoqKicHd3p0ePHvzvf/8jOTm5obdaTQ2mpqY888wzrQabp06dyujRo9tlj9jeYP3j\nHuRvC21J6lHSuC8StH1eoIkhQ4Zgb2/Pr7/+SlZWFq6urhQWFhIdHc3gwYM1Cj1eXl5IJBK2bdvG\njRs36Ny5M9Bgbwfwl7/8hdzcXHbu3MnZs2fx8PDA3NyckpISsrOzheQbpQg0evRozp8/T1RUFG+9\n9RZDhw6l/noaV+IvYmzRlSpZ699TY2eIx5URI0Zw9+5doqOjkcvl6Onp0a1bNyZOnKhRxDtx4gSX\nL1/GysqKRYsWtbkP0eNIUFAQx48fJz8/n+rqahYtWiTa5oqIiIj8gYgrYpHHgrYsgjXh5+fH9u3b\nOXfunJrtlNKSo3EwOzExkcDAQPr06cOnn36qkuWorE4KDAxU8y4uKSnBycmJL7/8UmWRnZWVxZUr\nVzRWI9XX1yOXt5w939rCe9SoUfz3v//l1KlTBAQECJZK0JCplZOTg6+vb4tiWUdmUv9u2JobEXg+\nncQb6hN/7+4WBDzl9qcVgEBcQP9Z0aoqorM5A15aLfzfZcycZn3Sm6IUWiZPnoxciyC6QWdzvGYt\n4/Kvv/edseg/he9eH42NhsV4SwKgubk5BgYGZGRkUFlZKTxzFQoFmzdvpqysTO2c8ePHs33vr9xO\nPo+ZYy/B/q1OUUNuXIja8frGXXAdO5drobvYvHEj/TwbqjgaC2PKCpk33niDQYMGcfz4cRISEpDL\n5XTu3Blra2uef/55lUoBaPDlh4Zg8sWLF6mvr8ff319F0PHx8SEqKgojIyPc3FR/Hj4+PuTl5dGz\nZ0+NmfhWVlZ8++23BAUFERERQWhoKHV1dZibm9OtWzemTp2qZj3W1nt4HBGfdSIPGh8fH3bv3k1C\nQoKKCNSzZ09GjBjBxo0bycnJwcHBgevXryOTyR5Y8G306NFqFUVKK860tN9F96KiIuLi4oS/1ca4\nu7vj6+vL2bNniYiIYNy4cQ9kbCINFBQUsHTpUuzs7Bg3bhwymYzz58/z97//nc8//1yo1Lx16xY7\nduygb9++9OvXj9TUVMzMzNi1axd6enr4+/tjYWHB+fPn+e6777C3t+f69etkZWXh6OjIX//6V5KT\nk3F0dMTX15eMjAzq6urYtm0bFRUVTJo0SW1s+fn5BAcHU1NTg46ODqWlpdTW1qKjo4OjoyPfffcd\ntra2vPbaa8K4P/30U+bPn8/3339PRUUF3t7eTJgwgZ9++onMzExsbGwwNzdHX1+furo6Nm3aRFpa\nGkuXLm3xezI1NW33OuR+7Z+fBNqS1NP0POX30J55QVMMDQ354osv2Lp1K0lJSaSkpGBra8ucOXN4\n9tlnOX9evRLHycmJ999/n4MHD3Ls2DGqq6uB30UgIyMj/vGPf3DixAnOnTtHREQE1dXVmJub07Vr\nVxYtWkT//v2F60kkEj7++GP279/P6dOnOXLkCEZ6Rli59sPWczTxu9a0eA8dxRli1KhRjBo1Suvj\nW6pw64iEhYWxefNmXFxcmD59Onp6evTp0+dRD0tERETkiUYUgUQeC7RdBDfH2LFj2bFjB2fOnFER\ngRQKBefPn8fMzIyBAwcK25VBx7ffflstuObn58fhw4cJDQ3V2MBy4cKFzWZZKvs5NEYikQgZUe1F\nX1+f8ePHc/DgQaKiolSCDydOnADQuDh8kujvbEV/ZyuyCmTEZxVRUaXAyECXfj2snqhFYHsRF9B/\nTv7Iqoh79+5x/PhxTExMGD16NHml1e36LFANUmiLRCJh2rRp7N+/nyVLljBs2DAUCgWJiYnIZDIh\nk7kx7u7u+AwbQ9bhIK4c3UiXbh4g0aH01lV09Q3RM2oYQ1NhTFFVgWHJFYyNjbG2tiYpKYmIiAi1\nCpnBgwdrbbdkZmbG8uXLWzxm2rRpzXr3L1mypNWqnE6dOjF79myVqoDWaMs9tFShp03T4z8C8Vkn\ncr80nUd4Ojqgr68vVPzI5XKuXbvGzJkzhQB/QkICDg4OwjPnQfVh0FQlbmXVIFiWl5cL25RNwPv2\n7Yuurvrz29vbm7Nnz3L9+nVRBHrAJCUlERAQgL//7xaTvr6+rF69mgMHDgi/C46Ojmzbtg1TU1MK\nCgoIDw8HYOXKlURERFBUVMTf//534VxloPzMmTOYmJiQnJzMwIED+b//+z8OHTpEt27dmDt3LqdO\nnWLfvn24uLiojS05ORkvLy9KSkowMTEhJCSEkJAQdu7ciUwmo0+fPhgYGODr60uvXr3o378/X331\nFe+//z5yuZy+ffuiUCjYsGEDOTk56OvrY2BggKmpKQYGBsybN4+0tDQ2bdrE8ePHMTU1pVevXsyb\nNw93d3eVsTTXE2jatGl4enry4YcfsmPHDi5dusSdO3d49913hQS9vLw8qi8fI/H4OerqaunUxQ47\nz+aD4x0lyN8W2pPUo+m8tswLmqv4t7KyYtmyZRrPaW5eMHbs2BYTSXR1dZk6darWPUJ1dXWZM2cO\nc+bMEbadiLvJt0eT1L6DxvzZnSE6EjExMQCsXr26Q/d7FhEREelIiCKQyH3RdCFtWNH2fhGg/SK4\nOZSWB/Hx8WRnZwt9hKKjo5HJZMyYMUOleiY1NRVdXV21hs5KampqKC0tVbOX09fXp0ePHmrHd+vW\nDRcXF8LCwigsLGTo0KF4eHjg5uamcbHeHqZMmcKvv/7K8ePHBRGorKyMCxcu4OTkpNEu6Emkh42J\nGMxrho7SP0kbWxVQt0aRSqU4Ozszbdo0lcy5B9HYW4lMJuPAgQNERkZSUFCArq4uPXv2ZNasWSpZ\nivv372fbtm0sXryY6dOnq91jSUkJr776Ki4uLqxdu1bYXltbS3BwMGfOnOHmzZvU1tbi6OjIhAkT\n1OxWGlu/vPDCC/z8888kJSVRVlbGmjVr8PLy+kOqImJiYrh27RrR0dHcvXuXBQsWYGBgQA8bA2zN\nOpFfeq/Nn6dtz6qmvPTSS5iZmXHy5ElOnDiBkZER/fv356WXXiIwMFDjOc/MmktsnoLCtBiK0i8i\nNTDC3LEP9v3GcfXYJo3nOA2ZwninCdy7mdhhK2T+THSUZ53I40VcZhE7w9I1CohlNaYUXUmntLSU\n1NRU6urq8PHxwcnJCQsLCxISEpgyZQoJCQlIJJJm+wG1FU1JQsr5auMm68qK8i5dumi8jnK7NnNm\nkbZhY2MjVDQoGTBgANbW1iqJapoqNm1sbFi8eDESiYSgoCAKCwuFc+VyOcbGxoSGhmJgYIBEImHR\nokVIpVJCQkLQ1dVlypQpWFpasn79ekJDQzVe39PTkzNnzuDv74+1tTV+fn7s3LmT2tpa3n33XTZu\n3EhaWhq9evXC19eXTz75hJqaGsaMGYNcLqdHjx4UFxfTq1cvXn31VQ4cOIBMJgMaxMeEhATq6uow\nMzNj8ODBhIeH88knn7B+/XocHBy0+g7Ly8tZtmwZhoaGjBgxAolEgrm5OQC5ubksW7YMmUyG34iB\nJBRJqJLd4fq5vZh2Ve9R97CD/DY2Ng/Fsli0Om0d0RniyaKkpOFnKApAIiIiIg+PP8+sQeSB0txC\nWpafRX72HbIKZG26nraL4JZQNpEMCQlh/vz5gGYrOGgI9NbW1rJr164Wr3nv3j0VEcjMzEyjJ7aO\njg5r1qxh9+7dhIeHC40zO3XqhJ+fH6+88kq7PdqV2NnZMWDAAGJjY8nLy8Pe3p6QkBBqamqe+Cog\nEe3oCP2TtLVVUSgUKtYozzzzDFVVVYSHh/PPf/6T69evM2/ePODBNfYuKChgxYoVFBQU0LdvXwYO\nHEhlZSUxMTGsXr2aJUuW8PTTTwMNGY/bt2/nzJkzGkWgs2fPUldXp/LsUSgU/P3vfyc2NhYHBwd8\nfX3R19cnMTGxRbuVvLw8PvjgAxwcHBgzZgxVVVVCr5o/oioiPDyckJAQzM3NeeGFF1T6Ow1xsyHo\n4o0Wr68pU7VxkKKlYErTvlRSqVRjjylovhKlv7M11r2HYN17iNq+vs++2+xnvzDFjx426p8j8vjR\nEZ51Io8Xygzy5n5fKozsuJZ2mS37T2FaW4K+vr5Q6eDt7c2lS5eoqanh8uXLdOvWrcWeGn8ESoHh\n7t27GvffuXNH5ThAaL5eW1urdrwoFqnTNLHN0bjhl8XZ2VmtkT00JKClpqaqnJt+NZWkqFDu5DT0\nBDI3N1fpb1FcXIy1tTVWVlYUFRUxceJEjh07hkwmw9XVFUdHRzIyMrh58ybDhw/H1NRUmBfduKH+\n7nVxcRHWJcqEOmVA1cHBQRBplD9v5X1IJBJmzpzJ9u3bsbS0xNLSkq5du2JoaIhUKhWSYH799Vcm\nTZqEnp4ednZ2fPTRR5w4cYLvv/+ew4cP88Ybb2j33WZlMXbsWN59912VpDyAH374AZlMJiTVxGUW\nEXg+nbDfLnD93G6VY5/kIL9odaodojNEx0dZNaikcUV8UFBQq9WDOTk5nD59mvj4eAoKCqioqKBL\nly4MGDCAOXPmCInESpKSkli5ciX+/v4MGzaMHTt2cOXKFWpqapqtbISGGFRwcDBnz57lxo0bKBQK\nLC0t8fT0ZNasWXTt2lU4ti1JfiIiIiKPElEEEmkzrS2ky+5Vs/u3NAaOyebpfk4q+5RZb38Ew4cP\nx8jIiLNnzzJv3jxkMhmXLl3C2dkZZ2dnlWONjIyor69vVQRqSksv8M6dO7No0SIWLVpEXl4eycnJ\nHD9+nCNHjiCXy1v10taGyZMnc+nSJU6ePMkrr7xCcHAw+vr6ovWHiMDjniWnra3KwYMHVaxRlEGD\ngIAAli5dyr59+xg8eLAwaW+psbdEIuGbb77Bz88PLy+vZht7r127lsLCQpYvX87o0aOF7XK5nBUr\nVvDVV1+xdu1a5s2bR0BAAP369SMuLo4bN26oeawrs3h9fX2FbXv37iU2NpapU6eyePFiIRhTV1fH\nhg0bOHXqFCNHjmTo0KEq10pJScHU1JS0tDQ++ugjtZ5hD7oqoiWbr6kDu7cqAmniYQYpRLuwPweP\n+7NO5PEhLrOoVcHQxM6Z3Hr48eBpvLpU06dPH8Hi18fHh9DQUI4dO0ZlZaVWVUCNn+8PAqUV2OXL\nl6mtrVULpCtt6lxdf6+cUM63i4rUe41kZGRo/JyWhKMnleYS26rK75KdfYfe/TSfJ5VKuSuvYtm2\nCyTdLOHuzStknt+HRKqLkUVXavVMcO3jhf/s50hKSiI5OVlITJFKpdTX1+Pn50dQUBBFRUXC3OXM\nmTPA7wlsyiovTf1FG6+plAl1yt8NIyMjjQl1tbW11NfXC/17qqqqgIaKnF27dpGRkUFBQQE6OjqY\nmpqSlJQENCTGQUPSnbK6SFt0dXVZuHCh2u9tUVER8fHx2NraCjZhQpB/kicfLE8nM+Mq0wd3Z94L\no5/o97Q4d2kbojNEx0VpFxkSEkJBQYHKmlBJS9WDFy5c4Pjx43h5eeHu7o6uri43b97k5MmTREdH\ns3btWiwtLdWumZGRwS+//EKfPn2YOHEihYWFzVY2KhQKPvvsM+Lj47GyssLX1xcjIyPy8/OJjIyk\nb9++ggjU3iQ/ERERkUeBKAKJtInWFtK6+g0Z+NXyMtYeScTGrJMQgMnLy/tDRSB9fX1GjRrFyZMn\nBVu42tpajV7Hffr0ISYmhps3b9KtW7cHPhZ7e3vs7e3x9fVl7ty5REZGtnqONgvvIUOGYG1tzalT\np/D29iYnJ4dx48bdd88hkSeLxzlLTltblVOnTqlYoygxMzNjzpw5rF+/npMnT6qIQM019tbR0eHC\nhQvk5+fj5eWlsbF3ZmYmycnJjBw5UkUAgoYgy9y5c/nwww8FixRoCNDExcUREhLCggULhO3p6elk\nZ2czfPhwoZKwvr6eI0eO0KVLFxYtWqSSVayjo8PChQs5ffo0oaGhaiKQubk5/fr102gFAw+3KqKj\nBClEu7A/B4/zs07k8WFnWHqrzwKjLvbo6htSmn2VSzk1zJr2e4W1Mjlh3759Kv9vCWWAvbCwsJ2j\nVsXKyop+/foRHx/P4cOHee6554R9V69e5dy5c3Tu3Jnhw4cL23v16gXA6dOnGTt2rPAuLSoqajYJ\nSvnOKiwsxN7e/oGM/XFGm8S2I5duMiFePbHten4ZqTl36PT/34d5iaFIpLr0mbwYiVSPywU3uFbd\nBUuPp3C8c4fk5GS167u7u+Po6EhiYiIFBQUoFArOnTuHqamp0MtUWeVlZGSkMgdpK8rn5J0KBfeq\nFWTczAdAT08PaEimW7lyJStXriQ8PBwDAwNeeOEFFi9erHIdXV1dzM3N21RNZmtrq7F6TtnrysPD\nQ63aqoeNCTOffopdJbcY2cf+T/FMF+cuIn8GvLy88PLyIikpiYKCAgICAtSOaal6cOzYscyYMUN4\ndimJi4tj9erV7NmzhzfffFPtmjExMbz33nsqsaHmKhsDAwOJj49nyJAhfPzxxyqfVVNTQ0VFhfD/\n9ib5iYiIiDwKRBFIpE20tpA2MLVCqm9I6a2rVN+TE3g+nf7OVlRXV7Npk+ZeDA+S8ePHc/LkSc6c\nOUN2djZSqZQxY8aoHTdjxgxiYmL47rvvWLFihZoXbWVlJTdu3KB3795afW5+fj719fXY2dmpbC8v\nL0ehUGglfGmz8JZIJEyaNIkdO3awbt06oKE6SEREE48yS+5+bFXu3btHXl4elpaWODo6qh2rDMAp\ngweAkLWtqbH3oEGDiIuLE7JoNTX2Vn62XC7X2GumtLQUY2NjJk+eLGSrDh8+HGNjY86dO8f8+fOF\n+2qaxQuQk5ODTCaja9eu7NmzR+N3pq+vT3Z2ttp2Z2dntQVQUx5mVURHCFKIdmF/LsSM4I5PY7sW\nTQGhhQsXAr9bRoaEhPDtt9/y3nvvYWpqyt69e8nMzERXVxcfHx9eeeUVunbtSlaBTCvRWqKjQ2eb\n7ty9dZUawNLx916VNjY22Nvbk5eXh46OjlY9GB0cHLC0tCQsLAypVIqNjQ0SiYSxY8eqVXNqy5Il\nS/jwww/56aefiI2Nxc3NjaKiIn777Td0dHR47733BDtUgN69e+Pp6UlycjJLly7Fx8eHu3fvEh0d\nTf/+/TX2xfTx8eHAgQNs2LCBESNG0KlTJ4yNjbVu5t6R0KZCDIB61BLb4jKLiMkoUDm3SlaCoZk1\nhmbWVJXfFc79d1ACZlcvNXv5p59+mtOnT5ORkcHx48cpKytj2rRpQk9R5ZylR48e5Ofnt/k+Syuq\n2RmWzrZrYQCUVEL5vRrW7j5FffEd3Lz1MDY25urVq5SXl6vMrZpbv0il0jZVuTXXy0o5L1Nm+Gt7\n3pOKOHcREWmguepBQGOVD0D//v3p3r07sbGxGve7u7urJQdrqmysq6vj2LFj6Ovrs2TJEjWxSU9P\nTxC17yfJT0RERORRIIpAIlqjzUJaRyrFpvcQ8pLCSD22idtJfTDJjSD7WioWFhZ/eOM/d3d37O3t\nCQ8PR6FQMGTIEI2ZZ8oAwfbt23nttdcYNGgQtra2VFZWUlBQQHJyMh4eHnz22WdafW5mZiZffPEF\nbm5uQhPh0tJSoqKiUCgUzJo1q9VraLvwnjhxIrt27aK4uJgePXrQp08f7b4cEZGHwP3YqtT//xWv\nMijQ3PNCUwNsXV1dPDw8SEhIUGvs7eHhgYODAykpKcyYMUNjY29ldm18fDzx8fHNjtHQ0FDI8FZW\nHwYHBxMXF8fAgQOFLF4zMzMhi7fx9ZV2K82htFvRdL+t8bCqIjpKkEK0CxMR+Z2CggIWLlyIn58f\nL774Ilu3biUpKYmamhr69OnDokWL6N69O6WlpezYsYPo6GjKy8vp0aMH8+fPVxHNS0pKOHnypNCj\nsLy8HFNTUzw9PZkzZw5OTr9XTNy6dYs33ngDLy8vvvjiC41je+utt0hJSRHs19pCREQEly5dYvjw\n4UKlZ0REBElJSXz11VfE51Rrfa3Ods7cvXW1IZlJR3Xu6OPjQ15eHj179tQqsUdHR4dVq1axdetW\nwsPDuXfvHvX19Xh4eLRbBLKzs2Pt2rXs2bOHixcvkpycTKdOnRgwYAAvvvgibm7qgvsnn3zCTz/9\nRFRUFEFBQXTt2pX58+czYMAAjSLQgAEDWLhwIcHBwRw6dAiFQoGNjc0TKQJpUyGmpL4eIbFNeW5T\n9I3NqJKVUFMha3RePXkJoWRcS8PDUfO7fNy4cVhbW1NYWMj69esxMDBg/PjxAJSVlbF7d0NfHF9f\nX6Kiotpyi/x2JY/UnDvYWZRj//+djvSMzZBIpdzNvkKdooZjcdnMGDWClAunePPNNzVWG5WUlCCX\ny1X+th8E2va6+jMhzl1EnkQ0rU1aornqQWh4roaGhhISEkJmZibl5eUqorRSQG+KpnekpsrGW7du\nIZfL6d27d6uxq/tJ8hMRERF5FIgikIjWxGepe4prws57DBJdPYozYinOiOVcXT4vzXyGgIAAjaW5\nDxo/Pz9+/vln4d/NMWvWLDw8PAgKCiIlJYWoqCiMjIywtLTk6aefVunl0Ro9e/Zk1qxZJCcnc+nS\nJcrLyzEzM6Nnz55MmzZNJRjcHNouvM3NzRk0aBCRkZFMmjSpmauJiDx87sdWpTHKoEBqaioff/wx\nmZmZKBQKwWJRaXejPE6ZIT5u3DgOHz5MQEAAOTk5QmPvwMBAYmJiKCwsZNmyZSqNvUNDQzl48KBg\nD/n000/z1Vdf8dVXX5GcnExQUJAwLmWmuo2NjZCp7ufnx7fffsvLL79MUlIS//jHPwgPD8fCwoLF\nixfj6+vLSy+9hJGREfC73UpkZCTh4eGkpaVRXFwMgKOjI35+ftTX16v0H2trM9GHURXRUYIUol1q\n3bkMAAAgAElEQVSYiIgq+fn5fPDBBzg5OeHn50dBQQEXLlxgxYoVfP3116xevRojIyOeeuopZDIZ\n58+f59NPP2XTpk1YW1sDkJyczL59+/D29haSVnJzc4mIiCA6Opp//etfQi9GR0dHvL29SUxMJCcn\nR8VzH+DKlSvcuHFDEHDaSnR0NH/9618FG1CAw4cPs2XLFv7zn//Qd9J8ra9l02coNn0aMnUra1Sr\nHJYsWcKSJUs0nvfll19q3O7m5saaNWs07vPz82txjtr43dMYS0vLNs2ljY2Nefvtt3n77be1/oxn\nn32WZ599VuvP6IhoWyHWmMQbJWQVNAgkms61cR/OzagjpB7bRGfbHlTeLaAwNRI9Y1NM7JypqCrW\neF0rKysmTpzI3r17SUpKwsbGhnPnznHy5El+++03SktLmTlzptbuBEry71YQpkHo0tGRYmzlSH19\nPfLCbEr0O3HUwAj9/CJiY2OxsLCgoqKCTp06ceLECRITE0lJSWHevHkPXARS9rpKSUmhrq5OrVJc\n2Y/oz4Y4dxF5UmguORDgblIOBvc0J2q0lAD3448/cujQISwsLBgwYACWlpZCEomyz5AmtK1sVCYj\nNldx1Jj7SfITEREReRSIIpCI1lRUKbQ6TiKRYNd3FHZ9RwHwyphegh2Q0sZDSXsWwS01LQd48cUX\n1XqONIeHhwceHh5aHdt07I2xsrJi3rx5Wl3Hxsbmvhbe9fX1ZGZmYmBgwNixY7X6TBGRP5r7sVVp\nSqdOnZDJZKSmpmJoaMj48eMxNDTk0qVLbN++nV9++YW6ujqVBtgKhYLTp09z584devXqhYWFBXZ2\ndsKiwMzMDLlcrtLY+5dffmHr1q107tyZCRMmcOjQIVJTU1m+fLnWvcvc3d0xMTHh9u3bfPHFF/zy\nyy907tyZF154gRs3bvDLL79w9+5d3n77bcFuRaFQsHXrVnR0dOjduzeWlpbI5XISExPZvHkz6enp\nHaKBaEcKUoh2YSIiDSQnJ/Pyyy8ze/ZsYdvu3bvZuXMnH3zwAaNGjeLNN98UxOf+/fvz73//m0OH\nDrFo0SKgoSrm559/VrEeg4aq6A8//JBt27bx6aefCtunTJlCYmIiwcHBKv3TAIKDgwEYNWpUu0Qg\nb29vFQEIYOrUqRw5coTExETcntJcYdAaRgbiEulJRtvEtracZ+U2EImOlMLUKO7cuEzNPRmGZtb0\nfnohd7OvUHY7r9lzJ0yYQEJCArdv38bGxoYjR46go6ODs7Mzr732GqNHj242sNkcl7PvIO2heZ+B\niQVdB0wk+ZdvkBfdojgjFtdeXqxd9j4bNmwgPz+f8vJyMjMzsbOz46WXXtJor32/NO51deTIEaZP\nny7si4qK0thH6c+EOHcR6ci0lhxYWHaP8oI7BLeSHNiY0tJSDh8+TPfu3fnqq6/U5iFhYWH3O2xh\nDahM0muJpkl+IiIiIo874gpHRGvauyAWF9IPlvDwcPLz85k8ebIw8RARedTcj61KU1JTU5HJZOjp\n6eHu7s5f/vIXdHR0eOWVV/jrX//Kjh07sLa2ZsKECcI5JSUleHt7M3ToUAwMDDA2NlYRVE1MTKiq\nqhIaezs4OLB582ZMTU1Zt24dVlZWVFVVcfnyZXR0dMjMzFQbV05ODjU1NWrbu3fvTm5uLpGRkTg6\nOuLs7MzKlSuprKzknXfe4cyZM7zyyitMmzaN3bt3s3nzZlasWEH37t1VrlNcXMz69es5e/Yszzzz\nTIfxwheDFCIiHQcbGxs1i1o/Pz927txJTU0NCxYsUKk+9PX1Zd26dSoCTXMWLc7Oznh7exMXF4dC\noRAsWYYNG4aFhQWnT5/m5ZdfFvz1UzJvs+fQCTqZmJFT3VnrZKPGeHl5qW3T0dHBw8ODvLw8zOrK\n2nxNoFWrGpGOjTa/awadzRnw0upmz3ObMF/tHEvXfli6qvvedupiyyvvvIaXV0NSXNPqsbFjx7aa\n2NU0iazxv5smx333YyCvb2oIhja9h77Pvvv7Nd2HU1Gcg2lXN8p0zLldUo6dnZ3wN95cj64HyRtv\nvMGyZcvYsmULcXFxODs7k5eXx4ULFxgyZAjR0dF/6OeLiIg8eLRNDqxvkhwYFBREUlISaWlpJCcn\ns2jRImbMmCEcf/v2berr6+nfv7+aAFRUVMTt27fve+yOjo4YGxuTmZlJSUlJi5ZwymOVSX66uroE\nBgaya9cuvvjiC41zFBEREZFHiRidF9Ga9i6IxYX0g2H//v3IZDKCg4MxNDTkhRdeeNRDEhEB7s9W\nRSkelMqruX23gsDz6Zw9tBszC2tcXFxITEzk7bffZtCgQVRVVXHlyhWqq6sxNjZWq+JbtGgRO3fu\nFDzzG/f8MTAwwNTUlNLSUnR0dCgsLKS2tpZp06ZhZdXwjFq2bBmrVq0iMzOTjIwMjI2N2bp1K0VF\nRWRlZZGUlKRmVQINIlBkZCQVFRV07txZqG40NDTE19eX3bt3k5GRwYsvvkhmZibHjx8nOjoab29v\nLC0tKS0tJTc3l5SUFCZOnAhAXFwc48aNa9N3KiIiIqKkaYWeo3FDJMbFxUXtOaYMcDg4OKgFVXR0\ndDA3N6eoSLUCIiYmhuPHj5ORkUFZWRm1tbUq+8vKyoTrSqVSJk6cyO7du4mIiMC0W192hqUTcvI4\nt3JLcOg/gL0XrpN+o5idYem4Dy/S2kqytYbypgbg1c2iTe8o7+4WorD9hPMoEtseZlJca5VOiupK\ndHSk9Bj5HLcuBlOWdw1FVhLfJh3Ftospfn5+JCQkPJSxdu3alW+++YatW7eSkJBAUlISPXr0YNWq\nVZSVlYkikIhIB6Q9yYGy7BQ2b96MRCLBzc0Nf39/td7Hyp56TS0kKysr2bBhg9pcpD3o6OjwzDPP\nsHfvXr7//ns+/vhjIXkFGtwn5HI5ZmZmSKVSlSQ/ZcV0Y/6onmoiIiIi7UEUgUS0poeNibiQfoRs\n27YNXV1dnJycWLBggeDNLyLyqLkfW5U78ip2hqVzIv4msgIZ20LTSA2Po6KkhOcmvY6rQTE3r8QJ\n1ig9e/aksrISqVSKXC4XSvb19fXp0aMHPj4+Qn+vpg1Ae/XqRVJSEj179iQ3NxdARUiysrLi22+/\nJSgoiM8++4zi4mKCgoIwNzcnLS2Nu3fv0q1bN7X7MDIywsTEBENDQ6RSqYplivLvtLy8HF1dXVat\nWkVoaChHjx5l7969FBQUUFdXh66uLmZmZgQFBaGvr6+VBYHI732aHka28qMmJCSEb7/9lvfee69F\nG1WRPzfN+e9Xld8lO/sOvfupR2WkUilAs9XFUqlUJbCi7LnTuXNn+vXrh7W1NQYGBkgkEiIjI4U+\nbo2ZNGkSe/fu5T/b9lLqMoX6eijOiEVHKsXCtR+VpYUA3CwsY8XOKN6f6q1iD9P4ed+Y1hrKGxsb\nM3d0T1bsjNIqICWRIFgYizy5PIrEtoeZFNdapVNF0S0yf/sFUzsXDM0s0TM2paLoFuY65fTp1Z3l\ny5c3W/EHmi2yAwICNL6Hm7PAboy9vT0rVqzQuE9832lmxYoVar0r7wexekHkQdHe5ED9zAbhuXfv\n3nh5eWl8nnTp0oXRo0cTFhbGO++8Q//+/ZHL5cTHx6Ovr4+Li0u7rGWb4u/vz9WrV4mOjub1119n\n8ODBGBkZUVhYSFxcHAsWLBCeTU2T/BQKBdnZ2ezcuRPgD+upJiIiItIeRBFIpE38P/bOPC7Kcv3/\n72GGfUcWEUVBEJHNBUExl0zNzDUrwUo9Wcdv1q+yk56TWZ1OZtmqtmiZvY6WWy4pbigCKi6AG7sI\nyCL7Isgy7DC/PzjzyDjDomKCPu9/sueZee57Fua57+vzua7rhTHO4kb6AdFZi3wRkc7mbsuqxGTc\nYMOxRBQK1bIqjfW1AKSVwXUdW5a8+qRKMPDdd9/l6tWrKkFBU1NTJBIJ06ZNY9q0aRrnEBAQwKpV\nqwBYsWIFoO4i19fX5/nnnycqKoqrV6+yZ88eoHmzffbsWY2ZQAADBw7U+DeqDK4qG45KJBJ8fHzY\nunUrVlZWjBo1CicnJ4yMjARhKzAwkPr6epXSL2vWrNE4LjRnDW3bto2srCzkcjm+vr7C61Mybdo0\n3N3dW21g3pUpLCxk4cKFPPHEE232gxMRud/cyXcxMjKSwMBAsrKyqKiowMTEhF69ejF69GimTJkC\nQGpqKqGhocTFxVFcXExtbS2Wlpb4+voyZ84cjIyMVK7ZUgjs0aMH27dvJy0tDR0dHYYPH04/n8ls\nCL2G/EY+eTFhyIuuo1A0YWTTD2tXP8qr6zh48ToTW9Tfr6ioYO/evUL5lbS0NJycnHj22WcZMmSI\n2utqbGxk27ZtmJubs2bNGrUyKUlJSRrfjx49etDH2Z0/DgbjauFDQ1011TcLMe/rhraeIQ3VlQDU\nV5WrlYfJy8trVQSKi4vD399f5VhTUxOJiYlAc+aTtbUlbz/t0W5pGokElkz17HAWkkj35V6NbV3d\nFNde1pGuSQ9MezkjL86iLDcFFE1oG5gwYuIEVv1zcZsCkIiIiEhb3K058GpGDoBK1o0m3nzzTXr2\n7El4eDiHDh3C1NQUHx8fXnzxRWGfd6/IZDI+/vhjjhw5QmhoKKGhoSgUCiwsLBg5cqSKibClye/4\n8eOEhYWRn59PYmIiAwcOvG891URERETuBlEEErkjhjg8WhvphQsXAqqON9GNLSKiyt2WODl3tQBN\nPyNSbV0AGmoqkWpbqAQDoTmtHlAJCLbsYdGhOf/P8d5ado/SRX4/OHbsGAUFBRqzV5KSkggMDOzw\ntQoLC1m5ciWGhoZMmDABAwMDevfu3dlTFhERuQOCgoL44YcfMDc3x8fHBxMTE27evElGRgbHjx8X\nRKCjR49y7tw5PDw8GDx4MAqFgtTUVPbt28fFixf5+uuv1cqzQbPAdP78eYYPH85TTz3FlStX2Hvg\nCNl/nqXX4CdIOb4FI+u+9HAaQnVpIWXZyVSXFqBQKKCFwGJn2MR7771HYWEhMpmM/v37M3r0aM6f\nP89HH33E66+/zpNPPqkydnl5OXK5HC8vLzUBqKamhmvXrrX6vpSbDEChCKY49SKNtTUAWDoPA0DX\nxBKpjh5l2Vepr5GjrWfItvAU3OxM+Omnn1q9ZmxsrPBeKDl48CB5eXl4enoKpWMmD7HHxsyAbeEp\nxGaqB+89+1owd7Rzt1+3inScezG2dXVTXHtZR7pG5vR77Bm148sWjcHMTKzgICIicvd0tL+f0gCY\nF3uCvNiT9LE0ws6ieW8XHx/PtGnTOHDggGBkW7ZsGb/99hsXL16ktLSUt956S4jF1NbWEhgYiFwu\nR1dXl+eee46+ffsyffp0xowZo2LUa1lFYMSIEdjb23PlyhVmz57NgAEDmDdvHq6urkilUqZOncrU\nqVOBZoPJ0aNHCQsL4+2336ahoYEePXrg7u7Os88+K/R2U2bVffrpp5SXl7Nnzx527tyJjo4OQ4YM\nYeHChfTo0aMT33ERERGRjiOKQCJ3jLiRFhERacndljhpLXaib9GTqpI8Kgsy0TW2EGpFKx3hxcXF\n2NjYaHSFdxRHR0fOnTtHYmIinp6eKucKCwvV+l90JspSdH5+fmrn4uPj7+ha0dHR1NXV8eabbzJ2\n7NhOmZ+IiMi9ERQUhEwm47vvvlNz1JeXlwv/fu6553jttdfUMgyDg4NZt24dhw4d4tlnn1W7fmRk\nJJ9++inu7u4AKBQK/Ga8TMXVBK6FbcPedyoWDrd+1zIjAim6ep7Gupr/Pb75N1V2JZCioiKWLl3K\nl19+ibu7O2+88QZyuZz33nuPn3/+GV9fX5WxzczM0NXVJTU1lZqaGvT09IDmGvk///yzyutrSUZh\nBfn0QM+kByVpMTQ1NqBn0gPjng4AaEmlWLv4kBd3iqTDP2HWZyDXI5vIOvYTfe1sWm3M7OPjw6ef\nfsrIkSOxtbUlLS2NixcvYmxszGuvvaby2CEOlgxxsFTrlzS4n6VYuvgR5F6MbV3dFCeW8O7etJVJ\n6u3tLZgUAZXs95YZ37GxsZw6dYrExESKi4tpbGykZ8+ePPbYY8yePRsdHR3heQsXLqSwsBCA5cuX\nq8ylZfBcGWgPDw8nNzcXiUSiEmgXEYE7Nwca2fTD1hMsajNBUU1AQIDaYyorK3n33XfR09PDz88P\niUQiVHOQy+UsX76ctLQ0+vfvz8SJE2lqauLy5ct8+eWXZGZm8tJLL6ldMzU1lT179jBw4EAmTZpE\nUVERZ86cYcWKFaxbtw47OzvhsQ0NDXz88cdER0djaWnJ2LFjMTAwoKCggIiICNzc3OjVq5fK9Q8f\nPkxkZCS+vr64u7uTnJxMeHg46enprFu3rt2MJxEREZH7gSgCidwV4kZa5EEgloXqmtxNsKEtevQf\nwo3Uy+THn8Kk9wC09QyJzSwhLb+MbZs2oVAomDRp0j2NMXbsWL777js+//xzDh48SHV1NVKplL59\n+1JeXi6Ub2sLpZNMIpFQWVnJRx99RFJSEhKJBC8vL1599VWg2R2/c+dOfv31V2pqalAoFFRVVQnN\nj5WkpaXx+++/k5GRwY4dOzh58iQGBga4ubkhl8vVxm9oaCA4OJj4+Hg+//xz1q1bh5mZGQ4ODkyd\nOpXBgwcLmYtwy1WnpDv00VG66aA5CzMkJEQ49/bbbwsuf2h+/3777TeuXLlCfX29iptP0zVXrVpF\nSUkJgYGBXL9+HRMTE5Wsz9OnT3Pw4EGhv4mtrS1jx45l5syZahu3tsrtrVmzhpCQEDZt2qQyX4VC\nwYEDBwgKCiI/Px9jY2NGjhzJSy+9xJtvvglo7rsAzcGd7du3k5qaikQiwc3NjZdfflmsN96FkEql\nQjnIlpiYmAj/bvl9aMmECRP45ZdfuHz5skYRaOzYsYIABJBZVEl9D2cgAT1TaxUBCMDCwZOiq+dp\n+l+pTYCI6ER04qOZOH4sY8aM4csvvxTOGRoa8sILL7By5UrOnj2rci1l2c3du3fz+uuvM2LECBoa\nGoiNjaWiogJPT09iY2PV5hydUYxEIsHS2Zvsi0eBW1lASnp6jkMi0+ZG6iVupF5CpmeE1ZCJ/GfF\nWyxevFjje+Xn58fkyZPZuXMn58+fRyaT4efnx7x581QCOC3pZ20srlVFgHsztnV1U1xXz1YS0Ux7\nmaRjx44lICCAkJAQCgsLVQLmNjY2wr/37NlDdnY2AwcOxNvbm/r6ehITE9m2bRtxcXGsXLlSMCBM\nnz6diIgI4uPjeeKJJzTem+420C7y6HGn5kBjm34Y2/Sjf1EoWenJGvcmGRkZPP7447z11ltqa6uN\nGzeSlpbGggULmD17tnC8rq6OTz/9lF27djFq1CgcHR1Vnnf+/Hm1yi7Kv7/AwEAVI8m2bduIjo7G\nx8eHf/3rXyr7gPr6eqqqqtTmfPHiRb755huVvd6XX37JqVOniIyM5LHHHuv4m/QAqampISAgAGdn\nZ7744gvheF1dHf7+/tTX1/POO+/w+OOPC+cOHz7M+vXrefPNN5k4cSLQbILcsWMHMTExlJeXY2Ji\ngpeXF/7+/moCWsu9WmlpKXv37iUrKwsjIyNGjx7N/Pnz0dbWFvZD165dQ0tLCx8fH1599VWMjdXX\neMXFxezevZsLFy5w48YN9PX1cXV1xd/fX62PcMvxldlcmZmZYjaXyEOBKAKJ3BPiRlqkK9BakFXk\nr+NOgg3tYWTVBxu3URQknCHp4HrM7AehJdPmjf+3E2lNKYMGDeKZZ9TLmNwJtra2NDQ0UF5eTmJi\nIu7u7igUCg4dOkRlZSUDBw4UHO7tUVpaSmpqKmPGjOHJJ58kIyODs2fPkpmZybhx40hMTMTCwoJJ\nkyZRWFjIqVOnuHbtGhs2bCAuLo5evXqRm5tLeHg4hYWFFBYWYm9vT11dHenp6YSFhVFbW4uVlRX1\n9fVAswA1Z84cYRGbk5ODlpYW9fX1lJaW0qtXLwYPHoyDgwMBAQFs374da2trlY1Od2j86+HhIfRJ\ncnBwYMSIEcI5BwcHQRy7Ezefkj///FPY0Hl6eqoIbVu2bGHXrl2YmJgwduxY9PT0uHjxIlu2bOHS\npUt88sknyGT3toTasGEDhw8fxsLCgsmTJyOTyYiMjCQ5OZmGhoZWrx8VFUVkZCTDhg3jqaeeIisr\niwsXLpCSksKPP/6oIjKI/DXU1tbyy287ORIcSmFBPuUlRVSVlzB79myee+453N3dcXV1xdTUFIVC\nQWhoKEFBQWRnZ5OZmUlFRQUKhQJTU1OVjJeMjAy+/PJLkpKSKCkpoby8nLS0NPr27avyHYnOKEZb\nv3ktZtDDVm1+OgbN34mmxlslWuRF2dRU1yGXy9m2bRs5OTlIJBK2bdsGQFlZGQBZWVlq13vxxRcx\nNTXl2LFjBAUFYWBgwJAhQ3jxxReF59+OsjyMhaMXOZeOIZHKsHD0UnmMRCKhp9tj9HS7FRgZOW4A\nurq6rQqiAMOHD1cpBycicifci7GtK5viunq2kohm2sskNTQ0ZO7cucTFxVFYWNiqmee1117DxsZG\nrVzy77//zs6dOzlz5gyjR48GYMaMGcjlckEE0rQ+vNtAu8ijx91mIsqqdFBfcUBMTAwSiYTff/9d\nTQCqqKggLCwMZ2dnle8lgI6ODgsWLODSpUucPHlS7bvp6uqqVtp/woQJbNiwgeTkZOFYU1MThw8f\nRkdHh9dff13NCKatra2xj9q0adNUBCCAJ598klOnTpGcnNxtRCA9PT2cnZ1JTk6murpaKFOcmJgo\n7EtjYmJURKCYmBgAvLya13kpKSmsWLGC6upqfHx8sLe3Jzs7mxMnThAZGcnKlSvVhBhoLu974cIF\nRowYgYeHB5cvX2b//v1UVlbi6+vLF198wfDhw5k8eTJXrlwhLCyM8vJy/v3vf6tc59q1a3zwwQdU\nVlYydOhQ/Pz8KC8vJyIigmXLlvH+++/j7e2tNr6YzSXyMCKKQCIPDcnJyfz5558kJiZSXl6OsbEx\nffv25cknn1S5yd6Ju/tOuROHATT3NtmyZQsXLlyguroaOzs7ZsyYgbW1tVCr9vbFvbKJc0REhFDH\nv60mzg8TFhYWrF+/XujnItJ1uJNgw8gBNpy9WtDm9eyGTEDfvCfFV6MoSY9B0dSErYsDC156iZkz\nZ95zAB5g3759JCUlsW/fPrKystDX12fBggXk5+ezb98+YeHaHnl5efTr14+PP/5YOLZu3TqCg4P5\n+eef6dmzJ3//+9+FjYaDgwO//PILZmZmJCYmcunSJXr37o2FhQWNjY3cuHGDqqoqzMzMePrppykp\nKeHbb78lIyODTz/9lHXr1mFkZIS+vj729vbY2NgwYcIEQQCdMGGCsEB3dHTE0dFREIG6eubP7Xh4\neGBjY0NgYCCOjo5q84+LiwPuzM2nJDY2lq+++kptU5iUlMSuXbuwtLTkm2++wdzcHID58+fz6aef\ncv78efbu3cvzzz9/168rISGBw4cPY2dnx9dffy2UNpw3bx4rVqygpKSkVUE7IiKC//znPyrfz82b\nN7N7926Cg4PVNsEi95ezCZm8sWQpWZkZGFjYYmjlhJZxf6qTo4i8HE9mdh7Ojn2RSCS4u7tjZmZG\neHg4NjY2VFVVUV1djZGREVKpFFtbW2bNmgXA9u3bOXPmDDU1Nfj6+mJjY8OlS5fIysoiOjpaRQSq\nqm1A8j9Ht7KnmgoSLbSkMnoPe5K+fjMAaKyrhiYF0dHRREdHY2dnh0KhEDLvlFRXV6sJMFKplJkz\nZzJz5ky1od5++22NmbrK8jDVN5t7E5n3cUWm2/69/G57zomI3Cn3Ymzrqqa4rp6tJIKagFheVdeh\nTNL26Nmzp8bjM2bMYOfOnVy6dEkQgdrjXgLtIo8md5SJCPS1MiLxQl2rj9HX19cotCQnJwvVGzSZ\nUBobGwHNhhZNsRmZTIaZmRmVlZXCsezsbORyOS4uLq2WptWEputbWVkBqFy/O+Dl5cWVK1eIj48X\nTDcxMTFoaWnh7u4uiD7QXOkgLi6Onj17Ym1tjUKh4JtvvqGqqop//OMfjBs3TnhseHg4X3zxBV9/\n/TXr169XE62jo6NZs2aNUOmgvr6et956i9DQUKKiovjkk09USiN/+OGHXLx4kbS0NOG3qLGxkdWr\nV1NTU8OqVatUMulLSkpYsmQJ69atY9OmTWqxwIclm0tEpCXizkrkoeDo0aP8+OOPaGlp4evrS69e\nvbh58yapqakcOnRI+IG+n+7uO3UYlJWVsXTpUgoLC3F3d2fgwIGUlpayfv36VsWcwsJCoYmzm5sb\nw4YNo6amps0mzg8TMplMbHrfhelosCGzqKJdEQjAop87Fv1uLdRee3IQM30c1B7XlkMcYO7cuRrF\nD1tbW2xtbVWcSwChoaFs375dbSFoYmKiUhtdyezZs1m9erXKsfHjxxMcHIyDgwMbN25U6fkxfvx4\ntm7dyvDhw4VgaXFxMX/7298wNjbGzc0Na2trFQFCT0+PjRs3cvnyZfbu3cvUqVPp1atXc6N3aNW5\n+ajQUTdfSyZPnqwxWBEcHAzAnDlzhPcfmgPfCxcu5MKFCxw7duyeRCBlWbvnn39epbeVTCZj/vz5\nLFu2rNXnjhkzRk2gnDx5Mrt37271tYrcH4IuX+ft9z/lRmYGdkMmYOM2SjhnN3Qiaaf+4EZOCnPG\nz8BSVk1wcDCxsbH4+fmxZMkS/vWvfzF16lT+/e9/I5VKhfIYCoWCr7/+mqamJpYtWyb05enduzdp\naWksWrQIXd1bYs/dCCVSbV2kWhL+/ve/q5SJvF8oy8MUJJwBwMqlY5k7d9tzTkREpJmunK30KHM5\nvZitp1LUsiUKKy0oT43Ff95CZk6ZqJJJeifU1NQQGBhIREQEOTk5VFdXC2tGgBs3bnT4WvcSaBd5\nNOmoORCae8QeuHCdlOjrSMpLuZxerCJMu7i4MHDgQI3PraioAJozTVJSUlodo6amRu1Ya1hR6D0A\nACAASURBVL1lpVKpSllwZaWAOy3/pen6SnG3I2XHHyS33y8sezsBzcJPSxHIyckJPz8/NmzYQE5O\nDnZ2dqSlpVFRUSH0vk1KShJKU7YUgABGjx7NwYMHSUxMJCEhQUWggeZsqpalrrW1tRkzZgxbt27F\n29tb5fESiYRx48YRHR1Nenq6sMe7cOECeXl5zJo1S+36FhYWzJ49m40bNxITE6OWDfSwZHOJiLRE\nFIFEuj1ZWVlCdsjq1auxt7dXOa9s8H4/3d2aHAbKzCRodhHMmTOHOXPmMGXKFB577DE2b95MYWEh\nHh4eQppxQ0MDhoaG7N+/X3CKtGTixImUlpayadMmMjMzOXPmDOXl5VhbW1NdXc3PP/+Mt7c3ISEh\nHD9+nOLiYnr06MGMGTOYOnWqyrWU/UwCAgIYPnw4v//+u1o/E0tLS/Lz89myZQsxMTHU1NTg4uLC\nq6++ioODajD+vffeIz4+XmOQXNmX5HanvrKx6Q8//MC2bdsIDw/n5s2bWFlZMWnSJGbPnq3iCNHU\nE6hl8Kplo1Rra2s2bdrEu+++S3JyMr/88otGZ/2ff/7Jr7/+yssvvyw4sEXuno4EG8wNNTjVO8C9\nBgNvn5OZopz48+HEx8dTVFREXV0dCoWCjIwMmpqaOtxfRZPTS7lRcHR0VGv6rjyn3IBnFFbwx6HT\n5JTI0dauxUhHqiZADB48GHt7ewoLCwUBwsfHhz///JPS0lKOHDmCQqHAxcVFJTjcXWn5WdXJbwql\npFqjo26+lgwYMEDj8WvXrgFozASzs7PD0tKSgoIC5HJ5qxvI9khLSwNg0KBBaudcXFw0OoCVODk5\nqR2ztGz+2+huzsLuzOX0Yr7aG0lJeiyGPXqpCEAAWjJteg15gvLcVPaGx7Pl6xUoFAouXrxIRUUF\nBQXNQriPj4/weStd3sqSgIBK824l+vr6KvfGu/ltNLC0Q7dQh4SEhPsuAmVkZHD+/HmqYkMpz03F\n1G4AhpbtGzrERvUiIp1HV81WehQJuny91eC4tetIpLoGJKRcoPi3nViZ7BcySf/2t79pXO/cTkND\nA++//z7Jycn07duX0aNHY2pqKtxrtm/fLpRx6gj3EmgXeXRpzxyoifLqOt7bGsmSqZ48Obh5H6an\np4eRkZHGxyvX4TNmzOCVV17pnIm3MsadCKfdldbE6abGRjLzKjkeHsErr7yCXC7n2rVrzJ49G0/P\n5j6UMTEx2NnZCX0hlcdTU1NV/v92PD09SUxMJC0tTU2k0fR7p8zG0rQfun2PDc0xQICioiKNInZu\nbi7QHFO8XQR6mLK5RESUiCKQSLfn8OHDNDY24u/vryYAwa3g2P10d9/uMLg9M6lnz54cP36ctLQ0\nDh06xIgRIzh58iSFhYVER0djbm6ukplUVVXF1atXBWcVQHp6OqWlpVhaWnL06FGhFmpDQwMnT54k\nPz8fLS0t/vGPfwAwbNgwtLW1OX36ND/99BOmpqYa0/5TUlLYs2cP7u7uav1MVqxYwbJly+jduzfj\nx4+nsLCQc+fO8cEHH/DLL790uGdKWzQ0NPDhhx9SUlKCt7c3WlpaREREsHnzZurr61UanmoiICCA\niIgI0tPTmT59urBQU/53ypQpXL16laNHj2psWHr06FG0tbXVsghE7o22gg13Wyv6boMXmha0tRWl\nxO5aTX1ZAQNdBuDm4kRTUxM5OTkoFAoMDAw6XA5OU3lC5Ua7LRdYTnE5724+R9z1EkrSrpBVXElT\nQz1aMm32Xa3HxuWWE87c3Bw9PT309PQEAeKf//wn+fn5HDx4kEOHDhEeHo6Ojg6jRo3i5ZdfxszM\n7I7fqweNxs+q8iYJmTdoispg7G3uQCUddfO1pLX3R9ncteV9oiUWFhYUFRXdkwikHEPTHLS0tDQ2\nNFWiaSPcXZyF3ZHWBMmtp1KQF+ei+N97nhd7QnhO9c0i9EwtUUb4qm8Wsy08BeObN+nRowdlZWVs\n2LCBrKwsjh49yvjx44XvUllZGevXr8fCwoKbN2+ycuVKRo0axeDBg1sNQPSzNsa5pymth+bUGTnU\nA6leGmfPniU4OFho3Kvy2jMyMDc3v2MH+u1cu3aNLVu2oK+QYt7XjT7Dp7T7nPYa1T/xxBPifbsL\no8m0I/ZvFBFpXue0lx3Rw9GLHo5eNNbX4D/MlJLrVwgODuajjz5i/fr17f4mK/sLtvz7U1JSUqJW\n9rM9/opA+6PA1atX2bt3L4mJiVRWVmJmZoa3tzcBAQFqZcYqKirYt28fERER5OfnI5PJsLa2xtvb\nmzlz5qjswXNzc9mxYwcxMTFCRrGXlxf+/v706tVL5bp32/D+bseoKC2l/tJetFLTKamBSqO+9Br8\nBFpSGRX56eTHnaSqJB+JREKdvAykMhQK+PZgLNam+gxxsCQmJobr16/z2Wefqc2ruLiYpKQkUlJS\nOHz4MObm5gwcOJCZM2d2SDDtCL1798bQ0JD09HRKSkruqCRcd6ItcVpLKqXRyIbQyDj2hidgp1NJ\nU1MTXl5e9OnTBwsLC2JiYpgyZYrQw0m5j1bueVp735THW/ZmVXK3e2ylkQqae6lBc0uItuhotpi4\n5xLp7ogikEi3pGVQ5tDJKKpqGxg2bFibz7mf7u6WDoPvvvuOzZs3o6Ojg7+/P5aWlshkMhwcHJgz\nZw6jR48mOzubkpISbty4waBBg9QykxYuXMiRI0c4f/68IFwox6isrCQjI4Pp06cLpeuGDx/Ob7/9\nRklJCdra2uzfv194DTNnzuS1115j9+7dGkWgCxcuqNVnVfYzWbp0KbNmzVIRxnbs2MHWrVs5duwY\n06dPv6P3SRMlJSU4ODiwcuVKwfE8d+5cFi1axP79+3nuuefaLNE3d+5cCgsLSU9PF/opteSxxx7j\nl19+ITg4mLlz56o47OPi4sjJyWHs2LFiM/W/mDuqFd1OMLAtWlvQFiadA4kWJv0Gk1vbhG5WITZm\n+ri4uDBixAguX77cKX2HWqOwrJqEslycezWLHVKd5uydhrpqdGTapN5oUHHClZaWAggbf7lcjrW1\nNaNGjeL69essXbqUpqYmQkJCCAsLo6CgQK1EXVenrc0HQF5plZo78F64ve60EuWGo7S0FFtbW7Xz\nJSXNn1nL+4REIlER7VuiySmm7Nl08+ZNtbr9TU1NVFRU3HHZCZHOpS1Bsi4iDQM3Zxpqmze28hu5\nyG/kCo+ryLsGEi2kOnpoSWUoFAp2bVhNf6Naxo8fz5gxYwgNDeXSpUvs2rWLo0eP4uHhgZeXFykp\nKdjZ2WFvb4+FhQVeXl6cOXOGsLAwiouLyc3NJTY2Vk0AeXqYPUd+79hrU/6m9pnyLu+//z7r1q3j\nwIEDuLi4YGhoSHFxMRkZGWRmZvLVV1/dswjUUrBp7+9cOT+xUb2IiMjDyNZTKR1a+wJItfWILjPk\ny//3/1AoFAQHB5OQkICfn5+QZd7U1KSWcZ6XlwcglGNqSXx8vMaxWl7vdgYMGIBEIiExMbFjExdR\nIzg4mO+//x5tbW18fX2xtLQkNzeXo0ePEhUVxVdffSVkGBQUFLB8+XIKCwtxcnJiypQpKBQKcnJy\n2LdvH0899ZQgAqWkpLBixQqqq6vx8fHB3t6e7OxsTpw4QWRkJCtXrtQohtxJw/u7HePgwYNcuHCB\nESNG4OHhwffbj1B4JYLG2hpMew8g48weTHoNwNJ5KJVF2ZTlptLcIajZQ7MtPKXVdYBCoWDt2rWE\nhIRgbGxMXV0dNjY2ODk5ERcXh52dnTCnvLw8tLS0sLGxuavPTktLi6effpo//viDH374gX/9618q\n709DQwNyufye10oPko6I00Y9HSjPS+PzzQeZ5KiNjo4Orq6uQHM2z8WLF6mvrychIQF7e3vh/Wi5\nr9KEcl91v3o+K/drK1asEMori4g8yogikEi3QlNQJiE5h9qKEr48ksqCCXqtLhbup7u7pcMgMzOT\ngoIC7O3thewjJTU1NVhaWlJYWEhRURHa2toaM5PmzJlDUFCQ0PQcbqXil5WVUVZWxq5du1SuXVZW\nRkNDA0OHDlWZf8+ePXF1dSUxMVHjRmHQoEFq9VmV/UwMDAx49tln1c5t3bpVKGfUGSxatEil5I2p\nqSm+vr6EhoaSk5ND37597/raOjo6TJgwgT///JPIyEiVDVFQUBDQ3E9D5K+lo7Wi7yUY2NaCtrai\nFKm2Lq5T/w+pti4SCax4wZchDpb88MMPQir7/eByejHpheUYWd/6u9c3bxYCGmurUegb01BTiVTb\nQnDCKeej/Pu9/TfK3NwcDw8Pxo4dy6JFi0hMTKSiokLIKJFIJF3asdTWZ6UUaxSKJjV34P3A0dGR\na9euER8fryYC5eXlUVxcjI2NjcpnYGRkJJQebUlTUxPp6elqx/v3709aWhqJiYlqItDtWaAifz3t\nCRVFZTX0BaQ6zUEYa9cR9B52qx9fUfIFKvKuUV1aQH1N5f+EoCa8x0/j32/9DX19fZ555hmys7NZ\ns2YNJ0+e5Ny5cyQmJrJs2TJeeOEFFi9ejJmZGR9++CH19fWkpqby22+/sXnzZv744w8ef/xxBg8e\nLIzp2tscB2sTqtt5bbf/pq5Zs4YDBw5w9uxZTpw4QVNTE2ZmZtjb2zN16tR7uv9qQmxU/+gyb948\nnn322YfWRS0i0h4ZhRXtZsFX5KdjZNNPWPvEZpaQUVjBzZs3AYSSv0rzWlFRkVpwW2mIi4uLw8fH\nRzien5/Pf//7X43jtrze7ZiamjJu3DjCwsLYsWMHzz//vEbh6V4C7Q8zOTk5/Pjjj9jY2PDZZ5+p\nmHxiYmL44IMP+Pnnn3n//fcB+OqrrygsLGTevHk899xzKtcqLy8XBCCFQsE333xDVVWVmqEzPDyc\nL774gq+//pr169erGZ862vD+XsaIjo5mzZo19OnTh4zCCnZkWVJ8+GdK0mMoy0mm//gXMbbpJ4xT\ndj0ReUkeVSX5GFj0FL77mjh69CghISE4Ozvz66+/snr1aq5evSpkp5SWlvLtt9+SlZVFSkoKS5cu\nvafvZkBAAFevXiUqKopFixYxfPhwDAwMKCoq4vLly7z88svdOju5I+K0cc/mVgAVeekcyihiiu9A\nIX7j5eXFiRMnOHz4MDU1NSqm6/79+wOoxLVaojyufFxn4+LiAkBCQoIoAomIIIpAIt2I1oIyMh09\naoHoq5m8VyBv1SV+N+7ujlLdKCX/ZhXTXnwNyckjWPXMZfOvG+ndW3PNewMDA6qqqtDS0tKYmaSj\no4OOjg5lZWWCKKWcv4uLi5q4BLBs2TKuXLnCRx99pHauR48eNDY2UlpaquYuv9d+JveKoaGhxs+j\nM3tcTJkyhX379nHkyBFBBCovL+fcuXP06dNHrf6syF/D/Q4GtrWg1TFsdidVFmRg2ttFcJwpSq9z\n7NixuxrvXualY2iKia0j1TcLqKu8SWVBJrrGFigU8P0fxyk+e1IobWZjY0NDQwMZGRlq166pqaGm\npgapVKqSyWRiYqJRpOgqtPVZSXWa+5/UV5UB7bsD75WJEycSHBzMjh078PHxEZxsTU1NbNq0CYVC\nwaRJk1SeM2DAAC5evMjly5cZMmSIcHznzp0UFhaqjaEU2v/44w98fX2F+05DQwNbtmy5L69LpGN0\nxA2pxKCHHRKJBHnhdZXjVgO8sRrgrfZ4r1EDhCwwaC4x8tVXXwHw/vvvExsby5gxY9DV1WXTpk3C\n47S1tXF1dWXVqlVMnDiRb775hsjISBURyMPDg8hTx7mcXqz2m6prZMbQFz/S+Juqr6/P888/f1el\ncO8WsVH9o4mFhYUoAIk80kRntL8OSz/1B1oynea+bUZmKBTw5tt7UVQ0Z4Uo941eXl6cPn2aVatW\n4e3tjY6ODtbW1jz++OP4+Phga2vLvn37yMjIoH///hQVFREVFcXw4cM1Cj0eHh5IJBI2b95MZmam\nUHp2zpw5APzf//0fubm5bN26lbCwMAYNGoSZmRklJSWdFmh/WDly5AgNDQ28+uqravtwLy8vfH19\niYqKorq6mpycHJKSknB0dFQzYgIqlSuSkpLIzs5m4MCBaobO0aNHc/DgQRITE0lISFDb63a04f29\njqHsrxqdUYyWVIZ5PzfyYk5g0stJEICg2fClZ2qFvCSP6tJmEUj5PE0cPHgQgDfeeAMrKys+//xz\ngoKCOHnyJBEREdTV1WFmZkavXr145ZVXVNbmd4NMJuPjjz/myJEjhIaGEhoaikKhwMLCgpEjR2rs\n8dld6Ig4DWBgbotMR4+y7KsU18ixnXOrIoyy34/SpNyy/4+rqyt2dnYkJiZy5swZRo261UPzzJkz\nJCQkYGdnh5ubW2e9JBV8fX2xtbXl0KFDeHp6qvX9gebvuYODw0PRV1dEpD1EEUikW9BWUMbAsjfy\nG7mU56aiZ2rZqkv8btzdHZnX1lMpnIqXNzd3DwrnZlb7mUm9e/dGoVBQXV2tsa9OYmKikGasFIGU\nLobqas0+347UR9XkLu+MWqv3Qlu9PKBz6q327NmToUOHcunSJfLy8rC1tSUkJIT6+noxC+gBc7+C\nge0taK0GDKckLZr08N2Y2buirW9Mamghl3VuMumJcYSHh9/12Hc7rz4+U6nIz+BmdhLJx36ld+lT\nNNbVEn09Ea++FjjY25Gbm8ukSZO4ceMGb731FnV1dWRlZbF//35Onz7N+fPnKS0tZdq0aSrBZi8v\nL06dOsV//vMf+vfvj0wmw83NrUsIoO19VlJtHQx62FFZeJ2M03vRNelBfpyEGYOMMb0Pa3VXV1dm\nz57Nnj17eP311xk1apTQry0zM5NBgwbxzDPPqDxn1qxZXLp0iZUrVzJ69GiMjIxISkoiPz8fDw8P\nNfebu7s7kydPJigoiNdffx0/Pz9kMhlRUVEYGBhgYWHRark6kfvLnZTq0dYzxLyfByXpseTFnaSn\n22gktxknaitKQCJB18gcHamCK1euCOUzlDQ0NAiGB+UG9MqVK/Tv318lSxZQc4PfTncSWMRG9Y8W\nmnoCtewdNHfuXP773/8SHR1NTU0Nffv2Ze7cuQwfPlzj9U6dOkVQUBBpaWlCKaBx48bxzDPPqJTq\nERHpKih7yrWF7eAnmjNJS/Ipz01FSyqjz4B+vLpgAVOmTBEMPpMmTaKwsJBTp06xZ88eGhsbcXd3\n5/HHH0dPT49Vq1bx3//+l7i4OBITE7GxscHf35+ZM2dqXOP26dOHJUuW8Oeff3L48GHq6uqAWyKQ\ngYGBSqD97NmznR5of5hoeQ8+GBZJVW0D8fHxpKSod+8rKysT+pJevXoVgKFDh7a7DkxNTQVUA+4t\n8fT0JDExkbS0NLX1fkcb3nfWGMrvvrZ+8z3foIdqHyGAfqOfo6GuhvrqCpXn3d6zsKamhszMTMzM\nzHB0dASaRZqpU6cydepUjfNsiYeHBwcOHGj1fEsTTkukUmmHxpg7dy5z587VeM7a2rrNsR8EHRGn\nASRaWhhZ9+VmdvN3VGJ2y+xsbW2Nra2tkBHY8rsgkUhYsmQJH3zwAatXr2bEiBH07t2bnJwczp07\nh76+PkuWLLlv+x6ZTMby5cv58MMP+fjjj3F1dRUEn+LiYlJSUsjPz2fLli2iCCTySCCKQCLdgraC\nMlYDvClOuUh+/ClMevVHz9RKxSVeXFyMpaXlXbm72yIqpYCT55p7mpj2dkHX2IKi5AtoSZs3nrdn\nJrV0GMhkMuzt7UlKSmLz5s0qTTvT09MJDQ2lvr4euCWSODs7Y2ZmRkFBQatNnKuqqigrK3sgNWmV\nGUONjY0qfXegc7J57pWnnnqKixcvcuzYMebPn8/Ro0fR0dFh/PjxD3pqInR+MLC9Ba2+uQ1OE+aT\nFxNGeU4KCkUT+mY2THzxFZ7ycb5vIlBb89I1Nsdt1ltcPbKRkvQ40k7+ga6ROXqmltQqZOTm5goC\nRG1tLS+88AI7duygoqKCsLAw7OzssLOzY8GCBWr9v/7+978DzWUnLly4gEKhICAgoEuIQB3ZfPQb\nNYvsC0cpz7tGY2Y8CoWC0EgPZo1Rz6TsDBYsWICjoyMHDx4kNDSUxsZGevbsyUsvvcTMmTPV+kV5\neXnx/vvvs2PHDk6dOoWenh6DBw9m2bJlbNu2TeMYixcvpnfv3hw5coQjR45gYmLCiBEjmDdvHgsW\nLNCYISlyf+moG7IlfYY/RW1FCXkxJyhNj8PQqg8yPSMaqiuoKStCfiOXfo/NRtfInEG2Jix781Vs\nbW1xcnLC2tqauro6oqOjycrKwtfXV3DN7tmzh9jYWNzc3LCxsUFfX5/MzEwuXryIkZERTz75ZJvz\nEgUWke5EYWEh77zzDj179mT8+PFUVFQQHh7OJ598wsqVK9UCkGvXruX48eNYWlri5+eHoaEhV69e\n5ffffycmJoZPPvlEbS0qIvKgMdBtP/SiKZP0/54cxEwfB5VjWlpazJs3j3nz5mm8jqWlJe+++67G\nc60FoR9//HEef/zxVud2J4H2RxWNpeuvZlFbUcLKtZuw62GIqYGOxufW1NQgl8sBOpQ1qSx139pj\nlceV12xJRxve38sYLY2myu++RNIcL5BqqwfblSYaRVOj2vNaohxL7J3ZOXREnFZi1NOBm9lXkero\nYWqtWvHGy8uLvLw8nJyc1L5fLi4ufPvtt+zcuZPo6GiioqIwMTFh7Nix+Pv7Y2dn1ymvpTX69evH\nd999x759+4iKiuL48eNoaWlhbm6Oo6Mjc+fOFftDizwyiCKQSJenvaCMnqkVfYY/RVbUIZIO/4Rp\n74HkRltgnHuWkvwsDAwMWLVq1V25u1ujrKqOvZHpWDg2l2LRkkpxHPM8qaG/U56XRlN9LWknd2La\n25k3w3fjZt5AvfymisNgypQppKens2PHDgoKCnB1daWkpITTp0/j4uLCuXPnMDMzQy6X4+/vzxNP\nPIGHhwcXL17U2MT52LFjZGdnk5+ff19EoIULFwKtu2OUZQOUGVUtSU1NJSoqil9++eW+1MttKUC1\nho+PD1ZWVgQHB+Pp6UlOTg7jx48X5i3ycNGRBa2RVR+cJ6hunvsMGICHh7PaBvmzzz5Te35bTrLW\nnF7KeQ19Ub1sI4COgQkes/9BSUY8xVejqL5ZgKKpCR19Q14KmCEIEDKZDH9/f5qammhsbGTVqlV4\neHi0+lpNTU1ZunRpq+cfJB35rHSNLej/eIDKMSdPzZ9VSzT9XrXl0GvJmDFjGDNmTLuPU+Lr66ux\n1vTbb7+tIvQrkUgkzJgxgxkzZqgcz83NpaamRhADlDzxxBNt/n52NWdhd6SjbsiWSHX0cJ64gBup\nFynJiOdmVhKKxnpkekboGlvQe9iTmNg64tnXggF9LFmwYAFxcXFcuXKFiIgI9PX1sbW1ZfHixSrm\njqeffhojIyOSk5NJTEyksbERS0tLnn76aWbOnClkUoiIPAzExcUxd+5cAgJu/c6PHTuWjz76iL17\n96qIQCEhIRw/fpyRI0fy7rvvqmTLbdu2je3bt3Po0CGmT5+OiEhXYnC/uytje7fPE/lraa10vbJ/\noMO0Jch09XijldL1AJmZmcCtMvVt0bLUvSaU19BU9aOjdNYY9/Ldv311qxQYOqs0/aNOR8RpJdYD\nfbEe2LzXMdJXFTNff/11Xn/99Vafa2dnxzvvvNOhcdraq7W1H2prb25qasr8+fOZP3/+PY3fFbO5\nRETuBFEEEunydCQoY+k8DH0zawqunKOyIIOy7CTCanoxxttdJbvnTt3drZFzQ475bWs3fXMbBj79\nf+RcDCYr8gBFyeepKS9C17gHxSYOfPzOy0JfDktLS6ZPn05oaCg3b94kNTWV5ORk7OzsWLRoEX/8\n8QcKhYJhw4apjKGnp4evry9PP/20WhNnAwMD+vXr1+lNnDuKs7MzZ8+e5ejRoyqutJiYGE6ePHlf\nxzY2bnY7FxUVteqel0gkTJ48md9++421a9cCzdlBIg8nd7Kg7Yzndfb1Lfq5Y9HvVpbOaxpcoNBx\nQaMr01U/q/tNaWkpZmZmKuUPamtr2bhxIwAjR458UFPrNOLi4li+fDkBAQHd4nvaniCp7K1zO1pS\nKVYuPli5+Gh4FkgkMHe0MzKZjNmzZzN79ux25zJkyBCxtI5It+H28oO9DTtYU/F/WFtbC2WnlAwd\nOhQrKyuSk5NVjgcGBiKVSnnrrbfUyiX6+/tz8OBBTpw4IYpAIl2OftbGeNhb3FHGqWdfCzGrsxvQ\nVul6Q0s7qm7kUll0HVO7Aa2WrodbTewvXbrEvHnz2iyR1b9/fwC1ksNKlMeVj7sbOmsM5Xf/xLWO\nj93ad19PT4++ffuSmZlJWlqaUBKuM2nP/PowIYrTIiKPFt07giLySNDRFFVDqz44Wt1SZuaPG8Dc\n0er1bu/E3a3pxt/fwwen2e9pfLy2niH9Rs3EyLoPWVGHkGhpYWBhS2mTAeGRl9i/f79KZlJAQAB7\n9uxBV1dXyEzav38/Fy5cwNjYmOeeew4LCwvWr1+PgYEBS5cuRSaTaWzi/N577xEfH6+xx1BnsHLl\nyjbPT5w4kb1797Jr1y7S09Pp06cPubm5XLx4kZEjRxIZGXlf5gXN6cd79+7l+++/x8/PD319fQwN\nDdXKFUyaNInt27dz48YN+vXrx8CBA+/bnEQeLF11QdtV5/UgeVTfk8DAQE6ePImHhwcWFhaUlpYS\nExNDcXExw4YNU2mc2pVp2dNDU8ZTd+J+CIsSCSyZ6qkx2CMi0t3RVPoIoLbyJllZpbjc6Fg5YAcH\nByGruyWWlpYkJSXdum5tLenp6ZiYmLB//36N19LW1iYrK+sOXoWIyF/HC2OceW9rZId6zykNBCJd\nn7ZL1/twI/USORePoWtsgZ6JpUrp+oaGBq5evYqbmxtOTk64urpy5coVdu/ezXPPPadyrYqKCnR1\nddHR0cHV1RU7OzsSExM5c+aMyrrxzJkzJCQkYGdnh5ub212/rs4c44Uxzpw8EdqhX57GUAAAIABJ\nREFUcdv77k+bNo3vv/+e77//nk8++USl/JhCoaC0tLRDJfVERHFaRORRQxSBRLo8Xc0l3pmZSdOn\nT1fLTDI0NERXV5cBAwYwePBgZDIZvXv3bmO0v4b2+lOYmpry+eef8+uvvxIfH098fDxOTk588skn\nFBQU3Ne5DR06lIULF3L06FH2799PQ0MD1tbWaiKQmZkZ3t7eREREMHny5Ps6J5EHS1dd0HbVeT1I\n7vU9UQrgd5KaHxISwpo1a3j77bfvS4nKjjB48GDS09O5fPkyFRUVSKVS7OzsmDZtGtOnT79vDVJF\nWuduhcUBtiYk55WrHffsa8Hc0c6iACTyUNJa6SMl5dV1HLx4nYnRWa2WPlLSWmleqVSKosUAlZWV\nKBQKysrK2L59+13PXUTkQTHEwZK3n/Zo828HRANBd6L90vWW2PtO53pkIFcObsDEtj/ZJj0wLzxP\nU005iYmJmJiYsGHDBgD+8Y9/8N5777FlyxbOnj2Lh4cHCoWC3NxcLl++zIYNG7C2tkYikbBkyRI+\n+OADVq9ezYgRI+jduzc5OTmcO3cOfX19lixZck/ryc4cY4iDJc/4OvDtufbHbe+7P2nSJBISEggL\nC2PRokX4+vpiampKSUkJMTExTJw4sVtkoHcVRHFaROTRQRSBRLo8Xc0l3pmZSUuWLMHW1hYHBwcG\nDhxIbm4uFy5cwNramsWLF6Ojo6Pism6ZmVRbW0tgYCDh4eHk5uYikUhwcXHh1KlTKplOOTk5hISE\nMHbsWJUeAsp/b9++HS8vLxUHz+HDhwEYP368cKxlWrQy2NrQ0MCRI0c4fvw4BQUF1NfXY2ZmhpeX\nF1OnTmXw4OaeSe7u7vj4+NCvXz/Ky8vZsmULUVFRVFRUYGtry/Hjx5kwYYLKe6OpzFVbNVhnzpzJ\nzJkzNZ5TolAoSE9PR1dXt83GpyIPB111QdtV5/Ug6ez3pDuUIfPy8sLLy+tBT0OkBXcrSH45b6Ra\nOazB/SwfavFW5NGmrdJHKigQSh91Bkq3t6Ojo1DaV0SkuzF5iD02ZgZsC08hNlP9fiMaCLoXHTGI\nWjh6om9uQ+GVCCoK0qnIv8aRqjQ8ne0ZNWoUo0ePFh5rY2PD2rVr2bNnDxERERw8eBAdHR2sra2Z\nNWuWSu9fFxcXvv32W3bu3El0dDRRUVGYmJgwduxY/P39sbOzu+fX15lj+DjbMNDOnJ42JpRpOG9i\noIP/KKd2jQMSiYR33nmHoUOHcvToUU6fPk19fT3m5ua4ublp7NEp0jqiOC0i8uggikAiXZ6u5pzv\nzMykyZMnExERwcmTJ6mursbQ0JChQ4cya9asNpu8y+Vyli9fTlpaGv3792fixIk0NTVx+fJlvvzy\nSzIzM3nppZeA5iZ8PXr0IDY2VuUaMTExKv9uKQLFxMSgo6PTbrm0b7/9llOnTtG3b1/Gjx+Prq4u\nN27cIDExkUuXLgkiUMt5L1u2DJlMxqhRo6ivr+f06dOsXbsWiURy3x35Z86coaCggKeeeuqemmSK\ndA+66oK2q87rQXIv78k777xDbW3tXzBLkdtRNmGH5uyqkJAQ4dzbb7+tYjxIS0vjt99+48qVK9TX\n1zNgwADmzZuHq6ur2nXlcjm7d+/m3LlzFBYWoqOjw4ABA3jmmWfU7isKhYLQ0FCCgoLIzc2luroa\nU1NT+vTpw8SJE1UCKwDFxcXs3r2bCxcucOPGDfT19XF1dcXf3x9nZ+e7FiT7WRuLoo/II0NbpY9u\nR6GAbeEp3HsosrkXhL29PdevX6eiokLoCSki0t0Y4mDJEAdL0UDwENBRg6i+uQ19/WYI/99a6Xpo\n7ne7YMECFixY0O517ezseOeddzo0h7tteN9ZYzzxxBPCfl/9uz+Gftavqz2nrb4848aNY9y4cR2a\n1+0oFAoOHTrE4cOHyc/Px9jYmJEjRwoxFE2cOnWKoKAg0tLSqKurw8bGhnHjxvHMM8+gra19V/Po\nKojitIjIo4EoAol0C7qSc74zM5MCAgIICAi442tt3LiRtLQ0FixYoNJguq6ujk8//ZRdu3YxatQo\noVGip6cnYWFhXL9+HXt7e6BZ6DExMcHS0pKYmBhhsVZZWcm1a9fw8PBQa7jbErlcTnh4OE5OTnz9\n9ddqtdwrKirUnpOens7EiRN54403hMfPmDGDN954gz179tw3EWj37t1UVFRw9OhR9PT01Oorizy8\ndNUFbVed14Pkbt8TKyurv2qKIrfh4eGBXC4nMDAQBwcHRowYIZxzcHBALpcDkJqayp49exg4cCCT\nJk2iqKiIM2fOsGLFCtatW6fiIpXL5SxdupSsrCycnZ2ZMWMGZWVlnD59mg8//JDFixerlPP87bff\n2LVrFzY2Njz22GMYGhpSUlJCSkoKp0+fVhGBrl27xgcffEBlZSVDhw7Fz8+P8vJyIiIiWLZsGe+/\n/z7e3t6iSCsi0gbtlT7SRGxmCfrUdMr4M2fOZN26daxdu5YlS5ao9IKA5nVsQUHBPTVDFxH5qxAN\nBN2frla6vrvwoL/7Gzdu5MCBA1hYWDB58mSkUimRkZEkJyfT0NCATKb6+axdu5bjx49jaWmJn58f\nhoaGXL16ld9//52YmBg++eQTpFLpA3o1nYMoTouIPPw82ncekW5DV3LOP+jMpIqKCsLCwnB2dlYR\ngAB0dHRYsGABly5d4uTJk4II5OXlRVhYGDExMSoikKenJ1ZWVhw4cICamhr09PSIjY1FoVC0W6ZI\nIpGgUCjQ1tbWWAdYkztTV1eXV155RUUw6tOnD4MGDSI+Pl6YQ2ezefNmZDIZffr04eWXXxaDxo8Y\nXXVB21Xn9VdSU1NDQEAAzs7OfPHFF8J7kpx9g3kvzaWurp5ZL7zCS89OE96Tw4cPs379et58800m\nTpyo1hNozZo1QkbK9u3bVfpGrFq1Si3LMjY2lu3bt5OamopEIsHNzY2XX36ZPn3aLkUh0iwC2djY\nEBgYiKOjo5rzMy4uDoDz58+r9V8KCgrihx9+IDAwkNdee004/t///pesrCwmT57M4sWLhfvLs88+\ny5IlS/jpp58YOnSokGUUFBREjx49+OGHH9DV1VUZv7z8Vp+exsZGVq9eTU1NDatWrcLd3V04V1JS\nwpIlS1i3bh2bNm0SRVoRkTboSOkjTeSUyDtl/IkTJ5Kamsrhw4d59dVXGTJkCNbW1lRUVFBQUEB8\nfDwTJkzg9dfVHeUiIiIinU1XK10v0j5XrlzhwIED2Nra8vXXXwtxi5deeonly5dTUlKiks0eEhLC\n8ePHGTlyJO+++66KUVaZFX/o0CGmT5/+l7+W+8GDFuhERETuH6IIJNJt6EpBmfuZmXR7QLi3oeog\nycnJNDU1Ac2LjttpbGwEICsrSzimFHRiYmKYNm0amZmZlJWV4eXlhZWVFX/++ScJCQkMGzZMKBvX\nnghkYGCAj48PUVFRvPnmm4waNYpBgwbh4uKiFohT0qtXL41l2Cwtmz+zysrK+yIC3UnDeJGHl666\noO2q8/or0NPTw9nZmeTkZKqrq9HXb+4bUVWchZWRDqCDcV2hyvujLGXZ2m+UMhslJCQEd3d3FdHH\nxsZG5bFRUVFERkYybNgwnnrqKbKysrhw4QIpKSn8+OOPmJiYdObLfWhoeZ+qk99stxSKq6urWqbn\nhAkT2LBhA8nJycKxhoYGwsLC0NPTY968eSoGg169ejFt2jR27txJaGgo/v7+wjmpVKqWjQqofH4X\nLlwgLy+PWbNmqQhAABYWFsyePZuNGzcSExODt7e3KNKKiLRCR0sf3U5dQ1OnzeG1117D29ubI0eO\nEBMTg1wux8jICCsrK5555hmx76OIiMhfxoM2iIrcOcePHwfg+eefVzGu6ujoMH/+fJYvX67y+MDA\nQKRSKW+99ZZapRR/f38OHjzIiRMnHhoRSERE5OFFFIFEuhVdJShzPzKTLqcXs/VUitoCsrbyJllZ\npbjcqARulVlLSUkhJSWl1evV1Nwqu2FpaUmvXr2Ij4+nqalJJYhqbm6OTCYjJiaGYcOGERMTg4GB\nAc7O7QtX//znP9m9ezcnT55k69atQPPiadSoUbz88suYmZmpPP72kh1KlKnTSnFLRETk0cHLy4sr\nV64QHx/P8OHDgWahR0tLC3d3d5X+ZQqFgri4OHr27Kni0GvJiBEjMDQ0JCQkBA8Pj1brkgNERETw\nn//8R0VQ2rx5M7t37yY4OFgt2/JRR9N9qrbyJgmZN2iKymBserHG+52m+4lMJsPMzIzKykrhWHZ2\nNrW1tbi6umrMJvX09GTnzp1cu3ZNODZu3DgOHDjA4sWLeeyxx3B3d2fgwIFq95ukpCQAioqKNBoo\ncnNzgWYDhbe3t3D8URZpRUQ00ZESRrpGZgx98SOVY7NfeoWZPg4qx9rqQQHw2WeftXpu+PDhwj1D\nRERE5EHSlUrXi2imZfzo2JlLVNU2qJmCAAYNGqRiLKqtrSU9PR0TExP279+v8dra2toqBlwRERGR\nroooAol0S7pCUKYzM5OCLl9vU1Aqr67j4MXrTIzOwuJ/ga0ZM2bwyiuvdHi+np6eBAUFkZKSQkxM\nDNbW1tja2gLNAbro6GhKSkrIzs5m+PDhGl3Vt6OjoyM0fywuLiY+Pp6QkBDCwsIoKChg9erVHZ6f\niIjIo4mXlxc7duwgJiZGRQRycnLCz8+PDRs2kJOTg52dHWlpaVRUVODn59cpY48ZM0Yto2jy5Mns\n3r1bJUNFpP37VF5pFe9tjWTJVE+eHKxaSq8tA0BL8b+qqgpozszRhPK4stcQwCuvvIKNjQ3Hjx9n\n9+7d7N69G6lUire3NwsXLhTuc8rScKdPn27zdbY0UIiIiKgjlj4SERERUaUrla4XUUWTgSkhNY/a\nihI+P5DE/Akylc9DKpWqZJJXVlaiUCgoKytTKTEtIiIi0h0RRSARkXugMzKTLqcXt7tgBEAB3x6M\nZfk0VyQSCYmJiXc0Vy8vL4KCgrh06RIJCQkqQVQvLy927txJeHi48P93iqWlJePGjWPs2LEsWrSI\nxMREKioqNLq5RUREHl1u/710722Hjo6OkPEjl8u5du0as2fPxtPTE2gWhezs7IRylcrj94qTk5Pa\nsZblKUWa6eh9SvG/+5S1qf5dBTiU5UJLS0s1ni8pKVF5HICWlhYzZsxgxowZlJWVkZCQQHh4OKdP\nn+b69ev88MMPaGtrC0LUihUr8PX1veO5iYjcKQsXLgRg06ZNwrGQkBDWrFmj1iOrOyGWPhIRERFR\npyuVrhdppjUDk1S7uXR9dEo2SQVyFQNTY2Mj5eXlwn5AuX50dHRk7dq1f93kRURERO4DoggkItIJ\n3Etm0tZTKR1KHYfmANuBmALGjRtHWFgYO3bs4Pnnn1fL2snLy0NLS0ul/4WnpycSiYRDhw4hl8tV\nhB6lE3/Xrl3C/7dHWVkZpaWl9OvXT+V4TU0NNTU1SKVSZDLxJ6Y9bm9q3xGmTZuGu7t7m2VSRES6\nGq2VvAQorzeh+EoKZWVlJCUl0dTUhJeXF/+fvTsPiLLOHzj+Hu77PuWQQ1QQ8BZvUbyPzXJNJDMN\nW1fd0qztZ1pZm0e1leZqdtmmea5Ham5qihdqgkByKoco9w3CAHIM8vuDnYlxhtML9fv6Z7fneeZ5\nnpFhZvh+LicnJywsLIiOjmbixIlER0cjkUjaFahWx8jISGWbaE+pqrnPKfncnvr6O//7X9gZmtyu\nRQ5HR0d0dXW5ceMGFRUVKhVEsbGxgPrgHYCpqSmDBw9m8ODBlJWVERMTQ1paGl26dKFbt24AxMfH\niyCQINwj0fpIEARBVUdpXS80n8BkYGFPZXEO5flp6BqbKyUwJSQkKP0NoKenh7OzM+np6SLBVRCE\nx55YoRWER+hmvrRNmZQAMWnFvPziC2RnZ7Njxw5Onz6Nl5cXZmZmFBcXk5GRQXJyMn//+9+VgkAm\nJia4uLhw48YNQDmTvnv37ujq6lJaWoqpqSmdO3du8T6KiopYvHgxLi4uuLi4YGVlRWVlJZcvX6ak\npIQpU6YohrwLgvB0a6mVWKWBHdeT4vl23wlM6orR0dHB09MTaHivioyMpLa2lvj4eJydnTE1NX2I\nd/90a+lzSlNHH4lEQm1lqWJbTFoxN/Olbb6WlpYW/v7+HD9+nO3btzN//nzFvpycHH7++We0tLQU\nQ99ra2tJSUlRvFbkZDKZopJLV7ch29PPzw97e3v++9//4uvrqzT3R+7atWu4uroqHiMIgnqi9ZHw\nOFNXpScI91NHaF3/tGsugcnCvReFKVHkxoVi6tgVLd2GCq4eDiZs3bpV5fipU6eyYcMGvvjiC15/\n/XWVJKXy8nLy8vJwd3d/EE9FEAThvhFBIEF4hK7cLGzX4xLzK/noo484duwYZ8+e5eLFi9TU1GBm\nZkanTp2YN28evXv3Vnlcz549uXHjBk5OTpibmyu2a2lp4eXlxe+//46Pj48is7s5tra2vPDCC8TG\nxhITE0NZWRnGxsY4ODgwZ84chg0b1q7n9rRZunQp1dXVj/o2BOGBaU0rMWM7V7LrYctPJ/Exr6F7\n9+7o6OgADe9bZ86c4ZdffqGqqqpVVUDy6khRzXPvWvqc0tTWwcDSgfL8dG6eP4CuiSUSiYRfL5ow\nyN2szdd76aWXiI+P58iRIyQnJ+Pj40NZWRnnz5/n9u3b/PWvf1UkONTU1PDWW29hb29Ply5dsLGx\noaamhitXrpCRkYGfnx9OTg3tPbS0tFi+fDnvvfceH3zwAZ6enoqAT2FhIcnJyeTm5rJt2zYRBBKE\nVhCtj4SOqj1V9oIgPDlaSmAysnbCprsf+dfCuPrfrzB39iIzUoPME99gb22uMptyzJgxpKSk8Msv\nv/DKK6/Qu3dvbGxskEql5OXlERcXx+jRo1m0aNGDfmqCIAj3RASBBOERqqyWtXhMfV3DMZL/tSiS\nP05LS4vJkyczefLkVl8vODhYkf12t3/84x/NPvbubDlDQ0MCAwMJDAxs1bWb+0NsyZIlLFmypFXn\nedJYW1s/6lsQhAeqNS0vDczt0dLRozQjkcisWv48Zbxin7xqUd6usjXzgOQDXQsKCtp514Jcaz6n\nXIY8S2bEccpyrlOXFkd9fT03B3sxyL1Pm69nbGzMp59+yt69e7l48SIHDx5EV1eXrl278txzzykS\nHGJjY1m2bBkeHh6YmZlx9epVLl26hL6+Pvb29ixcuJAxY8YonfvixYvk5+czdOhQcnJyOHnyJBoa\nGpibm+Pm5kZQUJDSMGDh0YqNjWX58uXMnDmToKAglf13Z/PLZDKOHj3KyZMnycvLo7a2FjMzM1xd\nXZk8eTK9evUCID8/n+DgYAICAtR+91C3gCyTyTh27BgRERGkp6dTUlKCnp4e7u7uPPvss/Tt27dd\nz/HOnTsEBwdTUVHBtm3b0NPTUznm66+/5siRIyxbtowhQ4a06zoPimh9JAiCIHQ0rUm0deg7Dl1j\nCwqSLlOYHIGmrgGmY/358L2lvPbaayrHL1iwgH79+nH06FGio6OpqKjAyMgIa2trnnvuOUWVuiAI\nQkcmgkCC8AgZ6Lb8K1hVVgSAtsEff0y35nHCoxcWFsbhw4fJyMhAKpViYmJCp06dGDZsGBMnTgSa\nzlaUyWTs27ePkJAQCgsLsbCwwN/fv9mgW11dHcePH+fUqVOkp6dTV1eHo6MjY8aMYdKkSa2q8BKE\n+6m1LS8lGhoY2XTmVmYitYCl4x8zX2xsbLC3t1fMOvP29m7xfA4ODlhaWnLu3Dk0NTWxsbFBIpEw\ncuRIbGxs7uUpPXVa83mja2yB+8iZStsGDPbCx8e12QSAplrxGBoaMmfOHCZOnEhwcDAjRoxQu1iv\noaHBgAED1AYImqKtrc0zzzyDj49Pqx8jPB7WrVvHuXPn6Ny5M6NGjUJXV5eioiISEhKIiopSBIHa\nQyqV8s033+Dp6UmvXr0wNTWlpKSE8PBw3n//fV599VXGjh3b5vNqaGgwbtw4duzYwdmzZxk3bpzS\n/pqaGk6fPo25uXmHnmUlWh8JgiAIHUVrEpgkEgnW3QZg3W2AYttw/64YGho2+f20f//+9O/f/77d\npyAIwsMmVpIF4RHq5dJ0i4zbJXkU34yl5EYsEokEMyfPVj1O6BiOHTvGpk2bMDc3Z8CAAZiYmHDr\n1i1u3rzJyZMnFUEgderr6/noo48ICwvD3t6eyZMnI5PJOHnyJGlpaWofI5PJ+PDDD4mKisLBwYER\nI0ago6NDTEwMX3/9NUlJSSxduvRBPV1BUKstLS+N7Fy5lZmIpo4epRrKM3969uxJTk4OXbp0UenD\nrY6GhgYrVqzghx9+4MKFC9y+fZv6+nq8vLxEEKiN2vt50xE/pyZPnszw4cNFBeYTqKKigtDQULp0\n6cJnn32maAkpJ5W2fUZVY0ZGRnz//fdYWSm/risqKnjrrbf497//jb+/v6KNZVuMHTuW3bt3c+zY\nMZUgUGhoKBUVFUyaNAktrbb92TZlyhS8vb1Zu3Ztm+9JENqrcaVdUFAQP/zwA1euXKGqqorOnTsT\nFBSksohaW1vLoUOHOHPmDDk5OWhqauLq6sqUKVMYOnRok+efPn0627dvJzY2lrKyMhYvXsz69esV\nx06ZMkXx/9X9LlRVVbFz505CQ0O5desW1tbWjB07lmnTpqlNnEpMTOTAgQMkJCRQXl6OmZkZ/fr1\nY+bMmSrto+RJXj/99BP79u3jzJkz5OXlKZIaQkJCWL9+PUuWLMHa2ppdu3aRkpKCRCKhR48evPzy\ny4p2poIgtE17E2ZFoq0gCE868S4nCI+Qi40xPs4WajPlK4tzKEgMR8/EEie/SeibNSxc+na2ENmW\nj4Fjx46hpaXFv/71L5Uh9mVlZc0+9ty5c4SFhdGtWzfWrFmjWFQKCgpqMpDzn//8h6ioKCZPnswr\nr7yiNBNl48aNnDhxgiFDhnToTGLhydOaTDw5m+5+2HRveH1W1SrP8lm0aFGTfbabWuD08PBg9erV\navcFBAQQEBDQ5L2IOQJ/aO5zqikd9XPKxMTkntu9hYSEEB4ezvXr1ykpKUFTUxMXFxcmTJggWoHc\nB43bihVkZLb6PUQikVBfX4+2trbaxVtj43t7PWpra6sEgKCham3MmDFs2bKFpKSkVlUq3s3CwoKB\nAwdy4cIFUlJS6NLlj0rIo0ePIpFIVIJDgtDR5efns3TpUuzs7Bg1ahRSqZTQ0FA+/PBDVq1apWjt\nKpPJeO+994iLi8PR0ZFJkyZRXV3NhQsX+Pjjj0lNTWX27Nkq58/JyeGNN97AwcEBf39/qqurcXFx\nYebMmYSEhJCfn8/MmX9UqMpnycnJr1tcXEy/fv3Q0NDg0qVLbN26ldraWqXHApw4cYKNGzeira2N\nn58fVlZWZGdnc/z4ccLDw/n000/VJhisWbOG5ORk+vbty8CBA1X+JggPDycsLIy+ffsyYcIEMjIy\niIiIIDk5mS+//LLDtSi93/OWdu7cya5du1izZo2o0BXumycpgUkQBOF+EkEgQXjEXhjuwds7wlRm\nZli698LSXbl1iUQCQcM8HuLdCW3RePHqem4pMlk9mo1mOcm19AfdyZMnAZg9e7ZSVrGxsTGBgYFK\nWY7QUDl05MgRzM3NmTdvnlIGtIaGBsHBwZw8eZIzZ86IIJDwUIlMvCdDU59T6tyvzyn5whA0BF5C\nQkIU+5YsWaJU0ZWamsqPP/7I1atXqa2tpWvXrsyePRtPT0+157x7sSk+Pp79+/eTmppKaWkpRkZG\n2Nra0rdvX5WFwC+//BJnZ2e8vb0xNzdHKpUSERHB559/TlZWFrNmzbrn5/40+v1GITvOJSsFG6V5\nN0lOK2LHuWQ8BxXS27XpxRkDAwMGDBhAeHg4r732GkOGDMHLy4tu3bqhq6t7X+4xPT2dAwcOEBcX\nR0lJCTU1NUr7i4tbHyi928SJE7lw4QLHjh3jb3/7GwA3b94kMTGRvn37igpG4bETGxtLUFCQ0nvo\niBEjWLlyJQcOHFAEgX766Sfi4uLo27cv7777ruJ7szzxae/evfTv31/l/TwhIYHp06erBIjc3d2J\njY0lPz+/2VahxcXFuLq6smrVKqVkq/nz53Po0CGmT5+uqL7Lysriyy+/xNbWlrVr12Jpaak4T3R0\nNO+++y7ffPMNK1asULlOQUEBmzZtavK7/6VLl/jHP/5Bz549Fdu2bt3Kvn37OHHiBNOmTWvyOQiC\noN6TlMAkCIJwP4lVFkF4xHq7WrFkkg/r/xvb7AKbRAKvT/ZtdhFEeDTULV7laziQmRSP37jpTJsy\nlon+g/D09FTJAFTn+vXrSCQSvLy8VPapy5LLyspCKpXSqVMn9uzZo/acOjo6ZGRktOFZCcK9E5l4\nT4ZH8Tnl4+NDRUUFhw8fxtXVlYEDByr2ubq6UlFRAUBKSgr79++ne/fujB07loKCAi5cuMA777zD\nhg0bcHBwaPY6kZGRfPDBBxgYGODn54elpSVSqZTMzEz++9//qgSBNm7ciL29vdI2mUzGypUr2bdv\nHxMmTFBaIBRaduz39GZfWxlF5by9I4zXJ/syrlfT7ZH+7//+j3379nH27Fl27NgBNHz2DRkyhJdf\nfhkzM7N232NiYiLLly/nzp079OzZEz8/PwwMDJBIJKSmphIWFkZtbW27z+/r64uTkxNnz54lODgY\nfX19jh8/DsCECRPafV5BeNAaJ0AZ6GrhaNjwi2xjY8OMGTOUju3Tpw/W1tYkJSUptp04cQKJRMK8\nefOUEqdMTU0JDAxkw4YN/PrrrypBIDMzM5X357aaP3++UrKVqakpfn5+nDp1iqysLDp37gw0VOTJ\nZDJeeeUVlfd3+ftBeHg4t2/fRl9fX2n/rFmzmk3+Gj58uFIACGD8+PHs27dP6d9JEIS2eRQJTIIg\nCB2dCAIJQgcwvrcztmYG7AxNJiZNNWPFt7MFQcM8RACoA2pq8crGcxCaugZ44nFvAAAgAElEQVQU\nJkXw1Q+7+fXof7ExNcDb25u5c+fi4dH0F82KigqMjY3V9v9Xt4gln3WQnZ2tyJxX5/bt2618VoJw\nf4hMvCfHw/6c8vHxwdbWlsOHD+Pm5qaS0R0bGwvA5cuXWbJkiVJ7P/lMtsOHD7NgwYJmr/Prr79S\nX1/P2rVrcXV1Vb7G9SwOht9QLG72crHC5a4AEICWlhaTJk0iJiaG6OhoRo0a1d6n/dT5/UZhi8HF\n+jt3qK+HdUdisDHVV7zGKioqlGaE6ejoEBQURFBQEIWFhcTFxRESEsLp06fJy8vj448/BlC0i6ur\nq1N7PXmAsbE9e/ZQU1OjtmXR3r17CQsLU9qWn5/PiRMn6NSpE1lZWYq5JampqWhrawMNn90HDhzg\n0qVL5Ofnk5+fT25uLlu2bOEvf/kLp0+fxtLSkv79+1NRUcHx48eJjIwkKyuL0tJSDAwM6N69O9On\nT6d79+4t/EsLwv2lLgEKoLr8FhkZJTh39VaZzQVgZWXFtWvXgIbvpTk5OVhaWuLo6KhyrLxaKDU1\nVWWfq6ur4nepPQwNDVUC+vL7AygvL1dsk99vXFwcycnJKo8pLS3lzp07ZGVlKbVzBJr9vg+oHN/U\nPTwMYWFhHD58mIyMDKRSKSYmJnTq1Ilhw4bRr18/goODFcc2NW8pJiaGc+fOkZCQQGFhIXV1ddjZ\n2TF06FCmTZumFHQLDg4mPz8fgOXLlyvdS+N2c9XV1Rw+fJjQ0FCys7ORSCR07tyZP/3pTwwfPlzp\ncfX19Zw6dYpjx46RnZ3N7du3MTU1xcnJiTFjxjBs2LD79w8mdGgi0VYQBEGVCAIJQgfR29WK3q5W\nKhl1vVysxIJoB9XS4pWlW08s3Xoiq6misjADT9vbxEX9xsqVK9m8eXOTVUGGhoZIpVJkMplKIOjW\nrVsqxxsYGAAwaNAglT+iBOFRE5l4T46O+Dnl6empMt9p9OjRfPXVV23Kom68MNXU4iaAu5kEs1vx\nFGddp6CgQKUlWFFRURufwdNtx7nkJt8btHQaMuprKxvm6NXXw87QZHq7WpGTk6MSBGrMysoKf39/\nRowYwfz580lISEAqlWJsbIyRkREAhYWFKo+rrKwkKytLZXt2djbGxsZqq3Hj4uIAyMzM5OOPPyYh\nIYGCggKKioqoqqpi9uzZ9OrVC39/fyoqKoiJiWHNmjVcv36dmpoanJycGDZsGIMHD2bDhg18/vnn\nlJeXU1FRQWVlJc888wwff/wxH330EbW1tdy5c0exOBsdHU1kZCTvvvsuffv2bfkfXBDug5aq98pu\n1xBytYjjVzJUqvc0NTWp/98D5QFXCwsLtecxNzcH1AdD5Pvaq6n3Dnk10p07f8wmlM/yPHDgQLPn\nrKqqUtnW0n3K349auocHTZ48YW5uzoABAzAxMeHWrVvcvHmTkydPMmLEiFbNW9q/fz+ZmZl0796d\nfv36UVtbS0JCAjt37iQ2NpZVq1YpgoN/+tOfuHTpEnFxcQQEBKhte1lRUcHy5ctJTU3F3d2dMWPG\ncOfOHX7//Xf++c9/kpaWxosvvqg4/scff2Tv3r3Y2toydOhQDA0NKS4uJjk5mfPnz4sg0FNGJNoK\ngiAoE0EgQehgXGyMRdDnMdHc4lVjWjp6mHTy4E5nC0ZbGHLixAni4+MZPHiw2uPd3d25cuUKCQkJ\niixIOXn2e2OOjo4YGhqSmJioNnAkCI+SyMR78jyoz6mm2go1R12WtZaWFmZmZq3Koh4xYgQXL17k\njTfeYNiwYdzWs+WXlBq0DVSD9NXSEn7a+x11NbcZObgf48aNw8DAAA0NDfLz8wkJCbmnlmBPm5v5\n0marBHVNrNDU0aM0M5Haqgq09QyJSSsmKbOInd99rXRsaWkpJSUluLi4KG2vqqqiqqoKTU1NxWej\nvr4+jo6OJCQkkJGRgZNTwyL1nTt3+O6771QCe9CwyJmVlcXNmzeVrnHixAmioqIoKCjg66+/xtbW\nFj8/P/r3789vv/2maA/12WefAQ2zPurr6ykpKcHc3JwRI0Zw69YtEhMT6dWrF6+++iqbNm1i8+bN\n9OjRAw8PD1JTU9m/fz9ubm4MGjQIAwMDIiIiyM7Oxs/Pj+TkZL777rtHFgSKjY1l+fLlzJw5s9kZ\nLMKToTXVewCoqd67mzwQU1JSona/fLu6gI28ou9hkF9/z549isSr1nqY93kvjh07hpaWFv/6179U\nktTKysowNDQkKCioxXlLCxYswNbWVuV5b9++nT179nDhwgVFIOaZZ56hoqJCEQRSF2T/9ttvSU1N\nZc6cOUrzkWpqali9ejV79+5lyJAhuLm5KZ6HpaUlmzZtUpkHJw/mCU+XjpjAJAiC8KiIlUJBEIR2\naGnxSpp7AyNbF6U/gmLSiqkrzwNodlD16NGjuXLlCj/++COrV69WZKhLpVK1M380NTWZMmUKu3fv\n5ptvvmHevHlKWe3QMAC3oqJCsdglCA+TyMQTmtNSW6FuRU0Hc5rL5m5NFvXgwYN57733OHjwIPsP\n/0JcWgH19WBg2YlOvQIwsXdTHJt/7Tdk1ZV0HvQMpW696D/GT/GaPXfuHCEhIa15usL/XLmpWonT\nmIamJjbdBpATe45rv3yNmVN36u/cYVHkj/Tp1lmpeqCoqIjFixfj4uKCi4sLVlZWVFZWcvnyZUpK\nSpgyZYrSrI7nnnuODRs28Pe//52hQ4eio6NDTEwMMpkMV1dXbty4oXQvf/rTn4iKiuKtt95SZJen\npKQQHx9Pjx49+OGHH/D09OSLL77A2dmZ/Px8PvnkE3R0dPj2228V53n11Vf5+uuvFRU+y5YtU5op\n9e6773Lw4EGSk5OxtLRULDgXFRXx7bffYmzcsGD14osv8tprrxEeHs7IkSM5deoUBQUFWFtb3/PP\nRZ38/HyCg4MJCAhgyZIlD+QawuOhtQlQoFy9p46+vj729vbk5uaSnZ1Np06dlPbHxMQADclRbSGv\nNLlz547alnRt1a1bN8Xve//+/e/5fB2Vpqam0lwmueZmGt3Nzs5O7fZnnnmGPXv2EBUV1epqHKlU\nyunTp/Hw8FAKAEFD9e6cOXOIiori7NmziiCQ/Hmo+7m35XkITx6RaCsIgiCCQIIgCO3S0uLVjXP/\nQUNLBwMrB3SNzKivh4r8NIo1pQzt56syBLax4cOHExoaSlhYGH/729/w8/Ojrq6OCxcu4OHhQU5O\njspjZsyYwY0bNzh69Cjh4eH4+vpiaWlJaWkp2dnZJCQkMHv2bBEEEh4ZkYknqNOatkJHItMZo6at\n0P3Sv39/+vfvz5Jvz1B1JY7SrCQKkyNJPbOT7hPno2fasLBeLW3ISjdz9lRZ3FRXpSk0r7Ja1uIx\ndr7+SLS0KUqJoiglCi09I9zGBfCP919n4cKFiuNsbW154YUXiI2NJSYmhrKyMoyNjXFwcGDOnDkq\ni45jxowB4KeffiIkJAQjIyMGDhzI7NmzWbNmjcp99O3bl/fee489e/YQGhqKpqYmHh4erFmzhi1b\ntiCru4OdRy/Op1VjkHtDUcVmbGysNHfE0tKS8vJyNDQ0qKioYOfOnQBoa2uTnp7OgQMHFHNOGs9I\nmTNnDpmZmRw+fJhr165x69Yt0tLSyMrKoqSkBDMzM4qKih5YEEgQgoODKb9dyy3PGa06XlZzm6jt\nH5Dm1otF472b/JwfPXo0P/74I99//z3Lly9XLN6XlZWxe/du4I/f19aSL/YXFBQotSprr8mTJ3P8\n+HG+++47OnXqhIODg9J+mUxGYmIiPXr0uOdrPUyNv4/pO3hSkpDIwoUL6dWrFwcOHGDSpEm8/fbb\nzZ5j586d7Nq1SzEvraqqisOHD3Pp0iWysrK4ffu2ov0ftK1lalJSkiKZQ/5e2Zh8rltGRoZim7+/\nPz///DOBgYFcv36d6dOns2TJkiYTRgRBEAThaSKCQIIgCO3Q0uKVfa8ApDnXuV2cS1l2ChqaWugY\nmjJk7LOseTO42ZZtEomEZcuWsW/fPk6ePMmRI0ewsLBg9OjRBAYG8txzz6k8RktLixUrVnDmzBlO\nnjzJ5cuXqaqqwsTEBFtbW2bNmoW/v/+9Pm1BuGciE0+Qa6mtkLySsv7OnRbbCt2rm/lSruZWYGzn\nirGdK5o6euREn6E0K1kRBNIxbGiRU553E1PHbsSkFXMzX0pxZjK//vrrA7mvJ5mBbst/hkgkEux6\nDMWux1DFtgnjvNDV1WXLli2KbYaGhgQGBhIYGNjq648ZM0bt4rJ8wPnd5MFC+GPx9HhqGb9GZyKx\ncCHbpBdbzzTMoaouv4W2hTOjJo1XOoevry8+Pj6cOHGC7777TqVa7ejRo+Tm5qKrq4uNjY1iaHpx\ncTHr169HR0eHXr16YW9vT3Jyw+vOyckJqVQqWhEKD1xReRWqdSItu3KzsMnP/eeee47IyEjCwsJ4\n9dVX6devH9XV1Zw/f57S0lKmTZuGl5dXm67Xs2dPzp8/z5o1a+jXrx86OjrY2NgwcuTIdtx9Q0D2\ntddeY8OGDSxatIg+ffrg4OBAXV0d+fn5JCQkYGJiwldffdWu8z9s6qtvHSl1GEZpbhzJB38mNTmZ\nHTt2IJVKmTt3rtrWq3eTyWSsWLGCpKQkOnfuzLBhwzA1NVVUF+3atatN71NSqRSA5ORkkpOTmzyu\n8SymefPmYWtry+7du8nJyWH//v1cvnyZfv36ERwcrBSUFwRBEISnjQgCCYIgtENLi1fWXfth3bWf\nynb/cV5KLWmaWmzS0tJqckHr559/VvsYiUTCyJEj2/1HrtA2b7/9NnFxcU3+PNQJCQlh/fr1LFmy\nRGWY/eMmODgYQGkhVhDaoqW2Qpo6+kgkEmorS1tsK9RecXFxeHp6qlR3yqoaBpZraGkrtll37U9x\n6hVuhO7DzNkTbX1jlq34lcq8GwwdOpTQ0ND7em9Pul4u7ftZtvdx94O6xdPSsoaFSm0D5UXusts1\n/DcqnbH/q2LLzc1l6dKlXL9+HW1tbSZPnsygQYOUZkq5u7tz/fp1XnrpJf785z8rsvD379+PtrY2\n69atU1T0hoSEEB8fj5WVlWKx9EGQZ/rLr9m47eGSJUuUhrmnpqby448/cvXqVWpra+natSuzZ8/G\n09NT5bx1dXUcP36cU6dOkZ6eTl1dHY6OjowZM4ZJkyYptdNt3I4uKCiIH374gStXrlBVVUXnzp0J\nCgp6ott0dRSyuvp2BYGaS5zS0tLiww8/5ODBg5w9e5YjR46goaGBq6srf/nLXxg+fHibrzd27Fjy\n8/M5d+4c+/fvp66uDm9v73v6fjxy5EhcXV05ePAgMTEx/P777+jp6WFhYcGQIUNa3eLsUWuu+tbS\nrSe49eT2rTysy7+iq4sjcXFxrFy5ks2bN6vMCoKGKqnhw4djbW1NWFgYSUlJattGFhcXK95HWkte\nvfPMM88wb968Vj1GQ0ODZ555Bjc3N0pKShg4cCD6+vqcP3+e9PR0Nm3apKi2FARBEISnjQgCCYIg\ntMPjuHglPHhiQLbwNAkJCSE8PJzr169TUlKCpqYmLi4uTJgwQWWxTR40PXjwIPv37+enn49yKvIa\nWnpGmLt4Y+87Eo27ZhFoautgYOlAeX46N88fICfGks5ViUwe63/fnsM333xDUVERNfo2ZBbIkGho\nUlmcgzT3BrpGZph39lYcq29uS5fRL5ETfZqyrGTq6+9Qru/JO8uXY2hoKIJAbeRiY4yPs0Wz8/Xu\n5tvZ4pFVEja1eKqlo0c1UFspRdP0rnl/9Siq2MKOH0QqlbJw4UL27duHk5MTs2bNoqKigs8++4y0\ntDRyc3Px8vJi0qRJSqfJycnB2dlZpaVrfX09N2/evP9PthEfHx8qKio4fPgwrq6uDBw4ULHP1dWV\nioqGgGlKSgr79++ne/fujB07loKCAi5cuMA777zDhg0blFpoyWQyPvzwQ6KionBwcGDEiBGKuUxf\nf/01SUlJLF26VOVe8vPzWbp0KXZ2dowaNQqpVEpoaCgffvghq1atwtfX94H+WzzttDQlLR6ja2RG\nn1krqS6/RWlmIqCcOKUu+UlHR4fnn3+e559/vsXz29jYtJh8o6GhwezZs5k9e7ba/c0lrwQFBTX5\n/c3FxaXVM7GaSvKSCwgIaDYZqC0JRq3VUvWtnIaWLvmVML5HH5ytTThx4gTx8fEMHjxYZd6SiYmJ\nov2evF314MGDVc4ZFxen/lqNzne3rl27IpFISEhIaPVzbExbW5uuXbsSFBREWVkZMTExpKWl0aVL\nl3adTxAEQRAedyIIJAiC0A6P2+KVIAjC/fbll1/i7OyMt7c35ubmSKVSIiIi+Pzzz8nKymLWrFkq\nj/n000+Jj4/HwNoFKw8jyrJTyIu/gOx2BZ0HP6NyvMuQZ8mMOE5ZznXq0uLYmhuJp7uTUvXBvXj+\n+ef57bffOHEhiqLrGYAEHUNT7LyHYd3dDy1dfaXjjayd8Bj9x8LivHFeDBzgCjyYRbuOTF01ZOOK\njdYslr4w3IO3d4S1atC8RAJBw1puSfQgNLd4amDlSEVRNmXZKeiZqiZ6yKvYDP63QPr8889z9epV\nLl68yIkTJ+jZsyd79+6lsLBQMXsoLy8Pc3NzxTlsbGzIzs6muLgYCwuL/523nqysLPT09LCyenAJ\nJj4+Ptja2nL48GHc3NxUFsjl87AuX76sUuV67NgxNm3axOHDh1mwYIFi+3/+8x+ioqKYPHkyr7zy\nitJC8MaNGzlx4gRDhgzBz89P5VpBQUHMnDlTsW3EiBGsXLmSAwcOiCBQO5w/f54jR45w48YNZDIZ\n9vb2jBgxgqlTp6pUTFga6XHrrsfX1VaTE3OGW2kJyKor0TE0xcqjL6aO3RXHiASojqG56ltp7g2M\nbF3+aMNaD1GphZhoNwRndHV1qa+v5+rVq4SHh/POO+/w/vvvs2/fPsVMIPnncmxsLB9++CHe3t68\n/fbbbNy4ke+//57y8nKysrIICAhg9OjRgPL8ptraWvbu3cupU6coKirCwsICIyMjEhMT6du3L2PG\njOGjjz5Suu9r166xf/9+rl69SkVFBYaGhsydO1fpO4JMJqO8vFzxPLKzs9m9ezfR0dGUlZVhYmJC\nz549CQwMpFOnTkrnbzzzqKSkhAMHDpCRkYGRkRHDhg3jpZdeQltbm5iYGHbt2sX169fR0NBgwIAB\nvPLKKxgbi7/7BEEQhI5DBIEEQRDa6XFZvBIEQXgQNm7cqNJfXyaTsXLlSvbt28eECROwtLRU2p+T\nk8OmTZv4+UoumWeSqKut4dovX1N8I5pOvQPQ1jdSOl7X2AL3kX8s+L7k35WA/72XNhd0UZfprS7D\ne+jQoQwdOpTpL0mZ//W51j3xRsTi5r3p7WrFkkk+LWanSyTw+mTfBzYTqiXNLZ5ad+1HYXIkuXHn\nMOnkrpghJVdTUUpMGgzRb1jsjI2N5c0332TFihVs2LABQ0NDJBIJtra2mJqasm7dOtLS0vj0008V\n55g6dSqbNm3itddeY8iQIWhqanL8+HFyc3MZP348hYXK7QzvVeOB8Qa6WjgatvxFx9PTU6WyYfTo\n0Xz11VckJSUpttXX13PkyBHMzc2ZN2+eIgAEDVUBwcHBnDx5kjNnzqgEgWxsbJgxY4bStj59+mBt\nba10DaF1tm3bxt69ezExMWHEiBHo6ekRGRnJtm3biIqK4sMPP1SaYWmkr41TowSoO3UyUk5uo6Io\nGwNzO8xdfairqSI39hzleWkAdLIwEAlQHcDNfGmziWs3zv0HDS0dDKwc0NTWoaq0gCsXQ5AUOOLb\nozuenp589NFHZGVlYWtrS3l5OXv27CE8PFzx/jNgwADs7e05ePAgiYmJVFVVMXbsWAoLC/H19SUp\nKYmqqiq++OILJBIJAQEB+Pj4IJFI+OGHH9i4cSMZGRmYm5szc+ZM6urqOHfuHKWlpWRmZnL8+HHs\n7e0xMzOjuLiYlJQUfv75ZxwdHRk2bBju7u588cUXXLhwga5du5KXl8epU6c4f/48GRkZ+Pn5UVVV\nxTvvvMPt27cZMGAAzs7OZGZmcubMGcLCwli1apXa+UdHjhwhIiKCgQMH4uPjw++//86hQ4coLy/H\nz8+PTz75hP79+zN+/HiuXr3K6dOnKSsr4/33339QP1JBEARBaDMRBBIEQWinx2XxSmi9trS3utv6\n9esVsxJ27dql1Pt8zZo1+Pj4KB0vzxpMSUlBIpHQo0cPXn75ZZV2P9DQS33Pnj1ERERQXFyMgYEB\nPXr04Pnnn1dpa9E4a/HuazaXpZ+VlcW2bduIjo5GJpPh6urK888/T1lZWbNzjKqqqti5cyehoaHc\nunULa2trxo4dy7Rp05RmOghPHnUDlrW0tJg0aRIxMTFER0czatQopf1z5szB2NgYA92GRSNNbR0s\nXLzJiT1HZXE2pg5dm71mS/PY2ktUdz4643s7Y2tmwM7QZGLSVP/9fTtbEDTM45F9hra0eKpnao1T\n/wlkhP+Xa798jaljdzS19bh9K5+8qxepqbiFx5iXsOs+AK2w83z00UcMGTKEPn36UFxcTHh4OEZG\nRuTl5WFmZkbXrl2ZPHkynTt3Vlxj/PjxaGtrc+jQIUJCQtDR0cHU1BQvLy/s7e3vWxBI/cB4qC6/\nRUZGCd2Kypt8rLqFUy0tLczMzBRZ+NDwWSOVSunUqRN79uxRey4dHR0yMjJUtru6uioFjeSsrKy4\ndu1ak/cmqLp27Rp79+7FysqKzz//XFF59tJLL7F69WouX77MgQMHVFq0NU6Ayr/6GxVF2Zg5e+I6\nbLriM9+2xxASj36LRAJ93MR3347g7rl3d7PvFYA05zq3i3OpLi+mtrIMLT1D+o2awhsv/5n333+f\na9eu8cYbb1BVVaWYt5Senk5ZWRkAenp6rFmzRjGzKy4uDnd3d9auXctzzz3H1KlTFUGX/fv3ExAQ\ngJOTE6+//jr/+te/CAsLw9DQEDs7O8XcyRdeeIHXX3+dmpoaJBIJFy9epKamBjMzM7Kzs7G1tSUo\nKIi//e1vyGQyLC0tOXPmDIcOHaK2tparV68yaNAgFi5cyOjRo3nttdeorKzkjTfewN/fX/H8Q0ND\n+eSTT/jss8/YvHmzyvfXK1eusH79esV39NraWhYvXsypU6cIDw9XVD5BQ6D7vffeIzIyktTUVNzc\n3O7Xj1EQBEEQ7okIAgmCINyDjr54JbRNe9pbycnnJISEhODt7a0UgLG1tVU6Njw8nLCwMPr27cuE\nCRPIyMggIiKC5ORkvvzyS0V7DIC8vDzeeustiouL8fX1Zfjw4RQWFnL+/HkuX77M8uXL73kgdmZm\nJn//+98pLy+nf//+uLi4kJuby5o1a+jbt2+Tj5PJZLz33nsUFxfTr18/NDQ0uHTpElu3bqW2tlap\nZY/w+Lu7OsDJCMLPHiM6OpqCggJqamqUji8qKlI5h3yhuHEFjbZhw7DpuuqqFu/hQVbeiOrOR6e3\nqxW9Xa1UXmO9XKweeaCtpcVTACuPvuib2ZB39TfK825SV1uFSSd39M1ssXTvDYCRpR1r1qxh+/bt\nXL58mbq6OlxdXXnjjTcwNDRscZ7c3TNEQkJCWL9+PQEBAaxZs0bl+La2J2xuYDxA2e0ajkSmM+ZK\nBuN6qSYryIe4301TU1Np3odUKgUgOzu72UHxt2/fVtlmZGSk5siGa9S35hf3Kdf49+v0od1UVsuY\nMWOGUutBTU1NgoODiYiI4Ndff1UJAjVOgCq6fgWJRIJD79FKi+a6RuZYdxuAQf7vOFqq/5kJD1dl\ntazZ/dZd+2HdtR/QEPSNP/gFlm69cPHqw8qVK8nNzWXp0qWKwIl83pI88UjOysqKN998k7Nnz6Kr\nq8u2bdswMDAA/nhPkrcSraqqQk9Pj5EjRxISEkJtbS1r165VBFOg4X0lKCiInJwcvL29FbOWZDIZ\ngYGBmJqaMnfuXKAh6Dxt2jSmTZuGu7s7ISEhSu+pV69eJTMzk+7duysFgACGDRvGkSNHSEhIID4+\nXukeAKZMmaKUpKWtrc3w4cPZsWMH/fr1UzpeIpHg7+/PlStXuHHjhggCCYIgCB2GCAIJgiDco468\neCW0TXvaW8kNHDgQQ0NDQkJC8PHxaXIhD+DSpUv84x//oGfPnoptW7duZd++fZw4cYJp06Yptm/a\ntIni4mJefPFFpcWYiRMnsmzZMtatW8f333+Pnp5ee582mzdvpry8nAULFjBx4kTF9sjIyGZbWRQX\nF+Pq6sqqVavQ0dEBGlpuzZ8/n0OHDjF9+nSlVjLC40lddUC1tITEY99hoFnHiIF9GDduHAYGBmho\naJCfn69Y0LmbfKG4ceWNRNKQ2V9frzoYurEHXXnzNFV3VlVVMXPmTDw8PPjkk08U22tqaggMDKS2\ntpalS5cqVUD+8ssvbN68mddee40xY8YottfV1bF//35OnjxJZmYmsbGxGBsbc/PmTbZv3058fDy1\ntbW4ubkxc+ZMrK2t2bdvH9HR0dy6dQtDQ0OMjY3R0tJSBBNtbW3x9/dnf2hDm54tW7YQFRXFkSNH\nyM7OxsDAgIEDBzJ37twmgw/3S0uLp3KG1k64WasGR+QMdLXw9PRg9erVaverC9o0N1y+pcHybdHa\ngfHUw7ojMdiY6rf79S9fEB40aBDLly9v1zmEtlH3Hn7twu9UFhdxMLEW226FSj9PBwcHrKysyMvL\nU8xZaWx8b2dM9STM2l9OvYEJusYWSvt9O1sQ3G8KP34pqrM6ivZU0VaVFfKfr/+JjZEW77//vtJ3\n1tbo1KmT4ve9MfkMs/LycsV319TUVCQSCZ6enirHe3l5qWzLzMykurqaHj16qP0M8PHxUVTny6Wk\npAA0OTvM19eXhIQEUlNTVYJA6iod5fPZ7q7IBxR/K6hLhhEEQRCER0WszAiCINwnLjbGIujzmGtP\ne6v2GD58uMof0+PHj2ffvn1Kcw0KCwv5/fffsba25rnnnlM63tPTkynkoTEAACAASURBVBEjRnD6\n9GkuXrzY7vsqLCwkJiYGe3t7JkyYoLSvb9++9OrViytXrjT5+Pnz5ysCQACmpqb4+flx6tQpsrKy\nlFoaCY+fpqoD8q/9hqy6EvNBz5Dt0IvOA3wV1QHnzp1TWXxRR1550xoPq/Lmaanu1NPTw8PDg6Sk\nJG7fvo2+vj4ACQkJiuBddHS0UhAoOjoaQOW969NPPyU+Pp6+ffvSrVs34uPjCQ8P59lnn2XcuHGM\nHz+ekpISQkNDWbx4MRKJBHNzc6X5ESEhIejp6fHiiy/SuXNnEhMT2b59O7dv30ZPT49///vfREVF\nMWDAAHr37k1MTAzHjx8nJyenyaDK/XK/WhB25PlRzc08AhoNi79DfT3sDE1u9++Ao6MjhoaGJCYm\nIpPJRKLAA9bUe3hdbTUAKUUy3t4RxuuTfZUqvCwsLCgoKFAbBALwsNbHy9EcW4fOTB7npZIAlZmZ\nyY8P9JkJbdGe959qaTEahvrY2XXF3d29zY9vrjoQUKoQrKiowNjYWLGvMTMzM5VtlZWVTe5r6THy\n4M3d5NsrKipU9qkLZsnvVd3zlO+TyVqXRCAIgiAID4P41i0IgiA8te5He6v2UJc12DgzUi41NRWA\nHj16qF0o8/X15fTp06SmprY7CCS/Rvfu3dXO8PHy8moyCGRoaKg2cKbuuQiPn+aqA6qlJQCYOXtS\nf1d1QGxsbKvOL6+8eXdD00FGePiVN09LdWfPnj25evUqcXFxipaS0dHRaGho4O3trQj6QMOMg9jY\nWOzs7LCxsVE6T05ODps2bcLY2FhRBRYfH099fT3/93//p2g15e/vz+TJk9HU1OTbb7+la9euhISE\ncOfOHWbNmkVqaiqlpaWKWRA7d+5k9erVGBoacu3aNTZu3Ii1tTXQUH20YsUKYmJiSEpKomvX5mdJ\n3Yv7EbzpyPOjWpp5BKCpo49EIqG2shSAmLRibuZL2/WcNDU1mTJlCrt37+abb75h3rx5SokE0FBl\nWlFRoXZGntB6zb2Ha2rrAiCrKkdT20Klwqu4uOE10dRCvnx7XXUFUwe4quwvKSm5H09BuE/aM/fO\n1KErA0f1JubcL6xYsYJVq1ZhbPxg3scMDAyQSqXU1dWpBIJu3bql9vim9rX0mKZem/LXvLqAjyAI\ngiA8CVQnawqCIAjCE+73G4W8ufU35n99js3HE9h6Jon1+y4wOXAun36zkxqJLuPGjWPGjBnMnDlT\n0XJHXXur9lA316CpzEhAqV9/Y/Lt9xJskV+jLdmUcm3J8hQeP81VB+j8b45Ped5NAEV1QFRUFL/+\n+murrzG+tzPBAZ6YGOio3e/b2YK1L/ipnUHyoLnYGDN1gCtBwzyYOsC1wy7it5e8oqdxsCc6Opou\nXbowePBgCgsLycrKAhqCxVKpVG07oDlz5igtDGpqamJvb0+nTp0U7XcA0tPTMTU1xdbWloyMDAAO\nHz6MpqYm77//PhMmTCA1NVWxLzAwEF1dXQoLCxVt5BpfY/To0QBK1ZMPgnzxtL06+vyo1sw80tTW\nwcDSgfL8dG6eP0BOzFk2fvsDN2/ebNc1Z8yYgZ+fH0ePHuUvf/kLn3/+OVu3bmXDhg0sW7aMOXPm\nEBbWuipBoWnNvYfrW9gBUJ6XBvzxHg4Ngd3CwkJsbW2b/JzX19fH3t6eoqIicnJyVPa3NhlAeHhe\nGO6BmlyfZkVWOzNqyvOkpqby9ttvNxl0uVdubm7U19dz9epVlX0JCQkq2xwdHdHV1SU1NVVt5Y66\n15+8mqmp16Z8e3uqngRBEAThcSAqgQRBEISnyoNsb3W/yRdfmvqjW57N2HiRRkOjIb+jrq5O5Xh1\nwaL2ZFMKT76WqgOsu/anOPUKN0L3Yebsiba+MSmn8vld5xZjA/wJDQ1t9bU87E3xcjQncLIvhs6q\nbYWE++PuyiZvRwd0dHQUQaCKigquX7/OtGnTFDMToqOjcXBwICYmBlA/S0HH1I6D4TeorJZRU3GL\nymoZrq6ulJeXK73nXLt2DRMTE6RSKbt27SIzM5PQ0FD09fV56623yM3NJSsriy1btigqezQ1Namq\nqmp19eSDIm9d2OLMnLs8DvOjWjvzyGXIs2RGHKcs5zp1aXGcyjZkwkAvlcqw1tDS0mLFihWcOXOG\nkydPcvnyZaqqqjAxMcHW1pZZs2apDG4X2qal93BL994UpfxObtw5TBy7oq1nSExaMam5pezcsoX6\n+nrGjh3b7DVGjx7Njz/+yA8//MCyZcsU1cR5eXlqZ1wJj5a8+nbdkdYH6OrrIVuvCwsXLmTz5s0s\nW7aMNWvWNNlSrb1GjRpFTEwM27dvZ9WqVYrq94qKCnbv3q1yvJaWFv7+/hw/fpxdu3Yxb948xb7k\n5GTOnDmj8hhPT08cHBxISEjgwoULDBkyRLHvwoULxMfH4+DgQI8ePe7rcxMEQRCEjkIEgQRBEISn\nxoNubyUPwNyvChg3NzcA4uPj1bbIkC/MNs5alAeECgtVs7sbZ+XffY1r165RX1+v0hJOXQam8ORr\nqTpA39yWLqNfIif6NGVZydTX30HfzJYxs+YxYYBHm4JAcnbmBgSoaSsk3Bt1Q+HlympNKLyaTGlp\nKdeuXePOnTv07NkTJycnLCwsiI6OZuLEiURHRyORSJQqgfJuVZKQUcLr2yMV26rLbxGfVoR9jQ5m\nmjVK74VSqRRtbW0KCgr47bffyMjIUFQaNX5vOn78OJGRDeesrKykrq6u1dWTD4p88bSpzw91Hpf5\nUa2deaRrbIH7yJmK/14wzkvx+9rcgv+WLVvUbpdIJIwcOVJp7lRTbGxsmr3G2rVrWzzH06al93Aj\naydsewwhL/4C145sxszZCw0tbf726h40q0rw8vJSmUV4t2effZZLly5x8eJFFi9eTJ8+faioqCA0\nNBRvb29RzdUBdXdQX1nenJi0YhaNH87ixTp88cUXLFu2jNWrVytVZ96rUaNGERoaSmRkJIsWLcLP\nzw+ZTMbFixfx8PAgKytL8R1bbvbs2URHR3Po0CGSk5Px8vJSzJ/r16+fyutPIpHw+uuv8+677/Lx\nxx8zcOBAHB0dycrK4rfffkNfX5/XX39dbWtkQRAEQXgSiCCQIAiC8NRobXsrU8duitYo9SXprW5v\nZWJiAkBBQcF9uV8rKyt69erFlStXOHz4MM8++6xiX2JiImfPnsXIyIhBgwYptssz6E+ePMnIkSMV\nC6WFhYXs2rVL5RrW1tb4+PgQGxvL0aNHmThxomJfZGRkk/OAhCdba6oDjKyd8Bg9W2mbU9eu+Ph4\nqCzYNrdIGxAQoGi5KNxfTVU+ylUa2HE9KZ5v953ApK4YHR0dPD09gYaqn8jISGpra4mPj8fZ2RlT\nU1PFec/EZyO9XaP2vAXFpeTWVHI5JR/5j9bAwIDa2lq8vb0JDAzkxRdfZPr06bi5ufHFF1+oPc/6\n9esfSRWmOuN7O2NrZsDO0GRi0lQDap3MDejhbEEXO5PHqoqtvTOP7sesJOHBac17uEPv0eib21GY\nGE7xjWjq79zBvpsrc158kalTp6qdRdiYtrY2q1atYufOnYSGhnL48GFsbGyYMWMGgwYNEkGgDqg1\n7R+betzUgAC0tbX5/PPPFYGg+0UikbB8+XL27t3LqVOn+Pnnn7GwsCAgIICJEydy6dIl9PX1lR5j\nYmLCJ598wrZt2wgPDyclJQUHBwcWLlyIjY2N2tdft27dWLduHXv27OHKlSuEh4djYmLCiBEjCAwM\nxMHB4b49J0EQBEHoaEQQSBAEQXgqPIz2Vg4ODlhaWnLu3Dk0NTWxsbFRZDu3p2UOwKJFi3jrrbf4\n/vvviYqKwsPDg8LCQs6fP4+GhgZLlixR+sO4W7dueHt7ExcXx9KlS+nZsye3bt0iPDyc3r17c/78\neZVrLFiwgL///e9s3ryZiIgIXF1dyc3N5eLFi/j5+REWFiYyI58yra0OuF+PE+6/5iof5YztXMmu\nhy0/ncTHvIbu3bujo9Mwn6lnz56cOXOGX375haqqKkUVkPy8zamWFqGprcv+S6mMG1tIb1crunfv\nzvbt26mrq8PNzQ09PT2cnZ1JT09HKpU+sIHj91NvVyt6u1qptNZ7nII+d2vPwHjfzhaP7fN9WrT2\nvdjCxRsLF2/Ffy8Y58VUNRWZTVV0GRgYMG/ePKV2XHKiJVzH05rgoK6RGX1mrVT7uOHDhzN8+HDF\n9qCgIIKCglTO0dzPfsmSJSxZskRlu46ODi+88AIvvPCC0nZ5MpKTk+psQHNzcxYvXqz2Ok3dg4OD\nA0uXLm3y/hpr6vlB8wksPj4+4vUvCIIgdDgaLR8iCIIgCI+/1ra3MrR2oiwrmcLkCO7UVjPm+XlM\nmDChVdfQ0NBgxYoVeHl5ceHCBXbu3Mn27dvJy8tr933b2dmxbt06JkyYQFZWFj/99BMRERH06dOH\nTz75BD8/P5XHvPPOO4wdO5aioiJ+/vlnrl+/zpw5c5g7d67aazg5OfHpp58yaNAgEhISOHToEHl5\neSxfvlzRG10+O0h4OrQmyz8n5gxR2z9AmnezTY8THo7mKh/lDMzt0dLRozQjkci4JKV2b/L5P3v3\n7lX679act05WS1VZkdKweRcXF0pLS8nPz1fM85k6dSoymYwvvviC8vJyldab1dXVaod+P2ouNsZM\nHeBK0DAPpg5wfewDIm0ZGC+RQNAwjwd7Q8I9ExVegjodOcGjuFg1EC2VSvnhhx8AlKreBUEQBEFo\nO5GuKQiCIDwVHlZ7Kw8PjyZbZLTU9qqprEFLS0sWLlzY3K0rMTQ05NVXX+XVV19t9TUcHR1Zvny5\nyvazZ88CqhmYTWUFQ/OZk0LHkZ+fT3BwMAEBASpZuaI64PHWUuWjnERDAyObztzKTKQWsHTsothn\nY2ODvb09OTk5aGho4O3t3erzGpjZUJIWT2bkcY6WFaGfdpb4K5dxd3dHU1OTd999l549eypazO3Y\nsYMtW7ZgZGTEm2++iVQqJS8vj4MHD97LP4PQSq2deSSRwOuTfTv8nCNBvIcL6nXk4OB3333HjRs3\n8PT0xNTUlMLCQiIjI5FKpYwfP17R7lgQBEEQhPYRlUCCIAhCu7399ttMmTLlUd9Gq3Tk7MdHrb6+\nnpKSEpXt0dHRhIaG4uTk9ND6pOfn5zNlyhTWr1//UK4HEBwcTHBw8AO9RkhICFOmTOkw801ao6Xq\nAOuuA/CasghDS4cOXR2wfv16pkyZQn5+/lNzD22Z+2Bk19D6SVNHj1INU6V98sqgLl26YGho2Orz\naukbY2jthKaWDkXJEZw4dQZ3d3fWrVvH7t27mThxInl5eRw9epSamho8PT3p1q0bDg4OHDx4kLCw\nMCoqKvD19cXOzq7Vz0Vov/G9nVn7gh++nS3U7vftbMHaF/wY10u1JZPQMYkKL+Fu8uBgWzys4ODg\nwYMxNzcnPDxc8TnQqVMnXn311TYlQgmCIAiCoN6Tv7IlCIIgCHTs7MdHrba2lrlz5+Lj44OTkxMa\nGhqkp6dz5coVtLS0WLBgwaO+ReEBsLCwYPPmzU22+mupOkBLzwAtPQNRHdABtabyUc6mux823Rva\nSlbV3lHat2jRIhYtWqT2vB5j5qicS90sCYCX/LsqLTD/9a9/bfX9NUXMXLj/nsSZR08zUeElqPPC\ncA/e3hHWYltPeLjBwaFDhzJ06NCHci1BEARBeBqJIJAgCILwVBCtUZqmpaXFhAkTiI6OJikpierq\nakxMTBgyZAjTp0/Hzc3tod1LS4EJoUFiYiJvvvkmAwcOZMWKFWqPWbBgAbm5uWzbtg19fX2OHTtG\nREQE6enplJSUoKenh7u7O88++ywWFqqZwfLqqPfeWMkHn24mPjqSmkopdt5Dsff1JyfmDBUpv7F2\nzRqV6oDo6GgOHDhAUlISVVVV2NjYMHjwYP785z9jaGio9jrqWgzu3LmTXbt2sWbNGnx8fBTb4+Pj\n2b9/P6mpqZSWlmJkZIStrS19+/Zl5syZbfvHfAI9qMpHUVH5dHCxMX4qPvueBuN7O2NrZsDO0GRi\n0lS///h2tiBomIcIAD1FRHBQEARBEJ5O4i8yQRAE4anRUbMf5Zpa8H7QNDQ0mD9//kO7XnO0tLRw\ndHR81LfR4clbZ0VERCCVSjE2Vl6wTUpKIjMzk8GDB2NsbExJSQnffPMNnp6e9OrVC1NTU9LT0/nq\nq6/46aefWLduHWPHjmX9+vWEhITw3XffkZGRQVpaGmfPjkJTU5NBfQZg6x6AiZUdRgYVnCiOpiD3\nOls3fkxxWgIvv/wyOjo6HDt2jC+//BJdXV0SEhJwdXVFW1ubTz75hNWrV+Pp6YmrqyvPPvssI0aM\nUHlu9fX1HDt2jBMnTnDhwgXS09P55z//SWBgIBMmTCAqKooPPvgAAwMD/Pz82LZtG9bW1lhZWfHZ\nZ59x7NgxSkpKWLx4sVJbwcYt/2xsbBRBp5SUFE6dOkVsbCyFhYVUV1djZWWFn58fM2bMwMjISOn+\nQkJCWL9+PUuWLMHa2ppdu3aRkpKCRCKhR48evPzyy0oztBq3zGzqHu63B1X5KCoqBeHxIyq8hLuJ\n4KAgCIIgPH1EEEgQBOEJkJiYyIEDB0hISKC8vBwzMzP69evHzJkzlTL83377beLi4jh48CD79+/n\n5MmTFBQUYGZmxogRI5g1axZaWqofDefOnePAgQNkZGSgr69Pnz59mDNnzkN8hvdHe7Ifp0yZgre3\nN2vXrn14N/oUy8/PJzg4mICAAJYsWQKgCExs2bKFqKgojhw5QnZ2NgYGBgwcOJC5c+eqVJcAFBYW\ncuDAASIiIigqKkJHRwd7e3sGDBhAYGBgs/fRXEBO3T3K5eTksHXrVq5cuYJMJsPV1ZXnn3++2WsV\nFhayb98+xX3q6+vj6elJYGAgHh5NByIDAgLYtm0bZ8+eZfLkyUr75LOHAgICADAyMuL777/HyuqP\nBZ38/HwuXryIVCrl3//+N/7+/op933//PdevX/9/9s48oKpqff+fwzzIIDKIOACKooCIQyiGmnNe\n1HJKLJVu2f1lpubQNy2HBi3LbupNTaurpqKWWiJOOYIGgojMKiAgpOgRmQ5Hmfn9wT07jucwqom1\nPn/pHtZae7PPOXut532fFyMjIzp16oSNjQ13797Fp1s7zMzM2Lp1K/Z21uTm2GJubs6hQ4eorKxk\nwoQJbNq0CSMjI/7973/z5ptv4ujoyL1797C1tSUvLw9DQ0Nu3brF6tWruXv3rsZ1ffnll4SEhGBt\nbY2HhwfFxcUoFAo2btxIUlISZWVlVFVV8emnn+Lk5MSpU6dwdHSksLAQLy8vevfujUwmw9LSEn9/\nf86fP096ejpjxoyRnpOaz8uxY8cIDw/Hw8ODHj16UFVVRWpqKr/88gsXL17kyy+/xNjYWGOckZGR\nRERE0KtXL55//nmysrKIiooiJSWFDRs2YG5uDtCgMTxqHlfmo8iobD7U9j108+ZNtmzZwpUrV8jP\nz8fU1JTdu3c/wZEKmgsiw0tQEyEOCgQCgUDw90KIQAKBQPCUc/z4cb7++mv09fXx9vbG2tqamzdv\ncuzYMSIjI1m9ejU2NjZq56xevZrExER69eqFiYkJUVFR7Nu3j/z8fI1F7QMHDvDdd99hamrK4MGD\nMTU1JTo6moULFz6Vll0i+vHpZcuWLURHR/PMM8/g5eVFXFwcx44dIzs7mxUrVqgdm5KSwrJly1Ao\nFLi7u+Pj40NJSQmZmZkEBgbWKwI1hZs3b7JgwQIUCgW9evXC2dlZGluvXr20nnPt2jWWLFlCUVER\nPXv2xMfHh8LCQs6fP8+7777L+++/T+/evQE0Fmo6uvdBJtvOqVOn1ESg8vJyzp49i4WFhdSvvr6+\nmgCkQk9Pjy5duiCXy0lOTpa2p6am0q9fPwoKCli3bh22trbMmDGD/fv3Y2hoyJo1azh79iwKhYKF\nCxfy3Xffcfz4cUxMTCgvL+fFF1+UMroyMjJ49tlnWbNmDa+99hoVFRX8+9//ZuHChWzfvh1jY2NJ\nZAkNDSUkJARnZ2dWrVrF/v37uX37Nu+//z47d+4kJCQEe3t7AAwMDKTxZmRk8NxzzzFnzhx0dXWl\n7b169UIul5Oens7YsWOxtbXVuAcTJ07kzTffREdHR2378ePHWbduHYcOHWLChAka550/f56PPvoI\nT09Padu2bdvYu3cvx48fZ/z48QBMmTKl3jE8Dh5X5mNzz6j8O1NZWcknn3xCdnY2zz33HNbW1mqf\nE4FAIHgQIQ4KBAKBQPD3QIhAAoFA8BRz48YNNmzYgJ2dHZ9++imtWrWS9sXGxrJkyRI2b96sUTMk\nOzub9evXSxZSU6dOZfbs2Zw6dYrp06fTsmVLoDrSeOvWrbRo0YK1a9dKi5fTp0/ns88+Iyws7E+6\n0keLiH58Orly5Qpff/21JGpWVFTw/vvvExcXR3JyMp07dwaqRZDPPvsMhULBggULNCzHcnJyHsv4\nNm7ciEKhYMaMGYwZM0baHhERwSeffKJxfEVFBatWraK4uJiVK1fi7u4u7cvNzeWdd95h3bp1zPpg\nFT+GZ2jNviikFfmxiWRlZUkWZJGRkSgUCsaOHasmiGRmZrJ//34SEhLIy8tDoVAQGxuLtbU1zs7O\n5Ob+0f7kyZPZvXs3BgYGODo6IpPJ8Pb25sSJE7z44otqdmf6+vr4+voSGBhITEwMAN27d5f26+jo\nEBAQgJmZGR07diQhIYGysjJGjx7Nrl27yM7OlupOHT9+HICAgACMjIykNgwNDQkICOCDDz6gpKQE\ngPnz5+Pr68vdu3extLTktddeU7vehlKbKDN06FC+++47Ll26pFUEGjBggJoABDBy5Ej27t2rJqg9\nKR5X3QdRT6J5oK1+2u3bt8nKymLEiBHMmjXrCY5OIBAIBAKBQCAQNCeECCQQCARPMUeOHKG8vJwZ\nM2aoCUAAnp6eeHt7ExkZyf3799XsjFQLsiqMjIwYOHAgu3fvJjU1lT59+gBw5swZysvL8fPzU1so\nlclkvPrqq4SHh1PVkHBwLSQnJ/Pzzz+TlJREYWEhZmZmdOjQgREjRvDss8+qWd1MnDiRHTt2EB8f\nT2FhIStWrJAsuhQKBfv37+f8+fPI5XL09PTo1KkTEyZMwMvLS61PpVLJsWPHuHjxIjdu3KCgoAAT\nExNcXV2ZOHGimgCkqvsBkJCQoFbXw9/fnylTpkj/b6gdn4rU1FS2b99OUlISMpmMzp0788orrzTp\nPj7tPCjEtTWt/Xny9/dXy2rT1dVl6NChJCYmqolAkZGRyOVyvL29tdac0ZYR87Dk5OQQExODnZ2d\nhjWbt7c37u7uJCQkqG2PiooiOzubF198UU0AguoF3vHjx7Ni9TrmfLUH8zbasynKrbtwLTmRNVt+\n4sul8wBNKziofkbnzn+XvKJiHJw6Y9elIz0sjMjJyaF169YAlJWVScd36tQJAAsLC2QymTSmmvtq\novr+Udm71XzubWxssLOzA5AEZqVSiYeHB7t27aKwsFA69tq1a8hkMq01sdzd3dHR0aGkpISlS5fy\nyy+/cOLECcm2bvny5UyfPp0ePXpovVe1UV5eztGjRwkNDSUrKwulUqn2vabNsq62+6B6toqKiho1\nhsfF48p8FBmVTx5t9dO0ff4EAoFAIBAIBAKBQIhAAoFA8JRRc9E8+HQE90rKSUhIICUlRePYgoIC\nKisruXHjhtqCpbY6I6rF9ZqLl9euXQPQuiDbunVrbGxskMvljb6GY8eOsWHDBnR0dPD29qZNmzbk\n5+eTmprKoUOHePbZZ6Vjs7OzmT9/Pg4ODgwaNIiSkhIp8lkul7No0SLkcjlubm706tWL4uJiLly4\nwLJly3jrrbcYMWKE1Nbvv//O9u3bcXNzo0+fPrRo0QK5XE5kZCQXL15kyZIlkn2Wk5MT/v7+7Nq1\nC1tbW7UF9Zr3o7F2fJcvX+aDDz6gvLwcHx8f7O3tSUtLY9GiRRpZBX9lLqXnsDM0RSO7paQon6ys\nPLrc1VxEb+ii+5UrVwBqtWB7HKSlpQHQrVs3DVsxqH5mHhSBVOO8c+cOgYGBGudcSEghXV5IG4c7\ntYpAlu1cydI3YvfPh5jy8it0sjbk4sWLODk54eTkBFTf61mLVpOcko3LsOkU2TlSBCTl5lNQaYKD\nsTmUKtXaVdWrUQlAgJRlo80GUrVPZT2Vl5dH+/btq8doaSkdl5eXJ7WhOraiooKKigqgWhwyMzOT\napMplX+MS1dXF3NzcwoKCujTpw99+vShuLiYESNGYGpqSmZmJh9++CHr1q1Ty1Sqj88//5zw8HBa\nt26Nt7c3LVu2RF9fH4CgoCA1cawmLVq0qPU+VFZWNrj/x83jynwUGZVPlgdrAtUMVNi1axe7du0C\nNIMWBAKBQCAQCAQCwd8PIQIJBALBU4K2RfPEq1mUKHL5ZO33OLQyxcJEu/d/cXGx2v+1FSTXtnip\nWoCtuYhbk5YtWzZaBMrKypIsbFatWiUtFKt40KorKSmJiRMnMm3aNI22vvrqK+7cucPChQsZMGCA\n2rgXLVrE5s2b8fb2lsbftm1btm3bJhVsr9nn/Pnz+e677yThwNnZGWdnZ0kE0raI1lg7vqqqKtau\nXUtpaSkffPAB3t7e0vFBQUF8++23DbqHTztHL2XWaSVVeL+U4IuZDIvJYkSPPxbzG7rornpuH8yO\ne5w05LPyIKoMmHPnzmk9Jykrj6oqqCzXLkIA6Ojp07J9N3JSo1m78zDjPVtSUVEhiZaqe52Z9Tt6\nhiaY2Tmqj+F+KeFxKXRqozm+pmBvb49cLic+Pl4SNfPz84Hqe5SWloaBgQHt2rWTRDATExPy8/Mp\nLy/H1NQUhUJBeXk5enp6auJ2RUUFhYWFaiKUkZER5ubmuLu74+npyc6dO4mKimqwCJSSkkJ4eDg9\nevRg+fLlanZyVVVV7Nu376HvSXPhcdV9EPUkmgf+/v7I5XJOk0F3EAAAIABJREFUnjyJu7u7FKyg\nLYhDIBAIBAKBQCAQ/L0QIpBAIBA8BdS2aK5rUF0zw2n0O+gZGjHLr7vaovnDohKL8vPzNcQa+COq\nvzEcPnyYiooKJk+erLXNB626LC0t8ff31zguPT2dhIQE+vfvryYAqcb98ssv88knnxAWFsaoUaPU\nrkdbn/379+fgwYPcuXNHLXOnLhprx3flyhVu3LiBu7u7mgAE4OfnR3BwMNnZ2Q3q+2nlUnpOvbVE\nAKiCr4LjsLUwrudATVR/59psvBqCKptHlaFSE21WXzU/K9rQ9llRnfOgIAjVGX//2hTaoLFadexB\nTmo0Eb+dRTfLAF1dXQYNGqR2rw1atKS48C73825j3NJOOrdUWUBFaTFpurqkZBc0qL+66NGjB4mJ\niQQHB0tC1J07d5DL5fz888/cu3eP4cOHo6+vT3x8PACurq7cuXOHEydO4OzsTGxsLImJieTk5HD5\n8mWp7cTERCorKzEzM6OiokKj/o/q3hsaGqptr+tvqfq8PfPMMxrtJScnU1pa+jC3o0FjEAgeBVOm\nTCE+Pp6TJ0/i4eEhsn8EAsETYc2aNZw8eZLvv/++1pp7f4cxCAQCgUDQ3BAikEAgEDRz6lo0N7V2\n4N7dmxTdycTCobO0aP6o6jB07NiRsLAw4uPj1Qq9A9y6dYs7d+40qJ2adkGHQiK5V1LeYKsuJycn\nyZqpJqosAqVSqdVKq6CgekE7KytLbfvly5cJCgriypUrUvZBTe7evdtgEUg1hoba8aWmpgJo1H+B\n6kXibt26/eVFoJ2hKfULQP+jqgoCz6bg0Mg+XF1dAbh48SLPP/98I8+uRiXQPJiZBkh/x5o4OzsD\n1ZlrlZWVGpZwKsGjJl26dAGqxY0HRaCYDM1+a6OFTTsMzazIz0oi4Z4JY4YPwsLCgp2/hEv32tbV\nm8KbqST/ugXLDt3Q1TdCcSud4nw55g4uVFXBqfgb+Ng3uFuttGzZkhkzZrBx40bmzJlDeno6enp6\nTJw4EXNzc9q2bUtAQAC3b9/m4MGD6OrqMnPmTFauXMmGDRuwt7cnMzOTN954AwcHB/r06cOFCxco\nLS1l586dQPVndNq0aXTt2hU7OzuysrK4e/curVq1wtbWVkMUVtU/u3PnDvb26heoqlX0YN2vgoIC\nNm7c+HA3o4FjEAgaQmPqpwkengft9poro0ePxt3dnU8//fRJD0UgqJWn5fMkEAgEAsFfGSECCQQC\nQTOnrkVzm87PcDc1mhsXf8XQzAojc2sCz6ZIIlB5eTlXr17Fzc2tSX0PGjSIXbt2ERwczLBhw6Ro\nuqqqKrZs2aJWPF0bWi3skm9QosjliyOpBAw1qlew0majBaBQKACIiYkhJiam1vPv378v/Ts8PJxP\nP/0UAwMDevTogb29PUZGRshkMuLj40lISKi1/oc2VHZe+/fvr/M4lR3fvXv3gMZZhv2VyJArNGoA\n1Ufc9VyMKa7/wBo888wz2NraEhERQWhoqIYokJOTo5Fx9iCdO3cG4MSJEzz33HNSlkhOTo5Ua6Mm\n1tbW9OjRg5iYGIKDgxkzZoy0LyIiQqMeEIC3tzf29vYcOnSI7t2707t3b2nfvZJqcVJ5Jwvjlq3R\n0dMUQmvSytmTm7GnqaisYsiQIRr32rxNJzo+58+t+LPkX09EJtPBwMwKE2sHjCxsKC64Q9rtQlxb\n6NbRS8MYNWoU9vb27N+/n99++w0jIyPu378vfe5++OEHzp49i1Kp5NVXX6V379588skn/PDDD6Sk\npFBVVUV+fj4dOnQgKyuLzMxMVqxYQVlZGb6+vvj4+BAeHk5KSgqxsbHI5XLs7e2ZNGkSY8aM0bAN\n9PT0ZP/+/Xz99df4+PhgbGyMqakpfn5+uLi40LVrV8LCwli4cCHdunUjPz+fixcv4uDggJWV1UPf\nj/rGIBDURVPqpwkEAsGTYtq0aUyYMOGR/X4KBAKBQCB4NAgRSCAQCJox9S2aG1lY0957DJkRQVwO\n/gZz+478bt6KlvILVBYXkpSUhLm5Od98802T+re1tWX69Ol8//33zJ49G19fX0xNTYmOjkapVOLo\n6EhGRobWc2uzsNMzMKIEiLl6nUW3lbxTj4VdzcL0NVHVBXnjjTfUIvjrYseOHejr6/PVV19p1AxZ\nv3691oX6ulBli+zZs0etTkltqI5pjGXYX4nGZLfU5EauslHH6+np8d5777F06VK++OILjhw5gqur\nK6WlpWRlZREbG8uBAwfqbKNLly64u7uTkJDAvHnz8PT0JD8/n8jISLy8vLTW8XnzzTdZsGAB3377\nLZcuXcLJyYns7GzCw8N55plniIyM1Bjn4sWLWbp0KR9++CFdu3bFyckJQ0NDQi4lkxh+iRJFHh7j\n59crArX2GEBrjwG8OaIbPs848UtkusYxFg6dsXDoLP2/pCifxF/WYtrKgW6jZwJwI/e8tP/777+v\ns88pU6ZIllMnT55U2+fl5YWXlxcxMTG4u7uzcOFCtmzZwtmzZ7l37x7t2rVj3LhxDBw4EIBu3brx\n2WefAdUi85EjRzh+/DiZmZl06tSJdu3aMWzYMEaNGoVMJuPZZ5+V+lJFwk+dOlXrOHv27Mlrr73G\nsWPHOHDgAOXl5dja2uLn54eOjg5Llixhx44dREVFcfDgQVq1asXw4cN56aWXmDlzZp33oKHUNQaB\noDaaWj9NIBAInhRWVlZCABIIBAKBoBkiRCCBQCBoxjRk0dzKuTvGLe2QXz6P4nY6ilvXOHIvje4u\n7enfvz++vr4PNYYXXngBKysr9u3bx8mTJzE2NqZnz568+uqrfPHFF1rPqcvCzsS6Lcq7Nym8mYqR\nhXWTLexqWmk1VATKzs6mffv2GgJQVVUViYmJWs+RyWRUVlbWOobU1FQSExPp06dPvf136tQJQKvY\nVFlZSVJSUr1tPM2oslsaS2m59vtfFy4uLqxbt469e/cSFRXFlStXMDY2xt7enpdffrlBbXzwwQf8\n97//JSIigoMHD9KmTRsCAgLo2bOnVhGoTZs2fPnll2zdupXY2Fji4+NxdHTk/fffp7CwUEMEAnB0\ndOQ///kPv/zyC5GRkZw4cQIdHR30jFpg3LI19h6D0DNseF2kHo7Vn6OG3GvDFpb0fGWZ2rYh46Yx\nxfdjrcfXFH0eZMiQIVINIG1YWVkxf/584A9bmEuXLkkiUE1kMhmjRo2SannVx8GDB+s95oUXXuCF\nF17Qus/MzIw333xT6z5tQlh911rbeOoag0DwIE2pnyYmdgKBoC7i4+NZvHgx/v7+Wn/PX3vtNeCP\n376TJ0+yZs0a5s6di42NDbt27SI1NRWZTIabmxv//Oc/Nd6pH6zHExgYKGVQnzx5Ui1oZO7cuWq/\np9HR0QQFBZGcnMz9+/extramX79+vPTSS1rresbExLBr1y6uXbuGvr4+bm5uBAQEPPR9EggEAoHg\nr4iYKwgEAkEzpqGL5sYt7ejgM1b6//RBnZni66JxXF2e8XUtbA4YMEDDUquu9uq2sOtNTspFbiWE\nYt6mI0YWNmoWdg2x6oLqRX43NzfCwsI4fvw4w4YN0zgmIyODli1bYmFhAVRnNt28eZPc3FwpSrGq\nqorAwECN2kEqzM3NtdaFAfDz8+PYsWN89913tGnTBgcH9eo1D9rxubq64uDgQEJCAhEREWp1YIKD\ng//y9YBMDOt/7dAmTIyf+jovPOOk9XgPD49aF91tbGxqXdyvSW0ZL6amprz99tu8/fbbGvtq69Pe\n3p5FixZp3Vfb58vCwoLp06czffp0te0LtoU3yj6vewcrHG2ra8805F5ro6nnCQSCR0tT6qdN82pR\n/8GCh6KkpISgoCDOnj3LzZs3kclkdOjQgTFjxqi9J4WGhvLFF18wduxYXn/9dY12ysrKmDp1KgYG\nBmzZskWyHFWde/ToUdLS0igtLcXOzo5BgwYxbtw4rTUSBYLHTWRkJBEREfTq1Yvnn3+erKwsoqKi\nSElJYcOGDZibm9d6roeHB0qlkqCgIJycnOjbt6+0z8npj3e7Xbt2ERgYiJmZGX369MHCwoKMjAx+\n/vlnoqKiWL16tVrW/W+//caqVavQ19fH19eXli1bkpSUxIIFC9TaFQgEjedBQbg+aqv79aAoLBAI\nnixipi8QCATNmKdxIbd+Czsb2vV5nqzIQ1w5vAmLtq7cjLHC7GYYubeyMDExYeXKlQ3qa8GCBbz/\n/vusW7eOgwcP0qVLF0xNTcnJySEjI4Pr16+zevVqSQR64YUXWL9+PbNnz6Z///7o6upy+fJlMjMz\ntdp1QXUtj9DQUD766CM6duyInp4ebm5uuLu707ZtW2bPns26det466236NmzJw4ODlRUVCCXyzXs\n+GQyGXPmzOGDDz5g5cqV+Pj4YG9vT1paGrGxsfTq1YuLFy824a4/HaiyVP6s8552Xh7gwqKdEQ1a\nCJbJUBN+xb0WCJ5emlo/7UZ77fapgkeDUqlk8eLFpKWl0bFjR4YNG0ZlZSWXLl3iiy++4Pr165It\nZd++fTE1NeXMmTO8+uqraiIPVNeKUyqVDB8+XG3f2rVrOXHiBNbW1vj4+GBqasrVq1fZsWMHsbGx\nfPzxxxptCQSPm/Pnz/PRRx/h6ekpbdu2bRt79+7l+PHjjB8/vtZzPTw8sLOzIygoCGdnZ60ZSHFx\ncQQGBuLq6sry5cvVsn5U2UiBgYGSoFpcXMz69evR0dHhs88+w8Xlj/ef7777rl7LX4FAIBAI/o4I\nEUggEAiaMU/jQm5DLOysXXphbGnL7cvhFN3OoOD3K5wubsOA3u4MHz68wX1ZW1uzZs0aDh48SFhY\nGGfOnKGyshJLS0vat2+Pn58fHTp0kI4fOXIk+vr6HDhwgJMnT2JgYICbmxtz5swhLCxMqwj0xhtv\nABAbG0tUVBRVVVX4+/vj7u4OwHPPPYeTkxO//PILcXFxXLp0CSMjI6ysrLTa8XXt2pVVq1axfft2\noqKigGpbuU8//ZTo6Oi/tAjkaGuGR3urJme3/N3wcrJm7j886rWEksngHb/uapaKzfVeN8QWpqqq\niqNHj3L8+HGysrKoqqqiffv2DB06lOeff77WOmECwV+FptZPu3rzr11X7knz7bffkpaWRkBAgNqi\nd2lpKStWrOCnn36if//+ODs7Y2BggK+vL0ePHiU6OlrDMlb13Td48GC1bSdOnKBfv34sWLAAAwMD\naZ/qu/PQoUOMGTPmMV+pQKDOgAED1AQgqH6n3rt3L8nJyQ/dviq7+u2339awfRsyZAhBQUGcOXNG\nEoHOnz+PQqFg8ODBagIQgL+/PydOnECpbFw9SYFA8OiZNm0aEyZMEHXCBIJmghCBBAKBoBnTXBdy\n66KhFnamNu1wtvnDR/xBCztbW9sG1fowNjZm0qRJTJo0qUH91mZ75+joqDU60cLCgoULF9bZpqOj\no1rqe3106tSJDz/8UGO7q6trrTVX/io8THbL35GRXu2xszQh8GwKcdc1vwe6d7Biiq+L1ppazeFe\nP/gZbogtzJdffklISAjW1tYMHz4cmUxGeHg4GzdulKxeBIK/Mk2tn3a/tOIRj+TvQ4ZcQUxGDvdK\nyjEx1KOtqfoXp0Kh4PTp07i4uGhkPRgYGBAQEEB0dDQhISE4OzsD1QLP0aNHOXnypJoIlJeXR3R0\nNM7Ozjg6Okrbg4KC0NXVZc6cOWoCEMDkyZMJDg7mzJkzQgQSNIqaz/adrN+b9P2iqmlZE5V1c1FR\n0UOP8cqVK+jp6WmttwjV9okFBQUoFArMzMy4du0agBSQVRNTU1OcnJy01t8UCAR/LlZWVkIAEgia\nEUIEEggEgmZOc1jIbQxPo4Wd4M/jYbJb/q54OVnj5WStsUjZw9G6TsG3Od7r+mxhQkNDpUXUVatW\nYWRkBMArr7zCokWLCAkJoU+fPgwcOPCxj1UgeFI0tX6aazc33guoP3hC8AeX0nPYGZqiEWxTUpRP\nVlYeXe5WL3AnJydTWVkJVGflPEhFRbUAV7O+YNeuXXFwcCAyMpKioiJatKiu2aTKWh46dOgf/ZWU\nkJ6ejrm5ea1WVvr6+rXWLxQIHkTbs624nUHK9bts/Pkc33y/jf/32nStwUeqeiCqfapntyYqW0LV\n5+JhUCgUVFRUSJnCtXH//n3MzMykLB9LS0utx7Vs2fKhxyRoGAcPHuTIkSPcvn2b0tJSXn/9dcaO\nHVv/iY+Z0aNH4+7uXmc93D+DmrVyJk6cyI4dO4iPj6ewsJAVK1bg4eGBQqFg//79nD9/Hrlcjp6e\nHp06dWLChAl4eXmptaeyR5w7dy7m5ub8+OOPpKeno6enh6enJ9OnT6dNmzZq5yxatIiEhAStwZU1\n29MWJKlUKtm+fTvh4eEoFApat27N888/j5+fX4My8+uqCZScnMzPP/9MUlIShYWFmJmZ0aFDB0aM\nGMGzzz7bkNsrEAgaiVhxEwgEgmZOc1zIrYun0cLur0BtBTmbIw9mt5QU5ZP4y1paOfegg8/YOrNb\n/s442po1OsvvYTKJHhX1RdjX5Pjx4wAEBARIAhCAkZERAQEBfPDBB/z6669CBBL8pRG/o38ORy9l\n1vluVXi/lOCLmQyLycJQoQAgJSWFlJSUWtssLi5W+//gwYPZvn07oaGhjBo1CoBTp06hp6en9j1W\nVFREVVUVBQUF9S6ECwT1Ud+zfSuviIIbecRdv0tNCUipVGrYsf0ZmJiYUFVV1eBnXzXG/Px8rfvz\n8pqvNWZT3tfrW6h/UoSGhrJ582acnZ0ZM2YM+vr6uLq6PulhNUuys7OZP38+Dg4ODBo0iJKSEkxM\nTJDL5SxatAi5XI6bmxu9evWiuLiYCxcusGzZMt566y1GjBih0V5YWBgXL16kX79+eHh4kJaWRlhY\nGPHx8XzxxRc4ODg89JjLy8tZsmQJRUVFDBgwgPLycsLCwti8eTO///47b775ZpPbPnbsGBs2bEBH\nRwdvb2/atGlDfn4+qampHDp0SIhAAsFjQohAAoFA8BTQHBZyG8rTaGH3JGiuE7o/i5rZLWeir7Du\nXAu8XO1Y8q8Bf7tn4XHT1Eyih6WhEfY1uXbtGjKZDA8PD4197u7u6OjoSDYw8Ee08vfff/+IRy8Q\nPD7qe27F7+jj51J6Tr3BNQBUwVfBcbziXm3PNnbsWKkuSUMYPHgwO3bs4NSpU4waNYq0tDQyMjLw\n9vbG3NxcOk61qO3s7MzatWsbfT0CgYq6nm09A2MAyouVVFXB4ehMXkrPwcvJmuzs7McmAuno6AC1\nZw25urpy4cIFMjMzad++fb3tdezYEYCEhASGDRumtk+pVJKenv6QIxY0hAsXLgCwbNmyZmf5tXHj\nRgwNDZ/0MCSSkpKYOHEi06ZNU9u+aNEi7ty5w8KFCxkwYIC0XalUsmjRIjZv3oy3t7dG1ltkZCRL\nly5VsxoNCgri22+/ZcOGDaxYseKhx5ybm4udnR3r169HX18fqM4OnDdvHocPH8bX11erJWN9ZGVl\nsXHjRkxMTFi1apXGZz4np2l1EQUCQf0IEUggEAieEp7UQm5TeNos7P4KWFlZSS/UTxOOtma8MtSL\noe4/YGJigpVV83qW/0o0JZOoqTQmwn5Ejz9qgymVSszMzNDT03xF1dXVxdzcnIKCgsc1bIFAK08i\n01L8jj5edoamNOjeAlRVQeQtkMlkJCUlNaofa2trPD09iYmJ4caNG5w8eRJAI/jDyMiI9u3bk5mZ\nKdU9EQiaQl3PtqG5NboGRihuZ0jbAs+m4OZgzqZNmx7bmFq0aIFMJuPOnTta948dO5YLFy7wn//8\nh0WLFmkICsXFxVy/fp0uXboA0LdvX1q0aEFISAh+fn64uPzx/bdr1y7JLu6vQt++fdm4cWOzs7nL\nza0OVGhuAhBA27Ztn0i/tWW/W1pa4u/vr3Zseno6CQkJ9O/fX00AgurAgJdffplPPvmEsLAwKZNU\nRffu3dUEIAA/Pz+Cg4OJi4tDLpdr2K81henTp0sCEICZmRmTJ09mzZo1nDhxokki0OHDh6moqGDy\n5MlaRV9VvTGBQPDoESKQQCAQPGX8mQu59REfH8/ixYvx9/dX8xR/2izs/gro6ek9sQnPw/I0j12g\nSWMj7G0tjKXvAFNTUxQKBeXl5RpCUEVFBYWFhU+d0CkQNAXxO/r4yJArGpVlBXD1Til9evcj9kIY\nu3fvZtKkSVJ2g4rs7Gx0dHSws7NT2z5kyBBiYmL49ddfCQkJwdzcXGPxDuCFF15g3bp1rF27lnfe\neUcjI6OoqIjbt29LWRACwYPU92zr6Opi2+UZsqKOcj/vFnevxRD8YxFZv26ig4PdY1vMNzIyonPn\nziQmJrJ69WocHBwkGyhHR0eplskPP/zAG2+8Qe/evbGzs6O4uBi5XE5CQgLdunXjww8/lNqbNWsW\nq1at4r333sPX15eWLVuSlJTE9evXcXd3JyEh4bFcy5PA1NT0idj01UZgYKCadd/o0aOlf6vqzsTG\nxrJ//36Sk5MpLi7G1tYWHx8fJkyYoHEtqpo1P//8M3v37uXMmTPcvn2bgQMHqgVehIaGcvToUdLS\n0igtLcXOzo5BgwYxbtw4NaFCNSZtNYFyc3P54YcfiIqK4v79+zg4ODB27FhsbW21zmlVY/vll1/Y\nt28fJ06c4M6dO1haWjJw4EBeeeUV9PT06s1+H+LURWOMV65cAaoDoLTVmlMFPWmrBactY15HR4du\n3bqRnZ1NWlraQ4tAurq6dO3atda+09LSmtTu1atXAejVq1fTBycQCJqEEIEEAoFA8Fh4mizsGlqY\n8urVq+zfv5+kpCSKioqwtLSkd+/e+Pv7a0yc65rQ3L59W5qcrlmzhjVr1kjnqQpn5ubm8uuvvxId\nHU12djZFRUWYm5vj7u7O5MmTadeunVp/tUWq1yzIGR0dTXBwMDdv3sTExIS+ffvy6quvYmpqqnb+\nSy+9xNatW4mPj6esrAxXV1def/11OnToQEFBAdu3b5cKXTs6OhIQEED37t2lPh/V2FVt7dmzh6io\nKHJzczExMcHNzY1JkybRqVMntWNrWuxZWlqyd+9e0tLSuHfvntZiqILHQ30R9qpCslVVlVRVVUch\nq74HnJ2diY2NJTExEU9PT7XzEhMTqaysFAuggr8NT9Pv6NNETEbTrGY8Bo2luPAuO3fu5PTp03Tr\n1g1LS0tyc3PJysoiJSWFhQsXaohA/fr1w8TEhKCgIMrLyxk9erTWbMdhw4aRmprK4cOHmTFjBl5e\nXtja2qJQKKT3hqFDh/LWW281afyCvz4NebZbdx9EcVEeWeeDKPpfRpCN1zA++mAOM2fOfGxjmz9/\nPt9++y3R0dGEhoZSVVWFtbU1jo6OAEyYMIFu3bpx8OBBkpKSiIiIwMTEhFatWjFixAiNWoD9+/fn\no48+IjAwkLNnz6Kvr4+7uzurV69m7969T0wEakyxe7lcztatW4mJiaG4uJgOHTowZcoUDZG4Ngtp\nlbXo+vXrpfuQn5+PjY0Nw4cPZ/z48dI7V00aM5+5desWe/fuJS4ujrt372JgUG2N2apVKyorK8nL\ny1PLcAkNDWXDhg2cPXsWQBL5DA0N2bt3LxEREXzxxRdaRa2VK1eSkpJCr1696Nu3LxYWFtK+tWvX\ncuLECaytrfHx8cHU1JSrV6+yY8cOYmNj+fjjj9HV1a3zb1NQUMDChQuRy+W4u7vj6upKXl4eGzdu\nxMvLq85zV69eTWJiIr169cLExISoqCj27dtHfn4+rgPH1Zv9HppawLEHst8V/6s1FxMTQ0xMTK19\n379/X2Pbg/ZwKlTZYnVlw6lEvJUrV9Z6DIC5ublGsEPNvrOyshg9ejT+/v4MHTq0zrZqUlRUbQfd\nqlWrBp8jEAgeDUIEEggEAsFj42mwsGtoYcrjx4/z9ddfo6+vj7e3N9bW1ty8eZNjx44RGRnJ6tWr\nsbGx0Whf24TGw8MDU1NTIiIi8Pb2xtnZWTpeNSlKSEjgp59+onv37vj4+GBsbMzNmzcJCwsjMjKS\nzz//HCcnpwZf55YtW4iOjuaZZ57By8uLuLg4jh07RnZ2tppv9O3bt5k/fz7t2rVjyJAhyOVywsPD\nWbRoEatXr2bZsmWYmJjg6+uLQqHg7NmzLF++nE2bNknX/6jGfvv2bd59911yc3Pp3r07AwYMICcn\nh3PnznHhwgUWL16sNZr6t99+4+LFi/Tq1Yvnn38euVze4PskeDgaEmGva2CMTCaj7F51hGPc9Vwy\n5Aocbc0YNmwYsbGxbNu2jU8//RRDQ0Oqqqr45ZdfWL58OXK5nNLSUr755humTp2qtf2ysjIOHDjA\nmTNnyM7ORldXFycnJ0aPHq22AFNcXIy/vz8uLi58/vnn0vbS0lImT55MWVkZ8+bN47nnnpP2HT58\nmI0bNzJ79mypDkFDo0QFTyc1I55Pnjwp2XkBzJ07l4EDB3L06FGioqLIzMwkLy8PIyMjOnbsyIsv\nvtioSNeQkBDWrFlD69at+fDDD7G1tZV+R4NPnGXHnr1kZqRRWVaKY1t7ujoMoLNt9/obFqhxr6S8\nSedVyPT57LPPOHr0KCEhIYSFhVFaWoqlpSVt2rTh9ddf17qQaGhoSP/+/Tl+/DhQXSeoNt588016\n9+7NkSNHiI2NRalU0qJFC2xsbBg3bpza95FA8CANebZlMhnWnXqSlx5Pa3df2vQYTL9BnTE0NJTq\nlE2ePBlTU1OGDBlSZ91KbQE2c+fO1WqbaW9vz9KlS+scW7du3ejWrVu916CiR48e9OjRo8FjeNw0\npti9XC5n3rx5tG7dmsGDB0vv1B9//DGffPKJWnBVXZSXl7N06VJyc3Pp3bs3Ojo6nD9/nm3btlFW\nVqZhQdaY+Uxubi7z5s3j3r179O7dGx8fH0pLS7l9+zaxsbHY2tqSl5cnZc6sXbuWrVu3cvXqVWxt\nbXn55ZeRy+VcvnwZDw8PRo4cydGjR9myZQuzZs3SuJYF+hCOAAAgAElEQVQ7d+6wfv16tXppUP3b\ne+LECfr168eCBQskIQr++I0+dOgQY8aMqfNebdu2Dblczvjx4wkICJC2jx07lnnz5tV5bnZ2NuvX\nr5esOqdOncrs2bP5Ofgoenfs0TNqUef5VMk0st9Vme1vvPGGWjZVQ8jPz9e6PS8vD0BNZFMJORUV\nFRpCmUqQ0UZhYSGVlZUaQpCqb2NjY0nIagwtWlTfqwMHDrBz586/bX1cgeBJIGakAoHgqeJRZitA\ndZTM3r17CQ8PRy6XY2BgQOfOnRk3bpzapCI0NJQvvvii1oLAZWVlTJ06FQMDA7Zs2aL2gtWU1PX/\n+7//Y9u2bVy4cIHi4mKcnJwICAjAzc2N4uJiAgMDOXfuHHl5edjb2zNlyhSN6LKH6X/RokX88MMP\nREZGolAosLe3Z9y4cWpRPqoME6j2365pC7By5Uq1NPXmZGFXk4YWprxx4wYbNmzAzs6OTz/9VC1y\nKTY2liVLlrB582bef/99jT5qm9AARERE0K9fP60vvp6enuzYsQNjY2O17enp6bz77rts27aN5cuX\nN/har1y5wtdffy1N7CoqKnj//feJi4sjOTlZiupKSEhg6tSpTJo0STp39+7d7Ny5k/nz5/Pss88y\nc+ZMKbLQy8uLf//73xw4cED6bDyqsa9fv57c3FyN8YwaNYr33nuPr776iv/+978YGRmpnRcVFcWy\nZcuEzcAToCFRyLr6Bpi0cqBInknGuf0Ymrfi62/TmPXyaAYOHMj58+c5d+4cM2fOpF+/foSGhnLm\nzBmqqqrw8fFh1KhRREREkJycrGEbp1oMSUhIoG3btvzjH/+gpKSE3377jVWrVpGWliYV5TUyMsLF\nxYXk5GTu378vPa9JSUmUlZUB1Z/vmouusbGxABpZSlB3lOiTWIwSPBo8PDxQKpUEBQXh5ORE3759\npX1OTk4oFAo2b95M165d6dGjBxYWFuTl5REZGcny5ct5++23GT58eL397Nu3j23btuHq6sqSJUvU\nasLs2rWLwMBAzMzMmDhqMBYWFmRkZPDzzz8TFRXF6tWrhU1iIzAxrH8KbNjCkp6vLNM4T09PDz8/\nP/z8/BrV5+zZs5k9e3aDju3Tp4/WAAdtiCxXQU0a8mwD6BlU/96V3SsEILeoWNqXnZ2NUqlsVvZj\nTwONLXYfHx/PlClT1ESagQMHsmzZMvbv399gESg3NxcnJyc++eQTSRyZMmUK//rXvzhw4AATJ06U\n3pMaO5/57bffUCgUzJgxg+59n5MC+mx76DFp+v/j+6+/kM5XCTWtWrXC3d2dl156SXrfUgk1np6e\nGBsbc/r0af71r39pzENfeeUVrfOloKAgdHV1mTNnjpoABNWCZXBwMGfOnKlTBCovLyckJARTU1Ne\neukltX1OTk4MHjyYX3/9tdbzAwIC1H6XjYyMGDhwICcjN2Jz9yYWDp1rPVfFg9nvqhpXiYmJjRaB\n4uPjmTx5stq2yspKqW5dzQBDleiSk5ODnZ0dfn5+DBgwABsbmzqvuaKigsuXL+Pm5qbRN0DPnj2Z\nPHky5ubmFBcXa2tCK126dCElJYXk5OQGnyMQCB4NQgQSCARPJY8iW0GpVLJw4UKysrJwcXFh7Nix\nFBQUcO7cOZYuXcrMmTMZOXIkUF2Q09TUlDNnzvDqq69qRNFERESgVCoZPny42r6mpK4rlUreffdd\njI2NGThwoDT+pUuXsnr1atavX49CoaBPnz5UVFQQEhLC559/jo2NjfQy+Sj619PTo3///pSVlXHu\n3DnWrl2LTCaTBAvVQtjJkydxd3dXE30etEJpTtTMSjp76EcU90p49dVX6yxMeeTIEcrLy5kxY4ZG\n6rqnpyfe3t5ERkaqLSSrqG1CUx81LRBq4uTkRPfu3bl06ZLWuim14e/vr5appKury9ChQ0lMTCQ5\nOZlnnnkGAFtbWyZMmKB27pAhQ9i5cydlZWX885//VLOWGDhwIGvXrlXzhX4UY8/JyeHSpUtS9HNN\nunbtysCBAzl9+jRhYWEaUdXe3t5CAHpCNDTC3rH/i/wedYzC7GtUXE/g1E1Tnu/bDUdHR9599108\nPDw4fvw4e/bsISEhAWtra5YuXcqECROQyWRMnTqVxYsXk5ubq+Z5/vPPP5OQkECvXr1YsmSJ9P02\nZcoU5s2bx08//USfPn0kj3NPT08uX75MQkKCtOgaGxuLjo4O7u7ukugDUFVVRXx8PK1bt9bqs15b\nlOipU6eYPn16syvoLGgYHh4e2NnZERQUhLOzs1qtAKgOAvnvf/+rUchY9Vu6ZcsWBg0apLFwpaKq\nqorNmzcTHByMj48P8+fPVzs2Li6OwMBAXF1dWb58udrCrMoiKDAwUGuAikA7PRybZp/X1PMEgj+L\nhj6jhubW6BoYUfD7VcqKlVy9UZ2ZW1payqZNmx7nEP9SNGVOocLW1lZDjOjZsyc2NjaNXhz/17/+\npfa7YWFhgbe3N6dOneLGjRt06NABaNp8puBeKdtCrlEUq2m1lp90C8P7pcAfQo2rqytRUVFqIpZK\nqImIiKBjx44kJCTw+++/a7gCuLi4aPRRUlJCeno65ubmHDhwQOv16+vra62bU5Pff/+d0tJSXFxc\nNOZqUJ2FVpcgom1sFfqmFN4vxaqk4QJIzex3FxcX3NzcCAsL4/jx41KGeU0yMjJo2bKlxtwqLi6O\nCxcuqAUMBAcHk52dTffu3dXeU11cXAgLC+PYsWNMmzYNc3NzzM3NiY2NJSQkpM7xbtu2jRUrVkiC\nnUKhYM+ePQCMHDlSqunaGBFo1KhRHDlyhNOnT2u18MvJydH4vAgEgkeDEIEEAsFTyaPIVti6dStZ\nWVmMHDlS7dgJEybwzjvvsGnTJnr27ImtrS0GBgb4+vpy9OhRoqOjtXo1g7rFR1NT19PT0zXGpBr/\n4sWL6dq1KytXrpTae+6553jvvffYu3evWibKw/Q/bNgwZs2aJaV/jx07llmzZrFv3z41EcjU1JST\nJ0/i4eGhsTDW3NBWsPNq6AWUd+9yOA3ap+fUWldBVbgzISGBlJQUjf0FBQVUVlZy48YNjTo12iYN\nDeXChQscOXKE1NRUCgsLqaioUNtfWFjY4CK+D44rQ64g/lYJN3KVnI5Jo7VTtf2Gs7OzRtq/qg8H\nBweNiZOOjg6WlpYaEY4PO3aVqOTm5qZVLOrevTunT58mLS1NQwTq3Ln+aLwngcq7XWW38lekoVHI\nhmZWdHzuj+jXN0d0Y8gz1YsBMpmMUaNGMWrUKP7zn/9gZGTEnDlz1DIRDQwMmD59OosXL1Zr9/jx\n48hkMl5//XW1iaWFhQWTJ09m3bp1/Prrr2oi0O7du4mNjVUTgTp16oSPjw/ffPMNN27cwMHBgbS0\nNBQKBT4+PlqvqbYo0d27d5OamtrgyH7B04W+vr7WxQpTU1OGDRvG999/T3JyMu7u7hrHlJaWsnr1\nasLDwxk9ejQzZszQqN+gyvR4++23NSLzhwwZQlBQEGfOnBEiUCNwtDXDo71VvdaVNenewapZZjQL\nBDVxtDXDpbU5KbcK6zxOR1cX2y7PkB0fypXDm7jVzpWPciLIunYFKyurBr9b/l15mDmFCicnJ631\nVqytraV5R0MwNTXF3t5eazugbvfV2PmM0sSB5NtKrhz+EYu2nTG374ipTTuMLGyQyWTcKbxPkTyP\ng5GpklBz6dIlbty4QWhoqJSVAn8INap3IW01a7QFyxQVFVFVVUVBQYGa60RjuXfvHlB7LZ3atqvQ\nlhmXcaf63lZVVTZqLDEZOdLvyYIFC3j//fdZt24dBw8epEuXLpiampKTk0NGRgbXr19n9erVlJSU\n8Nprr9GuXTtJcHnxxRexsrLCyckJJycnrl27hqGhIcbGxkybNk1yQxk/fjxmZmb89NNPpKenS3+f\nNm3aMGzYMMLCwgB1R5CkpCTJyi84OJjnn3+eLl268Ntvv5Gbm8uoUaOoqqrSWhOoZh2piIgIcnJy\nWLRoET179mTatGm0a9cOQ0NDLl++DMCsWbMwMjKivLwcpVLJ6NGjWbt2LVCdjXTs2DFOnTpFZmYm\nFRUVtG3blmHDhvGPf/xD7Z2ppmPMxIkT2bFjB/Hx8RQWFrJixQq1gFWB4O+KEIEEAsFTycNmK5SX\nl3P69GmMjIyYNm2a2rFt2rRh9OjR7Nmzh1OnTkmp1oMHD+bo0aOcPHlSbTEvLy+P6OhonJ2dpeKm\n0PTUdUNDw1rHX1RUxBtvvKHWnpubG7a2tmqZGA/b/+uvv642MWnXrh3dunUjISGB4uJiDfut5s7R\nS5laC3aWl1a/RF/Lq2DRzgje8euuVrBTRWFh9WR6//79dfajLQqqqdH/QUFBfPvtt7Ro0YIePXpg\nY2ODoaEhMpmM8+fPk56eTnl5w+saqKwAak5cFbczyMop4nhsFhcV4WRl5dGlh2ZVU9Viem12Q7q6\numoiz6MYu2pyWNv9U23X5mUtMi4aTs0J06OwLHsUEfY1I2t//S2aeyXlWhfQu3XrpvY9df/+fbKz\ns2nVqpUUmVgTVVRqze9KV1dXDAwMpIwfpVLJtWvXGD9+vHR8bGwsDg4OxMXFqbWjQqlUEhkZybFj\nxzRqE6iy7+ryXBc0Px6sY9fWtJZqz/8jMzOT/fv3k5CQQF5eHqWlpWr7c3M1xYaSkhI++OADrly5\nQkBAAOPHj9fa9pUrV9DT0+PcuXNa95eVlVFQUIBCoVATIQV18/IAFxbtjKi1kHdNZDKY4tv0gA6B\n4M/EtW3LekUggNbdByHT0+duajR3U6MJqbzNK+P/wZQpU5g5c+afMNKnk4edU6hQvZc/iK6uLlUN\n+WL6H9rECblczldffUVJSQmVlX8IFHXNZ3JyckhLS8PZ2Zni4mIupeewNTybLiNfJzsuhMLsa+Rn\nVi/aG5haYNu1H1BtcfbVLxcoV5ZQVVVAamoqubm57Nu3T6sTgqpmjbY5xYNBEDWvz9nZWRIGmoKq\nv9pq6dS2vS6KyyrqP0gLNbPmra2tWbNmDQcPHiQsLIwzZ85QWVmJpaUl7du3x8/Pjw4dOkh/u7y8\nPJKSkvD19SUgIICQkBAuXLhATEwMU6ZM4fr169y5c0fNDeXLL79kyZIlHDhwgISEBDIzMykvL2fW\nrFlYWFhIIhD8kcVcUFBA69atGTduHLt27eLHH3+kQ4cOeHh4MGHCBPz8/EhISNC4NqVSqVZHqqio\niISEBGxsbDh9+jR+fn6YmZnxxhtv0Lp1a3799VcMDAzQ1dWlRYsWuLi4SNZ45eXlfPzxx0RHR+Pg\n4MDAgQMxMDAgLi6OTZs2kZycrLWWU3Z2NvPnz8fBwYFBgwZRUlIiLHMFgv8hRCCBQNCsqW0h5mGz\nFX7//XdKSkro2rWr1kWT7t27s2fPHq5duyZt69q1Kw4ODlKdIdXLu+plrWYEzMOkrtc1/uLiYlq3\nbq1xTqtWrdSsAx6m/zZt2mh9UaoZUfY0iUCX0nO0TtYA9AyMKAHK7inQ1TfUKNipQjUB2bNnT6Nf\nIrVNaOqjoqKCwMBAWrZsyZo1azQiMhsTIViT2iauKgrvlxJ8MZNhMVl1Tlzr4lGNXXXPG1P4VEVT\n7rng0fAwEfbaImsTU7MpUeTy2cErTB+qp/bZ1NXVVVtgUAmHtUUwaxMO9fT06NatG7GxsRQUFHDl\nyhUqKyvx9PSkXbt2WFlZERsby6hRo4iNjUUmk2mtBwTVAvqDqATUmoswTUFVg+3777/XakUneDRo\newYBSoryq0Xyu5pi3tWrV1m8eLH03Hh7e2NiYoJMJiMtLY2IiAipxlRN7t+/z7Vr1zAxMaFnz561\njkmhUFBRUVFvBPT9+/eFCNQIvJysmfsPjzp/E6FaAHrHr3u9Uf2CP5/4+HgWL16Mv79/s89G/zOx\naqH5W6QNmUxGa7dnae1WXVN0+qDOktj5V85YfhgexZziSVLXfEZlLzp37lzc3d1ZsC2cqiowsrDB\nyXcCVZUV3M+7TeGtNHKuXuD3qKPoGVTPB3X0jLiRq8TL3ZVXXnmFHTt28NJLL/HKK6+o9aFUKvnn\nP/+JgYEB7do1bJ5hZGRE+/btyczMfKhgh7Zt22JgYEBGRoZW++6aWUsNxUhf08rsQWqrL1cTY2Nj\nJk2apOZw8iAqESgjI4PWrVszY8YMyZ1D5YaSkZFRqxvKxYsXpXqsKkcQFxcXPDw8pHbWrFkjOYJs\n2LBBWmuZNGkSs2bNwsHBgQ0bNmgdn62tLQcPHuTgwYOcP3+eGTNmaASaFhcXS22q+kxLS2Pu3Lla\n6+P++OOPREdH4+fnx4wZM6RzKysr+frrrzl+/Dj9+/fH29tb7bykpCQmTpwo1aQSCAR/IEQggUDQ\nLKl3IeYhsxVUKeG1LRZaWVlRUlLC1q1bMTU1lSLkBw8ezPbt2wkNDWXUqFEAnDp1Cj09PQYOHCid\n/zCp63WNv7YirQ9mYjxM/3X1AQ+/mPlnszM0pdYFHhPrtijv3qTwZipGFtYaBTtVdOnShdTUVBIT\nEx+ZpVPNF9kHKSwsRKlU4unpqfGMFhcXq4mTDSX++l3WHEurP+q5ioeauD6qsasKmiYmJlJRUaHh\nGa3KyujYsaOUzaKySAgMDOT777+nrKwMV1dXXn/9dTp06EBBQQHbt2+XhFxHR0cCAgK0Znbs3buX\n8PBw5HI5BgYGdO7cmXHjxmlkelRVVXHq1CmOHj3KzZs3uX//PhYWFrRr145hw4bh6+srLVSpqFn8\n9VFl4DQVKysrqZDxo6IpEfa1CZS6+tWLWTEpv3PltlItsraiooLCwkJJoFZ9d6kEwgepTTj09PQk\nJiaG2NhYrly5goGBgWQX1717dy5evEhZWRmJiYm0b9++1ppXgqebporke/bsobS0lJUrV2pYjfz0\n009ERERobc/S0pLZs2fz8ccfs3jxYj766COt9qEmJiZUVVU9lA2OQDsjvdpjZ2lC4NkU4q5rCtfd\nO1gxxdelWS3i/t141Nmqfwcaasv6qM77O/Eo5hRPkrrmM3379mXjxo20bNmy2jL6gTm4TEcXk1Zt\nMGnVhhY27Uj+dSvFhXfRMzJFV9+AMgNLrqSkMWfOHHbv3k1wcDBDhgxRs6rbsWMH9+7dY/jw4VKN\nmYbwwgsvsG7dOtauXcs777yj8R5XVFTE7du36dixY61t6Onp4evry8mTJ9mzZw8BAQHSvvT0dE6d\nOtXg8ajoaNe098GHqS9naWmpMSdqbO3WunhUjiDa6iA2Joi0qqqK4OBgWrZsqTEeHR0dXnvtNU6c\nOMGZM2c0RCBLS0v8/f0fbFIgECBEIIFA0Az5M7IVVAuetS0WqqxbHnzJSk5O5sKFC7Rr145Ro0aR\nlpZGRkYG3t7eahHpjyp1vak86f6bC9omMTWx6dybnJSL3EoIxbxNR4wsbNQKdqoKU/r5+XHs2DG+\n++472rRpg4ODg1o75eXlXL16FTc3twaPTRXJJpfLNfZZWlpiaGhIamqq2st2eXk5mzdvlqLBGsO+\n8w0QgP7Hw0xcH9XYra2t6dGjBzExMQQFBfHiiy9K+65evUpISAgtWrSgX79+KBQK4A+LhFatWjFs\n2DDkcjnh4eEsWrSI1atXs2zZMkxMTNQsEpYvX86mTZsk2y6lUsnChQvJysrCxcWFsWPHUlBQwLlz\n51i6dCkzZ85k5MiR0li2b9/OTz/9hJ2dHc8++yympqbk5uaSkpLCuXPn8PX1xc7ODn9/f4KCggDU\nIuNUYteTQk9PT6t12sPQ2Ah7oNZjTazsuZebTZH8OoZmLdUEyqSkJDUR1djYGHt7e27dusXNmzdp\n06aNWls1hcOaqDJ7VCKQyiJOte/MmTMcPnyY4uLiWrOA/qo09wXYmuObMmUKW7duJSYmhuLiYjp0\n6MCUKVM0FrrKyso4cOAAZ86cITs7G11dXUxatuZyZVss22v/DlctqFRVVmqI5Ddv3sTMzEyr17w2\nq5SaeHp68uGHH/Lhhx+yZMkSli9fjqurq9oxrq6uXLhwgczMTK3FxgUPh5eTNV5O1hqZ5z0crUUN\nIMFTyaOwZRVo8qjmFE+SuuYzpqamGBoacvXqVVKU1fPke3dvYmBmJWX8qCi7X515LauxMG/r2pfc\ntNMEBgYydepUtmzZwpw5c3j22WexsLAgOjqa+Ph4unTpoibANIRhw4aRmprK4cOHmTFjBl5eXtja\n2qJQKLh9+zYJCQkMHTqUt956q852AgICiIuLY9++fVy9epWuXbuSm5vLuXPn6N27N+fPn9dap6k2\nWrc0wdxYU+yoi4bWl6vNDcXe3p47d+6oHduU2q218bCOIN7e3vzwww988803XLp0CS8vL7p160a7\ndu0a5dRw48YNFAoFbdq0Yc+ePVqPMTAw0Opq4uTk1CiRUSD4OyFEIIFA0KyoK81ejYfMVmjbti2G\nhoakp6ejVCo1Iori4+MxMDDg7bffVoskadGiBebm5qSmpnLjxg1OnjwJoJHC/KhS15vKn9V/Xdks\nzYGYjLpfeI0sbGjX53myIg9x5fAmLNq6YmhmxcovojApy8PExISVK1fStm1bZs+ezbp163jrrbfo\n2bMnDg4OVFRUIJfLSUpKwtzcnG+++abBY3N1dcXQ0JCgoCAUCoVkVeXn54epqSmjR49m7969vPXW\nW/Tt25fy8nLi4uJQKBR0795dWtBuCPdKykn6PR/DFnUXPa1JzYlrY5DJZI9s7G+99Rbvvvsu//3v\nf4mOjsbFxYWcnBzOnTuHjo4Oc+fOxdjYWBKB6rJImD9/vmSRsHjxYhISEpg3bx7//ve/OXDggFRU\nfevWrWRlZTFy5Eg1O4UJEybwzjvvsGnTJnr27ClZch09epRWrVqxfv16DTswleBla2vLlClTpO+L\n5mRbo22RPz8/n/379xMZGUlOTg56enpYWlri6urK5MmTtVpSPkhjIuxVliPasOrYg5zUaG4lnMWi\nbef/z96bB0RV7///j2GGHdkX2WQRURR3lFxR0dx3ryZfU0vqlvfWTbvZ1UxvWVaf/Hltv1l2LdMs\nwb0CARdwAwFlcQNkcWGTRUCQfX5/0EwMM+yoqO/HP9k573POe4YzZ3k9X6/nC5lu3T772Bvz/fff\nq40fP348O3bs4LvvvmPNmjXKa1RxcTG7d+8G6oIJ9enevTuGhoZERkZSVFSkUtWpqBLbs2ePyv83\nxs2bN9m+fTsXL16kqqoKqVRKUVGRyhiFDYemyhFNf4/6lWPLli1T/tva2vqhWPZ0Rmu63NxcVq5c\nSdeuXRk3bpxS5N2wYQPvvfee8u9WXV3NunXrSExMxMHBgalTp1JRUcFnO/ZzO/8sXT2zsRugbkki\n1dFHIpFQVVakJpLb2Nhw69Yt0tPTVfoChoSEEBsb2+zc+/Tpw4YNG1i/fj1vv/0269evV+mBNXPm\nTM6dO8dnn33G6tWrNVZYZmRk0LNnz7Z8dYI/cLbuIkQfwWNBe2xZBY3TUe8UHUFSUhL79u0jPDyc\nyspKFi9ejJOTExMnTmTkyJEqY/Pz8/m///s/ZYKEVCrl6tWrau8zZ86c4ciRIwwcOBC/N/8DQEFa\nPKnhv6CtZ4jtAF/u5mRQkpNGRUkBWlIZBuZdlf2LLNwGYm9ZQ2RkJEePHqWiooKbN28SEhJCdXU1\nWlpaDBkyhE2bNjXqOtEUL7/8Ml5eXvz+++/ExcVRWlqKkZERVlZWzJkzh7Fjxza7D1NTUz7++GN+\n+OEHoqOjSUpKwt7enpdffhk9PT3Onj2rJqI0h72FIfdaqG20pL9cc24og0ys1LZpbe/WpmivI4i1\ntTWbN29m165dxMbGKvsNWVpaMmfOHJVn2qZQvNtlZmY2WQl97949tWWiN6xA0DhCBBIIBJ2Kpsrs\nG9KeagWZTMaYMWMIDg7mxx9/5K9//atyXVZWFocOHUJbW5t58+apBVwsLCwAOHLkCCdOnMDY2Fij\nRVhHlK63hwdxfEX1U8OMpM5C/cabjWHZYzD6ptbkXD7D3Zx0im5e4UppV8Z69+Ppp59Wjhs7diwu\nLi7s37+f+Ph4zp8/j56eHubm5owYMYJRo0a1am5GRkasXr2an376ibCwMMrLy5XHMTQ0ZNGiRZiY\nmHDkyBGCgoIwMDBg4MCBLFq0iF27drXqWMX3Kmn961bdC29bggIdNfeuXbvyn//8h59//pno6GgS\nExPR19dn0KBBLFiwQM06qb0WCdXV1Rw7dgw9PT0WL16sMtbOzo7p06fz888/c/ToUZ555hnlOqlU\nqjFzUFND3M5ORUUFq1atIisriwEDBjB06FDkcjm5ubmcPXuWESNGtEgEgpZl2DeXWWtk5Yh1L29y\nr0Ry+df/YtatNzdjtLgZshVbKzO16/OcOXOIiYkhMjKSV155BS8vLyoqKjh58iRFRUXMnTuX3r17\nq2yjpaWFp6en0rarfrWPtbU1tra2ZGVlKcc1Rk5ODv/85z9xdnZm0qRJFBYWEhgYSHJyMgkJCRr9\nzlvCwoULOXv2LGlpacyYMUN5LW9LEKW13A+7wPtBQkICfn5+KkkbPj4+rF+/nr179ypFoH379pGY\nmMjgwYN5++23kUqlpOeWsOemBcVB35KdeBJje3eMrFSrjKXaOhhY2HM39zrpJ/eSFW+BU/lVpj09\nhhkzZhAbG8uqVauU1YAKu50RI0Zw6tSpZuffs2dPNm7cyNq1a/n3v//N2rVrldaT/fv3Z8mSJfzw\nww+8+OKLeHl5YWNjQ3l5Obm5uSQmJtK7d2/eeeedDvxGBYKHj0Iwh7p+JYpECoDXXntNRYROTU1l\nx44dXL58maqqKtzd3Vm8eLHS2rM+NTU1BAcHc/ToUa5fv05NTQ0ODg5MmDCBqVOnasxWP3nyJIcP\nHyYtLY3q6mpsbW3x8fFh1qxZahnnCrH+s88+Y9euXZw5c4b8/Hzmz59PVVUVAQEBjfbASElJYcWK\nFQwZMoR169a17YujbbasgqbpyHeK9hAcHKzs1zvSDSkAACAASURBVGJiYoKBgQFeXl6kpKTw66+/\nqohAlZWVbNq0iV69eqkkSMhkMtzd3UlPT1e+z5SVlWFubs7MmTOV1oBmzp7ong+l6t5dbkUHUVNd\niW4XC0wdPQA5tdVV2PYbozzejAWLsZdMYcmSJcjlcoyNjbG3t0dPT4+qqiokEgn79u1T6xX0wQcf\ntOizDxkypMW23IcOHdK43MLCghUrVqgt37FjB4Bar6Km5ubr64uvr2+zLibQsv5yLXFDSSiQ8cnG\nL/FtoxvKg8DR0ZE333yTmpoa0tLSuHDhAocPH2br1q3o6empJWNpQvHcOWzYMBVL7ZYgesMKBI0j\nRCCBQNBpaC4YqIm2VisALFmyhIsXL3L48GGSk5Pp27cvxcXFnDx5knv37vHMM8/g7++vzMhWZK6Y\nmZlx4cIFVq9ejVwux83NDZms7nKanZ1NQEAA8fHx5Ofnk5WVxbZt29i/fz+zZs3C0dGx1aXrbaWj\nSuebwt7eHgsLC8LDw5FKpVhbWyORSBg7dmynyBBvqb+5oZUjrvWCfi9P7M2soS5q45ydnVtsidSS\nF5rBgwczePBgjeukUimzZs1i1qxZautee+01tXkoGnJqGms9eCrfH09SW9fFxlmlWWnDxqX1X3gb\ne5kC9QbCrZ17ZWUlgMbSfQsLC5YvX66yTCEqnItIVrFIGDlyJG+99ZbK2NZYJNy8eZOKigo8PDw0\nVs/169ePn3/+WaWv0ZgxYzh06BDLly9n5MiReHp60qtXrwcSoL8fxMXFkZWVxcyZM5XVUQqqq6s1\nNrhvjqYy7JvLrAWwHzwR3S7m3E46R15yNFJdA0yeHsOGdSt59dVXVcbKZDI2bNjA/v37OXHiBIcP\nH0ZLSwsXFxdefPFFRo8erfEY/fv3JzIyEgMDAzVxsX///mRlZeHm5tbk3zUxMZHZs2fz/PPPK5eZ\nmZnx5ptvcuDAAV588cU2iSl+fn7k5uaSlpbGzJkzH+i19X7YBd4PrK2tWbBggcqyQYMGYWVlRVLS\nn9e+kJAQJBIJ/v7+SsH4Qnoe2nqGdPUcTcbZg+SnxKqJQADOI2ZzMzqY4qxr1GQk8n12DB7dHfH1\n9WXdunX8/PPPREREIJVK6dGjBxs3biQnJ6dFIhDUWUN+8MEHrF27lnfffZfVq1crg13z5s2jd+/e\nHDp0iEuXLinPVQsLCyZOnKhSvSYQPC707duX0tJSDh48iIuLC0899ZRynYuLC6WldZZUKSkpBAYG\n0qtXL55++mlu377NqVOnWLt2LZ9++qmK5VV1dTUbNmwgNjYWe3t7fHx80NHRIT4+nq+//pqkpCRW\nrlypMo8ffviBPXv2YGxsjI+PD3p6esTExPDDDz8QGxvLhg0blO8B9Y/z1ltvUVJSwsCBAzEwMMDG\nxoa+ffsSGBhIcHCwRhEoKCgIgMmTJ7fru2utLWtn6lvTWemod4rGntcVaHp/UIgNN27c4JVXXsHA\nwICPPvpIzSK0vuWX4rxvLEHCwMCA7777Trk8LCyMLVu24OrqSvc/rAENLR0wsnKk4u4djO3ccB09\nHy1Z3bN6VXkplw9+zu0rZ7HpMxItqfSPJB8XwsLCVHoBQd1vYv369QQEBDB58mRlUuWDpqCgQC2B\nKD09nYMHD9KlS5cmk30aoyP6yz0oN5QHiVQqxc3NDTc3Nzw8PPjXv/7FmTNnlCJQU44iDg4OGBoa\ncvXqVaqrq9WusQKBoG2IX5JAIOg0tCQY2Nh2bRGBunTpwqZNm9izZw+nT59m//796OrqKhvA29vb\nq5Qf18/GHj16tDKw9OyzzwJ1D5UrV66krKwMLy8vhg8fTmVlJbGxsURERBATE0NMTEyrS9fbQ0eU\nzjeFlpYWb731Ftu3b+fUqVPcu3cPuVxO7969O4UIJHzR6+jsTYJv3boF0OwLYXMWCT0HqL85tcYi\noaysDEDt5VCBYrki8ATg7++PjY0NoaGhBAQEEBAQgFQqxcvLi2XLlqm9BD9sGvMYb4imhq4ymazD\nX8JaklkrkUiw6jkUq55DlctGj3HH0NBQox2ajo4O8+fPZ/78+S2ex/Tp0xu1qPjb3/7WpFi+du1a\n0tLSMDQ0VGtEu2TJEgoLCwkLC+PMmTNtrgZ6WDS0p3vY1nSNnb8uLi4aq/EsLS25cuUKUGcZkpWV\nhYWFhYqwpTgHjbo6140rzNZ4bN0u5nQf++ffd8kYd3z/yJ5vLDvZ09NT49+8se/KyclJmY3ckN69\ne6tVsQnuD4pgaGPVGoIHQ9++fbGxseHgwYO4urqqWakmJCQAcO7cObW/VVBQEF988QUHDx7k5Zdf\nVi7/5ZdfiI2NZdq0abzwwgsqQcjPP/+ckJAQRowYoWw0fuXKFfbs2YOlpSWbN29W2gwtWbKE999/\nn3PnzrF37161+01BQQGOjo588MEHav0zvLy8OHfuHBkZGTg5OSmX37t3jxMnTmBpadloglBr6IjA\ntOBPHtY7Rf37XsSvv1BSVsFzzz2nsUdcw55DLU2QaIgmS0HHIZOVAhCAtp4hJg49yU+No6IkH+/+\nvZTv45qefWUyGVOnTiU+Pp64uDjGjRvXsi+gg1mxYgW2trY4OTmhq6tLZmYm0dHR1NbW8ve//13j\n829LaG9/uQflhnK/SUlJwdbWVi1p6s6dOwAq1tlN9ceVSqVMnz6d3bt3s3XrVvz9/dX+NgUFBZSW\nlqpVbwkEgsYRIpBAIOg0tCQYqGtk2mHVClBnp7N06VKNDSobPpDUz8b+9NNP1USOU6dOUVJSwgsv\nvKDS+B3qPPu1tLSafbBs7fwVNFV10hGl86C5ggOgR48evP/++y3a/4NG+KLX0RnFsLCwMIKDgzl7\n9iwZGRlUVVVhZ2eHjY2Nmji5evVqjp46h+n4V8m+eJr8axeoKitCpmeEmbMnFt0HUnyvksMx15lw\n4QYT/7BICA8PZ+/evURHR5OamoqVlVWTzWgVQlFhYaHG9QUFBSrjoE4InTlzJjNnzqSoqIiLFy8S\nERHByZMnuX79Ol988UWnaE7arICWfxeoC1pbWFgQEBDAtWvX8PLywsPDA1dX11Y1y20pnV2g1ERj\nQkT37t01esn37duXsLAwUlNTH/mA8sOypmteANa8nVQqVfYrUIi3DUVexbmkrWcEQE1leYvm9DDP\nQYFA8CceHh5q19bx48fz3//+VyXQLZfLOXz4MGZmZvj7+6vc07S0tFi2bBmhoaEcP35cKQKFhIQA\nsGDBApU+E1KplGXLlhEdHc2RI0c0Jh0sW7ZMYwP1yZMnc+7cOYKCglTsqE+cOEF5eTlz587tsPtt\newPTgj950O8Umu57V8PPUZqfz2+p0C0tr1kRoCUJEo2hsBQEkOnoodtFPUFK26DO9ri28p6KpeDt\n27cJCAggLi6O27dvKyv+FeTn5zd57PvJpEmTOHv2LCdOnODevXsYGhoyaNAgZs+erdansS20pb/c\ng3ZDuZ8cO3aMoKAgevfuTdeuXTEyMiI7O5uoqCi0tbWZOXOmcmxz/XEXLFhAWloav//+O1FRUfTr\n1w8LCwuKiorIzMzk0qVLLF68WIhAAkErEG8vAoGg0/Cwg4EtzZBvDk1Cj6aXQMGDQfiid04x7Msv\nvwSgqKiInj170r9/f27fvs3mzZu5deuWil94zp0y0nKLMT25j7u5GRjbuSHV1qU4M4Wci6cov/NH\nFWE9i4Tr8af49ttvMTQ0xNLSkm7dupGens4bb7zRaFWQg4MDurq6pKWlUVpaqhbYVmQdu7m5adze\nxMSE4cOHM3z4cIqLi4mPjycjI0M5XktLi+rq5sXujqYlHuP1BbRNmzaxa9cuIiMjlY3tjY2NmTJl\nCgsWLOjQaqDOKFA2RnNCRHdPzWKfqakpoFpB9qjyMKzpWnv+Nobi99xQ5FWcS1XldUKoVLtl9+vH\nrWJUIOgMtOVZvKGFJ9RVHZiamnL37l3lslu3blFSUoKdnR0///yzxn3p6Ohw48YN5f8r7F/r94pT\nYG9vj6WlJTk5OWrPDDo6Ojg7O2s8hqKv17Fjx1i6dKkyMz4oKAipVNphvWPq05bAtECdB/VO0dh9\nr/qPJIVrhTWs3hnJimn9mrzvGRkZaVxeP0GiMRSWgv77QKqj+b4o0dJCIoFFo/+sKMvOzmblypXc\nvXuXPn36MGjQIAwMDNDS0iI3N5ewsLA2WQt3FAsXLlSr2n7YPGg3lPvJ6NGjqaqq4vLly6SkpFBZ\nWYmFhQWjRo1i9uzZKtWPzfXHlclkvPXWWxw/fpzQ0FDOnTtHeXk5xsbG2NjYsGjRIsaMGfOQPqlA\n8GgiRCCBQNBpeFjBwJZmyDeHt7c3P/zwA//97385f/48AwcOpHfv3jg6OooGhQ8R4YteR2cTwz7/\n/PMW+4VfvFGIXA4VJQV4THsZmW6diFNTVcmV377mzvWLdZOmziLhm18jyQrbjpGREZ988gnLli3D\n09OTjRs38uGHH3L69GmNc5LJZIwZM4bg4GB+/PFHlQzdrKwsDh06hEwmU1YqVVVVkZKSotZ4urq6\nWhl4amh7kJ6eTmVlZZvtJlpLWz3GX331VeRyOTdu3CAuLo5ff/2V3bt3I5fL1Rr6tofOKFBqoiVC\nxKHTl5msQYhQWGAoAoSKrFyFDWF96gcsHwYdlQzRUXSkR76+vj62trZkZ2eTmZmJnZ0d8Oc5ePzo\n+bpx5l2bndfjWDH6IKhvL7hgwQK2b99OQkICVVVV9OrVC39/f5ycnCgqKmLHjh1ERUVx9+5dnJ2d\nWbp0Kf369VPua8uWLYSFhbFt2zY1MTIhIYE1a9awcOFCFQuxhn0bdXR0sLCwwMPDg8WLF9OlSxdW\nr15NYmKi8hhbtmxRbq/pWIKOoT3P4o1VIkqlUpVeEyUlJQBkZmaq2D035N69e8p/K2xi61cB1cfc\n3Jzbt2+riUAmJiaNPvtLJBImTZrE999/T0REBOPHjyclJYVr167x1FNPNWpJK3j4PIh3iqbuezId\nPSqAqrISpNq69703zKSB3fDqbsW1nGKN67tZGmFqb8aIXn/eN/fv309JSYlGO83w8HDCwsLuy1wf\nZR60G4qfn5+avWZz+9DkCNK3b1+1bXr27EnPnj0b3U9DmuqPCyh7DbfEwr65flsCgUCIQAKBoBPx\nMIKBHZVhDHUPHps3b2bXrl3ExsYqA82WlpbMmTOn0X4TgvuP8EXvfGJYS/3C03NLuF1cF5CxGzhe\nKQABSLV1MHf25Nb5MGqrKpTLT0VEYFlazrx581QCdhKJhOeee44zZ840mgG5ZMkSLl68yOHDh0lO\nTqZv374UFxdz8uRJ7t27x0svvYSNjQ0AlZWVrFq1CltbW9zc3LC2tqayspILFy5w48YNvL29VSwK\n+vfvT3JyMuvXr6dPnz5oa2vj4uLC0KFDNc6lI2iPx7hEIqFbt25069aNYcOG8dxzz3H27NkOFYGg\n8wmUDWmpEFFWkMWmfefUAjKKCjJXV1fgz4Bl/QbOClJSUjTuuynhqCNoLABbkpNBXPgp7lbUaLQD\nvd90tEf++PHj2bFjB9999x1r1qxRfq8zB3Zl96fhAFh0H9jkcR7XitEHSU5ODq+//jqOjo74+vqS\nm5vLmTNnWL16NZs2bVI2LR81ahQlJSVERETw73//m6+//horK6s2HbOxvo05OTkcO3aMadOm0aVL\nF8aPH4+hoSGRkZF4e3srf7dw/20Pn1Q68lm8KRRVwMOGDWPNmjWt2qawsFDjc4vCJrbhuVFfANIk\nWE6YMIFdu3YRFBTE+PHjCQoKAuqsqgSdm/v9TtHUfc/A0oHS/EyKM1PQM7F8IL1hLLroYdFFj7f/\nOlrNUvB0aBE/3YxWGZ+VlQXA8OHD1faleB4SqPKw3VAEAsGTg7hqCASCTsWDDAa2JcO4ORwdHXnz\nzTepqakhLS2NCxcucPjwYbZu3Yqenh4TJkxo83wF7UP4oj9cMazh9+5oBFEngpr1C69vkWBgYae2\nX21DEwDk9bJ9ywqyKC6v1Ojt3bVrV6ysrDQ2IYW6ap1NmzaxZ88eTp8+zf79+9HV1cXd3Z05c+Yw\ncOCfAWJdXV2WLl1KQkICly9f5uzZs8pqg+XLl6v93hcsWEBpaSlRUVFcunSJ2tpafH1975sI1BaP\n8ci4K8QnO9Cvh2qgTWGhVb+yqaPobAJlQ1oqRFRXlpMVf4JdEbbKOSYnJ3P8+HEMDQ0ZNmwYAO7u\n7gCEhoYyduxYpFIpUCcKNZadrmiee/v2bY2ByPbQXAC2qqaWizcKCW5nALa13A+P/Dlz5hATE0Nk\nZCSvvPIKXl5eVFRUcPLkSewNaylzGoGRtXqzbQWPe8XogyIxMZFnn31WpY/K7t272blzJ6+//joj\nR45k+fLlykD6wIED2bx5MwcOHMDf379Nx2xJ30ZAmbkeGRnJsGHDHvk+Xp2d5p7FFeeAvLZWY9VD\nVFQUEolEY1Z7QxwcHDA0NOTq1atUV1e3yNrU1dWVa9eukZiYqHbtzcrKIi8vDxsbm1YLhCYmJowY\nMYLjx49z+fJlTpw4gY2NDYMGDWrVfgQPh/v1TtHcfc/K3Yu85BiyE8MxtuuOnomVyn0vLy8PS8v7\nc3/SZCmoqa5eIXQmJCSoPN/GxsZy5MiR+zK3R51HyRpZIBA82ggRSCAQdCoeZDCwLRnGji3MxpZK\npbi5ueHm5oaHhwf/+te/OHPmjBCBOgFPui/6gxbDNFUYVJQUcjXoWwykNfg8NYiJEyc26hde3+pA\npsGTXCLRQkumjcuoeVh0r+sOX1NZQU2tXNmLpaE1gJmZmVIEamiRAHUZvUuXLmXp0qVNfjaZTMbc\nuXOZO3duC76Jut5gy5cvZ/ny5S0a317a4jFekpXKC/7LGDtsEHZ2dpiampKXl0dkZCQSiYQ5c+bc\nh5l23mq91ggRXWycyE85T8A3mXQt8kVaU05ERAS1tbX87W9/U2aU9+zZE09PTxITE1m5ciX9+/fn\nzp07REVFMXDgQE6ePKm27/79+7N3714+//xzhg8fjr6+PoaGhkybNq1dn6/FyRC0PBmio7gfHvky\nmYwNGzawf/9+Tpw4weHDh9HS0sLFxYUXX3yRLo69O905+CjTmL2gtbU18+bNUxnr6+vLzp07qaqq\n4vnnn1eppPDx8eGTTz4hNTW13XMSfRs7F809i0t19JFIJFSVFbW76kEqlTJ9+nR2797N1q1b8ff3\nVzsfCgoKKC0tVVbxTpgwgZCQEHbv3s3QoUMxMalLPKmtrWXbtm3I5fI29/CZMmUKx48f56OPPqK8\nvJz58+cL++hHjI5+p2juvqdnYoXjkMnciPqVK799jYlDL3S7mLPx42gMqgoxMDBg48aNHTaftjB1\n6lRCQ0P58MMPGTFiBObm5mRkZBAbG8vIkSOJiIh4qPPrjDwq1sgCgeDRR4hAAoGg0/EggoFtzTC2\nNq5r/K0pGzslJQVbW1u1bEBFP4j7kUEvELSVByGGNVZhkHvlDNUVZZgNm0mm/QCchv7Z2LahX3hb\nrA6kOrpIayXcuXOHbt3Us/obNoZ/XGmJx3hDjO264+6oR0VFnfBTVlaGubk5AwYMYNasWWr9jzqS\nzlit1xohQsfQDMehU8k8H8b+g79ibaxD9+7deeaZZ9Syu9euXct3331HZGQkhw4dws7OjqVLlzJo\n0CCNItCgQYNYtmwZwcHBHDhwgOrqaqytrRk6dKiyx4qfnx/bt2/nwoULlJeX4+TkhJ+fH0OGDFHZ\nV1VVFQcOHOD48eMcOXuR4vJq9M1ssOo5FDOnPspxWfHHuXW+7rdYlp9JzI53mBeog37VHbp06XLf\nrOkUdIRH/gcffKC2jY6ODvPnz1epQqlPZzsHH0Wa6+/SrWdfZeWNAkUfFHt7e/T1VcVGLS0tpSDd\nVkTfxs5HS57Fpdo6GFjYczf3Oukn95IVb4FT+VWmPT2mTcdcsGABaWlp/P7770RFRdGvXz8sLCwo\nKioiMzOTS5cusXjxYqUI5OHhwdy5cwkMDORvf/sbI0aMQE9Pj5iYGDIyMujdu3ebkyM8PDxwcXEh\nLS0NmUwmEsUELbrvWfYYjL6pNTmXz3A3J52im1e4UtqVsd792ixIdiTOzs5s3LiRH3/8kXPnzlFT\nU4OLiwtr1qzB0NBQiECN0NmtkQUCweOBEIEEAkGn5H4HA9uaYaxl5gCgMRv72LFjBAUF0bt3b7p2\n7YqRkRHZ2dlERUWhra3NzJkz2z1vgeBRoakKg4qSOhHGtJsH8gYN3Rv6hbfF6sDA3BbjwlwSEhJU\nGolDXWPw27dvt3qfjyItEdAaBtH1TKyYOtGHWUNd7ufUmqQzVeu1RYhwHfMMS8a4N/mCbmhoyCuv\nvMIrr7yitq6xprazZs1i1qxZKssUFW25ubmsXLmSrl27Mm7cOGUflQ0bNvDee+8pfwfV1dWsW7eO\nxMRETCxs0HHsh3l1NXeuXyItIoB7hdnYDaizvzKyccbSbRAl2alo63fBtp8PAJ76+WTfTL8v1nSK\nz7Js2TKs3QaA+VOt3r4jPPI70zn4qNGS/i5hl/LU7AUVtoiKirmGSKXSdgmPom/jg0fxW25MpLbr\n7wNoK8dXV5aTnxJLcWYKFcX5VFeUoqWti14XC/RMrCnOukZNRiLfZ8dwr+g2hw8fBuDmzZsqf7+F\nCxfi5+dHQUEBMTEx7Nq1S2kXJ5PJeOuttzh+/DgrVqwgPDyc/v37Y2xsjI2NDf379+ebb77BzMwM\nU1NTAgICSE1NJT8/Hw8PD44ePaq0sDUwMCA5OZlnnnkGBwcHfH19W12dOX78eL755hu8vb2V1cuC\nJ5eW3r8MrRxxtfrz+vnyxN4qz23W1taNPkuA5gQJX19fjfaXmirmFfj5+Wm0YvTw8OD999/XuE1T\n83qS6ezWyAKB4PFAiEACgaBTc78CMW3JkAewdemlMRt72rRpjB49mqqqKi5fvkxKSgqVlZVYWFgw\natQoZs+ejZOTUwd/CoGg89KUxYvOH3187uakY+LQU2nxIi+8ruYX7mzdBStjfUpyWn7sEaNGkRV2\nlcOHDzNhwgSlP7lcLud///sf8pb6QD7iCI/x9vOoNOtNSEjAz8+PhQsXKpf5+Piwfv169u7dqxSB\n9u3bR2JiIoMHD6bvxEXcDr0KQNd+PiQFfUt24kmM7d0xsnKki40zICHtZADaBl2w7TcGgMFOtRzf\ns7XDrekaYm9uyM02bNfa83fXrl389NNPbNy4UWMfMUHLaUuvxbYGshTVO5qEodLSUo3biL6ND4fG\nROrwqM/QHTCbLl3rgtcVxXlkxR3F0MoJY/seSHX0qSy9Q/GtJGpraug+5hmM7dxYMsadp+yl2Nvb\n89NPP2Ftba0SvFb8jt99913WrFmjNh+JRMLYsWMZPHgwoBrkDgsL48KFC5w6dYqYmBgGDx7M5MmT\nyc3NZdWqVQC89NJLaGlp4ebmhoWFBaWlpcTHx7N161aSk5ObDJo3RGFxOHny5FZ+q4LHEfHc9mTT\nWa2RBQLB44MQgQQCwRNJWzLkFdtpysaGuj4PPXv27LA5CgSPKs03th1CQeoF0iICMO3mgbZ+F1KO\n5nJe5w5P+45Rs4ro42hGWkrLji2RwAtTvclwqGLbtm28+uqrjBo1CkNDQ2JjYyktLcXZ2Zn09PR2\nfMJHA+Ex3n4elYCMtbU1CxYsUFk2aNAgrKysSEpKUi4LCQlBIpHg7+9PeNo95XJtPUO6eo4m4+xB\n8lNiMaqXYdyQppIhOgJzc3O++uqrut4Gh66K8/cRoi29FtsazDIyMgI02/MmJyc3uW1zfRsVVnW1\ntbVtmptAlcZE6r/+4w1yLp1WikC6xpZ4zl6JTE+1GqyytIirwdu4GRNMbzs3DHRluLq64OrqqhSB\nNFUjtIfo6GjWr1+vFIrqs379erVzTi6Xs2XLFo4ePcrUqVNb9D6Ql5dHeHg4jo6OalXLgicT8dwm\n6IzWyAKB4PFBiEACwROAyHJV51EJ7AkEjyLN2S3qm9ngNn4JWXHHKL6VjFxei76pDRMW+TN5aA81\nEcjG1AAXa2MkElpskTDQZRbm5uYEBgYSFhaGvr4+gwYN4rnnnuPjjz/uiI/5SCA8xttHZwvINAwK\nOBjW/WFdXFzUeqxAndXVlStXALh37x5ZWVlYWFjg4OCAQWaaylijrs514wqzlct0DE0wsXfH3PnP\nZ4emkiE6AplMhoNDnfWqOH8fHdraazE9t6RNvxd3d3cAgoODVQLo6enpHDx4UG18a/o2dulSNx+F\n3aKgZTR2fWpMpHZxsOPmxRvKZTIdPY371TE0waxbb3KvRFJZWvRAnsW9vb01CkCARhtMiUTCjBkz\nOHr0KOfPn29SBDpx4gS3bt0iPDycqqoqFi1aJPpSCZSI+54AhC2tQCC4PwgRSCAQPJF0tsCeQPA4\n0RK7RSMrR3qMX6yyzNHdnb59e6j5hSu8y8+n5alZJFh0H4BF9wEaLRJGjx7N6NGj1Y6tyQv9cUV4\njLefzhCQOZ+Wx87wZLV7VsXdO9y4UUjPAZq3k0qlSvtDhUWWubk5oJ7UoK1XV1lRU1ne5FzudwC2\nfh+R1157jdem9mX5P16nJCdDrToXIP/aBa6fPcDKFStUzt/09HT27NnDlStXKCgowMDAAEtLSzw9\nPXnuueeQyWQsW7ZMGehvaBsl+ha0jrb2WryQntemZytvb2/s7OwIDw8nPz8fd3d3bt++TWRkJN7e\n3pw8eVJlfGv6Nvbq1QtdXV0OHjxISUkJZmZmAEybNk1NRBI0f33q5u6pUaR2drQl5kq6yrK7t29w\n+0okpXk3qS6/S20Duz8XU8kDeRZXiIyaKCkpYe/evURHR5OdnU15ueo1Mz8/v8l9BwUFcfHiRSwt\nLfH392f48OEdMmfB44F4bhMIBALB/UKIQAKB4ImlMwT2BHU0DPoJHm3uVx8VYZHQNoTHePt42AGZ\noPPXmzx28b1KDsdcZ8KFG0wc0LiNmyJ4VnIWjwAAIABJREFUXVhYCKgnQ1SV3wVAqq05Gx8eTjLE\npIHdGNPHjhMlWRrXu9oYY2BvxhA3a+Wy9PR0Xn/9daBOLLCxsaGsrIysrCx+++03nn32WWQyGTNm\nzODs2bMkJibi6+ur7B8maD1t7bXY1u10dHR4//332bZtGxcuXCA5ORknJyf++c9/0qVLFzURqDV9\nG42MjFi9ejU//fQTYWFhyiD/2LFjhQjUgJZcn8Iu5xOs4foklUqxNzdQVvneuX6ZtIg9SKQyjLu6\notPFHKlMGyQS7uakczc3g0n97R/Ap0Ip/DWktLSUFStWkJOTg7u7O+PGjcPIyAipVEppaSkHDx6k\nqqqqyX0/SYkogrYhntsEAoFAcD8QIpBA8ICIjIzk4MGD3Lhxg5KSEoyNjbGzs2PUqFFMmTKFf/7z\nnyQlJfHtt99qDELs27eP7777jueff57Zs2cDHZvlWlFRwcGDB4mIiCAzMxOJRIKTkxMzZsxQy6RP\nSEhgzZo1LFy4kCFDhvDjjz9y5coVJBIJ/fv354UXXsDS0pLs7Gx++OEH4uLiKC8vp2fPnrzwwgu4\nuLh09NfbJh52YE8geFy533aLwiKh9QgBrX08rIDM+bS8Zu9RAMjhP4fjsTbRb3QO+vr62Nrakp2d\nTWZmJnZ2dirJEHez0+vGmXdVbqOwKJLLax9qMoSNqQG9Hcz47K+j1c7fawl6bLl0RGV8WFgYlZWV\nrF27Fm9vb5V1d+/eVVp/zZw5k9LSUqUIJCxz2057ei0qaKr6atu2bWrLLC0tefPNNzWOb7iv1vZt\nHDx4cKN2YII6OuL6ZGKgw9/+eBbPij+ORCqj1+QX0DOxUhl3I+ow1pJCPBw0izOaUFy/ahpUEyko\nLS1tVNRrzJ7tyJEj5OTksHDhQrU+RFeuXNFoRSgQtAXx3CYQCASCjkaIQALBAyAoKIgvvvgCMzMz\nhg4dirGxMXfu3CE9PZ3Q0FCmTJnClClTuHr1KsHBwTz77LNq+wgODkZbWxtfX1+gY7NcS0tLWbNm\nDampqXTv3p0JEyZQW1vL+fPn+fjjj8nIyNA4p+TkZAIDA/H09GTixImkp6dz+vRpMjIyWLt2LatW\nrcLBwYFx48aRm5vLmTNnePvtt/n222/R02s80/hBIjKtBIKOR9gtdl6EgNZ2HkZAZmd4couqVaEu\nk35XRHKT96vx48ezY8cOvvvuO9asWaNMhtgUGEl2YjgAFt0HKsdLdfSRSCRUlRV1imQITefvtSbG\n6+joqC0zMjLq4FkJQPRafBLpqOuT4ll8wf6PkRhbqglAfbuZYXK1gnvoq20rkUiora3VeEzFbz0v\nT92qMCsrq0kRqDEyMzMBNFq4JSYmtmpfAkFLEM9tAoFAIOgohAgkEDwAgoKCkMlkfPbZZ5iYmKis\nKy4uBmDkyJF8++23hISE4Ofnh1QqVY5JSEjg1q1b+Pj4YGxsDHRslus333xDamoqS5cuZe7cucrl\nlZWVvP/+++zZs4cRI0bg6uqqsl10dDSvv/46Y8aMUS779NNPCQkJ4Y033mD27NnMnz9fuW737t3s\n3LmTI0eOMGPGjNZ8hfcVkWkleFQ5dOgQv//+Ozk5OVRWVuLv76/S1+Bh0hq7xcitr5NnZ4F/uAeF\nhYVIpVKcnZ2ZPHkyY8eOVRmbnZ1NQEAA8fHx5Ofno6Ojg4WFBR4eHixevFjZ0FsguF88qIBMem5J\nq4RUgPiMAtJzSxqd35w5c4iJiSEyMpJXXnkFLy8vKioq0EkMQ7e2DNM+IzCy7qYcL9XWwcmlOwZV\nBSSE/kzBZXu0tLTw9vbG2dm5PR9Phfr338rSO222CFMwatQoDh48yHvvvceIESMYMGAAHh4eGhu6\nCzoGIf4/WXT09WmgiyXjvHpxIyuXZ56yQ6pnhIGujP5OFpwOPcTVotsa92lsbKxR5AFwcHDAwMCA\nyMhIioqKlO9glZWVfP31162auwIbGxug7t2s/jUwNTWVPXv2tGmfAoFAIBAIBA8CIQIJBA8IqVSq\nIuwoUIg6Ojo6jB8/nn379hEZGamSYRYUFATApEmT1LZvb5ZrSUkJx44do0ePHioCkGLfS5cuJTY2\nlhMnTqiJQL1791YRgADGjRtHSEgIBgYGzJs3T23dzp07SU1NbfH8HiQi06pzkJuby/bt27lw4QLl\n5eU4OTnh5+fHkCFDlGNKS0sJDg4mJiaGW7duUVRUhIGBAb169eIvf/kLvXr1UtvvxYsXCQwMJDU1\nlaKiIoyMjLCxsWHw4MEsXLjwQX7EDiE8PJytW7fi6urKjBkz0NbW1vi5HxatsVvUuleAVm0XPD09\nMTMzo6SkhOjoaDZv3sytW7dYtGgRAAUFBaxcuZKysjK8vLwYPnw4lZWV5OTkcOzYMaZNmyZEoIdI\nWFgYUVFRXLt2rVkxb/Xq1SQmJrJ//34CAwMJDQ3l9u3bmJqa4uPjw6JFi5DJ6h5T7969y5IlSzA3\nN2fr1q0abXreffddzp07x+bNm+nR4/Ho33YhXXNgsyXbNXYvk8lkbNiwgf3793PixAkOHz6MlpYW\nnu4u/GvlK3TrNVAtGULXfxDffPMNsbGxhIeHI5fLsbS07BARSFND+Yq7d7iYkU9tVDo+aXltqj5y\nd3fno48+4pdffuHUqVMcO3YMAHt7e/z8/NRsbgUdg+i1+ORwP65Ps2bN4osvviD4+/+PESNGcFcq\n5dOfLnP9+nWGDh1KVFSU2jb9+/cnPDycd999l+7duyOTyejTpw+enp5KR4Tdu3fz6quvMmzYMGpq\narhw4QLm5uaYm5u3ev7jxo1j7969fPPNNyQkJGBnZ0dmZibnzp1j2LBhREREtHqfAoFAIBAIBA8C\nIQIJBPeBhhUlvQd4c+3aNZYvX87o0aPx9PTEw8NDrSpoypQp7N+/n99//10pAhUXF3PmzBkcHR3x\n9PRUju2oLNekpCSljcKuXbvU1it8tG/cuKG2TlOgzcLCAgBXV1e0tLQ0rsvPz2/VHAVPDrm5uaxc\nuZKuXbsybtw4SkpKiIiIYMOGDbz33nv069cPgJs3b7Jjxw769OnDkCFDMDIyIjc3l6ioKGJiYnj7\n7bdVvPxjYmJ45513MDAwwNvbGwsLC0pKSrh58ya//vrrIykCnTt3DoD169e3KZDxIGip3eKqE/3x\n8vLitddeU66rrq5m/fr1BAQEMHnyZCwsLDh16hQlJSW88MILatWE5eXlatccwYPlyy+/pFu3bs2K\nefXZtGkTFy9eZPDgwRgYGBAdHU1gYCB37txRng9GRkaMHj2a0NBQ4uLiGDBggMo+8vLyiImJwc3N\n7bERgIAWVcNo6rFSfztNDch1dHSYP3++SqVufdQDtF1Yt25d8xNuJc01lM8qLGP1zkhWTOun/G3X\n1NSoJdTcvXtX4/a9evVi3bp1VFVVkZKSQmxsLIcOHeLjjz/G2NhY7TwStB/Ra/HJoa3Vek1tN2nS\nJLS1tTlw4ABhYWHo6OjQp08f/vGPf3D69GmNItCLL74IQFxcHNHR0cjlchYuXKh8Z/Lz80NXV5fg\n4GCCg4MxNTVl9OjR+Pn5sXz58lbP39zcnI8++ojt27dz6dIlYmNjcXBw4OWXX2bAgAFCBBIIBAKB\nQNBpESKQQNCBaMporcMYk95PQ0Fdw9ADBw4gkUjw9PTkueeeUwatunbtyqBBg4iNjSUrKwtbW1vC\nwsKoqqpSqwLqqCzXkpISoK6/T3JycqPjysvL1ZYZGBioLVMEZzR5bCvWVVe3z+ZF8PiSkJCAn5+f\niijj4+PD+vXr2bt3r1IEcnBw4Pvvv1dW0inIy8vj9ddf59tvv1URgY4cOYJcLueDDz7AxcVFZRuF\nJeOjRkFB3XWmswpAClpit6ipR5hMJmPq1KnEx8cTFxfHuHHjlOs0VUB2lj5jD5vc3FyWLVuGr68v\nfn5+zVbVAVRVVXHgwAGOHz9OVlYWUqkUFxcXpk+fzsiRI1u8f0tLS7X9axLz6pOVlcUXX3yhrOB6\n9tlnefXVVzl69ChLlizBzKyuCfiUKVMIDQ3l999/VwveHzlyhNraWo3Vso8yBrpte0xv63YPkpY2\nlJf/0VC+R2Vd9VdeXp7SjklBSkpKk/vQ1tbGw8MDDw8P7Ozs2Lx5M5GRkcrzSCEwNdZXRNA6RK/F\nJ4OWXGc0idT1t9MkUvv6+ir7n9bH2dkZPz8/teUmJia88cYbjc5BIpEwb948NXcCgG3btrX4+PVx\ndHTk7bff1rju0KFDastee+01lSQXgUAgEAgEgodB539LFAgeEZrLaC0ycqW4iysvP++GvXYJZ86c\nISQkhPXr1/PVV18pq4ImT55MTEwMR44cYcmSJQQHB6Ojo6MSAFXQEVmuCrFm5syZ+Pv7t/0LEAha\nQUNBwMGw7odjbW3NggULVMYOGjQIKysrkpKSlMsaa+RraWnJiBEjOHToELdv38bKSrW5sCbxoKGQ\n1NnZtWsXP/30k/L/p0+frvy3IvgQFxfH3r17SUpKory8HGtra4YPH868efPUvrtly5YBmoMhimNt\n3LhRpZ/Y9OnT8fT0ZPXq1fzwww9ERUVRUlKCra0tc+bMYfz48Wr7cjDX5+zRSI6FhXH9VjY1Mn16\n9BuC75SZFJWUERsby0svvcTt27eprKxU2VZRPejt7c0PP/zAf//7X86fP8/AgQPp3bs3jo6OGi3C\nnmRaWlVXXV3NunXrSExMxMHBgalTp1JRUcGpU6f46KOPSE1NZfHixW3ef1NiHsDSpUtVLPz09PTw\n8fFh9+7dpKSkKAWlHj160KNHDyIjIyksLFSKQ7W1tYSEhKCvr4+Pj899+S4fFgOc2xYkb+t2D5LW\nNpTPKK9rCB8cHKxyPsbFxXHixAm1bS5fvkz37t3Vrvl37twBUPZNhD/vAbdva+45Img9otfi48/j\nfH0SPFgUz5SaREGBQCAQCAQdhxCBBIIOoDUZrV+FpfDB//PmlVe8kMvlhISEcPHiRaX929ChQ7Gy\nsiIkJIR+/fpx69Ytxo0b12Sfn/Zkubq7uyORSLh06VIbP71A0HIaq5aruHuHGzcK6ebuqdHSy9LS\nkitXrqgsu3z5MgcPHuTKlSvcuXNHrcIsPz9fKQL5+Phw+vRpXn/9dUaNGkW/fv3w8PDA0vLRC0Yo\nxJiwsDByc3PVrOyCgoL48ssv0dXVZeTIkZiampKQkEBAQACRkZF8/PHHjYporaG0tJRVq1Yhk8kY\nMWIEVVVVnDx5kk8++QSJRKKSSSuXy/nwww85ciyCOzV61Ji6Iq+t4dKvQQSdiCQr7gL6+gbYu/Zi\n4sSJGBgYoKWlRW5uLnv37uX7778nMDCQ0tJSevXqhaWlJbGxsZw+fRqoOz/mzJmjIog96bS0qm7f\nvn0kJiYyePBg3n77bWXFpp+fHytXrmTPnj0MGTIEDw8Ptf1PmDqbboPGUlZRjaWuDGtHNz58Zw3P\nP/88Tk5OjYp59dFk36b43Ta0+ZoyZQqffPIJISEhSiuz6Oho8vLymDJlymNXDeZs3YW+3cxb1Xy9\nn5N5pw+yt6Wh/F3jHnTRvsCePXtIS0vD0dGRzMxMYmJiGDZsmPJaoCAwMJD4+Hj69OmDjY0N+vr6\nZGRkEBMTg5GRERMnTlSO7du3LxKJhO+//56MjAzl81bDhARB6xG9Fh9fHtfrk+DJISwsjC1btvDa\na681W/0lEAgEAsHjgBCBBIIOoLmM1pLsNIxsnJFIJMjlsCsimYEulhozUiUSCZMmTWLHjh188skn\nQF11UEM6KsvVxMSEMWPGcOzYMXbv3s38+fPVgvBZWVloaWmpWbAIBK2huWq54nuVhF3OJ/jCDSYO\ncFRZJ5VKkdfb8MyZM3zwwQfo6OgwYMAAbG1t0dPTQyKRkJCQQGJiIlVVVcrxw4cPZ926dezfv5/Q\n0FCCgoIAcHNzY8mSJY9Ub4i+ffvSt29fEhISyM3NVbFHyc3N5euvv0ZPT4/Nmzfj4OCgXPfVV1/x\n22+/8b///Y+///3v7Z5HWloaEyZM4O9//7vymjFz5kz+/ve/ExgYqPJCHR4ezqEjx8mtNcbNdzFa\nMm0AbPv5ELtjPbW1NehZu5Jp/zROQ/sp//4HDhwgOTkZNzc3Zs2ahYGBAQ4ODowePZo333yTc+fO\nsXz5cg4fPszWrVvR09NjwoQJ7f5sjwMtraoLCQlBIpHg7++v0mvFxMSEZ555hk8//ZQjR46oiEBF\nZZUUVulwOMcKSXBdAkFFSSFXg77lbmYu1uamvNhAzFNYmzakKevQhokLo0ePZtu2bQQHB/OXv/wF\niUSi/C0/blZwCv7f6B6s3hnZoqoZiQT8RnX+nkhtaSivrWfItKX/4FZsKImJiSQmJuLm5saGDRvI\nyclRE4GmTp2KkZERSUlJXLp0iZqaGiwtLZk6dSqzZs3C2tpaOdbR0ZEVK1awb98+fvvtN6V42ZlE\noPo2jJ3BVqqxzP3CwkK2b99OXFwcBQUFyOVydu/e3SGJB4LOx+N4fRI8eL766iuV91aBQCAQCAT3\nh0daBJJIJA7Au8AkwALIAvYD78jl8sKHOTfBk0NLMlrTwn9BS6aDgaU9ukam3IyBwtM7ybmVgZub\nG/3791cZ//TTT/PTTz+Rn5+Ps7MzvXr1UttnR2a5vvTSS2RmZrJz506OHTtG7969MTU1paCggBs3\nbpCcnMwbb7whRCBBm2lptRx/9H+wNtFvsl/Ajz/+iLa2Nv/5z39wdFQVjL744gsSExPVthkyZAhD\nhgyhvLycpKQkoqKi+P3333nnnXf49NNP1fbzKHL8+HGqq6uZPXu2igAEdb1Wjh07xrFjx/jrX/+K\ntrZ2u46lq6uLv7+/imjs6OhI7969SUxMpLy8XFmZ8WPAQdJyi3HznaUUgABkugbom3WlND8TXWML\nZf8Pxd8/JCSE2tpaxowZo2ZXqaWlhaGhIfPmzcPDw4N//etfnDlz5okUgepbLlWW3qGsohoXF5dm\nq+ru3btHVlYWFhYWaucLoKwWSk1NVS47mnCTK7cKMbHviaTe/nOvnKG6ogxDW3dKqitxGjpZKeaF\nh4cTFhbW7s+po6ODr68vBw4cIDY2FicnJ2JiYujZs6dar6/HhYEulrw2tW+z10+JBFZM6/dI9Flp\nrqG8vKZuvaSeKAlgaGbDv//9b7Xxnp6ealncAwcOZODAgS2e09ixYxk7dmyLx3cGtmzZQlhYGNu2\nbVMRtR4mW7Zs4fz584wePRpbW1skEkm77zWCzsvjeH0SPHg0PX8IBAKBQCDoeB5ZEUgikXQHTgPW\nwAHgCjAU+AcwSSKRjJDL5eq+IwJBB9OSjFbbAb6UZF3jXkE2xZkpaEllZOu68tzSpUyZMgWZTPWn\naGpqipeXF2fPnm00u7kjs1wNDAz48MMPCQoK4sSJE5w+fZrKykpMTU2xs7PD39+/VcEUgaAhre3/\noKiWa4ysrCy6deumJtzI5XIuXrzY5P719PTo168f/fr1w8jIiJ07dxIdHd3pRaCGvRWKSivVxly7\ndg34M3hfHyMjI7p3705iYiI3b95sd9Dczs4OAwMDteUKi727d+8qRaCT0QmABEOrbmrjDa26kZcS\nS2VZEfDn319eeJ2zZ8/WjfkjizwlJQVbW1u1rHJNFZBPAprsFSvu3uFiRj61l/KYmpan9juqX1VX\nWloKgLm5ucb9K/ruKGzZzqfl8XXIZeRykOroq4ytKKnLvdEzsaI076aKmJeQkNABn7aOKVOmcPDg\nQYKCgnBxcaG2tvaxrQJSMGlgN2xMDdgVkUx8hnrSST8nc/xG9XhkAqzNNZQvL657fNc2ULWNakkj\n+scVc3NzvvrqK43X3M5CdXU158+fp3///vzzn/982NMRPCAet+vTk0JYWBhRUVFcu3aNwsJCpFIp\nzs7OTJ48WU0QX716NYmJiezfv5/AwEBCQ0O5ffs2pqam+Pj4sGjRIrV3WYCbN28qkxYLCgowNDTE\n3t4eHx8fpkyZohzXWGVhTU0NwcHBHD16lOvXr1NTU4ODgwMTJkxg6tSpKr0g61dL+vn5sX37di5c\nuEB5eTlOTk74+fkpewzW/0xQJ15v2bJFua4zCesCgUAgEHQkj/Lb1JfUCUCvyuXyzxQLJRLJZmAF\n8D7w0kOam+AJormMVgArdy+s3L1UlvmNcWduI7YIcrmctLQ0dHV1G81M7egsV5lMxrRp05g2bVqz\n++rbt6+yAX1DrK2tG10HNLlO8HjSlv4P8RkFpOeWNOodb21tTXBwMJMmTVLaQcnlcnbt2sWNGzfU\nxicmJuLh4aFidwVNiwedxX6nsT5KyReuIyku5Hy9QH9Lg/qKce2hMXufhlZe6bkl5N8pRqqrj9Yf\n6wozLnL76jnu3cmh6l4JNZX3yE85T1pEADqGJlw6mMT29GisrOo+1y+//MKpU6e4fv06xsbG5OTk\noKuri1QqxcPDQ/l3tLW1VZlLXl4eAQEBREdHk5+fj76+Ph4eHjzzzDNqvWh27drFTz/9xMaNGyku\nLiYwMJCMjAx0dHQYOHAgy5Ytw8LCot3fW0fRnL1iVmEZq3dGsmJaPzV7RQWKv2FhoebiacVyxbim\nxFwdQxMAjWLekSNHWvSZWoKdnR39+/fn3LlzXLlyBUNDQ0aPHt1h+++sDHSxZKCLpZoYPMDZ8pHr\nsdFYY/h7hTkUpCdQmJaARCLB1NGjRds9Cchksk6fLV9YWIhcLu9U10nBg+Fxuj49KXz55Zd069YN\nT09PzMzMKCkpITo6ms2bN3Pr1i0WLVqkts2mTZu4ePEigwcPxsDAgOjoaAIDA7lz547ac/K5c+f4\n8MMPqaqqYvDgwYwePZrS0lLS0tIIDAxUEYE0UV1dzYYNG4iNjVUKRzo6OsTHx/P111+TlJTEypUr\n1bbLzc1l5cqVdO3alXHjxlFSUkJERAQbNmzgvffeUyZJjR8/HkNDQyIjI/H29sbV1VW5D2FfKRAI\nBILHlUdSBPqjCuhpIB34osHq9cCLwLMSieR1uVze/kiXQNAEbc1MbWq7U6dOkZOTw+TJkzt11qdA\n0BLa0v9BsV1jwYNZs2YRGlrXG+Krr75CKpVy+fJlrl+/ztChQ4mKilIZv3XrVvLz8/Hw8MDGxgaZ\nTEZKSgrx8fFYW1t32iByS/oo1Q/01w/qd+umXnWjCOrXv65IJBKqqzWL2R0hFl1Iz0OqrUdNxT1q\na2rITjhOduJJZHoGmDl7Iq+pojT3OjVVFWSeD8XQuhu6hqaMn/3/MKwt4dChQ/Tp04fJkyeTmZlJ\nWVkZJ0+eJCUlhXv37uHh4YGXlxdDhgyhT58+yuNeu3aNt99+m7t37zJo0CCGDx9OcXExZ8+eZdWq\nVbz11lt4eXmpzfe3335TBgU8PT1JSkoiIiKCtLQ0Pv3002atjQ4dOsTvv/9OTk4OlZWV+Pv7M3Pm\nzHZ/j/Vpqb1iQ3u9hujr62Nra0t2djaZmZnY2dmprI+Pjwege/fuzYq5Vu5DKEi9QNGNK0i0pNyK\nDSHlaC7nde7wtO8YIiIiWv9BG2HKlClcuHCBO3fuMH36dLXeeI8zztZdHvmgamMN5csKsrh9NQo9\nYwscvaeib/pnJvaT3lC+YVLC9OnTleuWLVum/Le1tTXbtm0DIDs7m4CAAOLj48nPz0dHRwcLCws8\nPDxYvHgxXbqofp/h4eEEBQWRmppKZWUlNjY2jBkzhjlz5jR73Vu2bBm5ublAXYWBwv7xYSdRCB4s\nj8P16Unh888/V0ucqa6uZv369QQEBDB58mQ1QTcrK4svvvhCee149tlnefXVVzl69ChLlixRJhoV\nFxezadMmamtr2bhxI56enir7yctr/r3gl19+ITY2lmnTpvHCCy8o7W1ra2v5/PPPCQkJYcSIEXh7\ne6tsl5CQgJ+fHwsXLlQu8/HxYf369ezdu1cpAvXt25e1a9dSXFzMa6+9pmYpKhAIBALB48gjKQIB\ninKGI3K5XKVrsFwuL5FIJKeoE4meAtpvQi8QNEFbM1M1bRcQEEBJSQnBwcHo6enxl7/8pb3TEwge\nOi2plmvtdpMmTeKjjz7i119/JSwsDB0dHfr06cM//vEPTp8+rSYCzZ8/nzNnzpCcnExcXBwSiQQr\nKyvmz5/PjBkzlH2yOhNtCfS7urpy+vRpEhIS1HqNlZaWkpqaio6Ojor1nZGREenp6VRXV6vZeSQn\nJ7f7c5RVVGNg3pXirFTykqLITjyJjqEJPSf5o61vRP61C3Sx7U51eSkyPUO6eo6iq+copo9xhxvn\nyMnJYdWqVfTt21dlvworD03VhTU1NXz00UeUl5erBSAKCgpYsWIFn376Kdu2bVMLbsbExLB582ac\nnZ2Vyz7++GPCw8OJjIxk5MiRjX7W8PBwtm7diqurKzNmzEBbW1tjT7f20pH2iuPHj2fHjh189913\nrFmzRhloKS4uZvfu3QBMmDChWTFX38wGt/FLuLj/E8rv5JKXHI2+qQ0TFvkzeWiPDhWBvL29MTY2\npri4+LG3gntc0dRQ3qL7ACy6D1AbKxrKq7Nw4ULOnj1LWloaM2bMUCYAKP5bUFDAypUrKSsrw8vL\ni+HDh1NZWUlOTg7Hjh1j2rRpKiLQJ598QmhoKJaWlgwfPhxDQ0OuXr3Kjz/+SFxcHBs2bFCrpK3P\njBkzyM3N5eDBg7i4uPDUU08BqGTXCwSCzkNDAQjqKg6nTp1KfHw8cXFxjBs3TmX90qVLVa4benp6\n+Pj4sHv3blJSUpR2a2FhYZSVlSlt3hqisA1uDLlczuHDhzEzM1PrPamlpcWyZcsIDQ3l+PHjaiKQ\ntbW10u5cwaBBg7CysiIpKanJ4woEAoFA8LjzqIpAPf/4b2N38mTqRCB3mhGBJBJJTCOrOj5qI3gs\naSyjtSkay2j9/vvvkclkODo68vzzz2NlZdWRUxUIHgotqZbTNTJl0KL1jW7X0CccYO7cucydO1dt\nubOzM35+firLRo4c2WTwvjPSlkAIPpE6AAAgAElEQVT/G5PGsnv3bg4fPoyvr6/KS/6PP/5IWVkZ\nTz/9tIrw4e7uzrVr1wgNDVUJqIeFhXH58uV2fw4DXRnmrgMozkrl+tlDSGTadPUchba+EdUVZWQn\n1okDXbq6UF50m/yU83T1HIWBroyyNh4zOjqarKwsZs+erRaAMDc3Z+7cuXzzzTfExcWpVQNNnz5d\nRQACmDhxIuHh4SQlJTV5Hp07dw6A9evXN2rJ117+f/bOPC6qev3j74Fh35FFVGRRUJBFFCXJNdQ2\nzTSvC9etm92sbuY1697MUltssbq5lJl1K0uwn2jmSgqKkgskArKIrG4sDgjCsDMwvz+4MzHOgKCo\naN/369UrOed7zvmeYZj5nufzPM+ns9srTpkyhYSEBOLi4njppZcIDAykrq6O3377jfLycp566im8\nvb1Jir2xIGhu74ytqw/yKxfwn/46AM6envj6emiJdbr+plWEhIS0mZUrk8mQy+V4e3vrrHgTdH2E\nofytERoaikwmIy8vj0mTJmn5Vxw7dgy5XM6zzz7LE088obGvtrZWI6gaHR1NVFQUw4YNY8mSJRqV\ndao2mXv37tU6T0smTZqkFoHc3d21voMFAsHd5fpWfc7mEH8kkuTkZIqLi9V+sSquXtW2Vr6+jS6g\nflZVeQcCnDt3DoDBgwff1Fzz8/ORy+X06NGDn376SecYQ0NDne2f3dzcND7fVNjZ2ZGRkaH+2dbW\nlpdffplvv/32puYoEAgEAsG9yL0qAln97//lrexXbbe+A3MRCHRmtLZGWxmtwi9HcD9yM9Vy1y6d\n42BYDDvWX0Eul2NpaUmPHj0YMWKEuo+4rkqQlJQUli5dysyZM3nggQf44YcfOHv2LA0NDXh6ejJn\nzhy8vLy0rtfU1MSvv/7K4cOHuXDhAgqFAlNTU/Ly8igvL+fy5ct89913pKWlUV9fj0QiwdTUlNra\nWi2j2tjY2A611Ll8+TIREREkJydz7do1zMzMcO7Tn9+rnTG21HztLhz/hau5SQx48mWqS4uolF0g\nKfw99A2MuOjcj6dHrODZZ59lw4YNvPzyywwfPhwrKytSU1PJyMigV69ezJs3T+OcEydOJCoqii++\n+ILk5GTs7e3Jzc0lIyODIUOGqIWNm8XB0gQbVx+uXUjj8umDKJVNVMouUlteTNnFdMxse1AnL0Vq\nZIKBqSV1lWUo6msZ6GrH8eybu6bqQb+4uJiwsDCt/QUFBQBcunRJSwRqb5BDF6WlzeLM7RKAoPPb\nK0qlUt555x127tzJkSNH2LNnD3p6eri5ufH3v/9d3SrxdrQ+vVl+/vlnlEpluzzsBF0XYSh/+9HV\nKtHY2Fjj5127dqGvr8/LL7+sNX7GjBns2bOHmJiYNkUggUDQNdHlK1knL+Nc5NeY6jcy6oFBPPzw\nw5iamqKnp4dMJiM6OpqGhgatc+nyyrneBxL+aCV8s/5gcrkcaF6rhYeHtzqupqZGa1trlf36+voo\nWzyoS6VS7O3t/1TtZAUCgUAguFdFoE5DqVTqTFH5X4XQoDs8HcE9ishoFQhap6PVciVZCVxLOYDc\n152hQ4diaWnJtWvXOH/+PFFRUTc0kwXIzs5m+/bt9O/fn/Hjx1NcXMyxY8dYtmwZa9eupWfPnuqx\nCoWClStXkpSUhJ2dHaNGjcLU1JScnBwOHTpERkYGS5YswdXVlXHjxvHTTz9x5swZjIyMmDx5Mt7e\n3mqj2q+//prGxsZ2t9RJSEhg1apVNDY2MnToUJycnCgpKWH7vijOl9TgMXYOprbaLTsKEqOQXzmP\nvoERdp5DqLySR0nWad56+x0ivv0cJycnduzYwfHjx6mrq8Pe3p4pU6Ywbdo0rYd4Z2dn3n33XTZv\n3kx8fDz6+voMGDCAjz/+mOPHj9+0CJRy4Sr79+eQcrEUiUSC64i/cDXvDNVXCyg7n4KhmRXd3AfS\n3XckSeHvAWBgYkF9VTn9HI1xdbDg+E1dubmVGcBvv/3W5rja2lqtbe0NcrRElS2voqVfh0qkTE5O\nZseOHWRmZlJbW4uDgwPBwcFMnTpV65oqgfPnn38mIiKCmJgYrly5wqhRo3AY/Hib96Srqg7+aK+o\nqwLH0NCQadOmMW3atFbPqxJzWzu/Co9x83Qed6sUFxdz5MgRCgoKiIqKws3N7Z6r7hNoIwzldXP9\n69HLrJ1lof8jKCiIzZs38+WXX5KYmEhAQADe3t44OzsjkUjU4+rq6sjLy8PS0pJffvlF57kMDAx0\nZtwLBIKuTWu+krKMEyjqqrEZNomCngNxGdrsKwnNbW1Vnl43i2pNc/XqVa3K6vag8q0cNmwYS5cu\nvaW5tIZMJlN7AqnIz88nKiqKpKQkZDIZ1dXV2NjYMGjQIGbMmKHVxq5l4tmgQYP48ccfycrKoqmp\nCS8vL2bPnq2VWFRaWsqBAwc4ffo0hYWFVFZWYmlpiY+PDzNmzNBo16yap8oPLjQ0lO+++46kpCRq\na2txcXEhNDRU3Ybvejri85aWlsb27dvJzc2lvLwcc3NzHB0dGTx4sIa/EjR/b+zatYvY2FgKCgqQ\nSCS4uLjwxBNPdFmPVYFAIBA0c6+KQKpKH6tW9qu2X7sDcxEIAJHRKhC0RUeq5UqyE+jjYMW6deuw\nstL8mG/5sNYWv//+u5bRa2RkJJ9//jm7du3i+eefV28PCwsjKSmJoUOH8u9//1v9YCSTyTh9+jRl\nZWVMmzaNv/3tb4SFhWFoaMgLL7ygbjn23nvv8cwzz7Bw4UK2bdvGlClTWLNmzQ1b6lRWVrJ69WqM\njIz48MMPNR78DF2H8P7KZVw8uYv+jz2ndX9VJZcZ+syHGJo1vz7Kpkayon4gN/MsmZmZBAQEEBAQ\n0K7XCsDb25sPPvhAa7uu1nrQdtXiokWL6D9qCquvCzzo6etj4eiKvoER3k/8AyOLP6plVKJC6s+f\nIZHArDHaPeQ7gioAsWzZMq1+8bcDlV9RdHQ0MplM64E5MjKSL774AiMjI4YPH461tTUpKSlEREQQ\nFxfH6tWrdYpPq1atIisri8GDB/PAAw9gZWWF5C5V5HRm69OboaioiO+//x4jIyMGDhzICy+8oBHM\nFtzbCEP5ZnRl7QPUVV7j0qUy+l1tuxpRhYODA59++ilhYWGcPn2a48ebJXU7OzumTJmiFqorKytR\nKpWUl5e3mXEvEAjuLdrylayTlwFg3dtLw1cywM2OlJSUW752v379OHbsGAkJCTfVEq5Xr17qJCpd\nfpWdjSrB58SJE+zfvx9fX1+8vLyQSqVcvHiRAwcOEB8fz3/+8x+d1U2ZmZls27aNgQMH8vjjj1NY\nWMjx48dJS0vj7bffZsCAAeqxqampbNu2DT8/P4KDgzExMaGgoEDtZ/rRRx/h5uamdQ2ZTMbixYvp\n3r07Dz30EHK5nNjYWN555x3effdd/Pz8NMZ3xOctISGBlStXYmpqSlBQEN26dUMul3P58mX27t2r\nsaatqqpi6dKl5Obm0qdPH8aNG0dTUxOJiYmsXr2aCxcuMHv27E75vQgEAoGg87lXRaBz//u/Zyv7\nVSkXwv1PcEcRGa0CgW46Ui03yM0O/bpynSbUlpaW7bqel5eXlqfI2LFj+fLLLzWMYZuamti3bx+G\nhoZMnDaHvYmXNTKv9fT0sLCwYObMmRpGtUuXLmXdunVER0dz4sQJQkJCaGpqQk9Pjx49erSrpc6h\nQ4eoqqpiwYIFWpl/Li4udOs7CNnZk9SWF2NspekP1t13pFoAApDo6dOtz0AkOYfIzMzE07O1r8fb\nT1uBBxPb7lSXFlJ55YKGCARQJy+lobqCQV59CB7gcsPrqHq+q173lvTr12wdmJaWdsdEIF9fX1JS\nUpDJZBrCmUwmY+PGjRgbG/Ppp5/Sq1cv9b4NGzawb98+vv32W/7xj39onbe4uJjPP/9c431/Xia/\nqTl2RkVOZ7U+vRl8fX1Fy1TBfU1rWfsqKmrq2ZNwkXFJl9RZ+23h7OzMv/71LxobG8nLyyMpKYk9\ne/bw1VdfYWxszLhx49Tis7u7O2vWrOnM2xEIBHeRtnwlVevHyivnserVT+0rqSxrFjxulZCQELZu\n3cr+/fsJDg7W8mYsKSnRqqppib6+PhMnTmTr1q189dVXzJ8/X2tdXVpaSlVVldb6uSOoxCWZTAbA\nmDFjmDRpklaVTGJiIsuXL+enn37ihRde0DpPQkICzz33nEaL2ri4ON59913WrFnDxo0b1Ukr/v7+\n/Pjjj5iYmGicIy8vj9dee43vv/+eFStWaF0jJSWF0NBQDUFm1KhRLF++nB07dmiIQB31eTtw4ABK\npZL3339fS4C6Pvlu06ZN5ObmMm/ePA1f1vr6et577z22bdvGgw8+iLu7u9Y9CAQCgeDuc6+KQIf/\n9//xEolET6lUqvuzSCQSC+BBoBo4eTcmJxCIjFaBQJsbVcv1sDFlQG9brhoNJf7gTl544QVGjhyJ\nj48PXl5eWlVBbaHL10UqlWJtba3h63L58mUKisuoNuzGv/4vVWO8KvN69IN9MTEx4fLlyxpGtTKZ\njPz8fMLDw7l8+TInT57EwMCA2NhYnT4017fUUfnW5OXlaY0vLq+hrqLZlLe2vERLBDK17aF9flNL\nTE0Mb+hbc7tpK/DQrU8AV7MTKUo9imUvTwyMmwOQyqYmFDmx9O9pzd9mTm7XdVTCSHFxMY6Ojhr7\ngoKCcHJyYu/evfj5+Wn5/kDz6+/m5oaRkVEH7k6T6wX/8qp6rTExMTEoFAomT56sIQABzJ49m8OH\nD3P48GGee+45rcDDrFmztITPu1mRI1qfCgS3h7bEcw1aZO2rxO/GxsY2D9HX16dv37707dsXLy8v\n/v3vf3PixAnGjRuHsbExvXv35uLFi8jlciwsxNr1fkCXZ6Lgz8N5mbzNNYK95xBKc5PIi43AurcX\nBiYWZB+SkWh4jfEho4mNjb2l61taWrJkyRI++OADli5dSmBgIK6urlRXV3P+/HmKi4v55ptv2jzH\n9OnTycvLY//+/cTHx+Pn50e3bt0oLy+noKCA9PR05syZ0yERqLpOwc74PKrrFNRXXUPf0BipVMqu\nXbuQy+XY2NgAMGHCBI3q7ICAAFxcXDh9+rTO8zo5OfH445qteoOCgvDx8SE1NZW0tDS1ENbas4yb\nmxt+fn4kJibqrH5ycHBg+vTpGtsGDRqEvb29RnIb3LzPmy5/pJZrULlczuHDh/Hw8NAQgFTHzps3\nj9OnT3PkyBEhAgkEAkEX5Z4UgZRKZY5EIjkAjAdeBNa12L0SMAM2KpXKqrsxP4FAIBDo5vpquZyi\nClIullJYVk3B//6DXpT3HEF5USoXtkZgafILEokEHx8fnn76aZ0Cz/Xoaq0FzcGwlr4ukb9nkZFf\nhpVzd3TZ11bU1PNbTjm/Jl2it1GzuKIyqi0vLyc/P5+6ujouXbpEfn4+0CwutKetjsr49tdff9W5\nX1na3K6jsaFO+z4MjbW29etpw9VL0lZ9a+4ENwo8mNs74zjgQa6kHSNjzwase3ujJzWgn2kFRvXF\neAcOZMqUKe26lr+/P7/99hurVq0iMDAQQ0NDHBwcGDNmDFKplKVLl/LWW2+xcuVKvLy81IJPSUkJ\nWVlZFBUVsXnz5psSgVpr2ZSVdBFJRRmJeSVqASQnJwdAq1UHNBsY9+nTh9TUVC5fvqyVgdnae/1u\nVuSI1qcCQefTlnh+Paqsfe//CTbFxcU4OWl6x2VnZ+Pk5KT1XXjtWnOn7Jafe08++SRr165lzZo1\n/POf/9Q6prKykitXrtCnT5+O3pZAILgLJJ0vaXO/iY0jfcfOpTD5MBX5WSiVTZhYOzJu1nweHepx\nyyIQwJAhQ/jPf/5DREQEycnJJCYmYmZmhrOzM3/5y19ueLxUKuWNN94gJiaGqKgofv/9d2pra7G0\ntMTR0ZFZs2YxevTods0lMa+EQyn55F64yoZf04HmRK+0y+W4uHljZGlLdHQ0NTU1XL16lZMnTyKT\nyaisrNRYU7fWlm7AgAE629P6+vqSmppKTk6ORjXU77//zv79+8nOzqaiokJLyK+oqMDWVrNa3s3N\nTavqHZpbfKqSyuDmfN5GjRrF8ePHeeWVVxgxYgR+fn54eXlpVWtlZmaqXw9dyW6q+xAecgKBQNB1\nuSdFoP/xAnAcWCuRSEKAs0AQMIbmNnBv3MW5CQQCgaANXB0syMgv4+CZyzoDX93c/cHdn8aGWsb3\nM4LSPA4ePMjy5cvZsGFDh6qCWiMxr4QtJy6iVEJDdestthpqqvjPnjO8NLq5+kZlVBsdHc1nn33G\nE088wezZs/nLX/7SoZY6KuPbdevW6TTOTcwr6VCg/7FBvfnhLte/3ijwANAzYCwmNt0pORdPaV4y\nyqYmnL3deXr2bJ588sl2934fP348MpmMo0ePsn37dhobG/Hx8WHMmDFAs5/RunXr2LlzJ/Hx8URF\nRaGnp4eNjQ3u7u6Ehoa2u71gS9rTsun1LXH8c0Kz0XJVVXM+yvUP9CpUmaeqcbr2Xc/drsgRrU8F\ngs7jRuK5Ls5cKGXUsL4ArF+/Xu0tYWZmxoQJEzh8+DCRkZF4e3vTvXt3zM3NKSoqIj4+HgMDAyZN\nmqQ+17hx48jOzmbfvn08++yzBAQE4ODggFwu58qVK6SmpjJ27FhefPHFTr1vwe2lrq6OiRMnEhIS\nwvTp0/nuu+9ISUmhoaGB/v37M3/+fFxcXCgvL+eHH34gPj6eyspKXF1dmTdvns7EBcG9QXWd4oZj\nzO2d8Rg7R2Obs6cnvr4eWhVk77//fqvnCQkJ0Wq/rKJ3794sXrz4hnNprWJNIpEwZswY9bquLRwc\nHHSeR7Vmsw6aziAd3YErJRbIXB7jny/6cenUr/zyyy+UlpYyaNAgunXrpq6OUXk+6sLa2lrndtUa\nrrq6Wr1t165dbNq0CXNzcwYOHIi9vT1GRkZIJBJOnjxJXl4eCoX278/c3FznNfT19VG2WAjejM9b\ncHAwb731Fjt37iQqKorIyEgA+vbty9y5cxk4cCDwR/JaVlYWWVlZrZ6vtra2XdcVCAQCwZ3nnhWB\n/lcNFAi8DTwCPAYUAmuAlUqlsuxuzk8gEAgErdPe1jf6BsbszYP3/9rsyXPw4EHS0tIIDg6+5Tls\nOZqFkYUdUkNjaq5doaFajoGpdgC7prQQRX0dh7KrNIxqVea57u7uN9VSp3///mrjWF0iUEcD/T30\nr7X73m8X7Qk8ANi6+mDr+kdW5OzRnkzTUa0SGhqq4a/TEj09PebMmcOcOXN07ofmthtz585l7ty5\nN5xTW9dSBRfaK8y1NFpWZdWXlZXRu3dvrbFlZc3LFZUo2BJdmaUqukJFjmh9KhDcOu0Rz3WhsOzF\nM888w6+/NgcuFQoFDg4OTJgwgZEjR9LQ0MDZs2fJzs6mvr6ebt26MWLECCZPnoyLi6bv2vPPP09g\nYCD79+8nOTmZqqoqzM3Nsbe3Z8qUKe0Kwgq6JleuXOGVV17B2dmZkJAQZDIZJ06c4PXXX+fjjz9m\n+fLlmJqaMmLECLXZ/IoVK9i4cSP29vY3voCgy2FqdHMhnps9rqvS3mcNpRI+ijiB4mQEvv37snr1\nai3PnqNHj7Z6vKrC8nquX981NjYSFhaGjY0Nn332mVZyUMuKnpvlZn3ehgwZwpAhQ6itrSUzM5P4\n+Hj279/PypUrWbt2Lc7OzupzT5o0ifnz59/yXAUCgUBw57mnv+mVSuUl4Om7PQ+BQCAQdIy2Wt/I\ni/Iwd3RVB8BVrW8sdLSxuVlUmdcSPT3sPIdQlBrLxfg9uI34C3r6f3w1KpVK6msqKUo5gr7BeB4d\nOZbo/b/w7rvvcvr0aczMzBg2bBjQ3FLn448/5u2332bFihU3bKkzduxYfvrpJ8LDw/Hw8MDT01Nj\nvFKppKe0nPf/GtSuQH9Kyt0Xge73wMPNtGwKdHfn+PHjpKSk4O/vrzGmqqqK3NxcDA0Nb8rcWFTk\nCAT3Pu0Rz43MrRk0a7nWcaFPPsmTTz6pNb5fv37069evQ/NQBQHbg66M+9Yy8bsauvxyUlJSWLp0\nKTNnztRIBuhK3jrR0dHEx8eTk5NDWVkZ+vr6uLq68uijj7Yq0jU1NREZGYmNjQ01NTWUlpby0EMP\n0bt3b7Zu3corr7zC8OHDeeGFF5BIJCQnJ5OcnExUVBTjx49n6NChBAcHM3XqVI01zYIFC7hy5Qrf\nf/+9zoraiIgIvv/+e5577jkmTJig3l5SUkJERASnTp3i6tWrmJiY4OXlxYwZM9rV6lfQPga63lzy\nx80e11XpyJqtTl5G4dVK5gQEaAlAJSUlFBUVtXpseno6SqVSK3FHlSymWvdXVFRQVVWFv7+/lgBU\nW1urbh98K9yqz5uxsTF+fn74+flhbm7Oli1bOHXqFM7Oznh6eiKRSEhPT7/leQoEAoHg7nBvRF0E\nAoFAcN9wo9Y3eUf/Dz2pIaZ2PTEyt0aphHP7L9DHvA6/Af21Auk3Q8vM6+6+o6gqyaf8cibpu9Zj\n1dMTPQMjasqKqCzKxcZlAFezE6kqKcBjdCCNjY1s3LgRqVTKpEmT2LZtm9qotrCwkJKSEvLz82/Y\nUsfCwoLXX3+d9957jyVLluDv70/v3r2RSCQUFxeTkZGBXC5nx44d6kD/O6UnSawwZ94YT0YP6t/l\nAv33c+DhZls2TZ8yBKl0K3v27CEkJETDu+PHH3+kurqa8ePHY2BgcNNzExU5AsG9y/0ungs6hy++\n+ILevXvj4+ODjY0NcrmcU6dO8emnn5Kfn8+sWbO0jsnJyaGxsZHnnnsOAwMD4uLiCAsLw9vbG6VS\nSUNDA3/729+QSCRERkbyxRdfYGhoiK2tLT179sTCwoKIiAji4uJYvXq1WggKCQlh8+bNHDlyhIkT\nJ2pd99ChQ0ilUkaNGqUxlzfffJPKykoGDRpEcHAwFRUVnDx5ktdee4033niDwMDA2/cC/olwdbDA\nt7dth9Ysfi6299U6oqNrNkMzayqq6zlxKomnn25S++/U1tayfv16Ld+elhQUFLB3714NwTMuLo7U\n1FScnJwYMGAA0Nw2zsjIiOzsbGprazE2bvb3VCgUfPXVV1RUVNzMrWrRUZ+31NRUvLy80NfX1xh3\nvYeclZUVo0eP5vDhw2zdupVp06Zp+RQVFhaip6eHo6Njp9yLQCAQCDoX8fQgEAgEgjvKjVrfOA0M\nQV6YQ01pERUF2ejpSzE0syLwoYmsePnpdnvGtEXLzGs9fX36PhRKSVYCpbnJzT41SiX6UiOkxuaY\n2fWm5+CxFCRGk3D8KE5WxowaNQozMzOuXr3Kzp071Ua1y5Ytw8bGhhMnTrSrpY6/vz/r169nx44d\nnD59mrS0NKRSKba2tvj7+2u0vXN1sMDXpRuybDMeG+SCQxd8WL+fAw8327LpcpWEZ599lg0bNvDy\nyy8zfPhwrKysSE1NJSMjg169ejFv3rzOnaxAILhnuJ/F867I4sWLqauru9vT6DDr16/XSCKA5uDx\n8uXLiYiIYEDgcC5UKKmuU5BZcI2a+kZqamqYO3cuCxYsAGD27NksXbqUtLQ0rl69ytChQzExMUEm\nk7Fx40aMjY359NNPWbZsGYaGhnz88cds2LCBffv28e233/KPf/wDgDFjxvDDDz9w6NAhLREoKyuL\nS5cuERwcrK5CaGxs5MMPP6S2tpZVq1bh4/NHO9jS0lL++c9/snbtWr755ptbSogQ/MFfR3p0yFcy\nVEdL3nuZjq7ZDEzMsXH1ITk1nYULFxIQEEBVVRVJSUkYGhri7u5Obm6uzmMHDx7MN998Q0JCAm5u\nbhQWFnL8+HEMDQ15+eWX1RVCEomEiRMnEhERwYsvvsgDDzyAQqHgzJkzyOVy/Pz8OHPmzC3fe0d9\n3r766iuuXr2Kl5cXjo6OSKVSsrOzOXPmDA4ODowcOVJ97gULFlBQUMCWLVs4fPgw3t7eWFtbU1pa\nyqVLl8jKyuLVV18VIpBAIBB0UYQIJBAIBII7yo1a39h7BmLvqZ0N6v+gp0aLBl1Gtb6+vm22bfnm\nm28A2Bmfp7FdoqePfb+h2Pcb2uqx7qNn8PzD3jw51K3N+UPzA1h7cXBwUAdobsSiRYtYtGiRzn03\nuvc7xf0aeGiv35Gu45587DGcnJzYsWMHx48fp66uTi0MTps2TStLUyAQ/Hm4n8Xzrsi94nOjq83n\n9UilUjwCHmTr3iM8+2EY3dybK6Wz8kooLyjDyMCCaqWheryhoSFz585l6dKllJSUqL1KYmJiUCgU\nTJ48mV69eqGvr6+ufJg9ezaHDx/m8OHD6ooiOzs7/P39SUpK4uLFixp+d9HR0QA89NBD6m2nTp2i\nsLCQyZMnawhAALa2tjz11FNs2rSJ5ORkUQ3USXTUV7K9/oGttU7satzMms3lgSfo25hFvSyLvXv3\nYmVlxdChQ5k1axarVq1q9ThPT09mzJjBjz/+yJ49e1Aqlfj5+TFnzhytNoezZs3CysqKAwcOEBkZ\niampKQEBAcyaNYuwsLAOz7k1OuLzNm3aNE6cOEFWVhbJyclIJBLs7e2ZNm0aTzzxBObm5uqxpqam\nfPDBB0RGRnLkyBGOHz9OfX091tbW9OjRg/nz5xMQENBp9yH48zJx4kR8fHx0Pu8LBIKbR4hAAoFA\nILijdIXWNyLz+vZxuwIPd5v2vP88xs1r9biAgIB2PxiLBx6B4M/F/Sqedza1tbXMnDkTDw8PPvro\nI/X2+vp6ZsyYQUNDA4sXL9YIcO7bt48NGzawcOFCxo0b16V8fnSRmFfClqNZWqJgfVU5+oWJWDcU\no6yTU19fj6y8hjxZBUolmFfLtc5VpzRgT8JFxiVd4uGBzb5z3t7e6OnpUV1drR6n8iLx8/PTOoe5\nuTl9+vQhNTWVy5cv4+bWnAgzduxYkpKSiI6O5umnmy16FQoFR48excrKSkPMURneFxcX6wx0FxQU\nAHDp0iUhAnUijwT0xtHatENrFWIAACAASURBVF2+kvcbN1qzKRubRSJJixZoelIDHn58qs5krxut\ny/r378+77757w3np6+vzZCtebroSvW7ktdbWvNrr8zZ8+HCGDx9+w3EqpFIpEyZM0Gh/JxAIBIJ7\nAyECCQQCgeCO0hUEGJF5fXu5HwMPXeF9KxAI7k/uV/G8szE2NsbDw4PMzExqamrU1cHp6ek0NDQA\nkJycrCECJScnA3SKn+DtJjLxos73QJ28jHORX9NYX4O5Q28mjhqCva0V/3c8F1uzcq7mJqFs0q58\nkOjrgxL+s+cMDlYmBLjZoa+vj6WlpYbHSVVVFYCWWb0KGxsbjXEAw4YNw9TUlJiYGObOnYuenh7x\n8fHI5XImTZqk4S+i8jr57bff2rz/2traNvcLOk6Am53aV/L6yrL7eU17o7VXbcVVAAxMNV8DsWYT\nCAQCwf2MEIEEAoFAcEfpKgKMyLy+vdxvgYeu8r4VCAT3J/ejeH478Pf35+zZs6Smpqqz3JOTk9HT\n08PHx0ct+gAolUpSUlLo3r07Dg4Od2vK7SIxr6RVEVCWcQJFXTUuwybRrc9AzkmgztQCJ7/elJ5P\n5Wpuks5zKv8n9CiVEBabRYCbHY2NjVRUVGiINKqWpGVlZRqt3VSUlZUBqNvHQXNrueHDh3PgwAES\nExMZPHgwhw4dAjRbwbU8/7JlywgKCmrvSyLoRFwdLP5U65HW1mw1ZVcoPZ9CWV4KEokEa2cv9T6x\nZrs/UCqV7N69m8jISIqKirCwsGDYsGHMnj2bhQsXAn+0BwdoaGjgl19+ISYmhsLCQvT19XFzc2Pi\nxIkaFVLnzp1jyZIlPPDAA7zxxhs6r/38889TVFTE5s2b1Z5oAKdPn2bXrl3qBAY7OzuGDRvG9OnT\ntVpCP/PMMwCsW7eOsLAwTpw4wdWrV5k2bRqhoaGEhYURHh7OqlWrqKioYPv27Vy4cAFDQ0MCAgJ4\n5pln6Natm8Y5VRWwP//8MxEREURHR3P16lUcHByYPHkyDz/8MAD79+9n7969FBYWYmFhwbhx4wgN\nDVX7WrXk3Llz7Nixg/T0dCorK7G2tiYwMJCZM2dqJRSorr9z5062b99OVFQUxcXFWFtbM2rUKGbN\nmqX2/I2Ojuazzz4DIDU1VcN3rqu3oRQI7gWECCQQCASCO05XEGBE5vWd4X4KPHSF961AILh/ud/E\n8/bQ0fZs/v7+bN26leTkZA0RqG/fvgQHB/Pll1+Sn59Pz549yc3NRS6XExwc3OF5TZw4kZKSEuzs\n7oxXypajWa1+t9TJm0UY697NAWulEvJkze3fKq+cb/Wcivoa9b/PXCjlvEyO/Mp5mpqaNAQdd3d3\njh8/TkpKilbFVFVVFbm5uRgaGuLs7Kyxb+zYsRw4cIBDhw7Rt29fEhIScHV1xd3dXWNcv379AEhL\nSxMi0HV0JGCtCo4uWrQIa2trIiIiyM3Npbq6WuPv5/Lly0RERJCcnMy1a9cwMzPD39+f0NBQevbs\nqTWHuro6du3aRWxsLAUFBUgkElxcXHjiiScYOXJku+6jvr6eTz75hOPHj/PYY4+xYMECnYHjO4mu\nNVt1aSHF5+IxtuyGc9DjmFg3i8NizXb/8OWXX7Jv3z5sbW155JFHkEqlxMXFkZmZiUKhUIsN0NzC\n8q233iI1NZVevXrx+OOPU1dXx7Fjx/jwww/Jzc1lzpw5QPPnWM+ePTl16hRyuVxD5AHIzMzk8uXL\nBAcHa+wLDw8nLCwMCwsLhgwZgpWVFefPn+fnn3/m1KlTfPzxxxqfx6p5vfHGG8jlcgICAjA1NcXR\n0VFjzL59+4iLiyMoKAgfHx8yMzOJjY0lLy+PtWvXYmBgoPXarF69mnPnzhEYGIi+vj7Hjh1j/fr1\nSKVS8vLyOHToEEOGDMHf35+4uDi2bt2KkZERU6dO1TjPwYMHWb9+PQYGBgQFBWFnZ0dBQQG//vor\n8fHxfPzxxzr99z7++GPS0tIYPHgwpqamnDp1iu3bt3Pt2jV1K0Q3NzdmzpxJeHg4Dg4OhISEqI/3\n9fVt83cvEAhujBCBBAKBQHDH6SoCjMi8FnSErvK+FQgE9zf3k3h+q1wviPn06omhoaG64qeqqoqc\nnByeeuoptadNcnIyPXv25MyZM4Bur5uuxHmZvM0qU0MzK6BZ8LHq1U+9vaIgm6s5ia0eVy8vpUnR\noP7596wC4n7+HkBD3BozZgxbt25lz549GgE3gB9//JHq6mrGjx+vFVT08vKiR48enDx5EmdnZxQK\nBWPHjtWaR1BQEE5OTuzduxc/Pz+dvj8ZGRm4ublhZGTU6v3cj3QkYK3i2LFjJCQkMHjwYB599FFk\nMpl6X0JCAqtWraKxsZGhQ4fi5ORESUkJJ06c4NSpU6xatYo+ffqox1dVVbF06VJyc3Pp06cP48aN\no6mpicTERFavXs2FCxeYPXt2m/dQWVnJO++8w9mzZ5k7d65WwPhuoWvN1q3PQLr1Gagx7mbXbL6+\nvl3WW+zPSlpaGvv27aNnz5588skn6iqbOXPmsGzZMkpLSzWqQn/++WdSU1MZPHgwb775prpCMjQ0\nlMWLF7Nt2zaGDBmCl1ezAB8SEsLmzZs5cuSIlidSdHS0eoyKM2fOEBYWRv/+/VmxYoVG1Y9K1A0L\nC2P+/Pka5yotLcXZ2Zn3338fY2NjnfeakJDAp59+iqurq3rb6tWrOXr0KHFxcTp9noqLi/n888/V\n85g8eTLPP/88mzZtwszMjHXr1qmriEJDQ3n22Wf5+eefmTx5svq1yc/P54svvsDR0ZH3339fo+oo\nOTmZN998k6+++kpntVRhYSGff/65WiRTid2HDh1i7ty52NjY4O7ujru7u1oEEpU/AkHnIkQggUAg\nEHQKLcvT25Op01UEmJaZ1++8/xGJ8cdY+OaHjB7UXwThBFp0lfetQCAQ3M8k5pWw5WiWTnGkosGS\nkrNZlJeXk5GRQVNTE/7+/jg7O2Nra0tycjKPPfYYycnJSCSSLu8HlHS+pM399p5DKM1NIi82Auve\nXhiYWFBzTYa8MAfr3t6UXUjTOkZPX0qvwQ9TWXKRy6ciQaLHht83Y6asZsiQIbz55pvqSg0HBwee\nffZZNmzYwMsvv8zw4cOxsrLi1VdfJSMjg169ejFv3jydc3vooYf48ccf+emnn9DX12f06NFaY6RS\nKUuXLuWtt95i5cqVeHl5qQWfkpISsrKy1C2U/kwiUEcD1ipOnTrF8uXLGTx4sMb2yspKVq9ejZGR\nER9++KFG5daFCxdYsmQJa9euZc2aNertmzZtIjc3l3nz5vHUU0+pt9fX1/Pee++xbds2HnzwQa3q\nLhUymYwVK1ZQWFjI4sWLdf7+7yZizfbnQiXETJs2TUNwkUqlzJ07l9dee01j/MGDB5FIJMyfP1+j\nRaaVlRUzZsxg7dq1HDhwQC0CjRkzhh9++IFDhw5piEAKhYLY2FisrKw0/i5VIuFLL72k1fYtJCSE\nXbt2ERMToyUCQXNbuNYEIGiuVm0pAAE8/PDDHD16lMzMTJ0i0Ny5czXm0b17d7y9vTlz5oxWGzkz\nMzOGDh2q0ToOmlvGKRQKnn32Wa22c/7+/gQFBREfH6/h26di3rx5GlVSxsbGjBo1iq1bt5Kdna2u\n7hUIBLcPIQIJBAKB4K7RlVrfuDpY4OvSDVm2GY8NcsFBCECCVuhK71uBQCC434hMvNhmxWW1aXdy\nMtPYFHEQy8ZSDA0N1UE6Pz8/EhISaGhoIC0tjd69e2NlZXUHZ99xqusUbe43sXGk79i5FCYfpiI/\nC6WyCRNrR9xGTkPf0FinCATgOmIqRSlHKTufQkONnB4evQmdGcrUqVO1WnU99thjODk5sWPHDo4f\nP05dXR329vZMmTJFK6DakoceeogtW7agUCjUrY50zsXVlXXr1rFz507i4+OJiopCT09PnfkdGhqK\npaVlO16t+4eOBqxVBAUFaQlAAIcOHaKqqooFCxZote5zcXHh4Ycf5pdffuHSpUs4Ozsjl8s5fPgw\nHh4eGgIQNHs+zZs3j9OnT3PkyBGdIlBubi4rV66ktraWFStWdFmxVazZ7m9a/l4PHk+kuk6Bt7e3\n1rh+/fppCD01NTUUFhbSrVs3evXqpTVeVUGam5ur3mZnZ4e/vz9JSUnqvyOA+Ph45HI5kyZN0rhG\nRkYGUqmU3377TefcGxoaKC8v12ovZ2hoqCXwXI+Hh3b7QlULtsrKSp3H9O3bV2ubyr9H1z6VyNNS\nBMrIyACa/XqysrK0jikvL6epqYn8/Hytc97MnAUCQeciRCCBQCAQ3HVE6xvBvYh43woEAkHrxMXF\nsWvXLi5duoRcLsfS0pIePXowYsQIHnvsMY2xjY2NbN++na0/7+FYchZSY3NsXH1w8huDXougGoBF\ndzcu1lbzwYfvY1pfhomRAS+99BLBwcF4enoSExPDvn37qK2txd/fX2203dIMXEVYWBgbN25s9z1d\nu3aNzZs3qzOde/bsyaRJk3RWbLQXU6MbP5Kb2zvjMXaOzn2DZi3X+Nlj3Dz1v3sMfIgeAx8CYONz\nI9v8zgoICCAgIKAdM/4De3t7du3a1a6xVlZWzJ07l7lz53boGvcTNxuwbomnp6fO7argbF5eHmFh\nYVr78/PzAdTB68zMTJqamgB0jm9sbFSPv5709HR27tyJiYkJH3zwAW5ubjrn1JUQa7b7C13VomnZ\nhdTJS/lgdwZzx0o1Krz09PQ0hJaqqirgDxHkemxsbABtcWLs2LEkJSURHR2trpDU1QoOQC6X09jY\nSHh4eJv3UlNTozE3KyurG3pq6RLmVZ8Zqr/rjhzT1j6F4o9EhYqKCgB27NjR5vxqa2s7Zc4CgaBz\nESKQQCAQCAQCgUAgEAg6jcjISD7//HNsbGwYOnQolpaWXLt2jfPnzxMVFaUlAqkMo6/p2WLnEUhF\nQTZX0o6hqKnCJXiSxtjqkgJqSguR6OmDuTkTHx+LiYkJERER2NnZoVAo2LZtG9CczX3y5MlOuaeK\nigpeffVVioqK8Pb2xtvbm7KyMr744osOiyctGeh6+1tR+bnYigD4XeRWA9YtUQWnr0culwPw66+/\ntjmXmpoajfFZWVk6M/pV6Arm5ubmUlNTg5eXl84qCoHgdtJatai+gSEASVmXybhSxT8n+PHwwOZq\nnaamJuRyubq6RSVIlJWV6byGavv1wsWwYcMwNTXl8OHDzJkzB7lcTkJCAm5ublpiqKmpKUql8oYi\n0PXcSAC6m6hej59++glTU9O7PBuBQNBRhAgkEAgE9wgymYxnnnmGkJAQpk6dynfffUdaWhoNDQ24\nu7szc+ZMjSCEynBy0aJFWFtbExERQW5uLtXV1RpGpsnJyezYsYPMzExqa2txcHAgODiYqVOn6szY\nyc7O5ocffiA9PR2JRIKnpyezZs264ZwXLVqktf/1118nNTVVp7FqYmIiu3fvJjMzk6qqKqytrenT\npw8TJkxg4EBNU9fTp0+za9cuMjMzqampwc7OjmHDhjF9+nSd95CUlER4eDg5OTkYGBgwYMCAVvvd\nCwQCgUAg6BiRkZFIpVLWrVun1SJMlUncksLCQl5b8QGvbEmkF9DYUE/Gvo2U5iXTIyAEAxNzAOoq\nr5F/+lcMTMwxMLNBYmDIlL8+Q0jwIDZs2MC+ffuoqKhAKpWip6eHj49Pp93T5s2bKSoqYtKkSRoe\nDo8//jivvvrqTZ/X1cEC3962Ov2PWsPNwYLzxfJWW+a1RCKB0BHabXgEd4bOCFi3pLUAsSogu27d\nuhu2koI/grnXv5/bw+OPP055eTn79+/nnXfeYdmyZRgaGnboHALBzZCYV9Jqu1ATWyeqS4uoLL6I\nkYUN/9lzBgcrEwLc7Dh37py6ug3AxMQEJycnioqKKCgooEePHhrnOnPmDAB9+vTR2G5oaMjw4cM5\ncOCAui1cY2OjVhUQQP/+/fn999+5ePEivXv37oS7v/v069eP7Oxs0tLSbquHj0QiEdVBAsFtQO9u\nT0AgEAgEHePKlSssWbKEyspKHnnkEYYPH05OTg7Lly8nNjZWa/yxY8d4++23MTEx4dFHH2XEiBHq\nfZGRkbz55pukp6fzwAMP8OSTT2JhYUFERASvvvqqulRexdmzZ/nXv/5FUlISgYGBTJgwAalUyuuv\nv05mZman3eOWLVt46623SElJYdCgQUyePBl/f38uXbpETEyMxtjw8HCWL19OZmYmQ4YMYeLEiTg5\nOfHzzz/z6quvUl1drfV6vPXWW2RnZzN8+HAeeeQR5HI5S5Ys4cqVK512DwKBQNCZvP7660ycOLFD\nx0ycOJHXX3/9Ns1IINDkvEzOzvg8wmKzyCkqp1ah1NnSSpf3y7x588guqVP/rG9giK2rD0qlkurS\nAvX2svMpNDU2Yus+EH0DQ/QNjSnXaxaZZs+ejYmJCVVVVTQ1NdG3b99WvWw6ikKhICYmBhMTE2bO\nnKmxz8PDg9GjR9/S+f860oP2Jn9LJPDceG8WPe57w2MkEvjnBD9hfH+XuFHAGqCy+CJKJfxnzxkS\n80oAtALW7aF///4ApKXp9oi6Hk9PTyQSCenp6R26DjQHaF944QUmTZpEYmKi2htIILjdbDma1ar4\nbevW7ONzJTUWRX0tSiWExWahUCjYvHmz1vixY8eiVCr573//qyE4VFRUsHXrVgDGjRun8zho9uE6\ndOgQ+vr6Or8DJk1qrmJdt24dpaXaIn9tbS3nzp1r+4a7GKpn/6+//lrdYrIlCoWi3Z9BbWFpaUlJ\nScktn0cgEGgiKoEEAoHgHiM1NZXJkyfzt7/9Tb1NlYX6+eefM3jwYI3y7FOnTrF8+XItE1mZTMbG\njRsxNjbm008/1WjnsGHDBjZt2kRQUBDh4eH4+vqiVCpZs2YN9fX1LFu2jKCgIPX4Xbt2sWnTpk65\nv8TERLZu3YqjoyMffvihRhZkdHQ0H330Eb6+voSEhHDmzBnCwsLo378/K1as0Aj2qCqhwsLC1BmO\ntbW1fP755+jp6fHBBx9oGFR+/fXX/PLLL51yD4Kb53rvhpYVbddn2SUmJhIWFsalS5eoqqoiKCiI\nZcuWAc3tTTZv3kxOTg5yuRw3NzfWrl17Z29GIBAI/gToanUl0+vJ5cw0gh7+C09NHM9jo4fh5eWl\nVRWkwsPDg/TTBRrbDMyaxzbW/RFcri4tBKBHwFgsnZrN6msbmoN35ubm9OnTh5qaGtauXavTp+T9\n99/X2mZpacmqVavw9fXV2D569Gj1+MuXL1NXV8eAAQN0Cku+vr5qX4ibIcDNjkWP+7YqGKi4XtRx\ntDYlLDaLMxe0A4x+LraEjvAQAtBd5EYB66vZiVxJjcWqVz+khsaExWbh62ytM2B9I8aOHctPP/1E\neHg4Hh4eWt5BSqWS1NRU9fvcysqK0aNHc/jwYbZu3cq0adPQ09PMES4sLERPTw9HR0ed15w/fz6G\nhoZs27aNt956ixUrVogWUYLbxnmZvM2KSQtHV+w8BlOSlUDGng1Y9/Yi/7QeBdFf49jNCltbW41K\nuilTppCQkEBcXBwvvfQSgYGB1NXV8dtvv1FeXs5TTz2l07PLy8sLJycnjh07hkKhYOjQoTq/2/z9\n/Zk7dy6bN2/m73//O4GBgTg6OlJbW4tMJiM1NRVvb29WrlzZOS/QHaBXr14sXLiQtWvX8uKLLzJo\n0CB69uxJY2MjMpmM9PR0LC0t+fLLL2/pOv7+/hw9epS3336bPn36IJVKGTBgQKdW9woEf0aECCQQ\nCARdlJbmsaZGUnqZNT9FmpmZtZqFGh0dzYkTJzSC5UFBQVoCEEBMTAwKhYLJkydr9fOePXs233//\nPYWFhTQ0NADNhrP5+fn4+PhoCEDQnBW0Z88eCgsLb+meU1JSmDFjBmZmZixdulRnG4yW7SZUbeRe\neuklraBMSEgIu3btIiYmRi0CnTx5ErlczkMPPaQhAAHMnDmTqKgoreonQddEJpPx7rvvYmZmxtix\nYzE1NVW/j6urq1m5ciUNDQ2MGTMGS0vLVnvo3yu01TpRIBAI7hattbpy8BqGvpEpJZmn+PK7rRzY\nvxcHK1N8fHx4+umntb6DzczMMDXSfDSVSJoD0krlHxnajfXN1UKq9nCAxnGqz/rO/i5XVRVbW1vr\n3N/a9o7wSEDvDos6AW52BLjZaa0ZB7raCQ+gu0xnB6xvhIWFBa+//jrvvfceS5Yswd/fn969eyOR\nSCguLiYjIwO5XK5h6L5gwQIKCgrYsmULhw8fxtvbG2tra0pLS7l06RJZWVm8+uqrrYpAAHPmzMHQ\n0JAtW7bw5ptvsnLlSszNzVsdLxDcLEnnb1wZ4jz0cYwt7SjJOkVJ1in0jUyxGD+ad95azLx583By\nclKPlUqlvPPOO+zcuZMjR46wZ88e9PT0cHNz4+9//zsjR45s9TohISH8+OOP6n+3xtSpU/H29mb3\n7t2kp6cTFxeHqakp3bp14+GHH2bUqFEdeAW6BmPGjMHNzY2dO3dy5swZEhMTMTY2xtbWlgcffFCj\n68jN8ve//x1oblt/6tQplEolM2fOFCKQQHCLCBFIIBDcU1xfJdAVudVgra6MWmjug3/pUhmjH+yL\niYmJ1nGqLNTc3FyNxej1mYAqcnJygGbT5OsxNzfHwcGBvLw8dYu07OxsAJ2LLz09Pby9vW9ZBAKo\nrKzE3Nxcp3D1wAMPsGHDBnWQJyMjA6lUym+//abzXA0NDZSXlyOXy7GwsFDfs657MDMzw83NjdTU\n1Fu+B0Hncf3vXEVSUhL19fUsXLhQ6wEqMzOT8vJyZs+ezbRp0+7kdAUCgeBPQ1utrgC6ufvTzd0f\nRX0t1SWX8HKsIfX0CZYvX86GDRu0MqcHut64YkXf0AgARW0l4KB1nMrMu2U1gkQiQaFQ6Dxfe8Ui\n1fmuXbumc39r2zvKzYo6rg4WQvTpYnR2wLo9+Pv7s379enbs2MHp06dJS0tDKpVia2uLv78/wcHB\nGuNNTU354IMPiIyM5MiRIxw/fpz6+nqsra3p0aMH8+fP1/AbbY0ZM2ZgaGjIt99+yxtvvME777yj\ns+2jQHArVNfp/hxviUQiwcHrARy8HlBvGznak/Lycmpra3F2dtYYb2hoyLRp0zr8vDB9+nSmT5/e\nrrHe3t46K4p0caMYR2hoKKGhoTr3OTg46Iw/6KqAVbFo0SKdvr03uparq2urx3Xk+iEhITpFNCsr\nq1vy2hMIBLoRIpBAIBB0IVrLqFVRUVPPbznl/Jp0SW0eq0KVhXp9QKO1CgjVOFtbW537VZU1NTU1\nwI2zYG+m0iItLY34+HhSUlLU7SkaGxsxMjLi2rVrPPPMM4SEhKgXmWZmZhoVP3K5nMbGRsLDw9u8\nTk1NDRYWFup77sx7ENxerv+dq1D11tb1/lXt01VJJhBcT21tLTNnzsTDw4OPPvpIvb2+vp4ZM2bQ\n0NDA4sWLGTNmjHrfvn372LBhAwsXLlT3iy8oKGDr1q0kJydTUVGBpaUl/v7+zJgxQ8tw+LPPPiM6\nOppvvvkGBwcHjX0pKSksXbqUmTNntvrw3RKFQkFERATR0dGUlJRga2vL6NGjmTFjxq28LALBDWmr\n1VVLpIbGWPbwoMnFlrG2Zhw8eJC0tDStgLSrgwW+vW3brJ4wtenOtYtnkV+5gEV3d/xcbNXiR1VV\nFbm5uRgaGmoE+szNzTl//jwKhQKpVPPxNysrq1332qtXL4yMjMjNzaWqqkrreyklJaVd52kvQtS5\n9+nsgHVrwdLrcXBwYMGCBe2ep1QqZcKECUyYMOGGY319fVtNcpsyZQpTpkxp93UFgo5yfbWoLhpq\nKpEam2lU0RlIGtVty4cNG3bb5icQCARdHSECCQQCwV1CJpOpRY6pU6fy/n8+Z8fB4zQ1KjC16U53\nv1FYOvVRjy87n0p5fiZSI1NWbNzBT+RSVVpEdXU1u3fvVmehlpaWsnz5co4ePcrZs2f57LPPyM/P\nZ+rUqRpBC9W/k5OT+eabb0hPT0cikeDp6cmsWbPUgomq6sjU1JS6ujqWLVtGVlaWVvZPWVkZZ8+e\n5R//+AdHjhwBUC/AGxsbSUxMZPfu3WRmZlJVVYW1tTVnz55VZ+eqgqL6+vrk5uYSGhpKSkqKurJJ\nVenU0h/G1NQUpVJJeHg42dnZbNu2jbS0NKqqqrCxsWHIkCFMnz5dLRSo7vm///0vb7/9Nt988w2n\nT59mz549FBQUcO7cOfT09ERLuNuMUqlk79697Nu3j6KiIiwsLBg2bBizZ8/WGnv971wVIFfR8t+L\nFi3is88+U//82WefqX9u6SlUV1fHrl27iI2NpaCgAIlEgouLC0888YRW64eWAfnAwEDCw8PJyMig\nsrJSI4BfUlJCREQEp06d4urVq5iYmODl5cWMGTO02h6FhYURHh7OqlWrqKioYPv27Vy4cAFDQ0MC\nAgJ45pln1AKW6nNCxcSJE9X/9vHxaTO7TtA+jI2N8fDwIDMzk5qaGvVnXnp6urodZnJysoYIlJyc\nDDRnXUNzIHnZsmXU1NQwdOhQevfuzeXLl4mJiSEuLo53331X633QGSiVSj744APi4uJwcnJiwoQJ\nKBQKoqKiuHDhQqdfTyBQcaNWV/KiPMwdXTUCcWculNJY2VxdbGRkpPO4v4704PUtca2KSzZufhSl\nHqXkXDx2ffwJHfFHe9off/yR6upqxo8fj4GBgXq7p6cnOTk5REVF8cgjj6i3R0dHc/bs2Xbdr1Qq\nZfTo0fz666+Eh4er28xC899/TExMu84j+PMgAtZ/PtrysRTcOu2pFpVlxFF2PgULR1ekJhYoair5\nv/RqaivLGTx4MA8++OAdmKlAIBB0TYQIJBAIBHeZK1eusGTJEnLlBnTrOxhFTSVlF9PIObQF1wen\nYOOq2bqs8sp5sg9tQW+AP88+8SgymQxoDlbLZDIOHjxIr1698Pb2pqysDBMTEyIiIoiLi2P16tVq\nIcTd3Z0DBw7w9ttv4vD1bwAAIABJREFU06NHD4KDg3FyciI3N5dXX32VvLw8DTPYvn37As3VN9fT\n1NREenq61nZVT/AjR44QExODsbExw4YNw87OjqKiIvbu3Ut9fT3Q3PYLmlt5NTY2EhAQQGlpKQMG\nDGi1F3n//v35/fff2b17N//9738BsLe3JzU1FTMzM/bt28fJkyf56KOPcHR0pE+fZlFN1bbu22+/\n5fTp0wwdOhRvb29SU1MpLy9nzZo1GmKCoHPZtGkTu3fvxtbWlkceeQR9fX3i4uLIzMzUmandEkdH\nR2bOnElKSgqpqamEhISohRg3NzdmzpxJbm4ucXFxBAUF4e7urt4HzZniS5cuJTc3lz59+jBu3Dia\nmppITExk9erVXLhwQacYlZGRwbZt2/D29mbcuHFUVFSo55mTk8Obb75JZWUlgwYNIjg4mIqKCk6e\nPMlrr73GG2+8QWBgoNY59+3bp56nj48PmZmZxMbGkpeXx9q1azEwMFB7gEVHRyOTyTT8wNrq0S/o\nGP7+/pw9e5bU1FSGDBkCNAs9enp6+Pj4qEUfaBZeUlJS6N69Ow4ODiiVSj799FOqq6t55ZVXGD16\ntHpsbGwsH330EZ988gkbNmzokL9Dezh69ChxcXH069ePVatWqT3TQkNDWbx4cadeSyBoyY1aXeUd\n/T/0pIaY2vXEyNwapRKqZBco1ZczPNBPLaBeT4CbHYse9+Wzvbora4zMrek5+GEu/76PxsSfiN1d\nxhkrK1JTU8nIyKBXr17MmzdP45iJEycSFRXFF198QXJyMvb29uTm5pKRkcGQIUP4/fff23XPc+bM\nITk5mV9++YWsrCz1Ois2NpbAwEDi4uLadR7BnwMRsBYIOpf2VItaOrlRU1ZERWEOjfU1WJkZ08PT\nj1F/mcITTzzR6eswgUAguJcQIpBAIOhydKRKQMXRo0eJjIwkNzeX+vp6HB0dGT16NFOmTFFng169\nepWnn34aNzc31qxZo/M8K1asICEhgfXr1+Pi4qLefu7cOXbs2EF6ejqVlZVYW1sTGBjIzJkzW22n\npuu+IiMjOXjwIJcuXaKmpob09HRkMhlTZsyhrNIVVTMyu36B/P71v0jftZ7Bc9+lKDUW2dkTKGqr\naayvwcKpD+YBTzL6sZG4OliQlZVFREQEycnJ+Pr6snDhQr766itkMhk2NjZYWlqSk5PDt99+y9Sp\nU9m8eTMnT54kPj4ePT09FixYoFFt8MILLxAfH4+dnR2HDx/mu+++Iz8/n3PnzlFSUsKxY8c0KoH2\n7NnDuXPnKCwsxMDAgNzcXH744QfOnj1LYmIiZWVljBo1ii+//JJu3brR1NREcHAwEokEY2NjdTVH\nRUUFxcXFuLq6UlBQgL29PYMHD9Zol6QSjQAmTZpEXFwcr732Gi4uLnzyyScUFRVRWFjI008/zZUr\nV9i0aRPr16+noKAAhUKBubk5qamp2NnZkZGRwfr167G3t8ff35+mpiYsLCxIT08nMzOzVT8lwc1z\n9uxZdu/ejZOTE5988gkWFs3tbmbPns3SpUspLS3Vao/VEgcHB0JDQwkLC1OLQKpWgtAsbkZHRxMX\nF8ewYcO0MjE3bdpEbm4u8+bN46mnnlJvr6+v57333mPbtm08+OCDavFIRWJiIi+++P/svXlAVPX+\n//+AGfZ932QVBAQFF8Rdr2uZlmmZ2nrL8qr3l9XVb9mtvPdmmTfbLNOP3co2l6tp4S6iCEqCIAyb\nbAKKMOwCwz4svz+4c2SYYdHUVM7jLz3L+5wzc+Zw3q/n6/V8rVDLIoeOKrcNGzbQ2NjI+++/r9Zv\nqrKykldffZVNmzbx9ddfq2WmAyQkJPDxxx/j4eEhLPvwww+FwP748eMxMTERquJKS0v7ZA8mcuME\nBQUJVm6dRSBvb2/Gjh3L1q1bKSwsxMXFhdzcXBQKhWBllZGRwdWrV/Hz81MTgAAmTJjAwYMHSU9P\nJy0t7ZY3tD1x4gRwvTG3CjMzMxYuXCiK2f2MO9k3sTerK6fgqSjkl2ioLKamKAddiRR9EwvGzXiU\n91e90KPY/8AwNxwsjVm/NRdt9WxTp89k+BMTSY89RUxMDE1NTdjZ2TFv3jwWLFigYdXm6urKunXr\n+P7774mLi0MikRAQEMDGjRuJiYnpswhkbm7Ov//9b2GcnJwcXFxcWL58Ofb29qIIJKKGGLDuf3TX\nx1Lk1tFbtaiZoxdmjh3v8Do6sP7JUIZ59i7IioiIiPQHRBFIRETkruNGqwQ+++wzTpw4ga2tLWPH\njsXExITMzEx+/PFHZDIZ7777LhKJBBsbG4KDg0lMTCQ/P18t8AodAdvExES8vb3VBKDw8HC++OIL\n9PT0CA0NxdbWlqKiIo4dO0ZcXBwbN27Ezs6u1+v66KOPOH36NLa2tsyYMQOFQkFycjJXr14lJTsf\nnK6fj4mNC3rGZrQqm0j7dRMmNi6YOnigKM5DV6pHRW4SiTvf4+OWi3hZSYmOjhaqfgYMGMCnn36K\nsbEx9vb2DBgwgPLycnJycjh8+DBnzpzBzc2NYcOGcf78eSorK/nXv/5FVVUVtra2pKamcuXKFays\nrKitrSUsLIxhw4YxY8YMPDw82LJlC0ePHuWJJ57goYceIjc3F5lMxuDBg4mPj6ehoYHVq1fj5+fH\njBkzyMrKoqysjPj4eLZs2YK9vT3JyclYWlri6upKQUGBUM1x5coVFAoFo0ePJicnh9TUVIyMjNDT\n0+P48ePEx8ejUCj48ssvMTc3JyQkhFGjRpGQkIBEImH9+vWUl5dTWlrK8uXLaWhoEOycnJycMDIy\n4qWXXmLJkiX89ttvlJeXc/jwYdLT0ykvL8fNzU34Dv7xj3+gUChuiyDYn+ja3DrpxEEAFixYIAhA\n0NGY9dlnn1Wzd7vVKBQKTp06hY+Pj5oApDr+c889x4ULFzh9+rSGCOTl5aUhAAHEx8cjl8t59NFH\nNQL81tbWzJ8/n6+++gqZTKZRDTRnzhyN59DMmTOJiooiKyuL8ePH/46rFemJrvdl4AAX9PX1hYqf\nuro6Ll26xPz58xk6dCjQIQq5uLiQnJwMICzPyclR+39Xhg4dSnp6Orm5ubdcBLp06RI6OjpaGw53\nFkdFRG41vVld2Q0aid0gzQrIyTMHC5aL0H3D6GGetvx3w/9H/t+eU/utBnvYXu+VM3tKn8938ODB\nfPDBBxrLPTw8tIrr3fU9sbKyYuXKlVrXdbePSP9FDFj3L7rrYyly6+hcLdpTTzodHXh19lDx9yQi\nIiLSCVEEEhERuau40SqBiIgITpw4wZgxY1i1apVaJrSq78ahQ4d4+OGHAZg2bRqJiYmcPHmS559/\nXu3YkZGRtLW1MWXK9aBCYWEhX375JQ4ODqxfv16t0bxMJuPtt99m27Zt/P3vf+/xuqKiojhyPAIj\nK0cmPfk3LM1NGGbSTkREBGVlZaTLEpA02mDteT1oJzEwprm+BomeIb6zXqIk7Swl6Wex9hhCW0sz\n9eUFxJw8SqG9BQMHDkRfX5/y8nIqKir417/+RWtrK59++inPPfccaWlpZGdnI5PJWLNmDX/96185\ncOAAwcHBGBsbk5iYyC+//IKtrS12dnbMnz+fffv2ERcXxyOPPCLYCpWWlpKQkEBBQQHR0dEoFAqG\nDx/O+vXr+eWXX4COQPvy5cuFCoyIiAgaGhqor69n7969DBkyhNGjR7N+/XqefPJJQQQaMmQIKSkp\nnD17lvHjxzN37lz++te/kpWVRUJCAhYWFnh6eiKXyykpKeHdd99l3bp1DBgwAD8/P2xtbYmKisLI\nyAhzc3Pc3Nyw+J9FzMWLF2ltbcXHx4dx48Yxe/Zsvv76a0pKSti7dy+hoaH4+/sDoFQqUSqVZGVl\nMWHChNsiCPYHEvPK+SkqWyMDNuPwGaQN12g3d9LYZ/Dgwejq6t62c8rKyqKtrQ3oeD50pbW1FYCC\nggKNdd1VhWVkZABQVlamdcyioiJhzK4ikLYeMar7p7a2ttvrELl5ursvAWqU5pRfzKa6upqMjAza\n2toICgrC1dUVa2trZDIZs2bNQiaToaOjI9hZ1dfXA3QrAquW345eY3V1dZiZmWmtqrC0tLzlxxMR\nUdEXq6tbsZ+Hvdl10UdE5B5DDFj3TkREBHFxcVy6dIlr164hkUjw8PDgwQcfVOvFB7BmzRpSU1PZ\nv38/e/fuJSIigoqKCuzt7Xn00UeZOXMmAEeOHOHQoUPI5XLMzMyYPn06ixcv1lpZdSPJXV2PHxkZ\nSUlJCZMmTeKVV17psSdQeXk5+/btE3pH6uvr4+TkxKhRo1i4cKGwXXJyMlFRUUKCWmtrK46Ojowf\nP5758+erzXXhxvpM3i+oqkV3RGeTfFnzfW6ouzWLJ/j0y9+TiIiISE+IIpCIiMhdhcrapq9VAmFh\nYUgkElauXKnxUrxw4UIOHjxIZGSkIAKNHj0aExMTIiMjee6559QCzhEREUilUiZNmiQsO3LkCC0t\nLbz44osaL9BBQUGEhoYSFxen1ky8K4l55by+8VtyLlfg7f0gO3/rMDdpqq2ioLCaMSOGk5mVTcWl\nRDURSFdXgg46mDl5qjeMNTbHyn0w8uTTTJoxh4/e7shIfemllwAICAgQLIlUExAHBwe2bNlCc3Oz\n0PheFbh8+OGHqa+vZ+rUqYLFm0KhYNOmTZiYmDB9+nS16zEzM2Px4sUkJSUxb948/vznPwPw0EMP\n8Z///EftuNARpAwODqahoQFXV1euXr3K5cuXBbui9vZ2lEol8+bNo6KiQrB7Cw4OxtfXl5KSEgwN\nDdmwYQO6urosXbqU6upqLCws+Ne//oVcLufSpUs8/vjjpKamcv78ebUJ2Jtvvsknn3xCZmYmFhYW\nzJkzh9zcXKAjo9fX15fY2Fjh+1MqlSQmJtLe3q5m7VRYWMiGDRuoqanBw8ODmJgYYaK4atUqNm7c\nqCYIqiaKv/zyCz///DMnTpygrKwMS0tLJk2axFNPPdWjHc69zNHEK90GPFqVTdQ3NPPvQ1m06Zsz\nM9hVWCeRSDA3N79t56XqZ5WdnU12dna32zU2Nmos6y6gXlNTA8CZM2d6PLa2MbVli0okEgBBrBK5\ndfR0XwLUGztyKSuNr/aGY95aib6+viAODx06lISEBJRKJWlpaYLIDGBsbAzAtWvXtI5bWVmpth0g\nPNNVwmNnbkQsMjExQaFQaK2Sraqq6vM4IiI3Sl+srroy1N1aFHRE+h1iwLpnvvzyS9zc3AgMDMTK\nygqFQkF8fDwff/wxhYWFPPXUUxr7fPjhh2RmZjJy5EgkEglnz57liy++QCqVkpeXx8mTJwkJCSEo\nKIjY2Fh27dqFgYEBjz32mNo4N5vc9f7775Odnc2IESMYPXq08D7QHdnZ2axduxaFQkFgYCBjx46l\nqamJK1eusGPHDjUR6OeffxYsZkeOHIlSqSQ9PZ0dO3aQkpLCunXrtCZM9aXP5P3EME9bhnnaalR2\nq1WLioiIiIiocX9Gn0RERO4pOr+8HT97gfqmFq2WOV2rBJqamsjLy8Pc3Jxff/1V69h6enpqWf36\n+vqMHz+eY8eOceHCBSEzPycnhytXrjBmzBi1ILQqyz81NVVr0Li6upq2tjYKCwvx9vbWWK8KOubn\n56Gjo4OpvYfa+pqGZmKyK7Bq16GhUq62rq2tFV2JFANTTV9pUwcP4DTUXW/MrBIxtPVTsbGxQalU\nYmxsjKmpKXA9IKkKNldUVAjbZ2VlCWLMoUOHSEnpaNBcU1NDYWGhINB1rZhobW3VCOKrgpQWFhY0\nNjbi4+NDVlYWDQ0NwjaXLl1CqVTS2NyKvKSCsxnFNFtfpr6phba2NpydnQkKChLOw9TUFLlczrVr\n1wgNDaWurg5ra2utAoKhoSF2dnY0NDQglUp5/PHHOXLkCKmpqZiZmXHp0iWeeOIJdu/ejb29PePG\njSM9PR0zMzM1y7CPPvqI1NRUAgICCAkJ0ZgoBgYGahUEN27cSFpaGiNGjMDY2Jj4+Hh+/vlnqqqq\n1Poq3S8k5pX3GGiX6BkAoGyo45ODydhbGAmBj9bWVmpqarC1vT2BEJXo8sgjj7BkyZIb2rc7X37V\nmG+99RahoaG/7wRFbhu93ZcAZo6eFLXD1/tPMMSqGT8/PyG5ICgoiMjISA4fPkxjY6NaU/uBAwcC\nCM+nrqiWq7YDhOdwWVkZTk7qVXE9CZRdGThwIElJSaSnp2vY0XV3PiL3NjfaN1GpVPLrr78SGRmJ\nXC5HIpHg6enJnDlzurWc7Gtm/JMTfXht61GK086gKM5H2aBAVyJFz8gMEztXnIOnIDXoeNfQ0YHF\nEzSrH0VE+gNiwLp7vvjiC42/gy0tLaxdu5a9e/fy4IMPaiTilZWVsXnzZuEd7NFHH2XZsmV89dVX\nmJiY8Pnnnwv7LF68mBdffJH9+/fz6KOPCsk2v8ftQXX8viQutbS08MEHH6BQKFi1apVasiF0VAh1\nZtmyZTg4OGi8d/7444/s3r2bs2fPMmHCBI3j9KXP5P2IWC0qIiIi0ndEEUhEROQPQ5stT1qOnCZF\nJR8cyODZaVK1rLiuVQK1tbW0t7dTXV3Nzp07+3zcqVOncuzYMSIiIgQR6OTJk8K6zqiy/Pft29fj\nmNqy/DsHHVuVjUgMjND938SjMw3XSmjQ0cFAVz0jvLWpHh2JFCMrR4199AxNMTfWV9tnwIABQMfE\nRNv51dfXY2dnh6trR+WFSrRKT08HOiYpKqqrq6mtraWxsZEjR44In3trayuFhYXU1NTg7++vdt0N\nDQ00NjZqNEP19fXl/PnzVFVVYW9vT1BQEBcvXiQ1NVXYJvJcIhcLq2jUNaW8IpOo9CLSmrJIzS9H\np6EKN29/reKWn58fS5cu5eOPP8bCwoKLFy+ya9cukpKSWLt2LZ988gnJyck0NTWhr6+Prq4uixcv\nprS0lKqqKsaOHUtMTAx+fn5Ah4A2ZMgQdHV1CQkJYcyYMUDHRPHAgQMYGBgwd+5cYaLo6emJrq4u\ne/fuJTc3F0tLSw1BUC6Xs3nzZjVrw5dffpmTJ0/y7LPP3nfNY3+Kyu4x0G5s7UR9pZza0ssYmFmx\nIzpb+J2np6ff1gqYQYMGoaOjI9zztwJfX18A0tLSbqsIpBLA29rabqtl3v1Kb/clgLGVE1J9Q6oL\nMkkoVPLYnOs9oFQCy549e9T+D+Dv74+Liwvp6emcPXuWcePGCevOnj1LWloaLi4uBAQECMtV9oLH\njh1TGys/P5+wsLA+X9e0adNISkrihx9+4L333hNEK4VCwe7du/s8jsi9w430TWxpaeGdd94hNTWV\nAQMG8NBDD9HU1MTZs2fZsGEDubm5PPPMM2rj30hmvLuFLiTvoTK/FHNnbyzd/GlvbaGp9hqVecnY\n+Y5CamDcr62uREQ6IwasNekqAAFIpVIeeughkpOTkclkalbdAM8++6xaNbWjoyODBw8mOTlZw/7M\nxMSEUaNGqVnHwe9ze3jqqaf6XLkeFxdHaWkpoaGhGgIQoJH45OioOe+DjgSm3bt3c+HCBa0ikNhn\n8uYoLS3lhRdeUHPEuJvpyXJQREREpDdEEUhEROQPoTtbHlWVQFL2VTJK6nh19lDBLqprlYDq5d/L\ny4vPPvusz8f29/fH2dmZuLg46urqMDAw4PTp05ibmzNixAi1bVXH2L17t5qVT1/oHHSU6BnS2tRA\nW2urhhDU0txIY005JgM80dGB9naoqyhEWa9Aom+IpaufxtgtjbW4WJuonVNISAg6OjokJiYil8vV\nJlU///wzra2tDBw4ULAD8PPzw8XFhdTUVA0bo8TERBobG3F0dOSLL75QazC+bNkyioqK+OKLLwRB\nqa2tjX379mkN4M+ZM4fz58+TlZWFubk5QUFB7Nq1C5lMhrGxMYXlNSQUnEPfxBZLew9KM85RX1GE\nzcBhtCmbaG5ScrHWhGNJBWp/tIYMGUJlZSVjxozBzMyM/fv3c/DgQVpaWjAxMWH06NG0tLRQVFSE\nkZERBQUFtHe54SZMmEBMTAwRERHCsri4OAC1gO2RI0dobm7Gzc2N48ePa1xjXV0dhYWFDB8+XEMQ\nfO6559SsDQ0NDZk0aRK7du0iJyeHkJAQjfHuVfJLFb1aA1kPDKY85wLFqdFYDBhE8uWO/ZwtDfju\nu+9u6/lZWFgwefJkTp06xa5du1iwYIGGoCKXy9HV1cXBwUFteWpqKuvXr9fwrJ8xYwZOTk4cOnSI\noUOHMnLkSA3P+P3799PQ0MCUKVPUJpjx8fHs3LmT3NxcmpubcXBwYNiwYVp/R6pgQ1lZmca5ifRM\nX+5LAB1dXUzt3am6mokSsBlwXcy1t7fHyclJuD86V6vq6Ojw6quv8vbbb7NhwwZGjx7NgAEDKCws\n5LfffsPIyIhXX31VLas3NDQUZ2dnoqKiqKioYNCgQZSVlQlWLr3ZC6qYOHEi0dHRxMbG8te//pXQ\n0FBaW1s5e/YsPj4+yOXy3gcRuWe40b6J+/fvJzU1lREjRvD2228LGfCLFy/mtddeY8+ePYSEhAi2\nhzeaGX/27FlMpW2seW0FhQYD1ayuWpXN6Ojo9HurKxEREXW6VkO5mkLc6aPIZDLKysoEJwIVnZ0K\nVGhzX1BVKWpbp3qWdRaBfo/bg7aejt2hOk7XOWZ3NDY2EhYWxrlz5ygsLKShoUFt/qLt8+junMQ+\nkyIiIiIinRFFIBERkTtOT7Y8XasEOttFda0SMDQ0xM3NjStXrqBQKNQC7b0xdepUfvjhB6Kjo7G0\ntKSmpoY5c+Zo9FTw9fUlJyeHtLS0GwrW1ze1qAUdjawcURTnUld2GTNHL7Vt9U0sqZFforykiKn1\nqZxOusS1y2kd68ysaGluRKJviJ6RKRYug3AMGMsElzbSLuur2QvZ2Njg5uZGU1MTK1euZPz48VhY\nWJCamopMJsPIyIhRo0YJ2+vo6LBy5UreeustcnJy0NPT4/vvvyc3N5f4+HgsLS2FHiqdmTdvHps2\nbWL16tWMHz8efX19kpOTKS0t1SqUWbl4M3DkFGLiEjhxKgp3b3+Ki4v5/vvvaWrVISsnF31TS2wG\nBmPlEYiuRErFpUR0JFIaqkppb23BwNyOTw4mM9ej6fq4VlZUVlZiaGjIypUrWbRoEY2NjUyZMgWF\nQkFJSQlVVVUEBgbi7+/Pli1bNM5t0KBBgiDY2tpKW1sb8fHxSKVSPD09he0yMjKQSCQoFAqef/55\nDAwM1MZJSkri4sWLfPLJJ32aKN6vk7Kk/PJetzG1c8XeL5TSjFguHtqKldtgNjal0Fyai6mpqUYT\n3lvNX/7yF4qKivjpp584deoUgwcPxtLSksrKSgoKCsjOzmb16tUaQsuJEycYN26chmf9Z599xpQp\nU0hMTOSf//wn/v7+yGQyiouLmTVrFnl5eejr67N8+XKcnZ2F8fLy8ti+fTteXl6MHTsWExMTMjMz\n2bt3L3l5eRrZfUFBQZw5c4b333+fkSNHoq+vj729vUbTZBFN+nJfqjB19KTqaiYSfUOqddU9/oOC\ngpDL5Xh7e2v0c/L19eWTTz5h9+7dJCUlERcXh7m5OZMmTWLhwoW4uLioba+vr897773H119/TVJS\nEtnZ2bi7u7Nq1SrMzMz6LALp6OjwxhtvsHfvXk6cOMHBgwextrZm2rRpLFy4kHnz5vX52kXuProG\nS5NOHAT63jcxPDwcHR0dlixZIghA0CGIL1y4kE2bNnH8+HFBBLrZzHg/V1teeWCMaHUlIiLSLdoc\nIJoU18g8+h+MJa1MGj2cmTNnYmxsjK6uLqWlpURERKBUKjXG6qmnYk/rOrse/B63hxup4lf1+ev6\nTNVGS0sLf//738nKysLd3Z0JEyZgYWEhnP/OnTu1fh4g9pkUEREREekdUQQSERG54/Rky9O1SkBq\n0NFINcDFXGuVwNy5c9m0aROfffYZr776qsYLcG1tLSUlJWpiCcCUKVP48ccfOXnypNDwfdq0aRrj\nz549m2PHjvGf//wHZ2dnjUBeS0sLmZmZalUj0NHrp3Mo22ZgMIriXIoSI/CZ7oqu9H/NOdvbabgm\nx8R2AEaW9qSfP4OLpB1zD0+aa6uQGppQlHgCj/EdvWnMjfV5fKQTiWfCkUgkTJ48We24Dg4OjBkz\nhoaGBmJiYmhqasLOzo6HHnoIiUSCoaGh2vb+/v5s2LCBWbNmIZfLOXDgAL6+vmzcuJF169YRHh7O\nkSNHCAgIECompk+fDsBPP/3E4cOHsbGxYfTo0Tz11FNqzVvVJnu6g5BYudJQXc63+8NpKitFX6cF\nS9fB6Er1aVU2Y+boiZGlA+YuPrQ01FFzNYvmuip0dHTQN7GgvR1OphQK43e+ltDQULy9vamtrUUu\nlyOXy3FwcODhhx/miSee4O2339b4blWoBMGKigrkcjm1tbXY2NioBcxqamqEPkTffPONcM90RdtE\nsT9NyuqbWnrfCHAZMRMDM2vKss5Tnh1PctNVFj0yg2eeeYaXX375tp6jsbExH3zwAUePHuX06dPE\nxMTQ3NyMpaUlzs7OLFmyhGHDhmns9+yzz7JixQq1ZSrP+tOnT/PJJ58QFRVFXFwc+fn5KBQKWlpa\nWLx4MZMnT2bixInC956amkpZWRlTpkxh48aNgoUXwJYtW/jHP/6hYVk3Y8YMSktLiYqKEir7AgMD\nRRGoD/T1vgSw9wvF3q/D1q9Rqf77XLFihcY90BkXFxdee+21Ph/L1taW119/Xeu6AwcOaCxbv369\n1m2lUikLFy5Uayzd0zgidz/agqUAGYfPIG24Rru5pn1S176JDQ0NyOVybGxsBLvYzqhsCHNzc6+P\nf4OZ8aGhoXz//fds3bqVxMREhg0bxvDBg3F19ei2j9r9SGNjI4sWLcLHx4d///vfwvLm5mYWLlyI\nUqnktddeU3teHz58mC1btvDyyy8L71UKhYJ9+/Zx7tw5SktLkUqleHt789hjj2n9uyQicq/QnQNE\nacZvtDTVYzXAA5ezAAAgAElEQVTmEYpcgnEfdd0BIioqSq1S/1bze9webuT5pjpOdxU8nVFZe2qz\nJqusrLwh+3MREREREZGuiCKQiIjIHaU3Wx5tVQJXE3S5Gr4NJzsrjSqB6dOnk5OTw+HDh3nxxRcZ\nNmwY9vb2QiVIamoq06ZN0wjc2draMnToUGQymWDr5OWlXqEDHX12Xn75ZTZt2sSKFSsYPnw4Li4u\ntLa2UlpaSnp6Oubm5mzdulVtv9Y29VmOtecQqq9mcu1yGhcPfomFqx8tTfXUluQjMTDCOXgqnhMe\n49nJg4TGyVNnPIihpT11dbXop/3MwnGhGIUsIjr6OHV1dfz5z3/W6qPt4eHB4sWL1ZaVlpZy4sQJ\nrZ+5t7c3vr6+BAYGqgUZt23bxjvvvEN0dDSXLl3SqJioqKjgjTfeYOLEiUBHE3J/f38WLVqkdbKn\nb2KBvokFAXNXUpx2BnlSBAYeI2m/nENz7TWqCjKpK7uKVM8QidQAnxl/Jvb/XsXY2gkrt44s5WpT\nL2bMeoSCvCyNCZiXlxeVlZUsWrSIb775hldeeYUpU6awY8cOCgoK1LZ95ZVXhMmVShCsqKjg8uXL\nmJub8+2336rdDyYmJjg4OODu7o6zszPvvPNOt4Jgf8bYoG+vFTo6Otj5jsLOt6MybdnMwcwd1VF5\n9fXXX6ttO3XqVK2e14sXL9a4z3vbR4VUKmX27NnMnj2713MdMmRIt4H0zp71eXl5PPvsszz77LOC\nHdxbb72ltU9QXV0dY8aMYf369WoCEMDSpUuJjo7W8JrX1dXlmWee0ejfcTuIjY0lLCyMgoICFAoF\n5ubmODs7M2HCBGbNmgUgXOO+ffvYtWsXkZGRVFZWYmtry5QpU3j88cc1KivPnTvH2bNnycrKEoIh\nAwYMYOrUqcyePVtrUKWpqYkDBw5w9uxZrl69CnQ8v4cNG8aCBQvUBNmmpibCwsKIjo6mqKgIHR0d\n3N3dsfIJAUxv+HPo6/0sInIr6S5YCtCqbKK+oZl/H8qiTd9cCJaCZt9EVfZ5d9WVqkz2zhWpN5oZ\nb29vz8cff8yOHTu4cOECMTExQMdvdN68ecyZM6e3y70vMDQ0xMfHh6ysLLUqqfT0dCFrXyaTqYlA\nMpkM6Kiwgo73tDVr1lBaWkpAQAAjRoygsbGR8+fPs3btWlasWMHMmTPv8JWJiPx+enKAaFJ02FFb\nuvnT3o6aA0RKSsptPa+bdXu4UVR9RxMSEnjwwQd73FZl4Tp27FiNdZ17qYrcHkpLS9m+fTtJSUk0\nNjbi7u7O4sWLtd4fUVFRHD16VM3SefLkycybN0+wXldxM++/crmc7777jqSkJFpaWvD09GTBggW3\n58JFRET6DeLsVkRE5I7SF1uerlUCEgNjLGZM5t13XtNaJbBs2TJGjhzJkSNHkMlk1NXVYWpqip2d\nHfPmzes2U37q1KnIZDJaW1s1Go525k9/+hOenp788ssvJCcnk5iYiKGhIdbW1owbN05rc06JrubL\nnMf4+Zg6uFNxKYny7ATaWppBV4Kl22Ch0qdz0NHYQErgQGdWr17Nt99+S2LcGerr63F1dWXevHla\nm4veSm62YiK/VMH3udoneyrMHD0paofm2msYWTmgW1NBxaVE2lqUKOtr0DM2pzJXRnt7GwZm6vYJ\nJdX1WsecO3cumzdvZvPmzZSUlHDw4EEOHjzIlStXGDVqFDKZTGvljUoQjIuLo6ioiKFDh2oIgqqJ\n4qxZszh+/PgNCYI3S0pKCm+++SaLFi3qVuzozIEDBzhy5AglJSU0NzezZMkSHnnkkVtyLn0l2OPm\nej7c7H63k1vhWa/NCrCpqYm8vDzMzc359ddftR5bT09PQ7i8Uxw9epTNmzdjZWXFqFGjMDc3p6qq\nivz8fE6cOCGIQCo2bNhAdnY248aNExrU79ixg+zsbN5++221ie327dvR1dXF19cXGxsb6urqSE5O\nZtu2bWRnZ2tU0dTW1vLmm2+Sl5eHi4sL06dPRyqVUlxcTHh4OGPGjBFEoLq6Ot58801yc3MZOHAg\n06dPp62tjcTERBJ/3k6p2WCcg7t/zmvjbrwvRe5vegqWwvW+icqGOrVgKXTfN7Frzz8VquWdq1Vv\nJjPe1dWV119/ndbWVvLy8khKSuLgwYNs27YNQ0NDocrlficoKIiLFy+SmpoqBAxlMpnQQ0wl+gC0\nt7eTkpKCo6Oj0J/kk08+oaysjNWrVwvJNdDxbFuzZg3btm0jNDS020pkEZG7lZ4cIPRNOmxXa0vy\nsRjgS3s77IjOpv3aFa09OG8lN+v2cKOMGjUKe3t7YmNjiYqKUvt9A5SXlwvPbdXzICUlRc3Cu7i4\nmO3bt/+u8xDpmdLSUl577TUcHR0Fa/Ho6Gjeffdd1q1bJ1TPAnz22WecOHECW1tbNUvnH3/8EZlM\nxrvvvqvmKHGj779FRUWsWrUKhULBiBEj8PLyQi6X89577/W5t5SIiIiINkQRSERE5I7SF1uerlUC\nABMnD8LExESjSkBFSEjIDWdx/elPf+qzlZKHh4dGWX53rF+/nqWlCpb+X5Tach0dHewGhWA3qOM8\nm2qrSPvlM0zt3IRAqbago7W1NX/72996PW5PFQv29vY92gL1VOlwoxUTq777jXaFZrVXwNyVwr+N\nrZyQ6htSfTWT9rY2PMbPxzGwQ0xTfS5lmbEAGJipZzErW7VbqD3wwAPo6enxxRdfkJaWhkwmY+rU\nqaxcuZKYmBikUim1tbU0NzdrVF9MnTqV7du309TUpFUQVE0Uk5KSeP311zl37pyaIGhpaYmXl9cN\nZ2hdu3aNOXPmaLV9uBGioqLYtm0bXl5ePPzww+jp6QmZh3cSD3szhrhZ91jt15Wh7tZ3Vc+IW+lZ\nr80zvra2lvb2dqqrq+9KW4+jR48ilUr5/PPPsbBQ74mjqhLoTEFBAZs3b8bUtKPSRtWg/vz580RG\nRqo9Y9euXatRvdje3s6nn37KyZMneeihh/D19RXWbdmyhby8PB588EGWLVumJig1NjbS2toq/P+r\nr74iNzeX5557jvnz5wvLm5ubee+99/gx7CT1boMxtnbs0+dwt92XIv2DnoKloNk3cUd0tiACde2b\naGRkhJOTE8XFxRQVFan1JANITk4GULPL/T2Z8RKJBG9vb7y9vfH39+eNN97gt99+61ci0K5du5DJ\nZGoikLe3N2PHjmXr1q0UFhbi4uJCbm4uCoVCyPbPy8sjNTWVcePGaQSITUxMePLJJ1m3bh0xMTEa\nQryIyN1Mbw4QdoNCqMxNIi96L5Zu/ugZmZFzspRE/SpmTJ1MdHT0bTu3m3V7uFGkUilvvPEG77zz\nDh9++CFHjhzBz8+P5uZmCgoKkMlkQlLQqFGjcHJy4pdffiE/P5+BAwdSVlZGXFwcISEhlJWV3YpL\nF9FCSkoKixcvZtGiRcKySZMmsXbtWvbt2yeIQBEREZw4cYIxY8awatUqtTnljh072LlzJ4cOHeLh\nhx8Wlt/M+69CoeDFF19UGyc2NpZ169bd8msXERHpP4gikIiIyB3lZu117jVbnvshGH4z9DbZU6Gj\nq4upvTtVVzvs08wcPYV1BqaWGJhZ06SoxNLVD+9pT6vtqyfRpRnt/TFUFmAtLS288sorwv89PDxQ\nKpXs3buXtWvXEhAQgJ6eHp6enowaNYo//elPrFy5kqioKFJSUqivr0cqlRIQEEBgYKDaRPGDDz5g\n+PDhjBs3Tm2i2N7erpYldic5f/480DHJ6M76507x5EQf1vwU22MgU4WODoL94d3Arfas12bvoMq0\n9/Ly4rPPPru1F3CLkEgkahmMKrpa1AEsXLhQEIBAvUF9eHi4mgikzb5SR0eHhx9+mJMnT5KYmChM\ngqurq4mOjsba2prnn39e47Ps3BNMoVBw6tQpfHx81AQg1fk899xznDpzjmuXU/okAt1t96VI/6Av\nfz+79k1Mvtyxn7Olgda+idOmTeOHH37gm2++4c033xR6BtXU1LBr1y4ANZHmRjPjc3JycHJy0uh9\nV1VVBYCBgcENfgr3Fp0rRg0kBrS06woVP3V1dVy6dIn58+cL7wYymQwXFxdBgFMtV/ViqqurY8eO\nHRrHqa6uBvjDKkRFRG6W3hwgjKwc8J72LHLZKWoKs2lvb8PI0oHpTy3hwVE+t1UEgptze7gZfHx8\n2LRpE3v37iU+Pp6MjAxBqH/yySeF7QwNDXn//ffZvn07KSkppKen4+DgwMKFC5k7d+5t/zz6A10r\n/QeYdLz029vb88QTT6htO3z4cOzs7MjKyhKWhYWFIZFIWLlypUZS4cKFCzl48CCRkZFq4s2NvP+W\nl5eTlJSEg4ODRhJmaGgogYGBojWgiIjITXNvRVVFRETuee4nu6je6C0YbmBqyfCn1gL3T9CxL3Z/\nKkwdPam6molE3xBja/UMZTNHT5oUlRhbd1QMdcbBwpiCvmtrAk888QR1dXXExcUJGdNTp04V7BZe\neukloCNIEx8fT3t7O4sWLSIwMBC4cxPFm6GysuMD+aMFIIBhnra88tCQHi2NoOOef3X2UCGL/Y/m\nTnnWGxoa4ubmxpUrV1AoFJiZ/fHCb+cJsZGLP9fSM1m+fDkTJ04kMDAQf39/jaogFarfR2dUDeo7\nN5yH603P4+PjKS4uFvqKqOhsqZeVlUV7ezsBAQFqgo82srKyhAoIbQHU1tZWLIz1cXeQUKbDPXVf\nitybNDY2smjRInx8fPj3v/8tLG9ubmbhwoUolUpee+01NZH065/2cOHHrbiPfhgb72HUVxRRmZeM\noiQfZX0NbS1K9IzN0TMyo7G6XOibuLEphUsJp8nMzGTAgAGCnRDAvHnzSEhI4MyZMwwcOBAvLy8e\nf/xxzpw5Q3V1NfPnz2fw4MHC9jeaGX/q1CmOHj3K4MGDcXR0xNTUlOLiYuLi4tDT07vjlqR3Cm0V\nowA5tUZknUnicVkuBo1ltLW1ERQUhKurK9bW1shkMmbNmoVMJkNHR0foB6RQKABISkoiKSmp2+M2\nNDTcvosSEbkN9MUBwtTOFZ9p6v0OXQcNYsgQHw2nAm0JYCo69/vsSk99JG/U7aEneupJaWdnx7Jl\ny3o9hq2tLatWrdK6TptzQ0/X1psTRH+iu+d2U20VBQXXcBsUKCRKdMbW1lYQ6m/W0vlG3n9V786q\nd+muDBkyRBSBREREbhpRBBIREbmj9KcKmXs1GP576MtkT4W9Xyj2fqFa17mFzsYtVNOCbqi7NR++\n/VGP43Y3ATM0NGT58uUsX75c634WFhasXr26x7Fv1URx6tSplJSUCJnbERERatUkr7zyilogLzc3\nlx9++IGLFy+iVCoZNGgQzzzzDImJiWqWYnPmzKG9vZ2ysjImTJggCA11dXXo6+tjZmaGiYkJQUFB\nLF68GBcXF2JjYwkLCyMyMpJLly4REhJCYGAgEyZMYNasWfz000/s2rULR0dHPv74Y/bt28e5c+co\nLS0lISEBCwsLtm/frtYf6oFhbjhYGrMjOpvky5q/9aHu1iye4HNX3fN30rN+7ty5bNq0ic8++4xX\nX31VI4u+traWkpISNZum24H2CfEAql0mUF2cyuVdezE3+hUdHR0CAwP585//rNHnSFt/ClWDelX2\nOnRkuL/66quUlJQwaNAgpkyZgqmpKRKJhLq6OsLCwtQs9VRN7W1sbDTG74oqgJqdnU12dna327la\nGfDak6H31H0pcm9iaGiIj48PWVlZNDQ0YGRkBHRYtqnuc5lMpiYCZWWkAR0JEgDlOReoLsjA1MED\nM0cvoJ36CjmKknzQAV2JlPLseJKbrvLYrAcxNjYmJSUFb29vYUypVMq7777L66+/TkZGBg0NDURE\nRODp6clLL72kYT0GN5bwMHHiRJRKJRcvXiQnJ4fm5mZsbGyYMGECjz76KO7u7rf6o/3D6a5iFDq+\nuyJ5Lqs+38MYp3b09fXx9/cHOqp+EhISUCqVpKWl4ebmJojrqv5LL730EnPmzLlj1yIicrvpLw4Q\nInc3PT23AWoamom4WMGxpAKh0l+FRCKh/X879tXSuampSbD7njVrFosXL0Yul2NiYoK7uzsTJ05k\n4MCBau+/SqWSX3/9lR9//FEQi0pKSpgzZw7jx48XxjYxMeH8+fNs27ZNbb7bU5LJ4cOH2bJlCy+/\n/LJa9a9KnFLN66RSKd7e3jz22GMafX8jIiL49NNPeeWVV7C0tGTv3r3k5uZSX19/zwiNpaWlvPDC\nCzdlw36jvXrvFJ2/l+4EaBGRzoh/XUVERO4497Jd1I3ye4Lh98oLVWdu56TtXr8XujJkyBDh5d/T\n05PRo0cL6zw9PYUgeE5ODj///DN+fn7MmDGDsrIyzp49y1tvvcWyZctYtGgRERERlJaWsmDBAvbv\n349SqaS2thZPT0/Cw8OpqqpCX1+f4OBggoKC+O2334iPj2fmzJns378fKysrJkyYQH19PRYWFjQ1\nNXHixAkhYxng8uXLLF26FIVCQUBAAB4eHmRlZaGvr8/atWtZsWIFM2fOFK5hmKctwzxtNWwXgj1s\n7zpR90571k+fPp2cnBwOHz7Miy++yLBhw7C3t0ehUFBSUkJqairTpk1jxYoVv/fSuqWnCbGNVxB4\nBdGqbGSGrwFU5hEeHs7atWvZsmWLWlVQVVUVdnZ2avurGtR3bix//PhxSkpKtE6eMjIyCAsLU1um\nEsY6Z0d2h2rbRx55hCVLlvS6/b1yX4rc2wQFBXHx4kVSU1PVesTo6uoSGBgoPFuhozeAPD8bAzMr\nDEw7hFWHgPG4hsxCp0smcEVOIpfPhWHjPRzHgPEsmzmYuaM80dfXp7m5WSPTXE9PD4VCwciRI/nu\nu+80RGdt9DXhwdfXV62Pwf1OTxWjcN3aViHPY3/KVaaP9BHsgoKCgoiMjOTw4cM0NjYKVUCA8Bmm\npaWJIpDIfUV/coAQuTvp7bkt0KXSXxt9tXRWiQ0lJSU8//zzVFVVMWfOHPz8/IiOjub8+fNMnjwZ\nOzs7wsLCaG1t5Z133iE1NRU9PT0cHBxwd3ensLCQDRs2kJubyzPPdFTL1dXVYWpqytWrV/ucZKJ6\n3+j8d6e0tJQ1a9ZQWlpKQEAAI0aMoLGxkfPnz2ud16k4e/YsCQkJjBgxggcffJDS0tJePlgREZG7\nCVEEEhERueP0twqZeykY/nu5XZO2++Ve6MyQIUNwcHAgLCwMLy8vjcC4ymbs/PnzGtk9R48eZfPm\nzWRnZ7Ns2TJSUlIoLS1FIpGgVCpZunQpixYt4qWXXsLPz4/169fz66+/Eh4ezsSJE1mwYAGrVq3i\n888/x8PDg88//xwjIyOysrKwt7fn448/pqamhsbGRrKzswkODmbnzp0oFAo2bNjAxIkTCQsLw8PD\ng2XLlnH06FG2bdtGaGioRmWIh73ZXX+f/xGe9cuWLWPkyJEcOXIEmUwmTOrs7OyYN2+e2uTtVtPX\nCbFEz5BDebD+yUW0t7cTHh5OWlqa0MwcIDU1VeNcVXaLXl5ewrKioiIAtX07j9GVQYMGoaOjQ1pa\nGo2NjT1awqm2TU9P7/mCOnEv3Jci1zlz5gwHDx4kLy+PlpYWnJycmDRpEnPnzkVPT0/Y7oUXXgBg\n8+bN7Nixg+joaEGonDFjBvPnz9faqysrK4v9+/eTnp5OTU0NZmZmuLu7M3PmTLUMXIDMzEz27dtH\neno6tbW1WFpaMnLkSBYtWqRmyRkUFMSuXbuQyWRqIpC3tzdjx45l69atFBYW4uLiQm5uLnrtSo3+\neNqwHhjM1QvHUchzcQwYL/zdnTVrFocOHeLIkSPC8QASExMpKSlh2rRpfRKARLqnp4pRAGOrDgvb\n6quZKBvrKOV6coeq/8+ePXvU/g8d/UICAgKIiYkhPDxcLVNbRX5+PlZWVt1ac4qI3I30JwcIkbuT\n3p7bnVFV+nc337xRS+fU1FQ8PDywsrLijTfewMPDg4ceeojVq1ezefNmQfRPSUmhoqKCESNGsGzZ\nMpYsWYKlpSUffvghq1atYs+ePYSEhODv709KSgrm5ua0tbX1OckkJSUFR0dHNZeJTz75hLKyMlav\nXq1WFVxXV8eaNWu6ndfFx8ezdu1aRowY0bcP9S7C2tqaLVu2qCWpiYj0N0QRSERE5A/hXrSL+r30\nh6DjzU72Fk/w6Vf3wo3g7++vUd49bdo0tm7dqtaoFODgwYNYWVmxZMkSDh06RF1dHX/5y19wd3fn\nhRde4MSJE0RGRvL6668zc+ZMEhISaGpqQiKRoK+vj5+fH6mpqdTW1mJubk58fDwtLS2MHDmSb7/9\nFisrK2GioJpgjB49GhsbG9atW0dMTAyzZs26Mx/MLeROetZ3JiQkRC1Ye6foaUKsKM7D1MFDCJSr\nJsRm3TR637VrFyEhIZiamgIddhQqm8Np06YJ2zk4OAAdE10PDw9heW5urhAU7YyFhQUTJ07k9OnT\nfPPNNyxbtkwteN/Y2EhraysmJiZYWFgwefJkTp06xa5du1iwYIGGj7pcLkdXV1c4D5F7h++//549\ne/Zgbm7OpEmTMDQ0JCEhge+//54LFy7w7rvvIpVen9K0tLTwzjvvUFlZyciRI9HV1eXcuXN89913\nKJVKFi1apDb+sWPH+PLLL9HV1SU0NBRnZ2eqqqrIycnh0KFDaiJQeHg4X3zxBXp6eoSGhmJra0tR\nURHHjh3jZNRZHn7+NfSMLTA2kBI4wAV9fX3hWVlXV8elS5eYP3++IADIZDJcXFxITk7G2EBK0JCh\nXPvfsdpaW6nISeBafiqNNeW0NjcKljQAyvoatWCpm5sbgYGBJCQkUF5ejq2trXB9AA8++OCt/WL6\nGb1VjALo6Opiau9O1dVMAK5J7cgvVeBhb4a9vT1OTk7Cs6hrP7VVq1bx97//nU2bNnHgwAF8fX0x\nMTGhvLyc/Px8Ll++zMaNG0URSOSeoz85QIjcXfTlud2V5MuVwnNbG32xdM7Pzwc6KodmzZrFzp07\nhfdfHx8fJk+eTFhYGF999RVmZmZkZmZiZ2fHkiVLcHBwIDg4mKSkJKKjo1m4cCGbNm3i+PHj1NTU\nkJqairm5OU1NTX1OMlEoFGpJWHl5eaSmpjJu3DgNW1gTExOefPLJbud1oaGh96QABB0WuQMGDPij\nT0NE5A9FFIFERET+MPpThUx/4mYme/3lXuh6fQNMev+QuvZgAbha2UBNi5SknEJ+icujuq5ZaDLq\n7OzM7t27OXDgAIWFhezZs4fDhw8DUFxcTEREBK6urhQWFmJjY0N1dTXLly9n4sSJGBkZ0dzcTEpK\nCmPGjCE5OVkIrpqZmVFQUMCOHTtoa2vj8OHDGBsbc/ToUaH3S9dGqPcK/cmzvrcJcV7Uf9GV6mNs\n64KBqSXt7ZB55DIDTZsYGuCnZiUB4OrqyooVKxg3bhwSiYTY2FjkcjkhISFqFUJTpkxh3759fPXV\nV6SkpODs7ExRURHnz59nzJgxWqup/vKXv3D58mWOHDlCSkoKw4cPRyqVUlJSwoULF3j77bcZMmSI\nsG1RURE//fQTp06dYvDgwVhaWlJZWUlBQQHZ2dmsXr1aFIHuMTIyMtizZw+2trZ8/PHHWFlZAfDs\ns8/y3nvvcf78efbt28eCBQuEfSorK/H09GTdunWCFdfixYtZunQpv/76K48//rjwXCsoKBCyQjds\n2ICbm5va8cvLr1cJFhYW8uWXX+Lg4MD69euFflWJeeVktrly9KfNJH+4Ca9JTwj71CjNKb+YTXV1\nNRkZGbS1tREUFISrqyvW1tbIZDLBdlNHR4flTzzA+wcu0t4O+Wd+pqrgIgZmVli4+CI1MkVXIgGg\nLCOW9rZWjWDprFmzSE1N5dixYzz55JNcu3aN2NhYvLy8GDRo0K36WvolvVWMqjB19KTqaiYSfUOM\nrZ1Jyi8X3mOCgoKQy+V4e3trBA5tbW359NNPOXDgADExMURGRtLW1oalpSVubm7Mnj37vuyxJHL/\n098cIETuHvr63Na2X3fzT22WzrqGZmReKaGstBT55Rwmju8QXAYOHMgDDzzAwYMH1d5/U1JSSE9P\nZ9y4cTQ0NFBTU4Ovr68gUCxbtoxVq1bx1Vdf4efnx9WrV/nvf//LqVOnGDVqFLGxsbS2tvY5yQTU\nq08zMjKE/Xbs2KFxjT3N6+7ldwltPYGqqqrYt28fcXFxlJeXI5VKsbS0xM/Pj4ULF+Lo6NjjmDk5\nOZw8eZKUlBTKy8tpamrC1taW0NBQnnjiCSFJTkXnHj52dnbs3LmTnJwcdHR0CAgI4Pnnn8fV1VXj\nOHK5nO+++46kpCRaWlrw9PRUe/ftSn5+Pnv27CEjI4PKykqMjY2xtbUVerx2Tp4S6V+I37yIiMgf\nTn+okOlP/J7J3v16LyTmlfNTVLZG8L2ptoqCgmv4VtR2u2/nQFHncbLkNQBsOZZOdtIVlGUVeDrb\nQVERO3fuJDMzk+rqagoLC9XGKy4uFpqZOjo6MmnSJIqLiwkLC6OmpoaMjAz++c9/8vnnnyOTyRg0\naBBNTU2Ym5uTn5/P119/TVtbG7m5udjb26s1Rm1oaPjdn9UfQX/yrO9tQuwUPBWF/BINlcXUFOWg\nK5Gib2LByClz+MdKzUnD66+/zq5du4iMjKSyshIbGxsWL17MY489pla5Y21tzYYNG9i+fTvp6elc\nuHCBAQMGsGzZMoKDg7WKQKampnz44YeEhYURHR3N0aNH0dXVxc7OjunTp6sF7I2Njfnggw84evQo\np0+fJiYmhubmZiwtLXF2dmbJkiUaTW5F7k46i+Wnft1FfVMLTzzxhCAAQUej5hdeeIH4+HiOHz+u\nMRFeunSpIABBR2VZaGgoJ0+epLCwUAimHz58mNbWVhYuXKghAAFCNQ3AkSNHaGlp4cUXXxQEIKG3\nFjZYDPClujCLVmUTEr2Oirl6Y0cuZaXx1d5wzFsr0dfXx9/fH+gIyCQkJKBUKklLS8PNzY2JQV7U\nt0l576bgPj0AACAASURBVPujVBVcxNzJi4F/WoyOrkQ4j/b2dkrSYxjqaq0RLB0zZgyWlpaEh4ez\naNEiwsPDaW1t5YEHHrip70LkOn2pGAWw9wvF3i9U634rVqzosdebkZERCxYs6DGwIyJyL9IfHSBE\n/nj6+ty+0f1Uls7f7PiZb/efoKKqBom+EfrG5pg5BRJdaUVpwTUGBuppff/V19fHw8OD4OBgzpw5\nA6BmJ+vs7MxHH33E9u3bSUxMpLi4WEgUqKmpIS4uDnd3dy5fvtznJJPOSVwKhQKApKQkkpKSur1O\nbfO6zu9i9zpNTU38v//3/5DL5QQHBzNq1Cja29spLS3l3LlzjBs3rlcR6NixY/z2228MGTKE4OBg\n2tvbycnJ4ZdffiEhIYGPPvpI6NvUmbi4OGJjY4XeSgUFBcTHx5Odnc2XX36Jubm5sG1RURGrVq1C\noVAwYsQIvLy8kMvlvPfee1qrsvLz8/nb3/4GdFRuOTg4UF9fj1wu5/Dhwzz99NOiCNSPEb95ERER\nEZFbjjjZu44QJOxGEKtpaOZgwhWmJxUwM1gz86ev49Q3t5FReI2Hpv+Jbz/fwPr164mJiRH6/vRG\nXV0dqamp/OUvfyEjI4M333yT2tpann76aYyNjTE3N8fd3Z2XX36ZpqYmfvjhB9asWaO1x8u9Rn/y\nrO9tYms3aCR2g0ZqLA8aN0jrJEZPT4+nn36ap59+utdju7q68vbbb2td19VST4WhoWGfg6JSqZTZ\ns2cze/bsXrcVufvQJpZnnE2kvrKCXzKVOPiWq/3NcHFxwdbWlpKSEurq6gTB3MTEBCcnJ43xVYJO\nbe110T0zs8O2qy/WJqrM2dTUVLKzs8kvVbDrbI6wvqWxjva2NppqKjC2cQbAzNGTonb4ev8Jhlg1\n4+fnJ4hTQUFBREZGcvjwYRobG4UAzQPD3Lia5cj6SH3MXQapCUAArga1NDqaMsBGs7+PVCplxowZ\n/Pe//yUuLo7jx49jaGjI5MmTe70+kZ7pTxWjIiK3g/5S9S9y99CX56+BqSXDn1rb7X7dWTxXSB3I\ns5mA64MT6Dp7a6qtoqahmQMxF3nwf/O7zu+/qmoQR0dHfvnlFxYsWMC1a9fUxnBycmLNmjWUlJSw\nZMkSPD09Beu3qVOnsnfvXr777jtkMhkZGRm9Jpl0thJV9cR56aWXhL5EfUVbX8V7FZlMhlwu55FH\nHmHJkiVq61paWlAqlb2O8fjjj7Ns2TING+rw8HA2bdrEoUOHeOyxxzT2O3fuHP/617/UxLnvvvuO\nvXv3Eh4ezvz584XlW7ZsQaFQ8OKLL/Lwww8Ly2NjY1m3bp3G2BERETQ3N/PWW28RGhqqtq62tlbD\n2lukfyG+lYqIiIiI3BbEyV5HULMn4Ubou9LWxicHk7G3MNIqjPU2DoCuVB+JniHHz8RzPrsYPz8/\nYmJiSEtL65MIZGJiQmhoKHPnzmXPnj3k5+djYGBAUFAQ+vr6GBoacu3aNWQyGc3Nzejo6AhWXPcD\n/cWzXgxkityNdCdytyqbAMipaGHNT7G8OnuomlhubW1NWVmZhgikDcn/rNTa2tqEZSpBSFXZ0xM1\nNR3Vl/v27QMgveAaNQ3NGtu1tlxfZmzlhFTfkOqCTBIKlTw253pFjsqaRdUTq7NVy4RgX34dYIW/\nhy5jZw4W/n56WevxzRcbsTC+XuXUlQceeIC9e/eydetWKioqeOCBB7QKuCI3Rn+qGBXpG2vWrCE1\nNbXbJAYR7dyvVf8idx+367ndl3kZQH2lnI37z2vM71JSUgDw8vLCyMgIJycniouLKSoqwtnZWW0M\nlZ3bwIED1ZarxAOVCNSXJBMVvr6+AKSlpd2wCHQ/0rlyXIVUKu1TtYy9vb3W5dOmTeM///kPiYmJ\nWkWgiRMnanwvqve3zj1/y8vLSUpKwsHBQSPJLTQ0lMDAQFJTU/t8XV3t6UT6H+KMXkREROQe5oUX\nXgDg66+//oPPpHv682Tvp6jsHicIEn0jdHR0UNZX094OO6KztYpAvY2jws53FPKUKNas28juj9ew\ne/dudu7ciY+PD4MGDaKyspK6ujpcXV1pb29n3759zJs3Ty2ra+jQoezYsYOqqir8/Pzw9fVFKpUS\nEBBAXl4eERER2Nra4unpiZnZ9e81Pz8fKyure7ZpdX/xrBcDmSJ3Gz0FU1S2ai2NtUj0rDXE8srK\njqqh7oSf3lBNhisqKnptFqw6xu7duymtbWXp/0X1Or6Ori6m9u5UXc1ECdgM8BbW2dvb4+TkhFwu\nR1dXl8DAQGGdj48P/v7+XExOoL25jsGDB1NSVcXehARcXFzUbGO6YmdnR0hICLGxsQCiFdwtoj9V\njIqIiIjcD9yu53Zf52UtzY3Ik0+zI9pJeG/Jzs4mMjISExMTxowZA3QIBj/88APffPMNb775plBV\nUlNTw65du4COXkSdGThwICYmJsTGxlJdXc2kSZOuX0MPSSbQ8Y4REBBATEwM4eHhGmPDvT+vU9E5\nGbW5rkrNESEwMBAbGxv27t3LpUuXGDlyJP7+/nh5eWlU9nRHS0sLR48eJSoqioKCAurq6mjvdHNU\nVFRo3c/b21tjmbaK9dzcXAAGDx6s9ZyGDBmiIQJNmDCBsLAw1q1bx7hx4wgODsbf319rlbxI/0MU\ngURERERERG4D+aWKXicdEj19jG1cqC29Qv6ZfciTbXBvzGT2jMnCNmXVDaSU923y4hA4kYZrJcjO\nRbF0hRyPAQOIiopi/vz5mJmZUVdXx9ixY3FzcyMjI4OIiAgOHDiAr68vDg4OtLe3c/bsWaqqqjAx\nMWHs2LFCFtSqVatIS0sjLi4OY2NjjIyM2L59O+Xl5eTn53P58mU2btx4T08W+oONoRjIFLnb6CmY\nYmTtSH2lnNqSyxiYWauJ5XK5nPLychwcHG5aBPL19SU7O5uEhIReRSBfX19y/n/2zjygqmrt/5/D\nPE8yiCiToKDiwQlTcyjnqUxzIksbbmWWWWL3kpX9Xoty6Dqkr92Gm94M7YreFNQc8KokCAJymCRF\nUBFRQBkOIMhwfn/wnp3HcxhFRV2fv2Dttdda+3DYe6/neb7Pk5lJWloauarm3wMsOnpQfPkP9I1M\nKNHTvD/K5XLy8vLw8vLSuAY9PT0+/vhjtm7dSnx8POHh4XTo0IExY8Ywc+ZM3nrrrUbnHD16NLGx\nsXh7e2tFDwtaz+OiGBUIBIJHhba+bzdnf6fG0smN65mnCfvuCh1LRqJfW0lUVBR1dXUsWLBASss2\ndepUEhISiI2N5Z133qF///5UVVXx+++/U1JSwrRp0+jRo4fG2OrgEXXAx+2qksaCTNQEBQWxdOlS\n1q9fL+0Fzc3NH5l9na4Uw1VlxaRdvE5d3AWGZ9enGF69ejWhoaHExsaSmJgIgJWVFRMmTGDmzJlN\nqoFWrlxJTEwMHTt2ZODAgdja2mJoaAjAnj17Gkwpp0uRo0uxXl5eDoCNjY3OcXTVaOrWrRsrVqzg\n3//+NydOnOC///0vUJ9GOTAwkGHDhjV6TYJHG+EEEggEAoHgHpB0obBZ/dyHPMfl+AOU5p2n9mIq\nW64m4Nu1iyQvv1ioBBovSqlGT18fj+EzKcpOAS6Tm5uLvb29pAAyNjbmypUrqFQq5HI5/v7+VFRU\ncP78eeLj4zEyMsLBwYFu3bphaWmJv7+/NLa9vT3ffPMNzzzzDEVFRVy9epXw8HBsbGxwdXVl0qRJ\nUrH1h5nHIY1hW2yIG8qRLmiclJQUPvzwQ2bPnk1gYOCDXs4DpyljSoeufbieeZqrqcex6twNQxNz\nki/eIOtqCaE//IBKpWLMmDGtnn/ChAns37+f7du307dvX7p00czsX1hYKEVmTpo0iQMHDvD999/j\nN3aO1lh1tbVUXL+MhaPmfdDRZyCOPvU52Sur6zSOLViwgAULFuhcm6WlJfPnz9d5rCn17/nz5wEY\nP358o/0ELeNxUYwKBALBo0Jb37ebu78DMDK3pUvARK6cjuTXPXtxtDKia9euzJo1i759+0r9DAwM\nWL58Ob/++ivHjh0jIiICPT09PDw8eP311xs02svlcmJjYzEzM8Pb21vrmK4gEzX29vasXbuW8PBw\noqOjOXr0KHV1dY/Evk5XiuGqsmJSwlZTpbxBXlGFRorhhQsXolKpyMnJQaFQsHfvXrZv345KpWLO\nnDmkpKTw9ttvo1QqNeY5d+4cMTEx+Pv78+mnn0pOHACVSsXOnTvv+lrUf7vi4mKtY5GRkSxdulRn\njR8fHx8++eQTqquryczMJDExkfDwcFatWoWVlZXGHl/weCGcQAKBQCAQ3ANul5s3hrGlHV2fmi39\nPndEN0b+n9E9PDyc0KhzpB09q3VezynvSj97j54n/SyTybDz7M3UEc+3eRSyi4sLCQkJbTpme+VR\nTmMoDJn3lvz8fF599VVGjhzJokWLHrv5W0JTxhQLhy449RzCtbQTZERswsa1B3oGhrz9zi/oVxbR\no0cPpk6d2ur5u3Tpwvz589m4cSMLFy7kiSeeoFOnTpSWlnLu3DnMzMwICQkBoHPnzixcuJD169fz\n07rlXNd3xNiqA6jquFVWTFlBDgbGpvR45u0G57sftbVu3rzJ/v37sbS0FNGe94DHQTHaXG6/1wQG\nBrJ582aSkpKorKzEzc2NwMBAqZA51Ec0HzhwgISEBHJzcykpKcHMzAwfHx+mT5+Oj4+P1hyTJ0+m\nV69e/PWvf2XLli2cOnWKyspKPDw8mDdvHj179qSyspLQ0FB+//13ioqKcHZ2JjAwkCeffFLnuo8f\nP85vv/1GVlYWt27dwsnJiREjRjB16lQpgvvO/rt27SInJwdTU1P69u3LvHnz2uxzFAgE95a2vG83\nd3+nxsTaAc8Rs5g7oluj+zIjIyNmzJjBjBkzmj325MmTG6zp01iQiRpTU9Nmzzly5EhGjhzZ7LU9\nKJpbr0mlQiPFsEwmw9XVFVdXVwYNGsTLL7/MyZMnmTNHO+hHTV5eHgABAQEaDiCAs2fPcuuWdt3I\nluLp6QlAeno6dXV1WinhSktLcXBwaPB8Q0NDfH198fX1pVOnTvz9738nNjZWOIEeY4QTSCAQtDtu\n31ROnz6drVu3kpKSQmlpKZ9//jl+fn4olUp27drFyZMnyc/Px8DAAC8vL55//nn69OmjMV5NTQ37\n9+/n8OHDXLt2jerqamxsbPDw8GDSpElaD8HLly8TFhaGQqGQ0mLJ5XICAwNxcXHR6Jubm8vhw4dJ\nSkoiPz+fiooKbG1t6du3L7NmzZIiiNXcHgXev39/tm3bRkZGBmVlZfzwww+S+qOwsJBdu3YRHx/P\n9evXMTIywtnZmYCAAGbNmqX1mak3wFFRURQXF+Pg4MCYMWOYNm2aRr0Xwf2jtca+O89rq3EEjXP7\nfWfmzJls3ryZlJQUqqur8fHx4bXXXsPNzY2SkhJ++ukn4uLiKCsrw93dnXnz5mnk275x4wYHDx4k\nMTGRvLw8ysrKsLKyolevXsyaNUtLbdBSQ9pvv/3Gxo0bCQwMZPbs2dxJUVERL7/8Mp07d2bDhg06\nr1cYMgXtgeYYU1z6jMLUtiOFf8RxI1uBqq4O5+4ezHvxRaZMmdKswr2NMXbsWNzc3PjPf/5DSkoK\nJ0+exMrKCnd3dy2V0VNPPYWHhwf//Gk7P/wnEuXV8+gZGGFoaomNqy+2bj0bnete1tY6deoU58+f\nJy4ujuLiYl555RWd0aGCu+dxUIy2hPz8fN5//306duzI008/jVKpJCoqiuXLl/PZZ59Jz8fLly/z\n008/0bNnTwYMGICFhQX5+fnExcWRkJDAxx9/TL9+/bTGLy8v54MPPsDU1JThw4dL43/yySesXr2a\njRs3olQqGTBgALW1tRw7doyVK1fi4OAgFUBXs27dOg4fPoy9vT2DBw/G3NycP/74g61bt6JQKFi+\nfLmGMW/37t18//33mJub8/TTT2Nubk5iYiJLliyRUjkJHgzBwcGkpqYSHh7+oJcieAhoq/u22Je1\nbxpKMWxoakm3sa9w7uBmqa2iKJ9//pbI1/M13/WKiooAmnyHcnJyAiA1NVXDGVdSUsKmTZtaeQWa\n2Nvb4+/vT1JSEhERETzzzDPSsTNnzqBUKrWcQGfOnKFr164YGRlptKvVROLd8PFG3IkEAkG7JS8v\nj8WLF+Pi4sKIESOoqqrCzMyM/Px8goODyc/Pp2fPnvTr14/KykpOnTrFsmXLWLBgAWPHjpXGWbNm\nDcePH8fNzY2nn34aY2Njrl+/Tnp6OomJiRpOoISEBEJCQqitrSUgIABnZ2cKCwuJiYkhPj6ekJAQ\njfz6MTEx7N+/Hz8/P3x9fTEwMODSpUscPHiQuLg41qxZQ4cOHbSuLSMjgx07dtCjRw9Gjx5NaWmp\nZMg6d+4cy5YtQ6lU0qtXLwYPHkxVVRWXLl0iNDRUywlUU1PDJ598wo0bN+jfvz96enqcPHmSLVu2\nUF1drdNILLj3tNbYd+d5bTWOoHlcu3aNxYsX06VLF0aOHEl+fj4xMTEEBwezevVqli1bhpmZGUOH\nDpUMUZ9++in/+Mc/pJfw1NRUduzYQe/evRk8eDCmpqZcuXKF6Oho4uLiWLlyJR4eHlpzN9eQNmLE\nCH788UcOHjzIzJkztaLCDh06RG1tbZMF4YUhU/Cgaa5RxM69F3buf+a0nz+2B1MCtP+HGkuTFhgY\n2GAKPh8fH4KDg5u1Fnd3d/7n479R4Tq8XdXWOnHiBJGRkdjY2DB9+nSmTJlyz+YS1PMoK0ZbQkpK\nilZQwvDhw1m2bBm7du2Snl2dO3dmy5YtWFlZaZxfWFjI4sWL+f7773U6gbKzsxk3bhxvvfWWFNjU\np08f/v73v/Phhx/i6+tLSEiIZPB66qmn+Nvf/kZYWBhLly6VxomMjOTw4cMMGjSIoKAgDQNZaGgo\n27ZtY+/evZKRLT8/n82bN2NhYcG6deukQK25c+fy5ZdfEh0d3RYfn0AguI/c7X1b7MvaL42lGNbT\n18fY0g7ZbU5+ZV4Wv+z7lpr03/D1dsfGxobCwkJiY2ORyWRNKs29vb3x9fUlOjqaJUuW0KNHD4qL\ni0lISMDFxQU7O7s2ua758+cTFBTEd999x+nTp/Hw8CAvL489e/borBW0c+dOkpOT6dmzJ05OTpia\nmnLx4kUSEhKwsLDQsJMJHj+EE0ggELRb0tPTmT59Oi+99JJGe3BwMAUFBSxZskQj1Ul5eTnBwcF8\n++23DBw4EBsbG8rLy4mKisLLy4uvvvpKy1h6e27XsrIyVq1ahbGxMStWrNCI1r948SJBQUGsX7+e\ndevWSe1PPfUUzz77rFb6iNOnT7Ns2TJ++eUXnQWcT58+zYIFC7SMtDU1NXz55ZcolUqCgoIYPny4\nxvHCQu3UOTdu3MDDw4PPPvtM2tAGBgbyxhtvsHv3bqZPn37XkdKCluPuaImfq91dGwnbahxB80hN\nTeXFF1/USI2wfft2fv75ZxYvXsyTTz6p0xC1e/duXnvtNaA+D/fWrVsxNTXVGDs7O5sPPviALVu2\n8Omnn2rN3VxDmomJCU899RR79+4lISFBQyWkUqk4ePAgxsbGPPXUU826ZmHIbDvUhkSoNzhGRkZK\nxxYtWiQZEQGysrL46aefOHPmDNXV1XTr1o2XXnoJX19fjTFboixrav72lsrjYTamtHWx6btl0aJF\n7T79n+DRxNHRkZkzZ2q09e3bFwcHB86e/TOdra66FFAf6TxkyBDCw8MpKCjQimo2NjbmlVde0VC2\nDx8+nHXr1lFWVsbrr7+u4dDp2bMnjo6OZGVlaYyzZ88e9PX1effdd7UipGfNmkVERARHjx6VnEBH\njx6lpqaGSZMmady7ZTIZL7/8MjExMaiacwMQCASPDGJf1n5pLMXw7TWB1JjadcTQxIx9+/exd3cF\ndXV1WFhY4OXlxQcffMCQIUManS8rK4vOnTsTFxfHzp07+fnnn7GysmLIkCF88MEHfPDBBxr9IyMj\n+eijj6RzDx8+TGZmJjKZjJ49e/LKK6/onEcmk+Hu7k5ERARJSUmYmpoSEBBAYGAgW7du1eo/ceJE\nLCwsOHv2LOnp6dTW1mJvb8/EiROZMmWKxvNM8PghrIICgaDdYmNjo6Viyc7OJjU1lSFDhmjlujc3\nN+eFF17gs88+Izo6mgkTJiCTyVCpVBgaGupMi2Zp+ecL2ZEjRygvL+fNN9/UStfk5ubG2LFj2b17\nNzk5OdJxXSofqDcMu7m5kZiYqPO4p6enzij9uLg48vPzGThwoJYDCNBKL6fmjTfe0NjQWltbM3Dg\nQI4cOUJubu5DW9jxYaetjITtzdj4KHCn+qWzef2H6+joyPPPP6/Rd+TIkfz8889UV1c3aIi63dhk\nbW2tc04PDw969+7N6dOnqamp0XLONteQBvUF7ffu3cv+/fs1nECnT5/m2rVrjBo1qkGDm+De4efn\nR3l5OXv27MHDw4MnnnhCOubh4UF5eTkAmZmZ7Ny5Ex8fH8aMGUNBQQEnTpzgo48+Yv369RqpR1ui\nLGtq/vbGw2xMEbW1BI8bDT03PTw8tIKsoP6dNSMjQ6PtzJkz7Nmzh4yMDIqLi6mp0UwJef36dS0n\nkIuLi1ZQhZ6eHjY2NlRWVtKxY0etuTt06KDx3KyqqiI7OxsrKyt2796t8/oMDQ3JycmRfj9//jxQ\nf1+9k44dO+Lg4EB+fr7OsQR/EhkZSVxcHOfPn6eoqAh9fX3c3d0ZP368zmAVpVLJr7/+ysmTJ7l6\n9SoGBgY4OjrSv39/Zs6cSWlpKa+++qrU//Y0TL169eKLL76Qfs/MzGTHjh2kpaVRXl6Ora0tAwYM\nYObMmVpR+mvXriUyMpLvvvuOU6dOcfDgQa5cuUK3bt00xhQImtqXGVvY0HfOMkDsy+4nTaUY1jMw\npHO/sbgNfpa6mmounQznVoWSgCEjmTy8HyqVivz8fBQKhdYzx8rKijfeeENDUX7gwAHi4uIYP348\n9vb2qFQqMjMzSUtL4+OPP2bDhg1a46hTke7du5d+/foxfvx4cnJyiI+P59y5c5IjSc2VK1cICgpC\nqVQydepUPD09ycvLIyYmhhs3bmBvb68V5NWnTx+t8ggCgRrhBBIIBA+cxjaVdyps1JvJ8vJyQkND\ntcYqKSkBkDZxZmZmBAQEEBcXx8KFCxkyZAg9evSge/fuWvlQ1WNnZ2frHDs3N1caW+0EUqlUHD16\nlMjISLKzsykrK6Ourk46pyEFTrdu3XS2q9egKx1GQ5ibm+Ps7KzVrnYYlZWVNXssQdvSVkZCYWxs\nO05nF/Lz8XNaRueqsmJycopw7e6nZcxSGwoaM0TdqdI7deoU+/fvJzMzk9LSUmprazWOl5aWahkg\nWmJIc3V1pVevXiQkJFBYWCj9vx84cACA8ePHN/o5CO4Nfn5+ODk5sWfPHjw9PbXSj6WkpAD13487\nN23qWk979uxh/vz5UntLlGVNzd8eeZid3KK2luBxoKnnZvcG6kvr6+trKGViYmL44osvMDIywt/f\nH2dnZ0xMTJDJZKSkpJCamkp1dbXWOA3V3tHX128w2EFfX1/juVtWVoZKpaKkpERSSzaF2mmvK90O\ngK2trXACNYP//d//ld5ZbG1tUSqVxMfH8/e//53c3FyNwuvXrl3jww8/JD8/Hy8vLyZMmIBKpSI3\nN5dff/2V8ePHY25uzuzZs4mMjCQ/P18jYFBdowPqn7MhISEADB48GEdHRzIzM9m3bx8nT55k5cqV\nGv3VfPvtt6Snp9O/f38pzbZAcDtiX9Y+aUndJeXVLKqUN3D0fYK5C97TSDFcU1Oj81l0J9OnT2f+\n/Pk603KvX7+evXv3agUWApw8eZL/+Z//QS6XS21btmwhLCyMQ4cOMW3aNKl906ZNKJVK/vKXv2jU\nA4qNjeWzzz5r9vUKBGqEE0ggEDwwmtpUdpMbaZ2jTt+WlJREUlJSg2PfvHlT+vmvf/0rYWFhHDt2\njJ9//hkAIyMjhgwZwiuvvCJt7tRjq42ozRn7hx9+YPfu3djZ2dG3b186dOggKXLUmxNdNLShVG84\nG1IY6aKxDTCg4ZQS3H/aykgojI13z2+nLzW6YSu9eYvI9EIOJOUw1v9PNaD6f6kxQ9TtxqY9e/bw\n3XffYWFhgb+/Pw4ODhgbGyOTyTh58iTZ2dla0c8AFhYWDY6vK+XMhAkTSE1N5cCBA7zwwgsUFRUR\nGxuLp6dng45mQfvA19dXKzXbqFGj+Oabb7RUX61Vlj0sPOzGFFFbS/Ao05znZkTCJUbf8dzUxdat\nWzE0NGTNmjVaivuNGzeSmpraVsvWQv2u7OnpqZHWuTnnFBcX4+rqqnVcXTxc0DgbNmzQClarqalh\n2bJlhIWFMX78eGnfs3r1avLz83nppZeYPn26xjmlpaWYmJhgZGREYGAgKSkp5Ofn6wx2qKysZM2a\nNdTW1vLFF1/Qs2dP6VhYWBhbtmxhw4YNLF++XOvc8+fPs27dOp0OIoFAjdiXtT9akypYT99A6zwD\nA4NmvVM3lFZt1KhRfP/995w+fVqnE2jYsGEaDiCAcePGERYWprEHKCwsJCkpCScnJyZNmqTRf+DA\ngfTq1euePjcFjyYP525RIBA89DRnU7k38RJj7thUqo2wr7/+uob8vzHUm4XAwEAKCwtJTU0lMjKS\n//73v1y7do0VK1ZojP3111/j7u7e5LglJSXs2bMHNzc3Vq1apRWlffz48QbP1ZWaDv7ccF6/fr05\nlyZ4SGgrI2Fj43S0MmT27Nl4e3uzcuVK6Zxbt24xa9Ysqquref/99zVSb+zbt49NmzaxcOFCRo8e\nDdQ7Q3ft2sXJkyfJz8/HwMAALy8vnn/+eS1peWRkJGvXrmXRokXY2NgQFhZGVlYWFRUVhIeHS/0u\nX75MWFgYCoWC4uJizM3NkcvlBAYGaqS9upeczi5s0sgMgArWRCTjaG3aqo1bbW0toaGh2Nrasnbt\nn8LEiwAAIABJREFUWi21z52Knrth0KBB2NjYcOjQIWbPns2hQ4eora3VmWpScO9oSM3aGN7e2moW\nAwMDbGxsdKo3W6Mse5h4FIwporaW4FGjrZ+beXl5uLq6ajmAVCoVaWlpbbDihjExMcHV1ZVLly6h\nVCo10kE3RNeuXYmOjiYlJUWqyafm6tWrFBQU3KvlPlLoylZgYGDAxIkTSU5ORqFQ8PTTT5OZmUlG\nRgaenp46Dae3p0hqipMnT6JUKhk2bJiGAwjgueeeY//+/SQlJemsQTVt2jThABI0CxEE0r5oSYph\nC0c3jMysuJl1ih83rqZ///74+vri6enZbPVfTU0Nv/32G8ePHycnJ4fy8nKNoL2G7DleXl5abboy\nuKhTjffo0UPnmvz8/IQTSNBihBNIIBDcd+5mU9m9e3cA0tLSmu0Euh17e3tGjBjB8OHDeeONN0hP\nT5c2gz4+PkRHR5OWltYsJ9DVq1dRqVT06dNHywFUWFjI1atXW7w+Hx8fABISEkQ6p0eQtjISNjSO\nt7c3Z8+e5ebNm9J3Mj09XZK0KxQKDSeQQqEAkKKR8vPzCQ4OJj8/n549e9KvXz8qKys5deoUy5Yt\nY8GCBYwdO1Zr3hMnTpCQkCDlNr5dAZeQkEBISAi1tbUEBATg7OxMYWEhMTExxMfHExISQteuXe/6\nM2mKn4+fa1a6KQCVCkKjzrXK4FxaWkp5eTlyuVzLKF9ZWSnVGGgLDAwMGDNmDP/+97+Ji4vj4MGD\nmJiYMGLEiDabQ9AwTaZIut5wKs7GFJx3qjdbqyx72BDGFIGgfdHWz01HR0euXLnCjRs3pOejSqUi\nNDRUoxbPvWLKlCmsX7+edevW8d5772ndh8vKyrh27Zr0TjJixAi2bdtGREQEo0ePlqK+VSoVP/74\no06FrkA7MKKLBcQd+w2FQkFBQQG3bt3S6K82lP7xxx9AfS3EhoLlmov6XevOaHuof8726tWLI0eO\nkJWVpeUEEkpqQUsRQSDth4Hejs1yAukbmdB93Cv0N7pAZmaaVMfZysqKCRMmMHPmzCbVQCtXriQm\nJoaOHTsycOBAbG1tpVIGe/bsaTClnK7MD7oyuDQnJalA0FKEE0ggENx37mZT6e3tTc+ePYmOjubQ\noUOSeuF2Lly4gK2tLdbW1pSUlFBUVKTl1KmsrKSyshJ9fX3pAT9q1Ch++eUXtm3bhre3t9YmQKVS\nkZqaKhWIVW8G09PTqaurkyI0Kisr2bBhg1akdnMICAjA0dGR2NhYjh8/zrBhwzSO3177QyC4E7lc\nzpkzZ0hNTWXAgAFAvaNHT0+PXr16SU4fqP8+p6Sk0LFjR+m7vGbNGgoKCliyZInGd6+8vJzg4GC+\n/fZbBg4cqPUyGh8fz7Jly7RqWZWVlbFq1SqMjY1ZsWKFRvTvxYsXCQoKkgwy95IL+coWFZ4HSL54\ngwv5yhZv6mxsbDA2NiYzM5PKykpMTEyA+mixb7/9ltLS0haN1xTq9AHffPMN169fZ9y4cVpOaUHb\n05YpkhrjfirL2gvCmCIQPHjuxXNzypQpbNy4UarRqa+vz5kzZ7h06ZJUv/NeMnr0aKkmzF/+8hf6\n9OmDo6MjSqWSa9eukZqayqhRo1iwYAFQ/54/d+5cfvjhBxYuXMjQoUMxNzcnMTGR8vJy3N3duXDh\nwj1d88OErsCIKmURf/z2PWb6tQx/oi9jx47FzMwMPT098vPziYyMlAylaoNnW6ha1WM1ZCRVz6FL\neSsMqwLBw8lvpy/xw5HmvRfLZPDXmUMZ6x+ISqUiJycHhULB3r172b59OyqVSqNe2Z2cO3eOmJgY\n/P39+fTTTyUnDtTvsXfu3HnX13N7SlJdiJSkgtYgnEACgeC+0habyqCgIJYuXcr69esJDw+ne/fu\nmJubU1hYyIULF7h48SKrV6/G2tqa69ev8+677+Lu7o67uzv29vZUVFRw6tQpioqKmDx5smQwtbS0\nJDg4mM8//5ygoCDkcjmurq7IZDIKCgrIyMiQUmVB/SZh2LBhHD9+nIULF9KnTx/Ky8tJSkrCyMgI\nT09PScbbXAwMDPjb3/7GJ598wqpVq9i/fz8+Pj7cunVLejnZvXt3i8YUPLrcGW1p37leXq5QKDSc\nQF5eXgwePJhvvvmG3NxcXFxcyMrKQqlUMnjwYKC+yHxqaipDhgzRcj6am5vzwgsv8NlnnxEdHc2E\nCRM0jg8cOFDLAQRw5MgRysvLefPNN7XSv7i5uTF27Fh2795NTk6O1vG2JOlCYavPa6kxWiaTMXny\nZMLCwliwYAFPPPEENTU1JCcno1Qq6d27N8nJya1ajy4cHBwYMGAAsbGxACIV3H2gKTWrOoJZVVd3\nV6kFoXXKMnVAgqgHJ2hrXn31VaC+HqLg0eZePDfHjRuHoaEhu3fvJjIyEiMjI3r27Mm7775LdHT0\nPXcCAcyfP5/+/fuzf/9+FAoF5eXlWFhY4ODgwNSpUzXU0lDvuLKzs2Pnzp1ERkZiampK3759efnl\nl1m1atU9X+/DQkOBEfkZMdRUVWA76FmuuPjjFtBbCow4fvw4kZGRUl+1wfPGjZbtE3XRlPFUPYcu\nVe7dqpAEAsH9p9mZZv6PV5/2ke5FMpkMV1dXXF1dGTRoEC+//DInT55s1AmUl5cH1Afw3u4AAjh7\n9qyW4rE1eHp6AtoBx2pSUlLueg7B44dwAgkEgvtKW2wq7e3tWbt2LeHh4URHR3P06FHq6uqwsbHB\n1dWVSZMm4ebmBoCTkxMvvPACKSkpJCcnU1paiqWlJS4uLsybN4+hQ4dqzCOXy9mwYQO7du0iMTGR\ntLQ0DAwMsLOzQy6XSwZzNQsXLqRjx45ERUWxd+9erK2tCQgIYM6cOYSEhLTqWr29vVm/fj1hYWHE\nx8eTkZGBqakpzs7OvPDCC60aU/Bo0VAaqrraWi7mlXE46iSvvfYa5eXlnD9/nmnTpkn57BUKBS4u\nLpIjQt2uVhSUl5cTGhqqNWdJSQmAzpQtDaXOUI+ZnZ2tc8zc3FxpzHvpBKqoal2qrNaeN2fOHKyt\nrTl48CC//fYbZmZm9OnThzlz5uj8HO6W0aNHExsbi7e3931Jrfe405SaVd/IFJlMRnVFyV2lFoTW\nKcssLCyk4AWBQCBoDc15/hlb2NB3zrIGz/viiy+0zhk5ciQjR47Uand3dycwMFCr/fb6gnfSmDNS\n19xqBgwYIAXKNIdhw4ZpBcc0NcfjRGPG1yplfaS6jasvqjvSfN9pwFSn/E5MTOSll15q0hlze8DD\nncZRtfE0JSVFK2tEbW2tVINKvDMJBI8GLck0A3A4NpnRPTpoZbdQq2uMjY0bPV9dNyw1NVWjREFJ\nSQmbNm1q/kIawd7eHn9/f5KSkoiIiOCZZ56RjsXGxop6QIJWIZxAAoHgvtIWm0oAU1NTZsyYwYwZ\nMxody9zcnFmzZjFr1qxmr9HR0ZE333yzWX2NjY158cUXefHFF7WO6doc+vn5NbqhVePg4MD8+fOb\n7NfYBjgwMFDnhlrwcNNYGio9fX1qLZw4EpvCrqg0XIzKqKurQy6X06VLF+zs7FAoFEyYMAGFQoFM\nJpPypSuVSgCSkpJISkpqcP6bN29qtTWUOkM95oEDBxq9Jl1jtiVmxk2/7ui679x+XksMUfr6+kyZ\nMoUpU6Zo9V20aBGLFi3SaHN0dGx0/KYMTWo1iKgjdu9pjppV39AIsw4ulOVf4sLvu8hL7oBb5R9M\nGjOixfO1RllmYmJCt27dSEtL49NPP2XPnj34+fnxzjvvcOTIEZKSkqisrMTNzY3AwEAtY2h1dTW7\nd+/m6NGj5OXloa+vj4eHB5MnT+bJJ5+U+lVWVjJ79my8vb1ZuXKl1H7r1i1mzZpFdXU177//vkZk\n/b59+9i0aRMLFy7Umc5VIBC0D5rz3GzL8wQPL40ZX43MrQEou3YB687dpcAIVdElDh48qNHXy8sL\nX19fzpw5Q1hYGNOnT9c4rlQqMTY2xsjICKiv3QFQUFAgGWTVDBo0CEtLS44dO8bEiRMlBxPU1+q4\ndu2aVGNPIBA83LQm00zcqURmHPyBvvJedOrUCRsbGwoLC4mNjUUmkzF16tRGz/f29sbX15fo6GiW\nLFlCjx49KC4uJiEhARcXlzZJawn16tWgoCC+++47Tp8+jYeHB3l5ecTExNyXNKqCRw/xliYQCO4r\nYlMpELSe5kjdLTp6UJqXxZdbIhjjaYiRkRG+vr5AveonISGB6upq0tLScHV1xdq6foNuZmYGwOuv\nv64R0dQcGorWVI/59ddfa9Xlup/4u7dOhdHa8+4nN2/eZP/+/VhaWuqMVBa0Lc1Vs7oPeY7L8Qco\nzTtP7cVUtlxNwLdrF6n+VktojbJs8eLFfPfddyQlJXHlyhWUSiW5ubn4+/vz9NNPo1QqiYqKYvny\n5Xz22WeSIrCmpoZPPvmE1NRUOnfuzMSJE6mqquLEiROsWLGCrKwsXnrpJaDe2eTt7c3Zs2e5efOm\nlFo1PT1dqvGgUCg0nEDqumS6inULBIL2w6P83BS0HU0ZXx26DeBGVhLZUWHYuPpiaGpJ5pF8ThsV\nM2bkCKKiojT6L168mODgYP71r38RHR2Nn58fKpWKK1eucPr0ab755hvpOSqXy/n9998JCQmhf//+\nGBkZ4ejoyFNPPYWJiQnvvvsuX375JX/729948skncXBwIDMzk9OnT2NrayvVfhIIBA83rck0Y9Wp\nK94e5lRVXCM2NpaKigrs7Ozw9/dnypQp0t65IfT09Pj444/ZunUr8fHxhIeH06FDB8aMGcPMmTN5\n6623Wns5GnTq1ImvvvqKzZs3o1AoSElJwd3dnaVLl1JaWiqcQIIWI6yqAoHgviI2lQJB62mO1N2y\nowcAyrxs9l4oYMJAHylqUi6Xc/ToUfbt20dlZaWGIVYdJZmWltZiJ1BD+Pj4EB0dTVpa2gN1Ark7\nWuLnateiKLHebnbtujj9qVOnOH/+PHFxcRQXF/PKK680mbpAcPc0N0WgsaUdXZ+aLf0+d0Q3Rg71\nBlqe3qilyjIAZ2dnPvnkE/Lz86U6LoGBgcye/eeahg8fzrJly9i1a5fkBPrPf/5Damoq/fr14+OP\nP5bynAcGBvL++++zY8cOBgwYIG2O5XI5Z86cITU1VaMOmZ6eHr169ZKcPlBfKDclJYWOHTu2yhkm\nuD+oVCr27t3Lvn37uHr1KpaWlgwaNEin4hmarxwTPFw8is9NQdvTlPHV1NYJr1FzyVP8l9Lcc6hU\ndZjaODF6zmuMD/DWcgI5OTmxbt06du7cycmTJ4mIiJCcO88995wUuAQwZswY8vPzOX78ODt37qS2\ntpZevXpJgQcDBw5k5cqV/Pvf/yYxMZGKigpsbGwYP348s2bNarNIfYFA8GBpTaYZE2sHhowYQuD/\nvZs3RkOZXCwtLRvM3KLrfb6hdKhqGtofODs7ExwcrPNYY+MJBLoQTiCBQHBfEZtKgaB1NFfqbmbr\njIGRCSWX/6CwshznmX/mD1Ybenfs2KHxO9TL2nv27El0dDSHDh3SmarpwoUL2NraamzCG2PUqFH8\n8ssvbNu2DW9vb63aQSqVitTUVPz8/Jo13t3wwjBvgn+ObVa+aJmMZm0KHiQnTpwgMjISGxsbpk+f\nrtNBIGh7HlY1q6OjIzNnztRo69u3Lw4ODpw9e1ZqO3ToEDKZjNdee02j0K21tTWzZs1i/fr1HDx4\nUMMJtH37dhQKhYYTyMvLi8GDB/PNN9+Qm5uLi4sLWVlZKJVKrdp6gvbFd999R3h4OHZ2dowbNw59\nfX1iY2M5e/YsNTU1GBj8+V1uiXJM8PDxqD03BW1Pc4yvFg5d8B6leR/o0q0bfn7eDRpW582bx7x5\n8xodV09Pj5deeqnRe4y3tzdLly5tco3QcFCFQCBo3zys7+YCwYNAfOsFAsF9R2wqBYKW01ypu0xP\nDwtHN4ov/1H/u01n6ZijoyPOzs7k5eVJkfq3ExQUxNKlS1m/fj3h4eF0794dc3NzCgsLuXDhAhcv\nXmT16tXNdgJZWloSHBzM559/TlBQEHK5HFdXV6lofUZGBkqlkl27djXzU2g9fTzsWTTRr8l0ejIZ\nvDepN3082rf6UBgrHgztXc16IV9J0oVCKqpqMDM2oLN5/Zfdw8NDq3A21BedzcjIAOpTC+bl5dGh\nQwc6d+6s1VftNM7KypLafHzqlYZqxU95eTnnz59n2rRpUn+FQoGLi4tUu+h257OgfXHmzBnCw8Nx\ndnbmq6++wtKyPgDnxRdf5MMPP+TGjRsaKq6WKscEDxeP2nOzvaFWao4cOZLp06ezdetWUlJSKC0t\n5fPPP8fPz48rV65IjvbS0lKsrKyQy+XMmjWLTp06aYwXGhrKtm3bCAkJoaioiF27dpGTk4OFhQVD\nhw5l7ty5GBoakpyczLZt2zh//jx6enoEBATwl7/8Rfp/V5OcnMzx48dJT0+nsLCQ2tpaOnbsyJNP\nPsm0adMwMjLSMKLmJR8lL/kY3qPnUlNZQX76CW6WFKCnb4BlR09c+o3ByKy+jo8wvgoEgraivb+b\nCwTtCfH0FQgE9x2xqRQIWk5z01BBfV2g4st/oG9kgrWjpjFXLpeTl5eHl5cX5ubmGsfs7e1Zu3Yt\n4eHhREdHc/ToUerq6rCxscHV1ZVJkybh5ubWonXL5XI2bNjArl27SExMJC0tDQMDA+zs7JDL5fdV\nFTCujytONmaERp0j+aK2qqq3mx2BQ73FPUfQIO1VzXo6u5Cfj5/TWldVWTE5OUV099d9nr6+Pqr/\nexCXl5cDNJgix9bWFoCysjKpzcDAgB49eqBQKCgpKSEjI4O6ujrkcjldunTBzs4OhULBhAkTUCgU\nyGQyUQ+oHXP48GEAZsyYoWEQNjIyYu7cuXz44Yca/VuqHBM8fIjn5r0nLy+PxYsX4+LiwogRI6iq\nqsLMzIxz587x0UcfcfPmTQICAnB1deXy5cscPXqU2NhYPvvsM7y9tQPlIiIiiI+P54knnsDPz4/T\np0+ze/duysrKpBRpAwYMYNy4cZw5c4b//ve/lJaW8umnn2qMs3PnTi5fvoyPjw/9+/enurqa9PR0\nQkNDSUlJ4bPPPtNpRC08G0/J5T+w7twdCyc3yguvUHQxjZvF1/CZ8AZ6+gbC+Cp4bEhJSeHDDz9k\n9uzZBAYGNuuc2x269yNbwsNOe303FwjaI8IJJBAIHghiUykQtIyWRE06+gzE0WcgABamRhrHFixY\n0GgxXFNTU2bMmMGMGTOanKep3MbSehwdefPNN5vsdz/o42FPHw97LcWEv7u92AwImkV7U7P+dvpS\no0EVpTdvEZFwidFJOYz179LgOGqncFFRkc7j6vY7ncdyuZykpCQUCgUZGRkYGRlJRv/evXuTkJBA\ndXU1aWlpuLq6NltJKLg/3H4vPHgikYqqGi2VKECPHj001GStUY4JHk7Ec/Pekp6ezvTp0zXSmqlU\nKt566y0qKipYvHgxI0aMkI5FRUWxcuVKvvrqKzZt2oRMJtMYLykpibVr19KlS/39vrq6mnfffZcj\nR44QFxfH8uXLpf9xlUrFJ598QkJCAllZWXh6ekrjzJ8/HycnJ63xt27dyi+//MKJEycYOnSolvG1\n9Eom3ce9hqmtk9SW/ftOii6kUnL5D4YPGyq+NwJBO0VdR1JXTZv2THt7NxcI2ivCCSQQCB4YYlMp\nEDQfIXVvW9wdLcV9RtAq2pOa9XR2YZPrAEAFayKScbQ2bXA9pqamODs7c/XqVa5cuaKVakidzq1r\n164a7Wplj9oJpE4Rpz529OhR9u3bR2VlpVABtSN0qcfSMvOoUt7gy/AM5o4y0Piu6OvrY2VlJf3e\nGuWY4OFGPDfvDTY2NsyePVujLSMjQ1Lh3O4AAhg6dCgRERGkp6eTlpam5bSdPHmy5AACMDQ0ZNiw\nYfz888/0799fo79MJmPEiBEkJSWRnZ2t4QTq2LGjzvU+++yz/PLLLyQmJjJ06FDJ+KrGoXuAhgMI\nwN6rL0UXUqm4niuMr4LHim7durFp0yaN56eg7WlP7+YCQXtGOIEEAsEDR2wqBYKmEVJ3gaD90F7U\nrD8fP9esqEcAlQpCo841uqZRo0bx008/8c9//pMPP/xQUn6Ulpayfft2AEaPHq1xTteuXTE3Nyc2\nNpaSkhKGDx8uHVOrQXbs2KHxu+DB0pB6TN/QGICkc5fJuFbOe5N6S+qx2tpaSktLsbev//60Vjkm\nEDyuNFazzdDQUKNvZmYm0PA9s3fv3qSnp5OVlaXlBNKVIk7trPXy8tI61qFDBwCuX7+u0V5ZWcme\nPXs4efIkubm53Lx5U0ofent/tfH1A8VRAMw6aAYQABiZWyOTwVBvO2F8FTxWGBsb61TLCtqe9vJu\nLhC0Z4QTSCAQCASChwQhdRcI2g8PWs16IV/ZIqcwQPLFG1zIVza4vqlTp5KQkEBsbCzvvPMO/fv3\np6qqit9//52SkhKmTZtGjx49NM7R09OjV69exMbWR4LfrvZxdHTE2dmZvLw8qZ/gwdKYeszMzpmK\nG3mU5V/E2NJWQz2Wnp5OXV2d1Le1yjGB4HGjqZpt3eRGWudUVFQADSvt1O1qRd7tmJmZabWpa3bp\ncsqqj9XU/Fl7sqamhqVLl3L27Fnc3NwYOnQo1tbWUt9t27ZRXV0t9R/Xx5XEIV78kBmDvpGJ1hw9\nXO2oc7GleyeRDlRw9+Tn5/Pqq68ycuRInn/+eTZv3kxaWhrV1dV4enoye/Zs+vTpI/WPjIxk7dq1\nLFq0CBsbG8LCwsjKyqKiooLw8HCp3+XLlwkLC0OhUFBcXIy5uTlyuZzAwEBcXFw01lBcXMyuXbuI\ni4ujsLAQAwMDbGxs8PHxYdasWZKSrrGaQJmZmfz000+kp6cjk8no1q0bc+bMafTaW7LGtWvXEhkZ\nyQ8//EBiYiIRERFcuXIFMzMznnjiCV5++WXpnqBep5rJkydLP48cOZJFixY150/zwHnQ7+YCQXtH\nOIEEAoFAIHhIEFJ3geDeMHnyZHr16sUXX3zR4nMflJo16UJhq89raL0GBgYsX76cX3/9lWPHjhER\nEYGenh4eHh68/vrrDBs2TOd5crmc2NhYzMzMtKLQ5XI5eXl5eHl5CVVIO6Ax9ZhdV38KMxO5mhqF\ndeduGBjXR9T2dLFiy5YtWv1boxwTCB4nmlOzbW/iJcbcUbNN7chpSGl348YNjX5tTWxsLGfPntVp\n/L1x4wbbtm3TOsfd0ZIenW1Z8Hw/Ks2cNIyvZtzk1QPazi6B4G64du0aQUFBuLu7M27cOIqKioiK\nimLZsmUsWbKEoUOHavQ/ceIECQkJ9OvXj/Hjx5Ofny8dS0hIICQkhNraWgICAnB2dqawsJCYmBji\n4+MJCQmRghqqqqr44IMPyMvLw9/fn4CAAFQqFfn5+Zw8eZIhQ4Y0mE5RzZkzZ/joo4+oqalh8ODB\nODs7k5WVRXBwcIOpc1uyxtv58ccfSUxMJCAggD59+pCcnMyBAwfIy8vj888/B8DJyYnZs2ezZ88e\nAJ555hnp/NvTRD4siEwzAoFuhBNIIBAIBIKHCCF1fzgJDg4mNTVVI+KwKXQ5JkJDQ9m2bRshISH4\n+fndi6U+kjyshW4bo6Kqpsk+xhY29J2zrMHzdDm9jIyMmDFjBjNmzGj2WiZPnqwRNXo7CxYsYMGC\nBc0eS3DvaEo9ZuHQBUefgeRnxHJm7zfYuvbgcoIelw99i7ODrZYqoTXKMYHgceFuarapDbkpKSk6\nT1G33yulXV5eHgCDBw/WOpaamtrouS4dzPHz89Boy8+/2XaLEwj+j9TUVJ577jleeeUVqW3ixIks\nWbKEjRs30q9fPw1HaXx8PMuWLaNfv34a45SVlbFq1SqMjY1ZsWKFRk2tixcvEhQUxPr161m3bh1Q\nXwMxLy+PZ599ltdee01jrJqaGg2VnC5UKhXr1q3j1q1bfPTRRwwcOFA6tmfPHr777jutc1q6xtvJ\nyMhgw4YNODg4APXpXZcuXUpycjJnz56lW7duODo6EhgYSGRkJICWakkgEDwa6D3oBQgEAoFAIGgZ\nfTzsWfXSIP7xxjDmj+3B3BHdmD+2B/94YxirXhokHEACwWOAmXHrYrlae57g4ac56jGXfmPpMmA8\n+obGFJ6Lp+hiKtadPFm+fDkGBprfHbVy7MUXXwQgIiKCyMhIOnXqxJIlS5g3b969uAyB4KGgNTXb\n1Pj6+uLi4kJ6ejonTpzQ6HvixAnS0tJwcXGhZ8+ebblkCUdHR0DbCXX16lU2b958T+YUCBriQr6S\nX+OyCY06x69x2VwqKAPq0xvOnj1bo6+3tzcjRoygvLycmJgYjWMDBw7UcgABHDlyhPLycl544QUN\n5wqAm5sbY8eOJSsri5ycHI1jRkba6jYDAwNMTU0bvZ6MjAxyc3Pp1auXhgMIYNKkSTg7O7fZGgFm\nz54tOYCgPgXkqFGjADh79myjaxUIBI8WYhcoEAgEAsFDipC6P9ps2rQJY2PjB70MQTvF3711zt7W\nnid4+GmOekwmk+HQPQCH7gFS27AR3TA3N9eppGuNckwgeNS525ptMpmM9957j48//pgVK1bwxBNP\n0LlzZ3Jzc4mJicHU1JT33nsPmUx2T9avTjX166+/cuHCBbp27UpBQQFxcXEMGDCAgoKCezKvQHA7\nTdXTGjHES6fDxc/Pj8jISLKyshg5cqTU3q1bN53zZGRkAJCdnU1oaKjW8dzcXABycnLo0qULvXr1\nokOHDoSFhXH+/Hn69++Pr68vnp6eUlrUxsjMzATQWSdRT0+PHj16SGq81q7xdry8vLT629vXvwuW\nlZU1uV6BQPDoIJxAAoFAIBAIBO2Qzp07P+gltHt+//13IiIiyM7OpqamBmdnZ4YPH86UKVMwNDRs\ncaHb0tJS/vWvfxEXF4dSqcTZ2ZmpU6dKEZN3kpiYyJ49ezh79iw3b97E3t6eQYMGMXPmTK3up8Mc\nAAAgAElEQVT6N+qUdF9//TWhoaHExMRw/fp1ZsyY0aq0G+6Olvi52rXI0NjbzU44jh9jhHpMm8uX\nLzN//nz8/PwICQnR2eftt9/m8uXL/POf/8TOzg6VSsVvv/3GoUOHyMnJQaVS4erqyqhRoxg/fryG\nYf72Aua6Cmu3JlWooP3TFjXbunfvzpo1a/jll19ISkoiLi4OKysrhg8fzqxZs7SKwLclJiYmhISE\nsHnzZlJSUkhPT8fJyYlZs2YxZcoUoqKi7tncgvZJU/ey1tBYPcaG6mldjN5NwdlTIJPx+/kSDtxR\nTwvAxsYGgPLyco12W1tbnfdcpVIJwIEDBxpd782b9WkNzczMWL16NaGhocTGxpKYmAiAlZUVEyZM\nYObMmVrK2dupqKjQWOed2NraarW1dI23Y2FhodWmr68PQF1dXaPjCQSCR4tH941eIBAIBAKBoA2o\nrKxk9uzZeHt7s3LlSqn91q1bzJo1i+rqat5//32eeuop6di+ffvYtGkTCxcu1CiMXltby86dOzl8\n+DAFBQXY2NgwfPhw5syZo7VhbGxzrIvLly8TFhaGQqGguLgYc3Nz5HI5gYGB99RY9KD417/+xY4d\nOySjmImJCQkJCfzrX/8iMTGR5cuXt6jQbXl5OR988AEGBgYMGTKE6upqfv/9d9atW4dMJtOIJgXY\ntm0boaGhWFpaMmDAAKytrblw4QL/+c9/iI+PZ/Xq1VpFu2tqali6dClKpZI+ffpgZmaGk5NTqz+D\nF4Z5E/xzbLNSDslkEDjUu9VzCR5+hHpMm86dO9O7d2+Sk5PJzc3VuleeOXOGixcvMnjwYKkm0ldf\nfcWxY8ewt7dnzJgxyGQyYmJi2LRpE+np6QQFBT2ISxG0I9qiZhuAi4sL77//frPmDAwMbDCgYOTI\nkVrPMDV+fn46nZD29vYNfpd19W9sfkdHR+HoFDSL0NBQvvlhC7W+z2Dh5N5o3+qb5Vr1tACKi4sB\ntIJxGlLOqd/Vvv76a9zdG59Tjb29PQsXLkSlUpGTk4NCoWDv3r1s374dlUrFnDlzGjxXPZ96nXdS\nVFTUJmsUCASCOxFOIIFAIBAIBIJGMDExwdvbW1J7qFNPpKenS8VfFQqFhhNIoVAAIJfLNcZavXo1\naWlpUrHa+Ph4du7cSXFx8V1FViYkJBASEkJtba2UxqWwsJCYmBji4+MJCQm5ZwWkHwQZGRns2LED\ne3t7/v73v0tRk3PnzuXzzz/n1KlT7Nq1S1LZNKfQbXZ2NqNHj+btt9+W0nk8++yzvP322+zcuVPD\ngJacnExoaCg+Pj58+umnGoaGyMhI1q5dS2hoqFbB4Bs3btClSxe++OILTExM7vpz6ONhz6KJfk0W\nH5fJ4L1JvUW9sMccoR7TzYQJE0hOTubAgQMaBcbhz6jr8ePHA3D8+HGOHTuGp6cnK1askP6P58yZ\nQ3BwMMeOHWPAgAEMHz78/l6EoF0hVHcCQdM0lPY493o5DYXHdPJ/GlsPPzIjf+LmjTxqblURGnVO\n4/1GXcvqzmCfhvDx8SE6Opq0tLQWO1hkMhmurq64uroyaNAgXn75ZU6ePNmoE0idni01NVXrWF1d\nHenp6W26xpagp6dHTU3TTmyBQPBw0nTCSoFAIBAIBILHHLlcTm1trcaGTaFQoKenR+/evSWnD4BK\npSIlJYWOHTtKxZXV5OXlsXHjRt59913+8pe/sG7dOpydnTly5IjOyL/mUFZWxqpVqzA2Nubrr7/m\nww8/5OWXX2bJkiWsWbOGuro61q9f37oLb6ccOnQIgJkzZ2qkzdDX1+fVV19FJpNx8ODBFo1pbGzM\na6+9ppHPvUuXLvTo0YOcnBwqKyuldnVE8zvvvKMVaTpy5Eg8PT05evSoznleffXVNnEAqRnXx5Uv\nXhhIbzc7ncd7u9nxxQsDtdKlCB5PXhjmTXPLiDyq6rE7i4x39OyJnZ0dhw8flhz7UK8OjIqKwtnZ\nWXLoq+898+bN0/g/NjExYd68eQAtvvcIHj2E6k4gaJrOnTvj4OCg0VZQcpPSm7caPMfQzBJjy/r3\nnZpblVxNOSbV0wI4d+4cR48exdzcnEGDBjVrHaNGjcLc3Jxt27Zx9uxZrePq93o1ly5d0qniUb/H\nN1XP08fHBxcXF1JTU4mNjdU4FhERoVUPqDVrbC2WlpaUlJRw61bDfwOBQPDwIkJNBAKBQCAQCO7g\nQr6SpAuFVFTVYGZsgH3n+qg9hULBgAEDpJ+9vLwYPHgw33zzjZRKKCsrC6VSyeDBg7XGnTdvHpaW\nf0bVm5iYMHz4cLZv305mZqY0dks4cuQI5eXlvPnmm1rFYN3c3Bg7diy7d+/WWSz2YeL2v8nBE4lU\nVNVoKa2gPn2Ovb09165do7y8XMtJ0xCdOnXSSt8GmsVz1UbfjIwMDAwM+P3333WOVV1dTUlJCUql\nUuPvbWRkdE8iOPt42NPHw17re+vvbv/IqzgELeNxVo81VGQcwMDEA+XlGKKjoyUVz5EjR7h16xZj\nx46V0gidP38emUyGn5+f1hi9evVCT0+P8+fP39sLEbR7hOpO8Chz+fJlNm/eTFpaGtXV1Xh6ejJ7\n9mz69Okj9VGrohctWoSNjQ1hYWFkZWVRUVEhBdLcmfb41VdfJfnsBQDOHdqiMac6deLtNYGsO3Xl\neuZpyguv8NcLB7hx6Qxnz57l1q1byOVyli9fztChQ3U6Ze5Mz1xXV0deXh6LFy/G398fV1dXZDIZ\nBQUFZGRkoFQq2bVrFwCnT5/mxx9/xMfHh06dOmFjY0NhYSGxsbHIZDKmTp3a6Ocnk8l49913+eij\njwgJCWHw4ME4OzuTlZWFQqGgX79+JCQkaJxjaWlJcHAwn3/+OUFBQcjl8kbX2Frkcjnnzp1j2bJl\n9OzZE0NDQzw8PAgICLircQUCQftAOIEEAoFAIBAI/o+GjIR1tbVczCvjcNRJXnvtNcrLyzl//jzT\npk2jd+/eQL1TyMXFheTkZACp/Xa8vbWj6tVRkGVlZa1ac0ZGBlCfziw0NFTreG5uLsBD6wTS9TdJ\ny8yjSnmDFRF/MHeUoZah2s7OjoKCghY5gRrqp6t4rlKppLa2lm3btjU65s2bNzWcQNbW1g3mpG8L\n3B0thRFR0CTj+rjiZGNGaNQ5ki9qG6l7u9kRONT7kXIANVRkXE2FXXcyrvzG/275t+QEOnDgAAYG\nBowaNUrqV15ejqWlpc6i3/r6+lhZWVFSUnJPrkHwcCFqtgkeRa5du0ZQUBDu7u6MGzeOoqIioqKi\nWLZsGUuWLGHo0KEa/U+cOEFCQgL9+vVj/Pjx5OfnNzj2M888Q8GOfVzIP00HT3+MLKwbXYuRuS1d\nAiZy7vBPHNx9HCvz+vTNw4YNw8bGhgsXLnD48GEmTpyoda6u9Mw3b97ExMSEa9eukZaWhoGBAXZ2\ndsjlco3Arr59+1JQUEBaWhqxsbFUVFRgZ2eHv78/U6ZMwdfXt8nP0dfXlxUrVvDTTz8RHx8PQPfu\n3fniiy9ITEzUcgJBvYNmw4YN7Nq1i8TExEbX2FpmzpxJeXk5cXFxpKenU1dXx8iRI4UTSCB4RBBO\nIIFAIBAIBAIaNxLq6etTa+HEkdgUdkWl4WJURl1dHXK5nC5dumBnZ4dCoWDChAkoFApkMplOlYou\nR4MuJ0NLUCrrU2Coa1c0xM2bN1s1/oOkob+JvmF9VGfSuRwyrpXz3qTeGunObtyoN2w31wHUUszM\nzFCpVE06ge7kXjqABIKW8Dipx05nFzapfDIys8LapTv/jT7Fb9HJuNkacvHiRYYOHYq19Z+GSHNz\nc5RKJTU1NVqOoNraWkpLSzUUher/+draWp3zlpeX38WVCdozj7PqTvDokpqaynPPPadRP23ixIks\nWbKEjRs3Sk4VNfHx8Sxbtox+/fo1Ofazzz7LseRsYk6dxq6rHEsn9ybPMbF2wNDUAtde/hzY/W+N\n+zVAaWkpVlZWUl3Hw4cPA3+mZ1YH6rz44ossXLiQq1evsmLFCo1Uw3fSpUsXrZqPDeHn5ycpn+7E\ny8uL//f//p9Wu4+PT4M1LB0dHXnzzTebNfeiRYsarDfa0LpMTEx46623eOutt5o1h0AgeLgQTiCB\nQCAQCASPPc0xElp09KA0L4svt0QwxtMQIyMjKdqvd+/eJCQkUF1dTVpaGq6urlob0XuFerP99ddf\n39Nisfebxv4mpnYdqbiRR9m1ixhb2rEmIhlHa1P6eNiTl5dHYWEhTk5OkhOorQvd+vj4cOrUKS5d\nuoSrq2ubjSsQ3G8eB/XYz8fPNUuNYd+tP8U5Z1jz/TbG964vSz5u3DiNPp6enigUCtLS0rQc/Wlp\nadTV1dG1a1epzcLCAoDCwkKt+SoqKiSlpuDR5HFU3QkebczNzZk9e7ZGm7e3NyNGjCAyMpKYmBjJ\n4QIwcODAZjmA1LjZt+55ZGtpKgVV3Y6VlZXO/vciPbNAIBC0d/Sa7iIQCAQCgUDwaNMcI6FlRw8A\nlHnZ7D1yAh8fH4yMjID6FA1KpZJ9+/ZRWVmpUwV0r/Dx8QHqDZCPEo39TTp0rc87fzX1ONWV5ahU\nEBp1jrq6On744QdUKhVjxoyR+rd1odtnn30WqHe8qVVHt1NZWckff/zRJnMJBILWcyFf2ey6LJYd\nPTCx6kByfDQHDv8XFxcXrbSeo0ePBmDLli1UVVVJ7VVVVWzevFmjD4CpqSmdO3cmPT2dnJwcqb2u\nro7vv/9eFN9+DOjjYc+qlwbxjzeGMX9sD+aO6Mb8sT34xxvDWPXSIOEAErRLLuQr+TUum9Coc/wa\nl82lgvqUxV27dsXU1FSrv7pOWlZWlkZ7t27dWjSvg7Up/5+9Mw+Iql7//2sY9k12RHZcQQQXFMUF\n3NLc2gu5uZRZv6ybpeb9aqndb7ZYVi6Z3cx71UztuiUoLoAbroggq+ygKMgi+77N7w++c2KcQTG1\nND+vf5SznznnM3PO836e521qoHtX6/QbNARtmpkzZw4//vgj586du2NbzgfRnlkgEAgedkQlkEAg\nEAgEgseajgYJDc3t0NbVp/xaKsV11di9NEWapwwU7ty5U+XvP4IxY8bwyy+/sH37drp37672wq1Q\nKEhMTNRoZP6wcqdrYmztiG3voRQknSZl/3rMnDy4HqNDwYn/UFqYj4eHh4ox7/02uvX29mbGjBls\n2bKF119/HR8fH2xtbamrq6OwsJDExEQ8PDw0tvkQCAR/HJdy1Ctw2kMmk2HV3YdrFw9TXK7NG6+N\nV1vG39+fc+fOcerUKebMmcOQIUMAOHfuHAUFBQwfPpyAgACVdZ599lnWrFnD+++/z7Bhw9DV1SU+\nPp6mpiZcXV3Jzs6+p3MUPLzMmjULgI0bNz4WVXeCR5/2vDHrq8rIzS2lq6eOxvXMzMwA9RaXt2ur\n1h72lkZobqCpjkwGy959jdLsQYSGhhIcHMy+ffuQyWR4enryyiuvaBR8HkR7ZoFAIHjYESKQQCAQ\nCASCx5qOBgllWloY2zhTdq21wkNm5iDNs7Gxwc7Ojvz8fLS0tPD09Hwgx6oJExMTFi1axCeffMKC\nBQvw9vbGyckJmUxGUVERKSkpVFZWsmfPnj/smO6VjlwT+35jMDDvTHFqFCXZcShaWijycOOVadN4\n+umnVfw6HoTR7fPPP4+HhwchISEkJydz/vx5DA0NsbS0ZNy4cZK5vEAg+POoqb+7NpAWbt5cjzmC\nTEtbpaVRWxYuXEifPn0ICwvj4MGDQKtHxDPPPMOECRPUlldWBu3du5eIiAiMjY0ZPHgw06dP59NP\nP73LMxIIBIIHw+28MQEqahsIOXOZJy/lqvgwApSVlQHq4srv8ULsZKjLhBHd2ZfWqNmnU1uH3k/P\nRd/E7Dc/LddRjBo1iurqai5fvszZs2cJCwtj2bJlrF+//g9r0SwQCAQPM0IEEggEAoFA8FhzN0FC\n486ulF1LRa6rTycbB5V53t7e5Ofn061bN40Zhg8Sb29vvv32W/bs2UNMTAxJSUloa2tjYWGBt7c3\nfn5+f+jx3CsdvSYWLp5YuPwmuE0L6MGLw9UzPu9kdNueaS/c3ljXw8MDDw+PDh3rxo0bO7ScQCC4\nfxjq3d3rbm1ZAQqFgt79fFT8Itoik8mYMGGCRsGnPcaOHavSJk7JZ599dlfHJxAIBA+CjnhjAtSU\n5LNy7wXJh1FJQkIC0Oqbdi9oabU6Vvj1tGXwQPt2/bQ8HM2YPdFXrZ2ikZERPj4++Pj4oFAoCAsL\nIykp6ZF7DhYIBIIHgRCBBAKBQCAQPNbcTZDQppcvNr18ATC+pWf5W2+9xVtvvaVxvdsF+kaPHq0x\n41yTMBEUFERQUJDmY7Ox4f/9v//X7n4eJe42cHuv6wkE95vz588THBxMbm4ulZWVmJqa0qVLF4YP\nH64iHuTl5bFjxw7i4uKoqKjA1NQUb29vAgMD6dKly594Bn8N+rrcnd9KQdJpAF5+8bkHcTgCgUDw\nUNIRb0yApoY68uNPsC3SThJg0tPTOX78OEZGRlKLzN+LqakpAEVFRYzx9qafqxU5hZVcyimmpr6J\niBJbcpot+ejFgdjYtO4/Pj6ePn36qFUdKauT9PT07umYBAKB4K+CeFMWCAQCgUDwWHO3QcJ7XU9w\nZ8Q1ETzKHDp0iHXr1mFubs6gQYMwNTWlrKyMnJwcwsPDJREoPT2dDz/8kNraWgYNGoSTkxPXrl3j\n+PHjnD9/nuXLl2v0MhB0HBcbE/o4WdzWY6y2tIDy6+nUlORRkZdBj97ejPbr/wcepaAj1NXVMXXq\nVLp3784XX3whTW9oaCAwMJDGxkbmzZvHyJEjpXmhoaGsX7+ed955R6rEuhvhddu2bWzfvp1PP/2U\nkpISgoODuXr1KqamplJ1p0Kh4MCBA4SGhnLjxg1MTEwYMmQI06ZN03geTU1NHDx4kPDwcAoKCmhs\nbMTMzAxXV1cmTZpE37597/dHJxDclo56YwKY2DpzMyOWXRvy6Fw+GnlzHZGRkbS0tPDWW29haGh4\nT8eiFHM2b97MlStXMDY2Blrb+gIUXrSkMEM1jPnpp5+ir69Pz549sbW1RaFQkJSURHp6Ot26dcPb\n2/uejkkgEAj+KggRSCAQCAQCwWNNR4KEt+LlbCEMnh8g4poIOkJhYSGzZs1i9OjR7bbsu5WIiAhW\nrVrFu+++267ny71y6NAhtLW1Wbt2rZoPQUVFBdAaOP7666+pqalh/vz5BAQESMtERkbyxRdf8NVX\nX7F+/frf5akg+I2/jejOop/Pt5vlXlOST96lCOS6+pg79+bzjxb/sQco6BD6+vp0796dtLQ0amtr\nMTAwACA5OZnGxkYA4uLiVESguLg4ACkI/HuF171793Lp0iUGDRqEl5cX1dXV0rwNGzYQEhKChYUF\n48ePRy6Xc/78edLS0mhqalLxpwP45ptvOHnyJM7OzowaNQo9PT1u3rxJcnIyMTExQgQS/OF01BsT\nQNfIHMdBE8mLjeDX4APYmOrStWtXAgMD6d//3sVzR0dH3nvvPfbu3UtoaCgNDQ3AbyKQJmbMmEFM\nTAyZmZlER0ejq6uLjY0NM2fOZMKECWpjUCAQCB5XxLehQCAQCASCx547BQnbIpNBkAbfGcH9RVwT\nwaOMXC5HLperTVe2uklJSeHatWv06tVLRQACGD58OPv37yc5OZmkpCQ8PT3VtiPoOP1crXh3Yp92\n/S4su/bFsmtfZDJ4b5IXw70c1RcSPBR4e3tz+fJlEhMTGThwINAq9GhpaeHp6SmJPtAqtCYkJNC5\nc2dsbGzuSXiNj49n5cqVan4nly9fJiQkBDs7O7766ivJR2ratGksXryYkpISbGxspOWrq6uJjIyk\nW7dufPXVV5L/iZLKysr78jkJBHdDR3wY9YzN6P/yMulvt4BAZgT0aPfZq71Wx21pz49x5MiRKmJu\nWzT5ND755JM8+eSTt92Xkt/TnlkgEAj+KggRSCAQCAQCwWNFe14dfl16cqbcWgoS1lXc5EbCSSoL\nsmmur0GuZ4hpZ1eWvveGmhFt25YxpaWl7Nmzh9zcXIyNjRk+fDgzZsxAR0eH+Ph4tm/fTmZmJlpa\nWgwaNIjZs2drNCAvLi5m165dREdHc/PmTQwMDHB3dycwMLDDLaISEhJYvHgxU6dObddL6GHlToFb\nJcrA7a3XRCD4I2nrWWBg705pcipz5sxhxIgReHp64u7urlIVlJGRAYCXl5fG7Xl5eZGcnExWVpYQ\nge4D4/s5YWtm2K7JuJezBUHDu4vvkYeMtuPKUE8bK4duQKvw01YE6tatG35+fnz//fdcv34de3t7\nsrKyqKyslAzh70V4HT9+vEbD+/DwcABefPFFld9xXV1dZsyYweLFqlVlMpkMhUKBjo6Oxgo/Tc8C\nAsGDRvgwCgQCweOB+NYWCAQCgUDw2HA7r4767Hg++/titkWmczYmgYyIrbQ01tPJvgf6ZtaYadWg\nVZrDvo0rGeiquWXM/v37iY6OZvDgwfTp04fY2Fj27dtHVVUVvr6+fPHFFwwcOJDx48dz+fJljh07\nRkVFBR999JHKdjIzM1myZAlVVVX0798fPz8/KioqOHfuHAsXLuSDDz7Ax8cH+H0tsR4VROBW8LAT\nm13MzyfTb2ld6EC5/XDKbyRyZccuTA32IZPJ8PT05JVXXqF79+7U1NQAYGFhoXG7yult204J7o1+\nrlZqJuOGetr0dbESrSQfMjSPK2hpbuZKfhXhked47bXXqK6uJjMzk+eee04SVOPi4rC3tyc+Ph74\nTWi9F+G1R48eGtfJzMwE0CjUenh4qFX6GBoaMmjQIKKionjnnXcYOnQoHh4e9OzZU5jXC/40/io+\njLNmzQKQ/LoEAoFAoIoQgQQCgUAgEDw23Mmrw9TUlL4ulkw7upF6Mx3GvfA6PoOHSkHCO7WMuXTp\nEqtWrcLRsbWdUGNjI3PnzuXo0aNERUXx8ccfS8EihULB0qVLuXjxIllZWVKWcXNzMytWrKCuro5P\nP/1UJbhUUlLCe++9x5o1a9i4cSM6OjoP8uN6KHhQgdvJkyfj6el529YggvZJS0tj7969JCcnU1FR\ngYmJCc7OzowbN45hw4ZJy506dYr9+/eTnZ1NU1MTdnZ2+Pv78/TTT6vdv7e7JqtWrSIiIoKNGzeq\ntFdqj/z8fDZv3sylS5doamrC1dWVF1988d5PvA2HYq+232LMzRvcvGlurOOJnnpQkk1YWBjLli1j\n/fr1knl2aWmpxm2XlLQGv+/VZFugjouNiRB9HmJuN6605HKajW05ej6BPZFJ2OtW0dLSgre3N46O\njlhYWBAXF8eECROIi4tDJpNJfkD3IryamZlpXEe5TU3z5XK51P6xLf/4xz/YtWsXJ06c4OeffwZa\nK4eGDh3Kq6++2u6+BIIHhfBhFAgEgscDrTsvIhAIBAKBQPDXoSNeHeU3CwgY3J/l77zM04NcpRfd\n4cOH4+HhwfXr10lKSlLbxuTJkyUBCEBHR4cRI0agUCjw8fFREXRkMpnUkiY7O1uaHh0dTX5+PpMm\nTVLLLrawsOC5556jtLRUxfvgccDFxoSnB7kSNLy7yjX5IygsLGTy5MmsWrXqD9vnw8zhw4d5//33\nOXfuHO7u7jzzzDP4+PhQXl7OgQMHpOW2bNnCihUryM3Nxd/fn4kTJ6JQKNiyZQtLly6lqenOPgS/\nh7y8PObPn8/p06fp1asXU6ZMwcrKik8++YQzZ87cl33EZhffsVUhgFxHnwPZMoZNmsqYMWOorKwk\nKSmJrl27Aq0tGzWhnK5cTiB4HOjIuDLu7IpCoeDzzfsJiTiNrq4u7u7uQGs1T0JCAo2NjSQlJeHk\n5CQlfNyL8KqpdVvbZcvKytTmNTc3U1FRoTZdV1eXoKAg/vWvf/Gf//yH+fPn4+HhwbFjx0RSguBP\n428jutPOba6G8GEUCASCRxNRCSQQCAQCgeAvzR/p1aGpRZwyu7hbt25q8ywtLQG4efOmNC0lJQWA\noqIitm3bprZOXl4eALm5uaSlpbF9+3YAIiIiiIiIkJZ79913VSomsrKy+Omnn7h8+TKNjY306NGD\n6dOnS8EzJSUlJRw5coSYmBjy8/OpqqrC1NQUT09PAgMDVUQuUG1HFxQUxKZNm7h06RJ1dXU4OzsT\nFBQkeTcIHn1yc3OlSpYVK1bg5OSkMr+4uBhovY937tyJlZUVX3/9Nebm5gDMmDGDTz75hAsXLrBn\nz577Xp0DsH79eiorK5k9ezZTpkyRpp8/f57ly5ffl338fDK93UB15Y1sjG1dpMCxQgHbItMx+b9A\nsZ6eHu7u7tjb25OcnMzp06cZOnSotP7p06dJSkrC3t6e3r1735fjFQgeBW43rpSYdHYFoDI/mwM5\nRUzw7YWuri4A3t7eHD9+nNDQUOrq6qQqIOCBCK9du3YlMzOTxMREOnfurDIvOTmZlpaW265vZWVF\nQEAA/v7+vPHGGyQnJ1NZWSm8gQR/OMKHUSAQCP76CBFIIBAIBI8lERERrFq1infffZfRo0dL00U/\n6b8Of4ZXh6YMYmXVkZGRUbvz2lZEKDOHT506ddvzq6uro0+fPlRXVxMcHIyrqyuDBw+W5ru6ukrH\nmJGRwe7du+nVqxdPPPEERUVFnD59mg8//JA1a9Zgb28vrZeYmMjOnTvx8vLCz88PAwMD8vLyOHPm\nDFFRUXzxxRe4urqqHU9hYSHz5s2jc+fOjBo1isrKSiIjI/n4449Zvnx5u6Ka4NEiNDSU5uZmAgMD\n1QQgaA1qAoSFhQHw0ksvSQIQtN7zs2bNIjo6miNHjtx3Eai4uJhLly5ha2vLpEmTVOb5+vri6elJ\nYmLiPe0jp7Dytm1zsk/+Fy1tXQyt7NEzNkOhgNSDV+hqXI9X7154e3sjk8l47733WLJkCStWrGDw\n4ME4ODhw/fp1zp49i4GBAe+99167FQgCwV+NO40rJYbmdmjr6lN+LZXiumrsXvpN6IGbTrQAACAA\nSURBVFX+zuzcuVPlb+CBCK9jxozhyJEj/Pe//8XX11cSbxoaGti8ebPa8uXl5ZSWluLi4qIyva6u\njrq6OuRyOdraIkQj+HN4FHwYFQoFBw4cIDQ0lBs3bmBiYsKQIUOYNm3an3ZMAoFA8KggnjAEAoFA\nIBD85XiUvTqUYtGHH36Ir6/vHZe3tbUlODgYNzc3goKCVOYpM5svXLigJngeOnSIdevWERwczJtv\nvilN9/b2ZuvWrRgYGKhsKzs7m4ULF7J582Y++ugjteNISEggKCiIqVOnStP8/f1ZtmwZe/bsUQnG\nnT9/nuDgYHJzc6msrMTU1JQuXbowfPhwJkyYANxekN22bRvbt2/n008/pU+fPirzrl27xu7du4mP\nj6ekpAQjIyPs7e3x9/eXtt2WiooKtmzZQlRUFJWVldjZ2fHss88yZswYtWUfB1JTU9mzZw/JyclU\nVVWhrW+EuUN3fAImYGNtxYXYBOLj41m/fj0jR47U6Hmxa9cuvvnmG6ysrFQy8YuLi9m1axfR0dHE\nxcXR1NTEkiVLmD59uloVXdtrXFFRwa+//kpqaipvvPEGQ4YMke6PW8nKygI0m7ID9OnT555FoEs5\nxbedb9d3NJX5mdSW3KAiLwMtuTa6Rp3wGTWZj+a+IgV5e/bsyTfffMMvv/zCpUuXiIqKwtTUFH9/\nfwIDA1XEWYHgr86dxpUSmZYWxjbOlF1Lbf3bzEGaZ2Njg52dHfn5+Whpaam1YL3fwqu7uzuTJ08m\nJCSEt99+m6FDhyKXyzl//jzGxsZqySQ3b95k7ty5uLi44OLigpWVFTU1NVy4cIHS0lImT56s9tsr\nENwLCQkJLF68mKlTp6o8Iy5atIjExERCQkJUlu+oD+OflTS3YcMGQkJCsLCwYPz48dJ4S0tLo6mp\nSYioAoFAcBvEN6RAIBAIBIK/FHfn1QGf/W0qCoWCsLCwh8Kro2fPngAkJSV1SATqCO7u7ioCELRm\nMH///fekpaWpTG/bGq8trq6ueHl5ERsbq/FF28bGhpdeekllWv/+/bG2tlbZh1J8Mjc3Z9CgQZia\nmlJWVkZOTg7h4eEahZqOcuHCBT7//HMaGxsZMGAAI0aMoLq6muzsbHbv3q227erqahYuXIi2tjZD\nhw6lsbGRU6dOsXr1amQymdpnBq0i06ZNm0hKSqKxsRE3NzemTp1Kv3791JY9efIkhw4dIisri4aG\nBmxtbQkICODZZ59FR0dH47Y7ImCdO3eO06dPk5aWJrUSdHBwYPTo0UyaNEktiNlesAdUqyJbWlr4\n9ttv0dHRwdTGgeTUPG5cS6CuMpSfNq7H0s2b6uJr6Mr0qaxr4sSJE0yePJnm5mYOHz7M0aNHuXr1\nKtHR0RQXF2Nubi6ZnGdmZrJkyRKqqqro378/vXr14tq1ayQkJLBw4UI++OADjdc0NDSU8+fPY2Ji\ngq2tLQ4ODkRGRpKdnc2HH36otryy+q09c/W2VUm/l5r623sZWffwwbqHj9p076E91AK89vb2zJs3\n756P6a/IqlWriIiIYOPGjSqtLQV/Te40rtpi3NmVsmupyHX16WTjoDLP29ub/Px8unXrplaB+yCE\n19mzZ9OlSxcOHDjAwYMHMTU1ZfDgwUyfPp133nlHZVlbW1v+9re/kZDQKqZXVFRgYmKCvb09M2fO\nZPjw4Xe9f4HgQeBiY/KHei92hMuXLxMSEoKdnR1fffWVVHk3bdo0Fi9eTElJifitEAgEgtsgRCCB\nQCAQCAR/KR51rw5fX1/s7Ow4cOAAXl5e+PioBpNzCivZf/w8JpZ2dDIxxMHoDmoXmr2KtLW1MTMz\no6qqSm3ehQsXOHjwIBkZGVRUVNDc3Kwyv6KiQi3D2dXVVWPlhZWVleRzBK0ikLa2NmvXrlUTnDSZ\naHeUiooKVq5cSUtLC59++qmaX5PSq6Yt2dnZjB07lrfffls69qeeeoq3336b3bt3q4lABQUFLFiw\nABcXF8aPH09paSmRkZEsW7aM999/XyWAt3r1asLDw7GyssLPzw8jIyNSU1PZunUrcXFxfPzxx1I7\nQLg7AWvTpk1oaWnRs2dPLC0tqa6uJj4+nh9++IH09PTfJSoUFxezY8cObG1t8R75NP+7/BMAbD2H\n09LcREHSKWpu5lNfWYKWmS0puTf5zy/7ePLJJ/n444+JiYnB3t4ed3d3UlNTqa2t5dq1a6xYsULK\nvK+rq5Ouzauvvoquri5r166V2hIqFAq1e+3ixYt8/fXXbN26FYAlS5awefNmTp48SUxMjNp5KIO+\nmozaof0Kv7vBUO/3vUL93vUEgseBuxkfNr18senVmiRhbKCrMu+tt97irbfeanfduxFeg4KC1Cps\nb0UmkzFp0iS19pOgXiVhZGREYGAggYGBHdq/QPCgmDdvHvX19X/2YdwV4eHhALz44osqvlm6urrM\nmDGDxYsX/1mHJhAIBI8E4k1EIBAIBH86dXV1TJ06le7du/PFF19I0xsaGggMDKSxsZF58+YxcuRI\naV5oaCjr16/nnXfeYezYsQBUVlayZ88ezp07R2FhIdra2nTr1o3nn39eY5a+4K/HX8GrQ1tbm8WL\nF7N06VL++c9/4u7ujqurKzcqGjgWk0ZWZgb1laX0eW4+OgbG1FeVkZtbSs+b6mKOEk1+RNDqz3Kr\ncXVwcDAbNmzA2NiYvn37Ym1tjZ6eHjKZjHPnzpGdna3iYaTE2Ni43X0oblHl5HK5igCiRFNrsY4S\nERFBTU0NkydPVhOA4Devmrbo6enx2muvqYhXjo6OeHh4kJiYSF1dHfr6+tK8xMREnnnmGV599VVp\n2sSJE3n//fdZt24dAwYMwNDQkIiICMLDwxkyZAgLFiyQTMvhtzZnBw4cYMqUVi+LuxWwli1bhp2d\nnco0hULBqlWrOHr0KBMnTpQqyjpKVFQUTU1NjJjwAp9v3k9LUyNu/oGYObZuRyaTUX49DYceT1Ka\nk4hcR49T0fEs/vhLLsfEMGnSJGbPns0PP/yAk5MTo0aNIiwsjEOHDuHo6Eh+fj7PPPMMnp6e5Ofn\nU1xcjK2tLY6Ojjz33HNs2LCBpqYmtXOdPHkyTk5OZGdnS9PGjRvHyZMnyczMVDsPNzc34DdT9luF\nyfYq/O6Gvi6/zw/h964nEDwOiHElEPxxWFtb/9mH0CHatqU7cjqGmvomjc947bWAFQgEAsFvCBFI\nIBA8ctzOC0LwaKKvr0/37t1JS0ujtrZWapeTnJxMY2MjAHFxcSoiUFxcHIDkN1FYWMiiRYsoLCyk\nd+/eDBgwgLq6Oi5cuMCyZct46623GDdu3B98ZoI/mr+KV4eLiwtr167l119/JSoqip93BZNRUIm2\nvjEG5p2x6xOAtt5vbaUqahvYf/EqYy/lMq6v4+/eb3NzM9u2bcPc3JxVq1apVfu0rejpKOXVDdwo\nq2FbZDqGetp49PUlMzOTOXPmMGLECDw9PXF3d2+3DV1HSU1t9YcYMGBAh9fp0qWLRm8nLX0TbpTV\nsDksDltbG6naysjISMXzCFqrrAICAoiIiODs2bOMHj2a4OBg5HI5c+fOVRGAAAIDA9m/fz/Hjx+X\nRKC7FbBuFYCgVaSZMmUKR48eJTY2tkMiUE5hJadT8rleUk3JpcvoyxVs3X+cstxU6ipuUpJ1idrS\nfACa6qpRtLRg5tCTsquXaaypQAH8tHUrY4cO4LXXXqOlpYWwsDA6derErFmzSE1NJTExkf379wNQ\nVFTE1q1b2bdvH9euXcPFxYVt27aRl5cHtLYiLCoqIjY2VuXz/eWXXygsLJSmKYNXytZvt35Wffv2\n5dKlS+zfv1/6jKHVi+pe/YCgtU1OHyeLDpnYK/FytnjoWusIBA8TYlwJ/irc6q1nZmaGj48PU6dO\nVXmuUrZq/fXXX9m9ezfh4eEUFRVhZmaGv78/L7/8skaPm+PHj7N3716uXbuGgYEB/fv3Z+bMmXz5\n5Zfttn69FU1tYhUKBUePHuXQoUPk5eVRW1tLp06dcHR0ZOzYsRrbFdbV1bFt2zYiIyMpKyvD2tqa\nJ554gueee+6ekqVis4v5+WS6yvdBUkZrNfLnISnMGKNNP9ffno3kcvk9JRIJBALB44AQgQQCgUDw\nUODt7c3ly5dJTExk4MCBQKvQozT2VYo+0PqSkpCQQOfOnaXez9988w1FRUW8//77jBgxQlq2urqa\nRYsW8cMPP+Dr69uuT4Tgr8HdeHWkh22isuAKXi8s7LBXx+TJkykuLuazzz5TmX67ljGjR4/W6C0D\nrSb17b2sd+rUiRkzZuA1YiKLfj6Pdztd36TWdi0tfLM/HptOBiovxvX19axcuZLCwkLeffddzRv5\nPyoqKqiursbb21tNAKqrq9NYedEeyhf4Q5euUllYyebjSl8gUzp5PAElKQQHB7Nv3z5kMhmenp68\n8sorGlvXdQSlIGBpadnhdW6tkFIe8/5zOdwsrGT7qQz0jIulaquAod00mnb36dOHiIgIsrKyGDZs\nGNnZ2ZiamrJv3z6N+9XR0SE3N1f6+24FLGXVY3R0NDdu3KCurk5lvtInqD3aBlduZl4ht7gKFJUg\nk8GVMJob6mioLOHq+f3oGBijrWeIXM8ALbkO2gbGOA58kqvn91NxPYOKpnrSMi2ZNWsWKSkp5OTk\nMHnyZOLj43F0dCQ2NpbTp09ja2tLbm4uZWVl1NbWYmJiQmZmpkqFT79+/Thz5gzLly/H0NCQq1ev\nsn79epqbm+nTp49UxaOsIru1ik3Jm2++yYIFC9iwYQOxsbG4urqSn5/P2bNnGTRoEFFRUR36nG/H\n30Z0Z9HP5+/oPQatH2vQ8N93X99vFAoFISEhHDp0iBs3bmBiYsKQIUOYNm2a5F/StoVVY2Mj+/bt\n4/jx4+Tn5yOXy3F1dWXy5MkMGzZM4z5OnTrF/v37papBOzs7/P39efrppzV6YV26dInt27eTmZmJ\njo4OvXv3ZubMmQ/k/AUPN4/quBIIlISFhUneer6+vlhZWZGXl8fhw4eJiopi5cqValU4K1euJCkp\nSaomjo6OZvfu3ZSVlak9t+3evZtNmzZhbGzMqFGjMDIyIjY2lvfff7/dqu+O8tNPP7Fz505sbW0Z\nNmwYRkZGlJSUkJ6ezqlTp9REoKamJpYuXUpJSQk+Pj5oaWlx7tw5Nm/eTGNjo1rSTEc5FHtVo7en\nXEcPgEvp10gpqOa9SV5S4lNzczMVFRUaq74FAoFA0IoQgQQCgUDwUODt7c2OHTuIi4tTEYG6deuG\nn58f33//PdevX8fe3p6srCwqKyvx8/MDWn09EhMTGTp0qIoABK1B3r/97W8sX76cM2fO3JPp/OOM\nskLhVvHjYeNevDoe1nO8nccRgFzXAJlMRmNNueRx1FYEuhuUIun69evR1tbm5ZdfZtOmTcTFxZGS\nkkJNTQ1OTk4AlJeX89NPP3HixAmio6OprKxk1KhReHl5qbzAtzQ3U1dRTHLwtzRUl6Ml18bQ0h7b\n3n78Y9E72OtUcvbsWcLCwpg7dy4tLS1Mnz6diooK0tPTmTp1KlVVVSrm8AUFBVy5coVly5ahUCgw\nMDCgqKiI5uZmbt68iYuLy12fe3tBByUVtQ2cyiznsIZqK+XnVl1dTVVVFQqFgvLycrZv396hfd+N\ngFVdXc17771HQUEBPXr0YNSoURgbGyOXy6muriY4OFiqoLyb89TS1qWluRHvF/+BXFef6qJcbiSe\norIgm5am1u3pm1rSWFuFVfcBGJjZkBzyHWW5KSQlJXD9aja1tbU0NzeTmZnJjRs3gFavqKtXr3Lz\n5k169OiBn58fAQEBPP3002pVUtBarbNjxw5OnTpFcXExFhYWLFq0iG3btnXos4TWCq+vvvpKuncT\nEhJwcXHhgw8+oKKi4r6IQP1crXh3Yp/b3jPQGqh+b5LX7x6T95vvv/+e0NBQLCwsGD9+PNra2pw/\nf560tDSamppUss6VAb7ExEQcHByYOHEi9fX1nD59mhUrVpCVlcX06dNVtr9lyxZ27twpVU7q6+tz\n8eJFtmzZQkxMDB9//LHKPpTb0tHRYfjw4Zibm5OcnMyCBQtwdXX9wz4XwcPBvYyrwsJCZs2axejR\nowkKCmLTpk1cunSJuro6nJ2dCQoKkp4vofW79PDhw1y8eJHr169TXl6OoaEhvXr14oUXXqBXr15q\n+1U+J/zjH/9g8+bNXLhwgbq6OlxdXZk5cya9e/eWKiNOnTpFaWkpdnZ2BAUFtSuanjx5kkOHDpGV\nlUVDQwO2trYEBATw7LPPahRNBQ8v169f57vvvsPW1pbPPvtM5Tc9Li6OJUuW8MMPP/DBBx+orJef\nn8+6desknxulKH/06FFmzJiBubk5ADdu3OCnn37C1NSU1atXS4LHjBkzWLlyJSdPnryn4z906BCW\nlpasW7cOPT09lXmaPBtLSkpwdXVl+fLl0u95UFAQb7zxBvv27eOFF17QWMl0O2Kzi9sd/4YWdtSU\n5FNVeAU9E3OVxCdlC1iBQCAQtI8QgQQCwe/i/PnzBAcHk5ubS2VlJaampnTp0oXhw4czYcIEFixY\nQFpaGj/++KMUtGvL3r17+fe//82rr77KM888A0BOTg47d+4kJSWFkpISDA0NsbKykrLDtbW1mTVr\nltQS5lbzx7bZ9PX19QQHBxMZGUleXh4ymQxnZ2emTJmiJhIkJCSwePFipk6dysCBA9m6dSspKSnI\nZDK8vb2ZPXs2VlZW3Lhxgy1bthAXF0ddXR09e/Zk9uzZakGKsrIy9uzZQ1RUFMXFxZL5eq9evQgM\nDKRz58735Ro86rTt8Wyop42ngz26urpSxU91dTWZmZk899xzeHl5Aa0vUPb29sTHxwNI05Utqqqr\nqzUGCsvLywFUMu8Ff01u5w2Q9OtqAHo/PRcAZ79npOD2w+opcCePIwC5ji6GlvZUFV4l59Qe8uMt\nca5LZdITAXe9P5lMxhNPPEFUVBRbt25l586dmJmZ0dDQgEwmQ6FQcPnyZW7cuMHatWsxNDRk8ODB\nJCcnc/PmTT766CPeXvwpqw5lolBAU0MdJVlx1FeWINfRw7rXYJrqqym7kkxGxFb+t6qUHz+aw9//\n7oNCoWD37t00NTWRkpJCTEwM2trajBo1iurqaimQkJmZyQ8//EBpaSlDhw5l4MCBVFRUsGvXLjIy\nMti9e/ddtYSD2wcd2tJYW62x2qqsrAxoFZ2Vmbhubm6sXr26Q/tXrtMRAevIkSMUFBQwdepUteqz\nlJTW6qpbUfbJj84oUDvP5obWKiL9TlatwZWiq3Sy74GRtSNdR06lpbmJmpJ8KvMyKEqNIufUbrT1\nDDG1c8PN/yXid32JW88+nDwSwsyZM3FwcGDt2rUq+z99+jSff/65mp+SJnx9ffH19ZVav7766qvY\n2Njw7rvvShnRyucAExOTdivp7OzsWLRokcZ57VXm3S3j+zlha2bItsh04q+oj1MvZwuChnd/aASg\npKQkQkNDsbe356uvvpLuu+nTp/Phhx9SUlKi8sy2d+9eEhMTGTBgAEuWLJEqsIKCgpg3bx47d+5k\n4MCBuLu7A633386dO7GysuLrr7+WApczZszgk08+4cKFC+zZs4cXX3wRaK0uXLduHVpaWnz++ecq\nVYA//vhju5V0HUFTqyPBo8G9jqvCwkLmzZtH586dGTVqFJWVlURGRvLxxx+zfPly6dnx2rVr/PTT\nT/Tu3ZuBAwdibGxMYWEhUVFRXLx4kSVLlmj8LamurmbhwoUYGBjg7+8vbX/p0qWsXLmSdevWUVlZ\nycCBA2lububEiRN88cUXWFtbq7XpXL16NeHh4VhZWeHn54eRkRGpqals3bqVuLg4Pv74Y43+eYKH\nk4MHD9LU1MTs2bPVkjq8vb3x9fUlKipKpfU1wMyZMyUBCFrbZPv7+7Njxw4yMjIk8fLEiRM0Nzcz\nefJklYoXmUzGjBkzOHXq1D0LIXK5XKO3Tnut1t544w2VhI5OnTrh6+vL0aNHuX79Os7Ozne1/9sl\nPll07UtxRgw3EiPp5NADbb3W74ne9qZs3rz5rvYjEAgEjyNCBBIIBHfNoUOHWLduHebm5gwaNAhT\nU1PKysrIyckhPDycCRMmMGHCBFJTUzl8+DDTpk1T28bhw4fR0dGRAjE5OTnMnz8faA0A2draUlNT\nQ35+PqGhoUybNg1tbW2mTJnCuXPnSExMZPTo0RoFpurqahYvXkxWVhZdu3Zl7NixtLS0EBsby5df\nfsmVK1c0HlN6ejq7d+/G09OTcePGkZOTw5kzZ7hy5QoffvghCxcuxMHBgVGjRlFYWMjZs2dZsmQJ\nP/74o2QcXl9fz8KFC8nPz6dv374MGjQIhUJBYWEh586dY+jQoY+9CKSpx7OSikZTii+nU15eTkpK\nCi0tLXh7e+Po6IiFhQVxcXFMmDCBuLg4SaSD1tZI0NpS5tKlS+3uu7a29sGclOCh4W48BXSNWj1o\nHmZPgTt5HClxGfoM16IPU5GfSfOVRDbfuIh7V0eN35F34vnnn2fLli0UFRVha2uLra0t/fr14+WX\nX2bBggUcPnyYZcuWMWbMGObMmUNRURHHjx/HycmJq1ev8vl3m5G5tmY858WG01Rfg66RGT3Gv4ZM\nJqPyRjY27n6kHd7ItQsH2XBgAN+9PVESUrS0tIiNjeXJJ5/kypUrODo6Mn78eKC13cc777xDWVkZ\nvXr1Ys6cOZI33NNPP01AQABbt27lhRdeoF+/firnVVxc3G6bkDtVWympLcmnqaFerdpK2abMzc0N\nfX196bOorKxUCey0R8+ePTl9+jQXL168o4Cl9M9RVkK2pT2/G2NjYwA2hkarnWdNyf/58Tj0pK68\nkOsXj6BnYoG+aev5acm1MbZ2xNCiCy1NTRRcPkP5tVRM7dywcO2Dtq4BeVcy2b9/P01NTYwZM0Zt\n/76+vtjZ2XHgwAG8vLzw8fFRWyYlJQVXV1e17OOHnX6uVvRztVJLbOjrYoUhtcya9QqjR4/mpZde\nYtOmTSQkJNDY2EivXr147bXXcHZ2lqrqoqKiqKqqwsXFhZkzZ0rBamjNuD5y5AgxMTHk5+dTVVWF\nqakpnp6eBAYG4uioWp2mqSril19+IS8vDz09PZKTk6XAora2NjNmzGDhwoUq2wgLC0Mmk/Haa6+p\nBKI7depEYGAga9as4ciRI5IIFBYWBsBLL70kCUDQGlScNWsW0dHRHDlyRBKBzp07J1UQ3toGcurU\nqYSHh2v0fXpYWLVqFRERESpVioL7w+3G1Z1+rxMSEggKClJpReXv78+yZcvYs2ePNK4cHBzYvHmz\nWnC7uLiY+fPn8+OPP2r8Ps7Ozmb8+PHMmTNHasfar18/vv76axYvXoy7uzuffvqpFBgfOXIk//M/\n/8OuXbtUKkAiIiIIDw9nyJAhLFiwQCWQrhTBDxw4oOJrJnj4aHuP7j92npr6JhITE0lPT1dbtry8\nnJaWFq5fv063bt2k6Zra4CpbxlVVVUnTsrKyAPDw8FBb3sbGBisrKxX/vLslICCAkJAQ5syZw7Bh\nw/D09KRXr17ttpkzMjLS6FGofNZqe+wd4U6JT8bWjtj08qUw5TyXD3yPuZMH1y5qcS3sB+yszdXa\nGAsEAoFAFSECCQSCu+bQoUNoa2uzdu1aNSNvZan4sGHD+PHHHwkLCyMoKEgleJCQkMD169fx9/eX\nXrwiIiJoaGjgww8/xNfXV2WbVVVVUlDoqaeeorq6WhKBlMG/tmzYsIGsrCxmzpzJc889J01vaGjg\nk08+YefOnQwdOhQ3NzeV9aKjo5k/fz4BAQHStDVr1hAWFsb777/PM888IwUuAHbs2MHPP//MkSNH\npBe0uLg48vPzeeqpp3jttddUtt/U1HTbFj2PA3dqt1Rj2JnMtCQ27ArDtLkEXV1dKbjk5eXFxYsX\naWxsJCkpCScnJ+n+Uxq7v/7660yePPkPOReBOmlpaezdu5fk5GQqKiowMTHB2dmZcePGqbRBuRu/\niFmzZgGwbt26DhvPeplWsDd8C7XlRTQ31CLXM0TfxILa8iIMOv3Whz09bBNVhVf4/GCoyvpNTU3s\n2rWLiIgIqR1VQEAAgYGB7Z57c3Mzhw8f5ujRo1y9epXm5mYcHBwYO3YsEydOVDnGtsHZF154ga1b\nt5KQkEBFRQWffPKJ9L1WWVnJoV//S3L4CRqqy5FpyTG0sMO291BM7bqq7F/PxAKXYc+SH3+csivJ\nINNi586djB8/ni1btjB79myNx93W+0OJXC6nc+fOeHl5sWHDBpWM0K+++orS0lLkcjmvvvoqMpkM\nGxsbQkJCaGlpYfzEKWRkZNLddRgtzc2UZMdj2qUrvZ+eK30G2Sf/i5a2LigU1JTcIHjzWhoSD1Jw\n/QpOTk6UlJTg5ubGggULmDt3Lt999x1xcXFYW1sTGRlJTEwMvr6+NDc3qxy3i4sLf//731m5ciVz\n585lzJgxuLi4UFNTQ05ODkVFRRrPtyPVVkqaGuq4kXCCeJ0nyCmsxMXGhPT0dI4fP46RkRFDhgwB\nWgWpNWvWsHr1at577z21AEpVVRUFBQV07dp6HUePHs2OHTs4ePAgfn5+eHp6qizfVsCytbUFkFqc\nKcnKymLnzp0aj7t79+6EHzvJqRMRdOn7WxVM5Y0sSnNahSNdo044+U7h6vlgEnd/g5mTO/pmNqBo\noaGqjKqiXBprKpDJtdHSbh2nMi05Vt0HUJMVxZdffomTk5PKbyi0ihfK5IylS5fyz3/+E3d3d0nw\nKS4uJj09Xaq2fdREICUuNiZqwenCwtbEg4KCAubPn4+joyOjR4+WEkkWLVrEypUrWbZsGYaGhgwf\nPlyqKvjoo4/417/+JQUCExMT2blzJ15eXvj5+WFgYEBeXh5nzpwhKiqKL774QmP7tPTsa7ww43WM\nO1nSJNfHxNSMmpoataqInj17qjyr1dbWkp+fj6WlJQ4ODmrbVa6nDEoCkmeYyYzokQAAIABJREFU\nMkGjLfb29lhZWVFQUEB1dTVGRkbS8rfe79AaXHR1dW1X2BQ8HmgaV0puFYgcjFofLm1sbHjppZdU\nlu3fvz/W1takpaVJ09oLbFtZWTF06FBCQkIoKipS82/R09OTfv+U+Pv7s3r1aqqqqnj99ddVBJ3e\nvXtjY2OjMlYAgoODkcvlzJ07V601ZmBgIPv37+f48eNCBHpI0ZTUlpSaS31lCctXb8Te0ohOhuot\nTwE1Hz9N96Im7zulKN6ev6m5ufk9iUCvvfYatra2hIeHs2vXLnbt2oVcLsfHx4dZs2apCT7tjaE7\n+fa1R0cSn+wHjEPPxIKitAsUp0cj1zOk0xMBfLx0nuRrJxAIBALNCBFIIBD8LuRyucb2BEpRR1dX\nlzFjxrB3717Onz+vkrF86NAhACmzuy2a/AGUGcwdobKykmPHjtG9e3cVAUi57ZkzZxITE8OJEyfU\nRCAPDw+14NWoUaMICwvD0NCQ559/Xm3ezz//rPZS1955aGtr33Vf5L8SHWm3ZNLZlTwFbNwbTh/z\nBnr16iV9lt7e3hw/fpzQ0FDq6upUgkzK9hpJSUl/aREoNTWVPXv2kJycTFVVFWZmZvj4+DB16lSV\n7LeMjAyOHj1KQkICxcXF1NfXY2Vlha+vLy+99JLamIqIiGDVqlW8++67mJmZsWvXLrKysqipqWm3\nlc7mzZvZtWsX7777LqNHj+bw4cN89913aGlp4evri1wuZ/PmzZLPhFIEulu/CLiz8ayjo6MkKl2/\nfp1r165hYmmLThcvdAxMqCrM5UZiJABach1itv4TgOria1ibGatUdCgUCj766CN+/fVX6uvrMTEx\noaSkhKtXr3LlyhWNn0VUVBTvv/8+GRkZ6Orq0qVLF7p160Z9fT3/+te/SEtLY968eUCrqKX0TTl0\n6BBr1qxBS0uLkSNHMmTIELS0tNixYwdHjhzh2LFjlFdW06htjKGlHUZW9tSWFZJ59GccB03Eqvtv\nGcotzU1khG+h+mYehuadGRIwnJ62huzYseOOQdT2Amlubm5qLUGU95m9vb1KKxNoreBpkuvTUNOa\nDFBfUUxLUyPG1o5o6xlKy9n1HU1lfiYVeRk0VJdTnpvKDTc7Xpk5EycnJ/73f/+XHj164OjoyPLl\ny9myZQtRUVHI5XIaGxvx8PDAwMCA1NRUDhw4IFXhKK9f7969cXR0JDMzk9jYWIyMjHB0dOSFF17Q\neP4drbYCMLF15mZGLNXFeXzddBk3c20iIyNpaWnhrbfekgTpsWPHkpGRQWhoKLNnz6Zfv37Y2NhQ\nWVlJQUEBiYmJjBkzhrfeegto/e1csGABn3/+OYsXL8bHx6ddAWvUqFHs2bOHDRs2kJCQQJcuXcjL\ny+PChQsMGTKEyMhIteMeO3YsazZs4UbiKWpLC9DvZEV9RQkV+Rl0cuxF2dXLrdfXzQsDc1su7/+O\nvLijyLV1kevqI9czQq6tQ0tzE3qGJlh16y9te2LQGxxbn0F+fj4ymYyNGzdiaWlJeXk5eXl5JCcn\nM336dJ5//nnWrl3Lr7/+SlRUFOHh4WhpaWFubo6bmxtBQUHttpt51ElMTGTatGkaE0nmz5/PsGHD\nNFYV7Nu3T0oo8fb2ZuvWrWrjLjs7m4ULF7J582Y++ugjaXrClZsk55ZyLu0kdl4B6Pf252bGWupp\nxKz/M1w7u0ulKkJLS0ulak0ZbGwvs1pZ6dM207umpkZl3q1YWFhQVFQkiUAdCWgKBLfSXkV5fVUZ\nubmlOPXw1NjOysrKSmofrOTy5csEBweTkpJCWVkZTU1NKvNv3rypJgK19/tnZmZGXV2dxop/S0tL\nFQGqvr6e7OxsTE1N2217qKOjI9oYP6S0l9Qm123tDOE6+T209fR5e5KXmofgvaB8xigrK5P8GdtS\nWlp6T9vX0tLiqaee4qmnnqK8vJykpCQiIyM5deoUV69eZd26dQ/Up6qmvumOy8hkMqx7DsK65yBp\n2oiAHhgZGWlM9BEIBALBbzy+0UiBQHBXtA0SGti7U5qcypw5cxgxYgSenp64u7urVQVNmDCBX3/9\nVcpshtZKobNnz+Lo6KiS+Tl8+HCCg4NZvnw5Q4cOpW/fvri7u2ssMb8daWlpUtaRJm8YZea4ppcq\nTaX4yn7OmoKhynk3b96Upnl6emJpacmuXbvIzMzEx8cHd3d3jes/bnSk3ZKhuR3auvqU56Zy8Xoj\nz0/+TShUBqqUme5t2+R0796d3r17c+bMGcLCwhg7dqzatnNycjA3N1e7Tx8VwsLC+Pbbb9HR0cHX\n1xcrKyvy8vI4fPgwUVFRrFy5UgpUHD58mLNnz9KnTx/69u2LQqEgIyODX3/9lYsXL/LVV1+pBTAA\nlXZUTz755G2zCZ988kl2797N4cOH6dGjB+vXr8fQ0JAVK1bg5OTEt99+i6urK0uXLpWy0+/WL0LJ\n7YxnV61ahbW1NZ06dcLf35+9e/eipaWFvbU5XZwNMR/0DBdTrmLS2YWCpDNoyeVY9/LFycqYyvSz\nNNeptqpIT08nOjoaCwsL3nzzTRQKBadOnaK8vJwzZ86oid/bt2/nyy+/pKCgAD8/P8aNG8fVq1eJ\njY3FwcEBf39/jh07xtChQ6Uqx+bmZlJSUmhqasLf35+RI0fi7OzMqFGj+Mc//sHly5cpLCzE1NSU\n8ZOfISK1jKqCHDrZ98Bl2PNkhG3m2sXDdHLoiY5Bq6BXePks1TfzMHNyx3X4C8z7f/642Jjw/PPP\nSz4qt3KnQFrPvuoDVnn+ykDErbQoZKBo/Q5ubqwHQMdANYvbuocP1j18qCsvJjlkHSa2LgS9vZjn\nhneXBB1lUNjDw4PPP/9cWnft2rUcOXKEqqoq7O3tOX36NKdPn1bZvoGBAU888YRKKyBNKAXObZHq\nLVuc/Z7C2e8ptem6RuY4DppIXmwEF04d47qZPl27diUwMJD+/furLPvmm2/i4+PDwYMHiYuLo7q6\nGmNjY6ytrXn22WcZOXKkyvIDBw7km2++YdeuXcTFxbUrYFlYWLBixQo2bdpEcnIyMTExODg48Oab\nb9K3b1+NIlCnTp14fvZ8Vq/7F1WFV6gqvIKhhR1dR71MQ1WZJAIBGJjb0m3MdEqzE6m+eZ3G2koU\nLc3oGppi2qUb1r0Go2fcen1kMnh9yhC+e+cyx48fJzw8XDJKNzU1xdbWlpdffllKsOjUqRMzZsxg\nxowZt7020DrGb/U8UqKsPntUsLGxUUskGT16ND///DONjY3tVhW0TTJp77fL1dUVLy8vYmNjaWpq\nQltbm0OxV1mxO4aK2gb0jM3o7DkcaPUPA8hrMCCrrAX5+d/ap7a0tFBZWSk92ygzvNsLKiqnt80E\nV34vlJaWanx+KykpUVlH+a+yFaSSuro6pk6dSn5+vkoAvqGhgcDAQBobG5k3b57KGAoNDWX9+vW8\n8847Ks8Azc3N7N69m/DwcIqKijAzM8Pf35+XX35ZLeHg3LlznD59mrS0NOnZzsHBgdGjRzNp0iSV\na9Q24URZsQqt11oEIh8cd6oor6htIOLyTQ5fylULvsvlchRtVjx79iyfffYZurq69O3bFzs7O/T1\n9ZHJZCQkJJCYmKixgr+93z+5XH7byoi21atVVVUoFArKy8vZvn37nU5b8BBxu6Q2Iyt7am7mSd56\nmjwE7wU3NzfOnj1LcnKyynsQtFaZFxd3PKnlTnTq1Ak/Pz/8/PyoqKggPj6eK1euqLSxu98Y6v2+\n8OTvXU8gEAgeN8S3pUAguC2ag4QOlNsPp/xGIld27MLUYB8ymQxPT09eeeUVSUzp3Lkz/fv3l/rX\n29nZERERQWNjo1oVUI8ePVixYgX//e9/OX36NMeOHQNas+2CgoIYMWJEh45X6Q2Tnp6usRezklvL\n8EHzS50y4Hm7Mv22WYOGhoasXLmSbdu2cf78eWJiYoDWLO8JEybw0ksvPZbVQB1ttyTT0sLYxpmy\na6k0ApYOv71o2NjYYGdnR35+PlpaWmrtYxYsWMAHH3zAmjVrCAkJoWfPnhgZGVFcXExOTg5Xrlxh\n5cqVj6QIdP36db777jtsbW357LPPVMxm4+LiWLJkCT/88IPUa/6FF17gzTffVBMew8LCWLNmDQcO\nHFALSEJrS8Rly5bd0ZMEoAYDjDp3Jfx0NHlVa6isqeeVV17BycmJ2tpaTpw4gZWVFQMGDJCO4279\nItqiyXjWxcWFY8eOSaKTubk5aWlpmJqa4u7uTlxcHONH5vP3+VO4lOPHqn/moC2X8eO6pbjYmEjG\n4W1JSEjAxMSE//znP/Tt2xdobUP59ttvo1AoVAJI8fHxbNu2jdraWkaMGMHWrVul7wVldZWnpycy\nmYzjx49LIpCyndrgwYP597//LWVV5uTkcPnyZamybejQofzP//wPCzafJf7KTZob6tDW1aezVwBZ\nJ3ZQlnsZ6x6tnh43My8hk8mw7zcGbxdLqX2Ora0tkydPVgsydSSQtv/iVcZqCKTdDm35b0FSuU5r\nW6/GOs094RtrK6Xlbn2Bv7XFnxLld7Gm1qG/l44ED/SMzej/8jLpb7eAQN4c58HTg9Tbb7Vl4MCB\nku9KR3BycpKqxm6Ho6MjS5Ys0TivPXHEycmJbqP+pj7DFiy79lWZZGrXVa3l4K3IZPDeJC8puDVy\n5Eg1Yetx435W1ZmZmakF9C5cuMDBgwfJyMigoqJCrSViRUUFV8pbVMa2gZktsv/bt4GFHTUlN6gq\nuoqOgSlJ2deIzS6mn6sVqampKtszMDDAzs6OGzdukJeXR5cuXVT2FR8fDyC1NFSeZ2ZmJomJiWoi\nUH5+PsXFxdja2krjWLluYmKiinCjr6+Ps7MzFy5cUKlESk5OloLycXFxKvdbXFwcoN6KbuXKlSQl\nJTFgwAAMDQ2Jjo5m9+7dlJWVqQnkmzZtQktLi549e2JpaUl1dTXx8fH88MMPpKenq4zNqVOncu7c\nObKzs5kyZYqasCW4/3SkohwABR0Kvm/duhUdHR2++eYbNU+tdevWPdBWhMr7xM3NjdWrVz+w/Qju\nP7dLarPuMYibGTEq3nptPQSbmppITU2ld+/ev2vf/v7+7Nixg5CQEMaMGSO1iVUoFGzevPmu26+1\npbGxkYyMDKkNt5Kmpiap4vNBt2vt6/L7xLLfu55AIBA8bjx+kUiBQNBhbhcktHTzBjdvmhvreKKn\nHpRkExYWxrJly1i/fr0UaH/yySe5ePEiR44cYcaMGRw+fBhdXV1GjRqlts1evXqxdOlS6SE0JiaG\nkJAQvvzyS0xNTaWA7O1QvlRp8uT5o7CysuKdd95BoVCQm5tLXFwcBw4cYMeOHSgUCl5++eU/5bj+\nTO6m3ZJxZ1fKrqUi19WnXEtVsPH29iY/P59u3bqpBVqsrKxYtWoVISEhnDlzhuPHj9PS0oKZmRlO\nTk5MmjQJZ2fn+3I+fzQHDx6kqamJ2bNnqwhA0PqZ+Pr6EhUVRW1tLQYGBu0aVI8ZM4Yff/yR2NhY\njSKQr6/vHQWgtsJwuZYTucWnuXI4DC1tXUKzwCm7mILUaOrq6njuuedUAp936xehpD3j2evXr0vn\npRSVAgIC2LhxIykpKeTm5rJ161bGjRvH04NcCenc2mrqdqbSdXV1dOvWTUVkdHR0xMPDg4sXL6qI\nuCEhIdTV1WFvb4+pqSm//PKLyrYaGxvZvn07nTt3Vqs+dHJyolu3bhrbaigz7Kurq9m2bRsWpZXc\nSMiQ5jfVtbZcqitvHVfNjfXUV5aga9QJfVMLgoarVjX26dNHRQS634G0tlga60v/1zO1Qktbh9rS\nApr+T8BqS1VBDgAGlnYdfoFv2/rxfolAj0vQ4X4er5dz6312v7KbH3UeRFXdrZUDwcHBbNiwAWNj\nY/r27Yu1tTV6enrIZDJJjGhqauLnk5kqY1uu+5vAZOHqxc2M/8/efUdFdeaPH38PHYYOgkoRUCxI\nVRS7WKOxl9iSKIma/RqzxiSaXc0mmmZi4mY1ZU1ZN8Yo6i+aGGwYQVHWAhZEmgIigtIEFIahw/z+\nYGeWcYYqGojP65ycE+/cOsCduc+nPDHkxkeia2CEQqEgODIFLydLduzYoXEOY8eO5ccff+Tf//43\na9euVd3Pi4uL2bNnD4Ba8GbcuHEcP36cPXv2MHDgQNV3wdraWrZt24ZCoWD8+PGq9QcNGoSpqSmn\nTp1i8uTJahXZJSUlVFVVqZJ7oC7Qo0wCUQZ9oG4ANC4ujs6dO2t8/mVnZ/PVV1+pWt09//zzrFix\nghMnTrBo0SK1hIR169ZpfNYoFAo2b97MiRMnmDRpkuoetGDBAvLy8rh58ybTpk1r8HNXaDvNqShX\nUihQG3zXJjs7G2dnZ40AkEKhICEh4WFOtUlGRkY4OzuTkZGBTCZTa8UotF9NJbUZWdiq5tZLOvQ1\n5l26c9vcBqu8C9SWF5OYmIi5uTlff/11q47fpUsXnn32WXbs2MGf//xnhg8fjlQqJSYmBplMhqur\nK+np6a3ad2VlJW+++SZdunShR48e2NnZUVlZyZUrV8jMzCQgIEDjb6WtudiZ4eVs3ex5GqHu+0hj\n3+sFQRCE/xFBIEEQtGruIKGuvhGHb8JHz85HoVBw/PhxEhISVO3fBg4cSKdOnTh+/Dje3t7cuXOH\n0aNHNzrPj76+Pn369KFPnz507dqVzz77jKioKFUQSDkIoS3bqWfPnkgkEhITE1t55W1HIpHg7OyM\ns7MzgwcP5oUXXuD8+fNPZBCoOT2elex6B2DXu25wt7xK/We8fPly1Rwa2hgbGzNnzhytlSQPGjNm\nDGPGjNFY3l7auNTPKD90MorSimri4+O1VrgVFRVRW1vLnTt36NGjB9XV1YSGhnL69GkyMzORy+Vq\nFSz1WxjW17Nnz0bPKS23mDW7olT3BfOu7hiaWlF48ypSW0du3Kthza4ojJN+RldXV22wD1o+X4RS\n/f+v/74kpGZQU6tQGzicPn065ubmHDlyhOjoaLKzs5k/fz6+vr4UFxc3OeeInp4eVlZWGhV7tra2\n6Ovrq1X+Xbt2DYVCQWFhIYWFhVy4cEFtm8LCQsrKyjAzM6OsrEy1XFnR9OD74OzsjJubG//5z3+4\nf/8+2dnZREZGIpVKMZZVcDOvWO2eXFtVCUBNZV1lo76xVK0yQ+nB47T1QFp9psb6WEnrMkV1dHWx\ndvUiP+Uy2bEncRowUbVehayQu9ej0dHVZdjwkc1+gA8ICKBLly4cPnwYb29v/P39Nda5du0arq6u\nzc5YfVIGHVp7ncsneKpVt/i62Ha4a3+UHlVVXX01NTUEBwdjZWXF5s2bNebpUc5zknG3pNGfr5m9\nC7bu/clPuUT5/bp2n0d/2UNW+L+wt7HA2tparQpv5syZXLp0iaioKP785z/j7+9PRUWFqkXmrFmz\n8PDwUK3fp08fZs2axf79+1m+fDlDhw5Vzft269YtPDw8mDlzpmp9IyMjXnnlFTZu3Mhf//pXhg8f\njpWVFYmJidy5cwczMzOKi4tV68fGxtKjRw+GDBnC119/zZ07d3BwcCAtLQ2ZTKY2/6RSUFCQ2gC7\nkZGRKps+NTVVrVJPW7KBRCJh6tSpnDhxgpiYGFUQSHi8mltRXt/VW4Wk58kavF/Z2dmRlZVFYWGh\n6m9KoVAQHBz8WObimT59Op9//jlbtmzhtdde00huKikpITc3V63aTvh9NSepTTm3Xl7SeWS5N5Hl\n3OBoaRre7s4MHTqU4cOHP9Q5PPPMM9ja2nLgwAHCwsIwNjamX79+vPDCC7z99tsNJhY0xdDQkKCg\nIOLi4khKSuL8+fOqitCXX35Za6vtR+HZEe5qzxqNkUjQSHwSBEEQGiaCQIIgaNXYIKEs5yam9i6q\ngQLlIKHZf3u61x94k0gkTJgwgR9//FHV7mDixIka+0xKSqJ79+5q7Z7gf33i6+9TOYh79+5djf1Y\nWFgQGBjIyZMn2bNnD3PmzNFowaJsJ2Zvb9/oe9AaGRkZmJuba0xyrMzsf9Rl9O2V6PHcfNoyyhOu\nZ1IhK+SDLdtwsJFiYWKgdVtlm8NPPvmEc+fO0blzZwICArCyslJVnISEhGjtcQ+NT8JdVFpJQmoe\n7vW6X0kkEmzc+1GQFktVmYyqUhkVxQVcvhTHM5PHagxUtnS+iPq0vS/pufcpKavk36fSsXLNVwUq\nRo8ezejRo6mqquLy5csMGzaM8+fPqwWoG2JkZIRMJlPNraGkq6ur8b4pM9Tv3LmDlZWVxrxiDg4O\nQF1gsX6WuJmZGSUlJRotz3R0dPjwww9Zs2YNBw4cUE3UbmxszNSpU/EZOYn90ZlcvaU+EKZrYIS5\niQF9uxhrHWSuP6fHoxhIe1D3zuaUS+o+G7r6jqEkL4O716MpLcjC1N6FmopS7mUkUltVgdPAiSyd\n3PyKHj09PdauXcs777zDu+++S58+fVQBn/z8fFJSUsjJyWHHjh0tut8+KYMOrblOFzszEfRpwKOs\nqquvuLgYuVyOj4+Pxn21vLxcVWUZn6k9wF+f08BJGJnbcuNkMBWyQgrT4zEbH8j777xOUFCQ2r1Z\nT0+P999/nwMHDnDq1CkOHTqEjo4Orq6uvPTSS1pb9QYFBeHm5sahQ4c4ceIENTU1dO7cmeeff57p\n06drBNiHDh3Ke++9x9Z/bWfvr6Ggo4ube2/Wvv8pryx+VhUEksvl3Lhxg1mzZqnmwoiNjcXBwUHV\nmu7BOTJA+3yPynmGlC2OlGQyGT///DMXL14kJydHo3VwQwkUwqPXkoryB7dr6P41ffp0vvrqK1as\nWMHQoUPR1dUlKSmJjIwMBg4cSHR09MOccpPGjRtHamoqR44cYenSpfj5+WFnZ4dMJiM3N5f4+HjG\njh3baPKT8Hg1N6nN2MpebV7BRYE9tX5v+OijjxrcR0PJaqC9/WppaSk5OTmqeTiVvLy8tLaJffDY\nenp6zJo1i1mzZjV4TvU1ljTX2Jx+TfFztWXlJK8mP1sfbEkrCIIgNO3JG10TBKFJTQ0S3jz9/9DR\nM8DE1gFDU0sUCrh+9BbdTSvw7ttbo9XT+PHj2b17NwUFBbi4uNC7d2+Nfe7fv5+rV6/St29f7O3t\nMTY25tatW1y6dAlTU1Oeeuop1bpeXl5IJBJ++OEHbt26paoqmjt3LgD/93//R1ZWFrt27eLkyZN4\neHhgaWlJYWEhmZmZpKSksHr16kcSBIqJieH777+nd+/edO3aVdXTPyoqColEopYB+yR5UtotPayG\nMsp1/9tGy3XKa+gZGvHKZO8GM8pTUlI4d+4cvr6+rF+/XtVyCOoyXPfv39/g8RuahwXgToEctFTR\n2HT3Q9/o/1EpL6I4K5Xy4rsoFFBspvmw29L5IpRuF5RoHbjW0a37GnMtI4c1u6J47YH3RSaTYWlp\nyauvvoqpqSlXrlzRmID8QVZWVigUCq2T7j7YssXExAQjIyNGjRqFoaEh27Zta9acX429z6ampqxY\nsYL09HS8vLwYNWoUR48e5dChQ8jlcj59/XWNeUd8XWzZkBNKTk6Oav61+uLi4lT//ygG0h5kY2bE\nM/99gNczNKHnU4vJTfgP9zOSuHvtHDq6+khtumLvMYR1L81o8QO8i4sLX3zxBQcOHCA6OpqwsDB0\ndHSwsrLCzc2NBQsWNFnx9aAnZdDhSbnOx+VRVtXVZ2lpiaGhIampqZSXl2NkVPeZUF1dzbfffqsK\nlJRV1jS2G6Du/mPXZxBFt68hy72F18zXGRHYk6KiIsrLyzXa/RgYGDS7wlZpxIgRzZ7LMeZmPrti\ny0i3H4e5fV2meT7wwdFbmHTrh2VpNoaGhsTHx1NbW4uPjw9OTk5YW1sTGxvL008/TWxsLBKJRGur\n0cbmdKxfUS6Xy3nttdfIzc2lZ8+eqqp1XV1d5HJ5owkUD8rLy2Px4sWMGTNGY94hoXVaUlHe3O0m\nTJiAvr4+v/76K+Hh4RgYGNC3b19effVVzp49+8iDQADLli3D39+fo0ePEhsbi1wux9TUlE6dOjFz\n5swnfp619qY9JLUVFRUhlUrVvm/W1NSwbds2KisrGTx4cJsd6/cywc8Ze0sTgiNTNBKfQLSkFQRB\naC0RBBIEQUNTg4RdfMcgy75BWWEOxVmp6OjqYSC1wH/0FNa/+oLGIKilpSX+/v6cP3+eCRMmaN3n\npEmTMDU1JTk5mcTERGpqarC1tWXSpElMnz5dLYveycmJ1157jV9++YUjR45QWVnXEkkZBDIxMeHj\njz8mNDSUU6dOcfbsWSorK7G0tKRr164sWbIEPz+/h3mLGtSvXz/u3r1LQkICUVFRlJaWYm1tja+v\nL9OnT9eYbPNJ8aS0W3oYjWWUS20dKC3IouRuBhYOPRvNKM/OzgbqWjHWDwABJCcnq/5eWiI9T0Zx\nWSVmWsbV9Y2k2Pb0586l37h9MRQkYGhqTa6kk6p6JD8/H1tb2xbPFwFQICsnIfMefbVMVWQgrau4\nKy3MQuHmwz8OXeV+VhpzJo4gJydHLaikDP40FoCBugBDcXExP/74Ix9++KGqOrG8vJysrCy1VkC9\ne/fmwoULjB07lrCwML799luWLFmiUdFYWFiIXC5vdi91d3d3+vbtS3x8PKNGjeLjjz/m2Wef5fz5\n83XnaGcGpQVYWdmp3kPl3B3bt2/nr3/9q+o6c3Nz1TJAmzOQZmhqSb/n1qktq7+dtoxSpfqZofUf\n4B38xuLgN1b1WkMP8A1lrD7IwsKCRYsWsWjRoibXba4nZdDhSbnOR+1xVNUpSSQSpkyZwr59+1i+\nfDmDBg2iurqaq1evIpPJ8Pb25urVqxgb6Da5r6qyEvSM1AMj+pIavvvuO4DHOoDYVCu9UpPO3EhO\n4Lt9xzGvKcTAwED1Pcrb25tLly5RVVVFQkICzs7Oqvtha/z222/k5uYyf/58jQz2a9euERIS0up9\nCw+vOYPo2j676m+nreqioWoLFxcXrZUMzf38e1BjFR8DBgxQa0sotF85bkAlAAAgAElEQVTtIant\n7Nmz7Nq1Cx8fHzp16oRMJiMhIYE7d+7g5ubGlClT2uxYvyc/V1v8XG21Jj49Sc+HgiAIbUkEgQRB\n0NDUIGGnnv506qk5D4PP0J4YGxtrLFcoFNy8eRNDQ8MGM9r8/PxaFJjRVgZfn56eHpMnT2by5MlN\n7quxQUc7O7tGH/gefM3JyYklS5Y0ecwn0ZPSbqm1Gsso79RzIAWpl7lz6TcMzawxMrdVyyivrq7m\n+vXrqko6gPj4eLUHwaKiIrZu3dqqc2syMOwzioIbV5DlpqOoqcbazZfs2JNs+PQSJlX3MDExYcOG\nDS2eLwLgRk5xA0cFU/tu5KdeIj/lMp09R6BvJGXde+9zZK8DWVlZ3L17l65du/L666+TkpKCra0t\nurq6VFZWagRqlJycnDA1NSUqKopXXnmFgIAAampq2L9/v8b9bdq0aVy4cIGMjAy8vLw4evQo0dHR\neHt7Y2NjQ35+PteuXSM3N5eFCxc2GQTKzc1FoVDQuXNnVq1axVtvvcXnn3/OTz/9RHJyMiYmJmza\ntIn09HRu3brFpk2bVIOeM2bM4Pz585w9e5ZXX32Vfv36IZfLiYyMxNPTk6ioKODxZrF2xAf4jnjO\nrfGkXOej9Diq6up77rnnsLCw4LfffiM0NBQTExP8/Px47rnnCA4OBsDTyQYu5DW6n7xrUdxLj6O8\nKI9KuYxbZ3/l/yWWUl5SRP/+/Rk6dGirrqulmtNKz6yzK1kK2PZLGF5WlfTu3Vt17/bx8SEiIoIj\nR45QXl6utQqoJbKysgC0tgyNj4/Xuo2y3XBNjXoFlrW1NVu3bm313ByCpvYw+C4I7SGprVevXnh4\neJCQkKBqS2xvb8+cOXOYPXt2g99vOyrRklYQBKHtiCCQIAga2nqQ8MyZM+Tm5jJx4kTxQPwEE22I\nGtZURrmRhS3OAVPJiAoh6dDXmHfpzm1zG6zyLlBbXkxiYiLm5uZ8/fXXuLu706dPH86ePcvq1avx\n8PDg/v37XLp0CQcHB435JJqjqcCwaScnLB17UZx7kyp5EYraavKSznJN3plRAd5q1T0tmS8iPU/G\nPXlFw++LuQ2GptZUl8m4dmgrls4elOmYcjU+kYK7uRgbG3Pv3j3VZLeFhYWEhISwbt06+vbtS1JS\nktp8OVCXcf/Xv/6Vffv2ERYWxqFDh7C2tqZnz57cv39frR2Qj48PixYtYseOHRgYGNC5c2eys7MJ\nDg5GLpdTVlZG165dWblyJYGBgU2+zzdv3mTDhg24u7vj5OTEgAEDOHPmDKdPn6a4uJhu3bqRlJSE\ns7MzkydPplu3bqpt9fX1+eCDDwgODiYyMpKQkBDs7OyYO3cugwcPVgWBfo+BtI74AN8Rz7k1npTr\nfBQeZ1Ud1LUwmz59OtOnT9dYd+XKlaq2Y14Xc4nLKNR6bADzLq6U3ctBUVuDnqEJtfkpdO3pzchn\nZjJ16tQmqyXbSnNa6ZlYdUHPwIiizOtculPF7Cn/qyZXtuv86aef1P7dWsoEiri4OFxcXFTL09LS\nVMd4kLI96N27dzXmUnJ0dHyo8xHUtYfBd0GA3z+pzc3NjbVr17bpPgVBEIQngwgCCYKgoa0GCfft\n24dMJuPYsWMYGRnxzDPPtMXpCR2YaEOkXXMyyq3dvDG2sicv6Tyy3JvIcm5wtDQNb3dnhg4dyvDh\nw4G6zOS3336bnTt3cvHiRQ4ePIiNjQ3jx49n7ty5vPzyyy0+v+YEhq27+1J6L4dOPfrjOqLub33Z\nUx5MH+iqsW5z54u4kp5P3+mvNvh6F+9AungHUpgeT/71aApvxqKoraWzRy/W/GU106dPV8uILC8v\np6qqiujoaBITE6mtrWX27Nka+9XT02PevHnMmzdPtWzz5s2Eh4ezc+dOtfaUs2fPxsPDg4MHD5KY\nmIi+vj69evXCxsYGb29vRo4cqTExubGxsSpzv74ePXowe/Zs4uPjuXTpEiUlJVhYWDBv3jymTJlC\n//5aeuLVY2JiwpIlS7RWI9YfbBYDaYLw8NrD3BDaNDVAadbZDbPObkDdAOVHzwY89s/c5rbSk+jo\nYGrXjfu3r1MF2Dj2UL1mZ2dHly5dyM7ORkdHB09Pz4c6p9GjR/Pzzz/z3XffERcXR9euXcnKyuLC\nhQsMHjyYyMhIjW18fHz4+eef+fLLLxkyZAjGxsZIpVIGDhwo5gR6BH7vwXdBAJHUJgiCIHRcIggk\nCIKGtsq2++GHH9DT08PJyYkXX3yRTp06tfWpCh2QaEOkqbkTHhtb2dNtyDTVvxcF9tQ6yGFmZsay\nZcu07kNbz/qGeuIr+brYas0or6+sMAcA257/C1Q8bBuW5r4v1i6eWLv8bwDw+cCezNHyvhgZGfHy\nyy83GAhrLCu/fqb9gzw8PPDw8GjWuTY2Z4CtrS0LFy5s1n4ehhhIE4SH117bU3WEAcqWtNIz7ezK\n/dvX0TUwokhHfc4fHx8fsrOz6dGjB1KptIE9NI+1tTUbN25k+/btJCYmcvnyZRwdHVm2bBm+vr5a\ng0D9+vVj8eLFHDt2jF9//ZXq6mrs7OwYOHDgQ52LoF1H+N0WngwiqU0QBEHoiEQQSBAErdpikLA5\nk3sLTy7Rhuh/2mtGuVJTgeFKeRH3bsVjZNEJU/u6yp+2qB5p7+9LRyUG0gTh4bXn9lTtfYCyuQF+\nALveAdj1DgCgvKpW7bXly5ezfPlyrdt99NFHDe6zocQHJycn3n77ba3bNPSdVluLvry8xudlElqv\nvf9uC08OkdQmCIIgdDRilEQQBK3EIKEgPD7tNaO8Pm2B4cKbcVTICriXHk9tTTVdfUYhkUjarHqk\nI7wvHZUYSBOEh9eeq+ra8wClCPALD6M9/24LT57HndS2Zs0a4uPj1QLTcXFxrF27lvnz57NgwYJW\nrSsIgiD88Ylv0oIgNEgMEgrC49GeM8qVtAWGC1IvUZKXgb6JOY79n8LSuU+bBoY7wvvSkYmBNEF4\nOB0hYaY9Vt3+0QL8D95DHaXNiAoKD609/m4LgiAIgiC0VyIIJAhCo8QgoSA8Hu05o1zpwcCw+7gg\ntdcfRWC4I7wvHZ0YSBOE1hMJMy33Rwnwx9zMZ9fpFI3rqCi5T2bmPXoVlPxOZyYIwh/V66+/TkVF\nRau379mzJ1u3bsXc3LwNz0oQBEHoCEQQSBCEZhGDhILwaHWEjHJ4/IHhjvK+CILw5BIJMy3X0QP8\noTEZjX4uFZdVcuhSBuOuZPKUr9PjPTlBEP6wOnXq9FDbGxoa4ujo2EZnIwiCIHQkIggkCIIgCO1E\nR8oof5yB4Y70vgiC8OQSCTPN15ED/DE385s8bwAU8I9DV7GzMG5X5y8Iwu+jvLyc+fPn4+7uzief\nfKJaXllZybx586iqquL1119n1KhRqteOHDnC1q1bWbFiBePGjdM6z09LNDQnUGpqKidOnCAuLo78\n/HwqKiqwtbUlICCAuXPnYmpqqraf8PBwNm/ezMqVK7GxsWH37t2kpaVhYGDAgAEDWLp0KVKplLS0\nNHbu3EliYiI1NTV4e3vzpz/9CTs7u1advyAIgtB6IggkCIIgCO2IyCjXTrwvgiAIfywdNcC/63RK\nsyqYABQKCI5MaXfXIAjC42dkZIS7uzvJycmUlZVhbGwMQGJiIlVVVQDExsaqBYFiY2MB8PHxeaTn\nduzYMc6dO4eXlxe+vr4oFApSU1M5cOAAly5d4u9//7vqfOuLioriwoULDBgwgIkTJ5KUlER4eDh5\neXksWrSIt956i759+zJ+/HjS09OJjo4mJyeHL7/8EolE8kivSRAEQVAngkCCIAiC0A6JjHLtxPsi\nCILwx9HRAvzpebIWzWUEcPVWIel5snZ5PYIgPF4+Pj4kJSURHx/PgAEDgLpAj46ODp6enqqgD4BC\noSAuLo7OnTs/8sqZZ555hmXLlqGjo6O2/Pjx43z++eccPnyY2bNna2wXFRXFhx9+iKenp+qc33nn\nHa5cucL69et55ZVXCAwMVK3/+eefc/z4caKjowkICHik1yQIgiCo02l6FUEQBEEQBEEQlNasWcOU\nKVN+79MQhD8MFzszpg90ZcFwd6YPdG23AZMr6fmPdTtBEP5YlBU99YM9sbGx9OjRgyFDhpCfn8+d\nO3cASEtLQyaTPfIqIAA7OzuNABDA2LFjMTExISYmRut2I0eOVAWAACQSiaqSqVu3bmoBIIDRo0cD\nddcmCIIgPF6iEkgQBEHocBYvXgzAtm3bfuczadjD9uxuLzZv3kx4eDjbtm0T/bsFQRCEJ1ppRXWT\n6xiaWtLvuXUt3k4QhD+eB6scPR0dMDAwUAWB5HI5N27cYNasWXh7ewN1QSEHBweuXr0KoFr+KFVX\nVxMaGsrp06fJzMxELpejqNf3sqCgQOt2PXr00FhmbW3d4Gs2NjYA5OeLwLggCMLjJoJAgiAIgiAI\ngiAIgtAEE8PWPT63drvG5OXlsXjxYsaMGcPKlSvbfP+CILRezM18dp1O0do+srjKnPykFIqKirh2\n7Rq1tbX4+Pjg5OSEtbU1sbGxPP3008TGxiKRSB5LJdAnn3zCuXPn6Ny5MwEBAVhZWaGvrw9ASEiI\nas6iB0mlUo1lurq6AJiYmDT4Wk1NTVuduiAIgtBMIggkCIIgCIIgCIIgCE3wdbF9rNsJgtDxhMZk\nsPlwHPUKadSUmnTmRnIC3+07jnlNIQYGBvTp0weoq/q5dOkSVVVVJCQk4OzsjIWFxSM935SUFM6d\nO4evry/r169XBWqgbo6f/fv3P9LjC4IgCI+HCAIJgiAIgiAIv7v6We1z585l+/btxMXFUVVVRe/e\nvVmyZAndunWjqKiIH3/8kejoaEpKSnBxcSEoKEitXUpjbQzj4uJYu3Yt8+fPZ8GCBWqvyWQyDhw4\nwPnz58nJyUFPTw87Ozv8/f2ZO3cuRkZGauvX1NSwf/9+wsLCuHv3LpaWlowcOZLnnnsOPT3xNVsQ\n/mhc7MzwcrbWmt3fEO9u1u12jiNBaE8UCgUHDx4kNDSUnJwczMzMGDx4MM8//zwrVqwANFtBnz59\nmtDQUNLS0qisrMTe3p7AwEBmzpypqmSpLzY2lp9//pnk5GTKy8uxs7NjyJAhzJ49W6OqRdna+Zdf\nfmHfvn1ERESQm5vLyJEjVdV3crmc4OBgzpw5Q3FxMTpG5tzAEQvH3iT8+jk2br50GzJNbb9SW0fK\niwvZ8MG7mNTKMDMxZu3atUydOhUfHx8iIiI4cuQI5eXlj6UKKDs7G4CBAweqBYAAkpOTqaysfOTn\nIAiCIDx64ulUEARBaJcUCgWHDx/myJEjGg+CDWnOg2BBQQEvvPACrq6ubNmyRet+1q9fz6VLl/jy\nyy/p1q2bavn169f5+eefSUxMpKSkBEtLS/z9/Zk/f76q/3Vzris0NJTjx4+TmZmJQqHA2dmZsWPH\nMnHiRCQSidr6U6ZMwdPTk9WrV7N9+3YuX75MWVkZTk5OzJgxg5EjR2o9zuXLlwkJCSE5OZmysjJs\nbW0ZPHgwc+fO1dq64cqVK+zevZsbN26gr69P3759CQoKatY1CUJbys3N5Y033sDJyYkxY8aQl5fH\nuXPnWLNmDZs2bWLdunWYmJgwfPhwZDIZkZGRrF+/nm+++YZOnTo91HHXrl1LXl4ePXr04Omnn0ah\nUHDnzh0OHDjAxIkTNYJAmzZtIiEhgf79+2NiYsLFixfZv38/9+/fF+2ZBOEP6tkR7qzZFdVgln99\nEgksGO7+6E9KEP4Avv76a44cOYK1tTUTJkxAT0+PqKgokpOTqa6u1kiu2LJlC2FhYdja2jJkyBCk\nUinXr19n586dxMbG8v7776sFNUJDQ/nnP/+JoaEhw4YNw9LSkri4OPbt20dUVBSffvqp1u/IGzZs\nICUlhf79+zNo0CBVZU5lZSVvvfUWN27cwM3NjcDAQH4Mjycn9j+U5GVovcbqynJuXzxGZck9aqrK\n0TUyYsLQoRQXF/Ppp58yYcIEAH766Sfg8cwHZG9vD0B8fDxTpkxRLS8qKmLr1q2P/PgtJVphCoIg\ntI4IAgmCIAjt0nfffcfBgwdVD4K6urpt8iBoY2ODr68vMTExpKen4+LiorafwsJCYmJi6NGjh1oA\n6Pjx43z55Zfo6+sTEBCAra0tWVlZHDt2jOjoaDZt2tSsAei///3vnDp1CltbW8aPH49EIuHcuXNs\n3bqVxMREVq1apbFNSUkJq1evRiqVMnbsWORyOZGRkWzatImCggJmzpyptv7u3bsJDg7GzMyMAQMG\nYGFhQXp6Or/88gsXL15k06ZNan26z5w5w8aNG9HX12f48OFYWVmpzsXV1bU5Py5BaDPx8fE8//zz\nzJkzR7Vsz5497Nq1izfeeINhw4bx8ssvqwKmfn5+fPbZZ/z6668sWbKk1cfdtGkTeXl5LFy4kGee\neUbtteLiYo0AENRlz3711VeYmdVl+SuzlU+cOMGiRYuwsrJq9fkIgtA++bnasnKSV6PtnqAuAPTa\nZG/8XB9vK7iKigpCQkKIjIwkKysLiURCt27dmDp1KiNGjFBbVzkZ/MWLF8nIyODevXsYGRnRvXt3\nZsyYQf/+/bUe4/Lly+zZs4e0tDS1xJF9+/ZpVGE2Vn0JsHjxYkCzwgNaXuUhdFwJCQkcOXIEBwcH\n/v73v6uCMQsXLuRvf/sbhYWFapW94eHhhIWFMXjwYFatWoWBgYHqteDgYHbv3s3hw4eZOnUqUBc4\n+OabbzAyMuKzzz7D0dFRtf7WrVs5cuQI33//Pa+88orGud29e5evvvoKc3NzteU///wzN27cYMSI\nEaxatYpbd0vYndmJ3g79uHb0W63XeefiMcru5WDVzZPqyjIA5r/4MsP9Pfnwww85duwYUqmUoqIi\ndHR08PT0bOU72nzu7u706dOHs2fPsnr1ajw8PLh//z6XLl3CwcGh2YlugiAIQvum83ufgCAIgiA8\nKCkpiYMHD9KlSxe+/PJLXnrpJRYvXsyXX36Jjo4OhYXqbVjqPwh+8803rFixgsWLF/PJJ58wf/58\n4uLiOHz4sGr9sWPHAnDixAmNY0dERFBbW8vo0aNVy+7cucM///lP7O3t+eabb1i9ejUvvPACb731\nFu+//z737t3j22+1P+zVd/r0aU6dOoWbmxtbt25l6dKlLFmyhK+++ooePXpw6tQpTp06pbFdeno6\nPXv2ZMuWLQQFBbF8+XK2bNmCqakpP/74Izk5Oap1r169SnBwML179+a7777jtdde48UXX+S9995j\n5cqVZGZmEhwcrFq/vLycr776Ch0dHT7++GNWrlzJokWL2LhxI2PHjiU+Pr7J6xKEtmRnZ8fs2bPV\nlo0ZMwaAqqoqXnzxRbWKuZEjR6Krq0taWlqrj5mamsq1a9dwc3PTODaAubm52gCTUlBQkCoABGBk\nZMTIkSNRKBSkpqa2+nwEQWjfJvg589GzAXh30z446t3Nmo+eDeApX6fHel5yuZw333yTHTt2oKOj\nw7hx4xg9erSqyuDHH39UW18mk/Htt99SVlaGr68v06dPJyAggLS0NNavX89vv/2mcYzTp0+zfv16\nbty4wbBhw5gwYQJyuZxVq1aRm5vbZteyZcsWPv30U7KzsxkyZAiTJk3CzMyMnTt3sm7dOjGx/B9A\nep6MA9E3CY5M4R///n+UVlQzZ84ctWocPT09Fi1apLFtSEgIurq6vPrqqxqfz/PmzcPMzIyIiAjV\nsoiICKqrq5k8ebJaAAjqEjiMjY05efIkVVVVGsd67rnnNAJAUPccIZFIWLRoERKJhCvp+QAYSC2w\n6z1IY/3qilIK068itelKZ++6Sn5dAyOKdCwwMDAgKCgIhUKhSnTr0aOH1sqktqajo8Pbb7/N008/\nTWFhIQcPHiQxMZHx48fz3nvvifa2giAIfxDibi4IgiC0O2FhYQDMmTNHbYDVwMCARYsWsXbtWrX1\nm3oQPHToEBEREapswEGDBiGVSomIiCAoKAgdnf/lRISHh6Onp6fWZu3o0aNUV1ezdOlSbGxs1Pbv\n4+NDQEAA0dHRlJWVYWxs3OB1HT9+HKgbOK5fVWBkZERQUBB/+9vf+O233zRavOno6BAUFKQ28G1v\nb8+UKVPYvXs3J0+eZP78+QAcPHgQgD//+c8aD45jxowhJCSEiIgIVcXE+fPnkclkjB49Gnd39ZY1\n8+fPJywsDLlc3uA1CcLDSM+TcSU9n9KKairl9ymtqMbNzU3tbxJQZaE6ODho/I3p6OhgaWlJfn5+\nq8/j+vXrAPTr10+jJWNjHvybAVQVgSUlJa0+H0EQ2j8/V1v8XG3V7mMmhnr4utj+bnMAfffdd6Sl\npREUFMSsWbNUyysrK/nwww/56aefGDp0KG5ubgCYmpry73//G1tb9WolZTDp+++/JzAwUPXdqqys\njH/+85/o6uqyadMmtWrhH374gX379rXJdbS0ykPoWGJu5rPrdIra3FrXzl6htLCAn+JKsXLNV6ug\n69Wrl1pbt4qKCm7evIm5uTm//vqr1mPo6+uTmZmp+veNGzcA7e3VTE1N6d69O/Hx8dy+fVujCl7b\nZ31paSnZ2dnY2tqqKpRKK6pVr0s7aQaASwuyUNTWAlBTWUaX/waCwo/8SnlaZ1Vgs2fPnrzzzjta\nr+ujjz7SWObl5aX6/t/adc3MzFi2bJnWY2qr0hszZowqQae5x4C6RJ+GXhMEQRAeLREEEgRBENqF\n+oMov525TGlFtdYWCB4eHmoDxK15EDQwMGDYsGEcO3aMy5cv4+/vD9RVA2RkZDB48GC1jL9r164B\ndW2qUlJSNPZfVFREbW0td+7coUePHg1e440bN5BIJHh5eWm85unpiY6Ojuohtb5OnTqp+nXX5+Xl\npZrHp/656unp8Z///EfrOVRVVVFUVIRMJsPMzEy1rbb3WiqV4urqKqqBhDanbQCoouQ+CbcKqE24\ny9M31QeAlIM/9dsY1qerq/tQWeHKQGdLW55oy9BVnmvtfwd6BEH4Y3OxM/vdgj71yWQyTp48ibu7\nu1oACFBVGVy+fFlVkQx1348eDABB3b1t3LhxbNu2jeTkZNV3hPPnzyOXyxk7dqzGQPncuXM5evRo\nmySOtDS5R+g4QmMytLZSrKmqACCloIo1u6J4bbK3qpJOR0dHLSmspKQEhUJBUVERu3fvbtZxm/qc\nV7Zv1fb7q621a2lpqcZrJob/G17TNzLV2Ka6om4beUEW8oIs1fLoHDMyYv73/aa8vLzhCxHU3L59\nm+3bt5OQkEBVVRVubm7Mnz8fPz+/3/vUBEEQ2h0RBBIEQRB+V9oGgxNSs6mQFfLxwWssGqunMRhc\nP0DTmgdBqMtgO3bsGOHh4aogkLI93IOZbcXFxUBd7+/GNPXQJpfLMTMz09pWQXldRUVFGq9ZWlpq\n3Z/ywVP5IAp1g0A1NTVNvhdlZWWYmZmpHnabOoYgtJWGBoCUsu+VagwAtZSymkdbYEjbAI8ymPNg\nq0lBEIT24sGKI0ep+k00OTlZFXyu3/ZVSXk/rJ8UA5CRkcHPP/9MfHw89+7do7KyUu31+vdFZdtN\nDw8Pjf0bGRnh5uZGXFxcK67uf1qT3CN0DDE38zU+/ytK7pNwYAvVFaXoGZpQXS5HV9+Afxy6ip2F\nMX6uttTW1iKTyVTV+MrPbDc3N7Zs2dKsYyu3uXfvHs7Ozhqv37t3D9CebPJghXBcXBx/+ctfyMnJ\nUQui+rr87/+ryjWrgXUN6roA2PUZhGP/p1TLv/nTiHYRSO5ocnNzWbVqFS4uLkyYMIF79+4RGRnJ\nunXrWL16NcOHD/+9T1EQBKFdEUEgQRAE4XfT0GCwrr4hAFdSbnMtV642GFxTU0NxcbHqoas1D4IA\nffr0oWvXrkRHRyOXyzE0NOTUqVOYm5trTISsPMbevXsbrERoDqlUikwmo7q6WiMQpLwubfu/f/++\n1v1pe2A1MTFBoVA0OyCmvLamjiEIbUHbAJA2CgVqA0AtZWpal4F79+5dunTpovaatmq+Xr16AXWT\nnS9cuLDBlnDXr19n1apVFBcXa50fAGDz5s1cuHBBLTh7+fJlQkJCSE5OpqysDFtbWwYPHszcuXM1\nqomUk6R/8cUXBAcHc+7cOQoKCpgzZw5VVVXs27ePlStXam3DkpqaymuvvcaAAQMabCUjCELHoi1Z\nBuoGzzMz79GroG6wWSaTAXX3OG33OaX6CSvXr19n7dq11NbWqtrbmpiYIJFISEtLIyoqSm2OlKYS\nRxpa3hKtTe4R2r9dp1Ma/Pw3MDantraakrsZGJpZoVBAcGQKfq62XL9+XS2pw8jICGdnZzIyMlSV\n7U1xc3Pj7NmzxMXF4ePjo/aaXC4nLS0NAwMDnJyal3yiq6uLhYUFBQUF5OXlYWdnh4udGV7O1sRl\nFCK/qxmkNLFxQCKRIM/LUC3z7mYtAkCtFB8fz4wZM3jxxRdVyyZNmsTq1av56quv6N+//0M9twmC\nIPzR6DS9iiAIgiC0vQcHg7OvRnB557vIctMxsa4btC3Ju6UaDI65WTffR2JiolqbpQcfBFtizJgx\nVFZWEhkZycWLFykuLiYuLo63335bbT3lAHFCQkJrLxeoewBVKBRa95OQkEBtbS3du3fXeO3u3bvk\n5eVpLFdm29bfpnfv3pSUlJCRkaGxvjbKbbW1fJPL5dy8ebNZ+xGE5mhsAOhBygGg1ujZsycAx44d\nU1uenp5OSEiIxvo9evSgT58+pKWlaZ3TQiaTUVlZSa9evXBwcCA7O5vq6mqN9ZKTk8nPz8fKyko1\n8LB7927WrVtHcnIyAwYMYMqUKXTp0oVffvmF1atXqwWLlKqrq3nrrbc4f/48fn5+TJ06FXt7eyZO\nnIhEItG4LqXQ0FAAJk6c2MQ7JAhCRxAak8GaXVEaASCl4rJKDl3K4NiVTFVAedq0aRw8eLDB/zZs\n2KDafu/evVRWVvLee++xfv16li5dyrPPPsuCBQtU333qU97XGnaAzWoAACAASURBVEoc0ba8scpM\n0KzOrJ/c09h1iHlFOpb0PFmDv8cAJjZdAciNj6S6si5QefVWIalZ99ixY4fG+tOnT6e6upotW7Zo\nrfAtKSlRa5c8atQo9PT0OHToENnZ2Wrr7ty5k9LSUgIDA9HX12/2NfXt2xeFQsEPP/yA4r9fbp4d\n4U5VaRF5185rrK9vJMXKxQt5QRbZcadAUcuC4erzDWVnZ5Obm9vsc3iSSaVS1ZyoSu7u7gQGBiKX\nyzl37tzvdGaCIAjtk6gEEgRBeEKsWbOG+Pj4dvPQ3NhgsHV3X/JTL5MTH4mFY0/0DE0Ijkyhr4M5\nP/zwg8b606dP5/PPP2fLli289tprGpn1JSUl5ObmagRYRo8ezc6dOzlx4oQqe1U5qXt9kydP5tix\nY/zrX/+ia9euODg4qL1eXV3N9evX6du3b6PXPG7cOGJjY/nhhx/46KOPMDSsq3iqqKhg+/btqnUe\nVFtby/fff8+bb76pGkzJzc3l4MGD6OrqEhgYqFp32rRpXLhwgS+++II1a9Zo9D4vLy/n1q1bqsGd\nQYMGYWpqyqlTp5g8ebLa5Le7d+9uk97+ggBNDwBpc/VWIel5shZnyQYEBNC1a1dOnz5NQUEBPXv2\n5O7du0RFRREQEKB1zqw33niDNWvWsGPHDs6ePYuXlxcKhYKsrCxiYmL4+uuvsbOzY8yYMYSFhVFQ\nUKCxj/DwcABVpeLVq1cJDg6md+/erF+/Xu3eFB4ezubNmwkODmbJkiVq+yksLMTJyYmPPvoIIyMj\ntdf8/f25cOECt27dolu3bqrlZWVlnDp1CltbW41qRkEQOp7mVk7y32SZtVP6IJFISExMbPYxsrKy\nMDMz0zpXobbkEOX3qMTERI3vK+Xl5ap2cfUpKzPz8/M1XsvOzkYul6vdG1tT5SG0f1fSNX/+9Rma\nWWNk2Yn8lEtcO7QVS+c+SHR0WB6zi74unbG2tlar0h03bhypqakcOXKEpUuX4ufnh52dHTKZjNzc\nXOLj4xk7dizLly8HwM7OjqVLl7J161ZeffVVhg0bhoWFBfHx8Vy7dg1HR0eCgoJadE0DBw5EV1eX\n06dPc/v2bfr164dcLkdx5RimnZy5n3kNHigsdhowkQpZITlXI3CsvcPpkDSuWlpSWFhIZmYmKSkp\nrF69WutcoE+qhlphdu/eHWNjY431vby8CA8PJy0tTWvVtCAIwpNKBIEEQRCEx66pwWDTTk7Y9Q4g\n71oUSYe/xsrZg9uXdLh9/Fu6dLLSCGy09EFQydbWFm9vb2JjY9HV1cXFxYX09HSN83F0dGTFihV8\n/vnnLF++nH79+uHg4EBNTQ15eXkkJiZibm7O119/3eh1jxw5kvPnz/Of//yHl19+mcGDBwN1Ey3n\n5uYyfPhwtYCOkouLC8nJyaxcuRI/Pz/kcjmRkZHI5XJeeOEFtXZXPj4+LFq0iB07dvDSSy/h7++P\nvb095eXl5OXlER8fj4eHB++++y5QN9jyyiuvsHHjRv76178yfPhwrKysSExM5NatW3h6emodCBKE\nlmpqAKix7VoaBDIwMODDDz9k27ZtXLlyhZSUFLp168aqVaswMzPTGgSyt7dny5Yt7N+/n/Pnz3Po\n0CEMDAyws7NjxowZWFhYAHXZxBKJRGNAs7q6msjISKRSqSpwowy6//nPf9YITo8ZM4aQkBAiIiI0\ngkBQ1xbuwQAQ1FX5XLhwgdDQUP70pz+plp86dYry8nJmzZqFjo4o9heEjq6llZMHY3MJDAzk5MmT\n7Nmzhzlz5mjcC7Kzs9HR0VENMNvb23Pnzh3S09NxcXFRrXf8+HEuX76scZyAgACkUikRERFMnToV\nV1dX1Wt79+7Vmjji6OiIiYkJUVFRFBUVqe6llZWVfPPNN1qvp7XJPUL7k5yczC+//MLhk+e5npGD\nroExxpZ22PToh1U39eQpO4+hFN1O5l5GIvmpl9E3NsV13ATef/99goKCVN93g4OD2b17Nxs2bMDf\n35+jR48SGxuLXC5HT0+PS5cuMXjwYKZNm6ba9+bNmwkPD2fFihX861//4ssvv6SkpAQ7OzteffVV\n5syZg1QqJSYmhoMHD5KcnEx0dDRlZWV88MEHTJ48GV9fX7Xz1dPTY+nSpaxZs4aQkBB++ukn7O3t\nefHFF3HyGMgbq95QtbhW0jUwYtaS13CuyeT29RjOnj1LZWUllpaWdO3alSVLluDn5/eIfhodS1Ot\nMLt7aq/aUib2iUQ2QRAEdSIIJAiCIDx2zRkMduj/FIZm1txNvkB+ykV0DU2wGB/I+++8zooVKzTW\nX7ZsmcaDoKmpKZ06dWLmzJmMGjVK63HGjBlDbGwsNTU1jB49mn//+99a1xs1ahSurq4cOHCAq1ev\nEhMTg5GREdbW1gwdOrTZk4+++eabeHl5cfz4cY4ePQqAk5MTM2bM4Omnn9a6jampKe+++y7ff/89\nYWFhlJaW4uTkxMyZMxk5cqRqvby8PBYvXsyYMWP4+OOPOXjwIImJiURFRWFiYoKNjQ1PPfWU2jYA\nQ4cO5b333iM4OJjIyEj09fXx9PRk06ZN7Nu3TwSBhDZRWqHZPq0+Q1NL+j23rsHtGqti3LZtm8Yy\nW1tb/vKXv2hdv6F9mZmZERQUpJYNrMxA3R+dgYmhHr4utsybN48rV66QmZmpmj8gOjoamUzGvHnz\nVEGd7du3o6enpzXoBFBVVUVRUZFGtruBgYHagGx9ysDuyZMnCQoKUlUUhoaGoqury/jx47VuJ/w+\nlBVfD87jpJz7Sdvvrjb17+8rV658JOcqtB+trZx88flnycrKYteuXZw8eRIPDw8sG6kymDp1Kpcv\nX+bNN99k2LBhSKVSUlNTSUhIYOjQoZw5c0btGCYmJvzf//0fn332GatXr2bYsGFYW1uTlJTEzZs3\nVYkj9Ss29PT0mDp1Knv27GHFihUMHjyYmpoarly5grW1tUZiD7Q+uUdoX44dO8Y///lPdHR06Nbd\nnXtSV6rK5ZQVZJOffEEtCFQpv0/KsW0YmFrhOnw2NRVl3LuVQG5mGufOnaO8vFzrfD0DBgxgwIAB\nqn8r75UDBgzA0dFRY/1z584hkUh46aWX6NSpEzo6Ojz//PMA7Nq1iz179mBkZMTgwYMZP348hYWF\nJCUlERERoREESk1NZf/+/Xh7ezNjxgzu3r3LmTNniIiIYKa1NR6OVsyePwoLNw9VFYuvi229xJYF\nbfAu/zE1NG+sUnFZJQfPJjHxSqZq3lglZVvKB4PHgiAITzoRBBIEQfgDiIqKIiQkhMzMTGQyGebm\n5nTt2pXhw4fj7++vGmwCmDJliur/PT09+eijj4C6tkWnT58mMTGR/Px8ampq6Ny5M8OGDWPWrFkY\nGBioHbN+Fl5xcTH79+/n1q1bGBgY4Ofnx+LFi7GxsdE419TUVHZs3UzshSuABBMbB7r6BGqsJ5FI\n6NRrIPom5tzPSKS0IIvLUWdYtCgWR0dHxowZg0KhUBtoGDBgAGfOnKGoqIjvvvuOCxcu8Ntvv3Hg\nwAESExNV11pdXc2+ffsIDw8nPz8fOzs7AgMDmTx5coNBIKiryGnu4JvyWNqu6+mnn24w4NMQa2tr\n3njjjWav7+HhgYeHR7PX9/X1xdfXl7i4ONauXYubmxuOjo6sXLlSDDgKbcLEsHVfO1u73cNqKAMV\nwLK6C0Wl0YSHh6sCRspWcPUH+mUyGTU1NU1Obl5WVqYWBLKwsFC7t9UnkUiYMGECP/zwA5GRkYwd\nO5bU1FRu3LjBoEGDtA6oCoLQsbS2cvJ6Xikff/wxoaGhnDp1qskqg/79+/POO++wd+9eIiMj0dXV\nxd3dnQ0bNpCbm6sRBAIIDAzEzMyMPXv2aCSOKL9DPTgZ+4IFCzA0NOTYsWMcO3YMS0tLRowYwYIF\nC3j55Ze1Xktrk3uE9iEzM5OtW7diYmLCxo0bqTWy4k/fnFa9XikvUltflpuOXe9BOPQfr/r8s3Lx\nxCjlCB988AFmZmaqCvqHcePGDbZs2aLRbi0mJoY9e/Zgb2/Pxo0bNZ5htLUzvHDhAi+++CIzZsxQ\nLQsNDeWzzz7jyy+/xN7eninjA8Xncgs1txVmaWE2m365gJ2FMX6utqrlyjlT3dzcHuVpCoIgdDgi\nCCQIgtDBhYaG8tVXX2FlZcXAgQMxNzfn/v37pKenExYWxsiRI5k/fz7h4eHk5eWpTaBZ/wFo//79\n3L59m969e+Pv709VVRWJiYkEBwcTFxfHBx98oLXF0JEjR1TzbHh6epKcnExkZCQ3b97k888/V5tg\nNSkpib/97W/cvluMeVd3DM2sKC3MISXsB0ztXTX2DZAVEwYSHUxsHBgU0Jve9iZcvXqVb7/9lpSU\nFF5//XWt23377bckJibi7++Pv7+/6twVCgUff/wxUVFRdOnShcmTJ1NdXU1YWBi3bt1q1c9AEISm\n+brYNr1SG273MJrKQC006EpKXhm79h9i4cKFyGQyLl26hKurq1p7JBMTExQKRZNBoAc1FABSGjdu\nHMHBwYSGhjJ27FhCQ0MBmDBhQouOIzx6gwYNYuvWrVhZWf3epyJ0IE1VToL26snSimr09PSYPHky\nkydPbtaxHqykUPL09GxwPo3+/ftrzD1WW1tLeno6VlZWGhn4EomE2bNnM3v2bI19NVYN19C5Ce3f\nkSNHqKmpYd68eTg7OwPg5WytSqwwkFqorW9oaolER4eEA1sws3dBz9gMO+NaMm+lUVpaytKlSxk6\ndOhDn9esWbO0zrejrBBuKIlNOddffX369OHMmTOcPn2aHj16IJVKyc7OJj4+HiMjI9544w0RAGqF\n5rbCrK4sJ/vqKYIju6iCQCkpKURERCCVStskaCgIgvBHIoJAgiAIHVxoaCh6enp88cUXqj7rSsXF\nxUilUhYsWEBcXBx5eXksWKC99cCyZcuwt7fXGHzcuXMne/fu5cyZM1pbnl26dInPPvtMrXXRp59+\nyunTp4mKimLYsGFAXfBly5YtVFZWsmbNGr6+WKZaP+9aFLcvhmo9r+6jFmBoVvcA9fqfRuBiZ4ZC\noWDz5s2cOHGCSZMm0atXL43tGsr0U55Xr1692LBhg6rCacGCBQ0GlBqjrJyZP39+g++tIAjgYmem\nNgDUHN7drFs8H9DDak4Gqo6ePpbOHsSmXGbP4ZNIa0uoqanRGDDt3bs3Fy5cICMjQzUI1hYsLCwY\nOnQoERERJCUlcerUKezt7enXr1+bHUNoG1KpVLSkEVqsPVdOKuddUbaihLrveHv37uXu3bstrnQW\n/hiUrVOVbc8uxNRVY9QPFj47wp01u6K0fr4aW9pj3rU7ZffzKM6+QU1lGZ272WJubo6lpSV/+9vf\nmkyQaI6ePXtqXX79+nUkEolGcLMx7u7uODg4cOLECc6cOUNpaSlGRkbY2tri4uKiNegpNK4lrTDN\n7LtRkBrDvu+y6Fw0Bt2aciIjI6mtrWX58uUaFYmCIAhPOhEEEgRB6IDqP2jdyCmiulqBrq6uxnrm\n5ubN3mfnzp21Lp82bRp79+7l8uXLWoNAU6ZM0Zi74qmnnuL06dMkJyergkDXrl3jzp07eHp6MmPi\naM7knVN9ye/UcwB3r0dTIdP80q8MANUfDJZIJEydOpUTJ04QExOjNQjUUKZfWFgYAAsXLlRrcWdm\nZsa8efPYvHmzxjYdeS6GO3fuEBYWxpUrV8jLy6O0tBQrKyv69evHvHnz1DIblZPmAuzevVutemHD\nhg14eXmp/n369GlCQ0NJS0ujsrISe3t7AgMDmTlzplr1lyDU19gA0IMkElgw3P3Rn9QDmpuBat3d\nl/zUy3y76wA+9jro6uoSGBiots60adO4cOECX3zxBWvWrNHICC4vL+fWrVta72FNefrpp4mIiGDj\nxo2Ul5czZ86cNhkga8/q34tnz57N9u3bSUhIoKqqCjc3N+bPn68xoXZVVRW//vorERERZGdno6ur\ni6urK1OmTFF9PtXXWHvV+oPbOTk57Nu3j6tXr1JQUICBgQE2Njb06dOHhQsXqtr7NTQnkJJcLufH\nH3/k3LlzyGQyOnfuzMSJE5k8eXKzf54VFRWEhIQQGRlJVlYWEomEbt26MXXqVEaMGNGSt1hoJ9pz\n5eS1a9f45JNPVPP0lJeXc/36ddLS0rC1tRUJMU+YhlqnJlxMxbC6hEyZAuXMPH6utqyc5KU10ULX\nwBizzm6YdXZDIoHXJnvzlK8Ta9asIT4+Hj29thm6aqgqU9lu8MH2142RSqVaWzzXb8MttExLWmEa\nSK1wGjiJrJhwDoQcxs7cgO7duzNv3jyRFNNCbfWsq/x7rT/3pkiYFIT2QwSBBEEQOhBtD1p5Og7c\nTk4g4KlnmDVlPE8HDqZPnz4aVUFNKS8vJyQkhPPnz3Pnzh3KyspQ1HtCKygo0Lqdu7vmIG2nTp0A\nKCkpUS1LTU0F6tqLgPpgsERHB9NOzlqDQNUVpeQlncMqtZhnDn5GeXm52usNnVdDmX43btxAIpFo\nnS+nfpCjvWhoAvvmOnfuHEePHsXLy4s+ffqgp6dHRkYGv/32G9HR0fzjH/9Qtb0YNGgQUDdo6enp\nqfZ+1A+obdmyhbCwMGxtbRkyZAhSqZTr16+zc+dOYmNjef/997UGJQWhsQGg+pQDQPV7vD8OLclA\nNe3khKGZNUlXL6LraEng8CEa910fHx8WLVrEjh07eOmll/D398fe3p7y8nLy8vKIj4/Hw8ODd999\nt8Xn2qdPH1xdXbl58yZ6enqMGzeuxfvoqHJzc1m1ahUuLi5MmDCBe/fuERkZybp161i9erUqYaG6\nupp33nmH+Ph4HB0dmTRpEhUVFZw5c4aNGzeSlpbGwoULVfttqr2qcqCvsLCQ119/ndLSUvz9/Rky\nZAiVlZXk5uZy8uRJJk+erDbHU0Oqq6t5++23KSkpYcSIEVRXV3P27Fm+/fZbbt++zbJly5rch1wu\nZ+3ataSlpdG9e3fGjRtHbW0tMTExfPrpp9y6dUs16bnQcbTnyklHR0cGDBhAUlISFy9epKamBltb\nW6ZMmcKcOXNa/P1T6Lgaa52qZ2BEsayQNf8+wZpnR/OUrxMAE/ycsbc0ITgyhau3NH+/vbtZs2C4\ne6Of/8oWzzU1NRqv1X/20Kah4LpUKkUmk1FZWdmiQJDQtlrTCtMtcB6LAnv+LolDgiAIHYkIAgmC\nIHQQDT1o2fUZjK6hCfnJF/l6+x5+O3oYOwsTPD09eeGFF7QGaR5UXV3NW2+9RXJyMt26dWP48OFY\nWFioBvJ3795NVVWV1m21tblRbldbW6taVlpaCoClpSWgORisZ2yqeV6V5VwP/ReO0hocXHzo0aM/\npqam6OrqIpfLCQkJafC8Gsv0MzMz05pRqDy3P5JRo0Yxbdo0jeqcmJgY1q1bx969e1WTMg8aNAip\nVEp4eDheXl5as7XCw8MJCwtj8ODBrFq1Su1BOTg4mN27d3P48GGmTp36aC9M6LDaYgDoUWnpZOw2\nbj5kxZ6kuKyywbkzZs+ejYeHBwcPHiQxMZGoqChMTEywsbHhqaeeYuTIka0+37Fjx/L/2TvzgKqq\n9e9/mOcZmRUEQVBGJ5zFWVPTShO9DtzU917z3rLMfmWDvW9pWd2befV60+xazqmlOGGCA6gICKIM\nGiAIKCAi0+HILO8f/M6JwznAgTRB1+efcu299l57b/Y+e6/neb7frVu3EhgY+FQ+v1oiOTmZF154\ngVdeeUXeNmXKFFauXMmmTZvo378/hoaG/PzzzyQnJ9O/f38++OAD+W+TTP5z//79DBw4EC8vL6Bt\neVUZFy5cQCKRsGTJEqVnXVVVlUr/PFUUFxdja2vLpk2b5M9o2diOHz/OiBEj5IkTLbF161YyMzMJ\nCQnhpZdekrfX1NSwZs0a9u/fz7Bhw4RBdheks1ZO2tra8tZbb/0h+xJ0XtqSTjW0dkJ6P4+yOxl8\nddQaGzMD+e96QE9rAnpac6tQwtmEG2w4b0yApy0f/K/sc1vIvj2KipR/s2VJZ+2ld+/exMXFER8f\nL7xkniCdWQrzacbS0pLNmzf/bgm9N998k+rq6kc0KoFA8KgRT0qBQCDoArT1oWXl6oeVqx91NVU8\nKMrFy7aS5IRoVq9ezebNm9vMyoyJiSEtLU1lCXhxcXG7Tc1VIXupLC0tlbc1nQzOvqicuWdSlo6r\nGfx10StKAYkbN24QGhra4v7ayvSrq6tTCgQ1HZsMWWADGgMgMrk0gOXLl2NjYyP/d2ZmJjt27OD6\n9evU1tbi4eHBggUL5BOMTamvr+fkyZOcPn2anJwc6uvrcXJyYvz48UyZMkVh/E1L9GfNmsXOnTtJ\nSkqivLycNWvWyCt2KioqyM3NZf/+/URFRaGtrU2vXr2YOXOmkkRSQEAAzs7OJCQktHgOVREaGoqW\nlhavv/66UqZkcHAwR48e5ezZsyIIJGiVphNATT0E/F2s/3APoKaok4HaFDufkdj5jGRhkAdDh7Y8\nAdunTx+V1YeqaM0kvTmZmZkATJ48We0+XYnmfx9ORo0/gkZGRsyZM0dhXXd3d4KCgoiIiCA6Opqx\nY8dy6tQpNDQ0WLx4sUJ1opmZGcHBwWzYsIFffvlF4RmtpaWltryqqmxxfX39dh3jwoULFYL0TaVJ\nw8PDWw0CSSQSzpw5g7u7u0IASDa2kJAQEhISOHfuXJcJAnVl+dVHTWevnBQ827QlndrNYwBF6fEU\nJEdi6uDG7qh0hb/RoqIiXGysea6fMwctjfBxtlL7919W7R8eHs7o0aPlz+yioqIOf7NMmzaNuLg4\ntm3bhoeHh7xKXsb9+/eV2gSPns4shfk0o62tjZOTU9srtoFMDUQgEHRORBBIIBAIugDqelRo6+pj\n6uDOQ2dLxlkacerUKVJSUhg6dKg8M/nhw4dKWcr5+fkADB06VGmbycnJv/8AgF69eqncXkBPa/yc\nLbl9agsZFSY8P8AZzz598Hex5tj+W4Rl6T7Scbm5uZGYmEhqaiq+vr4Ky5KSkpTW9/HxkVcd9ezZ\nUy6ZBtCzZ0+kUinQmHl48OBBPD09mTBhAvfu3ePChQu8//77bNiwAUdHR3m/uro6Pv74YxISEnB0\ndGTUqFHo6upy7do1vvnmG9LS0njzzTeVxpKfn8+KFStwdHQkKCiIvKIyzqffJ6k0nRppKT9v+Zz8\n/Hy8vb2ZPHkyVVVVxMbGsmzZMpydndHU1KSiokKhQqs9GuvV1dVkZWVhamrK4cOHVa6jo6NDbm6u\n2tsUPNu42Jg80aBPc7pSBmpRURGRkZF0795d6VnW1WnJY6K6opTc3BKChvXCwMBAqZ+Pjw8RERFk\nZmYydOhQ8vPzsbKyUjmxITtnskBaYWGh3Nj71VdfZeTIkXh7e6uUVw0MDOSHH37gP//5D1euXCEg\nIIA+ffrQvXv3dvkyaWlpqUwSkAX2ZWNribS0NPnzfPfu3UrLZVJJ4pncdenMlZOCZxd1pFP1zbrR\nfeBkcmOPceP4N+Rf88RekoQetaSnp2NoaMjatWs7tP/evXvj7e1NcnIyb775Jn5+fpSWlhIbG0tA\nQADnz59v9zYDAgKYPXs2+/btY+nSpQwePJhu3bpRUlJCamoqnp6ez3xg+o+gM0thdmXS0tL4+eef\nSU1Npby8HBMTE5ydnZk4cSLDhw9XmYCxevVqEhIS2LBhAz179lTaZlRUFJ9//rlCZbYqTyCBQNB5\nEEEggUAg6OS09aElKcjC2NZFYeLpWnYx9RV3AdDT0wN+y2K+d++egr8LIK9mSUpKYtCgQfL2goIC\ntm/f/kiOw9PTE0dHR5KTk4mJiSEwMFC+7OjRo1SUFmFnbsiU/s74+DS+aMrGmZSUhIuLi3z9zMxM\n9u/f36FxjBs3jsTERHbs2MGaNWvkmdwSiYR9+/Ypre/j44OtrS2hoaG4uroqVSTJAkdxcXFKxt8y\nf4nQ0FAFb4cff/yRhIQEpk6dypIlSxQCdBs3buTUqVMMGzZM4RwBpKamMmvWLHxGPNc4QVpbDClS\nII30U9spz7uJnqkNAUNHs3jxYgA2bdpEVFQUd+7cYcmSJTg5OcmPOSIigsLCQrXPXUVFBQ0NDZSV\nlT2S6jCBoLPRFTJQz507x507d4iMjKS2tpZ58+a1K/DQ2WnNYwKgvLKG8zfLOJmYK/eYkCGTxJNK\npfIAvaWlpcrtyCRDm/pH2NnZ0aNHDwwMDAgNDeXw4cNoaGgoyava2Njwz3/+k927d5OQkMDFixcB\nsLa25sUXX2TatGlqHaupqalK6bimx9EaEokEgPT0dNLT01tcr7mXnqBr0VkrJwXPLupKp1q798fA\n3Ia716OpuHuLH/fn0dvZDhcXFyZMmPC7xvD+++/z3XffERMTw5EjR3BwcCAkJIR+/fp1KAgEMG/e\nPDw9PTly5AhxcXFUVVVhbm5Or169GDNmzO8ar0B9OqsUZlfl5MmT/Pvf/0ZTU5PAwEAcHBwoLS0l\nIyODY8eOMXz4cJX9xo4dS0JCAqdPn2bRokVKy2XqGC3JIQsEgs6HCAIJBAJBJ6etD62syB/R1NbF\n0NoRPWNzGhpAWphNsZaE4QN88fPzAxoNys+fP8/atWsZMGAAurq62NjYMHr0aAYNGoS9vT2HDh3i\n1q1buLm5ce/ePWJjYxk4cCD37t373cehoaHB66+/zvvvv8/atWsZOnQo9vb2ZGZmcvXqVfr37098\nfLxCnzFjxvDTTz+xdetWkpKScHBwIC8vj7i4OIYMGUJUVFS7xzFy5EiioqKIiYnhb3/7G4GBgdTX\n13PhwgXc3d3Jz8+nTFrDodgsJQmi1vDy8lJ6CR43bhz/+c9/SEtLk7c1NDRw9OhRLCwsWLx4scIE\noKamJosWLSI8PJyzZ88qBYHMzc2x9Bym9GH0oKQAyd1sTB3dkRRkcTQ+h/GJuQzuacrJkycZMGAA\nUqmUfv36yY3NASIjI9t17mQa7K6urnz99dft6isQdAW61nwcoAAAIABJREFUQgZqWFgYKSkpWFtb\ns3jxYpWVkl2VtqRPZdRWSvnq6DUFjwn4TdLTyMhI/rwqKSlRuQ1Ze3NfO3d3d5YvX45UKuX69etE\nR0dz6tQpJXnV7t278z//8z/U19eTlZVFYmIiR48eZcuWLejr6zN+/Pg2j7e8vFxldW7T42gN2fLp\n06fLA/9dmbbkV5/1iabOVjkpeHZpj3SqUbfuuHZrDNgvDPJQmrC3sbFptWrg008/Vb1dIyP+/ve/\n8/e//11pmartLV++XK1KngEDBjBgwIBW1/Hx8Wl1zO2RdRUoI6QwHx25ublyr59169bRo0cPheWq\nfLVkyLxiz549S0hIiIJUbklJCVeuXMHNzQ1nZ+fHNn6BQPBoEUEggUAg6OS09aFl7z8WSf5NKosL\nKM/LQFNLG10jM4ZNeIG1by2Sy31NmDCBwsJCIiMjOXjwIPX19Xh7ezN69Gj09fVZu3Yt27dvJykp\nidTUVGxtbQkODmbGjBkdCraowsvLi3Xr1rFjxw4uX74MNEo6fPrppyQkJCgFgSwtLVm3bh3bt28n\nNTWVhIQEnJycWLp0Kf7+/h0al4aGBu+88w4HDhwgPDyco0ePYmlpybhx4+gdOI7/HggjpSyHm91S\n5X1kEkS97yv7FsmQZYg3RVtbG3Nzc4VM8zt37iCRSHBwcFBZeQSNXg6q5HsMLGzZ+MsNpQ8i6b3b\nADysraZacp/yvAxWrtnAWHdjbt++TZ8+faiqqlLYZlFREQUFBUr7aFqV1Bx9fX169OhBTk4OEokE\nExMxGSV4+niSGaiq5DjWr19PREQE27Ztw8bGpsUJsfYQERHB+vXrO93EurrSp5XF+dTVVCt5TMgq\nM11dXTEwMMDe3p6CggLy8vJwcHBQ2Ma1a9eARolQVRgZGcknAxsaGhTkVZuipaVFr1696NWrF15e\nXrzzzjtER0erFQSqr6/n+vXr9O3bV6G96XG0hoeHBxoaGqSmpra6XlehLflVgUDQOehK0qmCromQ\nwnw0HD9+nPr6eoKDg5UCQNBYwdwSurq6DB8+nJMnT5KQkMDAgQPly86ePcvDhw871TukQCBoG/Er\nLBAIBJ2ctj6YunkMoJuHcsZa0MQ+Cp4JmpqaLFiwgAULFqjcjrW1NW+99ZbKZaqy3ebOnaskjSaj\ntay+Xr168X//7/9Vavf09FS5ve7du/PBBx+oPS51Mv20tbUJDg4mODhY3hZ2JYcPf0yg10vvquxT\nXlkjr7BpLkEELWdsa2lpKQRUZPI9eXl5rUqqVVZWKrWlF9Wiq8Jvs76mcV3J3Wyqyu/zsK6W2soK\nfr6hSWXhHcrKyvDy8pJvs6qqio0bN8r9IprSVDZQFTNmzGDDhg18/fXXvPHGG0rHXVFRwd27d1uc\nWBUIOjtdIQNVVbCoKUlJSaxatYo5c+a0+JzubNwqlHA5NZOUQ19j5eqPnc8I7lwJp+JuNg0P6zCy\ndqKb1xAAaqukpBzeQKqGJkn7jOnTuxejR4/m7NmzGBkZMWTIEIqLizEwMCA1NZXnnnsOFxcXzMzM\n8Pb2ZsqUKezduxdAIVhTXl5OQ7OL3tDQQHh4OLGxsezatYsBAwaQk5ODvb09+vr6nDx5ktOnT5OT\nk0NhYSHZ2dlYWVnR0NCglkzf999/z5o1a9DR0QEUpUnHjRvXal8zMzOCgoI4c+YMe/fu5eWXX1bp\n+aepqakkA9sZaUt+VSAQdA66gnSqoOsjpDA7RtPzdexcLA+q6+jfv3+HtjV27Fj5e07TIFBERATa\n2tqMGjXqUQ1bIBD8AYggkEAgEHRyxIfW40ddCSIaUClB1B4MDQ0BGDJkCKtWrVK734PqOvLrKlFV\ncK+l0+j7ZO8/Bs3kKKxc/XEeOh2AfnVXSE6IwdnZGRMTEzZs2EBiYiK6urq4uroqGY87OjpiZWVF\nZGQkWlpa2NjYoKGhwejRo7GxsWH8+PFkZGRw/PhxlixZQkBAADY2NkgkEu7evUtycjLjxo1j2bJl\nHTo/AkFnoDNloC5YsICZM2e26G3ztNBU+rRGWsKvYdvQN7PG0tWPGmkpZbk3kBTm8LCulpoHZVSW\nFGBgYYe01ohTp06xb98+fHx8WLFiBYaGhly+fJmcnBzs7OwoKytDIpGgpaXFrl272LhxI7169WL+\n/Pn06dNHvt+MjAz27NlDVVUVtra21NbWsmPHDtLT0/H29mb9+vXo6Ohw5swZTpw4QVFRESUlJVhb\nW2NmZkZ5eTmamppkZ2fz1Vdf8eabb7Z6zJaWltTV1bFs2TIFadLi4mKee+45vL292zxvf/3rX8nL\ny2PXrl2cOXOGPn36YG5uTnFxMbm5uaSnp7Ny5couEQQSCARdg64gnSp4ehBSmOpxJauo0TO2yX2Z\nknaHakkxX5zIIGScfrvfW728vHB0dCQmJoaKigqMjY25efMm2dnZDB48WJ48KBAIugYiCCQQCASd\nHPGh9fhpS4JIls3d0PCQhgaUJIjag5OTE0ZGRvz666/U1dXJ5fraoryyBoxVLzO0dgTgQdFtpWX9\nxs+kTy9noqKiOHbsGGZmZgwaNIh58+axdu1apfU1NTV577332L59OxcuXKCyspKGhgb69OmDjY0N\nAEuXLmXAgAGcOHGCq1evIpVKMTY2plu3brz44ouMHj1azbMhEHReOksGqqWl5VMfAAJF6VPJ3Wwc\n/Mdg5z1C3pafdI478aeQ3svBrLsnvSctJj/xNHX1xVhYWPDgwQMGDhzIiBGNffz8/Ni9ezdaWloc\nOnSIc+fOUVBQgIODA5WVlfTo0YOQkBCFMTg5OWFtbc3Nmze5dOkSaWlpVFdXM3/+fD7++GN5tc7I\nkSO5ePEiV65cwdraGgsLC6ytrQkKCmL69OkcOXKEU6dOMWzYsFaPWVtbm48//pgffviByMhIysvL\nsbOzY+bMmUydOlWt82ZoaMhnn31GWFgY586d4+LFi9TU1GBubo6DgwOLFy8mICBArW09CZrfX+p4\n8AkEgifPk5ROFQgEioRdyVGZ0Kitq081kPhrNu/elfLGVF+VihatMWbMGHbs2EFUVBSTJ0+W+/UJ\nKTiBoOshgkACgUDQBRAfWo+PW4WSNgNsWroGaGhoUPugDIBr2cXcKpR0aCJYS0uLadOmsXfvXrZs\n2cLixYvR1dVVWKe4uBipVEr37r+9pNc/bPniG1k5YmzjTMXdWzgPfh6rXr9N+NU2aDF//nxGjBiB\nhYWF3NQcWjbbdXd3Z82aNa0ex8CBAxVkAQSCtiTKuipPOgO1uSfQ7t275VKSERER8o9xaJTDTEpK\nkrft2bNHQXZy7dq1+Pj4tLq/oqIiDhw4wOXLl7l//z4GBgZ4eXkRHBys0vvs99A0AJBRUCZv1zM2\nx7aPYgDFytWfO/GnaGhowMypNwbmNrgGBbN0Yh+eH+DMiy++qCBx2fRZ9/LLL/Pyyy/L//3xxx9z\n5coVpUC8jY0NY8eOZe7cuaxevRpNTU1ef/11goKCFMbi4eFBdXU1I0aM4L///a+CWTLAokWLCA8P\n5+zZs/zP//yPyomSpsbhS5cuZenSpa2eq9ZkVrW1tZk6daragaPOgKqMZVDPg08gEDx5uoJ0qkDw\nLNCaooWhtRPS+3mU52Wgb2bdIUWLMWPGsHPnTiIiIhg/fjyRkZGYmpoyYICyHL1AIOjciCCQQCAQ\ndAHEh5Yij3KyuakEUUto6ehiaOVIRWEOt87/hJ6pFRu3ZvK3P03r0D5nz55NVlYWJ06cIDY2Fl9f\nX6ysrCgrKyMvL4/U1FQWLFigEATS0mzdW8Jl2AtkROwg+1Io936NxdDaES1dfX65d4GL+8rJzs7m\nyy+/VJgYFQgEXQ8fHx+kUimhoaH07NmTwYMHy5f17NlT7tMVERGBt7e3QtCnLUmwmzdv8sEHH1BR\nUUG/fv0YOnQo5eXlXLp0ibfffpv33nvvkXz0txQAkGFgYYdGM28bHYPGUkhNbV00NX/7hPF3sUZT\nUxNzc3OKihSf53FxcZw4cYKMjAzKy8uVfNDKy8uVqqxu377NypUrqaqq4qOPPsLPz09pfHfu3EEi\nkeDg4CD372mOrq4uubm5LZyBZ5uWMpZltOXBJxAIOgedSTpVIHhWaU3RopvHAIrS4ylIjsTUwQ19\ns24KihZFRUVYW7d+f1pbW+Pn50diYiJHjhyhrKyMadOmqa1mIRAIOg/irhUIBIIuQlf80Gqewd4Z\naSpB1Bouw17g9uWTlOffpD47mdN5Rkwe3KdDx6Wtrc17773H2bNnCQ8PJy4ujqqqKkxNTbG1tWXe\nvHlKmeemBrpIWtmmrpEZvScv4d6vsZTmXKfkVhINDQ1U6rvRs7cbU6dOxdlZlaOQQCDoSvj4+GBr\na0toaCiurq7MnTtXYbmrqytGRkZERETg4+OjtLwl6uvrWbduHVVVVaxdu1bBj6a4uJg33niDDRs2\nsG3bNrksWkdoKwAAoKWjr9Smoan1v//9LTjUVPpUS0tLIcgTGhrK1q1bMTY2xt/fn27duqGnp4eG\nhgaXLl0iKyuLujrl539eXh4SiQRXV1fc3NxUjk8ikcjXbVpp1ZzKysqWD7KLExERwfr161m+fLna\nkjDr16/npyMn0Bq4AF0jc5XryOVXHz783R58AoHg8dNZpFMFgmeRthQt9M260X3gZHJjj3Hj+DeY\nOXmSl2iJSd5FigtyMTQ0VCkP3pwxY8aQmJjIDz/8AAgpOIGgqyKCQAKBQNCFEB9ajx5DPfV+CvVM\nLHEbPUf+76UT+zB2UE+AFiV6QFHypykaGhqMHj1aLf8cGxsbIn45wVvfR7f6oq+lo4ed9wi5j4av\nsyVfLBjS5vY7G9OmTcPb27tFuTqBQNBxjhw5wtatW7l8+TLvv/8+FRUVTJ8+ncuXL5Ofn88LL7yg\nEACCRl+il156ia1bt3L16tUOVwO1JlnSXlqTPq2vr2f37t1YWFiwfv16pWqfGzdutLjdQYMG4ejo\nyA8//MB7773HJ598gomJ4u+roaEhAEOGDGHVqlW/80ieLe7cl9K9tQBgE/nV3+vBJxAI/jietHSq\nQPAsoo6ihbV7fwzMbbh7PZqKu7cou32DM1UOjBzgzYQJE9Taz9ChQ/nPf/7DgwcPcHZ2bjFJRiAQ\ndG5EEEggEAi6IM/yh1ZrfhjLli1j06ZNLcrE1dbWsnDhQgC+//57dHR0kOYkk7DzY5yHTEdb35CC\n5PNUlhSgqamFsV1PHPzHom9qpbQtL3sT9u/fT1RUFHl5eWhoaODs7Mzzzz/PyJEjH8uxC28owZMk\nLS2Nn3/+mdTUVMrLyzExMcHZ2ZmJEycyfPjwFvvduXOH8PBwEhMTKSws5MGDB1hYWNCvXz+Cg4OV\nZCgaGho4ffo0YWFh5OXlUVlZiZmZGd27d2f8+PGMGDFCvu6tW7fYv38/N27coLi4GENDQ6ytrfH2\n9ubPf/5zl5CqaBrUr5GWql2d2BEiIyPZsmULOjo62NraMnr0aDw9PYHfAiP37t1j9+7dSn3z8vIA\nyM3N7XAQqDXJEnXQ1NbBru8wXIZNb1X6tLy8HKlUip+fn1IAqKqqips3b7a6n1mzZqGrq8u3337L\nu+++yyeffIK5+W+VK05OThgZGfHrr78q+Qo9KwwePJjNmzdjYWGhdp9iSRXllTWtrtNcfjX/mhXO\nVb8ydUIQLi4uv3PUAoFA8Ph5Wj0SBZ0Pdd8Zjbp1x7Xbb/KqC4M8FL4TW/MdBNDT02tR/rYpqhL4\nfHx8Wt22QCD443j2vlgEAoHgGaHpB8jMmTPZvn07KSkp1NbW4urqypw5cwgICJCvL5VKOXnyJPHx\n8dy5c4eysjIMDQ3x9PRk1qxZ8onCpsgqNt5++2127NhBfHw8JSUlvP7666xfv16+3qJFi+T/b2Nj\n02J1jDq05ofh4eGBvb0958+fZ8mSJXJvDBkXL15EIpHwwgsvyOWM7CwMMTXQpTT3OuV5NzHv7omJ\nrTMPigsozblOxd1sPCb+GX3T3yYbPe0M+PeXH5OZmYmbmxvjx4/n4cOHXLlyhS+++ILs7Gzmz5/f\n4WNsCeENJXhSnDx5kn//+99oamoSGBiIg4MDpaWlZGRkcOzYsVaDQNHR0Zw4cQIfHx+8vLzQ1tYm\nJyeHX375hdjYWL766iusrH4LtO7YsYP9+/dja2vL8OHDMTIyori4mPT0dM6fPy8PAt26dYsVK1YA\nEBgYiK2tLQ8ePCA/P5/jx48zf/78Tj05r8oXp7qilJTs+zyMvcWorKJHfg/HxcUBMH/+fL777jvG\njh1L7969gcbACcD58+db3UZVVVWH9t2WZIm6dDM14NM/BbZ6bszNzdHT0yMjI4Oqqir09Rvl5erq\n6tiyZYv8WFtj+vTp6OrqsnnzZt555x3Wrl0rDyhpaWkxbdo09u7dy5YtW1i8eDG6uroK/YuLi5FK\npQrebk8TRkZGSr+xbXGnWKrWes3lV78viMfLrbsIAgkEAoFA0AR1FS0eVT+BQNC1EXe+QCAQPOXc\nvXuXt956CxcXFyZNmkRJSQlRUVGsXr2alStXyidUb9++zY4dO+jbty8DBw7E2NiYwsJCYmNjiY+P\n54MPPqB///5K26+oqOCtt95CX1+foUOHoqGhgbm5OXPmzJH7Ljz//PPyyaL2Tho1py0/jMmTJ/Pd\nd99x5swZpk6dqrAsLCwMgIkTJyq0O1oZceNOGq5BczBz9JC3F96I4fblMHJjj+M+bgHQGGDRybnI\nr5mZhISE8NJLL8nXr6mpYc2aNezfv59hw4bh6ur6u45VFV3RG0rQtcnNzWXz5s0YGhqybt06evTo\nobC8qKh1KYrRo0czffp0JR+ZK1eusHr1avbt28err74qbw8LC8PKyopNmzahp6en0Kfp5H1ERAQ1\nNTW8//77BAYGKqxXUVGh1Lcz0ZYvTn7JA97dFcMbU30f6X6LixufGaampkrLZM9mVefzUaCOZIkq\nhnna0svODEM9bf4VZcUgH8c2n28aGhpMmzaNAwcOsGzZMgYPHkxdXR3Xrl1DIpHg6+vLtWvX2tz3\n5MmT0dXV5euvv+add95hzZo1dOvWDYDZs2eTlZXFiRMniI2NxdfXFysrK8rKysjLyyM1NZUFCxZ0\niiBQTEwMoaGh5ObmIpFIMDU1xcHBgREjRvDcc88BkJGRwenTp0lKSqKoqIjq6mqsra0JDAxk9uzZ\nGBsbK2yzNU+gxMRE9uzZw82bN9HR0aFv376EhIRQU/dQrfE2l19dGOTBWFHZKhAIugiWlpby9yaB\n4HHi79Kx772O9hMIBF0bEQQSCASCp5zk5GReeOEFXnnlFXnblClTWLlyJZs2baJ///4YGhri5OTE\n999/rzQ5WFRUxIoVK/j2229VBoFu3brF6NGjef3119HS0pK39+/fn8LCQrKyspg+fTo2NjaP7yCb\nMG7cOHbu3ElYWJhCEOjOnTskJyfj6+uLo6OjQh8zQ13GDBtEqZOHwqRsN4+B3Ps1FklBFtUVpeib\nmPOX0a5s/3wL7u7uCgEgAF1dXUJCQkhISODcuXOPJQgEj8Ybqmml2OzZs9m+fTtJSUnU1tbi6enJ\n4sWLcXZ2pqysjB07dhAbG0tFRQUuLi6EhITg66s4OS2VSjlw4ADR0dEUFhaiq6uLh4cHL774Iv7+\n/kr7r6ur48CBA0RERFBUVISlpSVBQUEEBwe3OOb6+npOnjzJ6dOnycnJob6+HicnJ8aPH8+UKVPk\nhuLNj2/WrFns3LmTpKQkysvLWbNmDT4+Prz77rskJydz6NAhDh48SHh4OPfu3cPc3JxRo0Yxb968\nTl1J8kdx/Phx6uvrCQ4OVgoAAUpybs1pWuXTlICAAJydnUlISFBapqWlhaamplK7quBF8woMQGnC\nujOhri9OQwN8dfQa3tUVSstk5+bhQ9WT6s2XN5XRhMZAT2ZmJu+//z5jx47l0qVLJCQkcO3aNZYt\nW4aLiwtOTk6MHTuWqVOnKtxbAOvXryciIoKtW7cSFxfHL7/8Ql5eHh4eHi16eXVU5q6XnZlcsmRb\nOzJX582bh5mZGb/88gthYWEYGhoSEBDAvHnzVMrdtcTYsWPR0dHhn//8pzwQZGdnh7a2Nu+99x5n\nz54lPDycuLg4qqqqMDU1xdbWlnnz5hEUFNTew33khIWFsWnTJiwsLBg0aBCmpqaUlpZy69YtwsPD\n5UGgkydPEh0djY+PD/7+/jQ0NJCRkcGhQ4eIj4/nH//4BwYGBm3u78KFC6xbtw4dHR1GjBiBhYUF\nqampvPXWW9TqqS8d1xSRsSwQCLoS2traODk5PelhCJ4BXGxM8Olh2a5Ka19ny2dWVl4geNYRb9QC\ngUDwlGNkZMScOXMU2tzd3QkKCiIiIoLo6GjGjh3bYoWOtbU1w4YN48iRI9y7d0+eBS1DW1ubRYsW\nKQSAHgfq+maYmJgwfPhwTp8+zfXr1/Hy8gJ+qwKaPHmyyn4zxg+nd2CgQoWNhqYmxt16UC0pxlHv\nAW/9aSIPi7MVJlabU19fDzRWTzxuHoU31N27d1mxYgXdu3dn7NixFBYWEh0dzbvvvsuXX37J6tWr\nMTQ0ZMSIEUgkEqKiovjoo4/45ptv5H8LUqmUlStXkpubi7u7O9OnT6esrIzz58/z4Ycf8uqrrzJp\n0iT5PhsaGvjss8+IiYnB3t6eqVOnUldXR3h4ONnZ2SrHWVdXx8cff0xCQgKOjo6MGjUKXV1drl27\nxjfffENaWhpvvvmmUr/8/HxWrFiBo6MjQUFBVFdXK2Vmfvnll6SkpMgDopcvX+bgwYOUlpY+s1ru\nTe+3Y+dieVBdpzIIrA4NDQ2cPXuWiIgIsrKyqKioUAheNA+0BQUFceTIEV599VWGDx+Ot7c3np6e\nSs+oESNGEBoayieffMKwYcPw9/fHy8sLe3v7Do3zj6I9vjgNDZCQWUTzp7OxsTEaGhrcu3dPZT9Z\nsEy23MfHB2is3igsLGT06NFUV1czevRoALZv3w6Avb09dXV1uLm5UVZWxpYtW0hPT5ffWzdu3KBn\nz57y/WzZsoXU1FQGDBjAgAEDVAbuZKgzka9nbE6/eatb7NeapnxzmVEtLS1mzJjBjBkzlNZdvny5\n0r3dmh7+yJEjVXq9aWhoMHr0aPl57IyEhYWhra3Nv/71L8zMzBSWNa2smzVrFkuXLlW6hqdOnWLD\nhg0cO3aMmTNntrqvqqoqNm3ahKamJp999hnu7r9V73z77bfs+fFgh45BZCwLBIKuhCpPoPb6IyYl\nJbFq1SrmzJnDwIED2blzJzdu3EBDQwM/Pz+WLFmCtbU1BQUF/PDDD1y9epWqqip69+7NkiVLFH6r\nZVRXVxMaGqqWn2l7/BkFTxbhGSsQCNRFBIEEAoHgKaF5VYiTUeOboJubm8rsXR8fHyIiIsjMzJRL\nuVy/fp3Q0FBu3LhBaWkpdXWKgZb79+8rBYFsbW2VJpYeJR3xzXjuuefkHy5eXl7U1tYSERGBmZmZ\ngodQU8zNzVVW2Fyq8yClPpdXRrkS0NOas9nJAKSnp5Oent7iuDvqm/FHk5yczPz583n55ZflbXv3\n7mXXrl2sWLGC4cOH8+qrr8orAQICAvjnP//J4cOHWbx4MdA4gZybm8ukSZMU1p05cyZvvPEG33zz\nDf369ZNXg0VGRhITE0Pv3r1Zu3atvJJj7ty5KgM5AD/++CMJCQlMnTqVJUuWKFQ6bNy4kVOnTjFs\n2DAlGavU1FRmzZrFggULWjwH+fn5bNq0CROTxoDa/Pnzee211zh9+jQLFy5sl/F5V0fV/ZaSdodq\nSTFfnMggZJx+u6UGt23bxuHDh7G0tKRfv35YWVnJr7ksKNGUxYsXY2trS3h4OAcOHODAgQNoaWkx\nYMAAFi1aJA/yeHh4sG7dOn788UcuXLjAmTNnAHB0dGTu3LkqJ+2fNB3xxckrfoBjveKzWF9fHw8P\nD1JSUvjyyy9xdHSU+zW5uLjg6OiIlZUVkZGRaGlpYWNjg5aWljyQNnbsWJKSkuTP/tWrV2Nvb8+t\nW7f48MMPSU1NxdPTEz09PX744Qfy8/MpLS2VTzbJuHnzJl9//TW2trZtHoeQLHlyaGlpqUzUaFpZ\n11K17rhx4/j222+5cuVKm0GgS5cuIZFIGDNmjEIACGDOnDmEh4djWihp19hFxrJAIHgaaK8/ooz0\n9HQOHjyIt7c3EydO5NatW1y8eJHs7Gzef/993n77bZycnBgzZow8keuDDz7g22+/lfvhQWPC1qpV\nq9T2M1XXn1Hw5BGesQKBQF1EEEggEAi6OKombaExUJKbW4Kbt47Kfubm5kDjRwE0fpx8+umn6Orq\n4u/vj729Pfr6+mhoaJCUlERycjK1tbVK23mcE+Tt8c2Y6P+b70Lv3r1xdXXl/PnzLFmyhPj4eCQS\nCTNnzmxR3qu0tFT+/00rbArjtcnS01byNJo+fbo8CNKVsbGxUZrYGzt2LLt27aK2tpZXXnlFQQpq\n1KhRfP3112RmZgKNFTpnzpxBX1+fBQsWKKzr4ODAtGnT2LdvH6dPn5ZLvYWHhwOwYMECBSkvExMT\ngoODWb9+vcJ4GhoaOHr0KBYWFixevFghU11TU5NFixYRHh7O2bNnlYJAMn+q1ggJCZEHgKBxgn3U\nqFHs3buXjIwMBg4c2Gr/p4WW7jdtXX2qgcRfs3n3rlTpfmuNsrIyQkNDcXZ25osvvlAKSEdGRir1\n0dTUZPr06fKKspSUFKKiojh//jw5OTls2rRJ7i/k6enJhx9+SG1tLRkZGSQkJHDkyBG++OILTE1N\nVUoRPkk66otTXlmj1LZixQq2bt1KQkICkZGRNDQ0YG1tjYuLC5qamiz86+ts+uZbdh8K42FdDSb6\nOvTs4ahi68gDay4uLvzrX//i0KFDxMbGIpFIKCwsJCEhgXHjxjF37lyFwMFLL72kVgAIhGTJH0Xz\nhJA+/oHcvHmTV199lZEjR+Lt7Y2Xl5dS8kZdXR2aJs3dAAAgAElEQVRhYWFERkaSm5uLVCqlocnD\n4P79+23u++bNmwB4e3srLTMyMqJnz57k3Suhmbpgi4iMZYFA8LTQXn9EGZcvX2bFihUK8qIbNmzg\n1KlTrFy5khdeeEFlItcvv/zC888/L2/funUrme3wM1XXn1HQORCesQKBQB1EEEggEAi6MG0FScor\nazhy8TqTE3OVJm1lQQ9ZUGPnzp3o6Ojw1VdfKRlZb9q0ieTk5Ed/AK3Qmm+GLNDQ0PBQ7pthY2ag\n8GI7ZcoU/vWvf3H69Gmio6PR0NBg4sSJLe4vKSlJyY/m4cOHpKamAsg/ijw8PNDQ0JC3dxVaqhRz\ndXVVkv+xtLQEGisqmk/aa2pqYm5uTlFR42T27du3qa6uxsvLSyGQIsPX15d9+/bJJwehcaJQQ0OD\nPn36KK0vk61qyp07d5BIJDg4OLBv3z6Vx6erq6tSgq9nz55KH9zNaZ6xDsgr3ioqlP1YnkZau98M\nrZ2Q3s+jPC8DfTNrlfdbSxQUFNDQ0EBAQIDS31JRUREFBQWt9jczM2Po0KEMHTqU8vJyrl27RnZ2\nNr169VJYT0dHBy8vL7y8vHBwcOCf//wnMTExnS4I1JYvjio5NOeh01kY5KFUqWFvb8+HH36otA2F\nxADX5zBrYk2WFLMPvcoaxo4dK68CApBIJPz0009cvnyZgoICeSWjrq4u/fv3Z9KkSSxbtkxpXx4e\nHm0ec1OEZMnjo6WEEDDFrM8EKL5BaGgohw8fRkNDA29vb/785z/Ln3+ff/450dHR2NnZERgYiIWF\nhfzZGRoaqjIJpDmypBJZkklzLCwsMDPU5U/jvdgenS8ylgUCwTNDR/wRAfr06aPkLzdmzBhOnTqF\noaGhUiLXmDFj2LVrlzxZCxp/48+cOdNuP9P2+DMKnjyPwjNWIBA83YggkEAgEHRR1DUXf1Ccz5c/\nxylN2iYlJQG/BTfy8/Pp0aOHUgCooaGBlJSUDo1R9uEg88lpD635ZmjpGqChoUHtg7L/HSPsjkpX\nOL5Ro0bx3XffcfDgQYqLiwkICMDOzq7F/V27do24uDiFqo+jR4+Sn5+Pr6+vfALWzMyMoKAgzpw5\nw969e3n55ZeVPpDy8/PR1NRUO0P+cdJWpVhvf+WTLJMNau6d03S57Jo+ePAA+C1w1BxZu2xyUPb/\nJiYmKquyVE0eSiSN8kF5eXkK5vbNqaysVGpTp1JNlR+W7Bw09a55mmntfuvmMYCi9HgKkiMxdXBD\n36ybwv1WVFSkpGUvQ3bfpKam8vDhQ/m9UlVVxcaNG5WeDbKKHpmXl4y6ujp5QE6WkXr9+nXc3NwU\nqsngtwB388zVzkBHDe7V7ddWYsC98koqCks42SQxQCqV8sYbb3D37l08PDwYM2YMxsbGaGlpIZVK\nWw0AtLcSVEiWPB7auu5lxq6Um7iy9JVeOOpIiI6O5tSpU6xevZrNmzfLJYT8/f356KOPFKTjGhoa\nOHhQPS8f2bO0aWVtU0pKSgAY4+NEb9ceImNZIBA8FbSUaNWU9vojylCVqCQLKKlK5JIta1q9mZaW\n1m4/U3X9GQWdj0fhGSsQCJ5ORBBIIBAIuijqmovX1VSRf+0cu6Ps5RMq6enpnD17FiMjI4YMGQI0\nTtbm5eVRXFwsn7hvaGhg9+7dKiss1EFWGXLv3r12mbW35ZuhpaOLoZUjFYU53Dr/E3qmVhQkaTC9\njwnDBzTK0Ojp6TFmzBi50fekSZNa3eegQYNYs2YNQ4YMwd7enszMTOLj4zExMWHp0qUK6/71r38l\nLy+PXbt2cebMGfr06YO5uTnFxcXk5uaSnp7OypUrn3gQSJ1KsaPxOYxXUSmmLrJAkWxyrznFxcUK\n60HjRKFEIqGurk7po1fV5KGs75AhQ1i1alW7xqehru7QM8ru3bvZtn0HlR5TMbF1UbmOvlk3ug+c\nTG7sMW4c/wYzJ0/yEi0xybtIcUEuhoaGrF27VmVfCwsLRo4cSWRkJK+99hoBAQFIpVISExPR1dXF\n1dVVIVu1pqaGt99+G3t7e3r16oWNjQ01NTUkJiaSm5tLYGCgPFB98OBBrl27Rt++fbG1tcXAwIDs\n7Gzi4+MxNjZutfLvSfE4fXHUTQxoXj35yy+/cPfuXebMmcPcuXMV1r1xo7F6pCU6cn8JyZJHS3uu\n++aIDD79UyB///sAGhoaOHXqFCkpKXL/v0GDBil5B6WlpVFToyxHqAo3Nzeg0Wtu/PjxCsukUilZ\nWVnyf4uMZYFA0NVpM9Hq/m/V5O31R5ShKiFL9pxuLYmpqa+rLJmqPX6m6vozCgQCgaDrIIJAAoFA\n0AVpj7m4ia0z9zOucGBrHnZlY9GqryIqKoqHDx+ybNky+cfFjBkz2LRpE6+99hrDhg1DS0uL69ev\nk5OTw6BBg4iNjW33OP38/Pjpp5/YuHEjQ4cOxcDAACMjI6ZOndpqP3V8M1yGvcDtyycpz79JfXYy\nDQ0NnI7xkQeBAMaPH8+RI0ewtLRU8oppztChQ5k0aRL79u0jLi4ObW1thg4dyoIFC3B0VPTRMDQ0\n5LPPPiMsLIxz585x8eJFampqMDc3x8HBgcWLFxMQENDmMTxO1J0YpAU5PXVxcnJCT0+PrKwspFKp\n0geprOKsqXyXm5sbiYmJpKam4uvrq3L95vswMjLi119/VRk4ErRMREQE69evZ/ny5QryX00pr6yh\ndcE8sHbvj4G5DXevR1Nx9xZlt29wpsqBkQO8mTBhQqt9X3vtNezs7IiKiuLYsWOYmZkxaNAg5s2b\npxQ80tPTIyQkhKSkJK5fv86lS5cwMDDA3t6eV199VWFiecqUKRgbG5OWlkZqair19fVYW1szZcoU\nZsyY0aLRvSoKCwtZtGgRY8eOZfny5Wr3ay+P0xdH3cQAUKyezMvLAxqfgc15XDKgIgDw6GjruksK\nsjC2dUFDQ0PhujetmJNljicnJzNt2jR537KyMjZv3qz2WAYPHoyxsTHnzp1j6tSpChnse/bsUagI\nlSEylgUCQVekPYlWg3uattsf8VHSET/T9vgzPiuo806tLn/Ue6dAIBA0RcyiCAQCQRekPebiukYW\ndB80hbwrERwKPYaNqS5ubm4EBwfTr18/+XqTJk1CR0eHw4cPExERga6uLn379uX111/n4sWLHQoC\n9evXj0WLFnHy5EkOHz5MXV0dNjY2bQaB2vLNANAzscRt9ByFtl6+iv4UsgqD8ePHK2U3q2LgwIEK\ncnCtoa2tzdSpU9s8lidFRyeE24u2tjZBQUGcPHmSnTt38pe//EW+LD8/nyNHjqCtrc3o0aPl7ePG\njSMxMZEdO3awZs0aeSakRCJR6fmjpaXFtGnT2Lt3L1u2bGHx4sVKEmDFxcVIpVIlOUNB29Q/bGgz\nCARg1K07rt1+O78LgzwUPFtsbGzklXdN0dPTY/78+cyfP19p2aeffqrwb21tbV566SUlzXpVBAQE\nPPFga0d4HL447UkMkHEtu5hbhRJ5xWJSUhIuLi7y5ZmZmezfv79d22wvIgDw+1DnumdF/oimti6G\n1o7oGZtzOx5KLu7i7p1Gby0/Pz80NTXx8vLi4sWLrFy5kj59+lBaWkp8fDyOjo4tyn02R19fn7/9\n7W+sW7eOd955hxEjRmBhYUFqairZ2dl4e3v/4f6CAoFA8Khpb6LVX4ZY/S5/xN/L7/UzVdefUSAQ\nCASdGxEEEggEgi6IOkGSpuibdcM1KFhp0rY5zc3CZbi4uCjJBAEqJ3ybM2PGDGbMmNGu8T4K34z6\n+noOHTqElpZWm1JwTxu/Z0K4IxOyCxcuJCUlhaNHj5Keno6Pjw/l5eWcP3+eyspK/vrXvypI440c\nOZKoqChiYmL429/+RmBgIPX19Vy4cAF3d3fy8/OV9jF79myysrI4ceIEsbGx+Pr6YmVlRVlZGXl5\neaSmprJgwQIRBOoAWpodk8zr6H36rPM4fHHakxjQvN+YMWP46aef2Lp1K0lJSTg4OJCXl0dcXBxD\nhgwhKiqqQ9sWPH7Uue72/mOR5N+ksriA8rwMNLW0KdBz5c8hITz33HPyysoPPviAnTt3cvnyZY4c\nOYKVlRUTJkxg9uzZvPrqq2qPadiwYfy///f/2L17N1FRUejo6ODt7c2XX37JgQMHRBBIIBB0edqb\naHU6rdHDVF1/xEdNe/1M2+PPKBAIBIKug/h6FwgEgi7I4zYXf9L8Ht+M1NRUkpOTSUpK4tatW0yd\nOrVF0/qnld8zIdyRIJCJiQlffvkl+/fv5+LFixw6dAg9PT08PDx48cUXlao1NDQ0eOeddzhw4ADh\n4eEcPXoUS0tLxo0bR3BwMC+++KLSPrS1tXnvvfc4e/Ys4eHhxMXFUVVVhampKba2tsybN4+goKAO\nHXdnoqk8xKxZs9i5cydJSUmUl5ezZs0aDAwMOH36NElJSRQVFVFdXY21tTWBgYHMnj0bY2Nj+bbe\nffdd+YTr+vXrWb9+vXzZtm3b5HJppga6VAIl2akUpl6gsuwemlramNi54th/ArqGpkrjrKt+wM2Y\nkyz971UKCwvR1tamV69ezJw5U+l6N5XPMDc358CBA2RmZvLgwQO1AslPI4/aF6e9iQFN+1laWrJu\n3Tq2b99OamoqCQkJODk5sXTpUvz9/UUQqBOjznXv5jGAbh4DFNrmBnnwUrOEEFX+dzK2bdum1NZS\n0giAv78//v7+Su3Lly8XsjcCgaBL05FEq7SiOvr1CyQ5IUYtf8THQXv8TNvjzyhoP7t372bPnj1A\n4ztyRESEfFlTqbmEhARCQ0NJS0ujsrISa2trhgwZwuzZs5Xkt69du0ZkZCSpqakUFRVRX1+PnZ0d\nw4cP56WXXlJSUJCNYe3atZSUlPDTTz+Rm5uLsbExI0aMYOHChejo6HDt2jX27NnDzZs30dTUZNCg\nQSxZskTu+ysQCLoWXWM2UCAQCAQKPE5z8c7A7/HN2B2eyJ49ezAxMWHixIn8+c9/fowj7ZyoJadn\nbE6/eatb7Nfa5LyqCUEjIyNCQkIICQlRa4za2toEBwcTHBystKylfWtoaDB69GgFabmWaEmarCnN\npcia0toE5x9Bfn4+K1aswNHRkaCgIKqrqzE0NCQsLIzo6Gh8fHzw9/enoaGBjIwMDh06RHx8PP/4\nxz/kUiPjxo3DyMiImJgYAgMDcXV1lW+/6cejoZ42tYUp3ExOxMypN8a2zkiL8ijJTqGy9C6ez/0F\nTa3fXhmrK0q5f2kvZ02hb9++9O/fn6qqKuLi4li9ejXLli1j4sSJSsd04cIF4uPj6d+/P5MnT27R\nBLkzUVxczL59+7h8+TLFxcUYGhrSt29fXn75ZQUZlISEBFavXs3LL7+sIHt37do13nvvPQD++9//\nKgSkf/lxGzfOn+eTT7/izgOt3+WLo26A3318iMp+3bt354MPPlDZR9V9JCbzOwdPe0KIQPBHsmjR\nIkD1O45AIKOjiVb9xs+kTy9ntfwRHwft8TNtjz9jV6Zp4tXMmTPZvn07KSkp1NbW4urqypw5c9SS\nHW5PAOb777/nu+++o3///qSnp9OzZ08GDx4sX66hocG0adPQ19enqqoKExMTBg4ciKGhIadOneLj\njz9mzZo1+Pr64ubmxvPPP8/IkSM5ePAgt2/fxtPTEzs7Ow4cOIBUKuX69ev8+9//xtHREalUqpAE\nBnD06FEuX77M4MGD8fHx4cqVKxw+fJiKigoCAwP5/PPPGThwIJMmTeL69eucOXOG8vJyPvroo0d6\nLQQCwR+D+AIQCASCLsjjNBfvLHTUN2Pu3Lkqpeta4klP9j8OxMRg1yc1NZVZs2axYMEChfZZs2ax\ndOlSJRmPU6dOsWHDBo4dO8bMmTMB5H/XMTExDBkypNW/c33pHTwnL0bf/DfZvqzzBym5lUzZ7V+x\ncO4rb8+JPkR33VpWrlzFyJEj5e1SqZR3332XLVu2EBgYiLm5ucI+Ll++zOrVq+nfv387z8aT4e7d\nu7z99tsUFxfj6+vLyJEjKSoq4vz588TFxbFq1Sq5h1jfvn3R1tbm6tWrCkGgq1evKvy/7Bo0NDSQ\nlJSEjY0NA717oZ4TWcs87YkBAtWI6y4QCAR/LOokWjXUN66j0cSPtLZBS21/RAAfH58Wk5naSnRq\naZm6fqbt8Wd8Grh79y5vvfUWLi4uTJo0iZKSEqKioli9ejUrV65kxIgRrfZvGoAZMGAAtbW1pKam\nsnv3bpKSkvjkk0/k7+2TJ0/m4MGDVFdXA+Dq6qrw3bpx40bKy8spLS1l8ODB8mDLqlWrqK6uJiAg\ngIyMDIyMjCgvL+eLL74gOzubpUuXYmtri4aGBklJScTFxREQEMDx48cpKytj2LBh2NnZySVgZSQm\nJrJ+/Xp5ZVdtbS2vv/46p0+fJjY2lo8//hhvb2+g8d31ww8/JD4+nszMTIXkMoFA0DXQbHsVgUAg\nEHRG/jTSHY1WrDxklR7OQ6erbS7emZD5ZrR2jNA+34xnBTExqD7vvvsu06ZNe6z7WL9+PdOmTVOo\nfCksLGTatGkKEm1NMTc3Z86cOUrtNjY2SgEgaKz6MTQ05MqVKx0a4/zgmbw7b5zC/Wbdqx8AD+7f\nkbdVlhZgo1HC5HFBCgEgaKwu+tOf/kRNTQ0XL15U2kdgYGCXCQABbNq0ieLiYubPn8+aNWtYuHAh\nK1asYO3atTx8+JCvvvqKqqoqoDFrtnfv3qSnpyOVSuXbuHr1Kq6urpiYmCgEhG7dukVZWRl+fn6P\nZKyyxID20NUSAwTKiOsuEAgEfyzqJExVld8HQMfwt2etSLTqvCQnJzNhwgQ+++wzFi5cyPLly/ns\ns8/Q1NRk06ZNPHjwoNX+S5cu5dtvv2XlypW88sor/OUvf+Hrr79m9uzZJCUlceHCBaBRSvDiLSm6\n3Xpy4fI1isskCtuprKzk3LlzSCQSzMzM+Pvf/46RkRFbt24lMzOTkJAQfvrpJ7kiwsaNG+nXrx/7\n9+/nwYMHaDT7aL5y5QqrVq3C09MTe3t7li9fjqWl4jvDtGnTFKT9dHR0GDlyJA0NDQwYMEAeAILG\nKiWZ9HZWVlb7TrJAIOgUiF8igUAg6KI8DnPxzsaj9s14VngWKsWeFoolVRyKzZJLgTkZNd7MPXv2\nREdHR2n9uro6wsLCiIyMJDc3F6lUSkOTB8D9+/c7NA53d3cGN7vfdI3MGvdZ0xjo8HW2xNlJwsk0\nA6RSKbt371baTllZo/lxbm6u0jIPD48Oje1JUFRUxJUrV+jWrZuSR5WXlxejRo3izJkzXLx4kTFj\nxgDg5+dHSkoKycnJBAYGUllZSUZGBjNmzKCgoECpKkjW51HR0epJQddGXHeBQCD442gtYaqy5C7F\nt5IoyUpCQ0MD8+5eavUT/DHcKpSQeKtI6Z3byMhIKfHK3d2doKAgIiIiiI6ObrWa3s7OTmX79OnT\n2bdvH6GnIjmcqS3/LivTcSG3KJIaaSm6sbcYlVVEQE9rzp07R1VVFUZGRujo6HD+/HkqKyvZsWMH\ntra2VFdXs3v3bmpraykrK6O6upqQkBASEhIIDw/H3NycS5cukZqaSmJiIgYGBvz3v/8FWv4+cHdX\nfieQBYqayh7LsLKyanV7AoGgcyOCQAKBQNCFeRaCJAE9rQnoaa304t4R34xnCTExqB5vvvmmXJLh\ncbFgwQJmzpypkH2XlH2f1NwS0usyiSFV3l5dUUpubgkefrqqNsXnn39OdHQ0dnZ2BAYGYmFhIQ8W\nhYaGUltb26ExyjyCmt5vZxNusOGCMQEeNnzwl5G42Jjw44+NwZ3ExEQSExNb3F5lZaVSm4WFRYfG\n9jhoaSJAhsygWSbz1hxfX1/OnDlDZmamPAjk6+vL7t27uXr1KoGBgSQnJ1NfX4+fnx82NjZcuHCB\n3NxcunfvzrVr1+R9HhXPQmKAQBlx3QWdkaqqKubMmYO7uzuff/65vL2mpobg4GBqa2t58803FTz+\njh8/zubNm3nttdfkniN5eXns3buXq1evUl5ejqmpKX5+fgQHB+Pg4KCwz6ZG58XFxYSGhpKTk4Op\nqanc56ehoYFjx45x/PhxCgoKMDExYciQISolugQCVbSWaPWgOJ97v8aib2pF98ApGJg3eq+IRKsn\ny5WsInZFpitdM9k7d9CwXnI/zab4+PgQERFBZmZmq0GgqqoqQkNDuXTpEnfu3KGyslKeoFVYVklq\nVDK9dPzl65s6uKNrZIb0Xg559yW8uyuGN6b6EhYWhpaWFiYmJtTV1bFnzx7Kysq4ffs2JSUlfPnl\nlwr73blzJ8bGxjx8+JDt27djZWWFs7Mz/fr14+7du/j5+TFhwgT27NnT4veBoaGhUpvW/8oYNvUP\nbb6srq5tWUSBQND5EEEggUAg6OI8K0ESFxuTp+p4HjdiYlA9unXr9tj3YWlpqRAACruSw7qDCZRX\n1mClYv3yyhqOJeQwITGXif6/STSkp6cTHR2Nv78/H330kfxDDBontg4ePPjIxuxiY8Jz/Zw5aGmE\nj7OV/N6TfSz+n//zf9oto9dcpuJJ0NZEQO/7FQBySbeWAley9oqKCnlb79690dfXl1f5XL16FW1t\nbfr06SM34b169SoODg4kJyfTvXv3Rx4YexYSAwTKiOsu6Gzo6+vj7u5OWloalZWV8gnW1NRU+WTk\n1atXFYJAzSsk09PTef/996msrGTQoEH06NGD27dvc/bsWWJiYvjkk09UZrH//PPPJCYmMmjQIHx9\nfRUkOrdu3cqRI0ewtLRk0qRJaGlpERMTQ1paGnV1dSqD/gJBc1pKtLJy88fKzV+h7VlOtOoMhF3J\nafVbqLyyhvM3yzjZ7J0bkHtbNn2GNKeuro733nuPtLQ0nJ2dGTFiBGZmZmhpaXGrUML6/2zD2Ebx\nXU9DQwNLV1/uZybyoKSAhgb45PswNJKuM3HMSFJSUmhoaGDPnj2cPXuWf/zjHyr3ffz4cQBKS0u5\nd+8eL7/8MsuXLycpKYmkpCTGjx/PpEmT2LNnj7qnSyAQPOWItxyBQCB4ShBBEkFznvWJwZiYGEJD\nQ8nNzUUikWBqaoqDgwMjRozgueeeAxo9gZKTkxVMdJOSkli1ahVz5sxh4MCB7Ny5kxs3bqChoYGf\nnx9LlizB2tqagoICfvjhB65evUpVVRW9e/dmyZIl9OzZU2Ec69evJyIigm3btnFHqtnqx2i1pJiq\n8iLuXo/mzwvn42lnhIujLf369aNHjx4ADBo0SB4Ako111KhRlJSUkJWVxZw5c6ioqGDbtm1y/6CH\nDx8+knPau3dvAFJSUh67l9KjRp2JgKPxOYxPzMXif7MfS0tLVa5bUlICKGZJygI+CQkJlJSUcPXq\nVTw9PdHT08PR0RFra2sSExNxc3OjsrLykUrBNeX/s3ffYVGeWePHv0PvTQQRBIRYQEARKzZiiRpr\nYmLUJKu76ibGbNQE3VdNYvLT1TXRWBI1puyrxrpRo1iCChaIoggiHelI72VAkTa/P3hnwjBD0Wgi\n5v5c114bnv7M4PDMfe5zzp9lYoCgTLzvwtOmb9++xMfHExMTw8CBA4HGQI+GhgZubm5KJTJlMhnR\n0dF06dIFKysrZDIZX3zxBffu3eODDz5Q9KEACA4O5rPPPmPz5s3s2rVLZYJBVFQUmzZtUmlaHh8f\nz6lTp7CxsWHz5s0YGzf+u3jzzTdZtWoVJSUlioC9ILRGTLTqGCLSitp8jwBq71ex5XQUVqb6Su+V\n/BlQXUaMnDyIPGbMGJYuXaq0bvGOcy2e27y7BxKJBlWFdwEoSgqnpqSKCRMmUF9fz82bN7l7967i\n3NOmTWPBggVqj3X06FH27t2Lt7e30nKJREJMTEzrNy8Iwp+KCAIJgiAIwjPszzow6O/vz44dOzA3\nN2fQoEGYmJhQVlZGeno6AQEBiiBQa5KSkjh27Bhubm6MHz+e9PR0rl27RkZGBh9++CErVqzAzs6O\n0aNHU1BQQEhICB999BHfffcdenp6ao95ICip1S+jFTlJ1FaVo2fcCXNHd3Q6m2Bvp8358+eRSCTU\n1NQQExOjFICpra3lxx9/JCMjgy5dujBu3DgqKirQ0tJSDHIVFBQ83AvYgh49etCnTx+uXbvGhQsX\nFCV7mkpPT8fc3BxTU9PHcs7Hob0DAchgy+koVkxsrIMeGxtLfX29UtYVoCjn5uzsrLS8b9++3Lp1\ni6CgIDIyMpgzZ45inYeHBzdu3FDs86SCQHJiYsCfk3jfhT9K8+cMS7vGz9HIyEilINBzzz2Ht7c3\nX3/9NdnZ2dja2pKamopUKlUMYiYkJJCVlUXv3r2VAkAAI0aM4PTp08TFxREbG6vUuBxgwoQJKgEg\ngICAAABmzpyp+NsIoKOjw9y5c1m1atVjey2EZ9+ffaJVR9DWM7fc/ZJc6moecDA4Sen9io6OBlD7\neSKXm5sLoBKASS+QEnqr5bLJesad0NY3orq8iMrCTErTY9DSNcTCrgfTpk3j5s2bfPnllyxevBiJ\nREJc3K+lo6urq8nIyFBMzJIHr6Ojoxk0aJBiu7KyMsXnniAIAoggkCAIgiD8KfzZBgb9/f3R0tLi\nyy+/VAlGVFRUtOsYYWFhKjOQt2/fzoULF1i+fDkvvfQSM2fOVKw7fPgwBw4c4Pz580ydOlXleHcL\nK9XWkG/KzN4VYxsnOjl5YjdgAjJg3lsjmZadzMcff0xDQwPXrl1j+fLluLq6KmZZ6+np4ebmhoOD\nA3/7298Ux9PR0UFXVxc/Pz+kUqmi/NjkyZNbndnYGl9fX1avXs327ds5deoUvXr1wtDQkKKiItLT\n08nIyGDTpk1PVRCovQMBADIZ/BxbQr9+/bh9+zZ+fn689NJLivV37tzhypUrGBkZMXToUKV95T1+\nfvzxR2QymVKgx8PDg4sXL3LmzBkkEgnu7u6//cYEQRD+YC2V2Wyorycjt5KA4OssWLCAqqoqUlJS\nmDFjhuKzMjIyEltbW5U+acnJyUo/N+fh4XJ0lhYAACAASURBVEFcXBypqakqQaCePXuq3SclJQVA\nZXsAV1dXReasILTXn3WiVUeQXiBt85lbrq6mmrzoK0Rpv0B6gRRHK2OSkpK4fPkyhoaGKs96TbUU\ngLkUHk9ORMsBGE1tHUztelGUGEbMsc3IGhowd3TnQkgkf58+grlz57Jv3z58fX2pr6/n4sWLzJ8/\nn27duhEbG4urqyuffvopAN26dcPMzIwTJ06Qnp6Orq4uKSkp7N27lxkzZlBYWNiu10EQhGefCAIJ\ngiAIgvBM0tTUVMngADAxMWnX/q6uriozkEePHs2FCxcwMDDglVdeUVl34MABUlNT1R4vJrO4zXNq\n6xsDyqVtbqcXMX2QJ46OjpSXlzN06FDCwsI4deoUEomEzp074+3tTXV1tcrxjIyMWLlyJYcOHSIw\nMFCxzfPPP//IQSBLS0u2bt3KqVOnuHbtGpcvX6ahoQEzMzPs7e2ZPHkyDg4Oj3TsJ+FhBgLkojJK\nWPfaXDIyMvjPf/7DrVu36NGjB0VFRfzyyy9oaGiwdOlSlUbCzs7OGBkZUV5ejr6+vtJgpDwgVF5e\nTo8ePR759RcEQXhatFZmU0NTk3ojay7eiOZ4cCy2OpU0NDTQt29funXrhoWFBZGRkbz44otERkYq\nSq4C3Lt3D0Cpn15T8uXqenXI+3g0Jz+muvWamprtfjYQhOb+bBOtOoLb6UXt3tbY2oHi5AiqinL4\noi4eJ3MtgoODaWhoYPHixYp+mOoMGjQIGxsbRQDG2dmZwsJCjp69iKGlHTVV5S3u22PsX6gqyKS6\nohCZTEZVUSbp6anACF555RVcXV05deoUUVFRJCYm4u/vj7m5OW5ubhgYGLBlyxYyMzNJSkpi4cKF\nJCYmEh0dTXZ2Nvfu3eOFF17ggw8+IDg4+GFeOkEQnmEiCCQIgiAIwjOh6UxMfVsXSuPu8M477zBy\n5Ejc3NxwcXF5qOwUdQ2nO3XqBDSWhmg+a1i+rrhYfbDnfk19m+fUMTTF0fslilMjiTr6OfU11fw/\nfwO+t2gMGGhpabFo0SLF9vKeQK6urixevFjtMb28vPDy8lK7bs6cOUoly5qysrJS6pXUlL6+PjNn\nzlTKhGrJmDFjGDNmTJvbPSkPMxDQVPY9TbZs2cKRI0cICwsjJiYGfX19+vfvz2uvvab290MikeDh\n4cG1a9fo06ePUhDS0tISW1tbsrOzW5zdLgiC0FG0p8ymUZfuVOSm8u+9p3nBSRsdHR1cXFyAxmye\n8PBwamtriY2Nxd7eXvE3Wj7oKu+/1lxJSYnSdk017xEkJ9+2rKyMLl26KK2rr6+noqICS0tRuksQ\nngX3HtS1e1sdQ3O6DZpETkQgN3+5RLaZHs7OzsyaNYv+/fu3uq+enh7r169nz549REdHExcXh7W1\nNT7jpxDxwI7SjNgW99U1tqD7qJlkhfljbu9K95GvMsjbVbHe1dUVV9fGn+vq6vD39+fKlSvcvXuX\n69evY2ZmRteuXVmwYAHPP/+8ogqB/LuBvIeoumf51p7/W3tud3d3b/G7gSAITz8RBBIEQRAEoUNT\nX4rGjnLbEZTnxZBx+Cgm+ieRSCS4ubnx17/+Ve0AfnPqBpfkg/rqsjjk6+rq1H/x1NdRzUpqLvvW\neQrir6NtYIyJjTPaBib4eDowwNmKwMDAFnv7tDTzWWjfQICukRn931ijsl+nTp145513Hup8K1eu\nbHHd119//VDHEgRBeFq1p8ymcZfuAEhz0ziTXsiLg3ujo6MDNGZHXr58mbNnz1JdXa1UPlPeO03e\nk6M5+fLmfdla4+zsTEpKCjExMSpBoLi4OBoaGtp9LEEQnm4Gug831Kln2hknn1ksGu/K9EHd1W7T\nUnDE0tISX19fpWXpBVLe2h2k8mzZ3P2SvMZj9GycrNXPUX0gWktLi8mTJzN58uQ270UEagRBaIko\nfCsIgiAIQoflH3GX93efY+/6JWRcO6m0rpNTXzp5v46290JemPUW48aNIyYmhjVr1lBe3nJ5hifF\nrVunVtfXVldRmHADfTMrXKcsxnHYy+THXqW6MJM5c+agra3d4r4tzXwWHn4g4LfuJwiC8Kxrb5lN\nA3MbtHT0KM+6Q1ZmJjaOv5bIbNpDrenPAC4uLtja2hIXF8fVq1eVjnn16lViY2OxtbWlT58+7b7m\nsWPHAvDf//4XqVSqWF5TU8PevXvbfRxBEJ5+LQVTntR+zTlaGeNur76cpVxNVTmlGTHomXbGyLo7\nHg4WoqygIAhPlPh2KwiCIAhCh9SeUjQAmtp6nEmDDa/PRiaTceHCBWJjY/H29v59LvT/2Hc2wt3e\nosWBs5rKUmQyGcY2zmhq6wJgYqCDqaEORUVF5OXl/Z6X+8z4owcCBEEQnjXtLbMp0dDAyMqBsqw7\njT+b2SnWWVlZYWNjQ25uLhoaGri5uf26n0TCsmXL+Oijj9i4cSNDhgzBzs6O7OxsQkJC0NfXZ9my\nZQ81AcLFxYUpU6Zw6tQp3n33XYYNG4ampiY3btzAyMioxf5DgiB0PPIgzMP0hHzcQZjXR/Zg5YEb\nKt9TStKieSAtpjQ9hob6Orr2fR4NDQlzRrRdpUAQBOG3EJlAgiAIgiB0SK2VopHmpSFrslImg4PB\nSZSVlQGgq6v7e1yiitdH9qClMSsdw8aSblWFd5E1NCCRgK2FIXV1dXz11VfU17fdU0hQ1Z7ZmM2J\n2ZiCIAgte5h+G0b/VxJOU0cPUys7pXXyEnDPPfecSpnVXr16sWXLFnx8fEhISOD48ePEx8czatQo\ntmzZQq9evR76uhcuXMhbb72FgYEBP//8M0FBQXh6erJ27Vq0tMT8WEF4lrT2zN2cRMJjD8J4drdk\n6SR3lWsoTg4nLzqIhvo67LzGY+7gwrLJHnh2F5OPBEF4ssSTjiAIgiAIHU5bpWjSgv6LhpYOBpa2\n6BqZIZPBnZ8zcDZ6gEef3kq9B35P8i+EG48Eq6zT1jfC3NGN0vQYEn7ezYzxo7iQn825c7l4e3vj\n5OREamrqH3DVHV9LszHVeRIDAYIgCM+ShymXadV7MFa9BwNgpK+jtG7x4sUsXry4xX1tbW15//33\n23We1hqdy0kkkhb7anz//fftOo8gCB2D/Jm7paoB8n6QEglPLAgzwdMeazMDDgYnEZXR+L2lx7h5\nivUeDhbMGdFDBIAEQfhdiCCQIAiCIAgdTkulaKrLi8iJCOBBZSk1VRVU5CSjY2iKjpEZOoamDBg9\nheULZuLn50d4eDinT5+msLCQ119/nd69e/Pqq6+qPW5sbCzHjh0jNjaWmzdvkpubS1ZWFl5eXsye\nPVtp27q6On788UeCg4PJyckhOTkZqVRKSEgI06ZNY4KnPdp1/fn7ucbBsIb6evLjfqEkNZKayjJ0\nJbVYScpIi7xKeXk5rq6ufP7556xfv/7xvohPsYKCAubPn8+YMWNYunTpbz5eWwMBck9yIEAQBOFZ\nIcpsCoLQEagLwjT1ewRhPLtb4tndkvQCKbfTi7j3oA4DXS36OVqKrHNBEH5XIggkCIIgCEKHo64U\nTU1VKYnnvkfPzJqunuOou19J6d1YZPV1dBv4IuaObvQd1pPi4mJ++OEH+vTpw9tvv42RkREFBQWE\nhoYSHh7ORx99xKlTpxTHDQ8P59NPP8XAwIAhQ4YwadIkpFIpWVlZnDlzRikIdPjwYVatWsW+fftw\ndnZm3LhxjBkzhoiICL777jsqKip48803GTPQhZSIX0jLr+DDNZ9SnB2Bq501Y0ZNxcxAi2vXrtGj\nRw/q6+txc3PD2NiYDRs2qNyzu7u70rUKLXsaBgIEQRDaY/78+cDTlZ2ycuVKYmJiOHXq1FPRb0MQ\nBKE9npYgjKOVsfgMFAThDyWCQIIgCIIgdDjqStFI8zOwdh2Kbf8XFMssew0g8dz/khl6BpOuz2Gg\nq4WdnRV79+7FxMREaf+ioiI++OADvvvuO7y8vBTLz58/j0wmY8OGDXTv3l1pn4qKCqWfv/32W1JT\nU5k3bx4zZsxQLK+pqeFf//oXP/74I8OGDcPJyQmAuwkRVGQnMnpof9avX4+OTmN20Jw5c9pdAkdo\nv6dlIEAQBOFps3XrVgIDA/n++++xsrJqc3tRZlMQhI5EBGEEQfiz0/ijL0AQBEEQBOFhqSspo6Wj\nRxf3UUrLDDvZYuHoTl1NNWWZCfRztMTQ0FAlAARgaWnJsGHDyMrKorCwUGW9PEDTVNPjSKVSLl26\nRI8ePZQCQPJ9582bh0wm48qVK4rlAQEBAPzlL39ROr6xsTGzZs1q6faF38jRypjpg7ozZ0QPpg/q\nLgYFBEEQHlJLTc+bE2U2H5/58+crssQAAgMDmTJlCoGBgU/snNHR0UyZMoWDBw8+sXMIgiAIgvDk\niUwgQRAEQRA6FHkWh7WpPvnl9xXL9S1s0NTWVdneyNqB4tTbdNKQKgb74+Pj8fPzIyEhgbKyMurq\nlMvLFRcX07lzZwBGjRrFtWvX+OCDDxgxYgQeHh64uLhgaak8oJWYmEhDQwOA2sGS+vp6ADIzMxXL\nUlJSkEgkuLq6AnDnzh2OHz9OXFwcxcXFREdH8+DBA0pKSrCwsADg2rVrbNiwgV69evHvf/8bLa1f\nH+cyMjJ4//33MTIyYvv27ZiamgIQFRVFUFAQcXFxFBUVUV9fT5cuXRg+fDgzZsxQCXAdPHiQQ4cO\nsX79ekpLSzl+/DiZmZkYGRkxYsQI5s6di7a2NlFRURw6dIiUlBQ0NDQYNGgQCxcuxNhYOagiH7Ta\nvn07P/zwAyEhIUilUrp06cLEiROZPHkykrZGEv/PgwcP8PPzU/RckkgkODg4MHXqVEaOHNmuYwiC\nIAi/nSizKQi/3cP0QWz6fObu7v47XaEgCILwLBBBIEEQBEEQOoSItCIOBCW12INAS89Q7XJtfSMk\nEnC3NQIgJCSEDRs2oKOjQ79+/bCxsUFPTw+JREJ0dDQxMTHU1tYq9vf29ubjjz/mxIkTBAQE4O/v\nD8Bzzz3H3Llz6devH9CYCQSQlJREUlJSi/dRXV2t+O+qqiqMjY3R0tLiwoULfPXVV2hrazN48GDM\nzMxITU0lLS2NZcuWsWnTJjp37oy3tzeTJk3izJkz/PDDD/z1r38FGoMjGzdupLa2lg8++EARAAI4\nduwYWVlZ9O7dmwEDBlBbW0tcXBwHDx4kOjqadevWoaGhmiB++vRpwsLCGDJkCO7u7kRERHDy5Ekq\nKysZPHgwn332GQMHDmTChAnEx8dz6dIlKioq+OSTT1SOVVdXx0cffURlZSUjR46krq6Oa9eu8c03\n35CVlcWiRYtafM2avl6rVq0iNTVV0XOpoaGBiIgIXn31VQYNGiR6JAmC0CHIZDLOnDnD2bNnycvL\nw9jYmKFDh/Lmm2+2uE9QUBD+/v6kpqZSU1ODtbU1Pj4+vPzyy2hraytte/36da5evUpiYiLFxcUA\n2NnZMWbMGJXA+5QpUxT/3TTTxMrKSqUvUX19PceOHSMgIIDCwkLMzMwYNWoUb82fRkxWmSiz+Tsa\nMmQIu3btwtzc/I++FEF4JgUGBrJ161aWLl3KmDFj/ujLEQRB+E1EEEgQBEEQhKeef8Rdtp6JbrX3\nQF11VQvLK3GyNqGXvTUA+/fvR1tbmy1bttCtWzelbXfs2EFMTIzKMQYOHMjAgQOprq4mMTGR0NBQ\nfv75Zz799FO2b99Ot27dMDRsDEJNmzaNBQsWtOu+DA0NkUqlZGRksHPnTqytrdmwYQOdOnWioKAA\nPz8/rKysKCws5JtvvmH16tVA4yBdfHw8P/30Ex4eHnh5ebFr1y4yMzOZNWsWHh4eSudZtGgR1tbW\nKtk2+/fv58iRI1y9epURI0aoXN/t27fZunWr4nWqra1lyZIlXLx4kdDQUNauXYubmxvQOKD58ccf\nEx4eTmpqqqLvkVxJSQnW1tbs2LFDMVgp73109uxZRowYoThWS1rruXT58mUSEhLUnlsQBOFp8+23\n33Lq1CksLCyYMGECmpqa3Lhxg8TEROrq6pSyPAG2bdtGQEAAlpaWeHt7Y2hoyJ07d9i/fz+RkZGs\nXbsWTU1NxfZ79uxBQ0ODXr160alTJ6qqqoiKiuKbb74hKSlJqe/c7NmzuX79OmlpaUydOlXx90z+\n/01t2rSJ2NhYvLy8MDAwICwsjGPHjlFWVtZmFoPweBkaGqp9j4SOKScnhylTpjB79mzmzJmjdpvJ\nkyczcuRIRbb670EEQgRBEJ4NIggkCIIgCMJTLSKtqM0AEMD9klzqax8olYTzcLBAViPjTpG+IjCQ\nm5uLvb29SgBIJpMRGxvb6jn09PTw8PDAw8MDIyMjDhw4QFhYGN26daNnz55IJBLi4uLavCd5Sbv7\n2ubklubzxZdfU1dXx8KFC+nUqRPQWIcfGmdiOzk5ERoayv3799HX10dbW5t//vOfLFmyhC1btjBj\nxgwCAwNxc3Nj9uzZKufr0qWL2uuYNm0aR44c4datW2qDQFOmTFF6nbS1tRk5ciQHDhxgwIABSkEb\niUSCj48Pt2/fJi0tTW0gRl5GTk7e+2jr1q0EBAS0GgRqq+fSnj17+OSTT7hy5YoIAgmC8FSLj4/n\n1KlT2NjYsHnzZkUJzTfffJNVq1ZRUlKClZWVYvvAwEACAgIYOnQovr6+SiU85eWhzpw5w9SpUxXL\n16xZg42NjdJ5ZTIZW7du5eLFi0yaNIlevXoBjQH5goIC0tLSmDZtmtK5m8vNzWXHjh1K1/zee+9x\n8eJF5s6dK7JSfqOHyRBrbXC+qKiIo0ePEhYWRnFxMfr6+ri4uDBr1ix69OihcqyysjL27duneNaw\ntbVt83dB+P2ZmJio7WspCIIgCG0RQSBBEARBEJ5qB4KS2gwAAdTVVJMXfYX+z0/j5SHd6edoSW15\nHsuPh2FoaMjQoUOBxqBKTk6OUp8dmUzGwYMHlfr1yMXExODi4qI0wxoaB0wAdHUbg06mpqb4+Phw\n6dIlDh8+zMyZM1VKrJ2/EcNPNzJILW/8uUSzG+kFN0k8dJROnTpz5lIISUlJ3L9/nwMHDlBWVoZE\nIqF37940NDSQnZ3Nc889B0DXrl1ZvHgxmzdv5j//+Q8mJib4+vqqLetWXV2Nn58f169fJzs7m/v3\n7yNr8qLKSwU1p26gSP6aya+jKXkAS93xNDU1cXFxUVkur2mfmpqq9hrk2tNzSVdXV+17KAiC8DQJ\nCAgAYObMmUo91HR0dJg7dy6rVq1S2t7Pzw9NTU2WLFmi0sNt1qxZnD59msuXLysFgZoHgKAxWD91\n6lQuXrxIRESEIgj0MObNm6d0zXp6eowaNYrDhw+TnJzMwIEDH/qYwq8eNkNMnZSUFEX51f79++Pt\n7U1FRQXXr19nxYoVrF69mgEDBii2r6ioYPny5eTl5eHq6oqrqyulpaXs3LkTT0/PJ3m7QitkMpni\n90EeAD569KjankBTpkzBzc2NlStXKoJ5UqkUGxsbXn75ZcaOHaty/NraWn788UcuXrxIcXExFhYW\n+Pj4MGvWLF5++WXc3NzYsGHD73nLgiAIwhMkgkCCIAiCIDy10gukLfYAas7Y2oHi5AiCi3LoKxlD\n+rVqgoODaWhoYPHixRgYGAAwffp0duzYwXvvvcewYcPQ1NQkPj6eu3fvMmjQIEJDQ5WO+80331Bc\nXIyLiwvW1tZoaWmRnJxMVFQUVlZWjBw5UrHt22+/TU5ODgcOHODSpUu4urpiZmZGSUkJweGx/BIW\nhcOwGVg4Nma8mDu6UZYRS1b4OXKqytn8xRY6m+hRe78SQ0NDSktLqaioUARsmvYTAvD09MTAwIB7\n9+4xfPhwRRCmqbq6OlavXk1iYiIODg6MGDECU1NTRVDr0KFDSj2QmpK/Zk3J91NXgka+rq6uTmWd\niYkJsbGxrFq1SqnUiZmZGdDY70fei0I+6FBfX8+pU6cICAggOjqa+Ph4IiMjOXv2LFZWVkp9j0JD\nQzE2NlYaFGnaQLmiooJjx46RkZGBjo4Onp6ezJ8/X+1rlpSUxL59+0hISEAikdCzZ0/eeOMNbt26\nJRoyC4LwSOQZoPce1HH+6i3uPahTm/3o6uqqFMx/8OABaWlpmJiYcPLkSbXH1tbWVgmAS6VSjh8/\nTlhYGHl5eSp/P1oK/rdF3eQAeWmqysrKRzqm0OhhM8TUqa+vZ+PGjVRXV7N+/Xql37GSkhKWLVvG\n9u3b+f777xWZufv27SMvL0+lnO2kSZNYvnz5E7hToS01NTVs3ryZa9euMWnSJN566y2Vkr7NVVVV\nsWLFCrS0tBg2bBi1tbX88ssvbNu2DYlEopQtJpPJ2LBhAzdv3qRr165MnjyZ+vp6AgMDuXv37pO+\nvT9MQUEB8+fPZ8yYMbzyyivs2bOH2NhYamtrcXJyYvbs2e0KfEZFRREUFERcXBxFRUXU19fTpUsX\nhg8fzowZM5SC9Xv37uXo0aMtltNLTk5m2bJlDBw4kI8//vix3q8gCEJTIggkCILwjGr6kCtqtAsd\n1e30onZvq2NoTrdBk8iJCOSE3xmsTHRwdnZm1qxZ9O/fX7HdhAkT0NbW5uTJkwQGBqKjo0OfPn1Y\nsmQJ165dUwkCzZw5k5CQxgydyMhIJBIJnTt3ZubMmUydOhUjIyPFtgYGBvz73//G39+fK1eucO3a\nNWpqaqjX1CO2sAHb/uMxsfm1VJlEIsFxxKuUZsZTmZ+BvpkVNYamvPX2dP7nHwtbnYkpk8nYsmUL\n9+7dw8TEBH9/f7V9deQziNV9FpSUlHDo0KF2v8a/RUVFhSKTpyl5RpWhoSFVVcp9na5cucLly5dx\ncHBgyJAhlJSU0LNnT0xMTPD29uZvf/ubYlv5LNj169ernOPs2bPcuHGDwYMH4+bmRmJiIsHBwaSl\npbF9+3alEnUxMTF8/PHHNDQ0MHToUGxsbEhPT2fVqlUqvZYEQRDaEpFWxIGgJKUJDbHJuTyQlvDv\nUwnMHauFZ3dLxTpNTU2lck+VlZXIZDLKy8vb/XldVVXFsmXLyM/Pp2fPnowePRojIyM0NTWpqqrC\nz8+vxeB/W1qbAKDuM15ov4fNEFMnLCyM3NxcXnrpJZXnAQsLC2bMmMG3335LZGQkAwYMoK6ujsuX\nL6Ovr69STrZHjx74+PgQGBj4GO7u6XTnzh2OHz9OXFwclZWVmJmZMWDAAGbPnq3IfAZYuXIlMTEx\nnDhxgmPHjhEQEEBhYSFmZmaMGjWKN954Q22W1uXLl/npp5/IyspCX1+f/v37M2/ePD7//HNiYmI4\ndeqUyj7379/nww8/JCEhgblz59KvXz++/fZboqOjCQ8PJy0tjU8//ZSJEyfy2muvKZ5D09LSGDdu\nHC4uLmzfvp2lS5fSs2dPFi9ezN///ne8vLzo06cPf/vb30hOTubmzZv06dOHdevWoaWlRW5uLnl5\nefzv//4vVVVVFBQUcPPmzSf34v+B8vPz8fX1xdHRkQkTJlBaWkpwcDBr1qxh+fLlakskN3Xs2DGy\nsrLo3bs3AwYMoLa2lri4OA4ePEh0dDTr1q1TBPMnTpzIsWPHOHfunNogkL+/v2I7QRCEJ0kEgQRB\nEIQOKzo6WiWrQHi23HugmlHSnK6RGf3fWKP42clnFnN9ejJnhOpsZbkxY8ao/SLm6Oio8rs0fPhw\nhg8f3u5r1tLSYvLkyUyePFmxzHdvCA9ayGjS0NTEqvdgJBINnHxmYWrbkwpLC7S1tdUODsgdP36c\n8PBwfHx8mDFjBh988AGbNm3iyy+/VBo8ys3NBcDb21vlGDExMe2+r9+qvr5ebck3ee8jJycnxX9D\nYzZRZmYm48aNY/PmzUilUmJiYrCzs+OLL75AKpW2+9zh4eF88cUXODo6KpZ9/vnnBAUFcePGDcX7\nK5PJ2L59O7W1tXzyySd4eXkptv/555/ZuXPnw9620AGpm0SxdetWAgMD+f777xUz8cVkC6Et/hF3\n1fa0k/euu52URUJ+FcsmezC+X2P/tfr6eioqKrC0bAwMyYMuTk5ObNu2rV3nPX/+PPn5+WqfjxIS\nEvDz8/sttyU8Ro+aIdaShIQEAAoLC9WWTs3JyQEgMzOTAQMGkJWVxYMHD+jTp4/aAJ+7u/szGwS6\ncOECX331Fdra2gwePBhLS0tycnI4d+4coaGhbNq0SZHlJrdp0yZiY2Px8vLCwMCAsLAwjh07RllZ\nmcrfgWPHjrFnzx6MjIwYPXo0hoaGREREsHz5crWvNTRm/h06dAhjY2Pef/99fHx82LFjByEhIbi7\nu/PgwQOqqqowNTXlxIkThIeHs3nzZqCxPPGCBQsICQkBGjOkb9y4gY2NDeXl5fTq1YuwsDCSkpIU\n5SLlwaucnBx8fX2RSqUMHz6cyMhIDAwM+Ne//qX0LPSsiImJ4aWXXlKaTCTPfNuxY4fi/W3JokWL\nsLa2VsnO2r9/P0eOHOHq1auKQJKVlRUDBgzg5s2bZGRk4ODgoNj+/v37XLlyBUtLy2fydRYE4enS\n9lOEIAiCIAjCH8RA99Hmqzzqfk9Ce0rade45CA1NTbLDz1NdUURURgnpBb8GOerq6oiNjVX8fOfO\nHX744QdsbGx45513cHR0ZMGCBRQXF7Nlyxalfj/yAeumARaAvLw89uzZ8xjusGXpBVJOhKaRkldB\nXtk9Dv73mNJMcalUypEjRwBU6tVLJBJkMhna2tpIJBJFz6WkpCQOHz6sdgClsrKS/Px8leVTpkxR\nCgABjB8/HmjsNSQXHx9Pbm4uHh4eKl/GJ0yYgK2t7cO9AIIg/GlFpBWpDQABGFg0DsBWFmQgk8GW\n01FEpDVmvsbFxSl9Turp6WFvb8/du3fbHfyWD/Q/TPBfHmCor69v1zmE3yYirQjfvSG8tTuIXefi\n2Hs5kYjkXKIyivn3qQTF74Nc8wyxllRUVADwyy+/cOjQIZX/XblyBfi1vOy9e/eAX0uzNtfS8o4u\nOzubnTt3Ym1tze7du1m+fDl//etfh5Wy5QAAIABJREFUWb16NWvXrqW0tJRvvvlGZb/c3Fx27NjB\nkiVLWLhwIdu2bcPGxoaLFy9SWlqq2C4vL48ffvgBExMTvvzySxYvXsy8efPYunUrVnbdCYmIJbuk\nihOhadwtbCyjmJ+fT3x8PJWVlXzyySf4+PgA8Oqrr7Jv3z7++c9/4uPjg729PUuXLuW9994jMzOT\nM2fOAI29IpsGLq5fv86nn37KtGnTsLe3Z8mSJbzyyiuUl5dz/fp1JBKJolfjrl27kEqlLFy4kM8+\n+ww7OzuGDBnCypUrVTLknwWGhoYtZr5VVVUpAmkt6dKli9ryfNOmTQPg1q1bSsvlWT7yrB+5K1eu\nUF1dzfjx49sV5BUEQfgtnp4REkEQBEEQhGb6OVq2vdFj3O9JaE9JOz1TS+wHT+XuDT/iT3+NiY0z\nmx5E42FvQUFBAXFxcZiYmPD1119TVVXFZ599hkQiYcWKFejr6wONXzAjIyO5evUqJ06c4KWXXgJg\n0KBB2NjYcOLECdLT03F2dqawsJDQ0FAGDhxIYWHhY7/n9AIpvntDFMGv5LxyGhogOy6Hyqw4dI0t\nqKys5OrVq5SUlPDiiy+qzHzW1NTEwcGB+Ph4Rf+mYcOGcffuXbU9l2JjY4mPj+fVV1/F2tpa6Vjt\n7WGRkpICNM64bk4ikdC7d2+ys7N/24sjdEh/+ctfeOWVV5TKAwlCaw4EJakNAAFYOPejKPkWeTHB\nmNr1REvXgIPBSfSxNWHv3r0q20+fPp3t27ezbds2li1bphIElwfAnZ2dARSfgdHR0UoB8NTUVH78\n8Ue11yTPIC0sLFRkCQhPxuPIEGuJ/Hfjww8/ZPDgwW1eizxoIC/N2lxLyzuipllXV/2PUVFVzapV\nC1V6A/bt25fBgwcTGhrK/fv3Fc9ZAPPmzVPKttbT02PUqFEcPnyY5ORkBg4cCDQO7tfX1zNlyhTF\neyYvDRlW3Z2MotPIGhrYdS6OB5VlZGaWYiO9T01NDWZmZop/y0CLfaDGjh3Ld999R0REBKBaqnHk\nyJH07duXS5cuAY3lGidMmMDRo0cpLi7G3t4eTU1NioqKuH37NtbW1kyePFmpr6O8jO7vmTn+ODV9\nzw10tbAzbPxH5+zsrPS+yskz31JTU9VWDJCrrq7Gz8+P69evk52dzf3795UmYDXvuTZgwACsra25\ndOkS8+bNQ1e38d+6v78/mpqavPDCC4/jdgVBEFolgkCCIAgdVGJiIj/99BNxcXFUVFRgbGyMg4MD\n48ePVyldVVBQwJ49e7h9+zbV1dU4ODgwZ84cxRcVuaqqKs6dO0d4eDjZ2dmUl5djYGBA7969efXV\nV+ndu7fKdcj7cKxYsYIffviB8PBwSktLWbJkCWPGjCE7O5uAgABu375NQUEB9+7dw9zcnP79+zNr\n1qwWv8xGRERw6tQpEhMTqaqqUnwhmjx5Mv369VOU5gEUsxvlmjdtDwoKwt/fn9TUVGpqarC2tsbH\nx4eXX35ZqRdIe+5H+H05Whnjbm/RZiZNUx4OFjhaGbe94e+kPSXtACycPNA3t6Yg/jrS/DRuBOVT\nbGuJhYUFw4YNU5SV2L59OwUFBSxYsIDnnntO6Rj/+Mc/SE5OZt++ffTp04eePXuip6fH+vXr2bNn\nD9HR0cTFxWFtbc2sWbOYPn06wcHBj/V+C8rvc/hqMjYedkrLNTQ0sR00gbgTqVy6fov8gkI8enXn\nlVdeUSqd19To0aPp2rUrV65c4cCBA0BjA3Rra2t0dHQUPZfMzMzQ1NSkb9++ahv6treHRVszos3N\nzdvxCgjPIgsLCxEAEtqtrQxQo87dsOo9mIKEG8Sf+Rpze1eywjXIuvANNp3NVX7Xxo0bR3JyMmfP\nnmXhwoV4enpiZWWFVColPz+fmJgYxo4dy+LFi4HGz87jx48r+oh07dqVnJwcbt68ydChQ9V+7vft\n25fjx4/z1Vdf4e3tjb6+PoaGhi1+PguPpq0MsXsluVQWZKBrbM6W01FYmerj2d1SJUOsJb169QIg\nNja2XUEgOzs7dHV1SU1NpaqqSuXvZfMs4o5IXV+uO5dDqSoq5qPdJxh59ZbKc2N5eTkNDQ1kZ2cr\nPWu1d1KJvPytfFJJ08CfrpEZOgYmPKj8NcBWcb+GBuMumHSyoqCggNWrV7Nu3TqMjY2pq6vD39+f\noKAgfvnlF1JTU3n33XcVmWHNAw5yzZ8RAcX3LolEglQqVSrVKy852Dzw5+7u3uGCQOrec0ARcHN2\n01a7n/z5r3mPyqbq6upYvXo1iYmJODg4MGLECExNTRXPlYcOHVLpuSaRSJgwYQJ79+4lODiYsWPH\nkpycTEpKCkOGDBHPF4Ig/C5EEEgQBKEDOnfuHDt37kRDQ4PBgwfTtWtXysrKSE5O5syZM0pBoIKC\nAt5//326dOnC6NGjkUqlBAcHs3btWtatW6fU6DwrK4sffviBPn36MHDgQIyMjCgoKCA0NJTw8HA+\n+ugjtfWKKysr8fX1RU9PD29vbyQSieIhOiQkhJ9//hl3d3dcXFzQ0tLi7t27nD9/ntDQULZs2aIy\nA+/AgQMcPnwYPT09hg4diqWlJSUlJcTHx3P58mX69evHkCFDAAgMDMTNzU0p6NM0C2Dbtm0EBARg\naWmJt7c3hoaG3Llzh/379xMZGcnatWsVD+3tuR/h9/f6yB6sPHCjxRnVTUkktNoL6I/wMKXp9M2t\ncfBuLCWxaLwr0wd1V9lm5cqVLe5vaGjId999p7Lc0tISX19ftfuo6zs0Z86cFvtstdRPCaDOyAbz\nF5Zh1sJ7pamti56ZFV3cRmDsOZq/vz5YqSG6fADKyspK6brmzJlDUVERMTExBAYGcvv2bVxdXRWl\n5KAxgNujRw+lGboPq60Z0U1LvQh/Lup6ArVEJpPx7bffcurUKYYOHYqvry86OjpA42z+c+fOcfHi\nRe7evUt9fT12dnaMGzeOSZMmqS0v81vIm5m31l9MePzakwFq6zUeXWMLChNvUpQUhqauAaYv+LD2\n4/d57733VLZftGgRAwYM4OeffyYyMpKqqiqMjIzo3LkzL7/8Ms8//7xiWwsLCzZu3MiePXuIi4vj\n1q1b2NnZsWjRIvr166c2CNS/f3/mz5/PuXPnOHnyJHV1dVhZWYkg0GP2ODPE1Bk8eDA2NjacOXMG\nDw8PBgwYoLJNQkIC3bt3R1dXFy0tLXx8fDh37hyHDh1iwYIFiu2SkpK4fPnyo9zmU6OlrKu6B42T\nPsKDL3DrF3CyNqGziWpmiLxsnlx7J5XIgwhmZmZqA39aekZKQSAAZFDeoM/g/l6kpqaycuVK1q1b\nx86dOwkJCaFLly44Oztz7949Jk6ciL29PX5+fioBBzkjI6MWr9XU1BSZTEZ8fLzStUJjScqmOtoE\nmJbec7mK+zWcuhbPxNuZikw7OfnzX0s9mwBu3LhBYmKi2n6AJSUlShMTmxo3bhwHDx7E39+fsWPH\nKkrDTZgwob23JgiC8JuIIJAgCEIHk5mZya5duzAwMGDjxo3Y29srrS8qUh54iI6OZs6cOUp1j0eN\nGsWaNWs4fvy4UhDIzs6OvXv3qtQcLyoq4oMPPuC7775TGwRKT0/n+eefZ8mSJSoBleeff55p06ap\nZNxERESwZs0ajhw5wjvvvKO0/PDhw1hbW7Nx40aVAJH8/oYMGYKhoSGBgYG4u7urHbAODAwkICBA\nZRAO4ODBgxw6dIgzZ84wderUdt+P8Pvz7G7J0knurX6hg8YA0LLJHkpBhafBs1DSrr1aG9wC0NJp\nHGCpvVeBTAYHg5MU71dubq7aWchylpaW+Pj4MGrUKN566y3i4uKQSqW/KejTnJOTE6A6AAKNA/vy\nhtuC0JKamho2b97MtWvXmDRpEm+99ZYisFNXV8fatWu5desWtra2jBo1Ch0dHaKioti9ezeJiYm8\n//77f/AdCI9DezJAJRIJnXsNonOvQYplI316YmhoyPfff692n4EDB6pkcbekW7dufPTRR2rXtRQU\nnD59OtOnT1daVlBQwPz58xkzZgyLFy9m3bp1xMbGUltbi5OTE7Nnz1Y7OeBxZmGXlZVx/PhxQkND\nKSoqQktLCzMzM3r37s2sWbPo0qWL4lgymQx/f38uXLhAZmYmMpkMe3t7xo4dy8SJE1UCrfJzr1y5\nkn379hEaGopUKsXGxoaXX35ZpV/db/G4M8TU0dLSYtWqVXz88cd8+umnuLi4KAI+RUVFJCUlkZeX\nx759+xQlqf7yl78QGRnJyZMnSUpKwtXVldLSUoKDgxkwYAA3btx4bK/Bb9X097H5AHxzrWVdaero\nAdB35j/R1NFDIoH/12xiSnvdunWL0NBQbt26pfh30HRSyYGIbNUgVHVl88MAIJNBmXYXFr8zlV27\ndrFo0SJKS0sZOHAgn3zyCUeOHKGsrIxJkybh5ubGsWPHHvp6ARwcHKisrGT//v2K70FlZWVUVVVx\n+PBhpW070gSY1t7zpu6V5LLpp5uKTDs5eeab/HlQndzcXODheq5BY+Bt2LBhXL58mfj4eK5cuYK1\ntTX9+/dv/WIFQRAeE9F5TBAEoYM5e/Ys9fX1zJo1SyUABKiUV7OysuK1115TWta/f386d+6s1BAd\nGmc9qWs6a2lpybBhw8jKylLbP0RLS4v58+erDZh06tRJ5cs+gKenJw4ODiqNM+UDE/Pnz1cJAKm7\nv9b4+fmhqanJkiVLlAJAALNmzcLY2FjtDMfW7kf4Y0zwtGfD64PxcFA/AOLhYMGG1werzOh7GshL\n2j2Mp62kXXu0NbgFoGtiiaaOHuVZd6itriIqo4T0Aik1NTXs3r1badvy8nLS09NVjlFdXU11dTWa\nmppoaT3e+Uyurq7Y2NgQFRVFeHi40jp/f3/RD0holVQq5cMPPyQkJIS5c+fy9ttvKw04//e//+XW\nrVtMnjyZnTt3snjxYkVj8XHjxnHp0qWnarBVeHQPkwH6OPb7PeTn5+Pr60tlZSUTJkxg+PDhpKSk\nsGbNGpXMom3btvH555+Tm5uLt7c3kyZNwtjYmP3797NmzRrq6+tVji/Pwr5z5w7e3t5MnjwZMzMz\nHjx4wIoVK/jpp5/o3LkzL774IuPGjcPBwYHr16+TmZmpdJzNmzezc+dOSktLeeGFF5gwYQLl5eXs\n2rWLzZs3q723qqoqVqxYQUJCAsOGDWPMmDGUlJSwbds2Renhx6G9GWLdBk5EU1uXoqQwSjNiMO3q\nxNq1a9v9N8/R0ZEvv/ySV155haqqKgICAvj5559JTk7GycmJ999/X+l538TEhM8++4yxY8eSlZWF\nn58fqampvPPOO4pG9x1RaxNTDC1tAagsvAugmJjyuMiDCFdCwtWWJKu5V9HivneLKnHxGs6SJUu4\ne/cu8fHx9OrVS+V7SWJiIjU1NY90fQ4ODnh5eREbG8vu3bvJzMzk+PHjvPPOO3Tr1vgsraHROFzY\nkUoCtjUZSa6upprcqCtK77k8883Q0JChQ4e2uK88G7j565KXl8eePXtaPe+LL74IwMaNG6murmb8\n+PGPPQNYEAShJU/vU6YgCIKg0LSp5Zkrodx7UKc2I0ed7t27Kx7im7K0tFQ7qz0+Ph4/Pz8SEhIo\nKytTag4KjXWn5bWv5aytrTE1NVV7fplMxuXLlwkMDCQtLY3KykqlcgnNv9DeuXMHiUTS7vtryYMH\nD0hLS8PExISTJ0+q3UZbW1tl8ABavx/hj+PZ3RLP7pYqTV77OVo+9QGTjl7Srj3aM7iloamJVa9B\n5EYHkXB2N2bderO+IgxZWZZKz5Xi4mKWLFmCo6Mjjo6OWFpacu/ePW7evElpaSlTpkxR29T3t5BI\nJPzjH/9gzZo1rF27Fm9vb2xsbEhLS+P27dt4eXkRHh4uvrA/Y1pqHP0wCgoKWLNmDXl5ebz//vv4\n+PgorZfJZJw+fRpzc3MWLFig9HdZQ0OD+fPnExAQwOXLl9vVx0N4uj2LGaAxMTG89NJL/O1vf1Ms\nmzRpEsuXL2fHjh14eXlhYGDw2LOwQ0NDyc3NZdq0aUqlyqAxu65pKaygoCCuXLmCk5MTGzduRE+v\nMdvjjTfeYOXKlVy5coWBAwcyatQopeOkpaUxbtw43n33XcW/zWnTpvHuu+9y7Nixx9YT8nFniLVW\nntXU1JS5c+cyd+7cdl2bubk5S5YsUbuuI5aTbGtiSueegyhOvkV2+Hl0jS3QM7FUTExxtGrsw3Pn\nzh369OnzSOcfNWoUhw8f5uhPJ9Do9xo6ho3fK2QyGTm3A5G10d/pdnoR08eMITc3F19fX7Zu3aqU\nlSaVSvnvf//7SNcGjb9nq1at4scff+TixYvcv3+fkpISBg4cyNtvv83169fR19fnxo0bHaYfUHsm\nI8kZWztQnBzB0W9z6FI+Bs36aoKDg2loaGDx4sWKTC51Bg0ahI2NDSdOnCA9PR1nZ2cKCwsJDQ1l\n4MCBaidMyskz89LS0tDS0mLcuHEPfZ+CIAiPSgSBBEEQnmLqmlrGJmbzQFrC5z8nM2+sXptlC9TV\ng4bGmtCyZiPSISEhbNiwAR0dHfr164eNjQ16enpIJBKio6OJiYlRW3e6tVrR33//PSdPnsTCwoL+\n/fvTqVMnxYBAYGAgBQUFStvLa9w3z9x5WJWVlchkMsrLy1uszdySjlb7+s/G0cr4qQ/6NNfRS9o1\nFx0dzapVq5g9e7aiFGPTwa3YE9sA6DO9cUDJZcq7FCWFkXB2Nw+kpdTcK+decQ4VWUmEl/Tg3flv\nMGfOHEVpyKysLH788UcqKio4f/48tbW1GBgY0Lt3b3r06MG8efMYMWLEE7k3d3d3NmzYwP79+7l5\n8ybQ2Gh7/fr1iszB1gYHhI6jrcbRvYrVl+tpLisri+XLl1NdXc0nn3xC3759VbbJzs5GKpXStWtX\npV5WTeno6KidmNBcYGAgoaGhpKSkUFpaiqamJo6OjkycOFGpL4zwx5FngLZ3QBKe/gxQQ0NDpdLC\nAD169MDHx4fAwEBCQkIYM2ZMm1nYp0+f5vLlyypBoLaysNU9F2ppaSlNJrpw4QIA8+bNUwSAAPT0\n9Jg3bx4ffvgh58+fVwkC6erqqgRnu3XrhqurKzExMVRXVysd71E9ixliT6u2JqbomVpiP3gqd2/4\nEX/6a0xsnNE16cRnWyLpathAXFwcJiYmfP311490fhsbG15//XU2bN1F7tndmDn0QVNbD2luCvU1\n9zEw78L9snwAdI3M6P/GGqT56SRdaOz9JH+mmjNnDlFRUcTHx7N582ZcXV0ZPXo0O3fuxNbWVjF5\nprVA3dKlS9WWztPR0eH111/n9ddfJycnB19fX5KTk/nwww/JyspCX1+fsLAwBg0aRGho6CO9Dr+n\n9kxGktMxNKfboEnkRARywu8MViY6ODs7M2vWrDbLs+np6bF+/Xr27NlDdHQ0cXFxWFtbM2vWLKZP\nn66251pTY8eO5dtvv2Xw4MGi56wgCL8r8TQhCILwlGqpqaWWjh4PgNt3MliZX8WyyR6PrQTW/v37\n0dbWZsuWLYpSAHI7dux46Jlg5eXl+Pn54eDgwOeff64yaz8oKEhlH0NDQ6TSxvJQvyUQJO8r4uTk\nxLZt2x75OILwuEzwtMfarLHJc1SG6sCgh4MFc0b0eOoDQC1pbZAqI+QEpekx6JtZ0cm5HxJNbWrv\nVVBVmMlQn9H89a9/BRqDxuHh4SxZsoT6+nrGjx+PjY0NRUVFhISEoKmpyYIFC3B2dlY5h7oBkDlz\n5qjtFwaN5TxaGjTp1asXa9euVVn+n//8Bw0NDbp27drivQodQ3saR58Ov8s4NY2jm8vJyUEqleLk\n5KT2dxMaZ23Lt21tYsL9+/fbvPadO3dib2+Pm5sb5ubmSKVSwsLC+OKLL8jOzuaNN95o8xjCk9dR\nM0BbyoxzdnZWm33p7u5OYGAgqampDB8+/LFnYbu5udGpUyeOHj1KSkoKAwYMwMXFBScnJ5VM95SU\nFCQSCe7u7mqPo6GhQUpKisq6rl27qg3uy0sQV1ZWPpYg0LOYIfa0KCkp4ciRI4SFhVFSUkJBZT0F\nMnO6uA3HoJPy3+zilNtkhJzEYeg07LwmkH71J7IjLiCrr6MiqhPPDx/CsGHDVCabVFdX8+9//5vb\nt29TV1dH9+7dmTlzZovX9Oqrr3I1NpM9//mW0ow4QIausQVd+oygsigTTW1dpe1zIy9Rnp1I554D\nSLh1lXePbCMnJwdtbW0MDAzIz88nJSWFTp068cILL/Daa68p9VV9lNdMHkTq2rUrmzdvZvfu3Rw6\ndIiKigr69+/P22+/TUVFRYcIArUn064pPdPOOPnMYq5PzxY/f1vKtrO0tMTX11ftPm1lzqWmpgIw\nceLEh7peQRCE30oEgQRBEJ5CrTW1NLC0o6o4h4qcZPRMLdlyOkqlqeWjys3Nxd7eXiUAJJPJiI2N\nfejj5eXlIZPJ8PT0VBk4KCoqIi8vT2WfXr16cfPmTcLDw1utxwy/1qpuUFNSQU9PD3t7e+7evfvY\nm8cLwqPqyCXt2tLSIFVdTTVlGbEYdOpKr/HzkTQbtHv7dU/Ff1dWVvL555+jq6vLxo0blT6LMjIy\n8PX1Zfv27U80sPvgwQPq6uoUgWS5wMBA4uPj8fLyeiyDgcIfp72No5Gh+BvbmkGDBmFra8u+fftY\nvXo169atU/mbIx9gHjp0KKtWrfotl89XX32FjY2N0rK6ujrWrFnD0aNHmThxotqeesLvq6NlgLaV\nGefsptrfEVDMZK+qqnoiWdgGBgZs2rSJgwcPcuPGDUUvSRMTE1588UVee+01RTZQVVUVxsbGanvn\naGpqYmJiQnl5ucq65p/3TfcB9c+Zj+JZzBB7GuTn57NixQpKSkrw8PBg5MiRXL6VSPKFi1TkJNJ9\n5ExMbXuq7FeenUh5ViLm3d2w6TuK6vIizGvzaGho4M0331Tqm/SPf/wDX19frl69ipeXF05OTuTm\n5vKvf/0LLy8vBg0apJJBcvPmTeJDL6FrZIaVy1B0DM24X5JDSVoklQV3sewxQGl7XSMLTG17Uvfg\nPrcunuL5kd54enoSFRVFamoqzz33HP/617+U9mleIhBaLxMIvwYpPvvsM9LS0nBxccHU1JSioiIS\nExPp2bMnEyZMYPHixUrHfNp1hEy7oqIigoKC6NatGx4eHr/beQVBEEAEgQRBEJ5KrTW17NxzAEVJ\n4eTFBGHS1Rk9084cDE5SDB4UFRUpZi4+LCsrK3JycpRmhslkMg4ePNiuEjXqjgcQFxdHQ0ODImhT\nXV3NV199pbYx8JQpU7h58ybff/89PXv2VBnIKi4uViyTfzlrqfby9OnTFQPGy5YtU/mSX1lZSX5+\nfosztwXhSeloJe2aB6307lWpbNPS4JaExs8RDQ3NxtHOJjwcLHBz+nWG7sWLF6mqquLtt99WCUY7\nODgwfvx4Tp48SWZmpsr6x6WwsJAlS5YoSmI2NDSQkpJCXFwchoaGzJ8//4mc92knv++mA06BgYFs\n3bqVpUuXKg0Qqdv2adLextHwa7Nw2za2e/XVV9HR0eG7775j5cqVrFu3TqnMi52dHYaGhty5c4e6\nurp2N3hXp3kACBrLYk2aNImoqCgiIyMZPXr0Ix9feHw6SgZoezLjTl2LZ6KazLiysjKgMZDypLKw\nLS0tee+995DJZGRmZhIZGcmZM2c4fPgwMplMkf0mzyZX92+svr6eioqKP7ycZ0fNEHua7dixg5KS\nEt58801FZs6oiVISZd1IurCHjGsn6TN9CZrayhUGyrPu8Nzo1zHu4qRY9oJZJpfPn+HChQvMmDFD\nsXzXrl1IpVIWLlyoVMrwxo0brFu3TuWaqqur+eyzz9DRhIlvvEtufePfA1lDPbEnv6I+J5mae6oB\nSQCd+wX874E9ih6s9fX1rF69mqioKEWQ5nHw9vamrKyM0NBQqqqq0NbWxt7enhdeeKFD9qp5mjPt\nrly5QnZ2NkFBQdTW1vLGG2+I/pKCIPzuRBBIEAShmT968KqtppZ6pp3pNnAimaFnSDi7G1O73uTc\ntsA45xoleZkYGBiwfv36Rzr39OnT2bFjB++99x7Dhg1DU1OT+Ph47t69+0j1oM3NzRk5ciRBQUG8\n9957eHp6UlVVxe3bt9HR0cHJyUmREi/n6enJa6+9xpEjR1i0aBFDhgyhc+fOlJaWEhcXR+/evRV1\nrW1tbenUqRNBQUFoampiZWWFRCLh+eefx8rKinHjxpGcnMzZs2dZuHAhnp6eWFlZIZVKyc/PJyYm\nhrFjxyrNdBME4VctzQyX5qeTn1lKeoFUabl8cKspTR09TO16Up6VSMLZ3ZjZu2DU2R6jznYqg1sJ\nCQlAY5PugwcPqlxPdnY2wBMNApmZmTFq1ChiYmKIioqirq4OMzMzxo4dy8yZM9UOwD8LVq5cSUxM\nTIdsAP4wHqZxtFxURgn6VLe53bRp09DR0WHXrl38z//8D+vXr1dMqNDU1GTKlCkcPnyYb775hgUL\nFqiUPC0pKeH/s3efAVGeWcPH/0OXDlIERQEFlCIW1NiJaNQomhhjwNgSzcaSjZpg3qibuG6ixk0z\n7XGj6y7ZjaKJuhEsGMGGhWIAaRJQiihIERQcQdq8H8iMjDPAqKCg1++LePebMnPPda5zjlQqVfnd\nvjcI62AMsSfCOX/+PMXFxVRXVyttf/369fu6P6FttfcMUE0z426XFvDZ/+JUss+Tk5OBhsBPW2dh\nSyQSunfvTvfu3Rk6dCivvfYa0dHRiiCQs7Mz58+fJzU1VaU3V2pqKvX19Y994k9HyxBr70pKSkhI\nSMDa2ppp06YpljvamDB0oDclGZ6UZidxI+8CnZ2VfycsengoBYD69rBk1ngfjv96gIyMDKVzJCYm\nYmtry+TJk5WOMWTIEDw9PVVKZkdHR3P58mVqamroWZHJb7/fpPZOJbeKcrlzqww9Q1Pqa6qplt5E\nz+huGUSJBBa/MU8RAIKG94+tJuB3AAAgAElEQVSxY8eSmpraqkGgESNGMGLEiFY5VnvQnjPtwsPD\nSU1NxcrKigULFjBs2LA2P6cgCMK9RBBIEAShndGkqaWVy0A6mdtQeOEstwpzuHklnWNV9ozy8eS5\n55574HNPmDABXV1d9u3bR2RkJHp6enh4eLB06VLOnDnzQPWg3377bbp06UJUVBQHDhzAzMyMwYMH\nM2vWrCaDVbNmzaJ3796EhYURFxdHVVUV5ubm9OrVS2l2s5aWFqtXryY4OJjTp09TWVmJTCbD3d1d\nkYW0aNEifHx8OHToEOfPn0cqlWJsbKz4sCiaaAuCeprMDN95+iIDfe/ODJcPbi34n/K2TiOmU5h2\nmrKcFArOH0ciAdeuVhwzzcLp9dcVGRPyvimHDx9u9to06ZvyoIyNjXn77bfb7PgdlbqZzh3R/TSO\nbuxqqWr2mzoTJ05ET0+Pr776ivfff59169YpBvNeeeUVsrOzOXToELGxsfTt25fOnTtz8+ZN8vPz\nSUtLY86cOYogkLog7J2KMn4P/yeG2nWMfmYA48ePx9DQEC0tLYqKioiMjKSmpuaB7lFoW+01A1TT\nzLja6ioKkk6wI8pOEZjIzMzk+PHjGBkZKUr4tnYW9uXLlzE1NVVpoF5WVgaAvv7dvirjxo3j/Pnz\n/PDDD2zYsEGx7s6dOwQHByu2edw6SoZYe9BUjyo5+WQyDw8PleyvV0e5cOK4E6XZSVSWXQOUg0CG\nlnczkeVZV1ZWDWUJb926pXIOd3d3lT5U0NAX694g0KVLlzA2NqZ79+7cKMjB7GYhmdfK0TMyx85r\nFHcqSinLTeV26bW7QSAJONuaMmHEQJVzNO5PJTStpUw7fWNzBsxaAzzaTLsNGzY8kvMIgiA0RwSB\nBEEQ2hlNm1oaWTvgbH13tvC9TS2ba3oOTT+MNlVH2tHRUW2D9ZZmjevr6zN79mxmz56t8TUA+Pj4\n4OPj0+R6ORcXF5X62PcaNGgQgwYNavFY0PL9CMLTQNOZ4bL6epW+ZBP6d8fD3oj88ruvZVo6utj1\n9cWury+9LLXoZ1nN5bRzHDt2jMLCQjZu3Ajc7ZvyzTff4Ojo2Cb3JjyYJyUD6n4bR8tV12reE8TP\nzw9dXV2++OILRSCoS5cu6OjosHr1ao4fP05ERIRikoOpqSm2trbMmjULX19foOkgbFH6WWrv3MZi\n6FTyu/ajx+C+iiDsyZMniYyMfKD7E55O95MZZ2Lbg+sXE9i9NZ8uN/3QrqsiKiqK+vp6lixZonj9\nbu0s7ISEBP7973/Tu3dv7O3tMTc3p6SkhJiYGCQSiVL2x+jRo4mOjubUqVMsXrxYEZiKjo6msLCQ\nkSNHKv7GHrf2niH2uLXUo8rtekMwRCptCNCr6ynV38mKOWP78dHZfdTdUc3m1NZr6O+nLuuqcR8o\n+TnuDUTKqTu3VCrF0NCQxYsXKz6DJGSXKAJ/VxMaXqvrqhsmtfTtYYlXdXd+TyzA2NhY9VpbuT/V\nk0pk2gmCIDRNBIEEQRDamY7Q1FIQhCdbSzPDdfQ6AVBzu1zRM0X+QbqgoAA9ahncy4YP3hzV5OCW\nTDaFN998k7S0NEXZoN69e3PmzBlSU1NFEKgVxcTEEBoaSl5eHhUVFZiammJvb8/IkSPx8fFR6nPk\n7++v+NrT01MRrH/cpVJbiybvlY1nCsu9NHsBLwx2UlrW3GSLUaNGMWrUKJXl8pKlzWWhNheEvVPR\nkP1g3r0PMhlKQVh5WS5B0NT9ZMbpGVngMHgS+QmR/BJ6ABtTPXr27ElAQAADBgxQ2rY1s7AHDBhA\ncXExqampxMTEcPv2bSwtLenXrx8vvPACffr0Udr+vffew8vLiyNHjnDo0CEAHBwcePHFF3n++ec1\nPu+j0l4zxB4nTTKR9/92mXGJeVj8kWkm7011rz62+vTuakEnO0u16zXJujJq4RzyrLSW9mkc+Pvb\nhrMkFhrz8sg+vOI/CkcbEzZtiuH3Jq9C0JTItBMEQVBPjBgKgvBUkslkHDhwgIMHD3Lt2jVMTEwY\nOnSo2myVHTt2EBISwvr16/Hy8lJaV1RUxPz58/Hz81P0qZG7c+cOoaGhREVFkZ+fj0QioUePHkyZ\nMkXtwJBce25qKQjCk0+TmeH6plZo6xlw88rv1FRJScpt2M/eXJ/vv/9esZ2jjQkW+vWUlZWpBHWq\nqqqoqqpCW1tbUcJl7Nix7Nq1i5CQEFxcXFTq3stkMlJSUlRei4WmhYeH891332FhYcHgwYMxNTXl\nxo0b5OTkEBERwejRowkMDCQyMpKioiICAwMV+9ra2j7GK28bHeE9trkgrLxs0K3CHMy6uSmCsLKy\ny/z666+P7BqFJ8P9ZsYZmFnj7Bugkn2uTmtlYTs4OLBgwQKNr1EikfD8889rHPBp7tzLli1Teb4X\n2pammcj8EQR/b2IvoKHnU11dnSJjRi4pKQkzQz2Wz3oOZ89BJOaUEHumhCOXTFg2uS+vTh/a4jU5\nOzf0DUpLS6O+vl6lJJy6ALx8n+TkZJUShA6dDdGquEZXSyNenzISa2sRBGxtItNOEARBlQgCCYLw\nVNq6dSthYWFYWloyYcIEtLW1iYmJISMjg9raWpWa0vdLKpWyatUqsrKy6NmzJ+PGjaO+vp6EhAQ+\n/fRTcnNz1QacoH03tRQE4cmnycxwLW1tbNwGU5B8kvSD32Pu0Jv15eeQ3biCpaUllpZ3Z9xev36d\npUuX4ujoiKOjI1ZWVty+fZu4uDjKysrw9/enU6eGzCITExNWrlzJunXrCAoKwtvbm+7duyORSCgu\nLiY9PZ2Kigr27t3bZvf/pAkPD0dHR4dvvvkGMzMzpXXl5eUYGRkxc+ZMkpOTKSoqUlv280nS3t9j\nWwrCWrsOojQrkeyo3Zh374NuJxMuHi0iQe8Gz/n5EhUV9UiuU3gyiOxzob3RtEcVgEwGh1JL6dev\nH4mJiYSGhvLiiy8q1v/++++cOHECY2Njhg4dSqdOnXC0McGkIovkSEO6WBhqdB4rKyvFOfbv38+U\nKVMU62JiYlT6AQEMHToUExMTTpw4waRJk3Bzc1OsCw0NpbCwkH79+il6xgltQ5NMu8jISDZt2sSy\nZcvUlkTvKPz9/ZUyuAVBEO4lnt4EQXjqXLhwgbCwMOzs7Pj8888xMWl4MJw9ezarVq2itLQUGxub\nhzrH1q1bycrKYt68ebz00kuK5dXV1axbt46ff/6Z4cOHK2aJ3aulppaNPcqmloIgPPk0nRnepa8v\nEh1drl+M5/rFeH6vtWfey5OZOXMmixcvVmxna2vLq6++SnJyMklJSZSXl2NiYkLXrl2ZN28eI0eO\nVDqut7c33377LXv37iU+Pp7U1FR0dHSwtLTE29ubYcOGter9Pokaz3y9dO0mtbUyldnRAKampo/h\n6h6/9vwe21IQtpOFLb3GzqXg/DHKr2Yik9XTydyWcbMWMHGwiwgCCfelI2TGCU+P++lRJZeUW8rH\nr8wlNzeXf/3rX8THx+Pi4kJJSQmnTp1CS0uLZcuWKSabPKhFixYRFBTE1q1bSUhIwMnJiYKCAs6e\nPcvgwYOJjY1V2t7AwIClS5fyySef8P777zNixAisra25ePEiCQkJWFhYaNwXSxAEQRBagwgCCYLw\nVGg8IHZs3y5u36llxowZigAQgJ6eHnPnzmXVqlUPda6KigqOHTuGi4uLUgBIfo558+YRHx/PiRMn\nmgwCiaaWgiA8LprO8JZIJHTxGEEXjxEALBrvruiZ0rhvjJGREQEBAQQEBGh8DTY2NixcuPA+rloA\n9Y20i7S6ciUjlSHjX+Yl/+d43ncoffr0UckKepq05/dYTYKwxtYOuIydo7TMwdUVLy8XldJWYkaw\n0Jz2nhknPF3up0dVY1dva/Pll1+ya9cuzp07R0pKCp06dWLAgAG88soruLg8fCDf3t6ezz//nODg\nYM6fP09ycjKOjo6sXr2a8vJylSAQwJAhQ/j73//OTz/9RHx8PLdv38bc3JyJEycSEBCglDUtCIIg\nCG1NBIEEQXiiqRsQSz+dwO3S6+xJraJzzxKlwR13d3eVOs/3KyMjg/r6eqChn9C96urqAMjLy2v2\nOKKppSAIj4OYGd4xNdVI26bPULT1DSnJOMc/gnfy66ED2JgZ4unpyWuvvdYqg2MdUXt9j22v5bk2\nbdpEZGQk27Zte+hsaaF9aSkzTt/YnAGz1gAi+1xoW5oEwRv/Pjber3PnzkpZyM3x8/NrtuxXU32i\n7OzsWLlyZZPHVMfFxYXVq1drdF3N9aDy8vJqtn+VIAiCILREBIEEQXhiNTUgVldzB4DM6zWs3B7D\n8sl9Gd/PAQBtbe2HLo9TUVHRcPzMTDIzM5vcrqqqqsVjiaaWgiA8amJmeMfTUiPtzs7edHb2pra6\nitslefSxrSQl/ixr1qxh8+bNT21WUHt8j31cQdgdO3YQEhLC+vXr8fLyeqhjCR1Le86M6wjCwsI4\ndOgQhYWFVFdXs2DBAqZOndqq51i5ciUpKSlPfBCgvQbBhY6lqKiI+fPn4+fnx/Tp0wkODiY1NZWa\nmhqcnZ0JDAykf//+LR4nKSmJkydPkpaWRklJCXV1dXTp0oURI0bw0ksvoaenp9j2hx9+YPfu3U32\nFbp48SLLly9n0KBBfPjhh4rld+7cITQ0lKioKPLz85FIJPTo0YMpU6YwatQolePU1taye/duIiMj\nKSkpwdLSEl9f3/vKthcE4ekl3i0FQXgiNTcgpq2rD0BtlRRtXT2+3J+EjVkn+jtZUVdXR3l5OVZW\ndz/gyjOD5Bk8jd26dUtlmZGREQBTp05lwYIFrXE7GjW1FARBaC3tuWeKoErTRto6egaY2rtQ38OS\nsZZGHDlyhNTUVIYNG6Z4r6uvr3/ojNiOpj29x7bXIOycOXOYPn26KF/0hGqvmXHt3cmTJ9myZQvO\nzs5MmTIFXV1devfufd/HEZl2DUQmstCaCgsLCQoKwtHRkQkTJlBWVkZUVBRr1qxhxYoVKj0p77Vn\nzx6uXLlC79698fHxoaamhrS0NHbs2EFycjIff/yx4nlp4sSJ7Nmzh8OHD6sNAoWHhyu2k5NKpaxa\ntYqsrCx69uzJuHHjqK+vJyEhgU8//ZTc3Fxmz56t2F4mk/HJJ58QExODnZ0dkydPpra2loiICHJz\nc1vjWyYIwhNOBIEEQXgiNTcgZmhpx+3SAm4V5aJvYoFMBjuiMunvZEVaWpqilJucPKhTUqJap/ri\nxYsqy1xdXZFIJKSlpT38jQiCIDwGYmZ4x9FSI+2Ka9kY2zoikUgUy5JyS6m7VQiAvn7DxAh5Fmxx\ncTG2trZteMVCS9pjENbS0lIEgJ5w7TEzrr2Li4sDYM2aNW3y9xETE0NoaChhYWGUlJQwd+5c7O3t\nGTlyJM8//7xiu/z8fHbu3Mn58+cpLy/H1NQUb29vAgICsLe3Vzpm46y/8vJy9uzZQ25uLnp6evTv\n35/58+fTuXPnVr8XTbTXILjQMaWkpPDiiy/y+uuvK5ZNmjSJFStW8N133zFw4EAMDQ2b3H/RokXY\n2toqPT8B/Pjjj+zatYvTp08rAkk2Njb4+PgQFxdHbm4uPXr0UGxfWVnJiRMnsLKyYuDAgYrlW7du\nJSsri3nz5in1Ea6urmbdunX8/PPPDB8+XNFD+OTJk8TExODm5sb69esVmUgzZ87knXfeeYjvlCAI\nTwsRBBIE4YnT0oCYZc9+lFyM51pKFGbdXNHRNyQpt5SMK9f54YcfVLZ3dXUFICIigmeffRZtbW2g\nISgUEhKisr2ZmRm+vr4cO3aMnTt3MmPGDJVZ1QUFBWhpaYmBNkEQ2i0xM7xjaKmRdvbJn9DS0cPQ\nqiv6xubIZCAtyqVUu4IRPn3x9vYGwNvbm1OnTrF+/Xp8fHzQ09PDxsaGZ5999lHcRociH5jNy8uj\noqICU1PTVh2YrSgrwyBtL2fPp6OtZ4BFDw/s+/mhpa1DxbVsriWf4HbpNSQSCf7P+dLLSrVkDDQ8\np+zevZtz585x/fp1OnXqRJ8+fQgICFDqBTV//nyKiooAWLVqldIx5OWn1GUqNC6588orrxAcHExy\ncjI1NTX07t2bBQsW0KNHD27evMl///tfYmNjuXXrFo6OjsybN4++ffuqXHNdXR2HDx/m6NGjXL58\nmbq6Orp168a4ceOYNGmSymCcpj8LQXPtKTOuvSstbXhvbIsAUHh4ON999x0WFhbY29ujq6vLwIED\nycnJISIiQvH7nZmZyV/+8hcqKysZPHgw3bt358qVKxw/fpyYmBg+/vhjtb3fDh48SExMDEOGDMHT\n05OMjAyioqLIzs7m66+/RldXt9XvSRPtMQgutG/3Bq67GTX88hgZGREYGKi0rYuLC76+vkRGRnL2\n7Nlme0N16dJF7fKpU6eya9cu4uPjlbKJJk6cSFxcHOHh4bz55puK5SdOnKCqqoqXXnpJMSZQUVHB\nsWPHcHFxUQoAAejp6TFv3jzi4+M5ceKEIggUEREBNGTmNi5FZ2JiQkBAAJs2bWrxeyUIwtNNBIEE\nQXjitDQgZmztgE3vIRSlx3DhwD+w6O4OEi3e+u1HPJ3tVD7Iubm54enpSUpKCu+88w7e3t7cuHGD\n2NhY+vfvz6lTp1TOsXDhQvLz89m+fTvHjh3D3d0dc3NzSktLycvLIzMzkxUrVoggkCAIj9T8+fMB\n2LZtm0bbi5nh7V9LjbTt+vlRUXCJytJrlOdfREtbBz0jM4Y/9yLrg+ajo9PwceC5556jqKiIkydP\nsmfPHurq6vD09BRBoHs0HpgdPHgwpqam3Lhxo9UGZvfv38+5c+cY/swzePf1IjTiFFcuRFN3pwqz\nbq7knN6Dqb0r/Z4ZhY3WTQoyEvj888/561//qnScS5cu8cEHH3Dr1i0GDBjAsGHDKC8vJzo6mvfe\ne4/Vq1fj4+MDwJQpU4iOjiYlJQU/P7/7LkdVWFjIu+++i4ODA35+fhQVFXH27FlWrlzJZ599xpo1\nazA0NGTkyJFUVFQQFRXFX//6V77//nusra0Vx6mtreWjjz4iPj6erl27Mnr0aPT09EhKSuL7778n\nIyNDabazpj8LQWht8qCtnL+/v+LrsLAwoqOjOX36NBkZGVy/fh2Abt264efnx+TJk5WCmY33lb9H\nA2RlZeHq6so333zDJ598QkpKCkuWLGHPnj0cPHiQF198EXNzc3JycjA0NGTFihX4+voq9o+KimLt\n2rXMnTsXNzc3bty4gZGREfX19VRWVvLbb7/xxRdf4OjoCDQEekNDQ6mqquKLL77gypUr5Ofn4+rq\nyoYNG1r7W9gkkYksaCohu4TtJzNVJn/euXWDvLwyfIf3olOnTir7eXl5ERkZSVZWVrNBoKqqKkJD\nQ4mOjubq1atUVlYia/RLKf/blvPx8cHW1pZjx44xb948RaZ1eHg42traPPfcc4ptMzIyFNVHduzY\noXJueRn6vLw8xbJLly4hkUhwd3dXe0+CIAgtEUEgQRCeOC0NiAF0HTgefRNLijPiKMk8h7a+IUPG\njOKjvwXx9ttvq2z/l7/8hX/961/ExMQQFhaGvb098+bNY8CAAWqDQIaGhnzyySeEh4dz4sQJzpw5\nQ3V1Nebm5tjb27NgwQKNGlIKgiC0B2JmePvVUkNsa1cfrF19VJb7jndXGhzR0tJizpw5zJkzR+1x\n1AUO/fz81A6gaBpk7IjCw8PR0dHhm2++wczMTGldeXk50FC3/4svvuD27du8++67KgOzf//73/n8\n88/ZvHmzSmZLYmIimzZtwsHBAYC/vLuE1/+0iN8vZaGdU8CqD9Yw1W84jjYmyGQyPvzwQ3777Tey\nsrIUs4Xr6urYuHEjVVVVrF+/Hk9PT8XxS0tLWb58OV9//TXbtm1DV1eXqVOnIpVKFUGg+x1MSklJ\nYfbs2cyYMUOxbOfOnWzfvp13332XESNGsHjxYsW99u/fny+++IJ9+/Yp9U786aefiI+PZ/Lkybzx\nxhtKfaq+/fZbjhw5wvDhwxkyZIjGPwtBaAvyv5HIyEiKiopUsg2Cg4PR0tLCzc2Nzp07I5VKSUpK\nYsuWLWRmZioFMwMDA4mOjiY7O5spU6YoylCHhISgra2tqEAA8Nlnn5GamqooY3XkyBHS0tLw8PBQ\nep2Bhs8ieXl53Lx5k0GDBjFq1ChKSkr46aefuHr1Ku+8844iACRnbW3N5cuX2bNnDzNmzMDHx+ex\n9IgTmchCS8ITLjcbKCyvrObUpZscTsxjfD8HpXXm5uZAQ0+eptTW1rJ69WoyMjLo0aMHI0eOxMzM\nTPH3GBISQk1NjdI+EomECRMm8MMPPxAVFcXYsWO5ePEily5d4plnnlGaaFpRUQE0TBjJzMxs8jqq\nqqoUX0ulUkxMTBSTd9TdkyAIQnNEEEgQhCdOSwNi0PCQZu02GGu3wYplU8a7Y2RkpHbwysjIiD//\n+c/8+c9/VlknL5VyLx0dHSZPnszkyZPv4+oFQRAEQXOikfajd+/ArJy8r1J6erqimfS9A7MjR45k\n//79pKWlkZqaqhSggYasAHkACEBXV5dJ48dyY/t2nn32WZYGTlCsk0gk+Pr6kpiYSHZ2tiIIdO7c\nOQoKCnjxxRdVjm9paclLL73E1q1bOX/+vCIb6GHY2Ngwffp0pWV+fn5s376dmpoaXn/9daVg1+jR\no/nqq6/IyspSLJPJZOzfvx8LCwsWLFigNPCspaXF/PnziYiI4Pjx44ogELT8sxCEtuDl5YWXlxfJ\nyckUFRUxc+ZMpfVr1qzBzs5OaZlMJmPTpk0cPXqUSZMm4ebmBjT08ygqKiI7O5v+w/y4IpVw+04t\nnqMqiD3yC4sXL6akpITy8nJycnL47rvvMDFpmJRhaWlJTEwMN27coKysDAsLCwBu3brFp59+SufO\nnRUZelOmTAEaSk1t3LhR0c+oMT09PW7fvs3s2bMJCgpq9e/b/RCZyEJTErJLWswUA6iplPLl/iRs\nzDopBQxv3LgB3O37q05MTAwZGRn4+fmxbNkypXWlpaVqS8IDjBs3jh07dhAeHs7YsWMJDw8HYMKE\nCUrbyc89depUpckQzTEyMqKiooLa2lqVQJD8ngRBEJojgkCCIDxxxICYIAhymvSLaK5EWuM+HY1n\nx/v7++Pp6cmKFSsIDg4mPj6eyspKHBwcePHFFxk9erTScZKTk1m1ahWBgYEMGDCAH3/8kczMTOrr\n6+nTpw+zZ89WWxpKKpWye/duzp49S1FREXp6eri6ujJt2jT69evX5Dl8fHwICQkhPT2dW7dusWzZ\nMqVa4Y3Lz6j7gCt0HKKRdttrPAjZqWsfytJ+Z/HixYwaNQpPT0/69OmjlIly8eJFALU9b+TL09LS\nyMrKUgnSqHsdkM8e7tWrl8o6eQP3xmVp0tPTASguLlZbZiY/Px9oKDPTGkEgZ2dnlWwB+TV37dpV\npRyPlpYW5ubmlJTcLd979epVKioqsLe3Z9euXWrPo6enp1Qax9fXl23btjX7sxCEx+HeABA0BG2n\nTJnC0aNHSUhIUASBAK5cv0VaXhlB/zmLvrF8Rn83bnYdyc1rKRRdSOdOpRRtbW02bNjAa6+9houL\nC7W1tVhaWqKjo8PFixcZNGgQAEePHkUqlTJ+/HjOnTunlPFgbW2NtbU1165dIy8vTynoLJFI6NKl\nC8bGxm3zjXkAIhNZuNf2k5ka9YyqLC2gtvoOO6IylYJAycnJAIqJE+oUFBQAMGzYMJV1KSkpTe5n\nZmbG8OHDOX78OBcuXODEiRPY2toyYMAApe1cXV2RSCSkpaW1fCN/6NmzJ4mJiaSlpak8X8jvSRAE\noTkiCCQIT7jGTXs1GeSLjIxk06ZNLFu2rNkaua2pqUHWByUGxARBgLbvF3Hr1i1WrFiBkZERY8eO\nRSqVEhUVxWeffcb169eZNm2ayj4ZGRn8/PPP9OvXj0mTJlFQUMCZM2dITU3lb3/7Gx4eHoptpVIp\nK1asIC8vDxcXF6ZOncrNmzc5deoUH374IYsXL1aZWQgNA8A///wz7u7ujBs3jvLycuzt7QkMDCQ0\nNBRAMSMYmv8QLHQMopF221Dfb+DuwGzuzt2YdtqHRCLB09NTMTB7+/ZtoOlm8fLl6krRGBoaqiyT\nZ7qom7UsX1dbe7cUrrwUmrpytY01LjPzMJq7LnX3I18v73kAEBoaSnJyMufOneOXX36he/fuXL58\nGRMTE/r06aPYrrKyUvH1Cy+8gKmpKQcPHiQ0NJR9+1R/Fvf7HAyP51lYaP/UZaQ0paKigr1793Lu\n3DmuXbum8rfWOGgbnnCZ/b9dpryyWuU4nZ29wdmbkvIqrCuv4O/vz8mTJ1mzZg2bN2/G0NAQfX19\n7ty5w61btxT7yQPBWVlZXL16lbi4OEUvk9OnTyuu594gENCuAkCCcK+cogqNP+PXVldxLfkESbrP\nkVNUgaONCZmZmRw/fhwjIyOGDh3a5L7y3njJyckMHny3csi1a9cIDg5u9rzPP/88x48fV5RlnTFj\nhkrpVzMzM3x9fTl27Bg7d+5kxowZKpMpCgoK0NLSUvQQHjt2LImJifz3v/9l3bp16OnpAQ2vN01N\nnhAEQWhMBIEEQXgiiQExQRDaul9ETk4OI0aM4L333lN8uJs+fTrLli3jv//9L8OGDaNLly5K+/z2\n22+8+eabSmUi5U3iv/rqK77//nvFsYKDg8nLy2PChAlK/TSmT5/O8uXL+f777xkwYIBKE/eEhASW\nLFmiEiDq06cPkZGRACqla4SOTTTSbn3N9RuQD8zW1VTxnJs+lGZz5MgRpYFZgLKyMrXHLi1tGMBq\nKkDysORBmb/85S9KpdPaq5MnT7J3714kEgnDhw9n2rRpDBw4kKCgIDw9PZttSj9mzBjGjBmDVCrl\nwoULnD17VulnIQgPq6nm8wA3kq+if0/wRiqVsnz5cgoLC3F1dWXMmDEYGxujra2NVColNDRU0UtE\nXtaKFj6vaGlrU3Qb/CafqxcAACAASURBVF6ah56eHkeOHCE1NZWePXsikUioqKhQNJmHu/1GTp8+\nTUVFBfHx8Yq+I1evXuXGjRt06dJFKagqp6ure1/fH0F4lBJzSlre6A8mtj24fjEBaUk+X9RewNlC\nh6ioKOrr61myZEmz78GDBw/Gzs6OX375hZycHHr27ElxcTGxsbEMGjSI4uLiJvft06cPTk5OZGdn\no6Ojw7hx49Rut3DhQvLz89m+fTvHjh3D3d0dc3NzSktLycvLIzMzkxUrViiCQKNGjSIqKoqYmBje\neusthgwZQl1dHadPn8bFxUWRvSQIgtAUEQQSBOGJJAbEhNamSVkxuDv7Mzo6mqKiInR0dOjVqxfT\np0+nf//+ao998uRJwsPDycrKorq6GltbW3x9fZk2bZrKh3F5GbKVK1fyn//8h9jYWCoqKrCzs2Pa\ntGmMHTu2Tb8P7V3jmbqXrt2ktlbWZv0itLS0mDdvntLsPltbW/z9/QkJCeHYsWMqzaLt7OyYNGmS\n0rIhQ4bg6elJSkqKokdIbW0tx44dw8DAgDlz5iidw97eHn9/f3bt2sXRo0cJCAhQOp6zs7PaDCHh\nySYaabceTfsNaOsacCAbNrwaiEwmUxqYhabLs8iXy7drbfIyU6mpqRoHgeQzkBsPJD8qcXFxGBgY\n0K9fP4yMjJgxYwY6Ojps3rwZfX19jY5hZGSEj48PPj4+Sj8LdSX0BEFTLTWfLy6v5FZRmVLz+V9/\n/ZXCwkICAwNVJlykp6crMnKh+bJWFdeyMbZ1VLz/y2SwIyoTkz96f+jr69OnTx86d+5MVlYWKSkp\nisw1Q0NDSktL6dGjBy4uLmzevFlxnNauwCAIj9LtO7Utb/QHPSMLHAZPIj8hkrhTx7hqbkDPnj0J\nCAhQKc92LwMDA9avX09wcDDJycmkpaVha2tLQEAAL7zwAlFRUc3uP3bsWLZu3cqQIUMwNzdXu42h\noSGffPIJ4eHhnDhxgjNnzlBdXY25uTn29vYsWLBA6bOjRCLh/fffZ/fu3URERLB//34sLS0ZO3Ys\nAQEBaisQCIIgNCaCQIIgPLHEgJjQWjQtK1ZUVMTKlSspKirCw8ODgQMHUlVVRVxcHGvWrGHJkiWM\nHz9e6dhfffUVERERWFlZMWzYMIyMjPj999/58ccfOX/+PB999JFKEEMqlfLee++ho6PD8OHDqamp\n4dSpU3z11VdIJJKnsnyNupm6RVpduZKRypDxL/OS/3M87zu0VftFWFtbK2bnNebl5UVISAiXLl1S\nWefh4aFSEkK+T0pKCpcuXcLT05MrV65w584d+vTpo2gA3Vjfvn3ZtWuX2nO4uro+4B0JHZ1opN06\nWmNgtmvXrqSlpXH69GmGDx+u2P/06dOkpqbStWtXpfKPrWnIkCHY2dlx4MAB+vbtq7bvT3p6Ok5O\nToogizww3tzs5rZSWlqKRCJh+vTp7Ny5ky1btrBgwQK6deumsp1UKlWUr0pKSsLLy0vlNfVGo5+F\nIDwoTYPBMhlKzeflPbda6iWiVNZK64/fYdndIGz2yZ/Q0tHD0KorFdeyqSq/zs//2EhP4zv09eiN\nt7e34u8mPj6enTt3IpVK6datG6mpqVy6dAkXFxeWL1+u9rlDEDoiQ/37G8I0MLPG2TeARePdeWGw\nk9pt/Pz81H52srKyIigoSO0+YWFhzZ43KysLgIkTJza7nY6ODpMnT1aqENDS9gEBASoTwDS5JkEQ\nBBEEEoSnyJUrVwgODiY1NZWamhqcnZ0JDAxsMjuhsaSkJE6ePElaWholJSXU1dXRpUsXRowYwUsv\nvaSoSdtYfX09hw8f5tixY+Tm5lJbW0vnzp3x9PRk+vTp2NvbN3vO4uJi1qxZQ0FBAW+//TbPPvvs\nfd+zGBATWoOmZcW+/PJLiouLWbFiBaNGjVIsl0qlrFy5ki1btijNCIuMjCQiIoKhQ4cSFBSk9Hck\nn6l54MABpf4tANnZ2YwbN4633npLMXt76tSpvPXWW+zZs+epCwI1NVPXps9QtPUNKck4xz+Cd/Lr\noQPYmBkq9Yt4GE3N7LOwsABQ9AV5kH0epqdIU+cQnh6ikfaDa6nfQOOBWX1jc2Qy+P1QrsrA7PLl\ny/nggw/YuHEjzzzzDN26dePq1aucPXuWTp06tenArI6ODqtWreLDDz9k7dq1itI0+vr6lJSUkJmZ\nybVr1/jPf/6jCJTIgyk//PADubm5ir4gr7zySptcI8ClS5fw9/dX/D8kJISLFy8SGxtLbGwscXFx\nODk58dxzz5Gfn09aWhpz5szB3t6ew4cP884771BVVYWxsTFdunTB3d0dbW1tLl68SK9evfD29laU\n3lOnoKCAH374gcTERGpra3FycmLGjBltdr9Cx6Jp83m4Gwzu72SlmBySnJyMo6OjYpusrCx+/vln\nxf8bl7XS0esEQLX0JvomDe/vdv38qCi4RGXpNW6XXqOuuhKZrB6fMf78delr6Og0DOU4ODjg4eFB\nz549SU9PJzY2lk6dOmFnZ4e1tbXa1xmZTEZGRobIBBI6nOZ6cbXFfg+ipKSEkydP4uDgQN++fR/Z\neQVBEFoigkCC8JQoLCwkKCgIR0dHJkyYQFlZGVFRUaxZs4YVK1YwcuTIZvffs2cPV65coXfv3vj4\n+FBTU0NaWho7duwgOTmZjz/+WKmZYW1tLWvXriUxMRErKytGjx6NoaEhhYWFREdH4+Hh0WwQKDs7\nm7/+9a9UVlayZs0a+vXr91D3LwbEhPt1v2XFsrOzSUlJYfjw4UoBIGgoU/Pqq6/y8ccfc+bMGUXm\nUGhoKNra2ixdulQlkBoQEMD+/fs5fvy4ShBIX1+fBQsWKP3NOTg44O7uTkpKClVVVRgYGLTK96G9\na2mmbmdnbzo7e1NbXcXtkjz62FaSEn9W0S/CzMwMiUSi1FS9MXVBFjn5bPN7yfuAqKs1ruk+D9NT\n5Gmf8SuaugsPo6V+A40HZsvzL6KlrYOekZnKwKybmxtffvklu3btIjExkdjYWExNTRk9ejQBAQF0\n7dq1Te/D0dGRb775hl9++YXY2FgiIiLQ0tLCwsICZ2dnZs6cqVQW08HBgeXLl/O///2PgwcPUl3d\n0OekLYNAFhYWvPzyy0RGRlJUVMTMmTORyWRcuHBBUdItIyODzp07Y2try6xZsxgxYgR/+9vfiI+P\nx9nZGW1tbcrKysjOzubChQs4OzuzbNkynn/+ecXPQp38/HyCgoKoqKhg4MCBODs7U1BQwLp16xg4\ncGCb3bPQMdxP83m5pNxScooqGDNmDHv37mXr1q0kJydjb29Pfn4+cXFxDB06VFFGqnFZK5MuThSm\nneFyzH7Mu/dBW0cPbT0DnEc3/P1lHgmmojCX3s+/ifdwVzp16qR0bgMDA15++WWl97zz58+zbt06\ngoKC8Pb2pnv37kgkEoqLi+ncuTPbt2/npZdeUjqOvr4+27ZtU+k1KAjthaONCV7dLe/r77NvD8tH\nMg5w4sQJrl69ysmTJ6mpqWHWrFlP/TO5IAjtiwgCCcJTIiUlhRdffJHXX39dsWzSpEmsWLGC7777\njoEDBzbbHHHRokXY2tqqPMj8+OOP7Nq1i9OnTysFknbs2EFiYiKDBw/m/fffV+prUlNTo3aGvFxi\nYiIbNmzAwMCAjRs34uSkPnVbENrCg5YVS09PBxqCBjt27FA57s2bNwHIy8sD4M6dO2RnZ2Nqasq+\nffvUXouurq5i+8bs7e3V/r1aWTXMcrt169ZTEwTSdKaujp4BpvYu1PewZKylkaJfxLBhwzA2NiYn\nJ4fa2lqVQUN5I2V1iouLKSoqUhksaa7fR1paGjKZTOW19N59unXrhr6+PtnZ2UilUkWj93u3v99+\nF1paWk0GvARBaLnfgLWrD9auquXV1A3Mdu3alXfeeUej886cOVOlf4hcU6VqoCGDp6kSMGZmZsyd\nO5e5c+dqdA3PPvtsk1nXy5YtY9myZUrLbGxsmi0/09y6bdu2Kb5OTk5WBIEaS0pKwtPTkw0bNiiW\n7dixg/j4eCZPnswbb7yh1Mvo22+/5ciRI3Tr1k3lZ3GvzZs3U1FRwRtvvKE00SImJoaPP/642X2F\nJ9/9NJ+/d78XBjuxceNGgoODSUtLIz4+nm7durFo0SL69eunCAI1Lmtlat+LbgOfo+RiPMXp0dTX\n1aFvbI6122CVc2haDsvb25tvv/2WvXv3Eh8fT2pqKjo6OlhaWuLt7a22XJ0gdASvjnJh5fYYjZ7/\nJRKYOfLhMv81FR4eTmpqKlZWVixYsED8jQmC0O6IIJAgPGHuLXvWzajh6cjIyEilQbmLiwu+vr5E\nRkZy9uzZZmdMd+nSRe3yqVOnsmvXLuLj4xVBoPr6eg4ePIienh5LlixRaWyvq6vbZE+OY8eO8fXX\nX2NnZ8fatWuxtrbW+N4F4WE9TFmxiooKoCGImZiY2OQ5KisrgYZAjUwm4+bNm4SEhNzXdd4bEJCT\nZyo9jsbej0NLM3Xv7d0BDTN1624VAnf7Rbi6unLp0iUiIiKYMGGCYtvIyEguXLjQ5PHr6+v597//\nzXvvvac4R2FhIWFhYWhra+Pr66uyT35+PgcOHFCq/R0TE0NKSgp2dnaKHiE6Ojr4+vpy+PBhfvzx\nR958803F9gUFBYSFhaGjo3PfZTJNTEzIycmhurpabRlPQXja3W+/gYfd72l077PqTWm1RvvJZDL2\n79+PhYWFSjaslpYW8+fPJyIiguPHjzNkyJAmj1NSUkJiYiK2trYqfRiGDBmCp6enUu8W4emjafN5\nl3Hz1O7n4ODABx98oHYfeXA0p6hCablNn6HY9Bna4nnuLWvVXJDYxsaGhQsXNnn9jakL9ApCe9Tf\nyYplk7yarASgb2zOgFlrkEhg+eS+j6z/b+MJC4IgCO2R+LQiCE8IddkLAHdu3SAvrwzf4b3Uzor0\n8vIiMjKSrKysZoNAVVVVhIaGEh0dzdWrV6msrETW6Knr+vXriq+vXLmCVCrFzc2tyX4W6oSGhhIT\nE0OfPn344IMPFPXoBeFReNiyYvLMnD/96U9KPQ6aIg/kODs789VXX7XafTxNWpqpq653h7Qol1Lt\nCkb49MXb2xsAf39/IiIi+L//+z/Onz+PtbU1WVlZpKenM2jQIOLi4tQe39HRkYyMDJYtW0b//v2R\nSqVERUUhlUp57bXXsLOzU9ln4MCBbNu2jd9++w0nJycKCgo4c+YMenp6LF26VClgNXfuXFJTU9m/\nfz+ZmZl4eXlRXl7OqVOnqKysZOHChYreA5ry9vYmMzOTNWvW4OHhga6uLk5OTgwerDrbWBCeRh2h\n30BH1dSzambiZSTlZSRklzQ7WHf16lUqKiqwt7dn165darfR09NTm0HbmLxht7u7u1IgSc7Ly0sE\ngZ5yjyIY3J7LWglCezehf3dszQ3ZEZVJUq7q31DfHpbMHOnyyAJAgiAIHYEIAgnCE6Cp7AW58spq\nTl26yeHEPMb3c1BaJ28g3lzfi9raWlavXk1GRgY9evRg5MiRmJmZKbIOQkJCqKmpUWwvP1bnzp3v\n6z5SU1ORyWR4e3uLAJDwyD1sWTE3Nzeg4fdYkyCQgYEB3bt35/Lly1RUVGBiIj7U36+WZuo21btj\n+HMvsj5ovlJT5Y8//pj//Oc/xMbGoq2tjYeHB5999hlnzpxpMghkbGzM2rVr+fe//01ERAS3b9/G\nwcGBadOmMXr0aLX7uLq6EhAQwI8//sj+/fuRyWT07duXOXPm4OKiXK7CxMSEzz77jJ9//pkzZ87w\nyy+/oK+vj6urK9OmTaN///73/T175ZVXkEqlxMbGkpaWRn19PX5+fhoFgZKTk1m1ahWBgYFqy1bN\nnz8fuFvmqba2lkOHDhEREUFhYSE1NTWYm5vj5OTE5MmTVXq9Xblyhd27d3P+/Hlu3LiBkZER3t7e\nzJw5U23/FNHUXWgLYmC2bWjyrLpyewzLJ/dVeVaVk2fc5ufnN5tBK8+4bYr8OVX+DHwvCwuLZvcX\nnnyPKhjcXstaCUJH0N/Jiv5OVirZpf0crcR7siAIghoiCCQIHVxL2QtyNZVSvtyfhI1ZJ6UZMfIm\n5U2Vl4KGUkUZGRn4+fmplAkoLS1V+SAuP1bj7CBNvP322+zevZuQkBBkMhmvvvrqfe0vCA+qNcqK\nubi44OHhwZkzZzhy5Ajjxo1TPU9ODhYWFopyiC+88AJff/01X331FcuXL1f5O7x16xaFhYVqe8sI\nLc+4bap3h+94d5XMSHd3dz755BOVbR0dHZvs0wFgaWnJu+++q+EVN+jdu7fGPSeMjIyYN28e8+bN\na3Hb5nqDyBkYGLB48WIWL16s0fkfxpdffsnJkyfp0aMHY8aMQV9fn+vXryt6JDQOAv3222+sX7+e\nuro6Bg8ejJ2dHSUlJZw9e5Zz586xfv16pb8D0dRdaEtiYLZ1afqsKpOh9llVTp5xO3ToUFatWvXA\n1yN/r5U/A9+rrKzsgY8tPBkeVTC4pbJWco+6rJUgdCSONiYi6CMIgqABEQQShA5O0+yFytICaqvv\nsCMqU+kDhLy5uLOzc5P7FhQUAKhtbqiuXEa3bt0wMjIiOzub0tJSjUvCGRkZ8dFHH7F27Vp27txJ\ndXU1r732mkb7CsLDaK2yYkFBQaxevZqvv/6asLAw3NzcMDIyoqSkhJycHHJzc/nss88UQaBx48Zx\n8eJFDh48yBtvvEH//v2xsbGhoqKCwsJCUlJSGDt2LEuWLGnz70FHJMo2tV/y0ni9evXi888/Vym5\nJJ/RDw3Bzk8//RR9fX02btyIg8PdLIDc3FyCgoIUwVK5J62p+71ZVJGRkWzatIlly5Y1W6q1sU2b\nNhEZGcm2bduwsbFps2t9GoiB2dal6bMqNASC7n1WlZM/X/7+++/U1tYqsjnvl/yZV54Nee/rk/zZ\nWHi6PapgsChrJQiCIAjCoyCCQILQgbWUvdBYbXUV15JPkKT7HDlFFTjamJCZmcnx48cxMjJi6FD1\njUgBxWBScnKyUsmga9euERwcrLK9lpYWkyZN4qeffuK7777j/fffR1dX9+611NYilUoVA+GNderU\nibVr1/LRRx+xd+9eampq+NOf/qTRPQpPjrCwMA4dOkRhYSHV1dUsWLCAqVOnttn5WqusmJWVFZs2\nbSIsLIwzZ85w/Phx6uvrMTc3p3v37kyePJkePXooHXvRokX4+Phw6NAhzp8/j1QqxdjYGGtra6ZN\nm8azzz7bZvfd0YmyTW2vcYmN4rwrGjfLlkgkyGQydHV1lTLo5BqXPzx69ChSqZSFCxcqBYAAevTo\nwfjx49m3bx95eXk4ODiIpu7CIyEGZlvH/TyryiXllpJTVKGyXFtbG39/f3bu3MmWLVtYsGABenp6\nStuUlpYilUpVXksas7Kyol+/fiQmJrJ//36VQLJ4/RDg0QaDRVkrQRAEQRDa2mMPAkkkEl1gMdAP\n6A+4A7rAGzKZ7J8t7DsXWPLHPnVAAvCZTCbb36YXLQjtREvZC42Z2Pbg+sUEpCX5fFF7AWcLHaKi\noqivr2fJkiWKEhvqyEvz/PLLL+Tk5NCzZ0+Ki4uJjY1l0KBBFBcXq+wTGBjI77//TmxsLG+++SaD\nBg3C0NCQ4uJiEhISeP3115uc3ayvr8+HH37Ihg0bCAsLo6amhsWLF6sdSBSePCdPnmTLli04Ozsz\nZcoUdHV16d27d5ueM3Lvf4g/eBiPF5aib6zaI+B+yop16tSJGTNm3FdvkkGDBjFo0CCNtm2u3Ney\nZctUSjY+6UTZprahroF7RWEOmbnX2X4ykz5Dm2/gbmhoyODBg4mNjeXtt99m+PDhuLu74+bmhr6+\nvtK26enpAGRnZ7Njxw6VY129ehVAEQR6Gpq6P/PMM2zevFn0JnnMxMDsw7ufZ1VN9nvllVfIzs7m\n0KFDxMbG0rdvXzp37szNmzfJz88nLS2NOXPmNBsEgoYJGEFBQWzdupWEhAScnJwoKCjg7Nmzitcu\nQXjUwWBR1koQBEEQhLby2INAgBGw6Y+vC4FrQPNP7YBEIvkMeBe4AmwF9IAAIEwikfxZJpN92zaX\nKwjth6YzsgH0jCxwGDyJ/IRI4k4d46q5AT179iQgIIABAwY0u6+BgQHr168nODiY5ORk0tLSsLW1\nJSAggBdeeIGoqCiVfXR0dFi7di2HDh3i6NGjHD16FJlMhqWlJUOHDsXd3b3569XTY/Xq1fz9738n\nPDycmpoali5dKgJBT4G4uDgA1qxZo3EpwYfV1bLpnljNEWXFHr/HVbappd4799KkX0970VID97zr\nt1ps4A7w//7f/2P37t2cOHGC7du3Aw2v7cOHD+f1119XNGWXl4Y7fPhws9clb/b+NDR1NzIyarZX\nn/BoiYHZB3c/z6qa7Kejo8Pq1as5fvw4ERERxMXFUVVVhampKba2tsyaNQtfX98Wj29vb8/nn39O\ncHAw58+fJzk5GUdHR1avXk15ebkIAgkKIhgsCIIgCMKToD0EgW4DzwOJMpmsQCKR/BVY09wOEolk\nGA0BoEvAIJlMVvbH8k+B34DPJBLJfplMltOWFy4Ij1tLTdEB9I3NGTDr7p+Us28Ai8a788JgJ7Xb\n+/n5qc3QsbKyIigoSO0+TQ1samtrM3nyZJVyPfeaOXOm2sbrOjo6D9X4V+iYSksbZlo+qgAQgKWJ\nAaad9FresBFRVqz9EGWbWo8mDdxl9fVqG7hLpVKlwIWenp7i9b2kpISUlBQiIyM5duwYhYWFbNy4\nEbjb7P2bb77B0dGxxWvsqE3dZTIZBw4c4ODBg1y7dg0TExOGDh3K7NmzVbZtridQYmIiISEhXLp0\nCV1dXTw8PJg3b94jugtBuD+aPKu6jJundr+mni8lEgnPPvusRuVSbWxsmjyOnZ0dK1euVLtO015c\nwtNDBIMFQRAEQejIHnsQSCaTVQOH7nO3hX/8u04eAPrjWDkSieQ74APgNVoIJglCRyeaogtPkh07\ndhASEqL4v7+/v+LrsLAwoqOjOX36NBkZGVy/fh1oaBLt5+fH5MmT1WaJ3blzh7CwME6fPs2VK1eA\nhoBm//79mTFjBubm5orzdO1sRNq+rxSD3/rG5ni8sFTttYqyYu2PmKnbOppr4K6j11D6sOZ2OaDc\nwL2goEAlCNSYlZUVvr6+jB49mjfffJO0tDQqKiowMTGhd+/enDlzhtTUVI2CQB21qfvWrVsJCwvD\n0tKSCRMmoK2tTUxMDBkZGRo3uT99+jQbN25EV1eXkSNHYmFhQVpaGkFBQTg5qZ/cIQiPk3hWFQRB\nEARBEITH77EHgR7QmD/+DVez7hANQaAxiCCQ8IQTTdGFJ4mXlxfQMAO+qKiIwMBApfXBwcFoaWnh\n5uZG586dkUqlJCUlsWXLFjIzM3nnnXeUtr916xarVq0iOzubrl27Mm7cOHR0dLh27RpHjhxh6NCh\nmJubExgYSHR0NNnZ2QS+/BLH0hv6EGjrGqi9ztYuKya0LjFT98G11MBd39QKbT0Dbl75nZoqKboG\nRiTllpJx5To7/vm90rY3b96krKxMJahTVVVFVVUV2traiqDH2LFj2bVrFyEhIbi4uODq6qq0j0wm\nIyUlRfEa0RGbul+4cIGwsDDs7Oz4/PPPMTFp+B2dPXs2q1atorS0FBsbm2aPUVVVxXfffYeWlhaf\nfPIJLi53A9H//Oc/2bdvX5vegyA8CPGsKgiCIAiCIAiPX4cLAkkkEiOgK3BLJpMVqNkk849/XdWs\nU3e835pY1bZdyAWhlbR2U/SioiLmz5+Pn5/fU9dgXni8vLy88PLyIjk5maKiIpUSgWvWrMHOzk5p\nmUwmY9OmTRw9epRJkybh5uamWLd582ays7OZOHEiixYtUsoUqqqqoq6uDmgoR1hUVER2djar//wa\n86RaoqyY8FRqqYG7lrY2Nm6DKUg+SfrB7zF36I2svp4lv/2XAW49lEo4Xr9+naVLl+Lo6IijoyNW\nVlbcvn2buLg4ysrK8Pf3p1OnhswiExMTVq5cybp16wgKCsLb25vu3bsjkUgoLi4mPT2diooK9u7d\nqzh+R2vqHhERAcCMGTMUASBoKJk3d+5cjUqfRkdHU1FRwZgxY5QCQACBgYFEREQo+iUJQnvS2s+q\ngiAIgiAIgiDcnw4XBALM/vj3ZhPr5cvVdwsWhCfM42qKLgitQV3prqbcGwCChr4AU6ZM4ejRoyQk\nJCiCQDdv3iQqKgpLS0tef/11lVJxBgbqs3xAlBUTnl6aNHDv0tcXiY4u1y/Gc/1iPDoGxjiP9+Nv\nf13O4sWLFdvZ2try6quvkpycTFJSEuXl5ZiYmNC1a1fmzZvHyJEjlY7r7e3Nt99+y969e4mPjyc1\nNRUdHR0sLS3x9vZm2LBhStt3hKbujV9Dfj0dz+07tXh6eqps5+7urlLSTp1Lly4BqD2GkZERTk5O\n7TILShDEs6ogCIIgCIIgPF6tEgSSSCQ5QI/72GW7TCab1RrnflgymWyguuV/ZAgNeMSXIwgPRDRF\nFzqahOwStp/MVFse5kbyVfQrq1WWyzMBzp07x7Vr16iqqlJaL+8TBJCRkYFMJsPDw6PZgE9zRFkx\n4WmjSQN3iURCF48RdPEYoVg2cbw7+vr6bNu2TbHMyMiIgIAAAgICND6/jY0NCxcubHnDP7TXpu7q\nXt9SLxZwp6KUT8LSmTtWR+n9WFtbG1NT0xaPK8/yMTdXP8/JwsLiIa9cENqOeFYVBEEQBEEQhMen\ntTKBLgFVLW51V/5DnEue6WPWxHr58hsPcQ5B6HBE9oLQUYQnXG52NnBxeSW3iso4nJjH+H4OQMPg\n5/LlyyksLMTV1ZUxY8ZgbGyMtrY2UqmU0NBQampqFMeQD5Z27ty5ze9HEJ4UooH7w2vq9U1bVx+A\nxMwrpBdKWT65r+L1ra6ujvLycqysmv8+GhkZAXDjhvpH3LKysoe8ekFoW+JZVRAEQRAEQRAej1YJ\nAslkskc23VIm1cmByQAAIABJREFUk0klEslVoKtEIrFT0xdIXkQ641FdkyC0tcZ9el5++WV+/PFH\nkpOTKS8vZ926dXh5eSmyJKKjoykqKkJHR4devXrRY/p0HG36qxyzsrKS7du3c+rUKcrLy7GxsWHC\nhAk888wzD3SNK1euJCUlhbCwMI338ff3x9PTkw0bNjzQOYWOJyG7pMVyMAAyGXy5Pwkbs070d7Li\n119/pbCwkMDAQJVeQenp6YSGhiotkw+WNs4OEgSheaKB+8Np7vXN0NKO26UF3CrKRd/EQun1LS0t\njfr6+haP37NnTwBSUlIYN26c0jqpVEp2dnar3IcgtDWRaSsIgiAIgiAIj1bLBcjbp6N//DtBzbqJ\n92wjCE+MgoIC3n33XYqKivD19WX8+PEYGhpSVFTEsmXL2L17N2ZmZkycOJGRI0dy5coV1qxZw+HD\nh5WOU1NTw+rVq9m3bx+mpqZMmTIFLy8vdu7cyT//+c/HdHcNduzYgb+/P8nJyY/1OoS2sf1kpkaN\noaEhELQjKhOA/PyGBNJ7+4IAantguLq6IpFISE1NVSkbp468H0ddXZ1mFycIT6hXR7lwTwutJokG\n7sqae32z7NkPgGspUdTeua14fauuruaHH37Q6PjPPPMMxsbGnDhxgszMTKV1ISEhigxIQRAEQRAE\nQRAEQWistcrBPWr/AGYDqyUSyS8ymawMQCKROAJLgDvAvx/b1QlCG0lLS+Pll19mzpw5SstXrlxJ\ncXExK1asYNSoUYrlUqmUlStXsmXLFoYMGaLoI/C///2PzMxMhg0bxvvvv4/kjxG/6dOns2zZskd2\nP5s3b0ZfX/+RnU94vHKKKu4rwwAgKbeUnKIKbG1tARTN3+WysrL4+eefVfYzMzNj1KhRnDhxgn/9\n618sWrRI8XsOUFVVRV1dnSJjyMSkYUZycXExdnZ293trgvDEEA3cH0xLr2/G1g7Y9B5CUXoMFw78\nA4vu7lz5TYsrR7ZgZ22BpaVli+cwMDDgrbfeYuPGjbz//vuMHDkSCwsL0tLSyM3NxdPTU21QXBAE\nQRAEQRAEQXi6tYsgkEQieR/o/cd/+/3x72sSiUTedfiUTCZTpCfIZLIzEonkC+AdIEkikewG9IBX\nAEvgzzKZLOeRXLwgtIF7a6V3M2oYiTM3NycwMFBp2+zsbFJSUhg+fLhSAAgaSmK9+uqrfPzxx5w5\nc4bnn38egIiICCQSCfPmzVMaGLe1tcXf35+QkJA2vsMG3bp1eyTnEdqHxJySB95vzJgx7N27l61b\nt5KcnIy9vT35+fnExcUxdOhQoqKiVPZbuHAhubm5HDp0iOTkZAYMGICOjg6F/5+9Ow+LsmofOP4d\n9kUWYRhEQQHFBUHEfUfF3Le0TCmVN2zTFrO0bLO3UuvVSiuzLE0tsfdnaoK5hDuIoojIpgKyiIrs\nyDCyDczvD96ZGGdAxF3P57q60mc9zyM8z8y5z7nvnBxiYmL48MMP8fb2BsDHx4dt27bx3Xff0a9f\nP8zNzbG0tGTs2LG3dc2C8DASBdxvXWOeb626j8DUyo685JPkp0RjaGqBzfDBfPrRPF5//fVGnad/\n//588sknBAcHEx4ejrGxMV5eXixfvpw//vhDBIEEQRAEQRAEQRAEHQ9EEIjatG5+Nyzr97//1LRy\nVKlUqrckEkk8tTN/XgRqgBhgmUql2nkX2yoId83p9Hw2HUnRGU1cUVpMVlYR/m4dMDY21lp37tw5\noHbWT3BwsM4xr127BkBWVhZQWwsoOzsbqVSqmfFQXl7OtGnT8PDwYPr06ZogUGVlJVOnTqWqqop5\n8+YxZMgQzXF37drF6tWref3117VqE1RXV7N161b27dtHXl4etra2+Pn58dxzz2FkpP3IubEmUFBQ\nELm5uQC89957WtvWrTVUUVFBSEgI4eHhXLlyBYlEQps2bRg/frxOIEx4cFyvUDZ5Pzs7O7744gvW\nr19PUlISMTExODs788orr9C1a1e9QaBmzZqxbNkyzc/Knj17MDAwwMHBgSeeeILWrVtrtu3WrRtB\nQUHs3buXHTt2oFQqkclkIggkPLZEAfdb05jnm0QiwaFDLxw69NIsGzS4PZaWlqxdu1ZrW39/f/z9\n9Zfc7Nq1K127dtVZPnfu3Hs6m1cQBEEQBEEQBEF4ODwQQSCVSjW4ifutB9bfybYIwv2y5/TFBtPv\nlJRVciT1GntjsxjR1UWzXC6XAxAbG0tsbGy9xy8rKwPQ1Axo3ry5Zp2ZmRkeHh4kJydjZmamWZ6U\nlERVVRUAZ86c0QoCnTlzBqidQVHX8uXLSUxMpHv37lhYWBAdHc3WrVspLi6+aefU+PHjOX78OAkJ\nCfj7+yOTyXS2USgUvPfee6SlpdG2bVueeOIJampqOH36NMuWLSMzM5Pp06c3eB7h/rAwbdwrx+OJ\nQL37ubi48OGHH2qtUwcS6wYJ6zIzM2PKlClMmTLlpuedOHEiEydObFQbBeFxIQq4N05jn293aj9B\nEARBEARBEARBaCzxzVMQHgCn0/NvWn8BAJWEr3fGIbMx16ThsbCwAODFF19k3LhxNz2XugZKUVGR\n1nIfHx/Onj1LdHS0ZtmZM2cwMDDAy8tLE/QBUKlUxMfH06JFC51ATXZ2NqtWrdLUWJk+fTqvv/46\nBw4cYObMmVrBpxtNmDABhUKhCQKpU3XV9dNPP5GWlkZgYCCTJ0/WLK+srGTx4sVs2bKF/v374+7u\nftN7IdxbXV2bljqqqfsJgiDcK+L5JgiCIAiCIAiCIDyoDO53AwRBgE1HUm4eAPoflQqCw1M0f+/Q\noQMAiYmJjdrf3NwcJycnsq7ksG5XFMHhKfx5Ih2pczsADhw4oNn2zJkztGvXjn79+pGfn8/ly5cB\nSEtLQy6X68wCAggMDNQEgKB2Joafnx8qlYrU1NTGXWQ95HI5Bw8exMPDQysABGBiYkJgYCAqlYrD\nhw/f1nmEu8NVZoV365sXP6+rSxs7MQtBEIQHnni+CYIgCIIgCIIgCA8qMRNIEO6zjFy5Tg2gm4nL\nLCQjV46rzAoPDw86d+5MZGQkYWFhWvV5NOfIyKB58+bY2NhwOj2fbOPWnMmIJfPr73Eb+DQSiYSa\n6mouXCwkLiGJLl6eVFZWcuHCBSZPnkyXLl2A2qBQq1atiIuLA9Asr8vDw0NnmYODAwClpaW3dJ03\nSk5OpqamBkBv/aPq6mrgn/pHwoPn2UEeLNwU1aigp0QCAQN1f54EQRAeROL5JgiCIAiCIAiCIDyI\nRBBIEO6z2Iz8Ju+nHkH89ttv8/777/PNN98QGhpKhw4dsLS0JD8/n4yMDDIzM1m+fDnH0q6x4q94\nqh28sbSPpvjiWc7vWoNVy7ZUV5aTm52FSlVDdqGc7Oxsampq8PHxwcXFBTs7O86cOcPo0aM5c+YM\nEolE70wgdbq5ugwNDQE0AZymUtc/SklJISUlpd7tysvLb+s8j6vc3FyCgoLw9/cnICCA9evXExsb\nS3l5OW3atCEgIICePXtqtg8ODmbz5s0sWbJEJ3Vf3WPVrQV1eMdvFP29E4t+gVy7lEJecjSVpUUY\nmzfDvl03HDsPQCKRUHwxkVZlKSye/wtmZmYMGDCA559/HhMTE71tLywsZP369cTExFBWVoaLiwtP\nPvkkfn5+erePiYkhJCSE5ORkysrKkEql9O3bl2eeeUbnZzgoKAiAb7/9luDgYI4dO0ZBQQFTpkwh\nICCAsrIyduzYQXh4OHl5eahUKmxtbWnXrh2TJ0+mXbt2Tfr3EATh4eLrJmXuGO+bpneVSODNsV00\naV0FQRAEQRAEQRAE4W4SQSBBuM+uVyhvez+pVMqKFSsIDQ0lMjKSQ4cOUVNTg62tLa1bt2bs2LEU\nqSxZ8dcZVCowMDSinf90suMPU5SZSN65KEwsbZC274E8O53UrBzsbNOxMjOhU6dOQO2sn1OnTlFV\nVUViYiKtW7fGxsbmjtyDxlJ3zk+YMIFZs2bd03M/TnJzc5k3bx4tWrRg6NChyOVywsPD+fTTT/ns\ns8/0zgC7FTIbc1yrz7Hvwgks7V2xdnLn2qVkrsQeQFVTTTtnGcqsI/T198POzo7Y2Fj++usvampq\nmD17ts7xSktLmT9/PpaWlgwbNgyFQkF4eDjLly+noKCASZMmaW2/efNmgoODsbKyomfPntjY2JCR\nkcH27duJjo5m+fLlmlpbakqlkvfffx+5XI6vry8WFhY4OjqiUqlYtGgRZ8+epWPHjgwfPhxDQ0Py\n8/OJj4+nc+fOIggkCI+Rkb6tcbS1IDg8hbhM3Vm+XdrYETDQQwSAbsPChQtJSEggNDT0fjdFEARB\nEARBEAThoSCCQIJwn1mY3vzX0LSZLd2eW9Tgfubm5kyZMoUpU6boPcbbG45pjUw2NDHDufsInLuP\n0CxTFFzm/O6fkXXsTV5FHj27dtTMvPDx8eHQoUPs2rWL8vJyvbOA7gQDg9pSZfpmDbVv3x6JREJS\nUtJdObdQKz4+noCAAKZNm6ZZ5ufnx6JFi9i2bdttB4EArhdmE7FzM/JqE2Iz8iksLmH9lx9iWZ6C\nee4VVv60GhcXFwCqqqp44403CAsL49lnn9UJPmZkZDBgwAAWLFiARCIB4KmnnmLu3Ln8+uuv9OvX\njxYtWgAQFxdHcHAwHTt25OOPP9aa9bN//35WrFhBcHCwTpCxsLAQFxcXli5dipmZmda5z549S58+\nfXj//fe19lGpVCgUitu+V4IgPFx83aT4uknJyJUTm5HP9QolFqZGdHWVPrI1gOqb/SkIgiAIgiAI\ngiDcfwb3uwGC8Ljr6tq00cC3sl9j6w5ZNHfCyMSMa5fOcykrCyfX9pp16o7/LVu2aP39TrO2tgYg\nLy9PZ52NjQ2DBw8mJSWF33//XW+gKDs7m5ycnLvStkdNRq6cP0+kExyewp8n0rmYV1uzSSaT8cwz\nz2ht261bNxwcHEhOTr4j5546dSr29va4yqyY2MuN54f7MHm0P0ZUM3r0aE0ACMDY2JiBAweiVCr1\n1nsyMDAgMDBQEwACcHR0ZNy4cSiVSg4ePKhZrh45/tprr+mkffP398fd3Z1Dhw7pbXNQUJBWAKgu\nfWnqJBIJzZo1q/8mCILwSFM/3wIGejCxl9sjGwASBEEQBEEQBEEQHmxiJpAg3GeuMiu8W9sRnZRG\n4p8rsXfvSpt+EwDIjNxBQVosnSe+gWkzW80+XdrY3VJn0sYtO4j57Rva9J2Afduu9W4nMTCgmawN\nxZfO1/7d1lmzTiaT4eTkRHZ2NgYGBnh5ed3qpTaKt7c3EomEDRs2kJmZqelEVwclXn75Za5cucKm\nTZs4ePAgnp6e2NraUlhYSFZWFikpKcyfPx9HR8e70r5Hwen0fDYdSdEJDFaUFpOVVUTr9l6aGVl1\nSaVSzp07d0faoC9Fmp2dXb3r7O3tAcjP162h5eDgoPff29vbm82bN3PhwgXNsnPnzmFkZERERITe\ndlVVVXHt2jXkcjlWVv/8jpmYmODq6qqzfevWrXF3d+fIkSPk5eXRu3dvPD098fDwwMhIvGIFQRAE\nQRAEQRAEQRCE+0v0UAnCA+DZQR6cOpvWqG0lEggY6HFLxy+vqm70ts1auFF86TwVJQX8/PVnPO3n\njUwmA2pTwmVnZ9OuXTudWRR3iouLC2+++Sbbt29n165dVFZWAv8EgSwsLPj888/Zs2cPhw8fJjIy\nksrKSmxtbWnZsiWzZs3C19f3rrTtUbDn9MUGi5aXlFWy/2wBe2OzGNHVRWudoaEhqoaqnd8CfT8/\nhoaGADr1eOquq67W/Vm2tbXVWQbQvHlzAK5fv65ZJpfLqa6uZvPmzQ22r6ysTCsIZGNjozXTSM3A\nwIDFixfz+++/c/ToUdavXw/Upmf09/dn5syZ9c4eEoRHWXJyMtu3bycpKYmSkhKsrKxo06YNI0aM\nYMCAAZrtIiIi2LlzJ+np6SiVSpycnPDz82PixIkYGxtrHTMoKAiAVatW8dtvv3H06FFKSkpo1aoV\nAQEB9OnTh+rqarZu3cq+ffvIz8/H3t6eCRMmMHbsWK1jxcfH89577zFt2jR69uzJb7/9xrlz55BI\nJPj4+PDCCy8glUq5evUqGzdu5MyZM5SXl9OhQwdeeOEF3NzcdK65sLCQ//73v0RHR5Oens758+cZ\nMGAAixcv1glujxo1isTERH755RccHBxYunQphw8fxsPDgw4dOmBgYEB+fj5GRkb4+Pgwc+ZMWrZs\n2eA5CwsLsbCwoHPnzkyZMkXnnOqUl3PnzsXBwYHNmzeTmpqKRCKhc+fOPP/881qzMBsrODhY80zd\nv38/+/fv16ybO3cu/v7+qFQq9uzZQ1hYGFlZWahUKlq3bs2wYcMYNWqU3ufrkSNH2LZtG1lZWZib\nm9OtWzcCAwP1tkGpVLJnzx6io6O5ePEiRUVFmJmZ0bZtW5588km6d++u2bampoagoCAUCgUbN27U\n+4z+8ccf2blzJ++++y79+/e/5XsiCIIgCIIgCILwIBFBIEF4APi6SXnpiU68/qf28pZdh+LYuT/G\n5rWd0RIJvDm2yy0XlDYzNmz0trKOvZF17E1m5A6MSi9orZszZw5z5szRu9/SpUvrPaa/vz/+/v46\ny+sr6jxkyBCGDBlS7/GMjIwYO3asTqee0LDT6fkNBoA0VPD1zjhkNuYN/qypZwvpC8yUlpbeTlNv\nSXFxsd7lRUVFgHZQycLCApVKddMg0I30dVCqNWvWjFmzZjFr1iyys7NJSEhg9+7d7Ny5E4VCwbx5\n827pXILwsNu7dy/ff/89BgYG9O7dm5YtW1JcXExqaip//fWXJgi0ceNGtmzZgrW1NX5+fpiZmXHq\n1Ck2btxITEwMn376qc6MOqVSyQcffEBpaSm9e/dGqVRy+PBhlixZwqeffsquXbs4f/483bt3x9jY\nmIiICH788UdsbGwYOHCgTltTUlLYunUrXl5ejBgxgoyMDCIjI8nMzOSDDz5gwYIFODs7M3ToUHJz\nczl27BgffvghP//8s1bwICcnhwULFlBYWEiXLl1wc3Pj6tWrXLhwgfnz5/Pee+/Rs2dPnfOfOHGC\nqKgobG1tkclkGBoaEhISgpOTEy+//DLZ2dlERkYSHx/PsmXLaNWqVb3nHDRoEPn5+URERHDy5Mmb\nnrN79+6MGjWKrKwsoqOjSUlJ4fvvv9ekZW0sb29vFAoFISEhuLm50adPH806dbDsyy+/5PDhw0il\nUoYPH45EIuHYsWOsXr2apKQk3n77ba1j7tixg59//hlLS0uGDh2KpaUlMTExzJ8/X+9AAblczpo1\na+jUqRNdu3bFxsaGoqIiTpw4wccff8xrr73G8OHDgdp314gRI9i0aROHDx9mxIgRWseqrKzk4MGD\nNG/enN69e9/SvRAEQRAEQRAEQXgQiSCQIDwghno707FVc4zs/uncMLawwpjaAFCXNnYEDPS45QAQ\nQFtHmya1ydpct86J8PDadCTl5gGg/1GpIDg8pcGfN/VsHn0p2lJTU5vUxqbIy8sjNzdXM2NNLT4+\nHoC2bdtqlnXs2JGTJ09y8eJFWrdufcfb4uTkpJnJ8Oyzz3L8+PE7fg5BeJBlZWWxevVqLCws+OKL\nL3R+z9TPi3PnzrFlyxakUilfffWVZubezJkzWbx4MSdPnmTbtm1MmTJFa//CwkLatm3L0qVLNTOF\nhgwZwrvvvsvnn3+Ok5MTq1at0jyfJk6cyCuvvMIff/yhNwgUHR3NW2+9xeDBgzXLvvnmG8LCwpg/\nfz5PPvmkVht+//13Nm3axN9//8348eM1y1etWkVhYSHTp09nypQpxMfHExUVxaBBg4iIiODrr79m\n3bp1OrNOjh8/zieffEJ+fj4FBQUAvPTSS8TExCCVSnnppZcICQnhp59+4vvvv2fx4sX1nlNt9OjR\nvPvuuzc9p4+Pj2bZhg0b+OOPPwgLC2Py5Mk696kh3t7eODo6EhISgru7OwEBAVrrjxw5wuHDh3F3\nd+eLL77QtOe5555j4cKFHD58mJ49e+Ln5wdAbm4u69evp1mzZqxcuVLzbJ85cyaff/45kZGROm1o\n1qwZ69atQyrVfmcpFAoWLFjAL7/8wuDBgzX124YPH87vv//Onj17dIJA4eHhKBQKxowZI9J6CoIg\nCIIgCILwSNAt+iAIwn1jY2HC+J6u/PjSIF4Z4YksN5LCv79i8aROLJvRV9Mhr1KpCAkJYfbs2Uya\nNImZM2fyww8/oFAoCAoK0qTMUWvR3EIT0JFfTSclbD1n/ruUM//9nAsHgym/lqe1fcxv/6bqahIW\npkYEBQUxbtw4xo0bp3Nc4eGRkSvXqQF0M3GZhWTkyutd3759ewD27dunNRsoPz//lmfa3I6amhp+\n+eUXrVR1OTk5hIaGYmhoqNW5O2FCbb2tb7/9lsJC3ftRXl7O+fPnG33unJwcrl69qrO8tLQUpVKp\n6XAUhEdZRq6cP0+kExyewpJVG5Ffr2Dq1Kl6A63qTvqwsDCgNtWnOgAEtakfg4KCkEgk/P3333rP\n98ILL2iliuvcuTOOjo6UlpYSGBiolW6yRYsWdOrUiczMTGpqanSO5enpqfWMABg6dChQO3Pwqaee\n0rsuLe2fFK75+fkcPX6S65hR4eDNnyfSuVygAKBVq1b4+fkhl8v1Bi8GDRqkFYzp0qULs2fPBmpT\n6gGMHTsWJycn4uLiyM3N1Zzz9OnTODg4MGnSJK1jdurU6ZbOCTBy5Eitc95J6n/rwMBArYCUmZmZ\nJr1b3X/rQ4cOoVQqGTt2rFZwXyKR8K9//UvvzExjY2OdABDUDlZ44oknKC0t1bo2Ozs7+vTpQ2pq\nqs6ghd27dyORSHSCQ4IgCIIgCIIgCA8rMbxNEB5ArjIrXGVWZEQ6UJxhQWuHZlrrf/jhB3bt2oWd\nnR0jR47EyMiIqKgokpOTUSqVekeutrK35OrlZIovJWPdsi1Sj+6UX8vn2uUUrhdcodPY2RiZ1c5C\ncuriR2fLYkoLrjJ+/HhNh9rdqgMk3H2xGbqzdRq7n6vMSu+6Dh064OXlRUJCAvPmzcPHx4fi4mJO\nnDiBr68vERERt9PkRnN1dSU5OZm5c+fi6+uLQqHQjOT+17/+hZOTk2ZbdW2NjRs38uKLL9KjRw8c\nHR0pLy8nNzeXhIQEPD09+fe//92oc6enp7NkyRI8PDxwcXHBzs6Oa9euERUVhVKp1OlAFoRHyen0\nfDYdSdEKMJ8/chJFQQG70qB1en69swkvXKhNN3pjMAJqAydSqZScnBwUCoXWu8fS0lLrd1rNzs6O\nnJwcrZl/avb29lRXV1NUVIS9vb3WOg8P3Rp76m3c3d01aS9vXKeetXM6PZ/l63cQl1mAnVtLfouo\nDQ7JczLIySoiI1dOry5dOHjwIGlpaZogktqNdXu8vb01wQx1Wk0DAwM8PT3Jzs4mLS0NmUymCUJ1\n7txZ7zu/yy2cE9A5581k5MqJzcjneoUSC1MjnC3rn2Z64cIFJBIJ3t7eOuu8vLwwMDDQ/Dyotwf0\nbt+iRQscHBw0wbC6Ll68yLZt20hISKCoqEhTU1DtxsD/6NGjOXr0KHv27OHVV1+tva6MDE06wRtn\nlwqCIAiCIAiCIDysRBBIEB4yiYmJ7Nq1i1atWvHll19qOsdmzJjBBx98QGFhod6OCxsLE6rk2UiH\nBmDVwl2z/PLp/eQkRlBw4TSOnfsjkcCy91/n7KGt7N9/lQkTJoiOkEfA9QrlXdnvgw8+YN26dURF\nRREaGkrLli0JDAykW7du9ywI1KxZM/7973/zyy+/sG/fPq5fv46LiwuTJk3SpBeq66mnnsLT05PQ\n0FCSkpKIiorCwsICe3t7RowYoXef+rRr146nnnqKhIQETp06RWlpKTY2NrRr145x48ZpFSMXhEfJ\nntMX9dYYU1aWA3ChqJqFm6J4c2wXRnR10dn/+vXrAFqzgOqys7MjLy9PbxBIH0NDw3rXq9fpq1+m\nr75MY46lVCo196DgYm1Awthce8BGSVklvx9NxVrmDOgPsDRrpr2Pra2t5hx1Zy6p75NCodD6f333\nT728Meese136ZkvVpS/wB1BRWkxWVhEdCnTPp1AosLKy0husMjQ0xNrammvXrmltD7X3Qp/mzZvr\nBIHOnz/Pe++9R01NDT4+PvTu3RsLCwskEglpaWlERUVRVVWltU+XLl1wcXHh8OHDBAUFYW5uzt69\newEYNWpUg/dBEIQHX25uLkFBQfj7+zN37tz73RxBEARBEIT7SgSBBOE+uZVRtHXt378fgClTpmh1\nUBkZGTFz5kwWLFhQ775PjRvB0EnTCA5PIS6ztgNH6tGNnMQIFAWXteoOnT3U9GsTHjwWpjd/3Js2\ns6Xbc4vq3W/p0qU6+1haWvLaa6/x2muv6awLDQ3VWTZ37tx6v4gHBATo1JJQ8/f3x9/fv8FzvPXW\nW3r31cfT0xNPT89Gbbt27dp610mlUmbMmNHo8wrCo+B0er7eABCAkYkZFUDVdTmGxqZ8vTMOmY25\nzowgdfClqKhI78we9ayNB3UGak7xdc09MDQxBaCqTKGznaqmhg1hsZhfr9S6lvLycr3HLS4u1ru8\nqKgIQGdmbmO3vxPqC/yplZRVsvPURZ6IzdIK/FlaWiKXy/XOVK6urqakpEQrGFf32vSlFFRfW13/\n/e9/qaysZMmSJToziLZs2UJUVJTeNo8aNYo1a9Zw6NAh/P39OXjwIPb29vTs2VP/RQqCIAiCIAiC\nIDyERBBIEO6xpoyirUudAkZfB3aHDh00o3n1adeuHb5uUnzdpJogVOn1ClZGWtGrs4xlM/o24YqE\nh0FXV/0pme7WfoIgNOxhHqG86UhKvYEAC6kzioIrlFxJxcxGikoFweEpOkEgd3d3Lly4QEJCgk4Q\nKDs7m/z8fBwdHR/YIFBiVhG2LWv/bN68BQCKvIuoaqqRGBhiZGIOQNX1EioV1ygsVGhS1WVnZ9cb\nBIqPj2ctlNqvAAAgAElEQVTq1Klay2pqakhKSgJq71vd/ycmJlJdXa3z7o+LiwPQmx6vKRoK/AGa\nOj2qmhqdwJ+7uztnzpwhMTFRJ/1fYmIiNTU1Wu1s27YtkZGRxMfH06VLF63tr169Sl6edh1DgCtX\nrmBlZaU3hVxCQkK91zV06FA2bNjAnj17MDExQaFQMG7cOJ00gIIgCIIgCIIgCA8z8Q1HEO6hPacv\nsnBTlE4ASE09inZvbFa9x1Cn0NGXJsXAwAArK/31W0A7BYyrzIqJvdx4bnBHWthaYG0uYsKPMleZ\nFd6t7W5pny5t7OqtByQIwuMpI1de7zsMwKF9DyQGhlxNOEL5tdrO+rjMQjJy5QDk59fWJ3viiScA\n+P3337VSgdXU1LB27VpUKhXDhw+/W5dxW65XKMkrKdP83cTSBmsndypKi8k9VzvjxNRaiqGJGQUX\nYilIi+V6tSEt23pRWVnJjz/+WO+x4+LiOHnypNaynTt3kp2dTZcuXTTpWaVSKV27diU3N5eQkBCt\n7c+fP8/hw4dp1qwZffvemcEdDQX+AAxNzJFIJFRdv6YJ/Kmp/603bNhARUWFZnlFRQXr16/X2gZg\n8ODBGBkZsXPnTq20byqVil9++QWVnoY4Ojoil8vJyMjQWh4WFkZMTEy97ba0tMTPz4+0tDR+/fVX\nDAwMGDFiRP0XKgiCIAiCIAiC8BASvb6CcI/cbBSthgrNKFp9zM1rlxcXF9OiRQutdTU1Ncjlcp3C\n14IA8OwgDxZuirr5zyAgkUDAQN2C6YIgPN5iM/IbXG9m44BLz1FknfiLc7t+xMa5I6ZWdixZFo1F\nVREWFhYsWbKETp06MXnyZLZu3cqcOXPo378/ZmZmnDp1iszMTDw9PZk0adI9uqpbU1JWCRLtZS69\nxpL89zoux4Qhz76AhX1LJBIDSrIvYGBoSAsvP776bjWq4kvY2dnVO2CjV69eLF68mNTUVFQqFR9/\n/DGnTp3CysqKV155RWvbOXPmsGDBAtatW0dMTAweHh7k5+cTERGBgYEBc+fO1XxmuB03C/wBGBqb\nYGHfitLci2REbCM7zp425ecZO3wwfn5+HD9+nIiICGbPnq0JTB0/fpycnBwGDhzI4MGDNceSyWTM\nnDmTtWvX8vrrrzNw4EAsLS2JiYlBoVDg6uqqE+wZP348MTExLFiwgAEDBmBpaUlqaiqJiYn079+f\no0eP1tv2MWPG8Pfff1NQUECvXr2QSsUMWEF41Fy6dIn169eTmJhIVVUV7u7uTJs2DV9fX63tqqqq\n2LFjB4cOHSI7OxtDQ0Pc3NwYN24cAwYM0Nq27ozep59+mt9++434+HhKSkpYvHgx3t7eLFy4kISE\nBP7880+2bt3Kvn37yMvLw9bWFj8/P5577jm99dIEQRAEQRDuNDETSBDukZuNoq3rxlG0dalTpqhT\nw9R1/vx5vYWvm0KdCuVOHU+4/3zdpMwd441E0vB2Egm8ObaLTvomQRD0y83NZdy4caxYseKunWPF\nihWMGzdOa2bEvTjvja5XKG+6jdSjO+2H/wvrVu0pzckg92wk5+JjsbGxYcyYMZrtAgMDmT9/Pi1b\ntuTAgQOEhoZSU1PD9OnT+fTTTx/YjrHqGt2XualVczqMegFp+x6UlxSQe/YYKpUKO1cvbJw7cr3w\nMucT4+jXrx+ffPJJvalb+/Xrx/vvv09lZSWpqamcO3eOfv36sWzZMpydnbW2bdGiBV9//TWjRo3i\n8uXLbN++nejoaLp168Z//vMfevfufUeu92aBPzXX/k9i3dKDkuwLXI0/zIZff+XChQsALFiwgFde\neQVra2t2797N7t27adasGS+//DLz58/XOdbEiROZP38+jo6O7N+/n7CwMNq0acOyZcu0ZjWrde/e\nnY8++ojWrVsTHh5OWFgYRkZGLFmy5Kb1fdzd3TXp9UaOHNmoaxWEW1X3eX3p0iU+++wzpk2bxlNP\nPcWCBQs4ffr0/W7iIysnJ4e3336b0tJSRo4cyYABA7hw4QKLFi0iPDxcs51SqeSjjz5iw4YNVFdX\nM2bMGIYMGcLly5f54osv2Lhxo97jZ2dn89Zbb5Gbm8vgwYMZMWKEVp0zgOXLl7Nz5046d+7M6NGj\nMTExYevWrXz33Xd39doFQRAEQRDUHsxv14LwiGnMKNobxWUWYo5uzYChQ4cSFhbG//3f/9G7d29N\nvQSlUlnvl5OmUI9SzsvL01u0W3g4jfRtjaOtBcHhKcRl6v5MdmljR8BADxEAEu4olUpFaGgoe/bs\n4erVq1hZWdG3b1+mT5/O66+/DsDatWu19jly5Ah79uwhLS2NyspKHB0dGTx4MJMmTcLY2Fhr23Hj\nxuHl5cXChQvZuHEjJ06cQC6X4+TkxKRJkxg2bJjedsXExBASEkJycjJlZWVIpVL69u3LM888o1OL\nJigoCIBvv/2W4OBgjh07RkFBAVOmTGHYsGFUVVVpZiJkZ2dTWlqKtbU1Xl5eTJ06FRcXlzt1O29q\nxYoV7N+/n7Vr12rSh90pFqaN++ho6eCCu8M/1/zKCE8m9nLT2W7QoEEMGjSoUce88WekrqVLl9a7\nbu7cuTp1l7y9vQkNDdW7vUwmq3cdwEdfr2X1Xt2BGCYW1rTuNUbPHrXq3oPdu3fXu13Pnj01wZOb\nsbe3Z/bs2Y3a1t/fH39//3rX13fNjQn8AZha2dF2yDTN32cObo///2aUSiQSRo8ezejRoxt1LKj/\nZ6O+f+uePXvqDfh4eXk1eN1lZWVcuXIFBwcHevTo0ej2CUJTqAMSrq6ujBw5kqKiIsLDw1m0aBHz\n589n4MCB97uJD7Sm1NM7dOgQZWVlvPDCC5pnwZgxYxg2bBgzZ84kISEBCwsLtm/fTkJCAt27d+fD\nDz/UBOsDAgKYN28ev/32G7/88gsTJkzQOndSUhJPP/00M2bMqLcN2dnZrFq1SvP9Sv3558CBA8yc\nOZPmzZs39ZYIgiAIgiA0iggCCcI90NhRtDe6XKjQWebl5cXIkSPZs2cPc+bMoV+/fhgZGXHixAks\nLCyws7PTFGi+HT4+Pmzbto3vvvuOfv36YW5ujqWlJWPHjr3tYwv3l6+bFF83KRm5cmIz8rleocTC\n1IiurlJRA0i4K3744Qd27dqFnZ0dI0eOxMjIiKioKJKTk1EqlTozPlauXMm+ffuQSqX069cPS0tL\nzp8/z2+//caZM2f49NNPdWZSKBQKFixYgJGREf3796eqqoqIiAhWrlyJRCLR6QTevHkzwcHBWFlZ\n0bNnT2xsbMjIyNDMpli+fLnOSF6lUsn777+PXC7H19cXCwsLHB0dASgpKSE7O5u2bdtqnplXrlwh\nMjKSEydO8J///Ac3N90gSGPNmDGDp556Cju7W6vtdad1dW1agLip+z2IHrd70NjA353a717btWsX\n5eXlPPPMM3fk85MgNCQhIYEnn3yS559/XrNszJgxzJ8/n1WrVtG9e3edd49we8zMzHSCLB4eHjg5\nOXHlyhWOHTuGv78/YWFhSCQSZs2apfUZw8bGhqlTp7Js2TLy8vJ0jm9ra8u0adN0ltcVGBiolQbU\nzMwMPz8/fv/9d1JTU286Y1EQBEEQBOF2PRzfzgThIdfYUbQ3qlTW6F0+e/ZsnJ2dNSlVrK2t6dOn\nDzNmzCAwMPCOzNzp1q0bQUFB7N27lx07dqBUKpHJZCII9AhxlVmJoI9w1yUmJrJr1y5atWrFl19+\nqZlhM2PGDD744AMKCwu1Zqvs37+fffv20bdvX95++21MTEw064KDg9m8eTN//fUX48eP1zpPeno6\nTzzxBK+++qomneWECRN49dVX2bp1q1YQKC4ujuDgYDp27MjHH3+sNetn//79rFixguDgYGbNmqV1\njsLCQlxcXFi6dClmZmaa5bm5uVhbW/Pkk0/qpLZKT09nwYIFbNiwgY8//riJdxHs7OzuewAIap8b\n3q3tbml2a5c2do/Us+ZxuwePYtBLoVCwe/duCgoK2Lt3L3Z2dlqpCu8V9fNm7ty5Dc5WEh4dlpaW\nOgEDDw8PBg8ezP79+zUBCeHW3Ti4ydmyNnVnjx49eOONN3QCQc2bN+fKlSukpaXRr18/srOzsbe3\n10m9CdClSxeg9tlxIzc3N50Zyjfy8NCts+ng4ABAaWlp4y5QEIS76lYyFygUCvbu3cupU6e4fPky\n165dw8LCgo4dO/L000/TsWNHneOrMxe88847bNiwgZMnT1JeXo6bmxuBgYF07tyZ8vJygoODiYiI\noKioCCcnJwICAnRqkqndSuYEQRAEEQQShHugMaNhTZvZ0u25RVrLJk+fpTd9jkQiYcKECUyYMEFr\n+ZUrVygvL9dJO9TUFDATJ05k4sSJN227IAhCXXU7Yg78+X9cr1AyZcoUrWCLkZERM2fOZMGCBVr7\nhoSEYGhoyBtvvKEVAAKYOnUqO3fu5NChQzpBIFNTU2bNmqUJAAG4uLjg6elJQkIC5eXlmsCN+pn3\n2muv6aR98/f3JyQkhEOHDukEgaA2LVzdAJCasbExxsbGeotPOzg4EBcXp5n1FBwczPr166moqNA5\nTn2pbhqb4m3cuHFabVWTyWQNplO7Fc8O8mDhpqhG1bmTSCBgoG7n18PucboHj2LQS6FQsGHDBoyN\njWnXrh0vvfQS5ubm97tZwiOkvoBE27Zt9f6seXt7s3//ftLS0kQQ6BadTs9n05EUnWdURWkxWVlF\ntPWy1BvYUX/GUCgUmuBOfYMt1AEkfbVSG5PK7cbPGoBmtlFNjf5Bf4Ig3Fu3krng0qVL/Prrr3Tu\n3JmePXvSrFkzcnNzOXHiBKdOneLDDz+ke/fuOudQZy4wNzfHz88PuVxOeHg4H330EcuXL2fVqlXI\n5XJ69uxJdXU1hw8f5j//+Q8ODg506NBB61hNyZwgCMLjTQSBBOEeuNOjaIuKirC1tdVKW1JRUcFP\nP/0EQN++fZt0PkEQhNuhryPmXGQs1wsL2BJ/neZu+Vr1pjp06KD15aSiooL09HSsra3ZsWOH3nMY\nGxuTlZWls7xly5Z6U+hIpbXnKy0t1QRvzp07h5GREREREXrPUVVVxbVr15DL5VrpW0xMTHB1da33\n+mNjY1m7di0qlQoTExMqKio4evQoKpUKd3d3SkpK7vpsnmnTpnH8+HHS09MZP368puNJXwdUU/m6\nSZk7xpsVf8U3GASRSODNsV0eyRpjd/oe3Gywxv32qAW9blb3SZ+7MWunT58+rF69WtQDeYTcPCCh\nf2S2ra0toH+myeMkOTmZ7du3k5SURElJCVZWVrRp04YRI0bojITPzc1l4eIV7DoYSY2yEjNbGU5d\n/LBp1V6zTUlZJf/31yGORw5j8UcLtX53Kysrgdr3o/odWVRURFlZGZs2bSIiIoKSkhJkMhl9+vRB\npVLp7VA9cuQIBw4c4KeffuLkyZP8/fffXLlyhfbt22ttd2MdwoqKCi5fvkxZWZnOMdWDOFatWkVw\ncDDh4eEUFxfj4ODA8OHDmTx58n1PXxkaGsru3bvJycmhsrKSWbNm6QxQvNfUMy3q1o5TzyJfsmQJ\n3t7e97F1jdeU2lfC7bnVzAXOzs5s2LABa2trrePk5+fz1ltv8fPPP+sNAqWnpzNy5Ehmz56t+R32\n9fXlq6++4r333qNTp04sWbJEE6QeMmQI7777Ln/88Qfvv/++5jhNzZwgCMLjTQSBBOEeuNOjaENC\nQjh8+DDe3t7Y2dlRVFTEmTNnyM/Pp3v37vTv3/9ONV0QBKFR9py+qLdDvLqqdrZLSkEVCzdF8ebY\nLozoWjtb0cDAQCvIUlpaikql4tq1a2zevPmWzl9fkEPfSFu5XE51dfVNz1FWVqbVPhsbm3o7XXJy\ncjhx4gSurq5MmjQJBwcHTE1NycnJ4bvvviMzM/OeBIECAgLIzc0lPT2dCRMmNDhr6HaM9G2No60F\nweEpxGXqvtu6tLEjYKDHIxkAUnuc7oEI/N0ddTufhYdffe9BtZKySkIjzzIqNkvzHlQrLi4G7mzA\n/mGzd+9evv/+ewwMDOjduzctW7akuLiY1NRU/vrrL60gUG5uLoEvzuFsfjV27l2oriijKDORtEO/\n085/OlYt/smkUC4v4EK5KSnZ16gbvi0qKgLA3d0dc3NzTY2gN954g+zsbNzc3Bg8eDAKhYL169dz\n9erVBv991qxZQ1JSEj169KBHjx4YGBiQlJQE6K9DePDgQU6dOsWaNWsYOnSo3jqEH330EYWFhZrj\nHT9+nA0bNlBVVXXTOkR305EjR1izZg3u7u6MHz8eY2NjvemvBOFhsX//foBGZy6o71kglUrp378/\noaGh5OXladI+qpmamvL8889rfZ/w8/Nj5cqVlJaW8uKLL2oFdDp37oxMJiMtLU3rOE3NnCAIwuNN\nBIEE4R65k6Nou3btSnp6OqdPn0Yul2NoaEirVq0YN24c48ePv+8jwwRBeLycTs+vt+PL0Lj2i4my\nXIGhsQlf74xDZmOOr5uUmpoa5HI59vb2wD9fqNzd3Vm5cuVda6+FhQUqleqWA031PVurq6u5fPky\n5ubmbNu2jVatWmmt37t3L7GxsURHRzc4k+hh4+smxddNqpP2qKur9IFOBXYnPU734GEKetUdRf3M\nM8+wfv164uPjqaqqomPHjsyaNYs2bdpw7do1fv31V06cOEFpaSmurq4EBgZq6n/AP6kYZ86cqXOe\n+Ph43nvvPaZNm0ZAQIBm+dWrV/njjz+Ii4ujoKAAExMT7O3t6dSpEzNmzNAElxuaXZSfn8+2bduI\njo7WHMPJyYlevXoxderUu3TnhKZq6D1Y1/XCbJZvP6l5D6rFx8cDte+/x1FWVharV6/GwsKCL774\ngtatW2utz8/P1/p7fHw8EucetO/eQ7OsuasXqQc2kZMUqRUEqlZWUVVWyoH4y7z8v2UpKSlkZ2dj\nZGSkyaAwbNgwvvjiC1JTU5kxYwYLFy5EIpFQUlLC0aNHSUlJ0VvbR+3ChQusXLkSR0dHzbKFCxdS\nUlKitw5hmzZtSE1NJS8vr946hG5ubnz22WeaTt6AgABeeuklduzYwdNPP62VnupeOnnyJACLFi16\nIOoVqq1evRpTU9P73QzhIVH3s1tY5GmuVyjx9PTU2e7GzAVqZ8+eJSQkhHPnzlFcXIxSqV0HuqCg\nQCcI1KpVK52UoAYGBtja2lJeXk6LFi10zmNvb09ycrLm77eTOUEQhMebCAIJwj1yJ0fR+vj44OPj\ncxdaKQiCcOs2HUmp97lmbufE9cKrlOZdxNSqOSoVBIen4Osm5fz581r59c3MzGjdujUXL17UScV2\nJ3Xs2JGTJ09y8eJFnY6mxrixw9+mugilUkn79u11AkDl5eWaL4WZmZl3pP0PGleZ1SMX8LhVj8s9\nuBNBr8ame4qIiGDnzp2kp6ejVCpxcnLCz8+PiRMn6hQ71pcCCGpn6I0ZM4aSkhLmzJlDeXk5x44d\n480330Qul1NaWoqXlxcKhYKMjAyOHz9OSEgI33//PcOHD9cc5+zZs6xZswZTU1NWrFjBihUrACgp\nKdF0OKrTr7zzzjt8/PHHZGRkYGZmhp2dHWPGjGHdunXs27ePsWPH6n22vfrqq1y6dIl169ZRUFDA\nokWLkMvleHl50a9fPyoqKrh48SLBwcEiCPQAaug9WJeyspzsuMMEhztpPuunpKRw6NAhLC0tH6uU\nznWfI+F//R/y6xX861//0vteVqd2VbOwak5Ri27UHZph3bIdJpY2XC+4or2trYyizESO7NrCl87G\nGFaXEx4ejkqlolOnTpoZOJMmTWLx4sUUFxeTnJysqd0XERGBXC5n7NixXLmifey6Jk+erBUAUsvJ\nycHa2lpvHUKpVIqxsXG9dQhfeuklrVH+NjY29O7dmwMHDnD58mXatGlTb3vupsLC2oEAD1IACNBb\n+0kQbqQvbWdiajYV8kI+Dz3HzGFGWn0xN2YuADh27BhLly7FxMSErl274uTkhJmZGRKJhPj4eBIS\nEqiqqtI5t77U1VCbuaChrAZ1vy/dTuYEQRAebyIIJAj30MM0ilYQBKExMnLlDaa6tHPrQkHqaXIS\nwrFx7oCRiRlxmYWkXili48aNOttPnDiRb775hpUrV/Lmm2/qfCEqLS0lJyeHtm3bNrnNEyZM4OTJ\nk3z77bcsXLhQpxOjvLyczMxMnQKsBfJy3t5wTLfWg7wIRUU1uYXFlJeXa2oPKZVK1qxZo6k7cP36\n9Sa3WRAeJE0NejU23dPGjRvZsmUL1tbW+Pn5YWZmxqlTp9i4cSMxMTF8+umnjRoBn5CQgLe3N4WF\nhTz77LPIZDJ+//131q1bR1JSEl5eXpSVleHm5saoUaOIiYlh+/btLFiwgBYtWmhmBDk4ONCtWzcS\nExPp3bu3ZrbGxYsXOXz4sNY5V69eTWJiIk888QT+/v4oFApmz55NSUkJsbGxeovAZ2ZmkpmZSb9+\n/bC2tmb+/PnI5XLefvtt/Pz8tLa9cUaEcP/d7D1Yl5VjGwpST/PHT1docc1fE5Coqalhzpw59XYQ\nPkr0dcCeP3ISRUEBu9KgdXr+Tb8LGVk7IDEw0FluYmGNIv+S9rbmVlg6uGBoZMKfIX8hszahbdu2\nVFdXa2YiQ209QGdnZ5o3b46lpSU7d+7EwMAANzc3XnzxRZo3b857771Xb5turAGkVlpaqrcOYUJC\nApcvX8bZ2VlvHUJLS0ucnJx0jle31uG9pg52q40bN07z59DQUI4fP87Ro0dJTk6moKAAqA3M+Pv7\nM3bsWJ0Z1eqZlj///DMnT55k165dXL16lebNmzNixAiefvppJBIJERERbNu2jYsXL2JmZsaAAQN4\n/vnnddJg1TcgoK7S0lJmzpyJnZ0da9as0TvL+5NPPuHkyZN89dVXDc7+uhdyc3NZv349sbGxlJeX\n06ZNGwICAujZs6fWdlVVVezYsYNDhw6RnZ2NoaEhbm5ujBs3TqeeVn2zWNXUNanWrl2rWaZUKtm9\nezf79u0jJyeHqqoqbG1tcXNzY+zYsXTt2lXrGJcuXeKPP/7gzJkzFBcXY2lpiY+PDwEBAToDpu6l\n+tJ2qjMXxKZc4lyOQit99Y2ZCwB+++03jI2N+frrr3Fx0U7vuWrVKhISEu7aNdyrzAmCIDx6RBBI\nEO6xxyl1jCAIj77YjIY7JK0cXZF6dCc/5RTndq7GtnUnJAYGzDm9ic6uLbCzs9P6Av7EE0+QmprK\nrl27eOGFF/D19UUmkyGXy8nJySEhIYFhw4YxZ86cJrfZx8eHmTNnsnHjRl588UV69OiBo6Mj5eXl\n5ObmkpCQgKenJ//+9781+1wqKCUxq4gyfR19EgkSc2syLuUwYWogT43xR6lUEhcXh1wux8XFheTk\nZE3nnsH/Oq5UeoaN349OHUG4Fxqb7uncuXNs2bIFqVTKV199RfPmzQGYOXMmixcv5uTJk2zbto0p\nU6Zo9r1eoST5SjHB4SlYmBrhbFn7uyWTyfDy8uLAgQOabf39/Vm3bh01NTUYGhoyZswYTW2Nmpoa\nkpKSuHz5Mtu2bdMEgaRSKT169CAxMZG+fftqUrfFx8dz9OhRrevIyMigU6dOBAQEMHLkSM3y0aNH\nExcXx8GDB3Fzc9PaJzo6GoBRo0Zx4sQJcnNz6d27t04ASN0W4cFys/dgXSaWzXHpNYYrp/drBSSm\nTp1Kt27d7mIrHwz1dcAqK8sBuFBUrVM/UB8DYzO9yyUGBpp3q2kzW7o9t4iCC7FkHtuBk88QXgma\nokm5re7oVlMoFBgYGNCtWze++uornWNfuqQdXJLJZISGhmoCGepnVV1Lly7l7NmzKJVKvSP2W7Vq\npWnvjXUIb6XW4b3i7e0N1KayzM3N1alLtH79egwMDOjQoQP29vYoFAri4uJYs2YNKSkpzJs3T+9x\n161bR3x8PL169cLX15eoqCh+/fVXlEolVlZWrF+/nj59+tC5c2diY2P566+/qKmpYfbs2bd8Dc2a\nNWPQoEHs27ePM2fO6AQv8vPzOXXqFO3atXsgAkDz5s2jRYsWDB06FLlcTnh4OJ9++imfffaZ5h2l\nrh+VkJCAs7MzY8aMoaKigqNHj/LFF1+QlpbGjBkzbqstX3/9NUeOHKFNmzYMHToUU1NTCgoKSEpK\nIiYmRus+njp1iiVLllBdXU2vXr1wcnIiPz+fY8eOER0dzZIlS25rMFdTNZS288bMBXXTV9+YuQAg\nOzub1q1b6wSAVCoViYmJd/My7lnmBEEQHj0iCCQI98njkjpGEIRH2/UK5U23cek1BjNrKfkp0eSn\nRGNoakGvoYP49JO3CQwM1Bnp+sorr9CjRw92797NmTNnUCgUNGvWDAcHByZNmsSQIUNuu91PPfUU\nnp6ehIaGkpSURFRUFBYWFtjb2zNixAitztfT6fkkZhU1eDwzaynKcgWJl4tRbt1BS4fm+Pr68txz\nz/HSSy8BaNK2qDt21DOE6kpNTb3ta1MHmW78wioI91pT0j2FhYUB8Mwzz2h1qhoaGhIUFER0dDR/\n//03U6ZM0cwoiMsswKq8GaWHanPmV5QWk5VVROsO3jqjvNUz/8zMzHBycuKZZ57RrDMwMMDV1ZXs\n7Gyt/Pu3YvLkyRw7dowffviB06dP4+vri6enJ71798bOzo59+/Yxffp0TUo7pVJJfHw8nTp1wsfH\nh19++QWA7t27N+n8wr3XmPdgXWY2DrgPnsrMwe0brAH6qGmoA9bIxIwKoOq6HENjU60OWH1MjHRn\nATWGhWn93R/qd3NRkf73fX3L1eqrG9jUOoQPIm9vb7y9vYmPjyc3N1dnFsmiRYt0PtOpVCpWrFjB\ngQMHGDNmjM4sa6j97PPtt99qZloEBATwwgsvsG3bNk0aTnVne1VVFW+88QZhYWE8++yz2NjY3PJ1\njB49mn379rF7926dINDff/9NTU2NVhD/fomPjycgIEAr2Obn58eiRYu0Bips376dhIQEunfvzocf\nfvTAjSgAACAASURBVKgJFAYEBDBv3jy2bNlCz5496dSpU5PaoVAoCA8Pp127dnz55Zeaz5lqcrlc\n8+fS0lKWLVuGqakpX3zxhVaQJDMzk7ffflsz4/9eayhtp77MBcHhKXi72OrNXCCTybhy5QqFhYWa\nzxUqlYrg4OB7UovnXmROEATh0SOCQIIgCIIgNFlDHSpqEokEWac+yDr10SwbP8KTa9euUV5erjOK\nDqBnz546qS7qExoaWu+6uXPnMnfuXL3rPD099RaAvdGmIyl0nvjGTbczMrPE1rkD3Z6cyrIZtXUd\nUlJSUKlUDBkyhNGjRwO1KWNMTU0ZPHgwr732mmb//Pz8O9JJpB4RmJeXpzeVjCDcbbeT7unChQsA\nemsftmrVCqlUSk5ODn9GnuOHAxfq7dApKatkf1I+1RLt+kHqzjF1qpwbO7MMDQ0xNTVt8qy8Hj16\n8PTTTxMcHExMTAyRkZFAbZBLKpWSnJxMZGSkJtBcUFCAoaEhI0aMQCKRoFAoALTSzggPtsa8B+/k\nfg+rhjpgLaTOKAquUHIlFTMbqVb9QH1a2VlySe+ahnV1rX8mnbm5OU5OTly9epXs7Gyd92d8fHwT\nznj7dQgfJvo+c0gkEsaPH8+BAwc4ffq03iDQ1KlTtZ55lpaW9O7dm3379vHkk09qfU40NjZm4MCB\nms72pgSBPDw88PDwICoqiqKiIs2Ag5qaGsLCwjA3N9c7E/Nek8lkWgMVALp164aDg4PWQIWwsDAk\nEgmzZs3SvOOgtobU1KlT+eabb/j777+bHASSSCSoVCqMjY31BjvrzkQ5cOAACoWCl19+Wefzvbr+\n344dO8jKytL7+f9uuVnaTn2ZCy7HGHBl/8842tvoZC6YOHEiq1at4vXXX6d///4YGhpy9uxZLl68\nSK9evThx4sRdvZ57kTlBEIRHz+P1yVMQBEEQhDuqoQ4VtaqyUozMLLW+PHVysuKnn1YBPNCFsO9G\nrYcOHTrg5eVFQkIC8+bNw8fHh+LiYk6cOIGvr69O3YBb5ePjw7Zt2/juu+/o168f5ubmWFpaMnbs\n2Ns6riA0xu2me1LXztKXWglqZ/KkZl5mZcgpTCxtG26MCg4lZiOr0J11B7VpgfRRd3ip/wz6Uy+p\nAzZ12dra4uLiwjvvvEN1dTXp6enExsayc+dOEhISkMvl7NmzR9PBmJubi7OzM8OGDQP+mY2grqch\nPPga8x68k/s9jG72LnVo34P8lFNcTTiCdcu2mNk4EJdZSEauHFeZFfn5+VqpEO2szPBubtfo9zOA\nu6P1TbMwDBs2jF9//ZX169fz7rvvan7/c3JyGhxw0pCm1iF8UOhLYV4fuVzOtm3biI6O5urVq5SX\nl2utr++51q5dO51l6vukb506YHQ7NdJGjx7NypUrCQsL06QXjY6OJj8/n9GjR2vqO94LN95jdUpT\nfQMVoHZQwblz54DaNILZ2dnY29vj7Oyss616tlBaWlqT22dhYaEJbKiDHp6ennTo0AFTU1OtbdXt\nSk9PJzg4WOdYly9fBrjnQaDGpO3Ul7nAavhgPv1onk7mgpEjR2JsbMyOHTvYv38/JiYmdO7cmTfe\neIPIyMi7HgSCe5M5QRCER4sIAgmCIAiC0GSuMiu8WzfcEZN7LoqijHisHF0xMrdCZl7D5x/9QX5+\nPt27d6d///73sMW35m7Vevjggw9Yt24dUVFRhIaG0rJlSwIDA+nWrdttB4G6detGUFAQe/fuZceO\nHSiVSmQymQgCCXfdnUj3pA6WFhUV6R1VXlhYyOUCBa51aoJIJBJU9dTHqK4o43KhbrCmsdSBomvX\nrumsS0lJ0VlWN9htaGhIu3btaNeuHZ06deLdd9/FwsKChIQELl26RGZmJmVlZXh6empGs3fs2BGo\nrakwatSoJrdbuHca8x68UZc2do9VWuibvUvNbBxw6TmKrBN/cW7Xj9g4d8TUyo4ly6KxqCrCwsKC\nJUuWaO3z7CAPFm6Kqnd2UV0SCQz1vnkx+ieffJLjx48TGRnJG2+8Qbdu3TSpsLy8vIiKirr5yW7Q\nlDqEDwJ9MzrViuMvY1qmHVxXKBS8+eab5OTk0L59e4YOHUqzZs0wNDREoVAQEhJCVVWV3nPpq3+k\nntGififoW3c7aW8HDRrE2rVr2bt3L08//TQSiYQ9e/YA3LNUcPXdY3VK0w5d9e9naGioGaigHoxw\nY3BRTT2g4nZrTr7zzjv88ccfHD58mE2bNgFgYmJC//79ef7557G1rR2UoU4Nt3fv3gaPV1ZWdlvt\nuVWNSdupL3PBoMHt681c4O/vr6kRWJerq6tOqkRoOHPB2rVr6123dOnSetfdSuYEQRAEEQQSBEEQ\nBOG23KwjxtrJjbKiq5RkX6C6sowWbaRYt3dn3LhxjB8/vt48+g+CxnxpVBefVmtMrQdLS0tee+01\nrXRwavq+JOpLa6cuSq3PxIkTmThx4k3bLgh30p1I9+Tu7s6FCxdISEjQCQJlZ2dz8fJVKgwtMTL5\nJwhkaGJO5fUSnXOqVCrKinOouF7JxbxSZDLZLV9T+/btgdoR4vDPjKCMjAxCQkJ0tr948SLu7u46\nnZrFxcUAdO7cmbS0NPbs2UNMTAyAVgdOr169kMlkREVFceTIEQYNGqR1nBtnRAgPhpu9B+u+JyQS\nHqtaQNC4d6nUozvmtv/P3p3HRV2uj/9/sSPIKruyKimGIm6IomhoWoZrx4RMTT1Z+ak8HvWbS1mZ\n9uuc6phlm1EuBXo0NTUXFDOxEpB9EWVTUZBFtmFnYH5/cGZinAEGRUW9n4/HeTzyvd7vOcDM3Nd9\nXZcNBRf+pLLgMuXX0kmvsmOcz0CefPJJleO9Xa1YOnlAq4FnOS2t5iwgd/v2y4bp6enx/vvvExoa\nSmRkJAcPHlSU5PL19b2tIBB0rA9hV9BaRqdcUUUNlYWlHE/IVWR0hoeHU1BQQFBQkMoEeHp6utq/\nl/eTvr4+AQEB/Pzzz8TFxeHs7ExsbCx9+/bF1dX1rt+/vde4oqaew7FXmdDiNVZH015WLd+T5J+7\nWwuiVVVVqbyH6evrExwcTHBwMMXFxaSkpBAREcGvv/5KQUEBH374IfBX0O6zzz7DxcWl1XHfa5qU\n31RXuUBPq5GtW7cCXbtygSAIgiZEEEgQBEEQhDvS3kSMiZ0bJnZuaGnRagmorkr0ehAEzXRWuacJ\nEyZw4sQJdu3axfDhwxUZMk1NTYSEhFBeXUePPsqrXo16OFCRl0lFfham9n81QZbkZ9FQ27z6OSX3\nJkMfd+vwc/n4+ODg4EBaWhr5+fns2bOHuLg4oqKi8PHxUcnci46O5rvvvqN///7Y2dnRvXt3bty4\nQXR0NHp6erz88st8+umnREREcO3aNQwNDXFz+2tcurq6vPnmm7z99tv8+9//5ujRo/Tr14/6+npy\nc3NJTEzk559/7vBzCHdXRwIS/3hmYKu9bh5Wmr4nGls74mb912eEVyb2Z9rwvybkb138MMnbCVtz\nI0IjM0i6UoL7hPlK1xvobEnwnMV4u65RuVdrK++NjIxYtGgRixYtUtmn6SINdTTtQ9jW2ADFRPzd\n0lZGZ0syGUoZnXl5eQCMHDlS5diUlJS7MdQ79vTTT3Pw4EGOHTuGq6srTU1N9yQLSNPXmFteY3Va\n9rLKy8vDwcFBaX9SUhIAvXv/9d4oz3BVV04vPz9fbRCoJSsrK8aOHYu/vz+LFy8mLS0NiUSCiYkJ\n/fr1448//iA1NbVLBYE0Kb95a+UCaU0l/02rprayvMtXLhAEQdCEmKEQBEEQBOGO3ToRc6uBzpYE\nj3Z/4Ca+RK8HQdBMZ5V78vDwYObMmfz0008sWbKEUaNGYWhoSGxsLFeuXMHOsTe1jylPMtr290WS\nn0X2b7uxcHocmayJqqKr6BqaYNXHG0nBZWrqb69skL6+Phs2bOCLL75g69atHDlyhL59++Lv74+2\ntjZSqXKGw5AhQ7C1teXChQtkZmZSX19Pjx49GD16NNOnT8fZ2ZmnnnqKb7/9VlGq8Vbu7u5s3ryZ\nvXv3cv78edLT0xUTfc8///xtPYdw9z2s74Od4W6+l3q7WuHtaqW2d82jVHKvs7SV0Xmrlhmdtra2\nACQnJytN/mdnZ7Nnz567MNI75+DggJeXFzExMaSnp2NsbKySfXk33O5r3Bp5L6vvvvuO1atXK/oI\nVVRUsGvXLgAmTJigOL5Xr14YGRkRFRVFeXm5YrFFfX09X3/9tcr1y8vLKS0tVQnq1NbWUltbi46O\nDrq6uoqx7N69m7CwMNzd3RXZtH89j4yUlBQGDBig2QvQSTQp23lr5QIzY0McHhuI/99mdPnKBYIg\nCJoQQSBBEARBEDrFwzgRI3o9CIJmOrPc0/z583Fzc+Pw4cOcOnWKxsZG7OzseOGFF9Dq6cW3pzKV\nrmti54brmOe4kXKG0ispaOvq02voUzh4j+dG0m8AdNPXURz/ww8/sHDhQrVjDAkJYdWqVUor162s\nrHj77beZPHkyYWFhXLlyhZiYGAC2b9+OjY2NogG2q6srU6ZMafN1CAgIICQkBAcHB7Zt24aJierf\nC2tra1555ZU2ryN0PQ/j+2BnuBfvpS42Jo/0a9wZ2svoVEee0fnEE0+wb98+tm7dSnJyMg4ODuTl\n5RETE4Ovry+RkZF3adR35umnnyYhIYGysjICAwPR19e/q/e7k9e4tZ/vGTNmEBsbS1RUFK+99hpD\nhw6lrq6Os2fPUl5ezsyZM5Wy0HR1dZkyZQq7du3i9ddfx9fXl8bGRhISErC0tFTpL3Tz5k3eeOMN\nXFxccHFxwcrKiurqamJiYigtLSUwMJBu3boBYGJiwqpVq9iwYQPLly/Hy8sLJycntLS0KCoqIj09\nHYlEwr59+zr4yt259sp2yisXQHPW5gfP+zySQXtBEB5eIggkCIIgCEKnetgmYjrafPp+93oIDQ0l\nLCyMjRs3Kq20DAwMxNPTs80Gs+0pLCxk4cKFBAQEaFT+BiAiIoJNmzaxdOlStQ10hYdDZ5V7khsz\nZozaFdmXCyUqQSAAc8e+mDv2VdnuPHIqziOnMnZwP8W2tvppQetNmIcMGcKQIUPU7utIiaacnBxk\nMhmjRo1SGwASHnwP2/tgZ3jQ3ksfRe1ldLZ13rThrnz44Yds27aNtLQ04uLi6NWrF6+88gqDBg3q\nskEgHx8fTE1NqaiouCel4O7kNW7tb4quri7r16/nwIED/Pbbbxw+fBhtbW1cXV156aWX1L6XBgcH\nY2BgwPHjxzl+/Djm5uaMGTOG4OBgXn31VaVjbW1tef7550lOTiYpKYmKigpMTEzo2bMn8+fPZ/To\n0UrHe3l58fnnn7Nv3z7i4uJITU1FV1cXS0tLvLy81JYMvBdE2U5BEB51IggkCIIgCILQBvGlURDa\nd69KJz4M2Xk//fQTAJMnT77PIxGEe0e8l3Z9mmR0Aiq9l+TnOTo68tZbb6k9p6P9lNoKrAcEBKhd\nVKLuHu0F6AsLC5FIJPTv3x8nJ6dWj+ssmrzGBt3NGTxnXavnqVuooK+vz6xZs5g1a5ZG49DS0uLZ\nZ5/l2WefVdl3a08qY2NjZs+ezezZszW6NjQvtnj55Zc1Pv5eEWU7BUF4lIkgkCAIgiAIQjsepC+N\nzzzzDGPGjMHa2vp+D0V4hNzL4MyDmFFw+fJlYmJiyMzMJDY2lmHDhtG3r2rmkvDwuXjxIsuXL2fE\niBGsWbNG7TGvvPIKN27cYMeOHQ91dtiD9F76KNI0o7OzzusK9u/fj0wm45lnnrkn93sUX+OuRpTt\nFAThUSXeSQRBEARBEDTwoHxpNDU1xdTU9H4PQ3gE3avgzIOYUZCVlcWOHTswMjLCz89P9Pt5hPTt\n25eePXty/vx5JBKJSpDn0qVLXLt2jZEjRz7UASC5B+W99FF0rzI677eioiJ+++038vLyOHnyJK6u\nrvj5+d2Tez8qr/GDQJTtFAThUSOCQIIgCIIgCB2g7ktjREQE0dHRZGVlUVpaio6ODi4uLjz11FOM\nGzdOcdzLL79MQUEB27dvVxuo2bt3L9u3b2fx4sWKValJSUmcOXOGtLQ0iouLaWxsxM7ODj8/P2bO\nnKnSxLi1nkDqlJSUEB4eTlxcHPn5+VRWVmJqaoqnpyezZ8/G0dGx1XOvXbvGtm3bSE1NpaGhATc3\nN4KCgvD29m73NZQrLi5m7969nD9/nps3b9KtWzc8PDyYPXs27u73P3tD6Jh7GZx50DIKWitfJDwa\nAgIC2LFjB7/99ptKxkFERITimEeJmIDteh6GcpuauHHjBtu3b8fAwIBBgwbx6quvoqWldU/u/ai8\nxoIgCELXI4JAgiAIgiAId+iLL77AyckJT09PLCwskEgknD9/nk8++YTr168zZ84cQHkiMDAwUOU6\np06dQldXF39/f8W2n376iWvXrtGvXz+GDh1KQ0MDaWlphIaGkpyczPvvv4+2tvZtjTslJYU9e/Yw\ncOBARo4cSbdu3cjLy+OPP/4gOjqaf/3rX7i6uqqcV1BQwPLly3FxcWHSpEmUlpYSGRnJunXrWLFi\nhUqTYHWysrJ46623qKysZPDgwYwcOZKKigrOnTvHypUrWbNmDUOHDr2t5xLun3sZnBEZBUJX1vLn\nUmrmRk19I6dOnVIKAkmlUiIjIzEzM2PIkCH3cbSC0OxBLLfZUQMGDFDbP+heeRReY0EQBKHrEUEg\nQRAEQXiAxMfHExoaSm5uLlVVVfj4+LB27VoAMjIy2LFjB1lZWUgkElxdXdm8efN9HvGj4fPPP8fe\n3l5pm1QqZd26dezdu5ennnqKHj16MG7cOHbu3MmpU6dUgkAZGRnk5uaqlAR65ZVXsLW1VVml+sMP\nP7B7925+//13jYIu6nh5efHDDz/QrVs3pe05OTmsXLmS7du3884776icl5KSwvTp01mwYIFi2+TJ\nk1mxYgVbtmxhyJAhGBkZtXrfxsZGPvzwQ2pra9m4cSOenp6KfSUlJfzjH/9g8+bNhISEoKend1vP\nJtw/9zo4IzIKhK4kPqeYH89kqKz0z6kzJedUNBN+T+SpUV4AREdHI5FImDp1Kjo6OvdjuIKg5EEs\nt/mgEa+xIAiCcD+IIJAgCIIgPCAKCwt5//33MTY2Zvz48RgZGdGrVy8Aqqureffdd2loaGDcuHGY\nmppiYWFxn0f8cFI7sX1LAAhAV1eXyZMnk5SURGJiIk888QRWVlZ4eXmRkJDA1atXcXJyUhwvLwn0\nxBNPKF3Hzs5O7TimTp3K7t27iYuLu+0gkJmZmdrtrq6uDBw4kPj4eKRSKbq6yh8ZjY2NCQoKUtrm\n7u7O2LFjiYiI4M8//2yztNH58+fJz89n+vTpSgEgAEtLS2bOnMnWrVtJTEwU2UAPMBGcER41x+Kv\ntjqxa+k2iMu/Z7Py421oGy9j4iDHR7YUnNC1PWjlNh9E4jUWBEEQ7jURBBIEQRCEB0RCQgL19fW8\n/vrrSuXCoLmxdHl5OS+88AKzZs26TyN8uLW2uhugt7kW5mWplFzPoqioiPr6eqX9N2/eVPz3+PHj\nSUhIICIighdffBFozho6c+YMZmZmKkGP2tpaDh48yLlz57h+/To1NTXIWswwtrz27YiJieHo0aNk\nZmZSUVFBY2Oj0v6KigosLS2Vn7d3b5XsIWgusRIREUF2dnabk5rp6elAc3Pm0NBQlf15eXkA5Obm\niiCQIAgPhPic4jZX9ps79kNH35CbOUl8cjCBbloNxMbG4urqqrbspiDcT6Lc5t0nXmNBEAThXhJB\nIEEQBEF4QJSUNAcfbp2Qb7mvR48e93RMj4q2VnfXSUrZv+dbGutrGDdyKBMnTsTIyAhtbW0KCwuJ\niIigoaFBcbyvry9GRkacPn2aefPmoa2t3WpJIKlUypo1a7h06RLOzs6MHj0aMzMzxTFhYWFK1+6o\ngwcPsnXrVrp3786gQYOwtrbGwMAALS0tzp07R05ODlKpVOU8c3NztdeTb6+qqmrzvhUVFQCcPXu2\nzeNqa2s1eQxBEIT77sczGW2WdtLW1cPCqT/FmXFU5Gfzn+/TaGxsFFlAQpcmMjrvPvEaC4IgCPeC\nCAIJgiAIwn129uxZDh8+rJhwt7e3x9/fn2nTpqGnp0dycjKrV69WHN/yv5cuXcqmTZsU/960aZPi\n30uXLhWTS52gvdXdhel/Iq2rxtl3KuVugxg2wUdRvuPMmTOKcj9y+vr6+Pn5ER4eTnx8PEOGDOHU\nqVOAaim4qKgoLl26REBAAEuXLlXaV1JSQlhY2G0/V2NjI6GhoVhYWLBp0yaV4KI8W0edsrKyNrcb\nGxu3eW/5/rVr1+Lj49ORYQuCIHQ5lwslarNEb2XZexDFmXGUZCeRV1FEX3MZY8eOvfsDFARBEARB\nEB5pIggkCIIgPDACAwPx9PTkgw8+uN9D6TQ7duxgz549mJqa4u/vj6GhIbGxsezYsYO4uDjWr1+P\nra0tQUFBJCcnk5KSQkBAADY2NkBz75agoCCys7OJiorCx8cHNzc3xT7hzrW3urtOUgqAuZMHMhmE\nRmYogkDJyclqzxk/fjzh4eGcOnWKPn36EBsbi4uLi+L/O7n8/HwARo4cqXKNlJSU23kchYqKCqqq\nqvDy8lIJANXW1pKVldXquVlZWdTU1KiUhJM/763Pcau+ffsCkJqaKoJAgiA88BIuF2t0XHdrRwxM\nLCnLTaOpsRGrgX6t9mYTBEEQBEEQhM4igkCCIAiCcJ+kp6ezZ88erKys+OSTT7CwsABg3rx5bNiw\ngZiYGPbt28esWbMIDg4mNDRUEQQaMGCA4jpubm5EREQQFRWFr6+vyP7pRJqs7tY3bp7Aqyy4jFmv\nviRdKeFyoYSSaxmEh4erPcfDwwMHBwfOnTuHo6MjUqmU8ePHqxwnD/YlJyczfPhwxfYbN26wbdu2\n23yqZubm5hgYGJCZmUltbS2GhoZAcwm6b775RlGyTZ2qqirCwsJYsGCBYltGRganT5/G2NgYX1/f\nNu/t4+ODvb09v/zyCwMHDlTb9yc9PR1XV1cMDAxu8wkFQRDujeo61bKZrenh5kVe4q8APOYlguCC\nIAiCIAjC3SeCQIIgCIJwn5w4cQKA5557ThEAAtDR0WHhwoWcP3+e8PBwZs2adb+G+MjTZHW39WPD\nKMlOICdyL+ZOHuh1M+HNNeFUF+Tg5+dHZGSk2vOeeOIJfvjhB3bv3o2Ojo7akkDDhw/H3t6eAwcO\ncPnyZXr37k1RURHR0dEMGzaMoqKi2342LS0tAgMD2bt3L0uWLGHEiBFIpVKSkpKQSCQMHDiQpKQk\nted6enoSHh7OpUuX8PDwoLS0lMjISJqamliyZAlGRkZt3ltXV5fVq1fz9ttv8+677+Lh4aEI+BQX\nF5ORkcGNGzfYsWPHfQkCPYxZh4Ig3D1GBpp/rbYbMAa7AWMAGDS0/90akiAIgiAIgiAoiCCQIAiC\nINxDlwslJFwuprpOSvjvcVTXSfHy8lI5rmfPnlhZWVFQUEBVVVW7PVaEu0OT1d3dLGzpM34e+Ym/\nUnE9A5msicpuHqxdvRpjY+M2g0A//vgjUqmUYcOGqS0JZGhoyMaNG9m2bRvJycmkpaVha2vL7Nmz\nmTZtWqvX1tScOXMwMzMjPDycY8eOYWRkhLe3N3PmzCE0NLTV82xtbXn11VfZvn07R48epaGhgd69\nezN79mwGDx6s0b1dXFz47LPPOHDgANHR0Zw8eRJtbW0sLCxwc3MjODgYU1NTja61atUqUlJSOHTo\nkEbHC4IgdKZBLlb39DxBEARBEARB6AgRBBIEQXhEFRYWsnDhQgICAggODmbbtm0kJCRQW1uLs7Mz\nwcHBDBs2THF8aGgoYWFhbNy4UakU2a3Xatm8ftOmTURERPDtt98SExPDkSNHuHHjBhYWFkycOJG/\n/e1vaGlpcfbsWfbt28fVq1cxNDTEz8+PBQsWoK+vr3bsJSUlbNu2jbi4OGpqanB0dGT69On4+/ur\nPT4uLo6DBw9y6dIlampqsLKywtfXl+eee04luLJw4UIAPvvsM0JDQ/nzzz+5efOmoiTb7YrPKebH\nMxlKpcVSM/Opk5Tw4eGLzBuvp+gjI2dpaUlRUZEIAt1Hmq7u7m7tiPv4uYp/L5rYnxHDm3sytRaY\nsLa25uDBg+1e28rKiuXLl6vdp+7awcHBan9W1R2ro6PDtGnTmDZtmsq+pUuXKv0+Q3N5upbXWbt2\nbbvjDwgIaLVEoZmZGfPmzWPevHntXkcQBKGrcrExYYCTZbvlQ1sa6GyJi43JXRyVIAiCIAiCIDTT\nvt8DEARBEO6vwsJCli1bRmFhIU888QSjR4/mypUrrF+/vtVSUB313XffERoaymOPPcZTTz2FlpYW\nO3fuJCwsjEOHDvGf//wHe3t7nnrqKSwsLPjll1/49ttv1V6rsrKSFStWcPnyZcaPH88TTzzBjRs3\n+Oijj9i3b5/K8WFhYaxbt45Lly4xbNgwAgMDsbe3Z//+/axYsYLq6mqVc6RSKWvWrOHcuXN4e3sz\nZcoUbG1tb/v5j8VfZdWPUSqTQzp6zWWuEjJyWfVjFMcTcpX2l5Q0Hy8CQPePWN0tCEJXUFhYSGBg\nIJs2berwucnJyQQGBraZ3SfcuefHuKOlpdmxWloQPNr97g5IEARBEARBEP5HZAIJgiA84pKTkwkO\nDiYoKEixzd/fn3Xr1rFv3z4GDhx4x/fIzMzks88+o0ePHkBzpsLf//539u3bh4GBAZs2bcLR0RGA\nhoYG3njjDU6cOMHzzz+vUiLr8uXL+Pn5sXLlSrT+N9vy7LPPsnTpUnbu3MnIkSOxs7MDICkpidDQ\nUPr168c777yjFEyJiIhg06ZNhIaGsmjRIqV7lJSU4OjoyAcffIChoeEdPXt8TjGbfklGJlPd183S\njuqSfCoLrmBgYsl/DidhY9YNb1cr8vPzKS4uxtbWVgSB7iOxuvv+i4qK4uDBg+Tm5iKRSDA1ZFnr\nzgAAIABJREFUNcXBwYHRo0czdOhQRfYeNPfykfP09GTDhg0sXLiQqqoqduzYofb3+euvv+bw4cO8\n+eabjBo1qs2xNDY2cvz4cU6dOsXVq1dpbGykV69eTJgwgcmTJyv+JglCV9cyg/fZZ59l27ZtpKam\n0tDQgJubG0FBQXh7eyuOl79nLl26FHNzc/bu3Ut2djbV1dVK2YHXrl1j7969JCYmUlZWhrGxMV5e\nXgQHB9OzZ0+lMZSVlbFv3z6io6MpLi5GV1cXc3Nz+vXrx+zZsxXv5TKZjFOnTnHs2DHy8vKoqanB\nzMwMR0dHJkyYwOjRo+/Ni9YOb1crlk4e0Op7vpyWFvzjmYEq2b+CIAhC1yaTyTh06BDHjh3jxo0b\nmJiY4OvrywsvvMDrr78OQEhICABVVVUcP36c2NhYrl+/Tnl5OUZGRvTr14+//e1v9OvXT+X68p6U\n/+///T+2b99OTEwMtbW1uLq6Mn/+fB5//HFqa2sJDQ3l7NmzlJaWYm9vT3BwMH5+fmrHfObMGY4d\nO0Z2djb19fXY2toyduxYZsyYgZ6entKxqamp/PTTT2RnZ1NeXk737t2xtbVlyJAhSnMFgiA8mEQm\nkCAIwiPicqGEA9E5hEZmcCA6h6tFlUBzeafnnntO6djBgwdjbW3NpUuXOuXes2fPVgSAoDmzxcfH\nh7q6Op5++mlFAAhAT0+P0aNHI5VKyc3NVbmWtrY28+fPV5pstbW1JTAwEKlUyq+//qrYLp+Yeu21\n11QCKQEBAbi5uXH69Gm1Y164cOEdB4AAfjyT0epkUI/ezRNsN1LO0FBbhUwGoZEZNDU1ERISgkwm\n48knn7zjMQh3Rqzuvn+OHTvG+++/T25uLsOHD2f69OkMGTKEuro6Tp48ibGxMUFBQdjY2AAQFBSk\n+N/48ePR1tZm4sSJ1NTU8Ntvv6lcv76+nl9//RULCwt8fHzaHItUKuW9997jyy+/pLKyEn9/fyZN\nmkRTUxNff/01//nPf9p9nsDAQFatWnV7L4bwSLO0tOTLL79k7ty57R/cAQUFBSxfvpzKykomTZqE\nn58fWVlZrFu3Tm3Psd9//5333nuPbt268dRTTykFYGJjY3njjTc4ffo07u7uTJkyBS8vL/7880+W\nLVtGVlaW4ti6ujpWrlzJ/v37sba25umnn2bChAk4Oztz7tw5pff/nTt3smnTJkpLS/Hz82PatGl4\neXlx8+ZNzp4926mvx52a5O3EB8/7MNDZUu3+gc6WfPC8DxMHOardLwiCIHRdX331FVu3bqWqqopJ\nkybh7+9PfHw8b731FlKpch/Ra9eusXPnTrS0tBg2bBjOzs7k5OTw9ddfM3bsWD7++GO196iqqmLl\nypVkZ2fj7+/PyJEjyczM5O233yYnJ4e1a9cSFRXFsGHDCAgIoKioiH/9619cvHhR5Vqffvop//73\nv8nPz2fkyJFMnjwZExMTfvjhB9atW0djY6Pi2NjYWFatWkVaWhpeXl5Mnz6dESNGoKenxy+//AI0\nl3oPDAyksLCwE19VQRDuFZEJJAiC8JBT14sGoK6yjNzcUpwe80RbW3VNgJWVFenp6Z0yhj59+qhs\ns7S0bHWfPGBUXFysss/a2lptabYBAwYQFhamNMmUnp6Orq5uq5NEDQ0NlJeXI5FIMDH5K3NDX18f\nFxeXth9KA5cLJW1mkHS3dsT28VEUpP5O+uEvMXfqz/U4PQp++57Swnz69+/PjBkz7ngcwp0Rq7vv\nn2PHjqGrq8tnn32mkhVYUVGBsbExwcHBJCcnU1hYqLYX0pNPPsmuXbs4duwYEydOVNoXGRlJVVUV\nkydPRle37Y/F//3vf4mLi+OZZ57h73//u+LvZlNTE59//jknTpxg1KhR7QaTBOF26Orq0qtXr06/\nbkpKCtOnT2fBggWKbZMnT2bFihVs2bKFIUOGYGRkpNh3/vx51q1bx5AhQ5Suk5OTw5QpU7Czs+PA\ngQNKizuuXLnC8uXL2bx5M59++ikAiYmJ5OfnM3XqVJVsXKlUSkNDg+Lfx44do0ePHmzZsgUDAwOl\nYysqKu78Rehk3q5WeLtacblQQsLlYqrrpBgZ6DLIxUpkiQqCIDygUlNTOXLkCD179uTjjz9WLDCc\nO3cua9eupaSkRLEoCaBXr15s374dU1NTRTbOsGHDCA4OZvfu3Zw/f17tfXJycpg0aRKvvvqqYtGj\nt7c3n3zyCatXr8bDw4ONGzcqeueOGzeON998k71797JmzRrFdSIiIjh58iS+vr4sX75cqdeuvNfv\nL7/8wpQpUwAIDw/n2rVrWFhYMH78eKUewF3xvVYQhI4TQSBBEISH2LH4q21OXFfU1BNx4SbHE3JV\nVqXq6Ogga2vGuwPUlTPT0dEBUJpcunVfy9VJcubm5mrvYWFhAaDU40cikdDY2EhYWFib46upqVEK\nApmZmXVKWaeEy6pBrFv19B5PNws7ii9GU5KTiKypiaL+brz4wgtMmzat3Ylp4d6Y5O2ErbkRoZEZ\nJF1RDewNdLYkeLS7CAB1gpYTp1k3ypFKZYq/CS2ZmppqdD1LS0tGjBjB77//TmZmplLg+ejRo2hp\naakEh24lk8k4fPgwFhYWLFq0SClwrq2tzcKFCzl58iSnT58WQSDhrmhZvm3p0qWA5uXUAKrrpJzP\nKiTvh2NEnTpMWV4OqUnxWFhYMGzYMKV7ubu7Y2JiQnh4OD/++CP9+/fnyy+/5Pz581hbWxMREYGL\ni4tShm9kZCRSqZTBgwcrBYAAnJ2dmThxIj///DO5ublK+1tOSsnp6uqqvPfp6OioXbCi6d+B+8HF\nxkQEfQRBEB4SERERAMyaNUvpu62uri7z5s1j5cqVSse3PCYmJgaAdevWYWlpib6+PocOHaKoqAhr\na2ul8wwMDFiwYIHSd1F/f38+/fRTKisreemll5TeOx9//HFsbGzIzs5Wus7BgwfR0dHhjTfeUHmv\nnT17NocPH+b06dOKIJBcW++1c+fO5dlnn1Us5hQE4cEiZpYEQRAeUm31olEiQ6kXTWvkHwjVBWYq\nKyvvZKgdUlZWpnZ7aWkpoBxUMjIyQiaTtRsEulVn9fWorpO2fxBg6eKJpYun4t8vjH2MWWpKigUH\nB6vNdIDm8nYBAQG3N1BBI2J1992lLmuxULsn1y6l4jPxb8wMfJKnx/ri4eGhkhXUnv79+7N9+3ae\nf/55HBwcMDExwczMjLi4OCZMmKBYuXnx4kX27dtHfHw8Fy9e5MaNGwwdOpQxY8YgkUhwcHBg9+7d\nAOzatYtr166xbNkyoqOjSUlJISUlhfT0dPz9/ZkzZ45iIlveTwWaMy9a9i4KCgpS+r2WjyEtLY3K\nykrMzc0ZOnQoQUFBKl+6V61aRUpKCvv372fv3r2cPn2agoIC/P39FYEC4eEkL6eWn5/PoEGDGD58\nODKZjMLCQs6dO8eoUaOws7Nr/izwcyJJV25yuSGayoLdmNi5odfNkTpSqayu5Z133uG9997j8ccf\nV1xfHqg5duwYR48epUePHtja2uLi4kJkZCQ5OTls3rxZ0U8gLy+PgQMH0qdPH0JDQ1XGe/36dQBF\nEMjT05MePXqwd+9esrKyGDp0KB4eHri5ualMQI0dO5ZDhw7x6quv4ufnh6enJ/369btn/fLUBeAE\nQRCEh9utn/cTUporZPTv31/l2L59+6pdsHThwgUOHjxIaGgoBQUFzJs3T2n/zZs3VYJAPXv2pFu3\nbkrbtLW1MTc3p7a2VmmBh1yPHj2UyrjX1dWRk5ODqakpP//8s9rn09PTUyq96u/vz549e0hLSyMs\nLIyysjI8PDywsvprfsDS0lIEgAThASaCQIIgCA+ptnrR3Erei6atIJB8skVdibbMzMzbGuPtKCoq\norCwUCndHiA5ORmA3r17K7b169ePmJgYrl69ipOT0z0bo5yRwe29zd7uecK9IVZ3d77WshZtPHzR\nMTCi+NJ5vtq2i/Cjv2BjZoSnpycvvvgi7u7t9186fvw4ISEh1NXVUV1dzeTJk6murubAgQMUFBTw\n1FNPAXDixAk+//xz9PT0MDU1xdXVlT59+nD8+HHCw8Opr68nLy9PEVS+cOECEomEDRs2IJFIMDc3\nx9zcHH19fX766SfKysoUE8aurq4EBQURFhaGjY2NUsC2ZbmNlmPw8fHBysqKvLw8jh8/TnR0NB99\n9JHKZAHAxo0bycjIYMiQIYwYMaLDQTLhwaNJOTX571XFjXIAKvIycRz2FNZ9h1NXWUZxZhwyC1uu\nF5fz6aef8vXXXysWQcjf87Oysjhw4ABZWVncuHGD//u//yM+Pp4zZ84QFRWlaERdVVWFoaFhuz16\nampqgOZFGh999BGhoaFERUURFxcHNK82fvrpp3nuuecUQdRFixZha2vLyZMn2bt3L3v37kVHR4eh\nQ4eycOFC7O3tO+lVFQRBEB51rZVST43LxkBayeUyKbfGYbS1tZWqSgD8+eefLFmyhPz8fMzMzLC1\ntVUsiKioqMDNzY3Fixczffp0pQUG8gWN8oU+8h63Ojo61NXVERgYSFBQECNGjGDnzp1cuHCB+Ph4\nGhsbuXDhAh4eHlRWViKTySgvLycsLAyZTEZRURHFxcXU1NQgk8nQ19fHxMSEvLw8HBwcCAkJwdjY\nGCMjI7777jtCQkKA5s8D+/btY9CgQWzatImIiAhCQkJUvoufPXuWw4cPk5OTg1Qqxd7eHn9/f6ZN\nm6ZYMCK3cOFCALZs2UJoaCiRkZGUlZVhbW3Nk08+ycyZMzttUaYgCH8Rs0yCIAgPofZ60aiTdKWE\ny4WSVie3H3vsMQBOnjzJuHHjFKudiouLO5xpcyeampr4/vvvWblypeLDYUFBAYcOHUJHR4exY8cq\njp06dSoxMTF89tlnrFq1SmXlUm1tLVeuXKFv3753ZayDXG6vNNjtnicID6L2shZ7uHnRw80LaX0t\n1cW5eNjWkBL3J+vWrePLL79sM+CRm5vLl19+iZGREWvXrmX//v307NmTgIAA/vjjD2xsbBg2bBjX\nr1/niy++wNbWlg8++ID58+fj6enJmjVrSExM5J///CfXrl1j7ty5rF69Gvjry3nv3r1Zv3694st/\nbW0tr7/+OqdOnWLevHlYWFjg5uaGm5ubIgikLqPv1jG0LLWVmJjIW2+9xTfffKNU712uqKiILVu2\ndOnSWMLd0Vo5teTcMpXfKwMTS6weUy79pmtgRKHMkPSsK6SmpuLp2ZyVWlVVBcDgwYNxcXFR9NuT\nl088c+YMly5dUgSBmpqaiI6OZvHixbz33nsajd3KyorXX38dmUxGbm4uiYmJ/PLLL+zatQuZTMac\nOXOA5sm1qVOnMnXqVMrLy0lNTSUyMpKzZ89y9epVtmzZojLBJAiCIAgd1VYpdR09fSok9az6/jRv\nBvkrlVJvampCIpEofXb74YcfsLS05MUXXyQhIYHCwkKCgoKA5v47eXl5tz3OzMxMfvrpJ/r168eT\nTz5JQUEBFy5cYO3atWzevFkxDjc3Nz7++GPeffddEhIS6N+/P8OGDcPIyIiCggISExO5cOECDg4O\nTJkyhXPnzpGSksKYMWNobGwkKyuLxMRE3n33XTZv3tzqeHbs2MGePXswNTXF398fQ0NDYmNj2bFj\nB3Fxcaxfv16lzKtUKuXtt9+mpKSEoUOHoq2tzblz59i+fTsNDQ2K10oQhM4jgkCCIAgPIU160bR2\nXmtBoL59++Lp6UlKSgrLli3Dy8uLsrIyoqOj8fb2bnf1b2dxcXHh0qVLLF26FG9vb6qqqhTN3V98\n8UWlFcFeXl7MmzePHTt28NJLLzF06FBsbW2pra2lsLCQlJQU+vfvz7vvvnt3xmpjwgAnyw4F5AY6\nW97TLBNR5ka43zTNWtTVN8TUwZ0mZ0vGWxpz4sQJUlNTGTlypKJ8VFNTE1eLqxTlOyJ/+S+S6jpe\nfPFFxo8fz5EjRzh27Bj6+vpUVVUxe/ZstLW1OXr0KFKplL///e9KX+Ch+e+Iv78/X331FWlpaUil\nUqUvsvPnz1da/WloaIi/vz+7du0iMzNTpd9Ka9obg4+PD9HR0dTU1KiUCZkzZ44IAD1kbi1D08tY\n+ZekvXJq6n6vuts4qaysrSnJp4f7YK7n5JOVlaUIAslLxHh7e6uMTZ6N1rIUrDwrLz8/v8PPqqWl\nhZOTE05OTvj6+vLiiy9y7tw5RRCoJTMzM0aOHMnIkSOpqKggKSmJK1euKPX6EgRBEISOam9RUjdL\ne6pLbiApvKpSSv3ixYsqJdPz8/Px9PTktddeY9WqVRQWFhIcHIxMJrvj780xMTEsXbpUkVmekZFB\ndXU19fX1HDx4kFdeeQUnJyeuXr1KSEgICQkJDB8+nDfffFNp0URDQ4Oin+7UqVOpqqoiJSWFSZMm\nKTLVd+3axY8//sj58+fVjiU9PZ09e/ZgZWXFJ598oujTO2/ePDZs2EBMTAz79u1j1qxZSueVlJTg\n6urK+++/r1jQEhwczOLFi/n555/529/+JnrjCkInE79RgiAIDyFNe9F09Ly1a9fy3XffERUVxaFD\nh3BwcGD+/PkMHjz4ngWBunfvzrvvvsv333/PyZMnqa6uxtHRkRkzZuDv769y/LPPPkv//v05dOgQ\naWlpREVFYWRkRI8ePZg4caLaczrT82PcWfVjlEaT3FpaEKymF9CDQp7aLy8fIAjtaS9rUXIjh+62\nLkoT10lXSmisLACam+dCcwmp8up6lnx+lOzyv86/eCaGqps3OZINToU1+Pv7Ex4ezs6dO9HW1mbi\nxIlA8xdYaO7Xk5GRwfXr19HS0lL0NpFIJNjY2JCXl8c333yjVH5LPvldUlJCVVUVjo6OaifJ23Pr\nGG5VXl5OU1MT169fV5nw1qQsnvBgaK0MTV1lGbm5pfS92fwz1VY5tWGjxpFYaIP2Lf0JdA27q9xP\nWl+LJD+H6up6ruTfBJonky5cuICuri4+Pj4q58gzgZuamhTbxowZg66uLvHx8Vy6dEmRPSwnk8lI\nSUlRTCpdvXoVU1NTzM3NlY6T9/eT/243NDSQmZmJh4eH8rilUsXvl/zYe00mk7F161YOHTqEr68v\ny5cvV5uZJQiCIHR97S1KsnQdyM3MeApSIjHr1VdRSl0qlbJjxw6V4+WfG0tK/no/l8lkhIaGKvXi\nuR0eHh4qvWCtrKzQ0dFR9AaaNm0an376KZ9//jlubm4sWbJEKQBUWVlJQUGBopR6SkqK0vu6nLwf\nb2vvtSdOnADgueeeUwSAoPmzwsKFCzl//jzh4eEqQSCAxYsXK71vmpmZ4ePjw6lTp7h+/TrOzs4a\nvR6CIGhGBIEEQRAeQpr0lDHobs7gOetaPe+DDz5QOcfY2JjXXnuN1157TWWfvF5xS0uXLm01uyQ4\nOFhtSSSAgIAAlQ+2t97jn//8p9pz1enfv7/aJp7qdHYAw9vViqWTB7S5sgyaA0D/eGZgm32Z7gZL\nS0tFuSxBuNfay1rMOfNftHX1MbLqiUF3c2QyqCq8QomOBL+hA/Hy8gKgwdie9OulXAn7GlMHd7R1\ndNE3NkdaXwtAVmkjq36MYvYgbyCcmzdvMnz4cEWz24qKCgD27dsHNJdmq6ioUCp16eDggJubG0eP\nHiU6Oprc3Fxu3rxJSEgIeXl5pKWlMXfuXBwdHdVOkrfn1jG0pra2VmVbyy/dwoOrrTI0ABU19RyO\nvcqEhFwmDnJstZzaj2Fh1Np64+A1Tul8aa1qUNLE1pmSnGTqq8oJ/zWSptoKIiMjkclkODs7q2Sd\ntaZ79+706dOH8vJyli9fjpeXF05OzZlHRUVFpKenI5FIFD/f8fHxfP/99/Tr1w8HBwfMzc0pLi4m\nKioKLS0tZsyYAUB9fT0rV67E3t6ePn36YGNjQ319PQkJCeTm5uLj44Ojo2NbQ7sr6uvr+fjjj/nj\njz+YPHkyixcvFv0LBEEQHlCalFI3sXXByn0IxRmxpB/+khtOHliXxJOdnoSRkRGWlpaK94HLhRJs\nPEYQ/d8dTH9+AbJaCWU3i1i2bBlXr15l+PDhREZG3vZ41S3+0dbWxtzcXLFAYsKECURHRxMTE4NU\nKlX08ZFIJBQUFJCSksL48eNZsmQJAN988w2xsbHcvHmTvXv3cv78eTIzM0lKSsLGxoYxY8YoAkwt\nycvFyj+Tt9SzZ0+srKwoKCigqqpK0W8QmucV1PX0k38278hCKkEQNCOCQIIgCA8h0Yuma5nk7YSt\nuRGhkRkkXVH9gjHQ2ZLg0e73PAAEzb0jevXqdc/vKwjQfvah/aAAJPlZ1JTcoCIv83/BHTNGPTmd\njcsXNmce5BQTWWKB7eN+lF5OpSDtD2RNjZjYOqOrb0gd0FAtQUfPgF0J5Vha2SMpzmfSpEmK+8i/\nlO7evRsjIyMCAwPx9PRUCYbLZDJOnz7NyZMniY6OpqSkhNjYWGxtbZkzZ45ST7KOunUMHSEmnx98\n7ZWhUZChUobm1nJqT06dRXluukoQqKooF5lMpvTzom9sgalDb25mJZJ7OYtIaQW9e/dm8ODBnDt3\nrs2hlEhqORCdQ3WdlPqqMnQNjJg5czzW1tbExcWRmpqKrq4ulpaWeHl5MXLkSMW5gwcPpqioiNTU\nVKKioqiursbS0pJBgwYxbdo0ReaPgYEB8+fPJzk5mQsXLnDu3Dm6deuGvb09r776KhMmTOjAq9w5\nJBIJ69evJz09nXnz5vHss8/e8zF0FbW1tQQFBeHu7s6//vUvxfb6+npmz55NQ0MDy5YtY9y4v34W\njxw5wpdffsnrr79+X/7/EwRBuJWmpdQdh0/G0NSK4ozzFGec56g0j1nPjGfu3LnMnz8fXWNzlm//\n838BJWu03QO4nB5FceZFtBrrGK3TTbGA4E6CQC2DKS3p6OgoLUCaMWMGx44dw8DAgMTERKqqquje\nvTvW1tbMmDFD6W/zrFmzKC4u5tq1a/z5559cuHABa2trZs2axZQpU+jeXTWbGFCUk2ttQZKlpSVF\nRUVqg0CtPQN0bCGVIAiaEUEgQRCEh9CD0IvmUePtaoW3q5VKn4dBLlZtvu4te/Y8++yzbNu2jdTU\nVBoaGnBzcyMoKEilZ0NDQwM///wzp0+fJj8/Hx0dHVxdXQkMDFQ08VZ3/ZZZW5s2bSIiIoKQkBDi\n4uI4fPgweXl5GBkZMWLECF588UXFh/fk5GRWr16tODcwMFDx36LXkNCW9rIWrR8bivVjQ1W2j53Y\nX5Gh8OOZDNDSxmFQAA6DlDMIc2OOUnUzj4q8TAzNrJDW1xGbmskoTxeGDv3run379iUzM5PU1FSG\nDRumNrMRmifbx40bx7hx42hqaiIlJYXt27dr/LxaWlqtfqm9dQxC13K3+6dp2hsLQCaDrw5E8uGL\n49SWU9PR1kJbV0/lvNqKmxRfisG673DFtpqyQmrKirBw8eSdzV8w3ccNgNDQUKUgUMsM3YiYC6Tl\nlpIhzSaKNKC5XF3qlZs0Wd3kkzmv8PLLL7f5DI6OjkplFVujq6vLzJkzmTlzZrvHdpa2+jEVFhay\nbt06bty4wbJly+4o8PswMDQ0xN3dnUuXLin1K0tLS6OhoQGAxMREpYnGxMREQP2qcUEQhPtB01Lq\nWlpa2HiMwMZjBADzxj5G8Gh38vLyuFpQSpmRCSUtvn/36D2IHr0HkXFiG5KCK2SZj+JimQ7BwcFM\nmDCBBQsWKPUSavn5s6qqSuneISEhKt+55OSLluSlueWMjY0xNzenb9++fPTRR20+m5+fH1evXkUi\nkbBx40ZF+db2yBculZaWqs3skZfDay3oIwjCvSOCQIIgCA+pR6kXzYPExcbktoJtBQUFLF++HBcX\nFyZNmkRpaSmRkZGsW7eOFStWMHr0aKC5T8Lbb79NSkoKvXr1YvLkydTV1fH777/z4Ycfkp2dzdy5\nczW+7/fff09cXBzDhw/H29ubpKQkjh8/Tn5+Phs2bADA1taWoKAgDh48CMCUKVMU57u5uXX4WYVH\nx51mLbZXvsP6saEUZ8RyI+UMpg69Kb92iZLySoaOGoeWlhbFxcVYWVnxzDPPcPz4cb799lscHBzo\n2bOn0nWkUikXL17k8ccfv63xypmamlJcrH616b0aw6MqIiKCTZs2KTVS7io0KUNzq+iYOGaFhzDY\ny1OlnJqZkQFajiNVzjF16MP1uHAq8jLR62ZCTUk+NWUFGFv1wtl3Ct6u1u3e91j8VT78KY6Kmnp6\nqNmfX1rNqh+j+MczA5k46N6XabsT7fVjskjNIH7FCmpra3nnnXdEEON/vLy8uHDhAikpKYoAdmJi\nItra2nh6eiqCPtCcTZmcnIydnR02Njb3a8iCIAhKNCmlDtBQU4muobEio9bIQJe6ujre+9en5BRW\n4OLXr83zZS2yefvZNWfWqPtcWF1dzfXr1zv4FKp69eqFsbExOTk5lJSUYGlp2ebx2traQMeycNzc\n3MjKyiIlJUUlCJSfn09xcTG2trYiCCQIXYAIAgmCIDykunovGqFjUlJSmD59OgsWLFBsmzx5MitW\nrGDLli0MGTIEIyMj9u/fT0pKCkOGDOGtt95SpNQHBwezbNky9uzZw7Bhw1SabLcmPT2dzz//XNHo\nvrGxkTVr1pCUlKRo/m1jY0NwcDARERGKewmCJu40a7G98h2GZtY4DAog58xu4n98j6amRvQMu3M+\n+SJLly7FyMiIjRs30qtXL15//XU2b97MkiVLGDx4MD179qSxsZHCwkLS0tIwNTXlq6++uqPn9fLy\n4syZM7z33nv07t0bXV1dHn/8cTw9Pe/ZGITbczf7p2lahqYlU4feuLsaU1ddoLacWkh0mcrvlbFV\nT+wHjCEv8TQ3s+JpqK3CxNYZ9wnz8B08oN0FCpqWrJOpKVnX1WnSj+lEVCrOFrr4DHpc0Uj7UXRr\nppRVrz5Ac+CnZRCoT58+jBw5kq+++orr16/Ts2dPsrOzkUgkSqUBBUEQ7jdNFyUVpkekCfe/AAAg\nAElEQVRRejkZE1sXdLuZEEcyP395kTPxGZjY98Hcqf0etDIZhEZm8O+5vvTq1Yu0tDRyc3MV/e2a\nmpr49ttvqa+vv6NnguagzuTJk/nvf//Lli1bePPNN9HT+ytTWCqVUlVVhZmZGdC8WAmgqKhI43tM\nmDCBEydOsGvXLoYPH664VlNTEyEhIchkMp588sk7fhZBEO6cCAIJgiA8xLpyLxpBvdbK0BgbGxMU\nFKR0rLu7O2PHjiUiIoI///yTgIAATpw4gZaWFosWLVIEgADMzMyYPXs2mzdvJjw8XOMgUFBQkCIA\nBM11msePH09qaqoiCCQId+JOshY1Kd9h7uSBjn436qvKARnaevpkpqcyzmeg0pfScePG4erqyoED\nB0hKSiI+Ph5DQ0MsLS0ZNWqUItvuTrz00ktA8wTp+fPnkclkBAUF4enpec/GINyeu9k/TZOfY4Pu\n5gyes07xb0Mza0aNHdVqFu/zhsWK3ysTWxelc93Hv9Bcvu3Ap/RwG0R3q54q1wkODlYJ6MtL1t06\nFnXkk1wPwucLTYNbZj0fo9asB/EpcaxZs4b3338fE5NHp4xua5lSTY2NXMmv5GTkORYtWkRVVRVZ\nWVnMnDmTgQMHAs1/83r27ElSUhKAYrsgCEJXoOmiJFN7V2pKb1CRn4WxTiNp2taYWFhj4uGPdV8f\njXs0Jl0p4XKhhBkzZrB582ZWrFiBn58f+vr6JCUlIZVKcXV1JScn546fLSgoiIsXLxIdHc3ixYsZ\nNmwYRkZGFBUVER8fz4IFCxQZ0gMGDEBLS4vt27dz5coVRR+g5557rtXre3h4MHPmTH766SeWLFnC\nqFGjMDQ0JDY2litXrtC/f39mzJhxx88hCMKdE0EgQRCEh9zt9qIR7q32ytCMHdVHUWu/pQEDBhAR\nEUF2djYjR44kPz+fHj16qJ2slE+6ZGdnazyuPn36qGyzsmqe1KusrNT4OoLQmjvJWtSkfIdBd3OG\nLfhAadsrE/szbbiryrEuLi4a93uR119Xp2X/lJbMzMxYsWJFm9ftrDHcLZcuXWL//v2kpaVRUVGB\niYkJzs7OTJw4Uann2NmzZzl8+DA5OTlIpVLs7e3x9/dn2rRpSqtQobmPmKenp9rnadmfTF4+qmV/\nnuDgYLZt20ZCQgK1tbU4OzsTHBys1Fdp1apVpKSkKK63adMmxT75dUNDQwkLC2Pjxo2UlJRw8OBB\nrl69iqmpKSEhIW32BKqrq+PgwYNERkaSl5eHlpYWzs7OTJkyhTFjxigdK5PJOHXqFMeOHSMvL4+a\nmhrK67W5XKlHj96DsHDx1Pj/i7Z+/tv7vZIHcjTNBr6dknXySa6u/lmjI/2YbB/3w+imOdlZZ1m1\nahXvv/++Sl+mh1FbmVLaOjo0drflVFQy+yJT6alfSVNTE15eXjg6OmJpaUliYiJPP/00iYmJaGlp\niVJ6giB0OZosSjKxc8PEzg0tLfjgeR+8Xa04EJ1D3vG0Dt8v4XIx0yZMAGD//v1ERETQvXt3RowY\nwdy5c9m4cePtPooSXV1d3n33XY4ePcqpU6c4deoUMpkMS0tLfH196d//r+wlR0dH/vGPf7B//36O\nHDmiyEZqKwgEMH/+fNzc3Dh8+DCnTp2isbEROzs7XnjhBaZNm4aurph6FoSuQPwmCoIgPCJutxeN\ncPdpUobmbFY5xxNyVXosyCefqqqqFA1EW6v3bGFhAXQseCNfAdaSPMOoI/WiBaEtt5u1eKc9hYSO\nOX78OF988QXa2tr4+Pjg4OBAWVkZmZmZ/PLLL4og0I4dO9izZw+mpqb4+/srVoTu2LGDuLg41q9f\n3ykTAoWFhSxbtgw7OzueeOIJJBIJkZGRrF+/nvfff18R+B4/fjzGxsZERUXh4+Oj1Kvs1hr1+/fv\nJyEhgeHDhzNw4ECVxsy3qqqqYvXq1WRnZ9O7d28mTJhAU1MT8fHx/Pvf/+bKlSu88MILiuN37tzJ\nnj17sLW1xc/PD2NjYzKv5JFx+CylV9M6FARq7+e4M7OBb6dknfy8rvzZ43aCW9U9PAny6cO+sO28\n+eabbNy4sd0+Cw8yTTKlutu5UpGfzf+3/TBPuumhr6+vyDgeOHAgsbGxNDQ0kJqaipOTk6JckCAI\nQldxu4uSNMnmdZ8wX2Wb/LwJEyYw4X/BoJbULYwZMGAAhw4davU+ISEharfr6OjwzDPP8Mwzz7Q7\n1nHjxjFu3Di1+5YuXdrqQqUxY8aoLHzp6DhBfSayIAidQwSBBEEQBOE+0rQMTUNNldoeC2VlZUDz\nRKZ8MrO0tFTtNeTbRWNOoSu6nazFO+0pJGguNzdX0RPnww8/xMnJSWm/vLFxeno6e/bswcrKik8+\n+UQRfJ43bx4bNmwgJiaGffv2MWvWrDseU3JyMsHBwUqlMv39/Vm3bh379u1TBIHkWVlRUVH4+vqq\nzdKSS0pK4qOPPlIKFLVl69atZGdnM3/+fGbOnKnYXl9fz4YNG9izZw+jRo1SXO/YsWP06NGDLVu2\nYGBgoDi+2ulP4i9d0/jZNf057qxs4PYmuWSNzfu1WpQh1eS8++12g1vmvQfzxhvmfPrpp7z55pts\n2LBBqXTqw0STTCkTu+bMSkl+Dr9cLuJpn37o6+sDzb3QTp8+zZEjR6itrRVZQIIgdFm3s3hCk6x0\ndW73PEEQhNsl/uoIgiAIwn2kaRmampJ8pPV1Kj0WkpOTAXBzc6Nbt27Y29tz48YN8vLycHBwULqG\nvBb/3Wpora2tjVTatSf8hK6vo1mLd9JTSNDckSNHaGxsZPbs2SoBIPirTOSJEyeA5tIh8gAQNK9C\nXbhwIefPnyc8PLxTgkA2NjYqJUoGDx6MtbU1ly5duq1rTpo0SeMAkEQi4ddff8Xd3V0pAASgr6/P\n/PnziYuL47ffflO6po6ODtra2krHPz/GnZTckrv2c3yn2cDtTVbVVtwEQM9I+R5dfZLrdoNU1XVS\npgUEoKenxyeffKIIBNnZ2XXyCO8vTTOljCzs0dU3pPzaRYprq7B/bopinzwYu2fPHqV/C4IgdEUd\nXTwhstIFQXhQdO1P5YIgCILwEOtIGRppfS03kn8jSe9JRY+FjIwMTp8+jbGxMb6+vkBz2aOdO3fy\n3XffsXr1asVEY0VFBbt27QJQW3KgM5iYmHD58mXq6+sVK4AF4W67k55CQttaToD88ls01XVShgwZ\n0uY5WVlZAGpX+/fs2RMrKysKCgqoqqq646xEV1dXlWAKNAek0tPTb+uajz32mMbHXrp0SVEWMzQ0\nVGV/Y2Mj0JxFJTd27FgOHTrEq6++ip+fH56envTr16/L/xy3NllVU1pAyeVkSnOS0dLSwtzRQ6Pz\nugpN+4oNnrNO7XkdKX/zINI0U0pLW5vuNs6UXbvY/G/zv/oS2tjYYG9vT35+Ptra2nh6al7yUHjw\nHTp0iKNHj1JQUEB9fT2LFi1i6tSp93tYgtAuTRdPiKx0QRAeFCIIJAiCIAj3SUfK0JjYOnMzM56q\n4jw+kV7AzUKXyMhImpqaWLJkCUZGRgDMmDGD2NhYoqKieO211xg6dCh1dXWcPXuW8vJyZs6cqdQA\ntDN5eXmRkZHBunXrePzxx9HT08PV1ZXhw4fflfsJglxn9j4RmstU/ngmQ2lCI/XSdeokJfz7aCbz\nxxu2+lpWV1cDKGUBtWRpaUlRUVGnBIHU9SyD5kwbmSYpNWrI+6xpQiKRAJCRkUFGRkarx9XW1ir+\ne9GiRdja2nLy5En27t3L3r170dHRYejQoSxcuJAPnvfpkj/HrU1yVZfkU3QxGkPTHjj6TKabuY1i\n34MwySVWcLetI5lS3e1cKbt2ER19Q8xseint8/LyIj8/nz59+oiStI+QM2fO8M033+Dm5saUKVPQ\n09OjX79+93tYgtDpRFa6IAgPAhEEEgRBEIT7pCOTK/rGFjgOn0xefAQxZ3/lurkhvXv3Zvbs2Qwe\nPFhxnK6uLuvXr+fAgQP89ttvHD58GG1tbVxdXXnppZfu6orl5557jqqqKqKjo0lLS6Pp/2fvzuOi\nLtfH/7/Y90WEQURWdwUEEckdJdfcMjXFUj+ZedKOWmq/L9U51qmj53yy1MoW045Wop1j5o6mlOkJ\nA0UQBkQxQAGXEREYQJBlfn/4YXIcdhcQr+c/2ft93+/7nmEeI97Xfd1XVRWhoaESBBIPxf2qffK4\nOxB/scZsFGNTc8qAhLMXCL9azKtj/Rjp76bXvzogfePGDVxcXPTu5+XdDiLcuRBsYGCgzZq5W1FR\nURNfSdMYGBg0uG31a5gwYQIvvvhig/oYGhoyYcIEJkyYQEFBAcnJyRw7doz//ve/XLx4kXXr1vH+\nzH4t8nNc0yJX247+tO3or9f2UVnkkh3cdWvMcX6KbsEougUDYG2hmw28YMECFixYcF/nJlq+EydO\nALB8+XIcHByaeTZCPDgtPZtXCCFAgkBCCCFEs2lsrQRzOye8Q6bx8sgeTOzrVWs7U1NTpk6d2qCa\nGwqFgj179uhdX7x4MYsXL66xj6+vb419zM3NmT9/PvPnz693XCEelHutffI4i8/IrXUBw9KxA8XX\nL1F46Tzmdo6s3puIws5CbyHD29ub33//HaVSqRcEunz5Mrm5uTg7O+sEgaytrcnN1c+MrKqqIiMj\n4768tupj46qPb7sfunTpgoGBASkpKU3qb2dnR//+/enfvz+FhYUkJiZy4cIFOnXq1CI/x611kUt2\ncNdOMqXEvagO+ksASDwOJCtdCNHSSRBICCGEaCayuCKEaEm2HE2rdSHcqUsfctPiuKI8im37jpjb\nORFxLE27mJGbm4ujoyPDhw/n0KFDbNu2jb59+2JnZwfcDr5s3LgRjUbDiBEjdJ7dpUsX4uLiiI+P\nJyAgQHv9u+++Q6VS3ZfXZmNzO6Byv54Ht4M4ISEh/Pzzz2zbto2pU6fq1SiqroPi7OxMeXk558+f\np3t33bo5FRUV2ownMzOz+za/B6E1LnK11uDW/SCZUqIpIiIi2Lp1q/b/x40bp/1z9Sai06dPs2PH\nDs6dO0dpaSkKhYL+/fszefJkvSMDw8PDUSqV/PDDD2zfvp0jR45w9epVhgwZwuLFi4mKimLNmjUs\nXryYtm3bsnXrVtLT0zE1NSUoKIi5c+diZWVFeno63377LSkpKVRWVuLn58e8efNQKBTcTa1Ws2PH\nDn777TdUKhXGxsZ06tSJyZMn6/w9BeiMb29vz/bt20lPT6ekpKTGTVMNee9WrFiBr69vo/qK5idZ\n6UKIlkyCQEIIIUQzkcUVIURLkalS1/ldZG7nhFvQaLJi95G6/wvsOnTjUoIDNpeiybuShaWlJStW\nrKB79+4888wzfP/99yxYsIABAwZgbm5OXFwcFy5coEePHkyaNEnn2U8//TSnTp3ivffeY9CgQVhb\nW5OamsqVK1fw9fUlKSnpnl9ft27dMDMzY/fu3ajVam3NorFjx95TjZI//elPXLp0iS1btvDzzz/T\no0cP7O3tycvLIysri7S0NJYtW4azszO3bt3i9ddfx8XFhU6dOqFQKLh16xYJCQlkZWURHByMm5v+\nEXstTWtc5GqNwa37RTKlRGNVBy+ioqJQqVRMnz5d5/6BAwf49NNPMTMzY+DAgdjb25OUlMT27duJ\niYnh/fffr/F7ecWKFaSlpREYGMgTTzyh3WRQLSYmhhMnThAUFMTo0aM5c+aMdg6zZs3izTffpGfP\nnowYMYLMzExiY2O5cuUKn3zyic5RoCqVivDwcFQqFT179iQwMJDS0lJOnDjB8uXLWbBgASNHjtSb\n36+//kpcXByBgYGMHj36vm46EI+WlpjNK4QQEgQSQgghmpEsrgghWoKETP3j2O7m2DkQC3sFV88c\np+hqJgXZqfxc2p7BfXx0sntmz56Nt7c3e/fu5aeffqKyspJ27drx/PPPM3HiRIyNdf8J0qtXL958\n8022bdvG0aNHMTc3x9/fn9dff52IiIj78vqsra0JDw9n69atREVFUVpaCsDQoUPvKQhkaWnJP/7x\nDw4cOMAvv/xCdHQ0t27dwt7envbt2/Piiy9qd42bmZkxe/ZskpKSOHPmDL/99hsWFha4uLgwf/58\nhg8ffl9e68PS2ha5WmNw636QTCnRWL6+vtoAvkqlIiwsTHtPpVLxxRdfYG5uzocffkiHDh209z77\n7DP279/Pv/71L1555RW95167do1169Zha2tb47gxMTH8/e9/x8fHBwCNRsNf//pXEhISePvtt3nl\nlVcICQnRtv/oo484dOgQsbGxBAcHa6+vXr2aa9eusWzZMp1amsXFxYSHh7N+/XqCg4Oxt7fXGf/k\nyZMsX76cwMDAxr1hQgghxEMgQSAhhBCiGdW3uGJmbU/v55bL4ooQ4oEqKatoUDsrJze8nf7IVpkV\n0qXG4PTgwYN1Fs/qExwcrLMIV62m+mS11TKrtnLlyhqvBwYG1ro4FxYWprNQebe6xjQ2Nmbs2LGM\nHTu21v7V7Z555hmeeeaZOtuJ5tXaglv3g2RKifvlyJEjVFRU8PTTT+sEgACef/55fv75Z37++Wfm\nzZuHiYmJzv3nnnuu1gAQwJAhQ7QBIAADAwOGDh1KQkICHh4eOgEggGHDhnHo0CHS09O1f/9kZGSg\nVCoZMGCA3t9hVlZWzJgxg/fee4/o6GjGjBmjcz84OFgCQEIIIVosCQIJIYQQzUwWV4QQzc3SrGn/\nLGhqPyHEo0UypUR97v5sFBTf0mvz+++/A+Dn56d3z9ramo4dO6JUKsnOzsbLy0vnfufONWfD37hx\ng9jYWJydnbl8+TKbNm0iKSmJ8vJy7O3tuXnzJp06daKgoIBvvvmG2NhYioqKcHR0pLCwkNzcPzJh\nU1NT0Wg0nD59mqeeeorr169TVVWFg4MDPj4+eHt7A5CVlQXczmx66623KCwsZNy4caxcuVI7drdu\n3XjxxRfx8PDQG9vT05PZs2fX+D5Ui4qKYvfu3WRnZ2NhYUFQUBAzZ87UHmd6p/tdwyg5OZnvv/+e\n9PR0CgoKsLa2xtnZmcDAQL3j/YQQQjwa5F9tQgghRAsgiytCiObk79m0IHNT+wkhHk2SKSXuFp+R\ny5ajaXp15dISLmJQeIP4jFztRqbi4mIAHBwcanxWdYCjul1N92pTXFzMkiVLcHNzIzQ0FJVKxcGD\nB0lLS6OsrIylS5diaWnJoEGDUKvVHD58mHPnzlFQUKB9Rn5+vvaaubk5tra2GBoacvbsWeLj43F0\ndMTb25ubN2/qjF1WVsa2bdsICgrSjn38+HHCw8NZtWoVy5cv1xn72LFjvP3223zxxRc4OTnpvZZd\nu3YRHx/PoEGD6N27NykpKRw+fJikpCQ++OADnXpI97uGUVxcHO+88w6WlpYEBwfTtm1b1Go12dnZ\n7Nu3T4JAQgjxiJIgkBBCCNGCyOKKEKI5eCps8HV30FvEq4ufh4N8XwkhxGPsQPzFOutFFd68RfiW\nGF4d68dIfzdtDbYbN27g7u6u1/7GjRvA7XprdzMwMKhzLpmZmbz66qtMnTpVe83IyIgPPviAb7/9\nlueee4758+drn+Pp6cnLL7+MUqnUto+Li6OgoIDJkyfz8ccfY2hoCEBVVRWffPIJhw4d4q233tI7\nvlStVjNgwADeffdd7bVt27axZcsWlixZwsCBA3XGDggI4MMPP2TXrl28+OKLeq8lLi6ODz74QJt5\nBLBhwwZ27drF5s2bWbhwofb6/a5h9OOPP6LRaFi5cqVeNlZhYWFNb70QQohHgGFzT0AIIYQQQgjR\n/GYM7kw9a2xaBgbUWAtICCHE4yE+I7fOAFA1jQZW700kPiNXG9RISkrSa1dcXEx6ejqmpqa4ubnp\n3a+Pvb09kydP1rlWHayprKzkhRde0Akk9e/fHwMDA+1xcBqNhrNnz2JiYoKrq6s2AARgaGjInDlz\nMDAw4MiRI3pjm5mZ6dUQCg0NBaC8vFxv7CFDhmBkZER6enqNr2Xo0KE6ASCA6dOnY2VlxS+//EJ5\neTnwRw2j/v3711rD6NatW0RHRxMREcG4ceO0Y9ZXw8jU1FTvWl01mYQQQrRskgkkhBBCCCGEIMDL\nkcVP+da7qGdgAK+O9ZM6ZUII8RjbcjSt3gBQNY0GIo6lsWzUULZt28bevXsJDQ3FxcVF2+bbb7+l\npKSEESNGYGJiUuuz7j46WVVw+2g2FxcXncANoD02zcHBAQsLC517hoaGmJiYaI+ey8nJAcDR0ZHv\nv/8etVqNr6+vTh9TU1OSk5MpKCjQOZLN0tJSb+zqI+9cXV1rHNve3l6nHtGdfHx89K5ZWVnh5eWF\nUqkkKysLb29vUlNTgdsBtIiICL0+1UfdZWVlYWOjm7nbpUuXGsceMmQI0dHRLFmyhEGDBuHn50f3\n7t1xdJS/8x80lUrFnDlzCA0NZfHixc09HSFEKyNBICGEEEIIIQQAowLccba3JOJYGokX9I+G8/Nw\nIGxQZwkACSHEYyxTpW7U8aEAiRfyKMGHuXPn8tlnn7Fo0SIGDhyInZ0dSqWS1NRUOnTowOzZs2vs\nX1vtoSvJKRSVlqO+pR+RMjIyAmrOaoHbR8xp/i+SpVarAXB2diY1NZWvvvoKS0tLrK2tMTIy4tat\nW5SUlFBVVcWVK1d0gkDV49Q0dk1H21Xfr6ysrPHe3Ue3Vauui1RSUqIz54SEBBISEmrsA3Dz5k29\nIFBtNZb69+/PX//6V3bu3Mnhw4c5cOAAAJ06dWLWrFn4+/vXOo4QQoiWS4JAQgghhBBCCK0AL0cC\nvBz1dlv7ezpKDSAhhBAkZNacwdKQfhPHjMHFxYUdO3YQHR1NWVkZTk5OTJo0ialTp2rrBt1JVXCT\n8C0xtWYelVdW8ds5FQcTshjp3/ij5OCPYM2QIUPYuXMne/bsITo6mpycHKqqqrC3t8fd3Z3g4GA8\nPDyaNEZD5efn13j97ppJ1f996aWXGDduXJ3PvDtTqK4aS0FBQQQFBVFaWsq5c+eIjY0lMjKSd955\nh48++qhJx/UJIYRoXhIEEkIIIYQQQujxVNhI0EcIIYSekrKKett0Hj671n4BAQEEBAQ0aKypLy0h\n3qr2AFAbDx/sXLtg1daV1XsTUdhZaLNVfX196du3Lz179tTrp1AoGD58uPb/O3TogJWVlbYu0NSp\nUwkJCdEez/Xss8+yadMmvv76azZu3Ei3bt2YOHEijo6OjB8/njNnzrB582aKiorw9PSsMaOpsrKS\ngwcP8tNPP/HTTz+h0WhYtGgRw4cP56mnntK2UyqV+Pj46Iy9fv16IiIiqKysZMOGDcybN4+uXbtS\nXl7Ol19+yX/+8x+dsf38/Gp9T3Nzc1m3bh2ffvopFhYWBAUFMXPmTL3sIHNzc7y8vIiPj6ewsBCl\nUsm0adMICQlh8uTJej/DqKgo1qxZw+LFi7G3t2f79u2kp6dTUlLCnj17ap2PEEKIB0+CQEIIIYQQ\nQgghhBCiQSzNmraU1JR+Tak91JQjS42MjBg3bhzbtm1j/fr1vPjii9p7V69eZcmSJTg6OhIQEEBF\nRQXHjx8nOTmZ0tJSdu/eTd++fRk0aBBqtZpjx47x9ttvU1ZWpn1GRUUF7777LqdOncLV1ZV27dph\nZGREVVUVX3zxBefOnaNdu3YA/PzzzzzxxBM6Y+fl5WFvb4+bmxtKpZLw8HBWrVrF5cuXUavVTJgw\nAX9/f+3YX3zxBU5OTmRmZuoEd6Kjo8nMzCQgIIDg4GBSUlI4fPgwSUlJfPDBB2RlZdG9e3eMjIxQ\nqVSEh4ejUqkwNDREoVDg5+dHdnY2y5cvZ8GCBYwcOVLvvfz111+Ji4sjMDCQ0aNHo1KpGv3zaI3O\nnTvHDz/8QEpKCoWFhdjY2ODh4cHIkSMZOHCgTluVSsWmTZtISEigtLQUDw8PwsLCCAoK0mlXXFzM\nwYMHiYuLIycnh4KCAiwtLenWrRtTpkyhW7duevMYN24cPj4+hIeH8/XXXxMbG4tarcbFxYVJkybx\n5JNP6vUpLy/nP//5Dz/99BPXr1/HwcGBkJAQpk2bxqRJk/Dx8WHlypU6fe4Mel68eJHKyko6dOig\nDXrWlY0mhLj/JAgkhBBCCCGEEEIIIRrE37NpdeEa26+ptYcyVeomZbI+++yzZGRkEBkZSWxsLF5e\nXmRlZZGRkUHHjh0pKipi8ODBTJ48mW3btvHVV1+RkpJCv379WLNmjXZROyAggA8//JCrV69qn/3v\nf/+bU6dOMXbsWObOncvcuXMBWLt2LZ988gmHDh3SLvAHBgbyzjvvkJmZSVZWFh4eHpiYmNCnTx8+\n/PBDIiMj2bJlC0uWLCEsLIzff/+d8+fPU1lZibu7O7/88gvz58/H2dmZCxcusGrVKu080tLS6N69\nO8888wyhoaEAbNiwgV27drF582bOnz/P9evX6d69O7/++itXrlzBx8eH/Px8+vTpw/vvv4+BgQHh\n4eGsX7+e4OBgvRpGJ0+eZPny5QQGBjb6Z9BaHTx4kE8//RRDQ0OCg4Np3749+fn5nD9/nn379ukE\ngVQqFa+99hrt2rVj2LBh2sDiu+++y3vvvaeT5ZWdnc0333xDz549CQoKwtraGpVKRWxsLHFxcfzl\nL3+p8edQXFzM66+/jrGxMQMGDKC8vJz//ve/rF27FgMDA+1nA0Cj0bBy5UpOnDhB+/btGTt2LJWV\nlURFRXHx4sUaX+/dQc8hQ4ZgampKYmKiNuj52muv3cd3WAhRHwkCCSGEEEIIIYQQolVQqVTaY7TC\nwsIatJteNI6nwgZfd4dGBWj8PBwaHZi5l9pDTQkCZefdpOfwMG7aeHA24TeiY05w5coVbGxs6NKl\nC0FBQYSEhAAQGhrKV199RVVVFX379tXJahgyZAhr166lpKQEuL2IvnfvXtq0acOLL76IoaGhtq2h\noSFz5szh8OHDnDlzBoAJEybQrVs3li1bhkajoUePHvTt25eZM2diZ2dHaGgoW1TxWiMAACAASURB\nVLZsoby8nIULFwJoaxhlZ2ejUqnQaDT4+PgwduxYPDw8iIuLA8Df358LFy7ovO7p06dz+PBhfvnl\nF/785z9z4sQJ4uLiOH36NI6Ojtja2jJixAjGjx+PtbU1ADNmzOC9994jOjqaMWPG6DwvODhYAkB3\nyMrK4rPPPsPS0pJ//vOfuLu769zPzdX9nCclJREWFsb06dO114YMGcLy5cvZsWOHThCoQ4cObN68\nGVtbW71nLlmyhA0bNtT4s8jIyGD48OG88sor2s/jhAkTeOWVV/j+++91gkBHjhzhxIkT9OzZk/fe\new9j49tLyTNmzGDJkiU1vua7g57VY1RVVWmDngMGDCA4OLje908IcX9IEEgIIYQQQgghhBCtSmN2\n04vGmzG4M+Fbaq/VcycDAwgb1LnRYzSk9pCZtT29n1tea7+6atFs3LgRgPiMXLYcTbsjqOUA3mMo\nU+RjdSmfEU8O5v33V+j0dXBwwMzMjKlTp/L666/r3DM0NMTe3p6hQ4eycuVKsrOzUavVtG/fnu++\n+w5Au8geEREBgKmpKW3btmXr1q0AODs74+PjwxNPPMGbb76pNzaAq6srFhYWAEydOpWpU6cCMHv2\nbExNTVm+XPd9AZg0aRLDhg3TuWZlZYWXlxdKpRJ3d3dCQkKIjIykuLgYf39/unfvDsC+ffu0fQoK\nCoDbAY67denSRe/a42z//v1UVlYybdo0vQAQgKOjboacQqHg2Wef1bnWu3dvnJycOHfunM51Kyur\nGsd0dHRkwIAB7Nmzh2vXruHk5KRz38zMTC8g6ebmRo8ePVAqlZSWlmJubg7crvUE8Nxzz2kDQNVj\nT5s2jQ8++EDn2Q0Neh45ckSCQEI8RBIEEkIIIYQQQgghRKvSmN30ovECvBxZ/JQva/Yl1RkIMjCA\nV8f6NalOz8OoPXQg/mKdr6Hw5i2iUnI5mJDFSH837XUjI6PbY1la1tjPyMiIyspKANRqNQCXLl3S\nBnlqcvPmTb1rNS3yN2bsu919dFu16rpB1dlL1XNOSEggISGhUXO+swbR4ypTpSYhM5eSsgr2/RJL\nSVlFg7OjvLy8dAIn1RwdHUlNTdW7fubMGXbv3k1qair5+flUVOgGT69fv64XBGrfvn2Nn5/qgFRR\nUZE2CJSeno6BgYE2GHinHj166F3LycnRC3rezdTUtMYAohDiwZEgkBBCCCGEEEIIIVqVxuymF00z\nKsAdZ3tLIo6lkXhB/2g4Pw8HwgZ1blIACB587aH4jNx6g1gAaGD13kQUdhZNC2b932J7v379eOON\nNxrd/37Kz8+v8fqNGzeAP+Za/d+XXnqJcePGNWqMO4/Ge9zoZ5VB8rkcytR5vB95ntlPmtf7Gao+\ncu9uRkZGaO76sB4/fpyVK1diamqKv78/Li4umJubY2BgQFJSEkqlkvLycr1n1ZZBVB1grKqq0l4r\nLi7GxsZGe+9ONQUV7yXoKYR4cCQIJIQQQgghhBBCiEfSnTvuLc2M6WB1e5G0sbvpRdMEeDkS4OWo\n93Pw93RsUl2eOz3o2kNbjqY16Dg7AI0GIo6lNSkI1KFDB6ysrDh79iwVFRU6R2o9bEqlUu84uOLi\nYjIyMjA1NcXN7Xa2U9euXQFITk5udBDocVVbVpmxqTllQMLZC4RfLebVsX46WWX34ttvv8XExITV\nq1drf3bV1q1bh1KpvOcxLC0tUavVVFZW6gWCagoqtqSgpxDiD/q/EQkhhBBCCCGEEEK0YPEZuSzd\nfJx5Xxzls4MpbD5yjs8OprD06+OkZN0gv6zmfjXtphf3zlNhw8S+XoQN6szEvl73HACqNmNwZxqa\nWNKY2kOZKnWjgksAiRfyyFSpG9UHbn/mxo0bR15eHuvXr+fWrVt6bfLy8h7K8Vg///wz6enpOte2\nbt1KcXExgwcPxsTEBIDOnTvTs2dPoqOjOXToUI3PyszM1NYGetzVlVVm6dgBgMJL59H8X1ZZfEbu\nfRn38uXLuLm56QWANBoNycnJ92UMb29vNBoNZ86c0buXkpKid+3uoKcQomWQTCAhhBBCCCGEEEI8\nMhpSx2Vv3EWG31XHRTx6HlTtoYTMpi3CJ2TmNinA9eyzz5KRkUFkZCSxsbH4+fnRtm1bCgoKuHTp\nEikpKcycOVNvMf9+CwwMZNmyZQwaNIg2bdqQkpJCSkoKCoWC2bNn67RdunQpb775Jh999BF79uyh\na9euWFlZkZubS2ZmJhcuXGDVqlXY2dk90Dk/CurKKnPq0ofctDiuKI9i274j5nZOOlllubm52lo8\njaVQKLh06RJ5eXk4ODgAtwNAERER9y2oOGzYMBITE/n222957733tJlsxcXFbNu2Ta99ddBz27Zt\nrF+/nhdffBFTU1OdNnl5eRQXFz/wz7sQ4g8SBBJCCCGEEEIIIR4ClUrFnDlzCA0NZfHixc09nUfS\nw6rjIlqOB1F7qKSsaRkKTe1nbGzMm2++yZEjRzh8+DAnTpygtLQUW1tbnJ2dee655wgJCWnSsxtj\nwoQJ9OvXj127dpGTk4O5uTmhoaHMnDlTL5jj6OjImjVr2LNnD9HR0Rw5coSqqirs7e1xd3dn7Nix\neHh4PPA5t3T1ZZWZ2znhFjSarNh9pO7/ArsO3biU4IDNpWjyrmRhaWnJihUrmjT2xIkTWbduHQsX\nLmTAgAEYGRlx5swZLl68SN++fYmNjW3qy9IaNmwYx44dIy4ujgULFhAcHExFRQXR0dF07tyZnJwc\nvaM3W0rQUwjxBwkCCSGEEEIIIYQQ4pHwsOq4iJblftcesjSrfznMzNqe3s8tr7Xfnj17au27ceNG\nvWsGBgYMHTqUoUOH1ju2QqGo8/mNHTssLIywsDDt/4eGhtY7BwALCwumTp3K1KlT620bGhra4Oe2\nJg3JKnPsHIiFvYKrZ45TdDWTguxUfi5tz+A+PowYMaLJY48aNQoTExN27dpFVFQUpqam9OzZk0WL\nFhEdHX1fgkAGBga88cYb/Oc//+Gnn35iz549ODg4EBoaypgxY/jtt9+wsLDQ6dNSgp5CiD9IEEgI\nIYQQQgghhBAt3r3UcblfNWpE8/JU2NyXn6W/Z9MCg03tJ1qvhmaHWTm54e30R+bLrJAuOjWs6gv8\nrVy5ssbrtQXfPD09dQJ/1eoaY/HixTVmqZqamjJjxgxmzJihcz0hIQGgxoyexgQ9hRAPnmH9TYQQ\nQgghhBBCCCGa173UcRHiTp4KG3zdHRrVx8/DQYKJQk9DssruZ7/mkJenH3xXq9Vs2rQJgH79+j3k\nGQkhGuvR+cYRQgghhBDNIjw8HKVSWefOQSGEEI2jUqnYtGkTCQkJlJaW4uHhQVhYGEFBQc09tRar\nITvuazrC685+te2mF4+fGYM7E74lpkHHCxoYoJO1IUS1xyGrbMOGDWRkZNC9e3fs7OzIzc0lLi4O\ntVrNqFGj6NKlS3NPUQhRD8kEEkIIIYQQQgghHiKVSsVrr72GSqVi2LBhDBo0iAsXLvDuu++SmJjY\n3NNrsR6HHffi4QnwcmTxU74YGNTdzsAAXh3rJ7WlRI0eh6yy/v3706ZNG2JjY9m5cycxMTG0b9+e\nP//5z8yfP7+5pyeEaAD5TUgIIYQQQtTptddeo6ysrLmnIYQQrUZSUhJhYWFMnz5de23IkCEsX76c\nHTt24Ofn14yza7kehx334uEaFeCOs70lEcfSSLygf+SVn4cDYYM6SwBI1Km1Z5UNHDiQgQMHNvc0\nhBD3QIJAQgghhBCiTk5OTs09BSGEaFUUCgXPPvuszrXevXvj5OTEuXPnmmlWLV/1jvuki/qL9bV5\n1Hbci4cvwMuRAC9HMlVqEjJzKSmrwNLMGH9PR/nsiAapzipbsy+pzkCQZJUJIZqLBIGEEEIIIVqZ\nmJgYdu/eTVZWFmq1GltbW9q3b8+gQYMYM2aMtp1arWbnzp389ttvXLlyBWNjYxQKBX369OHZZ5/F\n3NwcqLsm0KlTp9i9ezfnzp3j5s2bODo60q9fP5599lmsrKx02s6ZMweAdevWERERwbFjx8jPz8fJ\nyYkRI0bwzDPPYHDXmSxz5syhoKCAoKAgUlJSKCwsxMbGBg8PD0aOHKm3K/Hs2bPs2LGDlJQUioqK\nsLe3p0+fPkyfPh0Hh7qP6lizZg1RUVFs3LgRhULR8DdcCCHqcOfC8q3ifErKKvDy8sLQUP90dkdH\nR1JTU5thlo+O1r7jXjQfT4WNBH1Ek0lWmRCiJZMgkBBCCCFEK3LgwAHWrVtHmzZt6Nu3L7a2tuTn\n55OZmcnhw4e1QaCrV6/yxhtvoFKp6NSpE2PGjEGj0ZCTk8POnTsZPXq0NghUm61btxIREYGNjQ1B\nQUHY2dmRmZnJDz/8wMmTJ1m1ahWWlpY6fSoqKvjrX/9KXl4effr0wdDQkN9++43NmzdTXl6uczQS\nQHZ2NqmpqVRWVhIcHEz79u3Jz8/n/Pnz7Nu3TycIdOjQIT755BNMTEwIDg7G0dGRS5cucfDgQWJj\nY1m1apVkNQkhHpr4jFy2HE3TyVopK8on+cJ1qlJyeSojV28x0MjICE1DohuPMdlxL4RoqSSrTAjR\nUkkQSAghhBCiFTlw4ADGxsZ8/PHH2NnZ6dwrLCzU/nnVqlWoVCpmzpzJlClT9NrVFwBKTEwkIiKC\nbt268fbbb+tk/URFRbFmzRoiIiJ48cUXdfrl5eXh5eXFe++9h6mpKQBhYWHMmzePXbt2MWXKFIyN\nb/+KmpWVRWpqKkZGRqxduxZ3d3edZ+Xm5mr/nJOTw6effoqzszMrV66kbdu22nunT5/mL3/5C+vX\nr+fNN9+s9TXNnDmTyZMn15sxJIQQ9TkQf7HOIMXlGyWEb4nh1bF+jPR3e7iTawVkx70QoiWTrDIh\nREsjQSAhhBBCiEfcnbsNf79SQEWFBiMjI712tra2AJw/f57U1FS8vb2ZPHlyre3qUn003J///Ge9\nY99CQ0PZvXs3R44c0QsCAcybN08bAAKws7MjODiYn376iZycHDw8PADYv38/Go0Gb29vvQAQ3D42\nqVpkZCQVFRXMnTtXJwAE0KtXL4KDg4mNjeXmzZtYWFjU+JocHBwkACSEuGfxGbn1ZqkAaDSwem8i\nCjsLCVY0gey4F0IIIYRoGAkCCSGEEEI8omo6akhl6Er2uWSCR07hmXEjGBPSj+7du+tkBZ09exa4\nXYT87ho8DZWamoqxsTH//e9/a7xfXl5OQUEBarUaG5s/FuOsrKxwcXHRaavRaMjKyiIpKYk5c+bg\n6upKv379SE5OBnSDPdWOHj3KgQMHSE9P59atW6Snp2NkZERCQgJpaWk6bbOzs4mMjOTChQtMmDAB\nZ2dnnJ2dCQwM1Dl+rraaQBqNhj179nDgwAGuXLmCjY0N/fr14/nnn2fhwoUAbNy4Udu+OhNq8eLF\nODk5sXXrVs6fP4+BgQE9e/bkhRdewM1Ndv4L0VptOZrWoHo1cDsQFHEsTYJA90B23AshhBBC1E2C\nQEIIIYQQj6DajhpSdO+HkZkluedO8vmmbfwYuQ+FnSU+Pj78z//8D507d6a4uBjgnrJe1Go1lZWV\nbN26tc52N2/e1AsC3e3LL7/kyJEjVFZWMmDAANzd3YmJieH48eNoNBrMzMx02q9du5bDhw/j6OhI\n//79sbKy4qOPPiI7O5v//d//pWvXrtrgVn5+PmlpaRgZGWFvb09ISAh2dnZkZ2ezb98+vRpENfn8\n88/Zv38/Dg4OjBo1CmNjY2JiYjh37hwVFRXa4+vuFhsbS0xMDIGBgYwePZqsrCxOnjxJWloan376\naYMyroQQj5ZMlVonMN8QiRfyyFSpJZAhhBBCCCEeCAkCCSGEEEI8Yuo7aqitdy/aevei4lYpJblZ\ndHe+ifLUcZYvX85nn32mDcTk5TVuofJOlpaWaDSaeoNA9Tlz5gx79uzB3t4eZ2dnJk+ejK+vL88/\n/zz9+/fn1q1blJWVadtHRUVx+PBh+vXrx9KlS7XHyimVSn755RfatWvHSy+9xPjx4wFYuXIl0dHR\nfPTRR3h5eemMfWeNpNokJyezf/9+XF1d+eCDD7Tv3cyZM3nrrbfIy8vTyRq602+//cbf/vY3evXq\npb22efNmtm/fzqFDh3jmmWca92YJIVq8hMzc+hvV0u9xCAKpVCrmzJlDaGgoU6ZM4dtvvyUpKYnC\nwkL+/ve/4+vr29xTFEIIIYRodQybewJCCCGEEKJxGnrUkLGpObbtO1PlHcKTTz6JWq0mOTmZrl27\nAnDq1Ck0DT2z6C7dunWjqKiIixcvNql/tcOHDwPwxBNP6GTUmJqaMmbMGAByc/9YVN29ezdGRkYs\nWrRIp65Q165dad++PZWVlRw5ckRvnDvbVmtIJk5UVBQAU6dO1cliMjY2ZtasWXX2HTx4sE4ACGDU\nqFEAnDt3rt6xhRCPnpKyijrvm1nb0/u55Xj0n1Bjv5UrV2prrrVmly9fZsmSJahUKkJCQhg5ciSW\nlpbNPS0hhBBCiFZJMoGEEEIIIR4h9R01pL6SgbWzp06tn8QLeVQWXQXAzMyMTp060b17d86cOcP2\n7duZMmWK7jPUaszMzGoMnFSbMGECJ06c4OOPPyY8PFzvaLnS0lIuXLigDTjd/Rqqi3j/+OspSsoq\n6NChA0lJSTrtXnjhBb788kvS09PJyspCoVCQkZGBra0tu3bt0qk3VF5ezpUrVygsLMTExET7jCFD\nhhAdHc1rr71G586dGTlyJN27d6+xzlBN0tPTAejRo4feva5du2JkZFRr306dOuldqx63qKioQeML\nIR4tlmZN+yd2U/s9qlJSUpgyZQozZ85s7qkIIYQQQrR6j9dvmkIIIYQQj7j6jhrKOPpvDI1NsXR0\nxczaHo0GilUXyDNSM7CPnzYzZcmSJYSHh/P1118THR2Nr68vGo2GS5cuER8fz+eff17rMWcAvXr1\nYtasWXz99de89NJL9OnTB2dnZ0pLS1GpVCiVSnr06ME777yj7XNdXcrvVwqZ98VR7bXk85cpU+dh\ncuoSJSW3dMbw9PSkR48epKWlsXDhQnx9fcnKyqKiooLo6GiMjIzo1q2btr2JiQmFhYWcPHmSv/3t\nb7i6ulJZWYmrqyvHjh0jNjaW06dPA7cDNLNmzcLf37/O97OkpAQAe3t7vXuGhoY69Y7uZm1trXet\nOmhUVVVV57hCiPqdPXuWHTt2kJKSQlFREfb29vTp04fp06ffU82ze+Hv2bAA8/3q96iyt7dvUE02\nIYQQQghx7yQIJIQQQohH1p21BRYvXtzc03ko6jtqyMU/FPXl37mZd4XCS+cxNDLG1MqOASOeZsXS\nOdoj15ydnVm7di3ff/89v/32G3v37sXU1BSFQsHTTz+NnZ1dvXOZPHkyPXr0YM+ePaSkpBATE4Ol\npSVt27Zl5MiRDBkyRNv2QPxFTv5+DYD2dzzDyMQMgAtX87iec4NfU69oa0JUVlZibW1NUFAQffr0\nITExkcuXL9OmTRuef/55RowYwYABA3TmlJmZyc6dO0lMTCQ+Ph5zc3McHByYP38+ffv2xczMjNjY\nWCIjI3nnnXf46KOPcHNzq/U1WlhYAJCfn0+7du107lVVVaFWq2nbtm2975UQ4v46dOgQn3zyCSYm\nJgQHB+Po6MilS5c4ePAgsbGxrFq1Cicnp4c+L0+FDb7uDnVmbN7Nz8OhVdcDujP781ZxPiVlFQR4\neelkbQohhBBCiAdHgkBCCCGEEI+Q+o4McurSB6cuffSuh4zsoQ1oVLOxsWH27NnMnj27zmeuXLmy\n1ns9evSo8ai0O8Vn5LJmXxI9Jy7Su2fp4EJJ3mWs2rri3vcpdp69RXBGLgFejqSkpFBVVYW9vT3h\n4eEALFiwgEuXLrF06dIas3A8PT3rDQj6+flhbW3Nli1bOHnyZJ1BoI4dO5Kenk5KSopeEOjs2bNU\nVlbWOZYQ4v7Lycnh008/xdnZmZUrV+oEYk+fPs1f/vIX1q9fz5tvvtks85sxuDPhW2IaVLvNwADC\nBnV+8JNqBvEZuWw5mqYTECsryif5wnWqHAuI/7/veiGEEEII8WAZNvcEhBBCCCFEwz2KRw1tOZpW\n62KoQ8fbx7FdUR6joqwEjQYijqVx69YtNm/erNd+4sSJVFRUsHbtWoqLi/XuFxUV8fvvv2v/X6lU\n1hioyc/PB27XSKrLsGHDAPj3v/+tM15FRQVff/11nX2FEA9GZGQkFRUVzJ07Vy8Tr1evXgQHBxMb\nG8vNmzebZX4BXo4sfsqXO0qz1cjAAF4d69cqAyEH4i8SviWm1oyoyzduEr4lhoMJWQ95ZkIIIYQQ\njx/JBBJCCCGEeIQ8akcNZarUdc7V2skNRbdgVKkxnNn3OW3ce5AdZ0j2ofW4OLXRq+sxfPhwzp8/\nz/79+5k7dy4BAQEoFArUajVXr15FqVTy5JNPsmDBAgDWr1/P9evX6d69O87OzhgbG3P+/HkSExNR\nKBQMHjy4zvn7+PgwatQoDhw4wIIFC+jfvz/GxsbExsZiaWmJg4MDBvWt9Aoh7tmdR4rt/TmGkrIK\nlEolaWlpem0LCgqoqqoiJyeHTp06NcNsYVSAO872lkQcSyPxgv53oJ+HA2GDOrfKAFB19md9mVAa\nDazem4jCzqJVvg9CCCGEEC2FBIGEEEII0SpkZ2ezadMmkpOTKS8vx9vbm+nTpxMQEKDX9ujRoxw4\ncID09HRu3bqFs7MzISEhTJo0qcYaBdnZ2Xz//fckJiaSl5eHlZUVrq6uDBkyhDFjxui0PX36NDt2\n7ODcuXOUlpaiUCjo378/kydPxsrKSqdteHg4SqWSH374ge3btxMVFcX169e1dXlGjhwJ3N71vm/f\nPi5fvoyNjQ2dez0BdAD0gw/FudlcTYmm+FoWlbduYmxuTcCEJ8nL69oshdITMnPrbeMaOBIzGweu\nnTtBbtpJjMwssRsRwrt/fY2FCxfqtX/55Zfp06cPkZGRnD59muLiYqytrXFycmLSpEkMHTpU23bq\n1KkcP36ctLQ0Tp8+jYGBAU5OTkydOpXx48djbW1d7/zmz59Phw4diIyMJDIyEltbW5544glmzpzJ\n7NmzcXFxadybIoRosJqOFEs+m0WZOo/31m7Eta0VdpamNfYtLS19WNOsUYCXIwFejjoBLEszY/w9\nHVt1DaC6sj/vVp39KUEgIYQQQogHx0DT0N/OHjMGBgZxvXv37h0XF9fcUxFCCCFELVQqFXPmzMHH\nx4eMjAw8PT3p3r07N27c4NixY5SXl7Ns2TIGDRqk7bN27VoOHz6Mo6MjAQEBWFlZcfbsWc6cOYOv\nry/vvvsuRkZG2vYnTpzgH//4B+Xl5QQGBuLp6UlxcTEZGRnk5eWxceNGbdsDBw7w6aefYmZmxsCB\nA7G3tycpKYmzZ8/i5ubG+++/rxMIqg4C9e/fn7Nnz9KnTx+MjIz49ddfKSgoYPHixWRkZPDTTz8R\nFBSEtbU1MTExXL16Fb/BY4grc9dZaLt+Pp6LsXsxMDTCrkNXzKxsCWxvSt7FVNq0adMshdIjjqWx\n+ci5RvebFdKlxdfJuHTpEvPmzWPw4MEsW7asuacjRKtzIP5ijRklqZFfUnL9Er2m/n8Ym5nz6lg/\nRvrXXttLPDyZKjXzvjha6/2yonySd66lrbc/Hv0naK9/MW9wqw6MCSGEEOLxFRgYyKlTp05pNJrA\n5pqDZAIJIYQQ4pGnVCp5+umneeGFF7TXnnrqKZYtW8a6desIDAzE0tKSqKgoDh8+TL9+/Vi6dCmm\npn/sHo+IiGDr1q3s27eP8ePHA1BYWMiqVauoqqpixYoV+Pj46Iybm/tHlotKpeKLL77A3NycDz/8\nkA4dOmjvffbZZ+zfv59//etfvPLKK3rzv3btGuvWrdMGiJ5++mlefvllvvzyS6ysrPj444+1dS/C\nwsKYO3cumQn/5b13PuS76HQSL+RRWphL1ol9mFrZ03n4LPp099QeNdSchdItzZr262ZT+z0IN27c\nwN7eXufYt7KyMr788ksA+vXr11xTE6LVqutIMStHV0quX6Lo2kXsXLvIkWItSEOyP2vrJ0EgIYQQ\nQogHw7C5JyCEEEIIca+srKyYPn26zrXOnTsTEhJCcXExx48fB2D37t0YGRmxaNEinQAQwLRp07Cx\nseHIkSPaa1FRUZSUlDB69Gi9ABCAo+MfC45HjhyhoqKCsWPH6gSAAJ5//nksLCz4+eefKS8v13vO\nrFmzdDKE2rVrR48ePSguLmbatGk6hc+trKzo27cvhYWFuNsa8P7MfnwxbzA+ppdxbWPBolf+xL9e\nHcv7M/tpF0Sbs1C6v2fTFmWb2u9B2L17N3PmzGH16tVs3ryZNWvW8Kc//YmTJ08SGBjIgAEDmnuK\nQrQ6dR0p5tSlL4ZGRuTE/UhpYa72SLFqFRUVJCcnP6SZijuVlFXUed/M2p7ezy3XyQJqSD8hhBBC\nCNF0LWeLpRBCCCFEPe6uq9DB6vYKYceOHbGwsNBr7+vrS1RUFOnp6QwcOJCMjAxsbW3ZtWtXjc83\nMTEhKytL+/9nz54Fbqdv1+f3338HwM/PT++etbU1HTt2RKlUkp2djZeXl879mgqXV9fvqeledVCo\nun6Qp8IGi7JcXB2scNTcIPrwHqLv6tNchdI9FTb4ujvo1POoj5+HQ4vaEe7v709GRgbx8fGo1WqM\njIxwdXVl3LhxjB8/XidDSAhx7zJV6jq/M8ztHHEPHs/FmN2c2fs5ti4dybZtSxvVCapKC0lJScHW\n1pbPP//8Ic5aQOvI/hRCCCGEaG3kNy0hhBBCtHg1FQaH27UFsrJu0NHHpMZ+9vb2ABQXF1NUVIRG\no6GgoICtW7c2aNzi4mIAnUyc+tpWB2/u1qZNG512d7ozC6hadV2iuu5VVPyxc7qwsBCAHTt21DnP\n5iiUPmNwZ8K3xDSoULiBAS2uFlCvXr3o1atXc09DiMdGQ44Uc/D2w6KNN/hsswAAIABJREFUM6oz\nv6G+moH6yu9ElqTj19mdAQMG6NSCEw9Pa8j+FEIIIYRobSQIJIQQQogWrbbC4NUKb95iT/QZRidk\n6RUGz8/PB24HUqqDKd7e3qxdu7ZBY1f3uX79Op6eng1qe+PGDdzd3fXu37hxAwBLS8sGjd1Y1eN/\n9913D2yMpgrwcmTxU751/hzhdgDo1bF+UtdDiMdcQ48Gs2jjrHOs2KyQLi0uiPy4aQ3Zn0IIIYQQ\nrY3UBBJCCCFEi1VXYfA7leRdZtUPJ4jP0N09npSUBNwO/Jibm+Pu7s7FixdRq9UNGr9r164AxMXF\n1dvW29tbZ8w7FRcXk56ejqmpKW5ubnr374fqubbUOhijAtxZOSMYP4+aM6X8PBxYOSNYL5AnhHj8\nyJFij7YZgzvT0FMyW2L2pxBCCCFEayNBICGEEEK0WHUVBr9Txa1SLif+olMYPC0tjSNHjmBlZUW/\nfv0AmDhxIhUVFaxdu7bGY9mKioq0tX0AQkNDsbS0JDIyEqVSqdc+N/ePoNPQoUMxNjZm7969XL58\nWafdt99+S0lJCSEhIZiY1Hx03b0aO3YsxsbGbNiwgZycHL37LaFQeoCXI+/P7McX8wbz8sgezArp\nwssje/DFvMG8P7OfZAC1IElJSYwbN46IiIjmnop4DMmRYo+26uzP+gJBkv0phBBCCPFwyFYpIYQQ\nQrRI9RUGv5ONswfXz8ez/ctLtCsIxaiylGPHjlFVVcWCBQu0x6MNHz6c8+fPs3//fubOnUtAQAAK\nhQK1Ws3Vq1dRKpU8+eSTLFiwAABbW1uWLl3KP/7xD9544w369OmDp6cnJSUlZGZmcu3aNTZu3AiA\nQqFg7ty5fPbZZyxatIiBAwdiZ2eHUqkkNTWVDh06MHv27AfyXgF06NCBhQsX8tFHH7FgwQJ69+6N\nq6srlZWVqFSqFlUo3VNhI0f/CCFq9agdKaZSqZgzZw6hoaEsXry4WebQ0owKcMfZ3pKIY2kkXtD/\nOfp5OBA2qLMEgIQQQgghHgIJAgkhhBCiRWpIYfBqplZtcOv7FJfio9i5ex8KW1M6duzItGnT6N27\nt07bl19+mT59+hAZGcnp06cpLi7G2toaJycnJk2axNChQ3XaBwUFsXr1arZv387p06eJj4/HysoK\nNzc3pkyZotN2zJgxuLi4sGPHDqKjoykrK9M+d+rUqdq6PQ/K0KFD8fLyYufOnSQmJhIfH4+5uTkO\nDg5SKF0I8UiZMbgz4VtiGpQNKkeKtUwBXo4EeDmSqVKTkJlLSVkFlmbG+Hs6ykYAIR5Ra9asISoq\nio0bN6JQKJp7OkIIIRrIQNOQ36ofQwYGBnG9e/fu3ZAaAEIIIYS4/yKOpbH5yLlG95PC4KIl0mg0\n7NmzhwMHDnDlyhVsbGzo168fzz//PAsXLgTQZpVFRUWxZs0aFi9ejL29Pdu3byc9PZ2SkhL27Nmj\nfWZ2drY2OJmfn4+VlRW9evUiLCwMV1dXnfFzcnI4fPgwCQkJqFQqSkpKaNOmDb1792batGk4Ov6x\nG796gacmK1aswNfX936/PULU6ED8xXrrwlUfKdac9cQkE0gI8biQIJAQQjReYGAgp06dOqXRaAKb\naw6SCSSEEEKIFkkKg4vW5PPPP2f//v04ODgwatQojI2NiYmJ4dy5c1RUVGBsrP+5/fXXX4mLiyMw\nMJDRo0ejUqm09+Li4lixYgWVlZX07dsXFxcXcnNzOX78OCdPnmTFihV07NhR2/748eNERkbi6+tL\n9+7dMTY25uLFi/z444/ExsayevVq2rZtC8ATTzwB3A5G+fj46AR9nJ2dH9RbJIQeOVJMCCGEEEKI\neyerJEIIIYRokaQwuGgtkpOT2b9/P66urnzwwQfaYwFnzpzJW2+9RV5eXo27aU+ePMny5csJDNTd\nMFZUVMT777+PmZkZ//znP3Fz+yMD4sKFCyxdupSPPvqItWvXaq8PHTqUCRMmYGJiovOs+Ph4li9f\nznfffcf8+fOB20EgKysroqKi8PX1JSws7L69F0I0lhwpJoQQt92ZdTh58mQ2bdpEcnIy5eXleHt7\nM336dAICArTti4uLOXjwIHFxceTk5FBQUIClpSXdunVjypQpdOvWTW+McePG4ePjw+uvv84333xD\nXFwcN27cYNGiRaxZs0bbbs6cOdo/KxQKNm7cyNKlSzl37hwbNmyo8feaH374ga+++ooXXniBp59+\n+j6/O0IIIepi2NwTEEIIIYSoSXVh8MZozsLgQtSm+mi1u+tCGRsbM2vWrFr7BQcH6wWAAH766SeK\ni4uZMWOGTgAIwMPDg5EjR5Kenk5WVpb2etu2bfUCQAABAQF4eHhw6tSpRr8uIR4mT4UNE/t6ETao\nMxP7esl3vRDisXX16lWWLl1KUVERo0aNYuDAgfz+++8sX76cY8eOadtlZ2fzzTffYGBgQFBQEBMn\nTsTf35/ExET+3//7f9RW/qCoqIilS5dy9uxZ+vfvz9ixY7G3t2f69Ol4eXkBMH78eKZPn8706dMZ\nP348cLs2pkaj4eDBgzU+9+DBg5iYmBAaGnqf3xEhhBD1kUwgIYQQQrRYUhhcPKruzFo4FB1PSVkF\nPXr00GvXtWtXjIyManxGly5daryempoKQEZGBhEREXr3c3JyAMjKytIGiTQaDUeOHCEqKoqMjAyK\nioqoqqrS9qnpODohhBBCtDxKpZKnn36aF154QXvtqaeeYtmyZaxbt47AwEAsLS3p0KEDmzdvxtbW\nVqd/bm4uS5YsYcOGDTVuNsnMzGTo0KEsWrRI53eUwMBAVCoVGRkZTJgwQS/bZ+DAgWzYsIFDhw4R\nFham0zcpKYmcnByGDBmiNx8hhBAPnvxrTwghhBAtVoCXI4uf8m1wYXCpCyGaW3xGLluOppF08Y/6\nJcnnL1OmzuMfe1KZ9aSxzufU0NAQG5uaMxratGlT43W1Wg1Q607bajdv3tT+eePGjezatQsHBwd6\n9+5N27ZtMTU1BW5nKt1Zb0gIUbe7j6brYNWAnQpCCNFItX3XWFlZMX36dJ22nTt3JiQkhKioKI4f\nP05oaKhO9vGdHB0dGTBgAHv27OHatWs4OTnp3Dc2NmbOnDm1blKpjampKU8++SQ//PADMTEx9O/f\nX3vvwIEDAIwaNapRzxRCCHF/SBBICCGEuEd3ns+9ePHi5p5OqyOFwcWj4kD8xRoDlkYmt4MtCWnZ\npF4t5tWxfoz0v52hU1VVhVqtpm3btnrPMzAwqHEcS0tLAD7++GM8PT3rnVdBQQG7d+/Gw8OD999/\nHwsLC537R48erfcZQoiag7wAZUX5ZGXdoOv1omaamRCiNanvuyZkQCe9v8sBfH19iYqKIj09XXvk\n2pkzZ9i9ezepqank5+dTUVGh0+f69et6QSBnZ2fs7OyaNPcxY8awc+dOIiMjtUGgwsJCjh8/jpub\nGz4+Pk16rhBCiHsjQSAhhBCiFVmzZg1RUVFs3LixxoKsjyopDC5auviM3Foz1iwcXCjJu0LRtYuY\n2bRh9d5EFHYWBHg5cvbsWSorKxs1Vrdu3YiOjiY5OblBQaArV66g0WgICAjQWzTKzc3lypUren0M\nDW+XDr3zyDghHme1BXmrFd68xd64iwxPyNIGeYUQorEa8l3z398LOFjDd429vT0AxcXFABw/fpyV\nK1diamqKv78/Li4umJubY2BgQFJSEkqlkvLycr0xastEboh27drRu3dvTp06xeXLl3FxcSEqKory\n8nLJAhJCiGYkQSAhhBBCPDI8FTYS9BEt0pajabUu2Dh4+XH9fDxXlcew69AVY1NzIo6l4etmz9df\nf93osZ588km+++47tm7dSufOnfVqB2k0GpRKJb6+vgDagHBKSgpVVVXaAE9paSmffPJJjUGo6vP6\nr1271uj5CdHa1BXk1aFBJ8grhBCN0dDvmvKbxTV+1+Tn5wNoj4H79ttvMTExYfXq1doagdXWrVuH\nUqm8vy/g/4wePZq4uDh+/PFHZs2axcGDBzE1NWXYsGEPZDwhhBD1kyCQEEIIIYQQ9yBTpdY7suVO\nNs6eOHYOJDctjtS9n2Hv3p2cU4ZcitqAc1s7HBwcaj36rcbn2dgQHh7O3//+d5YuXUqvXr1wd3fH\nwMCAa9eukZqailqtZseOHcDtHb2DBw/m6NGjLFy4kICAAIqLi0lISMDU1BRvb2/S09N1xnB1daVt\n27YcPXoUIyMjFAoFBgYGDB06tFVlGQrREHUFee+m0UDEsTQJAgkhGq2h3zU38y5TcatM77smKSkJ\nAG9vbwAuX76Mu7u7XgBIo9GQnJzcpDlWbySpK4u5b9++ODk5cejQIfz8/MjJyWHYsGFYW1s3aUwh\nhBD3ToJAQgghxH2UnZ3Npk2bSE5Opry8HG9vb6ZPn05AQIBe26NHj3LgwAHS09O5desWzs7OhISE\nMGnSJExMTHTaJicn8/3335Oenk5BQQHW1tY4OzsTGBioLQw7btw4bfs5c+Zo/6xQKNi4ceMDesVC\niITM3HrbuPV9CnNbR3LTTpKbdhIjM0tsRoTw7l9fY/bs2bi4uDRqzF69evHJJ5+wY8cOTp06RXJy\nMsbGxjg4ONCrVy+dYswACxcupF27dhw7dox9+/ZhZ2dH3759ee65/5+9Ow+osk7///88cNjOEUSW\no4DIYoALqLhvuOGaa04ZYpZl00wfm6JtfllNfqY0rabSlnG+Oc7QojZpToqalqiBmpAKggsKsYio\nHBaRAwgInN8ffDh5ZDvsqNfjn/K+3/d9v8/Rc8T7dV/v6xHefvvtWuc3MzPjtddeIzw8nCNHjnDj\nxg30ej39+vWTEEjcUxoLeeuSkJFPulYnlatCCJM15bumoryUq4k/kWAx1fBdk5yczKFDh1Cr1Ywa\nNQqo/jfA5cuXyc/Px8HBAagOgDZv3kxmZmaz5mlrW/29lpOTU+/PLgqFgunTp/Pll1+ybt06oLo6\nSAghRMeREEgIIYRoJYcPH+bTTz/FzMwMGxsbhgwZwpdffsl///tf/vWvfxEUFATA5s2bWbVqFU5O\nTnh7ezN69GjUajXnz5/nq6++4tSpU7z11luYm5sDcOLECf7617+iUqkYMWIEjo6O6HQ6Ll26xO7d\nuw0h0MKFCzl27BhpaWnMmTPHsBREzX+FEG2jpKyi0TEKhQJN35Fo+o40bBs3wZfr169TWlpq9JRu\ncHCwoaFzQzQaDX/84x9NmqOVlRWLFy9m8eLFtfatXr26zmN8fHxYtWqVSecX4m5lSshb33ESAgkh\nTNWU7xrb7h7kpcRRnHuZDyrO4d1NSXR0NFVVVSxbtgyVSgXAvHnz+PTTT3n22WcZM2YM5ubmnDt3\njosXLzJ8+HBiY2ObPM+BAweyfft2PvnkE0aPHo2NjQ1qtZpZs2YZjZs6dSpbtmwhLy8PT09P+vTp\n0+RrCSGEaD0SAgkhhBCtIC8vj19//ZWAgAAWL16MhYUFQ4YMITU1laysLD799FOGDBmCSqXi9OnT\n5ObmEhwczN/+9jcsLS0N59m8eTNbtmxh9+7dzJkzB4AffvgBvV7P6tWr8fLyMrpuYWGh4f9DQ0PR\narWkpaUxd+5ceVpfiHaismr8R+qbN4pQWquNln2zUFSyYcMGAMNTu0KIzsWUkLc1jxNC3Jua8p1h\nqe6G+/CZXI6L5JfDB8myt6Z3796EhIQwePBgw7jp06djYWHBjh07iIyMxNLSkv79+/Pcc89x9OjR\nZoVAgwcPZunSpezbt48dO3ZQUVGBRqOpFQLZ29szdOhQjh07xvTp05t8HSGEEK1LQiAhhBCiGdK1\nOuLTcykpq6C8uICcvHyUSiWff/45bm5uhnFffvkl4eHhxMTE8PPPPxMcHMzJkydRKBQsWrTIKAAC\nCAkJYdeuXRw6dMgQAtW4fSz81rxdCNFxBnk23vtDmxTDtfREbLt7orSxpeJGEd+cLaG06DpDhgxh\nzJgx7TBTIURTmRLyWnWxZ/AjK5p8nBBC1Gjqd4Z1V2e8J4Tw9LR+zBvuVe+4+qqLPT09CQ0NrbU9\nIiKi0WvPmzePefPmNThGr9eTlpaGlZUVEydObPScQggh2pb8ZCqEEEI0QVxaLpuiko3W7C4rKiAr\ntxArpTnacivcbhnfs2dPRo0aRUxMDKmpqYwdO5acnByUSiUHDx7k119/rXUNCwsLo3W6x48fz9Gj\nR3nxxRcJCgpiwIAB9O3bFycnaTotRGfgqbEloJdDg2v527l4cePaVQqv/Epl+Q26qq1x9R3A+Ifm\nM2fOHKMKISFE52FKyNuax4nObfny5Zw+fdqkG+UNqan8fvvttwkICGil2Yk72d32XXPkyBGys7OZ\nMWOGYXk6IYQQHUdCICGEEMJEe+MusnZ3Inr9b9uuJBwi6+R+KspKUJjZMm3G/Xh3t8PZzoaIiAhm\nz55tCGuKi4spKipCr9dz8+ZNvv/+e6NKntjYWGxtbbnvvvu4dOkSjzzyCKWlpXh5ebFo0SISExPZ\nt28fGzZsID8/H0tLSwYPHsyzzz7L2LFj2/vtEELcYtE4H5ZvijH6friVbQ9vbHt4A6BQwOpFIwj0\n6pw3boQQvzEl5L3dAA8H6QckWl1rBVCic7pbvmu2bduGTqdj3759WFtb89BDD3X0lIQQQiAhkBBC\nCGGSuLTcWgEQQJfunnTvN4qS3EvoqyrpETCeEmDUmPsMY0pLSwFQq9Wo1WoAVCoVn3zyidHTn7Nn\nz8bLy4vS0lL8/Pzw9/dHp9MRHR1Namoqf/vb31i3bh2WlpZYW1tz+vRpjhw5wrVr1/jkk0/w8/Nr\n8/dBCFG3QC8nwmYG1Pk9cSuFAp6fNUACICHuII2FvLdSKCA0yKftJyU6xAsvvEBZWVlHT0Pcpe6G\n75rPP/8cpVKJu7s7TzzxBM7Ozh09JSGEEEgIJIQQQphkU1Rynf8gs+3uiaXanvSj31F5swxN31GY\nW1hxzcHBMCYnJwcbGxu8vb2xtrbG0dGRy5cvU1xcXOt8aWlpTJ8+nf/5n/8xLA8VGBjIBx98wKuv\nvkrfvn159913sbS05Ouvv+Yf//gH165dY9u2bbz22muYmZkBUFlZ2TZvhBCiXtMDe9HdXsXm6GQS\nMmo/yTvAw4HQIB8JgES7unDhAv/97385e/YshYWF2Nra4uHhwbRp06SK1EQS8ooackNbtKXGvmtq\n+o915u8aqVQTQojOSUIgIYQQohHpWp1JSzPoqyq5mvgTboOnkpCRT7pWR1FREVlZWQwePJhRo0YB\nMHToUBISEti0aRMBAQGG6iAAKysrFixYQGpqKr179wbA0dERMzMzioqKeOqpp7C0tASgoKAAW1tb\nLC0tSU1NBcDWtnpJiJycHFxcXFr1fRBCNC7Qy4lALyfStTri03MpKatAZaVkkKdTp1uyRdz99u3b\nx9///nfMzMwYMWIErq6uFBQUkJKSwu7duyUEagIJee8NWq2WpUuXEhwczEMPPcRXX31FYmIihYWF\nrFq1is2bN9e5JNvNmzfZunUrBw4cIC8vDwcHByZMmEBISAjz58/H39+f1atX13nNI0eO8O2335KR\nkYGlpSWBgYEsXboUR0dHoznVmD17tuH/GzqvuDPJd40QQoi2ICGQEEII0Yj49FyTxiktbchLiaM4\n9zJqZ3feWn2MpKQkunTpwrJlywxNUQMCAtBoNCQkJPD73/+ewMBANBoNaWlpqFQqnnrqKSZPnsyy\nZcsA+Oc//0liYiJqtZqIiAiUSiUpKSkkJCSg0WiwtbUlPT0dgIEDB7J9+3Y++eQTRo8ejY2NDWq1\nmlmzZrXJeyOEqJunxlZCH9GhMjMzWb9+PSqVinfeeYdevXoZ7c/NNe3vNvEbCXnvHVeuXOHFF1/E\nzc2NCRMmUFZWVm9ze71ez+rVq/nll19wdXVl1qxZVFZWEhkZycWLFxu8zp49e4iJiWHEiBH4+/tz\n4cIFoqOjSUtL46OPPsLCwgK1Ws3ChQuJjIxEq9WycOFCw/Hdu3dv1dctOgf5rhFCCNHaJAQSQggh\nGlFSVmHSODOlBb7TnuByXCR5ycdJzLNBpVIxduxYgoKCjMZ6enoSGhpKcnIyp06dori4mIKCAlQq\nFfPnz2fixImGsQsWLODYsWNcv36dH374AYVCgbOzMwsWLGDOnDmsWbPGsPzb4MGDWbp0Kfv27WPH\njh1UVFSg0WgkBBJCiHvMnj17qKysJCQkpFYABODkJE+RN5eEvHe/s2fP8tBDD/Hoo482OvbQoUP8\n8ssv9O/fn5UrV6JUVt9mWbRoES+++GKDx544cYIPPvgAT09Pw7b33nuPqKgoYmJiGDt2LGq1mtDQ\nUBITE9FqtYSGhrbotYk7h3zXCCGEaC0SAgkhhBCNUFk1/NelVRd7evQfgy47A+uuznhPCAHg6Wn9\n2PjWs/To0aPO4/z9/Y2e5pw9ezb+/v4sXrzYaNzYsWMJCAgAYOPGjY3Od968ecybN6/RcUIIIe4u\ntz41vvunWErKKhgyZEhHT0uITuv2Soue6upGLPb29kY/ozUkMjISgEceecQQAAGo1WpCQkJ4//33\n6z129uzZRgEQwLRp04iKiuLChQuyZKMQQgghWoWEQEIIIUQjBnk272np5h4nhBBCNEVcWi6bopKN\n+teduZBFmS6f975PYclka+kfIcQt6vrMAJQVFZCZeY1gLz8sLCxMOldqaioKhYK+ffvW2tevX78G\nj/Xx8am1zdnZGYCioiKTri+EEEII0Rizjp6AEEII0dl5amxRN1INdDu1lVKWbxBCCNHm9sZdZPmm\nmFo3s5WW1gDEn89g+aYY9sVndsT0hOh06vvM1Ci8UU5UynWTPzPFxcXY2tpibm5ea5+9vX2Dx6rV\n6lrbas5TVVVl0vWFEEIIIRojIZAQQgjRiHStjmIT+wLVKC6rIF2ra6MZCSGEENXVDGt3J6LX196n\ncuoJQOHlFPR6+HBXAnFpue08QyE6l4Y+M0b0CpM/MyqVCp1OZ+jPeKuCgoJmzlQIIYQQovVICCSE\nEEI0Ij69eTfNmnucEEIIYYpNUcn13sx29h2Kwsycq6ejKL2eg14Pm6OTDftzc+XvKHHvaegzc7vb\nPzP18fb2Rq/Xc+7cuVr7zp4929Qp1svMrPr2jVQICSGEEKKppCeQEEII0YgSE6qAfKYsqfO4iIiI\nWttDQ0MJDQ2ttb2usTU2btxY777Vq1c3Oj8hhBB3l3Strt7lrACsuzrjPmwGmbG7Sdrz/+jasw+X\n4x2wvXyU/KuZqFQq3n777Xac8Z1Dr9cTERHB3r17uXr1Kra2towaNYrFixfz7LPPAg3/vSw6p8Y+\nM3VJyMgnXatrcInfSZMmkZCQwFdffcXKlStRKqtvsxQXF/P111+3aM63srOzAyAnJ4fu3bu32nmF\nEEIIcfeTEEgIIYRohKqJ/YBaepwQQgjRGFOqTZ18hmBjryH73M8UZadz/VISB0tdGTfUn6lTp7bD\nLO9M//jHP9izZw8ODg5Mnz4dpVJJTEwMFy5coKKiwnCTX9xZWlLZ3VgIFB0dzYkTJ1i2bBkjRoyg\noqKCo0eP4uPjQ1ZWlqGKpyUGDhzI4cOHefvttxk6dCiWlpZoNBomTpzY4nMLIYQQ4u4mP70KIYQQ\njRjk6dSuxwkhhBCNMaVKFUDt7I63s7vh149N8CU0yKetpnXHO3PmDHv27MHNzY33338ftVoNwKOP\nPsrrr79Ofn4+Go2mg2cpmsPUz0xTj1MoFLz66qts3bqVAwcOEBERgYODA8HBwdx///0cO3YMGxub\nZl37VlOnTkWr1RIVFcW3335LZWUl/v7+EgIJIYQQolESAgkhhBCN8NTYEtDLoUlLiAzwcGjwqVEh\nhBCiJaRKtW1ERkYCsGDBAkMABKBUKnnsscf485//3FFTEy1kyp99qy72DH5kRb3H1bcEr6WlJYsW\nLWLRokVG2+Pj4wFwd3c32l7f0sAAGo2mziWCzczMePTRR3n00UcbfR1CCCGEELdqeU2yEEIIcQ9Y\nNM4HhcK0sQoF8pS1EEKINiVVqq0nXavju9g0Nkcn8+PROErKKujXr1+tcX5+fpibm3fADEVraMvP\nTH5+7QeFdDod4eHhAIwaNapZ1xZCCCGEaA3yGJgQQghhgkAvJ8JmBrB2dyJ6ff3jFAp4ftYAAr3k\nJpsQQoi2I1WqLReXlsumqGSj9/BMyhXKdPmsiUjisclKo7/PzczMsLWV9+9O1ZafmX/+85+kpaXR\nt29funbtSm5uLidOnECn0zF9+nR8fX1bMnUhhBBCiBaREEgIIYQw0fTAXnS3V7E5OpmEjNo3EAZ4\nOBAa5CMBkBBCiHaxaJwPyzfFNPhwQg2pUjW2N+5inQ92mFtYAhCffImk7GKenzWAaYOql/KqqqpC\np9Ph6OjY3tMVraStPjOjR4+moKCA2NhYiouLsbCwoFevXkydOpUpU6a0cNZCCCGEEC0jIZAQQgjR\nBIFeTgR6OZGu1RGfnktJWQUqKyWDPJ3k6WohhBDtSqpUmycuLbfe98zGwYWS/KsU5VzEyrYbH+5K\nQNPVhkAvJ86fP09lZWX7T1i0mrb6zIwdO5axY8e20iyFEEIIIVqXhEBCCCFEM3hqbCX0EUII0eGk\nSrXpNkUl1xsAOHgNIC8ljuzT0XTt6YfS0prN0ckEuNvzxRdftO9ERZuQz4wQQggh7jUSAgkhhBBC\nCCHEHUyqVE2XrtU12BPGtrsnTj5DyE0+QdKu9dj36kvWSTMuR/6T7o5dcXBwQKFQtOOMRVuQz4wQ\nQggh7iUSAgkhhBBCCCHEXUCqVBsXn57b6Bj34TOxtnMiN/k4ucnHMbdSYTt1Am+98QJLlizBxcWl\nHWYq2oN8ZoQQQghxL5AQSAghhBBCCCHEPaGkrKLRMQqFAk3fkWj6jjRsGzfBl+vXr1NaWoq7u3tb\nTlEIIYQQQohWZdbRExBCCCGEuBNptVpmz57N2rVrO3oqQgghTKRVGm7kAAAgAElEQVSyavw5yJs3\nitDf1jTIQlHJhg0bABg1alSbzE0IIYQQQoi2IJVAQgghhBBCCCHuCYM8nRodo02K4Vp6IrbdPVHa\n2FJxo4hvzpZQWnSdIUOGMGbMmHaYqRBCCCGEEK1DQiAhhBBCCCGEEPcET40tAb0cSLyYX+8YOxcv\nbly7SuGVX6ksv0FXtTWuvgMY/9B85syZg0KhaMcZCyGEEEII0TISAgkhhBBCCCHa1dq1a4mMjGTj\nxo1oNBqTjlm6dCkAGzduNGyLjIxk7dq1hIWFERwc3CZzrU9ERATff/892dnZlJeX8+STTzJ37tx2\nnYNonkXjfFi+KYbbVnwzsO3hjW0PbwAUCli9aASBXo1XEAkhhBBCCNEZSQgkhBBCCNFCWq2W8PBw\n4uPjKS0txcPDg9DQUIYNG1ZrbFRUFHv37iU1NZXy8nK6d+/OhAkTmD9/PhYWFh0weyFEU0VFRfHZ\nZ5/h7e3NnDlzsLCwoE+fPh09LWGiQC8nwmYGsHZ3Yr1BEFQHQM/PGiABkBBCCCGEuKNJCCSEEEII\n0QJarZYXXniBHj16MGnSJHQ6HdHR0bz11lusXLmSAQMGGMauW7eO/fv34+TkxOjRo1Gr1Zw/f56v\nvvqKU6dO8dZbb2Fubt6Br0aIO8vIkSNZv3493bp1a9fr/vLLLwCsWLECBweHdr22aB3TA3vR3V7F\n5uhkEjJqLw03wMOB0CAfCYCEEEIIIcQdT0IgIYQQQogWSExMJDQ0lIULFxq2jR8/nhUrVrB9+3ZD\nCBQZGcn+/fsZNWoUL730EpaWlobxmzdvZsuWLezevZs5c+a0+2sQ4k6lVqtRq9Xtft38/OrQQAKg\nO1uglxOBXk6ka3XEp+dSUlaBykrJIE8nPDW2HT09IYQQQgghWoWEQEIIIYQQLaDRaHj44YeNtg0e\nPBhnZ2cuXLhg2LZz507Mzc157rnnjAIggJCQEHbt2sWhQ4ckBBIdTqvVsnTpUoKDg3nwwQcJDw/n\nzJkz3Lx5E29vbxYuXEhgYKBhfE2I+fbbbxMQEFDvucLCwmpdq6qqiu+++469e/ei1Wqxs7Nj7Nix\nhIaGolKpGp1rQz2BcnNz2b59O8ePHycvLw9LS0tcXFwYPnw4ISEhzXpval5rjdmzZxv+PyIiolnn\nFB3PU2MroY8QQgghhLhrSQgkhBBCCGGC258U76mubiTh5eWFmZlZrfFOTk4kJSUBUFZWRlpaGnZ2\nduzYsaPO81tYWJCZmdl2L0CIJsrOzuall17C09OT6dOnc+3aNaKjo1mxYgUvv/wyQUFBLb7GP//5\nT06fPk1QUBBqtZqTJ0+yY8cOzpw5wzvvvFMrMDVVcnIyK1asQKfT4e/vz+jRoykrK+PixYts3ry5\n2SFQTcgVGRmJVqs1qgAUQgghhBBCiM5IQiAhhBBC3PEaqza4XUPVA7eLS8tlU1QyiReNe0aUFRWQ\nmXkNv0F1H2dubo7+/zqOFxUVodfruX79ulEVgRCd2enTp3nggQd44oknDNtmzpzJyy+/zKeffsqQ\nIUNMqtZpyNmzZ/noo4/QaDQAPPbYY6xZs4ajR4+yffv2ZoU1FRUVrFmzBp1Ox0svvcT48eON9ufm\n5jZ7vgEBAQQEBJCYmIhWqyU0NLTZ5xJCCCGEEEKI9iAhkBBCCCFEPfbGXWTt7kT+L8uppfBGObtO\nXGRKfCbTBrnXe56aniXe3t6sW7euLaYqRKtTq9W1Kl18fHyYMGECkZGR/Pzzz42GqI2ZM2eOIQAC\nUCgUPP744/z888/8+OOPzQqBYmNj0Wq1jBgxolYABNVVek11eyXg9eLyJp9DCCGEEEIIITqChEBC\nCCGEuOeMHDmS9evX061bt3rHxKXlNhgAGejhw10JaLraEOhV981la2trevXqxcWLF9HpdNjaSu8J\n0XnUt9Rh7969sbGxqTU+ICCAyMhIUlNTWxwC+fv719rWo0cPnJ2d0Wq1FBcXG0JUU9UswzhkyJAW\nzQ3qrwRMjr+IovAacWm59X7uhRBCCCGEEKIzkBBICCGEEPcctVrd6I3lTVHJjQdA/0evh83RyQ3e\nDJ43bx4fffQR69at4/nnn691/aKiIrKzs+ndu7dpFxWihRpb6rC3v0Wdx9nb2wNQXFzc4jnUF8R2\n69at2SFQzbwcHR1bNDdTKgGXb4rh+VkDGqwEFEIIIYQQQoiOJCGQEEIIIe4qWq2W8PBw4uPjKS0t\nxcPDg9DQUIYNG2YYU19PoKVLlwLw5xVr+P6/Wyi4eI6KshKsbR3pMWA89u590FdVkn32KDlJsRRe\nTuZmSSEqRxcSGE66Voenpu4qnylTppCSksKePXv4/e9/T2BgIBqNBp1OR3Z2NqdPn2by5MksW7as\nbd8gITAt4Ig4eo4ZdSx1WFBQAPy2zKGZmRkAlZWVtc5TVFTU4DyuXbuGm5tbndtvvUZT1ByTl5fX\n5GNrmFoJqDehElAIIYQQQgghOpJZR09ACCGEEKK1aLVaXnjhBbRaLZMmTSIoKIiMjAzeeustEhIS\nTDpHRUUF/9/yVynMSqZrT18cvAIoK8onLeobdFdTSTv8LbkXjqN2dsdC1ZWqykoyf/mea+mniU9v\nuOH8008/zRtvvEGfPn04deoU3333HTExMRQXFzN//nzmzp3bGm+DEA0yNeAoyb/C3/77C3Fpxn+u\nExMTgeoeV/Bb6JKbW/vPf0pKSoPXOH36dK1tV69eJScnB41G06wQqE+fPgCcOHGiycfWaE4loBB3\nqpUrVzJ79mwiIiJq7fvqq6+YPXs2H330UQfMTAghhBBCtAapBBJCCCHEXSMxMZHQ0FCjZvbjx49n\nxYoVbN++nQEDBjR6jvz8fKw03vSZ+UfMzKt/VHLwGsCFH8JJi96GVZdu9Jn1NEpLa7zHL6BMd41z\nuz4l++wRSsrmG86zevXqOs8/bNgwo6okIdqbqQFHRXkpVxJ+YnO0i6HKJTk5mUOHDqFWqxk1ahQA\nvr6+AOzfv5+JEydibm4OVIdCW7ZsafAaO3fuZNKkSWg0GgD0ej3//ve/0ev1TJkypVmvb/jw4Wg0\nGmJiYoiKimLcuHFG+3Nzc3Fyql21o9VqWbp0KYHDx5BIQJOumZCR32AloBCd2XPPPcdzzz3Hv//9\nb/r3728IeE+dOsU333yDu7s7f/jDHzp4lkIIIYQQorkkBBJCCCHEHae+RvYajYaHH37YaOzgwYNx\ndnbmwoULJp///t+F8p8Tv1U1dNF4YNWlG2VF1/AKegilpbVhn5VtN9RO7hTlZGJtIUXWonNL1+pq\n9QCqj213D/JS4ti24TI9rgdjXllKdHQ0VVVVLFu2DJVKBYCfnx/+/v6cPn2aF154gYEDB1JQUEBs\nbCyBgYEcPny43mv069ePZ599lqCgINRqNSdPniQtLY377ruP+fPn13tcQ5RKJa+88gpvvPEG7733\nHt9//z19+vShvLyczMxMTp06xY4dO+o9Piu/GByaft349FwJgcQdydbWlpdffpnly5fzzjvvsG7d\nOkpLS3n//fexsLDglVdewcrKqqOnKYQQQgghmklCICGEEELcMRprZN/L19/Qn+RWTk5OJCUlmXQN\ntVrNpKH9+M+JKKPtFjZdKCu6ho2DS61jLFS26Ksq8bI3b8KrEaL9NbZk4a0s1d1wHz6Ty3GRfLdz\nNxo7S3r37k1ISAiDBw82Gvv666/zr3/9i5iYGCIiInB1dWXJkiUMHjy4wRDoySef5Oeff2bfvn1o\ntVpsbW2ZM2cOixYtwtLSstmv08fHh48++oht27Zx/PhxkpKSsLGxwcXFhUWLFtV5jIODA+vXr2fP\nqStcOqlt8jVLyiqaPV8h2tvtD1MM8uzJI488wueff84nn3zC9evXuXbtGn/605/o1atXR09XCCGE\nEEK0gIRAQgghhLgjmNLIPvJcHvvqaGRvbm6O3sQGH2q1Gk+NLQG9HIzCJsX/hUu3VgEZ9inMsVNZ\n4uagMvHVCNExmhpUWHd1xntCCI9N8CU0yKfecWq1mj/96U/86U9/qrWvrj4jYWFhhIWFAfDAAw/w\nwAMPNDqXjRs31toWHBxMcHBwneOdnZ15+umnGz1vDaVSSc+ePdFcvgk0HAL5TFlSa5vKSv5pJTq/\n+h6mAPB3d8XFy4+ffvoJgHHjxjF16tT2nqIQQgghhGhl8i8VIYQQQnR6pjayRw8f7kpA09XG0MOk\nuRaN82H5phjTmsMrwM2h6Q3shWhvzQ0qWhpwxMTEsHPnTjIzM9HpdNjZ2eHq6kpQUBD333+/YZxO\np2P79u0cO3YMrVaLUqnkvvvu48EHHyQwMNDonJGRkaxdu5awsDDs7e3Ztm0bqamplJSUEB4ezuOP\nP46Xlxfr1q2rc07/+7//y4kTJ/jkk0/w8PAw6gnEbT2Bqipuoj0fQ8HFc5QVVldTWajssHPpTff+\nY7Gw6cIgz+rvnLKyMnbu3El0dDSXL19GoVDg4eHBnDlzavUnEqI9NfYwxenMa+Ret8Oi8AbOdjbM\nnTu3fScohBBCCCHahCxcL4QQQohOz9RG9gB6PWyOTm7xNQO9nAibGYBC0fA4hQIm9Hehq6r5S1cJ\n0V5qgor2Og5g7969rFy5kszMTIYPH84DDzzAkCFDKCsrY//+/YZxWq2WsLAwtm3bRteuXZkxYwZB\nQUFcunSJFStWsG/fvjrPf+TIEd58801sbGwMxzg6OjJo0CBSU1NJT0+vdUx+fj5xcXHcd999eHh4\nGO1zsLUmoNdvTYEqym5wYd+/uBwXSdXNMhx7B+LkMwRrO2fyfo2jtDCXAR4OeGpsKS4u5s9//jNf\nfPEFZmZmTJkyhUmTJlFYWMh7773Hl19+2ez3UYiWMOVhitLCPLJO/EDGtZsU3rjJxx9/THl5eftN\nUgghhBBCtAmpBBJCCCFEp9aURvY1EjLySdfqWtykfXpgL7rbq9gcnUxdsdIADwdCg3z4aUcyWaa1\nHBKiQ9W11GFjagKO5tq7dy9KpZKPP/6Yrl27Gu0rLCw0/P+HH35ITk4OL7/8slHFTHFxMcuXL+ez\nzz5jxIgR2NvbG53j+PHjrFixgiFDhhhtnzx5MnFxcRw4cIAnnnjCaN+hQ4eoqqpi0qRJdc751krA\nzF/2UHLtKk6+Q3Efdj+KW5LhypvlQJVhqbwNGzaQmprKkiVL+N3vfmcYV15ezqpVq9i6dStjxozB\n29vbhHdOiNbT2MMUVZUVpB/eRlXFTXpPehh1ZQ7p6XFs2LCBZcuWtd9EhRBCCCFEq5NKICGEEEJ0\nak1pZN8ax90u0MuJ9x4dxfRBvfDU2PLYBF+entaP//eHcbz36KgWLzsnRHtbNM6nwQo3qy72DH5k\nBR6j56JQ0GAvoPqka3V8F5vG5uhkfr16ndIKPebm5rXG2dnZAZCWlsbp06cZPXp0rSXT1Go1ixYt\nory8nKNHj9Y6x4gRI2oFQAAjR45ErVYbAp9bRUZGolQqGT9+fJ3zr6kErCgrpiDjDBYqW9wCpxgF\nQABKS0tenj+cQC8ndDodBw8exMfHxygAArC0tGTJkiXo9XpDvxUh2ospD1NknfyRkvyraPqNwc7F\nmwrXYbh69Gbv3r0cPny4nWYq2pJWq2X27NmsXbu2ReepqKhg06ZNPPXUUzzwwAPMnj2bY8eOtdIs\nhRBCCNEWpBJICCGEEJ1aUxvZt/S4+nRVW9LDXtWsG+JCdCY1AUdjS0MpFPD8rAFNCjrrajqvNXPj\n0oUzjJj2EL+bPZX7J4yib9++RlVBSUnVpXTFxcVs3ry51nmvX78OQGZmZq19vr6+dc7F0tKSsWPH\nsm/fPk6ePMnQoUMBSElJ4eLFi4waNcoQQtVlemAvcjPOs2K3BebOvTC3MF7ysaYSsOb9efbZZzl2\n7Bi9e/eu8zVUVlbW+xqEaEuNPRRRcPEcOedjUTv1xHXgBAAUZmaMmbOYveHv8/HHH3PffffRo0eP\ndpit6Oy+++47vv76a/z9/QkKCsLc3JyePXsa+qoFBwcTFhbW0dMUQgghxC0kBBJCCCFEp2ZKQ/qa\nyoX6jlu9erXRvuDgYIKDg2udZ+PGjfVe4/Zz3CosLExueIg7yq1LHSZk1K4QuD3gMEV9Tec1fUdh\nbqUi98Jx/hH+NT98vxtNVxX+/v48/vjj+Pj4oNPpAIiPjyc+Pr7ea9y4caPWtm7dutU7Pjg4mH37\n9hEZGWkIgQ4cOGDY15ie9hb069mNsZOG4jemHyVlFaislAzydKq1RF5ZWRkAycnJJCfX35estLS0\n0esK0ZoaeiiivPg6F2MiUFpa4zn2d5TprnE24lNsu3uinLCK5557jpUrV/Luu+/y7rvvolQqeeaZ\nZ7h06RL/+te/cHBwqPfc4u4UGxuLtbU1b731Fkrlbz9rabXaDpyVEEIIIRoiIZAQQgghOrWOaGQv\nxL0g0MuJQC8n0rU64tNzGww4GtNY03lH74E4eg+koryUktxM+na/wemTP7NixQrWr1+PSqUC4Kmn\nnmL27NlNuvbtS7Tdqm/fvri6uhIbG0txcTFWVlb89NNP2NnZ1bmE3O3UanX1NW4WM2+4V4NjlyxZ\nQk5ODnPnzuXJJ59s0msQoi019DCFpborAx76s9E22x5e6K6mceN6LiNmjSIiIsKw79y5c2RkZDB6\n9GgJgO4STa3gyc/Px87OzigAam+zZ8/G39+/wQd0hBBCCPEbCYGEEEII0al1RCN7Ie4lnhrbFn9e\nGms6X0NpaY2dqw9VHg5MdlDz448/cubMGfz8/AA4c+ZMk0OgxgQHB/Pll18SHR2Nvb09hYWFzJ49\n26QbmL6+vigUCs6cOUNpaSnW1tb1jh0xYgQ2NjacPXu2NacvGnD48GF27dpFWloaFRUVuLi4MH78\neObNm4eFhUVHT6/TaOpDEU4+Q9FdTSMvJQ4YZbRv3759AMyYMaO1pic6WFlZGZcvX2b79u0cPnwY\nhUKBh4cHc+bMMerRtnbtWiIjIw2/rvmu1mg0BAcHs2XLFqC659qt48LCwkyqvBRCCCFE25EQSAgh\nhBCd3qJxPizfFGPSTebmNrIXQjRPY03ndVfT6NLd06hiJyEjn8qibACsrKzw8fGhf//+HD16lB9/\n/JEpU6bUvk56Ot26dTPqJWSKSZMm8dVXX3HgwAHs7e0BmDx5sknHHj9+nIqKCg4ePMjIkSPx8fHB\ny8uLGTNmMHHiREpLS6msrEStVrNmzRp+/fVX9Ho9X3/9NQsWLODMmTO8+uqrLFy4kKFDh/LZZ5/x\n66+/UlFRwcaNG9FoNE16LeI3X3zxBVu3bsXOzo7x48djbW3NiRMn+OKLLzh58mStparuZU19mMLe\n3Q+NsxPxsYe5efP3hkCtuLiY6OhoXFxcGDhwYFtOWbST4uJi3nzzTS5dukSfPn2YMmUKVVVVxMXF\n8d5775GRkcHixYsBGDlyJBqNhp07dwIwZ84coLpi0tvbm+LiYnbu3ImXlxcjR440XMPLq+EqSiGE\nEEK0PfmpWAghhBCdXls2shdCtExjTefTor7BTGmJyskNqy726PVQrM0g31zH2KEDDDeTX3rpJV57\n7TU++ugjIiIi8PPzQ61Wk5ubS3p6OhkZGfztb39rcgjk5OTEgAEDOHXqFObm5nh6euLt7W3SsX//\n+9/x8vJCp9Oh0+m4evUqWVlZ/PDDD/j5+aHX6/nLX/5CQEAAAB4eHvj5+bFp0yYOHjyIvb09mZmZ\nhIeH88Ybb6BQKJg/fz4uLi4SULRAUlISW7duxcnJiQ8++MDQF+qxxx5j1apV/PLLL2zfvp0FCxZ0\n8Ew7jyY9TGFmzswZ0zh3bD9Hjx5l/PjxQHU/rfLycqZNm9bgMozizrFhwwYyMjJwd3fngQce4A9/\n+AMA5eXlrFq1iq1btzJmzBi8vb0ZOXIkI0eONFT5hIaGGp2re/fu7Ny5E29v71r7hBBCCNGx5F8e\nQgghhLgjtEUjeyFEyzXUdB7AZVAwuiu/ciP/KoWXUzAzV2Kp7sqYqQ/w9ktLDWGIk5MTa9euJSIi\ngqNHj3Lo0CGqqqqwt7enV69ezJo1Cw8Pj2bNMTg4mFOnTlFZWcmkSZNMPu6TTz7BxcWF0tJSdu7c\nSXR0NFlZWZw7d47Y2FieffZZevXqZRhvbm7OmjVr2Lt3Lz/99BOnTp3i6tWr5OXlMWbMGBYuXMik\nSZOwtZXlKpvq1t5VB3d8TUlZBQ8//LAhAILq93/p0qUcP36cH374QUKgW5j6MEWNX4o0ZGYV8PnX\n2w0h0L59+1AqlSZX0onOTafTcfDgQby9vbGysgKq+wOFh4cTHx9PXl4eycnJhIeH8+abbxqOu3nz\nJllZWbz22mtkZWVx/fp1VCoV7u7uFBUV1Xmtmh4+y5cv54svviA2NhadToeLiwvz58+v889URUUF\n27ZtIzIyktzcXBwcHJgwYQIhISFt84YIIYQQdzEJgYQQQghxx2jNRvZCiNbRUNN5AGffoTj7Dq21\nfcK0ftjY2Bhts7GxYcGCBSbdvA8ODja5z8TEiROZOHFig2M0Gg0RERFG21xcXACwtrY2mtfRo0dZ\nvXo1AwcOrFWZpFQqmTVrFrNmzSIxMZFXX30Vb29v1q1bZ9JchbG4tFw2RSUbLWWWdCSOkvw8vjt/\nk+5+uUbhv5ubG05OTmRnZ1NcXIxare6IaXdKjT1McStLlR0KB292HfyZr/bFMqSXLRkZGQQFBTW5\nGk90nFt/XiovLjAK7S9cuEBVVRUAWVlZHDx4kG3bttG1a1dcXV1RKBQkJCTwzTff8OCDDzJgwAAA\nSkpKSElJYdSoUQwbNowuXbqg1WqJjo7m3LlzDB48uM65FBcX8+c//xmlUsmYMWO4efMmhw8fZt26\ndSgUCqPvc71ez5o1a4iJicHFxYVZs2ZRUVHB/v37ycjIaMN37O6xefNmtmzZwttvv22oVm2Oml5Q\nsoSpEELc2SQEEkIIIcQdpzUa2QshWkdTm8639Li2dHvA7N4FYn/ay6lTp8jJyaG8vNxofF5enknn\n9fX1bYvp3vX2xl2ss3Kl8mYZACl5FSzfFMPzswYwbZC7Yb+DgwM5OTkSAtWh5mGK709eZN3uRBoq\nCnLyHUpB5jnWrP+K6QHVN3+nT5/ePhMVLVJXeFpWVMCZjDyqYtMZn5aLTqcDIDU1laysLLKysnBz\nc6Nbt25cvXoVAB8fH65cucL27dsNIZBKpWLcuHGsXLnS6JqzZs1i6tSpHDt2rM45paWlMWXKFJ55\n5hnMzMwAmDt3Ls888wzffvutUQgUFRVFTEwMfn5+vP3221haWgLVS9C98MILrfQuCSGEEPcOCYGE\nEEIIIYQQzdbUpvNQvXxjZwpy67xhqrvG+b3/RGVeyfiRg5k2bRoqlQozMzO0Wi2RkZHcvHnTpPPb\n29u31dTvWnFpufUuXWZuUb10VUVpEeYWDny4KwFNVxtDRVB+fvXvowRA9YtMzGowAAKw7eGFtZ0j\neamn2JllxqQhfoYgQHRe9YWnNa5cK2H5phju96oeMH36dKysrNBoNGzYsMEQ0NR44oknuHDhguHX\nFhYWdZ7X0dGRbt26UVBQQE5ODs7Ozkb7raysePLJJ43O7+7uTr9+/Th9+jSlpaVYW1sDsH//fgAe\nffRRQwAEYGtrS0hICGvXrjXx3bh3zZo1i3HjxtX6fRBCCHFvkhBICCGEEEII0SJNajqvgNAgn7af\nlInqu2GqTfqZirISuo2ay2W3QXgM/63aJCoqytAc3RQKhaI1p3xP2BSVXO+fJxuHHpTkX6EoOwMr\nWwf0etgcnUyglxNXrlwhOjqasrKyJoVAkZGRrF27lrCwMJOXGbxTpWt1JoW2CoUCJ5+hXDqxj2tl\nMHjk+HaYnWiJhsLTW+n18N3ZEnJyijhx6jQAXl5etQIgqO7XlpSUZLStoKCAd955h6SkJAoKCqio\nqKC8vJzs7GycnJzIy8urFT64urqiUqnqPD9AUVGRIQT69ddfUSgU9OvXr9b4lixtdi+xs7PDzs6u\no6chhBCik5AQSAghhBB3Pa1Wy9KlSwkODiYsLKzR8ffSzUAhWoOpTecVCnh+1gCjHi4dqaEbpmW6\nawDY9+qLXo9RtUliYmI7z/Te0lhI4dg7kLyUOK6ejsKupy8W1moSMvJJvXqdzRs3otfra92ArunP\ntHDhQkJDQ9v6JXRq8em5Jo918B5I1skfUJgrsfXwb8NZidbQUHh6OwtrNeX2vdl7+ATKyhJ8Bw6v\nNebKlSuUlpaiv+WkWq2WU6dOoVAoGDRoEC4uLlhbW3Pz5k1SU1MpKyurs0qyvlDW3NwcwNCfCKr7\nB9na2qJU1r5ldadUVsbExLBz504yMzPR6XTY2dnh6upKUFAQ999/v2Hc5cuX+frrrzl16hSFhYXY\n2dkxcOBAQkJCcHV1rXXeqqoq9u3bx8GDB8nIyKCiogJHR0f8/f158MEHDcfU1xPo2LFjHDlyhAsX\nLhiWNO3ZsyfBwcHMmjVLHloQQoi7lIRAQgghhBBCiBZrrOn8AA8HQoN8Ok0ABA3fMLVUdwWgKDud\nrj39DNUm+msX+eGHH9pxlveexkKKLs7udO8/huwzR0jatR77Xv0wU1rwzJ/+g3npNWbMmMGLL77Y\nTrO985SUVZg89kZBNnq9nm7ufdErrdtwVqKlTK3wupX7sBmU5F0h+9wRNn6xmZLySkb29yI/P5/M\nzEySk5Pp2rWr0TEpKSmYmZnx4Ycf4u7ubrTvyy+/5MyZM4SHhzN06FDMzMwYMWJEk1+LWq1Gp9NR\nUVFRKwgqKCho8vna2969e/n000/p1q0bw4cPx87OjoKCAtLT09m/f78hBEpOTub111/nxo0bDB8+\nnF69enHp0iUOHTpETEwMK1euxMfnt8rZiooK/vrXvxIfH4+TkxPjx49HpVKRnZ3NsWPH6N+/f53B\n0a3Cw8MxMzPDz88PR0dHiouLSUhI4LPPPiM5OVl6LgkhxGDpN/AAACAASURBVF1KQiAhhBBCiNuM\nHDmS9evX061bt46eihB3lJqm8+laHfHpuZSUVaCyUjLI06lT9QCCxm+YOvsOIz81nrTobdj36ouF\njS0pB7TEWRYwNXgC0dHR7Tjbe4spIYVb4GRsuvUg93ws+Wmn0FdV4eLnxZLFi5k3b55RHxFhTGVl\n+m2A7DNHAHD2G9ak40T7a0qFVw1zS2u8JzxM4ZVkzJSWfPf9Ac6fssXDVYOrqytPPvkkUVFRXL9+\n3XDMjRs3UKvVtQIgvV6Pu7s7ly5d4ty5cyQnJ6PX6w3LvTVF7969iY+P5+zZs7X6UN0JlZh79+5F\nqVTy8ccf1wrRCgsLger364MPPqCkpIQXX3yRCRMmGMZER0fz7rvv8v7777N+/XpDdc7mzZuJj49n\n+PDhvPLKK0b9mW7evElJSUmjc1uxYgUuLi5G2/R6PWvXruXAgQPMnDkTPz+/5r50IYQQnZT8FCeE\nEEIIcRu1Wi0NxYVoAU+NbacLfW7X2A1Tm27duW/yY1w5dZDCrGT0+ips7Lsz5ZEnmTHcR0KgNlRf\n2FCce4nss0cpzsmksvwGSusu2Lneh1fQQ1iobHl6Wj/mDfdi+fLlnD59moiICADWrl1r6OG0ZcsW\ntmzZYjjn7UslASQkJLBlyxZSUlJQKBT079+fJ554otZNb4CysjJ27txJdHQ0ly9fRqFQ4OHhwZw5\ncxg3bpzR2FuXpBs6dChbtmwhKSmJoqIiNm7ciEajadH7ZqpBng3flL9xLZvrWcmU5F+m8HIKXd18\nUTv1bPQ40bEaC0+tutgz+JEVtbYrzMyxVNvj6D0Ij9FzGeDhwHuPjjLsP3bsmNH46dOnk5+fT35+\nPg4ODkB1iLB582by8vLw9fWt9blau3Ztk17L5MmTiY+P58svv2TVqlWGUFen0/Gf//ynSefqKObm\n5oal7m5V06cnKSmJS5cu0adPH6MACCAoKIhdu3Zx9uxZzpw5g7+/P1VVVezZswdLS0uWLVtmFAAB\nWFhY1Aqc6nJ7AATV/b/mzJnDgQMHiIuLkxBICCHuQhICCSGEEOKeotVqCQ8PJz4+ntLSUjw8PAgN\nDWXYsGGGMfX1BEpPT2fr1q0kJSWRn5+PSqXCyckJf39/Hn/88TrXrhdCdE6mVJt0cXbHZ/KjRtvc\nfX0JCPAxBAw1Vq9eXev4gICAWuNE4+oKG/JS4rgYuwuFmTlde/phqbKjTJdPXspJrmddwG/a0npD\nipEjRwLV3+3+/v5GN6e7d+9uNDY2NpaYmBiGDBnCjBkzyMzM5Pjx4yQnJ/P3v//dqNF6cXExr776\nKqmpqfTu3ZspU6ZQVVVFXFwc7733HhkZGSxevLjWfJKSkti6dSv9+vVjypQpFBYWtuvfH54aWwJ6\nOdRbCVeSf4XL8ZGYW1rTzaM/7sPuZ4CHQ6cPdu91rVWplZCRT7pWV+/v97x58/j000959tlnGTNm\nDObm5pw7d46LFy8yfPhwYmNjWzyHcePGER0dTUxMDM888wwjRoygsrKSI0eO4OPjw5UrV1p8jdZ2\nawWsjVtfrp09z//8z/8wbtw4/P396du3r1FIk5KSAlCr0qnGgAEDOHv2LKmpqfj7+3Pp0iWKi4vx\n8/MzhG/NodPp2L59O8ePH+fq1auUlpYa7a/pEySEEOLuIncqhBBCCHHP0Gq1vPDCC/To0YNJkyah\n0+mIjo7mrbfeYuXKlfX+QxyqA6CaHhMjRoyge/fulJSUcOXKFfbs2cPixYslBBLiDtLcG6ayJFbb\nuz2kKC3MJfOX3Viq7fGZ8hiWqt+CGN3VVFIiv6Ii5Sc8NaF1nm/kyJGo1WoiIyMJCAggNLTucVBd\n9fDmm28ycOBAw7bPP/+cbdu28eOPP/K73/3OsH3Dhg2kpqayZMkSo+3l5eWsWrWKrVu3MmbMGLy9\nvY2uERcXx7Jly5g+fXrT3phWtGicD8s3xdTZE8ux9yAcew8y/FqhgNAgn9oDRafSmpVa8em59YZA\n06dPx8LCgh07dhAZGYmlpSX9+/fnueee4+jRo60SAikUCl555RW2bdvG/v372bVrFw4ODkyePJmQ\nkBDmz5/f4mu0lri0XDZFJd8WqvbkulsQ16+eJuPrbdjZ7EChUBgeGvLx8TEs3VZfoFOzvbi42Oi/\njo6OzZ5rcXExzz//PNnZ2fj6+jJp0iS6dOmCubk5xcXF7Ny5k5s3bzb7/EIIITov+ReMEEIIIe4Z\niYmJhIaGsnDhQsO28ePHs2LFCrZv395gCBQZGUl5eTmvv/56rSbHRUVFWFlZtdm8hRCtr7k3TGVJ\nrPYRHODG6Yv56IHc5BNUVVbSc+g0owAIwLaHN117+mF+/SI3btzAxsamRdcdN26cUQAE1Te9t23b\nxoULFwzbdDodBw8exMfHxygAArC0tGTJkiWcPHmSn376qVYI5O3t3aEBEFT37wqbGcDa3Yl1BkE1\nFAp4ftYAAr3kz31n11iFV33qWibu1krJuqocg4ODjSqlDXPw9KwzZG2oIjIsLIywsLBa25VKJSEh\nIYSEhDTpfO1pb9zFej9Djt4DwXsglTdLmepnBflp/Pjjj6xYsYL169ejUqkAuHbtWp3nzs+v/n2s\nGVezTHFLKnV++OEHsrOzWbhwYa3fp6SkJHbu3NnscwshhOjcJAQSQgghxD1Do9Hw8MMPG20bPHgw\nzs7ORjf3GlJXs/EuXbq0yvyEEO2nOTdMZUmstlfXU/XFOZcAKMrOoCTvcq1jBvVUUZqnJCsri/vu\nu69F16/r+JrG9kVFRYZtFy5coKqqCqhu1n67yspKADIzM2vt8/X1bdEcW8v0wF50t1exOTqZhIza\nn4MBHg6EBvlIAHQHaajCqymk4rFxcWm5jYaoAOYW1uxOg9WLFqLX6/nxxx85c+YMvXv3BqofUKpL\nzfaacT179kStVpOWlmbUj6kpLl+u/v4cPXp0rX2nT59u8vmEEELcOeRvdiGEEELcdW5dl11lpaSn\nuvpf6F5eXpiZmdUa7+TkRFJSUoPnDAoKYufOnaxcuZIxY8YwaNAg+vbtW2eDXSHEnaEpN0xlSay2\nV99T9RVl1csmZZ89arTdTmWJm4Oa0tLqcP723hbNUVeoX9PcvSb0gepKIIDk5GSSk5PrPV9dc7K3\nt2/pNFtNoJcTgV5Otf7eHOTpJIHnHcjUCq/GSMVj4zZFJdf7HuuuptGluycKhQIAvR42RydjW1AA\ngJWVFX379sXNzY2zZ89y5MgRxowZYzj+yJEjnDlzBjc3N/r37w+AmZkZM2fO5JtvvuHTTz/llVde\nwcLCwnBMRUUFxcXFRn2HblfTAy0xMRFPT0/D9tTUVLZu3dqs90EIIcSdQUIgIYQQQtw16l6XHcqK\nCsjMvIbfoLqPMzc3R9/I3RJfX1/eeecdvvnmG44cOcLBgwcBcHNzIzQ0lHHjxrXKaxBCtB9ZEqvz\naOipenNLawAGLvj/DP///KwApgf2as8pGqlZmmnu3Lk8+eSTTTq25sZwZ+KpsZXQ5y7RWIVXY6Ti\nsXHpWl2DVaRpUd9gprRE5eSGVRd79Ho4/30GvbuUMaB/HwYOHIhCoeD555/nL3/5C++88w4jR46k\nZ8+eZGVl8fPPP2NjY8Pzzz9v9H2xcOFCzp8/T2xsLH/4wx8YNmwYKpWKnJwc4uLieOKJJ+pcpq/G\npEmT2L59Oxs2bCAxMRFXV1cuX77ML7/8wqhRo4iOjm7V90kIIUTnISGQEEIIIe4KDa3LDlB4o5xd\nJy4yJT6TaYPcm3WNPn368MYbb3Dz5k1SUlI4efIkERERvPfee9jZ2TFoUD0pkxCi05IlsTqHhp6q\nVzu5UZJ3maKci3R1q15KLTIxq0khUE0V6K3VPC3h6+uLQqHg7NmzrXI+IVrTrRVeu06ks+v4RUwp\nDJKKR9PEp+c2uN9lUDC6K79yI/8qhZdTMDNXYqnuytBJs/nf5x5Hqay+Fefn58eHH37If/7zH+Lj\n44mNjcXOzo7x48cTEhKCm5ub0XmVSiV//etf+f777zlw4AAHDhxAr9fj4ODAqFGj6NevX4PzcnBw\n4J133iE8PJyzZ89y8uRJevbsydNPP82gQYMkBBJCiLuYhEBCCCGEuOOZui47evhwVwKarjYtuqFr\nYWFB37596du3L66urnzwwQfExMRICCTEHUqWxOpYjT1V7+w7nLyUk2Sd+AErWwes7ZxIyMgnXavD\nU2NLRUUF58+fNyybVBc7OzsAcnJyWmXOXbt2ZcKECRw8eJCvv/6aBQsW1Fpu9MqVK5iZmRmWYBKi\nvXlqbHlmRgD39egqFY+tqKSsosH9zr5DcfYdWmv7wDG+2NjYGG1zc3PjhRdeMPna5ubmzJo1i1mz\nZjU4LjQ0lNDQ0Frb3d3d+ctf/lLnMREREbW2hYWFERYWZvL8hBBCdE4SAgkhhBDijtfQE+S3q1mX\nvak3Oc6dO0fv3r2xtLQ02l5wy/ruQog7myyJ1TEae6reuqsTvUbM4WLMTs7t+gd2Lr2xsnPk3Q9P\n4aqu4uzZs9jZ2fGPf/yj3nO4ubnh6OhIVFQU5ubmaDQaFAoFEydORKPRNGvef/zjH7l8+TKbNm3i\n4MGD9OvXD3t7e/Lz88nMzCQ5OZmXX35ZQiDR4aTisXWprJp3K625xwkhhBAtJX8DCSGEEOKO1tgT\n5HW59QlyU3377bckJCTQv39/unfvjo2NDRkZGZw4cYIuXbowbdq0pk5dCCEEjT9VD+DgPQCbbt3R\nnjuGLjsN3dVfiS9yROHnwZgxYwgKCmrweDMzM1577TXCw8M5cuQIN27cQK/X069fv2aHQCqVijVr\n1rB3715++uknjh49Snl5Ofb29ri6uvLkk08SGBjYrHML0dqk4rH1DPJsXljW3OOEEEKIlpIQSAgh\nhBB3tMaeIG/ouKbc9Jg5cyZdunThwoULnD17lsrKSpycnJg5cybz5s1r9k1EIYS415n6dLxNt+54\njJ5r+PXT0/oxb7hXrXGrV6+u83gfHx9WrVpV577g4OAGG6rXtUwSVPfoMGVpJoCAgP+fvTsPqLrK\n/z/+vOwCsolXkUUWUVEWEVcGE8XdXLMS0rLUnFbX+mVNUWNDUzmOlkrNtyYrt5lwR8XlJoEb7gi4\nISDuXFCUTfb7+4O5N6/3omjuvh//KJ9zPudz7hUEPq/PeZ+AescR4n6RFY9/nKeyMQEeTrf1EFJg\nSyd534UQQjwwEgIJIYQQ4pHWkCfILW0d6Dgmut7zbrxhaOxmYHBwsDzRLYQQ94A8VS+EeNS88JQv\nM5ekNKgcsUIBUT187/2khBBCiHqY3LqLEEIIIcTDS+qyCyHEo037VP3tkKfqhRAPUrCXM1MGB6BQ\n3LyfQgFTnw6U/ZaEEEI8UBICCSGEEOKRJk+QCyHEo++Fp3xveTNVS56qF0I8DAYEe/DZC10JbGk8\nxA5s6cRnL3Slfwf3+zwzIYQQQp88AiuEEEKIR5rUZRdCiEef9qn6uevTblpeSZ6qF0I8TIK9nAn2\ncuaUuphDpwooq6jG2tKMDp7O8rOmEEKIh4aEQEIIIYR45ElddiGEePQNCPagmYM1S5MzOZxrGOwH\ntnQiqoevBEBCiIeOp7KxhD5CCCEeWhICCSGEEOKRJ0+QCyHE40GeqhdCCCGEEOLukhBICCGEEI8F\neYJcCCEeH/JUvRBCCCGEEHeHhEBCCCGEeGzIE+RCiLth3bp1bNy4kby8PCorK5kwYQLDhg170NMS\nQgghhBBCiNsmIZAQQgghHjvyBLkQ4k4lJSXxr3/9C29vb4YOHYq5uTlt27Z90NMSQgghhBBCiDsi\nIZAQQgghhBBC/M/evXsBiI6OxsnJ6QHPRgghhBBCCCH+GJMHPQEhhBBCCCGEeFhcvly3p5gEQEII\nIR52arWaIUOGMHfuXL3jc+fOZciQIajV6gaPlZaWxpAhQ1i6dOndnma9VCoVQ4YMQaVS3bdrCiHE\nk0hWAgkhhBBCCCGeeEuXLmXZsmW6j4cMGaL7+7p16wBITU1l5cqVnDhxgvLycpRKJaGhoYwaNQob\nGxu98WbOnEl6ejqrVq0iLi6OxMRE8vLy6NmzJ1OmTNH1S05OJiEhgezsbCoqKnB0dKRt27YMHz4c\nX19fvTGTkpJ0fSsrK2nWrBnh4eGMHDkSc3Pze/G2CCGEeEyo1WrGjx9PRESE3vchIYQQjz8JgYQQ\nQgghhBBPvICAAKDuqWS1Wk1kZKRee0JCAgsXLsTS0pKwsDAcHBxIS0sjLi6OlJQUvvzyS4MgCCAm\nJobMzExCQkLo1q0b9vb2AGg0GubNm4dKpcLOzo7u3btjb2/PpUuXOHz4MK6urnoh0Lx589i6dSvO\nzs6EhoZiY2PD8ePHWbx4MampqcyaNQtTU9N7+A4JIYR4VLz44ouMGjXqtla1tm7dmtjYWOzs7O7h\nzIQQQjwIEgIJIYQQQgghnngBAQEEBASQlpaGWq0mKipK16ZWq/n222+xsrJizpw5uLm56dpiY2PZ\nsGEDP/zwA2+++abBuPn5+SxYsMDgptqmTZtQqVT4+voya9YsvQCptraWK1eu6D5WqVRs3bqV7t27\nM2PGDCwsLHRt2hVM69evZ+jQoXflvRBCCPFoc3Jyuu2yppaWlnrf34QQQjw+JAQSQgghhBBCiJtI\nTEykurqaESNGGNwgGzt2LNu2bWPbtm1MmjTJoCzbmDFjjD5VHR8fD8Cbb75psILIxMRE7+bd2rVr\nMTU1ZfLkyXoBEMDo0aOJj48nMTFRQiDxSFKpVMydO5cpU6YQERHxoKcjxF11fQm2UaNGsWjRIjIy\nMqiqqsLb25vIyEiCg4P1zqmqqmLNmjUkJiZy4cIFTE1N8fLyYsiQIYSFhTXounPnzkWlUvH999+j\nVCr1Sp6qVCq9PXi0X3tpaWm8//77REZG6j0IAVBcXMzq1avZvXs3Fy9exMzMDKVSSadOnXj++eex\nsrIC4OTJk/z666+kpaVRUFBARUUFzs7OdO3aleeffx5bW9s/8nYKIYS4QxICCSGEEEIIIZ5Yp9TF\nHDpVQFlFNdaWZlwtrTTok5WVBUBgYKBBm62tLT4+PqSnp3P27Fm8vLz02m/c1wegvLyc3NxcHBwc\n8Pb2vun8KioqyMnJwc7OjjVr1hjtY25uzpkzZ246jhBCiAcnLy+PGTNm4OnpyYABAygsLCQ5OZno\n6GjeeecdevToAUB1dTUfffQR6enpuLm5MXjwYCoqKtixYweff/452dnZvPjii7d9/YCAAEpLS1m7\ndi1eXl5069ZN13bj9y1jc3///fdRq9W0atWKQYMGodFoOHfuHKtXr2bgwIG6EGjTpk3s2rWLgIAA\nOnTogEaj4eTJk6xevZr9+/fzj3/8g0aNGt32/IUQQvwxEgIJIYQQQgghnjgHcwpYkpRJ2unLescz\nD51GUVTIwZwCgr2cASgtLQWot7SOo6OjXj9jbdfT9mvSpMkt51lSUoJGo+Hq1au6p7iFEEI8WtLT\n0xkxYgSvvPKK7tjgwYN55513WLBgASEhIVhbW7Nq1SrS09MJCQnhww8/1O31FhUVxbRp0/jll1/o\n3Lkzfn5+t3X9gIAAmjVrxtq1a/H29jZY6XMzs2fPRq1W8+KLL/Lss8/qtRUVFekCIIBnn32W1157\nDRMTE71+W7Zs4auvvmL9+vWMGjXqtuYuhBDij5MQSAghhBBCCPFESTh4mrnr09BojLcXXatk5pIU\npj4dSP8O7rpybYWFhXh4eBj0LywsBMDa2tqgTaFQGBzTjnfp0qVbzlXb19vbm3nz5t2yvxD3g0ql\nYs+ePWRlZVFYWIipqSmenp4MHDiQXr166fWdOXMm6enprFq1iri4OBITE8nLy6Nnz57k5eWRnp4O\n1JWvmjt3ru48bRkrIR4HNjY2REZG6h3z9fUlPDwclUrFrl27iIiIYMuWLSgUCiZMmKALgADs7e0Z\nPXo0X331FZs3b77tEOhOnTx5kmPHjuHt7W00vLmx3Gl9X7N9+vThu+++4+DBgxICCSHEAyAhkBBC\nCCGEEOKJcTCn4KYBkJZGA/+MP4zSvhHe3t7s3LmTtLQ0goKC9PqVlpaSnZ2NhYUF7u7uDZqDlZUV\nLVu2JDc3l+zs7JuWhLOyssLDw4PTp09TXFxM48aNG3QNIe6lhQsX4uHhgb+/P46OjhQXF7Nv3z7m\nzJnDuXPnGDNmjME5MTExZGZmEhISQrdu3bC3tycgIAAbGxtSUlLo2rWr3tfCjXtlCfEouLHEqJtN\n3TcbHx8fo2XQAgICUKlUZGdnExoayoULF2jSpInB/nPwe0nS7Ozse/sirnP8+HEAOnbsaPShhhtV\nV1eTkJBAUlISZ86cobS0FM1133Ab8vCDEEKIu09CICGEEEIIIcQTY0lS5i0DIC2NBpYmZ/LOgF4s\nX76c+Ph4IiIicHFx0fVZvHgxZWVl9OvXD3Nz8wbPY8iQIcyfP5/58+cza9YsvRveGo2GwsJCXfm5\n4cOH89VXXzFv3jymTp1qcHO8pKSEvLw8fHx8Gnx9If6I+fPn630dQN3N3+joaOLi4hg4cKBBucP8\n/HwWLFhgsHIAICUlhe7duxMREXFP5y3EvVJfidGKkiucOVOIj7/x7w8ODg5A3QMFDS09WlJScrem\nfUu3mtONvvjiC3bt2kXz5s3p2rUrjo6Ouu+Na9eupaqq6p7NVQghRP0kBBJCCCGEEEI8EU6piw1u\n0N3K4dzLlOHPxIkTiY2NZfLkyYSFhWFvb096ejrHjh3Dzc2NcePG3da4/fr1IyMjg23btjFp0iS6\ndu2Kvb09ly9fJjU1lb59++r2bOjbty8nT55kw4YNTJw4keDgYJRKJcXFxbpyWn369OGNN964rTkI\ncaduDIAAzMzMGDx4MIcPHyY1NZXevXvrtY8ZM8ZoACTEo64hJUbX7TzKwENn6N9Bf8XolStXgLqV\nb9eXHjVGe/x+rpLTXuvy5Vt/78zMzGTXrl106NCBjz/+WK+cnUajYcWKFfdsnkIIIW7ugYdACoXC\nFxgJ9Ad8gWZAIbAbmKvRaLbd5NyXgDeAdkANcBCYrdFo4u/1vIUQQgjxO7Vazfjx44mIiGDKlCm3\n7K9SqZg7dy5Tpkx5JJ/6Xbp0KcuWLSMmJoaAgAC9tqSkJFasWMH58+cpLy9n6NChTJw48QHNVAhx\nvUOnCu74vOGDBuHi4sLKlSvZuXMnFRUVNG3alJEjR/Lcc8/d9k05hULBtGnT6NixI5s2bWL79u1U\nVVXh6OhI+/bt6dq1q17/1157jU6dOrFx40ZSU1MpLS3F1tZWN4cb92ER4m66scSVuy3s+S2B1NRU\n8vPzqays1OtvrOSTr6/v/ZquEPdNQ0uMll2+wOxVe1HaNyLYy1l3PC0tDajb961Ro0a4uLhw8eJF\nzp8/T4sWLfTGOHz4MMAdr/o0MTEBoLa2tsHntGnTBoADBw7w4osv3rQk3IULFwDo0qWLXgAEcOLE\nCYP/J4QQQtw/DzwEAmYBzwNHgA3AZaANMBQYqlAoJms0mq9uPEmhUMwGpgNngf8DLIDRwDqFQvGW\nRqOZf5/mL4QQQggBwLFjx5g9ezbNmzdn0KBBWFpa6n55FkI8eGUV1bfs49t3XL3nBQcHExwc3KBr\nffbZZw3qFx4eTnh4eIP6du7cmc6dOzeorxB3g7ESVxXFhRxP+A5r0xp6dutI//79sba2xsTEBLVa\njUqlMlrySVvKSojHSUNLjFZXlnPh8G8sTXbRhUCZmZkkJiZiY2ND9+7dAejTpw8///wz//73v3n/\n/fd1wU1RURHLly8H6laH3glbW1sUCgX5+fkNPqdVq1b4+flx9OhR4uLiePbZZ/Xai4uLsbS0xMLC\ngmbNmgGQnp7OkCFDdH2uXr1KbGzsHc1ZCCHE3fEwhEAJwOcajebg9QcVCkVPYAvwpUKh+EWj0Vy4\nri2UugAoC+is0WgK/3f8S2A/MFuhUMRrNJpT9+k1CCGEEOI2dOvWjdjY2MfuhtDevXvRaDRMnToV\nPz+/Bz0dIcQNrC3v7NefOz1PiEdZfSWu1Md2UV1RhmP3YZx37UDLLoG6EldJSUmoVCqj4zVkU3kh\nHiW3U2K0cbOWXDp5kLj/O0/zqxGY1pSTnJxMbW0tb7zxBtbW1gCMHDmS/fv3k5KSwltvvUWnTp2o\nqKhg+/btXL16lWeeeYZ27drd0XytrKxo3bo1GRkZzJ49G1dXV0xMTOjatSuenp71njd9+nRmzpzJ\nTz/9xM6dOwkICECj0XD+/HkOHjzIN998g1KpxNfXFz8/P3bu3Mk777xDu3btuHLlCvv378fV1bXB\n+woJIYS4+x74bzMajWZRPcd/UygUiUBfIBS4vnjon//359+0AdD/zjmlUCgWAB8CLwPR92LOQggh\nhPhjrq97/jjR1ku/cTNsIcTDoYOn86073cXzhHhU3azEVUVx3a/gDh5+aDTwz/jDuhJX2tJWt+NO\nSlQJ8TC4nRKjFjaOuHcZzPmDKlavXY/SzgIfHx9Gjx5Nx44ddf3MzMyYNWsWq1ev5rfffiM+Ph4T\nExO8vLx49dVXeeqpp/7QnKdPn87//d//ceDAAZKSktBoNDg7O980BGrWrBnz5s1jxYoV7N69m/j4\neCwsLFAqlYwYMQJ7e3ug7mv5ww8/ZPHixezbt49169bRpEkT+vXrx/PPP8/rr7/+h+YuhBDizj3w\nEOgWtGvIb6zboN1hMsHIORupC4F6IyGQEEIIcd+p1WoWLVrEoUOHKC8vp2XLlkRFRemVMKpvT6Dx\n48cDsGDBAhYvXsyOHTsoKirC1dWVqKgounXrRk1NDStWrGDr1q0UFBTQpEkThg0bxtNPP603D41G\nw6+//kpCQgLnz5/n2rVr2Nvb4+7uTt++fenRo4de/4KCAuLi4ti3bx+XLl2iUaNG+Pn5MXr06Fvu\nY6B9PTe+DoDvv/8epVJ5+2+kEOKu81Q2JsDDqcFP1jo8dwAAIABJREFUbgMEtnTCU9n4Hs5KiIfP\nzUpcWdjU3fAtyTuFvVsbNBpYmpyJpvA0mzdvvu1rNW5c9/WlVqvveL5CPAgNKTF6PSv7pniHj+al\n8NZE9aj/Z0sLCwuee+45nnvuuVuOqVQqWbduncHxKVOmGN2n08XFhY8++sjoWAEBAUbHgrqv03Hj\nxjFu3Libzqdx48a89tprRtu+//57g2MRERGP5P6gQgjxqHloQyCFQtESiADKgKTrjtsArkDJ9SXi\nrpP5vz9bN/A6++tpatvw2QohhBAC6m7gTJs2jebNm9O7d2+Ki4tJTk5m1qxZfPrppwQGBt5yjOrq\nav7yl79QUlJC165dqa6u5rfffiMmJoZZs2axYcMGjh8/TkhICObm5mzfvp1vv/0We3t7vWDn559/\n5pdffqFZs2aEhYVhY2PD5cuXyczMZPv27Xp9s7Ky+PDDDykpKaFjx46EhoZSVFTE7t27effdd/ng\ngw/o1KlTvXP28vIiMjKS3bt3k5OTw9ChQ3UrnR7HFU9CPMpeeMqXmUtSGrSHg0LBTW/UCfE4ulWJ\nq6atO3M5+xA5yXE4ePhh3qgxJ39Vc9DiCv0iwklOTr6t67Vt2xZLS0vWrl1LcXGxrlTs008/Ld9D\nxUNNSowKIYR4VDyU33kUCoUlsASwBN69vuQbYP+/P6/Wc7r2uMM9mp4QQggh6pGWlkZUVBSRkZG6\nYz179iQ6OpqVK1c2KAS6fPkyPj4+fPbZZ5ibmwPQq1cv3nvvPf7+97/j4uLCggULdDeGhg8fzmuv\nvUZcXJxesJOQkECTJk1YsGABlpaWetcoKirS/b2mpobPP/+c8vJyYmJi8Pf315vL1KlT+fzzzykr\nK6Nv376MGjWKVatWceDAAaZNm0ZwcDCRkZFERUWhVqvJycnB0dGRH3/8kSlTpnDs2DHi4uLIzs6m\nrKxM7wnL1NRUVq5cyYkTJygvL0epVBIaGsqoUaOM3vjKzMzkp59+4tixYygUClq3bs2YMWM4cOAA\ny5YtIyYmhoCAAF3/IUOG4O/vz7vvvsvPP//M/v37KSwsZPLkyURERHDu3Dm2bt3KoUOHUKvVlJWV\n4ejoSMeOHRk9ejTOzvolsNLS0nj//feJjIykc+fOLF68WDeXoKAgJk6ciLOzMxcvXuSnn34iNTWV\n8vJy2rRpw8SJE/Hy8rrlv78Q91qwlzNTBgfUW+pKS6GAqU8H6jbwFuJJcasSV40cm9Gqz0tcSN1G\n0blMNJpaGjk0o++YCQzs4nvbIZCtrS0zZ85k2bJlqFQqysvLgbrv/RICiYeZlBgVQgjxqLgrIZBC\noTgFtLyNU5ZoNJox9YxlCvwM/An4DzD7D0/wJjQaTUg989gPdDTWJoQQQjzpTqmLOXSqgLKKaqwt\nzXCzqbuTqlQqef755/X6duzYkaZNm3LixIkGjz9x4kRdAATQvn17mjVrRl5eHuPGjdO7KdS8eXP8\n/Pw4cuQItbW1ur0FAExNTfU+1rKzs9P9fd++fVy4cIERI0boBUAATk5OPPPMM8yfP59r166Rl5fH\njBkzqKiooGnTpgQHB5OVlUV0dDTvvPOOwXV27NjB/v37CQkJYeDAgXqlbhISEli4cCGWlpaEhYXh\n4OBAWloacXFxpKSk8OWXX+q9zvT0dD766CNqa2vp3r07Li4unDp1ivfff/+m4VpJSQkzZszAysqK\n0NBQFAoFDg51z8rs2rWLjRs3EhAQgJ+fH2ZmZpw+XVfOZ8+ePfzzn/80ur9RZmYmK1aswN/fn/79\n+3Pq1Cl27txJbm4uf/nLX3j33Xdxc3Ojd+/eqNVqdu3axYcffsh3332HlZVVvXN9FCxdutRo4CYe\nLQOCPWjmYM3S5EwO5xqueAhs6URUD18JgMQTqSElrmybuuPb50W9Y+6tWxMQ4GtQTuqzzz675Xgh\nISGEhBj91VyIh5aUGBVCCPGouFsrgbKA8tvof97Ywf8FQIuBZ4H/AmM0GoPn87QrfewxTnv8ym3M\nRwghhBANcDCngCVJmQa/7FaUXOHMmUI8WvsbDV2cnZ05duxYg65hY2ODi4uLwXEnJyfy8vLw8fEx\naGvSpAk1NTUUFhbqQovw8HDWrVvH66+/TlhYGP7+/rRt29bgqWLtvPLz81m6dKnB2OfP1/3YUl5e\nTnp6OiNGjMDKyoply5YxduxYrKyseOedd1iwYIHexr5QFzBFR0cb3NhSq9V8++23WFlZMWfOHNzc\n3HRtsbGxbNiwgR9++IE333wTqNvf6KuvvqKqqoqPP/5Yb7yNGzeycOHCet/PU6dO0atXLyZPnoyp\nqaleW69evRg2bJhe4AZw8OBBoqOj+c9//mN0E999+/Yxffp0wsPDdce++uortmzZwjvvvMOIESP0\n6tgvX76cJUuWsHnzZoYOHVrvXJ8kc+fORaVSyZ5RD1CwlzPBXs4GoXYHT2e5QSeeaFLiSoiGu1WJ\nUUtbBzqOqduuWkqMCiGEeFDuyk9pGo3mD+/iplAozKkrAfcssBR4UaPR1Bi5VqlCoTgHuCoUChcj\n+wJpv6M2/HFjIYQQQhi1bt06Nm7cSF5eHmfzr1Ll3p2mbbsZ7Vt0rRLV0UtsOnSG/h3c9dpMTU0x\nfK7DuPpKv2gDDGPt2raamt9/dJgwYQLNmjVj69atxMXFERcXh6mpKZ06dWL8+PG6oElbGm779u26\nc8sqqim6VklNrQZTEwVWphpqamqwsbEhMjKSVatW6fr6+voSHh6OSqXi1KlTevPq2rWr0SebExMT\nqa6uZsSIEXoBEMDYsWPZtm0b27ZtY9KkSZibm3P06FEuXLhAYGCgwXgDBgxgzZo1nDt3zuj7ZmZm\nxvjx4w0CIMDoKh+A4OBgWrZsyYEDB4y2t2vXTi8AAujduzdbtmzB2tqaUaNGGbQtWbKE7Oxso+M9\nSp5++mmeeuopmjZt+qCnIu4ST2VjCX2EuI6UuBKi4aTEqBBCiEfBQ/GojkKhsKBu5c8w4CfgZY1G\nU3uTU34FxgIDgB9uaBt4XR8hhBBC3KGkpCT+9a9/4e3tTVD3XmSm5GLn7HbzkzTwz/jDKO0bPfBf\nck1MTBg2bBjDhg3j6tWrZGRkkJyczPbt2zl9+jQLFizA3NxcFyr95S9/wULpo1vp5HTdWBUlV7i6\nKRabJi40atTI4FoBAQGoVCouXbqkd7x169ZG55aVlQVgtIybra0tPj4+pKenc/bsWby8vHT927Vr\nZ9BfoVDQtm3bekOgZs2aYW9vfAG1RqMhMTERlUpFTk4OJSUl1Nb+/iOYmZnxHxV9fQ2fYtUGSt7e\n3garwbRtN74/jyI7Ozu9coJCCPG4kRJXQtweKTEqhBDiYffAQyCFQmEJrAQGAd8Dr94iAAL4hroQ\n6AOFQrFao9EU/m8sT+ANoALDcEgIIYQQt2Hv3r0AREdHE7PuOC6BXg06T6OBpcmZD9Uvuvb29oSG\nhhIaGkpRURGHDx8mNzeXVq1a0aZNGwCWb0jiqHlBvU9xFl2rZHvWVTYdOmPQpt1jp7KyUu+4o6Oj\n0bFKS0uBuhJ3xmjP0/YrKyvTu059/W+37fvvv2fNmjU4OTnRsWNHmjRpgoWFBQAqlUpvD6PrWVtb\nGxxryEqt6upb7zNxKykpKaxdu5YzZ85QXFyMnZ0dLVq0oEePHgwaNEjX7/z58yxfvpzU1FSKioqw\ns7MjKCiI0aNH06JFC4Nxa2tr2bRpE9u2bSM3N5fq6mqaNGmCv78/o0aN0p1zsz2Bzp49S1xcHKmp\nqVy5cgUbGxuCgoKIiorC1dVV12/IkCG6v48fP173d6VSyffff8+MGTM4ceIE3333ndFScatWreLf\n//43r7zyCiNGjLjzN/MJVN+/35AhQ/D399fbu0T2fxJPsluVuLqelLgSQkqMCiGEeLg98BCIukBn\nEFAAnAM+UigUN/ZJ1Gg0idoPNBrNToVCMQeYBhxWKBRxgAXwPOAEvKXRaE7d+6kLIYQQj6/Ll+ue\nZCyqNr+tp4EBDude5pS6+IH90ltVVcXJkyfx8/PTO15dXU1JSQkAlpaWQF3JNnNbR5bGrcarhzn2\nroY3ssounQONhqprpfwz/jBP2RbrtV+5UrcVoYWFhV7QYeRnGuD3oKSwsBAPDw+D9sLCQuD3sEX7\np/Y69fW/HVevXmXt2rW0bNmSL7/80mCFU1JS0m2Pea8lJCSwYMECHB0d6dKlC3Z2dly5coVTp06x\ndetWXQiUmZnJX/7yF65du0aXLl3w8PDg7NmzJCYmkpKSwqeffqq3mqm6uppPPvmEQ4cO4ezsTM+e\nPbG2tiYvL4/du3fTvn17o8HR9fbv309MTAw1NTV06dIFFxcXCgoK2LVrF/v27SMmJka3n1VkZCS7\nd+8mJyeHoUOH6j4ftH8OGjSI48ePs2nTJsaOHWtwrU2bNmFubk5ExB+uyCzugLHASIjHjZS4EuLO\nSIlRIYQQD6OHIQTSPlbsDHx0k36J13+g0WimKxSKNOpW/rwK1AIHgC81Gk38PZinEEII8UTQPv2u\n9ewzwzmlrgs9Oo6J5sqZY1w5fYSyS+epLKvbT8e8UWMqSq7o7ftz6FQBnsrGzJ07l19++QVvb2/i\n4+PZsGEDaWlp5ObmkpiYSO/evVEoFGzfvp2UlBRKS0sZM2YMYWFhvPLKK7qVKddLTU1l5cqVnDhx\ngvLycvLy8igrK9OtnKmsrOTdd9/FxcWFgwcP0qhRI8aMGcOhQ4c4c+YMXbt2xd3dXfdaLVr3xvTo\nWbK2LcW2qTsmZhYU5+VQfrWAqmvF1FZVYWJuTtW1EqorK9h5/CLXxztpaWlAXdkz7aqdm/H29mbn\nzp2kpaURFBSk11ZaWkp2djYWFha4u7vr+gMcOXLEYCyNRsOxY8duec0bXbx4EY1GQ3BwsEEAVFBQ\nwMWLF297zHstISEBMzMzvv76a4MSd9q9nTQaDXPmzKGsrIzp06fr7V2UnJzMF198wT/+8Q9iY2N1\nId3SpUs5dOgQXbp04b333sPc3Fx3TlVV1S3/TUtKSvjyyy+xtLTk888/1/27AeTm5jJjxgy++uor\n5s2bB0BUVBRqtZqcnByGDRtmsNonLCyM7777ji1bthAVFaW3n1NaWhrnzp2jZ8+eUpbuDtzOnk6y\n/5N40kmJKyGEEEKIx8MDD4E0Gk34Hzh3EbDobs1FCCGEEOjKHmnLgXXpNZiKoxd07ecPbgWFCdZN\nXLF396OmqpyrZ45TflVN4ak0PP80HICyCv3SX2fOnGHp0qV06dIFKysrcnNz2bp1K35+fjRu3JhF\nixZhbW2No6Mjjo6OrF+/ntraWl5//XW9cRISEli4cCGWlpaEhYXh4ODAokWLyMrK4q9//Svz58/H\n0tKScePGkZaWRmJiIvn5+fz222+4uLjw+uuv07dvX914ZRXVXKqwoO3gP6M+upuCE3spzM1AYWKK\nlV0T7Fr4Ym5lw+Wcw5Rfzedi2m+YmFnQ9H+vLzMzk8TERGxsbPD09OTMGcNycTfq1asXy5cvJz4+\nnoiICFxcXHRtixcvpqysjH79+unCiHbt2uHi4sLhw4fZv38/ISEheu9HffsB3Yw2eDhy5Ai1tbW6\nfXzKy8uZP38+NTU1tz3mvXB9WZWsi1eprtbohSJa2kDk2LFjnD17lrZt2+oFQAA9evQgPj6eI0eO\nkJGRgb+/P7W1tWzYsAELCwveeOMNvQAIwNzcvN49lbR+/fVXSktL+fOf/6wXAAG0bNmS/v37s2bN\nGs6cOWPQboyFhQV9+vRh1apVpKSkEBoaqmtLSEgAYMCAAbccRxi6nT2dZP8nIaTElRBCCCHE4+CB\nh0BCCCGEeLgEBAQQEBBAWloaarWavk+P4KT57ytQfHpFYdlYfy8bTedBnN61hkvZqZQWnMXG2Q1r\ny99/zPDz80OpVPLFF1/QpEkTAGbNmsXEiRNZuXIllpaWzJ07V3eDvKqqismTJ7NlyxZeeOEF7O3t\n+eyzz1Cr1UyaNAkrKyvmzJmDm5sbAC+99BKxsbFs2LCBH374gTfffJNnnnmGZ555RhfKfP/990Zf\nb9G1SswBcysbXIMjqCi+RE1VBW0HT8LasTkAFSVXyFg9D5smrlw6eRArh2aE945ApVKRnJxMbW0t\nb7zxBj169ODdd99FpVLd9D1WKpVMnDiR2NhYJk+eTFhYGPb29qSnp3Ps2DHc3NwYN26crr9CoeCt\nt94iOjqaWbNmERoaiouLCzk5ORw6dIiQkBD2799fb/k5YxwdHXnqqadISkri7bffJjg4mNLSUg4d\nOoSFhQXe3t5kZ2c3eLy77WBOAUuSMvVKEapNXDl7IoOu/Z/lmSH9GBTeHT8/P72Q5uTJkwAEBgYa\nHTcwMJAjR46QnZ2Nv78/Z8+epbS0lDZt2tS7R9OtaFdi5eTksHTpUoN2bUjX0BAI6krCrV69mo0b\nN+pCoKKiInbt2oW7uzv+/v53NNdHkVqtZvz48URERPD888+zaNEi0tLSqKqqom3btkyYMIGWLVty\n9epVfv75Z/bs2UNJSQmenp6MGzdO73Phdvb5ubGvSqVi7ty5AKSnp+vt7RQZGUlUVBRQF6Dv2bOH\nrKwsCgsLMTU1xdPTk4EDB9KrVy+D68ycOZP09HRWrVpFXFwciYmJ5OXl0bNnT9q2bcuCBQuIiooi\nMjLS4NzCwkJefvll3NzcmD9//h29v0I0hJS4EkIIIYR4dEkIJIQQQggAg6d8r5ZWAtDBU7/My40B\nENSFFE3bduVSdipFF7KwcXYzOG/06NG6AAjq9j/p2rUrW7duZcSIEXo3x83NzenRowdLly7lzJkz\nupv8iYmJVFdXM2LECF0ApDV27Fi2bdvGtm3bmDRpksGKjvrU1Gow1tPE1PDHJCv7prQMHcb5gyr2\nbt/GOQcrfHx8GD16NB07dmzQ9bQGDRqEi4sLK1euZOfOnVRUVNC0aVNGjhzJc889p9sfRisgIIDP\nPvuMxYsXs3fvXgDatGlDTEwMiYmJwO97BzXU22+/TfPmzUlOTmb9+vXY29vTpUsXxowZQ0xMzG2N\ndTclHDxtdB8KpV93TC2tKTixj28WLWfzxvUo7a3x9/fn5ZdfxtfXV1e6rb5AR3tcWzpQ++f1n5u3\nq7i4rlzipk2bbtrv2rVrDR6zefPmdOzYkQMHDnDhwgVcXFxQqVRUVVU9sauA8vLymD59Ou7u7kRE\nRKBWq9m1axczZ85k9uzZREdHY21tTY8ePSguLiY5OZmPP/6Yb7/99q6UdPPy8iIyMpJly5ahVCr1\n9mS6PlBauHAhHh4e+Pv74+joSHFxMfv27WPOnDmcO3eOMWPGGB0/JiaGzMxMQkJC6NatG/b29oSH\nh/PDDz+wefNmnn/+ed2KPa0tW7ZQU1PzxH5OCCGEEEIIIW5NQiAhhBDiCWdsxQVA5qHTKIoKKSyt\nIMDDSddeXVFG3pFdFJ3PpLKkkJqqSr3zqsqKCWzpZPDEcKtWrQyurb0hb6xNe1O+oKBAdywrKwsw\nvsrD1tYWHx8f0tPTOXv2LF5eXgZ9jDE10V894+QZwJXTRzmR8D0OLdvTuJkn5ta/l4Sysm+Kd/ho\nXuvfjuFdjF8jIiJC7wZxfYKDgwkODm7QPKEu9Jk1a5bB8X//+9+YmJjQokULvePr1q276XiWlpaM\nHTuWsWPHGrQZ2/Q+ICCg3jGVSuVNr3eruWgdzCm46UbkTbyDaOIdRHVlOWUFZ/Brdo30A7uIjo4m\nNjZWF4QVFhYaPf/y5brPY20/bdh26dKlBs3PGO1YX3/9NZ6ennc8zo0GDhzI/v372bx5My+99BKb\nNm3CwsKC3r1737VrPErS09MZO3Yszz33nO7Y8uXLWbJkCdOnTycsLIzXX39dtyIuODiYOXPmsGbN\nGiZMmPCHr+/t7Y23t7cuBNKu/LnR/Pnz9Uo8AlRXVxMdHU1cXBwDBw40Gjrm5+ezYMECgxJ0vXr1\nYv369ezfv5/OnTvrjms0GjZv3oylpaXRFUZCCCGEEEIIAWBy6y5CCCGEeFwlHDzNzCUpBgGQVtG1\nSmYuSaF1C3sUCqiuLOf4xu/Iy9iOiakZTl5BNPfvgUtgT5RtuwKgqa0hqoevwVg3rmwBdHu7GFvB\nom27fm8a7aqN+lZ5ODo66vVrCLtGFnofO3j44dMrkkZOLlzOPkTO9hUc2/AvSvJPU170eyB140qn\ne62iosLo61KpVBw9epTg4GCsrKzu65zuhSVJmfUGQNczs7DCroUvtd7h9OnTh+LiYjIyMvDx8QEg\nLS3N6Hna49p+bm5u2NjYkJOTowuIblfbtm0ByMjIaPA52hUdN9t7qUuXLjRt2pQtW7Zw8OBBzp07\nR1hYGLa2tnc0z0edUqlk1KhRese0YWtVVRWvvPKKXknEnj17Ympqet/LGt4YAAGYmZkxePBgampq\nSE1NNXremDFjjO5BNGjQIAA2btyod/zgwYPk5eXRo0cPo/+/CiGEEEIIIQTISiAhhBDiiXWrFRda\nGg2sTMlhZFcvYn9YQkVJIS6BPXEJDNfrV5p/hvzjKYS3dyHY694EJNobnYWFhXh4eBi0a1d/XB8q\nKRQKqqurjY5XWlqKtaUZbs3tuXjd+2Dv2hp719bUVFVSdukcl7IPcWr7Si5lHaD86lC6BLa573sj\n5OfnM3nyZDp06ICLiwu1tbVkZWVx5MgRbGxsGD9+/H2dz71wSl1cbyAJUHwxB9tmnno3+g/nXqam\nJA+oW9nk5+eHq6srR44cYceOHfzpT3/S9d2xYwcZGRm4urrSvn17oC6MGTx4MP/9739ZsGAB7733\nnl4pwerqakpLS/X2HbpRnz59+M9//sOyZcvw9fWldevWeu0ajYb09HS9kmGNG9d9/uTn5xsNDaDu\nc3fAgAH8/PPPzJs3D6hbHfQkuL48ZWXpFcoqqvH29jYoh6YNhF1dXWnUqJFem4mJCQ4ODnqrCe+H\n/Px84uLiSE1NJT8/n8pK/dWS9a068/U1DM8BXWm5/fv3U1BQgLNz3f+v2vKDT8rnhBBCCCGEEOLO\nSAgkhBBCPKEauuIC6oKgzAtX6d/Gjl+OWuDg7mfQx0lzibaujrR1dbzLM/2dt7c3O3fuJC0tjaCg\nIL220tJSsrOzsbCw0NtfyNbWllOnTlFdXY2Zmf6PPpmZmQAMDvHg3/uLDd4PU3MLGjf3wsLWkQup\nidRWV1J0PpOoN56+Ny/wJhwcHOjZsyfp6ekcPnyY6upqHBwc6NOnD88991y9QcKj5NCpm9+sz0n6\nLyZmFlg7u2Jp64BGA6XqXC6bFhPWKZCgoCAUCgVTp07lww8/5PPPP6dbt264ublx7tw5du3aRaNG\njZg6dapekBQZGcnx48fZs2cPkyZNonPnzlhbW5Ofn8/Bgwd55ZVXblrer3HjxsycOZO//e1vzJgx\ng6CgIDw8PFAoFOTn53Ps2DGKi4tZuXKl7pygoCBWrlzJ/PnzCQ0NpVGjRtjY2PD00/qfW/369WPZ\nsmVcunQJT09P3aqjx5Wx8pQVJVfIyL1EbUY+g3IK9ELmm60m1LbfbLXV3Xbx4kWmTZtGSUkJ7du3\np2PHjlhbW2NiYoJardbt62SMdiWjMYMGDSI9PZ1NmzbxwgsvUFhYSEpKCt7e3gahoxBCCCGEEEJc\nT0IgIYQQ4gl0qxUXxhzOvcwwtxa0c3NkWLAtzm3aUVZRjbWlGU4Us+AfP6Gxtrj1QH9Ar169WL58\nOfHx8UREROgFH4sXL6asrIx+/frpreRo3bo1WVlZbN26VW/zdG0ZNQA/N0emNPdk7vo0ii6ewrap\nOwoTU11fS1sH3Dr1p+DEXkZ2b33PVjrdjK2tLW+//fZ9v+79VFZhfMWWlkuHCIovZHHt8kWKzp/E\nxNQMCxt7/tRvBDEzxutCvjZt2vDPf/6T//znPxw6dIg9e/ZgZ2dHz549GT16NK6urnrjmpmZ8ckn\nn7Bx40Z+/fVXfv31VzQaDU5OTnTv3p127drdcu5BQUHMnz+flStXcuDAATIyMjAzM8PJyYmgoCBC\nQ0P1+nfs2JHx48ezadMm1qxZQ3V1NUql0iAEcnBwoFOnTuzevVvv8/dxlHDw9E1XJ14oLGPmkhSm\nPh1I/w7uxjs9YKtXr6a4uJgpU6YYBIdJSUmoVKp6z70+mLxR9+7dcXBwYMuWLURGRrJlyxZqamoe\n+88JIYQQQgghxB8nIZAQQgjxBLrViov6NPbwp3Hjjaz9ZTHdup2kRYsWnDp/nr1799K9e3eSk5Pv\n8kz1KZVKJk6cSGxsLJMnTyYsLAx7e3vS09M5duwYbm5ujBs3Tu+cIUOGsHXrVhYuXEhqaipNmzYl\nOzubY8eO0blzZ/bu3QvAgGAPmjlYM37SInKS87F1dsfC1gGFiSllly+gKDpHaKAvU19+5p6+xieZ\ntaX+j6YVJVfIWD2PJt4daBk6jKatO9G0dSeD88L7tzMoBebq6sq0adMafG1TU1OefvppgxDmRlFR\nUURFRRltUyqV/PnPf27wNYcPH87w4cNv2kej0ZCTk4OlpSW9evVq8NiPmtspT/nP+MMo7Rs9kDAW\n6sKa2tpao20XLlwAMAj9oP59qhrCzMyMfv368d///pc9e/awefNmrKysCA8Pv+MxhRBCCCGEEE8G\nk1t3EUIIIcTj5lYrLupjamXL559/TufOnTly5Ajx8fGo1Wpee+01g/DlXhk0aBB//etfadOmDTt3\n7mT16tVcvXqVkSNHMnv2bN1eK1ru7u58+umntGvXjj179pCQkIC5uTmzZ8+mVatWen2DvZz56qPJ\njBveB6/G1dgWncS+OJM/+djx8bRJ/Ph/C7G1tb0vr/NJ1MHzzm7q3+l5j4IdO3aQl5dH79696y15\n9ji43fKUS5Mz79lc1q1bx+uvv87cuXPZs2cP27Zt02u3s7Ord58hpVIJGAY+Bw4cYPPmzX9oXgMG\nDMDExIRvvvmGvLw8wsPDDcJPIYQQQgghhLgSLEV1AAAgAElEQVSRrAQSQgghnkA3rrgwxrfvOKPn\nubu78+GHHxo9Z926dQbHpkyZwpQpU4z2v9mqioiIiHr3YQkODiY4OLiemRtq164df//73w2Oe3p6\nGlw/LCyMsLCwBo8t7h5PZWMCPJxuq1RhYEsnPJWNb93xERMXF0dxcTGbNm3CysqKZ5999kFP6Z65\n0/KUp9TFd/3fPikpiX/96194e3vTsWNHqqur8fT01OsTFBREUlISf/3rX/Hx8cHMzIz27dvj7+/P\n4MGD2bp1K3//+9/505/+hJOTE7m5uRw4cICwsLA/tFqyadOmdO7cmZSUFAApBSeEEEIIIYRoEAmB\nhBBCiCeQrLgQD6sXnvJl5pKUBq0KUSggqofvvZ/UA/Djjz9iZlYXur7yyis0bdr0QU/pnrnT8pSH\nThXc9RBIWx4yOjqahIQEzp07h5eXl16fV199FYDU1FT27duHRqMhMjISf39/PD09iYmJYfHixezd\nu5eamhq8vLx4//33sbGx+cMlM/v27UtKSgq+vr74+Pj8obGEEEIIIYQQTwaFpqF1F54wCoVif8eO\nHTvu37//QU9FCCGEuCdm/LjrtldcfPli93s4I/EglZeXExkZia+vL1988YXueGVlJaNHj6aqqopp\n06bp7UuzYcMGYmNjefvtt+nbty8A58+fZ/ny5aSmplJUVISdnR1BQUGMHj2aFi1a6F1z6dKlLFu2\njJiYGC5fvszatWs5ffo0RVUmKDq9SHmx/p5AWhqNhnP7N9H4ylGG9u/NjBkzsLCw4Nq1a6xZs4bk\n5GTy8/PRaDQ4ODjQqlUrnnnmGYPyf+LhsDQ5kx8TT9z2eS+Ft77rIeAHH3zA4cOHja5qfBhov2au\n/5oTQgghhBBCPLxCQkI4cODAAY1GE/Kg5iArgYQQQognlKy4ENezsrLC19eXEydOcO3aNd1eI0eO\nHKGqqgqoW/lwfQiUmpoK1JXHAsjMzGTGjBmkpKTQsWNHRo8ezdmzZ0lMTCQlJYVPP/0UX1/Dz6NV\nq1Zx6NAhunTpQmBgIKWlpXQf2JX/W59Cxg19a6urqDiyiSYl2URFjmLSpEkoFAo0Gg3R0dEcPXqU\ntm3b0q9fP0xNTSkoKCAtLY327dtLCPSQakh5yrt5njHacEVryJAhur9rA6HU1FRWrlzJiRMnKC8v\nR6lUEhoayqhRo7CxsdEbb+bMmaSnp7Nq1Sri4uJITEwkLy+Pnj176pXHTE5OJiEhgezsbCoqKnB0\ndKRt27YMHz7c4Gtly5YtxMTEUFxczIIFC1i5ciXh4eGMHDkSc3Nzvb4ZGRmsWLGC7Oxsrl69iq2t\nLc2aNSMkJITIyMi79r4JIYQQQgghHn4SAgkhhBBPqGAvZ6YMDmDu+rSbBkEKBUx9OpBgLykF97gL\nCgri6NGjpKen07lzZ6DuxreJiQn+/v660AfqVuOkpaXRvHlzlEolGo2GOXPmcO3aNby8vOjXrx8v\nvvgiUHej+4svvuAf//gHsbGxKBQKvesePnyY2bNn4+3trXf84+c6c3xtEzzbNiMivDWK6nISV3zP\nhYpzvPjaREaNGqXrm5uby9GjR+nWrRsffPCB3jgajYbS0tK7+l6Ju+dhKE8ZEBAAgEqlQq1WGwQl\nCQkJLFy4EEtLS8LCwnBwcCAtLY24uDhSUlL48ssvDYIggJiYGDIzMwkJCaFbt27Y29sDdZ+T8+bN\nQ6VSYWdnR/fu3bG3t+fSpUscPnwYV1dXXQi0d+9e5s+fT3JyMlVVVfTv35/u3btz/PhxFi9eTGpq\nKrNmzcLU1BSA/fv388knn2BtbU3Xrl1p0qQJxcXFnD17lvXr10sIJIQQQgghxBNGQiAhhBDiCTYg\n2INmDtYsTc7kcK5habjAlk5E9fCVAOgxdUpdzKFTBZRVVGNtaYazW91KmdTUVL0QqFWrVoSGhvLN\nN99w7tw5XF1dyc7Opri4mNDQUACOHTvG2bNn8fX15fTp03rX6dGjB/Hx8Rw5coSMjAz8/f312gcM\nGGAQAGlZW5oR0LIJfdrYEx09B/XFi0ybNo3w8HCj/S0sLAyOKRQKbG1tb+u9EfePp7IxAR5Ot12e\n8m7uBxQQEEBAQABpaWmo1WqioqJ0bWq1mm+//RYrKyvmzJmDm5ubri02NpYNGzbwww8/8OabbxqM\nm5+fz4IFC7Czs9M7vmnTJlQqFb6+vsyaNUsvQKqtreXKlSu6j3/44QcSEhJo0aIFb731Fi+//LIu\nSNWuYFq/fj1Dhw4FYPPmzWg0Gj777DOD/YyKior+wLskhBBCCCGEeBRJCCSEEEI84YK9nAn2cjYI\nBDp4Ot/1TdfFw+FgTgFLkjINbrrX1tSQe6GErcm7mTBhAqWlpWRlZfHMM88QGBgI1IVCrq6uHD58\nGEB3/OTJkwC0a9fOIATS9jty5AjZ2dkGIVDr1q1vOt+zZ8/yzjvvUF5ezscff6wrP3c9Dw8PvL29\nSUpKIj8/n65du9KuXTt8fX0xM5MfeR92D6I8pbH/84xJTEykurqaESNG6AVAAGPHjmXbtm1s27aN\nSZMmGZRlGzNmjEEABBAfHw/Am2++abCCyMTEBCcnJ93H5ubmdO/enSVLlhj0HT16NPHx8SQmJupC\nIC1jgaixuQghhBBCCCEeb/IbsRBCCCGAuqfxJfR5tJSXlxMZGYmvry9ffPGF7nhlZSWjR4+mqqqK\nadOm6e3j89evFzF/wUI8ug6lSavgunGKLnExLYnivByunsnkdGY64950YtSAp6itrSUoKAh3d3ec\nnJz48ccfiY2Nxc3NjcuXL/Pf//6Xr7/+moKCAmxtbXFwcDA6V0dHR3Jzc/n4449JT09nxowZurb6\nztE6f/48xcXFeHt74+PjY7SPiYkJf/vb31i+fDk7duxg0aJFADRq1IiIiAheeuklrKysGvS+ivvv\nfpanrC8EBbiSdg7La5V6x7KysoDfA8/r2dra4uPjQ3p6OmfPnjVYeWNsD6zy8nJyc3NxcHCodwWc\nVkVFBTk5OdjZ2bFmzRqjfczNzTlz5ozu4549e7Jz506mT59Ojx49CAwMxM/PD2dnWdEphBBCCCHE\nk0hCICGEEEKIR5SVlRW+vr6cOHGCa9eu0ahRIwCOHDlCVVUVULdyRxsCHcwpYPHabWg0YNu87mZ1\n6aVznFQtpraqAnvX1phZNuJSViqr165jT9Jm3F2a4efnB9TdBF+xYgWOjo4kJSVRXl6Om5sbnTt3\nZvfu3Zw9e5arV68azLOyspIff/yRvLw8Ro4cycyZM/X2Bbpxj6AbdenSBVdXV3766Sc++OADPv30\nUxo3NgwsbW1tmTBhAhMmTODChQukp6ezceNG4uPjKS0tZdq0aXfwLov7RVue8rNvlnHwwD6uXb5I\nVXkJCoUJjRyUdOnRi/deHa0XAM2cOZP09HRWrVpFXFwciYmJ5OXl0bNnT6ZMmYJKpWLu3LlMmTKF\nJk2a8Le535K8Lw2FqRn2LVrj2qk/ZhZWlF2+yIXUbVxM+42q8lJefH0Gsz9+F6VSqdtPauHChVy8\neJHvvvsOpVKpm4OjoyMAa9asQaVS8corrxi0XU87XpMmTW75npSUlKDRaLh69SrLli1r0PsYGhrK\nRx99xOrVq9m6dSsJCQkAtGrVipdeeokOHTo0aBwhhBBCCCHE40FCICGEEEKIR1hQUBBHjx4lPT1d\nbx8fExMT/P39SU1N1fVd/NsJivNysWzsiKWtAxqNhtydq6mpLMfzTyNw8gqk9NI5rhWqsXZqzrmT\ne7EyM9GVuAoKCmLJkiWo1WouX77MW2+9xXvvvQdA586d+X//7/9x9OhRvfkVFxcza9YsDhw4gLu7\nO2+88cYtQx9jnn32WSwsLPjuu++YOXMmn3766U1XELm4uODi4kLPnj154YUX2L17921fU9x/wV7O\n1JxM5Cnv5uDfBfNGttRWlXP5zAnK0jeRscOBYK8xBufFxMSQmZlJSEgI3bp1w97eXq89JSWFrb/t\n4GytM86+IZTkn+VS9iEqS6/QIjiCzK0/YatsSSPHZmguXSDh1yTKiy/zn5++15Vg69y5M2vXrmXT\npk2MHTtWN3ZhYSEAe/bswdzcnIiICPbs2QMYDzi14126dOmW74e2r7e3N/PmzWvIW6iba+fOnSkv\nL+fEiRPs2bOHjRs38sknn/DVV1/h7u7e4LGEEEIIIYQQjzYJgYQQQgghHmFBQUEsX76c1NRUvRCo\nVatWhIaG8s0333Du3DmqzO3Yk3qE6ooyHDzaAlBacJbyqwXYNHXHyauu1JW1owtmFlZUlRVTVQOV\n1bVkZGTg7++vK4d14cIFlEolERERunn4+fnh6urK8ePHdSsd1Go10dHRZGRk0LRpUwIDA2nfvv0d\nv9Zhw4ZhYWFBbGws7733HjExMbq9U/Ly8tBoNDRv3lzvnJKSEqqrqw32UhEPr/nz5+Pi4qJ3rLq6\nmujoaOLi4hg4cKDBKpr8/HwWLFhQ7543KSkpeDwViXlt3cocjUZD1q+LKbqQTda2pXh0fRonr0Ay\ntywChQlOPsHsT89gz549eHt7s3PnTszNzWncuDFbtmwhKioKU1NTSktLyc7O5tq1a9TW1tKrV69b\n7rtjZWVFy5Ytyc3NJTs7+6Yl4aysrPDw8OD06dMUFxcbXQF3q2sFBgYSGBiIra0tS5YsYd++fRIC\nCSGEEEII8QQxedATEEIIIYQQDXdKXczqPTksTc5k9Z4crJxcsbCw0K34KS0tJSsri6CgIF1ok5qa\nyqFTBRRfzAHAtlldKbiyS+cBaNzMUze+wsQEW2VLqspLMbO0RmFuRXZ2NgBKpRIHBweqqqpo3Lgx\n/v7+v5+nUDB16lQaNWpEVlYWK1asYOTIkezYsQMTExPc3NyYOnXqHa0Cut7AgQOZPHky58+f5733\n3iM/Px+AnJwcXn31VaZPn87cuXP56aef+Prrr3n77beprq5m1KhRf+i64t658XO6wtTWoI+ZmRmD\nBw+mpqZGb3Wb1pgxY24avgSEdONC7e+l2RQKBY7/Cz6t7JW6EFTLySuQorJKUg5m0KtXL8zMzEhI\nSKBTp04UFhaSkpICwOLFiykrK8PW1hYTExMGDBjQoNc8ZMgQoC7w0oamWhqNhsuXf9+vaPjw4VRX\nVzNv3jyDvlAXdGr3LQL+P3t3HlBllfh//H3ZF1lFEDEFXHBBEM3MHSO3FCtrSskmJ1unKcvRmWyz\nsjSbptC0Zpr6/sxJbSYzUytNKQOXcGcNRVncuSCIgOzc3x8MN28XFfft8/pLzvac5/qI+nw455Ca\nmkptba1Vu+PHjwPg6OjYpDmKiIiIiMj1QSuBRERERK4BZzrM/kS1OwW/ZFJcXExGRgZ1dXWEh4dz\n00034e3tTVJSEm37d6DkaDYGgwG3/50HVFddCYC9s+XqgmYtgzh+cDe2js7YOTpbvHhu06YNaWlp\nBAcHW62uCQkJ4fXXXycmJoaDBw9SXl6Ol5cXQ4cO5aGHHiIgIOCifBZRUVHY29vz7rvv8vzzz/Pm\nm2/Svn177r33XlJTU9m+fTulpaV4eHjQvn17oqOj6dmz50W5tlw8p3umq8qKsT2yE8/qfEyVJVRV\nVVnUN7aNWocOHc54rToXHyi3LGt47l2a+1u1d3CpD5RS9u7nGV9fHn30UT788EPWr19PdnY2b731\nFiEhIWRkZODr64vRaOSmm26yCEbPZOjQoaSlpfHjjz/y+OOP07t3bzw8PCgsLCQpKYkhQ4YQExMD\nwJAhQ9i7dy/ffvstjz76KBEREfj6+lJSUkJeXh6pqancfvvtPPXUUwB89NFHHDt2jM6dO+Pn54ed\nnR179+4lOTkZX19fBg4c2KQ5ioiIiIjI9UEhkIiIiMhVbvXO/cR+k4LJ1Hj9SZeW7NuTxr+WrsW9\nthAHBwc6d+4MQFhYGNu3b6fDoHsoy9+Pk0cL7J3qwxsb+/oVAdUVpRbj+XbqjW+n3hxJ+hG7vJ24\nuLiY64YOHcqxY8eYOnVqo3Px9/cnODiYqKgoAgICWLhwIXl5eY2u0oiJiTG/6G6Mr68vK1eubLRu\n4MCBVi+zf//73592LLm6nO6ZriwpYvfqj6mtKqeZbxuiB/WiV0hrbGxsMBqNxMXFUV1dbTWel5eX\nVZkFOwerIoNN/aYItvaNrIwx1NdVVNZf64477sDf359ly5aRlZXF1q1bcXd3Z8yYMTg5ObF48eIm\nrwKC+pVIkydPpkePHqxZs4YNGzZQXV2Nl5cXXbt2pXfv3hbtn3zySW6++Wa+++47kpKSKCsro1mz\nZrRo0YIxY8YwePBgc9v77ruPzZs3k5mZSVJSEgaDgRYtWnDfffcxevRomjWzXmklIiIiIiLXL4VA\nIiIiIlexndkFZwyAANxaBnHYBJ98tY5uXlV06tQJB4f6l97h4eGsX7+ewswd1FZXmVcBAbh416+A\nKM3LaXTckrxcWjg70K5du/Oa++9+9zscHBz4+OOPmTZtGm+88Qaenp7nNZZcP870TBszNlNTeZK2\nfe6kebvu7DbAhH69iQjyIT4+nri4uEbHPNs2g072tk2aW4chEwCoLK3fOs3B7tfdsyMiIoiIiGDU\nqFG88cYbDBkyhIceeognnngCBwcHbrvtNnPbWbNmNel6kZGRREZGNqltr169zOd+nUn//v3p379/\nk8YUEREREZHrn84EEhEREbmKLYrPPGMABODi5Y+dgxPFB3azPXUP4eHh5rqGc4Hi167C3dnBfB4Q\ngGuLm3Byb06pcT9FuekWYxblpmNTeoQOwW3p2rXrec//zjvv5I9//CP79+/n+eeftzjrRG5MZ3qm\nK0uKAPBsU7+SzWSCxQmZAKSkpJz3Ndv5eZxXvwBvV6uyW265hRYtWrB27Vp27tzJoUOH6N+/v1bY\niIiIiIjIVUkhkIiIiMhVKsdY0ugZQL9lsLGhmW9bqivKOHGyiuat25vrfH198ff3p7i4mJtauOHW\nsu2v/QwG2va9C1t7R3I2LCXrp/9yeFccWfH/JWfDUtq1as5zzz131lUWZzNixAgmTZrE4cOHef75\n58nPz7+g8eTadbZn2sG1Pqw5dXVacm4hq9Yl8P3335/3dVt6udCtjfc59XF3ccDbzcmq3GAwMHz4\ncIqLi5kzZw5Q/4yLiIiIiIhcjRQCiYiIiFylduUUNLlts/9t82br4ESxjeWqh4aVQT3DujDl7l6c\nmum4+rQmZMQjeAV2o6zgAHnpmzmZf4C77hjK//voA0JCQi78RoCoqCimTJmC0Wjk+eef5+jRoxdl\nXLm2nO2ZbtGxFza2tmQnLCVn4zIO7VjL3h8W8frrr9GvX78LuvYDAzvQ1DzTYGh8FVCDoUOHYmdn\nx7FjxwgMDKRTp04XNDcREREREZFLRWcCiYiIiFylTlbWNLmtb6fe+HaqP0y+orrOou6pp57iqaee\nMn/t5+nC4oRMknPrV2Q4ufsQ2O9uAMLaehMzoAMRQT6NXicmJoaYmJjTz8PXl5UrVzZaN3DgQAYO\nHNjke5Lrz9meaWcvP9rf/hBHkn7kxKFMTKY6nD39GDn+cUb060RCQsJ5XzsiyIdnR3Y76xlbBgM8\nMbQLn6U5nLaNp6cnN998Mz///DPDhw8/7zmJiIiIiIhcagqBRERERK5SLo7n90+1s/WLCPIhIsiH\nHGMJu3IKOFlZg4ujHd0DfQj0dTuva4o0RVOe6WYtbqLD7b+3KAvv0YVu3YKsAsZZs2adcayoqCii\noqLMXw+PaPNrCAr0GD/dov2pIej9tzUeZgKYTCays7NxdHRk8ODBZ70nERERERGRK0UhkIiIiMhV\nqntg46txLla/QF83hT5yWV3qZ7opLkYIunHjRvLy8hgxYgQuLi4XbW4iIiIiIiIXm0IgERERkatU\noK8b3dp4k7K/sMl9wtp6K9iRq9bV9EyfTwi6dOlSSkpKWLNmDU5OTvzud7+76PMSERERERG5mGyu\n9ARERERE5PTO9TD7mAEdLu2ERC7QtfxMf/rpp6xYsQJfX19efPFFWrRocaWnJCIiIiIickZaCSQi\nIiJyFTuXw+yfGxVGRNDF2zZL5FK4lp/p355JJCIiIiIicrVTCCQiIiJylbM4zD7XehutUw+zF7kW\n6JkWERERERG5PBQCiYiIiFwDLsZh9iJXEz3TIiIiIiIil55CIBEREZFryPkcZi9yNdMzLSIiIiIi\ncunYXOkJiIiIiIiIiIiIiIiIyMWnEEhEREREREREREREROQ6pBBIRERERESuO7GxsURHR2M0Gq/0\nVERERERERK4YhUAiIiIiIiIXQXR0NNOmTbvS0xARERERETGzu9ITEBERERERudh+//vfc++99+Lt\n7X2lpyIiIiIiInLFKAQSEREREZHrjre3twIgERERERG54SkEEhERERGRCxYXF8eWLVvYt28fRUVF\n2NraEhgYyIgRIxg8eLBV+8zMTBYuXEhGRgYGg4GOHTsyfvx4duzYwZIlS5g5cybdunUzt//555/Z\nuHEje/bs4dixYwC0bt2aqKgoRo0ahcFgsBg/NjaWuLg4PvnkE3x9fQEwGo1MnDiRqKgoYmJiWLBg\nAbt27aKiooK2bdsSExNDr169LMapqanhu+++Y926deTl5VFdXY2npydBQUGMGjWK7t27ExcXR2xs\nLACpqalER0eb+48bN46YmJiL8yGLiIiIiIicI4VAIiIiIiJywT744APatGlDaGgoXl5elJSUsG3b\nNt59910OHTrE+PHjzW1TU1N55ZVXqKuro0+fPvj7+5OTk8MLL7xAWFhYo+MvWLAAGxsbQkJCaN68\nOWVlZSQnJ/PRRx+RmZnJ5MmTmzxXo9HI5MmTadmyJbfddhslJSUkJCQwY8YM3njjDYs5vPfee8TH\nx9O2bVtuu+02HB0dOXbsGOnp6ezYsYPu3bsTFBTEuHHjWLJkCb6+vkRFRZn7nxpkiYiIiIiIXG4K\ngURERERE5ILNmzcPf39/i7KamhqmT5/O0qVLGTFiBM2bN8dkMjF37lyqq6t59dVX6dmzp7n9d999\nxwcffNDo+NOnT7ca32QyERsbyw8//MDIkSMJCQlp0lxTUlKIiYlh3Lhx5rJBgwYxffp0li1bZg6B\nysrKSEhIoH379vz973/HxsbGYpySkhIAgoODCQ4ONodAWvkjIiIiIiJXC5uzNxEREREREbGUYyxh\n+ZZsFidksnxLNpW2zaza2NnZMXLkSGpra0lKSgLgl19+4ciRI4SFhVkEQADDhw8nICCg0ev9NgAC\nMBgMjB49GoCdO3c2ee6+vr7cf//9FmU9evSgRYsW7Nmzx2J8k8mEvb291XZzAG5ubk2+poiIiIiI\nyJWglUAiIiIiItJkO7MLWBSfScr+QovyqrJibI/sxLM6H1NlCVVVVRb1Def47Nu3D4AuXbpYjW0w\nGOjUqROHDh2yqispKWHZsmVs27aNo0ePUlFR0ej4TREUFGS1qgfAx8eHjIwM89cuLi7ccsstbNmy\nhWeeeYZ+/frRpUsXQkJCcHR0bPL1RERERERErhSFQCIiIiIi0iSrd+4n9psUTCbL8sqSInav/pja\nqnKa+bYhelAveoW0xsbGBqPRSFxcHNXV1QCcPHkSAE9Pz0av4eXlZVVWVlbGc889R15eHh07duS2\n226jWbNm2NraUlZWxooVK8zjN0WzZtarlgBsbW0x/ebm/vrXv7J06VJ++uknFi1aBICDgwP9+vXj\n4YcfPu19iIiIiIiIXA0UAomIiIiIyFntzC5oNAACMGZspqbyJG373Enzdt3ZbYAJ/XoTEeRDfHw8\ncXFx5rYuLi4AHD9+vNHrFBUVWZV9//335OXlMW7cOKvzdjIyMlixYsUF3NmZOTg4EBMTQ0xMDAUF\nBaSmphIXF8ePP/5IXl4es2fPvmTXFhERERERuVA6E0hERERERM5qUXxmowEQ1K8EAvBs0xkAkwkW\nJ2QCkJKSYtE2ODgYgPT0dKtxTCaTxXZsDQ4fPgxA3759repSU1ObeAcXzsfHh8jISF5//XX8/f1J\nT0+npKTEXG8wGKirq7ts8xERERERETkbhUAiIiIiInJGOcYSqzOATuXg6gFAaV6OuSw5t5BV6xL4\n/vvvLdp26dIFf39/kpOT2b59u0Xd6tWrGz0PyM/PD7AOlLKysvjiiy/O6V7ORXFxMTk5OVblFRUV\nVFRUYGtri53dr5sruLu7U1BQcMnmIyIiIiIicq60HZyIiIiIiJzRrpwzBxstOvaiMGsX2QlL8WzT\nGXtnN8qPG3l9rZF7Rw0lISHB3NZgMPD0008zffp0ZsyYQd++ffH39yc7O5tdu3bRs2dPtm/fjsFg\nMPe57bbbWLZsGf/6179ISUmhVatWHD58mK1bt9KnTx+L8S+mY8eOMWnSJAIDAwkMDMTHx4eTJ0+y\ndetWioqKiI6OxtnZ2dw+PDyc+Ph4Xn/9ddq1a4ednR1du3YlNDT0ksxPRERERETkbBQCiYiIiIjI\nGZ2srDljvbOXH+1vf4gjST9y4lAmJlMdzp5+jBz/OCP6dbIKabp168asWbP47LPP2Lp1KwAhISHM\nnDmT9evXA7+eHQTg7e3N7NmzWbBgAenp6ezYsYPWrVvz5JNP0r1790sWAvn5+fHAAw+QkpJCcnIy\nJ06cwM3NjYCAACZMmMCAAQMs2j/22GMAJCUlsW3bNkwmE+PGjVMIJCIiIiIiV4zBdLqNvW9wBoNh\ne48ePXr8dosKEREREZEbzfIt2Xy4xvoMn7N5clgX7rol6Jz6/OUvf2H37t385z//wcnJ6ZyvKSIi\nIiIicrXo2bMnO3bs2GEymXpeqTnoTCARERERETmj7oE+F7VfZWUlZWVlVuVxcXH88ssvREREKAAS\nERERERG5CLQdnIiIiIiInFGgrxvd2niTsr+wyX3C2noT6OvWaF1+fj6TJk2ie/fu+Pv7U1dXx759\n+0hPT8fV1ZWJEyderKmLiIiIiIjc0DtnILkAACAASURBVBQCiYiIiIjIWT0wsAPTFiXSlN2kDQaI\nGdDhtPWenp4MGjSI1NRUkpOTqampwdPTk9tvv5377rsPf3//izhzERERERGRG5dCIBEREREROauI\nIB+eHdmN2G9SzhgEGQzw3KgwIoJOv4Vcs2bNeOaZZy7BLEVERERERORUCoFERERERKRJhke0wc/T\nhcUJmSTnWm8NF9bWm5gBHc4YAImIiIiIiMjloxBIRERERESaLCLIh4ggH3KMJezKKeBkZQ0ujnZ0\nD/Q57RlAIiIiIiIicmUoBBIRERERkXMW6Oum0EdEREREROQqZ3OlJyAiIiIiIiIiIiIiIiIXn0Ig\nERERERERERERERGR65BCIBERERERERERERERkeuQQiAREREREREREREREZHrkEIgERERERERERER\nERGR65BCIBERERERERERERERkeuQQiAREREREREREREREZHrkEIgEREREREREZHLaNq0aURHR1/p\naYiIiMgNQCGQiIiIiIiIiIiIiIjIdUghkIiIiIiIiIiIiIiIyHVIIZCIiIiIiIiIiIiIiMh1yO5K\nT0BERERERERE5HqRmJjIihUrOHDgACUlJbi7u9OqVSsGDBjAHXfcYdG2traWL7/8knXr1pGfn4+n\npyeDBg1i/Pjx2NlZv7I5ePAgS5cuJSkpiePHj+Pq6kp4eDgxMTEEBARcrlsUERGRa4hCIBERERER\nEblqbdiwgVWrVpGdnU1NTQ3+/v4MGjSIu+66C3t7e3O7iRMnAjB37lz+/e9/s3nzZkpKSmjZsiUj\nRoxg1KhRGAwGq/F3797NsmXLSE9Pp7S0FE9PT26++WbGjRuHt7e3Rdtp06aRmprK8uXLz+nFvdw4\nVq9ezfz58/Hy8uKWW27B3d2d48ePk5OTw7p166xCoHfeeYe0tDR69uyJi4sL27Zt48svv+T48eM8\n++yzFm23b9/OzJkzqa2t5ZZbbsHf35+CggI2b97Mtm3bmDlzJu3atbuctysiIiLXAP3rVERERERE\nRK5KCxcu5IsvvsDd3Z1Bgwbh5OTE9u3bWbhwITt27GDGjBkWoUtNTQ0vv/wypaWlDBw4kJqaGjZt\n2sRHH33EwYMHefLJJy3GX7t2LfPmzcPe3p7evXvj4+PD4cOHWbNmDVu2bOGdd96hRYsWVvM6lxf3\ncmNZvXo1dnZ2vP/++3h4eFjUnThxwqr9kSNHmD9/Pm5ubgA8+OCDPPPMM/zwww889NBDeHl5AVBa\nWsrf/vY3HB0dmT17NjfddJN5jNzcXKZMmcLcuXOZM2fOJbw7ERERuRYpBBIREREREZGrTkZGBl98\n8QU+Pj68++675pfhDz30EG+++SZbt25l2bJl3HfffeY+hYWF+Pn5MX/+fPMqoZiYGCZPnsy3337L\ngAEDCA0NBeDQoUN88MEH+Pn5MWvWLJo3b24eJykpiZdffpmPPvqIF1980WpuTX1xLzcmW1tbbG1t\nrcrd3d2tyiZMmGB+jgCcnJwYNGgQn3/+OXv37qVXr14A/PDDD5SVlfHEE09YBEAAbdu2ZdiwYXz9\n9dccOHDAql5ERERubAqBRERERERE5KqQYyxhV04BJytr+PHrzzlZWcP9999vEarY2toyceJEtm3b\nxvfff28RAkF9SHTqNnFubm6MHTuW2NhY1q1bZw6BvvvuO2pqanj00UctAiCA8PBwevfuzZYtWygv\nL8fZ2dmivqkv7uXGcOpz6xzQmaL03fzxj39k4MCBhIaG0rlzZ6tVQQ06dOhgVdaw+qy0tNRclpGR\nAUB2djaLFy+26nPo0CEAhUAiIiJiRSGQiIiIiIiIXFE7swtYFJ9Jyv5Cc1nGxp2cLDzG8t3V+IUU\nEBHkY64LCAjAx8eHvLw8ysrKcHV1BeoDos6dO1uN361bNwCysrJ+Hf9/L9VTU1PJzMy06lNcXExd\nXR2HDh2iffv2FnVNfXEv1zaj0cjEiROJiopqdJu/xp5baE1xwACKj6aS+/lS3J2/xmAwEBoayh/+\n8AerZ6fh2T1Vwyqiuro6c1lJSQkAa9asOeOcy8vLm3p7IiIicoNQCCQiIiIiIiJXzOqd+4n9JgWT\nybK8troSgL3Hapi2KJHnRoUxrPuvKxy8vb3Jz8+3CIHc3d2xsbGxuoanpycAZWVl5rKG81mWLVt2\nxvlVVFRYlTX1xb1cv0733AI0Dw6H4HBqqysYGuIIhdmsXbuW6dOn8+GHH552VdCZuLi4APD+++8T\nGBh4gbMXERGRG4lCIBEREREREbkidmYXnPZFuq29IwA1FaXY2nvz3qpkfD2czSuCCgvrV1+cGsic\nOHGCuro6qyDo+PHjVm0bfv2f//zH/IJdpCnO9NyeytbeiW+yYdYD4zCZTKxdu5a0tDT69u17ztfs\n1KkTmzZtIi0tTSGQiIiInBPrH5ESERERERERuQwWxWee9kW6s3dLAErzcgEwmWBxQv22bUeOHKGg\noAA/Pz+LYKe2tpZffvnFaqyUlBQAgoODzWUhISEApKWlXfiNyA3lTM9tydFsTKdUNjy3DUGko6Pj\neV3z9ttvx9XVlSVLlrBnzx6repPJZH7ORURERE6llUAiIiIictnFxsYSFxfHJ598gq+vb5P6TJw4\nEYBPPvnEXBYXF0dsbCzPPvssUVFR5zyPxYsXs2TJEmbOnGk+M+RyOtt5EyLXsxxjyW/OUrHUvF0E\nx/bu5GhqPO6tO2Lv5EpybiFZR4tZ/MknmEwmhg4datXv008/5c0338Te3h6oP0vlP//5D1D/Ir3B\nqFGjWLNmDR9//DGtWrUiICDAYpyamhp2795N165dL8btyjXOaDSyYMECNv68lZ8zDuHk6Yt/2CA8\nAjpatMuO/y/VFWXU1VRRW12Jqa6WTXV1BPp6cNvgQYSHh1u0j46OJjQ0lL/85S/8+9//Zvv27Y2G\nPG5ubkybNo0333yTKVOmEB4eTps2bTAYDOTn55ORkUFJSclZtzcUERGRG49CIBEREREREbnsduUU\nnLG+WYub8Ovaj7y0jWSs+hDPNl2wsbPnT0//B9uKIrp06cKYMWMs+nh7e1NTU8NTTz1F7969qa2t\nZePGjRQWFnLHHXcQGhpqbtu6dWueeeYZ5s6dy1NPPUWPHj0ICAigtrYWo9FIeno67u7u/OMf/7gk\n9y/XDqPRyOTJk2nZsiWtO/fEq9yTotw0stZ/TvuoB3FrGWRua+vkyokjWYAJGzsHbOwcgFrKq2tx\ncHDAYDBYjV9aWsqUKVNwcnKib9++eHt7s337dqt24eHhzJs3j2XLlrFjxw7S0tKws7PD29ub8PDw\n89pmTkRERK5/CoFERERE5Jp166238uGHH+Ll5XWlpyIi5+hkZY3518cP7CZ/dyIVxfnUVpVj6+iC\nk5s3nm27Etj/Hgp2b6EwO4maygqqnOrw93YjIyODiRMnEh4eztixYwGws7NjxowZLFy4kH/84x8c\nOHCA4cOH89hjjzFq1Cjz9VJSUnjhhRcYN24c7733HsuXLyc5OZnPPvuMsrIyRo8ejaOjI8eOHePu\nu+9m0KBBFqv1EhISWL16NVlZWVRWVuLl5YWtrS2lpaVW9xkfH29uW1VVhZ+fH5GRkYwZM8a8Wkmu\nbikpKcTExDBu3DgWJ2Sy12kPXoGh7P1hEXnpm8wh0LF9u6g8cYxW3aMI7Hc3Nna//v62r95Nxs4f\n+eabbxg9ejSzZs0C6lcC5eTkMHjwYCZNmoStre0Z5+Lr68sTTzxx6W5WRERErjsKgURERETkmuXq\n6mpxHoiIXDtcHOv/O1qQuZ39iauwd26GR+uO2Dm6UF1RRkVRHoX7dhEy4lG8A0MpO3aIvXGf4enl\nwJAhkbRp04aDBw+yfv16EhMTqa2txcPDA1dXV5588kkqKyuJi4vj7bffPuO2k4GBgeaAZ9q0aaSm\npuLv709mZib9+vXD09MTDw8PAGbOnMmcOXN4++23cXd3p0+fPnh4eHDs2DGSk5N59NFHLbamnDNn\nDuvWrcPHx4e+ffvi6urK7t27+eyzz0hKSmLGjBlnfekvl0+OsYRdOQWcrKzBxdGO1q71Z/v4+vpy\n//33A78+t+6t2uPg6sHJY4fN/fN3J2KwsaVNn9EWARBA1B13snTvNtavX8/o0aMt6uzs7Jg4caKe\nBREREbkkFAKJiIiIiIVTz6m59957WbBgAWlpaVRXVxMcHMy4ceOIiIgwtz/TuTpnO/Omrq6O5cuX\ns3r1aoxGI+7u7vTv35+YmBhcXFzOOtfTnQmUk5PDF198QUZGBoWFhbi4uODj40NoaCh/+MMfsLOz\n/mfwxo0b+fLLL8nNzcXBwYGIiAgmTpxI8+bNrdo2nLvw888/YzQasbOzo3379tx7770Wn02D8vJy\nFi1axIYNGzhx4gS+vr4MHz6cW2+99az3KHK96h7oA9SHQDa2tnQa+QT2Tpahbk3FSaD+0PvcTcup\nrapg8nPPM/buO8xtEhISePvtt8nMzLxo22Hl5+czf/583N3dLcrXrFlDXFwcHTp0YMaMGRYhdF1d\nHcePHzd/HRcXx7p16+jTpw9TpkzBwcHBXNfwfbNhVYhcWTuzC1gUn2l1RlVl6XEOHCiiTcdQbGxs\ngF+fWwAHF3fKCg4CUFdTTXlRHnaOLuRn/Gx1jQM+R7G3t+fAgQNWdX5+fuagUURERORiUwgkIiIi\nIo3Ky8tjypQpBAYGMnz4cIqKikhISGD69OlMnTqVAQMGXPA1Pv74Y1JTUxkwYACurq7s2LGDr7/+\nmrS0NGbPnm3x0rSpcnJy+POf/wxA79698fPz4+TJkxw5coRvv/2WBx980CoE+vbbb0lMTKR3796E\nhoayZ88eEhISyM7OZu7cuRZbNhmNRqZNm4bRaKRr16707NmTiooKtm7dyvTp03nqqacYNmyYuX11\ndTUvvvgimZmZBAUFERkZSVlZGZ9//jmpqann+cmJXPsCfd3o1sabDACDDQaDjVUbO6f6MLis4CAV\nxQUEtutgEQABDBgwgFWrVpGUlGQRwlyI8ePHWwVAAKtWrQLgT3/6k9UqRBsbG7y9vc1fr1ixAltb\nWyZNmmT1vWzs2LGsWrWq0VUhcnmt3rmf2G9SMJkarz9RXkXcL8dYs+sAw7rfZH5uU/YXYrCxwfS/\njjVV5ZhMJqoryjiS/JPFGO4uDqyryDjtHLSlqYiIiFxKCoFEREREpFGpqancfffdPPzww+aykSNH\nMnXqVObPn0/Pnj2btFrnTNLT05k7d655q6aHHnqIt956i02bNrFs2TLzOR/nIi4ujqqqKl566SV6\n9+5tUVdaWoqjo6NVn+3bt/Puu+8SGBhoLvvb3/5GfHw8iYmJ9O/f31z+3nvvkZ+fz9SpUxk4cKC5\nvKysjGnTpvHRRx/Ru3dvPD09Afjqq6/MKxSef/5586Hg9957b6Oro0Sud6duuRXo64Z3UDcObv+e\nX1Z9gFdgKM182+La4iaLVUEnjx3GYIDRt/drdMywsDCWLFlCSUnJRZljhw4drMoqKirIzc3F09OT\n4ODgM/avrKwkOzsbd3d3vv7660bbnG5ViFw+O7MLzhgAmZngvVXJ+Ho4ExHkwwMDOzBtUaJFE1t7\nJwBcvFvS6Y7HzeUGA8x6oDcRQT6IiIiIXAkKgURERERucKc7A8HV1ZVx48ZZtO3QoQORkZHExcWx\nefNmiy3Yzsfo0aMtzuowGAz84Q9/YPPmzaxdu/a8QqAGja0iatasWaNto6OjLQIggGHDhhEfH8+e\nPXvMIVB2djapqan069fPIgCC+s/rgQce4I033mDTpk3ccUf9aoV169ZhMBiYMGGCOQCC+u1/oqOj\nWbJkyXnfo8i15HRbbvl27oOtowsFe7aRn5GI8ZefMRgMNPNtS6set+PaPIC6mkqC/dzpEdK20bG9\nvb0JDw8nJibmosy1sZUZZWVlAI1uEflbpaWlmEwmiouL9Wf8KrYoPvPsAdD/mEywOCGTiCAfIoJ8\neHZkN/74/a/1tvYOOHv6UlGcT03lSewcXTAY4LlRYQqARERE5IpSCCQiIiJygzrbGQiR/drj7Oxs\n1a9bt27ExcWRlZV1wSFQaGioVVnLli1p0aIFRqORsrIyqy2XzmbAgAGsWLGCN954g379+tG9e3c6\nd+6Mv7//afs09lP/LVq0AOpf5jbIyKjfzqesrIzFixdb9SkuLgYw/3R/eXk5R44cwcfHp9Hrd+vW\nTS+I5YZwti23mgeH0zw4nJqqCsryD1B8IINj+3ay74fF/O6PLzJiSBhxX6dTVFTUaP/CwvrvY6eu\nTmwIXWtra63aNwQ6p3NqYNug4XvRsWPHztj31LbBwcHMmTPnrO3l8ssxllj9/Xc2ybmF5BhLCPR1\nY3hEGyK7tuKnkiPmet9Ot5L78wr2b17BHWP/wITbLQOg0tJS8vLyaNeu3UW7DxEREZGzUQgkIiIi\ncgNqyhkIG/YVm89AOFXDNmdne4naFKc7B8HLy+u8Q6COHTsye/Zs/vvf/7Jx40Z+/PFHAAICAoiJ\nibFawQM0eg1bW1ug/rD3Bg1bTe3atYtdu3addg7l5eXAr5/Rme5T5HrX5C23AHtHJ8bfGYV3s5H8\nsPwzslO2MDbMFU/PCOK+/pyUlJRG+zWUn/pyvWHlX35+vlUIm5mZec734eTkRNu2bcnNzSUrK+uM\nW8I5OTnRpk0b9u/fT0lJCW5ubud8Pbm0duUUnHe/QN/6308/Txe6tPbi/ccH1q+ojexIfEsTmTs3\nkf/jx6w7HkGyry8lJSXk5eWRmprK7bffzlNPPXUxb0VERETkjBQCiYiIiNxgmvpCtrq8zOIMhAYN\nB683BCc2NvWHuTf20/anrqJpTFFREQEBAY2Wn3qNc9WpUydeeeUVqqur2bt3Lzt27GDlypX87W9/\nw93dne7du5/XuA2rDB577DGio6PP2r5h/qdbvXC6cpHrydm23Co5mk0zv0AMBgMmE+Tml/KnEd3Y\n+6MteY52ODo60rlzZwICAkhPT2fjxo306/fr2UAbN24kLS2NgIAAunbtai7v2LEjAGvWrCEsLMxc\nnpOTw4oVK87rXqKjo5k3bx7z5s1jxowZFt+jTCYTRUVFeHt7A3DXXXcxd+5c5syZw3PPPWf1/Uyr\nQq6sk5U1F61foK+bORiKGfAyW7du5bvvviMpKYmysjKaNWtGixYtGDNmDIMHD76geYuIiIicK4VA\nIiIiIjeYpp6BUF54hJqqSvMZCA0afuK+4afgG15sFhRY/1T13r17z3iN1NRUqy3hjh49Sn5+Pr6+\nvucdAjWwt7enc+fOdO7cmVatWvHuu++SmJh43iFQSEgIAGlpaU0KgZydnfH39+fo0aMcOXLEajXC\n6VY1iFwvmrLlVnb8f7Gxc8DFJwDHZp4c3A5FmxaRdyiX9u3bEx4ejsFg4LnnnuPll19m9uzZ3Hrr\nrbRu3ZpDhw6xefNmnJ2dee655yy2cevduzetWrUiPj6eY8eO0bFjR/Lz80lMTKR3795s2LDhnO9n\n6NChpKWl8eOPP/L444/Tu3dvPDw8KCwsJCkpiSFDhpjPJRoyZAh79+7l22+/5dFHHyUiIgJfrQq5\narg4nv11iGMzT3qMn37afrNmzWq0X69evejVq1eT5rFy5comtRMRERE5XzZXegIiIiIicvmcyxkI\nNVUVHE35yXwGAtRvobR+/XpcXV3p06cP8OtP269bt85iNVBBQcFZz7tZsWIFRqPR/LXJZOL//b//\nh8lkYsiQIed0bw1++eUXqqqqrMobVjA5Ojqe17hQf3ZQ165d2bRpE2vXrm20TU5OjvlsIIDbb78d\nk8nEggULMJ2SvuXl5enln1z3mrLlln/3KFyat6K88Cj5e7ZRmLWLo0VlTJgwgZkzZ2JnV//SPSQk\nhPfee4/IyEgyMjJYtmwZv/zyC4MGDeK9994zh7QNHBwcePPNN+nfvz+5ubl888035OXlMWXKFO64\n447zuh+DwcDkyZP585//zE033cSGDRtYvnw5KSkpdO3ald69e1u0f/LJJ3nllVfo1KkTSUlJLF++\nnMTERMrKyhgzZgx33nnnec1DLlz3QJ+zN7qI/URERESuFK0EEhEREbmBnMsZCG5+bTm2dydlBYd5\nt+YXgr3sSEhIoK6ujqeeesq8NVpISAihoaGkpqYyefJkwsPDOX78OFu2bCEiIuKMP23fpUsXnnnm\nGQYMGICrqys7duwgOzub9u3bM2bMmPO6xy+//JLk5GS6du2Kn58fzs7O5Obmsn37dpo1a8awYcPO\na9wGU6ZM4cUXX2Tu3LmsXLmSkJAQXF1dKSgoICcnh9zcXN555x08PDwAuPvuu/n555/ZtGkTkyZN\nokePHpSVlZGQkEBoaCiJiYkXNB+Rq1lTttxq0fFmWnS82aIsJrIj9wzoYNU2ICCAyZMnN/n6Pj4+\n/PWvf220rrEQ9nQrO34rMjKSyMjIJrU9l1UhcvkE+rrRrY13k38wAiCsrbd52zcRERGRa4VCIBER\nEZEbyLmcgeDg6sVNt4zk8M44tm74kUOeTrRr146xY8fSo0cPi7YvvfQS//d//0diYiIrV66kVatW\nTJgwgR49epwxBHrkkUfYvHkza9aswWg04ubmxujRo3nggQdwcHA4r3scOXIkzZo1Y8+ePaSnp1Nb\nW4uPjw8jR47krrvuwtfX97zGbeDj40NsbCwrV65k06ZNrF+/nrq6Ojw9PWnTpg2jRo2ibdu25vb2\n9va88cYbLF68mISEBFasWIGvry/3338/ffr0UQgk17WmbLl1MfuJnIsHBnZg2qLEJm2RajBATCPB\npIiIiMjVzmBqyr92bkAGg2F7jx49emzfvv1KT0VERETkolm+JZsP16SfsU1l6XHSls+heXB32vat\n36royWFduOuWoMsxRRG5juQYS3j8n/Hn3O+fjw/Uigu5LFbv3E/sNylnDIIMBnhuVBjDut90+SYm\nIiIi14WePXuyY8eOHSaTqeeVmoN+vEpERETkBqIzEETkctKWW3K1Gx7RBj9PFxYnZJKca/2chrX1\nJmZAByKC9PegiIiIXJsUAomIiIjcQPRCVkQuN225JVe7iCAfIoJ8yDGWsCungJOVNbg42tE90Ed/\n/4mIiMg1TyGQiIiIyA1GL2RF5HKKCPLh2ZHdmrzlllZcyJUS6Oum0EdERESuOwqBRERERG4wZ3sh\n69jMkx7jp+uFrIhcNNpyS0RERETkylAIJCIiInID0gtZEbnctOWWiIiIiMjlpxBIRERE5AalF7Ii\nciVoyy0RERERkctHIZCIiIjIDU4vZEVERERERESuTzZXegIiIiIiIiIiIiIiIiJy8SkEEhERERER\nERERERERuQ4pBBIREREREREREREREbkOKQQSERERERERERERERG5DikEEhERERERERERERERuQ4p\nBBIREREREREREREREbkOKQQSERERERERERERERG5DikEEhERERERERERERERuQ7ZXekJiIiIiIiI\nyLVp5cqVfPfdd+Tl5VFVVcUjjzzCnXfeeaWnJSIiIiIi/6MQSERERERERM5ZfHw8H330EcHBwYwe\nPRp7e3s6dep0paclIiIiIiKnUAgkIiIiIiIi52zr1q0ATJ8+HW9v7ys8GxERERERaYzOBBIRERER\nEZFzVlhYCKAASERERETkKqaVQCIiIiIiItJkixcvZsmSJeavo6Ojzb9euXIl0dHRhIaG8pe//IV/\n//vfbN++naKiIiZNmkRUVBQAlZWVrFixgoSEBA4fPozBYKBt27aMHj2agQMHNnrdHTt2sGLFCvbs\n2UN5eTk+Pj706dOH+++/H1dX10t70yIiIiIi1yiFQCIiIiIiItJk3bp1AyAuLg6j0ci4ceOs2pSW\nljJlyhScnJzo27cvBoMBT09PAMrKynjhhRfIysqiXbt2DBkyhLq6Onbu3Mnf/vY3cnNzefDBBy3G\nW7JkCYsXL8bNzY1evXrh4eFBTk4OX331Fdu2beOdd97BxcXl0t+8iIiIiMg1RiGQiIiIiFxVjEYj\nEydOJCoqimefffZKTweAiRMnAvDJJ59c4ZmIXHndunWjW7dupKSkYDQaiYmJsWqTk5PD4MGDmTRp\nEra2thZ1//rXv8jKymLChAncc8895vKqqirefPNNvvjiC/r160dwcDAAycnJLF68mE6dOvHqq69a\nrPqJi4sjNjaWxYsX88gjj1yiOxYRERERuXbpTCARERERERE5qxxjCcu3ZLM4IZPlW7IpLqs6bVs7\nOzsmTpxoFQCVlJTw448/0qFDB4sACMDBwYEJEyZgMpn46aefzOUrV64E4Omnn7ba9i0qKorg4GDW\nr19/gXcnIiIiInJ90kogEREREREROa2d2QUsis8kZX+hRXnmrv0YThSxM7uAiCAfizo/Pz88PDys\nxtqzZw91dXVA/dlCv1VbWwvAgQMHzGUZGRnY2dmxYcOGRudXXV1NcXExJSUluLm5ndvNiYiIiIhc\n5xQCiYiIiIiISKNW79xP7DcpmEyN158or2LaokSeGxXGsO43mcu9vLwabV9SUgJAZmYmmZmZp71u\nRUWFRZ/a2lqWLFlyxrmWl5crBBIRERER+Q2FQCIiIiJy1Tp48CALFiwgLS2N6upqgoODGTduHBER\nERbtqqur+frrr1m/fj1HjhzB1taWoKAgoqOj6d+/f6Njb9iwgVWrVpGdnU1NTQ3+/v4MGjSIu+66\nC3t7+ybN76effiI2NpaWLVvy2muv4evre8H3LHK12JldcMYAqIHJBO+tSsbXw9lqRdBvNWzndued\ndzb5DB8XFxdMJtNZQyAREREREbGmM4FERERE5KqUl5fHlClTKC0tZfjw4fTv3599+/Yxffp0EhIS\nzO1qamp45ZVX+PTTT6mtrWXkyJEMHjyYQ4cOMXv2bBYuXGg19sKFC5k9ezYHDhxg0KBBjBw5EpPJ\nxMKFC3nllVeoqak56/y+/PJLuTlV5gAAIABJREFU/v73v9OhQwfefvttBUBy3VkUn3nWAKiByQSL\nE06/sqdBx44dMRgMpKenN3kenTp1orS0lP379ze5j4iIiIiI1FMIJCIiIiLnzWg0Eh0dTWxs7EUf\nOzU1laFDh/LWW2/x0EMP8eyzz/LWW29hY2PD/PnzOXnyJABfffUVqamp9OzZk3nz5rFx40a2bdvG\n/Pnz8fX15YsvvuCXX34xj5uRkcEXX3yBj48P8+bN449//CMPP/wwc+fOpVevXqSmprJs2bLTzstk\nMvHPf/6TBQsW0KdPH9544w1tQSXXnRxjidUZQGeTnFtIjrHkjG08PDyIjIwkMzOTzz//3Hw+0KmO\nHDlCXl6e+es777wTgPfff5/CQus5VVRUsHv37nOaq4iIiIjIjULbwYmIiIjIFZVjLGFXTgEnK2tw\ncbSjtWv90gNXV1fGjRtn0bZDhw5ERkYSFxfH5s2biYqKYu3atRgMBh555BFsbW3NbT08PBg7dixz\n587l+++/p3PnzgCsXbsWgPvvv9/i3BJbW1smTpzItm3b+P7777nvvvus5lpVVcU777zD5s2biY6O\n5tFHH8VgMFz0z0TkStuVU3DJ+j3xxBNkZWUxY8YMpk+fjr29PXZ2dowdOxaj0UhmZiZTp07Fz88P\ngPDwcB566CEWLlzIY489xs0334yfnx8VFRUYjUZSU1Pp0qULr7322nnNWURERETkeqYQSERERESu\niJ3ZBSyKz7RabVBZepwDB4qI7NceZ2dnq37dunUjLi6OrKws+vbty5EjR2jevDmtW7e2ahsWFgZA\nVlaWuWzfvn1A/Yvl3woICMDHx4e8vDzKysrM55cAVFZW8tJLL5GRkcGECRO45557zu/GRa4BJyvP\nviXi+fZzcXHBy8uLZs2a4ezsTEVFBTU1NaSnp1NTU8PRo0cpLi626HPvvffSpUsXVq5cSXp6OomJ\nibi4uNC8eXOGDRvGoEGDzmu+IiIiIiLXO4VAIiIiInLZrd65/4wHzp8or2LDvmLW7DrAsO43WdR5\nenoCUFZWRllZGQDe3t6NjtOw0qe0tNRc1rCN3KmrgE7l7e1Nfn6+VQhUXl7Ovn37cHFxoUePHk24\nS5Frl4vj2f+r2GHIhEb7rVy58oz9ampqSEpKYujQocyYMcOiLi4ujtjYWFxcXKz6denShS5dupx1\nXiIiIiIi8iuFQCIiIiJyURw8eJAFCxaQlpZGdXU1wcHBjBs3joiICIt2W/Yc4fnZH1KYnUJlaREG\ngw3OXn60CLkFr7Zdze2qy8t4b1Uyvh7OlB3KYNWqVWRnZ3P48GEOHDhAq1atcHBwAKCoqKjROTWU\nu7q68tNPPxEbG8u+ffto2bIlRUVFtGzZkh9++IHVq1dz+PBhysvLSUlJwWAwsGvXLoYOHWoey9PT\nk2eeeYYZM2bwwgsv8Prrr9OhQ4eL/TGKXBW6B/pcsn5FRUWYTCaaN29+XtcQEREREZGmUwgkIiIi\nIhcsLy+PKVOmEBgYyPDhwykqKiIhIYHp06czdepUBgwYANSvAHh68l85lJmBk4cPLTreTF1NDcf3\np5OdsJTyoqM0b98TgPLCI9RUVfLirLnYHtmJu7s7gwYNIiEhgQMHDpCcnMysWbPw8/PDaDRy+PBh\nWrVqZTGv5ORkoH4l0N///nc6depEnz59SEhIIDU1lbVr1/LFF1/g5+dH//79qa6uZs+ePdTV1bF9\n+3aLEAjqt5B77bXXeO2113j55Zd59dVX6dSp02X4hEUur0BfN7q18SZh42bydydSUZxPbVU5to4u\nOLl549m2Ky069jK3rzhxDJtD25g+9b+cOHECd3d3wsPDGTt2rMWfy4kTJ2I0GoH6VT9xcXEAREVF\nkZeXR2pqKgCxsbHExsaa+33yySfs2LGD+fPn86c//Ylhw4aZ69atW8ecOXNwcHDg888/x97e3lz3\n5z//mezsbD7//HNzaBwXF8eWLVvYt28fRUVF2NraEhgYyIgRIxg8eLDVZzFt2jRSU1P56quvWLp0\nKevXrycvL49Bgwbx7LPPmtvFx8ezevVqsrKyqKqqws/Pj8jISMaMGWMxJxERERGRy0khkIiIiIhc\nsNTUVO6++24efvhhc9nIkSOZOnUq8+fPp2fPnri4uPDPBYvJyszAvVV72kWOxWBjC0DLsEHsWf0x\nR1M34OTVEoCaqgpyN33F8QO7iezRkXnz3qegoIDvv/+eW265hZCQEHbt2kX79u3Jy8vj//7v/3jh\nhRfM1z9x4gRLliwhNzeXkpIShg0bxp///Gf27dtHQkICn3/+OSdOnKB58+bMnz8fe3t7Zs6cSVBQ\nEA8++CDDhw9v9F67du1qPtD+5ZdfZvr06YSGhl7CT1fkygg07Wdh/OfYOTXDo3VH7BxdqK4oo6Io\nj8J9u8whUNmxQ+yL+4yOfi50iOxPmzZtOHjwIOvXrycxMZE33njDvGpu9OjRGI1GVqxYQVBQELfe\neisAwcHB5i0YExMT6d27N8HBwea5uLq6ms/xSkpKsgiBkpKSAKiqqiIjI4Nu3brVz6usjL1799K1\na1dzAATwwQcf0KZNG0JDQ/Hy8qKkpIRt27bx7rvvcujQIcaPH9/o5zFz5kwyMzPp2bMnt956Kx4e\nHua6OXPmsG7dOnx8fOjbty+urq7s3r2bzz77jKSkJGbMmIGtre0F/56IiIiIiJwrhUAiIiIicsFc\nXV0ZN26cRVmHDh2IjIwkLi6OzZs3ExUVxdervsNgMNC65zBzAARg7+RKy9CB5P68gqLsFADc/NqS\nl76JmopyqmzC+PTTT0lISKCuro6nn36a4OBgnnzySYqLi+nSpQuJiYk8/fTT5pU8jz/+ODt37sTJ\nyYmYmBgeffRRDAYDnTt35p577uHLL78kJSWFVq1asXDhQpKSksjNzaVLly6MGTMGO7vT/1M5JCSE\nmTNn8tJLL/Hqq6/y0ksv0b1790vz4YpcIXt3baZ9K2+c+z6MnaOrRV1NRf3ZWiaTif2bltPW24HX\nXnqeyMhIc5uEhATefvtt/v73v/Phhx9iMBi48847zSFQcHAwMTExVtdNTEykT58+REVFWZS7urrS\nokULkpOTMZlMGAwGoH7FX1hYGCkpKSQlJZlDoNTUVOrq6ggLC7MYZ968efj7+1veT00N06dPZ+nS\npYwYMaLRrery8/OZP38+7u7uFuVxcXGsW7eOPn36MGXKFIvAafHixSxZsoRvvvmG0aNHN/o5i4iI\niIhcSjZXegIiIiIicm3JMZawfEs2ixMy+XZHLicra2jXrh3Ozs5WbRtexmZlZVFeXk5Bfh72zm44\neVifG9KsZSAAFcX1W0U5uHrh7t8Og40NB3OzSUhIoF27drz66qsMGDCAgIAAfHx8yM/P5/nnn+fB\nBx8E4MCBAxw8eJB9+/bRvHlzXn75ZR577DHzC2OACRMmMHXqVEJCQsjJyeHNN98kPT2dfv368fzz\nz58xAGoQHBzMrFmzcHZ25vXXX2fr1q3n/FmKXG1O/fO972gxHq5OvD72FsLaelu0s3NyAaC1QwmB\nbrUMuCXCIgACGDBgAF26dOHQoUOkpaVdlPmFhYVRXFxMbm4uUP/nvbCwkH79+tG+fXvzqiD4dYVQ\nwwqiBr8NgADs7OwYOXIktbW1FmOcavz48VYBEMCKFSuwtbVl0qRJFgEQwNixY3Fzc2P9+vXndJ8i\nIiIiIheLVgKJiIiISJPszC5gUXwmKfsLzWWVpcdJyz1GjVcxO7MLiAiyDHc8PT2B+m2ZysrKsLUx\nYO/crNHx7Z3+V24y0WP8dADSvn4fl+at+Ous+dw3wPrsHW9vb/Lz86murua+++7jvvvuo6KiguPH\nj9ePaW9Pjx49Gr3ewIED6d+/PytXrmTdunXk5OSwceNGfv75Z26++WYmTpxofln8ySefNDpG27Zt\n+fe//326j0zkmtHYn2+jTQAH96Tx2ONPck/0UJ7oGcZJJx8M9i64ONrRPdCHlMT1fLTRwWq1TYOw\nsDDS09PJysq6KNsmhoeHExcXR1JSEoGBgRZBj9FoZPny5ZSXl+Ps7ExSUhJOTk507NjRYoz8/HyW\nLl1KUlIS+fn5VFVVWdQfO3as0Ws3bGl3qsrKSrKzs3F3d+frr79utJ+9vT0HDhz4/+3de7RdVX0v\n8O8MIe/wCDEBA5KEVwAhJFKEBHlUQHuhhQuK1CsVbusdKq0l1mEvFAavijhu77Vob7FjFPVWBKwy\nxFdUNIg8fKAJAQLBAiUJSgKGYEhCQhIy7x97Jx5OzjlJ4JBzsvh8xmCssedac+25M8aPtc/67jXn\nK/m4AADwqgmBAADYou/dtyj/+J0HU2vX+59c8ttc9OWfZ8Zph+cdR+yzqX1jGDN8+PAMHz48uwwd\nlKeWruryHOvWrEyS7LTzkE1tO+08OEkyfreu19JYtmzZpvN3tNtuu+UjH/lIrrrqqlx88cW58sor\nu7yBO2DAgJx++uk5/fTTs3z58jz00EO56667cvfdd2fRokWb1gqCJuuuvsccfEx2GjwsS//jl/nc\nF2/OxLHfyZhdh+XNb35zzj///IwfMzI/f6E1LdyoUaO6OPPv21et6rrut1XHdYFOP/303H///Rk9\nenTGjRuXyZMn55Zbbsm8efOy//77Z9GiRTnyyCNfthbPkiVL8tGPfjQrV67MoYcemqlTp2bYsGEZ\nMGBAnnnmmcyaNSvr1q3r8r133333zdpWrlyZWmuWL1+em266qVc+IwAA9CYhEAAAPbrviaU9BkBJ\nsnrZ4qxf+2I+/e0HMmbXoZueCHrwwdb6PhMnTszQoUOz3/h9suC387Pm+WczZJeXr7mxcsmCJMnQ\nUXtuahs6as8MXPNsli9ZkBz28hBn8eLFWbp0acaOHbtZCJS0bhZfccUVueKKK3LppZfm8ssvz6RJ\nmz9NtNGuu+6aadOmZdq0aXn++efzwAMPZOHChdl///17+ueBHdqW6nuPiZOzx8TJWb92TV5Y+mQO\nHrs68+b8NJdddlmuu+66DBvWmhbuueee67L/xqB243Gv1qhRozJu3LjMmzcv69aty4MPPpijjz46\nSXLIIYdk4MCBmTt3bl5oh1Odn1C69dZbs2LFilx44YWbrTl05513ZtasWd2+d8cpJTfa+P+eiRMn\n5tprr31Vnw0AAF4L1gQCAKBHX77z0R4DoCRZv3ZNljz449Sa3HjXo0mSRx99NHfccUeGDx+eY445\nJkly0kknZdyoYXnqvh+kbtjw+/5rXsiSeXcmSfbYb8qm9tH7Tcm4UcNz8803Z/ny5ZvaN2zYkOuv\nvz611pxyyindjuvQQw/NVVddlSS59NJLM2/evE371q1bl/nz52/+Wdavz8qVraeSBg8e3PMHhx3c\n1tR3kgwcNCS7vPGAbJh4Qk466aSsWLEiDz30UPbbb78kvw98O9vYvvG4LRkwoPUn6oYO/3/obPLk\nyVm9enVmzpyZVatWbXo6aPDgwZk0aVLuv//+btcDWrx4cZJk2rRp3Y51WwwZMiRvetObsmjRoqxY\nsWKb+wMAwGvNk0AAAHRrwTMrXrZGSHdGjt03zz52X1YtfSq/ecM+Gbrwx3lo7i+yYcOGXHDBBZue\nAjjzzDMze/bs3PHT2Xlk5r9klzfunw0vrcvvFj6cdWtWZeyh0zNizJuSJKUkl/73U7N47pDccsst\nueCCCzJ9+vQMGTIks2fPzsKFC3PIIYfkzDPP7HFsBx10UK6++upccsklufzyy3PJJZfkiCOOyNq1\na/Pxj388e+21V/bff/+MGTMma9euzdy5c/Pkk0/mrW99a/bZZ58ezw07si3V94olT2TE2PEvewLm\ngYXL8tLKp5O0QpeDDz4448aNy8MPP5x77rkn06dP33TsPffck4ceeijjxo3LoYceulVjGjlyZJLk\nmWee6faYyZMnZ+bMmfnqV7+66fVGhx9+eG688cb87ne/y8iRIzNhwoSX9R0zZkySVuBz1FFHbWqf\nM2dObrvttq0aY2dnnHFGPvOZz+Taa6/NjBkzNnsyceXKlXn66ae3OggDAIDeJAQCAKBbcxcs3arj\nBg3fPfscdWqeum9Wnn30l/nBc8Pytj84POecc06mTp266biBAwfmqquuyq233pqvfet7mfvI/Xl+\n9boM3W1sxh35zowa31o4/vB9R+W9bzugNa3cEedl4sSJ+fa3v53bb789L730Uvbcc8+ce+65OeOM\nMzJw4Ja/0k6cODGf/OQnc8kll+TKK6/MRRddlClTpuS8887Lgw8+mPnz5+dnP/tZhg4dmr322isf\n/vCHc/LJJ7+yfzTYQWypvp+4898zYOCgDBs9LoNH7JZak1XPLMyynVbk2CMPz+TJk1NKyYwZM3Lp\npZfmU5/6VI4++ujsvffe+c1vfpOf/vSnGTp0aGbMmNHlVGpdmTRpUgYPHpxvfvObWbFixaZ1eE47\n7bRN4cphhx2WUkqWL1+evffe+2XrEU2ePDk33nhjli9fnunTp2/2vqeeemp++MMf5pprrsn06dMz\natSoLFy4MHPmzMmxxx6bu+66a1v+CZMkJ598ch577LHMnDkzH/jABzJlypSMGTMmK1asyNNPP515\n8+blpJNOygUXXLDN5wYAgFdLCAQAQLdeeHF9j/sHj9gtU9932abXE084J0ny/hMOzHvfdkCXfQYN\nGpSzzz47Z599dpLW0whzFyzNCy+uz7DBA3PE+NEZP2bky/ocd9xxOe6447ZqzNdff32X7fvuu2++\n9KUvvaztrLPOyllnnbVV54Wm2VJ973XE27Ni8eNZvWxJnn/qsQzYaWAGDd8100/5r7n6Y3++KYA9\n6KCD8ulPfzpf+cpXMnfu3Nx7773ZZZddcvzxx+ecc87JuHHjtnpMI0aMyEUXXZSbbrops2bNypo1\na5IkJ5544qYQaOTIkZk4cWIef/zxzdb8OfDAAzNkyJCsWbNms31JMn78+Fx99dW54YYb8otf/CIv\nvfRSJkyYkIsvvjjDhw9/RSFQknzoQx/KkUceme9+97u5//77s2rVqowYMSJveMMbcuaZZ+bEE098\nRecFAIBXq9StmQD6daiUMnvq1KlTZ8+e3ddDAQDoM7fe+0Su+/7D29zvQ+84JGccNWHLBwJ9Rn0D\nAMBr6y1veUvmzJkzp9b6lr4aw4C+emMAAPq/I8aP3q79gO1HfQMAQPMJgQAA6Nb4MSNz2JtGbfnA\nDg7fd9Rm07kB/Y/6BgCA5hMCAQDQo/923AHZyjXdU0q6XQsI6H/UNwAANJsQCACAHk2ZMDoXnnrY\nFm8Ul5LMOO3wTJlgqijYUahvAABotoF9PQAAAPq/d055U8buNiw33vVoHli4bLP9h+87Ku992wFu\nEMMOSH0DAEBzCYEAANgqUyaMzpQJo7PgmRWZu2BpXnhxfYYNHpgjxo+2Rgjs4NQ3AAA0kxAIAIBt\nMn7MSDeFoaHUNwAANIs1gQAAAAAAABpICAQAAAAAANBAQiAAAAAAAIAGEgIBAAAAAAA0kBAIAAAA\nAACggYRAAAAAAAAADSQEAgAAAAAAaCAhEAAAAAAAQAMJgQAAAAAAABpICAQAAAAAANBAQiAAAAAA\nAIAGEgIBAAAAAAA0kBAIAAAAAACggYRAAAAAAAAADSQEAgAAAAAAaCAhEAAAAAAAQAMJgQAAAAAA\nABpICAQAAAAAANBAQiAAAAAAAIAGEgIBAAAAAAA0kBAIAAAAAACggYRAAAAAAAAADSQEAgAAAAAA\naCAhEAAAAAAAQAMJgQAAAAAAABpICAQAAAAAANBAQiAAAAAAAIAGEgIBAAAAAAA0kBAIAAAAAACg\ngYRAAAAAAAAADSQEAgAAAAAAaCAhEAAAAAAAQAMJgQAAAAAAABpICAQAAAAAANBAQiAAAAAAAIAG\nEgIBAAAAAAA0kBAIAAAAAACggYRAAAAAAAAADSQEAgAAAAAAaCAhEAAAAAAAQAMJgQAAAAAAABpI\nCAQAAAAAANBAQiAAAAAAAIAGEgIBAAAAAAA0kBAIAAAAAACggYRAAAAAAAAADSQEAgAAAAAAaCAh\nEAAAAAAAQAMJgQAAAAAAABpICAQAAAAAANBAQiAAAAAAAIAGEgIBAAAAAAA0kBAIAAAAAACggYRA\nAAAAAAAADSQEAgAAAAAAaCAhEAAAAAAAQAMJgQAAAAAAABpICAQAAAAAANBAQiAAAAAAAIAGEgIB\nAAAAAAA0kBAIAAAAAACggYRAAAAAAAAADSQEAgAAAAAAaCAhEAAAAAAAQAMJgQAAAAAAABpICAQA\nAAAAANBApdba12Pol0opzw4dOnTUwQcf3NdDAQAAAAAAdjDz58/P6tWrl9Va9+irMQiBulFKeSLJ\nLkkW9PFQoD+Y1N4+0qejgGZRV9D71BX0PnUFvU9dQe9SU9D71FXvGZ/k+VrrhL4agBAI2KJSyuwk\nqbW+pa/HAk2hrqD3qSvofeoKep+6gt6lpqD3qatmsSYQAAAAAABAAwmBAAAAAAAAGkgIBAAAAAAA\n0EBCIAAAAAAAgAYSAgEAAAAAADRQqbX29RgAAAAAAADoZZ4EAgAAAAAAaCAhEAAAAAAAQAMJgQAA\nAAAAABpICAQAAAAAANBAQiAAAAAAAIAGEgIBAAAAAAA0kBAIAAAAAACggYRAQJdKKfuUUv65lPLz\nUsqSUsqLpZSnSil3lVLOL6Xs3EPf95dS7i2lrCylLC+l3FFKOW17jh/6o1LKAaWUvy2l3F5KebKU\nsraU8nQp5RullBO30FddQRdKKTuXUv66lPKFUsrcdl3VUspfbEVfdQXdKKXsXUr5fPv734ullAWl\nlH8speze12OD/qqU8q5SymfbfzM9374e3bCFPtNKKTNLKctKKatLKQ+UUi4spey0vcYN/VUpZY9S\nyl+UUr5eSnmsXSPLSyl3l1L+vJTS5X1NdQU9K6V8qpQyq31fYnW7Vu4rpVxWStmjmz7qagdWaq19\nPQagHyqlnJDkG0l+nuQ/kyxLskeSP0qyT5IfJTml1rq+U79/SPI3SX6d5GtJBiU5J8moJH9Va/2n\n7fQRoN8ppdyc5D1JHk5yd1p1dVCSP0myU5K/rrV+pot+6gq6UUrZLclz7ZdPJ1mb1nXqA7XWf+2h\nn7qCbpRS9kvykyRj0vo++EiSo5KcmORXSabXWp/tuxFC/1RKmZtkcpKVaV1fJiX5cq31fd0cf3qS\nW5KsSfKVtL4b/nFa3w+/Vmt99/YYN/RXpZQPJrkuyeK07kEsSjI2yZlJdk2rft5dO9zcVFewZaWU\ntUnmpHVv4pkkw5McneTIJE8lObrW+mSH49XVDk4IBHSplDIoyfpa64ZO7TsnuS3JCUneU2v99w77\npiW5J8njSf6g1vpcu318ktlpXVQm1VoXvPafAPqfUsp5Se6vtd7Xqf34JD9IUpOMr7Uu7rBPXUEP\n2tertyeZW2tdXEq5PMll6SEEUlfQs1LK95OckuQjtdbPdmj/P0lmJPmXWusH+2p80F+1n+z+dZLH\nkhyf1k3rLkOgUsou7eN2TStY/WW7fUiS25Mck+RPa603b6fhQ79TSvnDtL6XfafjvYlSyp5J7k3r\nhz/vqrXe0m5XV7AVSilDaq1rumj/RJKLk1xXa/1wu01dNYDp4IAu1VrXdg6A2u3rktzafnlAp90b\nbwZ8YuMNtXafBUn+b5LBSc7v/dHCjqHW+sXOAVC7/cdJ7kjrSYRpnXarK+hB+3r13Y7h6VZQV9CN\n9lNApyRZkFY9dHRZklVJzi2lDN/OQ4N+r9b6o1rrox2fSujBu5K8IcnNG2+otc+xJskl7Zcfeg2G\nCTuMWuvttdZvdb43UWtdkuRz7ZcndNilrmArdBUAtW38oXfH+33qqgGEQMA2ac/1+V/aLx/otPsP\n29vvddH1u52OAV5uXXu7vlO7uoLep66gexvXqLuti5tuK9J6im5YWlOGAK9cT9eiO5O8kGRaKWXw\n9hsS7FC6+vtJXcGr88ftbcf7feqqAQb29QCA/q2UMjrJXyYpaSX/JyfZP8mNtdZvdThueJJxSVZ2\n82vsR9vbA1/bEcOOp5Syb1rTWb2Q1peoje3qCnqZuoItOqi9/Y9u9j+a1pNCByaZtV1GBM3Uba3V\nWteXUp5IcmiSiUnmb8+BQX9XShmY5M/aLzvemFZXsA1KKR9LMiKtqd6OTHJsWgHQNR0OU1cNIAQC\ntmR0WlN/bFST/ENac4R2tGt7u7yb82xs3633hgY7vvavZb6c1vRTH+84NVXUFbwW1BX0TI3A9qHW\n4JW7Jsmbk8ystX6/Q7u6gm3zsSRjO7z+XpLzaq2/7dCmrhrAdHDQYKWUBaWUug3/3dD5HLXWR2qt\nJa3QeN+0FgP+H0nuLKWM2s4fCfpcb9RVh3PtlORLSaYn+UpaASu87vRmXQEA0FyllI8k+ZskjyQ5\nt4+HAzu0Wuue7Xt+eyY5M62nee4rpUzt25HR2zwJBM32eJLuFnvrylPd7ai1vpRkUZJrSylPJ7kp\nyZVpTRWX/D7537WL7h3bf7cN44H+qFfqqh0A3ZDk3Wktvvi+LhYRVle8XvTa9WorqCvomRqB7UOt\nwTYqpfxlkmuTPJzk7bXWZZ0OUVfwCtRan07y9VLKnLSmffu3tJ62S9RVIwiBoMFqrW9/jU69cdHs\nEzq816pSym+SjCul7NXFOgsHtLfdzS8PO4TeqKtSys5pTQH37iQ3JvmzdtDa+b3UFa8Lr+H1qqv3\nUlfQs1+1t92ti6VGoHf8Kq31Fw5MMrvjjvZ6JxPSWvD+P7f/0KD/KaVcmOTTSealFQA908Vh6gpe\nhVrrwlLKw0mOKKWMrrUujbpqBNPBAa/EuPZ2faf229vbd3bR5486HQOvS6WUQUm+mlYA9G9Jzu0q\nAOpAXUHvU1fQvR+1t6eUUl7292IpZWRaU5i+kORn23tg0DA9XYuOSzIsyU9qrS9uvyFB/1RK+du0\nAqC5SU7sJgBK1BX0hjfXx3EvAAADGElEQVS2txvvU6irBhACAV0qpUxtT1fVuX1EWo9fJ8l3Ou3+\nXHv7d6WU3Tv0GZ/kgiQvJvlCrw8WdhCllMFJvp7k9CTXJzm/1rphC93UFfQ+dQXdqLU+nuS2JOPT\nqoeOrkgyPMmXaq2rtvPQoGm+lmRpknNKKUdubCylDEny9+2X1/XFwKA/KaVcmuSatJ5AeHv7yYTu\nqCvYglLKgaWUzaZ2K6UMKKV8IsmYtEKd59q71FUDlM2XHwBISim3pvVLz5+ktRbQC0n2SesX0ru1\n299Ra13Zqd//TvLRJL9O60IxKMl7kuyR5K9qrf+0vT4D9DellC8kOS+tL1D/nKSri/AdtdY7OvVT\nV9CDUsr/TDKp/fKIJJPTuk492m67u9b6r536qCvoRillv7RqaEySbySZn+StSU5Maxq4abXWZ/tu\nhNA/lVLOSHJG++WeSd6R1vQ4d7XbltZaP9bp+K+ltS7ezUmWJfmTJAe128/uYs1IeN0opbw/yRfT\neiLhs/n92iQdLai1frFDH3UFPWhPrfjJJHcneSLJs0nGJjk+ycQkS9IKXB/u0Edd7eCEQECXSimn\nJvnTJEeldTEYluS5JA+ktYj952utnaeD29j3vLR+OXpIkg1J5iT5X7XWb7/2I4f+q5RyR1pfrHpy\nRa318i76nhd1BV3aitr6f7XW87rod17UFXSplLJPkivTmvpjjySL03qa9YoOvwwFOiilXJ7ksh4O\nWVhrHd+pz/Qkf5fkmCRDkjyW5PNJPrOFKYOh8baippLkx7XWEzr1U1fQjVLKm5N8MMmxSfZO64fe\nq9L6oc930qqTZV30U1c7MCEQAAAAAABAA1kTCAAAAAAAoIGEQAAAAAAAAA0kBAIAAAAAAGggIRAA\nAAAAAEADCYEAAAAAAAAaSAgEAAAAAADQQEIgAAAAAACABhICAQAAAAAANJAQCAAAAAAAoIGEQAAA\nAAAAAA0kBAIAAAAAAGggIRAAAAAAAEADCYEAAAAAAAAaSAgEAAAAAADQQEIgAAAAAACABhICAQAA\nAAAANJAQCAAAAAAAoIH+PwORGSPuq1oXAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 793, + "width": 832 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "fig, ax = plt.subplots(figsize=(14, 14))\n", + "for idx in range(viz_words):\n", + " plt.scatter(*embed_tsne[idx, :], color='steelblue')\n", + " plt.annotate(int_to_vocab[idx], (embed_tsne[idx, 0], embed_tsne[idx, 1]), alpha=0.7)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/embeddings/assets/matrix_mult_w_one_hot.png b/embeddings/assets/matrix_mult_w_one_hot.png new file mode 100644 index 0000000..e2c56f0 Binary files /dev/null and b/embeddings/assets/matrix_mult_w_one_hot.png differ diff --git a/embeddings/assets/skip_gram_net_arch.png b/embeddings/assets/skip_gram_net_arch.png new file mode 100644 index 0000000..c9cb8ab Binary files /dev/null and b/embeddings/assets/skip_gram_net_arch.png differ diff --git a/embeddings/assets/word2vec_architectures.png b/embeddings/assets/word2vec_architectures.png new file mode 100644 index 0000000..52e45ce Binary files /dev/null and b/embeddings/assets/word2vec_architectures.png differ diff --git a/embeddings/assets/word2vec_weight_matrix_lookup_table.png b/embeddings/assets/word2vec_weight_matrix_lookup_table.png new file mode 100644 index 0000000..7065edb Binary files /dev/null and b/embeddings/assets/word2vec_weight_matrix_lookup_table.png differ diff --git a/embeddings/utils.py b/embeddings/utils.py new file mode 100644 index 0000000..cc21ae4 --- /dev/null +++ b/embeddings/utils.py @@ -0,0 +1,59 @@ +import re +from collections import Counter + +def preprocess(text): + + # Replace punctuation with tokens so we can use them in our model + text = text.lower() + text = text.replace('.', ' ') + text = text.replace(',', ' ') + text = text.replace('"', ' ') + text = text.replace(';', ' ') + text = text.replace('!', ' ') + text = text.replace('?', ' ') + text = text.replace('(', ' ') + text = text.replace(')', ' ') + text = text.replace('--', ' ') + text = text.replace('?', ' ') + # text = text.replace('\n', ' ') + text = text.replace(':', ' ') + words = text.split() + + # Remove all words with 5 or fewer occurences + word_counts = Counter(words) + trimmed_words = [word for word in words if word_counts[word] > 5] + + return trimmed_words + +def get_batches(int_text, batch_size, seq_length): + """ + Return batches of input and target + :param int_text: Text with the words replaced by their ids + :param batch_size: The size of batch + :param seq_length: The length of sequence + :return: A list where each item is a tuple of (batch of input, batch of target). + """ + n_batches = int(len(int_text) / (batch_size * seq_length)) + + # Drop the last few characters to make only full batches + xdata = np.array(int_text[: n_batches * batch_size * seq_length]) + ydata = np.array(int_text[1: n_batches * batch_size * seq_length + 1]) + + x_batches = np.split(xdata.reshape(batch_size, -1), n_batches, 1) + y_batches = np.split(ydata.reshape(batch_size, -1), n_batches, 1) + + return list(zip(x_batches, y_batches)) + + +def create_lookup_tables(words): + """ + Create lookup tables for vocabulary + :param words: Input list of words + :return: A tuple of dicts. The first dict.... + """ + word_counts = Counter(words) + sorted_vocab = sorted(word_counts, key=word_counts.get, reverse=True) + int_to_vocab = {ii: word for ii, word in enumerate(sorted_vocab)} + vocab_to_int = {word: ii for ii, word in int_to_vocab.items()} + + return vocab_to_int, int_to_vocab \ No newline at end of file diff --git a/first-neural-network/DLND-your-first-network/.DS_Store b/first-neural-network/DLND-your-first-network/.DS_Store new file mode 100644 index 0000000..81296b1 Binary files /dev/null and b/first-neural-network/DLND-your-first-network/.DS_Store differ diff --git a/first-neural-network/DLND-your-first-network/.ipynb_checkpoints/dlnd-your-first-neural-network-checkpoint.ipynb b/first-neural-network/DLND-your-first-network/.ipynb_checkpoints/dlnd-your-first-neural-network-checkpoint.ipynb new file mode 100644 index 0000000..3efbfc7 --- /dev/null +++ b/first-neural-network/DLND-your-first-network/.ipynb_checkpoints/dlnd-your-first-neural-network-checkpoint.ipynb @@ -0,0 +1,1002 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Your first neural network\n", + "\n", + "In this project, you'll build your first neural network and use it to predict daily bike rental ridership. We've provided some of the code, but left the implementation of the neural network up to you (for the most part). After you've submitted this project, feel free to explore the data and the model more.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina'\n", + "\n", + "import numpy as np\n", + "import pandas as pd\n", + "import matplotlib.pyplot as plt" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load and prepare the data\n", + "\n", + "A critical step in working with neural networks is preparing the data correctly. Variables on different scales make it difficult for the network to efficiently learn the correct weights. Below, we've written the code to load and prepare the data. You'll learn more about this soon!" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "data_path = 'Bike-Sharing-Dataset/hour.csv'\n", + "\n", + "rides = pd.read_csv(data_path)" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
instantdtedayseasonyrmnthhrholidayweekdayworkingdayweathersittempatemphumwindspeedcasualregisteredcnt
012011-01-01101006010.240.28790.810.031316
122011-01-01101106010.220.27270.800.083240
232011-01-01101206010.220.27270.800.052732
342011-01-01101306010.240.28790.750.031013
452011-01-01101406010.240.28790.750.0011
\n", + "
" + ], + "text/plain": [ + " instant dteday season yr mnth hr holiday weekday workingday \\\n", + "0 1 2011-01-01 1 0 1 0 0 6 0 \n", + "1 2 2011-01-01 1 0 1 1 0 6 0 \n", + "2 3 2011-01-01 1 0 1 2 0 6 0 \n", + "3 4 2011-01-01 1 0 1 3 0 6 0 \n", + "4 5 2011-01-01 1 0 1 4 0 6 0 \n", + "\n", + " weathersit temp atemp hum windspeed casual registered cnt \n", + "0 1 0.24 0.2879 0.81 0.0 3 13 16 \n", + "1 1 0.22 0.2727 0.80 0.0 8 32 40 \n", + "2 1 0.22 0.2727 0.80 0.0 5 27 32 \n", + "3 1 0.24 0.2879 0.75 0.0 3 10 13 \n", + "4 1 0.24 0.2879 0.75 0.0 0 1 1 " + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "rides.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Checking out the data\n", + "\n", + "This dataset has the number of riders for each hour of each day from January 1 2011 to December 31 2012. The number of riders is split between casual and registered, summed up in the `cnt` column. You can see the first few rows of the data above.\n", + "\n", + "Below is a plot showing the number of bike riders over the first 10 days in the data set. You can see the hourly rentals here. This data is pretty complicated! The weekends have lower over all ridership and there are spikes when people are biking to and from work during the week. Looking at the data above, we also have information about temperature, humidity, and windspeed, all of these likely affecting the number of riders. You'll be trying to capture all this with your model." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 17, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABBoAAALzCAYAAAC/R2QvAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAewgAAHsIBbtB1PgAAIABJREFUeJzs3Xu0ZHV55//P95zTzaGhuQittN1KUEFhElcQlBjNj4iY\n6BgZNTomJBnFqMREXIyuhESjXMyKK4NrVMJEFC/gTBxiBk1iJJhBRMcLChIXysUIotJtg402Td+b\nPvX9/bFrd33P7r2rdlXty/Pd9X6tddap06dOnV3Vdfbls5/n2c57LwAAAAAAgCrMtb0AAAAAAACg\nOwgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQga\nAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAA\nAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABA\nZcwHDc65U5xz73DOfc45d79zbrdzbptz7rvOuY86555T4jFe7Zzrlfz4LyUe72Dn3J84577hnPup\nc267c+4u59x7nHNPrOaZAwAAAAAQn4W2F2AY59yXJD23/6UPvrVC0lMkHS/pNc65/ynpdd77R0c8\npB/x/TLL9BRJ1/V/f/h4J0h6qqTXOed+x3v/2Wl/FwAAAAAAsTEdNEhaq+Rg/seS/l7S/5P0I0nz\nkp4t6a2S1kn6vf6//W6Jx/w1SZuGfH9D0Tecc4dK+qwGIcOHJP2dpF2SnifpzyQdJuka59xzvPe3\nl1geAAAAAAA6w3k/9Un+2jjn/knS1ZI+5XMW1Dn3GElfVVJN4CWd7r3/cs79Xi3pY/37HOe9/9GE\ny3OJpD/vP84fe+//e+b7vyTpS0pCjy9678+Y5PcAAAAAABAr0zMavPdnee+vzQsZ+t//mZKqhtQr\n6loW59yCpPOUhAx3ZUOG/vLcLOkjkpyk051zp9S1PAAAAAAAWGQ6aCjppuD2k2v8Pc+TdHj/9tVD\n7ndVcPtltS0NAAAAAAAGdSFoWBncXqrx9zw3uP3FIfe7VdKO/u2RV8QAAAAAAKBLuhA0/Gpw+64S\n97/KObfRObfHObfZOfc159y7nHOPH/FzJwW37y66k/d+SdK9StonTiyxPAAAAAAAdEbUQYNzzkm6\nIPinT5b4sdMlHaPkihuPkfQsSW+XdI9z7g1Dfm59//MO7/0jI37H/f3Pa5xzK0osEwAAAAAAnWD9\n8pajvEVJUOAlXeu9/7ch971X0rWSbtYgCHiSpN9UMkRyUdIHnHM97/2Hc35+df/z9hLLtSO4faik\nLSV+BgAAAACA6Jm+vOUwzrnTJf1fJWHJA5Ke7r1/qOC+q73324Y81n+U9On+Y+2U9GTv/U8y97lH\nSTDxI+/9z41Ytqsl/Z6SAOQJ3vsfl31eAAAAAADELMrWCefcf5D0KSXBwC5JrywKGSRpWMjQ//51\nki5RMldhlaTfz7nb7v7nlTnfyzoouL2rxP0BAAAAAOiE6FonnHPHSfqcpCMl7ZP0Ku/9Vyp46A8p\nCRukZI7DuzPfT8OKQ0s81iHB7TKtFvs553YoCSq8pJ+V+JElSb1xfgcAAAAAYCbMSZovcb/HKDnx\nvsd7f8ioO48SVdDQvzLEDZIer+Tg+hzv/T9X8dje+83OuZ9KOkrSupy7bJB0mqRDnHOHjRgI+YT+\n583e+0fHXJSDNHgjPHbMnwUAAAAAYFIHjb7LaNEEDc65o5TMZDhOydn+N3nv/7biXzNsYMWdSgZH\nStLTJH0j707OuXlJT+4/VpnLbRYuw5o1a0beeX5+XvPzZQIqIB579+7V5s2btWbNGq1cWaZbCegm\n/hYA/g6AFH8LmMTS0pKWlpZG3u+hhx5Sf35jJdXyUQQNzrnDJP2rpBOVHIhf4L2/ouLfcbSko/tf\n5g1v/HJw+3QVBA2STlXSOuElTdLS8TNJj12zZo1+8pOfjLwz0EW33XabTjnlFF1//fV6xjOe0fbi\nAK3hbwHg7wBI8beAOq1fv14bN26UpEoOQs0Pg3TOHSzpOkknKzl4/wvv/Xtq+FXnKulJkaQv5nz/\nJklb+7dfPeRxzgluf3r6xQIAAAAAIB6mgwbn3ApJ/yDpl5WEDO/z3l845mMc65z7xRH3+Q1J7+h/\nuVvSx7L36c9auExJGHGic+6tOY/zbEmv7S/rTd77b46zrAAAAAAAxM5668Q1kl6g5MD9Rkkf7V/a\nsshe7/33Mv/2c5K+4Jz7mqTPSPqWknIQJ+lJkl6pZPaC6/+et3rvNxU8/qWSXiXpBEmXOueO7y/j\nLklnSPozJa/pTknnj/VMAQAAAADoAOtBw8v6n52k50v69oj7/0BJeJDlJf2SpGcX/JyXtEPS+d77\njxQ9uPd+u3PuxZI+K+l4SW/of4SPs1XS2d77UcsKAAAAAEDnWA8ahl0Fouz9vynpd5WEDKdKWqtk\n6OOCpC2S7pD0eUkf9t4/NPIXeH+vc+5kSX+kpBriKZJWSrpfSQBxmff+/jGXGwAAAACATjAdNHjv\np75uo/d+u6T/3f+ohPd+l6T39D8AAAAAAECf6WGQAAAAAAAgLgQNAAAAAACgMgQN9ixJ0vz81F0j\nQLTWrl2rCy+8UGvXrm17UYBW8bcA8HcApPhbQEyc9+POW0SdnHMbJK1bt26dNmzY0PbiAAAAAAA6\nbv369dq4caMkbfTer5/28ahoAAAAAAAAlSFoAAAAAAAAlSFoAAAAAAAAlVloewEAAAAAwJJTTz1V\nDzzwQNuLAYzlmGOO0a233tr2YkgiaAAAAACAZR544IF0MB6ACRA0AAAAAECOubk5LicJ8zZt2qRe\nr9f2YixD0AAAAAAAOdauXcsl52FecGlKMxgGCQAAAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAA\nAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAAAA\nKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAABAja6++mrNzc1pbm5Or33ta9tenNoRNAAAAAAA0ADn\nXNuL0AiCBgAAAAAAUBmCBgAAAAAAUBmCBgAAAAAAauS9b3sRGkXQAAAAAADonG3btunyyy/XWWed\npeOOO06rV6/W4uKi1q1bpzPPPFOXXHKJ7rzzzgN+7pxzztk/uPHjH/+4JGnnzp36m7/5G/3Kr/yK\njjnmGC0uLuqJT3yizj77bH31q18tXIbXvOY1ywZAeu911VVX7X/88OOMM86o54VowULbCwAAAAAA\nQJWuuOIKvf3tb9eWLVskLR/C+MADD2jTpk268cYbddFFF+n666/Xr/3arx3wGOnP3H333Xr5y1+u\nu+++e9njbNiwQddcc42uueYaXXjhhbrwwgtzHyP9mbSqYRYGQhI0AAAAAAA6481vfrMuv/zy/Qf5\n8/PzeuYzn6njjz9ei4uL2rx5s771rW/pBz/4gSRp9+7dhY+1ceNGnXnmmdq0aZOOPPLI/RUNDz30\nkG688UZt3bpVknTJJZfopJNO0itf+cplP/+CF7xAq1ev1t13360bbrhBzjk97WlP0/Of//wDftfx\nxx9f3YvQMoIGAAAAAEAnXHHFFftDBkl61atepUsvvVTr1q074L533nmnrrzySq1atarw8S655BLt\n3btXF1xwgd75zndqcXFx//cefvhhveIVr9CNN94o55ze9ra3HRA0nH322Tr77LN19dVX64YbbpAk\nnXbaabrsssuqeLpmMaMBAAAAABC9hx9+WBdccMH+kOGNb3yjPvGJT+SGDJJ00kkn6b3vfa/OPPPM\n3O9777V371697W1v01/+5V8uCxkk6YgjjtAnPvEJHXLIIfLe6/vf/75uueWWap9UpAgaAAAAAADR\n+9CHPqRt27bJe69jjz1W733ve6d+zDVr1ugd73hH4fcf+9jH6sUvfvH+r7/xjW9M/Tu7gNYJAAAA\nAGjIqadKDzzQ9lJM55hjpFtvbXspDnT99ddLSoYtvv71r9eKFSumejznnF7ykpdo5cqVQ+938skn\n65Of/KQk6Yc//OFUv7MrCBoAAAAAoCEPPCBt3Nj2UnTT17/+9f23n/e851XymL/wC78w8j5HHXXU\n/tvpcMhZR9AAAAAAAA055pi2l2B6Fp/Dtm3btGvXrv1fP+lJT6rkcQ8//PCR9wkrJx599NFKfm/s\nCBoAAAAAoCEWWw66YNu2bcu+PvTQQyt53HSwJMbDMEgAAAAAQNRWr1697Ovt27e3tCSQCBoAAAAA\nAJFbvXq1Dj744P1f33fffS0uDQgaAAAAAADRO+200/bfvvHGG1tckgPNWgsGQQMAAAAAIHovetGL\n9t++8sorTQ1mXFxc3H/b0nLVhaABAAAAABC917/+9Tr00EPlvdcPf/hDnX/++W0v0n7hJTA3zsD1\nTQkaAAAAAADRO+KII/RXf/VXkiTvvT7wgQ/ot37rtwoP7O+44w6df/75uuGGG2pftp//+Z/ff/vr\nX/+6NmzYUPvvbBOXtwQAAAAAdMIb3/hG3XHHHfrABz4g770++clP6tprr9Uzn/lMnXDCCVpcXNTm\nzZv1b//2b/rBD34g55zOOOOM2pfrcY97nH75l39ZX/3qV7Vr1y49/elP1wtf+EKtXbtWc3PJ+f8n\nP/nJ+oM/+IPal6UJBA0AAAAAgM64/PLL9dSnPlXvfOc79cgjj6jX6+nmm2/WzTffvP8+zrn9H6tW\nrWpkud7//vfr+c9/vrZt26atW7fqmmuuWfb9X/3VX+1M0EDrBAAAAACgU8477zx9//vf13ve8x69\n4AUv0Pr167W4uKjFxUWtX79eZ555pt71rnfpu9/9rs4888wDfj4NIcpK7zvsZ0455RTdfvvtestb\n3qKTTz5ZRxxxhBYWFpaFHl3hvPdtLwMCzrkNktatW7eu8307AAAAgEXr16/Xxo0bxT45YlDF+zV9\nDEkbvffrp10mKhoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoA\nAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAA\nAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBl\nFtpeAAAAgKbcd5/0uc9Jr3iFdPTRbS8NAOs2bdqk9evXt70YwFCbNm1qexEOQNAAAABmxktfKt1+\nu/SlL0mf+ETbSwPAul6vp40bN7a9GEB0CBoAAMDMuOOO5POdd7a7HABsO+aYY9peBGBslt63BA0A\nAGBmLC0ln/fta3c5ANh26623tr0IQNQYBgkAAGZCrze4/eij7S0HAABdR9AAAABmQljFQEUDAACJ\nD39Y2rq12sckaAAAADMhbZuQCBoAAEj94z9K27dX+5gEDQAAYCYQNAAAcKA6tokEDQAAYCYQNAAA\ncKBw+1gVggYAADATmNEAAMCBqGgAAACYEBUNAAAciIoGAACACRE0AABwIIIGAACACYU7Uo8+2t5y\nAABgCa0TAAAAE2JGAwAAB6KiAQAAYELhjpT3Uq/X3rIAAGAFFQ0AAAATyp6xoaoBAAAqGgAAACZG\n0AAAwIEIGgAAkLR1a9tLgBhlgwWCBgAAaJ0AAEB/+IfSkUdK735320uC2FDRAADAgahoAADMvP/1\nv5JBfh//eNtLgthkd6S4xCUAAFQ0AACgvXuTz7RPYFxUNAAAcCAqGgAAMy/dGG7b1u5yID4EDQAA\nHIigAQAw89KN4fbtUq/X7rIgLgyDbM7DD0u7d7e9FACAMmidAADMNO+Tj9T27e0tC+JDRUMzvvUt\n6fGPl449lhYnAIgBFQ0AgJmW3RDSPoFxEDQ047rrpF27pJ/8RPrSl9peGgDAKFQ0AABmWvZA8ZFH\n2lkOxImgoRnh1Tz27GlvOQAA5VDRAACYadkDQyoaMI7s+4fLW9Yj3GHlNQYA+6hoAADMNFonMA0q\nGppB0AAAcaGiAQAw02idwDQIGpoRvq5797a3HACAcggaAAAzjYoGTIOgoRlUNABAPHq95Vf0qgpB\nAwAgGgQNmEY2WCBoqAdBAwDEo45qBomgAQAQEVonMA0qGpoRvs60TgCAbXVtCwkaAADRoKIB0yBo\naAYVDQAQDyoaAAAzj4oGTCP7/uEguB5hgMNrDAC2ETQAAGZe9gw0FQ0YBzMamkFFAwDEg9YJAMDM\no3UC06B1ohnMaACAeMxkRYNz7hTn3Ducc59zzt3vnNvtnNvmnPuuc+6jzrnnjPl4L3LOfSp4rPv7\nX79wjMeYd879gXPuS865nzjndjrn7nHOXeGcO2n8ZwkAKIvWCUyDoKEZVDQAQDzq2hYu1POw03PO\nfUnSc/tfhlf2XCHpKZKOl/Qa59z/lPQ6733hpsw55yRdKem1mcd7vKSXSnqpc+5K7/25I5bpKEn/\nIunUzDIdJ+kNkl7tnHuT9/4jJZ4iAGBMVDRgGgQNzSBoAIB4zGJFw1olB/MbJb1f0iskPUvSsyW9\nRdKG/vd/T9LHRjzWXyoJGbykb0r67f5j/bak2/r//jrn3F8UPYBzbk7SP2gQMlwr6UWSTpP0ZkkP\nSjpI0hXOuV8f+9kCAEaiogHTYEZDM8LXldYJALCtrqDBbEWDpLsk/amkT3nvfeZ73+hXMnxV0gmS\nfts5d4X3/svZB3HOHS/prUrCgVskne6939P/9jedc5+R9EUlAcIfO+c+5r2/N2d5XiPpOf3H+R/e\n+zcH37vVOXe9khBjtaTLnHMneu97Ez1zAEAuKhowDSoamkFFAwDEY+aGQXrvz/LeX5sTMqTf/5mS\nACH1ioKH+q8aBCrnBSFD+ji7JJ3X/3JB0vkFj5P+ri2S/iRnee6V9G5JTklrx8sKHgcAMCGCBkyD\ny1s2g6ABAOIxi60TZdwU3H5ywX3OUlKFcLf3/pa8O3jvvy7pu0pCgv+U/X6/KuLE/uP8nfd+d8Hv\nuiq4TdAAABXLCxry42jgQFQ0NIOgAQDiMXMVDSWtDG4fkMU4545TMvBRStojhkm/v845d2zme8/N\nud8BvPcPSvp3JYHFWFfEAACMlj1Q9F7asaOdZUF8mNHQDC5vCQDxoKIh368Gt+/K+X54ucm7RzxW\n+P0TK3icJzjnDh5xXwDAGPIODGmfQFlUNDSDigYAiAdBQ0b/kpUXBP/0yZy7rQ9ubxjxkPcHt59Q\nweO4zM8BAKaUtzHkyhMoi6ChGeHrStAAALbROnGgtyi5RKWXdK33/t9y7rM6uL19xOOFxbeH1vQ4\nAIAp5AUNVDSgLIKGZtA6AQDxoKIh4Jw7XckVHiTpQUl/WHDXxeD2qE1deDWKbMvD/sfx3k/zOACA\nKVDRgGkQNDSD1gkAiAcVDX3Ouf8g6VNKLkW5S9IrvfcPFdw9vDrEyoL7pA4Kbu8qehzn3DSPAwCY\nAhUNmEZ2Z4qD4HoQNABAPOqqaFio52Hr0b+KxOckHSlpn6RXee+/MuRHwt3PUW0MhwS3s+0R2cf5\n2YSPU9revXt12223jbzf2rVrtXbt2kl/DQBEhaAB06CioRkEDQBgx6ZNm7Rp06bC79+1/5IK1fa6\nRRM0OOceL+kGJZer7Ek6x3v/zyN+LBzcOGowYzgA8v7M97KPMyxoSB/Ha/TgyEKbN2/WKaecMvJ+\nF154oS666KJJfw0ARIXWCUyDoKEZ4evKjAYAaNcHP/hBXXzxxY3/3iiCBufcUZL+r6TjlBzAv8l7\n/7clfvTO4PbTRtw3/H72UpnZx7m9xOPc772fuHVizZo1uv7660fej2oGALOEigZMg6ChGVQ0AIAd\n5557rs4666zC73/lK9Kb3yxJL5S0ubLfaz5ocM4dJulfJZ2oJGS4wHt/RZmf9d7f55z7saS1kk4f\ncff/r/95o/f+h5nvfTm4fbryL6Up59zjJJ3QX85hLR0jrVy5Us94xjOmeQgA6BwqGjCNbLBA0FAP\nggYAsGNUq/3GjemtUaMIx2N6GKRz7mBJ10k6WcnB+194798z5sP8oyQn6WnOuWcV/J5fUlKJ4CX9\nQ/b73vvvKalycJL+s3NuMXufvnOC258eczkBACPkHRhS0YCyqGhoBpe3BIB4zNzlLZ1zK5Qc9P+y\nkgDgfd77Cyd4qPcpGRwpSX+dDQn6X1/W/3KfpPcXPE4acDxG0n/LWd4nS/rT/pf3iKABACpH6wSm\nQdDQDCoaACAes3jViWskvUBJyHCjpI/2L21ZZG+/8mAZ7/33nHPvURICPFPSV5xzfyXpXklPlnSB\nBhUT/817f2/B418t6bWSniPpTc65tZKulLRF0mmS/lzSYZKWJJ3nve+N+XwBACPQOoFpZN8/HATX\nIwxweI0BwLa6QnfLQcPL+p+dpOdL+vaI+/9A0pMKvvd2SWuUBAW/qCTESPn+x4e99+8oenDvfc85\n91JJn1USWPxm/yN8nD2S/sh7/68jlhUAMAEqGjANZjQ0g4oGAIjHzLVOaBAAjPOR/0CJ10t6sZKZ\nDRuVhAIb+1+/yHt/7sgF8v6nSlo5/lDS/5P0kKRdSqojPiTpGd77j07wXAEAJRA0YBq0TjSDGQ0A\nEI+Zq2jw3s/X8JjXSxp9zcjhj9GT9MH+BwCgQbROYBoEDc2gogEA4jGLFQ0AACxDRQOmQdDQjPB1\n7vXq24kFAEyPoAEAMPOoaMA0mNHQDIZuAkA86toWEjQAAKJRVNHgC6f0xOP++6WnP1064wxpz562\nl6abqGhoRvZ1JWgAALuoaAAAzLy8A8NeT9q1q/llqdonPyl9+9vSF76QfKB6nGlvBq8zAMSDigYA\nwMwLD2Dmgi1YF9onduwY3N6+vb3l6DIqGppB0AAA8aCiAQAw88KN4eGHD253YSBkeNDLgVk9mNHQ\njOxOK5e4BAC7CBoAADMv3BgeeeTgdheCBi4JWD8qGppBRQMAxIPWCQDAzCsKGrrQOkFFQ/0IGprB\nMEgAiAcVDQCAmRduDI84YnC7CxUNBA31I2ioX6934L/ROgEAdlHRAACYeVQ0YBoEDfXLOzPG+xkA\n7KKiAQAw87pc0cCMhvoxDLJ+BA0AEBcqGgAAM6/LwyCpaKgfQwrrR9AAAHGhogEAMPPCg/GwooHW\nCZRB60T98l5TZjQAgF0EDQCAmUdFA6ZB0FA/KhoAIC60TgAAZl6Xh0FamtHw6KPS3/yN9Ld/2+5y\nVI0ZDfUjaACAuNRV0bBQz8MCAFC9Lg+DtFTR8E//JP3RHyW3jz9eetaz2l2eqlDRUL+8HVZaJwDA\nLioaAAAzj9aJZtx77+D2d77T3nJUjaChflQ0AEBcmNEAAJh54cbw8MMHt7vQOmEpaAiX5ac/bW85\nqkbQUD+CBgCIC0EDAGDmhRvDlSulVauS212oaLA0oyE8AH/oofaWo2rZYMH7+nawZlVeeNP2+xkA\nUIzWCQDAzAsPCufnpdWrk9tdCBqoaKhfXqhAVUO1mNEAoG4f/ah06aXSnj1tL0k3MAwSADDzwoPC\n+XnpsMOkBx+kdaJqsxY0HHRQ88vSVbROAKjTbbdJv//7ye3HP176nd9pd3m6gIoGAMDMG1bR4H07\ny1QVgob6UdFQP4tBw86d7S8DgGr84Af5tzE5ZjQAAGZeuDFcWBgEDfv2Sbt3t7NMVbE6o6FLQUNe\nqEDQUC1rrRPf/a60bp30cz8nPfxwe8sBoBrhOpv1dzUIGgAAMy9b0XDYYYOvY5/TQEVD/ahoqJ+1\nYZCf+UwSMPz4x9LnP9/ecgCoBkFD9WidAADMvKLWCYmgoUrZoCH2tpQUQUP9rLVOhNUUsVc9AbBV\n/dcVVDQAAGbesIqG2AdCWg0a9u2LP8SRpF4v/9/bfq27xlrQEL6XufoFED8qGqpHRQMAYOZ1uaLB\n0lma7E5HF9oninak2FGtlrUZDQQNQLcQNFSPigYAwMwbFjRQ0VCd7M7bQw+1sxxVKtqRYke1WpYr\nGvbsaW85AFTDUijfFQQNAICZxzDIZnSxooGgoRnWhkFS0QB0CxUN1aN1AgAw88KN4dxct1onCBrq\nRdDQDFonANQpXMew/q4GFQ0AgJmXbgzn5iTnujUM0lI5KEEDJmW5dYKgAYgfFQ3Vo6IBADDz0oOY\nhYXkMxUN9ehi0MAwyGZYDhqY0QDEz9K2siuoaAAAzLx0Yzg/n3wmaKhHF4OGoh2ptl/rrrEcNFDR\nAMSP1onqETQAAGZeNmjoUusEQUO9aJ1oBjMaANSJ1onqpa+jc9U+LkEDACAaXa5oYEZDvQgammH5\nqhO0TgDxo6KhelQ0AABm3rCggYqG6mR33h56qJ3lqBIzGppB6wSAOlnaVnYFFQ0AgJmXDRoWFwff\ni/0gwtLOExUNmBStEwDqROtE9ahoAADMvGzQsHLl4HuxH0QQNNSLoKEZVDQAqBOtE9UjaAAAzLx0\npyINGtLPUvsH59MKN/R790ret7cs2Z23HTvi728naGiG5aAh9vcwAFuhfFfQOgEAmHnZigbnBlUN\nMZ+t7PWSj1BdZxjKyDv4jr2qoShQYEe1WpaHQca8jgCQoHWielQ0AABmXroxXFgY/FsXggbLZ4FT\nsQcN4WscnrVhR7VazGgAUCdaJ6pX1+tI0AAAGGnPHumWWw486960bEWD1I2gwfJZ4FSXgoaDDhrc\nZke1WpZDs5jXEQASVDRUL11v0zoBAGjcWWdJz3qW9Ja3tLsceUHDihXJ55hL4C0fnKUIGlCG5fcy\nMxqA+IXrmJi3+5bQOgEAaM2NNyafv/CFdpeDioZmdDFoCJ9TeFlUgoZqWQ4aYl5HAEhQ0VA9hkEC\nAFrh/WAj1PbZA4KGZuQtz0MPNb8cVaKioRl5ryczGgBUxVLQ0OtJ550nvfKV0s9+1u6yTKOuioaF\n0XcBAMwyS2WKBA3N6GJFA0FDMyxXNNA6AcTP0jDIr31Nuvzy5Pazn91+e+mkGAYJAGhFeJBgMWhg\nRkP1ZikQVCQ+AAAgAElEQVRoiPl9Y5Hl93LMYSSARPg33fb6O6xiuP/+9pZjGt4nHxKtEwCAhoUb\n8jZ31L0fXPWCioZ6pctz1FGDf4s9aGBGQzO4vCWAOllqnQh//5Yt7S3HNOpqm5AIGgAAI1ipaAg3\nhnlBw9JSvRvMOlkNGh772MG/xR40hO8Ngob65P0N7ts3OGPWNIIGoFsstU6Evz/WGQ11voYEDQCA\noayUKYY7FwvBhKE0aJDaL6OclNWg4eCDpcMOS253KWhgRkN9isK+tl5nZjQA3WJln0TqXkUDrRMA\ngEZZr2hIZzRI7e90TMpqX/vCwqB9gqABZYSv51ywl9nW+zlb0dBWZQWAalDRUC0qGgAArbEeNIQV\nDbGWRluqaOj1BgdjYdCwZUu8rSkSMxqaUtSi0tbfZvj/633c72EAzGioGjMaAACtyZYptnVGkKCh\nGeGyhEFDryc9/HA7y1QFKhqaEb7OBx88uG3h/SzFu44AkMgGDW1WKYXru1grGmidAAC0JnuA0NYZ\nwTJBQ6ytE5aDhqOPHnwdc/sEl7dshvWggTkNQNyy+yBtVillZ8Ds2tXeskyK1gkAQGuyBwhtHTCU\nmdEQ69lKy0FDVy5xyVUnmmG5daLN5QBQjezfdJvr8OzvjrF9gooGAEBrrAQN4Qa9a60TloZBZq/u\nQdCAcRTNwrCw3pDiXUcASGS3l5aChhjbJ6hoAAC0JrsRsnAA3LWggYqG+oXPixkN9aF1AkCdqGio\nFsMgAQCtsVLRkD3TnmJGQ7W6GjQwDLIZRZUjFt7PUrxhJICElZMfUvcqGmidAAA0ymLQwIyG+hA0\nYBpFFQ1t/G2Gl2ptczkAVMdy6wQVDcsRNAAAhsoe8La1o97l1glLMxoIGppz2WXS8cdLf//37S1D\n1SxVNOT938a6jgCQsNw6EWNFA0EDAKA1VsoUuxw0UNFQP2tDCiXpkkuke+6R/vzP21uGqoWvc9sz\nGvL+rpjRAMTNyj6JdOBBeowVDbROAABaE1PrBDMappcNGo4+evD1Qw81vzxVsXjViUceST5/73vS\njh3tLUeVLF3ekooGoHsst05Q0bAcQQMAYCjrQQMVDdXKBg2rVg1aDWKuaLDWOuH94P/Ye+muu9pZ\njqpZuuoEQQPQPZZbJ6hoWI6gAQAwlJUyxS4HDZZnNDg3aJ8gaKhOr7f8629/u53lqJr1oIHWCSBu\nVDRUi4oGAEBrrFQ0hBv0rgUNlisapOVBQ3aKfyyKZjS0tZOa/f/tYtDQ9iwMKhqA7rFy8kPqRkUD\nQQMAoDVWggZmNDQjL9BJg4Y9e6SdO5tfpipYq2jI/v9+5zvtLEfVmNEAoE60TlSL1gkAQGssBg3p\nmXaJioaqDatokOJtnwjfPytWDHao2tpJzf7erlQ0WLq6B0ED0D20TlSLigYAQGuslCkyo6EZsxA0\nzM8PnpuVioYHHoj7qh4pZjQAqJOlioa8y1tm5+9YV+frR9AAABjKYkVD14IG6xUNhx8++Lf0koyx\nyT6v9Lm19Trn/d4utE/QOgGgTtmDe0szGno9adu2dpZlUuHrSesEAKBR1oMGZjRUKy9oOOSQwb91\nYUbD/PzgfWOldULqRvuE9YoGggYgXt7bbp2Q4pvTQOsEAKA12QOEtnbUqWhoxqigYceOZpenKtZb\nJ6RuVzQQNACYVt5BsbWgIbY5DQyDBAC0hhkN9bM+o2HVqsG/ETRUI+//twsVDenr6dzyv00rQQMz\nGoB4WdpWSlQ0jELQAAAYynrrRBeCBioa6lc0o8FS68R3vpOUBscs/TsN21MkZjQAmF7e3zQVDdNh\nGCQAoDVWgoZwY8iMhvp0NWiIoaJh2zbpRz9qflmqlL7OCws2KxoIGoB4xRA0xFzRQOsEAKBRtE7U\nj6ChfjEEDVL87RNFFQ1tvJ/zfietE0C8mNFQPVonAACtsVLREG4M04NEqRtBg6W+01kLGiy8zmvX\nDm53NWigdQLAtCyF8lL+tju2igZaJwAArbEYNFDRUJ9ZuLzlwkL7l7cM/39PPnlwO/YrT6Sv5/w8\nrRMAqhVDRUNsQQOtEwCA1lgPGtouz66CpQOirlY0ZGd8WGqd+PmfHyxPVysaCBoATCuGGQ2xtU5Q\n0QAAaA0zGupnvaKBy1tWL/y9hxwiPfWpye277443MJOWD4O0GDQwowGIVwxBQ8wVDVUjaAAADGW9\noqELQQMzGuo3LGho45KS4f/vwkJS1ZD++7//e/PLUxVmNACoi6VtpTRYx4TBamwVDbROAABaYz1o\naPtgpgrWKxq6FjQsLCwfKNrrNb884f/vihXSL/zC4OuY2yfCoIEZDQCqZLWiYWFBOvLI5HZsFQ20\nTgAAWmM9aGj7YKYK1oOGgw6S5vp7DLEGDUUzGrLfa2N5skFDzAMhmdEAoC6Wg4bHPCa5TUXDAEED\nAGCo7EbcwpBCKhrqkxc0ODeoaog1aChqnZDaea2LWick6c47m1+eqoRXnWj7b5MZDUC3WGudCGfS\npBUN27bFddKDigYAQGusVzTMzw++jjVosLTzlBc0SIOgoQuXt7RQ0ZBtnXjc4wZfb93a/PJUJdzx\nbrvaiIoGoFtiqGiQpIcfbmd5JsEwSABAaywGDeFBojQ4cxrTWYSQ9YoGqXsVDeHZdgutEwcdNPh6\n9+7ml6cqtE4AqEveQbGVoCGtaJDimtNA6wQAoDXWL28pDc6cxnoQEb7Gi4vJZ4KGamWfl6WKhnR5\n0mWKubzfetAQ82sLzLpYKhpiChponQAAtMZiRQNBQzPLEh6Mr1qVfN69u95Sy7qEyzw3ZytoSA/I\n0//7rlQ0hH+nVmY0xLqOAGCr+k8qrmiIaSAkrRMAgNYQNNQvfG4HH5x8thY0hJe4jHFOQ/oaz80l\n5aFtBw3Z1gmpG0FDOAzSucHfppWKhljXEQDstk7Mz3ejooHWCQBAo2IIGro0oyGGoCHG9onwTLvU\nftCQbZ2Q4g8avE8+pMHr3ObfJkED0C2WWyeoaDgQQQMAYChmNNSvKGhID9raWpYuBQ3hDmH4WWr/\n8pbpwXg6EDLWoCFvYGv63Ky0TjCjAYiX1YoGhkHmI2gAAAwVQ0VDl4KG9Ky21M4shK4GDdYqGrrY\nOpH3N0rrBICqWJvREF7ON2ydiKmigWGQAIDWWAkawo1h14KGvBkNUvsHZ12c0ZAt6ZdonahKXtBg\nsXWijUohANOLpXUi1oqGqhE0AACGiqF1IjyYifEgoqiioe2DMyoa6jPsqhNLS+3uPE8qLwy0EjSE\nJcGxznIBZp2l1gnviy9vGWtFA60TAIBGWaloyOv/TqUVDd7HeenFcHJ1eKa97YOzvMtbSnEGDcNm\nNFhrnZDinCUwrKKh7RkN4fs31sonYNZZap3o9Qa3qWjIR9AAABgquxFvaye9zIwGKc6DiPAgOHwu\nloIGKhqqNax1QoqzfSIvDLQyo4GgAYifpdaJ7LZy5crBdjKmigaCBgBAayxWNHQtaAgHSlmtaCBo\nqNaw1gkp/qDBWusEQQMQP0utE3mtYmlVQ0wVDXW2TiyMvgsAoIydO6U//VPp9tsH/+ac9Bu/Ib31\nre0t17RimtEgxdl/HVY0tP1cZjFoaPt1noWgwVLrRIxtKQBsVzRIyZyGDRviChrqrGggaACAivyf\n/yP99V8f+O833ZSEDU99auOLVAkqGuoXw4yG2IMGazMa8lonDjpo8G9dCRqy81Oyf7t1Cv9fw/dv\njOsIAPkHxRb2SdJ1eFrRsHu3tGvX8qtIWcUwSACIwKZNxd/78Y+bW44qeU/Q0ARLFQ1FQze5vGW1\nutg6MeyqE1Lz72daJ4BuiaGiIRVLVQMzGgAgAuFO9Kc+Jb3jHYOvYzxokGydPcg7iEmFBzMxHkQw\no6F+1mY0dL11In19CRoAVMV60BBeeSKWgZB1vn4EDQBQkXAnetUq6dBDB1/v2tX88lTB0qWkylY0\nMKOhmmVJlycV++UtrQUNXb/qRF5FQ9MH+EWtE8xoAOJkdRhkXtAQY0UDrRMAYFS2FDr2gwYp/0DX\netAQ49lKizManJPmgr2ErlQ0WJzRkFfREOPB8LAZDRIVDQCmY+nkx6jWiVgqGma6dcI5t8Y592Ln\n3MXOueucc5udc73+x0dLPsarg58Z9fFfSjzewc65P3HOfcM591Pn3Hbn3F3Oufc45544/bMGEKPs\ngUM4BCjWigarQcNCZpRxV4IGSxUN2dc49qAhDHOk9oOGrrdOMKMBQNWstk5kL28pxVPRUOfrF8NV\nJx7MfO37H5OY9Of2c849RdJ1kp6SebwTJD1V0uucc7/jvf/stL8LQDn33JO0KRxzTLvL0cWKhrwN\nUK+XfMw1HFWXvbxljAcRFmc0ZF/j2IMGa5e37GLrxKhhkG22TnB5SyB+1lsnwhNMsaxn6mydiCFo\nkAYH9PdLukvSr2vy0ODXJA2ZDa8NRd9wzh0q6bMahAwfkvR3knZJep6kP5N0mKRrnHPP8d7fPuEy\nAijpK1+Rnvvc5LJw990nrV3b3rLMSkVD+u/hpfiawIyG5pclFHPQ0OsNblupaOjiVScst05weUsg\nftZbJ8L1XSxBw6xXNFws6RZJt3jvNzvnjpV03xSP9z3v/Y8m/Nk/kXS8kpDhj733/z343tedczdJ\n+pKkVZLeJ+mMKZYTQAk33ZR83rNH+trXpJe/vL1lGRY0xHjQIMUZNMR4EGFxRkM2aJifT/7P9+yJ\n7/KWo0r6226d6EpFA1edAFAnSxUNeeu7cL8olqAhfR5VVzNIEcxo8N5f7L2/znu/uc3lcM4tSDpP\nSchwVyZkkCR572+W9BFJTtLpzrlTml1KYPaEG5i2z2QPa53oWkVDGzvqsxA0WK5okAZnhWOraMg7\nqLdS0TA/P9jBC3dSYw8amNEAoGpWZzTkBQ2xrGeybYVVMh80GPI8SYf3b1895H5XBbdfVtvSAJC0\nfMe17ZV6FysaijbgbRwAd3VGg/eD0n7rQUN6sBZb0JD33rESNIT/312qaGBGA4CqxRQ0xLKeGbbN\nnxZBQ3nPDW5/ccj9bpWU7oI9p77FASAtX9G3vVKfpYqGNg+ApW7NaMiWX1oOGmKtaLAYNKS/s0tB\nQ97fqIUZDWnbTyq2MBJAIq91ghkN06GioVpXOec2Ouf29C+V+TXn3Lucc48f8XMnBbfvLrqT935J\n0r1K2idOrGB5AQwRS0UDQcP0uto6kT04iyVo8FNfx6k5FoOG9P82XI4waIhlJzVktXViYSHudQSA\nBBUN1aOioVqnSzpGySDMx0h6lqS3S7rHOfeGIT+3vv95h/f+kRG/4/7+5zXOuRVD7wlgKuGKvu2d\nxy5e3tJq0JDdIMZ8EJHdWYkhaFhaiut1HjWjoc3LW3apomHUMMi2WieyQUMsBwAAlrM0DDKvgivG\nyqk6KxpiuOpEVe6VdK2kmzUIAp4k6TclvULSoqQPOOd63vsP5/z86v7n7SV+V1hUeqikLRMtMYCR\nLAcNXahosDqjYS4Tk8c8oyHG1gkpqWpo+sojk7JY0dDF1gmrl7dcWIjzAADActnQeN8+G0EDrRP5\nZiVo+JT3Pm+A4zcl/b1z7j9K+rSS1+O9zrl/8t7/JHPfdPNfZvMUvrUOFkEDUBvLrROhGA8aJJsV\nDXNzB16GKeYZDTFWNEjJJS4f85hmlmlaFi9vOap1IsZ1Bq0TAOoUrqsXF6Xt25N/876eyzMO05XL\nW9I6MSXv/bYR379O0iVK5iqskvT7OXdLN/krc76XFZ7jifQ8JhAHKhrqZTFoyEvdYz6IiHFGgxTX\nQEiLFQ1db50gaABQtXAdE64v81oq6sblLUeblYqGMj6kJGyQkjkO7858Pw0rDi3xWMGuWKlWiwPs\n3btXt91228j7rV27VmvXrp3kVwCdYL2iwbkkaY/xoEGy2TrR5aDBekVDeInAmIKGUTMarLROhDup\nMa4zRl11ghkNAKYRrmPC9eW+ffWckS+7LNZbJzZt2qRNmzblfi89Eba0JO3bV+1KmqChz3u/2Tn3\nU0lHSVqXc5cNkk6TdIhz7rARAyGf0P+82Xs/0W7i5s2bdcopp4y834UXXqiLLrpokl8BdILFiob5\n+UEJ3+JishKnomF6w4IGZjRUw/vB8lDRUK+ut07kDYNkRgOAaWRbJ1KPPrr866aXxXrrxAc/+EFd\nfPHFQ+9TkENMhaBhuWEX67pTyeBISXqapG/k3ck5Ny/pyf3HumvSBVmzZo2uv/76kfejmgGzzmJF\nQ7YUeteuOA8apOWv7+Li4HlYCxqY0VCNYVf2kAgaqkTrRP1onQC6pah1os2qNMl+68S5556rs846\nK/d7p5+ezLp44hOlXbteqM2bN1f2ewka+pxzR0s6uv/lj3Pu8uXg9ukqCBoknaqkdcJL+sqky7Ny\n5Uo94xnPmPTHgZlhsaIh3LE++GBpy5ZuVDSsWtVu0JD+X3e5dSI7o6GtUnNptoKGNipH8lonrJbd\nljUqaLDSOhHbOgJAoqiioe2gIa9VzNI6vEyr/aGHSktLZUYRljcTwyBLOlfJMEhJ+mLO92+StLV/\n+9VDHuec4Panp18sAMNYr2hIB0LGeHZSWr4hDfvz23itmdHQ/LJkxRo0WJvRUFQ54txg5znGdUYs\nl7e0dAAAoDzrFQ3z84N1XyzrmTqHQXY+aHDOHeuc+8UR9/kNSe/of7lb0sey9+nPWrhMSRhxonPu\nrTmP82xJr1VSzXCT9/6bUy4+gBHCFX3bK/VhpdBdqGgIr6LRZutE3gFw2wMUp2FpRkNXgwZrl7cc\ndincmIOGvDN8Ft7PVDQA3VA0DLLNfRJp+fYyXa6290nLqvPyluZbJ5xzz5H0lOCfjg5uP8U5t6y6\nwHt/deYhfk7SF5xzX5P0GUnfkvQTJYHBkyS9UsnsBackIHir975oHMalkl4l6QRJlzrnjpd0jZJL\nWJ4h6c+UvKY7JZ0/1hMFMBFLFQ15K+v04HzXrnau8zytbOtE3r83hYqG5pclKwwadu6sf3mqYm1G\nQ/j7uhQ0WB4GGfM6AkDCautENmjYuTOe9cysX97ydcpvVXCSntv/SHlJ2aAh/fdfkvTsgt/hJe2Q\ndL73/iNFC+K93+6ce7Gkz0o6XtIb+h/h42yVdLb3/ttFjwOgOtZnNKQbwl4vWdbsQYV1BA31Gzaj\nwVrQEOvlLa0FDeH/a/Z17krQ0PblLcM5GFzeEuiGcB0TVllaChrSdU0M6xnvk/1TaUYrGvqGXQ1i\n1P2+Kel3lYQMp0paq6QqYkHSFkl3SPq8pA977x8a+Qu8v9c5d7KkP1JSDfEUSSsl3a8kgLjMe39/\nyeUFMCVLFQ3DZjRISVVDbEFD0YwGa0FDzJe3jLWiIdagwcKMhq62Tli66kS685wug9Vp8ADKK2qd\nsBQ0xNQ6kbfOrpL5oMF7f46WD1gc9+e3S/rf/Y+qlmmXpPf0PwC0KJaKBik5cDjssGaXaVoWZzR0\n7fKWzGioX97sAKutE+lOKkHDdIYFeG1vKwBMpqh1os0rYUn5QUMM65m6g4bOD4ME0G2xVTTEhtaJ\n+lHRUD9rl7cs0zqxZ09S1hqTUcMgm/zbzL6XnRssS2zrCAAJq1edCPdLYmqdGLXNnxZBA4CoxVbR\nEJtYWie6EjRYn9HQ1aDBYuuEFN972dLlLfPeyzGVNAM4UPp3PTfX7pWDsr+T1ol8BA0AomYlaOj1\nBj3BVDTUI/2/ZkZDc8uSRdBQjbJBQ2zhpKWrTuS9l9PQI7Z1BIBEeJnrNqvSpNFBw759y2fFWFR0\nic6qEDQAiJqV1omiA4eYDxokW0FD2WGQsc1oiDVoiOnylnnPy8qMhqLWCSm+dYblGQ0SQQMQu/CE\nQ5vrcKn4ID2mCsui9o+qEDQAiJqVioaioIGKhuqEZzKyYu6/jmkY5OJi8lpL8Vc0zM0lHxIVDVWx\ndHlLggage8JL1lpvnZDst0/QOgEAQ4Q77G2u0Lta0WBlRkN4reeijWGsBxExzWhwbvA+iD1okAbP\nkaChGtYrGmLqnQZwoKLWCYKGyTAMEgCGoKKhXkWXt2z6tQ77HIuChlgrGvKm46fP0VrQIA3aJwga\nJtfV1olRV51oO2iINYwEkChqnbA0oyGm1gkqGgBgCOszGsKD89gOGiQ7rRNlNobpxj32GQ3S4D1E\n0FCNoudlsaIhPBsW2zpj1DBIWicATIOKhmoxDBIAhrBe0RCenYyxosFK68Q4QUNsBxF5G/q2QpOu\nBg2jKhqafp1nqXWirctbhr8r7+/K+jR4AAeyOqMh3K7EFDQwDBIAhoipoiHGoKGodYKgoTrDys0t\nBw27dsVzsBZr64T1ndSsWGY0NL0sAKph6aoTZSoarO+P0DoBAEOEK/qlpeUrzSZ1dRhkTK0TXZnR\nIMURNEjxhGdF75/0dbbUOhHzOsN60BBT7zSAAxW1TrR5Jax0eVLhesZ6WMwwSAAo4P2BwUJbZ6lm\noaKhzaChTHkfMxrqWZas8H0QS/tETDMaYg4a8v5OnRu8zlZmNDS9LACqEVY0WGqd6MKMBioaACCQ\nt2Fpa+exqxUNzGio37ABehaDhrCiIZagIdbWidjWGUVn+Np4P49qnbB+AADgQOGMBlonpscwSAAo\nkLfTai1ooKKhGmU2hmnQsG9fPLMDpPgqGggaptfVioZRLSptBw1UNABxs9Q6UebyltYDTYZBAkAB\nKhrqZzFoGDWjQYqrfSLWYZASQcOkZjVooHUCwDRiGwZpPWigdQIACsQQNFDRUI1xWiekeIMGKhrq\nMWpGg6XXOdxJ7UrQ0Mb8lFFBg/UDAAAHCisaLM1o6MLlLWmdAIBA3k5rWyv1rlY0hBuhWC5vKcV1\ntnLUjAbvm1uWMmWUMQYNVDQ0o+j9Y6V1IqbeaQDL9XqDtkjLFQ0x7YtQ0QAABahoqF/6vJxr9xr0\n47ZOWN+4h4ZVNEjNXrK1zCyMMGjYubPe5anKqJL+paVmA52yQYP1s2FZzGgAUJfs9snqjIaYKhoY\nBgkABWIbBhnb2Ulp8LxWrFi+EWr6de5yRcOwGQ1S+wdnWTFe3nJURUP2PnWbtatOtHFFGIIGoFuG\nBQ1tVDQUre9iChoYBgkABaxWNIQbnPCgIcaKhvQ1XrEiqWqw0NM+SzMaJHtBQ+ytE3kzGqRmd1S7\n2jphvaIhpgMAAMtl9wMszWigdSIfQQOAaMVQ0bByZXKALsUZNIQVDeFni60TMW3cQ8NmNEjtH5xl\nxRg0FAVVBA3Vsh40xLqOAGCvoiH8nXPBEXVMgSbDIAGggKWKhnBZwgMH5wYHDrEdNEiDA4O2r4Yw\nyzMa2j44y4oxaCjTOtHkjuostE7kBQ379jU3C4OgAeiW7N+0lRkNCwuDE0pSXEEDFQ0AUCCGigZp\nMKeBiobJlRlYFOtBRGwzGroaNDT5One1oqGocqSNtiaCBqBbrLZODLtEsfX1DBUNAFDAUkVDmQOH\n2A4apOUzGsLPFisamNFQz7JkxR40MKOhPkWvcxvvZ2Y0AN1itXUiu08S7otYX89Q0QAABahoqF+2\noiHdgFoPGqyfRQgxo6F+RWfa2zojNux1Dg+GYw4ail5nKhoATCK7HrfUOhGKKdAkaACAAlQ01I8Z\nDfWLraIhvLzlzp31Lk9VrM1oGLa+iGknNavodW7jAJ+gAeiWYTMa2ry8ZcxBA60TAFAgxoqGpgah\nVcXijIauVTQwo6F+MQUN4Q50bOFkTBUN1g8AACyXrf6zOqMhpn0RKhoAoEDehqWtnccyFQ29Xjsb\nw2kwo6F+sVU0EDRMb9TrHGsVVNHl3qwEDTENaQOw3LDWCUtBAxUNAwQNAKIVS+tEWtEgxTenwUrr\nRFGPfSimswih2GY0rFgxWL5Ygoai52WxokGKN2goKiVuo62J1gmgW4YNg2RGw2SoaACAArG0TsQ6\nRX5padDqka1o2Lev2TYQZjS0tyx50qqGWIKGmC5vKcUfNAybwm6ldSKmdQQAezMaaJ0YjaABQLSo\naKhX+Ppmgwap2QOzWWmdiGFGg9TNoIHWiekVBQ0WWyesn2kEsFx2W2llRkN2fRfTeobWCQAoEEtF\nQ6xBQ95zshA0FG0MYzqLEKKioX5lhhTSOjE9KhoA1GVY64SlioaYggYqGgCgQCwVDbG2ToTPqe0D\n4C5fdSK2GQ3S4BKXsVzeMrYZDemOakzrC6n4DB8zGgBMa9gwSEszGubnJeeS29bXM1Q0AEABKhrq\nZbWiYdZmNDT5XMataHj00TjaVGJtnVhaiutKNUXDINs4wCdoALol+zfddutE0frOuUFYTEUDAESK\nioZ6MaOhGTHPaJDiaJ+wFjSUbZ2Q7O+ohqy3TsRU0gxguVhaJ6Q4gwYqGgAgQEVDvWidaEbMMxok\ngoZJjBM0xBROlhkGSUUDgElkQ/m54Ci26ZMLvd7gylsxBw1lLh0+DYIGANGioqFeec+praqBLgcN\nec/NetCQzmiQ4gjPysxoaOt1zns/x7rOKFPRQNAAYBLZs+/ODf62m65oGLWtTNc11tcztE4AQAEq\nGuplaUZDmdQ99hkN4QApC6/zsKAhfE/HMBDSakVDurOc1eWggdYJAJMY1mZoLWiIsaKB1gkACORt\nWNpaqXexooEZDc3Im9QfU9AQQ3hm9fKWeW0TUvzrDFonAFQt72/aQkVD3j5JLEEDFQ0AUICKhnpZ\nndFQdAAc60FE3kAp60FDbK0T1ioahg0Rk+INGrjqBIC65O0HpJ+bPrnQxdYJKhoAIMCMhnpZap3o\n8uUt8w7OrAcNsbVOlJnRYKmiISzxj2mdYb11Ivz/jmkdASC/iqCtioZRB+jpOnzv3sHQSIsYBgkA\nBWIJGmKtaIi5dSKmgwirFQ3Ddjpie0+H759wUrnVoKFrl7e00joR0/XtASyXd3BvfUaDZHt/hNYJ\nAChgtXUiu7IODxpiOChLjapoaPK1ZkZD88uSN6QwFWvrxNzc8udF60S1rF91IlwWyzv/AA5kdUbD\nsOzHKiMAACAASURBVNYJyfa6hmGQAFDAYkVD3hT58OxvTAcNVmc0UNHQ7LLkia11ougAuK3LW87a\nMEgrrRPhssS0jgAwvHXC2oyGWK5wQ0UDABSwWNGQd+AQW5l5ihkNzbA4o2GcoCGG93TRkMIYWidi\nCRq8l3q95Hb2dbbSOiHROgHEKtbWCcvrGoZBAkABSxUN6bLkHTjEeNAgMaOhKTFWNMTWOlF0pp3W\nieqkIYNE6wSA6lkaBln28paS7aCBYZAAUICKhnpZap0oszHs0oyGtp7LrLVOhO9nKhqmMywMpHUC\nwLSGzWiw1joRy4kPWicAoIClioZhBw4xHjRI+a0TbR0AU9HQ3rLkiS08KzOjgaBhOsP+RttunQiv\nNELQAMQpr8zfQkVDzK0TDIMEgAJUNNTLautE0cYwvFJCTAcRMc5oiK11wlLQ4H3xzIhUjEHDsKqj\nNlsnsgN6mdEAxClvHZNuK5eWknVrU0btk8QSNFDRAAAFwo1OeuDT1gp9VioaLAQNwzaGMZ6tjL2i\nIYbWiaLn1UbQkBfgZcWykxoatuPdZutE0bIsLS1fZgC2DatoyH6/bl1snaCiAQAC4Q5rGjRYrGhY\nuXJwRi2Gs78pSzMaxg0aYp/REFPQEMN72tLlLfMCvKwYw0mrrRMWQg8A0xs2DFJqb25UzBUNDIME\ngALhCvKQQ5LPFoMG5wYHDrEcNEhUNDShqIzeetBA68TkyvTEdi1oaLt1ou1lATC9YcMgs99vY1lC\nsQQNtE4AQIG8ioa2ymFHDXdLzwDHcFCWsjqjYdjGMF2+WA4gwksCxhQ0xNY6YSlomMWKBkutE7Ec\nAABYLq/Mv60rB42qBIgl0GQYJAAUyKtokJovh/W+/BT5mIKGmFsnLG/YQ0Ub+ZiChhje00XPq42d\n1FkMGqy2TsSyngAwunWCiobxUdEAAAXyKhqk5ncewxX1qIqGWA4apNGtE02+zl2d0VB0VqTtoGHU\nDseKFYP7xBA0WKpo6GrrhOWrTrS9LACmN6p1ghkN42MYJAAUKKpoaHrnscwZyhhbJyzNaCg7sIiK\nhsl5X76iwbnBe5rWifHMQkWDhQGMtE4A3TLqqhNNVjSMOkAP13mW1zMMgwSAAuEKMtwxtxg0hMMg\nm7zW8zSY0VC/op2VNs7SFM2LKBJTeEbQUD9aJwDUKe+g2MKMhlEVDZbXM+F6e66GVICgAUC00h32\nhYV2V+rjVDT0evGU9Vud0TDsILgrFQ3ODb5u+gxwdlmKpO1KMQQNRQedbQQ6ZV7nGM+6Dwsa5ucH\nl/i1FDTE8toCGF3RQOvE+MpWMU6KoAFAtNIV5IoV7Z6l6uoZSkutE+POaAgvG2nZsLLF9LW2dAAc\nonViMmXWF+H6LJb1xbC/Ueean59StAMdzvOJ4b0LIMEwyOoVbRurQtAAIFphRYP1oCG2Kf1SfutE\nG73W0vitE1IcVQ3DdlasBw2xVDSELSGxBA3OLW+3isGoXt+m25qKgobDDx/c3rq1mWUBML28bZTV\n1olYWrTSfSsqGgAgo6iioen0eBYqGiy1TpSpaJBsb9xTw1pCmg4axp0+nYZn+/bZbgcqOzvAWqAT\nW9BQdjha20HDYYcNbj/ySDPLAmB6loZBjgpWY6loKHulqUkRNACIVowzGiT7Z4BTMbdOSLYPflMx\nVzTE8p4u8xpLzb3OZdYXUtxBw7CKhiZe57B1iqAB6IZRrRPMaBgfrRMAUIAZDfWKPWiIoaIh5hkN\nYa+75aDBWkXDrAYNTVY0DKuuoHUCiJPVioaYWycYBgkABZjRUC9Ll7cse61nZjTUsyx5wve05aF6\n1oKGWWidaDtoGPYaU9EAxMnSjIZRrWJUNCQIGgBEK9aKhliCBmY01M/SjIautk5YCxq6WtFQdhhk\nE68zQQPQPVx1onpUNABAAYsVDUUr6/CgLJYDh9hbJ5jRUN2y5ImxdSL7vJwbvJ+sBg179iQzB6yz\nNAxy2HuZ1gkgTqNaJyzNaGhzQPk4qGgAgAJUNNRrVNDQ5Otc9ooIsVU0xDyjIZbWiVFn2tu6GoI0\n/HVuc8DtJMq2TjTxfh4W/lLRAMQp1ooGy+tvggYAKGCxoqHMjIZYKhoszWgoW9HQ1RkNTZzRnsXW\nCan5QGfcigYpjnVG2dd53z6p16t3WWidALrH0oyGrl3ektYJAAh4H1dFQywHZSFmNNSvzIyG7P3q\nMgutEwQN9Slb0SDV/1oPey8vLg5ed1ongHjQOlE9KhoAIEd4RiyGiobYDhqk/Oc1N5d8ZL9ft1me\n0SC1P0AvTyytE6Pabqy2qMS2zoglaHBuUNVARQMQD1onqkdFAwDkyB4Etxk05LUYZMVY0VD0vJrs\ntU6lBzHOJR9FYqtoKDOjQbIfNFh+T5e9GkJT75euVjSMev802dY0alkIGoD45IXGVi9vGS4XFQ0A\nEJnsjmSbZWqzUNEwF2wtmj4DLJXfGHZxRoNkM2joSutE08FZV4OGcSoa2g4a0itPbN0axxU9ACz/\nu073SaxWNDg3qGqIIWigogEAAtn5AdZbJ2I5+xtKn9eKFcurCNoIGtKN+qigIeaKhmFBQ9uXBMwT\nY+uEhRkNtE602zohDSoaHn3U9kEAgIFwPyDdJ7E6o0Fq/opG4/KeigYAyJUt62+zH67rFQ3Z52S5\noiG2GQ1lh0FarGiIJTyzFjRMUtEQw8Fw2ddZar+igStPAPHJO/tutaJBsl/REM46I2gAgAAVDfUL\nr+oRiiVosHoWIUTrRP3Kzg6wFjSE4WkM4WSMrRMSV54AYpFX2Wjh8paxBg3jbvMnQdAAIErZigbr\nQUN4dtLyQVkofV5FB8BtBA2jNoYxz2iIeRhkzK0T6bpj375m+vW72joxauhmW60TeetkKhqA+Fit\naCg6AZKu86wGDWWv5jUNggYAUYq5oiGGgwYp/taJ2IKG2CoaYqnSGaekv4nXeRaGQcZy1QmJoAGI\nRd6lGC3PaEgrGqzui4xaZ1eBoAFAlKhoqF/sQUPsMxqafi5dbZ0gaGgGrRMA6pTXOtFWRUOZg/SY\nWieoaACAABUN9cs7eyANnmOTr/MsVjQ0/Vy62jpRdkaDZCvQ6XLQYOWqExIVDUAs8lonLM9oCFsn\nLF5Gl9YJAChARUP9ylQ0NLXxLBs0dGlGQzgMsIkzIrRONPOemdWKBlonAExjVEWD1daJ7P2tYBgk\nABSIraJh5crBdZ9jOGiQRgcN0vKDizrNYkWD9aChK60TTbeodDVoGGcYZNtBA60TQHxGzWiwVtHQ\n9DZ8XFQ0AECB2CoanBucAbZ8UBYadXlLqbkzCHlnMvJ0dUaDxaAhPBC23DphbUbDJK0TFndSs0b1\nLNM6AWAasbZOSDbX4QyDBIAC2ZV8myv0cc9QxnB2Uiq+vGUbB/Oz0DoRW0WDc4P3tOXwzNqMhrLr\ni/D/P4Z1RqytE1Q0AHGwNAyyzCDFcB1ucX+EYZAAUCC7s269okGKq6Kh10s+JBsVDXlnMvLE3DoR\n24wGadA+Yfk9ba2ioautE7FedYKKBiAOefsBscxosF7RQNAAAIFhFQ1Wg4aYKhqGPac2gwZmNLSz\nLEXS8KwrrROWru7R5aCB1gkA47Ja0RBr0MAwSAAokD0Qnp8fbHysBg0xVTQM2wARNFRnWI9k02WX\n0wQNlt/Tlg6As7+jbEWD5dc3ZSnQoXUC6J68YZBtzWgoM98gphkNVDQAQCBvRzJdqVsNGsJ+dovX\nVA7FWtFg/QxCVuwVDTG0TsQ6oyGWq3qkRr3OllonDjpo8PdFRQMQh1GtE8xoGA/DIAGgQPbylpL9\noCE9++u9/SsixBo0WD+DkBX7jIb0Pb1792CmhzWWZzQMe53T11aKI2iwVDlS5r2cVjUQNAD29XqD\nEzRFrRNtzGiYnx9cujzL+okPhkGis3bskH72s7aXAjHLXt5SshE0lO25tn7gkPf65n3dxIY9PICl\noqGdZSkSHgxbnSNgLWgY9rcVCl9byzMwUjG1TkiDgZC0TgD2Ff1Nt13RMGxbaf3EB60T6KT77pOe\n8ARp/Xrp9tvbXhrEKuaKBsnuQVlqWHjS9IHZOBvD2IKGsjMarAYNMZT3W7oagjTZ+sLqaxuy9DqP\nW9FgvZUNmHVF28q2ZjTkDabMst46wTBIdNJll0lbtiQ7Tv/8z20vDWJltaKhK5erK9s60fSQwq4F\nDV2qaLB61n1UH6rV1okVKwbf71rQYKl1YmkpjtcXmGVF+wFtt04MW4db3x+hogGds2eP9PGPD75+\n8MH2lgVxi7GiwfpGJ2RpRsM4G0PrpYpZXZnRINk9WBsVVLXZOjHq/RzD5UNT47zOFioa0tYJifYJ\nwLqiwJjWickxDBKd8+lPL5/N8MAD7S0L4hZjRUNMQYOlGQ3jbAyda+99MAmrFQ1lz250oXWirYqG\nFSuKh4ilYriqR2rU36nV1gmJgZCAdUXbp7Yvb1m2osHi/gjDINE5H/7w8q8JGjCpYRUNTR/EdzFo\nGFbe3WQJtDR+eV/6Olt/jaX4ZzTE1jrR9pl2afA3U+Y1Tl/f2IKGtgMdggagW2IcBml9n4/WCXTK\n978vff7zy/+N1glMalhFw9LS8hVoU8viXPnBQBY3OqFYWyekuIKGYQdETZdddrV1wtLsAGnwOg8L\nJVMxtU7ENgyS1gkgHmVaJ6zNaLDeOsEwSHTKRz5y4L9R0YBJ5a0gmz5gyP6uUQcO1jc6IYKGZsQ+\noyGG1olRz6vN1olRYm2diCFooKIBiEfRtnJuLv8+TS0PFQ3DETSgEfv2SR/7WHJ7fl464YTk9tat\n9qfvw6a8A+G2+uHKHjhY3+iEhu2oEzRUx+qMhlltnWgyaBindWLfvmbD00lYGrpJ0AB0S1FFg3OD\ndYvloMHijAaGQaIzrrtO2rQpuf2Sl0hPf/rge7RPYBKjKhoIGqZDRUMzujSjwepZd2tBwyStE5Ld\n1zcV2zBIWieAeAz7m06/biNoiLldlmGQ6IxwCOTrXy8dc8zga9onMIm8A2GChupYChrG3RjGFDSU\nrWho4v3c1dYJq8Mgx2mdkOy+vilaJwDUZdh+QPo3zoyG8dA6gU548EHps59Nbq9bJ/36r0uPe9zy\n7wPjoqKhXlYvbzlO0LB3r+R9PctUlWE7T+HrHENFg9XWiVHPq+nZLpO0Tkh2X9+UpcoRggagW4ZV\nTLVZ0RBz6wTDINEJt90m9XrJ7d/6rWQHhIoGTIuKhnoNu7yl9aChrffBJIZt6J1rtjqD1glaJ6YR\nW0UDrRNAPMqE8k0GDen6jmGQwxE0oHbf//7g9kknJZ8JGjAtKhrqZal1YtyBRTG9zqOem/WgIYbS\nfmtBA60TNoIGKhqAeJSZ0dBU60SvNziB2pXWiZmsaHDOrXHOvdg5d7Fz7jrn3GbnXK//8dEJHu9F\nzrlPOefud87t7n/+lHPuhWM8xrxz7g+cc19yzv3EObfTOXePc+4K59xJ4y7TLLjvvsHt445LPtM6\ngWnlnXGPKWiwfqa9bNDQxPOYtHVCsrlxD406ILIeNMRQ2m91RkPXWidGvX+avN49QQPQLZZaJ8oe\noFvf52tiGGRN+UVlsoegvv8xFueck3SlpNcGjyNJj5f0Ukkvdc5d6b0/d8TjHCXpXySdmlmO4yS9\nQdKrnXNv8t5/ZNxl7LKwouFJT0o+U9GAaeXNEIgpaIj5ANh660Ssr3Pec2syaJjk7EYMpf2WZjSE\nZ8JmrXUivQzdo4/aq2igdQKwzVLrRNlQ3vq+yMxXNPSl4cKPJP2rJDfBY/ylkpDBS/qmpN+W9Kz+\n59v6//4659xfFD2Ac25O0j9oEDJcK+lFkk6T9GYlochBkq5wzv36BMvYWWlFw8KCtH59cvuxjx18\nn6ABkxhV0dDkSr2LQYPV1gmChmaWhdaJegwbsponhtc3VebvNF1HW6hoWLFiEORQ0QDYZqmioWwl\ngPXWCSoapIsl3SLpFu/9ZufcsZLuG/Ezyzjnjpf0ViXhwC2STvfep//d33TOfUbSF5UECH/snPuY\n9/7enId6jaTn9B/nf3jv3xx871bn3PVKQozVki5zzp3ove+Ns6xd5P2gouHYYwdv5IMOko48Utqy\nhdYJTMZKRYP3gw3grAQNTU/pnyZosFiuGEqf29xc8pFlPWiIobTfUtAwbMhqnhhe31SZv9P0tbZQ\n0SAlVQ27dhE0ANZZurxlFysaZnIYpPf+Yu/9dd77zVM8zH/VIFA5LwgZ0t+xS9J5/S8XJJ1f8Dhv\n7X/eIulPcpb1XknvVlJx8RRJL5timTtjy5bBBjydz5BK2yeoaMAkrMxoGHZAnmV9oxOioqEZoy6R\n1UbQ4Fx+6JEnhjPuloKGcSsautQ6IQ3W0VaChvTKE7ROALaVGQZpuXXC4kkPWieqcZaSKoS7vfe3\n5N3Be/91Sd9VEhL8p+z3+1URJ/Yf5++897sLftdVwW2CBuXPZ0ilAyF37JC2b29umdANVioauho0\nWJrRMG55X4yvc5lS817NNXJlrgueFcOB8KidwiaHQY6zvpDiCHJS4etcFFRZap2QBnMaHnkkqU4D\nYNOwg+KmZzSUPUCndaLjQYNz7jglAx+lpD1imPT76/otGqHn5tzvAN77ByX9u5LA4jljLGpn5V1x\nIhUOhKR9AuPK25FsIz0e58DB+kYnREVDM8pWNEjNnQWeNGiwWto/zmUXaZ2YXPo6z88nVTF5LLZO\nSEnIsGNHvcsUm15Puukm6cc/bntJgHKtE0tLzQSGZdcv1vf5qGiYXni5ybtH3Df8/okVPM4TnHMH\nD73nDBhW0cCVJzCNvANhKhqqQ9DQjPS5lQka6n4ukwQNCwuD94PVM+60TjQjDBqKWG2dkGifyLr0\nUul5z5N+8RelbdvaXhrMujKtE9n7tbEsobm55sLVSVDRML31we0NI+57f3D7CRU8jsv83EwaVtGQ\ntk5IVDRgfHkreoKG6gw7IGozaChzEBzj6xxr0CANDoatHghbChq63DoxTtBgrXVCYiBk6NFHpfe9\nL7m9ebP05S+3uzxAmatOSLaCBqnZOUvjmvlhkBVYHdweNQUgLJo7tKbHmTlhRcOw1gkqGjAuKhrq\nNazEm4qG6oya0WC9dUIaBA1WS/tHBVVtzWjocutEkaZbJ0YNNg2DBioaBq67bvl+2de+1t6yANLw\ns+/hOtxa0JDul1rcF5nkSlPj6nrQsBjcHrVZC98C2ZaH/Y/jvZ/mcWZOWtGwerV01FHLv0fQgGlY\nrGgom2xLNjc6IVonmtGFiob0rLvVM+6jykObnNHQ5daJUaGZtLyioc5e6rLv5bB1goqGgSuvXP41\nQQPaVraiwdqA6nQbbrF1goqG6YVXh1hZeK9EsDun7OZ8/+M456Z5nJmytCT98IfJ7eOOO3A4FK0T\nmAaXt6xXzEGD9QFModhnNEhxVTTQOlGfUe9lafnfZp1nHsu+l2mdONCGDdK//Mvyf7v55uV/R0DT\nygyDzN6viWXpSutEXRUNNT2sGeH4mlFtDIcEt7PtEdnH+dmEj1Pa3r17ddttt42839q1a7V27dpJ\nf02tNm4c7FRlB0FKVDRgOuHZszTEImioTtnLWzbxOlPRkLAeNOzalZylLrriQFtGvX+cS/59aYmr\nTkxjnNYJKVl3lAlbJjFJ0EDrROKqqwaX0k3/LrZvl+64Q3r601tdNMywsvsk1oIGK60TmzZt0qZN\nm5b9W3hFmXvuSYLtvRXv1HU9aAgHN44azBgOgLw/873s4wwLGtLH8Ro9OLLQ5s2bdcopp4y834UX\nXqiLLrpo0l9Tq2HzGSRpzZpkB897ggaML91hDzcwbZzJHidomJ8f7Li1vdEZZdjzajrQmYWgocyM\nBqtBQ3rWvddL3jcrR9X9NazM+2fFimaChi63TowzDFJK1h2HHFJ832nQOjGZXk/6yEeS285Jb3qT\n9P73J19/9asEDWiPpWGQ41QCWKlo+OAHP6iLL7648PvnnFPP7+160HBncPtpI+4bfv//Z+/dw/Wq\nqnv/79p7Z++E3EMSkgAh4R6EQsJFQRRavEBV9NhqLVaxCof2Vz21x6JVVLDH1lPrpT3F9lhbe+Tg\n5UFB+pyjpT0CKsrVRAsCgXBLIASSQMg9+/au3x8zs3Osd6/3Xbc55xpzrfF5njx77b3fvbPeudea\na47v/I4xHs74Pffn+D1Px3FcelmwaNEi3HLLLZmv4+pmAJIdJ9IcDUNDwMKFqqKxpE4IRUlbSHJ3\nNADqobNvH898PUq/9+U7kC/agilEoaEJjgZAXdvchIY8u0/TpgEHDvgtBtnU1Im8QoNLUUdSJ8px\n663AU0+p49e/HnjHO4zQcNddwO/9Xm2nJrScvKkTvms05BUa6l7zXXHFFbj44osTX/vMZ4DvfEcd\nX389sGoVcOGFF2Lbtm3W/t9GCw1xHD8ZRdGzAJYCOC/j5a8++HFzHMcbu75HG/ucB+CGtF8QRdFh\nAI6HcjP8tPgZG4aHh7FmzZoqv6J2shwNgEqf2LZNORo4Wm4FvuiHSahCA/cAOK/QII6GajSpRgOg\nguF58+ydkw3yXD++2i4WTZ0YGTHOvyamTrhCUifKQYtAXnYZsHq1ujfGxqQgpFAvnBwNZVInOh31\nc65qIWSRlmq/YIE5Pvlk4NRTVfxpk6YXgwSAfwYQATgxiqKz0l4QRdEroJwIMYCbu78fx/EGKJdD\nBODtURRN737NQajx5LtVTroJZDkaAFMQcnRUHvJCMfRE3yt1grPQAPAPgPs9SIeGTMs4H++jyUJD\nExwN3Hfd86ZOAPxSJ6IImH5wxcFxbCl5rh9fc7SkThRn2zbg5oMr4MWLgTe9Sc0/Oot3wwb1GkGo\ng1DbW3Jej9A52JUTsQ1Cw18B0JfE33SLBAc//x8HP50A8Nc9fs/nDn5cAOCz3d+MougYAH9y8NPH\nIEJDwtGwYkX6a2hBSEmfEIrAxdFQNHAIRWjIElB8vo+ilZF9Oy7KEsfNqNHAvWAhJ6GhqDAJ8G8f\nqpHUibC54QbzN7n0UvO3Oucc85q77/Z/XoIAZG9+pL3Ox7nkbW8J8FuPlHkeFYV16kQURa8EcCz5\n0kJyfGwURZfS18dx/LXu3xHH8YYoij4HJQKcCeCnURT9BYDHARwD4CMAVkO5GT4bx/HjPU7nawDe\nC+CVAN4fRdFSAF8BsAPAywF8HMAcAJMAPhDHcafg220c2tGwdGlyIUrp7jxxwgnuz0toBmmOhjom\n9KY6GvIIDfv3i6OhCh3ylOgVEPkqcNrpmPOpmjrBjTyLQs5CA/f2oRpJnQgbujn0hjeY47PPNsd3\n3aWcDoLgm7ypE1xrNAD81iM+HA2shQYAlwG4NOXrEYBzD/7TxFBiQBpXAVgEJRScBuBbXT8XA/iH\nOI4/0etE4jjuRFH0FgDfgxIsfuPgP/p7RgH8QRzH/9bnPbWCfftMJ4le9RkAkzoBiKNBKAYXR0PR\nwIFLq6Mssh6kVQSTjRuB3/gNYPly4NvfzhYPmio05HFq+HovVfpph5I6MTDQuw6QvnddzxtFFqga\n2j6UM2W6TrhCHA3F2bvXHNNxoULDnXf6Ox9BoOQtBsktdaKObmh5oevXNqdOxAX+pf8CxeUA3gBV\ns2EzlCiw+eDnF8VxfEXmicTxCwDOAfD/AbgDwHYA+6HcEX8PYE0cx18t9S4bhq5aDPSuzwBMdTQI\nQl6kRoNbXKZOfPWrwNq1wHe/m2/h2lShIc9ixdd7KRMAa0JJncgTAHN0NEjqRHHyCg2Dg6bFpggN\n5pi2HV22DDjqKHV8331+AjlB6KbfM8p3jYYy7S0BfqkT9HxamToRx/HvIllgservuwVAds/I/r+j\nA+DLB/8JPcjTcQJIOhpEaBCKEHJ7S6D+CsRZuBQaNm82xy+8kP36NggNeWo0+NgBBpqXOlHE0s9R\naNDje+CAmjcGmG4R5QnuuaVOAGr3fu9eSZ3oJTQAytWwcaMSEu+/Hwi8KZoQIKF2neC8HhFHgxAs\neTpOAFIMUiiPniBDdTQA/B46FJdCA61cvmdP9uvbIDSE7GjgnjpRJACemFBFOl2fS9b5UKiQc+CA\n3fOxSYipE4DpPCGOBnM8a1bye7QgpKRPCHWQN3WCW40GzqkTPhwNIjQITsjraJDUCaEsoTsaAH4P\nHUrWbrtPoaFIhWcgnDHmVKOh7akT9N51uVCtkjoB8BRygGRh07pTJ4oWNtX1CHbvTr6PtpHlaNDc\ndZef8xEESr/npbS3LIcIDUKwUEdDP6Hh0EPNokSEBiEvdCHZy9Hga0JvqtBAi22mFdDTYz05mVwA\n5IEKDXRx24uijgbOOwiUpjgampA64at2QJXUCYCnkAPkv0d9pE4UvZZp4cPdu+2fTyho0Xd4eOq4\nnXqquQ7F0SDUAddikE1obzkwkG9tVQYRGgQnaEfDtGnA4Yf3ft3AALB4sTqW1AkhL70WkoODJn9Z\nHA3VSEtNoVR5H5I6oShao4Gr0MB9x12/Nw6OhqqpExzHF8h/j/pwnVURGtqcPqFF3243A6DuD12X\n4amn8s3bgmCTfvc159SJ6dPNMTehWM/BruozACI0CA6IY+NoOOqo7MBAp088/3y7bYtCfvoF93rC\nFKGhGlk5zmXfx9hYsuhaUaEhT+AgjgY359IL7jvuetzoWHYjqRPVyLvD52Oci17LIYyvD/oJDYDq\nPqHZvt39+QgCJdRikAsWmOMXX3RzPmXRc7AIDUJQbN9ugod+hSA1uvPExAS/m1DgSb9JPiShgZuN\njuLK0UDdDIAbR0MUmeuAs9DAtUZDUQsl9x13fZ+FKjRwF3KA/GIgR0cD9+vXF1lCw6GHmuM83YIE\nwSb9nlG+5u+0c8maYzjfN3oOdlWfARChQXDAo4+a42OPzX69dJ7Ij8tq6CEhjgb3ZFnqOAsNQLVi\nlb5oiqOB+45wUUcDtzaiIQTCIadOhDC+roljERoE3vS7r30LDUVclvS+4eYEEkeDECQPP2yOT7FJ\n8gAAIABJREFUV63Kfr10nsjHI4+owpqvfjXvnXAf9Hvg6GBChIZqZAVnZZ0Z3UKDi2KQQHhCQ8g1\nGrjvuOcRGjgXg+Qu5ADlikFySZ0QoUHN4fpv2N3aUrNwoTkWoUHwTb/g3nfHMXE05EeEBsE669eb\n4xNPzH69LgYJAFu32j+fpvD1rwMbNwJ33AH88Id1n0290AVqyKkTnIPgIkIDZ0cDZ1GuKY4GzoFa\np2PuUUmdcIc4GsKGzsPiaBA40k+Y972uaprQII4GISiKCg3z5pljWiROSEIDtC1b6jsPDtBJXlIn\n3NAUoYHzGOexX/oqbNnU1Ak6D/RbTEnXiWqI0BA21FmWR2jgZgFvKwcOAF/6EvCDH9R9Ju4RR4N9\nsmpx2aDgckIQstFCw8yZ/VtbakRoyActlNl254c4GtzDSWgoU6gwBKGhiY4GbjvudB7gUKOhqakT\nea8fSZ3gSVGhgVvA1FauvRa48kp1nW/cmOwM0jTyOhp8Cw1Za5I5c9TfZ2KC330jjgYhOA4cMK0t\nTzxRVX/PYu5cc/zSS27Oqwns2GGO2y40cHU0NKX14sSEaTXLQWhoqqOhiUIDt0CNjpmkTrhDHA1h\nI0JDmNxxh/o4MQFs2FDvubgmT7cxgF/qRBSZe4fbfSPFIIXgeOwxE6DkSZsAxNGQF+poaHt3jjwP\nHBosu6SJjoY8wVnZ99EtkhUtBpk3CKY1Grh2a8mzK8K5G4JmZMSIytwCtbxCg69ikJI6YY65CA3U\nMcJVyHGNCA1hQlOVd++u7zx8EGrqBMBTaJicNGtkKQYpBEPR+gyAOBryIo4GQ572loCfB44IDbwd\nDQDfgpB5BJQo8uPOqCI0RJEJhrkFak1wNIQQCEvXibChQkOvrhNz5wIDB6MGTgFTWxkbAx5/3Hze\ndKGBU+pE0c0PLTTs26ec3xzIW7+oKiI0CFYp2toSSAoN4mjojdRoMOS10InQUA6fQsO+fcmHdhpV\nhQau45w3IOIuNAAmWOMWqDVBaAghEA7Z0RDC+Lomj6NhYABYsEAdi9BQP48/nrzvdu2q71x8kNfR\nwC11AuDpBirzLCqDCA2CVcTR4IbJyaQI0/bUCXE0uIWeVy+lu+yDvVtoALJ3aUVoUB85Cw16151b\noFam6wS3FJUQAuG878tHiooIDcXJ094SABYuVB+5BEtthq63AXE0aDinTgB8OraIo0EIEj3xDQwA\nxx6b72emTTOLVHE0pLNzZzLPfOtWvnnnPhBHg1tcORrGx5MpQJqs9IkyQkMoRTc1/d5XCEJD6KkT\nvmo0SOqEORZHAx/yOBoAEzDt3s03Ja0tUAcx0HxHQ6jFIAH+jgYRGoQg6HSM0HD00f0Xdd3ogpDi\naEinOzibmGj3WImjwS2uhIZeD1gXQkMI45w3zzMkoYFboCapE36Q1ImwKSo0AHwCprbSNkcD12KQ\nedYk2gkE8Llv6DhJ6oQQBJs3m92WvGkTGp0+IY6GdGh9Bk2b0yfE0eAWV0JDWtoEkN15ouhDHQhj\nnDmlTuRJl+mH3nUfG8uuueETbkJDU1MnyggNkjrBBxEawqNtQgO9rwe6oteQUie43DfiaBCCo0x9\nBo12NOzZk7yBBUWa3bzNBSH7Bfe+HzhVhAau1lPfQkNbHQ2chAYq8s6ZU/znp083x1yqagPlhAaX\n96WkTphjcTTwIU/XCYBnwNRG4niq0NCW1Im0e1pSJ4ojjgYhOOikl7fjhIYWhGz6ZFkGERqScM3V\na4qjgT6AbAoN9JqdPdsci9CQr0bD+Ljpe20bKjTQ+TgvXIO1vE4NSZ2oRplikFyEhhCEHNeIoyEs\ntmyZ6mBouqNBrwOyhAZxNORDikEKwUEL05R1NACSPpGGpE4kyeto8LGzKqkT+R/s1NGwcqU5LiI0\n5LWbhzDOeR/0PlwwNoUGTo6GPKIZ4K8YZJnUicFBM7dwFRokdSJs8nad4BgwtZFuNwPQ/E06fV+n\nzS++n/dF1yQc7xtpbykEB534Tjih2M9Ki8v+iKMhSb+FJN2d8rFo1JP14CAQRdmvDyEA9pE6sWKF\nOXbtaOCaokIDcpp60I2Pa6YNjgYONRr0746i/NcyYOY1rjvuIadOjIyYuZvTtesTcTSERZrQII4G\nhTga8iGOBiE49MS3aFHypsqDOBr6k+ZoaLPQQAOB7kned8CjzyWvIixCg+Loo81xVjHIpqZOlGm9\n6Oq90N2wMkIDFUo4BWtchYaiO0hcu3poOAkNVLikz4NeRJG5frmOr2vyCg0cq+e3ke7WlkDzhYZ+\njoaBAbMW5Cg0LFhgjrncN1IMUgiKnTtVzhhQPG0CEEdDFmmOhjanTvSri8BdaPBdQ6IMPoSGsqkT\nbRQaxNFQHm7FIPsVNOtHU4QGGhC4EnQ2bzbHy5bl+xnu4+uaMo6G7dvdnY/QH+po0PN1W1Ines2d\nem3FsRjk0JDZUOVy30gxSCEoHnnEHFcVGsTRMBVxNCTpN8nTgMeHzVgcDSI0lKVM6oTUaCgGt2KQ\n+u9XdGHXlNQJwLx3V9cyFRoOPzzfz4jQoD6OjPQPnDhawNuIFhoWLDDP0t27VTeKptIvdQIw87tv\nR0PeNYm+d7jcN+JoEIKiSscJIJk6IY6GqVBHgw5I2iw09Ctiw93REEIA7FpomDkzacFtq9DA0dEw\nY0a53Q2ujgZuxSD1riPtupIHGghzDCaK7PC5DgiefdYci9CQDy009HMzADwt4G1j927gmWfU8apV\nph3x+DjfZ50N+qVOAGZ+55g6ARih4aWXkmuauhBHgxAUVTpOAOJoyEI7GkZGgOXL1XGbhYa8jgYR\nGsrhSmjQ1+yiRcle7VlCQ5ndg6aMc/f3XAsNZdwMgNRoyIsWjefPL/Zzel7rdNyeX1mKiIFaaHCd\nOjE4CCxenO9nRGhQH7OEhuFhI5KJ0FAPjz5qjk88MSlaNrlOQ15HA8fUCcAIDXGcng7tGykGKQQF\ndTSUERrE0dAfPSktWAAcdpg63rmTl0XZJyE7GgYHzUI85AC4aK2JyUkjmHULDUWKQQ7kfGqFIDTQ\n+5eL0KB3x4rC1dHAqUbDgQPmfIoKDbSbDsf0CY6pE0uW5Bcm9fgeOKDEnLahxd4soQHgZwFvG93r\nbTpnN1loyHI0+EydKOOy5JZ2JO0thaDQE9/06WbHvQjiaOiPDtDmz0/u0NCc9zYRsqMBMAEP1wDY\nhaPhhReM5buoo0E/1Iu0Awyt6Gad7S07HbNALetoCKFGgw1Hw/g4cNVVwDXXFA9I6S4WFdfzwFXI\n0ZRxNLgICMbHjXMqb9oEwPf69UEc53c0ACZgevHFdooyddPtIKaOhiYXhMwqBulzXaXPZWAg/+YH\nN6FBHA1CMHQ6wGOPqePjjisWDGjE0dCb0VGzg7VgQVJoaGv6RD8llu78idBQjjwF9IoGv1QUW7y4\nnNBQpFJ/CI4GLqkTtIiYDaGBUyBcpoVoP6Hh5puBP/9z4FOfAr7//WLnQoWGsqkTAK/x1XBJndiy\nxVzLZYUGjuPrktFRIxgUERo6HdkYqoNuR4OkTijqKAZZZE3CTWiQYpBCMOzebW66JUvK/Q5xNPSm\ne3FKhYa2trgUR4Nb8gRnQ0NGyS8qNCxapHbwo0h97sLREILQwCV1omrHCYBvjYa8uzZ5HQ1PPGGO\nabelPFQRGrinTuR15wBuUyfKFIIE2i000NQ1KgD3ghby5RAwhc7oKLBpU/7Xa6FheBhYsSKZOtEG\nR0OeYpCuC+aWERq43TdSDFIIBjqxlc3vnTXLBC0iNCShi1NaowEQRwMQXo0GwG915DIU3WkvIzRE\nkVnUtlVo4JI6QedwSZ3oLzRQx91zzxU7F/qzTXM05G3VCrjdeaStLZcty/9z3MfXJVRoKOJoAIDt\n2+2fT5uYnARWrwaOOgr42teyXz8xAWzYoI6PP14Fum1xNGQF9746B9FzKbIm4eZokNQJIRhsLFKj\nyPyspE4k0fUZgKmOhrYKDXkdDa53/iYnjXLeJEdD3paAVYQGIL/QoBfCWQFM2rnlPb864JI6YcPR\nwDVQs10Mkj6fijrKmlyjoYzQMD5uf+eRCg3iaMhHFaGBQ8AUMo89ZmouXHdd9uufesrMT7rwehuK\nQcZx/tQJwP0mTtNSJ8TRILCGLlLLOhoAs/ASR0OSbkeDpE7wcTTQ8yhTP6ApAXCehzoVxbqFhn5d\nJ+LY/Cx182QhQkN+mpw6wcnR0OTUiSJCgx5rGjzYQoSG4lChV4QGv9CxX7cuW3hL6/DWhmKQtOho\nVuoEwF9o4OAEEkeDEAw2UieApKPBdX5VSHQvTiV1gk+NhrKKcNOEBpeOhj17TBBDRbYsfC46yqLf\nVxT1v35CEBq4Bmq2i0FK6kQ6ZRwNgP17s2yNBt9FhDkhjob6oM++l15SjoV+dHecANqROkEFyTyO\nBtdrK3E05EeEBqEyNlIn6M+Oj/PK8a0bmjohXScUHB0NZYSGTicpmnDBl9CgF7VjY70DDuraKSs0\ncBd0RkZMYcw0XC+gbLjSuNZosF0M0lbqRNOEBnpOeR0NgP1caqnRUBwRGuqjW2Rfu7b/69McDW1I\nnaDrJA6OhjKdsLjdN+JoEILBduoEIHUaKN2L07lzzUKtrakT/RwN06aZBxFXocGn8l4G344GoHf6\nBBXTmpo60W+Mu78vjoZiuEyd2LatmFBYpUZDk1InXDoatNAwa1ax9YjP2j7cKNp1glvAFDLdQsO6\ndf1fT4WGE05QH9uQOtFvzafh7mg45BAzN3K4b6S9pRAMtlMnAKnTQOkuBhlFZme3rY6GrIeOXpRz\nFRq4B8EuhYYZM8yuGV3U9kqfoNd40xwNOjgrIjS42KlpQ40G2o41jTLFIOO4WK5tkx0NHISGODZC\nQ5G0CYD/+LpEHA310S2w9xMa4tikThx5pHl+tsHRUDR1gmONBsC0uORw30h7SyEYbKVOiKMhne5i\nkIDZ2d22LVkkpy1kBfh60ShCQzmKCg2Tk9lF3bTQoFtbAvkcDW1IncgKzMTRUJ68rpGBAeOE6udo\noPMxUKxOg9RoULhKndi1y8wjIjTkp6jQoIMlgEdRu5BJczT0qlG2fbuZf3TaBNA+RwOH1Iky7S0B\nI9K98EL9tegkdUIIBlupE+JoSKfb0QCYgGtiYurCtw1kORpEaKhG3l3gvO+j0zEKvk6bAIo7GiR1\nIgyhgVONhrxjDJh7uFfwe+DA1PEvIjTouXpkJDleeZDUiWzKFoIE2i00FO06MXOm+Rty2JkNme7n\n3rZtyTojlLT6DEA7ikE2IXUCMELD2Fj/bls+kGKQQjCIo8EtaXbbtheEzOtocL0gb7rQkBWc5X2w\n79hhHA9UaKCLWtupE/TvwXGMgXKpE/3ey/g4cM89xd+vjTl82jQjSnEK1HQgm2fHJktoSHsuFamT\no+fyovUZAP6BMAehoWwhSID/+LqkqKMhipI7s0J50p57vQpCpnWcANT9pJ8RTRUampI6wSntSBwN\nQjBIjQa3aEfD7NlmUmt7i0txNLjF9k47vUaLOhrKpk5EkXl4chxjwH7qxDveAbziFcB731vsPGw4\nGqLIvA9OgVoZR0OvRWqa0FDG0VA0bQLgHwhzSJ2gQoM4GvJTVGgARGiwRdpzr1edhl6OBsC4GiR1\nQiFCQzZSDFIIBhddJ0RoMOjFqa7PACQDrjZ2nsjraJiYcNs+UoSGqT+TRlrHCcBtMUh6fhzHeGLC\n1FexNc63364+3nRTsetez7fDw9lBYj98CXxFsJk6UcXRMDFhrvEyQkNIqRNFnFCuHA1FhQY6vpyu\nXx9UERoOHOB5PYZCmn2+jNCg197iaFC4fObHsXl2VxEa6q5vIsUghWDQCmoU5WuN1Au6kyapE4o4\nNo4Gujhte+pEXkcD4HbRSM+jiUJDlsptU2jIam85c2b+RXD3+XEeY8CO0BDHZpF54EByUZqFFhqq\npL4B5r4LtUaDvt6LCA15HQ1VCkEC/Hfc9d+cthfuhdRo4EXR9pYAr53ZkKECuy6SnCU0zJ4NLF2a\n/J44Gvw5GvKIHr3gdN+Io0EIBj2xzZ7dv3BcFuJomMrevWaC7eVoaKPQkNfRALhdNIqjYerPpFHF\n0aB3jIu6Gej5cRzjIlbzPOM8NpZcjPXK801Dz7dVHGlAOx0NeYUGWmunyTUa8jhiuKdOtG2Hvoqj\nAag/YAoZ+tw75RT18dlnp84r+/cDTz2ljletMqKERgsNY2Pu0wbqgFMxyDzn0gtO9404GoRgsLVI\nFUfDVNI6TgBSoyFrordpg926FfjRj9LbN9oQGjguCvQ52RIa6AOVtkbLEhrGx8090DShwbajIa1N\nWh7i2IjFVR0N3Go0FElPAdymTlR1NISSOpFHaHCdOhFFxTrUAPyFHJeI0FAfdN5+9avNcff8vWGD\naYfYnTYBJNffTUyf4FQMsorQQNc/dd83UgxSCAZbi1QpBjkVugsmNRoMOhAYGEh30dhaNI6OAqtX\nA+efD/zlX/Y+D0AcDXmFBrpAzeo6QXMYiwYO9Pw4jzFQr9Cwd69ZxNlMnai7RzhQfCHlshhkWveg\nInAPhDkJDYcdVnyHjvv4uqRoe0sgGTDVnWseMnrsBweBs882X++ev3t1nNDQFpdNTJ/glDqR51x6\nwUmgk/aWQhBMTho1XBwN9unlaKD28zY7GnqpybYWjY8+avJ+//7vpwZPTRQayhYp7Pdg7yU0ZDka\nynac6D4/jq6RIsXzyggNP/+5+Tv2w0bHCQ297zhc10XEHKCYo0Fbl198Md/1VVVoGB42/yfHQLju\n1InJSSP6FE2bANotNOg13MhI/sCJU8AUMnrsZ80CTj/dfL1baOhXCBJovqMhj2gsqRPF0GM6OFgt\n7T0LERqESthqbQmoh5xepIijQdHL0TA8bPJ88woNX/0q8MpXArfcYu/86kIvTnsF97YWjVu2mOMn\nnwQeeCD9PPqdSxqchQbbO+1AeaGhSscJen5jYzx22Cl0vLKCM7qYySs07N2r7LZZuBIaOARrRYUG\nvVCdmEi/XqjQcNRR5jjPHFy1RkMUmfQJSZ2YyvPPG2FNhIZi6GC3SLHdtIDp7/8eOOcc4F//1d65\nNR09b8+aBRx3nHkmFhUamu5ooPckvVcpIaROcBIa9PrVZdoEIEKDUBE6oVVdpAJmASaOBkW/XTBt\nJc+TOnHzzcD73gfceSdw5ZX2zq8uijgaqizKqdAAAN/9bvJzXZyp+//MwpfyXgYXQgO11lLBLKvr\nBA3gqqROAPxcDUXGOYqy3Rlpu1h5CkLanMNpkMkhWCvraADS24PS5xJd7OdJn6haowHgWWwTUKJM\nWaHBlqOhSiFIgN+16xO6q56X7oDpxhuBK64A7roL+MM/tHt+TUYLDTNnql3l1avV5xs3JgNRLTQM\nDgLHHDP191ChoYmOBnpP9ppjfG3gVOk6MXeucQ/ULTTotYTLtAlAhAahIjYdDYBZ6IqjQUFTJ2iA\nBpgd3t27+y+MHnkEePe7k5+nLaJDQp+/T0cDoAQbTRwrl4jmNa/J/3vb6miYNy/5YPaVOpF1fnVQ\nJHWCviavowHIV6fBlaOBQ4vLKkJDmqDTS2jII/ZWTZ0A+DoaxseNA6Ro6oQtAZAKDcuWFf/5KOIr\n5LimqqPhnnuA97zHfP7II8lOQ0I6cZx0NADAmjXm+3r+7nTUmALAscem70A3PXWCPk9CdjQMDJi1\nPBehQRwNAmvoItWG0KAdDbt2pVf5bxv9Fqd5Wlzu2QO89a3JB8/4OPD00/bOsQ70LpjrGg3dQsMv\nfmFcDPfcAzz4oDp+5StVy6m8cA6AXQoNdHEKZBeDtJU6kXV+dVAkdQIIT2jgEKyVLQYJpO+0U6Hh\nuOPMcR5HQ9XUCYBvIFykVSvgJiCo6mgA+I6vS+K4utBw991T55+7765+bk1ndNSsc9OEhp/8RH18\n+mlzTaalTQCSOgHUUwyyqNAAmHun7iKqWSnIthChQaiE7dQJ+juaqMoWpVcxSCBpJU9b6MYx8N73\nAg89NPV7eXK3OZOVOmGrvWW30AAYV8NXvmK+dvnlxX5vKAGwDaFhctIEaN1Cw/CwechlCQ1VUydC\nHmf6mqJCQ1ZtCptiMTehoWyNBqC/0DBnTjKYLSo0NC11oorQYCt1QhftBaoLDdwcIy45cMDUtigi\nNMyfb4qTaqhD7a67qp9b06Fzth67V73KjOu116o1dlbHCaD5joY8qRMhFIMETMeW3bvrTekUR4MQ\nBK4cDd2/u630KgYJJO2haQHxl74EfPvb6njOHOCP/9h877HH7J1jHfgqBkkXr5qbb1YP/299S30+\nZw7wm79Z7Pc2JQDO82DfscMEu91CA2AWWG12NNgWGvTiZ+dOVcS0HzYdDdzy3KukTvQTGubPTwpf\neVInbNRo0ALq2Bgvx18eWzPFdeqEOBryQ2vjFBEaBgeT67W5c4Hvf998fued1c+t6dCx18/BlSuB\nd75THb/4IvCFL2QXggTE0QDUkzpRtL0lkFwH0c1E34ijQQgCVzUaACkICfR3NCxdao7ThIa/+ztz\nfN11wK//uvm86Y4G26kT8+YBxx+vju+4Q+006F2vSy4ptkADmhMA53kfvTpOaPQCK60YpA7gaF5j\nETgX3Sy6C5wlNNBdLNomLasgpNRoMOQVGubNA5YsMV8v4mgYGEgGBUXg5hjRcEudKFOjARChoehz\nbPlyc3z99Wo3Xndjue++8GtBuYaKw3Tsr7nGrG0+/3mTQgHkExqa6GjII2b6Sp2gv7tMkE7XMnUK\nDeJoEILAVdcJQBwNgFmcDg5OFXL6CQ1xrKoWA8AJJwBvfnMyp1gcDdnEsRnXZcuAt7xFHXc6aiGg\nKZo2AYjQQMnjaFi4sNzOQVPGmb4mj6PhvPPMcVadhibXaLBZDHL/fvP75s0r7mjQc/m8eVMt53nh\nNr4aH6kTmzcD3/lO77QGLTTMmFG9BsaBA/za4boibVc9L5/5DHDuucDXvga88Y3qa2efrT7u2wfc\nf7+dc2wqaakTgOoqcdll5jU33mi+J6kT+RwNLp/39O9WRjSmP5O2weILaW8pBIHt1AlxNCTRamfa\n4rSf0LBrl5nAtI102TKzCBRHQza7dpmfXbrUCA2AmaBXr04Wb8pLUwJgG0KD3snZsye5uI9jIzSU\nqc+Q9/zqoqzQMD5ucqopdPHz6lebY59CA7fUCZvFIOnzaN48NcfoZ16R9pZl0yaAZO0ZTnUEigoN\nRVMnOh3ggguAt72td3tmneZ2+OF2hBwOjhwfVHE0XHSRcvjRrlZaaAAkfSKLXkIDAHz841PvpSVL\neotobUqdyNPe0qWjoaqTm95ndQkNcSztLYVAcFkMUhwNZhcszTbeT2hIs5EODKjWSADwxBO8cnyL\nUsTRUHZBTsd06VLg5S9P2qWBcm4GIJwAOCs4y/M+aGVlXQSJohdYnU5ycb9rl/mdZeoz5D2/uijb\n3hJIX0TRResJJ5gxyyoIaXMO57bjbrMYZLfQAJj5IEto6HTsCA3cxlfjOnVi61bT3u+ee6Z+f+9e\ns14oW58B4Du+LqkiNKRxzjnmWApC9qef0HD44cD735/8Wi83A9B8R0PR9pYun/c2hYY0J6cPJifN\nukAcDQJrbNdooGpt2x0NtFp/2uJ00SJjJ+8WGnpV4NbpE+PjwKZN9s7VJ3FsRBKXjoZuoWFgQKWg\n0P/jkkvK/W7OAXBdqRNA8qFbtRBk3vOri7LtLbt/VtNt59ROm+3b+7ezlRoNhiKOBsA4baj7KY3d\nu40LRYSG4qkTzzxjjmmBZI2NQpCAvW5FIWFbaDj1VHOditDQn6y0lY98JOlU6Cc0tMnRUHcxSDq+\nZVInODga6LwrjgaBNS5TJ9ruaKDvP83RMDBgFrrd3RF6Lby0owEIt04DLTDlskZDt9AAAG99q/na\n295WPjDzZfErAz0fLkKDpE4UExpmzUqm9PRLn9DzzOBgMtAqA7dA2GaNhn6OBqB/nQYaHJetHwC0\nN3WCCmVpxdPS5uoycLt+fdCrIGFZpk0DzjxTHT/5ZL60orbSz9EAKAfghz5kPj/llN6/a2TE3FdN\ndDTkERokdSI/RdMKqyBCg1AJl8Ug2+5o6NdxQqMXVVu3JlMhegkNtCBkqHUa8vQwtrEzlbZ4fe1r\ngQ9+ELjwQuDP/7zc7wWaEwDbFhroQ5cGbk10NFRJnUh7L3pxGUXq+i8qNMydWz63XcOtRoNrR0Pe\nzhM2WlsCfANh16kTVGjYuXNq2h99VqalZ+XFRspdaNh2NADJOg3iauhNHpHnyiuB97xHbWz8zu/0\n/3066G260NBrjgkxdaINjoYey3RByIe+4QYGqu+GAeJooNBdsF6t/XQA3OkosUF/3mRHQ54J0pWj\nIYqAL36x3O+jcA6AywoNvQKGvMUgAbepE9ycI0VTJ7IWUXrsZs1S1yltcZlHaLDhSOMWCBdx5wDl\nUyeA/I4GERqqpU7Esbpme7WIS5tj8sJ1fF1SpetEL7oLQv6n/2Tn9zaNLEcDoNbV//RP+X7f7Nnq\nedvE1ImiNRpcPu+pkBOq0CCOBiEY6CK16m4YII4GSh5HA+0XTgNjmkpBX9MWR4ONBSMdwyp23DSa\nKDRI6kQxqqRO9CsGqcfzqKOMcNurzZwO2gA7jjTONRryLKbKFoME+jsabAkNkjqh6K7TQOeYXqJ8\nHtouNIijwS95hIYi6HoBTXc09BIaBgdN3TLOqRO9XJw+EaFBCAZ9w9lYpHb/HnE0mOMsRwOQFBq0\noyGKkoth2uJSHA39sZX3m4Yvi18ZXAkN06enu556CQ2SOpEkb40GPZ5RBJx8sjp++ul04fbAAXM/\n2RYaOARqvopBAuZ6/fGPgSOOAH7rt0xVb1s1GriNr8Zn6gQwtU4D/VyEhmK4EBoWLwaOOUYd/+xn\n/NxkXLDtJtFBL53Xm0IeoQEwc4ukTvRHikEKwaBvOBu2W0ApstoZ0Xahge7SZNVoANKSc6TgAAAg\nAElEQVSFhsMOS04iTWhxmcfRMG2aUbarCg0zZ5arLNwPqrxzC4BdCQ29cqel64TdrhN0PGnxsAcf\nnPozNjtOAM2q0VC0GORzz6m56fLL1fx7ww3Az38+9WebmDqRJ3+aUiV1ApjqaJDUifK4EBoA42oY\nHTX3gZDElaMBaJ6rQd+Pg4O9132AmedD6TpRV3tLcTQIQTA+bm5+W0LDwID5XW1PnaA7unRBS0kT\nGiYmjI03rdWXFhrGxvq3veNKnq4TgFk0lrUY6/G07WbQ6Aci5wC4qtAQx6q9ItA7AOhlI2yT0FDV\n0TA2ZgK2XkLDAw9M/Z22hQbOqRM2HQ1aLOgWGq67Dnj0UfM1XRtDUieSFEmdmJxM1hwCpjoaJHWi\nPLa7TmjOOcccS/pEOrbHnq7DmyY06Dmmn5sB8O9oKCMQiaNBEHJiu+OERu8Wtd3RQHN+e+WopwkN\nzz9veranCQ2h12mgE2Q/ZVs/kMosGPftM9c3rXFhkzYIDXv2mL9XL6Ghl7qvhbZZs8oXmm2L0EAX\nlXTho1MngHShwfYczi1QK1oMsmiNBiqAbdoEXHNN8mfWrlUfXRSDDFloKJI68fzzSXEZ6O9oqCI0\n2OhWFBquHQ0AcPfd9n5vk3DpaGhaQUh9P+YVGnw4GmbNMs7UInAQGsTRIARB1TylXugFb9sdDVRo\nyONo0MULexWC1ITeeSJP6gRQTWhwWZ9B0wShYXBQuZC6f06TVQgSyE6dKOtmAHjXwrBZo4GOG11s\nUkfDL3859Xe6dDRwCNSKFoPM42iIIjPGw8MmsF27dqpDTDsa0kSKMtDnLCchvoqjISt1ojttAuhd\no2FkJDsQ6Qe369cHLrpOAMDLXmaeDU88Ye/3NgnbjoY2pE5kzS8+Uif02JaNezgIDXTeFaFBYAtd\n6LgQGkZHedhv60Lv6EYRsGhR+msOO8zUtNDBca/WlpomORr6Wb707pQIDcUou9NuU2gYGzO7llWE\nhlAcDVVrNPTaGZs/38wBDzxgihNqpEZDkjw1GubONQEUkO4202LCv/+7mq9sORponRN6b9VNUaGB\nCpRZAUFael8vR8Ohh1brfsXVMeISV46GadPMBkmIKZo+0GM/MmLHvk7X4W13NPhInSgb91DnFAdH\ng6ROCGxxnToB8Nq18Y12NCxc2L/ooV58lhEaxNGQjggNirqEhm3bzNfKtrak59br/OqEnk+eB30Z\noQEw6RM7diTdToB9sZgGmRxEYhc1GrodCd1us7e/HXj9683/v369va4T9B7StU84UFRoAPJbnNOC\n1F41GqqkTQDiaCibotaLI49UH59/XjpPpJFWwLcKTXY0FK3R4Op663SqOxoGB8374CA0iKNBYIvr\n1AmgvUJDHBuhoVfahEanRzz3nPq5LKGBtrhssqNBT+Tj48W7a4jQoPAtNOiHro1CkPTcep1fneiF\n08hIvl3YskJDv/QJ246GKDJzC4dAzZbQEMf5hIaBAeBP/xQ4/XTztbVrjdAwZ065nF4NDaRDdjQA\nZqz1OD/7LPDf//vU7ihpqRNUuNm/31xrIjQUR8+506dXuzbTOOII9TGOp4qcgpm3bTlJmloMstMx\nc3mW0EBTJ7odfDbYu9f83ipxj/6b19V1QopBCkHgKnWCLuTaWqdh506jOGbt6OpAeHxcLT7pAz1N\naAi9xWVRRwNQfNFIx9CH0ODigVgWV0JDkfaWbRAa9PnkDcxsCA3dBSFtCw1ANSeRbYpey72KQR44\nYObjbqGBzs/veQ9wwgnAmjXma+vW9RYpijJtmnnWhi40dO88XnIJ8NGPAm98Y3I+zHI0UNFBhIbi\n2A52KdrRAEj6RBouHQ1NSp0oMr/oeSWOpxaRtUHV1pYafb+Jo0EQ+uAqdaKpqmwR8hSC1HR3nqCO\nhl4dE0JucVnU0QAUXzT6dDTEMS+xp+gDqJ/QQO3dRbpO0NauTU+dyBMAd78ub9cJoH/nCRdCAydH\nQ9FruZejoV8xxze+Ue0EL14MXH21+trq1eb71NFQpT6DRt9HTRIafvlL4Ec/Up8/9ZT6p6HPJ/33\noeICFR16zTF5aaPQoAMd10JDmjOlzXQ6ZuwldaI/dH7J62gA3KRP2HJy1y00iKNBCAJXqRN00m3S\nZFkEGmiVFRpmzOi9gxZynYYyjoaihb18Cg0AryDYd+pEmtDQBkcDTZ3IQ1lHw6pVpvCe69QJwNx3\noddooIvUfjUWLrhABcbr1wPLl6uvLVgArFihjteuNb/LptCwY4dpY1w3VVMn/vEfk9/T3ToAE6Au\nXmxcUVRcoHNMVUdDm9tb2uw4odGpE0B4GxquodeXrbFv6iYdHau8NRoAN898Oq62hIY6HK3iaBCC\nwMUiFUiqsnXlL9UNdTTkTZ0AkkLD4Yf3zv2mLS5Dq9Pg09EwMmInOEiDaxBcVmhI2z3IIzTQwkgu\nhQZuxciKpk70W0D1am8JqLHVwuJDDyXdM21Knciza1PG0QCogKp7ntB1Gug52BQaOh0+qYVFW7UC\n5nreswe47rrk99auVR8nJ00aGx3jXo4GSZ0oRhz7czSI0JCknzhclqamTpQVGkJwNMRxPaK8tLcU\ngkAcDe4omzrx6KPm75JWn0HTNkdDWaFhyZJq7dL60TShYXJyagpIHqEBMPd8mtAgqRNTX9e9gMpa\ntOr0iQMHkve7i/Q3KjTUXXuEjnGe+7is0JAGrdNQ9Gf7wbHFpV4oDw8nW3/2Qy9ud++e2kVCOxq2\nbDFzypFHGiFh3z7ztxWhoTwHDph7VIQGv7gQGtrgaMgS5kNJnUgrgu0TaW8pBIEroUEcDeVTJ/RO\nENBfaGiDo6GsDXZszCzgXaVNAO4tfmXR5zI0lC9o6BfM63EcGOgfyOqH7t696p7/6U/N95qaOmGz\nRkPWorVX5wntaIgie4tdvRDsdKa2iPRN0THuVQzSltBg09EA8GlxqYWGvO4cIH3u1vPNunUqAKZ5\n/UcemRw/7WpwVaOhaLpdiNAAx4XQsHSp6WQhNRqS0Dnb1tg31dFQpEaD63WVbUcDUE+cI6kTQhC4\nSp0QR0P51AkqNPQqBAkoEUIvvp98svj51YlrRwMde5dCA9cg2GYATPvb9xMt9EN3927gfe8zu+5n\nnlkteKDBDKcxnpgwO7W+hQZaEFLP4XPm5N+JzoLed3XXadCLqbxjbNPRQAtCamwLDdwcDUWEhu7F\n7bHHqnoXALBtmwpM6S74EUckHQtaYLBZo4GefxscDa6FhsFB8wwVR0MS16kTTVo7F0md8OlosNF1\nAqjH0SDFIIUgEEeDO8qmTtAdrn6OhoEBU6zsiSfqtzgXgQoNLmo0+CgECbRLaOjV2lKjF1r79wM3\n3KCO58wBrr++WupKFPUvVlkX9FxstLfs13UC6N15ggoNtuBkP9fjlHfHplcxSCo05BULDjts6hxs\nI3WiqULDZZcBZ5xhPl+3Lhmc5nE0VBUaoohX1xTXuNhV70anT2zdymsOrhsXQsP06WbzpalCQ972\nlkA4joa6UyfE0SCwRS9Sh4aKLTCyEEeDSZ0YHMze0Z0+PX3x209oAICVK9XHffvUDlIo5C3wZkNo\n6OcKqUrThYaxMXP/Zl3DaQut664Djj8+3znkOT+OYwz4cTQcc4yZo9NSJ2w60jjtChe9lm06GgBT\nEFIjjgYDHeuhIeDSS5PpJuvWTU2doEKCC6EB4FXM1DWuHQ1Ask4Dbb3ddujY2xIaoshs1DUpdaLJ\nxSABcTQIQk/0DTd3rt2CeeJoMI6GxYvzWZrTdt6zhIajjzbHTzyR/9zqJu9OQFmhQVc5B8TRkIde\nOwh5C0ECU/+OH/sY8OY35/v/s+AoNJSp0p9XaEizcw4OAiedpI4fe0zdD2Nj5jxsCg2cUifqrNEA\nTK3TIEKDgY71m96knHvdQkN36gQdPy0w2KzRALRXaHDR3hKQFpe9cOFoAEzw26RNuiI1GlynTthu\nbwmIo0EQeqKFBpu2W6C5eWZ56XSMoyErbUKTFhBn7cZrRwMQVp2GvIujsoW9JHVCfawaABcRGug9\n/7rXAX/6p/n+7zzohyjHMQbspE7oRWsU9V6I6ToNnQ7w8MPuauxwTJ2oy9HQBqGBtmcrIjTQ1152\nmfq4cqUZ37VrTWAaRUo4T3M06DEYGckOQvKgiwjXfe36wLejQYQGg6u0FXE0mGPOqROcuk6I0CCw\nxUV+L5C8AdvoaHjxRVMoLm9rvzJCgzga0hGhQX30KTRccon6PWvWAN/4hqlUbgOOjgZXqRMzZ/Z2\nQNGCkLfdBtx/v/m8iUJDHFcrBtmrRkMVoaGJNRroOBURGn7zN9W1+qpXAa9/vfpaFJkx27LF1BM5\n7DC1GO7naFiwwI6zsq2OBh9Cg3SeMLhyNGihYf/+ZD2rkClboyGU1Ik64hyfqRN9arYLQm9GR81N\nbHORCqhF4dCQmiTb6GgoUghS0x0QL1qUrVKG6mgQocEdcVyP0HDhher1M2bY637QfX5cxhhwlzrR\n736gBSGvvDL5vSbWaKALqTLFINMcDQMDxYKCZctUkKwdajYcDbSwKgehgV7LRYSGd71LpUx0p16u\nWaOEMMA40bT9vl+NBhtpE0BSaIhju2mhnBgfB26/3XzuSmiQ1Il0XKdO6P/DhrhZN9J1wj7iaBDY\n46rjBJDs6d5GoUEvSoHyQkNWfQag+Y4GbYEFygkNg4NKsHEFR6FhYsJ0IPEpNAD9d+OrwFFocNV1\not/9sGZN77/pCSfkO4c8cKnRUMY1kiU0zJ1b7BqNIuC889TxggV2guFDDjHvh3YZqouyQgOggqDu\nQL7bBQKYXfFuR8OBA0aMsFEIEkhev5zmDJvce6/q8PHlL5uv5VkzlEFSJ9JxVR+DCgsc5gcbFKnR\n4Ct1Ytq0/M+VNOoWGsTRILCH5vfaFhoApRS+9FI7Uyeoo6Fs6kSeRcPcuWrhtmNHuI6GfrsweR0N\n990HXHutWbA+8oj6eNhhbgJfDUehwaalv6jQ4Ap9fuPjqj6By79pXsqM89CQCsqo60STx9GwaBFw\n443qn07NAoBjjzU58jbgkjpRZoyzikGW2R38/OeVqHvhhXYWdFGk7qdnnw3b0dCLfkJDt6PBdscJ\nYGptH5sdteqm0wE++lHgc59Tx4CaD//4j4GLLnLzfx52mHGoSuqEwZWjgTpInnlGze+hw9HRMGdO\nNbdT3UKDT0eDCA1CKaijwXbqBNBuR0OZ1Inuegx52zKuXKkWbJs2qYW1a2XTBmWKQfYLeC69VBXH\n68Zla0vA/QOxDGUePnmEBmr39k33OHMIHMoEwVGkXnvgwNQ2ojoozrJyvuEN6p9LQhYa0hwNcVxN\naDjiCOAznyn+c/2gQkPd9v4iu415OO44Na/TQEwHT3T8X3zRvdDQtDoNd9wBfPaz5vPTTgO+8hXl\nbnDF4KB6lm7aJI4GiiuhoYkOkrI1Glxs4OiYpOoGa91Cg7S3FNjjMnUCMAvmvXuN8t4WfKVOACZ9\notMJ56Fks0bD7t3pIsPQEPCf/3O588uLOBr8wHGcy9RooK+l78PVgrUsXGo0UNEs7xgPDBjHi/75\n/fvNooxLvrO+n0ZHi3XUcYFtR8PAALB6dfJrOngaGjLrjW5Hg+0aDUDzhAa6ifGe95gUCtfov9/2\n7c0b07K46jrRxJoYZVMnXDsaqlC30CA1GgT2uGqNptEL5jiufyHlG1+pE0CyIGQodRrytPID8i0Y\ndZoEALzjHerB/PTTakF0+eXVz7UfHAPgqkIDfXjR/FAuQgMX50iZGg2AWRDQ98FNaOBYo6HIQkrv\n7mhxoWzHCZdw6jxhW2gApqZP0F1aXadBHA3FoWupl7/cn4OR/v02b/bzf3LHh6OhKakqXFInRkfN\nc6Wq0CDtLQUhA1+OBqB9dRrKpE7MmpWcuIo6GoBw6jTQfPR+luE8C8b1683xmjVqN+CII9yIZ924\ntviVwZWjwVYQUIamCDr0tZwdDVwCtbJjLEJDMXwIDXSXVgsNO3a4mWPKFhEOAfp+6Pt0TRPt/FWh\nwaXNv0UTx5pL6gRN5bbpaGh6e0sRGoRSuBYa6IK5bXUadOrE8HCxhS11NbTB0ZBlN+wu6pUGFRpO\nPLHaeRWlKQFwltAwe7Z7xbwfHMfZZuoEnR9FaDCUFRr0taoXYrqNImCnPaUNOLW4dCE0nH66OY6i\nZL0cLShMTAAbN079elW4XL8uoM9Bn0JDE+38VaHrGJsFihcvVilGQHPGuoijwWXqhK3WloCkTghC\nJq5TJ8TRoNImihT5ouJCkx0NelLOCqqKOhpWrap2XkXhGAC7EBrqTJsA+I9zkeAsBEcDlxoNVR0N\neiFGhQYfTqc80Huq7hZ2LoSGE04w8/eyZckdNyr2PP64OZYaDdnUJTQ0cZe9Knk6BZVhcNCs/5qS\nOlGkRoPL1AmbG6z0/pNikIKQgs/UiTY5GiYmgG3b1HHetAnN7/++Ukl/93fzL7qWLzdiRmiOhqwH\n9LRp6qELZAsNw8PAihVWTi833APgKkJDp2Pyp0VomIqr1Imquyw24FijoUrqBM0p766FUxdNT50Y\nGgI+/GH1+/7wD5Pfo86Fxx5L/3oV2iI02OgQkpcm1g2oSl5nZhn0eL/wQjNqnJV1NHBOnRgYMO+l\nbkeDCA0CS3ymTrTJ0bB9uyqACeQvBKl5+9tVPvFXv5r/Z0ZGjK0xBEfD2JiZILOEBlosMm3BODEB\nbNigjo87ztgNfdGUADjtwb5zp+kWU2drS4DnOFdNnRgfN+PLzdHAJVAr03UCmCo00B1YGjDVCSeh\noUj+dBGuuUYt7K+8Mvl16mhwLTQ0IUijSOoEH1w5GoDkeDdB2NFzTBRl2/xDcTQARmSq09EwNGQ3\ndScNERqEUvhMnWiTo6FMIUhKmWBZ12nYvp3/WNMJOc8Dup/Q8OST5kHkuz4DwDMAtuVo4NLaEuBf\ndLNM6gRgrl0RGtIp23Wiu0YDXaiL0DAVF44GTdrzjAoK0nWiGHUVg1y82Ah4IjSoTQ49P7mYs5uW\nqqKv2+nTs9OJXT7vmyQ06PWDj84zIjQIpZBikG7QhSCBckJDGUKq01C093Q/oaHOQpBAs4UGLq0t\ngeaMc/dr9e/gJjTQYLMJqRN0oU53CuukLUJDGmkFOUdG7AXOTRYa6nI0DAyYe6cJO+xVKbphUpSm\nparoOSZPuo+vYpA24h79t6/Dta3HxkehbhEahFJIe0s3UEdD0dSJsoTUecKmo0GEhqk00dHAcZzL\npk7QQE7fC9J1Ih1bxSC10DB7Ns9ikG0TGtKcCwsWFCuc3A8u168L6hIaACM0vPhi81JSiuJaHG6q\noyGP0OArdcJGPSS9WbZvn0mZ9oUW0sXRILBFp06MjBRbxOWlrY6GqqkTZaBCQ0iOBptCg++OEwDP\nANiW0PDss+ZrdRfQa8o4A8luMrq1HzdHA5dArarQMDGhFn96R5BL2gSg2h7rwLptQkOao8FW2gSQ\nDMCbLDT4LAYJNC/4rUJRZ2ZRmlYTg6ZOZBFi6kQc+59rxNEgsEffcC7cDEB7HQ11p05wdzSUFRrG\nxoDJyeT3qNBwwgnVz60oTQmA094HrdSft9WqK7iPc5Hg7LjjzLEuZMqt6wSX9pZVi0ECwNat5j1w\nSZsAVDcdHVw3sb1lP9JEBZuuKS5CmQvqqtEANM/OXwWfjoYmjDVHR4NNoQHwX6dBHA0Ce3RvcVdW\nUnE01JM60VRHA5BcZMUx8PDD6viII+rZCXb5QCxLmeAsS2hYtqz6eVWBu9BQJAimQoOuuM/N0TA4\naBYvXGo0lCkGCQCPP26OOTkaABNci6PBrqOhyUKDdjTQe9QXTdtlr4LrOXvRIjOPhT7WcVy+RgPn\n9pZAvUKDOBoE1oyPqzaKgJrQXNBWR0MdqRNLlpgFYkhCQx7LYS8b7LZtRiyroz4DwD8AzvsAEkdD\nccrWaDj2WHOc5mjgIDQA/VOWfFE1dQJIOry4Cg27dpndqTrg4GgQoSEfWmg45BB7NS3yIqkTBtfF\nIAcGzHM39LEeGzP1C4qmToijoTciNFgmiqJOzn+35fhdF0VRdFMURU9HUXTg4Meboii60Md74QBt\nKbVwoZv/o63tLXXqxIwZ/oKGgQFgxQp1/OST/ovSFKFsMUgguWisuxAkoMZdt2/jEgDbrtEwbZq7\nOSIvHJ0jZVMnVq40Pa+1o4FbMUigOUIDFV45pU4AyXQB+kz2jW+hYfZstSNPcSU0NK1oIRUafEPv\nHypEtxEf4rAWdl56KezNOvoMaVrqBP3b+/4bSeqEG+Kc/1KJFP8A4HsA3gJgGYBpBz++BcD3oyj6\nsss3wAWaE+oqiKjzBqwT7WhYssTvjoOu07B/f7JOBDdspU7UXQhSo9XkpggN+sGuF5LLlpnAuC44\nOhqqtLdcvlwdb9igREF6T9QRQKTRFKEhBEcDUG/6hG+hIYqmpk9IjYZ8aKHBdyFIILlW1G7CtuJT\naADCrtNA55c81+3goFlzFHneT04C99zT/56nQoONv1tdjoY4FkeDS/4OwCl9/r23z8/++cHvxwDW\nAvhtAGcd/Lju4Ncvi6Lo065Ongs+hAZ6A7bF0TA2ZnamfKVNaEJpcelCaKjL0QCYACjkALg7kB8d\nNXNE3WkTQPL86qwZQCmbOgGYOg07d6px1vfEzJn1izoaHXDWOd5li0HShZcIDdn4FhqAqUKDpE7k\nQ7+fOgTJefPMsU69bSuuu04AzamJUdTRAJg5PK+jYXwceP3rgVe8AnjHO3q/TgsNs2bZedbWJTTQ\nwujiaLDP1jiOH+rzb2PaD0VRdByAD0GJCfcBODeO4xviOF4bx/ENAF4FJT5EAK6MougYX2+oDqjQ\n4KpGw8CAuQnb4mjYutUc+yoEqaGdJzjXaRChwS1lhIbBQWNlHh1NtrasuxAkkLyXHnywvvOglC1U\nCCTrNDz2mLknuKRNAPwcDUXGuJejgXPqRNuEhm5hQdpbZhPH9aZOzJ5tXJoiNJhjH46GpggNeecX\nvXbJKzR8+MPArbeq4//zf3pfn7a77dUlNNBxEUcDH/4IwMFsanwgjuNEWBDH8X4AHzj46RCAD3o8\nN+9s22aOXeZf6zoNbXE01FEIUhOioyHPTkCW0DB7NrB0qZ1zK0MThAb62tFRXoUgAeC008zC4Lbb\ngE6n3vMBzDiPjBRPkepucanvCQ6tLTX6vhsfn9pW1hc2Uif0tTx3Lq/xBZLP3jpbXDbN0cClPatt\nxsbM3FeH0DAwYLqUtV1ocF0MEmhO6kQVR0OeddU3vwn81V+Zz+NYpVCkoWOR0IUGWjxYHA18uBjK\nzbA+juP70l4Qx/E9AB6BcjW82eO5ecdH6gRgJuC2CA3U0bB4sd//mwoNG1N9PTyoUgxS7+bs3w88\n9ZQ6PvFE/9W3KU0RGuiDnToaOAgNQ0PA+eer4+3bgQceqPV0AJjgrGjaBBCGo4FDsGZDaNCFcbml\nTQA8HQ1lrucydAsLNms0RJG5fpskNNDClnXVctHpE20XGnw4GpqSOlG0RgOQ39HwwAPAZZdN/fpd\nd039WqfTHKFBHA3MiKJoJVTBRwD4UcbL9fcPj6LoKHdnVS++hAa9g9SW1AkqqNB8Rh/QFJi6+7L3\nw0bqxKOPmgCizrQJwDwQ9+1Tyvo3v6mse3XltjfR0QAAF1xgjrVFsk6oo6Eo1NHw4INm0cBJaKD3\nXWjXctoOT0hCw333mY4kPqCimS/R1qWjAeCR+mMbKjTUUQwSSAoNnLtbuUZSJ/LjytHw0kvAW99q\n7ouLLjLfu/POqa/fu9dcsyI0FKNtQsPboyh6MIqivVEU7Yqi6NEoiv5XFEXn9/mZk8jx+p6vmvr9\nGmvZu8W3o2F8nM+Or0t8PHx6QRdqnCtCFx2jtHxbLh0nABMAjY8Dl1yi/l18sfpYB7aFBg41GgC+\nQkMZqzltcfmLX5ivcxUa6grWbBSD1HCrzwCkCw3f+Q5w1lnASSf5ayGohQZfaROACA1loO+lbkfD\n2Bifwrx14KMY5MKFZt5rSupE3jkmTzHIa64xguyaNcCNN5qU5XvumZpiabu1JVBfdz1JnXDLKgAn\nApgOYCaAYwC8G8BtURTdFEVR2uVDlxhZtyvVDRnugdjBt6MBaIergToafAcNM2aYh1KdPdmzsOFo\n4FIIEgDOOCP963fc4fc8NE11NJx0kikK+eMfJx+0dVDF0UBbXNJ6KiI0JGmjo+Ev/1J9HB8HfvYz\nP+dRh9BAhYXhYfuBs75+qQsgdDilTgDtTp/wsakURUYgbZujIU/qxN13m+Mbb1S/+5xz1Oe7dgEP\nPZR8PRUabNXrEUdDs9gL4JsALofqELEawOsA/BmA7VD1F94C4OYoiga7fpZeUlnhLr1UGC377KKF\nhsFBU9zHBfRmbkOdhjodDYBZvIXgaJg2Ld8EyV1o+Pznga9/Hfjrv1b/dGBe1/XeVKEhioBf+zV1\nvGcPcO+99Z5PlRoNQLJOg4aT0ECDTg6pE2W7TmhCEBruvz95XdOFsUvqdjQceqj9lI0mOhpEaOCD\nj2KQgJm3du3yNx/YpkyNBpo60StFR2+ozZsHrFihjs8+23y/O33ChaNBikE2i8PjOP6dOI6/Gsfx\nnXEc3x/H8a1xHH8SwMsAaAPqeQB+v+tn6eMzq1kKNfjXlAXnHi00HHqo277tddmK6qJuoUEv3jg7\nGvRknNdumCY06GKAg4PAMTU3oj3kEJUm8V/+i/qnH3ijo/XsutsQGnQxyHnz6lvQpkHTJ267rb7z\nAKqlTgDJOg0aTl0RmuZo4Jg6MTJi5sEXXgD+8R+T32+y0EAdDbbTJgAzrvv31+9+sgU3oYHzhoZr\n9FpvcNBtAdUmdJ6okjoRx727Hul1LhVstaMBmFoQkm7+hC40iKPBAXEc93zkxnG8DcBvAtCPkw90\nvYTux2T9SeiU0SAtPIkWGlymTQDtdjTUETToBdu+ffn7D/umaIX97oBn505jiTgVqy0AACAASURB\nVDv1VD+TbBHqFtfKPoD0YqnTMTZNLm4GDZc6DRMTZvHTVEcDJ6FhcFD9y0sojgbALJK3bAH+9/9O\nfq/JQgN1NLgQGppSSI/CoRgk/bu12dFA1zEuC6g2ofNEldQJIH0t2+kYoYvOH2vWmPlfHA32GHL/\nX/AnjuMnoyj6fwB+HcCxURQtieP4uYPfpiFu1lKO7rNWChPGxsawbt26zNctXboUS5curfJfFWL/\nfnNDuBYa6g66fMPF0QCoSVjntHOiqtBwzz3GSkfVay50i2vdRc9co4OzadOKuZXSHuxcCkFqVqwA\njj5a1TW46y618K5jZ6/sTjslzdEgQkMSfR0WHeNQikEC6hm8aVN60OZDnI/j6u6cMrh2NNB2z088\noeaN0OFUDBIQoQFwVwhS0wTBrErXCUDNT93X+86dptgjnT+mTwdOP13Vb3j0UeUU02JuE4SGLVu2\nYMuWLYkW3zt2AN2h5pjlnUYRGgwPQQkNAHA4AC00UMNR1nKD7ntUuq23bduG008/PfN1V199Na65\n5poq/1UhaBstcTTYpW6hgU64L77IT2iI42pCw759SZWa5uNxoW5xrWyRwrTXc3M0AMrV8MQTKgj9\nyU+A173O/zlQoaFscMbd0cCpRkPRa7l7h2f+fPcBQVmo7bcbH44GG9dyGY45RlWIf+454FWvsv/7\nqbDw5JP2f38dcEudEKHB/ZzdhNSJMjUashwNND24W6g8+2xTKPLuu4E3vEEduxAa6H3oY7335S9/\nGZ/61KcSX7v+evXPJSI0GHp19aW1R7NKx9HvP1zlZBYtWoRbbrkl83U+3QyAv44TQP1Bl2/q7DoB\nTHU0cOPAAaNCl3U00Ly7EBwNvtGBQ9GUkpCEhq98RR3femv9QkNZR8PRRyvLLS10xUlo4OBoKHst\ndwsNXNMmgKlCw+zZZt7wITTQIMCn0DB9uuqq8dBDpsirTbodDU1AhIb6uO461U5R71rv3Kk++hQa\nmuBoKFqjAUg+bzVUaOieQ885B/jiF9XxnXemCw22UpsHBtS9uG+fH0fDFVdcgYsvvhh33gl84GCR\ngMsvB37v95Kvu/DCC7Ft2zZr/68IDYaTyPGz+uBgWsWzAJZCFYvsx6sPftwcx/HGKiczPDyMNWvW\nVPkVTvApNNQddPnGR2/lfnQ7GrhBJ+K840MXVHv3GqV6yRLgqKPsnZst6hbXmu5ooEFJXXUaaHBW\nVmjQLS43kqeMCA1JbDkaQhIaLrvMLJKbLDQAan5xNceIo8ENbRUarroq3VGweLHb/7etNRqo0JDm\naKDO7DRHg4ZuTLlwNABqLetLaNCp9rQz2FFHqdoUlGHLxctaUQwyiyiKVgJ4LZSr4Yk4jrd0veSf\nAUQAToyi6Kwev+MVUI6GGMDNDk+3VqjItWiR2/+LLpzbJDSMjPgp0NINd0dDmdQS+mBau9Y8LM4+\n220RprLULa41XWhYtAj4lV9Rx+vW1SOo2bKbd9dpkK4TScpey91rLK71GYCk0DA0BLz//ebzpgsN\nLlm+3DwfmuhoqKsYZFuFhq1b1cfhYeWWWbkSOPNMJUC4ZMEC87cOVWjwnTpx+OHq/gdUTa+JCXXs\nousEYDbNmlwMsvFCQxRFb4yiqGfN6SiKDgNwI0xHiWtTXvZXAA5ebvibKIoSj9SDn/+Pg59OAPjr\nSifNmLocDW1IndDvsa6AgQoNHB0NVYWGxx83xxzTJoD6r3mbQgO3YpAa3X0ijoEf/9j//28jdQKY\nWqeBk6OhSTUaQnE0vPnNKoDRRVx9Cw11Ba8uGBkxAlNTHA1SDLIeRkdNsPvylyvh6okngHvvBc49\n1+3/HUVm/nr66WSqXSjYKAbZTT+hATCuhn37TDt0l44GQNpbhs61ADZGUfTXURS9I4qiV0RRdGoU\nRRdEUfRpAA8AOA3KiXAHgL/t/gVxHG8A8DkoV8OZAH4aRdHboyg6PYqitwP4KYAzDv6Oz8Zx/Hj3\n72gKddVoaJOjoa6AgU64TXQ0UDgWggTqv+Zt7QIDPB0NAPCKV5jj9ev9//82UieAqY4GTkIDB0dD\n2a4TIQkNOlAZHAT+6I9UYKEXwT7mj6Y6GgBTp2H79masPyR1oh7otVPHJtLxx6uPe/cCGzb4//+r\nUqZGQxFHQ1pBXbo+1AXEXQsN+/aZGmSuEaHBPjFUfYUPAPgGgDsB/BzA/wPwMQCHHnzNdwBcHMfx\neI/fcxWAfzz42tMAfAvAfQc/aqHiH+I4/oSzd8IAcTS4o26hoYmOhrQ2jdOmqRZGHKnzmqet6qo6\nGgYH3eeflmXFCnO8sVIlnXK0wdFQt9DQ6RjLa9VikJxTJ9asUTtuv/gF8MpXqq/pRbCkTlSjaXUa\nOAgNs2aZ53FbhAYXRQSL0KvmQCj4rtEAJB2vP/mJ+uhaaIhjf89KSZ2wz7sBXA3gXwA8AuAFAOMA\ndgC4H8CXAZwTx/FvxXHc89EcKy4H8Aaomg2bAYwe/PjPAC6K4/gKl2+EA1IM0g2Tk2YhII6GdMoU\ny4yiqQ+nNWv4LorrdDRMTBhrZVWhYelSJTZwhBYBrVtosFmjQYQGQxUxJyRHAwCcfLL6pxGhwQ5N\n6zzBoUbDwAAwd646bovQ4Cq3Py9pu/MhUSY9q2rqxGmnmTXmDTcAt91m5tPh4WobBN3Q57av9Anf\njobGd52I4/gOqJQIW7/vFgDZfScbirS3dAOdYMTRkE7ZMZoxI/mzXOszAPWKa1WCs+7Xc63PAKiC\nkCMj6v3WLTRUWbCsXJlscclJaKi7RkOVMQ6pGGQaeg7Zt0+Jh0MOV3lNFhrE0eCGefPURkYbhYY6\nHA1nnqlE/8nJ8B0NLlIn0oSGadOAD38YuPpq5Y77rd8y37P9N6SbZr6EBnE0CKzRQsPIiPsWjHXn\nq/uECikcikFydzQUCaq6F1Vc6zMA9YprNoUGrvUZALWrpqtKb9rkv0CWrRoN06eb9wHU0xK3F01x\nNCxcGF6RQ7pr6vq5WSYICIWmORo4FIMETJ2Gl14KszhhUVxZ7vMya5bptPTLX/pxOtlEX7fDw1PT\nYHuR5WigqRN03Uu56irgwgvV8fbtJvax/TesQ2iQGg0Ca/TNtnCh+/aAw8PmJmi6o6FsEG2ToSEj\ncnB0NJQdo+5AgbPQ0BRHA2ehATDpE3v2+BfVbKVOAMDFF6uP55+ffxHmg7qFBrqQqiI0hOZmAJIL\nYddBhTgawoGTowFQu6p1FYr1Sd2OBsC4OONYtWwMCX2NFBF8s2o06PXtvHm9UzwHB4Gvfz1Z0wlo\nhtAgjgaBLXGcFBp8oCfmNjka6rRAaxtZkxwN9AF15JG8gwdxNPihzjoNtlInAOALX1ALx1uYJfPR\ney601Am68OJenyENERrssGSJeU9NcDRooWFgwM8uZi9o5wmO6wzb1F0MEgi7IKSeY4oIDXlTJ9LS\nJigLFgA33ZSc25ogNIijQWDLnj3mAvUlNOjASxwNftA2shdf5GdrLFMMEkg+oDi7GYB6HQ1VdoFF\naMiPrdQJQLmQzjrLbnEqG9CFWd2pE0UXUrRbygkn2Dkfn/hMnWiy0BBFJn3iySf5PQ+LooWGGTPc\nu1H70bYWl3UXgwTCFhr086PI/NIvdaLTMQJXltAAAKtXA//zf5rPbYvPdC3rK86RYpACW3wWgtS0\nxdFA3x8HR8PEhFJXORWYq1IMUsO5ECSgAsahITX+ITsaOBeDBPg4GpoWnGnqTp2oci2vXAl85jOq\nZeQHP2j3vHwgjgZ7rFwJPPywep/PPae66YSKFhrqTJsA2i001OVoWLkSOOww4PnnldDQ6fBKtetH\nmdSJfo6GnTvV+weAQw/N9/suvVSlG/zrvwIf/3j+88hDHV0nJHVCYEsdQoO+CfftU1Vzmwo3RwPA\nr06DjWKQ3B0NUWTem9RocAcXoYGbE8EW06aZ3NfQhAYA+JM/Ab71Lf7XcRoiNNiDFoQMvU6Dvg9F\naPBL3cUgAbWu0GufnTuB9evrOY8yVK3R0O1oyOo40YvLLgO+/W1g1ar8P5MHSZ0QBMK2bebYt6MB\n8HcT1gGHrhNAcuLllj9ZVmh43evUx5e9TNnguKP//nU6Goo+fERoyI/N1AnO6MCzjhoNVdKAQoc+\nP0RoqAYtCBl6nQYujga6mdEGoYGDowEIM31iYsJsMNoqBllWaHCFFIMUBAJ1NCxa5Of/rDNn3Sfi\naMim7Bh94APAQw8B997rZ1KtSuiOhtmz611Q5eHww411VBwNbtALw61blWXXJ20Z4zTE0WCPpjga\n4piP0NBmR0Odz0WaNnrnnfWdRxHKts/tlzohQoM4GgTG1Jk6ATS7ICQXoSEUR0ORYpCAsrvVvcDK\nC3U06FxCH9gSGri7GQAlOOk6ElKjwQ36GbFtm7r//umf/BXUq+LOCR0RGuzRFEfD+Hi5nWEXtE1o\n4FAMEgBOP13VfwLCcTRQocFW6sQLL5jjvDUaXCKOBkEg1FkMEmi2o4FLMUjqaOAmNOhJePr03r2P\nmwD9+9Pe566pIjTQBzv3QpAanT6xfbvftKy27LZ/8YvJdrnvfS/wmtcAzzzj/v9uyxinIV0n7NEU\nRwMN2OoW3NssNNTpaJgxA1izRh0//DC/9V0adH6R1Al7iKNBYIs4GtzB0dHANXWCUycMF9QlrlUJ\nzmiQEYKjAUjWadi0yd//25YaDRdeqIqOvfOd5mu33Qb8wR+4/79FaFCIo6Eac+aYXc+QhQYqWIvQ\n4Bd9Dw4P1z8X0ToNd99d33nkpayjIdTUiaa2txShQciNOBrcwUVo4OxoaIvQUJe4ViU4O+UUYMUK\ndfwbv2HtlJxSV0HItqROAKqWz/XXA//yL8ai+dhj7v/fNheDFKHBLtrV8PTTU4OWUBChoT70upVD\n3aLQCkKWrdHgouuEK6S9pSAQqNDgK7epjY4GLl0nxNFQDyE6GoaHlR3zmWeAN7/Z7nm5goPQ0JYg\n+MILVR93wI+A2cYx1ojQYBddpyGO/TqfbCJCQ31wEhpoQcif/rS+88iLC0eD1GgQR4PAGC00zJrl\nb1Ehjga/cHU0dDpmsdQmoSEURwOg5oRQ0iYASZ3wjRYxfQiYbS4GKe0t7ULrNIRaEJIKDXUXg5w1\ny3T8aYPQoO/BOgtBao48Eli+XB3ffjvw4IP1nk8WNmo0cHc0SDFIoVY2bgQ+9SngF7+o+0wUWmjw\nlTYBtNPRULSjgk24trekC6U6x8cH9JoPxdEQIhwcDU0NztLQc8voaHKnygVtu5YpQ0NmUS5CQ3Vo\n54lQ6zRwKgYZRcbV0HShYWzMzEUcHA0A8P73q49xDHzyk/WeSxY2Uif61Wig7pq6oAKKOBoE77z/\n/cA11wBveYu/tmC96HSM5cin0NAWR4N+byMjfhTGXsyZYzo6cHI0cHF8+CBUR0No6J0dQFInfOAz\nLautY6zRu6c+hYamOkea5mioW2gA2iM0cGltSfmDPwCWLlXHN90E3HdfvefTD5fFIOfONe0+62Rg\nwNyTdQgN4mhoOQ89pD5u3Ag89VStp4KXXlJiA1Cf0GAr6Irj5AKJA1zqD9DdBk6OhjYJDXU5GtpW\nQG/mTDOX+RQa2hCcpeEzLatt13I3Oqjx1d5y+nT17GgiTXA0cBUaduyofxPNJVxaW1IOOQT4xCfM\n5x//eH3nkkVZoaFf6oTeMOVQn0GjXbq+NpZ06sTQkJ95W4QGxtDF2Lp19Z0HAGzbZo7rSp2wsWga\nGwPOPBNYvBj4yU+q/z5b6AmGw8NI7zyKo6EeQiwGGSo6fWLz5mTeokv0OA8PNzc4S0McDf6gjgaX\ngRwVGprK8uWmpsDjj9d7LmXhKjRMTvrbxa0DjkIDALzvfcap82//Bvzwh7WeTk/K1mjo5WjodMy6\nlkN9Bo1e0/p2NPja6BChgTHUVrZ2bX3nAdTT2hKwH3T99KdqLHfvBr7xjeq/zxZcHA2A2Xl86SW1\nEOBAm4SGENtbhopOn+h0lNjgAz3OTQ7O0vDpaGjjtUzRQsPkpNt6GG0QGqZNA449Vh2vW8enZlYR\nOBWDBNrTeYKmLnFJnQBUgPmpT5nPr7qKp7PEdo2GXbuMM5uT0KAdDb6LQfpK0xahgSlxnLzx63Y0\n1CU02A66aA93+p7qZHKSV0cFOgHv3FnfeVDoBNz0YpDiaPBHHQUh9Ti3ZYw1PgvNtrnrBOCvxWUb\nhAZA5bVrOFvNe8GpGCTQHqGBq6MBAC65BDjpJHV8553A979f7/mkUTZ1YnDQuAXps4C2tuQoNOzb\nZ4QQl4ijQQAw9WJbt65exZEG5YsW+ft/bQddGzaYYy41COhuAwehgWPniTY5GjgUg2xLcFaH0KCD\ns7YJDXRhJ44Gt/hqcdkWoeGKK1RrQAD43veUMzIkuKZOAM0WGrg6GgAVjH/60+bza6+t71x6UVZo\niCIz71NHA13PcqzRACTvVVeIo0EAMFVo2LbNn7U3jbocDTNmmPxIGnTt3FlOeKGOBi5BNBVQOATR\nPgOCvLRJaJD2lv6o09HQ9OCsG0md8Ic4GuwyMgJcfbX5nKvVvBehCg27dk3tGhASnB0NgOpop9cb\nHAudlq3RAJjNEvosoGt+To6GxYvN8ZYt7v8/cTQIANLtM3WmT9QlNESRmQj1pP2JT6hF62mnAXff\nXez3UUcDtVHVCbcgWhwN9cLB0dCW4ExSJ/zhsxikdJ0wx66Ehk7HjHPThQYAuPRS4Ljj1PGPfgT8\n4Af1nk8RQhQa7rkHWLIEOOYYP8GXC7gLDVEEHHaYOt66td5zSaNsjQbABNG9HA2chAbtlgKAp592\n///pMRFHQ8tJExrqLAhZl9AAmMByzx5VwPHTn1a7CfffD5xzDvCBD+Tb+e10klWjOQbRHB5G4mio\nF3E0+ENSJ/whjgZ/UKHB1RxCx7gNQsPQEPDf/pv5/GMfC8fVEGIxyK9/XQWazzyTHPeQ4Jw6odFC\nw44d/NwjZVMngPTUCa41GnwLDTp1QhwNLSftAdZGRwNggu+tW4HLLkt+L45VbtlJJ2W36Hn22eTE\ntWcPj4mVWxDN0dHQpmKQ9P2Jo8EtCxaY8fYhNExOmk4ubQjOKHW1t2xLvRGKD0cDtTW35Vp+29uA\nU09Vxz/7GXDzzfWeT164FYOka4xeQsMDD5jjr3wlzNai3B0NQNK2T9vYc8Bl6gSnGg1HHGGOn3nG\n/f8nqRMCAN6pE76VQB18j42ZB+Z73gN87nNm8nnmGeBNbwIefrj376H1GTQcduy5CQ3iaKiXgQET\n/IqjwS1RZFwNmza536Fs4xhr5s41lcDF0eAWERrcMDAA/Nmfmc//9m/rO5cihJY6EcdJoWFiArjm\nGuenZR36/ObqaKBCA7f0iSqpE1nFINvqaIhjKQYpHIQKDVp1evZZ4Lnn6jkfbTmaN09ZCH3SrQSf\nfjrwd38HfOhDwIMPAuefr76+Zw/w1rf2Ds5ofQYNhzoN3IpBcnQ0tEloAMw171No0K6RgYF2BWda\naBgddb/QosFZm8YYUNeVDjBczyv6Wh4cFEeDCA12+fVfN88gHzZnG4QmNDz//NS12de/Dvzyl27P\nyzb03uPqaNCpEwBvoaGso0GEhiQTE+ZYHA0thwoNZ59tjn/+c//nApgbtI6bk07Qhx4K3HijWdis\nXAn83/8LnHKK+nz9euB3fzd9ZzLN0cAhkOYWRIujoX5oXRJfaFFj9myz89wGli83x67TJ9q+065F\nTNfzSluvZY2P9pZtFRqiyKSPctioyENoQgN1M+ixjmNVCDwkQkudeP75+s4jDRtCA33mcq3RsGiR\ncRe4Tp3QbgZAHA2thwoNF1xgjusoCNnpmIVhHTfnSSepjwMDwDe/mSzgBiib+U03KWsuoISIz31u\n6u9JczRwExo4PIx8Fm3LS9uEBupo8FVwTI9xG8aXQvMjXVc3b1sBvW7082PHjvT0QFu09VrW+HY0\ncCgw6BOd3/3ii26vY1tooSGKeAicWUIDdS5ccw2wbJk6vvlm4N57nZ6aVUIoBsk5daLKHKOv807H\n1EWi6326zq2bgQGzDnHtaKAOD3E0tJxeQkMddRp27jTnU4fQ8JGPAJ/9rGop9drXpr/m2GOB6683\nn//JnwC33558jTga8sExdaJNxSABIzRMTPgrWEp3gduETwdPm1MnADO3dDpu04Laei1rfHSdqJI/\nHTpaaOh01PqIO/pvNWMGD4fPzJkqrQnIdjScdVbSyXDVVW7PzSb63hsa4jvfh5A6MThYPGWbBtFa\n4Nfr2blz/aeAZ6HTJ3bsSK53bSNCg/Af0F3MM84wAWgdQkPdlVrnzweuvBI499z+r3vjG4FPflId\ndzrARz9qvtfppAsNHKyP3ISGGTPMwpGboyGK2rF75rvFZRybMW5bcEZ316RIoVt8iJhtvpY1vh0N\nbbuW6TqIwxoiC+1o4JA2AajnuJ530+Zc7WiIIuVofe97VZosAPzgB+F0oNDP7jlzeAg8aYSQOlFm\nzUeDaB1c15kCnoWvzhOSOiH8B9pBMHu2umFWr1afb9zo/8HGtYBKGp/8JLBqlTq+917zENuyxUxa\n9GHLYceem9AAmL8zh/EBzBjNnKlsZk2HBkg+6jTs32/mHC7XoC/ytFqzhaROmGNXos6+fUaob9u1\nrPEhNISQf+4KERqqo4WG7jm301FFvgHg6KPVM394GPjt3zavefRRP+dYFX3vcb4/OKdOVBEaqPg5\nNqauK85Cg6+CkOJoEP4DvejXi+A1a8z3fBeEDEloGBwEXv96dRzHwA9/qI5pfYYzzjDHHAJpbl0n\nAH9F2/LStpxr346GNgcNPmuSiKPBHLsa6zZfy5oZM4w13ZXQEEL+uSuo0EBbf3OFu9BAHbxPPGHO\nVxf5BpToQF8TAiGkcM2fb+YKbkKDdk1VdTSMjqr5SsdVdTizs/AlNIijQfgP+gkNvgtCUsWe4w3a\nDa1pceut6iNNm3j5y80xB6GBs6Nh375kcFQXbRMafDsaOF6DvvApNNDq721IAeqGCtWu5t42X8ua\nKDJziA9HQ5uFhpAcDZzmHC00dDrJe5YWgqRCg06dAIAnn3R7bjaYmDA78pzvj4EB42rgmjpRxv3X\n7WjgvmHqK3VCHA3CFPRkTIUG33UauN+g3bz61Uah1UIDdTRQoYHDIoFb1wmAX+cJXRynLYGDOBr8\nkVUB3SZtHmdAHA0+0cGNOBrso1suAjzWEP0YH1dBL8DT0QAk511aCPLkk81xaI6GkOYhLTRs3eqv\ny1UebNVoGB3lH8fUkTohjgYBgFmYnXiiudnuv9/vOXC/QbuZMwc480x1vH49sHlz0tGwerWpOMvN\n0cClowKnzhPj48ZVwWV8XEMXJj6EhjbvAvsU1docnAHiaPCJT6GBeyBlm5AcDbQ7SGhCA3U0HHGE\n2UAKwdEQotAwPs6ni8rkpAmKbRSDpPcpxzimjtQJcTQIAMwieGgIOPxwdfzcc37PITShAUimT9x2\nm3E0TJsGLF/Oq9ihXhiPjPhTGLPIU7Rt0yY/iyza6qctgQN9nz5SJ0JaFNlmeNgswF0LDW0eZ0Ac\nDT7RQsP+/WZH2yZtFs1CEhpoulYIQoNOnRgeVm3LNUNDwFFHqeMnnuC1855GSPcHbXHJJX2iauHk\nfqkTHFPAFy405yzFIAWv0IXZokXq40svJS8W14RWowFICg0/+IFxNKxcqR5Y+n1wWCRwrD+Q5Wj4\n8Y/VQ3/FCmDbNrfn0sYdSnE0+MVX8dOQFp8uEEeDP+j15WIOafO1LEJDddKEhtFR01Fi1aqpGy+6\nTsOuXTxSOvsRkuDJsfMEdeJUdTTccENyg5bjhmkUmToN0t5S8AqdjLXQAPitdByio+Hss40K+t3v\nmknruOPUR/0+9uzxK9qkoR9InBbFWY6GG25QH/fsAe64w+25tDFw8F0MMqRFkQt6tVqzTdvHWRwN\n/nDd4lKEBgV3oaFqwOaKNKFh/XplmQeSaRMaWqeBe/pESPcHR6GBOlnLCGQrVpjjz38e+MhHzOdc\n4xidPrFzp7sNJnE0CFOgCzM6GbjeRaZQoYE+HDgzfTpw7rnqmN6w2orno597XkJ0NNA8StfpJ21P\nnfBdDLItY0zR1/v+/W67rLQ9CPZR+6WNwmQaIjS4Y9YssxvIXWgIydHQqxCkhnae4F4QMqS5nmPq\nBJ1f5s4t/vNXXAF8+MOmrgfdyecuNADu0ifE0SBMIS11AqhHaJg71xRRDAGaPqHpdjQA9dZpmJw0\nCwFODyM6Pt0Pnjj2KzTQB3Ybi0H6bm/J6Tr0ha+CkG0OzgB1/+rFjTga3ELfuwuhQY/zwACvANYH\nUWRcDT7dpWXgKjTQzh3f+Ibaae1VCFITkqMhpHmIo6OBFqUsIzQMDQF/8RfAffclu/YByffLCR8t\nLsXRIEyBg9CgFftQ6jNo+gkNXKyPdBHAafftxBPN8b//e/J7W7YkgwTXQkMbgzNpb+kXX0JD28c5\nisxY+3A0tHGMNb4cDXPmqL9r29CBsjgaynHeeaowNwDcfTfwoQ+ZQpBA+I6GkNYtHIUGW+O3ejVw\nzz0qfeLII4F3vhM45pjq5+cCH44GaW8pTKFXjQZfQkOnYxbeXO1GvVizZmqqR1rqRJ2OBq4236OO\nMgHB2rXJCs901wFwP35Vle0QqdPRwOk69EWvCui2aXNLQI2ee304Gtp4LWt8FYNs63WsNysOHEgG\n89zgKjTMmAHceKOptH/ttapwN6Ce8zTo0oijwQ3cUyeqCjVDQ8B//a+qU9r11/MVRn2nToijQQBQ\nv6Nh1y4lNgDhCQ2Dg8D555vPdWtLQISGLKIIOP10dbx1q3IxaOiuA+B+R6eNQoM4Gvzi29EwOMir\nMJtP9Fjv3p1c9Nii7deyxqejoY1wcUVmwbUYJACccQbwpS+Zz/Vu68knpweDhx5qno3cHQ10HuJ+\nj9DYgoujoY3rPkmdEGqhl9DgazIIseMEhaZP6NaWAB+hgfPuG81rW7fOOVusywAAIABJREFUHIuj\nwT2+hQaugpcvfAsNs2fz3VVxDZ17XbhH2n4ta1wKDRMTZqecexDlilCEBq6OBs373gdcfnnya2lp\nE4CaM7WrYeNG06GCIyG516ZPN/cxF6EhpNQTW0gxSKEW6k6doA/Q0Go0AEmh4YQTzDGXRQLnfGIq\nNKxda45FaHDP8LBRm6W9pXt8F4Nsy8IpDddj3fZrWeNSaAhpt9YVXNYQWXAXGgDgb/4GOPNM8/mp\np/Z+ra7TMD4ObN7s9ryqENo8pNMnmpg6EQoLFhjXkY8aDeJoEDB9uvqnqUNoCN3RsGoV8MEPKpHh\nyivN17k4GjjvvqU5GiYngYceSr5OUifcoBcnvh0NbensQfFVo4E6GtqK67m37deyxqXQ0MYgoBsR\nGuwxMqLqNZx/PvCrvwq86129XxtKnYbQ7hFdEHLnTrctnvPSxnVfFJn0CVepE+JoEBLQnR9AiQ56\ngSpCQ36++EVg/XrgVa8yXxOhIZtjjjEPSC00PP64Kn5FEUeDG/S97tPRcMghpu90m/DhaODaytY3\nvhwNM2eq1ottxWV7S3E0iNBgmyOPBG6/Hbjttv5roVA6T4TmaKCdJ3x2tetFaEKNLXT6xO7dybWv\nLcTRICToFhoA42oQoaEaIjRkMzCgWgMBSl3dunVqIUhAFZuiBads01ahQV8PPotBcrsGfeFDaJDg\nTEHH2qWjoa3XskYcDW6hQsP27fWdRxaci0GWgQoNnB0NoRX+pZ0nONRpaOsc47pOA12H+HL8idDA\nmO7WjIARGl58URVkck3oNRp6MXu2KQxZ524E52KQQDJ94uc/T9ZnoGqoj7x2IIydAVvo97p/v/ui\nVzo4a9P4UnwLDW0dZyAp8rp0NLR5jAG37S3bOidTFi40x+Jo8AdNneDsaKDtX0Mo/EsdDRzqNLR1\ng4l2nnAhNNQR04nQwJh+jgbAz8OtqY6GKDLvRxwNvekuCEmFhrPOMscur0X9wDnkEH85ZRyg14Pr\n9Im2Oxp81Gho6w5NN+Jo8IPL1Am5liV1oi5WrDDHITgaQhHiqNDAydEQRe2qtUMdDS7qNIjQICTI\nEhp8pE80VWgA+AkNHB9Ip59ujtetM6kT06cnq0S7HEMtNLRJ1QaS14NLoWFszOTtcbwGfUBFLHE0\nuMWlo0GuZcPQkAksRWiwTz+hYcMG4NZbgU7H7zml0TSh4ZBDgCVL1DF1NOzZA3z3u27dlUUIrcMQ\nV6Fh9ux21dpxnTohQoOQIEto8DEZtEFo2L07WSDFJ9wdDccfbxYnd92lFlAAcNJJyQeTCA32odeD\nyzoN3MUuH0SRmW9FaHCLS0eDjHESHeSI0GCf+fONJZ4u3rduVbWNXvMa4Gtfq+fcKE0TGgBTp+G5\n51RqYRwDb3sb8Na3AhddVO+5AWEW/qU1GjilTrRt3ec6dULXk5k+3d98IEIDY9JqNPiuDEsfoGnC\nR8hQNa8uFZy70DA4CJx2mjp+9lmzQ3PyyX4Kak5OmuChbQ8cukDxJTRwvAZ9oec3SZ1wi0tHg1zL\nSURocMfgoFmj0XXSnXcCe/eq4+99z/95ddO0YpBAsk7DU08BP/gBcMst6vN77pnaGcs3dB4K5f7g\n6mgIZfxs4St1wmfNPREaGMMpdWLuXFM8sSlw6DwRwsKY1mnQnHJKcvxc5ajSALttQoOvGg2yC6zQ\nQcOuXW6Kb8o4K1wW3pQxTqLHYNcutetrC+mgotCLdfr8W7/eHKd1afINdTRMn17fediEdp54/HHg\nYx9Lft9FgFaEEIulchIaxsfNddu2dd+8eaYmhW1HQxyL0CB0wUloaFraBMBDaODedQJI1mnQ+HI0\ntLXyMCCOBt/Q+daFqyHExacLhofNQsr2vCHXchItAnQ6dlsQi6NBoRfrL71kuoA9/LD5/oYNbls/\n50EHbDNmNCfXnToavvhF4Gc/S35/40a/59NNiILn/PlmM7Hu1Ik2C5lRBCxfro6ffNKuO2fPHiXi\nAMmuOa5pyLTTTOoWGjqd9ggNdVWNDmFh3MvRQBVRERrs46sYZIiLIhe4bnHZ5sVTN67qYci1nIRe\nZzbTJ0Q0U9DFun4GUkdDp5MUHupACw1Nqc8AJB0Nt9029ft1Cw0hCnFRZFwNdTsaQhw/m5x9tvo4\nNqZSsWxRRyFIQIQG1qTVaPApNOzaZXLyfV6UvvARKGehA8iREb6tG1etUuenmT8fWLbMj1DTZqHB\nVzHIEFw1PnDtaJAg2OCq408Iwq1P6CLd5jXd9kBA0915Io6TQgNQf/oEdTQ0Bepo0NAWiHULDaHO\n9VRosJlqVZQ2r/sA4IILzPGtt9r7vSI0CFOo29HQ5I4TAI/UiRB6vk+bBvzKr5jPTz5Zqd8+xq/N\nC1pfjgbpOqGgwq4LR0Obr+Vu9LNtdNSutTzUBb4rVqwwx9328iqIo0HRLTQ899xU58gDD/g9p270\n/dUkR8Phh0/dmPnMZ8wxJ6EhpLled56YmKi3TWjbn5W/9mvmWIQGwSlpQsMhh5gHhmt7kwgN7glB\naACSdRpOOUV9POQQ43SQ1An7iKPBLz5TJ9ocnAHuWlyKoyHJr/6qOba5YNWBwIwZfJ14PugWGrrd\nDAAfR0OThIbBQeCoo8znr30t8O53m8/rFhpCFeK4FIRsu9CwZAnwspep4/vuS66DqyBCgzCFtNQJ\nwLgaXDsa6EXZdKGh7hoN3BfFZ51ljlevVh+pq0GEBvuIo8EvIjT4w1WLSxnjJGefbToN3HqrPTu0\nHuc2BgGUPEJDnY6GiQmV5w00S2gAgBNPNMd/9mdqfaDXCHULDaHOQ1yEhjav+zTa1dDpAD/6kZ3f\nuX27ORahQQDQO/jUk8ELL7hpw6ahwaPUaLBPp2P6bXMXGi65BHjXu4Df/m3gne80X9cBg9RosE8d\njoaQFkW28dl1ou0Bmg9HQ5uvZc306cC556rjp58GHnvMzu9ta4/7bvoJDbomwubN9dnQaVpS04SG\nq69WwdgXvgCceab6mnY5PP2027VxFqGnTgD1dp6QZ6WbOg3iaBASDAyoHeM0tKMhjt0GyJI64RYt\nMgD8F8UjI8B11wHf+EayqJQew/373bTxarPQUIejgbvg5RLXNRr04nNgoHmL/qL4cDS0+Vqm2F6w\nxrEIDZp+QsMb3mCO60qf0GkTQLOKQQLAGWeo6/mP/sh8TQsNExPAli31nBcgqRNVEaEBOO88045W\nhAbBCf36HfsqCNl0oWH2bJXrB9QjNDQhwKOTlYvgrM1Cgzga/OI6dUIvnmbP7i0itwVXYy3X8lRs\nCw3795vd4raPMX3+bd9uWlnOn58s6FZX+kSTHQ1p0LoNdaZPhOpo4CI0tHndp5k3T4lpAPDgg6rQ\nbFWo0EBb87pGhAam9FuI+hIaml6jgdYYqKNGQxOEBteukDY/cOgiXopBusdXjYa2B2eAu3mjCXOq\nbdasMW6d2283LavLIruNBrpY37RJWfYBVT9AF00GeDga2iY0bNpU33mE6miQ1AleUJH4ttuq/z5x\nNAgJuDkamlijATDvqw5HA/0////27jxMiur6G/j3zDDsICoqCiKoILgrIhoUcDduqIlJ3MA1MXmN\nScxuNO7GmGiMif5i3BONuC+RxC2Iu4gLYhB3xAUUUSHsAzP3/eNUeW/39FJVfavX7+d5+pnqmerq\nWz19u6tOnXturX6Ypl1Qs5EDDd26AZ0767KPaHY+HNeu0q7RwECDxYyG8mluBsaN0+XPPgNeeaW0\n7fEkwHKPi6ZNs8vDhuk00KFKZTS8845dboTvz2rMaKilz6GNNrLLlZwthZ8xync2WniM3tRU3s8D\nBhqqVLUFGuoxowGw+7VkCbB6dXmfe84cu+x+QdaStDMaGvkLR8ROcfTmm5k1PXxiRoNyhzT4zmho\nb7cBnUZ7H+fCjIby8nnA2sifydm6drWZAu7rMmyYZpEMGKD3//tffzN+xHHffXbZneq0XlVjoKGW\n+ki/fsDw4br8zDOVy2po5AtMrq98xU4h72PWoDDQsM46hc8xfWOgoUpFDTSkOY7KPQB0r0DVk7SK\nkkXhBhoGDy7vc/uS9swd4RdOly72A7eR7Lij/mxvL/1KZD7hyVlLS2O+xqGmJpti7vuzgFkjmdzv\nE5+ZUOEBfufONhuI/AYaavUkKi25sj3Dk7Vw+MSiRTr7RDm1tQH336/L3bsD++xT3uevhGoJNITH\nLSK1N2TlsMP0pzHAP/9ZmTYwmKm6dQNGj9bluXOBd98tbXvhd225M9QZaKhShQINbsGWctRo6N0b\n6NQpveeppLRT/wtxPzQ23bS8z+1LuWo0NGpUe8QIu/zSS+k8R3jiwCvA9gTY99CJWh2zm5b+/W0h\n3tde87fdMKDD93KmYcOADTfU5SeeAFpbk2+LJwGZch20DxumPys5fOKZZ+zx4f7719+sE7msv75m\nmQCVCzQYY6eRHTCg9gr/HnqoXb733sq0IfyMaW6uvUCNb76CxK2t9liPgQYCUB3FIMMTx3qtzwCk\nf0W+kHrIaChXjYZGDTSEGQ1AeoGG8OSMJ8CZgQafqc68Cpypa1c7LOi11/xNjcs6GLmJ2APW5csz\n6wnExUBDpuzjo5YW+31eyYKQ7kmie/JYz0SAgQN1ee7cygxX+fBDe9zi/v9rxYgRGggGgEceSbcQ\ndT7h69e7d+0FanzzFWioVCFIgIGGqlXpGg3t7fbEu17rMwDpX5EvJMxoWGut2h2akubr587X3qiB\nhm23tVd+mdGQvnDoRFub3wOsWi0OlqYwiNbW5u9qLzMa8nOnWyzlgJXZOZmyD9qHDLEZoO6JZjkz\nGoyxgYbmZuCgg8r33JUWBhqWLatMkW/3/1yLgYamJhuYam0FHnyw/G0IP2MYyNTAT/g6TJmSfNYg\nBhqog0KBhh49bHpYWoGGJUvsG7pRAg3/93/Aj3+st1tvTfd516yx0y8NHly7Uds0M0KWLrXvwUYN\nNHTrZsf7zpoFrFzpd/vt7bbIJE8a0psNgVeBO/I9LIjv5cJ8XRnjezlT9nz04bCJcDkMFJczo+HV\nV+2FjHHjavdCRhKVrtPg/p/doTO1pNLDJxr9ApOrUyc7a9DChckDlm6gIfszK20MNFSpQoEGEZvV\nkFagoZLRr3JyAw3//jdw2WV6O+oonXM8LR9+qFfygNqtzwCkO3SClYdVeOV3zRr/V8XcmSx4cpbe\nFJfMaOjIHRb04oulb48FNwsbOBDYfHNdfu655MNVGGjIlH18FAaGAb0gNGSILr/2mn6Gl0MjDpsI\nVTrQUOsZDQAwdqzN7ps8ubSaLnG1ttoLKvx8UT6CxMxooA6KTT0SBhoWLkyeSlNII0xtCWg6ab4P\nsylT0ntetxBkrdZnALRQT1jd3XdGAwMNKs2CkJzaMlN4cAX4zWhgoKGj7bazmVw+3tec2rK4XXfV\nn2vW6JS5STDQkCn7oN3NaADsVe1Vq/R9/vnnekvjuC3kBhrGj0/veapRtQQamps7vhdqRUuLHW6z\neDEwdWr5npufLx0x0ECpiBpoaGvzXyEdaJxAw/rra3bBU0/p7Z577N/SGhMPZBaCrOWMBhH7ocVA\nQzrSLAjJq8CZOHSifHr0sAfir75a+lUzBnOKc6+2v/56sm2wsGmmYoEG96r2qFG6/rrr6nppTKk9\ndy7w8su6vNNOwMYb+3+OauYGGsLhqeWyZg0we7YuDx1a29NFV2r4hPtd2cjHfa4ttwQ22ECXn3gC\nWL06/jYYaKAOio3Zd6e4XLDA//M3SqAB0IPS0aP1Nn68PdlIM9BQLxkNgH1/+A408AtH+b7y62JG\nQ6a0Ag08Cc4tzNZZvVprkJSCGQ3FuSfBSQMNDJplyj5o32KLzPujRuV+3FtvAXfe6b897knhYYf5\n3361q2RGw1tv2YBprQ6bCO23nw2U3Hdfuhk4LvcCEz9flIgt5rt0KfD88/G3wUADdRA1owFIp05D\no9RoyCZiryB//DEwb146z1MvGQ2ADTQsX+63WCG/cFSvXnp1BABmzkwWzc6HGQ2Z0qrRwJOz3HzW\naWAwpzgGGvxzj4/69+/43tt3X+D884EDDtDbbrvZv6UxE0Uj12cA9H8QHj+XO9BQD4UgQz17Avvs\no8vz5gHTp5fnefn5kps7fCLJsO6FC+0yAw0USdqBhkbKaMiW5pj4kJvR4Ebga1FaU1xy6IQVnpC1\ntmpRMV+Y0ZCJNRrKy+ewIGY0FLfZZnYWhFIDDc3NOitOo3MruLtDU0IiwJlnalG9yZMzh2f6DjR8\n8YWmVgNahDJXe+pdS4sGG4DyBxrqoRCky82IeeCB8jwnM1lzK7VOAzMaKDYGGtKT5pj4UJjR0L+/\nnaq0VqU1xSUDDZYb/PJRoT/EjIZMHDpRXttvb5dL/azla1xc584abACAN95Ilg4dngj06lW70zL7\nNHiwHR5x7LHF1+/bF+jXT5dffRUwxl9bpk61/9ODDmrc/0948WbhwsyZldJWb4GGMF0fAGbMKM9z\nMpM1t0GDbPbzs89qBnEcDDRQbOUcOsFAg19Ll9q6GrVenwFgRkM5pPWeZEZDJhaDLK+11rJTLr7y\nSmnT/zGjIZrwKveKFcmK5YXvZb6PlQjw9NPAJ58AEyZEe0x4EvrZZ/o4X9wrne4V0EZTqYKQ4dCJ\n7t3r49huk01s0NYdFpImflfmF/bp1lYtXh9HeE7Xq5edKa5cGGioUWkHGty0yrDaaaPYbDP74ZpG\noOG99+xyrddnADIDDW6AqlQMNFg77GCXfb4nmdGQKa0aDbzanl+YrbNyZfJ0foCvcVSl1mlgoKGj\n5ubMAt3FuOP3fZ7AhYGGTp2AMWP8bbfWDBxol8s1fGLZMuCdd3R5q62K11mrBSL2vfree5mfsWnh\n0In8Shk+ER6bV6LmXh10hcaUZqBh0SKbnr311pkH342gqcme2H3wgf/Xt55mnACY0VAOffrYoNSM\nGTqtrQ/MaMjkvs/Symjo0cPfduuBr4KQfC9HU0qgYc0azYQAGGgohZtW76tOw0cf2f/nzjs3drCt\nEjNPzJ5th8HUeiFIl/teLUdWA4dO5OcOZYkTaGhvt8fmDDRQZG703GfqHZA5zq9R0+/SLAjpzjhR\nD4EG1mgoj/CEbMUKHV/tA68CZ2ppsSepadRo6NWrPq50+eRrWBCzc6IpJdDgfl7wJCC5NAINbiX6\nRj1uC1Ui0FBv9RlCaWXf5MOhE/mttx6w7ba6/NJL0Y+3Fy+253QMNFBkvXrZgynfY9A4zi/dOg1u\nRkO9DZ3wGWhgCl2mNApC8uSsozCDK61AA2XyNSyIQbNoSgk08CTAjy23tIUafZ288bjNqkSNhnoN\nNKQRFCuEx32FhX3bGL0oHEUlC0ECDDTULBF7kjp3rr9UasB+YTU3A2PH+ttuLUkz0FBvGQ1p12ho\naan9mTl88JVi7mK6eUdhoMFnjQaOa89v3XXticHLLyebCQFgMcio+vSxsx4w0FAZ3bvb2T9mzUr+\nng8ZY4/bunUDdtmltO3VOrdGwzPPAG+/nf5zugGjeho64e5LOQINHDpRWJI6DQsX2mUGGiiW8CR1\n9Wodn+fDvHk61gwARo5s3I6+xRZ6MAD4nU4QsBkNnTsDG23kd9uVkHaNht69G3eaLpcbaJg+3c82\nmdHQUZ8++nPVKjsevRTG2NeZr3FuYbbOsmXAm28m2wYzGqILsxo++SRe5o4baOBrXJrwBG758sws\nxyTeegv48ENd3n13oEuX0rZX63r0sMHLOXP0qvzFF+uxclrCk/C+feurgHqa07HmwoyGwsaMscfD\n06ZFewwzGigx92p4qV9UIY7zU83Ndo73OXP8pVEbYzMaBg2qj/Haaddo4JeN6tvXTgX4wgt6Ilwq\nZjR05HuKy2XL7MEZT85y85FBxoyG6JIOn2BGgz8+i+xx2ERHN99sgw0rVwK//KVePEtjKMXChcDH\nH+vy1lvX34WRtKZjzcXNZG30gFkuvXrZbKjXXouWze4GGvr2TaddhdTBaU7jcsf3u+n4peAXluUe\n/L78sp9tLligVzCA+qjPAGjmRzgvr6+hE8Yw0JDLrrvqz9ZWP+/J8ORMxGbwNDo3Q2fevNK3x5Oz\n4tzP2hdeSLaNMGjG93JxDDRUns+x7+5xm1uZvpHttpsGcH70I3tB55VX9L5vbqConuozhMo5fMId\nZlhvARtfwvfYihXRLjIzo4ES853R4I7z69rVntQ0qjTqNNRbfQZAvwzCkzNfGQ0rVuhUagADDa6v\nfMUuP/ts6dsLT8569KiP7Bof3OKE//536dtjSn9xI0fa5eeeS7aNMGjWsycPUItJGmjgrBP++Krm\n394OPPaYLvfpk/n51eh69gQuu0w/U8JMtf/8p/SaGNnqtRBkqJxTXIaBBh735Rf3/8FAAyXmO6Ph\n7beBDz7Q5d12YwG+tAMN9ZLRANgPL1+BBk5tmZsb/HvmmdK3x9oBHY0fb5fvvbf07fEqcHF9+wJD\nhujyiy8mGxbEmT2iGz7cLjOjoTKGDLGp4aVcJZ4xw37v7rGHDvukTCNH2sLmixfHL4JajPv/q6dC\nkKFyzTzhZrLy8yW/uBkmDDRQYoMG2WUfGQ1Mv8u05Zb2QMBXQUj3/1QvGQ2AzWhYvlzHQ5aKgYbc\ntt7ajj9/5pnSCzOFJ2cc024NHGiDjC+9VPqYXmY0RBNm67S2JgvsuhkNVNiAAXZ4CQMNldGpkw34\nvPlm8po7HO4aje8gfWj1auD++3W5U6f6DDS407GmGWhYtcoW7OTnS35xAz8MNFBi3boBG26oyz4y\nGlgIMlNLC7Dttrr85puZB1lJ1WtGgzuu/dNPS98eAw25NTcDo0bp8rx5NgMpCWN4FTifww6zy6Vm\nNTDQEI17IhB3WBDfy/E0NenMSgDwzjsa3ImCgQa/wpPStrbkV9kZaIjG97DD0OTJtkDi+PH1+fnj\nezrWfDjjRDSbb24vgkYZOsHpLakk4VXxjz+2RQaTaG+3gYa11rLTjTU6d/jEv/5V+vbqNaPBrWS7\n117A1KmlbY9fOPmVckLmWrXKViyux4OjUhx6qF0uNdDAk7NoSrniuHIl38txhXUa2tp02GQUnN7S\nr1JT0levBp58Upc32sgGj6ijESM04wDwG2i45hq7fNJJ/rZbbcKgWNQChEm4F5j4XZmfmw311lvF\ns4jDjIaWlspk/DHQUOPcq+LvvZd8OzNn2jfjuHEc5xc6/HC7fM45tkBhUmFGw9pra+GmevGtb+mH\nGKAffHvsoV+6SacHZEZDfr6uzHBqy/y22spOJfrEE6XNpsKMhmi22sq+Ps8+G29YEKe2jC9JQUgG\nzfwqtSDkzJn2AtOYMSyCWki3brZQ5uzZfupJffAB8OCDujxwILDPPqVvs1qVo04DP1+iC/8fUbKh\nwuOXddetzGcEAw01ztfME0y/y22ffYDdd9flN97QuZmTWr3ajveup2wGANh7b51u0b0qed11+tol\nGXvKyHZ+u+xil0sZa+qenPEEOJOIzWpoawMeeCD5tngVOJpShgUxmBNfkoKQPBHwq9STNzfQ7Aag\nKTf3NZo2rfTt3XCDHUZwwgn1fYGuHDNPMJM1ujgFIcNAg5t5XE4MNNQ4XzNP/POfdnnvvZNvp96I\nABdeaO+fc07yok3u2Lawwno92Wor4KmngCuvtAf7s2YBf/1r/G0xoyG/tde2VyNffllTGZNgRkNh\nvoZPcErA6NwTgThBNGY0xJcko4EBHb8GDLDfb0kCDW4fafTpyKPwWRCyrU0vpgB6nHjCCaVtr9rF\nnekgCV5gii5qkNIt0F6J+gwAAw01z0dGw8KFdpzfkCGZByCkV+W/+lVdnjs3c0xeHO4Xm3tVup40\nNQHf+56d1xsALrgAWLYs3nYYaCgsPCFbsyb5jCjMaChsl12ADTbQ5YceSl4Dhydn0SWtP8LXOL4h\nQ2wabdyMhu7d7Xh3Sk7EnsB98EHm914UYR/p1g3Ybju/batHPgtCPvqozVDdf39g441L21618zUd\nayHMmIouaoZJpWecABhoqHk+MhoeeMBeaT/sMI7zy+WCCzKX4544A5lfbPV+9WHECOCII3R5wQLg\niiviPZ6BhsJ8XJlhRkNhzc3AIYfo8ooVwMMPJ9sOD56iSzosiBkN8XXtai9UvPaanVaukPC9zPex\nP+4Jw8svR3/c/Pm2LtfIkbZGEuW38cZA//66PG2aLSCbxLXX2uWTTy6tXbUgbgHCJDh0Irr+/aNl\nQzHQQCXbaCP7BZM0o+Gee+yymy5M1o47Al//ui5/8gnwpz/F30YYaOjSxRYlqmfnnacZDgBwySXx\nCkMy0FCYjyszvApcnDvN5XXXAY88orcnnoh+oMXXObo+fXTOdgCYMSN6Fglf42TCmhjLlgHTpxdf\nn4EG/9ygsZsJWEwjXbjwKXytli5NXmtgwQLgvvt0eYMNgIMO8tO2audjOtZCOHQiOhEbpPzww/zH\n1ww0UMmam4FBg3R5zpx4lboBPcAIr9T162cPPKgj98T5t78FFi2K/tgFC3S+cgDYaSegc2f/7as2\nw4YBEyfq8qJFwO9/H/2xDDQUNmyYnbXkmWfi93uAV4Gj2HNP+9o88ACw7756GzsWOOCAaK87T4Lj\nCU8E1qwBXngh2mP4Xk7GLfzsFoTOxRgbaOD72J84/wMXC0Em4yNI//e/2wyg445rnGwSN/vmoov8\nzNzhYkZDPO7/Y9as3Ot8/LFdZqCBEgvTH5cu1XoLcTz8sL0yN368PZGmjoYPByZM0OVFi4BLL43+\n2Ea9+nD22fZL+PLLNRskCgYaCmtqskHBBQuSDZviCXBxXbrYIUDZHnss3lVggCfBUSQ5EeB7OZk4\nJ7nLl9shlrza6E///sAWW+jytGmZ7+VCGqHmUxp8DDu88067XO9FIF3ua3fHHXpMfPvtyS505MJh\nhvEUK9BpDHD11fZ+pYrQ87SyDrgFIeOecLjV1Dlsojj3xPkPf9AjiCRPAAAgAElEQVSTvCjcL7RG\nuvqwySbAKafo8vLlGgWPIvzCaW4GevRIp221rtQrM7wKHM0f/qBBsrPO0tuRR9q/ueN08wlPHHr0\nqO/pz3xJUhCS9UaSGTTI1nl69tnCQ1U4e0p6woDPmjU6LMs1c6YO4frHP+zvVq2yRYA33xxYf/3y\ntLMe7LCDzShN8r05bx7w3HO6vM02wNCh/tpW7XbfXU9cw2DuggXAN7+pwfgoNV6K4dCJeIoVhHz4\nYVvof+hQ4MADy9OubAw0JCQiA0XkUhGZLSJLReQzEXleRH4iIt3K2Ra3IGScOg2rV9tpLXv1AvbY\nw2+76tGgQcB3vqPLy5YBv/lNtMc1akYDAJxxhlYpB4C//MVWai4k/MLp3ZvFSfNx30eTJsV/PK8C\nR7PWWsAPfqBDp847T6drDV+vW2/NDNjkwnTzeLbYQqdwBaIPC+IMKsmFJ7mtrTo9cT5vv22Xw2Fb\n5Ee+zBJjgKOP1gtCxx5rx8W//LKdZrvRjidK1aWLDl8F9D396afxHn///Xa5ES/OffvbWjx2/Hj7\nu7vuAq6/vvRtc+hEPIUyGozRY+/Q+edXbqYgBhoSEJGDAcwE8CMAQwF0A9AHwAgAlwB4WUQ2K1d7\nkmY0PPmkLSBy4IF26hoq7Fe/0umkAOCqq4qfOK9ebVOsBw/WWhiNpF8/PVED9GD2vPOKP8YNNFBu\nY8boPOyA1g+Ie3WGJ2fJ9OxpsxqWLgVuu63w+mFAh+/laJqabCr4p59GC54zaJZc1OETN95ol8eO\nTa05DWncOBtQd/8H06bZK5Xt7cCvf63LjXzhwoek0+gCmVnAbrHgRjJggBaRv+UW+7so2X3FuIEG\nfo4Xt/badhaVV1/NDMrffTfw0ku6vP32tph9JTDQEJOI7ABgEoBeAJYAOAPAVwDsBeAaAAbAEAAP\niEiSpO8mAGiLMe9O0owGDptIJvvE+fzzC68/Y4atg9GoBwU//amNUN94I/DGG4XXX7RoPoBz0L37\n/LSbVrO6dLEHnoBGr+OMlXQDZEw3j8edzuyaa/KvZ4w9CU564DR//nycc845mD+/cfqCOyzIvYKY\nj1uUjO/lePbc0y7nCzQsWWKzpnr3Br7xjfTbla2e+8E66+jMVoAOlQiHZGafvN1xh548sBBkadzX\nbOrU6I9bvBiYMkWXBw7UE7hKqIa+IAIcdZROZQ5o4d4ZM0rbZniBqUsXXviMKhw+sWiRDusBdFaQ\nM8+061x4YWXr7zHQEN/l0AyGNQD2Mcb81hgzzRgz1RhzCoCfARBopsOPE2y/GYgXaEiS0WCMDTR0\n7gx89auRn46QeeJ8ww3Am2/mX5cHBRp5/elPdbmtTWtd5LNqFdDaOh/Auejatf4OKn067jgdowvo\nAVPUquW33GKn5+reXQ+aKLoRI4DtttPladPyz2Pto4De/Pnzce6559blCVY+++xjl3/5SzsePZeP\nP878LguzfCia9dYDtt1Wl196KXcl+UmTdKggoCcXlaibU+/9wM0seeyxzOCO68wzbc2nnj0z06cp\nmt12s2nk114bvYj6v/5laxEcemjlhnVWU1846SS7XGpWQ5jRwGET0bl1GsLjkJtvtsOsRo+u/Pkd\nAw0xiMhIALtDsxauNcY8n2O1ywDMhgYbfiAiqZf/Wntt2zGjZjQ8+ijwwQe6vNdeTOuNa511gJ/8\nRJfb2oDTT89fSMstBNmoGQ2AZoGst54u33Zb/ui3WxCIVycLa2kBzj3X3o+S1TBzZuYV+T/8ga9z\nXCLRDrDcWVaYChrdqFHAd7+ry6tWAV/7WuZ84K6LLrKfvaecwtc5ifAk15jcV3jd97f7vid/soew\nuMGd44+3weB//xv46CNdHjWKBWaTWH994MQTdXnJEp2uPApmAXd01FG2BtfNN2ceB8+YAey9t2Ze\n5jouue02DdpvvbXewnMSno9E5wYaTzxR73//+/Z3F11U+TpnDDTE43603JhrBWOMAfC34G4fAKmX\nWBSxWQ3vv6+Vi/NZtkxPkPff3/7OLepC0bknzpMna2Tx0Uc7rhdmNHTvbq8cNaKePbW+RchN7XJd\nd51dZtGx4r71LRvVnj7dZirksmgRcPjhwIoVev+EEzKDDhTd0UcDXbvq8t//bodHAXpQNWlSZmAx\nLHBI0Vx+ua3VMHeu1sXITvSbO1cLzAL6+eoWv6LoCg2fmDkTeD64pLL99jbFn/waPdrOaPWf/2QG\nd049FTjnnI6PaeQLF6U66yybnv/nP9vgTT6rVmlGA6AXmnbfPd321Qp3KNXixVoYEtBMswMO0Pfy\n+ed3nA7+6aeBY47RLKpZs/QWfr737Vu+9te6MLMS0KETs2bZ4Zr77ae1vCqNgYZ4dgt+LgNQIJkT\njzvLo9NrjhXWaWhrs1HBbA8/rCckl15q03l33hmYMKEcLaw/vXrpQW74ZfXuu5ryO3Givfr20Ud2\nLPzOO1eu6mu1+M53gI031uXJk/XLxvXII5kBiIMOKl/balVTE3DBBfb+mWd2PCEDtM8feyzwzjt6\nf8QI4MorKx/trlVrr20LLH3xhZ4I3HqrTkN38MF6YhyOte7dmwGduDp31jHp4dR9jzySWZME0Gye\nMJX5hz8ENtigvG2sF2PG2Cvj2YEG94T35JP5eZGWHj1s4ODdd21wZ4cdNLhz7LE6I4uLgYbk+vfX\nAA6gQWL3OzSXKVNsAeWDD+axnCu7ZtHq1TrtpTuy4+c/1yFBgAYhjjjCXhTt1k0vRPXsqZk7v/hF\n+dpe67bdVjMZeve2r2HPnsDw4cAVV1S6dYqBhniGQ4dNvG2MaS+w3utZj0ldoToNCxdqMGG//ezf\nunTRAiFPPWVnUKD4Dj9cr/i4Vbj/9jdg2DAdB8/q0Jm6ds2sz3DEEVq9GLBXLdudnhVOQ0WFHXyw\nptECGtEePz6z2ONbb2kK4wMP6P111gHuvNNekadk3DTy3/5W00iPPlqDaKHDDtPpwEaXJeRcXwYM\n0PTa8CT4ootsIPf114GbbtLf9+ljh7JRfL17ayAc0EK94dXdFSs0WwfQ44SjjqpM+xqFO3wiFH7G\ndOrUsfB0mPFDyfziF3bY4LXX2iB8LuFxCsBhE9l23VVPbAGdze7II4EnntD7nTvrz/Z2DT7MmaMZ\nEGEQYtw4rc2wZIne5s5llnUcIvreXbzYvoZLlugxx9ChlW6dYqAhIhHpAiBM6Pmw0LrGmEXQrAcA\n2DjNdoVyzTxhjJ7sDh9uDxYA7dgzZ2qaaZiqR8kNHarR7r/+1dbKWLhQ08K+/W27XqMWgsw2cSKw\n1Va6PH++BmsOPzxzHDbTEuMR0RPd8Grj5Mn6Gl9xBfCb32gmU3g1oalJr7wPGlSx5taNMWPyVx7f\ncEOdYuruu+0UVBTfuHHAJZfY+2Egd8IEG5T82c84NKVUuaa5vOceHW4FaFCYQ9nSlR1oyA7ufO1r\n9jhir700YEzJ9e0L/Dgo2b5mTe7hKYBmCIZDErt1A/bdtyzNqxnZNYvC4RMtLXrcsd9+ev/TTzXV\n/8kn9X7//hpIZnZIfWOgITq3xNTSvGtZYaChLGXW3IyGn/1MAw8bb6wnu2FF3T59NPI1ZUr1RLrq\nRVOTpo/Nnp05X+0XX9hlXn1QnToBDz2UWQn3nntsZfnNNis+ZSh1NHasvo79+un9pUu1jsgZZ+j4\nUkCDCw8+yAMlX0S0ONvVVwN//KO93XijfhY06jzrvv3oR/rd5QZyp0/X5fXXB047rXJtqxfuSe5p\np+kxxCmn2N+xCGT6dt45szBvdnCnqUmHwN51V+4ZKSi+00+3AZtbbtH3ffZt8GA7DG6//WzxQ7Im\nTOh44fKKKzQwdsst9sJGWD+gpUWzKsOhcVS/GGiIzk0ybo2w/irozBNlGZjgBg6++ELTk9ziNkcc\noQe+J57IMZZp2nBDHVd8332ZVzG32IIFblz9++tV93/8wxbUBPRqwd13s3J8UuPHaz93M2kAPUA9\n/XTgv//NnDqQStevn77ep51mbxMncooun0T0uys7kAtoTZJKTLdYb3bd1Z5ALV6sxxDhScEWW+iU\ngJSulpbMYZi5gjs9emgGII8n/OjdW6fQBTQLeM6cjje37hmHTeTWt29mYP2447QmFwCsu64Gx9yh\nmldcwYtvjYIJK9E5NcXROcL6XaD1HFak05xMm22mY1RvuSWzENyAATom/pBDytEKCh1yiKb8nn22\nzkTBK/QdiehYvn331ZOFadO0INO222olYkqmTx+9wn700ZrN0KWLDqtgvQuqdWEg9/77tRDkkCEd\ng2qUTJcuOtXtRRfZWWkA/Tz50594gaJczjlHLxKNHcvgTrmceqoef4R1BfIZPVpneaLcLr5YgzKb\nbgpcdVXmZ8aOO+qQzV/9SocAhUEIqn9iik26TgC+rNGwAho8mGyMKXjqLiJLAHQH8JwxJnIZMBFZ\njSAAtJ57qTeP5uZmNHMiZaozra2t+PTTT7Heeuuhc+cocT2i+sS+QMR+QBRiX6Ak2tra0JZrSrIs\nCxcuRBAbWG2MKfkNxkBDDCKyAMC6AF4xxuSdTVpE+gD4HBqUuMMYEzkGKiJrADByQEREREREROXW\nZowpeeQDh07EMxvA7gA2F5GmAlNcDst6TByrYIddfB5h/TYAhabaJCIiIiIiosbUhGgXsteB1hhc\n5eNJGWiI5ylooKEHgBEApudZzynng6fjPIExhmWtiIiIiIiIqGZx1ol47nWWj8+1gogIgAnB3UUA\nHku7UURERERERETVgoGGGIwx0wE8CU0pOVFERuVY7ScAhkOHPlxujCleeYOIiIiIiIioTrAYZEwi\nsj10OEQ3AEsBXATNWugG4EgAJwervg5gpDFmWSXaSURERERERFQJDDQkICIHArgZQG9odoPLAHgD\nwIHGmDnlbhsRERERERFRJTHQkJCIbAzgBwAOBDAAQCuAtwHcDuBKY8zKCjaPiIiIiIiIqCIYaCAi\nIiIiIiIib1gMkoiIiIiIiIi8YaCBiIiIiIiIiLxhoIGIiIiIiIiIvEk10CAiI0TkLBF5SEQ+EJGV\nIrJERN4QketFZHTM7X1VRO52tvVBcH//CI/tLCKjRORUEfmbiLwuIm0i0i4ibTHasJ6IHCgi54rI\nv0Tk02Ab7SJyfZz9iaMa9j1BmweKyKUiMltElorIZyLyvIj8RES6RXj8ZiLyLRG5TESeEpFlzms9\nIa12+8Z+4E817HuCNifuB8FjTxKRv4rINBF5L+gHy0XkfRG5T0SOEZFOabXfJ/YFf6ph3xO0uZS+\nMNZ5XYvdfp3WPvjCvuBPNex7gjaX0hei9oP2tPejVOwH/lTDvidoc0nnCcE2NhCRC0XkRRFZJHp8\n9I6IXCsiI9JqO0VkjEnlBuAJAO3BrS3HLfzbTQBaimxLAFybZ3vh764uso0bnHXbsx7fFmO/2vNs\npw3A9Sm8jlWz7zHbfTCARXn+/+0AXgewWYHHjynyWk9I673LfsB+UEX94PwC7x13318BMKjS73f2\nBfaFFPvC2CJ9wb39utLvd/YF9oUU+0KUPuDeXqv0e579gP3Adz8ItnEIgMUFtrEawJmVfq838i3N\njIYNARgAHwH4I4CvA9gZwK4ATgfwYfD3Y6Fv8EIuAnBCsP6LAI4MtnUkgJeC358kIhcU2Y4Jbv8D\n8DiAj+PuVNZ23gfwMLSTp6Xa9r0oEdkBwCQAvQAsAXAGgK8A2AvANUE7hgB4QER65NuM0+42AP8F\n8DzSfa3TwH7gR7Xte1Ge+kE7gBkArgRwMoCDAIwEsGdw/+lgO9sAeCTqFYAKYV/wo9r2vShPfcF1\nAvQ9n+92ledd8I19wY9q2/eiPPWFQu/98HYp7Gt/Uxr74gH7gR/Vtu9F+egHIrI7gDsA9ASwEsDv\nAOwBYCcAxwB4AZq5f66IfCetfaEi0opgALgfwNcQTKGZ4+/rQKNVYRRqtzzrDQHQGqzzHIAuWX/v\nBj0BbQewCnmiXwCOADABwHDnd48hfqTybAAHAFgvuL+Jsw9eI5XVtu8x2v2406adc/z9x85rlvPK\nE4DNoV80uwPoHvxuovO4WsloYD8o/TWsqn2P0W4f/aApwvP8wdnOqZV8vxdpJ/tC6a9hVe17jHb7\n6AtjnXXGVPr9XOLrwb5Q+mtYVfseo90l94WIz/NcsJ01AAZU6r1epI3sB6W/hlW17zHa7eM74dVg\nnVYA43L8vROAh4J1FgFYt9Lv+Ua8VfbJgQOdN9Lleda5yllnZJ51Rjnr/CnG85fciVL+AKnqfc+z\n3ZFOe67Ms44AmBWs9xmA5ojbrrlAQ8T9Yj8ovO2q3vc8202tH+TYzgbOc92W5ns17Rv7QtFtV/W+\n59mul76AOgo0RHzd2BcKb7uq9z3PdsvyvQBgqPM8j6b9Xk3zxn5QdNtVve95tltyPwAwwtnGTQWe\na3NnPQ6hqMCt0rNOTHWWN8uzziHQFJrXjTHTc61gjJkG4A3oG3O8zwZWWC3u+6HO8o25VjDa+/8W\n3O0DTXVqZFOdZfaDjmpx38vZD5Y4y10TbqNaTHWW2Rc6qsV953dCMlOdZfaFjmpx38vVFyYWe54a\nMtVZZj/oqBb33Uc/2MlZfjDfExlj3gbwTnD367FaSV5UOtDQ2Vluy/6jiAwGsFFw9/Ei2wr/3l9E\nNvHQtoqq4X3fLfi5DDpWLB93n2JVFa5D7Ad51PC+l7MfHOksv55wG9WCfSGPGt53fickw76QRw3v\ne7n6wlHO89yd4PHVhP0gjxredx/9YF1n+ZMiz/cJNMiyjYj0itRC8qbSgYZxzvLsHH/f0lkudgDt\n/n140gZVkVrd9+HQ6Orbxpj2AutVU5srbZyzzH6QqVb3PdV+ICJ9RGR7EbkMtvBdK4D/i93S6jLO\nWWZfyFSr+55GX7hIdKrXlSLyuYi8JDoN8pCSW1s9xjnL7AuZanXfUz8+EpFx0FR9A+AuY8zyuI2s\nMuOcZfaDTLW67z76wVJnea0iz+f+fcu8a1EqKhZoEBEB8HPnV7fnWG2As/xhkU1+4CxvnLRdVaTm\n9l1EugDoG9wt2GZjzCJoNBOoj/9XIuwHRdXcvqfVD0TkxnAubgCfQ6tJ/xBa8GgpgG8ZY94roekV\nxb5QVM3te4rfCbsG67RADyK3g/aF2SJyduIGVwn2haJqbt/LeHw0wVn+e8zHVhX2g6Jqbt899gM3\n6DS2wPOtB2AYNLCRazuUskpmNJwOnX4ljLq+nGMdN8VlaY6/u5Y5yz1LbFs1qMV9j9NmwLa7Hv5f\nSbEfFFaL+55WPzB5bpOglaLvi9nOasO+UFgt7rvvvjAPwJ+hw4VGQQuCHQbgemhGjwA4O8I0btWO\nfaGwWtz31I+PgumNvwZ933xojJkSvXlVif2gsFrcd1/94EnoBRcBcLyI5KvfcQGAZtjpRTl0oswq\nEmgQkbEAfhPc/QTA9/Ks6hY2ay2y2VXOcjXPJR9VLe57nDYD2m5Bffy/YmM/iKQW9z2tfnAG7Bzp\nowF8F5rVcCSAW0Vk8/hNrQ7sC5HU4r777AvPA9jEGPMDY8ztxpgXjDEzjDH3G2NOho77/V+w7i9E\nZNuSWl4h7AuR1OK+l+P46FDYE6laz2ZgPyiuFvfdSz8wxqyEBhEAfc8/ISLHiMg6ItIiItuIyM0A\nTkb17HtD6lTuJxSRraDFaToBWAHgCGPMwjyrr3SWO+dZJ9TFWV6RvIXpCvY/nznOeLqq2XcR2QjA\n2nn+/IUxZl6wHKfNgLbboIr/X2lhP2A/cETqB8aY+QDmO796TkSuAXAlgFOC+3sYY16N8JxVg32B\nfcGRty8YY4r1jxdE5FToCZYA+H8AvhPhOasG+wL7giPJ8VFdDJtgP2A/cBT6TrhcRIZCP+f7wc5S\n4VoAzYI7L7i/JMc6lKKyBhqCCqkPQd+MawB80xjzdIGHuG+IYqk+PZzlKOk4lVLoJGAcgCeC5Wra\n94uQ+QXmuhHACcFynDYDtt3V/P/yjv0AAPuBK3E/MMYYEfkBdK7xAdBikLsVflT1YF8AwL7gKvU7\nYRI08NYLBcbtViP2BQDsC65YfUFE+gHYG3pSNt0Y80aUx1Ub9gMA7Aeugv3AGPM9EXkQOsxmV9jz\n2mXQmh5nIHNmri8iPCd5VLahE0G061HoVCztAI43xjxQ5GFuoZABeddSboGPD/KuVXn5xllnV16t\npn3P1+bwpisZswpAGHUu2GYR6QP7AVLN/y+v2A++xH4AP/3AGLMaOo+0ANhVRDZMsp1yY1/4EvsC\nvPWFNgBvQvtC/yTbqAT2hS+xLyBxXzgGOhYdAG6K+Jiqwn7wJfYDRO8HwfC5cQB6A9gUwCAAaxtj\nTjLGLADgDqObFXmPyIuyZDSIyLoAHgEwGPqGO9UYc0uEh77mLA8rsq7791xT4FQFY0xz8bUAVNG+\nG2OOB3B8xNVnA9gdwOYi0lRg6pqa+H/5xH5gsR98yVebP3WWByJziEXVYV+w2Be+5KvNpvgq1YN9\nwWJf+FKSNh8T/FwNzeypKewHFvvBl2K1OQhizM3xp9HBz8+NMXMitpE8ST2jQUR6A3gYdt7Unxtj\n/hLlscEbIhzXUywNckzw8yNjTK43Wk2p4X1/KvjZA1oRPB93nwqlxdUF9oNkanjfy90P3Ku31ZwS\nyr6QUA3ve9n6gog0ARgKfV/NK7J6xbEvJFPD+55KXxCR7aBXbQ2AycaYmkoPZz9Ipob3vZzfCTvB\nfifclmQbVJpUAw3BVDv/ArAD9J98gTHm9zE3cx80DXKYiOyc53l2gZ0n9d7kLa46tbjvbhtyRjeD\nuZHDsVyLADyWdqMqif2gZLW472XrByLSHcBXg7srALydZDvlwL5Qslrc93J+JxwJYK1g+fGE2ygL\n9oWS1eK+p9UXJjrLNTVsgv2gZLW47+X8Trgw+GkAXJ1wG1QKY0wqNwAt0IIu7QDaAFyacDtDoFOg\ntAGYBqBr1t+7Qqe9aodOYbJZjG0/FravhP3cxNnH6z2/hlW97wW2/bjTplE5/v5T5zU7K8Z2JzqP\nm5DWe9fza8F+UPprWNX7XmDbJfUDAOsCOLzIc3SBRunD7dzgez88vh7sC6W/hlW97wW2XWpf6ANg\nbJHn2Bk6r3o7tIjcjr73w+Prwb5Q+mtY1fteYNtej4+gFwznBY9ZAKA57fevx9eC/aD017Cq973A\ntkvuB9Civ2sVeI6LnW38MY33MG/Fb2nWaJgEYB9oFGkKgOuLTNnSaox5K/uXxpi3ROT3AH4BYCSA\np0XktwDeAbAZgJ/DRkIvMca8k2vjIrIBgP2zft3P+fvErL89aYx5N8d2RgNw56vv6yxvnr0dY0zi\n6HK17XsMP4CmOXUD8IiIXAT9wOoGveJ0crDeGwAuy7cREfkaMqvSutX0dwsinqGPjTEPldDmtLAf\ngP0AyfpBTwB3isjbAO6CHix8BP1i7gs9sToRWvwI0MJQvyihvWljXwD7ApL1hbUAPCYiM6FXw16E\n1iFpg9YkORg6Rr0zdN9/Z4x5qYT2po19AewLKOH4yLFf0F4D4B9GC6LWCvYDsB8geT8YCuAJEZkE\nLYj9LrT24FYAvg09RjLB81TzsVF9SyuCAY0ixbm9W2BbAuAa6EFFW9bjwt9dXaQ9Y2O2J+cVcwA3\nxNhGyVHAatr3mO0+EDqNTHabw3a/BmBwkW28F6PNU9J6L7MfsB9Uoh9Ar4Lk2t9c+/4kgEGVfr+z\nL7AvpNgX2vM81t1GK4AzK/1eZ19gX0irL+TY1iTncSMq/d5mP2A/KFc/gNZ3yPe9EO73JAA9K/1e\nb+RbmhkNxtf6Rt9RJ4vIXdAo1UhohHAhgOkA/mKMedhjm4qt52s7xTdQffsebSPGTBaRbaFRywOh\n09i0QseP3w7gSmPMyiKbaY/RHi/tTgH7AftB0n7wPoBRAPaAHgQMBrABNF1wafD3FwDcEXHfK419\ngX0haV+YB+Dr0HnSd4YWP+0LTQteDL3qNRXAtcaY9320N2XsC+wLpR4fQUR6QbN5DIDZxpgXfbSv\njNgP2A9K6QdvADgVwF4AtoEeHzVBs92eAnCTMaaqa/U0AtH3JxERERERERFR6VKf3pKIiIiIiIiI\nGgcDDURERERERETkDQMNREREREREROQNAw1ERERERERE5A0DDURERERERETkDQMNREREREREROQN\nAw1ERERERERE5A0DDURERERERETkDQMNREREREREROQNAw1ERERERERE5A0DDURERERERETkDQMN\nREREREREROQNAw1ERERERERE5A0DDURERERERETkDQMNREREREREROQNAw1ERERERERE5A0DDURE\nRERERETkDQMNREREREREROQNAw1ERERERERE5A0DDURERJSYiEwUkXYRaRORgZVuTxQiMjVo85RK\nt4WIiKgeMdBAREREjcYENyIiIkoBAw1ERESUiirPHJBKN4CIiKheMdBAREREaWHmABERUQNioIGI\niIjSxMwBIiKiBsNAAxERERERERF5w0ADERER5SUifUTkYhGZLSLLReQTEXlERL5e4DE3ikg7gLHB\nr8YFtRrc25w8j+0mIj8UkSki8rGIrAqe8yEROU5Eih67iMguInK7iMwXkRUi8q6IXC0iQyPucz8R\n+a6I3CEib4rIUhFZKSIfisi9IvINEcmZqSEidwX795mIdC7yPM3BPraLyANR2kZERFQLOlW6AURE\nRFSdRGQ4gEcBbAhba6ELgD0B7CUiNwB4IsdD3doMgtx1GtpzPN9IAPcA2CjrMX0B7A1gHwCniMgh\nxpgFedr8IwC/g15MCbexCYCTARwlIt/IubP28U0APsrT7g0BHBLcThSRw4wxy7PWuRbAYQD6ADgU\nwO0Fnu4AAOsHz3NdoXYRERHVEjGGNZqIiIgok4j0AjALQP/gV5MA/A3AAgBDAZwOYCcALwAYCT1Z\nHmyMeV9ENgSwNoAbg3WmAzg+6ylajTFvO8+3DYBnAHQH8DzsGfkAAAd6SURBVD8Afw4e9wGAdaEn\n998B0ALgOQC7G2Pastp8GIC7grYsBnAxgMeDP+8J4GfB8gIAQwBMNcbsmbWNZgArATwG4EEArwL4\nFEAvAJtCAxa7Bqv/zRhzfNbjBcDc4HV72BjzVeQhIndDgxELAWxkjFmTb10iIqJawkADERERdSAi\nvwPwY+hJ+y+NMZdk/b0ZwGQA+wa/+jLQ4KzzGHT4RIcT+hzP9wqArQG8BGBfY8wXOdbZL3hOAfBt\nY8x1zt9aAMyBZh0sBrCLMebNrMdvBeBpAL2D9j6eq10isqkx5t0CbT0bwNnQrIwtjDHvZP39XABn\nAWgDMMgY81GObawH4ENodunlxpgf53s+IiKiWsMaDURERJQhOGk/HnoyPjM7yAAAQTbBiQBWe3i+\nAwFsE9ydmCvIEDznQwDuhAYajsv683jokAsAOC87yBA8fhaAC4u1p1CQIXA+NAtBoJkW2a6HvnZN\nACbk2cax0OwMALihWJuIiIhqCQMNRERElG0EgHWC5ZvyrRRcqX/Yw/OND36+YYx5rci6YU2IkVmF\nIfcOmwUd4pHPDchdMyInURuKyFAR2SrIitgSmo0AANtlP8YYMxda2yJXQCQU/v5FY8x/o7aHiIio\nFrAYJBEREWXbxlmeXmTd5wEcWOLz7RT8HBbMVhFFCzQYsjC4H7Z5jjHm83wPMsYsFJH3AAwqtHER\nOQbACQBGAeiWb3PQQpW5XAstXrm5iIw2xjztbHsEdJgIi0ASEVFdYkYDERERZVvHWc45u4PjEw/P\nF868EPfWPavNJkJ7wzbnm56yi4j8C5oVMRZA1wLPD+QPQtwLGwTJLoR5YvBzJYBbI7SXiIiopjCj\ngYiIiLK5J+HFhhnkPGGPqTn4+TSAU2I8zi2yGLYjyrCIQm0+E8D+wXamArgKWqDyY2PMii83IPI4\ngN3zbcsYs1pE/g7gRwCOEJHTjDHLRaQLgG8G27/bGPO/CO0lIiKqKQw0EBERUTZ36MEGAN7OtyI0\nG6FUnwXPs16EGg35fA496d8gwrphBkUuJwZ/e9IYs1eBbYQZFIVcCw009ATwdWiWxGHQqT8NWASS\niIjqFIdOEBERUbZXneWRRdYt9PeoRRdfDn4OFZGBER+TLWzzYBFZO99KItIXeeoziMg6APoFd+8o\nsI0eALYo1iBjzGwAz0IDIOHwiROCn3ONMVOKbYOIiKgWMdBARERE2V4EEE4xeWy+lUSkP4B9C2xn\nZfCzS5Hnu99Z/nnR1uX2aNgs5J9SEtAT/nxDJ9xMzx4FtnEyomeFXhv8HCMi4wDsCWYzEBFRnWOg\ngYiIiDIYY1qhJ8ICYHsR+Un2OiLSDOAa6OwP+cwPfm5a5CnvAjA7eL5TROSkQisH00welPXre4Pn\nEwBnicjQHI/bEsAZyJ9p8SmARcHykSLSYd9EZCSA8wpsI9ttAJYEy/+AHnsZFJg2lIiIqNYx0EBE\nRES5nAfgQ+iJ+yUicouI7CciO4jIN6FDAvYD8EKBbTwT/FxfRC4TkR1FZLPg9uUQCWNMO7RAYnhC\n/lcReUhEjhWRnYPn3E9EfiEiT0OHSYxxn8gYsxrA94O76wB4TkR+LiKjRGQXEfkltNikgdac6JDV\nYIwxAG4J/rYdgKdF5FsiMkJE9hSRSwE8DmAFgDdzbSPHNpcDmARbP8IAmGKMeb/YY4mIiGqV6Hcq\nERERUaYgA+ARaN2C7JNqA+BGAE8CuD64P9g9gQ5qGbwCYHCOx79njMnIdBCRrQHcCWBI+KsczQoP\nXH5tjLkwR5tPB/Bb6MWU7McvhQY0fgadunKqMWbPrMf3BvAYgO3zPP9CAIcDOD/fNnK0aSSAaU7b\njzLG3FboMURERLWMGQ1ERESUUzADxFYALoFewV8JHV4wBcCRxphwhobwlv34ZQB2BfBHAK8BWFZk\n/f8C2BLAROhQiPeh2QOrAMyDBgAuADAiV5Ah2MZl0Gkn7wbwSdDm96C1EnYyxvw7XDVPG/4HYDSA\nswDMDJ5/SdD+SwBsb4x5qtA2cmxzOmwGxCIA9xR7DBERUS1jRgMRERFRikSkF4CPAXQFcJUx5vtF\nHkJERFTTmNFARERElK6jAHQLlq+vZEOIiIjKgRkNRERERCkJZueYBWAogGnGmF0r3CQiIqLURZ0D\nmoiIiIgiEJG1oTNfrAvgJ9AggwHwm0q2i4iIqFwYaCAiIiLy6zQAZzv3DYB/GmPur1B7iIiIyoqB\nBiIiIiL/DIA1AOYC+AeAiyvbHCIiovJhjQYiIiIiIiIi8oazThARERERERGRNww0EBEREREREZE3\nDDQQERERERERkTcMNBARERERERGRNww0EBEREREREZE3DDQQERERERERkTcMNBARERERERGRNww0\nEBEREREREZE3DDQQERERERERkTcMNBARERERERGRNww0EBEREREREZE3DDQQERERERERkTcMNBAR\nERERERGRNww0EBEREREREZE3DDQQERERERERkTcMNBARERERERGRNww0EBEREREREZE3DDQQERER\nERERkTcMNBARERERERGRN/8fLeuuH1J9kooAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 377, + "width": 525 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "rides[:24*10].plot(x='dteday', y='cnt')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Dummy variables\n", + "Here we have some categorical variables like season, weather, month. To include these in our model, we'll need to make binary dummy variables. This is simple to do with Pandas thanks to `get_dummies()`." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
yrholidaytemphumwindspeedcasualregisteredcntseason_1season_2...hr_21hr_22hr_23weekday_0weekday_1weekday_2weekday_3weekday_4weekday_5weekday_6
0000.240.810.03131610...0000000001
1000.220.800.08324010...0000000001
2000.220.800.05273210...0000000001
3000.240.750.03101310...0000000001
4000.240.750.001110...0000000001
\n", + "

5 rows × 59 columns

\n", + "
" + ], + "text/plain": [ + " yr holiday temp hum windspeed casual registered cnt season_1 \\\n", + "0 0 0 0.24 0.81 0.0 3 13 16 1 \n", + "1 0 0 0.22 0.80 0.0 8 32 40 1 \n", + "2 0 0 0.22 0.80 0.0 5 27 32 1 \n", + "3 0 0 0.24 0.75 0.0 3 10 13 1 \n", + "4 0 0 0.24 0.75 0.0 0 1 1 1 \n", + "\n", + " season_2 ... hr_21 hr_22 hr_23 weekday_0 weekday_1 weekday_2 \\\n", + "0 0 ... 0 0 0 0 0 0 \n", + "1 0 ... 0 0 0 0 0 0 \n", + "2 0 ... 0 0 0 0 0 0 \n", + "3 0 ... 0 0 0 0 0 0 \n", + "4 0 ... 0 0 0 0 0 0 \n", + "\n", + " weekday_3 weekday_4 weekday_5 weekday_6 \n", + "0 0 0 0 1 \n", + "1 0 0 0 1 \n", + "2 0 0 0 1 \n", + "3 0 0 0 1 \n", + "4 0 0 0 1 \n", + "\n", + "[5 rows x 59 columns]" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dummy_fields = ['season', 'weathersit', 'mnth', 'hr', 'weekday']\n", + "for each in dummy_fields:\n", + " dummies = pd.get_dummies(rides[each], prefix=each, drop_first=False)\n", + " rides = pd.concat([rides, dummies], axis=1)\n", + "\n", + "fields_to_drop = ['instant', 'dteday', 'season', 'weathersit', \n", + " 'weekday', 'atemp', 'mnth', 'workingday', 'hr']\n", + "data = rides.drop(fields_to_drop, axis=1)\n", + "data.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Scaling target variables\n", + "To make training the network easier, we'll standardize each of the continuous variables. That is, we'll shift and scale the variables such that they have zero mean and a standard deviation of 1.\n", + "\n", + "The scaling factors are saved so we can go backwards when we use the network for predictions." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "quant_features = ['casual', 'registered', 'cnt', 'temp', 'hum', 'windspeed']\n", + "# Store scalings in a dictionary so we can convert back later\n", + "scaled_features = {}\n", + "for each in quant_features:\n", + " mean, std = data[each].mean(), data[each].std()\n", + " scaled_features[each] = [mean, std]\n", + " data.loc[:, each] = (data[each] - mean)/std" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Splitting the data into training, testing, and validation sets\n", + "\n", + "We'll save the last 21 days of the data to use as a test set after we've trained the network. We'll use this set to make predictions and compare them with the actual number of riders." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Save the last 21 days \n", + "test_data = data[-21*24:]\n", + "data = data[:-21*24]\n", + "\n", + "# Separate the data into features and targets\n", + "target_fields = ['cnt', 'casual', 'registered']\n", + "features, targets = data.drop(target_fields, axis=1), data[target_fields]\n", + "test_features, test_targets = test_data.drop(target_fields, axis=1), test_data[target_fields]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll split the data into two sets, one for training and one for validating as the network is being trained. Since this is time series data, we'll train on historical data, then try to predict on future data (the validation set)." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Hold out the last 60 days of the remaining data as a validation set\n", + "train_features, train_targets = features[:-60*24], targets[:-60*24]\n", + "val_features, val_targets = features[-60*24:], targets[-60*24:]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Time to build the network\n", + "\n", + "Below you'll build your network. We've built out the structure and the backwards pass. You'll implement the forward pass through the network. You'll also set the hyperparameters: the learning rate, the number of hidden units, and the number of training passes.\n", + "\n", + "The network has two layers, a hidden layer and an output layer. The hidden layer will use the sigmoid function for activations. The output layer has only one node and is used for the regression, the output of the node is the same as the input of the node. That is, the activation function is $f(x)=x$. A function that takes the input signal and generates an output signal, but takes into account the threshold, is called an activation function. We work through each layer of our network calculating the outputs for each neuron. All of the outputs from one layer become inputs to the neurons on the next layer. This process is called *forward propagation*.\n", + "\n", + "We use the weights to propagate signals forward from the input to the output layers in a neural network. We use the weights to also propagate error backwards from the output back into the network to update our weights. This is called *backpropagation*.\n", + "\n", + "> **Hint:** You'll need the derivative of the output activation function ($f(x) = x$) for the backpropagation implementation. If you aren't familiar with calculus, this function is equivalent to the equation $y = x$. What is the slope of that equation? That is the derivative of $f(x)$.\n", + "\n", + "Below, you have these tasks:\n", + "1. Implement the sigmoid function to use as the activation function. Set `self.activation_function` in `__init__` to your sigmoid function.\n", + "2. Implement the forward pass in the `train` method.\n", + "3. Implement the backpropagation algorithm in the `train` method, including calculating the output error.\n", + "4. Implement the forward pass in the `run` method.\n", + " " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Shape of weight matrices\n", + "\n", + " Node j1 -> | w1|w2|w3|....wn\n", + " ---------------|--|-------\n", + " Node j2 -> | w1|w2|w3|....wn\n", + " ---------------|--|-------\n", + " Node j3 -> | w1|w2|w3|....wn\n", + " . ...\n", + " . ...\n", + " Node jm ...\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "class NeuralNetwork(object):\n", + " def __init__(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + " \n", + " #print(\"DEBUG: hidden units \",self.hidden_nodes)\n", + " \n", + "\n", + " # Initialize weights\n", + " self.weights_input_to_hidden = np.random.normal(0.0, self.hidden_nodes**-0.5, \n", + " (self.hidden_nodes, self.input_nodes)) # shape hidden_n x n_features\n", + "\n", + " self.weights_hidden_to_output = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.output_nodes, self.hidden_nodes)) # shape 1 x hidden_n\n", + " self.lr = learning_rate\n", + " \n", + " #### Set this to your implemented sigmoid function ####\n", + " # Activation function is the sigmoid function\n", + " self.activation_function = self._sigmoid\n", + " self.hidden_gradient_function = self._gradient_sigmoid\n", + " self.output_activation = lambda x: x\n", + " \n", + " def _sigmoid(self, x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " def _gradient_sigmoid(self, output):\n", + " return output * (1-output)\n", + " \n", + " \n", + " def train(self, inputs_list, targets_list):\n", + " # Convert inputs list to 2d array\n", + " inputs = np.array(inputs_list, ndmin=2).T\n", + " targets = np.array(targets_list, ndmin=2).T\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + " # TODO: Hidden layer\n", + " \n", + " # signal into hidden layer\n", + " hidden_inputs = np.dot(self.weights_input_to_hidden, inputs) # shape: hidden_n x n . nx1 -> hidden_n x 1\n", + " # signals from hidden layer\n", + " hidden_outputs = self.activation_function(hidden_inputs) # shape: hidden_n x 1\n", + " \n", + " # TODO: Output layer\n", + " \n", + " # signals into final output layer\n", + " final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs) # 1 x hidden_n . hidden_n x 1 -> 1x1\n", + " # signals from final output layer - same, h(f(x)) = f(x)\n", + " final_outputs = self.output_activation(final_inputs) # shape 1x1\n", + " \n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + " \n", + " # TODO: Output error\n", + " \n", + " # Output layer error is the difference between desired target and actual output.\n", + " output_errors = targets - final_outputs\n", + " \n", + " # since the gradient of f(x) = x is 1, the error and output error are the same \n", + " \n", + " # TODO: Backpropagated error\n", + " \n", + " # errors propagated to the hidden layer\n", + " # shape: 1xhidden_n . 1x1\n", + " hidden_errors = np.multiply(self.weights_hidden_to_output, output_errors) # shape 1 x hidden_n\n", + " \n", + " # hidden layer gradients\n", + " # Calculate using sigmoid gradient of the hidden output\n", + " #shape hidden_nx1 \n", + " hidden_grad = self.hidden_gradient_function(hidden_outputs)\n", + " \n", + " # TODO: Update the weights\n", + " \n", + " # update hidden-to-output weights with gradient descent step \n", + " self.weights_hidden_to_output += np.dot(self.lr, output_errors).dot(hidden_outputs.T) #shape hidden_nx1\n", + "\n", + " \n", + " # update input-to-hidden weights with gradient descent step\n", + " # shape hidden_n x input_n += (hidden_nx1 * hidden_nx1) . (1x1 . inputs_nx1.T) \n", + " self.weights_input_to_hidden += np.multiply(hidden_grad, hidden_errors.T).dot(self.lr).dot(inputs.T) \n", + " \n", + " def run(self, inputs_list):\n", + " # Run a forward pass through the network\n", + " inputs = np.array(inputs_list, ndmin=2).T\n", + " \n", + " #### Implement the forward pass here ####\n", + " # TODO: Hidden layer\n", + " hidden_inputs = np.dot(self.weights_input_to_hidden, inputs)# signals into hidden layer\n", + " hidden_outputs = self.activation_function(hidden_inputs)# signals from hidden layer\n", + " \n", + " # TODO: Output layer\n", + " final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs)# signals into final output layer\n", + " final_outputs = self.output_activation(final_inputs)# signals from final output layer\n", + " return final_outputs" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def MSE(y, Y):\n", + " return np.mean((y-Y)**2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training the network\n", + "\n", + "Here you'll set the hyperparameters for the network. The strategy here is to find hyperparameters such that the error on the training set is low, but you're not overfitting to the data. If you train the network too long or have too many hidden nodes, it can become overly specific to the training set and will fail to generalize to the validation set. That is, the loss on the validation set will start increasing as the training set loss drops.\n", + "\n", + "You'll also be using a method know as Stochastic Gradient Descent (SGD) to train the network. The idea is that for each training pass, you grab a random sample of the data instead of using the whole data set. You use many more training passes than with normal gradient descent, but each pass is much faster. This ends up training the network more efficiently. You'll learn more about SGD later.\n", + "\n", + "### Choose the number of epochs\n", + "This is the number of times the dataset will pass through the network, each time updating the weights. As the number of epochs increases, the network becomes better and better at predicting the targets in the training set. You'll need to choose enough epochs to train the network well but not too many or you'll be overfitting.\n", + "\n", + "### Choose the learning rate\n", + "This scales the size of weight updates. If this is too big, the weights tend to explode and the network fails to fit the data. A good choice to start at is 0.1. If the network has problems fitting the data, try reducing the learning rate. Note that the lower the learning rate, the smaller the steps are in the weight updates and the longer it takes for the neural network to converge.\n", + "\n", + "### Choose the number of hidden nodes\n", + "The more hidden nodes you have, the more accurate predictions the model will make. Try a few different numbers and see how it affects the performance. You can look at the losses dictionary for a metric of the network performance. If the number of hidden units is too low, then the model won't have enough space to learn and if it is too high there are too many options for the direction that the learning can take. The trick here is to find the right balance in number of hidden units you choose." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress: 99.9% ... Training loss: 0.067 ... Validation loss: 0.162" + ] + } + ], + "source": [ + "import sys\n", + "\n", + "### Set the hyperparameters here ###\n", + "epochs = 1000\n", + "#learning_rate = 0.03\n", + "learning_rate = 0.07\n", + "N_i = train_features.shape[1]\n", + "hidden_nodes = int(N_i / 1.2)\n", + "output_nodes = 1\n", + "\n", + "\n", + "network = NeuralNetwork(N_i, hidden_nodes, output_nodes, learning_rate)\n", + "\n", + "losses = {'train':[], 'validation':[]}\n", + "for e in range(epochs):\n", + " # Go through a random batch of 128 records from the training data set\n", + " batch = np.random.choice(train_features.index, size=128)\n", + " for record, target in zip(train_features.ix[batch].values, \n", + " train_targets.ix[batch]['cnt']):\n", + " network.train(record, target)\n", + " \n", + " # Printing out the training progress\n", + " train_loss = MSE(network.run(train_features), train_targets['cnt'].values)\n", + " val_loss = MSE(network.run(val_features), val_targets['cnt'].values)\n", + " sys.stdout.write(\"\\rProgress: \" + str(100 * e/float(epochs))[:4] \\\n", + " + \"% ... Training loss: \" + str(train_loss)[:5] \\\n", + " + \" ... Validation loss: \" + str(val_loss)[:5])\n", + " \n", + " losses['train'].append(train_loss)\n", + " losses['validation'].append(val_loss)" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(0.0, 0.8)" + ] + }, + "execution_count": 25, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABBwAAALJCAYAAAAXuDCjAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAewgAAHsIBbtB1PgAAIABJREFUeJzs3Xd4VNX2N/DvnjQCSUhIqKEIFgzSO9JRFOki8JMiSFGQ\nC/IiXEVQUAQULqiIqFxREUVsFxAQUZCiNDEgAgZEpSYhQAKBBFJn9vvHZE6mz5mTKWHy/TxPnpyZ\n2bPPPkn04axZa20hpQQRERERERERkSfp/L0AIiIiIiIiIgo8DDgQERERERERkccx4EBERERERERE\nHseAAxERERERERF5HAMORERERERERORxDDgQERERERERkccx4EBEREREREREHseAAxERERERERF5\nHAMORERERERERORxDDgQERERERERkccx4EBEREREREREHseAAxERERERERF5HAMORERERERERORx\nDDgQERERERERkccx4EBEREREREREHseAAxERERERERF5HAMORERERERERORxDDgQERERERERkccx\n4EBEREREREREHseAAxERERERERF5nM8CDkKI2kKIxUKI40KIbCFEhhDigBBimhAi3EPnqCOEeE0I\nkSiEuCqEyC86zx4hxItCiMqeOA8REREREREROSeklN4/iRB9AHwCIAqA9QkFgJMAekkp/ynBOR4D\n8B6AcDvnMJ3nCoBHpZTbtJ6HiIiIiIiIiFzzesBBCNEMwG4A5QBkA5gPYCeMgYFHATxRNPQkgJZS\nyhsaztEewC4YgwoGACsBbACQCqA2gJEA+hS9fhNAQynlGY2XREREREREREQu+CLgsAtARwAFADpK\nKQ9YvT4VwH9gzEp4WUo5R8M5NgLoVTTHBCnlcjtjFgF4pmjMMinl0+6eh4iIiIiIiIjU8WrAQQjR\nCsAvMN7kvyel/JedMQLAMQAJAK4CqCKl1Lt5ngwAMQDSpZRVHIyJApBZtJZDUspW7pyDiIiIiIiI\niNTzdtPI/mbHK+0NkMaIx6qih9EAumo4TyiMgYTTjgZIKa8DSDcbT0RERERERERe4u2AQ4ei7zcA\nHHQybpfZcXsN5/kTxv4MdR0NEEJEAogzG09EREREREREXuLtgEMCjJkHf0spDU7GnbB6j7veK/oe\nK4QY52DMLLPjdzWcg4iIiIiIiIhUCvbWxEKIMBgzCiSAZGdjpZSZQogbAMoDqKXhdB/CmBkxAsDb\nQogWMO5ScQHGXSqGA3i4aC1zpZQ7NJyDiIiIiIiIiFTyWsABQKTZcbaK8aaAQ4S7JyrKnhhVtFvF\nTABji77MbQcwX0q53d35iYiIiIiIiMg93iypKGd2nK9ifB6MfRjCtZxMCJEAYCSARjBmMlh/3Qtg\nrBCihpb5iYiIiIiIiEg9bwYccs2O1ewKEQZjYCDH3RMJIToC2AugD4zlG8MBVCs6by0A/wJwE8Cj\nAA4UBSeIiIiIiIiIyEu8WVKRZXaspkyiQtF3NeUXCiFEKIA1AKJg7NnQRkp52WxIKoD3hBA/AUgE\nUB3AxwBau3meGygOilxR8RY9AGeNMomIiIiIiKhs0gEIUjGuEoyVAHlSygquBpc2Xgs4SCnzhBDp\nAGIB1HQ2VggRDWPAQQI47+apegCoUfTepVbBBvP1JAkhPoWxt0MLIUQjKeVRN84ThuI/iCpurpGI\niIiIiIhIqzB/L0ALb2Y4AMBxAB0B3CGE0DnZGvNuq/e4w7w84pCLsQdR3EzybgDuBBxk8WFlREcD\nISFARgZgUK7KAJTPAACUCy2H2AqxDidLTQVk0YxhYUBcnBsrIfKS/Px8XL58GZUrV0ZoqJpKKKJb\nD//OqSzg3zmVBfw7p1uZXq+HXq93Oe7yZeXzdOlsXGnl7YDDbhgDDhUAtADwq4Nxnc2O97h5jkKz\nY1fXE+LgfWpcAVAFqAzgEjZvBtq1A+rVA06fLhpR4SLw72oAgAfqP4BvHv3G4WTR0cC1a8bjLl2A\nLVvcXA2RFxw6dAgtWrTAli1b0Lx5c38vh8gr+HdOZQH/zqks4N85lQVVqlQxBR3UlPWXOt5sGgkA\n682OR9kbIIQQAEYUPcwEsMPNc5w2O+7oYqx5YOO0w1EqmIJR0jzOZCiOZxQa1Mcz5C0ZqyIiIiIi\nIiJyzKsBBynlrwB+hrHJxRghRBs7w6bBWBYhAbwppbTIKxFCdBZCGIq+PrTz/h9h3IFCAHhKCNHQ\n3lqEEA8BeLjoYYqU8rCmiypiKqOwDDgUJ1i4CjiYv48BByIiIiIiIgo03i6pAIDJMJZJhAPYKoSY\nD2MWQziAIQCeKBr3J4DXncxj97ZcSnlNCPEagDkw7lSxVwixFMBWAFcBVAXQH8beDbqieZ4r4TWZ\n9W0wf7L4x1mgL1A9FwMOREREREREFGi8HnCQUh4WQgwG8CmMAYH51kNgDDb0klLe0HiOuUKIGBiD\nGxUAPF/0ZX2efADPSynXaDmPObsZDnqWVBAREREREREB3u/hAACQUn4LoDGAN2AMLtyAMfvgVwDP\nAmgupXTWU0Fafbd3jqkAWgF4D8bdJ67D2BgyE0AijNkTDaWUb5ToYorY7+FgluFgYIYDERERERER\nlV2+KKkAAEgpz8PYr2Gam+/bBSBI5djfAPzL/dW5z25JBQRgCAJ0evZwICIiIiIiojLNJxkOgchu\nSQWgZDmwpIKIiIiIiIjKMgYcNLJbUgEoAQdXTSOFABCZCoRnOMiWICIiIiIiIrp1+aykIgAUhRiM\n1R0OgwRFjSNdZTjoKx8GhrUE9KHI3f83gBoeWiaRdtWrV8fs2bNRvXp1fy+FyGv4d05lAf/OqSzg\n3zmVBUFBSncBvT/XoZWQzOdXRQiRDCAeiAeQjHXrgP79gfh4IDXVbOC/KwMV0lEvph7+efofh/MF\nTboHhrgkAEDVlDFI++8Kr66fiIiIiIiIbi01a9ZESkoKAKRIKWv6ez3uYkmFRo57OKjLcJDh6cqx\nXpfjyaURERERERER+R0DDhrpHSW0qGwaKXXFPR6EZGULERERERERBRYGHDSyznAQwvSCuqaRCDIL\nOBRlRRAREREREREFCn60rpF1wEGnK8p6UNk0EsLsdQN/DUREROSeli1bIi0tzd/LICIiFapVq4bE\nxER/L8PneKerkXVJhU2Gg4EZDkREROQ9aWlppkZiREREpRIDDho5LqlQmeGgM4tYsIcDERERaaTT\n6bgtIBFRKXXhwgUYTDePZRDvdDWyV1JhfEFd00hzzHAgIiIirapXr47k5GR/L4OIiOww29ayTGLT\nSI2sg1TWJRWFhkJImz0zHU3GuA8REREREREFFgYcNDL1cLApqdAXZyusProazZY3w2dHP3M+GTMc\niIiIiIiIKMAw4KCRq5IKAJi5fSYOpx3GC9tfcDoXSyqIiIiIiIgo0DDgoJHjkori4MHF7IsAgEs3\nLrmYjCUVREREREREFFgYcNDIYUmFWfAgT58HALhRcAN6g9U+mmZUdnogIiIiIiIiumUw4KCRmpIK\nc9n52c5m89i6SqtCQyEOph50GnghIiIiIiKiwMGAg0YOSyr09vsxZOVnOZ4LgX8T/ujXj6Ll+y3x\n1LdP+XspRERERERE5AMMOGhkneFgr6TC3PW86w7nkiLwAw7bT2+3+E5ERERERESBjQEHjRz3cHCQ\n4ZBXnOEgbZo2BH5JhV4af2AGGfjXSkRERERERAw4aOZ4lwrXGQ7W7y0LJRWmQAMDDkRERERERGUD\nAw4auds00ryHg3WGgywDAQdTs0hTpgMRERERBZYlS5ZAp9NBp9PhmWee8dl5r127ppy3UqVKPjtv\nadC0aVPl2o8cOeLv5RDZYMBBI4c9HBw0jbTMcLCMOJSFHg4sqSAiIqKy7OzZs8qNoae+5syZ4+/L\nskso/zAuG+f1J9M1l8Vrp1uD/Y/jySW9VYzAnZKKfH2B1auBfxNuCjRwW0wiIiIqy3hj6D3StlEa\nEfkZAw4aOS6pcN00Mic/z+K1MpHhYGCGAxEREZVdUVFRmDhxotMxBw4cwIEDByCEQI0aNfDwww87\nHd+6dWtPLtEj/J3dwIAOUenCgINGJdkWM7fQKuAQ4D0cpJSQMP6gGHAgIiKisigmJgZvvfWW0zEv\nv/wyDhw4AAC48847XY4vbSZPnozJkyf7/LwVK1aE3jr9mIhKBfZw0MjdkgrzppF51gEHEdg34eZB\nBjaNJCIiIiIiKhsYcNCoJE0jy1qGg3mQgRkOREREREREZQMDDhqVZFvMnIJci9cCvYeDeaNINo0k\nIiIi8qz+/fsrO1ds2LABAJCRkYFFixahffv2iI+PR0hICIKCgmzem5qaihUrVuCxxx5D06ZNERMT\ng9DQUFSqVAkNGzbEk08+iV27dqlah5ptMb/55htlzIABA5Tnt2zZgoEDB6Ju3boIDw9HlSpVcN99\n9+GDDz6AweD8Ayu122JGR0cr465fN34YeObMGTz77LNo1KgRKlasiKioKDRs2BDTpk1DWlqaqus2\n2bhxIx555BHUqlUL4eHhqFmzJrp164YVK1YgL8/4geOUKVOUNfijZCYtLQ1z585Fhw4dUL16dYSF\nhaFKlSpo3bo1Zs6ciVOnTqme6+jRo3jmmWfQunVrxMXFKX83d911F9q3b4/Jkydj8+bNuHnzpsM5\nrl+/jqVLl6JHjx6oVasWypcvj/Lly6N27dpo0aIFBg8ejPfffx/nzp3zxOWTH7CHg0bW/98rLqlw\nP8MBAR5wMM9qYIYDERERkWdZN0zctm0bhg8fjkuXLjltojhv3jzMnj1buaE3H3vt2jVkZmYiKSkJ\nK1asQO/evbF69WpERkaqXo+aMTk5ORg7dizWrFlj8XxGRgZ27NiBHTt2YNWqVfj2228RERFRovMK\nISzGrF69GuPHj8eNGzcsnk9KSlKue8OGDejUqZPTeXNycjBkyBAl2GM614ULF5CamoqdO3di+fLl\n+N///qd6rd7w1ltvYebMmbhx44bFGjIyMpCeno7ExEQsWrQIzz33nMstV6dOnYolS5bY/O1cu3YN\n165dw99//419+/Zh6dKlmDhxot3givnfqfkcAJCSkoLk5GT89ttv+PrrrxEREaEEiejWwoCDRqYe\nDlqaRtr0cAjwbTFZUkFERETkG7///jtee+015ObmIiYmBp06dULVqlWRkZFhk6mQnJwMKSWEELjz\nzjtx9913K59UZ2Zm4vDhw/jzzz8BAJs2bULv3r2xc+dOj90sSykxfPhwrFu3DiEhIWjXrh3uvPNO\nFBQUYM+ePcqn7bt378a4ceOwevVql/OpOacQAmvXrsWYMWMAAHfccQdat26NChUq4O+//8ZPP/0E\ng8GA69ev4+GHH8bx48dRpUoVu/MZDAb07dsXP/74o/JzqVKlCjp37oyKFSvizJkz2LVrFw4dOoQ+\nffrg3nvvdedH5DEvvPAC5s+frwRdwsPD0aVLF8THxyM9PR07duzAtWvXUFhYiLlz5yIlJQUffPCB\n3bleeuklvPHGG8pcVatWRZs2bVC1alUAwJUrV5CUlIQTJ044/J38+eef6Nu3L/Ly8iCEQFhYGNq0\naYN69eqhXLlyyMrKwunTp3HkyBFkZ2d77edC3seAg0Zul1TkOWsaGdgZDhYlFWwaSUREROQ1r7zy\nCvR6PaZPn47Zs2cjNDRUeS0/P99ibKNGjbB8+XL07dvX4Q31wYMHMXr0aBw9ehS7d+/Gu+++iwkT\nJnhkrVu2bEFeXh66du2Kjz76CLVr17Z4fd68eXjxxRcBAJ9//jmef/55NGzYsETnNAUFnnrqKVSs\nWBEfffQR+vXrZzHm0KFD6NGjB9LT05GZmYmFCxdi0aJFdud74403LIINL7/8MmbMmAGdrrhyPTU1\nFcOHD8fOnTtx8uTJEq1fi++//x6vvvqqssZBgwbhvffeQ3R0tDImNzcXU6dOxbvvvgshBFauXImO\nHTvi8ccft5grJycHixYtUoINb7/9NsaNG2c3CJWeno5169ahoKDA5jVTUEwIgYceeggff/wxYmNj\nbcbp9Xrs2rULn376aQl/CuQv7OGgkcOSChVNI3MKrXo4BHjTSJZUEBEREXmflBJ6vR7//ve/MW/e\nPItgAwCbxxMmTMDYsWMdBhsAoEWLFti2bRsqVqwIAFi6dKnH1puXl4fmzZtjy5YtNsEGAJg5cybu\nv/9+5bGp7KKkTD+n7777zibYAADNmzfHm2++qYx1dN7c3FzMmzdPudmePn06XnjhBYtgAwDUqFED\nmzZtQkJCgk3QxxeeffZZ5bhHjx5Ys2aNRbABAMqVK4dly5ZhxIgRxi3tpcSMGTNQWFhoMe7QoUNK\nT4aePXti/PjxDjNe4uLi8MQTT9gNUO3evRuAMQDkKNgAAEFBQejWrRs+/PBD9RdMpQoDDhq5W1Jh\n3jSyrPVwsC6pUJPuRkRERETui46Oxssvv+zROStXroyePXtCSomTJ08iJSWlxHOaShsWL16MkBD7\nH9gBwOjRo5XjAwcOlPi8gPEmd9iwYWjTpo3DMYMGDVJ6RqSlpdm95rVr1yIzMxNSSsTFxWHWrFkO\n5ytfvjxee+015bp95ZdffsHRo0chpYROp8OyZcucnv+NN95AhQoVAAAXL17EunXrLF4376NQuXJl\nzesyzRMcHIyYmBjN81Dpx5IKjdwtqbhZcBOFhkIE64LtlFQE9qf+1jtTSEgI+L5RDhEREanXsiXg\nZpP+UqVaNSAx0d+r8C0hBPr27YuwsDC333vhwgX88ssvOHHiBDIzM3Hz5k2LD4n++OMP5fjw4cOI\nj48v8XpjY2PRuXNnp2OaNWumHJ89e7bE5zTd8A8cONDpuJCQENxzzz345ZdflHNbX/POnTsBGH/u\nAwYMcPlz79mzJypVqoQrV674LOiwfft2AMY1durUCbfddpvT8TExMRgwYAA++eQTAMCOHTswaNAg\n5fVatWopx5s3b8b58+ctnlOrVq1auHz5MgoKCrBixQo8+eSTbs9BtwYGHDRyd5cKAMjOz0Z0uegy\n18PBuozCIA3QCSbXEBERlWZpaYAHPsgmH2vRooVb43/77Tc899xz2L59u8vtJ03S09O1LM2CEEJV\nPwbzVPtr166V+LwmjRo1KvG5Dx8+rBw7y5YwCQoKQvPmzbFt2zaVqyy53377TTlW27Cyffv2SsDh\n0KFDFq81bNgQCQkJOH78OC5evIgmTZpg5MiR6NevH9q1a6c62DV48GAcOnQIUkqMHz8eGzZswJAh\nQ3DfffehWrVqKq+ObgUMOGhkneHgqqQCMPZxiC4XjVyrHg4I8B4O1o0i9QY9gnX80yMiIirNbvV/\n89/q69fKnTT3r7/+GkOHDkVhYaHNlpH2mDIesrKynI5Ty9QXwhlTuYWU0m7zQV+cG4Ddc1++fFk5\nVvspf82aNVWN8xTzNdapU0fVe8yzIOwFl1atWoUHH3wQV69eRWZmJpYsWYIlS5YgNDQUzZs3R6dO\nndCjRw907tzZ4d/UlClT8OOPPyrBl82bN2Pz5s0AgHr16qFjx47o2rUr+vXrp+p3RaUX7/o0su7h\noJRUOGgaCRQ3jizrJRVsHElERFT6lbVyhEARHh6uatz58+cxcuRI6PV6ZVvMcePGoX379qhbty4q\nVqxo0WRyypQpWLJkCQCozoRwxZe9DLxxbvPtGsuXL6/qPaa+EL5ivkZTbwZXTOOklHaDSy1atMCR\nI0cwZ84crFmzRjlHQUEB9u/fj/3792PhwoW47bbbMG/ePAwZMsRmjpCQEHz33XdYvnw5lixZgr/+\n+kt57dSpUzh16hQ+/vhjhIWF4YknnsD8+fN9/rMjz2Beu0aOSyocx3BMW2Pm6ctW00jrAAO3xiQi\nIiLyr6VLlyInJwcA0KFDB/z++++YMmUKWrdujcqVK9vsaOGprIZAYn4DbNq5wZUbN254azl2ma9R\n7blN44QQiIyMtDumRo0aeO+993Dp0iVs27YNs2fPRvfu3REREaFky5w5cwbDhg3DSy+9ZHcOnU6H\np556CidOnMCxY8fwzjvvYPjw4ahTp44yR35+Pt5++220b9/eInhCtw4GHDTSWlIB2AYcAr2Hg3WA\ngRkORERERP5laiYIAC+99JLL2ntPNGwMNHFxccpxcnKyqveoHecp5iU2586dU/WeM2fOKMfm12hP\nWFgYunbtilmzZmHLli3IyMjAN998g5YtWypZJPPmzbPIYLAnISEB48aNw8cff4zTp0/j6NGjmDBh\nghJ4OHbsGBYuXKhq/VS6MOCgkcOSCidNI01bY1qXVAR6hgNLKoiIiIhKl9TUVOXYVfPGvLw8HDhw\nwK8lEKVR06ZNlWPTbhbOGAwGmyaM3ma+y8fevXtVvcd8XPPmzd06X0hICHr37o3t27crvSAMBgO+\n/fZbt+Zp0KABli5dimeeeUbpHbJhwwa35qDSgQEHjbSUVJgyHKybRkoE9g24TUmFIbADLERERESl\nnU5XfBvgqhzg008/RVZWlsU2mQR06dIFgLHXwdq1a5Gfn+90/ObNm5GRkeHTwE23bt0AGNe4a9cu\nl5kqmZmZWLt2rc373VWhQgV07dpVeXzx4kVN8/Tt2xeAcf1a5yD/YsBBI4clFU6aRpp6OOSXsR4O\nLKkgIiIiKl3q1aunHDv75Dg5ORnPP/88sxvseOSRR5QdFC5fvoxXXnnF4dicnBw8//zzAODTwE2b\nNm3QuHFjAMZMg4kTJzod/8wzzyi9EqpWrYr+/ftbvH716lXV5z5//rxyXKVKFeVYr9fj+vXrquYw\nlYEIISzmoFsHAw4alSTDocz1cLDKaGDTSCIiIiL/6tOnj3L8wgsvYP369TZj9u7di06dOiEjI0P1\nDgdlSbly5TBz5kwAxiDC/PnzMW/ePOj1lv/WTU1NRa9evfDHH3+gXLlyPl/nwoULIYSAlBKbN2/G\n0KFDkZmZaTEmJycHEydOxMqVKwEYb/BfffVVBAdb3tusWrUK99xzD5YuXYqUlBS758vNzcVrr72G\nrVu3Ks/16NFDOc7OzkbNmjUxefJk7Nu3z+G6d+/ejeeee0553LNnT9XXTKUHt8XUyOr/I24FHGwz\nHAL7E3/rjAZmOBARERH51/jx47F06VIkJycjKysLAwYMQKNGjdCoUSMEBwfjyJEjOHz4MIQQuPfe\ne9G0aVMsW7bMb+tVk2HhjyyMZ555Bt999x127twJKSVefPFFvP322+jcuTOioqJw9uxZ7Nq1CwUF\nBWjcuDHatWuH5cuXA7Asa/GmBx54ADNmzMD8+fMhpcTnn3+ODRs2oGvXrqhRowYyMjKwfft2JQgh\nhMCoUaMwcuRIu/MdP34ckydPxuTJk1G3bl00atQIlStXhl6vR1paGvbt24dr164pc02cOBEJCQkW\nc2RnZ2Pp0qVYunQpoqOj0axZM9SsWRPh4eG4fPkykpKS8Oeffyrja9eubRF8oFsHAw4aGQzF5RSA\ne00jbXo4BHqGA0sqiIiIiFTzRcp9ZGQkNm3ahN69eyufVB89ehRHjx4FAGV3gAceeACffvop5s6d\n69Hzu3uNasb7o8eETqfDpk2bMGTIEGzcuBGAsV/Bl19+qYwRQqB58+ZYu3YtFixYoDwfFRXls3W+\n8sorqFKlCmbMmIGbN28iJyfHopGj6fcdEhKCZ599FnPmzLE7j2nbS5MzZ87g9OnTFmNMcwUHB+OZ\nZ57Bq6++avF6UFAQKlSooPQOuXbtGnbs2GFzLtN52rZti88//xzR0dHaLp78igEHjawDDo4yHEJ0\nISgwFAAwK6kwlLEeDtYlFWwaSURERGSX6SbL3U/rTTd57mjcuDGOHDmCt956C9988w3+/vtv6PV6\nVKtWDU2bNsWwYcPw8MMP26xNzfrVjFG7XjXj1Y5x9+ekZnx4eDjWr1+PjRs3YuXKlThw4ADS09MR\nGxuLu+66C8OGDcNjjz2G0NBQXLlyRXmfp26g1V7TpEmTMGjQIKxYsQJbtmzBP//8g6tXryIqKgp1\n6tTBAw88gDFjxlj097A2ZswY9OzZE99//z327t2Lo0eP4vTp08jMzIQQAtHR0ahfvz46d+6MESNG\n4Pbbb7eZIyIiAhkZGdixYwd+/vlnJCYm4q+//sKlS5eQl5eHChUqoHbt2mjZsiUGDRpkUY5Btx7B\nbrPqCCGSAcQD8QCS0asXsGEDEBRkfL11a+DAAQBxx4GJDZT3xUfGIyXLGDXufVdvbByyEd1W9MGO\nlE3KGN3FptC/85vvLsbH9pzbgw4fdVAen3r6FOrG1PXjioiIiG59NWvWREpKCuLj45GcnOzv5RCR\nCg0bNkRSUhKEEDhx4gTuvPNOfy+JvKyk/682vR9AipSypscX6GVsGqmRXq8uw6FyhcrKseNdKgK7\nxMC6pIJNI4mIiIiorDl27BiSkpIAABUrVmSwgcoEBhw0ctzDwTLgEBVWXJuVrzfuzZunL2M9HAzs\n4UBEREREZZfBYMDTTz8NwFgCMWTIED+viMg3GHDQyPEuFZZNI8uHlIdOGH/Mpl4OthkOgR1w4C4V\nRERERBSopk2bhuXLlys7M1g7efIkevTogZ07dwIAwsLCMHnyZB+ukMh/2DRSo/x8dSUV4cHhCNGF\nIE+fhwK9MeBg2zQysG/AbUoq2DSSiIiIiALEP//8g9dffx1PP/00mjRpgvr16yMyMhJZWVn4448/\ncOTIERgMxn/vCyGwYMEC3HXXXX5eNZFvMOCgkXXAQSmp0FtmOJQLLoeQoKKAQ1GGQ4FVhgNLKoiI\niIiIbl1CCBQWFiIxMRGJiYk2rwkhEBERgcWLF2Ps2LF+WiWR7zHgoFGedZKCgwyHcsHlEKIzBiGU\nDAfrkgpdYAccWFJBRERERIHqgw8+wPr167Fjxw6cOHECly9fRnp6OqSUqFSpEho0aID7778fY8aM\nQaVKlfy9XCKfYsDBTUIYMxvy8iwzHBT2Ag5BRQEHUw8Hg2XTyEDv4cBdKoiIiIgoUFWqVAmjR4/G\n6NGj/b0UolKHTSPdZMpkcNzDwU5JhVWGQ5nbFpMlFURERERERGWOzwIOQojaQojFQojjQohsIUSG\nEOKAEGL/NI8VAAAgAElEQVSaECK8BPPWEUIY3Pw6VdLrsVdSIQRUZjiU7V0q2DSSiIiIiIgo8Pmk\npEII0QfAJwCiAJjyAsIBtADQEsBYIUQvKeU/Gk9hr7jBmRMaz+Mww6F4JTrjV1HWQnhwOIJ1xh+z\nKcOhwDrgEOA9HKxLKJjhQEREREREFPi8HnAQQjQD8DmAcgCyAMwHsBPGgMOjAJ4AcCeATUKIllLK\nG26eIgVAIxXjZgAYCmNw4mM3z6EwBRyseziYMhykBIQhGDIoH4BVSYWhAHqDHoWy0GrSwL4BZ0kF\nERERERFR2eOLDIc3YQwuFADoLqU8YPbaTiHEXwD+A+AuAFMBzHFncillIYAkZ2OEEDoAXYoeZgH4\nxp1zWM5l/O6wpAKAkMGQMAs4BBX3cLDZoQIAhN4YqBC2LwUCm5IKNo0kIiIiIiIKeF7t4SCEaAWg\nI4xZBSusgg0mrwM4DkAAmCyECPLCUu4HUKNoHV9JKXNdjHfJYUkFACGLG0daZzjkFdoJOOj0DucK\nBCypICIiIiIiKnu83TSyv9nxSnsDpJQSwKqih9EAunphHSPMjlc5HKWCKQvBYAAKCy2fL96pojhx\nxDzDwSANyCnMsTNpgAccrEoq2DSSiIiIiIgo8Hk74NCh6PsNAAedjNtldtzekwsQQkTAGPiQAM5K\nKX8u2XzFx7m5ls8Xl1TYz3AAgOz8bDuTGgI64GCd0cAMByIiIiIiosDn7YBDAow3+n9L6fQu03zX\niAQPr2EggPJFx5qbRdpj3cfBRMjiDIfwkHAlwwFwEHBgSQUREREREREFGK8FHIQQYQDiih4mOxsr\npcyEMQsCAGp5eCnm5RSflHQy8wwH84CDRYaDdUmFywyHAA84WJdUsGkkERERERFRwPNmhkOk2bGd\nu2wbpoBDhKcWIISoBaAzjFkWe6WUp0o+Z/Gx4wwHq5IKlxkOBhgMgRtxYEkFERERERFR2ePNgEM5\ns+N8FePzYNypItyDa3isaE7AQ+UUano42DSNNMtwyMrLsjuvIYBTHKwzGtg0koiIiIiIKPAFux6i\nmfnWk6EqxofBmIlgZxsHzYYXfc8D8KUnJjQY8gEcAgAcOVL8fFZW8TaZMq3AGOaIVJnhAKBQr4f3\nW2r4h3WAgRkORERERERUll24cAEXLlxwOS4/X81n96WXNwMO5h/lqymTqFD0XU35hUtCiFYA7oYx\niLFBSnndE/Pm5l4G0AIAMGZM8fP79hUf538GYyFHV5U9HAAU6PUAQuy+dqtj00giIiIiIqJiy5cv\nx8svv+zvZXid1wIOUso8IUQ6gFgANZ2NFUJEwxhwkADOe2gJ5s0iV3loToSHV0ZOzhYAwJIlwOTJ\nxufbtQMOHTL2dQh7dATy4v8wjg9WsUsFAL0hcG/CrQMMbBpJRERERERl2bhx49C3b1+X43r06IHL\nly/7YEXe4c0MBwA4DqAjgDuEEDonW2PebfWeEhFCBAP4v6KHlwBsKemcJkFBoQCaAwBqme2nER0N\nBAUVjYmLVlpmqs1wKAzgvgYsqSAiIiIiIipWvXp1VK9e3eW40FA13QlKL283Ddhd9L0CTHUI9nU2\nO97jgfP2gnFLTglgtZNAh9tU7VJhCFOOw0PCLQIONwpu2HtLUQ+HwMSmkURERES3piFDhkCn00Gn\n0+HLL+23RFu+fLkyZsKECR45b15enjJn+fLlPTKntzz//PPKWhcuXOjv5fjErfT7If/ydsBhvdnx\nKHsDhBACxeUPmQB2eOC8XimnANTtUlHx3FAEiSAMvmcwyoeUtyipuJFvP+AQyDfh3BaTiIiIyrpp\n06YpN2j16tXTPM/Vq1cRFhamzPXJJ594cJWOCfN/BJdgjDfOW1rcSmv1lLJ4zeQerwYcpJS/AvgZ\nxj0bxggh2tgZNg1AAozZCG9KaflxuBCisxDCUPT1oatzCiFiYMxwkACOSimPuHiLWxxlOJg/H3N6\nDK4+dxVfDPwCACxLKgrKXg8HllQQERFRWff4448DMN6gnT17Fj/99JOmedasWYOCggIIIRAREYFH\nHnnEg6ssfaSPt44vi9kKJeHr3w/denyxD+NkGLe6DAGwVQgxXQjRRgjRRQixHMCConF/AnjdyTxq\n/5ofRfE2nCs1rFc165IKU9BBSiAyLFJ5Xk2GQ0D3cLAuqWDTSCIiIipjGjZsiGbNmik3aKtWaUvC\nNc9oGDhwYKlJZzd90h0on3i7ex2Bct1Enub1gIOU8jCAwQCuwdjLYT6AfQC2A3gCxkDCCQC9pJT2\n78bdM7Loux7AZx6Yz4KakgprDptGGoKUw0AOOLCkgoiIiAgYOdL4z1QpJb7++mvkOWoI5sBff/2F\nX375RXk8YsQIJ6N9Z9y4cdDr9dDr9Vi2bJm/l+Nzr776qnL9zz77rL+XQ1Sq+CLDAVLKbwE0BvAG\njJkMNwBcBfArgGcBNJdSnnY2hdV3u4QQdwBoVTTuBynlpRIu3c45io8dlVRYZxZZZDiYN40sCFcO\nDWWopCKQ+1UQEREROTJ06FCEhIRACIGsrCysX7/e9ZvMfPzxx8pxrVq10KVLFw+vkIjIs3wScAAA\nKeV5KeU0KWWClDJSShkrpWwjpVwspcx18r5dUsqgoq8xLs7xt9nY3p6/CkvOSirMOcxwKCwOOARy\nhoN1CQUzHIiIiKgsiouLw0MPPaS5rGL16tUAjOn7pmwJIqLSzGcBh0ChJsPBmsMeDmYZDievnMCY\nb8Zg3fF1nlimU1dzrvq0wQtLKoiIiIiMzMsqtm7dikuX1CXk7tq1C2fPnlUeP/bYYw7HJiUl4fXX\nX8fDDz+M+vXrIyoqCqGhoahSpQpat26NadOm4eTJkyW7EDPubouZmZmJV155BS1btkRMTAwiIyOR\nkJCAp556Cr///rvb5z9z5gzeeecdPProo2jYsCGio6MRGhqKuLg4NG3aFBMnTkRiYqLTOdq2bQud\nTocFC4zt5aSUmD59unJd5l/W1+huo8n8/Hy8//776NevH+rUqYPy5csjOjoaCQkJePLJJ7Fjh7pN\n+6pVq6ac1/R3dO7cOcyYMQONGzdGdHQ0IiMjcc8992DKlClISUlRNa+3HD16FFOnTkXTpk0RFxeH\ncuXKIT4+Ht26dcOiRYuQmZmpeq6tW7di1KhRFr/vKlWqoEGDBujWrRtmz56NPXv2QK93/KHu2bNn\nMXv2bHTq1AlVq1ZFWFgYoqKicPvtt6NNmzYYO3YsvvrqK1y9etUTl192SSn5peILQDIAGRsbL405\nDFJOnCiV4379pKxY0Xhcv7608Nb+tyRegsRLkJUWVFKO8a+7leNm77SWeAkycn6kzC3Ild7y2ZHP\nZNDLQbLryq7SYDB47Tzmnt78dPE1vwS5ZP8Sn5yXiIgokMXHx0sAMj4+3t9LITfk5+fL2NhYKYSQ\nOp1OvvHGG6reN2rUKOU97du3dziuT58+UgihfOl0Oosv0/NBQUHyueeec/nvwUcffVSZ54svvrA7\n5r333lPGPPXUU07n2759u6xWrZrd9QkhZHBwsFy4cKHMzc1VxoSHhzucb+LEiaquVwghR44cKXNz\n7f87u23btjbvsZ7L9GV9jdOnT1fGL1iwwOn1//zzz/K2225zeP2mr169esmrV686natatWrKey9e\nvCi/+OILGRUV5XDeiIgIuXXrVqdzqqX29yOllAUFBXL8+PEyKCjI6XXHxsbKNWvWOJ3r+vXrskeP\nHqp/56tXr7Y7z1tvvSXDw8NVzdG9e3fNPycpS/7/atP7ASTLUnBf7O5XsB9jHbckRxkO1q+ZM89w\ncFRSkZp9HgCQlZ+FrPwshAWHlXit9nyV9BX0Uo8dZ3bg0o1LqBpR1SvnMceSCiIiIiKjkJAQDBky\nRGmuuGrVKvy///f/nL4nNzcX//vf/5THpi027Tl//jyEEAgJCUGDBg1wxx13ICYmBkIIXLp0Cb/+\n+itSU1NhMBiwcOFCFBYWYtGiRR65Nlf27duHPn36ICcnB4CxNKRNmzZo0KAB8vLysHfvXpw5cwbT\np09HeHi4i9mMkpOTIYSATqdD/fr1Ub9+fVSqVAkhISHIyMjAwYMHcfq0sVXcqlWrkJ2dja+//tpm\nnsGDB6NVq1bYt28fDh48CCEE2rVrh+bNm9uM7dixo6br37ZtG/r27Yu8vDwIIZTrT0hIQF5eHvbt\n26esdfPmzejYsSN2796NihUrOpxTSgkhBL799ls88cQTkFKibt26aNu2LSIjI/H333/jp59+gl6v\nx40bNzBw4EAkJSWhRo0amq7BXQaDAb1798YPP/ygXHNcXBw6d+6MmJgYnD17Fjt37kRBQQGuXLmC\nYcOG4fr163jyySftzjd48GB8//33yq4gd911F5o2bYqYmBjk5+fj0qVLOHr0KM6dO+dwTZ9//jkm\nT56srKdixYpo164d4uPjERQUhGvXruHPP//EH3/8gYKCAq/8XMoSBhxKQHXTSLMeDvn6/OIXCsuZ\nPV88WW6hw5YWJWZ+/gKDb/4DYkkFERERUbGRI0di2bJlkFLi999/xx9//IF77rnH4fi1a9ciKysL\nABAWFobBgwc7HPvggw9i9uzZ6N69OypUqGB3zPr16/HEE08gIyMDb775JoYMGYIWLVqU7KJcyM3N\nxYgRI5CTkwMpJerVq4cvv/zS5ob+ww8/xIQJEzBt2jQIIUyZxg61adMGAwcORM+ePRETE2N3zK5d\nuzB69GicPn0a69atw9q1azFgwACLMVOmTAFgLI84ePAgAKBv374e23UiPT0djz32mLIzSYMGDbBm\nzRo0bNjQYtyqVaswfvx45OXlISkpCePGjcPnn3/ucF7TjfeECRMQERGBFStWYODAgRZjjhw5gh49\neiAtLQ1ZWVmYP38+3n77bY9clytz585Vgg0AMHv2bMycORNBQcW79V24cAGPPfYYtm/fDgCYPHky\n2rRpgyZNmljMdeDAASXYULFiRXz99dfo1q2b3fOeOnUKn332GeLi4mxemzNnjrKeadOm4ZVXXkFo\naKjNuOzsbGzevBnHjh3TdvEEgD0c3KZmW0ybgINZhoOFguJ9k3P1xZPlFbq3RZI7zLMNfLVbBHep\nICIiIirWsmVLJCQkKI9dNY/85JNPABhvLvv374+oqCiHY1977TX079/fYbABAPr3749164x9w6SU\nPrn5fP/99/HPP/9ASomIiAhs27bNbvbA6NGjsWzZMuTn59uZxdb06dMxbNgwh8EGAOjcuTN++OEH\nhIQY/02+dOlSbRdRAv/5z39w8eJFSClRuXJl/PjjjzbBBsC41emHH36opKN/9dVXLvtPSClRWFiI\nDRs22AQbAKBx48bK71hK6TSA4UlXr17FggULlJv7F198EbNmzbIINgBA9erVsWnTJjRt2hRSSuTn\n52PmzJk28/3888/K8b///W+HwQYAqFevHl544QU88MADFs9nZGTgxIkTAIDbb78dCxYssBtsAICI\niAgMHjwYc+bMUXfBZBcDDm7SVFKhcxRwKE4VM89wyNN7MeBgdrNfaCj02nkszsmSCiIiIiIL5s0j\nV69e7fCT/LS0NGzbtk15PGLECI+cv0OHDqhbty6klBbze8sHH3wAwBg0mTp1Km677TaHY8eMGePx\njIvbb78dHTp0gJQSe/fuVTINfMFgMFhc/5w5c1ClShWH4x999FF07dpVefzuu+86nV8IgYEDB6JT\np04Ox/Tv3x+VKlUCYAwE/PPPP+5cgiarVq1SMlpq1aqFF154weHYcuXKYcmSJQCM/01s2bLFpizi\n+vXryrG9zAU1zOeoXLmypjnIPSypcJOaXSpUZziYlVSY35R7s6TCIsNB+ibTwDrA4KvzEhERkXYt\n/9sSadlp/l6GZtUiqiHxSeefDPvT8OHDMWPGDBgMBly4cAHbtm1D9+7dbcZ9+umnSqf9atWq4cEH\nH1R9jpMnTyIxMRGnTp3CtWvXkJeXZxHYuHnzJgAgNTUVGRkZiI2NLeFV2Xf16lUcOXJEeexshw2T\nESNGKKUNap07dw4HDhzAyZMnce3aNeVm1+T8eWPPtMLCQhw7dszrZSQmv//+O65cuQIACA0NxdCh\nQ12+Z+zYscpuFc52rTD1cLCX2WBOp9OhUaNG2LVrFwDjDg2333672kvQxFQiIYTAsGHDEBzs/Naz\nY8eOuPPOO/HXX39BSomdO3daBNhq1aqlHK9cuRIjRoxAWJh7fe+qVauGkJAQFBQU4NChQ/j111/R\nqlUrt+Yg9zDgUAKOSiqsOc5wKG/3aa+WVBj8UFLBDAciIqJbTlp2GlKy/LuNXiCrUaMG7r//fvzw\nww8AjGUTjgIOgPGmbfjw4Up6ujPr16/HSy+9ZHGT70p6errXAg6//fabchwXF4d69eq5fE+7du1U\nz//zzz/j+eefx969e1W/Jz09XfXYkjJdvxACDRs2REREhMv3tG/fHoAxoHD27FlkZmYiOjra4fhG\njRq5nNP893vt2jWX40vK/Pd+7733qnpP+/bt8ddffwEADh06ZBFw6NOnD8qVK4fc3Fzs27cPCQkJ\nGD16NHr16oUmTZpAp3OdvB8eHo5evXph/fr1yMvLQ6dOnTBkyBA88sgj6NSpEyIjI928SnKFAQc3\nOSupMFGf4WC/+65XSyr8kOFgHdhgwIGIiKj0qxZRzd9LKJFbYf0jR47EDz/8ACkl1q1bh5s3b6J8\n+eIPpA4fPmwRNFBTTjF9+nQsXLgQAFQFJ0wZAKamlN5w+fJlZT21a9dW9R614959911MnDhR+aTf\nFV9crzXT9QNAnTp1VL2ndu3a0Ol0MBiM/25OT093GnBwtpOFiamHBQCf7L6g5brNS22sg0JVq1bF\n8uXLMXbsWBQWFuLMmTOYNWsWZs2ahcjISLRt2xadO3dGnz59nAZgli5diiNHjuD06dPIy8vDypUr\nsXLlSuh0OjRs2BCdOnVC9+7d0aNHD4ufGWnDgIObnJVUOGwaqaKHgzlvZjiY923wVQ8Hm5IKNo0k\nIiIq9UpzOUKgePjhhxEVFYXr16/j5s2b+Prrry2CCh9//DEA4416s2bNnO5kAQAbNmzAwoULlRvv\njh07YuTIkWjZsiVq1aqFChUqWNxAtWvXDr/88gsAKDe23pCdXbwtvHlAxRlnTS9Nfv/9d0yaNAmA\n8WfUuHFjPPHEE2jbti3q1KmDyMhIi4aAQ4YMwRdffAHAu9drzfz61VyXSXh4OG7cuAHAdYBETbDF\nlwwGg0WfDLXXbT7O3jU/9thjuOeeezBnzhx89913KCw03s9kZ2dj69at2Lp1K1544QW0bdsWr7/+\nOtq2bWszR3x8PA4dOoQFCxbggw8+UAIjUkocOXIER44cwdtvv424uDhMnz4dU6ZMKXU/31sJm0a6\nSc0uFdbsZjgYdIDBfiDCqz0cWFJBREREVCqUK1cOgwYNUh6bdqMAAL1eb7GbwOOPP+5yvsWLFyvH\n//rXv7Bz506MGjUKjRo1QnR0tM2ntb76lN+8hMDUN8IV0422M4sXL1YCB/369cPBgwcxYcIENG/e\nHLGxsTa7D/gyq8Gc+fWruS6TnJwc5fhWS/XX6XQW/RXUXrf5OEfX3Lx5c6xfvx4XL17E2rVrMXXq\nVLRp0wYhISEQQkAIgf3796NTp07YtGmT3TmioqIwb948pKamYs+ePViwYAH69u2L2NhYZY709HRM\nmzZNVc8NcowBhxJQXVJhL8NBHwYYgmyfR+CXVLBpJBEREZGR+W4VO3fuREqKsW/G999/j4sXLwIA\ngoODMWTIEKfz5OXlKT0MgoODMXfuXKfjpZRITk4u6fJVMe0GIKW02XnAEVODR2dMTQkBYO7cuS5r\n+M+ePavq3J5mvhuC2us/d+6cRRaG1l0Z/EnLdZ85c0Y5dnXN0dHR6NevHxYuXIi9e/fi8uXLeP/9\n9xEfHw8hBPR6PcaNG6c0XbVHp9Ohbdu2mDZtGtatW4fLly9j586deOihh5Sshi+//BKbN29WtX6y\nxYCDmzSVVNjLcNCHAtJBwCHAmkZaZzQww4GIiIjIqEOHDkoTRYPBoDSJNC+n6Nmzp8uGjhcvXoRe\nr4cQAjVr1kRUVJTT8YcPH7bYItCbmjZtqhynp6fj9OnTLt+zb98+p69LKZWAjE6nQ4MGDZyOz8jI\nwPHjx12mxnsjdb5Zs2YAjGs+evSoqk/7TcEjIQTq1KnjtH9DaWW6bgCqG3qaj2vevLlb54uKisLo\n0aOxdetWBAUFQUqJtLQ0/Prrr27N07FjR2zcuNFim9ENGza4NQcVY8BBA1PwtES7VOhDAWn/x++r\nbTF91cOBJRVEREREjpn3bfjkk09w/fp1bNy4UXnOlAXhjPmn+2rKFt555x03V6ldpUqV0LhxY+Wx\neemII67GmNLeAWOgJjfX+b+f33vvPRgMBottMu0pV65423pPNVZs0qSJEjDKz8/HmjVrXL7ngw8+\nUI67devmkXX4mvm6V69e7TTTAAB2796NkydPAjD+frt06aLpvPXr18ddd92lPDYFptwhhECvXr1K\nNAcZMeCggakcqUS7VBT6qaTCUApKKtg0koiIiEgxYsQI5eb5+PHjePbZZ5Ub6EqVKqF3794u56hW\nrRrKly8PKSUuXbrk9FPd7du346OPPvJpI7wxY8YAMH7Kv3jxYqflDStXrsSvv/7qcn3mOxqYB2is\nJSUl4dVXX1V1veaZJKbylpLS6XQW1z9r1iyLHRysffnll/jxxx+Vx+PHj/fIOnxtxIgRCA8PV0pp\n5s+f73BsXl4eJk+eDMB4s//QQw/Z7FSSkZGh6rwFBQUWAYIqVaoox1lZWUqjSVfMy3rM5yD3MOCg\ngb2Ag7YMBz+UVEiWVBARERGVJrfddhs6duyofPr+3//+F4Dxxmvo0KEIDna9sVxwcDAefPBB5fHw\n4cPx22+/2YxbvXo1+vXrBymlWzsmlNTYsWOV0pHs7Gzcd999OHTokM24jz76COPHj7doOOhInz59\nlONJkyZZ9HQw2bJlC+677z7k5OSout6GDRsqx999951bTR6dmTZtGqpWrQrA+Gl5t27dcOzYMZtx\nq1atwuOPP65kcAwePBgtW7b0yBp8LSYmBtOnTwdgDLS89NJLmDdvnk2mQ2pqKnr16qX8vYaGhtrt\nQTJp0iTcd999WL16tcNyoPT0dIwaNUoJTsTGxqJVq1bK63v37kXdunUxd+5cJZvCmqm0afny5cpz\nDz30kBtXTua4LaYGpoa3+fnFz5kHG1T3cHCQ4eCzXSp8leEg2TSSiIiIyJmRI0fip59+AmAMNJiC\nD+blFq7MmjUL3377LfLz8/HXX3+hVatWaNeuHe68806lqeTZs2chhMCkSZPwyy+/KNtielt4eDhW\nrVqF7t27IycnB6dOnUKrVq3Qpk0bNGjQAHl5edi3bx9OnToFIQTeeustZctLR6ZOnYqVK1fi6tWr\nuHTpEu6//360bNkSCQkJkFIiMTERJ06cgBACffv2RVhYGL788kunc3bo0AFVq1bFxYsXcfbsWdx9\n9924//77ld0LAON2ogMGDHDr+uPi4vDpp5+ib9++yM3NxR9//IEmTZqgXbt2uPvuuy2uHzD+Ddxz\nzz1477333DqPr7nKGpk5cyb27duH77//HlJKvPjii3jrrbfQpUsXREdH4+zZs9i5cyfyi26sdDod\nlixZgiZNmtjMJaXEjh07sGPHDgQFBSEhIQEJCQmIiYnBjRs3kJKSgj179iilMEIIvPnmmzYBu5SU\nFMyaNQuzZs1C9erV0bRpU1StWhVBQUG4ePEiEhMTkZaWpszRvXt39O/f3xM/rjKJAQcNHAVcHTaN\ndLRLhYMeDr7apcJnPRwM7OFARERE5MygQYMwadIki/4LCQkJaNGiheo5mjRpgk8//RQjR45Ebm4u\npJTYs2cP9uzZA6C478GkSZOwePFidOjQwePX4cy9996LjRs3YtiwYbh06RIAYP/+/di/f7+yvqCg\nIMydOxdjx451GXCoUaMG1q9fjwEDBuDKlSsAgMTERCQmJirzmbIEVqxYgbFjx7pcY1BQEJYtW4Yh\nQ4agsLAQqampSgNPk/Hjx7sdcACA++67Dz/88ANGjBih7Mawd+9ei0aJphv4nj174pNPPkHFihXd\nPo8vueqJodPpsGHDBkyaNAkrVqyAwWBAeno6vvrqK2WM6fcUExODZcuW4f/+7//szhUZGWnRt+PY\nsWM2WSKmuSpWrIglS5bYbGlZvnx5BAcHK1kWaWlp+O677+zOAQBDhw7F+++/r+InQY4w4KCBvYCD\n05IK7lLh9DERERFRWRcREYGHH34Yn332mfLc448/7vY8AwcORPPmzfHGG2/ghx9+QHJyMkJCQlCj\nRg106NABo0aNQrt27ZTx5jdXjqgdY/7dkW7duuH48eN4++23sW7dOpw6dQqFhYWIj49Hly5dMG7c\nODRr1gx5eXmq5uzYsSOOHTuGN998E99++62yA0b16tXRqlUrjBgxwqLURM21DBgwAL/++iuWLVuG\nvXv34vz588jOzlZuru29X20/jPbt2+PEiRNYtWoVNmzYgMOHD+Py5csIDQ1F9erV0bFjRwwdOlR1\nw0Q111OS8a7mMv/uSHBwMN59911MmDABK1euxI8//ojk5GRkZ2cjNjYW9evXR+/evTFmzBinAZb/\n/ve/mDJlCrZt24b9+/cjKSkJ586dQ1ZWFkJDQxEbG4uGDRviwQcfxPDhw1GpUiWbOTp27IhLly5h\n69at2L17Nw4fPox//vkHV65cgV6vR1RUFO644w7ce++9GD58uMVOG6SNcBWVIiMhRDKA+Pj4eFSs\nmIykJMvXhw0DduwAUlOBmjUB862DM25mIO4/VvvInu0AHB8A9HjG5lxPt34aSx5a4vmLAFBjcQ1c\nyL4AAFj3f+vQ/27vpwd1XtkZP539SXn8ZPMnsbzPcifvICIiIldq1qyJlJQUxMfHIzk52d/LISIi\nO25OS7MAACAASURBVEr6/2rT+wGkSClrenyBXsamkRq4XVLhMMPBvyUVvspwYEkFERERERFR2cOA\ngwZqSyrOngVefx1IPe9mSUWAbYtpHWBg00giIiIiIqLAxx4OGph2qTBnb5eKUaOMZRbfbAwBuli9\nQR/mn10q/NE0UjLDgYiIiIiIqKxhhoMGrkoqTP78s+j7CTuBhTLUNJIlFURERERERGUPAw4aOCqp\nMDFlOBQWJRAY9MJ2a8zS0MOBJRVERERERETkJQw4aOCopMK6aaQp4KDX22kcWeinkgp/ZDiwpIKI\niIiIiKjMYcBBA7UlFRYBB7sZDn4oqfBHDwerwIavAh1ERERERETkPww4aKC2pEKvL/5uk+GgD3WY\n4RDou1Qww4GIiIiIiCjwMeCggaaSCpsMhzCHPRy8VVJhkAZIyOIlsKSCiIiIiIiIvIQBBw0clVRY\nc9rDwQ8lFTalDT7KcPDXeYmIiIiIiMh/GHDQQE2Gg5TFJRWFhQ56OPi4pML6Rt9XPRxYUkFERERE\nRFT2MOCggaMeDuZ9HAwGy2O7u1T4uKTCX80bWVJBRERERERU9jDgoIGrkgopi8spTErDLhXWN/5+\nK6ngLhVEREREREQBjwEHDdSUVFgHHILtNY30dUmFn278WVJBRERERERU9jDgoIGakgqbgINQn+GQ\nW5gLKaXd10rCXz0c/JVZQURERERERP7DgIMGWkoqbDMcQh32cACAAkOBxtU55u5uEXqDHhv/3IhD\nFw559LzMcCAiIiIiIgp8DDho4KqkAijeocLEJsOh0HFJBeCdPg42mQYuSiq+/ONL9P28L9qsaIML\nWRc0n5clFURERERERGUPAw4aOCqpMFGf4eAk4OCFPg7uZjiYMhsKDYVIupyk/bxuBjqIiIiIiIjo\n1hfs7wXcihyVVDhtGmm3h4PjeI83tsZ0t4eDedCjJH0XWFJBRETkPRcuXEDNmjX9vQwiIrLjwgXt\nmeKBgAEHDdSUVFgHHIJsAg5+KKlwc5cK8zWUpMGkdYCBTSOJiIg8x2AwICUlxd/LICIissGAgwZq\nSipc9nDwQ0mFddDA1Y2/RYZDCcogrM/DDAciIqKSq1atmr+XQEREKpXV/2cz4KCBlpIK2wyHUKcZ\nDr4oqXAVRDBfQ0kyHFhSQURE5HmJiYn+XgIREZFTbBqpgZaSCru7VDjp4eCLkgp3ejh4tKSCTSOJ\niIiIiIgCHgMOGmjZpSII/i+psMlwcFVSUVjyppFSSkhIi+eY4UBERERERBT4WFKhgauSCsA24KCz\nF3AQjm+8vVJS4ea2mJ7IcLB3DjaNJCIiIiIiCnzMcNCgfHnb51w1jbS7S4WvSyrc7OFgkeGgsQzC\nXjYDMxyIiIiIiIgCHwMOGjgKODhtGlkaSir80MPBXqCCAQciIiIiIqLAx4CDBvYCDoDzkgqbgENh\nmP93qfBBDwe7JRVsGklERERERBTwGHDQQE1JhaoeDs4yHJyUVCRfT0a/z/thxo8z1Cy3+JTWPRxc\nlVR4IMOBJRVERERERERlEwMOGoSEAEFWsQLrkgqbHg52Aw5Oejg4KalYnrgcG/7cgFd3v4qky0mq\n112iDAeNWQn23semkURERERERIGPAQcNhADCw+0/b+I0w0EKwBCsuaTiQvYF5fjSjUsu16usySpL\nwSc9HOwEF5jhQEREREREFPgYcNDIuqzCnZKKIIQCEJpLKm4U3FCObxbcVLVewP2SCvOgh9asBJZU\nEBERERERlU0MOGhkL+DgNMNBWgccoLmk4kZ+ccAhpyDH9WKLlKSkwpO7VLBpJBERERERUeBjwEEj\nRztVACoyHGSY8UBjSYV5hkNOoRsBBz80jWRJBRERERERUdnEgINGzjIc7DWNNM9w0CkZDhpLKvI1\nllRY3fw7CyIUGgotAgNasxLsBRfYNJKIiIiIiCjw+SzgIISoLYRYLIQ4LoTIFkJkCCEOCCGmCSHs\ntGAs0bk6CyE+EEKcFEJkCSGuFR3/TwjxlBDCSX6COiUqqZBFAQcnGQ5OSyoKNJZUWGc4OLnxtw54\neLKkghkOREREREREgS/YFycRQvQB8AmAKACy6OlwAC0AtAQwVgjRS0r5TwnPEwHgAwCDip6SZi9H\nALgDQH8AewAcKcm57O1SYWKvpEJYNI0sKqlw0sPBaUmFhzIcnGUtWAc8tGYlsKSCiIiIiIiobPJ6\nwEEI0QzA5wDKAcgCMB/AThgDDo8CeALAnQA2CSFaSilvOJjK1XnKA9gC4F4YAw3fFZ33JIyZHHUA\ntEJxMKJEXJVUOMtw0EkVJRVqMxxK0sPBBxkOdksq2DSSiIiIiIgo4Pkiw+FNGIMLBQC6SykPmL22\nUwjxF4D/ALgLwFQAczSeZxGMwQY9gNFSyk+sXt8P4AsA04QQJS4lcdY0ErCT4WCwE3BwVlLh5x4O\nNhkOGoMELKkgIiIiIiIqm7zaw0EI0QpARxgzDlZYBRtMXgdwHIAAMFkI4fgu3PF5mgAYV3SexXaC\nDRakLPkdr7tNI80DDsUlFe7vUiGltAgylKiHg7OSCk/1cLCTRcGmkURERERERIHP200j+5sdr7Q3\nQEopAawqehgNoKuG80yAMWCRB2CBhve7zd2mkcJuSYXljz/ILNbiqKQipzAH0qw1xc1C1xkOUkpI\nKW2CBk5LKjzUw8FeNgMzHIiIiIiIiAKftwMOHYq+3wBw0Mm4XWbH7TWc5xEYsxu2SymvAoAQIqho\nZ4zaQohQDXM65aykwm7TSItdKuw3jYwIjVCOHZVUZOdnWzx2leFw6cYl3PX2XWjwTgNcvnnZ4jWf\nZDiwpIKIiIiIiKhM8nbAIQHGQMDfLsoYTli9RzUhxB0AKhU93CeEiBVC/H/27jxMrrLMG//3qaWr\nuiu9pdNJOhtZ2CIaIQngjDKoDIsICDiKPwdRAQ2+zCgzOs7o64joK44zo+DooIC4jKgjv0F9FTWC\nsskaDEQim4RshHSSztL7Vsvz/nH6nPOctc45daq6qvv7ua6++tSps1Sn8fI6d3/v+7kZwBEAO6e+\nBoQQvxJCvMH9KuHZV6koNzTSdYYDYJnjkE1lkZgaL+GVcFDnNwDlZzh8+FcfxrbD2/D8wefx6fs+\nbXnPr4hgb+mIs6WiJEvQgi1EREREREQ0U1Wt4CCEyACYN/Vyj9+xUsp+aCkIAFga8lavUrYT0Ja7\n/ACAHLRihwTQBOBsAA8IIT4a8vquyrVU+M1wsBQclJRDOplGNpUF4D3DQV2hAii/SsXmXjNY4lgW\nc5paKgBY2kKIiIiIiIho5qlmwqFV2R72PMqkP0nP8T3Kaa6y/UkACwH8EtoSmFkA8wF8CEA/tDkP\n/yqEODfkPRzCtlTAUnDIKPvNhEM6kUYmqb3n1VJhTziUa6nwe3+6WioAtlUQERERERHNdNUsOGSV\n7ckAx09AKwg0lzvQJqdsNwG4G8D5UsonpZR5KeUhKeUtAM4HoD/l/kvIeziEHxppphqsCQel4KAk\nHDxbKvLhWir8EhChEg5Rl8X0uEfU6xEREREREVFjqGbBQe0JCDK0MQOt/SH4Oo/W++iP+/8oXQYE\nSCkfBvDjqeNOEEKcEPI+Fm4FB5W94NCN43HM3GMgILB09DzzDVvCQS842IdD6hwJhzItFX4JB7/U\nQlwJB68kAxMOREREREREM1uqitceUraDtEnoSYUg7Rde9zkopXza59hfA/irqe2TATwT8l6YnJzE\nk08+ib17rfv37QNGlbBBf38PgB5zRymFP179RxwYOYBPX7PE3G+b4bA8txw7+nfg8NhhHB47jLnN\nasdIzAkHv5aKmGY4eN0j6vWIiIiIiIgaXW9vL3p7e8seNzkZpFmgflWt4CClnBBCHATQBWCJ37FC\niA6YQx5fDnkr/fgg56rvd4e8DwCgr68P69atc+z/1resr7dsuRbAZ4zXxSLQlGzCkrYlsOQvlJaK\nVCKF1fNW476d9wEAnut7Dq9fZl0lNOwMBz9eK0hMFCbim+HgUVhgwoGIiIiIiGarm2++Gdddd910\nf4yqq2bCAQCeA3AagKOFEAmfpTGPt50TxrPKdtLzKOf7kf7E3t3djY0bN2LrVuB97zP3X3EFsGkT\nsHWr9vr443vw+OPKzZS7WQoOtpaKV3Wbi2482/ess+AQMuHgx54+mChM4KSbT8KewT142/Fv8z02\n6j10QQoOfSN9+PVLv8Zbjn4Lulq6It2fiIiIiIio3mzYsAEXXHBB2ePOOecc9PX11eATVUe1Cw4P\nQSs45ACsA/CEx3GnK9sPh7mBlHJACLEVwGsALC9z+Cpl+5Uw99E1NTVh7dq1SNn+5Xp6gNZW9Tjr\n+2rBoaQ+a9taKlZ3rzZeP9un1lI0bjMcpJQQ9iESAdhTC997+nt47qBW77n96dt9jw3Kq7AQpIDx\n7h+/G7/Z/hucd+x5+Pn/9/NI9yciIiIiIqo3PT096OnpKXtck/3BssFUc2gkAPxU2X6/2wFCe1K+\nbOplP4D7Itznx1Pf24QQb/Y57mJl+6EI9zGEXaXCM+EgvRMO+sO/yp5wKMkSJovR+nrs7Q4D4wOB\nj416D12QhMOWfVss34mIiIiIiKhxVLXgIKV8AsDvoK0McYUQ4lSXwz4GYDW0GQw3Sml9QhVCnC6E\nKE19fcvlfAD4T2jDIwWALwshWu0HCCEuBfDGqfvcJaWMlHDQhV2lIlBLRTKNBbkF6Mh2AAiWcADK\nr1ThxZ4y0FfIcBN5hkMFQyPzxbzlOxERERERETWOaiccAOAj0Ja6TAO4RwjxT0KIU4UQbxRC3Azg\ni1PHvQDgyz7XcSx1abwh5UEA/zT1cg2ATUKI9wkh1k7d56sAvj31/iCAv6/g5wHgLDgA/gkH9bXf\n0EghBFbP09oqXh58GUMT6iIczoQD4D04styDuv2hvznd7HnsdCyLmS9pnz9qgoOIiIiIiIimT7Vn\nOEBKuUUI8U4AtwNoA3C9/RBoxYa3SimdT9PB7/N1IUQntKUhjgVgT0NIAPsBXCilfCnqfXTNtmdz\ne8Ihb3vWDzTDIZEGALyq+1V4dM+jAIDnDz6PkxefbBzjVnDwGhw5NDnkul9VkiUkhPYZmlPeBYfI\nQyMraKkwEg4lJhyIiIiIiIgaTS0SDpBS/gJa8uAGaMWFEQBHoA2R/DiAtVLKHX6XsH33us/1AF4H\nLc2wA8A4gAEAvwfwaQDHSSkf975CcPbZHfYZDn4FB7+WCgBGwgFwznEI01IxODHoul+lJhekzz9v\n7C0VZQoYUkomHIiIiIiIiBpY1RMOOinly9DmNXws5HkPoPxyl+rxTwK4MtynC89tUYg4hkYCcCyN\nqQqTcAhScCiWisa/rl8LRtShkVFbKtT75Yv5yCtxEBERERER0fSoScJhNog8NNK2LCYA36UxXRMO\nHjMcAhUclAd7vyRB5ISDR6GiXAFDLX5IyMgFDyIiIiIiIpoeLDjEJExLhWWGQ8mZcFjWvgwC2sX2\nDe+zXKcqCQf9M/vMSog8w8HjvHIJB/tn4UoVREREREREjYUFhyqJ0lKRSmgdLgmRMJaotM9nqOYM\nB7+H+lqvUmH/LJzjQERERERE1FhYcIhJLEMjpxIOgLlEpb1dIvaEgwyYcIjY0uDZUlEmMeFIOHCl\nCiIiIiIioobCgkNM8vmoQyOdMxwAc4nKQAmHSmY4lKzDGb3oCYehifJLbXpdX09wAEw4EBERERER\nzXQsOMRkzPbM71dwsMxwcFmlAnBPOEgpMTw57Lx3BS0VQRMOhVIBNzx6Azq+2IEP/vyDZa+rUwsL\n6s9XdmgkZzgQERERERE1NBYcKpBUFuscG4uhpUJJOLSkW7TrKsWE8cI4JNSTNZW0VKizGfxSBMVS\nEbdvvR0lWcJ3tnyn7HWN85TCgvrzMeFAREREREQ0s7HgUIGWFnPbXnCoZGgkYLZUjBfGIadOUOc3\nqMfWqqVivDCuHVfKB04cqNdXEw6hV6ngDAciIiIiIqKGwoJDBZqbze1RW8gg0gwHl5YKAMaDvjq/\nYV7LPGO7b7QPv3rxV475CnEPjVSLDF6pCjtLS4WScCg7NLLIlgoiIiIiIqJGxoJDBdSCQ5iWCssM\nB4+WCj3hAJhtFWrCobul29j++u+/jnN/cC4u+tFFlnvGnXBQ2xqCFhwsLRUVJBzYUkFERERERNRY\nWHCogL3goLInHNTXXi0VXgkHvWXCK+Gg++2O31peh53h4JtwKBUt7wcuOJTcZziUHRppTziwpYKI\niIiIiKihsOBQgagJhzDLYgIeCYecmXDwErqlIqaEw67+Xegf7wfgvUoFEw5EREREREQzGwsOFfAr\nOASe4VDySDikyiQcmp0JB8CaWAjbUjFZ8lmlIuAMhwd2PoAVX1mBo248CkfGjniuUnHRjy7C0f9x\nNJ7e/7TrdTjDgYiIiIiIqLGx4FAB+yoVKr+Cg2WGg9cqFWn/hINbSwVgDpgEpifhcPdLd0NCYnBi\nEI/uedRzlYrDY4fx0pGX8O2nvu16HSYciIiIiIiIGhsLDhWwr1IRqaUiyNBIl4SDV0uFXnAoyRKG\nJodcj1EFneGgXhswiyB+x4zmRz1XqdD1T/S7XoczHIiIiIiIiBobCw4VWLHC3F66NGJLRYBlMd0S\nDu2ZdksiQqc/8A9PDgf5EQKvUgEAEuYH90o4TBQnjO2x/JjnKhXlrsOEAxERERERUWNjwaEC110H\nLFkCzJ0L3HKL9b3gQyODJxzUFom2TJvlGJ1ecAjSTgHYWipCpAi8CgX2hIPXKhXlrsMZDkRERERE\nRI3N+SdyCqyjA9ixQysuNDdbEw52njMcvIZGuiQcBsYHjH3t2XY0p5sdbRNuxQk/lqGRIVIEQQsO\nXqtU6NQ2ERUTDkRERERERI2NBYcKpVLaVzlBWiosQyPLJBzaM+2uD/36A//QRPn5DYBthkOIFEGQ\nlorR/KglQdGUbAp8Hc5wICIiIiIiamxsqYhR0IRDoJYKJeGgP5QPTJgJh7ZMm+ucBr3gsLN/Z6DP\nXM2WirHCWPSWCiYciIiIiIiIGhoLDjGKVHDwaqlIOVsqLAmHbLvrffQH/o0vbTT2uSULjM8VYmik\nSk9deN0fcLZUuA25VAdhqjjDgYiIiIiIqLGx4FAjnjMcAiQc9Id7e8LBzVhhDCVZwsZtWsGhJd2C\nN694s/fnijnhMFGIqaWixJYKIiIiIiKiRsaCQ4yitVR4LIvpk3DIprKeqYXxwjj+sO8P2De8DwDw\n5hVvxpymOZ6fK+4ZDr6rVIQZGllkSwUREREREVEjY8EhRpEKDgePAwAkZROWdyw3drsmHKZWqdDT\nDZ9/8+cd9xkvjONX235lvH7L0W9xfdA3PlfUVSoKwWY4lFulYjQ/Cmn5B9GohRCALRVERERERESN\nhgWHGikoz8+W5+vNG4C7vo53Fe5Gd67b2O2XcGjPaPMbPvGGT+CxKx7Dl876knGsa8HBZVijLvaW\nCp9VKtw+R1EWXe/LoZFERERERESNjQWHGAVNOFhmOORbgN9fhWWl0y3HWxIOhTFIKY2Cg55wEELg\n1CWnYuGchcaxB0YO4NGXHwUAHNd1HFZ0rgiccKhKS4X0b6kA3NsquCwmERERERFRY2PBoUY8Wyo8\n9lkSDvkxy8O7fYUK9dgdR3YYx528+GQA3g/6gG2GQ8zLYtpXqfBKWozkR/DQ7ofwbN+znp+FCQci\nIiIiIqLGwoJDjCLNcPDY15JuMbbHCmO+K1RkU1lj++DYQWO7takVQJllMaV/wuGkhSe5nhdkWcyx\n/FjZoZEAcOezd+K0b5+G137jtdjVv8v1s3CGAxERERERUWNhwSFGlRQcLG0WcA6N1AdGAuYMB51a\ncDg0esjYzqVzALyTBYCtpWIqVfDq+a/Gg+97EPe/9378+dI/dz0vyrKYXp/jnu33ANDSFo/teczy\nWXRMOBARERERETUWFhxqJGzCQS0ijBXGjPkNQJmCw5hZcNCXw/Sd4SCdq1Q0JZtw2lGn4fTlpyOV\nSLme51ZwkFI6hkaWW6UCAPYO7TW2+0b7AHCGAxERERERUaNjwSFGkYZGTrEXHBIigUwyA2Aq4eDT\nUqGmIQ6Omi0VuabyCQd9hoOU0thWCwNJkXQ9z63goBYb9GMsLRUen6N3uNfY7huZKjgw4UBERERE\nRNTQWHCokbAJB8AsJDgSDlnvhMPhscPGttFSEWCVCnV4pFoYCJNwUNsp9M8dZJWKAyMHHNv2ggMT\nDkRERERERI2FBYcYxTk0EjBXn7DPcPAbGqm2MARJOOgFAfWB3pJwSARPOKgDIwEtlaAmE7w+h/qZ\nPVsqODSSiIiIiIioobDgEKPYCw5eCQefGQ6qQDMcphIO6gN9kIRDURYdRQB7wQEAhieHzev6fA6d\nUXBgSwUREREREVFDY8GhRsLOcABsCYeAy2KqgqxSobdSWJIIifIFB8CZcrDPcABsBQefz6EzZjhw\naCQREREREVFDY8EhRtVMOFiWxbTNcNALE3ZGS0WAVSrUB/qmZJOx7TU0EnAWHJhwICIiIiIiIh0L\nDjGq1gyHkixZlru0JxzUAoHKaKnwm+EQsaUCCFZwGJoYKvs5VYdGD6FYcrZrcIYDERERERFRY2HB\noUYqSTgAwL7hfca2fYaDEMK1rSLQKhURh0YCLi0VBWdLxdCkWXAI0lIhIXFo7BATDkRERERERA2O\nBYcY+SUcpDRnN7jNcHDbp7ZK7B/Zb2zbEw6A+xwHvaXCnizIJDPGtj7DoVoJB7Wlwu9aqr6RPs5w\nICIiIiIianAsONSQnnKoNOHQmml1Husyx8GrpUItQBgtFV4JhwpnOHjd10/faB8TDkRERERERA2O\nBYcY+SUcgAgFB6WIcGDkAACgtakVCeH8tYVpqbAUHKZaKqKsUjFWGLO8dlulQhVkaCTgkXDgDAci\nIiIiIqKGwoJDjKpZcNDZV6jQ2QsOmWTGmL9gTzhkUmZLhdvQSMsqFSFmOPglHJIi6Xst1YGRA0w4\nEBERERERNTgWHGpILzi4zWso11Khsw+M1NkLDno7BeCfcDBmOJSqM8NBNz8337c9Q9U3yhkORERE\nREREjY4FhxjVIuHgNjAScBYc9IGRQJkZDtJlWcyIMxzcVqnQLZizwLUVxE3fCGc4EBERERERNToW\nHGIUe8HBLeHg0VJhP1af3wCUmeHgNjSyCgmHBbkQBQe3hANnOBARERERETUUFhxqqJYJB0tLhX2G\ng7IsZtmEQ0wzHBbMWVDRDAe2VBARERERETUWFhxiFDThUIsZDpaWioT30Eh9hoNllYqACYexvHWV\nimomHCaLk5Bu/0hERERERERUl1hwqCG/hINbEaIl3eLYFzTh0JLK4eKLgfXrgQP7AsxwKLmvUhGm\npcJvWcyFcxaWHRopoFVs3GY4qJ+ViIiIiIiI6p/30ySFVouhkfNa5rle237sxNAc/PIn2vYv7wow\nwyGGoZGVJhyWtS/DroFdODR2yPW+k8VJ3wIIERERERER1Q8mHGJUi6GRZ6862/Xa9oRDsmS2VIyP\nNFneK5dwCDw0shBuhoO94GBPa6yauwoAUJIl14QDB0cSERERERE1DhYcqiythAtCz3CwpRaO7ToW\nJy480fU+9oJDRpgFB1nwGRpZLuEQYmikX0vFgpxzaGR3S7fl8y9uXex5PsClMYmIiIiIiBoJCw4x\ncks4ZMxn+4oTDu864V0QHjEKZ8HBXKWilPduqdCHRlZ9WUyXhENXS5ex3Z5p95xPoeNKFURERERE\nRI2DBYcYlSs4FLRn+8gzHC559SWe97YXHJpgJhxKhfJDI9X0gPr+y7srn+GQFEl0NXc55jKo8yja\ns+2WpTzdVJpwODR6CG/8zhtx/g/Px0TBO41BRERERERElWPBocqalPEJ9oSDWqBwKzjYiwiv6n6V\n533sxYkmpaWi6JNw8Gup+NWvgPe9x5pwSCfSRlLBviym10N8d64byUTSkXBQWyraM+ULDpXOcPjI\nxo/ggV0P4K4/3YWvbfpaRdciIiIiIiIifyw4xChoS4U+wyGh/Ou7FRyO7ToWr57/agDA9y/+vu+9\nnQkHpaXCb4aDz9DIBx4AIK2phKZkk7Fc50h+xPKemnBQP8+C3AIAcLZUNCstFdl2tDa1uv1ohkoT\nDr/a9itje8v+LRVdi4iIiIiIiPzVbI1BIcQyAB8BcC6ApQAmALwE4A4A/ymlHPM5vdy13wvg2wEP\nf5+U8r+i3issvxkOiYT/XIdkIonNH9yMI2NHsGDOAt/72AsOaaWlojgZYIaDS8KhUABQsiUckmk0\np5oxPDmMoYkhy3teLRX6Z7cPjVQTDYESDhXOcFATGW5LjhIREREREVF8apJwEEKcD+BpAH8H4FgA\nzQA6AKwD8K8AnhJCrIrhVjLAV9WEHRqZVJ6/7QWH7duB224DRoeayhYbAJeCg1RnOCQhYH64TMpl\nlQqXhEOxCKDkTDjowx0HJwYt7+mrVDQlmyzFB6+Eg56UAGrTUuGVwCAiIiIiIqL4VT3hIIQ4CcB/\nA8gCGAJwPYD7oRUd3gXgAwCOAXCXEGK9lHLE41JBnQWg1+f9PRVe31MlBQd1qUwpgde9DujrAx5+\nGPjWt8rf276iRUopOOTzWhFBb0koNzRSTzhoBQfnDAe14CClNFbO0B/os6ms5Xp6wcE+NNJScKjB\n0Eip1JtYcCAiIiIiIqquWrRU3AituJAHcKaUcpPy3v1CiBcB/Bu05MNHAXy2wvu9KKXcXeE1YuM2\nNLLcDIehIa3YAADf/nawgoMz4WA+vOfzWqFAf2DXCwqA+9BIvSDh1lKhJhwkJEbyI0ahQC04qOmH\n+bn52s9rSzjkmsyiSC1aKlQsOBAREREREVVXVVsqhBAnAzgNWivDN23FBt2XATwHQAD4iBDCex3G\nOhc24eBVcDh0KPy97Q/QqZL5MF8omG0SgDZLQU8bGDMcvFoqbEMj00kz4QBY2yr0VSrUoZSAH8oS\nkwAAIABJREFUll4AnAUHPfkAAIvbFqM1U92hkSp72oKIiIiIiIjiVe0ZDhcq299xO0BKKQHoQxw7\nALypyp+ppqLMcIil4GBrqVDbKJIiaQxwNFapcBkaWa6lArAWHNSEg8pr9YmzVp2Fy0+8HJeccAku\nOeGSqs9wUOnzJoiIiIiIiKg6qt1S8Yap7yMANvsc94Cy/XoAv6naJ6qipMsfzaMkHA4fDn9vZ8LB\nfHgvFKxtFGrCwW9opNZS4RwaqRYQghQcOps7ATgTCulkGre97Tbj9dCkddULuzgTDnoag4iIiIiI\niKqj2gmH1dDaKbZJKUs+xz1vO6cS3xFCvCKEmBBC9AkhHhVCfE4IsajC65a1YoVzn1vBodwMhygJ\nB/syj2pLhT40UpcUSaQSWq3JSDiUAiYcbC0V+tKYUkqj4JBJZXDTuTcBAJa2LcWblmuhlcVti3HM\n3GMAAP/8F//s+BncEg5q60MlMxykbRkQJhyIiIiIiIiqq2oJByFEBsA8aAUH35UhpJT9QogRAC0A\nllZ469OV7bkATgFwKoCPCiGukVLeUuH1Pb32tc590zHDQUBAFMwChGvCIWGd4WBZpcJnhoM6NBIw\nEw6FUsFYBSKbyuKq9VfhlMWnYNXcVcYynAmRwKNXPIot+7bg9OWnwy6Xzjn3NeWMe1SScBjNj1pe\nq9fqHerFT5//Kc479jwsba/0Pz8iIiIiIiICqttSoTbuDwc4Xi84+Dfye3sJwJ0AHgPw8tS+lQDe\nDuCvoC3L+XUhRElK+c2I9/B14onOfdMxwyHXlEOpZE6wtCccUomUs6Ui4CoVXjMc9HSD/lmEEFi3\naJ3jc3a1dOGMlWe4/gzJRBIt6RZLcaAl3WLco5IZDgMTA5bXasLh8p9djo3bNuIHf/wBfvf+30W+\nBxEREREREZmqWXBQG/mD/Gl6AtpKFc3lDnTxYynld132bwbw/wshzgXwE2g/7w1CiJ9JKQ9EuI+v\nhQuB7m5zSUtgemY45NI5qGEAR8LBbWikZ0tFsFUq1IKDfZWKMOY0zbEUHNTUQyUJh4FxW8FBmeGw\ndf9Wy3ciIiIiIiKqXDVnOIwr202eR5ky0NovxsLeSErpO21QSvlLAJ+FVtBoAXBF2HsEIYSzrUIt\nOBQK+ufRvsc6wyFt1mlyTTmjuAG4zHBImDMcdvbvxAU/vAAP7DTndlqHRlprUl4tFWpiwD40Mgz7\nHIeWdIv5c1Qww6F/vN/yWv28I/kRANaiSaXG8mMYy4f+T5mIiIiIiGjGqGbCQS0CBGmT0P+UHaT9\nIopboBUdAG3OwxeiXGRychJPPvmk5/sLF+pbPQB6fIdGqi0VJWWkplpwaApSqoE1VTCnaY6l4OCa\ncFCGMf78Tz+3XMuScLDNcAjaUhGVveCQa4op4WBvqVASDnqiYqI4gZIsISEqq8PtHtiN137jtUiI\nBLZ+aCsWtVZ9XikRERERETWQ3t5e9Pb2lj1ucjK+lfqmQ9UKDlLKCSHEQQBdAJb4HSuE6IBWcJAw\n5y/E/Xn6hBCHpj7P4qjX6evrw7p1ztkETtcC+IylYBBlhkOrOgnDhxACJy86GU/sfQKnLj4VxVfM\n99wSDnpLhRvL0MiACYe4WirUJTcBW8KhkhkOtpYKvXhRKBUshYzxwrjlnlH84k+/MBIVd790N953\n4vsquh4REREREc0sN998M6677rrp/hhVV82EAwA8B+A0AEcLIRI+S2MebzunWmT5Q/x1d3dj48aN\nnu+/+CLwrncBWsLBfYaDLkhLhQzxiTdeuhGPvPwIzlhxBj73qLnfLeFQ8vhVCAgj/aC1VJSZ4TA5\n1VJRmL6Wir6RPnQ2dxptIm68hkba2x7iKDiMFcxrqv8uREREREREALBhwwZccMEFZY8755xz0KcO\nCWww1S44PASt4JADsA7AEx7HqWskPlyNDyKEmAdtmU4A2Bv1Ok1NTVi7dq3n+695jfW1veCgFhCC\nDI20Fyn8zG2ei/OOPQ+AOS8CcE847B7Y7XqNdDINIYR5b2ltL6h5S0WZoZE/e+FnuPhHF+OYrmOw\nZcMWYwlOO6+hkfblMsfyY9HGlirUf4tK2kCIiIiIiGhm6unpQU9PT9njmoL22Nepag6NBICfKtvv\ndztAaE+3l0297AdwX5U+ywZoQyMB4AG/AyuRTltfZ5Vn72LROqvBreBQKAD9ynxDtXAQhn1opL7U\nJQDL/AY7NQmhXUMARbMuFailwuOhPwjfhINLS8V7f/peFGURzx98Hv/z7P94Xtcr4WAvOMQxOFK9\nRiWDLomIiIiIiBpZVQsOUsonAPwO2oP+FUKIU10O+xiA1dDaHW6UUlr+pi+EOF0IUZr6+pb9ZCHE\nUUKIE/0+hxDiPAD/PPVyHMC3w/80wX33u9qwx8suA3LmH+gdCQe3GQ5HjlivFSbh4HWeo6UikcSH\n1n/I9Tw1CWEUO5TBkelEGplUxrhe3KtU2Gc4lEs4qKtPPNP3jOd1vRIO+goVOrUdIiomHIiIiIiI\niKqfcACAj0Bb6jIN4B4hxD8JIU4VQrxRCHEzgC9OHfcCgC/7XMdrmsFyAE8KIR6euvY5Qoi1Qoh1\nQoh3CCHuAPB/oS3NKQF8VEpZfhxoBS67DBgc1AoPalEhSEuFfUnMOAoOjpYKkcTXzv0anvjAE7j1\n/Fst5zkTDrAMjtSTEnrKodotFeVmOKj38moTAYD+CeuymHohwLWlokKWhEMFgy6JiIiIiIgaWbVn\nOEBKuUUI8U4AtwNoA3C9/RBoxYa3SilH7OcHvQ2A1wH4M5/3RwBcI6W8LeI9QtFnN4QtOKjzG4B4\nWirsCYdUIoWESGD9ovUYmbT+k6uFCbPgkHS835Zpw6GxQ7GvUhF2WcwlbUuw7fA2AMBLR17yvK4j\n4VCjlgomHIiIiIiIaLaqesEBAKSUvxBCrIGWdngrtGUyJwFsA3AHgP+UUvo96Unbd9VmAJdCKzas\nh7Y8xDxoP9sRAM8A+C2Ab0opD1b+04RjLzioMxzcWirsCQcptS8hEIo94ZCytVToFrdZVwhVZz0Y\nxQ4l4aAXLuwJh+HJYeMYtUgQVtgZDupqG3888EeUZAkJ4QzuOGY4eA2NjLmlgjMciIiIiIhotqpJ\nwQEApJQvQ5vX8LGQ5z0AwHPKoZRyGMAPp77qjl/CQX1PL0TYCw76eamQvyl7MsK+LKZucetiz+OM\nooUyw8HeUjFZnMREYQKHx8xoRldzV7gPq/ArOEyWnGkBNbkwPDmMXf27sKJzhe9xgHfCIe6WCiYc\niIiIiIhotqrFDIdZrdIZDkC0tgr77IcE3BMOzelmdGY7jdfuLRUpx/v2lSrUgkNns3m9sFoz3kMj\n7QkHKaUjufD0/qddrxs04RD7KhWc4UBERERERLMUCw5VFkfBIcrgSPs5KWG2SqQS1riE2lahvme2\nVHgnHABnwWFu89zwH3iKb8LBlhYYK4yhULJWY7Ye2Op6XXvCIV/KQ0rpmGHBVSqIiIiIiIjiwYJD\nlfnNcAgyNBKIJ+Hwl8veiqRIYlHrIqzrWWd5T22rGJoYcl7DZ4YDAAxNDuHIuLmeZ5wFB3UehH0e\ngr2IALgXHNySEIBWDKj6KhWc4UBERERERLMUCw5Vps5e8JvhECXhICXw+OPAgPNZ2nHOn/e8GXv+\nfg9e+vBLaE43W97rae0xtg+MHHBeQ7qvUqGrVcLhod0P4bMPfNZoh3ArImzd7yw4DE8OW4ZL6iaK\nE54tFVJKfPq+T+PKn11pDMYMigkHIiIiIiIiFhyqTi0qFArxtlR88pPA616nfUnpf04+DyycsxDZ\nVNZxnfkt843tkbzZYuC2SkW5loqmZBOaU9aCRhitTdYZDp3ZTsxrmQcA6B/vx7X3X4t/e+TfALgn\nHP506E+OuQluhQlAm+PgtUrFIy8/gs89+Dnc9tRtuGXzLaF+BhYciIiIiIiIWHCoumoNjdyxA/iX\nf9G2n38e6OvzP8evLWN+br7rfrOlQkk4uLRUqAWHuc1zIcKu4amwJxyyqSzuvvRunHvMuca+Fw69\nAMC9kFCURfSP91v2uRUmAP+Eg34PAHim75kQPwFbKoiIiIiIiAAWHKou7AyHfuuzsnGe3bXXWl/n\nbc+1bgkHL+ULDs6Eg5pEGJwYxJExbYZDJe0UgLPgkE6mcVLPSbjtgtss9wO8Cwn2QoRXwsFvhsMr\ng68Y+3Yc2RHw02uYcCAiIiIiImLBoerCJhxGrc+/xnmqP/4RuP12675J23NtHAUHt5YKtxkOfSN9\nRitGpQUHdWYDYCYq2jPtxj690KAWEvTjAARPOBQmLC0kgNlSsXdor7FvR3/0ggOXxSQiIiIiotmK\nBYcqC1twGLE+/wJwtkP84AfOmQ3lCg5+LRXrF603tt/72vc6ryH9l8XcNbDL2K604JBMJC2v9QJH\nNpU1igp6oUEtJCxrX2Zs2wsM9qKCzq+l4pUhM+Hw8sDLoZIKTDgQERERERGx4FB1fgUHIbQvQGu1\nKJWAMZdVGe3FA7e2i0oSDl0tXbjnPffg2tOvxb+d+W/Oa5RZFnNn/05juzPb6X2jCPT7CSHQntVS\nDm4Jh6M6jjK27QkHtaiQFOYvxG9opFpwkJDYPbA78GfmDAciIiIiIiIWHKrOb4aDWnCQ0r3YADjT\nCW7Fg0oSDgDwlyv/Ep9542fQnet2nlPyXxYzzoSDnX4/wGyrcEs4HNVuFhzsMxv0uQwA0NlsFkTc\nEg76sWpLBVB+jkPvUC/ueekeTBYnLUUGJhyIiIiIiGi2SpU/hCpRrqVCLTi4tVPo56miFBz8Eg5e\n/IZGqgWH7Ue2G9uxFxyU2QxqwkFKicHJQeM9taXCnnDQUwsA0JHtwMHRgwDch0aOF8aRL+axf3i/\nZb/fHIfJ4iTW3rIW+4b34bo3Xmd5jzMciIiIiIhotmLCocqCtlRI6T4wUj9PZS8uuO0LsyymF+Mc\n6VwWszvXbbQnlKQZ24i74KAXOACzyFGURYzmRz0TDn4tFWrLh1dLxb7hfZCwDsnwSzjs6t+FfcP7\nAAD37rjX8h4TDkRERERENFux4FBlYQoOXgmHOFoq9HP6+4H/+A/g97/3/9xSKp/VJeHQlGyyzE3Q\nxVFwePzKx3H2qrPx3Qu/axkiaVmpYmLA0jrhNzRSbanQUxKA1lLhWKUiP+ZopwCA7f3bHfuMc5QE\nhZ6e0LHgQEREREREsxVbKqoszAyHoC0VQRIOXjMcPv954N//HZg7F9i7F8hkAtzTZYYDABwz9xhL\nOwUQz9DIUxafgo2XbnTsV4sFA+MDRmEhKZJY1LrIeK9/wjvh0JHtMLbdEg7jhXHLwEidX8JBLWjY\nCw4cGklERERERLMVEw5VFmaGQ9CWikoSDi+8oH0/fBg4dMj7c1tSFS6rVABawcEu7pYKlVfCoS3T\nZhkG6Ug4KAkES0uF29DIwhheGXQpOPjMcGDCgYiIiIiIyIkFhyqrl5YK/RrqtfzmOljOV2Y4qDMV\njp57tOM8r4LD5s3A3/898Mwz3vcsx1JwUBIO7dl2y3vlhkbqxgvjliUs9X1qwiGV0IotB0cPYmhi\nyPVzqdcoSus/PIdGEhERERHRbMWCQ5VFLTg0NVnPU0VpqdCLFJEKDgNLAQDZVBZdLV3G7mO6gicc\nLr8cuOEG4G/+xvue5VhaKpSEQ3umHdlU1iiG2JfF9BoaaS9MAM4ZDmt71hrbXikHtaXCjgkHIiIi\nIiKarVhwqDK14FAoOGc4JKZ+A6WStaWitdXcjtJS4bVKRdCCg+W9h/8RR/Vegx9c/APLcpj2hIOA\nsBQFVLt3a9937fK+ZzlqiuHAyAHjYb492w4hhPG+I+HgMTTyyNgRxz3GCmOWhMMblr7B2LbPq1DP\n8cIZDkRERERENFux4FBlYWY4qAmHNvO5PtZVKtT9gRMOQ4tw3I4bcNHqiyzHLO9YbiyNCWjtCgnh\n/p+Ufi/75wpDLRa8PPCyuX+q0KC3S/jNcFBbKo6MOwsO44VxY4ZDLp3DqUtONd57qvcp18/FhAMR\nEREREZETCw5VllLWASnXUqEmHNrbreepat5S4fIacC6NqQ5u9Lqe3z3LURMOuwd3m/uztoLDxABK\n0oySeK1S4VZwKJQK2D2gXXtx22KcsvgU473HX3nc9XP5JRxKsoRiqYIqCxERERERUYNiwaHKos5w\niDvh4NZS4Zc2sN/T61i1raJQ8q4mxJ1w0IsCgFmI0N8vyRKGJ4eN9/UEQjqRRku6xdjv1lIBmAWE\nRa2LcFT7UZifmw8A2PTKJkj1F2i7vhe2VRARERER0WzEgkOV2QsO9hkOQQoOcS6LGWfCAYDxMA4A\nh8cOl71ebAkHl4KDml5Q2yr0hENLugWZZMbY75ZwUC1qXQQhBE5dfKpx/IuHX3Qc55dwANhWQURE\nREREsxMLDlUWZoaD2lLhV3CI0lJR0bKYLq91c7PmqhRqqkAlpVloiSvhsGdwj2O/19KYekGgOd2M\nTEopOHgkHHQLcwsBwCg4AFrKwa5swoFLYxIRERER0SzEgkOVNTWZRYXR0elrqQibcAjaUrGic4Wx\n3dXc5XpM0EGV5agFBbf9loSDsjSmXhBoSbcYS2cC5RMOC+doBQfLHIc9zjkOTDgQERERERE5seBQ\nZYmEucTl0FC0gkOcy2JGWqUC1lYQ1eUnXW60VXz3wu+WvVYlCYeWdItlVQxduYSD3lLRnGq2tFTY\nl8+0WzBnAQDg5MUnG/vcBkdyhgMREREREZFTqvwhVKnWVmBwUPsKukqFXqQA6neVCgBoy7Rh299u\nw6GxQ1jesbzstSpJOAgh0JZpcyQTFrUuAuA+w0FK6dlSoWrLtGFwYtCyb0FugXHd4+cdj+cPPo8t\n+7ZgvDCObCprHMeEAxERERERkRMTDjWgpxUGB61JgURC+wKmZ5WKOFoqAKA10+pZbLBfq5KEA2Cd\n46A7ofsEx3t6emGyOGkskWkfGqlyawfREw6A2VaRL+XxXN9zluPKFRyizHDYM7gHV//iatzxzB2h\nzyUiIiIiIqoHLDjUgF48GB62PnCrCYdSySw4CAHMmWMe55VwaGlx7vM6p1oJhyDUc0sla8ojLPsc\nh545Pehq0YoFbjMc1GJAc8o74TC3ea5jnz7DAQCWti01tg+NHbIcN14Y9/3MURIO//LQv+Cm39+E\ny35ymWXFDSIiIiIiokbBgkMNqO0RQ0PmtldLRS7nXN1CJ6X5Opcz90/XKhVBhElLlGNPOLx6/quN\nbbXgoCcc1PkK9qGRKr1ooRMQmNcyz7yvx3wI+z3cRJnhsKN/BwBgojiBAyMHQp9PREREREQ03Vhw\nqAG1PWJA+WO119DIlhYgpUzXUB/Y1XYKNQURNOEQdIBjnEUCr+JHFPaEg1pwcCsK6AMjgakZDh4t\nFfaEw7yWeUglzF+CWzFDV40ZDuoSo+WuT0REREREVI9YcKgBr4JDIuFecPBLOKiFBa+Eg1vbwnQm\nHOK8lj3h8Jr5rzG2g7RUJBNJ15Uu7DMc1PkN9vvaWxzKJhwizHCwFBzKXJ+IiIiIiKgeseBQA2pL\nhV/Cwaulwivh4FVwcHugDzvDwf6e17KYQXgt0RmFb8LBZWikvaUCgOscB3vBQZ3fANQ+4TA0Yfbe\nlJsRQUREREREVI9YcKgBNeEwqKy8aB8aqRcc7C0VagEhSEtFHAWHaiYc4iw4vKr7VcZ2W6YNAto/\nqJ5CsLRUpJoBwLWtwj7DQV8S0+2+enpCVy6BUI2WioOjB/HMgWdCX5eIiIiIiKhWWHCogSAzHEbN\n5+KKWyrcigMzpaWiLdNmeZ1rMv8REiKB1owWJ3m271lce9+16B3uNd7XEw5ugyMdLRW2gkMlCYco\nQyP9WiqGJoZwzFePwau//mr85LmfhL42ERERERFRLbDgUANqS0W/8qyqznDQ5zcAWiEhyNBIr2Ux\nq9FSEecqFZUkHOzpArslbUsAAEOTQ/jsg5/FB37+AeO95vRUwsHWUtHa1OpoofCd4VDlhIOU0jfh\n8NS+p4yixz3b7wl1bSIiIiIiolphwaEGgrRU2AsJXgkH9bimJu0LCJZwsM9haMSEQy5tJhrUgZG6\nG8++EWsWrDFeqw/uXi0Vx8873ihG6OKc4RB2aORofhQS5tRPe0FDbRM5Mn4k1LWJiIiIiIhqhQWH\nGvBrqUi4/AaCtlSk0+4FB7dCQj4fLmlQrzMcNqzfgO6WbjSnmvHDt//Q8f6Zq87EH676A3rm9Dje\n8xoaubp7tVGM0NlbKrKprNGKoa5SUSgVUCj5/0BhEw5qkQRwFjTUAsSRMRYciIiIiIioPqXKH0KV\nCrJKhco+NNKrpcKr4OCVcAjTJlHNVSoqKV7Ma5mHXdfsQlEWMadpjudx3bluy/wGQGmpsCccuo5H\nNpW17LO3VABayuHAyAFLwiHIkpVhZzgMTQ5ZXtvvoRYgmHAgIiIiIqJ6xYJDDQQZGqnySzhEbakI\nmnC4/npg2zZg7Vrr/npJOABwtD+46W7pdp43lWKwD410a6mwJxwAs+CgznAo104BxJ9wsLRUMOFA\nRERERER1igWHGvAqOKhDI1X2goP6gB6kpSJowsH+eutW4H//b21782bre3EOjazkWkF155wFB7+W\nCnvCwe18fWnMgfEBlGQJCZHAeGG87GcJO8PBUXCwJxzyTDgQEREREVH94wyHGqi0pSKuhEO5pMEr\nr5jbe/ZY36unhEMQrgkHj5aKVZ2rHDMcUglnLU4fHClhriIRpKUibMJhaMLWUmGf4VCwznCQUoKI\niIiIiKjesOBQA3G2VFQz4TBqJvUxZnuOrpdVKoJyKzjoCYd0Mm3s68x2Ip1MB2rTUJfG1Oc41KSl\nwmeViqIsOo4nIiIiIiKqByw41EA2ayYW1D9GBy04xDE0MsgMh2oVHMKsjhEXt5YIPcXQO2QOk1zR\nuQIAkE6ksa5nHQDgb0/5W9drdmTMpTH1lSqqMTQyzCoVANsqiIiIiIioPnGGQw0IobVVHLE9F3rN\ncIjaUiGldj23B/qwCQc7Kc3rh1UvCQc9xbD9yHZj38rOlQAAIQTufe+9eOKVJ3DaUae5XjNMwiGb\nyhrzHUK3VNhXqfBpqQC0topl7ctC3YOIiIiIiKjamHCoEbWtQhdnSwVgFhCiJhzsqQa7qEtjTssM\nB5+hkWoiYGXHSmO7LdOGM1ae4VjFQqfPcACUgoNHwqEtY/7C4x4aqbZUAEw4EBERERFRfWLBoUa8\nCg4Jl9+APeEQpKUCMIsRcbRUuImaTJiWVSp8lsU879jzjH1vXvHmwNfUV6kAYCyNqaYNOrOdxnZr\nkzkpNO5lMd0SDkRERERERPWGLRU1oq5UoYuScPBqqQC0gkMu5z00slxrQ7UKDvWWcLjh7BswMjmC\nkxaehLNWnRX4muUSDvNz8420gSXhEHKGg2OVCp9lMQEmHIiIiIiIqD6x4FAjbgkHrxkOUVsqGiXh\nUIuCw9zmuUiIBEpS6wNJiqSxOsXRc4/Gve+9N/Q11RkOxtBIJW3QnevGC4deAAC0ZipIOOStCQd9\nFoSOCQciIiIiImoEbKmokTAzHKrRUlHp0Eiv6wYxHUMjEyKBruYu43WQZS/LKZdwUNs4Kkk4lGup\n4AwHIiIiIiJqBCw41EitWioA9wRBHAmHRhoaCVjbKvR2ikqUm+EwPzff2FYLDqFXqXBpqegb6cMd\nz9yBgfEBZ0sFEw5ERERERFSHWHCokbAJB7XgoD6g+7VU6MWIeks4TMfQSMCaOJBSVny9cgmHNy1/\nEzqznUiIBM4/9nxjfxxDIy/80YW45H8uwYa7NjhbKphwICIiIiKiOsQZDjUSZoZDNgtMTJivi0Xg\nE58AenuBefPM/WFnOJRLGsykoZGANeEwkh+p+HqWGQ5TCQd1vsLCOQux85qdGJwYNFbEAOJZFvPx\nPY8DAJ7Y+4TjeBYciIiIiIioHrHgUCNBWyoyGa0QoSYcHn4Y2LhR2+40V170bKmot4TDdMxwAKwJ\nB/vcgyjaMm0QEJCQZsJBSRs0p5vRlmlDW6YNgxODxv7QLRWTQ56vBycGkU6kLe+zpYKIiIiIiOoR\nWypqJGhLRfPUH8bVgsOQ8vx5RHm25CoV/jqzneUPCiEhEsbqE8YqFUpLhZpqaEqav5hKh0aqBicG\n2VJBREREREQNgQWHGnErOHR0eBccUgGyJ2ESDqWSdf6D23EzLeEwp2lO7NfU5zh4JRx0agqh0hkO\nqsnipCU9ATDhQERERERE9almBQchxDIhxJeEEM8JIYaFEIeEEJuEEB8TQlS+ZqH7PZuFENuFEKWp\nr+3VuE8Q9paK1lbgne/U2idUbgkHL14JB68Ewfi49fVMTzhUo+Cgr1Sxf2Q/7t95Pw6OHjTeUxMO\nyUQSCaH9csPMcCiWimXbP0rSulzIkfEjsQzFJCIiIiIiilNNZjgIIc4H8D0AbQD0J6NmAOsArAdw\npRDirVLKl2K+9ecALFfuOW3sCYePfxzo7g7WUuElTEsFUL7gMGZN6jvEtSzmTEg4AMCbvvsmy3tq\nwgHQUg4TxYlQCYcowy0LpQJG8iNV+XmJiIiIiIiiqnrCQQhxEoD/BtAKYAjAJwH8OYAzANwKrRhw\nDIC7hBC5mO/7EQBjU/d1WQ+idtrbra//7u+0714FByGc6Qe7MC0VwPQlHKZrlYqj5x5tbC9tWxrL\nNZe0LfF8T004AOYchzAzHPzaKfywrYKIiIiIiOpNLVoqboSWZigAOFNK+UUp5eNSyvullFcB+Di0\nYsCxAD4axw2FEAloxYwEgOsBTPvT2OrVwJo1WhHhjjuA3FRpxavgAJRPOYRNONgTDNPVUlGrhMMb\nlr0B71nzHhw992j8+JIfx3LNa0+/Fu941TuwZsEax3vZVNbyOp3U5jiESTgMTQyVP8gFB0cSERER\nEVG9qWpLhRDiZACnQUsxfFNKucnlsC8DuBzAagAfEUJ8XkpZ6SPpNQDWAngOwBcBXFnC/kAsAAAg\nAElEQVTh9SqWSgFPPKGlDNT2Cr+CQyqlrS7hJc6Winze/15+1y1nuhIOQgj810X/Fes1j5t3HO54\nxx24+6W7cfbtZxv7s6kshO2XqSccwhQcmHAgIiIiIqKZotoJhwuV7e+4HSC1aXf6U2EHgDe5HReU\nEGIZgOugFTk+JKWs0eNteU1NzlkOlSQcwrZU+CUcys1v8Ltu2PNqVXCoptcvfb1lJQp7OwVgrlQR\nZmjk0CQTDkRERERENDNUu+DwhqnvIwA2+xz3gLL9+grv+XUALQD+S0r5YIXXqrpatlT4JRzKtVP4\nXbec6WqpqKZcUw6nLD7FeO32wF9pwkE/34ta5GDCgYiIiIiI6k21Cw6roSUNtkkp/dY4eN52TiRC\niHcBeAu0mQ3/EPU6tVSupcJPnMtiBik4xLVKxUxIOADAm5b7h3EqHRrZ3dLte6w6wPLQ2KHA9yAi\nIiIiIqqFqhUchBAZAPOmXu7xO1ZK2Q8tBQEAkZYTEEJ0ALgBWoHjH6WUB6Ncp9bsBYesMncwjpYK\ntWhhLziox1Uz4TBdy2JW25tW+BccKh0aOT833/fYZe3LjO2+kb7A9yAiIiIiIqqFaiYcWpXtIJPw\n9ILDnIj3+3cACwA8IqW8LeI1aq7aLRWZjLntN8Ohli0VMyXh8GdL/szYXtezzvG+kXAIMcPBknDI\n+Scc1ILDwdGGqK8REREREdEsUs2Cg7pGYJA/8U5AWx7TOX2vDCHEXwB4P4A8gKvCnj+dErbfQBwt\nFWphQE1MTNcMh5macGhON+POd96Ji46/CDe99SbH+/rQyKIsouTbUWQK01JhSTiMxpdw0Oa4EhER\nERERVaaaBQf18dZ/+p0mA60dIsB6CSYhRBOAW6Ze3iilfCbM+dMtasIhldLObYSCw0xNOADAxasv\nxo8v+bFlgKROHfoYNOWgJhV65vT4HqvOcIir4PCVx76CBf++ADc94SygEBERERERhVHmb+gVUdf3\nC9ImkZv6HqT9QvUpAMcC2A1tOcyqmpycxJNPPln2uJ6eHvT0+D8wAtELDumpFRkboeAwUxMO5egz\nHABtjkMmlfE5WrOjf4exvbrbf35qW6YNHdkO9I/3x9ZS8a+P/Cv6Rvvw5Ue/jP918v+K5ZpERERE\nRGTV29uL3t7essdNTgafB1ePqlZwkFJOCCEOAugCsMTv2KmBjzloCYeXQ97q41Pn/QbA+cL+BK/R\nixk5IcQlU9sHpJT3hbwX+vr6sG6ds1/f7tprr8VnPvOZssdFXaVCLzSUW6VCLTj4zXCwv+em1qtU\nlErAM88AJ5zgbD1pBJaEQ8CVKrYf2Q4AyCQzWNW5yvfY5lQzulu60T/eH9vQyIHxAQDA0ORQmSOJ\niIiIiCiqm2++GdddV/W/l0+7aiYcAOA5AKcBOFoIkfBZGvN42zlh6E91l099+ekG8MOp7fsBhC44\ndHd3Y+PGjWWPC5JuAKqfcFCHRtZLS0XQ67z3vcDttwNXXgncemu0e08nfYYDALwy+AraM+1IJpy/\n1J+/8HNsP7IdV62/yig4rOhcgZZ0i+/1W9ItmNcyDy8efhEDEwPIF/OWVEVYUkqM5rX/ECYKE5Gv\nQ0RERERE/jZs2IALLrig7HHnnHMO+voad0W6ahccHoJWcMgBWAfgCY/jTle2Hw55j6AT7oTt2EiT\n8ZqamrB27doop7qqZUvFdK1SETXh8OtfW783GjXhsOYba3DK4lPwyOWPWIoO249sx9v++22QkNh+\nZDvGCtovaWXnSjSn/eenNqebLStZHBw9iJ7WYIUuN2OFMcip/1mMF8bLHE1ERERERFEFbcFvagoy\nDrF+VTuo/lNl+/1uBwitB+KyqZf9CJk6kFImy31Bm+8AALuU/WeE/WGqoRotFUGXxVSPq8cZDhNT\nf2Rv1LYle9pg0yubsOmVTZZ9v972a+Mh/z82/Yexf2XHSjSnyhQcUs2Y1zzPeF3pHAc93QAAE8UJ\nrlZBREREREQVqWrBQUr5BIDfQUsXXCGEONXlsI8BWA0tcXCjlNLyOCqEOF0IUZr6+lY1P+90mA1D\nI6OuUqH/PBMNmu5XEw66u1+62/K6NdPqem6QhENLusWScKh0pYqRyRHL64lig/7DExERERFRXajF\nKL6PQFvqMg3gHiHEPwkhThVCvFEIcTOAL04d9wKAL/tcZ0b+uTVqwiGugoP+R+x6TDjoP0/DJhwS\nznkK92y/x/JaH9Jot7IzQMIhrQ2N1FU6OFJNOADR5zjs7N+J32z/DUqeI1uIiIiIiGg2qHrBQUq5\nBcA7AQxAm+VwPYBHAdwL4APQCgnPA3irlHLE6zozlX31haAJB73QkFaeacutUmEvOADmyhNBCg5R\nV6mIknAoFs37NWrB4fDYYce+x/Y8ZikyHBk/4npuoBkOqWbMa4mvpWIkb/2fX5Q5DkfGjuDkW0/G\nmd87Ezc9cVNFn2e67R/ej+8//X0cGXP/HRERERERkb+aLDYopfwFgDUAboCWZBgBcATaEMmPA1gr\npdzhdwnb90gfo8Lzq2I6h0YC5sN/vQ2NVNsoCoXoxY7p1D/e79hXlEXct/M+32MAbZWKTDIDAfM/\nkK7mLssx9paKFw+/iOvuvw4bt5VfRcWNvaUiSsHh1idvNQof12y8JtLnqBcX33ExLv3JpbjqF1dN\n90chIiIiImpI1V6lwiClfBnavIaPhTzvAQA+j96BrrGikvOrqdotFS3KyopuBYNCQRssWW8tFfZU\nw+SktXjSCD647oP47Y7foi3Thi+c8QVc/curAQA3PXEThiaG8Lbj3+ZacJifm485TXMAANlU1li5\nYln7MhwaO2Qc15y2Jhy+8vhXjHN6P9qLjmxHqM/raKmIMMNBHYqZSWV8jqx/f9j3BwDAln1bpvmT\nEBERERE1ppoVHMhd1ISDXmhIJrVrSOlecMjl/O9fi4RDlJaKmVBweOcJ78TyjuXobulGd64b12y8\nBvlSHvdsvwf3bL8Hl6651PGQD2jtFLrmdLOl4PDUvqcAaAMpEyJhmeGgGy+MY9vhbVi/aH2ozxu1\npeKFgy/g+1u/j3e/5t34/d7fG/vX9awLdf96ky/lte/F/DR/EiIiIiKixsSCwzSrtKVCCK34MDFR\nPuHgZjpaKoJcx74yRaPOcThl8SnG9gXHXYA7n7vTeP3Iy49gecdyxzmWgoMyOHJp21JjuyWt/WLV\nhIMqSjuEvfgR9Brv+7/vw2N7HsNXN33VktjQUxqNqlDS/sehFx6IiIiIiCicmsxwIG9RWyrUVgp9\nu14TDlFmOLglHBrdbRfchtsvut14fWDkgOtAwpUdZsEhmzJjHcvalxnbeiFiTtMcZJLO1gWv1S/8\nRJ3h8NiexwA451E08rKaJVkyVtnQCw9ERERERBQOCw7TzF5wUFedCJJwACorOOjHug2U9Do2LHuB\nIcoMB3vioRG1Z9vx12v+Gn9x1F8AAIYnh7F3aK/juBMXnmhsqytVLGlb4tgvhLAMjtQNTgyG/nz2\nloqoy2LqoqQs6oVaZGBLBRERERFRNGypmGb2goP6OmrBQX3AjzPhEHWlCCYcrObn5hvb+0f2AwBW\nda7C5SddjmKpiAuPv9B4/7Rlp+GPB/6I05adhs7mTmO/3lIBaG0Vewb3WO4xMBE+4RC1pSIpkihK\nZxWp0oLFdFKLDGypICIiIiKKhgWHaWYvOKjiaKngDIf6M79lvmPf3Oa5+ORpn3Ts/8o5X8E7T3gn\n1i9aj6f3P23sV2c72JfLBGrXUlEoFVyLDUBjt1SoRQYmHIiIiIiIomHBYZqpBYeErcElbMJBf0iv\ntxkOca1SMVO4tUCo6QVVOpnGG5e/EQDQlmkz9qutFupSmbooLRVRlsX0K0o0csLB0lLBhAMRERER\nUSSc4TDN1CJDxjb7zy/hoBYcWlu170ND2vKYYRMOUpoFB7971jLhMJMLDmpLha4j21H2PLXgoLZU\n9A71Oo6N0lIRZVnMsbw5/OO1C16L/3nH/xifs6ETDkqqoVAqQEo5jZ+GiIiIiKgxseAwzdSEQzZr\nfc8v4aC2VLRNPYcWi9rwx7AJh3zePKfT/Q/txvWjiJJwsLdUzIShkTq3gkNn1ucffkrPnB5jcOT6\nnvXG/s+96XOOY2tWcCiYBYdjuo7B21/1duNnmSkJB7fXRERERERUHlsqppn6h9MwBQc14dBm/uEb\ng4PhCw5qO0VHB9DX534sEw7xiJpwSCfTePjyh7F572ace8y5xv7LXnsZxgpjSCVSuPqXVwOIp6Ui\nSMFBPUafK5FJZQKfX6/sbRSFUgHpZNrjaCIiIiIicsOEwzRT/3JvLzgEbalobze3BwbCr1KhFhzU\na9nt3QvceCPwwgv+17TjKhVW3S3OGQ5BCg4AsKx9GS5afZHxUA9oD/gfPvXDeP+J7zf2xTE0MkhC\nQW2pMAoOSe2zNXJLhT3RwDkOREREREThMeEwzcaVPwJX2lIBOBMO5WY4FIvA8LD5eu5c72Nvukn7\nfsstwLPP+l9XFUdLxUwqOERtqSgnm8oilUihUCrULOGgtlRkU1nL94nCBKSUEH5LsdQp+8oUXKmC\niIiIiCg8JhymWdSCQ9CWimzWf+nNQkEbNqnrcq6w6PDcc+HaK+JoqZhJMxw6mzuRFNZfbtCEgx8h\nBNozWkRlOoZG6itn6OkLCVl29kFJlvDywMthP2rVMeFARERERFQ5FhymmV/Bwa+lQk04qG0Q9oJD\nMum8rqpQCJ5wUKnn6O69F7j7bud+tlRYJUTCsTSm17KYYbVnpwoOcbRUhFwW095SEeQab/vvt2HZ\njcvwhd99IcxHrTp7gYEJByIiIiKi8FhwmGZxJxwGBpwFB/tymyp7wcGecEh4/BcyaEvsP/YYcMYZ\nwNlnA9/8pvMeqtk+NBJwznGII+EAmEtnDk4Mhl7KsdKWCnvCAfCfA5Ev5nHXn+4CANz53J2hPmu1\nMeFARERERFQ5FhymWbVbKlKp8gkHtaWiw/bcqyYpVOo5APDUU+b2Bz4A7Nljvo6ScJjJMxwA5xyH\nuAoOektFvpQPvUpEpS0V+uwGNeHgdw31vSgtINXEGQ5ERERERJVjwWGaqQUHexLB3lKhFhaCDo1M\nJMIlHFpbrdcOWnDI257HPvQhc8nPOBIOM2mGA+AsOMQxNBIwWyqAcA/xUkpHwiFIS4Ul4TDVUqEX\nHspdw1JwiNACUk32hEO5WRREREREROTEgsM081sW055wUOcreC2LOThoPuDr54dJONgLDl7FCnvB\nYdT6rIq77gL+8AdtmzMcnKqVcNBbKgAYK1V89fGv4i3ffwue6n3K6zRMFCdQkiXLviAJB8sMh5At\nFXWdcLDPcGBLBRERERFRaCw4TLMwLRVeBQevGQ76+WESDnPmREs42AsOAPDy1OIDUVapmOktFeoM\nh1w6h3Qy7XN0cHpLBaClBh7a/RA+vPHD2LhtIz734OeM9/YN78Pp3zkdf3XHXyFfzDsGRgLxtFQE\nTThMFid9ixO15pjhwJYKIiIiIqLQWHCYZmFWqVi1ytxeuNDc9mqpiCPhUEnBYWzMvIf9nuXMpoRD\nXOkGwFZwmBjAP9zzD8brnzz/E2N7w10b8OCuB3Hnc3fiFy/+wtFOAUQYGum2SkXAhIP+eeuFY4YD\nEw5ERERERKH5LLxItRCmpeLd7wYWLNCOO/NMc7/XsphBEg7FYvUSDvq+KAkHFhyiUVsqvrPlO3hs\nz2PGaz1VUSgV8LMXfmbs3/TKJhw/73jHtYIkDsq2VARMOABaIsPeajJdmHAgIiIiIqocCw51pFzC\noa0N+OpXnee1tprbURIO1So4jI1pgyNL1tEAkRIOM3loZGdzPAMjAevQyO9v/b7lvcNjh1GSJdy/\n837L/t7h3ugJh/wMTThwhgMRERERUcXYUlFHyiUcvJIKTU3mueoMB71gUYuWihHnCACMjrqnGYpF\ncwULLzN+hkPOnOFQrYSDXVEWcWj0EH649YeW/S8cfCH6DIeCc4aDukpF0GUxgfpaqYIJByIiIiKi\nyrHgUEeiFhwAc46D2yoVlQyNjLpKhb7Pq33Cnnqwm+ktFSs7V+LEhScCAC487sLYrqvOcHCza2AX\n7nzuTsu+5w8+j5G8s+AQelnMSlsq6inhYCswcFlMIiIiIqLw2FJRR8q1VPgVDtrbgQMHtILDnDna\nvihDI3O5eFsqvNonikVnQUU10wsOCZHApis3Ye/QXhzVcVRs11VbKnRtmTZjicwfbv2h48H+yPgR\n7Ozf6Tgv9LKYlbZU1HPCgS0VREREREShMeFQR8olHLwe/gFrwiHqspi5HJBIWI/3uufgoPV12IRD\nuTkO9paKcjMcJieBjRuBgwf9j6sn6WQ61mID4N5Scdaqs4zte3fea2yrrRxP9j7pOC/yspgREw56\nUaQeOGY4sKWCiIiIiCg0Fhym2S23aN9bWoBLLrG+F6Wlolg00wdhh0bqyYg4Ew5RCw5hEw6f+hTw\nlrcAp59efj7ETObWUnHmSnNJk6f3P21sn3fsecZ25IKDW0vFDBgayYQD0czyrae+hc8/+HlLkZSI\niIiqjy0V0+zyy4Fly4CVK4EO2+zAsC0VOnvBoVzCQT9eX+0izmUx/Voq/IQtODz6qPb92We1AZZ6\n8WS2sbdULGtfhuO6jjNel6Q5POPco8/F7U/fDsC94DBRmICUEkIIz/u5rVKhDo0MuyxmvbAnGphw\nIGpcW/dvxRU/uwIA0NPag8tPunyaPxEREdHswYLDNEsmgbPP9n5PFSThoAqySkWx6J9wCDs0MpEw\nB0LWMuGgtlxMTMzegkNrU6vl9ZoFa7BwzkLHcbl0DqcddZrxWsIZC5GQyJfyaEp69/LoRYOkSCKd\nTAOwtlSEWqWCCQciqoLdA7uN7V39u6bxkxAREc0+bKmoY5UWHIIkHEZHzYf5OBIOXV3WfV4Fh3IJ\nh7AzHOwFh9kqmbD+R7NmvnvBYdXcVVjcuhi5dM7xXiph1iHLtVXoLRVqqmEmtFRwhgPRzKEWELni\nDBERUW2x4FDHwrRU+BUc7AkH9XV/v7kdxwyHefPMfX6rVFQz4TDTVrSoxJoFa9CWabMUBABtWU4h\nBI6fd7zjnLnNc41tv4IBYLZU6PMbgAqWxayjlgr7QwkfUogaV1EWXbeJiIio+lhwqGP2hIO9AKFq\nd84K9Ew4qAWHI0fM7bAFB3044+SkWUCYaz6rVpRwCFtwUN+fzQkHuzUL1kAIgQW5BZb9qzpXAQBW\nd692nKMWHMolHPT39fkNwAxJONhnOLClgqhhFUvm/+GweEhERFRbLDjUMb8Cg10cCYcwLRWFgvlg\nrw6MzOWA5qlnz0pmONiLBkw4BPe6Ja8zto/pOgYAHG0VKztXAgAuOcG2NAqArmazLyZoS0UcCYd6\nWhbTMcOBLRVEDcuScCgx4UBERFRLLDjUMXvCwU+YGQ5hWir82jj0tgp7waGlxdxfq1UqOMPB9M3z\nv4krT7oSG/96ozGPwavg8NZj3oq1PWst74VJOOgtFWrLhmWVijAJhwAtFfuH90PWYN1TxwwHJhyI\nGhZnOBAREU0fFhzqWJiCg19LhV/CQW2pCJNwANwLDi0t8SQc7AWGMEMjwyYctmwBrr4a2LQp3Hn1\n6oT5J+DWC27F2Ueby5/YCw56S4UQAp/+i09b3rPMcPBJKJRkyXjfq6VivOizSkUxXEvFp+79FBZ+\naSE23LXB97g4MOFANHOoqQbOcCAiIqotFhzqWKUtFfr59pSC+jrq0EjAu+AQR8KhkpaKsAmHq68G\nbroJuPLKcOc1EnWGQ0IkcFTHUcbrC467wHi/q7krcMJBTS94tlSESDiM5kctD/b94/345G8/ie/9\n4XsAgB898yPL92riDAeimUMtMjDhQEREVFshHmmp1uJqqajGKhVAbRMOfgWHUgnIK8+DYQsOu6aW\nZd+5M9x5jURNOCxtW4qmpPmLFULg0SsexZce/RIuOv4i3LfzPuM9v4KDPr8B8FkWM8QMB0Cb49DV\nos2Q+Mbvv4EvPPQFCAi8YdkbMDw5DEArTFQbEw5EM4cl4cAZDkRERDXFhEMdq9YMB/X1uPLM59ZS\nkUwCCY//SganZvx5JRwmJrwLBX4JBynDFRzCznuwG5t6bh4dNVfemGnUgoM+v0G1onMFvnbu13DG\nyjMCrzKhz28AbC0VERMOgLWt4sVDLwIAJCR29u80Cg6FUqHqf6W0Jxr4V1GixmWZ4SD5v2UiIqJa\nYsGhjoVpqYg6w0HllnBIpbwLDuUSDgAwPOx+rl/Cwe09v9SC/b2wCQe96FIsWpMSM8mi1kXG9tFz\nj/Y9Vk0rBE04WFoqKkg4qIMj1eLD0OQQRiZHzHsrxY5qcCQc2FJB1LC4SgUREdH0YcGhjgkR/Fg9\nnaAKskqF2zXsCQevpEW5GQ6Ad8HBL+HgVjAoFLTWiSDHh0k4SGlNeYxWP60/LdYvWo8zV56JBbkF\n+OC6D/oeG7TgoL4XV8JBXRpTLTgcHD0ICTN+ohY7qsGxSgVbKogallpkYFqJiIiotjjDoY6VG6yo\nSqeBlSuB7dvNfV4JB6+lLt0SDkEKDiPmH54dCQf9GDu/hINXwSCfd//slSQc7IWM0VGgoyP4+Y0i\nmUji7vfcjZIsISH864yBEw559xkOCZFAOpFGvpQPXLDQqUWG/nFzwMi+4X2W46o9x4EJB6KZw5Jw\n4CoVRERENcWEQx0rN1jR7qqrrK+jJhyOMhcwwJIllSUc4iw4eO2vpOAwbnvmnakJB125YgNgSyj4\ntERYWiqUhIN6jYpaKpTt/cP7rfeucksFV6kgmjk4NJKIiGj6sOBQx8IkHADgAx+wvtaHOoad4XDW\nWcD11wP/5/8A55wTruCQywVLOIRtqQizP0xLxZjtuXWmFxyCiNRSkbYVHKbmOAQdOqlTEw7q9r4R\na8KBLRVEFJRlaCRbKoiIiGqKLRV1LGzCoaMDuPBC4Kc/1V5v3qx9D5pw0AsOySTwiU+Y++NKOKTT\n5lDGek042AsQs1GlLRVAPAkHtaWi1gkHtlQQzRxsqSAiIpo+TDjUsfnzzW2vIoHd9deb2x/8oPu5\n5Voq7NRVKtRBlkFWqVALDupsCL+EQ6UFhzAJB7+Wiq9+FfjkJ60zKmaDwMti+rRU6AWIqMtijhfG\nMVk0f5H2GQ5VTzgUuSwm0UzBoZFERETThwmHOnbSScCVVwIPPgjcfnuwc1avBn79a+Cxx4C/+Rtt\nnz3h4DZ4MZn0HiapJhzmzgUOHdK2wyYcMhnz4b1eEw76z/LUU8CHP6xtH3UUsGFD8Gs2Oq+Ew8HR\ng3j7HW9HZ7YTd7zjDkvKwLOlwiPhIKU0rp1KpIyHAD3hoCYdAJeCQ60TDmypIGpYXBaTiIho+rDg\nUOduvTX8OWedpX3p0mktmSCnVhV0Kyy0tnovw6kWHObPL19w8Eo4qPeNMsOhlgWH3bvNfer2bKAW\nHP50+E94aPdDeP3S1+PrT3wdD+56EACwcdtGz2UxAbOlwqslI1/KG8tcLsgtwCtDrwAA+ie0Ngp1\nfoPb65rPcGBLBVHD4gwHIiKi6cOWillACOvDvltLhT6/wY1eqAC0hIPeYhEk4TA8bG6rnyFKwqEa\nQyO9Cg7qNcMUMGYCteDw0+d/itO+fRq+8ftvYNPeTcb+Xf27LA/9jhkOUwmHyeIkpPof0BS1ELG4\nbbGxfWhUq2ap8xvcMOFAREFZVqngDAciIqKaYsFhllCLDE1NzjSDX8EhrzxrNTcD7e3adv/UM2Hc\nCYdatlR4rVIxmwsO6rKYujuevQOb9242Xu8d2uvfUqFcQ53FoFMLDvNz85FKaGGrw2OHAThbKuxq\nPcOBCQeixqUWGZhwICIiqi0WHGYJ9WE/ldK+VAsXep+rPuhns0BXl7att1aEmeGg80s4hG2psO/3\nSzjs3g3sU8YBeCUc1P2zreBgTysAwEO7H0LvcK/xeu/wXv+WCnXwpMscB/u5c5vnAgAOjWn/Udlb\nKOyYcCCioCwJB85wICIiqikWHGYJNeHgVnBYs8b7XPUBvrnZLDj092uFA3UVh7hXqVDPqTThsHkz\nsHy59rVnj7aPLRVObgUH+wP43qG91lUq0u6rVADuK1WoBYdsKouuZu0/Kr2lolzCYTQ/6vt+pTjD\ngWjm4AwHIiKi6cOCwyyhpguSSWfB4TWv8T7XXnCYN898ffiw+ZCeyWjXrjThoN5PbfUIOsPB67hL\nL9XmUfw/9s47PI7qXv/v0WrVZRVbluVubAw2xuBCtyEYQscQLoQWcjElkEsIISHkl3aBEJLcEIjJ\nJbEJNQSC6cYmgLkYsMEY9yJ3W5ZlWZbVe1lJu/P7YzQzZ86cabuzTTqf59Gj2dnZ2dl+zjvv9/0G\nAsDvfy+vS1bBYccO4P33rYWbcOEJDixsSYUhwyHVucMhIzVDdTh09HYg0Bewz3CIckkFOykRkxSB\nIHnRdakQGQ4CgUAgEMQUITgMEliHA915ArB2ONCTWrqkApDLKpRJuiI0eJnhQAsOTh0OZtvR3SaU\n2ySj4FBfD8yeDVx5JfCPf3i//9y0XIzKlYMczxh1Bncbg8PBoqSC16nC4HDI0t5UjV2NcS+pMGQ4\niJIKgSBpocsohHgoEAgEAkFsEYLDIMEqw4EQ4KSTnO2HdTjU1xsFB9rhQE/+w8lwyM3l78tse95l\nBTprIjtb/p+MoZFLlmjHdPvt3u/fl+LDqltX4bkrn8PK767EiBxjwEdzd7Ma8AhwQiPpDAcXJRWA\nnOMQ79BIQ4aDKKkQCJIWncNBZDgIBAKBQBBThOAwSGAdDnSXiokTtQm4HXSGA2DvcKAJp6QiHMHB\nSVtMxTmRjKGRZs+vl0wsnIjbZ96O7LRsnDvuXO42ZU1l6rJVScWt792Kn6z4iW6gbyk4dDbYOxyi\n3aWCzXAQDgeBIGmhBURRUiEQCAQCQWyJmeBACBlLCHmCELKbENJOCGkghKwnhE97dhoAACAASURB\nVDxACIloCkUIOZEQcg8h5CVCyCZCSCUhpKv/fsoIIUsIIfO9eizJCOtwqKvTLluVU7BkZLhzOJgd\nQzglFZFmONAoJSXJWFJh9vxGi/PGnacuE2hKVVmjJjhYlVRsPLoRT379JFaUrVDXmWU4AHJJhW2G\nQ6y7VAiHg0CQtIi2mAKBQCAQxI+YCA6EkCsBbAdwP4DJADIB5AOYBeCPALYQQiZGcBe/BPC/AG4B\ncCqAkQDS+u9nPIBvA1hKCPmMEFJotpOBDOtwoLEKjGRhHQ51ddqE3I3DoaUFOHSIv120SipoeE4G\nen0iCw4Z9pmOnnLDtBswsWAiirKK8L1Z31PXKy6DFJKC7DS9RYYXPLm3fq+6bJXh0NCVAA4HkeEg\nEAwYRFtMgUAgEAjiR9QFB0LIDABLAOQCaAPwCwBnA7gAwLMAJADHA3ifEOLQ2G+gD8DXAJ4EsADA\npQBmA/gmgHsBlPbfz3kAloX7WJIZtksFjRuHQ1qa3uFQWaktu3E4/Pa3wIQJwNKlxu2iERoZCukv\nJ7PgwD6WaFOYWYh99+5D1Y+rdG4HhWnDpyHNl6ZbR5dUKBxu0VI7bUsq7DIchMNBIBA4RDgcBAKB\nQCCIH6n2m0TMQshOg14A35QkaT113eeEkP0AHofsfPgJgN+EcR+3S5JkNg37lBCyCMCbAK4BcBYh\n5ApJkt4P436SFq8cDmlpeocD3flBERrS0+WMCEnS3zbdOAfF0qXA1Vfr10Wa4cATCJqa9JcVYYEN\njVQuJ3KGQ28c5r4pJAUpvhSMzB1puO6s0WcZ1tElFQqHWy0EhwRyOEiSZBAYxCRFIEheRIaDQCAQ\nCATxI6oOB0LIaQDmQnYXPMeIDQpPAtgNgAC4jxDi42xjiYXYoFwvQRY1FOa6vY9kh81woEMijzvO\n+X6cCA6E8MsqeILDkSPGdZGWVPC2ozMrAHcOB3abeOMkFDNaOBYcXDocEinDIcT5OhElFQJB8iLa\nYgoEAoFAED+iXVJBn7t+ibdBvxjwcv/FfADnR+lY2qjlGFfBxx/W4bBsGTBvHvDaa8YSCyusBAda\nxHAqONAlGQrRCI2srdVfTuaSCtbhEMvjK8ktMaw7e8zZhnVch0PLYXx95Guc/fzZeOyLx9T1idYW\nk1c+EZSCkFjLjkCQwEiShB999CNc8a8rcLTtaLwPJ66ItpgCgUAgEMSPaAsOc/r/dwDYZLHdKmr5\nnCgdy43U8p4o3UfCQosB6emy2LByJXDDDe72k5Ym/ynOgxZqXkhnN/ByHNLSjOuOHDGWXkQjwyES\nh0OiCw4t1nNzT8lJy0FummY7GZY1DJMKJxm244VG1nbU4qf/91OsPbIWtR21um3pkorKlkpb23Nn\nb2c4h+8IszOgIsdBkExsPbYVT617Cv/e/2+8sOWFeB9OXBEOB4FAIBAI4ke0BYcpkMspDtiUPdAC\nwBSv7pwQMpQQciYh5HnIYZUAUAfgVa/uI1m48UY57PH004HTTgt/P4poQAdHKhQVacs8wYHncOjs\nNOYrRCPDYSA5HNjnIZaCA6Cf7J85+kwQQgzb8EoqAGDN4TWGdRmpGchIzUCWX37THGw6aHsM0Syp\nMCufEGUVgmSioatBXa7rqLPYcuCjcziIDAeBQCAQCGJK1EIjCSHpAIZBFhw4lfoakiQ1E0I6AGQB\nGBPh/X4O4Fze3QCoBfAtSZJaI7mPZGTGDODoUbmcgjM/dIwiOAwdCpSX66+jwyedllQAcllFIdWs\nlJ7gh+NwiFRwkKTkCo1sto478Bx6wD512FTuNlWtVdz1EoxlCYobojCzEJ29nWjq1hSo/Ix8bp5D\nNEsqhMNBMBCgBTI6M2UwQn+mhcNBIBAIBILYEk2HA3VuGu0Otu/o/59juZU9ksnfXwBMlSRpbYT7\nT1r8/sjEBmUfAN/hMG2atuzU4QAYgyPNHA5OMxx6e41lGmYlFWyXCkmS90fvMxQC+hJojMoKL7EW\nHOgSitkjZ3O3mVLk3KikCA50joPCqNxR3NtE1eFgIiwIh4MgmaDfx93BwS040CUVIsNBIBAIBILY\nEk3BgS7idpKrH4DcqYJzbtwVtwI4GcB0yE6HHwPYD+BeAC8RQoZHuP9BDe1woMnM1He7cOtwoIm0\npIK3rVOHg3KdE9cEzVdfAQsXxqa8Id4Oh8WXL0a2PxtnjzkbV594NXeby4+/HAtOXYD5J8zHP67+\nh+X+VMEhiyM4DOELDr2h3qhNHMzOgIozo4JkQjgcNGhXlgSJ24lGIBAIBAJBdIhaSQUAeoTDiQs0\nkA7ZiRDRqUtJkiqYVWsIIYsAvAngSgDrCSFnS5I0uGO7w0QpfWAdDlOn6rtduHE4HD4MvPKKvM9L\nLnFfUsFb39Ojvz+noZHKdTzBgQ7epGlvl4+7rU2+n8ce42/nFfF2OFxw3AVoeLABab40bn4DAPhS\nfHjhKjmobuuxrZb7y0yV1Smew2F07mjT23X1dSEnLTJDVKAvgBSSAr/Pr64zzXAQJRWCJIJ+v0bT\nEZQMsOJkMBREii/aEVYCgUAgEAiA6AoOdBtKJ7MCZTrnpPzCFZIk9RBCbgNQATkj4o8AvhPOvnp6\nerB582bb7UpKSlBSYmwhmIx88IHczeKCC4BTTpHXsQ4HupwCcOdw+P3vteUdOyLvUqGso90R0XQ4\nVFXJYgMA7IlB/5N4OxwA81BIHmOGWMey0BkOLGYOB0CeRGWkZiA1JbyvsSOtRzDr77MQkkLYcOcG\njM8fD8Aiw0GUVAiSCOFw0GA/00EpCD/8JlsLBAKBQBAbqqurUV1dbbtdj9kkKEmImuAgSVKAEFIP\nYCgA89OUAAgh+ZAFBwlApdW2ERxPAyFkDYBvAriKEOKTJPdx1XV1dZg1a5btdg899BAefvhh9wea\ngFx6KdDQIAdOKtgJDsXFxv2YCQ40y5bpJ/fhZDjw1rEOh95e+c9McGDXWwkOnVSHRjYTIhokguDg\nhsLMQmT5s0xbWVplOJw1+izT/V70ykUoayzDG9e9gUsmXeL6uN7c+abanvP5zc/j0XmPArDIcBAO\nB0ESoctwSALBoTXQinRfuisx0ylsZ4pwy6O212xHcXYxinM4P3ACgUAgELjkmWeewSOPPBLvw4g6\n0XQ4AMBuAHMBTCKEpFi0xjyRuU20UKadWZA7aNS43UFRURE++ugj2+0GirtBIZV5p7AlFXSHCgCY\nNAkG0hwU1qSmAvX18nJ2tl5w6Ojg38YuwyEY1PZJ09nJFwjcOhxiLTjEuy2mWwghGJs3FnvqZfsH\nAdF1qzDLcJg3YR6+Mf4bpvtVSjWe2fRMWILDzrqd6vKKshWq4JAoDodgKIjeUK/6/AgEbkgmh8OO\n2h0487kzkZOWg9337EZBZoGn++eVVLjlvT3v4erXr0ZuWi4q769EXkaeV4cnEAgEgkHKXXfdhfnz\n59tud8kll6COPXuaRERbcPgSsuCQDWAWgA0m251HLa+J4vHQ/uywSjfS0tIwc+ZMjw4nebFzOPAE\nB9bhkJZmnDzX1MjtOwFg5EhZdCgsBBobjW04FewcDg0Nxq4VAN/JoKx3IzjQQohwOPChBYc5Y+fg\ni8NfqNcpZzTz0vUD+IfPexgZqRlIISlqyFteeh5aAnqF5VDzobCOaVfdLnV549GNqO+sx7CsYQmR\n4dDS3YJTFp+C9p52rL19LY4fenzM7lswMNBlOESxjawXLHhvATp6O9DR24HfrPoN/nzJnz3dvxcO\nh1UVqwAAbT1t2HpsK84bf57NLQQCgUAgsMZpCX6ak7O2CUy0U5OWUssLeBsQOXXuu/0XmwF8Fo0D\nIYSMAnAW5LKNCkmSTM6XC5zAOhxGjtRfdiI48HSbsjKgtVVeVj5/x/fPtSor9W4CBTuHg5kg2NrK\nb3fZ2iq7IuzuQyHeDodkEBzG541Xl2+YdoPuuhQifw2NyBmhrkvzpWHuuLkghOiCIYuyiwz7rmhm\nc2LtkSRJ53CQIOGTg58ASAyHw4qyFahoqUBDVwOW7llqfwOBgCGZHA5HWrXeyNXt9rWsbuFlOESy\nD1FeJRAIBAKBc6IqOEiStAHAF5DbXd5OCDmDs9kDAKZAFgIWsrkKhJDzCCGh/r8X2BsTQo4nhJxv\ndRyEkCEAXoPWLcO6T5/AlgLG8co2Kxg/Xt+1AgAyGGf4SScZ97tli7asiBjHUyd3y8qMt7FzOLCB\nkQpNTfz1vAm8U8GB55jwmmR0ONw9+25MyJ+Aa6Zcg1tPvZW7zfkTzsepI07FyNyR2HjnRnW9TnDI\nMgoOTd1NaAu0GdZbcbTtKFoDrbp1K8pWADCfTMSyLWZTl/bm7OgV2qjAPcmU4eBPobrERGEyz5ZQ\nhPNZpgWcnmByh3cJBAKBQBBLol1SAQD3QS6TyATwf4SQ30F2MWQCuBHAnf3b7QXwpMV+OKZ4AMBI\nACsJIdsgOyo2ATgGoA/ACADnALi9f1kCUAq5S4UgAsaMAb75TeDTT4GXXjJen5YGjBsHHDyorfMz\noeC8NpMV1MlqxeEwebK2bv9+fV5EKMR3KdACAX0MNI2N/PU8IUI4HCJjRskMHLzP5IXoJ8ufhc3f\n22xotWnncACAipYKTBs+jXsdD9rdoPBx2ceQJMnc4RDLkgqqbGSwtzQUhEcyORzSfJpVNBqTedbR\nEE6GA/35F4KDQCAQCATOibrgIEnSVkLItwG8AmAIgN+xm0AWGy6PoMxBAjAdwCkW10sA3gdwmyRJ\nYgQfIYQAK1bI7SCHDOFvM2mSfrJPbzd8uLEsg4XncNi/X7+NmRBAT8q3bdOWTz8dWL9eXm5o4N82\n0QWHZHQ4sKz87kos/Hoh7p59t249KzYAesFhWCb/TVPR7FJwqNUEh9SUVPSF+nC07Sh21e0yz3CI\nYUlFSzclOCR4/b0gMUkmh0PUBQcPHA66kgrRIlcgEAgEAsdEO8MBACBJ0r8hCwJ/hiwudABoghwi\n+SCAmZIkmUQCyrtg/tN8CTl08hEAnwDYB6AFQC+ABgAbAfwVwBxJkq6SJMlkmilwCyHmYgOgFwoA\nedt//hP49reBzz8H7rxTDp/MygLyOIHfbIYDAOzbp9/GSatMWnA4i+qySDscMjO1ZbeCQ6xDI5PR\n4cAyb8I8LLtxGS47/jLbbZ06HKyQJEltgQnoAyMvmniRunyo+ZBuYkFPhNw4HBZvXIxLX71U7aTh\nFjOHQ1ugDS9tfQkHGg+EtV/B4IGeFCe6S8bvo0oqojCZ9yLDYbA7HAJ9AdelawKBQCAQALEpqQAA\nSJJUCTmv4QGXt1sFwGdxfRCy6PBlRAco8Bw2ONLnA77zHflPoaJCFi5uuQV45x399jzBwa3DQZI0\nwWHUKGDsWG0bWnAoLASqquTlSB0OSkcMzsl6T2AdDh0dclkJ27p0oDA8ezgAwEd8KMnhJ/nadar4\n7tLv4pXtr+BXc3+FR+c9qiupmDt2Lj7Y/wEAoKGrAUPSNRUtMzVTnVw4nQg1dDbg+//+PgBgT/0e\nlN9npaXyae7WVCTa4fCzT36GRRsXYfSQ0Si/rxypKQP0RRdEDD1BDkpB9IX6Evb9EuuSikgdDoNN\ncGjpbsGUv05Ba6AVa29fi5OLT7a/kUAgEAgE/cTE4SAYnPAEB5bsbNnhMGKE8TqlpGLIEKC4WF52\nKjgo6w8fBlr6Txafcop8Xwqs4MBbb3c/gF5wCIWA6mpg+nRgxozouA9YwQHQHuNA5MGzH8ScsXPw\nhwv/gOKcYu42Vg4HSZLwyvZXAAC//eK36A32qg6HsXljMT5/vLptY1ejTljI9GvWF6cOh8Mth9Xl\ncFt26hwOlOCwaOMiAHKqf7j7FgwOWIEskcsqYh0aGVaGA/V8DrYuFasqVqG6vRodvR1Yvm95vA9H\nIBAIBEmGEBwEUYMtqbA6A89rQUuvU/Z17JicG6FgJzjQ5RROBYdIHA4A8OtfAzt2AFu3Aj/7mfnt\nwoUtqQCSs6zCKaeNOg1fLPgCD5z9ADJTM7nbWLXGZLtRvLnrTXVCP7VoKgoztRe/obNBdyaTvj+n\nDof6znpH21mhy3AwscNXtlRGfD+CgQs7KU5owSHKJRXC4RAZ9OMN9Fn8GAoEAoFAwEEIDoKoMX68\n/jLP4aDAOhyysvT5EGynCgW7kgorwYEOjfRScNi0SVteudL8duHCczgMZMGBhnYc0Fg5HBq79JaV\nX376S3V5WtE0neDQ2NWom6hl+bU3jNNJytG2o462s8LM4UBzsMm664dgcMNOihNZcIh2SYXIcIgM\n0RJUIBAIBJEgBAdB1EhP1192IziUlOgzEMxyHGghgG6z6cTh4EZw6LYYq3cwvVVyc7Xl2lp4zmBz\nONCYORyOtR8znVA1detfULoU4dsnfRtDM4eqlxu6GIdDGCUVVW1VjrazwonDQQgOAivY92siB0cm\nW0nFYJt0D2axRSAQCASRIwQHQVS5u7/r4axZ1iGKrOCg5DcoOBEcaEcE63DIzJT3EYuSCvo+2qIQ\n6i0cDnzMSgxYh4PC3LFzcdqo04wOhyDf4cCzetd31uNXn/4K7+97X10XM4dDsxAcBOYkU4ZDsoVG\nDra2mINZbBEIBAJB5AjBQRBVnnwS+PBD4JNPrLdjMxzYy7TgUFamLdNCAO0sCASA9nZt22nTZIeF\nE8EhyDn55UZwiHaA42B2ONACAAAcX6i9MczKKpq6OAoSgJ+c9RMAwJD0IfAR2X5jcDikWjscHl31\nKB774jFc8/o1qGqVnQ2s4GB3NrWuow5LdixRhZGQFDJ1ONBngsub3He/EAwekinDgSYqggPrcBAl\nFa4YzI9dIBAIBJEjBAdBVMnMBC65BMjPt95u+HD9ZdbhMGaMtlxFOdbpyTftcAgEZLFBaVE5bZr8\nnxYc6Ek6LTjwcCM41DOZgV4LEDyHQ3u7t/eRqLAlFWeNOUtdNguO5DkcJhVOwpUnXAkAIISoLger\nDAfeWc2vjnwlXxfqxcpyObCDFRzae6xfnOvfuh43vn0jbnn3FnV7CZJ6veJwkCQJISmkrhclFQIr\nksnhQH/mRGhk4qFzOIQG12MXCMzYUbsD/yr9V0J/twoEiYIQHAQJgd8PDBumXWYdDvn5WkZDJeWc\nN3M49PQAdXXaZaVkI0t/glwlO1ufAcESieBQ7vGJaJ7DoStxy7M9hS2pOGu0JjiUN/OfaDrDYVjW\nMIzNG4u/X/F3pBDt629olpzj0NjV6DjDISSF1PaaAPBp+acAjBkObT3mdTWdvZ1YVbEKALC6YjUk\nSdK5GwDN4dAb6tVNnOo669AWiELNjmBAkEwOB125gscZDqxQB4i2mG4RDgeBQE9HTwfmvDAHN79z\nMxZ+vTDehyMQJDxCcBAkDHSOA+twIAQYPVperqzUnAtmGQ6BgH7ir4gZZoJDZiZQUGB+bFaCAxsa\nyToavBYceA4Hq1DLgQTrcDhnzDnq8oHGA9zb0A6HN697ExU/qsD5E87XbaM4HFoDrboSBqu2mBXN\nFejs1dSmT8s/RTAURHVbtW47K4fDrrpd6mSovacdLYEWXX4DoDkceKF/ZiKLQMC+X82yQBKBaGYE\n8MonhMPBHSLDQSDQU9laqf5Wb6vZZrO1QCAQgoMgYaBdDazDAdAEh44OoLVVXrYKjXQjOGRkhC84\nsA4HloMeO995gsNgdTicMOwE1amwv3E/7ya6DAc6IJKGXl/bobUWoQUHdpKys26n7nJlayW+PvK1\nYYJj5UIorSnV76OlkutwkCSJO2EUOQ4CM5LJ4RDNkgqem0FkOLhjMD92gYAHPR4I9FkMEAUCAQAh\nOAgSiLFjteVx44zXK4IDABw5Iv+3Co3kCQ6ZJk0Ooik4iJIK76BDEwE53X5cnvxmOdB4AJIkGW7T\n2K05HAoy+C8y3RqzpqNGXdZlODATuB21Owz7+ef2fxrWWZVUbK/Zrrt8uOWwweEgQUJPsIfrcBA5\nDgIzkinDgS2pYEsgIsErh8NgPss/mB+7QMCD/kwk8nerQJAoCMFBkDD86EfAmWcCP/kJMGmS8Xov\nBAe/X/5jCVdwkKT4ORzoxzFYBAdCCJ686EmcOOxEvHfDewCA44fKnSpaA62o66wz3IYuqXDicKAF\nB12GAzOBYx0OAF9wsCqpKK1lHA6tlWjuNrYc6err0pVvKAjBQWBGUjkcoiiO8MSFcDIcopkzkegI\nh4NAoIf+Pkjk71avOdp2FL9d/VtsOrop3ociSDKE4CBIGKZNA9auBf70J/71dKcKJTjSKjSSFhyK\nirRlXllFuIJDTw8QsjkZ56XDQZKAvv7fObqEZLBkOADA/Wfdj9337Mb8E+YD0LfG3N9gLKtQSir8\nKX5DW00FncOh3ZnDYWetLDj4iA+5afKbjycKWJZUMILD4ZbDhpIKQC6r4JZUiAwHgQnJ5HBgP1u8\nz1G48MSFsBwOg3jSrQvMjEIXEYEg2aC/DxL5u9Vr7lx+J3792a8x+9nZYQm3gsGLEBwESYOdw4EN\njaS7VNAdMHiCQ7ihkWxgJI/ycntRwil0fkNenrY8WBwOPHSCAyfHQXE4FGYWghDC3Yepw8EkNDIY\nCmJ3/W4AcovNiyZeZHp8Zg6HmvYaXV4EoA+iounq6+KWVJQ1lZner2Bww07iee+fRIEVADwVHDgl\nFeFkOAzq0MhBLLYIBDwGq8Phg/0fqMu8luMCgRlCcBAkDW4EB9rhkJYG5ORo1/HaX4brcLArp1Bu\ne+yY/XZOoPMb6Mc7qAWHoTYOh/62mAWZ5i+w0hYTgK6kwczhcLDpoDrIOGn4Sbhtxm2m+zbLcGDd\nDYB7h8Oxdo/eWIIBR1I5HIJJ4HAYxDkGg/mxCwQ8RIYDDCdMBAIrhOAgSBrCzXAYNkxuq6ngZUmF\nE8EBAA4dcradHbTDQQgOMlYOh95gr+owMMtvsLrOLMOBzm84qegkXDzxYoweMho8zEoq2MBIoL9L\nhYnDgTcJa+5uDmvyJBj4JFWGQxRLKqKS4TDIygqEw0Eg0KPrUhEcnF0qaDeoQGCHEBwESUNhodZl\ngpfhwJZU0IIDTTwEhxbjHDIszASHwZThwDI+fzx8xAfAKDgo7gbAvEMFYCE40G0xJW2AQXeoOKno\nJPhSfLjtVL7LwaykgnY4pPnSAABHWo9wbYpdvfySCkDf9lMgUEgmhwMrCnT0OKhVc4hnXSoG8aR7\nMD92gYDHYM1woBEOB4EbhOAgSBoI0VwOdg6Hhgat/MBOcBg9GsjPj67g0GaeG+gKuqQiPV0uFwEG\nt8PB7/NjQsEEAHJJBd0a00mHCkAfGklj5nDYVrNNXZ5ePB0ATMsqzEoq9jXsU5fnjp0r30eoV7de\noauPX1IBAA1dDdz1gsFNUjkcYlxSEU6Gw2AuKxjMj10g4DFYMxxo6IBtgcAOITgIkgpFcGhtlf+2\nbNGuy8kBUlPl5aoqbT3doQIwCg533imLGdEKjVSO1wtoh4Pfrzk+BrPgAGhlFR29HbpcA/rsfzgl\nFWYZDtuOyYJDZmomJg+dDAAYlz8OPz7zx/ARH74383vqtmYOB+XsQH5GvroPQO+eUGAdDsOyNBWt\nvrPesL1AwE7izQSrRCCqXSp4oZEuSyokSdLtR7TFFAgGNyLDQTgcBO4QgoMgqaBzHH7/e+Crr+Tl\niROBE0/Uzvg3US5z1uGgiBIKd9wh/+cJDoprIhyHQyE1h42GwyEtTQgOCmY5DrTDwaqkIictB/4U\nv2E9LTgE+uQ3QVugDQcaDwAApg2fBl+KT93miYufQNcvu/DovEfVdWYOh7oOuY1KUVYRxuaNVdfz\nJjNshsOYIVqP2IZO4XAQGEkmh0M0u1TwyifcllSw2w+2SfdAdDiUNZbhs/LPEJI8aiElGFSwDgfa\nWTlQYR+jyHAQuEEIDoKkYow2z8If/qAtP/004PPJZQYsrOCwdq22PHUqMHKkvMwTHJSchHAEh5IS\nbdnO4dDbC9x/P/CjH+lFBd52Cn6/nD0BDO4MB0DfqaKsUWsV6bSkghDCvb44uxjpPvlNpZQ6lNaW\nQoL8w3vqiFMNt/H7/MhN0+p7eKGRPcEeNRyyKLtIJyDwYLtU0AGVoqRCwCOZMhwSvaRi0AsOA8zh\n0NjViJMXnYx5L8/Da6WvxftwBEkI/ZkISaFBEd7M/oYIwUHgBiE4CJKK0ZxGANdeC1xyibysOBxo\nWMFhwQJt+VHtRLTnDocRI7RlO8Hh/feBhQuBp54Cli0z3044HPjQDoEjrUfUZV1opEVbTEDfGlMh\n05+JacOnAZAFh/aedrWcAuALDgCQkZqhBlnySiroMgjW4cCjq09fUqETHITDQcAhmRwOsS6pcDs5\nYI9vIEy63TDQHA47a3eqAu66qnVxPhpBMsJ+hwyGThVsWZ4oqRC4IdV+E4EgcZg2TX95yhR5kq4w\ndChQw4iurOBwzz1ym8rp04FvfUtbn5Ym5zsoIgKdkRCpw8GupELpusEus4gMBz70BPxI6xE0dDZg\n8cbFWFm+Ul1v5XAwuz41JRUzRszApupNkCBh27Ft2Hpsq3r9KcWncPdFCEFOWg5aAi3ckgqlnAKQ\nBYcxee4cDrqSCuFwEHBIlgwHSZKiWlLBdTi4zHBgj2+wtcWkH39PsAeSJIHQvaaTDFpAMuv+IxBY\nwXOQ5aTlxOloYgP7WRGhkQI3CMFBkFTMmQMsXix3qZg3DzjnHL2r4ayzgF279LdhQyNHjwZeM3FR\nFhRoIkJ6ulai0dsLhEJACuMJsgqNdFNSQe+nnZ8xCMDocFBKKvr65D82n2KwMCp3lLpc1VaFX336\nKyzetFi3jVWGA8DvVOFP8WNmyUygP5x0y7Et2FqjCQ5Khwoeuem5suDAKamgHQ7DsoZh9JDRKMoq\nQl1nnWFbgJPhQAkUIjQy/iTaBCwYCqplPwqJ6nDgORA6er1ri+lFhgM7uRgIZ/ndQE/QJcgBmqkk\neX9sdIF/wcT8XAgSG/Y7JFG/X72E53BItN8+QeIiSioESQUhwF13yaUQGLldTAAAIABJREFU559v\nLKGYO9d4G9bhYAVdVkELDgDf5eCVw4EWGay2NXM4AIM7x6Eou0gNfTzSesQgNgD2Dofzxp1nWJea\nkooZJTPUyxuPbkRpTSkAYFLhJOSm5xpuo6Cc7eCVVNDCQlF2EVJTUvHUJU8ZtlMQGQ6JSaAvgHNf\nPBcn/e0kHG45HO/DUeEFjybqgJjnFoh6lwqXGQ6ipKLX8nKyQb9+ifq5ECQ2yVSy5hXsY+zq6zLt\nwiUQsAjBQTCgOPdc47pwBQdJikxwcJPhQAsOVg4HWnCgMxyAwV1WkUJSMGqI7HI40npEzU+gsctw\n+N6s72F49nDdOr/Pj+nF05FC5K/KN3a+oU78zcopFJTgyPaedkO6M1tSAQA3nnwjLj/+cnU97Z5w\nm+FQ2VIp6itjwKfln+KLw19gd/1uvLHzjXgfjgpvQpioA2Ke2yDaJRWiS4U7BprgIkoqBJGSTKG8\nXsH7rIjgSIFThOAgGFCMHw+MGqVfN9TolDeFFhy6uiITHAoKtNt75XCgSypYh8NgFhwAbRLe0NXA\nPYNpV1KRnZaNX8z5hW5dCklBlj8LJww9AYDeUmgWGKmguB8kSIYJFOtwUHj92tdxxeQrMDZvLH54\n+g/V9azDoSCjQG3ZyTocNlRtwLiF4zBu4Tgcaj5keYyCyKDP7vBKZ+JFojkcqlqrTNsP8o416g4H\nlxkOhjP8nGMeyAy0khJdScUgmCgOVA42HcRPVvwEqw6tivl9i5IKGXFiQ+AUITgIBhSEGF0OSs6B\nEyIRHHzMSfXsbK2tZjQcDnRbTOV4BzP0WX+WUbmj4Pf5bfdx1+y71OVhWZo1hi6rAGQhYv4J8y33\nRQdIscGRPIcDIIsey29cjkP3HcLZY85W17MZDpn+TDVzgnU4vLP7HUiQ0N3XjWV7LVqeCCKGTiZP\npEkYz+EQrzO5f/rqTxj959G4aslV3OujXlIRBYdDX6jPVEAZiAxkh8NgmCgOVH6+8ud48usncf1b\n18f888h+JgJ9g6BLBc/hkATBkasrVuPxNY+jubs53ocyqBGCg2DAwSurcArbGtNOcKDDHguZiICc\nHK2tphvBwanDgS2pGMwZDgAwOtcoOPzHlP/ATSffhBeuesHRPjJSM7D29rW4YvIVWHT5InX9jBF6\nweGBsx6wDIwEtJIKwHj2m3Y40MKGAiEEmX7txWVLKjJSM9TbNXQ16Eo2dtfvVpfpFp4f7P8A5//j\nfLyz+x3L4xY4h554JVJbtERyOLy9+20AwPv73ucOyqNdUsHbf6QZDkDy5xi4YaA5HOjjT9TuLQJ7\nypvKAci2fi+/M5wgHA4yie5waA204rJXL8ODnzyI333xu3gfzqAmeWOGBQIT5swJ/7b5+frLbhwO\nQ4cCdVSTAdrhYFdS4bRLhVVopHA4GAWH66Zeh+unXe9qP2eOPhPLb1yuWzerZJa6nO3PxiPnP2K7\nH1pwYIOV6M4SdEkFTWYqJThQJRUZqRlIISkYmiU7HPpCfWgNtCIvIw8AIzjUaILD/Svux76GfShv\nKsc1U66xPX6BPfQEOpHOcCVShkNHj/blFggGkJ6arrs+HiUVkTocAPm405HO2XrgMeAcDqKkYkBA\nv3advZ0xbUspMhxkttVsw7/3/RsXT7oYqSmJN6WsaK5Qux5tqt4U56MZ3AiHg2DAMXUqMHOmvPz/\n/p+72+Ywv1duBQea7GzN4dDTw7+9ghcOh8EuOCihkTQTCyd6su/zxp+HW0+9FaePOh1f3/E1MlLt\n63TowU9LoEU3IVUcDln+LDWLgcXM4aAIEXQbTyXHoSfYg7LGMnX9jtod6Av1oSfYgwONBwDIoZqD\nyQ4eTeiJVyJNwrhn5EO9rrMLvIBucckblPPEES/bYvIec6QZDkBivd7RZqA5HERJxcCAdpXRwmYs\nEA4HmUUbF+GK167Ag//3YByOyJ7GrkZ1uaq1Ko5HIhCCg2DAkZICfP458PXXwGOPubstPYEH3AkO\nbDkG7XAArIUEkeEQOTyHw3EFx3my7xSSghevehHr7liHacOnOboN3TLz/H+cj1FPjlIn/UqGA53f\nwMI6HJSzvooQoRMc+nMc9jfs153RDQQD2Fu/F4eaD6kiQ1AKoqW7xfLYJUnCoeZDg8o2Hg70gDeh\nSipMXrd4HCPtVuANyqPepcKDtpi8Y0z2SbcbBprDQbTFHBiwDodY4kVbTEmS8Fn5Z1hdsdqrw4oq\nVjlAidSliaapu0ldPtJ6xNAxTBA7hOAgGJDk5gJnnCGLD27IYk42OxUcsrKMt2UFB6scB6cOB6u2\nmIM+w4ERHPIz8lGYWWiydfShSyoA2YXwyvZXEJJCqiPBrJwCkFtyKu09u/q0kgrV4ZBldDjQ5RQK\n22q2qUIHu70ZT617ChOemoC5L84VP9AWxDvDobKlEvd+cC/e3f2ubr3ZhDAewZF2gkO0Syp4YoHb\nkgreMSbDpHvlwZW47s3rIk7xH3AOB+rxiLaYyUs8BQcvHA5fVX6FeS/Pw3kvnYeNRzd6dWhRwyrv\nxJdibEWeCNAOh47eDrQGbALVBFFDCA4CAQUrGtAOgmXLgD179Ncr2QvZ2frJv98v/+VSc06ngkN7\nO1BWBtx0E/Dcc/K6vXuBxx8HDh7U34coqdAYkTMCKUT7SvPK3RAuvHrS7TXb0djVqLoNeIGRNIqb\noatXK6lQSjB4DodddbsM+9h2bJuuzILe3oz7V9wPAFhXtQ7H2o9ZbjuYiXeGw03v3ISnNzyNa964\nRpfAbda2MdZncyVJ0mc4xCE0MlolFcng/rnwnxfirV1v4Rv/+EZE+xloDgdeSUVdR13MJ62CyKC/\nT7wsw3KCFxkOm6s3q8ubjiZ+voDVY4znyR0rmrqadJePtB6J05EIhOAgEFCMomIA/H69w2HhQjkb\ngg6GpB0OtDihZEE4KamQJH1oZG8v8MgjwGuvAXffDdTUANdeCzz4oHwM9PEJwUEjNSUVJTkl6uWJ\nBd7kN4QLXVKhsL1mu2lLTB6Km6Gzt1NzOPjdORy21mxFWZNecKBDK1n2N+zXXRatpMyJd4bDl4e/\nVJdpYchsMhxrwaE31KsrX3Ca4RD10Egp8tDIRJ90e5nXMdAcDmxJxdrKtRj55EiMXzhenAFNIhLJ\n4RCOw40+/lgLJuFAu4HyM/QJ64kUmkxDOxwAITjEEyE4CAQUZ58NXH21HAD58cd6wQGQJ/Xr12uX\nacGBnvxnZ8v/nZRU9PQAfcx4trRU/h8MAlu2ADt2GG+XliYyHFjosopEdDiUNZWhvLlcvWwrOPSL\nC/SkXxEhaHeEIiDsrpMFBx/xoSBDDhXZdmybQXCwKqn48MCHust0DWQ8qO+sx5nPnYlTF5+acD2/\nEynDgR4AJ4rDgQ1yc1NS4VUpjycOhyQsqaBb70bKgHM4UAKKBAnv7H4HfaE+1HXW4avKr+J4ZAKn\nSJKk+85NxgwHneAQ49DLcKBLKpZevxRf3/418tLl7liJmoXCjl+q2kRwZLwQgoNAQEEI8O67QG0t\n8I1vABMmGLcp758v9vWZl1QoggNdUmHmcOCFRFZUaMurTMpvWYfDYM9wAPSdKuLtcDDrZPFZ+Wfq\nslWGA6CJCy0BLeTRLDQyGApib8NeAMCkwkmYWSK3aqnpqMGaw2t0+7Uqqfhg/we6y/F2ONyx7A6s\nq1qHbTXbsGTHkrgeC4suw8HlGZ7eYC8eXfUoHl/zuCeTa3rAmigOB3YS4DQ0MiSFPJvUepHhkIwO\nh6NtR3WXw+1ME5JChtuaCVrJAnv89V2a4ytRJ04CPeznLxm7VNAT+GRzOAxJH4IzRp+hOjnjLbib\nIRwOiYMQHAQCDkrY5BVXyGUM11yjXXfokPy/vBwI9Y/Dxo0L3+HAExyaKFH2s8+M1wPRa4tZW+vN\nfuLBuLxx6vKkwklxPBJzwWNl+Up12anDgUbNcGBKKipaKtRBz5SiKZhVMku9nlX5zUoqOns78fmh\nz3Xr2BrIWNLc3Yz39r6nXt5ybEvcjoUHLTK4nYAu3bMU//35f+PBTx7UvSecwk7m6QGr2YTQKvQr\nGrCDaKclFbzbhosXXSq4GQ4JPumubqvWXQ43HJH32NsCbfhX6b+Sou6cB/uYaAE2Ua3hAj3sd4lw\nOEQf+vdDGZuk+2QbcKIKdQaHg4etMTcd3YQb3roB7+9737N9DmSE4CAQWOD3A/fdBzzxhLZOcTjs\n26etO+EEe4eDG8GBZqNJeHE0Mhx++EOguFjOi0hG7ph5ByYWTMRlx1+Gc8edG9djmVAwAYsuX4Qf\nnPYDLLthmbqenjTbhkamGgUHtUsF5XDYU79H50yYMmwKrph8hel+zUoqPiv/zHCmIp4Oh8UbF+su\nJ9pZ5Z5Q+F0qDjZpCbBsFxEnsOUlZg4HOkg13g4H3nNkNnH3agLBK58YDA6H6na94BCugMN7fZ7e\n8DRufudmzH1xrm0AbSxYW7kWr+943fHryr52tACbqGdqBXriLTh44XBIugwHWnDoH4coTs5EFeoM\noZFt3jkcfvbJz/D6ztdx5/I7PdvnQCY13gcgECQDo0cDPp+cqaAIDnv3atdPnqwvmeA5HNyUVNAE\nTU7GpaXJJSAKXpRUvPaa/H/JEuCPf4x8f7FmatFUHPih+8lbtLh79t0AzCfttiUVHIeD8kM/JH0I\nTio6CTvrdmJbzTb8eMWP1W2uPvFqzCqZhaKsIm4tt5ngwLobAPMMh0BfAPd9dB98xIeFlyyE3+e3\nfCxuCfQF8NS6p3TrEs0OGUmXivYe7YMfjoukpoMRHEwcDrlpuWpJTiJmOJhNEj0THHgOh0GQ4cA6\nHDp6OoBs9/vhORw2VG0AIE9ADjQe0LmtYs2R1iM496Vz0Rfqw4t9L+LWU2+1vY2hpIIWHBJ04iTQ\nwwpDydilgr4N/XuQqNAuKUVoSE9NbIdDNEsqlH0daz+GkBTSifsCI+LZEQgckJoKjB0rLztxOPC6\nVJg5HDrC/J2MhsNBET86RXcwT8nPyMfYvLG6dQQEE/I5ISEUXIdDvwhBCMGTFz+prlcG0TdOuxGn\njzodvhQfrjrhKu5+lQF2MBTEgvcW4OJXLkZNew0qWioM25qJJe/sfgfPbHoGf9v4Nyzft9zycYTD\nhqMbDC05K1srPb+fSNBlOLg8M6oTHMII5mSfGzOHA90tJd4OB1clFR5ZjL3IcEjGtpjRdDhI0DJH\n4j3RKK0pVV/Pbce2OboN+5hoATZeDodEF7ASjURzOITzvkm6DAdOSYUiPASloOvv1VgQzZIK+jWL\n9/dgMiAEB4HAIePHy/+bm+U/1uHgZWikE7zOcAgGNZeEEBy8Z8qwKbrLPzj9ByjJLTHZWsYqwwEA\nLpp4Ea6deq22fWom/nDhH9TL10y5BjwUG/TK8pV4aetL+LjsYzy/5XlD0BxgfvadFicONR+yfBxO\n2VK9BT//5OfY17AP5U3lhuurWqvCDr+LBvQg0+2EgRYcwilbMZRUWDgcFBJScKCO1Ud86rJnGQ68\nLhUuMxySsaSC/SyHK+DYCSuxzgVhoScUTo+Ffe3o77h4OBweW/0Ycn6Xg9+s+k3M7ztZibfgMCgz\nHHqNJRVKhgOQeO6gkBQyjF8auhrCzrNhoX/DvdrnQEYIDgKBQ+iOFeXlmsNh2DCgsNDb0Egn+P3e\ntsWknRZdXYBHXekE/Zw+6nR1efbI2Tp3ghlWGQ4Kf774z2r45EPnPaRzUsybME+3rWL5UxwOe+s1\n1exA4wFuy6jmAH8yTFsVeXXcr5W+hgtevkDXlcOOK1+7En9Y8wdc9uplXLdFb6gXdR3etfuLlEi6\nVLT3RuZwMJRUOHA4xHpQ7iQ0kp7M073dvbIY88QF1w6HZCypiKLDgSbeZ/Zosc7p+5sVUWjHRjwc\nDos2LkJvqBd/2/C3mN93ssJ+38Z6wu51SUUyORx8xKeWUNLduBIt/6Q10Kr7bCvwTqyEA/2ei7fw\nmgwIwUEgcAgtOJSWAkf7v7NOOEH+H63QSDO8bovJHodos+kt95x2D84Zcw6uOuEqfPydj5GaYh+h\nY1VSoTB6yGhsu3sb1t+xHj+b8zPddemp6bhr1l3qdicOOxGArPJLkqQLLqxoqVB/iOkOG2YOB3o9\nLxPi3g/vxafln+IXn/7C8jEqSJKkCh5lTWU618TskbPV5UTKcdBlOLgcbNGDlXAyHAwlFSYOh4KM\nAu59xgJDaCRHlKEH7gWZ3h8r1+HgMsMhGR0O3AyHMLATZ+J9Zo/+7Dgd9FuJKPE4S6sIjsr3ssAe\ng8OhL/lCI+nPTjI4HJTHSI9BlAwHIPEcDmx+g0K4Y4ieYA+e3fQsVhxYgd5gr+57JN7fg8mAEBwE\nAofQgsOKFdry5Mny/1mzNMfBnDny/2QqqWCzJLq6gOXLgd/+FmhpiWzfAqA4pxhf3vYllt6wVDex\nssIqNJKmJLcEp406jbuPJy56Aq9e8ypW37oaI3JGAJB/ODt6O3CwWRMctlRvUSdRxw89Hv4U+QyG\nmd2/sZtyOHQ1oLuvG+/sfgeHmg8h0BdQRYiKZqNTgQc7WVhXtU5dPmfMOepyIgkO9KQz5iUVDh0O\nukl8jM+iOQmNpAdteel52m09OlbehHmgt8WUJMlSkHKDXUlFvB0OtDsoXIcDTazP0vYGe9Xj7gv1\nJUV4YCIw4EoqksHh0D+ppscgtMMh3t8FLLQYSaAlrIc7hvjntn/ie+9/D5f96zLsqN2hu044HOwR\nXSoEAoeYCQ6Kw6GoSM51qK2VxQdADpvMypIzEaLtcIhUcGCPo7wcuOYaoK9P7obxy19Gtn+Be3ji\nAp3h4ITstGzcdPJNAPStNOs761HWWKZepgfuo3JHIT8jH3WddaZ2f7ak4ndf/A6Prn4UJTklWH/n\nevW6us46RwnOrQH9B0T5Qc9Jy8H04unq+kQSHNgMB0mSQOjWMRZEGhrpNMOBdjjEejLjJMMhGUoq\nksHhUNNegyx/FnLTc9HQ1WCYEIWd4ZBEJRVOzzJavXaxPkvb1qM/E9HY1agrgxLwYYWheIdGDooM\nh/5Jtc7hQGc4JFhJBT1GmVQ4Cfsb9wMIfwyxq24XADkbYsPRDbrrhMPBHuFwEAgcQgsODZSDXBEc\nALmTxezZ+naVisvBzOEQbpeKtDRZ0PD156x57XDYvVsWGwBg//7I9i0ID67DgbPOKazgQJdU0IzK\nHaWeGTc7+86WVHx5+EsAcu04rf73hfocncFvC/A/IOPyxmHMkDHq5UQSHNiJi5tJaKRtMS1LKoKJ\nITg4yXBIhpKKRM9wWHdkHUY9OQpjF45FQ2eDoZwCiJ7DId5n9sJyOFiVVMR40tTSrbcPmtnABXrY\n75J4ZziEI1TRn51kcLYkncOB+m6YPHQyd70b6NeL/f2NteCVjAjBQSBwyIgR+pBGhcmTjetolODI\naDgcAM3l4HWGQx2VzReuKCKIjJG5Iw3reK4HpwzLGqYu76rbZTpZGDVklHq2uaW7hdsZgnU40D/A\nbNcKJ0GPrMNBYVz+OIweMlq9fKQtcQQHdpDpZrJCDzBbAvzn2ArLkooQfxIfb4cD7/mJdkmFJ6GR\nnEl3IgkO7+55F0EpiObuZqyuWM0NRRsUDgenGQ5WJRUxdji0BKIrOLQGWl0LbMlAvEsqvHY4BIKB\nhH+dbB0OCZzhQI8hwn2vWAkO8RZekwEhOAgEDiFEa42pkJICTJzI3VxFERza2vidH6wEB5/P/Lq0\nNPm/Ijh47XCorze/ThAbrp16LXLScnTrInI4ZGkOh/VV6023G5U7Sj0zLkEynIUD9GcJGrqsBYfa\njlrbY2OtxQrj8hjBYQA6HEJSyNThwaOjp8MgHiSkw8FBhkPUSyqi1BbT7sw/j/rOeizeuJjb8jUS\n6M9ES6DF0KECGMAZDnRopBclFQPI4bDq0CoU/6kY0xZNSyiBzAvYyW2yZzgAiZ3jEAwF1feQlcNh\nc/VmrK5YHfPj40F/N4zKHaUuh1v+QL9eBsFBlFTYIgQHgcAFrJth3jwgPZ2/rYJSUtHXx3chWAkO\nM2bI/4cM0ZdpEKKJEV4JDuxx0GUjQnCID/kZ+bh9xu26dW4zHGjokgq2BpFmZO5I3eSPLYnoCfbo\nJoTdfd06ASIcwcHU4ZA3DrnpuRiSLit3iSQ4sJMTN2d42Am1m+BI1t0AmDscojGJdwqbHG9XUkEf\nq1cWaZ5YEK+2mPd+eC++/+/v46olV7m+rRX0Z6I10MovqYiSwyHeA+1kL6lgv/e8FBze2vUWuvu6\nsad+DzYe3ejZfhOBeE/Wve5SASR2jgP9+My6VGyv2Y5Zf5+F8146D8v3Lo/p8fGgP0ujhmiCQ7gd\nTejXSzgc3CMEB4HABb/6ldyB4tprgRdeAJYts79NvjaGxt69xuvNBIf8fOCJJ4ALLwQWLQLyNLex\nWk4BaGUeXgsOdElFpyhPixv3nXGf7jJ9RsEttMNhQ5W54DBqyCjdmXG25tEuc6C8WX8GNyLBIX8c\nAM0SeaT1SMK0jmMnnU4nKz3BHsOkx01dKRsYCVg4HOLYpcJJaKSZOBLNkop4tcUsrSmV/9eWmpbQ\nHGo+hOmLpuOyVy9zLIwo7WQB+Yz5YHI4iJIKc2jXmBsHVTIQ75IK9j000B0O9GeLHoPQy2/uelNd\nnr9kfmwOzAL6NzXqJRXC4WCLEBwEAhecdhrwxRfAm28CCxbou0SYceGF2vJf/2q8nnYPKOUXADB0\nKHDuucD//R9w001AAdVJUSmnALzLcBAlFYnJhIIJmFUyS73My3VwCp3hIIE/afen+DEsa5ilw8Fu\nchxWSYVFaCSgDRi6+7oTJliNnZw4nYTynAZugiPZwQ5g7nDI8mchzZdmer/RJFFLKrzIcAinLSY9\noTAb9L66/VWU1pbiwwMf4ouKL2z3KUlSdEsqEjjDISSFdCUJynO6pXoLqlqrzG5m+TmN9eOJZkkF\n/RlKhlBCN8RbcIjU4RAMBT3rJBML6Ak1XVJBZzjQbgcg/mIk23lLIVxxgL4d+x0rHA72CMFBIIgy\n3/2uJiS88op+Ig9ozoL0dL2oMHSofjv6OtrhoAgOgQAQcpc7xz0OBREamTh8ePOHuO3U2/D0pU/r\nlHq30CUVNHSP6pG5I5FCUnSTv6auJp3oYDcoZgWGSBwO4/PHAwBG5yZWjoMkSUaHg8Ozo7zBf8Ql\nFSYOB3+KX80BScjQSLOSimg6HDzIcAjH4UA/H2aTC3qQ7OQ90djVqBvYtwZabQUpN3jVpaK0phQT\n/zIR81+b7zog1YzWQKtOOO3u68ayvcsw8+8zcfz/Ho/6znru7RKppCKaDoeBLDiwr1NPsMe1kBgJ\n7HsoEAy4ct7x3meJ7HAwK6mgHQ7s99XKgyujf2AWmJZUhClO0c8BK6YIh4M9QnAQCKJMbi5we38Z\nfnc3cMUVwKWXAg89BGzerE30c3K0vAfAKDjQpRk91FiXdllE4nKwcjiIkor4UpRdhOeveh73nH5P\nRPuhSyoUhmcP17kmlB9m2op/7ZvXouB/CvD4mscBuB8U13aGFxqZ5ktDcU6x7riAxBAc+kJ9BpeI\n08kK1+EQaUmFicPB74ue4PDClhfwrde/pWuDqjsmJ20xzUoqBmCGAz3QNXst6G2cTObZz0JLoAUN\nnQ2G7eLtcPju0u/iYNNBLN+3HG/vejusY2HhuYI+OvARAPm523R0E/d2iVRSYchw6PZOcKBf84Em\nOPDed7F0OUQqQvImqAntcOgzcThQrgZ2XPDe3veif2AWKN8P6b505KXnwUfk4DMvSircXCeQEYKD\nQBADfvADLfRx3Trgo4+A3/wGmDULKCuT12dny6KDgpXDgRYH6FadkeQ4iNDIgU9eeh6Ksop0644r\nOA5j88aqlxXrIT35U1i8aTEAd/Z/IPy2mGPzxiKFyD9TJTkl6nreGf5YwxtcRuJwcPOc0pNM5fnp\n6O1Qz7DpBIcoORzae9rx/X9/H0v3LMUvP/0ldxsnGQ70wD0jNcPz8g9ulwqXGQ5eCA6SJOkdDiYC\nAD1wdXLWjM5vAGSLviJe0UJivB0OW49tVZd31e0K61hYeCId7aYye44TyuHAKanYemwrXt3+asTi\nx0B2OMRbcOB9LtyUEPC2TeTXyElJBSs4LN+33DM3Uzgox1OQWQBCiOrMCFtwsPg+Fg4He4TgIBDE\ngOOOkzMfrLBzONCCA020HA50eUZXV2TlGoLEgBCCd69/V5flcMLQE3SCgzJJoUMjFQ42HUR7T7t7\nh0OYbTGV/AYAqtMB4J/hjzW8iUkkGQ5uSioqWirU5ZOKTgIg17Mrx6QrqfD5ke3PBiBPOr0K3Kxq\nrVIf7+bqzdxtnGQ4mJV/RLOkwq3DgdsW02WGQ0+wRzf4duJwcDIw5jkclM/nsKxh6oQgHg6H7TXb\nsfLgSsN7zqx8yi28zwz9XWP2HFu2xYxzaOS+hn2Y88IcfOfd7+Av6/4S0b4HsuDAe51i6RDgfSdE\nKjiE+xmNxaRe53BwWFJxrP2YqcsoFiiCZGFmIQCtw1e4bgThcIiMmAkOhJCxhJAnCCG7CSHthJAG\nQsh6QsgDhJDwG8vL+84khHyLEPK3/n02EkJ6CCH1hJCvCCEPEUKK7fckEESPxYuB9euBnTuBgweB\nadP01+fkWDsc8o0nnAHoBQc3DofKSqCRmjdatecERFnFQOGcsedg3w/24a5Zd2Hu2Ln46dk/dexw\nAICdtTtd2f8B9xkOU4umIjUlFQtO1VS64mxKcEhUh0OMSioUwSHbn40xeWPU9cqA28zhIEHybGBE\nvwZHWo8YztQC7rtU0OKIZw4HDzIceGcz3Toc2OfCbHIUaUlFdVu1emyFmYXITtPEpnAIt0vF4ZbD\nmPnMTFz4zwuxbO8y3ZlQdpIdLjxXkBPBwbKkIs4ZDsfaj6kTzwc/eTCifdOv+UATHLqD8XM4SJLE\n/Q5xIzjwPtvhfEY/OfgJhj8+HDe+faPr27rB1OGQat0X/mDTwajqaIbAAAAgAElEQVQdkxWdvZ3q\ne145waIIDsLhEB9iIjgQQq4EsB3A/QAmA8gEkA9gFoA/AthCCJkY5r5PBlAL4G0Ad/fvMw+AD0AB\ngDMAPARgLyHkusgeiUAQPn6/3OVi6lRgwgRgPtM1yAuHg1PBYc0aYNw4+a+2f3xmVzYhBIeBQ0Fm\nARZfsRirF6zGScNPwonDTlSvmzx0sroNj9LaUtcOh4auBtuzynSXirW3r0XDgw24efrN6roROSPU\nZV4oXqzhnWGLqKTCoeAgSRIOtxwGILcMVSbogHaGjHU4KIKD2X2HA+sy2Vm307ANe8aO9/zQ74vU\nlNSIJ8hW+1dqeOPRFpMd5HrlcGC7MdACRGFmoeZuiZLDwWygvfXYVnVStq5qHYakay2YvBIceA4H\nWggzFRysSio8dDhIkoSb37kZJz59oq6khMYrtwePgexwiGdJhdn7Jx4Oh2c3P4uGrgYs2bEE5U3l\n9jcIEycOBx7RfH9bcbTtqLqsuDYjFRysXl/hcLAn6oIDIWQGgCUAcgG0AfgFgLMBXADgWQASgOMB\nvE8IyTbbjwVDAGT37+dLAD8H8E0AMwFcDOAZAH39271KCLk4kscjEHjFJZfoL4crOIST4bB8OSBJ\nsqvhi/7Oa3YOB5HjMHC5cdqNuGPGHbj39Htx+eTLAZg7HEpr3AsOAEwT4xXogUlOWo5uggIwJRUJ\n6nCIpKSivrMeqytW22Y51HbUqgOfcXnj1Ak6YO9wMLvvcGBfg521esEhJIUMAzRbhwNTUuFF+Qct\nLij5EPEIjWQnE2aTC53DwcFZsyNteocDfea1IKMgbg4H+nG0Bdp0n2cnmS5O4Il0tAjBe69LkmT5\n+nvpcHh/3/v4V+m/sLdhL+75gB/4y3MGKSjv13CQJEkvOPQOLMGBW1IRoy4PZu8fN+8druDAKUFb\nW7nWUiClfy/YPBcvcZLhwCMhBIccWXBQjrurt8v1b4skWbsDheBgT2oM7mMhZEdDL4BvSpK0nrru\nc0LIfgCPQ3Y+/ATAb1zuPwTgdQAPS5K0l3P9J4SQjwC8C9n18L/99yUQxJUzz9RfzswECgu1y8VM\nEVBeHn8/4WQ4VFMthJv6f6/sBAUhOAxcMv2ZeHb+s7p1poJDbanuDIdTajtqdS4FFiXDISctRw1C\npMlJy0GWPwudvZ2uMhz6Qn2QJAl+n99+YxfwBpeRlFR8XPYxPi77GBMLJmLvD/bCl+Lj3lZxNwBy\nqGZqivYznkgOB95k2TbDgSqpUASLcN5rNPQEPD01HV19XZ60xbSbiLOEU1IRToYDDetwkCQJhBDT\n7XmEm+FAP77WHv2kg54MRIKdOMd7ju0ej5cOhy8Pf6kuf1X5FXcbK7eHUt4WDt193bouOsLh4B1m\nn31XJRW8LhWMYHLla1fik4Of4L9m/xf+evlfufuhs4+8+lzxGEgOh6AURG+o15WgZ/fbLkoq7Imq\nw4EQchqAuZDdB88xYoPCkwB2AyAA7iOE8EdZJkiStFaSpBtNxAZlm2UA3um/j4mEkFPd3IdAEA38\nfrm0QmH/fuCWW4ApU+S2mXPn6ren3Q80dLbD7t3O7psnOITrcHj1VWD6dOCVV5zdN01Tk9yxI5Kw\nS0F0oCeyNNtrtuscDrlpJm9MBruzmsrAhHU20CiChdOSiq3HtmLoH4fixL+e6PnAx+suFQplTWWW\nZ6rowMhxeUxJRRwdDmxrTN7ZRl6verOSCrN9uIU+O6icjXPtcIhChoPZ60APXJ2cNWNLKmjoDAc6\nVNQN4XapYB0O7MTIC/eKXdAq7zm2ezxeOhwOtRxSl5XJDovV95KZ6OsE9rELwcE7zL4/Ii6poAQy\nSZLwWflnAICV5StN90OXIla3VZtuFynhZjgkguCgtNSmP4Nu3yt2goJwONgT7ZKKq6nll3gbSPKv\nzsv9F/MBnB+lY/mMWg4rL0Ig8JqLqQKf1FTg+OOBXbuADz4AfIz0lm1ScESXZvzrX87uNxyHg1mG\nw89/DpSWAvffL5dpOEWSgAsukMWVBx5wfjtBfFC6ITR0Naht7fLS8zA8e7ij29sFRyoDEysBQwmO\nbOpucjThu+DlC9AaaMXBpoNYvne57fYvb3sZf1n3F0cTUm6GQwQOBxqr8pOKZkpwyB/HnaDHxOHA\nllQwDgezAR37HFmJI17kONCvpTI4jkdbTIPDwUFJhd2guC3QZnmGvCCjgCtIuSFchwN97K2BVt37\nrqO3w5OJiF3uCa+MwO7x9AR7POvkUtZYpi6PGTLGcL0kSZbPQySTmIEuOPC+awdShkMgGFCdWFaf\ncVrIq26PnuBAH2+yOhzo43brSLB7bYXDwZ5oCw5z+v93ALDqjbKKWj4nSsdCy3DuRhsCQZR4+GGt\njOK//9t6WzPBYcYM4MT+zL/Vq4HDh/nb0dCCg9KpIhyHQ2en3O0CAOrrnd23QiAAbNkiL69e7fx2\ngthxzZRrAABnjDoDF0/U1DFlEFGQWYChWUO5twUAAs2+bSU4SJKknqmxcjjQOQ52AkZdR53OidHQ\n1WC5/fqq9fjPpf+J+z66D2/vettyW8C7DAf6OVKgBYdgKKhzh4TjcKC3iVZJxbH2Y2jo1J5js8kt\nK9TQggBdUgF4c6x0+YRioZUguWolpxOgJPn1ctsWM5zQSLsJp13NNu1wAMJzjHiR4dDc3Wx4/F7Y\nv8NxODj5jLoVk3j0Bnt1rh/e+6W9p93yfRjJBHqgCw5OMhCiRbQcDmYhn1Y5HzqHQxQFB/q7iBYZ\nbDMcehJHcIjI4WDzXSwcDvZEW3CYArmc4oAkWf6672FuEw3Oo5YdGs8FguhSXAzs2QPs3Qtcfrn1\ntmO1zoUYNkxbJgS4WQvzx2uvWe+npwdooOZeTU1AKGTfhYInOBw6pL+sCAhOoAWOWvuuiYI48OJV\nL2LJfyzBeze8h5OLTzZcX5hZiKGZmuDADj7olo1v7X7LNKm9o7dDrTe2LKnIdt6p4p/b/6m7bBd0\nSU8OFAeHFdwMhzBKKmgRRUERHEJSCGc+fyZGPDECr+94HQAjOJg4HJQJk4/4QAiJyDUQkkLcM768\n4E7a5WA2oGMH2vSENjUlVSc4bDi6AT//5OeGcg038Eoq2PV2qMcYTAWCsmgRi7aYdoNiq/wGQJ/h\nYHWfVjhxOPDeH/Sx8z6rXgTc2TkcuBkODrI3vCir2NuwV7cfXt6EXbeOSAQHVlwaDIJDUmU48Npi\nUq8Z/d7t6uvift9IkqR3OMShpILncKC/cxLB4VCSUwIAyEqNYkmFcDjYEjXBgRCSDkCZFln+KkqS\n1AzZBQEARt9Z5MdyCoDLIYsf263yHgSCWFNUBEx2EGNaUgL88Y/AuefKuQc0N92kLb/6qvV+jjFj\nv6YmZy0veYJDOdOFafNm+/0o0IJDfb0seggSiyHpQ3D9tOtRnFOMk4cbBYeCDL3DQWmpybv85eEv\nMeOZGXh397uG/dBnaXLTLUoq6E4VFsGRkiTh+S3P69bZBU3SZ+ftzpwCJhkOYZRU+FOMYZaKo2Fv\n/V5sPLoRISmE13f2Cw79JRWpKakoySnhOxz6B8RKUGa4JRV76vdg9JOjcdqzp+kG05IkcZ9PulOF\n2dl0g+BgUVKx4L0F+MOaP+C2925zfMwsPIcD4C7HQd025PdOcOA8P8FQUPceshvE0oNqpdc8jUFw\nCMPh4Ki8yMbezhMcvHA42IVGcjMcHDhT7ITDl7e9jKfXP20pWm2p1qvvLYEWg5uBnozxhFbhcDCH\n9xrFO8PBTeConUODfb14LodAMKA7llg5HOjSBF6GAx0O7YXg4DagF9C+X/LS81RRXjgc4ks0HQ70\nqNHJN53yScux3MolhJA0AM9B7lABAL/0cv8CQSz56U+BVauAWbP06487DjjrLHm5tBSoqDDeVqGa\n+U1qanLWgUIRJbq7gRdfBNatAw4e1G/jRnBo0+aYCAa1LAlBYjK9eDrG5Y3TrSvMLMTwLC3DYWrR\nVN3104qmGUoGXik1povaDbwV6IGMVWvMjUc3GlwKdq006ZILuzOnAH9wGU5JRWVrpeF6xeFADyCV\ndYrDYcyQMfCl+PgZDv2TKkXMCFdweGHLC6hur8am6k1qgJmyD2WARU9oVx/WaqOcOhysQiMV9jaE\nf46Al+EAwFWnCnWSGvTLfwijLWaP/RlndtBqNyimRR9W7APkkide21Q3OBns84SRzj7t2HnPtVXY\npVOiVVJhJRx+sP8D/OfS/8S9H96Lt3ebl16xbq6QFNIJq4B+EjkhfwJYOns7XZX+0PAEB6+yKRIB\nuwyEaBKLDAf2sfDcMOz7KapdKlw4HHLTc9XfnEgFh8fXPI7c3+fi4c8fdnwbSZLU50IppwCYDAfm\nu7Y10Io3dr5hWqopHA6RE03BgX4XOvllDkDuIhFZDywjfwUwC7K74SVJkj7weP8CQUIwZ462vH+/\n+XY8wcEuvwHQRIm//AW47TZg3jxg7Vr9NuE6HABRVpHo+H1+PHPFM7p1ISmEW065BcXZxTh7zNm4\ncvKVuuuPKzgOCy9ZiG8e90113ZrDa1DVWoXz/3E+Fry3QB6IU7ZQJ6GRgHVJxeZq4xvRVnDwwuEQ\nRknFQ+c9ZLheFRwoi2xDVwNaA63qsY3Ll8WfaDoc9jXsU5fpgRj9XF486WK1rObd3e+qZ53NJrdm\nJRVK+Qf9eBRaA62ugx4VzEoqwnM4aCUVEbfF5EyO2EGr3Vkz+jPAExy8cDjQkyuzgLhw7O2RTo4k\nSbIPjaTe6xuPbsRPP/6po/Icq8/xrUtvVZef2/yc6XZbjhnrC9njpSeR4/PHc/fjZhJLw37++kJ9\nnmRTJAqJ0qWC/m6NuC2mhcOhobMBVy25CicvOln9XqZ/NwG5bDDStq6SJOH2927H7L/Pxt56Teg1\ndThwMhyy/FnqiYNIBYfFmxYjEAzg6fVPO75NW0+b+l1HCw5WDoe7378b1791Pea/Np+7T9vQSOFw\nsCWaggP96jhpdpoOWRTw7FUjhPwcwO39+10P4Ade7VsgSDQmUr1XysrMt+OVVDhxOCjb/Oxn8v/O\nTmNeRHW1cf9mCMEh+bh40sWYN2GeevnEYSfi1BGnourHVVhz2xoUZRfpts/LyMMPz/ghPr7lY1wy\nSW6nUtNRg1vfuxWfH/ocL219CZ8f+tyxw8FpSQXPNWBbUtHlTnDgZji4LKnITM3Evaffix+e/kP8\nYs4v1OvrOuWSCtbhQHeoGJsnh7pE0+FwoPGA4ZgA/XM5ZsgY3DL9FgDy43+1VK7pogd0dIvVipYK\nLN+7XB3AqcfKEUdo7OrdzaDPruscDuFkOHhYUsF7HdhtbB0OlPAzuVAvOPiID7lpubr3R1NXk23Y\nKgstrPDEICBMwaE9MsGhu69bfQ3oUhkaWmA57dnT8Ke1f8K1b1zraN88unq7dJ8DMwFGkiRuXg37\nvWLncADCn0Tz3l8DqaxC+a71Ea2dVzwyHMIVHGwdDoxg9MH+D7Bs7zLsqN2BC1++EIDR4QA4bxlt\nxvaa7Xhh6wvYVL0Jz2zSTjDQ7x36MftSfIYW2tn+bM8EB+Uz09Td5NihwwuMBKwFhzWVawAA66rW\nccVk25KK3q4B5SCKBvxG695AfxKclEkov2SefCMSQu4C8BhksWE3gMslSYpYzOjp6cFmB6dxS0pK\nUFJSEundCQSO4QkOtbXACy/IoZS/+IXcdpN1ODQ368sbzHAiSgBycOSll9pvx95nXR1/O0Fi8dZ1\nb+GK165AVWsVbjpZDg/xpciDPrZvPH35nDHn4KMDcvjIJwc/UdeX1pTqzu5ZORycllQcbjG2S/G6\npMILh0NOWg6GZg3FU5c+hc7eTvzuy98B0BwO9OCxsasRh5oPqZeV8hYrh4MyEQtHcAhJIZQ1acol\n3SmDnrQWZxfjyhOuxMJ1CwHIZ33vOe0e3YCuMLNQvc1VS64CAPz4zB/jiYufUM8WKuIIr6QCkCfL\nhZmFjo6dhhYWws1wUM/yh1Jl0QHRCY1kt7Gz6dLv6ROGnaC7riCzwOAY+c673wEAvP3tt9UONHbQ\nDofstGxutxfeYNxu8hdpSQX92EfmjtR9NhSU9zo96VHCaa0wEw5XlK3QXeYJm8qx8b5D2MwJ+riO\nKziOu6+Ong5uPocdZoKDVVehZEGSJHXCXphZqIpAkZRU/Ne//wsbjm7Ac1c+h1NGnGK5Lf3dkZuW\ni2OQv6fdnOV2m+Gwo05z5lS2VuJY+zHuZL66vVp1v1lR21GLgowCVehVoD9X9O8P/dyywmO6L133\nnGSnaYJDW6ANISmEFBLe+W1FVAlJIbT3tFtmPCmYCQ50KQj93cpmEtV11ulux27PQ4KEnmAPN9PC\njurqalSzA3MOPT3J7VCKmuAgSVKAEFIPYCiA0VbbEkLyIQsOEgD+N7gLCCE3Qi6lkAAcAvBNSZKs\nI8odUldXh1lsAT2Hhx56CA8//LAXdykQOOI4arxSViaXPvz0p3JXCkAWGj76yCg4hELAUQcnm5QM\nh9GjgSMWMbCbNzsTHITDITkpyCzAmtvW8K/LKNBdzkvPU5fPGcPveLyrbpduImnpcGBKKo60HsHI\n3JGGwQwtOMwYMQNbjm1Be087Ons7dWc5aNyWVHAzHEL8AcG6I+vw/r73ceesOzE2b6xOcFDI8mch\ny5+Fzt5OrsOhL9Sn6wLhyOHAcQ04HZRXtVbpBsV0q056UFqcU4xpw6fhjFFnYF3VOmyr2YbN1Zt1\n90MLDgoflX2EJ/CEKo4oZ8nMzqI7eU146NpuUgGdbjIc1H0EE8zh0D9ITvOlYcwQfd628pniCTh/\n+upP4QkOXjocHJRU7K3fi/944z9w/NDj8dZ1b6nCJqDPSJhZMpMrOHT3daMv1Me9zgoz4fCtXW/p\nLpc1lkGSJBBCDOt5GBwOlGunJLcEd8y4A6+UvoKeYI+a3RDuWXve55y14CcrfaE+9fmhBYdwn6uy\nxjIs2rgIAHDqM6ei99e9hrP2NPRnghbV3WSk0OIEAYEEybQtJmAU0V/a+hKmF0837NdJp4olO5bg\n5nduxklFJ2H9net1Th36PUqLZvRjY79TMlIzdO83uqRCgoSOng5HQgFLoC+ge65bAi2G/XT1doEQ\nonsMbh0OrYFWnch4rP2YUXBwICZ19XWFJTg888wzeOSRR1zfLtmIdlvM3ZBzGSYRYilvncjcJmwI\nIfMB/KP/fo8CuECSJM+iW4uKirBp0ybbv7vuusuruxQIHDF2LJDa/xu5bx/wy19qYgMAfPKJ7CLg\nCalWAoKC4nBI4XyS6ZadTz4J/P73QK9NibMQHAYeBZl6wYEejJ0+6nTuIG5X/S59hoPFwCQ7LVud\n9KyqWIUxfx6Dq5dcbdhOOfOYn5GPSYWT1PVWZRU6h4NN+j3g3OEgSRIuefUS/PaL3+Kmt2VHCE9w\nALROA7wMBwAorS1Vl0fljgIAbo2+muHAcQ04dTjsb9QHwZiVVCgi0IJTF6jrVpSt0A3o6Nap6v4b\n9qMv1Oe4pMKJ64SHIiykpqTqJqyRllQ46XZAQ4coAvwJocHh4DDDoTi7GHkZebrrVMGBIxKsPbKW\n6wIC5OdlS/UWVWTRlVSYuE/CERyq26ttAxHnL5mPnXU7sXTPUqyuWK27bkPVBnX53LHnmu6jo6cD\n5U3lptfz4Dkcuvu6sWzvMt26tp42ruPjYJOWpnzCUM15YshwoEoqhqQPwbPzn0Xr/2vF92d/X10v\nSiqM0K8PLVaH+1zR322AdTYHoBcx6ft3I+jQnxnldzMQDKjfS+z3AytiPbv5WW7nCidC3sOfP4yQ\nFEJpbSle3PKi7jpacKBbSSvvnXRfuuF3nJ1k0yUVQPhlFezzyQp2B5sOYsQTIzDmz2N0v0luBQfW\n/cgbJzgplwk3OPKuu+5yNK8sKiqy31kCE23B4cv+/9mQgxvNOI9a5p86cwAh5AIAr0N2bjRAdjYc\nCnd/PNLS0jBz5kzbP1FOIYg1qanAuH4n3Y4dxgl9MAi8+y5fcKh04CtSBIcG4/gKF18sOx8AoLFR\nLt/4n/+x3p8oqRh4sCUV9CQoOy0bM0bMMNxmZ+1Ow8DbCrqsAgCW71uuG4iEpBAqW+Q39Ni8sTpX\nhFlZhSRJOodDR2+HbSig0wwHOuhxTeUaBENBdSLJTq6LsuQBRUNnA0JSyFCPW1qjCQ4lufJvDK8L\ngdUk3umkg85vABjBgXE4AMDZY85W1+2q26U7I8YrhegN9aK8qVwdvKsOB5NJbbgOB2UA7yP6WuPw\nQyOjV1LBCgx9oT7T92FfqE8VpkbkjNC5iQDNbWT2fL65803u+lvfuxUz/z4Tt7wr53LQwoqZO4jb\npcJm8tcX6sOe+j2m17cGWnWhpfQkHgA2Vm9Ul+eOm2u6n/aedk8cDpurN3MnlDw3A12KNHvkbHWZ\nFTJph4Py+vl9/oja9ynEU3BYtncZpv1tGp76+qmo7J83WQfC68ICGCfEv/7s15bfN/Rnkr7/cAUH\nWpBVhAb2tWLFqoNNB7F833LDfp20xqS7/rBdo8wEB+W4eN8nbHCkZ4KDRVcXAHh719toDbSivrMe\nHx74UF3vWnBgBAbeOMGJmBBucGRJSYmjeWVampM4xMQl2oLDUmp5AW8DInvRvtt/sRnAZ7zt7CCE\nnN1/f+kAWgBcJEmS+a+ZQDAAoXMcFG64QVt+443wHQ6dnUAgwM9ymDQJ+Owz4LrrtHU7bMLAhcNh\n4JGakqrLYGAnQbyyiqbuJt3k1k5w4J1Z3t+gnY2vaa9RtxkzZIyjoMn2nnbDfu1CCp06HOhBG6Af\nzJg5HIJSEM3dzYbBIz1BK8npFxwcOBzogZZjh0OD3uFgWlLRL+hMHjpZLW3ZVbdL73AwqRvfU7/H\n6MYwse07cZ3wUBwOvhSfLmAu/LaY8qAvJIVcuSR4JRVsyBhvcmk2iK3vrFfzCIpz3DkcAOCNXW9w\n17+yXZ6ALNmxBMFQ0PPQyPPHn68uf7DfvGnY27v0LSfpz4IkSdh4VBYchmcP53boUOjo7UB5c+QO\nB/o7avQQrUqYFUIAveAwq0Q719bc3YzeYK/6utMTMfr1oz+v4eYSxFNwuPaNa7Gzbid+tOJHUQly\npN9vmamZqp0+3PtiJ7b1nfV4dfurptvTYiVdRujm+dUJDtT3oyKaOBFPvqr8yrDOSUkFfWLgq8qv\ncOPbN6LkiRI8t/k5fUkF9Z1r5soDjOGpWf4sDEmLvsOBFhbo31nTDAeTtpiswMAL3nRUUiFaY1oS\nVcFBkqQNAL6AXN5wOyHkDM5mDwCYAjlvYaEk6UcBhJDzCCGh/r8XePdDCDkVwPsAsiCHTl4qSZIx\nIlggGODwBIe77tLWr1zJz2ugHQ5+v/F6QBYaeO4GAJgwQRYd/v53bV2jTWqKEBwGJvRghhUP5p+g\ntZyaNnyauryuap26bBUaCQCnFBsDvegzoXSQm1OHA88WbTfB5WY4cEQI9szUtmPb1GUzwQEAKlsq\nDQMsZeLrIz61I4jf51cn6x09HZAk6f+zd93xUZT5+5nd9E4SElKA0AlIb9JBBA/EAp7+7EixchbO\n3k5PTz29E3tFz7N3PQt2sSAqCIggIkV6TYAkpCck8/tj8s5833femZ3dbEKi7/P55JPZ2dnZ2Z2y\n833e53m+NoWDT/OZxaJnhUORoHAol1sq2HZER0Sb9pVf9/+K4mpr21Nj5GGPv+7/1bOlouUoHKxR\npmBsFWIBUafX2Y4XKeHgcBMr2lpiImK4UEy3DAcAWLZrmc1qIFocfiv6zRYaKUMwoZGnH2Ux4O9v\neF+6DGAfeaUWkK3FW80CY3D2YEflBeCucKAEFIXs3KYE3KTOk8xpSi6Y84jqYWDWQHN62e5lyJmf\ngx4P90BxVTFHatJrJSV2wpnhEG7CoV6vxz+/+SfuXHwnR77RY2bZrmVhfU+A3z8xETHm9xUqOSMr\niN1IKvr5KOEg6xrhBHrO0Ou+k8JBBlloaSCFA1XcMbzy8yvYW7YXdy+525bhwK4J7PolIx1tloqo\nJlI4CIMAtNMNVSjuKrUCaRkxDwShcJAMTDSlwuGPgqZWOADA5TBaXUYC+FTTtOs0TRumado4TdOe\nAMCE1+sBzHdZjzRaWNO0zgA+AsDo4ZsAlGqa1tvlr3UbYRQUHNBZCLr2+YDBg4HTTnN/HVU4ZGTI\nl3EjHNj7JiUBLD+rKMCApGip8EI4VDeuxbRCM4CNNnZu05nzzAPA+E7j8cGZH+CDMz/ApUMvNefT\nMMRACofz+p9nC4mkElFamHhVONCbFYZABa5U4SAZGRUVDj/u/dGcdrJUAMDPBc4SocyETO47YIVg\neW05N3JPQxLZe4WqcCipLjE/MyNukqOTudGtXm17ATBuvBZuWAjAKOrEDgoMv+7/1bOlwmuGw7rC\ndbjus+tMYoet36Zw8KhO0HWdEA6RZpcKIDhbhaxwFPeFbBmngpOOwjFSjZ47pqVCKBD6ZPQxp0WF\ngXiD/3PBz2FXOAzMGohuqd0AAN9s/0Z6nu08tBNfbOHFrvS8ZuoGABicNRg+zWeTdTOU1ZQ5Fo80\nL4aeT1KFAyHgjut6nDktIxyY6iErIYsbYf3kt09QWFGIjQc34o1f3uCCVKkarLVYKt5d/y6u//x6\n3LjoRlPeL5JWS7YvwUXvX4T297XHp799Gpb3pcdbTERM0Nc2ETIrhFtXI3pOUGVKqAoHjnBgCgcP\n5IksAyUQ4eCU3QIYKh7RnlhaXQpd110tFVKFQzNkODgpHNjvfJuYNhwZ4jXDYW+5XeHglOEQDnLw\nj4ImJxwalAanwbA5xAO4E8B3ABYBOB8GkfArjLaVodCTowFkwFBRaADuB7AmwN/F0jUpKLRyiAqH\nXr2AhATgjDPsy7Yh+X6UcHDKpQmkcAAMgiOlYYA7EOEgKhwCZTjcey+QmAhcd537cgpHFg9NfgiX\nDb0ML0x7Qfr85G6TMbnbZLM4FREozXp6/nTsv3o/fr7YKvzizugAACAASURBVMipwoHeUDVK4eBS\n4FbUVsgzHDxYKtwIB3rjSQMiRYg5FuYIXw2fPUFbnrH38nojKyukDlQcwHc7vjPl5WL7tV7p1j5l\n39/ArIHS0EgA+PWAd0uFV4XDWW+dhbuX3I1xz44DwIdGhqJw4KwXgsKhsYSDuC+CsVTQY5kdD7Ro\ndVI4XD7scnNaLE7E82BtwVrHLhX02BVvxg/XH3b8bhKiEnB8t+MBGN/tx5s+NtfBCvX3N7xva1/p\nSDg0ZCRQuTSFm8KBqqnoSLXsPGbHvE/zYUKnCeZ80VJRVlNm7psuqV1suTYM32z/Bj/uMa4FeSl5\n3H5qasJh56GdeHnNy40mIGhwJyP4xOvdo8sfxRMrnsDOQztx7WfXNur9GOjxFu2PNn8zglEYUMgK\nYreAYXrtiPJHmedCKBkOGjReJdGwjlD3TaDQyG3F21yfF38jD1YeRNXhKpPckCnQjlSGA0c4VFnH\nHbuOie1k6XnFWSq8KBwcrsNM4QcoS0UgNIfCAbquLwTQF8B9MJQM5QCKAPwA4BoAA3VddzPZ6cJ/\n2fPB/Cko/C4hEg5Dhxr/+/QBbrqJf66XvN5zJBwqKnjCoW9fIC4OmDMHSCWKaTYdrKXiwAHgsEsN\n8MQTRueLRx91X6/CkUV+23w8MPkBDG8/3H259Hzp/EAKB8AI6uqW1s0cseYsFSWCpSLBA+EQhMLh\n7LfORvI/k/HoD/YDUWqpEKwZrMgAJAoHcvPiRjhQmShgFZViFkWoCgexJSZDQXkB5n08z3x84SC+\nG5OMRBrVYZRtBIxhXeE6m6XCqXD0SjgwQqe4qhgVtRWcpYLrUuExw4ELbayLBA5bnyWYG8xwKxw4\nS0XDMU5HW50yHPq3629Oi8e9WCyuLVzLfX56w05v5sXvwe17SYhKwNTuU83H76x/B+v3r0e3h7qh\ny4NdsGDFAq7lJcP2ku1m9gENjByUbWQkxEbIj5sdJTscCx56raHBpiKZqOu6qfjpkNwBaXFpphpJ\nDI2kBESXNs6Ew+u/vG4WMcNz+Wsll+EQYhCi7HVlNWWo1+sx6flJOPOtM3HJwktCWjfDhoPWdXdr\nyVYAsLW/pUUhJVsbA1HhwIijysOVQVmlGGREhavCgVxjaW5RUJaKhnMkJiKGI5DZ9+V0nXbKK2HH\nY0F5ge08pthWYhEOC05YgI2XbuTOR/F4Plh5kCNGZYSweH0Pm6XCReGg67pU4XC4/rC5nJgdRK8R\nrgoHWYaDwzWNuw42kaXirx//FcOfHh5yjlFLQbMQDgCg6/oOXdev0nU9X9f1RF3X03RdH6br+r26\nrjv2G9F1/Std1/0Nf7Mlzz9Lnvf6d1vTfloFhSMD0VIxZIg1fdttwF/+Yj2eMAFSeFU4XHopcOgQ\nsGABvxxTThQXA/UuXc9ESwUA7N9vn8fAFBOlpYFbbiq0fKTFpXHqA8AY7XEa4RYR5Y9CpzaGtGbD\ngQ1mMbL9ELFUJLfnFQ5OlgqJwkFW4JbVlOHFNS/icP1h2wgs4M1SQZUDIrnCKRz2eSccmGy7pLoE\nuw5Z/lWqcGCkRNXhqoA35WJLTIaHlj1k5m30atsLFwy6gHs+WMKhqKrIZqkQ7TJ02UAQJcabDm7i\nQiNDUThwOQ31kUB1aDfSUoWDUBTKbmodMxwkwZ2cpaIhPT8lJsW0UVw/6nruJpyOCgL2Y/Xngp/N\nzx/hi+DkyfRYFckpt1H5xKhEjO442izSXv75ZfR7vB92HjJkds+vfp4j24ZkGz9i5bXl5jHASDtq\nWXDKcXCzJjkSDoLC4UDlAdM/znJKuqQa7P6uUp6cowVblzZdEOmPlI4K0++IdngBeFVKuBUOK/es\nxLr9Rvf551c/H9K6Gdbvt6xsbOTcTRkABNeO1gn0OhsTEcPtx1BUDrLzWCROKOi1I9IXGZKlgx0z\nMREx6JBs9RVnSh4nJVpWQpa068+EztYN3dKdS23PM1CFQ15KHrqmdkW7eIvwEC0XRVVF3HVK2qVC\nyHAIm6XCJcOhuKqYO+/YtYv+bovfk+e2mLIuFYRMoL9RbsRruLB893J8v/P7Vm/ZaDbCQUFBoemR\nkMBnMDCFA2BkKzzwAPDiiwZJcN558nWkp8vni4RDWhrgl2RuMcJB1w1CwgmiwgFwtlXoOlBC1HSB\n7BpHGk8+CRx/PPDTT4GX/SNDLFCP6XQMNBYC4gGsx315bbkpD2c3TBo05CTmIDYy1ixuglE4yEYT\nAt1Me7FUUIzqMIp7zIVGSgLBGFhLTIbBWVbrve92fmdOyxQOQOBRU5rQ3jO9pzn99I9Pm9PzJ823\n9WPvkd4DGvj9N6rDKNsNqQx0W2XwonAQb1A3HthoZTho8gyHPaV7XIsgjpiojwgv4eDBUuEpw4Ep\nHCSWCk3TsGTWEnwz8xv8fdzfuZtw8dgUH284sMF8/0hfJBdKGSrhEB8Vjyh/FK4cfqU5jxaQi7cv\nxup9qwEYBRHNnNhesh2VtZUm8cCKf8BZGeOmFOrdtrc53Tezr3R7AL5DRdc2xnt2bmOx+zR8k1M4\nNJASTioHBjeFQzBFRm1dLaa+NBWDnhwktUSV1ZSZ2SqNRb1ezxGTbOTcrVAH+EC/UGFTOBAbXijF\nLR1JZ+dQYXmhNCMB4FVPEb4Iy9JRU2rrPCNiyfYluP/7+83vKTYyFu2T2pvPM4WeE3mRHpfOZf0w\n0CBT+hsAGNf7U147Bce/dDx3PjCig57LIpF+sPIgty0JkYG7VITNUuGicBCtI+zaRX/LRSsfZ6mo\ndbZUHKw8aFMrOoV8NofCwbRz+Vp3yd66t15BQcGG/g2K2ZQUw0pB4fMBZ55pt0FQiAqH5IZ72Opq\nPtgxTW7L5tbrZquQEQ5OwZFVVbyqoTkIhzVrgFtvBTbbu565orwcuOwy4IMPgDvuaJJN+93glPxT\nABijnrePvx3vnP5OUK+n8lI22sZu2LITs80RflaQNVbh4ERYsBEPmcLBaWS+TUwbW5tQ2Y2kDKLC\ngfnYAeDZn541pzPiLfaREg5uI3Hr96/Hf3/6LwDj5vuiQRfZluma2pULzmOIi4wzVSeAsX8y4jMc\nFQ4UVI0hgxc5qfhdbzy40SQTZBkOjyx7BNnzszH2v2MdCwWbpSKMhIMXS0UwGQ5Umk2PkcToRIzs\nMBKR/kgkRiWa34NItImPa+trzVaskX5nwkHcRqciOSYixnzvv439G54+8Wmpool9L30z+3KjvztK\ndqCwwmKl6fEdisLhmE7H4L8n/RdPTH0CJ/U4yZwvEoeUcOiWZgRedmlj+Rdp4U0LfUZKUI++iLjI\nOI7sED9LMJ0X3lz3JhZuXIiVe1aa8+g+K6spw8KN4SEcdpTs4Ar/HSU7UFdf52pFAORtRIMFl+EQ\nEc21YAwmR4GBnseMxKrT66RENCAoHBrOKTZf9hvA8PxPz2P0M6Mx7+N55mi9TeHQoNBzIoXT49K5\n4x4wfn+O6XSM+VgkHB754RG8te4tfLDxA3y46UNzvoxwEGGzVMgUDmKGQzN0qRAJB7avKGkqKhzo\n75CbwgGwE2f0mKPff3ps0yocymuswRSR4G9tUISDgsLvDPfdZxAKr77q3OISMAIYRcJU0+yEA1VM\n0PaZToQDDaN0IwZklgonwqFYqP2ag3A44wzg738H5s4N7nUlJVY3DVkLUoo9ewz1xh8Vc4fOxbq5\n67Bj3g7cNOYmxw4FTqCEw6mvn4rej/Y2bx7aJ1ujRkxyXlJdIs0m8Boa6TR6x4p5WYaDk8JhSrcp\ntiLb6cZPvKGzKRwI4fD1tq/N6fF5423bCLgTDn/78m/myN41I6+ReoZpqz8RVLUyqr2h4BAJB9mI\nb6CbqeKqYizduRQLVixwLGjF75paKirL/fhxpaVwOFx/GH/50PCYLdmxxJT0iwiHwqG2rlbaQlMs\nKioOB5/hEOWPMkdlLx16KUa0H4FrRlzDET8UmqaZN+KBFA4Ukb5Ibr/lJOaY014VDjSkUdM0zBow\nC+v/sh5vnvYmbhtnd7r2zehrk5vTc5De+DtlOMjObYYofxRm9J+BCwZdwCkkXBUODQUpPc4pqUEJ\nB0ZKMHuLDEOyh9iuA6EqHGThmNRS9lvRb/hhtxX06NN8AUfknUA7AwEGObWnbE9AhUM4CAexLSZV\nOIRiqaAkBVXNOJEnYoaD27X1yRVPIv2edPR8uCfO/d+5NgVBTEQMshKzTNLai8JBJBwSohLQIbmD\naS9aunOpSbTqus4p0xhYK122TicUVRbxCgeJPShgl4qaplc4lNaUorauljvfRYWDpmnmdYKdV2U1\nZdJzTMxxoGQCRzg0scKBXk8U4aCgoNCi0KuXYZmYNMl9OdpRguHoo43WlhRNRTgEo3Ao4cOJgyYc\n3n/faA26zGNLcF0H1jfcT23Y4L6siAry2yUjVRjmzweys4GTTw5u/b839EzvyYXdBQNaDB+oPIBf\nCn8xH9NCJVBrTNlI1q/7f8Wsd2Zx4ZBOCgl2cxWMpYKOqDKkxqbaLAl+zW8r+kWFQ+c2naVFPB31\nolJYp5vZLUVb8Nra1wAYN1WXDbuMC7JkGNBugPT1AN+pgllGxBtSUUIOOFsq2OfaV74Pxzx3DC54\n/wLc99190mVFFQRVOOzd48fnn1o3bHQUGHBOdw9HhoNT0ehJ4RAgwyEzPtO0IfXO6I0ls5bg7ol3\nS1/DwAgHsRh3JRz8kfhzrz9jXN44nNjjRJzc07pweSUcZMVKTlIOpudPx/hO423PiQoHN8LBSeHg\nBlroU1LPTeHAClKqSmAWEMAqqBOjEs1ixM1SIeY3AKG32pMF7lLVy/c7v+eeq9frQ/aF06Behm3F\n27hr5Dl9z0GH5A44rbfVlzvcCgcxwyGU0XT2Gr/mR15Knjnf6XovZji4ER73f38/DlQesBE0dPsj\nfBEmgccsgcEQDknRSdA0zbyultaUmr+F32z/RrqvaIehgAqHGvfQyCbrUiEQDrRLhex6XVRV5Kpw\nAKzrBDvunfaxOJ+SCSzwOiYihiN2m0LhQK89inBQUFBotRBtFWeeCcQKA0VU8bCd5Ak1xlKh68Fl\nODSGcNB1YPZs4PXXgSuvDLw8ANTUWB0zKoK8H/NKOLzxhvH/3XcNy4hC8HBK7AaA/plWGj+VP9Pi\ngEE2CvrVtq/wzKpnMPeDueZNstOIFxu59WqpiPRFSi0Jfp/fdpOUmZBpK/pFhYOmaZzKATAKIkq0\n0GLP6QaQfjfnDzwfCVEJUpuHG+Ewo/8MpMWmoUdaD0zPnw7AfkM6MGug7eaJFn9zBswBYNxIU589\nu0kUJcMMYsFMMxxQ7zf+GvDGuje4ZZ0IB5vCocoixxpLOISa4XC4/jAKy42LJd3HXsGOsbKaMk6V\n46YGiPBFID0uHV/M+ALvnP4Olxfh1VIhIxwYBrQbwGVsABLC4dB283MDvAWJKhS8kg+U5KI5I+J5\nzCwTGjTTJtEttZtpV6C+eFao5CTlmESQaKmg1yMZ+eamcNhWvA1XfHSF2U6UQnY8isWpl9d4AQ2M\nNLetZBsKKixC6O5j78a2K7bh7+P+bs6TEQ5OWQlOsLXFJMqZUCwVjCRIik7y1EZZzHCgZK74/vsr\nXJKwYSlzmCKvsKIQlbWVjlYaGeHAPj89lhi5JFM3AEDHZG+EQ1FVUUBLhaxLRWNzNQA7eeOmcACM\n6z+X4RBnv0ll5xa7ZtF9TM87J4VDTEQMrh91PS4behlenP4id7w0hcKBIxw0RTgoKCi0UlCFg6YB\np55qtLqk8yixsLfhGpyU5GzX8KJwqKiwrASUoCgoMIr9a64B/vY3q8tFYwiHsjJLObFpk/uyDJQo\nKA+yK5lXwoEuF+x7KBjISczhipi7JtyFBScswPxJ83H50Zeb84flDDOnl+5ailV7V2HeR/OwrtBI\na2c3KW1i2tgUBgDM5bxYKkSJsmzUeHyn8Y7tP0UpfKeUTrYbQrG7B2Cl+TNM6MS3oaEWE6cuFFSS\nzcgc2c3ogCwXhUPbXthz5R6sm7vOVK6IN6Q5iTlcUBrAj97cPfFuPDz5YXw761upwoK2dqMQv+s9\nZXusm2XdbxAGDRBHex0VDmHIcKBFIz1e31r3Fq799FrzvaVdKiQ3sfsr9pvSbDqC7RVUakxVIYEs\nFRR0nzZG4cAQHxWPozKO4tbfNbUrcpNyzXleLRWiAsgJNN+AkmLi52E3/blJuebnjvRHmraK9fvX\no+pwFWrras2CkxKHosLh2pHXAjD23di8sbbtcstwuGHRDXhg6QP404t/suXM0BFghsToRNfOPyET\nDpIR+63FW7mRYXbuUtWASDh8vvlzpN+TjmmvTvNs73DrUiH7PHX1da7rZq9JjE7kCDyn670tw4EU\n11SZoOu6mTtwVMZRKLq2CLeOvZVbFzue6PVwc9Fmxy46stBI9v5H5x5tzvtu53c4VH3IVKyJZB4l\n8gIpHAJZKmRdKqL8UeZnC5vCgWY4lMkJh0AKB0ZMyhQONKBWJJvYNSE2IhY5STl4YPIDmJ4/nSM6\nlcLBHYpwUFD4A4PaBYYMATIzeYVDXJzR+UKEk7oB4AkEJ2KAqhtoK8+CAkOJ8K9/AbffDnzYkG/U\nGMJhLyGqCwqAOg9duej2UXLECyrJb45XwsFtOQVnaJqGm8fcjLTYNNxxzB24btR1mDNwDuYNn8fd\ntA/LtQiHb3d8i2mvTsP9S+/HcS8cx/k+28a3lRIB5s2JZMQr0hfJFV+iV5/dAPVq2wsz+89ElzZd\npH51hnsn3YtjOh2DozKOwtCcobh9/O1cgZgamyrt+iAqHI7tfCz3mCoFqPWEghbyrEiIj4rnPl92\nYnbAUdNIfyTXbUQkHDITMm3ECi1oU2NTMXfoXPTO6C2Vo28t3iotIFxbZ9ZHGKSDA5zS88NtqaDf\n3VfbvsI9396DGz6/wbac7LUMe0r3mNMy8ikQ6I04VTXQm3U6Cg/YQz3pPhVvtB0zHEhhJsPQHKut\n0lEZR8Hv8yM2Mtb8zrxaKmQkTKcUe6YF/Uz089CCtqC8wPxeqL8fsAqUOr0O6wrXcQQAVTWIx/Cc\ngXOw6sJVWHvJWun1xk3h8NKal8xpmtcC8AUZQ0JkgivRE2ox6GSpYPsnLTbNLJBiImJMy4BIODzy\nwyMoqirC/379n2tHEQq3LhXiqPiWoi3o9EAn9Hu8n2MQIytsbQoHB7m9mOHAKSzI+1cdrjIVRGmx\naUiJSbEFhLLjjhIArG2pDG4Kh0HZg8zr6PLdy7Fi9wqTsDyv/3nca+jnbKylQtalArAsPuFSOFTU\nVpgEML0G0m11y3AAJJYK8ptO941N4dDwPYrdcCjR2dQKB7/P+ferNUARDgoKf2BQNcK0acZ/kXCI\nlwyOuBEOdJ1Olgpa0OflWdMFBTwJ8ktDXRQuwqG+3tm24bR9dXWG6sIrKJFQXc1316CgxITMXqLg\nDdeMvAaFVxfihtE3OC6Tm5Rr3vB+ufVLcyR/x6EdeOXnV8wbInZTKIIl48tGvKIjonk5NvF/Vx+u\nNm9sUmNT8Z+T/oNNl23iCBARYzqOwefnfo41F6/B0jlLMb7TeO6G0Gn0lhIOEb4IjOk4hnuehtz9\nUvgL3vjlDXR+oDOu/fRaU9JMFQ50VJLe4LvZKZwg3pC2S2iHvOQ8bp5TlwpZwn9ZTZmUXHAboTcs\nFc4jRJ4tFY0kHGQWlcXbF9uWY5CNmv20z+q32y21m6dtoKA34vQ7YzfrydHJ6JHeg3uNqHCgN97h\nUDgAPOHQN8O6+WfF2O7S3dzIppPCQUY40HUzOFoqyDm8dOdSc1oMS6UFypqCNdwxSYMixUJX0zT0\na9dPOgILGOcC2za3jIUvtnyB6sPV2HRwEzeaTpEQZScc6HcRSjFYWVtpZg3Q9qDbSrZZ2SKC1Yct\nV1hRyBWSVG3l1lGEQuxSQQt+8fO8tvY17Di0A2sK1uCjTR/Z1nW4/rD5HSdG8QoHJ0uFmOFAv186\nKk/3B1N79cnkW4ex84gqHJiiToa2cW3thEMD4RITEWNmM2wr2cYF4fbJ6GNmDwDglENJ0UmOo+de\nLBVUHRTpizSv5Y0mHCT2GPadOloqCOHgluFQU1djdFUhpFK/zH7mtLjv2XVYDKflFA5NSDikxaaZ\nwaKtFa176xUUFBqFW24x/nfsaHVjoJaK+PjGEQ5OxAAd0U9NtV5TWAjsI9f5nQ2/l40hHPYJ9wx7\n7MS4DSIBEEyOg7isk3pBEQ7hAx1NdwIr8sWU8Ju+uMmcTotLkybKM9+4bMQryh/FSbPp6CgtQJyK\nCy+gBaKY38DQPqk9eqb3BAAc1+U4W5GREZ9hrmdt4Vpc+9m12FK8Bfd8ew9mvjMTdfV1JuEQ4Ysw\nE89F9G/XXzrfDeLNbGZ8JkdoyJZhcArckyXyu7bO1PkMBxGyG9gNBzbgiy1fWDNESwVJX9d13VFS\n66RwYNhctBlFlUWeFQ7Ldy83p4fkDLE9HwicwqHCrnBIjU0NSuEQLsJhctfJ5ujoST2tUFVGONTr\n9Vixe4U530nhkBGfYbs5lxEOTpYKeg5T6w2VrAO8BHv1vtXc8UeJMlqweWkRC1ifh5IVInHx2ZbP\nMOmFSej2UDfc9tVtUktFfFQ8l5GQEZ+BE7ufaD4OVAzuKNmBf3/7by6z4dmfnjWvoyPbjzSLsLWF\na819Lx7nlJjYUrwFgLE/6Qju2oK1rtvCIHapoCoRsUilhJosT4HaBZKik7jt9prh4GSpoIoXdh2j\n3wPbfsBZ4SBa/NwUDoBFJByqPsQp2XKTcvHGaW8gMz4TA9oN4EJfNU1zVDl4sVTQY5qeh5RwCKUb\niqzjSHFVMXRdl16vD1Qc4Pa3LMNBVCQEq3AQz1+6vpKqEqnyIhAqaisw/dXp6PlwT26fVdZWYsch\nI6ldVFe1RijCQUHhD4yZMw0VwapVRptMgFc4xMfzBARDOC0ViYlWJ4yCAr5TRaiEw4YNwOmnA//5\nD69wAOyPA20fEFzGglfCgS6nCIemx9E5R0vns5E6IESFgz+aK1ZoCJ+bn7SsDPi//wNmzAisoPGi\ncNA0DQvPXIhHpzyK/5z0H+nzTOWwu3Q3J21+7qfncP3n15tFfPuk9o4EgEyaHggiIZSZYCccAnWp\nELGt2J7jcLAqgMJB5295Hp78sHnDKFoqdpTsQN/H+uKqT68i65ArHOr1ehz3wnFI/mcyLvvwMlvB\nTQtOmcIBAFbtXSVXOEhGzSjh4Nai1An0WGTHaL1ezxEO4g2uuH/oMe8WGkmLeloYyZCTlIN1c9fh\nxwt/xAndTzDnd21jbQvLDvBrfo4cpCONiVGJrqP65mfyB1Y4fL/LmXAQFQ5O5/vcIVZv5df+/Jpt\nO2QQpd8AbK1bfy742bRV3PnNnVLyID4y3izwAeDkHidz2+ZGOCzfvRwDnxyIqz+9Gie/ahSody2+\nCxcvvNhcZkKnCea5TLdPtPrQQptde3aX7ubIqp8Lg1c4BLJUUJWBLBSVfv7E6ERuJNlTlwp/pKOl\nghJALGTVp/m4opyti2bsUMKBKi5iImIQFxlny7WREQ6AkVdE57N8nRUXrLBZA5wIh6LKosBdKsi5\nQxUQjHA4XH9Y2o46EKQKh6oSHKg8YNpa6PWFhkaKVhcG0a5ECYe8lDzzukH3PSWTbZYK8vid9e8g\ne342nlzxpPcPCeC6z67D27++jfUH1uORZY+Y8+l5qwgHBQWFVo/8fD48MpwKBydLBS3CExKsThiH\nDvGdMFgbzmAJhzvvBF59FbjwQsuWweBF4SCSBOFWOOi6Ujg0N0Qbg2ykZlzeOKmEv7CiEDV1NVIZ\nf5Q/irvhuuC9C/D0SiMZ3GnEEzC6lLz2GvDcc8BHdqUvBzqixawhMnRu0xkXD7nYMWOB2ipEPL78\ncfPz0ZZpAHDGUWeY07KQu2CREJXgmXCQ7Q9ArnBwtVTofuCgZT+YM2AO5g6dayo5xBGzL7d+ae86\nUh8J1Fg3saygWLV3FT7d/Clq62vx0LKHMPCJgVzxFUjhAAA/7v3RXI4GvG06uAl3fH0HHlz6IJbv\nXo6auhqs2rsKgGGncGu56AQ68se+s9LqUnMkPC0uLaDCQdM0c7TPTeFA7Q2BFA6AUXj1b9efI6m6\npHaxLZcel86pGGghIdoIYiJiuEBK8zP55G0xP/ntE4z971g8+sOjWLbL6KWcm5TLFXPsszHV0Op9\nq3lLBTlue2f0xtfnfY33zngPU7tPdfn0Fljh5kY4UNTU1UgtFQcrD3LH3Cm9TpGGLOq6jmd+fAbP\n/fQcdF3Hyj0rMf7Z8aYq4Nf9v2LNvjW4+YubzddeOvRSnN33bNv1AgigcCgyCqmNB/jwWq8KB7FL\nBfd5angChX4nsusDJQiSopLg9/lNUtBR4SBkODhZKmQKB4C3QTGFB6dwIJYKer1Pj0uHpmlIjU3l\njn1KuFBrxg+7fzCn2bGraZpUESgSDkxZUV5bzhG5gbpU0POQdrIJ1lah67q0NWhxVTF3raY2ERoa\nmRqbKv2cdPsqayttmTBs37NBBsDY30zRY7NUCI8B4LHlj7l/OILPN3+Oh5Y9ZD6mOUqydrytGYpw\nUFBQ4JCQYHSnAIDk5OAJh/h4IKJhcNSLwiEhwVI4ADxBEKrCYVvDNfvwYWDxYv65lqBwqK4OvIxC\neDEoaxBXyN027jbzxq9rale8/X9v47z+53EjJgyF5YWOieXREdHcaxZuXIg5783BusJ1rgoHquQp\nkK/axNi8sRjRfgRyk3Ixo/8M94VdQIMjGVihRW+URTLgn8f+EzP6zcBTJzwV8o0Pu+FmN2hNaamI\n9kfb1o/6CGDjZPTb+SiePvFpPD71cQAwCYfiqmKuuJMGt9VFGuupNW5a2U20GN63/sB6XPvZteZj\nL4TDyj0rTaUAPVa+2vYVbvriJlz+0eUYsmAIRv5npEmEiEGhXkHXv7V4K+5cfCeeX/0893wghQMA\nR8KBKjroSLcXwkEG2TEnfo/0xj8xOpE7nvLT86Wj0O1dbgAAIABJREFUnU4KBx06vt72NeZ+MNcs\nekR1A2AUb0zlsLdsLxekKFqzRnccjandp3qyfwFyhQOTV8sQGxErtVRU11Xj3kn3IsIXgYmdJ+LY\nzsdKCYcPNn6AWe/Owoz/zcCiLYvwz2/+aSv4Xlj9Aup0I3V5Vv9ZeHDyg/D7/LY8FsCucKD7cNU+\ngzATu+VsLtrsmlnBUFXnnOEgKhxooetF4QBYqoKC8gKpFUDMcPBiqaDFd7c0i3BgpEtabJp5PlGi\nMyeJJxwAQyVBlVJOCge2LX7NH7CbjUg4UEsdJbqkXSoIWUcVEPQ4cyWDJaiorZC2Sy2pLuFsTjQT\n42CVleEgC4wE7AoHZpdMjEpEdES0qR45WHnQ3M/UKuemcGCQtd6WoaSqBDPfmcnNo9+1IhwUFBR+\n14iPBy6+2FA9XHgh0K+ffRk3wkHTLFtFsJYKgB/537vXkJsHSzgUk25ha4VBE6pwWLQI+PRT9+0D\nglM4VAoKaJl6QVyfUjg0PeKj4tGvnXUwn9b7NHw/53t8OeNLrL1krelplbVcLKxwJhyi/FHcDRfD\nl1u/dCUc6DFQFUBtGuWPwpJZS7D9iu1mTkMoEBUOkb5IXHH0FbblxAKiQ3IH/Pfk/2L2wNkhv/cX\nM77AX4/+K5bOMWS+YkaEY2ikJFMDkO8n9n23iW2DQVmD+Cfr/YDuR+6eizFrwCwz8Zve0NORM2kn\nj4bQSa2GD0MTCQcAePXnV03bBy2iZL5iwJCvs8LdLTWey2/IDj6/AeCPxYd/eBg3LroRl354qfV8\nTKpdgSLZP6zId+tSQSXhTUk40FH2TimdsOuQZZHpndEbfp/fdp5SotCn+VzbzjlZsnqkWeGaP+79\n0Zx2UuZ4BSuMKg9XmoWXm8JB9KMDRqHJVAjF1xbj47M/hk/zSQmH73Z+Z877YfcP0i4Ur6x9xZwe\nlzfOnJbliIihkQPaDTA/02ebP4Ou61xBBRhEj1tgIoNoVXBri0mXlSocCNHK1sPIEifViJjhwCkc\nqKWCvJYSp5cMvsScvmWsEaSlaZqtVTBgVzgw0OOfEh6iCgcwlDiBOhykx/LXHKpI2VFiEV3BWCro\nefvJb5+4vr8ImZ0CMHIa/v3tv83HM/pZBPzesr0myeKUmUSJyYraCvN3nX2flMhh9gxqGRMVDbJr\nhpNtTsS8j+fZSER6jlMFkCIcFBQUfpd45BHgwAHglFOA3r2NFpUUboQDYNkqgrVUiNB1gyAIlnCg\nz4sDFEzhsGgRMGECMGkS8LVQLzS1wsELKaEQftw27jZ0T+uOv4/7O3KScpAam4qxeWO5wkNW7BWW\nFzr6eSN9kVLC4dud3zpKrAH+GBCPByd4HR11Qu8MXuEwNGcoxueNty1nUweEAf3b9ce9x91rjkiJ\nN8BhUTg0fN+psal2wqGhJaaYl5GdYBEflHCQKhzqG4ruaj4MjREOKTEp+NuYvwEwWiXe9/19APgC\n/Lmn+Rt2lsHAsgkAZ1JCRKgKB6fRPwbWepWOnAajcOAsFfHWyGqgDAcn5Cbl2s4x0cc+uetk3Dr2\nVtwy9hYc3/14cyQesCTsohzcLZdCxPD2w6XzaYgrLZYbExIL2KXfAF/4ySwibD8Mzh6M9X9Zj42X\nbjRH0+Oj4s3rh6xApwTDlqIt0vOL5t1Q8vasPmfZimWREIqOiDY75+wu3Y1fCn+xKRwAI3gyEMRu\nIDERMaZ6TSxUuQyHigAKhyhe4QDIcxyopULMcCirdVA4xFgKh/GdxmPBCQtw85ibcf6g88351FbB\n0COth6mOoF0UOMIhSm6pYJCRECLE3z3atpju92BCI0/tdao5/dLPL3GvCRQiSYkbep4+s+oZ87iZ\n0GkCJnSaYO57SmA5XUPp9h2qPmQeS+x6Qq8rzFZBCVVZ6KtInhdWFErVGRTvrX8Pz6x6BoCx/9g1\n6kDlAfP9fj3wq7l897TurutrDVCEg4KCghQ+cnW46Sbg/vuNeXFxwHh7jcKBEQ6lpYatQdeB5cut\njhFulgoRO3faCYeyMud2kwCvcBDBFA7nnGPNW7DAvn6KcGc4KMLhyOD47sdj/V/W429j/+a4zG3j\nbkNMRAzaxLQxAxr3V+y3pVYzlNeWcyM8DN/u+NZV4RAK4dBYZMZncsTH2I5jpWF6TUE4BILMrwvY\niRq2baLCoaauxlxHm5g2GJQtUTjAbmeiN4uMcKg+XI3fDv5m3xjWVrPKKABYEjyT8Y7uMBqXDrvU\nHAVbsHIBDlYe5Arwzz+MQ8q3D6Frale8fMrLdmIExqgttf8AxkieWMQNyAq+RSkQuBhmN+tUNSDb\nP0xOzArduvo6bC/Zzn1eGmoZ6k2zT/PZ0v0z4vjvwu/z45Zxt+DWcbfayCt2zIijs6JqQ3YeMziF\nc9IQ19+KrGPGSZnjFXRb2fe5s9Qa/Xz/jPfxw/k/cCO8DMnRyeie1p0rGilkmQe0+F+1b5VZqMs+\nd6QvklNaRUdE46YxN3HLiJYKAJjYeaI5/enmT20ZDoC31pjsuhrlj0JsRCw0TXNswRhQ4VBtVzjQ\nY0uW4+BmqXAKjRSJ0zkD5+C28bdxBTxVyzBkxGfgw7M+xL2T7sWNo28059PCOJDCIRTCgarcGMGi\nQZMW3E6Wivy2+WYb5WW7lpmEwBUfXYG0e9Lw2lrnAFVKHFEVGlXiXDfqOmiaZp5rVB3gdI2jhAMl\nUmQKB2a3oISqzEJx/3H3Y0KnCebARb1eLyW3GNYWrMW5/zvXev2f7ud+r9jnYARmRnxGownMlgBF\nOCgoKHjC5ZcbgY47dgDt3O2AXKeK4mLgmWeAIUOAXr2Mx26WChE7dtgJB7ZeGerq5Msz7N0L1NcD\nu0lGnE+4EookgVI4/HEwIGsAds7biZ1/3WmOxtfpddwIIA3sKqkqkeY+bC7azI2SuxEOgSwV4QLt\nVAEY2RBtYtvYCsHmIhxosSZ2iWAQCzfWJaO4qpi7oacBnamxqfZCyUHhQG9mmQx/48GN3Ai5iTqj\nQNWrjMJEh44PN31oPj2m4xikx6Vj1oBZAIxC8Y6v7+B96bVxKP7kL9jwl404/ajTpQVdXGSc7cZ2\nYNZAXDPiGm5eqBaFhKgEx5BOwDpW6YirzMLCio/Kw5XQdR1TXpqCjvd3xNu/vm0uc1rv0/Dw5Ifx\n9IlPY1SHUSFtL2APjnTKwmCg3ytTgogKB/G8dVI4TOs5zbGdJfXG01HNcFkqAEI4NBQikb5ItE9u\nj8HZg6Xta+lougyiwkHXda74p61HB7QbYOuMk9823/bdndf/PO6xrH0vJRw++e0Tk6ChxbgnhUOl\npWRiqg1WdIczwwGwt0cEQguNpBkOTrCRpDDO1eHth+Ovw//KXQspKUIVDulx6bbjOFjCITEqUXp+\nUZUMBT03xHOMBg6/tOYlFFcV44GlD6Coqgjzv5vvuD10P8q2f1DWIEzoNAGAnFxwUnHR6yq9pjGi\ngSMcmMLBxVIBAKf2PhWfnfsZzuxzpjnPKXB0R8kO/OnFP5nHxok9TsTM/jORm2h9xp2HduJg5UFz\nHTQYszVDEQ4KCgqekZPDkwlOEDtVPPqoNb14sd1SEazCAXC2VRwKEIa8Zw/w44/8vMJC/nFTKxy8\nts5UODJIi0sz2o+Rmw/aso2mjBdXFXMEBMX7G943p8XCmR4DzaVwAIAp3aYAMLzBrACkgXh+zc8V\n4U0JeiNJPfcUcZFxOKvPWYjwReCJqU9wo+70hpHKrFNjU5Eel86PdDe0xLRZKiQKB0cfOVM4kNaY\ndB8zyfjVI642b8IfWPoAlxbPAieZ0kJmi4iLjOMKTgDomd4TFw+52Bzpp6OdwYIl3TuBPdcx2fqu\n6WggA/uMh+sPY+PBjVKfdmJ0IuYOnYtZA2Y1yhJEW2MCgQmHp098GiPbj8StY281STabwkG0VAgK\nh+enPY83T3tT2maWwSmMr7EKB7r/WQgns1TkJuWa1xyZBYwSCjKIhMPesr1c0Ccl2/JS8mxWLNoO\nlCHKH4VPz/kU7RLa4ey+Z0vtAUdlHGV+Xx9u+tAcOR7VYZS5TaxTxSPLHsHE5yfixz0/2tbDznVK\n6rDXl9aU4tf9v+K99e+htq6WIxQOVh60SfllGQ6UYJFZKsS2mJRwoEogpwwHJ8jUTrKuEAAwudtk\nc5upSk3TNFuBHizhkBKTIr0+OBGcdBtF29TpR51uTr+05iWOuJdZahjofpHZRG4ec7N5PZFtqxeF\nA22tbBIO8XaFAxcaKSEcGCgJ5JT5dM1n15jE4aCsQXhh2gu2fbbz0E7uN8itu1RrgiIcFBQUwg5K\nOKxaBaywBkywcqVd4eCU4QA4KxycCIdA+Q4VFcArr/DztgmDd43JcBCLR6VwaL2ghANt2dYj3ZK+\nVtdVY+WeldLXUylmS7BUAMA1I6/Bp+d8iqVzlpo3X8NyrJahuUm5ruF54cSI9iPMaVEyT/HC9BdQ\ncl0JLhh0ASf1pT5zKpdmhUj/dv2tlbTZDCAA4VDWQDjI8hsAILJhRxHC4attXwEwilkmH+6Y0hHX\nj7oegFG8LdqyyFpHA+HAVC0DswbaRpDjIuNsN7b56fmIi4zD0jlLsXjmYtw2/jb5NnqEG+HARgdl\nRSMF3cY3fnnD9rxf87sqKYKBGJomZjiI6N+uP76Z9Q1uGXeLOc+W4SBYKkQVw6CsQZieP921WJSN\n5MdExDgqIrxCtFSU15SbhTYtTmQjuYFG02lnnUPVh1wLv47JHW3dbfpm2AkHADi287HYc+UePD/t\neenzmqbh2M7H2uZ3S+1mjuJuK9mGPaV7cMXHV+CzzZ/h1q9u5Zatrau1rFOE1GGFbkVtBQY9OQgn\nvnIi/v3tv812hoBBFIjWIFmGAyWR9pTZ+2iLoZERvgjzXKAj805tMZ3Qq20vmzrBqcj/U9c/YcNf\nNmDzZZtt5JZIMLi1UmYQCQdZq1NZYCRgEEkj249EUnQSN8oPGG1uR7YfCcDIqflq61fmcwcrD3LK\nNAo3hcPRuUfjxB4nmo+DUTjQ/fDLfisYWGqpkCgc3M7rQNkfALB0pxGaHB8Zj4VnLjRVNTbCgfwG\nKYWDgoKCggOoCuKpp/jnRMIhkMJh0yZ5XoMTseCW3+C0Tdu38+GSKsNBAeCLmi3FW8zp7qm8BWFq\n96nm9Lyj59nWo0GzFQFHinCI8EXg2M7HcioGSjg0Z37D7eNvR6+2vZCblIs7J9zpuiwjR+iNMA0J\nk+Vl9M8khEPmGgDuhANTWUg7VABAhrEOSjgwDMgawBWwV4+42rR/cKg1btoZ4eDTfJieP51bRFQ4\nxEfGm/srPS4dozqMclTVeIVbMCX7/obnWkGJsqwPGmJ395K7bc/HRcY1OuiUQSQcAikcZBALJjEn\ngxZ7GjRXEsxtO8LhtxYtFdSf3j7ZGvGVKRy8yPdp5oEsS4EhLyXPRjjQwMhgMbXbVNu8bqndOMvM\nZ5s/M1UEv+7/lVvWKYiX5hgwC8pHv31key/RViHLcKAkkoxwEDMcAIsYoCPzVOEQyOYCGASYqB5x\nKvIBo72m7DwOh8KhS5sutmWc1BY+zYfFMxej4KoCU+VFQX9f3lz3JvcczT2hcFM43DXhLu66EozC\ngf6+UfWMNDTSY4YDA80tkVkq6urrzK4UXVO7cgSFm8Ihv60iHBQUFBSkoAoHse3kypV2S0Vqqj1H\ngeFnhwypYAmHyEjnZcrK+PU1dZeKYNti1tYCL71kfHcKzQdZe6tIX6Rt9OesPmfhnL7nYO6QufjX\nxH9xlgvAuIETuzIE0xazqdGvXT8z9f6kHic12/smRifi54t/xpbLt3gq7gBetfDu+nfNaTpSxkb8\nZIWRSDjERcaZo16mpaJhdCnCF4H5k4jPePXZxn8J4SDeoMdGxuKJqU9wI/zansFAsXHs0H3+515/\ntm0TPV56pPdoNMEgwoulYmzeWFw69FIMzh6Mp054yrbcqb2tFHoxrA+AzRbSGASb4SCDWDCJZAi1\nVHRI7uAaIskQ5Y+yFf2NzW8A+O/uhdUv4MkVT5qPqd9bSjh4KG4p4SBrgcng1VLhFaf1Pg2XDb2M\nm9cnsw9Hzn2+5XNzemvxVi4bQ3aeA3Ibidh2E7CISV3Xsbt0txmaCVikBVU4eMlwoK+lCgqmcIiN\niJXm/Mgg2ipCyWkRC3QvhEO7hHbm791RGUchIz7DRna4bYumaY7nC8tCAoClu5Zyz8n2EcATQZkJ\nmeZv6tTuU7mWrADfaYjBiVClhEN1nZUg7Kpw8GqpiLdbKqoOV+EfX/8Dz656FnvK9phklageoyTi\nztLfp8KheXSTCgoKfyi0cbnf2rnT6HTBkJAA+P1Gq00xSwEw1AcMPp8R+AgEb6no1Qv46Sfn7dq2\nzVJmiCTBkVY43HcfcO21QGwssGYN0MU++HDEsHEjMH060L078PrrzsRRa4RMtp2VmGW7gUuOScZz\n054zH5/b71zc/MXN5uPZA2bb1nOkFA4yRPgisGzOMmwt3trsoymapiFC834r0i+zH3qk9TDkudu+\nwo6SHdhXvo9Lt2cF89l9z8YtX95iFBmfGiPwYpcKwJAcF1cVY8ehHaipq8H6/UaLym6p3XD50Zfj\ns0WH8cFCH7C5QQ4uIRxkhMnELhNReHUhdpfuRk1dLfpn9wZgFLmUcBjdYTT3unq9nmvRKRttbCxq\n6izmpV1CO0zsPBHPr34euUm5HBnx4OQHHdcxLGcYOiR3kOY7AOElHDomd4Rf85v5AuFQOIigxWIw\nSp92Ce2wv2K/+bix+Q0A/90tWMm3UeIsFZLCKmiFg4OlIsIXgezEbE490DaurbQDhVdomoYHJj+A\n2QNn48GlDyInMQcj2480zzmAJxxq6mqwu3S3+Zm5rJYY6ziVtVyl5xAD6x4w94O5eGz5Y9xz7DtJ\njU1FpC8StfW12FNqKBwKyguQHpcOn+azZTgAROEg6VLhhQBiGJQ9CCAW1FAIB/H3SRYsKiI6IhoL\nz1yIr7Z9hfP6nwdN09AltQtW71ttLhPo/HGCG0HlSDgQhUNiVCIWzViERVsW4ZT8U2zLzhwwEw8u\ne9DVwsiQk5iDCF8Etw8BhwyHBsKBkqmuCgeJpeKplU+Z9wIPTX7IfJ7m4wCGOoJd31hoJGB8di/7\nrzXgd3RrqKCg0FLQQWL9jSHWtw0NAyrR0ZbyQLRVdJd0UGtPiPtgFQ79+/OPIyOB860W2Byx0dQK\nBy85DxTMAlJZCVx6qfdtaQ6ceqqhQnnrLeD99wMv35ogUziMbD8SU7tPRd/MvojwReC9M96zLXPD\n6Buw4oIV+G72d9h+xXb8a9K/bMu0JMIBMG6kWoN0U9M0zid83AvHYciCIbjn23vMeexmMyUmBd/O\n+hZDt7wNfPdXAHaFA2BIkwGjuPl629fmyFd+23z4NB9G4GrguyvNThdeCQfAKDby2+ajR0pf6/Xg\nCQe/z8/lOBRXFXMWEfHmNBygJEHH5I544E8P4NEpj+KTsz+xqXGcoGkaTu11quPz4SQcIv2RZpvH\n2IhYaYEZCIGKN/qd0BHHQBCDI8OhcHCSrwMeLBVBKBxq6mocW1F2SO4Av8+PlJgUjM8zemFPz58e\nFptM38y+eOrEp/D38X+HpvH2FWofAYyOPwxcVksAhYMMBysPoupwFf7zoz0IlB1TPs1nFo97y/bi\nsR8eQ+a/MzHx+YnQdd2W4UBfW11XbT7PFA5e8hsYRIWD23HgBEo4ZMRneFLqAMCQnCG4asRV5jEl\nXtNC2RbAGKF3Umh5UTgkRiciNykX5/Y7V7oNPdN74r7j7uPmOWU4+H1+aTYNIxriI+PNnIbC8kLU\n6/UcMSW1yTVAZqlYtmuZOe/1X143p0WVpN/nN4mF9fvXm4GW+W3zw2ZLO9JQhIOCgkLYMWYMMG8e\nkNVwD52aCtx8s325ZHJfRAmH6GhDkSAiL8+aDlbhIBIOM2cCI0daj2lwZGMyHLyQCcFaKhLJvfWH\nHwIF8gDkIwKqGtm503m51giZwmFi54mI8EVg5QUrse+qfVx+A4NP82Fg1kAcnXu0Y+HSkiwVrQ2U\ncJAFPNKCr0d6D6QXnmx2mJASDsQC886v75jTvdKNi5AtQyYIwoFB3Mfi4/fOeM8sXs7pew73nCzA\nrbEY23GsOX1C9xPQJrYNLh5ycdCkk0g4UOm422hgKLh6xNVIik7CVSOuCukmPBiFQ4ck98BMCjH0\nM9wZDiLosZYcnWzLoghG4QBY3QJE4oSqPBaeuRDfzf4OD095OOC6QwEjk2TYUrQFuq6jXq/nLRUx\n9tDIQDhQeQDf7/yek9MDBnFAAwHZPi0oL8DTPz4NAFi0ZRG2l2znRsfZd09VIGU1ZairrzNH6b3s\nDwbRvuLUqtUN9DfHS2CkE0RlVahteGMjY22tlxm8ZDh4ed8LB11oXou6pnaVhrkyyNRLbHBB0zRz\nurCiEK+vfR0r9hiSk36Z/XBc1+Mc15sWlwatQcHGLBXUrvTtjm/NaRmJzIii8tpyM+z092KnABTh\noKCg0ATw+4H584Fdu4wuE9u3AydJrOHTSVYaJRwyM4GOkntsSjg4KRm8Khyuu45/jyOpcAhEONTV\n8Y8fDtM9n64Df/sbMGeOt7DNQEjyNsjUaiBTOLCUdb/P36jCoqUpHFoTuqZ25YLIRIj7hRIGgQiH\ndzdYuRCs+A4H4SDuY5FwGJQ9CKsvWo3VF63GsNxh3LHH8jXCiZvG3IRBWYMwqcskXDniypDXMzRn\nqPnZcxJzOHsIC10LFy4YdAGKri0KuUNHMCO0R1rhIHb3eHH6ixjTcQwuH3Y5+mRYnnhN02y2imAU\nDhRDsodwxTHtCBMbGYujc49usg42uUm5NuKEYcWeFej9aG90eqATJ/EPVeHw5dYvbfMToxI5EosV\nrDp0rNq7ypy/bv86M8MhwhdhvoYWxaU1pZwMPxiFQ5Q/CgOzBgIwulaEQqzlpeSZ3yVTb4UCkXAI\n1VIBgDtmKbxaKgJB0zS88udX8NFZH+Hr8752PU7pcQ0Yxw5VgbCBhoLyAlz/+fXm/LuPvds1SyfC\nF2GqQ/aV74Ou61h/wLIKUaJKprKQZW38XlpiAopwUFBQaEJoGpCbC8THAz168M+lpwP/+If1mLbG\nzMgwVBIiQlU4+P3AkCFWRsMFFwCdOvHWD6Zw0PUj06VCaA/OYf9+/vGjj1pZFo3B0qXA7bcDTz8N\nPPdc4OUD4fDhwMu0Jog3ij3SegRViLhBEQ6Nw8z+M81p0dcrEg6UZKipsZ9rdPSNyurZ6FIgwiE2\nIjagrz2QwgEwCA4WsPbyKS+jTUwbnNzzZE6NEC5kJ2Zj+QXL8fHZHzeqhaOmaXj5lJcxo98MvHTK\nS9x3ua1km8srQ0NjwjMDFUyUNOJaqgaAjXAIQ4YDLRQvGnQRzuxzJr467yvc/6f7bUWoaKvwpHCI\nshfo3VK7cUqDplDWOCHCF+H4fo8tfwzr9q/D9pLteHzF4+Z8ep5ThYEbDlQckBIO1P8PAO3irX3K\nckMAYF3hOrNwpKQQLYpLq0uD7lBB8fb/vY07jrkDr5/6euCFJUiNTcWDkx/ElG5TcNPom0JaB2AP\nag1V4QA45zjsLdtra1UK2C0VXuDTfDiu63Gu6gbArnAQ82Ao2cu6Ux3T6RhM6jIp4DYwK86+sn3Y\nX7Gfa41KITvWZYRDYwJaWxpUaKSCgkKzICICOPpo4Pvvjcd3320ERTKICodjjzVeQ4tYL4QDHamf\nPt3IFpg+3SA9vvkGWLYMOP104/ncXIMU0XVL4VBdbS+cvSocdD20LhWHDxuFULREPanrdsLhwAHg\n0CEgxfvAiRS7dlnTodghxMItGCVIa4B4Yy/rIR8qKMmgLBXB4/xB56OmrgZxkXGYOWAm7lx8J27+\n4maMyxvnSjjouqEYiiB3P7JRQA0aeqQbLKmNSKviC4jObToHHIkMpHAQMaHzBBReXeg5T+FIYmjO\nULNt5vLdy4/w1jgjkMLhxekv4ty3z8X4vPE4Ovdoz+sVLRXhUDgMyR6Cx45/DEWVRbhqxFWuy9oI\nhxAVDgOyBmBz8WZzRL85W+QChj+e5jUw0JFhqhyg37NXhcPust34fuf3tvmVh/kT1KloXbd/nZnR\nQEfRKeFQVlPGdbJIiQ7uh7pDcgfcMPqGoF4j4pIhl+CSIZc0ah1NqXDIS8nD1uKtAIyMDrGw3lVq\n3Jz4NX9YzicK0b4jKhllVsqLBl3kSW3CyIvqumrHa2G0P1oaeisSDmM6jsHEzhMDvmdrgSIcFBQU\nmg033QTMmgWceCJw3nn8c5RwyMgw5PmjRgFffmnNz8oyOlxUVDgXyJSIePJJ4N57LSVDfr7xxxAV\nZaxz925L4SCzN4gEwfbtwLRpRojlG29YxUttrd3+UFpqFDn0t0o2ol1aKiccysvlyfrhIBzo5woU\nXCmDSPr83ggHEeEiHOrq+H2qFA7Bw6f5cOkwK0H1pjE3YdaAWWgb19Z2YyjaKGpqeMIhKyEL8ZHx\nKK+1DuC8lDzTRx9I4eClpacXhYOI1kA2iKDnyMk9Tz6CW2JHoILp6NyjseFS5xaRThAVDuHIcNA0\nDRcNvsjTsmJAXrAZDgzDc4dzAZJie9+mhlsgnwxUSeI1w2HhhoVmfsPArIFYuUfea1rcpwzr9hOF\ng99SOIiWCqqYCFbh0FIgyv5DDY0E7CP1U7pOwaPLHwVg2Cro87qum1aLvJQ87nsOB0QiTSQYZFbK\nsXneVGZU6bZ4+2LpMh2SO0iVWtQ6NyhrEBaeubBV/gY4QVkqFBQUmg3HHw/s2wcsWGBvnygqHABg\nyhR+meRkK0xy0ya51YEqHJKTDVWEW6tGluOwb59RBMgIB7GQfuIJYOVK4J13gE8/tebLCsfDh+2E\ngWw5pxwHWatQwCAcGgv6uUIhHHYLXccCZVG0RtBQyHBJ28ViUxEO4UF2Yrb05lQkDMTzUdM0m8qB\nemfDQTgEq3Boreib2RfzJ83H9PzpuHfSvUcWtq9wAAAgAElEQVR6czg0pmBygzgaHg5LRTCgCodI\nX6Qni4xIOLSNa4vObTrjkiGXYHzeeFw46MKgVB7hgJfziCIUhQMlFS8fdjmmdp8KDRoemfIIt5yo\nWmFYV8hnODBQ2f/u0t2clD6YDIeWBPFa2hiFQ8eUjiYpkxiViFEdRpnPiTkO+8r3mTaLrqldQ35P\nJ9gsFXHOlgoA6N22t+c2vF4IByfr0MTOE3HXhLtw4+gb8cWMLxplYWmJUAoHBQWFFoGxY42MhUOH\nrIDJKVOAa66xlklOBo46Cli+3FAN/PILMHgwvx426p6YyI9iOqFjR+C774zpHTvkagKR2FhHgvE3\nbAAmT5Yvx1BayrcFDYZwEO0UDC2RcPg9KhzuPvZupMSk4OQeJ4etkBD3vyIcmhYyhYOI7mnduXA4\nmg5uIxxq+NHUplI4tFbMGz4P84bPO9KbYUNjCiY3NEVoZDCghENSdJIn6bdYoA9vPxyapqFDcgcs\nmrEo7NvoBaLUPS02DQcqDzguzykcPPr8KcbljcM5fc9BSXWJjRRwUjgcqDxgFsM0w4GqM+Z+MJfL\nlQmmS0VLRk2d5MLpET7Nh7lD5uLuJXfjosEXcUTCxgMbUX24Gtd/fj2SopNwTKdjzOeaQmWTlZCF\nSF+kSRzZFA7CY9YS1gsoMfHN9m+kyzh1wNE0DdeNus7ze7U2KMJBQUGhRSA11bA1lJdbCgexNWZS\nEtCHWAHXrDFIh23bgKuuAmJjLYVDG4/3fLRTxYYNfK4Eg1hIr7eCh7GJkPNuhAMNxZQtdyQIh8Za\nKmgGBPD7VDj0atsLz097PqzrFPd/dbXddqMQPnghHMQbW9oe0kY41EVxD5XCoXWgMeGYbkiOTka0\nP9qU6je3woFaKrzK922EQ+7wsG5TKBAtFcd0Ogav/yIPToyNiOX2ZyBLRdu4tiissOSCg7MHm5YB\nmQLBLXiQ7WeqcJiWPw1jOo7B19u+RllNGZ796VnzudaqcBCxt2xvo17/z2P/iRtH34jE6EQuFHJ1\nwWo8s+oZ3Pf9fcZj0omkMV02nOD3+dExpaOprHALjQQMYsorWGikG5ozjLUlQVkqFBQUWgwSEiyy\nATAKsNtvN6bz8428BUo4LFgAzJhhtHZ8vCG8mikcvOYb0HaZy5YFDnmsq+NJht9+ky9HIa4zVIVD\nJ3I/phQOrRey/a8K0KZDKIQDtVQE6r6iFA6tA+H2gjNomsYVqOHIcAgGVOHgdTS9JRIO9DzSoLkW\neiKp46Zw8Gt+W3ehuUPmum5LoK4zAH88Rfmj8NFZH+GE7ifYlmutGQ4A8O+J/zanz+hzRqPXx/ZT\nYnSiGUq5Zt8afLXtK3OZd9dbrYmbwlIB8LaKQKGRXvMbAPlx0zO9p5kHBAAdkxXhoKCgoNDicP31\nhoVi2TIji4ESDswKAQBffWXcxDNLhFeFw7Bh1vTSpfLCv7raCoPcto0vWLwqHCicQiNlaE2Ew+9R\n4dAUkO1/ZatoOogKBSdLBYWrpUKAlzR/pXA48ojyRwVeKESw0fIof1Szj2hzhEOICofB2YMdlmw+\npMelm9vVPrk9eqb3dFxWtK3Qx6IdIik6CfW61Ufap/nwf73/z3VboiOiOeJIlhFBFQ4AEBsZi/+c\n9B/ERsRy81uzwuGyYZdh/qT5eHH6i0G1ivWCfu36ATA6hLy3/j1zPm1D2lTBpXnJeea0SDCIIaxi\nFxg39GrbCxp4qWLP9J7okWb1hVcKBwUFBYUWCL8fGDTIUD8AhgIiXXL9X7mS75rgVeGQl2fZHZwU\nDoBFJmwQQsy3brVGQL0SDsFYKmhoZGcykBoKQSCisZYKpXAIDUrh0LzwpHAg0t2shCyucJMRDsek\nnQMAGNl+JDd65QSlcDjyEAvBcOLG0TfiqIyj8I/x/2hSYkOGUBQOsZH8d9FUgZrBQNM03DT6JrSN\na4sbR9+I4bnD0SmlEzRoOKvPWdyyosIhOiIab5z6Bs7tdy6+nPElF7iXHJOM/RUWcz+t5zTb55eB\nBkdO6TbF9jzNcGBIj0vHrAGzuHmtOcMh0h+JecPn4cw+Z4Z93f0y+5nTNMyTwa/5m6w160k9jaCw\n1NhUDMsZxj3XJbWLqaygCg8v6NSmE24cfSM3r3tqdwxoNwCAodwRye0/ClSGg4KCQquCphnBkbRd\nJmAEPm7caD32qnDQNEPl8P77BmGxapV8ufJyI4iS5jcARjGyY4ehPqCFZFKSpUIIl6WCEg4tUeGg\nCAdvkBFOSuHQdBAJBlkwbFpsGvpl9sNP+37CcV2P456TEQ4Xt38C8yadhhHtR3jaBqVwOPLIb5uP\n3m17Y23h2rB30JjUZRLWXLwmrOv0io4pHeHX/KjT6zzLtbuldkNmfCb2le/DXRPuauIt9I6rR16N\nq0ZcZQZfrpu7DiXVJfjt4G94cc2L5nKyYM5Tep2CU3oZYY0Z8RlmuGNSdBLO6nMWrv3sWsRExOBf\nE//laVvaJbTD2sK1AAxi8eNNH6OoyhrVcGpZ+Nfhf8UjP1hdL1qzwqEpEUgx0RQtMRmmdp+KjZdu\nRHpcuk0VFOGLwMoLVmLjwY0hqTpuG38b9lfsx+MrDJ/viPYj0DezL8pqyzCmwxhkJ2aH5TO0NijC\nQUFBodWhTx874QAAn39uTXtVOAAW4SCuIzraKk6cFA6AkePQqRNfSGZmti7Coaws+OBCZakIDcpS\n0bzwonDQNA2fnPMJlmxfgoldJnLPyQgHX10s1zI1EJTC4cjDp/mw7Pxl2HRwE/pk9An8glaCdgnt\n8MTUJ7BkxxJcNeIqT6+JjojGygtX4ueCn7muAC0BtMtGdEQ0MiIyUFdfxy0TKCcjIz4Dm4s2AzAU\nBpcPuxztEtohPz3f1g3DCTSXo3tad8waMAv3fmcRVdSmQdG5TWec3fdsvLD6BbSNa+spSPCPCKpw\nkKEpAiMp3PIhEqMTMTBrYEjr1TQND095GENyhuBw/WGc2ONEaJqGV//8aqib+ruAIhwUFBRaHfo4\n3Cs2hnBgWG0FJCMjw1AvAFZxLiocACPH4dhj7YQDU1yIhXioXSoaQzhUVgJLlgAjRgBxcfbt0HXj\nMyZ4bP1cVwfs2cPPUwoHb1CWiuaDrnvLcACMImVa/jTbfBnhEChIUoRSOLQMxEXGoW9m3yO9GWHH\n7IGzMXvg7KBek52Y3WpGWzMTMhHljzJbMwZqPUo7DyTHJCM6Ihrn9js3qPc8ttOxeGH1C6bsfkKn\nCYiJiMEdi+8AwOe8iHhi6hMY3WE0RnUY1ewWm9aCDskdkBKTguKqYunzXds0TWBkc8Dv89usNX90\nKMJBQUGh1YESDsnJQEmJMb1kiTXfq6UCAIYMkc/PzLQIh0AKB7oMey1DQQGvHgglNDIhwSBAGIIl\nHGbOBF59FZgyBVi40JgnEgSlpd4Jh8JCK0iTQSkcvEFZKpoPMmLAiXAIZh3BEg4iwaD2t4KCd/g0\nHzomd8TGgwaLH6j1aEac9WMpC3z0gnP7nYtebXuhfXJ7U3b/j2P+gUldJmHRlkWYM3CO42vjIuNw\nwaALQnrfPwo0TUPfzL74etvXAIyMFZ/mM/McmlrhoNC8UKGRCgoKrQ4DBgAdG6yqN98MxEvyroJR\nOKSkAD162OfTAr+83PhjBEQ2GRhauBC4/HLgww/lr73zTqBrV6PDBWAVGxGE8g0UGpmebuRCMARL\nOCxaZPz/+mtrnoxw8ArRTiFbn4IcylLRfJCRC8ESDjKFQ6DOFSKUpUJBoXGg6f5BKRxCDG3UNA1D\ncobYul6M6TgGt467FblJuSGtV8ECtVUMzBqIoTlDzcdN1aFC4chAEQ4KCgqtDtHRRrjjDz8A8+YZ\nBISIYBQOADBmjH0eVSlUVPAtMI85BohqUEquWwc8+CDwrtU+GvmC2nLzZmD+fGNklBUrtNuGjHCo\nqwMOHrSWTSStxoMhB2prLeKirMy5q0ZjCYeKCqBebmtVIFCEQ/OhqQgHZalQUGhe0FaGARUOYSAc\nFJoelHAYnD0YJ/UwukfERca1iFatCuGDIhwUFBRaJVJSgMGDAZ8PGCjJ9qEKAy+44gp7YCIlHMrL\ngZ9+sh7n57uTGr16AU89ZVgYGDHxyit8UU+38eWXDRXEp59a84qLrQI+Pd1QRMQ2dPMKRuFQUMA/\nZhaUxigcxPwGwLCNNKZw/uILw/qxYkXo62gNkFkqVAHaNJCRBUrhoKDQ+tCrbS9zOlA3DhrUqLpE\ntFyc0OMEpMelI9IXiXP6noO5Q+fizdPexPezv0fb+LZHevMUwghFOCgoKLR6jBvHP54+3SAjgkGv\nXsBZfKtvm8LhpZesxyNHur9HXBwwe7ZhtzjhBGNeQYHVDQMA2gq/p7/9Btxwg/WYBkayZZmtIhjC\nYe9e/nFxQ0ZTYwiHoiL5/MbYKmbPBv77X4P8CRd++83++Y80lMKh+SAjF2RtMd2gFA4KCkceswfO\nxqz+s3D9qOsxqsMo12X/1PVPyEvJQ3pcutkqU6HlISM+A9uv2I59V+3DoOxBiPBFYHr+dPTJ/P10\nkVEwoEIjFRQUWj1OOgl48kmj2J02zcp3CBa33AK88IL1OC3Nmt640VIf5OUBo0cDu3ZZAYwiWCcI\nwCAy3nzTmF6wwJqfKunstXy5UfgnJvKEA7NfJCUB+/YFRziIaoSiIkON0BhLBSUcMjIsFUVZWfDq\nEsBQcmzdakyzEM7G4ssvgfHjDVXI5s1Au3buy3/4oUHGnHYa4Je3WA8LFOHQfAiHpUJGLiiFg4JC\n8yIpOglPn/S0p2VTYlKw6dJNOFx/GNER0U28ZQqNQWxkLGIjY4/0Zig0MZTCQUFBodXD5wPOP98Y\nGQ+VbAAMS8P11xvT//d/fBjlE09Y9oZzzzXe88wzDdJh6VL7umLJ7+eUKVaI5eLF1nxZ2CUAfPut\n8Z/lLgA84QAYhIOuG9MVFUZB7QSZwqG62p63EAzhUEw6WeWS7KxQFQ6lpdbnOXjQmm4MmLKkshJ4\n9ln3ZZctM/bTmWcC773X+Pd2g2qL2XxQGQ4KCn9M+H1+RTYoKLQQKMJBQUFBgeCOO4xAxJde4lUK\nLLwRMAgHhuxso02nmP9AXxsdDZx6qv29YmOB3r3t81knCSeFA2AESlZWGoVL795Aly685YNCVDgU\nF8uJgVAJh5wcazrU1pgsVwIwyBBZzkGwoNsSEUDPd/PN1vR11zX+vd2g2mI2H5oqw6GxbTEV4aCg\noKCg8EeBIhwUFBQUCDQNyMoyFAwyBcKoUUZxTxEbC3TuzM+jhAMg74IRGws8/TRwwQX8qDojHD7+\n2JrH7AC0U8WhQ8DKlZYV4Z135J9JpnCQFb1HUuFACQeAJ3hCQV2d+2MR+/ZZ08F2OAkWylLRfGgp\nbTGVwkFBQUFB4Y8KleGgoKCg4ACRNACAiy6SL9urF589IL5WbJPJlhk2zPgDDEvHpk2GvH/RIuCN\nN4z5GRlGG07AUjgABkFA31PWOQKQEw5OCodVq4z3y86Wr4uug4EuGy7C4cABoH370NYFGJkbFE4h\nlwyUcKBhoU0BZaloPrQUS4VSOCgoKCgo/FGhFA4KCgoKDhAVDv36AWecIV+2Vy/+cbRgHe3Rw/6a\nWCEniakgamqA00+35t92G5CQYExTwuHQIW+Egyw0UkYMPPEEMGAAcNRRlmrCCYxwSEzktykclgqg\n8QqHH3/kHwdDOMiIplBQVQV8/rldTaIsFc2HlhIaqRQOCgoKCgp/VCjCQUFBQcEBYuH5r38ZVgsZ\nRMJBzHRISLCP2DsRDoAVGJmfb7SLZBAJh02brMd79sjDFr1aKti8oiI+p0IGRji0aWORIUDLsVQE\nQzjU1vLfWzDWEjfMnAkce6zR9YJCWSqaDzJi4Ei0xRQJhpoae2hrc6G2NjwZKQoKCgoKCl6gCAcF\nBQUFB7RtaxXTI0cCEyc6LysSDjL07Mk/FgmNiRP5eT4f8MADfOChm8KhvNxeLOu6d0sFxeLFRotO\nJzDCISWFV4KES+Fw4EBo62FYtYp/7EY4iGqOYFqOuuHLL43/LJODQREOzYeWmuEABE98hANFRUZb\n3+xsYP365n9/BQUFBYU/HhThoKCgoOCA2Fjg3XeBW24x/rtBJBO8LCMqHLKzjSL1vvuA114DNmyw\nkxxuhANgt08cOmQvdrwQDgBw9dXy+VVV1ohtSkrLUzjoenAKBzHvoTEKh337jIJW1633LC3lQytl\no8tKYt80aCzhUF9vKRGioqz5jVU4OM1ranzyidGFp6QE+N//mv/9FRQUFBT+eFChkQoKCgouGD/e\n+AuEhARg6FAj8HHqVPkyYnCkSDgAwJAhxp8TaJeKXbss6wXDnj18XoSobgCMQtiLpPrLL4GCAiNE\nkoIGRooKh1AJB1FV0BiFw65dfEtRwJ3AEAmHUBUOH3wAnHCCMYK8YgU/gl1aanxXgEUARUdby7QU\nhcNLLxkj31deyZNbrRWNJRyokiE21nptMISDrrecoFBKvIXLOqSgoKCgoOAGRTgoKCgohAlvvWWM\nIJ5wgvz5QJYKL6BFoGgbAIzRSwpZkKRXhQMAbNsWmHCgCoeWEBr500/2eW4KB5qDAYReiL31ljEa\nvnkz8Nln/HMlJXbCITXV2j8tgXDYvBk4+2yjQE5IcFa4tCbIrA+NIRzYcRqMpULMCGE4EoQDPc9C\nPVcVFBQUFBSCgbJUKCgoKIQJOTlGUGB6uvx5LwqHQKCEg2gbAOwEg0zh4BQayUADL3fskL+eIVwK\nh3BmOOzaZZ9XXOwc0hcuhQPdZnGd9PMxcoEREEDLsFRs3GgVxhs2HNltCRcaq3CgSgZ6vgajcHAi\nk47EPqfnrlI4KCgoKCg0BxThoKCgoNBMyMzkHzeWcJAVhV4JBzdiYOxYa3r7dvnrGVqiwkFGVui6\nM5EgkgNVVcGHAgL8NjsRDrpukT1xcUBMjDHdEhQOVAUSqI1oa4GMXAgmrJEeB1SRFMzx4UQsHGnC\nQSkcFBRCx333AdOn23OUFBQU7FCEg4KCgkIzQWyVGQoC+erdLBWspWdlpXtBTwkHpnCgBVZLVzjQ\n/IasLGtaVkTX1Ni7VADG6O/+/cGNZLspHNh3VltrKS3i4izSSREOTYNwZzgwtFaFg7JUKCg0Hnv2\nGDk3b78NzJ9/pLdGQaHlo9kIB03TOmiadq+maes0TSvTNO2ApmnLNE27StO0EMb5uHVrmqbla5o2\nQ9O0RxrWW6VpWn3D35jAa1FQUFBoeowYYU3n5gb/+kCEg5vCoVMna5oSE23a8K8RCYe//MUIq3z0\nUWOeG+HQVAqHgweBc84BbrhB7oenoIRDt27WtKyI3rZNbrV46SWgXTugb1/vhSHdZlF9wj4fLT5j\nYxXh0NRoKsLh96BwUJYKBYXQsGeP9Tsksx0qKCjwaBbCQdO0EwCsBjAPQHcAsQBSAAwCcA+AHzVN\n69KItzgHwFoAzwC4GMBgAJEA9IY/BQUFhRaBV181gvkefhho3z7418sIB2prEAkHaomgGRI056Bd\nO/41w4ZZ07/8YhAN1dXAPfcY82jR0qaN0S4woiGCOJxtMSmx8OSTwAsvAHfdZXTPcANVGlDCQabq\nkOU9AMDTTxutLNetA775xv39AGNb6fsWFPDPs89HszNiYy1LRUvIcKAkA93HrRnhDo1kCEbh0JII\nB6VwUFBoPChZ93u5ViooNCWanHDQNG0AgFcAJAIoBXADgBEAJgBYAIMQ6AbgfU3T4p3WE+htYJEL\nNQBWAljTMF9BQUGhxSA3F3j+eWDu3NBeHxtrWSMYunSxrAO7dxthkuvWGSP3K1ca87OyeIUDLbTF\nbIm4OIuEWLvWKvy3bTNGc0SFg6ZZKodwEQ61tXxBtHatNf3DD+7rogqHLoTKlo3ai8QAA/XlBno/\nwCAS3ApZmcKhKS0Vug5cfDEwcaIzqSJCKRzscAqNDEbh4LRvj4SqRSkcFBQaD/rbpAgHhcbif/8D\n7rzz931Nbg6Fw/0wFA2HAUzUdf1uXdeX6rr+pa7rFwG4BgYx0B3AlSG+xy8ALgcwHECSruuDAbzd\n+E1XUFBQaFnQNLsFYPZsi3AoLQUGDgQGDDB+xFhQ4pAhvHWCWioiI63p5GTjv5P64ptv+GKUdVpg\nCotQRk2dAh2pImHzZmta1g6UghEObdoAbdta82VF9L591nRenjVNf/iXLXN/PyBw5oQXS0Ugq0gw\nWLoUePxxoz3nY495ew39fkpKnLt6tCaE01JBQyNDVTjQdRxpS4VSOCgohAalcFAIF3bsAE45Bbjx\nRsu2+ntEkxIOmqYNATAahvLgKV3XZbdt8wGsg0E6XK5pmj/Y99F1/Qdd1x/WdX2ZrutB3EooKCgo\ntD706WNN33MPcNFFQHY2v0x1NXD11dbjIUP4Noy06KIyf9bSs0MH+Xt/841d4QA0TuFQViYvbmkR\nv2WLNf3TT+7rY69LT+dJlkAKh65d5etzUjjoOvDJJ8Drr/OqChkCWSrq64MrYgNh2zZretMmb6+h\n34+u21UnrREtLTSSHo/KUqGg0DrREgiH3wMhrGAETLN9SZWcvzc0tcLhZDL9X9kCuq7rAJ5reJgC\nYHwTb5OCgoJCq8aDDwJ//rOhYLj6akP1QLsxMFBVgEg4UNDCgykCnBQOixfLCYfGKBycClumcKio\n4LMpfv3VWY5eW2ttX1paYMKBKhycCIddu+zdP0pKgNNPB447DjjtNCNjwg2BLBXic40F/Vyy1qYy\niN/P78FWIbM+hNoWMxyhkfQcbG7CobaWJwSdiD4FBQV3UMKhtDS8ZLEXrF1rDAqMGhUcgarQ8kCV\nnI3pztXS0dSEw6iG/+UAVrgs9xWZHtl0m6OgoKDQ+jFunDGqftJJ1jwZ4UAxeLCccIiLA/6/vfMO\nl6I6//j3cOECIkhHBUsoCho0ithQitgIsStGjQViLz8b1hiJRI0aCxg1KooGO4pRgsYKUqw0sQAq\nKghXUUBAQbjc8v7+ePfknJmd2Z29dy+3fT/PM89OOXPm7O7Z2Xm/533fc9VVbtt6RcR5OHzyifM2\nMMYlsbQeDqWl+gD0xRfA6NGubFlZvIEVJzjYP9/wtJXl5fEjAf6fdxIPhySCAxD0cti0CRgwABg/\n3u17+eX4cwH3Hv3QET+kAkguOJSVAf/5T2ZPD/99Jc2iHv58so3cffihil/5FibWrdMEob5gVlHy\n6eFgvVGA2unhEPU78z1uahqbNumUg/9mgCypYYRj7aNCAquSJ55QIfztt4HJkzfvtUl+oeCQH3pA\nwykWiUgmHX1h6BxCCCE5EJ7a0qdzZx3tjxMcTjgBuOMOjR885hjdH+fhIKJiAqBig01gaT0cAH0Y\nGzQIuOQSYJddgJNO0uu3bas5BcL4hpDfRvtHHGV4xuVx8P+wcw2p6Nw5uk4gmMdh6lRNzOmTzai3\n79Gf8WLnnYNGbFIDdNw44MgjdTaRsOeFxRccvv02mYGci4dDcbF6d1x8MXDdddnrzoUrr9QpUA86\nqPIjh/lMGtmokct3UlEPh5omONTksIprrwUuvxw49thgEldCqpuw4LC5wypWrHDrfrghqX1QcKgk\nxpjGAFLRwFiWqayIrIF6QQBABSaKI4SQ+s0hhzhjaPjw4LG99tLXKFGiWTOd0vKyy3RWA5Oa2yfs\n4bDzzunn+uKA72ExdqwzEDZuBJ5+Wo2d9euBYcPSjRzfEPKNfvvnG/VAFSc4+LkU2rQBWrd221HT\nYlrDvHVrLR+HLzhMmRJfLg77Hl9/3e07+OCKeThMTfkEFhcDs2ZFl/EFh/LyeGHCUlaWPkqXSXAo\nKnJizexM/osV4J139HXJEuDjjytXVz49HBo2dNO/1hQPh7lz3eeVjSijqKZmRV++XEVQy0cfVV9b\nLKtX6+w/pHr56ivg6KPdNM3VQXULDv71wh6ApHZBwaHyNPfWk2joVnDYMmMpQgghaeywg+Y2mDcP\n+Nvf3GwTgOZvAOI9HKIIezhccIETIyx+fYce6tZvvDG+nUuXAiNHBvf5goM/dWcmD4e4cAJfcGjb\nFmjeHChIpSLOFFLRoYOWjWPWLBfv/tZbbn/4M4lj7Vptm/WM2H13oH37igkO/gPmshg53xccgOwe\nGFEPzJkEh+XL3bqfXyMf+O/pvfcqV5cvGNiwn4oKDvnwcMhnDof583VGmj59gn0yjqjvOBcPh9JS\nDXF4+OHk51SUsDEZN33t5mLdOhVdd9klGEpFNj933gm8+KKGAlbX6H5YcNjc+W7861FwqN34gsOa\nNSr+10WqUnDwHEWR5O+9GDpTRdNsBQkhhKTTuTOw2246CnvIIW7/3nvra5TgYI2wMB06BKfLPOQQ\ndXH38UdrDz3UhVf4I+WTJmm86fvvA40b674771RxxBLn4RAlOFiBZN68YMK7sjL9sw6HVBjj3nf4\noXD9ehfD3r69y0cRxZo1OtvDunUun0P37u6zzcbatcCbb7qpL+3340/b6eelmDEDuPTSaFdyfwaK\nOCEhLDhkSxwZ9cCcVHBYvjx/U3pu2BD8DisrOPjigg37iRMcRNLFk7DgUJM8HHzPhmnTspevbEjF\nM89oiMOZZ2pfriqWL0+fyjXfolauzJrl3NhfeaV621Lf8e9/n31WPW2obg8HCg51B19wEKkbyZqj\nqErBwf8rLUxQvjE030Me83QTQkj95Npr1Xg/9ljNZA2oQR0WGOIEhwYN1GsC0FH4rl2B008PlvEF\njFatgP33Dx7v1g0YPBg4+WQ1zK++WveXlanxYvENIT9xo80VYQWHRo2c6PHzz8Crr+r6xo2axLFV\nK+CWW9z5NkTCGnnhP3LfKI/zcLDhKIAKDTNmOGNzwID45Jph/PYCTnAYONDts4bM+vXAEUcAo0bp\ncV/AKS0NigxRHg4iuXs4RD3kZHqI9gWHTZuiw1UqQlFRcPvddytXX5TgUFISPTvDiSfq9LLnnusE\nlHwIDkk8HIqL9Xdy3HHJRQBfmPG/D74+MPoAACAASURBVItNrmrfQ2VDKvzEqX5fzjeTJqWLMVHv\nb3MSFthI9eF7sVWXsV2TBAfmcKjdhP8762pYRVUKDv7PMUmYhH3srcEpjIBNmzZhzpw5WZfvqluO\nJ4TUa/bYQ0fHJ0xwngcNGgAXXhgs1zSDT9mf/qQG2MiRamgde2zwuPVYsAweHNw+7LDg9tChbt3P\ngxD2cOjeXdffe08fLu0D1Y47An/4gyt72WVqTI0YodN1AkFviLapLEJWcFi7Nmho+m7acR4O/nv6\n4IOg6/qAAfHJNaN4/nl9bdwYOPBAXd9vPyd0vPqqtm/CBPcAu2SJejpYioqCLpdRQsJPP6VP/ViV\nHg6AjkDbGUoqQ1hA+eKLyj2ARQkOQHpIREmJzvwCAA884Fz68xFSkcTD4emngaee0j7y6KPJ6vU/\nl/AjR3Gx5gj59a+Bs8/WfZX1cFi0yK37yU/zTZRBX92PVFUZQlQTeecd7Tv+DEY1hZogOIR/N9WZ\nw2HFiuB0t6R28f333wGY87/lnXei7cpNtXz+0yoTHESkGIC9LXTKVNYY0xJOcEg4gVf1sGLFCvTq\n1Svr8sADD1R3UwkhJI3rrw+GSmQa+T7jDDVwbRLKZs2CySPDD3u//W1w+/DDg9s77OByNLz7rjPE\nfENoq610dB9Q4/tf/3IPU507A0OGOE+KhQtVgLj99uj2hwUHEaBfP+ddEfZwaNzYjWBHvaeZM4NC\nSb9+yT0c/Pd5wAFO6CksdF4OK1dqAsawwTl2LPDf/+p6+DOP8nAIezcAFfNwyCQ4hI2uTz7R76dj\nx8qNuEW9n/ffr3h9UTkcgHRhxM/6DgDXXKMJPsOzVFSVh4P/Hv0EpZnwDS/fIBYBzj/f9dVHH9Xf\nUGVzOPiCw6xZyXOO5EqUwFTdXgV+f6/utmwObr9dvWNuuy36flKd+L/V+ujhUF6efj0/zGRzMn26\nzlIU9kwjySkqegBAr/8tw4ZF25Urwn9StYyqnhZzATQvQ1djTKZrdQ+dU2Np164dZs+enXU555xz\nqruphBCSxhZbAM8957b9XA9J8LXUM84IHuvZE+iUkpcLC4H+/dPPtyERmzY5d/k4wQEA7r7brXfu\nrDkZRo1y+8aPj3aPB1xIhT9TxYwZwCmnaPLGsOBgTLqXQ/fuLsxj9mw3K8Suu6pXRC6Cg2XffYPb\ngwa59fvvd4ai733y0EP6GiU4hPMnbA7BIWx03XefXmPlSuDxxzNfKxNRgkNl8jjEeThkExxE9D1t\nLg8Hf6aPOXOS1Rvn4TB2rC4+b79duZCK0tKg91BJSTDEIp9ECQ7V7VXg9/cffqi7id0svmiYbYab\nzUlJSfD/Iqng8PHHmqj3zDPzk2+mOgWHn35Kfw/VIbxs2qTTaN90k4ZwkopRUnIOgNn/W0aMiLYr\n2/kJn2ohDbMXqRQzABwI9V7oBSDu76mft/52FbepUhQWFmLPPfes7mYQQkiFOfJI9Rz44APgyitz\nO7dfPxUsli5NFxysGDBihE6xGZUfYsAAl+X+wgvVqPUf5rfaSoWF1q01ttEPBbAJJXv31hHc++7L\n3FYrNBxzjLqsW8rKgLPOAn73O7evfXt9bd7cxVQWFur23nvr6K5vpB53nL6GQyq22SZoHNn34bPr\nrsFt3xPENxSvuUYf5oqLdUYCIP3BsrhYjXz/WSRKcKjqkAp/hN5PCJorVSU4GBOckSUsOETNgvDx\nx8FEqflIGhnl4VBSEpx1ZcEC9UiIy69iCedwENH3GRYbABWxKhNSsXRpusgyYwbQt2+y83PB99zY\ndls1eP33l4myMp3BoGNHYJ998tcmv7+Xl2sbO3TIX/01Df93WJM8HMJiVFJvqtGjdWrVjz7S/6Ze\nvSrXjuoUHKLuy9UhOHz/vfs+7P8TyY0NG4CNG7cB4OYVb9FCZx8KU1iYJB1izaWqPRxe8NaHRhUw\nxhgAp6U21wCowAznhBBCcuG004B77lEDOVeOOw645BKgSZPoY598otNoRjFggFtfsCDdcN1qKzXq\nwuEZTZsGc0j84x/qbfDcc8C4ccAbbwTLt2rljMMhQ9Rg+ugjZ+zPng3ccIMrb40H38OhfXs1cOy0\nopaGDQHrxBb2cOjWLbi9445IIyw4bL+9TrfnY4wKOjaEZdEiNfiiHizDRnqUgbBqlZuRI4rKJI0E\nggb8gkr4KfrvxeYISTriH4VtV2FhMOdINg8HQEf0fSGla1fXp3LxcPBHiDt5AaZWcJg/P5hzo7xc\n+2o2fMO8uNh9X1Hi0uTJlQup8MMpLFWVx8EaMca430VJSbLEpI8/rvegAw+MnuGlooT7e10Oq9i4\nMdi3qntKUh+/XYC2LdN9zWITEANuauKKUl5evTkcoq7l/y+E8/dUFX6/CH8vdYVnnwVuvTVZH6sI\nUf+7TBpZAURkJoDp0LCKPxpjovTm4QB6QGeoGCUiAUc1Y0w/Y0x5aonQ7QkhhNQWtt022ggHdPTX\nihh+WAUAvPBCcMrMBg10lOq444BTT1UDwzcobTiFpVMnDfkYMyZ6lNT3cAjvC099efzx+j4A9Szw\nr7vNNkHRwuassBQUBPNgWC67LNiuiy5S74kePXS7tFQNqCjBwYZL3Hqr5sm46y53zHfhzxRWkYuH\nQ3l55lHPhQvjw1yyYWOBGzRwQs+qVdGj80mwwkCjRio6WDJ5ONjvU0T7HaDfze67u5CKXDwcrOHb\nvr1+HzaJqxUc/HAKS9S+MOEH0+XLdYTfetjssYcTt2bNihYikoZU+Aab5Z13Kv49Z8K+r5Yt1VPB\nEmfk//ija9/UqfpaUqI5OPJFOKSjLgsO4RCKmuThEGXYJslf4IcDffhh5doQlaCxpng4DB+uoWN3\n3FH17ajrgsPMmTpYcfXVwIMPVs01okRUCg4V52LoVJeNALxujLnaGLOPMaa/MeYBALemyn0G4M4M\n9WSMujLGnO4vAH7jHR4UOt6nEu+HEEJIJfBzFuy/v3paDBgQzNdw5JEqIrRrB0ycCBx6aOY6Cws1\nq7qloCC63H77ASedlL4/ysPBhin85jfB+i66yK0bEwyraN06KHaExZVu3dJn9wCAP/5RR9mXLNGH\nkNGjdb8VHAD1HIjzcNi4Efjzn9Wo9Ed2/Wk9kwoO9n2vXh0d77x6deYR/g0bsodw+IjoCPysWc7D\nYZttgsJMRUerfQ8HX3AoLlaDec891dPFf3j2wyiscdGtm4pRuYZUbNjgDLguXbS/WFHNGvtR4kI2\nr47y8vSH1e++C+YX6NjRvZfy8uiR3Yp4OFghbu1aTSyYb+wDd9u2QQ8sa/Q/8gjQpw/w2mtatmtX\nYKeddDpNv51Jk29mo6Qk3aDKh+BQXq4eNP60t1XFzz8nF4fCHlNV6eGwbJnOojJuXLLyUYZttnCC\njRuDSQ398KWKECXSZQo/yzdxgoMIcO+9em+6//6qb4ffL9at23yeFZsL3wPyppuq5hoUHPKIiHwI\nYAiAtdBcDjcDeBfAZABnQYWEhQAGi0hlJnZ5JLQcldpvAFwVOvbHSlyHEEJIJbjiCmC33TRs4qWX\nNPxi8mT1VLA0aQJMm6YP9mFvhzhsckcg80PoJZcEt7fYwsXLR3k4bLGFS67Zt6+KFj5+WEXr1m52\nDCDdw8EXRcK0aaN1+V4J3b2Uyh9/HC0aLF2qo3ZRIoAvOGSKd/YfYm2by8qiDdIkxlYueRwmTdKZ\nOnr3dnV36qQGuiXKpT8JcYLDpk36EDl3rj5Y+mEgvuBgsTG11sNBJFniQH9k1b4fG3azcKG+X19w\nsN4P2Twc1qxJNyCXLw8aVr7gEEdSDwf/8//97916ZcJdoigtdaPFbdoAW2/tji1frsb5eeepWDR8\nuIZS2b77/PNBYaoys5v4rFiRLrzlQ3C47jpNINu3b9V4iohoUt2+fVVIPfzwZAkTk4RoZeK994Cj\nj3bTAGfi5pvV62zYsGQzHVREcFiyJPi+P/qocokjo34zNcHDYc0a5zW1ObxSwkJUXTKURYL3tqh8\njc88o4MNlREjKDjkGRF5CcBuAO6CejKsB7AamkTySgB7ikim1C8Seo0rk8tCCCGkGvjVr3SU6aWX\ngkn0omiQw7+Ub6BmGm3p3Ts4U4QfnxklOAD6cPHii7qEQzJ8waFNG+eV0LJlUAQB0vM3ZMP3cHjz\nTWfk+kLEsmXxo7m+wWln2IjCPjAXFAQ9NqIebpMYW7nkcXj55fR9YcEh3x4OmzYBn32m6yJuxhQg\nmGfEssce+upPm5rEy8E31O378QW0f//bjbh27er6x6efBmexCBM3k0NYcOjXL/035D88hwWlxx9X\nUSw84mxDFpo0CSZbTRL6kQv+A3ibNukeDpMmud/2p58GE4rOnBkMB1iwIN17YMECN01oUqJmyMiH\n4DBpkr7Om1c1niIPPgiceKJOXQhoiEmS8IOw4Z/Nw2HZMp3q9w9/UOFk+HC9T555ZnYhxeYqKStL\nJl5VRHDwRT9A+0RlkixWt+AQda0VK4L3mp9/rrppay3hflGXwirmzw/+7qOE95tu0r5/ww3pIXpJ\noeBQBYjIUhEZLiI9RKS5iLQRkX1E5A4Rif1bFZGpIlKQWmI9E7wySRZ6OBBCSB3jvPOcu7oNSYgj\nLqllVEiF3X/kkdECSTik4qabdPaP555LHxnJVXDYaSdnME6b5vYfcIBbX7o0eorC1q3VG8Oen2m2\nBysstGoV9LDYbTfg5JODxnUSY2vaNJ2F5LHHspeNMjQ6dgyKNdkEh/fe0/e6yy6ac8OOrlqvj7Dg\n8MsvQePLHxHcbbf079l6OPiCQ5LEkX67reBw5JFu3y23OGGhVy93nbKyzIkjox5KozwcWrUCjjoq\nWM5PXBl+kL7qKjV+/dlrysqc0dalSzDDf749HPz3FeXh4E/pW14OPPWU2/7kk2BdIkFB5JdfVIAZ\nOlS9rJIS1d8rKziUlwcNRP+37VNWpiFn2UIOZs1STwHfyJw8Ob1cVC6OMLl6ONxyi067+sQTmkPD\nzliwenV2scIXAz7+OHvb8iE4AJULq4gSHNavzy2RbGXwRWD/9xEWnavayyFOcFi3TmeBuvrq/ExB\nWh38+9/B7aKioKgg4u7tJSUVF8QpOBBCCCG1jE6dgLfeUiP33HMzlx0yxHkP+HOIx3k4ZMJOa1lQ\noHknOnXSBI4DB+qsGz6ZQiqiaNIkPSwD0ASYtu5ly6IFhx9/1FCR3XbT7U8+iXehtw+xLVsGBYe1\na9Wos6OxQNDYipup64UXNJ74tNMyZ4UvLY02rHMJqRDR7/u993QEe+ZMnUWkvNw9JDZqFMydsWhR\ntIdC69ZaNvw9WQ8HG1Jh256NKMGhVy83cu/nuhg0KLkxH2V4RXk4AMD//V+wXJs26XkkAH3QtR4C\n33/vZu5Ytsx9jl276mdk++TcuclCS5Liv69wDodFi4BXXgmWz2ZU+UbYu++69/TYY8kzz1eF4FBU\nFBQH4gSH8eM1Z8zpp8eXWb0a6N9fcyH4cedRYU1JQpNyyeEgEvRQmjkzaBBnEgM2bAiOIocFoyii\nZpPJNjVmlODgJ44UUQP5179OllAyLu9JRRPbxrFhgwo54dFz//O193Yg/T8g029jwYLk+VviiAup\n+Ne/gIce0v/AfCZuzRfffBM9/bJPWHAoLw+eE54dxXrL5QoFB0IIIaQWss8+6tqbbcrqwkJ9mHvn\nHeCvf3X7/USFSb0R9t9fH5a/+CI4kwaQLjiEQyyS4IdPWA44wI1Uf/ll5geefVLzQ5WXR4dVlJe7\nh+Wwh4PFumUDQWOrZ89guaZN08+9997g9saNOsq//fbAk09Ghw506qSfnU3AmWkEadas9BHLlSvV\n4IoLqYgL+bAeKb7gsP32rh358HBo0CA9L0mvXtpv/fnXM4UrJA2pAHRU3/fY+OILzWQPBI2O8Gdi\n3fz9/Tb/hG3nL78An38e385cyeThMHFi7m7ifh4H32Bftw74z3+S1VEVIRVhT4Np06JHg/1Qn7fe\niq5rzhwXInLrrS6/SNT3ksTDISqkIi40YtGioMH/5pvB45lCOMJiRK4eDlYczubhECVI+PeL+fPV\nQP70U+DOiNT1a9dqEkb7e4gTbfMdVnH44XqfP++84H5fcPDvv2HBIU4oevBB9QTbbbfKJXqM83Dw\n7xdJRKTNybx5KpbuuGP8f0BRUbTY6/ezcJ/KJWeRjy84WO/KjRuDYsbf/qahUZsjuWxVQsGBEEJI\nvaRVq2DIAQCccALw978DY8emT4eZiV13jfZECAsO/gh5Uvw8DoA+fOy5ZzB3hOWYY9z6hRfqq5+v\nIhxWsWqVeiFYY6dVq/Q2AyrOLFsGjBihhp/FjvwDasT6SSotTz4ZfEi+5x419pYu1VHZKOzIthVo\nli2LNzbHjHHrvmDkjw4mFRysV4svOPjv0SYXBaIN0TBWcGjWLOgx44dVACrKFBTojCg2R0gmD4ek\nIRWA1nf55W7/Hns4Y80XHKwrfHj7pZfcPis0+J4Y+czjEBYcmjcPfua54ns4+KIZoPkqklAVHg5h\nw3/5cud98PTTmgdh1aqgIeOPvj/zjBqbjzySLsbZKVD9UJ2460YRHv31E3mG+e9/g9vhzziT4BD2\nPPAFwjisUVtQ4PriihWZZ8Wx12nY0PUlX3DwP78oA/nii9XoHzBADUFfcPAF1nwKDmvXOoFs7Nig\nt4J/Hd/DIXxPi/NwsKP3X3+d2fssG3GCg/+dVzTZb1UxYYKKZ2VlQa89n9dec+t+AuiqFhysmAsE\n74Ovv66eTkkT/NZUKDgQQgghKQoL9WF/6ND81NeokRr+W2yhxkFF8I1oQEcxAeD449PLHnWUZu4f\nMQK4/nrd5wsO/oiviBq+Tzzh9g0cqEZvmDlz9HojRwZHTn1jfKednJHrs2GDe+8rVwI33uiO+SNs\nBx+sry1auHr9sAr7kPf448CoUW4WDRvHv+WWwalVp093I7NhwSFsXFush4P/mfn5Mvz9b76pD6Fj\nxkQbG6Wl7iHVTolpGTjQvbeLLnJeKM2aOY+Wjz/Wz+6TT9JHIuNCKmxIRNOmQa+G4cP1fbRoAVx6\nqfNw8B9iw5/Jp5/q52fzYRQWAoMH67rviZHPPA7+g7Z92Pe9HAD9jqLEPZ/dd9fXoiL9TDZtCnoL\nABqekSTRnS8uWJHPnxGgIkQZ/tOnq7F2yinAHXfo78Q3ZHzj8IortF9cfnm6J8MLLwTPGzjQhdBk\nExxKS6OFtDjjNRziEhYFM3kfhAWH0tLsrun2+2rb1t0vgHhvFRF3nR13dB4BX3/tRoz9Ni5cGPTm\n+OUX4NlndX3FCr23+r8ZP39PPgWH8G/RF8d88db3wgt7yMR9Z34fqKgHgki84OCLPxXNbVBV+KJd\n3H/Aq6+69XPOcevVJTjY30QuCbRrIrW8+YQQQkjN5h//0BGrM86o2Pm+wXv99cAOO+j60KHAYYcF\ny/burQbGX/7ijOeddnLG53vvuQfT997TkBJAjz/8sBoy+++vhsSkScAfUymWS0qipxn0wz26dQNO\nOknXCwuDD8n33acGxV//Gh/r/NhjGg8+d67zsgjPVPHqqzp96qWX6nt8/HE3Sn/yyToFoBUWpkxx\n5xYWurAIIHp6UcB5IfTqBTzwgOb3OP98d9xOjwro6O7Ageql0adP+gjU0qUuz4P/PgA1AGfMUDf5\ncIJTOyJdUqJ5HXr21D7gu9n6D6R21Hb1amdcdewYFDiaNFGDdtUq/Yys4LBxo2tjeIR0/nz9zq2I\nccghzu23qgQHXwCw35c1li1jxkR70lhatgwao/PmqReGFQjsg3tpqTMmi4ri+6UvOFghA0iWlO8/\n/9Gkiv7sGUB0uMO0adonrMH78svBfmqnPvz+e7d/9er0OPmw4LDLLs5T6KuvMufcWL48Onwiyj1/\nw4b4MA9LLh4OQPawCl9w8L2EfK8rn1Wr3O+yc+fg92dzx/ht3LAhaDC/+mrwd/fii5tHcAjPWjJ2\nrLtvW8Fhq63cf0EUUf2zpCRoOFd0dpSffkr3RrH3JP/zrGmCgy/aRQkOZWXu99SiheZ6smQSHD77\nrGIJMq3g0LChCmIW+1n+/LO7d/jhfLURCg6EEEJIFVOZh4Xu3dX4GDtWPRcsxug+awS2bZvuDQGo\ngWVH0L//3j0sPfSQKzN6NDBsmDNSDztMR7P79IlvV+fOahxbceC3v1Uj4N13dSTplFOc4ffll8A1\n16jwAKTn2Nh2Wx3JHjQomAcjPFPFAw+47TvuAK67zm2fdZYap9YY9V3DGzXSXAbZ8MMezj5bZxzZ\nYgu3b9ddXbjH6687o2n+fBWU/IfOqPwNPltvrW0KT7PqG/NTp+rrrFnAZZe5/b7gsMsubt0a1VGe\nJoDrh35yVBv/HxVSMWGC2z7uOLferp0ztubMSRYLvmgR8Kc/qffK//1ftItwOKQCCH5vjzyiXjy+\nZ02DBsHkkl26BI9/+GHQ1d8ftXzlFZ3NYbvttI477khPBmpH/Fu3Dhp4mUJqysr0+zrySO33XbsC\nf/6z+37sKHPjxk5QeeutYAhIlCjx4YfpbvDhpKvz5wcN8O7d3ehpSYl6XHXvrsazZcMGba8VDIFg\nv4wSHKZNy55TI1fBwY64r1+f/v5/+cUZ/23bqhBnv48pU5zHwtdf6+/G924A1CvGFxzsaHfYC8MX\n3vz+D6iA5AtT2aYQrihRv0Wbo8FP8OsngA0TJTh8/XVQcMpVcBBR4zrqe125Uj8b//NZsmTzzd6R\njZUrg/8J8+eniwRz5zoRwPdCAzILDmvWZJ+RJQp7rdatg4K4vQ/6vwEKDoQQQgipUgYNUo+GsFvl\nttuqa/9JJ+lof0FB9Pm+cPDII2rsPfOMbrdoER2eET4PUAPp3HP1AenKK1Vs+PRT9ZQ49VQts+++\nLu+EDesAgNtvd8bcn/4UjD/2jWwf/4HvnXeCrtMbNrgHsyFDnNAQJZIUFqpx4rutRhGexjSMMcHR\nc5/nn1evC0BH//zcB1GCQxx+zL3PAw84A8j3BIia+WTbbTNfw3o4ANoXfvopenYCK/A0bJg+veb+\n+7vzw4lBwzz3nPaJm29WQ+8f/9DQnXAS0yjB4frrVVh69VXnJeQLCt26BftP165Bw3LevGDCyIsu\ncnVPnaoimIj2p+HDVWizRpmI83DYeuv0aTqjmDoVOOgg4K673L4NGzREok8fFRusIdy1q/sclyxJ\nN3DDRAkOUfieBzvvHOz399+vRqM/O8+112p7Z8wInmeJMl7936Iv+PgsXhw/8hvn4VBaqrNu7Lxz\ncNaNcLiNMc7LoaRE+8fKlZp7p39//S361+jcORguZvM4hI1nKzgUF6eHaqxYEYzx9wWHfM4uECUE\njBunn6X1pGjVSj8Df5pbnygDOJxTIVfB4corVazq3Tv92MqV6Z5jpaWZ82tsTsIzkKxbl37P88Mp\nDjtMvcfsf0ImwQGoWFhFnOBgvzs/xIiCAyGEEEKqjb320sSM4fAKnzPOcA8s992nnhF2ZPvkk4Oj\n+D7dugWN8OOPB/75T324tCPFHTtq8s3wSD2g04SG27XTTvrgesopbp9vQPr4Hg7PPRc9FWXLlsGw\nBGvA+ViPCj8kIookU6GG62jXzr33kSPVM6JbN80zYdlpp+z1WqJyaFguv1xd3q1x06RJ9CwmcR4O\nFt/DYd26+CSaNlxlwAB9KPa56ir3vv/613SDq7hYR6S//Va9T8Lf3VdfqZDmnxclOHTooPUfeqg7\nttde7jvdb7+gl0eXLmqs2mlQZ850hnS7dvp59e+v22vXpk+B99prbqrHb791o+pbbx00rMePVy+B\nvn01xKe0VH8T/fs7gaOgQBPR2mSxc+ZoX7Du6DvtFOxP4RkiwsydGy849OqVLkhuvbX+PqJmx5k/\nX0fLZ88O5j6x+CJO2HgVcV4UhYXAmWdGt2n9emdUlZS4/uR7H3Tr5vrjxx/r92WFqBtvdKO84SlT\ngeBsLy++qN+dLTdyZHAGom7d1CvC9lkrOIQ9HKx3wZtvOq8J3xj0fyu+aPrcc/q9PvJIevLMXLFC\nQEGBa++772pftB4DdjYhX/TwiRKJwjk8vv02eSjIvHnqAQREJ/dcuTLa8yEfYRXvvqsidabpLKdM\n0d/2pZdGi1xRU56GPUl8Mcneb2yoQ1GRvu+ysmgRJVfBoaTEeXm1bh0U+GxyZ19wqEjC6ZoEBQdC\nCCGkjrPddiosAGoAXHqpO2bzNERhTDCHxFlnuf1Juemm4Pb996uhfOaZ+sC+3XYazhFF+/ZuNM1/\niLTtANRzwh95zkVwiEpImI2wh8OIEe5BHNAcA/4D6VFHqfCSFBsiYzn+eHfNJUv0wdoa5m3bqndH\nePQrm+DgezisWxd88I4Ky/HzWFj22AM4/XRdX7NGDTzLDz+oR8OWW+qIszVqjj5aH6btDDArVwJX\nX+3Os8bills6wSCKtm01WeiFF6pRakOGbLsaNXIJ9b76yrl5H3KI9t0BA1x5m7PA9w6ynh3jxrl9\nBxygngv2wf+pp/T9TJ+u4tLee+uUg5bOnXXEdPx4NaCjvGu6dcsugvlk8nA4+GAVN3ysGBXn2fPu\nuyqQReVt8Eexw8brvHluNHvAgMwi2eLFarjvvrsKC/36qcBhhZwuXZxHypIlOouNpbRUxUkgKDjY\n32m/fk6seOml9Gk5bb/u3FmFzy23dN5Gn3yi/SIslFlB4YUX3L7bbov2Huvf3wkzs2er+DVsmH4X\nUR4cSVi71glP++4bTCLrCz9WcIjzcEgiOADJvBxE1PsnU56CVauiDfEvv8w++0gmXn5Zv+ebb9Yw\nh7jZGm68UQ30UaPS+wEQ/bvx73s//uhyGnXp4hLTWsGhvFwFj6IiJ576U0hnS3gaxu8fHTqoiGqF\n/ylTXPiKpbZ7OEBEuCRYACwDIB07dhRCCCGktvHRRyL6GOOWgQNFyssznzd7tsiee4pcfHH2snFc\neqle78or049lq3PpUpHu3V2br3zuVQAAIABJREFUDzhApKxMZMwYkaeeij7/tNOC7/Pcc3X/mjUi\nBQVu/ymnBMt98kmy9/Ob32j55s1F1q7VfcOHB+saPFjknXeS1Zep/W+/LTJ+fLDNjRrp+u67a/mz\nzw5ee/z4zPVfdZUrO3lysO1/+lOwrsMPj/+Oli0T2WILLdekicjKlbp/5Mj0vrb11u54UZF+dn4b\nRETatdPtHXbI7fMqLRW54QaRESN0XURk2LD0NjzzjB779NP0Y+PGiWy/va4bI7J4sUjXru74V1/p\nuc8+K9KgQfr5/nLPPSIlJcE2rl0rsvPOwXJjxmhfbtMmvq5GjYL9P2556CGRWbOC+846y31PUef0\n6uXWf/1rkVdeEenSReS3vxX55ht37OijtZ5Jk/QzPussd+zee0U+/DC+XRMmiDz8cPzx888Xue22\nzO/trbdEnnzSbY8a5T7XIUPc/saN089t3FjvYZbjjnPHnnsuvXyrVvqdbLON69fr14sMGJBeduNG\n7VNRbR49Orc+bHnnHVfH2WcH71Hjxrn1YcO0/LXXxn9umzYF6z700PQyDzyQvU3//W/2/geIXHJJ\n+r4WLfT3dMIJ2e/1K1aIXHCByIMP6vaUKenf6e9/L/LddyLFxe680tJgmT593LXmzNH3uPXW6W07\n80xXxyOPuP2XXeb2X3GF2//mm9oX7faJJ7r1QYOyf44+/vVuvln3+d/PZ5+5/5mCApGOHTsKAAGw\nTKT67eJcl2pvQG1ZKDgQQgip7Qwa5B5oTjhB5OefN9+1ww+/ufDDDyqOtG0rMm1a9vLl5Wqo/+EP\nIgcdpAamZf/93WcwdqwEHkB/+CFZe2bPVlHgtdfcvrIykTvu0Adm38CpCIsWiRx7rMidd+r2xo1q\nCIUfmA86SI8vWRLc//bbmeu/8UZX9pprRHr0cNtz5wbr+vzzzHVdfLEre/vt+jn86lfBOho0EJk4\nMXje3Xe74w0b6kO+3e7Vq2Kfm8/o0cE2NGrkxKHycpEOHYLtW7UqKJT4BubAgcG6H3tMDdEttxQ5\n55zgda66Kr5NM2cGy06dqvt9oxkQadrUrffoIXLqqenffXixvwvfQBs50r3fbOf/5z+urIgadPbY\nLruIDB0afd4334j89FNw3y67uPU771TxLe66t9+u4k54f2GhWz/++OD3+fjj7jN9/PHoeq0oZ41X\ni/8dh4VJu7z8slv/3e/0vPvuS69fRIWlzp3T6zjiiOR91eehh1wdo0fr52O3zzjDrV9+uZb/5z/j\nP9uiomDd4d8lIPJ//5e9TQMHuvLGBM+3wgwg0rdv5j722Wfx1ygvFzn4YFd27tx0gc5fWrVSoUtE\nheLw8ddeU4HTFzaBoIi4//7u+n4f9YVi/3s/9tjgf8bo0SItW+p6ixYqTInofeaQQ/TzCH8HFl+0\nmzJF991yi9v3z386MbdrVwoO9Wah4EAIIaS2s2yZjuo8+GDFvRWqk3y0edQoffpp3Fjk22/VaLQP\n0nZ0vCZywQWS9lA9ZIg7ftFFum+LLUR+/DFzXe+9l14XoCNq5eXaR5o0STb6uXChO79LF5E33nDb\nAwfq9qxZ6eeVlsYbKIcemttnE8XUqZnr9EcnDzxQ9xUVBT1g7PLEE+n1r1gh8ssvun777XrekCHp\nng1h7rpL+9qOO7rzx4xx12rQQOT00932MceoB0i4TXvuGdxevlzrmjZNBZzCwqCBZz/rjh1F2rcP\nntu2bbQgGCVyhdtgsd4pgHot2PWhQ5140LFj0KAGRJ5/Xs/fd9/g/ksu0XYBKsD4o79vveWuu2pV\n+nd21VUqMkb1u4kTXbmttope79/frd9/v563fHnQs6V1a1fngw+mfzbNm+tn+vnnwdH4bFhvMEDk\n9deD3731wAFUNBRRoci/ri/UzJnj6i0udu3fbrvgbzSKjRtVkF682IkMXbumi0cHHeTW7b0UcIKP\nv9x9twqStt/7PPZYsOwf/uDW99pLvdnC9Z16qp4bFo4B9XJ49NH0/UOHuvffsqXe79asCfbRsjLX\nrqIiFRPs+Z06ufWJE4NC3Lhxes6tt7p9xx4b/fn++td6vKBAZN063ff++8H22/XBgyk41JuFggMh\nhBBS+yktFXn6aWeMWONi112rt13ZCLvLAyoyWDZtUtf1mTOT1XfddcG6mjcXWbDAHc9FfPFHJv0R\nxKefznzehg0if/lLutv0SSclv3Yca9YE67z33uBx30i89Va3Pxxm07JltIEUJhcxbNEiZ2SIBEf4\ne/YMGlDXXKNlwiE7vodI8+bB63/9tYYi+Xz+uYbLfPihhkj4dV1wQXQ7d901WK5Zs6Dnxy23uLK+\nYOCHAPnLhRdq2aeeUiOuc2fnZWWFQLu88Ua69wigI+phUccXCAD1UIgj7A1kl+OPj97vf47hsApL\nebkat08+qS7/9rj1KNhuOz3uG7JLl+rvZtAgkdWr3X7fm+Dbb/VYVLvuuUfLh8NZrBs+oKEQv/yi\nfccXbE480QlETZuqYOb38a++0uNbbRUUGG68Ub24wt9puG3t2qnhHt4/eLAKaIB6B5SUiFx9tX6u\nvlEfXv7+d23XuHHa9mbNdH+LFiqMnHtu9Hk9e6bve+EFDROz259+GvSS8e+plpdfjg6j+vhjkRkz\n3Hbfvlq+d+9guddfd3WVlem9yYo4e+zhjpWUpHtkAOr9RcGhniwUHAghhJC6x+efq/E9b151tyQz\n5eX6YF1YqA+rnTtXrs2lpcEQm3DIQy48/3z6Q3KrViooJOG114Ln+rHVlWHHHV2dYQN8wwb1SDj6\naA0J8Pc//LDu79nT5X2oas46Sw2pxx5Tg6RbNx3htyJQcbHI3nvre+nQQT0s7Mh+7965XcsfgQXi\nc42MGaNG3W67qSH2xRfaD8ePV8PTH7m3Xg1NmmjbGjZM7xPWdVxE3c99UauoyBlhW2yhhuSbb6bX\nce216e288053vEEDFzoTRXl5tOfGpEnpRuVvfhM8Nxy+EEXUaLtdrLDz889BYcC+p5kz3WfQoYMT\nkaJCIWzY1sqVwf1+2MWjj0Yb49ddp6Ei/r599nGfm+9hYxdj9Dc0Z05w/9//nl52r72icyb44RiN\nG0d7bUUtfkicSDDEaOJE5+1jjMhNN6Wf366dyJdfah4jERU57LHWrYPeOTbMKYwv8AEqxpWU6Hfk\n51gJ38sADYtav17kr3/VPmY9d/w+YYkKP3rgAQoO9Wah4EAIIYSQusT69Wqsvfpq5eopKUlPbHjF\nFbnV4Rsu4bj7inLXXWqU50vAqGr8EfDy8vQwh7Vr1fCZO1e3//Y3kZ12cvkXkuKHm3TunJ9QpeXL\nNZRh0iTd9kf6ATWysoWb2ESftu+UlKSHf3z5Zfp5ixa540nyf0SNym/cqIZd2DD3+f57d8wPJ/GJ\n86CwBvHMmSJHHRXc36qVihD77ef22fwtIsFEl4DIYYe5Y+XlwZwfd93l1n23fH959FENrzvxxKAI\n0KePiphRYUX2mmVlbl+TJtFhC8ce63JlNGqUHi4Tt/TsmZ6vZMcd0/unHxZzwglO3Np1Vw2xCXtM\nDR2a/j126ZJ+/U6dMnt2ffSReitMn679xeLn2fAXX3SLEo2AYD4SEfVcCZd56y0KDvVmoeBACCGE\nEBLNihWa62D0aPUQ8B/Ik1BerjNAjBqV+7mZSOplUZ/YuNGFvljX/HxTUqKjzTaW/9JLs59TXu5m\nMrGcd54zvOLyDYhoEsXtt08mvqxeHUx2CLhj99yjRniTJtFJDu+5R4UBO7NKFNtu6+odOjSYDNCG\nA4SXAw906z16BMWmG24Ilg17pHTr5o69/np0/TYBY0GBCjSWWbOCs6T4YoPfVt/TZ8IEDR+YODE9\nh4T9rtev14SLU6em52cILxdf7N7vlCnBY+efn/75btwYHYJhZ+4Ii10vvBDdB444Qv4nDBxzjBPy\ncuWHH7S/hNvz3HOZQ0WAdAGtuFi/b+t10aqVemBRcKgnCwUHQgghhBBSF1i/XsM1qjp57BdfqOFV\nUeFn9mwXRvTf/+avXc8+64y+Pn2CxxYtSg/ByQWbj6JnT/VK2bgxfYS7UaP0fAh2CXsc+dNSduuW\nfr0//1mPHXOMfqfh+o46SkfuJ04U+eCD9PPnzHGzLdileXMVXM48Uz09fO8bn3ffTb/emDHBMsuX\nB4/74Rw77BCcLWnTpmAeA+sxEyZq5hab4NMPa2jSJJgvxae8XEWGpLMTZcKfthXQmVpEtP/6go6f\n78YPmwmzYYP2g8WLdZuCQz1ZKDgQQgghhBCyefnoo+TJUJNSXq4j8dtvL/Lii/mv+6uvgiLL0087\nQ7Ow0HliHHKI229MehiHiHqL/O53GjIwf370NRcvVlFg3bqgZ8Kpp2bOaWH54ougEHD99cne6/r1\nzlukQwdNcBiVZNUmatxnHxUVTj5ZvS6i8ofYPBRt27qpJsNMny4BA98Yl++krEzzjgCaj2Jz8eij\nLkTlrrvc/oULg7NDjRyp4RsPP5y87touOBhRY5pkwRizDEDHjh07YtmyZdXdHEIIIYQQQkgtQAS4\n6CJg+nTgttuAww7T/fPmAUccAXTqBNx1F7DPPpW/1ptv6nLSSUDPnrmdO306sHixntuwYbJzfvwR\n+PZboEcPoKAgvszUqcDBBwPNm2eub80a4IkngL59M7d/wQJg8mRg4ULggAOAE090x1atAmbNAvr1\nA5o0SfY+8sEHHwBffw2ccALQoEH+6u3UqROKiooAoEhEOuWv5s0DBYeEUHAghBBCCCGEELI5qe2C\nQx61F0IIIYQQQgghhBCFggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwI\nIYQQQgghhBCSdyg4EEIIIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGE\nEEIIIXmHggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQgghhBCS\ndyg4EEIIIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXmHggMh\nhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQgghhBCSdyg4EEIIIYQQ\nQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXmHggMhhBBCCCGEEELy\nDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQgghhBCSdyg4EEIIIYQQQgghJO9QcCCE\nEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXmHggMhhBBCCCGEEELyDgUHQgghhBBC\nCCGE5B0KDoQQQgghhBBCCMk7m01wMMZsb4y5wxizwBizzhizyhjzgTFmuDGmaR6vM8gY87wxZqkx\nZmPq9XljzOH5ugYhhBBCCCGEEEIy03BzXMQYcwSAxwC0ACCp3U0B9AKwF4AzjTGDReTLSlzDABgD\nYFhql73OtgCOBnC0MWaMiJxT0WsQQgghhBBCCCEkGVXu4WCM2QPA0wCaA/gZwLUA9gcwECoQCIBu\nACYZY5pV4lI3Q8UGATAbwEkA9k69zkntP9MYc2MF628AAGVlZZVoIiE1m++++w5/+ctf8N1331V3\nUwipMtjPSX2A/ZzUB9jPSX3Asz9rZTqEzdHoUVBvhlIAh4jIrSLyvoi8JSLnArgSgAGwE4DLK3IB\nY0y31LkCYCaAA0RkvIjMFpHxAA6EihAGwBXGmC4VuEwBQMGB1G2+++473HDDDfzjJnUa9nNSH2A/\nJ/UB9nNSH/Dsz4LqbEdFqVLBwRjTG2rsC4CHROSDiGJ3AlgAFQMuNsZU5IO8FC485CIRKfYPisgG\nABelNhsCuKQC1yCEEEIIIYQQQkhCqtrD4Whv/dGoAiIiAMalNlsCGFCB6xwJFTUWisjMmOu8D+Az\nqLBxVAWuQQghhBBCCCGEkIRUteBwQOp1PTSkIY6p3nqfXC5gjPkVNDFkuJ5M1+lojNkhl+sQQggh\nhBBCCCEkOVUtOPSAeh4sEpHyDOUWhs7JhV1i6sn3dQghhBBCCCGEEJKQKhMcjDGNAbRNbS7LVFZE\n1kC9IABguxwv1clbz3gdAEu99VyvQwghhBBCCCGEkIRUpYdDc299XYLyVnDYsgqvs95bz/U6hBBC\nCCGEEEIISUhVCg5NvPVNCcoXQxM6Nq3C6/izV+R6HUIIIYQQQgghhCSkKgWHjd56YYLyjaH5HjZU\n4XUae+u5XocQQgghhBBCCCEJaViFdf/srScJX2iWek0SflHR6zTz1nO9TmsAWLFiBdq3b5+1cEFB\nAQoKCnK8BCHVy6ZN6iR0+OGHo7AwiU5ISO2D/ZzUB9jPSX2A/ZzUZsrKylBWVpa13IoVK+xq6ypt\nUBVRZYKDiBQbY1YCaINgYsc0jDEtoWKAIJjYMQl+osiM10EwUWSu1zF2xfvSCamTsI+T+gD7OakP\nsJ+T+gD7OaknmOxFah5V6eEAAAsAHAigqzGmQYapMbuHzsmF+TH15Ps6xXBhHz8mKF8GINNUoIQQ\nQgghhBBC6icNACRxiW8NFRuKsxWsiVS14DADKjg0A9ALwMyYcv289bdzuYCIfG2M+RbANqF6ouib\nei0SkSU5XqdZ9lKEEEIIIYQQQggBqjZpJAC84K0PjSpgjDEATkttrgEwpQLXeRGq+nQ3xuwdc519\noR4OEmoXIYQQQgghhBBC8kyVCg4iMhPAdKgY8EdjzD4RxYYD6AEVAkaJSCBzhjGmnzGmPLWMjbnU\nKAClqfV/GGP8qTKR2r47tVkKYHSF3hAhhBBCCCGEEEISUdUeDgBwMXQKykYAXjfGXG2M2ccY098Y\n8wCAW1PlPgNwZ4Z6JPaAyBcAbocKG70BvG2MGWKM6WWMGQIN09grVcdtIvJlpd8VIYQQQgghhBBC\nYjEisXZ8/i5izGAAjwNogfTsmgIVGwaLyNcR5/aDhlkIgH+JyLCYaxgADwKwx/3r2Df5kIicU9H3\nQQghhBBCCCGEkGRsDg8HiMhLAHYDcBdUXFgPYDU0ieSVAPaMEhv8KkKvUdcQETkLwGBoTociaCbP\notT2IIoNhBBCCCGEEELI5mGzeDgQQgghhBBCCCGkfrFZPBwIIYQQQgghhBBSv6DgQAghhBBCCCGE\nkLxDwYEQQgghhBBCCCF5h4JDFowx2xtj7jDGLDDGrDPGrDLGfGCMGW6MaVrd7SP1E2NMO2PMYGPM\nDcaYl40xK4wx5allbAXqG2SMed4Ys9QYszH1+rwx5vAc6igwxpxrjJlmjPnBGPOLMWaRMeZ+Y8wu\nubaJkNTUxn82xrzq9c2fjTGfGWPGGmP65Fgf+zmpURhjmhtjTjTG3G6MecsY84UxZo0xptgY870x\nZoox5gpjTOuE9e1njHnMGLPYGLPBGPOdMeYVY8zvc2zXSanf3Xepehan6t23Yu+UkGiMMbd5zy/l\nxpi+Cc7hvZzUSEJ9OdMyOUFddaafM2lkBowxRwB4DDqdZ/iDMgA+h07n+eXmbhup3xhjykO7/P4Z\nO31sRD0GwBi46WT9euzUsmOyzfBijGkD4L8A9kL0b6UYwIUi8nCSdhFijJkG4IDUZtQfle2fjwE4\nU0RKMtTFfk5qJMaYgQBeR4ZZuKB9ayWAP4jIaxnq+guA66CDSVF9/CUAx4nIpgx1NAEwAcCgiDYZ\nAOUARorIyAztJSQRxpjdoTPWFXi7B4jItJjyvJeTGk3q+TyJcf2WiAyMqaPO9XN6OMRgjNkDwNMA\nmgP4GcC1APYHMBDaCQRANwCTjDHNqqudpF4jqeUbAK/B3YRy4WboDU0AzAZwEoC9U69zUvvPNMbc\nGFeBMaYBgBfgbmj2YXUfAP8H4HsAjQHcb4w5rAJtJPWTbaD9qQjAaADHQ/vmfgAuA7AsdfxUAI9k\nqYv9nNRkvgHwLwAXAzgW2sf7ADgRwLMASgG0BfCiMaZnVAXGmHMAXA/9H1gE7e97AzgawGRon/0t\ngGwecI/AiQ2TU+fvDeCPqXobABhhjDmzYm+VEMUzqgoA/IBkzzC8l5Pawj8B9MywZBoYrHv9XES4\nRCwApkKV/GIAe0ccvzx1vAzA9dXdXi71awEwAvrw2C61vYPXH8cmrKMbgE2pc94D0Dh0vCmAD7zf\nQZeYeoZ517474ngXAGtSxz8D0KC6Pz8uNX8BMBHAcUh54kUcbw1godf3Dogpx37OpcYucf07VOYo\nr+89F3G8JYDVqeNfA2gVvgaAF706+sZcZ4BX5t/htgFoA2BxqswqAFtV9+fHpfYuAC5J9aVPAdyY\noH/yXs6lxi+opG1YV/s5PRwiMMb0BnAgVBF6SEQ+iCh2J4AF0D/yi40xBRFlCKkSROQGEXlZRFZU\noppLATRMrV8kIsWha2wAcFFqsyH04SCKy1OvqwFcGdHWLwH8Dfpb6QrgmEq0mdQTRORIEZkgqX/G\niOM/wvU9QD0gomA/JzWWuP4dKvMi9IHQQJ9NwpwNYKvU+pUisjriGudDHywB4IqYS9n9ZQAuCLdN\nRFYBuCq12RIAvRxIhTDGdAIwEvqcfR6A2JA4D97LSX2gTvZzCg7RHO2tPxpVIPVHPC612RI6MkBI\nbeJI6J/9QhGZGVVARN6He9A9KnzcGNMNQI9UPc+IyMaYaz3qrfPPm+SLt7z1LjFl2M9JXeDn1GuT\niGO2z/4E9UxIQ0SKALwB7eMDw6GgxpgtARwE7eOvi8i3Me14PnUdgH2cVJx/AmgG4FGJydcQAe/l\npD5QJ/s5BYdobKKy9dDYmTimeus5ZUsnpDoxxvwKwLapzamZynrHOxpjdggdOyCiXBoi8j00yaoB\nfyskfxR662Xhg+znpC5gjNkZwG+QeggNHWsEoHfq2LsiUpqhKtt3G0Pjen16w/2eMvXxEqibrwHQ\n2xjTMK4sIVEYY4YAGAzgR0SMvMacw3s5qfPU5X5OwSEaqwotEpHwbAA+/h9/j6ptEiF5xZ8KZ2Fs\nqfTj4X5ekXq2M5xSluSH/t76gojj7OekVmKMaWqM6WqMuQzqyWMN+7tCRXfyjm3uPt4Q6opLSCKM\nMVtBkwALNPxnVcJTeS8ntY0hxphPjTHrjTE/GWM+N8Y8aozpn+GcOtvPKTiEMMY0hmaDBjQLeiwi\nsgbqBQEA21VluwjJM5289Yz9HMBSbz3czytSjwmdR0jOpDKcX+XtGh9RjP2c1BqMMafbOdqhzxaf\nA7gdQHuogfY3EXk6dFp19vGoegjJxN8BdADwtohkm13Ih/dyUtvoAaA7NAyuGTTs8zQAk40xzxtj\nWkScU2f7OV3h0mnura9LUH49gC0AbFk1zSGkSsiln6/31sP9PF/1EJIrl0GniRIAE0RkbkQZ9nNS\n24hKIvkhgLNFJCrEk32c1AqMMQdAp1ctAXBujqezn5PawnrorECTod4D6wC0A9AP2u/bQHMFvmCM\nOURE/HDQOtvPKTik4ydk2pSgfDFUFaK7FalN5NLP/Qy54X7+v3pEpDL1EJIYY0w/aHZlQOeSPj+m\nKPs5qU38G4BNEtYUOiI2BJrM62ljzCUi8lLonLz38UrWQ0gaqVwjD6Y27xSR+TlWwXs5qS10FJGf\nIva/aYz5B4BXAOwBFSDOA3CPV6bO9nOGVKTjZ/IsjC3laAwdkdhQNc0hpErIpZ839tbD/fx/9Rhj\nKlMPIYkwxuwKzZTfENqPThCRlTHF2c9JrUFEfhKR+alltoiMF5HjoW64naEjYqeFTst7H69kPYRE\n8Seoe/kS6HSYucJ7OakVxIgN9tgK6BTedhrYi0JF6mw/p+CQzs/eehLXEju1VJLwC0JqCrn0c3/6\ntHA/z1c9hGQllcH5VQCtAJQCOFFE3s5wCvs5qfWIyBMAngVQAOAeY0xL7zD7OKnRpGZZuRo6OHeR\niFTEqGE/J3UCEfkawOtQ7/iuxpitvcN1tp8zpCKEiBQbY1ZCY2wyJs9I/ek3g95El2YqS0gNw08i\nky1JjJ+MJtzPw/X8mKAeQfYkNoQEMMZsC+AN6JRR5QCGisikLKexn5O6wovQ8IpmAA4HYJNHVlUf\nn1PBeggJcyl0tPZLAFsaY06MKPNrb32gMWab1PrElEDBezmpS8wH8NvUekcAy1PrdbafU3CIZgGA\nA6HKU4MMU2N2D51DSG3Bj5/sHlsq/Xi4n4fr+ShBPUsrOMJB6inGmDbQEYFfQf8UL0yN+maD/ZzU\nFVZ46/6c658DKIN6rOazj09MUE8pgEVZrkmIddnuAuCpLGUNgD+n1gV6z/8GvJeTukVUcmCgDvdz\nhlREMyP12gxArwzl+nnrmdx6CalRpFy6vk1t9stUFkDf1GuRiCwJHZvhrcfWY4zpAJ0vXsDfCsmB\n1NRRr0GnmBIAV4nI/UnOZT8ndYiO3vr/3F5FpATAB1BDbT9jTKaBJNt3iwHMCh2bCZekLFMfbwRg\nX2gfnykipYlaT+o7kmCJKqs7eC8ndYtdvHXbr+t0P6fgEM0L3vrQqAKpOeBt8qY1AKZUdaMIyTMv\nQh9Suxtj9o4qYIzZF6p+CoK/CwCAiHwBVVYNgCHGmCbhMin839G/K9NoUn8wxjQF8DI0o7MAuFFE\nbs+xGvZzUhc4wVv/OHTM9tkWAI6NOtkY0wnAwdA+/oaI+FOhQUTWAXgT2scPToUwRXFc6jqAJm8l\nJCMiMlRECjItcIkkBUD/1P6GIvKNVxXv5aTWk8pFdQi0j34lIt+FitTJfk7BIQIRmQlgOvSL+qMx\nZp+IYsPhRtxGheZRJaQ2MArqEgsA/wjfkFLbd6c2SwGMjqnHGoCtAdwWPmiM6QJNGAWo+y3/vElW\nUiOpLwDYH+4+O6ICVbGfkxqLMeZ0Y0zjLGUuhYv3/Qr6fOLzEIC10GeWW4wxrULnNwBwHzTpJOD6\nchi7vyGAe1Pn+fW0BXCBspPIAAAEHklEQVRLanMNgIcztZuQCmJi9vNeTmo0xpjfGWMKMhzvAGAC\n3AwU90QUq5P93IjEhZHUb4wxv4G6lzSFui/eDPViaArgJABnpYouBNA7PFpASFVijOkDoKu3qy2A\nv8O5RQUeBEXkXzH13Ax3w5kL4FZoYqcuAK6CG1m+WUT+HFNHAwBTAfRJ7ZoAYAyA1QD2AXAdgPbQ\nOOPBIvJaDm+V1FOMMRMAHAPtf5MBXJLllE0pVT+qLvZzUiMxxnwNoDm0P82A9st1qX09AZwC1+eK\nAfxWRNI8Ko0xZwOwoUZfArgJ6gmxLTRpX39oH39SRE7N0J4nAfw+tTkF+vD7LYDdAFwL/c0IgHNE\n5KGKvGdCwhhjRgAYAe1bA0RkWkw53stJjcUYsxgq2E4A8C6AxdCpJtsCGADg7NS6QIXjQ1JhceF6\n6lw/p+CQAWPMYACPQ90Hw4qrAPgM+iV9vbnbRuo3xphHAJyesLikXBaj6jEAHgQwzO7yz0u9PiQi\n52RpTxsALwHojejfyiYAF4jI2IRtJvUcY0xcst44FotI55i62M9JjSQlOGyP+FFd2z+XAhgmIpMz\n1DUCmnDPRNQn0L57vIhsCp/r1dEEOgWn9agI/1bKAYwUkb/G1UFIruQgOPBeTmosOdzPnwNwloj8\nFFNPnevnFByyYIzZDsDFAAZDpxbZBHU9GQ/gXhHZWI3NI/WUlOBwWtaCiohIxhlpjDGHQ5XX3lD1\ndSU0idj9SVXPlJp6FoCToeFGzaAjY28AuFtEOJMLSYwxJtcwtcUi0iVLneznpEaRiuc9GDr61QNA\nB+i03BsA/ADgQwCTAIxP8ryRiu29ADrTVgdo6MM8AGNFZHwO7fo9gDMA7A6gJYDvAUyDPve8n7Qe\nQpKQEhyuhxpBB8UJDl553stJjcMYcyA0SeN+ADpD+2YLqNfaUgDvAPhX0ntoXernFBwIIYQQQggh\nhBCSd5g0khBCCCGEEEIIIXmHggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7\nFBwIIYQQQgghhBCSdyg4EEIIIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBC\nCCGEEEIIIXmHggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQggh\nhBCSdyg4EEIIIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXmH\nggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQgghhBCSdyg4EEII\nIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXnn/wEBtmy1xV9a\nhQAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 356, + "width": 526 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "plt.plot(losses['train'], label='Training loss')\n", + "plt.plot(losses['validation'], label='Validation loss')\n", + "plt.legend()\n", + "plt.ylim(ymax=0.8)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Check out your predictions\n", + "\n", + "Here, use the test data to view how well your network is modeling the data. If something is completely wrong here, make sure each step in your network is implemented correctly." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABYQAAAMGCAYAAABCg4CrAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAewgAAHsIBbtB1PgAAIABJREFUeJzs3XecVNX9//H32V2WtoAKKAiKWCJojAqIBVQkVtQEEzUx\nUURi+xoL0VhCiFgCFtQoij9bosSYWL5REkswsQRLMFL0awEERZGyIB2Wsm3O7487O3vnzuzOlnvn\nzsx9PR+PfeydnTtnztldZpn3fOZzjLVWAAAAAAAAAIDCVxT2BAAAAAAAAAAA2UEgDAAAAAAAAAAR\nQSAMAAAAAAAAABFBIAwAAAAAAAAAEUEgDAAAAAAAAAARQSAMAAAAAAAAABFBIAwAAAAAAAAAEUEg\nDAAAAAAAAAARQSAMAAAAAAAAABFBIAwAAAAAAAAAEUEgDAAAAAAAAAARQSAMAAAAAAAAABFBIAwA\nAAAAAAAAEUEgDAAAAAAAAAARQSAMAAAAAAAAABFBIAwAAAAAAAAAEUEgDAAAAAAAAAARQSAMAAAA\nAAAAABFBIAwAAAAAAAAAEUEgDAAAAAAAAAARkfeBsDGmjTHmQmPMDGPMSmPMDmPMFmPMQmPMH4wx\nRzZxnFOMMc8bY5bFx1gWv3xyM+ZSbIy51BjzljHmG2PMNmPM58aYh4wxB7R8lQAAAAAAAADQesZa\nG/YcWswYs6ekVyTVha3exZj45ynW2rENjGEkPSppTJox6m7/qLX2kgxz6SrpH5IGNTCPSkmXW2t/\n39g4AAAAAAAAABCUvK0QNsaUqD4MtpI+lDRa0pGSTpR0i6SK+HVXGGOua2CoSXLCYCtprqRzJA2O\nf54X//qFxpjfNjKXIknTVR8G/1XSKZIOl3SlpNWS2kp6yBhzUkvXDAAAAAAAAACtkbcVwsaYH0p6\nTk4A+x9Jx1jPYowxAyTNklQiaaOkXa21ta7r95P0qaRiSbMlHWutrXRd317STDlBb7WkA6y1X6SZ\nyxhJj8XnMtVae6Xn+n3khM2dJH0uqb+1NtaqbwAAAAAAAAAANFPeVghLOsp1fLs3DJYka+08SS/J\nadmwk6R+nlN+IScslqQr3GFw/PbbJV0Rv1giKW3bCUnXxD9vkJRSiRwPkW+Lz2NfSWc0MA4AAAAA\nAAAABCafA+FS1/GSRs5zV/SWeq77npyq3oXW2tnpbmyt/a+kz+SEud/3Xh+vMu4fH+cZa+2OBubx\nhOuYQBgAAAAAAABA1uVzIPyZ63jvRs7bJ/7ZSlpc90VjTF9Ju8cvzsxwX3XX9zLG9PFcNzTNeSms\ntaslLZITLA/JcH8AAAAAAAAA4Lt8DoT/ImmTnID1+vjGbkmMMYdKOlVOGPyUtbbCdfUBruOFGe7L\nfX1/z3UtGWePeH9iAAAAAAAAAMiavA2ErbXrJI2StFVOxe1sY8x5xpjDjTHfNcZMkPRvSW3kbOj2\nS88QvV3HyzPc3TLX8R4+jGM8twMAAAAAAACAwJVkPiV3WWtfNMYMlBP2/kzSNM8pqySNl/Romt6+\nnVzHFWrcVtdxWUDjAAAAAAAAAECg8rZCWJKMMW0kjVb9Zm/W87GbpPMknZDm5u1cx1UZ7qrSdext\n9ZAYx1rbmnEAAAAAAAAAIFB5GwgbYzpIel3SDZJ2lnSHnP6+bSV1kXSipHckDZI03Rgz1jOEu2K4\nNMPdtXUdb29oHGNMa8YBAAAAAAAAgEDlc8uImyUNlVMJPMZa+yfXdRWSXjfGvCnpX5KOkzTZGPO6\ntfbj+DlbXOdnat/Q0TO2m3ec9S0cp1HGmK1yAmWb4T7q1EqKNec+AAAAAAAAgDxUJKm4CeftImdv\nr0prbcdMJxeqfA6EL5ATji7yhMEJ1tqYMeY3ciqFi+S0l7gmfrV7A7hMG7y5N5Jb5rnOO05jYW3d\nOFaZN6Dzaqv6X+xdm3lbAAAAAAAAAI62mU8pXHkZCBtjdpOT6FtJH2Q4fa7ruJ/reH4DX0/Hff0C\nz3XecT5qwjjLrLXNbRkRk1RsjFG3bt0ynlxcXKzi4qa8MJJZVVWV1qxZo+7du6u0NFNXjPwXtfVK\n0Vtz1NYrRW/NUVuvFL01R229UvTWHLX1StFbc9TWK0VvzVFbrxS9NUdtvVL01hy19UrRW3OhrLe2\ntla1tbUZz1uzZk3doQ10QrnOWpt3H5K6yglIayU9m+HcMte5f/Nctzz+9U8zjDE/PsbXaa7bzzX+\n1EbG2M113p9asOblkmyvXr1sts2dO9dKsnPnzs36fYchauu1Nnprjtp6rY3emqO2Xmujt+aordfa\n6K05auu1Nnprjtp6rY3emqO2Xmujt+aordfa6K05auu1Nnprjtp6u3fvbuWEwattDmScYX3k66Zy\n6yVtjh8faYxpbB3DXMdfeq77m5y+If2MMYPT3dgYc4Scyl4rabr3emvtYjlVw0bS2caYdg3M4wLX\n8QuNzBcAAAAAAAAAApGXgbC11kp6WU4Iu7ukX6c7zxizs6TbXV96yXPKvZJq4sf3e8Pc+OUp8Ys1\nku5rYEp3xT/vIunONPPYR9IN8Yufi0AYAAAAAAAAQAjyMhCOu0XSNjmh8M3GmL8ZY35gjDnEGHOE\nMeYXcvoLHyCnuvc1a+1r7gHi1b13xcc4TNK7xpizjTEDjTFnS3pX0qD47e+01n7RwFymxc81ki43\nxjxnjDnRGHOYMeby+HWd5bSLuMJaG/P1OwEAAAAAAAAATZCXm8pJkrX2M2PM9yT9RVI3SafHP5JO\ni3+8LunsBob6taTuksZIOkTS02lu/5i19jeNzCVmjBkpp2r5MEk/jH+4x6mU9HNr7T+btEAAAAAA\nAAAA8Fk+VwjLWvuGnP6+10t6U9I3kqrkVA4vkfSspJHW2hOttZsaGMNaay+SdKqcnsIr5IS3K+KX\nT7HWXtKEuayTdJSkyyS9LWmtpO2SvpD0iKQB1to/tHy1AAAAAAAAANA6eVshXMdau0FO24e7Mp2b\nYZwZkma0coyYpIfjHwAAAAAAAACQU/K6QhgAAAAAAAAA0HQEwgAAAAAAAAAQEcU33XRT2HNAE9x8\n881XS+rcuXNnXX311Vm//7KyMg0bNkydOnXK+n2HIWrrlaK35qitV4remqO2Xil6a47aeqXorTlq\n65Wit+aorVeK3pqjtl4pemuO2nql6K05auuVorfmKK33nnvuUUVFhSRtvummm+4Oez5hMdbasOeA\nJjDGLJfUq1evXlq+fHnY0wEAAAAAAADySu/evbVixQpJWmGt7R32fMJCywgAAAAAAAAAiAgCYQAA\nAAAAAACIiJKwJwAAAAAAAPLLoEGDtGrVqrCnASCievTooTlz5oQ9jbxFIAwAAAAAAJpl1apVdX04\nAQB5hkAYAAAAAAC0SFFRkXr27Bn2NABERHl5uWKxWNjTyHsEwgAAAAAAoEV69uyp5cuXhz0NABHR\nu3dv3p3gAzaVAwAAAAAAAICIIBAGAAAAAAAAgIggEAYAAAAAAACAiCAQBgAAAAAAAICIIBAGAAAA\nAAAAgIggEAYAAAAAAACAiCAQBgAAAAAAAICIIBAGAAAAAAAAgIggEAYAAAAAAACAiCAQBgAAAAAA\nAICIIBAGAAAAAAAAgIggEAYAAAAAAEBkTJs2TUVFRSoqKtKYMWPSnjNz5szEOcOHD8/yDJvmggsu\nSMzxj3/8Y9jTQR4hEAYAAAAAAIiIYcOGJULEdB+dO3dW3759NXLkSD3wwAPavHlz2FMOjDHGl3PC\nlg9zRG4hEAYAAAAAAIgIY0yjH1u3btXSpUv197//XVdeeaX23HNPPfnkk2FPOzTW2qzcD9W+yKaS\nsCcAAAAAAACA7LHWyhijww47TIMHD076+saNGzV79mwtXrxYkrR582adf/752rFjhy666KKwphyK\nusrbbFbgNve+6oJ8oDkIhAEAAAAAACJoxIgRuvHGG9Ne97e//U0XXHCBNm3aJGutrrrqKp166qna\nfffdszzLcBx77LGqra0NexqNevzxx/X444+HPQ3kIVpGAAAAAAAAIMn3v/99/elPf0pUE1dWVurB\nBx8Me1oAfEAgDAAAAAAAgBQjRozQwQcfnOij+9prr4U8IwB+IBAGAAAAAABAWkcddZQkp7/wkiVL\nkq7ba6+9Ehuhff3115KkJUuWaPz48RowYIB23XVXFRcXa8CAAQ2Ov3z5cv32t7/VMccco169eqld\nu3bq2rWrBgwYoGuvvTbRy7ipPvjgA1188cXaZ5991KFDB+266646/PDDNXnyZG3YsKHJ48ycOTOx\ntuHDhzfpNt98840mT56sE088UX369FGHDh3UoUMH9enTRyNGjNDkyZO1dOnSpNvUfQ+nTZsmyfk+\njx49OnHf7o9bbrkl6bbN3Yhu69atmjJlik4++WTtscceat++vXbZZRcddNBBuuKKK/T+++83aZ11\n91lcXJz42qJFizR27FgdcMAB6tSpk7p06aJDDjlE48aN07p165o0LrKHHsIAAAAAAABIa+edd04c\nb968Oek674ZmjzzyiMaOHasdO3Zk3OjMWqsJEyborrvu0o4dOxLjSdLGjRu1YcMGffjhh7rvvvt0\n3XXX6be//W3GuY4fP1533HFHovdvXauLdevWafbs2br//vv13HPPNW3hrjVmYq3VrbfeqjvvvFPb\ntm1Lud3y5cu1bNkyzZgxQ7/61a/0ySefqF+/fonz6s6tq8RuycZymbz00ku6+OKLtWrVqqTbVFVV\naePGjfr00081depU/eQnP9Gjjz6q9u3bN/k+H3roIf3iF79QZWVl0tc/+ugjffTRR3r00Uf16quv\nNvrCALKLQBgAAAAAAABpuatqu3TpknJ9XY/hZ599Vtddd52MMerVq5eGDBmiLl26aOXKlVq/fn3S\nbWKxmM4++2w9//zziUC0Z8+eOvzww7XrrruqoqJC77//vj7//HPV1NRo0qRJWrt2rR566KEG5zlu\n3DjdfvvtifE6dOig4cOHq2fPnlq1apXeeOMNrVixQiNGjNDYsWN9+/7EYjGdeeaZmj59euK+S0tL\ndeSRR2qvvfZSmzZttGrVKs2dO1fl5eWy1qqqqipx+9GjR2vdunV67bXXtHDhQhlj9N3vfjcRGLsN\nHjy4RXN85plndO655yoWi8kYo+LiYg0dOlT77ruvKioq9Pbbb2vlypWSpD//+c/66quv9MYbb6i0\ntLTBMet+7tOmTdNll10mY4z69eunQYMGqX379lq4cKHeffddWWu1bt06nX766VqwYIE6d+7cojXA\nXwTCAAAAAAAASOvdd9+V5FSE9u3bN+X6uorQcePGqW3btpo6darGjBmTdE51dXXS5ZtuuikRBnfr\n1k0PPPCAzjrrrJSxp0+frp/97GfasGGDHn30UR1//PE688wzU8576623dMcddyTmctZZZ+nhhx9O\nCrC3bNmiSy+9VE8//bQmTpzYzO9Cw8aNG5cIgyXpiiuu0IQJE5Iqq+vMmTNHU6dOVZs2bRJfmzBh\ngiSn/cPChQslSeeee65GjRrly/yWLFmiiy66SLFYTJJ0+OGH66mnnkr5Wd5777269tprFYvFNGvW\nLF133XW69957Gxy3br2XXnqpdt11Vz355JM64YQTks555513dNppp2nz5s1atWqVpkyZovHjx/uy\nLrQOPYQBAAAAAACQ4uWXX9ZHH32UCP+OP/74tOdZa1VbW6vHH388JQyWlBSALl26VLfddpuMMSor\nK9Nbb72VNgyWpJEjR+qFF15I3P9NN92U9rxf/epXiXYLxx9/vP7yl7+kVDN36tRJf/rTn3TCCSck\nVei2xuLFi3XXXXcl5nf77bfr3nvvTRsGS9KgQYP0+OOPq3///r7cf1PcfPPNqqiokLVW++67r159\n9dW0wf7YsWM1efJkWWtlrdXUqVNT+h171VUJv/766ylhsCQNHTpUkyZNSlz+y1/+0voFwRcEwgAA\nAAAAAEgyffp0nXfeeTLGyFqr0tJS/c///E/ac40xGjx4sH784x9nHPfee+9N9Pi97rrrtP/++zd6\n/jHHHKOTTjpJ1lotWLBAH374YdL1Cxcu1KxZsxKXp0yZ0mBPXWOM7r///pTexy11zz33JCpvjzji\nCF177bWtHtNPmzZt0rPPPivJWfvkyZPVqVOnBs+/6qqrdOCBB0pyWmE88sgjjY5vjNEll1ySuE06\no0aNUklJiay1+uyzz1RRUdGClcBvtIwAAAAAAAA5Y9AgKb7vVd7q0UOaMyfsWWT28ssva82aNUlf\n27hxo95//30tXrw48TVjjO6991716tUrZYy6KtGmhMGS9I9//CNx3NTbHHfccZoxY4Ykpw3BIYcc\nkrjuzTffTMxx4MCBGQPm/fbbT0cccYRmzZrV6lD41VdfTRxffvnlrRorCP/5z39UWVkpSerWrZtO\nO+20Rs83xmjMmDG65pprJNV/bxuTroWHW1lZmfbZZx999tlnstbq66+/1gEHHNDEFSAoBMIAAAAA\nACBnrFolrVgR9iyiYfbs2Zo9e3ba6+rC0k6dOmnKlCkZe9oOHDgw4/2tX79eixYtSoz9u9/9rkmh\n7Pz58xPHy5YtS7rugw8+SBwfeeSRGceqO89dVdwS33zzjb766qvE5eOOO65V4wWh7ntTV8FdVJS5\nUcCQIUMkOUG/+3vrVfdCwEEHHZRxzK5duyaON23alPF8BI9AGAAAAAAA5IwePcKeQevlyxrShbFl\nZWXq2rWrvvOd7+j444/XqFGj1Llz54xjde/ePeM55eXliWNrrR588MFmzddaqw0bNiR9zV3hvOee\nezZpnKae15jVq1cnjtu2baseOfhDd39v+vTp06Tb7LXXXonjqqoqVVRUqKysrMHzm/K74e4h7d1g\nEOEgEAYAAAAAADkjH1otFIoJEyboxhtv9GWs9u3bZzzHXR3a0nYNdf2H67h70nbo0KFJY3Ts2LFF\n9+22ZcuWxHFjgWmY3N+bpq7Ze96WLVtydn1oOQJhAAAAAAAABK4ubLTWauedd9a6detaPaY7rNy2\nbVuTbrN169ZW3697c7Zc3SjN/b1p6pq95zW2CR3yV+bmIQAAAAAAAEAr7bbbbonjjRs3+hIIu1tV\nfP311026jbcPcUu411JZWZnUQiJXtOR74+6LXFpaSnVwgSIQBgAAAAAAQOB69OiR1L/31VdfbfWY\nhx56aOL4vffea9JtWruhnCTtuuuuSf1233jjjVaN19IWGo2p+95Ya/X+++/LWpvxNv/5z38S83F/\nb1FYCISRYK3Vf5f/V1sqt2Q+GQAAAAAAoJlOPfXUxPG9997b6vGOO+44SU6mMWfOHC1atKjR8z//\n/HPNmjXLlwD2lFNOSRxPnTq1VWO1a9cucezXxmtHHXWU2rZtK8nZYO7ll1/OeJvHH388cTx8+HBf\n5oHcQyCMhDvfvVNH/P4IDX5scJNeNQIAAAAAAGiOa665RsXFxbLWau7cuZowYUKTb5uuLUO/fv10\n5JFHJi5fddVVDd7eWqsrr7xS1lpfco+xY8eqqKhI1lrNmjVLd955Z4vH6tq1a+J4xYoVrZ6bJHXp\n0kU/+tGPEpevvfbaRnsJT5kyRR9//LEkqaioSBdddJEv80DuIRBGwltfvyVJWrh2odZuWxvybAAA\ncFgrxWJhzwIAAAB+2HvvvTV+/HhJTkB766236vzzz9fy5cvTnh+LxfT6669r1KhRGjBgQNpzJk2a\nJGOMrLX65z//qXPOOUebNm1KOmfLli0699xzNWPGjETVbGvtt99+uuaaaxJrueGGG3TllVdqw4YN\nac+fPXu2LrjgAi1YsCDlum9/+9uJ47/97W++VQlPmDBBZWVlstZq0aJFOvHEE/Xll18mnWOt1X33\n3ZdYizFGl19+ufr06ePLHJB7SsKeAHJHbaw2cRyzPPMGAISvslIaOlRavVp6/XVpv/3CnhEAAABa\na8KECVq6dKmmTZsmSXryySf15z//WYceeqj69eunsrIybd68WUuXLtX//d//qaKiQpLUrVu3tOMd\ne+yx+uUvf6nJkyfLWqtnnnlGL730koYPH64ePXpo9erVeuONN1RRUaFddtlFV111VbMqkxszadIk\nffbZZ3rxxRdlrdUDDzygRx55REceeaT69u2rkpISrVq1SnPnzlV5ebmMMfrFL36RMs4pp5yi9u3b\na/v27frggw/Uv39/DRs2TDvttFOivcVJJ52k448/vlnz69u3rx577DGde+65qq2t1axZs7T//vvr\n6KOP1j777KOtW7fqrbfeSlQlG2N05JFH6o477mj9Nwc5i0AYCbW2Nu0xAABheestac4c5/i556Rx\n48KdDwAAAPzxhz/8QYMGDdKNN96oDRs2KBaLac6cOZpT95+/OGNM4mPo0KENjnfHHXeopKREd955\np2KxmLZt26YXX3wxaZzevXvrueee08KFC5s8z0ytJYqLizV9+nT95je/0d13363KykpVV1dr5syZ\nmjlzZso6SkpKkvoF1+ncubPuuece/fznP5e1VkuWLNGSJUuSzunUqVOzA2FJOvvss1VWVqYLL7xQ\nq1evVm1trd588029+eabSXOTpJ/85Cd69NFHVVpa2uz7Qf6gZQQS3FXBVAgDAHLB9u31xzt2hDcP\nAACAQuLHhmrusVo63mWXXaalS5dq6tSpGjlypPbee2916tRJbdq00S677KLvfOc7+vGPf6yHHnpI\nS5cu1fPPP9/oeBMnTtT777+vMWPGaK+99lK7du3UrVs3DRo0SJMmTdKHH36oww8/PDFv9+fG1taU\n9d16661avHixbrnlFh199NHafffdVVpaqg4dOmivvfbSiBEjdPfdd+urr77St771rbRjXHLJJXr7\n7bd13nnnaf/991dZWZmKiooanUNT5zdixAh9/vnnuu+++3TCCSeoV69eatu2rXbaaSf1799fl112\nmd577z09+eSTaQPrln5fmjtPZIfJx83DjDH/lnRMM282zFr7VgPjnSLpIkmHSeouaY2k2ZIesdbO\naOKciuNj/ERSP0llklZKek3SFGvt/GbO1zv+ckm9evXq1WBfndYa9sQwzVzqvHq1dOxS7dllz0Du\nBwCApnrhBekHP3COx42TJk4Mdz4AAMDRu3dvrVixQkE+RwUAr9Y+9tTdXtIKa21v3yeYJ/K1Qtg2\n40OSaiUt9g5iHI9JelnSSEm7S2oT/zxS0ivGmIczTcYY01XSLEkPShoiqauktpL6SrpY0lxjzM9a\nuNasSWoZEaNlBAAgfDU19cdsLAcAAAAArZevPYRHS+qY4ZwDJT0jJxR+3VpbnuacSZLGxM+ZJ2my\npC8k7SPpOkmHSrrQGLPGWjs+3Z0YY4okTZc0KD7OXyU9Jmm9pMMljZe0m6SHjDHLrbWvNn2Z2UXL\nCLRGLCZ9843Uo0fYMwFQSNyBcC2vVQIAAABAq+VlIGytXZrpHGPM+a6L09Jcv5+ka+SEuLMlHWut\nrYxfPdcY86KkmXKC3muNMY9ba79Ic1ej5VQFW0lTrbVXuq6bY4yZIWmupE6Sphhj+lubm2mruyqY\nTeXQXCNGSK++Kt1/v3T55WHPBkChcIfABMIAAAAA0Hr52jKiUcbpUv2T+MUKSS+kOe0Xqg/Er3CF\nwZIka+12SVfEL5ZIGtvA3V0T/7xBTlVxkniIfJskI2lfSWc0bRXZ5w6BqRBGc2zb5oTBkvT00+HO\nBUBhoUK48FVXS6efLh11lLRqVdizAQAAAApfQQbCkr4rqZecqt3nrLXp9iX/Xvz6hdba2ekGsdb+\nV9JncsLc73uvj1cZ94+P80wD9yNJT7iOczcQjhEIo2Wqq+uP168Pbx4ACg8VwoXvjTekl16SZs2S\nnnkm7NkAAAAAha9QA+FRruMnvVcaY/rK2ThOctpCNKbu+l7GmD6e64amOS+FtXa1pEVyguUhGe4v\nNO4QmE3lCtPixdLPfiZNn+7vuO6QZt06f8cGEG1UCBe+LVvqjysqwpsHAAAAEBUFFwgbYzrKqcK1\nkr621qYLag9wHS/MMKT7+v4+jLOHMaZ9hnNDQcuIwnfLLdIf/iCNGiVVVfk3rjukWb9esta/sQFE\nGxXChY+fMQAAAJBdBRcIS/qhpI7x4z82cE5v1/HyDOMtcx3v4cM4xnO7nMGmcoVv5Urn85Yt0tat\n/o3rfgJfUyNt3uzf2ACijQrhwsfPGAAAAMiuQgyEG20XEdfJdZzpzYnu2KwsoHFygrsqmArhwhTU\nk27vWLSNAOAXqkcLH4EwAAAAkF0FFQgbY3pJGianXcR71trPGzi1nes40xvnK13H3lYPiXGsta0Z\nJyfQMqLwEQgDyDeEhYWPnzEAAACQXQUVCEs6T/VreqKR83a4jkszjNnWdby9oXGMMa0ZJycktYxg\nU7mClK1AeP16/8YGEG1UCBc+fsYAAABAdpWEPQGfnRv/XCnp2UbOc+1nnbF9Q0fXsbcthHecxmKw\nxsZpsqqqKs2bNy/jeT179lTPnj2bNTYVwoWPCmEA+Ybq0cLHzxgAAACtVV5ervLy8oznVVVleoN/\nNBRMIGyMGSjpADntIl6y1m5q5HT3BnCZNnhzbyS3zHOdd5zGAuG6cawyb0DXoDVr1mjgwIEZz5sw\nYYJuuummZo3tDoHZVK4wEQgDyDeEhYXP/TN2HwMAAABN9fDDD+vmm28Oexp5o2ACYUnnu46nZTh3\nvuu4X4Zz3dcvyDDOR00YZ5m1tsUtI7p3764ZM2ZkPK+51cFScpsIKoQLE4EwgHwTZDuBykrp2Wel\nAw6QmvBaa0FYsUL66CPp+OOlNm3Cno2D0B8AAACtdckll+h73/texvNOPvlkrVmzJgszym0FEQgb\nY0ok/Sh+cY2kfzR2vrX2S2PMSkk9JR2bYfhj4p9XWGuXeq57x3V8rBpoU2GM2U3St+RUB7+b4f4a\nVVpaqgEDBrRmiAZV1RAIFzoCYQD5JsiwcOpU6ZprpI4dpaVLpa5d/R0/11RXS4MGSatWSbfdJt1w\nQ9gzchAIAwAAoLWa2jq1tDTTFmDRUCibyp0iqbucwPUpa5uUZv5NkpHUzxgzON0Jxpgj5FT2WknT\nvddbaxfckDHqAAAgAElEQVTLqRo2ks42xrRr4L4ucB2/0IS5haJia/2zsK3beUZWiAiEAeSbICuE\nP/3U+bx1qzR/fuPnFoIVK5wwWJLebdXL0/5iUzkAAAAguwolEB7lOn6yibe5V1JdPHa/N8yNX54S\nv1gj6b4Gxrkr/nkXSXd6rzTG7COprgbnc+VwIOyuCt66jQrhQhRUn0bvE/j1jXXTBoBmCLJ61D1e\nE/afyHvu9W7YEN48vKgQBgAAALIr7wNhY8xOkk6TU8X7ibX2w6bcLl7de5ec6t7DJL1rjDnbGDPQ\nGHO2nNYOg+Lj3mmt/aKBoabFzzWSLjfGPGeMOdEYc5gx5vL4dZ0l1Uq6oonVy6GIqf5ZWHUNz8gK\nERXCAPJNkNWj7sfEqAXCGzeGNw8vAmEAAAAguwqhh/CPJbWVE9xm2kzO69dyWk2MkXSIpKdd19n4\nx2PW2t80NIC1NmaMGSnpZTnB8g/jH+5xKiX93Fr7z2bOL6usqX8WVlObs7m1r95e+rYWrVukn37n\np2pX0lDHj8IR1JNub7UxgTAAv1Ah7B8qhAEAAABIhREInysndK2R9Ofm3NBaayVdZIz5q6SL5QS6\n3SStlTRb0kNNCXGtteuMMUdJukjSTyT1l9RR0kpJr0maYq1d0Jy5haM+BI5CILy6YrW++8fvqjpW\nrR01O/TzwT8Pe0qBo0IYQL4JskKYQDg30EMYAAAAyK68D4SttUN9GGOGpBmtHCMm6eH4R15KrhAu\n/GdkSzYsUXWsWpL02brPQp5NdmQrEN682dnNvk0b/+4DQDQFWT0a5ZYR27dLlZVS27bhzacOFcIA\nAABAduV9D2H4x7p6CNfECr9CuNbWr7c2Fo1noNkKhCU2lgPgDyqE/eP9/uVKH2ECYQAAACC7CIRR\nr8i1qVwEnpG5Q+BY7u715ysCYQD5hgph/3i/f7nSNoJAGAAAAMguAmFIkpx2yvVqI9BD2F0hTCDc\nOunGoo8wAD9ka1O5deukqip/x881uVohTA9hAAAAILvyvocw/OEOR6VobCoXtQrhWCy+zt7vS998\nW7W1Zb6NTSAMICjZahkhSatWSXvu6e995BIqhAEAgCQNGzZMb731Vtrr2rZtqy5duqhz587abbfd\ndOihh2rgwIEaPny49thjjyzPFEBQqBCGpNQeulHYVC6ph7CNwHprJR09UbrwSGnMEFVX24y3adbY\nHgTCAPyQrZYRUu60jfhi/Re67l/X6b3l7/k6bq5WCBMIAwCQXcaYBj+qqqq0Zs0affHFF3r33Xf1\nwAMP6IILLlDfvn112mmn6Z///Gcocx42bJiKiopUVFTUYJgNoOmoEIak1ArZSGwqF7EK4ZoaSX3i\nfzh7fKQt1ZsldfFlbAJhAEHJZoVwrgTCY1/5pV76Yrqe+fh/tfTqJb6N6w3AqRAGACC6rLUyxuiw\nww7T4MGDE1+PxWLatGmTNm7cqE8//VRLly5NnP/KK6/olVde0ejRozVlyhSVlfn3rtNMjDFJnwG0\nDoEwJKVrGVH4z8jcIXBkAuGi+mfd1TX+/YzZVA5AULLVQ1jKnUD4P/O/ktpKX29e6uu4udoygh7C\nAACEZ8SIEbrxxhsbvP6bb77Rk08+qSlTpmj58uWSpCeeeELz58/XzJkz1bZt22xNFYCPaBkBSela\nRhR+QOoOwSurC/8ZqBMI16/Tz58xFcIAghJkWJirLSO27Ygv1Pj7t5iWEQAAoLl23XVXXXPNNVqw\nYIHOOuusRGXx7NmzNXr06LCnB6CFCIQhKbVCuDYCLSPWrKtf8+LPC3+9NTWSTP2aq3181k0gDCAo\nUawQtnL1uPfx73GuVggTCAMAkPs6dOigp59+WqeddpqstbLW6tlnn9U777wT9tQAtACBMCSl6SEc\ngWdkq7+pX+OWiogEwkUEwgDySxQrhK37xbsA2/sQCAMAgOaaNm2aOnXqlOjlO3HixAbPnTdvnm6/\n/Xadfvrp2meffdSpUye1bdtWPXr00JAhQzR+/HgtW7as0fur20hu5syZkpxexu4N5twff/zjH1Nu\nv2bNGj3xxBMaPXq0BgwYoK5du6q0tFQ777yz+vfvrzFjxujVV19txXcEyE/0EIak1JYRflcIP/yw\n9Oc/S5MmSUOG+Dp0i7mfZMds4T8Dra5WUoUwLSMA5IOoVwhXVcfUrtSfcXO1ZQQ9hAEAyB8777yz\nRo8erfvvv1+S9K9//UsbN27UTjvtlHTe4MGDNWfOnMRl92Zwa9as0TfffKNZs2Zp8uTJ+u1vf6tr\nr722wfusu621NmWsxtx///26+uqrVRv/D4b7dps3b9amTZv02Wef6YknntDw4cP17LPPapdddmnS\n2EC+IxCGpDSbysX8rUi6+mpp2zbpppukf/3Lt6FbxV0hG51N5VyBcMCbyhEIA8H497+lceOkUaOk\nSy8NezbBCzIszNlA2PXiXRUVwgAAIMecddZZiUDYWqt33nlHp512WtI5y5YtkzFGbdu21YEHHqh9\n991XXbp0kbVW5eXl+u9//6u1a9equrpa119/vYwx+uUvf5lyX5dffrkk6fnnn9fKlStljNHIkSPV\nq1evlHP79++fdHnlypWKxWIyxmjvvfdW//791b17d7Vr104bN27Uxx9/rE8//VSS9MYbb+iEE07Q\ne++9pzZt2vjyfQJyGYEwJAVbIVxd7YTBkpTh3SBZFc1AuP5Zd7XPob/X+vWStVITX7wF0ESTJkmz\nZkkffSRdcknh/xsLMiz0toxYvdq5j+Jif++nudwVwtU19BAGAAC5ZeDAgSopKUlU3r733nspgfAP\nf/hDnX766Ro2bJjatm2bMoa1Vk8++aQuv/xyVVRUaPz48TrrrLPUp0+fpPOmTJkiSfr444+1cuVK\nSdJVV12lY445JuM8999/f91///0644wz1LNnz7TnfPLJJ/rZz36m2bNn68MPP9TkyZM1bty4zN8E\nIM/RQxiSUgPRoDaxWbPGt2Fbzd0nOabCfwbq3VSuJsCQQZIqK+tfCADgn7q3+W/dGo3wLJstI2Kx\n3Pg7lVQhXB3ci3e50jKCQBgAgPzSvn179e7dO9HCYfXq1SnnPPDAAzrppJPShsGS075h1KhR+v3v\nfy9Jqq6u1kMPPeTrPEePHq3LLruswTBYkr797W/rX//6l3r06CFrrR588MHEuoBCRiAMSaktIvzc\nVM79RG/9+tSKrLDURLJCOHubykm50Tbi6aeliROlioqwZwL4I2r9VrPZMkLKjbYR2WoZsWmTE4KH\nzf3/glz5PwIAAGhcly5dEscbWvG2ox/+8IcqKyuTJL322mutnldLdO7cWWeccYYkqby8XPPnzw9l\nHkA20TICklLDwaAqhCUnJNxtN9+GbzH3mm1UAmFXyFAb0KZyXbo4IYPk/Kz33NO3u2m2L7+UzjnH\nOe7USbryyvDmAvjFG541UHRRMGpqJB30lNSpXDWfXi6pnb9je6xa5dvwLed6rA6yZYS1zuP1zjv7\ndhctErUXOQAAmQ16ZJBWVeTCH+WW61HWQ3MunpP5xDxVF+JK0pYtWxo995NPPtG8efP01VdfafPm\nzaqsrEy63hgja60+/vjjQOYqORvZvffee1qwYIE2bNigrVu3JlUCuzfA+/DDD3XggQcGNhcgFxAI\nQ5JUXZ38hNPvTeXc1qzJjUA4ki0j3BXCAf2Mu3WrD4TDfjvy11/XHy9dGt48AD9FLTzb1vFT6dRz\nJUlV1d0lne/b2HlRIRxgywjJeZwOOxCmZQQAwGtVxSqt2LIi7GmgEe4QuHPnzmnPmTZtmm677TYt\nWrSoSWNWV1dr06ZNSdXHrTV//nxdf/31mjFjRqLncSZr16717f6BXEUgDElSpecJZ62PFbPeCqxc\neWx1h97RaRlR/8Pwsy2Ie6j27euPq6p8u4sW4W3IKERRC4R3tP8ycVzb+ctGzmy+dI8LuRAIJ1UI\n+/hujnTr3bBB6tvXt7toEQJhAIBXj7IeYU+h1QphDY3ZVFcFJGmXXXZJuX7MmDF64oknJDkVwJnU\nVetu2bLFt0D41Vdf1ciRI1VZWSljTMZ5uOcAFDoCYUhK7VEYVDsBKTc27JGSA2GriATCWWgZ0c71\nbu6wQ9ioBWeIhqiFZzHrfqz2d8G5WiGc3DIi2ArhVrT8803UfqcBAJkVcquFQrBt2zYtX748EbD2\n6JEcfj/yyCN64oknEteffPLJOuecczRgwAD17t1bHTp0UElJfRzVt29fLY2/pTPmU/vKtWvX6sc/\n/rGqqqpkjFGfPn106aWX6uijj9bee++tnXbaKWnDu5tvvlk333yzr3MAchmBMCSlviW1NuCWEbkg\n6i0jgmoL4u5nWl3t2120CCEDCpH7dznsF12yoTYLgXDHjtLWrc5x2D2Ea2uV3N4n4EA47NY+Ei/e\nAQCQb+bMmZNov2CM0RFHHJF0/d133504vuWWW/TrX/+60fGCqMh99NFHtWnTJhljdPDBB+utt95K\n6nucjTkAuawo7AkgN9R4qkWDbBmRi4FwFDeVC6plRC5VCNMyAoUoauGZOxCWicnPgo26xwX3uxK3\nbfNv/JbwPlYHuamcRIUwAABovmeffTZxXFRUpKFDhyYuL1++XIsXL5Yk7bTTTrrhhhsaHWvLli3a\nEMB/SF5//fXE8fjx4xsNgyUlKpSBqCAQhqQ0LSMiUCHsDr0j0zIiqUI4mJAhlyqEoxacIRqi9nud\n9PfI1Pq65rqx3C9khf24lbIBaEAv3tUhEAYAAM2xfv16/fGPf0z05D3llFPUqVOnxPUrV66U5FQO\n9+vXT8XFxY2O98477yR69zamKX2I3ermIUkHHXRQo+fGYjG9++67zRofyHcEwpCUrmVEcBVJObOp\nXCRbRtQ/6w4q9M+5YCXNMZDPohaeJT0+F0UwEA64QjgXWkZE7XcaAIB8NmrUKFVUVCRC3PHjxydd\nX1RUHzNta8Jbrx588MEm3W8713/YqpvwH7bmzOOFF17QqlWrmh06A/mMQBiSUp9wxqLQMiLim8oF\nVXWWSy0jolZJiWiIXg9h1yJ9rhCu+/61b5/6tbBUViX/PYrCpnI8VgMAkPu2bt2qH/3oR3rllVck\nORW7o0aN0uDBg5PO69u3r4wxstbqk08+0VdffdXgmM8884xefvnlJgWxXbt2TRyvWLEi4/l77713\n4vjvf/97g+etWbNGV199dWLOQFQQCENS6hPOSLSMiGIg7Ko6C6oKPOcq7eIIGVAoohaexTw9hP1a\ns/shsLS0/jjsx63KquQFVtfSQxgAAIRn9erVuuuuu3TAAQfoueeek+SEwUOGDNEjjzyScn7Xrl0T\nm8zFYjGdeeaZWrRoUdI51lpNnTpVo0aNUklJSVL1b0O+/e1vJ47/93//N+P5p59+euL4tttu01NP\nPZVyzrx583Tsscdq+fLl6tixY8YxgUJSEvYEkBu81aJ+biqXq4Gwu0I4Mi0jAtpUzv1kPlcrhMOe\nC+CXqIVnSYGwjy0j3N/HkhLno6YmBwJhTwunoDYArUPLCAAAou3ll1/WGteT9Fgsps2bN2vjxo2a\nP3++vvzyy8R1dZW8F198se655x61adMm7Zi33nqrTjzxRMViMc2bN08HHXSQhgwZor333lsVFRV6\n++23VV5eLmOMJk6cqIcffjjjpm4/+MEPNG7cOFlr9dJLL+k73/mOjjrqqKT+xeecc44GDBggSTr/\n/PN19913a9GiRdqxY4fOO+88TZo0SQcffLDatWunTz75RHPmzJExRgcffLBOOukk3XHHHS3+PgL5\nhkAYkoLdVM4bxK1dK1krhd2ehwrhYEKGXNpUjpABhShqFcK1SRXC/gXC7nGKi6U2bXIkEPZUCHv/\nPrcGFcIAAMDNWqvZs2dr9uzZaa+v2zhOkoqLi3XKKado7NixOu644xodd/jw4XrwwQd1xRVXqKam\nRjU1Nfr3v/+tf//734lxi4uL9Zvf/EY33HCDHn744Yxz3W+//XTDDTfo9ttvlyR98skn+uSTT5LO\nOeiggxKBcGlpqV588UWNGDFCS5YskSQtWLBACxYsSFrb0KFD9fTTT6etdgYKGYEwJEk1AfYQ9j65\nq6mRNm2SdtrJt7tokagFwlXVMcnU90TKRsuIsKtygw7O1q6VunRxgiQgW6JU+W6tZAPaVM49TkmJ\n8+94+/bwA2FvAFwTgZYRUXuRAwCAXNBQ397S0lJ17txZXbp0UY8ePXTooYdq4MCBOv7447X77rs3\nefyLL75YQ4YM0e9+9zu9+eabWrlypdq3b69evXpp+PDhGjNmjA4++OCk+WTqJTxx4kQdffTRevzx\nxzV37lytXr06sWFcutvut99++uCDDzR16lQ9//zz+uyzz1RVVaUePXrooIMO0k9/+lOdeeaZiQ3o\n2FQOUUIgDEmpLSOCDIQlp21E2IFwja2V4o/3NgItI7x9omsiViHsd3D2xhvSKadIu+8uffyxVFbm\n7/hAQ6IUnsViSnpng58Vwu7HhLoKYSn8x62ot4ywNjfeRQQAQCF78803s3I/Bx54oB577LGM57nb\nUmRy8skn6+STT27y+WVlZbr++ut1/fXXN3rehAkTNGHChCaPC+Q7NpWDpDQtI6x/T0DTBXG50Ec4\nchXCAf6Mc3VTuSCDsxdflKqqpK++csJhIFui9Pb62lol9T73c1O5dC0jpPCrrqs8gXB1jX9/n9Kt\nbcMGJ4ANk3dehf57DQAAAISNQBiSUqtHg2onUCcXAmHvzvWFLrVPdDA/Y3eFcNjBSpDBWVVV/fG8\nef6ODTTE+8+20IMzb+/zoFtGSOG/kOWtEPa+g6c10j1WV1dL8XdahsLa6P1eAwAAAGEjEIak1HAw\nFlD1aJ21a30bvsWSK4QL/9lnaugfrQphv8Np99o++MDfsYGGpOvJXshSK4SDaxlREm+iFfbjlrdC\nOKiWEV271h+H2TYi3fIIhAEAAIBgEQhDUmpY6GcP4ZxtGeEKvS0Vwq2Sq5vKBVkhTCCMMHh/jws9\nOItihXC2NpXr1q3+OMyN5dL9nSj032sAAAAgbATCkBTOpnJhi1oP4WrPs+4obCpXWyvpsAelk36h\nSuNvCZx7bcuWSevW+To8kFbUeq1ms0I4ZwLhLLWMcG+EWVnp2100G4EwAAAAkH0EwpCUWoEU1IZj\ndXIiEE5qE1H4gXAYm8qFXSG8umaxdOrPpSPv1aqev/d1bG9oRJUwsiHyFcJZ2FQu9EDYs4lcbUAV\nwrnS3odAGAAAAMg+AmFICrZCOFdbRsTcFcKm8J99pvyMs7CpXNjByqbalYnjyrYrfB3b+3tNIIxs\niFoP4ZoaJVcIZ6FlRNjf02pvD+EC7/dOD2EAAAAg+wiEISl105oobCqXFHpHoIdwyqZyEagQdofg\nfv5OS6kByrx5vg4PpBXJlhFJFcLBt4ywNtzvq/fdHEG1jMiVQJgKYQAAACD7CIQhKV0gHIEewu5N\n5SLQMsIbKtRGoYewa40xBRsIUyGMbIhky4gsVAgXFztVwnXCfOxKae8TwZYRYb+YCAAAABQ6AmFI\nkmo87QNiPgakOdsyQslVZ4UuW6F/roQMUnYrhBctkioqfL0LIEXUAuHUCuFgegi7W0ZI4T52ed/N\nUegtI6gQBgAAALKPQBiSst8yYts25yNM7jXaCLSMqPI8645Cy4hsVghbK330ka93AaSIfA/hLLSM\nkHKrQti76WtrEAgDAAAAkAiEEedtJ2ADbhkhhV8lnBSIRiAQ9ob+QQXCudQyoiYpEPY3OUsXYtA2\nAkGLfA/hLGwqJ1EhnE3Z2FRuxYrw/x4BAAAAuYRAGJJSexT6WU3pDjDat68/DntjuaQq6Ai0jPCG\n/rFY8FVnYVcvuivrbMAVwhKBMIIXtZYRYVUIh/nYlVohHMzf41wJhIOuEH7uOWmPPaSDDw7/bxIA\nAACQKwiEISm1Aimo/rK77VZ/vG6db3fRIkkBIRXCrZKrFcLuNQbSMsLzQsIXX/h6F0CKqAXC2aoQ\nzqWWEd7Ham+P/9bIxQrhoAPhl15yWvosWCDNn+/fuAAAAEA+IxCGpDSBsI+byrmf2HXsWH9cWenb\nXbRIzBMIWxveXLIhNfQv/B7C7mDF70B4c8cPpGt6yYw5JhEMV1X5ehdACu+/qbD/jQWtpkZSkWuR\nEdxUrtbHdDSKgbB7/LBfiAYAAAByBYEwJKWGhdbHsLChlhFhh2felhE+FmHlpOra5GfdQVWBl5a6\n7jOHKoT9bhmxodefpbLVsnu+Le0+V1L460Xhi2SFcJZaRpSU1F/OpU3lagN6rHb/PS7kHsLun3PY\nraoAAACAXEEgDEnpeggX9hNQKbVCuNCDlWy0jCgqyp0+nFLyCx3W+LypXNH2xHFJe+c47Bc5UPii\nFgg7FcLRahnh7ffuZw/hKFYIu8eiQhgAAABwEAhDklTrrRD2sZoyF5+ASp5AuCgCgXAW+kQXFzuh\ncFH8kSX0n7E7EPa5Qjim+sWVlDpJcNjrReGLZCAcUIVwrraMSO0hXNh/j2kZAQAAAGRfSeZT8oMx\nZndJF0o6TdJekjpJWiNpqaQ3JT1jrf20kdufIukiSYdJ6h6/7WxJj1hrZzRxDsXxMX4iqZ+kMkkr\nJb0maYq1Nme3MwkyLGyoZUTY4Zk3IKyuial9Ab9G4g0ZgughXFzsfC4pcaplw/4Zu3+vY8bnQNjW\n/2IXEwgjS6LWQzjITeW8LSNy5d0N3grhGJvKtQoVwgAQvPLycvXu3TvsaQCIiPLy8rCnUBAKIhA2\nxlwo6W45IbB7a7Be8Y8hcsLZq9Pc1kh6VNKY+Jfqbr+7pJGSRhpjHrXWXpJhDl0l/UPSIM8c+kq6\nWNL5xpjLrbW/b97qsqM2VptUL26z0DIi7LfXezcZq66JqZCL5lM3DgwuEG7Txvn5hh1WuV/Y8LtC\nuNa4KoTbOscEwggaFcIR2FTOWyGchQ1AC7mHMIEwAAQvFotpxYoVYU8DANAMeR8IG2PGSrpHTgi7\nVNL/k/RfSZvlhMHfkhPsNpRwTpITBltJ8yRNlvSFpH0kXSfpUEkXGmPWWGvHNzCHIknTVR8G/1XS\nY5LWSzpc0nhJu0l6yBiz3Fr7autW7T/vpjVRaBnhXWNVdWHvKpetlhFSfbAS9s+4JkstI4qoEEaW\nRC0QTqkQDnBTuVwJhGtqa6Xi+sveHv+tkYsbgNIyAgDyV48ePcKeAoAI4zGodfI6EDbGHC7pLjkh\n7IuSfmStrXSd8kH88z3xdg7e2+8n6Zr47WdLOtZ1+7nGmBclzZQT9F5rjHncWvtFmqmMllOFbCVN\ntdZe6bpujjFmhqS5ciqYpxhj+lvrYxrng9pY8hNQPyuE86VlhHdn90JTE0t+1h10ywgpFyqE3W81\nr5G1kjE+jS1Xy4g2TiAcdtU7Cl/UAuGUCuEAN5Urcf2PKKcC4YAeq3MmAM9iy4i1a/0bFwAgzZkz\nJ+wpAABaKN/fH///5KxhqaRzPGFwEmvTPqP6hepD8Su8t7fWbpd0RfxiiaSxDQx/TfzzBjlVxd77\n/kLSbZKMpH0lndHQPMOSjXYCUo4Fwp6esjU1/lZhTZ8uzZ7t25Ct5t04MOiWEVL4P+Ok32tTKx9b\nccq6WkYUtaFlBLIj8j2EI7CpXDZ6CEc1EKZCGAAAAHDkbSBsjDlC0iFyqnLvjIe3zfW9+O0XWmvT\nRnfW2v9K+kxOmPv9NPPYT1L/+DjPWGt3NHBfT7iOcy4Q9j7hDKqHsLtlRNjVlN41VvsYCP/lL9IZ\nZ0hDhkjLlvk2bKt4Q38/i9TrfsZ1FXa5WSHsX5BkrWRNaoUwgTCCFvkKYR97COd0ywj35Zh/P+S6\nNefSeoPuIUzLCAAAACBV3gbCks5yHf9v3YExpqsxZl9jTJfGbmyM6Stn4zjJaQvRmLrrexlj+niu\nG5rmvBTW2tWSFskJlodkuL+s874llZYRrfPRR87n6mpp7lzfhm2VIKvA3SGDlDsVwkm/16bWt4Da\nqVqkQhjZF7VAOKVCOMCWEe6ANMwXs7yP1bSMaB33WBs3Fv6/GQAAAKAp8jkQPjz+eYm1dq0x5hJj\nzGeS1sgJXjcYYz41xlxljGmT5vYHuI4XZrgv9/X9fRhnD2NM+0bPzDJvO4GgNpXLqUDY2zLCx417\n3E9wV670bdhWyWbLiLoK4bB/xkFVCFdXSyqq/yEXuSqErfXnPoB0vOFZoYdbqRXCwWwql0stI1Ie\nq2kZ4dv41kobNvg3NgAAAJCv8jkQPkBOm4alxpin5PQT3jf+tbqPfpJ+J+k1Y0wnz+17u46XZ7gv\n95v+9/BhHOO5XehSKoRNFFpGeCuEg1lz7gTCyc+6g2gZ4a0QDrtlRK07SPExSKqullRcn6CYEueX\n2drCD+gQLu/vV9j/xoIWVoVwLvUQrgmgQjiXAvBsVghLtI0AAAAApDwNhI0xRlJdS4ghks6RVC7p\nXEm7SOog6VhJ78kJhodKeswzjDsgrshwl1tdx2UBjROqWuvtIex/OwEpORAOu3rUWyFc7WPLiJys\nELbZqxDOlZYRSWssqvEtPKupUXLLiJL647DXjMIWtZYRQVYIezeVq3tng0SFcDYF3UOYQBgAAABI\nlZeBsJzA18SP28oJWo+11v7FWrvJWltprX1H0nclfRQ/90xjzCDXGK5oUplqVStdx95WD4lxrLWt\nGSdUsZSWET4/Ad19jnTMrdreZkXi62EHZyktI3ysEM7JQDgLbUG8LSPCrl5MWnOALSPqKoQT1wEB\niWQg7K4QjsKmct5AmB7Cvo6/dq1/YwMAAAD5qiTzKTlph+vYSnrUWvu59yRr7Q5jzK8lvRT/0o8k\nzUkzRmmG+2vrOt7e0FyMMaUZQuHGxmmSqqoqzZs3L+N5PXv2VM+ePZs8bsqmNT62jKipjUk/OkPq\nslzPbVgo6SlJORCceSuEfewh7H4yu2JFw+dlU2qFcPAtI2Ix56MopJeekiqEA20ZQYUwsiNqPYRr\na1cHTZgAACAASURBVJX8WB2BlhE1ngVGZlO5NlulNtulbd2oEAYAAECzlZeXq7y8PON5VWH3L80R\neRkIW2trjTE75FTnWkn/auT01yXVSCqWdJjr61tcx5naN3R0HXvbQnjHWd/CcZpkzZo1GjhwYMbz\nJkyYoJtuuqnJ48YCbBlRVVstdXHaKy+v+rT+62H3EI54y4hsVAhLzveiNNNLLgHxbirnV8UyFcII\nCxXCwbWMcAekYb67wftY7W3p1KqxczUQbr9OunI/qc026bH3VFt7iG/jEwgDAABEw8MPP6ybb745\n7GnkjbwMhOOWSdrPdZyWtbbSGLNW0m6Suruucm8Al2mDN/dGct778o7TWCBcN45V5g3o0urevbtm\nzJiR8bzmVAdLqW9J9XNTuZra+mfWW2P123uHHpylBMLBtIxYv17asSO5f3IYam1ywpGNCmHJ+TmH\nFggHWSHs6iGs4vpAOOwXOlDYIrmpXEAVwrnaMiKlh3AUKoT7vC21j///YN8ZvgbC3n8jBMIAgHSs\nlcrLpd13D3smAFrqkksu0fe+972M55188slas2ZNFmaU2/I5EJ6v+kC4OMO5dde7n1XNdx33y3B7\n9/UL0szDfd5HTRhnmbW2RS0jSktLNWDAgJbctFFBVo/WxOqfjVXU5G4gXBNQywjJ+c9F376+Dd8i\nKaF/FiuEw5JcIVwTWA9hd/uI0H+vUdCi1jIiXYWwX3useVtGuOVWIBx8hXCoFdG1Sn6BzccNQBPj\nuxAIAwDSufBC6Q9/kMaNkyZODHs2AFqiqa1TS8OqWMsx+bqpnCS95Treu6GTjDGdJHWLX0x0c7XW\nfimp7s38x2a4r2Pqbm+tXeq57h3XcYPjGGN2k/QtOdXB72a4v6yLecNBHyuEq10VwttqtySCtDAr\nKa1VcsggqdrHZMX7ZDYX2kZks2VErlSeJf1e+9gyoqZGSSGwu0KYQBhBilrLiJQK4YA2lSspSX4h\nK5c2lfMGxK2RsxXCRcG8eJcY34VAGACQzl//mvwZAApdPgfCL8gJVyXpjEbO+4EkEz9+23Pd3+LX\n9TPGDE53Y2PMEXIqe62k6d7rrbWL5VQNG0lnG2MaagxwgWfuOSWlh7CfLSNinmdj7TZKCrkCyxsy\nyN8K4VwMhFPedmxiTjDe2nFd37Z0FcKhBsLu32vfW0a4K4QJhJEdtbWSiiul/V6R2q8v+EDYCQtd\n/9aKYqqp9eGBS7m7qVzqBqDBBMK58jid8jP28bFaSn3RZO1a/8YGABSOumKlQm/HBQB18jYQttZ+\nJek5OUHsOcaY47znGGN6SLo1frFK0uOeU+6Vs+GcJN3vDXPjl6fEL9ZIuq+B6dwV/7yLpDvTzGMf\nSTfEL36unAyEvWFhMC0jJEntnLYR4T8BzV7LiNwMhP1563W6t13nyluRvZvKBdZD2HVMD2EEqbZW\n0om/lH56qjTqeFXX+BOO5iqnnUAwj9WNbSqXU4Gwjy0j6h6Pi4slY+pD4fD/HrtD/2ADYSqEAQDp\n1P2NLPQX2wGgTt4GwnHXSVojp0fwy8aYScaYocaYgcaYyyS9L2ejNytpvLW23H3jeHXvXXJC5cMk\nvWuMOTt++7PltHYYFL/9ndbaLxqYx7T4uUbS5caY54wxJxpjDjPGXB6/rrOcHsZXWOvjszufpFYg\n+dkywjN2+xwJhFM2lSvslhEpP2OfnnRnCoRzpmWE8a9lRHW1klpG2CIqhJEdNTWSer3vXOj5gWp9\nfCErF6V7rK7x6ZlaY5vKhdpTN0ubykn1aw79HTtJgTAtIwAA2WVt/d9CAmEAUZHPm8rJWvu1MWaE\nnIrbXnKqcG9wnyIn2Zxorb27gWF+Lam7pDGSDpH0tOf2VtJj1trfNDKPmDFmpKSX5QTLP4x/uMep\nlPRza+0/m77C7EnZcMzHCuFam75COMxKyqArhHMtELZWiskzKZ96caYLhHNlU7mkPskBbipnXRXC\nBMIIkrdi1uk3m2lf1fyVrkLYr37v3seuXGmh4P17HNSmcpITCG/fnmMv0AbcMmLdOudvojHpzwcA\nRI/7XZMEwgCiIt8rhGWtnSvp25J+I2mupI2SdkhaIqdFxGHW2gmN3N5aay+SdKqcnsIr5IS3K+KX\nT7HWXtKEeayTdJSky+T0Kl4rabukLyQ9ImmAtfYPLVxm4Ky3IjjAHsKmYy5UCFvJJL/VupBbRsRi\nSm0D4tOT7rypEPbxbcjOCwpUCCP7vP3P/aqWzVXpKoT9qorOm5YRPvUQtlaJvvF14XcuVAhnu2VE\ndbVUUeHf+ACA/OcuYCnw/1oBQEJeVwjXsdZuljQp/tHSMWZImtHKecQkPRz/yCvpNhzzizcQLu64\nQTUK9wloVXXq+gq5QjhdRXSQLSNyskLY75YRrgAjZgiEkR3et9f7VS2bq2pqJJUGUyHsbRmRuxXC\nwVRES7kaCAfbMkJyqoQ7dfLvPoAoe+opadIk6brrpPPPD3s2QMu4/w4W+H+tACAh7yuE4Y+UCqQA\nN5Ur6hB+y4jK6tT1+Vlpl5OBcEqFcMyXgDSKFcKVVTGpqP4FBGvYVA7Z4X1xpyZW2M9avBXRkn9r\n9j525crjVmoPYf8ronMpEE7pIRxwywhJWrvWv/GBqLv1Vmn+fOczkK+oEAYQRQTCkBRsywhvD+G6\nQDjcCuF0gXBwLSM2bw73LappK4QDbBmRMxXC7iDFxwrhyurkgagQRrZ4wzNveFho0vd7979COJda\nRngDYOtTy4hcDYSDbBlhbXJfyDpsLAf4Z/Nm5/OWLeHOA2gN99/BdH83AKAQEQhDUpoK4aLaRK/B\n1krpIZwDgXC6CuHqAFtGSFJ5uW/DN1vKE24p0JYRuRCsxGJKDpKKYk7vaB9sr0peVMywqRyyI6WH\ncIEHwukqhGt9eqbWWIVwmC9kef8eF3qFcEro72PLiIbGIRAG/FP374yqSuQzKoQBRBGBMCSlqUAy\n1rfwzLtBjtqFHwhX12S3ZYQUbtuIhlpGFHIgnC5I8qv3aGW1NxCmQhjZ4X1xJxKbygVUIZyrm8ql\n9BCOWoWwjy0jCISB4NX9nzfMF9KA1qKHMIAoIhCGpPRvSa2p9SsQTv4fom2XAz2E0wTCflWdSen/\nI7FihW/DN1sUW0Y4b61PXmC6ViEt4W0ZUSsCYWSH9/e60FtGBNlD2LupXK4Ewt4XUQu9ZURKD2Ef\nW0Y09PeHQBjwT92/M0I05DMqhAFEEYEwJEkxbw9h+ReeeVtG2ByoEA66h3BeVAgXeMuIdGsOrEJY\nbCqH7Ihay4i0FcIBbSrnfiErZzbDVPq/zy2RKRD2q03U/2fv3YNlyeo63+/KzKq993n0OZymaU53\nY4MtjwaZq6CgNlfUCRXUQBwVYyJmHNFRwStzVUavEept1DtjyDg3BF/gY0SN0RnvwKCO2GrIAEI4\nCt00DxGh6QeHfjec3ue1a1c+1v0jMyt/a+XKql2VK3OtzPp9Ik6cOrv2zl15Kiur8pvf/PzWpa+G\n8KlT1W0OhBnGHqyMYMYAN4QZhtlGOBBmcvSwEECcWDoI1RrC2dTTQNhisOJlIFxrCHenjPC1IWxS\nhWzCYcINYcYNtaFyupJnZJgdwuNWRuiN4D4awvr9fVI7eWfRIUx31ddeW91+7DE7y2cYhhvCzDjg\nhjDDMNsIB8IMALOjcG4pPMu0QDiduldGmJqiNhvCpg8SzgPhmkN4+xrCepC7KXpDOJEcCDP9oJ/c\nGbtDOE4kEKj7Zlv7al+VEbpDWFpqCOvrC/ixzrWGsEVlBF3OyZPV7cNDO8tnGIYbwsw40A8RLJoE\nGYZhvIUDYQaA+YDTVkM40RvCkwuLMNLVJaqmhnDagTLi9Onqaw8+aG3xa1M74Aa2oyEs9CCpK4dw\nlaRwIMx0Sa0hPHJlhGm/bLUhvPs48JU/i7/89J8s9l+A2+FIdWVEPw1hpwNAe1BGTKfmrzMMszlS\nckOYGQf6eyBvzwzDbAMcCDM5HSojMhiOrHcfz3+HowNQU/u5C2UEbSQdHFhb/NoYlRFBaiX08DVk\nMK2zLWXEPOWGMOMGXaEwdmWE8WoOm0PlXvBLwNf833jln30rHr78kPMha1L2o4woT9r5s6+mDeFu\nlBE7O9VtPtBnGDvoLUpuVTJDRT8m4vcJhmG2AQ6EGQDmA057ygjDcvbceoRNwWBq8VNs+SGCeild\nBoV9KyN8GM5kco/a2qbnibpSqYwB5HV3HirHdIkenm1jIGy1IfyEe/Lfk8W463N3Od9fZxlqJ7Js\nKSP0ffVhcohPX/eLwOf/JQDXgTB1CHNDmGGGgv5a4tcWM1S4IcwwzDbCgTADAJCGhnDS0VA5AMCu\nW4+wKRC26RAuzzJHUXUQ6jwQ7nGoHG2dubr02rTOtpQRc22lJOTid3FDmOkSfVji2B3CpvWzFYLn\n/5fVC/axK48tTmY5O1kZo3Yiy/T+vAn6vvr7/sf34aM3/DDwL14KHHvUn4ZwR8oIbggzjH30z3gu\ndTsM0wZuCDMMs41wIMzkHl9RD0NttSmNgbDjhrBp3Wy6OMsPFXRQkfNAWA8VLLWw6Acony5DNjWE\n49TOkYreEAYAhPnZDQ6EmS5JUqkEwsYrMEaEMRC2dDWHHkQ+duUx5/tr04ksKe03hC/v3IXf/dDv\n5v8IUuDUOX8C4Y6UEdwQZhj7cIjGjAVuCDMMs41wIMzkl6iaGsKWGrNmh7BjZYTRS9mtMsKlSsA8\nVK4fZYRPDWHT874Jc1OwXDQNORBmukQfsjZ2ZYTJF2zLIZymAEK1IexFINxDQ/iDp25V7xSpP0Pl\nWBnBMIOBlRHMWOCTGwzDbCMcCDNmnQA6Hiq39zkALpURpsn13SgjXAcMi8fjSBnhV0PYzqe7OOWG\nMOMGfRsefyBcf/+w6hBuaAi7OpEVx+jeIXzth3HX3h+odwaJPyG4RWUED5VjmG7hEI0ZC7wtMwyz\njXAgzBQHY/UDTlvhmTkQdtsQNl2GbKt1BpQfKiTC0COHcEfKCK8bwtp2bXJHb4JRPVE0DXmoHNMl\nekBqU3XjI6b9sq0QPD9RtoUN4a/+KUBI9U7XgbCmjLD1vsENYYbpFg7RmLHAygiGYbYRDoQZc1gI\ne0PlvFRGGILBzFJDWEogvfqjwI88BZ+45cWIdtw3R80N4e4CYR8awsahcpbCs/kWNIR/6IeAZz4T\neN/7XD8ShqKfzBp9Q7hDh/CyhrBP+y3T+/MmpCnyq3Oe9cf1O70KhLtRRtD3JT7QZxg7sDKCGQt8\ncoNhmG2EA2GmWRlhqyFsCiwcN4RN62YrZMgyAF/3b4Gr7sfFM+/BxZt/GYAHDmE9VOhQGeFDQ9ik\njEgsDZUbuzLiwQeBN7wB+MQngDe+0fWjYSh6Q3jsQ+VMDWi7DWEPA2HDvtrG21OaAtjZN9/pMBCu\nOYQ7UkaEYfUexQf6DGMHDtGYsaC/B1o0CTIMw3gLB8KMMTgD7AyVyzLUh5kBi4awO4ewaX0thgzX\nfnjx74Nr/2f+O503hLXnIUithLVDagjbcwiPe6jcQw9Vtx9+2N3jYOro27DtQPhnfxa45Rbg/e+3\nutiNMbX6sw6HypUns5ydrDQ4hK1ezWF6PwY8aAiTFQySThrCUcSBMMPYhgNhZizwtswwzDYSrf4W\nZuw0OoQtKCMaD0B9bAhLO6eCkwTAozcDJx8EABxe9Q/574xznYQQVn7N+o9py5QRxoawpSApzpob\nwmNwCH/uc+bbjHv0xqxNZcT+PnDrrfl+6gUvAC5fBo4ds7b4jeiyIWxSRpwhDWEX++umhnCSqPvV\njZftbSDcvTKCG8IMYx/9teTqqjCGaQs7hBmG2Ua4Icw0KyMsDOBqPAB17BDu3Et5/qbFvw+PfwpA\n4RZ29OFiG5URRoewrRZ4ZlipESkjPvvZ6jYHwn6RSG2onMVA+NKlfD9V8nM/Z23RG9PnULn9w32E\n0+rfLi4XNb4f2xwAOoRAuAdlBIdWDGMHblUyY4G3ZYZhthEOhJnmoXIWlBGDaghbao8mCQCpvbQc\n6wQ6DxkKvG8I21JGGBvC41FG0ECY3mbco2/DNpURegD6+tcDn/yktcVvhFkZ0c1QOQAQe9UZEBev\n5TiG4eSdxX112LBSPjmEWRnBMIOBQzRmLHBDmGGYbYQDYabToXKNgbBrh7Bh3TKbygh9na/+RP57\nPRtUNPqGsKZCiU3N3k2WbXIIj6ghTFvBsxlwcODusTAqtXa6SK01WfX9wXwO/OAPqq3hvulcGaEF\npPLYY4vbLl7LZr1PZs/37mtDmL4/WVRG8FA5hukW/bXEry1mqPDJDYZhthEOhJmiSVlPFGw0hJuV\nERcAkXqljEg6bJ3hSR8F4FlD2LZD+PS9+GDyn3FpfsmfhrC2ztZa4NKwUiMaKqe3glkb4Q+1MLQj\n32rJX/yF25Zwl4Gw6f0p2/UgENZP3nWkjNiNdsnvGKcyghvCDNMtHKIxY4G3ZYZhthEOhJkelBHV\nso9Pjld37j7uVSBsa3K9MQS/9iMA3DWijY/JZsggMuA7/yl+78q/wE/81U941BDuyiE87qFyHAj7\nSyrrDeEuA2EAeOwx89f7wBT+2tJk5O9P6ms53fEgEDY0hLcqEO5IGcENYYaxD4dozFhgZQTDMNsI\nB8JMr8qIa45fU925d95dQGpsndlURmifKnxoCHepjJheBM7cDQC446E7vG0Im573jZath3IAwuk4\nlREAe4R9onZSw2JDuOkiCZcnOUwNYVuBsOlEmetAuMkh3IUyQgmEw9gjhzArIxhmKLAyghkLfHKD\nYZhthANhxuhaBYC0A2XENcdIILx73quhcl0OKsKT8oaws/WN0a0yIqwSo8vzy942hLtURkQ7rIxg\nuqfWmO2hIew0EDaEv7ZO3pkcwum0CoRd7Ls6HwA6hIYwKyMYZjBwiMaMBW4IMwyzjXAgzCCOJSDq\nU4O6aAg/8dgTqzv33AXCJnVAl15KnLkbmF7yqyEcZPlz35I0BRAdLv59Ob7sbUPY1lC5sTeEORD2\nl9q2F9hpjwLNBz+Hh+av90GXDWHTybt46oEyosurObwNhOlQOXvKCG4IM0y36O8/rkoADNMWPrnB\nMMw2woEwg8PY/I6XWmjM6gegp3ZPVXdGB+7CQmPIYFMZYfhEfM3HHDuETZ5oS4Gw1hD2IRDusiGc\nGhrCwWQ8DWE9AOZA2B9q2/DYG8Iw7KsNX9sEk94njjxQRhiu5uhCGbEX7VV3Og+Eu1FGcEOYYbqF\nlRHMWOCGMMMw2wgHwkxjE9jGUDn9QO/E5ER1ZzRz6BA2KDK6VEYAwJM+4s8Bd/l1Wy3wsKoQXppf\n8kIZsRh2R+jUITwZx1C5LGOHsM+YGsLsEN5w2YZ99dxxILxtDWEpi+2uB2UEN4QZxj7cqmTGAm/L\nDMNsIxwIM4hjcwpgLSykgfCUBMITdw3h3pURAHCt40BYDxlgUQtCG8KeKCNMrWhbz7GxITwSZcT+\nfj0Y5IawP/TlEJ5Oq9suA2FT+Gvjag4pzQ7hWehBILxFDuHFevFQOYYZJByiMWOBt2WGYbYRDoQZ\nzJOGhrAtZQQJIpVAOJq5CwudKCP+3kNlhKXnmDiEkyxBJqoVddsQ7kYZkcHUEB6HMsIU/nIg7A99\nOYT3iE3AlUM4y2A8kWWjIbx4e9P21YeBB4FwrSFsJyDVFRk+BMKLbZeus0WHsK6MKK9e4QN9hrED\nKyOYscDKCIZhthEOhJnGQDhu+Po66OHoyZ2T1Z0OlRFdOoQblRG7+941hO0pI9Qnci4vL2571RC2\npYxAfaXEZBwNYZMeggNhf+jLIUwDYd9OZNlwCOfrKmvLPyCBsIuTWWaHcNaJQ9irQFhRRkgrfnuA\nlREM0zXcqmTGAm/LDMNsIxwIM43KiC6Gyh2fHK/udNgQ7m2onBSIZHHQHR56FY4CNpURaoXwIL0M\nIcjvdkC+7enKCDsPJjMEwkE03kCYHcL+UGvHdjSAy4dA2NTyB+wEwk1XcsTi0mJ/5k1DeMTKCGMg\nDHsn71gZwTDdwiEaMxa4IcwwzDbCgTDTGAp2EQirygiHDmHDwWYXDuEAESKxk98ROQ6EjQ1hS8+x\n1hC+PK88wq7WeW440WHNIWxQRogoX9GhD5VjZYTf1LY9iw1husv3IRBuCm1tKCNM/uAFx/IzIN44\nhEc8VM7oEIa9k3e6MqIMhLMs90gzDNMOVkYwY0E/uWFp1jjDMIzXcCDMNDuELXyqW6WM8KkhLDtQ\nRgSIEKKYzhQeehesmILxddEdwgBwaX5p4Wr0anCgNYewQRlRNIQXw6oGCisj/MbUEB6rQ9jU8gcA\nCfsnKxWO5doIbxrCws5z7GMgXDWEu9H7NDWEAT7YZxgb6PsmV1eFMUxbuCHMMMw2woEw0+gKtjZU\nrrEh7NAh3NHkekANX0MRYVI2hMO5f8oIC55oY0M4rhrCrg4O4qS7hrAU9ZUKourJHbI2whQIHxzk\nfxj31JqTY3cIG65skEhbtzv1AWsKDgNho0N4C5URNk5WAs0NYf0+hmE2g5URzFjgbZlhmG2EA2HG\nGJwBdho6esNLCYQnfikjbFyGDGyrMkKtEHqhjDCE3dYuNTcESWVDGBh2IEzbwNddZ/46446aP3fs\nDmHDiSwEaet257AawhJJ0t5voK/zXkSeZM8C4S6UEXpDmA/2GaY9rIxgxgI3hBlmHFy+DLznPXzF\nylHhQJjptCGsqwr0hrBPyghbDWFdGTEJyoawu0A4TiQQ1NfPmjLC0BAulRGudsamdUsshAx5g8+w\nnHAcgTBtCD/jGdVtDoT9IHPgEHaljGg6kWVjnZc7hB+rfn/P5Otc31fPG07crkMtEJ64D4TzwYEZ\nINTA26rTv8C3QPj8eeB3fxd44AG3j4Nh2sCtSmYs8LbMMOPgJS8BXvxi4LWvdf1IhgEHwkyjQzi1\n1R6lDuGp6hD2ShlhwUsJLFFGRO4cwo0tcFvPceRfQ9h0osNGQziOYQ6SyNeGPFiOBsJPf3p1mwNh\n92QZ6gFpDw5hHxvCbQ/Ulp6s9E0ZASDpIBD2RhlhCP1tOYR9Vkb8wA8A/+pfAd/yLW4fB8O0gRvC\nzFjgQJhhho+UwHvfm98u/2aWw4Ew0zg8zp4yonqHPTY5Vt3psCFsCga7UkZMFw3hGIdzN1Ns4tSc\nGNloCCcJ/GwIG7Zra4GwyT06sIbwfA689a3AJz+pfr0MfoMAeOpT619n3JG3KesDx8aqjGhuCGd2\nGsLkdfzkE0+u7nQZCCey1pYFgNjCk6y3or0JhA1XXGyDMuLDH1b/ZpghwiEaMxZYGcEww4de7civ\n4aPBgTCDuKElmlpWRgQI1UtUI3cO4b6UEaGIMA13FvfNYjfJSqMWxFrIoDaEL80vOW8Izw3rZuMy\n5KYAQ5CgZQiB8L/7d8C3fRtwyy3A/n719bIhfOYM8MQn1r/OuMPovd1Gh7CFEFx/HZ89cba602kg\nbF6xpit51mFVQ9iZIsMYCI9fGVE+NtePg2HawIEwMxZ4W2aY4UNfx+wQPhocCDNLHMJ2D0ADROoQ\nG5cOYcPBpuxIGbFoCAM4mLuRcTYFvzY80UaH8HzkDWGDMkIGw2oIf/CD+d+PPgrcdlv19TL4vfrq\nPBQu4Yawe4wBaZBsn0PYQgiuh6NPPEbOfuxcAODmddwU/I5VGdE03G8blBHleyMfsDBDRn8d8fbM\nDBVuCDPM8KHvQfwaPhocCDONl6KmFhqzeiAcBRECUWx23jmE7Ssj9IbwQewmWWl8jm2F/rpDOHbv\nEE4MzffaQK4NaBoqN7RA+PLl6vYf/VH+dxwDF/IsDGfO5KFwCQfC7jG2KcV4HcL5+vYzVE7RGRUq\nCSeBcGxeMRvKCH378SEQbnyOgwSybs7YbPkFvjaEpVRPxjDMkOBWJTMWeFtmmOHDDeH14UCYUYIz\nIatNwsbAMdrwChBBCFEdhLp0CBuOvrpTRkwX980SzxrC1pQR9Yaw+0C4G2VEo0OYfG0IQ+UuXapu\nv+Md+XqdP199jRvC/mF0CI9dGdGRQ1gPR5VAuNifufgg2RT8mk5wrYuPDeEmZYQtN/YQGsKA+8fC\nMJvCIRozFrghzDDDhz9brQ8HwowSnAWYLG7bHioXitwhsDgInbhzCJvawNJSIKw3hHdIQ/gwceQQ\nbmwI21JG1BvCpTIiy9y0n0zrbG+oXP780hMoQ24I7+8Df/3Xqif4zBk1EGaHsHuMl9d3NFTuGMlH\nvWsIW1NGNDSEQ3cNYZcOYa8CYUsnOnweKsctFmYM6K8j168rhtkUPrnBMMOHP1utDwfCjBKchTQQ\ntqyMKAPhhUd4S5QROxEdKudIGZGZ94j2JtdrDWGijADc7JCNgwMtPMfUIRzKqkaZDSwQpg1hINdG\n0BYwN4T9wxiQWmwI++YQbm4I23cI70a7EBD5P4r9mVcO4ZE2hJscwrbc2ENQRui3GWZIcIjGjAVu\nCDPM8OGG8PoMNhAWQmRH/PPOIyzrpUKItwkhzgkhZsXfbxNCvGSNxxMKIV4lhHiPEOIRIcQVIcRd\nQog3CSGe3W5tu4U2hENhtyFMw9EA+ZGYF8qIDofKxUkGBPmyokANhA89U0Z05RC+NL+0aAgD/lx6\nbaMhnG/TRSCMCSZB/pqRotqYhxgI//Efqy3gq6/OW6I7xebLgbB7jAFpRw7hyQQIik8IY2wIJwkU\nh/AkmGASFu9/Dh3CzcoI+w1hZcirbw1hVkYwzCDgQJgZC7wtM8zw4ZPt6zPYQLhAHvGPEZHzmwD+\nFMDLAVwHYFL8/XIA7xBCvHnVgxBCXA3gbwD8KoBbAFwNYAfA0wB8H4DbhRDfs9kqdg9tHoWofLfW\ndAJlIKwrIyJ3yghTQ9heIEwD9gi7ExIIp74Fwh0pI+ZqQ9hJ0GBYZynsDpULUDmiM1GlZkNwCFNl\nBADcey/wrndV/z5zBhCiaglzIOweY5uyI4dwGALT4u3AP4ew/YZwFESLkzsuG8LJNiojTM/xsOon\nWQAAIABJREFUFigj6O/ngxZmqLAyghkLHAgzzPDhQHh9otXf4j2/hjyIbeLykvv+PYDvRh4a3wHg\nPwD4FICbAPwYgC8G8K+FEI9KKX/StAAhRADg7QC+pFjOWwH8JoDPAXghgJ8EcC2ANwkhPiOl/POj\nr1o/0EZSJCaLCN3GAC6jMmJClBGxBMrLdHtEGtQBtpQR80QNGXapMsJVQ7ihCdz09XVoUkacctwQ\nNq2bNYdw0SwMMMEkTIAYyAbUEE5TYDarf/0tb6luX311/veZM8CDD7JD2Afy/Wm9IdxlIDybedgQ\ntjZUrnqhRkF+cudyfNmtQ7jp5J01ZUS1UvTqFe8awh0oI7ghzDD24RCNGQusjGCY4UNfx/waPhpj\nCIQfkVJ+bN0fEkI8HcBrkcef7wfwYillmdbdLoT4EwDvRh70/qgQ4rellJ8yLOq7kLeCJYBfkVL+\nG3LfB4QQtwG4HcBJAG8UQtwsbU0vs0RNGVEEwpmF9qju0wVIKynIMI8TgHiL+6JTZUSqBsJ7pCE8\n960hbCv0n/rXEDYFKNYcwmSbNjWEfQ+EaTv4Wc8C7r47D/3On6++TgNhADg4yP9QtyzTL8bwrCOH\ncBhWuhDvHMLWhspV/5eTkCgjihNcTlQ3DU1ga753ss7TcIpABMhk5p9DuANlhG8NYW6xMGNA33Z5\nW2aGCp/cYJjhw5+t1mfoyog2/DCqQPw1JAwGAEgpDwC8pvhnBOCHGpbz2uLv88hbxQpFiPxzyGuw\nXwDgW9o9bPtQbUAUdDFULn9HrQXCAObSUFPsAVMwaCunny8LhDNXDWHzHtGeMqLeEHbtEDY1hE3N\n8HXJA+HKIVwGwimGEwhTf/DNNwO/93vAiRPq95RBMB0sRwNjpn/6dAgHgXtlRHND2FIgHNYbwgA8\ndQjbHyoXBRGioNhRe9cQthMI+zpUTkpWRjDjgJURzFjQ3wMtHCJ5j5S5Lu4DH3D9SBjGDnz11fps\ncyD8MuRd2I9LKd9v+gYp5d8C+EfkYe436/cXLeObi+X8Vykb0823kNveBcK0PToJ7A6VowegUamM\nIINsYnnQ+ndsgrEhbGqibUBNGTH1oCHc8Fxae459dAiblBEWAmFlUKKovKMZhqOMoA3h48eBV7wC\nuPNO4IUvzL929dXAM59Z3S5hbYRb2CFcYGuoHG0IBxPiEHYXCDfqfTpoCPsTCJueYzvKCF+HynGI\nxowFblUyY2Ebt+V3vhP46q8GvvRLgX/8R9ePhmHaww3h9dnKQFgI8TTkg+OAXAuxjPL+64UQN2r3\nvcjwfTWklA8D+ATyYPmWNR5qL9ADUNoQziw0ZhVlRGBoCGceNYQtKSPmido625tUg/pcNYSbgt+u\nGsKX5pcQTap5jqMbKlcERpEwN4R9HypHG8JlM/imm4D3vhf4q78CPvQh4Nix/OtPelL1vQ891N9j\nZOr06RB+PPsMHvnalwJf92/H2xA2OIQBOB0q16iMsHyCFvApEN4+ZQRfZs+MhW0M0Zhxso0O4dtv\nr27feae7x8EwtuCG8PqMIRB+hRDi74UQl4UQF4QQnxBCvEUI8VVLfubZ5PbHVyyf3n+zheU8RQjh\nlYWzsSFsfahcfiRGA+HYJ2WEpUCY6hkmwQR7pCEcZ26SlcaGsK3nOFKD7lSmCKJqXV0c7JrWzZ5D\nuFBGiMo7mg6oIWwKhIG8Qfc1XwNcf331tbNnq9sPPtj9Y2Oa6dohTJfzs/d+PS5fdxvwFf8RByfW\n1vRboTksbD9UbqlD2KEyoqkJnCR2T9ACngfCI1dGcCDMjAVuuzNjQNf4ANuxLdP3fX4fYsYA3Y6l\n3A71S1vGEAjfDOBZAHYBHAdwE4DvBPBOIcTbhBBXGX7mBnL7MyuWf47cfoqF5Qjt55yTkFfKJKza\nrDYawnGSASJvipocwjHcKCOMLlmRWtlpKEPlwgjHlEDYt4awLWWEIeieVl4CX4IVew7h+lC5VA7H\nIawrI5Zx3XXV7Qce6ObxMEeja4fwYv8nUpybVSFwvOOmGt65MqLJIeywIdyojBhpQ7jPoXI+KSO4\nVcmMBd6WmTFg2m63YVvmNiUzNviE+/pEq7/FWy4D+CMA70Tevr0E4BoALwbwKgBXA3g5gLcLIb5W\nSqUueJLcJl25xt9Too1dsrYcp9BQcKo4hC0EwmmyOO1QHnhSh3CCGaQEhGj9q9bC2BQtWmdBy9Mk\nukP42E4VCCdyrEPl6uslJ5cB5BPJfGkI2wiED+cZEOT/b7kyIj/hkSJBrhMX3gfCTQ1hE9wQ9gej\nMqKLhvAN/0v5epYGSFM1TOuD7pURTQ7h/LWcJD2/MWHJ1RxjHipnDP3tOISXNYRdHiTwAQszFjgQ\nZsaA6f1vG7ZlbggzY4Pfk9ZnyIHw9VLKC4av/5UQ4pcA3Abgi5EHxK8G8Mvke3bJ7VXX8NOkS1c9\nLJYjpWyzHKfQA9BJRB3CFtqUhkCYNoQRzpCmeXOnT8wN4TwQpsPQNkFRRugNYUeBcJMawpoywtQQ\nnrhtCJsCFBuDAw9jNVCZ0BMIYQykU+8DYW4IDxPj5fVdOISf+cfqHUGM+RzY6/mdq/uhcmpDeKGM\nAIAwRhxP6z/YMU0O4T4awlmWt8TbnhRdh66VEdwQZphu2cbL7JnxYQpDt2Fb5gFczNjgE+7rM9hA\nuCEMLu97VAjxbcibwxGA10ANhKm4dtUR3w65rfsNFssRQkxXhMLLlnNk5vM57rjjjpXfd/bsWZyl\n1b4lKA3h0O5QuThNgbJ0ZVBGIJohjj0JhC0dgFJlxCSIsDdx3xCmz7GAgETearUR+pscwgAgoyp1\ndLEzNgYoRejRJvA4VIYGTjANSYswnAPpdJBD5ZrghrA/GAPSLhrCeiAczp0Ews0N4W4cwlOiTELg\nJhBu0vjEFp5kus6BCBCIQAmEgfzk3c5OwwI6gIfKmf/NMEOBt2VmDGxrQ5gDYWZs5K/lB4s/wB13\nACdPmr937vsBe08MNhBehZTyHiHEXwL4BgBfIIR4spSyFCFeJN+6St9A+3O6FkJfzuc2XM6RefTR\nR/H85z9/5ffdeuuteN3rXnekZSakSbkT2R0qF1N9QlgoIyYkVZgcII77DxqWKSPaEmfqOu9EJBCG\n+0B4J9rBLJkVX7eljKjvULPIcUPYtP0WIUOrQHhOnl8RYUoTBofu0XVYpyG8uws84QnA+fPcEHaN\n0bdq2yF89SeAa7QZqUUg3DeNDWFbygjNIUyHqiKcI45XvDg6oFEZYW1fna9zGQS7DoQbHcI9KCM4\nEGaY9nDbnRkD29oQZmUEMzby7fjNAH4aAPBVX+XwwQyE0QbCBR9DHggDwPUAykCYDoBbNeCNDpI7\np92nL2dZIFwuR2L1ALpGrrnmGtx2220rv++o7WCg44ZwpoZngLkh3DcShnWzFAinmjJiJ/SgIUzC\n0Ukwwawot9tTRtTXK4uq8x5OHMLGhnAenrXRgsxitSE8oQlDcQm674HwOg1hIG8Jnz+fN4RdOL+Z\nnF4cws/4k/odYewkEG5sCFtTRqgOYaUhHMbeDMNc9vV1oOHrskC4T7p8jgH1gD4I/A2EtyF4YMYJ\nKyOYMcANYQ6EmXGQb8ffD+BlAIA//3PgiU80f+9LXvISPProo309NG8ZeyAsG77+MXL7WSuWQe//\nhxXL+fARlnNOSrmxMmI6neJ5z3vepj9uJCGh4HRiNxBOiD4hCvIjMT0Q7jtokBKNrTMbb4ZUGTHV\nGsKp6D8QllJVQ9DQw5oywtQQDv1tCLeBKiPyy8zJbnSEDWEg9wh/7GPAbAbs7wOnT3f32JhmenEI\n67oIAAjnOHRwLqvzhvAyh3DgKBDuuiHsZSDcvTIiDPMTWb4GwnwgzgwVPrnBjAFuCG/H+jLjJ38t\nny3+AM95DnD99ebvnU77V8P5SI+jQ5zwbHJ7cbGzlPIe8u8Xr1jGVxZ/3y+lvE+7773kduNyhBDX\nAngG8oD6fSt+X+9k5EBzN7IbFs5pIByaGsIHvR+AGlt2gLWGcKIpI2gAm66cYWgf/ZJcNRC2EfpL\nIKqvV+o6EG5oCLd9jue1oXLqICrA/0B4k4ZwCWsj3NG1Q/ggvQR83nvrdwTjawivdAiHc3+ubID9\noXLeB8JBYuX/v1xGOaeAA2GGsQsHwswY4IYwvw8x44A/X63PaANhIcTTAHwt8hD2bimlPg7pjwAI\nAM8SQrygYRlfhrzZKwG8Xb9fSvlJ5K1hAeAVQohd/XsKXklu//d11qMP6AHoru2GcKaGZwCwFxFh\nsANlRJKgoXVmPxCeasoIFw1hPVRRAmoLoX+SmZ/ANHA7VM54QsNCyDBPq/WdGkIkAIMaKnfUhnAJ\nD5Zzh/FklsWG8EzuA4Fhv++dQ7j9vjpJsMIh7KYh3LRPbgqK11q2h4Fws0PYfkOY/k3vcwGHaMxY\nYGUEMwa4IczBGTMO+PPV+gwyEBZCfJMQIlxy/7UA3gqgTGp+2fBtvwig3GR+SQ9zi3+/sfhnAuAN\nDb/uF4q/zwB4veGx3ATgx4t/3gUfA2FFGVFd/m4cvLYmNBydBGaHcN9BQ9fOQmWdI00Z4WConB6A\nK4GwhZChaVBe4rwhbPJEW1BGaA1hUyDse0OYKiO4ITwcjG3KwN5Qubjh5I6rQLhxX21NGbHMITz3\nzCHcsTJCSEBkHp2gtfR+XDzFvgfCfCDODBU++GbGwLYGwtwQZsYGf75an6E6hH8ZQCSEeCuAvwFw\nL4ADAE8E8NUAvq+4LQH8NYBf1RcgpfykEOIXkIe1XwrgfUKInwfwKQA3Afi/AHxxsYzXSyk/1fBY\nfgfAdwO4BcAPCiHOAvgNAOcBvBDATwK4CkAK4DVSWqjdWkZtCFcHxDYeakzeTc3KiP4bwrXWmRSL\ng+GuG8JZ4FdD2EYLPJVVUhSKcHGCIRFuG8LGpp0NZUQSL041TaOJ2ioc6VA5bgj7gbExa7EhTAPh\nabCDeVbsr8LYiUO4MSy0NlRuSUPYkUOY7rciESGR+c6z84Yw8vviuF+f2jJlhM2GsO/KiG0IHphx\nwtsyMwa2VRnBDWFmbPB70voMNRCWyE3Rryn+mO6XAP4bgO+VUjYd1v0EgGuQB7pfBOC/GJbxm1LK\nn2p8IFJmQoiXA/hT5MHytxZ/6HIOAfwfUsq/WL1q/ZOSUHAadqeMmBSB8N6EKCMm/TuE9YA0QIQM\nsbVAOJXNDeHMlTKChCp0cJKVFjhpCJ/ZO4NHr+TTOhNRpY6ug5UFFgYHzhPaKhx+Q/jYsdXfzw1h\nPzBeXm/RIUz1L3vhcRIIO1RG9NUQDifqUDlHDWEa/E6C6WIwq42hcjR89T8Q3i5lBB+IM0NFfx3x\ntswMEW4I82uXGQf8+Wp9BqmMAPCdAG4F8GcA/hHAZwHEyFu5HwbwZgBfIaX8DinlhaaFyJzvBfCN\nyJ3C9yMPb+8v/v1SKeX3r3owUsrPAvgKAD+AvJH8GPLG8qcA/DqA50kp/9Nmq9o99AB0ajksTI+g\njHB6iaoMIGRxhCgyO0NsaEM4UhvC0llDuGmonI1AuEqKzuydWdyOHTeEzQ5hCw1h6hCO9MvMh9UQ\nPnYMCI7wLsANYT/o2iGckHOnuxE5U+ByqFxHDWE9EK7rX9w3hCfk8dgeKle2oeuBcOtfs/FjUuhI\nGRGR1XV5sM8hGjMWuI3FjAHTe5+F87Deww1hZmxwILw+g2wISyn/Gnnwamt5twG4reUyMuRB9Jut\nPKgeocHvxHJDOE7JAXdxROaTQ1jIEKI8L2LrAFQPhElDWIaHkBIQov3vOfLj6VgZkcgq5H7C3hMW\nt2M4dgg3DJVr+xzTbXpqaBUC/g+VKxvCRxkoB3BD2BcaHcKWtjelIUwDYe8awvaHyk0CTf8Szt2o\nbpSG8MT49Y2XnWKhyWhuCLf+NWvR7PRnZQTDDAHelpkxwA3h7VhfZvzon2N5u17NUBvCjEXopaj0\nANT6UDmjQ9iRMqJonQUIIVA1hG0rI6ahPrn+sPf1jWM0DpWz0gI/QkPYRSBsbAjbUEaQhvAkGqYy\nomwIH8UfDAB7e8Dp0/ltbgi7o2uHMN1f74VqIOyVQ7gDZUStIezAISylfoKWDAC14Xs/kkO49a9Z\nC1ZGmP/NMENBfx3xwTczRLbVIczKCGZs8Oer9eFAmFGCM9p2tDFUjoaj5VC5vYg4hF0NlSsbwqAN\n4W4cwkIIBFnREo76D4Q7HyqnOYRL5nL8yoidgQ6VW7chDFQt4QceyIMrpn+MjVmbDmGijDg2IRtH\n6EYZoa9vWF7UZE0ZoQ2VU9r+/QfCeuDfTUN4IIGwZWWE7w1hPmBhhgo3hJkxsK0NYVZGMGOD35PW\nhwNhRnUWKg1hC8qITG3LAp4oI4QhELYUrCSGdRZZEcI6uPRab9nZdgjThvATditlxKF03BA2tZ8t\nhAwxHSoXDq8hnKbAlSv57aM2hIHKI3xwAFxoNLMzXWL0rVptCDcFwm6UEfq+KxTFa83COutB5CTU\nfeD9D5WLYzSevLMyVC6VQOhXINzoELasjPC9IcwHLMxQ4YYwMwa4IcyBMDMO+IT7+nAgzCihID0A\nlVaGytH2sUdD5YhDOIDdoXIZ1MuQASCURUPYgTJCbwjb1oKkomoIH58cX2xDh/KS+hh6xniJddBe\nGRGT0GxnUg+RAL8D4TIMBjZrCAPsEXaFUaFgYZsuSWlD2EOHcFQGwrYawqHWENba/nHcbxt+2ck7\noxN9TWio7Esg3LUyQh8q52sgzAcszFDhkxvMGOCGML8PMeOA35PWhwNhRtEGKMoICw1hRZ9QKiMm\nRBkx8cEhbFcZkch6IBxI18oI2lq2q4zISEN4Gk5xvGgWum4Iyz6GykX1y8wBv4fKXa6elo0awgB7\nhF3ReUO4SRkRxF44hENRhJe2hsrRhnBgPrnT5wfJut6HKCOsBML19yYvAuGOPNEAD5VjmK7hbZkZ\nA9saCHNDmBkbfMJ9fTgQZpSWKD04tD1UbhpVB6CLVq5XDmFLB6CGQDiE44Yw9VIqob+FkCGokqKd\naAfHp0UgnDl2CHekjKCX1e9EaogUTv1vCF+qitvcEB4YfTqEy9cxAMcN4XznEQp6NUeKtgaFoziE\ngX73Xfm+ulox28oI03tTJDwIhFkZwQcszGDRX0e8LTNDZFuVEXS9t2F9mfHDn6/WhwNhplkZYXmo\n3IQciU1EoY1w5hDO1y0QIQJRKSNsvBlmpoNuVA1hJ+tLW2cBaQhbCISbGsKz1F1DOMtgbp1ZuLye\nnuTY1ZQRwcT/oXLcEB4uxvDMYkOYKiOOT6gyws1QObrvUvbV1obKqQ5hRRnhQP8Sx9CUEUTvY2Oo\nnOm9yVuHsEVlxOl7cN8Lvx2/+L9+kQNhhrEMN4SZMcANYX4fYsYBvyetDwfCjBII0wNi68qIqDrw\nrALh8SkjfG8I70Q7i9s2Qv+MNoTDqiE8yy4DkNVj6BE9BF9gY6gcaQhPJ6p3NJxwQ5jpjjRFtw5h\n4j8/NnHvEKb7rlCECIS9qzmSBDWHsKKMCPo/uaPvt3Yiu3ofk87IdSDc2BC2qYz4sjfg8ev+G37k\nz38E++mD6n2O4AMWZizwtsyMAW4IcyDMjAM+4b4+HAgzjQ7hzNSwXBNFGRFWB57ToPAIux4qRwNh\nS40k01C5qiE8x3ze45QiLB8qh6D9pdeZUBvCJ6Z57TSVqbMha8bgDLDUEK5WptYQLpQR7BBmusCo\njOioIXxsqgbCLhzCdH1DESK0eDUHbaYGCBCIQFNG9L/vqg2ViywPlUO1MuW6+hEId+N7z7JiKODx\nRwAAEhIXskcW9zsPhM/eAbz8u4Cn/RUfsDCDRX8dbUOIxowPbghzcMaMA/1z7Da8jtsSrf4WZuxQ\nbYDthrDSPqaBsENlBA0LAwQQorhDZFbeDGkgvDjoFlUr98rhHMCO/mOdoV+Sq7TgijAp2PDUkJQA\nQs0hTIdRTS8BBzv+hAxWHMLNQ+XCASgjaEN4nUD4mmuq2489Zu/xMEfHeHm9RYcwDYRP7KhD5Vw3\nhG0rI5IEwDRf37AIRdWhcv2/luMYjSfvrDiEfR0qN+1GGbH4eeKKTnClfr8DkgTA1/4o8PnvBG58\nN5LkHncPhmE2REpuCDPjYFsbwhwIM2ODG8Lrww3hLUdKdbCY9YFjhktUAWAaVoGwy7AwECGCHpQR\nEQmAL/dctdPbsmog3G6d0xSLJl25bGUY1TSvo/a9M87X2RCgWGid0dBsGuoO4fEqI2h4TJfB9Ife\nIAXQWUP4pDdD5YqGcEAbwnYdwmXwql494UgZ0bCvtuF7X+0QjkeljFj8PHmPmvsUCJ8sLrU4+cBW\nBA/M+DCdp+JtmRki29oQZmUEMzb4JOX6cCC85eTDt8xTzW07hOmB506pjJgcuFEokNaZsOilBMzK\niAkZ5HbQc7JSGyqneDLbrXMeCGsOYdoQnuSBsGtv8gIbyghtm6b/nyLyvyG8qTIiCKoA+eJFu4+J\nORrmhnCGOLGzD6VKAVUZETtRRugO4cW+2lZ7tGgBh8LUEJ5Xj6EnanqfkDaEeajcuiyeO+KKjn0K\nhMv3qCDhA3FmkJheQ3zwzQyRbQ2EuSHMjA1uCK8PB8JbzjK/bKeBcNkQFhKHSb9HoEpDGCEC2PNS\nAkAm1EFFADBRlBE+NYRtBMJaQ3jiviHcrTJC9XDS18yYG8JAFSBzIOwGo0MYQJq231cDaiC8O5lW\njdwRNoRpM7UMXlWHsGcN4ZZD5aSEErz6Egg3n7yzcDWHqSEsPQqEy+cjyJBYeg0zTJ+YPtvxwTcz\nRLZVGUHXexvWlxk/HAivDwfCW45+MGZbGUHbsmEQLm7vRruL21fms9a/Zx0Uh7Ayud5SIGxsCJNA\neO4gEG5qCNtQRkSaQ3jqviHcPFQuQ9KyTUlPckyCidYQHu9QOQA4eTL/m5URbmi6vD6x0B4FgIwE\nwtMoqq5scBQI6w1h+0Pl8vWNljSEfXEIt1VG6E3ccp29CIS7VkYQh3DsVSBcPYA5H7EwA2RbW5XM\n+NjWbZkbwszYYGXE+nAgvOUsbQibHKxrsrIhDGCW9BsI03UOaSBsQScA+BcI6wfcSguu5TqbGsLH\nJuRS88lB9Rh6RN+uBcTidpy0266pZzUK1KFywQCUEW0awmUgfPFi0ThkeqXpRIcNnQCg7q93InKy\nw9FQOXoyaxJGVSBsSxmhN4Q9cwgrgXDLhnAtEPapIdyjMuLQq0C4Wu95z1dKMYwNWBnBjAVuCHMg\nzIwDbgivDwfCW44+fEsJCy03hOmB5160t7h9kBy0/j3r0Di53lJDWIr6Ok9JIDyLfVJG2BgqpzqE\nafu7vM91QzhEtc5tm1j0svqJNlSuDMezzDxsxQfaNITL708Sv1vQY6VJGRFbOmqhuptJQHQo4dwL\nh/DiKhNbyogiKFycuFOUEfkG3ud6L/O9Z9JyQ9iTQLjZIbxFyggAccaBMDM8trVVyYwP07bs6+d4\nW0ipvl45OGPGADeE14cD4S1nWSMJImvdAqQHsfTAk4aGs7jfhnAcSyDI3+Vzh3D3yohpWAXCBy4C\n4UZlhH2HsBIIR/lz67ohHIlqnduGZ6n2/JqUEYC/LWEbDWGAPcIuaArPbDWEs6aTHaH7hnAYqFdz\ndNEQVk/u5P8Xsx7fnuIYjQqnsQbCnfrey9Ulyoh55lEgTJ7rOOUjcWZ4cCDMjIVtbAhzk5IZI7xd\nrw8HwlvOsqnmVjyNTQ3hCQmE056VEWR4SxiQyfUWDkCzDICpIUwDYRfKiKaGcMtgxeQQ3iHrWgbC\nbhrC5HlWGsLtnuRMNrQoAXWavaeBsA2HMMCBsAuaBnDZcggr7fdggmnkj0M4UobKSSRpu7OV+Xuf\n1hAO6g3hgx4vYFnaEG455NXvQNisjGj7Id7UED70JBDWn4956ukbBsMsgZURzFjYxpMbHJwxY4S3\n6/XhQHjLyQ+4SXAmQkAWvlULLaxGZcSkUkYcpv0qI2ggqDqEbQ0qqq/zbuhYGdHlULkjNIRdT66n\nDeGkw4awcmmyp0qFNg1hGiDzYLn+SdLq6gZKdw3hShnhLBAmDWE6mJSe2NsEuq+emhrCQf8N4WUn\n7+w0hNXnF3AfCPejjCAOYU8CYT38T7ghzAyQbQzRmHHCDeHxry+zHeivZd6uV8OB8JajH4CGQQgh\n7Tl1aSC8aHdBbQgfZv02hPVAmD6uJGnXOmsKhGlD+DDpN1lZ7hC2oYxY4hB22RDuSBnReFk9ADGA\nhrAeCGcyw6+9/9fw6v/xajx48cGlP8sNYbc0ncyw1RCWmkOYDpVz4RCm+656IGwhIF3qEM7vc9kQ\n3oahco2BcEfKCL8CYXYIM8OGA2FmLGzjtqy/33OTkhkD3BBen2j1tzBjRj8AzduyFhUKDQ3hY1MS\nCCfuAuFAEIfw4r7NXxb6QV65zjtRFRrOEgfKCPKYlMuiLSsj9IZwMJ0hQ7+DmYDlDeG2Q+X0bZqG\nSDLw3yFcKiN2doBZdgn/8q3/Em//+NsB5M/fG176hsaf5UDYLUlm3natNYRJIKxs2z40hOnVHGgf\ngseJXCzb2BB2pYxoOHknWw55PWog3PcHZ/2EZfVYLHmiAeXKjZlPgTBZ76bXNsP4jOk1xAffzBDZ\nxkCYgzNmjHDzfX04EN5y9IOxUIQQCCABKw1h2RAIH9+plBHznhvCMW0IByEySUKGlpchNwfCRBnR\ncyDcpzJiJ1IbwpNjMxxC9db2gb7Ok8COMiLLoF52HagNYRn6HwiXDeG9a+/HV/zWS/GRRz6yuI/e\nNsHKCLfEDf7rtKVOoISe7NCHyh3OJQBh5fccFb0hHEl7DWEawEWhwSHsQBlR22/RoXItA+Gm9ybX\nDeFmh3A3yohZ6lEgTBvC7BBmBogpQMoyQEpA9Pt2wTCtoO9902mufRt7kMQNYWaM8ImO9WFlxJZT\ncwjbVkYYBqwBakN4Lvt1CCtD5XRlhEUvJUAcwpMqEJ6nHg2Vs6yM0BvC0W6eply5ov8YR0S1AAAg\nAElEQVRktyxvCG++wnEMJVzQlREyOFS/10PKcD655XW1APi+/fuW/iw3hN3S1IpNpZ1PO1mTMkJI\nzOP+j4yoMzkUqjIizVruq7P6ftp1Q3iZ3oeVEeuzOAigQ+U8CYTzhnr1nLIyghkiTQfaLXfPDNM7\ndFveKQ7Zxh4Ic3DGjBFuCK8PB8JbjkkZIcrNwsKUbzoIR2kI00DYcUM4CGhDuJsW1i5pCB/2HAgv\nbQi3fI5rDWHNIVwGwu4bwtX/f5vnOI5Re34DESyeZxoI+z5UTl51bvG1Jx1/EgDg3P65pcETN4Td\n0nRZedv9VokSCIcTpTE7i/vfoGnoa9shHJP/y3I9XTuEl+2ru1JGKOvsOBBWw2mLyghyVccBCYRd\nHvzqbX9WRjBDpOk1ygfgzNCg733bEghzQ5gZI3yiY304EN5yakPlhN2GsGxoCB/fqULDWLocKheo\nXsqOlBF7EzpUzkEg3NgQtqCMIA7hKIiUQDjccdkQrp5LGmy1D4TVFiWQB+EAkAm/G8JZRp6L3f3F\n17/kui8BkLfUlg2W44awW5oGItpSRjQOlQMwd3BJO3UjhyJERAPhlg7hJFN9yeXvqH5hHoD3qYzQ\n34+VoXLopyHct++dhuCTYFI9BxaUEVVD2D9lhP5a5oYwM0SaDrTHHqQx44MbwuNfX2Y74EB4fTgQ\n3nL0hnAYhFVDuOUlm1I2B8IndiuHcIJ+lRF6Q5iGAHHSgzIiczxUjjbCrCgj8uAklDsQQqiB8DRP\nU2azfi8h1ENw6hCep5u/M8xmWOqIzjxXRhwc5K9LAJDTxwEAJ6cn8fmnP3/xPaU24s6H7sRv3P4b\nuDSvqsAcCLulcahcF4FwOFH2FfO0/4YwXd98X21vqBxddrmeQogqBA/8aghDpIvX7ubLXh0I97m+\ngPr+FAYhgvL92IIyIn8fyJT/04PEk0BYO0KxpX1hmD7hQJgZC6aGMDBu/Qk3hJkxwic61oeHym05\niwOmgnKoHIDW7dF8AJd6QF+yN3HXEE6ydDEbKQxChLAYMjQ0hI9N3TmEVykj2jzHSYKFQziQ+XJp\nICym1XN75YqqHOgS/USHraFyFy6g5hAGaEPY76FyVPOQFoHw6d3TuPH0jYuv3/v4vXjuk56Lr/md\nr8H52Xnc9bm78PNf+/MAWBnhmrTRIdyBMkJrCLsYepVoDWGbyohU1hvCQL7e83TuhUPYdPIu2vBT\nm6+BsK6MSESKGLCnjAjU7dabQFg7MZlwQ5gZIKyMYMYCDZF2q8MYpCkQjLQ+x01KZozwdr0+I93F\nMUdFPwDNHcJ2lBFNB6CAGhq6dAhHQdipMqJsH1NlRNxzQ7hzZUTZEEa+jvS5DSZqINwXyxrCbUL/\nCxewtCGceq6MoC7nJMyVEad3T+PGU1UgfN/j9+HOh+7E+dl5AMDfPfB3i/u4IeyWxoZwyxNZJfoV\nHXRfcZi4cAirV3NEob2hcglpZFI1w2KdixM/fSojljeEMwu+97ruxrdAOBTR4rFYUUaE6nbrTyDM\nDmFm+HBDmBkLJmUEMO5tmYMzZozox99jfg3bghvCW45+AKooI1o2dJYFwntRpYzouyEcpylQHP+H\nQYhQ0taZPWWEkCGEyKvIe7QhnPUbrOhtWf0yZFsO4RD1hjAcBcL1dbYzVC4PhJsdwimqQLhvF+dR\nWLR6wznSIE9+Tu2ewlNPP3XxPfft34dTj5xa/Pv+C/cvbtNAmBvC/dO07VpTRgTNQ+VkMEeaAiST\n7ZyaQ1iExvs2weQQBkgr15UyounkXUfvx94FwuVjtKWMCAfSEJYenkFkmBU0BUgcLDFDg4ZIekN4\nrJiCMymB4tCVYQYJn+hYHw6EtxyzMsJOQ7hJnwCooWEs3TmEoyBEkHWjjAjIy+vYDrn0WnrUELYR\nMhQNrMjQEEZUBcK0ndo1tXW2pIzY34dZGWFoCPsYCC+eg51qoJyujLhv/z6lNf+ZC5+BlBJCCEUZ\nwQ3h/mkKfrMehsohiDGfA3t7hh/sCLq+YaAGwm331dTZStUMVUPYkTJCUd1QZUQ3V+x4EQgX++pQ\nEKe/LWWE1hCeJbP8M48MnB7o6+9DKQfCzABhZQQzFrghnNNGTcUwPsAO4fVhZcSWo081z5URdobK\n6aFcUyCcoN+GMG2WRUGIkMihkpZD5WggLCRpRJOGcOJCGbHkMuT2gXC+PmUgXIajQHUf4LohTJUR\nm58qbFRGKA3hfPKTj4Hw4jnYfXzxtVM7p3DNsWsWrf37Hr8PH3nkI4v7D5KDhT6CA2G3NG27tgLh\nRftdCoRBqO4rwjnmPVsj9IYw3Ve3bQgXploAQCRUh3D+C/tXRujvx9av5vAsEJZSnTWQKyPKQNiS\nMiIwBK1RvpJOG8Laa5mVEcwQYWUEMxaahsqNeVs2qe24TckMHW4Irw8HwluOHhYGIkBgaajcUR3C\niXCgjCgfUxgiooOKWnop6TrThvBx8umi74aw3tRWW2f2GsKhyMOLQASLICMLXTqEq+eSNgB1d+M6\nUGWEQLBo0pYhuIRc/F/3GSQdlUVDmATCp3dPQwiBzzv1eQDyoXIffeSjys+V2ojJpPqgzMqI/mlq\nxdpzCBfbdpa/XpR9RRj3fpKj1hAmvoqkZQieZgNoCNOhckHa3iF8hEC49/00oATCi8fTkTICADBx\nHwjXGsLghjAzPDgQZsZCuS1HkarGGvO23NQQZpghw4Hw+nAgvOXQFq9AACFEL8qIvUl13XEWHKBl\nDrsWSiDc4VA5VRlBGsKOlRFKANByUFGSyIVDuGwIA1XgLwM3yohlTTtbQ+VC0ipUmnzF/4ePDeEq\nEFaVEQAWHuGD5ACPzx5Xfu4zFz6zuF16hLkh3D9pU4swSK3sQ0uHsJB5EOm8ISzV/RY9edd2qJzS\nEB6CQ7gnZcRsljd3+8AUCIeBPWWEaagcAGByRf39DtAbwo2vbYbxGFZGMGOhbMvqgXCfx6d9ww1h\nZoywMmJ9OBDecmiTMiiCYCG6GSoXEv+j7pnt9bJcslJhYPcyZBoIh6jaXSd2SSAMd8qI/LJr8kmn\n5XMcpykg8vQgElV4sVAoCD+GylGNRZvn+MIFLBpntD25E9Y1GUNpCJ/ayQfI3XjqRsNP5Nx/sT5Y\njgPh/mkcHmehTQmgar8XuhuloeogEKYqjFCoDeHWyghpvmqiUkbkK9vn63i5Q7jdc9x0glYPhIH+\n1rkKhIv3pyC0qozI/z/rR7zRnvtAmBvCzBjghjAzFrgh3Pw1hhkKWVY/icPb9Go4EN5y6AFo6Q4O\nLDWEj6qMQDTrtYVFG6JRqAakbRvCR1FG9B0I07ZsKCKlEd02ZJjF1bpEot4QpoGwy6FyNNhqM1Su\nqSGseJM9bgibHMJlQ5gOltOhDeHSI8zKiP6hIaaChTallCD+c0NDOOhfGUH31WEQIujKIRwY2v5h\nAkB61RDuQxkB9NeKXqxPp8qI+lmMYMeDQFhvCDe9thnGY7ghzIyFsi07mWxPIMwNYWZsmF6vY34N\n24ID4S2HHoCWQXBAhsq10wnAz0A4pUGh3hC2qIwggeFx0hBOHTaEg6J9JSx5og+T6mA7QhVelM8v\nHRjoqiEsIDAh217S4sB7fx+Lxtk0bGoI998sPCqLUH6nUkac2j1CQ/hCvSE8m/EHx75p1J203FcD\nxRn1UHUIu1ZG1BrC5CqTxrb0UZcNs0NY9fb2O2Rt2TBM21fs+BAI96OMqB/xehEIc0OYGQHcEGbG\nAjeEm7/GMEOBt+nN4EB4y6HKCKErIzpsCEdBtLgsGdFBv8qIWkO4ehm0GTgG6MqIan2pMiIVLgLh\nstVahv52DrpnSbUuk6DeEI6ly6Fy1YmOSUQCYUsNYRocKQ3h0N+GcNNQOWBFQ/hi3SEMcEu4b7Km\nENRCeJZlWJzsCKRpqFy/gXCWQWnL6rqb1g5huaIhDADhvH9lRMOVDVYGgAb1dfY3EO5OGeFDIKw3\ngukJCoYZCk0H2nwAzgyNcpvtsiH8jncAb3oTej+53gQ3hJmxYdqmx3xSxxYcCG85anu0C2WEeskv\npQwdEMY9N4SrICEKQruDiqgygjSE93aqA/sU/X4SUJURWiDcMmSYp6QhLOoN4TwQzh3DvQ+VI63o\nSWDHPao4hEnIrDSEoyE4hJuHypU84+pnLMIx2hAulREAe4T7pvGycmuX1y8bKhf3ehBjev+gHvo2\nr2MpASlWOISB3t+b9HXuUhlRhs2mQLivk3eLE9KFhz4UYXU1h4VtummonJi6D4T1K1XoCQqGGQr0\nNUTMaHwAzgyOpqFytrble+4BvumbgFe/GnjLW+wssy2mzxQ2X7sf+hDwHd8BvPWt9pbJMMvghvBm\ncCC85RiVEZaGytG2rJCB6q4FGboWzvs96FYawkFnygjqmA2DAEjz9c2cNISroT2APWWE0hA2OIQl\n5CJk6r8hXA1LVIZRtbjUPG8Ir1JG+NsQNjmEy6FyZ0+cVcKhf3LtP8H1J68HoDqEaUOYA+F+oduu\nss1ZaAjTNmVQ7Jv1oXJ9btN6W1ZvCDe2pY8AbUMDyxvC83l/4Ya+zmpYO1JlhLa+NpUR9CQHRfjQ\nENZ+ecbKCGaA0ANtDoSZIdN1Q/iuu4pZDQA++EE7y2xL1+HZT/4k8Id/CLzylebmJsPYhgPhzeBA\neMvRm5RAN0PlBNEnlIRlozTouSG8RBnRdlBRkzIiX3j+adlJIKw3hIWdg26lIRzUG8L5HXlVtneH\ncLHOQgRKIGxLGUGDFNNQOb8bwnVlRBiEeMpVT1l8/blPei5uuOoGAMD52XlcifMnkJUR7kjJICpl\nm7PgEKbhWdAwVM51Q5ieVGxzYkf32zc7hPP/j76CcOXknQi1AaDdKJycD5XTHtPiip0Oh8r52BBm\nZQQzRDgQZsZC1w1hGog+9JCdZbala2XEgw/mf1+86M86M+Om69b7WOFAeMtRmpRCcwhbbCQtQmZC\n1RDuWxmhDpWjyojEhjKiCFVoQxgARBkIB/0GwjT0D4uDf1vKiMMVDmEAi0C4T2WEHqxMlIZwy6Fy\npTKCXFqutgr9bQgvGyoHqB7h5z7pubj+qusX/y61EayMcAcNQW0OHAPU1mzpd3c5VM7YELY0VE5v\nji5rCANuGrOBtr5tQ38fA2HTY4rC4vEEGeJEtlp+/t5naAhP3AfC+snnVHB9ihke9DU0nZq/zjBD\noOuGMA1fH37YzjLb0nWbkn6WuP/+5u9jGFtwQ3gzOBDeclRlROEQJq2kpMUBmaqMqDeEI1EEakHs\ndqhcSFpnabtAOI4lCSL9CIRNDWE6OLDNjpI2hCcGhzAA5w3hACEmYfVcbBokSbmkITw4h3DeEN4J\nd5Tn6jnXPAcAICDwvLPPww0nb1jcV2ojuCHsDtoiVP2yNnyrEgjLE3juh8rRE1lA4RCmyogWJ+9q\nDeElDmGg58ZscYJWV2SMsSHcpSd6sXyTMsKDhrB+YlJyQ5gZINwQZsYCbQiTC0e5IdwCehzEgTDT\nB9wQ3ox6SsdsFepQOU0ZASBOM8DQ7j36ssuAwRAIB1Mgg1OH8ERTRiQtD0DnCW20aYFwln9alk4a\nwmWIqT3HLZuFh2m1LlNPG8KBCDGJaAt8sxU+OFA9q/TScuXyfY8bwrpDuNRFlPz4i34cl+PL+PIb\nvhw3nr5RbQhf5Iawa7p0CNN9VxkI60PlnDuEbTaEGxzCeggO9HdyJ00BTKv9lqKMGKtDWHtMYaDv\nqzf/mNqkjIAHDWH9fYgdwswQ4UCYGQvlttyVMoK+Vh56KC+ZCGFn2ZvSdZuSfnb6zGeav49hbMEN\n4c3gQHjL0S9Rzf8OgKIYHCcpWgXCi5ZmfVObBJMiEO55krtUA2GqjGg7VC5Oq71OGPgRCJuGyi2C\nlZbNwpg2hEM/G8J1ZcRmK3zhAvL2XpBvIzQ4Gl5DOFdGUF0EANxw1Q347W/+beXfJaaGMAfC/ZI1\nKSMsOIRnpCoSmhzCLpQRHTmEm8JRoO5NBtwoI/QAvO2+2sdA2OwQJldzZAmAnfoPrrV8Q9DqQSCs\nN4TZIcwMEfoa4kCYGSpZVmiz0I8y4uAgv8KOfp52QdcNYVZGMH3DgfBmsDJiyzE5hANyEJq0UCjQ\ngz1zQ7hSRjhrCEeqMqLNZcgAcBiTQFhrCAdlIBw6VEYEdWVEq4ZwdvSGcJ+BcK0hHNJLzVsEwg0h\nktoQzlMzHxvCly8jf73vXABQbwjrXH+y7hBmZYQ7aGikbHMWGsKH5MhgoYzQBqw5dwhTZUTboXJE\nJdA4VK5nZYS+39KHyrX5UGsKXwGooXNxf1/7amNDWLS/mkNZvqcNYf2EhgzixQR6hhkKTQ1hPgBn\nhgTdXvsYKgf4oY3o+vJ6VkYwfcPKiM3gQHjLUQ5ADQ7huMWriB7s6eEoAEyDoonlUBkRhSEiqoxo\nETIAwDwxB4YAEMji03J0CNnjkZ8yVE5oDeGWQZLSEA7MDeHJbp6M9qmMUFycQYiIKiM2HCqXB8Lm\nEElpCIeeN4SnFwGRb3+ndk4t/X6lIXwxbwizMsIdyxrC7ZUR9UDYfUOYXnGhNmbbBMJHbgi7UEYs\nrtgJelFGCCHI+4EDZcSS0L+tQ1gP/hd4EAhn+vtQEKPl+WiG6R168M1D5ZihQrfjPhrCgB+D5bps\nCEvJgTDTP9wQ3gwOhLcc08CxkAbCyeZHKKscwotALchw+Up/R0K0mRMFISKlPWpPGRFpIXgoq0/L\nSdbf3ok+x+VwNQE7DeE5bQiH5obw7glHDWEyLHFKhsptGiTt7+NIDeFg6rlDuPAHA6sbwk8+8eRF\nKDWGoXK3/s9b8Q3/+Rtw9/m78elPo9eA0wYpbQhbdggfkkA4bBgq17tDOFgSFsq2703k5E7TUDkX\nyghy8k4IASHt7Kv1AWt0/7W47YFDWFFGbHjyTlm+oSEsI/eBcC3sDhI+aGEGBysjmDFAg9E+HMKA\nvw1hW+9D8zmUq17YIcz0gekkB78frWaUgbAQ4vVCiIz8+coj/MxLhRBvE0KcE0LMir/fJoR4yRq/\nNxRCvEoI8R4hxCNCiCtCiLuEEG8SQjy73Vp1g96kBDpSRoi6h3hKGpZXDvsbqEID4VBoDeEW6wuo\nDWHdIRwSFyIdxtY1SlO7Q4fwtMEhPD3uyCFMvMnTyJJDODSHSDSci3Y9bwivEQhPwgmuPX4tALMy\nYkgN4bvP342fec/P4M/u+jO8+rd/GTfeCDz/+cM6c9ylQ1hVRkT13xE6VkZYbAgfHmKthnCvTt1G\nvU+751gPX+n+y6dAWFFGtLxiRw/+S6TjhnCWAVJoT2YY80ELMzh4qBwzBra1IdxlIKwfA91/P1iL\nxHQON4Q3Y3SBsBDifwPwQ8jHopV/ln2/EEL8JoA/BfByANcBmBR/vxzAO4QQbz7C770awN8A+FUA\ntwC4Gvk0lKcB+D4AtwshvmfD1eoM2kgqDzzpZaqJJWWEqSE8jaoD78sH/SUNtJkTBqpDuM2gIuCI\nyggAV3qs2tFwNNJD/5bNQtoQpqEovT3dyz8V9KmM0Jvvka2hclH1CYeG3rQhHO742RCO4+IDaTFQ\nDlitjAAqbcRDlx5CnMaDVUY8duWxxe0P35uH2x/9KPDxj7t6ROuRZVDCM3X4mQ1lBPWf+6KMUE/e\n0femDJuv8GwGNRz10CFcBqMB7Oyrl2kyXAXCJq9xH8oIGboNhPXnAgA3hJlBwsoIZgwsawjbUvn4\n6BDuUhmhB8KzGXD+vJ1lM0wT7BDejFEFwkIIAeA3AIQAHgEgjvBj/x7AdyMPjm8H8M8BvKD4+47i\n6/9aCPH/LPm9AYC3A/iS4vvfCuClAF4I4N8AeBh5OPwmIcTXb7JuXWFWRthpCK9yCO9E1YH35Vl/\nDeGs1hDuSBmhBcIRaQhfmvXdEFZbZ4GloXJxdoSG8DH3DeHIwjCqPBCukpK9yd7iNl33aOpnQ3gR\nyK/REAaA66/KB8tJSDx06aHBKiNmSfWEXJxXofh997l4NOujN2aVoXI2HMKpQRnhcKgcbcsCxck7\nS0PlDg7QqE/wxSG8aAhb0vv4GAibQv9tUEbQ96cFATeEmeHByghmDHBDePnXNsH0OYI9wkzXcEN4\nM0YVCAP4P5GHsh8H8FurvlkI8XQAr0Ue4r4fwIuklH8opbxdSvmHAP535CGxAPCjQoibGhb1Xchb\nwRLAr0gpXyGl/Asp5QeklL8C4EUALiD//36jEMKb/3djIymwM1ROCeUMDWEaCF/pMRBWlBGB2jpr\n46UElgfCoaNAmIb6tdC/5WXIsTQ3hGkgHBUN4UVDtQf0Ex1qkNRiqNyk+oSzG5KGMFn3wNOG8CKQ\nJ4Hwqd3VDeGzJ84ubj98+eHBKiNoIHwlG2ggTEIk2w1hk0NYD0ddOoR1nUDWYl99cIBGfYJPDmGA\nNIQtKyOWBcJ9nbxbpYxo2xBuVEYUgbCUbi5h1ZvRALghzAwSVkYwY6APh/BQGsK21td0Ip09wkzX\ncCC8Gd4Ek20RQtwA4GeQh7KvBnCU6OmHgUVS+RoppXK4K6U8APCa4p8RchWFidcWf58H8GP6nVLK\nTwH4OeTB8hcA+JYjPLZeyIOzwiEstPYoOm4IT6qw4cphj8oIrSGsXIbcoTKCNoR7VUakaqgCVO2z\ntkHSURrCk10SxPUUNNQbwrR11mKoHFFG0IawoozwvSG8U4WhR2kIH5scW9w+iA8wneYfmIFhBcKH\nSfWak9MqFB9WINwwVM6yQ7hURuhD5Zw6hPUTO62VEeaGsKqM6NchTNc5WjSEy0DYbkOYvu955RC2\n1AIHSmVEfaPNwuqNyEVwpbffAQBhzActzOBoCoR5W2aGBN1e+xoqN/aGsOkYiBvCTNewMmIzRhMI\nA/g1AMcBvEVK+Z4j/szLkAfIH5dSvt/0DVLKvwXwj8jD3G/W7y9axjcXy/mvUsqmGOgt5LY3gXCS\nSkDkFZmyGWxLGZEkchHKBYZAeHfiZqgc1ULoDeG2Q+VoQ3iiB8Ki+rR8uc9AOKOD7iwrI2hDODI3\nhMOd/gNh5USHPoxqwyCppoyISCBMG8ITPxvCmyoj6HoeJAcQohosN1RlBPUoDyoQJiGSdYdwWg+E\nnQ+VozqBwN7Ju1pDmITA+joDPSsjmvQ+Fh3CISLkhq0cnxzC9pUR9c8WXgTCtYYwKyOY4cHKCGYM\nuFBG+NoQZmUEM2S4IbwZowiEhRCvAPCNAD4HQ0O34WeehnxwHAC8e8W3l/dfL4S4UbvvRYbvqyGl\nfBjAJ5AHy7cc5TH2AW2PmpQRbYbKzRPSTF0RCM/6cglADQQDEVi7DBlQw1e9ITwRbhrCeiOa/t3W\nPZqQhvBOZG4IByQQ7muwHG1ghSJQBxW1cQhTZUTDULlgkv+fxLG9YRQ2qJQR6w2Vo03ogzhf/3Kw\n3JAawkogvDO8QFgPkfSGcPuhclQZke+7nA+V0xvCZF8NkW58uf9shkaHsEtlhLLfCrpTRuhX7PjS\nENZP3rUd8pq/ZnwNhHWHMCsjmOHByghmDLhSRrhQFlH6bgizMoLpGm4Ib8bgA2EhxCkAb0De0P0x\nKeVnj/ijzya3V82Zp/ffbGE5TxFC7C39zp5Q/LKB3aFytC0b0AP5gr2dKmw48EQZ0fYAdE4dwmFz\nQ7hPRQb1QC+eY1vKCE8bwnEigcDcEJathso1KCNIOCei6v/Ep5awrYYwMMyG8GFKnozplUVQNJRA\nuGuHMN1fL5QR2lA5lw5hfagcgs0D0mUOYdNQORcO4YUywtLVHN4GwiT0rzmEW74f60Plyn1ZGrgN\nhPUgHAArI5hBQrfZKdl18gE4MyRcNITn80JF55AuG8KsjGBcwA3hzRh8IAzgPwC4FsD7pJS/vcbP\n3UBurzpndY7cfoqF5Qjt55xBG8BlMBrShnCLoS7Up2t0CEeOGsJLhsq1bQgryggtEJ4ExJk877Eh\nnKkBOECVETIPTzdEDYTNDWExcaGMUE902HCPLlVGkDBcTKr/E588wqZA+ChD5UwNYRoI+9SCXobS\nEAYWTekHH/QruG9CD5GU4NKCQ5g2hCOjMsK3hnC28ZDKozuEHSgjmhrCVpQR+fp4FQgvUUZsOgBU\nWT5pgpf7OxnEi/8LbxzCQcIhGjM4WBnBjAEXDWHAvTaiy/CMlRGMC7oclDhmBh0ICyFeBOB7kA+Q\ne9WaP36S3F7Vc6MXup/oaDlOoO3RSHcWQg3W1mXZgDVAbWUdzB0FwvqgorbKiCUOYaqMOOgzENYC\ncKB6rgFgHrfwRJNAeI+EwE2BcF/KCH27VryU2OzTzlJlBL18P/S8IbzmUDlTQ/gE2Xv19Zy2pR4I\nV8H4uXPwHj0gpSchrDuEA/eBsH5Zfa0hLNKNH8/RHcLuG8KLq2taNoSpciTU/fblv8P8/l5d70uG\nyqUtBgcC9aFyiiKn2Jf75BDmFgszNFgZwYyBPhrCpv2768FyXV5ezw1hxgVN27RrPYvvDDYQFkJM\nAPx68c//V0r5sTUXsUturzqspLGOrnpYLEdK2WY5TlAcwoawMGlR/1NUBYaGMD3wPoz7SxpoQ9Tm\noCJAdQhPSQMaAKaBG4dwooSj+fMQKoHw5uscZ9U7/vEdcyBMNQt9BQ2JFgjbUEbs76NZGUHDucjP\nhnDlEM6D0EAEODFdfV5qWUMYGI424jDRXnNLPMJSSlye+5V06wGp3hBur4wgjVnk+y6lkRs6UEZo\nDWG6r0bQMhD20SFMhrzWB4DacwiXDfASX4bK6S3wtMUVSoDaiga0E2CTK9X39EyTQ5hDNGZosDKC\nGQPb2hDuWxnx2c/29/mC2U6atl9+T1pOPaUbDj8B4FkA7gXwMxv8PN1VTRu/K4ekPdB3ZYvlCCGm\nK0LhZcs5EvP5HHfcccfK7zt79izOnj278vto4Fs7AAUQJy3CwnRFQzj0QBmhO4RbXv++TBkxJS3S\ng7jPhrB6wA2oWpA4adECz6r1OHnc7BBG6MAhrJ3osKaMOGtWRtBwTvreEC5UCczHRucAACAASURB\nVFftXKUGbA0scwgD+WC5I+xqnNOkjADUQFhKiW/4/W/AO+95J37/n/0+vvXZ39rTI1yO3qZUWumW\nG8JREYoKITANp5incyCcu3cICzsN4VwZcXSHcG/KCJPex6oyogiEmxrCQQJA4uBAbP6L1n5M6glL\n5bGJPCSlB+drL39CGsJUkeM8EGaHMDN86Otnd9f8dYbxHbrv7TMQ9rEh3GUgDAAPPADcdJOd38Ew\nOtX2+2DxJ+cDH1BPWpbM+7z00WMGGQgLIZ4J4MeRD5J7jZRyk3D1Irm9qiZ3nNzW+3D6cj634XKO\nxKOPPornP//5K7/v1ltvxete97qV36c0hItwKAppQ8eOMkK/RBVQD8JpGNE1ekOYhgytlRGkIawP\nlaMN4X4DYUMLnDzHh/MWnmjSEL7qmDkQlmH/ygi63Uah9hxvEAhLuZkywqeGsO4QPoouAjA3hKky\nYigNYT0QDo8/vtgSaCB8/8X7cdtdtwEAfv+jngXCpDFr3SGc1h3CQL6fnqdzN0PldIewNlRu08eT\nKyOO7hDuq9FiGgAa2BwqF60YKlf8noODDRPYTR7TEmUEghRxvHkgnCQAdpc3hF2EsKyMYMZCkzKC\nt2VmSNCwdjIBSGeGG8IbQj83PeUplZrt/vs5EGa6o9p+3wzgpxdf//Ivd/FohsMgA2EAP4y81fsp\nACeEEN9h+J4vJLf/qRCi7LD9cREg0wFwqwa80UFyum1SX86yQLhcjsTqAXRGrrnmGtx2220rv+8o\n7WBAHRpXqiJC0hpMWrwTUp9utEoZkThSRogOlRFLGsKzHgNhNfQ3OITbtMCJQ/j0cbMyInPeEA6U\nkEFuEAjPZsWbTIMyojyxkMoUWeBzQ1guAmHFp7mEVQ7hixf1n/CTw1R9Mp528z7u+mh+mwbC5w/O\nL27XWsUO0UMk2w7hWHEIV/uuaTjF5fiyhw3hzF5DOFQD8OqX9uwQNjSEF+tsUxnR1BBG/j2zWQgp\nAdFxUdgYCBta4LR5uP7yyVC5HY8awjxUjhkBrIxgxkAfDeGhOIS7aAjfdFMVCH9mo/SDYY5Gtf1+\nP4CXLb7+7nerx64lL3nJS/Doo4/28dC8ZqiBcHkkfBOAP1jxvQLATxW3JYCnAfg0AOocftaKZdD7\n/0G7T1/Oh4+wnHMbtpoxnU7xvOc9b5MfNaIcgJaNJHJqtJ1DmDaE6xUfehCeZDGyTD0r2wVSqoFg\n3SFsryGsKyN2HCkjMkNDOAztKCPoULlTJ8wN4UxU39NfQ7h5qFy2wVC5CxfKhZmVEUAe0F2Jr0AG\nVUrlU0P4yhXkgXbRemzTED5OrnXoK+Rvix7uPvWZ+7iruE0D4f3DSiUxT/25lEgPSO07hOkJPINC\nwUUgvKwh3HaoXINDWFVG5N/T1+s4Mb0fWxoqR8PXpQ3hIAHSHcxmwF7Hkw5qDmF9cGCQGBtMay2/\ncagcKyMYpi309cND5Zih0sdQuW1rCOuB8Lveld/mwXJMl1Tb79niT84XfiFw5kz9+6cmj8QWMtih\ncsjD3VV/TN+bf0HKewA8UPzzxSt+11cWf98vpdTGD+G95HbjcoQQ1wJ4RvEY3rfi9/VGklKHcKGM\noFO+2ygjVjmEteE9fRx0ZxmWhgw2A+FppK7z3pQMlZs7UkYU4cLEUkM4ISru0yQQpu3FNPBsqNwG\nDeH9MiNsUEYAVeBPA3DvGsJFOxhYIxA2NIRpINxXyN8WPRA+de3jOF38FyiB8MznQJhegUA+xNho\nCGd1hzBATty5GCoXNJ+8az1UrsEhTE9UhpN+lRFGh3C5zn04hFF9Tx/rrIf+NYdwi+d4sfwi1A9E\ngJM7RH7uPBCuN4Q5EGaGRpMyggNhZki4Gio35oYw/Qzx1KdWt7mMyXQJD5XbjEEGwlLKV0opw2V/\nUA2akwC+qvh6JKX8NFnUHyFvED9LCPEC0+8SQnwZ8mavBPB2w2P5JPLWsADwCiFE08WNryS3//s6\n69slpoZwqDSELSkjDIGwPryntwPQJSFD2mFD+OQeCYRn/QVNiaw/D7QhPI9bNIRRJETJFNNpdX0x\nvfQ3hYNAWNuu2w6VqxrCZmUEUIXgqfDYIUwCYWXA0hKUhnARCB87pi13AOjKiOjEPm68Mb997lz1\nYcHrhjAJzxRvtQWHMFVGGIes9dwQ1i+rD4XtoXKrG8LhtGdlhOlqjr6VEUWA2se++ijKiDYN4TwQ\nzp/DSTDBsQnZcTkPhOsOYT5gYYYGKyOYMdB3Q7gczOxjQ9jW+tLjn9Okf9LjDHlmC2l6T+IT7ssZ\nZCC8AU0mvF8EFteP/5Ie5hb/fmPxzwTAGxqW8wvF32cAvL72y4W4CfkQPAC4Cx4FwkaHMG0Ipy3C\nwkxt/ujow3v6CM9MIYOijNggLFSWv6QhfOJYtWe63GOykpkawuTTTtyiIZyK4klLd2u+ybJBG6P/\noXK1QJg2hHV34xE4kjIirAfC3jWEp9UTcGKyapZmjtIQHpEyQuxWgXCSAA8Ww2h9bQjrrULrDeG0\noSFc3g7nvZ7gMDqEtYFjthrCylA5su7htGdlhMH3vlA4iaw3hzDQY0N4xVC5Ng3h/DWTP4fTcOpX\nIGxwCPMBCzM06OuHA2FmqPTdEL6hmFz08MPFlauO6MshfNVV1e027+kMswoedLoZ2xIIGynavb+A\nPDD+UgDvE0K8QgjxfCHEK5CrHb4EeTv49VLKTzUs6neK7xUAflAI8f8JIb5OCPGlQogfLO67CkAK\n4DVStqyhWoQqI6Kw3hBOWzSEqZNSb8sCdWWEq4awEha2fGpS2RwIX3WMNIR7SgqlNDuEqRakTSBc\n6hFEulO7rwyEE9l/Q1i/9Fp3j0pp+KElLALhZcqIsiEMPxvCC4dwgd5wbsLUEB6DMuJQPL4IhIFK\nG/H4rGpR+xQI6+GZMlTOhkO4QRmxCJ4DF8oI4petNYRbDpUrmrACQtk/0KA9mDgcKqc3hFuGo4fz\nDAjy97co9DMQDkVYeyztG8L5Aiah5w3hkBvCzPAoD7LDsJsQjWH6YFlD2FZgS3/HdddVX3M5mLlL\nhzD9DMGBMNMXdJumA4n5PWk5Wx0IF/wEgN9CHvp+EYD/AuD9xd9fVHz9N6WUP9W0gCLgfTmAvyu+\n/1sB3Abgb5E3jK8BcAjg+6WUf9HZmmyAegCabw4TOnCsRUM49lAZsaohLJGtHRZSlgbCx/sfKmdy\nJgPq4MA2Q+WyIF+PIGsOhOeZC4cwdWPXQ4Zlbwwf+ADw7GcDr3pV9bWjKCPK7TmBxw1h8vj1QLuJ\nVQ3hoQTCh4n6ZOwf7iuB8L335idQhqKM6NIh7IMyojZUTncItx0q19CWdeoQNlzNsTh5J7JW4eiV\nGfVP+xEI6613kzKidUO4UEZ41xCuOYR5qBwzPMrXTxTlf/SvM8wQ6LMhHATqZ+i+Pl+Y6KshfIoY\n6jgQZrqEbr80EObPV8sZeyCsD5erf0PO9wL4RuRO4fuRh7f3F/9+qZTy+1f+Iik/C+ArAPwAgL8G\n8BiAAwCfAvDrAJ4npfxPm69KNySSHowVOoGItEdb+GXnyYqGsKaM8MEhDJG2OhuckFBFD4Tp0LWD\nnobKmdYXgHLQ3WaonCwGxgWyHi6aAuG+wkMarAQiqIUMy94YXv964B/+AXjzm4FPfjL/2jrKiET6\n2RDeNBCehJPF/5/JITwUZcQl7cnYn+0rgy5+5EfyDw9/8DY/A2G9Vag7hNsetCRKIEwUCouhcgnm\nc9nqhNk61PZdetO/RWOWOoSV9yGoQbuI5ovv72O9jQ7hhTJC4nC++YOggfDORF1nXxrCJmVE64Zw\n+Tx77xBmZQQzPMptVg/ReFtmhgTdXrsOhCcT9TP0NgTC3BBm+qIpEOaTlMupp3QjQUr50wB+eo3v\nvw15q7fN78wAvLn4MwhoQ7hURkwiMnCsRVh4cJgAhaZ0ZxLW7nehjMgbwlXiG4hA0wnknsaw/nCP\nRIrmhvDpk1WAo7cVu0JvRC+GylFlRIu9pAzz9Qhlc0P4MHXREF4eJC1b5dtvr27fcw/w9KcD+2VG\nWCgjBITa0ER1Cb9Elh/sZ5F/DeGd6rlQAsUV7E32cGl+adAN4YsHaiD8+ExVRjzySP73px/ZB67N\nb/sUCOsBqd4QbvshXh2IaWgIA0AYYz6fKl6urjA1hG21R5c2hMn7kojyI7gsyw/mpupL3jq66gZQ\n99XzOMWmH9uUhnDkR0PYFAjrj6XNwaMyVE5XRhQn9/xxCGfF1Tpj72kwY4KVEcwYoCceux4qF0XA\nHumTuAyEWRnBjA12CG8Gf/LcctKMXlpfKCNoQ7iFTmA2p42kVcqIHhvCS5QREFmrN/90yVA52hCe\n9RQImxyNAJR1TjZ8jtMUi8ZpuKQhTN2tLhrCdfdoc3i2vw/cfXf173Pn8r91ZcRutAuhTdFTAtYi\nKPepIXzlChaPCzh6Qxio2tBDdghfmWsN4cN9POc5wJOfrH3jjp8NYf21rOw/+3AIA71qI/QAvNYe\nbeEQPjhA5ZYN1LYsDcPLhvDiZzpmaUMYhQd4Qw4O11NG9HHyrvb+ZAj92zSEc2WEp0Pl9IYw1Kuq\nGGYIUGUEB8LMUOmjIVz+Dr0h7PIqO9Nbjq315YYw4wJWRmwGB8JbDm0kTcqGsOIQ3vydgQbCu9NV\nyoh+Jtjr7r6aMqKlizORzSH4mauqwHCe+aOM2PQ5vnQlWSw7RHNDOJUp9o73FzIA9eFMR70M+UMf\nUv9dBsKPl3PGilaZaSCbMuSraKV51xDeQBkBVOtbNoR9+TC7DjPN270/28fursSddwLvehfwwAPA\niRMAdj0OhMnJLGV7s+AQThocwvqVHL0GwmLJiZ3WyghzQ1hRRoSx+jMdY2oIR+TItPUVOwW+DJUz\nXcFiSwsCDEwZAeCwTfrNMA5oUkZwIMwMib6VEb41hKlFyrYyQl9fDoSZLmFlxGZwILzlmBpJijKi\nhUN4Nq+WbQyEHSgjVoYMLRvCGVFG6N7kJ9BAOHWsjKCB8IYhw+OXqnWYoLkhDADHT+WfDJwEwobJ\n9U3b2p13qv/+9Kfzv++9t/hCoYwwhalKQzjyqyEsZctAeAQNYaouAfJG7EFygGuvBV78YuDsWeBl\nL4O3DWH9ZJaqjFg+KPFIy6f+80ZlRH8N4ZUn71orI8wOYeX9gATCvTeE9aFyAOaWGsJ6CO6zMqK1\nQ3goQ+UAzFOusDDDgpURzBig22vfgbAPDeEumpTlZ4jdXVW3xYEw0yWsjNgMDoS3HKqMKA88p8T3\nu6lOAAAOY9IQ3vFDGbE6ZGipjJDNgfBVx6o9U5z18454JGVEutlzfOFylQxFQXNDGAB2T+RhXF/h\nYaKd6NAvQ27a1j74QfXfZUN4oZEoAlV9oBygN4Tz/xtfGsLzuar4ANo1hIcYCNPhhiX7s33l39/+\n7fC7IbxkqFzbDzuxJA1hEpLqwz/7OslhPHlndaicuSEsBPGDB/0qIzKTMoJcsXMY27lix+dA2JYn\nGgDiRAJhvvxJOFH3ec4DYW4IM8OHlRHMGKCfn7o6ueHjULnyMdGA2nZDeHdXbSBzIMx0CTeEN4MD\n4S2HNinLkDCypIw4JAege0dQRvjhEG4XrNBAeNmlyIl0o4ywOVRunwTCU7E8EN472W9DOMuWB0lN\noZbeED53Lv8/LBvCYrpEGeFxQ3jx/04CYSXAXkEZgMdZjDRLsbsLlArloSgjEtRfc4/PHlf+/fVf\nDyUQzmSm7CNdorf9dWVE2yxJUUaQk1m+OIRtDZVLEjWQ0x3C9Guyb2WEoSE8ofvqFidoh9AQNu2r\nWzWEJW29T7X1TKvH0DP655CSmBvCzMBgZQQzBvpURuhD5Vx9hs6y/OpBoJuGcPmZaW9PDdk5EGa6\nhBvCm8GB8JZjCjAjSwegtCE8JYPqSlwoI0wNYX1QUZsDUKqM0A+6aYBjCqe6QG8imRzCyYaB14XL\nVUIyDZYrI3aOV4Fw+QGkS2jzPRDBkRrC8znw93+vfu3cufxP/kYiIcMjKiM8awgvWrzRhkPlSAB+\nkBxAiKrhMISGsJQSqTA0hA/VhvB0JwV2Lihf86UlrL+W9YZw20A4bWgIOw2ElzWENxwqtwh2i7BX\n308D1TpL0V9DWEpAosOGcEyHBvoUCKsnLPXH0qohrHmxTevpU0N4nnBDmBkWVBlB5yhzIMwMiT6U\nEU1D5Vw1hOlnxi4awlQZAVTaCA6EmS6h2zU3hI8OB8JbjikQVnUCLRrCSXM4CrhRRqxuCLcMhJc0\nhAMRAFn+tbSnQFhfX9NzvGnof+EKaQiHyxvCZSCcZf2EpMmKoXKmbe1jH0Ptub9yBXj/+8sFxYDI\n02yTMkLZnj1rCFeBcDuHMFDXRgwhEI6z6rmj6MqIi/OLte/xJRDWwzPVIdzu0npAHYhJHcJOh8qt\nGAC6yTovXvtlQzg0NISLr8mgP4dwlqH23pQ/lva+9zQFYvp+LPwIhPVg1KSMaNcQrjaQSTipLRvw\nyyF8mHIgzAyLJmUEt7GYIdG3MsKHoXJNl9Z3oYwAOBBm+qHr7XqscCC85dBLVI1h4YZ+WQCYx8sD\nYV0Z0Ud4ttIh3Haq+ZKGMAAEWR6cZsEhss3/a4/+eBocwmrTLt3oA8/FA6IfCFc0hI9VKVIfl0dl\nMIT+snAcBIlxW6P+4IBsEu9+d3Ejqj61GZURHjuETcqINg1hoGo4DEEZcZiYnwhdGaEHxIA/gbAe\nnimBcEvXKgCkng2VWz0AtG0gvLohnJGGcNfvT/p7U/m+FAbth7xSZzJQD8FNgXAfr2ujQ9iSJxoA\nkqXKCP8awgkrI5iBwcoIZgzoDWF6DNC1Q9jVZ+guXatSVsc/ZfjNgTDTBxwIbwYHwlsObQibwsJN\nB44BwHxFQ9iFMmJ1yNCdMgIAQlmEhuFhL81KPWRYOIQtrPPFgyoZMvloaeA4OValKX2sd6o1hAEg\n+P/Ze/NoS476zvMbmffet9eiUpVUWkslAQKBEQiBsQ02HkzLHpDbmLYBGze2u9scDwefbsxgwwD2\nwNgwBy/TNj4D3WNPN43ZTBvDuA9mF2AjdozQBqWlhFSLXpXqVb3lvrvG/JEZGb+IjFxvbvfe+J5T\np/LdNfPmEhnf+MbnJ7Y5AhlB+cE/9mNy+fOf9xfa8k2JyAibEG6ULnTNO0JHRuh/A+q08zpFGcIM\nTqEDWYBmnrUiEsJuvQnhLGbheAx86UvAuXPq4945SYqNxTCEx6guIey1TbK9FdvaognhnD21bhch\n45WqMQxhxy0UGUGP6bbTDh0/wTpULJ0HLmSREVbTJoqMsIaw1bSq7IQw5yoyogkJ4aip9UUYZ7Tv\nYxPCVlXKFpXLJ2sIz7nGCQnhSZAR/WH4s6nqQEYkJoRzcimFEg1h+KZhq4fN8Oz0wmXqcAMIFdLL\nYwhv7cabi4ohvCRfW/ZoeIjF6RvBDvw7vAhkBE0Iv+hFcjngChMz1YSMaHJC2GQILxgwH1FSDOGh\naghXxYWeRI+cInenXB77eiK4+Qlh77h20QJjTJ7HRSSEG8YQ1k2zrAnhd74TeM5zgGc8Q70R9MzR\n+LZJbP+4QoawabCS/g/kTwhPiyFcODICcv9NQ0K4X2BC+Ngx4KMftZ1vq3IVhYywnW+raVLZReX0\nz29CUbkyk5TWELaqS7aoXD5ZQ3jOpSSETQXHJkkIj+LNUR0Z0QyG8GQd0NSGsNvD1lb+70mrSJOh\ngGm527u0QFnYXKSGY5WGsClZCKgJYT25Ox7LhPBVVwE33mj44HYCMkIpKuf9oI1LCLsFFJUbqMgI\nzutLOKTVI6fldi+NDwbLIWSEISG81W3G3Ss1zxzdLCw6IdwAQzgxIZwwePfFL3r/338/8Mgj8vEk\nfAIgt3lEEsJln8vR1y3Key8nIazO1qmYIawx7otERuiDHMp2184QDvdOBgUxhHd2gGc+E/i5nwP+\n+I8L+Ugrq5Bo6tEmhK2mWWUXlaNm1DwUlaP3SxYZYVWlbEI4n6whPOdKSgjnnaI6GsUXWAPqQUaY\nEsK6yVCmIdxmwhDuV5IQNhXtAYpBRmwTZ2ipE4+McBeqQ0ZEmeBxCeHjxxHsjxtvBK68Mvy5l15B\nkBEGZjJNCLNOsxLCEzOEYxLCyuc3VCfW5XavOYeC5RAywpAQPvloM+5eqXnmMu08LiIhDIqMkNcu\nxTB1BpUNciQO3iWYhXQ96fHpmaNyW+PaJmoo1pYQJu1TWYZwkxLC+roUlRBuO23t+Kk7IRz+4kFB\nCeEHH5SolM9+tpCPtLIKidbBsAlhq2lW2cgI2o41BRlRZkKYbpNNCFtVKZsQzidrCM+5aBG0wDhT\nkBHFTFFVTFdfdSAj9E53y2kViozg5Pc0syklMqKyhHBC6ixvKnq7Jx2X5XY8MqK1WF1C2DPO5HEr\nizP5RoMzDB1rNEF47bVmQ/iyqwgyIiEh3F6YMYawISFMDeGmc4RPn5HbfWDh0mA5DUO4UYawSAhD\nMwsLSAjT2SKUIdyohHAGZAQ99+jxqbdNpuu0mhDm8n0lKlVCuKD2uKmGcFGFA4PPh1pUDiDbWjdD\n2JQQLohXTtvYY8cK+Ugrq5DKTlVaWVWlspERuiHchKJy+jqJbbbICKtpVpQhbNukeFlDeM41MqR4\nqXk7Go1z8UGTOqBAGBlRhXlmSszqqbNSE8KO3yK69TCETfs47zZ3++kTwk6n4oSwwVhxY5ARp07J\n5Usv9W7WDhxQX3PJ5QQZkcAQbi01KyFcaFG5oYqMUD6/oTp9Ru6IQ8uXBMs6MkL/23tvM+5e6XEd\nQkYUzBCORkZUXFROZwhnwAnQ9aTH5+4uADchIeyGEQplt08mZjL9H6g2IVxFJ9XIEC6gbQo+Xysq\nB1DMSs0JYUNRuSKREULHj9sOuFU50lOVZGKJTWNZTZXiBjfG+cmJgWg7pjOEm5AQbrXk+VtEmxiH\njBgOi/lNraxMEse143gDHfrjVmZZQ3jOlYSMgDPKdRKlMoRrQEYkJ5ImREaw+G0OUqTOGOc3y786\nRZkMRaSid/qyxV9ZiE8Is7Z8bdn7OdFYMSAjdEMYCKeELz5MkBEGM5UaZ60pSAgrRl+CkhLCTUdG\nrJ+T2314D0FGpCgq1xRDmA5mBciIIhPCIk05dtBuyeuDcp1uVEI4/rqVOiEcwxD2vrgv31eiEgey\nAAyGEySEY0xw+nd7oeqEsHoPopvTExV5ZWpROfEd4rODdahY0QzhYu4J6PV4PPZMYSurolV2qtLK\nqiqVjYyIYwjPYkI4Dhmhf7eVVZESxy8d5KCPW5llDeE5F020mjqgec3CbhchNIMuxph8vCJkBO2I\nMThgjBWLjEgyhEmK9NyF8p0VU4cb0PdxzoTwQK7/ymJ8QrhKQzjSWHFkmjKPIXzgkvRF5YQh3JSE\nsGQIeyu04C6AMZb6/UkM4aYnhM+cl8ffpfv3BfsqDTJi/bEGGcJMSwg7JSSEx22lM1QrMiKhAGge\nQ9grKpeOIex9sffauhjCRTD9sySEqzSETTN2imibAD+BRPazMP7pOQM0iyE8LAEZAVhshFU5ssgI\nq1lR2cdy0xnC1DwrGxkB2FkrVuVJnGu6IWzbpHhZQ3jONUZCQjin0ZAmIQyQjrfbrzwh7MC8vZOM\nXPIEZMQiMYQ3tsp3VvQOd8gcBfIjIwayxV9djE8I02RqNQzh+ISwntw9eVIuHz7s/X/VVepr9h0k\nDOEEZITbaXZCOAsuAjAnhKcJGXF2Q55rB/YuYO/iXgBhRITJED5zrhl3rnSgo6UXlSsgITwWCeGR\nagjrReWqTQir1y79uhW3LlHIiG4XwcAIoA7kCJkSwmWfy2nwPlUgI6pPCMcjI/Ie18Mhgn0HyHuN\nRieEx8UnhAHgvvsK+VgrK0V6qtIht9K28201Taq6qFynA4hMRl0JYT21XJYhrCMjAGsIW5UnmhCm\n57FNCMfLGsJzrnFCUbmJEsJpDGFhNlSEjKDJHNdoCE+GjEhKCC+1pflwvgJD2FREDyhmm3tDuf6r\nCQlh7tafEG670UXl0iSE1/bHIyOoseQ0LCE8sSGckBBuOjLi3JY8/vYsL2Lf4j4A6ZARZzaaceeq\nFJUrIyEMmRCmHfu6EsJ6ijJrwbHYhHACS1s3wYHqr1uma3VRRV71QnpNMYRdxw0hI/K2x6MRFExG\n44rKGRjCNiFsNU3SE4aMSVPYGsJW06Sqi8oxJkMVdSWEda5xkYZwEjLCGsJWZUkcv3SQA7BtUpKs\nITznMiWElYRO2YZwkBCuniHs6BxOYCJkhDdFNcEQJsXXzm9XhYwIm/5FVHLPYgijQkM4KiHcSoGM\ncF1ZTI4awnv3As5CAjKCJoTb3g/a7zejeEIZCeFpQkZsbKoG4N4FLyF8oXcBYy53kCkhfO58M+5c\n6XEdQr8UmRAet2KQETUWlTMkhHMzhJMM4RqQESZ8AqBeqzlGuW5qG50Q1gYsi2ibAENC2G1YUTlD\nQnhYUkLYGsJWZUifZg9II812vq2mSVUiI8S5IpKzFhlhZVWcbEI4n6whPOfiBoawXlQutyFsmK6p\nKzAb3H4l0+uVwkymhPAEVc2jOvRUy8QQ3typChmRYPrn3ObeSO6wteV4ZMTYqQ4Z4ZkM0uQT+9d1\npXmmH2vCEL7kEplwociIo0eB3jABGUETwm25b5tw4yMZwt42UPM6jUwJ4WlBRvR6QLevIgIEMoKD\nY7O3GTxnSgifu9CAHQi9qFwZCWH/2qUjI7SiclVhUGhilsFjvWfh28cjI+IN4fqKyiXgfXLO5shi\nCLc63ut2d8sfzDIyhAtomwDxe8YkhC1D2MpqIunT7On/1hC2miZVXVQOkPfQTSkqJwzhIrY3CRnR\nlNmTVrOnqKJytk2KlzWE51icmxEHeqc7z4W72wXQkb3wlc6K8XVVIyNMCeGiisqlMoQXpBF3Yad8\noynKZCgEGTGSB8ae5fiEMDWEq0naGZAREQnh8Rg4fdpbFrgIALj2Wsn41kxKJAAAIABJREFUuv56\naYQCEcgIYrIyYgg3gSMcGGKut15FJ4SbjIx49FGEDECBjABUjrApIbyx1Qfnpa5iKlGDNFQAtNCE\ncPOKyontVAohxpjgnKsdDnp8pkJGODUhI5LwPpMM0KY0hN2OfF0d3GRl3SYY6NCREWKfBoZzExPC\nvBhDWB+ge+AB2xmyKl56whCwhrDVdKrqonLAbCeELTLCqi7ZhHA+WUN4jjUeIwVfdoIOaFv2Spbb\ny8bXUWREr1dtIkkkhHUDPG8iaTCA0XylomiFrW71DOEoZESebe5zlcuqixotI1Z1QtiAjCAJYXqz\ncvasvOGjhvDhw8Af/AHw/OcDb3yjNEKBCGSEazaEmzASvrUFABxol8MQbnJC+NQphAzAA0sHgr/P\nds8Gy6aE8HDcx4ULpa5iKinXLj3pX0BCmBaVowzheovKqQY4kM4E19exiIRwJdetBGRE3mt1poRw\nW76u6lR0KAU+AUNYR0Y0jiFcITKi3wcefriQj7aqQHfeCXzyk83ATcXJhIwo0lSysqpKVTOEAWkI\nNyUhLLbZIiOspllRCWHbJsXLGsJzLL1TUnhRubZs5Vba5oQwRUYA1SaSXGNCOP8UVaXTzR31c30t\nL8oWcWu3KmREfOX6vKmzwTh9QnjE5GtrSwiTonL0ODMVlBP6nd8BPvUp4IYbgN0kZARNCLealRDe\n2oJijkyUEJ4yZMTp0wDI/lhoLeDg8sHg7/XtdQDAaDzCZn9Tfzvg9pVjpC5RhnBoYKeIhDBrckKY\nJFhTmOBxhnCahDDd5s5SPYZwke3xzg5SG8Ju5YawxsUuoG2Sn00Swk1jCJuKyhWUEDYdqxYbMR16\n5BHgppuAf/EvgPe/v+61iZdFRljNispGRpgYwuIeejCox6yqiiFsQkZYQ9iqLFlkRD5ZQ3iOFVXV\nvLCichmREcH7SpTK4SwWGUENYYebi+jRhPBOBc6Kvo+LREYMSEKYFssTokbLEPUnhN0IZEScIUyV\nhIxQjLNWsxLC29tINMHipCSEpwwZ4RnC6rZfvHxx8PeZnTMAvAJzRjXEEI41zwpOCDemqJx/PVUM\nyxQmuD4IkzUhTAdAFtd2Qp9RhiITwgUwdTMhI2pKCIt10JEREzGEDXUMZEK4PkNY39fB4yUlhAHg\nvvsK+WirkvXlL8t7hs99rt51SZJFRljNiujx6jjVMISXSK6kDmxEFEPYIiOsplkWGZFP1hCeY6Xi\ny07CLMyIjADqSQgXUbQH0AxhmA3hRZIi3a4gIVzmNOQhl+tvMlWiDOG6EsLSSBpjpyuhsHkM4SRk\nhGD1Ag1KCBMTTFnXFDIlhKcZGWEyhCk/WDme3T5Onix9NRM1GI4B5h23LQNDeDDARKzjqISwXlSu\nKkOYDt5RkzBon2IG7+IM4TQJYTqjpbPqvbnsQY/IAqAZCulFyWub5EW+KYawsVCihoyYiOlPGMJi\nYIMOogANKypnE8Jzr/vvl8sPPiiX3/Y24Bd/EThxovJVipQJGWENYatplDCLXNerHSL+AeUhI+gs\nuzoM4aoSwtYQtqpS4lyzCeFssobwHCuqiE0RHdCdHQTICJe11JQZUfC4MwLYuNIOqGOakjtBVfM0\nhjDFCuwMKkJGxKVlgdwm+AhkGr7BYKTb2h83KCEMYLcn4XxpDeEsyAjuNjEhHG/gx8mUEJ4qZISr\nHqsHVwgyYsdDRlB+MEVKNCUhPBjGzObwj/e81y7OObg4Z8YthSFcKzLClBCeEBmRJiFMZ7R0Vqox\nhKMG74pAGk1DQjhAOhSQiJafHUZGNCEhPBhyoyE84uUlhK0hPB2iSW5hCN9xB/CmNwEf+hDwp39a\ny2oZZZERVrMimioUEvdBZTOEgfoN4aITwhYZYVWXbEI4n6whPMdKxyycICHsIyOWXDMuAggXLKqy\nkru5wz0hMsJPJDkswhAmxmm3Xz0yosjK9UMWb6q0nXbwPb1xN7i5qsT0Z9LwFetABzq6u/I3SZ0Q\nHsQjI+i+5U5zEsKjUbpUZJzo6+chIUwN46YYwv1ReDYHTQgD+W+ylWJWcciIyovK1YOMoAnh1vJ2\n6DPKUNSMnSKYulkMYaclX1elCR41eyXvMb27i1RF5ephN5qrhY1KSAiLc9kiI6ZDNCH80EPeOXLH\nHfKxu++ufp2iZJERVrMicbxSQ7jIYzkpIVwHdk3nGott53zygpYWGWFVhziX5ysd5ABsm5QkawjP\nsdIlkiYpKuf1oBddMy4C0Kcjl28ImxjCeiI6byKJFu5xUySEu4PeRFO80yjSZCgAGTFmaqEuXYyx\nwGzpDrqVVdSNQkZQ06Pbk89THMBEyAiaEHbkSVN3QjgwsiYwhPV9CUwPQ/ixxxDadpoADgzhhieE\nhyM6sKNzsTkAnvsmezAmFwAdGeGqyIiqBjjSJIRHI/NN3sTIiE7YEB4O8yew0yhte1xEQljZp1B/\nX9aqJyFsZAg7w9y/uVdIkySEHa2oHOMAG9eTEB6ZXeghijWEXRc4etRbPnZsMqRMXTp1CnjrW4Hb\nb697TaoRNYQHAw8R8f3vy8ceeKD6dYqSRUZYzYooMkKoLENYnCtNSwgXyU22yAirOqS3SdQQtgnh\neFlDeI6VtqhcHkPLM4S9XslyKzohrKfP6uiAFtHhBtROt5siIcxZr3SzUEdGRO3jrNvMOTB2/BZ/\n7IYSZ0ICNbA73A1Gw6tMgQMRyIj+MOgc50FGmEykltMKjqVxgxLCW1v+wgSGMCD3pTDGFxclY63J\nCeGNDaj85NYCDiwfCP4OkBENTwhTE6ntGgazcqZHvc8mF4DYonINQEYEhp4XYTFduyZGRpCEsLso\n31zmcZ4O4VRBQtithyEcVUQv7zHtGcKkqJyOjACCQYWqNSBfysCC5XHByIjlZeC66+RjTbiOZdUb\n3gC8+c3ArbfOvokwGADHj6uPPfigagg/+GBzjH2LjLCaFZWdEE4qKtekhDAwuXlmkRFWdUiftWKR\nEellDeE5VulF5XxkBE1b6dLTZ2V3QAfDMeB4RoIwbYtAZAApDWGapHX72NzM911plQoLkoPT2O8j\n4LI6PLpAmUjSdof1J4RVY0UWKxKd5OVlYHU1+nNFMrbltCINcGH40/R0ExPCWYvKAWRf+r8DY3LK\nW/MNYZWfvNxeDgpdmhLCh5YPyQ9oiCFME8JGnMAE167YhLA2i6MOQ1gY4AC5dsVgMvRBGHrNyZoQ\ndoghXOa1K7IAqDZ4V7YhzCo0hNMgIyZKCBOGcKioHLzvrqWoHEG00OOvaGTE8jJwxRXy8dOnC/n4\nSnXXXd7/6+vNKqhWhn7wg7D5pBvC29vAmTOVrlakTAnhIjmkVlZVycQQFvdBk+ITgOkqKqc/l0c2\nIWxVh+KOaTtIGS9rCM+xItOjRRSV644CE0YYLyZVjYwYjIpPywophnCCYeitQE+mN0tSZAp8wtSZ\nl7LzDeFxtLlIMQN1J4TVpKM81oTZd/iwTLyaJJKxpoJyQsLwp4ZwYxLCbv6ickA4IQxIbESTkRF6\nQlhsu8BCCEN4Y3cjeI2eEKZYkbo0HMtjuuyE8LQUlQPSGcKTJITZQjUJYZqWBYobvAMyGsINQ0bk\nPaY3N2FERuifX3dCWDGEUYyLJo7T5WVgzx75eNkD0GXovBynwyOP1LceVcjEedYNYaA52AjLELaa\nFVWJjGhKUTl9nYo0hC1D2KoO2YRwfllDeI5VZlG5rZ50iFYX0iMjyjaW+sNwp7ioonKUIRyVIFXN\nlV7pHbSQyWBKgecwwamp4iLaXKQmYt0JYd3Y2t310rvnznkPxeEiAImMMPGDhYThP2p4QjiXIawl\nhIHmJ4Q5NyAj/H0kCsud3TmLMR+ryAiNIby+Xv/NBEVGhBjCQIEJ4dZ0ICNiEsL6Ou7uyk5dKkOY\nJIRZu8qEcP3ICObUawgXYYADYWREkBDWBlFqSQiTc5kef+OCGcK6IXzhQiEfX6moITzrCWHKDxb6\nxjfkPYrQgw9WsjqJssgIq1lRlUXlxHfUXVTOJoStZk02IZxf1hCeY5U5RXV7IDvRawsxCWENGVG2\nsTQcJRjCE3RAaac7EinQqiEhnMQQzrHN3S6CxGkrBTJid7iLpWUPfFdJcabEhLCHJ6FTaJMMYWGE\nxiWExXcM0cCEcIEMYe5DDEVCuKmG8Pa2fzyIY9VpBce+MIRHfISN3Q21qJyWEObcm7JcpygyotSE\n8LgZDGETXxbIlxAGZIcrFTKCJIR5u0KGcEVF5eIMYRBkRNmdVJMhzBiT2zzBIIeOjDAyhOtKCEci\nIyYfdeJ8tgxhus7zaAh/7nPhx5qSELZF5axmRTYhXKx5Ju7BFhbkzEtrCFuVLZsQzi9rCM+x9ERS\nVHp0YkN4MYYhrCEjyjaW+gZDWEdklGoIU2REBQnhqBT4pJxGmrJrseSEMAAsLEsnqcybHw+FIqFf\n4njW+c3dbvqCcgApphZjporvGPLZTQgDQG/kbRRFRjSl0A3VhqBA+NtOt5uavmd2zuCRTTkf+Yo9\nBLzppwzr5m9SE6nlFpsQpkzT+KJy5RfCDNZpxANzMq6oXFpDeHvbO0azJoTHrXoYwkZO9CRMf4JP\naEpCOMr0D5adYWEJYXGvoZ8z9SSEzciIMZt8pLTXk9fiaTeEh0N1EGYekRGmfdYUQ9giI6xmRXEM\n4bKKys1yQljcOyyS2ytrCFuVrTKP6VmXNYTnWKkSSXk7oAPZutG0lS4dGVFlQrhIAxxQDWFaBImq\n6oRwFD5h0m2mDOEWkhnCALCwKt2FMo2GqIGOjhNGRmQxhLMgI5qeEFaOw5Si5r5IS9ddFCNJuiFM\nB2QuXro4WF7fXscDG14ve7G1iKv3Xi0/xDeV6t6+EWEIB4ZwUQlhragcZQgzxuTAXauHfr+YIitJ\nGo3kl1CzME1ROZNpvb3tXQ/HY2RKCI/dehPCkw5YBiZ4TEKYztQZM/mD1oGMAFQsyEQJYTdcVK4J\nCeGoonJFICOosbC8DKytyb+njSGsm6HzkhB2XeCGG6JfNw3IiPG4mYPEVlYmVYmMaGpCuMg0pej3\n0G20hrBV2dLRLBYZkV7WEJ5jReIEimAWjmTPOX1RuX7po6SmonI6M7mIhHCkIVxzQrgoZMTODg8M\n4Y4TkxAm5mlnWd7xlLmf9WKJcciItIbwaDxCf+SdCGmKyg3GTUwIT1hUjuxLkZZeIWM9TcRGSEPY\n2/aohPD6zjoeOOcZwkf2HQmlyYH69yM1kTqu4TyehCGsFZWjHQOADCD4v0UVN/NDbjYw8yIjtrdJ\npytDQnjkVJgQNrTHkzL9g+M2xhCm17QBqhm4A/wUuG/sR+3j4TCfsRRKCBuREfUkhOngjmoITx5h\nocfoysp0J4QpPxiY7YQw5zIhfPXVwHXXRb+2KQnhOGQEUM3AoZVVEaobGTFrCWFxD2YTwlZVyiIj\n8ssawnOsKJxAEVW+d0ckIdyJQUa41SIjaGGmNmEWMviQowmQEds7Y4B5PdcO3S6iWhLCBnN0UmTE\nhR06FTeGIdyihrA0YWpJCE9gCIt0MJCAjPAN/zFGwe/exITwJAxhQCaEqSFcxw1tkuKQEYIhDAB3\nPnpnYHIf3X80NHMBqP8Glk4zN57HBSaEQ4awGMjyWcxVmOOjcYQhnKKoXJQhHDyecC60nFZwDAyc\nahLCegHQKKZ/LrwPELvN9O8Blydy6QnhcdgAV5b93yNPm7y5CYUhbC4q17CEcAHICD0hPEuG8Cwn\nhB97TO6fo0eBI0fU5x1HmsQPPtgMszUOGQHYRJbV9KjuonJ1JIR1jIVFRlhNu+KOadsexcsawnOs\nqKrmukmbp/O/O5Y95yYhI6gh3CIpXlnEJj8yYrtLEnytNAnhfm0J4UkLFW3uyIOik9IQbi02KSHc\ny4SMoIZwLDKigcnSUgzhKUkIB9XZBTKC7B9qCH/1xFeD5Wv2XROauQDUvx9HPKE4ZIkJ4eDcaVVo\nCE+QEI5CRujmaMftqNdCItFuDVATQ5gZ8D45TP9gndty5fVZO4yx4LFehYbwMKXpn8cQ1pER4pxW\nBrzrYghXlBDWDeFpQ0aYEsKziiGgBeWOHgWuuUZ9/uqrgSc8wVvu99V7l7pkQkZYZqPVNKrshLCJ\nIdwkZERZCWGLjLCqUjYhnF/WEJ5jRRWxUcwQZ5Drwt3n+ZARpTOEx+GEMKByKfMmhHd2UxjCrTqQ\nEeG07KSFii7sEB6tG20u0s5ua6lmhnBCQviSS6I/UxigQAIywlUT4ED9CeEyisqZGMJNNISDhLBr\nQEYsS2TEVx+RhvDR/UdVbm6FmIQ40WtXcB6XkhBuBVWhhXRkRBWGMEVG0OsVHbwD0ieEd3bChnDc\neSBmttC2rEpDOArhlDshHGMI08d6owoTwhOY/knSkRFBQthpQEKYV5cQpgzhaU8Ib29Pn6mdVtQQ\nvvbacEL4cY9TTeImYCPouaMzhPXnrayarLKLypmQEU0qKldkQng0kttrE8JWVSoOg2Lbo3hZQ3iO\nFTVFVTfOsl64OQcGaD4ygiaEZQc0PzJipyd/qIVWBDLCrRcZUVShoq2udISUbdJETUR3oUkJYc8Q\nfuwx+dCBA9GfKQxQIAEZoRn+QP3JUmNRuZh9FqVpTAhvbMC7xrneeR+FjDi1JUcGrtnn9bqD46Uh\nCeGhYXp9UQlhajYzHr521YGMSJsenQQZEXcur3ZWAQA9XmVRuaR9nLMAKAC04q9hwhCmuKeyO6lR\nhnARyIitLSjIiIAhzOovKhfFEOYFJITpMTpryAhgdrERgh8MmJERuiHchMJyJpPLGsJW0yaKX5nX\nonJFmmf0/ssawlZVyiaE88sawnOsqCI2ukmb9cK9uwug3UxkhJIQLhoZ0Scm+II5RapjC6pGRojt\nnLSQ3maXmIsxpgo1Ed3FChnCTN7hiW3VDeHdXYkUWFhQb850KciImISw8h1NSwi7BRaVmyaGsGse\nvKCGMNU1+82GcN03sEpCuGiGMEFGOAZDuJnICA6AT4SMiE0I++1Wb7wNwJunXkdCWEdGTJoQXmwt\nGjEZwhDuDqtLCFNEQlbTP0l6Qjg4Z7TZMfUYwnK7aXvCbUJYkckQntXCcjoy4uqr1ecf9zjVJG5C\nQtgyhK1mQSb0CV2eh4RwkcgI2uexyAirKlVmocRZlzWE51hRzMJJkRHdLoDOFCAjjIZwfmTEdk/2\nnqMM4aqLytHUmcsMpgqQy2TY2qXmYrqEsNOpMCFsQEbofF+aEN6/H6Fp8lQKMiKOIexOR0K4KIbw\nVCAjWmYj/KKli2QxSaKmJoTHJoYwKyYhTJERxoRwDciIxPQoEFn0tIiEsJjZMuKjYLurLCpnxIJM\nkhD2DeGo9lgawt3gWlimITwew2iAA2FkxKQM4Y7bAfM3Sj9+msQQ5qx4hrDryuv0tOEW5ikhTA3e\no0eBffu8f0JNREaYuKjWELaaNpkGNoDyi8o1KSHcbheXprQJYau6ZJER+WUN4TlW2qJyuTqg7WYi\nI6IM4aDTPUFCeIdgBVYiOt26YVh2B42aDNRYUE1/s6kSp+0eScx20jGEWadChnBKZIRICF90Ufxn\npkZGzBlDeCqQERHb7TouLlpSd/xFSxdh7+JeAM0zhI0M4QlZ4EI0IRyLjHBGABtVckxHGcJpkEZR\nhnCehLD3pd7BXXtCOIfpLw1hbyFqhoN4fDAeYGF5oL63BEUhq5Rl//ncCWEfGUHbuyYUlaPJaIUh\n7BSfEAYkNsImhJur9XXv/8VFaQTTRPB11zUbGWETwlbTKnqcUgPJccLP51UTi8qVlaak22INYasq\nZZER+WUN4TlWWUXlPEM4HzKi7GkzKpfSXKgob0J4N0WK1HVcacxWlRBm4YSwbqpk3eZt4o4ttWMS\nwtR8aNWbENa3+cIFaWLu3x//mWmREdPCEC4qITwVyAjKTtbS7AdXDip/H91/NFhuGjKCckeNyIiC\nEsImZIQ+s6GShLCBmQyEB11M6xKFjPCMYp4pIQwgaM9KZwibeO8ZZnN87nPAX/yF2iHLmhAGgKW1\nrvreEhRlgANhZETW9olzPw3rn7v02q9iVqpPCI/HahK47bYB7t1/FIqMeMr78MGt38SJzRMzZQjP\nakJYzFaig9M//uPe/1dfHU4N24SwlVUxSkJGACpnOI9MyAhqltZx/6yvUxnICGsIW1UpfZDSIS6n\nbY/i1Up+idWsinZAGVhgiqqp3exF5ZqNjJBXhE7LgIzIgU8Q6pJiPEmm4c5gp3KGMDUWJjWEd3o9\nQHCwOumQEaxdJUM4OSFMO5aJCeE8yIiGJYSdhV2Ie1rdGE0jU0J4KpARMexknSMscBFA8xLCIwNv\ntYyEsIMYhjAAuD30etHX9KKk8GXJYJaOfsmcEHYHPn94OhLCaZERJ08Ct9zidbS6XeC1r/Uez2MI\nL67tAKf2VGAIy+MuDhmR9bju9/0OrY+MoPc0KjKi+oSwyfhn4xa42weKQkasnAb+5SvxmQtDvPOf\nFrFnzx8D8ExyzuPxSE2SycCe1YSwyRD+gz8AfuRHgGc9S5o1R44A3/428NBD3jHeqrEXZxPCVrOg\nqISwfiw7E0ToTIYwY15KuNudrYSwZQhb1SX9mGbMO49HI5sQTpJNCM+xaAeUQU3LBp3QeUFGOJMj\nI9IkhAFiGrZ6pd8E0LRsq8CE8E5ftvjLC+mKyvEmJYRbPcUQTkoIp0ZGELOKtbwDqW5DWCSE3c6E\nReUSEsKNNYRpMtrNbwjXfQOrJISNfNmCEsKGcWIVddOvlSGsm9OZDeGUSXnFEK4qIWxg+qctAHrH\nHfIY/c535OPdLrwim/4AVVTbRA3hhVXvAl2HAa4s5ywqF8yK8M9dOvCsDqJUnxAOsaKZG2BaeFHI\niP33A673HcfPHw8Ky43HzZzJEaV5SQhTQ4gawsvLwC/8glpgTiyPRsDp09Wto0km9qot4mM1bUqT\nEJ60nTANngAyVFF3UTk9ITzJ9lpkhFVdimuT7ABlvKbSEGaMrTHGfpEx9k7G2OcZY99njG0wxnqM\nsdOMsc8xxl7HGEvI/QWf92zG2HsZYw8yxrqMsZOMsU8wxl6acb1exhj7B//9Xf/z3ssY++F8W1qu\naIdMNwECo9YZZO78T4KM2N72EixliZoMbVNCeAJkRG8sW8G4VDQt0FS2IRyVEM6SAuc8bIR0B/Kg\nWIlJCCsFc9xmMYSzJITTIiPo9i6slj/tOo2ChHCnXIZwE42GRGTEcjQyIjhHmpIQNhikRSWE6UBZ\nY5AREYawnsI3bXMsMiKtIUwGMp3FqhLCCciImAFLmpw8e1YueyZ4cttEH++s7Mj3lqQQQ5gWPQ2Q\nEd7zmYueCkPYkUXlgu+pmSGsD1i2nBYYl8zkSe9/dnYArEqncKu/FSAjgOnCRlBDWKASZtEQFrUM\ngOR7kQMHzO+rQxYZYTULSioqB0yOjDCdK4BM0NZdVK6shDA1hOl2W0PYqgzFtUl2gDJeU2kIA3gm\ngPcD+PcAngPgKIA1eAiMiwE8F8A7ANzDGHtB3Acxxn4PwBcB/BKAKwF0ABwC8AIAf80Y+zhjrBP9\nCQBjbJEx9vcA3gfg+f77O/7n/RKALzHG3pxrS0sU7YC6uiEsEjV5E8I5kRGcl5uopMZHxzUZwvmQ\nEZwDfZ4SGSEMDbdXuolGzVGHZUdGjEbA857ndVI++lH5+O5A7qTVxZiEMDERR06FCWEm797Evo0z\nhBMTwinT34qpsuy9p26jVJgjwhBmYOp5l1KmhHCTkRGcC0M4AzJivyEh7IwBNqrdEKYIBbH/CksI\nZ0ZG5PueLCq6qNzOTv6E8MJa+Qlh3SAV1y2deRt1raaGsJh+DoRn7KQxhNvL3ut7vck7wlHSE8JG\nrEPOonKhhDD57LoZwqGEsCMTwnAGE6+PZwifCv6eBUN4eVkmY0+cKO+YrEv0fE0yhIUxDvjtW42y\nyAirWVBaZMQkMiEjgGYlhIsqwEVRiDQ0wpjc9lkzhO+9F/it3wK+9KW612S+FZcQtoZwvKbVEAaA\nhwD8FwC/BeDFAJ4N4EcB/CKADwMYwjOH/44x9hTTBzDGfgPAmwEwAMcA/Bo8s/lfAvgsAA7gZwD8\nZcK6/BWAn/Zf/1n//c8E8Ov+5zoA3sIY+zf5NrUc0Y4JRUYAakK4SmQEUPK0XGoItwyJJDbGeJy9\n8e/3oaSw4kzDwNBolW8I030cZ6pEmQz//M/Abbd52/fe98rHd4ckIbyYrqgcdypkCKcoKheXyjm1\ndUoZPEiLjKDb216uPyGsTBEmhbRYDohkUkK4aYbw1pZvHMQYgLohbCwqB+RiqRetEQ/zVgtjCBNk\nhGswhKtGRnCuFt6KTAgbkBH9UR9bfB2AyvybJCG8sFotQ9iBG5yj6nEY3R7HG8KkbYoYrFQM4SX5\n+rIGaOOQEcE2M55rkFYawgkJ4boMYY0hHKTyneHEnZbtbXgMYV+6IVx23YIiJQzhPXuAyy/3lodD\n4MyZ+tapDGUxhOngtU0IW1lNrqqREbOeEKYzlOiMBkBiI+q+ny5ar3418B//I/DSTPPKrYqWRUbk\n17Qawp/lnB/hnP8a5/zPOed/xzn/Kuf8ds7533DOXwrgX/mv7QB4i/4BjLF9AN4Oz8R9CMCzOOf/\nhXP+Dc75xwH8FID/D55Z/DLG2HNNK8IYex48E5oD+BiAF3DOP+5/zv8Lz6h+yP+cdzDG9hb3M0wm\nmh51mZoQpvzMfIaw14F20FI7tJp0ZARQcgqLIiOMCWEvepK1A5p2Wi5AkRE97O6Wm3ZRUuA5EsLf\n+pZcfvRRuUwRCmtL6RLCQ1ZlQjiMjNBNLSrayfrgdz+Iy/7oMtz4f9+Inm98p0VGqKZK+RzOJCnf\n7RtheQrKAckM4bqT0LqC9BRFRrjRyAgGhqv2XhX8rZ8jdSeEjUXlKkoI65iGsn+L8RiRZmFcUbnd\n4S6e+K4n4tjPHgau+4SSpjMyhN10CeH2SrUM4TyDd3TGQwgZkTEbkndYAAAgAElEQVQh3FqWry+r\no5rKEAZy3YMEpqd/TxHNEK4LGaEzhEW8spyEsGAIA9OZEN67F7jsMvn4rBWWmwVD2CaEraZVaZAR\nZRvCZc7GiZLYbsfx/hVlCNPr2TwYwpwDX/mKt/zII7O1bdOmuDbJJoTjNZWGMOfJlDXO+d8BuBee\nEfscw0v+HQBhzv6vnHPl1sr/jt8EIJqB10V8lXh8BOB/0deNc34WwOv9P/cBaExKWE8kUU2MjPA7\noB0WX41eR0YAJXe6kwxh3zzNXGRtB6lSWIBaVA4oF5FB9zHdXv13j9rH1BCmqZweSQivLqVjCFND\nuHw2ZXJCmIp2wj5814fBwXHn+p34wvEvAEiPjKDPtZbK53AmSTmXSEI4j5SE8BQgIwJD2E2HjLhi\nzxXKMdK0hPAY8qIkZlZUlRDWkRFlF0pMbRZqDOGvn/g67j93v3f+P/n9WF6WDLvMReVIQri9XFFC\n2DBAm/Y41BPConOZ1hCmbVZrsVmGcL6EMAdaYWREIxLCGkM4GIRxBhN3WmaFITwaSWN/716ZEAZm\njyNskRFWVvWpamQE/Vx6D111X0GsU9EFIectIXzihDrzZppm4cyabEI4v6bSEM4gcVqaen0/6/9/\nAcDfmt7MOX8EwKfhmcr/E2NMYR8wxlYB/CS8dPCnOOdRt6n/3f8eAPi51Gtfsmh61NESwlmQEfpJ\nRhnCi040LkL5HiCY3llqp3ts7oDqCeFJMRlxpmGQcHNGABuVb46ycELYdVxSuGeQKiG8vi6Xe2OC\njFhIh4wY8GoSwmmLylHR1M3pbdmZ/swDnwGQHhlBzRZ30XvPcJh9gKEoBVOnAXDH22e5DeHWdCEj\nTAlhfdsPrsiEMMVFAM1KCEclZnUeajEJ4VboeT2VW/ZvEWcWxiEjHt0m0xguug+Li/IYzYyMIAlh\nd8k7uLvdapi60YZwLxVDeDyWnRJ99koaZIRTgSEcKipXYEJ4awuK6dq8onLqdjukqFwhhnAMMmJa\nDGHaqZ71hDA1UKY1IWyREVbTqiqQEeI7Wi2PpSu0RJrjqg1huk70f2Cy7Y27ns2iIXz33erf09LG\nzqJsQji/ZtYQZow9AcCN8Mzae7Tn2gBu9p/7Muc87jC5zf9/AcAztOduhoekoK8LiXM+AHA7PGP5\nZsZYuMddg9QOaL6E8Etf6l3w//7v5WMUGbHoxhvCVSMjRjxsFAIkaVcQMiJVQhgonSNMTf+Wqx52\nFAti2t7RyGMICz32mLxJ6I/SmSoKMgLVMIQnTQif2pLTbYUhvNmXvdO4fUufq8JUSRI9l0ZO8Qlh\nWk+wziS0SUZkhIbLuGbfNcFx8bRLn6Y816SEsG4iBUXlnIKQETQhzOpHRuTFCaxvk1Gr/feHDOG8\nCWFhCAPVJGbj8D6mfTwYAKdPq4+J1GEeZITTka8vq30qPSFMrvEKMqIJReWY2j4FCeESkBE7gx2s\nrMoPnZb0ksBFAOGE8PHj1a9PmZpWZERSQth2wK2mQVUmhNva7RVNCFeNXdPXySIj8umuu9S/p6WN\nnUWZ2iSbEE6nmTKEGWNLjLHrGGP/AcDngSDq9CfaSx9PnrsH8aLPP1F77kkRr4v7nBaA6xJeW4lU\nZEREQjjGCHn4YeCDH/RGw97zHvm4h0/wWrYlNwsyogpDOCkh7F0x8iWEMzKEAcCtwhAWXErV9E8y\nhI8dU/cF57KhH5CEsM5lpaIGaW/UDYo8lV5UbpKE8JZ0Vr5x4hs4u3MWnzj2ieCzjuw7EvndqqlS\nTSI6TjQhPGaTGcL7FuVc1bM7XgSAMWkKN9cQjkZGHFg+gA+95EN4/Y++Hm94zhuU55qUEPYMYUNR\nOVYMMoIWUEyDjKg1IdxSzenIhPDaSbSXdxRDeH0duRLCzoK8EJZ1LlPT381ojp4+7V2fqURKh7bH\nQDpDmHWahYzIlxCWP1R0UbmGJIRRcEJ4VR0dWFiV+3Na0ku6Ifz4x8u/3/Me1XSYdk0rMiJuei5g\nO+BW06Eqi8rphnBTE8JFISNsQtiqStmEcH5NvSHMGPvXjLExY2wMYBvA9wC8E8AheAngP+Scf0B7\n2xVk+eGEr/gBWb6ypM+pRamKysUgI07JEIqSTtrujoC21+leamdBRpTPEB4bCjMBkxeV29mBmhCO\nQ0ZoabuqkBFKRxjJhQMpLkJIYCMGPJ2pQp/rDrvBaHjpReUMRoNualEJQ3hnsKOkgTk4fv+238cP\nLnin7y3X3YIDy9qQNxHd71WYKkkKDGE2xoh5OznOwI9Tx+3gwJK37Se3TgaPixvasrmyWZUGGQEA\nP3v9z+Ltz3+7go8AgI7THENYN8+MDOGCisoZE8KtahPCcTgBvTikkhDeIQlhAHzf/cE1Z3vbv3HP\nkRBmxBAuq31SZnOwCAM84lptmkKvJoST2ybFKG6YIZwvIRxmbgM6d7suhrBaVK5IhvBWfytAdgm1\nVuTI4LR0VnVD+HGPA17yEu/vM2eA3/3detarDMUl6nQ1KSFskRFWs6Aqi8q11G5YrQlhsd2mc7cI\nQ3hpSTW8gfkwhG1CuD7FDVJaQzheU28I++KGf98G8EzO+f9meD2puYwtw/NU9M56taTPqUXpkBFD\n9Ps8lD4CgEcfNS9vdkkaqRWfEK4eGWHugAbbPwkygjKE45ARWge/KQlhfR+bDGFRWG4AkhBupSsq\n1x12g5uD0hPCvhngMhfMB3ZFJYTX1mSDQdPBQn/+1T8Pll/xQ6+I/W4lZdeuPyEcnEsxhdWy6PDa\nYQDAic0TEPUzq9ineWRERmQwwxuHjHDLSwgnFZXTTdjmICN6sYbweO/9QUJ4PAYeeAC5EsJoy+a9\nCoRCXEI4nyGcLSFMDeRpZAhvbkK5xkcnhBtSVI4LQ3iMwXAySPUWwm2YuySP32nprOqGMAD86Z8C\nq/7d83veA3z5y9WvVxma1oSwLSpnNQuqExlRZ0K4rKJy4npmGtyihrDJV5hGWWREc2QapLTIiHSa\nBUP4bwE8xf/3TAAv8x+7EcAHGGP/s+E9tBeY1NWgXV/d5Svqc2oRTSTpCWEluesMjQYpLTJGlzd7\nsvO52klICDcNGeEUg4xInRB2S04IjzjgeB3MuIQwEL5Yfvvb4c8T+3nI0xmMjLHg+d3hboUJYe9Y\napHjK8oQph0wWlBOiMO7a9mzsAe3PuHW2O+mpgpvNSghHINNyKLL1rzKPv1RH+d2vXhS85ER6QxA\nXc1DRoQZwnqBrEISwrOCjAAwWL1PKXzovSd7Qpi3q0oIJzOETW2xyRAWKZ1chnAF167SE8IEGRHJ\nEG5AUTmvwKvc9t3+ZDGWHRZuw9jC9CWE6XoKQ/jyy4G3vlU+/pu/ORumgjBQ2m2Er1eaFhdlm2sT\nwlZWk6vKonJNMoT1dSrCEOZc3nvEGcKcz8b14exZ1f8ApqeNnUVZZER+Tb0hzDm/wDm/y//3Dc75\nhzjnLwHwKwCOAvgoY+xXtLfRyc0dxItGyvTLdVGfU4tox8RxIhLCQGRhOZoK3tqSjdlWT/aYVxea\nhYwYlYSM0IvKxTGElc5uyUXlhqTFdWMSwoC6zZxHJ4RHI4A76VOXIi3dHVSfEG6nMITpFExaUE7X\nLzzpF2KNfkBNhvNWgxLCOU1RXYdXDwfLJzc9bETjE8I509GNSwibGMJOCQlhbeAIqB4ZUUhROQC9\nlQkMYZIQ5q3yGcIU4ZQ1LXviRPgxJSGcouApvbaN3WYZwnHHtem5UFE5cp/RiIQwUxPCdBCmN8x4\n86FptxVuw1hn+gxhU0IYAF79auCpT/WWv/1t4M47q12vMiTO1Ysu8rj8SRL3LHUbwjYhbDULmvei\nckUmhLe25OeaZjt0SNM+C9gIHRcB2IRwnYpDRtj2KF7hnt+MiHP+PsbYiwD8AoA/Z4x9jHMuLAJ6\nuibhG2h3UsdCFPU5qdXv9/HNb34z8XWHDx/G4cOHY1+jdEDjEsJuH/1+2ODUR8XW14GrrgK2+9QQ\nzo6MKLNRHEchI5wKkRFudUXlhmO140mlG8L9vjT3TpyQ+7fVkhfZ9XVhMKRDRgDSdOkOuzhUMUPY\njUoWRiWECTLisrXLcGJTOi2veGo8LgJQBwKqMFWSJBPCxRvCJzZP4IZDNyiGMOfpOrRVyIiMSDhW\nqZqUEI4yz/S0Y94b7O5AHqBtwwQWHRlRNi9a315qfOvXzzhkRHfp/kISwiO32oRw1rRsIjJiX7aE\nML12VYHIAPIlhD/wAeCVrwRuvRX40Ifk41tbABalo7ins0d+nDaIUntCmLlKUd/eIH+MZTAARouG\nWS7t2TGEWy3gV34FeO1rvb+/8AXgyU+udt2KFjWE02jfPuDkyfqREXFpLMB2wK2mQ1UyhKclIZx3\ne5N46LohvBxvETReOi4CsIZwnaL3ihsbJ/HNb54M+iuDAWCyz/qzMDJRgGbWEPb1d/AM4RUAtwAQ\nxeVoAbgr9DdpogXgfqA9p39OnFMb9zmptb6+jptuuinxdW95y1vwe7/3e7GvGQ454PoJYRaRHgUi\nC8s9qs7ODQzhnYHsQa4tNgsZkVhULicywqvk7rXmDEz9/TTpabtykRFmUwUgpr8hIUzTwc9+NvDF\nL3rL6+v+/slgMIrk2e5wN7j5GQ6979NvjoqQZzSEE8LqsZacEH7Zk1+GP/ryHwEAjuw7gh+76scS\nv1tJ2TnlmypJMiWEs5iiugQyApCF5cQ+5dw7bxbyf3yhKhoZUX9COKGo3AQJ4e2+PEBb4/A1Wx1M\naRAywt8vp04B737PGOs4o3zOdid/QrjjdtByWhiOhxg55SeE1YEsuV9d5oKBeegaDZEhVDRDeOhM\nR0L4He8Aej3gwx/27kcOHfIe9wxhGZ+8aEk6bY1ICGsMYVrIcbefPyG8swNgNWwID5wtOI7H0Z6W\nzmqUIQwAz32uXP7CFzx0xLSq35cDt2kNYXHPItJ4ZdxHpZFFRiRrOPQGrc6eBf7bf0suGmhVvapA\nRjSxqFwZCWGBiwDSGcLTLlNCeFoGXWdR9Lj9xCfejZe97PeV51PYZ3OrWTeEaVzoarL8PQAjeMiM\n6xM+gz6vn/p3aa/7WIrPGQI4lvCdkTp48CA+8YlPJL4uKR0MAMORLF4SSginQEaYEsIAsDOUnee9\nS81CRiQawgAAjsEgW8yRTsvtsKWgkJlJdSWE9cKBMiE8BNgYg4H8Dagh/IIXSEP4zBngBz+AMg0/\nCzKC3vx0u+V0ZGgBLnocM8bQdtre9PgUDOEXP/HF+B/f/x+4+8zdeN2PvE47RsxqOa3gO0ZO+YWZ\nkmRMCLuTF5UDwsgIANjdbZ4hzNo9CMzkRAzhmkx9IGVRuQkSwttkEK+NsGGoY24ag4xo9dDfBt74\nRuAv378BvF7txWy1HsDSygiA/J1Yezf18bDSXsH53nkMWH0JYca8AcbeqJeYEGZMMlVVhnAy354a\nwiNWviGcpahcVA0DyrkPGcJLMqq0f0mO+unc7doTwhpDuD8B6M4zhMPIiO3+FtbWPJN1Wjqr1BDe\ns0d97sYbveJyW1ueIdykmSlZRbEPWQ1hwGvnDh4sdp3SKgkZYZmNwCc/Cbzvfd7yX/0V8Nu/Xe/6\nWIUVhYxwHPNr8qiJDGHdEC7i3LWG8PQMus6iaJv0cz/3G3jNa27Fr/+6vFf8ylfCgzK33HIL1nVD\naw4164bw5WQ5mDPHOR8wxr4K4NkAns0Ya3HOoy5/P+7/3wPwde25r8ErJtf2X/d/mj6AMdYG8MMA\nOICvxXxXojqdDp7+9Kfnfbui4XgE+DfROjdSLSqXLiEs/u4SQ3jPUnZkRKmGMDczdVVjZYzBQDVP\nk0Q73Qtu/DZXmRCOKqIHhH/7wUCuF+XyPf/5wJve5C2vrwMPPIBcCeHusIvFJQ5x0HW74Y5eEVIS\nwq5699VxOyFDOCohfOWeK/HN3/gmTmyewDX7rkn9/UvtJQx6AwxYgxLCOTm6uhSGsJ8QXiQf1+2G\n01x1SRjC7aXdoOJn0uAFVZOREWKgo6iEMJ3V0Wbh65eOjCj7t4gzC03IiDvuALAcvqEbswFGK48A\nuCp4bO+BXYjweKIh3PENYVTEEPaTo23DtVoYwnEJ4WuuAe6/31ueJCE8QPMTwp/9rPq3Uth2E8CK\nNIRpQlhp62tICFNUFxBOCPey8qqIdnYArIQTwlv9LezZM72GsN6mtFrAj/4o8A//4KET7rsPuO66\natevKNEp1lmQEUJ1GsLCNHIcaZ4VMe18lnT8uFy+77761sMqWlUmhOMYwlUawrSoW5FF5ZKuZ7Nm\nCFtkRLNEb58OHz6Mpz/9sNJePuUp6iAM4PlqVjNQVC5B/4os36E991H//z0AXmx6M2PsCgDPh2fk\nfppzrliVnPMtAJ+B53A9nzF2WfhTAAA/738PAPz31GtfsgYUJ6ClRydJCHdHsjO52mkYMoKlSAiz\nceaGina6F934wmOVJoTTFJUDQp3uhx6Sy099qrxROHMGePBBKAzhODwGIE2XMR9jcVlerUudem1I\nCANkXcn6RyWED60cwmJrEUf3H41NfOsSxsoQDU0IT2IIr6kMYaDehEOchCHcWpIH2nQjI/InhPt9\n4FWvAp71LHOiIeC+j9qhcwZoFjLCtF/W1wGsmEf4u4tqL3zvgfTngigs1+flJ4SHIx4YwnG8d90v\n3NyU5/nVV8tBtsce8zp9aQ3hBXcBzB+sq+LaNSlD+NOfVv+m9yNeQjgNMmJUeYrRxBBWDOESEsLC\nEAZmwxAGwtiIaVUeQ5gOYtdZWM40Dd4iI1SdIqfjD3LDAq3KVNkM4fHY+wfEJ4SrDI6YcC8WGZFN\nW1vynKYDktPSxubRhz8MPPGJwLveVfeamEXvFYtkY8+DptIQZoz9a8ZYbNSLMfbvAfyM/+f9AL6o\nveQ/AzgPz8x9O2NsP32SMeYA+AvIuabvjPgq8XgLwLv899HPuRjA2/0/NwD8P3HrXaUGIzWlQqWj\nHEwGgIkhDAC9sewxryQYwowxaWpUgYyISMwqhrAzylxUbmcHATJiMaagHBBOCJdqCKdNCGudbtHI\nHTzo3bBcfLH3d5AQ9hOnHWch0SylBfY6y9KMqcJoaLvxhfQAtXMlisrtX9yfm7UrtnfAG5QQLqGo\nnM4QBppjCHNOGMIr3oXKYY5iDCVJPT8GtSaEJ2UI//ZvA+9+N/DVrwJ/9mfh54OE8GBZmSYpVAsy\ngpnbJ/362e97A1ViPwMAzsvSAJst1RDec1EGQ9hvv3q8AoZwzOBdsM2GgQnKD77sMtkJe+wx7+Z4\nPEbQNgHRBU8ZY4FZ3OfNTghzDnzqU+pjYUOYICMW5UVePWcawhAmk/X6w+IZwpv9TaytyddMQ6fI\nGsLR0hPCdUmYRtYQjpY1hJuvKGQEPZbHkrCYWSaTSqiu++ck48wawsm65x65/KxnyeVZTQhz7vUl\n7rkHeMMbJJ6sSUoqdGoxRtGaSkMYwO8BeIQx9m7G2CsYYz/CGPshxtiPMsZexRj7IoA/8l/bA/Dv\nOFcPXc75OQCv9/88AuCrjLFXMsZuYoy9CMCnAbwQXjr4rznnxltOzvnn4BWrYwB+FsCnGGMv8j/n\nVwF8Gd58VQ7g9Zzz86bPqUNxBcc6TnxRue3tcOMlDOLeODmNpHyX6PxVgYyISAgr28/GmQ1hioxY\nbiUgI7Tp16UiI9IwhP31ENs8HHrTMAHgSr8UopiSqCMj0pimlFnZWZYbW25xpmhkBIBIhrBARlyy\neknu75emSjMTwpMUlVtqL2HfotcbNTGEm2IIb23JG/jRkreeh1YOha5zcTKZUnXdAE3CEH7f+1QT\n+MSJ8GuoIewafiJ9VkOtReW06+eFC/5xTpERD/9wsHjBvV/57OU92RPCQz4IrillXbfojJ24gazR\nSO2cUkP48svl9eyxx8i6pkgI0+d6DTOE9fb4/vvVqdiAPyjgSzeEm1dUTt3ullMMMmJ7m8ciI4Sm\nocMqDOGFBTOX/uab5ePzZgg3JSFs4qJaQ1jVtBvCW1vAS1/qFcab4NLUaJWNjDCxtoUoMqLMvq+u\nspKU84SMoIbwjTfK33FWE8LHj8vZwxcuqPu6KbIJ4fyaZobwfgD/1v+ni/v/fgDg13zTNvwizt/D\nGDsM4E0AjgL4S8Pn/D2AX09Yl18DsAYvkfwTAJ6nfcYIwP/OOf/PCZ9TqdInhMOGsJ4OBmRCp094\ni6JDHae220Z32K0EGcHTFJXLgYzY7o4CDEFU0R4hffp1melRagjHJ4QHwYX01Cl50bzCD9qJhHCv\nB3z3uwCe4G3rYhpDmCTSWksVTUX2j6VOCkM4qNjd38L2wDv4LlnJbwiL/e8NjHjM5FlJCANeSnhj\ndwMnNk+Ac46lJZkQ392NeWOFClJTbIR+xzNIaLo5jXRTinOv41BHRfco8ywpIfzd7wL/VmshqXEm\ntOMf95GGcKtahnB6s7An26IVzRC+4W8AAGe5mhBeXN2FQOSmTQh7X7wN7O4rr6gcuVYnzWwYDKQZ\nFmUIj0bAaeENkqJycYZwcO0alT+7IUtROf241nERALn/6PudgkUzMkIfRBmPqy1KNhggVFSOFvXt\nj/JHWM5ubQEdb4et4hJswTsAdEP4wgU1ZdoUra8D732vV8hWGMJRTPqFBeCHfxi47TZvkPoHP5AD\n2NOkaTaELTIiWdQQPnfOuydbSe4WNUYf+QjwwQ96yy9+MXDrrfWuTxlKkxCe5Fg24RmE6PXtfIVx\nMVOS0iaEs+n735fLj388sLbmXc+nYcA1j/SB14ceMu/jOmUyhG1COJ2mNSH8kwBeBS+Z+88ATgEY\nALgA4BiAjwD4VQDXc84/G/UhAMA5/30APwbgrwE8BC9RfBrAJwG8jHN+K+c89rLFOd/lnL8IwC8B\n+JT//p7/ee8D8GOc87fm29TyRBNJLS05p3AkDQlhU0FG8RgtwJOEjKDfxdrel5TVAR2PEdkBVQ3h\n7MiIbeKELXfiDWF9+nVVReXSMoRpikFPCAPAww8jMBjTmIv0NdQQLi1pN5QsTj0hTKdeC4lOmMBF\nAMClq5fm/n5huHDwYJCg7oRwa7GYonIAcNmah0rvDru40LvQyIRwYAgvnwX30QNZ96l+fgCoDRuh\nM4SDonIJCeG3vz28T0yGcHfon4z9FaMhrF+zyjb+YxPCGjIiEEVGnHgGMPau6etD1RBuL2VPCHtv\n9Nq10gzScQq8j4/qoe0TTXxTQxggZrGfEHaZG7omUolr1+6o2QnhOEM4mBVBkRFL0kHTE8LAZNOB\ns2owQLioXEEJ4VObsg272JVQw2lJCL/mNcBrXws873nyOhVXpJRiI76oQ+GmRLOAjLAJ4Wid0pDe\n05YSpgaf6d5hFlRlQlg3hOk5X2Xi0iIjJtf3vieXH/c4Wb+hqe3rpLrtNvVvWmuoKSrruJ4HTWVC\nmHP+AID/5P8r4vNuB3B7AZ/zAXgm9VQorgOaJyH86KNe0mbA8iEjHHeAEUos2hOTSFKNlRwJ4b7s\nNa8kGMJVFpUb8ZiEsGPudCcZwt7rPWMiFTKCJoQXymcID8dh40woLiFMC8pNkhBWjvlWFxgu1p4Q\n7izvBtn4iRPCaypHeHFR9tgbZwivngwemzQhDNR3Axt17dITwqOR13ERHZl77/X+Z8zjyz7ySLhT\nN+Zj7I788zKCIdwkZIRpvwBQkRGblwMXrgT2HcfpvkRGHDoEjFh2hrD3xd7JVFr7RHnvbsTgneE4\n1BPCtBP23e/6C74hnNQei+e7DTOE6faOx8Bn/WH+lRW5P4QhHHTGfEN4qbWkDkpqReXEupgGQspQ\nKCHM/ISwj6OZJCF8clO6T5e0r8WDo38E4BnCl63J1zV1Suudd3r/02tUWkP4H/8RePnLy1mvMmUT\nwrMrzs2G8PXX17M+eUSvvXXWUShTZReVizOE6blctyFcRJKSbsP+/eHnZ8kQFglhxwGOHkXA6W9q\n+zqp9IRwEwe3khjC894mxWlaE8JWBWgYh4ygRpphymZUQngwANDKjozwvqdcZERcB1RHRmQN6Wz1\nZK95dSGBIVxlUbmx2vGkikphmQxhgYwIlCEhTBEa7mIFDOFUSbs+RA9cdMIEPxiYLCGsFG3yjZi6\nE8ILK8UiI4RObp5sdkJ4Ve5TamSnUZMSwvq1KygqpyWEARiLQx4+7JmFgHfDTm+KAn4wMBXICN2c\nDkSREduHgHNHAQCbw3MBPuD664HdYXqedpUJ4RG5brUjr1vD0IBlFDICIElav6hcWkN4OB4GifQm\nGML0mL7jDtnp/KmfkuiMcELY2+d6IUm9qFywLhXJM4TVey96vzWYoKjcelcOal6+GJ0QbmqH1ZR2\njTOEn/Y0uUx5jtOkaTaETUXlbBpLamMjbHo10USJ0zwYwmUjI+IYwgsLkiNc5blcdkJ4797wtgKz\nYwhzLhPCV1/t7UfRxu7uzt6178QJ4Ngx9TGbEJ4tWUN4jqWYhXE4AQMywpQQ3tryH+/kQ0YI06WW\nhLDWScxqCFNTZXUhW0K4TBNtHJcQzmAIhxLC/lRtZVsiRA1S1imfIawkhKMYwszDSriuHNWlyIgi\nisp5K+BtZF0JYWGO0GnyafZZnAQyAgBObJ5otiG8JhPCRSAjak0Im4rKaQlhQK5jrycZsldeKZOj\nnKsdD9UQToeMKLtjGHet1s3pQCIhPOwAvTXg/FXBU85e7zh45jOlIdxxO+pAoEGKIVx2Qpjy3qMY\nwgDgDJT2Sdykt9ue8U9Npc9/3l/wB6aS+Pama1dp1+mcDGHK7bv5ZrXgKRBGRlBchP49YhClekNY\nvfdyyTpNxBDeJdij5SuDe6tZNYQPHJDmqN5ZnRbNGjKCFgBsSk2BuqSngwFrCDdRZSMj4hjCgFoI\ntiqZTOoiDeEotuysGMKPPipnIz3+8d7/a2QWzqxhI0yFW6fRELYJ4WhZQ3iORTugeiIpCRlBE8LU\nLLztNqSuaB76Lj+R1OuVc9J6iST5wTRdpxdYy9pQdQey17Fq/ksAACAASURBVJypqFzJCeER0ieE\nxTY//DCAg3cBr3gBPvrY2wBohrAzDH7HrAxhp1MBQzgNMgIA3B7275fFhIpCRjQlIcy5NK9aiyUl\nhLfUhHBTOoBlISPqZQjTBGl0QlicxzQ5euWVasqfTslOlRDWBrEGg3K5q6nToyaG8M5BAAzYkgMA\nv/t/nMJrXwu8/vXSEE5zHigDmhUyhNNeq0cjaZBed51340tNpaBTkhEZAQBLe8q9duVNCD/4oFw+\nckS2TWfOeNe8rS14M1h8QzuUEGYNSAiz6IRwf5Q/IfxYnyAjVi/FamcVwHQwhIdD83rFGcKAx24E\nPKOtKe1PFgkTyHGg7KM4NSUhbEJGLJNLTFMGiOvSrBnC02zgxalOZARQjyFsMqknNYTHY3k9ihrc\nmhVDmA5MizZobQqwTFn0X/+rZ3a/7W1hfjDQzGuZLSqXX9YQnmPFJYSTisrRhPANN8jlT38aQccZ\nSIeMEJ0/ToomlZHCypJIypoQ7g6JIdxqEEOYmP5JReWUhPCz/i/g2k/hj7/9Jnzr5LdUZASZpp2K\nIUwMctaugCHMUySEAcDtKx2ropARIYYw6kkId7ueQQIUW1ROYQg3FBkRdJILRkbUyxBOSgh71zZl\nYMdXFkPYxBCu+reIMwtbTksme4NrEQeW/Y3aPuT9TwzhJz3zFN75Tu83yGQIk/arvVJuQpgWANVn\nc6hthrxWP/SQNMGe8ATv/3AyhwfmaBZDeGHVOy5KQ2ToGBRyzxGXED5+XC5TQ3g08gaCtrYQIEKA\nsCEcxRCuSkaGMDWEJ0BGbAzloObh1UsUQ7jpndXz582PpzWEOQfuuy/+tU2UMIH274fx2mvS6qp8\nbdMSwk28H6hLs2YIz2pCuEpkhMkQFv2QXq+6c6aMJOXGhuxzzHpCWC8oB6Dxg65ZxDnwutd5xveb\n3gT8J79il+vKNrmJCWHT4I5NCKeTNYTnWEpC2I1PCOs3AjQhTA3hz3wGuZER3JGtQxmd7vSJpF5m\nQ3h3mD4VrU95LvMGIM5kUPexZgivSMf/w3d9WE0Ik1ReVmSEMEiBEo0GnjYh3FdGsZWE8ATICGqA\nOwv1JYTpOUSL+RWZED6xdQKL5OOa0gEsCxnRFIawOJf1mQ2AvMmmHc8rrog2hLf75EDpm5ERruPK\nZKV//pf5W8RdqwFDccjFDY+vCwDb/sWKGMJ0sCeLISwMNQBYWC07IZwe7yP2sSgaCEhDOJTMcQeB\n8Zk0WLncChvCpSaEDRgUIF9CGPDuS7a2EOAiAGD/ooqMaCJDmG77YBJkBJexpav2XRGZEG6iIUyN\nzSc/WZoUN94Y/77rJCpZSW1Ni4QhnBYXAXhmsMBGNC0hbA1hKWsIT4fKRkakTQgD1aWEJ2Wt3nsv\n8Du/A3zrW/IxgYsAZt8Qpm3NLCIjHnxQDf6J4/+mm6QBfuJE8xK3NiGcX9YQnmOpxbdiEsKGonLi\nQuG68mII+Dc7/vRUl7khQ84kYUxyNoIo9FV5QtiJTiSl0e44AzLCrQYZMR5DmZqapqhcv+/fxBLs\nx4fu/BAOHODytUuy1d+3SGB2EaK/B29VwRBOYSQBsQnhQyuHcn+/mrKrLyEcsDQBOEUawlOQEJ5J\nZIQbTr7ryVHAbAhPiowAyECWW78hHGy3GJwiA1geMgKFGMJ0QLPOhHCUQUoLaYnK9SFjKQPCiV6r\nOyvNQUbQ9lgYwp0OcOmlYUN4cxOKIRyfEK7LEFZnZ3XI4OwgJzJizMc47X7D++P8Fbh8/0EtISzb\n8CZ2Vqkh/JznAP/0T8DHPga84hXx7xOdU2D6DGGRageyGcKATBXWZQhzbi4q18T7gboUZQhzHn68\nqZoHQ7jOonJAcwzhLMbZr/4q8I53AC9/uXwsDQ99VgxhU0K46bNwsuirXzU//tznAlf55TnGY88U\nbpLEce04chaNLSqXTtYQnmMNSRJFTwgnFZUTCeGLLwYu0cOUPjJipbMCJgCtMTKl3Mow0GIr17fM\nU3J17e4C73kPcPvt6uP9UQZkRKuaonKpU3ZA0Ok+ccK/WSUmwn3n7sPDo2/L165lM9mo8cLdihPC\nCcgIJSHsF5W7aOki9XUZRfd/e7kZCWGHoDrSYD7itNpZxVrHu/PRGcJN6QBKQ9jrke1d2Js4UKOr\necgI71xmYAEyQZ/ZAGQ3hDd7KQ1hzYQtk9eZaAgH5rS/sStkykqBCWGKjGgtl5sQpoZw7OBdq5ct\nIUwG4bIgI8q+duVBOHEukRFXXeXd8NPjWiaEo5ERJu523QzhIhLCxx47hoHj90JPPAPLyzLhzsHR\nXpY7somdVWoI798PPOMZwItelIxRmGZDmG5zVkNYJITpNO0qRRnyUciIuorpNkXUEL7Ub462tqLx\nKE3UPBjCaRLCkxhJSUXl6mCCJ02tj9ve7W3gK1/xlu+5Rx4j85gQbreBq6/2lmcJGSH2LwC85jWe\n2b2wALzylbLYPNA8bIS4V6TnmUVGpJM1hOdYcVXN44rKcS4TwgcPagXHgAAZkaagHBBOIwM1M4Rb\n0ciIP/xD4Dd+A/jJn5SjoZwDA6RPCOvfVS6jMRtDOOCOttWV+uj3PiwrW69mY+1Sg3TsVswQ1hLq\neqJSNOSc88A0mqSgHKAe9yJlV3dCmLKbJ00IAzIl3PiEsD94kZUfDDQrIUwNUhfRrFUgPUOYc+DW\nW4Ff/GVysU2VEC7/t0iPjPBXYpkawmGGsDi3Oee5E8KtJZkQLsOAyZMQTmUId+TFJ6ltMhnCvV45\nBQTzJITPnZMdrSNHvP/p/ceZM8nIiKYlhFtOS7nfyltU7usnvi7/OPEMXHSRijxxl2SD0ERDmBoh\n+5InHgWaZkM4TaIuSsJEGo/rMR+iUo9NvB+oS9QQvvlmuTxN2Ih5M4TpsUxxaJOYl7OGjPjWt9R7\nApESnRdDeDwGjh3zlo8elb/bLCWEqSH8pjd5RaoffdRDhIqEMDAdhrBFRqSTNYTnWEoH1I1BRmgJ\n4a0teWNw6JDBEPbNxDQF5QDNfPYLJ9XLEI5GRnzkI97/3a6crru7i0zTcvWicoNBOReprAnhwYDc\nqGqG8Ifv+jAOXOy7IHQafgqjjZoQY6c5DOGffmEfr361t7zV3woKA05SUA5Q939ryfvMOjpGtJPI\n2sUVlQOAy9Yu876jvwnelkZDU6q8b2wA6GwGg1N59mnzEsLece0gemYDEE4Iu66XTtIN4bvvBj7+\ncR0ZsRKZyNNN2DI7h3GDd4AJGUEMYYGM2N0bvE4YwoPxANzHEmVNCLuL3rE0GiEzYz6NRjkYwqIN\nOnhQdipbLTWpcuQ60ja10ieEW0vyfWWc11naY/F76/xgIJkhPA1F5egMLYo8yiJqCLfXn4G9e1VD\n2FlstiFM07JZDOH9+6X5MI+GMFAPNiLKRLOGsJQwhNtt4Id+SD5uDeFmKQoZQQ3hSdrAWTOEv/Y1\n9W8RPpgXZMQjj8hrGx2QnJWE8GAAfPOb3vK113p9h7U1uX3UEG7atcyEMbIJ4XSyhvAcK0tROXrh\npqDxgwc9U1h9s0RGpJEJGVF1QjiqijvVyZPAnXfKv8XN3s4OlGm5mZARvqFRxo3zcIjMDOEoQ/jY\nY8ewctTHRmRERtDfY8gqYAinREb8/tv6uMzzNQsrKAeoBrgwVepICCs3JOT4TFMIMEl0v18Yy+Oh\nKR3AjQ0oSfas/GAgXHQRqJkhLBLCLCIh3DIjIy67zDOFdUP4/vv9P9r5kBHNSAgLZARplAQyAgyH\nlr2BAGEIi3QwkD0hzBZlo1QK0ihjQvjCBa9NAmQ6WIh2xh5/Qz5kRGux3MG7PAO0AhcByGmaRkN4\nMQYZUXNRuX4foaJydOAyL0OYGsKXjG8CY6ohjI40hJvYWc1rCAOyU/7II9OFKZjEEKa/Ef3tqlKU\nyWUNYSnRR7j0UtVEobN3mq55MISjkBFFFUxOYgjXMbiTZAjHtYlf/7r6tzieTQlhzjnuWr8raNdm\nwRA2FZQDZqeo3B13yAGQZz0r/Py0ISNsQjidrCE8x6Id0HaGonLrJIx16JA2NYSNAH96etOQEek7\noD1jQ/XpT6t/i5u9bhdAO2dROT9tV16HO1vqLMoQBoDu0b/xFjIiI6jxQg3h0orpIZ2x0hvJu9tz\nXWIgLGbsmWmix7274G1vWanCONEbkoEroXV7F/dO/NlX7LkiWD7Rl3dHTekAbmwg88CFriYhI2hR\nOZdFG2eAd5Pd7UpOsLh5o4bDmTPAAw/4f7Sbj4zQB7P0AndGZASAS/zr05mdMxiMBtkNYZIQZh35\nO5XSPlGGcALep99XcRGioJwQ3ddHn5APGeEsyveVNmBZQkJYLyq3fykGGVEXQzimqFyehPBoPMI3\nT/qRnnNHcPl+b/SHGsJ9bAWd8aYnhPfvj36dSTSlJabyToNsQnh2NRzKvtKllwJXyFumxqXq4jRv\nhvA8J4QdBxBlfyZNCAtf4C2ffwtu+Isb8FPv/SkAs2EImwrKAbODjKAF5Z75zPDz04aMsAnhdLKG\n8BxrFJMQjisqpyeEWy3SoBFjtGnIiFBilnS6ozqgVLohfNoPlXqGcAZkRJUJYSelyeAMjAnhA0vS\n7e/t/5a3MAEyYojyGcJpkRH9kTyoz/eKM0xpItpZkMdF1cklekPShXenttJemahgntAzL5d3Cd9Y\n/4L8ngZ0AMdjv2gLOU6nHRmhMIRJQtg0uNTve0k5IWEIt1rSRFANYXJg9lciDWEdGVHm8ZxYVC5I\nK/cBcDMyAsBla95+5+BY31mfKCFMjfNSBvBSDmSJ9snEDxaig7RXXZe+bVIM4U65hvCkCeEoQ/h7\n30MsMkIZXKiLIawVlaP3X3kYwveevRfbA//4PPGMoIAVNYS3+lvBlM8mdlbzMoSB6eUIT3NCOKpQ\nVqsl/27C/UBdWl+XrPlLL1VTddYQbpbKRkZkKSpXlSEcZYKL5ShDeGMjfI01JYTF9ewD3/0AAOC2\n47dhu789E4ZwVEJ4VpARlB9sSghfcok8jqfBELYJ4XSyhvAcK5YhHIOMoAlh0RkLsBEdeRVcWyDD\nZTGqChkRlzpLMoQ5Bz71KfUxJSGcARnRclpwmH/qlZ0QZukTwrKoHA8KER3dfzTYHvfiB3DTTcBF\nRzyjbcFdCBXsMYn+Hv1RN7g4l8YQRjpkhGII7xJDeGEyQ1gxVRbKR2REid6QdLnX29bNkbx6zlXP\nCZa/cqpZhvDWll/wgiIjchSVM81cqBcZkT4hTDucNJkksBGRhvBgOZIhHJiw7hBg41JveLPxzwfA\n8hn59463kY4DHF5TC8tNkhCmhnAZ7dOYR1+rdaRRvy/5wUDYEBa8yr17gSuOpEdG0Gs1a5AhbEoI\nC2TEvn3yhv/4ca/gDZaikRFNKyrnMhdLC/Ja08sxlUQvKHeJTz2aJkO4CGQEMD+GcN0J4bhp8CIl\n3IT7gbpEC8rphvBXvgLccgvw7Gerg1xN1DwYwlHICJp2ryohXCcyApDbH2WcfeMb4ceikBE7gx0c\ne0xO2TjfOz9zhnBUQngWDOF2G7jxxvDzjiP7FU0b3EpiCFtDOFrWEJ5jUUO4o93R6UXl6I0ATQgL\nIzhI6SzInsaeBTJcFiP9u4DyGcKMu2BibgzCxZn0huruuyWvUSgvMgIgHe92eYzZpGnXkQzhlrzz\nWems4Mi+IwCAk90H8bWvcTh7vB/i0tVLld8wSvT36A67WPE9lrIaTMUQriMh3DabKlUnhOXvy7E1\n8nqe+vTpvLpk9RJcf7E3T/0bp74WmGVNKCoXGAsTIiMYY/L4aURROZEQzmYI046oMITPnyfT3tIy\nhLVrZJ2GsLouPYWPip53V764qCbDQ4awmy0hPG6VmxAeZ2AIJyEj3vxm4M/+zB/EbBFkRMJgpWIY\n12gIu8wFg9+2+NvLuTSEWy0E/HfHkYnoe+7xP9dPCDOw0H2IyhCuCxmhzlRaXZHbvtOb3BBOSgg3\nsbNalCFskRHVKCphCFhDGAgbwnv3Aqv+6XjPPcA//ANw++3Au99dz/qlFb3nmVYDL0m2qJxcTkoI\n67gIQBrCIizmul5a9u71u4MivgCwsbsxE4awqL+xuAhcfrl8nCaEmzjomkbnz8uwwVOfqp4DVAIb\nce5cs+4nkpARVaMbp0nWEJ5jxSEjcieEF7MnLU1JvLINUgcxHe5WL3TR0NPBgERGZC0qBxCzfMG7\nkpaHjEifEN7c9M1+DX8hDOHd4S4e2XwEZ3a8NF7aafg0idcddoPO3vnzEW+YQOMxgkEFoP6EMD0u\nqu4cBTck7R0MuLetRSWEAeC5Vz0XgMe8bB+9HUAzOoCBsZCRdW2SXrysroTMaIRg9kSLICNcx5UD\nPaSoHC1aYzKEAZIw7aRjCOuc9TJveLMlhPsyvTtcALi3AYmGcIqE8IK7EBiTY1ce3GVM0VYYwrGD\ndz0FGdFuA9dco37Wnj3Aq18N3Hyzl9IRyoKMoO1A1YYwY0w59zj3Xi/SdFddpSa5KDYCQGAI71/a\nL2fjGL5HfH+Vg3U0IczA4DAH+1bkAEe3l72X/LUTpJd+8umRhrBIMPX7zUv72YRwtvc2FRkBSEN4\nmgr8FS3dEGZMbYuFvvOd6tYpj+Y5IVxVUbm1NQQzs+pOCIv1izLOqCEs1vnhh72+lwgZXHON99wd\nj96hvHdWDGFx3T54EMqMullICH/96xJ1Y+IHC1GOcJNSwiZD2HLt08kawnOsIS0q10pfVO7b35bL\nIqUTjJLlSAhXhYygKbtYQ9iAjKD8YNEAqAnh9J1ugBrC3u9VSUI4oVBRMB2XGEQr7RVcs0+6DV95\nWMKF0k7Dpwb57nC3VEM4NXsU1TCE6XFRW0I4Zvr0JHru1c8Nlp1rbgPQjMZWGsLZWNcm6YZwIxLC\nUeZohoQw4A+eAFpCOJohrGALWr16E8LaugTXrL5M9C4sqIbwyc2TmQ1hxph8HRncobNkitIoBhmh\nX6t3d2Xn67rrzJ1Moe4wPTKCPs9b5RrCcUXlgPBxvb4uz22BixAKGcKL0YgcxSD2kUpVpnkoQ1hs\n8/498ljcGWSLog1GA3z7lH9TdvY6YHd/YkIYaF6CSRghq6vxx7NJe/bIUMK8GMJ1J4QtMiJeIjAC\nIDgff+mXvP+vuUb+Rt/9brXrlVXzZgjXwRB2HHk+180QFjM4o/orX/cno6yuSjTVyZNeala854Yb\nvP+/+6h6cJ/rnpsJQ1jch+gDl4uLckBhWg3hj39cLj/72dGvayoT3WQIL5Pb3nluk5JkDeE5FmUW\npi0qt74O3OZ5P7j2WuDoUW/5Va8CbroJ+MmfzoGMqKioXFxC2MRoFBoMgM9/3ls+dAh40pO85VOn\nvJG0PMiIwHRcuACAl5fAokX0EpARAcssIiEMAF9++MvBctpp+AoyYiATwru7xSMGvH2cAxlRUkJ4\n7JZrqsRJGsKkwNJiOYYwv8rjCDehsdWRER23k4p1bVJTEsLUPKMJYYDgE1IwhGmxsUBpGcIapqFM\nQ0mf3aCnPHV8hUgIs6E0hItICAPyfKYJ4TIM4XGGonLHjsljUecH66IJ4aS2STGEyfbWOmDpH9cU\nBSAKygkphjAbB4NgUed98Pv6319l540mhEWbfGCP2kZm0fcf+748rk/eBACpDOGmdVijOtlpJVLC\nJ096HPlpEDV/sm43NZApu7MqpUkIN+F+oC7pCWEAeOMbvePz+9+XbM7jx5t3LlLNgyFcNzICkOdz\n3ciIOM78o4/KImJPf7ocmB1fdDf+8hMSWyT6yLOYEO715HVNv2YzFv/7NV3DIfABrwYgOh3ghS+M\nfi1NCDelsBznZoawTQinkzWE51hKUTknXVG5j35UJste8hLvAggAT3yiN3L4in8zIUPY7/zVmxBW\nkRHf/77sYPzETwCHfR+01/NSrlmLygHkt2Ec6GyV0uFOncACchnCaafhO8wJvosiI4DiU8LDIYKU\nOZAeGbHRk3Mu9y3m7JH6oqZL2aZKnIIbEmIIF8UQBoAr914ZpMcHl9wOtHYb0djqyIi0rGuTmmII\n04GO6CSlREYIQ7jdRlBgClATwoHaeZAR1TGEXbRC+0+/XouEcGscYwhv5zOExfk8hDyBafqrKFFD\nOGk2By0oR6fLm5QXGVH2YJY6QOuETH/93KPJz1hDeOG8164iekaENIRrSgiLwR2REF6TxyI9RtPo\n5CYpbrBxBAAii8rRKa1N67BOagjTY+LEiYlXpxIJ82fv3uypaHotpxi3qpQmITwazS+z0WQIi2XX\nBZ78ZPnYXXdVt15ZNQ+GcN1F5QCZED5/vhqmfZIhvLlJZpH5+jpB1d98sx82uOgY8JtPxh+evRk4\n8jkAxBA+PXuGMO2zmtoq0cY2eZAnSp/5jLy3feEL49tiGjShiLo6Rc+bKGTEPGOMkmQN4TnWKKaI\njV7oTVy4P/IR+fBLXhL+zAu95iIjlA4oS4+MoBe7a69Vb+5On/YvMO30KSxA+20WLpSYwFKL11Dp\nRnzQgGmG8DX7JTLiGydkidkshbqE+dIddLGXBHCLZt9lSQj3hvLuVkkIF4iMGDlNSAiXg4wAZEqY\nuz3gsq81p6icMwBWvF5ynoJyQk1BRgyGHHCFiaQlhN1wQlhcsy6/XGWcmQ3hlEXlNExDVQxhl4Wd\nkhAywje121CREZesSDc8b0JYnM8DyBO4aEPYY5+nH7yjiQxa1MQkmjjNYggPS752KfvYCe9j/dwL\niiAibAgrx3WK610wW6a2hLCPjPBnZq0syGOxN8p2EV3fIW7gtsdNiDKEm4qMoKmr/TnHLOl9xbR0\nxoUhnBUXAXjbKzq9Z84Ut05plaaoHDC/iSxqCNNBWSFqCDcZGzFvhnDdCWHOy8HppV0nOmioz7S4\ng/i7T3uabwpe9UXA8Z3jazxD+IYbgLM7Z3FyS63EPguGMO2z7jV0FafZEH7f++TyL/9y/GsPk24V\nvdbVqahj2iIj0skawnMsiozQcQKmhPC5c94IEuBNFbnppvBnUmOtacgIOg3ZTWAI04ZK53HSm7tT\np1RkhIt2qDNvkm4IV120BzAUZhKKSQj3RvKOMAuXVZgqlCEMFG8I500IKwzhCZERruMGhtXIqS8h\nLG5IFvcTZETBhvCPX/3j8o8jt6HblQUJ6tLGBoBV6djl5QcDzUkID4aELxs1mOUXldvYkEaDXsQm\n1hAedoBxK50h7FbHENYH7wDt2rVwIeiQLDA1IbzUXgrO50mREb1RecgIHe+TxRA+nHB4K8iIhNkr\n1BAeseoYwibTPy4hHMsQpjMikpARdTGENWQE3S/9cUZDeJsawgextiY5kNNiCE9SUE6oqdsWpfF4\nMkOYMXk9ryMhHIeMsB1wOY2ano9U1hBujqKQEQvklqeoonJf3vwA3v6ltyvtMqBeA6rARiQlhIHw\ndZTOTHriE31DmBRvxtoJMOZhrHR+MDB7hrCprRK/3/Z2NUnvorS9Dfzt33rL+/YBP/Mz8a+fJkPY\nJoTTyRrCc6xRDLPQVFTuYx+TN4E///MSF0FFE8JpjTWls+ubGpUzhDUmZVRC+Ior1IRwYAj7yIgF\nJzkdDAB7OuUnhIdDZGIIB9IM4QNLB7DSDt/RpkVGADI1XTYyIosJbmIIO8xROtF5JbZ3gPoSwuJm\nrrM32SDJK8oRxpX/hPG4/iminiEs71AuXUl/nOpqSkK4P0xxTPvreN998jk6rQuIMIRFQbaB14uP\nYgjr14vGJIQXZSJ00VUNYUBep3InhP1zeXe0i3bHM56LTggnFtHT2ifaXiQawsP0yIiO2wGD17CX\nfe2i26wPcoh1AZAZGeGsJA+ABbNl6koIa0Xl6LE44NE/NufAW98KvPzlHosU0BLCOweV+5NpYQjP\noyFMp2TnMYQBedyvr1c/EJsGGQHMpyG8tQU88IC3/MQnml8jCm8B0hAej8vp+0yieTCEo5ARjoPA\nwCykqNyB7+GPjr8Mv/uZ38VffeuvlNdUXSQyKhUddx29+265/IQnmA3ho0e9AaF5NYTjEtZN1sc+\nJtf3JS9RB0NMOnhQekBNMYSjjmk7QJlO1hCeY43TVjX3kRF/8zfyIYGL+MrDX8GffPlPsLHrXSXz\nICPo61qrnjFXOkM4zhxt9WITwjoygiaEF9yUhvAUJYQZY0pKWCjLVHyRgKJF5YCSEsJ5isr5CeE9\nC3ty82aphPEyZPUnhNtr5SWEj+4/Kgca/BvDuhvcjQ0oXNy01yGTVFOK15cQHkWn3vWictQQzpQQ\nHnhmamRCuNUgZARdF5IIpQNX4oZWGMJb/S2c2ZFzq7MiIwDg4GGvV1hKQpgWWEs7eAe1PTIpCzKC\nMRa8pkpDOAsyotUKYzKoIXzk+mRkhF5UrraEsG9M02NxiN0Qu1Hoc58D3vxm4P3vB971Lu+xR7fJ\nwbgdbwg3lSE8j4YwTQFOagj3+9Ub/GmKygH13w/UoTvvlMtPeYr5NYcOyfb4zju9cMSTnuQ9/sUv\nlr+OaaUX2I66Nk2zohLCgBxYLgQZcZGsjHrv2XuV1zQ9Icy5NISvvtpLvZsM4aiCcoBXq8V1pZE4\ni4bwtLVDQllwEYB3noj25+TJ+NdWpTQJ4Xlsj9LKGsJzrLgiNjoy4swZ4JOf9P687DLgWc/yOhi3\nvO8W/IdP/ge89ba3AgAu9LMbwrTTtrjPawnLuJDGmQxxDGFqCF9xRRQywus8L7rxHW4hhVNbE0M4\nymTorIQTZZQjDAAMDIdWDqVel4AhPCzXEB6NkA8Z4SeEJ8VFCAkTqc/rSQiPx3K011ktjyHMGJOF\n6ha9nVl3g7uxgWCmAZDe+DMpOF4YB5xRbYZwf5RiYKfVA8Bx//3yuWyGsHeuNw0ZEZseBRRm7MpC\ndEIYAI5vHJfPZ0RGAMDFl3oH9/p6sR3jTIN3LfUgzISMSMG3F9tLr13ltU8p9nFLHYy57rpwp/1x\nj5PJ9uueklxEUzKE60JGRCeE0dqNHBB///vlskhMxqVt8wAAIABJREFU6wxhen+y1FoKEt/TgozI\nyxBu6rZFqUhDGKgeG2ETwtGiCAiKhqBiTD538iTwjncA997rXWs/9KHy1zGNRqPwtPdpNPGSFJUQ\nBuSxXIghTIIKIkQl1HRD+NQp+ff113v/X345MhnC57rnwJhMXU/jsZQlIdykWThJuv127/9Dh4Dn\nPCfde8Tg86lT9aMCAYuMmFTWEJ5jjWISwnpRuYcflhfvW2/1Ol/3nrk3aNS+8+h3AORLCB9YOhAs\nL/z/7L15lGxXXfb/7Jq7q6fbfbtv3yk3N7lDkhsgM7lJEBFlfhGQQV1LNAoiKD8Vlq++iiJLXln6\n4qvI0iWoIC+CShAUBQIyCBhiEkImMg830527b4/V1TXu3x+79tnffeqMVfsM3V3PWlmp2119qk7V\nOXt49rM/323zAKLpEDVmoR0ZkfVHRpRKwNRUNzLi1ClYyIggE27A9tmUlmJBRgRNCI9NdRvC54+f\nr/3t9uHtXWarl+Tn0mw3MTKmRl9xJ4RpstApIdxvQTkpy1RpR2uquKlSUR00G/I3SPrRRKkzKups\n3U96AigMYTV619KkIeW0UyIJNciMxSv1jkwT8/Pqn3ZDeGJCR0IcOoTAhnCcyAg/vqzWXpPre8zH\nEH5y6Un1+5DICACYnBWfVatlto+yL94FbatHRsR/XqKGsF9CmD6nTtAFkSMjvBLCgGaWy8ko1d69\nwKc/DfzWbwHX/LD/jgh7Qjj+onI6Q7iQLQC8E5vKVx3vrUYD+Nzn1L+PHxf/1xjCNmQEY8xKCacZ\nGUG3SA8SwsGVpCEctKhcVOOeO+8E/u7vkh9vOIkawm4JYUA3i//kT9TjNCbupDaiiecnr2vZaEK4\n4G4Ix42MCGsIU1yExKAMDQGZcWIIl8/i8CV1cM4tZMTesb3IMDHolOc8MITTJVrI0F6I2ksyjNBo\nxHPN+mlQVK4/DQzhLaw2d9+iqieE9Vb7hhvE/48tHrN+Nr8mXIheispNDStDODcqjrOyYp5FGjwh\n7IyM2LNHrOrbkRH/fRu3kBET5fQiI4JuQx6ZdDCEbciIsIW66Lbr4TE1skpDUbn15rr12FhCOC8T\nwusAE1HCODsiOhDhpeiQEQDhEpeWANbua+BsQouLALKGE8IAkK0nh4xo02vaYzHLlh61M4SzWd18\neN7lLfU3PgxhOzIiroSwE07ADRkxPuyOjACAJxeftB73goyY2K5uYpMc4V7xPn7pYEDsypAKcs6q\niF6MReX8DGFyzocPOx/vTW8CPvABoMpDICOSKipnYwgzxpDlne8mt+74fr72Nd1ElAvVVkK4NgI0\nS10IETdDOE2m6VZERtCFu41uCMeNjDh3DnjRi4AbbwQ+9CHzx+9X95FwpFtCGNA5wnTOkRYmp5Nh\ntxk5wkGQEf1cx9a9UlBQWVrIGkg2IRyEIWwvKGdpRL9YZy44hWeWn7HCYc/d8VwrNLIVDOGN1g8B\nYrFDXqNjwWwbALoXkoZFLLeFnUFCOJgGhvAWVjtwUTndmT16VPz/iQW1N3m+Kka3shMYyg0FTpDS\nSRsrq1Gy6U7RK3XmhoxYXlbmmkzbTU6qFN2DDwKPPt6w0l3Dhd4M4TiQEd4mg/qOh8f8kRFh+MGA\nnrIrjaiRVSTIiJBF5egihumEsHgTwiGNsyOiZl2rIAySLMtitDDq8he9y0odMw4UlxJfge1KCGcN\nJYQ7xTWTUIMgI7wWOeyLd/aEMKBjIy69jFyUdR+GsA0ZsboaHU8wDDLiBT+mDMBtIz4J4R4MYXov\nj29Xn5dJjjDtm4DgeB8/fjCgEsKlXMlK6nhJnu96xIZwqIQwOWenhDDVuap/Ec3Ei8rZGMIAkIO3\nIfxP/6T/+8QJcf9ZDOE14Q56GcIbgSG8FZERU1Puz/MSbcu3EjLirrvUPfuD7tpViUu+p+lpHTFn\nl5tZPDCE45UXMsJoQjhFyAi3BZ0gCWHZB6811tDO641tefYE7j+jINqXzly6pQzhjZgQpt9zr4Zw\nGtqsQUK4Pw0M4S2sNtz5stlMVk0eyRb8HTuA/R1v8NiCSgjLSZg0hMMUcqLICJr2Mt0p9sIQtvOD\nAZGgk4O8Y8dg4SKAYFtygXgSwn4mg/Ydkwn3kIMhbE8IU5MliKj5UiSG8NKS07N7V5iicrWWGNnS\nlXoLf9CnaKpQXh9xGsK0g2/kxI20bWibkYJ5dmmfWWkx8Q43EoYwkGxCmBSVs5tnWlqWssALeoJM\nipoI511ILko/hrDD60RVRdk3IUzM6fK06ij2zJStgiXSDKeLVxQT00tCeHRyAyaEO0XlgvZNVkFM\n3rTa0sQNYXI/uyWEpShTl+4+orKSudl0MIQBoMA615mDIVyrAZ//fPdxTp1pKgO8Ipj+g4Rwus7N\nTaaREXNz7s+LQkkmhCU/G0ifQXn2rOobvHARgJ4QpkqDuQJsHUNYJoQZ694hJQ3hWq13TqoTMoIG\nUYB0IiOooemEjDi92j0IOtc4gUfmH7H+fdH2izRDmHM+MIRTJvo+wxjCdPyZhjbL7ZrO59V9nfT8\nNM0aGMJbWF4JYYCYaSQ9evSoqhD6xKJKCK/WV1Fv1XsyhIfzw9YEv1mILiFcb7RFihHdJoPd7KjX\nRecvt2UCetpOm3TlVQujGYEe0tAEUSaEPRjCQHcldwDIl9WgpZwXibsuZETYhDD5XArD0SEjeikq\nR1fqTSEjNPOlw2hNChlRz4gbKQpcBGBL4Q0tJNrhttudRYYoGMIpSQgXPBPCarYmETd2UUN4ejep\nXhWGIdwx6KIyXsIkhBeqava0d0cZf/u3wLvfDbzjHeJnbnibXhLCIxPq4jaZEPZrq/sxhGVCOKgh\nrHHwI2y76HdsX7gDwiMjpI4vC7hulmUxPeywIgJaVC4dDGEAyGdkQribIXzzzc732gPHCHegIs7V\nnkiUhnCj3UCprD7HNE1WBwzh3o6xVYvKPfaYepw2gzJIQTmpbds6hblsqlSiW2wNo61iCMvFDaex\nD72Wez33NCaEwzKEJTJiakq1O6dWu13AEyu6IXxo6pBlCLd4C5VGZVMbwhutHwL09zkaYhNp2pAR\nbtc0YyolPEBGuGtgCG9h0YSwoyEsjYeMbghL0YQwIDjCvRjCjDEryVPPRoiMIKAoz4Rwx+xotZwT\nwoBt0kUSwj0VlStGU1TOjyEMOBvCuaHuhPC20jbtPffDEM6WokNGhEkIOyIjIjGE408IW5P9TBM1\nJu7JWAzhUrKGsIUxIMaoSWREUpOhJndnCLsZZ064CAB47WvF/48cAXafTxPCYvHHlSFsQ0YA0ZlK\nfulRavJTREC5UMaNNwIf/KCaYF06cyledP6LtL9nYNg+vB1BRNv04XH1eUWZELa31faip1JhDOGg\ni5Va21UQfxtFYonuYAmKjJie9jfPTqycACD6KPuuGClVVC6phHD3eRez7siIz35WPb76avX44Wf1\ngnKAe0IYgMawTNNkdZAQ7u0YaWEI2w3hqLfo0oRw0jUL7KL8YL+EMKDmVCMjwEtfqn6ehsTdVjOE\n7dcxoBLCQO/XmkoIk/a3tow2V8wtmhBOmyG8vKyKmFJkk5sh/Og5dYMenDyo7SJcXF/cNIbwuMN0\nMa1YJi9tFmREkEKng4SwuwaG8BYWh7dZqBLCqtWWg5dWu6XxGAHgmeVn0OJighXGEAaUWVWFMoRp\n0Q0TqpHWwp46c5p8Nhq6IeyeEFYmQdBJd3zICHcsCOBsCGeK3YYwY0xLCYdFRlBTpYkqyh3UZxoS\nwhQZYYohrF0HCSSErQ6+pD7gyAzhIWoILyY6QbOuJ5IQ3hzICMoQ9igqF8AQfvObgaefFhzGOu8R\nGRFjQtgPGaEZwvly13MzLIOvv/nruO/t9+FPXvIneMMlb8CHX/5hTJed06N20Xu5NLoBkRHN3pAR\nADA1K64PORk0KS0hnA1mCPvxgxuthsXU3T3qEL2Th5N9IROvX6vFNzmt1bnVL9M+2csQ/va3xf/L\nZeBnf1b9/NGTJKbeSQjPzOh/Sw3hamvV6nvTNFk1wRAuldQkME3n5qbNZAgPkBFKYRLCAPCnfwq8\n5z1iF8All6ifp8Fg2SqGsMwK+RnCvV7LqqicSghzcKzU1Ip6sagWUuJARriZZ06GsFtBOb+E8ERp\nAtuHt2uhkY1uCEvM4ciI8/VCg2P0c0uzejWENwoyAlB90iAh7C6Hy3mgrSK/hLAyC8VdlssBV10l\nfnRi5QQabb3YHC0yF9YQlhzhBtZF4rY5ZHyVlJoqdnM0wzLIZXJotpvW5LNe15ERtKF3Q0YEnXSP\nFskyYqRF5byxIE6GMDW46fmcP3E+7j19L4DwyAhqylUbVUxMiG1x0SSE+ygqF0VCOAGGsJXcJExu\ntwJL/UpjCCeMjFCGMEkIm0JG5GrJISNIW1vIeSAjyHm7GcL0d5V6j8iITnsRlfHSbHHCWfVYyAKw\nsK5mT+VCtyEMiAWtS2cuxaUzAWboNtF7mfLPjSMjAhcAVd+xX1G5Nm9jvSkWRwIbwjn1vB27q5h/\nVGwHbDadJ0C9qpeicn64iJOrJ8EhsFC7x9wNYfl6nPQVKyu9F/cKKs6BVkulwuh5D+eHgBqAbBML\nS03I4fmpU2IBBxDjr/2kvuvTc2cBeXtUZjA1pQr2SNF7YqW2grEx0femyTSV7TZj4basUjEmJrPn\nzqXr3NxEx7e9muCTk+K8Od/cyAhx34jXabWAxx9Xv0tzQtiNEUy1Zw/wB38gHt9yi/p5GgyWrWII\neyEjjCaECTICEOYoDaFMTop5QtoSwmEM4ScWnsDTS6LDOjh5EIwxbY6wUF2w+qhGQ9zbEZQ2iUyy\nr3LbyXLkiOiD63Xge9+L7331IxMJ4TQjIwC12DJICLtrkBDewmozd4MU6EZGXHaZGuhR81eKJobD\nJi214i/DIhpsnCHcpIloL3NUjHi8EsKuyIiACeFcJqcm6FEmhBnFZARLgbezzobwDXtvsH52ZCbA\nSJeIfi5rjTWrMzVdVE6YDCGREVEkhEkiOlNKkCFcUmbZVkBGRJsQriU2GWq13RPCbulRuoDlJokT\nAOCfEI4RGdFoei9WUpOfbrt0Sgj3K3ovZ4tr1uTFZEI4zEJWmISwLCgHBMcZ6QlhMYFtt82eLwA0\nW4rpbyohLPnBgE9CmPaFTFw/cZiI9l079Hseyqt2amFFtV+3367+/upr2vjrM78AvOVaYPIxnFjS\nkRFOCwT0nlhrrFkTvjQyhMfH3ZE1QSTPbSMZwiMj3SZ+UGWzahEjTcgIk4ZwqwXccIM4z+9+V4zJ\nqVGZJoOSc5UQPv/88IsbaduCvdUM4TiREYA+9wDUwlAShvD82jxW66uOyANaUM4PGXH78dutRdmD\nUwcBwBUZYX8fG0F+hnCxCDz3ueLxww+nq591U6+G8OioauvT0F4FSQgPDGF3DQzhLSwesqicxg9e\nPNb1fMoUHiuEREaUiFk1JAxh08gImhAOkpalCeGhIbF622g18NE7P4qlse+qP6TIiICTboCkqFOU\nEB4eBupt9WaomfbO578Tn3jNJ/Ctn/tWaHORIiaOrxy32EuVitkBQbMJT2QENbXiSggXy+LzjDMh\nbHXwJCEcCzIiLQlhQwxh7W8TTAg3iSFczLlf0/S8vRLCUpohXBfGURqQEfR8fdOjRG4J4X5EF7Pq\n7aplwJhPCLsv0PZsCDfD716hn+G2HWoCS3fLmFCTpN5NJYQlPxgAdo3ucn2e9noxFpaj/GBAN6aH\nC6qvXVxVzsNtt6m/HznyLfzLUx8D9twGPP9DOFshLmBluqugHKB/72uNNWvCv7wsDKw0yG+SHVRp\nNLvdJE2fXnERUhIbkWRC2AsZ0e+45557hBG8vCzwChQXAaTLoHzqKVUMLgg/2K6BIZyMvJAR9Fru\n3xDuTghTybagVoveuKL37yOL92H3/92N3f93Nxbqp1HsDPWcDGEtIVxRF6lceKy11AVyaPIQAG9D\neCNdT/W6as+8+iq5k5pzgWZLu3o1hBlTY9A0tFdBuPaNhv68gZQGhvAWFkVGOKZHbQnh665Tv3NK\nCFOTODQyIoaEcKMZzhylCeE9e0Tj99Z/eyve9u9vwwee/TFgrDNDzodPCAPEfCxFWVQuHEN41y5g\nramq0jOyl6eUK+HNz3szrtp1Vej3sn+b2uf6xMITWmdqMiXslxCm33tcDOFCWVwfiSSEY0BG6Anh\nxXQYwoYSwpoJmiRDuE0XOforKkdVaXQjI9wSek6vE1lC2Gfxzs3kjyIhbDfUJKP19Glzhprf4l02\nk0WGdb6Yzmefz6sJ5NL6ErjDm6GGf2CcUUFFhMam1Rds2hD24mIDvRnCx1eCJYS1z5fFV1iu0YC2\na4e+j5GS6jPcDOGVye+of8zeg4U6ZQjPhEoIcy4WZJMW56rd7hWdICXN7mo13ckzzs0Zwts7tTEr\nlXjHGHElhOn48D//s9sQThMyIiw/2K60bcHeKoZwksgIKjovMo3Tc31PAL7x9M2otWpYri3jq49/\ntWunhURGlErAvn3q706uiIuUgeE5O7pXQNwSwhux8Bqgt0VehvCVV6rHGwEbQcfxYQxhQLVZ584l\n3zYEXaQcpISdNTCEt6g4h1VQBfBLCNeRyQDXX69+55gQ7scQHiKGcMfEMm4I+6TOLJOhM/mcm1Or\n/Xv3At968lv4xD2fAACst9eAg18EAMzuCZ/CAmwJ4So3ntbx24YMdBvCO3cqEyHMufjpgm0XWI+P\nLR6LbODjlxBmjFnnLFeyNUM4goRwbjj+hLAyhKNHRmgM4dJCSorKmWEI29O3tZpoO++4w7xB5iXP\nhHAuXFE5qn6REUklhN2+00gSwmTXR7VZtVKY1ao5Q63VgqtRKGVvq3fsEOb9x+/6OCb/eBKv+PQr\nukzhhSrhKwc0yynffnibSgibLizX5OEM4Xxe5+c6SUNGeDCEtcXRJBPC5H2MEOdhqSLGFO22aGsA\nsVj7gyWyM2nmB1hp68iIAwe6X5P2RZVGRZvwpSFJS81bUwlhIB3n5ia6M8pUQhiINyUcV1G5VbLL\nfm4O+Nd/1X+ftAlBRdvICy8M//cbISG8EQuB+SloUblex7ZOReUAfXciAKvgJxD9Yh29fxtcndh8\ndb7LEJbX9b59emBAIiOmy9M4b/y8rtc4NOWcEKYLf1Eb3yZF32uQhDAA3HlndO/HlHpNCAN6m2Ua\nKxZWce1a2awaGMJbVO02PIvYAGpClsk38Nd/rRsMTgnhpxafsh73kxBm5YQTwh0z6Qlyirv2NPCO\nL71De37+kpsBAEcuC89pBMhnxDh4rmJ8oGU3GZxS4HrhQC4SwhEYwueNnwcGkTa2J4RNDgjCYDKc\nkBGaudmH6HWQH4o/IbzlkRFRJIQ7yIjPfQ645hqRVjSNtXFTi9Oich4J4U7bVSoFMxp0Qzg8MiIq\n06XpwUwGPJARUTCESdq/2qhaCWHA3AC4F7yP3Kr3iXs+gTZv4+bHbraKuUjdc/oe6/HhKZ94bUc0\nITw0Fl1CuBUGC5Kt48AB/6J2PSWEMzEnhF3GXaPEeVheE+3Xww+r93X1NS3c+uyt6mBDC2hOqu8X\nlWlce233a9JFEpoQBtKRzlpQaxZGDeE0nJub6NjWpCE8N9ffscIorqJydmPsq1/V/50mQ5hihGg/\nEVRTU6r/TashnKbP25SCJoR7vZatz9HGELYnhOM0hOn9W2+rL3V+TTeEV1fVogxFVHHOLUN4dmQW\nu0a6EU0HJ50TwrSdp+1/2kXnquMe2aEjR2BhNzZCQpj2lWG55/SaSLrNClJUDhgkhN00MIS3qPyY\nhYBKV7ZZAzfeqCePJC+YTmjotuawW++pWTU0FRFDmE5AvUyGzoSbGsIn9v0ZHjj7gP78w1/Hn/9F\nHS/7H4QhHAIZoZnmERSW86tcD9gn3Y3IEsKFbAF7x8WKwrGFY1pnahIZIVLR7sgI+V6AaJERWkK4\npBLCcTEbHZERQ9EgI7RUdVqKykXBEO4c88tfFv9cW4uPD6alKb0M4U7btXevQNw0Wg186dEvaclJ\nqjAJYafXiS4h7FNUzg0ZEUFC2I6MoJxWUxzhrv7Yc/FON4QpN/ehuYe0v7nrpLpAL995eaD3QhPC\n+RFlCEeZEA5iCPsVlAN0Q9iLIax9vilkCK90omgUF3H+1Q9guWa74aYeE/+vl4HmEJ7//O7X1BLC\n9UrqtusGTV0F0VY3hJNKCLvxGgGzCWEnpQkZQT9/+r0EVTarjOSkzRVg6xnCUSWErb7FhoywF5VL\nwhDO54FaS53Y3Nqc1Y42m8CTT6q/oWOfhfUFa84/OzLb1d/OlGes+RTFytkTwhvVEPbqq/J54HnP\nE48fecR88XTTMpUQThpzExfGaLNqYAhvUQXZokrNtBZXz602qji5Ku78S2ecQVn9ICNKExEhI0jl\n+rzXNmS7IZxfwy259wEAMiyDK3cKQFCluYLnvuJW8GyfyAggksJydmSEJ0MYALL1yBLCALB/Quz3\nna/OozSueiDjCWEPZATgnhDOsqyxhCFdGMiW1PbfuLiG1gC0FD0yIpvJYjQvedjJMoStwWVECWEA\neJqEMOMyHZo0IexRKFGa1nI3x/u//X688tOvxHUfuw61ZvdMrlIPzhB2ep00JYRzmZxrcrgfuSEj\nAHMJ4SB4HzvSaOdOkdKhhvCDcw9qf3PXKWIIzwYzhEcKI9bjzFB0CeFm0CKvAJCt48gR/2PKz2K0\nMKoZ23Y5FZVLmiFMr7OVTiN6++3qb9m+W9wPXJnB4cPO5qJ9QSNtpint//tlCKft3Ny0GQzhuHiN\nfsZYmgzKfg1hQBksp093dnImqK1iCEddVE4Ygty3qFxihjAZG1JkBCAMTSkNabKqViycDGGZDgb0\nhPDC+sKmN4QBHRuR9sJypgzhpBexgiaEB8gIZw0M4S2qIIkkaqY1WupOe3LxSevxJdOXOBqH/SAj\nsqMiGryyYtZAC1y5PlcDwJUhPHEMNS5659dd/Dr82rW/Zv3NzY/drFVyD4OM0JKVRfOF5UJxKQEg\nW8f0bMP6nEwbwpQj3BhRvGnjDOGQCWE5KBsrjmlF9PoR/ewyRfXFxtURyQ4+OxI9MgIAxgqd0VFq\nkBFmGML6/SGOKQtNAvGZDm2PNKVbQhiAtcX86aWnu8xCICRD2MEcjywh7FNwzOk7pUamSWnIiKaO\njDCbEA7YVnc++507gZX6ilYY8MGz6jtu87ZlCO8Z24PpcjCHgiIjanzFmvgYN4RDJIQPXVzHO97R\n9RRNnHMrCe/FDwZsi6OdPjJphjBduKrU9IQwY8DJnJchPI2jR51/5VZUDkgHZ3eAjOjvWGlMCEfF\nEHZSq5WeqvH9IiMAZbC0WvEhqdy0VQzhqIvKLS1BFB9n+hbBJA1hmoqW9VQAPSEM6EUcXQ3hcrch\nLPnBwNZjCAO6IZx2bITsKxnTr8Eg2ijIiEFC2F8DQ3iLKgizkJpp0jwD9OJx+yf2OxpN/SSEM2U1\nCjK5eqhXrvdJy2aayhAuqf0ee0b34CUXvsT6982P36yZKulDRgTchgwAmQYmd6hRSFQJYQBYL0Vj\nCPeUEO5s2zKFiwBshnBBfbFxdURyop8hhjDdtmVaE/LYpQWsVWPiYjjIiSFsDBmRZEKYJmZtixxO\nhvCePeKfdEvi/Wfu7zquZgjXvRnCTuZ4Uglhp+80Cn4w4I2MiIoh7Lmbo/Mdz87quAhATwgfWzhm\nIQaCpoMBHRmxUluxrqXjx80ib1ohDOF3/8+6NvFw0nJt2TLHvfjBXa8Xd0LYZdxFDeG1+jrW1oB7\n7xX/vuQS4I7ToqBcMVsEsw/d15z5wUB3Ubm0ISNMpCqlBoZwf8cKI6+icrmcMomjTggD6TEp5edf\nLAIjPa5Ppilxt9UM4SiQEa1WZ5yU776Q04KMWG86F5UDzCSEh/PDVl+3WRjCfobwlVeqxxvFEB4b\nE6ZwGKUJGTEoKtefBobwFpU9keSHE6B8YFpQ7oJtF2hmrlRYQ5jyTdslZQibXCEPtQ05W1fspJLq\nBSZKE5gpz1jYiLtP3Y2/v/fvrd+nDxkRLiE8Ma3ehGmDhSaEl7PqGjKfEBbfMwNDhnU3cdJMqrfq\n4JxbyAgtsd2ntKR4If6EsDTq2JAYbY0URhzNcVOy7t9sU8cQxCx7QjiXyTm2bUGlJVE7Jij9DuMy\nHVpQ7a/9PnZK7sqEMC2YaGegA9DSpb4JYc0cj5ghTM3CgMiIKPjBgA0Z0YgyIRy+qJzdEKYM4V5w\nEYCeEF5trGJ3x1ut1cz2x2GKytEFaTcF5QcDdoZw3Alh50KvdDG5nani1luVUfGco6escdc1u6/B\nnuED+oG9EsIpLypH76GBIRxeaUBGeG2173fM45cQBtJjUsrPf3o6vLEilabE3VYxhL2QEf0WlbP6\nlUL3uDg1yAiPhHCvhjBNCDPGrJTwVmAIA2IRV147G8kQDqs0LWDFxbXfrBoYwltUQZiFbsgIWVAO\n6BjCw/0bwoVswZqItvJqtGySI6wlhP1SZ7maxe8a26F6AZkifdmBl1k/kxPRfeP7XJnKToo9Iexj\n+k/O1DE1q0bvxhPC21RC+BxX15BJ4L44Z3GtZuFsgNKE8Hpz3VrsiCohjLz6TOPoiDhXHXy7KG6g\nKHERADA1rEZ4K83k9oDJAVu2KBIP/fCDAeeEMFVshrC2mOWfELYMYZoQPuuTEPZjCBPjOVuIFhmh\nna/fQlZHUSWEqVEXaUI4KN7HwxA+u3YW82vCtaUF5a7YeUXg9+KWEAbMYiPCJISDGML0s9joCWHk\n1jV+8NCh71qPr9t7HS6dfo523HxjxpWxbC8qlzZkhIlt9lJb0RDevl09TgsyAlCGcFQJYfq5paGw\nXLsNzM2Jx/0sbKTJYNkKhjDnyhCOAhlhzW+cEsLrKUwIr81jdFRtBaLICDr2sRvCo8VRDdt1cEol\nhAFsOUM4lwOe+1zx+PHH033fyHFAL4bwzIy8MBZvAAAgAElEQVRa/Eq6vRogI/rTwBDeogpbVI4m\nhE+sqonXnrE9RhLCgOII17IqhmTSEA6bEJaa2as6bdmpUUMYAK7dcy1u/YVbQ6XUok4I279jP2TE\n332yjlYmOkOYJoTPNiJMCHeQEVnmbwhTw8xoQpiiQ3Kq94kjIVytymIkHK1CPIbw5LAaHVWayY3w\n5LWUyYvRVz+4CMCeEO6eHcVmCDOPhHC2+z06JYT9DWFvZEQuk7MS99IQjgwZQczCvIPbkM1ku9qz\nqBLC2UzWajPsDOE4i8pZbXWmBbCWoyEMKGzE90993/rZ5TuDJ4TppG6lvmIlhAGBjTClMAzhQAnh\nZfXm/BjC2ut1+siki8rZDeE771T/XBxT/ODr916PK/fqC8/7tk+73rcaQ7iZ7oTwwBAOrzQYwvbJ\nN2DOEKYJ4cs7zVgmox4D6TBbFhfVZ9LPdTwwhOMVLdwXRVE5yxAudEfd05AQzuX0onK1Vg2lMfXi\ndIzjlRAG9IXYA5P6LhaJrFuqLWF8Qn3om5UhDOgLQ2ntj1otda31Ygjn86oPSjMywquoXLUxcIiB\ngSG8ZdVPUbmFqjJ8poamusymUq7UU7V3eZwqzgEQK5RJIiOkJmZ1ZAQgDGCZuvq5y34O3/zZb2Ln\nqA/k0Ca7IWzaXAllMgA47/y6ZhCZNoR3lHdYRumJ9QgZwplghjAgtkdJRZUQ5rl4E8LWdZRfA+98\nFlHygwFgkiBfKu1kDOF2mwy+cxEkhLPJJYRpUTlvhrBCRjRaDa3g5RMLT3QNfHRkhLg33Ywl+los\nHy0ygiaECw5tNdBdWC6qhDCgFniqjSqGh1VqyNSCZVje++hEAzt2uBjCncJyMiE8OTSJvWN7A7+X\nQrZgvdZGSghTZIRfQlhHRoj3kUhROeZcVA65dW2b6SncbT2+ds+1uGynnhC+6Dz3SKKdgZ1mhvDA\nEA6vYlGd99yc93NNKigywmRC+MMfBl7zGvH/vaRJS0NC2BQLO01Mzq1gCNOFjSgSwtb8JmXICHne\ndmQEALDh7ok3Y+K6/reH/w2/983fswoWA8oQftORNwEAfuLin+iaP8q5c5u3kRtW5vhmTQgDG6M/\nouMeOjYII9lmnTplts5EWPWaELazvLeqnGdaA216Balq7lZU7lxVjGYZGMZL410J4V7SwYAqLNdG\nSxRyW59IRUJ4ZGoJ6PypTJHmMjnc9pbbcGr1FPaM7UEv0gzI0pLxKu5BONH27zhKQ5gxhv3b9uOB\nsw/gmZVjEKY/iy4h7IOMAIAzFRVPmij2WeKciHJH25l4E8LWwGNI3TxRJ4Sp4VxFMiO8lRUyGOng\nHeymYVg58XmpYksIc/eEML2ey+N1/NK7xWB1bk0f5LR5Gw/PP4zLZi+zfibvd9YqgnPRPngZwsVs\nEevNdbCc4im3Wt5/04taPgxh+V5oexVVQhgQbeFSbcl6vW3bxGTdVNsViiEM4CN/U0c+X3JNCJ9c\nOYnTFRHtuXz2crCQQMvRwijmq/NYqa9gD9n5aTIhbNwQXg7OEKavly8KQnd8yAjncZfGnc9X8dRT\nnefkgCpfsJ6/fXh7F5rqyovcnVQvZEQaJqkyIZzLBZtkeylt5+Ymk4YwIIya5eXNnxC+4grg858X\nj9/+dvXzNJiUURjCg4Rw9PJDn0SKjKgtgXNu9c9pQEYAAB+aA7BP+9n0NHDHyVvx6n98tfbzQrZg\nmb3ve9H78NYr3+q4ICufA4j+rFQaw/r6xjWExwPkhzZCf0TfVy8JYQCY6lhA9bpo74fNWgeB5XUv\nexWVa/EWBhokhLeswvJlKTJCGsITpQlkWKaLIdyzIUyP0zGzzBrC6qb3rVxP0oDF8e6EMCAmZ72a\nwUB3QlhOAE0prMkQtSEMAPsnBEd4vbmO4pQY7UZVVC7LnI0kes5nK2oUbzIhTNNe7Zya0cSR+LBW\nfOM0hElCuIZk9oDR64hnNllCGO73MTWtb3xrDR/8oHhsT58A3YXl5P2ea6uZiBtDWHstYo5HkazU\nkBEO7RbQzRGONCHcMetk4loy8ExNZoIs0NLv+cdeJmbqTobwQ3MPaQXlwvCDpSRHeKWmIyNMLlq2\nTTOECcrKDxlBxzvDowkmhDPuCWGpCy8Eluuq8CljDAcmDyDbVs+/4fLgCeG0MoT7KcQltREm4IDa\n+VYq6RPVXiUNyIUFPSUVpfwSwtIUaDb1iXpYSWMsk9HNuSLpmtNgUm4VQ9jpZxtZLeIF9VpUrtUC\n3vUu4Bd+obtN9UJGyDomUtQQjjo8ohWVa+o3UKvYnRDesQP43onu6mivOvQqbcF5z9gexwVoOnem\nHOGNiIwol50XweyipnFa+yMThjA9T5M1gcIqKDKC3seNVkNuSN/yGhjCW1RBzEKNIUyREetiFixN\nJrvZ1G9CGAAwJDokk4aw3zZkt4RwZtjZEO5XURvCzSY0XqHkf1LFbQhTjvDwHsERjgoZkQuAjDi7\nRgxhgwzhDMtY10ojp1wjWqAhKlkD0qLq6U2em5PofVHLJLPkT6+jdiYChnCSCWGo9terqFyDGGf2\ngiUAcP8ZnSNsGcJQ93oQZARtH6MwlWhb7dQ3AckhIwCVZKxUzBgwYRfv5AROGsKTQ5NWUdYH5x7E\n908SfvBscH6wlDzWan01MmSEKYZwmwseoUwIZ1jG2sLqJvp65ZGYi8oFZAhLHTyo7mW5aJnNZHHR\n9kvUc3a5O1DZTNZqB+2GcNKTVM6VIdwvLgLYOIawyXMGdAPSJGbNS0GLygH9pYSlIVwu6wsG1BBO\nAzLCFAt7ZESZF2k0hNNgvpuUCWTEv/wL8Kd/CnzsY8BnPqP/ThnCzpFful09qYSwHRlRy3azZ2Zn\n9R2Vf/gjf4jv3PgdfOb1n+l6rpPcDOGNlBCW32XQnSy0P0rSKPWSCUOYfh5JGvy9ICNW6ilYFU+J\nNqwhzBi7kjH2u4yxrzDGnmGMrTPGVhhjDzPGPsYYuz7k8V7OGPscOdYznX+/zP+vrWNkGWO/xBj7\nNmPsDGNsjTH2GGPsrxhjl/gfIT4FKTimMYQ7CeE2b1sMYWkE25ERvZpPmrHcYRiZHNw2ekBGZLNA\nK08KjxlMkcaTEBbfcZC0bJwJYQAozgqOsMmOkiIjcpkAhnBECWEA2D4sSPsVri7iRx4x+hKOsjr4\nfLTfJRVFRtSzCRvCrI12pwhbvwlhJz4vVRoSwm7GmRMXy15YrlIXsw55fRSL0BKhdkljiWfUZxHF\nZ+CHEwAcEsIRIyMAYahxzrUq2SYGwGF37NRbdXDOLUN49+huXDx9MQDgqcWn8PG7P249N0xBOSmZ\nEK42qxgdb1qDaZPICBMJ4f93z//D5B9N4uWfejmOLYr+ZEd5h+s1I0XHO8Mjoo9MmiHsVoj04CFu\npf3p2OpNz30NAFG8x2+nkrw3Ko0KhobULoCkTdPlZTWJM2GOjqh6iImfm5taLcX6NWUI79ihHsdl\nIgZFRgD9GcISGUG/W0A36tJgUppKCDOmMzmT1FYzhHstKved76jH9j7SCRlB+za6kysuQ7jdVni1\nXK4bGUELu0vNzuoBmpdc+BLccN4NjmMVJ9kNYdOL6nFIjvV6MYTT2h9tFUPYrajcci2lX0wC2pCG\nMGPs2wDuAPA+AD8KYBeAPIBhAAcA/ByA7zDGPsGYS0xQHYsxxv4GwBcBvIYca1fn319ijH0kwHua\nAnArgL8EcD2AKQBFAPsB/CKAOxljvxD6ZCMSnYAynnHc4uGUEF5aXwLv5OvlNnFjyIihaJERLa1y\nfXcn5pQG3L8fWK6rFq7Xc3OSTGGJF1/Gk08aOzQA/Tt2MvyBBAzhbcoQzm4XCeHlZX3bVj8KmxCm\nK96mU7Tyel6uLyJXEN9DrAnhvJqFaXzKCESREY3sQiKFBayBCMW99MsQzqYjIUwNYXtROfoeadLD\nKSHshozYPT2MD34Q+OIXvZmW8vNsZ6JFRgQxhO3p7ziQERwc9Vbd+ACYom4YWKDdHAvrC9b3vWt0\nFy7afpH1Hp9YEG3r0T1HcXjqcOj3M1JQ7kulsWotEhgtKuexyAH4G8J3nrgTb/nCW7BUW8LNj91s\nFQj1w0XYX2+oLN5HvR692VGvwxUN4pYQ3ndgzWLc0UXL337Bb+O/bvwv3PHWO3wn5XRBgzE18Usa\nGUFTlf2YaFKZjCqMk9YJ+LlzwpABzBnCuwgy2+SijZeCFpUDzCWEqTYrMgJQhvDCQrLpZyejLg2f\ntUmFQUa4fRe3364e29sdp4QwZdwnYQjbjTM7MmINzglhagjPlMM1XjQ0QhPCwMZICTca6jvZTIYw\nHQNsdGREUIYw7Y8GhrDShjSEAeyEoH4cB/AhAK8HcA2AowDeBeDZzu9/BsDHXY4h9YcAfr7z/DsB\n/FTnWD8F4Pudn7+FMfZ+twMwxjIA/gXAVZ3n/zOAlwN4PoD/D8BpCHP4rxhjLw19thFITEBFT8hc\nagvShLCckElcBOCeEDbBEGZl88iIXorKHTqkjJWRwohv8iiM8tm8Ml2LS1haiiAty9KVEKbICD5+\nzHpsqrMMkhCmRpKGjDCcEKbX876LxIX82GNqMhiVlCGcTEIYpcVEVvwtY46YKX0zhHP+DOE4zG+Z\neAb6Swg/vvC4lQZptBrWzo/RUhnvfjfw4hd7vw/5Wi1EnBD2MQsBB2REhAlhmt6sNqvGJzN0x07G\npT8uZPTvmfKDd43uwsXbL9aeP1Ycw9+/7u9DF5QD9MXKldqKhY1YXjZnInql3gFvQ3iltoKf/Oef\n1GobSPkVlLO/3nBZOQJRG6S9MIR3nq/uY5qyymayuP686wNhrGT7L3cEyIlf0pNUU9vsqdJybm6K\n4pzpro4T3VjxSBQmIUwTWc0mcOedwY1Ft4Rw2pARJg1hmviOs1CgXVstIdwLMqJeB76vCE2BGMK0\n6BpduKf3TJyGsD0hXGk7M4RpgGa6HO4id0NGABuDI0zn5kEN4a3CEN4ICWG3/mhgCCttVEP4QQBv\nBHAe5/xdnPPPc87v5Jzfzjn/EIDLAMgs3k8xxm5wOghj7CCAd0OYuHcAuIFz/pnOsT4D4AUQJjED\n8BuMsQtd3s/PQaSCOYC/4Jy/kXP+Vc759zjnfwHgBgDLEJ/3n3cM5ERF06MZ+KdH5cRLFpQDgMmS\nWYYwPc7QZAQMYZI6KzgsBTsZwocPqxVck/xgKeuz6vBeTWIjUpkQJsiIRvkJ67GpTkQ/5wBF5SJi\nCAP6Qsnew+J6rlajn7BZHTzZdqxtR45A2r1RWkhkgqYMYZIQ7pch7JMQbjTimSBpCWEbQ5gao3UX\nhrA0m9q8jYfnHgaAnu51+XkIg1o44VEjI4K0XUC0CWF7YS7TA+Cw7VYQQ/ijr/qotgAXRhIZAQjG\nGuUIm0og9mMI//KXfhmPnXsMAHDVrqtwdM9R63cHJw/6vrZmxA6r9xH1pM1uCAdJCG/fQ5BVPfZR\n8t6Q93xaUrTU7BoYwr1rIyWE3/Uu4KqrgFe8IthrSFPSnhBOGzLC5PdK/54eN25tBUPYLyFMFx6c\nku733qt/JvZ2xxofEGQE3cVCE8KZjNraHmtC2MYQXmrMdRUXnp1ViL2RwkjosIWXIbwREsJ0nDdg\nCOvaCIawW1G5gSGslLgx2Ys456/mnP8z587ZLM75OQijV+r1Lof6dcCK47yTc661ipzzKoB3dv6Z\nA/BrLseRr7UA4H86vJ/HAXwAwlg+AOC1LseJTTSRxLhLQtgBGUENYblNnG4XB8wgIwoT5hnCWlG5\ngIbwoUMb3RDuJIQDcDjrrbqVIAKiMYRHi6PWOdcLp62fm+pEGk1uJYTtW+ul4mII0+t5x/nxcYST\nRkZgaKGvLaK9KvqEsDPHNA7jgQdMCNOtfzQhfPWuq63HkiPckyHs8HlEkar048sCDsiIKBPC5P6p\nNiJKCIdYoHUyhK/dc61l/L3l8rfgTZe+qef3Y08I0wSiKcNJQzg5tNVuhvD9Z+7HJ+/9pPU+/+n1\n/4Rv/uw38Xs/9Ht405E34deudRumKenICOUIxGIIu9Ru0NroTttdKgGl8f4NYXl/N9oNNFoNa+K3\ntqYn5OJWlAnhlZXod+P0otNq2KMlQftR0glhu3kEOBvClQrwN38jHn/jG/7XHjXFNkpCuFBQCy69\nit4Lg4RwtPJLCOdyyih2us4oLgIIhozQEsK2nVxy4SMuQziX60ZGzFfnu8xBioyYHg4fgae+wHJt\nWTMRt4IhnNYFSvq+em230oKM6KWo3MAQVtqQhnBA/Sd57JbsfTVEzOkhzvkdTk/gnN8G4GEIM/fH\n7b/vpIwv7hznnzjnbkOTvyOP02EI+0xAnYrKyYJygEr05jI5zSw1gYzIjQrjeWXFHHC+6ZMQ1gyG\nzvbw/QfqqDZF62E6QQrYDWFulCNMuZS9JISjStzJz7GZVQ2xqU6k2aJ8Rn9D+NllBcXUsAcGRK/n\nbbuUIRw1R9gy6GJMCBeyBWTbHVOxtJisIWyQIZzL5BTP1QEZAUQ/0Gu34ZoqBDyQESQhfN3e66zH\n958RhjCdiFBmrJe0RbNOYjopZEScCWF6/9gTwsYN4R4TwjtGduA7N34HN73hJnzkf/iWPfAUNYRX\n66vGt3dyDoD1lhD+3onvWY9/64bfwgXbLkAxV8T7XvQ+/OPr/9G3wBqg94elIfU+4kZG+CWEDx4E\nVur9F7WliyVrjTVtorq66vAHMck0QxhAas7NTZslISyNtHxeFEKzy2kC/uUv65Nxv7aTmmJpTwhL\n43Z62vnzCKNBQjg++RWVA9S17GQI33ab/u+wyAiaEAbiMYS1c863u9BLc2tzXYbw9pmmFQgLyw8G\n9D6o0qhsuYTwRjCEN3pC2OteHhSV89dmNoTpbLGrZBVjbD9E4TgA+JbPseTvdzPG9tl+d4PD87rE\nOT8N4BEIY/l6n9eLXLoh3FtCmCIe6ONejVOaqGTDykAz1Vn0goxw4/eZkmUIZ9pAfs18QthiCKcD\nGQGoc25kVENs6juuNdXAJkhCeKUuRm/jxXHMjsyaeRMd0et5ZCY+Q9jq4GNkCANAsd0Z4ZVSlBDO\n9pcQBshCkQMyAoh+oCcWdtyva9eicsTwff7u51uPH1sQW+2fXnra+lkQE83+WlEmhAMVlYuRIUzv\nHztD2BwyIhzv3W4IA8DlOy/H6y95vWNRujCiCwQr9RXjCRA6/gDCGcIy4Q4AV+y8oqfX14zY4ZgT\nwiGKyh08qN/H/SaEgW5DOMmJapQJYSCdk/Aoznl6Wk2A40oIy6CGn4kGKBP4s5/Vn+OHhKOGvldC\nOGmTknPdEO5X9BhpMYSlAZ/0Z21aTsiIf7jvH3DDx27Alx/9MgB17kEMYbeEcLbkj4wA4kdGZArd\nX+j8WndCODemCs2F5QcD+oK93RDeCAxhOkcdD9gNp70vAjaXIeyVEC4W1ULdICHsrM1sCP8wefyg\nw+8vIY8f8jkW/f3Ftt/1cpy9jLFoI3s+ohNQV0PYoaichowgiUpqfvWaEB4vjVsT2XZRGWimsBGt\ntur5gxSVK5eBoW0xGcIAUFoyaghrCeEAKbtGu4G1ZvQmokw51VGx3p8pVrRcuACCJYSlLp25tKfi\nS16iCWGJQAE2JzICAIq8c38kjowwlxDWjpFQQpjex0C3eZbNZK12kxpndKLx3B3PBYO4vo8tiGKO\nTy2qxmbfuH2d01lORfaiOH8/vizggIyIKSEcBTKCfscZl8U7+tnXWjVHQ9iUNIZwLb2G8JHpIz29\nPmUIFxNMCNP3kc/klZHf2d1Bi9oCvSeEaV9eaVS0raFJbvGMkiEMpHMSHoUhnMkAO3eKx0kkhJ1k\nZzZWq8AXv6g/x29s75UQThMyYnFRfR4mvtM0JoRlm7HZDGEnZMRv/Mdv4JZnbsHvfvN3AbgbwouL\nwMMP6z9zNYSHXJAR687IiLW16AoWU+Ms62AI2xPCuRzQLKjGuhdkhJYQrm+8hPD9atih1VTw0lYp\nKrcRkBGMqft4kBB21qY0hJlwdn6T/OgzDk+jt/SzDr+neoY83mvgOMz2d7FL36IavKjcwno3MgLQ\nza9eDeEMy1gJmFZetSqmzEJqMjgiI3J6AalDh4ClmjJVokBGaMcsLkfHEE5hQli8ATEDN8VJq1ND\n2MH0B9wNYdPaPrzdetzMz1ud0WZERgDAEOuM8PJVLFXinzVEwRAGnBPC1FCJeqBnN8/sReUA9R41\nZARJFk6Xp61UypOLTwIAnloihvBEMEPYCRmRFEO4CxkRU0I4bUXlGBh2lA3BSDvSGMIRJIT9FjkA\nPQlPr+sHzj5gvcegyXa76OtRQziWhDBF2pBFDcaYaq8MJ4TpYslaYy01E9WokRFpnIRHYQgDiiN8\n9qzzVn/TCpsQ/upXuxEe/SSE04SMoONXE9dxGhnC8vNP+rM2Ladt5nNrIg17avUUAHdD+A4H0KQb\nMiLTSQgzMG034mLNGRnBuXMROxOixhkrdK+mVJtVlCfUXHDHDmB+XV2IPSEjSB+0Wl/dcAzhW29V\nj48edX8eVamkrqm0FpWj1+tmTggDapFykBB21qY0hAG8C8A1EFzff+ac3+XwHIrP9iON0c0bdtii\nqePEKooTMIGMMJEQBlSSsZ1Vd6wpQ9hvG7I9IUwLygExJISLy0YZwtRkCGKqJGIIl0QvaWrQ22x5\nFyoCnA3h58w8x8wbIKL3xLnqPA4cEI+feELfpmZaVqom5oRwOaOW/M8sxT8qkAORXNHZcOlV1vVC\njJyDB9XvY0kIZ92LygHqPWpF5TrJEwaGkcIIzp84H4AoDFKpVzRDWP7OT07IiMQSwnZkRJQJYVpU\nrhlRUblOf5wLaQjPlGccFwn6kVdC2MT3HSQhnM1krYVMaQiv1letBY1Lpi/peVcHXSAtlFRjHEtC\nmCxY2a9huyEcRULYbggnOVGV5mip1G349aqNZAhv3+7+vLCiHOGTJ80d103SSAtqCNtxEcDmSQhH\naQinISGczytTNI7FhjhFx+LZLMA5t9Bbq3UxpZfnbjdo7bgIQG9zmk21qME6DOHh/LBWhNktIQxE\nh42gJjhzQaEVya7G2VngTEVdiH0nhDcYQ7jVAv77v8XjnTuBfcHyE2BM9Udp7IsA80Xl5DzsmWeA\n3/5t4L/+q/f3FlZ+PHDZJw0MYWdtOkOYMfZCAB/o/PM0gHe4PJVGx/y6ONpi2p0V6zic836OE6t6\nLSqnISNIp3Zo6hAAYT4ENRecJCcuLaaMybk5t2eHEzWEnRKzdMKdLdTx8z9vZjLmJbshfPasvp2h\nHzWbUCZDJnxCOCoT0Z6KBqJJCBdcTBInjEAUCWGamp+vzuOQuEVQrwNPP+3yRwZkTY5iZghPDqv2\n4IFj8Y/w5ECkPK6aWSMJ4Vx3Qjh2QzjjvdAh72OnhPBYcQwZltHa5aeWnrKMNSAEMsKh8GYkCWE4\nc1apCpn4EsJ2ZETSCeH15jpOrgrXxzQuAog+IRzEEAa6r+uH5hSV65LpSxz/Jojo6xVKMSeEc+7t\nk1NCmC5Km0gIV+qV1CR6pNk1M9N/IS4paghHbfD3InnO27YBhe616Z61W+1Cj4Uj7IeMoIbw4iLw\nhS90P2ezMIRNJ90nJ9X9kAZDuFBQn7fJz/r++4E3vAF48YuBV74S+PVfj98ctJtIdAy1Ul8B51xL\nCFOMAzWEZzuh33pdfUa0P+EF4e6OFEa0dtytqBwQnSGsJYTzzqsp+XFlCO/YAZytEGREDwzhXCZn\n9ed2ZESSfVCtBvzO7wAf+IA7ouOBB9R3efRouL5qoxjCQ0Pubbmfcjl13crx4W/+pvhMf+Inog1A\nUQVNCA+QEc5yWdvdmGKMHQHwOYjzqgJ4A+fczU6kraDfsIw6SPZNHNZxGGMFH1PY6zixSpgMnYSw\nywTUKSFMkRGUIfzOa96JSr2CS2cuDbz92Ely4t0gH48xQ9gndUYn3L/9u3W85EeBv/1+nAlh0ZI+\n/TRw0UX9HztIQtjOiZaGcCFbcP2bfmU3wQFz33Gz7V58SyouZARNCM9X53GUmIiPPALs32/8JQEo\nQzhbrFq2WhzIiD3bJ3BbB5pz/+PJGcLDY+uQnpURhnC2myEszX0g/qJyTvelPE+tqFxnMUsuZO2f\nUBfcsYVjFkN4amgqsJlqx+oA0SQM05YQticsR0cFt7PdNpgQDoFwOrFyAs22eH4khjBJCK/WVxM1\nhKvNqjVJv/9M//xg++sV404IuyAjANJO56u49FJhlGrICEMJ4TQYwu226vdNohM2SkLY5DkDekI4\nDo5wGGTEzTer72LXLmVY+xnCXgnhtCIjTHyv2axIj589mw5kBDWE63VhnJlYwPn939eT41/6EjA1\nBbznPf0fO6jsReXWm8oaaPM21pvrGOpczJyL614u5NzV2Xs8MQFccYV4/4DoR4pFva/kWXExlwtl\n5LN5DOeHsdZYS9wQzuSdb57MiJqUzc6KnWVSvSSEATFGq7fqqUoI33QT8Id/KB4///nAj/xI93O+\n+131+Lrrwh1/oxjCveIipCYmxPUqxxP33Sf+f+aM+Hm/xw8iP0N4kBD21qYxhBlj+wF8BcA2AE0A\nb+Kc3+LxJ3To77dZjQ5F7FgI+3G8hjhexwmker2O73//+77P27lzJ3bKKhMOCpsQtheVG8oNaQnS\nbUPb8Ec/9ke+78tP8ph1XoUgfjBjAyI/LiWdnPFMx+wgk7EoDGFazR2dFeQnnzRpCHcYwiETwlEm\nSrWUk2FkBE0Iu22jthvCO0d2amleUxrKD2EoN4Rqs4q5tTktVfroo8BLX2r8JQGoyRErxouMOLRn\n2qKoP3DiKQABQVsG1G6rwffQ6OZKCAdhCHslhOX9RhPCj557FMdXhGsQZgGP3julcg3rAE6dCvzn\ngdVTUbkoE8I2ZEQmI7bJLSyYTwgHQUbQdHcUhjDtlyIrKueDQQG6r2vJDwaAIzO9G8JaMbdicglh\nN2REsbyOr3xFmC4mxiD27brUEE4KGRQ1r+oAACAASURBVHHunGi3AXP8YCDdhnC1qhYdTBvCSSWE\ngxjCcrs1ALzxjcCf/Zl47IeMCJoQ3mzICEBcH2fPCkPFlAEbVk6GsPx5sf+1djzzTPfPfvCD/o8b\nRvaictQQBkRKuFRSF/P6uvg8lpfVwsuRI90FxLZv19vWZlZczHLherw4jrXGmta+AwkkhF2QEays\nIyNoQrgXhjAg+qGF9QVU6hWUy+LzbrWSNYQff1x/7GQIU35wWENYXhe1mvjPxH1jUiYN4ePH1XiY\nzguq1XgN4WxWtJcnT57EScJPkuONWk3wv7NZ4OQjJ4GYEsxp16YwhBljuwB8DcAuAG0AN3LO/93n\nz2gBOL/KJLSQnL0Lsx/HyxCWx+HwL0DnqLNnz+LKK6/0fd573/te/P7v/77r78NuUbUjIyguwqS0\nJGNuHWgOmTOEQySE5QTUxHZNL+mGsBgwmCospyEjAhRYi8sQpgnh0allrMAgQ5hTQzhYMarn7DDP\nD5aaGp7Cs8vPYn5tHofIy0RZWE5OjjIFgv+IISF8/f7Lgc7E71j9dnD+k7FNYlZW1Hav0ghhdBpg\nCFvHyLTE/cSzFg8aSBdDWLZb9VbdmtjIVCE1hG955ha0ebvr536i19H0riqeeURM6kxPWIMYwvQ+\nzrCMke/aTfS8ZRu5bZuYyJiYzIjvWC7e+Z/vscVj1uM4kBEjI+L75Ty+onJA93V9/1mVEDaFjMgX\n1WwgHkPYveil/HcT61biU8NW9TgGsSeEZx2Yf3ErquJqcRb8DCvTSVKqpBLCQZARNMH70pcqQ3gz\nJoRNGsL33y/MlErFHGM7jNwMYVPGlvx+pTEIOJvEUcqOjLAbwqv1VZRK6mZdXxfm1kOKXoSLLxY7\nhqTkoo/VV7IWWkwcV875JkoTOLl6MvGEMCf90fTwtJUEbhX1hPBDa4Qh3AMyAlBmeKVRAWNiDDU3\nlywygo5n3MZyMiFcKIgkeBjZFyhNLn72K85VH9krP1hKGt9ra6LNort+oyqOaJd9kfIjH/kI3ve+\n9zk+95pr4nlPG0kb3hBmjE0B+A8A+yGM1l/hnH8qwJ8+QB775THp7x/0Oc69AY7zDOe8p1tkenoa\nN998s+/zvNLBgF7Exs0QdkRGVEWLSQvKmZRmROarqTKEo08ImzWE9dRZihLCZNtreWoJKzCHjGiQ\nonJuDOEuQziCgnJSU0MdQ7g6jwMHROIdEMiIqCQNYUaKyplIyvrpmt2qh12fug2nTyuuWtSiA8pS\nOaKEMABk65gaH8IUCZTHzRD22t0gi8o5mUjU+P3Wk9+yHgflBwN6ezU5W8EzEIO9c+egfSb9Kiwy\nopwv91xgLIhoe1htiPtKbnlcXOzfEG+1AOSDJ4SPLURsCBd1QziTERObpaVkGcIyITxSGMHesb2O\nfxNEtIZArqDeR+LIiE4SvcVbaLQayGfzVoIsn8n33J6lERkRlTma5oSwlwn+7PKz+Os7/xqvOPgK\nPH/P80MfO80JYamREbEtW2ozMoRNXcvUODpzJnlDmPKuTX3e8vuVzOQzZ6Ktr+EkL2QEIHbI0MUH\naW49SJyAiy7SE5Gy3bH6SlLPQ+7WkPPJ1foqmu2m1QfGXVSO7ljZPbbbMoQndqqE8LXXAjf9wAAy\nonPulXoFnHNs28YwN5dsQpj2f07v4+xZFeC56qrwCyFpNoSrVXX9m0gIS9kDT3EZwvZFyutecx1+\ncdcv4tWHX42dozvxq7+qitx97WtiDP/ST74Mcx89Cxiq3bSRtaENYcbYGICvArgYwgz+Tc75XwX5\nW875McbYCQA7AbzQ5+k/1Pn/cc653a6jNRRfCOAzLu91B4BDnffphbLwVKFQwBVhl6gc1EtRuWqj\nimpT3NlRGcLa1vZ8FaiaS4/6MYSpwSBZnFEjIzTuZV70/sYSwi0OZEQKMK0J4dK4GDlVKqLTcJpE\nhBFlCBdywQzhKPjBUtuHt3feVxPD21ZQLo+hUhFYkKikisqJe3UoNxSpWSY1XZ7GeOtCLGUfB3Z+\nH3ff28DLZnusUhBSdFBXGCYJYZMMYQDI1bBjx1CspoNoq73Z2NQ445zrOxs6CzB7x/YiwzJo87bG\ng+vVEB6fVrP1p582ZwhzjlBmIRAtLgLoRkYAagDcaomJbT8JC23Hjgveh16HTy+pWXPkCeGacEnH\nx5M1hCv1ipWMvmT6kr7aNPp61BBOCzICEMZEPpvXWOC9nnNXUTliWiWFjIgqIbxRDeF3feVduOmB\nm/Dh2z+Mp3/9aT0sEEBxJ4T9isoNOwwhL79ctJsyEeqHjPBKCG8FZAQ9/gUXmDluGHklhE1Ifr/l\nshg/nDkjFjOaTfeFBtPyQ0aIhLD6t7zWqCF88cW2QlV2Q7igLmTZFtN50Eptxdp1GzsygixQ7h7d\njbtP3S3ex8wcvvIVcX9ffTVw9nZxkY8URnpG0Mlzb/EW6q06tm0TF9XSktjOT1PWcckvIUxxEUd7\noOCluT+iC+AmDWGangeSM4R/6Vu/hGOLx3Bm/Aw+/8LPg+YkDx4EzjsPWP7COlwssC2nBG4/M2KM\nDQH4EoDLIUzW93POPxjyMP8KEdm7iDHmGCBnjF0LkezlAP7F/nvO+aMQqWEG4I2MMbcIx43k8edD\nvk/jajaVWRgkIVxv1V0LypkU3Zo7sV20IqYMYc69K9f7IiN6LOjiJaeEsCmzsNmi5+ufEF5vrluD\nobgM4eKY6iFNfM+NdniGcKQJYcImPledtxKzp09H9pLWYJ3nOoZwDPxgqYtGO/Gf/Dr+4z6vzRJm\npRnCQ1EmhGvYuTPeQV6ghHDnPXJwtHhLX8gqipFaPpvHnrFuOlIYhjA1XkenlCFscptnULOQGqRR\nFpQD3JERUv0mXJqtNsAE8yQIMoK2c1EYwvls3vp8V+rKEAaSM4QfmlOzjH4KygE20521rAl/LAnh\nAMgIQBkTdhZ4L7InhMdThowwmZqi55Zk8sxJXoawxKEsrC/g8w+GnyKMjakUaRwJ4TBF5aSuvBLW\nVnGgv4RwGpER+bw5Via9Puh1E6fiMoRHRoQ5AwhjMI7rV8o3IVxf0a5lN0OYLgh3ISMK6kKWYyg6\n96s0iGFMhjJrEaUW3ZARu0fVNoNz1Xm85CXAi14k/n2mIi7CXtPBgDvL3hSKqhf5JYT74QcD6TaE\n6fvpt92i/e7DD+u/S8IQbrabVoDgnlP3AND7pGoVeOzxNuos4kHfBtKGNIQZY3kIc/Y6CKP2zzjn\n7+3hUH8GWLHRD9vN3M6//7zzzyaAD7kcRxrRkwD+2OH9Xgjgtzr/fAwpMIQbpBd0q2quJYRbDQsX\nAUSYEKaG8LToDZNERtCt11EjI4pjYtBgajBUa6jvOAhPl5rfcRWVy42oz9fE99wiDOFigIQwA8PF\n0xf3/8IumhpShvB8dR47dojHCwtqsG1SnFNDOPq0t13X71P7Qf/7mdtie106qMuVzDKE6fUyuq2G\nX/5lMbDIdprNOA1hxrOOKUH6HmvNmo6MIAtZTrzgXhPC5Qk1iUnEEKbIiIgTwk7ICJPb7ml/HAQZ\nQRWFIQyo73q1LvolOeBfX++/7eqFIWyKH2x/vWa7GVslcD9khN0Q5pxrCeFeZZ+I08lfGgxhkwlh\neqwoF157EX0/9nOWhgsAfPLeT/Z0fJkSjjoh3G4rZn9YQxhQu0n6YQgnlRCuVICbb1amX6MBPPaY\neLxrlzmW/mY3hNttZXiWy8BeQgCKExvRa0JYpiCHhoB9+5yNP4WMUBfySF70q3QsJftYIP6EMO2P\ndo8pQ3huTXH8mu2mVT+oV34w0L1ThS6qJ9UP+SWEJT8Y6C0hbC82mCaZNITpeDgpQ5hijCQ+DwBO\nrZ4C51zrk9bWgF/9jVUMpLQhDWEA/wjgxyDM4G8A+Bhj7IjHfwedDtJJ934QIt17NYBbGGNvZIxd\nyRh7IwTa4arO6/wx5/xxp+MA+ETnuQzArzDGbmKMvYQxdjVj7Fc6vxuDqGX4Ts471XwSFJ2ABi0q\nJzsEIB6G8HgnIby6ambA12bBDWGJjJAmaSFbiITDSgcFhRHROPltowuqWl2dr9s2ZDdDOMrEHU0I\nZ4ZUj2SCI6wnhP2ThQcmD0RqmGqG8Np81zZA06IDdZ5VyIi49KrLlSH8yFpShrDhhDC5Xu74fg2v\nfa2Y9MVlItGichkXypN9MYsmhOkCzP6J/V1/G6aonLaANaojI0wpbHoUiCEh7ICMMJsQDtdWW89l\n2b4SO16SHGGKjJDqN80T9jtu8RbuO32f9fO+E8JkEbzVblnprngSwu7ICNpWV5tVrDXW0OrsbDKZ\nEM5mVfuVVDIrKobw8LA6N8r1TIPcTPBWu4X5NTXw+9oTX8PxZWdXd62xhn+47x/w6Hx3ZVrJEV5d\njfZapoZSkKJyUtIQnuxMH5aXbeaUTV4J4VxObTGPMyH8+tcDL3858DM/I/79ve8p4+766829jp0h\nHLdaLWHaAtEYwtQkKpdVQhiIt7CcX1E5O0NYLog+3nEDDh8W16FTMUtrXEqREZ3FObs5av0+ZkO4\nnVHnu6O8AxkmbiraHtHHJhPCJsdQvcorIcw5cI8Il2LvXsCnNJOjqNGaVF/rpqgM4TQgI+Q4HRCP\nV+orGsboq18FvvT1lDn0CWujGsKv7fyfAXgxgPt8/vuKx7F+B8DfQpi+l0GYzXd0/n9Z5+d/wzn/\nXbcDdAze1wC4vfP8nwBwM4DbIBLG0wBqAN7GOf9q6LMlMmUWNkgvmHVLCNuKylFDODJkBJl4j02q\nG9qEeUYTwk6Tbmr8WAnhjrESRToY0DvI/LDo/VdWzKRH1+vU9A9nCMdVVI4XDCeE26SoXICE8HN2\nRIeLAHRkBE0IA9Gkl+hAvZ2NHxlx3f7LgJb4fOdLt1kpoqhFB3WZQnQM4QZXH3BchrDOe3e+pul7\nrLWCJ4RHC6Oh2jY6icmXU4SMiJoh7ICMMJkQrhNDOB/AAJeaHZl1NZD7leQI25ERQPyGMADcffpu\n63GUCeEo26ywyAhtYaePhLDdEAbU97nZEsKAKma6UQzhc9Vz4FAXHgfHp+/7tOMx/ve3/zd++nM/\njR/6ux+yxqlScXGE7Saak+yGcLkMHDokHlPevJcR5JUQBhQ2Ik5D+LbOOvcXviC+z29+U/1Obq83\noajDA36ic5AoDGFq9ieZEPZDRtgTwtWqKJol/+7izgZDaqh5IiPy3ciIuBPC9P7lGfVlDuWHLMP3\n5OpJ6+e05sRMuffG2ishnEZDeH5efYeHD/d2/AEyIhlD2H4fn1o9pfVJ//7vAIop+0IS1kY1hHkP\n/zkfSOitAF4JwRQ+DmHeHu/8++Wc87f5viHO5yEQFu8A8B0AcwCqAB4H8FEAV3DOP9bDuWryWk0P\nozo1hF0mY/aicpQhHAcyYmSbWUOY94CMkCZpVIYwHRRkSmpQ4LeVLohqjXATbrpFKC6GcDtvliHc\nDICMODJzxEruvu6i1/X/oh6yJ4SjNoStJD1rWyv/cSaEi7kiJqqXAQD41MO479F4Rnm6IRwdQ5hu\nQ4o1IZwxkxC2G8L7JvaFKlSltVfFZJERcSaENWREJAlhb7494GwIR4WLAFRCeL25jma7mbghLBnC\npVwJe8f3Oj4/qOjrtXjLupebzWjNpbDICK2GQR8JYW0i3uFVygWNNBjCpiuvy352eTm+yWgQuRnC\n1HCR+sQ9nwB3WJ2448QdAMQE98SKzhfbrXZ8R8phpYaSW0I4n1dYJUAUlJP/niTTB6+xLjUNnYrU\nSZMyTmSEheTiwBe/CPznf6rf/fAPm3udpJERXoawicCK3exPQ0LYCRmxUu9OCNv5wUBwZIQTQzhJ\nZETb1h/JMeKJlRPWeJfibPpKCNv6IbqonoQhbGcX29/Do2QTxkHHfeb+2iqGMP0uV20khkQSwg39\nRU+tntL6kLvuwsAQtmlDGsKc82zI/y4McMybOeev45zv5ZwPdf7/ujCJXs55m3P+Ec75CznnM5zz\nMuf8IOf87ZzzB/2P4K+2IdhEEGahvahc3MiI4XFF1DeSEA6BjKi36mjztuL39TEZ85JmvBYMG8I0\nIRxgG/J8VcXPnbaWm9JQbshKLDcyZpERTYKMcEsID+eH8dCvPIS73nYXfvo5P93/i3rInhCOepBv\nTYxICi1OhjAAXFhS2Ih//d4dsbymZmrkzDKE7elbKTmAqlbNLdQ5ifJWAxvCARPCYfjBgD6JWeer\nVtLL5AQuKF82ToawhoyIgCHcJDsbcgF471KRGsIFtQd2pbaSuCH87PKzAMQ1LLe19iraHzbbTcft\nvlGoXkdwZESjqt/HBpERgLp+19eTKcold7uVy3qBMBOSCWEgXRxh2efncvqCEjVcpO4/ez/uPnV3\n18/pOJwaSUB8CWHa37klhAE9JXzFFeoxNYS9dj1KQ2x4WOEhqJJICNPXuukm4JZbxOM9e4ALfWea\nwZVmQ9jE5203hNPAEHZLCNuLylFD+KKLxP+d+hCVECYM4c4Yyo5PkIrdEGb6jjo6Rnx6SXwRZytq\nAt4XQ7jgnhBOYmFybU3//hcXdY9FssEB4MCB3l4jzQxhOo6j128vmvDIzCXBEPZLCFcqGBjCNm1I\nQ3gri3MzpnCDbFENVFSurReV2zYUPTJiaNRwQtjHELYnAVfrq9Y2vqgSwhmWsSZrnKwim0CDBEkI\nZ1kWDN0JwSMz/TEavcQYs0yqdZhFRgRJCAPA9uHtuGz2slDpyF60fXi79XhubS4+ZERO3TtxIiMA\n4JrdyhD+zrF4OMJ0MMly8SaEgWh5ja0WLIZwNggyolkLzBAOww8G9AH9an3VSvU8+6y+9bIf9YSM\niDghnM/krUUsaagZTQi36Y6d4AzhOBLCgEhJmTSEwxaVowp7zTpJSwi3W7GleOzICL+EsClkBL1v\nnZAnSbAN5ec8HsFaOzWE04SNkMbezIxefIwaLgcmlfPwqfs+1XUML0M4iYRwUENY8oMBHRkRJCFs\n5wdLxZ0QbrX0fu7LX1aF0V70InMF5QBxX8j09WZERlCzc2RE3LPyWoozIeyHjHBiCPslhLuQETQh\nnAJkhM4Q1sfLtH89tngMQETIiBQwhO39Xrutj+U3e0L42DH1mPYdvcirH0+aIQx0G8IABoawTQND\neAPKxOC9EXKLqp0hHAcyojgSLzLCnoim2zWjMoQBNTBoZ9WgwIwh7M8QZow5Trr7ZTT6SWIjqi2z\nyIgWpwlhj5lKTNKQETEwhK2JUZ4YwjEiIwDg5c9RhvADS/EbwrRIhmmGMGU2xjXQM5kQ3j22W2sL\n+kkIr9ZXrVRPq2XOeLEbwkEM0qgNYcaYtbASCTKCGMJuDGGnazlKQ5h+12lICEudP35+fy8OvT+0\nJ4QjL8bV2aKby+S6ru0uQ5jcx/2MQYrZorXwKwsY0e8ziXSWbDP73arqpDQawpzrhjAVNVzeftXb\nrfvhpgdu6sJG0HG4LPgoRSf1cSWE3ZARgLshHDYh7MQPBpRJGVdC2Ot1TOIiAGEuS5TKVkgIZ7Pq\n+k0qIeyEjFhtrLoawpmMMgq9kBG0AG8akBH0nNtMR0bQ0MCTi08CsCWETRWVSwFD2Knfo++DGsK9\nJoTTXFSOFn+TSfdelXRCmHN1XbsxhLuwQ4WIqwhvMA0M4Q0oE4N3rap5kKJy7QbOrcdgCJM0Y2HY\nHDKi3YbvBJQxZp1zl6kSETICUAODBjNtCPtPuAE9CQ6ISThNqUQh+XmuNNRnbAIZ0eKkqFzWY6YS\nkzRkxFqcyIjkEsI/esWFwLoYBZ3lD/k824zoIM6eeOhXWkLYARkBxGcIZ5lLQjinm9aLNWf2aC6T\n0/ir+ybCGcL2oiB0m6epVE9Qs/DIzBGrvb5q11VmXtxDcmEliqJydIE2n0JkxGp9NT2GsOGEMC0q\nB8SREBZtiBPOxjMh3McYhDFmTcajuH7DiiaxojCEo1547UWLi2rC2mUIE8Pl8NRhvHj/iwGILdu3\nH7/d+l2r3dKuCXtCeOdO9ThKIzxoQlgmgUdGdMMhLEPYzRCOGxnhxc41WVBOSl4nZ85EW+zSSXEb\nwoDiCJ87F50ZalfYonKViiqadeGF6nOhKfZuQ9gBGZHXd1tJUdMqjoRwywMZIQ1hjSHcDzLCIyFs\nApMYVk7jGDqXkMgIxoALLujtNdKcEJaG8MiIjhvqRUkbwnauvRNDeJAQ9tbAEN6AMrGSRpERgYrK\ntWzIiFI0yAjKussNmUsIC1MleCq61qrFlhCWnWQN5gzhZhPg8GcIA92T7sNThz0NZBOSCeFaq4aR\nCTG6NI2MsBvdSWi8OG4tuMSRELYG6nm1mDKci5chPDTEUKyK0VN96CnUG4ZYAh6ShkahADTaETKE\nXZAR0RvCEhnhb5zVWjXXhDCgYyPCJoSzmaxljFJkBGAu1RPULJwdmcUdb70D//ZT/4afvPQnzby4\nh2TfJAeaUSWEU1NUrhAdMqIfQ3j/tv759rQ/bPFWvAnhDjLCabFKY1U3q573cVjJ69deVA6IP7lU\nqSiDa6skhN0KygG64TJTnsEbLnmD9e+bHrjJekzHpEC3IRzXdxqkqBwAvPe9gh3853+uF5gLgoxo\ntZSREAQZEYdh6maE7tsH7I+g7Ia8TprN+Bdt7IZwgTTFJgxhWnhKGsJRLDD7ybeonA0Z8cgj6rqU\nuAhAmMnSzJV9iPzO8mVvZITctQHEj4xosXDICJMJ4e2KqGckEBRWXglhzlVC+Lzz9AWRMEqrIby+\nrpARF13UP+4maWSEHwvcMSE8MIQ1DQzhDSgjCeE2NUf9E8K0qBwD63ty4ia6vT1bMmcIB+ZSdpJ2\ndmREVOcLkIQwXweY+F76NYRrNQQywIHuSXfUuAhAGcIAMDkrRk9GCgeCGMKZ5A1hxpiVpp9fm8f4\nuBpYx4aMiDkhDABTmc7sKNvELT94NvLXk23ixISe4jWBjLCbrVJxDfRo2+WWEO5CRnRSZBmW0SYe\nAHB0z1EA4h68ePpihJUc1FNkBBB/QhgAnjf7PLzq0Ks8F7xMyY6MiK6oXAoZwoaREWliCMeeEO4g\nI5zapqgSwoAyhGVCOElkhMnq5k7aaIawZriUp/Gai15jLSR/9oHPWtgIiosAug1hurARpSEctKjc\nj/84cOedwI036j8PgoxYU2vavsgI+3uKSm5GqGlchNQ08d7i5ggnmRAG4sNG2I0kOsYDuovK3XWX\nenyxbfgk27LlZXE9yms4PxwcGZHPq0WWuA3hYrao7RqzkBG29qlX2RPCaTaE5+dVG9orPxgQ2BzZ\nRqbJEH70UbWI1i8uAhC7NQrdwzUA8RjCdoyRX1E5AAND2KaBIbwBlUhCuK0YwtuGtvVd5dtN1LzK\nFFQr0m9nEXYCSk0VIB6GsHgDYgTQ7/YZYQj7Y0GA7kn3kenoCspJUYN9YlZ8zufO6YOzXtREuhLC\ngMJGzFfnwZi+DdC0HJERMTOEAeC8MRWX+fZ9T0T+etQQlgOBLMsaSboHKSoXeUK4U1Quw/wLrNWa\nKiE8Vhzraqv/1wv+F/7qlX+Fr7/569rCTFDJ9qrSSBYZEbfsyIhCQSWC+u2TWwEYwk4LXFsxIRxF\nUbl4E8IhkBEGE8JyMp4GZMRWNISpoTdt81TsCbyp4Sm8+AKBjXhq6SncceIOAMDCut7QrNT1izWb\nVaZwXAnhXko1BEFG2IuOOYkmN+PARtDXoO/pZS+L5vWiRox5KWlDOK6EsG9RubqeEL77bvX4iG2q\nJO+95WW9jcsOBS8qB6jPIw5DuMnV+ZZyJZRyJewcEewZOzKinC9ru3jDyp4QHh5WY6gkDGEvZISJ\ngnKASN7ShYK0yCQ/GBDn6YaNSMIQHhSVC6+BIbwBZcIQ1hPCziO6LMtahUgarYY1GI0KFwHo5lWt\nvWY1pKYTwm4GKTWE4y4qJ96AGBj0mxBeX4eVNgbCJYSPzERvCI8V1CxwbFo1yv0a4TQhnBYjSRaW\nW62vot6qW9iIs2c7bGuDUglhgozoYwDXq47sVMCtu2gp2wjUbquBFk0Im+AHAzazNWGGcC5kQtgp\nVThSGMHbrnpbz9xd2V7ZkRGmJnBBF+/ilryPmu2mleiVA+D+E8L+DOFsJqv1W/lMXitaaVppLCo3\nnB/ua8uqlFZUjsfNEPZARpDxT7VRjSQhXKlXwDlPFBkRtSFMjbS0MITpZ2yfOEvDZSg3ZJkmGjbi\nfoGN8EsIAyr5nQZkhJuCICOckAJ2UZNyfd35OSZFjdA3vhF4//uBP/gD8TgKbRVDWJrrdIE5iYSw\nY1E5G0OYGk9Hj+rHkm3Zyop+/2VKDgzhgp6WpYraEKbn3EL3jjq56Hpq9RQW1xfxxIIIdewZ29PX\n69oTwgCslHBaEsLyZ5IfDPReUE5KXhdpKipn2hAG3LERaUgIn6mcQXFIxxcOTQwMYaqBIbwBZSLN\n0WjSyZizOcoYs1KWtVbNYghHVVAO0M2raqNqJSnMMIQDICM65g9N2QHRFpWjAwPkRSdpBhnRW0I4\nDmQETTuVp9Tn3Lfxz2nSLh0J4e3Dal8ULSzXapkpHkilGMLJIiOuPqgSwo+cjTYhvLystj5t26YG\nAiZwEfbjJJMQ5hb+JeuSEO4yhDttVxSoG5o0nN3ZRqYzioibIRy3NL6rjSPcd0KY+yMjAP173jW6\nC6xf8JuHNGRExAlht90c9r7p/InzjZxzognhXpER/SaEO+MMDo5aq7apkRH5vDId05IQ9jpnWVRu\npqwcwNde9Fpr3HbTAzeBc54aQzgoMsJNo6Pq79zGQGlPCJdKwO/8DvCe98DqA02LJsk3syGcZELY\njz1qZwhLTU+LonJUsh9ptYCTJ8kvCiQh7IOMAOJNCDfQXXODcvo//+DnUW+Ji+HaPdf29bpOJjg1\nhOMunBhHQhjYGglhINmEsP0+theVa/EWahm9sxmeTNEXkgINDOENqLgSwoAy1ebW5sAhWusoDWF7\nURU5IFpY6I8RFjaRFGtCOK8GAX4SdAAAIABJREFUBsVRMwnhXhnC+UweByb7XA4NILpVfWhcNcr9\nGsLtNCIjSIrPXljO9CA/LciI6y9RCeHja9EmhKmZMTGhTNvNkhCuNenCjjf7HACWa8vW+4xiIYtO\nZOp8zapsbxQZEXB3Q5yi95F92/3amncFej9RhrBbQhjoNoSjFEVGrNZXMTqqCo8klRA2gYsA9KJy\ncTKE6w0O5MSFEhoZYSghDIiU8GZGRgAKG3HqVPxGg5PczrnN25ivigEf5XNODU/hBfteAEBgI+bW\n5rTCzoAwq+yShvDaWnRc3X4TwowpbMRGTQj3WmQqjGhCeLMxhP2KysWVEPZDRtgTwlLXX99diIve\n18+S0hk8RxjCqUNG6EXlAOD88fOtn33qvk9Zj6/be11fr6slhOu6Idxqxd8PeTGEqSFsKiFcq8Wz\ncBVE0hDOZPo/P6k0ISPs9zEArEJfHS6MDgxhqoEhvAFlxhD2TwgDylSjVZC3DcWDjKCGMNCfSWpP\nCLsxkJNGRoxtN2kIU050sNTZ4e2HYzFgqCFcHFOT3n63DbVSVlQOUAxhQCysUEPY9HbWtBSVO7D9\nfOvxcvaJSAdBdkPYSgg7GC69KOmEcI2MdIIgIyiPMoqEsH0iI1M9p0+bGeymNSGs7V5p6glhoL/J\nTCoNYVtRuUzGHJ+016JydKLajxIrKtdWLovTgpXdEJZjkHwm3/cCF71+1xprmxoZAShDuFqNNvUd\nVG7nfK56Dm0u2FF2HMoFE2ph9eTqye6EcMM9IWx/TZPqNyEM+BvCQRLCpk1KPyVpCKcpIdzP4qeU\nU0J4YkJ910kkhJ2QEdVmFflid3GT6xy8UXpfP/ywetzKiQa2kC1YfVoxW7TmoW6GcLNp5rO2SzOE\nPZARAPCNY9+wHl+/9/q+XtcpIUzn+HFjI7wSwhIZwRhwwQXdzwsj2ianoS9qt5UhfMEF5tqyNCEj\n7AxhAFhq6YZwptTpIKPbaLehNDCEN6CMICNawSbc0lSjE9aLpgztL3CQfdJiqsouNRkYz7luO5Wd\ndaPdwFxV9U5RGsK0kxyZVEXl+km12BnCQZERcRSUA/S0U27EYEKYpS8hTCd5ZytnI+UbKmREsgzh\nUq6EoWbHsJo4hkceie61aHs4Ph4tQ1hunQPiM4SDtNX0PdLFuygSwrS9Wq2vaqkemorpVWk1hOnC\nikwIU0O4n4XaFicMYQ+HhS5OxJkQlsWrTG1H7zUhTLey9iPaH7Z4jMgIUsDHCRlh3yFlscBL432j\nMmg6a62xtqmREYBeWC4NHGF6XdHrjbbXNCEMALMj6iROrZ4KhIygn2dURn+/ReUAZQivrDibXkES\nwkkiI+IwhClr2TRezE9JICMYUynhp5+OJ9nvlxAGgHa2O6rrZAjT+/q229TjRk7hFmU7zhjTCvRS\n0es9ipQwvX/rbXW+cr5P+1m5M3iiNIGLpy/u63VpmMCeEAbiN4TdEsKcq4Tweef1f6/H0SaH0fHj\nYgcJYA4XAXQnhOWQJS0J4cWmbgi3cmIg4hYQ3GoafAobUCYSwi0NGRGcLwsAl++8vP834CI7p9GU\nIdxswjJIMzzYhPupxaesx3Rgblq0kxyeEKPger2/gYA9IRwUGRGXIUwTwtmhaJARaTGSqHFzfOX4\nlkBGAMBMvjOoHDmN790b0d436IONcjlihnACyIg65b0HSAhHbQhTxE2lXtEMYROpntQawrluhrCp\nbfc6+zw4QzhK2RnCgDlDmCaEGZjvjh0pU8iI5BLCJI0VAhlhYkFaQ0Y0KpveEKb9bBo4wm7nLPnB\nADAzTFaLAewc3Wk9Prly0irsLOWFjLC/pkn1i4wAdLPTaU4TNiG8GZERQYrvRaUkDGFAGcLr6/Gc\ns19CGAAaGf0+KxSAK6/sPha9r6khXOXOBdlpgV4q+nmsrcG4nJARpVzJMqud+tmje472bZwVsgWr\n77UzhIH0GMLz82qM0y8/GIhvrhBUQfnBf3nHX2Lm/8zgj2/540DHpePhyUlguDPkiKNt9mMIA8C5\nuj4QqEF8GWwQEQYwMIQ3pEwM3jVkhMcWVaeU5eWz0RnC+Uze6nTsyAhjCWEEM0efXHwSgCj0EWXa\nlBrCQ+NqYNBPIsDOEA6KjIijoBygb2XnRXNF5dpIX1E5Wpn32eVnI00IpwUZAQAXbFP7rG598MnI\nXocawsUit1K8kTCEk0BGNNXoPRsgSfnUklrI2jGyw+npfcmOjKCTVhOfg2YIc3ezMG7Zd68AJhPC\n6UNG0O9ZGk/SbKpW++OT0gXLTMD+GIiGIdxqtzTDKaqEMOdAm3Vvz6WibVa1WcVyTdxQJhZ27Anh\nYhEY6nQNm9EQpgnhVBvCBPFjTwjvHCGGsBMywqOoHBBdGs0kMgJwNv7CMoQ3Y0J4fFwVrNvMCWHa\n/lKDMG5D2C0h3GD6fXbllXDkCtP7Wn5f+VIDlab4eztuMYghHEVCWCsq11mkpOPcvWN7u0yyfvnB\nUrIfckoIx83Jlu1juazazYUFs/xgYOMawu//9vtxdu0s3vON92hBEzfRvmd2Vo0v0pIQPl05ZbVj\no2McK41BQphq8ClsQJkuKuc1AbWbalNDU5rBZVqMMSuJZRIZQRNJXglhOgGVaYyoJ950oiaLygH9\nDQDX1xE4ZUcHAkdm4k8It/Kqh+x3hTiNyIjdY7utx88uPxtpQtgaqKcgIXzpHrXt7J6noissR1ef\n80PeCbxeRNsEmhAeGVFboqI0VBpN/0UOai4dW1Cf9e7R3U5P70t2Q3iIXF4mBn7UEPYyC+MWNeVk\n30C3iK52ezOB1QpY5DVOQ3isOGZNCqURZcpsov2TW6FEwAEZMWEGGWFPCOdyKs0SKXc1pxorpwUr\n2lbPr81bKBETLHB7UTlAJXo2M0MYSLkhTBLCdoZwL8iIOAxhk8gIwHmsGyQhTE25zZgQzmTUomPS\nCeECaYpNGsKMQRtDmFpkDaogyIhqa8Uy5gFnXASgjwekDlyqTsJekN1ujlo/j9EQrrW7d9QVc0Vt\n3gL0zw+WksixNCWEJybUdWc3hLdqQni5toyTqycBCITmx+/6uO9xaUI4cUO41X0fn1o9hcsuE49/\n+MXrVjCyXxzXZtHAEN6AMmIIc2oW+heVk7pi5xWR3zxy4mISGRHUZHAykeJMYhXKqvfvOyEckCH8\nhkvegCzL4oX7XohDU4d6f9EQooZwI2MuIcxZ+orKUVPOjoyILiGcLEMYAC4/XyWEH194IrLXoRPB\nbNE7gdeL3JARmYyafEdpCNdb/hgUapzJQRyArkG9CdkZwqYn5driXYoMYdoPnFg5AUCZiPj/2Xvz\nMEmO+sz/jTq6u/ru6bnv0WhGo5FGo1uA0LEGhGQhQAs/juXBMgYj8Eq2AcPjNUasF69sbG4DPwlz\nr581YGMuG4RkIQkhgY7RLQ0jzX1390xf1V13Ve4fWRHxzazMqqzMyMyomX716JnqrurMyqrMyIg3\n3vh8EWxpZ9VoH+8T9n0plUiJxOLxOdNRU2U20YSwV0O4v6u/YUDtV/Qz5qYrH8yHlRA2DWHvyAj+\nmQOKEsJd1oQwEE375aTTmSFsT1vS5NXSvibIiDkHZESpOTIiioSwCmTEQkLYXa2K74WlsBPC/Pvt\n7ZUT60Dr5LhqeUFGzJetE9+Xu3ijTm3Zui3ymnVDRpRrZUt9ijgSwvYJSroaJ8mSuHTVpUr2bTfB\ndSgqZzeEn31WvkYFYzcKjE878mII757cbfn5zh13iuKnbqKG8LJlcsIulqJyDsiI43PH8YMfAN/8\nJvC3n5NfxEJC2NTCp9CBmp4ODtuvVP0lhMPERXDxJe4qkRFWZmF7zOSV/dEZwslelcgIbybDO857\nByY+PIH7brovsoaRDnDz1VkxqFDJENYlIZxJZ4SRcXj2MEZHZQf4VEZGbFwk03wnq/tC6dgC1s5G\nsqt5As+P3JARgOwAhZoQ9lBUzqndAsJPCM+X55UbwromhJ0MYVUDN2oIN8P70HORLicPSzyheHzu\nOAzDiDUhvH54vbLJaDpBylMifDAfakI46d0QHpuXNwfVCWFuCPP2K5u1JuXC1unMELanCJshI2hC\n+FhWH2SEjgnhU90Qnp62fu5hKypkhN3sjzoh7AUZkS1lLeeaW0LYqS1bdaa8Zu0TmvbVVlxRFpUr\n1pOU9vsRNYTPX36+ZUIxiHRJCJfL8rMdGpLnXbUKPPywfN327cH3pVtROW4IL15snZijevGktRL4\nvul9uGfPPU23Gycywst1fHzuOFasAG66CUj3EUN4wQoFsGAId6RKpeAXWNXCEPZukIZZUI6LL5uM\nIyHsaAhHmBBOdMtOQZDZ8XYYwoDJtopy2QRNCM+WZsX3HLRDYCT0KyoHSI7w0exRJJI10Qk6lZER\nlCGM4b3YuTOc/bgmhBUhI9wSwoDVEA6rInap0jr17nasYbRd9kFMqIZwE7MwaumQEH79Wa8HALz6\njFcrKTTWStyQKtfKmCpMxW4Iq5IlIVxrTAiHcS17QUakEikxKUvNPxUJYXtROcCa6IkyuUT35bTM\n2k0PHngQ33zqm5YaGG7SFRlhN40shrANGdGb7hV9JTeGsGE7WaM2hP0mhFUzhE9FZARgNWyiTPKf\nLoYwnQhzSwjPleawfr35eNs262QTlVNbtmSNe0LYvtpK/D7ChHCp6pwQpngmVfxgQCaES9USKrVK\nbAxh2jbShDAAPPaY+e+iRcBKBV1onZARtZq8H65b5/66l06+1PC7O3bc0XTbL3uZ7BNfc400hAuF\n8MZHXA0J4YocAy/rMy9YuuqK12cAFpARXAuGcIcqaMeg4rGIjRMyImzxRGNoDGHNDGHLzGuXGmRE\noQALMkIncxQwTTZuYs0UZkSn9+TJYDcOQ8OicoBMapaqJZzInRAdyrExtTdKnZARKwdWIon69TSy\nD3v2hLMfOhBMRJwQ5p3IcjmcatAAUPZQANSp3epOditbYk9FmedhM4STp0lCmK7YadZWf+Tyj2Df\nn+zDXe+4K5JOrJ1hGoYh7BWRoYofDFgnSO0J4Wo1nERLAzLCAWlDayhQqajbYC8qB1jNwyjNJj4o\n7umxskkBZ0OmZtTw0Xs/iiu/eSXe9aN34X//8n+33MfoqGnwAHobwhQZYU8IA3IlwMGZg5Zl5YD5\nudCBL9A5ReVo397p+/GSED6dkBFAtIXlThdD2FNCuJjF174GfPCDwD/9k/u2nBLCg8vkbIdbUTnA\nyhGOlCFcN4Tt9yM61r9m4zXK9k3Hu/Olecv5HWVCmLaNNCEMyM9n+3YrzsSvdDKEs1nTFAasx2zX\ni5MyIczHUj/Z9RMcnj3s+jdLlgC7dwPPPQdce62VDR72hF2zonLrhk3ne6owJcZw1BBeQEaYWvgU\nOlRBb5TWIjZNGMLEVOvv6seZixSU3GwhbmCVa2V091TFjNPpwBA20uEgI5oxhOMST77MFmeFIVws\n+jfWajUASf2QEYB1QH9k9giW1lGBhYJaZqVOyIgES2BJer35w8hejI+HM0VsMYTT8SSEgfAMFU8J\nYQdzadXgqlBMQ/sg5nRBRoz2jorPX3VCuFTxlhAGzKRsqxUfqrS8LxxD2Lw/med1s+O9ZOUl4ny7\nYfMN/ndoU4IlRME8bgjTdFcYHOFSCS2REUDjRFZXsgvv2PaOwPtvVlQOiMcQthsodzx+B/pv78c7\nf/BO8bu50hxu/O6NuP1Xt4vfff6RzzviEqiSSYj7bNwM4WJRGmwNCeF6UblMKmMx7bk4R9huBnPZ\nP4dOQUbwxCUA7HOoOeslIXw6ICNasZbDUphF5eikm93spwahLkXl5kpz2L4d+PSngfPOc9+WPSGc\nSAA9w+5F5frT8SAjhHmWqAiGvv1+9PqzXo8vXPsF3HH9Hbh+0/XK9k3buPnyPNJpaUxGaQjT+509\nIczV7LtuRzoZwvS4mxnCNCH8J5f9CQCz3sIPdv6g6fZXrADOqdemVx0WaSY3hnAqkbKMvzmKayEh\n3KgFQ7hDFdgQpgnhJj06aqptX7Y9kpkUmpDJV/KiQxTkmGlCuJ2q5kC0hnAlodIQ1jchDEhDeKY4\nY+n0+j1u8zvWr6gcYGW5Hp49bFlyphIboRMyAgBW9dZTfV3zOBBSb8/S0UiHmxC2D8qjMFQsDOE2\nEsJh8IOB0xcZkWAJYc6oTgiXqnpO3sWdEB7tHcXeP96LF295Ea/Z+Br/O3QQN9X5gDjsQZsXZITT\n79953jstxcX8yqmoHG2/omQbuhnCX3vya6gaVfzTM/+EA9MHAAB/dvef4ce7fmx53VRhCl974mst\n98OxEWNjMhUVh+gEgxsyYknfEseBKb0GnRSHIayiqNyaNaZhBjgbwvR+6saZPh2QEVEXWeOyG8KM\nSVO45Dw34Vl08rRZQjjqonKJhOHKEPYi+3m6cSOQreiHjBDHTCYo7fedBEvg1stuxc0X36zUMLMn\nhAHJEY4rIexmCKvgBwN6FZWjPspwE+oYZwivHVqLN255o/j9c+PPed5XlIawW9I/k8o0BBsAW0J4\nwQoFsGAId6yCmg98EAQAXR6LykVRUA6wJhrz5byYdZ1rHghpKtNkMI9ZO2QEmTEtMzWGMB1wA60Z\nwnGIF8qZLc5i0ahMjwYyhDshIZyVCWFAbXrJnhBOsmSsn8P6IckR3je1N5R9uCaEHVKzfpRKpESa\n0A0ZAYSXZqGGcJfLdxllu2UfxKg2hEslaImMAORnOpGbQKlaUpYQ9lI4MA5FYgg36X8AplG2aXST\n/525iH/OdmQEEKYh3Lp9sn//H3r5h5Ts36moXBzICMNwN4SPZY+Jx3ftvgulagnfee47AMz3/4Vr\nvyCe/+xvPtuSJcwnXsvlaNKGbnJjJteMGk7kTBfEzg/malU8Mlu0mlWdkhBOp4HV9W7R/v2Nz3OT\neHjYnTO9gIwIT3ZDGJDHHPSzpianTsgIJCswYI5FaPip1WoELnt7tnWrlQXfFBlRjh4ZwULoL7eS\nPSEMSEN4eto62RSm6P3OjozgCiMhHHdROXpNuSWET+ZOYqpgvnDTok04e/HZ4rnnJ573vK84E8Lc\nEO5J9TQUZwXMIBrXQkLY1IIh3KEKjoxov6hcFPxgwJpozJVzYknR3Jx/3mo+D2kyNBlw22+KCZbA\n0r6lLq9Wo0w6Iw0nQw1DuFiE1gxhQCaEK7UKhhdLJ8nvLLFp+utZVG7VoHtCOBxD2Bzwx8UP5tq0\nZL14fGT+QCj7oCakkVSfEGaMiXYwDmREpUZT787ntNPy8ygSwvNlKzJCRafP2lbrNZFFTfbjc8eV\nDdzKFT3b6igMYbdzOmzxJLa9qBwQDjKiXIYnZMSBGdlO3rD5Bpy95GzH17WrVkXlojKECwVpwtCB\nsmEYFp7uXXvuwoMHHhQDtxu33IhbL7sV1515HQDzc/qX5/+l6b5oYbljx9xfF7aoIUyPeTI/iZph\nRpfd+pmtDGG7WRWF+aCiqBwAbKgvIJqctH5GlQpw6JD1NU5SPRnZSqczMgI4NQ1hioyoQp5Eoxn5\nwdsnXdxkP5azz4Yw1gAHZESXMzKCTjSHaQinM/J4VSHWWsliCNsSwkB0kx6tEsKJhEQfBJVOyAgv\nhjBPBwPA5tHNGOoZEqGmFyZeaChk6qZYkRF1tn4mnbGssOIJYVpgTqdVeXFqwRDuUEWVEKamygUr\nokkI04FLvpIXhrBh+E9hUZPBaxEbwBwQh21KJFhCpO7my3NikBak89cJDGFaOb13RN6dgyEj9Cwq\nZ08IryBjvKNH1e3HnhCOix/MdeYyaaCdKIYzIrd0NDwYLn7EJ4piSQj7LCpHJyFUyj6IUV04witO\nIA6t7LcWlqMDQL/3pkoFMJieqzmaGcJBBjb0/tSsqG2Yiich3HrCanGvHB1/+BUfVrZ/p6JycSAj\n3MzR6cI0ymTy69699+JfX/hX8fPrz3o9AOtn8ncP/50wVJ1EK8SrvM+2K7dj5vxgwLmgHOCMjKC/\nsxvCqZQ0pnQuKgdYzV6KjTh0SBp1zQzhhYRweHIyhPl5FdSkbGYIZzJyf1EnhMvEEKbX41zZW0I4\nkbBOLG7dCkzl3ZERboZwVAnhVI87MiIsWZAR5UZDOCpsRKuE8FlnWSecgiiTkQVO4zaEvTCEX5qU\n/OBNi8zVWVuXbAVgTnBQM7WZdEkI00lVjnujq5ESiQUrFFgwhDtWKhnCzRLCbz/37ehJ9eDVZ7wa\n5y1TtH6ihSwMYYKMAPwnd3I5eEok2Y2VsHERXHywNleaE4kAlQxhnUwGLp4QBoDMsLxL+j3uahXa\nIiPsDOFVxKs7ckTdfviAhXFDOEZ+MABsWCyvn+lKOIYwNSFrISSEAWkux5EQLlflOe2GjHAsKteh\nDGGvk3dxiN4PjmaPKkny2PE+Oh1zJAnhmA1hPjkeSULYAzLi41d9HAwMbz3nrXjl2lcq2z+9bjk/\nL46EsJs5you9cGVLWXztSZMTnE6k8dqNrwUAXL3+aly88mIAwFPHn8K3n/626750MYTdGMKcHww0\nQUY48KPXDq2V23bgm/LrVGdkBOBuCNPHXhPCp6ohrFNCmJ+7Qc+rZoYwY9IEj+J4aUK4QgxhOjHn\nNSEMWK/vs89ujoygk3TUEKZGXRgGKTfPkl0kIRwHMqKeEF5Cmr44DGGnhLAqfjBgntO8TY7bEPbC\nEKYF5TaPbgYAnLNExqVfmHjB077iZAjzonKZVMbSbz82d8zyL6BnYC4OLRjCHaqghnCFGMJdafce\n3Ru2vAGTH5nEPe+8J5KCcoA11UiREYB/jvB8rgYwc5lDOwnhqAxhPlibL82LDuD0tLWz0o50Nhm4\nqCGcHlCVEJbmmU6N/HDPsEi+h2kIC0MupUdCmM7MzrFwRuQWEzIZDhPNLSEcCTKCttUuo+8oGcL2\npeensyFMO7yBDGFN8T7DPcPi3ArNEA7iKAUQnySNNCHsYQXDLZfegrm/mMN33vwdpWy70V7pLHEj\nMg6GsKshPNfITuKJ4avXXy1qDjDGcPvv3C5e8+F7PmwxXajofVbHhDB933SJOpUTMoIawk5807DN\nBxVF5YDghvBCUbnw1MwQLhSCFZaj90o6vuPi5lzkCWGDJITJBI1XhjBgvb63bJHIiL50X0M/zYLf\nKskPZbVcUIgDIZDW+DFbEsLJ0yshTPsvTglhVfxgLn5e6GQIuyIjJiUygtdv4AlhwDtHOK6EcCpl\niPBOT6rHMqnKjWCeFE4n0pF5W7pr4VPoUAXtvNcIMiLdJCEMRG8q2ZERNLnj1xCey3lLRNsHaXSJ\ncJjiHQOaEDYM/x0iO0NYJ3OUiyIj0n1qE8KsltYKFM8YE4nNI7NHQjeEDU0YwvRGXOw6FkqldzoQ\nrCaiTQhHgYyoeEgIJ1lScMi5wkJGJFhCnFf2hLCKTp8OfFk32Q3hREJ2ev0iI3SevGOMiZTw8bnj\nlntxkD5IvlADEmZj0AxZFaZEQjhKhrAHZAQQTrudSqSE6cjNV50TwlQcF8H1mo2vwVvOeQsA4ETu\nBD5670cd/44mhFXeZ9uVW1G56YL80O3pQS4nZMTaQW+GcDbrP1TQTFEmhNevd//7BWREeHIyhOkE\nUpA2ko7j7AlhQPar5ufDLzJGz2WaEB7pGRFGkVMK302/+7vy3/5+iYxwur7dkBE9PZJ/HoYhLBPC\nMReVc2AIT0zY/yIcRZkQBtSl64PKEzKinhBOsiQ2DJuNtO4JYUs7YetnLeldIq5lgYyoG8NO99fT\nVQuGcIdKJTLCLXUWl+zICDqD7LcTMp8nbNkmA9C4E8LFahEjo7Jl89sBtDOEdTIZuGhCmGXkXUpF\nQjgB/Y6Xc4SzpSzQPSs6wsqREYmK+O7jRkYMdQ8hUasbH/3HQkm40I5GjUXLEI6mqFzrtosx1tCh\nD7PtohNYqhnCubw0C+PCCbjJbggDwbmKNBEN6Dd5xzvMJ3InUENZCeO+UIw/Ec0/52gZwtEPwKmW\n9ZvVTHnxNq0MYYeEMNcNm29o+N1nrvmMaIfu3HEnHjvyWMNrdE8IU0N4uMd5/e6izKKGfumaoTXi\nsdNydlXGnZtUF5UDgiMjTtWE8NCQ5I/qkhAGghlbzZARQLSF5eiECU0IZ9IZSz/Hq/7+74Hnnwd+\n9CPzZ74KwM4PBqxpWfs++ETIsWPqJzuEIdwdQ1E5DRPCw8ON+IR2E8L3778fV3/zanzrqW85Ps+v\nnWIxmskrN7VCRhiGIYrKbRjZINCLtLit7glhWlw8k84gmUiKfuyx7DGUq2XRB3JCMp2uWjCEO1SB\nkRHkLtjTBBkRh2gimRaVAwIgI/LeDPDYGMLkJjm4WPaW/HYAO4EhTGfmsuyweOy3Q1CtQhgrCUMf\nfjAXTWwezcqUsPKEcEreeeNGRjDG0FutX0MDR0OZ/ecDwXQaKFVPQYZwjbZd7uc1bbtGM6OhFgnh\nKY/50jxSKbOYCqBmUD6fJ6tXNJusdDKEOUf4VEwIA7KdNmBgIjehhHFfKMd/vPaicpEkhEMqeulV\nS/uWAjAH4hxPxc2mPXuieQ9eEsKcEQwA25dtx7rhdQ3bWTW4Cv/r6v8FwDw3OW+Yatkyk98IxJsQ\ndmMIezGEaUqfqxUyQpVx5yZVReVWrJDG6kJC2FmMSYM07oSwqkmzVoYwTUWHbQhbkBE1a/+RG8Lt\nMIQZM4vJcY4p7zMuyixqeK0FGVG2ziivI03ewYOed+9J0hCOoahci4RwXEXlkkk5kbZokXUy0Ys+\ndPeH8MCBB3DLz24Rq46o6LUTRv/Cq1ohI47PHRfnIi8oB5j3J77K9fnx52EYRst9xcUQribkzvh5\nzdFLY/NjFn5wVB5PJ2jBEO5QBTUfaOqsK62XWUhTjblyTslAzWtCOMqUHRXtGPQNy46B3w6g7iYD\nAJwxcoZ4fLy4T5hKgRLCSZ4Q1s8QXj0gwWCUIzw3pyaRZhj17z0tb4ZxIyMAYJDVZ2AzUzh8XH2M\nh5uQmYzVsA2DIVypVSxafYAZAAAgAElEQVRV7aNIslQNgozwOJkVFi6CiyZnGJNJLSUM4aK3tjoO\nDfcMiw6mqoSw2VbHn5h10/I+a2E5bghPTQVg3BfjvzfxSVJeVC7qhHBUA3CqZX3LxOOx+TF0dwPb\ntpk/P/+8/wn3duRmCPPEDgC898L3isdv3vpm1229c/s7xeN90/sank+lTFMY6NyEMGCdPE+ypKVf\n2gwZAYRjCKtCRiQS0vjat8/sw/DHgPnd9TbpwpwOhjAgC8vplBAO0xCm/aqwj5mfy4kEULQFCga6\nzMFnOwlhKs4PBtpDRgBWQ1g1NoIbwiwdAzLCISEcR1E53i6m09K4vOkm89/3v19OJHpRrpzD08ef\nBmB+j9Rw5KJtcpwcYTsqwy6eDgZkQTkuzhGeKkw1RTxxaZEQrvtJPAlcM2riuwKcGf2nqxYM4Q4T\nN80CJ4Rr+g5ALQzhspqEcI6kzro1TAjTjkHvsDzIQMgIzRnCG0bkWsD9M/sCpyDMhHDdENY8IXwk\nq54jXKnUB1RpGVWMGxkBAKNd8oa7+/hx5dvnJmRPD1CohJsQBoBSVY6UMhm5ZDYKZESzhDB9j2G3\nW7y9ylfyqNaqwhBW0emjbXVcfFk3McbEZ+uUEPYQmmiQ7pN31IyihrBh+D/nC6X4j9eeEI7GEI6+\nqjsVNYS5AXvZZebPtRqwY0f478FLQvh1m1+Hb73xW7jtytvwoZd/yHVbdCXEoZlDjq/h99njx8Ph\n6XqRF4YwralgFx20jmRGhFEFxGMIqyoqB8gEcC5n8kPzefO7AprjIoD4kBF0VUwU4onZ2dnwmbpc\n3BBmTK4iUHVeeS0qB0SHjEilGvuPIiFcynpKRNpFi0Yu6mmeEI7SEOYmeLIrBmSEZgzhoSFp/n7+\n8+b59td/3d62nj7+tJhYBoD90/sbXhN2/8Kr+PXU3+88mffLA78Uj+2GMOUIPz/eGhuhgyFsTwgD\nwI5jsqOzYAhLLRjCHSbeCQmcEK7qyyy0IyNUJIRzRc2REeQm2TOkyBDWOHUGmJ8t/7z3Tu0NvBRZ\n+4TwoHNCGFBjCIsBkUbICABY2iuvob0T6mNavKPR02Nl/Krs4NJ2ge6DMTnLHpoh7CchPBBuQtie\n8uAdPyUMYdpWa4YzAuQ9YaowhXw5LxJO1aq/yut2Q1g3vI+bIQwEWMGiATKC93siLSqnCTICkMze\nSy+Vzz/6aPjvwQtDeHHvYvze9t/DX/2Xv2p6D2OMYc2gydM9PHvY8TW8sFy1CoyPO74kdAVNCNNB\n66LMIouR5FTwqlMSwkAjR3j/fufnnBRXQjjKdDAQLUKBi9/LurqkYabK1PJaVA6IDhmRTDYawgPd\n5k2hZtQsz3kVLygHOCeE6bjPjSEMhJgQ7ooBGeGQEKac7KiREfaUrFNqtpUeP/q45ed9U42rVcLG\n+HgVv56ccBGFSgFfeuxLAMzi0deeea3leZ4QBrwVlovLEKbICB6Kol7OE8eeEI8XkBFSC4Zwh4kb\nwtmstVPWrmjqTDezsFlROf8JYW8mAzVV0ok0RntHXV+rUrSD392vHhmhm8kAmDec9cPrAZg30NHF\n5gy83xSEJSGsYVE5atIdmVWfEBZmXLrxZhinVg3Kweyh6calVEFFkRGhJYRJms/OEeYdq/CQEfI6\nbra6gb7HsA1hC/uuNK8WGVEgbbVmCWHA2oE8NnfMsqTZD0dY94QwLbqhyhAulSkWJJ7JO3tCmPYz\nTllkRH9jQlgbQ7ieEB7NjLZ1TvACa9lSFjOFxpG26vusH6lERizKLBJGFRA/MiJoQpiavvv3ey8o\nB5jjIb7/U9kQpm1uVNgIaghznYoMYS8JYcB54qWVLMgIh6JyXckucR9qxhBWaQgbBsFkpKNfsWJJ\nCNePmTGZEo7KEOZeAp0I9qvHj1kNYV0TwobR3BD+P0//H3EfftPZb7JgHQHgnKUkIeyhsFxsDGHW\nRkJ4oaic0IIh3GGiy5SCJNLo8gbdzEKaCMmVc2oMYWIydDcxhGlqZ8XACiRYNJcI7Xik+9QjI3Qz\nGbj4DWe+PI+BZXKtkJ/jrlQgjJWk7gnhrDUhfNg53NSWxICIICN0YAivXSRvuE5sraCiyIjQGMKk\nXaAJYUAmCmZnzWXXqlWpydmR7iaj7zgYwoBpSKg0hClOQMuEcL80hI/MHrEMaP1whAsFaN1W2xPC\ndLDup502DD0Swny/vC+UTEpzIrSEcMzICEtCuD7wO/tsaYY/8kj478HJHDUMQySEqWntRfS+emi2\nERuxkgSA4uII0/OJ9mdniqZb25XsajpBQAetIz0jTZeaA9EiI1QnhNsxhAFpzkaJjIgzIRxVYTkn\nQzgMZITWCeEWaJZWsiAjHIrKMcYs9RioqCFMU/NBRY0zFsMEJU0I02PmHOEoDOFyWbZhzRjlXmVP\nCOtqCBcK8rq2G8I1o4ZP//rT4ucPv+LDDX/fMQlh5lBUjtxDOe4NWEBGUC0Ywh0mCjoPYgjrnBC2\nMIQVISNooaJmhjA1VaJcSkBnTZMZeZP0mwYwkRH6YkG4NgzLHn96iRwJ+On0VqsQyIgk088QXtq3\nVFxrDx54EMVBOcMaGjJCg4TwxqXyhjuRVzsiF4X0EDJD2ENC2DDC6ejRhLBXZEToDOG0syGsotNn\naauDug0hiH62R7NHLQNaFQlh3dpq1ciIchkAi7//wSfCK7WK4EPyQVtoCeGYkRFODOFkErj4YvN3\nhw4Bx9TP2VnkZAjPleaQr+Qb3qMXcWQE4MwRpoZw3Anh/n65NBqQCeHhnmGwJlWM7MiIrmQX0gmz\nj3OqISP8GsILCWG14gZLHAnhOIrKtUwIF30khFsgIwA59rMbwv39ciJAZULYYgino78fOTGEAZkQ\nzuf99aPaEe2nBjWE50pz2Dmx0/I7pwKnOhSVo5MrdjTGT3b9BLtO7gIAXLXuKlyy6pKGvx/uGRb9\n38eOPmYpQOek+AxhUlQu3YiMoFpARkgtGMIdJpoQDjJzWtW4qFwoyAiLIew+4I7LEKYdj0R3cGSE\nnSGsWwqcixrCxshe8dh/QlhfhnAykRQV02eKM/jgk68FhsyeXmjICA0YwltWyetouqLWbaCDwIaE\nsMIOrpeEMBBOmqVqeEsI0/cYF0O4Wg2GMgKshnBKc2TE0exRy4DCd0JYY2QENehUGMK6HC/db80w\no/188jm8hHC8yAinhDBgxUY89li478HJEKbvpd2EsMUQdkgI05U4cSWE+THblydTQ7iZ6NJdnojm\n2IhOQ0Ycnj2MTzzwCTwz9gyA4IawytUpraRDQvh0QEbYE8KPPgqccw7wgQ/436ebmiEjgiaEWyEj\nADn2o+YoF+cIHz4cvF/FRY0zFgMyoifVAwZz8otiMmg7vWtXuO+BGs5BDeEnjz0JA9aCg7omhOn4\nxJ4Q/sxvPiMeO6WDud58tjmGLVQKeNu/vq1hPEQVlyFccUoIOySBkyyJJX1Lwn1jHaQFQ7jDpAwZ\nAX0TSRZkRCWnJCFMlyGnm5gM64fXC0P60pWXur5OtaghXE7MiU62KoawbiYDFx3oVPoVJoQ1NIQB\n4Cuv+wouWnERAOB47gjwztcCqfwpnRA+c5m8EWeZWkOYDgIzGatZqzQhnHRPCFNDOIzCcrSt9swQ\njgkZAQQfmNO2Wsd2K+yEsG7HnElnMNRtOkwqDGH76pUUi7eoHCCxEXzQls2aiX+V0g4ZQYq4XXaZ\nfE3YHGE+GE6lpJnH08qAj4TwkDSEnQrL6YCM4MdMTYGaURPM41aG8DlLz8FfXvGXeP1Zr8cfXfJH\nAOC61BzQGxlx0w9vwm3334Ybv3sjADP9ykMf1BBOJIA1a1w2QnQ6JIRPZWQEDfxw2RnCt98OvPAC\n8LnPAceP+9+vk9yQEd3J7sAM4VbICMB6HRu2mw7HRlSr6touS32WGCYoGWMiUEBNcDop+etfh/se\nVCaE7bgIwJyYpKuwAT2KytHxiSWFn5/EgwceBABsHt2M6zZd57qN2191O84aPQsA8OTxJ/Hn//nn\nrq+NiyFcMUhCuD4GXta/TExEcC3rXxYZFrQTtPBJdJhUJIQNA6gZ+iaELcgIRQlhS+qsyfGOZEZw\n30334as3fBV/fNkf+9uZD1mLNM2JQbcqhrBupj/XhhEZAcl1B0sIF8tVgJkdqmRMJkMrDXQP4Gfv\n+Bk2j242f7F4F7D5P5QYwroyhEd7FwFVc1RRSKkdkVPz0Y6MUMoQTrknhMPm3VmKyjWJY924xRxg\n/86G38GS3nBnvRcMYVNH51QlhPW9HwMSG3GqJoT54I1PPtdq6petUmREEqlYBiJ9XX1iyS41Yelg\nPGyOMDVHOSWBmtOBkBEtEsJxICMMQwYZqCmQLWZFsoxPuDTTJ37nE/jR236EtUNrAcg22Gkpu64J\n4cOzh/GLfb8AAOyd2otytQzGZBJ4717g+TpJa80ab9uOyhA2jNMHGVGrybEWvberSjnybTNm3T6X\nvU/1xBPyZ9WT7hQZQSf7e1I9LYs3tpIlIeyGjKibo1Wj2hA2CKOwnJshHCXCiN+DaEL45S+Xzz/8\ncLj7V5kQpgXl+IrXSq2CI7PWm41uCWEaZLl///3iXnT9puub9k36uvrw3Td/V5wvn3vkc7h///2O\nr40vIdyIDkwlUpYJcWCBH2zXgiHcYVKREC6XYWUWaoYTsCAjKnlLAsuvIdyOyXDZ6svw7gvfHely\neztoX4khrMGgu5VoQniGBUsIlyo09a5nQhgAlvQtwcev+rj8xeAhjI3ZOmo+pCsygjGGdNG88VZ6\njilN3tFORoMhHBYyIuqEMEFGNFvdcMult+DIB4/gnnfe05RHqUJ2Dhwd1AXt+Fnaag0ndpolhFUg\nI3S7HwPSEM6WssgMBkMa6WII08+ZI7TCHLRRZERXInpcBBdHMlBMw6pVwIr62Oixx8IpjsnllJal\n78U+aGslmhB2YggvWiRTjnEkhOfnZdqcHjPHRQCtE8JO4svZ58vzAnnCpWtC+PsvfN/y82zRPBlu\nNOcyUavJZKoXXAQQHTKiUpHf46meED52TJrfa9fK36tGRvT1WevicHV3SzNpzx6Tbc6leqKOIyNa\nFZXjaf52ZGEIt0BGANEUlrOMM5LxrFhxSghv3y6v5bATwvQcygQcIvGEcHeyG9dvul783o6N0M0Q\nppMu9+69Vzx+1YZXtdzO9uXb8alrPiV+/tJjX3J8XVyGcNlwHgPTwnLAAj/YrgVDuMNEb55+LzDd\nzUJ6AefLeaRSsmHxjYwo6p3AsiSEy/PCEM7n2/+eBcezAxjCwz3DYjA0UZUJYT+VZufz8o6Q0tgQ\nBqyFmtA/BsMIvhROV2QEAPRW6jfivhMYP1lStl17QpgnMpIsaZlkCSrKFi9Vre8/bEO4Bu9t9cqB\nlZEkD8NMCBcr+t6bADPlz1MHJ3InLAmTUxEZAVjbq0qPNO862RBulhAG1A/aSiUIZEQ6ET0ugosn\ncCfzkyhXzXsmYzIlPDMDvNi8VkwgORrCc/4ZwkPdQ2KCyikhzJjERsSREHZiJgPBDWHaBufK1oYn\n7AJGfovKfe+F71l+nimaJttf/iXw/vdbX+vVEObmbLkc7kQGTSCf6gnhPXvk4zPPlI8zGfl9qzKE\n3cQNK/skjmpjqVlROXpd8nO1HVFkhNs1bl0dap1R5gxhQF1C2MIiTkaPjACcE8JdXbK46d69wPi4\n01+qkaqE8HRhWhRWO3/5+dg0ukk8ZzeEdSgq54aMuHefaQinEilcue5KT9t638XvE5O3P971Y8vk\nB1dsCWE4Fxe3J4IXEsJWLRjCHSZlhrDGOAFqYvGOLsdG+E0IF8vxD0CbyW6wBCkiITquGgy6vYin\nhCeKh0RROD9Gw0xW3hG6UnobwpZlsf2mExx0sCqREbJh0AEZAQADTN54dx5SB4GzM4Qn5icAAIt7\nFys1RuNERlRoQjihx3ltb69oxy+IIWwYQFFzZAQAjGbMEfrJ3Ek1CWGm94QlNYRnqsdE4lIJQ1gD\nQ9jOEAZCSggneUI4PkOYJnAnchPiMR+MA8Azz4Sz71JJtg9uCeF2kRGMMZESPjRzqIHDCUhsxORk\nNMXHqOh5RCccVBrCdmxEOi0H47ogIw7PHsbDh6xrwXlCOJ0Gvvxl4B/+Qa6CfPWrvW2XTkaGiY2I\n0xCOuqgcNYQ3bpSPGZPXrQqGcDNDeJEzcje0hHArQ5her17FAwpD3UOuoZz+tLeEcBjICCMZzoq6\nVuJhjUKlYClwT7ERYaaEVTGEnzz2pHh88cqLsX54vfh53/Q+y2t1YAg7JYSPzB7BrpNmFb9LV11q\nwaQ0UyqRwn87978BMEMy33v+ew2vSSTk6pwoDeFSrbGoHOBgCA8sGMJUC4Zwh6lYk3dDvxdYqQQt\nBmRuSiaSIpGXr5gHGaT6t2EARdJa6GaAA9Yl2BQZAbQ/6BYdV41NfyrOXaoaVWDITPj4MRqmiCHc\n027p64hlSQj3mYPhoIawREbINkIHZAQALErLG++Lx9QVlqOD++4eQxgcqivHxomMKFf1a6tp+nq+\nPK8sIVwuAwbT73jt4gViJvOTyGSkAXWqJoSpSTeRGw+ENNLleOk9kSeELYVWFBsvFmREMkZkBPku\naTJ32zb5mueeC2fftP/magi3mRAGJEc4X8lbuJ1ccRaWcztmmjj0hYxowTflibSwkRFJj11LOy4C\nkIYw1y23AL/9LfDAA8Db3+5tu9Sc5X3f5583l6C/+93qikOqMIQrtQru2XMPTubaazgHBmQyNwpk\nhJshDMhzOKqEsF2qDWG3onI9qR4L99ePIcwTwm4F5YBGXCBV+IYwYQhHiIxww2RExRFWlRDePblb\nPD536bliLAs0JoQzGdlW6oCM4OMWng4GvOEiqH5v+++Jx99+5tuOr+ETk1EWlSs7FJUDGhERC8gI\nqxYM4Q5TviZ7l8GQEXonkvhFnC+bBxkkIVwsQsvUJJUbMgIIYAhrMOj2IsoRxrA5q+qn0ztLE8Jp\nfY8XMAtMiO9EUUJYZ2TE0l554903oc4Qpm1gomdOdOhVF1VrlhAO3RCuUYawHhMdzZARQTp++Ty0\n5+kCwGiv2UAXq0Wke+UBK2EIazh5R026sfmxU8IQtiSE6ymlZcSLHBuz/0UwmYaw2T5FmcayiyaE\naWG5c8+VrwnLEHbDJ1Bjul2GMGArLOfAEaaF5aI2hENDRjRJFgLhGsLURPOKq/+XF/6l4XdOXNZN\nm4Arr/S+XSdD+ItfNFPuX/86sHOnt+20UlBDuFqr4vr/ez2u+adr8Lv/93cdk+xuYkwmZqNICO+W\nPpcFGQHI88qvqVWtyr4qLRhul5shHBYywskQDpIQNgxDLKN3KygHNI79qEZG5GcUBkO4logHGUGT\nmkeycuATVUJYFUP44MxB8Xjd0DqsG5YOvt0Qpul6HQxhfn0FMYTPX34+zl1qdhwePvQwHj/6OD7x\nwCfwZ3f/mVi1EpUhTM/rouGSELYlgheQEVYtGMIdJlo8QhVDWMdBN0828oQwvykWCjYGkgflcrCk\nJnU3hIMmhIUx2AEMYQCWWdXuFSZH2BcyYk6eGBnNE8IJlpBJrX41CWEnZIQuCeGVg/LGe3BK3Yic\nplGr3XL5sx9ToZmaJYRDR0ZomBAOiyGsi1nYSjTxU07LxupUTQjT62lsThrChUL7x6zL8dJ7Ik8I\nh24Ic2REjIaw3dzn2rBBDt6iNoS5MT3UPeTLmLAUlnPgCNOEcNQc4SgYwtlS49I5atypZuvywbfX\nbtaR2SN46NBDDb+3J4T9yOnec1D6NL7qUTipREoH+DGEP/nQJ3H3nrsBAI8eedRihHkRN4SjTAgn\nElaOLSDP4WLRH6KDTprGnRCuVOS53Nvb3BB2WnnQTHOlOYEicisoBzQvKseY/PwPHlRzHdPxs5GI\nBxmxdkhWKqSm6vLl8ngffzx4oW03qUoIH5yV733t0FoMdg+KvqEdGQEEn0wJKjtD2DAM/GLfLwCY\n4aGXrX5ZW9tjjOGd571T/HzpP16K2+6/DZ/+9adx5447ze3GYAiXayQhTIvKLSAjmmrBEO4wGUZN\nDKYCISM0ZxbyZCNnCFP2WrspYdMQlr0QHQ3hnlQPGMxIRBjICB2/Y64NI9IQ7lkeICE8R5ARXXob\nwgAZmPeNA6wWSkJYl3N97Yi88R7NhoOMoIZwVAnhXx38Fb64838AQ2bHUHVC2DCkWQWc+gxhe0JY\n13ZrUY80hCspGdnynxDWu622YAZIQhjwOWGpwXfsVFSOGsJBi3zaVSobQMp0lXpiREa4JYQTCeCc\nc8zHu3eHM4BzTQjXjWk/uAigdUI4TmREWIawV2SEYfivveEmWojLi+7bf594TDmbfgp12UVNRf5Z\nUyqV30LUdlHzs6vL/XVOevjQw7jtvtssv6P8US/ibe7cnNWcDkPcEF6zpvFYg3LWvRrCbgxhle0S\nPTcGBhoN4aFuWQms3YQwLSjXFBmRdkdGABIbUSyqmaTUISHsZggDwCteYf6bzwNPPx3O/pUZwuS9\n80lJHnA6PHtYFG3l0ikhPDwMvDT5Eg7PHgYAXLHuCl/YkHdse4fwLgzIVQ87ju0AEFNCuOqcEF5A\nRjTXgiHcieo1jQ+/M6UNCWENl6hyI8uOjADa79zm87AkhCmzSRcxxoTJki1mA1UVdkJG6Pgdc1Fk\nRGJUJoTbnQ2fnZd3hEy3HsZZMwmTJVEFMifDYQhrgozYuFTeeCfy4SAjSmliCEfAEM6X87jhn2/A\nF578W+B3bwGg3hA2ee/yvNbFLKSDGJUM4U4xhDkyAgCKSdlAq0gI67iaww0ZAficsNTgO6b3RJ7k\nWk7Q7qoTwnQiKUpeo11uDGFAYiMMQ91Seyonc7RQKYikqN+VHasHV4vHTglhioyIMyHsVlSOGk9e\n1SxZCFir2qvGRnBD2GtC+NmxZ8Xj15zxGvFYRUJ4BQl5cbM/bEO4nYTwTGEGb//+20Ubw/Xk8fYM\nYWqQhpkSnpyUxpGdHwxYz6swDeEoEsLNDOHuVDe6U92iD92uIUwTxX4TwgBwBiHqUbazX7kZwlHe\nk9YNSbTCgWkrHDkKbISqonL8vQ/3DGOw27yh8QmvmlETZiuXuOcVwp/UcRK/rru7TaP2wQMPiufa\nxUVwrRpchRvOugEAwOr/AcDOCbMDETVDOJkEClVnhjBNBDMw5StJO10LhnAnqs9MdahCRug46KbI\nCMMwLB3pdjt4nYCMAKyFioJ0/qQhrHfqjGvd0DpxE6kOmgnhWq39QcxcTvZ0ejvAELYUlusfO6WR\nEWetlDfiqUo4yIhCSqbdokgI//LAL+UgYcMvgERFOTIil4Olre4EhvDphowoMNlAq2AI63jMbsgI\nwGdCOBn/JEerhLBqQ7hADWFNkBHjuXHLc2FzhJ0MYWpKU7O6HemMjHArKqcUGVF0R0YA6rAJXNxU\n8poQfm5CnkyXr7lcPHZiCLcru9lfrQLj5LSO2xD+1MOfEknCLYu3iN8/ceyJtvZPz+HDh91fF1TU\ndLTzgwFbYUQfX19QQziKhHAqkRL3B35tejGEZ4uz+NFvf4T//h//HW/63pvE7z0zhEuNHYjNm+Xj\nF19s+RZaymIIM/N4EywR6X3YkhCetSaEqSH86KPh7F9FQrhaqwrDlxrcdAWEHRsRNF0fVDywwq8t\niq05Z8k5vrf7jTd8A1953VfwzPufweZR84TddXIXakZNGMLVangIEMCKMbIn/bnomHtp31It+9px\nasEQ7kQFNITN1JnefFk+q1MzaijXykoTwroawnzQfSJ3AsOL5PfjnyGsd+qMqzvVLRKdlR65Trfd\n485aEsL6N/SWwW//8cBmos5F5TavXgxUze9kOrGnrYIqzUTNx2IiWobwXbvvki/omgeWPqs8ITw/\nDy3MM7t6071iEidbzIZWVE6X47WLGsJ5qE0I63jMdPnsqYKMsCSE60XlhobkEukwE8I9aT2Kyrkl\nhIEIDeF5BYYwQUbYU1kAsFoGiHGo0S8OVaEhI7qaIyNouvO3v217803VLjKCJ4SHuodwzlJpPKhI\nCNsN4fFx6+oyVcaLH0N4fH4cn/3NZwGYuKd/f/u/i++t3YTwOuk54cAB99cFFTWEnRLCUSEj4kwI\nUxOJm7nNDOFcOYc3fOcNWPTJRXjjd9+ILz/+Zeyd2iueP3vx2a5/2yohHKYhXGXmSR0lLgKwTuDZ\nkRHnnGPiiwDg+efD2b+KonJj82Oi4DM1uGlNHHthubgNYT7G5IWwT+Zk521x72Lf212UWYQ/vOgP\nce7Sc3H2EvNcL1QKODB9wPL5hpkSpoYwX1kOWM/trmQXzlt2HgDgopUXhfdmOlQdawgzxpYwxq5n\njP0VY+ynjLEJxlit/v/XfWzvOsbYvzHGDjHGCvV//40xdm0b20gyxt7HGPslY2ycMZZjjO1mjN3B\nGNva7ntylZKEsH4mAxVNNubKOYsh7Csh3KU3QxiQAzUDBhJ9sqFWwRDWGRkBQCy3qSbll9vucc/l\npMnQcQnhvjFMT5tLdf1KmKNdZqeSgWlzri8aSQDHLgYA5DIv4tEjaqb+qSGcY+EhI7qSEqRXqppr\nve7ac5f1RWsfUm4I2xPCurTVjDFxzc4WZxUzhPW+NwHAaEY6ovM1BQzhDmirebJ0fH5cgSEc/3dM\n0/b8mmZMpoRVM4SLdBljOj6G8EjPiPjMqRkLxGMIHyNMeb8M4YHuATFh4cQQ7u0FFtfHu2GaaU6K\noqick5F0Dgl8qTZW2ikqN1OYEantc5eea8FjzJbUG8LHbESqOBPCf/urv8V82bwpvPei92Ljoo3Y\nvnw7ANMIo4ZMK9Hibvv3e/6zttXKEA6KjKD3C3o92BUFQ5iGi9wMYX5tzpXmLPUcqP5xxz/ix7t+\nbMGCpBIpXLXuKnzh2i/gHee9w/U9UISh03W8aZN8rMIQpkXlqvWEcNQrVvq7+sWkut0Q7umRx7xz\np5ksVS0VCWH6vjg3PuMAACAASURBVKkhTBGI9++/3/I3Qa+dICqX5fnOJ1tO5OXSEYpBC6Ito3IV\nxM4TOyM3hFMpa0LYvkr2X/+/f8VnrvkMvvK6r4T3ZjpUHWsIAxgD8BMAHwNwLYBFAIz6/57FTH0V\nwH8AeCOAlQDS9X/fCOCnjLE7PWxnFMCvAXwZwOUARgF0A9gA4L0AdjDG3t3Oe3NVv9mJD5QQrhuk\nDEybFCEVNbLy5XygonINDOG0fgxhwJrcKXfJNW/+kRH6GUlu4qmJciILfgm3e9zzBWkydKX0N4Qt\ng9/+46hU/BlKXMKI6zZ7Gv1d/WCM+d+gQjEGLDnwXvHz///4HUq2S9vAOSM6ZMT+6f347Qlb7GrN\nw8jl1LLBTENYnte6FJUDYDGElSIjOqCtpgnhbFU2VEETwgmW0OaatYunN2eLsxgYkV+yr/uTBkx/\nN0ONc4RPnLAOnoOqVKMFfOJLCDMm2Xm0qBxgLkvn6aGoDGGaoqLLbdsVT50dnj3suAKFG2pHj4a7\ndNUuN4YwL6iWSqR8TdzSonLZUqPrGYUh7CUh/Ny4PJG2Ld0m7htAOMgIXQzhQzOH8OXHvgzAXKn1\n0Ss+CgC4YPkF4jVPHX/K8/6jMoR375aPw0BG7NrVfPtcNCFMv+OwEsL9/c0NYcA5JWwYBu7YIfuz\nf3TxH+Enb/8JJj8yift//37cetmtTcdflC9sR/gAwNq1ctXKSy+1PqZWsiSEYZ7UcTDtuYl6ePaw\nWKHDxduuQgHYu9f+l8GlgiFM2cfUEL5i3RXiO/3n5/7ZsmIlzoQwvVaFIZyThnCQhDAVTwgDJkdY\n1erBVqJc+3zF3BEDaxgzbRrdhA+8/ANYNbjKvonTXp1sCAPSAD4I4G4AfkZStwP4g/p2dgB4O4BL\n6/8+Uf/9exhjf+22AcZYAsAPAVxcf/33AVwH4DIAfwzTvO4GcAdj7LU+3qNFqSEFCeF6irAL+phG\nVNSkzlfygZARncIQpibWVGlcDB5UMIR1RkYAcnBeQwVImQcwMdHsLxo1l9fTOHOTFRlhTvIESZiK\n773b7OXSAaMOuqz/rUDBnCL/znPfwVQ+OHCXmo9zteiQET/f/fPGF615CIDawnK6JoSBcAzhfB7i\n3gToWQAUsBrC04VJMWALyhDW6fu1i15TrF8mS30lhOl3HJPpT40paqjxhLBhqGWvlqp6ICMAee8Z\nnx9HzZDr6xmTKeFDh9QXI3MyhOnSarrctl1xbESxWsRErrHzwJfc12rhMljtasUQHu4Z9tUHb5UQ\n3rBBLodWbQhzU67Zkn+uZ8dlQblzl56LoR6SEFaAjFi61CwkBJiG8FFbiYK4DOG/+dXfCLzUrZfe\nKgoaXbjiQvGadjjCcSAjaEEzrqCmFjWEzzrL/XVbt8p9vfWt8vdRIyNaGcIPHHhAhAOuWncVvnT9\nl/C6za/z3P+midI9k41V45JJmdR+6aX2i23bpQMyApAmaqVWwbE56yxOmJNZgPqEMGUI93f14/0X\nvx+AeWyf/83nxXNBJ1OCiCIJuSHMVygkWdJXYVMnUTzKb0/8NhZkBL+OM+mMlv6WrupkQ/ivANwA\nYLlhGOsBvK/dDTDGNgH4EEwT9zEArzQM43uGYewwDON7AK6AaRIzAB9mjDksoAEA/D7MVLAB4EuG\nYbzFMIy7DcN43DCMLwF4JYBZmJ/3F+oGsm8lB9UZwt2J/uYvjknUEM6Vc4GKypkJ4c5BRgDAxPyE\nWJY7OenyBy4ShkxSRhV1NhoAm3lZPzfbTUHkSEJYl+JbzWQtKmeuTw5iJkpkhHmBUMNDB124rRd4\n6iYA5vLpbz/97cDbpObjbNU0AZIs2bSIhx/ZE8I/2/0z8bO4bocPAoOHlRaWy+VgYQjrdF7z82u+\nPI90t5x8CswQJngfanroJLq8brIwKYwR/wlh8/PTuZ2mE1jVTFBDWH7HcZn+tH2kxlRYheVoQri3\nKz5kBCDbrEqt0mByUGyE6sG4kyFMC+9sGPFvCK8elKBgJ2xEVIaaXa2QEX5wEYB1UspuqAAmh3Nr\nHVS3Z0+wiTqqalW28f0emmdLQnjZNnQnu8WEPU9JB1EyCayo16zVKSHM+wiZVAYfufwj4vc0IdwO\nR3j5cpkUjQIZsXSpNdHOFXTZu1dDeHAQeOop4O67gT/9U/n7KIrK0QDAcHdzQ/gOstrtfRe3bUNg\noHtA3Ft3T+52fA3nCBeLwRno1BCuIB5kBGA1Ue3YiLDRRSoYwm7ICAC49bJbBWbuzh13ipUQcSaE\n6biErwLiCeHR3lFlxiktnBkHMoIyhOOY6OhkdawhbBjGXxmG8VPDMNrMEVr0AQB8BHarYRhF+qRh\nGHkAt9Z/TAH4UzjrQ/V/pwB8xP6kYRh7APwNTGP5TAA3BnjPIp0TDBlhmm49TM8Btx0ZoTIhrGvq\njBrC4/PjgqE1OdkeW1Z0XOsmeG+6F4lgcxChixZI4YbmnsbJclcZBpAryRFPnFXcvcqCjOgLnhA2\nB3yGuLYtn6kG2r4dwI6bxc937LgjcHE5OsidqZi3gtHeUeXnOz2f5spzuHffvQDMZVbvvoCQgNY8\nfNolhAGgmpK928DICA3So61EzZiTuZMiZeInIUy5ybrygwFre1XqCmgIkwnauL5j2j5SQ3g5madT\nyREu1QjXrivmhDD5LqMsLNcsIdyV7MLKgZW+t92qsBw1hMM01Ozix5xKQaykMAxDmEt+k1nrhtaJ\n4AQ1Xal40q5WU1dYjrZxXgxhe0KYMSZSwioSwoBECoyPAwet3lIsReXK1bIwjM5ecrZlAnHrkq3C\nLGrHEE4k5Dl84ECwehNuyuVkwtoN56AqIbx8udVcdtKGDcBrXmM1pnVKCI/NjeHfdv4bAHOF541b\n/A3tz1xkftjH5o5hvtTYiaCF5YJiIygGqWzUDeEYkRGAc2E5Lm0TwrPuhvDy/uW4absZfsmWsvjK\nDpNXq4shLBLCebPzRmtiBNVA9wBWDZgN8s4TO9GTkQ1V1AzhBUO4PentEoWv18NM9f7WMIzHnF5g\nGMYjAHbBNHPfYH++njI+u76d7xqG4TYc/iZ5HMgQNgIWlSsUDGG6ZZJ6GsIUBG5HRvhLCOuPjLAb\nwjwhXK22t7xEdFzrxoqupgqVxbzsbt8Qnp+HxUjSDZfgpJGeEYm2UJAQFmzOhLmmTLfP4LzzAExs\nBfZfCcBcTvTQoYcCbVO2gQamima7qJofDFg7zPftu08s0X3txtfilWtfKV+45iE84X0VaEvZDWGd\nDEO69FeVIWxfzaFrQrgn1SPuI5N5mRD2jYyo3590nawErAnhefg3hOkKJSC+7zjqhHBZE4YwYP0u\n7RzhKA1hwzBEQnj98PpAE3m0cj0vYkYVd0J4YMBEcgAm4oGjOvwmhJOJJLYuMSPAuyd3Wyqrc4Vh\nrNBARitD2DAMYVavHFgpJtL4taeCIQxYGbP2+28cCeFDs4fE92vHoKSTaZy71LzIdp3Y5Yj7cBM/\nh7NZKF2JxEV5rU4F5YBgy96npkzTHmieDrYrrJQhPTd6+yuiKBw1kuhqM7sh/I2nvoFyzXSi/uCC\nP/BtrHJDGAD2TDUOfFQWlhMJ4WQJlTpDOI7VhNREpTxewDxeXrAy7IRwj0/PkJvYqUTKutqzrg+9\n/ENgdYrp5x/5PGpGLdaicnRsOTJirnTkbY8qfjAX5whP5idhZCR3S9UqFSc5MYR1rI+ls05bQ5gx\ntgFm4TgAeKDFy/nzqxhj62zPvdLhdQ0yDGMMwIswjeXL23irDaplghnCuWIJSJpXT29SL9OIy8IQ\nDlhULpeDZYmqro2EmyEMtDfolugA85h1NVWoqHm5aEX7hvDsLLQwGdoRY0wmteqp/yCd/EIBoqAc\noF9C+Iwz6szBJ/9A/O5nL/3M/Q88iJ7rxXoCTzU/GLAmhOny3OvOvA4vX/1y+cI1D+OTn1RXWG5+\nHiI9mkBKKx7WYJccRJQTCg3hDmAIAzIlfDIvE8LtppcMw4pQ0HnyjqZKT+THxLLDBWSEN5UNagjr\ngYwAgLF594TwCy+o3S8fBDNm3gvG58eRK5sXDWVp+hFNCOuEjOD9VdqHpcaSX0MYgDAWa0YNO0/s\nbHg+bkP42NwxTOZN5hl/r4CVPx90lRBgNYSffdb6XByGMOViO53XFy43OcIGDDwz9ozn9xB2YTna\n53YzhIOYWl5xEXZ1dZkJaSC8hHB3r3P77JYQninM4POPSD7sey+SRZPbFTWEnbARNCGszBDulm6+\nKn5sO2qWEO7qkse8a5f6IqDcO8lk5HnVrriJvXpwtWONnrMWn4XrNl0HADiSPYInjj2hTUJ4eFim\ngwErAk2FKEd4rkfel6JmCMfdz+o0nbaGMICt5HGrxVT0+bNtz/nZzhrGmG9XstI9DsBAoeAPMJ8l\nM9KZlJ7GGU0I58o5ZQnhNOvWtsDakj6ZbBzP+TeE7QnhTjBHqXm5fK35BY+NeU/cdaIhDJCkVu8E\nwKrBkRHd8uLQLSGcSADbtgHYc434HUcv+JUwH3slOYheR6rklPw4Z8k5eMOWN2AkMyKSWljxJA4c\nnce3vqVmv5QhnGL68IMBq6FWZrJ3q5IhrLNByg3hyfwkevtMU6NUsi7JbKVyub70twPaanuqlCON\nOhUZYSkqV2wsKgeoNoT1QRrR79KOjBgdlcXC7AW6goqmZRMJGz84QEE5QN+EMDdQaR9WlSG8bek2\n8fjZsWcbno/bEKbvib5XbkCVa2VReC2IqCFsb3/jMIT3TTU/ry9YITnCOhWW2yfftqeEsJupVSgA\n99zT+LxfQ5gxmRJWaSrRc7mrV7bPXgzh2+67DcfnzJV9N265MdCEVjuGcFBkhDBXe6QhHKQN8iuL\nITx7sOF5PjFZLgO7ndHKvsUnFfziIrLFLKYKpsNKWch2vX7z68Xju3bfFWtROXtCmPODAWBxRm1C\nmHKEp9PRGsKptGEpKrcg7zqdDeHV5HGrmsO0d7nG9pyf7TDb37UlI1kQxo+fNNZckSSwNDWEKdbB\nXlQuCEM4k9LXYKBL3WlROcCHIcyqQNpsfXVO2XFRI2TxKtmDp0vYmimbRUcawmKpUaIG9J4Ijozo\nkp8dTXDqou3bAcytAMZNA/Wxo48FWjYqOhh9ctlzGMiI4Z5hYaL0pfvw8as+jl+/+9fiPLt8TX3R\nR6IKrHoMt9+uJtVgtl2medaT0Os6poZakSlkCHcAMgKQ3LVStYSefhlbaifBZBaUqwAp023Qua22\np0r5/Wl62sQaeZU9IawFMqIUPkO4QhLCcTAbqeyrkezin4HK4wekQWTnBwPBE8KWonIOhvDwsNxv\nVIZwrSYntcMwhGnq1okjvHatNPfjMIQtBeWIIUyvPRXYCGoI2xUHQ5hOdDid1+cvP188djLy3RR2\nQpimCJe4dKNamVp79wIXXQRccw3wutdZn6OG8JYtaEt+V+E0E50sSPW0NoSn8uYH9OSxJ/HFx74I\nwFxx+pnXfibQ+2hlCC9fLq81ZQnhHtkGxZEQXt6/XCDz7AlhwDqZpRobwc8hvwXl6P3Fzg+meu2Z\nrxWP7YZw1Alhur+hIbP2BVeYCeHJRPiGsGHIPmgyXRa4noWEcHs6nQ1hGp9rZTHSnKK9C6RqO+0p\nAEd4vkyL9ug54B7pkdymyfyksqJymaSe/GDAHCTyGzNNYAFmYTmvEizZunQ2VbgsyIhl8gv2io2w\nJ4R1ThZS0aQW+sdO6YQwUOcIA8C+VwEwl7o+cKAVscddwnzsIwnhEAzhVCKF+3//ftxx/R146daX\n8D+v/p+Wz/dlq18mX7zqEezfD3z728H3a+Ju6gVAk3qd03RQX4AcGZ5uyAgASA3KznU7HOFOMsAt\nhciIIWwY7aFu7AzhuL5jev1GgYwoVfVBRti/y4bn609PTVmNsKCyG8KtkpTtqDfdK65Jp6JyjMmE\n5aFD/lbXtatcThb/oqGGmaKadN62ZSQhPN5oLCYSwNb64pW9e9WYaW0lhG0F5bgof15FYblmhnDc\nyIgNI43nNf0snL43N4VtCNPvdsCl+9jTYy7pBxpNrQcfBC69VKJmHnzQahr7TQgDehjC04Vp1Iwa\n3v8f7xem021X3Yb1w+sDvY+NIzKO/dJkYwSYMckR3rcvGJJMJOi7400IJ1hCrOpwMoQpukh1Ybmg\nCWH6fpsZwuuH14u07K8P/xq1btlRitMQHhy0JYRDYggDwIQhF86HZQjT8E2ymxTv1RQPqqtOZ0OY\n9shbNa+0S2w/w8R2DMMIsp32FMAQnivLu2CfpgNQ+4AlODLCHHRnNC0ox8WTO4EZwh1mjlJkRP+o\n/ILbMoQ7xFihshQj6D8e3BAmCWHdGMJAPSEMAHtfJX53717/2AgnZEQYDGHAXAZ188U3Y8XAiobn\nLIbw6kcAAN/4RvB9Uv55r2bXMR3UF2oqGcLx4wS8iBrCyX45Y9d2QrhD2ur+rn6xcmdsbizY/ane\nVidYIjZ8ghtDeGhIGh4qDWG+jBGIHxnhNSEMyCJQQVWtSsPJKSHsZJy1K84RPjJ7RBg2VNwQLpXU\np5+d5GaeqkoIr+hfIcITTglhQCbtDAPY2YgZblt+DGEGJrFKsK5eoua4XzUzhAuF9jA+bvKTEGZg\njkvKB7sHxe+fG3/OM0c5bGQEHVs1+2759UtNpv37zVSw/V5AOeTcEO7qsprbXhQGMoIeb9KLIVyc\nxl2778IjR8w+3pbFW/DBl38w8PsYyYyIFUdOCWFAYiOqVSvao105JoR7ok8IA9JMnS5MN0wMhZkQ\n5udQ2IYwAFy78VoAZvjliSk51onbELYwhDNqE8LL+paJ6+Z4JfyEMO1zd/fLncQ98d5pOp0NYTps\n7WrxWnr7t5/SYjuMsSDbaU995kjFV0K4Int0gxqmCIHGAUtvL6nQHCAhrPOAG5DHPVOcweCI7IG2\njYzQYEluO6Jprd6R0wcZQSc+0BcsIVwswlpUTsNrexsPNB24CjDM208QjjA3H5ODBBkRAkO4lbYs\n3iIMpsTa3wAwcLgVQMiD5nOGaLt6NcPdUENtvjIr2ucgnT5qkHYne7TlvQPWTjTrlYZw2wnhDmqr\n+YoGmhAG/ExYyiJ6cRVKpJ81ZQgzJhOyqgxhwwAKFX2QEXQVRbOEMKDuM6D9NpEQbrG0vl3xxFm5\nVm5gIwPRc4TDNoQZYyIlfCR7RCxpp1LNEfZqCBcrRYFDOGvxWRaeY5QJYUBNSthPQnjV4CrXa51/\nb9lS1jEh6aSVK4FUynwcdkLYiyFM078//KHsj43IBZ7CEK5WJf/2zDOBZJu39jATwj09QMVwNoTp\natXpwjSePv60+PmjV3wUXclWw35v4tiIw7OHkS83dqJ4QhgIxhEW/ZOeeIvKAc0Ly23cKK8zlQnh\nalVey34NYV5QDmjOEAaAa8+8Vjz+xaG7xHkftyEcZkKYMSaS0ScqB8UYJixDmLbvmQHn63hBrXU6\nG8K0i9BqFEZH4nY7UtV2vKkK4CgA9jiAJ/DEE87/Hzt2zHUT+Yr+xpndEGZMdlDa7dzN5ytAygxv\n93XrnRCmZhbrl6nH9g3hzkidcdE0a9eAz4RwJxrCFmTE8baWXtvVgIzQMCE8OAhs2ACgMIzEsYsB\nAM9PPC+Kc7Qr3sFIDoaLjGilBEvgkpWXAABqfceAwcOYmGjxRx6UzecBZqaH+rv1uo4tRblKs+ip\n970CJ4Tr6VHdDHC7aEIYvbKBDpIQ1r3d4hNYk/lJjIzKdXrtnOv0mOM83gRLiDbSbkrxhOzEhLp0\nYY3pg4xIJ9Pi/HUyTsPgKNsHpIA0hId7hpUsW+YJYaB1YbkwDDW7aF+VLsGnhnBQM4ayeZ1SwnEZ\nws+NP4dyzWwjLl55seU5t3S+X/X1mYxoN0VpCGeLWWG2NJvkOHdJ+9iIZNLkQgN6GMKzsxKJ8gSp\njffnfy4f83Nu/36JOmiXHwzIhHCloqY+AyDPi4EB6woO2j7TyYvpwrRlVQMtnhVUlCNM98FFC8sF\n4QiLdjhmZAQArB10N4STSeDsOnngpZfUoYuoKemXIUyL4LVKCF+57kpxPv18910YGDQvmKiLyjUk\nhENkCAO2tm/I/LzCMoTpsWUG5E6MrOHqkdH/S0EYLKeQTmdDmOa3WhV4o4Xk7L1MP9sx0LoAnbNy\nAL4CYOftAC7C2952ES66qPH/O++803UT+SpJCPfoOQB1SrDwDnW7CeG5IuXp6m0IL+2VRrjR688Q\n7iQuJRd9j9VkVnRGT3VD2IqMUMAQpkXlyKBLJ3GOcG2PxEb8Yt8vfG2Lm48JMnkSR0IYAC5bdZn8\nYfUjyOWCp1lm8/I6HujW65y2D+p551oVQ1j3a5gawrXuAAnhdGcgMgDrBNbQSpnKbydtaTLu6wnh\nmBnR/By2m1I8IWsYwIkT9r9qX3NzAFL6ICMA+V06ISNoQjgsQ7hcLQsTQEU6GLAZwjPNDeFTISEM\ntObRqmZxejUNHz/6uHh88Qp3Q1hFUTmgMSU8RHx2FWk8r4YwTb0342JT/rMb7sNJ/ByemUGg/qKT\nvDCEAfnZlsvyc3nySfPfVAp4y1vka/k5F4QfDFjTnKqMJS+GcCqREn2R6cK08lUNXK0Kyyk3hDVC\nRgDOHGHOP69WvY8FW4n2yVUgI/iqFDdl0hlcvf5qAOYqjsxa84KIOiHMDehk0jTCT+TDSwgDtuT0\nkHmzDRJ4aiY64dfTL6/jvffsdfTH7P9PqEjvnAI6nQ1hQjZCq2k++rydwuVnO4cMw/B3S+sF8F4A\nr3gLgB24884d2LGj8f+bb77ZdRMWQzij56CbJlj4gIV3Pts1hOdLxBDWPCFMk9HlLjlQa7uonAZF\ne9oRxRvMlbI4o97P2r/fW0KroahcBxwzEAYyQu+icgDhCO/7HfE7vxxhYT72h88QbqXLVhNDeJXJ\nmAvaz8gWqSGs1zltN4RVJIQtOAHNr2Gaqqh1qWEI626C02urf7lMlrbDNLQjI+IUbyPdDGFADTIh\nmwWQ0gcZAch7z3x5HvMl6ywGTQirQkbYDeFDs4cE5zdoQTmu1YMyk+FUWO5UNIRbJYRXr5b7jpIh\nTA3hS1ZdYnmOpqJVJISBRkOYmo5RJoRpocSmCWEFheVUn8P8c0okIO7nThokWYOZGdOg5efW1q3m\ndcaxEaoMYZrmVGUI83O5mSEMyGt0Kj8lDOHB7kELTiKo2jGEf/vbhqc9SyRTNUBGrBuWDfJjRx5r\neJ6e64ca5/d8KaghXK1VBQpnSe8ST302zhEGAGz8OYD4kBGDgyYWy5IQVswQBmyG8LDZUD38sPLd\nALB+lt198jq+8IYLHf0x+/9LlsQTItJNqbjfQFwyDGMfY+wogBUArmrx8ivr/x4xDMN+C/4VeXwV\ngO85bYAxtgzAZpjp4Ifaf8d1JQGsBDBlALgQa9YAF17Y3iaKhuzRDWmaEAbMwedkfrLBEM5mzeSO\nV/Sg1RDW22SgA+45YxyJhFkN+5RnCHdRQ3gOGzcCzzxjmsGHD7cuQEEZwl2JbqQSndG0WRLCm36K\n/ZuWYssXF+H+37/f+lwLVSp141zzonKAOUAFABy8HCl0o4Iifrr7pyhXy0gn021ti5uPRsZsIxIs\nYV3KH6HsCWHALMi0rjlirKnmSnTyTq+2iw4iZoozYgAZZLCWy9cEb0y3RLRd9DwrpWQD3T5DuHMm\nsmhCuGdUTlh6Zb0DQL5YEeZo3PcmPqmRLWVRM2pIMDMjodoQzWYBJPVBRgCNWK4NXdKUjSIhTJdF\nK0sIDzVHRoRppjkpCkP4nKWSCeFkLDJmLtF//HFz4iaf979UGvBuCD921DR4EiyB85efb3nOkhBW\nUFQOaDSEN28GHn3UfBylIWwplNhkomPL4i1IJVKo1CptJYTt57CYYFcg/t329zcfX1FDeHYWOHjQ\nTHEC5niUMRNV8qtfAUeOmCakyoSwCo4wTTf397c2hA/PHsZkfhKVeTOhcsbIGUr5960M4ZERs10e\nGws2saMTMuKC5RegK9mFUrWErz/1dbx565tx3abrxPNrSPhWlSFM+6d+DOGnjj+FqYIZdb1i3RWe\n/oa+rjZqfnmFgolQ6VKDoG4paggDkiGcYIlQvn9q9i/dfADjO8xVBFNTVsa4CtH2vas3D9QJEKNL\nR3GhB4OsK6ovQXOdzglhAPgRAAZgC2PsUqcXMMZeBjPZawD4of15wzBegpkaZgDewhhz6+m/izz+\nQZA3DQDoMwdjfgbfBUNePcO9+g66+YBlrjSHXDknljBRKLwX5atyhN6b6pyE8IncOBbVPYd2kob2\nhHDcg24vomnWbCmLjRvlc16WCtGEcG9K/+PlGuoekkUpeidh9E5g18ld+P4L329rO+J60LyoHEAG\nkZUMzk6Znb/jc8fx7y/+e9vb4u1fLWNeIKOZUWHqRK1l/cuwfni9+cOKHUCiEjghnCvLtmuwRy+z\nMIyE8HyJMJM1N0dpqqKUDJIQ7pzJO7qiodI9JgYz7RjCBXI/jtsAtxRGJClZ1YaozsgIoLGwXBQJ\nYZqkVJUQbsUQXrpUJh+jMIS9MISDDsaHe4bFcT879iwMDnUl4kuvDSPYcnPAmyGcL+eFyXnOknPQ\nm7b2u1UXlQMaDWFahCvShDBFRoy4n9ddyS6cNWo6ozsndqJc9QbGDZODTROzzWTHcVB+8AUXmP9S\ndvULL5gBDy4dEsL2a9NLQrhYLaJqmM63qjaLy2IITzUawoBk6o6Pt7dqlIq3wywTPzJiSd8SfPLV\nnxQ/3/TDm3AsK+sfrSV43oPe6i62VNCEMC2C/aoNr2rySikxLgBQG5AHcuRI+/v3K7shfDJvhhhG\nekZCKd5Mj3lk/X4A5v3nwQeV78rSt0j3yus4kwow83ka6nQ3hD8HgC9I/we7mVv/+Qv1HysAPu+y\nnU/V/10E4O/sTzLGNgLgmP3dCGAIixnJAIYwTQiP9OppGgHWAcv4/Lil89lOBy9flncAe8dUN1H+\n6URuQnT+sw1DOwAAIABJREFUDh/2brQ0pM4051ICjRXfzyCBIe+G8HzDtnQXYwwXrbio4fdOCYFm\nEueG5kXlAOtg44La+8TjO3bc0dZ2DIMP0gxUu03nNS5+MJdICXflgKXPBTeEK9Qs1Os67uvqA4N5\nP1JlCFMDXPfrmCaEi4kADOEOaqvpPXkiPybSavv2yeJCrVSo6cNMditudTogI+wJYaqoE8LNjLN2\nRJERTgxhxqTJcOCA93PWr1olhBMsoaSd4zzameIMjmaPNjzPjSQgODbCiyH89NjTwjizF5QDwmcI\nDw8DdBVwXAnhVsl3jo0o18p4afIlT++DJoRVG8L8c2qW/AYakRGcHww4G8J33w385jfm482b/SUE\nVSeEmxnC9gk7JzSESn4wYE4w81VXbv1/WozPLzaCt8PJvviREQDwJ5f9CV63+XUAzDHvu34ks3Nh\nG8J+Vkr4MYRHekZEX6dKDOFnvZNiAqlUkv1ye0I4DH4wYOVDJxbJ2df77lO/L3otpzPSFNNhJVYn\nqWMNYcbY5Yyxm/j/AN5Mnj6TPld/vkH1dO+nYKZ7LwHwEGPsLYyxixhjb4GJdrgYZjr47wzDcLOm\nvlV/LQNwC2PsXxhj1zDGLmGM3VJ/bhBAFcCthlEHp/mQSL/1maMUP4ZwGbJHN9yn76DbPmChRlI7\nHOFCrXMMYfsx8w5ArWZWWvUiWrQH0N9YAczzmt8w/SSEKTJiQGMMipN+8NYf4KIj/wj8s1yA4JYQ\ncJMYrHRAUTk62Fg+/xqRtLh7z93YM+m9coQ85nnUkmZDGBc/mMuCjVj1SGBDOK9RmtKuBEtYGKy0\nCrgX7reTKN9et+O1ixrCOQRARnRQW00TwmNzY9hQ9/Hm572tYjEMoGjoc07TSbPQDWHNkBGWhPCc\n9SAzGTlwDC0hHEJxpu5Ut7gHOCWEAZmwnJ/3n7LzKjfz9Pic6bIv6V2iZNn5GcPy83M67rAMYbck\nqaWgnIMhbGEIl9QnhFesaMQaBBU1hJutMObndXeyuyX2i/KfOZe0lWjymRqxQWUYVmREM9k/W/o+\nzq/TQXgqHQA++1k5+fK2t/l7f1Eawm4JYSrVCWHGmEgJH5w5iGKlcRmsiuuYM4QTveaDVCIV67iY\nMYZvvOEbWDmwEgDw8z0/x5FZMzobBjIiSEK4WCniwQNmxHXVwCpsHt3c4i9MMcaEQTqXOAjTUorO\nEKbnOi/oyvs7tBaGSvWme7Gk15yVm2EHBILm/vvV74u278lu9+t4Qc3VsYYwgPcA+Ab5/+/rv2cA\nXml77utNtvNRAF+DeYWeD+A7AB6r/3t+/fdfNQzjY24bqBu8bwTwaP31bwJwF4BHYCaMlwAoArjZ\nMIy72z9UKRHt750EEmVfN8YSowlhfQegdnPUb0K4UJUfUtwD0FayH7OfDkAnFpUDpBmSLbZvCM/M\n1kRCeFBz9qhdy/qX4UK8B3jxBqBsumpKEsK6IyMAzGUTuPkiWQDzKzu+4nk7YjKsVzpRvAMSlyyF\n5VYHN4Qty+s1TI/ygT1NCAP+U8J5zY+XqjvVLd7jfO30KCpnxwzQlRxesBHlMoA0Od60HgxhwGoI\nh8IQ1gwZQfsadmQEIE3xsBLC3DhjYNYCNAHF8QlHs0dRqTXOTNGEZVB8Qis5ISNqRk183twECapV\ng9IN5WYKVViGcJ9LE93KEHa77oLIbghTs1plQriry52vaxiGQKFsGNnQEl/Fk92Ac0FAJ61cCTER\n9+tfB1uRQ1UsSg5wK0OYIiMOHZI4iE2b5OdOE8IzJAT+9rf7e386ICOoVCeEAYmNqBk1HJxpjMSq\nuI4lQ9hcpTDUPaSUhexHi3sX481ny0wfvzcMDcnJB1UJ4SAM4d8c/g3yFXMDrzrjVW19btwQLhl5\noNdM50ZlCNvvvRwXAYSXEAYkR/j4/FGcd4EJ9n36afUTsfRaTnTLLziTXkBGtKNONoQB03z1+r/z\nBkz9IYDrYTKFj8A0b4/Uf77OMIyb3f6ebOckgFcA+CMADwI4ASAPYA+ArwC40DCMZsa0J1k6GL0n\nfN0YK8QQHtQ4TWlPsPhJCNdqQBmEIax5Qng0MyqWYdsNYa9LhDqxqBwgDcxsKYu1a4Fkfe7DkyFM\nXJhOOV6qkREARgKYNJ3wvVN7Ua1VPf+9GBTUE8I9qR5tC+vZr+N3XfAupBNmMbmvP/V1x2SEk8Qx\n9+ljCF+w/AIkWf3EXfZMcEO4pvd1zAf2M4UZJYZwodY55iggU8Jz1SBF5fRJzLaSJSFsM4T37XP4\nA5t0O15qTGVLclQRDkNYL2QE/S7tyAhAmuKzs2oMmIZBab3K+UhmROnnwbERNaMmkrhUtMbMAw8o\n262jnBLCJ3InhFG9YmCFkv2sGiCGcLbRED7jDJlqfeGFYPvix9TV5Z6U5YZwKpHCecvOa3g+DGTE\nmWfKz/iCC8IzhJvhIsbmx4Rh5CVBypERgHNBQDddfbV8T4884vnPmsqNd+0kavZ+7GPyfs9xEYDZ\nhi5aZP2788+3Yg/aUdgJ4cm8dKnsBrBjQlgR5sayTXLOUPQIV1BDuFg08QEAUOsyr7u4+MF20SJk\n+6f3i8ccG3HokBrET5CEsB9cBBed9EyNmu52bIZwTvZXaS0M1eIc4ZpRw4VXHwZgfoe//KXa/dDj\nS6QXEsJ+1bGGsGEY7zIMI+nx/5bOiGEYdxmG8V8Nw1hjGEam/u9/bSfRaxhGzTCMOw3DuMowjKWG\nYfQZhrHJMIz3G4YRcF7elMUQ7hv3ZwgnOmPQ3Swh7NUQNpfkdg4yIplIiiUcfhPCncal5OLLd+dK\nc0ilDJxZr7HwzDOtlwvNFjrjnHbTMO9vTpoHXaqWcHj2sOe/txeV05UfDKDhOl7atxRv2vomAOZg\n+We7f+ZpO8J0zEQz2+1FmXQGw931UVDmZGBDuKTR8non8YH9fHke3T1yAsOvIWw53g5ot7ghPFue\nBJ93PpUTwrQI5vj8uEiqAd4SwnZERtzfsVtScWhIph9VLFXVHhnRJCEMqElJ2welnKOrml9pKSzn\nwBF+FRnH/+d/Kt11g5wMYcr4XdGvxhCmSWOnhHAqJVEDL77oH+kDtObMzpXmsPOE2VndtnSb47me\nTqZFwR9VCeHBQeCnPwU++Ungox+NxxBuhx8MmIYJbwO9JoQBaQgD6pZge2FDc115JfDqV5uPafqX\nGsKMWY1jwH86GFCfELajTyZysrNm70c6GcK0aJYqUZOZInW4Vq2S340fhrBsgw1U02b7G7SopSpR\nw/TAtGTOcmxEsdhecXU3BWEIBzGEKVN35VbTEN61y4qiCUv2ey/nBwMhJ4TJd3rGhfI7VY2NsLTv\naZIQXigq15Y61hA+XSXSZwDQP+bPEE7Wrx6DaW2QqkBG5HKwGMJxD0C9iB/3RG4CGzfKpGxbyIgO\n4lJy8YRwpVZBsVoUrLFaDfja15r/bbbYeYgMKrshDLSHjbAjI3TFRQDOA7UbNt8gfueUjHCSOOYe\nOSLRoXO7qLf+HjJTgTqw5TJQS+k9sUMNtVSfbJT9GMLVaudMVnLxybuKURbJ1yAMYR2/YyrGmBhA\nnMidaBsZYccZxf0duxnCjEnWrIriYw1F5TRDRjRLCAPqDeGBAQMzxXASamuGiCHswNM980yZOnvo\nIXVL7p3kZLIdyx4TvwsDGXF0rrGoHCDTheWyt2vVTa04s08dfwq1epkUJ1wEl1hdUlSTEAaAK64A\nPvIRc8VVHIYwXebvxTBMsATOWWq6pnun9lqwBc101VXycRyGMGPAP/wDkE5bf0/T90CjIeyXHwyE\nmxDu77eaZPbixPZ+5cqBlaFM6tFJBKd+MGMyYb1vX/vGuDDv0zkYzJzAj7OgHBVNCB+Ykeah6sJy\nfhPC2WIWjx55FABw1uhZljbXi6ghvHijeSDVqv/igO2oGTIizIQwNYQXbZAcYdWF5SyM+NRCQtiv\nFgzhDpM1IezPEK4mzTs/K/fHzg5qJvvyVD/IiHwelgG3zgY4Fx+o5co5lDEvkrK7dpnmaCt1KkOY\nplqzxSze8x4gUT/dv/pV91RLpWJbah4zl9KPlBnCdWSErgXlACt3kF/HFPVAlzM1k2j7eqbF73RY\n/jaSqVek7p7F+ITv+qH1yazOSAgDQCIje2V+7ksN5qiGx2sXLSzHk+qnckIYgMUQ3rBBOqWeE8Ia\nISPcisoBkjVbKAQ3RE1kBGEIa4CM6OvqExMQ9qJygNUQVoHNoIO2dG9eYBNUT+K1SggzJlPChQLw\n8MNKd2+R0zL8Y3PSEFaVELYgIxwSwoC1yFcQjnArQ3jH0R3i8UUrLnLdDr9Xq0oI2xVWUblmhjDt\nu3gtcMuNYwOGZbKgmdatU88R9lIskGrLFuADH7D+jiaEAashfPnlVnOvXYXNEKYJYTt6zN5GhcEP\nBqzICKeEMCAndgyjfQa6uA40C1EA1gkUagirLizn1xB+6NBD4p7VbjoYsBrCvSuksx0FNiK2hDAx\n+SdKB0T78MwzwPS0yx/5EL2Wy0z+EHf/stO0YAh3mCyGcGbS142Rp84SFb0Hn/YEC+VReR2g2BPC\nnWQIA1ZsRKFgppVayT7o7gSTAbC+z2wpi9WrgeuvN38+csRcDuikbBYdmYimGql7iH4N4ZkZAMkS\nkDIBYTojI5JJ2bnnN3Ja6ZbOXjeTTEXLzq0OaYeRnvqXyQyMz/hPP+VysJpnGqZH6efNemSv088A\ntcEs1PB47VrUQw1hk0HYPkO4sybv+ACiVC0hmZkTbZdnhrBGiWgLQ7hojRHS4mNe7rvNRJERqUS6\nZbGpqMT7Gk4J4TCREXy5MhACMqJFQhiIDhvhVIDNgoxQxBAe6B4Q93wnhjCgpiBVrSbbNzdD+Mnj\nT4rHF6640PlFkNfebHEWhgo4qE1xJIQph1b0A1poZb9MidNzo5VUc4TtiVkv+tjHZDG/rVuBJbYS\nDjTJ/J73BHt/YTOEJ+ZNQ3iga6Bhws7+XXrhQ/vR2qG14t7gtlIuyHUsDWG9QhSAmVTl43OKjFCd\nEPZbVI6Oxy5ZdUnb+6WGcG1QHl8chrCFIdwbTUJ4/8x+vOxl8rmnnlK3H358ySQwVYrG7D4VpUev\ndEGepcIQNtKdYQgPdA2IpZV2XqGXwSfAE8IdZgj3Wg1hWoTBSwegISHcAcYK0JgQBoCbSTnHO+90\n/rtsFh2XsrPLMSE85d0QHhuDSAcDeiMjADlY4wNmumypbUNYs7QDfQ/ztSnf6Z1OSwhTY97PMefz\n6Ljr2NKZ7vWbEO6sFSw0PTWRmxDYiEOHZMEaN+mWEHZDRgASGQEA+/cH2w9FRuiAi+Diq7BO5k+i\nXC1bngsrIdzbC8xXyCSeYkOCDrzdTBVqCN97r+NLlIjf37q75fL6MJARdFtHZo84GqwqDOF8XuJT\nWhnCSZbEtmXbXLfFr72aUcN8uY1ZNI/q64NYohyVITxVmBKPLatHmoieA34MYUANNqIdZAR93S9/\nCfzFXwDf+U7j89u2AffcA3z3u8BNNwV7f6EbwvWEsB0XAUSXEE4n02KFw74p50EuHQ+2ixsQxqBm\nIQrAxFFxA/HAzAHRhumCjKDXJl2R4VWrB1eLgvHzydMvIXxg+oBlBcGTTzr8gU/xa9k8Nn2KjHea\nFgzhDpNKQzhZ1XvAzRgTA5bx+fG2eYWAQ8pOQ1PFLtohsReWa9UBqNVMRhw3VnpSPUgmks3/SBNR\nE5NXfL/2Wtkh+NnPnJNas7PoOCPJLmEIz65Gwvh/7J13eBzVvf7f2SLtSlp1q1nuvXdcMNhUY1qA\nBAKmhF5yCQQC5Jeb5AZCAim0JNTQMZAQ4JqOMc3GNu4Fd2y5yJZsyep1tf33x9lTZnZmd7bImtHV\n+zw8jLTFu9rZM+e85/1+vqRpUzwJ4ePHwRrKAcZOCAN8wcEMYTEhrBMZwQ1hY6UdZGkSR+IcYeXY\nZcTzWm4IJ5cQVuJ9zDBWi5NpKYt80IkmhJ22TMMkR6NJfM8iRzgYjL1gMwtDGEhtQlhERhjJEBar\nkcSSaUCeEE6lIZydLefG5qandhOvPLucbYLvqtulep+SEl7OvnFjastXRdGFqphW7Q5kBMA5wh2+\nDjZ/EjVyJDdId6n/WWIqlmnY5e/CzuM7AQBj+42NynAUjajuwEZIEn+NyRrCoRDf7NJrCDN0VAwl\naginmiOciCEMAEOHAn/8IzF/1XTmmcBll/FzL1F1JzLCmeln6W41E0lpCHdXQhjgZnNTVxNrvCkq\nmY0dVrBmsBAFFTUQu/xdrGqlO5ER8TSVE7+biWzk2a129rga92HkhIe/HkkInyCGcK4jl43zlS3d\nZwjT9xerOWSfosv4q48+ySQ3hJvivjB6/B7ASpIgtqDxDAalxAZruXkBNrHWawibMiGsgYwAYk8A\nWEIrbKz09II7HokmZrs3vGlh5aVmoRDw3nuRj+tVhnDIigwPmRDub9zPmrPEUm0tWEM5wPiGMP0e\n00l5pj0TaVZihOtNCLOxz2BpB9lC0NGcsCHc0QFDlderSTTUgmkpYAib7HtcnMlds6xiUldfrV6x\nrSoRoZBhM97nqyalIRxP5Y7RkBHiJmSrt5sTwuFzO9tA1Rvi+avERnRXU7nsbMiMjlRv4lkkC8b2\nI8DcA00H0OlTjxPSlHAwCKxYkdKXwKTG2xWNBbFPRrKKxRF2Ojl3ds+exBolxjINdxzfgUCINKua\nUjol8g6CxGtHS1fqGsuJUs4zEpVY+ZByZESChnCqOcJqvGsjqTsTwsF0/rmpmUgnKiEMKDjCKinh\nYcMAm40cpwQZYYA5M9XgnMHsmHKE+/fnmwmGSQjH2VCOilav1HbUYuxE8oWtqgKamqI9Knn1VEIY\n4Cb/kZYjGDM2wM7d7koIU/RLriMXdqs9yqP6pFSfIWwyJZsQbnHzGZ0taMCrvkLUHA2GgmjqasSw\nYeT3lZXaTcZEmZEhLA7QDe4GjBrFb4s1AeDNxcjn3NML7ngkSwgLPMeFC/l91C4ivQoZASCtnWAj\n3H637iYjSmSEkZvKAXwx6fGQRLskSWynOv6EsLHSDrKFoDPZhLCx+bIyQ9iWgoSwyRjCJVncNcvp\nT2KUNTX6mxiJJrhZmmHKkBEddXFV7pgJGSEmhFNiCIcT9NkO44zNoiGsbCxXJPTESjYhHAopEsJd\n3TtmU0M4hBC+r/9e9T5nnsmPP/885S8BgLohTBPC/TL6sU3QVEhmCMfgCLe3J5a2i2UIbznGJ2hT\nSqIbwt2dEAZ4Y7lkm8pRXAQQIyHsTjIh3K7fEAZ4StjjATZtin7fWEo0IXyilOqEsPh+PTahzFwF\nGaGcTw/J6/6EMKDeWM5uB0aMIMd79wKBgP7nVkVGGKCqjkqJGADI941uTvYkQ5gawk6bM2ETXcQZ\nDZ5YxY537Ejo6XQrWkJY7ziVqCgGxBf0odF3jDU33b07Nd9jv5+b/DL0Sx8uIm71GcImU7KGcFMH\nvwraQwa86iukTLDQxaffT3bWYkmZEDaDyaBMYGVn88YNsZARbOKaZu6EsFjyOH48SQoD6oawMiHc\n0yZDInI4yH8AYGmKv7EcQUaYhyEsLjjoxJye9/Wd9bqazBgVGSEzOFKIjDDiZpb49/anxBA218aO\nmPBzFnJDrUIn7UU0wc0ybmkhIwC9hrBxPuNoTeWKiviYnHRTuQ4fYHdH/Js9LWU1kqj0dN7sNNmE\ncGcnSeICkciI7kiojes3jh3vrNupep958zjX98035QZRKhQI8IUqvd6FQiG2yZuqhnJUorGolhAG\ngIkT+bFWk95oimkI1+g3hGUJYU/3JoTb2xNLRFPpNoTDyAinzRkVlyEq0YQwAMyZw4/XrInroREy\nuiHcnQlhtxSdO2q1WNn5mmZNSyn7WynRbNZioFOOsMcTH3LA8MgIsQlZ8yF2TLERNTWx+xTEUrIJ\n4TJXGaQE+Sfi+yscfuIay2klhPMcebBZbN36bw/OHcyOK5srMXkyOQ4EUmOEy8atHC/bXOzDRcSv\nPkPYhGITaEdT3BfGRsEQToMBr/oKiQuW2o5a2eJz//7Yj1c2ZjKiqaKUcsEN8GRHQwOiGkweDwAp\naDqTAZCbA+Li3OHg73/XLvnkHOgdyAiAp4QD9fEbwhFN5UyCjAAiOcKegEezzFeUEhnhtDlTmrhK\nVKlCRohjlx1OQ/JlxUW935qCpnImYwiLCWEpm7tm+/bpe7y7K8CMQle6OcYtMUEVLzLC44GhkBHp\n1nTYLcQVVKYUJYljIw4dSs5QavUYs3pD3NCo7Yh0fSlHONmEsHJB2p3ICIAnhAEwpq1S2dnAFVeQ\n46Ym4KWXUvsaRJY4vd41uBvgCxJkWyr5wYC8jFkrIXzZZfz4lVfi/zdimYabj21mx5NLJkd9LvFz\n766EMP27h0Lxsd2V0msIU2REPKk7V7qLzVnjNYRnz+bHqTSEjYiM6E6GcHsodiOqicVkN2Vm/5nd\nOheTJYQ1Gsuddho/fv55/c9tdGSELCHcwg1T2kcmFIoPyaWmRBjCbp+bbfYksxkgJoQzS3nced26\nhJ9SlyISwuEqTFlT5G6SaIIrOcJbtyb//OJ7S8vlKAy1pH+fost4K8w+xRTrXptAQri5UzCETZAQ\nViZY4m0sp0RGOO1xUOR7SNEMYSA6NsLjAWDjJ4WZzFG1pnJUdFfR7wd2KtZ3xBA2dvMtPaKGsOdo\ngoaw2FTOhAlhsbmBHo6wMiFshHQwkGpkBDmv0yVjmqOiueWzpIIhbByzUI/ynfmwSqR8wZfGXbO9\ne/U9Xtz4yEo3/vsF5Nenus46DBwIWMIzye/Vq/OZjIaMkCSJncNqphTFRnR2AvX1ETfrVruPP7eR\nDGHZhnt7pCFMS3U7OpJL0CoXpN2NjBhXxBPCu+q1O6jdcw8/fvxxfRgyvVIzT0UEVKpThiIyQstY\nnDwZmDSJHK9bFz+DNJohHAgGsK12GwBgWN6wmNfjE8kQBpLjCIuGcFqUPWeKjGBrNJ2i50K8hvCY\nMfw9rlmT3KaV+Pf5v5QQdjqBxq7oyAgAeP3i1/HY2Y9h8cWLk//Ho0hkCB9oVl/kXn01kBm+dL72\nmn4kiuGRETnRDWEg+cZyiRjCYiPQVBnCtoLD7Hv20UepvfYoJTNNnV5mbovX/+6SEgOS6sZy4rhl\ny4m9sdMnbfUZwiYUN4Sb0OmObwYgGsLpkgGv+golawiLyAi75DBkyk4p0RijhjAtEQKAb77Rfqyy\nJNcMpgqVDBmhKN+NtqvYGxjCADeEO6u4IfzZhtiGsNcb7pJuoqZy4oKDXtBlhrAOjrCSIWyUpENK\nkRHhNKXDasxzWlzUe6X/e8gIi2RhKcsOxJ8Q7vTz92uWhLBywzItjXeY3749egWL0ZrKAXzzTM0Q\nFhvLJYqN8HrlmyXZacYxhGVIrs7jkbcLPc+SwUZEGMLdjIwYmDOQnVtaCWGAnLfnnEOODx0CXngB\neOQR4Kqrki/jVTWEBWOhJxLCAHDddfz45ZfJ/48e1WeYRjOEv2/4Hm4/2QmM1VAOiM7vTpVEQzgZ\njrCehHCXv4u9f70N5ajoudDqaWUNlfXIagVmziTHx44lZ5YZHRkhmnepNIRdLn1NtgblDsJds++S\nGVzdoaLMIlbJqpUQzs0lpjBAPrfXXtP33GoJYSMhI0pdpaxihzKEAY6MAJLnCNNzJz2dowhjSdyo\nSZUhfLTjMOuP09gYfV2frOjn7nIB9W5+nT8RhrCIjNjftJ+Fu4DUGMLiuG7J6jOEk5Hx3bE+RYiV\nI1kCcAfi2/Zu7hRK8S3GNo2AyKYnCSWEw4kkh6XnF596ZLfa2SKJJiXPOYd3Wn3qKW3DxeOB6UwV\nKjHVqpwUR9tV7G3ICLQMAgKE67S9OrYhfJxe39PMwxCOhowA4kgISwFmhBtlYisrF3U28c8nTolj\nl9NqzLFLXNR7kAJDOGwWWmA1BP5Dj+g1qsFTS3A90J8QdgfMN24pE8IAsGAB+TkUit6gS9ywtElp\nhugCTc9hZVUKkJrGcu3tkFVvGCkhLCJP1latRSAo71BUwm9OChtxopERFsmCMf1IWdWBpgNw+7RL\nFu69lx/fdhv5+Y03gNtvT+41iAYrvd6JxkKqGcIlWSWQQCaJWgxhALjySs5OXryYvM/+/YGhQ2On\n4KOZhmJDuaklU2O+XvF6red6n4iyha9aqhLCWoZwIg3lqESTSW8jYapUYSOMbgjb7YAtjDtNJTLC\n5SLNUal62kiSJImlhA82H0QwFFS933/9Fz9+6il96XA1hrBRghQAGbepaVrZUsl6iYgJ4WQNYXru\nJMIPBlJnCFe2VOKSS/htS5Yk/LQxRa+/OTnyXgGiv9JdGpE/gh3va9yHnBwwH2fbtviaIqpJNq5n\nxk7690lbfYawCSWWI3WGGuN6bGuXkBC2GPCqr5AyITxoEDdG400IO23G5wdTiQ22ADKA/vCH5Laa\nGuD119UfRwxhc+ITtJrKAbzMEYhMCPcWQ5g28EHQBrSSLXF/ViV8vuiPY8mtdGNyKtWUioSw2w2Z\n0WKU0jdZOigJhnBbR1AYu4xpCIuLia6QHBnxn//E17hINAsdlqyEG3ecaNGEcCAUQNkwcj3WzRAO\nGistq0dp1jQ2vtDr09ln89s/+0z7seL1ySgbtPS9dPm74A3IO9aIhnCiCeG2NhjWEM5z5uGUgacA\nAPY27MXr2+QTC3EDPhmzKVpCuLs28mhjuRBC2FOv3Y33tNOAqSr+5bffJmcixkJGpDohbLPY2FgU\nLSFcWAicfz45rqkhZhJAzOCPP47+b0QzDUV+sJ6EsFgerheNFa+6AxmhaQh3cUM4UWQEED82YtYs\nfpzMd1RtA8NooiZeKhLC9Fx2ufjGJmAMI4lyhL0Br+YGwfjxpDEmQJqNf/VV7Oel47DkNCYyAuCI\ngVZPK9s4TKUhTM+dnjCEcx25bJ17uOUwzj2XI2jee4+Y+q++Cvz0p8BbbwkGfpKin3t2thwNdSIM\n4RxxmUAMAAAgAElEQVRHDvNx9jWQiTENeHV26g9PaEmcWwTSjbOxY0b1GcImVL6DTzY8lsa4uFGi\nIew0aBmyKJkh3HkcaWm8fCRehrDDaj5DuMndBH+QwIXuu4/f/sgjvGu3KFMjI6IwhPPzefnu1q3y\n965ERpjpPYvKFdfFHeHz3tmEozXR4VLMEO4lTeWAOBLCBkw65DhyWFIrGWREa6cbkMjgbtRzOjMt\nk71Xd4h/Fm+8Afz4x8B55wEbNuh7LoKMCDOTDWIW6pGYsiwfTb6MjY2kAWgseYLm3Miik21qCM+d\nyxdYy5Zpp5VEZITDIKl30aBVoopEZESiCWEjG8IA8IfT/8CO719xv8wUpzgFAPjww8T/jWgM4e4a\nt2WN5eq0sRGSROZTDgc5h0eOJL/3+4FVqxL/92MhI1LNEAY4R7imvYbNG9UkYiNErV4d/fmjNR7b\nUsMTwlNKYhvCw/KHMf769w0x4OMJKlWGsFfYJ9IyhGlDOSB+ZESqDOG1a+N6qEz0s7XZonOSe1IU\nG5FsQtjr5Z9phCFsACNJxhFu0l7oKlPCsUTHYUsGMVqN0ohZlBpHWNyYTBblQw1hvfxgIHWGsCRJ\nLCV8uOUwslxBnHEGua2qCrjxRuDaa4FnngEuv5xs3iVbqRII8O92dra8eazYVLY7RVPCx9qPoc3T\nllKOsDiu+9Nio1/6pK0+Q9iEknexb5LtXseSzBA2QUJY3K2lO1v04tDYGGanRpFYhky5TGYQHcxC\nCLFStBkzgPnzye3ff6++QGtogIzRaCaTQXytyoU5wBvLtbcD+/fz34sJYQmSKRoHqklmCHfyi9m+\nquhVAAxJ0IuayolMNy11dUHWHMMoyAiLZOGmTxJN5Vrdxmm+pSWLZGHnmjvAz7+qKn6f5cv1PZfI\nEM4wwWYllZiyKBrK6+r1pIQ9IfMlhAF+fWp0N8If9CM9nV+bamq0F23ihmWGzRifcTSWaW9HRgDA\nqYNOxYJhhPlxqPkQXtj8Artt5EhukK5eTeZciUgLGeGwOZBu03DYkhRNCAPArjrtxnIASQkfP07m\nTw89xH//5ZeJ//tqTbq6ExkBcI5wMBSUlQYrdc45vFHxtGm8FD+WAR4tIby7nnSoK84s1mU0pFnT\nMCSPGF97G/ZqlsYnoxOaEBaRESfQEM7PB0aNIsebNxPD6+uv469ooJ9tVhavwjSaUpUQVn43KTIi\n3ZpuiDUTTQgDBBuhpYsuAkrDw8gHHxCOdDRxhnC474bB0sGAnDlLOcL5+cDwcGuVTZvkGzTxqicT\nwgAwooCYo56ABy9veRkXX8xve+kl+X39fmL06604U5M4ZvdEQhgARhaMZMcVjRUyjvC6dck9tzi3\n8NiMlfQ3m/oMYRNKVo7kbIxrt1Rks2bae/7CF0s2i40ZRXRna9gwfvtB7WslAKC90wdYSVLCqKaK\nmpSNe6hE3t1f/hL5uOXLIU/Lmug9ywxhFZ6jVmM50RDOTMs0ReNANckTwvxiVnE0ujmqhowwekJY\nFRmRER8ygiSEBRalQRLCgLBp52hGSwuZwIomqR61dYkbO8b9HlODqyPQqnr7jh36nkdMjxoVkaEm\nMSGcUxZfYzkvzJkQFq9PNBlHOcKANjbC3RViKfAMg3zG4lipNIRLSnhaLlXICCNu1v3x9D+y4we/\neVDG3L3gAvL/QAD49NPEnl8LGdGdY7behDCVy0VSwnRjA9BXhq0ltTStmBAWx41UiSaEgegcYbud\nIDFWryapUjq32r07uumvZQj7g35mNIiczFgaVUCczE5fZ9TXm6hOZFO5nkJGADwl7PMRvNrppwMT\nJ8a3gSMydY2qVCWElXgMusYqzCg0BKqKbpQA0RPCdjtw/fXkOBDgTSLVFApxBEEozViNmEWpJYQB\nfo57PIQ9m4iCQd7fIlFDOFnUzx0n3cGO7/viPsw9uz5iA+a224Azz+Q/79EmHsWU8trbkwlhgGz+\nzZ7NOfb//jdiYhGjSfwuuyVjJf3NJnM6J//HlSpD2CgJnVgqzy4HAFS1ViEQDMjKR8SkqJraPXwr\nOTPNPAlhrbTkwoWEHQXwCb2oL76AaRnCFsnCUnJqCWGtMpPDh8GMJDO9X6XEju4Z4IbLwdroEVNV\nZIQBTQdRasgI0WTSg4xwuyFHRhgo7cASQo4mACFMmUJQN/fco/852jz8e5ztMO55zQxhn/qKW2+J\nX7vbyzbvMkyUlhVTFukFPCEci40WCgE+GD8FriYxfUGvT3oM4c4uL2AhXUSMkoiWISMUG5EWC+cX\nHjqkr3GPUkZHRgDAtLJpuHg0iSrVtNdg1WEeFaWGMJA4NkIrIdydY/ag3EGsKmzLsS34+7q/44Hl\nD8ga2qmpoIBXI23dqg/9oqZoDOF8Zz4cNkdiTxxForEYjSMMkA3oOXNIOvjkk/nvv/1W+zFahnBt\ney1CCEW8hliihjDQPdiIE9lUToaMSKKp3NH2+A1hsbFcRRjH3NoaH1NYTAgbVWJCOJGxmEqWEHaF\n2DXMKKnCYXk89RTre3HjjTzR/fzz6ihBgJzDPh8AKYCgnQzIRqmqE0UZwgDw4d4P2ebkzJn8Pomi\nUcRmx4kYwq40V9Jrq9OGnIZFExYBIGPGY9/9CnPn8ttvuIGkgq+5hv8umYRwNENYRHJ2p8SE8L7G\nfcjPB37wA/Lz8eOx2fXRJL6/jlBfQjgZ9RnCJpSsHMnZFKchzK+EWXZjm0ZU9ALhD/pxrP2YzBCO\nxRFuEwzhrHTzGMJaCWFJ0k4JV1aGLxwm5unSi624cUEllpnQhPDx42FDOPyezWwI/+AHwBlnkAZN\nC+fzi9mRxugJYY6MIN9tq2SF02ZsbEbMpnJ6GcIGREYAwmux+oG0DuwKVywvXqz/OTq8giHsNO73\nmKZMOvztgBTZMnjXLn2dhEWcUZYJqleoxKSfxaU/Iez1wrTNMAud/PpEy21HjuTM3ZUrgY6OyMe1\n+4xXvRINGQFwbERbG9DUFHFzTBkdGUF13ojz2DEt/weIWUgbni5dmliaR1y0ZbmCbMO3O8dsi2Rh\nKeEjrUdw59I7cf+K+3H9+9fHfOzpp5P/h0LAihWJ/ftK8zQUCrGEcKobylHpTQgrJRrC0TjCWoZw\noiXVowoFQ7g+9YawWZAR4vmQSEJYNIRF6d2MDYXMYQjThHAwmLpUYVp2C3xB8mRGSRWOLBjJ2L5b\na7ZGve/gwXwz9tAh4PPP1e/HxmChktBIIQqqicUT2VzoiwNf4JSXT0F1a3VKWNkiaiQRhnCquO+P\nnPUImwe8sOUFXPWrbzFhAmFCP/ssWedTRAaQYkO4B5ARFJMBkIQwQIxvKiUqIx6J3+XWAJmLZtgz\nTIUINYr6DGETKpmEcIfPfAtQWQlJc2VchnCHT0zZGWMBqkdahjBAYPP9w/P+Dz4gZX6AwLszKUMY\n4OW7asiIAQMISwogHKlQCNi4MXxjLzCEc3NJwvuzz4BR5fzzr2mNLyHsSncZouwtmtQSwrmOXNag\nrNcgI4BwSpiIcir1qF0whHMMbAjLDK70yO+tx8NTS9HU7jFnWlYsu+uy1cASnlXFSggrG4CaaexS\nuz5JEl+Yer3EFFZK3ORwpRvj/cYyhMXGcolgI8yQEAaAMf3GsOPdddwQttlIZRJASo7VPtdYEhel\nVmcbS5N295g9sWhixO+W7FmCPfXRa3Bpox8gcWyEsiy9uasZXX4SUesOfjDAGcKAekLYG/DiuY3P\n4auD8jclGsLROMIpN4S7OSF8Qg3hJJARmWmZ7LuQiCE8YQLhyWZmAosW8d/rxTW53Txxa2RDWEx1\nJsMRFs8FS5bxUoV2qx0TiiYAIBsl4nVTTTffzI//+U/1+3B+MJ8zGylEQZXvzMcbl7zBwkybjm3C\ngtcXYOx4P/vuJcqdFc8ZvQnhNk8bW4+myhAudZXiD6fxZq5PVd6GzVv9ePJJznMfwT3UlBrClC2f\nbk0/YXOR4fnc3d7XSN7MWWcB5aT4G598QuZWjzxCGp7Gg9cT31+LL5z0N8jGjtnUZwibUEpDOJ4L\nY6efz+iMsiCLJSVTKB5DuL5ZSAg7zLNjFM0QTksD7rqL//zoo+T/bGfYpAxhgJsibZ42hBQ1YZLE\ny4aOHydcpY0bAVi9gM0re7zZNbhITODpZAg7yJXR6PxgQL2pnNViZUaqnoRwZyeMj4wAAEczHEJ1\n8O7dkfdXU6dfMIQzjPs9lhvCfHZWKDT51bMwFdOjRkZkKCUmhOvctSxRum9f9LJWkZkMmKuaQw0Z\nAQCnnMLvo5ZOE8/prHRjvN9oDGEg+cZyZjGERxeOZsd7GuSGabLYCHHRFkwXNvG6ecy+e/bdmDNg\nDs4edjYuGXMJ+/2j3z4a9XGnnAJYreQ4UUNYaZ6K/OBUGQtKiQnhzcc2R9z+t7V/w60f34oFry9A\nVStfdZeW8mbNGzZAs1E1u1Zb5cZoShLCJjeEk0FGAPzvdrTtaMTcN5YkCViyhGzYvPwyN5X0GsLK\nzQujKlWGsPjdRIYxuaNTSggjL4QQvqv9Lup9zz+f8O4B4P33hapBQZQfLFbVGSlEIerCURdizQ1r\nGI98Z91OVLTswrRp5PaKCqA+dt/pCCViCHfXuH3bjNvYZ7ytdhueXP+k7PaCAt5XpjuQEcVZxScs\nOJRhz2Doz30N5M1YrcC115LbAwHCPr/3XuCVV4AHHtD/3GzskoJo9pJ1o+if9Em/+gxhE0qZPosn\nIdwZEBJJJjGElV1HCwr4pCXaQOn1ApVHBYawiUoIohnCAHDTTUBO+Fq+eDHBJtCEsD3LnKkzgCMj\nAqEAS9OIoqWcAFmobdgA05oq0TS8lE9Mmzz6kBFSOk8IG11qyAiAYyP0JIQbGmBYZIRoCN/3uyb8\n6U/8Nr2GsDsgbt4Z97zWMoR/+Uv+az2lq/L0qHHfr1J5jjzYLaRDRm1HLUaGcWnt7UBNjfbjektC\nuK6TL6hHc08R36v4O50GrNiRMYRV2PWiIZxIQtgsyIh8Zz5jCooJYQA45xxuMr3/fvz8TpkhbBfG\n7PTuHbPHFY3D6utX47OrPsPLP3iZ/e1f2/Yaatq1v5wuF3DSSeR4927gaPyhzQhDOJWNibQ0smAk\nKwP+bP9n+HSfvAvgJxWfACD4te9q5CYTTQl7PMDmSC8ZAL9WZ2VB1ggpUUO4OLOYfSbdjYxIpqmc\naCQ5NNDPYkI4XmQEwP9unb5O1Y0pPbJaSWCEXoN279aHVtBKfhtNYpl/Mo3lxDln0MGvX0YykqaU\n8qYpW45tiXJP0qDrqqvIcSCgXsXBE8LGN4QBYELxBFw76Vr2c1VrlYwjvH59/M8pnjN6DeFEx7ZY\nsllseOa8Z1hV5G+//q0M8yNJPCV85IicfxyP5LimAPMTThQugopyhBvcDWzz7Lrr+O0t/LTE8uX6\nn5e9P2cjgiEC0DZK0t9s6jOETahkkBHMZAhJcJmEqStC5g81H4Ikkd0kgCR2tDqO7tsHBCx8AWom\npozMEHZHGoLZ2aQTKUCM79mzgbrwvKb/YPMjIwB1bIRYyvnll+GEsElNlWgqz+efP+UiqSkQCH/u\nUhChsDFuhoSwGjICAAoyiCHc4mmBLxB9JVNbC8MiI0Rzes7pTZgwgd+m1xDuCphjo0M0uPoPJbOz\nU04BLryQ30dPUklWvWJgRIZSkiQxI62mvUZW6hcNG0EMYXNiMsQklbhhOZL3DlHtjN0VNF4KPB5k\nRCoSwka+Ro0pJNiI2o5aGRM1NxeYP58cHzwYf5d3umhLSwPcwZ6p6shOz8at024FQNAJ/1j3j6j3\nF+cazz8f/78na1yVBby7613289C8oSqPSF52qx2PnP0I+/n2T29nTZl8AR/WVfFa6yOtR2SP1cMR\n1uLMJmqaSJLEsBGHWw6z15oq5QinV3P0XoJRJSYuizT6MMkYwkkkhIHEsBGiaONpr1cnrqkHDeFY\n8zxR3YGM8KXx65eREsJTS6ey4y010Q1hQF6do2aWmgUZIYqmSgFiCCfLEU4kIdxdhjAAzCyfiZum\n3gSA9M35xbJfyG6nc8lQCNi/P7F/Q7YRllnPTFMRdXYiNCKfT4xpSnjoUHnIi6qigvsZsUS/y1aX\nMZP+ZlKfIWxCOW1O2BCuW4rTEGYLMm8WHA5jc0aplMgIALjiCn77q6+qP27HDgB2MSFsngV3rIQw\nANx9N2fwiAmWfv3N31QOUE9rTZrEG9x8+mk4gdcLDWHx83dDOyHc0BDuKiz8DYyaQBPldPKEkVpC\nGJCXYCrldocf5zBoQlhYEDZ3NWMMx3OyBnOx5AmawywU01DX/HIL7r8feP11YNgwnqbSkxB2CwZ4\nrtNc32OKjajrqMOo0byDXjTWXW9JCIvXp+xsoCy8ZlJLCIufsVEaJeptKgckbwhn2lywSMaddlND\nGJA3lgOAiy/mx//7v/qf0+/nf7e8PDIeUp3oTbw7Z93J0vxPb3xatXkt1XXX8VT044/HbyiKJluX\n9The+e4VAGROdunYS+N7sjh05YQrMW/QPADAgaYD+PPqPwMAvqv9Dm4/XyyIyAgAsk73775LPjel\nNA3h9sRNE4qNCCHE+JKpUk4OR3/oZferSaz0KClRvw+dr7jSXLBZbHH/G6k0hMUNaD2bsT2FjFj8\n3WJkPZyFy9+5XBcmozsSwl6b8RjCAGmuRq8VavgXpWbM4MdRDeF0Y2LW1CQawkdajsgSwskawnqb\nynWnIQwAD53xEJtPvbXzLVnlSio4wqIhHHDwhnJFGRo7W90kmhAGeGM5APjHP0gF0n//N/DTn/L7\nr1mj73np+3MW9hnCycq4M9M+aUqSJGRI4UW4kyAjli0jpXyx1BUMXwm9WUhL677XmEoVZhTCaSOj\nNzWEL78c7PW/8Yb65HXnTsgMYTMlhPOceayURMsQ7tePlAYNVYRNcvv1joSw2mLNYgFOO40csxKa\nNPO+Xy3lOnKBIFnJ+NPqNcuFlA3lAHMgIySJLyrVEsJAdI4wS+wYdHIrmqRNXU0oKeFpJb0JYQ/M\ncV4vHLGQHb964CHc999uDBxIFuJjx5LfV1TEXsB1GdAs1CuatgiEApg6l5+3n36q9QhzM4S1kBEA\nMCqMBa2vjzRhxE2OLINscsQyhMvKuDGYLDLClWbszTqxsZyy8dpFF/HjJUv0P+e333Izdd48Uv1B\ndaI38cpcZVg0gXTdau5qxpcHvtS879ChwE9+Qo5bWoAnnojv3xKva6/uepohsG6aelNCCVK9kiQJ\nT5/3NDMlH171MI61HcPqw/LYr9IQHjMGGEiQnVi7Fvj5z+XPGwrFTgjbLXbZpq4eyRrLpRgbIUmc\nZZ8Ic5TqGMeIahrCFBmR6GfbHQlhQN9mbE8lhP+5+Z/wBrx4a+dbukzPVCWExcS3125MIynDnsG+\nGzuO74A34I16/9JS0ngbIJWTgYD8dlaSbxJkBAAMyBnAjqvaqjBwIP/+rV0LvPkmqVjRK6MlhAGy\n5rl64tXs5x3H+Q5Oqg1hr50bwj2aEBY2/saOJfPkP/5RXpWj1xCmmzvOAiHpb6CNHTOpzxA2qTKt\nYWyEsxHffEO6e190EfDOO9Ef5wnxhLBWcwSjSZIkxhGubK5EKBRCfj5vdFJbSwxxpXbuhMwsNJMh\nbLPY2MRSyxAGSHpp5UpuvEyeDEjp5m0qFwsZAcgvGgBMm7KLJkmSkB4Ir2Qy67jxqxA3RvlV3wzI\nCIAnUcTFSKGTG03ROMK8kR6PbRnpfYtGR5O7CZLEv6OHDyuammjIFzKHWTi9bDouGk3coqNtR/H0\nhqfZbTSpFAzGNsJFnIDLZN/jkkzuEmQW1bJNulWrtNmVZk4I5zpyYZXIhpXy+hSNI8zmHzDOOS1u\noLV6Iz8sq5UvtJNNCBu9ekNsLKfkCJeVgZXsbt+urxQdkDehO/98oKWrZzfxxOZyXx2M3jHu17+W\np4SbmqLeXSY6xjtcnXh6I2kYZJWsuGv2XVEelRqN7TcWd5x0BwCCx3hz+5v4tupb2X2UyAiLhfSj\nsJMANZ56CnhS6HPk8XCTScsQLnOVxd2oSGYId0NjuVQYwjQhnJbGK9REhUIhlhCWIf3ikGg2VbdV\nR7lnbImGsK6Grj1kCB9r40774m2LY94/VQnhI8Kp77MbMyEMcI6wL+jDrrrYpWWUe97eHols4sgI\nPogZKUShJiUyQmws3tYGXHklqUR76SV9z6fFEN55fCcqGtUvaN1tCAPAuH7jZK+FKhWGsMjmdVsF\nQ7iHGMKAPCEcCAaw+vBq1HfWY/Zsfv9v5ZcrTVFD2J5jzI0dM6nPEDapXLbwpCOtAx99yncOX3wx\n+uNEQ9gsCWGAc4TdfjdbgNL0BqCOjYhARpjMHKUprFgNtsrKSInQu+8CS5cCHT5zGElqEk0RNWQE\noMIcMqmpEkuZUviillGv2ZyKGaPpQkLYQMZoNNGFhwwZEW9COJx2yE7PhtViTfVLTFhiSogmh0Rs\nhBpfVVQgAASs5kBGAMCDpz3IKhoeXvUwS1rqXZh6PEB9q3ner1Ji2uJ4Zy0WhkPTfj9v9qmUkiFs\npg1LSZLY9amuQz0hDMgN4VAIaPcaL/UupvmPd6i0ZwfHRjQ3yxdYetTaFmTjc67T2IZwNGQEIMdG\n6E0Jf/QR+b/FAixcKEdG9ATmZ96geawU+8uD2glhABgyhHdCb22NLyVMr2u26a+wa9nl4y/HwJyB\n8b7khHTr9FvZ8WvbXsO3R+QrbGVCGABOPRX45z/5z3feGQ5WQNs09Pg9bE7eP7t/3K+TIiOA7jGE\n+4WnUW430NER/b5aovOvkhJ5Mz2qDl8H/EFSpphIQzlA3jz73d3v6kIoaGnIEG6exouMOJGGcG0H\nN6je3P5mTJ5wqhLCVeFTX5KAjpAxm8oBwNQSzhHWk6CmhjAQiY1ghnAWX0xQ1JVRlZOew+YJR1qI\ni3/LLZCF2UIh4LXX9D2fWkL4jW1vYOKzEzHhmQmqf+MT0Qx0XBE3hEXjP9UJ4U6Jz29OdEJ4SN4Q\ndt0VE8J//favmPvyXEz75zTkFrrZXGvDhtgNMf1+/plas437PTaL+gxhkyrbzicd1Q18x++LL7RZ\nWd6AF0Ep/A0zUUIYkHOEDzUfAkC4M3Sy9/778vRGV1c4wWJSZATABzU9DbYyM4FLLgGKizlqIc2a\nBrvV3u2vM5WSMYQ1EsKjRpHyKKrsQuOlzlKhHFv4omZ341C1+kqGG8JCQtgEyAhAnhCmax+x3FRX\nQjiMjDBa6Zu4KKQGiGgIx0rLdnZCttFh9PN6fNF4Vord4G7AE2uJcyKyDKOVrn71FeAJmndjR1xY\n1bTXMEMY0MZGiAlhO5yG2tDQI5qmUiaEtQzhgweBdo/xTH9XuoslSsTkiiixsVy82IgWNz+vcxzG\nNoTLs8vZWKNERgDxc4QrKvjm1+zZJK0pIiN6YtzOceRgetl0AMDOup0yZqOaxJTwv/+t/9+hBqp3\nPE9p3Dvn3rheazIaUTACs8pJpHtb7bYIA7iqtUrVdLz2WuCee8hxMMhDJlqGsPj3SyRBNyJ/BNtM\nTDUyAuAJYSCxlLDfzxscaeIikmwoB5AmYpOKSbfsjUc34r097yX0PACpahgX9pgqKmKbp+Jne6IY\nwh3eDhkWrq6zDsv2q5R6ChITwqkwhEtKgIYuclJYJEvC6e7uEk0IA8CWY7Eby0UzhNlGpqv7E6+p\nkiRJLCVMx6uFC4HqauCTT0izU0B/tYqSIby3YS9u+egWBENBdPm78MeVf5Td/7mNz2FNFWEX5Dny\n4LTrBA/HqbH9xrLjnXU8IZyXBxSEl0SpMIRbgz2XEE6zprFNr70Ne9m15+WtLwMgTUXXVK1hKWG3\nG/juu+jPKY5bUpZxk/5mUZ8hbFLlpAkXLidvvuT3a0/UZUxWExvClCNstwOLiAcBjwf43e/4/ffs\nCTfb6gWGMBA9LakUTQibzVQB5IkhZfKMSpLk2IhBI8xrJEVTvoN//gdq1FcyLCmbyXd+zVIuQxeV\ngQD5/gKJJISJ2Wq00jcZMiKcEB7L53wxG8t1dkLOlzWIeRZND8x/gGEE3t71NgD9LMMlSyBLyxrd\nAFdKnFzXttdi/nyeYlm6lG94iBIN4XTJfOMWvT65/W50+vh1VkRGiEn4b76BYTc5aEqxpr0m5Y3l\n2gQMhdGREZIkMWzEweaDjHtLNWIE/06vXStvZqsmmg4GOOKrp5ERAHDGED6BiIWNGDwYmBL2Zfbu\n1UbAKEUWqyH4csiXYET+CEwqmRT/i01C10y8RvO2Tl8nuzYp9atf8R4db75J1hVahrCspDorfoPJ\naXey1PT3Dd8nlYxVU7KG8PHjfPwu1QgIig1wE00IWyQL/nD6H9jPv/n6NwgEA1EeEV30exoKxd6A\n7glkhJgOpnptW/Sop5gQThQZ4fNxJnR5OV9nFDgLDNfwc3LJZHa8pSa2ITxtGk+wayaEw4awBOmE\nm4KJiBrCHb4OFq4oKCDVJnSuUV2tb4NAvI/d4cHl71wuq6hdsnsJ9jXsgz/ox399/F+49eNbWfL/\n2snXpuT9qCk7PZu9z511O2VjIE0J632PSonXqyaf0FQu88Q2lQM4kqrd245ddbtwoOmAbBP+m8pv\nMGcOv38sbIT43kLOPmREsjLW6Ncn3cpzqBvCgHaKQVaC73GZEhkBEI4w1S238PTGP/4BPPMMOaZl\nbqKpYjpDWOCpRuMIK0WNfyMtuPVK5MmJO6VKidiIsiG90xAuyuIXtco69c+fJWVNtOtPJS482trI\nRN1ljSMhbPUCdmJY9ETpcTTJkBHuSGSEvoSwuQzSYfnDMCx/GADgYNNBhEIhlJXxBfnXX/NkjqhA\nINwQVTQLTWCAixLL72raa5CZScqvAcIrVNsAOHIE7PrksJrr/QKKxnLC5t3AgYDDQY7FhPDKlZBd\nj400VsdqbJVMQthMhjDAG8sFQ0HVxPQlHMErM3zVJPKDqSHc7OlZZAQgN4SjNZajmsort2OmlpVf\n+pQAACAASURBVABioHZ1AcioR8hGVvFD8obE+zKT1mXjLoPdIq8SU3I51ZSfD5x3HjmurSWVh7oM\n4QTnHhOLJwIgTR23H9fRBS0OJWsIi7iuWA3lgMQZwgBw3ojzMLucROR21e3Cm9vfTPi54uEI94gh\n3B5pCL+/530ZUkapVCAjjh3jBn//8hBrimrEVGG+M5+FobbWbI25QeBy8eDBtm1y01xpCBdlFpmi\ngnRAttBYTjFeiUiF/ftjP5c4H3m//TfMZKdjZAghPLrmUdz20W14eiPvhXH3rLvxl7P+ksCr1y/K\nEW7uapZVXYjvcdcu0idKT6NIKvq5Z2QAdZ0911QOAM4eejY7fnvX2/is4jPZ7SsqV8g4wrEay4mo\nm0B6X0I4WfUZwiZVvliW5JTv8i9fDtUmVGZOCIt8LZoQBojJ8uyz/H4/+xnw+eeiISwwhE1gqogS\nF9yJGMJGWnDr1fgiPouNtjC4/HKCDJkxA5g8w5gmQ7Iqy+Wff1WTelqafc+zeHOOUlf3cK5SLbE0\ncc8ewsK+/AfcEI52ztfWguEiAOMhI9KsaWwDii5wBg3iJY+6DGGDmmfRRMfpDl8H6jvrIUnADTeQ\n27xe4OGHIx+zdm048W3C90slIiNo8ikWNmLjRjATPDfDXO8XkKcwxO+qxQKMDPcP2b+fc+BIQtiY\nqfdYja2SSQh3+E1mCIsc4brIgYoahYA2DgUgJcrffEOOhwzhG2KyhHAPjdtzBsxBupVMgL88+GXM\nVKpoCG+OjfLkBlvuIfa7Ibkn3hAuyCjA+SPPZz9LkPDDMT9kP1Mup5qu5o3vsXixNlYgFYbwWUPP\nYsdLK5Ym9Bxa6id4A3Xq06io0mUIi8iIBBPCAEno//F0XrZ+/4r74Q14ozxCWyKuKdYmhmisnChk\nhGh6OWxkB9ET8OA/O/+j+ZhUNJUTN6WLB7ay6pbu4sMmq6mlZPDp8HXI2KtaotgIvx/YupX/vrUV\ngBQEXGStYJbgiLiBpWyEOXw4P9aDjaAGo9UWxAdHSXoszZqGz6/+nM05n9v0HF7Y8gIAYhS/dOFL\neHTBo7BZbEm8i9jSwkaIhvC55wKXXkrwS3o3t6ghnJ3NN2GskrVH8Cg/HMuvPW/vehtL98vH+rVV\nazFqrIdt/MSTEPbZyR/EbrEbbj1oFvUZwiZVQWZkQjgn/B0IBskuklJmNoTVkBFUN9zAmWeBAPDj\nHxNTGICpkRFi+bxeQzgUCqHDa15kRJ4zD/1dpDHJjuM7NBdqTidZjK5fD0i9tKncwEK+kqltU//8\naekbneQBxp3YKiUmUV54gUxwmo/GgYxw8CSJ0ZARAF8Y0vSQxcL5qhUVHJOhJmVCuLvYZamWaHpQ\n1vs99/DP+vnngcOH5Y95j6ISDYoT0CMZMkKnIbxhYxBII9en/CxzvV9AvmGpbMZGz3Ofj7CDjx2j\nTH9jpt5lja1UEsKJGsJ+P+CVzGsIq3GEp0/nqcsvviAbPWr68kvy/gGSDqalzCJDuKf+Hk67EycP\nPBkAmU8eaDqATl+n5nwjFYawGGo4kbpmEsdGTCieIOtor5UQBoj5kBf2NpcsEeYaSH1C+Jzh57Dj\nVBvCySaExfetCxmRIEOY6rQhp+HMoWcCAA40HcBLW15it7V52nRjJERD+OmnCXNVSz2NjLh+8vXs\n+G/r/oZgKKj6GDEh3NZGqo6OaO9pqEq8f2YpP/8TaYh4IjStdBo73nR0U8z7a3GEW1oAZNQBFnL+\nmNEQjpYQjmUINzXxIMa4WdUMFbFg2ALMGzwPN0+9WXZ/CRIWX7wY1025LvEXH4fEcXlX3S7sbdiL\nBa8vwPZ8jpGhG1odHZFIEC3JDOHwd64os6hH8Cjl2eWYM4AwIXbV7cKn++QT4y5/F7bWbcCMGeTn\nw4ejVyGJG1ldVvLHKcwohKTW+bNPMdVnCJtU/URD2EEMh1/8gv9KDRshTsThzTIVMqLUVcrKOqjR\nIOpPf+IliU1N4fQVAKvTxMiIBBLCbr8bIZBFjZESWPFoQjGZyTZ3NaO6rTrm/cWNjt5kCA8uEj//\nyGhLKMQbDTj6CZ1wTZgQXrs2fODWZwjX1gJw8PEsN91YyAiAl0OL6SFazhcMRm8SISaE7aEMw7Ht\ntCSaHnScLiwE7riD/M7nAx56iN8/FArzgwEg3bwJ4VxHLkscVreSMWvUKG4krlwpn7w2NQEHjvDN\nSle6ud4vAMb9BICKRvlqTNlYbuXK8A8GxYLESgj37082dID4kBHt7ZA1/DSDIUw5fwCwqz6SdWKx\nAAsWkOP2dmD1avXnEdNp8+fzY1ox4Upz9WgjRREbceorpyLzoUyMe3qcHK0W1vjxHE0WlyGcd5D9\nrqcM4XNHnMs+00XjF2FAjnYJtqj0dFKJBZA0pliJJzOE25M3hIfnD8fQvKEAgFWHV6l+BonKTMgI\nKjEl/PsVv4fb58bjax5H3p/zMPfluRFsbzWVlvKUt8cDXHRRlP4yPWAIiwnhc0eci5MHkA0aNaOI\nSkwIP/YYQcdNnQo0a1MmIiQmhNML+fqi3FWucu+eF22ACZBmg7E0cyY//vRTjsdobYUp0XLRkBFi\nQjhW07V16/jxsBl8vjIin7jKP5/1c1kK+O8L/44fj/9xAq84MckSwsd34o5P78Cy/cvwdv1vgaJI\n5sueyL3aCLnd3BB2ZQfZxn1P4CKoLht7GTv2BUn5mDgv+qbyG1x6Kb//5ZdrX3N5QjiETokbwn1K\nTOZYZfYpQoUuERlBdqcXLeID5Pr1kY1sxJ18tJWZKiFskSxsIisyhKmsVuC110hpoqisPPMmhBMx\nhGk6GDCfqUI1oYhHG3YcjwE/A9Du652GcGk2//ybvZGf/9GjfCJvzSExFleayzR/A3HhsZeiKv0O\n2ILkexqNIWyKhHA4KeT2u+HxkziwyBF+6in1ZmMASQDQhHCaZBzjLJZE0+NgMzdDfvELvgHw0ksk\nNQoQzh1lv+X1MyZOQI8kSWLv/UDTAYRCIUgSTwn7fCTNRLV5M0yNyADkiRYl711sLCc3hI35nofm\nDWWLQTVurt1OGhAB8SWE29pgOkN4eP5wOG3EeVm2fxncvsja7HPP5cda2AiRmz2OnyoMGdHTY7Zo\nCNO58e763ViyZ0nEfR0O/h527YrNL2WbPz2MjABISfT6G9dj263bcN/J90UtwVZKxEaI5bupTghL\nkoRzhpGUsC/ow9eHvo7xCP06IYZwipARVCf1PwkXjb4IAHCs/Rgu+NcFuHvZ3QiEAlhbtRZ/Wa2P\nZ/rii6RiEiDXoCuuUB+/xM3KnmAIF2cV494597Kf//rtX1UfIyaEaYVVfT3wVfS+kDKJCeGQywQJ\n4TKeEN54LLYhPH48UBz2+z77DHjjDXJsVkNYbbwKBAMIhUJxISNEHm3BCH7n4fnkSQbkDMBfzvwL\nBuYMxOMLHsftJ92e/IuPQ6IhvOzAMny2n/N1c6aTi2w8fUgAYNMmvsYYNamZNcjryWaCIjaC6hez\neZpxReUK3HYbcFnYN+7oAM4/X70SgI1bjmYEQEqVetLsNrv6DGGTqiRbjozIyCBm6FCyyQ6vN3LX\nVLa71lpuqoQwwLERLZ4WGYeOKjcXePttyN6XM1tgCJvMZEjEEBbTskYqyY1HMo5wbWx6vuw9m+wz\njiYRjN8ejPz8xQYJPgcxhM2SDga0Fx42Hznv61RS0QApQ66vh6EZwoB8YUhTcRddxJNmzz4L/PnP\n6o8lyAhyXpvVEBYrOfLzgZ//nBz7fMBdd5GJ6n//N39sdj/+PTbb5h3AFxZuvxvH2sn38RxeCS0z\nzkR+MGDOcYs2HwNIqkuUmBDes4ezZEVD2Eifsd1qZwnFvQ17VUuWadq7oUGeqIsmMyaE7VY7LhtH\nVmPNXc2qTM+zz+YICC1DmPZxSE8Hhg3jv6eVaj09Zk8rmybbfKbSQhZQbEQwGLupj5GQEQDgSndh\nQvEESJIUNXGn1KxZwJQpkb+njHCAG8IZ9oykzm8RG6FsNpSMkmUIi8gILUM4lcgIqgdPexASyJfs\ny4PyxocPrXwI+xtjd9Gy24khuGgR+dnrVa8e5eNZCDua12Dnce2GzqlSTQd32kuySnDBqAtYpcaK\nyhVYXx1ZE+/UIGetWKH/3xUTwp50ISGcbcyEcGFGIRs7Nh/bHBMZYreTButUt98O/P3vpCrJjIaw\nsqJh+aHlyPtzHk5/7XS4cvwoCBcVxmMIW4siDWEAuGv2Xaj8eSV+PuvnKXnt8SjHkcPOQWUV9MSL\nl2LlSnk1jp6EsLiJN3oa34ApyixK5qUmJREbARA0x23Tb2Oow9WHVyMIP155BazB3LFjwIUXhoMy\nglhCOIdz6AZmD0SfElOfIWxSleSIhnATxo0jpXzFwuaIsrGcrIFEa7mpEsKAdmM5UdOmAY8/zn9O\nyyFGjFWyssYFZlFCCWGfMRNY8UiWEK7TkRDupcgI8fP3p9dFmBDMEE5rh1ciW6VmmeQB2s1LrO3k\ngl7fWa9aOtrQEN71FpERPdStPprE10RLSsePJ6kdql/9CnhTpZF4UxNYgtRpNY9ZqMYQprrnHs5g\nfP994LbbONewf38gI4e83wy7eRAZosSFBUUonH4636AUyzc3bYLMEM6ym2/cynXkMt77zrqdMv6q\naAgvWcINNEc2ec9Om9NwnzEtq3f73apG2SDexiAmNuKDD4AzzwRefhmmM4QB4Nbpt7LjZzaS5jtf\nHPgCt350K3bX7UZhIWdV7tgRmd7xeHgJ7+jRpIILAHwBH2vi1NNjts1iw9c/+RrLf7IcR+8+ygzq\nZfuXqZou8XCEuSFMSiEcNkePLsJFZadns3lSrISwJBFO9KuvAg8+SMbs55+HrBM8NYTLXGVJsRtP\nG3Iaw8J9WvFpzEZ/elXAKVSmQUYAJBixaMIi2e/oNcYT8OD2T2/X9TeyWoE/cAwp3n478j70fLWN\n/gzzFs/BlOemYFvttoRfux6JCWHKNBWTgn9a9aeI9zd0KP8MTj2Vb0olYghLEtAKISHsMmZCGABm\nlBGoaqevU5XrrtSllwJXXUWOW1qAO+8kG1lmNIRz0nNYuOlIyxE8+M2DaPO2Yfmh5fj2yLcsJXzk\niHajwWCQIyNKS4E6v7oh3NMSU8Ki1h5bickntSMvjzTfBvQlhEVDeOBYIZHfgwlhALh0LGdCTC+b\njn6Z/TBv8DwAxMPYfGwznE6yTqAhx61bgZ/8JHweh8USwqIhnNNnCCcqY83I+6RbJTlyZARtIBDN\nEK5qExPCA0xnCIuN5dQ4wlS33UYmPU8/Dbht5D2XucoMtwCNpf+rCeHRhaPZZxVvQri3GsLIqJct\nTADBEM4yX0M5QDshHKzncTIRO0DFxjWjIyOEhLBYUnrNNcDDD/P73XdfJDqirp43HMs0kVlYlFnE\nNt6UY3R2tnyz7rnn+PHjjwO1neQ8LnAWwIxSM4SzsoBTTiG/q6zkqQ6SEDYvIoOKLmAa3Y2yJkEu\nF2/40tjIz297JpnBG/H9yjjCSTaW+8UvSFO1v/4VpjSEZ/afiUnFkwAA66rX4Q/f/AHnvH4Ontv0\nHK59/1oA8qaJSxWh2r17SYNfQIGLEPpYGGHMLsgowLzB81DqKmWNvBrcDaqszngM4ePHASDEEsKD\ncwcbptGNJEksiVbVWhXTVMzPJ9es3/yGzKlvvJHf1unrZNUvyRpMWWlZOGUQGSwPNh/EK1tfwcd7\nP5albxNRRgZHDSRjCOflQXPNJBrCqUBGUD0w/wE2j79q4lXYfPNm9tktrViK979/X9fzDBlCwjIA\nOXcPHJDfTo0V20jSjdsX9OG5jc+hO0WvF/nOfKRZya7p1ZOuZmbVkj1LcNOHN8Eb4F0r09OB774j\nCKKvvgImkSEK27aFN9F1iG5eFRcDxzr4mtioCWEgfo4wQFLC5Yq3NGSS+Qxhcbw62HwQyw8tZ7dt\nOrpJho1QntdUu3bxNOns2cC+RrJbabfYZQnknpaI4RLlC/rw9UGC0aHYiIaG6ONZKMQN4bw8IC1f\njmjpSf143I/hSiOJILrpderAU9nt31SSkrJ+/YAPP+ThoXffBe6/nz8P7RWFXL5D32cIJy5zOWR9\nYsrPENIVzkaMD1fZ60oI+9OAzkJWumwW0ZJOAPiu5jvN+0kS8KMfAdfd1MXKzo18sddSriOXGaOJ\nGMJmNUeddicD/e+q28W4R1qirNl0azqbWPYGOWwO2EPhzzCaIezqXYZw1zH+PVcri2TjmtGREULp\nqLKi4Ze/BM4IIyyrqyN3+4838aiDK9145pmWRJbuoeZDEWbDZZeR5KSos84Czjq/hS38xXHeTFIz\nhAG5cfbpp2Qif/Ag5Alhk47Vys7Yol57jSSkaToUUgBuG08TGk2xGsvFkxAWy5LNaAhLkiRLCf/2\n698iECIO7/rq9ahqrZKd1zTpTxWLHwz0fEJYKRFZoIaNmDSJJxJjGcIvvQQg8zhgJ82/eoofrCWK\njRAN3UR0rI3PPVLxnaYcYQC4/oPrcf6/zsdJz5/EUuWJinKEkzGEtdLBAEdGSJBSutExLH8YNt68\nEZ8s+gSvXvQqXOkuPLHgCXb7S1te0v1cP/oRP373XfltNCEsFXJ++r93/ltmxqZatKmcmFZ02Byy\nhnovbnkRZ7x2huwcLSoC5s4l15V5JFSIUEjg1EeRz8cRIAMG8AawdotdhmgzmhIxhHNzgbfeIonS\n8ePJGD1ulvkMYYBjI7wBrwzntPHYRrbxDGg3lhOTsrNmhdj8bEjeEFkjuZ6WMiF816y72DG9Jon9\nGaJhIyoqOCJn9mygrtM4CeFSVynW37QeH13xEX520s8AgCWEAW4IA6QR97//zZv6PvggOa+XLSMJ\nYgDIKO1LCKdCfYawSWW1WCF1hSfUjiZ9CWFaBtnWH+lpFhgksKBbND0AQAZc1xK92APmNIQtkoWl\n5Q63HI7JjgLkE/SCDHMm7QBgQjE5oT0BT1RWmj/ox/4mcruRSn9SpSwpvJLJqIv4PlNDOL0wtYuy\nEyUtZESogSeE6WcriqSvYHhkhMjCvvfze2VN8iRJ3pjpSzkiEMebeHo022keQxjg5ofb72Zdjakk\niTTToxiFtDTgySeBQy08CT4kz1jmiV7pMYSXLhXMJKGpnFmrOZSdsUXNmkXO69pawrF8+Z1j8IdI\nV+me5KlqaVRhahLCXV3kPyYTGsIAcOWEKzU3Kj78/kNMn07MGQD4+GP5fHOncCqIhrBo7BhtE2/B\nsAXseOn+SEM4M5MvxnfsIDxWNa1ZEzYgDMIPVlM8jeWiSdZQLiv5ucfFYy5m2Aiq/U378fSGp5N6\nXsoRbmiQlxzHUlsb51aWauy1f1fzHXbXkR1dMcSRKo0uHI2FIxay571kzCWsemzl4ZVsXdDp64w6\nV/6h0MvpnXfI/0MhgrdpDIewA7ncUWt0N+KTfZ/gSMsRzH9lPha+sVC1d0siave2M5NfmVa8YeoN\nePOSN1ml0arDq/D7Fb9XfZ553EPSxEaEQgQXUFlJzGC6R11eztfERq8gnVrKyxP0NJajmjOHbE5u\n307mIfT7apWs6JdhXANcKa31+8ajG3U1lhP5wSOnHYPbTwIXRlszihvsowtH44H5D0RgdPQ2lhNN\n8DlzgH0N/LtthFT06MLROG/kebBaSGJgTOEY1XENIGulvwp9Jq+9FrjpJv7zhLncEB6UK+zc9yku\nGXcE7FNMWTxhwy+rBuPGkaucaAiLicIObwcva2oxHy4CIJNqyvlbU7VGVoatJpEDKDbSMJNmlc8C\nQBpsfVqh0b1FkNghfWTByCj3NLbG9xMayx3XxkYcaj7EUgzigr63KDctPGlzNqL6KL9AdnVxU6Lf\nUL4o6w1N5dDEDeEDTZE1YGZBRlw8+mLMHzwfABmLrv/gelliVkzKfvGF/LF1LdwszMkwl1mo1ViO\nauRIwladMoUk6UaOBA42cUN4aK45E8KDcgbBKpHJrWgIjxlD0kgAWbR+RvcynbwU2kxGoahxRdoJ\nYaqCAtLUaNg0wfQ3WGISiJ0Q1msItwieicsF0xrCrnQXrpxwJft5cslkdvzB3g9gsQDXXUd+9vkI\nW5ZKNITHCqEnGTLCYIbwgJwBbEG+vnq9bAOPimIjvF55ClrUo4+GD/L4+W40QziexnLRVN3GQxep\n2Iwenj8cK65dgYfPeBi/m/c71lTt4VUPo9XTGuPR2qIJ4UAgstl2NMXiBx9oOoAFry9gvTsuHn1x\nwq9RryRJwqmDSHl1c1czdhzfAY/fg+n/nI7h/xiuaZ6PGMERC+vXA6+8QuYgP/gB+f7C4ocvS24o\nv7TlJfz4nR9jReUKLK1Yike+fSQl74GmgwHSUE6pKyZcgZXXrWQVf+/sekcVbXIKzwhFGMKhELnW\nnnQS2ZwcP15ukpWUd6HBTb7jRg8M5TpyWdXk1pqt8AV8uh8rBr+oIVySVcKMODOo3KX++ext2IvS\nwfyaEssQttuBzAECPzjPWIbw9LLpmFg8EQBpKulKd2HuwLkACC6jorFCd0JYaQh/V8urqum/YSSp\njWui7rqLzze6uoDDYQ947lzAkscNYaN/l42sPkPYxMroCtdKOFoRygyX32gkhGWTvtZyltAym2hJ\nWTAUjOi8q5T4ns06SNw87WZ2/OzGZ2Pen7KRALAJhBlFE8IAIi4MosQkl7ig7y0qzAyvZCxB7D3C\nN0AqKnjSwVVqTmSEVkIYjToTwunGTghbLVa8fvHrLOX/wfcfsAZNAFmg0OTS8uWAXyCjNLRynEBe\npnkNYTUGNEAMws2bgSvDfpNo/Js1IWy32tl7r2isYAtYSeIpYa9XMIwK+OadWTEZsoRwXfTO9OLm\ngNEMMoAw2yn/U80QHjCAL64Pqp/WAOSG08UXAxNncCOLcvPMot+f9nucN+I8XDf5Oqy8biUzEr86\n+BXaPG249VZeyvnss2FjCdwQdjh4UxhAXsFkxE08io0IhoL44sAXEbeLHOENGyIfv38/aaIIAFnl\nh9jvjXa+i/PhZAzh9dXr2XGqxu3ZA2bj/839f7h//v24aiLpjNXobsRjax5L+DkLhXYM8WAjohnC\nLV0tOHvx2YyFO7P/TPx94d8Tfo3xaN4gHo1dUbkCy/Yvw+56Ehd8a+dbmo8TsRHXXUc4vFQzzqpE\nyCLHs32490OsqeLxyqc3Po0ObweSldhQTqt8fXrZdJw2+DQAJMUuGlpUhYVguMQtW+SbcddcA5xz\nDueMtreTcnOq7P58M6N/tnEbylFRbESXv0tz8zWa/EE/+7ubqZIQiEy0ilUEbS7O71FDRrS08GrK\nyZOBw+3GbCgHkDnkpps3ofG+RvxoLPmyLhzOS8yWViyN2xC2WoEZM0KsSWR5dnnKGl+mWiJHeEWl\nfIdHkoBnniEGMJXVSioOD7cQQ7g4s5hVFvQpfvUZwibWWVP4yPB9AxkZ9BrCZkwIA8DCEfLBMZrE\nUjgjlEgkooXDF7JF2Cf7PkFlc3R4IU0IWyWraY0VAJhQxA3haAlhceHeGw3h/rl8JVNxlK9kvhf8\nCnu+OZERmgnhzkI4JHJjVIaww7hpM6r+2f3xykWvsJ8fXvUwMwotFs4Rbm0VGiQAaOoQkBEO8xrC\n0Zp/ihKNY7OaowBfYLR52xi/HgCuuCLyvtZiYewyaXVDriOXjTk763ZGbVBldENYkiT2ORxuORzB\nLU1LI82ZAFKqqfVWRUM4NxcIpRFD2Glzwm61qz/IoCrKLMJHiz7CSz94CVlpWbhw1IUACMtx2f5l\nGDwYuOACct/qasL083h4Umv0aIEhDeD17a+z4yklU07Qu9AvkSP80b6PIm6fOZMf0671op54giMJ\nxsw5xH5vtES8DBnRkjgygs7BLZKFmXep1P3z72eMz0fXPIoHVzyIh1Y+hK01W+N6HtEQrqvTvp9S\nx/jUKgIZ8eb2N9mG9ZjCMfh40ccnrFkmTdIBhLf59q632c/ba7drjsOiIUw1ZAhBSNz/d75BmW5V\nXyA2uhvj4hZrKVZCmIqONwDZUFcTxUYEg8Dq1eS4oQF4/fXI+4qp/rRCASmokUA1kkSO8Gf7P5Ox\ndPWotr0WIZDzwkzrBCAy0HXLtFvY8d62jcgLt+tQSwhvFYaKadPk1VtGM4QBwGaxyfqPiNekt3a+\nhbIyHqbRQkY0N/NN2cmTgXp/JavOoc1ijSgtjjBVejrhn1Nu9K9+BYwe52XJ9z5+cHLqM4RNrAVT\nOUyG7g4XFvIUCzVOgkFg1XbREDYnMgIgEyGnzQmATEajLUB7Q0LYarGylHAIITy/+XnN+4ZCIZYQ\nHpw72NQN1obmDWWfM93ZVJMsIWxSUyWaBhVyztfhenVDOJDRO5ARk1lFsoQ8kJRwZUtlRFNBlhDO\nJgtZp82JDHtG973QJHX+yPNx9rCzAZAxSUx3aGEjmju5IWy2hmOi+aHXEJYlhA1mnsQjLY7w/Pmk\n6c2dd/Iu0bnDyZfYYXOYeiJLy+wb3Y0RzGhR4rlg1M9Y3FRUS4hS/EFHB+9Wr5TY7T43F6zU3Uy4\nCC2JBs3735OOLj/7Gb/9ySeBvXtJaT4g5wcfaDqAzyoIL2Vw7mCcOVTRXdIAmjtwLhtv39rxVsT4\nNXUqWDPmtWvljw0GgX/9ixxnZABZ5QZGRggBiaq2xBLClc2VbN0xq3yWzMRIlYbmDcWNU24EQLiz\n/7P8f/Drr36NU18+VZYyjaV+Ai41VQnhb6t4TfaLF754Qnt2TCiawDbBV1SuYN9FAGjqasKx9mOq\njxs9GvjpT4HsbOC888gGzt69hC9c0cTjlWJDSQC4bvJ17PixtY/FbPQcSzRVDURvcBWPIQyQSiuA\nbE5RXXIJsGABIpXNz3szJYQB4Jdf/BLFjxTjd1//TvfjZbxvExvCQ/OG4qZpHCArNpY7ckTB7wdJ\njlNNmWJ8Q1ip8UXjWSXW6iOrsbNuB0sJHzoEuN2Rj1m7lm9Yz5lDOOdURsRFUInj2jeVFf7Z+AAA\nIABJREFU36j6O0VFxOTfuRP4/e9Jryi60WHmebQR1GcIm1hjCgVDONzUwGbju+HUEL7vPuD+x4TV\ni4mREQ6bg3E5q9uqo5apiglhsxrCAHDDlBtYSuKFzS9o8qNq2mvQ7iWl5mbmBwPECKfYiIrGCk1e\ndG9PCBe7eLTlaDOPtoiGcKeFTP4z7BmmKklWIiNEczTTSwxhf9AfkWCqrQVg7wTySTpnXNE4SAbv\nkKks+6KiCWGAG8KhENDqFhqOnaDUUaqUTELYYXNETQwZXVqGMEBK3Z54gqSUurx+tFrJ+Tsif4Sh\nG9rEkl5shJgCN2rjD9GkvOnDm2RJNkDOw9ViyCoTwr3JEJ43aB67xny872P4g36cfjpvtrZiBSnh\npBIN4ec3Pc8WbjdPvdmQDEuHzYE7TroDAOAL+iKaWTmdwMTwenrXLlLZQbVnD0kmAuRaVt1xCAC5\nLtNmOUaRyBBWa6CoR2JjZ4py6w79dt5vGXaJqs3bhifXPwkACAQDWLJ7Cb448AW6/F1qT9EtyIh1\nVSQinmZNkzX9OhGyWqysyXZ9Z30EX3l7rXZV3VNPkTL6jz4CLryQb3CI/UcuHXspzhp6FgDggpEX\n4IULX2BNFw81H8K7u95N6vXLkBFZ2oZweXY5+9tuOrZJFW9y8sn8eHv4bYvJ7tGjgeuvj3xun9Nc\nTcdnlM1Afxc3rus76/H7b34fFaknysyG8OjC0Wxe+bOTfoax/caywJDYWC4UikwJi4bw1Kl8XmaV\nrIadh4iSJAm3TuMbNM9tfI6FCkIhsqGjlNhEb/ZsOT/YyAlhcVyr66zDnnp1JkZGBpmLSRLHRQCk\nj0efEpd5VyF9Yg3WAJ4QBjg2oraWDBjvvw/ZbqhZm8pRiSUUn+7TbrRGJw9WyWoqtqpSpa5SXDT6\nIgBkZ/3jfR+r3q+38IOpZvbn9ZkbjqoA+8AN4X4Z/bolodLTEheSLf7jbDdYNISbfGT2W5pVanhj\nVFRaGmnyQCUawrY2jg1QcoRrawH02wVIxFwYXzQeRpc4Zokd7AcPBoaFkclr1pDkYXs7EMzkk3fK\nNTWLCjMKWWJbiyEsKhgKsqZyQ3KHmOocViqaISzqcOtB+IJkY8/slQ1iZ+xobEO6OZDryDUk8xsA\nFk1YxDZvjnccxzVLrpGV5oqG8E4N71s0hHNyQr3KEE63pbOxrNHdiDVH1kCSgDvu4Pd57jl+TA1h\nb8CLF7e8CICUxF4/RcWhMYjumXMPOz9f/e7ViEXpLNLnF6GQHPOzahU/nnNykOG9jDim5Thy2Pph\nffV6NHfF0WktLHFjU7y+pVplrjLs+OkOfLzoY/znR/9h4QjKs73pw5twyX8uwVmLz0Len/Nw/pvn\nR6CmEjWEtZARje5GNt+eUjIF6bYTv6ASeZtK6TUJRcnWDwUj8N7l72HDTRvw3uXvwSJZcO+ce9nt\nf1v3t7ifX5ReZAQAXDiSp4Q/2huJcSkp4fNImgxWfm4XXgiGFQCIkdQmCQlhl/ETwk67E1tu2YKn\nz30acwbMYb9fWblS1+PNbAinWdOw9Zat2HzzZtw5807YLDZMKSXIoQNNBzB8Am/Qu1Lx56CGsNUK\njBsXYvOyQbmDTFNFe/Wkq5kB/tq21zB0FA+MqHGERb79rFnyKttJJcY1hIHoHGE1iYZwX0I4OfUZ\nwiZWv8x+bOdcnLQWFZH/ezwksXDwIOSGsIkTwoDCEK6IbQiXukoNmUaJR9dOupYdq5WyAvId/hEF\nvcsQpmkMUa2eVjaxNLupoiVZqWnRDhw+TBai1BDuP9iNZg9ZzJltkgfwlLDTSXayqQL1vLGciBMI\nhcLIiCK+4BF500bVqIJRbPf6m8pvZI1ZqBHu9RJToaEBQAmPNYgNFs0gSZIYEqCyuTIm666mvQae\ngAeAufnBgH5DuDdVNsgSwsfVXVJ/0M8qdoyKiwAIC/WVi15hG8ifH/gcT6x9gt0uJl71JIQzctwI\nhAg/oTcYwoC82oFy/m68kTRwUooa6Et2L2FM7UvGXBI1FdjTynPmMfMrGArif77+H9ntIkdYxEaI\nhvDYk2rZmGY0XAQVTfUGQgF8eSB6g2alfAEfm4cWZhRiWtm0lL8+USVZJTh3xLm4dNylWDRhEQBi\nyl7yn0vw8taX2f26/F34eN/HuHvZ3bLHKxnCTzxBSqmfeUbezFUprYSw2ExvVvmsxN5UkhJ5mwAg\ngW86ROu7oSW6fshJz0G/jH7IsGdgetl0Vr1y+pDTWVXquup1aOlq0Xwu5fNuOrpJ9ju9yAggNjbC\nYgH6h/1cLUPY4SCNbNm/WQwc6zAfUrBfZj/cNuM2PHY2b7AookuiycyGMEA2saaUTmGba9NLOUKj\naDI/vz7/nD+mq4tfp8eOBdqCdWjztgEwBy6CKteRiyvGk0YUrZ5WHC/+N7tNaQiHQtwQLiggoROa\nEHbanIYPi8XiCCvVZwinTn2GsMk1ph+5QFe3VbMkithYbt26MM8tzNtEwA509jN1QnhE/gi2qFx1\neJXqxMTj9zCeoVgeZ1adOuhUNjHT2jXb18B3+M2OjACAmeV85bW2em3E7TJ+sMlNFS3N6D+D/1C+\nDocOkQUNNR0GjeMzXzPxg6nOIlWJ+OEPCdeOcoW7qrkhLKZ9WlqIcYoivuAxgyEsSRLbyPIGvFh+\naDm7Teyau2NH2BAuDRvCIcnQJV5aoiaIJ+CJyXrsLfxgIJwGDC/MoxrCwtglVvqYUaIhvKNOPZl2\ntO0o404a1SCjKsoswuuXvM4+x4dWPsQazIkdvvUYwvYsXsrdWwzhuQP5gEXNCLudNHsRy7cdDtKw\nKhQK4R/r/8F+f9v0207Ya01Ud8y8A0WZJFnx9q638aP//IjNr7Qay1FD2OEAXAOMyw+mklWtxGjQ\nrNSaqjXMWDl72NknFHlzz+x72PGy/cvY8QUjL2D850/2fYK6Do7YEhnC330H3H03qcj56U8JV5Sy\nZ5WihrDdLk+YigEFMbhwIjWlZAoy7Rwn9cOxP2RjVrwJYY/fwxLtIwpGqCbaJUnCGUMI4yoYCuLb\nI7GNyOWHlmPCMxMw/fnpeGvHW+z3YkKYfs+0NLlkMjNsvzz4pep6jxrCjY2EqaqW7BaxEeXlhD3K\n7mOyufOU0ilw2BwAgNWHV+t6jNkNYaVEpnJLxkYUhKkyX37JN3l27OA8+wh+cJ55DGFAzvVe0fEM\nEMYvffed/H6VlbwKYsYMoMPXztZQ44vGGz4cJ45rKypXRO0TBZA+M1R9hnBy6jOETS6RI0xTwqIh\nzMonaEK4tRwIWUxtCEuShAtGktbWvqBPxjKjqm4zFx8qllzpLsbS2nF8Bxo6GyLu09uQEcPyhrEE\n/LqqdREXht6UstNSriMXxdbweyvZiopDXTJcRMlwwRA2IRbljTeAzZuBl8MhH5rCaT6kjoxgDeWK\nuSFsBmQEoM0RHiRgr6qqgOP1PqCYlHjlYwRc6ebhQlOJJki0ppAAGC4CMH9CON2Wzial/1cSwnnO\nPGbkr6tST46Jn7FRDTJRpw85nSURG9wNeHkLGaDsDg9Kpm4ELH7s2sUbt4gSDWFrBjeEzfg9VtPw\n/OHol0Ectm+PfMsqADIyCJd0ahinetFFpEx32f5lWH2EmBZjCsdg3qB5qs9rJGWlZeGB+Q+wn9/d\n/S7GPj0Wb2x7AyNGcHNw3TpyDlRXhyvxAJx0ErCvmSfljXq+nzroVGYqLd0fvUGzUjJcRDfyg9U0\noXhCBKLiJ5N+gg+u+AA/nf5TAKQi4d87eIpOTAh/8YX8e7tjB6nS2cenz0zUEC4uJklUqnXVgiFc\n3jOGsN1qx8kD+Q7MNROvYanHnXU7EQgGdD/X/qb9jO8dLUwipvdilXPXd9bjyv+9Et6AFwBphObx\nk9Q8TQgXOAtgt9o1nwMg672LR18MgGymv7H9jYj79BeID0ePyg3hsrD3OWUKcPnlhJl88828grQ4\ns9g06ACqNGsaZpSRsMjB5oM41qbeRFDU0fbeZQiLYZl1R9eySrvWVp6QjdZQzmxVtNPLpjMPYGfT\nJjjP+jMA4JNPOLsekOMiZswgPHH63TZDuEQc1462HcWWmi1R79+XEE6d+gzhFEuSpIGSJD0qSdJu\nSZLaJUlqkCRpvSRJ90iS5Ez1vyfjCIcby4mG8KpVIA2YMsKMnVZijpoZGQHELiMSmw/0BkMYgGwh\nterwqojbaclXmjWtVwyMkiThpP4nASCLcjFJCCgSwr0UGQEA4/PCCw6rD+sPb8XWrfy2vEF8ImjG\nSZ7VSiZqtLEJNYTbjgyEVSI72eLnThtlUmREgbPANE3ITh9yOuMfihzhcmF4qq4GttfsAWxk4TTQ\nPuWEvsZUSdyovPTtS1XZf1QiZ3hInrkTwgAvRWzqakKju1H1PjJDuBeMXeePPB+A9gat2FzQqAaZ\nUiIz87G1j6GmvQZTnpuCmgtnAAt/htZWYj4oJRrCFqeQEE7rHQlhSZIYw7K5q1mGK8vNJSbp2rXA\nK6+QdPCvv/o1u/3++fcbjqerpVum3YLnL3ielbT7g37cufROeINdOIlMS1BbSxJZq4WQ3ty5wL92\n/Iv/LCSqjSSn3ckaNFe1VkXlf4sKhUKy8fzsYWd3x8uLKvG7ObJgJJ48lzSYu3rS1ez3i7ctZsf5\n+erPQxP/gYD8MwRIuTmdb4iGYygUYsiIwozCHq1quX3G7Ui3pmNm/5lYMHwBw0t1+bsi5svRJFYX\nRguTnDLwFHYcrZw7FArh2veulaVSK1sq8cLmFxAKhVhCWO/c7capN7LjZzc+G7F5IX4+1dVyQ5jO\nKSUJ+Ne/gLY24Lob/Ow1mHV9ePIAvhmgJ61NPwu7xR7RpNGMGlkwkm1Mfn3wa5x2ppfdRrERSkN4\nbRWvNDVbaEqSJPzmlN+wn90n/wqY+gI8HmAxH+pkXPsZMxQN5QzOD6aiG0AA8OiaR6PelxrCTpvT\ncM1bzaY+QziFkiTpAgDbANwFYCQAJ4BcANMA/AXAFkmShmk/Q/yKlRDesAER/GAApk4IAyTdkJOe\nA4B0u/YFfLLbj7QcYce9ARkBkPdMpdydD4aCbPdzWN4ww5eF6JWMI1wt5wj3ppRdNJ08iDPqdjSv\nk+0A55bzCbcZE8JKMU5f0I7+WSQ6u79pP1sAVFYCcDYALjLjH1803jQGgyvdxcyBisYK9n0tE3z8\nqipgRz2fxY5wmdMQvnbytZhcMhkA6Qh/4b8uxIubX1S9r7hwNXtCGNDHEaabWSVZJb0CJRBrg1Y0\nhM2CBZlUMglnDSVMmwNNBzD52cm8ee+0fwK5B1Uby4mGMNJ7HzICkJsRypJlm41gFdLTgSV7lmDT\nMcJ3nFwyGT8a+6MT+jqTkSRJuHHqjai4owLnjjgXANmYfmfXOxHYCJEfPHLGYXx96GsAxHToKaSA\nHonpXr3YiJWHVzJG7YyyGT3Cgz5t8Gl46PSHcNHoi/DhFR8yVMT4ovEsRbfh6AYWkrHZ5MgHABgz\nhrCEqZTf5cpKniQeIgxZ+5v2o8FNInkz+8/s0fnHBaMuQOMvG7HmhjVIs6ZhfD9eLRUPR1isLoyW\nEC7OKmZz7Q1HN8h6IYj627q/sebXYgPRP6z8A2o7atHl72LPp0cTiyeyTajtx7djzf9n77zDpCjS\nP/6p3SXnqESRKKKoIKICignB8PMUs2c4PeXMOZ05e+p5xlPx9Dz1jKfnmc+cUDlzwgiCoKui5Mzu\n1u+P6p7pHTYzsz099f08Tz/b013TW9+pDm+9/dZbc96qtD/TIRy+qOvQwaVwidK8Ofy05KdUbvce\nbfN/QrmqiE4sVx+HcPc23RNjM9dEkSli5/47A87G7DA0fU48F2SSiTqENx5azqOfPwpA85Lma+Tg\nTgJ7Dt6TK3a4Ir1ht0nQ/xluvz19r8qMEP7ox4hDOAERwgCHbHJIyrn74KcPptLZZGKtTTmEe7fr\nXRDndZzIIZwljDGbAQ8AbYDFwB+BrYEdgNtxCV8GAE8aY1pVd5z6EuYQBlKdlahDeNUqKjuEFzrn\naNIdwk2KmzBhgBuCvWDFgtSQxJBCjBAe03tMKkdY5tv52QtnpyYxKYT8wSHR4XiZE8uFDuGSopKC\ncCRVx7gh6d/gu4q3Uw/8Jk3AtE12DuFMohO3rNvUvTtbtHJRqgP26ackbkK5KNEOeGicNmuWngh0\nzhz4esn7qTIbdx7WqPXLFq2atuKN372RcgBZLKc8d0rV6QSiEcIJcRbWRDTyJHMyHYCFKxamhswW\nyousbdbbJuXwrOoF7cyFM1PrSYkQhsqRiNGJkCiqgK3+UmUe4dAhbAyUFRemQ7iSM6KaSY3KK8o5\n96V0RNOl213aqLlms0Xrpq05e/TZqc+3vnsrW0bmEYs6hI2B6a3SQ9oPHnpwXndSK+URnl43h/DV\nb16dWj9h5AlZr1NdMMZw9piz+fd+/17D3j14aNVRwh26z4ctr4MdzoYdzmbdfS6nW79fUvszHcLf\nph9L9I2Yl/mQPzhKyyYtU+dYdALa+uQRrjQhdS2Rk+FIxbKKskoRlyHvl77PGc+fkfp8/8T7mTh4\nIuByBx/06EGpfbVNKBflD8PTOVRvfffWSvuqixDuVo1JXCmlYJtk9g+j9+DM/m8mpYtL+WWZO9eT\n6gCviqg9/cGiZxkUmFNvvw3z56fz6/brBx8vfD31DJ/Qf0LqJVLSOHPUmZyyZTBpZlEFjD+ZaZ9X\n8OabUFEB7wUmZ8+erj8VjRAeus7QGGpcf1o2acmxI44F3KSn0Yl9o8xfMZ+lq91LqUIYFR03ybPO\n8pfrcBHBZcBO1to/WWunWmtfsdb+ATgDMLjI4VOz9U97t+tNixKXiaIqhzCQnlAOCiZlBMD/Daw+\nKimc0RygV7vCiBDu0KJDyuD74McPKjlXCi1/cEiYMgJchPA9H93D6DtHc/7L56eGufXt0LfWPGRJ\nZoveG0OZC3OY12JqalbZTTeFucuTnTIik6hDuAPpXlgYRfrJJ1TKHxztACWBaITc3R/dnYp8DtNG\n/PADfLc6HdYwvHsyI4TBOYUf2vuhVC7WRSsXMfm9yWuUC9u2c8vOBZFndbv1t0utP/n1mqkyCnFk\nQ9Pipqkc2VW9oE1aDuGQHfvumIp0B+jQvAPNi1u6D5vdwftfrJnLP3QIt20Lcxan89sVkkN4ePfh\nqbyb1U1q9NBnD6Vs0q17bZ2Ksk0io3qNYkiXIYBzvrTum34G3XZb2vGw0caWh7+6O7Xvt0N/26j1\nrC8DOw1MXY+vzXqNa9+6tsbUEdPmTkuli+jVthf7DdmvMapZLw7Y6IBUuqk7PriDG6feyNVTrua7\n3/SD8SfDmCthzJW8XHQOx72+F23buWfwpxn+06hDOBohnA/5g6sj+oK8oRHCteVWjY5UzAxMWbxy\nMfv/a39WV7gXgqdtdRrj+4/n4u0uTr0Meunbl1Ll6+MQ3nvDvenYwuX+eOizhyrNoxJ1CE+b5tJ9\nQPUO4WjAUFIdpJ1adkqljHy/9H2Wr15ebdlHPn8ktR5ODFgIjOs3LhUk9ez0Z1OTVJeXu/vy8uAn\n2Wwzd86E7LPhPo1d1axhjOHqcVen07d0/hIGPsHtt8NXX7kcygCbb+7srjAH73rt1qNd83Yx1br+\nHDvi2FSO+9vfv535y+evUSYaOSyH8Nojh3AWMMaMAMbgooD/Zq39XxXFrgU+xzmFTzTGZGVMf5Ep\nSuUgnD5vOqvKV1XhEC68lBHgohvCnJz/+fI/vDX7LW5/73ZmL5xdkBHCkH47nznLb/QNfyFFCHds\n0THl4J76/VQOeewQpsyewiWvXcLyMve0LxSnSnU0KW5C60XDAaho9y20dLNnjxhReebgQkgZETXg\nW65MZ9e5+Z2bmbd83hoRwkmZUC6kX8d+qciOz+Z+xoc/uoTQYYemvKKCn0yQJHphT/p1S3ZOLGMM\n521zXurz9VOvT00yA25283C270KIDgY3S3KPNq5BX5zxIktWLam0v1Bzn9eUNiJMGdGpRadEOf2N\nMVy2/WUUmSLaNmvLUwc+xSEbB9PVN10WzPZdmdAh3K7jSq5969rU9ugkOEmneUlzhndzz6Sv533N\n3KVzK+231nLVm1elPl+y3SV5HSlbG8aYSrO8Pzj9tlQe4WXLXGQWwMCx76ZSt22z3jZ5nxPdGJOK\nsltVvopTnzuVIX8dwuWvX15l+T+/mc7neNKWJ+Xli/h1Wq+Tinz+eenPnPDsCZzxwhmUNVnTofD6\nd6/TdSfnwJ89O+1MAZgRScEbRghbays5QaMBC/lAv479aFbsOncNiRDu0rJLpRQPVZGZum7mgpnc\nMPUGrppyFfv9a7+Uc3lE9xFctsNlAGzYZcPUhH8hBlPpmVEbLZq04LBNDgNgZflKhk8eztFPHs3U\nOVMrOYSjQ+arcwhHUwomuX+4dU9nS66uWM27P7xbbblCcYZm0qVVF4Z3d8+hD3/8kM23SwfInJ0e\n1MEmm6bTRTQrbpaa8yCpFJkizhx1ZnrDqKt58EF4KN3MDNt8NQc8ckAqPUvSNHdp1YXfbfo7AJau\nXrrGqACoPKHceu3WW2O/qB9yCGeH30TW76qqgHWhYGHoQHtgu6rKNYQwj3C5Leebed+khh+nqOQQ\ndtGyhRAh3KFFh5RxMmP+DLa+c2uOevIoRt05KmWUF5mixEw6VReqyyNcaVKIhM2eWhu1RWEUukMY\noFtF5Dfo6SJUNhj+c6pz0r55+1oN+SQQjRBuuTgd7XL3R3fT9/p+fNdlMnRNR74kzSEMbjbwkHBI\na2piuQ4zKC8JeqU/bkan5M/9wQadN2CPQXsAbqjm/Z+kJ1yatXBWagbkQkn7Yky6o7uyfCXPTX+u\n0v5CjBAGNwwzjMx7/MvHU9HvZRVlqRe0SYoODtllwC58e+K3TD9hOlv12oqztjkFKpzpPLvbjSxb\nlY7MstYNVQUo3+RvzFroIljG9x+fd86jtaWmSY1e/PbF1MuuzbtvznZ9smbuxsbBQw+mZRMXHX73\nR3fzwKNLOP74ysEVi/vfUal8Ejh5q5PXGFV23svnpdovpHRxKfd+ci8A7Zq148hhRzZaHevLRWMv\nWnPuEGvgw0PhHy+wS/ltqc1zBp8OLdzkn9EUMFVFCF8/9frUEOyh6wzNO5urpKiEDbtsCDgnb+ni\n0lq+Ae/+8G4qsGBI1yG1lu/Vrlfq5e2U2VMYdNMgTnz2RM584Uye+eYZANo0bcP9E+9PjSIAuG78\ndbx75Lu8fOjLvHzoy8w6aVal0TR1YdLmk1KRxrMWzuLW925l27u2ZWXLtPc+2obVOYSjEbPRnP9J\nY1TvSC73atJGlC4uTU1CPqjToETazDURTRuxvPtztGy5ZpmmAyLpIgZMSNRL6eqYMGBCatQKvaew\novNbXHBBev9n65yfGs3Qt0NfLt+h6pd8+cwpW52SigC/44M71phMMuoQVoTw2iOHcHYIpxFeCqyZ\nNDBNdCawUdWWqifhsBFwQ7qaNs2YQKH9zPR6AUUIQ+W0ESGzF81Odbq7t+meiiIuBKIO4XDiEoAv\nfk3P9F1IKSMAtuyRTtjXpKgJfx73Zw7Y6IDUtjCXdCEzqHXEIdzDPeSntb4plTf68E0PT3QEVkjU\nIdzqx3FcuO2FqWFDC1cugN0nQW/nfOjdrncih2HvO2TfVEfpvk/uo6yiLO0QXjcyC0bpMNrnV3+z\nwURzsV7z1jUpwy7qSCqUCGGoOVq2kkO4gCKEoy9op8+fzv++dwOlZi+cnZrAJ4kOYXD3mnCSk/U7\nrM+684Lc2K1+5pIX/pIqt2JFMG9Dk6X8PPiS1PZLt7u0UevbGNQ0qVE0z+zpW59eEM+mds3bceBG\nLv3N4lWL+fNHZ3HDDc5xeOFF5Wx72ek8N885GpsVN0tMJN7ATgP58rgvmXbMNCYNnwS4EWhHP3U0\nFbYiVe78l89Pje44evOj89qpMrz7cGadNIuP/vARV+14FeeMOYcz2n4Aj91FyewduOaAo9h3yL4A\nrCieCzv8EaicNmLGDKDlL5iNH2RJi89574f3KuXG/dOOf2pMSXUmTKNVYSsYcOMALnrlIlaWray2\nfDTy7qCND6q2XJRwQq6yirJKI35Cbt/9dvp1rDx/enFRMcO7D2dsn7GM7TO2Qan8BnYayP0T72ds\nn7E0KXLR6SvLV3LFWxfSORhMVZE+Zat0CE+dM5XXv3sdcH3nLXtuuWahhBC9B1/z5jWpQKgoj3z+\nSOql+75D9i2Ie3GUaB70V+Y8wyOPwF57uYlN+/SBgw+Gma0eTpVJyn25NopMEadtfVp6w6j0M5c+\nL/PQ9+7+VFJUwgMTH0hkX6l/x/6pl0bT50+vlLPcWlspJdt67RUhvLbIIZwdBuPSRXxjbcSCWpPo\n3XpwtaXqSTRR+B0fuAiFVNqIvi9A//8C0MK0g6UufLhQHMK/2+x3jOo1inVbr8veG+6dch6FJHk4\nUFV0bdU19Vbwf9//j6e/fpqPfvyI56c/D7gch4WQSzbKQUMPYli3YQzuPJiXDn2JU7Y6hfsm3sc3\nx3/Dp0d/yvbrbx93FXPOiHUjRmvPt2nVYSkPzbwZcA/8k7Y8KaaaZZeoQ/inH4u4YOwFfH381xy2\n6WFrlE3ahHIhHVp0YPeBuwNuoqrnpz+fdgh3SzuEWy3ajJICeZc1qvcotuq5FeCGsm5z1zac//L5\n/P7x36fKRHO1Jp3t+myXmrTkya+epLyinCWrlnDxqxfz1Fdu9vUmRU0S6yCtjj032DO9/uCezFww\ns1IntVCc/js1OxcqXDT0te9dksqRHKaLYMvrWN3MRSRNHDwxNay1kIg6I2559xaueP0Klqxawgel\nH6Si4tdvvz57Dd4rripmndO2Pi1lY978zs089sVjrGoxi7f77carq69JlTt79NmJytdojGFwl8Hc\nMOGGVIDJ23Pe5o73XX/izdlv8rcP/ga4XNhJsDeMMQxdZyinjzqdS7e/lIuP3YSRJmtOAAAgAElE\nQVTJk+G//4XBg+HacdemJ5YaPhl6TK00sdz0ubNh0jDsxP0ZetuGjPzbyDVy4+Yjk4ZPok1T56xf\nunopF756IUc+UXU094IVC7j/Uzdip22ztuy/0f51+h87rr9jar15SXPOHHUmj+77KI/u+ygfTvqQ\n/TbKXW7pfYfsy8uHvkzpqaV0auGGUN378b102iBsvHQUYVUO4ejLqtO2Oi2RE12GDOo0iJ377QzA\nr8t/Zdw94yqlSwR4eFrhOUOjjOw5knbN3L32uenPscNOq3nkETex3LffwvnXfcMDwTnerLhZyvYu\nBA7c+MB0f3+Dx1ywUPFKmuz1h9RLgCt2uCLR6aqiIyrv/iidn//Rzx9N2Rk92/ZM9IudfCG5d8I8\nwRjTDAgTPc6pqay1dgEuihggazOdTeg/IZU/5dlvnuW1Wa85h3Crn2DPg8G4G8PJw8+lZQvX5COS\ne3+oRNtmbXnj8DcoPbWUh/d5mHPGnFNp/xrDxgqAs0aflVo/7unj+MNTf0hFYJ2y1SkF9wa4ffP2\nvHfUe3x2zGeM7j06tb1fx351GuJWCGzWtxcsDryl/V6g+LBxzFvuhjnuv9H+BTNxYpcubqZ2gB9/\ndH97tu3J3/f4OzsuugfK0m+ykjz07ZBN0kbOxa9djOn0DbT/FgY+kdreaXVyJ5Srigu2TY9ne+O7\nN7jktUtS963dB+5eacK9pNOspFmljtqpz53KgBsHcMErF6Ryn4/qPaqgRq8AHDHsiFRqhNIlpYy6\ncxR7Pph2EheKA3ynTTaGqScAsKpiBcc/czzWWubPt7DNJbDDuYCL4rl4u4vjrGrOWKf1Oozo7gzJ\nxasW88eX/kibK9owbPKwVJlTtjqloM7xQZ0H8Zed0xHhB//7YAbeNJBnv3kWgGJTzF93+SsXjL2g\nukPkNU2Lm3LzLjenPp/xwhnc98l9HP3U0altl253Keu0rvtkYPlCs2Zw5JGwfRA/0KNtDy4eG1yb\nxsJuR/PJp+55NP2HX1m0xzhol841Gz6rorlx85Gte23N18d/zTGbH5O69u75+B5enPHiGmXv/fhe\nlq1eBrgUJykHeS3st9F+/HH0Hzl969P5+vivuXLHK9lz8J7sOXhPNll3k+yJqYFOLTul+kIWy7zN\nz4RtL4LTu8Lho6H9TLpnxMZ8M++bVC7ZdVuvm/eTPtaGMYYH934w9TJ99qLZ7Hj3jqkXlKWLS3l9\nlouGLsR0EeACYnbq52aTm79iPrvct0uqb1S6uJSd792Z+StcHqeJG07M65EN9aVpcVNO3epU98FY\nSvbfH7Y/j9VtXU7wrXttzSlbnRJjDdeevQbvlUrV9OBnD7KybCWLVy7mxGdPTJW5fvz1awQDivoj\nh/DaE727LKm2VJrQIVy3J28daFbSjAvHXpj6fM5L59Cx2yLY62Bo47wqO6y3M5fscgoffghTpsCe\ne1ZzsIRz+tanV0qZUGgRwuCGdY3tMxaAbxd8mxpGMbDTwEpDswuNQnN014c+fQy8nY7KWdQuPUT3\ntK1Oq+oriaRJE1JD/2bPdjk5Q1a/91v4+2vw6wC6tlyHwzc7PJ5KZoHx/cfTpWUXwEViHf7eYDhu\nA1gnGLO6pCtdmxeGkz9k5/478+QBT66RN/esUWfx7/3+TXFRVuZZzRuiaSOun3o9Py5xz+KSohKO\nHXEsj+z7SHVfTSwtm7TkqQOfSrXxD4t/SKW16dqqa+ImNqmObbcFXrkQFjmPw1NfP8VeD+3FQS+M\nhe3PT5U7aeRJqZyehciTBz7JUcOOqjLKrmOLjqlJYQqJScMnMXHwRACWrFqSGjLfpWUXnj/4eY4e\ncXRNX897tl9/+1TqgAUrFnDQowfx8U8fA27CzKTri3L8yOPTIyy7fcA73ML3i75nj4d3gS5uZEOb\nsvUZ128czUua06d9nzVy4+Yj67Reh5t3vZlbd02ngzj26WMrpY6w1lZKFxGmC6kLJUUlXLbDZVy1\n01Wx9rGOHXFsajLlue2fgu0uhFa/QO8pcOQIZhelszRW2Aoue/2yVOTkCVucQLOS5A+Vbde8Hc8e\n9Cz9OrgUHV/++iUjbh/Bbe/exm8e/E1K7z4b7lOwfaiTRp6USiHywowX2Hzy5vz20d+y7V3bMmO+\nyy+9UdeNuGnCTXFWMyecMPKE1GidsjYzU6kjik0xt+x6S6Ij4AHaNGuTGnk2f8V8Hv/ycc564Sy+\nX+wmo57Qf0KlkWmi4ZjMJM2ifhhjegLf4cap3GOtPayW8rNw0cHfWGsH1uP/zAF69OjRgzlz1gxE\nLqsoY+NbNk4Nz2xa0ZZVRW5yoqKl61J6wUd0bZU521xh8sKMFxh3zzgslvsn3l/nYVBJ4vO5nzP0\n1qGUVZSltj1/8PPs2HfHGr4lksqiRdCunYWRN8LOJ0ORy0wzrt84/vvb/8Zcu+yyww7w0ktufepU\n2GIL5xju3BnmzYNu3S3fzS5PfOTZizNe5MBHD+TnpT9X3rG0MzwxmfF99uSZZ+KpWy4pqyjjjvfv\n4JlvnuHgoQczccOJcVcpJ/yy7BfWuWadSnk4Jw6eyOU7XM7ATnV+9CeS7xZ+x6g7RzFn0RyaFTfj\nxJEnctbos+jQokPtX04I/frBjBYPwT5VDI+2hh3NFTx3/hkF2wmPMm3uNP405U+pzneLkhacvvXp\nqcitQmP+8vkMmzyMmQtm0rykOSeNPIkzR5+Zd5OMNZT5y+ez/yP7V5oQ02B464i3ap3kN2lM+W4K\no/8ejDxb1YoWLStSozhYvC6ndniTa85ZnwpbgcEk6nqusBWMvnM0b815C3AjcX5d/iuf/PQJFbaC\npatdfNKoXqN44/A34qxqg7nlnVs45uljqtxXUlTCbgN3Y1SvUdz3yX188KNLydWqSStmnzy7oJ5H\n387/lvH/HM9Xv361xr5mxc346A8fFdScBZm8Put1Jj40kbnL5q6xb7126zHl8Cn0aNsjhprlnlkL\nZrHpbZuyYMWC1LaTtzyZa3e+NsZaZY/npj/Hzve6EXfFpjg1WqN5SXM+PfrTNfKV15eePXvy/fff\nA3xvrS28KMI6IofwWmKM6Qz8jHMIP2itPbCW8j8CXYBPrbV1HltTm0MY4F/T/sU+D2fkCFrdnI0+\nfopPHi/8PKtRpnw3hV+X/8ruA3dPlAFXH85+4WyunHIl4NIG3D/x/phrJHJJx47BDPZ9n6fTUQex\nvHwprx72Kpt33zzuqmWVO++EI45w68ccAzffDKWlpIb/jRvn8gAWAotXLuaaN6/hmreuYdkyC2+d\nDFPOhJVtOegguPfeuGso1oZJT0xi8vuT2brX1ly909WV8q4WOnOXzuWZb55huz7bFUxKmyhHHAF3\n3mmdQ3hIOk8jyzrBY3dx3TG7ceKJ1X9fJJtflv3Cs988y9g+YwtyJJq1luemP8cZL5zBxz99zAXb\nXlBpJGIhMfisI/iixZ2VNy7tDHe/yH1/GcoBB1T9vSTw0Y8fMWzysEovJjO5Z897Eps+YVX5Ksbf\nO95Nsv3Z3jDlDNj+XOj/XLXfuXqnqytPyFUgLFixgAMeOSCVwgZgSJch3DjhxtTkXIXMrAWz2Pdf\n+6YmtAU3sfzLh75c8C/hH5n2CHs/7NKudW/TnS+O/aJg0mOUV5TT6y+9KF1SWmn7dTtfx4lbrr2R\nJYewQw7htSTIIbwc5xB+ylr7f7WUXwy0BN621o6qx/+ZA/To0qULzz77bJVlKmwFhz12GJ/9/JlL\nZDFzf3jpMn73m77ceWeVXxEJZvnq5Zz47IksWrmIm3e5mU4tO8VdJZFDxo2D55+H/v3hk2krsVTQ\nokmLuKuVdRYtcpPLLV8OHTo4Z/Crr8LO7gUxp54K11xT8zGSxrLVyxgxAqZ91DK17fjj4YYbYqyU\nyArzl88vqEgk4bjnHjjkEADLiefP4fgTV/HPf8IFJ/eC8qbcdRccemjMlRQiCyxdtZRWTVvFXY2c\ncdVNv3Dmd4Oh1S8U04QhS4/l45vPgWWdefttGJnwoOhT/nsKf3k7nfu6T/s+tGri2nOb9bbhxgk3\nJjplU1lFGU8/U8EeuwWpPIrK6LjnxRSNvIVflv2SKrfZuptx1U5XFfRIyvKKcq544wpemPECBw89\nmMM2PSzRbVtfrLXMXjSb1eVuAsiebXsWRGqQunDD1Bt47IvHuHyHywtukrXzXjqPS1+/FICRPUZy\n9U5XM2a9MTV+p7S0lNLS0hrLAIwfP565c+eC5w7hZI+5zQOstSuNMb8AnYAaTyRjTHugFc55PLum\nstUxd+5chg+vfcbq3fY/iicfuQ2ATRonx79oZFo0acHk3SfHXQ3RSEye7JwQe+4JzZsUroHTti3s\ntRf8858uIvrJJ2HWrPT+jQpvXgxaNmlJ724w7aP0tk56v1MQyBlcmIwdG64ZPnilF/0ugiZLADea\nkfaFkT1AiIJ2BgNsObQznD8V+j/LAVuO55ev+/Kxm2uN9dePt27Z4OqdrmZIlyEYY9ix7470btc7\n7ipllZKiEtaPSqooYaO5F/PyaRfyQekHvDXnLdZrtx67Dtw18TlVa6O4qJhztzmXc7c5N+6qxIIx\npuDO77pywsgTOGHkCXFXIyecs805dG/Tnd7terPLgF3qNPL7tttu46KLLmqE2hUGcghnh8+BMUB/\nY0yRtdWOzdkg4zv1pqYI4SjrrtuNv/aDn39OD78WQiSXPn3gvPPirkXjcOihziEMLoVEi0gg9MYb\nx1OnXNMz43WiHMJC5C+9erk8wtOnw9tvuxENC9Ip/OQQFiIhDBkCzO8L7xzDxyvdtQzQsiV06RJr\n1bJCcVExRwwr7I5gj4z0sN26QZEpYnj34QzvXnsQlRAif2le0rzeE5pOmjSJ//u/GgftA5UihL1G\nDuHs8AbOIdwKGA68U025bSPrUxryj5o2bcqwYcPqVPbSSxvyH4QQIl62394Z+N9/D08/nd5uDAwe\nHF+9ckmmQ7hjx3jqIYSoG2PHOofwqlXOKSyHsBDJo1Mn2Gwz+OAD+Pjj9Pa+fZ3NIfKfDh2geXNY\nscJ97tYt3voIIeKlW7dudKvDjaBp06aNUJv8p7DHTjQej0XWf1dVAePi2w8JPi4AXs51pYQQIokU\nF8PBB6+5feJEF7VTiChCWIhkkU4bAa+8IoewEEnl8svX3FYI6SJ8wZjKUcLhJMRCCCFqRw7hLGCt\nfQd4HTDAEcaYqqYgOA0YjMsffJ21trwRqyiEEIniuOOck7SkxOVNfuIJeOCBuGuVO+QQFiJZbBsZ\n8/Xyyy7neYgcwkIkh513hp12qrxNDuFkEXUIK0JYCCHqjlJGZI8TcWkgWgDPG2Mux0UBtwAOAI4M\nyn0JXBtLDYUQIiH06AEzZ0J5OfgwokcpI4RIFr16uWHlM2bA//4Hgwa57cZAmzbx1k0IUXeMgauv\ndqkjrHXb+vaNt06ifsghLIQQDUMRwlnCWvshsC+wEJdL+HLgLeAlnDPYAl8Au1prl8ZVTyGESArF\nxX44g2HNSVEUISxE/jNmjPu7ciV88olbb9cOimRdC5EoNtnETWgbssEG1ZcV+Ud4L27Z0rWlEEKI\nuqEI4SxirX3KGDMUFy28K9ATWAV8AzwE3GytXRFjFYUQQuQh7dpBq1awdKlzhLdtG3eNhBC1MXo0\n/OMfbj2MLOzQIb76CCEazp//7CaJbN8edtwx7tqI+nDkkS53cP/+0Llz3LURQojkIIdwlrHWzsbl\nCz4t7roIIYRIBsbAxhvD22/DgAGa3VyIJDB69JrblD9YiGTSsSP8859x10I0hJIS2GOPuGshhBDJ\nQ4PahBBCiDzg1lvh2GPhrrvirokQoi4MGrRmNJocwkIIIYQQIgnIISyEEELkAZtsAjfdBCNHxl0T\nIURdMGbNKGE5hIUQQgghRBKQQ1gIIYQQQogGIIewEEIIIYRIInIICyGEEEII0QDC2e1D5BAWQggh\nhBBJQA5hIYQQQgghGsBmm0HLlunPcggLIYQQQogkIIewEEIIIYQQDaBJE9hyy/RnOYSFEEIIIUQS\nkENYCCGEEEKIBhLNI9ypU3z1EEIIIYQQoq7IISyEEEIIIUQDOfJI6NMHBg2CXXaJuzZCCCGEEELU\nTkncFRBCCCGEECKp9OwJM2aAtVCkUAshhBBCCJEA5BAWQgghhBBiLTDGLUIIIYQQQiQBxTEIIYQQ\nQgghhBBCCCGEJ8ghLIQQQgghhBBCCCGEEJ4gh7AQQgghhBBCCCGEEEJ4ghzCQgghhBBCCCGEEEII\n4QlyCAshhBBCCCGEEEIIIYQnyCEshBBCCCGEEEIIIYQQniCHsBBCCCGEEEIIIYQQQniCHMJCCCGE\nEEIIIYQQQgjhCXIICyGEEEIIIYQQQgghhCfIISyEEEIIIYQQQgghhBCeIIewEEIIIYQQQgghhBBC\neIIcwkIIIYQQQgghhBBCCOEJcggLIYQQQgghhBBCCCGEJ8ghLIQQQgghhBBCCCGEEJ4gh7AQQggh\nhBBCCCGEEEJ4ghzCQgghhBBCCCGEEEII4QlyCAshhBBCCCGEEEIIIYQnyCEshBBCCCGEEEIIIYQQ\nniCHsBBCCCGEEEIIIYQQQniCHMJCCCGEEEIIIYQQQgjhCXIICyGEEEIIIYQQQgghhCfIISyEEEII\nIYQQQgghhBCeIIewEEIIIYQQQgghhBBCeIIcwkIIIYQQQgghhBBCCOEJcggLIYQQQgghhBBCCCGE\nJ8ghLIQQQgghhBBCCCGEEJ4gh7AQQgghhBBCCCGEEEJ4ghzCQgghhBBCCCGEEEII4QlyCAshhBBC\nCCGEEEIIIYQnyCEshBBCCCGEEEIIIYQQniCHsBBCCCGEEEIIIYQQQniCHMJCCCGEEEIIIYQQQgjh\nCXIICyGEEEIIIYQQQgghhCfIISyEEEIIIYQQQgghhBCeIIewEEIIIYQQQgghhBBCeIIcwkIIIYQQ\nQgghhBBCCOEJcggLIYQQQgghhBBCCCGEJyTWIWyMaWWMGWOMOdUY86AxZoYxpiJYZjTgeEOMMbcZ\nY74xxiwzxvxsjHnNGDPJGFNcj+NMMMY8aoyZbYxZEfx91Bgzvr51EkIIIYQQQgghhBBCiGxSEncF\n1oIngW0jn22w1BtjzJHAjUDTyDGaAaOA0cDvjDG7WGvn1XAMA9wOHB6pD0B34DfAb4wxt1trJzWk\njkIIIYQQQgghhBBCCLG2JDZCOCB0As8DngeWAqY+BzDGTABuAZoAPwLHAyOBCcCjwfFHAP8OnL7V\ncTnOGWyB94ADgC2Cv+8H239vjLm0PvXLB0pLS7nwwgspLS2NuyqNgm96wT/NvukF/zT7phf80+yb\nXvBPs296wT/NvukF/zT7phf80+ybXvBPs296wT/NvuktLy8PV5PuE107rLWJXIDfA/sDfSPbvgUq\ngBl1PEYJ8HXwnflAnyrK3BTsLwcOqeY4A4BVQZm3gWYZ+1sA/wuOsxLo1wC9cwDbo0cP29i89957\nFrDvvfdeo//vOPBNr7X+afZNr7X+afZNr7X+afZNr7X+afZNr7X+afZNr7X+afZNr7X+afZNr7X+\nafZNr7X+afZNb5cuXcLg0p9sHvg341oS6w231v7NWvuAtbbe+YIj7An0w50Il1trZ1ZR5nScszhc\nr4qTSaffON5auzKjrstxkccE5U5aizoLIYQQQgghhBBCCCFEg0isQzhL/Cay/o+qCgTO3IdwqSg2\nNMb0r6LY/+Gcyl9Ya9+p5jhTgS+D4+yxNpUWQgghhBBCCCGEEEKIhuC7Q3h08PdLa+3PNZR7NbI+\nKrrDGLM+buK4zHI1HaeHMWa9OtdSCCGEEEIIIYQQQgghsoC3DmFjTCugF0Fkby3Fo/sHZ+zbsJpy\n9T2OEEIIIYQQQgghhBBC5BRvHcJAz8j6nFrKzo6s98rRcYQQQgghhBBCCCGEECKn+OwQbhNZX1JL\n2aWR9dY5Oo4QQgghhBBCCCGEEELkFJ8dws0j66tqKbsyst4iR8cRQgghhBBCCCGEEEKInJJTh7Ax\npiILyyE5qt6KyHrTWso2i6wvz9FxhBBCCCGEEEIIIYQQIqcYa23uDm5MeRYO8ztr7d11/H/fAusB\nM621fWspOwj4HDep3M3W2hNqKNsJmBuUfcBae1Bk3yTglmDfPtbaR2s4zkTg4aDsH6y1t9dFV/Dd\nVUATYwydO3eutXxxcTHFxcV1PXyNrFq1irlz59KlSxeaNq3N5518fNML/mn2TS/4p9k3veCfZt/0\ngn+afdML/mn2TS/4p9k3veCfZt/0gn+afdML/mkuFL3l5eWUl9fuhpw7d264WmatbZLTSuUxuXYI\nD8zCYUqttYvr+P/q4xBuBSzGOWf/Y63dq4aymwLvB2WvttaeFdm3K/BEsO9ka+0NNRznJODaoOyu\n1tpn66Ir+G4ZkB0PrxBCCCGEEEIIIYQQ/lJurS2JuxJxkVPh1tqvcnn8tcFau9QYMxvoBWxQS/Ho\n/s8z9k2rplx9j1MbK3EpJywwrw7ly4GKev4PIYQQQgghhBBCCCGSRhF1C6TsCBgqz/PlHd56wgPe\nAA4ABhljulprf66m3LaR9SnRHdbab40xPwDdMspVxTbB3++ttbPqU1Frbav6lBdCCCGEEEIIIYQQ\nQohMcjqpXAJ4LLJ+WFUFjDEtgH1xkbnTrLXfVFHsP7i3CxsYY7ao5jhb4iKEbcb/FUIIIYQQQggh\nhBBCiEbBd4fwv4HpOGfu2caY9asocw3QIVi/qprjXAeUBes3GmOaR3cGn8PcwmXA9WtTaSGEEEII\nIYQQQgghhGgIiU0ZYYzpB4zO2NwaF4Hb2hhzaMa+ZzJTQlhry4wxJ+AmhWsHvGmMuRT4H84JfBSw\nV3DM14F7q6qLtfZrY8w1wFnACGCKMeZPOGdzP+BMYLPgOFdZa6c3TLUQQgghhBBCCCGEEEI0HGOt\njbsODSJw+P69Hl8Za619rZpjHQHcBDTFRQtHscBUYDdrbbWTuRljDDAZODzclHEMgL9ZayfVo85C\nCCGEEEIIIYQQQgiRNZKeMsLWcamo8SDW3gEMB27HRfUuB37BRQX/ARhdkzM4OIa11h4J7IrLKfw9\nbsbC74PPE+QMFkIIIYQQQgghhBBCxEliI4SFEEIIIYQQQgghhBBC1I+kRwgLIYQQQgghhBBCCCGE\nqCNyCAshhBBCCCGEEEIIIYQnyCEshBBCCCGEEEIIIYQQniCHsBBCCCGEEEIIIYQQQniCHMJCCCHW\nGmOMibsOjY1vmo0xHeKugxBCCOE7vtkf4J9m2VxCiMZADmEhRE4xxug+4wHWWutLWxtj2htjTKC5\nOO76NAbGmKuAc4wxbeOuS2MQbVdfzmshRPLR/coPZHMVNrK5RKHh2wudJKELTohGoLqboA83R2tt\nRfRzIT/ojTEHGmM2jbsejYkx5g5jzAWwZlsXIsaYPYErgcnGmBJrbXncdco1xpi/AqcBewIdgm2F\nfu9qGq4U+nnt8/MpSiE/m3zD53NaNldhI5tLNleBIpurwNvYWmujnwv52ZQ0TEbbCNFoGGOKCv2m\nH2KMKQH6Ay2BNsCnwLzwDX8h/g7GmIHAxsAoYDXwM3CLtXZZrBXLEcaY24AjgcnAjdbaz2KuUs4x\nxlwHnBB83LjQNRtjzsMZ6W2ACuAwa+298dYqtxhjbgKOAcqBYuAfwBGFeM8CMMbsDGwB7AcsAOYB\n1wIfW2vnxVm3XGGMaQesh3s+tQA+A+ZmGu+FgjGmLzAI6I67jmcAb1lrV8VasRxTqLZGVcjmks1V\niMjmks1VaMjm8sLm2gjYAGd3rQS+B54CVhSq3ZU0O6Mk7goI/zDGXAI8Y619MxwCFHedcokx5mBg\nR2CfYFNz4F3gHWPMSdba1caY4kJ6622MORk4ANg8Y9dvjTHHWWunFFLbG2NuwXVMwOnGGHOTtfbT\n+GqVW4wxNwDHAWXA8cAX8dYotxhjrgVOAixwN/CUtfbheGuVW4I2Pib4GHZONsYZdtOSZvDUhjHm\nQuAoYN2MXUOBvxljJltrf270iuUQY8xxwK64Z1QZ0Az4BJgZ/B6zCqlTZow5C9gXiEYVLgW+MMZM\nBl6x1n5dYM8n2VyyuWRzJRzZXLK5ZHMlHw9trnOA3+KcwVG+AP5jjHnIWvtBoTyfjDE3A/+21r6Q\nqOvVWqtFS6MtwC24t7zTgc2DbSbueuVQ7xW4h3p5oHtZ8DdcXgKaFNLvAPw5om8V8BawOFgqgI+B\nzoWiGfdi7ZFA29Lg76LgXN8o7vrlSPMNkTb+PVAcd51yrPesiN7TgZ6RfUVx168R2njvQHf4+ay4\n65cDvX+J6PsBeBGYDcwPts0E9i6kNgeuzngefR38XRH8/Qa4HheJFnt9s6D3mojWZbio0eh9+2fg\ndWDbuOuaRc2yuWRzyeZK+CKbSzZX3PXLgV7ZXH7ZXL8CU4N79oJg2+KgzXeLu65Z0ntzoKsMGBNs\nS8S5G3sFtPizAJdGLpSC76DghryEN8KHcMOejgAuCm6Gq4J99xeKcYfL8xVqvhYYH2zfGrgpMNor\ngL/GXdcs674q0PVk8MCrABYWYgfFw47JVoGRVh5cw80i+6rUnhQDoI5tfBQuSmUj4L1g27fAJnHX\nM4t6L4zoPRMYGmwfEJzjpcG+KXHXNYuaL4tovhQYB7QHdgo6LT9G7mNTQ+M2qUvQrqHes3HD6psD\nY3Ed71mR/SuBveKucxY0y+aSzSWbK+GLbC7ZXLK5kr94aHOdEtF7KrBZsH0TYC9chHDUOX4s0Dzu\neq+F3ouo7NxfTYKcwrFXQIsfS3Dx/xA84BdEbgAF2UEJjJhQ49FAp4z9u+HefoZvQXdM+m8Q3PCj\nN/b2Gfs3w+UNqgAeiru+WdJcFPz9v0jnZCzwPgXYQaGGjkltD7ykntuBcV4OTAEGRraXRLUBfXDD\n3pom4eHfwDa+k3Q01oF1afd8X3BD934OdJ0FtMnY3w44DxdpOJMCiLQL7nZ7T0kAACAASURBVFcL\nI/fqFhn7OwIHke6gVAT37h3jrnsD9fbDpQwoD9qyRRVleuIilBZFNB8ad93XQrNsrsr7ZXPlQZ2z\noFk2V8ZvUcN3E3luI5tLNpdsrsTaXMG12R14LbiOL83UG5TrDPwd97IjFQmf+RxLwgLsHrEvfo60\ndWKcwrFXQEvhL0AvXP6nctzwgDNJv9EPh0gUTAcF2B74MtB2AdAqsi9q0Bwe+Q3Oi7vea6l5F+C7\niOY2kX1R4+aFoMzLBMM2C2EJDNOFwIdAF2C7YL3KDkoSz3Pg1sj5egDQNLIvel63wkXf7QLsD4xM\nYlsHRk0J8Hyg+epqyl2EmxwhnMTnE1yk1si4NTRA8/VU0TEh3QnvFbm3fUFgqCd5AS4O9HwFbBht\n/8j66KDMPGCduOucBc1XBHpeBXpEthdF1tvijPOlwbldgRvKOTbu+jdA7w6khyduk9nGkfO8I27I\n3+zIdXBQ5vmQ7wuyuWRzpddlc9lknufI5pLNJZtLNlcyba4tAw2rgF0y2zhynrfB2ScfR66D06L3\nunxfcC+l/oYbibUy0PMg6RfxiXAKx14BLYW/AIdFLvSbg23dgHci2wsmagW4JLgBTAU2rWJ/eEPs\nFPkNHg+MocRpB7oC9+KGSdwPDKhKMzAY1xFdjZtNdkNgPPA7XLL5DnFraaD+YtxMsR8FbblVsH0s\n8AHVd1CaA23jrn8dNR4buVY/APoH24uIdDxwE508RuVhyvOBt3HDozrFUf+10N0UF1lYAZwYbgv+\n9ohoDfNVroj8XUKQ/ywJC3BdxHipclgqzmC9J9KuR4bnQdz1b4DeIlwkyueBnsdqKDsMWI4bpjwC\nmACcGFzjfYMyeX/vDjS3J5237oZaym9OOp9fmKvza2CLuLXUU/ehpCND16uqvUh3wNvihuFHo1Z2\nj1tDPfUehmyu6H7ZXLK5QDZX3i/I5sosI5tLNlcSba7dgrr/Agyrqr1I21wtcaN6wlEeFcDvE9TG\n+0fqfWuwrRPwLxLkFI69AloKewkMsCeCC+K/RIYCAL2B/1FAHRRgSMRAObcO5Z8m/ea3Ta7rlyPN\n65CevGTXjH3RN74nRwy5r0jn86vARWT9nSDHUBIX0vkLfxt8bkrVHZRBuA7NgbhIh+Fx170O2gbg\nIjKWBe32BNAno8ztkfYMh7hFDffpwB+BbnHrqaf253EdrVMytj8Z0XkHLgLg+kBn9Hc4OCift/c0\nYD1cJNkqYBI15CgExkS0/TuyPW/11aI9fAZ9BXTN1IKbAfqU4DyeSXqoYxi98imwc1J+g+C+NC2o\n/x3htowyUf2v4jooYYTpcpwTqmdj1TkLmo8i3cGqtmNFuoPSBpfTb06knbeKW0cdtcrmqrm8bC7Z\nXLK58nhBNldmWdlcsrmSZnPtRfpFzbgayoUva1sAx5Ge6LcC2CluHXXQ2QS4MajvK+H5HOzrBjxM\nQpzCsVdAS+EvgWFzKy5CIexwhcMFMjsoiR7KiBsm8RruTVeP6nREfodwBs4vgY5x138tdI8Brox8\nrhR5g5sxN2zjabhcjdfiDLxweO7CwEBKVO63SFseH+h4OPKQK2HNDsrfcJMYhcPAriQy/C/floi+\n9YP2WhUsTxF0NIC7SDtc/h38FnvgJvR5lLRB9x0u72HruHXVQ/8DQd3fAboE28LhXwuAERnlOwC3\nUXmSqvFx66iDzi2Cc7Wmjkl4Xt9CutN5aNx1b6BeA7QG3gh0fA8cSSSCDNcx2Z50RMs8XCfmfVx+\n1mgntFqjN1+WQHPbQEMF8Gpk3xoGKi4iazbuubwF6WF9cwiiZknAcxoXKRkOr7+1pvsPlSOF7wju\ndauD+1qPxqhvFvTK5qq+XWVzyeaSzZXHC7K5omVkc8nmSqLN1ZW0E/wZakh1EjnHWwLnk55Q8ANg\nUNxa6qC1Z1DvfVkz5UtinMKxV0BLYS+Ri6I9ayZRL9QOyv8BlxPJY1dD2XBY2GzcG+Mk6i3K+Fyc\n8Tk6nOIfwDYEM4nicp9tSvoN/3zgL7i3hYn6LXATF80H3sjYXoLLbxcOh1kG/Bqs/wD0irvudW1j\n1uygPA78KdCyGJejsXvGd7vjhjCHxtzXwJBgX962cUTzMcFDfBbwG5zB+gquIxYaaCXB33BoY5vg\nd/kp0Pw+0DtuTdXorHcbkB6CX46LUirJ57asRctupCMMP8JNZrJ+cF86lvRMyNNwuWh74KIwewXn\nf2jozSDPZwEnbXhfQXp48eWR/U2Cv+Gzefvgun4z+LwFaWP9dfLYqZKhuyvpDub7BGkFqGXGetyw\nv9A2mUPQASXPDPlq6i6bq+qysrlkc8nmysMF2Vw1fUc2l2yuJNlcbYDnSEeC71HT+Rr5ndoBjwTf\n+wX4Q/T3ybclUu82BM/YyL5EOYVjr4AWvxeq76BUO5QRl5Mn7x6EVI7OaFFV3av4znGB3p9JgJHa\ngN+kNemOx18J3vaHv03kZtqXdATXLBISiZWhpTtpAzzMaRc9J3YhPQtpOOwtnOk8Lx92GRqr6qCs\nCB5uy3AdkOhkNtFJT1rjckSFubEejVtPPXT3w0UpVOAidPrhnAmlBPlIq/md2pEe4jg3NAAKZcFF\nJVUE50EihtNXo6MDblKm5RE9FbjJIcJOyzQiE5tEnludcB2UMBLtuGB73j2fMjTvHjmnS4ELqigz\nhPSM19cE21rhhuiuCJ5Zm8etpQ5aw2fMb4L7VAXwRGZbVvG9sI03jNzXn4lbTxZ+D9lcsrlANpds\nrjxdkM1V3e8im0s2V97bXBE9oyNt/DLBiyuqcYJS2YEajmKZGreOtfwN6uQUznhulRDDpKBFCBEj\n1tpyY0yxtfY7YB/cZALgDKAHjTGbW2utMaYYwBjTGpdcfqQxpkk8ta6aoJ4mWF8ebquqbFgO98Yb\nwELV16MxprcxpkeWq9soWGuX4HIJXQScb62dG9lnw7a11s7Aze6+HPcWeAxU+p3ymkDKD7icjeBm\nHU1hjGmBm8E+ej4YYIIxZlNrbXnj1LThWGsrjDFF1tpvccMTnwt2FeOGVj8e1WGtLYusL8ENWf05\n2NTHGNO8cWrecAK904FLccbqBNzwU4OLRJmb+Z3I77QQOANntHbCGYSJOaerwxgT3qcewRm4JcAJ\nxpi28dWq4Vhr5+OGY/4e50j5Eddmt+M60/OBo6y1P4XPoeC5VWSt/RUX0TIPFyGwR7C/yvt+vmCt\nfQL4c/BxHeACY8zjxphjjTFHGGMuxRnwXYO/fwy+txR3j2sKdAY2avTK15NIW7yDM8jLgF2NMX8P\n9peH7ZrxvfLgXP8Wl7+zDFjfGNO5cWqeG2RzyeaSzSWbK1+RzbUmsrlkc5FnNldt11SwfyouTdEq\nYFvc8ya8XquyuSqCZ1NpUHYF0N8Ys2G2619fGnoPidybSoETcPnRF+Hu4S8ZY8ZYaysIbJHA3toF\n2N0Y0yk7ta8bcgiL2Il0UGbh8p5ldlBGBGXa4h7wN+IMnVHx1Lh66vpQipSbh+toNgfaZd50jDED\ncPnAPjPG9M1mXRsLa+1HwMXW2l+q2R8atN/ihi2C+03y/iFfBd8Hf/c0xpig89UaF512Fm6I6s+4\n4cutcPmzJhljhsRS23pSRQflZdywnqettfNq+e5XuNxhAEPJ6MDlI8GDGlyexZdww/W2xUUm9cHd\no6r8njGmBNd5WRhsXhXsS9o5XYnIb/JfXNsDbIYzVqOdl8Rgrf3ZWvtP3PC8EcDGwH24CSNKgS+D\n6zna+a4IHGS/4q57i+vA5jVh+1hrLwMuJh09txsut+btwJm49nwRN6v5quB8Jtj2cbCeN85RY0wT\nY0xxFc/Q0GH4PfAg6Tyihxpjbgz2VecUrggcjVNwnfCBuMjavKA6zbWRVJurKr2FbnM1pI2TbHM1\nQG/iba6aNBeizVWb3mC1oGyuht6rIZk2Vw3P49D+KDibqwbNJVBYNldwn13fGNPXGDM0ui+4X1lr\n7WrgP7h7VhmwhzHmweD75VWdt5H2/hj3jO5AHty3atNby3drcwpvk2Fv/Rn4Fy5wodGu7ZLaiwhR\nM8aY9YHv1uaNezRqxRizN+5i2Bz38L/fGHM4bgjUGTijwOKGFDQ62dAbPRzuOgzzQaUMl6Bj8ldg\nWLApNqNmbTXXZJAFN8pwcoyQ5Q35P9mivnrDjgjuwXcmblibNS4iY3fgXNITG22Hm2TgHzgjfR+g\ntTHmcmvt59lXUzfqqjnaQTHGnAhsZ619ppZjlwTRK6tx5/FPwJLI79bo1KeNrbUfG2PuwjmE+uE6\nKW1xURpnV9Uxs9aWGWMqcNc4OO2xkq17V9D+c40xF+KM2YHA2ThDtqLGLzci9dEbnIvzI59H4iLM\npgdaq+q8rjbGtMPlazVUEb3U2NSmORKFUW6tvdAYMxN3j9oNl6sRXF6/qcBJgVGfij6z1q40xoTX\nbOyRV8aYrXGdyQk4h88sY8xrwIPW2pXBfbgocO4+bYzpjuuEdQSODX6LY2w6Aqkicuzw8xxcB84E\nf2OlNs11OUbCbK611hs9HMmwudZKcwJtrnrpLRCbq06aC8jmqlVv5F5dKDZXVu5dCbK5anseV4Tn\nYAHZXLVpLjPGNLHWri4Qm2s/3GSIB+DsoXbGmH/iRiz8K3K/qrDW/s8YcxsuKnpjYJ+gXQ+oykaN\n3J++x0UIxz6qoY56a7yvRp3CxpgTgBuAnXD3tBeNMbvh2vYM3P2uDJdHuvGubZsHOTa0JHcB7sZN\nWLAlWUiOTTpP0HpUzm/3E+lE83OBwUnWS+W8huW4vDKDI/sH4N6Qh3o3KJQ2ruZ/tMEZ6xW4aKxG\nz5+TDb24vHzzcVFIo4EDgc8CXTMJchbiHnLb4YYxV+A6Ld2SpDmzXHhO11C+A+lJXv4Tl9b66o3q\nws3gXUp6tuevcJO6tAl/k8i1bXDD2RYE7TuyLr9TPmiuxzE3Aj4JfosPCWarj0tjNvRG2u/kQNcn\nQLtgW+bkTUXAfriIpO+B3ZLSxtH9OMffBsDI4L4VzTuaqXk90nkpj4y5jc8mPXt35nILweRxVeg9\nAecMC6/jewgmJ4q0q4msnxqUew9omRTNdTxevttcWdFLsmyurLZxNf8jn2yuBusluTZXvTWTbJur\noffqJNtcWb+OyW+bq0F6SbbNVZ/zOprvO6k216U42ynUuDyy/j/gkMx2DdYPwo3SWB2UfRz3UiOc\nVC/T5joqKPcZ0DkJeut4vGhO4X+RzilcRnr0Wiz2VmwnlZbkL8BNkQvjDbLvFO6G64CFyeYrcMNl\n4uqYZF0vzkBdEegLjZeBpDsmsenNZRtX8X/GBjfDlcD5xDSD7troxRmjTYPvVeCG2X7Amh2TcHbk\nEmAc8BoxzpTbiG28G24CmLnAoeFvlgS9VO6gHI4zAEPD5hOcg6lrpEwJznB/hfTEKB0KsY1x0Vnh\nsU+OS2O29eKiPcJj3B7ZHj6fDC4C7bWgzLNEjPokaCbSka5mf6YDohjXQV+GG9LXfW3rvRZ6/xJo\nLcd1/l/CRdmURn6HO8PrMmiv6HV8HC53YTjz97O4Gc3bR/XjJnp5MyhzG9AsSZrreNx8tbmyrpf8\nt7ly0sZV/J+x5IfN1WC9JNfmaqw2zheba23v1Um0uXLWxuSnzbXWekmezdWQ8zrJNte1EV3/xT03\nTgo0htv/Q+SlecZ1vD/uhVw4ieCUQNu6kTIlOJsrvKffQ0wv4Ruit47HjU6Q+BjO4Rze234lLnsr\nrhNLS7IX4BjSbzXCNyZZdybhHvrhG5R5wIaFpBf3ZjC8OW5LfnVMGquNB5M24t6K64GXLb3ANcF3\nw3adSbpjkvnmtziuh10MbRwacS8RmT04KXqpbNjsHRgI4YzIc3DDvY4EDgWuxuXlrMBFMfQvtDYm\nbdiuB7wdHPdbgoiVpOvFRW+8S9ph+DfSwxTb4yKRXg32zS7ENq7i/0Sdo/8AWsek91LSBvmZwLBg\newtc9NBjkf2nZnw3eh0fGpy7ofPzK1y+voOD45wXObenA+vH2MYN1lzP/5MvNldO9JLfNldjtXG+\n2FxZ0UuybK7GbON8sLmyda9Oks2Vq3tXvtpc2bqOk2RzNdZ1nC8213kZevtG9jWhckDCdhnfjV7H\nu+IiY0M7tRSYhkuVcAxwXeQ6nhn9P0nRW8//cwzuhV2s9pa1cghracCCSwL/CpXfimW1E4obTnFQ\n5Mbwa1wXSi714jony3EdsANxs5nnQ8ckl5pDo6YNLkolNFrnAAOSqjfcD2yDmw061FRlxyTupZHa\nuHXwe4RG3HfEZMRls42D9c1xb4yXkDYMyiPrFbhIlkGF2MbR3wQXORlqPrhQ9OKMtcWRdv0SN0xs\nOu6ZFF7jcQ4vb4zruB3uXv1q5HeIxTmKswuWBfU4noyI3eB8nIAbTluBS6HRM7NMZH0sLtInPGYF\n6WiNcPkq5ut4rTXX4X/kk82VM73kr82VS835aHNl7TomOTZXY7RxPtlc2b5XJ8Hmaox7dT7ZXFnV\nSzJsrsa4jvPJ5jqM9EvzU4DmkX3h6Iu9SKe12CmqJVouWN8Ilx5kUeQcLqPydfx5XNdxlvTWls4n\nb+ytVJ3i/OdakrkAl5B+E38m7g3WHWSvE9oCOIR07qu4DfWc6cXNFLswOM7cfNDbSG28EfAn0jkK\nv47r5p9tvbgIlAnAA0CfcFuc7RlTGw8CLiKdz+9b4jXisqI380EPbIozzl/DveFdgIsyvAToXeBt\nHBqv65A27gYmXS+VDbujcREM4XHD5UdcZGFsUSq5bmNcZM6WwKOk8+TNjOteDfQM6lKOy5dcbQ5Q\n4O9BfRcBQ2tp41bAKNzwv/dJO4TfBm4muI8nXXMN38sbmyvXeslDm6uR2jhvbK5s6yUBNlcjtXHe\n2Fy5ulcHn/PS5mqkNs4bmytXbUwe21y5bmPyz+baGBdwUIaL3q1SL/D7oK4rCaJpgRYZZZpUcezr\nguP/invR8ypwFW6S0KTrrfIZhMtnfxh5YG9VqlfcFdCSrAXYMHKTvi+yfYvg5peNTmgv4CHSBnuc\nhnpO9QY3n2Wk34bGlj+mMTQDXXG5G0NDZikusiPOobhZ14vroDQL1ktyUe980xw5RgfcEL5fSL/5\nfR3oV2h6SUcohRMj9CRPopMa414d/ga4yLMrifelTlb1UjkyaVvcxGIf4jqh9+KGsHbNhZZ8aWNc\nFMN9wTEW4zpjcV7H+5COCtuvpnYDxpB+zhwVbKsub1/YyW6Jm/V7CM6BlrqPF5rmjO/nk82VU73k\np82VM83kp82Vdb3kv82VyzbOR5srV/fqfLa5cn6vDo9BfthcWdVLMmyuXD+f8sbmCs6zw4P7yUcE\nOfcj+6OTwIUpFJ7BpdN4BufI/g9uBFZq8riMv2E+3Z4ETmBiun/nSG9Vz6qOwF3kgb1VqV5xV0BL\nshagPzAZFzkzOtgWXiTDyU4ntDnwu+AhMKSQ9eLe6Icze88n5iEDjaR5PC5K5RXccIx1s60hn/Tm\n49IIbTwB94b/XeAKoEeB6w2NGxNdL2TNVfy/poWmN3M/GREOcS+N0cZAP+BC3OQfseShDH970hE3\n/4psr85xsAHp2aHPq+P/WGPIX5zXcWNoDr6XFzZXI7VxXtlcjaQ5b2yuxjqn82lppDbOG5urkfTm\nlc0Vx3lNjDZXrvSSxzZXY7UxeWJzBXXZC5eu4sSM7VFb6VAqR3Ivx0XuR7e9Fmoh8uImD6/jnOqN\naB0DPE3M+b8r1SvuCmhJ3gL0wb29axF8jl4o2eqEtgDaxK21MfTicia9Tx44gxtR82ZAl3x52DfG\nOZ1vSyO08QhcNFaruLWqjf3Q3AjndDSCJVaHf2NoJm2gNyEPhmDjOkoPAhdltkdmvYPny9eB7gvj\nrnu+ayZPbK7G0Eue2VyNpDlvbC5dxzlr47yxudTGha+5kc7pvLK5cq2ZPLO5groMq6o9gs97knaC\n/hs4CRft2x/3kvk90vmWXyIPXrTHobea86N5rjQ0SHfcFdCSzCXzoiYLndA8v0HkQm80yXq7uDU2\ntuZ8W3w7p9XGauNC1OybXp804yIrBgAd6lC2BGecVwC31qYpH/X6qDnHeqORSXljc+VYc949j307\np9XGauNC1JxLvfm6+NTGrOkMzfy8HWnn6G244ITMPMHbAS+TTm12fty6pLfqpQghGoANzvzoX2OM\nCdbfw+VXeQCXcHtr4BpgC2NMEUBY1hjTwhgzIHKMvDwnc6S3zBjTJFhf2LiKaieHmn1q47w9p0Ft\nrDYuPM2+6QV/NFvH19ba+WGda6AYl+cPoGl1hYwxrcJj1+GYjY5vmnOst9wYUxKs543NlWPNZZ61\ncd6d09Aobax7dcz4pjmXeoP1vNILfrWxtbaips+4KNdSnHP0j9bamdba1ZBuO2vty8CNOOc4uHkv\n8hLf9GaSVw8QkWzq0QktCcq2xuVrucoYc27wvcwLMG/Jkt7V8dS+YaiNC1sv+KfZN73gn2bf9ELh\na7bWOb+rInCOlONyxIIbwrfGd4wxA4FLjDGH1nbMfMA3zTnSW5ab2mYHtXGaQtQLOdOse3Ue4Ztm\n3/SCn5qjWGtfxEXEnmWt/TVjX9T+fBSXvx5gW2NM+3x7gVUXCl6vzYMwZS2FtVDzcNUpwbZ2wH64\nmRwrgLeowxCMfFx80+ujZt/0+qjZN70+avZNr6+aI3qfCvTcW8VvMQCX460CeBxoHXd9pVl6pVl6\nfdTsm14fNfumt5A1R3XUUq44+PssaduyqK7fz5fFB71hSLMQWcNa96bEOt4zxtwU7Nof2AoXqfQM\nLhppY2ABcIS1dn7VR8xvfNML/mn2TS/4p9k3veCfZt/0gp+aIRWhUxx8bAfp6Bvj0mLcAowFFuIi\nPpbEUM2s4ptm3/SCf5p90wv+afZNL/in2Te9UNiaQx11KFdujFmXdOqEz20ej2aoDh/0yiEsckIV\nndCbccMnDgJGAhvgbpDzgDHW2s9jrO5a45te8E+zb3rBP82+6QX/NPumF7zVXGGMWRB8XBFuj3TE\ntsfpHW2t/SKGKmYd3zT7phf80+ybXvBPs296wT/NvukFPzVnYlzO/vE4+/I74J/BdlNXJ2uSSLLe\n/M9pIRJL2AkN1t8F/gFMxeUwbIfLrVMQnU/wTy/4p9k3veCfZt/0gn+afdMLfmomyNtHYOsal68v\n2hEbU4AdMd80+6YX/NPsm17wT7NvesE/zb7pBT81RxkMTALaAO8CH0KyciXXk8TqlUNY5JTwIjDG\ntAS6AusAzXDDUrcpsM6nd3rBP82+6QX/NPumF/zT7Jte8FJz8+BvR2PMRsBfqdwRKzS94J9m3/SC\nf5p90wv+afZNL/in2Te94JnmMOjAGNPEGLMZcD1uFNoM4HSbMRlb0ikkvUoZIXKOcbOX7wacCQzC\n3Qi3sdZOi7ViOcI3veCfZt/0gn+afdML/mn2TS94pzmMzmkP3AyMoUA7YhF80+ybXvBPs296wT/N\nvukF/zT7phc80xyMROsK7Aocg5u0uBTYzVr7bayVywGFpFcOYZFTgrcnuwKXAetTwDdC8E8v+KfZ\nN73gn2bf9IJ/mn3TC15qnhf8HYKzdwtdL/in2Te94J9m3/SCf5p90wv+afZNL3ikOXCM/gY4Emdf\ndsSlTTjIWvt1nHXLBYWmVykjBJCaDTNcN8aYFsH6Wr00CIaofo67WCwueXrsN0Lf9IJ/mn3TC/5p\n9k0v+KfZN73gn+Zc6SUd9BB2xPImLYZvmn3TC/5p9k0v+KfZN73gn2bf9IJ/mnOktzWwCy5Kdh5w\nB7B3PjhHfdPbEOQQFhhjiq21FcH6BOAvwGfGmA2stWVreWxjrf0Yl1NliM2D5Om+6QX/NPumF/zT\n7Jte8E+zb3rBP8251As8CHwGlAHb5ktaDN80+6YX/NPsm17wT7NvesE/zb7pBf8050qvtXYGcH6w\nHAWcaq39Lht1Xht809tgrLVaPF6A4sj6icAvuFnHK4C/A0VZ+B9rfQzplWbplWZf9fqo2Te9PmrO\ntV6gKbAD0Cdurb5q9k2vj5p90+ujZt/0+qjZN70+as613nxbfNO7Vr9V3BXQEmPjRy4E4ILgAqkA\n7gL2AErirqP0SrP0SrPPen3U7JteHzXnWi9g4tbou2bf9Pqo2Te9Pmr2Ta+Pmn3T66PmXOvNt8U3\nvWv9e8VdAS3xL7iZEcML5Qyga2RfXt3QpFeapVeafdTro2bf9Pqo2Te9Pmr2Ta+Pmn3T66Nm3/T6\nqNk3vT5qlt7C1tvQZW2TZYuEY4zZDJf7BOBK4GZr7dJgn7HBFVMo+KYX/NPsm17wT7NvesE/zb7p\nBf80+6YX/NPsm17wT7NvesE/zb7pBf80+6YX/NMsvYWtd23QpHJiELAB8APwVHihQGpG8kLDN73g\nn2bf9IJ/mn3TC/5p9k0v+KfZN73gn2bf9IJ/mn3TC/5p9k0v+KfZN73gn2bpDShQvQ1GDmGPMcYU\nAXvhkp5/aq2dEnOVcopvesE/zb7pBf80+6YX/NPsm17wT7NvesE/zb7pBf80+6YX/NPsm17wT7Nv\nesE/zdJb2HrXFjmE/cYC7YL18oYcwBhTnL3q5Bzf9IJ/mn3TC/5p9k0v+KfZN73gn2bf9IJ/mn3T\nC/5p/v/27i1UtvMg4Pj/2ydtkwabmFuNpqXEJG29YUWsFoxYqlWE4oM+KlbUlljSB9GCFu9Cn+oF\naSnqmxA1D0JqQR8kabVeiBVKqARNTURSsUkvgtIU2718WGt7prtnJ+fk7LNnZv3/f/jYc2atmfP9\n9pr18q3FbJsXfGabF3xmmxd85ryX2J55L6sWhMUtt8v/J/NJc+0Y48Vj6aTXHG0bY7xxjHH7NE1f\nXK7C7Hw2L/jMNi/4zDYv+Mw2L/jMNi/4zDYv+Mw2L/jMNi/4zDYv+Mx51+293BTIetb+BRjA3cAb\npqWTdp6maRpj3AS8F3hsjHHLNE2HZzTX08jmBZ/Z5gWf2eYFn9nmw+Wg/QAADEVJREFUBZ/Z5gWf\n2eYFn9nmBZ/Z5gWf2eYFnznvur3PuxaEJR2/wrFxheQDwCPL418ZY3zrc7zPOeB7gRcDjzGfaDuX\nzQs+s80LPrPNCz6zzQs+s80LPrPNCz6zzQs+s80LPrPNCz5z3nV7r0QtCK+0jZMBgONXODaukHwM\n+Nvl8R3Az48xXnP0Hkub36HyauBe4Bbg/cBnr8D0LzmbF3xmmxd8ZpsXfGabF3xmmxd8ZpsXfGab\nF3xmmxd8ZpsXfOa86/aeSdM0NVY2gHMbj18NvAn4beA3gDcDdx3b/yXA3wGHwH8BDwKvv8A+dy/b\nDoGHgZdt22r0Gs02r9Fs8xrNNq/RbPMazTav0WzzGs02r9Fs8xrNedftPbPf67Yn0DjlA/qlJ8rb\ngI8y/3XFw43xj8C7gbGx71cCHzm23/uAXwd+CniA+btYDoH/AO7cttXoNZptXqPZ5jWabV6j2eY1\nmm1eo9nmNZptXqPZ5jWa867be6a/221PoHGKBxMONh7/0saH/p+WD/t9wKeBzy3Pf+DYa64D7gee\nOHbSHJ1s/wP8PceuvuTNnDdz3sx5vWab12i2eY1mm9dotnmNZpvXaM67bu+Z/363PYHGFTio8/ef\nHH3QfwH4xo1tdwJ/DDyzbH/P8vy55ec1wA8Bvwd8fDlB/g34c+DtwFdv22f3Gs02r9Fs8xrNNq/R\nbPMazTav0WzzGs02r9Fs8xrNedftPbPf67Yn0DjlAwrfsXzID5lvhb+W+a8kjmX7bcBTy/a/Ar5h\n47UHx97reuBW4MZtu/J6zTav0WzzGs02r9Fs8xrNNq/RbPMazTav0WzzGs151+0909/ttifQOOUD\nCj8GfB74MPD1y3NHV0buZP5ulEPgg8CrTniPcyc8P057vnkz581s8xrNNq/RbPMazTav0WzzGs02\nr9Fs8xrNedftPdPf7bYn0DjFgwlXMX9nyiHwm8tzR1dN7tg4UR660IkC3LLxeOdPDJvXaLZ5jWab\n12i2eY1mm9dotnmNZpvXaLZ5jWab12jOu27vWY8Dak1NzLfPAzwKME3TNMa4g/nW+ZcCHwLeOk3T\no0cvGnMvBH5wjPHdR68705k/v2xe8JltXvCZbV7wmW1e8JltXvCZbV7wmW1e8JltXvCZbV7wmfOu\n23u2XcnV5sbpDOCA81dB/v/xCfvez3yF5B3LvzdvoX+Ik2+h/xbmv7T4N8Ar8mbOmzlv5rxus81r\nNNu8RrPNazTbvEazzWs05123d1dHdwjvcGOMATBN0+G0fKI3H5/QY8vPN48x3sT8PSovZb568iVX\nTTb+n+uBtwAD+G/mv7h45tm8y1xUZpt3mYvKbPMuc1GZbd5lLiqzzbvMRWW2eZe5qMw27zIXldnm\nXeaiMtu8y1xU5rzr9u56V217AnVy0zRNY4xvAt4AfDvwAuAc8BfAI9M0feho3zHGVdM0fQF4P/DD\nwCuAPwKuBh4E3nbCiTKA1wKvBz4D/OHy/47nOClPPZsXfGabF3xmmxd8ZpsXfGabF3xmmxd8ZpsX\nfGabF3xmmxd85rzr9u580w7cptz48gHcDbwLeAb4AvPt8Efj88vPdwGvPfa6a4H7lu3/CzwJfP+y\n7YD5Csm5jf2/jvk7Vw6BPwNuzps5b+a8mfM6zTav0WzzGs02r9Fs8xrNNq/RnHfd3n0YW59A4wIH\nBe4FHmH+vpND4Ang48AngKeOnTgPMd8mv/n6m4F/XrZ/CvhT4HXH9rkO+J7l9YfA48DteTPnzZw3\nc16n2eY1mm1eo9nmNZptXqPZ5jWa867buy9j6xNoHDsg8GsbJ8JfAz8D3MB8G/31wG3AbwEPb+z3\nKPDOY+/zcubvWjlcTrovAu9jvuJyL/Mt+Y8v258EXpk3c97MeTPndZptXqPZ5jWabV6j2eY1mm1e\noznvur37NLY+gcbGwZhPgqMT4FeBbz62/arl5wHwncAfbOz/78DPHdv/VuZb5Z9e9jm6Lf/oqsxn\nmK+e3Jk3c97MeTPndZptXqPZ5jWabV6j2eY1mm1eoznvur37NrY+gcZyIOB3Nj74Pw1ct7Ht4ITH\nNwPv3njdI8CPLtvG8vM64B7gfuarJM8wXzV5APgJ4KvyZs6bOW/mvE6zzWs027xGs81rNNu8RrPN\nazTnXbd3H8fWJ9CYAH534wP/k2x8IfZFvPbqYyfaAyzfkwK84Ni+NwFfA9yQN3PezHkz53WbbV6j\n2eY1mm1eo9nmNZptXqM577q9+zq2PgH7AH5/44N+z/LcuMT3uI3zf3XxEPjZY9sPTnjdJf0/eTPn\nzWz0Gs02r9Fs8xrNNq/RbPMazTav0WzzGs151+3d53FAba0xxg8AP77881+BwzHG1dM0TWOMSzk2\nTwJ/srwHwNvHGHcdbZym6fBCL5qWM+assnnBZ7Z5wWe2ecFntnnBZ7Z5wWe2ecFntnnBZ7Z5wWe2\necFnzrtu777XgvB2e5j5i7UBbme+lf5HxhjXTNN0OMYYF/Mmy4f+AeCjy1NfwfzdK7uWzQs+s80L\nPrPNCz6zzQs+s80LPrPNCz6zzQs+s80LPrPNCz5z3nV797tpB25TNg/gRuAXOX8r/EeYT5qrl+3P\necs7y+3ywBuBz7FxS/3FvD5v5ryZ82Y2e41mm9dotnmNZpvXaLZ5jWab12jOu27vPo/uEN5y0zR9\nCngP8MvLU68B3sp8FeXo1vpnvYoynb9d/pMbTx8u23bqlnmbF3xmmxd8ZpsXfGabF3xmmxd8ZpsX\nfGabF3xmmxd8ZpsXfOa86/bucy0I70DTND3NZZ4wSy8BXrQ8fvzUJ3pK2bzgM9u84DPbvOAz27zg\nM9u84DPbvOAz27zgM9u84DPbvOAz5123d19rQXhHupwTZuP5o+9U+SzwiSs43cvO5gWf2eYFn9nm\nBZ/Z5gWf2eYFn9nmBZ/Z5gWf2eYFn9nmBZ8577q9e9m0A99b0Tg/gJt4Ht+3AtwKfHh5zX3bduTN\nbPUazTav0WzzGs02r9Fs8xrNNq/RbPMazTav0Zx33d59GlufQOMCB+USTxjghcv2TwOPAd+3PH+w\nbUvezEav0WzzGs02r9Fs8xrNNq/RbPMazTav0WzzGs151+3dl7H1CTROODDPfcIcbOz7bcA/LPu9\nF7hu2/PPm9nuNZptXqPZ5jWabV6j2eY1mm1eo9nmNZptXqM577q9+zC2PoHGsxyck0+Yazb2uQv4\n4LL9YeBl25533sx5vWab12i2eY1mm9dotnmNZpvXaLZ5jWab12jOu27vro+tT6DxHAfo5BPmRcDL\ngb9cnn8C+Nptzzdv5ryZbV6j2eY1mm1eo9nmNZptXqPZ5jWabV6jOe+6vbs8tj6BxkUcpAufMO8E\nHlz+/RTwym3PM2/mvJmtXqPZ5jWabV6j2eY1mm1eo9nmNZptXqM577q9uzq2PoHGRR6oLz9hPrlx\norxq2/PLmzlvZrvXaLZ5jWab12i2eY1mm9dotnmNZpvXaM67bu8ujrEciNqDxhg3AfcwnzQHzCfK\nd03T9OhWJ3aFsnnBZ7Z5wWe2ecFntnnBZ7Z5wWe2ecFntnnBZ7Z5wWe2ecFnzrtu767VgvCeNca4\nEXgH8BbgddM0fWzLU7qi2bzgM9u84DPbvOAz27zgM9u84DPbvOAz27zgM9u84DPbvOAz5123d5dq\nQXgPG2PcABxM0/T0tudyFtm84DPbvOAz27zgM9u84DPbvOAz27zgM9u84DPbvOAz27zgM+ets6gF\n4aqqqqqqqqqqqipJB9ueQFVVVVVVVVVVVVWdTS0IV1VVVVVVVVVVVUlqQbiqqqqqqqqqqqpKUgvC\nVVVVVVVVVVVVVZJaEK6qqqqqqqqqqqqS1IJwVVVVVVVVVVVVlaQWhKuqqqqqqqqqqqoktSBcVVVV\nVVVVVVVVJakF4aqqqqqqqqqqqipJLQhXVVVVVVVVVVVVSWpBuKqqqqqqqqqqqkpSC8JVVVVVVVVV\nVVVVkloQrqqqqqqqqqqqqpLUgnBVVVVVVVVVVVWVpBaEq6qqqqqqqqqqqiS1IFxVVVVVVVVVVVUl\nqQXhqqqqqqqqqqqqKkktCFdVVVVVVVVVVVVJakG4qqqqqqqqqqqqSlILwlVVVVVVVVVVVVWSWhCu\nqqqqqqqqqqqqktSCcFVVVVVVVVVVVZWkFoSrqqqqqqqqqqqqJLUgXFVVVVVVVVVVVSXp/wDBv2CL\n9YhmagAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 387, + "width": 706 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "fig, ax = plt.subplots(figsize=(8,4))\n", + "\n", + "mean, std = scaled_features['cnt']\n", + "predictions = network.run(test_features)*std + mean\n", + "ax.plot(predictions[0], label='Prediction')\n", + "ax.plot((test_targets['cnt']*std + mean).values, label='Data')\n", + "ax.set_xlim(right=len(predictions))\n", + "ax.legend()\n", + "\n", + "dates = pd.to_datetime(rides.ix[test_data.index]['dteday'])\n", + "dates = dates.apply(lambda d: d.strftime('%b %d'))\n", + "ax.set_xticks(np.arange(len(dates))[12::24])\n", + "_ = ax.set_xticklabels(dates[12::24], rotation=45)\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Thinking about your results\n", + " \n", + "Answer these questions about your results. How well does the model predict the data? Where does it fail? Why does it fail where it does?\n", + "\n", + "> **Note:** You can edit the text in this cell by double clicking on it. When you want to render the text, press control + enter\n", + "\n", + "#### Your answer below\n", + "- The model predicts the data good enough.\n", + "\n", + "- The model fails for the period of christmas holidays. \n", + "\n", + "- **Reason:** We can expect users to change behavior around christmas time (family visit, holidays ..). We also have much less data for christmas holidays which occures only once a year then the rest of the year. So the model is over fitted for normal days vs christmas holidays. We would need data over several years to fit the model for these seasonal changes." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Unit tests\n", + "\n", + "Run these unit tests to check the correctness of your network implementation. These tests must all be successful to pass the project." + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + ".....\n", + "----------------------------------------------------------------------\n", + "Ran 5 tests in 0.005s\n", + "\n", + "OK\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import unittest\n", + "\n", + "inputs = [0.5, -0.2, 0.1]\n", + "targets = [0.4]\n", + "test_w_i_h = np.array([[0.1, 0.4, -0.3], \n", + " [-0.2, 0.5, 0.2]])\n", + "test_w_h_o = np.array([[0.3, -0.1]])\n", + "\n", + "class TestMethods(unittest.TestCase):\n", + " \n", + " ##########\n", + " # Unit tests for data loading\n", + " ##########\n", + " \n", + " def test_data_path(self):\n", + " # Test that file path to dataset has been unaltered\n", + " self.assertTrue(data_path.lower() == 'bike-sharing-dataset/hour.csv')\n", + " \n", + " def test_data_loaded(self):\n", + " # Test that data frame loaded\n", + " self.assertTrue(isinstance(rides, pd.DataFrame))\n", + " \n", + " ##########\n", + " # Unit tests for network functionality\n", + " ##########\n", + "\n", + " def test_activation(self):\n", + " network = NeuralNetwork(3, 2, 1, 0.5)\n", + " # Test that the activation function is a sigmoid\n", + " self.assertTrue(np.all(network.activation_function(0.5) == 1/(1+np.exp(-0.5))))\n", + "\n", + " def test_train(self):\n", + " # Test that weights are updated correctly on training\n", + " network = NeuralNetwork(3, 2, 1, 0.5)\n", + " network.weights_input_to_hidden = test_w_i_h.copy()\n", + " network.weights_hidden_to_output = test_w_h_o.copy()\n", + " \n", + " network.train(inputs, targets)\n", + " self.assertTrue(np.allclose(network.weights_hidden_to_output, \n", + " np.array([[ 0.37275328, -0.03172939]])))\n", + " self.assertTrue(np.allclose(network.weights_input_to_hidden,\n", + " np.array([[ 0.10562014, 0.39775194, -0.29887597],\n", + " [-0.20185996, 0.50074398, 0.19962801]])))\n", + "\n", + " def test_run(self):\n", + " # Test correctness of run method\n", + " network = NeuralNetwork(3, 2, 1, 0.5)\n", + " network.weights_input_to_hidden = test_w_i_h.copy()\n", + " network.weights_hidden_to_output = test_w_h_o.copy()\n", + "\n", + " self.assertTrue(np.allclose(network.run(inputs), 0.09998924))\n", + "\n", + "suite = unittest.TestLoader().loadTestsFromModule(TestMethods())\n", + "unittest.TextTestRunner().run(suite)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/.ipynb_checkpoints/Untitled-checkpoint.ipynb b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/.ipynb_checkpoints/Untitled-checkpoint.ipynb new file mode 100644 index 0000000..2fd6442 --- /dev/null +++ b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/.ipynb_checkpoints/Untitled-checkpoint.ipynb @@ -0,0 +1,6 @@ +{ + "cells": [], + "metadata": {}, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/Readme.txt b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/Readme.txt new file mode 100755 index 0000000..b1f2568 --- /dev/null +++ b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/Readme.txt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:b92c8628622948bf0828c43a1b315d1517c3d7fd63654730d0dea379b8e88175 +size 5607 diff --git a/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/Untitled.ipynb b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/Untitled.ipynb new file mode 100644 index 0000000..5b8d752 --- /dev/null +++ b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/Untitled.ipynb @@ -0,0 +1,238 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## This is a title\n", + "dsjksd\n", + "sd\n", + "gsf;sfag\n" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
instantdtedayseasonyrmnthholidayweekdayworkingdayweathersittempatemphumwindspeedcasualregisteredcnt
012011-01-0110106020.3441670.3636250.8058330.160446331654985
122011-01-0210100020.3634780.3537390.6960870.248539131670801
232011-01-0310101110.1963640.1894050.4372730.24830912012291349
342011-01-0410102110.2000000.2121220.5904350.16029610814541562
452011-01-0510103110.2269570.2292700.4369570.1869008215181600
\n", + "
" + ], + "text/plain": [ + " instant dteday season yr mnth holiday weekday workingday \\\n", + "0 1 2011-01-01 1 0 1 0 6 0 \n", + "1 2 2011-01-02 1 0 1 0 0 0 \n", + "2 3 2011-01-03 1 0 1 0 1 1 \n", + "3 4 2011-01-04 1 0 1 0 2 1 \n", + "4 5 2011-01-05 1 0 1 0 3 1 \n", + "\n", + " weathersit temp atemp hum windspeed casual registered \\\n", + "0 2 0.344167 0.363625 0.805833 0.160446 331 654 \n", + "1 2 0.363478 0.353739 0.696087 0.248539 131 670 \n", + "2 1 0.196364 0.189405 0.437273 0.248309 120 1229 \n", + "3 1 0.200000 0.212122 0.590435 0.160296 108 1454 \n", + "4 1 0.226957 0.229270 0.436957 0.186900 82 1518 \n", + "\n", + " cnt \n", + "0 985 \n", + "1 801 \n", + "2 1349 \n", + "3 1562 \n", + "4 1600 " + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "%matplotlib inline\n", + "import pandas as pd\n", + "\n", + "data = pd.read_csv('day.csv')\n", + "\n", + "data.head()" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAg0AAAF5CAYAAAAcQxneAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzsfXmYXUWZ/vv1lt7SIXsISUgISwICkrCKsggDIzPwU1Ex\nwDO4jQwOiFERGFARGBFcwAEUBxEEJKiow7AGyLDJFiAsAZKwZichW3cnvS/1++O7H6dO3TrrPbf7\n3u56n6efc++5derUWfrUe97vra9IKQUHBwcHBwcHhyhUDHYDHBwcHBwcHMoDjjQ4ODg4ODg4xIIj\nDQ4ODg4ODg6x4EiDg4ODg4ODQyw40uDg4ODg4OAQC440ODg4ODg4OMSCIw0ODg4ODg4OseBIg4OD\ng4ODg0MsONLg4ODg4ODgEAuONDg4ODg4ODjEQmLSQESNRHQNEa0konYi+jsRHWiUuZSI1ud+f5iI\ndjd+H01EfyCiFiLaRkS/JaIGo8x+RPQEEXUQ0SoiOi/dITo4ODg4ODhkgTRKw00AjgFwGoCPAHgY\nwCNEtDMAENH5AM4GcCaAgwG0AVhIRDVaHXcAmJ2r558AHAHgN/IjEY0EsBDAewDmADgPwCVE9LUU\n7XVwcHBwcHDIAJRkwioiqgWwHcCJSqkHtfUvALhfKfUDIloP4KdKqatzvzUB2AjgDKXUn4hoNoDX\nAcxVSr2UK3M8gPsATFFKbSCiswBcBmCSUqo3V+YKAP9PKbV34Yft4ODg4ODgkBRJlYYqAJUAuoz1\nHQA+TkQzAEwCsEh+UEq1AngOwGG5VYcC2CaEIYdHACgAh2hlnhDCkMNCAHsR0aiEbXZwcHBwcHDI\nAIlIg1JqB4BnAHyfiHYmogoiOh1MCHYGEwYFVhZ0bMz9htzyA6PePgBbjTK2OqCVcXBwcHBwcBhA\nVKXY5nQAvwOwDkAvgCVgj8KckG0ITCbCEFWGcktrGSIaC+B4ACsBdEbsy8HBwcHBwcFDLYDpABYq\npbYEFUpMGpRS7wE4mojqADQppTYS0Z1g0+IGcOc+EX6lYAIACUdsyH3/EERUCWB07jcpM9HYtWxj\nKhCC4wH8IenxODg4ODg4OHyI08BCgBVplAYAgFKqA0AHEY0Gd9jfVUq9R0QbwKMiXgU+NEIeAuD6\n3KbPANiJiA7QfA3HgMnGYq3M5URUmQtdAMBxAFYopVoCmrQSAG6//XbMnj077WE5pMT8+fNx9dVX\nD3Yzhh3ceR88uHM/eHDnPnssW7YMp59+OpDrS4OQmDQQ0XHgDn4FgD0AXAVgGYBbckWuAXAxEb2d\n2/llANYCuBsAlFLLiWghgBtzoyRqAFwLYIFSSpSGOwD8AMDviOhKAPsC+CaAc0Oa1gkAs2fPxpw5\nYZESh2Jg1KhR7rwPAtx5Hzy4cz94cOe+qAgN76dRGkYBuALALmDz4l0ALhZFQCl1FRHVg/Mu7ATg\nSQCfUkp1a3WcCuA68KiJ/lwdHxICpVRrbhjmdQBeALAZwCVKqZtStNfBwcHBwcEhA6TxNPwZwJ8j\nylwC4JKQ35vBhsqwOpYCODJp+xwcHBwcHByKAzf3hIODg4ODg0MsONLgkAnmzZs32E0YlnDnffDg\nzv3gwZ37wUOiNNKlDCKaA+DFF198MdAgs3r1amzevHlgGzYMMG7cOEybNm2wm+Hg4ODgkBJLlizB\n3LlzAZ7iYUlQudRDLssNq1evxuzZs9He3j7YTRlyqK+vx7JlyxxxcHBwiMSWLcDWrcAeewx2SxzS\nYNiQhs2bN6O9vd3lccgYMrZ38+bNjjQ4ODhE4qqrgIceAl56KbqsQ+lh2JAGgcvj4ODg4DB42LYN\n2L59sFvhkBbOCOng4ODgMGBobwd6ega7FQ5p4UiDg4ODg8OAwZGG8oYjDQ4ODg4OAwZHGsobjjQ4\nODg4OAwY0pCGNWuA444DOjqK0yaH+HCkwcHBwcFhwJCGNCxdCjz8MLBhQ3RZh+LCkQaHPPz617/G\n73//+8FuhoODwxBEGtLQnZvu0IU1Bh+ONDjk4Ve/+pUjDQ4ODkWBkIYkyYiFNPT2FqdNDvHhSIOD\ng4ODw4BBkvImIQCONJQOHGkYQli/fj2++tWvYpdddkFtbS122203fOMb30Bvby9uueUWVFRU4Omn\nn8a3v/1tTJgwAY2NjfjsZz/rm49jxowZeP311/HYY4+hoqICFRUV+OQnPzmIR+Xg4DCUIKTBFmpY\nuhRoaclf39XFS0caBh/DLiPkUMX777+Pgw46CK2trTjzzDOx1157Yd26dbjrrrvQ3t4OIgIAnHPO\nORgzZgwuueQSrFy5EldffTXOOeccLFiwAADwy1/+EmeffTZGjhyJiy++GEopTJw4cTAPzSEFOjqA\nzk5g9OjBbomDgwelwknD0UcD550HnH++f73zNJQOHGkYIrjgggvwwQcfYPHixTjggAM+XH/JJZf4\nyo0fPx4PPvjgh9/7+vpw7bXXYvv27Rg5ciROOukkXHTRRRg/frybfraMceihwKuvJosbOzgUG11d\n3j1pEoDmZp7MatOm/O1ceKJ04EhDANrbgeXLi7uPWbOA+vrC61FK4e6778ZJJ53kIwwmiAhf//rX\nfes+8YlP4JprrsGqVavwkY98pPDGOJQEXn11sFvg4JAPfZJhkzSsWsXL1tb87RxpKB040hCA5csB\nnlq8eHjxRSCLubM2bdqE1tZW7LPPPpFlp06d6vs+Oqdfb9u2rfCGODgUGQ8+CPz5z8BNNw12SxzS\nIA5psHkaHGkoHTjSEIBZs7hTL/Y+soBKoEFXVlYWXIeDw2DhySeBu+92pKFckZY0iBHSeRoGH440\nBKC+PhsVYCAwYcIENDU14bXXXsukPjFNOjiUGtrbvbdOh9JGZyfwm98A55wDVOTG6bnwRPnDDbkc\nAiAifPrTn8Y999yDJUuWFFxfQ0MDmpubM2iZg0O2aGtzpKFccN11wLe+BTzyiLfOhSfKH440DBH8\n+Mc/xoQJE3DEEUfg29/+Nm688Ub86Ec/wr777ovWHHUPCkGY6+fOnYtXX30V//mf/4k//vGPePTR\nR4vefgeHOBDSMFyjaVddVfywaVaQDn7HDm9d1qShv9+RyIFGItJARBVEdBkRvUtE7UT0NhFdbCl3\nKRGtz5V5mIh2N34fTUR/IKIWItpGRL8logajzH5E9AQRdRDRKiI6L90hDg9MnjwZzz33HD7/+c/j\njjvuwLnnnovbb78dn/zkJ1GfG6IRFHYw1//gBz/ACSecgJ/+9Kc49dRTcdlllxW9/Q4OcdDezoSh\nr2+wWzI4+MlPgHvuGexWxMOIEbwUPwIQTho2bOCwcJLwxOc+5+3HYWCQ1NNwAYAzAfwLgDcAHAjg\nFiJqVkpdBwBEdD6AswGcAeA9AJcDWEhEs5VSwgnvADARwDEAagDcAuA3AE7P1TESwEIAD+X2ty+A\nm4lom1Lqt+kOdehjypQpuPnmm62/nXHGGTjjjDPy1h955JHoM57AEyZMwP/+7/8WpY0ODoWgrY2X\n3d1AVcyn17p1wP77Ay+/DEyZUry2DQR27CgfM2BNDS91JUAnDaZC0NwMTJvGI9f6+gDdsx2U3Olv\nf+OlUoCzYg0MkoYnDgNwt1LqQaXUaqXUX8Ed+8FamXMBXKaUukcp9RqYYEwG8GkAIKLZAI4H8FWl\n1AtKqacBnAPgi0Q0KVfH6QCqc2WWKaX+BOC/AHw73WE6ODgMBeikIS4WLeKkQXffXZw2DRS6u7nT\nLBfSkERp6O0Ftm8Hdt2Vv2/f7q8rKI30yJG8TDpl9vLlwGc+M3wVq0KQlDQ8DeAYItoDAIhofwCH\nA7g/930GgEkAFskGSqlWAM+BCQcAHApgm1LqJa3eRwAoAIdoZZ5QSum3yEIAexHRqIRtdnBwGCJI\nQxqkYzE7onJDmmMfTCQhDRKSmDaNl6avISg8IcrRW28la9uSJcD//I/fb+EQD0lJw08A/BHAciLq\nBvAigGuUUnfmfp8E7vw3GtttzP0mZT7Qf1RK9QHYapSx1QGtjIODwzCDdDrDkTRIB1cuSoOEJ+KQ\nBiEJojSYvoYg0rDzzrxMShr6+/31OsRHUk/DKQBOBfBFsKfhowB+SUTrlVK3hWxHYDIRhqgyErEa\npr5pBweHNG/bdXW8LHfSUG5Kg4xw0UnDjh18PTo6/KRBRnhHKQ0mYZJ9JCUNEpYol3NZSkhKGq4C\n8GOl1J9z318noukALgRwG4AN4M59IvxKwQQAEo7YkPv+IYioEsDo3G9SxpxaUbYxFQgf5s+fj1Gj\n/BGMefPmYa+99grbzMHBoQyQpuOUjqXcSUO5KQ3SMeukYdUqYMYM4I030pEGU2no7ORlWtJQLucy\nayxYsODDmY0FLbaxrhYkJQ31yH/T70cuzKGUeo+INoBHRbwKAETUBPYqXJ8r/wyAnYjoAM3XcAyY\nbCzWylxORJW50AUAHAdghVIq9MiuvvpqzLGkcswi6ZGDg8PgIg1pECnakYaBhY00vPsup88PIg1B\n4YkgI2RHBy83hr5K5mO4hyfmzZuXN4vxkiVLMDfGhEtJPQ33ALiIiE4gol2J6DMA5gP4q1bmGgAX\nE9GJRLQvgFsBrAVwNwAopZaDTY03EtFBRHQ4gGsBLFBKidJwB4BuAL8jor2J6BQA3wTw84TtdXBw\nGCLo6fE6jeGoNJRbeMIWAhDSANg9DZMn81DLuEqDkIakyb6Gu9JQCJIqDWcDuAysGkwAsB7Ar3Pr\nAABKqauIqB6cd2EnAE8C+JSWowFgX8R14FET/QDuAg/VlDpaiej4XJkXAGwGcIlSyk1T4+AwTCGd\nJuCUhmKgv9+bIyILSAff2cnXa8UKYP16YM89eb2pNNTXs3myqSl5eCIpaRjuSkMhSEQalFJt4FwJ\nofkSlFKXALgk5Pdm5BI5hZRZCuDIJO2Lg2XLlmVd5bCGO5+Dj6wf9qUKnTTokncUpIMo9+F1xSQN\nK1ey1+Dhh4Fjj82mTnmb7+gAfv1rnocCAGbOBKqr80nDTjvx5zDSYB67KA1yjZO2zXYu77uPE0Wd\ncEKyOocLhs0sl+PGjUN9fT1OPz2UqzikQH19PcaNGzfYzRiWWLoUOOwwYM0aYPTowW5NcTHclYZi\nhieE+z/1VPakob0deOABb/1uu4WThlGj4g+5FKUhLWmwnct//mdeDtf5TaIwbEjDtGnTsGzZMmze\nvHmwmzLkMG7cOEwT27PDgGL9eu5Mtm0b+qQhLAVxGOThb5vToJxQTKVBOt/a2uzqlI552zYmI4JJ\nk5g06NewpcVPGkylIcoI6cITA4dhQxoAJg6uc3MYSpCH33CYMni4Kw1CGorR0UmnrJOGVauA228H\n/uM/0s3rIKRh0SLu1B97jD0LFRXRSkMcT0NfH9cxYkS24QmHcAyDSKiDw9CFPCyHQw79QkmD2UG8\n9BLwq18V3q6BQjGVBhtpuPde4OKLgU2b0tUp96RSTBI+8QkOpQHRnobWVr5u11/PCpPN0yDqSH19\n+tETTmlIDkcaHBzKGE5piIb+Fqp3LrfcAlx4YfB2//M/wNq18fdTbMjxFzM8oU8zvWULL9980192\n5Urgs59lc2MYdCI7ZYrfrGuShk2bvPCaKA2vvw6cfTYrFTalQScN+jXu64tWHkwi2d3NStRwIN+F\nwpEGB4cyho00DFUDV6GeBsCLgQNsHm1tzZfCAeB//5dnQbzyyuTtLBaKGZ6weRq2buXlihX+suef\nz1NS33knQqF3wFOn+n/TSUN7O886ud9+/F1Iw9tv8/dNm+ykQa6lSRpqa4ETT4zXNqn3iiuAY46x\n3wsOfjjS4OBQxpCHn/6AHqpx2rY2jq2bJroo6B2KboZcs8a/1HH++bxsbEzezmJhIIyQMskUEKw0\niJc86hroHbxJGmpqvONYsoTv30NycxybpGHDBjs5DgpP9PYC998f3jbT07B0Ke9PiJJDMBxpcHAo\nY9gepkM1VNHayjNWjhiRLWlYvdpfXik2AQJ+ZWKwUazwRFubfRSCkAZTaZB2RA1Ei6s0PPccd/z7\n7MPfxdMg80noIaI4SoP5uw3m6IlVq3iUh/g3mpqCtx3ucKTBwaGMYTNCDlWlYd06YJdd+C01LWmQ\nDq+ry5uvwFQaduzwOhw9JDLYCAtPKOXlWkiCri72G/w1NxGAfh/JW7epNMg5FFIRhLikYfFiYO5c\noCo3lm/UKCYHS5fy9/Xrve2SGCFfeCG6bTppADx1Q0yZDvlwpMHBoYxhUxqGKmlYuzYdadA7FOnw\n1q3z1plKw4YN3udSJA226/vUU/ymbh5LFNat45ELy5fzd51gbdnC51o6VEF7OzB+PG8XpmrppGHS\nJP9vOmlYv56zUQpkkuIXX/TaKJD9/fznwMEH82dTaRBD5TPPRLetp4ePRxQGUVWMiZIdNDjS4OBQ\nxhhupGHKlGyUBlEXxo/P72hFgdh119ILT9TX2499zRomR7bRHg8+CNx8s71O6ZDlntHP1dat3Nmb\n91NbG09hrRRL+kHQScPYsf7fdNLQ38+TVAmkw+7pYVOjtLG62rvPr7vOK28qDXIMOtnp7gbOOQf4\n4AN/me5ufzkhDQ0Nwcc13OFIg4NDGcMWnhiqnoZ167InDR/7WH54QkjDjBnFVxp27ABuvTXeiJcd\nO/gt2kYKJVQgnaKO//5v4Gc/s9dpkgz9ftq2DZg4kfdnqjWSI88WoujpAS64wD8SYd99/WVM0qAP\nx9T9BMccA7z/Pn+ur/fu7cmTvTINDf5rLPXqitE77zDROOAATpMtbevp8ZMGCcUM1RFIWcCRBgeH\nMsZAKg3d3ckmisoSfX0sY2flaVi/nuPWu+3mkQTBhg0cX588ufik4dOfBs44wy/BByGMNIgp0UYa\n1q2zjxCR33TIuWpu5o5z4kT/eqX8pMFmhnznHR6q+txzHDJRChgzxl9GHwHT1+cnDXpo4Oijvc/1\n9d6x77KLf72NNAjZkHYDfN0XLeJcE4CnNEiWSlEakmaYHE5wpMHBoYwxkKRh113zDW3Fgvmmt3Ej\ndy5plAabp2HbNu7I6uo8Q52+r4kTebiljTQ88ggP/Vy7ljumqHwFQfjgA+7AgOj8AN3dfI1Hj+al\neX7ClIZ16zhxkW3ujSDSICZI8SLIPdXVxfsOUxrk2nR2+sMOOsKUBiENdXXAhAne+oYGu9KghyeU\n4nrHjfMrDab6Jvvu6eFzMGkS70tIsUvyFAxHGhwcyhgDGZ7YsCF9SuEkkDe/Rx/l7y+/7L1ZFqI0\n1NR4pEHSFtfW+knDT38KXHYZk4b6erunQdq1fDnw7W8D8+YFn5ft24Pb+ve/+8sF4bXXgFdf5c9i\n8jOJYRBp6OvzOk+b2hAUnpD6RGmQ/cn5E/IYFJ4A0pMGCU/89KfsORHopEH3HFRX55PnadOY0AmZ\nkH19//u8lGvS3c0EaexY/76c0hAMRxocHMoYQ9EIKR3k00/zcskS77dCPA0jR0aThu99j5d1dfxn\nUxoka2Jnp9eJP/aYfd+f+hTwk5/Yf9PzH4SRhn33BQ46iD8HkYag8IQoNIDdJBmlNAhpkPtLzl9T\nE58/W3hCrk1XVzBp0JM7maShspLX/fu/+5WGPff02qGT5IoKjxzIvqdN4+sj6opsN3Omv1x3tzdD\nrKg9kyc70hAGRxocHMoYQzFPg3Sgko1R5kP45CdZds6SNIwY4fdpiOy+YQMrDTbSUFfHy85Obzz/\n//2ffd/r1+d7JgQrVgB77MGf407bLaTBPP4gpUEnBXGUBrmPpD7ptPWUzwC/6Y8dWxylAfBm1fzo\nR3lSsQ8+8EIzso1e1pxLQq6j+BpkOyF8+gRYW7dyqOq993jdCSc40hAGRxocHMoYUUpDMVzgxTZD\nSj6CkSN5KR3Z/fdzB1FTE68Nzc1cXsbrhykNcp4mTODO6U9/CiYN0vF0dHj1ScjChD5Do4kVKzip\nERB/2u4opcEkKEIaamvzCYJSTI70t3k9PFFX510DMzzR0MBtsQ25jKM0RJEGQUUFcNZZHDqoqvK2\nMZUGM8NjEGkQAmpTGu6/H7j2WiarjjQEw5EGB4cyRlQa6WI8/Mz4/ZtvZqtuCGmQN3rpIKQDiqs0\nvP46L//0J142NtpJg5jnAD5fp5wCzJkT7GnQwxNCKoJmw+zoCG7rm28CH/kI15eUNNiUhrFj7UpD\ndTWPYjDb2NXFxy0dLOAPT4wdy9sC+eGJhobgdN5xlIaGBq+uMNKgo6oqX2m44AJ/eMJUGsTPIett\nSoOQhk99imfV1EmIQz4caXBwKGNEhSeKoTTopKGvjyXku+7Krn7pQM34dVLSIPXU17PioHdUOmkA\nPF9DX5+3n7o6boNJiCTdcWenV19bm504BSkNmzdzx7zXXvw2X4jSIO2YPZvJg04a163jGP2UKf50\nzIBHzqZM8dbpSoNOGkylob7e70vQIeuU8s6VCZmUSvYZhzToyZ36+9nrccUV9vDEmDF8veWYg8IT\nYoTUh4Q60hAORxocHMoYUeGJYjz89LfZ9nZ+m85yVIU5x0JvLz/IJc4dlzRIPfX1vH1S0lBfz0sz\nRCEkRg9PAPm+hJ4ebrutrZKFcvp0Jg1xPQ3SuenXWHwFH/0od9QyfwLgzddRW5sf0hGiog+j1ZWG\nMWO8Tl/uL93TEHQd9HVBSsOoUXwNZJ9JlQY9t4PNCFldDey+u5esKU54QuBIQzgcaXBwKGMMpKdB\nhrnppKEYEztJZ6LHr/XOJ+4sl0IaGhr8pKGz0zMxpiEN+lu3zMOgt1sg58bWVtlfXR2PRIirNEgO\nA71O8TN89rPc5rvv9n4T0qB7CARyfvRESXGVhjDSoO8njDS0tvL+kpAGW+ppm9JQU8PKi0ziFaQ0\nbN3K19yRhvhwpMHBoYwRlachS9IgD3ZdVZAOVX/jLhQy3C+INCQNTzQ0+MMTIovrpEFP6mOSBtPX\nIOd3xw6uTxINmQmawkiD7K+2Nn54or7ee1PWO+aXX+bj228/jsvLjJVAOGmQfdrCE6bSoJOGqiq+\nBnpWRx1xlQal+BwWS2nQSYO031QaxDjqSEN8ONLg4FDGGMjwhDxozfAEUBzSoKcZTkMa5E26osKv\nNIgiIEMugWBPA5CvNMi53r6dfxPSYCoNsl0YaRgxIn54orGRjx3wX+PHH+f4/tixPFxw8WJv33GU\nhjieBj08IWSqEKVBhqq2tKQ3QuqkwVQaqquBWbNYhdm8OVhpENLgPA3xkYg0ENF7RNRv+bs29/sI\nIrqeiDYT0XYiuouIJhh1TCWi+4iojYg2ENFVRFRhlDmKiF4kok4iepOIzij8UB0chh4GygiplNfR\n6UpDMcITMowvTGmIM+RSN1TaSMPo0YWFJ7JQGoQ0xFEaGhq8Tlyv87HHgKOO4s9CADZv5jq3b2fS\noHe4AtnnHnvwWzngz9MwZow9PCFhKhtpaG2NrzQAyUiDaYS0hSdk3xKeADhzZxBpED+IUxriI8Db\nGogDAei3wb4AHgKQG9SEawB8CsDJAFoBXA/gLwA+AQA5cnA/gPUADgUwGcBtALoBXJwrMx3AvQB+\nBeBUAMcC+C0RrVdKPZywvQ4OQxoDpTTodeod3GCEJ6qq4s0NIG/SQhrq6/OVBml3EtKgKw1tbcDO\nO/P3JEqD7G/ECPY0vPtu8HFILonGxvxOfP16Tkp05JH8fdw4Xm7a5O0/SmkYNw544w3O/tjfz+W2\nb2elwTRCmqRBr7Olhc/p3nt766JIQ3NztuEJXWmYPp0JxYoVXllRlkw40hAfiUiDUsqX/4uITgTw\njlLqSSJqAvAVAF9USj2e+/3LAJYR0cFKqcUAjgcwC8DRSqnNAJYS0fcB/ISILlFK9QI4C8C7Sqlc\nQlesIKKPA5gPwJEGBwcNUXkaslIa9Dd7vRMdjPBERUVy0qB7GkTJ2Gknrx6dNEgHE8fT0N7OHWBD\nQ2FKQ1h4YuRIjzRIeELqlORFM2bwUkjD5s0eiQnzNNTUeHVKZynnP8gIGaQ0yIiQN97w1mWpNCQ1\nQtbWMiHbssULh8ixmpDfgfj313BFak8DEVUDOA3ATblVB4JJyCIpo5RaAWA1gMNyqw4FsDRHGAQL\nAYwCsI9W5hFjdwu1OhwcHHIYqPCE3snpnaiQhqzCE93dHgEJUhpkboIo2MITnZ08t0VdHX83wxN6\nZxTkaZB2idLQ0OAfQiiI42moqYkOTwh50cMT0gYhGzLJk04aJBvk5Mn28MSOHV7GRyCfNAQZIaU9\nphFSn1VSEJanAUjvaYhjhJT9tLTwdtXV9vbstJN/fdz7a7iiECPkZ8Cd/e9z3ycC6FZKmZx5I4Dc\nBKuYlPtu/o4YZZqIKEBccnAYnhio8IR0cjvt5CcN8jkrpUHveLNUGoQ0AMCVVwJnnslvqHFGTwSF\nJ7Zu5c/19f5kRYIopWHECG5D3CGXNiOkSRrq65nsCGmQ6b+DlAaZ3wPwSIPE+W1GyLfe8vI6mEqD\nZJyU8wYEKw319dxRF+JpkG2ClAbATxqqquztMad7d+GJcCT1NOj4CoAHlFIWfukDAYjzvhNWhmKU\nAQDMnz8fo4TG5jBv3jzMmzcvRhMcHMoLAx2eMElD1kqDThoKVRr0GQ4rKjzDYmMjpx8GwkdP1NTw\ndkGkQZz3DQ18XpJ6GmTfI0eGDz0UgqR7GqROkzQArDbIiAGR3YM8DTalQcI3o0f7lY3ubuCFF4DP\nf57XmaRBJsTSQ1lBpIHI69CTKA2S10FXhGxzT9iUhqoqu9Kgp9E26xuqWLBgARYsWOBb12Ky3gCk\nIg1ENA1sUPy0tnoDgBoiajLUhgnwlIMNAA4yqpuo/SbLiUaZCQBalVKRA62uvvpqzJkzJ/ogHByG\nAKLCE1k9/ORhvNNO/kmKsvY0yHNL7+TSKg1Sl3gajjySjYP6qIkw0iBKhD51NmCf8TGt0gBwx60U\n16e/+QvkGk6caA9P1NT4DX5CGhobvU5Sf0sXBCkNchyjRnn76ekBXnqJ231YLlAcpDTo1yaINEj9\nSY2QAB9HHCOk7KOlhddXVdn3MxxJg+1FesmSJZgrM6iFIG144itgInC/tu5FAL0AjpEVRLQngGkA\nns6tegbZKsV+AAAgAElEQVTAvkQ0TtvuOAAtAJZpZY6BH8fl1js4OGgYqIyQUUpD1uGJ8eODwxNp\nlQaARzoIYQCYGIwYYScNsi+ToJidb309nxeTNER5GqQdpq/CRF8fT6R0ySX5RsjWVr/KAHikoafH\n6zx1E6EgSGloaeHtamv94YlnnuF1H/0orzNHT9im3o4iDUmVBmlL1JDLIE+DXo9g113934cDaSgE\niUkDERGALwG4RSn14anNqQs3AfhFLs/CXAA3A3hKKfV8rthDAN4AcBsR7UdExwO4DMB1Sim5/W4A\nMJOIriSivYjoGwA+B+AX6Q7RwWHoolQ8DWnCExs35kv60vGOG+dXGvQHfRqlIaxT0tUE2/BOkySY\n34OMkHJubDkldKVBlkG5J/r7gUmTWBWQtn31q8BDDzFp0Dt+wB+e0JUG/b5YvZrPv6k09PXxeRs1\nijtj3Qi5YgVPsCXEJUhp0FEs0hCmNFRUePs1wxO2Ng1HpaEQpFEajgUwFUwITMwH51i4C8Bj4HwM\nJ8uPOZLxzwD6wOrDrQBuAfBDrcxKAP+U28/LuTq/qpQyR1Q4OAx7ZD16orsbeOqp/PXFUBpOOw24\n6CL/Oul4x44NVxqiSEN/f/7oiSBEkQZzX+Ybe1B4Iq6nIUpp0N+qdTzwAB+jqTSMH5+vNJikYddd\ngSee8BMOUXB09ULvqHt6/CqNOXoiqdIg6kxc0jB2LC/Xrg03QurDKgeSNPT0ePOADGUkJg1KqYeV\nUpVKqbctv3Uppc5RSo1TSo1USn1eKfWBUWaNUuqflVKNSqmJSqnzdcUiV+ZxpdRcpVSdUmoPpdRt\nyQ/NwWHow6Y06J16UtJw3nnAxz+enzdASMPo0dwZSr3SMXZ1JR/bvnFj/jC9lhbuyGprvU6utzd5\neEJvYxzSYBs9AXhKw1tv+c/1AQd4ZWROCJMcyHXo68s/NzalYc0aJgIm9LdqgCemAvhchYUn9I5S\njkMp/z0RFJ4QL7moDTJjp5AQwK80iEJhImjIJeCd97ik4ZBD+No88US+EVIfcqm30fQ02NqUFWn4\n7/8GDjIde0MQbu4JB4cyhk1p0N/6kz78Xnklvz7ArzQo5XUWNtUhLnbsyB9qKFkF9Xh5GiOkLvWL\nETIIptKgd2CVlUxs9twTuOoqrz69o5b8CaYCoZ8P8zedNMjb+2WX8dwRTz7pL2sqDX/5C/CFLwAr\nV9pJw5gxPBzUVBrk+PR2bdHS9dlIA+AnDXqHW1PD5fv6glWSMKVBrmNc0tDYyJ3yY4/lhycAvi9t\nSkNrq/9cmG2SjJ56u9KQhrffZqPtUIcjDQ4OZQyb0qCThiilQSmeFVFPjWzWB/hJA2D3MiQNUezY\nka9oNDfzg16XvqOMkMuW5adh1t/6o5QGMULKuTKVBjmuxYu9+qqqgN135+9BpEEnVKYKYTNCSqf2\nne/4y9o61enTg0mDKAD627U+6kKfO0T3YQSRBhl5odcn+5FjM7NmCsJIg1zHuKQB4Dk2Hn88f5QL\nwPXYlAZRQUyl4d/+zZu1U0da0vDBB7z/OJOplTMcaXBwKGMEkQZ5cEY9/J5/Hjj5ZOC//ou/6yMO\ndISRBknmk4Y0BCkNSYZcHnEEMHOmf/ZNU2mI42mQOk3SIOdQ6pTO88EHOd+DjDIIUxrMjkT3NMhS\n5P2VK/1lTfUDYNKwZg0rCjbSIPswRwz09nqk4fvfZ0ldkEZpkPMhSoPueQCyVRoAnrlywwbeX5DS\nYJIGgBUV09NQWelPRKW3Ky1pALJNqV6KcKTBwaGMERSeEFd8lNIg20tHFUQa9DwNgD+p0/jx/nVx\n0NvLD/4gpSEsPGEqDdJpXn55fntl+0JIg3SIspTY/syZwBVX+OP+Ojo6vLdgm9Jghifkrd8sazNC\nzpjB7Vi+PJ80SKfZ3p4fntCVhjPP9GdDlM6ytdWuNASRBl1p0OdwALJXGmT/3d1+IyTgKQ1meAII\nJg02iEciqR9ISINkIh2qcKTBwaGMEaQ0xCUN8sCVB11UeEI6qOuuY5Wio8Ob7yDJG5aUtSkNUeEJ\n6dzk2IS0vP66V8bseKM8DbqR0yQoQhbkHJidJxCsNEinFUYaZClJs8yhl0FKg5QNIw1h4Ylx4/zb\nhYUnbEZIPTulnCN9tkggntKgVHzSIPX19PiNkEC40rB5c77qEkYagORqg2QIjZMSvJzhSIODQxlD\nOjqTNIgrPurBJw97edDJG6ONNIwY4U3idM01wMEHc8cknU8SpUEnKTqxiROekM9mFsB33vG3V0eU\n0vDnPwO//a2/fsCuNJixfcB7G9ePpaMjnDSYnoYkSsP06d52Zp6GOErDyJH500TreRp0IiIqShxP\nQ1KlQa5dGtJgUxpsRkjArjQE7VN+T0Ia+vs9MuaUBgcHh5KFLTyhpyOOUhrkYW++HQW5/YU0APxd\nD08kURrkwdrf7ycbengiTGmQ9dLWujpOWCRkwex4wzol6XTOPZeXJmmQOnWlQX+bBfIndgL4nI4Z\nY2+PzdMg9ff3++uxyfe1tcDRR/PnOEqD7mnYvNm7ZjoqK/n3NOEJIVRJSENFhXechYQndGXAZoQE\nkocnpL642LrVK+9Ig4ODQ8kiKDyRVGkwH3RBSoNuHBs7ljsm6RiDHPQ26PvTfQ1JlAb92GfNYoK0\neDG3NQlpuPdeXtpkaz08oXsabEoD4Cdbzc08XwQQHp6wzYnQ3OzNyxDU/mOO8bdLIJ15R0ew0mAj\nDRUVfC2USmaE1JUGMzwRlqehEKWhuzs4PKErDTKzqZ5VtBjhiY3avMxZk4a//hW46aZs6ywEjjQ4\nOJQxCjVC6kqDXjYqPAEwWejoiL8vHfqDVVQOpbx4elSeBlkPcLlZs/jzEUcA//Ef3lu7qAhhnoZf\n5BLUi09A78D08IQ5ekJHEtLwt7/xMQtpkPkvZH8AcOGFPKpFrq+tgzv1VF5/+OH2toR5GoJIg4RI\ngpQGM7mT1Fmo0hBWzlZfmBFSb6M+s6WZpyGIqKQhDfrInaxJw8knA1/7WrZ1FgJHGhwcyhim0qBU\nMtKgexr0h10c0jB2LJMGWZfkIWtTGiQ7oOQ90MMTeictD32dNOiTDq1f720rcf+wN9lvfhP413/1\nSEGQpyFOeEJIgxgKJ0zg79KeNWs4o+PKlX5PgbRTOuvVq3looRyjrf0778xt2W8/e1t0pcEccmma\nIGUfYsYMMkIOtqdBnwsjjhES8NSxYioNxSQNpQZHGhwcyhgmaeju5o4mbnhCHvbbtnFcVmAbcmkj\nDXoHWqjSIPusrk4enqip4ayNAMf4pZOW9kZ1Srp3IQ5piFIaROY3SYOueOg5DYRASGfd2spKQZjS\nEIQopWHDBp4Ay4SuNOjmyiAjpG30RLE9DXGUBj08AXj3QDE9DTI6o7HRkQYHB4cShhmeEDNiUqWh\nudn/thRkhNQf7k1NvN9CSYOZG6KyMnl4orqaM0MedRSTEOng4ygNQDBpqKz01keNnpDfAK/zNcMT\nehjJpjRIp9vSwoQuzNMQBD2xl+lp6OoC1q3z52cQVFR4x6p3vHHzNFRV+WfNBIo3esJmhAxSGoJI\nQ5bhCcnMOXKkIw0ODg4lDFNpENKQ1AgJAN/6lvc5KDxh7lsPHSQlDfLwNpWGqqroNNKyf8DrKCoq\n+Li3b0+uNFRXZxueENJgKg36edXPp6k0tLSwUhAWngg7FvOzLNet4zZOmZK/XUWF106dHIRNWCXH\n1tnJxCdpRsi0SoMtPBGlNBQzT8P27XzvZa00JE0wNRBwpMHBoYwRRBqSGiEB4OmngX/4B399gq6u\n/IexHk7Q2xKFNWuAV19lI2VVlUcapIMU0hA0y6WpNOhvwDppIPLaHGaElH1GkQZBnPCEeAPikgab\n0qCThiThCf06mXH8997jZZDSYCMNcT0NdXXeMUlHXSxPg54a3MzTEFdpKAfSUIqqhSMNDg5ljKDw\nRBKlYffdgd/9jj0Nt97K6+MoDdLJJlUavvUt3t/Ikfxnhieqqrw8DUrFVxoAjzQIyYmSogVVVcET\nVpmJotKEJ/SptwVhnoa2Nv/skVkpDTKxl400VFZ6pEE/B1ETVsnoCV1pMM2HNkheCCC50qB/Dhty\naWuLLLMMT+ikIcuMkPospKUCRxocHMoYWSgN9fXAl7/MY+z1NzkdNtIgHUxST4PMc1FdzXHgoPAE\nwB1nXE8D4Fcaamry495BsI3OMD8DXuKluKRBRinEVRr0UQuA96aZxggJ5Bsh332X9zV2bP52ccIT\nUUqDSRqiwhNpPQ36NmFDLoHyVhp00lAqoQpHGhwcyhhZGCH1N944pOHSS3lpdjBxH2pr1/Ly7be5\nU12zxr9PMUIC3KmEKQ2iRNjCE7pxMy1pMMlBe3s8T8O2bXwNamq4Dhtp0GEqDQK5noUqDXp4YupU\ne7hGnz3UpjREzT1hUxqiwhNpM0Lq2+hKg3k/A/mehrikQVeFojAQpCHo3hloONLg4FBm6O8Hfvxj\njnsXaoTU8ywA/nHwOnTS8P3vAyeckK80BO1rxw4e2SD1yCiNvj72UNx3n9cpSRv0DilMaTB9FWZ4\nwnwbDYKtMzLXy7GEhSekPc3Nnj/BTIktEN8DkO9pEIgKk5Y0mEqDkAYbgo47SGmorOTzWojSYPsc\nBht51JWGMNIQNzyRZu6JgSANprdmsOBIg4NDmWH5cuCii4Dvfa/w8IT5kLXNoQB4b+4CPf4dpTR8\n4QvA3nvz5/XrvfW1tZzoaMsW4Mkn842QQLTSIORGJw1tbXxcSTwNekcbpjS0tcUPT0hKZZ006OfV\nNvzSJA1pwhPmcEl92dcH7LKLfTv9HJlKg400iNFUVxrkOCR9c5TSYNt3GGzb6OEEfSIwgelpKNfw\nhCMNDg4OqSATPL3wgj08UV3tdRxplYYoT4OevyCKNLz0kvdZQhMPPcTx9QMP5BEGjz6ab4QEopUG\n6aSlDUKWtm3LJjxhdiw7dsQPT9iUBjnGc88Fvv51b/soT0OhSoN+fOb8EIIgpSHICAl4x2YqDXFG\nT6RRGmzXyQxPmN6bgfQ0NDQ40uDg4FBiaGnh5Wuv5SsNO3bwg0sk26COvKkJ+MMf8pUGeWDGIQ26\n054of1+vvMJhDP0tV0jDoYdyGmQiJg3NzfbwRJjSEBSeAPhhm4URshClQUjDqFE8nFWfvfLss+15\nGoKUhqxGTwD5s2IKkoYnAC8JV1pPg23fYYgyQsbxNBRz9IQQrKzgSIODg0PBENLQ3e35A+RB9dZb\nPA+D/vZloreXH3LnnccPIl1pIOKHalSeBhtpkIdsWxt7Ll54AXjgAa8NnZ1MGiRznkB8CGFGSFun\nHhSeADitr640JPE0hJGG1tZ4Qy6lEwGAn/0MuP9+YMECfwhGh3R0jY3+zixNeKKiwqtDN//JOdDP\nvbmdII4REshXGuReihOeKNTTYDNC2sITxVYalPKud2VlMgNlFHSi4EiDg4NDKghpANjfAPADva2N\nO+oDD/S/fZmQhE7vv8+fzYesvFnqSKI0PPkkey5WrPB+BzissnZtvhHPJA1JjJBmeKLYSsOGDbyM\nCk+0t3sd50kn8SiR1av9x6hDrkFtrZ+cpTFC6u0xFQMgWGnQjYWm6tDdzfeSjSzpnoaddwZuvBE4\n/vj8/QftD8jGCCl5LaI8DVlnhOzq4us6ciTXnSVp6Onx2l+2pIGIJhPRbUS0mYjaiegVIppjlLmU\niNbnfn+YiHY3fh9NRH8gohYi2kZEvyWiBqPMfkT0BBF1ENEqIjov3SE6OAwtSA4AwOu4ZYjf0qVM\nGoKUhqef9rICAvaHbJDSEJc0SEcnpkx5OHd0sDIiWRIFYaQhyggZFZ5IktzJrF//3NDA53fduvzy\n+v510iAPe/ldJH5zH4A/k6J+ntMoDYBHPHRyI22LCk+Y+6qu9oimLTzR1eUpDUQ8jbPso5ieBpMQ\niscmrqchq/CE3O+iNGQZnujt9Xw6ZUkaiGgnAE8B6AJwPIDZAL4DYJtW5nwAZwM4E8DBANoALCQi\nPU/XHbltjwHwTwCOAPAbrY6RABYCeA/AHADnAbiEiL6W7PAcHIYedKVBOm5BXx8wd26w0nDKKcAv\nfuF9N42QgD0uG0Yaqqr4QSukQTo6Mx7f3u4fVSAQ0qBL90mNkFHhiUJHT1RVcXbHQklDVHiirs6v\nNKTxNOjtsb3pR5EG27EFkQa5dkH5PorpaTA7fzEID3R4wiQNWSoNomAApUMaQsQjKy4AsFoppXfe\nq4wy5wK4TCl1DwAQ0b8A2Ajg0wD+RESzwYRjrlLqpVyZcwDcR0TfVUptAHA6gGoAX1VK9QJYRkQH\nAPg2gN8mbLODw5BCSwu/+cqbvI7aWmDffb2ESabSsHkzsHGj933VqnhKg23IpT5Xg+5pkIeo+Zbc\n3s6KyOTJ/rptSoOuJgQpDatX53d08oCVjJBxJ3yKytNgkoYk4Qn5vbs7ODwRpTSkJQ1mO4FoT4PZ\nmVZVeaTBrG/8eGDTpnzyaSZSCtuf+TkMYUZIaWMQaTAJWzmQhp6eMlcaAJwI4AUi+hMRbSSiJfrb\nPxHNADAJwCJZp5RqBfAcgMNyqw4FsE0IQw6PAFAADtHKPJEjDIKFAPYiImNQkoPD8EJLiz0N8FFH\nAc8+yw9Nm9LQ2cl/Zj57U2nIKjwhSz08EaY06NK9/uAOUhouvpilcMDrpOrqvN/TJncKUhomTQpW\nGoj8EzBFhSfClIYswhMDpTRMmMAhp6B8H1krDWHhCWmjGZ6Q6yDXppjhiaw9DaWoNCQlDbsBOAvA\nCgDHAbgBwH8R0em53yeBO/+NxnYbc79JmQ/0H5VSfQC2GmVsdUAr4+AwLBFEGmbOBPbfnz/bhlxK\nWGPzZv92UUpDby8/QMNIQ9zwhJ6/QGBTGsysjzalYft2nmQL8DopIq9TzDJPQ1R4QtrQ08Pnoa3N\nTxpkaGKQp6GhgduYtRHSpjQk9TSIbwHIP25RGtra7Pk+BjIjZJTSYBK2YikNWXoaSlFpSBqeqACw\nWCn1/dz3V4hoHzCRuD1kOwKTiTBElZF3hRKZtsPBYXAQRBomaXTaZoQUA6UoDfLAtykN+ugJm8FM\nl2GDlAZbeELPXyAIIw1hSkNXl/dg1zuziRN5P1kYIc3wRNDoCVnX0+ONNEjiafjc5zhTY0WFd56J\nClca0pAGs202MiAYP55DXB0d/rBTsZSGsCGXQaRBH76r15GENCxeDMyebQ/tyDVyngY73gewzFi3\nDMBnc583gDv3ifArBRMAvKSV8fmniagSwOjcb1JmorEf2cZUIHyYP38+Rhlp1ebNm4d58+aFbebg\nUDZobubprAXSYeukwRaeENIgU1EfeyznDjDlXNMIKaTBzNMgqKryexrkIWqOntiyhTtUW3iip8cz\nssX1NOgmUL1znDSJh3smGXIZ1whprjfr0I8jSXhi1CjgH/+RP8t5Hj06vadB6kgSntBVFR02r4Jg\nwgSvs951V2+9vB03NCAQWWWElHs9yAipD9/Vt4sKT0jnrxRwyCHAMccAjzySX16uUUMD1y0TqUWF\nxOJAZnCtrc2WNCxYsAALFizwrWvRHdYhSEoangKwl7FuL+TMkEqp94hoA3hUxKsAQERNYK/C9bny\nzwDYiYgO0HwNx4DJxmKtzOVEVJkLXQAcDlmhlAo9squvvhpz5swJK+LgUNYwlQZ5w49SGvQJkgCP\nNIhpUmCGJ4KUBv1zmNIgbZF5J2xKg96+JEqDQO/MpHPPMrlT1qQhrJOU8zxmTHGMkCZJFASFJ6KU\nBsH06d7nj3yEh/fKnCM2ZK00yHk3j89UGqLCEzphBTxStGiRvXxbmzebqdTd359cHbJBsnDW1WVL\nGmwv0kuWLMHcuXMjt03qabgawKFEdCERzSSiUwF8DcB1WplrAFxMRCcS0b4AbgWwFsDdAKCUWg42\nNd5IRAcR0eEArgWwIDdyAuAhmd0AfkdEexPRKQC+CeDnCdvr4DDkEBSe0Du1MKVB8IUvAGeeCZxx\nhn+9kIYLLuB64pCGME+DPKzjkoYoI6R81h+iZngCyCa5k+lpsJUXRJEGGT4alvAI8M6zrjRkaYQM\nQtrwBMDn2lS6DjsMoUijNNi2iQpPmJOwJQ1PmP83JvSRMlJnVr6GYikNhSARaVBKvQDgMwDmAVgK\n4CIA5yql7tTKXAUmAb8Bj5qoA/AppZQ+ovxUAMvBoybuBfAEOK+D1NEKHpY5HcALAH4K4BKl1E3J\nDs/BYeghzAgpsBkh9YdfRQU/6G64IX8IpJCGK6/k73GVBnPIpXSesr2QBlt4AoivNMhn/djM8ASQ\nbUbI6mpgzz3t+9PX6aTBHHIZNH+DCXkzHjOmOEbIIAQpDbb8CwJJ1DVtWvI2plEa9O3iGiGlzWlH\nT+gK3aZN+eX1kTJSd1a+BrlfSok0JA1PQCl1P4D7I8pcAuCSkN+bwbkYwupYCuDIpO1zcBjK6Ozk\nN9YxY7x1Z50FnH++v/MPM0ICXvY+G0wjpMSC44YnzFn+TNIQpDRI++IOudRhC0/oczBkkadh3Dj2\nHrS0FBaeSKI0mL6QuBhopUH3MyTdn/k5CjKsMe6Qy1mzgDlzeGZR2R6IrzTopGHJEi9FtqCtLV9p\nyJI0lLXS4ODgMLgQE+OoUf4OzXxoR4UnzBETOpJ6GsQIaXoaBEI6opSG5maPgOhmtKAJq8w2C0Rp\nkPqA9J4G0xx4yileu0wIMZCOPmjIZRQBqKnhuvTzndYImYXSEGaEHDmS96X7GeJioJSGujrgxRd5\n9INtexNhpGHlyvzyNqUhy/BE2SsNDg4OgwczkUxvr/2BG2WEDCMNQaMnknoazO3Xr+cOxnyo6+EJ\n8y2wr4/rTaM0NDdnO3oC4BkrJ00CPvpRex1RSkNcT0Ntrb9NWaaRDkIapYEI2G8/4KCDkrUPKJw0\nxFUaTMgxxA1PCNnWM2Pq0JUGqXsoKw2ONDg4lBGENDQ2hsdms1QahAToD2PzzdzmaRAIaVCKQxPm\nW78Mz9u2Lf+YzDi0+Vlg8zRs2+aFceKGJ4j87TNJw8iRwI9+ZK8jq/CEEKsgIhMHNk/D177mb5OJ\nNKQBAJ5/PlnbzP2Zn6Ngdvr66IkRI6JVpTRKQ20tX3u5tjqK6WlwSoODg0NBMJUGIJw0mJ4GCSNE\nkQbd0yD7DMrToHsaenr8QyEB/3czNCH7q6vzKw1hpCFKaRBz3i67ePuOSxps8y7oyzCYpMGU9WXu\niSgCkKXSoNdx443h2wR1plGkIS2yDk8IaYi7fRIj5E47cd020qBn/3SeBgcHh5JCXNIQZISUDjWK\nNOippsVHEcfTYIYmAH8Sphkz7PtsaPAbDJMoDboHAmBy8+yzwDXXJM/TYNYdlPDIBt3TMGKEv64k\nSkNjI1/fLJSGLMITtjklskBapSEsPGGGvmxIY4QcPZqJQZDSEDTk8qWXeLr6tBCloaHB+z8cbDjS\n4OBQRkiqNOjhiZYWT7qPIg3vv+99jyINuqfBRhp0BCX7kbe4NEqDrSM75BB+yCdNI12o0vCHPwDf\n/GZ+JsQknoZvfQu4/fbClIZiGSFLQWkwr5VuhIxDGtLkaTBJw6pVPHJk/Xp/eML0NMyZw54Pc7bZ\nMDQ3A088wZ9FaRg3Ln+iucGCIw0ODmUE3dOQVGnYscMzCUaRBhnpoO8zTp4G089gIow0yFuV3n4z\n9a/5GQjvGJMOuSyUNAjM85tEaZg0CTjggPIwQhaCwVYakoQnTNLw5pusxq1ZE2/I5TPP8HLt2ugw\nw803A5/6FH+W/4mxYx1pcHBwSIEdO7wZEZMqDTppCDPEVVf7FYMkngYZbhj0QA4jDXq9SUhDWEeW\ndPREVqTBzCKYZMilrb4sjJBRCFIaqquDCUUhyGr0RFpPQ5TSIB2/eBp00iDXt60t2AiplHf+Fyzg\n/8WpU4Gzzw5v34YNXGdnp0cyx43Ln51WR38/cOed2c6wGQRHGhwcygjbt3ujDZIoDRI6iOtp0NHa\nyp2e7gsIIg3y0AoiJTJW3oQ87M03/rThCbNsXKXBLJfE06BvK+RJb2Pc8IS+ja3uJNtmoTQQefdL\nKXkaTPKQVXjCnHtCVxpkyKUMX25rsxshe3u5k5f7d+tW4K23+POrr4a3T7JOtrZ6aaTHjmUi0dHB\nc3qsXu3f5tZbgXnzgPvuC687CzjS4OBQRti+3ctrEFdpuOce4IEHuMNKSxrMNzjTCCmeBnk7C6rf\nzAYpMElDEiNkHNKQ1gipp5GOQtiboD56YqBIA1EyhSJIaQC861kKSkPQkMtihSdkOvcwpcGWp2Ht\nWv48ejSrBjI0VZ+h1ga5j1pa/EoDwCGKww/PzxNy553+75s2sRE4iZciLhxpcHAoIyQlDUpxQqIf\n/5i/pyEN27eHk4aKCs/TEEQavvSl/Nk0dSQhDSYBCOvIBtIIuW5d8G+6p2EgwhMy62IShIUgikEa\nslIaihWeENLQ3c316qQhSmno6/Puh5kzeejvCy+E71cgSkNLC9cjSgPg+Rr0RG2dnd6U3RJW/O53\ngfnz/YbmrOBIQ5Gxdi3f1M89N9gtcRgK0ElD2MNPD0/s2OE9wJKQBkmMFKY06A/uMKWhqQmYMiV4\nn0lIg/m9VIyQYh69/Xbgrrv8vyUxQurbCNIoDUlDCWGhGHmDT0pe4uwPGFgjZFLSoBSvsykNW7bw\n76YR8o03gD/9ib/vuqufNLS0hLdPlAYhCLrSYKpZl14KLFzo/d+Zs6KGqV9p4ZI7FRn33MPL557j\nYWAODoUgTXhi+3b/vA/V1dFppAF28W/d6nkadCQlDXESGpn1AR5pMDuyigpvX1mQhqCOJImnQRJJ\nnXZa/m+FehrSGCGTkoao8ERFRXLyEmd/5ucomPe93CuiCCTdPqhdQhr6+4NJg3TKptJwem46xtpa\n/m3jRh6mCUTnW5A6ZakrDaJCVFezwvHDHwKHHuptK6ZlSaK2cWP4vtLAKQ1Fhgy1ydJA5DB8EZc0\nmHm4WTgAACAASURBVEqDjEJobGQHd9hbv9QrOR3CwhP6W18YaYgz34JZrqLCPnrC/B5n9ESUp4GI\n6ylEaTjrLG/WRxMDrTTss0/y+SCiwhNZP8OyyghpJvZKur0JG2kgsocnpBM38zSIWbmzk+/tri7P\nRBmmNPT0+FUMqbOpiZdCPOrrvX0vXszkZPJkT2kQ0rBhQ/C+0sKRhph44QVg33392e2ioJQXa9q6\ntTjtchhe2LEjndIgaGwEXnkFOOOM4H1IvTI8Myw8oc/Z0N/vjZ5IqzSYc1oEhSf0Y85CaZB9F0Ia\nfvUr4IMP7L8VOuQyKWn49KeBhx9Otk2U0pCln0Hfn/k5CkHhCf23MEgoISiUkSQ8IdfbDE8IafjL\nX3g/XV1MIGprw5UGPReDfBZT67hx3iybdXUeaejv51lGR470/tfl/995GgYRb74JvPaad6HiYMsW\n76I50uCQBvfey528IO6QS5006MP/Ghv9k13ZYCoNLS3BJCAoPGEOuUyrNAyUp0H2XQhpCIO0sbMz\nudJgTqJVLEQpDVmThqxGT9gmGAvD0UcDf/+75xMwUYjSIMfU0cEJmj77Wb63Ozt53cSJ4UqD3r/o\nSgPAIQpRGnTSAHB69pEjPaVBQmVOacgIW7b4M97FgW1u9SjoZR1pcEiD888HbrjB+540PLFjh3/Y\nlZne2AZTaeju9vYpSBqeSKM0hJEG/ZizGD0h9RSSpyEMQgA6OpKThix9BGEoN6XBdm3jKA2VlTxs\nMapdNqWht9cfQghSGjo7ves3YgT/H/b38/9Ua2vwUEjdxyCf5bzrSoMengCYNDQ2ekqDZJ10pCEj\nXHyxZ1SJC7mBknT+QhqmTHGkwSEdmpu9WCiQ3AipS6F6kp4wSH1jxngPvijSEKU0JDVCSjuyUhri\nvKkPhNKQhDRETayUNcKUBnPWzSyQdUZIIBtiY2ZTFaVB/nfa2qI9DV1d3vmqrfVIxsSJfE8HpZKW\n+mbM8IcnAFYa9PCEHgoLUhpceCIjtLZ6Mk9cyMMwSecvN8puuznS4JAOOmkQU2Na0tDYGK/zlHDG\n6NGesUxCIgIbaQjL0+DCE7zs7EzuaRhopWGgwhPFUBqyIg1yPwP+0RMAd+ZyX0q4Qv4/9Gsr/zv6\nlNq6T8iGRx5hQ+PUqfnhiXHjvGeBhCemTOHU7Icf7lcaXHgiY/T2BhuWglCI0jBzZrKwhoMDwA8m\nyUEPeBLnqFH8PUp6J8onDXGgD9uSzjxIadCNkMUITwSNnojbUSQhDdXVpaU0DDRpCBtVUC6ehqxU\nmYoKe3gCyM95UlPjjVawkVndRBxGGjZvBm67DfjGNzgDpU1pEAhpmDoVeP114LDDnNJQVPT28sm1\nzY0ehLSehooKTu7hlAaHpJAHi7xdyD0kD484482LTRqK7WmIGnI5ezYPLwxCoUpDKXgaSiE8MXly\n8HDSQvcHJDN6ho2eyIrY6HlAdCMkALz7Li933ZWXkyZ57beRBn2UhpAGmxny8cf5BeHLX+YXA5vS\nIBDSoF8Tm6ehpSX7VNJDjjQIwwqDDAtLMhIiKDyxdi3wzjv2bSRn+dixvJ1+8R56CJg1qzi5wR2G\nBuTBIqRBHiKSqTGKNGShNIjEmtTTkEV4Is6Qy1/8Avj5z4PrTWqELGTuiTDIeezoKP3whK193/mO\nN3w8KyS5Nrbt0hoh4+4jSGl49lm+3yUPxs47e9vp929SpUE6+jFjmDRImFDq0UmDUqyW66TBpjQA\n3v9QVhhypOFnP4suI6QhSbasoPDE1KnBE5DI7GijR/uTewDAlVcCK1YUJ2OXw9CAkAZ5mMi9l5Y0\nxBk5AfgTxMRVGkxPgzkGPuvRE3EVgHI2QpaS0lBTwwmGskRa0lDokMs40MMTptLw7LMccpbwxOTJ\n3nZxwxM2pUFUtepqLwQJ+IdcCvr6+KVX2gDYPQ16vVlhyJGGJUuiywhpSOJrSONpEKVBHvL6tkI0\nli+PX5/D8IKpNCQlDXp4YsyY+ErDccfxsqkpnaehoiI/GVRcpSHu6ImwDs5WrpzDE6WgNBRzf1kq\nDcUgDabS8MorwH77ed91pSEqPCGdvE1p6Onh/VRW+v/fbEpDXx8/D+RZAHhKg1JMGuRcDCppIKIf\nElG/8feG9vsIIrqeiDYT0XYiuouIJhh1TCWi+4iojYg2ENFVRFRhlDmKiF4kok4iepOIQvLX+bFm\nDbBokXfBbUhDGtKMnhClwUYapk3jpSMNDkGwkYbKSu+NL67S0NDA92Fc0nD++fzwqaiIH57QPQ2V\nlcFzVQRByhdLaUibpyFrpUF/mMfdphRGTxQDhYYnBsoIKUqDpHIGgP3395S7KNKgE+imJl4vap6O\n7m77EOcgpaGnx/9/1tjIbe3oYHVSnhOloDS8BmAigEm5v49rv10D4J8AnAzgCACTAfxFfsyRg/vB\nE2UdCuAMAF8CcKlWZjqAewEsArA/gF8C+C0R/UOcxvX1AcceCzz5ZHCZQsITSY2QEp4wt5Ub3ZEG\nhyCY4YktW5iAyr0TV2kYOZK3iysvE3kPxCThCRtpiCuxRxkhbRNW2dabKLXRE0ByT8NAv/mXutJg\n3vfFVBrEc1ZRwf8TP/kJf58717vHdQXA5mnQlYbaWr9fAuDPDz7IJMBGGoKUBnPyM9lmxw4mp/K9\nFEhDr1Jqk1Lqg9zfVgAgoiYAXwEwXyn1uFLqJQBfBnA4ER2c2/Z4ALMAnKaUWqqUWgjg+wD+nYjk\n8M8C8K5S6ntKqRVKqesB3AVgfpJGhqXqzCo8EWVilPCEfjEF8gblSMPwgFLAr3/tT+kcBcnzoSsN\nuhwZV2lobOR5ES68MHm703gadNIghshi5WmIMijq7YuCLTxRUwPcdBNw4onR24dBb6dTGhilbIS0\nkQaADaFvvgmccIL3v6z7D/S22JSGujpuoyjXAPCHP3DK6YUL7cqeXI9Ro7zj6+vLn8dElMTt2/2k\nIc7ggCRIczvuQUTriOgdIrqdiKbm1s8FKwiLpKBSagWA1QAOy606FMBSpZQ+y/dCAKMA7KOVMX26\nC7U6YiEsjJBVeCJqyKYoDfrFNNvgSMPwwPPP8/jrOEZdgS08oZOGOHkaJAX0gQcGG3bDkCS5k01p\nMHPyByHt6IliexoA4CtfCZ6nIC7KgTRk5d+Ii0I9DQNhhJSXRX0fe+zB3+X/UycNRF67bKTBpjRI\nuuj33w8PTxB5IYreXq6jHJSGZ8HhhOMB/BuAGQCeIKIGcKiiWyllWjw25n5DbmkGBTZqv4WVaSKi\nyNnSn32WT1bWpEEuckuLt/3mzcHlAX5THD3ak3p10iAPQzd6YnhAMrMleajp4Qml+J7W45pxwhNA\nfC+DDXGNkKanYfp0bqtp2ozaT9I00gNBGrKAHnt24Qn/fgpVGorpaTCVBh3/9m/AKafwBFi29pnh\niZoarkfPAaGX7+iwKw064ZT/f1vYTv6HZEbNYnkaEvGyXDhB8BoRLQawCsAXAARk0wYBiJONIKwM\nxSgDAPje9+ajt3cUbrgBeOwxXjdv3jzMmzfvwzKFeBoAJgPjxvmnMbWVl/BERQU/uPXwhLShs5Mv\napx54B3KF0IwbW+sd94JXHGFfzZLwB9i6+pi0rDbbt66OOEJIP5QSxvShicOOoiP+bDD/OWi9pM0\nuVO5kIZyUBoGOjyRladhoJUGwYQJ/L9ra5/uT5B7W8iDGZ6QNre32//f9GP67neByy+3/1/I/np6\n+HkRRhoWLFiABQsW+Na1hMX09eOLVSoASqkWInoTwO7gkEINETUZasMEeMrBBgAHGdVM1H6T5USj\nzAQArUqpSM509dVX41//dQ4OPhi49loegnnwwf4yhYQnAA47jBvndQTmAxVgQtLXB+yyi1fGpjQA\n3DlknW3NobQg94rtrX/VKi/LnA79f7ijg0nqgQd66+IqDYUQ0qDRE+ZkShKe6O3NDzHoyyCkneUy\nyzwNH/tY8ch7IaTBKQ3R20knX0wjZNL2maRB/D36yAzAa3NHh/d/FqQ0fOUrwOLFnD1S3xbw7t3u\n7ujwhPkiDQBLlizB3LlzI4+vIA5LRI0AZgJYD+BFAL0AjtF+3xPANABP51Y9A2BfItLft44D0AJg\nmVbmGPhxXG59LIwZw29lP/oRcMgh+eSgt5dZ2KZN4UMzdejlJPQhSoOMjtDx3nu8nD6dl3riDWmD\nICbBcyhjyL2ik09Bf7//fhC0tHgPmo6OdEZIvVwajBjBD6OgIZRBngazjWmNkAOpNPzwh+nMonHg\nlIbg/RUantDryDo8Ic/9QkiDKAxBSoMengjzNOjlxdxoIw2iNJSEEZKIfkpERxDRrkT0MQB/AxOF\nO3Pqwk0AfpHLszAXwM0AnlJKPZ+r4iEAbwC4jYj2I6LjAVwG4DqllLx73wBgJhFdSUR7EdE3AHwO\nwC/itlNIw8sv83fzQd3bywpAf394iEGHzHQG5JMG23TDMoWp5Ce3KQ1yszjSMPQhKcttKV2DSMOO\nHZ4C1dnphbsEcUlDISmQR4ywK2lReRqCyoXtB8heaUj7Nps1ChlyOVSTO8lskoWGJ2y/FQrxHYSF\nJ4IQpTSYRkhdaZBt9D7F/P+trAwPT3R3l1aehikA7gCwHMCdADYBOFQpJV3vfHCOhbsAPAZWIE6W\njZVS/QD+GUAfWH24FcAtAH6olVkJzvVwLICXc3V+VSkVO/O5kAYZEmOqCb29XurPuCGKvj7PhCKk\nQSRnm1qxcqV/bLyeF1zaIPU50jD0If4ZGzkQ0mAO4e3s9EhCWxu/MchoBCB+eKKQB2lNTThp0J3d\nuqfBbGM5GCGLiTRKAxEfZymkkS7mPrNQGuQZPJBGyCCY85WYngbTCGlTC3SSYu47Smlob+d2F2v0\nRFIj5LyI37sAnJP7CyqzBkwcwup5HDyEMxXGjGEVQDeG6DBJQ9gseYL+fg4xtLbmKw2mkvHkkxx3\nktAEYFcaxo3j/TvSMPQhoydsSoM+m57+0Ovq8oyTkhhMfwMZiPDEZz7jv48FccMThSgN8kYlb6Q6\niuFpKCYqK71zlOR6VFcP3fAEwOclC0+DkIas2i5qQLGUBlt4Qt8mqv4wpUH6mZIYPVEuEKVBmJb5\ndpdEaVAKeOYZ74E+Zoz3AA9SMo44gpcnn+yta2z0j9ZwSsPwgsxrH6Q0yG/6Q6Cz0xsDbiMNUdK7\nOV48DY48kv9MJPU0pA1PmMZKc//lojQQcer4VauSvQ3bslQWC3EJXpZIozQMVHgiKyMkEasAutKg\n9xn6+Y5jxI1SGoQ0lEqehrLA2LF84iSjnvl219PD5sXaWuDRRzl3+A03MAkwCcDNNwOHHw489xxf\n7NGjPaVB0vvqrNFmdAPsSkN9PbfBkYahjb4+j5wGeRqknI6uLi88MVhKQxCSehrSGiH1OnTEJUSl\nQhoAYO+9eemUBg+FKA22+6JYRshClAaA7+8gpUEPS8ZVGmykwVQaSsIIWS4Qh/natby0KQ3V1XxS\nf/Mblo6feoonIbnpJn/ZN3LTcckEPqJiAN7FMHM4CP7xH73PNk+DTIH6xz8C11yT7lgdSh87dngP\niSilQUdXlzcyR+45G2kIelAWsxOIytMQt42CMNJQiNJQKkZIoHxIQ6krDWHXtBSUBtPTAPDLoT56\nQu8zdAIRlzTYPldU8HfpZ+rr/aGMrFAC/0rZQx+WBtg9DVVVnqN99Gh+GK9fDyxd6i8rrK2hwQtP\nhCkN4nNYtAj42te89TaloaqK3ySfeQaYn2hmDYdygs70wzwNJmnQjZBhpKGYoyeCYDNCFuJpqKry\njH+COEpDuYQnAGDWLF7a7oEgDGR4olyUhrD7vhSHXAJ+pcE0QuoEIm54QmBeq5oar5+R4dLO0xAD\n4lcQ2JSGqirOrNXbCzzyCJ/Y7m6eWluHsDaZU33MGJ6wBLArDUIazGRNtjwNDQ3+vOU23HADpynd\na6/wcg6lC500JFUazPCEPmNeOYQn4o6eIAIuugg4RsvQEkYaxFgY9TAvFSMkAOy5Jy/feSf+Nk5p\nyEcY0RjIjJBBCCINQXkaClEazOOtri4+aSgB/p09JAujIIg0XHQRJ3SpqeFhKn19+aRBLkB3d354\nIkxp0OcIALzwhMhdojTYhrPpOO884K67wss4lDailAYbaVCK77nGRr7v0igNAx2eKERpAIDLLgNm\nz463bWVlvOMqJaVBRmntvHP8bZzSkGybUhhyaSMNTU3eC6JphNT7j6RKg3m8utJQW8vEwSkNMTBi\nBDBxojdaQX9Qy41guk5FUQgiDZKMafRojxgkJQ39/ZzAo77e8zS0mtN7aejr43bJLIcO5Yk0SoNs\nU1vLRKFUwxNxPQ1pOqKo8ES5kYaxY4G33/YSvsWBUxryUVfnV9x0lILSYPM03HGHFzY3lQadQJgp\no3/3u/z6ndJQJEyd6n3WH8by2ZwdTEjDBx/4H/I6aaioYLlYRjsEhScaGvzToQKeoqDXV1XlDcWz\nQco60lDe6NSmcourNMi9JbHQNKMnBkJpyMrTYENUeCLOcZWSERIAZs4sXSNkXHNp1vtMenynngo8\n9JD9t1IwQtqUhlmzeIIrvW5BkNJw0035Cd/0+oF4ngY3eiImdNKgP6htpEE/0YA36gLID080NXkT\ngnR28o1hKg2mygB4ExVJfRIikaQ/NslSVAhHGgYWPT0cf3700Wzqk3/a6mq70mAzQuqkobY2XZ6G\nYnoazA49ytOQNWlIqjSUgqchDVx4Ih9NTd7sqbb6skDWngbz9zhKQ1T95mfZ3ikNKZFEaTBJw5o1\nwI03ct4GMzwhWba2b/fS+ppKg400yPTEkhBKpk598EH+bmOUjjQMDu67D3jrLeC227KpTwhAQ0N8\npUHUidpaJpwy0ieNEbIY4QmA/x/iehqyDk+Uo6chDQZSabDJ6sVGmvBEGIoVnihUaTB/L5YRsqbG\n6zfq6hxpSIRClIZ77wW+/nXgiivywxNCGlpb+cFeX++/6Js3e6l/dcj+9LfKqip2i19+uT0plCMN\nwfjyl/kaFQM33sjLPfbIpj4hDY2NyT0NI0awgaqlhT/rD6/BDE8A+aShkDwNNgw1T0MaDKTSsPfe\nnMxuzpyB2R+QTmmIqi8LyLDIrMITZt3FNELqpKEYRsgy/VeKRlpPAwA8n5uTk8hbbyMNSZQGubhy\ng4jSIL+FkQY9Jl4ILr0UePXVbOoabNxyi9e5Z42ncxO5Z0XW4ioN+j2gkwYZdmnOpjqYQy4Bu9Jg\npn0uptIQ561sKJCGgTRCfulLpW+EDEMpzD0hbQgiAFmGJ2xGSElEWF3tlIZEOPpo4Itf5M9xlIb2\ndv5MBLzyCn/WO38JT4ihMUhpiEsa9IdrFGnIqvP64Q+BY4/Npq6hDCFpWZG1KKXB5mnQwxNpSUPc\nVMtpoYcIiuFpCDNRDhdPQ01N+RKeOMhaach6auxSVxps4QmAnxUy50XWRsghOeQSYKfqLbcAd94Z\nz9MgmDoVWL2aP+sX3VQaWlqYwTU08I2lFF+k7dvtuRdkf7L/JEpDFqRBbn4x1DnYoZTXYWetNBQS\nngCSkwazXNYI8jTYlAY3eiIdvvOd8iU8cZC10pCVSiL380AYIbNKI61vL94nlxEyIeQEJiENu+/u\nkQYxnwH+0RP6b/X1vJRZMDs78x/uQHKlobfXG9qZReclHaGt03LwoP+DZU0a6uvTDbkUpcEcmx5F\nGrKeLtiEzdOglH+4scvTUBg++cnBbkFxUcpKg7wMyve4GMjRE2FKg3x34YkEqKjgh1mc8IRg5kzv\nswyHBLzwRH0912sjDQB3zrbEI0k8DUrxb9/9Ln/PovMaSmbKYhIfPSRhhidefZXTg+um2Tjo6uJ7\nrKbG3/Zt24A//zl69ESQ0iD3XlCiG7mnSmH0RCFKQ9CEVcOBNAx1lLLSkHb0RNQolGKGJ2Sf8qxw\nRsgUMMfGBxkhBbvv7n2WjJKynZCQpqZ80iAXPog0BI2eAPJJgxgxBVl0+OLZGArIYirxZ57Jn9EU\n8BMF87z/5jc8OkbmHomLri6+x6qqPAKrFDBjBvCFL3hm2yilwSQNc+YADz/szWlgQu6pYikNX/+6\n55EZ6DwNM2fGG93iSENpo9SVhkLCE0FtKWZ4YiCUhiEdngD8D2rAezDrFydIaZDUvQK5uZuaOHMk\n4CcNSvHDPkxpkP1LGmn5TW6c558HLrjAv23WSoP4L8oVWfgyTj8dePdd4KST/JOLCWlobMw/73pq\n5ygceigwbx5w7rkeadAJ7FNPeeRH/qmTkgaicGNrsZWGyy/3t2Ug8zR873vJ6ijn+30oo1RHTxSa\nEVJmbQ2rW5D1LJeAnzS4jJAJEUdpiEsa5GKNHGkPT/T08E0WNzxhvoX19wNnnsky+CGHeNtmTRqy\neFMfTDQ3F17H3Lm8vOUW/3ohDaNH54cn5HvUW3NfH/Dcc9505zalwQx9AcnDE1EottKgoxh5GgoJ\nbZh1OKWhNFGqeRrM8ERSpSGMqA+EEdJ5GgpAkNIQRhr2248/m2+0UUqD/qA3YTNC6koDwNL3yy8D\nP/sZ51QQZE0awua7KAdkoTRIR2zOICrXcKedgpWGKE/Fu+/yUqYztykNtvkoTKVBzH5BSkMUBpo0\nhCkNaTqGLCZQcuGJ0kYpKw1pjZBVVelJQxbJnYDijp4Y8v9KVVXJPA0NDZyn4WMfCw9PiNIg6aH7\n+8NJgz7k0hyaJhf+0Uf5t6OO8nsrurrsaaaTQO8A9bfcckQWpEH+Ud96y79eVxpM0iC/2UZA6Hj9\ndV4KaejszFca4pAGuS8LJQ0DkRY4zNNQWZkuPOBIw9DHcFQawsITSZQGG+FyRsgMkCQ8UVXlXYQR\nI/I7DblYhSoNpq9Cflu0CJg+nf+mTfNvb0s01N0NPPFE/voLLgBOOcW/bigpDRKeiMPKgyDXYNs2\nPznUlQbznMdVGl57jZeS5MumNOhxxqDwhNxH5RKesJGGQw8FTjstXZ1ZkgbnaShNDEWlYaDCE7b/\nCzfkMgMkCU/onZAu7wh0pUEfew9EKw06aZD2mErDe+/xFKry27Rp/NAF7CGK224DjjzSIzCCRx8F\n/vY3/9BAffRE0iGDpQZRGsRDkga9vd4b/Ntve+vN8ERzs7ePKKXhmmuA667zlAZ5GNg8DTalwUwj\nnZXSMJiehgMPBH7/+3R1OqVh6KNUlYZCJ6xKojSkDU/Y/q9tnoaSMkIS0YVE1E9Ev9DWjSCi64lo\nMxFtJ6K7iGiCsd1UIrqPiNqIaAMRXUVEFUaZo4joRSLqJKI3ieiMNG1MojToYQrp+CUvA+At9YyP\ncZUGfchlkNLQ1eW/2VatAn7wA/5sIw2iMkgMHeAObtky7oj+7/+89fr2tuyT5QQhDfJmmwZ9fV74\nQA9R6OGJNWt4eeutvC5KaZg/HzjnHO96CBlIqzTI/djQwPdIKYcngpSGQpAFaXBGyNJGqSoNhYQn\nknoa0oYnbMda0koDER0E4F8BvGL8dA2AfwJwMoAjAEwG8BdtuwoA94OHex4K4AwAXwJwqVZmOoB7\nASwCsD+AXwL4LRH9Q9J2xlEa5OFsUxrq6/PlIJGL5XcgvtLQ2xusNHR15d8IUpeNNPz977x87z1v\n3bp1npLwwAPe+o4OvvFHjCh/0qCPntA73xUr4ptGe3uZEEycaFcaRo/2FAbJmxHX02Bm35RhuEk9\nDXLtiTgtun7fxUGxM0LqCPI0FFonUFh948fzNZ4yJZs2OWSLUlYaBiM8kURpiBOeqK3Nbg4dQarL\nRUSNAG4H8DUAzdr6JgBfATBfKfW4UuolAF8GcDgRHZwrdjyAWQBOU0otVUotBPB9AP9ORPJ4OwvA\nu0qp7ymlViilrgdwF4D5SduahdJgvq3oM2hm6WmwkQa5+GZn+P773hutThreeIOX++/PSgUArF3L\nxs3a2uB5LsoJ+pBRnTTMmgX8y7/Eq0OSa+2+e3B4QiDKUpjSoL8tmHkX4ioNLS1eqEkPTwDAQw/x\ncNwkKAVPQyHIYsjl2LFs/NWHUjuUDrJWGrKqqxClYcIEJqpBCAtPFKo0mHNPNDVx8jh9f4Ui7Sm+\nHsA9Sqn/M9YfCFYQFskKpdQKAKsBHJZbdSiApUqpzdp2CwGMArCPVuYRo+6FWh2xUainwUYaZszw\nykmnHqU0EPFfmKchCWl4+WVeTpjAHoabb+bvy5ZxZ7Pnnl7nddJJHOaoq+N9ZXkDDQZ05mxKbzKt\ndRR6e/lcTJnC6oxed0WFPwQln8OUBt1XIoTADE9EKQ3/+Z/AiSd6v+v30Uc+4s17Ehe2RGbFQpCn\noRBkoTQ4lDaOPBL4+McHuxX5KERpuPBC4L77gn8fyPCEPDMk62wWSPwOQkRfBPBRMEEwMRFAt1Kq\n1Vi/EcCk3OdJue/m7/LbKyFlmohohFIqtrUjidIQNzyhkwbdqxBGGqRsb282SsOWLbw8+GDg3nuB\nRx4BPvMZjs/vsQdvJx3qpk3cvro678Fezujs5I58+/Z8k0/ceSnkfDQ2slJz3XXA5MleZ637Bxob\neRmmNKxZ432OozTYSMO2bd5wWFNpSIPBCE/o6dGzqBMYmPY7DA4uvHCwW2BHIUqDzDMThKyMkDYy\nbRohhTS0tiZ/6QhCIqWBiKaAPQunK6UiIrv+TQHE8bmHlaEYZfKQxNOgP6Tlc1R4Qg87RJEGYZim\n0iDLJKRh61bej96WZ55hgjBxot8AIyMnRGkod9LQ0eHF9+UY5Y0gLmmQzq2hAWhrY6Xmr3+1kwad\n1AXtY+1aXk6YEEwa9HtRkjcB/vuzrc2/TSGQc1Ku4QmnNDgMFsy5J7L2XWQx5DKJ0tBqvsYXgKSP\nk7kAxgN4kehD7lUJ4AgiOhvAPwIYQURNhtowAZ5ysAHAQUa9E7XfZGlGhSYAaFVKhXpB58+fj1Ga\nY2z5ckCpeQDmAcgmPKFvq6eAjksaslAatm4FxowBdt6Zvzc18XwGst5GGurr+eYpd9LQ2en920hs\nlAAAIABJREFUM0hHrqfnjgMJTzQ2ckfd08Pn2EYazJCCbR+iNIwcyZk99XJBSkNDA6slNtIQNPFZ\nGrjwhINDMhQy90QUwsITxVQadCxYsAALFizwrWuJOb9AUtLwCIB9jXW3AFgG4CcA1gHoAXAMgL8B\nABHtCWAaAIk2PwPgP4ho3P9v78zj7KjKvP99ekkn3SEBISGEnRB2CCasIqCiQRZxwa3BUQRfFxA0\njuLoOw4Mjs6or0RHQBCEEZAog6OiCGFxQQIDQlglRJawhgQJSWfpztZ93j9OHerc6rr3VtWte/ve\n6uf7+fTn9q1bdepU3bp1fvV7nnOOl9cwG+gLynHrHBfZz+xgeUXmzp3LzJkzX3//jnfYRtThbtr+\nCU+aCBn3JbmLyXcayt2kOzqy5zREM2BXrrQZ/ueea8MSX/uaFQ2rV9vkPicafDGTh9Nw+ulw7LHD\nB49qJOvXD3ca/InAkuBG5OzpsfG+zs5S0eA32H7jX24fzmnYtCm509DdPVw0rF8fXkt52YnqNChK\nOmoJT1QjLjwxbZrNW5oypfx2jiROg58ICcNFQ29vL729vSXLFi5cyCw3KU+l+levYogxZp0x5nH/\nD1gHrDDGLArchR8DFwbjLMwCrgIWGGPchM+3Ao8D14jIASJyLPB14CIv5HEpME1EviUie4rImcD7\ngQtJSVx4IpqxW8lpcA0txKvNqNMwdmz5C6y9Pb+cBucodHXZi+3gg+3w1ytWlDoN/na1igZj4Gc/\ngz/+Mdv2eeGLhqjTkDU8sXZteadh40a7vvuhxzkNS5eG61YSDVGnIa68/n5bJ/d5rbRql8s8ek8o\nShZqSYSsRnu7/f2/8502B21oyA7k96tfJQtJtlp4Io5ojsEcYBDbRbILuAU46/WVjRkSkROBH2Ld\nh3VYt+I8b51nReQErEg4B3gROMMYE+1RUZW4RMjoyU46ToN/4TzwgA19RJ2GSpZyuZwGV74/86Wj\nrc3Wq5xocOywg3Uf+vvt8lWrbOPljwTZ1pZeNDz3nC1n+nRYvtzWIzoCZaOp5DRkCU+4MSwqhSdc\n2MDfl49z9gYG7PkdM6Z8eMJNoe6660bLcyImL9Ewmgd3UpQs1NtpePVV22X+kUfsbybtOBD+q080\nPOF6fjWVaDDGvC3yfgNwdvBXbpsXgBOrlPsnbA5FTXR0lDa4caIhS3hi5kz79+CD9r3vNFSqS6Wc\nBrdOFL8BcqxcGY5oCGFuw4YNVjT09w8XDevWpRcNX/2qFSg33xyOB9FMoiHpfBBR/PAE2AavUnjC\n77K0aZMNR/T02BARhD9Kt153d6mr4JwGt+/160PREGXdOvvnem3UykgOI10LKhqUkaIRTgPYe0Ta\n30wap6G9Pcxly4vCD66axGmolghZafz6tE5DpREh/WXRY4iORxB1GqZODf/feuswPOGLhrVr04uG\nvr5whEknGtwMnzfdlO/FmJSBgfKJkEnxwxN+ueXCE75ouOEG22tl333DZe48uHr09MR3uQT7/ftO\nQxQnGlopPKFOg1Ikapl7IknZvkM6NJSfaIg6DWDvlSoaUhCX05BENFTqcukTl9NQjmq9J1x9o8SN\nHx4VDc5pgNKcBt9W7+9PLxoGBsKG2Xca1q6FE0+08y00mkrhiaS468B/mu/vT+Y0PP+8ffVnC129\nGrbZJnzf3R0/uJPbt5/TEKUVwxPNOoy0omShnuEJ/3retCnf8ETUaQAVDanxk88gXjS0tdllefSe\nqCWnwV/m48ITS5aEltlrr4XWONj/Xf190VBreMI9fUMoGlauDBvMahdj3CyUTz0VDmKUlqEhe1xR\npyFLeKK9PZnTEM1pcP/7P/SoaHBOg6vv2LGlTkMjwxONmKxJnQalSNQ7POHIOzwxYwZ8+MOwyy7h\nMhUNKensLHUa4pINYfgoXtUSIR1pnIZachqefx522w1+8AP7FDo4WOo0iIRugwtPxDV2aUXD+vWl\nToNrZN0cF5UmUbr6ajuHg+uO6Jg+3f5lwdUl6jRkDU/4DbMvGrq74ZvftAmmGzeWdnl159Q9iRhj\nwzhxosHl0/T0lDoNlcITK1bYcvNyGvJ8Sqq0D81pUIpCvRMhHXmHJyZNgnnz1GmoiSROA9hGNumE\nVT6NymlwXfoefdS6DFAqGiDMa9hqK3s8xoT5CADvf392p8EY26XTjRP/17/aV39ipyg332wv1kh3\nYCD7WOiu8R4/3v6QszoN1XIaROwQtzvvHA78BPYzX4gNDto6bNo0PDyxeXO4bk9PKBKrhSdcomle\noqEROKch7Q2wEtrlUhkpGuU05B2eiENFQ0qSJEJCeafBH6ch7kuKDiNdr5wGF2bo7AxFgx+eAOs0\njB9feixuGum//x2uuKJUNBgDd94ZH0JwuJyGRYvsE/DJJ9vlTjRUchrcvhcsKBUvteAPVNXVlT2n\nIS48AdYx8L9D59i4/W6xRen52rQp/EFOmhQudzkNvmhw3221REgnGvIKTzSCtrbwulKnQWl1oomQ\n9XQa8gxPxKGiISVJEiHBPqX7yYT1DE9kyWlwT7rt7eFkVVtvXbre9tuHT7u+aBCx63Z0lIqGBx6w\ns8zdU2GcTSca7rzTbn/SSXb5Y4/Z10qCY8kSOOYYu87ChXaZn2ORBf+Jv6ur9t4T0YZ55crS79D1\nXHH79We/dOW4H2RceMKJhu7u4jsN7rrOWzTohFVKo3HhiUbkNGQNT6jTUCeSOg0LFsBZZ4Xvd9nF\n2uozZ+YfnnANnWvYk4QnXGPb0RF2efSfbAE+9zn40Y9Ky161yjZYTin7ouG55+zr/feXr7Oz7O+8\nE2bNshMyjRsXigb3pL9sGRx1lG10wf4Qnn0Wjj/e7v++++xyJ3iy4s/vMWZMGBqI9i6phrsOurpK\nv9c40eCchvb24d9vOdEQF57wJybbvLm609BqoiFuiPZaUKdBGSnqPWGVo5bwxEg5DYXX8EmdhugN\nfOxYuO46+3+S8MTQkG0MkoQnBgZsYxVXbjmnwTU+nZ1WNHR1DX9K3m03++e2gVA0ROsAYZ6EG6Aq\njoEBe86eecaOSyBi9+HCE66xXrgQ/vxnOyzqIYdYEbFhg50HY+ZM+EswiLgTPFnnVfBFgwtPvPWt\npbN9JsFZgiLhxFFg6+znaThhMjBg9xn9fiqJhmh4wrkV/gBQcbRieKKeToOKBqXRNDIRst7hibgu\n+7WgTkMCkjoNTgxUKmdw0LoGcVMvQ/Wcho4OOwTppEmVL+Q0osGFDqL482SsWhU++R5ySLiOuxiX\nLw/XA+syAOy6K+y/PyxebN+7GSDzEA2uQX/xxdA1gcohE4d/HUQbZz/B1IUn1q+331l0zAM/p6FS\neMJ3GqqJBncuW8lpaGuLn0G21jJBRYPSeBrZ5XJoqL6JkC4snheFFw1JnYZKJM1pqDaKn/vy+vuH\nN+Rx/zv8nAYXnvAbqDjSiIbHHw9DJj7+8NuvvRaWc+ih4XJ3bt24Cy484cZ02HXXcH4HyFc0OKch\n2iUy7lii+NdBT0+pAPNzRfzwRDmnwc07UU00OMHhRIPmNFRGe08oI0WREiGjPQhrZVSIhrychmqD\nO/X3V77Ru5yGgYHyoiFpTkM0nyGKEw0rV1YWDdtsY+vkP6k7/IbYL8cXDX5OA4ROw5Iltuzx421j\n60SDC09kfRqN9p7YsGH4bJ7RacTjiDoNvlDwnQbXe2JgoNRpcI6SC090dpYmSbpz5cIe3d3JnIYx\nY0q3aRU0p0EpEo1KhHQ5DSoamoh6hyfSOA21hCfcxetyGpKKBj+s4NcBrGhwgyzF9WqITvTlGrH9\n9guXlwtPLFliXQawx+oacuc0ZI2x+b0nXKzO7xIJyUSD/0Pt6Sk9n1GnwTkZvtPgzqkTDRMmlIam\n3OerVtnt2tvD78SFM+JEgdu339W3FVDRoIwkr7wS3oPyIJoIWa9hpLOEJ9L+LtzDal4UXjTUOzzh\nOw15hCcqzVzmPk8rGio5DdOm2f+jU2/HLXPldHTYhu/oo8s7Dc8+Gw5lOnbscNGQJIQQRzQ8kcVp\ncE8Q7lzvtBPstVf4eTSnIc5pcN+zy2mYMKH0e3Lnqq8vXNclybpzVEk0tFJoAkpzGlQ0KI1m0iTb\nsysvoomQeU9Y5cgSngC7vjoNdSJPp6FS7wkXW69XeML/PA/R0N9vP9t9d7s8jWgAa8WPHVs5POGc\nBj88UavT4ARBV1eYCJlWNEQT9q66yg585YiGJ6o5DWvW2PPhOw3OSerrC8+be3V5H3HXirvxtVLP\nCdDeE0qxaPSIkPUWDZoImYI4pyHtrH9JnIYkcehawhP+/l57LbloiE6M5OrgJpxyTkO18AQMb+Rc\nvB9KwxObN9u5MvzwhBsJ0zWYtTgNY8bY89DVZevoBtby13Fs3Ag//Wn4xADDRy7s6gpHtuzpKW38\nfafBn3TKFw0bN4Yixp0Xt54fHnLfuTsHbrhqn3e8w76OxJTjteAfh4oGpdVp9NwTaUVJe3v63hNJ\nepUlofCiobOzNDZVr5wGJxqS5jRkDU+89po9lqSiAUr35Yb7dRn6LoSQ1mlw+3BP4c5hWLXKdoEc\nHCx1Glx5rkGvxWlw5Y0ZEyYV+uX5ouGGG+AjH7Gvjriuge3t9lqJjrLp954YNy7eadi8uTRnwZ8G\nOy484Qa48kcbdbhhut1Q4bVw/PF2wq1GoKJBKRLNPMulKyNpO+ZPdZAHhRcN7qnRNSRZREOlm5f7\nzD0ZJslpiIYn/AuymmhwjX3SLpdQOliREy4uTLDTTva1FtHgXIbtt7eiwR+jAcInbH/GzFqcBlde\nV1f8xFe+aHCN2W9+Ey4rN57AuHHDJwFzxxh1Glz4wNmLHR2lTkOcaBCx+3DnPk40uEnH8uCmm+CF\nF/IrrxL+NaxdLpVWp55dLqPhiSyTvKV1GiC/vIbCiwbX0Dn7faSdhs2bh4cn/HKq5TS444huH6Wa\naHBdH7fbzh5XkvBEVDS4ngVOMO28sxUNzz9v3ztB4p6wnWhwuRBZ7DLXeIMVDXETYfmiwY2hcOut\nodIuN7HSuHHZnQY3r0dbW/nwhNuHEw09PcOvqa4uO/qmGwyrVain06BzTyiNpp5dLuMSIbOEJ9Lk\nNIA6DYmpt2gQsX9JnIZy4Ql/H9WchqTJZv42/myYvmjYckvbuI0bF+80RBMKyzkN7mLcemvbSK5e\nbRs/f6ZQCMMTbqbILBdxufBEuXq7sMkrr9hBrCCd0xAd3Ckup8GFJ1ydfKdh9ep40eDCGf52HR32\nb9ddYY89kp2PZkHDE0qRaGSXy0YkQrp95UHhRYO7YbuR+fLuPQH2AkuSCFkuPOGXnVQ0VFOmfjnl\nnAaXF9HdnT084ew1sKJh5cqwN4Ej6jS40SCzhCh80ZDEaVi1yvZIaGuDe++1y8qJhvHjh4d9/PBE\nNafBrV8uPAH2HK5YEY5C6YuGSvOWNDv1CE+oaFBGikYlQjYiPOHWU9GQkHo7De7zRoUnkjoN/kVe\nLqfBiYZx48qHJ/z6VHMa3vAGKwRefbW0y6DvNPiiIUsyZBKnwRc7q1bZZMD99gtFg6tv9FxffDH8\n4z+WLqvmNPg5Da5O1cITg4PhuSyKaFCnQSkSfiJknqEJiHca0u7jwANhzz2TrZu301D4aKG7YddT\nNPhOQ9bwhKtTXk6DTzWnoVx4YmDAOgZ9fVYNJwlPgE2+80VDPZ0GP8TR1mb/oqJhyy1t19Ko0xBt\njI46avj+3Dgf/f3lx2mo5DQYU3renIBy27vvcbvtWrtxVNGgFAk/ETJPlwHyCU/cfHPydTWnISXu\nhl3P8ER7e5jTUClB0TXYzuqO20deOQ0+tYQn3BwPbj0flwgZFQ0vvhgfnnA5DXk6DT7t7baOvmvi\nRMOsWfDYY/b8pZmN0e1jzZrqXS7d+n6XS39dCM+hW+a2O/tsuOuu6vVpVuohGrT3hDJS+ImQeTsN\neYQn0jCiOQ0i8mkReVhE+oK/u0Xknd7nXSJysYi8KiJrROQGEZkcKWNHEblJRNaJyDIR+baItEXW\neYuIPCAi60XkbyLysawH2IjwhHMaursrX2AdHbbh9O3p6D7qIRrKJUK6+H2l8MS4ceETdnRQLOc0\n+DkNMNxpKBeeyOo0+F0ufTo6hgsgJxomTLD1dOEEt3413DE7p6FcImS58IS/Lgx3Gnyx4QutVqNa\nt+FaylTRoDQaPxGyEU5D3sLEZ6QTIV8AvgzMCv5+D/xaRPYOPv8ecAJwMnAUMBX4hds4EAe/w4ZF\nDgM+BpwGXOCtswvwW+AOYAbwfeAKEXlHyroC+SZCVstpqDZfgJ/7kEY0xOU0pLnI3GiHbj9xOQ3V\nnIZyMzL6jbATDS+/HO80rF1rf4jusyxOQ7TLpU9Hx3AB5ESDnwyUZo4EX7DFOQ3lchr879E//040\nRHMa6nnTaAQanlCKRKOchqzhiTRUSoT8l3+x3dHTkOp0GGNuMsbcYox5Kvj7Z2AtcJiITABOB+YY\nY/5kjHkQ+DhwhIgcEhRxLLAXcKox5lFjzHzga8BZIuJus58BnjHGnGuMWWyMuRi4AZiT7tAsXV32\nAmiU01CtHCcaGhme8BPs2tutgFq7tnJ44tpr4corrWNQSTTEhScGB+NzGlz3R+c0nH46/Pa3yY8D\nsocnfLWdJjzhC7Y0TkO53ivlwhOt3jCqaFCKhO801DMRspHhibichp/8BO64I115mU+HiLSJyIeB\nbuAerPPQgXUIADDGLAaeBw4PFh0GPGqMedUraj4wEdjXW+f2yO7me2WkrKe9QechGirlNAwOVnca\nOjqyOQ21ioboftwwxi5sEReeuO02mDIFvv9921jGHVs0PDFhQlgv32lw4w9ERcN998GCBenqH02E\n9HHhiXqJBt9pGDcunNkxbpwGfztfNJRLhFSnYTgqGpSRotGJkCMVnli/Pn3YInVVRWQ/EVkDbAAu\nAd5rjHkCmAJsNMZEp9pZHnxG8Bqd9Xy591mldSaISKSZSEZ3d33DE255kvCEU3tZwxPuC856kbW3\nh42qP/hS1Gl4+WV44xvhkEPKOw3RRMj29rCBjM7SOG5cODqjEw0QHxYBezFfd1388nJOQzSnwfXW\n8EXD4GD5ESHj8PfhOw0u2dH96JM6DeVyGlpdNOg4DUqRKFJ4opJo2LAhvWjIkrL0BDbXYEts7sLV\nIhLTWe11BEgyYHCldSTBOgDMmTOHiX4QGRga6qW/vxeoX04DJBMNjkaGJ6L7cc6AawCjT+cAS5fC\n3kGmStxsjK5exoS5CU40rFgxPKlv7NjQafA/KzeN9W23wamn2m6Q/qRLlZyGaHjC7a8eToMTBq43\nhlt+5plWFFULT2hOQ3VUNCgjRaMSIV14op6//7ichnnz5jFv3jzWrIEbb4QlS6DPPdVVIbVoMMZs\nBp4J3i4M8hU+B1wPjBGRCRG3YTKhc7AMODhS5LbeZ+5128g6k4HVxpiqqXNz585l5syZJcv226/+\nXS4hnWio94iQSergyo1zGpYutWMHwPAuhNHtXcNfyWlIKxpcF9a+vuGiwQmuOKdh3Ljwu/ZFgxM2\neeU0+E6DH5740Ifsqz9SZRKnodUbRu1yqRSJIjoNfk5Db28vH/5wL+3tMHs2XHYZLFy4kFmzZlUt\nL4/T0QZ0AQ8Am4Fj3AcisgewE3B3sOgeYH8R8QfrnQ30AYu8dY6hlNnB8kz4T5/1DE9US4SsNLpi\nPZyG9vZw6uvofvxyo6Jh/Xo7HLSbcbFSIiSE27a1hQ1k1GkYN254ToPbVxxupMfVkWBXtd4T/nft\nhPPEiaVOQ5oul/6532mncJuurnDgp7hrKm0ipDoNw9EJq5SRotFOw0iEJzZvtsdY1/CEiHwDuBnb\n9XIL4FTgaGC2MWa1iPwYuFBEVgJrgP8EFhhj/hIUcSvwOHCNiHwZ2A74OnCRMSZoDrkU+KyIfAu4\nEisg3g8cn+7QQvJKhKwWnqjWzz6r05BlGGmwT7vROseJhmh44uWX7atzGt7znuFjNPjbO9FQzWmI\ny2moJhqic0skCU+4+rgyxo8Py0nb5dI/7h12qJ7T4PDfR7ttQvESITWnQSkSjXIanAM6EomQ7t5b\n75yGbYGrsY19H/AIVjD8Pvh8DjCI7SLZBdwCnOU2NsYMiciJwA+x7sM64L+A87x1nhWRE4ALgXOA\nF4EzjDHRHhWJqTURstrNy30eSaUYhr99VGDUIzwRNzqlXwfXAEadBicanNNw1lnE4raPEw1xToPr\ntZEkEdI18r7TYEyyREgngNx33tNTatFlCU/svnvpNk40RHMa/LrEoeM0JEdFgzJS+L0n6tnl0g1w\nNxLjNLh911U0GGM+UeXzDcDZwV+5dV4ATqxSzp+wXThzobs7fMqtp9PgN4ZxuP2OHz/8Iqm0j7wT\nIaPlOtFgjL35L11qlzunoRzRnAY/PFEpp8HP/UgTnti0ydYx6eBOcaIhbXjCBKm3Rx5pX6vlNDjK\nWZo6TkNynAiv9rtSlLxpVHiiEaKhnNOQVTS0+PNNMnp6bAPi4jf1ymmodnOrJC7a22294i7QWkeE\njKsDlIYnhoZCq2zpUtso+sNPx5EmEdLvcukPNpVGNLh1kw7u5ERDd3f2ESH33BO++lW48EL7Pi6n\nIS48UY6idrmsh2g44AB48kmbS6IojaSI4Yno4E4NcRpaFdeQuK6GefeecCQNT1QSDXHUy2nwwxNg\n3YaurnCI6WoKu1JOQ1yXS1d33yFIk9MQFQ3V5p5Yty4chClrl8v2dvjGN8L3b3mLFRFbbVXqNKQV\nDUULT9QjpwHCsJCiNJJGOQ1pxozJijoNGXCJkGkaC59qN3b3ZJvUaYgTFyMhGvzwBISN7caNwxvk\nSvVKGp5wJBENcTkNrn7VJqzq77c/+HXrwif6uMGdsmTlT5liRYRI5ZyGckTDE0VJhPRvrK1+LIri\nrud65DTE5eo0Q07DokUkYlT8vF0iZL1Eg3sqruY0uP2mdRriei5kVb9xosG9OkGyceNw679SvXyn\n4W1vg89/HiZPLl3XP2a/7HKJkLWEJwYH7bHEiYa04YlKVMppKMcOO1jRs+OOpXVo9YbWXY+tnpuh\nKBD+HgcH6+c0+InqI917YvNm+MhHEpaXX9WaF+c0uEYx7/CEa+CSOg1xXTOTOg1gL7B6iAYXX9u0\nKV6olKuX7zRMnQpz5w5f14mIMWNKfyC15DSUS4QE+31XEw219v9PktMQvSZ23tnWK9rItnpjq6JB\nKRLuHrV5c/16T4wbF97nGj24E5Q6DeXuw7Hl5VOt5qbeToM7+UlzGuJGjkwrGrLiX5z+kMgQioak\nToMvGqpd9NsGY3xGG/o8EyFdeAKsg+GLBt+ic08Ptd4MquU0XH89HHjg8OVxSYOt7jToQExKkXC/\n0XpMJuXK852Gkc5pcP8nocVvVcno7rYNoWt08hYNjqRdLtOKhnHj7GyT++5bWp8suG07O8Mfhmt8\n3YWT1mkYGKh+bpxocN/Br38Nn/lM9ZyGNImQLjwB1Z2GPBo3P6ch7jv5wAdg+vTKZRQtp0GdBqUI\njIbwRFanocVvVclwDYdrgOrVeyJpeCKtaAA455xw3IQ8RIP/lF5reGJgoHqdXHjChYhOOgkOOsi+\nj5vnPY3T4MSD7zQ40eASMqODO+XRuGXpPRGlKE6DigalSBTJafBDLT5ZnYZRYSa6hsSNE1AvpyHa\nY6BcOVlEg7//PMITvihwje9zz8Ff/5o+ETJNeMLH/WjWrx9+TtautSKsUu+Jjg57Lnp67EUfJxq2\n2SZcF9KPq1AJNzW4X35aipLToKM3KkWiEU6DP5VAvX83HR3DH878REh1GiK4Bsk1QPUSDdUuLrd9\nraKhXk7Dz38OZ5xhG8IkToMra9Om5OEJH+cYRC/YjRttmVOnVnYawLoMft6C3320EeGJrCEvhzoN\nitJ8+KKhnomQ0f3VC+eK+mhOQwXychpqvSG6BqYZnAZfNDh73zXQ/f3JnAZX3w0bqp8bf6ZHRznR\n4EITU6dWzmkAW09fGJTLaYiOCJlXeMIfDTMLKhoUpflw13M9ek+48vz7WCOcBs1pSIFrSFyjmOQp\n2ievG7s/F0LcPqpdOHmIl0pOg598mOQcpRENcS5MOdHg6jF1amkCqwuD+OLKn7a7o6M0f6UVnAZN\nhFSU5qNo4Yn2dhUNqWhUeKIa7gl6JMMTroy4nAZ3ftauTeY0uHps3Jjt3DjREB3gyZ2nPfawry++\naF/9GS4dUadhwgTrajzzjC2n3IiQeeU05BWeaPXGVnMalCLR6ETIRoQnKo3ToOGJCM0SnnDDTTdb\neCLqNKxbl8xpcINMbdyY7dz4iZA+TjTMnGlfn3wyXC8qGqI5DSJ2kqnFi0udBlfXzZttvkTe4QnN\nabCvKhqUItDoLpcjEZ7QRMgK1NtpuPde+1eNE06wr64x9Dn1VDjzzGT1qHd4Yu3a5CGcjo5k4Qmw\nPTMeeSR8Xy488be/2ddDD7WiwImGgYHSH5qrux+eAOtQLF5sRZov0NwPZ+3a+FE506I5DSEqGpQi\nUc9EyK4uuOACeOc7w2WTJuW7jyh5JkKOii6XrqHJ6jQcfjjMmRPfAwDgkEOSlTNzpp1IKY7Zs6tv\nn2fvCV8UtLfbsn3RkCQ8AfZcbtw4vDGPY599St+XEw133w177227S06bVioaogM6TZwIb3hDWBew\nTsMNN9hz7YsGF9dbtar66J1JUKchREWDUiT88ETeToMIfO1r8NBD4bJdd813H1Eq5TRs2qTDSA+j\ns9P+ZXUatt4aLrww/3qlpV7hCffeXTj9/emchqzhiXKiYcECOOII+//06aFoeOop2Gmn0nWvucYK\nliuuCOuw555hnkSc09DXF9+bIy155DQUJRegKMehKFBfp8HhD+OftzCJ25d2uUxJT0920dAs1Guc\nhiTvK5WXNDwRJS4Rsq8PHnsM3vQm+94XDffdN9zV2X1328sCSp0GR1Q0DA7Wx2nQ8ISl78rMAAAf\njklEQVR9VdGgFIF6drl0uPvVfvvVp/zovqpNWBV1ccvR4req5HR3Zw9PNAt5NDBJRUMap8GY/JyG\np5+25e2/v30/bRo8+6ztQfHSS/GhIJHSRNJ994WPftS+32230rrm6TRoeCJERYNSJOqZCOlwT/6N\nEg2VEiE3bEj+oNjit6rkqNNQum1UFNQiGvy6pcEpW180OPXrciS23RaGhmD+fPv+4IPjy+rsLG2A\nf/ITG6ubNi1cpx45DUND4f9ZUNGgKM1HPbtcOqZPtwnw3/xmfcr3STJOQ+I8tnyr1rx0d8Pf/27/\nV9GQb3gia51ErNvgiwY3l4Pbv8sqvv12+/8OO8SX1dlZ/XutR06DX3YWiiIaNKdBKRKNcBq6uuDa\na+tTdpQkI0Kq0xChSE5DPcIT0XhWWqcha2PR1VWa01BONDz0kE2CLPcDTisa8nIaHFmPvyiNrfte\nWvW3pSg+jUiEbCSVchqMsfdgFQ0RurvDQYNa9caW5zgN1cITabpcQvYfVldXOF02DBcNbpbKJ5+E\nKVPKlzN5crhupbr29dkfT145DXH/p6EoToOGJ5Qi4SdC1rtnQyOo5DRAum72qW5VIvIVEblPRFaL\nyHIR+aWI7BFZp0tELhaRV0VkjYjcICKTI+vsKCI3icg6EVkmIt8WkbbIOm8RkQdEZL2I/E1EPpam\nrlH8cb5bVTTUu8ulT6OchjFjQqEA4f/O+dhqK3u8g4Ow3Xbly1mwAD75yep1ffVV+38eToM/NoWK\nBvuqokEpAkV0GuISIV0y+rp19XMajgR+ABwKvB3oBG4VEX9on+8BJwAnA0cBU4FfuA8DcfA7bD7F\nYcDHgNOAC7x1dgF+C9wBzAC+D1whIu9IWd/XiXa9a0WKltPg9uOLBqd+3f7b2uw4GVBZNLzhDdXr\n3N4OK1bY//NwGvK4pooiGooSZlEUaEwiZCOJS4TcuDG8h6Ua0C/Njo0xx/vvReQ04BVgFnCXiEwA\nTgc+bIz5U7DOx4FFInKIMeY+4FhgL+CtxphXgUdF5GvAf4jI+caYzcBngGeMMecGu1osIm8G5gC3\npamzw3caWvUiaKRoaETvCbefOKfBr8+kSTaJtZJoSEJHRyga8nAaoqNNZqEoja06DUqRaEQiZCOJ\ny2nYtMm2iytW1NdpiLIlYIDXgvezsELkDreCMWYx8DxweLDoMODRQDA45gMTgX29dW6P7Gu+V0Zq\n/NkOW/UiyHOchry7XOYdnvD373IVKuU0JMEXDeo05IuKBqVIjIbwxObNYXiibjkNPiIi2FDEXcaY\nx4PFU4CNxpjVkdWXB5+5dZbHfE6CdSaISMJxq0qJTmrUijRjeKIeoqGzs1TYuR4UeTgNeeY0qGgI\nUdGgFIl6zj0xEsSJhk2bSnMako4IWUsTegmwD/DmBOsK1pGoRqV1JME6ZfGdhlalGRMh88hpiPae\niNbFOQ15iIZ16+yrH67KSh7hiaKIhqKEWRQFiuc0tLeX3mdhuNOQ2F3OUgERuQg4HjjSGLPU+2gZ\nMEZEJkTchsmEzsEyIDqu37beZ+41OqfkZGC1MWYjFZgzZw4TI4+Rvb297LZbLxB2u2xFGjkiZKO6\nXMYlQkYVr3Maag1PuGOfMCGfp4c8nYZWb2zVaVCKRCPmnmgk/pD3APPmzWP58nmvz2y8dCmsX9+X\nrKy0Ow8Ew7uBo40xz0c+fgDYDBwD/DJYfw9gJ+DuYJ17gK+KyDZeXsNsoA9Y5K1zXKTs2cHyisyd\nO5eZM2cOW/7YY9W2bH7yHKehmbtcRuty0EF21suk9lk5XF39xr4W8hANebhHzYCKBqVIFDER0g9P\n9Pb2MmdOLzNmwK232t/tUUct5PrrZ1UtK+04DZcApwKnAOtEZNvgbyxA4C78GLgwGGdhFnAVsMAY\n85egmFuBx4FrROQAETkW+DpwkTHGGSiXAtNE5FsisqeInAm8H8g8QbU/82Gr0sjwxEjmNET3/e53\nw113ZSvfpxlFg4YnFKX5KFp4oloiZD1HhPw0MAH4I7DU+/ugt84c7BgLN3jrnew+NMYMAScCg1j3\n4Wrgv4DzvHWexY718HbgoaDMM4wx0R4ViUn65NzM1DM8kXUY6VobvSSiIS9cw55HPgNoToOPOg1K\nkShaImTcOA1+IiTUb5yGqrc2Y8wG4Ozgr9w6L2CFQ6Vy/oTtwqkENPM4DVnr1NkZzgkCjRENzeY0\niLT+jUlFg1IkRovT4I9qq7NcxnDbbfCXv1Rfr1nJ46nUNW7NGp5IM697Wlwd83Ia/HpmPf7dd4f9\n98+nPiOJigalSBTNaYgb3CmraCiAhkrO298OX/nKSNciO3k4DePHw4UXwuzZpcvdBeOenkcyPFFr\nwmM58nYa/JtJ1uOfPRsefjif+owkmtOgFIkiOg333w/f+164zM9pABUNhSSvTPs5c4bPCOkumAkT\nSt9XoxGJkHmRd06DEqJTYytFwt1ji9Ll8qmn7OucOfZ1aMj++aIh6cNUAU7H6KGeffqjoqGRXS6r\nDe6UFyoa6oeGJ5QiUbTwxAc+YF+nT7evLr/BFw1bbJGsLH0uaCHqaQG7htpdOM00TkNe5B2eUEJU\nNChFomjhiU98Ap5+Gq6/3r53osHPaRg/PllZBTgdo4d6DgSUNTzRSl0u806EVEI0p0EpEkVzGsC6\nCm5UyFqcBhUNLUQjnIaRCE80qveEOg31Q50GpUgUzWmA6qJBnYYCUs+BgJopEbLevSfUacgfFQ1K\nkfDnniii0+DyyNRpKDj1dBoOPhhOPhm23tq+T+s0ZBUynZ2NC0+4OqrTkD8qGpQi4e4VQ0PFcRrG\njbOiwZh4p0F7TxSQeoqGffaBG26wF1ZbW/J95DE1dqNEg/uhqNOQP5rToBQJXygURTSMHWtF0ObN\n8YmQSbtLF+R0jA4aMSPimDHp5ulopd4TGzbY1zydBm0kLeo0KEXCD0kUKTwB1m2IcxqSol0uW4h6\njtPg6OrKJhpq6T0xNGQTjtrb65sI6URDnk7DE0+EA6eMZlQ0KEWiqE4DlIqGLPljKhpaiEY4DSec\nkG79PJwGsIk57e31TYR0jkaeomH33e3faEdFg1Ikiuw0DAyEiZBZHtBUNLQQjYgb779/ugmU8shp\nANugjx3beuEJxaI5DUqRGC1OQxpX2VGQ0zE6aMYbcx7hCQhdgEaIBk2EzB91GpQiMVpEQ5a5YtRp\naCHqOU5DVvIKTzRSNKjTkD8qGpQiUcTwhOspsX69zSMDFQ2FpxmdhjzDE1DfRMh65DQoFhUNSpEo\nutMQnZU2zTEW5HSMDhqRCJmWWp0GF1PbsAEuvlidhlalGQWtomSliE5DufBEeztccEHyctRpaCGa\n8cacV07DU0/BZz9r/69X74np0+GBB7Il/yiViT65KEorM1qchs7OUEAsXJisnIKcjtFBI8ZpSEte\nOQ19fcOX5c3ll8O99xbnyaGZ0PCEUiR8oVCU+4Xf5bKWREgVDS1EM4Yn8sppWLFi+LK8GT8eDjmk\nPmWPdprRBVOUrPhCoZnut7WQV++JgpyO0UEz3pjzCk+89trwZUrroE6DUiSKGJ7o6LDHsn59OLiT\nioaC08yioVanwRcNK1fWViel8ahoUIpEERMhRcLpsXVwp1FCkcdp8EXD9tvXViel8ahoUIpEEZ0G\nCKfHbmh4QkSOFJEbReQlERkSkZNi1rlARJaKSL+I3CYiu0c+30pEfioifSKyUkSuEJGeyDoHiMid\nIjIgIs+JyJfSH16xaEanoVYh4+c0iFjxcMwx+dRNaRzNeG0qSlaK6DTAcKehUeGJHuAh4CzARD8U\nkS8DnwU+BRwCrAPmi4gfqb4O2Bs4BjgBOAq4zCtjC2A+sASYCXwJOF9EPpGhvoWhGRMh8xqn4bXX\nbKLiVlvlUy+lsajToBSJIvaegOGiIcvvNbXOMMbcAtwCIBJ7Oj8HfN0Y85tgnY8Cy4H3ANeLyN7A\nscAsY8yDwTpnAzeJyBeNMcuAjwCdwBnGmM3AIhF5I/AF4Iq0dS4KRe5yuWIFbLFFPnVSGo+KBqVI\n+KKhSInZY8eGs1y2tWV7AM31mVVEdgWmAHe4ZcaY1cC9wOHBosOAlU4wBNyOdS0O9da5MxAMjvnA\nniIyMc86txLNaAHnKRrGj8+nTkrjUdGgFAn/cbheg82NBGPHwrp1VjRkHeQu7/HbpmAb/+WR5cuD\nz9w6r/gfGmMGReS1yDrPxJThPutjFNKM4Ylacxra2+22K1fCbrvlVy+lsWy7LbzvfTBjxkjXRFFq\np8hOwxWBV591OP1GDfoqxOQ/pFzHab+K5cyZM4eJE0vNiN7eXnp7e6vVsekpotMgApMnw7Jl6jS0\nMmPGwC9+MdK1UJR8KKrT4Ga3hHls2DCPk7xuDH19yZ7F8xYNy7CN+7aUug2TgQe9dSb7G4lIO7BV\n8JlbZ9tI2W6bqItRwty5c5k5c2bqircCRexyCbaL5bJlmtOgKEpzUFSn4Z573H+9TJzYy403hp8t\nXLiQWbNmVS0j1+bHGLME2+C/3mlORCZgcxXuDhbdA2wZJDY6jsGKjfu8dY4KxIRjNrDYGDMqQxPQ\n3E5DLUJmhx3sqzoNiqI0A0V1GnyyTi6XZZyGHhGZISIHBot2C97vGLz/HvDPIvIuEdkfuBp4Efg1\ngDHmCWxS4+UicrCIHAH8AJgX9JwA2yVzI3CliOwjIh8CzgG+m+0wi0EzioY8enS4wZzUaVAUpdko\nktPw4INw9NH2/6yJkFmeDw/ChhoewOYXfBdYCPwrgDHm21gRcBm218Q44DhjzEavjFOAJ7C9Jn4L\n3Ikd14GgjNXYbpm7APcD3wHON8b8OEN9C0MzJkLmEZ5Qp0FRlGalSE7DgQfC7Nn2/6xOQ5ZxGv5E\nFbFhjDkfOL/C56uwYzFUKuNR4Oi09SsyzTxOQy1CxjkNWS9iRVGUelEk0QCho9uw8IQycjSj05Bn\neCJh8q6iKErDKFJ4AkJHN+tIl03U/CjVaMachrx6T4CKBkVRmo+iOQ1ONGzYkG17FQ0tRFFFw9Sp\n9jVBbx9FUZSGUjSnwYUnBgayba9R5BaimcdpqKVOEybASy/BlCnV11UURWkkRXUa1q/Ptr2Khhai\nGZ2GbbeFN70J9tqrtnKc26AoitJMFM1pUNEwimhG0dDdDQsWjHQtFEVR6kPRnAYXnti0Kdv2TWR0\nK9VoxvCEoihKkSmaaKh1PBxtflqInXeGb3/bhgMURVGU+lPU8ERWNDzRQrS1wZe+NNK1UBRFGT0U\nzWno7q5te3UaFEVRFKUMRXMasg7q5FDRoCiKoihlKJrTUCsqGhRFURSlDEVzGmpFRYOiKIqilEF7\nq5WiiZCKoiiKMoo44ADYd99s26poUBRFUZRRxMMPZ99WjRdFURRFURKhokFRFEVRlESoaFAURVEU\nJREqGhRFURRFSYSKBkVRFEVREqGiQVEURVGURKhoUBRFURQlESoaFEVRFEVJRFOLBhE5S0SWiMiA\niPyviBw80nVS4pk3b95IV2FUoud95NBzP3LouR85mlY0iMiHgO8C5wFvBB4G5ovINiNaMSUW/RGP\nDHreRw499yOHnvuRo2lFAzAHuMwYc7Ux5gng00A/cPrIVktRFEVRRidNKRpEpBOYBdzhlhljDHA7\ncPhI1UtRFEVRRjNNKRqAbYB2YHlk+XJgSuOroyiKoihKq81yKYAp89lYgEWLFjWuNsrr9PX1sXDh\nwpGuxqhDz/vIoed+5GjEuf/pT2HiRBgtX7HXdo6ttJ5Y17+5CMIT/cDJxpgbveX/BUw0xrw3ZptT\ngJ82rJKKoiiKUjxONcZcV+7DpnQajDGbROQB4BjgRgARkeD9f5bZbD5wKvAssL4B1VQURVGUojAW\n2AXblpalKZ0GABH5IPAT4FPAfdjeFO8H9jLG/H0k66YoiqIoo5GmdBoAjDHXB2MyXABsCzwEHKuC\nQVEURVFGhqZ1GhRFURRFaS6atculoiiKoihNhooGRVEURVESkVo0iMhXROQ+EVktIstF5Jciskdk\nnS4RuVhEXhWRNSJyg4hMjqzzPRG5X0TWi8iwnrBBGVeJyCMisklE/idFHT8gIouCia4eFpHjIp/3\niMhFIvKCiPSLyF9F5FNVytxZRK4QkWeCbZ4UkfOD7qH+eh8UkQdFZF0w2dYXk9Y7wXE19bkXkX2C\n/S0RkSEROSfLMZQpeysR+amI9InIyuC76PE+Py/Y52Dw6v7WJKl7lX036rwfLSK/EpGlIrJWRBYG\nXYmT1LHi5G4icqmIPBVcu68E+9mzSpmJrgMRGSMi3xCRZ4Nje0ZETktS7wTH1fLnPljncBG5Iyi7\nT0T+KCJdFco8QESuE5HnJbxHDfs9eesfEXxHuY0q0OznXkSOFJEbReSl4Ld+Upn1LgjK7heR20Rk\n9wRl7ygiN4m9jy8TkW+LSJv3+XtF5Nbgt9QnIneLyOxq5RaBLE7DkcAPgEOBtwOdwK0iMs5b53vA\nCcDJwFHAVOAXMWX9GPhZmf20Y8dq+D5wW9LKicjhwHXA5cCBwK+AX4nIPt5qc4HZwCnAXkF9LxKR\nEysUvRd2cKn/A+yD7c3xaeAb3r6PA64FLgH2Bc4E5ojImUnrX4WmPvdAN/A08GXg5RqOIY7rgL2x\n3W5PwB7bZd7n38GOFrpd8DoFeBy4PkX9y9Go8/4m7MRs7wP2B64CrhaREypVTpJN7nY/cBr2Op6N\nvZbni4hUKDrpdfDfwFuBjwN7AL3A4kp1TkHLn/vgnnQzcAtwUPB3ETBUoehZwCvYbuT7YO8z/x53\nLxGRCdieZrdXqmsGmvrcAz3YBPmzKDPon4h8GfgsthfeIcA67PczplyhgTj4HbajwGHAx7C/nQu8\n1Y4CbgWOA2YCfwB+IyIzqtS59THG1PSHHfJ5CHhz8H4CsAF4r7fOnsE6h8Rsfx6wsMo+rgL+J2F9\nfgbcGFl2D3CJ9/5R4P9G1rkfuCDlsX8ReMp7/1Pg55F1Pgs8V+t5boVzH9luCXBO2mMos85ewTpv\n9JYdC2wGppTZZkawzZta8bx76/4WuKLKOv8LfN97L8CLwLkVttkfGAR2TViP2OsAeCfwGrBl3ue5\nKOc+uP+cn8OxXwTcHrN8HvCvaY6tCOc+sv4QcFLM8qXAHO/9BGAA+GCFso4DNgHbeMs+BawEOips\n9xjwz/U6/83yl0dOw5ZYlfda8H4WVqH5k00tBp6nMZNNHc5wxT0/su+7gZNEZCqAiLwVmE6VQS1i\n2JLwuAG6GD6w1HpgBxHZKWXZSfffTOc+C9FjiONwYKUx5kFv2e3BdoeW2eYTwGJjzN251LKURp73\niVQ4N5JhcjexYZ3TgWeAF2qs37uwgvvLIvKiiCwWke+ISMWhaGugpc69iEzCXqOvisiCwOr+o4gc\nkUd9ROTjwG5Y0VBvmubcJ0FEdsU6jn79VgP3Url+hwGPGmNe9ZbND+q0b5l9CbBFrXVuBWoSDcGJ\n+h5wlzHm8WDxFGBj8OX4NGqyqSlUn+jqbGAR8KKIbMRaUWcZYxYk3UkQF/sscKm3eD7wPhF5m1j2\nAL4QfLZdusOouv9mPPepKHMMcUzBWrWvY4wZxP5Ahx1XYD2eAlyRX21fL7th513sAGcHAVdWWC3x\n5G4i8hmxOR5rsCGK2caYzVnrF7Ab1sbeF3gP8DnsIGwX1VjuMFr03O8WvJ6HDacdCywE7hCRaSnq\n8ybgg3ghORGZDnwTOMUYUynUUTNNeO6TMAUrctJOfFiuDXGfxfElbLgkj3BoU1Or03AJNt7Wm2Dd\nSpNNpSZIVFkT/K0WkX9Kse9zsOr/RGw86h+BS0TkbUHZP/TLjtn39tgY5c+NMa9f2MaYy7E3y98A\nG7GOxrzg48Gsx1qGVjn3lRh2DNXOfbQqxB/XycB44JqM9apEQ8574H5dCXzCGPNEsOzNkfNeqQ5x\n+74Wm+dzFPAk8N8utisij3ll35Siqm1Ya/gUY8z9xphbsEL5NKmQ6JeRVjz37h57qTHmamPMw8aY\nL2BzPk4Pyv6dV/ajMfXZD5ubdb4x5o5gWRs2HHqeMeZpb7/1olXOfaLduPpVO/cxDDsusUmbXwM+\nEHEnCknmESFF5CLgeOBIY8xS76NlwBgRmRBRoJMZrt5qYSk2bu1wttAy7AiSPq/vO7BNvwG8O7jB\nATwmIm/E5ij8HnsBfCdup0FI4/dYxT2sx4Ux5isi8lWsIv07NoEI7JwYudDE5z4xFY4h7twvwx6D\nv307sBXxx3UG8FtjzCsxn2WmUeddRI4Gfg18zhjjT8L2F0rP+3KsOB2kwjXvMMY4l+FpEbkXG6N9\nL/BzbBzX9QQaSFHdl4GXjDFrvWWLsDfmHbCJsTXTwufeJQRHp99dBLiQ5RmASy7cFKnPPthwx6XG\nmH/3PtoC+zR+oIhcHCxrs5vIRqyL9MfyR5qcJj33SViGvQ63jWwzGXChzrhzvwyI9oBx33HJvkXk\nw8CPgPcbY/6QsF6tTZZECOzT9AvAbjGfxSXH7EFjEyF/HVm2gCAREvtjG8IOSe2vcylwS5Wyt8c+\nIVxLMJpmgvpcjRUYuSShNPO5j2xXNhGy0jGUWX8v7M3ZT4ScTUwiJHbClUHguLzOeSPPO/AWbMP+\n6RR1i0vGewH4UoVturCZ5B9NuI9yiZD/B1gLdHvL3o29AXeN4nPvJ0K+CPxrZLuFwL9VKXtfbAP2\n7zGfCfbJ3/+7GNtjaG9gXNHPfWT7tImQH6hQ1jsZngj5SazI7vSW9Qa/oXflca5b5S/Ll3NJcPKO\nxKov9zc2ss6S4EKYhW20/xwpZxrWKr0Uq7pnBH8d3jp7B+v8GpvMMgOYUaV+h2OfAL6AzeQ9H5uM\nuI+3zh+AR4CjsY3MadiuZZ+sUO52WEv3Nmy3oteP3Vtna2yW7Z5BXb8fXFSzcvmymv/cdwbrHQi8\nBHwreD8tzTGUKft32IS7g4EjsOLtmpj1vo69ySUSdc103oNt1wL/FtnPVlXq90HsjfCjWIF1GbAC\nmBR8vivwT9hQ3I7YLm43Yp2wbaqUXfE6wMZxn8O6FXtjQx+LsU/Go/7cB+t8LjiGk4N6fB17Xyjb\ncwUrGF7BPnT49Sn7fZFz74kWOPc9hPebIeDzwfsdvXXODb6Pd2F7DP0Kex8fU6HcNmwX0JuBA7B5\nKMuBr3vr9GLbmU9H6jwhr/PfrH9ZLqQh7JNc9O+j3jpd2P69r2LV438DkyPl/KFMOTt56yyJfDYE\nDCao48nAE8GP+RGGuwqTsf2GXwh+vI9jLbFKZX4spq4l9cGKhruB1cFx3woclNuX1eTnHti5TB1/\nn+YYypS9Jdbh6cPeyC7He7oN1hFs5naqrrPNct6xT/Nxn/8+QR3PxIbABrBd/A7yPtsOuAlrla/H\nNvLXANMTlFv1OsA+Xc7H3vifA75Nfi5DS597b51zg3OzBrgLOLxKmeeVqc8zVbbJUzQ09bnHPvTF\n1fHKyHrnYx2H/uA63T3Bse+I7fa5FisYvgW0JTimK6uV3ep/OmGVoiiKoiiJ0LknFEVRFEVJhIoG\nRVEURVESoaJBURRFUZREqGhQFEVRFCURKhoURVEURUmEigZFURRFURKhokFRFEVRlESoaFAURVEU\nJREqGhRFURRFSYSKBkVRqiIifxCRC+tQ7hIROSfvchVFqQ8qGhRFSYU29IoyelHRoCiKoihKIlQ0\nKIpSgoh0i8jVIrJGRF4SkS94n/0BO5vpXBEZEpFB77M3i8idItIvIs+JyPdFpNv7fJKI/Cb4/GkR\nOSVm33NE5BERWSsiz4vIxSLS49WrT0TeF9nmvcH6PfU4H4qihKhoUBQlyv8DjgTeBcwG3gLMCj57\nL/Ai8DVgCnbabURkGnAzdmrk/YAPAUdgp012/ATYHjul8fuxU0pPiux7EDgb2Bf4KPBW7LTEGGP6\ngZ8BH49s8zHgemPMusxHrChKInRqbEVRXid4Wl8BnGKM+Z9g2VZYoXCZMeYLIrIEmGuM+U9vu8uB\nzcaYz3jL3gz8EegGdgGeAA4yxiwMPt8TWAR83i8rUp+TgR8aYyYH7w8GFgA7GWOWicgk4CXgbcaY\nu/I7E4qixKFOg6IoPtOATuA+t8AYsxJYXGW7GcBpQUhjjYisAW4JPtsV2AvY5ARDUO5iYJVfiIi8\nXURuF5EXRWQ1cA2wtYiMC7b5C/A41oUA+AfgWRUMitIYVDQoiuIjwWtaC3I8cBlwAFZAzAj+3wN4\n2iu3/I5FdgZ+AzwEvA+YCZwVfNzprXoFYYjiY8CVKeuqKEpGVDQoiuLzFLAZOMwtCMITe3jrbATa\nI9stBPY1xiwxxjwT+duMDUN0iIjLjXDhiS29MmYBbcaYLxpj7jPGPIXNgYhyLbCTiJwN7ANcnflo\nFUVJhYoGRVFeJ0gm/DHwHRF5q4jsB1yFTVB0PAscJSJTRWTrYNm3gMNF5AciMkNEdheRd4vID4Jy\n/wbMB34kIocE4uFyoN8r9ymssDhHRHYVkX8APhVTx1XAL4HvAPONMUtzPAWKolRARYOiKFG+BPwZ\nuBG4Nfj/Ae/zf8EmNj4NvAJgjHkU2ytiOnAn1nk4H5uk6DgteP9H4AZsOOMV96Ex5hHgC8C5wKNA\nL/BPZer4Y2AMGppQlIaivScURWk5Ahfiu8DUIPyhKEoD6BjpCiiKoiQl6EWxG/Bl4FIVDIrSWDQ8\noShKK3Eu8CCwFPiPEa6Loow6NDyhKIqiKEoi1GlQFEVRFCURKhoURVEURUmEigZFURRFURKhokFR\nFEVRlESoaFAURVEUJREqGhRFURRFSYSKBkVRFEVREqGiQVEURVGURPx/jhnyCZddDUgAAAAASUVO\nRK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "data[24*10:].plot(x='dteday', y='cnt')" + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/day.csv b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/day.csv new file mode 100755 index 0000000..d39017f --- /dev/null +++ b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/day.csv @@ -0,0 +1,732 @@ +instant,dteday,season,yr,mnth,holiday,weekday,workingday,weathersit,temp,atemp,hum,windspeed,casual,registered,cnt +1,2011-01-01,1,0,1,0,6,0,2,0.344167,0.363625,0.805833,0.160446,331,654,985 +2,2011-01-02,1,0,1,0,0,0,2,0.363478,0.353739,0.696087,0.248539,131,670,801 +3,2011-01-03,1,0,1,0,1,1,1,0.196364,0.189405,0.437273,0.248309,120,1229,1349 +4,2011-01-04,1,0,1,0,2,1,1,0.2,0.212122,0.590435,0.160296,108,1454,1562 +5,2011-01-05,1,0,1,0,3,1,1,0.226957,0.22927,0.436957,0.1869,82,1518,1600 +6,2011-01-06,1,0,1,0,4,1,1,0.204348,0.233209,0.518261,0.0895652,88,1518,1606 +7,2011-01-07,1,0,1,0,5,1,2,0.196522,0.208839,0.498696,0.168726,148,1362,1510 +8,2011-01-08,1,0,1,0,6,0,2,0.165,0.162254,0.535833,0.266804,68,891,959 +9,2011-01-09,1,0,1,0,0,0,1,0.138333,0.116175,0.434167,0.36195,54,768,822 +10,2011-01-10,1,0,1,0,1,1,1,0.150833,0.150888,0.482917,0.223267,41,1280,1321 +11,2011-01-11,1,0,1,0,2,1,2,0.169091,0.191464,0.686364,0.122132,43,1220,1263 +12,2011-01-12,1,0,1,0,3,1,1,0.172727,0.160473,0.599545,0.304627,25,1137,1162 +13,2011-01-13,1,0,1,0,4,1,1,0.165,0.150883,0.470417,0.301,38,1368,1406 +14,2011-01-14,1,0,1,0,5,1,1,0.16087,0.188413,0.537826,0.126548,54,1367,1421 +15,2011-01-15,1,0,1,0,6,0,2,0.233333,0.248112,0.49875,0.157963,222,1026,1248 +16,2011-01-16,1,0,1,0,0,0,1,0.231667,0.234217,0.48375,0.188433,251,953,1204 +17,2011-01-17,1,0,1,1,1,0,2,0.175833,0.176771,0.5375,0.194017,117,883,1000 +18,2011-01-18,1,0,1,0,2,1,2,0.216667,0.232333,0.861667,0.146775,9,674,683 +19,2011-01-19,1,0,1,0,3,1,2,0.292174,0.298422,0.741739,0.208317,78,1572,1650 +20,2011-01-20,1,0,1,0,4,1,2,0.261667,0.25505,0.538333,0.195904,83,1844,1927 +21,2011-01-21,1,0,1,0,5,1,1,0.1775,0.157833,0.457083,0.353242,75,1468,1543 +22,2011-01-22,1,0,1,0,6,0,1,0.0591304,0.0790696,0.4,0.17197,93,888,981 +23,2011-01-23,1,0,1,0,0,0,1,0.0965217,0.0988391,0.436522,0.2466,150,836,986 +24,2011-01-24,1,0,1,0,1,1,1,0.0973913,0.11793,0.491739,0.15833,86,1330,1416 +25,2011-01-25,1,0,1,0,2,1,2,0.223478,0.234526,0.616957,0.129796,186,1799,1985 +26,2011-01-26,1,0,1,0,3,1,3,0.2175,0.2036,0.8625,0.29385,34,472,506 +27,2011-01-27,1,0,1,0,4,1,1,0.195,0.2197,0.6875,0.113837,15,416,431 +28,2011-01-28,1,0,1,0,5,1,2,0.203478,0.223317,0.793043,0.1233,38,1129,1167 +29,2011-01-29,1,0,1,0,6,0,1,0.196522,0.212126,0.651739,0.145365,123,975,1098 +30,2011-01-30,1,0,1,0,0,0,1,0.216522,0.250322,0.722174,0.0739826,140,956,1096 +31,2011-01-31,1,0,1,0,1,1,2,0.180833,0.18625,0.60375,0.187192,42,1459,1501 +32,2011-02-01,1,0,2,0,2,1,2,0.192174,0.23453,0.829565,0.053213,47,1313,1360 +33,2011-02-02,1,0,2,0,3,1,2,0.26,0.254417,0.775417,0.264308,72,1454,1526 +34,2011-02-03,1,0,2,0,4,1,1,0.186957,0.177878,0.437826,0.277752,61,1489,1550 +35,2011-02-04,1,0,2,0,5,1,2,0.211304,0.228587,0.585217,0.127839,88,1620,1708 +36,2011-02-05,1,0,2,0,6,0,2,0.233333,0.243058,0.929167,0.161079,100,905,1005 +37,2011-02-06,1,0,2,0,0,0,1,0.285833,0.291671,0.568333,0.1418,354,1269,1623 +38,2011-02-07,1,0,2,0,1,1,1,0.271667,0.303658,0.738333,0.0454083,120,1592,1712 +39,2011-02-08,1,0,2,0,2,1,1,0.220833,0.198246,0.537917,0.36195,64,1466,1530 +40,2011-02-09,1,0,2,0,3,1,2,0.134783,0.144283,0.494783,0.188839,53,1552,1605 +41,2011-02-10,1,0,2,0,4,1,1,0.144348,0.149548,0.437391,0.221935,47,1491,1538 +42,2011-02-11,1,0,2,0,5,1,1,0.189091,0.213509,0.506364,0.10855,149,1597,1746 +43,2011-02-12,1,0,2,0,6,0,1,0.2225,0.232954,0.544167,0.203367,288,1184,1472 +44,2011-02-13,1,0,2,0,0,0,1,0.316522,0.324113,0.457391,0.260883,397,1192,1589 +45,2011-02-14,1,0,2,0,1,1,1,0.415,0.39835,0.375833,0.417908,208,1705,1913 +46,2011-02-15,1,0,2,0,2,1,1,0.266087,0.254274,0.314348,0.291374,140,1675,1815 +47,2011-02-16,1,0,2,0,3,1,1,0.318261,0.3162,0.423478,0.251791,218,1897,2115 +48,2011-02-17,1,0,2,0,4,1,1,0.435833,0.428658,0.505,0.230104,259,2216,2475 +49,2011-02-18,1,0,2,0,5,1,1,0.521667,0.511983,0.516667,0.264925,579,2348,2927 +50,2011-02-19,1,0,2,0,6,0,1,0.399167,0.391404,0.187917,0.507463,532,1103,1635 +51,2011-02-20,1,0,2,0,0,0,1,0.285217,0.27733,0.407826,0.223235,639,1173,1812 +52,2011-02-21,1,0,2,1,1,0,2,0.303333,0.284075,0.605,0.307846,195,912,1107 +53,2011-02-22,1,0,2,0,2,1,1,0.182222,0.186033,0.577778,0.195683,74,1376,1450 +54,2011-02-23,1,0,2,0,3,1,1,0.221739,0.245717,0.423043,0.094113,139,1778,1917 +55,2011-02-24,1,0,2,0,4,1,2,0.295652,0.289191,0.697391,0.250496,100,1707,1807 +56,2011-02-25,1,0,2,0,5,1,2,0.364348,0.350461,0.712174,0.346539,120,1341,1461 +57,2011-02-26,1,0,2,0,6,0,1,0.2825,0.282192,0.537917,0.186571,424,1545,1969 +58,2011-02-27,1,0,2,0,0,0,1,0.343478,0.351109,0.68,0.125248,694,1708,2402 +59,2011-02-28,1,0,2,0,1,1,2,0.407273,0.400118,0.876364,0.289686,81,1365,1446 +60,2011-03-01,1,0,3,0,2,1,1,0.266667,0.263879,0.535,0.216425,137,1714,1851 +61,2011-03-02,1,0,3,0,3,1,1,0.335,0.320071,0.449583,0.307833,231,1903,2134 +62,2011-03-03,1,0,3,0,4,1,1,0.198333,0.200133,0.318333,0.225754,123,1562,1685 +63,2011-03-04,1,0,3,0,5,1,2,0.261667,0.255679,0.610417,0.203346,214,1730,1944 +64,2011-03-05,1,0,3,0,6,0,2,0.384167,0.378779,0.789167,0.251871,640,1437,2077 +65,2011-03-06,1,0,3,0,0,0,2,0.376522,0.366252,0.948261,0.343287,114,491,605 +66,2011-03-07,1,0,3,0,1,1,1,0.261739,0.238461,0.551304,0.341352,244,1628,1872 +67,2011-03-08,1,0,3,0,2,1,1,0.2925,0.3024,0.420833,0.12065,316,1817,2133 +68,2011-03-09,1,0,3,0,3,1,2,0.295833,0.286608,0.775417,0.22015,191,1700,1891 +69,2011-03-10,1,0,3,0,4,1,3,0.389091,0.385668,0,0.261877,46,577,623 +70,2011-03-11,1,0,3,0,5,1,2,0.316522,0.305,0.649565,0.23297,247,1730,1977 +71,2011-03-12,1,0,3,0,6,0,1,0.329167,0.32575,0.594583,0.220775,724,1408,2132 +72,2011-03-13,1,0,3,0,0,0,1,0.384348,0.380091,0.527391,0.270604,982,1435,2417 +73,2011-03-14,1,0,3,0,1,1,1,0.325217,0.332,0.496957,0.136926,359,1687,2046 +74,2011-03-15,1,0,3,0,2,1,2,0.317391,0.318178,0.655652,0.184309,289,1767,2056 +75,2011-03-16,1,0,3,0,3,1,2,0.365217,0.36693,0.776522,0.203117,321,1871,2192 +76,2011-03-17,1,0,3,0,4,1,1,0.415,0.410333,0.602917,0.209579,424,2320,2744 +77,2011-03-18,1,0,3,0,5,1,1,0.54,0.527009,0.525217,0.231017,884,2355,3239 +78,2011-03-19,1,0,3,0,6,0,1,0.4725,0.466525,0.379167,0.368167,1424,1693,3117 +79,2011-03-20,1,0,3,0,0,0,1,0.3325,0.32575,0.47375,0.207721,1047,1424,2471 +80,2011-03-21,2,0,3,0,1,1,2,0.430435,0.409735,0.737391,0.288783,401,1676,2077 +81,2011-03-22,2,0,3,0,2,1,1,0.441667,0.440642,0.624583,0.22575,460,2243,2703 +82,2011-03-23,2,0,3,0,3,1,2,0.346957,0.337939,0.839565,0.234261,203,1918,2121 +83,2011-03-24,2,0,3,0,4,1,2,0.285,0.270833,0.805833,0.243787,166,1699,1865 +84,2011-03-25,2,0,3,0,5,1,1,0.264167,0.256312,0.495,0.230725,300,1910,2210 +85,2011-03-26,2,0,3,0,6,0,1,0.265833,0.257571,0.394167,0.209571,981,1515,2496 +86,2011-03-27,2,0,3,0,0,0,2,0.253043,0.250339,0.493913,0.1843,472,1221,1693 +87,2011-03-28,2,0,3,0,1,1,1,0.264348,0.257574,0.302174,0.212204,222,1806,2028 +88,2011-03-29,2,0,3,0,2,1,1,0.3025,0.292908,0.314167,0.226996,317,2108,2425 +89,2011-03-30,2,0,3,0,3,1,2,0.3,0.29735,0.646667,0.172888,168,1368,1536 +90,2011-03-31,2,0,3,0,4,1,3,0.268333,0.257575,0.918333,0.217646,179,1506,1685 +91,2011-04-01,2,0,4,0,5,1,2,0.3,0.283454,0.68625,0.258708,307,1920,2227 +92,2011-04-02,2,0,4,0,6,0,2,0.315,0.315637,0.65375,0.197146,898,1354,2252 +93,2011-04-03,2,0,4,0,0,0,1,0.378333,0.378767,0.48,0.182213,1651,1598,3249 +94,2011-04-04,2,0,4,0,1,1,1,0.573333,0.542929,0.42625,0.385571,734,2381,3115 +95,2011-04-05,2,0,4,0,2,1,2,0.414167,0.39835,0.642083,0.388067,167,1628,1795 +96,2011-04-06,2,0,4,0,3,1,1,0.390833,0.387608,0.470833,0.263063,413,2395,2808 +97,2011-04-07,2,0,4,0,4,1,1,0.4375,0.433696,0.602917,0.162312,571,2570,3141 +98,2011-04-08,2,0,4,0,5,1,2,0.335833,0.324479,0.83625,0.226992,172,1299,1471 +99,2011-04-09,2,0,4,0,6,0,2,0.3425,0.341529,0.8775,0.133083,879,1576,2455 +100,2011-04-10,2,0,4,0,0,0,2,0.426667,0.426737,0.8575,0.146767,1188,1707,2895 +101,2011-04-11,2,0,4,0,1,1,2,0.595652,0.565217,0.716956,0.324474,855,2493,3348 +102,2011-04-12,2,0,4,0,2,1,2,0.5025,0.493054,0.739167,0.274879,257,1777,2034 +103,2011-04-13,2,0,4,0,3,1,2,0.4125,0.417283,0.819167,0.250617,209,1953,2162 +104,2011-04-14,2,0,4,0,4,1,1,0.4675,0.462742,0.540417,0.1107,529,2738,3267 +105,2011-04-15,2,0,4,1,5,0,1,0.446667,0.441913,0.67125,0.226375,642,2484,3126 +106,2011-04-16,2,0,4,0,6,0,3,0.430833,0.425492,0.888333,0.340808,121,674,795 +107,2011-04-17,2,0,4,0,0,0,1,0.456667,0.445696,0.479583,0.303496,1558,2186,3744 +108,2011-04-18,2,0,4,0,1,1,1,0.5125,0.503146,0.5425,0.163567,669,2760,3429 +109,2011-04-19,2,0,4,0,2,1,2,0.505833,0.489258,0.665833,0.157971,409,2795,3204 +110,2011-04-20,2,0,4,0,3,1,1,0.595,0.564392,0.614167,0.241925,613,3331,3944 +111,2011-04-21,2,0,4,0,4,1,1,0.459167,0.453892,0.407083,0.325258,745,3444,4189 +112,2011-04-22,2,0,4,0,5,1,2,0.336667,0.321954,0.729583,0.219521,177,1506,1683 +113,2011-04-23,2,0,4,0,6,0,2,0.46,0.450121,0.887917,0.230725,1462,2574,4036 +114,2011-04-24,2,0,4,0,0,0,2,0.581667,0.551763,0.810833,0.192175,1710,2481,4191 +115,2011-04-25,2,0,4,0,1,1,1,0.606667,0.5745,0.776667,0.185333,773,3300,4073 +116,2011-04-26,2,0,4,0,2,1,1,0.631667,0.594083,0.729167,0.3265,678,3722,4400 +117,2011-04-27,2,0,4,0,3,1,2,0.62,0.575142,0.835417,0.3122,547,3325,3872 +118,2011-04-28,2,0,4,0,4,1,2,0.6175,0.578929,0.700833,0.320908,569,3489,4058 +119,2011-04-29,2,0,4,0,5,1,1,0.51,0.497463,0.457083,0.240063,878,3717,4595 +120,2011-04-30,2,0,4,0,6,0,1,0.4725,0.464021,0.503333,0.235075,1965,3347,5312 +121,2011-05-01,2,0,5,0,0,0,2,0.451667,0.448204,0.762083,0.106354,1138,2213,3351 +122,2011-05-02,2,0,5,0,1,1,2,0.549167,0.532833,0.73,0.183454,847,3554,4401 +123,2011-05-03,2,0,5,0,2,1,2,0.616667,0.582079,0.697083,0.342667,603,3848,4451 +124,2011-05-04,2,0,5,0,3,1,2,0.414167,0.40465,0.737083,0.328996,255,2378,2633 +125,2011-05-05,2,0,5,0,4,1,1,0.459167,0.441917,0.444167,0.295392,614,3819,4433 +126,2011-05-06,2,0,5,0,5,1,1,0.479167,0.474117,0.59,0.228246,894,3714,4608 +127,2011-05-07,2,0,5,0,6,0,1,0.52,0.512621,0.54125,0.16045,1612,3102,4714 +128,2011-05-08,2,0,5,0,0,0,1,0.528333,0.518933,0.631667,0.0746375,1401,2932,4333 +129,2011-05-09,2,0,5,0,1,1,1,0.5325,0.525246,0.58875,0.176,664,3698,4362 +130,2011-05-10,2,0,5,0,2,1,1,0.5325,0.522721,0.489167,0.115671,694,4109,4803 +131,2011-05-11,2,0,5,0,3,1,1,0.5425,0.5284,0.632917,0.120642,550,3632,4182 +132,2011-05-12,2,0,5,0,4,1,1,0.535,0.523363,0.7475,0.189667,695,4169,4864 +133,2011-05-13,2,0,5,0,5,1,2,0.5125,0.4943,0.863333,0.179725,692,3413,4105 +134,2011-05-14,2,0,5,0,6,0,2,0.520833,0.500629,0.9225,0.13495,902,2507,3409 +135,2011-05-15,2,0,5,0,0,0,2,0.5625,0.536,0.867083,0.152979,1582,2971,4553 +136,2011-05-16,2,0,5,0,1,1,1,0.5775,0.550512,0.787917,0.126871,773,3185,3958 +137,2011-05-17,2,0,5,0,2,1,2,0.561667,0.538529,0.837917,0.277354,678,3445,4123 +138,2011-05-18,2,0,5,0,3,1,2,0.55,0.527158,0.87,0.201492,536,3319,3855 +139,2011-05-19,2,0,5,0,4,1,2,0.530833,0.510742,0.829583,0.108213,735,3840,4575 +140,2011-05-20,2,0,5,0,5,1,1,0.536667,0.529042,0.719583,0.125013,909,4008,4917 +141,2011-05-21,2,0,5,0,6,0,1,0.6025,0.571975,0.626667,0.12065,2258,3547,5805 +142,2011-05-22,2,0,5,0,0,0,1,0.604167,0.5745,0.749583,0.148008,1576,3084,4660 +143,2011-05-23,2,0,5,0,1,1,2,0.631667,0.590296,0.81,0.233842,836,3438,4274 +144,2011-05-24,2,0,5,0,2,1,2,0.66,0.604813,0.740833,0.207092,659,3833,4492 +145,2011-05-25,2,0,5,0,3,1,1,0.660833,0.615542,0.69625,0.154233,740,4238,4978 +146,2011-05-26,2,0,5,0,4,1,1,0.708333,0.654688,0.6775,0.199642,758,3919,4677 +147,2011-05-27,2,0,5,0,5,1,1,0.681667,0.637008,0.65375,0.240679,871,3808,4679 +148,2011-05-28,2,0,5,0,6,0,1,0.655833,0.612379,0.729583,0.230092,2001,2757,4758 +149,2011-05-29,2,0,5,0,0,0,1,0.6675,0.61555,0.81875,0.213938,2355,2433,4788 +150,2011-05-30,2,0,5,1,1,0,1,0.733333,0.671092,0.685,0.131225,1549,2549,4098 +151,2011-05-31,2,0,5,0,2,1,1,0.775,0.725383,0.636667,0.111329,673,3309,3982 +152,2011-06-01,2,0,6,0,3,1,2,0.764167,0.720967,0.677083,0.207092,513,3461,3974 +153,2011-06-02,2,0,6,0,4,1,1,0.715,0.643942,0.305,0.292287,736,4232,4968 +154,2011-06-03,2,0,6,0,5,1,1,0.62,0.587133,0.354167,0.253121,898,4414,5312 +155,2011-06-04,2,0,6,0,6,0,1,0.635,0.594696,0.45625,0.123142,1869,3473,5342 +156,2011-06-05,2,0,6,0,0,0,2,0.648333,0.616804,0.6525,0.138692,1685,3221,4906 +157,2011-06-06,2,0,6,0,1,1,1,0.678333,0.621858,0.6,0.121896,673,3875,4548 +158,2011-06-07,2,0,6,0,2,1,1,0.7075,0.65595,0.597917,0.187808,763,4070,4833 +159,2011-06-08,2,0,6,0,3,1,1,0.775833,0.727279,0.622083,0.136817,676,3725,4401 +160,2011-06-09,2,0,6,0,4,1,2,0.808333,0.757579,0.568333,0.149883,563,3352,3915 +161,2011-06-10,2,0,6,0,5,1,1,0.755,0.703292,0.605,0.140554,815,3771,4586 +162,2011-06-11,2,0,6,0,6,0,1,0.725,0.678038,0.654583,0.15485,1729,3237,4966 +163,2011-06-12,2,0,6,0,0,0,1,0.6925,0.643325,0.747917,0.163567,1467,2993,4460 +164,2011-06-13,2,0,6,0,1,1,1,0.635,0.601654,0.494583,0.30535,863,4157,5020 +165,2011-06-14,2,0,6,0,2,1,1,0.604167,0.591546,0.507083,0.269283,727,4164,4891 +166,2011-06-15,2,0,6,0,3,1,1,0.626667,0.587754,0.471667,0.167912,769,4411,5180 +167,2011-06-16,2,0,6,0,4,1,2,0.628333,0.595346,0.688333,0.206471,545,3222,3767 +168,2011-06-17,2,0,6,0,5,1,1,0.649167,0.600383,0.735833,0.143029,863,3981,4844 +169,2011-06-18,2,0,6,0,6,0,1,0.696667,0.643954,0.670417,0.119408,1807,3312,5119 +170,2011-06-19,2,0,6,0,0,0,2,0.699167,0.645846,0.666667,0.102,1639,3105,4744 +171,2011-06-20,2,0,6,0,1,1,2,0.635,0.595346,0.74625,0.155475,699,3311,4010 +172,2011-06-21,3,0,6,0,2,1,2,0.680833,0.637646,0.770417,0.171025,774,4061,4835 +173,2011-06-22,3,0,6,0,3,1,1,0.733333,0.693829,0.7075,0.172262,661,3846,4507 +174,2011-06-23,3,0,6,0,4,1,2,0.728333,0.693833,0.703333,0.238804,746,4044,4790 +175,2011-06-24,3,0,6,0,5,1,1,0.724167,0.656583,0.573333,0.222025,969,4022,4991 +176,2011-06-25,3,0,6,0,6,0,1,0.695,0.643313,0.483333,0.209571,1782,3420,5202 +177,2011-06-26,3,0,6,0,0,0,1,0.68,0.637629,0.513333,0.0945333,1920,3385,5305 +178,2011-06-27,3,0,6,0,1,1,2,0.6825,0.637004,0.658333,0.107588,854,3854,4708 +179,2011-06-28,3,0,6,0,2,1,1,0.744167,0.692558,0.634167,0.144283,732,3916,4648 +180,2011-06-29,3,0,6,0,3,1,1,0.728333,0.654688,0.497917,0.261821,848,4377,5225 +181,2011-06-30,3,0,6,0,4,1,1,0.696667,0.637008,0.434167,0.185312,1027,4488,5515 +182,2011-07-01,3,0,7,0,5,1,1,0.7225,0.652162,0.39625,0.102608,1246,4116,5362 +183,2011-07-02,3,0,7,0,6,0,1,0.738333,0.667308,0.444583,0.115062,2204,2915,5119 +184,2011-07-03,3,0,7,0,0,0,2,0.716667,0.668575,0.6825,0.228858,2282,2367,4649 +185,2011-07-04,3,0,7,1,1,0,2,0.726667,0.665417,0.637917,0.0814792,3065,2978,6043 +186,2011-07-05,3,0,7,0,2,1,1,0.746667,0.696338,0.590417,0.126258,1031,3634,4665 +187,2011-07-06,3,0,7,0,3,1,1,0.72,0.685633,0.743333,0.149883,784,3845,4629 +188,2011-07-07,3,0,7,0,4,1,1,0.75,0.686871,0.65125,0.1592,754,3838,4592 +189,2011-07-08,3,0,7,0,5,1,2,0.709167,0.670483,0.757917,0.225129,692,3348,4040 +190,2011-07-09,3,0,7,0,6,0,1,0.733333,0.664158,0.609167,0.167912,1988,3348,5336 +191,2011-07-10,3,0,7,0,0,0,1,0.7475,0.690025,0.578333,0.183471,1743,3138,4881 +192,2011-07-11,3,0,7,0,1,1,1,0.7625,0.729804,0.635833,0.282337,723,3363,4086 +193,2011-07-12,3,0,7,0,2,1,1,0.794167,0.739275,0.559167,0.200254,662,3596,4258 +194,2011-07-13,3,0,7,0,3,1,1,0.746667,0.689404,0.631667,0.146133,748,3594,4342 +195,2011-07-14,3,0,7,0,4,1,1,0.680833,0.635104,0.47625,0.240667,888,4196,5084 +196,2011-07-15,3,0,7,0,5,1,1,0.663333,0.624371,0.59125,0.182833,1318,4220,5538 +197,2011-07-16,3,0,7,0,6,0,1,0.686667,0.638263,0.585,0.208342,2418,3505,5923 +198,2011-07-17,3,0,7,0,0,0,1,0.719167,0.669833,0.604167,0.245033,2006,3296,5302 +199,2011-07-18,3,0,7,0,1,1,1,0.746667,0.703925,0.65125,0.215804,841,3617,4458 +200,2011-07-19,3,0,7,0,2,1,1,0.776667,0.747479,0.650417,0.1306,752,3789,4541 +201,2011-07-20,3,0,7,0,3,1,1,0.768333,0.74685,0.707083,0.113817,644,3688,4332 +202,2011-07-21,3,0,7,0,4,1,2,0.815,0.826371,0.69125,0.222021,632,3152,3784 +203,2011-07-22,3,0,7,0,5,1,1,0.848333,0.840896,0.580417,0.1331,562,2825,3387 +204,2011-07-23,3,0,7,0,6,0,1,0.849167,0.804287,0.5,0.131221,987,2298,3285 +205,2011-07-24,3,0,7,0,0,0,1,0.83,0.794829,0.550833,0.169171,1050,2556,3606 +206,2011-07-25,3,0,7,0,1,1,1,0.743333,0.720958,0.757083,0.0908083,568,3272,3840 +207,2011-07-26,3,0,7,0,2,1,1,0.771667,0.696979,0.540833,0.200258,750,3840,4590 +208,2011-07-27,3,0,7,0,3,1,1,0.775,0.690667,0.402917,0.183463,755,3901,4656 +209,2011-07-28,3,0,7,0,4,1,1,0.779167,0.7399,0.583333,0.178479,606,3784,4390 +210,2011-07-29,3,0,7,0,5,1,1,0.838333,0.785967,0.5425,0.174138,670,3176,3846 +211,2011-07-30,3,0,7,0,6,0,1,0.804167,0.728537,0.465833,0.168537,1559,2916,4475 +212,2011-07-31,3,0,7,0,0,0,1,0.805833,0.729796,0.480833,0.164813,1524,2778,4302 +213,2011-08-01,3,0,8,0,1,1,1,0.771667,0.703292,0.550833,0.156717,729,3537,4266 +214,2011-08-02,3,0,8,0,2,1,1,0.783333,0.707071,0.49125,0.20585,801,4044,4845 +215,2011-08-03,3,0,8,0,3,1,2,0.731667,0.679937,0.6575,0.135583,467,3107,3574 +216,2011-08-04,3,0,8,0,4,1,2,0.71,0.664788,0.7575,0.19715,799,3777,4576 +217,2011-08-05,3,0,8,0,5,1,1,0.710833,0.656567,0.630833,0.184696,1023,3843,4866 +218,2011-08-06,3,0,8,0,6,0,2,0.716667,0.676154,0.755,0.22825,1521,2773,4294 +219,2011-08-07,3,0,8,0,0,0,1,0.7425,0.715292,0.752917,0.201487,1298,2487,3785 +220,2011-08-08,3,0,8,0,1,1,1,0.765,0.703283,0.592083,0.192175,846,3480,4326 +221,2011-08-09,3,0,8,0,2,1,1,0.775,0.724121,0.570417,0.151121,907,3695,4602 +222,2011-08-10,3,0,8,0,3,1,1,0.766667,0.684983,0.424167,0.200258,884,3896,4780 +223,2011-08-11,3,0,8,0,4,1,1,0.7175,0.651521,0.42375,0.164796,812,3980,4792 +224,2011-08-12,3,0,8,0,5,1,1,0.708333,0.654042,0.415,0.125621,1051,3854,4905 +225,2011-08-13,3,0,8,0,6,0,2,0.685833,0.645858,0.729583,0.211454,1504,2646,4150 +226,2011-08-14,3,0,8,0,0,0,2,0.676667,0.624388,0.8175,0.222633,1338,2482,3820 +227,2011-08-15,3,0,8,0,1,1,1,0.665833,0.616167,0.712083,0.208954,775,3563,4338 +228,2011-08-16,3,0,8,0,2,1,1,0.700833,0.645837,0.578333,0.236329,721,4004,4725 +229,2011-08-17,3,0,8,0,3,1,1,0.723333,0.666671,0.575417,0.143667,668,4026,4694 +230,2011-08-18,3,0,8,0,4,1,1,0.711667,0.662258,0.654583,0.233208,639,3166,3805 +231,2011-08-19,3,0,8,0,5,1,2,0.685,0.633221,0.722917,0.139308,797,3356,4153 +232,2011-08-20,3,0,8,0,6,0,1,0.6975,0.648996,0.674167,0.104467,1914,3277,5191 +233,2011-08-21,3,0,8,0,0,0,1,0.710833,0.675525,0.77,0.248754,1249,2624,3873 +234,2011-08-22,3,0,8,0,1,1,1,0.691667,0.638254,0.47,0.27675,833,3925,4758 +235,2011-08-23,3,0,8,0,2,1,1,0.640833,0.606067,0.455417,0.146763,1281,4614,5895 +236,2011-08-24,3,0,8,0,3,1,1,0.673333,0.630692,0.605,0.253108,949,4181,5130 +237,2011-08-25,3,0,8,0,4,1,2,0.684167,0.645854,0.771667,0.210833,435,3107,3542 +238,2011-08-26,3,0,8,0,5,1,1,0.7,0.659733,0.76125,0.0839625,768,3893,4661 +239,2011-08-27,3,0,8,0,6,0,2,0.68,0.635556,0.85,0.375617,226,889,1115 +240,2011-08-28,3,0,8,0,0,0,1,0.707059,0.647959,0.561765,0.304659,1415,2919,4334 +241,2011-08-29,3,0,8,0,1,1,1,0.636667,0.607958,0.554583,0.159825,729,3905,4634 +242,2011-08-30,3,0,8,0,2,1,1,0.639167,0.594704,0.548333,0.125008,775,4429,5204 +243,2011-08-31,3,0,8,0,3,1,1,0.656667,0.611121,0.597917,0.0833333,688,4370,5058 +244,2011-09-01,3,0,9,0,4,1,1,0.655,0.614921,0.639167,0.141796,783,4332,5115 +245,2011-09-02,3,0,9,0,5,1,2,0.643333,0.604808,0.727083,0.139929,875,3852,4727 +246,2011-09-03,3,0,9,0,6,0,1,0.669167,0.633213,0.716667,0.185325,1935,2549,4484 +247,2011-09-04,3,0,9,0,0,0,1,0.709167,0.665429,0.742083,0.206467,2521,2419,4940 +248,2011-09-05,3,0,9,1,1,0,2,0.673333,0.625646,0.790417,0.212696,1236,2115,3351 +249,2011-09-06,3,0,9,0,2,1,3,0.54,0.5152,0.886957,0.343943,204,2506,2710 +250,2011-09-07,3,0,9,0,3,1,3,0.599167,0.544229,0.917083,0.0970208,118,1878,1996 +251,2011-09-08,3,0,9,0,4,1,3,0.633913,0.555361,0.939565,0.192748,153,1689,1842 +252,2011-09-09,3,0,9,0,5,1,2,0.65,0.578946,0.897917,0.124379,417,3127,3544 +253,2011-09-10,3,0,9,0,6,0,1,0.66,0.607962,0.75375,0.153608,1750,3595,5345 +254,2011-09-11,3,0,9,0,0,0,1,0.653333,0.609229,0.71375,0.115054,1633,3413,5046 +255,2011-09-12,3,0,9,0,1,1,1,0.644348,0.60213,0.692174,0.088913,690,4023,4713 +256,2011-09-13,3,0,9,0,2,1,1,0.650833,0.603554,0.7125,0.141804,701,4062,4763 +257,2011-09-14,3,0,9,0,3,1,1,0.673333,0.6269,0.697083,0.1673,647,4138,4785 +258,2011-09-15,3,0,9,0,4,1,2,0.5775,0.553671,0.709167,0.271146,428,3231,3659 +259,2011-09-16,3,0,9,0,5,1,2,0.469167,0.461475,0.590417,0.164183,742,4018,4760 +260,2011-09-17,3,0,9,0,6,0,2,0.491667,0.478512,0.718333,0.189675,1434,3077,4511 +261,2011-09-18,3,0,9,0,0,0,1,0.5075,0.490537,0.695,0.178483,1353,2921,4274 +262,2011-09-19,3,0,9,0,1,1,2,0.549167,0.529675,0.69,0.151742,691,3848,4539 +263,2011-09-20,3,0,9,0,2,1,2,0.561667,0.532217,0.88125,0.134954,438,3203,3641 +264,2011-09-21,3,0,9,0,3,1,2,0.595,0.550533,0.9,0.0964042,539,3813,4352 +265,2011-09-22,3,0,9,0,4,1,2,0.628333,0.554963,0.902083,0.128125,555,4240,4795 +266,2011-09-23,4,0,9,0,5,1,2,0.609167,0.522125,0.9725,0.0783667,258,2137,2395 +267,2011-09-24,4,0,9,0,6,0,2,0.606667,0.564412,0.8625,0.0783833,1776,3647,5423 +268,2011-09-25,4,0,9,0,0,0,2,0.634167,0.572637,0.845,0.0503792,1544,3466,5010 +269,2011-09-26,4,0,9,0,1,1,2,0.649167,0.589042,0.848333,0.1107,684,3946,4630 +270,2011-09-27,4,0,9,0,2,1,2,0.636667,0.574525,0.885417,0.118171,477,3643,4120 +271,2011-09-28,4,0,9,0,3,1,2,0.635,0.575158,0.84875,0.148629,480,3427,3907 +272,2011-09-29,4,0,9,0,4,1,1,0.616667,0.574512,0.699167,0.172883,653,4186,4839 +273,2011-09-30,4,0,9,0,5,1,1,0.564167,0.544829,0.6475,0.206475,830,4372,5202 +274,2011-10-01,4,0,10,0,6,0,2,0.41,0.412863,0.75375,0.292296,480,1949,2429 +275,2011-10-02,4,0,10,0,0,0,2,0.356667,0.345317,0.791667,0.222013,616,2302,2918 +276,2011-10-03,4,0,10,0,1,1,2,0.384167,0.392046,0.760833,0.0833458,330,3240,3570 +277,2011-10-04,4,0,10,0,2,1,1,0.484167,0.472858,0.71,0.205854,486,3970,4456 +278,2011-10-05,4,0,10,0,3,1,1,0.538333,0.527138,0.647917,0.17725,559,4267,4826 +279,2011-10-06,4,0,10,0,4,1,1,0.494167,0.480425,0.620833,0.134954,639,4126,4765 +280,2011-10-07,4,0,10,0,5,1,1,0.510833,0.504404,0.684167,0.0223917,949,4036,4985 +281,2011-10-08,4,0,10,0,6,0,1,0.521667,0.513242,0.70125,0.0454042,2235,3174,5409 +282,2011-10-09,4,0,10,0,0,0,1,0.540833,0.523983,0.7275,0.06345,2397,3114,5511 +283,2011-10-10,4,0,10,1,1,0,1,0.570833,0.542925,0.73375,0.0423042,1514,3603,5117 +284,2011-10-11,4,0,10,0,2,1,2,0.566667,0.546096,0.80875,0.143042,667,3896,4563 +285,2011-10-12,4,0,10,0,3,1,3,0.543333,0.517717,0.90625,0.24815,217,2199,2416 +286,2011-10-13,4,0,10,0,4,1,2,0.589167,0.551804,0.896667,0.141787,290,2623,2913 +287,2011-10-14,4,0,10,0,5,1,2,0.550833,0.529675,0.71625,0.223883,529,3115,3644 +288,2011-10-15,4,0,10,0,6,0,1,0.506667,0.498725,0.483333,0.258083,1899,3318,5217 +289,2011-10-16,4,0,10,0,0,0,1,0.511667,0.503154,0.486667,0.281717,1748,3293,5041 +290,2011-10-17,4,0,10,0,1,1,1,0.534167,0.510725,0.579583,0.175379,713,3857,4570 +291,2011-10-18,4,0,10,0,2,1,2,0.5325,0.522721,0.701667,0.110087,637,4111,4748 +292,2011-10-19,4,0,10,0,3,1,3,0.541739,0.513848,0.895217,0.243339,254,2170,2424 +293,2011-10-20,4,0,10,0,4,1,1,0.475833,0.466525,0.63625,0.422275,471,3724,4195 +294,2011-10-21,4,0,10,0,5,1,1,0.4275,0.423596,0.574167,0.221396,676,3628,4304 +295,2011-10-22,4,0,10,0,6,0,1,0.4225,0.425492,0.629167,0.0926667,1499,2809,4308 +296,2011-10-23,4,0,10,0,0,0,1,0.421667,0.422333,0.74125,0.0995125,1619,2762,4381 +297,2011-10-24,4,0,10,0,1,1,1,0.463333,0.457067,0.772083,0.118792,699,3488,4187 +298,2011-10-25,4,0,10,0,2,1,1,0.471667,0.463375,0.622917,0.166658,695,3992,4687 +299,2011-10-26,4,0,10,0,3,1,2,0.484167,0.472846,0.720417,0.148642,404,3490,3894 +300,2011-10-27,4,0,10,0,4,1,2,0.47,0.457046,0.812917,0.197763,240,2419,2659 +301,2011-10-28,4,0,10,0,5,1,2,0.330833,0.318812,0.585833,0.229479,456,3291,3747 +302,2011-10-29,4,0,10,0,6,0,3,0.254167,0.227913,0.8825,0.351371,57,570,627 +303,2011-10-30,4,0,10,0,0,0,1,0.319167,0.321329,0.62375,0.176617,885,2446,3331 +304,2011-10-31,4,0,10,0,1,1,1,0.34,0.356063,0.703333,0.10635,362,3307,3669 +305,2011-11-01,4,0,11,0,2,1,1,0.400833,0.397088,0.68375,0.135571,410,3658,4068 +306,2011-11-02,4,0,11,0,3,1,1,0.3775,0.390133,0.71875,0.0820917,370,3816,4186 +307,2011-11-03,4,0,11,0,4,1,1,0.408333,0.405921,0.702083,0.136817,318,3656,3974 +308,2011-11-04,4,0,11,0,5,1,2,0.403333,0.403392,0.6225,0.271779,470,3576,4046 +309,2011-11-05,4,0,11,0,6,0,1,0.326667,0.323854,0.519167,0.189062,1156,2770,3926 +310,2011-11-06,4,0,11,0,0,0,1,0.348333,0.362358,0.734583,0.0920542,952,2697,3649 +311,2011-11-07,4,0,11,0,1,1,1,0.395,0.400871,0.75875,0.057225,373,3662,4035 +312,2011-11-08,4,0,11,0,2,1,1,0.408333,0.412246,0.721667,0.0690375,376,3829,4205 +313,2011-11-09,4,0,11,0,3,1,1,0.4,0.409079,0.758333,0.0621958,305,3804,4109 +314,2011-11-10,4,0,11,0,4,1,2,0.38,0.373721,0.813333,0.189067,190,2743,2933 +315,2011-11-11,4,0,11,1,5,0,1,0.324167,0.306817,0.44625,0.314675,440,2928,3368 +316,2011-11-12,4,0,11,0,6,0,1,0.356667,0.357942,0.552917,0.212062,1275,2792,4067 +317,2011-11-13,4,0,11,0,0,0,1,0.440833,0.43055,0.458333,0.281721,1004,2713,3717 +318,2011-11-14,4,0,11,0,1,1,1,0.53,0.524612,0.587083,0.306596,595,3891,4486 +319,2011-11-15,4,0,11,0,2,1,2,0.53,0.507579,0.68875,0.199633,449,3746,4195 +320,2011-11-16,4,0,11,0,3,1,3,0.456667,0.451988,0.93,0.136829,145,1672,1817 +321,2011-11-17,4,0,11,0,4,1,2,0.341667,0.323221,0.575833,0.305362,139,2914,3053 +322,2011-11-18,4,0,11,0,5,1,1,0.274167,0.272721,0.41,0.168533,245,3147,3392 +323,2011-11-19,4,0,11,0,6,0,1,0.329167,0.324483,0.502083,0.224496,943,2720,3663 +324,2011-11-20,4,0,11,0,0,0,2,0.463333,0.457058,0.684583,0.18595,787,2733,3520 +325,2011-11-21,4,0,11,0,1,1,3,0.4475,0.445062,0.91,0.138054,220,2545,2765 +326,2011-11-22,4,0,11,0,2,1,3,0.416667,0.421696,0.9625,0.118792,69,1538,1607 +327,2011-11-23,4,0,11,0,3,1,2,0.440833,0.430537,0.757917,0.335825,112,2454,2566 +328,2011-11-24,4,0,11,1,4,0,1,0.373333,0.372471,0.549167,0.167304,560,935,1495 +329,2011-11-25,4,0,11,0,5,1,1,0.375,0.380671,0.64375,0.0988958,1095,1697,2792 +330,2011-11-26,4,0,11,0,6,0,1,0.375833,0.385087,0.681667,0.0684208,1249,1819,3068 +331,2011-11-27,4,0,11,0,0,0,1,0.459167,0.4558,0.698333,0.208954,810,2261,3071 +332,2011-11-28,4,0,11,0,1,1,1,0.503478,0.490122,0.743043,0.142122,253,3614,3867 +333,2011-11-29,4,0,11,0,2,1,2,0.458333,0.451375,0.830833,0.258092,96,2818,2914 +334,2011-11-30,4,0,11,0,3,1,1,0.325,0.311221,0.613333,0.271158,188,3425,3613 +335,2011-12-01,4,0,12,0,4,1,1,0.3125,0.305554,0.524583,0.220158,182,3545,3727 +336,2011-12-02,4,0,12,0,5,1,1,0.314167,0.331433,0.625833,0.100754,268,3672,3940 +337,2011-12-03,4,0,12,0,6,0,1,0.299167,0.310604,0.612917,0.0957833,706,2908,3614 +338,2011-12-04,4,0,12,0,0,0,1,0.330833,0.3491,0.775833,0.0839583,634,2851,3485 +339,2011-12-05,4,0,12,0,1,1,2,0.385833,0.393925,0.827083,0.0622083,233,3578,3811 +340,2011-12-06,4,0,12,0,2,1,3,0.4625,0.4564,0.949583,0.232583,126,2468,2594 +341,2011-12-07,4,0,12,0,3,1,3,0.41,0.400246,0.970417,0.266175,50,655,705 +342,2011-12-08,4,0,12,0,4,1,1,0.265833,0.256938,0.58,0.240058,150,3172,3322 +343,2011-12-09,4,0,12,0,5,1,1,0.290833,0.317542,0.695833,0.0827167,261,3359,3620 +344,2011-12-10,4,0,12,0,6,0,1,0.275,0.266412,0.5075,0.233221,502,2688,3190 +345,2011-12-11,4,0,12,0,0,0,1,0.220833,0.253154,0.49,0.0665417,377,2366,2743 +346,2011-12-12,4,0,12,0,1,1,1,0.238333,0.270196,0.670833,0.06345,143,3167,3310 +347,2011-12-13,4,0,12,0,2,1,1,0.2825,0.301138,0.59,0.14055,155,3368,3523 +348,2011-12-14,4,0,12,0,3,1,2,0.3175,0.338362,0.66375,0.0609583,178,3562,3740 +349,2011-12-15,4,0,12,0,4,1,2,0.4225,0.412237,0.634167,0.268042,181,3528,3709 +350,2011-12-16,4,0,12,0,5,1,2,0.375,0.359825,0.500417,0.260575,178,3399,3577 +351,2011-12-17,4,0,12,0,6,0,2,0.258333,0.249371,0.560833,0.243167,275,2464,2739 +352,2011-12-18,4,0,12,0,0,0,1,0.238333,0.245579,0.58625,0.169779,220,2211,2431 +353,2011-12-19,4,0,12,0,1,1,1,0.276667,0.280933,0.6375,0.172896,260,3143,3403 +354,2011-12-20,4,0,12,0,2,1,2,0.385833,0.396454,0.595417,0.0615708,216,3534,3750 +355,2011-12-21,1,0,12,0,3,1,2,0.428333,0.428017,0.858333,0.2214,107,2553,2660 +356,2011-12-22,1,0,12,0,4,1,2,0.423333,0.426121,0.7575,0.047275,227,2841,3068 +357,2011-12-23,1,0,12,0,5,1,1,0.373333,0.377513,0.68625,0.274246,163,2046,2209 +358,2011-12-24,1,0,12,0,6,0,1,0.3025,0.299242,0.5425,0.190304,155,856,1011 +359,2011-12-25,1,0,12,0,0,0,1,0.274783,0.279961,0.681304,0.155091,303,451,754 +360,2011-12-26,1,0,12,1,1,0,1,0.321739,0.315535,0.506957,0.239465,430,887,1317 +361,2011-12-27,1,0,12,0,2,1,2,0.325,0.327633,0.7625,0.18845,103,1059,1162 +362,2011-12-28,1,0,12,0,3,1,1,0.29913,0.279974,0.503913,0.293961,255,2047,2302 +363,2011-12-29,1,0,12,0,4,1,1,0.248333,0.263892,0.574167,0.119412,254,2169,2423 +364,2011-12-30,1,0,12,0,5,1,1,0.311667,0.318812,0.636667,0.134337,491,2508,2999 +365,2011-12-31,1,0,12,0,6,0,1,0.41,0.414121,0.615833,0.220154,665,1820,2485 +366,2012-01-01,1,1,1,0,0,0,1,0.37,0.375621,0.6925,0.192167,686,1608,2294 +367,2012-01-02,1,1,1,1,1,0,1,0.273043,0.252304,0.381304,0.329665,244,1707,1951 +368,2012-01-03,1,1,1,0,2,1,1,0.15,0.126275,0.44125,0.365671,89,2147,2236 +369,2012-01-04,1,1,1,0,3,1,2,0.1075,0.119337,0.414583,0.1847,95,2273,2368 +370,2012-01-05,1,1,1,0,4,1,1,0.265833,0.278412,0.524167,0.129987,140,3132,3272 +371,2012-01-06,1,1,1,0,5,1,1,0.334167,0.340267,0.542083,0.167908,307,3791,4098 +372,2012-01-07,1,1,1,0,6,0,1,0.393333,0.390779,0.531667,0.174758,1070,3451,4521 +373,2012-01-08,1,1,1,0,0,0,1,0.3375,0.340258,0.465,0.191542,599,2826,3425 +374,2012-01-09,1,1,1,0,1,1,2,0.224167,0.247479,0.701667,0.0989,106,2270,2376 +375,2012-01-10,1,1,1,0,2,1,1,0.308696,0.318826,0.646522,0.187552,173,3425,3598 +376,2012-01-11,1,1,1,0,3,1,2,0.274167,0.282821,0.8475,0.131221,92,2085,2177 +377,2012-01-12,1,1,1,0,4,1,2,0.3825,0.381938,0.802917,0.180967,269,3828,4097 +378,2012-01-13,1,1,1,0,5,1,1,0.274167,0.249362,0.5075,0.378108,174,3040,3214 +379,2012-01-14,1,1,1,0,6,0,1,0.18,0.183087,0.4575,0.187183,333,2160,2493 +380,2012-01-15,1,1,1,0,0,0,1,0.166667,0.161625,0.419167,0.251258,284,2027,2311 +381,2012-01-16,1,1,1,1,1,0,1,0.19,0.190663,0.5225,0.231358,217,2081,2298 +382,2012-01-17,1,1,1,0,2,1,2,0.373043,0.364278,0.716087,0.34913,127,2808,2935 +383,2012-01-18,1,1,1,0,3,1,1,0.303333,0.275254,0.443333,0.415429,109,3267,3376 +384,2012-01-19,1,1,1,0,4,1,1,0.19,0.190038,0.4975,0.220158,130,3162,3292 +385,2012-01-20,1,1,1,0,5,1,2,0.2175,0.220958,0.45,0.20275,115,3048,3163 +386,2012-01-21,1,1,1,0,6,0,2,0.173333,0.174875,0.83125,0.222642,67,1234,1301 +387,2012-01-22,1,1,1,0,0,0,2,0.1625,0.16225,0.79625,0.199638,196,1781,1977 +388,2012-01-23,1,1,1,0,1,1,2,0.218333,0.243058,0.91125,0.110708,145,2287,2432 +389,2012-01-24,1,1,1,0,2,1,1,0.3425,0.349108,0.835833,0.123767,439,3900,4339 +390,2012-01-25,1,1,1,0,3,1,1,0.294167,0.294821,0.64375,0.161071,467,3803,4270 +391,2012-01-26,1,1,1,0,4,1,2,0.341667,0.35605,0.769583,0.0733958,244,3831,4075 +392,2012-01-27,1,1,1,0,5,1,2,0.425,0.415383,0.74125,0.342667,269,3187,3456 +393,2012-01-28,1,1,1,0,6,0,1,0.315833,0.326379,0.543333,0.210829,775,3248,4023 +394,2012-01-29,1,1,1,0,0,0,1,0.2825,0.272721,0.31125,0.24005,558,2685,3243 +395,2012-01-30,1,1,1,0,1,1,1,0.269167,0.262625,0.400833,0.215792,126,3498,3624 +396,2012-01-31,1,1,1,0,2,1,1,0.39,0.381317,0.416667,0.261817,324,4185,4509 +397,2012-02-01,1,1,2,0,3,1,1,0.469167,0.466538,0.507917,0.189067,304,4275,4579 +398,2012-02-02,1,1,2,0,4,1,2,0.399167,0.398971,0.672917,0.187187,190,3571,3761 +399,2012-02-03,1,1,2,0,5,1,1,0.313333,0.309346,0.526667,0.178496,310,3841,4151 +400,2012-02-04,1,1,2,0,6,0,2,0.264167,0.272725,0.779583,0.121896,384,2448,2832 +401,2012-02-05,1,1,2,0,0,0,2,0.265833,0.264521,0.687917,0.175996,318,2629,2947 +402,2012-02-06,1,1,2,0,1,1,1,0.282609,0.296426,0.622174,0.1538,206,3578,3784 +403,2012-02-07,1,1,2,0,2,1,1,0.354167,0.361104,0.49625,0.147379,199,4176,4375 +404,2012-02-08,1,1,2,0,3,1,2,0.256667,0.266421,0.722917,0.133721,109,2693,2802 +405,2012-02-09,1,1,2,0,4,1,1,0.265,0.261988,0.562083,0.194037,163,3667,3830 +406,2012-02-10,1,1,2,0,5,1,2,0.280833,0.293558,0.54,0.116929,227,3604,3831 +407,2012-02-11,1,1,2,0,6,0,3,0.224167,0.210867,0.73125,0.289796,192,1977,2169 +408,2012-02-12,1,1,2,0,0,0,1,0.1275,0.101658,0.464583,0.409212,73,1456,1529 +409,2012-02-13,1,1,2,0,1,1,1,0.2225,0.227913,0.41125,0.167283,94,3328,3422 +410,2012-02-14,1,1,2,0,2,1,2,0.319167,0.333946,0.50875,0.141179,135,3787,3922 +411,2012-02-15,1,1,2,0,3,1,1,0.348333,0.351629,0.53125,0.1816,141,4028,4169 +412,2012-02-16,1,1,2,0,4,1,2,0.316667,0.330162,0.752917,0.091425,74,2931,3005 +413,2012-02-17,1,1,2,0,5,1,1,0.343333,0.351629,0.634583,0.205846,349,3805,4154 +414,2012-02-18,1,1,2,0,6,0,1,0.346667,0.355425,0.534583,0.190929,1435,2883,4318 +415,2012-02-19,1,1,2,0,0,0,2,0.28,0.265788,0.515833,0.253112,618,2071,2689 +416,2012-02-20,1,1,2,1,1,0,1,0.28,0.273391,0.507826,0.229083,502,2627,3129 +417,2012-02-21,1,1,2,0,2,1,1,0.287826,0.295113,0.594348,0.205717,163,3614,3777 +418,2012-02-22,1,1,2,0,3,1,1,0.395833,0.392667,0.567917,0.234471,394,4379,4773 +419,2012-02-23,1,1,2,0,4,1,1,0.454167,0.444446,0.554583,0.190913,516,4546,5062 +420,2012-02-24,1,1,2,0,5,1,2,0.4075,0.410971,0.7375,0.237567,246,3241,3487 +421,2012-02-25,1,1,2,0,6,0,1,0.290833,0.255675,0.395833,0.421642,317,2415,2732 +422,2012-02-26,1,1,2,0,0,0,1,0.279167,0.268308,0.41,0.205229,515,2874,3389 +423,2012-02-27,1,1,2,0,1,1,1,0.366667,0.357954,0.490833,0.268033,253,4069,4322 +424,2012-02-28,1,1,2,0,2,1,1,0.359167,0.353525,0.395833,0.193417,229,4134,4363 +425,2012-02-29,1,1,2,0,3,1,2,0.344348,0.34847,0.804783,0.179117,65,1769,1834 +426,2012-03-01,1,1,3,0,4,1,1,0.485833,0.475371,0.615417,0.226987,325,4665,4990 +427,2012-03-02,1,1,3,0,5,1,2,0.353333,0.359842,0.657083,0.144904,246,2948,3194 +428,2012-03-03,1,1,3,0,6,0,2,0.414167,0.413492,0.62125,0.161079,956,3110,4066 +429,2012-03-04,1,1,3,0,0,0,1,0.325833,0.303021,0.403333,0.334571,710,2713,3423 +430,2012-03-05,1,1,3,0,1,1,1,0.243333,0.241171,0.50625,0.228858,203,3130,3333 +431,2012-03-06,1,1,3,0,2,1,1,0.258333,0.255042,0.456667,0.200875,221,3735,3956 +432,2012-03-07,1,1,3,0,3,1,1,0.404167,0.3851,0.513333,0.345779,432,4484,4916 +433,2012-03-08,1,1,3,0,4,1,1,0.5275,0.524604,0.5675,0.441563,486,4896,5382 +434,2012-03-09,1,1,3,0,5,1,2,0.410833,0.397083,0.407083,0.4148,447,4122,4569 +435,2012-03-10,1,1,3,0,6,0,1,0.2875,0.277767,0.350417,0.22575,968,3150,4118 +436,2012-03-11,1,1,3,0,0,0,1,0.361739,0.35967,0.476957,0.222587,1658,3253,4911 +437,2012-03-12,1,1,3,0,1,1,1,0.466667,0.459592,0.489167,0.207713,838,4460,5298 +438,2012-03-13,1,1,3,0,2,1,1,0.565,0.542929,0.6175,0.23695,762,5085,5847 +439,2012-03-14,1,1,3,0,3,1,1,0.5725,0.548617,0.507083,0.115062,997,5315,6312 +440,2012-03-15,1,1,3,0,4,1,1,0.5575,0.532825,0.579583,0.149883,1005,5187,6192 +441,2012-03-16,1,1,3,0,5,1,2,0.435833,0.436229,0.842083,0.113192,548,3830,4378 +442,2012-03-17,1,1,3,0,6,0,2,0.514167,0.505046,0.755833,0.110704,3155,4681,7836 +443,2012-03-18,1,1,3,0,0,0,2,0.4725,0.464,0.81,0.126883,2207,3685,5892 +444,2012-03-19,1,1,3,0,1,1,1,0.545,0.532821,0.72875,0.162317,982,5171,6153 +445,2012-03-20,1,1,3,0,2,1,1,0.560833,0.538533,0.807917,0.121271,1051,5042,6093 +446,2012-03-21,2,1,3,0,3,1,2,0.531667,0.513258,0.82125,0.0895583,1122,5108,6230 +447,2012-03-22,2,1,3,0,4,1,1,0.554167,0.531567,0.83125,0.117562,1334,5537,6871 +448,2012-03-23,2,1,3,0,5,1,2,0.601667,0.570067,0.694167,0.1163,2469,5893,8362 +449,2012-03-24,2,1,3,0,6,0,2,0.5025,0.486733,0.885417,0.192783,1033,2339,3372 +450,2012-03-25,2,1,3,0,0,0,2,0.4375,0.437488,0.880833,0.220775,1532,3464,4996 +451,2012-03-26,2,1,3,0,1,1,1,0.445833,0.43875,0.477917,0.386821,795,4763,5558 +452,2012-03-27,2,1,3,0,2,1,1,0.323333,0.315654,0.29,0.187192,531,4571,5102 +453,2012-03-28,2,1,3,0,3,1,1,0.484167,0.47095,0.48125,0.291671,674,5024,5698 +454,2012-03-29,2,1,3,0,4,1,1,0.494167,0.482304,0.439167,0.31965,834,5299,6133 +455,2012-03-30,2,1,3,0,5,1,2,0.37,0.375621,0.580833,0.138067,796,4663,5459 +456,2012-03-31,2,1,3,0,6,0,2,0.424167,0.421708,0.738333,0.250617,2301,3934,6235 +457,2012-04-01,2,1,4,0,0,0,2,0.425833,0.417287,0.67625,0.172267,2347,3694,6041 +458,2012-04-02,2,1,4,0,1,1,1,0.433913,0.427513,0.504348,0.312139,1208,4728,5936 +459,2012-04-03,2,1,4,0,2,1,1,0.466667,0.461483,0.396667,0.100133,1348,5424,6772 +460,2012-04-04,2,1,4,0,3,1,1,0.541667,0.53345,0.469583,0.180975,1058,5378,6436 +461,2012-04-05,2,1,4,0,4,1,1,0.435,0.431163,0.374167,0.219529,1192,5265,6457 +462,2012-04-06,2,1,4,0,5,1,1,0.403333,0.390767,0.377083,0.300388,1807,4653,6460 +463,2012-04-07,2,1,4,0,6,0,1,0.4375,0.426129,0.254167,0.274871,3252,3605,6857 +464,2012-04-08,2,1,4,0,0,0,1,0.5,0.492425,0.275833,0.232596,2230,2939,5169 +465,2012-04-09,2,1,4,0,1,1,1,0.489167,0.476638,0.3175,0.358196,905,4680,5585 +466,2012-04-10,2,1,4,0,2,1,1,0.446667,0.436233,0.435,0.249375,819,5099,5918 +467,2012-04-11,2,1,4,0,3,1,1,0.348696,0.337274,0.469565,0.295274,482,4380,4862 +468,2012-04-12,2,1,4,0,4,1,1,0.3975,0.387604,0.46625,0.290429,663,4746,5409 +469,2012-04-13,2,1,4,0,5,1,1,0.4425,0.431808,0.408333,0.155471,1252,5146,6398 +470,2012-04-14,2,1,4,0,6,0,1,0.495,0.487996,0.502917,0.190917,2795,4665,7460 +471,2012-04-15,2,1,4,0,0,0,1,0.606667,0.573875,0.507917,0.225129,2846,4286,7132 +472,2012-04-16,2,1,4,1,1,0,1,0.664167,0.614925,0.561667,0.284829,1198,5172,6370 +473,2012-04-17,2,1,4,0,2,1,1,0.608333,0.598487,0.390417,0.273629,989,5702,6691 +474,2012-04-18,2,1,4,0,3,1,2,0.463333,0.457038,0.569167,0.167912,347,4020,4367 +475,2012-04-19,2,1,4,0,4,1,1,0.498333,0.493046,0.6125,0.0659292,846,5719,6565 +476,2012-04-20,2,1,4,0,5,1,1,0.526667,0.515775,0.694583,0.149871,1340,5950,7290 +477,2012-04-21,2,1,4,0,6,0,1,0.57,0.542921,0.682917,0.283587,2541,4083,6624 +478,2012-04-22,2,1,4,0,0,0,3,0.396667,0.389504,0.835417,0.344546,120,907,1027 +479,2012-04-23,2,1,4,0,1,1,2,0.321667,0.301125,0.766667,0.303496,195,3019,3214 +480,2012-04-24,2,1,4,0,2,1,1,0.413333,0.405283,0.454167,0.249383,518,5115,5633 +481,2012-04-25,2,1,4,0,3,1,1,0.476667,0.470317,0.427917,0.118792,655,5541,6196 +482,2012-04-26,2,1,4,0,4,1,2,0.498333,0.483583,0.756667,0.176625,475,4551,5026 +483,2012-04-27,2,1,4,0,5,1,1,0.4575,0.452637,0.400833,0.347633,1014,5219,6233 +484,2012-04-28,2,1,4,0,6,0,2,0.376667,0.377504,0.489583,0.129975,1120,3100,4220 +485,2012-04-29,2,1,4,0,0,0,1,0.458333,0.450121,0.587083,0.116908,2229,4075,6304 +486,2012-04-30,2,1,4,0,1,1,2,0.464167,0.457696,0.57,0.171638,665,4907,5572 +487,2012-05-01,2,1,5,0,2,1,2,0.613333,0.577021,0.659583,0.156096,653,5087,5740 +488,2012-05-02,2,1,5,0,3,1,1,0.564167,0.537896,0.797083,0.138058,667,5502,6169 +489,2012-05-03,2,1,5,0,4,1,2,0.56,0.537242,0.768333,0.133696,764,5657,6421 +490,2012-05-04,2,1,5,0,5,1,1,0.6275,0.590917,0.735417,0.162938,1069,5227,6296 +491,2012-05-05,2,1,5,0,6,0,2,0.621667,0.584608,0.756667,0.152992,2496,4387,6883 +492,2012-05-06,2,1,5,0,0,0,2,0.5625,0.546737,0.74,0.149879,2135,4224,6359 +493,2012-05-07,2,1,5,0,1,1,2,0.5375,0.527142,0.664167,0.230721,1008,5265,6273 +494,2012-05-08,2,1,5,0,2,1,2,0.581667,0.557471,0.685833,0.296029,738,4990,5728 +495,2012-05-09,2,1,5,0,3,1,2,0.575,0.553025,0.744167,0.216412,620,4097,4717 +496,2012-05-10,2,1,5,0,4,1,1,0.505833,0.491783,0.552083,0.314063,1026,5546,6572 +497,2012-05-11,2,1,5,0,5,1,1,0.533333,0.520833,0.360417,0.236937,1319,5711,7030 +498,2012-05-12,2,1,5,0,6,0,1,0.564167,0.544817,0.480417,0.123133,2622,4807,7429 +499,2012-05-13,2,1,5,0,0,0,1,0.6125,0.585238,0.57625,0.225117,2172,3946,6118 +500,2012-05-14,2,1,5,0,1,1,2,0.573333,0.5499,0.789583,0.212692,342,2501,2843 +501,2012-05-15,2,1,5,0,2,1,2,0.611667,0.576404,0.794583,0.147392,625,4490,5115 +502,2012-05-16,2,1,5,0,3,1,1,0.636667,0.595975,0.697917,0.122512,991,6433,7424 +503,2012-05-17,2,1,5,0,4,1,1,0.593333,0.572613,0.52,0.229475,1242,6142,7384 +504,2012-05-18,2,1,5,0,5,1,1,0.564167,0.551121,0.523333,0.136817,1521,6118,7639 +505,2012-05-19,2,1,5,0,6,0,1,0.6,0.566908,0.45625,0.083975,3410,4884,8294 +506,2012-05-20,2,1,5,0,0,0,1,0.620833,0.583967,0.530417,0.254367,2704,4425,7129 +507,2012-05-21,2,1,5,0,1,1,2,0.598333,0.565667,0.81125,0.233204,630,3729,4359 +508,2012-05-22,2,1,5,0,2,1,2,0.615,0.580825,0.765833,0.118167,819,5254,6073 +509,2012-05-23,2,1,5,0,3,1,2,0.621667,0.584612,0.774583,0.102,766,4494,5260 +510,2012-05-24,2,1,5,0,4,1,1,0.655,0.6067,0.716667,0.172896,1059,5711,6770 +511,2012-05-25,2,1,5,0,5,1,1,0.68,0.627529,0.747083,0.14055,1417,5317,6734 +512,2012-05-26,2,1,5,0,6,0,1,0.6925,0.642696,0.7325,0.198992,2855,3681,6536 +513,2012-05-27,2,1,5,0,0,0,1,0.69,0.641425,0.697083,0.215171,3283,3308,6591 +514,2012-05-28,2,1,5,1,1,0,1,0.7125,0.6793,0.67625,0.196521,2557,3486,6043 +515,2012-05-29,2,1,5,0,2,1,1,0.7225,0.672992,0.684583,0.2954,880,4863,5743 +516,2012-05-30,2,1,5,0,3,1,2,0.656667,0.611129,0.67,0.134329,745,6110,6855 +517,2012-05-31,2,1,5,0,4,1,1,0.68,0.631329,0.492917,0.195279,1100,6238,7338 +518,2012-06-01,2,1,6,0,5,1,2,0.654167,0.607962,0.755417,0.237563,533,3594,4127 +519,2012-06-02,2,1,6,0,6,0,1,0.583333,0.566288,0.549167,0.186562,2795,5325,8120 +520,2012-06-03,2,1,6,0,0,0,1,0.6025,0.575133,0.493333,0.184087,2494,5147,7641 +521,2012-06-04,2,1,6,0,1,1,1,0.5975,0.578283,0.487083,0.284833,1071,5927,6998 +522,2012-06-05,2,1,6,0,2,1,2,0.540833,0.525892,0.613333,0.209575,968,6033,7001 +523,2012-06-06,2,1,6,0,3,1,1,0.554167,0.542292,0.61125,0.077125,1027,6028,7055 +524,2012-06-07,2,1,6,0,4,1,1,0.6025,0.569442,0.567083,0.15735,1038,6456,7494 +525,2012-06-08,2,1,6,0,5,1,1,0.649167,0.597862,0.467917,0.175383,1488,6248,7736 +526,2012-06-09,2,1,6,0,6,0,1,0.710833,0.648367,0.437083,0.144287,2708,4790,7498 +527,2012-06-10,2,1,6,0,0,0,1,0.726667,0.663517,0.538333,0.133721,2224,4374,6598 +528,2012-06-11,2,1,6,0,1,1,2,0.720833,0.659721,0.587917,0.207713,1017,5647,6664 +529,2012-06-12,2,1,6,0,2,1,2,0.653333,0.597875,0.833333,0.214546,477,4495,4972 +530,2012-06-13,2,1,6,0,3,1,1,0.655833,0.611117,0.582083,0.343279,1173,6248,7421 +531,2012-06-14,2,1,6,0,4,1,1,0.648333,0.624383,0.569583,0.253733,1180,6183,7363 +532,2012-06-15,2,1,6,0,5,1,1,0.639167,0.599754,0.589583,0.176617,1563,6102,7665 +533,2012-06-16,2,1,6,0,6,0,1,0.631667,0.594708,0.504167,0.166667,2963,4739,7702 +534,2012-06-17,2,1,6,0,0,0,1,0.5925,0.571975,0.59875,0.144904,2634,4344,6978 +535,2012-06-18,2,1,6,0,1,1,2,0.568333,0.544842,0.777917,0.174746,653,4446,5099 +536,2012-06-19,2,1,6,0,2,1,1,0.688333,0.654692,0.69,0.148017,968,5857,6825 +537,2012-06-20,2,1,6,0,3,1,1,0.7825,0.720975,0.592083,0.113812,872,5339,6211 +538,2012-06-21,3,1,6,0,4,1,1,0.805833,0.752542,0.567917,0.118787,778,5127,5905 +539,2012-06-22,3,1,6,0,5,1,1,0.7775,0.724121,0.57375,0.182842,964,4859,5823 +540,2012-06-23,3,1,6,0,6,0,1,0.731667,0.652792,0.534583,0.179721,2657,4801,7458 +541,2012-06-24,3,1,6,0,0,0,1,0.743333,0.674254,0.479167,0.145525,2551,4340,6891 +542,2012-06-25,3,1,6,0,1,1,1,0.715833,0.654042,0.504167,0.300383,1139,5640,6779 +543,2012-06-26,3,1,6,0,2,1,1,0.630833,0.594704,0.373333,0.347642,1077,6365,7442 +544,2012-06-27,3,1,6,0,3,1,1,0.6975,0.640792,0.36,0.271775,1077,6258,7335 +545,2012-06-28,3,1,6,0,4,1,1,0.749167,0.675512,0.4225,0.17165,921,5958,6879 +546,2012-06-29,3,1,6,0,5,1,1,0.834167,0.786613,0.48875,0.165417,829,4634,5463 +547,2012-06-30,3,1,6,0,6,0,1,0.765,0.687508,0.60125,0.161071,1455,4232,5687 +548,2012-07-01,3,1,7,0,0,0,1,0.815833,0.750629,0.51875,0.168529,1421,4110,5531 +549,2012-07-02,3,1,7,0,1,1,1,0.781667,0.702038,0.447083,0.195267,904,5323,6227 +550,2012-07-03,3,1,7,0,2,1,1,0.780833,0.70265,0.492083,0.126237,1052,5608,6660 +551,2012-07-04,3,1,7,1,3,0,1,0.789167,0.732337,0.53875,0.13495,2562,4841,7403 +552,2012-07-05,3,1,7,0,4,1,1,0.8275,0.761367,0.457917,0.194029,1405,4836,6241 +553,2012-07-06,3,1,7,0,5,1,1,0.828333,0.752533,0.450833,0.146142,1366,4841,6207 +554,2012-07-07,3,1,7,0,6,0,1,0.861667,0.804913,0.492083,0.163554,1448,3392,4840 +555,2012-07-08,3,1,7,0,0,0,1,0.8225,0.790396,0.57375,0.125629,1203,3469,4672 +556,2012-07-09,3,1,7,0,1,1,2,0.710833,0.654054,0.683333,0.180975,998,5571,6569 +557,2012-07-10,3,1,7,0,2,1,2,0.720833,0.664796,0.6675,0.151737,954,5336,6290 +558,2012-07-11,3,1,7,0,3,1,1,0.716667,0.650271,0.633333,0.151733,975,6289,7264 +559,2012-07-12,3,1,7,0,4,1,1,0.715833,0.654683,0.529583,0.146775,1032,6414,7446 +560,2012-07-13,3,1,7,0,5,1,2,0.731667,0.667933,0.485833,0.08085,1511,5988,7499 +561,2012-07-14,3,1,7,0,6,0,2,0.703333,0.666042,0.699167,0.143679,2355,4614,6969 +562,2012-07-15,3,1,7,0,0,0,1,0.745833,0.705196,0.717917,0.166667,1920,4111,6031 +563,2012-07-16,3,1,7,0,1,1,1,0.763333,0.724125,0.645,0.164187,1088,5742,6830 +564,2012-07-17,3,1,7,0,2,1,1,0.818333,0.755683,0.505833,0.114429,921,5865,6786 +565,2012-07-18,3,1,7,0,3,1,1,0.793333,0.745583,0.577083,0.137442,799,4914,5713 +566,2012-07-19,3,1,7,0,4,1,1,0.77,0.714642,0.600417,0.165429,888,5703,6591 +567,2012-07-20,3,1,7,0,5,1,2,0.665833,0.613025,0.844167,0.208967,747,5123,5870 +568,2012-07-21,3,1,7,0,6,0,3,0.595833,0.549912,0.865417,0.2133,1264,3195,4459 +569,2012-07-22,3,1,7,0,0,0,2,0.6675,0.623125,0.7625,0.0939208,2544,4866,7410 +570,2012-07-23,3,1,7,0,1,1,1,0.741667,0.690017,0.694167,0.138683,1135,5831,6966 +571,2012-07-24,3,1,7,0,2,1,1,0.750833,0.70645,0.655,0.211454,1140,6452,7592 +572,2012-07-25,3,1,7,0,3,1,1,0.724167,0.654054,0.45,0.1648,1383,6790,8173 +573,2012-07-26,3,1,7,0,4,1,1,0.776667,0.739263,0.596667,0.284813,1036,5825,6861 +574,2012-07-27,3,1,7,0,5,1,1,0.781667,0.734217,0.594583,0.152992,1259,5645,6904 +575,2012-07-28,3,1,7,0,6,0,1,0.755833,0.697604,0.613333,0.15735,2234,4451,6685 +576,2012-07-29,3,1,7,0,0,0,1,0.721667,0.667933,0.62375,0.170396,2153,4444,6597 +577,2012-07-30,3,1,7,0,1,1,1,0.730833,0.684987,0.66875,0.153617,1040,6065,7105 +578,2012-07-31,3,1,7,0,2,1,1,0.713333,0.662896,0.704167,0.165425,968,6248,7216 +579,2012-08-01,3,1,8,0,3,1,1,0.7175,0.667308,0.6775,0.141179,1074,6506,7580 +580,2012-08-02,3,1,8,0,4,1,1,0.7525,0.707088,0.659583,0.129354,983,6278,7261 +581,2012-08-03,3,1,8,0,5,1,2,0.765833,0.722867,0.6425,0.215792,1328,5847,7175 +582,2012-08-04,3,1,8,0,6,0,1,0.793333,0.751267,0.613333,0.257458,2345,4479,6824 +583,2012-08-05,3,1,8,0,0,0,1,0.769167,0.731079,0.6525,0.290421,1707,3757,5464 +584,2012-08-06,3,1,8,0,1,1,2,0.7525,0.710246,0.654167,0.129354,1233,5780,7013 +585,2012-08-07,3,1,8,0,2,1,2,0.735833,0.697621,0.70375,0.116908,1278,5995,7273 +586,2012-08-08,3,1,8,0,3,1,2,0.75,0.707717,0.672917,0.1107,1263,6271,7534 +587,2012-08-09,3,1,8,0,4,1,1,0.755833,0.699508,0.620417,0.1561,1196,6090,7286 +588,2012-08-10,3,1,8,0,5,1,2,0.715833,0.667942,0.715833,0.238813,1065,4721,5786 +589,2012-08-11,3,1,8,0,6,0,2,0.6925,0.638267,0.732917,0.206479,2247,4052,6299 +590,2012-08-12,3,1,8,0,0,0,1,0.700833,0.644579,0.530417,0.122512,2182,4362,6544 +591,2012-08-13,3,1,8,0,1,1,1,0.720833,0.662254,0.545417,0.136212,1207,5676,6883 +592,2012-08-14,3,1,8,0,2,1,1,0.726667,0.676779,0.686667,0.169158,1128,5656,6784 +593,2012-08-15,3,1,8,0,3,1,1,0.706667,0.654037,0.619583,0.169771,1198,6149,7347 +594,2012-08-16,3,1,8,0,4,1,1,0.719167,0.654688,0.519167,0.141796,1338,6267,7605 +595,2012-08-17,3,1,8,0,5,1,1,0.723333,0.2424,0.570833,0.231354,1483,5665,7148 +596,2012-08-18,3,1,8,0,6,0,1,0.678333,0.618071,0.603333,0.177867,2827,5038,7865 +597,2012-08-19,3,1,8,0,0,0,2,0.635833,0.603554,0.711667,0.08645,1208,3341,4549 +598,2012-08-20,3,1,8,0,1,1,2,0.635833,0.595967,0.734167,0.129979,1026,5504,6530 +599,2012-08-21,3,1,8,0,2,1,1,0.649167,0.601025,0.67375,0.0727708,1081,5925,7006 +600,2012-08-22,3,1,8,0,3,1,1,0.6675,0.621854,0.677083,0.0702833,1094,6281,7375 +601,2012-08-23,3,1,8,0,4,1,1,0.695833,0.637008,0.635833,0.0845958,1363,6402,7765 +602,2012-08-24,3,1,8,0,5,1,2,0.7025,0.6471,0.615,0.0721458,1325,6257,7582 +603,2012-08-25,3,1,8,0,6,0,2,0.661667,0.618696,0.712917,0.244408,1829,4224,6053 +604,2012-08-26,3,1,8,0,0,0,2,0.653333,0.595996,0.845833,0.228858,1483,3772,5255 +605,2012-08-27,3,1,8,0,1,1,1,0.703333,0.654688,0.730417,0.128733,989,5928,6917 +606,2012-08-28,3,1,8,0,2,1,1,0.728333,0.66605,0.62,0.190925,935,6105,7040 +607,2012-08-29,3,1,8,0,3,1,1,0.685,0.635733,0.552083,0.112562,1177,6520,7697 +608,2012-08-30,3,1,8,0,4,1,1,0.706667,0.652779,0.590417,0.0771167,1172,6541,7713 +609,2012-08-31,3,1,8,0,5,1,1,0.764167,0.6894,0.5875,0.168533,1433,5917,7350 +610,2012-09-01,3,1,9,0,6,0,2,0.753333,0.702654,0.638333,0.113187,2352,3788,6140 +611,2012-09-02,3,1,9,0,0,0,2,0.696667,0.649,0.815,0.0640708,2613,3197,5810 +612,2012-09-03,3,1,9,1,1,0,1,0.7075,0.661629,0.790833,0.151121,1965,4069,6034 +613,2012-09-04,3,1,9,0,2,1,1,0.725833,0.686888,0.755,0.236321,867,5997,6864 +614,2012-09-05,3,1,9,0,3,1,1,0.736667,0.708983,0.74125,0.187808,832,6280,7112 +615,2012-09-06,3,1,9,0,4,1,2,0.696667,0.655329,0.810417,0.142421,611,5592,6203 +616,2012-09-07,3,1,9,0,5,1,1,0.703333,0.657204,0.73625,0.171646,1045,6459,7504 +617,2012-09-08,3,1,9,0,6,0,2,0.659167,0.611121,0.799167,0.281104,1557,4419,5976 +618,2012-09-09,3,1,9,0,0,0,1,0.61,0.578925,0.5475,0.224496,2570,5657,8227 +619,2012-09-10,3,1,9,0,1,1,1,0.583333,0.565654,0.50375,0.258713,1118,6407,7525 +620,2012-09-11,3,1,9,0,2,1,1,0.5775,0.554292,0.52,0.0920542,1070,6697,7767 +621,2012-09-12,3,1,9,0,3,1,1,0.599167,0.570075,0.577083,0.131846,1050,6820,7870 +622,2012-09-13,3,1,9,0,4,1,1,0.6125,0.579558,0.637083,0.0827208,1054,6750,7804 +623,2012-09-14,3,1,9,0,5,1,1,0.633333,0.594083,0.6725,0.103863,1379,6630,8009 +624,2012-09-15,3,1,9,0,6,0,1,0.608333,0.585867,0.501667,0.247521,3160,5554,8714 +625,2012-09-16,3,1,9,0,0,0,1,0.58,0.563125,0.57,0.0901833,2166,5167,7333 +626,2012-09-17,3,1,9,0,1,1,2,0.580833,0.55305,0.734583,0.151742,1022,5847,6869 +627,2012-09-18,3,1,9,0,2,1,2,0.623333,0.565067,0.8725,0.357587,371,3702,4073 +628,2012-09-19,3,1,9,0,3,1,1,0.5525,0.540404,0.536667,0.215175,788,6803,7591 +629,2012-09-20,3,1,9,0,4,1,1,0.546667,0.532192,0.618333,0.118167,939,6781,7720 +630,2012-09-21,3,1,9,0,5,1,1,0.599167,0.571971,0.66875,0.154229,1250,6917,8167 +631,2012-09-22,3,1,9,0,6,0,1,0.65,0.610488,0.646667,0.283583,2512,5883,8395 +632,2012-09-23,4,1,9,0,0,0,1,0.529167,0.518933,0.467083,0.223258,2454,5453,7907 +633,2012-09-24,4,1,9,0,1,1,1,0.514167,0.502513,0.492917,0.142404,1001,6435,7436 +634,2012-09-25,4,1,9,0,2,1,1,0.55,0.544179,0.57,0.236321,845,6693,7538 +635,2012-09-26,4,1,9,0,3,1,1,0.635,0.596613,0.630833,0.2444,787,6946,7733 +636,2012-09-27,4,1,9,0,4,1,2,0.65,0.607975,0.690833,0.134342,751,6642,7393 +637,2012-09-28,4,1,9,0,5,1,2,0.619167,0.585863,0.69,0.164179,1045,6370,7415 +638,2012-09-29,4,1,9,0,6,0,1,0.5425,0.530296,0.542917,0.227604,2589,5966,8555 +639,2012-09-30,4,1,9,0,0,0,1,0.526667,0.517663,0.583333,0.134958,2015,4874,6889 +640,2012-10-01,4,1,10,0,1,1,2,0.520833,0.512,0.649167,0.0908042,763,6015,6778 +641,2012-10-02,4,1,10,0,2,1,3,0.590833,0.542333,0.871667,0.104475,315,4324,4639 +642,2012-10-03,4,1,10,0,3,1,2,0.6575,0.599133,0.79375,0.0665458,728,6844,7572 +643,2012-10-04,4,1,10,0,4,1,2,0.6575,0.607975,0.722917,0.117546,891,6437,7328 +644,2012-10-05,4,1,10,0,5,1,1,0.615,0.580187,0.6275,0.10635,1516,6640,8156 +645,2012-10-06,4,1,10,0,6,0,1,0.554167,0.538521,0.664167,0.268025,3031,4934,7965 +646,2012-10-07,4,1,10,0,0,0,2,0.415833,0.419813,0.708333,0.141162,781,2729,3510 +647,2012-10-08,4,1,10,1,1,0,2,0.383333,0.387608,0.709583,0.189679,874,4604,5478 +648,2012-10-09,4,1,10,0,2,1,2,0.446667,0.438112,0.761667,0.1903,601,5791,6392 +649,2012-10-10,4,1,10,0,3,1,1,0.514167,0.503142,0.630833,0.187821,780,6911,7691 +650,2012-10-11,4,1,10,0,4,1,1,0.435,0.431167,0.463333,0.181596,834,6736,7570 +651,2012-10-12,4,1,10,0,5,1,1,0.4375,0.433071,0.539167,0.235092,1060,6222,7282 +652,2012-10-13,4,1,10,0,6,0,1,0.393333,0.391396,0.494583,0.146142,2252,4857,7109 +653,2012-10-14,4,1,10,0,0,0,1,0.521667,0.508204,0.640417,0.278612,2080,4559,6639 +654,2012-10-15,4,1,10,0,1,1,2,0.561667,0.53915,0.7075,0.296037,760,5115,5875 +655,2012-10-16,4,1,10,0,2,1,1,0.468333,0.460846,0.558333,0.182221,922,6612,7534 +656,2012-10-17,4,1,10,0,3,1,1,0.455833,0.450108,0.692917,0.101371,979,6482,7461 +657,2012-10-18,4,1,10,0,4,1,2,0.5225,0.512625,0.728333,0.236937,1008,6501,7509 +658,2012-10-19,4,1,10,0,5,1,2,0.563333,0.537896,0.815,0.134954,753,4671,5424 +659,2012-10-20,4,1,10,0,6,0,1,0.484167,0.472842,0.572917,0.117537,2806,5284,8090 +660,2012-10-21,4,1,10,0,0,0,1,0.464167,0.456429,0.51,0.166054,2132,4692,6824 +661,2012-10-22,4,1,10,0,1,1,1,0.4875,0.482942,0.568333,0.0814833,830,6228,7058 +662,2012-10-23,4,1,10,0,2,1,1,0.544167,0.530304,0.641667,0.0945458,841,6625,7466 +663,2012-10-24,4,1,10,0,3,1,1,0.5875,0.558721,0.63625,0.0727792,795,6898,7693 +664,2012-10-25,4,1,10,0,4,1,2,0.55,0.529688,0.800417,0.124375,875,6484,7359 +665,2012-10-26,4,1,10,0,5,1,2,0.545833,0.52275,0.807083,0.132467,1182,6262,7444 +666,2012-10-27,4,1,10,0,6,0,2,0.53,0.515133,0.72,0.235692,2643,5209,7852 +667,2012-10-28,4,1,10,0,0,0,2,0.4775,0.467771,0.694583,0.398008,998,3461,4459 +668,2012-10-29,4,1,10,0,1,1,3,0.44,0.4394,0.88,0.3582,2,20,22 +669,2012-10-30,4,1,10,0,2,1,2,0.318182,0.309909,0.825455,0.213009,87,1009,1096 +670,2012-10-31,4,1,10,0,3,1,2,0.3575,0.3611,0.666667,0.166667,419,5147,5566 +671,2012-11-01,4,1,11,0,4,1,2,0.365833,0.369942,0.581667,0.157346,466,5520,5986 +672,2012-11-02,4,1,11,0,5,1,1,0.355,0.356042,0.522083,0.266175,618,5229,5847 +673,2012-11-03,4,1,11,0,6,0,2,0.343333,0.323846,0.49125,0.270529,1029,4109,5138 +674,2012-11-04,4,1,11,0,0,0,1,0.325833,0.329538,0.532917,0.179108,1201,3906,5107 +675,2012-11-05,4,1,11,0,1,1,1,0.319167,0.308075,0.494167,0.236325,378,4881,5259 +676,2012-11-06,4,1,11,0,2,1,1,0.280833,0.281567,0.567083,0.173513,466,5220,5686 +677,2012-11-07,4,1,11,0,3,1,2,0.295833,0.274621,0.5475,0.304108,326,4709,5035 +678,2012-11-08,4,1,11,0,4,1,1,0.352174,0.341891,0.333478,0.347835,340,4975,5315 +679,2012-11-09,4,1,11,0,5,1,1,0.361667,0.355413,0.540833,0.214558,709,5283,5992 +680,2012-11-10,4,1,11,0,6,0,1,0.389167,0.393937,0.645417,0.0578458,2090,4446,6536 +681,2012-11-11,4,1,11,0,0,0,1,0.420833,0.421713,0.659167,0.1275,2290,4562,6852 +682,2012-11-12,4,1,11,1,1,0,1,0.485,0.475383,0.741667,0.173517,1097,5172,6269 +683,2012-11-13,4,1,11,0,2,1,2,0.343333,0.323225,0.662917,0.342046,327,3767,4094 +684,2012-11-14,4,1,11,0,3,1,1,0.289167,0.281563,0.552083,0.199625,373,5122,5495 +685,2012-11-15,4,1,11,0,4,1,2,0.321667,0.324492,0.620417,0.152987,320,5125,5445 +686,2012-11-16,4,1,11,0,5,1,1,0.345,0.347204,0.524583,0.171025,484,5214,5698 +687,2012-11-17,4,1,11,0,6,0,1,0.325,0.326383,0.545417,0.179729,1313,4316,5629 +688,2012-11-18,4,1,11,0,0,0,1,0.3425,0.337746,0.692917,0.227612,922,3747,4669 +689,2012-11-19,4,1,11,0,1,1,2,0.380833,0.375621,0.623333,0.235067,449,5050,5499 +690,2012-11-20,4,1,11,0,2,1,2,0.374167,0.380667,0.685,0.082725,534,5100,5634 +691,2012-11-21,4,1,11,0,3,1,1,0.353333,0.364892,0.61375,0.103246,615,4531,5146 +692,2012-11-22,4,1,11,1,4,0,1,0.34,0.350371,0.580417,0.0528708,955,1470,2425 +693,2012-11-23,4,1,11,0,5,1,1,0.368333,0.378779,0.56875,0.148021,1603,2307,3910 +694,2012-11-24,4,1,11,0,6,0,1,0.278333,0.248742,0.404583,0.376871,532,1745,2277 +695,2012-11-25,4,1,11,0,0,0,1,0.245833,0.257583,0.468333,0.1505,309,2115,2424 +696,2012-11-26,4,1,11,0,1,1,1,0.313333,0.339004,0.535417,0.04665,337,4750,5087 +697,2012-11-27,4,1,11,0,2,1,2,0.291667,0.281558,0.786667,0.237562,123,3836,3959 +698,2012-11-28,4,1,11,0,3,1,1,0.296667,0.289762,0.50625,0.210821,198,5062,5260 +699,2012-11-29,4,1,11,0,4,1,1,0.28087,0.298422,0.555652,0.115522,243,5080,5323 +700,2012-11-30,4,1,11,0,5,1,1,0.298333,0.323867,0.649583,0.0584708,362,5306,5668 +701,2012-12-01,4,1,12,0,6,0,2,0.298333,0.316904,0.806667,0.0597042,951,4240,5191 +702,2012-12-02,4,1,12,0,0,0,2,0.3475,0.359208,0.823333,0.124379,892,3757,4649 +703,2012-12-03,4,1,12,0,1,1,1,0.4525,0.455796,0.7675,0.0827208,555,5679,6234 +704,2012-12-04,4,1,12,0,2,1,1,0.475833,0.469054,0.73375,0.174129,551,6055,6606 +705,2012-12-05,4,1,12,0,3,1,1,0.438333,0.428012,0.485,0.324021,331,5398,5729 +706,2012-12-06,4,1,12,0,4,1,1,0.255833,0.258204,0.50875,0.174754,340,5035,5375 +707,2012-12-07,4,1,12,0,5,1,2,0.320833,0.321958,0.764167,0.1306,349,4659,5008 +708,2012-12-08,4,1,12,0,6,0,2,0.381667,0.389508,0.91125,0.101379,1153,4429,5582 +709,2012-12-09,4,1,12,0,0,0,2,0.384167,0.390146,0.905417,0.157975,441,2787,3228 +710,2012-12-10,4,1,12,0,1,1,2,0.435833,0.435575,0.925,0.190308,329,4841,5170 +711,2012-12-11,4,1,12,0,2,1,2,0.353333,0.338363,0.596667,0.296037,282,5219,5501 +712,2012-12-12,4,1,12,0,3,1,2,0.2975,0.297338,0.538333,0.162937,310,5009,5319 +713,2012-12-13,4,1,12,0,4,1,1,0.295833,0.294188,0.485833,0.174129,425,5107,5532 +714,2012-12-14,4,1,12,0,5,1,1,0.281667,0.294192,0.642917,0.131229,429,5182,5611 +715,2012-12-15,4,1,12,0,6,0,1,0.324167,0.338383,0.650417,0.10635,767,4280,5047 +716,2012-12-16,4,1,12,0,0,0,2,0.3625,0.369938,0.83875,0.100742,538,3248,3786 +717,2012-12-17,4,1,12,0,1,1,2,0.393333,0.4015,0.907083,0.0982583,212,4373,4585 +718,2012-12-18,4,1,12,0,2,1,1,0.410833,0.409708,0.66625,0.221404,433,5124,5557 +719,2012-12-19,4,1,12,0,3,1,1,0.3325,0.342162,0.625417,0.184092,333,4934,5267 +720,2012-12-20,4,1,12,0,4,1,2,0.33,0.335217,0.667917,0.132463,314,3814,4128 +721,2012-12-21,1,1,12,0,5,1,2,0.326667,0.301767,0.556667,0.374383,221,3402,3623 +722,2012-12-22,1,1,12,0,6,0,1,0.265833,0.236113,0.44125,0.407346,205,1544,1749 +723,2012-12-23,1,1,12,0,0,0,1,0.245833,0.259471,0.515417,0.133083,408,1379,1787 +724,2012-12-24,1,1,12,0,1,1,2,0.231304,0.2589,0.791304,0.0772304,174,746,920 +725,2012-12-25,1,1,12,1,2,0,2,0.291304,0.294465,0.734783,0.168726,440,573,1013 +726,2012-12-26,1,1,12,0,3,1,3,0.243333,0.220333,0.823333,0.316546,9,432,441 +727,2012-12-27,1,1,12,0,4,1,2,0.254167,0.226642,0.652917,0.350133,247,1867,2114 +728,2012-12-28,1,1,12,0,5,1,2,0.253333,0.255046,0.59,0.155471,644,2451,3095 +729,2012-12-29,1,1,12,0,6,0,2,0.253333,0.2424,0.752917,0.124383,159,1182,1341 +730,2012-12-30,1,1,12,0,0,0,1,0.255833,0.2317,0.483333,0.350754,364,1432,1796 +731,2012-12-31,1,1,12,0,1,1,2,0.215833,0.223487,0.5775,0.154846,439,2290,2729 diff --git a/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/hour.csv b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/hour.csv new file mode 100755 index 0000000..1592203 --- /dev/null +++ b/first-neural-network/DLND-your-first-network/Bike-Sharing-Dataset/hour.csv @@ -0,0 +1,17380 @@ +instant,dteday,season,yr,mnth,hr,holiday,weekday,workingday,weathersit,temp,atemp,hum,windspeed,casual,registered,cnt +1,2011-01-01,1,0,1,0,0,6,0,1,0.24,0.2879,0.81,0,3,13,16 +2,2011-01-01,1,0,1,1,0,6,0,1,0.22,0.2727,0.8,0,8,32,40 +3,2011-01-01,1,0,1,2,0,6,0,1,0.22,0.2727,0.8,0,5,27,32 +4,2011-01-01,1,0,1,3,0,6,0,1,0.24,0.2879,0.75,0,3,10,13 +5,2011-01-01,1,0,1,4,0,6,0,1,0.24,0.2879,0.75,0,0,1,1 +6,2011-01-01,1,0,1,5,0,6,0,2,0.24,0.2576,0.75,0.0896,0,1,1 +7,2011-01-01,1,0,1,6,0,6,0,1,0.22,0.2727,0.8,0,2,0,2 +8,2011-01-01,1,0,1,7,0,6,0,1,0.2,0.2576,0.86,0,1,2,3 +9,2011-01-01,1,0,1,8,0,6,0,1,0.24,0.2879,0.75,0,1,7,8 +10,2011-01-01,1,0,1,9,0,6,0,1,0.32,0.3485,0.76,0,8,6,14 +11,2011-01-01,1,0,1,10,0,6,0,1,0.38,0.3939,0.76,0.2537,12,24,36 +12,2011-01-01,1,0,1,11,0,6,0,1,0.36,0.3333,0.81,0.2836,26,30,56 +13,2011-01-01,1,0,1,12,0,6,0,1,0.42,0.4242,0.77,0.2836,29,55,84 +14,2011-01-01,1,0,1,13,0,6,0,2,0.46,0.4545,0.72,0.2985,47,47,94 +15,2011-01-01,1,0,1,14,0,6,0,2,0.46,0.4545,0.72,0.2836,35,71,106 +16,2011-01-01,1,0,1,15,0,6,0,2,0.44,0.4394,0.77,0.2985,40,70,110 +17,2011-01-01,1,0,1,16,0,6,0,2,0.42,0.4242,0.82,0.2985,41,52,93 +18,2011-01-01,1,0,1,17,0,6,0,2,0.44,0.4394,0.82,0.2836,15,52,67 +19,2011-01-01,1,0,1,18,0,6,0,3,0.42,0.4242,0.88,0.2537,9,26,35 +20,2011-01-01,1,0,1,19,0,6,0,3,0.42,0.4242,0.88,0.2537,6,31,37 +21,2011-01-01,1,0,1,20,0,6,0,2,0.4,0.4091,0.87,0.2537,11,25,36 +22,2011-01-01,1,0,1,21,0,6,0,2,0.4,0.4091,0.87,0.194,3,31,34 +23,2011-01-01,1,0,1,22,0,6,0,2,0.4,0.4091,0.94,0.2239,11,17,28 +24,2011-01-01,1,0,1,23,0,6,0,2,0.46,0.4545,0.88,0.2985,15,24,39 +25,2011-01-02,1,0,1,0,0,0,0,2,0.46,0.4545,0.88,0.2985,4,13,17 +26,2011-01-02,1,0,1,1,0,0,0,2,0.44,0.4394,0.94,0.2537,1,16,17 +27,2011-01-02,1,0,1,2,0,0,0,2,0.42,0.4242,1,0.2836,1,8,9 +28,2011-01-02,1,0,1,3,0,0,0,2,0.46,0.4545,0.94,0.194,2,4,6 +29,2011-01-02,1,0,1,4,0,0,0,2,0.46,0.4545,0.94,0.194,2,1,3 +30,2011-01-02,1,0,1,6,0,0,0,3,0.42,0.4242,0.77,0.2985,0,2,2 +31,2011-01-02,1,0,1,7,0,0,0,2,0.4,0.4091,0.76,0.194,0,1,1 +32,2011-01-02,1,0,1,8,0,0,0,3,0.4,0.4091,0.71,0.2239,0,8,8 +33,2011-01-02,1,0,1,9,0,0,0,2,0.38,0.3939,0.76,0.2239,1,19,20 +34,2011-01-02,1,0,1,10,0,0,0,2,0.36,0.3485,0.81,0.2239,7,46,53 +35,2011-01-02,1,0,1,11,0,0,0,2,0.36,0.3333,0.71,0.2537,16,54,70 +36,2011-01-02,1,0,1,12,0,0,0,2,0.36,0.3333,0.66,0.2985,20,73,93 +37,2011-01-02,1,0,1,13,0,0,0,2,0.36,0.3485,0.66,0.1343,11,64,75 +38,2011-01-02,1,0,1,14,0,0,0,3,0.36,0.3485,0.76,0.194,4,55,59 +39,2011-01-02,1,0,1,15,0,0,0,3,0.34,0.3333,0.81,0.1642,19,55,74 +40,2011-01-02,1,0,1,16,0,0,0,3,0.34,0.3333,0.71,0.1642,9,67,76 +41,2011-01-02,1,0,1,17,0,0,0,1,0.34,0.3333,0.57,0.194,7,58,65 +42,2011-01-02,1,0,1,18,0,0,0,2,0.36,0.3333,0.46,0.3284,10,43,53 +43,2011-01-02,1,0,1,19,0,0,0,1,0.32,0.2879,0.42,0.4478,1,29,30 +44,2011-01-02,1,0,1,20,0,0,0,1,0.3,0.2727,0.39,0.3582,5,17,22 +45,2011-01-02,1,0,1,21,0,0,0,1,0.26,0.2273,0.44,0.3284,11,20,31 +46,2011-01-02,1,0,1,22,0,0,0,1,0.24,0.2121,0.44,0.2985,0,9,9 +47,2011-01-02,1,0,1,23,0,0,0,1,0.22,0.2273,0.47,0.1642,0,8,8 +48,2011-01-03,1,0,1,0,0,1,1,1,0.22,0.197,0.44,0.3582,0,5,5 +49,2011-01-03,1,0,1,1,0,1,1,1,0.2,0.1667,0.44,0.4179,0,2,2 +50,2011-01-03,1,0,1,4,0,1,1,1,0.16,0.1364,0.47,0.3881,0,1,1 +51,2011-01-03,1,0,1,5,0,1,1,1,0.16,0.1364,0.47,0.2836,0,3,3 +52,2011-01-03,1,0,1,6,0,1,1,1,0.14,0.1061,0.5,0.3881,0,30,30 +53,2011-01-03,1,0,1,7,0,1,1,1,0.14,0.1364,0.5,0.194,1,63,64 +54,2011-01-03,1,0,1,8,0,1,1,1,0.14,0.1212,0.5,0.2836,1,153,154 +55,2011-01-03,1,0,1,9,0,1,1,1,0.16,0.1364,0.43,0.3881,7,81,88 +56,2011-01-03,1,0,1,10,0,1,1,1,0.18,0.1667,0.43,0.2537,11,33,44 +57,2011-01-03,1,0,1,11,0,1,1,1,0.2,0.1818,0.4,0.3284,10,41,51 +58,2011-01-03,1,0,1,12,0,1,1,1,0.22,0.2121,0.35,0.2985,13,48,61 +59,2011-01-03,1,0,1,13,0,1,1,1,0.24,0.2121,0.35,0.2836,8,53,61 +60,2011-01-03,1,0,1,14,0,1,1,1,0.26,0.2424,0.3,0.2836,11,66,77 +61,2011-01-03,1,0,1,15,0,1,1,1,0.26,0.2424,0.3,0.2537,14,58,72 +62,2011-01-03,1,0,1,16,0,1,1,1,0.26,0.2424,0.3,0.2537,9,67,76 +63,2011-01-03,1,0,1,17,0,1,1,1,0.24,0.2273,0.3,0.2239,11,146,157 +64,2011-01-03,1,0,1,18,0,1,1,1,0.24,0.2576,0.32,0.1045,9,148,157 +65,2011-01-03,1,0,1,19,0,1,1,1,0.2,0.2576,0.47,0,8,102,110 +66,2011-01-03,1,0,1,20,0,1,1,1,0.2,0.2273,0.47,0.1045,3,49,52 +67,2011-01-03,1,0,1,21,0,1,1,1,0.18,0.197,0.64,0.1343,3,49,52 +68,2011-01-03,1,0,1,22,0,1,1,1,0.14,0.1515,0.69,0.1343,0,20,20 +69,2011-01-03,1,0,1,23,0,1,1,1,0.18,0.2121,0.55,0.1045,1,11,12 +70,2011-01-04,1,0,1,0,0,2,1,1,0.16,0.1818,0.55,0.1045,0,5,5 +71,2011-01-04,1,0,1,1,0,2,1,1,0.16,0.1818,0.59,0.1045,0,2,2 +72,2011-01-04,1,0,1,2,0,2,1,1,0.14,0.1515,0.63,0.1343,0,1,1 +73,2011-01-04,1,0,1,4,0,2,1,1,0.14,0.1818,0.63,0.0896,0,2,2 +74,2011-01-04,1,0,1,5,0,2,1,1,0.12,0.1515,0.68,0.1045,0,4,4 +75,2011-01-04,1,0,1,6,0,2,1,1,0.12,0.1515,0.74,0.1045,0,36,36 +76,2011-01-04,1,0,1,7,0,2,1,1,0.12,0.1515,0.74,0.1343,2,92,94 +77,2011-01-04,1,0,1,8,0,2,1,1,0.14,0.1515,0.69,0.1642,2,177,179 +78,2011-01-04,1,0,1,9,0,2,1,1,0.16,0.1515,0.64,0.2239,2,98,100 +79,2011-01-04,1,0,1,10,0,2,1,2,0.16,0.1364,0.69,0.3284,5,37,42 +80,2011-01-04,1,0,1,11,0,2,1,1,0.22,0.2121,0.51,0.2985,7,50,57 +81,2011-01-04,1,0,1,12,0,2,1,1,0.22,0.2273,0.51,0.1642,12,66,78 +82,2011-01-04,1,0,1,13,0,2,1,1,0.24,0.2273,0.56,0.194,18,79,97 +83,2011-01-04,1,0,1,14,0,2,1,1,0.26,0.2576,0.52,0.2239,9,54,63 +84,2011-01-04,1,0,1,15,0,2,1,1,0.28,0.2727,0.52,0.2537,17,48,65 +85,2011-01-04,1,0,1,16,0,2,1,1,0.3,0.2879,0.49,0.2537,15,68,83 +86,2011-01-04,1,0,1,17,0,2,1,1,0.28,0.2727,0.48,0.2239,10,202,212 +87,2011-01-04,1,0,1,18,0,2,1,1,0.26,0.2576,0.48,0.194,3,179,182 +88,2011-01-04,1,0,1,19,0,2,1,1,0.24,0.2576,0.48,0.1045,2,110,112 +89,2011-01-04,1,0,1,20,0,2,1,1,0.24,0.2576,0.48,0.1045,1,53,54 +90,2011-01-04,1,0,1,21,0,2,1,1,0.22,0.2727,0.64,0,0,48,48 +91,2011-01-04,1,0,1,22,0,2,1,1,0.22,0.2576,0.64,0.0896,1,34,35 +92,2011-01-04,1,0,1,23,0,2,1,1,0.2,0.2273,0.69,0.0896,2,9,11 +93,2011-01-05,1,0,1,0,0,3,1,1,0.2,0.2576,0.64,0,0,6,6 +94,2011-01-05,1,0,1,1,0,3,1,1,0.16,0.197,0.74,0.0896,0,6,6 +95,2011-01-05,1,0,1,2,0,3,1,1,0.16,0.197,0.74,0.0896,0,2,2 +96,2011-01-05,1,0,1,4,0,3,1,1,0.24,0.2273,0.48,0.2239,0,2,2 +97,2011-01-05,1,0,1,5,0,3,1,1,0.22,0.2273,0.47,0.1642,0,3,3 +98,2011-01-05,1,0,1,6,0,3,1,1,0.2,0.197,0.47,0.2239,0,33,33 +99,2011-01-05,1,0,1,7,0,3,1,1,0.18,0.1818,0.43,0.194,1,87,88 +100,2011-01-05,1,0,1,8,0,3,1,1,0.2,0.1818,0.4,0.2985,3,192,195 +101,2011-01-05,1,0,1,9,0,3,1,1,0.22,0.197,0.37,0.3284,6,109,115 +102,2011-01-05,1,0,1,10,0,3,1,1,0.22,0.197,0.37,0.3284,4,53,57 +103,2011-01-05,1,0,1,11,0,3,1,1,0.26,0.2273,0.33,0.3284,12,34,46 +104,2011-01-05,1,0,1,12,0,3,1,1,0.26,0.2273,0.33,0.3284,5,74,79 +105,2011-01-05,1,0,1,13,0,3,1,1,0.28,0.2576,0.3,0.2985,6,65,71 +106,2011-01-05,1,0,1,14,0,3,1,1,0.3,0.2879,0.28,0.194,10,52,62 +107,2011-01-05,1,0,1,15,0,3,1,1,0.3,0.2879,0.28,0.194,7,55,62 +108,2011-01-05,1,0,1,16,0,3,1,1,0.3,0.3182,0.28,0.0896,4,85,89 +109,2011-01-05,1,0,1,17,0,3,1,1,0.24,0.2273,0.38,0.194,4,186,190 +110,2011-01-05,1,0,1,18,0,3,1,1,0.24,0.2424,0.38,0.1343,3,166,169 +111,2011-01-05,1,0,1,19,0,3,1,1,0.24,0.2576,0.38,0.1045,5,127,132 +112,2011-01-05,1,0,1,20,0,3,1,1,0.22,0.2273,0.47,0.1642,7,82,89 +113,2011-01-05,1,0,1,21,0,3,1,1,0.2,0.197,0.51,0.194,3,40,43 +114,2011-01-05,1,0,1,22,0,3,1,1,0.18,0.197,0.55,0.1343,1,41,42 +115,2011-01-05,1,0,1,23,0,3,1,1,0.2,0.2576,0.47,0,1,18,19 +116,2011-01-06,1,0,1,0,0,4,1,1,0.18,0.2424,0.55,0,0,11,11 +117,2011-01-06,1,0,1,1,0,4,1,1,0.16,0.2273,0.64,0,0,4,4 +118,2011-01-06,1,0,1,2,0,4,1,1,0.16,0.2273,0.64,0,0,2,2 +119,2011-01-06,1,0,1,4,0,4,1,2,0.16,0.197,0.64,0.0896,0,1,1 +120,2011-01-06,1,0,1,5,0,4,1,2,0.14,0.1818,0.69,0.0896,0,4,4 +121,2011-01-06,1,0,1,6,0,4,1,2,0.14,0.1667,0.63,0.1045,0,36,36 +122,2011-01-06,1,0,1,7,0,4,1,2,0.16,0.2273,0.59,0,0,95,95 +123,2011-01-06,1,0,1,8,0,4,1,1,0.16,0.2273,0.59,0,3,216,219 +124,2011-01-06,1,0,1,9,0,4,1,2,0.18,0.2424,0.51,0,6,116,122 +125,2011-01-06,1,0,1,10,0,4,1,1,0.2,0.2576,0.47,0,3,42,45 +126,2011-01-06,1,0,1,11,0,4,1,1,0.22,0.2576,0.44,0.0896,2,57,59 +127,2011-01-06,1,0,1,12,0,4,1,1,0.26,0.2879,0.35,0,6,78,84 +128,2011-01-06,1,0,1,13,0,4,1,1,0.26,0.2727,0.35,0.1045,12,55,67 +129,2011-01-06,1,0,1,14,0,4,1,1,0.28,0.2727,0.36,0.1642,11,59,70 +130,2011-01-06,1,0,1,15,0,4,1,1,0.28,0.2727,0.36,0,8,54,62 +131,2011-01-06,1,0,1,16,0,4,1,1,0.26,0.2576,0.38,0.1642,12,74,86 +132,2011-01-06,1,0,1,17,0,4,1,1,0.22,0.2273,0.51,0.1642,9,163,172 +133,2011-01-06,1,0,1,18,0,4,1,1,0.22,0.2273,0.51,0.1343,5,158,163 +134,2011-01-06,1,0,1,19,0,4,1,1,0.22,0.2576,0.55,0.0896,3,109,112 +135,2011-01-06,1,0,1,20,0,4,1,1,0.2,0.2121,0.51,0.1642,3,66,69 +136,2011-01-06,1,0,1,21,0,4,1,2,0.22,0.2121,0.55,0.2239,0,48,48 +137,2011-01-06,1,0,1,22,0,4,1,2,0.22,0.2121,0.51,0.2836,1,51,52 +138,2011-01-06,1,0,1,23,0,4,1,2,0.2,0.197,0.59,0.194,4,19,23 +139,2011-01-07,1,0,1,0,0,5,1,2,0.2,0.197,0.64,0.194,4,13,17 +140,2011-01-07,1,0,1,1,0,5,1,2,0.2,0.197,0.69,0.2239,2,5,7 +141,2011-01-07,1,0,1,2,0,5,1,2,0.2,0.197,0.69,0.2239,0,1,1 +142,2011-01-07,1,0,1,4,0,5,1,2,0.2,0.2121,0.69,0.1343,0,1,1 +143,2011-01-07,1,0,1,5,0,5,1,3,0.22,0.2727,0.55,0,0,5,5 +144,2011-01-07,1,0,1,6,0,5,1,2,0.2,0.2576,0.69,0,8,26,34 +145,2011-01-07,1,0,1,7,0,5,1,1,0.2,0.2121,0.69,0.1343,8,76,84 +146,2011-01-07,1,0,1,8,0,5,1,1,0.2,0.197,0.51,0.2537,20,190,210 +147,2011-01-07,1,0,1,9,0,5,1,1,0.2,0.1818,0.47,0.2985,9,125,134 +148,2011-01-07,1,0,1,10,0,5,1,1,0.22,0.197,0.37,0.3284,16,47,63 +149,2011-01-07,1,0,1,11,0,5,1,2,0.2,0.197,0.4,0.2239,19,48,67 +150,2011-01-07,1,0,1,12,0,5,1,2,0.2,0.197,0.37,0.2537,9,50,59 +151,2011-01-07,1,0,1,13,0,5,1,2,0.2,0.1818,0.37,0.2836,9,64,73 +152,2011-01-07,1,0,1,14,0,5,1,2,0.2,0.197,0.4,0.2537,7,43,50 +153,2011-01-07,1,0,1,15,0,5,1,2,0.2,0.2121,0.37,0.1642,9,63,72 +154,2011-01-07,1,0,1,16,0,5,1,2,0.2,0.2121,0.37,0.1642,5,82,87 +155,2011-01-07,1,0,1,17,0,5,1,2,0.2,0.2576,0.37,0,9,178,187 +156,2011-01-07,1,0,1,18,0,5,1,1,0.2,0.2273,0.4,0.0896,7,116,123 +157,2011-01-07,1,0,1,19,0,5,1,1,0.16,0.197,0.55,0.0896,3,92,95 +158,2011-01-07,1,0,1,20,0,5,1,1,0.18,0.2121,0.47,0.1045,1,50,51 +159,2011-01-07,1,0,1,21,0,5,1,1,0.18,0.197,0.47,0.1343,0,39,39 +160,2011-01-07,1,0,1,22,0,5,1,2,0.18,0.197,0.43,0.1642,2,34,36 +161,2011-01-07,1,0,1,23,0,5,1,2,0.18,0.197,0.51,0.1642,1,14,15 +162,2011-01-08,1,0,1,0,0,6,0,2,0.18,0.197,0.51,0.1642,1,24,25 +163,2011-01-08,1,0,1,1,0,6,0,2,0.18,0.2121,0.55,0.0896,1,15,16 +164,2011-01-08,1,0,1,2,0,6,0,2,0.18,0.2424,0.55,0,3,13,16 +165,2011-01-08,1,0,1,3,0,6,0,3,0.18,0.197,0.55,0.1642,0,7,7 +166,2011-01-08,1,0,1,4,0,6,0,3,0.18,0.197,0.55,0.1642,0,1,1 +167,2011-01-08,1,0,1,5,0,6,0,2,0.16,0.1667,0.74,0.1642,0,5,5 +168,2011-01-08,1,0,1,6,0,6,0,2,0.16,0.1667,0.74,0.1642,0,2,2 +169,2011-01-08,1,0,1,7,0,6,0,2,0.16,0.1818,0.74,0.1045,1,8,9 +170,2011-01-08,1,0,1,8,0,6,0,3,0.16,0.1818,0.93,0.1045,0,15,15 +171,2011-01-08,1,0,1,9,0,6,0,3,0.16,0.1818,0.93,0.1045,0,20,20 +172,2011-01-08,1,0,1,10,0,6,0,2,0.18,0.197,0.8,0.1642,5,56,61 +173,2011-01-08,1,0,1,11,0,6,0,2,0.2,0.1818,0.69,0.3881,2,60,62 +174,2011-01-08,1,0,1,12,0,6,0,2,0.2,0.1818,0.59,0.3582,8,90,98 +175,2011-01-08,1,0,1,13,0,6,0,1,0.2,0.1818,0.44,0.3284,7,95,102 +176,2011-01-08,1,0,1,14,0,6,0,1,0.2,0.1667,0.32,0.4925,12,83,95 +177,2011-01-08,1,0,1,15,0,6,0,1,0.2,0.1667,0.32,0.4478,5,69,74 +178,2011-01-08,1,0,1,16,0,6,0,1,0.18,0.1364,0.29,0.4478,8,68,76 +179,2011-01-08,1,0,1,17,0,6,0,1,0.16,0.1212,0.37,0.5522,5,64,69 +180,2011-01-08,1,0,1,18,0,6,0,1,0.14,0.1212,0.39,0.2985,3,52,55 +181,2011-01-08,1,0,1,19,0,6,0,1,0.14,0.1212,0.36,0.2537,4,26,30 +182,2011-01-08,1,0,1,20,0,6,0,1,0.12,0.1212,0.36,0.2537,0,28,28 +183,2011-01-08,1,0,1,21,0,6,0,1,0.12,0.1061,0.39,0.3582,2,35,37 +184,2011-01-08,1,0,1,22,0,6,0,1,0.12,0.1061,0.36,0.3881,1,33,34 +185,2011-01-08,1,0,1,23,0,6,0,1,0.1,0.0606,0.39,0.4478,0,22,22 +186,2011-01-09,1,0,1,0,0,0,0,1,0.1,0.0758,0.42,0.3881,1,24,25 +187,2011-01-09,1,0,1,1,0,0,0,1,0.1,0.0606,0.42,0.4627,0,12,12 +188,2011-01-09,1,0,1,2,0,0,0,1,0.1,0.0606,0.46,0.4627,0,11,11 +189,2011-01-09,1,0,1,3,0,0,0,1,0.1,0.0758,0.46,0.4179,0,4,4 +190,2011-01-09,1,0,1,4,0,0,0,1,0.08,0.0909,0.53,0.194,0,1,1 +191,2011-01-09,1,0,1,5,0,0,0,1,0.08,0.0909,0.53,0.194,0,1,1 +192,2011-01-09,1,0,1,6,0,0,0,1,0.1,0.0909,0.49,0.2836,0,1,1 +193,2011-01-09,1,0,1,7,0,0,0,1,0.08,0.0909,0.53,0.194,1,5,6 +194,2011-01-09,1,0,1,8,0,0,0,1,0.1,0.0909,0.49,0.2836,0,10,10 +195,2011-01-09,1,0,1,9,0,0,0,1,0.12,0.0758,0.46,0.5224,0,19,19 +196,2011-01-09,1,0,1,10,0,0,0,1,0.14,0.1061,0.43,0.3881,0,49,49 +197,2011-01-09,1,0,1,11,0,0,0,1,0.16,0.1212,0.4,0.5224,2,47,49 +198,2011-01-09,1,0,1,12,0,0,0,1,0.18,0.1364,0.37,0.4478,4,79,83 +199,2011-01-09,1,0,1,13,0,0,0,1,0.2,0.1667,0.34,0.4478,6,69,75 +200,2011-01-09,1,0,1,14,0,0,0,1,0.22,0.1818,0.32,0.4627,8,64,72 +201,2011-01-09,1,0,1,15,0,0,0,1,0.22,0.197,0.35,0.3582,5,77,82 +202,2011-01-09,1,0,1,16,0,0,0,1,0.2,0.1667,0.34,0.4478,13,79,92 +203,2011-01-09,1,0,1,17,0,0,0,1,0.18,0.1515,0.37,0.3881,3,59,62 +204,2011-01-09,1,0,1,18,0,0,0,1,0.16,0.1364,0.4,0.3284,4,44,48 +205,2011-01-09,1,0,1,19,0,0,0,1,0.16,0.1364,0.43,0.3284,1,40,41 +206,2011-01-09,1,0,1,20,0,0,0,1,0.14,0.1212,0.46,0.2537,0,38,38 +207,2011-01-09,1,0,1,21,0,0,0,1,0.14,0.1061,0.46,0.4179,1,19,20 +208,2011-01-09,1,0,1,22,0,0,0,1,0.14,0.1212,0.46,0.2985,5,10,15 +209,2011-01-09,1,0,1,23,0,0,0,1,0.12,0.1364,0.5,0.194,0,6,6 +210,2011-01-10,1,0,1,0,0,1,1,1,0.12,0.1212,0.5,0.2836,2,3,5 +211,2011-01-10,1,0,1,1,0,1,1,1,0.12,0.1212,0.5,0.2836,1,0,1 +212,2011-01-10,1,0,1,2,0,1,1,1,0.12,0.1212,0.5,0.2239,0,3,3 +213,2011-01-10,1,0,1,3,0,1,1,1,0.12,0.1212,0.5,0.2239,0,1,1 +214,2011-01-10,1,0,1,4,0,1,1,1,0.1,0.1212,0.54,0.1343,1,2,3 +215,2011-01-10,1,0,1,5,0,1,1,1,0.1,0.1061,0.54,0.2537,0,3,3 +216,2011-01-10,1,0,1,6,0,1,1,1,0.12,0.1212,0.5,0.2836,0,31,31 +217,2011-01-10,1,0,1,7,0,1,1,1,0.12,0.1212,0.5,0.2239,2,75,77 +218,2011-01-10,1,0,1,8,0,1,1,2,0.12,0.1212,0.5,0.2836,4,184,188 +219,2011-01-10,1,0,1,9,0,1,1,2,0.14,0.1212,0.5,0.2537,2,92,94 +220,2011-01-10,1,0,1,10,0,1,1,2,0.14,0.1212,0.5,0.2985,0,31,31 +221,2011-01-10,1,0,1,11,0,1,1,2,0.16,0.1364,0.47,0.2836,2,28,30 +222,2011-01-10,1,0,1,12,0,1,1,2,0.2,0.1818,0.4,0.2836,5,47,52 +223,2011-01-10,1,0,1,13,0,1,1,2,0.2,0.1818,0.4,0.2836,4,50,54 +224,2011-01-10,1,0,1,14,0,1,1,2,0.2,0.197,0.4,0.2239,0,47,47 +225,2011-01-10,1,0,1,15,0,1,1,2,0.2,0.197,0.4,0.2239,2,43,45 +226,2011-01-10,1,0,1,16,0,1,1,1,0.2,0.2121,0.4,0.1343,4,70,74 +227,2011-01-10,1,0,1,17,0,1,1,1,0.2,0.2273,0.4,0.1045,4,174,178 +228,2011-01-10,1,0,1,18,0,1,1,1,0.2,0.197,0.4,0.2239,1,154,155 +229,2011-01-10,1,0,1,19,0,1,1,1,0.16,0.1667,0.47,0.1642,3,92,95 +230,2011-01-10,1,0,1,20,0,1,1,1,0.16,0.1667,0.5,0.1642,1,73,74 +231,2011-01-10,1,0,1,21,0,1,1,1,0.14,0.1364,0.59,0.194,1,37,38 +232,2011-01-10,1,0,1,22,0,1,1,1,0.14,0.1515,0.59,0.1642,2,22,24 +233,2011-01-10,1,0,1,23,0,1,1,1,0.14,0.1515,0.59,0.1642,0,18,18 +234,2011-01-11,1,0,1,0,0,2,1,1,0.14,0.1667,0.59,0.1045,2,10,12 +235,2011-01-11,1,0,1,1,0,2,1,1,0.14,0.1515,0.59,0.1642,0,3,3 +236,2011-01-11,1,0,1,2,0,2,1,2,0.16,0.1515,0.55,0.194,0,3,3 +237,2011-01-11,1,0,1,5,0,2,1,2,0.16,0.1818,0.55,0.1343,0,6,6 +238,2011-01-11,1,0,1,6,0,2,1,2,0.16,0.1818,0.55,0.1343,0,27,27 +239,2011-01-11,1,0,1,7,0,2,1,2,0.16,0.2273,0.55,0,2,97,99 +240,2011-01-11,1,0,1,8,0,2,1,2,0.18,0.2121,0.51,0.0896,3,214,217 +241,2011-01-11,1,0,1,9,0,2,1,2,0.18,0.197,0.51,0.1642,3,127,130 +242,2011-01-11,1,0,1,10,0,2,1,2,0.2,0.2121,0.51,0.1642,3,51,54 +243,2011-01-11,1,0,1,11,0,2,1,2,0.2,0.2121,0.47,0.1343,4,31,35 +244,2011-01-11,1,0,1,12,0,2,1,2,0.2,0.2273,0.51,0.1045,2,55,57 +245,2011-01-11,1,0,1,13,0,2,1,2,0.2,0.2273,0.59,0.0896,6,46,52 +246,2011-01-11,1,0,1,14,0,2,1,2,0.2,0.2273,0.59,0.0896,3,60,63 +247,2011-01-11,1,0,1,15,0,2,1,2,0.16,0.197,0.8,0.0896,2,45,47 +248,2011-01-11,1,0,1,16,0,2,1,2,0.16,0.1515,0.86,0.2239,4,72,76 +249,2011-01-11,1,0,1,17,0,2,1,2,0.16,0.1515,0.86,0.2239,6,130,136 +250,2011-01-11,1,0,1,18,0,2,1,3,0.16,0.1818,0.93,0.1045,1,94,95 +251,2011-01-11,1,0,1,19,0,2,1,3,0.16,0.2273,0.93,0,0,51,51 +252,2011-01-11,1,0,1,20,0,2,1,3,0.16,0.1515,0.93,0.194,0,32,32 +253,2011-01-11,1,0,1,21,0,2,1,3,0.16,0.197,0.86,0.0896,0,20,20 +254,2011-01-11,1,0,1,22,0,2,1,3,0.16,0.1818,0.93,0.1045,1,28,29 +255,2011-01-11,1,0,1,23,0,2,1,3,0.16,0.197,0.93,0.0896,1,18,19 +256,2011-01-12,1,0,1,0,0,3,1,2,0.16,0.197,0.86,0.0896,0,7,7 +257,2011-01-12,1,0,1,1,0,3,1,2,0.16,0.1818,0.86,0.1045,0,6,6 +258,2011-01-12,1,0,1,2,0,3,1,1,0.14,0.1515,0.86,0.1343,0,1,1 +259,2011-01-12,1,0,1,5,0,3,1,1,0.14,0.1515,0.86,0.1642,0,5,5 +260,2011-01-12,1,0,1,6,0,3,1,1,0.12,0.1515,0.93,0.1343,0,16,16 +261,2011-01-12,1,0,1,7,0,3,1,1,0.14,0.1515,0.69,0.1343,0,54,54 +262,2011-01-12,1,0,1,8,0,3,1,1,0.16,0.1667,0.59,0.1642,3,125,128 +263,2011-01-12,1,0,1,9,0,3,1,1,0.16,0.1364,0.59,0.3284,3,78,81 +264,2011-01-12,1,0,1,10,0,3,1,1,0.18,0.1818,0.55,0.2239,0,39,39 +265,2011-01-12,1,0,1,11,0,3,1,1,0.2,0.1818,0.51,0.3881,3,32,35 +266,2011-01-12,1,0,1,12,0,3,1,1,0.2,0.1515,0.47,0.5821,3,52,55 +267,2011-01-12,1,0,1,13,0,3,1,1,0.22,0.197,0.44,0.3582,0,49,49 +268,2011-01-12,1,0,1,14,0,3,1,1,0.2,0.1818,0.47,0.3284,0,44,44 +269,2011-01-12,1,0,1,15,0,3,1,1,0.2,0.1667,0.47,0.4179,1,48,49 +270,2011-01-12,1,0,1,16,0,3,1,1,0.22,0.197,0.44,0.3284,5,63,68 +271,2011-01-12,1,0,1,17,0,3,1,1,0.2,0.1818,0.47,0.3582,0,139,139 +272,2011-01-12,1,0,1,18,0,3,1,1,0.2,0.1515,0.47,0.5224,2,135,137 +273,2011-01-12,1,0,1,19,0,3,1,1,0.18,0.1515,0.47,0.4179,1,82,83 +274,2011-01-12,1,0,1,20,0,3,1,1,0.16,0.1364,0.5,0.3284,2,54,56 +275,2011-01-12,1,0,1,21,0,3,1,1,0.16,0.1364,0.55,0.3284,0,57,57 +276,2011-01-12,1,0,1,22,0,3,1,1,0.16,0.1212,0.55,0.4478,1,32,33 +277,2011-01-12,1,0,1,23,0,3,1,1,0.14,0.1061,0.59,0.4179,1,19,20 +278,2011-01-13,1,0,1,0,0,4,1,1,0.14,0.1212,0.59,0.2836,1,6,7 +279,2011-01-13,1,0,1,1,0,4,1,1,0.14,0.1212,0.5,0.2836,0,2,2 +280,2011-01-13,1,0,1,2,0,4,1,1,0.14,0.1212,0.5,0.3582,0,2,2 +281,2011-01-13,1,0,1,3,0,4,1,1,0.14,0.1212,0.5,0.3284,0,3,3 +282,2011-01-13,1,0,1,4,0,4,1,1,0.14,0.1212,0.5,0.2537,0,4,4 +283,2011-01-13,1,0,1,5,0,4,1,1,0.14,0.1212,0.5,0.2985,0,3,3 +284,2011-01-13,1,0,1,6,0,4,1,1,0.12,0.1515,0.54,0.1343,0,28,28 +285,2011-01-13,1,0,1,7,0,4,1,1,0.12,0.1515,0.54,0.1343,0,72,72 +286,2011-01-13,1,0,1,8,0,4,1,1,0.14,0.1364,0.5,0.194,5,197,202 +287,2011-01-13,1,0,1,9,0,4,1,1,0.14,0.1212,0.5,0.3284,2,137,139 +288,2011-01-13,1,0,1,10,0,4,1,2,0.16,0.1364,0.5,0.3582,2,36,38 +289,2011-01-13,1,0,1,11,0,4,1,2,0.2,0.1667,0.44,0.4478,4,33,37 +290,2011-01-13,1,0,1,12,0,4,1,1,0.2,0.1667,0.44,0.4179,3,49,52 +291,2011-01-13,1,0,1,13,0,4,1,1,0.22,0.197,0.41,0.4478,2,81,83 +292,2011-01-13,1,0,1,14,0,4,1,1,0.22,0.197,0.41,0.3881,3,39,42 +293,2011-01-13,1,0,1,15,0,4,1,1,0.24,0.2121,0.38,0.2985,5,55,60 +294,2011-01-13,1,0,1,16,0,4,1,1,0.24,0.2121,0.38,0.3582,2,76,78 +295,2011-01-13,1,0,1,17,0,4,1,1,0.2,0.1818,0.4,0.2836,4,158,162 +296,2011-01-13,1,0,1,18,0,4,1,1,0.2,0.1818,0.4,0.3284,3,141,144 +297,2011-01-13,1,0,1,19,0,4,1,1,0.16,0.1515,0.47,0.2537,1,98,99 +298,2011-01-13,1,0,1,20,0,4,1,1,0.16,0.1515,0.47,0.2239,0,64,64 +299,2011-01-13,1,0,1,21,0,4,1,1,0.14,0.1212,0.46,0.2985,0,40,40 +300,2011-01-13,1,0,1,22,0,4,1,1,0.14,0.1212,0.46,0.3284,0,30,30 +301,2011-01-13,1,0,1,23,0,4,1,1,0.12,0.1364,0.5,0.194,1,14,15 +302,2011-01-14,1,0,1,0,0,5,1,1,0.12,0.1364,0.5,0.194,0,14,14 +303,2011-01-14,1,0,1,1,0,5,1,1,0.1,0.1212,0.54,0.1642,0,5,5 +304,2011-01-14,1,0,1,2,0,5,1,1,0.1,0.1212,0.54,0.1343,0,1,1 +305,2011-01-14,1,0,1,3,0,5,1,1,0.1,0.1364,0.54,0.1045,0,1,1 +306,2011-01-14,1,0,1,5,0,5,1,1,0.1,0.1364,0.54,0.0896,0,8,8 +307,2011-01-14,1,0,1,6,0,5,1,1,0.1,0.1818,0.54,0,0,17,17 +308,2011-01-14,1,0,1,7,0,5,1,1,0.1,0.1212,0.74,0.1642,0,70,70 +309,2011-01-14,1,0,1,8,0,5,1,1,0.12,0.1667,0.68,0,2,156,158 +310,2011-01-14,1,0,1,9,0,5,1,1,0.14,0.1515,0.69,0.1343,0,117,117 +311,2011-01-14,1,0,1,10,0,5,1,1,0.18,0.1818,0.55,0.194,4,40,44 +312,2011-01-14,1,0,1,11,0,5,1,1,0.18,0.1667,0.51,0.2836,6,47,53 +313,2011-01-14,1,0,1,12,0,5,1,1,0.2,0.197,0.44,0.2537,2,59,61 +314,2011-01-14,1,0,1,13,0,5,1,1,0.22,0.197,0.37,0.3881,4,73,77 +315,2011-01-14,1,0,1,14,0,5,1,1,0.22,0.2121,0.41,0.2836,5,59,64 +316,2011-01-14,1,0,1,15,0,5,1,1,0.24,0.2424,0.38,0.1642,9,59,68 +317,2011-01-14,1,0,1,16,0,5,1,1,0.22,0.2424,0.41,0.1045,3,87,90 +318,2011-01-14,1,0,1,17,0,5,1,1,0.22,0.2273,0.41,0.1642,4,155,159 +319,2011-01-14,1,0,1,18,0,5,1,1,0.2,0.2576,0.47,0,5,134,139 +320,2011-01-14,1,0,1,19,0,5,1,1,0.16,0.197,0.59,0.0896,3,89,92 +321,2011-01-14,1,0,1,20,0,5,1,1,0.18,0.2424,0.59,0,0,68,68 +322,2011-01-14,1,0,1,21,0,5,1,1,0.16,0.2273,0.69,0,4,48,52 +323,2011-01-14,1,0,1,22,0,5,1,2,0.16,0.2273,0.69,0,2,34,36 +324,2011-01-14,1,0,1,23,0,5,1,2,0.18,0.2424,0.55,0,1,26,27 +325,2011-01-15,1,0,1,0,0,6,0,1,0.18,0.2424,0.55,0,3,25,28 +326,2011-01-15,1,0,1,1,0,6,0,2,0.16,0.197,0.59,0.0896,2,18,20 +327,2011-01-15,1,0,1,2,0,6,0,2,0.16,0.197,0.59,0.0896,0,12,12 +328,2011-01-15,1,0,1,3,0,6,0,2,0.16,0.2273,0.59,0,1,7,8 +329,2011-01-15,1,0,1,4,0,6,0,2,0.16,0.2273,0.59,0,0,5,5 +330,2011-01-15,1,0,1,5,0,6,0,1,0.16,0.2273,0.59,0,0,1,1 +331,2011-01-15,1,0,1,6,0,6,0,1,0.14,0.1667,0.63,0.1045,1,2,3 +332,2011-01-15,1,0,1,7,0,6,0,1,0.14,0.2121,0.63,0,1,9,10 +333,2011-01-15,1,0,1,8,0,6,0,1,0.14,0.1515,0.63,0.1343,1,22,23 +334,2011-01-15,1,0,1,9,0,6,0,1,0.16,0.1818,0.64,0.1343,2,31,33 +335,2011-01-15,1,0,1,10,0,6,0,1,0.18,0.197,0.59,0.1642,2,57,59 +336,2011-01-15,1,0,1,11,0,6,0,1,0.2,0.197,0.55,0.2239,18,54,72 +337,2011-01-15,1,0,1,12,0,6,0,1,0.24,0.2273,0.48,0.2239,15,74,89 +338,2011-01-15,1,0,1,13,0,6,0,1,0.28,0.2576,0.38,0.2985,21,80,101 +339,2011-01-15,1,0,1,14,0,6,0,1,0.3,0.2879,0.39,0.2836,26,92,118 +340,2011-01-15,1,0,1,15,0,6,0,2,0.32,0.3182,0.36,0.194,21,108,129 +341,2011-01-15,1,0,1,16,0,6,0,2,0.34,0.3333,0.34,0.194,33,95,128 +342,2011-01-15,1,0,1,17,0,6,0,2,0.32,0.303,0.36,0.2836,29,54,83 +343,2011-01-15,1,0,1,18,0,6,0,2,0.3,0.2879,0.45,0.2537,15,69,84 +344,2011-01-15,1,0,1,19,0,6,0,2,0.32,0.303,0.39,0.2537,14,60,74 +345,2011-01-15,1,0,1,20,0,6,0,2,0.32,0.303,0.39,0.2537,6,35,41 +346,2011-01-15,1,0,1,21,0,6,0,2,0.32,0.303,0.39,0.2239,6,51,57 +347,2011-01-15,1,0,1,22,0,6,0,2,0.3,0.3182,0.42,0.1045,0,26,26 +348,2011-01-15,1,0,1,23,0,6,0,1,0.3,0.2879,0.45,0.2836,5,39,44 +349,2011-01-16,1,0,1,0,0,0,0,1,0.26,0.303,0.56,0,6,33,39 +350,2011-01-16,1,0,1,1,0,0,0,1,0.26,0.2727,0.56,0.1343,4,19,23 +351,2011-01-16,1,0,1,2,0,0,0,1,0.26,0.2879,0.56,0.0896,3,13,16 +352,2011-01-16,1,0,1,3,0,0,0,1,0.22,0.2727,0.69,0,9,6,15 +353,2011-01-16,1,0,1,4,0,0,0,1,0.26,0.2576,0.56,0.1642,0,1,1 +354,2011-01-16,1,0,1,5,0,0,0,2,0.26,0.2576,0.56,0.1642,1,1,2 +355,2011-01-16,1,0,1,6,0,0,0,2,0.26,0.2576,0.56,0.1642,0,1,1 +356,2011-01-16,1,0,1,7,0,0,0,2,0.24,0.2121,0.56,0.2985,0,3,3 +357,2011-01-16,1,0,1,8,0,0,0,1,0.22,0.2121,0.55,0.2836,0,18,18 +358,2011-01-16,1,0,1,9,0,0,0,1,0.22,0.2121,0.51,0.2537,3,29,32 +359,2011-01-16,1,0,1,10,0,0,0,1,0.22,0.2121,0.51,0.2836,8,71,79 +360,2011-01-16,1,0,1,11,0,0,0,1,0.24,0.2273,0.44,0.2537,23,70,93 +361,2011-01-16,1,0,1,12,0,0,0,1,0.24,0.2121,0.41,0.2836,29,75,104 +362,2011-01-16,1,0,1,13,0,0,0,1,0.26,0.2273,0.35,0.2985,23,95,118 +363,2011-01-16,1,0,1,14,0,0,0,1,0.28,0.2727,0.36,0.2537,22,69,91 +364,2011-01-16,1,0,1,15,0,0,0,1,0.26,0.2424,0.38,0.2537,35,78,113 +365,2011-01-16,1,0,1,16,0,0,0,1,0.24,0.2273,0.38,0.2239,22,77,99 +366,2011-01-16,1,0,1,17,0,0,0,1,0.22,0.2121,0.37,0.2537,23,82,105 +367,2011-01-16,1,0,1,18,0,0,0,1,0.2,0.2121,0.4,0.1642,11,56,67 +368,2011-01-16,1,0,1,19,0,0,0,1,0.18,0.197,0.47,0.1343,14,47,61 +369,2011-01-16,1,0,1,20,0,0,0,1,0.18,0.197,0.47,0.1642,7,50,57 +370,2011-01-16,1,0,1,21,0,0,0,1,0.18,0.197,0.51,0.1642,6,22,28 +371,2011-01-16,1,0,1,22,0,0,0,2,0.2,0.2121,0.49,0.1343,2,19,21 +372,2011-01-16,1,0,1,23,0,0,0,2,0.2,0.2273,0.4,0.1045,0,18,18 +373,2011-01-17,1,0,1,0,1,1,0,2,0.2,0.197,0.47,0.2239,1,16,17 +374,2011-01-17,1,0,1,1,1,1,0,2,0.2,0.197,0.44,0.194,1,15,16 +375,2011-01-17,1,0,1,2,1,1,0,2,0.18,0.1667,0.43,0.2537,0,8,8 +376,2011-01-17,1,0,1,3,1,1,0,2,0.18,0.1818,0.43,0.194,0,2,2 +377,2011-01-17,1,0,1,4,1,1,0,2,0.18,0.197,0.43,0.1343,1,2,3 +378,2011-01-17,1,0,1,5,1,1,0,2,0.18,0.197,0.43,0.1642,0,1,1 +379,2011-01-17,1,0,1,6,1,1,0,2,0.18,0.1818,0.43,0.194,0,5,5 +380,2011-01-17,1,0,1,7,1,1,0,2,0.16,0.1818,0.5,0.1343,4,9,13 +381,2011-01-17,1,0,1,8,1,1,0,2,0.16,0.1515,0.47,0.2239,3,30,33 +382,2011-01-17,1,0,1,9,1,1,0,2,0.16,0.1515,0.47,0.2239,8,39,47 +383,2011-01-17,1,0,1,10,1,1,0,2,0.16,0.1515,0.5,0.2537,7,50,57 +384,2011-01-17,1,0,1,11,1,1,0,2,0.16,0.1515,0.55,0.194,9,55,64 +385,2011-01-17,1,0,1,12,1,1,0,2,0.18,0.197,0.47,0.1343,10,70,80 +386,2011-01-17,1,0,1,13,1,1,0,2,0.18,0.197,0.47,0.1343,13,80,93 +387,2011-01-17,1,0,1,14,1,1,0,2,0.18,0.2121,0.43,0.1045,12,74,86 +388,2011-01-17,1,0,1,15,1,1,0,2,0.2,0.2121,0.47,0.1642,21,72,93 +389,2011-01-17,1,0,1,16,1,1,0,2,0.2,0.2121,0.47,0.1642,6,76,82 +390,2011-01-17,1,0,1,17,1,1,0,1,0.2,0.197,0.51,0.194,4,67,71 +391,2011-01-17,1,0,1,18,1,1,0,2,0.18,0.1667,0.55,0.2537,7,85,92 +392,2011-01-17,1,0,1,19,1,1,0,3,0.18,0.1818,0.59,0.194,2,58,60 +393,2011-01-17,1,0,1,20,1,1,0,3,0.16,0.1515,0.8,0.194,4,29,33 +394,2011-01-17,1,0,1,21,1,1,0,3,0.16,0.1515,0.8,0.194,3,24,27 +395,2011-01-17,1,0,1,22,1,1,0,3,0.14,0.1212,0.93,0.2537,0,13,13 +396,2011-01-17,1,0,1,23,1,1,0,3,0.16,0.1364,0.86,0.2836,1,3,4 +397,2011-01-18,1,0,1,12,0,2,1,2,0.2,0.1818,0.86,0.3284,0,3,3 +398,2011-01-18,1,0,1,13,0,2,1,2,0.2,0.197,0.86,0.2239,0,22,22 +399,2011-01-18,1,0,1,14,0,2,1,2,0.22,0.2273,0.8,0.1642,2,26,28 +400,2011-01-18,1,0,1,15,0,2,1,2,0.22,0.2273,0.87,0.1642,3,32,35 +401,2011-01-18,1,0,1,16,0,2,1,2,0.22,0.2273,0.87,0.194,0,61,61 +402,2011-01-18,1,0,1,17,0,2,1,2,0.22,0.2273,0.82,0.194,1,124,125 +403,2011-01-18,1,0,1,18,0,2,1,2,0.22,0.2273,0.8,0.1642,1,132,133 +404,2011-01-18,1,0,1,19,0,2,1,2,0.22,0.2273,0.8,0.1343,1,98,99 +405,2011-01-18,1,0,1,20,0,2,1,2,0.22,0.2727,0.87,0,0,83,83 +406,2011-01-18,1,0,1,21,0,2,1,2,0.22,0.2424,0.93,0.1045,0,41,41 +407,2011-01-18,1,0,1,22,0,2,1,2,0.22,0.2576,0.93,0.0896,0,33,33 +408,2011-01-18,1,0,1,23,0,2,1,2,0.22,0.2727,0.93,0,1,19,20 +409,2011-01-19,1,0,1,0,0,3,1,2,0.22,0.2727,0.93,0,0,3,3 +410,2011-01-19,1,0,1,1,0,3,1,3,0.22,0.2273,0.93,0.1343,1,6,7 +411,2011-01-19,1,0,1,2,0,3,1,3,0.22,0.2273,0.93,0.1343,0,3,3 +412,2011-01-19,1,0,1,4,0,3,1,3,0.22,0.2273,0.93,0.1343,1,1,2 +413,2011-01-19,1,0,1,5,0,3,1,2,0.22,0.2576,0.93,0.0896,0,7,7 +414,2011-01-19,1,0,1,6,0,3,1,2,0.22,0.2576,0.93,0.0896,0,32,32 +415,2011-01-19,1,0,1,7,0,3,1,2,0.24,0.2576,0.92,0.1045,1,89,90 +416,2011-01-19,1,0,1,8,0,3,1,2,0.24,0.2576,0.93,0.1045,1,196,197 +417,2011-01-19,1,0,1,9,0,3,1,2,0.24,0.2576,0.93,0.1045,2,107,109 +418,2011-01-19,1,0,1,10,0,3,1,2,0.26,0.2727,0.93,0.1343,1,46,47 +419,2011-01-19,1,0,1,11,0,3,1,2,0.28,0.303,0.87,0.0896,5,47,52 +420,2011-01-19,1,0,1,12,0,3,1,2,0.3,0.3182,0.81,0.0896,5,65,70 +421,2011-01-19,1,0,1,13,0,3,1,1,0.4,0.4091,0.62,0.2836,11,67,78 +422,2011-01-19,1,0,1,14,0,3,1,1,0.4,0.4091,0.58,0.2537,7,68,75 +423,2011-01-19,1,0,1,15,0,3,1,1,0.4,0.4091,0.54,0.2836,4,78,82 +424,2011-01-19,1,0,1,16,0,3,1,1,0.38,0.3939,0.58,0.3881,10,94,104 +425,2011-01-19,1,0,1,17,0,3,1,1,0.36,0.3333,0.57,0.3284,7,190,197 +426,2011-01-19,1,0,1,18,0,3,1,1,0.34,0.3182,0.61,0.2836,5,156,161 +427,2011-01-19,1,0,1,19,0,3,1,1,0.32,0.2879,0.57,0.4179,4,108,112 +428,2011-01-19,1,0,1,20,0,3,1,1,0.32,0.303,0.49,0.2985,2,74,76 +429,2011-01-19,1,0,1,21,0,3,1,1,0.32,0.2879,0.49,0.4179,4,55,59 +430,2011-01-19,1,0,1,22,0,3,1,1,0.3,0.303,0.52,0.1642,6,53,59 +431,2011-01-19,1,0,1,23,0,3,1,1,0.3,0.2727,0.52,0.4627,1,27,28 +432,2011-01-20,1,0,1,0,0,4,1,1,0.26,0.2273,0.56,0.3881,5,8,13 +433,2011-01-20,1,0,1,1,0,4,1,1,0.26,0.2727,0.56,0,2,3,5 +434,2011-01-20,1,0,1,2,0,4,1,1,0.26,0.2727,0.56,0,0,2,2 +435,2011-01-20,1,0,1,3,0,4,1,1,0.26,0.2576,0.56,0.1642,0,1,1 +436,2011-01-20,1,0,1,4,0,4,1,1,0.26,0.2576,0.56,0.1642,0,1,1 +437,2011-01-20,1,0,1,5,0,4,1,1,0.24,0.2273,0.6,0.2239,0,6,6 +438,2011-01-20,1,0,1,6,0,4,1,1,0.22,0.2121,0.6,0.2239,0,35,35 +439,2011-01-20,1,0,1,7,0,4,1,1,0.22,0.2121,0.55,0.2239,1,100,101 +440,2011-01-20,1,0,1,8,0,4,1,1,0.22,0.2121,0.55,0.2836,2,247,249 +441,2011-01-20,1,0,1,9,0,4,1,2,0.24,0.2273,0.52,0.2239,3,140,143 +442,2011-01-20,1,0,1,10,0,4,1,1,0.26,0.2273,0.48,0.2985,1,56,57 +443,2011-01-20,1,0,1,11,0,4,1,2,0.28,0.2727,0.45,0.1642,5,63,68 +444,2011-01-20,1,0,1,12,0,4,1,2,0.3,0.3333,0.42,0,7,77,84 +445,2011-01-20,1,0,1,13,0,4,1,2,0.28,0.2879,0.45,0.1045,12,86,98 +446,2011-01-20,1,0,1,14,0,4,1,2,0.3,0.303,0.45,0.1343,6,75,81 +447,2011-01-20,1,0,1,15,0,4,1,2,0.32,0.3182,0.45,0.194,8,62,70 +448,2011-01-20,1,0,1,16,0,4,1,2,0.3,0.303,0.49,0.1343,8,83,91 +449,2011-01-20,1,0,1,17,0,4,1,2,0.3,0.3182,0.49,0.1045,8,207,215 +450,2011-01-20,1,0,1,18,0,4,1,2,0.26,0.2576,0.56,0.194,1,184,185 +451,2011-01-20,1,0,1,19,0,4,1,1,0.26,0.2273,0.56,0.3284,6,146,152 +452,2011-01-20,1,0,1,20,0,4,1,2,0.26,0.2424,0.6,0.2836,2,124,126 +453,2011-01-20,1,0,1,21,0,4,1,2,0.24,0.2273,0.6,0.2537,3,54,57 +454,2011-01-20,1,0,1,22,0,4,1,2,0.24,0.2121,0.65,0.2836,0,56,56 +455,2011-01-20,1,0,1,23,0,4,1,2,0.24,0.2121,0.65,0.3284,3,28,31 +456,2011-01-21,1,0,1,0,0,5,1,2,0.24,0.2273,0.7,0.2537,1,20,21 +457,2011-01-21,1,0,1,1,0,5,1,2,0.24,0.2273,0.7,0.2537,0,6,6 +458,2011-01-21,1,0,1,2,0,5,1,3,0.24,0.2424,0.75,0.1642,0,2,2 +459,2011-01-21,1,0,1,3,0,5,1,3,0.22,0.2121,0.8,0.2985,0,1,1 +460,2011-01-21,1,0,1,4,0,5,1,2,0.22,0.2576,0.87,0.0896,0,1,1 +461,2011-01-21,1,0,1,5,0,5,1,1,0.24,0.197,0.6,0.4179,1,4,5 +462,2011-01-21,1,0,1,6,0,5,1,1,0.22,0.2121,0.55,0.2537,0,27,27 +463,2011-01-21,1,0,1,7,0,5,1,1,0.2,0.1818,0.51,0.2836,2,66,68 +464,2011-01-21,1,0,1,8,0,5,1,1,0.2,0.1818,0.47,0.3284,7,210,217 +465,2011-01-21,1,0,1,9,0,5,1,1,0.2,0.1818,0.51,0.3582,7,159,166 +466,2011-01-21,1,0,1,10,0,5,1,1,0.2,0.1667,0.47,0.4627,6,57,63 +467,2011-01-21,1,0,1,11,0,5,1,1,0.22,0.1818,0.41,0.4627,6,53,59 +468,2011-01-21,1,0,1,12,0,5,1,1,0.22,0.1818,0.27,0.5821,11,67,78 +469,2011-01-21,1,0,1,13,0,5,1,1,0.2,0.1515,0.21,0.5821,8,65,73 +470,2011-01-21,1,0,1,14,0,5,1,1,0.2,0.1515,0.25,0.5224,6,56,62 +471,2011-01-21,1,0,1,15,0,5,1,1,0.16,0.1212,0.26,0.4478,4,61,65 +472,2011-01-21,1,0,1,16,0,5,1,1,0.16,0.1364,0.26,0.3582,0,97,97 +473,2011-01-21,1,0,1,17,0,5,1,1,0.14,0.1212,0.28,0.3582,10,151,161 +474,2011-01-21,1,0,1,18,0,5,1,1,0.12,0.1212,0.3,0.2537,1,119,120 +475,2011-01-21,1,0,1,19,0,5,1,1,0.12,0.1061,0.3,0.3284,3,93,96 +476,2011-01-21,1,0,1,20,0,5,1,1,0.1,0.0758,0.33,0.4179,1,52,53 +477,2011-01-21,1,0,1,21,0,5,1,1,0.08,0.0758,0.38,0.2836,0,41,41 +478,2011-01-21,1,0,1,22,0,5,1,1,0.06,0.0303,0.41,0.3881,1,33,34 +479,2011-01-21,1,0,1,23,0,5,1,1,0.06,0.0455,0.38,0.3284,0,27,27 +480,2011-01-22,1,0,1,0,0,6,0,1,0.04,0.0303,0.45,0.2537,0,13,13 +481,2011-01-22,1,0,1,1,0,6,0,2,0.04,0,0.41,0.3881,3,9,12 +482,2011-01-22,1,0,1,2,0,6,0,2,0.04,0.0303,0.41,0.2537,0,11,11 +483,2011-01-22,1,0,1,3,0,6,0,2,0.04,0.0303,0.41,0.2836,1,6,7 +484,2011-01-22,1,0,1,4,0,6,0,2,0.02,0.0152,0.48,0.2985,0,3,3 +485,2011-01-22,1,0,1,6,0,6,0,2,0.02,0.0303,0.44,0.2239,0,2,2 +486,2011-01-22,1,0,1,7,0,6,0,1,0.02,0.0152,0.44,0.2836,0,8,8 +487,2011-01-22,1,0,1,8,0,6,0,1,0.02,0,0.44,0.3284,1,26,27 +488,2011-01-22,1,0,1,9,0,6,0,1,0.04,0.0303,0.41,0.2537,3,37,40 +489,2011-01-22,1,0,1,10,0,6,0,2,0.04,0.0606,0.41,0.1642,3,50,53 +490,2011-01-22,1,0,1,11,0,6,0,2,0.06,0.0758,0.38,0.1343,4,59,63 +491,2011-01-22,1,0,1,12,0,6,0,2,0.06,0.1061,0.38,0.1045,10,60,70 +492,2011-01-22,1,0,1,13,0,6,0,1,0.08,0.1667,0.35,0,12,72,84 +493,2011-01-22,1,0,1,14,0,6,0,1,0.1,0.1364,0.33,0.1045,11,64,75 +494,2011-01-22,1,0,1,15,0,6,0,1,0.12,0.1515,0.28,0,10,93,103 +495,2011-01-22,1,0,1,16,0,6,0,1,0.12,0.1364,0.28,0.194,11,72,83 +496,2011-01-22,1,0,1,17,0,6,0,1,0.12,0.197,0.28,0,8,59,67 +497,2011-01-22,1,0,1,18,0,6,0,1,0.08,0.0909,0.35,0.194,0,54,54 +498,2011-01-22,1,0,1,19,0,6,0,1,0.08,0.1061,0.35,0.1343,6,53,59 +499,2011-01-22,1,0,1,20,0,6,0,1,0.06,0.0758,0.45,0.1642,1,44,45 +500,2011-01-22,1,0,1,21,0,6,0,1,0.06,0.1061,0.41,0.0896,0,39,39 +501,2011-01-22,1,0,1,22,0,6,0,1,0.06,0.1515,0.49,0,7,23,30 +502,2011-01-22,1,0,1,23,0,6,0,1,0.04,0.0758,0.57,0.1045,2,31,33 +503,2011-01-23,1,0,1,0,0,0,0,1,0.04,0.0758,0.57,0.1045,2,20,22 +504,2011-01-23,1,0,1,1,0,0,0,1,0.04,0.0758,0.57,0.1045,1,12,13 +505,2011-01-23,1,0,1,2,0,0,0,1,0.02,0.0606,0.62,0.1343,3,15,18 +506,2011-01-23,1,0,1,3,0,0,0,1,0.02,0.0606,0.62,0.1343,1,4,5 +507,2011-01-23,1,0,1,5,0,0,0,2,0.04,0.0758,0.57,0.1045,0,3,3 +508,2011-01-23,1,0,1,6,0,0,0,2,0.04,0.0758,0.57,0.1045,0,1,1 +509,2011-01-23,1,0,1,7,0,0,0,1,0.08,0.1061,0.58,0.1642,1,1,2 +510,2011-01-23,1,0,1,8,0,0,0,1,0.06,0.0758,0.62,0.1642,2,17,19 +511,2011-01-23,1,0,1,9,0,0,0,1,0.1,0.0758,0.54,0.3582,3,25,28 +512,2011-01-23,1,0,1,10,0,0,0,1,0.14,0.1061,0.46,0.3881,7,51,58 +513,2011-01-23,1,0,1,11,0,0,0,1,0.14,0.1364,0.43,0.2239,22,77,99 +514,2011-01-23,1,0,1,12,0,0,0,1,0.16,0.1212,0.37,0.4627,24,92,116 +515,2011-01-23,1,0,1,13,0,0,0,1,0.14,0.1061,0.33,0.3881,12,75,87 +516,2011-01-23,1,0,1,14,0,0,0,1,0.16,0.1364,0.28,0.3582,17,93,110 +517,2011-01-23,1,0,1,15,0,0,0,1,0.16,0.1364,0.28,0.3582,13,64,77 +518,2011-01-23,1,0,1,16,0,0,0,1,0.16,0.1364,0.26,0.3284,9,56,65 +519,2011-01-23,1,0,1,17,0,0,0,1,0.14,0.1061,0.26,0.3881,5,50,55 +520,2011-01-23,1,0,1,18,0,0,0,1,0.12,0.1212,0.3,0.2537,5,44,49 +521,2011-01-23,1,0,1,19,0,0,0,1,0.12,0.1212,0.3,0.2836,5,45,50 +522,2011-01-23,1,0,1,20,0,0,0,1,0.1,0.1061,0.36,0.2537,4,31,35 +523,2011-01-23,1,0,1,21,0,0,0,1,0.1,0.1061,0.36,0.194,5,20,25 +524,2011-01-23,1,0,1,22,0,0,0,1,0.08,0.0909,0.38,0.194,5,23,28 +525,2011-01-23,1,0,1,23,0,0,0,1,0.06,0.0606,0.41,0.2239,4,17,21 +526,2011-01-24,1,0,1,0,0,1,1,1,0.06,0.0606,0.41,0.194,0,7,7 +527,2011-01-24,1,0,1,1,0,1,1,1,0.04,0.0455,0.45,0.194,0,1,1 +528,2011-01-24,1,0,1,3,0,1,1,1,0.04,0.0303,0.45,0.2537,0,1,1 +529,2011-01-24,1,0,1,4,0,1,1,1,0.02,0.0606,0.48,0.1343,0,1,1 +530,2011-01-24,1,0,1,5,0,1,1,1,0.02,0.0606,0.48,0.1343,0,5,5 +531,2011-01-24,1,0,1,6,0,1,1,1,0.02,0.0758,0.48,0.0896,0,15,15 +532,2011-01-24,1,0,1,7,0,1,1,1,0.02,0.1212,0.48,0,5,79,84 +533,2011-01-24,1,0,1,8,0,1,1,1,0.04,0.1364,0.49,0,6,171,177 +534,2011-01-24,1,0,1,9,0,1,1,1,0.06,0.1515,0.41,0,4,98,102 +535,2011-01-24,1,0,1,10,0,1,1,1,0.1,0.1364,0.42,0,6,34,40 +536,2011-01-24,1,0,1,11,0,1,1,1,0.1,0.1212,0.46,0.1343,3,43,46 +537,2011-01-24,1,0,1,12,0,1,1,2,0.12,0.1364,0.42,0.194,11,52,63 +538,2011-01-24,1,0,1,13,0,1,1,2,0.14,0.1364,0.43,0.2239,6,54,60 +539,2011-01-24,1,0,1,14,0,1,1,2,0.14,0.1364,0.46,0.2239,2,43,45 +540,2011-01-24,1,0,1,15,0,1,1,1,0.16,0.1667,0.4,0.1642,7,50,57 +541,2011-01-24,1,0,1,16,0,1,1,1,0.16,0.1515,0.47,0.2537,4,66,70 +542,2011-01-24,1,0,1,17,0,1,1,1,0.14,0.1212,0.5,0.2537,6,178,184 +543,2011-01-24,1,0,1,18,0,1,1,1,0.14,0.1364,0.59,0.194,8,145,153 +544,2011-01-24,1,0,1,19,0,1,1,1,0.14,0.1515,0.54,0.1642,5,101,106 +545,2011-01-24,1,0,1,20,0,1,1,1,0.14,0.1364,0.59,0.194,1,80,81 +546,2011-01-24,1,0,1,21,0,1,1,1,0.14,0.1515,0.63,0.1642,6,53,59 +547,2011-01-24,1,0,1,22,0,1,1,2,0.14,0.1364,0.63,0.2239,3,32,35 +548,2011-01-24,1,0,1,23,0,1,1,2,0.16,0.1515,0.64,0.2537,3,21,24 +549,2011-01-25,1,0,1,0,0,2,1,2,0.16,0.1364,0.69,0.2836,3,6,9 +550,2011-01-25,1,0,1,1,0,2,1,2,0.16,0.1667,0.69,0.1642,0,5,5 +551,2011-01-25,1,0,1,2,0,2,1,1,0.16,0.1515,0.69,0.2239,0,2,2 +552,2011-01-25,1,0,1,4,0,2,1,1,0.14,0.1667,0.74,0.1045,0,1,1 +553,2011-01-25,1,0,1,5,0,2,1,1,0.14,0.1364,0.74,0.2239,0,9,9 +554,2011-01-25,1,0,1,6,0,2,1,1,0.16,0.1818,0.74,0.1045,1,35,36 +555,2011-01-25,1,0,1,7,0,2,1,1,0.16,0.1515,0.74,0.2239,5,103,108 +556,2011-01-25,1,0,1,8,0,2,1,2,0.16,0.1818,0.74,0.1343,5,233,238 +557,2011-01-25,1,0,1,9,0,2,1,2,0.2,0.2273,0.64,0.0896,10,134,144 +558,2011-01-25,1,0,1,10,0,2,1,2,0.22,0.2424,0.6,0.1045,6,49,55 +559,2011-01-25,1,0,1,11,0,2,1,2,0.24,0.2424,0.6,0.1343,6,55,61 +560,2011-01-25,1,0,1,12,0,2,1,2,0.26,0.2879,0.56,0.0896,21,85,106 +561,2011-01-25,1,0,1,13,0,2,1,2,0.26,0.2727,0.56,0.1343,21,72,93 +562,2011-01-25,1,0,1,14,0,2,1,2,0.3,0.3333,0.45,0,11,57,68 +563,2011-01-25,1,0,1,15,0,2,1,2,0.32,0.3485,0.42,0,21,63,84 +564,2011-01-25,1,0,1,16,0,2,1,2,0.32,0.3485,0.42,0,14,102,116 +565,2011-01-25,1,0,1,17,0,2,1,1,0.3,0.3333,0.45,0,14,208,222 +566,2011-01-25,1,0,1,18,0,2,1,2,0.3,0.3182,0.49,0.0896,7,218,225 +567,2011-01-25,1,0,1,19,0,2,1,2,0.26,0.2576,0.65,0.1642,13,133,146 +568,2011-01-25,1,0,1,20,0,2,1,1,0.24,0.2273,0.65,0.194,16,103,119 +569,2011-01-25,1,0,1,21,0,2,1,1,0.24,0.2273,0.65,0.194,5,40,45 +570,2011-01-25,1,0,1,22,0,2,1,1,0.22,0.2273,0.64,0.1642,4,49,53 +571,2011-01-25,1,0,1,23,0,2,1,2,0.22,0.2273,0.64,0.1642,3,37,40 +572,2011-01-26,1,0,1,0,0,3,1,2,0.22,0.2273,0.69,0.1343,3,14,17 +573,2011-01-26,1,0,1,1,0,3,1,2,0.24,0.2424,0.65,0.1343,0,5,5 +574,2011-01-26,1,0,1,2,0,3,1,3,0.22,0.2273,0.69,0.194,3,7,10 +575,2011-01-26,1,0,1,5,0,3,1,3,0.2,0.1818,0.86,0.2836,0,1,1 +576,2011-01-26,1,0,1,6,0,3,1,3,0.2,0.1818,0.86,0.2836,0,8,8 +577,2011-01-26,1,0,1,7,0,3,1,3,0.22,0.2121,0.87,0.2985,1,29,30 +578,2011-01-26,1,0,1,8,0,3,1,3,0.22,0.2121,0.87,0.2985,3,69,72 +579,2011-01-26,1,0,1,9,0,3,1,3,0.22,0.2121,0.87,0.2985,3,55,58 +580,2011-01-26,1,0,1,10,0,3,1,3,0.22,0.2121,0.93,0.2836,2,26,28 +581,2011-01-26,1,0,1,11,0,3,1,3,0.22,0.197,0.93,0.3284,6,35,41 +582,2011-01-26,1,0,1,12,0,3,1,3,0.22,0.197,0.93,0.3284,7,41,48 +583,2011-01-26,1,0,1,13,0,3,1,3,0.22,0.197,0.93,0.3284,4,43,47 +584,2011-01-26,1,0,1,14,0,3,1,3,0.22,0.197,0.93,0.3582,0,36,36 +585,2011-01-26,1,0,1,15,0,3,1,3,0.22,0.1818,0.93,0.4627,1,42,43 +586,2011-01-26,1,0,1,16,0,3,1,4,0.22,0.197,0.93,0.3284,1,35,36 +587,2011-01-26,1,0,1,17,0,3,1,3,0.2,0.1818,0.93,0.3582,0,26,26 +588,2011-01-27,1,0,1,16,0,4,1,1,0.22,0.2273,0.55,0.194,1,23,24 +589,2011-01-27,1,0,1,17,0,4,1,1,0.22,0.2424,0.55,0.1045,2,82,84 +590,2011-01-27,1,0,1,18,0,4,1,1,0.2,0.2273,0.69,0.0896,3,101,104 +591,2011-01-27,1,0,1,19,0,4,1,1,0.2,0.2273,0.69,0.0896,3,76,79 +592,2011-01-27,1,0,1,20,0,4,1,1,0.18,0.2121,0.74,0.0896,4,55,59 +593,2011-01-27,1,0,1,21,0,4,1,1,0.18,0.2121,0.74,0.0896,2,36,38 +594,2011-01-27,1,0,1,22,0,4,1,1,0.18,0.2121,0.74,0.0896,0,27,27 +595,2011-01-27,1,0,1,23,0,4,1,1,0.18,0.197,0.8,0.1642,0,16,16 +596,2011-01-28,1,0,1,0,0,5,1,2,0.2,0.2121,0.75,0.1343,0,9,9 +597,2011-01-28,1,0,1,1,0,5,1,2,0.2,0.2121,0.75,0.1343,1,2,3 +598,2011-01-28,1,0,1,2,0,5,1,2,0.2,0.2121,0.75,0.1642,0,2,2 +599,2011-01-28,1,0,1,3,0,5,1,2,0.2,0.2273,0.75,0.1045,1,0,1 +600,2011-01-28,1,0,1,5,0,5,1,2,0.18,0.2121,0.8,0.1045,0,4,4 +601,2011-01-28,1,0,1,6,0,5,1,2,0.18,0.197,0.8,0.1343,0,16,16 +602,2011-01-28,1,0,1,7,0,5,1,2,0.16,0.197,0.86,0.0896,2,58,60 +603,2011-01-28,1,0,1,8,0,5,1,2,0.16,0.197,0.86,0.0896,2,155,157 +604,2011-01-28,1,0,1,9,0,5,1,3,0.18,0.2121,0.86,0.0896,6,95,101 +605,2011-01-28,1,0,1,10,0,5,1,3,0.18,0.2121,0.86,0.1045,0,49,49 +606,2011-01-28,1,0,1,11,0,5,1,3,0.18,0.2121,0.93,0.1045,0,30,30 +607,2011-01-28,1,0,1,12,0,5,1,3,0.18,0.2121,0.93,0.1045,1,28,29 +608,2011-01-28,1,0,1,13,0,5,1,3,0.18,0.2121,0.93,0.1045,0,31,31 +609,2011-01-28,1,0,1,14,0,5,1,3,0.22,0.2727,0.8,0,2,36,38 +610,2011-01-28,1,0,1,15,0,5,1,2,0.2,0.2576,0.86,0,1,40,41 +611,2011-01-28,1,0,1,16,0,5,1,1,0.22,0.2727,0.8,0,10,70,80 +612,2011-01-28,1,0,1,17,0,5,1,1,0.24,0.2424,0.75,0.1343,2,147,149 +613,2011-01-28,1,0,1,18,0,5,1,1,0.24,0.2273,0.75,0.194,2,107,109 +614,2011-01-28,1,0,1,19,0,5,1,2,0.24,0.2424,0.75,0.1343,5,84,89 +615,2011-01-28,1,0,1,20,0,5,1,2,0.24,0.2273,0.7,0.194,1,61,62 +616,2011-01-28,1,0,1,21,0,5,1,2,0.22,0.2273,0.75,0.1343,1,57,58 +617,2011-01-28,1,0,1,22,0,5,1,1,0.24,0.2121,0.65,0.3582,0,26,26 +618,2011-01-28,1,0,1,23,0,5,1,1,0.24,0.2273,0.6,0.2239,1,22,23 +619,2011-01-29,1,0,1,0,0,6,0,1,0.22,0.197,0.64,0.3582,2,26,28 +620,2011-01-29,1,0,1,1,0,6,0,1,0.22,0.2273,0.64,0.194,0,20,20 +621,2011-01-29,1,0,1,2,0,6,0,1,0.22,0.2273,0.64,0.1642,0,15,15 +622,2011-01-29,1,0,1,3,0,6,0,1,0.2,0.2121,0.64,0.1343,3,5,8 +623,2011-01-29,1,0,1,4,0,6,0,1,0.16,0.1818,0.69,0.1045,1,2,3 +624,2011-01-29,1,0,1,6,0,6,0,1,0.16,0.1818,0.64,0.1343,0,2,2 +625,2011-01-29,1,0,1,7,0,6,0,1,0.16,0.1818,0.59,0.1045,1,4,5 +626,2011-01-29,1,0,1,8,0,6,0,1,0.18,0.197,0.55,0.1642,3,31,34 +627,2011-01-29,1,0,1,9,0,6,0,1,0.18,0.2121,0.59,0.0896,0,34,34 +628,2011-01-29,1,0,1,10,0,6,0,2,0.18,0.2121,0.64,0.1045,4,51,55 +629,2011-01-29,1,0,1,11,0,6,0,2,0.18,0.197,0.64,0.1343,4,60,64 +630,2011-01-29,1,0,1,12,0,6,0,2,0.2,0.197,0.59,0.194,12,66,78 +631,2011-01-29,1,0,1,13,0,6,0,2,0.22,0.2273,0.55,0.1642,9,56,65 +632,2011-01-29,1,0,1,14,0,6,0,2,0.22,0.2273,0.6,0.1343,10,89,99 +633,2011-01-29,1,0,1,15,0,6,0,1,0.22,0.2121,0.69,0.2537,22,98,120 +634,2011-01-29,1,0,1,16,0,6,0,1,0.24,0.2424,0.6,0.1642,19,88,107 +635,2011-01-29,1,0,1,17,0,6,0,1,0.24,0.2879,0.6,0,9,82,91 +636,2011-01-29,1,0,1,18,0,6,0,1,0.22,0.2273,0.69,0.1343,9,59,68 +637,2011-01-29,1,0,1,19,0,6,0,2,0.22,0.2121,0.69,0.2537,6,52,58 +638,2011-01-29,1,0,1,20,0,6,0,1,0.18,0.2121,0.74,0.0896,1,42,43 +639,2011-01-29,1,0,1,21,0,6,0,1,0.18,0.2121,0.74,0.0896,1,35,36 +640,2011-01-29,1,0,1,22,0,6,0,1,0.16,0.197,0.8,0.0896,4,28,32 +641,2011-01-29,1,0,1,23,0,6,0,1,0.16,0.197,0.8,0.0896,3,30,33 +642,2011-01-30,1,0,1,0,0,0,0,1,0.16,0.1818,0.8,0.1045,0,33,33 +643,2011-01-30,1,0,1,1,0,0,0,1,0.14,0.2121,0.8,0,7,22,29 +644,2011-01-30,1,0,1,2,0,0,0,1,0.16,0.2273,0.8,0,1,10,11 +645,2011-01-30,1,0,1,3,0,0,0,1,0.14,0.2121,0.93,0,1,7,8 +646,2011-01-30,1,0,1,4,0,0,0,1,0.14,0.2121,0.93,0,0,1,1 +647,2011-01-30,1,0,1,5,0,0,0,1,0.14,0.2121,0.86,0,0,3,3 +648,2011-01-30,1,0,1,7,0,0,0,1,0.14,0.2121,0.86,0,0,3,3 +649,2011-01-30,1,0,1,8,0,0,0,2,0.14,0.2121,0.86,0,1,11,12 +650,2011-01-30,1,0,1,9,0,0,0,2,0.16,0.2273,0.8,0,4,34,38 +651,2011-01-30,1,0,1,10,0,0,0,2,0.18,0.2424,0.8,0,7,57,64 +652,2011-01-30,1,0,1,11,0,0,0,1,0.22,0.2727,0.75,0,9,50,59 +653,2011-01-30,1,0,1,12,0,0,0,1,0.3,0.3182,0.52,0.1045,10,87,97 +654,2011-01-30,1,0,1,13,0,0,0,1,0.28,0.2879,0.61,0.1045,13,71,84 +655,2011-01-30,1,0,1,14,0,0,0,1,0.28,0.303,0.61,0.0896,18,104,122 +656,2011-01-30,1,0,1,15,0,0,0,1,0.3,0.3333,0.56,0,14,95,109 +657,2011-01-30,1,0,1,16,0,0,0,1,0.3,0.3333,0.56,0,19,104,123 +658,2011-01-30,1,0,1,17,0,0,0,1,0.3,0.2879,0.56,0.194,6,71,77 +659,2011-01-30,1,0,1,18,0,0,0,1,0.26,0.2576,0.65,0.1642,8,57,65 +660,2011-01-30,1,0,1,19,0,0,0,1,0.26,0.2576,0.65,0.194,9,46,55 +661,2011-01-30,1,0,1,20,0,0,0,2,0.26,0.2727,0.65,0.1045,3,30,33 +662,2011-01-30,1,0,1,21,0,0,0,2,0.24,0.2424,0.7,0.1642,3,25,28 +663,2011-01-30,1,0,1,22,0,0,0,2,0.24,0.2273,0.7,0.194,2,19,21 +664,2011-01-30,1,0,1,23,0,0,0,2,0.24,0.2121,0.65,0.2836,5,16,21 +665,2011-01-31,1,0,1,0,0,1,1,2,0.24,0.2273,0.65,0.2239,1,6,7 +666,2011-01-31,1,0,1,1,0,1,1,1,0.22,0.2121,0.64,0.2537,2,5,7 +667,2011-01-31,1,0,1,2,0,1,1,1,0.22,0.2273,0.64,0.194,0,1,1 +668,2011-01-31,1,0,1,3,0,1,1,1,0.22,0.2273,0.64,0.194,0,2,2 +669,2011-01-31,1,0,1,4,0,1,1,1,0.2,0.197,0.59,0.2239,0,2,2 +670,2011-01-31,1,0,1,5,0,1,1,1,0.18,0.1667,0.64,0.2836,0,8,8 +671,2011-01-31,1,0,1,6,0,1,1,1,0.16,0.1364,0.69,0.3284,0,37,37 +672,2011-01-31,1,0,1,7,0,1,1,2,0.16,0.1364,0.64,0.2836,1,71,72 +673,2011-01-31,1,0,1,8,0,1,1,2,0.16,0.1364,0.59,0.2836,3,182,185 +674,2011-01-31,1,0,1,9,0,1,1,2,0.16,0.1364,0.59,0.2985,0,112,112 +675,2011-01-31,1,0,1,10,0,1,1,2,0.16,0.1515,0.59,0.194,1,68,69 +676,2011-01-31,1,0,1,11,0,1,1,2,0.16,0.1515,0.59,0.194,2,46,48 +677,2011-01-31,1,0,1,12,0,1,1,2,0.18,0.2121,0.55,0.1045,6,62,68 +678,2011-01-31,1,0,1,13,0,1,1,2,0.16,0.2273,0.59,0,2,52,54 +679,2011-01-31,1,0,1,14,0,1,1,2,0.18,0.197,0.55,0.1343,1,85,86 +680,2011-01-31,1,0,1,15,0,1,1,2,0.16,0.1818,0.59,0.1343,3,41,44 +681,2011-01-31,1,0,1,16,0,1,1,2,0.16,0.1818,0.56,0.194,3,83,86 +682,2011-01-31,1,0,1,17,0,1,1,2,0.16,0.1515,0.59,0.194,6,155,161 +683,2011-01-31,1,0,1,18,0,1,1,2,0.16,0.1515,0.55,0.2239,3,153,156 +684,2011-01-31,1,0,1,19,0,1,1,1,0.3,0.3182,0.61,0.1045,3,108,111 +685,2011-01-31,1,0,1,20,0,1,1,3,0.16,0.1667,0.59,0.1642,0,78,78 +686,2011-01-31,1,0,1,21,0,1,1,3,0.16,0.197,0.59,0.0896,3,53,56 +687,2011-01-31,1,0,1,22,0,1,1,2,0.16,0.1818,0.59,0.1045,0,34,34 +688,2011-01-31,1,0,1,23,0,1,1,2,0.16,0.197,0.64,0.0896,2,15,17 +689,2011-02-01,1,0,2,0,0,2,1,2,0.16,0.1818,0.64,0.1045,2,6,8 +690,2011-02-01,1,0,2,1,0,2,1,2,0.16,0.1818,0.69,0.1045,0,3,3 +691,2011-02-01,1,0,2,2,0,2,1,2,0.16,0.2273,0.69,0,0,2,2 +692,2011-02-01,1,0,2,3,0,2,1,2,0.16,0.2273,0.69,0,0,2,2 +693,2011-02-01,1,0,2,5,0,2,1,3,0.14,0.2121,0.93,0,0,3,3 +694,2011-02-01,1,0,2,6,0,2,1,3,0.14,0.2121,0.93,0,0,22,22 +695,2011-02-01,1,0,2,7,0,2,1,3,0.16,0.2273,0.93,0,0,52,52 +696,2011-02-01,1,0,2,8,0,2,1,3,0.16,0.2273,0.93,0,3,132,135 +697,2011-02-01,1,0,2,9,0,2,1,2,0.16,0.2273,0.93,0,2,114,116 +698,2011-02-01,1,0,2,10,0,2,1,2,0.16,0.2273,0.93,0,0,47,47 +699,2011-02-01,1,0,2,11,0,2,1,2,0.18,0.2424,0.86,0,2,49,51 +700,2011-02-01,1,0,2,12,0,2,1,2,0.2,0.2576,0.86,0,2,53,55 +701,2011-02-01,1,0,2,13,0,2,1,2,0.2,0.2576,0.86,0,3,49,52 +702,2011-02-01,1,0,2,14,0,2,1,2,0.22,0.2576,0.8,0.0896,5,49,54 +703,2011-02-01,1,0,2,15,0,2,1,2,0.24,0.2879,0.75,0,7,45,52 +704,2011-02-01,1,0,2,16,0,2,1,2,0.24,0.2424,0.75,0.1343,3,61,64 +705,2011-02-01,1,0,2,17,0,2,1,2,0.24,0.2879,0.75,0,4,172,176 +706,2011-02-01,1,0,2,18,0,2,1,2,0.24,0.2576,0.81,0.1045,3,165,168 +707,2011-02-01,1,0,2,19,0,2,1,2,0.24,0.2424,0.81,0.1343,3,105,108 +708,2011-02-01,1,0,2,20,0,2,1,2,0.22,0.2273,0.87,0.1343,5,69,74 +709,2011-02-01,1,0,2,21,0,2,1,2,0.22,0.2273,0.87,0.1343,0,64,64 +710,2011-02-01,1,0,2,22,0,2,1,2,0.22,0.2576,0.87,0.0896,2,34,36 +711,2011-02-01,1,0,2,23,0,2,1,3,0.2,0.197,0.93,0.194,1,15,16 +712,2011-02-02,1,0,2,0,0,3,1,3,0.22,0.2424,0.93,0.1045,0,2,2 +713,2011-02-02,1,0,2,1,0,3,1,3,0.22,0.2273,0.93,0.194,0,3,3 +714,2011-02-02,1,0,2,2,0,3,1,3,0.22,0.2273,0.93,0.1343,4,0,4 +715,2011-02-02,1,0,2,3,0,3,1,3,0.22,0.2273,0.93,0.1343,0,1,1 +716,2011-02-02,1,0,2,4,0,3,1,3,0.22,0.2121,0.93,0.2836,0,1,1 +717,2011-02-02,1,0,2,5,0,3,1,3,0.22,0.2424,0.93,0.1045,0,3,3 +718,2011-02-02,1,0,2,6,0,3,1,3,0.22,0.2424,0.93,0.1045,1,17,18 +719,2011-02-02,1,0,2,7,0,3,1,3,0.22,0.2121,0.93,0.2239,1,48,49 +720,2011-02-02,1,0,2,8,0,3,1,3,0.22,0.2121,0.93,0.2239,1,154,155 +721,2011-02-02,1,0,2,9,0,3,1,2,0.24,0.2576,0.93,0.0896,4,119,123 +722,2011-02-02,1,0,2,10,0,3,1,2,0.22,0.2727,1,0,2,59,61 +723,2011-02-02,1,0,2,11,0,3,1,2,0.24,0.2273,0.93,0.194,5,47,52 +724,2011-02-02,1,0,2,12,0,3,1,2,0.24,0.2273,0.93,0.2239,3,61,64 +725,2011-02-02,1,0,2,13,0,3,1,1,0.34,0.3333,0.93,0.1642,1,74,75 +726,2011-02-02,1,0,2,14,0,3,1,1,0.38,0.3939,0.82,0.3881,2,61,63 +727,2011-02-02,1,0,2,15,0,3,1,1,0.38,0.3939,0.76,0.3284,10,66,76 +728,2011-02-02,1,0,2,16,0,3,1,1,0.36,0.3333,0.71,0.2985,8,95,103 +729,2011-02-02,1,0,2,17,0,3,1,1,0.36,0.3182,0.53,0.5224,7,183,190 +730,2011-02-02,1,0,2,18,0,3,1,1,0.34,0.2879,0.42,0.5522,7,175,182 +731,2011-02-02,1,0,2,19,0,3,1,1,0.28,0.2424,0.45,0.4925,3,88,91 +732,2011-02-02,1,0,2,20,0,3,1,1,0.24,0.197,0.48,0.5522,4,71,75 +733,2011-02-02,1,0,2,21,0,3,1,1,0.22,0.197,0.47,0.3284,1,62,63 +734,2011-02-02,1,0,2,22,0,3,1,1,0.22,0.2121,0.44,0.2537,5,35,40 +735,2011-02-02,1,0,2,23,0,3,1,1,0.2,0.1667,0.44,0.4478,3,29,32 +736,2011-02-03,1,0,2,0,0,4,1,1,0.2,0.1667,0.4,0.4478,1,11,12 +737,2011-02-03,1,0,2,1,0,4,1,1,0.2,0.1515,0.44,0.5224,0,5,5 +738,2011-02-03,1,0,2,2,0,4,1,1,0.18,0.1667,0.43,0.2537,0,2,2 +739,2011-02-03,1,0,2,3,0,4,1,1,0.18,0.1667,0.43,0.2537,0,1,1 +740,2011-02-03,1,0,2,5,0,4,1,1,0.16,0.1364,0.5,0.2985,0,2,2 +741,2011-02-03,1,0,2,6,0,4,1,1,0.16,0.1364,0.43,0.3582,0,39,39 +742,2011-02-03,1,0,2,7,0,4,1,1,0.14,0.1212,0.5,0.3284,1,86,87 +743,2011-02-03,1,0,2,8,0,4,1,1,0.14,0.1212,0.5,0.3582,4,184,188 +744,2011-02-03,1,0,2,9,0,4,1,1,0.16,0.1364,0.47,0.2985,6,127,133 +745,2011-02-03,1,0,2,10,0,4,1,1,0.18,0.1515,0.43,0.3284,2,50,52 +746,2011-02-03,1,0,2,11,0,4,1,1,0.18,0.1364,0.43,0.4478,9,55,64 +747,2011-02-03,1,0,2,12,0,4,1,1,0.2,0.1818,0.4,0.3582,2,67,69 +748,2011-02-03,1,0,2,13,0,4,1,1,0.2,0.1667,0.4,0.4179,4,47,51 +749,2011-02-03,1,0,2,14,0,4,1,1,0.22,0.197,0.37,0.3881,4,43,47 +750,2011-02-03,1,0,2,15,0,4,1,1,0.22,0.197,0.37,0.3284,4,56,60 +751,2011-02-03,1,0,2,16,0,4,1,1,0.22,0.2121,0.37,0.2537,5,73,78 +752,2011-02-03,1,0,2,17,0,4,1,1,0.2,0.197,0.4,0.194,5,170,175 +753,2011-02-03,1,0,2,18,0,4,1,1,0.2,0.2121,0.4,0.1642,2,145,147 +754,2011-02-03,1,0,2,19,0,4,1,1,0.2,0.2576,0.4,0,4,92,96 +755,2011-02-03,1,0,2,20,0,4,1,1,0.2,0.2273,0.47,0.0896,1,108,109 +756,2011-02-03,1,0,2,21,0,4,1,1,0.18,0.2121,0.55,0.1045,1,53,54 +757,2011-02-03,1,0,2,22,0,4,1,1,0.18,0.2121,0.51,0.0896,2,39,41 +758,2011-02-03,1,0,2,23,0,4,1,1,0.2,0.2273,0.47,0.1045,4,34,38 +759,2011-02-04,1,0,2,0,0,5,1,2,0.2,0.2576,0.44,0,3,10,13 +760,2011-02-04,1,0,2,1,0,5,1,2,0.16,0.2273,0.59,0,0,7,7 +761,2011-02-04,1,0,2,2,0,5,1,2,0.14,0.1667,0.63,0.1045,0,1,1 +762,2011-02-04,1,0,2,3,0,5,1,2,0.14,0.1667,0.63,0.1045,0,1,1 +763,2011-02-04,1,0,2,5,0,5,1,2,0.14,0.1515,0.63,0.1343,0,7,7 +764,2011-02-04,1,0,2,6,0,5,1,2,0.16,0.2273,0.55,0,2,26,28 +765,2011-02-04,1,0,2,7,0,5,1,1,0.14,0.2121,0.59,0,0,87,87 +766,2011-02-04,1,0,2,8,0,5,1,1,0.14,0.1515,0.74,0.1343,3,217,220 +767,2011-02-04,1,0,2,9,0,5,1,2,0.16,0.1818,0.8,0.1343,3,124,127 +768,2011-02-04,1,0,2,10,0,5,1,2,0.2,0.2121,0.51,0.1343,5,46,51 +769,2011-02-04,1,0,2,11,0,5,1,1,0.22,0.2273,0.51,0.1642,3,61,64 +770,2011-02-04,1,0,2,12,0,5,1,2,0.24,0.2424,0.48,0.1642,8,78,86 +771,2011-02-04,1,0,2,13,0,5,1,2,0.26,0.2576,0.5,0.2239,9,73,82 +772,2011-02-04,1,0,2,14,0,5,1,2,0.28,0.2727,0.45,0.1642,15,76,91 +773,2011-02-04,1,0,2,15,0,5,1,2,0.28,0.2727,0.48,0.2537,9,81,90 +774,2011-02-04,1,0,2,16,0,5,1,2,0.3,0.2879,0.42,0.2239,8,91,99 +775,2011-02-04,1,0,2,17,0,5,1,2,0.26,0.2727,0.56,0.1343,10,195,205 +776,2011-02-04,1,0,2,18,0,5,1,2,0.24,0.2576,0.6,0.1045,3,152,155 +777,2011-02-04,1,0,2,19,0,5,1,2,0.24,0.2424,0.65,0.1343,1,102,103 +778,2011-02-04,1,0,2,20,0,5,1,2,0.24,0.2424,0.65,0.1642,2,69,71 +779,2011-02-04,1,0,2,21,0,5,1,2,0.24,0.2424,0.7,0.1642,2,41,43 +780,2011-02-04,1,0,2,22,0,5,1,2,0.24,0.2424,0.65,0.1642,1,45,46 +781,2011-02-04,1,0,2,23,0,5,1,2,0.24,0.2424,0.7,0.1343,1,30,31 +782,2011-02-05,1,0,2,0,0,6,0,2,0.24,0.2424,0.7,0.1642,3,36,39 +783,2011-02-05,1,0,2,1,0,6,0,2,0.24,0.2424,0.65,0.1642,1,17,18 +784,2011-02-05,1,0,2,2,0,6,0,2,0.24,0.2424,0.75,0.1642,5,12,17 +785,2011-02-05,1,0,2,3,0,6,0,2,0.24,0.2424,0.75,0.1642,1,10,11 +786,2011-02-05,1,0,2,4,0,6,0,3,0.22,0.2273,0.93,0.1343,0,8,8 +787,2011-02-05,1,0,2,5,0,6,0,3,0.2,0.2273,1,0.0896,0,9,9 +788,2011-02-05,1,0,2,6,0,6,0,3,0.2,0.2576,1,0,0,4,4 +789,2011-02-05,1,0,2,7,0,6,0,3,0.22,0.2576,0.93,0.0896,0,4,4 +790,2011-02-05,1,0,2,8,0,6,0,3,0.2,0.2273,1,0.0896,0,10,10 +791,2011-02-05,1,0,2,9,0,6,0,3,0.2,0.2273,1,0.0896,3,17,20 +792,2011-02-05,1,0,2,10,0,6,0,3,0.2,0.2121,1,0.1343,3,31,34 +793,2011-02-05,1,0,2,11,0,6,0,3,0.22,0.2273,1,0.1343,1,46,47 +794,2011-02-05,1,0,2,12,0,6,0,3,0.22,0.2273,1,0.1642,10,42,52 +795,2011-02-05,1,0,2,13,0,6,0,3,0.22,0.2273,1,0.1642,10,62,72 +796,2011-02-05,1,0,2,14,0,6,0,3,0.22,0.2727,1,0,5,50,55 +797,2011-02-05,1,0,2,15,0,6,0,3,0.22,0.2727,1,0,11,49,60 +798,2011-02-05,1,0,2,16,0,6,0,3,0.22,0.2273,1,0.1343,8,63,71 +799,2011-02-05,1,0,2,17,0,6,0,2,0.24,0.2121,1,0.2836,14,64,78 +800,2011-02-05,1,0,2,18,0,6,0,2,0.28,0.2424,0.93,0.4478,2,81,83 +801,2011-02-05,1,0,2,19,0,6,0,2,0.28,0.2424,0.93,0.4478,6,78,84 +802,2011-02-05,1,0,2,20,0,6,0,1,0.3,0.2879,0.87,0.2537,5,64,69 +803,2011-02-05,1,0,2,21,0,6,0,1,0.26,0.2576,1,0.194,3,53,56 +804,2011-02-05,1,0,2,22,0,6,0,1,0.26,0.2727,0.93,0.1343,2,43,45 +805,2011-02-05,1,0,2,23,0,6,0,1,0.26,0.2576,0.93,0.2239,7,52,59 +806,2011-02-06,1,0,2,0,0,0,0,1,0.26,0.2576,0.7,0.194,2,37,39 +807,2011-02-06,1,0,2,1,0,0,0,1,0.26,0.2273,0.65,0.4179,4,40,44 +808,2011-02-06,1,0,2,2,0,0,0,1,0.26,0.2273,0.6,0.3284,0,20,20 +809,2011-02-06,1,0,2,3,0,0,0,1,0.26,0.2879,0.6,0.0896,3,10,13 +810,2011-02-06,1,0,2,4,0,0,0,1,0.26,0.2273,0.6,0.3582,0,2,2 +811,2011-02-06,1,0,2,5,0,0,0,1,0.26,0.2576,0.6,0.2239,0,1,1 +812,2011-02-06,1,0,2,6,0,0,0,1,0.26,0.2576,0.6,0.2239,0,1,1 +813,2011-02-06,1,0,2,7,0,0,0,1,0.24,0.2424,0.65,0.1642,0,8,8 +814,2011-02-06,1,0,2,8,0,0,0,1,0.24,0.2576,0.65,0.1045,2,21,23 +815,2011-02-06,1,0,2,9,0,0,0,1,0.28,0.2879,0.56,0.1045,7,38,45 +816,2011-02-06,1,0,2,10,0,0,0,1,0.3,0.2879,0.52,0.2537,15,74,89 +817,2011-02-06,1,0,2,11,0,0,0,1,0.32,0.303,0.49,0.2537,28,89,117 +818,2011-02-06,1,0,2,12,0,0,0,1,0.34,0.3333,0.46,0,48,126,174 +819,2011-02-06,1,0,2,13,0,0,0,1,0.34,0.3636,0.46,0,47,135,182 +820,2011-02-06,1,0,2,14,0,0,0,1,0.34,0.3485,0.46,0.0896,47,114,161 +821,2011-02-06,1,0,2,15,0,0,0,1,0.34,0.3485,0.46,0.0896,52,130,182 +822,2011-02-06,1,0,2,16,0,0,0,1,0.34,0.3485,0.49,0.1045,42,115,157 +823,2011-02-06,1,0,2,17,0,0,0,1,0.34,0.3636,0.46,0,24,97,121 +824,2011-02-06,1,0,2,18,0,0,0,1,0.3,0.303,0.56,0.1642,13,65,78 +825,2011-02-06,1,0,2,19,0,0,0,1,0.28,0.2879,0.61,0.1343,1,20,21 +826,2011-02-06,1,0,2,20,0,0,0,1,0.28,0.2879,0.61,0.1045,5,21,26 +827,2011-02-06,1,0,2,21,0,0,0,1,0.26,0.303,0.6,0,5,22,27 +828,2011-02-06,1,0,2,22,0,0,0,1,0.26,0.303,0.6,0,5,57,62 +829,2011-02-06,1,0,2,23,0,0,0,1,0.24,0.2879,0.65,0,4,26,30 +830,2011-02-07,1,0,2,0,0,1,1,1,0.24,0.2879,0.65,0,1,14,15 +831,2011-02-07,1,0,2,1,0,1,1,1,0.22,0.2727,0.75,0,1,4,5 +832,2011-02-07,1,0,2,2,0,1,1,1,0.2,0.2576,0.8,0,0,3,3 +833,2011-02-07,1,0,2,3,0,1,1,1,0.2,0.2576,0.86,0,0,1,1 +834,2011-02-07,1,0,2,4,0,1,1,1,0.2,0.2576,0.86,0,1,1,2 +835,2011-02-07,1,0,2,5,0,1,1,1,0.2,0.2576,0.86,0,1,9,10 +836,2011-02-07,1,0,2,6,0,1,1,1,0.18,0.2424,0.93,0,1,29,30 +837,2011-02-07,1,0,2,7,0,1,1,1,0.18,0.2424,0.86,0,6,89,95 +838,2011-02-07,1,0,2,8,0,1,1,2,0.16,0.2273,1,0,7,223,230 +839,2011-02-07,1,0,2,9,0,1,1,1,0.22,0.2727,0.8,0,3,115,118 +840,2011-02-07,1,0,2,10,0,1,1,1,0.24,0.2576,0.75,0.1045,6,49,55 +841,2011-02-07,1,0,2,11,0,1,1,1,0.3,0.3182,0.65,0.0896,11,36,47 +842,2011-02-07,1,0,2,12,0,1,1,2,0.32,0.3485,0.62,0,7,59,66 +843,2011-02-07,1,0,2,13,0,1,1,2,0.36,0.3636,0.57,0.0896,10,54,64 +844,2011-02-07,1,0,2,14,0,1,1,2,0.36,0.3636,0.57,0.0896,8,52,60 +845,2011-02-07,1,0,2,15,0,1,1,2,0.38,0.3939,0.54,0.0896,4,46,50 +846,2011-02-07,1,0,2,16,0,1,1,2,0.36,0.3485,0.57,0.1343,16,98,114 +847,2011-02-07,1,0,2,17,0,1,1,2,0.32,0.3182,0.7,0.1642,9,207,216 +848,2011-02-07,1,0,2,18,0,1,1,2,0.34,0.3333,0.66,0.1343,5,170,175 +849,2011-02-07,1,0,2,19,0,1,1,2,0.32,0.3485,0.7,0,5,123,128 +850,2011-02-07,1,0,2,20,0,1,1,2,0.32,0.3333,0.7,0.1045,6,82,88 +851,2011-02-07,1,0,2,21,0,1,1,1,0.32,0.3485,0.7,0,3,75,78 +852,2011-02-07,1,0,2,22,0,1,1,1,0.28,0.303,0.81,0.0896,3,34,37 +853,2011-02-07,1,0,2,23,0,1,1,2,0.3,0.3333,0.81,0,6,19,25 +854,2011-02-08,1,0,2,0,0,2,1,2,0.28,0.3182,0.87,0,4,6,10 +855,2011-02-08,1,0,2,1,0,2,1,2,0.28,0.3182,0.87,0,0,4,4 +856,2011-02-08,1,0,2,2,0,2,1,2,0.26,0.2727,0.93,0.1045,1,1,2 +857,2011-02-08,1,0,2,3,0,2,1,3,0.28,0.2727,0.93,0.1642,0,1,1 +858,2011-02-08,1,0,2,4,0,2,1,1,0.26,0.2576,0.93,0.1642,0,3,3 +859,2011-02-08,1,0,2,5,0,2,1,1,0.26,0.2273,0.81,0.3284,0,2,2 +860,2011-02-08,1,0,2,6,0,2,1,1,0.26,0.2273,0.7,0.3284,0,39,39 +861,2011-02-08,1,0,2,7,0,2,1,1,0.24,0.197,0.65,0.4179,3,97,100 +862,2011-02-08,1,0,2,8,0,2,1,1,0.24,0.197,0.56,0.4925,7,236,243 +863,2011-02-08,1,0,2,9,0,2,1,1,0.24,0.197,0.52,0.4925,7,128,135 +864,2011-02-08,1,0,2,10,0,2,1,1,0.22,0.1818,0.47,0.5522,4,44,48 +865,2011-02-08,1,0,2,11,0,2,1,1,0.22,0.1818,0.47,0.4627,1,49,50 +866,2011-02-08,1,0,2,12,0,2,1,1,0.24,0.197,0.38,0.4925,2,63,65 +867,2011-02-08,1,0,2,13,0,2,1,2,0.24,0.197,0.32,0.4478,2,48,50 +868,2011-02-08,1,0,2,14,0,2,1,1,0.22,0.197,0.37,0.4179,3,61,64 +869,2011-02-08,1,0,2,15,0,2,1,1,0.22,0.197,0.35,0.3881,6,45,51 +870,2011-02-08,1,0,2,16,0,2,1,1,0.22,0.1818,0.35,0.5224,4,79,83 +871,2011-02-08,1,0,2,17,0,2,1,1,0.22,0.1818,0.32,0.5821,4,172,176 +872,2011-02-08,1,0,2,18,0,2,1,1,0.2,0.1818,0.32,0.3881,1,151,152 +873,2011-02-08,1,0,2,19,0,2,1,1,0.16,0.1212,0.4,0.4627,1,100,101 +874,2011-02-08,1,0,2,20,0,2,1,1,0.16,0.1364,0.4,0.3284,3,53,56 +875,2011-02-08,1,0,2,21,0,2,1,1,0.14,0.1061,0.33,0.4627,8,46,54 +876,2011-02-08,1,0,2,22,0,2,1,1,0.12,0.1061,0.33,0.3582,0,29,29 +877,2011-02-08,1,0,2,23,0,2,1,1,0.12,0.1061,0.33,0.3284,3,9,12 +878,2011-02-09,1,0,2,0,0,3,1,1,0.1,0.0758,0.36,0.3582,0,17,17 +879,2011-02-09,1,0,2,1,0,3,1,1,0.1,0.1061,0.36,0.2239,0,7,7 +880,2011-02-09,1,0,2,2,0,3,1,1,0.08,0.0758,0.38,0.2836,1,2,3 +881,2011-02-09,1,0,2,3,0,3,1,1,0.06,0.0758,0.45,0.1343,0,2,2 +882,2011-02-09,1,0,2,5,0,3,1,1,0.06,0.1061,0.45,0.1045,0,7,7 +883,2011-02-09,1,0,2,6,0,3,1,1,0.06,0.1515,0.45,0,0,43,43 +884,2011-02-09,1,0,2,7,0,3,1,1,0.06,0.1061,0.49,0.1045,4,95,99 +885,2011-02-09,1,0,2,8,0,3,1,1,0.1,0.1364,0.42,0,1,198,199 +886,2011-02-09,1,0,2,9,0,3,1,1,0.12,0.1364,0.39,0.1642,4,119,123 +887,2011-02-09,1,0,2,10,0,3,1,1,0.14,0.1818,0.36,0,8,51,59 +888,2011-02-09,1,0,2,11,0,3,1,2,0.14,0.1515,0.43,0.1642,1,40,41 +889,2011-02-09,1,0,2,12,0,3,1,2,0.18,0.1818,0.4,0.2239,4,57,61 +890,2011-02-09,1,0,2,13,0,3,1,1,0.18,0.1667,0.4,0.2537,2,67,69 +891,2011-02-09,1,0,2,14,0,3,1,1,0.2,0.1818,0.34,0.2985,2,56,58 +892,2011-02-09,1,0,2,15,0,3,1,2,0.2,0.1818,0.34,0.2836,3,61,64 +893,2011-02-09,1,0,2,16,0,3,1,2,0.2,0.197,0.37,0.2537,7,72,79 +894,2011-02-09,1,0,2,17,0,3,1,2,0.2,0.197,0.34,0.2537,9,157,166 +895,2011-02-09,1,0,2,18,0,3,1,2,0.18,0.1667,0.47,0.2985,2,168,170 +896,2011-02-09,1,0,2,19,0,3,1,3,0.14,0.1212,0.86,0.2537,1,87,88 +897,2011-02-09,1,0,2,20,0,3,1,3,0.14,0.1515,0.86,0.1642,0,84,84 +898,2011-02-09,1,0,2,21,0,3,1,2,0.14,0.1515,0.86,0.1642,0,83,83 +899,2011-02-09,1,0,2,22,0,3,1,3,0.16,0.1667,0.8,0.1642,4,42,46 +900,2011-02-09,1,0,2,23,0,3,1,3,0.16,0.1515,0.8,0.194,0,37,37 +901,2011-02-10,1,0,2,0,0,4,1,3,0.14,0.1364,0.86,0.194,0,16,16 +902,2011-02-10,1,0,2,1,0,4,1,3,0.14,0.1515,0.8,0.1343,0,7,7 +903,2011-02-10,1,0,2,2,0,4,1,3,0.14,0.1515,0.8,0.1343,0,3,3 +904,2011-02-10,1,0,2,4,0,4,1,2,0.14,0.1364,0.59,0.2239,0,1,1 +905,2011-02-10,1,0,2,5,0,4,1,2,0.12,0.1212,0.5,0.2239,0,6,6 +906,2011-02-10,1,0,2,6,0,4,1,2,0.12,0.1212,0.54,0.2836,0,26,26 +907,2011-02-10,1,0,2,7,0,4,1,1,0.1,0.0758,0.5,0.4179,0,99,99 +908,2011-02-10,1,0,2,8,0,4,1,1,0.1,0.0758,0.49,0.3284,5,173,178 +909,2011-02-10,1,0,2,9,0,4,1,1,0.12,0.1061,0.42,0.3582,1,121,122 +910,2011-02-10,1,0,2,10,0,4,1,1,0.12,0.1061,0.42,0.2985,1,34,35 +911,2011-02-10,1,0,2,11,0,4,1,1,0.14,0.1212,0.39,0.3582,1,44,45 +912,2011-02-10,1,0,2,12,0,4,1,1,0.16,0.1364,0.34,0.3881,4,65,69 +913,2011-02-10,1,0,2,13,0,4,1,1,0.18,0.1667,0.29,0.2985,3,59,62 +914,2011-02-10,1,0,2,14,0,4,1,1,0.2,0.1818,0.27,0.2836,6,42,48 +915,2011-02-10,1,0,2,15,0,4,1,1,0.2,0.197,0.25,0.2537,0,50,50 +916,2011-02-10,1,0,2,16,0,4,1,1,0.2,0.1818,0.27,0.2985,4,76,80 +917,2011-02-10,1,0,2,17,0,4,1,1,0.18,0.1818,0.26,0.194,6,159,165 +918,2011-02-10,1,0,2,18,0,4,1,1,0.16,0.1818,0.28,0.1343,3,157,160 +919,2011-02-10,1,0,2,19,0,4,1,1,0.14,0.1667,0.28,0.1045,2,110,112 +920,2011-02-10,1,0,2,20,0,4,1,1,0.14,0.1818,0.31,0.0896,4,93,97 +921,2011-02-10,1,0,2,21,0,4,1,1,0.14,0.2121,0.39,0,2,70,72 +922,2011-02-10,1,0,2,22,0,4,1,1,0.12,0.197,0.39,0,4,47,51 +923,2011-02-10,1,0,2,23,0,4,1,1,0.12,0.1515,0.42,0.1045,1,33,34 +924,2011-02-11,1,0,2,0,0,5,1,1,0.1,0.1364,0.49,0.1045,2,12,14 +925,2011-02-11,1,0,2,1,0,5,1,1,0.1,0.1364,0.54,0.0896,1,6,7 +926,2011-02-11,1,0,2,2,0,5,1,1,0.1,0.1364,0.54,0.0896,0,3,3 +927,2011-02-11,1,0,2,5,0,5,1,1,0.08,0.1212,0.63,0.0896,0,4,4 +928,2011-02-11,1,0,2,6,0,5,1,1,0.1,0.1818,0.68,0,1,23,24 +929,2011-02-11,1,0,2,7,0,5,1,1,0.08,0.1667,0.73,0,1,73,74 +930,2011-02-11,1,0,2,8,0,5,1,1,0.1,0.1212,0.74,0.1642,4,212,216 +931,2011-02-11,1,0,2,9,0,5,1,1,0.12,0.1212,0.74,0.2239,8,132,140 +932,2011-02-11,1,0,2,10,0,5,1,1,0.14,0.1364,0.69,0.194,5,39,44 +933,2011-02-11,1,0,2,11,0,5,1,1,0.22,0.2273,0.47,0.1343,12,52,64 +934,2011-02-11,1,0,2,12,0,5,1,1,0.22,0.2273,0.47,0.1343,7,64,71 +935,2011-02-11,1,0,2,13,0,5,1,1,0.24,0.2273,0.35,0.194,21,89,110 +936,2011-02-11,1,0,2,14,0,5,1,1,0.3,0.2879,0.26,0.2537,17,67,84 +937,2011-02-11,1,0,2,15,0,5,1,1,0.32,0.3182,0.21,0.1642,12,62,74 +938,2011-02-11,1,0,2,16,0,5,1,1,0.3,0.2879,0.28,0.194,14,111,125 +939,2011-02-11,1,0,2,17,0,5,1,1,0.3,0.3333,0.24,0,18,193,211 +940,2011-02-11,1,0,2,18,0,5,1,1,0.28,0.3182,0.28,0,9,165,174 +941,2011-02-11,1,0,2,19,0,5,1,1,0.26,0.303,0.33,0,7,94,101 +942,2011-02-11,1,0,2,20,0,5,1,1,0.22,0.2273,0.55,0.1343,2,61,63 +943,2011-02-11,1,0,2,21,0,5,1,1,0.2,0.2121,0.59,0.1343,1,46,47 +944,2011-02-11,1,0,2,22,0,5,1,1,0.2,0.2273,0.64,0.0896,2,41,43 +945,2011-02-11,1,0,2,23,0,5,1,1,0.18,0.2424,0.69,0,5,48,53 +946,2011-02-12,1,0,2,0,0,6,0,1,0.16,0.197,0.69,0.0896,3,27,30 +947,2011-02-12,1,0,2,1,0,6,0,1,0.14,0.2121,0.86,0,2,22,24 +948,2011-02-12,1,0,2,2,0,6,0,1,0.14,0.2121,0.8,0,2,13,15 +949,2011-02-12,1,0,2,3,0,6,0,1,0.12,0.197,0.8,0,3,7,10 +950,2011-02-12,1,0,2,4,0,6,0,1,0.12,0.1667,0.74,0.0896,0,4,4 +951,2011-02-12,1,0,2,5,0,6,0,1,0.12,0.1667,0.74,0.0896,0,1,1 +952,2011-02-12,1,0,2,6,0,6,0,1,0.12,0.1364,0.93,0.194,1,1,2 +953,2011-02-12,1,0,2,7,0,6,0,1,0.12,0.1515,0.8,0.1045,2,9,11 +954,2011-02-12,1,0,2,8,0,6,0,1,0.14,0.1515,0.86,0.1343,2,28,30 +955,2011-02-12,1,0,2,9,0,6,0,1,0.16,0.1818,0.64,0.1343,5,38,43 +956,2011-02-12,1,0,2,10,0,6,0,1,0.22,0.2121,0.41,0.2537,13,71,84 +957,2011-02-12,1,0,2,11,0,6,0,1,0.3,0.2727,0.28,0.3284,30,84,114 +958,2011-02-12,1,0,2,12,0,6,0,1,0.3,0.2727,0.39,0.4627,27,93,120 +959,2011-02-12,1,0,2,13,0,6,0,1,0.3,0.2727,0.39,0.4179,32,103,135 +960,2011-02-12,1,0,2,14,0,6,0,1,0.34,0.3182,0.31,0.2836,30,90,120 +961,2011-02-12,1,0,2,15,0,6,0,1,0.34,0.303,0.29,0.4179,47,127,174 +962,2011-02-12,1,0,2,16,0,6,0,1,0.34,0.303,0.29,0.4179,42,103,145 +963,2011-02-12,1,0,2,17,0,6,0,1,0.32,0.2879,0.31,0.5224,24,113,137 +964,2011-02-12,1,0,2,18,0,6,0,1,0.28,0.2576,0.38,0.3284,4,60,64 +965,2011-02-12,1,0,2,19,0,6,0,1,0.28,0.2727,0.38,0.1642,2,39,41 +966,2011-02-12,1,0,2,20,0,6,0,1,0.26,0.2576,0.41,0.2239,1,39,40 +967,2011-02-12,1,0,2,21,0,6,0,1,0.26,0.303,0.41,0,9,42,51 +968,2011-02-12,1,0,2,22,0,6,0,1,0.24,0.2576,0.44,0.0896,6,39,45 +969,2011-02-12,1,0,2,23,0,6,0,1,0.22,0.2273,0.51,0.1343,1,31,32 +970,2011-02-13,1,0,2,0,0,0,0,1,0.2,0.2273,0.64,0.1045,5,34,39 +971,2011-02-13,1,0,2,1,0,0,0,1,0.2,0.2273,0.59,0.0896,1,23,24 +972,2011-02-13,1,0,2,2,0,0,0,2,0.2,0.2273,0.75,0.0896,1,19,20 +973,2011-02-13,1,0,2,3,0,0,0,2,0.2,0.2273,0.69,0.1045,4,8,12 +974,2011-02-13,1,0,2,4,0,0,0,2,0.2,0.2121,0.69,0.1642,0,2,2 +975,2011-02-13,1,0,2,6,0,0,0,2,0.2,0.2121,0.69,0.1343,2,3,5 +976,2011-02-13,1,0,2,7,0,0,0,2,0.22,0.2727,0.55,0,0,3,3 +977,2011-02-13,1,0,2,8,0,0,0,2,0.22,0.2273,0.64,0.194,1,11,12 +978,2011-02-13,1,0,2,9,0,0,0,2,0.24,0.2273,0.6,0.2239,12,35,47 +979,2011-02-13,1,0,2,10,0,0,0,1,0.3,0.2727,0.45,0.3284,19,86,105 +980,2011-02-13,1,0,2,11,0,0,0,1,0.32,0.2879,0.39,0.4478,26,86,112 +981,2011-02-13,1,0,2,12,0,0,0,1,0.36,0.3182,0.32,0.4627,58,94,152 +982,2011-02-13,1,0,2,13,0,0,0,1,0.38,0.3939,0.29,0.3582,62,92,154 +983,2011-02-13,1,0,2,14,0,0,0,2,0.4,0.4091,0.3,0.4179,51,110,161 +984,2011-02-13,1,0,2,15,0,0,0,2,0.4,0.4091,0.3,0.2985,40,122,162 +985,2011-02-13,1,0,2,16,0,0,0,2,0.42,0.4242,0.28,0.3284,28,106,134 +986,2011-02-13,1,0,2,17,0,0,0,1,0.42,0.4242,0.28,0.3284,30,95,125 +987,2011-02-13,1,0,2,18,0,0,0,1,0.4,0.4091,0.32,0.2985,17,78,95 +988,2011-02-13,1,0,2,19,0,0,0,1,0.4,0.4091,0.35,0.2836,11,50,61 +989,2011-02-13,1,0,2,20,0,0,0,1,0.4,0.4091,0.35,0.3284,15,32,47 +990,2011-02-13,1,0,2,21,0,0,0,1,0.4,0.4091,0.35,0.3582,6,45,51 +991,2011-02-13,1,0,2,22,0,0,0,1,0.4,0.4091,0.35,0.2985,5,31,36 +992,2011-02-13,1,0,2,23,0,0,0,1,0.4,0.4091,0.35,0.3582,3,27,30 +993,2011-02-14,1,0,2,0,0,1,1,1,0.38,0.3939,0.37,0.3582,3,8,11 +994,2011-02-14,1,0,2,1,0,1,1,1,0.38,0.3939,0.37,0.3582,1,6,7 +995,2011-02-14,1,0,2,2,0,1,1,1,0.36,0.3333,0.4,0.2985,0,2,2 +996,2011-02-14,1,0,2,3,0,1,1,1,0.34,0.3182,0.46,0.2239,1,1,2 +997,2011-02-14,1,0,2,4,0,1,1,1,0.32,0.303,0.53,0.2836,0,2,2 +998,2011-02-14,1,0,2,5,0,1,1,1,0.32,0.303,0.53,0.2836,0,3,3 +999,2011-02-14,1,0,2,6,0,1,1,1,0.34,0.303,0.46,0.2985,1,25,26 +1000,2011-02-14,1,0,2,7,0,1,1,1,0.34,0.303,0.46,0.2985,2,96,98 +1001,2011-02-14,1,0,2,8,0,1,1,1,0.38,0.3939,0.4,0.4627,7,249,256 +1002,2011-02-14,1,0,2,9,0,1,1,1,0.4,0.4091,0.37,0.3881,8,122,130 +1003,2011-02-14,1,0,2,10,0,1,1,1,0.44,0.4394,0.33,0.2239,9,46,55 +1004,2011-02-14,1,0,2,11,0,1,1,1,0.52,0.5,0.23,0.2537,10,43,53 +1005,2011-02-14,1,0,2,12,0,1,1,1,0.56,0.5303,0.22,0.4478,27,99,126 +1006,2011-02-14,1,0,2,13,0,1,1,1,0.58,0.5455,0.19,0.3881,27,93,120 +1007,2011-02-14,1,0,2,14,0,1,1,1,0.6,0.5909,0.15,0.4925,14,76,90 +1008,2011-02-14,1,0,2,15,0,1,1,1,0.56,0.5303,0.21,0.6567,19,71,90 +1009,2011-02-14,1,0,2,16,0,1,1,1,0.52,0.5,0.27,0.4627,16,102,118 +1010,2011-02-14,1,0,2,17,0,1,1,1,0.46,0.4545,0.33,0.6119,25,218,243 +1011,2011-02-14,1,0,2,18,0,1,1,1,0.4,0.4091,0.4,0.6119,11,194,205 +1012,2011-02-14,1,0,2,19,0,1,1,1,0.38,0.3939,0.43,0.4925,12,86,98 +1013,2011-02-14,1,0,2,20,0,1,1,1,0.36,0.3182,0.46,0.4627,5,65,70 +1014,2011-02-14,1,0,2,21,0,1,1,1,0.36,0.3182,0.5,0.5224,8,35,43 +1015,2011-02-14,1,0,2,22,0,1,1,1,0.34,0.2879,0.46,0.6567,1,44,45 +1016,2011-02-14,1,0,2,23,0,1,1,1,0.32,0.2879,0.49,0.4925,1,19,20 +1017,2011-02-15,1,0,2,0,0,2,1,1,0.3,0.2727,0.49,0.4179,7,12,19 +1018,2011-02-15,1,0,2,1,0,2,1,1,0.3,0.2424,0.42,0.7761,0,5,5 +1019,2011-02-15,1,0,2,2,0,2,1,1,0.28,0.2273,0.41,0.6866,1,2,3 +1020,2011-02-15,1,0,2,4,0,2,1,1,0.22,0.1818,0.37,0.5224,0,1,1 +1021,2011-02-15,1,0,2,5,0,2,1,1,0.22,0.1818,0.32,0.4627,0,4,4 +1022,2011-02-15,1,0,2,6,0,2,1,1,0.2,0.1818,0.32,0.3284,0,30,30 +1023,2011-02-15,1,0,2,7,0,2,1,1,0.2,0.1818,0.32,0.3582,2,103,105 +1024,2011-02-15,1,0,2,8,0,2,1,1,0.2,0.1818,0.32,0.3582,10,213,223 +1025,2011-02-15,1,0,2,9,0,2,1,1,0.22,0.197,0.29,0.4478,2,108,110 +1026,2011-02-15,1,0,2,10,0,2,1,1,0.24,0.2121,0.27,0.2836,5,47,52 +1027,2011-02-15,1,0,2,11,0,2,1,1,0.26,0.2424,0.25,0.2537,11,46,57 +1028,2011-02-15,1,0,2,12,0,2,1,1,0.28,0.2727,0.24,0.2537,6,65,71 +1029,2011-02-15,1,0,2,13,0,2,1,1,0.32,0.303,0.21,0.2239,14,68,82 +1030,2011-02-15,1,0,2,14,0,2,1,1,0.34,0.3182,0.19,0.2239,10,69,79 +1031,2011-02-15,1,0,2,15,0,2,1,1,0.34,0.3333,0.19,0.1642,11,74,85 +1032,2011-02-15,1,0,2,16,0,2,1,1,0.34,0.3182,0.19,0.2537,21,77,98 +1033,2011-02-15,1,0,2,17,0,2,1,1,0.32,0.303,0.22,0.2239,15,191,206 +1034,2011-02-15,1,0,2,18,0,2,1,1,0.3,0.303,0.22,0.1343,14,198,212 +1035,2011-02-15,1,0,2,19,0,2,1,1,0.28,0.2879,0.26,0.1343,3,142,145 +1036,2011-02-15,1,0,2,20,0,2,1,1,0.26,0.2727,0.33,0.1045,3,98,101 +1037,2011-02-15,1,0,2,21,0,2,1,1,0.24,0.2879,0.52,0,5,61,66 +1038,2011-02-15,1,0,2,22,0,2,1,1,0.24,0.2879,0.44,0,0,41,41 +1039,2011-02-15,1,0,2,23,0,2,1,2,0.22,0.2576,0.44,0.0896,0,20,20 +1040,2011-02-16,1,0,2,0,0,3,1,1,0.22,0.2576,0.41,0.0896,0,15,15 +1041,2011-02-16,1,0,2,1,0,3,1,1,0.2,0.2273,0.44,0.0896,0,9,9 +1042,2011-02-16,1,0,2,3,0,3,1,2,0.2,0.197,0.47,0.194,0,1,1 +1043,2011-02-16,1,0,2,4,0,3,1,1,0.2,0.197,0.51,0.194,0,1,1 +1044,2011-02-16,1,0,2,5,0,3,1,1,0.2,0.197,0.47,0.194,0,5,5 +1045,2011-02-16,1,0,2,6,0,3,1,1,0.2,0.197,0.55,0.2239,1,32,33 +1046,2011-02-16,1,0,2,7,0,3,1,2,0.2,0.197,0.55,0.2239,5,103,108 +1047,2011-02-16,1,0,2,8,0,3,1,2,0.22,0.2273,0.55,0.1642,6,224,230 +1048,2011-02-16,1,0,2,9,0,3,1,1,0.24,0.2121,0.52,0.2836,2,122,124 +1049,2011-02-16,1,0,2,10,0,3,1,1,0.26,0.2273,0.41,0.3881,14,55,69 +1050,2011-02-16,1,0,2,11,0,3,1,1,0.34,0.303,0.34,0.2985,7,59,66 +1051,2011-02-16,1,0,2,12,0,3,1,1,0.38,0.3939,0.32,0.3284,14,72,86 +1052,2011-02-16,1,0,2,13,0,3,1,1,0.42,0.4242,0.3,0.3582,13,80,93 +1053,2011-02-16,1,0,2,14,0,3,1,1,0.46,0.4545,0.28,0.4179,17,65,82 +1054,2011-02-16,1,0,2,15,0,3,1,1,0.46,0.4545,0.28,0.4179,35,82,117 +1055,2011-02-16,1,0,2,16,0,3,1,1,0.46,0.4545,0.31,0.3881,26,96,122 +1056,2011-02-16,1,0,2,17,0,3,1,1,0.46,0.4545,0.28,0.2836,11,244,255 +1057,2011-02-16,1,0,2,18,0,3,1,1,0.4,0.4091,0.4,0.2239,20,202,222 +1058,2011-02-16,1,0,2,19,0,3,1,1,0.34,0.3182,0.53,0.2239,18,143,161 +1059,2011-02-16,1,0,2,20,0,3,1,1,0.38,0.3939,0.43,0.194,10,108,118 +1060,2011-02-16,1,0,2,21,0,3,1,1,0.36,0.3485,0.46,0.194,5,87,92 +1061,2011-02-16,1,0,2,22,0,3,1,1,0.34,0.3333,0.53,0.194,12,61,73 +1062,2011-02-16,1,0,2,23,0,3,1,1,0.38,0.3939,0.4,0.2239,2,31,33 +1063,2011-02-17,1,0,2,0,0,4,1,1,0.34,0.3333,0.53,0.194,1,16,17 +1064,2011-02-17,1,0,2,1,0,4,1,1,0.34,0.3182,0.53,0.2239,0,6,6 +1065,2011-02-17,1,0,2,2,0,4,1,2,0.34,0.3182,0.53,0.2239,2,4,6 +1066,2011-02-17,1,0,2,3,0,4,1,1,0.34,0.3333,0.53,0.194,3,1,4 +1067,2011-02-17,1,0,2,4,0,4,1,1,0.32,0.3182,0.57,0.194,3,1,4 +1068,2011-02-17,1,0,2,5,0,4,1,1,0.32,0.3333,0.66,0.0896,1,11,12 +1069,2011-02-17,1,0,2,6,0,4,1,1,0.3,0.303,0.7,0.1343,3,44,47 +1070,2011-02-17,1,0,2,7,0,4,1,1,0.32,0.3333,0.57,0.1045,7,119,126 +1071,2011-02-17,1,0,2,8,0,4,1,1,0.32,0.3333,0.57,0.0896,18,267,285 +1072,2011-02-17,1,0,2,9,0,4,1,1,0.36,0.3485,0.57,0.194,16,163,179 +1073,2011-02-17,1,0,2,10,0,4,1,1,0.38,0.3939,0.54,0.194,18,70,88 +1074,2011-02-17,1,0,2,11,0,4,1,2,0.44,0.4394,0.44,0.2537,19,71,90 +1075,2011-02-17,1,0,2,12,0,4,1,1,0.48,0.4697,0.41,0.2239,15,86,101 +1076,2011-02-17,1,0,2,13,0,4,1,1,0.54,0.5152,0.34,0.2239,15,108,123 +1077,2011-02-17,1,0,2,14,0,4,1,1,0.6,0.6212,0.31,0.2239,26,55,81 +1078,2011-02-17,1,0,2,15,0,4,1,1,0.6,0.6061,0.28,0.2537,15,91,106 +1079,2011-02-17,1,0,2,16,0,4,1,2,0.56,0.5303,0.35,0.2985,29,117,146 +1080,2011-02-17,1,0,2,17,0,4,1,2,0.58,0.5455,0.32,0.2985,18,256,274 +1081,2011-02-17,1,0,2,18,0,4,1,2,0.54,0.5152,0.42,0.2836,11,211,222 +1082,2011-02-17,1,0,2,19,0,4,1,1,0.48,0.4697,0.55,0.3284,14,161,175 +1083,2011-02-17,1,0,2,20,0,4,1,1,0.48,0.4697,0.59,0.3284,8,131,139 +1084,2011-02-17,1,0,2,21,0,4,1,1,0.52,0.5,0.55,0.3881,5,119,124 +1085,2011-02-17,1,0,2,22,0,4,1,1,0.5,0.4848,0.59,0.2836,8,68,76 +1086,2011-02-17,1,0,2,23,0,4,1,1,0.46,0.4545,0.67,0.2985,4,40,44 +1087,2011-02-18,1,0,2,0,0,5,1,1,0.44,0.4394,0.72,0.2836,12,20,32 +1088,2011-02-18,1,0,2,1,0,5,1,1,0.44,0.4394,0.72,0.2239,1,7,8 +1089,2011-02-18,1,0,2,2,0,5,1,1,0.44,0.4394,0.72,0.2836,2,5,7 +1090,2011-02-18,1,0,2,3,0,5,1,1,0.46,0.4545,0.67,0.2537,2,6,8 +1091,2011-02-18,1,0,2,4,0,5,1,1,0.46,0.4545,0.67,0.2537,0,1,1 +1092,2011-02-18,1,0,2,5,0,5,1,2,0.46,0.4545,0.67,0.1045,1,6,7 +1093,2011-02-18,1,0,2,6,0,5,1,2,0.44,0.4394,0.72,0.1642,2,48,50 +1094,2011-02-18,1,0,2,7,0,5,1,2,0.42,0.4242,0.77,0.2239,8,108,116 +1095,2011-02-18,1,0,2,8,0,5,1,2,0.42,0.4242,0.77,0.194,26,246,272 +1096,2011-02-18,1,0,2,9,0,5,1,2,0.42,0.4242,0.77,0.194,15,154,169 +1097,2011-02-18,1,0,2,10,0,5,1,2,0.44,0.4394,0.72,0.2239,17,78,95 +1098,2011-02-18,1,0,2,11,0,5,1,1,0.44,0.4394,0.72,0.1642,31,82,113 +1099,2011-02-18,1,0,2,12,0,5,1,1,0.5,0.4848,0.59,0.194,59,126,185 +1100,2011-02-18,1,0,2,13,0,5,1,1,0.6,0.6212,0.43,0.194,45,131,176 +1101,2011-02-18,1,0,2,14,0,5,1,1,0.66,0.6212,0.36,0.2985,73,118,191 +1102,2011-02-18,1,0,2,15,0,5,1,1,0.66,0.6212,0.36,0.3284,55,117,172 +1103,2011-02-18,1,0,2,16,0,5,1,1,0.66,0.6212,0.36,0.2836,68,164,232 +1104,2011-02-18,1,0,2,17,0,5,1,1,0.66,0.6212,0.34,0.3582,52,275,327 +1105,2011-02-18,1,0,2,18,0,5,1,1,0.64,0.6212,0.33,0.3284,29,195,224 +1106,2011-02-18,1,0,2,19,0,5,1,1,0.62,0.6212,0.29,0.5821,16,146,162 +1107,2011-02-18,1,0,2,20,0,5,1,1,0.6,0.6212,0.31,0.194,19,105,124 +1108,2011-02-18,1,0,2,21,0,5,1,1,0.58,0.5455,0.21,0.4925,11,61,72 +1109,2011-02-18,1,0,2,22,0,5,1,1,0.54,0.5152,0.1,0.2537,19,88,107 +1110,2011-02-18,1,0,2,23,0,5,1,1,0.52,0.5,0.08,0.2836,16,61,77 +1111,2011-02-19,1,0,2,0,0,6,0,1,0.48,0.4697,0.12,0.4925,6,23,29 +1112,2011-02-19,1,0,2,1,0,6,0,1,0.46,0.4545,0.14,0.4179,10,21,31 +1113,2011-02-19,1,0,2,2,0,6,0,1,0.44,0.4394,0.13,0.3881,3,14,17 +1114,2011-02-19,1,0,2,3,0,6,0,1,0.42,0.4242,0.14,0.2985,0,7,7 +1115,2011-02-19,1,0,2,4,0,6,0,1,0.4,0.4091,0.15,0.3284,0,3,3 +1116,2011-02-19,1,0,2,5,0,6,0,1,0.4,0.4091,0.15,0.3284,0,3,3 +1117,2011-02-19,1,0,2,6,0,6,0,1,0.4,0.4091,0.17,0.4179,3,3,6 +1118,2011-02-19,1,0,2,7,0,6,0,1,0.38,0.3939,0.17,0.5224,6,16,22 +1119,2011-02-19,1,0,2,8,0,6,0,1,0.38,0.3939,0.17,0.5821,9,36,45 +1120,2011-02-19,1,0,2,9,0,6,0,1,0.4,0.4091,0.16,0.6567,18,37,55 +1121,2011-02-19,1,0,2,10,0,6,0,1,0.42,0.4242,0.16,0.5821,34,72,106 +1122,2011-02-19,1,0,2,11,0,6,0,1,0.44,0.4394,0.16,0.5821,47,76,123 +1123,2011-02-19,1,0,2,12,0,6,0,1,0.44,0.4394,0.18,0.4925,38,81,119 +1124,2011-02-19,1,0,2,13,0,6,0,1,0.44,0.4394,0.16,0.6119,52,103,155 +1125,2011-02-19,1,0,2,14,0,6,0,1,0.46,0.4545,0.15,0.6567,102,94,196 +1126,2011-02-19,1,0,2,15,0,6,0,1,0.44,0.4394,0.16,0.7463,84,87,171 +1127,2011-02-19,1,0,2,16,0,6,0,1,0.44,0.4394,0.16,0.6418,39,81,120 +1128,2011-02-19,1,0,2,17,0,6,0,1,0.42,0.4242,0.19,0.6119,36,91,127 +1129,2011-02-19,1,0,2,18,0,6,0,1,0.36,0.3182,0.25,0.4478,21,67,88 +1130,2011-02-19,1,0,2,19,0,6,0,1,0.34,0.303,0.29,0.3582,5,54,59 +1131,2011-02-19,1,0,2,20,0,6,0,1,0.32,0.2879,0.28,0.5224,9,38,47 +1132,2011-02-19,1,0,2,21,0,6,0,1,0.32,0.2727,0.26,0.5522,4,29,33 +1133,2011-02-19,1,0,2,22,0,6,0,1,0.3,0.2576,0.28,0.4925,2,42,44 +1134,2011-02-19,1,0,2,23,0,6,0,1,0.28,0.2424,0.33,0.4478,4,25,29 +1135,2011-02-20,1,0,2,0,0,0,0,1,0.26,0.2121,0.35,0.4478,3,14,17 +1136,2011-02-20,1,0,2,1,0,0,0,1,0.24,0.197,0.41,0.4627,5,11,16 +1137,2011-02-20,1,0,2,2,0,0,0,1,0.24,0.197,0.41,0.5522,0,17,17 +1138,2011-02-20,1,0,2,3,0,0,0,1,0.22,0.1818,0.44,0.5522,9,9,18 +1139,2011-02-20,1,0,2,4,0,0,0,1,0.22,0.1818,0.44,0.5522,0,1,1 +1140,2011-02-20,1,0,2,6,0,0,0,1,0.2,0.1818,0.47,0.2985,1,1,2 +1141,2011-02-20,1,0,2,7,0,0,0,1,0.18,0.1667,0.51,0.2537,0,2,2 +1142,2011-02-20,1,0,2,8,0,0,0,1,0.2,0.2273,0.51,0.1045,2,22,24 +1143,2011-02-20,1,0,2,9,0,0,0,1,0.22,0.2121,0.47,0.2836,7,48,55 +1144,2011-02-20,1,0,2,10,0,0,0,2,0.26,0.2576,0.41,0.194,34,70,104 +1145,2011-02-20,1,0,2,11,0,0,0,2,0.3,0.303,0.33,0.1642,72,89,161 +1146,2011-02-20,1,0,2,12,0,0,0,1,0.3,0.3182,0.33,0.1045,62,120,182 +1147,2011-02-20,1,0,2,13,0,0,0,1,0.34,0.3333,0.29,0,76,122,198 +1148,2011-02-20,1,0,2,14,0,0,0,1,0.36,0.3485,0.27,0.1642,108,104,212 +1149,2011-02-20,1,0,2,15,0,0,0,1,0.36,0.3636,0.27,0.0896,66,102,168 +1150,2011-02-20,1,0,2,16,0,0,0,1,0.36,0.3636,0.29,0.0896,59,88,147 +1151,2011-02-20,1,0,2,17,0,0,0,1,0.34,0.3485,0.33,0.1642,60,86,146 +1152,2011-02-20,1,0,2,18,0,0,0,2,0.34,0.3333,0.36,0.1343,30,71,101 +1153,2011-02-20,1,0,2,19,0,0,0,2,0.34,0.3333,0.36,0.1343,3,39,42 +1154,2011-02-20,1,0,2,20,0,0,0,2,0.34,0.3636,0.42,0,12,30,42 +1155,2011-02-20,1,0,2,21,0,0,0,2,0.32,0.3485,0.53,0,17,39,56 +1156,2011-02-20,1,0,2,22,0,0,0,2,0.32,0.303,0.57,0.2239,4,43,47 +1157,2011-02-20,1,0,2,23,0,0,0,2,0.3,0.303,0.61,0.1642,9,45,54 +1158,2011-02-21,1,0,2,0,1,1,0,2,0.34,0.303,0.42,0.3284,7,30,37 +1159,2011-02-21,1,0,2,1,1,1,0,2,0.34,0.303,0.42,0.3284,2,11,13 +1160,2011-02-21,1,0,2,2,1,1,0,2,0.34,0.303,0.42,0.3284,1,3,4 +1161,2011-02-21,1,0,2,3,1,1,0,2,0.34,0.303,0.42,0.2985,2,3,5 +1162,2011-02-21,1,0,2,4,1,1,0,1,0.32,0.3182,0.45,0.1642,1,0,1 +1163,2011-02-21,1,0,2,5,1,1,0,2,0.34,0.3636,0.36,0,1,2,3 +1164,2011-02-21,1,0,2,6,1,1,0,2,0.42,0.4242,0.26,0.2985,2,6,8 +1165,2011-02-21,1,0,2,7,1,1,0,2,0.42,0.4242,0.26,0.2836,3,16,19 +1166,2011-02-21,1,0,2,8,1,1,0,2,0.32,0.303,0.57,0.2985,7,56,63 +1167,2011-02-21,1,0,2,9,1,1,0,2,0.32,0.303,0.57,0.2836,11,46,57 +1168,2011-02-21,1,0,2,10,1,1,0,2,0.32,0.303,0.57,0.2537,29,52,81 +1169,2011-02-21,1,0,2,11,1,1,0,2,0.32,0.3333,0.57,0.1045,20,70,90 +1170,2011-02-21,1,0,2,12,1,1,0,2,0.32,0.3182,0.66,0.194,26,67,93 +1171,2011-02-21,1,0,2,13,1,1,0,3,0.3,0.2727,0.81,0.3284,28,75,103 +1172,2011-02-21,1,0,2,14,1,1,0,2,0.32,0.303,0.76,0.2537,15,101,116 +1173,2011-02-21,1,0,2,15,1,1,0,2,0.3,0.2727,0.7,0.4478,11,76,87 +1174,2011-02-21,1,0,2,16,1,1,0,3,0.28,0.2424,0.75,0.4179,8,48,56 +1175,2011-02-21,1,0,2,17,1,1,0,2,0.28,0.2576,0.75,0.3881,18,62,80 +1176,2011-02-21,1,0,2,18,1,1,0,2,0.24,0.2121,0.87,0.3582,2,64,66 +1177,2011-02-21,1,0,2,19,1,1,0,2,0.24,0.2121,0.87,0.3582,0,49,49 +1178,2011-02-21,1,0,2,20,1,1,0,3,0.24,0.2121,0.81,0.3881,0,29,29 +1179,2011-02-21,1,0,2,21,1,1,0,3,0.22,0.197,0.75,0.4478,1,33,34 +1180,2011-02-21,1,0,2,22,1,1,0,3,0.2,0.1667,0.75,0.4179,0,11,11 +1181,2011-02-21,1,0,2,23,1,1,0,3,0.2,0.1667,0.75,0.4179,0,2,2 +1182,2011-02-22,1,0,2,6,0,2,1,2,0.12,0.1212,0.8,0.2836,0,7,7 +1183,2011-02-22,1,0,2,7,0,2,1,2,0.12,0.1212,0.8,0.2836,0,40,40 +1184,2011-02-22,1,0,2,8,0,2,1,2,0.12,0.1212,0.8,0.2537,7,107,114 +1185,2011-02-22,1,0,2,9,0,2,1,2,0.14,0.1212,0.74,0.2537,5,101,106 +1186,2011-02-22,1,0,2,10,0,2,1,1,0.16,0.1818,0.69,0,0,44,44 +1187,2011-02-22,1,0,2,11,0,2,1,1,0.16,0.2273,0.64,0,7,43,50 +1188,2011-02-22,1,0,2,12,0,2,1,1,0.2,0.2273,0.59,0.1045,7,48,55 +1189,2011-02-22,1,0,2,13,0,2,1,1,0.22,0.2273,0.55,0.1642,3,52,55 +1190,2011-02-22,1,0,2,14,0,2,1,1,0.22,0.2273,0.55,0.194,9,49,58 +1191,2011-02-22,1,0,2,15,0,2,1,1,0.24,0.2273,0.48,0.194,2,67,69 +1192,2011-02-22,1,0,2,16,0,2,1,1,0.22,0.2273,0.51,0.1642,7,79,86 +1193,2011-02-22,1,0,2,17,0,2,1,1,0.22,0.2121,0.51,0.2239,8,188,196 +1194,2011-02-22,1,0,2,18,0,2,1,1,0.22,0.2273,0.47,0.1343,6,161,167 +1195,2011-02-22,1,0,2,19,0,2,1,1,0.2,0.1818,0.47,0.2836,4,114,118 +1196,2011-02-22,1,0,2,20,0,2,1,1,0.2,0.197,0.47,0.2239,3,102,105 +1197,2011-02-22,1,0,2,21,0,2,1,1,0.2,0.197,0.47,0.2537,2,80,82 +1198,2011-02-22,1,0,2,22,0,2,1,1,0.16,0.1515,0.43,0.2537,1,76,77 +1199,2011-02-22,1,0,2,23,0,2,1,1,0.16,0.1515,0.43,0.2537,3,18,21 +1200,2011-02-23,1,0,2,0,0,3,1,1,0.14,0.1364,0.5,0.194,0,6,6 +1201,2011-02-23,1,0,2,1,0,3,1,1,0.14,0.1515,0.46,0.1642,0,4,4 +1202,2011-02-23,1,0,2,2,0,3,1,1,0.12,0.1515,0.5,0.1343,0,1,1 +1203,2011-02-23,1,0,2,3,0,3,1,1,0.12,0.1364,0.5,0.1642,0,2,2 +1204,2011-02-23,1,0,2,5,0,3,1,1,0.12,0.1515,0.5,0.1045,0,8,8 +1205,2011-02-23,1,0,2,6,0,3,1,1,0.12,0.1515,0.5,0.1045,0,36,36 +1206,2011-02-23,1,0,2,7,0,3,1,1,0.12,0.1515,0.54,0.1343,2,94,96 +1207,2011-02-23,1,0,2,8,0,3,1,1,0.14,0.1515,0.54,0.1642,8,227,235 +1208,2011-02-23,1,0,2,9,0,3,1,1,0.18,0.2121,0.51,0.0896,9,130,139 +1209,2011-02-23,1,0,2,10,0,3,1,1,0.2,0.2576,0.4,0,4,47,51 +1210,2011-02-23,1,0,2,11,0,3,1,1,0.24,0.2576,0.41,0.1045,16,53,69 +1211,2011-02-23,1,0,2,12,0,3,1,1,0.26,0.2879,0.35,0.0896,11,56,67 +1212,2011-02-23,1,0,2,13,0,3,1,1,0.3,0.3182,0.28,0.0896,9,78,87 +1213,2011-02-23,1,0,2,14,0,3,1,1,0.32,0.3333,0.29,0.1045,17,61,78 +1214,2011-02-23,1,0,2,15,0,3,1,1,0.34,0.3485,0.25,0.0896,18,54,72 +1215,2011-02-23,1,0,2,16,0,3,1,1,0.34,0.3636,0.25,0,15,79,94 +1216,2011-02-23,1,0,2,17,0,3,1,1,0.34,0.3485,0.25,0.0896,15,207,222 +1217,2011-02-23,1,0,2,18,0,3,1,1,0.32,0.3333,0.26,0.0896,3,206,209 +1218,2011-02-23,1,0,2,19,0,3,1,1,0.3,0.3333,0.33,0,6,135,141 +1219,2011-02-23,1,0,2,20,0,3,1,1,0.24,0.2424,0.6,0.1642,2,107,109 +1220,2011-02-23,1,0,2,21,0,3,1,1,0.24,0.2879,0.48,0,2,89,91 +1221,2011-02-23,1,0,2,22,0,3,1,1,0.24,0.2879,0.48,0,1,60,61 +1222,2011-02-23,1,0,2,23,0,3,1,1,0.22,0.2576,0.55,0.0896,1,38,39 +1223,2011-02-24,1,0,2,0,0,4,1,1,0.22,0.2576,0.55,0.0896,0,11,11 +1224,2011-02-24,1,0,2,1,0,4,1,1,0.22,0.2576,0.6,0.0896,0,7,7 +1225,2011-02-24,1,0,2,2,0,4,1,1,0.2,0.2273,0.64,0.0896,0,4,4 +1226,2011-02-24,1,0,2,3,0,4,1,1,0.2,0.2121,0.64,0.1343,0,2,2 +1227,2011-02-24,1,0,2,5,0,4,1,1,0.2,0.2121,0.69,0.1343,1,3,4 +1228,2011-02-24,1,0,2,6,0,4,1,1,0.2,0.197,0.69,0.194,0,58,58 +1229,2011-02-24,1,0,2,7,0,4,1,1,0.2,0.197,0.72,0.194,0,104,104 +1230,2011-02-24,1,0,2,8,0,4,1,1,0.24,0.2273,0.7,0.2239,8,244,252 +1231,2011-02-24,1,0,2,9,0,4,1,2,0.24,0.2121,0.75,0.2985,4,133,137 +1232,2011-02-24,1,0,2,10,0,4,1,2,0.26,0.2424,0.7,0.2836,8,49,57 +1233,2011-02-24,1,0,2,11,0,4,1,2,0.32,0.2879,0.57,0.3881,8,71,79 +1234,2011-02-24,1,0,2,12,0,4,1,2,0.36,0.3333,0.53,0.3582,10,85,95 +1235,2011-02-24,1,0,2,13,0,4,1,2,0.38,0.3939,0.54,0.2985,16,72,88 +1236,2011-02-24,1,0,2,14,0,4,1,2,0.4,0.4091,0.5,0.2985,2,67,69 +1237,2011-02-24,1,0,2,15,0,4,1,3,0.4,0.4091,0.54,0.3881,5,58,63 +1238,2011-02-24,1,0,2,16,0,4,1,3,0.38,0.3939,0.62,0.2836,4,67,71 +1239,2011-02-24,1,0,2,17,0,4,1,3,0.36,0.3485,0.66,0.194,9,168,177 +1240,2011-02-24,1,0,2,18,0,4,1,2,0.34,0.303,0.87,0.3284,5,132,137 +1241,2011-02-24,1,0,2,19,0,4,1,3,0.34,0.303,0.93,0.2985,3,112,115 +1242,2011-02-24,1,0,2,20,0,4,1,2,0.34,0.303,0.93,0.3582,2,87,89 +1243,2011-02-24,1,0,2,21,0,4,1,2,0.34,0.303,0.87,0.3284,7,76,83 +1244,2011-02-24,1,0,2,22,0,4,1,3,0.34,0.3182,0.87,0.2836,5,50,55 +1245,2011-02-24,1,0,2,23,0,4,1,2,0.32,0.303,0.93,0.2239,3,47,50 +1246,2011-02-25,1,0,2,0,0,5,1,3,0.32,0.3485,0.93,0,1,8,9 +1247,2011-02-25,1,0,2,1,0,5,1,2,0.32,0.3485,1,0,1,9,10 +1248,2011-02-25,1,0,2,2,0,5,1,2,0.32,0.3485,1,0,0,3,3 +1249,2011-02-25,1,0,2,3,0,5,1,2,0.32,0.3333,0.93,0.1045,1,1,2 +1250,2011-02-25,1,0,2,5,0,5,1,2,0.32,0.3333,0.93,0.1045,1,5,6 +1251,2011-02-25,1,0,2,6,0,5,1,3,0.34,0.3485,0.93,0.0896,0,11,11 +1252,2011-02-25,1,0,2,7,0,5,1,3,0.34,0.3333,1,0.1343,1,34,35 +1253,2011-02-25,1,0,2,8,0,5,1,3,0.36,0.3485,0.93,0.1343,3,70,73 +1254,2011-02-25,1,0,2,9,0,5,1,3,0.34,0.303,0.93,0.3582,3,111,114 +1255,2011-02-25,1,0,2,10,0,5,1,3,0.42,0.4242,0.94,0.3284,7,42,49 +1256,2011-02-25,1,0,2,11,0,5,1,1,0.52,0.5,0.77,0.4478,9,50,59 +1257,2011-02-25,1,0,2,12,0,5,1,3,0.54,0.5152,0.6,0.4627,20,95,115 +1258,2011-02-25,1,0,2,13,0,5,1,3,0.54,0.5152,0.6,0.4627,6,77,83 +1259,2011-02-25,1,0,2,14,0,5,1,3,0.56,0.5303,0.56,0.6119,14,71,85 +1260,2011-02-25,1,0,2,15,0,5,1,1,0.46,0.4545,0.41,0.806,5,50,55 +1261,2011-02-25,1,0,2,16,0,5,1,1,0.32,0.2879,0.49,0.4627,11,91,102 +1262,2011-02-25,1,0,2,17,0,5,1,1,0.32,0.2727,0.49,0.7463,8,181,189 +1263,2011-02-25,1,0,2,18,0,5,1,1,0.32,0.2879,0.49,0.4925,7,150,157 +1264,2011-02-25,1,0,2,19,0,5,1,1,0.3,0.2727,0.52,0.4478,4,86,90 +1265,2011-02-25,1,0,2,20,0,5,1,1,0.3,0.2576,0.49,0.6119,2,60,62 +1266,2011-02-25,1,0,2,21,0,5,1,1,0.28,0.2576,0.48,0.3881,7,56,63 +1267,2011-02-25,1,0,2,22,0,5,1,1,0.26,0.2121,0.48,0.4478,7,43,50 +1268,2011-02-25,1,0,2,23,0,5,1,1,0.26,0.2273,0.48,0.3284,2,37,39 +1269,2011-02-26,1,0,2,0,0,6,0,1,0.24,0.2273,0.52,0.194,3,25,28 +1270,2011-02-26,1,0,2,1,0,6,0,1,0.24,0.2121,0.52,0.2985,2,25,27 +1271,2011-02-26,1,0,2,2,0,6,0,1,0.22,0.197,0.6,0.3582,3,9,12 +1272,2011-02-26,1,0,2,3,0,6,0,1,0.22,0.2273,0.55,0.1343,1,7,8 +1273,2011-02-26,1,0,2,4,0,6,0,2,0.22,0.2576,0.6,0.0896,1,1,2 +1274,2011-02-26,1,0,2,5,0,6,0,2,0.22,0.2424,0.64,0.1045,1,9,10 +1275,2011-02-26,1,0,2,6,0,6,0,1,0.22,0.2727,0.6,0,1,6,7 +1276,2011-02-26,1,0,2,7,0,6,0,1,0.22,0.2576,0.6,0.0896,1,21,22 +1277,2011-02-26,1,0,2,8,0,6,0,2,0.24,0.2879,0.6,0,2,55,57 +1278,2011-02-26,1,0,2,9,0,6,0,2,0.26,0.2576,0.56,0.1642,9,65,74 +1279,2011-02-26,1,0,2,10,0,6,0,2,0.3,0.303,0.49,0.1343,14,71,85 +1280,2011-02-26,1,0,2,11,0,6,0,1,0.3,0.3333,0.45,0,26,100,126 +1281,2011-02-26,1,0,2,12,0,6,0,2,0.32,0.3182,0.45,0.1642,45,115,160 +1282,2011-02-26,1,0,2,13,0,6,0,2,0.34,0.3182,0.42,0.2537,38,136,174 +1283,2011-02-26,1,0,2,14,0,6,0,2,0.34,0.303,0.36,0.3284,57,154,211 +1284,2011-02-26,1,0,2,15,0,6,0,2,0.36,0.3333,0.4,0.2836,40,125,165 +1285,2011-02-26,1,0,2,16,0,6,0,1,0.36,0.3333,0.46,0.2836,41,130,171 +1286,2011-02-26,1,0,2,17,0,6,0,1,0.36,0.3485,0.43,0.2239,53,130,183 +1287,2011-02-26,1,0,2,18,0,6,0,1,0.34,0.3333,0.46,0.194,26,111,137 +1288,2011-02-26,1,0,2,19,0,6,0,1,0.32,0.303,0.49,0.2537,30,64,94 +1289,2011-02-26,1,0,2,20,0,6,0,1,0.3,0.303,0.56,0.1642,8,60,68 +1290,2011-02-26,1,0,2,21,0,6,0,1,0.28,0.2727,0.65,0.2537,9,59,68 +1291,2011-02-26,1,0,2,22,0,6,0,1,0.28,0.2727,0.75,0.2239,8,38,46 +1292,2011-02-26,1,0,2,23,0,6,0,1,0.28,0.2576,0.75,0.2836,5,29,34 +1293,2011-02-27,1,0,2,0,0,0,0,1,0.26,0.2424,0.87,0.2836,8,26,34 +1294,2011-02-27,1,0,2,1,0,0,0,1,0.26,0.2576,0.87,0.194,7,30,37 +1295,2011-02-27,1,0,2,2,0,0,0,1,0.26,0.2727,0.87,0.1343,2,20,22 +1296,2011-02-27,1,0,2,3,0,0,0,1,0.26,0.2879,0.87,0.0896,3,8,11 +1297,2011-02-27,1,0,2,4,0,0,0,1,0.24,0.2424,0.87,0.1343,0,2,2 +1298,2011-02-27,1,0,2,6,0,0,0,1,0.24,0.2879,0.87,0,2,1,3 +1299,2011-02-27,1,0,2,7,0,0,0,1,0.24,0.2424,0.87,0.1343,6,8,14 +1300,2011-02-27,1,0,2,8,0,0,0,1,0.26,0.303,0.87,0,9,26,35 +1301,2011-02-27,1,0,2,9,0,0,0,1,0.28,0.303,0.87,0.0896,17,42,59 +1302,2011-02-27,1,0,2,10,0,0,0,1,0.3,0.303,0.81,0.1642,24,79,103 +1303,2011-02-27,1,0,2,11,0,0,0,1,0.36,0.3485,0.62,0.1343,33,92,125 +1304,2011-02-27,1,0,2,12,0,0,0,1,0.4,0.4091,0.54,0.0896,61,132,193 +1305,2011-02-27,1,0,2,13,0,0,0,1,0.42,0.4242,0.47,0,90,169,259 +1306,2011-02-27,1,0,2,14,0,0,0,1,0.44,0.4394,0.47,0.0896,105,177,282 +1307,2011-02-27,1,0,2,15,0,0,0,1,0.46,0.4545,0.44,0.1343,98,163,261 +1308,2011-02-27,1,0,2,16,0,0,0,1,0.48,0.4697,0.44,0.1642,98,170,268 +1309,2011-02-27,1,0,2,17,0,0,0,1,0.42,0.4242,0.54,0.194,66,121,187 +1310,2011-02-27,1,0,2,18,0,0,0,1,0.4,0.4091,0.54,0.194,24,103,127 +1311,2011-02-27,1,0,2,19,0,0,0,1,0.4,0.4091,0.54,0.1343,16,86,102 +1312,2011-02-27,1,0,2,20,0,0,0,1,0.4,0.4091,0.54,0.0896,9,72,81 +1313,2011-02-27,1,0,2,21,0,0,0,1,0.38,0.3939,0.62,0.1642,8,61,69 +1314,2011-02-27,1,0,2,22,0,0,0,2,0.38,0.3939,0.62,0.1045,2,67,69 +1315,2011-02-27,1,0,2,23,0,0,0,2,0.36,0.3485,0.62,0.1642,6,53,59 +1316,2011-02-28,1,0,2,0,0,1,1,2,0.36,0.3636,0.66,0.1045,5,25,30 +1317,2011-02-28,1,0,2,1,0,1,1,3,0.34,0.303,0.87,0.3582,1,7,8 +1318,2011-02-28,1,0,2,3,0,1,1,3,0.32,0.3182,0.93,0.1642,0,1,1 +1319,2011-02-28,1,0,2,5,0,1,1,2,0.34,0.3636,0.93,0,1,4,5 +1320,2011-02-28,1,0,2,6,0,1,1,2,0.34,0.3485,0.96,0.1045,1,27,28 +1321,2011-02-28,1,0,2,7,0,1,1,2,0.36,0.3636,0.93,0.1045,2,90,92 +1322,2011-02-28,1,0,2,8,0,1,1,2,0.34,0.303,0.93,0.2985,13,242,255 +1323,2011-02-28,1,0,2,9,0,1,1,1,0.42,0.4242,0.82,0.2836,15,127,142 +1324,2011-02-28,1,0,2,10,0,1,1,2,0.52,0.5,0.72,0.4925,13,79,92 +1325,2011-02-28,1,0,2,11,0,1,1,2,0.56,0.5303,0.64,0.2985,13,74,87 +1326,2011-02-28,1,0,2,12,0,1,1,2,0.56,0.5303,0.64,0.2985,0,36,36 +1327,2011-02-28,1,0,2,13,0,1,1,3,0.46,0.4545,0.94,0.2239,1,31,32 +1328,2011-02-28,1,0,2,14,0,1,1,3,0.42,0.4242,1,0.2985,1,24,25 +1329,2011-02-28,1,0,2,15,0,1,1,3,0.42,0.4242,1,0.2985,0,35,35 +1330,2011-02-28,1,0,2,16,0,1,1,3,0.42,0.4242,1,0.1343,2,40,42 +1331,2011-02-28,1,0,2,17,0,1,1,3,0.4,0.4091,1,0.2985,2,77,79 +1332,2011-02-28,1,0,2,18,0,1,1,3,0.46,0.4545,0.94,0.194,4,127,131 +1333,2011-02-28,1,0,2,19,0,1,1,3,0.44,0.4394,0.88,0.6119,1,79,80 +1334,2011-02-28,1,0,2,20,0,1,1,3,0.44,0.4394,0.88,0.6119,0,45,45 +1335,2011-02-28,1,0,2,21,0,1,1,2,0.38,0.3939,0.87,0.3881,2,78,80 +1336,2011-02-28,1,0,2,22,0,1,1,3,0.34,0.303,0.93,0.4179,4,72,76 +1337,2011-02-28,1,0,2,23,0,1,1,2,0.32,0.2879,0.81,0.3881,0,45,45 +1338,2011-03-01,1,0,3,0,0,2,1,1,0.3,0.2727,0.7,0.4627,0,7,7 +1339,2011-03-01,1,0,3,1,0,2,1,1,0.26,0.2273,0.7,0.3582,0,3,3 +1340,2011-03-01,1,0,3,2,0,2,1,1,0.24,0.2121,0.65,0.3881,0,4,4 +1341,2011-03-01,1,0,3,3,0,2,1,1,0.22,0.2121,0.69,0.2836,0,2,2 +1342,2011-03-01,1,0,3,4,0,2,1,1,0.22,0.2121,0.69,0.2537,0,1,1 +1343,2011-03-01,1,0,3,5,0,2,1,1,0.2,0.1818,0.64,0.2836,1,1,2 +1344,2011-03-01,1,0,3,6,0,2,1,1,0.2,0.1818,0.59,0.2985,0,46,46 +1345,2011-03-01,1,0,3,7,0,2,1,1,0.2,0.1818,0.59,0.3284,2,105,107 +1346,2011-03-01,1,0,3,8,0,2,1,1,0.2,0.1818,0.59,0.3881,10,204,214 +1347,2011-03-01,1,0,3,9,0,2,1,1,0.22,0.197,0.55,0.4179,8,116,124 +1348,2011-03-01,1,0,3,10,0,2,1,1,0.24,0.2121,0.52,0.2836,13,55,68 +1349,2011-03-01,1,0,3,11,0,2,1,1,0.28,0.2727,0.45,0.2537,8,48,56 +1350,2011-03-01,1,0,3,12,0,2,1,1,0.3,0.303,0.39,0.1343,6,80,86 +1351,2011-03-01,1,0,3,13,0,2,1,1,0.32,0.3485,0.39,0,13,65,78 +1352,2011-03-01,1,0,3,14,0,2,1,1,0.32,0.3333,0.36,0.1343,18,61,79 +1353,2011-03-01,1,0,3,15,0,2,1,1,0.34,0.3636,0.34,0,7,57,64 +1354,2011-03-01,1,0,3,16,0,2,1,1,0.34,0.3485,0.34,0.0896,10,92,102 +1355,2011-03-01,1,0,3,17,0,2,1,1,0.34,0.3485,0.34,0.0896,12,230,242 +1356,2011-03-01,1,0,3,18,0,2,1,1,0.32,0.3182,0.39,0.194,10,214,224 +1357,2011-03-01,1,0,3,19,0,2,1,1,0.3,0.303,0.49,0.1343,4,115,119 +1358,2011-03-01,1,0,3,20,0,2,1,1,0.3,0.3182,0.61,0.0896,2,86,88 +1359,2011-03-01,1,0,3,21,0,2,1,1,0.26,0.303,0.56,0,8,55,63 +1360,2011-03-01,1,0,3,22,0,2,1,1,0.24,0.2727,0.62,0.1045,3,44,47 +1361,2011-03-01,1,0,3,23,0,2,1,1,0.24,0.2273,0.65,0.2239,2,23,25 +1362,2011-03-02,1,0,3,0,0,3,1,1,0.22,0.2273,0.69,0.1642,3,5,8 +1363,2011-03-02,1,0,3,1,0,3,1,1,0.22,0.2273,0.69,0.194,0,4,4 +1364,2011-03-02,1,0,3,2,0,3,1,1,0.22,0.2273,0.69,0.194,0,2,2 +1365,2011-03-02,1,0,3,3,0,3,1,1,0.22,0.2121,0.69,0.2836,3,1,4 +1366,2011-03-02,1,0,3,4,0,3,1,1,0.2,0.2121,0.75,0.1343,1,0,1 +1367,2011-03-02,1,0,3,5,0,3,1,1,0.22,0.2121,0.69,0.2239,0,5,5 +1368,2011-03-02,1,0,3,6,0,3,1,1,0.22,0.2121,0.55,0.2537,1,39,40 +1369,2011-03-02,1,0,3,7,0,3,1,1,0.22,0.2121,0.64,0.2537,2,108,110 +1370,2011-03-02,1,0,3,8,0,3,1,1,0.24,0.2121,0.65,0.2836,13,243,256 +1371,2011-03-02,1,0,3,9,0,3,1,1,0.28,0.2576,0.56,0.2985,7,141,148 +1372,2011-03-02,1,0,3,10,0,3,1,1,0.32,0.303,0.49,0.2836,11,65,76 +1373,2011-03-02,1,0,3,11,0,3,1,1,0.34,0.303,0.53,0.3284,8,65,73 +1374,2011-03-02,1,0,3,12,0,3,1,1,0.4,0.4091,0.43,0.194,20,62,82 +1375,2011-03-02,1,0,3,13,0,3,1,1,0.5,0.4848,0.25,0,35,90,125 +1376,2011-03-02,1,0,3,14,0,3,1,1,0.52,0.5,0.23,0.2985,21,75,96 +1377,2011-03-02,1,0,3,15,0,3,1,1,0.54,0.5152,0.19,0.4179,19,91,110 +1378,2011-03-02,1,0,3,16,0,3,1,1,0.54,0.5152,0.19,0.3284,27,112,139 +1379,2011-03-02,1,0,3,17,0,3,1,1,0.5,0.4848,0.23,0.2836,30,238,268 +1380,2011-03-02,1,0,3,18,0,3,1,1,0.46,0.4545,0.23,0.4925,8,193,201 +1381,2011-03-02,1,0,3,19,0,3,1,1,0.4,0.4091,0.28,0.5224,6,144,150 +1382,2011-03-02,1,0,3,20,0,3,1,1,0.36,0.3333,0.29,0.4179,9,86,95 +1383,2011-03-02,1,0,3,21,0,3,1,1,0.34,0.2879,0.29,0.4627,3,68,71 +1384,2011-03-02,1,0,3,22,0,3,1,1,0.3,0.2576,0.26,0.5522,4,44,48 +1385,2011-03-02,1,0,3,23,0,3,1,1,0.26,0.2121,0.3,0.5224,0,22,22 +1386,2011-03-03,1,0,3,0,0,4,1,1,0.24,0.197,0.3,0.4627,3,10,13 +1387,2011-03-03,1,0,3,1,0,4,1,1,0.24,0.197,0.3,0.4627,0,1,1 +1388,2011-03-03,1,0,3,2,0,4,1,1,0.2,0.1667,0.27,0.4627,1,2,3 +1389,2011-03-03,1,0,3,3,0,4,1,1,0.2,0.1667,0.27,0.4627,0,1,1 +1390,2011-03-03,1,0,3,4,0,4,1,1,0.16,0.1212,0.31,0.4925,0,1,1 +1391,2011-03-03,1,0,3,5,0,4,1,1,0.14,0.1212,0.33,0.2985,1,7,8 +1392,2011-03-03,1,0,3,6,0,4,1,1,0.14,0.1212,0.33,0.2985,1,34,35 +1393,2011-03-03,1,0,3,7,0,4,1,1,0.12,0.0909,0.42,0.4179,1,110,111 +1394,2011-03-03,1,0,3,8,0,4,1,1,0.14,0.1212,0.39,0.2985,4,216,220 +1395,2011-03-03,1,0,3,9,0,4,1,1,0.16,0.1364,0.37,0.2836,10,135,145 +1396,2011-03-03,1,0,3,10,0,4,1,1,0.18,0.1818,0.29,0.194,7,49,56 +1397,2011-03-03,1,0,3,11,0,4,1,1,0.2,0.2273,0.29,0.0896,10,40,50 +1398,2011-03-03,1,0,3,12,0,4,1,1,0.22,0.2273,0.25,0.1343,7,65,72 +1399,2011-03-03,1,0,3,13,0,4,1,1,0.22,0.2576,0.25,0.0896,13,67,80 +1400,2011-03-03,1,0,3,14,0,4,1,1,0.24,0.2576,0.21,0.1045,18,60,78 +1401,2011-03-03,1,0,3,15,0,4,1,1,0.24,0.2879,0.23,0,6,62,68 +1402,2011-03-03,1,0,3,16,0,4,1,1,0.26,0.303,0.23,0,6,65,71 +1403,2011-03-03,1,0,3,17,0,4,1,1,0.26,0.303,0.22,0,17,185,202 +1404,2011-03-03,1,0,3,18,0,4,1,1,0.24,0.2121,0.35,0.3284,6,161,167 +1405,2011-03-03,1,0,3,19,0,4,1,1,0.2,0.197,0.4,0.2537,5,101,106 +1406,2011-03-03,1,0,3,20,0,4,1,1,0.2,0.2273,0.4,0.0896,1,69,70 +1407,2011-03-03,1,0,3,21,0,4,1,1,0.18,0.2121,0.4,0.1045,3,48,51 +1408,2011-03-03,1,0,3,22,0,4,1,1,0.2,0.2576,0.4,0,3,50,53 +1409,2011-03-03,1,0,3,23,0,4,1,2,0.18,0.2121,0.43,0.0896,0,23,23 +1410,2011-03-04,1,0,3,0,0,5,1,2,0.2,0.197,0.55,0.194,0,12,12 +1411,2011-03-04,1,0,3,1,0,5,1,2,0.18,0.1818,0.64,0.194,0,4,4 +1412,2011-03-04,1,0,3,2,0,5,1,2,0.18,0.1818,0.64,0.194,0,2,2 +1413,2011-03-04,1,0,3,3,0,5,1,2,0.18,0.1667,0.74,0.2537,0,1,1 +1414,2011-03-04,1,0,3,4,0,5,1,2,0.18,0.1818,0.74,0.194,1,0,1 +1415,2011-03-04,1,0,3,5,0,5,1,2,0.16,0.1818,0.74,0.1343,0,7,7 +1416,2011-03-04,1,0,3,6,0,5,1,2,0.16,0.197,0.74,0.0896,1,28,29 +1417,2011-03-04,1,0,3,7,0,5,1,1,0.16,0.1818,0.8,0.1343,0,83,83 +1418,2011-03-04,1,0,3,8,0,5,1,1,0.18,0.197,0.74,0.1343,6,222,228 +1419,2011-03-04,1,0,3,9,0,5,1,1,0.22,0.2273,0.6,0.1343,12,138,150 +1420,2011-03-04,1,0,3,10,0,5,1,1,0.24,0.2273,0.6,0.2537,10,56,66 +1421,2011-03-04,1,0,3,11,0,5,1,1,0.28,0.2727,0.52,0.2239,16,73,89 +1422,2011-03-04,1,0,3,12,0,5,1,1,0.32,0.303,0.43,0.1642,10,87,97 +1423,2011-03-04,1,0,3,13,0,5,1,1,0.34,0.3182,0.42,0.2239,13,74,87 +1424,2011-03-04,1,0,3,14,0,5,1,1,0.36,0.3333,0.4,0.2836,30,65,95 +1425,2011-03-04,1,0,3,15,0,5,1,1,0.36,0.3333,0.4,0.2985,31,75,106 +1426,2011-03-04,1,0,3,16,0,5,1,1,0.36,0.3485,0.46,0.194,17,101,118 +1427,2011-03-04,1,0,3,17,0,5,1,2,0.36,0.3333,0.5,0.2537,22,206,228 +1428,2011-03-04,1,0,3,18,0,5,1,2,0.34,0.303,0.53,0.2985,15,172,187 +1429,2011-03-04,1,0,3,19,0,5,1,1,0.32,0.303,0.61,0.2537,5,102,107 +1430,2011-03-04,1,0,3,20,0,5,1,2,0.3,0.2879,0.7,0.194,9,78,87 +1431,2011-03-04,1,0,3,21,0,5,1,2,0.3,0.2879,0.7,0.2239,6,64,70 +1432,2011-03-04,1,0,3,22,0,5,1,1,0.3,0.2879,0.7,0.194,4,40,44 +1433,2011-03-04,1,0,3,23,0,5,1,2,0.3,0.303,0.75,0.1642,6,40,46 +1434,2011-03-05,1,0,3,0,0,6,0,2,0.28,0.2879,0.81,0.1045,4,15,19 +1435,2011-03-05,1,0,3,1,0,6,0,2,0.3,0.3182,0.81,0.1045,5,20,25 +1436,2011-03-05,1,0,3,2,0,6,0,2,0.3,0.2879,0.87,0.194,5,15,20 +1437,2011-03-05,1,0,3,3,0,6,0,2,0.3,0.2879,0.87,0.194,0,2,2 +1438,2011-03-05,1,0,3,4,0,6,0,2,0.3,0.303,0.93,0.1642,0,1,1 +1439,2011-03-05,1,0,3,5,0,6,0,2,0.3,0.303,1,0.1343,0,3,3 +1440,2011-03-05,1,0,3,6,0,6,0,2,0.3,0.2879,1,0.2239,1,3,4 +1441,2011-03-05,1,0,3,7,0,6,0,2,0.3,0.2727,1,0.2985,5,10,15 +1442,2011-03-05,1,0,3,8,0,6,0,2,0.3,0.2727,1,0.2985,11,34,45 +1443,2011-03-05,1,0,3,9,0,6,0,2,0.32,0.303,0.93,0.2239,15,48,63 +1444,2011-03-05,1,0,3,10,0,6,0,2,0.34,0.3333,0.93,0.194,34,69,103 +1445,2011-03-05,1,0,3,11,0,6,0,2,0.4,0.4091,0.76,0.2239,43,116,159 +1446,2011-03-05,1,0,3,12,0,6,0,2,0.44,0.4394,0.67,0.2537,46,121,167 +1447,2011-03-05,1,0,3,13,0,6,0,2,0.46,0.4545,0.67,0.2836,60,130,190 +1448,2011-03-05,1,0,3,14,0,6,0,2,0.48,0.4697,0.59,0.2836,80,118,198 +1449,2011-03-05,1,0,3,15,0,6,0,2,0.46,0.4545,0.63,0.2985,83,122,205 +1450,2011-03-05,1,0,3,16,0,6,0,2,0.48,0.4697,0.59,0.2985,74,130,204 +1451,2011-03-05,1,0,3,17,0,6,0,2,0.48,0.4697,0.59,0.3582,60,104,164 +1452,2011-03-05,1,0,3,18,0,6,0,2,0.48,0.4697,0.59,0.3284,37,115,152 +1453,2011-03-05,1,0,3,19,0,6,0,2,0.46,0.4545,0.67,0.3284,27,62,89 +1454,2011-03-05,1,0,3,20,0,6,0,2,0.44,0.4394,0.77,0.3284,12,58,70 +1455,2011-03-05,1,0,3,21,0,6,0,2,0.44,0.4394,0.72,0.3284,12,60,72 +1456,2011-03-05,1,0,3,22,0,6,0,2,0.42,0.4242,0.77,0.2985,16,47,63 +1457,2011-03-05,1,0,3,23,0,6,0,2,0.44,0.4394,0.77,0.2985,10,34,44 +1458,2011-03-06,1,0,3,0,0,0,0,2,0.42,0.4242,0.77,0.3582,11,41,52 +1459,2011-03-06,1,0,3,1,0,0,0,2,0.42,0.4242,0.77,0.2836,12,27,39 +1460,2011-03-06,1,0,3,2,0,0,0,2,0.4,0.4091,0.82,0.2836,5,27,32 +1461,2011-03-06,1,0,3,3,0,0,0,2,0.42,0.4242,0.82,0.2985,2,9,11 +1462,2011-03-06,1,0,3,4,0,0,0,2,0.42,0.4242,0.88,0.3582,0,3,3 +1463,2011-03-06,1,0,3,6,0,0,0,2,0.42,0.4242,0.94,0.3582,1,1,2 +1464,2011-03-06,1,0,3,7,0,0,0,3,0.42,0.4242,1,0.4478,0,5,5 +1465,2011-03-06,1,0,3,8,0,0,0,2,0.4,0.4091,1,0.2985,1,8,9 +1466,2011-03-06,1,0,3,9,0,0,0,2,0.42,0.4242,1,0.2836,4,18,22 +1467,2011-03-06,1,0,3,10,0,0,0,2,0.42,0.4242,1,0.2985,4,27,31 +1468,2011-03-06,1,0,3,11,0,0,0,2,0.42,0.4242,1,0.2239,18,44,62 +1469,2011-03-06,1,0,3,12,0,0,0,2,0.46,0.4545,0.94,0.3284,10,69,79 +1470,2011-03-06,1,0,3,13,0,0,0,2,0.46,0.4545,0.94,0.3582,22,83,105 +1471,2011-03-06,1,0,3,14,0,0,0,3,0.44,0.4394,1,0.2239,12,27,39 +1472,2011-03-06,1,0,3,15,0,0,0,3,0.44,0.4394,1,0.2239,3,4,7 +1473,2011-03-06,1,0,3,16,0,0,0,3,0.36,0.3333,1,0.2836,3,8,11 +1474,2011-03-06,1,0,3,17,0,0,0,3,0.34,0.303,1,0.2985,2,23,25 +1475,2011-03-06,1,0,3,18,0,0,0,3,0.32,0.2879,1,0.3582,0,23,23 +1476,2011-03-06,1,0,3,19,0,0,0,3,0.3,0.2576,1,0.4925,0,11,11 +1477,2011-03-06,1,0,3,20,0,0,0,3,0.28,0.2424,1,0.4179,3,8,11 +1478,2011-03-06,1,0,3,21,0,0,0,3,0.24,0.1818,0.93,0.6119,1,6,7 +1479,2011-03-06,1,0,3,22,0,0,0,2,0.22,0.197,1,0.3881,0,10,10 +1480,2011-03-06,1,0,3,23,0,0,0,2,0.22,0.197,1,0.4179,0,9,9 +1481,2011-03-07,1,0,3,0,0,1,1,3,0.2,0.1818,1,0.3284,0,4,4 +1482,2011-03-07,1,0,3,1,0,1,1,3,0.2,0.1818,1,0.3284,0,2,2 +1483,2011-03-07,1,0,3,3,0,1,1,1,0.2,0.1515,0.8,0.5821,0,1,1 +1484,2011-03-07,1,0,3,4,0,1,1,1,0.2,0.1515,0.8,0.5224,1,0,1 +1485,2011-03-07,1,0,3,5,0,1,1,1,0.2,0.1818,0.75,0.3582,0,1,1 +1486,2011-03-07,1,0,3,6,0,1,1,1,0.2,0.1818,0.75,0.3881,3,31,34 +1487,2011-03-07,1,0,3,7,0,1,1,1,0.2,0.1667,0.75,0.4179,3,88,91 +1488,2011-03-07,1,0,3,8,0,1,1,1,0.2,0.1818,0.75,0.3881,11,200,211 +1489,2011-03-07,1,0,3,9,0,1,1,1,0.22,0.1818,0.64,0.4627,5,129,134 +1490,2011-03-07,1,0,3,10,0,1,1,1,0.24,0.2121,0.6,0.2985,15,43,58 +1491,2011-03-07,1,0,3,11,0,1,1,1,0.26,0.2121,0.48,0.4478,19,41,60 +1492,2011-03-07,1,0,3,12,0,1,1,1,0.3,0.2727,0.42,0.4179,28,68,96 +1493,2011-03-07,1,0,3,13,0,1,1,1,0.32,0.2879,0.36,0.3881,16,54,70 +1494,2011-03-07,1,0,3,14,0,1,1,1,0.32,0.2879,0.36,0.4179,21,56,77 +1495,2011-03-07,1,0,3,15,0,1,1,1,0.34,0.303,0.31,0.3881,32,64,96 +1496,2011-03-07,1,0,3,16,0,1,1,1,0.34,0.303,0.31,0.3582,26,96,122 +1497,2011-03-07,1,0,3,17,0,1,1,1,0.34,0.303,0.29,0.3582,27,206,233 +1498,2011-03-07,1,0,3,18,0,1,1,1,0.32,0.303,0.31,0.2239,19,214,233 +1499,2011-03-07,1,0,3,19,0,1,1,1,0.3,0.2879,0.33,0.2239,11,134,145 +1500,2011-03-07,1,0,3,20,0,1,1,1,0.3,0.303,0.33,0.1642,5,87,92 +1501,2011-03-07,1,0,3,21,0,1,1,1,0.28,0.2727,0.38,0.2239,1,53,54 +1502,2011-03-07,1,0,3,22,0,1,1,1,0.26,0.2576,0.48,0.1642,1,34,35 +1503,2011-03-07,1,0,3,23,0,1,1,1,0.28,0.3182,0.48,0,0,22,22 +1504,2011-03-08,1,0,3,0,0,2,1,1,0.26,0.2879,0.48,0.0896,1,9,10 +1505,2011-03-08,1,0,3,1,0,2,1,1,0.24,0.2424,0.52,0.1343,0,4,4 +1506,2011-03-08,1,0,3,2,0,2,1,1,0.24,0.2424,0.52,0.1343,1,0,1 +1507,2011-03-08,1,0,3,3,0,2,1,1,0.24,0.2576,0.52,0.0896,5,2,7 +1508,2011-03-08,1,0,3,4,0,2,1,1,0.22,0.2727,0.64,0,0,2,2 +1509,2011-03-08,1,0,3,5,0,2,1,1,0.2,0.2273,0.69,0.1045,2,8,10 +1510,2011-03-08,1,0,3,6,0,2,1,1,0.2,0.2576,0.59,0,3,42,45 +1511,2011-03-08,1,0,3,7,0,2,1,1,0.18,0.197,0.64,0.1343,9,119,128 +1512,2011-03-08,1,0,3,8,0,2,1,1,0.22,0.2273,0.55,0.194,10,247,257 +1513,2011-03-08,1,0,3,9,0,2,1,1,0.26,0.2576,0.41,0.1642,11,140,151 +1514,2011-03-08,1,0,3,10,0,2,1,2,0.3,0.2879,0.33,0.194,25,46,71 +1515,2011-03-08,1,0,3,11,0,2,1,2,0.36,0.3485,0.25,0.1642,26,52,78 +1516,2011-03-08,1,0,3,12,0,2,1,2,0.36,0.3788,0.23,0,29,70,99 +1517,2011-03-08,1,0,3,13,0,2,1,2,0.38,0.3939,0.25,0.1045,25,73,98 +1518,2011-03-08,1,0,3,14,0,2,1,2,0.38,0.3939,0.2,0,16,56,72 +1519,2011-03-08,1,0,3,15,0,2,1,2,0.36,0.3788,0.18,0,35,77,112 +1520,2011-03-08,1,0,3,16,0,2,1,1,0.38,0.3939,0.27,0.1642,26,82,108 +1521,2011-03-08,1,0,3,17,0,2,1,1,0.36,0.3485,0.27,0.2239,39,209,248 +1522,2011-03-08,1,0,3,18,0,2,1,1,0.34,0.3333,0.27,0.194,21,214,235 +1523,2011-03-08,1,0,3,19,0,2,1,1,0.34,0.3485,0.31,0.1045,9,141,150 +1524,2011-03-08,1,0,3,20,0,2,1,1,0.32,0.3333,0.39,0.0896,2,74,76 +1525,2011-03-08,1,0,3,21,0,2,1,1,0.3,0.2879,0.49,0.194,7,68,75 +1526,2011-03-08,1,0,3,22,0,2,1,1,0.3,0.2879,0.49,0.2239,11,44,55 +1527,2011-03-08,1,0,3,23,0,2,1,1,0.28,0.2727,0.61,0.194,3,38,41 +1528,2011-03-09,1,0,3,0,0,3,1,1,0.26,0.2879,0.65,0.0896,0,9,9 +1529,2011-03-09,1,0,3,1,0,3,1,1,0.26,0.2727,0.52,0.1343,0,4,4 +1530,2011-03-09,1,0,3,2,0,3,1,1,0.26,0.2727,0.52,0.1343,1,1,2 +1531,2011-03-09,1,0,3,3,0,3,1,1,0.24,0.2576,0.7,0.0896,1,2,3 +1532,2011-03-09,1,0,3,4,0,3,1,1,0.24,0.2576,0.75,0.1045,0,2,2 +1533,2011-03-09,1,0,3,5,0,3,1,2,0.24,0.2424,0.81,0.1343,1,7,8 +1534,2011-03-09,1,0,3,6,0,3,1,2,0.24,0.2576,0.85,0.1045,5,44,49 +1535,2011-03-09,1,0,3,7,0,3,1,2,0.24,0.2424,0.87,0.1642,18,123,141 +1536,2011-03-09,1,0,3,8,0,3,1,2,0.24,0.2424,0.87,0.1343,11,238,249 +1537,2011-03-09,1,0,3,9,0,3,1,2,0.26,0.2424,0.87,0.2537,5,136,141 +1538,2011-03-09,1,0,3,10,0,3,1,2,0.3,0.2879,0.75,0.2537,8,49,57 +1539,2011-03-09,1,0,3,11,0,3,1,2,0.32,0.2879,0.76,0.3881,15,44,59 +1540,2011-03-09,1,0,3,12,0,3,1,2,0.32,0.303,0.76,0.2537,14,84,98 +1541,2011-03-09,1,0,3,13,0,3,1,2,0.32,0.2879,0.76,0.3582,17,82,99 +1542,2011-03-09,1,0,3,14,0,3,1,2,0.34,0.303,0.76,0.2985,10,61,71 +1543,2011-03-09,1,0,3,15,0,3,1,2,0.36,0.3333,0.71,0.2985,15,64,79 +1544,2011-03-09,1,0,3,16,0,3,1,2,0.34,0.303,0.76,0.3284,10,102,112 +1545,2011-03-09,1,0,3,17,0,3,1,2,0.34,0.303,0.76,0.2985,18,176,194 +1546,2011-03-09,1,0,3,18,0,3,1,2,0.34,0.3182,0.76,0.2537,12,176,188 +1547,2011-03-09,1,0,3,19,0,3,1,2,0.34,0.303,0.76,0.3284,11,123,134 +1548,2011-03-09,1,0,3,20,0,3,1,2,0.32,0.3182,0.87,0.194,4,87,91 +1549,2011-03-09,1,0,3,21,0,3,1,2,0.32,0.303,0.93,0.3284,10,52,62 +1550,2011-03-09,1,0,3,22,0,3,1,3,0.32,0.3182,0.93,0.1642,4,17,21 +1551,2011-03-09,1,0,3,23,0,3,1,3,0.34,0.3333,0.93,0.194,1,17,18 +1552,2011-03-10,1,0,3,0,0,4,1,3,0.34,0.3182,0,0.2537,3,0,3 +1553,2011-03-10,1,0,3,1,0,4,1,3,0.34,0.3182,0,0.2537,0,2,2 +1554,2011-03-10,1,0,3,2,0,4,1,3,0.34,0.3182,0,0.2537,0,1,1 +1555,2011-03-10,1,0,3,5,0,4,1,3,0.36,0.3485,0,0.194,1,2,3 +1556,2011-03-10,1,0,3,6,0,4,1,3,0.36,0.3333,0,0.3284,0,12,12 +1557,2011-03-10,1,0,3,7,0,4,1,3,0.38,0.3939,0,0.2239,1,36,37 +1558,2011-03-10,1,0,3,8,0,4,1,3,0.38,0.3939,0,0.2836,1,43,44 +1559,2011-03-10,1,0,3,9,0,4,1,3,0.4,0.4091,0,0.2239,1,23,24 +1560,2011-03-10,1,0,3,10,0,4,1,3,0.4,0.4091,0,0.1642,0,17,17 +1561,2011-03-10,1,0,3,11,0,4,1,3,0.4,0.4091,0,0.2537,6,5,11 +1562,2011-03-10,1,0,3,12,0,4,1,3,0.42,0.4242,0,0.2239,4,30,34 +1563,2011-03-10,1,0,3,13,0,4,1,3,0.42,0.4242,0,0.2239,1,11,12 +1564,2011-03-10,1,0,3,14,0,4,1,3,0.44,0.4394,0,0.2985,0,12,12 +1565,2011-03-10,1,0,3,15,0,4,1,3,0.44,0.4394,0,0.2239,3,11,14 +1566,2011-03-10,1,0,3,16,0,4,1,3,0.42,0.4242,0,0.2537,1,20,21 +1567,2011-03-10,1,0,3,17,0,4,1,2,0.44,0.4394,0,0.3881,2,109,111 +1568,2011-03-10,1,0,3,18,0,4,1,3,0.44,0.4394,0,0.3582,2,80,82 +1569,2011-03-10,1,0,3,19,0,4,1,3,0.44,0.4394,0,0.5821,5,51,56 +1570,2011-03-10,1,0,3,20,0,4,1,3,0.36,0.3333,0,0.3284,9,29,38 +1571,2011-03-10,1,0,3,21,0,4,1,3,0.36,0.3485,0,0.2239,1,27,28 +1572,2011-03-10,1,0,3,22,0,4,1,2,0.34,0.3333,0,0.1343,4,30,34 +1573,2011-03-10,1,0,3,23,0,4,1,3,0.34,0.3485,0,0.0896,1,26,27 +1574,2011-03-11,1,0,3,0,0,5,1,2,0.34,0.3485,1,0.0896,0,6,6 +1575,2011-03-11,1,0,3,1,0,5,1,3,0.34,0.3485,1,0.1045,0,8,8 +1576,2011-03-11,1,0,3,2,0,5,1,3,0.34,0.3485,1,0.1045,2,3,5 +1577,2011-03-11,1,0,3,3,0,5,1,2,0.32,0.3333,0.93,0.0896,0,2,2 +1578,2011-03-11,1,0,3,5,0,5,1,1,0.3,0.2879,0.81,0.2239,1,6,7 +1579,2011-03-11,1,0,3,6,0,5,1,1,0.26,0.2424,0.81,0.2836,1,31,32 +1580,2011-03-11,1,0,3,7,0,5,1,1,0.26,0.2576,0.7,0.194,10,104,114 +1581,2011-03-11,1,0,3,8,0,5,1,1,0.28,0.2727,0.7,0.2537,15,244,259 +1582,2011-03-11,1,0,3,9,0,5,1,2,0.3,0.2879,0.61,0.2836,13,143,156 +1583,2011-03-11,1,0,3,10,0,5,1,2,0.32,0.303,0.57,0.2985,18,60,78 +1584,2011-03-11,1,0,3,11,0,5,1,2,0.32,0.303,0.57,0.2985,14,67,81 +1585,2011-03-11,1,0,3,12,0,5,1,2,0.34,0.3182,0.53,0.2537,15,96,111 +1586,2011-03-11,1,0,3,13,0,5,1,1,0.36,0.3333,0.53,0.2537,15,92,107 +1587,2011-03-11,1,0,3,14,0,5,1,2,0.36,0.3333,0.5,0.3582,25,64,89 +1588,2011-03-11,1,0,3,15,0,5,1,2,0.34,0.3182,0.53,0.2537,21,64,85 +1589,2011-03-11,1,0,3,16,0,5,1,2,0.34,0.303,0.49,0.2985,18,95,113 +1590,2011-03-11,1,0,3,17,0,5,1,1,0.34,0.3182,0.49,0.2239,23,200,223 +1591,2011-03-11,1,0,3,18,0,5,1,1,0.32,0.303,0.49,0.3284,19,134,153 +1592,2011-03-11,1,0,3,19,0,5,1,1,0.3,0.2727,0.56,0.3284,7,111,118 +1593,2011-03-11,1,0,3,20,0,5,1,1,0.3,0.303,0.56,0.1642,6,70,76 +1594,2011-03-11,1,0,3,21,0,5,1,2,0.3,0.2879,0.52,0.2537,10,43,53 +1595,2011-03-11,1,0,3,22,0,5,1,1,0.3,0.303,0.52,0.1642,11,53,64 +1596,2011-03-11,1,0,3,23,0,5,1,1,0.3,0.2879,0.52,0.2537,3,34,37 +1597,2011-03-12,1,0,3,0,0,6,0,1,0.26,0.2727,0.6,0.1343,4,30,34 +1598,2011-03-12,1,0,3,1,0,6,0,1,0.24,0.2273,0.65,0.194,3,15,18 +1599,2011-03-12,1,0,3,2,0,6,0,1,0.24,0.2273,0.65,0.194,0,14,14 +1600,2011-03-12,1,0,3,3,0,6,0,1,0.24,0.2273,0.65,0.2537,1,6,7 +1601,2011-03-12,1,0,3,4,0,6,0,1,0.24,0.2273,0.65,0.2537,0,1,1 +1602,2011-03-12,1,0,3,5,0,6,0,1,0.22,0.2273,0.69,0.194,0,2,2 +1603,2011-03-12,1,0,3,6,0,6,0,1,0.22,0.2121,0.75,0.2537,2,2,4 +1604,2011-03-12,1,0,3,7,0,6,0,1,0.24,0.2273,0.7,0.194,4,19,23 +1605,2011-03-12,1,0,3,8,0,6,0,1,0.26,0.2576,0.65,0.1642,9,44,53 +1606,2011-03-12,1,0,3,9,0,6,0,1,0.28,0.2879,0.65,0.1045,25,76,101 +1607,2011-03-12,1,0,3,10,0,6,0,1,0.32,0.303,0.66,0.2239,21,78,99 +1608,2011-03-12,1,0,3,11,0,6,0,1,0.34,0.303,0.49,0.3284,36,83,119 +1609,2011-03-12,1,0,3,12,0,6,0,1,0.34,0.3182,0.53,0.2836,51,107,158 +1610,2011-03-12,1,0,3,13,0,6,0,1,0.36,0.3333,0.51,0.3582,62,95,157 +1611,2011-03-12,1,0,3,14,0,6,0,1,0.4,0.4091,0.5,0.4478,70,96,166 +1612,2011-03-12,1,0,3,15,0,6,0,1,0.42,0.4242,0.44,0.4925,81,101,182 +1613,2011-03-12,1,0,3,16,0,6,0,1,0.46,0.4545,0.41,0.4478,100,144,244 +1614,2011-03-12,1,0,3,17,0,6,0,1,0.46,0.4545,0.44,0.3284,99,114,213 +1615,2011-03-12,1,0,3,18,0,6,0,1,0.42,0.4242,0.54,0.1642,54,91,145 +1616,2011-03-12,1,0,3,19,0,6,0,1,0.42,0.4242,0.54,0.0896,26,86,112 +1617,2011-03-12,1,0,3,20,0,6,0,1,0.4,0.4091,0.58,0.1045,22,64,86 +1618,2011-03-12,1,0,3,21,0,6,0,1,0.38,0.3939,0.62,0,36,46,82 +1619,2011-03-12,1,0,3,22,0,6,0,1,0.36,0.3788,0.71,0,7,56,63 +1620,2011-03-12,1,0,3,23,0,6,0,1,0.38,0.3939,0.66,0.0896,11,38,49 +1621,2011-03-13,1,0,3,0,0,0,0,1,0.38,0.3939,0.62,0.1045,3,35,38 +1622,2011-03-13,1,0,3,1,0,0,0,1,0.36,0.3485,0.66,0.1343,10,23,33 +1623,2011-03-13,1,0,3,3,0,0,0,1,0.34,0.3333,0.76,0.1343,6,17,23 +1624,2011-03-13,1,0,3,4,0,0,0,1,0.34,0.3333,0.66,0.1642,4,9,13 +1625,2011-03-13,1,0,3,5,0,0,0,1,0.36,0.3485,0.62,0.1343,0,3,3 +1626,2011-03-13,1,0,3,6,0,0,0,1,0.34,0.3636,0.66,0,0,2,2 +1627,2011-03-13,1,0,3,7,0,0,0,1,0.36,0.3485,0.62,0.194,2,8,10 +1628,2011-03-13,1,0,3,8,0,0,0,1,0.4,0.4091,0.5,0.2985,11,23,34 +1629,2011-03-13,1,0,3,9,0,0,0,1,0.4,0.4091,0.5,0.4179,8,36,44 +1630,2011-03-13,1,0,3,10,0,0,0,1,0.42,0.4242,0.47,0.2537,36,86,122 +1631,2011-03-13,1,0,3,11,0,0,0,1,0.44,0.4394,0.41,0.4179,88,93,181 +1632,2011-03-13,1,0,3,12,0,0,0,1,0.46,0.4545,0.38,0.3881,74,120,194 +1633,2011-03-13,1,0,3,13,0,0,0,1,0.46,0.4545,0.38,0.3881,97,124,221 +1634,2011-03-13,1,0,3,14,0,0,0,1,0.46,0.4545,0.41,0.2985,144,106,250 +1635,2011-03-13,1,0,3,15,0,0,0,1,0.48,0.4697,0.39,0.3284,149,155,304 +1636,2011-03-13,1,0,3,16,0,0,0,1,0.46,0.4545,0.41,0.3881,124,132,256 +1637,2011-03-13,1,0,3,17,0,0,0,1,0.44,0.4394,0.41,0.3582,98,143,241 +1638,2011-03-13,1,0,3,18,0,0,0,1,0.4,0.4091,0.43,0.3582,55,92,147 +1639,2011-03-13,1,0,3,19,0,0,0,1,0.36,0.3333,0.5,0.3582,28,73,101 +1640,2011-03-13,1,0,3,20,0,0,0,1,0.32,0.303,0.57,0.2985,23,54,77 +1641,2011-03-13,1,0,3,21,0,0,0,1,0.3,0.2879,0.56,0.2537,9,49,58 +1642,2011-03-13,1,0,3,22,0,0,0,1,0.3,0.2727,0.56,0.3284,8,29,37 +1643,2011-03-13,1,0,3,23,0,0,0,1,0.26,0.2576,0.65,0.2239,5,23,28 +1644,2011-03-14,1,0,3,0,0,1,1,1,0.26,0.2727,0.65,0.1045,4,7,11 +1645,2011-03-14,1,0,3,1,0,1,1,1,0.26,0.2879,0.65,0.0896,0,1,1 +1646,2011-03-14,1,0,3,2,0,1,1,1,0.26,0.2879,0.65,0.0896,0,1,1 +1647,2011-03-14,1,0,3,3,0,1,1,1,0.26,0.2727,0.65,0.1343,0,3,3 +1648,2011-03-14,1,0,3,5,0,1,1,1,0.24,0.2424,0.7,0.1343,0,8,8 +1649,2011-03-14,1,0,3,6,0,1,1,1,0.26,0.2727,0.65,0.1343,1,27,28 +1650,2011-03-14,1,0,3,7,0,1,1,2,0.28,0.3182,0.61,0,4,84,88 +1651,2011-03-14,1,0,3,8,0,1,1,2,0.3,0.303,0.56,0.1343,24,217,241 +1652,2011-03-14,1,0,3,9,0,1,1,1,0.32,0.303,0.53,0.3284,13,127,140 +1653,2011-03-14,1,0,3,10,0,1,1,1,0.34,0.3182,0.42,0.2537,27,57,84 +1654,2011-03-14,1,0,3,11,0,1,1,1,0.36,0.3333,0.4,0.2537,26,45,71 +1655,2011-03-14,1,0,3,12,0,1,1,1,0.38,0.3939,0.37,0,40,54,94 +1656,2011-03-14,1,0,3,13,0,1,1,1,0.38,0.3939,0.36,0.3284,24,51,75 +1657,2011-03-14,1,0,3,14,0,1,1,2,0.38,0.3939,0.34,0,27,52,79 +1658,2011-03-14,1,0,3,15,0,1,1,1,0.38,0.3939,0.37,0.1642,24,77,101 +1659,2011-03-14,1,0,3,16,0,1,1,1,0.4,0.4091,0.35,0.1045,27,80,107 +1660,2011-03-14,1,0,3,17,0,1,1,1,0.38,0.3939,0.4,0.194,42,229,271 +1661,2011-03-14,1,0,3,18,0,1,1,1,0.36,0.3636,0.4,0.1045,25,210,235 +1662,2011-03-14,1,0,3,19,0,1,1,1,0.36,0.3485,0.43,0.1343,17,133,150 +1663,2011-03-14,1,0,3,20,0,1,1,1,0.34,0.3182,0.46,0.2239,23,106,129 +1664,2011-03-14,1,0,3,21,0,1,1,1,0.34,0.3333,0.46,0.1343,5,58,63 +1665,2011-03-14,1,0,3,22,0,1,1,1,0.32,0.3333,0.49,0.1045,4,43,47 +1666,2011-03-14,1,0,3,23,0,1,1,1,0.32,0.3485,0.53,0,2,17,19 +1667,2011-03-15,1,0,3,0,0,2,1,1,0.32,0.3485,0.53,0,7,7,14 +1668,2011-03-15,1,0,3,1,0,2,1,1,0.3,0.303,0.62,0.0896,4,6,10 +1669,2011-03-15,1,0,3,2,0,2,1,1,0.3,0.303,0.62,0.0896,1,2,3 +1670,2011-03-15,1,0,3,4,0,2,1,1,0.24,0.2576,0.75,0.0896,1,1,2 +1671,2011-03-15,1,0,3,5,0,2,1,1,0.24,0.2273,0.75,0.194,0,11,11 +1672,2011-03-15,1,0,3,6,0,2,1,1,0.22,0.2273,0.8,0.194,3,32,35 +1673,2011-03-15,1,0,3,7,0,2,1,1,0.24,0.2424,0.75,0.1642,10,109,119 +1674,2011-03-15,1,0,3,8,0,2,1,1,0.26,0.2879,0.7,0.0896,23,259,282 +1675,2011-03-15,1,0,3,9,0,2,1,2,0.3,0.3333,0.7,0,10,147,157 +1676,2011-03-15,1,0,3,10,0,2,1,2,0.32,0.3182,0.66,0.194,22,56,78 +1677,2011-03-15,1,0,3,11,0,2,1,2,0.34,0.3182,0.57,0.2239,27,51,78 +1678,2011-03-15,1,0,3,12,0,2,1,2,0.36,0.3485,0.5,0.2239,32,70,102 +1679,2011-03-15,1,0,3,13,0,2,1,2,0.36,0.3333,0.57,0.2836,23,74,97 +1680,2011-03-15,1,0,3,14,0,2,1,2,0.38,0.3939,0.46,0.2985,14,74,88 +1681,2011-03-15,1,0,3,15,0,2,1,2,0.38,0.3939,0.43,0.2836,24,66,90 +1682,2011-03-15,1,0,3,16,0,2,1,2,0.38,0.3939,0.46,0.3284,21,93,114 +1683,2011-03-15,1,0,3,17,0,2,1,2,0.36,0.3485,0.62,0.194,22,195,217 +1684,2011-03-15,1,0,3,18,0,2,1,2,0.36,0.3333,0.62,0.2985,18,207,225 +1685,2011-03-15,1,0,3,19,0,2,1,2,0.34,0.3182,0.71,0.2836,14,138,152 +1686,2011-03-15,1,0,3,20,0,2,1,2,0.34,0.3182,0.71,0.2836,9,79,88 +1687,2011-03-15,1,0,3,21,0,2,1,3,0.32,0.3333,0.81,0.1045,2,53,55 +1688,2011-03-15,1,0,3,22,0,2,1,2,0.32,0.3182,0.87,0.1642,1,20,21 +1689,2011-03-15,1,0,3,23,0,2,1,2,0.32,0.3182,0.87,0.1642,1,17,18 +1690,2011-03-16,1,0,3,0,0,3,1,3,0.3,0.2879,0.93,0.2537,0,8,8 +1691,2011-03-16,1,0,3,1,0,3,1,3,0.3,0.2727,1,0.2985,1,2,3 +1692,2011-03-16,1,0,3,3,0,3,1,2,0.28,0.2727,1,0.2239,1,2,3 +1693,2011-03-16,1,0,3,4,0,3,1,2,0.3,0.303,0.93,0.1642,0,1,1 +1694,2011-03-16,1,0,3,5,0,3,1,2,0.3,0.3182,0.93,0.1045,0,3,3 +1695,2011-03-16,1,0,3,6,0,3,1,2,0.3,0.3182,0.93,0.1045,1,30,31 +1696,2011-03-16,1,0,3,7,0,3,1,2,0.3,0.2879,0.93,0.194,10,101,111 +1697,2011-03-16,1,0,3,8,0,3,1,2,0.32,0.3182,0.93,0,26,227,253 +1698,2011-03-16,1,0,3,9,0,3,1,2,0.32,0.3333,0.93,0.1343,10,144,154 +1699,2011-03-16,1,0,3,10,0,3,1,2,0.36,0.3485,0.81,0.1642,19,49,68 +1700,2011-03-16,1,0,3,11,0,3,1,2,0.36,0.3485,0.81,0.1343,13,62,75 +1701,2011-03-16,1,0,3,12,0,3,1,2,0.4,0.4091,0.71,0.1045,23,65,88 +1702,2011-03-16,1,0,3,13,0,3,1,2,0.4,0.4091,0.76,0.1343,18,81,99 +1703,2011-03-16,1,0,3,14,0,3,1,2,0.4,0.4091,0.76,0.2239,33,67,100 +1704,2011-03-16,1,0,3,15,0,3,1,2,0.4,0.4091,0.76,0.2537,22,79,101 +1705,2011-03-16,1,0,3,16,0,3,1,2,0.4,0.4091,0.76,0.1642,30,121,151 +1706,2011-03-16,1,0,3,17,0,3,1,2,0.44,0.4394,0.54,0.2985,37,216,253 +1707,2011-03-16,1,0,3,18,0,3,1,2,0.44,0.4394,0.54,0.2239,26,211,237 +1708,2011-03-16,1,0,3,19,0,3,1,2,0.44,0.4394,0.51,0.3284,18,150,168 +1709,2011-03-16,1,0,3,20,0,3,1,2,0.42,0.4242,0.54,0.2985,10,111,121 +1710,2011-03-16,1,0,3,21,0,3,1,2,0.42,0.4242,0.58,0.2836,3,75,78 +1711,2011-03-16,1,0,3,22,0,3,1,2,0.4,0.4091,0.62,0.3881,11,48,59 +1712,2011-03-16,1,0,3,23,0,3,1,1,0.4,0.4091,0.65,0.194,9,18,27 +1713,2011-03-17,1,0,3,0,0,4,1,1,0.38,0.3939,0.66,0.2537,4,19,23 +1714,2011-03-17,1,0,3,1,0,4,1,1,0.36,0.3485,0.71,0.2239,1,11,12 +1715,2011-03-17,1,0,3,2,0,4,1,1,0.34,0.303,0.71,0.2985,3,5,8 +1716,2011-03-17,1,0,3,3,0,4,1,1,0.34,0.3333,0.66,0.1642,1,1,2 +1717,2011-03-17,1,0,3,4,0,4,1,1,0.34,0.3333,0.71,0.194,0,3,3 +1718,2011-03-17,1,0,3,5,0,4,1,1,0.32,0.3333,0.76,0.1045,0,13,13 +1719,2011-03-17,1,0,3,6,0,4,1,1,0.32,0.3182,0.76,0.194,4,47,51 +1720,2011-03-17,1,0,3,7,0,4,1,1,0.32,0.3333,0.76,0.1343,12,128,140 +1721,2011-03-17,1,0,3,8,0,4,1,1,0.36,0.3485,0.66,0.1343,17,282,299 +1722,2011-03-17,1,0,3,9,0,4,1,1,0.4,0.4091,0.62,0.2537,23,162,185 +1723,2011-03-17,1,0,3,10,0,4,1,1,0.44,0.4394,0.54,0.3284,9,69,78 +1724,2011-03-17,1,0,3,11,0,4,1,1,0.44,0.4394,0.54,0.2836,20,66,86 +1725,2011-03-17,1,0,3,12,0,4,1,1,0.5,0.4848,0.45,0.2239,24,81,105 +1726,2011-03-17,1,0,3,13,0,4,1,2,0.52,0.5,0.42,0.0896,26,85,111 +1727,2011-03-17,1,0,3,14,0,4,1,1,0.5,0.4848,0.45,0.1045,32,87,119 +1728,2011-03-17,1,0,3,15,0,4,1,1,0.52,0.5,0.42,0.2537,30,95,125 +1729,2011-03-17,1,0,3,16,0,4,1,1,0.5,0.4848,0.48,0.2836,34,108,142 +1730,2011-03-17,1,0,3,17,0,4,1,1,0.5,0.4848,0.42,0.194,48,265,313 +1731,2011-03-17,1,0,3,18,0,4,1,2,0.46,0.4545,0.59,0.1642,50,260,310 +1732,2011-03-17,1,0,3,19,0,4,1,1,0.44,0.4394,0.58,0.194,18,189,207 +1733,2011-03-17,1,0,3,20,0,4,1,1,0.42,0.4242,0.67,0.2239,25,112,137 +1734,2011-03-17,1,0,3,21,0,4,1,1,0.42,0.4242,0.62,0.2239,19,119,138 +1735,2011-03-17,1,0,3,22,0,4,1,1,0.4,0.4091,0.66,0.2239,17,70,87 +1736,2011-03-17,1,0,3,23,0,4,1,1,0.42,0.4242,0.62,0.2836,7,43,50 +1737,2011-03-18,1,0,3,0,0,5,1,1,0.42,0.4242,0.58,0.2836,5,24,29 +1738,2011-03-18,1,0,3,1,0,5,1,1,0.4,0.4091,0.62,0.2537,4,12,16 +1739,2011-03-18,1,0,3,2,0,5,1,1,0.4,0.4091,0.62,0.194,5,9,14 +1740,2011-03-18,1,0,3,3,0,5,1,1,0.36,0.3485,0.71,0.194,1,4,5 +1741,2011-03-18,1,0,3,5,0,5,1,1,0.38,0.3939,0.66,0.1642,2,8,10 +1742,2011-03-18,1,0,3,6,0,5,1,1,0.4,0.4091,0.66,0.194,1,35,36 +1743,2011-03-18,1,0,3,7,0,5,1,1,0.4,0.4091,0.66,0.194,11,112,123 +1744,2011-03-18,1,0,3,8,0,5,1,1,0.42,0.4242,0.67,0.2537,24,256,280 +1745,2011-03-18,1,0,3,9,0,5,1,1,0.46,0.4545,0.63,0.2239,18,192,210 +1746,2011-03-18,1,0,3,10,0,5,1,1,0.52,0.5,0.53,0.2836,43,74,117 +1747,2011-03-18,1,0,3,11,0,5,1,1,0.54,0.5152,0.52,0.2836,55,104,159 +1748,2011-03-18,1,0,3,12,0,5,1,2,0.56,0.5303,0.49,0.3582,72,123,195 +1749,2011-03-18,1,0,3,13,0,5,1,1,0.64,0.6212,0.41,0.2836,57,118,175 +1750,2011-03-18,1,0,3,14,0,5,1,1,0.66,0.6212,0.39,0.2537,71,103,174 +1751,2011-03-18,1,0,3,15,0,5,1,2,0.68,0.6364,0.39,0.3582,62,111,173 +1752,2011-03-18,1,0,3,16,0,5,1,1,0.68,0.6364,0.39,0.2836,67,137,204 +1753,2011-03-18,1,0,3,17,0,5,1,1,0.7,0.6364,0.37,0.3284,95,237,332 +1754,2011-03-18,1,0,3,18,0,5,1,1,0.68,0.6364,0.39,0.1642,84,247,331 +1755,2011-03-18,1,0,3,19,0,5,1,1,0.66,0.6212,0.44,0.1343,58,132,190 +1756,2011-03-18,1,0,3,20,0,5,1,1,0.62,0.6212,0.46,0.1343,46,103,149 +1757,2011-03-18,1,0,3,21,0,5,1,1,0.62,0.6212,0.46,0.1045,22,91,113 +1758,2011-03-18,1,0,3,22,0,5,1,1,0.62,0.6212,0.5,0.1642,55,63,118 +1759,2011-03-18,1,0,3,23,0,5,1,2,0.6,0.6212,0.53,0.2239,26,60,86 +1760,2011-03-19,1,0,3,0,0,6,0,2,0.6,0.6212,0.53,0.2537,26,50,76 +1761,2011-03-19,1,0,3,1,0,6,0,2,0.58,0.5455,0.46,0.3582,16,35,51 +1762,2011-03-19,1,0,3,2,0,6,0,2,0.56,0.5303,0.43,0.2239,5,20,25 +1763,2011-03-19,1,0,3,3,0,6,0,2,0.54,0.5152,0.39,0.3284,1,7,8 +1764,2011-03-19,1,0,3,4,0,6,0,1,0.52,0.5,0.34,0.4179,1,2,3 +1765,2011-03-19,1,0,3,5,0,6,0,1,0.52,0.5,0.34,0.4179,0,2,2 +1766,2011-03-19,1,0,3,6,0,6,0,1,0.44,0.4394,0.44,0.4179,0,10,10 +1767,2011-03-19,1,0,3,7,0,6,0,1,0.4,0.4091,0.5,0.3284,4,9,13 +1768,2011-03-19,1,0,3,8,0,6,0,1,0.42,0.4242,0.47,0.4925,11,37,48 +1769,2011-03-19,1,0,3,9,0,6,0,1,0.42,0.4242,0.44,0.4627,35,41,76 +1770,2011-03-19,1,0,3,10,0,6,0,1,0.44,0.4394,0.38,0.4179,55,85,140 +1771,2011-03-19,1,0,3,11,0,6,0,1,0.46,0.4545,0.36,0.4478,90,106,196 +1772,2011-03-19,1,0,3,12,0,6,0,1,0.46,0.4545,0.33,0.4179,126,141,267 +1773,2011-03-19,1,0,3,13,0,6,0,1,0.5,0.4848,0.34,0.4627,174,127,301 +1774,2011-03-19,1,0,3,14,0,6,0,1,0.5,0.4848,0.31,0.4925,168,144,312 +1775,2011-03-19,1,0,3,15,0,6,0,1,0.5,0.4848,0.29,0.4179,170,143,313 +1776,2011-03-19,1,0,3,16,0,6,0,1,0.5,0.4848,0.31,0.3881,175,129,304 +1777,2011-03-19,1,0,3,17,0,6,0,1,0.48,0.4697,0.31,0.3284,138,140,278 +1778,2011-03-19,1,0,3,18,0,6,0,1,0.46,0.4545,0.31,0.3284,92,125,217 +1779,2011-03-19,1,0,3,19,0,6,0,1,0.44,0.4394,0.33,0.2836,38,116,154 +1780,2011-03-19,1,0,3,20,0,6,0,1,0.42,0.4242,0.35,0.2239,39,69,108 +1781,2011-03-19,1,0,3,21,0,6,0,1,0.4,0.4091,0.37,0.2985,20,73,93 +1782,2011-03-19,1,0,3,22,0,6,0,1,0.4,0.4091,0.37,0.3284,27,45,72 +1783,2011-03-19,1,0,3,23,0,6,0,1,0.38,0.3939,0.4,0.2985,13,37,50 +1784,2011-03-20,1,0,3,0,0,0,0,1,0.34,0.303,0.49,0.4179,7,33,40 +1785,2011-03-20,1,0,3,1,0,0,0,1,0.32,0.2879,0.53,0.3582,2,22,24 +1786,2011-03-20,1,0,3,2,0,0,0,1,0.3,0.2879,0.52,0.2836,6,24,30 +1787,2011-03-20,1,0,3,3,0,0,0,1,0.28,0.2727,0.56,0.2537,1,11,12 +1788,2011-03-20,1,0,3,4,0,0,0,1,0.26,0.2576,0.56,0.2239,2,1,3 +1789,2011-03-20,1,0,3,5,0,0,0,1,0.26,0.2576,0.6,0.194,3,6,9 +1790,2011-03-20,1,0,3,6,0,0,0,1,0.26,0.2576,0.6,0.2239,1,3,4 +1791,2011-03-20,1,0,3,7,0,0,0,1,0.24,0.2273,0.6,0.2239,5,9,14 +1792,2011-03-20,1,0,3,8,0,0,0,1,0.28,0.2727,0.56,0.1642,7,30,37 +1793,2011-03-20,1,0,3,9,0,0,0,1,0.3,0.303,0.52,0.1343,35,43,78 +1794,2011-03-20,1,0,3,10,0,0,0,2,0.32,0.3485,0.45,0,55,81,136 +1795,2011-03-20,1,0,3,11,0,0,0,1,0.34,0.3485,0.39,0.1045,92,111,203 +1796,2011-03-20,1,0,3,12,0,0,0,1,0.36,0.3636,0.4,0.1045,120,108,228 +1797,2011-03-20,1,0,3,13,0,0,0,1,0.38,0.3939,0.34,0,88,115,203 +1798,2011-03-20,1,0,3,14,0,0,0,1,0.4,0.4091,0.3,0,145,134,279 +1799,2011-03-20,1,0,3,15,0,0,0,1,0.4,0.4091,0.32,0.0896,172,136,308 +1800,2011-03-20,1,0,3,16,0,0,0,1,0.42,0.4242,0.32,0,92,132,224 +1801,2011-03-20,1,0,3,17,0,0,0,1,0.4,0.4091,0.35,0.2836,83,143,226 +1802,2011-03-20,1,0,3,18,0,0,0,1,0.38,0.3939,0.4,0.3582,58,98,156 +1803,2011-03-20,1,0,3,19,0,0,0,1,0.36,0.3333,0.43,0.2836,21,58,79 +1804,2011-03-20,1,0,3,20,0,0,0,1,0.36,0.3333,0.46,0.2836,21,46,67 +1805,2011-03-20,1,0,3,21,0,0,0,2,0.34,0.3182,0.53,0.2836,10,39,49 +1806,2011-03-20,1,0,3,22,0,0,0,2,0.34,0.303,0.53,0.3284,8,30,38 +1807,2011-03-20,1,0,3,23,0,0,0,3,0.34,0.303,0.61,0.3881,13,11,24 +1808,2011-03-21,2,0,3,0,0,1,1,3,0.34,0.303,0.66,0.3881,2,11,13 +1809,2011-03-21,2,0,3,1,0,1,1,2,0.34,0.303,0.71,0.3881,1,6,7 +1810,2011-03-21,2,0,3,2,0,1,1,2,0.34,0.303,0.71,0.3284,1,5,6 +1811,2011-03-21,2,0,3,3,0,1,1,2,0.34,0.303,0.71,0.3284,0,1,1 +1812,2011-03-21,2,0,3,5,0,1,1,1,0.32,0.303,0.81,0.2985,1,1,2 +1813,2011-03-21,2,0,3,6,0,1,1,3,0.32,0.303,0.76,0.2537,2,30,32 +1814,2011-03-21,2,0,3,7,0,1,1,3,0.3,0.2727,0.87,0.4179,3,15,18 +1815,2011-03-21,2,0,3,8,0,1,1,2,0.3,0.2727,0.87,0.4179,3,95,98 +1816,2011-03-21,2,0,3,9,0,1,1,2,0.34,0.303,0.81,0.3284,12,115,127 +1817,2011-03-21,2,0,3,10,0,1,1,2,0.38,0.3939,0.76,0.3881,17,45,62 +1818,2011-03-21,2,0,3,11,0,1,1,2,0.42,0.4242,0.71,0.2985,32,47,79 +1819,2011-03-21,2,0,3,12,0,1,1,2,0.44,0.4394,0.77,0.2537,25,56,81 +1820,2011-03-21,2,0,3,13,0,1,1,2,0.5,0.4848,0.63,0.2239,35,77,112 +1821,2011-03-21,2,0,3,14,0,1,1,2,0.54,0.5152,0.64,0.2239,36,65,101 +1822,2011-03-21,2,0,3,15,0,1,1,2,0.56,0.5303,0.6,0.2239,41,66,107 +1823,2011-03-21,2,0,3,16,0,1,1,2,0.54,0.5152,0.64,0.2836,29,112,141 +1824,2011-03-21,2,0,3,17,0,1,1,2,0.54,0.5152,0.64,0.2537,41,231,272 +1825,2011-03-21,2,0,3,18,0,1,1,2,0.52,0.5,0.72,0.2239,44,232,276 +1826,2011-03-21,2,0,3,19,0,1,1,1,0.58,0.5455,0.6,0.4179,22,199,221 +1827,2011-03-21,2,0,3,20,0,1,1,1,0.56,0.5303,0.64,0.194,16,106,122 +1828,2011-03-21,2,0,3,21,0,1,1,1,0.46,0.4545,0.94,0.194,12,79,91 +1829,2011-03-21,2,0,3,22,0,1,1,1,0.46,0.4545,0.88,0.2239,18,60,78 +1830,2011-03-21,2,0,3,23,0,1,1,1,0.46,0.4545,0.88,0.0896,8,22,30 +1831,2011-03-22,2,0,3,0,0,2,1,1,0.46,0.4545,0.88,0.194,4,7,11 +1832,2011-03-22,2,0,3,1,0,2,1,1,0.42,0.4242,1,0.1343,6,13,19 +1833,2011-03-22,2,0,3,2,0,2,1,1,0.44,0.4394,0.94,0.0896,0,1,1 +1834,2011-03-22,2,0,3,3,0,2,1,1,0.44,0.4394,0.82,0.2239,3,2,5 +1835,2011-03-22,2,0,3,4,0,2,1,2,0.42,0.4242,0.82,0.1642,0,1,1 +1836,2011-03-22,2,0,3,5,0,2,1,2,0.4,0.4091,0.87,0.1343,1,11,12 +1837,2011-03-22,2,0,3,6,0,2,1,1,0.4,0.4091,0.87,0.2537,2,58,60 +1838,2011-03-22,2,0,3,7,0,2,1,2,0.4,0.4091,0.76,0.1045,4,132,136 +1839,2011-03-22,2,0,3,8,0,2,1,2,0.4,0.4091,0.58,0.2985,24,312,336 +1840,2011-03-22,2,0,3,9,0,2,1,2,0.4,0.4091,0.58,0.4627,19,159,178 +1841,2011-03-22,2,0,3,10,0,2,1,2,0.44,0.4394,0.54,0.2985,29,95,124 +1842,2011-03-22,2,0,3,11,0,2,1,1,0.44,0.4394,0.51,0.3284,22,64,86 +1843,2011-03-22,2,0,3,12,0,2,1,1,0.46,0.4545,0.47,0.2537,35,86,121 +1844,2011-03-22,2,0,3,13,0,2,1,1,0.5,0.4848,0.45,0.2239,21,85,106 +1845,2011-03-22,2,0,3,14,0,2,1,1,0.5,0.4848,0.42,0.3284,30,71,101 +1846,2011-03-22,2,0,3,15,0,2,1,1,0.5,0.4848,0.42,0.2239,55,79,134 +1847,2011-03-22,2,0,3,16,0,2,1,1,0.5,0.4848,0.42,0.1045,40,125,165 +1848,2011-03-22,2,0,3,17,0,2,1,1,0.5,0.4848,0.42,0.1045,42,265,307 +1849,2011-03-22,2,0,3,18,0,2,1,1,0.48,0.4697,0.44,0.0896,41,271,312 +1850,2011-03-22,2,0,3,19,0,2,1,2,0.44,0.4394,0.54,0.2537,22,142,164 +1851,2011-03-22,2,0,3,20,0,2,1,2,0.44,0.4394,0.54,0.3582,19,116,135 +1852,2011-03-22,2,0,3,21,0,2,1,2,0.42,0.4242,0.54,0.2836,24,63,87 +1853,2011-03-22,2,0,3,22,0,2,1,2,0.4,0.4091,0.58,0.2537,13,57,70 +1854,2011-03-22,2,0,3,23,0,2,1,2,0.4,0.4091,0.58,0.2537,4,28,32 +1855,2011-03-23,2,0,3,0,0,3,1,3,0.36,0.3485,0.71,0.2239,3,8,11 +1856,2011-03-23,2,0,3,1,0,3,1,3,0.34,0.3182,0.76,0.2239,0,7,7 +1857,2011-03-23,2,0,3,2,0,3,1,3,0.34,0.3182,0.76,0.2239,1,1,2 +1858,2011-03-23,2,0,3,3,0,3,1,2,0.34,0.3182,0.76,0.2537,0,3,3 +1859,2011-03-23,2,0,3,5,0,3,1,3,0.32,0.3182,0.87,0.194,0,10,10 +1860,2011-03-23,2,0,3,6,0,3,1,2,0.34,0.3182,0.87,0.2537,1,43,44 +1861,2011-03-23,2,0,3,7,0,3,1,2,0.32,0.303,0.93,0.2239,7,111,118 +1862,2011-03-23,2,0,3,8,0,3,1,2,0.32,0.3182,0.87,0.194,14,247,261 +1863,2011-03-23,2,0,3,9,0,3,1,2,0.32,0.303,0.81,0.2537,8,174,182 +1864,2011-03-23,2,0,3,10,0,3,1,2,0.34,0.3182,0.81,0.2836,15,60,75 +1865,2011-03-23,2,0,3,11,0,3,1,2,0.34,0.3333,0.87,0.194,18,53,71 +1866,2011-03-23,2,0,3,12,0,3,1,2,0.34,0.3182,0.87,0.2239,14,73,87 +1867,2011-03-23,2,0,3,13,0,3,1,3,0.34,0.3333,0.87,0.1343,14,94,108 +1868,2011-03-23,2,0,3,14,0,3,1,2,0.36,0.3636,0.87,0.1045,13,71,84 +1869,2011-03-23,2,0,3,15,0,3,1,2,0.38,0.3939,0.82,0.2239,11,74,85 +1870,2011-03-23,2,0,3,16,0,3,1,2,0.4,0.4091,0.76,0.2239,13,123,136 +1871,2011-03-23,2,0,3,17,0,3,1,2,0.4,0.4091,0.76,0.2239,28,245,273 +1872,2011-03-23,2,0,3,18,0,3,1,3,0.38,0.3939,0.82,0.2537,20,218,238 +1873,2011-03-23,2,0,3,19,0,3,1,3,0.38,0.3939,0.82,0.2537,12,147,159 +1874,2011-03-23,2,0,3,20,0,3,1,3,0.36,0.303,0.87,0.6418,3,72,75 +1875,2011-03-23,2,0,3,21,0,3,1,3,0.32,0.2879,1,0.4179,0,22,22 +1876,2011-03-23,2,0,3,22,0,3,1,2,0.32,0.3182,0.93,0.1642,6,38,44 +1877,2011-03-23,2,0,3,23,0,3,1,3,0.32,0.3333,0.9,0,2,24,26 +1878,2011-03-24,2,0,3,0,0,4,1,2,0.32,0.303,0.93,0.2239,0,11,11 +1879,2011-03-24,2,0,3,1,0,4,1,2,0.3,0.2879,1,0.2239,0,3,3 +1880,2011-03-24,2,0,3,2,0,4,1,2,0.3,0.2879,1,0.2537,0,6,6 +1881,2011-03-24,2,0,3,3,0,4,1,2,0.28,0.2727,1,0.194,1,0,1 +1882,2011-03-24,2,0,3,4,0,4,1,2,0.28,0.2727,1,0.194,0,1,1 +1883,2011-03-24,2,0,3,5,0,4,1,2,0.28,0.2727,0.93,0.2239,1,8,9 +1884,2011-03-24,2,0,3,6,0,4,1,2,0.28,0.2727,0.93,0.2239,0,41,41 +1885,2011-03-24,2,0,3,7,0,4,1,3,0.26,0.2576,1,0.2239,2,106,108 +1886,2011-03-24,2,0,3,8,0,4,1,3,0.26,0.2273,1,0.2985,4,120,124 +1887,2011-03-24,2,0,3,9,0,4,1,2,0.26,0.2576,0.93,0.2239,3,94,97 +1888,2011-03-24,2,0,3,10,0,4,1,2,0.26,0.2273,0.93,0.3284,10,55,65 +1889,2011-03-24,2,0,3,11,0,4,1,2,0.3,0.2727,0.81,0.2985,10,61,71 +1890,2011-03-24,2,0,3,12,0,4,1,2,0.3,0.2727,0.7,0.3284,12,82,94 +1891,2011-03-24,2,0,3,13,0,4,1,2,0.32,0.303,0.7,0.2239,11,90,101 +1892,2011-03-24,2,0,3,14,0,4,1,2,0.3,0.303,0.7,0.1343,17,66,83 +1893,2011-03-24,2,0,3,15,0,4,1,2,0.3,0.2879,0.7,0.2537,13,77,90 +1894,2011-03-24,2,0,3,16,0,4,1,2,0.3,0.2879,0.65,0.2239,6,78,84 +1895,2011-03-24,2,0,3,17,0,4,1,2,0.3,0.2879,0.7,0.2239,12,221,233 +1896,2011-03-24,2,0,3,18,0,4,1,2,0.3,0.2727,0.61,0.2985,17,197,214 +1897,2011-03-24,2,0,3,19,0,4,1,1,0.3,0.2727,0.56,0.3881,16,122,138 +1898,2011-03-24,2,0,3,20,0,4,1,1,0.28,0.2576,0.61,0.2836,8,110,118 +1899,2011-03-24,2,0,3,21,0,4,1,1,0.26,0.2576,0.65,0.2239,11,67,78 +1900,2011-03-24,2,0,3,22,0,4,1,1,0.26,0.2576,0.65,0.1642,3,59,62 +1901,2011-03-24,2,0,3,23,0,4,1,1,0.24,0.2273,0.65,0.194,9,24,33 +1902,2011-03-25,2,0,3,0,0,5,1,1,0.24,0.2273,0.6,0.194,1,17,18 +1903,2011-03-25,2,0,3,1,0,5,1,1,0.2,0.2121,0.69,0.1642,0,5,5 +1904,2011-03-25,2,0,3,2,0,5,1,1,0.2,0.197,0.69,0.2239,0,5,5 +1905,2011-03-25,2,0,3,3,0,5,1,1,0.2,0.2273,0.69,0.1045,0,2,2 +1906,2011-03-25,2,0,3,4,0,5,1,1,0.18,0.2121,0.74,0.1045,3,2,5 +1907,2011-03-25,2,0,3,5,0,5,1,1,0.18,0.197,0.74,0.1642,0,9,9 +1908,2011-03-25,2,0,3,6,0,5,1,1,0.18,0.1818,0.74,0.2239,0,32,32 +1909,2011-03-25,2,0,3,7,0,5,1,1,0.2,0.1818,0.59,0.3582,4,100,104 +1910,2011-03-25,2,0,3,8,0,5,1,1,0.22,0.2121,0.51,0.2836,9,228,237 +1911,2011-03-25,2,0,3,9,0,5,1,1,0.24,0.2121,0.48,0.2985,16,150,166 +1912,2011-03-25,2,0,3,10,0,5,1,1,0.26,0.2576,0.48,0,15,53,68 +1913,2011-03-25,2,0,3,11,0,5,1,1,0.3,0.2879,0.45,0.2836,17,58,75 +1914,2011-03-25,2,0,3,12,0,5,1,1,0.32,0.303,0.42,0.2537,37,108,145 +1915,2011-03-25,2,0,3,13,0,5,1,1,0.32,0.3182,0.39,0.194,34,104,138 +1916,2011-03-25,2,0,3,14,0,5,1,1,0.36,0.3333,0.32,0.3284,30,103,133 +1917,2011-03-25,2,0,3,15,0,5,1,2,0.34,0.3182,0.36,0.2537,43,79,122 +1918,2011-03-25,2,0,3,16,0,5,1,2,0.34,0.303,0.34,0.2985,23,127,150 +1919,2011-03-25,2,0,3,17,0,5,1,2,0.32,0.2879,0.36,0.3582,23,202,225 +1920,2011-03-25,2,0,3,18,0,5,1,1,0.3,0.2879,0.39,0.2537,8,190,198 +1921,2011-03-25,2,0,3,19,0,5,1,2,0.3,0.2879,0.36,0.2836,9,126,135 +1922,2011-03-25,2,0,3,20,0,5,1,2,0.3,0.303,0.39,0.1343,6,75,81 +1923,2011-03-25,2,0,3,21,0,5,1,2,0.3,0.303,0.39,0.1642,11,64,75 +1924,2011-03-25,2,0,3,22,0,5,1,2,0.28,0.2576,0.38,0.3284,3,40,43 +1925,2011-03-25,2,0,3,23,0,5,1,2,0.26,0.2424,0.38,0.2836,8,31,39 +1926,2011-03-26,2,0,3,0,0,6,0,2,0.26,0.2424,0.38,0.2836,3,25,28 +1927,2011-03-26,2,0,3,1,0,6,0,1,0.26,0.2273,0.38,0.2985,3,24,27 +1928,2011-03-26,2,0,3,2,0,6,0,1,0.22,0.2121,0.44,0.2239,4,21,25 +1929,2011-03-26,2,0,3,3,0,6,0,1,0.2,0.197,0.47,0.194,2,7,9 +1930,2011-03-26,2,0,3,4,0,6,0,1,0.22,0.2121,0.44,0.2239,2,4,6 +1931,2011-03-26,2,0,3,5,0,6,0,1,0.2,0.197,0.51,0.194,0,8,8 +1932,2011-03-26,2,0,3,6,0,6,0,1,0.18,0.1818,0.55,0.194,2,8,10 +1933,2011-03-26,2,0,3,7,0,6,0,1,0.2,0.197,0.51,0.194,18,23,41 +1934,2011-03-26,2,0,3,8,0,6,0,1,0.22,0.2273,0.44,0.1343,24,42,66 +1935,2011-03-26,2,0,3,9,0,6,0,1,0.24,0.2424,0.41,0.1343,32,68,100 +1936,2011-03-26,2,0,3,10,0,6,0,1,0.26,0.2576,0.41,0.1642,32,93,125 +1937,2011-03-26,2,0,3,11,0,6,0,1,0.28,0.2727,0.38,0.1642,75,93,168 +1938,2011-03-26,2,0,3,12,0,6,0,1,0.3,0.2879,0.36,0.194,98,129,227 +1939,2011-03-26,2,0,3,13,0,6,0,1,0.32,0.303,0.33,0.2239,94,145,239 +1940,2011-03-26,2,0,3,14,0,6,0,1,0.34,0.3182,0.29,0.2239,93,123,216 +1941,2011-03-26,2,0,3,15,0,6,0,1,0.34,0.3333,0.29,0.194,110,109,219 +1942,2011-03-26,2,0,3,16,0,6,0,1,0.34,0.3333,0.23,0.194,118,130,248 +1943,2011-03-26,2,0,3,17,0,6,0,1,0.32,0.303,0.24,0.2239,102,101,203 +1944,2011-03-26,2,0,3,18,0,6,0,2,0.32,0.3182,0.21,0.1642,64,89,153 +1945,2011-03-26,2,0,3,19,0,6,0,2,0.28,0.2727,0.45,0.2537,45,93,138 +1946,2011-03-26,2,0,3,20,0,6,0,2,0.28,0.2727,0.45,0.2537,18,67,85 +1947,2011-03-26,2,0,3,21,0,6,0,2,0.26,0.2576,0.44,0.194,15,36,51 +1948,2011-03-26,2,0,3,22,0,6,0,2,0.28,0.2727,0.41,0.2239,17,46,63 +1949,2011-03-26,2,0,3,23,0,6,0,2,0.26,0.2424,0.44,0.2836,10,31,41 +1950,2011-03-27,2,0,3,0,0,0,0,2,0.26,0.2576,0.41,0.1642,5,26,31 +1951,2011-03-27,2,0,3,1,0,0,0,2,0.24,0.2273,0.52,0.194,3,17,20 +1952,2011-03-27,2,0,3,2,0,0,0,3,0.22,0.2273,0.55,0.194,6,15,21 +1953,2011-03-27,2,0,3,3,0,0,0,3,0.2,0.197,0.69,0.2239,0,14,14 +1954,2011-03-27,2,0,3,4,0,0,0,3,0.18,0.197,0.74,0.1642,1,5,6 +1955,2011-03-27,2,0,3,6,0,0,0,3,0.16,0.1818,0.86,0.1343,0,2,2 +1956,2011-03-27,2,0,3,7,0,0,0,3,0.16,0.1667,0.86,0.1642,0,7,7 +1957,2011-03-27,2,0,3,8,0,0,0,1,0.2,0.197,0.69,0.2239,6,8,14 +1958,2011-03-27,2,0,3,9,0,0,0,1,0.22,0.2273,0.55,0.1343,17,37,54 +1959,2011-03-27,2,0,3,10,0,0,0,2,0.22,0.2121,0.55,0.2239,27,59,86 +1960,2011-03-27,2,0,3,11,0,0,0,2,0.24,0.2424,0.48,0.1642,23,85,108 +1961,2011-03-27,2,0,3,12,0,0,0,2,0.26,0.2576,0.44,0.2239,43,120,163 +1962,2011-03-27,2,0,3,13,0,0,0,1,0.3,0.2879,0.39,0.2239,47,100,147 +1963,2011-03-27,2,0,3,14,0,0,0,1,0.32,0.3182,0.36,0.1642,61,103,164 +1964,2011-03-27,2,0,3,15,0,0,0,1,0.32,0.303,0.36,0.2836,61,111,172 +1965,2011-03-27,2,0,3,16,0,0,0,1,0.34,0.3333,0.34,0,45,98,143 +1966,2011-03-27,2,0,3,17,0,0,0,1,0.32,0.303,0.31,0.2537,50,109,159 +1967,2011-03-27,2,0,3,18,0,0,0,1,0.3,0.2879,0.33,0.194,32,105,137 +1968,2011-03-27,2,0,3,19,0,0,0,1,0.3,0.2879,0.31,0.2239,23,80,103 +1969,2011-03-27,2,0,3,20,0,0,0,1,0.28,0.2727,0.36,0.1642,7,50,57 +1970,2011-03-27,2,0,3,21,0,0,0,1,0.26,0.2576,0.41,0.1642,7,29,36 +1971,2011-03-27,2,0,3,22,0,0,0,1,0.26,0.2576,0.41,0.1642,5,25,30 +1972,2011-03-27,2,0,3,23,0,0,0,1,0.26,0.2576,0.44,0.194,3,16,19 +1973,2011-03-28,2,0,3,0,0,1,1,1,0.22,0.2273,0.55,0.194,1,11,12 +1974,2011-03-28,2,0,3,1,0,1,1,1,0.22,0.2273,0.44,0.194,2,4,6 +1975,2011-03-28,2,0,3,2,0,1,1,1,0.22,0.2273,0.44,0.1642,0,5,5 +1976,2011-03-28,2,0,3,3,0,1,1,1,0.22,0.2273,0.44,0.1642,0,2,2 +1977,2011-03-28,2,0,3,5,0,1,1,1,0.18,0.1818,0.43,0.194,0,8,8 +1978,2011-03-28,2,0,3,6,0,1,1,1,0.18,0.1667,0.34,0.2836,0,40,40 +1979,2011-03-28,2,0,3,7,0,1,1,1,0.2,0.1818,0.32,0.3284,6,104,110 +1980,2011-03-28,2,0,3,8,0,1,1,1,0.2,0.197,0.34,0.2239,15,239,254 +1981,2011-03-28,2,0,3,9,0,1,1,2,0.22,0.2121,0.32,0.2239,12,109,121 +1982,2011-03-28,2,0,3,10,0,1,1,1,0.24,0.2273,0.3,0.194,22,47,69 +1983,2011-03-28,2,0,3,11,0,1,1,2,0.26,0.2424,0.3,0.2836,18,39,57 +1984,2011-03-28,2,0,3,12,0,1,1,2,0.26,0.2727,0.25,0,8,71,79 +1985,2011-03-28,2,0,3,13,0,1,1,2,0.3,0.2879,0.24,0.2239,14,69,83 +1986,2011-03-28,2,0,3,14,0,1,1,1,0.32,0.303,0.22,0.2239,15,65,80 +1987,2011-03-28,2,0,3,15,0,1,1,1,0.32,0.303,0.24,0.2836,19,53,72 +1988,2011-03-28,2,0,3,16,0,1,1,1,0.34,0.3182,0.23,0.2239,15,117,132 +1989,2011-03-28,2,0,3,17,0,1,1,1,0.34,0.303,0.23,0.2985,25,227,252 +1990,2011-03-28,2,0,3,18,0,1,1,1,0.32,0.303,0.21,0.2836,21,220,241 +1991,2011-03-28,2,0,3,19,0,1,1,1,0.3,0.303,0.22,0.1343,17,156,173 +1992,2011-03-28,2,0,3,20,0,1,1,1,0.32,0.3333,0.22,0.1045,5,107,112 +1993,2011-03-28,2,0,3,21,0,1,1,1,0.32,0.3333,0.21,0.1343,6,55,61 +1994,2011-03-28,2,0,3,22,0,1,1,1,0.3,0.2879,0.22,0.2239,0,40,40 +1995,2011-03-28,2,0,3,23,0,1,1,1,0.28,0.2576,0.24,0.2985,1,18,19 +1996,2011-03-29,2,0,3,0,0,2,1,1,0.26,0.2273,0.28,0.3284,1,15,16 +1997,2011-03-29,2,0,3,1,0,2,1,1,0.24,0.2121,0.32,0.2985,0,4,4 +1998,2011-03-29,2,0,3,2,0,2,1,1,0.24,0.2121,0.32,0.2985,0,3,3 +1999,2011-03-29,2,0,3,3,0,2,1,1,0.24,0.2121,0.32,0.2985,2,1,3 +2000,2011-03-29,2,0,3,4,0,2,1,1,0.2,0.2121,0.44,0.1642,1,1,2 +2001,2011-03-29,2,0,3,5,0,2,1,1,0.22,0.2121,0.37,0.2537,0,8,8 +2002,2011-03-29,2,0,3,6,0,2,1,1,0.22,0.2121,0.41,0.2985,4,46,50 +2003,2011-03-29,2,0,3,7,0,2,1,1,0.22,0.197,0.41,0.3582,5,128,133 +2004,2011-03-29,2,0,3,8,0,2,1,1,0.24,0.2121,0.35,0.3582,19,268,287 +2005,2011-03-29,2,0,3,9,0,2,1,1,0.28,0.2576,0.28,0.3881,16,156,172 +2006,2011-03-29,2,0,3,10,0,2,1,1,0.3,0.2727,0.24,0.2985,13,46,59 +2007,2011-03-29,2,0,3,11,0,2,1,2,0.34,0.303,0.25,0.3881,18,63,81 +2008,2011-03-29,2,0,3,12,0,2,1,1,0.34,0.3333,0.25,0.1642,29,77,106 +2009,2011-03-29,2,0,3,13,0,2,1,1,0.34,0.303,0.25,0.2985,24,80,104 +2010,2011-03-29,2,0,3,14,0,2,1,1,0.36,0.3333,0.25,0.2537,20,66,86 +2011,2011-03-29,2,0,3,15,0,2,1,1,0.38,0.3939,0.25,0.2239,22,65,87 +2012,2011-03-29,2,0,3,16,0,2,1,1,0.38,0.3939,0.22,0.1642,12,124,136 +2013,2011-03-29,2,0,3,17,0,2,1,1,0.4,0.4091,0.2,0.1343,34,265,299 +2014,2011-03-29,2,0,3,18,0,2,1,1,0.36,0.3636,0.21,0.0896,42,252,294 +2015,2011-03-29,2,0,3,19,0,2,1,1,0.34,0.3333,0.23,0.194,20,170,190 +2016,2011-03-29,2,0,3,20,0,2,1,1,0.36,0.3636,0.21,0.0896,13,119,132 +2017,2011-03-29,2,0,3,21,0,2,1,1,0.34,0.3636,0.42,0,8,75,83 +2018,2011-03-29,2,0,3,22,0,2,1,1,0.34,0.3636,0.49,0,6,49,55 +2019,2011-03-29,2,0,3,23,0,2,1,2,0.32,0.3333,0.57,0.1045,8,27,35 +2020,2011-03-30,2,0,3,0,0,3,1,2,0.32,0.3485,0.57,0,3,8,11 +2021,2011-03-30,2,0,3,1,0,3,1,1,0.32,0.3485,0.39,0,1,9,10 +2022,2011-03-30,2,0,3,2,0,3,1,2,0.32,0.3485,0.57,0,0,4,4 +2023,2011-03-30,2,0,3,3,0,3,1,2,0.32,0.3485,0.49,0,0,4,4 +2024,2011-03-30,2,0,3,4,0,3,1,2,0.32,0.3485,0.57,0,1,0,1 +2025,2011-03-30,2,0,3,5,0,3,1,2,0.3,0.2879,0.56,0.194,0,7,7 +2026,2011-03-30,2,0,3,6,0,3,1,2,0.3,0.303,0.49,0.1343,4,44,48 +2027,2011-03-30,2,0,3,7,0,3,1,2,0.32,0.3333,0.45,0.1045,8,128,136 +2028,2011-03-30,2,0,3,8,0,3,1,2,0.32,0.3182,0.49,0.1642,16,247,263 +2029,2011-03-30,2,0,3,9,0,3,1,2,0.32,0.303,0.45,0.2239,7,147,154 +2030,2011-03-30,2,0,3,10,0,3,1,2,0.34,0.3182,0.46,0.2239,13,51,64 +2031,2011-03-30,2,0,3,11,0,3,1,2,0.34,0.3182,0.46,0.2537,26,64,90 +2032,2011-03-30,2,0,3,12,0,3,1,2,0.34,0.3333,0.46,0.1642,15,65,80 +2033,2011-03-30,2,0,3,13,0,3,1,2,0.36,0.3485,0.46,0.2239,14,84,98 +2034,2011-03-30,2,0,3,14,0,3,1,2,0.36,0.3333,0.46,0.2985,16,64,80 +2035,2011-03-30,2,0,3,15,0,3,1,3,0.28,0.2576,0.81,0.2985,14,56,70 +2036,2011-03-30,2,0,3,16,0,3,1,3,0.28,0.2727,0.87,0.2537,0,36,36 +2037,2011-03-30,2,0,3,17,0,3,1,3,0.26,0.2727,0.93,0.1045,11,105,116 +2038,2011-03-30,2,0,3,18,0,3,1,2,0.26,0.2576,0.93,0.2239,6,79,85 +2039,2011-03-30,2,0,3,19,0,3,1,2,0.26,0.2576,0.93,0.2239,5,67,72 +2040,2011-03-30,2,0,3,20,0,3,1,3,0.24,0.2121,0.93,0.2985,4,40,44 +2041,2011-03-30,2,0,3,21,0,3,1,2,0.24,0.2273,0.93,0.2537,0,27,27 +2042,2011-03-30,2,0,3,22,0,3,1,3,0.24,0.2273,0.93,0.2239,3,21,24 +2043,2011-03-30,2,0,3,23,0,3,1,3,0.24,0.2121,0.93,0.2836,1,11,12 +2044,2011-03-31,2,0,3,0,0,4,1,3,0.24,0.2273,0.93,0.2239,0,3,3 +2045,2011-03-31,2,0,3,1,0,4,1,3,0.24,0.2273,0.93,0.194,1,4,5 +2046,2011-03-31,2,0,3,2,0,4,1,3,0.24,0.2273,0.93,0.194,0,5,5 +2047,2011-03-31,2,0,3,3,0,4,1,3,0.24,0.2273,0.93,0.2239,0,1,1 +2048,2011-03-31,2,0,3,4,0,4,1,3,0.24,0.2273,0.93,0.2537,0,2,2 +2049,2011-03-31,2,0,3,5,0,4,1,3,0.24,0.2273,0.93,0.2239,0,8,8 +2050,2011-03-31,2,0,3,6,0,4,1,3,0.24,0.2121,0.93,0.2985,2,34,36 +2051,2011-03-31,2,0,3,7,0,4,1,3,0.24,0.2273,0.93,0.2537,5,87,92 +2052,2011-03-31,2,0,3,8,0,4,1,2,0.24,0.2273,1,0.2537,7,185,192 +2053,2011-03-31,2,0,3,9,0,4,1,3,0.26,0.2424,0.93,0.2537,6,126,132 +2054,2011-03-31,2,0,3,10,0,4,1,3,0.26,0.2424,0.93,0.2537,15,54,69 +2055,2011-03-31,2,0,3,11,0,4,1,3,0.28,0.2727,0.93,0.194,6,52,58 +2056,2011-03-31,2,0,3,12,0,4,1,3,0.28,0.2727,0.93,0.194,17,73,90 +2057,2011-03-31,2,0,3,13,0,4,1,3,0.3,0.303,0.87,0.1343,13,55,68 +2058,2011-03-31,2,0,3,14,0,4,1,3,0.3,0.303,0.87,0.1343,27,49,76 +2059,2011-03-31,2,0,3,15,0,4,1,3,0.3,0.2727,0.87,0.2985,4,61,65 +2060,2011-03-31,2,0,3,16,0,4,1,2,0.3,0.2879,0.87,0.2537,13,72,85 +2061,2011-03-31,2,0,3,17,0,4,1,2,0.3,0.2879,0.87,0.2537,15,153,168 +2062,2011-03-31,2,0,3,18,0,4,1,2,0.3,0.303,0.87,0.1343,12,165,177 +2063,2011-03-31,2,0,3,19,0,4,1,2,0.3,0.303,0.87,0.1343,12,118,130 +2064,2011-03-31,2,0,3,20,0,4,1,3,0.28,0.2576,0.93,0.2836,6,79,85 +2065,2011-03-31,2,0,3,21,0,4,1,2,0.28,0.2727,0.93,0.1642,7,53,60 +2066,2011-03-31,2,0,3,22,0,4,1,3,0.28,0.2727,0.93,0.2537,7,44,51 +2067,2011-03-31,2,0,3,23,0,4,1,3,0.26,0.2576,1,0.1642,4,23,27 +2068,2011-04-01,2,0,4,0,0,5,1,3,0.26,0.2576,1,0.1642,0,6,6 +2069,2011-04-01,2,0,4,1,0,5,1,3,0.26,0.2576,1,0.1642,0,4,4 +2070,2011-04-01,2,0,4,2,0,5,1,3,0.26,0.2576,0.93,0.194,0,7,7 +2071,2011-04-01,2,0,4,3,0,5,1,2,0.24,0.2273,0.93,0.2537,0,4,4 +2072,2011-04-01,2,0,4,4,0,5,1,2,0.24,0.2273,0.93,0.2537,0,3,3 +2073,2011-04-01,2,0,4,5,0,5,1,3,0.24,0.2273,0.93,0.2239,1,11,12 +2074,2011-04-01,2,0,4,6,0,5,1,3,0.24,0.2273,0.93,0.2239,2,26,28 +2075,2011-04-01,2,0,4,7,0,5,1,3,0.24,0.2424,0.93,0,4,91,95 +2076,2011-04-01,2,0,4,8,0,5,1,2,0.26,0.2424,0.87,0.2537,8,198,206 +2077,2011-04-01,2,0,4,9,0,5,1,1,0.32,0.2879,0.7,0.3582,11,162,173 +2078,2011-04-01,2,0,4,10,0,5,1,1,0.32,0.303,0.66,0.2836,12,63,75 +2079,2011-04-01,2,0,4,11,0,5,1,1,0.32,0.2879,0.61,0.3582,17,72,89 +2080,2011-04-01,2,0,4,12,0,5,1,1,0.34,0.303,0.53,0.3582,15,80,95 +2081,2011-04-01,2,0,4,13,0,5,1,1,0.36,0.3333,0.5,0.3582,18,92,110 +2082,2011-04-01,2,0,4,14,0,5,1,2,0.36,0.3333,0.46,0.3582,26,61,87 +2083,2011-04-01,2,0,4,15,0,5,1,1,0.34,0.303,0.46,0.4179,30,81,111 +2084,2011-04-01,2,0,4,16,0,5,1,1,0.34,0.303,0.46,0.4179,42,125,167 +2085,2011-04-01,2,0,4,17,0,5,1,1,0.34,0.303,0.46,0.3284,36,245,281 +2086,2011-04-01,2,0,4,18,0,5,1,1,0.34,0.303,0.46,0.3284,36,205,241 +2087,2011-04-01,2,0,4,19,0,5,1,1,0.34,0.303,0.49,0.3284,16,120,136 +2088,2011-04-01,2,0,4,20,0,5,1,1,0.32,0.3182,0.53,0.194,2,75,77 +2089,2011-04-01,2,0,4,21,0,5,1,1,0.32,0.3333,0.53,0.1343,9,84,93 +2090,2011-04-01,2,0,4,22,0,5,1,1,0.3,0.303,0.56,0.1642,10,64,74 +2091,2011-04-01,2,0,4,23,0,5,1,1,0.3,0.3182,0.61,0.0896,12,41,53 +2092,2011-04-02,2,0,4,0,0,6,0,2,0.3,0.3333,0.61,0,3,29,32 +2093,2011-04-02,2,0,4,1,0,6,0,1,0.26,0.2727,0.65,0.1343,4,28,32 +2094,2011-04-02,2,0,4,2,0,6,0,1,0.24,0.2424,0.75,0.1343,1,20,21 +2095,2011-04-02,2,0,4,3,0,6,0,1,0.24,0.2424,0.7,0.1642,1,8,9 +2096,2011-04-02,2,0,4,4,0,6,0,1,0.24,0.2424,0.7,0.1642,0,5,5 +2097,2011-04-02,2,0,4,5,0,6,0,2,0.24,0.2424,0.75,0.1642,1,4,5 +2098,2011-04-02,2,0,4,6,0,6,0,1,0.26,0.2727,0.7,0.1045,5,7,12 +2099,2011-04-02,2,0,4,7,0,6,0,2,0.26,0.2727,0.7,0.1045,2,16,18 +2100,2011-04-02,2,0,4,8,0,6,0,1,0.3,0.3182,0.7,0.1045,10,45,55 +2101,2011-04-02,2,0,4,9,0,6,0,2,0.34,0.3485,0.66,0.1045,22,65,87 +2102,2011-04-02,2,0,4,10,0,6,0,2,0.36,0.3485,0.57,0.1642,41,113,154 +2103,2011-04-02,2,0,4,11,0,6,0,2,0.4,0.4091,0.47,0.1642,72,126,198 +2104,2011-04-02,2,0,4,12,0,6,0,3,0.32,0.3333,0.81,0.1343,84,100,184 +2105,2011-04-02,2,0,4,13,0,6,0,1,0.34,0.3333,0.81,0.1343,56,81,137 +2106,2011-04-02,2,0,4,14,0,6,0,3,0.32,0.303,0.76,0.3284,97,93,190 +2107,2011-04-02,2,0,4,15,0,6,0,3,0.34,0.303,0.76,0.3881,72,64,136 +2108,2011-04-02,2,0,4,16,0,6,0,1,0.38,0.3939,0.62,0.3284,111,85,196 +2109,2011-04-02,2,0,4,17,0,6,0,1,0.38,0.3939,0.54,0.3582,89,95,184 +2110,2011-04-02,2,0,4,18,0,6,0,1,0.38,0.3939,0.54,0.194,69,110,179 +2111,2011-04-02,2,0,4,19,0,6,0,1,0.36,0.3333,0.53,0.3582,71,77,148 +2112,2011-04-02,2,0,4,20,0,6,0,1,0.34,0.3182,0.53,0.2836,29,56,85 +2113,2011-04-02,2,0,4,21,0,6,0,1,0.32,0.303,0.61,0.2537,24,53,77 +2114,2011-04-02,2,0,4,22,0,6,0,1,0.32,0.303,0.61,0.2985,14,41,55 +2115,2011-04-02,2,0,4,23,0,6,0,1,0.32,0.3182,0.61,0.1642,20,33,53 +2116,2011-04-03,2,0,4,0,0,0,0,1,0.3,0.2879,0.65,0.194,8,31,39 +2117,2011-04-03,2,0,4,1,0,0,0,1,0.3,0.3182,0.61,0.1045,8,26,34 +2118,2011-04-03,2,0,4,2,0,0,0,1,0.26,0.2727,0.7,0.1343,5,19,24 +2119,2011-04-03,2,0,4,3,0,0,0,1,0.3,0.3333,0.61,0,3,8,11 +2120,2011-04-03,2,0,4,4,0,0,0,1,0.28,0.303,0.7,0.0896,3,0,3 +2121,2011-04-03,2,0,4,5,0,0,0,1,0.28,0.2727,0.65,0.2239,1,4,5 +2122,2011-04-03,2,0,4,6,0,0,0,1,0.28,0.2727,0.65,0.194,10,23,33 +2123,2011-04-03,2,0,4,7,0,0,0,1,0.32,0.303,0.57,0.2239,13,20,33 +2124,2011-04-03,2,0,4,8,0,0,0,1,0.34,0.3333,0.57,0.1642,18,44,62 +2125,2011-04-03,2,0,4,9,0,0,0,1,0.36,0.3182,0.5,0.4925,68,74,142 +2126,2011-04-03,2,0,4,10,0,0,0,1,0.4,0.4091,0.43,0.4179,111,104,215 +2127,2011-04-03,2,0,4,11,0,0,0,1,0.42,0.4242,0.41,0.2537,139,104,243 +2128,2011-04-03,2,0,4,12,0,0,0,1,0.44,0.4394,0.38,0.2836,166,147,313 +2129,2011-04-03,2,0,4,13,0,0,0,1,0.44,0.4394,0.33,0.2985,219,148,367 +2130,2011-04-03,2,0,4,14,0,0,0,1,0.46,0.4545,0.31,0.1343,240,109,349 +2131,2011-04-03,2,0,4,15,0,0,0,2,0.46,0.4545,0.33,0,174,118,292 +2132,2011-04-03,2,0,4,16,0,0,0,2,0.46,0.4545,0.31,0.194,147,156,303 +2133,2011-04-03,2,0,4,17,0,0,0,2,0.46,0.4545,0.28,0.1343,148,126,274 +2134,2011-04-03,2,0,4,18,0,0,0,2,0.46,0.4545,0.31,0,71,101,172 +2135,2011-04-03,2,0,4,19,0,0,0,2,0.42,0.4242,0.44,0.194,51,93,144 +2136,2011-04-03,2,0,4,20,0,0,0,3,0.42,0.4242,0.41,0.2239,24,55,79 +2137,2011-04-03,2,0,4,21,0,0,0,2,0.42,0.4242,0.44,0.2239,8,37,45 +2138,2011-04-03,2,0,4,22,0,0,0,1,0.4,0.4091,0.43,0.1045,7,29,36 +2139,2011-04-03,2,0,4,23,0,0,0,1,0.4,0.4091,0.5,0.0896,9,22,31 +2140,2011-04-04,2,0,4,0,0,1,1,1,0.4,0.4091,0.5,0.2239,1,5,6 +2141,2011-04-04,2,0,4,1,0,1,1,1,0.4,0.4091,0.56,0.3582,7,4,11 +2142,2011-04-04,2,0,4,2,0,1,1,1,0.38,0.3939,0.66,0.2537,1,1,2 +2143,2011-04-04,2,0,4,3,0,1,1,1,0.38,0.3939,0.66,0.2836,1,0,1 +2144,2011-04-04,2,0,4,4,0,1,1,1,0.38,0.3939,0.71,0.2985,1,1,2 +2145,2011-04-04,2,0,4,5,0,1,1,1,0.4,0.4091,0.62,0.2537,0,7,7 +2146,2011-04-04,2,0,4,6,0,1,1,1,0.42,0.4242,0.58,0.2836,3,43,46 +2147,2011-04-04,2,0,4,7,0,1,1,2,0.44,0.4394,0.54,0.2537,7,150,157 +2148,2011-04-04,2,0,4,8,0,1,1,2,0.46,0.4545,0.55,0.2985,31,308,339 +2149,2011-04-04,2,0,4,9,0,1,1,2,0.5,0.4848,0.51,0.3881,24,134,158 +2150,2011-04-04,2,0,4,10,0,1,1,2,0.54,0.5152,0.45,0.3582,35,55,90 +2151,2011-04-04,2,0,4,11,0,1,1,1,0.6,0.6212,0.4,0.3881,58,66,124 +2152,2011-04-04,2,0,4,12,0,1,1,1,0.64,0.6212,0.36,0.4627,59,98,157 +2153,2011-04-04,2,0,4,13,0,1,1,1,0.68,0.6364,0.34,0.3284,47,92,139 +2154,2011-04-04,2,0,4,14,0,1,1,2,0.74,0.6515,0.27,0.4925,47,76,123 +2155,2011-04-04,2,0,4,15,0,1,1,1,0.76,0.6667,0.23,0.5522,47,96,143 +2156,2011-04-04,2,0,4,16,0,1,1,1,0.76,0.6515,0.22,0.5224,59,130,189 +2157,2011-04-04,2,0,4,17,0,1,1,1,0.74,0.6515,0.23,0.6119,83,283,366 +2158,2011-04-04,2,0,4,18,0,1,1,2,0.72,0.6364,0.23,0.4925,78,308,386 +2159,2011-04-04,2,0,4,19,0,1,1,1,0.7,0.6364,0.24,0.4627,51,227,278 +2160,2011-04-04,2,0,4,20,0,1,1,2,0.7,0.6364,0.24,0.5522,40,133,173 +2161,2011-04-04,2,0,4,21,0,1,1,2,0.7,0.6364,0.3,0.4478,19,76,95 +2162,2011-04-04,2,0,4,22,0,1,1,2,0.68,0.6364,0.36,0.4627,17,58,75 +2163,2011-04-04,2,0,4,23,0,1,1,2,0.64,0.6212,0.47,0.2239,18,30,48 +2164,2011-04-05,2,0,4,0,0,2,1,1,0.62,0.6212,0.5,0.1045,10,12,22 +2165,2011-04-05,2,0,4,1,0,2,1,3,0.62,0.6212,0.57,0.4179,10,5,15 +2166,2011-04-05,2,0,4,2,0,2,1,2,0.54,0.5152,0.73,0.3284,0,5,5 +2167,2011-04-05,2,0,4,3,0,2,1,2,0.54,0.5152,0.73,0.3284,1,3,4 +2168,2011-04-05,2,0,4,4,0,2,1,3,0.5,0.4848,0.88,0.4925,0,2,2 +2169,2011-04-05,2,0,4,5,0,2,1,3,0.46,0.4545,0.94,0.2985,0,5,5 +2170,2011-04-05,2,0,4,6,0,2,1,2,0.48,0.4697,0.88,0.3284,2,36,38 +2171,2011-04-05,2,0,4,7,0,2,1,2,0.48,0.4697,0.88,0.3284,10,124,134 +2172,2011-04-05,2,0,4,8,0,2,1,3,0.38,0.3939,0.87,0.5821,9,148,157 +2173,2011-04-05,2,0,4,9,0,2,1,3,0.36,0.3182,0.87,0.4925,2,44,46 +2174,2011-04-05,2,0,4,10,0,2,1,3,0.34,0.303,0.81,0.3881,3,25,28 +2175,2011-04-05,2,0,4,11,0,2,1,3,0.32,0.2879,0.81,0.4478,1,18,19 +2176,2011-04-05,2,0,4,12,0,2,1,2,0.34,0.303,0.71,0.3881,6,32,38 +2177,2011-04-05,2,0,4,13,0,2,1,2,0.36,0.3333,0.63,0.4179,5,51,56 +2178,2011-04-05,2,0,4,14,0,2,1,2,0.36,0.3182,0.57,0.4925,9,67,76 +2179,2011-04-05,2,0,4,15,0,2,1,1,0.4,0.4091,0.46,0.4627,6,62,68 +2180,2011-04-05,2,0,4,16,0,2,1,1,0.38,0.3939,0.46,0.4179,17,113,130 +2181,2011-04-05,2,0,4,17,0,2,1,1,0.42,0.4242,0.38,0.4179,27,246,273 +2182,2011-04-05,2,0,4,18,0,2,1,1,0.38,0.3939,0.4,0.4925,19,248,267 +2183,2011-04-05,2,0,4,19,0,2,1,1,0.36,0.3333,0.43,0.2985,12,148,160 +2184,2011-04-05,2,0,4,20,0,2,1,1,0.34,0.3182,0.46,0.2836,4,87,91 +2185,2011-04-05,2,0,4,21,0,2,1,1,0.34,0.2879,0.46,0.5224,8,81,89 +2186,2011-04-05,2,0,4,22,0,2,1,1,0.32,0.303,0.49,0.2985,3,43,46 +2187,2011-04-05,2,0,4,23,0,2,1,1,0.3,0.2879,0.49,0.2836,3,23,26 +2188,2011-04-06,2,0,4,0,0,3,1,1,0.3,0.3182,0.49,0.1045,0,15,15 +2189,2011-04-06,2,0,4,1,0,3,1,1,0.26,0.2727,0.6,0.1343,0,2,2 +2190,2011-04-06,2,0,4,2,0,3,1,1,0.24,0.2424,0.7,0.1343,0,5,5 +2191,2011-04-06,2,0,4,3,0,3,1,1,0.26,0.2727,0.6,0.1343,0,4,4 +2192,2011-04-06,2,0,4,4,0,3,1,1,0.24,0.2424,0.65,0.1343,0,1,1 +2193,2011-04-06,2,0,4,5,0,3,1,1,0.24,0.2576,0.7,0.1045,1,12,13 +2194,2011-04-06,2,0,4,6,0,3,1,1,0.24,0.2576,0.7,0.1045,4,52,56 +2195,2011-04-06,2,0,4,7,0,3,1,1,0.26,0.2727,0.65,0.1045,3,130,133 +2196,2011-04-06,2,0,4,8,0,3,1,1,0.32,0.303,0.61,0.2239,17,308,325 +2197,2011-04-06,2,0,4,9,0,3,1,1,0.36,0.3333,0.46,0.2537,8,157,165 +2198,2011-04-06,2,0,4,10,0,3,1,1,0.4,0.4091,0.37,0.3881,21,48,69 +2199,2011-04-06,2,0,4,11,0,3,1,1,0.42,0.4242,0.38,0.3881,23,70,93 +2200,2011-04-06,2,0,4,12,0,3,1,1,0.44,0.4394,0.41,0.2537,22,112,134 +2201,2011-04-06,2,0,4,13,0,3,1,1,0.46,0.4545,0.38,0.4925,29,83,112 +2202,2011-04-06,2,0,4,14,0,3,1,1,0.5,0.4848,0.34,0.4179,35,80,115 +2203,2011-04-06,2,0,4,15,0,3,1,1,0.52,0.5,0.32,0.3284,34,83,117 +2204,2011-04-06,2,0,4,16,0,3,1,1,0.54,0.5152,0.28,0.4179,27,142,169 +2205,2011-04-06,2,0,4,17,0,3,1,1,0.52,0.5,0.32,0.4478,53,303,356 +2206,2011-04-06,2,0,4,18,0,3,1,1,0.52,0.5,0.29,0.3582,43,282,325 +2207,2011-04-06,2,0,4,19,0,3,1,1,0.5,0.4848,0.31,0.2985,36,196,232 +2208,2011-04-06,2,0,4,20,0,3,1,1,0.46,0.4545,0.51,0.2239,15,126,141 +2209,2011-04-06,2,0,4,21,0,3,1,1,0.46,0.4545,0.41,0.2836,18,84,102 +2210,2011-04-06,2,0,4,22,0,3,1,1,0.46,0.4545,0.41,0.2836,13,71,84 +2211,2011-04-06,2,0,4,23,0,3,1,1,0.46,0.4545,0.41,0.2985,11,29,40 +2212,2011-04-07,2,0,4,0,0,4,1,1,0.46,0.4545,0.41,0.2836,5,15,20 +2213,2011-04-07,2,0,4,1,0,4,1,1,0.42,0.4242,0.47,0.1343,2,11,13 +2214,2011-04-07,2,0,4,2,0,4,1,1,0.42,0.4242,0.54,0,0,7,7 +2215,2011-04-07,2,0,4,3,0,4,1,1,0.36,0.3485,0.66,0.194,0,3,3 +2216,2011-04-07,2,0,4,4,0,4,1,1,0.34,0.3333,0.76,0.194,0,1,1 +2217,2011-04-07,2,0,4,5,0,4,1,1,0.34,0.3182,0.76,0.2239,0,6,6 +2218,2011-04-07,2,0,4,6,0,4,1,1,0.32,0.303,0.81,0.2537,5,59,64 +2219,2011-04-07,2,0,4,7,0,4,1,2,0.34,0.3333,0.76,0.1642,8,152,160 +2220,2011-04-07,2,0,4,8,0,4,1,1,0.36,0.3485,0.76,0.194,23,291,314 +2221,2011-04-07,2,0,4,9,0,4,1,2,0.4,0.4091,0.66,0.1343,15,155,170 +2222,2011-04-07,2,0,4,10,0,4,1,2,0.42,0.4242,0.67,0.1045,29,66,95 +2223,2011-04-07,2,0,4,11,0,4,1,2,0.46,0.4545,0.55,0,36,86,122 +2224,2011-04-07,2,0,4,12,0,4,1,2,0.46,0.4545,0.55,0,39,114,153 +2225,2011-04-07,2,0,4,13,0,4,1,2,0.52,0.5,0.48,0.0896,36,99,135 +2226,2011-04-07,2,0,4,14,0,4,1,1,0.56,0.5303,0.43,0.0896,46,111,157 +2227,2011-04-07,2,0,4,15,0,4,1,1,0.6,0.6212,0.38,0.1045,27,133,160 +2228,2011-04-07,2,0,4,16,0,4,1,1,0.6,0.6212,0.35,0,52,161,213 +2229,2011-04-07,2,0,4,17,0,4,1,1,0.52,0.5,0.52,0.2836,63,280,343 +2230,2011-04-07,2,0,4,18,0,4,1,1,0.48,0.4697,0.55,0.2537,68,265,333 +2231,2011-04-07,2,0,4,19,0,4,1,1,0.46,0.4545,0.59,0.2985,34,192,226 +2232,2011-04-07,2,0,4,20,0,4,1,1,0.44,0.4394,0.62,0.1642,37,166,203 +2233,2011-04-07,2,0,4,21,0,4,1,2,0.44,0.4394,0.67,0.2836,19,89,108 +2234,2011-04-07,2,0,4,22,0,4,1,2,0.4,0.4091,0.76,0.194,18,63,81 +2235,2011-04-07,2,0,4,23,0,4,1,1,0.38,0.3939,0.76,0.2537,9,45,54 +2236,2011-04-08,2,0,4,0,0,5,1,2,0.36,0.3485,0.76,0.2239,4,21,25 +2237,2011-04-08,2,0,4,1,0,5,1,2,0.34,0.3333,0.76,0.1343,3,6,9 +2238,2011-04-08,2,0,4,2,0,5,1,3,0.34,0.3333,0.76,0.1642,3,10,13 +2239,2011-04-08,2,0,4,3,0,5,1,3,0.34,0.3333,0.76,0.1642,0,1,1 +2240,2011-04-08,2,0,4,4,0,5,1,2,0.34,0.3333,0.86,0.1343,0,1,1 +2241,2011-04-08,2,0,4,5,0,5,1,2,0.32,0.3182,0.87,0.194,1,8,9 +2242,2011-04-08,2,0,4,6,0,5,1,2,0.34,0.3636,0.87,0,2,33,35 +2243,2011-04-08,2,0,4,7,0,5,1,3,0.34,0.3333,0.87,0.1642,4,109,113 +2244,2011-04-08,2,0,4,8,0,5,1,3,0.34,0.3333,0.87,0.1642,13,208,221 +2245,2011-04-08,2,0,4,9,0,5,1,2,0.36,0.3485,0.76,0.194,17,168,185 +2246,2011-04-08,2,0,4,10,0,5,1,2,0.36,0.3333,0.71,0.2836,10,63,73 +2247,2011-04-08,2,0,4,11,0,5,1,2,0.4,0.4091,0.62,0.3881,19,79,98 +2248,2011-04-08,2,0,4,12,0,5,1,3,0.38,0.3939,0.66,0.2239,29,51,80 +2249,2011-04-08,2,0,4,13,0,5,1,3,0.36,0.3333,0.76,0.2836,7,35,42 +2250,2011-04-08,2,0,4,14,0,5,1,3,0.34,0.3182,0.87,0.2239,2,13,15 +2251,2011-04-08,2,0,4,15,0,5,1,3,0.34,0.3182,0.87,0.2239,1,24,25 +2252,2011-04-08,2,0,4,16,0,5,1,2,0.32,0.303,0.93,0.2537,2,58,60 +2253,2011-04-08,2,0,4,17,0,5,1,3,0.32,0.303,0.93,0.2537,10,138,148 +2254,2011-04-08,2,0,4,18,0,5,1,3,0.32,0.303,0.93,0.2537,8,54,62 +2255,2011-04-08,2,0,4,19,0,5,1,3,0.3,0.2727,0.93,0.4179,1,52,53 +2256,2011-04-08,2,0,4,20,0,5,1,2,0.3,0.2727,0.93,0.3284,12,51,63 +2257,2011-04-08,2,0,4,21,0,5,1,2,0.3,0.2879,0.93,0.2537,11,43,54 +2258,2011-04-08,2,0,4,22,0,5,1,2,0.3,0.2727,0.93,0.2985,9,42,51 +2259,2011-04-08,2,0,4,23,0,5,1,2,0.3,0.2879,0.93,0.2239,4,31,35 +2260,2011-04-09,2,0,4,0,0,6,0,2,0.3,0.2879,0.87,0.2239,5,26,31 +2261,2011-04-09,2,0,4,1,0,6,0,2,0.3,0.303,0.87,0.1343,3,17,20 +2262,2011-04-09,2,0,4,2,0,6,0,2,0.32,0.303,0.87,0.2239,2,15,17 +2263,2011-04-09,2,0,4,3,0,6,0,3,0.3,0.2879,1,0.2537,3,11,14 +2264,2011-04-09,2,0,4,4,0,6,0,3,0.3,0.2727,1,0.2537,0,3,3 +2265,2011-04-09,2,0,4,5,0,6,0,2,0.3,0.2879,0.93,0.2239,0,5,5 +2266,2011-04-09,2,0,4,6,0,6,0,2,0.3,0.2879,0.93,0.194,0,13,13 +2267,2011-04-09,2,0,4,7,0,6,0,2,0.3,0.303,0.93,0.1642,8,13,21 +2268,2011-04-09,2,0,4,8,0,6,0,2,0.32,0.303,0.93,0.1642,7,47,54 +2269,2011-04-09,2,0,4,9,0,6,0,2,0.34,0.3485,0.87,0.1045,27,71,98 +2270,2011-04-09,2,0,4,10,0,6,0,2,0.34,0.3333,0.87,0.1343,43,90,133 +2271,2011-04-09,2,0,4,11,0,6,0,2,0.36,0.3636,0.81,0,51,91,142 +2272,2011-04-09,2,0,4,12,0,6,0,2,0.36,0.3485,0.81,0.1343,79,123,202 +2273,2011-04-09,2,0,4,13,0,6,0,2,0.36,0.3788,0.81,0,114,108,222 +2274,2011-04-09,2,0,4,14,0,6,0,2,0.36,0.3485,0.81,0.1343,94,118,212 +2275,2011-04-09,2,0,4,15,0,6,0,2,0.36,0.3788,0.87,0,85,116,201 +2276,2011-04-09,2,0,4,16,0,6,0,2,0.38,0.3939,0.82,0,73,118,191 +2277,2011-04-09,2,0,4,17,0,6,0,2,0.38,0.3939,0.82,0.0896,128,129,257 +2278,2011-04-09,2,0,4,18,0,6,0,2,0.38,0.3939,0.82,0.1045,48,129,177 +2279,2011-04-09,2,0,4,19,0,6,0,2,0.38,0.3939,0.82,0.1343,47,83,130 +2280,2011-04-09,2,0,4,20,0,6,0,2,0.38,0.3939,0.87,0.1642,19,74,93 +2281,2011-04-09,2,0,4,21,0,6,0,2,0.36,0.3485,0.93,0.1343,10,65,75 +2282,2011-04-09,2,0,4,22,0,6,0,2,0.36,0.3485,0.93,0.1343,21,66,87 +2283,2011-04-09,2,0,4,23,0,6,0,2,0.38,0.3939,0.87,0.0896,12,45,57 +2284,2011-04-10,2,0,4,0,0,0,0,2,0.38,0.3939,0.87,0.0896,5,48,53 +2285,2011-04-10,2,0,4,1,0,0,0,2,0.38,0.3939,0.87,0.1343,6,31,37 +2286,2011-04-10,2,0,4,2,0,0,0,2,0.38,0.3939,0.87,0.0896,12,24,36 +2287,2011-04-10,2,0,4,3,0,0,0,2,0.38,0.3939,0.87,0.1343,5,11,16 +2288,2011-04-10,2,0,4,4,0,0,0,2,0.36,0.3636,0.93,0.1045,3,2,5 +2289,2011-04-10,2,0,4,5,0,0,0,2,0.36,0.3636,0.93,0.0896,0,4,4 +2290,2011-04-10,2,0,4,6,0,0,0,2,0.36,0.3636,0.93,0.0896,0,4,4 +2291,2011-04-10,2,0,4,7,0,0,0,2,0.36,0.3636,1,0.0896,3,7,10 +2292,2011-04-10,2,0,4,8,0,0,0,2,0.38,0.3939,0.94,0.0896,17,38,55 +2293,2011-04-10,2,0,4,9,0,0,0,2,0.38,0.3939,0.94,0.1343,31,50,81 +2294,2011-04-10,2,0,4,10,0,0,0,2,0.4,0.4091,0.87,0,69,81,150 +2295,2011-04-10,2,0,4,11,0,0,0,2,0.4,0.4091,0.87,0.1343,93,109,202 +2296,2011-04-10,2,0,4,12,0,0,0,2,0.42,0.4242,0.88,0.1343,94,136,230 +2297,2011-04-10,2,0,4,13,0,0,0,1,0.46,0.4545,0.82,0.1045,121,142,263 +2298,2011-04-10,2,0,4,14,0,0,0,1,0.5,0.4848,0.72,0.194,148,133,281 +2299,2011-04-10,2,0,4,15,0,0,0,1,0.5,0.4848,0.77,0.2239,156,141,297 +2300,2011-04-10,2,0,4,16,0,0,0,2,0.52,0.5,0.72,0.1642,135,153,288 +2301,2011-04-10,2,0,4,17,0,0,0,2,0.52,0.5,0.72,0.194,84,152,236 +2302,2011-04-10,2,0,4,18,0,0,0,1,0.5,0.4848,0.77,0.194,103,137,240 +2303,2011-04-10,2,0,4,19,0,0,0,1,0.5,0.4848,0.77,0.194,47,84,131 +2304,2011-04-10,2,0,4,20,0,0,0,1,0.46,0.4545,0.82,0.1343,21,71,92 +2305,2011-04-10,2,0,4,21,0,0,0,1,0.44,0.4394,0.88,0.194,18,77,95 +2306,2011-04-10,2,0,4,22,0,0,0,1,0.44,0.4394,0.94,0.2537,12,45,57 +2307,2011-04-10,2,0,4,23,0,0,0,1,0.46,0.4545,0.88,0.3582,5,27,32 +2308,2011-04-11,2,0,4,0,0,1,1,1,0.48,0.4697,0.88,0.3284,7,16,23 +2309,2011-04-11,2,0,4,1,0,1,1,1,0.46,0.4545,0.94,0.2836,1,3,4 +2310,2011-04-11,2,0,4,2,0,1,1,1,0.46,0.4545,0.94,0.2836,7,2,9 +2311,2011-04-11,2,0,4,4,0,1,1,1,0.46,0.4545,0.94,0.194,0,1,1 +2312,2011-04-11,2,0,4,5,0,1,1,1,0.46,0.4545,0.94,0.194,1,12,13 +2313,2011-04-11,2,0,4,6,0,1,1,2,0.46,0.4545,1,0.2239,2,59,61 +2314,2011-04-11,2,0,4,7,0,1,1,2,0.5,0.4848,0.88,0.2239,12,164,176 +2315,2011-04-11,2,0,4,8,0,1,1,2,0.52,0.5,0.88,0.2836,28,286,314 +2316,2011-04-11,2,0,4,9,0,1,1,2,0.56,0.5303,0.83,0.2836,33,132,165 +2317,2011-04-11,2,0,4,10,0,1,1,2,0.56,0.5303,0.83,0.2985,41,55,96 +2318,2011-04-11,2,0,4,11,0,1,1,2,0.6,0.5909,0.73,0.2985,45,59,104 +2319,2011-04-11,2,0,4,12,0,1,1,2,0.6,0.5909,0.73,0.3284,37,97,134 +2320,2011-04-11,2,0,4,13,0,1,1,2,0.64,0.6061,0.69,0.4179,62,77,139 +2321,2011-04-11,2,0,4,14,0,1,1,2,0.72,0.6667,0.54,0.3582,60,85,145 +2322,2011-04-11,2,0,4,15,0,1,1,1,0.74,0.6667,0.48,0.5224,56,85,141 +2323,2011-04-11,2,0,4,16,0,1,1,1,0.74,0.6667,0.48,0.5224,73,162,235 +2324,2011-04-11,2,0,4,17,0,1,1,1,0.74,0.6667,0.48,0.5224,100,352,452 +2325,2011-04-11,2,0,4,18,0,1,1,1,0.72,0.6667,0.51,0.3881,93,290,383 +2326,2011-04-11,2,0,4,19,0,1,1,1,0.72,0.6667,0.51,0.4478,73,211,284 +2327,2011-04-11,2,0,4,20,0,1,1,1,0.68,0.6364,0.57,0.5224,44,122,166 +2328,2011-04-11,2,0,4,21,0,1,1,1,0.66,0.6212,0.61,0.2537,39,119,158 +2329,2011-04-11,2,0,4,22,0,1,1,2,0.64,0.6212,0.5,0.2836,25,66,91 +2330,2011-04-11,2,0,4,23,0,1,1,3,0.58,0.5455,0.6,0,16,38,54 +2331,2011-04-12,2,0,4,0,0,2,1,2,0.62,0.6212,0.5,0.1045,11,13,24 +2332,2011-04-12,2,0,4,1,0,2,1,2,0.62,0.6212,0.5,0.0896,6,7,13 +2333,2011-04-12,2,0,4,2,0,2,1,2,0.6,0.6212,0.53,0.1343,7,7,14 +2334,2011-04-12,2,0,4,3,0,2,1,2,0.58,0.5455,0.56,0.1343,0,1,1 +2335,2011-04-12,2,0,4,4,0,2,1,2,0.56,0.5303,0.64,0.1343,4,2,6 +2336,2011-04-12,2,0,4,5,0,2,1,2,0.54,0.5152,0.68,0.1343,1,15,16 +2337,2011-04-12,2,0,4,6,0,2,1,2,0.54,0.5152,0.68,0.2836,3,55,58 +2338,2011-04-12,2,0,4,7,0,2,1,2,0.54,0.5152,0.68,0.2239,12,177,189 +2339,2011-04-12,2,0,4,8,0,2,1,3,0.48,0.4697,0.88,0.3881,15,211,226 +2340,2011-04-12,2,0,4,9,0,2,1,3,0.46,0.4545,0.94,0.4925,4,50,54 +2341,2011-04-12,2,0,4,10,0,2,1,3,0.5,0.4848,0.82,0.4179,6,34,40 +2342,2011-04-12,2,0,4,11,0,2,1,1,0.52,0.5,0.77,0.3582,11,38,49 +2343,2011-04-12,2,0,4,12,0,2,1,2,0.56,0.5303,0.64,0.4627,9,83,92 +2344,2011-04-12,2,0,4,13,0,2,1,3,0.54,0.5152,0.68,0.2836,22,84,106 +2345,2011-04-12,2,0,4,14,0,2,1,2,0.5,0.4848,0.82,0.2836,16,28,44 +2346,2011-04-12,2,0,4,15,0,2,1,2,0.48,0.4697,0.77,0.5224,24,54,78 +2347,2011-04-12,2,0,4,16,0,2,1,2,0.48,0.4697,0.77,0.3881,19,80,99 +2348,2011-04-12,2,0,4,17,0,2,1,3,0.44,0.4394,0.77,0.2239,29,262,291 +2349,2011-04-12,2,0,4,18,0,2,1,3,0.44,0.4394,0.77,0.2239,21,203,224 +2350,2011-04-12,2,0,4,19,0,2,1,2,0.42,0.4242,0.82,0.2836,3,127,130 +2351,2011-04-12,2,0,4,20,0,2,1,2,0.42,0.4242,0.82,0.2836,9,94,103 +2352,2011-04-12,2,0,4,21,0,2,1,2,0.42,0.4242,0.82,0.194,10,74,84 +2353,2011-04-12,2,0,4,22,0,2,1,2,0.4,0.4091,0.94,0.2985,7,47,54 +2354,2011-04-12,2,0,4,23,0,2,1,2,0.4,0.4091,0.94,0.2537,8,31,39 +2355,2011-04-13,2,0,4,0,0,3,1,2,0.4,0.4091,1,0.2985,3,12,15 +2356,2011-04-13,2,0,4,1,0,3,1,2,0.4,0.4091,1,0.2985,0,1,1 +2357,2011-04-13,2,0,4,2,0,3,1,3,0.4,0.4091,1,0.2985,0,2,2 +2358,2011-04-13,2,0,4,3,0,3,1,3,0.4,0.4091,0.94,0.3284,0,2,2 +2359,2011-04-13,2,0,4,4,0,3,1,3,0.4,0.4091,0.94,0.3284,1,2,3 +2360,2011-04-13,2,0,4,5,0,3,1,3,0.38,0.3939,0.94,0.2537,1,4,5 +2361,2011-04-13,2,0,4,6,0,3,1,3,0.38,0.3939,0.94,0.2239,1,33,34 +2362,2011-04-13,2,0,4,7,0,3,1,3,0.36,0.3485,1,0.2239,3,67,70 +2363,2011-04-13,2,0,4,8,0,3,1,3,0.38,0.3939,0.94,0.194,6,158,164 +2364,2011-04-13,2,0,4,9,0,3,1,2,0.38,0.3939,0.94,0.2239,2,63,65 +2365,2011-04-13,2,0,4,10,0,3,1,2,0.4,0.4091,0.87,0.194,4,42,46 +2366,2011-04-13,2,0,4,11,0,3,1,2,0.42,0.4242,0.82,0.194,4,57,61 +2367,2011-04-13,2,0,4,12,0,3,1,3,0.42,0.4242,0.82,0.2537,12,83,95 +2368,2011-04-13,2,0,4,13,0,3,1,2,0.44,0.4394,0.77,0.194,6,55,61 +2369,2011-04-13,2,0,4,14,0,3,1,2,0.42,0.4242,0.77,0.2537,24,82,106 +2370,2011-04-13,2,0,4,15,0,3,1,2,0.44,0.4394,0.72,0.2836,12,69,81 +2371,2011-04-13,2,0,4,16,0,3,1,1,0.46,0.4545,0.67,0.2836,23,119,142 +2372,2011-04-13,2,0,4,17,0,3,1,1,0.46,0.4545,0.67,0.2985,25,284,309 +2373,2011-04-13,2,0,4,18,0,3,1,1,0.44,0.4394,0.62,0.3881,28,293,321 +2374,2011-04-13,2,0,4,19,0,3,1,1,0.44,0.4394,0.62,0.2537,15,160,175 +2375,2011-04-13,2,0,4,20,0,3,1,1,0.44,0.4394,0.62,0.2537,7,119,126 +2376,2011-04-13,2,0,4,21,0,3,1,1,0.42,0.4242,0.67,0.1642,17,100,117 +2377,2011-04-13,2,0,4,22,0,3,1,1,0.42,0.4242,0.67,0.194,10,89,99 +2378,2011-04-13,2,0,4,23,0,3,1,1,0.4,0.4091,0.71,0.1343,5,57,62 +2379,2011-04-14,2,0,4,0,0,4,1,1,0.38,0.3939,0.76,0.1343,3,18,21 +2380,2011-04-14,2,0,4,1,0,4,1,1,0.38,0.3939,0.76,0.2239,1,9,10 +2381,2011-04-14,2,0,4,2,0,4,1,1,0.36,0.3485,0.76,0.1343,0,3,3 +2382,2011-04-14,2,0,4,3,0,4,1,1,0.34,0.3333,0.81,0.1343,0,2,2 +2383,2011-04-14,2,0,4,4,0,4,1,1,0.34,0.3636,0.76,0,1,8,9 +2384,2011-04-14,2,0,4,5,0,4,1,1,0.34,0.3485,0.76,0.0896,2,12,14 +2385,2011-04-14,2,0,4,6,0,4,1,1,0.34,0.3485,0.76,0.1045,4,66,70 +2386,2011-04-14,2,0,4,7,0,4,1,1,0.38,0.3939,0.66,0,11,182,193 +2387,2011-04-14,2,0,4,8,0,4,1,1,0.42,0.4242,0.58,0.1642,21,316,337 +2388,2011-04-14,2,0,4,9,0,4,1,1,0.46,0.4545,0.51,0.1343,18,152,170 +2389,2011-04-14,2,0,4,10,0,4,1,1,0.5,0.4848,0.51,0.194,21,68,89 +2390,2011-04-14,2,0,4,11,0,4,1,1,0.52,0.5,0.45,0.1642,28,87,115 +2391,2011-04-14,2,0,4,12,0,4,1,1,0.54,0.5152,0.42,0.1343,35,110,145 +2392,2011-04-14,2,0,4,13,0,4,1,1,0.56,0.5303,0.37,0.2239,38,121,159 +2393,2011-04-14,2,0,4,14,0,4,1,1,0.56,0.5303,0.35,0.0896,36,101,137 +2394,2011-04-14,2,0,4,15,0,4,1,1,0.6,0.6212,0.31,0.0896,57,85,142 +2395,2011-04-14,2,0,4,16,0,4,1,1,0.6,0.6061,0.26,0.1045,49,153,202 +2396,2011-04-14,2,0,4,17,0,4,1,1,0.6,0.6061,0.28,0,50,338,388 +2397,2011-04-14,2,0,4,18,0,4,1,1,0.56,0.5303,0.3,0,47,290,337 +2398,2011-04-14,2,0,4,19,0,4,1,1,0.54,0.5152,0.32,0,37,222,259 +2399,2011-04-14,2,0,4,20,0,4,1,1,0.5,0.4848,0.51,0.194,22,147,169 +2400,2011-04-14,2,0,4,21,0,4,1,1,0.46,0.4545,0.63,0.1343,19,126,145 +2401,2011-04-14,2,0,4,22,0,4,1,1,0.48,0.4697,0.55,0.1045,22,82,104 +2402,2011-04-14,2,0,4,23,0,4,1,1,0.46,0.4545,0.59,0.1045,7,40,47 +2403,2011-04-15,2,0,4,0,1,5,0,1,0.44,0.4394,0.67,0,6,21,27 +2404,2011-04-15,2,0,4,1,1,5,0,1,0.44,0.4394,0.62,0,6,9,15 +2405,2011-04-15,2,0,4,2,1,5,0,1,0.4,0.4091,0.76,0,8,10,18 +2406,2011-04-15,2,0,4,3,1,5,0,1,0.4,0.4091,0.76,0,0,3,3 +2407,2011-04-15,2,0,4,4,1,5,0,1,0.38,0.3939,0.82,0.0896,0,3,3 +2408,2011-04-15,2,0,4,5,1,5,0,1,0.36,0.3636,0.81,0.1045,2,11,13 +2409,2011-04-15,2,0,4,6,1,5,0,1,0.36,0.3636,0.81,0.0896,5,42,47 +2410,2011-04-15,2,0,4,7,1,5,0,1,0.4,0.4091,0.71,0.1642,10,139,149 +2411,2011-04-15,2,0,4,8,1,5,0,1,0.44,0.4394,0.62,0.2985,21,279,300 +2412,2011-04-15,2,0,4,9,1,5,0,1,0.5,0.4848,0.55,0.194,16,162,178 +2413,2011-04-15,2,0,4,10,1,5,0,1,0.5,0.4848,0.55,0.194,31,91,122 +2414,2011-04-15,2,0,4,11,1,5,0,1,0.52,0.5,0.55,0.194,41,95,136 +2415,2011-04-15,2,0,4,12,1,5,0,1,0.52,0.5,0.55,0.2836,45,155,200 +2416,2011-04-15,2,0,4,13,1,5,0,1,0.54,0.5152,0.52,0.2537,47,133,180 +2417,2011-04-15,2,0,4,14,1,5,0,1,0.54,0.5152,0.52,0.2537,57,106,163 +2418,2011-04-15,2,0,4,15,1,5,0,1,0.54,0.5152,0.52,0.4478,50,112,162 +2419,2011-04-15,2,0,4,16,1,5,0,1,0.52,0.5,0.55,0.4925,70,173,243 +2420,2011-04-15,2,0,4,17,1,5,0,2,0.5,0.4848,0.63,0.3881,64,267,331 +2421,2011-04-15,2,0,4,18,1,5,0,2,0.46,0.4545,0.67,0.3881,47,216,263 +2422,2011-04-15,2,0,4,19,1,5,0,1,0.42,0.4242,0.77,0.3881,41,168,209 +2423,2011-04-15,2,0,4,20,1,5,0,2,0.4,0.4091,0.76,0.3284,26,92,118 +2424,2011-04-15,2,0,4,21,1,5,0,2,0.4,0.4091,0.76,0.2985,25,77,102 +2425,2011-04-15,2,0,4,22,1,5,0,2,0.38,0.3939,0.82,0.3881,16,64,80 +2426,2011-04-15,2,0,4,23,1,5,0,2,0.36,0.3485,0.81,0.194,8,56,64 +2427,2011-04-16,2,0,4,0,0,6,0,2,0.36,0.3485,0.81,0.2239,7,36,43 +2428,2011-04-16,2,0,4,1,0,6,0,2,0.36,0.3333,0.87,0.2836,5,28,33 +2429,2011-04-16,2,0,4,2,0,6,0,2,0.36,0.3485,0.87,0.1343,5,19,24 +2430,2011-04-16,2,0,4,3,0,6,0,2,0.36,0.3485,0.87,0.1642,5,9,14 +2431,2011-04-16,2,0,4,4,0,6,0,2,0.36,0.3485,0.87,0.2239,1,4,5 +2432,2011-04-16,2,0,4,5,0,6,0,2,0.38,0.3939,0.87,0.2836,1,3,4 +2433,2011-04-16,2,0,4,6,0,6,0,3,0.4,0.4091,0.82,0.3284,1,9,10 +2434,2011-04-16,2,0,4,7,0,6,0,2,0.4,0.4091,0.82,0.4179,3,18,21 +2435,2011-04-16,2,0,4,8,0,6,0,3,0.4,0.4091,0.87,0.3582,7,38,45 +2436,2011-04-16,2,0,4,9,0,6,0,3,0.4,0.4091,1,0.2985,2,27,29 +2437,2011-04-16,2,0,4,10,0,6,0,3,0.42,0.4242,0.94,0.3881,1,21,22 +2438,2011-04-16,2,0,4,11,0,6,0,3,0.42,0.4242,0.94,0.4478,4,27,31 +2439,2011-04-16,2,0,4,12,0,6,0,2,0.46,0.4545,0.88,0.5821,6,26,32 +2440,2011-04-16,2,0,4,13,0,6,0,3,0.46,0.4545,0.94,0.5224,9,49,58 +2441,2011-04-16,2,0,4,14,0,6,0,3,0.52,0.5,0.83,0.5821,14,49,63 +2442,2011-04-16,2,0,4,15,0,6,0,3,0.52,0.5,0.83,0.5821,16,62,78 +2443,2011-04-16,2,0,4,16,0,6,0,3,0.5,0.4848,0.88,0.4627,6,27,33 +2444,2011-04-16,2,0,4,17,0,6,0,3,0.5,0.4848,0.88,0.5821,1,14,15 +2445,2011-04-16,2,0,4,18,0,6,0,2,0.5,0.4848,0.94,0.3881,6,32,38 +2446,2011-04-16,2,0,4,19,0,6,0,2,0.52,0.5,0.94,0.4179,9,62,71 +2447,2011-04-16,2,0,4,20,0,6,0,3,0.44,0.4394,0.94,0.2239,4,49,53 +2448,2011-04-16,2,0,4,21,0,6,0,2,0.44,0.4394,0.94,0,1,13,14 +2449,2011-04-16,2,0,4,22,0,6,0,3,0.42,0.4242,1,0,2,22,24 +2450,2011-04-16,2,0,4,23,0,6,0,2,0.44,0.4394,0.77,0.2836,5,30,35 +2451,2011-04-17,2,0,4,0,0,0,0,2,0.44,0.4394,0.77,0.2836,4,29,33 +2452,2011-04-17,2,0,4,1,0,0,0,1,0.42,0.4242,0.67,0.2239,6,25,31 +2453,2011-04-17,2,0,4,2,0,0,0,1,0.4,0.4091,0.62,0.194,4,25,29 +2454,2011-04-17,2,0,4,3,0,0,0,1,0.4,0.4091,0.66,0.1343,12,13,25 +2455,2011-04-17,2,0,4,4,0,0,0,1,0.36,0.3636,0.76,0.1045,2,5,7 +2456,2011-04-17,2,0,4,5,0,0,0,1,0.36,0.3485,0.71,0.1642,2,1,3 +2457,2011-04-17,2,0,4,6,0,0,0,1,0.36,0.3182,0.66,0.4478,3,5,8 +2458,2011-04-17,2,0,4,7,0,0,0,1,0.36,0.3333,0.5,0.3582,2,14,16 +2459,2011-04-17,2,0,4,8,0,0,0,1,0.38,0.3939,0.46,0.3881,7,36,43 +2460,2011-04-17,2,0,4,9,0,0,0,1,0.4,0.4091,0.43,0.2836,31,71,102 +2461,2011-04-17,2,0,4,10,0,0,0,1,0.42,0.4242,0.44,0.4478,91,120,211 +2462,2011-04-17,2,0,4,11,0,0,0,1,0.46,0.4545,0.41,0.4179,119,185,304 +2463,2011-04-17,2,0,4,12,0,0,0,1,0.46,0.4545,0.38,0.3881,167,187,354 +2464,2011-04-17,2,0,4,13,0,0,0,1,0.5,0.4848,0.34,0.3881,181,162,343 +2465,2011-04-17,2,0,4,14,0,0,0,1,0.52,0.5,0.34,0.3881,170,191,361 +2466,2011-04-17,2,0,4,15,0,0,0,1,0.54,0.5152,0.32,0.5224,179,209,388 +2467,2011-04-17,2,0,4,16,0,0,0,1,0.54,0.5152,0.3,0.3582,161,182,343 +2468,2011-04-17,2,0,4,17,0,0,0,1,0.56,0.5303,0.3,0.4478,143,163,306 +2469,2011-04-17,2,0,4,18,0,0,0,1,0.56,0.5303,0.3,0.3881,102,175,277 +2470,2011-04-17,2,0,4,19,0,0,0,1,0.56,0.5303,0.3,0.2985,75,135,210 +2471,2011-04-17,2,0,4,20,0,0,0,1,0.52,0.5,0.36,0.194,44,97,141 +2472,2011-04-17,2,0,4,21,0,0,0,1,0.5,0.4848,0.42,0.194,13,70,83 +2473,2011-04-17,2,0,4,22,0,0,0,1,0.5,0.4848,0.39,0.1045,21,49,70 +2474,2011-04-17,2,0,4,23,0,0,0,1,0.44,0.4394,0.67,0.1642,19,37,56 +2475,2011-04-18,2,0,4,0,0,1,1,1,0.46,0.4545,0.47,0.0896,15,24,39 +2476,2011-04-18,2,0,4,1,0,1,1,1,0.46,0.4545,0.47,0.1343,11,8,19 +2477,2011-04-18,2,0,4,2,0,1,1,1,0.42,0.4242,0.67,0.1045,14,2,16 +2478,2011-04-18,2,0,4,3,0,1,1,1,0.42,0.4242,0.62,0.1045,6,1,7 +2479,2011-04-18,2,0,4,4,0,1,1,1,0.4,0.4091,0.87,0.1642,2,4,6 +2480,2011-04-18,2,0,4,5,0,1,1,1,0.4,0.4091,0.76,0.0896,0,16,16 +2481,2011-04-18,2,0,4,6,0,1,1,1,0.4,0.4091,0.82,0,3,51,54 +2482,2011-04-18,2,0,4,7,0,1,1,1,0.46,0.4545,0.63,0,12,156,168 +2483,2011-04-18,2,0,4,8,0,1,1,1,0.46,0.4545,0.63,0,20,277,297 +2484,2011-04-18,2,0,4,9,0,1,1,2,0.5,0.4848,0.55,0.1045,37,132,169 +2485,2011-04-18,2,0,4,10,0,1,1,2,0.52,0.5,0.55,0.1045,41,68,109 +2486,2011-04-18,2,0,4,11,0,1,1,2,0.54,0.5152,0.52,0.0896,36,87,123 +2487,2011-04-18,2,0,4,12,0,1,1,2,0.56,0.5303,0.49,0.0896,60,124,184 +2488,2011-04-18,2,0,4,13,0,1,1,2,0.56,0.5303,0.49,0.1642,41,128,169 +2489,2011-04-18,2,0,4,14,0,1,1,2,0.58,0.5455,0.49,0.194,49,95,144 +2490,2011-04-18,2,0,4,15,0,1,1,2,0.6,0.6212,0.43,0.194,64,112,176 +2491,2011-04-18,2,0,4,16,0,1,1,1,0.6,0.6212,0.46,0.2836,40,165,205 +2492,2011-04-18,2,0,4,17,0,1,1,1,0.58,0.5455,0.49,0.2836,66,362,428 +2493,2011-04-18,2,0,4,18,0,1,1,1,0.64,0.6212,0.31,0.3582,41,321,362 +2494,2011-04-18,2,0,4,19,0,1,1,1,0.56,0.5303,0.46,0.2537,42,244,286 +2495,2011-04-18,2,0,4,20,0,1,1,1,0.6,0.6212,0.33,0.2985,29,136,165 +2496,2011-04-18,2,0,4,21,0,1,1,1,0.56,0.5303,0.37,0.2836,20,131,151 +2497,2011-04-18,2,0,4,22,0,1,1,1,0.52,0.5,0.55,0.2836,9,81,90 +2498,2011-04-18,2,0,4,23,0,1,1,1,0.5,0.4848,0.59,0.2537,11,35,46 +2499,2011-04-19,2,0,4,0,0,2,1,1,0.5,0.4848,0.55,0.2239,2,23,25 +2500,2011-04-19,2,0,4,1,0,2,1,1,0.46,0.4545,0.63,0.2239,5,2,7 +2501,2011-04-19,2,0,4,2,0,2,1,1,0.46,0.4545,0.67,0.2836,8,5,13 +2502,2011-04-19,2,0,4,3,0,2,1,1,0.48,0.4697,0.63,0.2239,0,3,3 +2503,2011-04-19,2,0,4,4,0,2,1,1,0.46,0.4545,0.67,0.0896,1,4,5 +2504,2011-04-19,2,0,4,5,0,2,1,2,0.46,0.4545,0.72,0.194,1,17,18 +2505,2011-04-19,2,0,4,6,0,2,1,2,0.48,0.4697,0.67,0,2,63,65 +2506,2011-04-19,2,0,4,7,0,2,1,2,0.52,0.5,0.55,0,22,166,188 +2507,2011-04-19,2,0,4,8,0,2,1,2,0.5,0.4848,0.68,0.1343,20,331,351 +2508,2011-04-19,2,0,4,9,0,2,1,3,0.52,0.5,0.63,0.0896,23,139,162 +2509,2011-04-19,2,0,4,10,0,2,1,2,0.5,0.4848,0.72,0.2239,15,60,75 +2510,2011-04-19,2,0,4,11,0,2,1,2,0.54,0.5152,0.68,0.1642,19,53,72 +2511,2011-04-19,2,0,4,12,0,2,1,2,0.54,0.5152,0.64,0.0896,19,84,103 +2512,2011-04-19,2,0,4,13,0,2,1,2,0.54,0.5152,0.68,0.2239,26,103,129 +2513,2011-04-19,2,0,4,14,0,2,1,2,0.54,0.5152,0.64,0.194,19,88,107 +2514,2011-04-19,2,0,4,15,0,2,1,2,0.54,0.5152,0.65,0.1642,44,83,127 +2515,2011-04-19,2,0,4,16,0,2,1,1,0.56,0.5303,0.52,0.2985,30,162,192 +2516,2011-04-19,2,0,4,17,0,2,1,1,0.56,0.5303,0.6,0.1045,39,372,411 +2517,2011-04-19,2,0,4,18,0,2,1,2,0.54,0.5152,0.64,0.1343,44,377,421 +2518,2011-04-19,2,0,4,19,0,2,1,2,0.5,0.4848,0.72,0.194,28,248,276 +2519,2011-04-19,2,0,4,20,0,2,1,2,0.5,0.4848,0.72,0.2239,20,148,168 +2520,2011-04-19,2,0,4,21,0,2,1,2,0.5,0.4848,0.72,0.1343,8,114,122 +2521,2011-04-19,2,0,4,22,0,2,1,1,0.48,0.4697,0.77,0.0896,3,109,112 +2522,2011-04-19,2,0,4,23,0,2,1,1,0.46,0.4545,0.88,0.0896,11,41,52 +2523,2011-04-20,2,0,4,0,0,3,1,1,0.44,0.4394,0.88,0.1343,4,29,33 +2524,2011-04-20,2,0,4,1,0,3,1,1,0.42,0.4242,0.94,0.1642,2,5,7 +2525,2011-04-20,2,0,4,2,0,3,1,1,0.42,0.4242,0.94,0.1642,0,2,2 +2526,2011-04-20,2,0,4,3,0,3,1,1,0.42,0.4242,0.94,0,0,2,2 +2527,2011-04-20,2,0,4,4,0,3,1,1,0.4,0.4091,1,0.1045,5,4,9 +2528,2011-04-20,2,0,4,5,0,3,1,1,0.4,0.4091,0.94,0.194,1,14,15 +2529,2011-04-20,2,0,4,6,0,3,1,1,0.42,0.4242,0.94,0.1642,2,62,64 +2530,2011-04-20,2,0,4,7,0,3,1,1,0.44,0.4394,0.94,0,26,211,237 +2531,2011-04-20,2,0,4,8,0,3,1,1,0.62,0.6061,0.69,0.4627,22,374,396 +2532,2011-04-20,2,0,4,9,0,3,1,1,0.66,0.6212,0.61,0.2836,27,170,197 +2533,2011-04-20,2,0,4,10,0,3,1,1,0.62,0.6061,0.69,0.0896,45,86,131 +2534,2011-04-20,2,0,4,11,0,3,1,1,0.62,0.6061,0.69,0.0896,31,87,118 +2535,2011-04-20,2,0,4,12,0,3,1,1,0.7,0.6515,0.48,0.3881,28,117,145 +2536,2011-04-20,2,0,4,13,0,3,1,1,0.7,0.6515,0.48,0.3881,42,135,177 +2537,2011-04-20,2,0,4,14,0,3,1,1,0.74,0.6515,0.4,0.4627,33,115,148 +2538,2011-04-20,2,0,4,15,0,3,1,2,0.76,0.6667,0.35,0.3582,32,113,145 +2539,2011-04-20,2,0,4,16,0,3,1,2,0.76,0.6667,0.35,0.3582,54,201,255 +2540,2011-04-20,2,0,4,17,0,3,1,2,0.74,0.6515,0.37,0.4179,34,398,432 +2541,2011-04-20,2,0,4,18,0,3,1,1,0.74,0.6515,0.3,0.3582,56,385,441 +2542,2011-04-20,2,0,4,19,0,3,1,1,0.7,0.6364,0.32,0.2537,74,309,383 +2543,2011-04-20,2,0,4,20,0,3,1,1,0.68,0.6212,0.32,0.1045,31,194,225 +2544,2011-04-20,2,0,4,21,0,3,1,1,0.66,0.6212,0.36,0.2836,38,134,172 +2545,2011-04-20,2,0,4,22,0,3,1,1,0.62,0.6212,0.41,0.2537,12,105,117 +2546,2011-04-20,2,0,4,23,0,3,1,1,0.6,0.6212,0.4,0.3284,14,79,93 +2547,2011-04-21,2,0,4,0,0,4,1,1,0.56,0.5303,0.43,0.3582,11,33,44 +2548,2011-04-21,2,0,4,1,0,4,1,1,0.52,0.5,0.45,0.2239,6,20,26 +2549,2011-04-21,2,0,4,2,0,4,1,1,0.5,0.4848,0.42,0.3881,3,10,13 +2550,2011-04-21,2,0,4,3,0,4,1,2,0.46,0.4545,0.51,0.4627,4,3,7 +2551,2011-04-21,2,0,4,4,0,4,1,2,0.44,0.4394,0.54,0.4478,2,4,6 +2552,2011-04-21,2,0,4,5,0,4,1,2,0.42,0.4242,0.54,0.5522,0,14,14 +2553,2011-04-21,2,0,4,6,0,4,1,2,0.4,0.4091,0.58,0.5821,2,73,75 +2554,2011-04-21,2,0,4,7,0,4,1,2,0.42,0.4242,0.47,0.4627,15,183,198 +2555,2011-04-21,2,0,4,8,0,4,1,2,0.4,0.4091,0.5,0.4925,19,346,365 +2556,2011-04-21,2,0,4,9,0,4,1,2,0.42,0.4242,0.47,0.2836,18,178,196 +2557,2011-04-21,2,0,4,10,0,4,1,1,0.42,0.4242,0.44,0.3582,25,73,98 +2558,2011-04-21,2,0,4,11,0,4,1,1,0.44,0.4394,0.41,0.2836,34,99,133 +2559,2011-04-21,2,0,4,12,0,4,1,1,0.46,0.4545,0.38,0.1642,49,150,199 +2560,2011-04-21,2,0,4,13,0,4,1,1,0.48,0.4697,0.33,0.3284,55,122,177 +2561,2011-04-21,2,0,4,14,0,4,1,1,0.5,0.4848,0.34,0.3881,68,160,228 +2562,2011-04-21,2,0,4,15,0,4,1,1,0.52,0.5,0.29,0.4179,69,166,235 +2563,2011-04-21,2,0,4,16,0,4,1,1,0.52,0.5,0.27,0.3582,66,228,294 +2564,2011-04-21,2,0,4,17,0,4,1,1,0.5,0.4848,0.29,0.1642,79,402,481 +2565,2011-04-21,2,0,4,18,0,4,1,1,0.5,0.4848,0.25,0.2836,71,381,452 +2566,2011-04-21,2,0,4,19,0,4,1,1,0.46,0.4545,0.28,0.2836,52,272,324 +2567,2011-04-21,2,0,4,20,0,4,1,1,0.44,0.4394,0.3,0.1343,28,176,204 +2568,2011-04-21,2,0,4,21,0,4,1,1,0.42,0.4242,0.47,0.194,25,159,184 +2569,2011-04-21,2,0,4,22,0,4,1,1,0.42,0.4242,0.38,0.0896,31,106,137 +2570,2011-04-21,2,0,4,23,0,4,1,1,0.4,0.4091,0.43,0.1045,13,86,99 +2571,2011-04-22,2,0,4,0,0,5,1,1,0.36,0.3485,0.62,0.194,3,30,33 +2572,2011-04-22,2,0,4,1,0,5,1,1,0.36,0.3333,0.53,0.2537,2,22,24 +2573,2011-04-22,2,0,4,2,0,5,1,1,0.36,0.3333,0.53,0.2985,2,7,9 +2574,2011-04-22,2,0,4,3,0,5,1,1,0.36,0.3333,0.53,0.2985,0,2,2 +2575,2011-04-22,2,0,4,4,0,5,1,1,0.34,0.3182,0.53,0.2836,1,2,3 +2576,2011-04-22,2,0,4,5,0,5,1,1,0.34,0.3182,0.53,0.2239,0,16,16 +2577,2011-04-22,2,0,4,6,0,5,1,1,0.34,0.3182,0.53,0.2836,2,47,49 +2578,2011-04-22,2,0,4,7,0,5,1,2,0.34,0.3182,0.53,0.2836,5,175,180 +2579,2011-04-22,2,0,4,8,0,5,1,2,0.34,0.3182,0.53,0.2537,15,331,346 +2580,2011-04-22,2,0,4,9,0,5,1,3,0.34,0.303,0.61,0.2985,17,178,195 +2581,2011-04-22,2,0,4,10,0,5,1,2,0.36,0.3485,0.62,0.194,32,104,136 +2582,2011-04-22,2,0,4,11,0,5,1,3,0.34,0.3333,0.66,0.194,23,83,106 +2583,2011-04-22,2,0,4,12,0,5,1,3,0.34,0.3333,0.76,0.1343,19,61,80 +2584,2011-04-22,2,0,4,13,0,5,1,3,0.34,0.3333,0.87,0.194,3,37,40 +2585,2011-04-22,2,0,4,14,0,5,1,3,0.34,0.3182,0.87,0.2537,3,33,36 +2586,2011-04-22,2,0,4,15,0,5,1,3,0.34,0.3333,0.87,0.194,13,62,75 +2587,2011-04-22,2,0,4,16,0,5,1,3,0.34,0.3333,0.87,0.1343,4,58,62 +2588,2011-04-22,2,0,4,17,0,5,1,3,0.32,0.303,0.93,0.2239,6,63,69 +2589,2011-04-22,2,0,4,18,0,5,1,3,0.32,0.303,0.93,0.2239,4,47,51 +2590,2011-04-22,2,0,4,19,0,5,1,3,0.32,0.303,0.93,0.2239,7,48,55 +2591,2011-04-22,2,0,4,20,0,5,1,3,0.32,0.303,0.87,0.2239,8,39,47 +2592,2011-04-22,2,0,4,21,0,5,1,3,0.3,0.303,0.93,0.1642,5,23,28 +2593,2011-04-22,2,0,4,22,0,5,1,3,0.3,0.303,1,0.1343,3,21,24 +2594,2011-04-22,2,0,4,23,0,5,1,3,0.32,0.3333,0.93,0.1045,0,17,17 +2595,2011-04-23,2,0,4,0,0,6,0,2,0.32,0.3182,1,0.194,0,18,18 +2596,2011-04-23,2,0,4,1,0,6,0,2,0.32,0.3182,1,0.194,5,11,16 +2597,2011-04-23,2,0,4,2,0,6,0,3,0.32,0.3333,1,0.1343,3,14,17 +2598,2011-04-23,2,0,4,3,0,6,0,3,0.32,0.3333,1,0.0896,0,4,4 +2599,2011-04-23,2,0,4,4,0,6,0,3,0.34,0.3636,1,0,2,3,5 +2600,2011-04-23,2,0,4,5,0,6,0,3,0.34,0.3636,1,0,1,6,7 +2601,2011-04-23,2,0,4,6,0,6,0,2,0.34,0.3333,1,0.1642,0,11,11 +2602,2011-04-23,2,0,4,7,0,6,0,3,0.34,0.3333,1,0.1642,2,17,19 +2603,2011-04-23,2,0,4,8,0,6,0,2,0.36,0.3333,1,0.2537,15,32,47 +2604,2011-04-23,2,0,4,9,0,6,0,2,0.38,0.3939,0.94,0.2985,13,55,68 +2605,2011-04-23,2,0,4,10,0,6,0,2,0.42,0.4242,0.88,0.2239,36,93,129 +2606,2011-04-23,2,0,4,11,0,6,0,2,0.46,0.4545,0.88,0.2239,68,164,232 +2607,2011-04-23,2,0,4,12,0,6,0,1,0.52,0.5,0.83,0.4179,94,197,291 +2608,2011-04-23,2,0,4,13,0,6,0,1,0.52,0.5,0.83,0.3881,126,186,312 +2609,2011-04-23,2,0,4,14,0,6,0,1,0.58,0.5455,0.78,0.3582,182,209,391 +2610,2011-04-23,2,0,4,15,0,6,0,1,0.6,0.5909,0.73,0.3881,171,226,397 +2611,2011-04-23,2,0,4,16,0,6,0,1,0.6,0.5909,0.73,0.2836,180,246,426 +2612,2011-04-23,2,0,4,17,0,6,0,1,0.6,0.5909,0.73,0.3284,168,215,383 +2613,2011-04-23,2,0,4,18,0,6,0,1,0.58,0.5455,0.78,0.194,149,227,376 +2614,2011-04-23,2,0,4,19,0,6,0,1,0.56,0.5303,0.83,0.2836,84,199,283 +2615,2011-04-23,2,0,4,20,0,6,0,2,0.54,0.5152,0.88,0.2537,45,138,183 +2616,2011-04-23,2,0,4,21,0,6,0,2,0.56,0.5303,0.83,0.2836,32,103,135 +2617,2011-04-23,2,0,4,22,0,6,0,2,0.58,0.5455,0.78,0.2537,55,114,169 +2618,2011-04-23,2,0,4,23,0,6,0,2,0.54,0.5152,0.88,0.1642,31,86,117 +2619,2011-04-24,2,0,4,0,0,0,0,1,0.52,0.5,0.83,0.2239,30,66,96 +2620,2011-04-24,2,0,4,1,0,0,0,1,0.5,0.4848,0.88,0.194,14,40,54 +2621,2011-04-24,2,0,4,2,0,0,0,1,0.5,0.4848,0.88,0.194,27,45,72 +2622,2011-04-24,2,0,4,3,0,0,0,1,0.5,0.4848,0.94,0.0896,4,20,24 +2623,2011-04-24,2,0,4,4,0,0,0,1,0.5,0.4848,0.94,0.0896,0,5,5 +2624,2011-04-24,2,0,4,5,0,0,0,1,0.5,0.4848,0.94,0.0896,5,7,12 +2625,2011-04-24,2,0,4,6,0,0,0,2,0.5,0.4848,1,0,10,3,13 +2626,2011-04-24,2,0,4,7,0,0,0,2,0.5,0.4848,1,0,16,11,27 +2627,2011-04-24,2,0,4,8,0,0,0,1,0.52,0.5,0.94,0.2239,38,42,80 +2628,2011-04-24,2,0,4,9,0,0,0,1,0.56,0.5303,0.88,0.2537,73,104,177 +2629,2011-04-24,2,0,4,10,0,0,0,1,0.6,0.5758,0.78,0.1642,118,171,289 +2630,2011-04-24,2,0,4,11,0,0,0,1,0.66,0.6212,0.65,0.1343,124,193,317 +2631,2011-04-24,2,0,4,12,0,0,0,1,0.68,0.6364,0.65,0.2537,168,220,388 +2632,2011-04-24,2,0,4,13,0,0,0,1,0.7,0.6515,0.58,0.2836,205,236,441 +2633,2011-04-24,2,0,4,14,0,0,0,1,0.7,0.6515,0.54,0.3284,197,223,420 +2634,2011-04-24,2,0,4,15,0,0,0,1,0.66,0.6212,0.61,0.2537,167,202,369 +2635,2011-04-24,2,0,4,16,0,0,0,1,0.74,0.6667,0.48,0.3881,162,197,359 +2636,2011-04-24,2,0,4,17,0,0,0,1,0.66,0.6212,0.65,0.2985,142,189,331 +2637,2011-04-24,2,0,4,18,0,0,0,1,0.64,0.6061,0.69,0.1642,96,174,270 +2638,2011-04-24,2,0,4,19,0,0,0,3,0.6,0.5606,0.83,0.2836,62,132,194 +2639,2011-04-24,2,0,4,20,0,0,0,3,0.6,0.5606,0.83,0.2836,34,71,105 +2640,2011-04-24,2,0,4,21,0,0,0,3,0.54,0.5152,0.94,0.1642,8,53,61 +2641,2011-04-24,2,0,4,22,0,0,0,3,0.54,0.5152,1,0.1642,5,40,45 +2642,2011-04-24,2,0,4,23,0,0,0,3,0.54,0.5152,1,0.0896,5,37,42 +2643,2011-04-25,2,0,4,0,0,1,1,3,0.52,0.5,1,0,2,10,12 +2644,2011-04-25,2,0,4,1,0,1,1,1,0.54,0.5152,1,0.0896,2,8,10 +2645,2011-04-25,2,0,4,2,0,1,1,1,0.54,0.5152,1,0.0896,1,9,10 +2646,2011-04-25,2,0,4,3,0,1,1,2,0.5,0.4848,1,0.1045,2,8,10 +2647,2011-04-25,2,0,4,4,0,1,1,2,0.52,0.5,1,0,1,4,5 +2648,2011-04-25,2,0,4,5,0,1,1,2,0.46,0.4545,1,0,3,14,17 +2649,2011-04-25,2,0,4,6,0,1,1,2,0.5,0.4848,1,0,7,59,66 +2650,2011-04-25,2,0,4,7,0,1,1,2,0.52,0.5,1,0.0896,13,183,196 +2651,2011-04-25,2,0,4,8,0,1,1,1,0.56,0.5303,0.88,0.1045,27,326,353 +2652,2011-04-25,2,0,4,9,0,1,1,1,0.6,0.5606,0.83,0.1343,27,147,174 +2653,2011-04-25,2,0,4,10,0,1,1,1,0.64,0.6061,0.73,0.2239,61,64,125 +2654,2011-04-25,2,0,4,11,0,1,1,1,0.64,0.6061,0.73,0.2239,53,94,147 +2655,2011-04-25,2,0,4,12,0,1,1,1,0.66,0.6212,0.74,0.2537,48,140,188 +2656,2011-04-25,2,0,4,13,0,1,1,1,0.7,0.6515,0.65,0.2239,57,146,203 +2657,2011-04-25,2,0,4,14,0,1,1,1,0.72,0.6667,0.54,0.2239,47,99,146 +2658,2011-04-25,2,0,4,15,0,1,1,1,0.74,0.6667,0.51,0.2239,50,125,175 +2659,2011-04-25,2,0,4,16,0,1,1,1,0.7,0.6515,0.54,0.3582,41,198,239 +2660,2011-04-25,2,0,4,17,0,1,1,1,0.7,0.6515,0.54,0.2985,80,441,521 +2661,2011-04-25,2,0,4,18,0,1,1,1,0.68,0.6364,0.57,0.3582,74,425,499 +2662,2011-04-25,2,0,4,19,0,1,1,1,0.66,0.6212,0.57,0.3284,62,320,382 +2663,2011-04-25,2,0,4,20,0,1,1,1,0.66,0.6212,0.61,0.2836,42,195,237 +2664,2011-04-25,2,0,4,21,0,1,1,1,0.62,0.6061,0.69,0.2537,41,149,190 +2665,2011-04-25,2,0,4,22,0,1,1,1,0.6,0.5909,0.73,0.2985,23,83,106 +2666,2011-04-25,2,0,4,23,0,1,1,1,0.58,0.5455,0.78,0.2836,9,53,62 +2667,2011-04-26,2,0,4,0,0,2,1,1,0.62,0.6061,0.69,0.2836,10,17,27 +2668,2011-04-26,2,0,4,1,0,2,1,1,0.62,0.5909,0.73,0.2836,16,5,21 +2669,2011-04-26,2,0,4,2,0,2,1,1,0.56,0.5303,0.83,0.3284,17,9,26 +2670,2011-04-26,2,0,4,3,0,2,1,1,0.54,0.5152,0.88,0.2836,17,4,21 +2671,2011-04-26,2,0,4,4,0,2,1,1,0.56,0.5303,0.83,0.2537,2,4,6 +2672,2011-04-26,2,0,4,5,0,2,1,1,0.54,0.5152,0.88,0.2537,0,16,16 +2673,2011-04-26,2,0,4,6,0,2,1,1,0.56,0.5303,0.88,0.2239,0,80,80 +2674,2011-04-26,2,0,4,7,0,2,1,1,0.58,0.5455,0.83,0.2985,16,254,270 +2675,2011-04-26,2,0,4,8,0,2,1,1,0.58,0.5455,0.83,0.2985,32,417,449 +2676,2011-04-26,2,0,4,9,0,2,1,1,0.64,0.6061,0.73,0.3582,35,164,199 +2677,2011-04-26,2,0,4,10,0,2,1,1,0.66,0.6212,0.69,0.3582,22,98,120 +2678,2011-04-26,2,0,4,11,0,2,1,1,0.68,0.6364,0.65,0.3881,40,132,172 +2679,2011-04-26,2,0,4,12,0,2,1,1,0.7,0.6515,0.61,0.4179,40,146,186 +2680,2011-04-26,2,0,4,13,0,2,1,1,0.74,0.6667,0.51,0.4478,37,139,176 +2681,2011-04-26,2,0,4,14,0,2,1,1,0.72,0.6667,0.58,0.3284,40,115,155 +2682,2011-04-26,2,0,4,15,0,2,1,1,0.7,0.6515,0.61,0.3582,34,119,153 +2683,2011-04-26,2,0,4,16,0,2,1,1,0.7,0.6515,0.58,0.3881,40,251,291 +2684,2011-04-26,2,0,4,17,0,2,1,1,0.68,0.6364,0.61,0.3582,66,455,521 +2685,2011-04-26,2,0,4,18,0,2,1,1,0.68,0.6364,0.65,0.4478,65,463,528 +2686,2011-04-26,2,0,4,19,0,2,1,1,0.64,0.6061,0.73,0.4179,42,286,328 +2687,2011-04-26,2,0,4,20,0,2,1,1,0.64,0.6061,0.73,0.3582,35,199,234 +2688,2011-04-26,2,0,4,21,0,2,1,1,0.62,0.5909,0.78,0.2836,33,162,195 +2689,2011-04-26,2,0,4,22,0,2,1,2,0.6,0.5606,0.83,0.194,32,116,148 +2690,2011-04-26,2,0,4,23,0,2,1,2,0.6,0.5606,0.83,0.2239,7,71,78 +2691,2011-04-27,2,0,4,0,0,3,1,1,0.6,0.5606,0.83,0.2239,3,24,27 +2692,2011-04-27,2,0,4,1,0,3,1,1,0.6,0.5606,0.83,0.2537,2,15,17 +2693,2011-04-27,2,0,4,2,0,3,1,1,0.58,0.5455,0.88,0.2537,0,5,5 +2694,2011-04-27,2,0,4,3,0,3,1,2,0.58,0.5455,0.88,0.2836,3,4,7 +2695,2011-04-27,2,0,4,4,0,3,1,1,0.56,0.5303,0.94,0.2239,0,6,6 +2696,2011-04-27,2,0,4,5,0,3,1,2,0.56,0.5303,0.94,0.2537,1,16,17 +2697,2011-04-27,2,0,4,6,0,3,1,1,0.56,0.5303,0.94,0.2537,5,79,84 +2698,2011-04-27,2,0,4,7,0,3,1,2,0.58,0.5455,0.88,0.2836,17,229,246 +2699,2011-04-27,2,0,4,8,0,3,1,2,0.58,0.5455,0.88,0.3284,31,413,444 +2700,2011-04-27,2,0,4,9,0,3,1,2,0.6,0.5455,0.88,0.4179,20,161,181 +2701,2011-04-27,2,0,4,10,0,3,1,2,0.62,0.5758,0.83,0.2836,26,66,92 +2702,2011-04-27,2,0,4,11,0,3,1,2,0.64,0.5909,0.78,0.2836,53,103,156 +2703,2011-04-27,2,0,4,12,0,3,1,1,0.66,0.6061,0.78,0.3284,38,135,173 +2704,2011-04-27,2,0,4,13,0,3,1,1,0.64,0.5909,0.78,0.2985,31,119,150 +2705,2011-04-27,2,0,4,14,0,3,1,1,0.68,0.6364,0.74,0.2836,29,119,148 +2706,2011-04-27,2,0,4,15,0,3,1,1,0.7,0.6515,0.65,0.4925,18,120,138 +2707,2011-04-27,2,0,4,16,0,3,1,1,0.7,0.6515,0.7,0.3881,29,189,218 +2708,2011-04-27,2,0,4,17,0,3,1,3,0.66,0.6061,0.83,0.3881,63,458,521 +2709,2011-04-27,2,0,4,18,0,3,1,3,0.66,0.6061,0.83,0.3881,46,366,412 +2710,2011-04-27,2,0,4,19,0,3,1,1,0.62,0.5909,0.78,0.2836,40,220,260 +2711,2011-04-27,2,0,4,20,0,3,1,1,0.64,0.5758,0.83,0.3284,30,188,218 +2712,2011-04-27,2,0,4,21,0,3,1,1,0.62,0.5606,0.88,0.2836,19,126,145 +2713,2011-04-27,2,0,4,22,0,3,1,2,0.62,0.5606,0.88,0.3284,18,111,129 +2714,2011-04-27,2,0,4,23,0,3,1,2,0.62,0.5606,0.88,0.3582,25,53,78 +2715,2011-04-28,2,0,4,0,0,4,1,1,0.64,0.5909,0.78,0.3582,13,41,54 +2716,2011-04-28,2,0,4,1,0,4,1,2,0.62,0.5758,0.83,0.3881,8,12,20 +2717,2011-04-28,2,0,4,2,0,4,1,2,0.62,0.5758,0.83,0.4478,14,7,21 +2718,2011-04-28,2,0,4,3,0,4,1,2,0.64,0.5909,0.78,0.3881,2,1,3 +2719,2011-04-28,2,0,4,4,0,4,1,2,0.62,0.5758,0.83,0.4179,4,5,9 +2720,2011-04-28,2,0,4,5,0,4,1,2,0.62,0.5758,0.83,0.3881,0,13,13 +2721,2011-04-28,2,0,4,6,0,4,1,2,0.64,0.5909,0.78,0.4627,7,79,86 +2722,2011-04-28,2,0,4,7,0,4,1,2,0.64,0.5909,0.78,0.4478,14,201,215 +2723,2011-04-28,2,0,4,8,0,4,1,2,0.66,0.6061,0.78,0.4478,31,367,398 +2724,2011-04-28,2,0,4,9,0,4,1,3,0.62,0.5606,0.88,0.3284,18,165,183 +2725,2011-04-28,2,0,4,10,0,4,1,2,0.62,0.5606,0.88,0.2836,9,45,54 +2726,2011-04-28,2,0,4,11,0,4,1,2,0.62,0.5758,0.83,0.2836,10,74,84 +2727,2011-04-28,2,0,4,12,0,4,1,2,0.62,0.5909,0.78,0.2537,23,84,107 +2728,2011-04-28,2,0,4,13,0,4,1,2,0.62,0.5758,0.83,0.2985,18,103,121 +2729,2011-04-28,2,0,4,14,0,4,1,1,0.62,0.5758,0.83,0.2985,21,121,142 +2730,2011-04-28,2,0,4,15,0,4,1,1,0.66,0.6212,0.69,0.3284,46,117,163 +2731,2011-04-28,2,0,4,16,0,4,1,1,0.62,0.5909,0.78,0.2537,42,228,270 +2732,2011-04-28,2,0,4,17,0,4,1,1,0.64,0.6212,0.47,0.3582,49,406,455 +2733,2011-04-28,2,0,4,18,0,4,1,1,0.64,0.6212,0.47,0.3582,44,486,530 +2734,2011-04-28,2,0,4,19,0,4,1,1,0.62,0.6212,0.43,0.2985,57,347,404 +2735,2011-04-28,2,0,4,20,0,4,1,1,0.58,0.5455,0.43,0.2239,49,220,269 +2736,2011-04-28,2,0,4,21,0,4,1,1,0.56,0.5303,0.46,0.1343,27,167,194 +2737,2011-04-28,2,0,4,22,0,4,1,1,0.54,0.5152,0.45,0.1642,34,119,153 +2738,2011-04-28,2,0,4,23,0,4,1,1,0.54,0.5152,0.39,0.0896,29,81,110 +2739,2011-04-29,2,0,4,0,0,5,1,1,0.52,0.5,0.39,0.2836,11,37,48 +2740,2011-04-29,2,0,4,1,0,5,1,1,0.5,0.4848,0.45,0.0896,10,21,31 +2741,2011-04-29,2,0,4,2,0,5,1,1,0.46,0.4545,0.63,0.1045,2,8,10 +2742,2011-04-29,2,0,4,3,0,5,1,1,0.46,0.4545,0.59,0.0896,1,4,5 +2743,2011-04-29,2,0,4,4,0,5,1,1,0.46,0.4545,0.51,0.1343,0,6,6 +2744,2011-04-29,2,0,4,5,0,5,1,1,0.46,0.4545,0.47,0.1343,0,25,25 +2745,2011-04-29,2,0,4,6,0,5,1,1,0.5,0.4848,0.45,0.2836,4,67,71 +2746,2011-04-29,2,0,4,7,0,5,1,1,0.52,0.5,0.42,0.194,18,222,240 +2747,2011-04-29,2,0,4,8,0,5,1,1,0.54,0.5152,0.39,0.2985,35,386,421 +2748,2011-04-29,2,0,4,9,0,5,1,1,0.54,0.5152,0.39,0.3284,45,185,230 +2749,2011-04-29,2,0,4,10,0,5,1,1,0.56,0.5303,0.4,0.2836,57,99,156 +2750,2011-04-29,2,0,4,11,0,5,1,1,0.6,0.6212,0.38,0.2239,49,108,157 +2751,2011-04-29,2,0,4,12,0,5,1,1,0.56,0.5303,0.37,0.2836,63,160,223 +2752,2011-04-29,2,0,4,13,0,5,1,1,0.56,0.5303,0.37,0.2985,74,234,308 +2753,2011-04-29,2,0,4,14,0,5,1,1,0.6,0.6212,0.35,0.3284,58,190,248 +2754,2011-04-29,2,0,4,15,0,5,1,2,0.56,0.5303,0.4,0.2985,77,175,252 +2755,2011-04-29,2,0,4,16,0,5,1,2,0.54,0.5152,0.42,0.3582,59,240,299 +2756,2011-04-29,2,0,4,17,0,5,1,2,0.52,0.5,0.45,0.3881,75,433,508 +2757,2011-04-29,2,0,4,18,0,5,1,1,0.52,0.5,0.48,0.3582,55,384,439 +2758,2011-04-29,2,0,4,19,0,5,1,1,0.46,0.4545,0.55,0.3284,40,230,270 +2759,2011-04-29,2,0,4,20,0,5,1,1,0.48,0.4697,0.48,0.2239,34,160,194 +2760,2011-04-29,2,0,4,21,0,5,1,1,0.46,0.4545,0.51,0.194,50,145,195 +2761,2011-04-29,2,0,4,22,0,5,1,1,0.42,0.4242,0.58,0.1642,36,110,146 +2762,2011-04-29,2,0,4,23,0,5,1,2,0.44,0.4394,0.54,0.0896,25,88,113 +2763,2011-04-30,2,0,4,0,0,6,0,2,0.44,0.4394,0.54,0.2537,33,73,106 +2764,2011-04-30,2,0,4,1,0,6,0,2,0.44,0.4394,0.54,0.1642,15,59,74 +2765,2011-04-30,2,0,4,2,0,6,0,2,0.44,0.4394,0.54,0.2239,12,53,65 +2766,2011-04-30,2,0,4,3,0,6,0,2,0.42,0.4242,0.54,0.194,4,13,17 +2767,2011-04-30,2,0,4,4,0,6,0,2,0.42,0.4242,0.54,0.2239,2,5,7 +2768,2011-04-30,2,0,4,5,0,6,0,1,0.42,0.4242,0.54,0.3881,3,4,7 +2769,2011-04-30,2,0,4,6,0,6,0,1,0.4,0.4091,0.58,0.2836,10,11,21 +2770,2011-04-30,2,0,4,7,0,6,0,1,0.4,0.4091,0.54,0.3881,21,27,48 +2771,2011-04-30,2,0,4,8,0,6,0,1,0.4,0.4091,0.54,0.2985,21,70,91 +2772,2011-04-30,2,0,4,9,0,6,0,1,0.44,0.4394,0.51,0.3582,35,121,156 +2773,2011-04-30,2,0,4,10,0,6,0,1,0.48,0.4697,0.44,0.3881,94,185,279 +2774,2011-04-30,2,0,4,11,0,6,0,1,0.5,0.4848,0.42,0.3582,120,201,321 +2775,2011-04-30,2,0,4,12,0,6,0,1,0.52,0.5,0.42,0.3881,178,238,416 +2776,2011-04-30,2,0,4,13,0,6,0,1,0.54,0.5152,0.39,0.2985,185,270,455 +2777,2011-04-30,2,0,4,14,0,6,0,1,0.54,0.5152,0.39,0.2239,184,268,452 +2778,2011-04-30,2,0,4,15,0,6,0,1,0.56,0.5303,0.37,0.1343,217,282,499 +2779,2011-04-30,2,0,4,16,0,6,0,1,0.58,0.5455,0.37,0.194,191,273,464 +2780,2011-04-30,2,0,4,17,0,6,0,1,0.56,0.5303,0.43,0.1343,162,245,407 +2781,2011-04-30,2,0,4,18,0,6,0,1,0.54,0.5152,0.43,0.1343,134,237,371 +2782,2011-04-30,2,0,4,19,0,6,0,1,0.54,0.5152,0.43,0.1343,150,237,387 +2783,2011-04-30,2,0,4,20,0,6,0,1,0.44,0.4394,0.62,0.1343,65,159,224 +2784,2011-04-30,2,0,4,21,0,6,0,1,0.44,0.4394,0.62,0.1343,58,119,177 +2785,2011-04-30,2,0,4,22,0,6,0,1,0.44,0.4394,0.67,0.1045,37,106,143 +2786,2011-04-30,2,0,4,23,0,6,0,1,0.44,0.4394,0.67,0.1045,34,91,125 +2787,2011-05-01,2,0,5,0,0,0,0,1,0.42,0.4242,0.67,0.0896,19,77,96 +2788,2011-05-01,2,0,5,1,0,0,0,1,0.42,0.4242,0.69,0.1045,9,50,59 +2789,2011-05-01,2,0,5,2,0,0,0,1,0.42,0.4242,0.77,0.1045,7,43,50 +2790,2011-05-01,2,0,5,3,0,0,0,1,0.4,0.4091,0.82,0.1045,8,15,23 +2791,2011-05-01,2,0,5,4,0,0,0,1,0.4,0.4091,0.76,0.1045,6,11,17 +2792,2011-05-01,2,0,5,5,0,0,0,1,0.4,0.4091,0.82,0.0896,0,10,10 +2793,2011-05-01,2,0,5,6,0,0,0,2,0.4,0.4091,0.82,0.0896,4,9,13 +2794,2011-05-01,2,0,5,7,0,0,0,2,0.42,0.4242,0.77,0.0896,7,26,33 +2795,2011-05-01,2,0,5,8,0,0,0,2,0.44,0.4394,0.77,0.1343,16,43,59 +2796,2011-05-01,2,0,5,9,0,0,0,2,0.46,0.4545,0.72,0.1343,45,96,141 +2797,2011-05-01,2,0,5,10,0,0,0,2,0.48,0.4697,0.63,0.194,109,155,264 +2798,2011-05-01,2,0,5,11,0,0,0,1,0.46,0.4545,0.82,0.1045,109,141,250 +2799,2011-05-01,2,0,5,12,0,0,0,2,0.48,0.4697,0.77,0.1045,109,172,281 +2800,2011-05-01,2,0,5,13,0,0,0,2,0.5,0.4848,0.63,0.1642,123,209,332 +2801,2011-05-01,2,0,5,14,0,0,0,2,0.5,0.4848,0.68,0.1045,85,153,238 +2802,2011-05-01,2,0,5,15,0,0,0,2,0.5,0.4848,0.72,0,113,153,266 +2803,2011-05-01,2,0,5,16,0,0,0,2,0.5,0.4848,0.72,0.194,75,139,214 +2804,2011-05-01,2,0,5,17,0,0,0,2,0.48,0.4697,0.82,0.2537,60,136,196 +2805,2011-05-01,2,0,5,18,0,0,0,2,0.46,0.4545,0.82,0.0896,33,126,159 +2806,2011-05-01,2,0,5,19,0,0,0,2,0.46,0.4545,0.84,0.1045,47,131,178 +2807,2011-05-01,2,0,5,20,0,0,0,2,0.46,0.4545,0.82,0,35,86,121 +2808,2011-05-01,2,0,5,21,0,0,0,2,0.46,0.4545,0.82,0,23,82,105 +2809,2011-05-01,2,0,5,22,0,0,0,2,0.46,0.4545,0.82,0,32,68,100 +2810,2011-05-01,2,0,5,23,0,0,0,1,0.46,0.4545,0.77,0.194,64,82,146 +2811,2011-05-02,2,0,5,0,0,1,1,1,0.46,0.4545,0.72,0.1343,68,109,177 +2812,2011-05-02,2,0,5,1,0,1,1,1,0.46,0.4545,0.72,0.1343,41,73,114 +2813,2011-05-02,2,0,5,2,0,1,1,2,0.44,0.4394,0.77,0.2239,16,19,35 +2814,2011-05-02,2,0,5,3,0,1,1,1,0.44,0.4394,0.77,0.1343,9,7,16 +2815,2011-05-02,2,0,5,4,0,1,1,1,0.44,0.4394,0.77,0.1642,9,8,17 +2816,2011-05-02,2,0,5,5,0,1,1,2,0.44,0.4394,0.77,0.1343,4,16,20 +2817,2011-05-02,2,0,5,6,0,1,1,1,0.44,0.4394,0.88,0.1045,3,59,62 +2818,2011-05-02,2,0,5,7,0,1,1,2,0.46,0.4545,0.82,0.3284,16,193,209 +2819,2011-05-02,2,0,5,8,0,1,1,2,0.48,0.4697,0.77,0.2239,21,350,371 +2820,2011-05-02,2,0,5,9,0,1,1,2,0.5,0.4848,0.77,0.2836,41,131,172 +2821,2011-05-02,2,0,5,10,0,1,1,2,0.54,0.5152,0.77,0.2239,31,77,108 +2822,2011-05-02,2,0,5,11,0,1,1,1,0.58,0.5455,0.73,0.2985,34,96,130 +2823,2011-05-02,2,0,5,12,0,1,1,1,0.62,0.6061,0.69,0.2239,40,147,187 +2824,2011-05-02,2,0,5,13,0,1,1,2,0.62,0.6061,0.69,0.194,51,119,170 +2825,2011-05-02,2,0,5,14,0,1,1,1,0.64,0.6061,0.69,0.2239,48,133,181 +2826,2011-05-02,2,0,5,15,0,1,1,1,0.66,0.6212,0.61,0.194,45,110,155 +2827,2011-05-02,2,0,5,16,0,1,1,1,0.66,0.6212,0.61,0.2239,49,220,269 +2828,2011-05-02,2,0,5,17,0,1,1,1,0.66,0.6212,0.65,0.194,65,472,537 +2829,2011-05-02,2,0,5,18,0,1,1,2,0.64,0.6061,0.65,0.194,68,450,518 +2830,2011-05-02,2,0,5,19,0,1,1,2,0.62,0.6061,0.69,0.1343,46,268,314 +2831,2011-05-02,2,0,5,20,0,1,1,2,0.62,0.6061,0.69,0.1343,44,174,218 +2832,2011-05-02,2,0,5,21,0,1,1,2,0.6,0.5909,0.73,0.1045,54,178,232 +2833,2011-05-02,2,0,5,22,0,1,1,2,0.6,0.5909,0.73,0,28,97,125 +2834,2011-05-02,2,0,5,23,0,1,1,2,0.56,0.5303,0.83,0.194,16,48,64 +2835,2011-05-03,2,0,5,0,0,2,1,2,0.56,0.5303,0.83,0.2239,0,16,16 +2836,2011-05-03,2,0,5,1,0,2,1,2,0.56,0.5303,0.78,0.2537,0,14,14 +2837,2011-05-03,2,0,5,2,0,2,1,2,0.56,0.5303,0.78,0.2537,0,5,5 +2838,2011-05-03,2,0,5,3,0,2,1,1,0.54,0.5152,0.83,0.2836,0,2,2 +2839,2011-05-03,2,0,5,4,0,2,1,1,0.52,0.5,0.88,0.2239,3,1,4 +2840,2011-05-03,2,0,5,5,0,2,1,1,0.52,0.5,0.88,0.1343,0,14,14 +2841,2011-05-03,2,0,5,6,0,2,1,2,0.52,0.5,0.94,0.1642,7,102,109 +2842,2011-05-03,2,0,5,7,0,2,1,2,0.54,0.5152,0.88,0.2239,17,248,265 +2843,2011-05-03,2,0,5,8,0,2,1,2,0.56,0.5303,0.83,0.2836,24,435,459 +2844,2011-05-03,2,0,5,9,0,2,1,2,0.6,0.5758,0.78,0.3582,29,157,186 +2845,2011-05-03,2,0,5,10,0,2,1,2,0.64,0.6061,0.69,0.3582,36,91,127 +2846,2011-05-03,2,0,5,11,0,2,1,2,0.66,0.6212,0.65,0.3284,20,120,140 +2847,2011-05-03,2,0,5,12,0,2,1,2,0.68,0.6364,0.61,0.4925,48,169,217 +2848,2011-05-03,2,0,5,13,0,2,1,2,0.7,0.6515,0.58,0.6119,50,144,194 +2849,2011-05-03,2,0,5,14,0,2,1,2,0.7,0.6515,0.58,0.5224,36,122,158 +2850,2011-05-03,2,0,5,15,0,2,1,2,0.7,0.6515,0.58,0.4478,37,117,154 +2851,2011-05-03,2,0,5,16,0,2,1,1,0.72,0.6667,0.54,0.4627,46,225,271 +2852,2011-05-03,2,0,5,17,0,2,1,1,0.7,0.6515,0.54,0.4627,53,464,517 +2853,2011-05-03,2,0,5,18,0,2,1,1,0.7,0.6515,0.48,0.4179,59,485,544 +2854,2011-05-03,2,0,5,19,0,2,1,1,0.68,0.6364,0.57,0.3582,42,323,365 +2855,2011-05-03,2,0,5,20,0,2,1,1,0.66,0.6212,0.61,0.194,28,262,290 +2856,2011-05-03,2,0,5,21,0,2,1,2,0.64,0.6212,0.61,0.4478,42,183,225 +2857,2011-05-03,2,0,5,22,0,2,1,2,0.58,0.5455,0.6,0.3881,14,99,113 +2858,2011-05-03,2,0,5,23,0,2,1,2,0.56,0.5303,0.68,0.3284,12,50,62 +2859,2011-05-04,2,0,5,0,0,3,1,3,0.52,0.5,0.77,0.1642,5,22,27 +2860,2011-05-04,2,0,5,1,0,3,1,3,0.5,0.4848,0.82,0.2836,1,6,7 +2861,2011-05-04,2,0,5,2,0,3,1,3,0.5,0.4848,0.82,0.2836,0,4,4 +2862,2011-05-04,2,0,5,3,0,3,1,3,0.42,0.4242,0.88,0.5224,0,1,1 +2863,2011-05-04,2,0,5,4,0,3,1,3,0.38,0.3939,0.94,0.3284,1,2,3 +2864,2011-05-04,2,0,5,5,0,3,1,3,0.36,0.3333,0.87,0.3284,1,8,9 +2865,2011-05-04,2,0,5,6,0,3,1,3,0.34,0.303,0.93,0.3881,0,21,21 +2866,2011-05-04,2,0,5,7,0,3,1,3,0.34,0.303,0.93,0.3582,6,46,52 +2867,2011-05-04,2,0,5,8,0,3,1,3,0.34,0.303,0.93,0.3881,5,74,79 +2868,2011-05-04,2,0,5,9,0,3,1,3,0.34,0.303,0.93,0.3582,5,35,40 +2869,2011-05-04,2,0,5,10,0,3,1,2,0.34,0.3182,0.93,0.2836,5,26,31 +2870,2011-05-04,2,0,5,11,0,3,1,1,0.4,0.4091,0.82,0.3284,3,69,72 +2871,2011-05-04,2,0,5,12,0,3,1,2,0.44,0.4394,0.62,0.3582,19,105,124 +2872,2011-05-04,2,0,5,13,0,3,1,2,0.46,0.4545,0.59,0.4478,15,128,143 +2873,2011-05-04,2,0,5,14,0,3,1,1,0.5,0.4848,0.51,0.3881,13,94,107 +2874,2011-05-04,2,0,5,15,0,3,1,1,0.48,0.4697,0.51,0.4179,22,107,129 +2875,2011-05-04,2,0,5,16,0,3,1,1,0.5,0.4848,0.45,0.4627,22,173,195 +2876,2011-05-04,2,0,5,17,0,3,1,1,0.4,0.4091,0.71,0.2537,22,388,410 +2877,2011-05-04,2,0,5,18,0,3,1,1,0.42,0.4242,0.67,0.2537,29,367,396 +2878,2011-05-04,2,0,5,19,0,3,1,1,0.42,0.4242,0.58,0.3582,30,266,296 +2879,2011-05-04,2,0,5,20,0,3,1,1,0.4,0.4091,0.62,0.3284,17,174,191 +2880,2011-05-04,2,0,5,21,0,3,1,1,0.4,0.4091,0.62,0.1642,17,133,150 +2881,2011-05-04,2,0,5,22,0,3,1,1,0.38,0.3939,0.62,0.2239,9,80,89 +2882,2011-05-04,2,0,5,23,0,3,1,1,0.36,0.3485,0.62,0.2239,8,49,57 +2883,2011-05-05,2,0,5,0,0,4,1,1,0.36,0.3485,0.66,0.194,4,23,27 +2884,2011-05-05,2,0,5,1,0,4,1,1,0.34,0.3333,0.71,0.194,4,6,10 +2885,2011-05-05,2,0,5,2,0,4,1,1,0.34,0.3182,0.66,0.2239,1,4,5 +2886,2011-05-05,2,0,5,3,0,4,1,1,0.34,0.3182,0.66,0.2537,0,4,4 +2887,2011-05-05,2,0,5,4,0,4,1,1,0.34,0.3182,0.66,0.2537,4,3,7 +2888,2011-05-05,2,0,5,5,0,4,1,1,0.34,0.3182,0.66,0.2537,1,29,30 +2889,2011-05-05,2,0,5,6,0,4,1,1,0.34,0.3182,0.66,0.2537,3,86,89 +2890,2011-05-05,2,0,5,7,0,4,1,1,0.38,0.3939,0.58,0.2537,16,255,271 +2891,2011-05-05,2,0,5,8,0,4,1,1,0.42,0.4242,0.5,0.2537,25,415,440 +2892,2011-05-05,2,0,5,9,0,4,1,1,0.46,0.4545,0.44,0.2836,20,164,184 +2893,2011-05-05,2,0,5,10,0,4,1,1,0.5,0.4848,0.39,0.2836,30,98,128 +2894,2011-05-05,2,0,5,11,0,4,1,1,0.52,0.5,0.36,0.4478,43,105,148 +2895,2011-05-05,2,0,5,12,0,4,1,1,0.52,0.5,0.34,0.3582,27,169,196 +2896,2011-05-05,2,0,5,13,0,4,1,1,0.54,0.5152,0.3,0.5821,50,142,192 +2897,2011-05-05,2,0,5,14,0,4,1,1,0.54,0.5152,0.28,0.4478,19,135,154 +2898,2011-05-05,2,0,5,15,0,4,1,1,0.56,0.5303,0.26,0.4925,27,120,147 +2899,2011-05-05,2,0,5,16,0,4,1,1,0.58,0.5455,0.24,0.4179,36,233,269 +2900,2011-05-05,2,0,5,17,0,4,1,1,0.56,0.5303,0.26,0.3881,66,467,533 +2901,2011-05-05,2,0,5,18,0,4,1,1,0.56,0.5303,0.26,0.2836,64,456,520 +2902,2011-05-05,2,0,5,19,0,4,1,1,0.54,0.5152,0.28,0.2239,56,305,361 +2903,2011-05-05,2,0,5,20,0,4,1,1,0.5,0.4848,0.34,0.1642,33,225,258 +2904,2011-05-05,2,0,5,21,0,4,1,1,0.5,0.4848,0.36,0.194,39,141,180 +2905,2011-05-05,2,0,5,22,0,4,1,1,0.48,0.4697,0.39,0.194,30,135,165 +2906,2011-05-05,2,0,5,23,0,4,1,1,0.46,0.4545,0.41,0.194,16,99,115 +2907,2011-05-06,2,0,5,0,0,5,1,1,0.44,0.4394,0.44,0.1642,13,43,56 +2908,2011-05-06,2,0,5,1,0,5,1,1,0.4,0.4091,0.62,0.1642,8,24,32 +2909,2011-05-06,2,0,5,2,0,5,1,1,0.38,0.3939,0.62,0.1045,1,15,16 +2910,2011-05-06,2,0,5,3,0,5,1,1,0.36,0.3636,0.71,0.1045,3,6,9 +2911,2011-05-06,2,0,5,4,0,5,1,1,0.36,0.3636,0.71,0.1045,0,1,1 +2912,2011-05-06,2,0,5,5,0,5,1,1,0.34,0.3485,0.81,0.1045,0,16,16 +2913,2011-05-06,2,0,5,6,0,5,1,1,0.36,0.3636,0.81,0.0896,8,74,82 +2914,2011-05-06,2,0,5,7,0,5,1,1,0.4,0.4091,0.82,0,20,202,222 +2915,2011-05-06,2,0,5,8,0,5,1,1,0.42,0.4242,0.77,0.2537,35,415,450 +2916,2011-05-06,2,0,5,9,0,5,1,1,0.46,0.4545,0.67,0.2836,27,161,188 +2917,2011-05-06,2,0,5,10,0,5,1,2,0.54,0.5152,0.49,0.3284,30,97,127 +2918,2011-05-06,2,0,5,11,0,5,1,2,0.54,0.5152,0.49,0.3284,42,123,165 +2919,2011-05-06,2,0,5,12,0,5,1,1,0.56,0.5303,0.49,0.3582,48,198,246 +2920,2011-05-06,2,0,5,13,0,5,1,1,0.58,0.5455,0.4,0.3284,71,182,253 +2921,2011-05-06,2,0,5,14,0,5,1,1,0.6,0.6212,0.4,0.4478,86,127,213 +2922,2011-05-06,2,0,5,15,0,5,1,1,0.6,0.6212,0.38,0.3881,89,171,260 +2923,2011-05-06,2,0,5,16,0,5,1,1,0.6,0.6212,0.4,0.2537,82,262,344 +2924,2011-05-06,2,0,5,17,0,5,1,1,0.58,0.5455,0.4,0.3582,83,470,553 +2925,2011-05-06,2,0,5,18,0,5,1,3,0.54,0.5152,0.52,0.3582,85,385,470 +2926,2011-05-06,2,0,5,19,0,5,1,3,0.54,0.5152,0.52,0.3582,39,253,292 +2927,2011-05-06,2,0,5,20,0,5,1,1,0.52,0.5,0.48,0.1045,29,161,190 +2928,2011-05-06,2,0,5,21,0,5,1,1,0.48,0.4697,0.67,0.1343,22,130,152 +2929,2011-05-06,2,0,5,22,0,5,1,1,0.46,0.4545,0.72,0.2239,46,105,151 +2930,2011-05-06,2,0,5,23,0,5,1,1,0.44,0.4394,0.82,0.1343,27,93,120 +2931,2011-05-07,2,0,5,0,0,6,0,1,0.42,0.4242,0.82,0.1343,10,76,86 +2932,2011-05-07,2,0,5,1,0,6,0,1,0.42,0.4242,0.82,0.1045,8,50,58 +2933,2011-05-07,2,0,5,2,0,6,0,1,0.42,0.4242,0.82,0.0896,5,47,52 +2934,2011-05-07,2,0,5,3,0,6,0,1,0.42,0.4242,0.77,0.0896,9,9,18 +2935,2011-05-07,2,0,5,4,0,6,0,1,0.4,0.4091,0.82,0.1045,1,4,5 +2936,2011-05-07,2,0,5,5,0,6,0,1,0.46,0.4545,0.59,0.1343,4,3,7 +2937,2011-05-07,2,0,5,6,0,6,0,1,0.42,0.4242,0.77,0.0896,1,12,13 +2938,2011-05-07,2,0,5,7,0,6,0,1,0.48,0.4697,0.59,0,8,32,40 +2939,2011-05-07,2,0,5,8,0,6,0,1,0.52,0.5,0.52,0.194,19,96,115 +2940,2011-05-07,2,0,5,9,0,6,0,1,0.54,0.5152,0.49,0.2836,54,164,218 +2941,2011-05-07,2,0,5,10,0,6,0,1,0.56,0.5303,0.49,0.2985,90,208,298 +2942,2011-05-07,2,0,5,11,0,6,0,1,0.56,0.5303,0.43,0,132,215,347 +2943,2011-05-07,2,0,5,12,0,6,0,1,0.6,0.6212,0.4,0.194,129,244,373 +2944,2011-05-07,2,0,5,13,0,6,0,1,0.6,0.6212,0.35,0.2836,196,240,436 +2945,2011-05-07,2,0,5,14,0,6,0,3,0.6,0.6212,0.35,0.1642,143,235,378 +2946,2011-05-07,2,0,5,15,0,6,0,1,0.6,0.6212,0.38,0.2537,148,230,378 +2947,2011-05-07,2,0,5,16,0,6,0,1,0.6,0.6212,0.38,0.194,119,223,342 +2948,2011-05-07,2,0,5,17,0,6,0,1,0.58,0.5455,0.4,0.2239,138,216,354 +2949,2011-05-07,2,0,5,18,0,6,0,1,0.58,0.5455,0.37,0.1642,114,175,289 +2950,2011-05-07,2,0,5,19,0,6,0,1,0.56,0.5303,0.43,0.194,88,179,267 +2951,2011-05-07,2,0,5,20,0,6,0,1,0.56,0.5303,0.43,0.1343,82,137,219 +2952,2011-05-07,2,0,5,21,0,6,0,1,0.54,0.5152,0.49,0.2239,58,124,182 +2953,2011-05-07,2,0,5,22,0,6,0,1,0.54,0.5152,0.49,0.194,26,94,120 +2954,2011-05-07,2,0,5,23,0,6,0,1,0.5,0.4848,0.59,0.1045,30,89,119 +2955,2011-05-08,2,0,5,0,0,0,0,2,0.5,0.4848,0.59,0.1045,22,78,100 +2956,2011-05-08,2,0,5,1,0,0,0,2,0.52,0.5,0.55,0,8,56,64 +2957,2011-05-08,2,0,5,2,0,0,0,1,0.48,0.4697,0.63,0,17,42,59 +2958,2011-05-08,2,0,5,3,0,0,0,1,0.46,0.4545,0.72,0.0896,10,21,31 +2959,2011-05-08,2,0,5,4,0,0,0,1,0.42,0.4242,0.82,0,2,8,10 +2960,2011-05-08,2,0,5,5,0,0,0,2,0.44,0.4394,0.77,0,0,5,5 +2961,2011-05-08,2,0,5,6,0,0,0,2,0.44,0.4394,0.82,0,2,4,6 +2962,2011-05-08,2,0,5,7,0,0,0,2,0.46,0.4545,0.82,0,8,15,23 +2963,2011-05-08,2,0,5,8,0,0,0,1,0.52,0.5,0.63,0.1343,28,58,86 +2964,2011-05-08,2,0,5,9,0,0,0,1,0.56,0.5303,0.52,0.2239,48,112,160 +2965,2011-05-08,2,0,5,10,0,0,0,1,0.58,0.5455,0.49,0.1343,91,153,244 +2966,2011-05-08,2,0,5,11,0,0,0,1,0.58,0.5455,0.49,0.0896,142,198,340 +2967,2011-05-08,2,0,5,12,0,0,0,1,0.6,0.6212,0.46,0.0896,139,243,382 +2968,2011-05-08,2,0,5,13,0,0,0,1,0.6,0.6212,0.49,0.1343,166,224,390 +2969,2011-05-08,2,0,5,14,0,0,0,1,0.6,0.6212,0.49,0,126,240,366 +2970,2011-05-08,2,0,5,15,0,0,0,1,0.6,0.6212,0.49,0,128,230,358 +2971,2011-05-08,2,0,5,16,0,0,0,1,0.6,0.6212,0.46,0.1045,122,263,385 +2972,2011-05-08,2,0,5,17,0,0,0,3,0.58,0.5455,0.49,0.2836,106,245,351 +2973,2011-05-08,2,0,5,18,0,0,0,1,0.58,0.5455,0.49,0,63,205,268 +2974,2011-05-08,2,0,5,19,0,0,0,1,0.56,0.5303,0.64,0,61,178,239 +2975,2011-05-08,2,0,5,20,0,0,0,1,0.52,0.5,0.77,0.1343,42,132,174 +2976,2011-05-08,2,0,5,21,0,0,0,1,0.52,0.5,0.77,0.0896,32,95,127 +2977,2011-05-08,2,0,5,22,0,0,0,1,0.5,0.4848,0.88,0.0896,29,86,115 +2978,2011-05-08,2,0,5,23,0,0,0,1,0.46,0.4545,0.88,0.0896,9,41,50 +2979,2011-05-09,2,0,5,0,0,1,1,1,0.46,0.4545,0.82,0.0896,31,22,53 +2980,2011-05-09,2,0,5,1,0,1,1,1,0.44,0.4394,0.88,0,25,8,33 +2981,2011-05-09,2,0,5,2,0,1,1,1,0.44,0.4394,0.94,0,6,2,8 +2982,2011-05-09,2,0,5,3,0,1,1,1,0.46,0.4545,0.82,0.0896,0,7,7 +2983,2011-05-09,2,0,5,4,0,1,1,1,0.42,0.4242,0.82,0.1642,0,4,4 +2984,2011-05-09,2,0,5,5,0,1,1,1,0.42,0.4242,0.77,0.1642,0,23,23 +2985,2011-05-09,2,0,5,6,0,1,1,1,0.44,0.4394,0.72,0.2537,2,87,89 +2986,2011-05-09,2,0,5,7,0,1,1,1,0.48,0.4697,0.63,0.2537,16,221,237 +2987,2011-05-09,2,0,5,8,0,1,1,1,0.52,0.5,0.55,0.2239,23,351,374 +2988,2011-05-09,2,0,5,9,0,1,1,1,0.54,0.5152,0.56,0.1642,36,142,178 +2989,2011-05-09,2,0,5,10,0,1,1,1,0.56,0.5303,0.52,0.194,24,95,119 +2990,2011-05-09,2,0,5,11,0,1,1,1,0.6,0.6212,0.46,0.2239,26,99,125 +2991,2011-05-09,2,0,5,12,0,1,1,1,0.6,0.6212,0.43,0.194,35,160,195 +2992,2011-05-09,2,0,5,13,0,1,1,1,0.62,0.6212,0.38,0.2836,36,138,174 +2993,2011-05-09,2,0,5,14,0,1,1,1,0.62,0.6212,0.41,0.2239,44,118,162 +2994,2011-05-09,2,0,5,15,0,1,1,1,0.62,0.6212,0.41,0.2239,51,148,199 +2995,2011-05-09,2,0,5,16,0,1,1,1,0.64,0.6212,0.38,0.2239,49,255,304 +2996,2011-05-09,2,0,5,17,0,1,1,1,0.62,0.6212,0.38,0.2537,59,539,598 +2997,2011-05-09,2,0,5,18,0,1,1,1,0.62,0.6212,0.38,0.2985,66,458,524 +2998,2011-05-09,2,0,5,19,0,1,1,1,0.58,0.5455,0.43,0.194,45,339,384 +2999,2011-05-09,2,0,5,20,0,1,1,1,0.54,0.5152,0.49,0.194,25,214,239 +3000,2011-05-09,2,0,5,21,0,1,1,1,0.52,0.5,0.59,0.1343,28,128,156 +3001,2011-05-09,2,0,5,22,0,1,1,1,0.52,0.5,0.68,0.0896,21,95,116 +3002,2011-05-09,2,0,5,23,0,1,1,1,0.5,0.4848,0.68,0.0896,16,45,61 +3003,2011-05-10,2,0,5,0,0,2,1,1,0.48,0.4697,0.63,0,6,12,18 +3004,2011-05-10,2,0,5,1,0,2,1,1,0.46,0.4545,0.59,0.0896,3,12,15 +3005,2011-05-10,2,0,5,2,0,2,1,1,0.44,0.4394,0.58,0.1343,1,4,5 +3006,2011-05-10,2,0,5,3,0,2,1,1,0.44,0.4394,0.54,0.194,1,3,4 +3007,2011-05-10,2,0,5,4,0,2,1,1,0.42,0.4242,0.62,0.2537,0,2,2 +3008,2011-05-10,2,0,5,5,0,2,1,1,0.4,0.4091,0.66,0.2239,1,28,29 +3009,2011-05-10,2,0,5,6,0,2,1,1,0.42,0.4242,0.67,0.1343,9,103,112 +3010,2011-05-10,2,0,5,7,0,2,1,2,0.44,0.4394,0.62,0.1343,13,301,314 +3011,2011-05-10,2,0,5,8,0,2,1,2,0.5,0.4848,0.51,0.1045,28,397,425 +3012,2011-05-10,2,0,5,9,0,2,1,2,0.52,0.5,0.48,0.1045,29,176,205 +3013,2011-05-10,2,0,5,10,0,2,1,2,0.56,0.5303,0.37,0.0896,27,100,127 +3014,2011-05-10,2,0,5,11,0,2,1,2,0.56,0.5303,0.43,0.1343,17,108,125 +3015,2011-05-10,2,0,5,12,0,2,1,2,0.6,0.6212,0.38,0,47,170,217 +3016,2011-05-10,2,0,5,13,0,2,1,2,0.62,0.6212,0.38,0.2537,50,152,202 +3017,2011-05-10,2,0,5,14,0,2,1,2,0.62,0.6212,0.33,0,42,134,176 +3018,2011-05-10,2,0,5,15,0,2,1,2,0.64,0.6212,0.33,0.0896,28,152,180 +3019,2011-05-10,2,0,5,16,0,2,1,1,0.66,0.6212,0.31,0.1045,56,271,327 +3020,2011-05-10,2,0,5,17,0,2,1,1,0.64,0.6212,0.33,0,79,532,611 +3021,2011-05-10,2,0,5,18,0,2,1,1,0.64,0.6212,0.29,0.1343,70,480,550 +3022,2011-05-10,2,0,5,19,0,2,1,1,0.6,0.6212,0.43,0.2239,69,365,434 +3023,2011-05-10,2,0,5,20,0,2,1,1,0.54,0.5152,0.6,0.1343,50,241,291 +3024,2011-05-10,2,0,5,21,0,2,1,1,0.54,0.5152,0.52,0,30,173,203 +3025,2011-05-10,2,0,5,22,0,2,1,1,0.52,0.5,0.55,0.1045,29,121,150 +3026,2011-05-10,2,0,5,23,0,2,1,2,0.52,0.5,0.59,0.1343,9,72,81 +3027,2011-05-11,2,0,5,0,0,3,1,2,0.52,0.5,0.55,0.1343,8,30,38 +3028,2011-05-11,2,0,5,1,0,3,1,2,0.5,0.4848,0.59,0.1045,4,16,20 +3029,2011-05-11,2,0,5,2,0,3,1,1,0.52,0.5,0.59,0,6,4,10 +3030,2011-05-11,2,0,5,3,0,3,1,1,0.5,0.4848,0.63,0.2239,0,3,3 +3031,2011-05-11,2,0,5,4,0,3,1,2,0.48,0.4697,0.88,0.1343,0,2,2 +3032,2011-05-11,2,0,5,5,0,3,1,1,0.46,0.4545,0.88,0.1343,0,20,20 +3033,2011-05-11,2,0,5,6,0,3,1,1,0.46,0.4545,0.82,0.1343,6,93,99 +3034,2011-05-11,2,0,5,7,0,3,1,1,0.48,0.4697,0.72,0.194,25,293,318 +3035,2011-05-11,2,0,5,8,0,3,1,1,0.48,0.4697,0.77,0.2537,21,421,442 +3036,2011-05-11,2,0,5,9,0,3,1,1,0.52,0.5,0.72,0.1343,21,182,203 +3037,2011-05-11,2,0,5,10,0,3,1,1,0.54,0.5152,0.68,0.0896,30,124,154 +3038,2011-05-11,2,0,5,11,0,3,1,1,0.56,0.5303,0.64,0.0896,18,123,141 +3039,2011-05-11,2,0,5,12,0,3,1,1,0.6,0.6212,0.56,0,41,185,226 +3040,2011-05-11,2,0,5,13,0,3,1,1,0.62,0.6212,0.5,0,36,157,193 +3041,2011-05-11,2,0,5,14,0,3,1,1,0.62,0.6212,0.53,0.0896,36,141,177 +3042,2011-05-11,2,0,5,15,0,3,1,1,0.64,0.6212,0.41,0,48,131,179 +3043,2011-05-11,2,0,5,16,0,3,1,1,0.66,0.6212,0.39,0,49,240,289 +3044,2011-05-11,2,0,5,17,0,3,1,1,0.64,0.6212,0.47,0.1045,17,242,259 +3045,2011-05-11,2,0,5,18,0,3,1,1,0.62,0.6212,0.5,0.2985,40,234,274 +3046,2011-05-11,2,0,5,19,0,3,1,1,0.56,0.5303,0.52,0.1343,60,341,401 +3047,2011-05-11,2,0,5,20,0,3,1,1,0.54,0.5152,0.64,0.1343,28,245,273 +3048,2011-05-11,2,0,5,21,0,3,1,1,0.54,0.5152,0.6,0,32,202,234 +3049,2011-05-11,2,0,5,22,0,3,1,1,0.5,0.4848,0.72,0.2537,12,134,146 +3050,2011-05-11,2,0,5,23,0,3,1,1,0.46,0.4545,0.88,0.2537,12,69,81 +3051,2011-05-12,2,0,5,0,0,4,1,2,0.46,0.4545,0.88,0.2239,5,39,44 +3052,2011-05-12,2,0,5,1,0,4,1,1,0.46,0.4545,0.88,0.2239,1,16,17 +3053,2011-05-12,2,0,5,2,0,4,1,1,0.44,0.4394,0.94,0.194,1,14,15 +3054,2011-05-12,2,0,5,3,0,4,1,1,0.44,0.4394,0.88,0.1343,0,5,5 +3055,2011-05-12,2,0,5,4,0,4,1,1,0.42,0.4242,0.94,0,1,5,6 +3056,2011-05-12,2,0,5,5,0,4,1,1,0.42,0.4242,0.94,0.1343,2,27,29 +3057,2011-05-12,2,0,5,6,0,4,1,1,0.44,0.4394,0.88,0,9,103,112 +3058,2011-05-12,2,0,5,7,0,4,1,1,0.46,0.4545,0.88,0.1343,14,283,297 +3059,2011-05-12,2,0,5,8,0,4,1,1,0.48,0.4697,0.82,0.194,27,394,421 +3060,2011-05-12,2,0,5,9,0,4,1,2,0.5,0.4848,0.77,0.2836,21,191,212 +3061,2011-05-12,2,0,5,10,0,4,1,2,0.54,0.5152,0.68,0.2239,28,119,147 +3062,2011-05-12,2,0,5,11,0,4,1,1,0.58,0.5455,0.6,0.2239,43,136,179 +3063,2011-05-12,2,0,5,12,0,4,1,1,0.58,0.5455,0.6,0.2239,43,164,207 +3064,2011-05-12,2,0,5,13,0,4,1,2,0.62,0.6212,0.57,0.2985,48,170,218 +3065,2011-05-12,2,0,5,14,0,4,1,2,0.62,0.6061,0.61,0.2239,43,148,191 +3066,2011-05-12,2,0,5,15,0,4,1,2,0.64,0.6212,0.53,0.2537,54,163,217 +3067,2011-05-12,2,0,5,16,0,4,1,2,0.64,0.6212,0.57,0.2537,50,238,288 +3068,2011-05-12,2,0,5,17,0,4,1,2,0.64,0.6212,0.57,0.194,54,540,594 +3069,2011-05-12,2,0,5,18,0,4,1,1,0.62,0.6061,0.69,0.2239,64,463,527 +3070,2011-05-12,2,0,5,19,0,4,1,2,0.6,0.6061,0.64,0.194,59,305,364 +3071,2011-05-12,2,0,5,20,0,4,1,1,0.6,0.6061,0.64,0.194,56,220,276 +3072,2011-05-12,2,0,5,21,0,4,1,1,0.56,0.5303,0.83,0.1343,31,192,223 +3073,2011-05-12,2,0,5,22,0,4,1,1,0.54,0.5152,0.83,0.194,25,147,172 +3074,2011-05-12,2,0,5,23,0,4,1,2,0.54,0.5152,0.77,0.194,16,87,103 +3075,2011-05-13,2,0,5,0,0,5,1,2,0.52,0.5,0.83,0.1642,6,46,52 +3076,2011-05-13,2,0,5,1,0,5,1,1,0.52,0.5,0.83,0.1642,5,15,20 +3077,2011-05-13,2,0,5,2,0,5,1,2,0.5,0.4848,0.88,0.1343,3,8,11 +3078,2011-05-13,2,0,5,3,0,5,1,2,0.5,0.4848,0.88,0.1343,2,2,4 +3079,2011-05-13,2,0,5,4,0,5,1,3,0.5,0.4848,0.88,0.1642,2,3,5 +3080,2011-05-13,2,0,5,5,0,5,1,2,0.5,0.4848,0.88,0.194,1,24,25 +3081,2011-05-13,2,0,5,6,0,5,1,3,0.5,0.4848,0.88,0.1343,6,76,82 +3082,2011-05-13,2,0,5,7,0,5,1,3,0.5,0.4848,0.88,0.1642,16,141,157 +3083,2011-05-13,2,0,5,8,0,5,1,2,0.5,0.4848,0.88,0.194,26,361,387 +3084,2011-05-13,2,0,5,9,0,5,1,2,0.5,0.4848,0.88,0.194,18,215,233 +3085,2011-05-13,2,0,5,10,0,5,1,2,0.52,0.5,0.83,0.1343,31,99,130 +3086,2011-05-13,2,0,5,11,0,5,1,2,0.52,0.5,0.83,0.1642,56,90,146 +3087,2011-05-13,2,0,5,12,0,5,1,2,0.52,0.5,0.88,0.194,53,170,223 +3088,2011-05-13,2,0,5,13,0,5,1,2,0.52,0.5,0.88,0.3881,69,157,226 +3089,2011-05-13,2,0,5,14,0,5,1,2,0.52,0.5,0.87,0.2836,70,120,190 +3090,2011-05-13,2,0,5,15,0,5,1,2,0.52,0.5,0.88,0.1642,56,142,198 +3091,2011-05-13,2,0,5,16,0,5,1,2,0.52,0.5,0.88,0.1343,44,256,300 +3092,2011-05-13,2,0,5,17,0,5,1,3,0.52,0.5,0.88,0.1642,62,429,491 +3093,2011-05-13,2,0,5,18,0,5,1,3,0.52,0.5,0.88,0.1642,39,359,398 +3094,2011-05-13,2,0,5,19,0,5,1,2,0.52,0.5,0.83,0.194,25,245,270 +3095,2011-05-13,2,0,5,20,0,5,1,2,0.52,0.5,0.83,0.1343,22,130,152 +3096,2011-05-13,2,0,5,21,0,5,1,2,0.52,0.5,0.84,0.1642,24,130,154 +3097,2011-05-13,2,0,5,22,0,5,1,2,0.52,0.5,0.83,0.2239,25,107,132 +3098,2011-05-13,2,0,5,23,0,5,1,3,0.5,0.4848,0.88,0.1642,31,88,119 +3099,2011-05-14,2,0,5,0,0,6,0,2,0.5,0.4848,0.88,0.1343,24,78,102 +3100,2011-05-14,2,0,5,1,0,6,0,2,0.5,0.4848,0.88,0.1343,18,64,82 +3101,2011-05-14,2,0,5,2,0,6,0,2,0.5,0.4848,0.88,0.1343,15,37,52 +3102,2011-05-14,2,0,5,3,0,6,0,2,0.5,0.4848,0.88,0.1343,5,30,35 +3103,2011-05-14,2,0,5,4,0,6,0,2,0.5,0.4848,0.82,0.1642,1,4,5 +3104,2011-05-14,2,0,5,5,0,6,0,2,0.5,0.4848,0.82,0.1045,7,7,14 +3105,2011-05-14,2,0,5,6,0,6,0,2,0.5,0.4848,0.88,0.1045,2,9,11 +3106,2011-05-14,2,0,5,7,0,6,0,3,0.48,0.4697,1,0,4,28,32 +3107,2011-05-14,2,0,5,8,0,6,0,3,0.5,0.4848,0.94,0,16,72,88 +3108,2011-05-14,2,0,5,9,0,6,0,2,0.52,0.5,0.94,0.0896,14,86,100 +3109,2011-05-14,2,0,5,10,0,6,0,2,0.52,0.5,0.94,0.1642,34,133,167 +3110,2011-05-14,2,0,5,11,0,6,0,2,0.52,0.5,0.94,0.1343,72,207,279 +3111,2011-05-14,2,0,5,12,0,6,0,3,0.52,0.5,1,0.0896,75,204,279 +3112,2011-05-14,2,0,5,13,0,6,0,3,0.52,0.5,1,0.0896,68,180,248 +3113,2011-05-14,2,0,5,14,0,6,0,2,0.54,0.5152,0.94,0.1343,57,159,216 +3114,2011-05-14,2,0,5,15,0,6,0,2,0.54,0.5152,0.94,0.1642,73,221,294 +3115,2011-05-14,2,0,5,16,0,6,0,2,0.54,0.5152,0.94,0.194,107,188,295 +3116,2011-05-14,2,0,5,17,0,6,0,2,0.56,0.5303,0.88,0.194,78,194,272 +3117,2011-05-14,2,0,5,18,0,6,0,2,0.56,0.5303,0.88,0.2239,88,216,304 +3118,2011-05-14,2,0,5,19,0,6,0,3,0.54,0.5152,0.94,0.1343,69,179,248 +3119,2011-05-14,2,0,5,20,0,6,0,3,0.54,0.5152,0.94,0.1343,42,115,157 +3120,2011-05-14,2,0,5,21,0,6,0,3,0.54,0.5152,0.94,0.2239,15,44,59 +3121,2011-05-14,2,0,5,22,0,6,0,3,0.54,0.5152,0.94,0.1642,7,19,26 +3122,2011-05-14,2,0,5,23,0,6,0,3,0.52,0.5,1,0.194,11,33,44 +3123,2011-05-15,2,0,5,0,0,0,0,2,0.52,0.5,1,0,5,34,39 +3124,2011-05-15,2,0,5,1,0,0,0,2,0.52,0.5,1,0.1045,4,43,47 +3125,2011-05-15,2,0,5,2,0,0,0,2,0.52,0.5,1,0.1343,13,37,50 +3126,2011-05-15,2,0,5,3,0,0,0,2,0.52,0.5,1,0,11,21,32 +3127,2011-05-15,2,0,5,4,0,0,0,2,0.52,0.5,1,0,5,8,13 +3128,2011-05-15,2,0,5,5,0,0,0,3,0.52,0.5,1,0.1045,3,11,14 +3129,2011-05-15,2,0,5,6,0,0,0,2,0.52,0.5,1,0.1343,3,14,17 +3130,2011-05-15,2,0,5,7,0,0,0,2,0.52,0.5,1,0.1343,4,38,42 +3131,2011-05-15,2,0,5,8,0,0,0,2,0.54,0.5152,0.94,0.1343,24,46,70 +3132,2011-05-15,2,0,5,9,0,0,0,1,0.58,0.5455,0.83,0.2537,36,98,134 +3133,2011-05-15,2,0,5,10,0,0,0,1,0.58,0.5455,0.83,0.2239,73,153,226 +3134,2011-05-15,2,0,5,11,0,0,0,1,0.6,0.5758,0.78,0.2239,120,202,322 +3135,2011-05-15,2,0,5,12,0,0,0,1,0.62,0.5909,0.73,0.2985,120,247,367 +3136,2011-05-15,2,0,5,13,0,0,0,1,0.62,0.5909,0.73,0.2836,195,261,456 +3137,2011-05-15,2,0,5,14,0,0,0,1,0.64,0.6061,0.73,0.194,183,254,437 +3138,2011-05-15,2,0,5,15,0,0,0,1,0.66,0.6212,0.69,0.2537,206,253,459 +3139,2011-05-15,2,0,5,16,0,0,0,1,0.64,0.6061,0.69,0.2537,158,282,440 +3140,2011-05-15,2,0,5,17,0,0,0,3,0.56,0.5303,0.78,0.3582,137,255,392 +3141,2011-05-15,2,0,5,18,0,0,0,1,0.56,0.5303,0.78,0.2537,60,177,237 +3142,2011-05-15,2,0,5,19,0,0,0,1,0.56,0.5303,0.83,0.1045,78,153,231 +3143,2011-05-15,2,0,5,20,0,0,0,1,0.54,0.5152,0.88,0,53,138,191 +3144,2011-05-15,2,0,5,21,0,0,0,1,0.54,0.5152,0.88,0,44,107,151 +3145,2011-05-15,2,0,5,22,0,0,0,1,0.56,0.5303,0.83,0,29,88,117 +3146,2011-05-15,2,0,5,23,0,0,0,1,0.54,0.5152,0.88,0.2239,18,51,69 +3147,2011-05-16,2,0,5,0,0,1,1,1,0.52,0.5,0.94,0.1045,17,21,38 +3148,2011-05-16,2,0,5,1,0,1,1,1,0.52,0.5,0.94,0.1343,6,8,14 +3149,2011-05-16,2,0,5,2,0,1,1,1,0.5,0.4848,1,0.0896,4,9,13 +3150,2011-05-16,2,0,5,3,0,1,1,1,0.5,0.4848,0.94,0.1642,1,3,4 +3151,2011-05-16,2,0,5,4,0,1,1,1,0.5,0.4848,1,0.1343,1,5,6 +3152,2011-05-16,2,0,5,5,0,1,1,1,0.5,0.4848,0.93,0.194,1,20,21 +3153,2011-05-16,2,0,5,6,0,1,1,1,0.52,0.5,0.88,0.1642,11,93,104 +3154,2011-05-16,2,0,5,7,0,1,1,1,0.52,0.5,0.83,0.2836,27,245,272 +3155,2011-05-16,2,0,5,8,0,1,1,1,0.56,0.5303,0.73,0.2985,28,366,394 +3156,2011-05-16,2,0,5,9,0,1,1,1,0.6,0.5909,0.69,0.194,38,156,194 +3157,2011-05-16,2,0,5,10,0,1,1,1,0.62,0.6061,0.65,0.0896,37,75,112 +3158,2011-05-16,2,0,5,11,0,1,1,1,0.64,0.6212,0.57,0,56,129,185 +3159,2011-05-16,2,0,5,12,0,1,1,1,0.66,0.6212,0.54,0.0896,72,137,209 +3160,2011-05-16,2,0,5,13,0,1,1,1,0.68,0.6364,0.51,0,61,153,214 +3161,2011-05-16,2,0,5,14,0,1,1,1,0.68,0.6364,0.51,0,76,117,193 +3162,2011-05-16,2,0,5,15,0,1,1,1,0.72,0.6667,0.51,0,55,110,165 +3163,2011-05-16,2,0,5,16,0,1,1,3,0.6,0.5758,0.78,0.2537,45,181,226 +3164,2011-05-16,2,0,5,17,0,1,1,1,0.58,0.5455,0.88,0.2239,47,227,274 +3165,2011-05-16,2,0,5,18,0,1,1,3,0.58,0.5455,0.83,0.194,55,398,453 +3166,2011-05-16,2,0,5,19,0,1,1,1,0.58,0.5455,0.83,0.2537,36,272,308 +3167,2011-05-16,2,0,5,20,0,1,1,1,0.58,0.5455,0.83,0,31,167,198 +3168,2011-05-16,2,0,5,21,0,1,1,1,0.58,0.5455,0.83,0,28,149,177 +3169,2011-05-16,2,0,5,22,0,1,1,1,0.56,0.5303,0.88,0.0896,22,105,127 +3170,2011-05-16,2,0,5,23,0,1,1,1,0.56,0.5303,0.88,0.0896,18,39,57 +3171,2011-05-17,2,0,5,0,0,2,1,2,0.56,0.5303,0.88,0,13,18,31 +3172,2011-05-17,2,0,5,1,0,2,1,3,0.56,0.5303,0.88,0,7,6,13 +3173,2011-05-17,2,0,5,2,0,2,1,3,0.56,0.5303,0.88,0,4,4,8 +3174,2011-05-17,2,0,5,3,0,2,1,3,0.54,0.5152,0.94,0.2537,1,3,4 +3175,2011-05-17,2,0,5,4,0,2,1,3,0.54,0.5152,0.94,0.2537,2,2,4 +3176,2011-05-17,2,0,5,5,0,2,1,3,0.52,0.5,0.94,0.2537,0,22,22 +3177,2011-05-17,2,0,5,6,0,2,1,3,0.52,0.5,0.94,0.2537,0,49,49 +3178,2011-05-17,2,0,5,7,0,2,1,2,0.52,0.5,0.94,0.2537,13,138,151 +3179,2011-05-17,2,0,5,8,0,2,1,2,0.52,0.5,0.94,0.2985,22,325,347 +3180,2011-05-17,2,0,5,9,0,2,1,2,0.54,0.5152,0.88,0.2537,17,190,207 +3181,2011-05-17,2,0,5,10,0,2,1,3,0.54,0.5152,0.88,0.2537,31,73,104 +3182,2011-05-17,2,0,5,11,0,2,1,2,0.56,0.5303,0.83,0.4179,26,104,130 +3183,2011-05-17,2,0,5,12,0,2,1,3,0.56,0.5303,0.83,0.4179,38,115,153 +3184,2011-05-17,2,0,5,13,0,2,1,2,0.56,0.5303,0.88,0.3284,31,141,172 +3185,2011-05-17,2,0,5,14,0,2,1,1,0.62,0.5909,0.73,0.4627,61,123,184 +3186,2011-05-17,2,0,5,15,0,2,1,1,0.62,0.6061,0.69,0.4478,79,131,210 +3187,2011-05-17,2,0,5,16,0,2,1,1,0.62,0.6061,0.61,0.3284,73,217,290 +3188,2011-05-17,2,0,5,17,0,2,1,1,0.62,0.6061,0.65,0.4179,83,521,604 +3189,2011-05-17,2,0,5,18,0,2,1,1,0.6,0.5909,0.69,0.2985,54,426,480 +3190,2011-05-17,2,0,5,19,0,2,1,1,0.6,0.5909,0.69,0.2537,51,298,349 +3191,2011-05-17,2,0,5,20,0,2,1,1,0.58,0.5455,0.83,0.4179,30,253,283 +3192,2011-05-17,2,0,5,21,0,2,1,1,0.54,0.5152,0.88,0.2537,23,151,174 +3193,2011-05-17,2,0,5,22,0,2,1,2,0.54,0.5152,0.88,0.2836,12,86,98 +3194,2011-05-17,2,0,5,23,0,2,1,2,0.54,0.5152,0.88,0.2537,7,49,56 +3195,2011-05-18,2,0,5,0,0,3,1,2,0.54,0.5152,0.88,0.2239,8,15,23 +3196,2011-05-18,2,0,5,1,0,3,1,2,0.54,0.5152,0.88,0.2537,3,9,12 +3197,2011-05-18,2,0,5,2,0,3,1,3,0.52,0.5,0.94,0.2239,1,5,6 +3198,2011-05-18,2,0,5,3,0,3,1,3,0.52,0.5,0.94,0.2985,6,3,9 +3199,2011-05-18,2,0,5,4,0,3,1,3,0.52,0.5,0.94,0.2985,1,2,3 +3200,2011-05-18,2,0,5,5,0,3,1,1,0.52,0.5,1,0.2239,0,9,9 +3201,2011-05-18,2,0,5,6,0,3,1,1,0.52,0.5,1,0.1343,2,99,101 +3202,2011-05-18,2,0,5,7,0,3,1,1,0.54,0.5152,0.88,0.194,14,260,274 +3203,2011-05-18,2,0,5,8,0,3,1,1,0.56,0.5303,0.88,0.2836,25,428,453 +3204,2011-05-18,2,0,5,9,0,3,1,3,0.56,0.5303,0.83,0.2537,26,176,202 +3205,2011-05-18,2,0,5,10,0,3,1,3,0.54,0.5152,0.88,0.2836,33,73,106 +3206,2011-05-18,2,0,5,11,0,3,1,2,0.54,0.5152,0.94,0.0896,6,17,23 +3207,2011-05-18,2,0,5,12,0,3,1,1,0.56,0.5303,0.94,0.2985,12,42,54 +3208,2011-05-18,2,0,5,13,0,3,1,2,0.56,0.5303,0.83,0.3284,30,92,122 +3209,2011-05-18,2,0,5,14,0,3,1,2,0.58,0.5455,0.78,0.2836,44,94,138 +3210,2011-05-18,2,0,5,15,0,3,1,1,0.6,0.5758,0.78,0.2537,34,133,167 +3211,2011-05-18,2,0,5,16,0,3,1,1,0.6,0.5909,0.73,0.194,53,241,294 +3212,2011-05-18,2,0,5,17,0,3,1,1,0.6,0.5909,0.69,0.1343,78,487,565 +3213,2011-05-18,2,0,5,18,0,3,1,1,0.6,0.5909,0.73,0.2239,58,431,489 +3214,2011-05-18,2,0,5,19,0,3,1,3,0.56,0.5303,0.88,0.1343,43,288,331 +3215,2011-05-18,2,0,5,20,0,3,1,3,0.56,0.5303,0.88,0.1343,24,189,213 +3216,2011-05-18,2,0,5,21,0,3,1,1,0.52,0.5,0.88,0,11,74,85 +3217,2011-05-18,2,0,5,22,0,3,1,1,0.52,0.5,0.83,0,14,103,117 +3218,2011-05-18,2,0,5,23,0,3,1,1,0.52,0.5,0.94,0.0896,10,49,59 +3219,2011-05-19,2,0,5,0,0,4,1,1,0.52,0.5,0.94,0.1045,6,23,29 +3220,2011-05-19,2,0,5,1,0,4,1,1,0.5,0.4848,0.94,0,2,4,6 +3221,2011-05-19,2,0,5,2,0,4,1,1,0.5,0.4848,0.94,0,3,12,15 +3222,2011-05-19,2,0,5,3,0,4,1,1,0.48,0.4697,1,0,1,3,4 +3223,2011-05-19,2,0,5,4,0,4,1,1,0.48,0.4697,0.94,0,2,3,5 +3224,2011-05-19,2,0,5,5,0,4,1,1,0.48,0.4697,0.94,0,2,24,26 +3225,2011-05-19,2,0,5,6,0,4,1,2,0.5,0.4848,0.94,0,17,86,103 +3226,2011-05-19,2,0,5,7,0,4,1,2,0.5,0.4848,1,0,18,239,257 +3227,2011-05-19,2,0,5,8,0,4,1,2,0.52,0.5,0.88,0,34,453,487 +3228,2011-05-19,2,0,5,9,0,4,1,2,0.54,0.5152,0.88,0.0896,40,176,216 +3229,2011-05-19,2,0,5,10,0,4,1,2,0.54,0.5152,0.83,0,35,95,130 +3230,2011-05-19,2,0,5,11,0,4,1,3,0.54,0.5152,0.88,0.1642,53,111,164 +3231,2011-05-19,2,0,5,12,0,4,1,2,0.58,0.5455,0.78,0.1045,38,130,168 +3232,2011-05-19,2,0,5,13,0,4,1,3,0.58,0.5455,0.64,0.2537,44,139,183 +3233,2011-05-19,2,0,5,14,0,4,1,3,0.54,0.5152,0.73,0.194,43,137,180 +3234,2011-05-19,2,0,5,15,0,4,1,1,0.56,0.5303,0.78,0.1642,63,125,188 +3235,2011-05-19,2,0,5,16,0,4,1,1,0.58,0.5455,0.64,0.194,55,247,302 +3236,2011-05-19,2,0,5,17,0,4,1,1,0.56,0.5303,0.78,0.1642,60,487,547 +3237,2011-05-19,2,0,5,18,0,4,1,1,0.6,0.6212,0.55,0.1642,59,454,513 +3238,2011-05-19,2,0,5,19,0,4,1,1,0.58,0.5455,0.6,0.1343,65,345,410 +3239,2011-05-19,2,0,5,20,0,4,1,1,0.54,0.5152,0.77,0.2836,46,236,282 +3240,2011-05-19,2,0,5,21,0,4,1,1,0.54,0.5152,0.77,0.2836,29,180,209 +3241,2011-05-19,2,0,5,22,0,4,1,3,0.5,0.4848,0.82,0.2985,11,68,79 +3242,2011-05-19,2,0,5,23,0,4,1,1,0.48,0.4697,0.94,0,9,63,72 +3243,2011-05-20,2,0,5,0,0,5,1,1,0.46,0.4545,0.94,0.0896,15,33,48 +3244,2011-05-20,2,0,5,1,0,5,1,1,0.46,0.4545,1,0,8,17,25 +3245,2011-05-20,2,0,5,2,0,5,1,1,0.44,0.4394,1,0.0896,4,4,8 +3246,2011-05-20,2,0,5,3,0,5,1,1,0.44,0.4394,1,0,1,4,5 +3247,2011-05-20,2,0,5,4,0,5,1,1,0.44,0.4394,0.94,0.0896,1,4,5 +3248,2011-05-20,2,0,5,5,0,5,1,1,0.44,0.4394,0.94,0.1045,0,28,28 +3249,2011-05-20,2,0,5,6,0,5,1,1,0.46,0.4545,0.88,0.1045,15,102,117 +3250,2011-05-20,2,0,5,7,0,5,1,1,0.5,0.4848,0.77,0.1642,30,289,319 +3251,2011-05-20,2,0,5,8,0,5,1,1,0.54,0.5152,0.68,0,41,476,517 +3252,2011-05-20,2,0,5,9,0,5,1,1,0.54,0.5152,0.68,0.2239,35,199,234 +3253,2011-05-20,2,0,5,10,0,5,1,1,0.56,0.5303,0.64,0.194,44,104,148 +3254,2011-05-20,2,0,5,11,0,5,1,2,0.58,0.5455,0.6,0.194,63,133,196 +3255,2011-05-20,2,0,5,12,0,5,1,2,0.6,0.6212,0.56,0.2537,69,186,255 +3256,2011-05-20,2,0,5,13,0,5,1,3,0.6,0.6212,0.56,0.2836,60,172,232 +3257,2011-05-20,2,0,5,14,0,5,1,1,0.62,0.6212,0.53,0.2239,66,131,197 +3258,2011-05-20,2,0,5,15,0,5,1,2,0.6,0.6212,0.56,0.194,74,168,242 +3259,2011-05-20,2,0,5,16,0,5,1,2,0.6,0.6212,0.53,0.1642,61,269,330 +3260,2011-05-20,2,0,5,17,0,5,1,1,0.62,0.6212,0.53,0,71,483,554 +3261,2011-05-20,2,0,5,18,0,5,1,1,0.6,0.6212,0.56,0.1642,79,394,473 +3262,2011-05-20,2,0,5,19,0,5,1,1,0.58,0.5455,0.64,0.0896,49,253,302 +3263,2011-05-20,2,0,5,20,0,5,1,1,0.58,0.5455,0.64,0.1045,40,180,220 +3264,2011-05-20,2,0,5,21,0,5,1,1,0.56,0.5303,0.64,0.1045,40,155,195 +3265,2011-05-20,2,0,5,22,0,5,1,1,0.54,0.5152,0.73,0,28,124,152 +3266,2011-05-20,2,0,5,23,0,5,1,1,0.52,0.5,0.72,0.1642,15,100,115 +3267,2011-05-21,2,0,5,0,0,6,0,1,0.52,0.5,0.77,0.1045,20,78,98 +3268,2011-05-21,2,0,5,1,0,6,0,1,0.52,0.5,0.72,0,8,64,72 +3269,2011-05-21,2,0,5,2,0,6,0,1,0.52,0.5,0.72,0.1045,5,35,40 +3270,2011-05-21,2,0,5,3,0,6,0,1,0.48,0.4697,0.82,0.1045,7,12,19 +3271,2011-05-21,2,0,5,4,0,6,0,1,0.46,0.4545,0.88,0.1343,1,6,7 +3272,2011-05-21,2,0,5,5,0,6,0,1,0.46,0.4545,0.88,0.1045,1,4,5 +3273,2011-05-21,2,0,5,6,0,6,0,1,0.5,0.4848,0.82,0.1343,6,22,28 +3274,2011-05-21,2,0,5,7,0,6,0,1,0.54,0.5152,0.73,0.1343,10,33,43 +3275,2011-05-21,2,0,5,8,0,6,0,1,0.56,0.5303,0.68,0.2836,29,97,126 +3276,2011-05-21,2,0,5,9,0,6,0,1,0.6,0.6061,0.64,0.2239,60,165,225 +3277,2011-05-21,2,0,5,10,0,6,0,1,0.62,0.6061,0.61,0.1045,122,201,323 +3278,2011-05-21,2,0,5,11,0,6,0,1,0.64,0.6212,0.53,0.1343,173,245,418 +3279,2011-05-21,2,0,5,12,0,6,0,1,0.66,0.6212,0.5,0,222,271,493 +3280,2011-05-21,2,0,5,13,0,6,0,1,0.7,0.6364,0.45,0,191,271,462 +3281,2011-05-21,2,0,5,14,0,6,0,1,0.72,0.6515,0.42,0,187,269,456 +3282,2011-05-21,2,0,5,15,0,6,0,1,0.72,0.6515,0.45,0.2537,232,274,506 +3283,2011-05-21,2,0,5,16,0,6,0,2,0.72,0.6515,0.39,0.2537,204,267,471 +3284,2011-05-21,2,0,5,17,0,6,0,2,0.72,0.6515,0.42,0.1642,191,253,444 +3285,2011-05-21,2,0,5,18,0,6,0,2,0.7,0.6364,0.45,0.1642,191,253,444 +3286,2011-05-21,2,0,5,19,0,6,0,1,0.68,0.6364,0.47,0.1642,117,184,301 +3287,2011-05-21,2,0,5,20,0,6,0,1,0.62,0.6061,0.61,0,84,164,248 +3288,2011-05-21,2,0,5,21,0,6,0,1,0.62,0.6061,0.61,0.1045,77,150,227 +3289,2011-05-21,2,0,5,22,0,6,0,1,0.6,0.5909,0.69,0.1343,78,126,204 +3290,2011-05-21,2,0,5,23,0,6,0,1,0.58,0.5455,0.78,0.0896,42,103,145 +3291,2011-05-22,2,0,5,0,0,0,0,1,0.54,0.5152,0.88,0.1642,31,100,131 +3292,2011-05-22,2,0,5,1,0,0,0,1,0.52,0.5,0.94,0.1343,17,81,98 +3293,2011-05-22,2,0,5,2,0,0,0,1,0.52,0.5,0.94,0.1045,18,50,68 +3294,2011-05-22,2,0,5,3,0,0,0,1,0.5,0.4848,1,0.1045,10,23,33 +3295,2011-05-22,2,0,5,4,0,0,0,1,0.5,0.4848,1,0,3,9,12 +3296,2011-05-22,2,0,5,5,0,0,0,1,0.5,0.4848,1,0.1045,1,5,6 +3297,2011-05-22,2,0,5,6,0,0,0,1,0.52,0.5,0.94,0.1045,5,13,18 +3298,2011-05-22,2,0,5,7,0,0,0,1,0.54,0.5152,0.88,0.1045,13,27,40 +3299,2011-05-22,2,0,5,8,0,0,0,1,0.6,0.5909,0.73,0.0896,29,65,94 +3300,2011-05-22,2,0,5,9,0,0,0,1,0.62,0.6061,0.69,0.1045,59,130,189 +3301,2011-05-22,2,0,5,10,0,0,0,1,0.64,0.6061,0.65,0.1343,135,170,305 +3302,2011-05-22,2,0,5,11,0,0,0,1,0.7,0.6515,0.54,0.194,164,209,373 +3303,2011-05-22,2,0,5,12,0,0,0,2,0.72,0.6667,0.51,0.194,146,255,401 +3304,2011-05-22,2,0,5,13,0,0,0,2,0.66,0.6212,0.54,0.2239,180,275,455 +3305,2011-05-22,2,0,5,14,0,0,0,3,0.62,0.6061,0.69,0.194,125,191,316 +3306,2011-05-22,2,0,5,15,0,0,0,2,0.66,0.6212,0.61,0.1642,110,221,331 +3307,2011-05-22,2,0,5,16,0,0,0,1,0.68,0.6364,0.61,0.1343,72,201,273 +3308,2011-05-22,2,0,5,17,0,0,0,1,0.7,0.6515,0.58,0.2537,93,213,306 +3309,2011-05-22,2,0,5,18,0,0,0,1,0.66,0.6212,0.61,0.2239,120,238,358 +3310,2011-05-22,2,0,5,19,0,0,0,1,0.66,0.6212,0.61,0.194,107,193,300 +3311,2011-05-22,2,0,5,20,0,0,0,1,0.64,0.6061,0.65,0.1642,57,174,231 +3312,2011-05-22,2,0,5,21,0,0,0,1,0.62,0.5909,0.73,0.194,27,109,136 +3313,2011-05-22,2,0,5,22,0,0,0,1,0.6,0.5606,0.83,0.1343,35,86,121 +3314,2011-05-22,2,0,5,23,0,0,0,2,0.58,0.5455,0.83,0.1343,19,46,65 +3315,2011-05-23,2,0,5,0,0,1,1,2,0.56,0.5303,0.88,0.1343,23,18,41 +3316,2011-05-23,2,0,5,1,0,1,1,1,0.56,0.5303,0.88,0.1045,4,5,9 +3317,2011-05-23,2,0,5,2,0,1,1,3,0.56,0.5303,0.88,0.1642,2,3,5 +3318,2011-05-23,2,0,5,3,0,1,1,3,0.54,0.5152,0.94,0.1642,0,4,4 +3319,2011-05-23,2,0,5,4,0,1,1,2,0.54,0.5152,0.94,0,2,2,4 +3320,2011-05-23,2,0,5,5,0,1,1,2,0.54,0.5152,0.94,0,0,11,11 +3321,2011-05-23,2,0,5,6,0,1,1,2,0.54,0.5152,1,0,4,92,96 +3322,2011-05-23,2,0,5,7,0,1,1,2,0.56,0.5303,0.94,0.1045,13,223,236 +3323,2011-05-23,2,0,5,8,0,1,1,1,0.6,0.5455,0.88,0.194,50,359,409 +3324,2011-05-23,2,0,5,9,0,1,1,1,0.6,0.5455,0.88,0.2239,36,131,167 +3325,2011-05-23,2,0,5,10,0,1,1,1,0.66,0.6212,0.74,0.2985,49,72,121 +3326,2011-05-23,2,0,5,11,0,1,1,1,0.68,0.6364,0.69,0.3881,73,107,180 +3327,2011-05-23,2,0,5,12,0,1,1,1,0.68,0.6364,0.69,0.3881,65,139,204 +3328,2011-05-23,2,0,5,13,0,1,1,2,0.74,0.6818,0.58,0.4179,49,155,204 +3329,2011-05-23,2,0,5,14,0,1,1,2,0.72,0.6818,0.62,0.5224,57,130,187 +3330,2011-05-23,2,0,5,15,0,1,1,2,0.72,0.6818,0.62,0.4478,57,119,176 +3331,2011-05-23,2,0,5,16,0,1,1,2,0.72,0.6818,0.62,0.2985,56,223,279 +3332,2011-05-23,2,0,5,17,0,1,1,3,0.72,0.6818,0.7,0.3881,56,373,429 +3333,2011-05-23,2,0,5,18,0,1,1,1,0.7,0.6667,0.74,0.2836,68,385,453 +3334,2011-05-23,2,0,5,19,0,1,1,1,0.7,0.6667,0.74,0.2239,56,302,358 +3335,2011-05-23,2,0,5,20,0,1,1,1,0.64,0.5758,0.83,0.2239,39,271,310 +3336,2011-05-23,2,0,5,21,0,1,1,1,0.64,0.5758,0.89,0.2239,33,155,188 +3337,2011-05-23,2,0,5,22,0,1,1,1,0.62,0.5455,0.94,0.2239,25,106,131 +3338,2011-05-23,2,0,5,23,0,1,1,2,0.62,0.5606,0.88,0.194,19,53,72 +3339,2011-05-24,2,0,5,0,0,2,1,2,0.6,0.5152,0.94,0.194,12,23,35 +3340,2011-05-24,2,0,5,1,0,2,1,2,0.6,0.5152,0.94,0.1642,6,9,15 +3341,2011-05-24,2,0,5,2,0,2,1,2,0.6,0.5152,0.94,0.194,2,8,10 +3342,2011-05-24,2,0,5,3,0,2,1,2,0.58,0.5455,0.94,0.1045,0,2,2 +3343,2011-05-24,2,0,5,4,0,2,1,1,0.6,0.5455,0.88,0.1045,1,3,4 +3344,2011-05-24,2,0,5,5,0,2,1,2,0.58,0.5455,0.94,0.2537,2,22,24 +3345,2011-05-24,2,0,5,6,0,2,1,2,0.58,0.5455,0.94,0.194,10,102,112 +3346,2011-05-24,2,0,5,7,0,2,1,3,0.64,0.5909,0.78,0.194,26,288,314 +3347,2011-05-24,2,0,5,8,0,2,1,3,0.62,0.5758,0.83,0.2239,31,403,434 +3348,2011-05-24,2,0,5,9,0,2,1,2,0.64,0.6061,0.73,0.194,27,111,138 +3349,2011-05-24,2,0,5,10,0,2,1,1,0.7,0.6515,0.61,0.3284,18,90,108 +3350,2011-05-24,2,0,5,11,0,2,1,1,0.74,0.6667,0.51,0.2239,33,142,175 +3351,2011-05-24,2,0,5,12,0,2,1,1,0.76,0.6818,0.45,0.3881,41,129,170 +3352,2011-05-24,2,0,5,13,0,2,1,1,0.78,0.697,0.46,0.2537,38,152,190 +3353,2011-05-24,2,0,5,14,0,2,1,1,0.78,0.697,0.46,0.2836,27,126,153 +3354,2011-05-24,2,0,5,15,0,2,1,3,0.74,0.6818,0.55,0.4179,59,129,188 +3355,2011-05-24,2,0,5,16,0,2,1,1,0.66,0.6212,0.69,0.194,52,225,277 +3356,2011-05-24,2,0,5,17,0,2,1,1,0.7,0.6515,0.61,0.1045,56,492,548 +3357,2011-05-24,2,0,5,18,0,2,1,1,0.7,0.6515,0.7,0.194,74,490,564 +3358,2011-05-24,2,0,5,19,0,2,1,1,0.66,0.6212,0.74,0.194,45,310,355 +3359,2011-05-24,2,0,5,20,0,2,1,1,0.66,0.6212,0.74,0.0896,39,241,280 +3360,2011-05-24,2,0,5,21,0,2,1,1,0.64,0.5909,0.78,0.0896,21,168,189 +3361,2011-05-24,2,0,5,22,0,2,1,3,0.62,0.5606,0.88,0.2239,26,115,141 +3362,2011-05-24,2,0,5,23,0,2,1,3,0.66,0.6212,0.74,0.1642,13,53,66 +3363,2011-05-25,2,0,5,0,0,3,1,3,0.6,0.5606,0.83,0.2239,3,32,35 +3364,2011-05-25,2,0,5,1,0,3,1,3,0.58,0.5455,0.83,0.194,6,2,8 +3365,2011-05-25,2,0,5,2,0,3,1,2,0.56,0.5303,0.94,0,0,8,8 +3366,2011-05-25,2,0,5,3,0,3,1,1,0.54,0.5152,1,0.1343,0,9,9 +3367,2011-05-25,2,0,5,4,0,3,1,1,0.54,0.5152,1,0.1045,0,3,3 +3368,2011-05-25,2,0,5,5,0,3,1,1,0.56,0.5303,0.88,0.0896,2,23,25 +3369,2011-05-25,2,0,5,6,0,3,1,1,0.56,0.5303,0.94,0.0896,4,113,117 +3370,2011-05-25,2,0,5,7,0,3,1,1,0.62,0.5909,0.73,0.1343,29,284,313 +3371,2011-05-25,2,0,5,8,0,3,1,1,0.64,0.6061,0.69,0.2239,36,495,531 +3372,2011-05-25,2,0,5,9,0,3,1,1,0.68,0.6364,0.65,0.0896,27,207,234 +3373,2011-05-25,2,0,5,10,0,3,1,1,0.7,0.6515,0.61,0.0896,30,105,135 +3374,2011-05-25,2,0,5,11,0,3,1,1,0.74,0.6667,0.51,0.0896,36,104,140 +3375,2011-05-25,2,0,5,12,0,3,1,1,0.74,0.6818,0.55,0.1343,32,178,210 +3376,2011-05-25,2,0,5,13,0,3,1,1,0.74,0.6818,0.55,0.1343,39,163,202 +3377,2011-05-25,2,0,5,14,0,3,1,1,0.74,0.6818,0.55,0.1642,47,121,168 +3378,2011-05-25,2,0,5,15,0,3,1,1,0.76,0.697,0.52,0.2537,37,145,182 +3379,2011-05-25,2,0,5,16,0,3,1,1,0.74,0.6667,0.51,0.2836,61,238,299 +3380,2011-05-25,2,0,5,17,0,3,1,1,0.74,0.6667,0.51,0.2239,77,524,601 +3381,2011-05-25,2,0,5,18,0,3,1,1,0.72,0.6667,0.58,0.2239,66,451,517 +3382,2011-05-25,2,0,5,19,0,3,1,1,0.7,0.6515,0.61,0.1343,73,332,405 +3383,2011-05-25,2,0,5,20,0,3,1,1,0.7,0.6515,0.61,0.1642,56,284,340 +3384,2011-05-25,2,0,5,21,0,3,1,1,0.66,0.6212,0.69,0.1343,33,203,236 +3385,2011-05-25,2,0,5,22,0,3,1,1,0.66,0.6212,0.69,0.2537,34,140,174 +3386,2011-05-25,2,0,5,23,0,3,1,1,0.64,0.6061,0.73,0.1343,12,74,86 +3387,2011-05-26,2,0,5,0,0,4,1,1,0.64,0.6061,0.73,0.1343,11,34,45 +3388,2011-05-26,2,0,5,1,0,4,1,1,0.64,0.6061,0.73,0.1343,3,13,16 +3389,2011-05-26,2,0,5,2,0,4,1,1,0.62,0.5758,0.83,0.1045,7,7,14 +3390,2011-05-26,2,0,5,3,0,4,1,1,0.6,0.5455,0.88,0.0896,0,4,4 +3391,2011-05-26,2,0,5,4,0,4,1,1,0.6,0.5455,0.88,0,0,2,2 +3392,2011-05-26,2,0,5,5,0,4,1,2,0.6,0.5455,0.88,0.0896,2,25,27 +3393,2011-05-26,2,0,5,6,0,4,1,2,0.6,0.5,1,0.0896,4,94,98 +3394,2011-05-26,2,0,5,7,0,4,1,2,0.62,0.5455,0.94,0.2537,16,270,286 +3395,2011-05-26,2,0,5,8,0,4,1,2,0.66,0.6061,0.83,0.1642,31,458,489 +3396,2011-05-26,2,0,5,9,0,4,1,1,0.72,0.6818,0.7,0.1642,22,194,216 +3397,2011-05-26,2,0,5,10,0,4,1,1,0.7,0.6667,0.74,0.2836,49,94,143 +3398,2011-05-26,2,0,5,11,0,4,1,1,0.74,0.697,0.66,0.2239,36,124,160 +3399,2011-05-26,2,0,5,12,0,4,1,1,0.78,0.7424,0.59,0.3284,53,141,194 +3400,2011-05-26,2,0,5,13,0,4,1,1,0.82,0.7727,0.52,0.2836,52,145,197 +3401,2011-05-26,2,0,5,14,0,4,1,1,0.82,0.7727,0.52,0.2836,44,145,189 +3402,2011-05-26,2,0,5,15,0,4,1,2,0.82,0.7576,0.46,0.2836,40,101,141 +3403,2011-05-26,2,0,5,16,0,4,1,2,0.8,0.7424,0.49,0.3582,58,233,291 +3404,2011-05-26,2,0,5,17,0,4,1,1,0.8,0.7273,0.46,0.2836,49,445,494 +3405,2011-05-26,2,0,5,18,0,4,1,1,0.8,0.7273,0.46,0.3284,74,404,478 +3406,2011-05-26,2,0,5,19,0,4,1,1,0.78,0.697,0.46,0.3284,67,311,378 +3407,2011-05-26,2,0,5,20,0,4,1,1,0.72,0.6667,0.58,0.1642,55,267,322 +3408,2011-05-26,2,0,5,21,0,4,1,1,0.72,0.6818,0.62,0.0896,45,178,223 +3409,2011-05-26,2,0,5,22,0,4,1,1,0.7,0.6515,0.65,0.1343,23,157,180 +3410,2011-05-26,2,0,5,23,0,4,1,1,0.7,0.6515,0.65,0.194,17,73,90 +3411,2011-05-27,2,0,5,0,0,5,1,1,0.68,0.6364,0.69,0.2537,14,55,69 +3412,2011-05-27,2,0,5,1,0,5,1,1,0.7,0.6515,0.58,0.2985,14,36,50 +3413,2011-05-27,2,0,5,2,0,5,1,1,0.68,0.6364,0.61,0.2239,6,12,18 +3414,2011-05-27,2,0,5,3,0,5,1,1,0.66,0.6212,0.65,0.0896,1,11,12 +3415,2011-05-27,2,0,5,4,0,5,1,1,0.64,0.6061,0.69,0.1343,2,5,7 +3416,2011-05-27,2,0,5,5,0,5,1,1,0.64,0.6061,0.65,0.2836,2,29,31 +3417,2011-05-27,2,0,5,6,0,5,1,1,0.64,0.6061,0.69,0.0896,8,73,81 +3418,2011-05-27,2,0,5,7,0,5,1,1,0.64,0.6061,0.73,0.1642,20,229,249 +3419,2011-05-27,2,0,5,8,0,5,1,1,0.66,0.6212,0.74,0.2836,25,393,418 +3420,2011-05-27,2,0,5,9,0,5,1,1,0.7,0.6515,0.65,0.2836,37,190,227 +3421,2011-05-27,2,0,5,10,0,5,1,1,0.72,0.6667,0.54,0.3881,42,111,153 +3422,2011-05-27,2,0,5,11,0,5,1,2,0.74,0.6818,0.55,0.3284,40,126,166 +3423,2011-05-27,2,0,5,12,0,5,1,2,0.76,0.697,0.55,0.2985,53,190,243 +3424,2011-05-27,2,0,5,13,0,5,1,1,0.74,0.6818,0.58,0.194,51,158,209 +3425,2011-05-27,2,0,5,14,0,5,1,1,0.76,0.697,0.55,0.2537,51,188,239 +3426,2011-05-27,2,0,5,15,0,5,1,1,0.78,0.7121,0.49,0.2836,98,258,356 +3427,2011-05-27,2,0,5,16,0,5,1,2,0.76,0.697,0.55,0.3881,63,356,419 +3428,2011-05-27,2,0,5,17,0,5,1,1,0.74,0.6818,0.58,0.3881,77,414,491 +3429,2011-05-27,2,0,5,18,0,5,1,2,0.72,0.6818,0.62,0.2985,89,310,399 +3430,2011-05-27,2,0,5,19,0,5,1,3,0.6,0.5758,0.78,0.2537,69,210,279 +3431,2011-05-27,2,0,5,20,0,5,1,2,0.62,0.5909,0.73,0.2239,24,120,144 +3432,2011-05-27,2,0,5,21,0,5,1,1,0.6,0.5606,0.83,0.1045,37,114,151 +3433,2011-05-27,2,0,5,22,0,5,1,1,0.6,0.5758,0.78,0.1343,25,116,141 +3434,2011-05-27,2,0,5,23,0,5,1,1,0.58,0.5455,0.88,0.1343,23,104,127 +3435,2011-05-28,2,0,5,0,0,6,0,1,0.58,0.5455,0.88,0.1343,15,79,94 +3436,2011-05-28,2,0,5,1,0,6,0,1,0.58,0.5455,0.88,0.194,8,47,55 +3437,2011-05-28,2,0,5,2,0,6,0,1,0.56,0.5303,0.94,0.194,13,35,48 +3438,2011-05-28,2,0,5,3,0,6,0,1,0.56,0.5303,0.88,0.2239,2,13,15 +3439,2011-05-28,2,0,5,4,0,6,0,1,0.56,0.5303,0.88,0.2537,0,4,4 +3440,2011-05-28,2,0,5,5,0,6,0,1,0.56,0.5303,0.88,0.2239,4,3,7 +3441,2011-05-28,2,0,5,6,0,6,0,1,0.58,0.5455,0.83,0.2537,3,16,19 +3442,2011-05-28,2,0,5,7,0,6,0,1,0.6,0.5606,0.83,0.2537,17,33,50 +3443,2011-05-28,2,0,5,8,0,6,0,1,0.62,0.5909,0.78,0.194,27,75,102 +3444,2011-05-28,2,0,5,9,0,6,0,1,0.64,0.6061,0.73,0.2537,61,116,177 +3445,2011-05-28,2,0,5,10,0,6,0,1,0.7,0.6515,0.65,0.2239,110,166,276 +3446,2011-05-28,2,0,5,11,0,6,0,1,0.7,0.6515,0.65,0.2239,171,178,349 +3447,2011-05-28,2,0,5,12,0,6,0,1,0.72,0.6667,0.58,0.2537,145,213,358 +3448,2011-05-28,2,0,5,13,0,6,0,1,0.72,0.6818,0.62,0.2239,168,217,385 +3449,2011-05-28,2,0,5,14,0,6,0,1,0.74,0.6818,0.58,0.2239,172,210,382 +3450,2011-05-28,2,0,5,15,0,6,0,1,0.74,0.6818,0.55,0.2239,187,187,374 +3451,2011-05-28,2,0,5,16,0,6,0,1,0.76,0.697,0.55,0.2836,201,201,402 +3452,2011-05-28,2,0,5,17,0,6,0,1,0.74,0.6818,0.55,0.2985,180,161,341 +3453,2011-05-28,2,0,5,18,0,6,0,1,0.72,0.6818,0.62,0.2537,173,212,385 +3454,2011-05-28,2,0,5,19,0,6,0,1,0.7,0.6515,0.65,0.194,99,153,252 +3455,2011-05-28,2,0,5,20,0,6,0,1,0.7,0.6515,0.65,0.2239,68,128,196 +3456,2011-05-28,2,0,5,21,0,6,0,1,0.66,0.6212,0.74,0.2985,87,114,201 +3457,2011-05-28,2,0,5,22,0,6,0,1,0.66,0.6061,0.78,0.2239,46,115,161 +3458,2011-05-28,2,0,5,23,0,6,0,1,0.64,0.5758,0.83,0.194,44,81,125 +3459,2011-05-29,2,0,5,0,0,0,0,1,0.64,0.5758,0.83,0.2239,32,51,83 +3460,2011-05-29,2,0,5,1,0,0,0,1,0.62,0.5606,0.88,0.1642,17,49,66 +3461,2011-05-29,2,0,5,2,0,0,0,1,0.62,0.5606,0.88,0.1045,12,49,61 +3462,2011-05-29,2,0,5,3,0,0,0,1,0.62,0.5606,0.88,0.1045,2,24,26 +3463,2011-05-29,2,0,5,4,0,0,0,1,0.6,0.5455,0.88,0.0896,13,10,23 +3464,2011-05-29,2,0,5,5,0,0,0,1,0.6,0.5152,0.94,0.1642,3,3,6 +3465,2011-05-29,2,0,5,6,0,0,0,2,0.62,0.5606,0.88,0.194,5,10,15 +3466,2011-05-29,2,0,5,7,0,0,0,2,0.62,0.5606,0.88,0.2537,13,17,30 +3467,2011-05-29,2,0,5,8,0,0,0,2,0.62,0.5606,0.88,0.2239,34,62,96 +3468,2011-05-29,2,0,5,9,0,0,0,2,0.62,0.5455,0.94,0.2537,74,78,152 +3469,2011-05-29,2,0,5,10,0,0,0,2,0.66,0.6061,0.83,0.2239,130,121,251 +3470,2011-05-29,2,0,5,11,0,0,0,2,0.66,0.6061,0.83,0.2836,139,180,319 +3471,2011-05-29,2,0,5,12,0,0,0,2,0.7,0.6667,0.74,0.2239,216,186,402 +3472,2011-05-29,2,0,5,13,0,0,0,1,0.72,0.6818,0.7,0.2836,237,181,418 +3473,2011-05-29,2,0,5,14,0,0,0,1,0.74,0.697,0.7,0.2836,183,168,351 +3474,2011-05-29,2,0,5,15,0,0,0,1,0.74,0.697,0.7,0.2985,221,155,376 +3475,2011-05-29,2,0,5,16,0,0,0,1,0.74,0.697,0.7,0.2836,194,167,361 +3476,2011-05-29,2,0,5,17,0,0,0,1,0.74,0.697,0.7,0.2836,214,169,383 +3477,2011-05-29,2,0,5,18,0,0,0,1,0.72,0.697,0.74,0.2836,151,177,328 +3478,2011-05-29,2,0,5,19,0,0,0,1,0.72,0.697,0.74,0.2537,141,146,287 +3479,2011-05-29,2,0,5,20,0,0,0,1,0.7,0.6667,0.79,0.194,91,128,219 +3480,2011-05-29,2,0,5,21,0,0,0,1,0.68,0.6364,0.83,0.1642,107,122,229 +3481,2011-05-29,2,0,5,22,0,0,0,1,0.66,0.5909,0.89,0.1343,78,93,171 +3482,2011-05-29,2,0,5,23,0,0,0,1,0.66,0.5909,0.89,0.1642,48,87,135 +3483,2011-05-30,2,0,5,0,1,1,0,1,0.64,0.5606,0.94,0.194,38,65,103 +3484,2011-05-30,2,0,5,1,1,1,0,1,0.64,0.5606,0.94,0.1343,26,53,79 +3485,2011-05-30,2,0,5,2,1,1,0,1,0.64,0.5606,0.94,0.194,17,28,45 +3486,2011-05-30,2,0,5,3,1,1,0,1,0.64,0.5606,0.94,0.194,10,8,18 +3487,2011-05-30,2,0,5,4,1,1,0,1,0.62,0.5455,0.94,0.1642,2,4,6 +3488,2011-05-30,2,0,5,5,1,1,0,2,0.62,0.5455,0.94,0.1045,1,6,7 +3489,2011-05-30,2,0,5,6,1,1,0,2,0.64,0.5606,0.94,0.0896,6,10,16 +3490,2011-05-30,2,0,5,7,1,1,0,2,0.64,0.5606,0.94,0,12,19,31 +3491,2011-05-30,2,0,5,8,1,1,0,2,0.66,0.5909,0.89,0.1045,36,58,94 +3492,2011-05-30,2,0,5,9,1,1,0,1,0.72,0.697,0.74,0.0896,47,81,128 +3493,2011-05-30,2,0,5,10,1,1,0,1,0.8,0.7576,0.55,0.2537,116,153,269 +3494,2011-05-30,2,0,5,11,1,1,0,1,0.82,0.7727,0.52,0,153,168,321 +3495,2011-05-30,2,0,5,12,1,1,0,1,0.86,0.803,0.47,0.1045,179,187,366 +3496,2011-05-30,2,0,5,13,1,1,0,1,0.86,0.7879,0.44,0.0896,133,217,350 +3497,2011-05-30,2,0,5,14,1,1,0,1,0.88,0.803,0.39,0,142,185,327 +3498,2011-05-30,2,0,5,15,1,1,0,1,0.88,0.803,0.39,0.2537,132,179,311 +3499,2011-05-30,2,0,5,16,1,1,0,1,0.88,0.7879,0.37,0.2836,101,208,309 +3500,2011-05-30,2,0,5,17,1,1,0,1,0.86,0.7879,0.41,0.2239,102,202,304 +3501,2011-05-30,2,0,5,18,1,1,0,1,0.86,0.7879,0.41,0.1642,77,172,249 +3502,2011-05-30,2,0,5,19,1,1,0,2,0.52,0.5,0.48,0.1045,68,157,225 +3503,2011-05-30,2,0,5,20,1,1,0,1,0.76,0.7121,0.58,0,69,168,237 +3504,2011-05-30,2,0,5,21,1,1,0,1,0.74,0.697,0.7,0.1343,42,107,149 +3505,2011-05-30,2,0,5,22,1,1,0,1,0.72,0.697,0.74,0.1642,26,80,106 +3506,2011-05-30,2,0,5,23,1,1,0,1,0.7,0.6667,0.84,0.1045,14,34,48 +3507,2011-05-31,2,0,5,0,0,2,1,1,0.7,0.6667,0.79,0.0896,9,20,29 +3508,2011-05-31,2,0,5,1,0,2,1,1,0.68,0.6364,0.89,0.0896,8,9,17 +3509,2011-05-31,2,0,5,2,0,2,1,2,0.66,0.5909,0.94,0.1343,5,7,12 +3510,2011-05-31,2,0,5,3,0,2,1,2,0.64,0.5606,0.94,0,1,3,4 +3511,2011-05-31,2,0,5,4,0,2,1,1,0.66,0.6061,0.83,0.0896,0,3,3 +3512,2011-05-31,2,0,5,5,0,2,1,1,0.66,0.6061,0.83,0.0896,0,26,26 +3513,2011-05-31,2,0,5,6,0,2,1,1,0.68,0.6364,0.79,0.0896,11,98,109 +3514,2011-05-31,2,0,5,7,0,2,1,1,0.74,0.697,0.66,0.1045,16,219,235 +3515,2011-05-31,2,0,5,8,0,2,1,1,0.78,0.7424,0.59,0.1343,34,372,406 +3516,2011-05-31,2,0,5,9,0,2,1,1,0.8,0.7576,0.55,0.1343,22,153,175 +3517,2011-05-31,2,0,5,10,0,2,1,1,0.82,0.7879,0.56,0.1343,37,72,109 +3518,2011-05-31,2,0,5,11,0,2,1,1,0.86,0.803,0.47,0,28,78,106 +3519,2011-05-31,2,0,5,12,0,2,1,1,0.86,0.803,0.47,0.1343,37,125,162 +3520,2011-05-31,2,0,5,13,0,2,1,1,0.9,0.8333,0.39,0.1642,37,110,147 +3521,2011-05-31,2,0,5,14,0,2,1,1,0.9,0.8182,0.37,0,41,99,140 +3522,2011-05-31,2,0,5,15,0,2,1,1,0.9,0.8182,0.37,0.1642,42,133,175 +3523,2011-05-31,2,0,5,16,0,2,1,1,0.9,0.8333,0.39,0.1343,44,205,249 +3524,2011-05-31,2,0,5,17,0,2,1,1,0.84,0.803,0.53,0.2239,67,428,495 +3525,2011-05-31,2,0,5,18,0,2,1,1,0.84,0.803,0.53,0.194,55,362,417 +3526,2011-05-31,2,0,5,19,0,2,1,1,0.78,0.7424,0.62,0.1045,39,246,285 +3527,2011-05-31,2,0,5,20,0,2,1,1,0.78,0.7424,0.62,0.1045,38,213,251 +3528,2011-05-31,2,0,5,21,0,2,1,1,0.76,0.7273,0.66,0.1045,45,165,210 +3529,2011-05-31,2,0,5,22,0,2,1,1,0.74,0.697,0.7,0.1642,42,117,159 +3530,2011-05-31,2,0,5,23,0,2,1,1,0.72,0.697,0.79,0.0896,15,46,61 +3531,2011-06-01,2,0,6,0,0,3,1,1,0.7,0.6667,0.79,0.1642,9,25,34 +3532,2011-06-01,2,0,6,1,0,3,1,1,0.7,0.6667,0.84,0.2537,8,9,17 +3533,2011-06-01,2,0,6,2,0,3,1,1,0.7,0.6667,0.79,0.1642,0,3,3 +3534,2011-06-01,2,0,6,3,0,3,1,1,0.68,0.6364,0.89,0.194,0,6,6 +3535,2011-06-01,2,0,6,4,0,3,1,2,0.68,0.6364,0.89,0.2239,0,4,4 +3536,2011-06-01,2,0,6,5,0,3,1,2,0.66,0.5909,0.89,0.0896,2,19,21 +3537,2011-06-01,2,0,6,6,0,3,1,2,0.68,0.6364,0.89,0.0896,7,121,128 +3538,2011-06-01,2,0,6,7,0,3,1,2,0.7,0.6667,0.79,0.1642,19,265,284 +3539,2011-06-01,2,0,6,8,0,3,1,2,0.72,0.697,0.79,0.194,37,417,454 +3540,2011-06-01,2,0,6,9,0,3,1,2,0.74,0.7121,0.74,0.2239,34,173,207 +3541,2011-06-01,2,0,6,10,0,3,1,2,0.76,0.7424,0.75,0.2537,31,84,115 +3542,2011-06-01,2,0,6,11,0,3,1,2,0.82,0.803,0.59,0.2239,26,86,112 +3543,2011-06-01,2,0,6,12,0,3,1,1,0.86,0.8333,0.53,0,32,137,169 +3544,2011-06-01,2,0,6,13,0,3,1,1,0.9,0.8182,0.37,0.194,29,125,154 +3545,2011-06-01,2,0,6,14,0,3,1,1,0.9,0.8182,0.37,0.2537,40,105,145 +3546,2011-06-01,2,0,6,15,0,3,1,1,0.9,0.8333,0.39,0.2985,25,127,152 +3547,2011-06-01,2,0,6,16,0,3,1,1,0.86,0.803,0.47,0.2985,39,227,266 +3548,2011-06-01,2,0,6,17,0,3,1,1,0.86,0.803,0.47,0.2985,52,434,486 +3549,2011-06-01,2,0,6,18,0,3,1,3,0.82,0.7879,0.56,0.3881,31,278,309 +3550,2011-06-01,2,0,6,19,0,3,1,2,0.74,0.7121,0.79,0.1642,16,248,264 +3551,2011-06-01,2,0,6,20,0,3,1,2,0.74,0.7121,0.79,0.2537,23,233,256 +3552,2011-06-01,2,0,6,21,0,3,1,1,0.74,0.697,0.7,0.2836,29,161,190 +3553,2011-06-01,2,0,6,22,0,3,1,1,0.74,0.697,0.66,0.1343,14,115,129 +3554,2011-06-01,2,0,6,23,0,3,1,1,0.74,0.6667,0.51,0.1642,10,59,69 +3555,2011-06-02,2,0,6,0,0,4,1,1,0.74,0.6515,0.4,0.3582,11,31,42 +3556,2011-06-02,2,0,6,1,0,4,1,1,0.72,0.6515,0.38,0.1045,3,12,15 +3557,2011-06-02,2,0,6,2,0,4,1,1,0.7,0.6364,0.39,0,0,6,6 +3558,2011-06-02,2,0,6,3,0,4,1,1,0.66,0.6212,0.44,0.1045,0,4,4 +3559,2011-06-02,2,0,6,4,0,4,1,1,0.66,0.6212,0.41,0.194,0,3,3 +3560,2011-06-02,2,0,6,5,0,4,1,1,0.66,0.6212,0.39,0.2239,3,28,31 +3561,2011-06-02,2,0,6,6,0,4,1,1,0.66,0.6212,0.39,0.1343,7,106,113 +3562,2011-06-02,2,0,6,7,0,4,1,1,0.7,0.6364,0.34,0.2537,19,285,304 +3563,2011-06-02,2,0,6,8,0,4,1,1,0.7,0.6364,0.32,0.2985,25,442,467 +3564,2011-06-02,2,0,6,9,0,4,1,1,0.72,0.6364,0.26,0.3582,28,197,225 +3565,2011-06-02,2,0,6,10,0,4,1,1,0.72,0.6515,0.28,0.4925,33,106,139 +3566,2011-06-02,2,0,6,11,0,4,1,1,0.74,0.6515,0.28,0.3881,42,123,165 +3567,2011-06-02,2,0,6,12,0,4,1,1,0.76,0.6667,0.25,0.4627,39,193,232 +3568,2011-06-02,2,0,6,13,0,4,1,1,0.8,0.6818,0.24,0.3582,35,168,203 +3569,2011-06-02,2,0,6,14,0,4,1,1,0.78,0.6667,0.24,0.3284,43,142,185 +3570,2011-06-02,2,0,6,15,0,4,1,1,0.8,0.6818,0.21,0.3881,44,144,188 +3571,2011-06-02,2,0,6,16,0,4,1,1,0.8,0.6818,0.22,0.4925,63,255,318 +3572,2011-06-02,2,0,6,17,0,4,1,1,0.76,0.6515,0.2,0.5224,88,484,572 +3573,2011-06-02,2,0,6,18,0,4,1,1,0.74,0.6515,0.22,0.3881,74,451,525 +3574,2011-06-02,2,0,6,19,0,4,1,1,0.72,0.6364,0.23,0.2985,58,321,379 +3575,2011-06-02,2,0,6,20,0,4,1,1,0.7,0.6364,0.24,0.194,51,286,337 +3576,2011-06-02,2,0,6,21,0,4,1,1,0.66,0.6212,0.31,0.194,33,215,248 +3577,2011-06-02,2,0,6,22,0,4,1,1,0.64,0.6212,0.33,0.2537,14,141,155 +3578,2011-06-02,2,0,6,23,0,4,1,1,0.62,0.6212,0.35,0.2239,23,89,112 +3579,2011-06-03,2,0,6,0,0,5,1,1,0.62,0.6212,0.38,0.2836,15,53,68 +3580,2011-06-03,2,0,6,1,0,5,1,1,0.6,0.6212,0.38,0.3582,7,15,22 +3581,2011-06-03,2,0,6,2,0,5,1,1,0.56,0.5303,0.43,0.2239,0,12,12 +3582,2011-06-03,2,0,6,3,0,5,1,1,0.54,0.5152,0.49,0.2239,0,5,5 +3583,2011-06-03,2,0,6,4,0,5,1,1,0.52,0.5,0.48,0.3284,1,5,6 +3584,2011-06-03,2,0,6,5,0,5,1,1,0.52,0.5,0.48,0.3582,4,24,28 +3585,2011-06-03,2,0,6,6,0,5,1,1,0.54,0.5152,0.45,0.2985,8,98,106 +3586,2011-06-03,2,0,6,7,0,5,1,1,0.54,0.5152,0.45,0.2537,25,252,277 +3587,2011-06-03,2,0,6,8,0,5,1,2,0.56,0.5303,0.43,0.3284,31,471,502 +3588,2011-06-03,2,0,6,9,0,5,1,1,0.58,0.5455,0.4,0.1642,36,194,230 +3589,2011-06-03,2,0,6,10,0,5,1,1,0.62,0.6212,0.33,0.2537,54,107,161 +3590,2011-06-03,2,0,6,11,0,5,1,1,0.64,0.6212,0.31,0.3284,57,135,192 +3591,2011-06-03,2,0,6,12,0,5,1,1,0.66,0.6212,0.27,0.2836,51,208,259 +3592,2011-06-03,2,0,6,13,0,5,1,1,0.68,0.6212,0.26,0.2537,58,190,248 +3593,2011-06-03,2,0,6,14,0,5,1,1,0.7,0.6364,0.28,0.2836,66,174,240 +3594,2011-06-03,2,0,6,15,0,5,1,1,0.7,0.6364,0.28,0.2836,56,156,212 +3595,2011-06-03,2,0,6,16,0,5,1,1,0.72,0.6364,0.26,0.3284,63,286,349 +3596,2011-06-03,2,0,6,17,0,5,1,1,0.72,0.6364,0.25,0.2836,73,485,558 +3597,2011-06-03,2,0,6,18,0,5,1,1,0.7,0.6364,0.24,0.2985,76,488,564 +3598,2011-06-03,2,0,6,19,0,5,1,1,0.68,0.6212,0.26,0.2836,53,338,391 +3599,2011-06-03,2,0,6,20,0,5,1,1,0.64,0.6212,0.29,0.1642,57,236,293 +3600,2011-06-03,2,0,6,21,0,5,1,1,0.64,0.6212,0.29,0.1045,51,196,247 +3601,2011-06-03,2,0,6,22,0,5,1,1,0.62,0.6212,0.35,0,26,145,171 +3602,2011-06-03,2,0,6,23,0,5,1,1,0.58,0.5455,0.46,0.1045,30,141,171 +3603,2011-06-04,2,0,6,0,0,6,0,1,0.56,0.5303,0.52,0,24,69,93 +3604,2011-06-04,2,0,6,1,0,6,0,1,0.54,0.5152,0.64,0.1045,14,80,94 +3605,2011-06-04,2,0,6,2,0,6,0,1,0.54,0.5152,0.56,0.194,18,41,59 +3606,2011-06-04,2,0,6,3,0,6,0,1,0.52,0.5,0.59,0.1045,8,10,18 +3607,2011-06-04,2,0,6,4,0,6,0,1,0.52,0.5,0.59,0.0896,4,11,15 +3608,2011-06-04,2,0,6,5,0,6,0,1,0.5,0.4848,0.64,0,3,5,8 +3609,2011-06-04,2,0,6,6,0,6,0,1,0.54,0.5152,0.6,0,11,17,28 +3610,2011-06-04,2,0,6,7,0,6,0,1,0.56,0.5303,0.56,0,27,60,87 +3611,2011-06-04,2,0,6,8,0,6,0,1,0.6,0.6212,0.46,0.0896,29,96,125 +3612,2011-06-04,2,0,6,9,0,6,0,1,0.62,0.6212,0.43,0.1343,55,169,224 +3613,2011-06-04,2,0,6,10,0,6,0,1,0.64,0.6212,0.38,0.1045,115,202,317 +3614,2011-06-04,2,0,6,11,0,6,0,1,0.66,0.6212,0.36,0.1343,120,249,369 +3615,2011-06-04,2,0,6,12,0,6,0,1,0.7,0.6364,0.32,0.1642,150,270,420 +3616,2011-06-04,2,0,6,13,0,6,0,1,0.74,0.6515,0.28,0.1343,188,268,456 +3617,2011-06-04,2,0,6,14,0,6,0,1,0.74,0.6515,0.33,0.2239,193,258,451 +3618,2011-06-04,2,0,6,15,0,6,0,2,0.74,0.6515,0.27,0.4179,180,224,404 +3619,2011-06-04,2,0,6,16,0,6,0,2,0.72,0.6515,0.3,0.1343,168,272,440 +3620,2011-06-04,2,0,6,17,0,6,0,2,0.72,0.6515,0.34,0.2537,142,202,344 +3621,2011-06-04,2,0,6,18,0,6,0,2,0.74,0.6515,0.3,0.1642,127,214,341 +3622,2011-06-04,2,0,6,19,0,6,0,2,0.7,0.6364,0.39,0.0896,79,206,285 +3623,2011-06-04,2,0,6,20,0,6,0,2,0.7,0.6515,0.48,0.0896,90,149,239 +3624,2011-06-04,2,0,6,21,0,6,0,2,0.66,0.6212,0.47,0.1045,46,139,185 +3625,2011-06-04,2,0,6,22,0,6,0,1,0.64,0.6212,0.57,0.2239,49,141,190 +3626,2011-06-04,2,0,6,23,0,6,0,2,0.64,0.6212,0.57,0,29,121,150 +3627,2011-06-05,2,0,6,0,0,0,0,1,0.64,0.6212,0.57,0.0896,23,90,113 +3628,2011-06-05,2,0,6,1,0,0,0,2,0.64,0.6212,0.61,0.1642,26,70,96 +3629,2011-06-05,2,0,6,2,0,0,0,2,0.64,0.6212,0.61,0.1642,16,48,64 +3630,2011-06-05,2,0,6,3,0,0,0,2,0.62,0.6061,0.65,0.1642,22,25,47 +3631,2011-06-05,2,0,6,4,0,0,0,2,0.62,0.6061,0.65,0.1642,4,8,12 +3632,2011-06-05,2,0,6,5,0,0,0,2,0.62,0.6061,0.61,0.1343,1,5,6 +3633,2011-06-05,2,0,6,6,0,0,0,2,0.62,0.6061,0.61,0.1045,10,11,21 +3634,2011-06-05,2,0,6,7,0,0,0,2,0.6,0.5909,0.73,0.2537,8,19,27 +3635,2011-06-05,2,0,6,8,0,0,0,2,0.6,0.5909,0.73,0.2537,24,74,98 +3636,2011-06-05,2,0,6,9,0,0,0,2,0.6,0.5758,0.78,0.194,53,111,164 +3637,2011-06-05,2,0,6,10,0,0,0,2,0.62,0.5909,0.73,0.0896,83,168,251 +3638,2011-06-05,2,0,6,11,0,0,0,2,0.64,0.6061,0.69,0.1045,121,214,335 +3639,2011-06-05,2,0,6,12,0,0,0,2,0.64,0.6061,0.69,0.1045,123,212,335 +3640,2011-06-05,2,0,6,13,0,0,0,1,0.66,0.6212,0.65,0.1045,154,213,367 +3641,2011-06-05,2,0,6,14,0,0,0,1,0.7,0.6515,0.58,0.1642,161,224,385 +3642,2011-06-05,2,0,6,15,0,0,0,1,0.7,0.6515,0.54,0,161,256,417 +3643,2011-06-05,2,0,6,16,0,0,0,1,0.72,0.6667,0.51,0.0896,138,260,398 +3644,2011-06-05,2,0,6,17,0,0,0,1,0.74,0.6667,0.51,0,126,264,390 +3645,2011-06-05,2,0,6,18,0,0,0,1,0.7,0.6515,0.58,0.1045,124,239,363 +3646,2011-06-05,2,0,6,19,0,0,0,1,0.68,0.6364,0.65,0.1642,108,249,357 +3647,2011-06-05,2,0,6,20,0,0,0,1,0.66,0.6212,0.69,0.194,84,195,279 +3648,2011-06-05,2,0,6,21,0,0,0,1,0.64,0.6061,0.73,0.1642,54,111,165 +3649,2011-06-05,2,0,6,22,0,0,0,1,0.64,0.5909,0.78,0.2239,36,94,130 +3650,2011-06-05,2,0,6,23,0,0,0,1,0.62,0.5909,0.78,0.1343,25,61,86 +3651,2011-06-06,2,0,6,0,0,1,1,1,0.62,0.5909,0.78,0.1343,11,18,29 +3652,2011-06-06,2,0,6,1,0,1,1,1,0.6,0.5606,0.83,0.1343,9,5,14 +3653,2011-06-06,2,0,6,2,0,1,1,1,0.58,0.5455,0.88,0.1045,4,4,8 +3654,2011-06-06,2,0,6,3,0,1,1,1,0.58,0.5455,0.88,0,2,3,5 +3655,2011-06-06,2,0,6,4,0,1,1,1,0.56,0.5303,0.94,0.1045,4,4,8 +3656,2011-06-06,2,0,6,5,0,1,1,1,0.56,0.5303,0.88,0.0896,7,24,31 +3657,2011-06-06,2,0,6,6,0,1,1,1,0.58,0.5455,0.88,0,11,101,112 +3658,2011-06-06,2,0,6,7,0,1,1,1,0.62,0.5909,0.78,0.1045,18,281,299 +3659,2011-06-06,2,0,6,8,0,1,1,1,0.64,0.6061,0.73,0.0896,28,410,438 +3660,2011-06-06,2,0,6,9,0,1,1,1,0.7,0.6515,0.58,0.1642,26,162,188 +3661,2011-06-06,2,0,6,10,0,1,1,1,0.72,0.6667,0.51,0.2239,31,64,95 +3662,2011-06-06,2,0,6,11,0,1,1,1,0.74,0.6667,0.42,0.1343,39,72,111 +3663,2011-06-06,2,0,6,12,0,1,1,1,0.76,0.6667,0.37,0.1642,25,128,153 +3664,2011-06-06,2,0,6,13,0,1,1,1,0.78,0.6818,0.35,0,28,152,180 +3665,2011-06-06,2,0,6,14,0,1,1,1,0.78,0.6818,0.31,0.1343,46,99,145 +3666,2011-06-06,2,0,6,15,0,1,1,1,0.8,0.697,0.29,0.194,38,127,165 +3667,2011-06-06,2,0,6,16,0,1,1,1,0.76,0.6667,0.33,0.1343,34,233,267 +3668,2011-06-06,2,0,6,17,0,1,1,1,0.78,0.6818,0.33,0.2836,63,516,579 +3669,2011-06-06,2,0,6,18,0,1,1,1,0.76,0.6667,0.31,0.1343,56,500,556 +3670,2011-06-06,2,0,6,19,0,1,1,1,0.72,0.6667,0.54,0.194,64,343,407 +3671,2011-06-06,2,0,6,20,0,1,1,1,0.68,0.6364,0.61,0.1343,59,277,336 +3672,2011-06-06,2,0,6,21,0,1,1,1,0.66,0.6212,0.57,0.0896,29,160,189 +3673,2011-06-06,2,0,6,22,0,1,1,1,0.66,0.6212,0.61,0.0896,36,137,173 +3674,2011-06-06,2,0,6,23,0,1,1,1,0.64,0.6061,0.69,0.0896,5,55,60 +3675,2011-06-07,2,0,6,0,0,2,1,1,0.64,0.6061,0.65,0.0896,5,15,20 +3676,2011-06-07,2,0,6,1,0,2,1,1,0.62,0.6061,0.69,0.1343,2,2,4 +3677,2011-06-07,2,0,6,2,0,2,1,1,0.6,0.5909,0.73,0.0896,3,2,5 +3678,2011-06-07,2,0,6,3,0,2,1,1,0.6,0.5909,0.73,0.0896,1,1,2 +3679,2011-06-07,2,0,6,4,0,2,1,1,0.58,0.5455,0.83,0.1343,2,3,5 +3680,2011-06-07,2,0,6,5,0,2,1,1,0.58,0.5455,0.88,0.1642,2,30,32 +3681,2011-06-07,2,0,6,6,0,2,1,1,0.6,0.5758,0.78,0.1343,10,116,126 +3682,2011-06-07,2,0,6,7,0,2,1,2,0.64,0.6061,0.69,0.194,23,311,334 +3683,2011-06-07,2,0,6,8,0,2,1,2,0.68,0.6364,0.61,0.1343,45,432,477 +3684,2011-06-07,2,0,6,9,0,2,1,2,0.72,0.6667,0.58,0.2239,27,190,217 +3685,2011-06-07,2,0,6,10,0,2,1,2,0.76,0.6818,0.48,0.194,37,86,123 +3686,2011-06-07,2,0,6,11,0,2,1,2,0.76,0.697,0.52,0.2537,38,113,151 +3687,2011-06-07,2,0,6,12,0,2,1,2,0.8,0.7273,0.46,0.2836,42,141,183 +3688,2011-06-07,2,0,6,13,0,2,1,2,0.8,0.7273,0.43,0.2537,55,141,196 +3689,2011-06-07,2,0,6,14,0,2,1,2,0.8,0.7273,0.43,0.2836,34,112,146 +3690,2011-06-07,2,0,6,15,0,2,1,2,0.82,0.7424,0.41,0.3284,56,127,183 +3691,2011-06-07,2,0,6,16,0,2,1,1,0.8,0.7121,0.38,0.2239,55,253,308 +3692,2011-06-07,2,0,6,17,0,2,1,1,0.8,0.7273,0.43,0.2537,64,475,539 +3693,2011-06-07,2,0,6,18,0,2,1,1,0.78,0.697,0.46,0.2836,72,479,551 +3694,2011-06-07,2,0,6,19,0,2,1,1,0.76,0.6818,0.48,0.2537,69,355,424 +3695,2011-06-07,2,0,6,20,0,2,1,1,0.74,0.6818,0.56,0.1343,45,301,346 +3696,2011-06-07,2,0,6,21,0,2,1,1,0.72,0.6818,0.7,0.1343,31,187,218 +3697,2011-06-07,2,0,6,22,0,2,1,1,0.7,0.6515,0.65,0.1343,27,126,153 +3698,2011-06-07,2,0,6,23,0,2,1,1,0.68,0.6364,0.79,0.1045,18,72,90 +3699,2011-06-08,2,0,6,0,0,3,1,1,0.66,0.5909,0.89,0.1343,12,29,41 +3700,2011-06-08,2,0,6,1,0,3,1,1,0.66,0.6061,0.78,0,3,20,23 +3701,2011-06-08,2,0,6,2,0,3,1,1,0.64,0.5758,0.89,0.0896,0,7,7 +3702,2011-06-08,2,0,6,3,0,3,1,1,0.64,0.5758,0.89,0.0896,2,1,3 +3703,2011-06-08,2,0,6,4,0,3,1,1,0.62,0.5758,0.83,0,0,6,6 +3704,2011-06-08,2,0,6,5,0,3,1,1,0.62,0.5606,0.88,0,0,21,21 +3705,2011-06-08,2,0,6,6,0,3,1,1,0.64,0.5758,0.89,0.1343,13,103,116 +3706,2011-06-08,2,0,6,7,0,3,1,1,0.66,0.6212,0.74,0.1045,24,329,353 +3707,2011-06-08,2,0,6,8,0,3,1,1,0.76,0.7121,0.58,0.1642,46,435,481 +3708,2011-06-08,2,0,6,9,0,3,1,1,0.76,0.7121,0.58,0.1642,34,168,202 +3709,2011-06-08,2,0,6,10,0,3,1,1,0.82,0.7424,0.43,0.2239,27,63,90 +3710,2011-06-08,2,0,6,11,0,3,1,1,0.84,0.7576,0.41,0.194,28,106,134 +3711,2011-06-08,2,0,6,12,0,3,1,1,0.88,0.8182,0.42,0.1343,15,134,149 +3712,2011-06-08,2,0,6,13,0,3,1,1,0.9,0.8485,0.42,0,20,116,136 +3713,2011-06-08,2,0,6,14,0,3,1,1,0.92,0.8788,0.4,0.2537,44,111,155 +3714,2011-06-08,2,0,6,15,0,3,1,1,0.92,0.8788,0.4,0.1642,28,100,128 +3715,2011-06-08,2,0,6,16,0,3,1,1,0.92,0.8788,0.4,0.2239,34,199,233 +3716,2011-06-08,2,0,6,17,0,3,1,1,0.92,0.8485,0.35,0.2239,80,426,506 +3717,2011-06-08,2,0,6,18,0,3,1,1,0.9,0.8182,0.37,0.2239,61,398,459 +3718,2011-06-08,2,0,6,19,0,3,1,1,0.82,0.803,0.59,0.1343,61,323,384 +3719,2011-06-08,2,0,6,20,0,3,1,1,0.8,0.7879,0.63,0.1343,55,225,280 +3720,2011-06-08,2,0,6,21,0,3,1,1,0.8,0.803,0.66,0.1343,34,168,202 +3721,2011-06-08,2,0,6,22,0,3,1,1,0.76,0.7424,0.75,0.1642,37,139,176 +3722,2011-06-08,2,0,6,23,0,3,1,1,0.76,0.7424,0.75,0.194,18,98,116 +3723,2011-06-09,2,0,6,0,0,4,1,2,0.74,0.7121,0.79,0.1343,7,40,47 +3724,2011-06-09,2,0,6,1,0,4,1,1,0.74,0.7121,0.79,0.1045,3,13,16 +3725,2011-06-09,2,0,6,2,0,4,1,2,0.72,0.7121,0.84,0.1642,0,6,6 +3726,2011-06-09,2,0,6,3,0,4,1,2,0.72,0.7121,0.84,0.1045,0,2,2 +3727,2011-06-09,2,0,6,4,0,4,1,2,0.72,0.697,0.79,0.0896,2,4,6 +3728,2011-06-09,2,0,6,5,0,4,1,2,0.7,0.6667,0.84,0,1,19,20 +3729,2011-06-09,2,0,6,6,0,4,1,2,0.72,0.7121,0.84,0.0896,13,105,118 +3730,2011-06-09,2,0,6,7,0,4,1,2,0.72,0.697,0.79,0.1343,31,283,314 +3731,2011-06-09,2,0,6,8,0,4,1,2,0.76,0.7273,0.7,0.1045,32,400,432 +3732,2011-06-09,2,0,6,9,0,4,1,1,0.84,0.7879,0.49,0,20,148,168 +3733,2011-06-09,2,0,6,10,0,4,1,1,0.86,0.8182,0.5,0.2836,20,74,94 +3734,2011-06-09,2,0,6,11,0,4,1,1,0.9,0.8485,0.42,0.2239,36,82,118 +3735,2011-06-09,2,0,6,12,0,4,1,1,0.92,0.8485,0.35,0.194,40,100,140 +3736,2011-06-09,2,0,6,13,0,4,1,1,0.9,0.8182,0.37,0.0896,18,118,136 +3737,2011-06-09,2,0,6,14,0,4,1,1,0.92,0.8788,0.4,0.194,25,93,118 +3738,2011-06-09,2,0,6,15,0,4,1,1,0.94,0.8333,0.31,0.1642,18,86,104 +3739,2011-06-09,2,0,6,16,0,4,1,1,0.92,0.8333,0.33,0.1343,30,170,200 +3740,2011-06-09,2,0,6,17,0,4,1,1,0.9,0.8182,0.37,0.2537,54,355,409 +3741,2011-06-09,2,0,6,18,0,4,1,1,0.88,0.803,0.39,0.2836,52,414,466 +3742,2011-06-09,2,0,6,19,0,4,1,2,0.84,0.7424,0.39,0.3284,55,271,326 +3743,2011-06-09,2,0,6,20,0,4,1,2,0.8,0.7273,0.43,0.194,40,214,254 +3744,2011-06-09,2,0,6,21,0,4,1,3,0.76,0.697,0.52,0,29,142,171 +3745,2011-06-09,2,0,6,22,0,4,1,2,0.74,0.6818,0.58,0.1642,22,131,153 +3746,2011-06-09,2,0,6,23,0,4,1,2,0.74,0.697,0.57,0.1642,15,82,97 +3747,2011-06-10,2,0,6,0,0,5,1,3,0.7,0.6515,0.65,0.2239,8,61,69 +3748,2011-06-10,2,0,6,1,0,5,1,3,0.7,0.6515,0.65,0.2239,5,18,23 +3749,2011-06-10,2,0,6,2,0,5,1,1,0.68,0.6364,0.69,0.1642,6,7,13 +3750,2011-06-10,2,0,6,3,0,5,1,1,0.68,0.6364,0.69,0.1642,3,3,6 +3751,2011-06-10,2,0,6,4,0,5,1,1,0.66,0.6061,0.78,0.1642,0,4,4 +3752,2011-06-10,2,0,6,5,0,5,1,1,0.66,0.6061,0.78,0.1642,0,28,28 +3753,2011-06-10,2,0,6,6,0,5,1,1,0.66,0.6061,0.78,0.0896,10,94,104 +3754,2011-06-10,2,0,6,7,0,5,1,1,0.72,0.6818,0.66,0.1045,25,242,267 +3755,2011-06-10,2,0,6,8,0,5,1,1,0.74,0.6818,0.58,0.1343,29,423,452 +3756,2011-06-10,2,0,6,9,0,5,1,1,0.76,0.697,0.55,0,37,176,213 +3757,2011-06-10,2,0,6,10,0,5,1,1,0.78,0.7121,0.52,0,23,88,111 +3758,2011-06-10,2,0,6,11,0,5,1,1,0.82,0.7576,0.46,0,46,107,153 +3759,2011-06-10,2,0,6,12,0,5,1,1,0.84,0.7576,0.44,0.1343,40,167,207 +3760,2011-06-10,2,0,6,13,0,5,1,1,0.84,0.7576,0.44,0.1045,40,149,189 +3761,2011-06-10,2,0,6,14,0,5,1,1,0.86,0.7727,0.39,0,44,121,165 +3762,2011-06-10,2,0,6,15,0,5,1,1,0.84,0.7576,0.44,0.1045,49,163,212 +3763,2011-06-10,2,0,6,16,0,5,1,1,0.82,0.7879,0.56,0.2985,46,239,285 +3764,2011-06-10,2,0,6,17,0,5,1,1,0.82,0.7727,0.52,0.2239,63,454,517 +3765,2011-06-10,2,0,6,18,0,5,1,1,0.8,0.7727,0.59,0.2239,96,367,463 +3766,2011-06-10,2,0,6,19,0,5,1,1,0.78,0.7424,0.59,0.194,61,245,306 +3767,2011-06-10,2,0,6,20,0,5,1,1,0.76,0.7121,0.62,0.1045,53,197,250 +3768,2011-06-10,2,0,6,21,0,5,1,1,0.76,0.7273,0.66,0.1343,55,163,218 +3769,2011-06-10,2,0,6,22,0,5,1,1,0.72,0.697,0.74,0.2537,40,145,185 +3770,2011-06-10,2,0,6,23,0,5,1,1,0.72,0.697,0.74,0.1642,36,110,146 +3771,2011-06-11,2,0,6,0,0,6,0,1,0.7,0.6667,0.79,0.1343,28,89,117 +3772,2011-06-11,2,0,6,1,0,6,0,1,0.7,0.6667,0.79,0.0896,10,67,77 +3773,2011-06-11,2,0,6,2,0,6,0,1,0.7,0.6667,0.79,0.1045,10,50,60 +3774,2011-06-11,2,0,6,3,0,6,0,1,0.68,0.6364,0.83,0.0896,10,18,28 +3775,2011-06-11,2,0,6,4,0,6,0,1,0.68,0.6364,0.79,0.0896,4,9,13 +3776,2011-06-11,2,0,6,5,0,6,0,1,0.68,0.6364,0.74,0.1343,4,12,16 +3777,2011-06-11,2,0,6,6,0,6,0,1,0.7,0.6515,0.61,0.1343,9,31,40 +3778,2011-06-11,2,0,6,7,0,6,0,1,0.72,0.6667,0.58,0.194,18,50,68 +3779,2011-06-11,2,0,6,8,0,6,0,1,0.74,0.6818,0.58,0.194,32,111,143 +3780,2011-06-11,2,0,6,9,0,6,0,1,0.74,0.6818,0.62,0.1343,59,171,230 +3781,2011-06-11,2,0,6,10,0,6,0,1,0.74,0.6818,0.58,0.1642,91,173,264 +3782,2011-06-11,2,0,6,11,0,6,0,1,0.76,0.7121,0.62,0.1343,110,215,325 +3783,2011-06-11,2,0,6,12,0,6,0,1,0.8,0.7576,0.55,0.2239,112,235,347 +3784,2011-06-11,2,0,6,13,0,6,0,1,0.8,0.7576,0.55,0.194,150,237,387 +3785,2011-06-11,2,0,6,14,0,6,0,1,0.82,0.7727,0.49,0.1045,148,232,380 +3786,2011-06-11,2,0,6,15,0,6,0,1,0.82,0.7727,0.49,0.1045,142,232,374 +3787,2011-06-11,2,0,6,16,0,6,0,2,0.8,0.7273,0.46,0.194,169,225,394 +3788,2011-06-11,2,0,6,17,0,6,0,2,0.74,0.6818,0.58,0.1642,147,190,337 +3789,2011-06-11,2,0,6,18,0,6,0,2,0.72,0.6818,0.62,0.1343,131,155,286 +3790,2011-06-11,2,0,6,19,0,6,0,1,0.7,0.6515,0.65,0.194,97,167,264 +3791,2011-06-11,2,0,6,20,0,6,0,1,0.68,0.6364,0.74,0.2537,95,180,275 +3792,2011-06-11,2,0,6,21,0,6,0,1,0.66,0.6212,0.74,0.2239,76,144,220 +3793,2011-06-11,2,0,6,22,0,6,0,1,0.66,0.6061,0.78,0.2239,53,137,190 +3794,2011-06-11,2,0,6,23,0,6,0,1,0.66,0.6212,0.74,0.1045,24,107,131 +3795,2011-06-12,2,0,6,0,0,0,0,1,0.66,0.6061,0.78,0.2239,19,100,119 +3796,2011-06-12,2,0,6,1,0,0,0,1,0.64,0.5758,0.83,0.1045,19,74,93 +3797,2011-06-12,2,0,6,2,0,0,0,1,0.64,0.5758,0.89,0.2537,9,57,66 +3798,2011-06-12,2,0,6,3,0,0,0,1,0.64,0.5758,0.89,0.2836,8,20,28 +3799,2011-06-12,2,0,6,4,0,0,0,1,0.62,0.5606,0.88,0.1642,8,6,14 +3800,2011-06-12,2,0,6,5,0,0,0,1,0.62,0.5606,0.88,0.2239,5,5,10 +3801,2011-06-12,2,0,6,6,0,0,0,1,0.62,0.5606,0.88,0.1045,5,9,14 +3802,2011-06-12,2,0,6,7,0,0,0,1,0.64,0.5909,0.78,0.1045,7,27,34 +3803,2011-06-12,2,0,6,8,0,0,0,1,0.7,0.6667,0.74,0,28,64,92 +3804,2011-06-12,2,0,6,9,0,0,0,1,0.72,0.6818,0.66,0,75,107,182 +3805,2011-06-12,2,0,6,10,0,0,0,1,0.76,0.7121,0.58,0.0896,120,191,311 +3806,2011-06-12,2,0,6,11,0,0,0,1,0.78,0.7273,0.55,0.1045,131,236,367 +3807,2011-06-12,2,0,6,12,0,0,0,1,0.82,0.7576,0.46,0,176,244,420 +3808,2011-06-12,2,0,6,13,0,0,0,1,0.78,0.7424,0.59,0.2836,109,264,373 +3809,2011-06-12,2,0,6,14,0,0,0,1,0.82,0.7879,0.56,0.2537,96,219,315 +3810,2011-06-12,2,0,6,15,0,0,0,1,0.8,0.7727,0.59,0.2985,142,218,360 +3811,2011-06-12,2,0,6,16,0,0,0,3,0.68,0.6364,0.83,0.2836,115,235,350 +3812,2011-06-12,2,0,6,17,0,0,0,3,0.68,0.6364,0.83,0.2836,94,158,252 +3813,2011-06-12,2,0,6,18,0,0,0,1,0.7,0.6667,0.74,0.1642,69,187,256 +3814,2011-06-12,2,0,6,19,0,0,0,1,0.7,0.6667,0.74,0.1045,67,156,223 +3815,2011-06-12,2,0,6,20,0,0,0,1,0.66,0.6061,0.83,0.1642,62,144,206 +3816,2011-06-12,2,0,6,21,0,0,0,1,0.66,0.6061,0.83,0.1045,39,101,140 +3817,2011-06-12,2,0,6,22,0,0,0,1,0.64,0.5758,0.83,0.1343,31,97,128 +3818,2011-06-12,2,0,6,23,0,0,0,1,0.64,0.5909,0.78,0.194,33,74,107 +3819,2011-06-13,2,0,6,0,0,1,1,1,0.64,0.5909,0.78,0.1343,8,20,28 +3820,2011-06-13,2,0,6,1,0,1,1,1,0.64,0.6061,0.73,0.2239,6,10,16 +3821,2011-06-13,2,0,6,2,0,1,1,1,0.62,0.5909,0.73,0.2836,2,8,10 +3822,2011-06-13,2,0,6,3,0,1,1,1,0.6,0.6061,0.6,0.4925,1,6,7 +3823,2011-06-13,2,0,6,4,0,1,1,1,0.56,0.5303,0.64,0.2985,1,3,4 +3824,2011-06-13,2,0,6,5,0,1,1,1,0.54,0.5152,0.64,0.3881,2,27,29 +3825,2011-06-13,2,0,6,6,0,1,1,1,0.54,0.5152,0.64,0.4179,7,98,105 +3826,2011-06-13,2,0,6,7,0,1,1,1,0.56,0.5303,0.56,0.3582,26,302,328 +3827,2011-06-13,2,0,6,8,0,1,1,1,0.58,0.5455,0.49,0.4478,44,432,476 +3828,2011-06-13,2,0,6,9,0,1,1,1,0.62,0.6212,0.46,0.3582,35,189,224 +3829,2011-06-13,2,0,6,10,0,1,1,1,0.64,0.6212,0.41,0.3284,38,80,118 +3830,2011-06-13,2,0,6,11,0,1,1,1,0.66,0.6212,0.41,0.2985,53,106,159 +3831,2011-06-13,2,0,6,12,0,1,1,1,0.66,0.6212,0.41,0.2836,35,161,196 +3832,2011-06-13,2,0,6,13,0,1,1,1,0.7,0.6364,0.37,0,53,174,227 +3833,2011-06-13,2,0,6,14,0,1,1,1,0.7,0.6364,0.34,0.2537,55,145,200 +3834,2011-06-13,2,0,6,15,0,1,1,1,0.7,0.6364,0.37,0.4478,58,127,185 +3835,2011-06-13,2,0,6,16,0,1,1,1,0.7,0.6364,0.37,0.4179,64,262,326 +3836,2011-06-13,2,0,6,17,0,1,1,1,0.7,0.6364,0.39,0.3284,72,529,601 +3837,2011-06-13,2,0,6,18,0,1,1,1,0.7,0.6364,0.37,0.3582,76,510,586 +3838,2011-06-13,2,0,6,19,0,1,1,1,0.66,0.6212,0.39,0.2836,75,348,423 +3839,2011-06-13,2,0,6,20,0,1,1,1,0.64,0.6212,0.44,0.2537,78,253,331 +3840,2011-06-13,2,0,6,21,0,1,1,1,0.64,0.6212,0.41,0.2537,40,178,218 +3841,2011-06-13,2,0,6,22,0,1,1,1,0.62,0.6212,0.46,0.2239,20,114,134 +3842,2011-06-13,2,0,6,23,0,1,1,1,0.62,0.6212,0.46,0.194,14,75,89 +3843,2011-06-14,2,0,6,0,0,2,1,1,0.6,0.6212,0.49,0.194,13,18,31 +3844,2011-06-14,2,0,6,1,0,2,1,1,0.6,0.6212,0.49,0.2537,3,10,13 +3845,2011-06-14,2,0,6,2,0,2,1,1,0.6,0.6212,0.49,0.2836,2,8,10 +3846,2011-06-14,2,0,6,3,0,2,1,1,0.58,0.5455,0.53,0.3284,1,1,2 +3847,2011-06-14,2,0,6,4,0,2,1,1,0.54,0.5152,0.6,0.2537,0,3,3 +3848,2011-06-14,2,0,6,5,0,2,1,1,0.54,0.5152,0.56,0.3284,3,23,26 +3849,2011-06-14,2,0,6,6,0,2,1,1,0.54,0.5152,0.56,0.2985,6,107,113 +3850,2011-06-14,2,0,6,7,0,2,1,1,0.56,0.5303,0.52,0.4179,19,346,365 +3851,2011-06-14,2,0,6,8,0,2,1,1,0.58,0.5455,0.49,0.3881,45,441,486 +3852,2011-06-14,2,0,6,9,0,2,1,1,0.62,0.6212,0.46,0.194,19,174,193 +3853,2011-06-14,2,0,6,10,0,2,1,1,0.62,0.6212,0.46,0.2239,33,97,130 +3854,2011-06-14,2,0,6,11,0,2,1,1,0.64,0.6212,0.41,0.2836,40,103,143 +3855,2011-06-14,2,0,6,12,0,2,1,1,0.64,0.6212,0.41,0.3582,30,139,169 +3856,2011-06-14,2,0,6,13,0,2,1,1,0.64,0.6212,0.44,0.2239,33,176,209 +3857,2011-06-14,2,0,6,14,0,2,1,2,0.64,0.6212,0.47,0.194,41,118,159 +3858,2011-06-14,2,0,6,15,0,2,1,1,0.64,0.6212,0.47,0.1642,42,136,178 +3859,2011-06-14,2,0,6,16,0,2,1,1,0.68,0.6364,0.41,0.194,51,279,330 +3860,2011-06-14,2,0,6,17,0,2,1,3,0.64,0.6212,0.5,0.2836,85,484,569 +3861,2011-06-14,2,0,6,18,0,2,1,1,0.62,0.6212,0.51,0.3284,65,473,538 +3862,2011-06-14,2,0,6,19,0,2,1,1,0.64,0.6212,0.5,0.3582,51,335,386 +3863,2011-06-14,2,0,6,20,0,2,1,1,0.6,0.6212,0.56,0.1642,60,237,297 +3864,2011-06-14,2,0,6,21,0,2,1,1,0.6,0.6212,0.56,0.2985,37,206,243 +3865,2011-06-14,2,0,6,22,0,2,1,1,0.58,0.5455,0.6,0.2239,29,159,188 +3866,2011-06-14,2,0,6,23,0,2,1,1,0.56,0.5303,0.68,0.2239,19,91,110 +3867,2011-06-15,2,0,6,0,0,3,1,1,0.56,0.5303,0.64,0.3582,8,44,52 +3868,2011-06-15,2,0,6,1,0,3,1,1,0.54,0.5152,0.64,0.2836,0,14,14 +3869,2011-06-15,2,0,6,2,0,3,1,1,0.52,0.5,0.63,0.2239,0,9,9 +3870,2011-06-15,2,0,6,3,0,3,1,1,0.5,0.4848,0.68,0.2537,0,1,1 +3871,2011-06-15,2,0,6,4,0,3,1,1,0.5,0.4848,0.63,0.2836,0,4,4 +3872,2011-06-15,2,0,6,5,0,3,1,1,0.48,0.4697,0.67,0.2836,1,20,21 +3873,2011-06-15,2,0,6,6,0,3,1,1,0.5,0.4848,0.63,0.2537,11,110,121 +3874,2011-06-15,2,0,6,7,0,3,1,1,0.54,0.5152,0.6,0.1343,22,348,370 +3875,2011-06-15,2,0,6,8,0,3,1,1,0.58,0.5455,0.53,0.1343,53,445,498 +3876,2011-06-15,2,0,6,9,0,3,1,1,0.6,0.6212,0.49,0,22,185,207 +3877,2011-06-15,2,0,6,10,0,3,1,1,0.64,0.6212,0.47,0.1642,27,108,135 +3878,2011-06-15,2,0,6,11,0,3,1,1,0.68,0.6364,0.36,0,49,115,164 +3879,2011-06-15,2,0,6,12,0,3,1,1,0.7,0.6364,0.34,0.2239,33,178,211 +3880,2011-06-15,2,0,6,13,0,3,1,1,0.74,0.6515,0.28,0.2836,43,146,189 +3881,2011-06-15,2,0,6,14,0,3,1,1,0.74,0.6515,0.28,0,50,128,178 +3882,2011-06-15,2,0,6,15,0,3,1,1,0.76,0.6667,0.27,0,33,131,164 +3883,2011-06-15,2,0,6,16,0,3,1,1,0.76,0.6667,0.27,0.1343,47,265,312 +3884,2011-06-15,2,0,6,17,0,3,1,1,0.74,0.6515,0.28,0.1045,83,555,638 +3885,2011-06-15,2,0,6,18,0,3,1,1,0.72,0.6515,0.32,0.1343,80,527,607 +3886,2011-06-15,2,0,6,19,0,3,1,1,0.7,0.6364,0.37,0.2836,54,362,416 +3887,2011-06-15,2,0,6,20,0,3,1,1,0.66,0.6212,0.44,0.1642,57,273,330 +3888,2011-06-15,2,0,6,21,0,3,1,1,0.64,0.6212,0.47,0.2239,48,209,257 +3889,2011-06-15,2,0,6,22,0,3,1,1,0.62,0.6212,0.5,0.1045,31,144,175 +3890,2011-06-15,2,0,6,23,0,3,1,1,0.62,0.6212,0.53,0,17,90,107 +3891,2011-06-16,2,0,6,0,0,4,1,1,0.6,0.6212,0.56,0.1045,9,38,47 +3892,2011-06-16,2,0,6,1,0,4,1,1,0.58,0.5455,0.6,0,4,13,17 +3893,2011-06-16,2,0,6,2,0,4,1,1,0.6,0.6212,0.56,0.0896,1,4,5 +3894,2011-06-16,2,0,6,3,0,4,1,2,0.56,0.5303,0.73,0.0896,0,4,4 +3895,2011-06-16,2,0,6,4,0,4,1,2,0.56,0.5303,0.68,0.1045,0,6,6 +3896,2011-06-16,2,0,6,5,0,4,1,1,0.6,0.6061,0.64,0.194,7,18,25 +3897,2011-06-16,2,0,6,6,0,4,1,3,0.58,0.5455,0.73,0.1642,8,104,112 +3898,2011-06-16,2,0,6,7,0,4,1,3,0.56,0.5303,0.78,0.1642,16,172,188 +3899,2011-06-16,2,0,6,8,0,4,1,2,0.6,0.5909,0.69,0.1343,24,364,388 +3900,2011-06-16,2,0,6,9,0,4,1,2,0.62,0.6061,0.65,0.194,29,232,261 +3901,2011-06-16,2,0,6,10,0,4,1,2,0.66,0.6212,0.61,0.2537,18,103,121 +3902,2011-06-16,2,0,6,11,0,4,1,2,0.68,0.6364,0.61,0.2985,33,117,150 +3903,2011-06-16,2,0,6,12,0,4,1,1,0.72,0.6667,0.54,0.3881,41,165,206 +3904,2011-06-16,2,0,6,13,0,4,1,2,0.72,0.6667,0.54,0.2537,37,114,151 +3905,2011-06-16,2,0,6,14,0,4,1,2,0.72,0.6667,0.54,0.3284,39,114,153 +3906,2011-06-16,2,0,6,15,0,4,1,2,0.7,0.6515,0.61,0.2985,32,127,159 +3907,2011-06-16,2,0,6,16,0,4,1,3,0.66,0.6212,0.69,0.194,36,258,294 +3908,2011-06-16,2,0,6,17,0,4,1,2,0.64,0.5909,0.78,0.1045,42,230,272 +3909,2011-06-16,2,0,6,18,0,4,1,1,0.64,0.5758,0.83,0.2239,26,299,325 +3910,2011-06-16,2,0,6,19,0,4,1,1,0.62,0.5758,0.83,0.2836,58,269,327 +3911,2011-06-16,2,0,6,20,0,4,1,1,0.62,0.5758,0.83,0.3284,25,176,201 +3912,2011-06-16,2,0,6,21,0,4,1,3,0.62,0.5758,0.83,0.2537,34,154,188 +3913,2011-06-16,2,0,6,22,0,4,1,3,0.62,0.5758,0.83,0.2537,24,127,151 +3914,2011-06-16,2,0,6,23,0,4,1,3,0.6,0.5606,0.83,0.2537,2,14,16 +3915,2011-06-17,2,0,6,0,0,5,1,1,0.56,0.5303,0.88,0.0896,3,21,24 +3916,2011-06-17,2,0,6,1,0,5,1,1,0.56,0.5303,0.94,0.1343,4,19,23 +3917,2011-06-17,2,0,6,2,0,5,1,1,0.56,0.5303,0.94,0.1045,2,11,13 +3918,2011-06-17,2,0,6,3,0,5,1,1,0.56,0.5303,0.94,0.1045,1,5,6 +3919,2011-06-17,2,0,6,4,0,5,1,1,0.54,0.5152,0.94,0.1343,1,5,6 +3920,2011-06-17,2,0,6,5,0,5,1,1,0.54,0.5152,1,0,1,12,13 +3921,2011-06-17,2,0,6,6,0,5,1,1,0.56,0.5303,0.94,0,8,89,97 +3922,2011-06-17,2,0,6,7,0,5,1,1,0.6,0.5606,0.83,0,25,225,250 +3923,2011-06-17,2,0,6,8,0,5,1,2,0.6,0.5606,0.83,0,28,426,454 +3924,2011-06-17,2,0,6,9,0,5,1,1,0.7,0.6515,0.58,0.194,28,196,224 +3925,2011-06-17,2,0,6,10,0,5,1,1,0.7,0.6515,0.58,0.194,44,126,170 +3926,2011-06-17,2,0,6,11,0,5,1,1,0.68,0.6364,0.69,0.2537,43,138,181 +3927,2011-06-17,2,0,6,12,0,5,1,1,0.7,0.6515,0.65,0.2537,49,194,243 +3928,2011-06-17,2,0,6,13,0,5,1,1,0.74,0.6818,0.62,0.2836,48,156,204 +3929,2011-06-17,2,0,6,14,0,5,1,1,0.76,0.697,0.52,0.194,73,142,215 +3930,2011-06-17,2,0,6,15,0,5,1,1,0.78,0.697,0.43,0.2239,62,181,243 +3931,2011-06-17,2,0,6,16,0,5,1,1,0.76,0.6818,0.45,0.1642,69,286,355 +3932,2011-06-17,2,0,6,17,0,5,1,3,0.76,0.6818,0.4,0.194,85,467,552 +3933,2011-06-17,2,0,6,18,0,5,1,3,0.76,0.6818,0.4,0.194,62,388,450 +3934,2011-06-17,2,0,6,19,0,5,1,3,0.64,0.5758,0.83,0.2537,53,275,328 +3935,2011-06-17,2,0,6,20,0,5,1,1,0.64,0.5909,0.78,0.1045,40,192,232 +3936,2011-06-17,2,0,6,21,0,5,1,1,0.64,0.5909,0.78,0.0896,49,159,208 +3937,2011-06-17,2,0,6,22,0,5,1,1,0.62,0.5606,0.88,0.1343,48,127,175 +3938,2011-06-17,2,0,6,23,0,5,1,1,0.62,0.5758,0.83,0.1343,37,141,178 +3939,2011-06-18,2,0,6,0,0,6,0,1,0.62,0.5606,0.88,0.1343,21,83,104 +3940,2011-06-18,2,0,6,1,0,6,0,1,0.62,0.5758,0.83,0.1045,15,80,95 +3941,2011-06-18,2,0,6,2,0,6,0,1,0.6,0.5455,0.88,0,16,37,53 +3942,2011-06-18,2,0,6,3,0,6,0,1,0.6,0.5455,0.88,0.0896,4,16,20 +3943,2011-06-18,2,0,6,4,0,6,0,1,0.6,0.5455,0.88,0.0896,1,4,5 +3944,2011-06-18,2,0,6,5,0,6,0,1,0.62,0.5758,0.83,0,1,6,7 +3945,2011-06-18,2,0,6,6,0,6,0,1,0.62,0.5758,0.83,0.1045,9,18,27 +3946,2011-06-18,2,0,6,7,0,6,0,1,0.64,0.5909,0.78,0.1343,12,45,57 +3947,2011-06-18,2,0,6,8,0,6,0,1,0.66,0.6061,0.78,0.0896,28,103,131 +3948,2011-06-18,2,0,6,9,0,6,0,1,0.7,0.6515,0.61,0.2239,62,156,218 +3949,2011-06-18,2,0,6,10,0,6,0,1,0.72,0.6818,0.62,0.194,68,176,244 +3950,2011-06-18,2,0,6,11,0,6,0,1,0.72,0.6818,0.62,0.1343,134,270,404 +3951,2011-06-18,2,0,6,12,0,6,0,1,0.74,0.697,0.66,0.0896,162,258,420 +3952,2011-06-18,2,0,6,13,0,6,0,1,0.76,0.697,0.55,0,135,192,327 +3953,2011-06-18,2,0,6,14,0,6,0,2,0.8,0.7273,0.46,0.2836,138,196,334 +3954,2011-06-18,2,0,6,15,0,6,0,2,0.82,0.7576,0.46,0.1343,153,214,367 +3955,2011-06-18,2,0,6,16,0,6,0,2,0.76,0.697,0.52,0.2239,193,275,468 +3956,2011-06-18,2,0,6,17,0,6,0,2,0.76,0.6818,0.48,0.1642,210,239,449 +3957,2011-06-18,2,0,6,18,0,6,0,2,0.74,0.6818,0.55,0.194,118,263,381 +3958,2011-06-18,2,0,6,19,0,6,0,2,0.74,0.6667,0.51,0.1045,99,227,326 +3959,2011-06-18,2,0,6,20,0,6,0,2,0.72,0.6667,0.58,0.1343,61,127,188 +3960,2011-06-18,2,0,6,21,0,6,0,2,0.72,0.6818,0.66,0.1045,58,125,183 +3961,2011-06-18,2,0,6,22,0,6,0,2,0.72,0.6818,0.62,0.1343,56,105,161 +3962,2011-06-18,2,0,6,23,0,6,0,2,0.72,0.6818,0.62,0,53,97,150 +3963,2011-06-19,2,0,6,0,0,0,0,1,0.7,0.6515,0.65,0,18,71,89 +3964,2011-06-19,2,0,6,1,0,0,0,2,0.7,0.6515,0.65,0.0896,17,59,76 +3965,2011-06-19,2,0,6,2,0,0,0,1,0.68,0.6364,0.74,0.0896,13,59,72 +3966,2011-06-19,2,0,6,3,0,0,0,2,0.66,0.6061,0.78,0.1343,14,16,30 +3967,2011-06-19,2,0,6,4,0,0,0,2,0.66,0.6212,0.74,0.1343,7,10,17 +3968,2011-06-19,2,0,6,5,0,0,0,2,0.66,0.6061,0.78,0,7,12,19 +3969,2011-06-19,2,0,6,6,0,0,0,2,0.66,0.6061,0.78,0.1642,11,22,33 +3970,2011-06-19,2,0,6,7,0,0,0,2,0.66,0.6061,0.78,0.1642,35,36,71 +3971,2011-06-19,2,0,6,8,0,0,0,3,0.68,0.6364,0.74,0.1045,25,59,84 +3972,2011-06-19,2,0,6,9,0,0,0,2,0.68,0.6364,0.74,0.1045,58,111,169 +3973,2011-06-19,2,0,6,10,0,0,0,2,0.7,0.6515,0.61,0.1045,86,158,244 +3974,2011-06-19,2,0,6,11,0,0,0,2,0.72,0.6667,0.58,0.0896,141,236,377 +3975,2011-06-19,2,0,6,12,0,0,0,2,0.74,0.6818,0.55,0.0896,141,255,396 +3976,2011-06-19,2,0,6,13,0,0,0,2,0.74,0.6667,0.51,0,149,214,363 +3977,2011-06-19,2,0,6,14,0,0,0,3,0.74,0.6818,0.58,0.1045,124,193,317 +3978,2011-06-19,2,0,6,15,0,0,0,2,0.74,0.6818,0.55,0.1045,146,216,362 +3979,2011-06-19,2,0,6,16,0,0,0,2,0.76,0.6818,0.48,0,121,257,378 +3980,2011-06-19,2,0,6,17,0,0,0,1,0.74,0.6667,0.51,0,159,238,397 +3981,2011-06-19,2,0,6,18,0,0,0,2,0.74,0.6818,0.58,0.0896,78,218,296 +3982,2011-06-19,2,0,6,19,0,0,0,1,0.72,0.6818,0.62,0.1343,94,217,311 +3983,2011-06-19,2,0,6,20,0,0,0,1,0.7,0.6515,0.7,0.194,86,146,232 +3984,2011-06-19,2,0,6,21,0,0,0,1,0.68,0.6364,0.79,0.1343,57,110,167 +3985,2011-06-19,2,0,6,22,0,0,0,1,0.66,0.6061,0.78,0.2239,38,114,152 +3986,2011-06-19,2,0,6,23,0,0,0,1,0.66,0.6061,0.78,0.194,14,78,92 +3987,2011-06-20,2,0,6,0,0,1,1,1,0.66,0.6061,0.83,0.1642,7,19,26 +3988,2011-06-20,2,0,6,1,0,1,1,1,0.64,0.5758,0.89,0.0896,9,3,12 +3989,2011-06-20,2,0,6,2,0,1,1,1,0.64,0.5758,0.89,0.0896,2,0,2 +3990,2011-06-20,2,0,6,3,0,1,1,3,0.64,0.5758,0.89,0,1,0,1 +3991,2011-06-20,2,0,6,4,0,1,1,3,0.62,0.5606,0.88,0.1642,1,2,3 +3992,2011-06-20,2,0,6,5,0,1,1,3,0.62,0.5758,0.83,0.1343,0,3,3 +3993,2011-06-20,2,0,6,6,0,1,1,3,0.6,0.5758,0.78,0.1343,1,25,26 +3994,2011-06-20,2,0,6,7,0,1,1,3,0.56,0.5303,0.88,0.194,0,46,46 +3995,2011-06-20,2,0,6,8,0,1,1,3,0.56,0.5303,0.83,0.194,12,209,221 +3996,2011-06-20,2,0,6,9,0,1,1,2,0.6,0.5758,0.78,0.2537,19,200,219 +3997,2011-06-20,2,0,6,10,0,1,1,2,0.6,0.5758,0.78,0.2537,33,58,91 +3998,2011-06-20,2,0,6,11,0,1,1,2,0.6,0.5758,0.78,0.1045,27,67,94 +3999,2011-06-20,2,0,6,12,0,1,1,2,0.62,0.6061,0.69,0.1045,33,129,162 +4000,2011-06-20,2,0,6,13,0,1,1,1,0.64,0.6061,0.65,0.1045,50,129,179 +4001,2011-06-20,2,0,6,14,0,1,1,1,0.66,0.6212,0.61,0.1045,45,130,175 +4002,2011-06-20,2,0,6,15,0,1,1,1,0.66,0.6212,0.61,0.1343,66,139,205 +4003,2011-06-20,2,0,6,16,0,1,1,1,0.7,0.6515,0.58,0.1642,61,238,299 +4004,2011-06-20,2,0,6,17,0,1,1,1,0.7,0.6515,0.54,0,81,484,565 +4005,2011-06-20,2,0,6,18,0,1,1,1,0.7,0.6515,0.61,0.2537,64,474,538 +4006,2011-06-20,2,0,6,19,0,1,1,1,0.66,0.6212,0.65,0.2239,62,369,431 +4007,2011-06-20,2,0,6,20,0,1,1,1,0.66,0.6212,0.69,0.2239,52,264,316 +4008,2011-06-20,2,0,6,21,0,1,1,1,0.64,0.6061,0.73,0.1642,39,167,206 +4009,2011-06-20,2,0,6,22,0,1,1,2,0.64,0.6061,0.73,0.2239,24,106,130 +4010,2011-06-20,2,0,6,23,0,1,1,1,0.62,0.5909,0.78,0.2537,10,50,60 +4011,2011-06-21,3,0,6,0,0,2,1,1,0.62,0.5909,0.78,0.1642,10,23,33 +4012,2011-06-21,3,0,6,1,0,2,1,3,0.62,0.5909,0.78,0.1642,2,12,14 +4013,2011-06-21,3,0,6,2,0,2,1,3,0.62,0.5758,0.83,0.1642,1,5,6 +4014,2011-06-21,3,0,6,3,0,2,1,3,0.62,0.5758,0.83,0.1642,0,2,2 +4015,2011-06-21,3,0,6,4,0,2,1,2,0.6,0.5455,0.88,0.1045,2,7,9 +4016,2011-06-21,3,0,6,5,0,2,1,2,0.6,0.5455,0.88,0.1343,3,22,25 +4017,2011-06-21,3,0,6,6,0,2,1,2,0.6,0.5455,0.88,0.1343,8,107,115 +4018,2011-06-21,3,0,6,7,0,2,1,2,0.6,0.5152,0.94,0,21,288,309 +4019,2011-06-21,3,0,6,8,0,2,1,2,0.62,0.5606,0.88,0.1045,33,368,401 +4020,2011-06-21,3,0,6,9,0,2,1,2,0.62,0.5606,0.88,0.2537,32,243,275 +4021,2011-06-21,3,0,6,10,0,2,1,2,0.64,0.5758,0.83,0.194,41,120,161 +4022,2011-06-21,3,0,6,11,0,2,1,1,0.66,0.6061,0.83,0.1642,52,119,171 +4023,2011-06-21,3,0,6,12,0,2,1,1,0.7,0.6667,0.74,0.1642,40,149,189 +4024,2011-06-21,3,0,6,13,0,2,1,1,0.74,0.697,0.66,0.1642,49,158,207 +4025,2011-06-21,3,0,6,14,0,2,1,1,0.76,0.7121,0.62,0.2239,44,108,152 +4026,2011-06-21,3,0,6,15,0,2,1,1,0.78,0.7424,0.59,0.2537,42,128,170 +4027,2011-06-21,3,0,6,16,0,2,1,1,0.8,0.7727,0.59,0.2239,62,273,335 +4028,2011-06-21,3,0,6,17,0,2,1,1,0.78,0.7576,0.66,0.2239,65,507,572 +4029,2011-06-21,3,0,6,18,0,2,1,2,0.76,0.7273,0.7,0.2537,67,469,536 +4030,2011-06-21,3,0,6,19,0,2,1,2,0.74,0.7121,0.74,0.2239,67,358,425 +4031,2011-06-21,3,0,6,20,0,2,1,3,0.74,0.697,0.7,0.2239,72,254,326 +4032,2011-06-21,3,0,6,21,0,2,1,1,0.72,0.697,0.74,0.194,38,191,229 +4033,2011-06-21,3,0,6,22,0,2,1,1,0.7,0.6667,0.79,0.1045,12,97,109 +4034,2011-06-21,3,0,6,23,0,2,1,1,0.7,0.6667,0.74,0.1045,11,53,64 +4035,2011-06-22,3,0,6,0,0,3,1,1,0.66,0.6061,0.78,0.194,15,18,33 +4036,2011-06-22,3,0,6,1,0,3,1,1,0.66,0.6061,0.78,0.1343,2,19,21 +4037,2011-06-22,3,0,6,2,0,3,1,1,0.66,0.6061,0.78,0,2,5,7 +4038,2011-06-22,3,0,6,3,0,3,1,1,0.66,0.6061,0.78,0.1343,0,7,7 +4039,2011-06-22,3,0,6,4,0,3,1,1,0.64,0.5758,0.83,0,3,8,11 +4040,2011-06-22,3,0,6,5,0,3,1,1,0.64,0.5758,0.89,0,2,21,23 +4041,2011-06-22,3,0,6,6,0,3,1,1,0.66,0.6061,0.83,0,12,110,122 +4042,2011-06-22,3,0,6,7,0,3,1,1,0.7,0.6667,0.79,0,23,288,311 +4043,2011-06-22,3,0,6,8,0,3,1,1,0.72,0.697,0.74,0.1045,27,396,423 +4044,2011-06-22,3,0,6,9,0,3,1,1,0.72,0.697,0.79,0.1642,14,205,219 +4045,2011-06-22,3,0,6,10,0,3,1,1,0.74,0.697,0.7,0.2537,31,104,135 +4046,2011-06-22,3,0,6,11,0,3,1,1,0.74,0.697,0.7,0.1642,37,108,145 +4047,2011-06-22,3,0,6,12,0,3,1,2,0.8,0.7727,0.59,0.2239,44,144,188 +4048,2011-06-22,3,0,6,13,0,3,1,2,0.82,0.7879,0.56,0.2537,34,113,147 +4049,2011-06-22,3,0,6,14,0,3,1,2,0.82,0.7727,0.52,0.2836,32,113,145 +4050,2011-06-22,3,0,6,15,0,3,1,2,0.82,0.7879,0.56,0.2836,32,114,146 +4051,2011-06-22,3,0,6,16,0,3,1,2,0.82,0.803,0.59,0.3284,34,224,258 +4052,2011-06-22,3,0,6,17,0,3,1,2,0.8,0.7576,0.55,0.3881,67,462,529 +4053,2011-06-22,3,0,6,18,0,3,1,2,0.8,0.7727,0.59,0.2537,66,447,513 +4054,2011-06-22,3,0,6,19,0,3,1,2,0.8,0.7424,0.52,0.2537,62,280,342 +4055,2011-06-22,3,0,6,20,0,3,1,1,0.74,0.7121,0.74,0.1343,52,230,282 +4056,2011-06-22,3,0,6,21,0,3,1,1,0.74,0.7121,0.79,0.1642,31,216,247 +4057,2011-06-22,3,0,6,22,0,3,1,2,0.72,0.697,0.79,0.194,25,123,148 +4058,2011-06-22,3,0,6,23,0,3,1,2,0.72,0.697,0.79,0.2239,14,91,105 +4059,2011-06-23,3,0,6,0,0,4,1,1,0.72,0.697,0.79,0.2239,5,44,49 +4060,2011-06-23,3,0,6,1,0,4,1,1,0.72,0.697,0.74,0.2239,5,26,31 +4061,2011-06-23,3,0,6,2,0,4,1,1,0.72,0.697,0.74,0.2239,0,4,4 +4062,2011-06-23,3,0,6,3,0,4,1,2,0.72,0.697,0.74,0.1642,1,4,5 +4063,2011-06-23,3,0,6,4,0,4,1,2,0.7,0.6667,0.79,0.1343,0,7,7 +4064,2011-06-23,3,0,6,5,0,4,1,1,0.72,0.697,0.74,0.1642,2,22,24 +4065,2011-06-23,3,0,6,6,0,4,1,1,0.72,0.697,0.74,0.1642,8,92,100 +4066,2011-06-23,3,0,6,7,0,4,1,2,0.72,0.697,0.74,0.1343,26,301,327 +4067,2011-06-23,3,0,6,8,0,4,1,2,0.74,0.697,0.7,0.194,20,412,432 +4068,2011-06-23,3,0,6,9,0,4,1,2,0.74,0.697,0.7,0.1343,33,194,227 +4069,2011-06-23,3,0,6,10,0,4,1,2,0.76,0.7273,0.66,0.1343,39,88,127 +4070,2011-06-23,3,0,6,11,0,4,1,3,0.76,0.7273,0.66,0.2537,28,108,136 +4071,2011-06-23,3,0,6,12,0,4,1,2,0.76,0.7273,0.66,0.2836,30,130,160 +4072,2011-06-23,3,0,6,13,0,4,1,2,0.76,0.7121,0.62,0.2836,37,149,186 +4073,2011-06-23,3,0,6,14,0,4,1,2,0.76,0.7273,0.66,0.2985,46,103,149 +4074,2011-06-23,3,0,6,15,0,4,1,2,0.76,0.697,0.52,0.3582,62,127,189 +4075,2011-06-23,3,0,6,16,0,4,1,2,0.74,0.6818,0.62,0.2985,58,217,275 +4076,2011-06-23,3,0,6,17,0,4,1,2,0.72,0.6818,0.7,0.3582,74,495,569 +4077,2011-06-23,3,0,6,18,0,4,1,1,0.72,0.6818,0.7,0.4179,57,483,540 +4078,2011-06-23,3,0,6,19,0,4,1,1,0.72,0.6818,0.7,0.3582,56,344,400 +4079,2011-06-23,3,0,6,20,0,4,1,1,0.7,0.6667,0.74,0.2239,75,277,352 +4080,2011-06-23,3,0,6,21,0,4,1,1,0.7,0.6667,0.74,0.194,46,188,234 +4081,2011-06-23,3,0,6,22,0,4,1,2,0.7,0.6667,0.74,0.2239,23,139,162 +4082,2011-06-23,3,0,6,23,0,4,1,1,0.7,0.6667,0.74,0.2836,15,90,105 +4083,2011-06-24,3,0,6,0,0,5,1,1,0.68,0.6364,0.79,0.194,10,53,63 +4084,2011-06-24,3,0,6,1,0,5,1,1,0.66,0.6061,0.83,0.3284,8,20,28 +4085,2011-06-24,3,0,6,2,0,5,1,1,0.66,0.6061,0.83,0.3284,4,10,14 +4086,2011-06-24,3,0,6,3,0,5,1,1,0.66,0.6061,0.83,0.1642,1,4,5 +4087,2011-06-24,3,0,6,4,0,5,1,1,0.66,0.6061,0.83,0.0896,1,8,9 +4088,2011-06-24,3,0,6,5,0,5,1,1,0.64,0.5758,0.89,0.0896,2,18,20 +4089,2011-06-24,3,0,6,6,0,5,1,1,0.66,0.6061,0.83,0.1343,4,87,91 +4090,2011-06-24,3,0,6,7,0,5,1,2,0.66,0.6061,0.83,0.2239,21,247,268 +4091,2011-06-24,3,0,6,8,0,5,1,1,0.7,0.6667,0.74,0.2239,27,439,466 +4092,2011-06-24,3,0,6,9,0,5,1,1,0.74,0.6818,0.62,0.1343,28,203,231 +4093,2011-06-24,3,0,6,10,0,5,1,1,0.8,0.7273,0.43,0.2836,43,102,145 +4094,2011-06-24,3,0,6,11,0,5,1,1,0.8,0.7121,0.38,0.3284,56,147,203 +4095,2011-06-24,3,0,6,12,0,5,1,1,0.8,0.7121,0.41,0.2836,51,150,201 +4096,2011-06-24,3,0,6,13,0,5,1,1,0.8,0.7121,0.36,0.2239,43,178,221 +4097,2011-06-24,3,0,6,14,0,5,1,1,0.78,0.697,0.43,0.2836,67,162,229 +4098,2011-06-24,3,0,6,15,0,5,1,1,0.82,0.7273,0.34,0.3881,69,147,216 +4099,2011-06-24,3,0,6,16,0,5,1,1,0.8,0.697,0.33,0.4179,80,247,327 +4100,2011-06-24,3,0,6,17,0,5,1,1,0.76,0.6667,0.37,0.2537,85,472,557 +4101,2011-06-24,3,0,6,18,0,5,1,1,0.76,0.6667,0.37,0.2985,87,365,452 +4102,2011-06-24,3,0,6,19,0,5,1,1,0.74,0.6515,0.4,0.2239,92,293,385 +4103,2011-06-24,3,0,6,20,0,5,1,1,0.72,0.6515,0.45,0.1642,66,222,288 +4104,2011-06-24,3,0,6,21,0,5,1,1,0.7,0.6515,0.48,0.1343,50,183,233 +4105,2011-06-24,3,0,6,22,0,5,1,1,0.7,0.6515,0.48,0,41,126,167 +4106,2011-06-24,3,0,6,23,0,5,1,1,0.68,0.6364,0.51,0.1343,33,139,172 +4107,2011-06-25,3,0,6,0,0,6,0,1,0.68,0.6364,0.47,0.1642,19,97,116 +4108,2011-06-25,3,0,6,1,0,6,0,1,0.66,0.6212,0.5,0.1642,20,75,95 +4109,2011-06-25,3,0,6,2,0,6,0,2,0.66,0.6212,0.5,0.194,8,51,59 +4110,2011-06-25,3,0,6,3,0,6,0,1,0.64,0.6212,0.57,0.194,18,21,39 +4111,2011-06-25,3,0,6,4,0,6,0,1,0.64,0.6212,0.57,0.194,7,4,11 +4112,2011-06-25,3,0,6,5,0,6,0,1,0.62,0.6061,0.61,0.1343,6,8,14 +4113,2011-06-25,3,0,6,6,0,6,0,1,0.64,0.6212,0.57,0.2239,12,17,29 +4114,2011-06-25,3,0,6,7,0,6,0,1,0.64,0.6212,0.57,0.2836,7,35,42 +4115,2011-06-25,3,0,6,8,0,6,0,1,0.7,0.6515,0.48,0.2537,27,85,112 +4116,2011-06-25,3,0,6,9,0,6,0,1,0.72,0.6515,0.45,0.2537,47,139,186 +4117,2011-06-25,3,0,6,10,0,6,0,1,0.72,0.6515,0.45,0.194,99,203,302 +4118,2011-06-25,3,0,6,11,0,6,0,1,0.74,0.6667,0.42,0,122,209,331 +4119,2011-06-25,3,0,6,12,0,6,0,1,0.74,0.6515,0.4,0.2836,140,252,392 +4120,2011-06-25,3,0,6,13,0,6,0,1,0.74,0.6515,0.4,0.2836,140,231,371 +4121,2011-06-25,3,0,6,14,0,6,0,1,0.74,0.6667,0.45,0.2985,151,232,383 +4122,2011-06-25,3,0,6,15,0,6,0,1,0.74,0.6667,0.45,0,157,235,392 +4123,2011-06-25,3,0,6,16,0,6,0,1,0.76,0.6818,0.43,0.2985,175,235,410 +4124,2011-06-25,3,0,6,17,0,6,0,1,0.74,0.6667,0.42,0.2239,152,247,399 +4125,2011-06-25,3,0,6,18,0,6,0,1,0.72,0.6515,0.45,0.2537,108,199,307 +4126,2011-06-25,3,0,6,19,0,6,0,1,0.72,0.6667,0.48,0.2836,121,254,375 +4127,2011-06-25,3,0,6,20,0,6,0,1,0.7,0.6515,0.48,0.194,74,195,269 +4128,2011-06-25,3,0,6,21,0,6,0,1,0.68,0.6364,0.51,0.1642,62,140,202 +4129,2011-06-25,3,0,6,22,0,6,0,1,0.68,0.6364,0.47,0.194,61,126,187 +4130,2011-06-25,3,0,6,23,0,6,0,2,0.66,0.6212,0.5,0.2985,49,130,179 +4131,2011-06-26,3,0,6,0,0,0,0,1,0.64,0.6212,0.53,0,30,85,115 +4132,2011-06-26,3,0,6,1,0,0,0,1,0.64,0.6212,0.53,0.1045,17,73,90 +4133,2011-06-26,3,0,6,2,0,0,0,1,0.62,0.6212,0.57,0.1045,17,70,87 +4134,2011-06-26,3,0,6,3,0,0,0,1,0.6,0.6061,0.64,0.1045,12,22,34 +4135,2011-06-26,3,0,6,4,0,0,0,1,0.6,0.6061,0.64,0.0896,3,8,11 +4136,2011-06-26,3,0,6,5,0,0,0,1,0.6,0.6061,0.64,0,3,9,12 +4137,2011-06-26,3,0,6,6,0,0,0,1,0.6,0.6061,0.64,0,7,13,20 +4138,2011-06-26,3,0,6,7,0,0,0,1,0.62,0.6061,0.61,0,24,27,51 +4139,2011-06-26,3,0,6,8,0,0,0,1,0.64,0.6212,0.61,0.194,29,78,107 +4140,2011-06-26,3,0,6,9,0,0,0,1,0.66,0.6212,0.54,0.0896,75,131,206 +4141,2011-06-26,3,0,6,10,0,0,0,1,0.72,0.6667,0.48,0.1045,136,196,332 +4142,2011-06-26,3,0,6,11,0,0,0,1,0.72,0.6515,0.45,0.1343,127,219,346 +4143,2011-06-26,3,0,6,12,0,0,0,1,0.74,0.6515,0.4,0.1642,175,268,443 +4144,2011-06-26,3,0,6,13,0,0,0,1,0.74,0.6515,0.4,0,197,245,442 +4145,2011-06-26,3,0,6,14,0,0,0,1,0.74,0.6515,0.4,0.1343,197,246,443 +4146,2011-06-26,3,0,6,15,0,0,0,1,0.74,0.6515,0.4,0,174,229,403 +4147,2011-06-26,3,0,6,16,0,0,0,1,0.74,0.6667,0.42,0.194,177,278,455 +4148,2011-06-26,3,0,6,17,0,0,0,2,0.72,0.6515,0.42,0.1642,178,243,421 +4149,2011-06-26,3,0,6,18,0,0,0,2,0.72,0.6515,0.45,0.2537,117,265,382 +4150,2011-06-26,3,0,6,19,0,0,0,2,0.72,0.6667,0.48,0.1343,89,228,317 +4151,2011-06-26,3,0,6,20,0,0,0,2,0.7,0.6515,0.48,0.1045,55,158,213 +4152,2011-06-26,3,0,6,21,0,0,0,2,0.7,0.6515,0.54,0.1045,48,135,183 +4153,2011-06-26,3,0,6,22,0,0,0,1,0.7,0.6515,0.51,0,19,95,114 +4154,2011-06-26,3,0,6,23,0,0,0,1,0.7,0.6515,0.54,0.0896,14,64,78 +4155,2011-06-27,3,0,6,0,0,1,1,3,0.64,0.6061,0.73,0.2836,7,25,32 +4156,2011-06-27,3,0,6,1,0,1,1,2,0.62,0.5909,0.78,0.1343,4,11,15 +4157,2011-06-27,3,0,6,2,0,1,1,3,0.62,0.5909,0.78,0.1343,3,3,6 +4158,2011-06-27,3,0,6,3,0,1,1,3,0.62,0.5909,0.78,0.1343,0,2,2 +4159,2011-06-27,3,0,6,4,0,1,1,2,0.6,0.5606,0.83,0.1642,1,8,9 +4160,2011-06-27,3,0,6,5,0,1,1,2,0.62,0.5758,0.83,0.1642,1,21,22 +4161,2011-06-27,3,0,6,6,0,1,1,2,0.62,0.5909,0.78,0.1642,4,90,94 +4162,2011-06-27,3,0,6,7,0,1,1,2,0.64,0.6061,0.73,0.2239,14,247,261 +4163,2011-06-27,3,0,6,8,0,1,1,1,0.66,0.6212,0.74,0.194,29,418,447 +4164,2011-06-27,3,0,6,9,0,1,1,2,0.72,0.6667,0.54,0,37,150,187 +4165,2011-06-27,3,0,6,10,0,1,1,2,0.72,0.6667,0.54,0.1343,55,95,150 +4166,2011-06-27,3,0,6,11,0,1,1,2,0.7,0.6515,0.58,0,55,95,150 +4167,2011-06-27,3,0,6,12,0,1,1,2,0.72,0.6667,0.54,0,40,130,170 +4168,2011-06-27,3,0,6,13,0,1,1,2,0.72,0.6667,0.58,0.2239,37,127,164 +4169,2011-06-27,3,0,6,14,0,1,1,2,0.72,0.6667,0.58,0.0896,58,100,158 +4170,2011-06-27,3,0,6,15,0,1,1,2,0.74,0.6818,0.55,0,76,126,202 +4171,2011-06-27,3,0,6,16,0,1,1,1,0.74,0.6818,0.55,0.1045,53,222,275 +4172,2011-06-27,3,0,6,17,0,1,1,1,0.74,0.6818,0.55,0.1343,90,514,604 +4173,2011-06-27,3,0,6,18,0,1,1,1,0.74,0.6667,0.51,0,79,512,591 +4174,2011-06-27,3,0,6,19,0,1,1,1,0.72,0.6818,0.62,0.194,56,340,396 +4175,2011-06-27,3,0,6,20,0,1,1,1,0.7,0.6515,0.65,0.1045,54,251,305 +4176,2011-06-27,3,0,6,21,0,1,1,1,0.7,0.6515,0.65,0,52,190,242 +4177,2011-06-27,3,0,6,22,0,1,1,1,0.68,0.6364,0.69,0,33,112,145 +4178,2011-06-27,3,0,6,23,0,1,1,1,0.68,0.6364,0.69,0,16,65,81 +4179,2011-06-28,3,0,6,0,0,2,1,2,0.66,0.6061,0.78,0.1343,5,25,30 +4180,2011-06-28,3,0,6,1,0,2,1,1,0.66,0.6212,0.74,0,5,8,13 +4181,2011-06-28,3,0,6,2,0,2,1,2,0.66,0.6212,0.74,0,2,9,11 +4182,2011-06-28,3,0,6,3,0,2,1,2,0.66,0.6212,0.74,0,0,2,2 +4183,2011-06-28,3,0,6,4,0,2,1,1,0.66,0.6212,0.74,0,1,7,8 +4184,2011-06-28,3,0,6,5,0,2,1,1,0.66,0.6061,0.78,0,1,20,21 +4185,2011-06-28,3,0,6,6,0,2,1,1,0.66,0.6061,0.83,0.1642,7,117,124 +4186,2011-06-28,3,0,6,7,0,2,1,1,0.68,0.6364,0.83,0.194,27,320,347 +4187,2011-06-28,3,0,6,8,0,2,1,1,0.7,0.6667,0.79,0.1642,39,417,456 +4188,2011-06-28,3,0,6,9,0,2,1,1,0.74,0.697,0.7,0,38,208,246 +4189,2011-06-28,3,0,6,10,0,2,1,1,0.8,0.7424,0.49,0.1642,45,112,157 +4190,2011-06-28,3,0,6,11,0,2,1,1,0.82,0.7576,0.46,0,49,118,167 +4191,2011-06-28,3,0,6,12,0,2,1,1,0.84,0.7727,0.47,0.2239,41,161,202 +4192,2011-06-28,3,0,6,13,0,2,1,1,0.84,0.7576,0.44,0,33,156,189 +4193,2011-06-28,3,0,6,14,0,2,1,1,0.86,0.7879,0.44,0.2239,43,117,160 +4194,2011-06-28,3,0,6,15,0,2,1,1,0.86,0.7879,0.41,0,39,127,166 +4195,2011-06-28,3,0,6,16,0,2,1,1,0.86,0.803,0.47,0.2836,41,220,261 +4196,2011-06-28,3,0,6,17,0,2,1,3,0.82,0.7879,0.56,0.2836,70,509,579 +4197,2011-06-28,3,0,6,18,0,2,1,3,0.8,0.7576,0.55,0.2985,68,436,504 +4198,2011-06-28,3,0,6,19,0,2,1,2,0.74,0.6818,0.58,0.4627,44,255,299 +4199,2011-06-28,3,0,6,20,0,2,1,1,0.74,0.6818,0.62,0,50,246,296 +4200,2011-06-28,3,0,6,21,0,2,1,1,0.74,0.697,0.66,0.2836,31,164,195 +4201,2011-06-28,3,0,6,22,0,2,1,1,0.7,0.6515,0.7,0.2985,30,104,134 +4202,2011-06-28,3,0,6,23,0,2,1,1,0.7,0.6515,0.7,0.2836,23,58,81 +4203,2011-06-29,3,0,6,0,0,3,1,2,0.68,0.6364,0.79,0.2836,11,25,36 +4204,2011-06-29,3,0,6,1,0,3,1,1,0.68,0.6364,0.79,0.2985,10,18,28 +4205,2011-06-29,3,0,6,2,0,3,1,1,0.66,0.6061,0.83,0.2239,7,9,16 +4206,2011-06-29,3,0,6,3,0,3,1,2,0.66,0.6061,0.83,0.1045,2,3,5 +4207,2011-06-29,3,0,6,4,0,3,1,1,0.66,0.6061,0.83,0.1045,0,5,5 +4208,2011-06-29,3,0,6,5,0,3,1,1,0.64,0.5758,0.83,0.194,3,19,22 +4209,2011-06-29,3,0,6,6,0,3,1,1,0.66,0.6212,0.69,0.2985,12,104,116 +4210,2011-06-29,3,0,6,7,0,3,1,1,0.7,0.6515,0.54,0.2239,18,318,336 +4211,2011-06-29,3,0,6,8,0,3,1,1,0.7,0.6515,0.48,0.2836,41,499,540 +4212,2011-06-29,3,0,6,9,0,3,1,1,0.72,0.6515,0.45,0.3881,32,189,221 +4213,2011-06-29,3,0,6,10,0,3,1,1,0.74,0.6667,0.42,0.2836,31,95,126 +4214,2011-06-29,3,0,6,11,0,3,1,1,0.76,0.6667,0.35,0.2239,53,117,170 +4215,2011-06-29,3,0,6,12,0,3,1,1,0.76,0.6667,0.35,0.2537,55,188,243 +4216,2011-06-29,3,0,6,13,0,3,1,1,0.8,0.697,0.31,0.3284,45,186,231 +4217,2011-06-29,3,0,6,14,0,3,1,1,0.8,0.697,0.31,0.194,50,138,188 +4218,2011-06-29,3,0,6,15,0,3,1,1,0.82,0.7121,0.3,0.1642,43,146,189 +4219,2011-06-29,3,0,6,16,0,3,1,1,0.82,0.7121,0.3,0.4179,60,268,328 +4220,2011-06-29,3,0,6,17,0,3,1,1,0.82,0.7121,0.26,0.4179,78,492,570 +4221,2011-06-29,3,0,6,18,0,3,1,1,0.8,0.697,0.29,0.2985,78,510,588 +4222,2011-06-29,3,0,6,19,0,3,1,1,0.76,0.6667,0.35,0.3881,60,341,401 +4223,2011-06-29,3,0,6,20,0,3,1,1,0.74,0.6515,0.37,0.2985,69,264,333 +4224,2011-06-29,3,0,6,21,0,3,1,1,0.72,0.6515,0.39,0.2239,26,195,221 +4225,2011-06-29,3,0,6,22,0,3,1,1,0.7,0.6364,0.42,0.194,32,151,183 +4226,2011-06-29,3,0,6,23,0,3,1,1,0.68,0.6364,0.47,0.194,32,97,129 +4227,2011-06-30,3,0,6,0,0,4,1,1,0.66,0.6212,0.5,0.1642,15,39,54 +4228,2011-06-30,3,0,6,1,0,4,1,1,0.64,0.6212,0.53,0.1343,13,19,32 +4229,2011-06-30,3,0,6,2,0,4,1,1,0.62,0.6212,0.57,0.1343,2,6,8 +4230,2011-06-30,3,0,6,3,0,4,1,1,0.6,0.6061,0.6,0.1642,2,7,9 +4231,2011-06-30,3,0,6,4,0,4,1,1,0.58,0.5455,0.64,0.1343,2,4,6 +4232,2011-06-30,3,0,6,5,0,4,1,1,0.58,0.5455,0.64,0.2537,1,24,25 +4233,2011-06-30,3,0,6,6,0,4,1,1,0.6,0.6061,0.64,0.2537,4,119,123 +4234,2011-06-30,3,0,6,7,0,4,1,1,0.62,0.6061,0.61,0.2239,29,291,320 +4235,2011-06-30,3,0,6,8,0,4,1,1,0.68,0.6364,0.47,0.2239,44,511,555 +4236,2011-06-30,3,0,6,9,0,4,1,1,0.7,0.6364,0.42,0.2239,33,227,260 +4237,2011-06-30,3,0,6,10,0,4,1,1,0.72,0.6515,0.42,0.194,35,97,132 +4238,2011-06-30,3,0,6,11,0,4,1,1,0.74,0.6515,0.37,0.1343,47,122,169 +4239,2011-06-30,3,0,6,12,0,4,1,1,0.76,0.6667,0.35,0.1642,50,183,233 +4240,2011-06-30,3,0,6,13,0,4,1,1,0.76,0.6667,0.33,0.194,70,162,232 +4241,2011-06-30,3,0,6,14,0,4,1,1,0.78,0.6818,0.31,0.194,46,117,163 +4242,2011-06-30,3,0,6,15,0,4,1,1,0.78,0.6667,0.25,0.2537,64,176,240 +4243,2011-06-30,3,0,6,16,0,4,1,1,0.8,0.697,0.26,0.1343,63,260,323 +4244,2011-06-30,3,0,6,17,0,4,1,1,0.78,0.6818,0.27,0.3582,98,496,594 +4245,2011-06-30,3,0,6,18,0,4,1,1,0.76,0.6667,0.29,0.2537,114,472,586 +4246,2011-06-30,3,0,6,19,0,4,1,1,0.74,0.6515,0.33,0.194,89,366,455 +4247,2011-06-30,3,0,6,20,0,4,1,1,0.74,0.6515,0.33,0.194,55,285,340 +4248,2011-06-30,3,0,6,21,0,4,1,1,0.72,0.6515,0.37,0.1045,81,242,323 +4249,2011-06-30,3,0,6,22,0,4,1,1,0.7,0.6364,0.42,0.1642,43,164,207 +4250,2011-06-30,3,0,6,23,0,4,1,1,0.66,0.6212,0.5,0,27,99,126 +4251,2011-07-01,3,0,7,0,0,5,1,1,0.66,0.6212,0.5,0,20,48,68 +4252,2011-07-01,3,0,7,1,0,5,1,1,0.66,0.6212,0.5,0,15,16,31 +4253,2011-07-01,3,0,7,2,0,5,1,1,0.62,0.6061,0.69,0,6,7,13 +4254,2011-07-01,3,0,7,3,0,5,1,1,0.64,0.6212,0.53,0,5,6,11 +4255,2011-07-01,3,0,7,4,0,5,1,1,0.62,0.6212,0.57,0,1,5,6 +4256,2011-07-01,3,0,7,5,0,5,1,1,0.6,0.6061,0.64,0.1343,3,27,30 +4257,2011-07-01,3,0,7,6,0,5,1,1,0.62,0.6061,0.61,0.1343,11,97,108 +4258,2011-07-01,3,0,7,7,0,5,1,1,0.66,0.6212,0.54,0.1045,25,218,243 +4259,2011-07-01,3,0,7,8,0,5,1,1,0.7,0.6364,0.42,0.1642,39,453,492 +4260,2011-07-01,3,0,7,9,0,5,1,1,0.74,0.6515,0.35,0.1642,58,202,260 +4261,2011-07-01,3,0,7,10,0,5,1,1,0.76,0.6667,0.33,0.1642,61,109,170 +4262,2011-07-01,3,0,7,11,0,5,1,1,0.8,0.697,0.26,0,68,146,214 +4263,2011-07-01,3,0,7,12,0,5,1,1,0.8,0.697,0.26,0,83,180,263 +4264,2011-07-01,3,0,7,13,0,5,1,1,0.8,0.6818,0.24,0,83,209,292 +4265,2011-07-01,3,0,7,14,0,5,1,1,0.82,0.697,0.23,0.194,78,225,303 +4266,2011-07-01,3,0,7,15,0,5,1,1,0.82,0.697,0.21,0.2537,82,299,381 +4267,2011-07-01,3,0,7,16,0,5,1,1,0.82,0.697,0.17,0,85,342,427 +4268,2011-07-01,3,0,7,17,0,5,1,1,0.82,0.697,0.21,0.194,99,362,461 +4269,2011-07-01,3,0,7,18,0,5,1,1,0.8,0.6818,0.24,0.2537,98,324,422 +4270,2011-07-01,3,0,7,19,0,5,1,1,0.78,0.6667,0.25,0.194,80,238,318 +4271,2011-07-01,3,0,7,20,0,5,1,1,0.72,0.6515,0.42,0.2239,84,185,269 +4272,2011-07-01,3,0,7,21,0,5,1,1,0.7,0.6364,0.45,0.194,69,149,218 +4273,2011-07-01,3,0,7,22,0,5,1,1,0.7,0.6364,0.42,0,49,173,222 +4274,2011-07-01,3,0,7,23,0,5,1,1,0.68,0.6364,0.47,0.0896,44,96,140 +4275,2011-07-02,3,0,7,0,0,6,0,1,0.68,0.6364,0.47,0.0896,31,84,115 +4276,2011-07-02,3,0,7,1,0,6,0,1,0.66,0.6212,0.54,0,20,58,78 +4277,2011-07-02,3,0,7,2,0,6,0,1,0.64,0.6212,0.53,0,9,43,52 +4278,2011-07-02,3,0,7,3,0,6,0,1,0.64,0.6212,0.53,0,5,21,26 +4279,2011-07-02,3,0,7,4,0,6,0,1,0.62,0.6061,0.61,0,3,8,11 +4280,2011-07-02,3,0,7,5,0,6,0,1,0.58,0.5455,0.73,0.1045,1,13,14 +4281,2011-07-02,3,0,7,6,0,6,0,1,0.62,0.6061,0.69,0.1045,11,21,32 +4282,2011-07-02,3,0,7,7,0,6,0,1,0.64,0.6061,0.65,0,10,35,45 +4283,2011-07-02,3,0,7,8,0,6,0,1,0.68,0.6364,0.57,0,38,94,132 +4284,2011-07-02,3,0,7,9,0,6,0,1,0.74,0.6667,0.51,0.0896,55,124,179 +4285,2011-07-02,3,0,7,10,0,6,0,1,0.8,0.7121,0.36,0,103,168,271 +4286,2011-07-02,3,0,7,11,0,6,0,1,0.8,0.7121,0.36,0.0896,144,213,357 +4287,2011-07-02,3,0,7,12,0,6,0,1,0.82,0.7121,0.3,0.1343,142,232,374 +4288,2011-07-02,3,0,7,13,0,6,0,1,0.82,0.7121,0.3,0.1045,178,226,404 +4289,2011-07-02,3,0,7,14,0,6,0,1,0.84,0.7121,0.2,0,177,215,392 +4290,2011-07-02,3,0,7,15,0,6,0,1,0.86,0.7273,0.19,0.1642,168,163,331 +4291,2011-07-02,3,0,7,16,0,6,0,1,0.88,0.7424,0.22,0.2239,206,192,398 +4292,2011-07-02,3,0,7,17,0,6,0,1,0.84,0.7273,0.3,0.2239,179,170,349 +4293,2011-07-02,3,0,7,18,0,6,0,1,0.82,0.7121,0.3,0.2836,193,190,383 +4294,2011-07-02,3,0,7,19,0,6,0,1,0.8,0.697,0.33,0.2239,156,149,305 +4295,2011-07-02,3,0,7,20,0,6,0,1,0.76,0.6818,0.45,0.2836,136,148,284 +4296,2011-07-02,3,0,7,21,0,6,0,1,0.74,0.6667,0.48,0.194,95,137,232 +4297,2011-07-02,3,0,7,22,0,6,0,1,0.72,0.6667,0.51,0.1642,78,123,201 +4298,2011-07-02,3,0,7,23,0,6,0,1,0.72,0.6667,0.54,0.2836,66,88,154 +4299,2011-07-03,3,0,7,0,0,0,0,1,0.7,0.6515,0.58,0.2239,47,97,144 +4300,2011-07-03,3,0,7,1,0,0,0,2,0.7,0.6515,0.61,0.1642,23,55,78 +4301,2011-07-03,3,0,7,2,0,0,0,1,0.7,0.6515,0.58,0.194,19,50,69 +4302,2011-07-03,3,0,7,3,0,0,0,3,0.68,0.6364,0.65,0.0896,8,25,33 +4303,2011-07-03,3,0,7,4,0,0,0,3,0.68,0.6364,0.65,0.0896,3,2,5 +4304,2011-07-03,3,0,7,5,0,0,0,3,0.62,0.5758,0.83,0.194,0,1,1 +4305,2011-07-03,3,0,7,6,0,0,0,2,0.62,0.5455,0.94,0.0896,0,3,3 +4306,2011-07-03,3,0,7,7,0,0,0,2,0.64,0.5758,0.89,0.194,3,22,25 +4307,2011-07-03,3,0,7,8,0,0,0,1,0.68,0.6364,0.79,0.2239,39,53,92 +4308,2011-07-03,3,0,7,9,0,0,0,1,0.7,0.6667,0.74,0.2537,82,99,181 +4309,2011-07-03,3,0,7,10,0,0,0,1,0.74,0.697,0.66,0.1642,131,133,264 +4310,2011-07-03,3,0,7,11,0,0,0,1,0.76,0.7121,0.62,0.1045,215,175,390 +4311,2011-07-03,3,0,7,12,0,0,0,1,0.8,0.7576,0.55,0.1045,198,206,404 +4312,2011-07-03,3,0,7,13,0,0,0,1,0.8,0.7727,0.59,0.2239,248,173,421 +4313,2011-07-03,3,0,7,14,0,0,0,1,0.82,0.7879,0.56,0.2537,225,150,375 +4314,2011-07-03,3,0,7,15,0,0,0,1,0.84,0.803,0.53,0.2985,194,182,376 +4315,2011-07-03,3,0,7,16,0,0,0,1,0.84,0.803,0.53,0.2985,195,219,414 +4316,2011-07-03,3,0,7,17,0,0,0,3,0.8,0.7424,0.49,0.8507,181,177,358 +4317,2011-07-03,3,0,7,18,0,0,0,3,0.8,0.7424,0.49,0.8507,74,107,181 +4318,2011-07-03,3,0,7,19,0,0,0,2,0.66,0.6061,0.83,0.194,100,83,183 +4319,2011-07-03,3,0,7,20,0,0,0,3,0.66,0.6061,0.83,0,83,93,176 +4320,2011-07-03,3,0,7,21,0,0,0,2,0.66,0.6061,0.83,0.1642,75,92,167 +4321,2011-07-03,3,0,7,22,0,0,0,2,0.66,0.6061,0.78,0.1045,69,93,162 +4322,2011-07-03,3,0,7,23,0,0,0,1,0.64,0.5758,0.83,0.1642,70,77,147 +4323,2011-07-04,3,0,7,0,1,1,0,1,0.64,0.5758,0.83,0.1343,63,77,140 +4324,2011-07-04,3,0,7,1,1,1,0,1,0.64,0.5758,0.83,0.0896,43,76,119 +4325,2011-07-04,3,0,7,2,1,1,0,1,0.64,0.5758,0.83,0,33,30,63 +4326,2011-07-04,3,0,7,3,1,1,0,1,0.64,0.5909,0.78,0,13,13,26 +4327,2011-07-04,3,0,7,4,1,1,0,2,0.66,0.6061,0.78,0.0896,8,4,12 +4328,2011-07-04,3,0,7,5,1,1,0,2,0.64,0.5758,0.83,0.1343,1,3,4 +4329,2011-07-04,3,0,7,6,1,1,0,1,0.64,0.5758,0.89,0.1642,5,11,16 +4330,2011-07-04,3,0,7,7,1,1,0,1,0.66,0.6061,0.83,0.1343,17,19,36 +4331,2011-07-04,3,0,7,8,1,1,0,1,0.7,0.6667,0.74,0.1045,42,44,86 +4332,2011-07-04,3,0,7,9,1,1,0,2,0.72,0.6818,0.66,0.0896,142,96,238 +4333,2011-07-04,3,0,7,10,1,1,0,2,0.76,0.697,0.55,0.0896,166,114,280 +4334,2011-07-04,3,0,7,11,1,1,0,2,0.76,0.697,0.55,0.0896,177,172,349 +4335,2011-07-04,3,0,7,12,1,1,0,2,0.78,0.7121,0.52,0,237,210,447 +4336,2011-07-04,3,0,7,13,1,1,0,2,0.78,0.7121,0.49,0,242,181,423 +4337,2011-07-04,3,0,7,14,1,1,0,1,0.8,0.7273,0.46,0,235,173,408 +4338,2011-07-04,3,0,7,15,1,1,0,1,0.82,0.7424,0.43,0.0896,224,184,408 +4339,2011-07-04,3,0,7,16,1,1,0,1,0.82,0.7424,0.41,0,236,216,452 +4340,2011-07-04,3,0,7,17,1,1,0,1,0.8,0.7273,0.46,0,240,196,436 +4341,2011-07-04,3,0,7,18,1,1,0,1,0.8,0.7273,0.46,0.1343,222,196,418 +4342,2011-07-04,3,0,7,19,1,1,0,2,0.78,0.7121,0.49,0.1642,170,205,375 +4343,2011-07-04,3,0,7,20,1,1,0,2,0.76,0.697,0.55,0.1045,195,191,386 +4344,2011-07-04,3,0,7,21,1,1,0,2,0.74,0.6818,0.62,0.1045,195,262,457 +4345,2011-07-04,3,0,7,22,1,1,0,2,0.74,0.6818,0.62,0.1343,115,211,326 +4346,2011-07-04,3,0,7,23,1,1,0,2,0.72,0.6818,0.7,0.1045,44,94,138 +4347,2011-07-05,3,0,7,0,0,2,1,2,0.7,0.6515,0.65,0.1642,19,36,55 +4348,2011-07-05,3,0,7,1,0,2,1,1,0.7,0.6515,0.65,0.1045,15,24,39 +4349,2011-07-05,3,0,7,2,0,2,1,1,0.66,0.6212,0.74,0.0896,8,5,13 +4350,2011-07-05,3,0,7,3,0,2,1,1,0.66,0.6212,0.74,0.2239,1,7,8 +4351,2011-07-05,3,0,7,4,0,2,1,1,0.66,0.6212,0.69,0.1642,1,4,5 +4352,2011-07-05,3,0,7,5,0,2,1,1,0.66,0.6212,0.69,0,1,19,20 +4353,2011-07-05,3,0,7,6,0,2,1,1,0.66,0.6212,0.74,0.0896,3,91,94 +4354,2011-07-05,3,0,7,7,0,2,1,1,0.7,0.6515,0.65,0.0896,22,248,270 +4355,2011-07-05,3,0,7,8,0,2,1,1,0.72,0.6818,0.62,0.1343,41,391,432 +4356,2011-07-05,3,0,7,9,0,2,1,1,0.74,0.6818,0.58,0.1343,50,177,227 +4357,2011-07-05,3,0,7,10,0,2,1,1,0.76,0.697,0.52,0.1045,55,84,139 +4358,2011-07-05,3,0,7,11,0,2,1,1,0.78,0.697,0.46,0.1045,75,97,172 +4359,2011-07-05,3,0,7,12,0,2,1,1,0.8,0.7273,0.43,0.1045,71,130,201 +4360,2011-07-05,3,0,7,13,0,2,1,1,0.82,0.7424,0.43,0.1045,67,136,203 +4361,2011-07-05,3,0,7,14,0,2,1,1,0.8,0.7424,0.49,0.1642,66,112,178 +4362,2011-07-05,3,0,7,15,0,2,1,1,0.82,0.7424,0.43,0,51,111,162 +4363,2011-07-05,3,0,7,16,0,2,1,1,0.82,0.7576,0.46,0.1045,79,202,281 +4364,2011-07-05,3,0,7,17,0,2,1,1,0.82,0.7576,0.46,0.1045,79,466,545 +4365,2011-07-05,3,0,7,18,0,2,1,1,0.8,0.7576,0.55,0.1642,70,426,496 +4366,2011-07-05,3,0,7,19,0,2,1,1,0.8,0.7727,0.59,0.1343,71,297,368 +4367,2011-07-05,3,0,7,20,0,2,1,1,0.76,0.7273,0.66,0.1045,59,225,284 +4368,2011-07-05,3,0,7,21,0,2,1,1,0.78,0.7424,0.62,0.2537,77,168,245 +4369,2011-07-05,3,0,7,22,0,2,1,1,0.76,0.7273,0.66,0.2239,31,124,155 +4370,2011-07-05,3,0,7,23,0,2,1,1,0.74,0.697,0.66,0.1642,19,54,73 +4371,2011-07-06,3,0,7,0,0,3,1,1,0.72,0.697,0.74,0.194,11,26,37 +4372,2011-07-06,3,0,7,1,0,3,1,1,0.7,0.6667,0.74,0.1343,4,11,15 +4373,2011-07-06,3,0,7,2,0,3,1,1,0.7,0.6667,0.79,0.0896,1,4,5 +4374,2011-07-06,3,0,7,3,0,3,1,1,0.7,0.6667,0.79,0,0,6,6 +4375,2011-07-06,3,0,7,4,0,3,1,2,0.68,0.6364,0.83,0.0896,0,6,6 +4376,2011-07-06,3,0,7,5,0,3,1,1,0.68,0.6364,0.83,0.0896,5,30,35 +4377,2011-07-06,3,0,7,6,0,3,1,1,0.7,0.6667,0.79,0.0896,9,112,121 +4378,2011-07-06,3,0,7,7,0,3,1,2,0.72,0.697,0.79,0,24,288,312 +4379,2011-07-06,3,0,7,8,0,3,1,2,0.72,0.697,0.74,0.1045,31,397,428 +4380,2011-07-06,3,0,7,9,0,3,1,3,0.7,0.6667,0.79,0.194,31,158,189 +4381,2011-07-06,3,0,7,10,0,3,1,3,0.7,0.6667,0.79,0.0896,13,30,43 +4382,2011-07-06,3,0,7,11,0,3,1,2,0.72,0.697,0.74,0,24,103,127 +4383,2011-07-06,3,0,7,12,0,3,1,2,0.74,0.697,0.66,0.0896,38,114,152 +4384,2011-07-06,3,0,7,13,0,3,1,2,0.74,0.697,0.7,0.1045,33,107,140 +4385,2011-07-06,3,0,7,14,0,3,1,1,0.76,0.7273,0.66,0.2836,50,113,163 +4386,2011-07-06,3,0,7,15,0,3,1,1,0.76,0.7273,0.7,0.2836,55,130,185 +4387,2011-07-06,3,0,7,16,0,3,1,1,0.76,0.7273,0.7,0.2537,57,218,275 +4388,2011-07-06,3,0,7,17,0,3,1,1,0.78,0.7424,0.59,0.2239,79,517,596 +4389,2011-07-06,3,0,7,18,0,3,1,1,0.76,0.7121,0.62,0.194,77,486,563 +4390,2011-07-06,3,0,7,19,0,3,1,1,0.74,0.697,0.7,0.2239,71,323,394 +4391,2011-07-06,3,0,7,20,0,3,1,1,0.72,0.697,0.74,0.2537,55,257,312 +4392,2011-07-06,3,0,7,21,0,3,1,1,0.7,0.6667,0.79,0.194,66,175,241 +4393,2011-07-06,3,0,7,22,0,3,1,1,0.7,0.6667,0.79,0.194,30,156,186 +4394,2011-07-06,3,0,7,23,0,3,1,1,0.68,0.6364,0.83,0.2239,20,78,98 +4395,2011-07-07,3,0,7,0,0,4,1,1,0.66,0.5909,0.89,0.1343,6,29,35 +4396,2011-07-07,3,0,7,1,0,4,1,1,0.66,0.5909,0.89,0.194,2,14,16 +4397,2011-07-07,3,0,7,2,0,4,1,1,0.66,0.5909,0.89,0.194,1,7,8 +4398,2011-07-07,3,0,7,3,0,4,1,1,0.64,0.5606,0.94,0.1045,2,2,4 +4399,2011-07-07,3,0,7,4,0,4,1,1,0.64,0.5758,0.89,0.1045,0,4,4 +4400,2011-07-07,3,0,7,5,0,4,1,1,0.64,0.5758,0.89,0.1045,2,30,32 +4401,2011-07-07,3,0,7,6,0,4,1,2,0.64,0.5758,0.89,0.1343,11,107,118 +4402,2011-07-07,3,0,7,7,0,4,1,2,0.66,0.5909,0.89,0.1045,13,279,292 +4403,2011-07-07,3,0,7,8,0,4,1,2,0.7,0.6667,0.79,0.194,28,415,443 +4404,2011-07-07,3,0,7,9,0,4,1,1,0.74,0.697,0.7,0.1642,25,154,179 +4405,2011-07-07,3,0,7,10,0,4,1,1,0.78,0.7424,0.62,0,29,72,101 +4406,2011-07-07,3,0,7,11,0,4,1,1,0.82,0.7727,0.52,0.1343,30,112,142 +4407,2011-07-07,3,0,7,12,0,4,1,1,0.84,0.7727,0.41,0,32,139,171 +4408,2011-07-07,3,0,7,13,0,4,1,1,0.86,0.7879,0.41,0,48,107,155 +4409,2011-07-07,3,0,7,14,0,4,1,1,0.86,0.7879,0.41,0.2537,39,121,160 +4410,2011-07-07,3,0,7,15,0,4,1,1,0.86,0.7879,0.41,0.2537,47,119,166 +4411,2011-07-07,3,0,7,16,0,4,1,1,0.86,0.7727,0.39,0.2239,47,218,265 +4412,2011-07-07,3,0,7,17,0,4,1,1,0.86,0.7576,0.36,0.2537,80,489,569 +4413,2011-07-07,3,0,7,18,0,4,1,1,0.84,0.7576,0.44,0.2985,70,492,562 +4414,2011-07-07,3,0,7,19,0,4,1,1,0.82,0.7727,0.52,0.2836,79,286,365 +4415,2011-07-07,3,0,7,20,0,4,1,1,0.76,0.7121,0.58,0.3582,61,224,285 +4416,2011-07-07,3,0,7,21,0,4,1,1,0.74,0.6818,0.62,0.1642,39,170,209 +4417,2011-07-07,3,0,7,22,0,4,1,1,0.74,0.6818,0.62,0.1642,32,150,182 +4418,2011-07-07,3,0,7,23,0,4,1,1,0.72,0.6818,0.66,0,31,98,129 +4419,2011-07-08,3,0,7,0,0,5,1,1,0.74,0.697,0.66,0.1045,6,41,47 +4420,2011-07-08,3,0,7,1,0,5,1,1,0.72,0.697,0.79,0,9,25,34 +4421,2011-07-08,3,0,7,2,0,5,1,2,0.7,0.6667,0.74,0.1642,10,12,22 +4422,2011-07-08,3,0,7,3,0,5,1,1,0.68,0.6364,0.79,0.1045,2,4,6 +4423,2011-07-08,3,0,7,4,0,5,1,1,0.7,0.6667,0.79,0.2537,7,4,11 +4424,2011-07-08,3,0,7,5,0,5,1,1,0.68,0.6364,0.79,0.2537,3,24,27 +4425,2011-07-08,3,0,7,6,0,5,1,1,0.7,0.6667,0.74,0.1045,11,91,102 +4426,2011-07-08,3,0,7,7,0,5,1,2,0.7,0.6667,0.79,0.194,30,302,332 +4427,2011-07-08,3,0,7,8,0,5,1,2,0.72,0.697,0.74,0.2537,36,421,457 +4428,2011-07-08,3,0,7,9,0,5,1,2,0.76,0.7273,0.7,0.2985,30,211,241 +4429,2011-07-08,3,0,7,10,0,5,1,2,0.74,0.697,0.7,0.3582,30,108,138 +4430,2011-07-08,3,0,7,11,0,5,1,2,0.74,0.7121,0.74,0.2537,61,104,165 +4431,2011-07-08,3,0,7,12,0,5,1,2,0.76,0.7273,0.7,0.2836,49,164,213 +4432,2011-07-08,3,0,7,13,0,5,1,2,0.78,0.7576,0.66,0.2836,54,175,229 +4433,2011-07-08,3,0,7,14,0,5,1,2,0.8,0.7879,0.63,0.3582,53,163,216 +4434,2011-07-08,3,0,7,15,0,5,1,2,0.8,0.7879,0.63,0.3582,39,102,141 +4435,2011-07-08,3,0,7,16,0,5,1,3,0.68,0.6364,0.83,0.4478,27,105,132 +4436,2011-07-08,3,0,7,17,0,5,1,3,0.66,0.5909,0.89,0.4478,6,161,167 +4437,2011-07-08,3,0,7,18,0,5,1,1,0.66,0.6061,0.83,0.0896,35,281,316 +4438,2011-07-08,3,0,7,19,0,5,1,1,0.66,0.6061,0.83,0.0896,45,225,270 +4439,2011-07-08,3,0,7,20,0,5,1,1,0.66,0.6061,0.83,0.1642,30,220,250 +4440,2011-07-08,3,0,7,21,0,5,1,2,0.66,0.6061,0.83,0.2537,46,158,204 +4441,2011-07-08,3,0,7,22,0,5,1,2,0.66,0.6061,0.78,0.194,33,136,169 +4442,2011-07-08,3,0,7,23,0,5,1,2,0.66,0.6061,0.78,0.0896,40,111,151 +4443,2011-07-09,3,0,7,0,0,6,0,2,0.66,0.6061,0.83,0.1343,31,90,121 +4444,2011-07-09,3,0,7,1,0,6,0,2,0.64,0.5758,0.89,0.1045,5,48,53 +4445,2011-07-09,3,0,7,2,0,6,0,1,0.64,0.5758,0.89,0.1642,12,43,55 +4446,2011-07-09,3,0,7,3,0,6,0,1,0.64,0.5758,0.89,0.1343,7,23,30 +4447,2011-07-09,3,0,7,4,0,6,0,1,0.64,0.5758,0.83,0.2239,2,4,6 +4448,2011-07-09,3,0,7,5,0,6,0,2,0.62,0.5606,0.88,0.1642,4,11,15 +4449,2011-07-09,3,0,7,6,0,6,0,2,0.64,0.5758,0.83,0.194,11,27,38 +4450,2011-07-09,3,0,7,7,0,6,0,1,0.66,0.6061,0.78,0.194,21,50,71 +4451,2011-07-09,3,0,7,8,0,6,0,1,0.7,0.6515,0.7,0.2537,68,114,182 +4452,2011-07-09,3,0,7,9,0,6,0,1,0.74,0.6818,0.58,0.2836,77,170,247 +4453,2011-07-09,3,0,7,10,0,6,0,1,0.76,0.697,0.52,0.1642,87,177,264 +4454,2011-07-09,3,0,7,11,0,6,0,1,0.78,0.697,0.46,0.2985,101,189,290 +4455,2011-07-09,3,0,7,12,0,6,0,1,0.8,0.7273,0.46,0.2537,145,221,366 +4456,2011-07-09,3,0,7,13,0,6,0,1,0.82,0.7424,0.41,0.2239,167,249,416 +4457,2011-07-09,3,0,7,14,0,6,0,1,0.84,0.7424,0.36,0.1343,158,215,373 +4458,2011-07-09,3,0,7,15,0,6,0,1,0.84,0.7273,0.34,0.0896,182,220,402 +4459,2011-07-09,3,0,7,16,0,6,0,1,0.82,0.7273,0.38,0,171,245,416 +4460,2011-07-09,3,0,7,17,0,6,0,1,0.84,0.7273,0.32,0,160,241,401 +4461,2011-07-09,3,0,7,18,0,6,0,1,0.82,0.7273,0.36,0.1045,132,246,378 +4462,2011-07-09,3,0,7,19,0,6,0,1,0.76,0.697,0.55,0.1642,128,178,306 +4463,2011-07-09,3,0,7,20,0,6,0,1,0.76,0.7121,0.58,0.2239,129,185,314 +4464,2011-07-09,3,0,7,21,0,6,0,1,0.74,0.6818,0.58,0.1642,83,155,238 +4465,2011-07-09,3,0,7,22,0,6,0,1,0.72,0.6818,0.62,0.1642,60,154,214 +4466,2011-07-09,3,0,7,23,0,6,0,1,0.72,0.6667,0.58,0.194,47,93,140 +4467,2011-07-10,3,0,7,0,0,0,0,1,0.7,0.6515,0.61,0.1642,42,112,154 +4468,2011-07-10,3,0,7,1,0,0,0,1,0.7,0.6515,0.61,0.1343,19,94,113 +4469,2011-07-10,3,0,7,2,0,0,0,1,0.68,0.6364,0.69,0.0896,28,68,96 +4470,2011-07-10,3,0,7,3,0,0,0,1,0.66,0.6212,0.74,0.0896,6,19,25 +4471,2011-07-10,3,0,7,4,0,0,0,1,0.66,0.6212,0.69,0.0896,0,5,5 +4472,2011-07-10,3,0,7,5,0,0,0,1,0.66,0.6212,0.74,0,7,10,17 +4473,2011-07-10,3,0,7,6,0,0,0,1,0.64,0.5909,0.78,0.0896,8,17,25 +4474,2011-07-10,3,0,7,7,0,0,0,1,0.7,0.6515,0.7,0.0896,27,38,65 +4475,2011-07-10,3,0,7,8,0,0,0,1,0.72,0.6818,0.66,0,27,65,92 +4476,2011-07-10,3,0,7,9,0,0,0,1,0.74,0.697,0.66,0.0896,62,107,169 +4477,2011-07-10,3,0,7,10,0,0,0,1,0.76,0.697,0.55,0.1343,95,173,268 +4478,2011-07-10,3,0,7,11,0,0,0,1,0.78,0.7273,0.55,0.194,110,177,287 +4479,2011-07-10,3,0,7,12,0,0,0,1,0.82,0.7576,0.46,0.2836,133,244,377 +4480,2011-07-10,3,0,7,13,0,0,0,1,0.8,0.7273,0.46,0.2985,139,228,367 +4481,2011-07-10,3,0,7,14,0,0,0,1,0.82,0.7424,0.41,0.3284,122,227,349 +4482,2011-07-10,3,0,7,15,0,0,0,1,0.84,0.7424,0.39,0.2836,142,219,361 +4483,2011-07-10,3,0,7,16,0,0,0,1,0.84,0.7576,0.41,0.2836,128,244,372 +4484,2011-07-10,3,0,7,17,0,0,0,1,0.84,0.7424,0.36,0.2239,146,217,363 +4485,2011-07-10,3,0,7,18,0,0,0,1,0.82,0.7424,0.43,0.2836,141,222,363 +4486,2011-07-10,3,0,7,19,0,0,0,1,0.8,0.7424,0.49,0.3284,120,183,303 +4487,2011-07-10,3,0,7,20,0,0,0,1,0.76,0.697,0.55,0.194,101,167,268 +4488,2011-07-10,3,0,7,21,0,0,0,1,0.74,0.6818,0.62,0.2537,65,162,227 +4489,2011-07-10,3,0,7,22,0,0,0,1,0.74,0.697,0.66,0.2239,56,87,143 +4490,2011-07-10,3,0,7,23,0,0,0,1,0.72,0.6818,0.66,0.2537,19,53,72 +4491,2011-07-11,3,0,7,0,0,1,1,1,0.7,0.6515,0.65,0.194,10,25,35 +4492,2011-07-11,3,0,7,1,0,1,1,1,0.7,0.6515,0.61,0.2836,5,5,10 +4493,2011-07-11,3,0,7,2,0,1,1,1,0.7,0.6515,0.61,0.2239,8,4,12 +4494,2011-07-11,3,0,7,3,0,1,1,1,0.68,0.6364,0.65,0.1642,2,8,10 +4495,2011-07-11,3,0,7,4,0,1,1,1,0.66,0.6212,0.74,0.2239,0,4,4 +4496,2011-07-11,3,0,7,5,0,1,1,1,0.66,0.6212,0.74,0.1343,5,21,26 +4497,2011-07-11,3,0,7,6,0,1,1,1,0.68,0.6364,0.74,0.194,19,102,121 +4498,2011-07-11,3,0,7,7,0,1,1,1,0.7,0.6667,0.74,0.2537,28,289,317 +4499,2011-07-11,3,0,7,8,0,1,1,1,0.74,0.697,0.66,0.2836,35,385,420 +4500,2011-07-11,3,0,7,9,0,1,1,1,0.78,0.7424,0.59,0.2985,41,167,208 +4501,2011-07-11,3,0,7,10,0,1,1,1,0.8,0.7576,0.55,0.3582,44,73,117 +4502,2011-07-11,3,0,7,11,0,1,1,1,0.82,0.7879,0.56,0.2985,39,89,128 +4503,2011-07-11,3,0,7,12,0,1,1,1,0.84,0.803,0.53,0.2836,40,108,148 +4504,2011-07-11,3,0,7,13,0,1,1,1,0.86,0.8182,0.5,0.2537,22,119,141 +4505,2011-07-11,3,0,7,14,0,1,1,1,0.86,0.8333,0.53,0.2836,40,94,134 +4506,2011-07-11,3,0,7,15,0,1,1,1,0.86,0.8485,0.56,0.4179,38,108,146 +4507,2011-07-11,3,0,7,16,0,1,1,1,0.86,0.8485,0.56,0.3881,57,182,239 +4508,2011-07-11,3,0,7,17,0,1,1,1,0.86,0.8485,0.56,0.4179,59,455,514 +4509,2011-07-11,3,0,7,18,0,1,1,1,0.86,0.8333,0.53,0.3881,57,415,472 +4510,2011-07-11,3,0,7,19,0,1,1,1,0.84,0.8333,0.59,0.3284,80,293,373 +4511,2011-07-11,3,0,7,20,0,1,1,1,0.72,0.697,0.74,0.4179,42,194,236 +4512,2011-07-11,3,0,7,21,0,1,1,1,0.7,0.6667,0.79,0.4179,21,109,130 +4513,2011-07-11,3,0,7,22,0,1,1,1,0.72,0.697,0.74,0.1343,20,70,90 +4514,2011-07-11,3,0,7,23,0,1,1,1,0.7,0.6667,0.79,0.1343,11,44,55 +4515,2011-07-12,3,0,7,0,0,2,1,1,0.7,0.6667,0.79,0.1343,3,19,22 +4516,2011-07-12,3,0,7,1,0,2,1,1,0.7,0.6667,0.79,0.0896,6,8,14 +4517,2011-07-12,3,0,7,2,0,2,1,1,0.7,0.6667,0.79,0.1045,2,7,9 +4518,2011-07-12,3,0,7,3,0,2,1,2,0.7,0.6667,0.79,0,2,6,8 +4519,2011-07-12,3,0,7,4,0,2,1,1,0.7,0.6667,0.79,0.1642,0,6,6 +4520,2011-07-12,3,0,7,5,0,2,1,1,0.7,0.6667,0.79,0.0896,4,21,25 +4521,2011-07-12,3,0,7,6,0,2,1,1,0.74,0.6818,0.58,0.2239,13,102,115 +4522,2011-07-12,3,0,7,7,0,2,1,1,0.76,0.697,0.55,0.194,29,301,330 +4523,2011-07-12,3,0,7,8,0,2,1,1,0.76,0.7121,0.58,0.2239,35,382,417 +4524,2011-07-12,3,0,7,9,0,2,1,1,0.78,0.7273,0.55,0.4925,27,149,176 +4525,2011-07-12,3,0,7,10,0,2,1,1,0.82,0.7727,0.52,0.1642,38,85,123 +4526,2011-07-12,3,0,7,11,0,2,1,1,0.82,0.7727,0.52,0.1642,39,86,125 +4527,2011-07-12,3,0,7,12,0,2,1,1,0.86,0.7879,0.44,0.2985,38,125,163 +4528,2011-07-12,3,0,7,13,0,2,1,1,0.86,0.803,0.47,0.2239,32,104,136 +4529,2011-07-12,3,0,7,14,0,2,1,1,0.9,0.8485,0.42,0.2985,33,91,124 +4530,2011-07-12,3,0,7,15,0,2,1,1,0.9,0.8485,0.42,0.3582,37,111,148 +4531,2011-07-12,3,0,7,16,0,2,1,1,0.86,0.7879,0.44,0.2836,25,177,202 +4532,2011-07-12,3,0,7,17,0,2,1,1,0.88,0.8182,0.44,0.2836,48,380,428 +4533,2011-07-12,3,0,7,18,0,2,1,1,0.86,0.803,0.47,0.2985,39,472,511 +4534,2011-07-12,3,0,7,19,0,2,1,1,0.84,0.7576,0.44,0.2836,42,295,337 +4535,2011-07-12,3,0,7,20,0,2,1,1,0.82,0.7424,0.43,0.194,57,251,308 +4536,2011-07-12,3,0,7,21,0,2,1,1,0.82,0.7424,0.43,0.1045,49,209,258 +4537,2011-07-12,3,0,7,22,0,2,1,1,0.8,0.7273,0.46,0.1343,44,150,194 +4538,2011-07-12,3,0,7,23,0,2,1,1,0.78,0.7121,0.52,0,20,59,79 +4539,2011-07-13,3,0,7,0,0,3,1,1,0.76,0.7121,0.62,0,9,37,46 +4540,2011-07-13,3,0,7,1,0,3,1,1,0.76,0.697,0.55,0,1,11,12 +4541,2011-07-13,3,0,7,2,0,3,1,1,0.76,0.697,0.55,0,2,3,5 +4542,2011-07-13,3,0,7,3,0,3,1,1,0.74,0.6818,0.58,0.194,0,4,4 +4543,2011-07-13,3,0,7,4,0,3,1,1,0.74,0.6818,0.58,0.2537,0,5,5 +4544,2011-07-13,3,0,7,5,0,3,1,1,0.74,0.6818,0.58,0.194,4,22,26 +4545,2011-07-13,3,0,7,6,0,3,1,1,0.74,0.6818,0.58,0.1642,18,103,121 +4546,2011-07-13,3,0,7,7,0,3,1,1,0.76,0.697,0.55,0.2537,32,281,313 +4547,2011-07-13,3,0,7,8,0,3,1,1,0.8,0.7424,0.49,0.2537,31,418,449 +4548,2011-07-13,3,0,7,9,0,3,1,1,0.82,0.7424,0.41,0.194,23,163,186 +4549,2011-07-13,3,0,7,10,0,3,1,1,0.82,0.7576,0.46,0.194,33,80,113 +4550,2011-07-13,3,0,7,11,0,3,1,1,0.84,0.7424,0.39,0.1642,55,110,165 +4551,2011-07-13,3,0,7,12,0,3,1,1,0.84,0.7576,0.41,0.2836,40,134,174 +4552,2011-07-13,3,0,7,13,0,3,1,3,0.82,0.7576,0.46,0.1343,39,119,158 +4553,2011-07-13,3,0,7,14,0,3,1,3,0.82,0.7576,0.46,0.1343,27,89,116 +4554,2011-07-13,3,0,7,15,0,3,1,3,0.64,0.5606,0.94,0.1343,13,33,46 +4555,2011-07-13,3,0,7,16,0,3,1,2,0.66,0.5909,0.89,0,23,118,141 +4556,2011-07-13,3,0,7,17,0,3,1,1,0.7,0.6667,0.79,0,70,418,488 +4557,2011-07-13,3,0,7,18,0,3,1,1,0.72,0.697,0.79,0.1343,51,412,463 +4558,2011-07-13,3,0,7,19,0,3,1,1,0.7,0.6667,0.79,0.1045,79,340,419 +4559,2011-07-13,3,0,7,20,0,3,1,1,0.7,0.6667,0.84,0.1642,61,281,342 +4560,2011-07-13,3,0,7,21,0,3,1,1,0.7,0.6667,0.79,0.1045,56,194,250 +4561,2011-07-13,3,0,7,22,0,3,1,1,0.68,0.6364,0.83,0.194,48,136,184 +4562,2011-07-13,3,0,7,23,0,3,1,1,0.66,0.6061,0.83,0.2537,33,83,116 +4563,2011-07-14,3,0,7,0,0,4,1,1,0.66,0.6212,0.69,0.2985,14,32,46 +4564,2011-07-14,3,0,7,1,0,4,1,1,0.66,0.6212,0.61,0.2537,4,21,25 +4565,2011-07-14,3,0,7,2,0,4,1,1,0.64,0.6212,0.57,0.2985,3,7,10 +4566,2011-07-14,3,0,7,3,0,4,1,1,0.62,0.6212,0.57,0.2985,1,5,6 +4567,2011-07-14,3,0,7,4,0,4,1,1,0.62,0.6212,0.53,0.2537,0,6,6 +4568,2011-07-14,3,0,7,5,0,4,1,1,0.6,0.6212,0.56,0.2239,6,22,28 +4569,2011-07-14,3,0,7,6,0,4,1,1,0.6,0.6061,0.6,0.2239,14,116,130 +4570,2011-07-14,3,0,7,7,0,4,1,1,0.62,0.6212,0.53,0.2537,23,311,334 +4571,2011-07-14,3,0,7,8,0,4,1,1,0.64,0.6212,0.5,0.3582,53,433,486 +4572,2011-07-14,3,0,7,9,0,4,1,1,0.68,0.6364,0.47,0.2836,29,195,224 +4573,2011-07-14,3,0,7,10,0,4,1,1,0.7,0.6364,0.42,0.2537,32,95,127 +4574,2011-07-14,3,0,7,11,0,4,1,1,0.72,0.6515,0.37,0.2239,38,110,148 +4575,2011-07-14,3,0,7,12,0,4,1,1,0.74,0.6515,0.35,0.2537,51,172,223 +4576,2011-07-14,3,0,7,13,0,4,1,1,0.76,0.6667,0.33,0.3284,39,138,177 +4577,2011-07-14,3,0,7,14,0,4,1,1,0.74,0.6515,0.35,0.2836,49,129,178 +4578,2011-07-14,3,0,7,15,0,4,1,1,0.76,0.6667,0.33,0.2985,42,138,180 +4579,2011-07-14,3,0,7,16,0,4,1,1,0.76,0.6667,0.33,0.1343,43,231,274 +4580,2011-07-14,3,0,7,17,0,4,1,1,0.74,0.6515,0.37,0.1343,78,517,595 +4581,2011-07-14,3,0,7,18,0,4,1,1,0.72,0.6515,0.42,0.194,94,484,578 +4582,2011-07-14,3,0,7,19,0,4,1,1,0.72,0.6515,0.39,0.2239,67,333,400 +4583,2011-07-14,3,0,7,20,0,4,1,1,0.7,0.6364,0.42,0.2239,81,267,348 +4584,2011-07-14,3,0,7,21,0,4,1,1,0.66,0.6212,0.5,0.1343,58,203,261 +4585,2011-07-14,3,0,7,22,0,4,1,1,0.64,0.6212,0.57,0.0896,42,134,176 +4586,2011-07-14,3,0,7,23,0,4,1,1,0.64,0.6061,0.65,0.2537,27,97,124 +4587,2011-07-15,3,0,7,0,0,5,1,1,0.62,0.5909,0.73,0.2239,23,57,80 +4588,2011-07-15,3,0,7,1,0,5,1,1,0.62,0.5909,0.73,0.194,7,13,20 +4589,2011-07-15,3,0,7,2,0,5,1,1,0.6,0.5758,0.78,0.1642,16,22,38 +4590,2011-07-15,3,0,7,3,0,5,1,1,0.6,0.5909,0.73,0.194,1,6,7 +4591,2011-07-15,3,0,7,4,0,5,1,1,0.6,0.5909,0.73,0.1642,2,8,10 +4592,2011-07-15,3,0,7,5,0,5,1,1,0.6,0.5909,0.73,0.1343,7,16,23 +4593,2011-07-15,3,0,7,6,0,5,1,2,0.6,0.5909,0.73,0.1045,12,108,120 +4594,2011-07-15,3,0,7,7,0,5,1,1,0.64,0.6061,0.65,0.1045,17,257,274 +4595,2011-07-15,3,0,7,8,0,5,1,1,0.64,0.6061,0.65,0.2537,50,514,564 +4596,2011-07-15,3,0,7,9,0,5,1,1,0.68,0.6364,0.54,0.194,31,176,207 +4597,2011-07-15,3,0,7,10,0,5,1,2,0.66,0.6212,0.61,0.1642,59,107,166 +4598,2011-07-15,3,0,7,11,0,5,1,1,0.7,0.6515,0.54,0.1343,73,137,210 +4599,2011-07-15,3,0,7,12,0,5,1,1,0.7,0.6515,0.48,0.194,61,177,238 +4600,2011-07-15,3,0,7,13,0,5,1,1,0.7,0.6515,0.54,0.194,85,162,247 +4601,2011-07-15,3,0,7,14,0,5,1,1,0.7,0.6515,0.51,0.194,105,160,265 +4602,2011-07-15,3,0,7,15,0,5,1,1,0.72,0.6515,0.45,0.1642,74,163,237 +4603,2011-07-15,3,0,7,16,0,5,1,1,0.72,0.6667,0.51,0.194,96,265,361 +4604,2011-07-15,3,0,7,17,0,5,1,1,0.74,0.6667,0.42,0.2537,104,483,587 +4605,2011-07-15,3,0,7,18,0,5,1,1,0.72,0.6515,0.45,0.2239,102,394,496 +4606,2011-07-15,3,0,7,19,0,5,1,1,0.7,0.6364,0.45,0.2239,114,280,394 +4607,2011-07-15,3,0,7,20,0,5,1,1,0.7,0.6515,0.51,0.2239,92,251,343 +4608,2011-07-15,3,0,7,21,0,5,1,1,0.66,0.6212,0.54,0.1045,75,186,261 +4609,2011-07-15,3,0,7,22,0,5,1,1,0.66,0.6212,0.57,0.2239,71,152,223 +4610,2011-07-15,3,0,7,23,0,5,1,1,0.64,0.6212,0.61,0.1642,41,126,167 +4611,2011-07-16,3,0,7,0,0,6,0,1,0.62,0.6061,0.65,0.194,42,68,110 +4612,2011-07-16,3,0,7,1,0,6,0,1,0.6,0.5909,0.69,0.1045,24,51,75 +4613,2011-07-16,3,0,7,2,0,6,0,1,0.6,0.5909,0.73,0.1343,10,48,58 +4614,2011-07-16,3,0,7,3,0,6,0,1,0.6,0.5758,0.78,0.1343,6,20,26 +4615,2011-07-16,3,0,7,4,0,6,0,1,0.6,0.5909,0.73,0.0896,9,7,16 +4616,2011-07-16,3,0,7,5,0,6,0,1,0.58,0.5455,0.78,0.0896,4,6,10 +4617,2011-07-16,3,0,7,6,0,6,0,1,0.6,0.5758,0.78,0.1642,16,21,37 +4618,2011-07-16,3,0,7,7,0,6,0,1,0.62,0.5909,0.73,0.1343,14,38,52 +4619,2011-07-16,3,0,7,8,0,6,0,1,0.66,0.6212,0.65,0.1642,37,90,127 +4620,2011-07-16,3,0,7,9,0,6,0,1,0.7,0.6515,0.54,0.2239,72,150,222 +4621,2011-07-16,3,0,7,10,0,6,0,1,0.72,0.6667,0.54,0.194,114,208,322 +4622,2011-07-16,3,0,7,11,0,6,0,1,0.74,0.6667,0.48,0.2239,171,225,396 +4623,2011-07-16,3,0,7,12,0,6,0,1,0.76,0.6818,0.43,0.2836,187,276,463 +4624,2011-07-16,3,0,7,13,0,6,0,1,0.76,0.6818,0.45,0.2985,221,276,497 +4625,2011-07-16,3,0,7,14,0,6,0,1,0.74,0.6667,0.45,0.2985,201,232,433 +4626,2011-07-16,3,0,7,15,0,6,0,1,0.76,0.6818,0.45,0.3284,205,223,428 +4627,2011-07-16,3,0,7,16,0,6,0,1,0.76,0.6818,0.45,0.2836,183,242,425 +4628,2011-07-16,3,0,7,17,0,6,0,1,0.76,0.6818,0.45,0.2239,234,241,475 +4629,2011-07-16,3,0,7,18,0,6,0,1,0.76,0.6818,0.45,0.2836,185,243,428 +4630,2011-07-16,3,0,7,19,0,6,0,1,0.74,0.6818,0.48,0.194,164,242,406 +4631,2011-07-16,3,0,7,20,0,6,0,1,0.72,0.6667,0.51,0.2239,108,188,296 +4632,2011-07-16,3,0,7,21,0,6,0,1,0.7,0.6515,0.58,0.2239,85,167,252 +4633,2011-07-16,3,0,7,22,0,6,0,1,0.7,0.6515,0.65,0.2836,65,136,201 +4634,2011-07-16,3,0,7,23,0,6,0,1,0.68,0.6364,0.61,0.2239,61,107,168 +4635,2011-07-17,3,0,7,0,0,0,0,1,0.66,0.6212,0.65,0.2239,34,91,125 +4636,2011-07-17,3,0,7,1,0,0,0,1,0.64,0.6061,0.69,0.194,16,86,102 +4637,2011-07-17,3,0,7,2,0,0,0,1,0.64,0.6061,0.69,0.2239,28,66,94 +4638,2011-07-17,3,0,7,3,0,0,0,1,0.64,0.6061,0.69,0.2239,21,26,47 +4639,2011-07-17,3,0,7,4,0,0,0,1,0.62,0.5909,0.73,0.1642,4,6,10 +4640,2011-07-17,3,0,7,5,0,0,0,1,0.64,0.6061,0.69,0.2239,4,8,12 +4641,2011-07-17,3,0,7,6,0,0,0,1,0.62,0.5909,0.73,0.1642,10,11,21 +4642,2011-07-17,3,0,7,7,0,0,0,1,0.64,0.6061,0.73,0.1642,20,30,50 +4643,2011-07-17,3,0,7,8,0,0,0,1,0.68,0.6364,0.65,0.2239,45,73,118 +4644,2011-07-17,3,0,7,9,0,0,0,1,0.72,0.6667,0.58,0.2239,74,110,184 +4645,2011-07-17,3,0,7,10,0,0,0,1,0.74,0.6818,0.6,0.2239,127,177,304 +4646,2011-07-17,3,0,7,11,0,0,0,1,0.76,0.697,0.55,0.2836,147,244,391 +4647,2011-07-17,3,0,7,12,0,0,0,1,0.76,0.697,0.55,0.2836,177,243,420 +4648,2011-07-17,3,0,7,13,0,0,0,1,0.8,0.7424,0.49,0.2985,200,272,472 +4649,2011-07-17,3,0,7,14,0,0,0,1,0.8,0.7424,0.49,0.3582,168,262,430 +4650,2011-07-17,3,0,7,15,0,0,0,1,0.82,0.7424,0.43,0.2985,129,195,324 +4651,2011-07-17,3,0,7,16,0,0,0,1,0.8,0.7273,0.46,0.2985,129,188,317 +4652,2011-07-17,3,0,7,17,0,0,0,1,0.8,0.7424,0.49,0.3881,133,236,369 +4653,2011-07-17,3,0,7,18,0,0,0,1,0.8,0.7273,0.46,0.2985,147,243,390 +4654,2011-07-17,3,0,7,19,0,0,0,1,0.76,0.697,0.55,0.2836,145,234,379 +4655,2011-07-17,3,0,7,20,0,0,0,1,0.76,0.7121,0.58,0.2537,85,177,262 +4656,2011-07-17,3,0,7,21,0,0,0,1,0.74,0.6818,0.62,0.2239,59,148,207 +4657,2011-07-17,3,0,7,22,0,0,0,1,0.72,0.6818,0.66,0.2239,65,116,181 +4658,2011-07-17,3,0,7,23,0,0,0,1,0.7,0.6667,0.74,0.1343,39,54,93 +4659,2011-07-18,3,0,7,0,0,1,1,1,0.7,0.6667,0.74,0.1343,21,30,51 +4660,2011-07-18,3,0,7,1,0,1,1,1,0.7,0.6667,0.74,0.2239,17,8,25 +4661,2011-07-18,3,0,7,2,0,1,1,1,0.66,0.6061,0.83,0.2239,2,8,10 +4662,2011-07-18,3,0,7,3,0,1,1,1,0.66,0.6061,0.78,0.194,3,4,7 +4663,2011-07-18,3,0,7,4,0,1,1,1,0.64,0.5909,0.78,0.1642,1,3,4 +4664,2011-07-18,3,0,7,5,0,1,1,1,0.64,0.5909,0.78,0.1045,2,15,17 +4665,2011-07-18,3,0,7,6,0,1,1,1,0.64,0.5758,0.83,0.1045,10,95,105 +4666,2011-07-18,3,0,7,7,0,1,1,1,0.68,0.6364,0.74,0.2239,22,255,277 +4667,2011-07-18,3,0,7,8,0,1,1,1,0.7,0.6515,0.7,0.194,18,329,347 +4668,2011-07-18,3,0,7,9,0,1,1,1,0.74,0.697,0.67,0.2239,33,170,203 +4669,2011-07-18,3,0,7,10,0,1,1,1,0.76,0.7121,0.62,0.194,52,78,130 +4670,2011-07-18,3,0,7,11,0,1,1,1,0.8,0.7576,0.55,0.0896,42,99,141 +4671,2011-07-18,3,0,7,12,0,1,1,1,0.8,0.7727,0.59,0.194,39,126,165 +4672,2011-07-18,3,0,7,13,0,1,1,1,0.82,0.7879,0.56,0.2537,27,125,152 +4673,2011-07-18,3,0,7,14,0,1,1,1,0.82,0.7879,0.56,0.3284,32,106,138 +4674,2011-07-18,3,0,7,15,0,1,1,1,0.84,0.803,0.53,0.3284,53,111,164 +4675,2011-07-18,3,0,7,16,0,1,1,1,0.84,0.803,0.53,0.2836,45,220,265 +4676,2011-07-18,3,0,7,17,0,1,1,1,0.84,0.7879,0.49,0.3284,72,473,545 +4677,2011-07-18,3,0,7,18,0,1,1,1,0.82,0.7727,0.49,0.3582,80,478,558 +4678,2011-07-18,3,0,7,19,0,1,1,1,0.8,0.7576,0.55,0.2239,63,335,398 +4679,2011-07-18,3,0,7,20,0,1,1,1,0.78,0.7424,0.59,0.2239,91,232,323 +4680,2011-07-18,3,0,7,21,0,1,1,1,0.74,0.697,0.66,0.194,48,154,202 +4681,2011-07-18,3,0,7,22,0,1,1,1,0.76,0.7273,0.66,0.2239,36,104,140 +4682,2011-07-18,3,0,7,23,0,1,1,1,0.74,0.697,0.66,0.1642,32,59,91 +4683,2011-07-19,3,0,7,0,0,2,1,1,0.74,0.697,0.66,0.1045,25,26,51 +4684,2011-07-19,3,0,7,1,0,2,1,2,0.74,0.697,0.66,0.1343,7,6,13 +4685,2011-07-19,3,0,7,2,0,2,1,1,0.72,0.6818,0.7,0.1642,9,4,13 +4686,2011-07-19,3,0,7,3,0,2,1,1,0.72,0.6818,0.7,0.1642,2,1,3 +4687,2011-07-19,3,0,7,4,0,2,1,1,0.72,0.6818,0.7,0.1343,1,4,5 +4688,2011-07-19,3,0,7,5,0,2,1,2,0.7,0.6667,0.74,0.1045,0,19,19 +4689,2011-07-19,3,0,7,6,0,2,1,2,0.72,0.697,0.74,0.1045,8,126,134 +4690,2011-07-19,3,0,7,7,0,2,1,2,0.74,0.697,0.7,0.1642,28,287,315 +4691,2011-07-19,3,0,7,8,0,2,1,2,0.76,0.7273,0.7,0.1642,31,381,412 +4692,2011-07-19,3,0,7,9,0,2,1,2,0.8,0.7879,0.63,0.1343,22,177,199 +4693,2011-07-19,3,0,7,10,0,2,1,2,0.82,0.7879,0.56,0.1642,29,112,141 +4694,2011-07-19,3,0,7,11,0,2,1,1,0.82,0.803,0.59,0,29,98,127 +4695,2011-07-19,3,0,7,12,0,2,1,1,0.84,0.8182,0.56,0.0896,26,127,153 +4696,2011-07-19,3,0,7,13,0,2,1,1,0.86,0.8333,0.53,0.1045,33,139,172 +4697,2011-07-19,3,0,7,14,0,2,1,1,0.86,0.8182,0.5,0.194,41,111,152 +4698,2011-07-19,3,0,7,15,0,2,1,1,0.88,0.8182,0.44,0.194,48,110,158 +4699,2011-07-19,3,0,7,16,0,2,1,1,0.9,0.8485,0.42,0.2985,61,216,277 +4700,2011-07-19,3,0,7,17,0,2,1,1,0.8,0.803,0.66,0,68,445,513 +4701,2011-07-19,3,0,7,18,0,2,1,1,0.8,0.803,0.66,0.1343,80,450,530 +4702,2011-07-19,3,0,7,19,0,2,1,1,0.76,0.7424,0.75,0.194,54,334,388 +4703,2011-07-19,3,0,7,20,0,2,1,1,0.74,0.7273,0.74,0.0896,48,229,277 +4704,2011-07-19,3,0,7,21,0,2,1,1,0.74,0.7121,0.74,0.1343,47,194,241 +4705,2011-07-19,3,0,7,22,0,2,1,1,0.74,0.7121,0.74,0,36,120,156 +4706,2011-07-19,3,0,7,23,0,2,1,3,0.72,0.697,0.79,0.1642,19,73,92 +4707,2011-07-20,3,0,7,0,0,3,1,1,0.7,0.6667,0.84,0.1045,11,29,40 +4708,2011-07-20,3,0,7,1,0,3,1,1,0.7,0.6667,0.84,0,4,7,11 +4709,2011-07-20,3,0,7,2,0,3,1,1,0.7,0.6667,0.84,0,2,7,9 +4710,2011-07-20,3,0,7,3,0,3,1,1,0.7,0.6667,0.84,0.1045,2,4,6 +4711,2011-07-20,3,0,7,4,0,3,1,1,0.68,0.6364,0.83,0,0,4,4 +4712,2011-07-20,3,0,7,5,0,3,1,1,0.68,0.6364,0.83,0.1045,4,19,23 +4713,2011-07-20,3,0,7,6,0,3,1,2,0.68,0.6364,0.89,0.1045,5,111,116 +4714,2011-07-20,3,0,7,7,0,3,1,2,0.7,0.6667,0.84,0.1045,27,276,303 +4715,2011-07-20,3,0,7,8,0,3,1,1,0.74,0.7121,0.74,0,34,404,438 +4716,2011-07-20,3,0,7,9,0,3,1,1,0.74,0.697,0.7,0.0896,28,181,209 +4717,2011-07-20,3,0,7,10,0,3,1,1,0.76,0.7273,0.7,0,34,76,110 +4718,2011-07-20,3,0,7,11,0,3,1,1,0.8,0.7879,0.63,0.0896,26,116,142 +4719,2011-07-20,3,0,7,12,0,3,1,1,0.82,0.803,0.59,0,28,121,149 +4720,2011-07-20,3,0,7,13,0,3,1,1,0.84,0.8182,0.56,0.194,23,97,120 +4721,2011-07-20,3,0,7,14,0,3,1,1,0.86,0.8333,0.53,0.2239,44,83,127 +4722,2011-07-20,3,0,7,15,0,3,1,1,0.86,0.8182,0.5,0.1642,36,102,138 +4723,2011-07-20,3,0,7,16,0,3,1,1,0.84,0.8182,0.56,0,30,204,234 +4724,2011-07-20,3,0,7,17,0,3,1,1,0.84,0.8333,0.59,0.194,50,405,455 +4725,2011-07-20,3,0,7,18,0,3,1,1,0.84,0.8333,0.59,0.2239,68,429,497 +4726,2011-07-20,3,0,7,19,0,3,1,1,0.82,0.8485,0.67,0.1642,59,323,382 +4727,2011-07-20,3,0,7,20,0,3,1,1,0.8,0.8333,0.75,0.2239,45,264,309 +4728,2011-07-20,3,0,7,21,0,3,1,1,0.8,0.803,0.66,0.1642,33,209,242 +4729,2011-07-20,3,0,7,22,0,3,1,1,0.78,0.7727,0.7,0.2239,25,144,169 +4730,2011-07-20,3,0,7,23,0,3,1,1,0.76,0.7424,0.75,0.2537,26,73,99 +4731,2011-07-21,3,0,7,0,0,4,1,2,0.76,0.7424,0.75,0.194,13,29,42 +4732,2011-07-21,3,0,7,1,0,4,1,2,0.74,0.7273,0.84,0.2239,5,16,21 +4733,2011-07-21,3,0,7,2,0,4,1,2,0.74,0.7273,0.84,0.2239,2,4,6 +4734,2011-07-21,3,0,7,3,0,4,1,2,0.74,0.7121,0.79,0.1343,1,5,6 +4735,2011-07-21,3,0,7,4,0,4,1,2,0.72,0.7121,0.84,0.1045,0,4,4 +4736,2011-07-21,3,0,7,5,0,4,1,2,0.72,0.7121,0.84,0.1045,2,16,18 +4737,2011-07-21,3,0,7,6,0,4,1,2,0.72,0.7121,0.84,0.1045,6,111,117 +4738,2011-07-21,3,0,7,7,0,4,1,2,0.74,0.7273,0.84,0.1343,23,251,274 +4739,2011-07-21,3,0,7,8,0,4,1,2,0.76,0.7576,0.79,0.1642,23,358,381 +4740,2011-07-21,3,0,7,9,0,4,1,2,0.8,0.8182,0.75,0.2537,34,166,200 +4741,2011-07-21,3,0,7,10,0,4,1,2,0.82,0.8636,0.71,0.2239,31,47,78 +4742,2011-07-21,3,0,7,11,0,4,1,2,0.86,0.8939,0.63,0.2985,27,79,106 +4743,2011-07-21,3,0,7,12,0,4,1,1,0.88,0.8939,0.56,0.3284,29,106,135 +4744,2011-07-21,3,0,7,13,0,4,1,1,0.88,0.8939,0.56,0.2985,21,81,102 +4745,2011-07-21,3,0,7,14,0,4,1,1,0.9,0.8939,0.5,0.3582,50,97,147 +4746,2011-07-21,3,0,7,15,0,4,1,1,0.9,0.9242,0.53,0.2537,40,93,133 +4747,2011-07-21,3,0,7,16,0,4,1,1,0.92,0.9242,0.48,0.2836,43,167,210 +4748,2011-07-21,3,0,7,17,0,4,1,1,0.92,0.9091,0.45,0.2985,40,374,414 +4749,2011-07-21,3,0,7,18,0,4,1,1,0.9,0.8939,0.5,0.2836,39,343,382 +4750,2011-07-21,3,0,7,19,0,4,1,1,0.86,0.9242,0.67,0.2836,51,233,284 +4751,2011-07-21,3,0,7,20,0,4,1,1,0.84,0.8939,0.71,0.2239,38,199,237 +4752,2011-07-21,3,0,7,21,0,4,1,1,0.82,0.8939,0.75,0.2239,38,187,225 +4753,2011-07-21,3,0,7,22,0,4,1,1,0.82,0.8636,0.71,0.1642,36,113,149 +4754,2011-07-21,3,0,7,23,0,4,1,1,0.8,0.8182,0.71,0.1642,40,73,113 +4755,2011-07-22,3,0,7,0,0,5,1,1,0.82,0.8333,0.63,0.1343,9,51,60 +4756,2011-07-22,3,0,7,1,0,5,1,1,0.8,0.7879,0.63,0,7,17,24 +4757,2011-07-22,3,0,7,2,0,5,1,1,0.78,0.803,0.79,0.1343,0,14,14 +4758,2011-07-22,3,0,7,3,0,5,1,1,0.78,0.7879,0.75,0.1045,1,6,7 +4759,2011-07-22,3,0,7,4,0,5,1,1,0.76,0.7424,0.75,0.1045,8,5,13 +4760,2011-07-22,3,0,7,5,0,5,1,1,0.74,0.7121,0.79,0.1045,2,17,19 +4761,2011-07-22,3,0,7,6,0,5,1,1,0.76,0.7424,0.75,0.0896,13,83,96 +4762,2011-07-22,3,0,7,7,0,5,1,2,0.8,0.803,0.66,0.1045,20,232,252 +4763,2011-07-22,3,0,7,8,0,5,1,2,0.84,0.8485,0.63,0.1642,30,292,322 +4764,2011-07-22,3,0,7,9,0,5,1,1,0.86,0.8939,0.63,0.0896,31,177,208 +4765,2011-07-22,3,0,7,10,0,5,1,1,0.9,0.9242,0.53,0.0896,32,83,115 +4766,2011-07-22,3,0,7,11,0,5,1,1,0.9,0.8939,0.5,0.1045,23,86,109 +4767,2011-07-22,3,0,7,12,0,5,1,1,0.94,0.9545,0.48,0.1642,20,95,115 +4768,2011-07-22,3,0,7,13,0,5,1,1,0.94,0.9848,0.51,0.1642,25,98,123 +4769,2011-07-22,3,0,7,14,0,5,1,1,0.96,1,0.48,0.2985,24,77,101 +4770,2011-07-22,3,0,7,15,0,5,1,1,0.94,0.9848,0.51,0.2985,32,101,133 +4771,2011-07-22,3,0,7,16,0,5,1,3,0.9,0.8333,0.39,0.2985,29,153,182 +4772,2011-07-22,3,0,7,17,0,5,1,1,0.88,0.8485,0.47,0.1045,35,271,306 +4773,2011-07-22,3,0,7,18,0,5,1,1,0.9,0.8182,0.37,0.1642,34,250,284 +4774,2011-07-22,3,0,7,19,0,5,1,2,0.86,0.7879,0.41,0.1045,40,210,250 +4775,2011-07-22,3,0,7,20,0,5,1,1,0.84,0.8182,0.56,0,46,166,212 +4776,2011-07-22,3,0,7,21,0,5,1,1,0.82,0.803,0.59,0.0896,38,152,190 +4777,2011-07-22,3,0,7,22,0,5,1,2,0.84,0.7879,0.49,0.0896,44,105,149 +4778,2011-07-22,3,0,7,23,0,5,1,1,0.8,0.7879,0.63,0.194,19,84,103 +4779,2011-07-23,3,0,7,0,0,6,0,1,0.82,0.7879,0.56,0,16,85,101 +4780,2011-07-23,3,0,7,1,0,6,0,1,0.82,0.7727,0.52,0.1045,13,57,70 +4781,2011-07-23,3,0,7,2,0,6,0,1,0.82,0.7727,0.52,0,13,48,61 +4782,2011-07-23,3,0,7,3,0,6,0,1,0.78,0.7576,0.66,0.0896,6,16,22 +4783,2011-07-23,3,0,7,4,0,6,0,1,0.76,0.7273,0.66,0.1045,5,4,9 +4784,2011-07-23,3,0,7,5,0,6,0,1,0.76,0.7121,0.62,0.1343,1,6,7 +4785,2011-07-23,3,0,7,6,0,6,0,1,0.8,0.7424,0.52,0.1343,6,19,25 +4786,2011-07-23,3,0,7,7,0,6,0,1,0.8,0.7424,0.52,0.1045,7,38,45 +4787,2011-07-23,3,0,7,8,0,6,0,1,0.84,0.7879,0.49,0.2537,24,85,109 +4788,2011-07-23,3,0,7,9,0,6,0,1,0.84,0.7879,0.49,0.1642,30,102,132 +4789,2011-07-23,3,0,7,10,0,6,0,1,0.86,0.7879,0.41,0.2239,61,137,198 +4790,2011-07-23,3,0,7,11,0,6,0,1,0.9,0.8182,0.37,0.194,62,154,216 +4791,2011-07-23,3,0,7,12,0,6,0,1,0.92,0.8636,0.37,0.1045,71,164,235 +4792,2011-07-23,3,0,7,13,0,6,0,1,0.94,0.8788,0.38,0.1642,78,167,245 +4793,2011-07-23,3,0,7,14,0,6,0,1,0.92,0.8939,0.42,0,84,152,236 +4794,2011-07-23,3,0,7,15,0,6,0,1,0.94,0.8788,0.38,0,78,137,215 +4795,2011-07-23,3,0,7,16,0,6,0,1,0.94,0.8788,0.38,0.1343,70,151,221 +4796,2011-07-23,3,0,7,17,0,6,0,1,0.94,0.8788,0.38,0.1343,70,127,197 +4797,2011-07-23,3,0,7,18,0,6,0,1,0.92,0.8788,0.4,0.2239,62,134,196 +4798,2011-07-23,3,0,7,19,0,6,0,2,0.82,0.803,0.59,0.1343,80,137,217 +4799,2011-07-23,3,0,7,20,0,6,0,1,0.82,0.803,0.59,0.2239,47,98,145 +4800,2011-07-23,3,0,7,21,0,6,0,1,0.82,0.803,0.59,0.2239,39,115,154 +4801,2011-07-23,3,0,7,22,0,6,0,1,0.8,0.7727,0.59,0.194,37,88,125 +4802,2011-07-23,3,0,7,23,0,6,0,1,0.8,0.7727,0.59,0.1045,27,77,104 +4803,2011-07-24,3,0,7,0,0,0,0,1,0.8,0.7727,0.59,0.1045,42,77,119 +4804,2011-07-24,3,0,7,1,0,0,0,1,0.78,0.7576,0.66,0.1045,33,63,96 +4805,2011-07-24,3,0,7,2,0,0,0,1,0.8,0.7879,0.63,0.0896,12,48,60 +4806,2011-07-24,3,0,7,3,0,0,0,1,0.8,0.7879,0.63,0.0896,17,20,37 +4807,2011-07-24,3,0,7,4,0,0,0,1,0.78,0.7576,0.66,0.1045,1,5,6 +4808,2011-07-24,3,0,7,5,0,0,0,1,0.78,0.7576,0.66,0.0896,5,3,8 +4809,2011-07-24,3,0,7,6,0,0,0,2,0.8,0.7879,0.63,0.0896,9,13,22 +4810,2011-07-24,3,0,7,7,0,0,0,1,0.8,0.7727,0.59,0.1343,6,29,35 +4811,2011-07-24,3,0,7,8,0,0,0,1,0.82,0.7879,0.56,0.2836,22,82,104 +4812,2011-07-24,3,0,7,9,0,0,0,1,0.82,0.7879,0.56,0.3881,42,112,154 +4813,2011-07-24,3,0,7,10,0,0,0,1,0.86,0.8182,0.5,0.3284,61,165,226 +4814,2011-07-24,3,0,7,11,0,0,0,1,0.84,0.803,0.53,0.2537,74,169,243 +4815,2011-07-24,3,0,7,12,0,0,0,1,0.9,0.8636,0.45,0.1642,85,161,246 +4816,2011-07-24,3,0,7,13,0,0,0,1,0.86,0.803,0.47,0.2239,90,162,252 +4817,2011-07-24,3,0,7,14,0,0,0,1,0.86,0.8182,0.5,0.2836,79,204,283 +4818,2011-07-24,3,0,7,15,0,0,0,1,0.9,0.8485,0.42,0.2537,68,170,238 +4819,2011-07-24,3,0,7,16,0,0,0,1,0.9,0.8485,0.42,0.2537,62,187,249 +4820,2011-07-24,3,0,7,17,0,0,0,1,0.9,0.8333,0.39,0.1045,75,174,249 +4821,2011-07-24,3,0,7,18,0,0,0,1,0.88,0.8182,0.42,0.1642,67,176,243 +4822,2011-07-24,3,0,7,19,0,0,0,1,0.86,0.803,0.47,0.1642,70,170,240 +4823,2011-07-24,3,0,7,20,0,0,0,1,0.84,0.7879,0.49,0.1642,52,143,195 +4824,2011-07-24,3,0,7,21,0,0,0,1,0.8,0.7879,0.63,0.1343,42,111,153 +4825,2011-07-24,3,0,7,22,0,0,0,1,0.78,0.7576,0.66,0,23,75,98 +4826,2011-07-24,3,0,7,23,0,0,0,2,0.76,0.7273,0.7,0.0896,13,37,50 +4827,2011-07-25,3,0,7,0,0,1,1,2,0.76,0.7273,0.7,0,17,17,34 +4828,2011-07-25,3,0,7,1,0,1,1,2,0.76,0.7424,0.75,0.0896,4,8,12 +4829,2011-07-25,3,0,7,2,0,1,1,2,0.74,0.7121,0.79,0.0896,3,3,6 +4830,2011-07-25,3,0,7,3,0,1,1,1,0.74,0.7121,0.79,0,2,2,4 +4831,2011-07-25,3,0,7,4,0,1,1,1,0.72,0.7121,0.84,0.0896,0,6,6 +4832,2011-07-25,3,0,7,5,0,1,1,1,0.72,0.7121,0.84,0.0896,2,22,24 +4833,2011-07-25,3,0,7,6,0,1,1,2,0.72,0.7121,0.84,0.0896,7,100,107 +4834,2011-07-25,3,0,7,7,0,1,1,2,0.74,0.7121,0.79,0,20,257,277 +4835,2011-07-25,3,0,7,8,0,1,1,2,0.76,0.7424,0.75,0,34,353,387 +4836,2011-07-25,3,0,7,9,0,1,1,1,0.8,0.803,0.66,0,11,124,135 +4837,2011-07-25,3,0,7,10,0,1,1,1,0.82,0.803,0.59,0,30,54,84 +4838,2011-07-25,3,0,7,11,0,1,1,2,0.84,0.8333,0.59,0.194,24,83,107 +4839,2011-07-25,3,0,7,12,0,1,1,2,0.84,0.8333,0.59,0.194,17,50,67 +4840,2011-07-25,3,0,7,13,0,1,1,2,0.74,0.7121,0.74,0.1343,14,64,78 +4841,2011-07-25,3,0,7,14,0,1,1,2,0.72,0.697,0.79,0.194,28,62,90 +4842,2011-07-25,3,0,7,15,0,1,1,1,0.7,0.6667,0.84,0.3881,16,96,112 +4843,2011-07-25,3,0,7,16,0,1,1,1,0.7,0.6667,0.84,0.2239,37,161,198 +4844,2011-07-25,3,0,7,17,0,1,1,1,0.72,0.697,0.74,0,48,437,485 +4845,2011-07-25,3,0,7,18,0,1,1,1,0.74,0.7121,0.74,0.0896,60,473,533 +4846,2011-07-25,3,0,7,19,0,1,1,1,0.74,0.697,0.7,0.0896,62,324,386 +4847,2011-07-25,3,0,7,20,0,1,1,1,0.72,0.697,0.74,0.1343,55,247,302 +4848,2011-07-25,3,0,7,21,0,1,1,1,0.7,0.6667,0.84,0,41,153,194 +4849,2011-07-25,3,0,7,22,0,1,1,1,0.7,0.6667,0.84,0,13,112,125 +4850,2011-07-25,3,0,7,23,0,1,1,1,0.7,0.6667,0.84,0.0896,23,64,87 +4851,2011-07-26,3,0,7,0,0,2,1,1,0.7,0.6667,0.79,0.1045,8,20,28 +4852,2011-07-26,3,0,7,1,0,2,1,1,0.7,0.6667,0.79,0.1045,6,6,12 +4853,2011-07-26,3,0,7,2,0,2,1,1,0.68,0.6364,0.89,0.1045,1,6,7 +4854,2011-07-26,3,0,7,3,0,2,1,1,0.68,0.6364,0.89,0.1343,2,2,4 +4855,2011-07-26,3,0,7,4,0,2,1,1,0.68,0.6364,0.88,0.1642,0,6,6 +4856,2011-07-26,3,0,7,5,0,2,1,1,0.66,0.5909,0.89,0.0896,3,24,27 +4857,2011-07-26,3,0,7,6,0,2,1,1,0.68,0.6364,0.83,0.2239,2,108,110 +4858,2011-07-26,3,0,7,7,0,2,1,1,0.72,0.6818,0.66,0.2836,30,288,318 +4859,2011-07-26,3,0,7,8,0,2,1,1,0.74,0.6818,0.55,0.194,32,392,424 +4860,2011-07-26,3,0,7,9,0,2,1,1,0.76,0.6818,0.4,0.2239,25,162,187 +4861,2011-07-26,3,0,7,10,0,2,1,1,0.8,0.697,0.31,0.2239,42,96,138 +4862,2011-07-26,3,0,7,11,0,2,1,1,0.82,0.7121,0.3,0,49,102,151 +4863,2011-07-26,3,0,7,12,0,2,1,1,0.84,0.7273,0.28,0,25,122,147 +4864,2011-07-26,3,0,7,13,0,2,1,1,0.84,0.7273,0.32,0.2985,43,139,182 +4865,2011-07-26,3,0,7,14,0,2,1,1,0.86,0.7424,0.3,0.2836,46,101,147 +4866,2011-07-26,3,0,7,15,0,2,1,1,0.88,0.7576,0.28,0.3284,49,114,163 +4867,2011-07-26,3,0,7,16,0,2,1,1,0.86,0.7576,0.36,0.2836,41,213,254 +4868,2011-07-26,3,0,7,17,0,2,1,1,0.86,0.7576,0.36,0.3582,72,465,537 +4869,2011-07-26,3,0,7,18,0,2,1,1,0.84,0.7424,0.39,0.3881,68,478,546 +4870,2011-07-26,3,0,7,19,0,2,1,1,0.82,0.7424,0.43,0.2985,69,329,398 +4871,2011-07-26,3,0,7,20,0,2,1,1,0.8,0.7273,0.46,0.2537,47,243,290 +4872,2011-07-26,3,0,7,21,0,2,1,1,0.78,0.7121,0.49,0.2239,45,222,267 +4873,2011-07-26,3,0,7,22,0,2,1,1,0.76,0.697,0.55,0.1343,27,131,158 +4874,2011-07-26,3,0,7,23,0,2,1,1,0.76,0.7121,0.58,0.1045,18,71,89 +4875,2011-07-27,3,0,7,0,0,3,1,2,0.78,0.7121,0.52,0.1642,19,26,45 +4876,2011-07-27,3,0,7,1,0,3,1,1,0.78,0.7121,0.49,0.194,1,9,10 +4877,2011-07-27,3,0,7,2,0,3,1,1,0.76,0.697,0.52,0.2985,1,5,6 +4878,2011-07-27,3,0,7,3,0,3,1,1,0.72,0.6667,0.58,0.1642,0,4,4 +4879,2011-07-27,3,0,7,4,0,3,1,1,0.72,0.6667,0.51,0.194,1,3,4 +4880,2011-07-27,3,0,7,5,0,3,1,1,0.7,0.6515,0.54,0.2836,0,18,18 +4881,2011-07-27,3,0,7,6,0,3,1,1,0.7,0.6515,0.54,0.2239,4,119,123 +4882,2011-07-27,3,0,7,7,0,3,1,1,0.72,0.6667,0.48,0.2537,34,313,347 +4883,2011-07-27,3,0,7,8,0,3,1,1,0.74,0.6667,0.42,0.2537,28,405,433 +4884,2011-07-27,3,0,7,9,0,3,1,1,0.76,0.6667,0.35,0.2537,27,189,216 +4885,2011-07-27,3,0,7,10,0,3,1,1,0.78,0.6818,0.33,0,32,82,114 +4886,2011-07-27,3,0,7,11,0,3,1,1,0.8,0.697,0.31,0,42,94,136 +4887,2011-07-27,3,0,7,12,0,3,1,1,0.82,0.7121,0.3,0.2836,37,132,169 +4888,2011-07-27,3,0,7,13,0,3,1,1,0.84,0.7273,0.3,0.3582,29,151,180 +4889,2011-07-27,3,0,7,14,0,3,1,1,0.84,0.7273,0.28,0.2239,20,107,127 +4890,2011-07-27,3,0,7,15,0,3,1,1,0.86,0.7424,0.26,0.2537,37,103,140 +4891,2011-07-27,3,0,7,16,0,3,1,1,0.86,0.7273,0.25,0.1642,40,228,268 +4892,2011-07-27,3,0,7,17,0,3,1,1,0.84,0.7121,0.26,0.2239,81,491,572 +4893,2011-07-27,3,0,7,18,0,3,1,1,0.82,0.7121,0.28,0.1045,78,463,541 +4894,2011-07-27,3,0,7,19,0,3,1,1,0.8,0.697,0.29,0.0896,87,288,375 +4895,2011-07-27,3,0,7,20,0,3,1,1,0.76,0.6818,0.45,0.2239,55,239,294 +4896,2011-07-27,3,0,7,21,0,3,1,1,0.74,0.6667,0.48,0.1045,49,196,245 +4897,2011-07-27,3,0,7,22,0,3,1,1,0.72,0.6667,0.48,0,37,143,180 +4898,2011-07-27,3,0,7,23,0,3,1,1,0.74,0.6667,0.45,0.0896,16,93,109 +4899,2011-07-28,3,0,7,0,0,4,1,1,0.7,0.6515,0.51,0.1343,14,31,45 +4900,2011-07-28,3,0,7,1,0,4,1,1,0.72,0.6667,0.51,0,3,26,29 +4901,2011-07-28,3,0,7,2,0,4,1,1,0.72,0.6667,0.54,0.1045,2,5,7 +4902,2011-07-28,3,0,7,3,0,4,1,1,0.72,0.6667,0.54,0,0,5,5 +4903,2011-07-28,3,0,7,4,0,4,1,1,0.7,0.6515,0.61,0.1045,0,6,6 +4904,2011-07-28,3,0,7,5,0,4,1,1,0.7,0.6515,0.65,0.1045,1,26,27 +4905,2011-07-28,3,0,7,6,0,4,1,1,0.7,0.6515,0.7,0.0896,6,110,116 +4906,2011-07-28,3,0,7,7,0,4,1,1,0.72,0.6818,0.7,0.1642,27,278,305 +4907,2011-07-28,3,0,7,8,0,4,1,1,0.74,0.697,0.66,0.1642,47,409,456 +4908,2011-07-28,3,0,7,9,0,4,1,1,0.78,0.7424,0.59,0.194,21,189,210 +4909,2011-07-28,3,0,7,10,0,4,1,1,0.8,0.7576,0.55,0.2836,30,86,116 +4910,2011-07-28,3,0,7,11,0,4,1,1,0.84,0.8182,0.56,0.2836,41,113,154 +4911,2011-07-28,3,0,7,12,0,4,1,1,0.84,0.803,0.53,0.2537,35,151,186 +4912,2011-07-28,3,0,7,13,0,4,1,1,0.86,0.803,0.47,0.2537,26,126,152 +4913,2011-07-28,3,0,7,14,0,4,1,2,0.84,0.7879,0.49,0.2537,27,111,138 +4914,2011-07-28,3,0,7,15,0,4,1,1,0.86,0.8182,0.5,0.194,22,124,146 +4915,2011-07-28,3,0,7,16,0,4,1,1,0.86,0.803,0.47,0.1642,42,214,256 +4916,2011-07-28,3,0,7,17,0,4,1,1,0.84,0.803,0.53,0.194,45,423,468 +4917,2011-07-28,3,0,7,18,0,4,1,1,0.84,0.8182,0.56,0.2239,55,428,483 +4918,2011-07-28,3,0,7,19,0,4,1,1,0.82,0.803,0.59,0.194,37,285,322 +4919,2011-07-28,3,0,7,20,0,4,1,1,0.8,0.7879,0.63,0.194,41,229,270 +4920,2011-07-28,3,0,7,21,0,4,1,1,0.78,0.7576,0.66,0.2836,31,165,196 +4921,2011-07-28,3,0,7,22,0,4,1,1,0.76,0.7273,0.7,0.2537,25,149,174 +4922,2011-07-28,3,0,7,23,0,4,1,1,0.76,0.7424,0.75,0.194,28,95,123 +4923,2011-07-29,3,0,7,0,0,5,1,1,0.74,0.7121,0.79,0.1642,12,45,57 +4924,2011-07-29,3,0,7,1,0,5,1,1,0.74,0.7121,0.79,0.1343,12,22,34 +4925,2011-07-29,3,0,7,2,0,5,1,1,0.74,0.7121,0.79,0,3,10,13 +4926,2011-07-29,3,0,7,3,0,5,1,1,0.74,0.7121,0.79,0,2,9,11 +4927,2011-07-29,3,0,7,4,0,5,1,1,0.72,0.7121,0.84,0.1343,2,4,6 +4928,2011-07-29,3,0,7,5,0,5,1,1,0.72,0.7121,0.84,0,4,23,27 +4929,2011-07-29,3,0,7,6,0,5,1,2,0.72,0.7121,0.84,0,7,83,90 +4930,2011-07-29,3,0,7,7,0,5,1,2,0.74,0.7121,0.79,0.0896,26,228,254 +4931,2011-07-29,3,0,7,8,0,5,1,2,0.76,0.7727,0.77,0,30,354,384 +4932,2011-07-29,3,0,7,9,0,5,1,1,0.86,0.803,0.47,0.2537,28,151,179 +4933,2011-07-29,3,0,7,10,0,5,1,1,0.9,0.8485,0.42,0.2836,37,69,106 +4934,2011-07-29,3,0,7,11,0,5,1,1,0.92,0.8788,0.4,0.2985,43,118,161 +4935,2011-07-29,3,0,7,12,0,5,1,1,0.96,0.8636,0.31,0.2836,40,95,135 +4936,2011-07-29,3,0,7,13,0,5,1,1,0.94,0.8333,0.31,0,42,114,156 +4937,2011-07-29,3,0,7,14,0,5,1,1,0.96,0.8636,0.3,0.2239,34,114,148 +4938,2011-07-29,3,0,7,15,0,5,1,1,0.96,0.8636,0.3,0.2537,40,129,169 +4939,2011-07-29,3,0,7,16,0,5,1,1,0.96,0.8636,0.3,0.3881,35,198,233 +4940,2011-07-29,3,0,7,17,0,5,1,1,0.96,0.8636,0.3,0.2985,47,374,421 +4941,2011-07-29,3,0,7,18,0,5,1,1,0.92,0.8636,0.37,0.3284,49,313,362 +4942,2011-07-29,3,0,7,19,0,5,1,1,0.9,0.8485,0.42,0.2239,22,219,241 +4943,2011-07-29,3,0,7,20,0,5,1,1,0.86,0.7879,0.41,0.3284,53,153,206 +4944,2011-07-29,3,0,7,21,0,5,1,1,0.82,0.7576,0.46,0.1642,29,134,163 +4945,2011-07-29,3,0,7,22,0,5,1,1,0.8,0.7424,0.49,0.2239,47,127,174 +4946,2011-07-29,3,0,7,23,0,5,1,1,0.78,0.7121,0.52,0.1045,26,90,116 +4947,2011-07-30,3,0,7,0,0,6,0,1,0.76,0.697,0.55,0.194,60,73,133 +4948,2011-07-30,3,0,7,1,0,6,0,1,0.76,0.697,0.55,0.0896,15,75,90 +4949,2011-07-30,3,0,7,2,0,6,0,1,0.76,0.697,0.55,0.1045,11,31,42 +4950,2011-07-30,3,0,7,3,0,6,0,1,0.74,0.6818,0.62,0.1642,18,25,43 +4951,2011-07-30,3,0,7,4,0,6,0,1,0.72,0.6818,0.66,0.1642,4,6,10 +4952,2011-07-30,3,0,7,5,0,6,0,1,0.72,0.6818,0.7,0.2836,4,9,13 +4953,2011-07-30,3,0,7,6,0,6,0,1,0.72,0.6818,0.7,0.2985,6,18,24 +4954,2011-07-30,3,0,7,7,0,6,0,1,0.76,0.7273,0.66,0.194,3,39,42 +4955,2011-07-30,3,0,7,8,0,6,0,1,0.78,0.7424,0.59,0.2239,27,87,114 +4956,2011-07-30,3,0,7,9,0,6,0,1,0.82,0.7576,0.46,0.2985,49,125,174 +4957,2011-07-30,3,0,7,10,0,6,0,1,0.82,0.7424,0.41,0.2985,79,182,261 +4958,2011-07-30,3,0,7,11,0,6,0,1,0.84,0.7424,0.36,0.194,111,206,317 +4959,2011-07-30,3,0,7,12,0,6,0,1,0.84,0.7273,0.32,0.1642,111,230,341 +4960,2011-07-30,3,0,7,13,0,6,0,1,0.88,0.7576,0.29,0,125,183,308 +4961,2011-07-30,3,0,7,14,0,6,0,1,0.9,0.7879,0.29,0,112,193,305 +4962,2011-07-30,3,0,7,15,0,6,0,1,0.9,0.803,0.31,0.1343,113,189,302 +4963,2011-07-30,3,0,7,16,0,6,0,1,0.9,0.7879,0.27,0.2239,126,176,302 +4964,2011-07-30,3,0,7,17,0,6,0,1,0.9,0.7879,0.29,0.1045,93,209,302 +4965,2011-07-30,3,0,7,18,0,6,0,1,0.88,0.7576,0.28,0.194,92,207,299 +4966,2011-07-30,3,0,7,19,0,6,0,1,0.74,0.6515,0.33,0.2239,101,155,256 +4967,2011-07-30,3,0,7,20,0,6,0,1,0.82,0.7424,0.43,0.1343,74,132,206 +4968,2011-07-30,3,0,7,21,0,6,0,1,0.8,0.7424,0.49,0.1642,82,123,205 +4969,2011-07-30,3,0,7,22,0,6,0,1,0.78,0.7121,0.52,0.1045,80,129,209 +4970,2011-07-30,3,0,7,23,0,6,0,1,0.76,0.697,0.55,0.0896,63,114,177 +4971,2011-07-31,3,0,7,0,0,0,0,1,0.76,0.697,0.55,0.0896,33,80,113 +4972,2011-07-31,3,0,7,1,0,0,0,1,0.74,0.6818,0.62,0.0896,8,71,79 +4973,2011-07-31,3,0,7,2,0,0,0,1,0.74,0.697,0.66,0.0896,19,48,67 +4974,2011-07-31,3,0,7,3,0,0,0,1,0.74,0.697,0.66,0,24,26,50 +4975,2011-07-31,3,0,7,4,0,0,0,1,0.72,0.6818,0.7,0,6,7,13 +4976,2011-07-31,3,0,7,5,0,0,0,1,0.72,0.6818,0.69,0,4,4,8 +4977,2011-07-31,3,0,7,6,0,0,0,1,0.74,0.697,0.7,0,5,8,13 +4978,2011-07-31,3,0,7,7,0,0,0,1,0.74,0.6818,0.58,0.1343,19,26,45 +4979,2011-07-31,3,0,7,8,0,0,0,1,0.76,0.697,0.55,0.2239,54,88,142 +4980,2011-07-31,3,0,7,9,0,0,0,1,0.8,0.7424,0.49,0,55,128,183 +4981,2011-07-31,3,0,7,10,0,0,0,1,0.84,0.7424,0.36,0.1642,91,163,254 +4982,2011-07-31,3,0,7,11,0,0,0,1,0.86,0.7424,0.28,0.1642,112,163,275 +4983,2011-07-31,3,0,7,12,0,0,0,1,0.9,0.7727,0.25,0.194,146,192,338 +4984,2011-07-31,3,0,7,13,0,0,0,1,0.9,0.7727,0.24,0.1343,142,152,294 +4985,2011-07-31,3,0,7,14,0,0,0,1,0.9,0.7727,0.24,0.1642,130,196,326 +4986,2011-07-31,3,0,7,15,0,0,0,1,0.92,0.7727,0.18,0.2985,109,184,293 +4987,2011-07-31,3,0,7,16,0,0,0,1,0.92,0.7727,0.17,0.2836,98,208,306 +4988,2011-07-31,3,0,7,17,0,0,0,1,0.92,0.7727,0.16,0.1642,114,206,320 +4989,2011-07-31,3,0,7,18,0,0,0,1,0.86,0.7879,0.41,0.2836,91,213,304 +4990,2011-07-31,3,0,7,19,0,0,0,1,0.8,0.7576,0.55,0.3284,87,187,274 +4991,2011-07-31,3,0,7,20,0,0,0,1,0.8,0.7576,0.55,0.3284,70,205,275 +4992,2011-07-31,3,0,7,21,0,0,0,1,0.78,0.7424,0.59,0.2985,68,96,164 +4993,2011-07-31,3,0,7,22,0,0,0,1,0.74,0.697,0.7,0.2985,25,71,96 +4994,2011-07-31,3,0,7,23,0,0,0,1,0.74,0.697,0.66,0.2239,14,56,70 +4995,2011-08-01,3,0,8,0,0,1,1,1,0.72,0.6818,0.7,0.2239,7,22,29 +4996,2011-08-01,3,0,8,1,0,1,1,1,0.72,0.697,0.74,0.194,5,12,17 +4997,2011-08-01,3,0,8,2,0,1,1,1,0.7,0.6667,0.74,0.1045,4,7,11 +4998,2011-08-01,3,0,8,3,0,1,1,1,0.7,0.6667,0.79,0.1642,0,4,4 +4999,2011-08-01,3,0,8,4,0,1,1,1,0.66,0.6061,0.83,0.1343,2,2,4 +5000,2011-08-01,3,0,8,5,0,1,1,1,0.66,0.6061,0.83,0.1045,2,24,26 +5001,2011-08-01,3,0,8,6,0,1,1,1,0.66,0.6061,0.83,0.0896,3,97,100 +5002,2011-08-01,3,0,8,7,0,1,1,1,0.74,0.6818,0.62,0,24,258,282 +5003,2011-08-01,3,0,8,8,0,1,1,1,0.8,0.7273,0.43,0.194,35,347,382 +5004,2011-08-01,3,0,8,9,0,1,1,1,0.82,0.7424,0.41,0,27,139,166 +5005,2011-08-01,3,0,8,10,0,1,1,1,0.86,0.7576,0.36,0.1642,27,70,97 +5006,2011-08-01,3,0,8,11,0,1,1,1,0.88,0.7727,0.32,0.1642,53,66,119 +5007,2011-08-01,3,0,8,12,0,1,1,1,0.9,0.803,0.33,0.2537,53,115,168 +5008,2011-08-01,3,0,8,13,0,1,1,1,0.9,0.803,0.31,0.2985,38,112,150 +5009,2011-08-01,3,0,8,14,0,1,1,1,0.92,0.8182,0.29,0.194,37,86,123 +5010,2011-08-01,3,0,8,15,0,1,1,1,0.9,0.7879,0.27,0,52,77,129 +5011,2011-08-01,3,0,8,16,0,1,1,1,0.9,0.7879,0.27,0,34,197,231 +5012,2011-08-01,3,0,8,17,0,1,1,1,0.76,0.6818,0.45,0.2985,69,445,514 +5013,2011-08-01,3,0,8,18,0,1,1,1,0.78,0.697,0.43,0.194,68,475,543 +5014,2011-08-01,3,0,8,19,0,1,1,1,0.74,0.6667,0.51,0.2239,63,350,413 +5015,2011-08-01,3,0,8,20,0,1,1,1,0.72,0.6818,0.62,0.2239,49,256,305 +5016,2011-08-01,3,0,8,21,0,1,1,1,0.7,0.6515,0.7,0.2537,42,178,220 +5017,2011-08-01,3,0,8,22,0,1,1,1,0.7,0.6515,0.7,0.0896,21,116,137 +5018,2011-08-01,3,0,8,23,0,1,1,1,0.68,0.6364,0.74,0.194,14,82,96 +5019,2011-08-02,3,0,8,0,0,2,1,1,0.66,0.6212,0.74,0,11,18,29 +5020,2011-08-02,3,0,8,1,0,2,1,1,0.66,0.6212,0.74,0.1045,4,8,12 +5021,2011-08-02,3,0,8,2,0,2,1,1,0.68,0.6364,0.69,0,2,6,8 +5022,2011-08-02,3,0,8,3,0,2,1,1,0.66,0.6212,0.74,0.194,0,5,5 +5023,2011-08-02,3,0,8,4,0,2,1,1,0.66,0.6212,0.74,0.1045,0,10,10 +5024,2011-08-02,3,0,8,5,0,2,1,1,0.66,0.6212,0.74,0.1343,4,17,21 +5025,2011-08-02,3,0,8,6,0,2,1,1,0.68,0.6364,0.69,0.1045,12,105,117 +5026,2011-08-02,3,0,8,7,0,2,1,1,0.72,0.6667,0.61,0.2836,15,320,335 +5027,2011-08-02,3,0,8,8,0,2,1,1,0.74,0.6818,0.58,0.2836,37,398,435 +5028,2011-08-02,3,0,8,9,0,2,1,1,0.78,0.697,0.46,0.2985,31,182,213 +5029,2011-08-02,3,0,8,10,0,2,1,1,0.82,0.7273,0.36,0.2985,50,90,140 +5030,2011-08-02,3,0,8,11,0,2,1,1,0.84,0.7273,0.32,0.4179,32,96,128 +5031,2011-08-02,3,0,8,12,0,2,1,1,0.86,0.7424,0.3,0.2836,45,112,157 +5032,2011-08-02,3,0,8,13,0,2,1,1,0.86,0.7424,0.3,0.2239,50,153,203 +5033,2011-08-02,3,0,8,14,0,2,1,1,0.9,0.7727,0.25,0.2836,45,114,159 +5034,2011-08-02,3,0,8,15,0,2,1,1,0.9,0.7727,0.25,0.2836,48,120,168 +5035,2011-08-02,3,0,8,16,0,2,1,1,0.9,0.7727,0.25,0.2836,58,216,274 +5036,2011-08-02,3,0,8,17,0,2,1,2,0.9,0.7879,0.27,0.1343,63,493,556 +5037,2011-08-02,3,0,8,18,0,2,1,2,0.86,0.7576,0.34,0.194,65,491,556 +5038,2011-08-02,3,0,8,19,0,2,1,2,0.86,0.7576,0.36,0.2239,85,369,454 +5039,2011-08-02,3,0,8,20,0,2,1,2,0.82,0.7576,0.46,0.2537,40,248,288 +5040,2011-08-02,3,0,8,21,0,2,1,1,0.8,0.7424,0.52,0.2239,53,201,254 +5041,2011-08-02,3,0,8,22,0,2,1,2,0.78,0.7424,0.59,0.1642,22,169,191 +5042,2011-08-02,3,0,8,23,0,2,1,2,0.8,0.7424,0.49,0.1642,29,103,132 +5043,2011-08-03,3,0,8,0,0,3,1,2,0.8,0.7424,0.49,0.1642,11,32,43 +5044,2011-08-03,3,0,8,1,0,3,1,2,0.78,0.7121,0.52,0.1642,7,9,16 +5045,2011-08-03,3,0,8,2,0,3,1,2,0.78,0.7121,0.49,0.1642,2,9,11 +5046,2011-08-03,3,0,8,3,0,3,1,2,0.76,0.697,0.55,0.0896,0,4,4 +5047,2011-08-03,3,0,8,4,0,3,1,2,0.76,0.697,0.52,0.1642,0,7,7 +5048,2011-08-03,3,0,8,5,0,3,1,2,0.76,0.697,0.55,0.0896,2,22,24 +5049,2011-08-03,3,0,8,6,0,3,1,3,0.72,0.6818,0.66,0.4925,9,101,110 +5050,2011-08-03,3,0,8,7,0,3,1,2,0.74,0.6667,0.51,0.2239,19,252,271 +5051,2011-08-03,3,0,8,8,0,3,1,2,0.74,0.6667,0.51,0.1642,29,408,437 +5052,2011-08-03,3,0,8,9,0,3,1,2,0.74,0.6818,0.55,0.194,23,172,195 +5053,2011-08-03,3,0,8,10,0,3,1,3,0.74,0.6818,0.58,0.0896,9,59,68 +5054,2011-08-03,3,0,8,11,0,3,1,2,0.74,0.6818,0.58,0.1045,29,93,122 +5055,2011-08-03,3,0,8,12,0,3,1,3,0.76,0.697,0.55,0.1343,19,142,161 +5056,2011-08-03,3,0,8,13,0,3,1,2,0.76,0.7121,0.58,0.2239,37,107,144 +5057,2011-08-03,3,0,8,14,0,3,1,2,0.76,0.7121,0.58,0.2239,43,104,147 +5058,2011-08-03,3,0,8,15,0,3,1,2,0.7,0.6667,0.79,0.1343,31,87,118 +5059,2011-08-03,3,0,8,16,0,3,1,3,0.68,0.6364,0.79,0,21,129,150 +5060,2011-08-03,3,0,8,17,0,3,1,2,0.7,0.6667,0.79,0,47,378,425 +5061,2011-08-03,3,0,8,18,0,3,1,3,0.7,0.6667,0.84,0,49,443,492 +5062,2011-08-03,3,0,8,19,0,3,1,3,0.7,0.6667,0.84,0,51,270,321 +5063,2011-08-03,3,0,8,20,0,3,1,3,0.7,0.6667,0.84,0.1045,7,80,87 +5064,2011-08-03,3,0,8,21,0,3,1,2,0.68,0.6364,0.89,0.1343,7,81,88 +5065,2011-08-03,3,0,8,22,0,3,1,3,0.68,0.6364,0.89,0.1045,11,66,77 +5066,2011-08-03,3,0,8,23,0,3,1,2,0.68,0.6364,0.89,0.0896,4,52,56 +5067,2011-08-04,3,0,8,0,0,4,1,2,0.68,0.6364,0.89,0.0896,2,15,17 +5068,2011-08-04,3,0,8,1,0,4,1,2,0.66,0.5909,0.94,0.0896,3,14,17 +5069,2011-08-04,3,0,8,2,0,4,1,2,0.66,0.5909,0.94,0.0896,2,5,7 +5070,2011-08-04,3,0,8,3,0,4,1,3,0.66,0.5909,0.94,0.0896,0,3,3 +5071,2011-08-04,3,0,8,4,0,4,1,2,0.68,0.6364,0.89,0,0,7,7 +5072,2011-08-04,3,0,8,5,0,4,1,2,0.68,0.6364,0.89,0,3,17,20 +5073,2011-08-04,3,0,8,6,0,4,1,2,0.68,0.6364,0.89,0.2985,7,90,97 +5074,2011-08-04,3,0,8,7,0,4,1,2,0.68,0.6364,0.89,0.2985,11,271,282 +5075,2011-08-04,3,0,8,8,0,4,1,2,0.7,0.6515,0.82,0.3582,29,369,398 +5076,2011-08-04,3,0,8,9,0,4,1,1,0.72,0.697,0.74,0.3284,25,162,187 +5077,2011-08-04,3,0,8,10,0,4,1,1,0.72,0.6667,0.71,0.2985,24,89,113 +5078,2011-08-04,3,0,8,11,0,4,1,2,0.74,0.697,0.66,0.2239,37,112,149 +5079,2011-08-04,3,0,8,12,0,4,1,2,0.76,0.7121,0.62,0.2239,53,152,205 +5080,2011-08-04,3,0,8,13,0,4,1,1,0.76,0.7121,0.58,0,63,142,205 +5081,2011-08-04,3,0,8,14,0,4,1,1,0.8,0.7576,0.55,0.1642,53,114,167 +5082,2011-08-04,3,0,8,15,0,4,1,1,0.8,0.7576,0.55,0.2537,50,115,165 +5083,2011-08-04,3,0,8,16,0,4,1,1,0.76,0.7121,0.62,0.3284,70,220,290 +5084,2011-08-04,3,0,8,17,0,4,1,1,0.76,0.7121,0.62,0.3284,91,464,555 +5085,2011-08-04,3,0,8,18,0,4,1,1,0.7,0.6667,0.74,0.2985,88,435,523 +5086,2011-08-04,3,0,8,19,0,4,1,1,0.7,0.6667,0.74,0.2836,65,301,366 +5087,2011-08-04,3,0,8,20,0,4,1,2,0.7,0.6667,0.74,0.194,41,245,286 +5088,2011-08-04,3,0,8,21,0,4,1,2,0.7,0.6667,0.74,0.194,25,186,211 +5089,2011-08-04,3,0,8,22,0,4,1,1,0.68,0.6364,0.74,0.194,33,141,174 +5090,2011-08-04,3,0,8,23,0,4,1,1,0.66,0.6212,0.74,0.1045,24,108,132 +5091,2011-08-05,3,0,8,0,0,5,1,1,0.66,0.6212,0.74,0.1045,13,41,54 +5092,2011-08-05,3,0,8,1,0,5,1,1,0.64,0.5909,0.78,0.0896,3,16,19 +5093,2011-08-05,3,0,8,2,0,5,1,1,0.66,0.6212,0.69,0,5,14,19 +5094,2011-08-05,3,0,8,3,0,5,1,1,0.64,0.5909,0.78,0.0896,0,6,6 +5095,2011-08-05,3,0,8,4,0,5,1,1,0.64,0.5909,0.78,0.1343,1,6,7 +5096,2011-08-05,3,0,8,5,0,5,1,1,0.64,0.5909,0.78,0.0896,0,16,16 +5097,2011-08-05,3,0,8,6,0,5,1,1,0.64,0.5909,0.78,0.1343,7,94,101 +5098,2011-08-05,3,0,8,7,0,5,1,1,0.66,0.6061,0.78,0.1343,23,247,270 +5099,2011-08-05,3,0,8,8,0,5,1,1,0.7,0.6515,0.65,0.2239,39,415,454 +5100,2011-08-05,3,0,8,9,0,5,1,1,0.72,0.6667,0.58,0.1045,43,183,226 +5101,2011-08-05,3,0,8,10,0,5,1,1,0.74,0.6818,0.55,0.1343,50,113,163 +5102,2011-08-05,3,0,8,11,0,5,1,2,0.76,0.697,0.52,0.2537,44,103,147 +5103,2011-08-05,3,0,8,12,0,5,1,1,0.76,0.697,0.52,0.2537,57,159,216 +5104,2011-08-05,3,0,8,13,0,5,1,1,0.78,0.7121,0.52,0.2537,40,164,204 +5105,2011-08-05,3,0,8,14,0,5,1,2,0.78,0.7121,0.52,0.2537,81,164,245 +5106,2011-08-05,3,0,8,15,0,5,1,1,0.76,0.697,0.55,0.2537,50,160,210 +5107,2011-08-05,3,0,8,16,0,5,1,1,0.8,0.7424,0.49,0.2537,76,254,330 +5108,2011-08-05,3,0,8,17,0,5,1,1,0.78,0.7121,0.49,0.2985,84,466,550 +5109,2011-08-05,3,0,8,18,0,5,1,1,0.76,0.697,0.52,0.2537,105,361,466 +5110,2011-08-05,3,0,8,19,0,5,1,1,0.74,0.6818,0.55,0.2537,90,282,372 +5111,2011-08-05,3,0,8,20,0,5,1,1,0.72,0.6667,0.58,0.2239,70,221,291 +5112,2011-08-05,3,0,8,21,0,5,1,1,0.7,0.6515,0.65,0.194,54,119,173 +5113,2011-08-05,3,0,8,22,0,5,1,1,0.7,0.6515,0.65,0.2239,51,137,188 +5114,2011-08-05,3,0,8,23,0,5,1,1,0.68,0.6364,0.69,0.2239,37,102,139 +5115,2011-08-06,3,0,8,0,0,6,0,1,0.66,0.6061,0.78,0.194,29,104,133 +5116,2011-08-06,3,0,8,1,0,6,0,1,0.66,0.6061,0.78,0.1642,17,50,67 +5117,2011-08-06,3,0,8,2,0,6,0,1,0.66,0.6061,0.78,0.2239,14,39,53 +5118,2011-08-06,3,0,8,3,0,6,0,2,0.66,0.6061,0.78,0.1642,14,28,42 +5119,2011-08-06,3,0,8,4,0,6,0,2,0.66,0.6061,0.83,0.1642,5,5,10 +5120,2011-08-06,3,0,8,5,0,6,0,1,0.64,0.5758,0.89,0.0896,2,4,6 +5121,2011-08-06,3,0,8,6,0,6,0,1,0.64,0.5758,0.89,0.1045,5,22,27 +5122,2011-08-06,3,0,8,7,0,6,0,1,0.68,0.6364,0.83,0.1642,14,49,63 +5123,2011-08-06,3,0,8,8,0,6,0,1,0.68,0.6364,0.83,0.1642,27,94,121 +5124,2011-08-06,3,0,8,9,0,6,0,1,0.74,0.697,0.7,0.2239,60,155,215 +5125,2011-08-06,3,0,8,10,0,6,0,2,0.74,0.697,0.7,0.2836,91,199,290 +5126,2011-08-06,3,0,8,11,0,6,0,1,0.78,0.7424,0.62,0.2239,138,211,349 +5127,2011-08-06,3,0,8,12,0,6,0,1,0.8,0.7576,0.55,0.2239,130,252,382 +5128,2011-08-06,3,0,8,13,0,6,0,2,0.8,0.7727,0.59,0.3284,176,265,441 +5129,2011-08-06,3,0,8,14,0,6,0,1,0.82,0.7727,0.52,0.2537,176,204,380 +5130,2011-08-06,3,0,8,15,0,6,0,2,0.84,0.803,0.53,0.2537,130,232,362 +5131,2011-08-06,3,0,8,16,0,6,0,3,0.74,0.7121,0.79,0.1642,155,188,343 +5132,2011-08-06,3,0,8,17,0,6,0,3,0.74,0.7121,0.79,0.1642,61,88,149 +5133,2011-08-06,3,0,8,18,0,6,0,2,0.72,0.697,0.79,0.4478,81,130,211 +5134,2011-08-06,3,0,8,19,0,6,0,3,0.72,0.7121,0.84,0.2537,57,114,171 +5135,2011-08-06,3,0,8,20,0,6,0,3,0.72,0.697,0.79,0.2836,58,79,137 +5136,2011-08-06,3,0,8,21,0,6,0,2,0.7,0.6667,0.84,0.3284,28,86,114 +5137,2011-08-06,3,0,8,22,0,6,0,2,0.7,0.6667,0.84,0.3284,24,96,120 +5138,2011-08-06,3,0,8,23,0,6,0,3,0.7,0.6667,0.84,0.2836,29,79,108 +5139,2011-08-07,3,0,8,0,0,0,0,2,0.7,0.6667,0.84,0.2836,14,66,80 +5140,2011-08-07,3,0,8,1,0,0,0,2,0.7,0.6667,0.84,0.2537,10,63,73 +5141,2011-08-07,3,0,8,2,0,0,0,2,0.7,0.6667,0.84,0.2239,18,48,66 +5142,2011-08-07,3,0,8,3,0,0,0,2,0.7,0.6667,0.84,0.1343,9,23,32 +5143,2011-08-07,3,0,8,4,0,0,0,1,0.7,0.6667,0.84,0.1343,1,5,6 +5144,2011-08-07,3,0,8,5,0,0,0,1,0.7,0.6667,0.89,0.194,1,4,5 +5145,2011-08-07,3,0,8,6,0,0,0,1,0.7,0.6667,0.89,0.1642,3,10,13 +5146,2011-08-07,3,0,8,7,0,0,0,1,0.7,0.6667,0.84,0.194,11,28,39 +5147,2011-08-07,3,0,8,8,0,0,0,1,0.72,0.697,0.79,0.194,23,66,89 +5148,2011-08-07,3,0,8,9,0,0,0,1,0.76,0.7424,0.75,0.1045,82,102,184 +5149,2011-08-07,3,0,8,10,0,0,0,1,0.8,0.803,0.66,0.1343,88,178,266 +5150,2011-08-07,3,0,8,11,0,0,0,1,0.82,0.8333,0.63,0,113,156,269 +5151,2011-08-07,3,0,8,12,0,0,0,1,0.9,0.8485,0.42,0.2985,161,209,370 +5152,2011-08-07,3,0,8,13,0,0,0,1,0.9,0.8485,0.42,0.2985,118,206,324 +5153,2011-08-07,3,0,8,14,0,0,0,1,0.86,0.803,0.47,0.194,130,192,322 +5154,2011-08-07,3,0,8,15,0,0,0,3,0.72,0.697,0.74,0.2985,118,208,326 +5155,2011-08-07,3,0,8,16,0,0,0,3,0.72,0.697,0.74,0.2985,74,119,193 +5156,2011-08-07,3,0,8,17,0,0,0,3,0.74,0.7121,0.74,0.6418,63,131,194 +5157,2011-08-07,3,0,8,18,0,0,0,1,0.72,0.697,0.74,0.0896,74,155,229 +5158,2011-08-07,3,0,8,19,0,0,0,1,0.72,0.697,0.79,0.1642,68,160,228 +5159,2011-08-07,3,0,8,20,0,0,0,1,0.72,0.7121,0.84,0.1343,50,133,183 +5160,2011-08-07,3,0,8,21,0,0,0,1,0.72,0.7121,0.84,0.1045,36,100,136 +5161,2011-08-07,3,0,8,22,0,0,0,1,0.7,0.6667,0.89,0.194,21,82,103 +5162,2011-08-07,3,0,8,23,0,0,0,1,0.7,0.6667,0.79,0.1045,12,43,55 +5163,2011-08-08,3,0,8,0,0,1,1,1,0.7,0.6667,0.79,0.1045,13,17,30 +5164,2011-08-08,3,0,8,1,0,1,1,1,0.68,0.6364,0.83,0.1343,4,8,12 +5165,2011-08-08,3,0,8,2,0,1,1,1,0.66,0.5909,0.89,0,4,3,7 +5166,2011-08-08,3,0,8,3,0,1,1,1,0.66,0.5909,0.89,0,1,0,1 +5167,2011-08-08,3,0,8,4,0,1,1,1,0.66,0.5909,0.89,0.1045,2,8,10 +5168,2011-08-08,3,0,8,5,0,1,1,1,0.7,0.6667,0.79,0.0896,1,14,15 +5169,2011-08-08,3,0,8,6,0,1,1,1,0.7,0.6515,0.65,0.2836,8,87,95 +5170,2011-08-08,3,0,8,7,0,1,1,1,0.72,0.6818,0.62,0.2985,18,249,267 +5171,2011-08-08,3,0,8,8,0,1,1,1,0.74,0.6818,0.58,0.3284,29,320,349 +5172,2011-08-08,3,0,8,9,0,1,1,1,0.76,0.697,0.55,0.3881,37,146,183 +5173,2011-08-08,3,0,8,10,0,1,1,1,0.8,0.7424,0.52,0.3881,62,62,124 +5174,2011-08-08,3,0,8,11,0,1,1,1,0.82,0.7424,0.48,0.3582,90,89,179 +5175,2011-08-08,3,0,8,12,0,1,1,1,0.82,0.7576,0.46,0.2537,48,134,182 +5176,2011-08-08,3,0,8,13,0,1,1,1,0.84,0.7576,0.44,0.2985,43,108,151 +5177,2011-08-08,3,0,8,14,0,1,1,1,0.82,0.7424,0.43,0.2985,41,109,150 +5178,2011-08-08,3,0,8,15,0,1,1,1,0.84,0.7576,0.41,0.2985,40,86,126 +5179,2011-08-08,3,0,8,16,0,1,1,1,0.86,0.7727,0.39,0.2836,64,218,282 +5180,2011-08-08,3,0,8,17,0,1,1,1,0.86,0.7727,0.39,0.1642,67,460,527 +5181,2011-08-08,3,0,8,18,0,1,1,1,0.84,0.7576,0.41,0.1343,64,465,529 +5182,2011-08-08,3,0,8,19,0,1,1,1,0.82,0.7424,0.43,0.1045,64,352,416 +5183,2011-08-08,3,0,8,20,0,1,1,1,0.8,0.7576,0.55,0,71,215,286 +5184,2011-08-08,3,0,8,21,0,1,1,1,0.76,0.7121,0.58,0.0896,30,156,186 +5185,2011-08-08,3,0,8,22,0,1,1,1,0.76,0.7121,0.58,0.1045,26,121,147 +5186,2011-08-08,3,0,8,23,0,1,1,1,0.74,0.697,0.66,0.1045,19,53,72 +5187,2011-08-09,3,0,8,0,0,2,1,1,0.72,0.697,0.79,0.1343,10,25,35 +5188,2011-08-09,3,0,8,1,0,2,1,1,0.72,0.6818,0.66,0,5,9,14 +5189,2011-08-09,3,0,8,2,0,2,1,1,0.72,0.6818,0.7,0,9,3,12 +5190,2011-08-09,3,0,8,3,0,2,1,1,0.72,0.6818,0.62,0,0,3,3 +5191,2011-08-09,3,0,8,4,0,2,1,1,0.7,0.6667,0.74,0,1,6,7 +5192,2011-08-09,3,0,8,5,0,2,1,1,0.7,0.6515,0.65,0.0896,5,25,30 +5193,2011-08-09,3,0,8,6,0,2,1,1,0.72,0.6667,0.64,0,6,95,101 +5194,2011-08-09,3,0,8,7,0,2,1,1,0.72,0.697,0.74,0.1045,30,313,343 +5195,2011-08-09,3,0,8,8,0,2,1,1,0.74,0.697,0.66,0,40,352,392 +5196,2011-08-09,3,0,8,9,0,2,1,2,0.78,0.7424,0.62,0.1343,34,141,175 +5197,2011-08-09,3,0,8,10,0,2,1,2,0.8,0.7727,0.59,0.2239,55,89,144 +5198,2011-08-09,3,0,8,11,0,2,1,1,0.8,0.7727,0.59,0.2239,61,88,149 +5199,2011-08-09,3,0,8,12,0,2,1,1,0.82,0.7879,0.56,0.194,60,125,185 +5200,2011-08-09,3,0,8,13,0,2,1,1,0.84,0.7879,0.49,0.1642,48,127,175 +5201,2011-08-09,3,0,8,14,0,2,1,1,0.86,0.7879,0.44,0.2836,53,95,148 +5202,2011-08-09,3,0,8,15,0,2,1,1,0.86,0.7727,0.39,0.2836,58,106,164 +5203,2011-08-09,3,0,8,16,0,2,1,1,0.86,0.7727,0.39,0.2985,57,228,285 +5204,2011-08-09,3,0,8,17,0,2,1,1,0.8,0.7576,0.55,0.1642,79,453,532 +5205,2011-08-09,3,0,8,18,0,2,1,1,0.8,0.7576,0.55,0.194,97,488,585 +5206,2011-08-09,3,0,8,19,0,2,1,1,0.8,0.7424,0.49,0.1045,69,302,371 +5207,2011-08-09,3,0,8,20,0,2,1,1,0.8,0.7273,0.43,0.194,34,258,292 +5208,2011-08-09,3,0,8,21,0,2,1,1,0.78,0.697,0.46,0.2537,46,184,230 +5209,2011-08-09,3,0,8,22,0,2,1,1,0.78,0.697,0.46,0.2985,28,127,155 +5210,2011-08-09,3,0,8,23,0,2,1,1,0.76,0.6818,0.48,0.2836,22,53,75 +5211,2011-08-10,3,0,8,0,0,3,1,1,0.74,0.6667,0.51,0.2239,19,18,37 +5212,2011-08-10,3,0,8,1,0,3,1,1,0.72,0.6667,0.54,0.194,7,10,17 +5213,2011-08-10,3,0,8,2,0,3,1,1,0.72,0.6667,0.54,0.1642,1,10,11 +5214,2011-08-10,3,0,8,3,0,3,1,1,0.7,0.6515,0.58,0.1642,1,3,4 +5215,2011-08-10,3,0,8,4,0,3,1,1,0.7,0.6515,0.54,0.1343,1,4,5 +5216,2011-08-10,3,0,8,5,0,3,1,1,0.68,0.6364,0.61,0.0896,2,30,32 +5217,2011-08-10,3,0,8,6,0,3,1,1,0.68,0.6364,0.61,0,9,110,119 +5218,2011-08-10,3,0,8,7,0,3,1,1,0.7,0.6515,0.58,0.0896,16,289,305 +5219,2011-08-10,3,0,8,8,0,3,1,1,0.74,0.6667,0.51,0.194,38,361,399 +5220,2011-08-10,3,0,8,9,0,3,1,1,0.76,0.6818,0.48,0.2239,32,148,180 +5221,2011-08-10,3,0,8,10,0,3,1,1,0.8,0.7273,0.43,0.2239,41,72,113 +5222,2011-08-10,3,0,8,11,0,3,1,1,0.8,0.7121,0.41,0.3881,54,95,149 +5223,2011-08-10,3,0,8,12,0,3,1,1,0.82,0.7273,0.38,0.2985,74,166,240 +5224,2011-08-10,3,0,8,13,0,3,1,1,0.82,0.7273,0.38,0.2836,41,150,191 +5225,2011-08-10,3,0,8,14,0,3,1,1,0.84,0.7273,0.34,0.3284,62,116,178 +5226,2011-08-10,3,0,8,15,0,3,1,1,0.86,0.7424,0.3,0.3284,53,141,194 +5227,2011-08-10,3,0,8,16,0,3,1,1,0.86,0.7424,0.26,0.2985,76,206,282 +5228,2011-08-10,3,0,8,17,0,3,1,1,0.84,0.7121,0.26,0.2985,71,513,584 +5229,2011-08-10,3,0,8,18,0,3,1,1,0.82,0.7121,0.28,0.2985,70,489,559 +5230,2011-08-10,3,0,8,19,0,3,1,1,0.8,0.697,0.26,0.2537,79,334,413 +5231,2011-08-10,3,0,8,20,0,3,1,1,0.76,0.6667,0.31,0.1343,59,253,312 +5232,2011-08-10,3,0,8,21,0,3,1,1,0.76,0.6667,0.33,0.1045,47,162,209 +5233,2011-08-10,3,0,8,22,0,3,1,1,0.74,0.6515,0.37,0,17,122,139 +5234,2011-08-10,3,0,8,23,0,3,1,1,0.74,0.6515,0.37,0.0896,14,94,108 +5235,2011-08-11,3,0,8,0,0,4,1,1,0.7,0.6364,0.45,0.1045,6,40,46 +5236,2011-08-11,3,0,8,1,0,4,1,1,0.7,0.6364,0.45,0.1343,9,17,26 +5237,2011-08-11,3,0,8,2,0,4,1,1,0.66,0.6212,0.54,0.1343,3,8,11 +5238,2011-08-11,3,0,8,3,0,4,1,1,0.64,0.6212,0.57,0.1343,2,7,9 +5239,2011-08-11,3,0,8,4,0,4,1,1,0.64,0.6212,0.57,0.1642,1,4,5 +5240,2011-08-11,3,0,8,5,0,4,1,1,0.64,0.6212,0.61,0.194,2,23,25 +5241,2011-08-11,3,0,8,6,0,4,1,1,0.62,0.6061,0.65,0.1642,5,101,106 +5242,2011-08-11,3,0,8,7,0,4,1,1,0.66,0.6212,0.54,0.1343,15,292,307 +5243,2011-08-11,3,0,8,8,0,4,1,1,0.7,0.6364,0.45,0.1642,29,361,390 +5244,2011-08-11,3,0,8,9,0,4,1,1,0.72,0.6515,0.39,0.2239,31,166,197 +5245,2011-08-11,3,0,8,10,0,4,1,1,0.74,0.6515,0.37,0.2537,36,96,132 +5246,2011-08-11,3,0,8,11,0,4,1,1,0.76,0.6667,0.35,0.2537,59,128,187 +5247,2011-08-11,3,0,8,12,0,4,1,1,0.78,0.6818,0.31,0.2537,50,164,214 +5248,2011-08-11,3,0,8,13,0,4,1,1,0.76,0.6667,0.31,0.2239,51,166,217 +5249,2011-08-11,3,0,8,14,0,4,1,1,0.8,0.697,0.29,0,39,103,142 +5250,2011-08-11,3,0,8,15,0,4,1,1,0.8,0.697,0.29,0.2985,52,126,178 +5251,2011-08-11,3,0,8,16,0,4,1,1,0.8,0.697,0.27,0.194,57,241,298 +5252,2011-08-11,3,0,8,17,0,4,1,1,0.8,0.697,0.29,0.2537,96,486,582 +5253,2011-08-11,3,0,8,18,0,4,1,1,0.78,0.6818,0.29,0.0896,74,497,571 +5254,2011-08-11,3,0,8,19,0,4,1,1,0.74,0.6515,0.35,0.194,48,323,371 +5255,2011-08-11,3,0,8,20,0,4,1,1,0.72,0.6515,0.37,0.1343,56,232,288 +5256,2011-08-11,3,0,8,21,0,4,1,1,0.7,0.6515,0.48,0.0896,35,171,206 +5257,2011-08-11,3,0,8,22,0,4,1,1,0.7,0.6515,0.48,0,34,123,157 +5258,2011-08-11,3,0,8,23,0,4,1,1,0.66,0.6212,0.5,0.1642,22,105,127 +5259,2011-08-12,3,0,8,0,0,5,1,1,0.64,0.6212,0.53,0.1642,12,53,65 +5260,2011-08-12,3,0,8,1,0,5,1,1,0.64,0.6212,0.53,0.1343,4,19,23 +5261,2011-08-12,3,0,8,2,0,5,1,1,0.62,0.6212,0.57,0.194,0,15,15 +5262,2011-08-12,3,0,8,3,0,5,1,1,0.62,0.6212,0.53,0.1642,0,9,9 +5263,2011-08-12,3,0,8,4,0,5,1,1,0.62,0.6212,0.53,0.1343,0,5,5 +5264,2011-08-12,3,0,8,5,0,5,1,1,0.6,0.6212,0.56,0.1343,2,24,26 +5265,2011-08-12,3,0,8,6,0,5,1,1,0.62,0.6212,0.5,0.1343,3,73,76 +5266,2011-08-12,3,0,8,7,0,5,1,1,0.64,0.6212,0.44,0.1642,17,247,264 +5267,2011-08-12,3,0,8,8,0,5,1,1,0.68,0.6364,0.39,0.2239,29,397,426 +5268,2011-08-12,3,0,8,9,0,5,1,1,0.72,0.6515,0.34,0.1343,27,178,205 +5269,2011-08-12,3,0,8,10,0,5,1,1,0.74,0.6515,0.33,0.1045,44,86,130 +5270,2011-08-12,3,0,8,11,0,5,1,1,0.76,0.6667,0.29,0,48,108,156 +5271,2011-08-12,3,0,8,12,0,5,1,1,0.8,0.697,0.27,0,55,167,222 +5272,2011-08-12,3,0,8,13,0,5,1,1,0.78,0.6818,0.29,0.0896,74,179,253 +5273,2011-08-12,3,0,8,14,0,5,1,1,0.8,0.697,0.29,0.194,51,135,186 +5274,2011-08-12,3,0,8,15,0,5,1,1,0.8,0.697,0.29,0.1642,59,190,249 +5275,2011-08-12,3,0,8,16,0,5,1,1,0.82,0.7121,0.28,0,68,266,334 +5276,2011-08-12,3,0,8,17,0,5,1,1,0.82,0.7121,0.26,0,93,423,516 +5277,2011-08-12,3,0,8,18,0,5,1,1,0.76,0.6818,0.4,0.2985,89,376,465 +5278,2011-08-12,3,0,8,19,0,5,1,1,0.74,0.6667,0.42,0.2239,101,284,385 +5279,2011-08-12,3,0,8,20,0,5,1,1,0.72,0.6515,0.42,0.2239,77,210,287 +5280,2011-08-12,3,0,8,21,0,5,1,1,0.7,0.6515,0.48,0.1343,73,153,226 +5281,2011-08-12,3,0,8,22,0,5,1,1,0.68,0.6364,0.51,0,74,157,231 +5282,2011-08-12,3,0,8,23,0,5,1,1,0.68,0.6364,0.51,0,51,100,151 +5283,2011-08-13,3,0,8,0,0,6,0,1,0.68,0.6364,0.51,0.1045,15,69,84 +5284,2011-08-13,3,0,8,1,0,6,0,2,0.68,0.6364,0.57,0.0896,13,64,77 +5285,2011-08-13,3,0,8,2,0,6,0,1,0.68,0.6364,0.57,0,16,58,74 +5286,2011-08-13,3,0,8,3,0,6,0,1,0.66,0.6212,0.65,0.1343,9,18,27 +5287,2011-08-13,3,0,8,4,0,6,0,2,0.64,0.6061,0.73,0.1343,5,5,10 +5288,2011-08-13,3,0,8,5,0,6,0,1,0.64,0.6061,0.73,0.1045,3,15,18 +5289,2011-08-13,3,0,8,6,0,6,0,1,0.64,0.6061,0.73,0.0896,3,19,22 +5290,2011-08-13,3,0,8,7,0,6,0,2,0.66,0.6212,0.74,0.194,10,34,44 +5291,2011-08-13,3,0,8,8,0,6,0,2,0.7,0.6515,0.65,0.1642,28,90,118 +5292,2011-08-13,3,0,8,9,0,6,0,2,0.7,0.6667,0.74,0.1642,70,138,208 +5293,2011-08-13,3,0,8,10,0,6,0,2,0.72,0.697,0.74,0.2239,97,163,260 +5294,2011-08-13,3,0,8,11,0,6,0,2,0.74,0.697,0.7,0.194,122,192,314 +5295,2011-08-13,3,0,8,12,0,6,0,2,0.66,0.5909,0.89,0.1642,175,246,421 +5296,2011-08-13,3,0,8,13,0,6,0,2,0.74,0.7121,0.74,0.3284,61,92,153 +5297,2011-08-13,3,0,8,14,0,6,0,1,0.74,0.697,0.7,0.2537,113,146,259 +5298,2011-08-13,3,0,8,15,0,6,0,3,0.68,0.6364,0.79,0.3582,150,208,358 +5299,2011-08-13,3,0,8,16,0,6,0,2,0.7,0.6667,0.74,0.2836,148,133,281 +5300,2011-08-13,3,0,8,17,0,6,0,2,0.72,0.697,0.74,0.2537,97,172,269 +5301,2011-08-13,3,0,8,18,0,6,0,1,0.7,0.6667,0.74,0.2239,123,169,292 +5302,2011-08-13,3,0,8,19,0,6,0,2,0.68,0.6364,0.79,0.2836,67,170,237 +5303,2011-08-13,3,0,8,20,0,6,0,2,0.68,0.6364,0.83,0.2239,78,105,183 +5304,2011-08-13,3,0,8,21,0,6,0,2,0.68,0.6364,0.83,0.3881,40,127,167 +5305,2011-08-13,3,0,8,22,0,6,0,2,0.68,0.6364,0.83,0.3881,36,118,154 +5306,2011-08-13,3,0,8,23,0,6,0,2,0.66,0.6061,0.83,0.3284,25,95,120 +5307,2011-08-14,3,0,8,0,0,0,0,3,0.66,0.6061,0.83,0.2985,19,71,90 +5308,2011-08-14,3,0,8,1,0,0,0,3,0.64,0.5758,0.89,0.2239,16,57,73 +5309,2011-08-14,3,0,8,2,0,0,0,3,0.64,0.5758,0.89,0.2239,13,49,62 +5310,2011-08-14,3,0,8,3,0,0,0,2,0.64,0.5606,0.94,0.2537,5,22,27 +5311,2011-08-14,3,0,8,4,0,0,0,3,0.64,0.5606,0.94,0.194,1,2,3 +5312,2011-08-14,3,0,8,5,0,0,0,2,0.64,0.5606,0.94,0.1642,1,9,10 +5313,2011-08-14,3,0,8,6,0,0,0,2,0.64,0.5606,0.94,0.1045,2,4,6 +5314,2011-08-14,3,0,8,7,0,0,0,2,0.66,0.5909,0.89,0.194,4,20,24 +5315,2011-08-14,3,0,8,8,0,0,0,2,0.66,0.5909,0.89,0.2836,20,21,41 +5316,2011-08-14,3,0,8,9,0,0,0,3,0.66,0.5909,0.89,0.2836,27,76,103 +5317,2011-08-14,3,0,8,10,0,0,0,1,0.7,0.6667,0.79,0.2239,67,116,183 +5318,2011-08-14,3,0,8,11,0,0,0,1,0.7,0.6667,0.79,0.194,81,117,198 +5319,2011-08-14,3,0,8,12,0,0,0,2,0.7,0.6667,0.79,0.2239,98,183,281 +5320,2011-08-14,3,0,8,13,0,0,0,1,0.72,0.697,0.74,0.2537,144,233,377 +5321,2011-08-14,3,0,8,14,0,0,0,1,0.74,0.697,0.7,0.194,126,244,370 +5322,2011-08-14,3,0,8,15,0,0,0,1,0.74,0.697,0.66,0.194,128,203,331 +5323,2011-08-14,3,0,8,16,0,0,0,1,0.74,0.697,0.66,0.1642,116,176,292 +5324,2011-08-14,3,0,8,17,0,0,0,1,0.76,0.7121,0.62,0.1343,133,196,329 +5325,2011-08-14,3,0,8,18,0,0,0,1,0.74,0.697,0.66,0.1642,125,222,347 +5326,2011-08-14,3,0,8,19,0,0,0,1,0.72,0.697,0.74,0.2239,75,184,259 +5327,2011-08-14,3,0,8,20,0,0,0,1,0.7,0.6667,0.84,0.1642,67,126,193 +5328,2011-08-14,3,0,8,21,0,0,0,3,0.6,0.5455,0.88,0.4925,51,104,155 +5329,2011-08-14,3,0,8,22,0,0,0,3,0.6,0.5455,0.88,0.4925,14,25,39 +5330,2011-08-14,3,0,8,23,0,0,0,3,0.6,0.5606,0.83,0,5,22,27 +5331,2011-08-15,3,0,8,0,0,1,1,3,0.6,0.5455,0.88,0.0896,7,18,25 +5332,2011-08-15,3,0,8,1,0,1,1,2,0.6,0.5455,0.88,0.1343,1,9,10 +5333,2011-08-15,3,0,8,2,0,1,1,2,0.6,0.5606,0.83,0,0,3,3 +5334,2011-08-15,3,0,8,3,0,1,1,1,0.6,0.5606,0.83,0.194,1,6,7 +5335,2011-08-15,3,0,8,4,0,1,1,1,0.6,0.5606,0.83,0.1343,1,4,5 +5336,2011-08-15,3,0,8,5,0,1,1,2,0.6,0.5455,0.88,0.194,3,14,17 +5337,2011-08-15,3,0,8,6,0,1,1,1,0.6,0.5455,0.88,0.2537,3,87,90 +5338,2011-08-15,3,0,8,7,0,1,1,1,0.6,0.5455,0.88,0.2537,10,248,258 +5339,2011-08-15,3,0,8,8,0,1,1,1,0.64,0.5909,0.78,0.3284,29,326,355 +5340,2011-08-15,3,0,8,9,0,1,1,1,0.64,0.5909,0.78,0.3284,52,170,222 +5341,2011-08-15,3,0,8,10,0,1,1,1,0.68,0.6364,0.69,0.2836,54,87,141 +5342,2011-08-15,3,0,8,11,0,1,1,2,0.72,0.6818,0.62,0.2537,41,112,153 +5343,2011-08-15,3,0,8,12,0,1,1,1,0.74,0.6818,0.58,0.2836,62,116,178 +5344,2011-08-15,3,0,8,13,0,1,1,1,0.78,0.7121,0.52,0.2836,53,140,193 +5345,2011-08-15,3,0,8,14,0,1,1,1,0.74,0.6818,0.55,0.0896,56,95,151 +5346,2011-08-15,3,0,8,15,0,1,1,3,0.74,0.6818,0.55,0.2985,49,133,182 +5347,2011-08-15,3,0,8,16,0,1,1,1,0.74,0.6818,0.55,0.3284,58,225,283 +5348,2011-08-15,3,0,8,17,0,1,1,1,0.74,0.6667,0.51,0.2836,59,471,530 +5349,2011-08-15,3,0,8,18,0,1,1,1,0.7,0.6515,0.65,0.194,70,413,483 +5350,2011-08-15,3,0,8,19,0,1,1,1,0.7,0.6515,0.65,0,54,343,397 +5351,2011-08-15,3,0,8,20,0,1,1,1,0.66,0.6212,0.74,0.1343,45,240,285 +5352,2011-08-15,3,0,8,21,0,1,1,1,0.66,0.6212,0.65,0.194,34,150,184 +5353,2011-08-15,3,0,8,22,0,1,1,1,0.66,0.6212,0.69,0.2537,22,76,98 +5354,2011-08-15,3,0,8,23,0,1,1,1,0.64,0.6061,0.69,0.2239,11,77,88 +5355,2011-08-16,3,0,8,0,0,2,1,1,0.64,0.6061,0.69,0.1045,8,23,31 +5356,2011-08-16,3,0,8,1,0,2,1,1,0.64,0.6061,0.69,0.1642,4,12,16 +5357,2011-08-16,3,0,8,2,0,2,1,1,0.62,0.5909,0.73,0.2239,1,3,4 +5358,2011-08-16,3,0,8,3,0,2,1,1,0.62,0.5909,0.73,0.2239,1,5,6 +5359,2011-08-16,3,0,8,4,0,2,1,1,0.64,0.6061,0.69,0.2985,0,5,5 +5360,2011-08-16,3,0,8,5,0,2,1,1,0.62,0.5909,0.73,0.2836,1,29,30 +5361,2011-08-16,3,0,8,6,0,2,1,1,0.62,0.5909,0.73,0.3284,4,115,119 +5362,2011-08-16,3,0,8,7,0,2,1,1,0.64,0.6061,0.69,0.2537,18,328,346 +5363,2011-08-16,3,0,8,8,0,2,1,1,0.7,0.6515,0.58,0.4478,24,417,441 +5364,2011-08-16,3,0,8,9,0,2,1,1,0.68,0.6364,0.61,0.2836,27,171,198 +5365,2011-08-16,3,0,8,10,0,2,1,1,0.7,0.6515,0.58,0.3881,28,83,111 +5366,2011-08-16,3,0,8,11,0,2,1,1,0.74,0.6667,0.51,0.2239,40,110,150 +5367,2011-08-16,3,0,8,12,0,2,1,1,0.76,0.6818,0.48,0.3284,42,120,162 +5368,2011-08-16,3,0,8,13,0,2,1,1,0.76,0.6818,0.48,0.2985,45,147,192 +5369,2011-08-16,3,0,8,14,0,2,1,1,0.76,0.6818,0.45,0.4179,65,117,182 +5370,2011-08-16,3,0,8,15,0,2,1,1,0.8,0.7121,0.41,0.2985,36,123,159 +5371,2011-08-16,3,0,8,16,0,2,1,1,0.8,0.7121,0.41,0.2239,55,248,303 +5372,2011-08-16,3,0,8,17,0,2,1,1,0.76,0.6818,0.45,0.3284,75,525,600 +5373,2011-08-16,3,0,8,18,0,2,1,1,0.76,0.6818,0.45,0.2239,54,516,570 +5374,2011-08-16,3,0,8,19,0,2,1,1,0.74,0.6667,0.48,0.194,56,320,376 +5375,2011-08-16,3,0,8,20,0,2,1,1,0.74,0.6667,0.51,0,48,232,280 +5376,2011-08-16,3,0,8,21,0,2,1,1,0.72,0.6667,0.54,0,38,177,215 +5377,2011-08-16,3,0,8,22,0,2,1,1,0.7,0.6515,0.61,0,31,121,152 +5378,2011-08-16,3,0,8,23,0,2,1,1,0.66,0.6212,0.65,0.1343,20,57,77 +5379,2011-08-17,3,0,8,0,0,3,1,1,0.66,0.6212,0.69,0.0896,8,16,24 +5380,2011-08-17,3,0,8,1,0,3,1,1,0.64,0.6061,0.73,0.1642,2,12,14 +5381,2011-08-17,3,0,8,2,0,3,1,1,0.66,0.6212,0.69,0,0,6,6 +5382,2011-08-17,3,0,8,3,0,3,1,1,0.64,0.5909,0.78,0,1,4,5 +5383,2011-08-17,3,0,8,4,0,3,1,1,0.64,0.6061,0.73,0,0,5,5 +5384,2011-08-17,3,0,8,5,0,3,1,1,0.64,0.6061,0.73,0,0,28,28 +5385,2011-08-17,3,0,8,6,0,3,1,1,0.62,0.5909,0.78,0,4,101,105 +5386,2011-08-17,3,0,8,7,0,3,1,1,0.66,0.6212,0.65,0.0896,12,296,308 +5387,2011-08-17,3,0,8,8,0,3,1,1,0.7,0.6515,0.54,0.1642,35,452,487 +5388,2011-08-17,3,0,8,9,0,3,1,1,0.74,0.6667,0.48,0.0896,31,178,209 +5389,2011-08-17,3,0,8,10,0,3,1,1,0.76,0.6818,0.43,0.0896,27,91,118 +5390,2011-08-17,3,0,8,11,0,3,1,1,0.78,0.697,0.43,0.1045,20,108,128 +5391,2011-08-17,3,0,8,12,0,3,1,1,0.8,0.7273,0.43,0.2836,26,163,189 +5392,2011-08-17,3,0,8,13,0,3,1,1,0.8,0.7121,0.41,0.194,38,138,176 +5393,2011-08-17,3,0,8,14,0,3,1,1,0.82,0.7273,0.36,0.1642,54,138,192 +5394,2011-08-17,3,0,8,15,0,3,1,1,0.8,0.7273,0.43,0.1343,42,117,159 +5395,2011-08-17,3,0,8,16,0,3,1,1,0.82,0.7424,0.43,0.2239,62,238,300 +5396,2011-08-17,3,0,8,17,0,3,1,1,0.8,0.7424,0.49,0.2239,46,506,552 +5397,2011-08-17,3,0,8,18,0,3,1,1,0.76,0.697,0.55,0.2537,86,470,556 +5398,2011-08-17,3,0,8,19,0,3,1,1,0.76,0.697,0.55,0.2239,50,297,347 +5399,2011-08-17,3,0,8,20,0,3,1,1,0.74,0.6818,0.58,0.2537,37,243,280 +5400,2011-08-17,3,0,8,21,0,3,1,1,0.72,0.6818,0.62,0.2239,38,192,230 +5401,2011-08-17,3,0,8,22,0,3,1,1,0.7,0.6515,0.65,0.194,29,142,171 +5402,2011-08-17,3,0,8,23,0,3,1,1,0.7,0.6515,0.65,0.2836,20,85,105 +5403,2011-08-18,3,0,8,0,0,4,1,1,0.68,0.6364,0.65,0.1343,18,38,56 +5404,2011-08-18,3,0,8,1,0,4,1,2,0.66,0.6212,0.65,0.1642,13,11,24 +5405,2011-08-18,3,0,8,2,0,4,1,1,0.66,0.6212,0.65,0.2239,2,4,6 +5406,2011-08-18,3,0,8,3,0,4,1,1,0.66,0.6212,0.69,0.2836,0,6,6 +5407,2011-08-18,3,0,8,4,0,4,1,1,0.64,0.6061,0.73,0.2239,0,9,9 +5408,2011-08-18,3,0,8,5,0,4,1,1,0.64,0.5909,0.78,0.2836,0,27,27 +5409,2011-08-18,3,0,8,6,0,4,1,2,0.64,0.5909,0.78,0.2537,6,97,103 +5410,2011-08-18,3,0,8,7,0,4,1,1,0.64,0.5909,0.78,0.2985,19,289,308 +5411,2011-08-18,3,0,8,8,0,4,1,1,0.66,0.6212,0.74,0.2836,27,393,420 +5412,2011-08-18,3,0,8,9,0,4,1,1,0.72,0.6818,0.66,0.2537,42,186,228 +5413,2011-08-18,3,0,8,10,0,4,1,1,0.74,0.697,0.66,0.2537,37,92,129 +5414,2011-08-18,3,0,8,11,0,4,1,1,0.76,0.7121,0.58,0.2537,44,97,141 +5415,2011-08-18,3,0,8,12,0,4,1,1,0.8,0.7424,0.52,0.2836,34,139,173 +5416,2011-08-18,3,0,8,13,0,4,1,1,0.8,0.7424,0.52,0.3582,33,154,187 +5417,2011-08-18,3,0,8,14,0,4,1,1,0.82,0.7576,0.46,0.3284,67,115,182 +5418,2011-08-18,3,0,8,15,0,4,1,1,0.8,0.7273,0.46,0.3582,56,118,174 +5419,2011-08-18,3,0,8,16,0,4,1,1,0.8,0.7273,0.46,0.3582,49,196,245 +5420,2011-08-18,3,0,8,17,0,4,1,3,0.8,0.7273,0.46,0.2985,44,365,409 +5421,2011-08-18,3,0,8,18,0,4,1,3,0.76,0.697,0.55,0.2537,48,226,274 +5422,2011-08-18,3,0,8,19,0,4,1,1,0.66,0.6061,0.83,0.1045,26,115,141 +5423,2011-08-18,3,0,8,20,0,4,1,1,0.66,0.6061,0.83,0.1045,24,141,165 +5424,2011-08-18,3,0,8,21,0,4,1,2,0.68,0.6364,0.79,0,12,137,149 +5425,2011-08-18,3,0,8,22,0,4,1,2,0.7,0.6667,0.74,0.1045,18,117,135 +5426,2011-08-18,3,0,8,23,0,4,1,2,0.7,0.6667,0.74,0.1343,20,94,114 +5427,2011-08-19,3,0,8,0,0,5,1,3,0.68,0.6364,0.69,0.194,9,53,62 +5428,2011-08-19,3,0,8,1,0,5,1,3,0.64,0.5758,0.89,0,1,19,20 +5429,2011-08-19,3,0,8,2,0,5,1,3,0.64,0.5758,0.83,0,0,8,8 +5430,2011-08-19,3,0,8,3,0,5,1,1,0.64,0.5909,0.78,0,2,1,3 +5431,2011-08-19,3,0,8,4,0,5,1,1,0.64,0.5758,0.83,0,1,7,8 +5432,2011-08-19,3,0,8,5,0,5,1,1,0.64,0.5758,0.83,0,1,16,17 +5433,2011-08-19,3,0,8,6,0,5,1,1,0.64,0.5758,0.83,0,2,91,93 +5434,2011-08-19,3,0,8,7,0,5,1,2,0.66,0.6061,0.83,0.0896,12,251,263 +5435,2011-08-19,3,0,8,8,0,5,1,2,0.66,0.6061,0.83,0,30,368,398 +5436,2011-08-19,3,0,8,9,0,5,1,1,0.7,0.6515,0.65,0.1343,31,187,218 +5437,2011-08-19,3,0,8,10,0,5,1,1,0.72,0.6818,0.62,0.0896,57,108,165 +5438,2011-08-19,3,0,8,11,0,5,1,2,0.72,0.6818,0.66,0.0896,45,120,165 +5439,2011-08-19,3,0,8,12,0,5,1,2,0.74,0.6818,0.62,0.0896,54,168,222 +5440,2011-08-19,3,0,8,13,0,5,1,2,0.76,0.7121,0.62,0.194,63,169,232 +5441,2011-08-19,3,0,8,14,0,5,1,1,0.78,0.7273,0.55,0.194,78,142,220 +5442,2011-08-19,3,0,8,15,0,5,1,1,0.8,0.7424,0.49,0.1045,66,156,222 +5443,2011-08-19,3,0,8,16,0,5,1,1,0.8,0.7424,0.49,0,66,261,327 +5444,2011-08-19,3,0,8,17,0,5,1,1,0.76,0.7121,0.58,0.5224,85,442,527 +5445,2011-08-19,3,0,8,18,0,5,1,1,0.76,0.7121,0.58,0.5224,88,337,425 +5446,2011-08-19,3,0,8,19,0,5,1,3,0.62,0.5909,0.73,0.4627,33,132,165 +5447,2011-08-19,3,0,8,20,0,5,1,3,0.62,0.5758,0.83,0.2985,12,46,58 +5448,2011-08-19,3,0,8,21,0,5,1,1,0.62,0.5758,0.83,0.1343,18,89,107 +5449,2011-08-19,3,0,8,22,0,5,1,1,0.6,0.5455,0.88,0.1343,25,97,122 +5450,2011-08-19,3,0,8,23,0,5,1,2,0.6,0.5455,0.88,0.0896,18,88,106 +5451,2011-08-20,3,0,8,0,0,6,0,1,0.6,0.5455,0.88,0,21,107,128 +5452,2011-08-20,3,0,8,1,0,6,0,1,0.62,0.5758,0.83,0,12,35,47 +5453,2011-08-20,3,0,8,2,0,6,0,1,0.62,0.5909,0.78,0,10,59,69 +5454,2011-08-20,3,0,8,3,0,6,0,1,0.62,0.5909,0.78,0,9,42,51 +5455,2011-08-20,3,0,8,4,0,6,0,2,0.6,0.5455,0.88,0,1,6,7 +5456,2011-08-20,3,0,8,5,0,6,0,1,0.6,0.5606,0.83,0,0,8,8 +5457,2011-08-20,3,0,8,6,0,6,0,2,0.6,0.5606,0.83,0.1343,6,18,24 +5458,2011-08-20,3,0,8,7,0,6,0,1,0.62,0.5758,0.83,0,8,36,44 +5459,2011-08-20,3,0,8,8,0,6,0,1,0.66,0.6212,0.69,0,27,106,133 +5460,2011-08-20,3,0,8,9,0,6,0,1,0.7,0.6515,0.65,0,51,139,190 +5461,2011-08-20,3,0,8,10,0,6,0,1,0.72,0.6818,0.62,0.1045,114,192,306 +5462,2011-08-20,3,0,8,11,0,6,0,1,0.72,0.6667,0.58,0,115,223,338 +5463,2011-08-20,3,0,8,12,0,6,0,1,0.74,0.6818,0.58,0.1343,156,253,409 +5464,2011-08-20,3,0,8,13,0,6,0,1,0.76,0.697,0.55,0.1343,158,220,378 +5465,2011-08-20,3,0,8,14,0,6,0,1,0.8,0.7424,0.49,0.194,147,228,375 +5466,2011-08-20,3,0,8,15,0,6,0,1,0.8,0.7273,0.43,0.194,175,215,390 +5467,2011-08-20,3,0,8,16,0,6,0,1,0.8,0.7273,0.46,0.194,151,222,373 +5468,2011-08-20,3,0,8,17,0,6,0,1,0.8,0.7424,0.49,0.194,180,203,383 +5469,2011-08-20,3,0,8,18,0,6,0,1,0.76,0.7121,0.58,0.2537,169,225,394 +5470,2011-08-20,3,0,8,19,0,6,0,1,0.74,0.6818,0.62,0.194,109,182,291 +5471,2011-08-20,3,0,8,20,0,6,0,1,0.74,0.6818,0.62,0.2239,78,170,248 +5472,2011-08-20,3,0,8,21,0,6,0,1,0.72,0.6818,0.7,0.2239,81,144,225 +5473,2011-08-20,3,0,8,22,0,6,0,2,0.7,0.6667,0.74,0.194,77,146,223 +5474,2011-08-20,3,0,8,23,0,6,0,1,0.7,0.6667,0.74,0.1343,59,98,157 +5475,2011-08-21,3,0,8,0,0,0,0,1,0.7,0.6667,0.74,0.194,6,68,74 +5476,2011-08-21,3,0,8,1,0,0,0,1,0.68,0.6364,0.79,0.2239,17,69,86 +5477,2011-08-21,3,0,8,2,0,0,0,1,0.68,0.6364,0.79,0.194,13,51,64 +5478,2011-08-21,3,0,8,3,0,0,0,1,0.66,0.6061,0.83,0.2239,25,26,51 +5479,2011-08-21,3,0,8,4,0,0,0,1,0.68,0.6364,0.83,0.2239,1,5,6 +5480,2011-08-21,3,0,8,5,0,0,0,1,0.66,0.5909,0.89,0.2985,0,8,8 +5481,2011-08-21,3,0,8,6,0,0,0,1,0.66,0.5909,0.89,0.2836,4,10,14 +5482,2011-08-21,3,0,8,7,0,0,0,1,0.68,0.6364,0.83,0.2985,12,19,31 +5483,2011-08-21,3,0,8,8,0,0,0,1,0.7,0.6667,0.79,0.2239,30,62,92 +5484,2011-08-21,3,0,8,9,0,0,0,1,0.72,0.697,0.74,0.2239,80,151,231 +5485,2011-08-21,3,0,8,10,0,0,0,1,0.74,0.7121,0.74,0.2537,97,177,274 +5486,2011-08-21,3,0,8,11,0,0,0,1,0.8,0.7879,0.63,0.2836,132,181,313 +5487,2011-08-21,3,0,8,12,0,0,0,1,0.82,0.803,0.59,0.2836,135,239,374 +5488,2011-08-21,3,0,8,13,0,0,0,1,0.8,0.7727,0.59,0.4179,94,216,310 +5489,2011-08-21,3,0,8,14,0,0,0,3,0.7,0.6667,0.79,0.3881,78,157,235 +5490,2011-08-21,3,0,8,15,0,0,0,1,0.7,0.6667,0.84,0.2836,17,51,68 +5491,2011-08-21,3,0,8,16,0,0,0,1,0.72,0.697,0.79,0.1642,97,161,258 +5492,2011-08-21,3,0,8,17,0,0,0,1,0.72,0.697,0.79,0.1343,91,179,270 +5493,2011-08-21,3,0,8,18,0,0,0,1,0.72,0.697,0.79,0.2537,96,215,311 +5494,2011-08-21,3,0,8,19,0,0,0,1,0.72,0.6818,0.7,0.2537,81,201,282 +5495,2011-08-21,3,0,8,20,0,0,0,1,0.7,0.6667,0.79,0.2537,41,140,181 +5496,2011-08-21,3,0,8,21,0,0,0,1,0.7,0.6667,0.79,0.2537,45,93,138 +5497,2011-08-21,3,0,8,22,0,0,0,2,0.7,0.6667,0.79,0.194,37,97,134 +5498,2011-08-21,3,0,8,23,0,0,0,1,0.7,0.6667,0.74,0.1642,20,48,68 +5499,2011-08-22,3,0,8,0,0,1,1,1,0.68,0.6364,0.79,0.1642,18,22,40 +5500,2011-08-22,3,0,8,1,0,1,1,1,0.68,0.6364,0.79,0.0896,8,13,21 +5501,2011-08-22,3,0,8,2,0,1,1,2,0.7,0.6515,0.65,0.1045,2,6,8 +5502,2011-08-22,3,0,8,3,0,1,1,3,0.7,0.6515,0.65,0.1343,0,5,5 +5503,2011-08-22,3,0,8,4,0,1,1,1,0.68,0.6364,0.65,0.194,2,10,12 +5504,2011-08-22,3,0,8,5,0,1,1,1,0.66,0.6212,0.65,0.3582,1,23,24 +5505,2011-08-22,3,0,8,6,0,1,1,1,0.66,0.6212,0.61,0.2985,6,106,112 +5506,2011-08-22,3,0,8,7,0,1,1,1,0.66,0.6212,0.61,0.3284,11,270,281 +5507,2011-08-22,3,0,8,8,0,1,1,1,0.66,0.6212,0.61,0.3284,20,331,351 +5508,2011-08-22,3,0,8,9,0,1,1,1,0.68,0.6364,0.54,0.2537,36,151,187 +5509,2011-08-22,3,0,8,10,0,1,1,1,0.7,0.6515,0.48,0.2836,36,85,121 +5510,2011-08-22,3,0,8,11,0,1,1,1,0.72,0.6515,0.34,0.3284,46,85,131 +5511,2011-08-22,3,0,8,12,0,1,1,1,0.74,0.6515,0.35,0.3881,54,153,207 +5512,2011-08-22,3,0,8,13,0,1,1,1,0.74,0.6515,0.3,0.4627,47,158,205 +5513,2011-08-22,3,0,8,14,0,1,1,1,0.74,0.6515,0.3,0.3881,88,137,225 +5514,2011-08-22,3,0,8,15,0,1,1,1,0.74,0.6515,0.3,0.2985,72,127,199 +5515,2011-08-22,3,0,8,16,0,1,1,1,0.74,0.6515,0.3,0.3881,66,254,320 +5516,2011-08-22,3,0,8,17,0,1,1,1,0.74,0.6515,0.27,0.3284,82,509,591 +5517,2011-08-22,3,0,8,18,0,1,1,1,0.72,0.6515,0.28,0.2985,72,537,609 +5518,2011-08-22,3,0,8,19,0,1,1,1,0.7,0.6364,0.3,0.2836,66,350,416 +5519,2011-08-22,3,0,8,20,0,1,1,1,0.66,0.6212,0.34,0.194,21,247,268 +5520,2011-08-22,3,0,8,21,0,1,1,1,0.64,0.6212,0.38,0.194,28,180,208 +5521,2011-08-22,3,0,8,22,0,1,1,1,0.64,0.6212,0.38,0.2985,32,121,153 +5522,2011-08-22,3,0,8,23,0,1,1,1,0.62,0.6212,0.41,0.2537,19,45,64 +5523,2011-08-23,3,0,8,0,0,2,1,1,0.6,0.6212,0.43,0.194,13,22,35 +5524,2011-08-23,3,0,8,1,0,2,1,1,0.58,0.5455,0.49,0.194,6,13,19 +5525,2011-08-23,3,0,8,2,0,2,1,1,0.56,0.5303,0.52,0.194,2,4,6 +5526,2011-08-23,3,0,8,3,0,2,1,1,0.56,0.5303,0.52,0.2836,2,4,6 +5527,2011-08-23,3,0,8,4,0,2,1,1,0.54,0.5152,0.56,0.1642,0,5,5 +5528,2011-08-23,3,0,8,5,0,2,1,1,0.54,0.5152,0.56,0.194,1,35,36 +5529,2011-08-23,3,0,8,6,0,2,1,1,0.54,0.5152,0.6,0.194,3,111,114 +5530,2011-08-23,3,0,8,7,0,2,1,1,0.6,0.6212,0.49,0.194,11,333,344 +5531,2011-08-23,3,0,8,8,0,2,1,2,0.62,0.6212,0.5,0,43,461,504 +5532,2011-08-23,3,0,8,9,0,2,1,2,0.66,0.6212,0.44,0,34,191,225 +5533,2011-08-23,3,0,8,10,0,2,1,1,0.7,0.6364,0.37,0,56,92,148 +5534,2011-08-23,3,0,8,11,0,2,1,1,0.7,0.6364,0.34,0.1045,62,124,186 +5535,2011-08-23,3,0,8,12,0,2,1,1,0.7,0.6364,0.39,0.0896,60,177,237 +5536,2011-08-23,3,0,8,13,0,2,1,1,0.72,0.6515,0.3,0.1045,70,164,234 +5537,2011-08-23,3,0,8,14,0,2,1,1,0.72,0.6515,0.3,0.0896,149,502,651 +5538,2011-08-23,3,0,8,15,0,2,1,1,0.72,0.6515,0.34,0.2239,178,423,601 +5539,2011-08-23,3,0,8,16,0,2,1,1,0.72,0.6515,0.32,0.1343,133,311,444 +5540,2011-08-23,3,0,8,17,0,2,1,1,0.72,0.6515,0.34,0.2239,133,339,472 +5541,2011-08-23,3,0,8,18,0,2,1,1,0.72,0.6515,0.34,0.194,98,421,519 +5542,2011-08-23,3,0,8,19,0,2,1,1,0.66,0.6212,0.5,0.2239,70,297,367 +5543,2011-08-23,3,0,8,20,0,2,1,1,0.64,0.6212,0.53,0.2537,54,206,260 +5544,2011-08-23,3,0,8,21,0,2,1,1,0.62,0.6212,0.57,0,38,173,211 +5545,2011-08-23,3,0,8,22,0,2,1,1,0.62,0.6212,0.57,0.1343,46,145,191 +5546,2011-08-23,3,0,8,23,0,2,1,1,0.62,0.6061,0.61,0.1343,19,61,80 +5547,2011-08-24,3,0,8,0,0,3,1,1,0.62,0.6212,0.57,0.1343,9,29,38 +5548,2011-08-24,3,0,8,1,0,3,1,1,0.6,0.6061,0.6,0.1045,7,17,24 +5549,2011-08-24,3,0,8,2,0,3,1,1,0.58,0.5455,0.68,0.1343,2,4,6 +5550,2011-08-24,3,0,8,3,0,3,1,1,0.58,0.5455,0.68,0.1343,0,1,1 +5551,2011-08-24,3,0,8,4,0,3,1,1,0.56,0.5303,0.73,0.1045,0,7,7 +5552,2011-08-24,3,0,8,5,0,3,1,1,0.56,0.5303,0.73,0.1045,1,26,27 +5553,2011-08-24,3,0,8,6,0,3,1,1,0.56,0.5303,0.73,0,3,105,108 +5554,2011-08-24,3,0,8,7,0,3,1,1,0.6,0.6061,0.64,0.0896,11,297,308 +5555,2011-08-24,3,0,8,8,0,3,1,1,0.62,0.6061,0.69,0.1343,32,436,468 +5556,2011-08-24,3,0,8,9,0,3,1,1,0.64,0.6061,0.69,0.2537,44,169,213 +5557,2011-08-24,3,0,8,10,0,3,1,1,0.7,0.6515,0.58,0.2836,51,95,146 +5558,2011-08-24,3,0,8,11,0,3,1,1,0.72,0.6667,0.54,0.2985,57,128,185 +5559,2011-08-24,3,0,8,12,0,3,1,1,0.74,0.6818,0.55,0.3284,60,175,235 +5560,2011-08-24,3,0,8,13,0,3,1,1,0.76,0.6818,0.48,0.4179,56,180,236 +5561,2011-08-24,3,0,8,14,0,3,1,1,0.76,0.6818,0.48,0.4179,77,127,204 +5562,2011-08-24,3,0,8,15,0,3,1,1,0.76,0.6818,0.48,0.4478,63,141,204 +5563,2011-08-24,3,0,8,16,0,3,1,1,0.76,0.697,0.52,0.4179,49,222,271 +5564,2011-08-24,3,0,8,17,0,3,1,1,0.76,0.697,0.52,0.3582,83,484,567 +5565,2011-08-24,3,0,8,18,0,3,1,1,0.74,0.6818,0.55,0.2985,102,519,621 +5566,2011-08-24,3,0,8,19,0,3,1,1,0.74,0.6818,0.55,0.2985,93,347,440 +5567,2011-08-24,3,0,8,20,0,3,1,1,0.72,0.6667,0.58,0.2985,45,289,334 +5568,2011-08-24,3,0,8,21,0,3,1,1,0.7,0.6515,0.61,0.3582,47,189,236 +5569,2011-08-24,3,0,8,22,0,3,1,1,0.7,0.6515,0.65,0.3582,41,140,181 +5570,2011-08-24,3,0,8,23,0,3,1,1,0.68,0.6364,0.69,0.2985,16,54,70 +5571,2011-08-25,3,0,8,0,0,4,1,1,0.68,0.6364,0.74,0.2985,11,41,52 +5572,2011-08-25,3,0,8,1,0,4,1,1,0.68,0.6364,0.69,0.3881,4,11,15 +5573,2011-08-25,3,0,8,2,0,4,1,1,0.68,0.6364,0.69,0.3881,3,2,5 +5574,2011-08-25,3,0,8,3,0,4,1,1,0.66,0.6212,0.74,0.3284,2,2,4 +5575,2011-08-25,3,0,8,4,0,4,1,1,0.66,0.6061,0.78,0.3881,1,6,7 +5576,2011-08-25,3,0,8,5,0,4,1,1,0.66,0.6061,0.83,0.4179,1,25,26 +5577,2011-08-25,3,0,8,6,0,4,1,1,0.66,0.6061,0.83,0.3284,1,101,102 +5578,2011-08-25,3,0,8,7,0,4,1,2,0.68,0.6364,0.79,0.3284,17,296,313 +5579,2011-08-25,3,0,8,8,0,4,1,2,0.7,0.6667,0.74,0.2985,23,461,484 +5580,2011-08-25,3,0,8,9,0,4,1,2,0.72,0.6818,0.7,0.2985,21,141,162 +5581,2011-08-25,3,0,8,10,0,4,1,2,0.74,0.697,0.66,0.2985,37,105,142 +5582,2011-08-25,3,0,8,11,0,4,1,3,0.7,0.6667,0.79,0,39,112,151 +5583,2011-08-25,3,0,8,12,0,4,1,3,0.7,0.6667,0.79,0,9,32,41 +5584,2011-08-25,3,0,8,13,0,4,1,1,0.7,0.6667,0.79,0.1045,7,30,37 +5585,2011-08-25,3,0,8,14,0,4,1,2,0.72,0.697,0.74,0,27,86,113 +5586,2011-08-25,3,0,8,15,0,4,1,2,0.74,0.697,0.7,0,28,104,132 +5587,2011-08-25,3,0,8,16,0,4,1,1,0.72,0.7121,0.84,0,16,105,121 +5588,2011-08-25,3,0,8,17,0,4,1,1,0.72,0.7121,0.84,0,28,284,312 +5589,2011-08-25,3,0,8,18,0,4,1,2,0.72,0.7121,0.84,0.1343,31,377,408 +5590,2011-08-25,3,0,8,19,0,4,1,2,0.66,0.6061,0.78,0.3284,31,241,272 +5591,2011-08-25,3,0,8,20,0,4,1,1,0.64,0.5909,0.78,0.3582,33,192,225 +5592,2011-08-25,3,0,8,21,0,4,1,1,0.64,0.5909,0.78,0.2836,29,158,187 +5593,2011-08-25,3,0,8,22,0,4,1,1,0.62,0.5758,0.83,0,21,126,147 +5594,2011-08-25,3,0,8,23,0,4,1,1,0.62,0.5758,0.83,0.0896,15,69,84 +5595,2011-08-26,3,0,8,0,0,5,1,1,0.62,0.5758,0.83,0.1343,9,42,51 +5596,2011-08-26,3,0,8,1,0,5,1,1,0.62,0.5758,0.83,0,10,13,23 +5597,2011-08-26,3,0,8,2,0,5,1,1,0.62,0.5606,0.88,0.1343,4,16,20 +5598,2011-08-26,3,0,8,3,0,5,1,1,0.6,0.5455,0.88,0.1045,3,8,11 +5599,2011-08-26,3,0,8,4,0,5,1,1,0.62,0.5606,0.88,0,2,6,8 +5600,2011-08-26,3,0,8,5,0,5,1,2,0.62,0.5606,0.88,0,3,23,26 +5601,2011-08-26,3,0,8,6,0,5,1,1,0.62,0.5606,0.88,0,5,100,105 +5602,2011-08-26,3,0,8,7,0,5,1,1,0.64,0.5758,0.83,0,3,263,266 +5603,2011-08-26,3,0,8,8,0,5,1,1,0.66,0.6061,0.83,0.1045,17,387,404 +5604,2011-08-26,3,0,8,9,0,5,1,1,0.7,0.6667,0.74,0,39,191,230 +5605,2011-08-26,3,0,8,10,0,5,1,1,0.74,0.697,0.66,0.0896,34,89,123 +5606,2011-08-26,3,0,8,11,0,5,1,2,0.76,0.7121,0.62,0,63,151,214 +5607,2011-08-26,3,0,8,12,0,5,1,2,0.76,0.7273,0.66,0.1045,70,173,243 +5608,2011-08-26,3,0,8,13,0,5,1,2,0.78,0.7424,0.59,0.1045,72,165,237 +5609,2011-08-26,3,0,8,14,0,5,1,1,0.8,0.7727,0.59,0,55,164,219 +5610,2011-08-26,3,0,8,15,0,5,1,2,0.8,0.7727,0.59,0.194,53,196,249 +5611,2011-08-26,3,0,8,16,0,5,1,1,0.78,0.7424,0.62,0.1045,50,293,343 +5612,2011-08-26,3,0,8,17,0,5,1,1,0.74,0.7121,0.74,0.2537,56,433,489 +5613,2011-08-26,3,0,8,18,0,5,1,2,0.74,0.7121,0.74,0.1343,40,370,410 +5614,2011-08-26,3,0,8,19,0,5,1,1,0.72,0.697,0.79,0.1045,55,235,290 +5615,2011-08-26,3,0,8,20,0,5,1,1,0.72,0.697,0.79,0.0896,39,183,222 +5616,2011-08-26,3,0,8,21,0,5,1,1,0.72,0.697,0.79,0.1642,38,152,190 +5617,2011-08-26,3,0,8,22,0,5,1,1,0.72,0.697,0.79,0.1045,26,126,152 +5618,2011-08-26,3,0,8,23,0,5,1,1,0.7,0.6667,0.84,0.0896,22,114,136 +5619,2011-08-27,3,0,8,0,0,6,0,1,0.7,0.6667,0.84,0.1045,33,112,145 +5620,2011-08-27,3,0,8,1,0,6,0,1,0.7,0.6667,0.84,0.1642,13,51,64 +5621,2011-08-27,3,0,8,2,0,6,0,1,0.7,0.6667,0.84,0.194,18,59,77 +5622,2011-08-27,3,0,8,3,0,6,0,2,0.7,0.6667,0.84,0.2239,8,22,30 +5623,2011-08-27,3,0,8,4,0,6,0,2,0.7,0.6667,0.84,0.2239,1,3,4 +5624,2011-08-27,3,0,8,5,0,6,0,2,0.7,0.6667,0.84,0.2985,1,11,12 +5625,2011-08-27,3,0,8,6,0,6,0,2,0.7,0.6667,0.84,0.2985,3,15,18 +5626,2011-08-27,3,0,8,7,0,6,0,2,0.7,0.6667,0.84,0.3582,2,26,28 +5627,2011-08-27,3,0,8,8,0,6,0,2,0.7,0.6667,0.84,0.2537,14,62,76 +5628,2011-08-27,3,0,8,9,0,6,0,2,0.7,0.6667,0.84,0.4179,28,128,156 +5629,2011-08-27,3,0,8,10,0,6,0,2,0.7,0.6667,0.79,0.4627,51,154,205 +5630,2011-08-27,3,0,8,11,0,6,0,3,0.66,0.5909,0.89,0.4179,15,73,88 +5631,2011-08-27,3,0,8,12,0,6,0,3,0.66,0.6061,0.83,0.4925,11,65,76 +5632,2011-08-27,3,0,8,13,0,6,0,3,0.66,0.6061,0.83,0.3881,10,33,43 +5633,2011-08-27,3,0,8,14,0,6,0,3,0.64,0.5758,0.89,0.5522,4,19,23 +5634,2011-08-27,3,0,8,15,0,6,0,3,0.64,0.5758,0.89,0.5522,2,28,30 +5635,2011-08-27,3,0,8,16,0,6,0,3,0.64,0.5758,0.89,0.5224,10,14,24 +5636,2011-08-27,3,0,8,17,0,6,0,3,0.64,0.5758,0.89,0.8358,2,14,16 +5637,2011-08-28,3,0,8,7,0,0,0,3,0.62,0.5758,0.83,0.3582,0,1,1 +5638,2011-08-28,3,0,8,8,0,0,0,3,0.62,0.5758,0.83,0.4179,2,6,8 +5639,2011-08-28,3,0,8,9,0,0,0,1,0.66,0.6212,0.74,0.4179,7,46,53 +5640,2011-08-28,3,0,8,10,0,0,0,1,0.7,0.6515,0.61,0.6119,27,115,142 +5641,2011-08-28,3,0,8,11,0,0,0,1,0.7,0.6515,0.58,0.3881,59,178,237 +5642,2011-08-28,3,0,8,12,0,0,0,1,0.74,0.6667,0.51,0.3881,88,218,306 +5643,2011-08-28,3,0,8,13,0,0,0,1,0.76,0.6818,0.45,0.3582,129,302,431 +5644,2011-08-28,3,0,8,14,0,0,0,1,0.78,0.697,0.43,0.4179,157,290,447 +5645,2011-08-28,3,0,8,15,0,0,0,1,0.78,0.697,0.43,0.3582,155,314,469 +5646,2011-08-28,3,0,8,16,0,0,0,1,0.8,0.7121,0.36,0.3881,196,295,491 +5647,2011-08-28,3,0,8,17,0,0,0,1,0.76,0.6818,0.4,0.2985,145,253,398 +5648,2011-08-28,3,0,8,18,0,0,0,1,0.74,0.6667,0.42,0.2239,145,257,402 +5649,2011-08-28,3,0,8,19,0,0,0,1,0.72,0.6667,0.48,0.1045,140,247,387 +5650,2011-08-28,3,0,8,20,0,0,0,1,0.72,0.6515,0.45,0.1045,64,150,214 +5651,2011-08-28,3,0,8,21,0,0,0,1,0.66,0.6212,0.61,0.1343,60,126,186 +5652,2011-08-28,3,0,8,22,0,0,0,1,0.64,0.6061,0.69,0.1045,25,73,98 +5653,2011-08-28,3,0,8,23,0,0,0,1,0.62,0.5909,0.73,0.1045,16,48,64 +5654,2011-08-29,3,0,8,0,0,1,1,1,0.62,0.5909,0.73,0.1045,13,21,34 +5655,2011-08-29,3,0,8,1,0,1,1,1,0.6,0.5909,0.69,0.1045,5,15,20 +5656,2011-08-29,3,0,8,2,0,1,1,1,0.6,0.5909,0.69,0.194,8,5,13 +5657,2011-08-29,3,0,8,3,0,1,1,1,0.6,0.5909,0.69,0.1045,1,5,6 +5658,2011-08-29,3,0,8,4,0,1,1,1,0.56,0.5303,0.68,0.1343,0,3,3 +5659,2011-08-29,3,0,8,5,0,1,1,1,0.56,0.5303,0.73,0.1343,0,17,17 +5660,2011-08-29,3,0,8,6,0,1,1,1,0.56,0.5303,0.73,0.1343,3,99,102 +5661,2011-08-29,3,0,8,7,0,1,1,1,0.6,0.6061,0.6,0.2239,11,273,284 +5662,2011-08-29,3,0,8,8,0,1,1,1,0.62,0.6212,0.57,0.2537,20,384,404 +5663,2011-08-29,3,0,8,9,0,1,1,1,0.62,0.6212,0.53,0.2537,22,166,188 +5664,2011-08-29,3,0,8,10,0,1,1,1,0.66,0.6212,0.47,0.2239,42,73,115 +5665,2011-08-29,3,0,8,11,0,1,1,1,0.66,0.6212,0.47,0.1642,38,100,138 +5666,2011-08-29,3,0,8,12,0,1,1,2,0.68,0.6364,0.44,0.1642,48,159,207 +5667,2011-08-29,3,0,8,13,0,1,1,2,0.7,0.6364,0.39,0,48,170,218 +5668,2011-08-29,3,0,8,14,0,1,1,1,0.7,0.6364,0.42,0,55,127,182 +5669,2011-08-29,3,0,8,15,0,1,1,1,0.7,0.6364,0.42,0.1642,56,164,220 +5670,2011-08-29,3,0,8,16,0,1,1,1,0.72,0.6515,0.42,0.1642,56,226,282 +5671,2011-08-29,3,0,8,17,0,1,1,2,0.7,0.6364,0.47,0.2239,67,524,591 +5672,2011-08-29,3,0,8,18,0,1,1,1,0.66,0.6212,0.47,0.2537,60,487,547 +5673,2011-08-29,3,0,8,19,0,1,1,1,0.66,0.6212,0.47,0.2537,50,317,367 +5674,2011-08-29,3,0,8,20,0,1,1,1,0.64,0.6212,0.53,0.1642,50,227,277 +5675,2011-08-29,3,0,8,21,0,1,1,1,0.64,0.6212,0.53,0.194,34,168,202 +5676,2011-08-29,3,0,8,22,0,1,1,1,0.62,0.6212,0.57,0.1343,23,119,142 +5677,2011-08-29,3,0,8,23,0,1,1,1,0.6,0.6061,0.6,0.0896,19,56,75 +5678,2011-08-30,3,0,8,0,0,2,1,1,0.56,0.5303,0.73,0.1343,10,17,27 +5679,2011-08-30,3,0,8,1,0,2,1,1,0.56,0.5303,0.68,0.1045,6,7,13 +5680,2011-08-30,3,0,8,2,0,2,1,1,0.54,0.5152,0.73,0.1045,2,8,10 +5681,2011-08-30,3,0,8,3,0,2,1,1,0.54,0.5152,0.73,0.194,2,2,4 +5682,2011-08-30,3,0,8,4,0,2,1,1,0.52,0.5,0.77,0.1642,0,6,6 +5683,2011-08-30,3,0,8,5,0,2,1,1,0.52,0.5,0.77,0.0896,1,27,28 +5684,2011-08-30,3,0,8,6,0,2,1,1,0.54,0.5152,0.73,0.1045,4,115,119 +5685,2011-08-30,3,0,8,7,0,2,1,1,0.56,0.5303,0.68,0.1343,19,338,357 +5686,2011-08-30,3,0,8,8,0,2,1,1,0.62,0.6212,0.57,0.1343,34,459,493 +5687,2011-08-30,3,0,8,9,0,2,1,1,0.66,0.6212,0.5,0.2239,38,180,218 +5688,2011-08-30,3,0,8,10,0,2,1,1,0.7,0.6364,0.39,0.2239,28,109,137 +5689,2011-08-30,3,0,8,11,0,2,1,1,0.72,0.6515,0.37,0.194,60,115,175 +5690,2011-08-30,3,0,8,12,0,2,1,1,0.72,0.6515,0.37,0,65,172,237 +5691,2011-08-30,3,0,8,13,0,2,1,1,0.74,0.6515,0.35,0.1045,59,170,229 +5692,2011-08-30,3,0,8,14,0,2,1,1,0.74,0.6515,0.33,0.1642,58,151,209 +5693,2011-08-30,3,0,8,15,0,2,1,1,0.74,0.6515,0.35,0.2239,34,162,196 +5694,2011-08-30,3,0,8,16,0,2,1,1,0.74,0.6515,0.37,0.1642,42,295,337 +5695,2011-08-30,3,0,8,17,0,2,1,1,0.72,0.6515,0.42,0.194,62,549,611 +5696,2011-08-30,3,0,8,18,0,2,1,1,0.7,0.6364,0.45,0.1642,60,516,576 +5697,2011-08-30,3,0,8,19,0,2,1,1,0.66,0.6212,0.5,0.0896,68,397,465 +5698,2011-08-30,3,0,8,20,0,2,1,1,0.66,0.6212,0.5,0,39,259,298 +5699,2011-08-30,3,0,8,21,0,2,1,1,0.64,0.6061,0.65,0.0896,43,180,223 +5700,2011-08-30,3,0,8,22,0,2,1,1,0.62,0.6061,0.61,0,22,121,143 +5701,2011-08-30,3,0,8,23,0,2,1,1,0.62,0.6061,0.61,0,19,74,93 +5702,2011-08-31,3,0,8,0,0,3,1,1,0.6,0.5909,0.69,0,8,24,32 +5703,2011-08-31,3,0,8,1,0,3,1,1,0.6,0.5909,0.69,0,2,13,15 +5704,2011-08-31,3,0,8,2,0,3,1,1,0.56,0.5303,0.73,0,1,5,6 +5705,2011-08-31,3,0,8,3,0,3,1,1,0.56,0.5303,0.78,0,2,4,6 +5706,2011-08-31,3,0,8,4,0,3,1,1,0.56,0.5303,0.73,0,0,5,5 +5707,2011-08-31,3,0,8,5,0,3,1,1,0.54,0.5152,0.83,0.0896,2,25,27 +5708,2011-08-31,3,0,8,6,0,3,1,1,0.54,0.5152,0.77,0,4,107,111 +5709,2011-08-31,3,0,8,7,0,3,1,1,0.6,0.5758,0.78,0,12,316,328 +5710,2011-08-31,3,0,8,8,0,3,1,1,0.62,0.6061,0.69,0,27,440,467 +5711,2011-08-31,3,0,8,9,0,3,1,1,0.64,0.6061,0.69,0.0896,27,217,244 +5712,2011-08-31,3,0,8,10,0,3,1,1,0.7,0.6364,0.45,0.1045,33,105,138 +5713,2011-08-31,3,0,8,11,0,3,1,1,0.72,0.6515,0.42,0.1343,28,143,171 +5714,2011-08-31,3,0,8,12,0,3,1,1,0.74,0.6667,0.43,0.1343,68,192,260 +5715,2011-08-31,3,0,8,13,0,3,1,1,0.74,0.6515,0.37,0.1045,39,182,221 +5716,2011-08-31,3,0,8,14,0,3,1,1,0.74,0.6515,0.4,0.1343,48,148,196 +5717,2011-08-31,3,0,8,15,0,3,1,1,0.74,0.6515,0.4,0.1343,34,138,172 +5718,2011-08-31,3,0,8,16,0,3,1,1,0.74,0.6667,0.42,0.1343,46,263,309 +5719,2011-08-31,3,0,8,17,0,3,1,1,0.72,0.6515,0.45,0.1343,83,525,608 +5720,2011-08-31,3,0,8,18,0,3,1,1,0.7,0.6515,0.51,0.2239,70,495,565 +5721,2011-08-31,3,0,8,19,0,3,1,1,0.84,0.7727,0.47,0.2537,53,393,446 +5722,2011-08-31,3,0,8,20,0,3,1,1,0.66,0.6212,0.57,0.1343,30,259,289 +5723,2011-08-31,3,0,8,21,0,3,1,1,0.66,0.6212,0.61,0.0896,27,174,201 +5724,2011-08-31,3,0,8,22,0,3,1,1,0.64,0.6061,0.69,0,20,137,157 +5725,2011-08-31,3,0,8,23,0,3,1,1,0.6,0.5758,0.78,0.1045,24,60,84 +5726,2011-09-01,3,0,9,0,0,4,1,1,0.6,0.5758,0.78,0.1045,18,33,51 +5727,2011-09-01,3,0,9,1,0,4,1,1,0.6,0.5909,0.73,0.0896,7,14,21 +5728,2011-09-01,3,0,9,2,0,4,1,1,0.58,0.5455,0.78,0.0896,14,11,25 +5729,2011-09-01,3,0,9,3,0,4,1,1,0.58,0.5455,0.78,0.0896,7,7,14 +5730,2011-09-01,3,0,9,4,0,4,1,1,0.56,0.5303,0.83,0.0896,0,7,7 +5731,2011-09-01,3,0,9,5,0,4,1,1,0.56,0.5303,0.73,0.0896,1,22,23 +5732,2011-09-01,3,0,9,6,0,4,1,1,0.6,0.5758,0.78,0,2,103,105 +5733,2011-09-01,3,0,9,7,0,4,1,1,0.6,0.5758,0.78,0,7,335,342 +5734,2011-09-01,3,0,9,8,0,4,1,1,0.62,0.5909,0.73,0.1343,31,467,498 +5735,2011-09-01,3,0,9,9,0,4,1,1,0.64,0.6061,0.69,0.1045,29,178,207 +5736,2011-09-01,3,0,9,10,0,4,1,1,0.68,0.6364,0.61,0.1343,46,92,138 +5737,2011-09-01,3,0,9,11,0,4,1,1,0.72,0.6667,0.51,0.1343,51,141,192 +5738,2011-09-01,3,0,9,12,0,4,1,2,0.72,0.6667,0.51,0.1343,64,165,229 +5739,2011-09-01,3,0,9,13,0,4,1,2,0.72,0.6667,0.48,0.0896,50,169,219 +5740,2011-09-01,3,0,9,14,0,4,1,3,0.72,0.6667,0.54,0.2537,54,144,198 +5741,2011-09-01,3,0,9,15,0,4,1,1,0.72,0.6667,0.51,0.1343,39,135,174 +5742,2011-09-01,3,0,9,16,0,4,1,1,0.74,0.6667,0.51,0.194,55,253,308 +5743,2011-09-01,3,0,9,17,0,4,1,1,0.72,0.6667,0.54,0.2537,61,567,628 +5744,2011-09-01,3,0,9,18,0,4,1,1,0.72,0.6667,0.54,0.2239,69,462,531 +5745,2011-09-01,3,0,9,19,0,4,1,1,0.7,0.6515,0.58,0.1642,79,364,443 +5746,2011-09-01,3,0,9,20,0,4,1,1,0.66,0.6212,0.61,0.2239,33,247,280 +5747,2011-09-01,3,0,9,21,0,4,1,2,0.66,0.6212,0.57,0.194,17,160,177 +5748,2011-09-01,3,0,9,22,0,4,1,2,0.66,0.6212,0.57,0.2239,34,145,179 +5749,2011-09-01,3,0,9,23,0,4,1,1,0.64,0.6061,0.65,0.2537,15,111,126 +5750,2011-09-02,3,0,9,0,0,5,1,1,0.64,0.6061,0.65,0.194,6,58,64 +5751,2011-09-02,3,0,9,1,0,5,1,3,0.62,0.5909,0.73,0.1045,4,28,32 +5752,2011-09-02,3,0,9,2,0,5,1,3,0.62,0.5909,0.73,0.1045,9,11,20 +5753,2011-09-02,3,0,9,3,0,5,1,2,0.6,0.5606,0.83,0.1642,4,4,8 +5754,2011-09-02,3,0,9,4,0,5,1,1,0.6,0.5606,0.83,0.0896,2,2,4 +5755,2011-09-02,3,0,9,5,0,5,1,2,0.6,0.5606,0.83,0.1343,0,20,20 +5756,2011-09-02,3,0,9,6,0,5,1,1,0.6,0.5606,0.83,0.1343,3,73,76 +5757,2011-09-02,3,0,9,7,0,5,1,1,0.6,0.5606,0.83,0.1045,6,253,259 +5758,2011-09-02,3,0,9,8,0,5,1,1,0.62,0.5909,0.78,0.194,22,434,456 +5759,2011-09-02,3,0,9,9,0,5,1,2,0.62,0.5758,0.83,0.1045,30,190,220 +5760,2011-09-02,3,0,9,10,0,5,1,2,0.64,0.6061,0.76,0.1642,34,106,140 +5761,2011-09-02,3,0,9,11,0,5,1,2,0.66,0.6212,0.74,0.1045,51,141,192 +5762,2011-09-02,3,0,9,12,0,5,1,2,0.66,0.6212,0.69,0.2239,82,178,260 +5763,2011-09-02,3,0,9,13,0,5,1,2,0.68,0.6364,0.65,0.1343,72,207,279 +5764,2011-09-02,3,0,9,14,0,5,1,2,0.7,0.6515,0.61,0.1045,75,208,283 +5765,2011-09-02,3,0,9,15,0,5,1,2,0.7,0.6515,0.61,0.2537,69,277,346 +5766,2011-09-02,3,0,9,16,0,5,1,2,0.7,0.6515,0.61,0.1343,82,299,381 +5767,2011-09-02,3,0,9,17,0,5,1,2,0.68,0.6364,0.65,0.1642,78,377,455 +5768,2011-09-02,3,0,9,18,0,5,1,2,0.68,0.6364,0.65,0.194,50,305,355 +5769,2011-09-02,3,0,9,19,0,5,1,1,0.66,0.6212,0.69,0.1343,67,220,287 +5770,2011-09-02,3,0,9,20,0,5,1,1,0.64,0.6061,0.73,0,38,158,196 +5771,2011-09-02,3,0,9,21,0,5,1,2,0.64,0.6061,0.73,0.0896,28,121,149 +5772,2011-09-02,3,0,9,22,0,5,1,2,0.64,0.6061,0.73,0.1642,33,114,147 +5773,2011-09-02,3,0,9,23,0,5,1,2,0.64,0.6061,0.73,0.1642,30,68,98 +5774,2011-09-03,3,0,9,0,0,6,0,2,0.64,0.6061,0.73,0.1045,22,65,87 +5775,2011-09-03,3,0,9,1,0,6,0,2,0.64,0.6061,0.69,0.2537,17,57,74 +5776,2011-09-03,3,0,9,2,0,6,0,2,0.64,0.6061,0.69,0.2239,15,26,41 +5777,2011-09-03,3,0,9,3,0,6,0,1,0.62,0.5909,0.73,0.1642,17,18,35 +5778,2011-09-03,3,0,9,4,0,6,0,1,0.62,0.5909,0.73,0.1642,3,4,7 +5779,2011-09-03,3,0,9,5,0,6,0,1,0.62,0.6061,0.69,0.194,3,9,12 +5780,2011-09-03,3,0,9,6,0,6,0,1,0.62,0.5909,0.73,0.1642,4,19,23 +5781,2011-09-03,3,0,9,7,0,6,0,1,0.62,0.5909,0.73,0.194,5,33,38 +5782,2011-09-03,3,0,9,8,0,6,0,1,0.64,0.6061,0.69,0.1642,24,65,89 +5783,2011-09-03,3,0,9,9,0,6,0,2,0.66,0.6212,0.69,0.2537,83,118,201 +5784,2011-09-03,3,0,9,10,0,6,0,3,0.66,0.6212,0.65,0.2537,90,168,258 +5785,2011-09-03,3,0,9,11,0,6,0,3,0.66,0.6212,0.69,0.2836,66,128,194 +5786,2011-09-03,3,0,9,12,0,6,0,1,0.7,0.6515,0.61,0.2239,97,160,257 +5787,2011-09-03,3,0,9,13,0,6,0,1,0.7,0.6515,0.65,0.2239,153,200,353 +5788,2011-09-03,3,0,9,14,0,6,0,2,0.72,0.6818,0.66,0.1642,204,176,380 +5789,2011-09-03,3,0,9,15,0,6,0,1,0.72,0.6818,0.7,0.1642,187,201,388 +5790,2011-09-03,3,0,9,16,0,6,0,1,0.72,0.6818,0.7,0.194,186,188,374 +5791,2011-09-03,3,0,9,17,0,6,0,1,0.72,0.6818,0.7,0.2239,170,201,371 +5792,2011-09-03,3,0,9,18,0,6,0,1,0.72,0.6818,0.7,0.1642,160,179,339 +5793,2011-09-03,3,0,9,19,0,6,0,1,0.7,0.6667,0.74,0.1343,147,148,295 +5794,2011-09-03,3,0,9,20,0,6,0,1,0.7,0.6667,0.79,0.1642,99,120,219 +5795,2011-09-03,3,0,9,21,0,6,0,1,0.68,0.6364,0.83,0.1343,71,93,164 +5796,2011-09-03,3,0,9,22,0,6,0,1,0.68,0.6364,0.83,0.1045,66,96,162 +5797,2011-09-03,3,0,9,23,0,6,0,1,0.66,0.6212,0.85,0.1343,46,77,123 +5798,2011-09-04,3,0,9,0,0,0,0,1,0.66,0.5909,0.89,0.194,33,76,109 +5799,2011-09-04,3,0,9,1,0,0,0,1,0.66,0.6061,0.83,0.1343,37,38,75 +5800,2011-09-04,3,0,9,2,0,0,0,1,0.66,0.6061,0.83,0.194,17,43,60 +5801,2011-09-04,3,0,9,3,0,0,0,1,0.64,0.5758,0.89,0.194,20,23,43 +5802,2011-09-04,3,0,9,4,0,0,0,1,0.64,0.5758,0.83,0.1642,0,4,4 +5803,2011-09-04,3,0,9,5,0,0,0,1,0.64,0.5758,0.83,0.1343,1,5,6 +5804,2011-09-04,3,0,9,6,0,0,0,1,0.64,0.5758,0.84,0.2239,0,3,3 +5805,2011-09-04,3,0,9,7,0,0,0,1,0.66,0.6061,0.78,0.194,10,20,30 +5806,2011-09-04,3,0,9,8,0,0,0,1,0.66,0.6061,0.78,0.1642,21,49,70 +5807,2011-09-04,3,0,9,9,0,0,0,1,0.66,0.6061,0.78,0.1642,87,102,189 +5808,2011-09-04,3,0,9,10,0,0,0,1,0.7,0.6667,0.74,0.1343,150,147,297 +5809,2011-09-04,3,0,9,11,0,0,0,1,0.74,0.697,0.7,0.1642,174,163,337 +5810,2011-09-04,3,0,9,12,0,0,0,1,0.76,0.7273,0.66,0.1642,214,221,435 +5811,2011-09-04,3,0,9,13,0,0,0,2,0.78,0.7424,0.62,0.2836,245,174,419 +5812,2011-09-04,3,0,9,14,0,0,0,1,0.78,0.7576,0.66,0.3284,205,156,361 +5813,2011-09-04,3,0,9,15,0,0,0,1,0.78,0.7576,0.66,0.2836,218,192,410 +5814,2011-09-04,3,0,9,16,0,0,0,1,0.8,0.7727,0.59,0.2239,196,141,337 +5815,2011-09-04,3,0,9,17,0,0,0,1,0.76,0.7273,0.66,0.2239,204,172,376 +5816,2011-09-04,3,0,9,18,0,0,0,1,0.76,0.7273,0.66,0.194,187,169,356 +5817,2011-09-04,3,0,9,19,0,0,0,1,0.74,0.697,0.7,0.194,178,150,328 +5818,2011-09-04,3,0,9,20,0,0,0,1,0.74,0.697,0.7,0.2985,104,125,229 +5819,2011-09-04,3,0,9,21,0,0,0,1,0.72,0.697,0.74,0.2537,104,103,207 +5820,2011-09-04,3,0,9,22,0,0,0,2,0.72,0.697,0.74,0.2836,70,85,155 +5821,2011-09-04,3,0,9,23,0,0,0,2,0.72,0.6818,0.7,0.1642,46,58,104 +5822,2011-09-05,3,0,9,0,1,1,0,2,0.7,0.6667,0.74,0.2239,31,66,97 +5823,2011-09-05,3,0,9,1,1,1,0,2,0.68,0.6364,0.79,0.1045,19,35,54 +5824,2011-09-05,3,0,9,2,1,1,0,2,0.68,0.6364,0.79,0.1642,17,22,39 +5825,2011-09-05,3,0,9,3,1,1,0,2,0.68,0.6364,0.74,0.2985,4,12,16 +5826,2011-09-05,3,0,9,4,1,1,0,2,0.68,0.6364,0.69,0.2836,3,5,8 +5827,2011-09-05,3,0,9,5,1,1,0,2,0.66,0.6212,0.74,0.1642,2,4,6 +5828,2011-09-05,3,0,9,6,1,1,0,2,0.66,0.6212,0.74,0.1343,6,5,11 +5829,2011-09-05,3,0,9,7,1,1,0,1,0.66,0.6212,0.74,0.1642,11,30,41 +5830,2011-09-05,3,0,9,8,1,1,0,2,0.66,0.6061,0.78,0.194,45,56,101 +5831,2011-09-05,3,0,9,9,1,1,0,3,0.68,0.6364,0.74,0.1642,63,89,152 +5832,2011-09-05,3,0,9,10,1,1,0,2,0.7,0.6515,0.7,0.1642,107,137,244 +5833,2011-09-05,3,0,9,11,1,1,0,2,0.7,0.6667,0.74,0.1642,101,207,308 +5834,2011-09-05,3,0,9,12,1,1,0,2,0.72,0.6818,0.7,0.194,141,212,353 +5835,2011-09-05,3,0,9,13,1,1,0,2,0.74,0.697,0.7,0.1343,154,235,389 +5836,2011-09-05,3,0,9,14,1,1,0,1,0.74,0.697,0.7,0.1642,145,212,357 +5837,2011-09-05,3,0,9,15,1,1,0,2,0.68,0.6364,0.79,0.2836,111,142,253 +5838,2011-09-05,3,0,9,16,1,1,0,2,0.68,0.6364,0.79,0.2836,92,192,284 +5839,2011-09-05,3,0,9,17,1,1,0,2,0.66,0.5909,0.89,0.1045,37,77,114 +5840,2011-09-05,3,0,9,18,1,1,0,1,0.66,0.5909,0.89,0.0896,31,92,123 +5841,2011-09-05,3,0,9,19,1,1,0,3,0.66,0.5909,0.94,0.1045,52,123,175 +5842,2011-09-05,3,0,9,20,1,1,0,3,0.66,0.5909,0.94,0.2537,26,56,82 +5843,2011-09-05,3,0,9,21,1,1,0,2,0.66,0.5909,0.94,0.2985,20,40,60 +5844,2011-09-05,3,0,9,22,1,1,0,2,0.6,0.5455,0.88,0.5821,15,49,64 +5845,2011-09-05,3,0,9,23,1,1,0,3,0.56,0.5303,0.88,0.3881,3,17,20 +5846,2011-09-06,3,0,9,0,0,2,1,3,0.54,0.5152,0.94,0.3582,1,7,8 +5847,2011-09-06,3,0,9,2,0,2,1,3,0.54,0.5152,0.94,0.2537,0,2,2 +5848,2011-09-06,3,0,9,3,0,2,1,3,0.54,0.5152,0.94,0.2985,1,0,1 +5849,2011-09-06,3,0,9,4,0,2,1,2,0.54,0.5152,0.94,0.2985,1,3,4 +5850,2011-09-06,3,0,9,5,0,2,1,3,0.54,0.5152,0.88,0.3582,1,20,21 +5851,2011-09-06,3,0,9,6,0,2,1,2,0.54,0.5152,0.88,0.3284,0,72,72 +5852,2011-09-06,3,0,9,7,0,2,1,2,0.54,0.5152,0.83,0.3582,6,166,172 +5853,2011-09-06,3,0,9,8,0,2,1,3,0.54,0.5152,0.83,0.3881,15,349,364 +5854,2011-09-06,3,0,9,9,0,2,1,2,0.54,0.5152,0.81,0.4179,18,167,185 +5855,2011-09-06,3,0,9,10,0,2,1,3,0.54,0.5152,0.83,0.3582,16,90,106 +5856,2011-09-06,3,0,9,11,0,2,1,3,0.54,0.5152,0.83,0.3881,11,78,89 +5857,2011-09-06,3,0,9,12,0,2,1,3,0.54,0.5152,0.83,0.3881,16,51,67 +5858,2011-09-06,3,0,9,13,0,2,1,3,0.54,0.5152,0.88,0.2985,5,24,29 +5859,2011-09-06,3,0,9,14,0,2,1,3,0.54,0.5152,0.88,0.2836,3,21,24 +5860,2011-09-06,3,0,9,15,0,2,1,3,0.54,0.5152,0.94,0.3881,18,71,89 +5861,2011-09-06,3,0,9,16,0,2,1,3,0.54,0.5152,0.94,0.3881,11,95,106 +5862,2011-09-06,3,0,9,17,0,2,1,2,0.54,0.5152,0.88,0.3881,15,276,291 +5863,2011-09-06,3,0,9,18,0,2,1,2,0.54,0.5152,0.88,0.3881,22,351,373 +5864,2011-09-06,3,0,9,19,0,2,1,2,0.54,0.5152,0.88,0.4179,12,269,281 +5865,2011-09-06,3,0,9,20,0,2,1,3,0.54,0.5152,0.88,0.2985,13,150,163 +5866,2011-09-06,3,0,9,21,0,2,1,3,0.54,0.5152,0.88,0.2836,12,109,121 +5867,2011-09-06,3,0,9,22,0,2,1,2,0.54,0.5152,0.94,0.3284,5,79,84 +5868,2011-09-06,3,0,9,23,0,2,1,3,0.54,0.5152,0.94,0.2537,2,56,58 +5869,2011-09-07,3,0,9,0,0,3,1,2,0.54,0.5152,0.94,0.2239,1,12,13 +5870,2011-09-07,3,0,9,1,0,3,1,3,0.54,0.5152,0.94,0.2537,2,3,5 +5871,2011-09-07,3,0,9,2,0,3,1,2,0.54,0.5152,0.94,0.2239,2,4,6 +5872,2011-09-07,3,0,9,3,0,3,1,3,0.56,0.5303,0.94,0.1343,1,1,2 +5873,2011-09-07,3,0,9,4,0,3,1,3,0.56,0.5303,0.94,0.1642,0,4,4 +5874,2011-09-07,3,0,9,5,0,3,1,2,0.56,0.5303,1,0.1343,1,15,16 +5875,2011-09-07,3,0,9,6,0,3,1,3,0.6,0.5455,0.88,0.1045,1,74,75 +5876,2011-09-07,3,0,9,7,0,3,1,3,0.6,0.5455,0.88,0.0896,3,83,86 +5877,2011-09-07,3,0,9,8,0,3,1,3,0.62,0.5455,0.94,0.0896,9,319,328 +5878,2011-09-07,3,0,9,9,0,3,1,3,0.6,0.5455,0.88,0.1343,14,176,190 +5879,2011-09-07,3,0,9,10,0,3,1,3,0.6,0.5455,0.88,0.1343,3,63,66 +5880,2011-09-07,3,0,9,11,0,3,1,3,0.6,0.5152,0.94,0,1,9,10 +5881,2011-09-07,3,0,9,12,0,3,1,3,0.56,0.5303,0.94,0,1,21,22 +5882,2011-09-07,3,0,9,13,0,3,1,3,0.6,0.5455,0.88,0,2,9,11 +5883,2011-09-07,3,0,9,14,0,3,1,3,0.6,0.5152,0.93,0,1,24,25 +5884,2011-09-07,3,0,9,15,0,3,1,3,0.62,0.5455,0.94,0.1045,3,55,58 +5885,2011-09-07,3,0,9,16,0,3,1,1,0.64,0.5758,0.89,0,7,137,144 +5886,2011-09-07,3,0,9,17,0,3,1,3,0.64,0.5758,0.89,0,21,264,285 +5887,2011-09-07,3,0,9,18,0,3,1,3,0.64,0.5758,0.89,0,18,219,237 +5888,2011-09-07,3,0,9,19,0,3,1,2,0.64,0.5758,0.89,0,14,212,226 +5889,2011-09-07,3,0,9,20,0,3,1,3,0.64,0.5758,0.89,0.0896,3,93,96 +5890,2011-09-07,3,0,9,21,0,3,1,3,0.64,0.5758,0.89,0.0896,4,34,38 +5891,2011-09-07,3,0,9,22,0,3,1,3,0.62,0.5455,0.94,0.1642,3,26,29 +5892,2011-09-07,3,0,9,23,0,3,1,3,0.62,0.5455,0.94,0.194,3,21,24 +5893,2011-09-08,3,0,9,0,0,4,1,3,0.6,0.5,1,0.194,3,11,14 +5894,2011-09-08,3,0,9,1,0,4,1,3,0.62,0.5455,0.94,0.1045,0,4,4 +5895,2011-09-08,3,0,9,3,0,4,1,3,0.62,0.5455,0.94,0,0,2,2 +5896,2011-09-08,3,0,9,4,0,4,1,2,0.62,0.5455,0.94,0.0896,0,3,3 +5897,2011-09-08,3,0,9,5,0,4,1,3,0.62,0.5455,0.94,0.0896,1,13,14 +5898,2011-09-08,3,0,9,6,0,4,1,3,0.62,0.5455,0.94,0.1642,1,55,56 +5899,2011-09-08,3,0,9,7,0,4,1,3,0.62,0.5455,0.94,0.1045,7,172,179 +5900,2011-09-08,3,0,9,8,0,4,1,3,0.62,0.5455,0.94,0.1343,7,188,195 +5901,2011-09-08,3,0,9,9,0,4,1,2,0.62,0.5152,1,0.2537,4,65,69 +5902,2011-09-08,3,0,9,10,0,4,1,2,0.64,0.5606,0.94,0.1045,8,57,65 +5903,2011-09-08,3,0,9,11,0,4,1,2,0.64,0.5606,0.94,0.2239,16,82,98 +5904,2011-09-08,3,0,9,12,0,4,1,2,0.66,0.5909,0.94,0.2239,17,85,102 +5905,2011-09-08,3,0,9,13,0,4,1,2,0.68,0.6364,0.83,0.3881,14,112,126 +5906,2011-09-08,3,0,9,14,0,4,1,2,0.7,0.6667,0.79,0.3582,15,105,120 +5907,2011-09-08,3,0,9,15,0,4,1,3,0.66,0.5909,0.89,0.2985,24,115,139 +5908,2011-09-08,3,0,9,16,0,4,1,3,0.64,0.5606,0.94,0.2836,5,151,156 +5909,2011-09-08,3,0,9,17,0,4,1,3,0.64,0.5606,0.94,0.2836,11,102,113 +5910,2011-09-08,3,0,9,18,0,4,1,3,0.64,0.5606,0.94,0.3582,2,66,68 +5911,2011-09-08,3,0,9,19,0,4,1,3,0.62,0.5152,1,0.3284,1,51,52 +5912,2011-09-08,3,0,9,20,0,4,1,2,0.62,0.5455,0.94,0.1642,6,83,89 +5913,2011-09-08,3,0,9,21,0,4,1,3,0.64,0.5606,0.94,0.1045,6,76,82 +5914,2011-09-08,3,0,9,22,0,4,1,3,0.62,0.5152,1,0.0896,5,65,70 +5915,2011-09-08,3,0,9,23,0,4,1,3,0.62,0.5152,1,0.0896,0,26,26 +5916,2011-09-09,3,0,9,0,0,5,1,3,0.64,0.5606,0.94,0.194,1,15,16 +5917,2011-09-09,3,0,9,1,0,5,1,2,0.62,0.5152,1,0.1642,0,8,8 +5918,2011-09-09,3,0,9,2,0,5,1,2,0.62,0.5152,1,0.2537,1,7,8 +5919,2011-09-09,3,0,9,3,0,5,1,2,0.62,0.5455,0.94,0.2537,1,1,2 +5920,2011-09-09,3,0,9,4,0,5,1,3,0.62,0.5455,0.94,0.1642,0,3,3 +5921,2011-09-09,3,0,9,5,0,5,1,2,0.62,0.5455,0.94,0.1343,0,14,14 +5922,2011-09-09,3,0,9,6,0,5,1,2,0.62,0.5152,1,0.1343,3,54,57 +5923,2011-09-09,3,0,9,7,0,5,1,3,0.62,0.5152,1,0.2537,4,104,108 +5924,2011-09-09,3,0,9,8,0,5,1,3,0.62,0.5455,0.94,0.1642,12,276,288 +5925,2011-09-09,3,0,9,9,0,5,1,3,0.62,0.5455,0.94,0.1642,5,131,136 +5926,2011-09-09,3,0,9,10,0,5,1,2,0.62,0.5152,1,0,2,27,29 +5927,2011-09-09,3,0,9,11,0,5,1,2,0.62,0.5455,0.94,0,6,66,72 +5928,2011-09-09,3,0,9,12,0,5,1,1,0.64,0.5606,0.94,0,4,71,75 +5929,2011-09-09,3,0,9,13,0,5,1,1,0.7,0.6667,0.79,0.0896,14,108,122 +5930,2011-09-09,3,0,9,14,0,5,1,1,0.72,0.697,0.74,0.0896,29,119,148 +5931,2011-09-09,3,0,9,15,0,5,1,1,0.74,0.697,0.66,0,27,161,188 +5932,2011-09-09,3,0,9,16,0,5,1,1,0.74,0.697,0.66,0.1343,41,244,285 +5933,2011-09-09,3,0,9,17,0,5,1,1,0.7,0.6667,0.79,0.2537,54,451,505 +5934,2011-09-09,3,0,9,18,0,5,1,1,0.7,0.6667,0.79,0.194,33,377,410 +5935,2011-09-09,3,0,9,19,0,5,1,1,0.66,0.5909,0.89,0.1642,33,316,349 +5936,2011-09-09,3,0,9,20,0,5,1,1,0.64,0.5606,0.94,0.0896,30,180,210 +5937,2011-09-09,3,0,9,21,0,5,1,1,0.66,0.5909,0.89,0,49,154,203 +5938,2011-09-09,3,0,9,22,0,5,1,1,0.62,0.5455,0.94,0,33,127,160 +5939,2011-09-09,3,0,9,23,0,5,1,1,0.62,0.5455,0.94,0.0896,35,113,148 +5940,2011-09-10,3,0,9,0,0,6,0,1,0.62,0.5455,0.94,0,32,84,116 +5941,2011-09-10,3,0,9,1,0,6,0,2,0.62,0.5455,0.94,0,16,67,83 +5942,2011-09-10,3,0,9,2,0,6,0,2,0.6,0.5,0.97,0.1045,18,46,64 +5943,2011-09-10,3,0,9,3,0,6,0,1,0.58,0.5455,1,0.1343,8,29,37 +5944,2011-09-10,3,0,9,4,0,6,0,1,0.58,0.5455,0.94,0.0896,3,4,7 +5945,2011-09-10,3,0,9,5,0,6,0,1,0.58,0.5455,0.94,0.1343,2,6,8 +5946,2011-09-10,3,0,9,6,0,6,0,1,0.58,0.5455,0.94,0.1642,0,6,6 +5947,2011-09-10,3,0,9,7,0,6,0,1,0.6,0.5455,0.88,0.1343,9,43,52 +5948,2011-09-10,3,0,9,8,0,6,0,1,0.62,0.5758,0.83,0.1642,16,103,119 +5949,2011-09-10,3,0,9,9,0,6,0,1,0.64,0.5909,0.78,0.2985,39,168,207 +5950,2011-09-10,3,0,9,10,0,6,0,1,0.7,0.6515,0.65,0.2537,85,233,318 +5951,2011-09-10,3,0,9,11,0,6,0,1,0.72,0.6818,0.62,0.2537,108,252,360 +5952,2011-09-10,3,0,9,12,0,6,0,1,0.72,0.6818,0.62,0.2239,144,260,404 +5953,2011-09-10,3,0,9,13,0,6,0,1,0.74,0.6818,0.55,0.2836,131,239,370 +5954,2011-09-10,3,0,9,14,0,6,0,1,0.74,0.6818,0.55,0.2836,129,230,359 +5955,2011-09-10,3,0,9,15,0,6,0,1,0.74,0.6818,0.55,0.2239,217,263,480 +5956,2011-09-10,3,0,9,16,0,6,0,1,0.74,0.6818,0.55,0.194,170,261,431 +5957,2011-09-10,3,0,9,17,0,6,0,1,0.74,0.6818,0.55,0.2239,185,275,460 +5958,2011-09-10,3,0,9,18,0,6,0,1,0.72,0.6667,0.58,0.194,119,241,360 +5959,2011-09-10,3,0,9,19,0,6,0,1,0.7,0.6515,0.58,0.1343,101,214,315 +5960,2011-09-10,3,0,9,20,0,6,0,1,0.66,0.6212,0.69,0.1045,78,167,245 +5961,2011-09-10,3,0,9,21,0,6,0,1,0.64,0.5909,0.78,0.0896,59,171,230 +5962,2011-09-10,3,0,9,22,0,6,0,1,0.64,0.5909,0.78,0,49,126,175 +5963,2011-09-10,3,0,9,23,0,6,0,2,0.62,0.5606,0.88,0,32,107,139 +5964,2011-09-11,3,0,9,0,0,0,0,1,0.62,0.5606,0.88,0,29,79,108 +5965,2011-09-11,3,0,9,1,0,0,0,2,0.62,0.5606,0.88,0,22,66,88 +5966,2011-09-11,3,0,9,2,0,0,0,2,0.62,0.5909,0.78,0.194,16,60,76 +5967,2011-09-11,3,0,9,3,0,0,0,2,0.62,0.5909,0.78,0.0896,15,30,45 +5968,2011-09-11,3,0,9,4,0,0,0,1,0.6,0.5606,0.83,0.1343,3,6,9 +5969,2011-09-11,3,0,9,5,0,0,0,1,0.6,0.5606,0.83,0.1343,15,24,39 +5970,2011-09-11,3,0,9,6,0,0,0,1,0.58,0.5455,0.88,0.1045,4,16,20 +5971,2011-09-11,3,0,9,7,0,0,0,1,0.6,0.5455,0.88,0,9,28,37 +5972,2011-09-11,3,0,9,8,0,0,0,1,0.64,0.5909,0.78,0,25,69,94 +5973,2011-09-11,3,0,9,9,0,0,0,1,0.66,0.6212,0.74,0,59,168,227 +5974,2011-09-11,3,0,9,10,0,0,0,1,0.7,0.6515,0.65,0,120,214,334 +5975,2011-09-11,3,0,9,11,0,0,0,1,0.7,0.6515,0.65,0,144,258,402 +5976,2011-09-11,3,0,9,12,0,0,0,1,0.7,0.6515,0.65,0,113,298,411 +5977,2011-09-11,3,0,9,13,0,0,0,1,0.74,0.6818,0.58,0.1045,119,231,350 +5978,2011-09-11,3,0,9,14,0,0,0,1,0.72,0.6667,0.58,0.2836,120,222,342 +5979,2011-09-11,3,0,9,15,0,0,0,1,0.74,0.6667,0.51,0.2985,134,270,404 +5980,2011-09-11,3,0,9,16,0,0,0,1,0.72,0.6667,0.51,0.2985,180,303,483 +5981,2011-09-11,3,0,9,17,0,0,0,1,0.72,0.6667,0.51,0.1343,152,228,380 +5982,2011-09-11,3,0,9,18,0,0,0,1,0.68,0.6364,0.65,0.2985,110,229,339 +5983,2011-09-11,3,0,9,19,0,0,0,1,0.64,0.6061,0.69,0.1045,101,232,333 +5984,2011-09-11,3,0,9,20,0,0,0,1,0.62,0.5909,0.73,0.1045,65,161,226 +5985,2011-09-11,3,0,9,21,0,0,0,1,0.64,0.6061,0.69,0,45,114,159 +5986,2011-09-11,3,0,9,22,0,0,0,1,0.62,0.6061,0.69,0.3881,21,74,95 +5987,2011-09-11,3,0,9,23,0,0,0,3,0.58,0.5455,0.78,0.0896,12,33,45 +5988,2011-09-12,3,0,9,0,0,1,1,1,0.56,0.5303,0.88,0.0896,5,11,16 +5989,2011-09-12,3,0,9,1,0,1,1,1,0.56,0.5303,0.88,0,1,11,12 +5990,2011-09-12,3,0,9,2,0,1,1,1,0.56,0.5303,0.88,0,2,0,2 +5991,2011-09-12,3,0,9,4,0,1,1,1,0.54,0.5152,0.94,0.1045,0,4,4 +5992,2011-09-12,3,0,9,5,0,1,1,1,0.56,0.5303,0.83,0,1,23,24 +5993,2011-09-12,3,0,9,6,0,1,1,1,0.56,0.5303,0.83,0,1,108,109 +5994,2011-09-12,3,0,9,7,0,1,1,1,0.58,0.5455,0.83,0,12,300,312 +5995,2011-09-12,3,0,9,8,0,1,1,1,0.6,0.5758,0.78,0.0896,26,382,408 +5996,2011-09-12,3,0,9,9,0,1,1,1,0.64,0.6061,0.69,0,21,155,176 +5997,2011-09-12,3,0,9,10,0,1,1,1,0.68,0.6364,0.62,0.1642,40,89,129 +5998,2011-09-12,3,0,9,11,0,1,1,1,0.7,0.6515,0.58,0.194,34,131,165 +5999,2011-09-12,3,0,9,12,0,1,1,1,0.72,0.6667,0.51,0.194,37,147,184 +6000,2011-09-12,3,0,9,13,0,1,1,1,0.72,0.6667,0.54,0.1045,43,120,163 +6001,2011-09-12,3,0,9,14,0,1,1,1,0.74,0.6667,0.51,0.1642,60,129,189 +6002,2011-09-12,3,0,9,15,0,1,1,1,0.74,0.6667,0.48,0.0896,60,152,212 +6003,2011-09-12,3,0,9,16,0,1,1,1,0.72,0.6667,0.48,0,60,238,298 +6004,2011-09-12,3,0,9,17,0,1,1,1,0.72,0.6667,0.51,0.1642,75,515,590 +6005,2011-09-12,3,0,9,18,0,1,1,1,0.7,0.6515,0.54,0.194,56,515,571 +6006,2011-09-12,3,0,9,19,0,1,1,1,0.68,0.6364,0.57,0.1343,63,373,436 +6007,2011-09-12,3,0,9,20,0,1,1,1,0.66,0.6212,0.65,0,41,258,299 +6008,2011-09-12,3,0,9,21,0,1,1,1,0.64,0.6061,0.73,0.0896,29,166,195 +6009,2011-09-12,3,0,9,22,0,1,1,1,0.62,0.5758,0.83,0.1642,16,134,150 +6010,2011-09-12,3,0,9,23,0,1,1,1,0.62,0.5758,0.83,0.1045,7,62,69 +6011,2011-09-13,3,0,9,0,0,2,1,1,0.6,0.5455,0.88,0.1045,7,19,26 +6012,2011-09-13,3,0,9,1,0,2,1,1,0.58,0.5455,0.83,0.1045,4,6,10 +6013,2011-09-13,3,0,9,2,0,2,1,1,0.6,0.5758,0.78,0,2,0,2 +6014,2011-09-13,3,0,9,3,0,2,1,1,0.58,0.5455,0.83,0,2,2,4 +6015,2011-09-13,3,0,9,4,0,2,1,1,0.56,0.5303,0.88,0,2,6,8 +6016,2011-09-13,3,0,9,5,0,2,1,1,0.56,0.5303,0.88,0,1,19,20 +6017,2011-09-13,3,0,9,6,0,2,1,1,0.56,0.5303,0.88,0.1045,6,116,122 +6018,2011-09-13,3,0,9,7,0,2,1,1,0.58,0.5455,0.83,0.0896,14,348,362 +6019,2011-09-13,3,0,9,8,0,2,1,1,0.6,0.5606,0.81,0.1045,26,399,425 +6020,2011-09-13,3,0,9,9,0,2,1,1,0.64,0.6061,0.69,0.0896,35,179,214 +6021,2011-09-13,3,0,9,10,0,2,1,1,0.68,0.6364,0.69,0.1343,42,93,135 +6022,2011-09-13,3,0,9,11,0,2,1,1,0.7,0.6515,0.61,0.1642,33,120,153 +6023,2011-09-13,3,0,9,12,0,2,1,1,0.72,0.6667,0.58,0.2239,43,144,187 +6024,2011-09-13,3,0,9,13,0,2,1,1,0.74,0.6667,0.51,0.2836,48,136,184 +6025,2011-09-13,3,0,9,14,0,2,1,1,0.74,0.6667,0.51,0.3284,51,139,190 +6026,2011-09-13,3,0,9,15,0,2,1,1,0.74,0.6667,0.51,0.2985,35,145,180 +6027,2011-09-13,3,0,9,16,0,2,1,1,0.74,0.6667,0.51,0.2239,41,251,292 +6028,2011-09-13,3,0,9,17,0,2,1,1,0.74,0.6667,0.51,0.2239,72,507,579 +6029,2011-09-13,3,0,9,18,0,2,1,1,0.7,0.6515,0.61,0.1642,67,472,539 +6030,2011-09-13,3,0,9,19,0,2,1,1,0.68,0.6364,0.69,0.1642,55,341,396 +6031,2011-09-13,3,0,9,20,0,2,1,1,0.66,0.6212,0.74,0.194,31,241,272 +6032,2011-09-13,3,0,9,21,0,2,1,1,0.64,0.5909,0.78,0.1642,45,200,245 +6033,2011-09-13,3,0,9,22,0,2,1,1,0.64,0.5909,0.78,0.1343,24,120,144 +6034,2011-09-13,3,0,9,23,0,2,1,1,0.64,0.5909,0.78,0.1045,15,59,74 +6035,2011-09-14,3,0,9,0,0,3,1,1,0.62,0.5909,0.78,0.0896,5,28,33 +6036,2011-09-14,3,0,9,1,0,3,1,1,0.62,0.5909,0.78,0.0896,1,7,8 +6037,2011-09-14,3,0,9,2,0,3,1,1,0.6,0.5606,0.83,0,1,4,5 +6038,2011-09-14,3,0,9,3,0,3,1,1,0.6,0.5455,0.88,0.1343,1,7,8 +6039,2011-09-14,3,0,9,4,0,3,1,1,0.6,0.5606,0.83,0.0896,1,8,9 +6040,2011-09-14,3,0,9,5,0,3,1,1,0.58,0.5455,0.88,0.1045,1,30,31 +6041,2011-09-14,3,0,9,6,0,3,1,1,0.58,0.5455,0.88,0.1045,7,138,145 +6042,2011-09-14,3,0,9,7,0,3,1,1,0.6,0.5606,0.83,0.1045,20,350,370 +6043,2011-09-14,3,0,9,8,0,3,1,1,0.62,0.5758,0.83,0.1642,33,396,429 +6044,2011-09-14,3,0,9,9,0,3,1,1,0.64,0.5909,0.78,0.1642,19,183,202 +6045,2011-09-14,3,0,9,10,0,3,1,1,0.7,0.6515,0.7,0.1343,27,115,142 +6046,2011-09-14,3,0,9,11,0,3,1,1,0.72,0.6818,0.66,0.1642,43,121,164 +6047,2011-09-14,3,0,9,12,0,3,1,1,0.74,0.6818,0.58,0.2239,56,156,212 +6048,2011-09-14,3,0,9,13,0,3,1,1,0.76,0.697,0.55,0.2537,39,139,178 +6049,2011-09-14,3,0,9,14,0,3,1,1,0.78,0.7121,0.52,0.2239,19,105,124 +6050,2011-09-14,3,0,9,15,0,3,1,1,0.78,0.7121,0.49,0.2537,30,146,176 +6051,2011-09-14,3,0,9,16,0,3,1,1,0.76,0.697,0.52,0.2836,36,241,277 +6052,2011-09-14,3,0,9,17,0,3,1,1,0.76,0.697,0.52,0.2836,87,512,599 +6053,2011-09-14,3,0,9,18,0,3,1,1,0.72,0.6818,0.62,0.1642,83,503,586 +6054,2011-09-14,3,0,9,19,0,3,1,2,0.7,0.6515,0.7,0.1045,44,337,381 +6055,2011-09-14,3,0,9,20,0,3,1,2,0.72,0.6667,0.58,0.4179,40,221,261 +6056,2011-09-14,3,0,9,21,0,3,1,2,0.66,0.6212,0.65,0.2239,24,189,213 +6057,2011-09-14,3,0,9,22,0,3,1,2,0.66,0.6212,0.65,0.1343,20,140,160 +6058,2011-09-14,3,0,9,23,0,3,1,2,0.64,0.6061,0.69,0.1045,10,62,72 +6059,2011-09-15,3,0,9,0,0,4,1,1,0.64,0.6061,0.69,0.1343,3,34,37 +6060,2011-09-15,3,0,9,1,0,4,1,1,0.64,0.6061,0.69,0.1642,0,20,20 +6061,2011-09-15,3,0,9,2,0,4,1,1,0.62,0.5909,0.73,0.1045,0,5,5 +6062,2011-09-15,3,0,9,3,0,4,1,1,0.62,0.5909,0.73,0.0896,3,5,8 +6063,2011-09-15,3,0,9,4,0,4,1,1,0.6,0.5758,0.78,0.1045,2,5,7 +6064,2011-09-15,3,0,9,5,0,4,1,1,0.6,0.5758,0.78,0,1,30,31 +6065,2011-09-15,3,0,9,6,0,4,1,2,0.6,0.5606,0.83,0.2537,5,119,124 +6066,2011-09-15,3,0,9,7,0,4,1,2,0.62,0.5909,0.78,0.194,17,321,338 +6067,2011-09-15,3,0,9,8,0,4,1,1,0.64,0.6061,0.73,0.1343,27,364,391 +6068,2011-09-15,3,0,9,9,0,4,1,2,0.64,0.6061,0.73,0.3582,23,186,209 +6069,2011-09-15,3,0,9,10,0,4,1,2,0.66,0.6212,0.69,0.2537,25,96,121 +6070,2011-09-15,3,0,9,11,0,4,1,2,0.68,0.6364,0.65,0.1343,39,120,159 +6071,2011-09-15,3,0,9,12,0,4,1,2,0.66,0.6212,0.69,0.2985,31,148,179 +6072,2011-09-15,3,0,9,13,0,4,1,2,0.64,0.6061,0.65,0.2836,38,151,189 +6073,2011-09-15,3,0,9,14,0,4,1,2,0.64,0.6061,0.65,0.4179,36,134,170 +6074,2011-09-15,3,0,9,15,0,4,1,2,0.6,0.5909,0.69,0.4179,25,110,135 +6075,2011-09-15,3,0,9,16,0,4,1,2,0.54,0.5152,0.77,0.4627,29,193,222 +6076,2011-09-15,3,0,9,17,0,4,1,3,0.48,0.4697,0.82,0.4627,31,230,261 +6077,2011-09-15,3,0,9,18,0,4,1,3,0.48,0.4697,0.67,0.6119,22,222,244 +6078,2011-09-15,3,0,9,19,0,4,1,1,0.46,0.4545,0.67,0.4627,20,212,232 +6079,2011-09-15,3,0,9,20,0,4,1,1,0.46,0.4545,0.63,0.3284,25,223,248 +6080,2011-09-15,3,0,9,21,0,4,1,1,0.46,0.4545,0.63,0.3284,11,134,145 +6081,2011-09-15,3,0,9,22,0,4,1,1,0.44,0.4394,0.67,0.2836,7,108,115 +6082,2011-09-15,3,0,9,23,0,4,1,1,0.44,0.4394,0.67,0.2239,8,61,69 +6083,2011-09-16,3,0,9,0,0,5,1,1,0.42,0.4242,0.71,0.2239,5,43,48 +6084,2011-09-16,3,0,9,1,0,5,1,1,0.42,0.4242,0.67,0.194,7,19,26 +6085,2011-09-16,3,0,9,2,0,5,1,1,0.4,0.4091,0.71,0.2836,3,7,10 +6086,2011-09-16,3,0,9,3,0,5,1,1,0.4,0.4091,0.71,0.2537,2,4,6 +6087,2011-09-16,3,0,9,4,0,5,1,1,0.4,0.4091,0.71,0.2836,1,3,4 +6088,2011-09-16,3,0,9,5,0,5,1,1,0.38,0.3939,0.76,0.194,2,30,32 +6089,2011-09-16,3,0,9,6,0,5,1,1,0.38,0.3939,0.71,0.2239,6,87,93 +6090,2011-09-16,3,0,9,7,0,5,1,1,0.4,0.4091,0.71,0.2836,16,283,299 +6091,2011-09-16,3,0,9,8,0,5,1,1,0.42,0.4242,0.67,0.2239,23,386,409 +6092,2011-09-16,3,0,9,9,0,5,1,1,0.46,0.4545,0.55,0.2537,21,189,210 +6093,2011-09-16,3,0,9,10,0,5,1,2,0.5,0.4848,0.51,0.2836,36,104,140 +6094,2011-09-16,3,0,9,11,0,5,1,2,0.5,0.4848,0.51,0.1343,40,139,179 +6095,2011-09-16,3,0,9,12,0,5,1,2,0.52,0.5,0.45,0,39,212,251 +6096,2011-09-16,3,0,9,13,0,5,1,2,0.54,0.5152,0.42,0,55,168,223 +6097,2011-09-16,3,0,9,14,0,5,1,2,0.54,0.5152,0.45,0.1642,49,176,225 +6098,2011-09-16,3,0,9,15,0,5,1,2,0.54,0.5152,0.45,0.2239,38,165,203 +6099,2011-09-16,3,0,9,16,0,5,1,2,0.52,0.5,0.48,0.1642,67,291,358 +6100,2011-09-16,3,0,9,17,0,5,1,2,0.52,0.5,0.48,0.1343,86,480,566 +6101,2011-09-16,3,0,9,18,0,5,1,2,0.52,0.5,0.52,0.1642,55,427,482 +6102,2011-09-16,3,0,9,19,0,5,1,2,0.5,0.4848,0.55,0.1642,59,256,315 +6103,2011-09-16,3,0,9,20,0,5,1,2,0.5,0.4848,0.55,0,41,184,225 +6104,2011-09-16,3,0,9,21,0,5,1,2,0.5,0.4848,0.63,0,39,124,163 +6105,2011-09-16,3,0,9,22,0,5,1,1,0.48,0.4697,0.67,0.0896,32,135,167 +6106,2011-09-16,3,0,9,23,0,5,1,1,0.5,0.4848,0.59,0,20,106,126 +6107,2011-09-17,3,0,9,0,0,6,0,1,0.46,0.4545,0.72,0.1642,28,80,108 +6108,2011-09-17,3,0,9,1,0,6,0,1,0.46,0.4545,0.72,0.1045,28,52,80 +6109,2011-09-17,3,0,9,2,0,6,0,1,0.46,0.4545,0.82,0.0896,18,61,79 +6110,2011-09-17,3,0,9,3,0,6,0,1,0.46,0.4545,0.72,0.1343,7,21,28 +6111,2011-09-17,3,0,9,4,0,6,0,1,0.46,0.4545,0.72,0.1343,1,4,5 +6112,2011-09-17,3,0,9,5,0,6,0,1,0.46,0.4545,0.72,0.1642,2,3,5 +6113,2011-09-17,3,0,9,6,0,6,0,2,0.46,0.4545,0.72,0.2239,5,17,22 +6114,2011-09-17,3,0,9,7,0,6,0,2,0.46,0.4545,0.77,0.194,4,33,37 +6115,2011-09-17,3,0,9,8,0,6,0,2,0.46,0.4545,0.77,0.2537,27,81,108 +6116,2011-09-17,3,0,9,9,0,6,0,2,0.48,0.4697,0.67,0.2239,32,145,177 +6117,2011-09-17,3,0,9,10,0,6,0,2,0.5,0.4848,0.68,0.2836,72,177,249 +6118,2011-09-17,3,0,9,11,0,6,0,2,0.52,0.5,0.68,0.2239,119,248,367 +6119,2011-09-17,3,0,9,12,0,6,0,2,0.52,0.5,0.67,0.2239,115,257,372 +6120,2011-09-17,3,0,9,13,0,6,0,2,0.52,0.5,0.68,0.194,124,225,349 +6121,2011-09-17,3,0,9,14,0,6,0,2,0.52,0.5,0.72,0.194,123,174,297 +6122,2011-09-17,3,0,9,15,0,6,0,2,0.52,0.5,0.72,0.194,148,206,354 +6123,2011-09-17,3,0,9,16,0,6,0,1,0.54,0.5152,0.68,0.2537,116,189,305 +6124,2011-09-17,3,0,9,17,0,6,0,1,0.52,0.5,0.72,0.2239,141,218,359 +6125,2011-09-17,3,0,9,18,0,6,0,2,0.52,0.5,0.72,0.2239,95,234,329 +6126,2011-09-17,3,0,9,19,0,6,0,1,0.52,0.5,0.68,0.1343,70,186,256 +6127,2011-09-17,3,0,9,20,0,6,0,1,0.52,0.5,0.68,0.1343,43,133,176 +6128,2011-09-17,3,0,9,21,0,6,0,2,0.5,0.4848,0.72,0.194,49,121,170 +6129,2011-09-17,3,0,9,22,0,6,0,1,0.5,0.4848,0.72,0.2239,31,112,143 +6130,2011-09-17,3,0,9,23,0,6,0,1,0.46,0.4545,0.82,0.1642,36,100,136 +6131,2011-09-18,3,0,9,0,0,0,0,1,0.46,0.4545,0.77,0.2239,24,88,112 +6132,2011-09-18,3,0,9,1,0,0,0,1,0.46,0.4545,0.77,0.2239,13,66,79 +6133,2011-09-18,3,0,9,2,0,0,0,1,0.44,0.4394,0.82,0.2537,21,68,89 +6134,2011-09-18,3,0,9,3,0,0,0,1,0.44,0.4394,0.82,0.194,11,25,36 +6135,2011-09-18,3,0,9,4,0,0,0,1,0.44,0.4394,0.77,0.2239,1,0,1 +6136,2011-09-18,3,0,9,5,0,0,0,1,0.44,0.4394,0.77,0.2239,1,5,6 +6137,2011-09-18,3,0,9,6,0,0,0,1,0.44,0.4394,0.77,0.2537,2,10,12 +6138,2011-09-18,3,0,9,7,0,0,0,1,0.46,0.4545,0.72,0.2239,15,29,44 +6139,2011-09-18,3,0,9,8,0,0,0,1,0.46,0.4545,0.72,0.2537,16,53,69 +6140,2011-09-18,3,0,9,9,0,0,0,1,0.48,0.4697,0.67,0.2537,46,94,140 +6141,2011-09-18,3,0,9,10,0,0,0,1,0.5,0.4848,0.72,0.1642,82,178,260 +6142,2011-09-18,3,0,9,11,0,0,0,1,0.52,0.5,0.68,0.2239,116,201,317 +6143,2011-09-18,3,0,9,12,0,0,0,1,0.54,0.5152,0.64,0.1343,135,229,364 +6144,2011-09-18,3,0,9,13,0,0,0,1,0.56,0.5303,0.6,0.1343,162,214,376 +6145,2011-09-18,3,0,9,14,0,0,0,2,0.58,0.5455,0.56,0.1343,127,184,311 +6146,2011-09-18,3,0,9,15,0,0,0,2,0.58,0.5455,0.6,0.1343,132,233,365 +6147,2011-09-18,3,0,9,16,0,0,0,2,0.56,0.5303,0.64,0.1642,134,235,369 +6148,2011-09-18,3,0,9,17,0,0,0,2,0.56,0.5303,0.64,0.194,95,230,325 +6149,2011-09-18,3,0,9,18,0,0,0,2,0.56,0.5303,0.6,0.1045,66,223,289 +6150,2011-09-18,3,0,9,19,0,0,0,2,0.54,0.5152,0.68,0.1045,54,191,245 +6151,2011-09-18,3,0,9,20,0,0,0,2,0.54,0.5152,0.68,0.0896,44,136,180 +6152,2011-09-18,3,0,9,21,0,0,0,2,0.54,0.5152,0.68,0.0896,37,110,147 +6153,2011-09-18,3,0,9,22,0,0,0,2,0.54,0.5152,0.68,0.0896,9,75,84 +6154,2011-09-18,3,0,9,23,0,0,0,1,0.54,0.5152,0.68,0.194,10,44,54 +6155,2011-09-19,3,0,9,0,0,1,1,2,0.52,0.5,0.72,0.1642,14,23,37 +6156,2011-09-19,3,0,9,1,0,1,1,2,0.52,0.5,0.72,0.1045,3,7,10 +6157,2011-09-19,3,0,9,2,0,1,1,2,0.52,0.5,0.72,0.1343,6,7,13 +6158,2011-09-19,3,0,9,3,0,1,1,2,0.5,0.4848,0.77,0.1343,1,4,5 +6159,2011-09-19,3,0,9,4,0,1,1,2,0.5,0.4848,0.77,0.1642,2,6,8 +6160,2011-09-19,3,0,9,5,0,1,1,2,0.5,0.4848,0.77,0.194,2,26,28 +6161,2011-09-19,3,0,9,6,0,1,1,1,0.5,0.4848,0.77,0.2239,6,107,113 +6162,2011-09-19,3,0,9,7,0,1,1,2,0.5,0.4848,0.82,0.2239,20,312,332 +6163,2011-09-19,3,0,9,8,0,1,1,2,0.52,0.5,0.77,0.1343,29,391,420 +6164,2011-09-19,3,0,9,9,0,1,1,2,0.54,0.5152,0.68,0.1642,32,183,215 +6165,2011-09-19,3,0,9,10,0,1,1,2,0.56,0.5303,0.64,0.1343,21,84,105 +6166,2011-09-19,3,0,9,11,0,1,1,2,0.58,0.5455,0.64,0.1642,41,98,139 +6167,2011-09-19,3,0,9,12,0,1,1,2,0.58,0.5455,0.6,0.0896,51,138,189 +6168,2011-09-19,3,0,9,13,0,1,1,2,0.6,0.6061,0.6,0,53,124,177 +6169,2011-09-19,3,0,9,14,0,1,1,2,0.6,0.6061,0.6,0.194,45,143,188 +6170,2011-09-19,3,0,9,15,0,1,1,2,0.6,0.6061,0.6,0.2239,44,143,187 +6171,2011-09-19,3,0,9,16,0,1,1,2,0.6,0.6061,0.6,0.1642,55,208,263 +6172,2011-09-19,3,0,9,17,0,1,1,2,0.58,0.5455,0.64,0.1343,85,483,568 +6173,2011-09-19,3,0,9,18,0,1,1,2,0.56,0.5303,0.68,0.1343,56,484,540 +6174,2011-09-19,3,0,9,19,0,1,1,2,0.56,0.5303,0.68,0.1045,41,333,374 +6175,2011-09-19,3,0,9,20,0,1,1,2,0.56,0.5303,0.68,0.1343,27,204,231 +6176,2011-09-19,3,0,9,21,0,1,1,2,0.56,0.5303,0.68,0.1642,19,181,200 +6177,2011-09-19,3,0,9,22,0,1,1,2,0.56,0.5303,0.68,0.1642,25,104,129 +6178,2011-09-19,3,0,9,23,0,1,1,2,0.56,0.5303,0.73,0.194,13,55,68 +6179,2011-09-20,3,0,9,0,0,2,1,2,0.56,0.5303,0.73,0.1642,4,21,25 +6180,2011-09-20,3,0,9,1,0,2,1,2,0.54,0.5152,0.88,0.1642,3,11,14 +6181,2011-09-20,3,0,9,2,0,2,1,2,0.54,0.5152,0.88,0.1642,1,4,5 +6182,2011-09-20,3,0,9,3,0,2,1,2,0.54,0.5152,0.83,0.2239,0,3,3 +6183,2011-09-20,3,0,9,4,0,2,1,1,0.54,0.5152,0.88,0.194,2,4,6 +6184,2011-09-20,3,0,9,5,0,2,1,2,0.54,0.5152,0.88,0.194,1,21,22 +6185,2011-09-20,3,0,9,6,0,2,1,2,0.54,0.5152,0.88,0.194,3,111,114 +6186,2011-09-20,3,0,9,7,0,2,1,2,0.54,0.5152,0.88,0.1642,23,306,329 +6187,2011-09-20,3,0,9,8,0,2,1,3,0.54,0.5152,0.94,0.2537,13,196,209 +6188,2011-09-20,3,0,9,9,0,2,1,3,0.54,0.5152,0.94,0.2239,5,64,69 +6189,2011-09-20,3,0,9,10,0,2,1,3,0.56,0.5303,0.88,0.1642,4,26,30 +6190,2011-09-20,3,0,9,11,0,2,1,3,0.56,0.5303,0.88,0.1642,8,48,56 +6191,2011-09-20,3,0,9,12,0,2,1,2,0.56,0.5303,0.94,0.1642,13,47,60 +6192,2011-09-20,3,0,9,13,0,2,1,2,0.56,0.5303,0.94,0.1343,22,81,103 +6193,2011-09-20,3,0,9,14,0,2,1,2,0.58,0.5455,0.88,0.194,27,85,112 +6194,2011-09-20,3,0,9,15,0,2,1,2,0.62,0.5909,0.78,0.1045,34,158,192 +6195,2011-09-20,3,0,9,16,0,2,1,2,0.6,0.5606,0.83,0.1642,23,241,264 +6196,2011-09-20,3,0,9,17,0,2,1,1,0.6,0.5606,0.83,0.1045,66,445,511 +6197,2011-09-20,3,0,9,18,0,2,1,1,0.6,0.5606,0.83,0.1045,39,453,492 +6198,2011-09-20,3,0,9,19,0,2,1,1,0.58,0.5455,0.88,0,43,294,337 +6199,2011-09-20,3,0,9,20,0,2,1,1,0.56,0.5303,0.94,0,41,212,253 +6200,2011-09-20,3,0,9,21,0,2,1,1,0.56,0.5303,0.94,0,29,177,206 +6201,2011-09-20,3,0,9,22,0,2,1,1,0.56,0.5303,0.94,0,23,125,148 +6202,2011-09-20,3,0,9,23,0,2,1,1,0.56,0.5303,0.94,0,11,70,81 +6203,2011-09-21,3,0,9,0,0,3,1,1,0.54,0.5152,1,0,7,20,27 +6204,2011-09-21,3,0,9,1,0,3,1,1,0.54,0.5152,0.94,0,0,17,17 +6205,2011-09-21,3,0,9,2,0,3,1,1,0.54,0.5152,0.94,0,1,5,6 +6206,2011-09-21,3,0,9,3,0,3,1,2,0.54,0.5152,0.94,0.0896,0,6,6 +6207,2011-09-21,3,0,9,4,0,3,1,2,0.54,0.5152,1,0.1343,2,5,7 +6208,2011-09-21,3,0,9,5,0,3,1,2,0.54,0.5152,1,0.1642,1,30,31 +6209,2011-09-21,3,0,9,6,0,3,1,2,0.54,0.5152,1,0.194,5,112,117 +6210,2011-09-21,3,0,9,7,0,3,1,2,0.54,0.5152,1,0.194,18,312,330 +6211,2011-09-21,3,0,9,8,0,3,1,2,0.54,0.5152,1,0.1045,18,444,462 +6212,2011-09-21,3,0,9,9,0,3,1,2,0.56,0.5303,1,0.0896,21,187,208 +6213,2011-09-21,3,0,9,10,0,3,1,2,0.6,0.5455,0.88,0,30,103,133 +6214,2011-09-21,3,0,9,11,0,3,1,2,0.62,0.5758,0.83,0.1642,42,138,180 +6215,2011-09-21,3,0,9,12,0,3,1,2,0.64,0.5909,0.78,0.0896,42,151,193 +6216,2011-09-21,3,0,9,13,0,3,1,2,0.66,0.6212,0.74,0.1642,37,144,181 +6217,2011-09-21,3,0,9,14,0,3,1,2,0.66,0.6212,0.74,0.194,40,139,179 +6218,2011-09-21,3,0,9,15,0,3,1,2,0.66,0.6212,0.74,0.1045,43,142,185 +6219,2011-09-21,3,0,9,16,0,3,1,2,0.66,0.6212,0.74,0.1045,51,230,281 +6220,2011-09-21,3,0,9,17,0,3,1,3,0.66,0.6212,0.74,0.1642,61,475,536 +6221,2011-09-21,3,0,9,18,0,3,1,3,0.64,0.5758,0.83,0.1642,24,384,408 +6222,2011-09-21,3,0,9,19,0,3,1,3,0.62,0.5455,0.94,0,30,253,283 +6223,2011-09-21,3,0,9,20,0,3,1,3,0.62,0.5455,0.94,0,11,149,160 +6224,2011-09-21,3,0,9,21,0,3,1,2,0.6,0.5,1,0.0896,25,175,200 +6225,2011-09-21,3,0,9,22,0,3,1,2,0.62,0.5455,0.94,0,15,112,127 +6226,2011-09-21,3,0,9,23,0,3,1,1,0.6,0.5152,0.94,0.1045,15,80,95 +6227,2011-09-22,3,0,9,0,0,4,1,1,0.6,0.5152,0.94,0.1045,11,30,41 +6228,2011-09-22,3,0,9,1,0,4,1,2,0.6,0.5152,0.94,0.0896,5,6,11 +6229,2011-09-22,3,0,9,2,0,4,1,2,0.6,0.5152,0.94,0.1045,2,8,10 +6230,2011-09-22,3,0,9,3,0,4,1,2,0.6,0.5152,0.94,0.0896,5,7,12 +6231,2011-09-22,3,0,9,4,0,4,1,2,0.6,0.5152,0.94,0.0896,0,2,2 +6232,2011-09-22,3,0,9,5,0,4,1,2,0.6,0.5152,0.94,0.1045,2,28,30 +6233,2011-09-22,3,0,9,6,0,4,1,2,0.6,0.5,1,0.1343,7,94,101 +6234,2011-09-22,3,0,9,7,0,4,1,2,0.6,0.5,1,0.1045,16,295,311 +6235,2011-09-22,3,0,9,8,0,4,1,2,0.6,0.5,1,0.1642,26,389,415 +6236,2011-09-22,3,0,9,9,0,4,1,2,0.62,0.5455,0.94,0.194,35,168,203 +6237,2011-09-22,3,0,9,10,0,4,1,2,0.64,0.5758,0.89,0.2239,19,101,120 +6238,2011-09-22,3,0,9,11,0,4,1,2,0.64,0.5758,0.89,0.194,23,142,165 +6239,2011-09-22,3,0,9,12,0,4,1,2,0.66,0.6061,0.83,0.1642,25,151,176 +6240,2011-09-22,3,0,9,13,0,4,1,2,0.66,0.6061,0.83,0.194,38,155,193 +6241,2011-09-22,3,0,9,14,0,4,1,2,0.66,0.6061,0.83,0.2239,41,142,183 +6242,2011-09-22,3,0,9,15,0,4,1,2,0.68,0.6364,0.79,0.1642,32,149,181 +6243,2011-09-22,3,0,9,16,0,4,1,2,0.68,0.6364,0.74,0.1343,39,262,301 +6244,2011-09-22,3,0,9,17,0,4,1,2,0.66,0.6061,0.78,0.0896,50,513,563 +6245,2011-09-22,3,0,9,18,0,4,1,2,0.64,0.5758,0.83,0.1045,50,501,551 +6246,2011-09-22,3,0,9,19,0,4,1,2,0.64,0.5758,0.89,0.0896,32,388,420 +6247,2011-09-22,3,0,9,20,0,4,1,2,0.62,0.5455,0.94,0.1343,35,250,285 +6248,2011-09-22,3,0,9,21,0,4,1,2,0.64,0.5758,0.89,0.0896,27,194,221 +6249,2011-09-22,3,0,9,22,0,4,1,2,0.62,0.5152,1,0.0896,21,166,187 +6250,2011-09-22,3,0,9,23,0,4,1,2,0.62,0.5455,0.94,0,14,99,113 +6251,2011-09-23,4,0,9,0,0,5,1,2,0.62,0.5455,0.94,0.0896,11,41,52 +6252,2011-09-23,4,0,9,1,0,5,1,2,0.6,0.5,1,0,2,29,31 +6253,2011-09-23,4,0,9,2,0,5,1,2,0.6,0.5,1,0.1045,6,14,20 +6254,2011-09-23,4,0,9,3,0,5,1,2,0.6,0.5,1,0,3,5,8 +6255,2011-09-23,4,0,9,4,0,5,1,3,0.6,0.5,1,0,6,7,13 +6256,2011-09-23,4,0,9,5,0,5,1,2,0.62,0.5455,0.94,0,2,20,22 +6257,2011-09-23,4,0,9,6,0,5,1,2,0.62,0.5455,0.94,0.0896,5,99,104 +6258,2011-09-23,4,0,9,7,0,5,1,3,0.62,0.5455,0.94,0.1343,14,240,254 +6259,2011-09-23,4,0,9,8,0,5,1,3,0.62,0.5455,0.94,0.1343,20,297,317 +6260,2011-09-23,4,0,9,9,0,5,1,3,0.62,0.5152,1,0.1343,3,108,111 +6261,2011-09-23,4,0,9,10,0,5,1,3,0.62,0.5152,1,0.1045,7,28,35 +6262,2011-09-23,4,0,9,11,0,5,1,3,0.62,0.5152,1,0.1045,1,20,21 +6263,2011-09-23,4,0,9,12,0,5,1,3,0.62,0.5455,0.94,0.1045,1,28,29 +6264,2011-09-23,4,0,9,13,0,5,1,3,0.6,0.5,1,0.0896,7,27,34 +6265,2011-09-23,4,0,9,14,0,5,1,3,0.6,0.5,1,0.0896,3,21,24 +6266,2011-09-23,4,0,9,15,0,5,1,3,0.62,0.5455,0.94,0.1045,9,52,61 +6267,2011-09-23,4,0,9,16,0,5,1,3,0.62,0.5455,0.94,0,5,51,56 +6268,2011-09-23,4,0,9,17,0,5,1,2,0.6,0.5,1,0,13,86,99 +6269,2011-09-23,4,0,9,18,0,5,1,1,0.6,0.5,1,0,14,220,234 +6270,2011-09-23,4,0,9,19,0,5,1,1,0.62,0.5455,0.94,0.1343,16,232,248 +6271,2011-09-23,4,0,9,20,0,5,1,1,0.6,0.5152,0.94,0,21,158,179 +6272,2011-09-23,4,0,9,21,0,5,1,1,0.6,0.5,1,0.1642,17,121,138 +6273,2011-09-23,4,0,9,22,0,5,1,1,0.6,0.5152,0.94,0.1642,31,126,157 +6274,2011-09-23,4,0,9,23,0,5,1,2,0.58,0.5455,1,0.1343,41,107,148 +6275,2011-09-24,4,0,9,0,0,6,0,1,0.58,0.5455,1,0.0896,7,86,93 +6276,2011-09-24,4,0,9,1,0,6,0,2,0.58,0.5455,0.94,0.0896,18,61,79 +6277,2011-09-24,4,0,9,2,0,6,0,2,0.58,0.5455,0.94,0.0896,14,54,68 +6278,2011-09-24,4,0,9,3,0,6,0,1,0.58,0.5455,0.94,0,4,27,31 +6279,2011-09-24,4,0,9,4,0,6,0,1,0.56,0.5303,0.94,0.1045,1,7,8 +6280,2011-09-24,4,0,9,5,0,6,0,2,0.56,0.5303,0.94,0.1045,2,3,5 +6281,2011-09-24,4,0,9,6,0,6,0,2,0.56,0.5303,0.94,0.1045,6,12,18 +6282,2011-09-24,4,0,9,7,0,6,0,2,0.56,0.5303,0.88,0.1045,12,34,46 +6283,2011-09-24,4,0,9,8,0,6,0,2,0.58,0.5455,0.9,0.0896,19,102,121 +6284,2011-09-24,4,0,9,9,0,6,0,2,0.6,0.5455,0.88,0.0896,41,134,175 +6285,2011-09-24,4,0,9,10,0,6,0,2,0.6,0.5455,0.88,0.0896,90,203,293 +6286,2011-09-24,4,0,9,11,0,6,0,2,0.62,0.5758,0.83,0.0896,118,268,386 +6287,2011-09-24,4,0,9,12,0,6,0,2,0.62,0.5758,0.83,0.0896,141,266,407 +6288,2011-09-24,4,0,9,13,0,6,0,1,0.64,0.5909,0.78,0,170,290,460 +6289,2011-09-24,4,0,9,14,0,6,0,2,0.66,0.6212,0.74,0.1343,164,226,390 +6290,2011-09-24,4,0,9,15,0,6,0,2,0.66,0.6212,0.74,0.1343,168,284,452 +6291,2011-09-24,4,0,9,16,0,6,0,2,0.64,0.5909,0.78,0.0896,167,282,449 +6292,2011-09-24,4,0,9,17,0,6,0,1,0.66,0.6212,0.74,0.1045,180,246,426 +6293,2011-09-24,4,0,9,18,0,6,0,1,0.64,0.5909,0.78,0.1045,157,243,400 +6294,2011-09-24,4,0,9,19,0,6,0,1,0.62,0.5606,0.88,0,77,231,308 +6295,2011-09-24,4,0,9,20,0,6,0,1,0.62,0.5758,0.83,0,77,166,243 +6296,2011-09-24,4,0,9,21,0,6,0,1,0.62,0.5758,0.83,0.0896,66,156,222 +6297,2011-09-24,4,0,9,22,0,6,0,2,0.62,0.5606,0.88,0,50,143,193 +6298,2011-09-24,4,0,9,23,0,6,0,2,0.6,0.5455,0.88,0.0896,27,123,150 +6299,2011-09-25,4,0,9,0,0,0,0,2,0.6,0.5455,0.88,0,37,136,173 +6300,2011-09-25,4,0,9,1,0,0,0,2,0.6,0.5455,0.88,0.1045,31,93,124 +6301,2011-09-25,4,0,9,2,0,0,0,3,0.6,0.5152,0.94,0,27,79,106 +6302,2011-09-25,4,0,9,3,0,0,0,3,0.6,0.5152,0.94,0,20,39,59 +6303,2011-09-25,4,0,9,4,0,0,0,3,0.6,0.5152,0.94,0.1343,3,5,8 +6304,2011-09-25,4,0,9,5,0,0,0,1,0.6,0.5455,0.88,0.0896,1,2,3 +6305,2011-09-25,4,0,9,6,0,0,0,2,0.6,0.5455,0.88,0.1045,4,13,17 +6306,2011-09-25,4,0,9,7,0,0,0,2,0.6,0.5152,0.94,0.1343,15,23,38 +6307,2011-09-25,4,0,9,8,0,0,0,2,0.62,0.5606,0.88,0,27,59,86 +6308,2011-09-25,4,0,9,9,0,0,0,2,0.64,0.5758,0.83,0,55,100,155 +6309,2011-09-25,4,0,9,10,0,0,0,2,0.64,0.5758,0.83,0,85,190,275 +6310,2011-09-25,4,0,9,11,0,0,0,2,0.66,0.6061,0.78,0.1343,131,230,361 +6311,2011-09-25,4,0,9,12,0,0,0,2,0.66,0.6061,0.78,0,143,270,413 +6312,2011-09-25,4,0,9,13,0,0,0,2,0.68,0.6364,0.74,0,122,247,369 +6313,2011-09-25,4,0,9,14,0,0,0,2,0.66,0.6061,0.78,0,116,231,347 +6314,2011-09-25,4,0,9,15,0,0,0,2,0.66,0.6061,0.78,0,119,234,353 +6315,2011-09-25,4,0,9,16,0,0,0,1,0.7,0.6515,0.7,0,144,320,464 +6316,2011-09-25,4,0,9,17,0,0,0,1,0.68,0.6364,0.74,0,145,275,420 +6317,2011-09-25,4,0,9,18,0,0,0,1,0.68,0.6364,0.74,0,125,279,404 +6318,2011-09-25,4,0,9,19,0,0,0,1,0.64,0.5758,0.83,0.1343,87,242,329 +6319,2011-09-25,4,0,9,20,0,0,0,1,0.64,0.5758,0.83,0.0896,51,135,186 +6320,2011-09-25,4,0,9,21,0,0,0,1,0.62,0.5455,0.94,0.0896,24,113,137 +6321,2011-09-25,4,0,9,22,0,0,0,1,0.62,0.5606,0.88,0.0896,20,84,104 +6322,2011-09-25,4,0,9,23,0,0,0,2,0.62,0.5455,0.94,0.1045,12,67,79 +6323,2011-09-26,4,0,9,0,0,1,1,2,0.62,0.5455,0.94,0,6,23,29 +6324,2011-09-26,4,0,9,1,0,1,1,2,0.62,0.5606,0.88,0.1045,3,14,17 +6325,2011-09-26,4,0,9,2,0,1,1,2,0.62,0.5606,0.88,0.1045,4,4,8 +6326,2011-09-26,4,0,9,3,0,1,1,2,0.62,0.5455,0.94,0,0,5,5 +6327,2011-09-26,4,0,9,4,0,1,1,2,0.62,0.5455,0.94,0.0896,2,4,6 +6328,2011-09-26,4,0,9,5,0,1,1,2,0.62,0.5606,0.88,0,0,25,25 +6329,2011-09-26,4,0,9,6,0,1,1,2,0.62,0.5455,0.94,0.0896,6,107,113 +6330,2011-09-26,4,0,9,7,0,1,1,2,0.62,0.5455,0.94,0,17,315,332 +6331,2011-09-26,4,0,9,8,0,1,1,2,0.62,0.5455,0.94,0,29,326,355 +6332,2011-09-26,4,0,9,9,0,1,1,2,0.64,0.5758,0.89,0.194,44,161,205 +6333,2011-09-26,4,0,9,10,0,1,1,2,0.64,0.5758,0.89,0.194,37,117,154 +6334,2011-09-26,4,0,9,11,0,1,1,1,0.68,0.6364,0.79,0.2239,43,116,159 +6335,2011-09-26,4,0,9,12,0,1,1,2,0.68,0.6364,0.79,0.1642,48,161,209 +6336,2011-09-26,4,0,9,13,0,1,1,2,0.7,0.6667,0.74,0.1642,48,139,187 +6337,2011-09-26,4,0,9,14,0,1,1,1,0.7,0.6515,0.7,0.1642,44,110,154 +6338,2011-09-26,4,0,9,15,0,1,1,1,0.72,0.6818,0.62,0.1343,52,157,209 +6339,2011-09-26,4,0,9,16,0,1,1,1,0.7,0.6515,0.65,0.1642,74,239,313 +6340,2011-09-26,4,0,9,17,0,1,1,1,0.7,0.6515,0.7,0.1343,61,509,570 +6341,2011-09-26,4,0,9,18,0,1,1,1,0.66,0.6061,0.83,0.1045,63,490,553 +6342,2011-09-26,4,0,9,19,0,1,1,1,0.66,0.6061,0.82,0.0896,36,347,383 +6343,2011-09-26,4,0,9,20,0,1,1,1,0.64,0.5758,0.89,0.1343,24,227,251 +6344,2011-09-26,4,0,9,21,0,1,1,1,0.64,0.5758,0.89,0.1343,18,164,182 +6345,2011-09-26,4,0,9,22,0,1,1,1,0.62,0.5455,0.94,0.1343,14,122,136 +6346,2011-09-26,4,0,9,23,0,1,1,1,0.62,0.5455,0.94,0.1343,11,64,75 +6347,2011-09-27,4,0,9,0,0,2,1,1,0.62,0.5455,0.94,0.1343,9,28,37 +6348,2011-09-27,4,0,9,1,0,2,1,1,0.62,0.5455,0.94,0.1642,4,6,10 +6349,2011-09-27,4,0,9,2,0,2,1,1,0.62,0.5455,0.94,0.1045,7,4,11 +6350,2011-09-27,4,0,9,3,0,2,1,2,0.62,0.5455,0.94,0,5,5,10 +6351,2011-09-27,4,0,9,4,0,2,1,3,0.62,0.5455,0.94,0.0896,5,3,8 +6352,2011-09-27,4,0,9,5,0,2,1,3,0.62,0.5455,0.94,0.0896,0,24,24 +6353,2011-09-27,4,0,9,6,0,2,1,2,0.62,0.5455,0.94,0.1045,8,116,124 +6354,2011-09-27,4,0,9,7,0,2,1,2,0.62,0.5455,0.94,0.1045,15,234,249 +6355,2011-09-27,4,0,9,8,0,2,1,2,0.64,0.5758,0.89,0.1045,27,400,427 +6356,2011-09-27,4,0,9,9,0,2,1,3,0.64,0.5758,0.89,0.194,20,173,193 +6357,2011-09-27,4,0,9,10,0,2,1,3,0.64,0.5606,0.94,0.1642,21,101,122 +6358,2011-09-27,4,0,9,11,0,2,1,2,0.66,0.5909,0.89,0.1045,26,97,123 +6359,2011-09-27,4,0,9,12,0,2,1,2,0.66,0.5909,0.89,0.1343,33,129,162 +6360,2011-09-27,4,0,9,13,0,2,1,2,0.7,0.6667,0.74,0.2239,29,122,151 +6361,2011-09-27,4,0,9,14,0,2,1,2,0.68,0.6364,0.79,0.1045,29,123,152 +6362,2011-09-27,4,0,9,15,0,2,1,2,0.68,0.6364,0.83,0.0896,35,170,205 +6363,2011-09-27,4,0,9,16,0,2,1,2,0.66,0.6061,0.83,0.194,33,225,258 +6364,2011-09-27,4,0,9,17,0,2,1,2,0.66,0.5909,0.89,0.194,50,480,530 +6365,2011-09-27,4,0,9,18,0,2,1,2,0.64,0.5758,0.89,0.2239,31,325,356 +6366,2011-09-27,4,0,9,19,0,2,1,2,0.64,0.5758,0.89,0.1343,26,294,320 +6367,2011-09-27,4,0,9,20,0,2,1,1,0.62,0.5909,0.78,0,17,221,238 +6368,2011-09-27,4,0,9,21,0,2,1,1,0.6,0.5606,0.83,0,17,179,196 +6369,2011-09-27,4,0,9,22,0,2,1,1,0.6,0.5455,0.88,0.0896,19,102,121 +6370,2011-09-27,4,0,9,23,0,2,1,1,0.6,0.5455,0.88,0.0896,11,82,93 +6371,2011-09-28,4,0,9,0,0,3,1,1,0.6,0.5455,0.88,0,7,29,36 +6372,2011-09-28,4,0,9,1,0,3,1,1,0.6,0.5455,0.88,0,1,13,14 +6373,2011-09-28,4,0,9,2,0,3,1,1,0.6,0.5455,0.88,0.2537,0,9,9 +6374,2011-09-28,4,0,9,3,0,3,1,2,0.6,0.5152,0.94,0.0896,0,3,3 +6375,2011-09-28,4,0,9,4,0,3,1,1,0.6,0.5455,0.88,0,1,3,4 +6376,2011-09-28,4,0,9,5,0,3,1,2,0.6,0.5152,0.94,0,2,20,22 +6377,2011-09-28,4,0,9,6,0,3,1,3,0.6,0.5152,0.94,0.1642,3,111,114 +6378,2011-09-28,4,0,9,7,0,3,1,3,0.6,0.5152,0.94,0.1642,8,177,185 +6379,2011-09-28,4,0,9,8,0,3,1,3,0.62,0.5455,0.94,0.2239,17,270,287 +6380,2011-09-28,4,0,9,9,0,3,1,3,0.62,0.5455,0.94,0.2239,14,141,155 +6381,2011-09-28,4,0,9,10,0,3,1,2,0.62,0.5455,0.94,0,20,109,129 +6382,2011-09-28,4,0,9,11,0,3,1,2,0.66,0.6061,0.83,0.1045,24,127,151 +6383,2011-09-28,4,0,9,12,0,3,1,2,0.68,0.6364,0.79,0,25,159,184 +6384,2011-09-28,4,0,9,13,0,3,1,1,0.7,0.6667,0.74,0.2537,24,138,162 +6385,2011-09-28,4,0,9,14,0,3,1,1,0.72,0.6818,0.62,0.2537,25,145,170 +6386,2011-09-28,4,0,9,15,0,3,1,1,0.7,0.6515,0.7,0.1343,35,146,181 +6387,2011-09-28,4,0,9,16,0,3,1,1,0.7,0.6515,0.7,0.194,39,241,280 +6388,2011-09-28,4,0,9,17,0,3,1,1,0.7,0.6515,0.7,0.2239,64,527,591 +6389,2011-09-28,4,0,9,18,0,3,1,3,0.66,0.6061,0.83,0.2537,65,459,524 +6390,2011-09-28,4,0,9,19,0,3,1,3,0.66,0.6061,0.83,0.2537,57,315,372 +6391,2011-09-28,4,0,9,20,0,3,1,3,0.6,0.5606,0.83,0.2985,11,91,102 +6392,2011-09-28,4,0,9,21,0,3,1,3,0.6,0.5455,0.88,0.2537,7,67,74 +6393,2011-09-28,4,0,9,22,0,3,1,1,0.6,0.5455,0.88,0.0896,19,68,87 +6394,2011-09-28,4,0,9,23,0,3,1,1,0.6,0.5152,0.94,0.1343,12,59,71 +6395,2011-09-29,4,0,9,0,0,4,1,1,0.6,0.5152,0.94,0.194,7,33,40 +6396,2011-09-29,4,0,9,1,0,4,1,2,0.6,0.5,1,0.194,3,20,23 +6397,2011-09-29,4,0,9,2,0,4,1,2,0.6,0.5,1,0.194,0,2,2 +6398,2011-09-29,4,0,9,3,0,4,1,2,0.6,0.5152,0.94,0.194,0,8,8 +6399,2011-09-29,4,0,9,4,0,4,1,2,0.6,0.5455,0.88,0.194,2,5,7 +6400,2011-09-29,4,0,9,5,0,4,1,2,0.6,0.5455,0.88,0.1045,0,22,22 +6401,2011-09-29,4,0,9,6,0,4,1,2,0.6,0.5455,0.88,0.0896,5,106,111 +6402,2011-09-29,4,0,9,7,0,4,1,1,0.6,0.5455,0.88,0.1343,14,269,283 +6403,2011-09-29,4,0,9,8,0,4,1,1,0.6,0.5606,0.83,0,29,402,431 +6404,2011-09-29,4,0,9,9,0,4,1,1,0.62,0.5909,0.78,0.2537,27,198,225 +6405,2011-09-29,4,0,9,10,0,4,1,1,0.64,0.6061,0.65,0.194,24,103,127 +6406,2011-09-29,4,0,9,11,0,4,1,1,0.62,0.6061,0.61,0.194,36,122,158 +6407,2011-09-29,4,0,9,12,0,4,1,1,0.66,0.6212,0.61,0.1642,38,157,195 +6408,2011-09-29,4,0,9,13,0,4,1,1,0.64,0.6212,0.61,0.194,36,175,211 +6409,2011-09-29,4,0,9,14,0,4,1,1,0.68,0.6364,0.51,0.2985,33,134,167 +6410,2011-09-29,4,0,9,15,0,4,1,1,0.68,0.6364,0.45,0.3881,42,185,227 +6411,2011-09-29,4,0,9,16,0,4,1,1,0.68,0.6364,0.41,0.3582,64,227,291 +6412,2011-09-29,4,0,9,17,0,4,1,1,0.66,0.6212,0.47,0.2985,62,527,589 +6413,2011-09-29,4,0,9,18,0,4,1,1,0.64,0.6212,0.41,0.2239,84,500,584 +6414,2011-09-29,4,0,9,19,0,4,1,1,0.62,0.6212,0.5,0.0896,36,348,384 +6415,2011-09-29,4,0,9,20,0,4,1,1,0.6,0.6212,0.53,0.0896,39,234,273 +6416,2011-09-29,4,0,9,21,0,4,1,1,0.58,0.5455,0.56,0,30,195,225 +6417,2011-09-29,4,0,9,22,0,4,1,1,0.56,0.5303,0.68,0,25,150,175 +6418,2011-09-29,4,0,9,23,0,4,1,1,0.52,0.5,0.77,0.1045,17,64,81 +6419,2011-09-30,4,0,9,0,0,5,1,1,0.52,0.5,0.83,0.1045,7,53,60 +6420,2011-09-30,4,0,9,1,0,5,1,1,0.52,0.5,0.77,0.0896,5,18,23 +6421,2011-09-30,4,0,9,2,0,5,1,1,0.52,0.5,0.83,0.1045,6,9,15 +6422,2011-09-30,4,0,9,3,0,5,1,1,0.52,0.5,0.83,0.1045,0,3,3 +6423,2011-09-30,4,0,9,4,0,5,1,1,0.52,0.5,0.83,0.194,2,6,8 +6424,2011-09-30,4,0,9,5,0,5,1,1,0.52,0.5,0.83,0.194,3,23,26 +6425,2011-09-30,4,0,9,6,0,5,1,1,0.52,0.5,0.83,0.2239,0,98,98 +6426,2011-09-30,4,0,9,7,0,5,1,1,0.52,0.5,0.83,0.2239,14,283,297 +6427,2011-09-30,4,0,9,8,0,5,1,1,0.54,0.5152,0.77,0.1343,31,425,456 +6428,2011-09-30,4,0,9,9,0,5,1,1,0.56,0.5303,0.73,0.194,23,214,237 +6429,2011-09-30,4,0,9,10,0,5,1,1,0.6,0.6061,0.64,0,46,122,168 +6430,2011-09-30,4,0,9,11,0,5,1,1,0.64,0.6212,0.57,0.0896,49,168,217 +6431,2011-09-30,4,0,9,12,0,5,1,2,0.64,0.6212,0.57,0.194,59,195,254 +6432,2011-09-30,4,0,9,13,0,5,1,2,0.66,0.6212,0.47,0.3284,72,194,266 +6433,2011-09-30,4,0,9,14,0,5,1,2,0.64,0.6212,0.47,0.2985,93,173,266 +6434,2011-09-30,4,0,9,15,0,5,1,1,0.64,0.6212,0.5,0.2836,52,193,245 +6435,2011-09-30,4,0,9,16,0,5,1,1,0.62,0.6212,0.53,0.2239,62,298,360 +6436,2011-09-30,4,0,9,17,0,5,1,1,0.62,0.6212,0.53,0.194,94,497,591 +6437,2011-09-30,4,0,9,18,0,5,1,1,0.58,0.5455,0.49,0.3582,62,449,511 +6438,2011-09-30,4,0,9,19,0,5,1,1,0.54,0.5152,0.52,0.2836,42,332,374 +6439,2011-09-30,4,0,9,20,0,5,1,1,0.54,0.5152,0.52,0.2836,28,202,230 +6440,2011-09-30,4,0,9,21,0,5,1,1,0.52,0.5,0.55,0.194,27,179,206 +6441,2011-09-30,4,0,9,22,0,5,1,1,0.52,0.5,0.55,0.3284,30,129,159 +6442,2011-09-30,4,0,9,23,0,5,1,1,0.52,0.5,0.55,0.3284,23,109,132 +6443,2011-10-01,4,0,10,0,0,6,0,1,0.5,0.4848,0.63,0.3881,24,106,130 +6444,2011-10-01,4,0,10,1,0,6,0,1,0.48,0.4697,0.67,0.3284,11,47,58 +6445,2011-10-01,4,0,10,2,0,6,0,1,0.46,0.4545,0.63,0.4179,21,46,67 +6446,2011-10-01,4,0,10,3,0,6,0,1,0.46,0.4545,0.59,0.4179,8,17,25 +6447,2011-10-01,4,0,10,4,0,6,0,1,0.44,0.4394,0.58,0.4179,2,6,8 +6448,2011-10-01,4,0,10,5,0,6,0,2,0.44,0.4394,0.62,0.2985,1,4,5 +6449,2011-10-01,4,0,10,6,0,6,0,1,0.42,0.4242,0.67,0.3284,4,15,19 +6450,2011-10-01,4,0,10,7,0,6,0,3,0.42,0.4242,0.67,0.2537,6,30,36 +6451,2011-10-01,4,0,10,8,0,6,0,2,0.4,0.4091,0.76,0.3284,9,58,67 +6452,2011-10-01,4,0,10,9,0,6,0,3,0.4,0.4091,0.82,0.3582,17,112,129 +6453,2011-10-01,4,0,10,10,0,6,0,3,0.4,0.4091,0.76,0.3582,21,100,121 +6454,2011-10-01,4,0,10,11,0,6,0,3,0.38,0.3939,0.82,0.3284,30,102,132 +6455,2011-10-01,4,0,10,12,0,6,0,2,0.4,0.4091,0.82,0.2537,28,130,158 +6456,2011-10-01,4,0,10,13,0,6,0,2,0.4,0.4091,0.82,0.3284,27,98,125 +6457,2011-10-01,4,0,10,14,0,6,0,2,0.42,0.4242,0.71,0.3582,33,147,180 +6458,2011-10-01,4,0,10,15,0,6,0,2,0.42,0.4242,0.71,0.2836,41,154,195 +6459,2011-10-01,4,0,10,16,0,6,0,2,0.4,0.4091,0.76,0.2985,58,165,223 +6460,2011-10-01,4,0,10,17,0,6,0,3,0.4,0.4091,0.76,0.194,58,170,228 +6461,2011-10-01,4,0,10,18,0,6,0,3,0.38,0.3939,0.87,0.2239,23,117,140 +6462,2011-10-01,4,0,10,19,0,6,0,2,0.38,0.3939,0.82,0.194,27,99,126 +6463,2011-10-01,4,0,10,20,0,6,0,3,0.36,0.3636,0.87,0.1045,8,58,66 +6464,2011-10-01,4,0,10,21,0,6,0,3,0.36,0.3636,0.87,0.1045,3,53,56 +6465,2011-10-01,4,0,10,22,0,6,0,2,0.36,0.3485,0.93,0.2239,12,58,70 +6466,2011-10-01,4,0,10,23,0,6,0,3,0.36,0.3485,0.93,0.2239,8,57,65 +6467,2011-10-02,4,0,10,0,0,0,0,3,0.36,0.3333,0.93,0.2985,4,43,47 +6468,2011-10-02,4,0,10,1,0,0,0,3,0.34,0.3182,0.87,0.2537,1,23,24 +6469,2011-10-02,4,0,10,2,0,0,0,3,0.34,0.303,0.87,0.2985,3,27,30 +6470,2011-10-02,4,0,10,3,0,0,0,3,0.34,0.3182,0.87,0.2836,4,5,9 +6471,2011-10-02,4,0,10,4,0,0,0,2,0.34,0.3182,0.87,0.2537,4,3,7 +6472,2011-10-02,4,0,10,5,0,0,0,1,0.32,0.303,0.87,0.2836,0,7,7 +6473,2011-10-02,4,0,10,6,0,0,0,1,0.32,0.3182,0.81,0.1642,2,13,15 +6474,2011-10-02,4,0,10,7,0,0,0,1,0.34,0.3333,0.76,0.194,5,24,29 +6475,2011-10-02,4,0,10,8,0,0,0,2,0.34,0.3333,0.76,0.194,14,50,64 +6476,2011-10-02,4,0,10,9,0,0,0,1,0.36,0.3333,0.76,0.3582,19,96,115 +6477,2011-10-02,4,0,10,10,0,0,0,2,0.36,0.3333,0.71,0.2836,60,188,248 +6478,2011-10-02,4,0,10,11,0,0,0,2,0.38,0.3939,0.66,0.2537,74,218,292 +6479,2011-10-02,4,0,10,12,0,0,0,2,0.4,0.4091,0.62,0.194,78,243,321 +6480,2011-10-02,4,0,10,13,0,0,0,3,0.4,0.4091,0.62,0.2239,84,224,308 +6481,2011-10-02,4,0,10,14,0,0,0,3,0.36,0.3485,0.81,0.1642,61,195,256 +6482,2011-10-02,4,0,10,15,0,0,0,3,0.36,0.3333,0.81,0.2537,29,144,173 +6483,2011-10-02,4,0,10,16,0,0,0,3,0.36,0.3485,0.87,0.194,44,162,206 +6484,2011-10-02,4,0,10,17,0,0,0,3,0.36,0.3485,0.87,0.1642,32,135,167 +6485,2011-10-02,4,0,10,18,0,0,0,2,0.36,0.3485,0.81,0.194,16,158,174 +6486,2011-10-02,4,0,10,19,0,0,0,2,0.38,0.3939,0.76,0.1642,34,128,162 +6487,2011-10-02,4,0,10,20,0,0,0,2,0.36,0.3333,0.76,0.2537,16,71,87 +6488,2011-10-02,4,0,10,21,0,0,0,1,0.36,0.3485,0.71,0.2239,17,71,88 +6489,2011-10-02,4,0,10,22,0,0,0,3,0.36,0.3636,0.81,0.0896,9,55,64 +6490,2011-10-02,4,0,10,23,0,0,0,1,0.36,0.3636,0.81,0.0896,6,19,25 +6491,2011-10-03,4,0,10,0,0,1,1,1,0.34,0.3485,0.87,0.1045,3,13,16 +6492,2011-10-03,4,0,10,1,0,1,1,1,0.36,0.3636,0.81,0.0896,1,5,6 +6493,2011-10-03,4,0,10,2,0,1,1,1,0.36,0.3636,0.76,0.1045,0,4,4 +6494,2011-10-03,4,0,10,3,0,1,1,1,0.36,0.3788,0.81,0,1,6,7 +6495,2011-10-03,4,0,10,4,0,1,1,1,0.34,0.3485,0.87,0.1045,2,6,8 +6496,2011-10-03,4,0,10,5,0,1,1,2,0.36,0.3636,0.76,0.1045,4,17,21 +6497,2011-10-03,4,0,10,6,0,1,1,2,0.36,0.3788,0.71,0,3,91,94 +6498,2011-10-03,4,0,10,7,0,1,1,2,0.36,0.3485,0.71,0.1343,8,241,249 +6499,2011-10-03,4,0,10,8,0,1,1,2,0.36,0.3636,0.71,0.1045,13,359,372 +6500,2011-10-03,4,0,10,9,0,1,1,2,0.4,0.4091,0.66,0.0896,12,141,153 +6501,2011-10-03,4,0,10,10,0,1,1,2,0.4,0.4091,0.66,0.1045,19,82,101 +6502,2011-10-03,4,0,10,11,0,1,1,2,0.4,0.4091,0.71,0.2239,26,100,126 +6503,2011-10-03,4,0,10,12,0,1,1,2,0.4,0.4091,0.71,0.1343,29,118,147 +6504,2011-10-03,4,0,10,13,0,1,1,3,0.42,0.4242,0.67,0.1642,16,102,118 +6505,2011-10-03,4,0,10,14,0,1,1,3,0.4,0.4091,0.76,0,26,94,120 +6506,2011-10-03,4,0,10,15,0,1,1,3,0.4,0.4091,0.76,0,22,79,101 +6507,2011-10-03,4,0,10,16,0,1,1,2,0.4,0.4091,0.76,0.1045,16,202,218 +6508,2011-10-03,4,0,10,17,0,1,1,2,0.4,0.4091,0.76,0.1642,40,455,495 +6509,2011-10-03,4,0,10,18,0,1,1,2,0.4,0.4091,0.76,0.1642,28,384,412 +6510,2011-10-03,4,0,10,19,0,1,1,1,0.4,0.4091,0.76,0,15,300,315 +6511,2011-10-03,4,0,10,20,0,1,1,3,0.4,0.4091,0.82,0,13,191,204 +6512,2011-10-03,4,0,10,21,0,1,1,2,0.4,0.4091,0.82,0,10,128,138 +6513,2011-10-03,4,0,10,22,0,1,1,2,0.4,0.4091,0.82,0,12,79,91 +6514,2011-10-03,4,0,10,23,0,1,1,2,0.4,0.4091,0.82,0.1045,11,43,54 +6515,2011-10-04,4,0,10,0,0,2,1,2,0.4,0.4091,0.87,0.0896,5,22,27 +6516,2011-10-04,4,0,10,1,0,2,1,2,0.4,0.4091,0.87,0.1045,1,8,9 +6517,2011-10-04,4,0,10,2,0,2,1,2,0.4,0.4091,0.87,0,1,2,3 +6518,2011-10-04,4,0,10,3,0,2,1,1,0.4,0.4091,0.82,0.1045,3,3,6 +6519,2011-10-04,4,0,10,4,0,2,1,1,0.4,0.4091,0.82,0.0896,3,4,7 +6520,2011-10-04,4,0,10,5,0,2,1,1,0.4,0.4091,0.82,0.1343,2,25,27 +6521,2011-10-04,4,0,10,6,0,2,1,1,0.4,0.4091,0.82,0.1045,3,109,112 +6522,2011-10-04,4,0,10,7,0,2,1,1,0.42,0.4242,0.82,0.2836,11,298,309 +6523,2011-10-04,4,0,10,8,0,2,1,1,0.44,0.4394,0.77,0.2239,28,372,400 +6524,2011-10-04,4,0,10,9,0,2,1,1,0.46,0.4545,0.67,0.2537,19,168,187 +6525,2011-10-04,4,0,10,10,0,2,1,1,0.48,0.4697,0.67,0.2836,19,107,126 +6526,2011-10-04,4,0,10,11,0,2,1,1,0.54,0.5152,0.64,0.2836,22,102,124 +6527,2011-10-04,4,0,10,12,0,2,1,1,0.56,0.5303,0.56,0.2985,39,160,199 +6528,2011-10-04,4,0,10,13,0,2,1,1,0.56,0.5303,0.56,0.2985,20,153,173 +6529,2011-10-04,4,0,10,14,0,2,1,1,0.58,0.5455,0.53,0.2537,27,156,183 +6530,2011-10-04,4,0,10,15,0,2,1,1,0.58,0.5455,0.56,0.2836,28,166,194 +6531,2011-10-04,4,0,10,16,0,2,1,1,0.58,0.5455,0.56,0.3284,36,273,309 +6532,2011-10-04,4,0,10,17,0,2,1,1,0.56,0.5303,0.6,0.3881,55,530,585 +6533,2011-10-04,4,0,10,18,0,2,1,1,0.54,0.5152,0.64,0.2836,68,456,524 +6534,2011-10-04,4,0,10,19,0,2,1,1,0.52,0.5,0.68,0.2239,35,310,345 +6535,2011-10-04,4,0,10,20,0,2,1,1,0.52,0.5,0.68,0.2239,25,236,261 +6536,2011-10-04,4,0,10,21,0,2,1,1,0.5,0.4848,0.72,0.1343,22,144,166 +6537,2011-10-04,4,0,10,22,0,2,1,1,0.5,0.4848,0.72,0.1343,8,106,114 +6538,2011-10-04,4,0,10,23,0,2,1,1,0.48,0.4697,0.77,0.1343,6,60,66 +6539,2011-10-05,4,0,10,0,0,3,1,1,0.48,0.4697,0.77,0.1343,7,36,43 +6540,2011-10-05,4,0,10,1,0,3,1,1,0.48,0.4697,0.77,0.1642,4,7,11 +6541,2011-10-05,4,0,10,2,0,3,1,1,0.46,0.4545,0.82,0.1642,1,2,3 +6542,2011-10-05,4,0,10,3,0,3,1,1,0.46,0.4545,0.82,0.1045,1,5,6 +6543,2011-10-05,4,0,10,4,0,3,1,1,0.44,0.4394,0.88,0,1,4,5 +6544,2011-10-05,4,0,10,5,0,3,1,1,0.44,0.4394,0.82,0.1343,3,29,32 +6545,2011-10-05,4,0,10,6,0,3,1,1,0.44,0.4394,0.82,0.1045,5,112,117 +6546,2011-10-05,4,0,10,7,0,3,1,1,0.46,0.4545,0.77,0.1045,9,292,301 +6547,2011-10-05,4,0,10,8,0,3,1,1,0.52,0.5,0.63,0,26,441,467 +6548,2011-10-05,4,0,10,9,0,3,1,1,0.54,0.5152,0.6,0.2985,32,187,219 +6549,2011-10-05,4,0,10,10,0,3,1,1,0.56,0.5303,0.56,0.3881,30,103,133 +6550,2011-10-05,4,0,10,11,0,3,1,1,0.6,0.6212,0.46,0.2836,32,138,170 +6551,2011-10-05,4,0,10,12,0,3,1,1,0.62,0.6212,0.43,0.3284,34,180,214 +6552,2011-10-05,4,0,10,13,0,3,1,1,0.64,0.6212,0.41,0.2239,30,189,219 +6553,2011-10-05,4,0,10,14,0,3,1,1,0.64,0.6212,0.41,0.2836,34,156,190 +6554,2011-10-05,4,0,10,15,0,3,1,1,0.64,0.6212,0.47,0.2836,35,147,182 +6555,2011-10-05,4,0,10,16,0,3,1,1,0.64,0.6212,0.47,0.3284,37,245,282 +6556,2011-10-05,4,0,10,17,0,3,1,1,0.64,0.6212,0.47,0.2537,66,525,591 +6557,2011-10-05,4,0,10,18,0,3,1,1,0.6,0.6212,0.53,0.1642,57,536,593 +6558,2011-10-05,4,0,10,19,0,3,1,1,0.58,0.5455,0.56,0.1343,40,334,374 +6559,2011-10-05,4,0,10,20,0,3,1,1,0.52,0.5,0.77,0.1642,18,228,246 +6560,2011-10-05,4,0,10,21,0,3,1,1,0.5,0.4848,0.82,0.1045,28,173,201 +6561,2011-10-05,4,0,10,22,0,3,1,1,0.52,0.5,0.72,0,20,134,154 +6562,2011-10-05,4,0,10,23,0,3,1,1,0.5,0.4848,0.77,0.1045,9,64,73 +6563,2011-10-06,4,0,10,0,0,4,1,1,0.5,0.4848,0.72,0.194,6,34,40 +6564,2011-10-06,4,0,10,1,0,4,1,1,0.48,0.4697,0.63,0.194,5,17,22 +6565,2011-10-06,4,0,10,2,0,4,1,1,0.46,0.4545,0.67,0.194,2,8,10 +6566,2011-10-06,4,0,10,3,0,4,1,1,0.44,0.4394,0.67,0.2239,1,4,5 +6567,2011-10-06,4,0,10,4,0,4,1,1,0.44,0.4394,0.62,0.2836,0,5,5 +6568,2011-10-06,4,0,10,5,0,4,1,1,0.42,0.4242,0.62,0.194,0,24,24 +6569,2011-10-06,4,0,10,6,0,4,1,1,0.4,0.4091,0.66,0.194,4,110,114 +6570,2011-10-06,4,0,10,7,0,4,1,1,0.42,0.4242,0.62,0.194,15,287,302 +6571,2011-10-06,4,0,10,8,0,4,1,1,0.44,0.4394,0.62,0.1642,20,437,457 +6572,2011-10-06,4,0,10,9,0,4,1,1,0.46,0.4545,0.63,0.194,20,188,208 +6573,2011-10-06,4,0,10,10,0,4,1,1,0.52,0.5,0.52,0,24,105,129 +6574,2011-10-06,4,0,10,11,0,4,1,1,0.54,0.5152,0.49,0,29,115,144 +6575,2011-10-06,4,0,10,12,0,4,1,1,0.56,0.5303,0.49,0.0896,46,180,226 +6576,2011-10-06,4,0,10,13,0,4,1,1,0.56,0.5303,0.46,0.0896,33,158,191 +6577,2011-10-06,4,0,10,14,0,4,1,1,0.58,0.5455,0.43,0.0896,56,139,195 +6578,2011-10-06,4,0,10,15,0,4,1,1,0.58,0.5455,0.46,0.1343,62,157,219 +6579,2011-10-06,4,0,10,16,0,4,1,1,0.58,0.5455,0.49,0.194,51,272,323 +6580,2011-10-06,4,0,10,17,0,4,1,1,0.56,0.5303,0.52,0.1343,63,505,568 +6581,2011-10-06,4,0,10,18,0,4,1,1,0.54,0.5152,0.64,0.0896,59,479,538 +6582,2011-10-06,4,0,10,19,0,4,1,1,0.52,0.5,0.59,0.1045,38,305,343 +6583,2011-10-06,4,0,10,20,0,4,1,1,0.5,0.4848,0.77,0.1045,29,231,260 +6584,2011-10-06,4,0,10,21,0,4,1,1,0.46,0.4545,0.88,0.0896,34,161,195 +6585,2011-10-06,4,0,10,22,0,4,1,1,0.46,0.4545,0.82,0,24,124,148 +6586,2011-10-06,4,0,10,23,0,4,1,1,0.44,0.4394,0.88,0.0896,18,81,99 +6587,2011-10-07,4,0,10,0,0,5,1,1,0.44,0.4394,0.88,0,19,48,67 +6588,2011-10-07,4,0,10,1,0,5,1,1,0.44,0.4394,0.88,0,3,25,28 +6589,2011-10-07,4,0,10,2,0,5,1,1,0.42,0.4242,0.88,0,0,5,5 +6590,2011-10-07,4,0,10,3,0,5,1,1,0.4,0.4091,0.87,0,1,9,10 +6591,2011-10-07,4,0,10,4,0,5,1,1,0.4,0.4091,0.87,0.1045,0,5,5 +6592,2011-10-07,4,0,10,5,0,5,1,1,0.42,0.4242,0.82,0,3,24,27 +6593,2011-10-07,4,0,10,6,0,5,1,1,0.42,0.4242,0.88,0,3,84,87 +6594,2011-10-07,4,0,10,7,0,5,1,1,0.42,0.4242,0.94,0,5,237,242 +6595,2011-10-07,4,0,10,8,0,5,1,1,0.46,0.4545,0.82,0,31,386,417 +6596,2011-10-07,4,0,10,9,0,5,1,1,0.48,0.4697,0.77,0,34,207,241 +6597,2011-10-07,4,0,10,10,0,5,1,1,0.52,0.5,0.68,0.0896,48,126,174 +6598,2011-10-07,4,0,10,11,0,5,1,1,0.56,0.5303,0.63,0,80,150,230 +6599,2011-10-07,4,0,10,12,0,5,1,1,0.6,0.6212,0.53,0,61,174,235 +6600,2011-10-07,4,0,10,13,0,5,1,1,0.6,0.6212,0.46,0,95,210,305 +6601,2011-10-07,4,0,10,14,0,5,1,1,0.62,0.6212,0.5,0,77,179,256 +6602,2011-10-07,4,0,10,15,0,5,1,1,0.66,0.6212,0.41,0,68,202,270 +6603,2011-10-07,4,0,10,16,0,5,1,1,0.64,0.6212,0.44,0,73,346,419 +6604,2011-10-07,4,0,10,17,0,5,1,1,0.6,0.6212,0.56,0.2537,101,462,563 +6605,2011-10-07,4,0,10,18,0,5,1,1,0.56,0.5303,0.6,0.0896,94,371,465 +6606,2011-10-07,4,0,10,19,0,5,1,1,0.56,0.5303,0.52,0,51,243,294 +6607,2011-10-07,4,0,10,20,0,5,1,1,0.54,0.5152,0.49,0,30,167,197 +6608,2011-10-07,4,0,10,21,0,5,1,1,0.5,0.4848,0.68,0,22,144,166 +6609,2011-10-07,4,0,10,22,0,5,1,1,0.5,0.4848,0.68,0,27,119,146 +6610,2011-10-07,4,0,10,23,0,5,1,1,0.5,0.4848,0.63,0,23,113,136 +6611,2011-10-08,4,0,10,0,0,6,0,1,0.48,0.4697,0.72,0,17,72,89 +6612,2011-10-08,4,0,10,1,0,6,0,1,0.46,0.4545,0.82,0,14,51,65 +6613,2011-10-08,4,0,10,2,0,6,0,1,0.46,0.4545,0.82,0,4,41,45 +6614,2011-10-08,4,0,10,3,0,6,0,1,0.44,0.4394,0.88,0,4,16,20 +6615,2011-10-08,4,0,10,4,0,6,0,1,0.42,0.4242,0.94,0.1343,3,7,10 +6616,2011-10-08,4,0,10,5,0,6,0,1,0.42,0.4242,0.88,0.1045,1,7,8 +6617,2011-10-08,4,0,10,6,0,6,0,1,0.42,0.4242,0.94,0,6,16,22 +6618,2011-10-08,4,0,10,7,0,6,0,1,0.42,0.4242,0.94,0.0896,27,46,73 +6619,2011-10-08,4,0,10,8,0,6,0,1,0.46,0.4545,0.82,0,14,89,103 +6620,2011-10-08,4,0,10,9,0,6,0,1,0.5,0.4848,0.77,0,54,163,217 +6621,2011-10-08,4,0,10,10,0,6,0,1,0.52,0.5,0.77,0,120,217,337 +6622,2011-10-08,4,0,10,11,0,6,0,1,0.58,0.5455,0.56,0,176,221,397 +6623,2011-10-08,4,0,10,12,0,6,0,1,0.62,0.6212,0.35,0,191,255,446 +6624,2011-10-08,4,0,10,13,0,6,0,1,0.62,0.6212,0.46,0.1045,256,229,485 +6625,2011-10-08,4,0,10,14,0,6,0,1,0.62,0.6212,0.5,0.0896,251,214,465 +6626,2011-10-08,4,0,10,15,0,6,0,1,0.66,0.6212,0.39,0,262,234,496 +6627,2011-10-08,4,0,10,16,0,6,0,1,0.64,0.6212,0.44,0.0896,221,230,451 +6628,2011-10-08,4,0,10,17,0,6,0,1,0.62,0.6212,0.5,0.1343,174,219,393 +6629,2011-10-08,4,0,10,18,0,6,0,1,0.6,0.6212,0.46,0.1045,135,224,359 +6630,2011-10-08,4,0,10,19,0,6,0,1,0.54,0.5152,0.68,0.1343,76,172,248 +6631,2011-10-08,4,0,10,20,0,6,0,1,0.52,0.5,0.77,0.1045,78,124,202 +6632,2011-10-08,4,0,10,21,0,6,0,1,0.52,0.5,0.77,0,54,133,187 +6633,2011-10-08,4,0,10,22,0,6,0,1,0.5,0.4848,0.77,0,45,104,149 +6634,2011-10-08,4,0,10,23,0,6,0,1,0.48,0.4697,0.88,0,52,90,142 +6635,2011-10-09,4,0,10,0,0,0,0,1,0.46,0.4545,0.88,0.0896,46,83,129 +6636,2011-10-09,4,0,10,1,0,0,0,1,0.46,0.4545,0.88,0,15,63,78 +6637,2011-10-09,4,0,10,2,0,0,0,1,0.46,0.4545,0.88,0,20,35,55 +6638,2011-10-09,4,0,10,3,0,0,0,1,0.44,0.4394,0.88,0,10,20,30 +6639,2011-10-09,4,0,10,4,0,0,0,1,0.44,0.4394,0.88,0,7,7,14 +6640,2011-10-09,4,0,10,5,0,0,0,1,0.44,0.4394,0.88,0,1,14,15 +6641,2011-10-09,4,0,10,6,0,0,0,1,0.44,0.4394,0.88,0,7,14,21 +6642,2011-10-09,4,0,10,7,0,0,0,1,0.44,0.4394,0.86,0,18,26,44 +6643,2011-10-09,4,0,10,8,0,0,0,1,0.46,0.4545,0.88,0,25,82,107 +6644,2011-10-09,4,0,10,9,0,0,0,1,0.5,0.4848,0.77,0,66,119,185 +6645,2011-10-09,4,0,10,10,0,0,0,1,0.56,0.5303,0.68,0.0896,155,210,365 +6646,2011-10-09,4,0,10,11,0,0,0,1,0.62,0.6212,0.53,0.0896,189,234,423 +6647,2011-10-09,4,0,10,12,0,0,0,1,0.64,0.6212,0.44,0.0896,212,243,455 +6648,2011-10-09,4,0,10,13,0,0,0,1,0.66,0.6212,0.44,0.0896,225,193,418 +6649,2011-10-09,4,0,10,14,0,0,0,1,0.68,0.6364,0.47,0.1642,272,228,500 +6650,2011-10-09,4,0,10,15,0,0,0,1,0.68,0.6364,0.51,0.1642,198,209,407 +6651,2011-10-09,4,0,10,16,0,0,0,1,0.66,0.6212,0.5,0.1343,223,247,470 +6652,2011-10-09,4,0,10,17,0,0,0,1,0.62,0.6061,0.61,0.1045,208,230,438 +6653,2011-10-09,4,0,10,18,0,0,0,1,0.64,0.6212,0.5,0.1045,186,228,414 +6654,2011-10-09,4,0,10,19,0,0,0,1,0.56,0.5303,0.73,0.0896,116,186,302 +6655,2011-10-09,4,0,10,20,0,0,0,1,0.56,0.5303,0.73,0.1045,59,134,193 +6656,2011-10-09,4,0,10,21,0,0,0,1,0.54,0.5152,0.83,0.1045,53,120,173 +6657,2011-10-09,4,0,10,22,0,0,0,2,0.52,0.5,0.94,0,48,104,152 +6658,2011-10-09,4,0,10,23,0,0,0,1,0.5,0.4848,0.88,0.1045,38,85,123 +6659,2011-10-10,4,0,10,0,1,1,0,1,0.5,0.4848,0.94,0,11,42,53 +6660,2011-10-10,4,0,10,1,1,1,0,1,0.48,0.4697,0.88,0,4,30,34 +6661,2011-10-10,4,0,10,2,1,1,0,1,0.48,0.4697,0.88,0,4,15,19 +6662,2011-10-10,4,0,10,3,1,1,0,1,0.46,0.4545,0.88,0,2,9,11 +6663,2011-10-10,4,0,10,4,1,1,0,1,0.46,0.4545,0.94,0,2,4,6 +6664,2011-10-10,4,0,10,5,1,1,0,1,0.46,0.4545,0.94,0,2,19,21 +6665,2011-10-10,4,0,10,6,1,1,0,1,0.44,0.4394,0.94,0,5,39,44 +6666,2011-10-10,4,0,10,7,1,1,0,1,0.46,0.4545,0.88,0,12,102,114 +6667,2011-10-10,4,0,10,8,1,1,0,1,0.52,0.5,0.83,0,27,227,254 +6668,2011-10-10,4,0,10,9,1,1,0,1,0.54,0.5152,0.77,0,58,161,219 +6669,2011-10-10,4,0,10,10,1,1,0,1,0.56,0.5303,0.73,0.1642,107,158,265 +6670,2011-10-10,4,0,10,11,1,1,0,1,0.62,0.6061,0.69,0.0896,139,183,322 +6671,2011-10-10,4,0,10,12,1,1,0,2,0.7,0.6515,0.48,0.0896,164,201,365 +6672,2011-10-10,4,0,10,13,1,1,0,1,0.72,0.6515,0.37,0,146,219,365 +6673,2011-10-10,4,0,10,14,1,1,0,1,0.74,0.6515,0.37,0,147,223,370 +6674,2011-10-10,4,0,10,15,1,1,0,2,0.72,0.6515,0.42,0.1045,130,254,384 +6675,2011-10-10,4,0,10,16,1,1,0,2,0.7,0.6364,0.45,0.0896,154,248,402 +6676,2011-10-10,4,0,10,17,1,1,0,2,0.66,0.6212,0.54,0.1045,125,334,459 +6677,2011-10-10,4,0,10,18,1,1,0,2,0.64,0.6212,0.53,0,108,365,473 +6678,2011-10-10,4,0,10,19,1,1,0,2,0.58,0.5455,0.78,0.0896,69,267,336 +6679,2011-10-10,4,0,10,20,1,1,0,1,0.6,0.5909,0.73,0,37,188,225 +6680,2011-10-10,4,0,10,21,1,1,0,1,0.56,0.5303,0.88,0.0896,27,148,175 +6681,2011-10-10,4,0,10,22,1,1,0,1,0.56,0.5303,0.88,0.0896,22,104,126 +6682,2011-10-10,4,0,10,23,1,1,0,2,0.54,0.5152,0.88,0.1045,12,63,75 +6683,2011-10-11,4,0,10,0,0,2,1,2,0.54,0.5152,0.88,0,14,16,30 +6684,2011-10-11,4,0,10,1,0,2,1,2,0.52,0.5,0.94,0.0896,2,9,11 +6685,2011-10-11,4,0,10,2,0,2,1,2,0.52,0.5,0.94,0.1343,1,4,5 +6686,2011-10-11,4,0,10,3,0,2,1,2,0.52,0.5,0.88,0.1045,2,1,3 +6687,2011-10-11,4,0,10,4,0,2,1,2,0.52,0.5,0.88,0.1343,0,5,5 +6688,2011-10-11,4,0,10,5,0,2,1,2,0.52,0.5,0.88,0,0,20,20 +6689,2011-10-11,4,0,10,6,0,2,1,1,0.52,0.5,0.83,0.0896,9,114,123 +6690,2011-10-11,4,0,10,7,0,2,1,2,0.52,0.5,0.88,0.0896,19,333,352 +6691,2011-10-11,4,0,10,8,0,2,1,2,0.56,0.5303,0.83,0.1045,31,375,406 +6692,2011-10-11,4,0,10,9,0,2,1,2,0.6,0.5909,0.69,0,40,195,235 +6693,2011-10-11,4,0,10,10,0,2,1,2,0.6,0.5909,0.69,0.1343,38,111,149 +6694,2011-10-11,4,0,10,11,0,2,1,2,0.6,0.5909,0.73,0.1642,31,92,123 +6695,2011-10-11,4,0,10,12,0,2,1,2,0.6,0.5909,0.73,0.1642,51,130,181 +6696,2011-10-11,4,0,10,13,0,2,1,2,0.6,0.5909,0.73,0.194,42,138,180 +6697,2011-10-11,4,0,10,14,0,2,1,3,0.6,0.5909,0.73,0.1045,54,117,171 +6698,2011-10-11,4,0,10,15,0,2,1,2,0.6,0.5758,0.78,0.1642,36,139,175 +6699,2011-10-11,4,0,10,16,0,2,1,2,0.6,0.5909,0.73,0.1642,63,257,320 +6700,2011-10-11,4,0,10,17,0,2,1,2,0.6,0.5758,0.78,0.2537,70,534,604 +6701,2011-10-11,4,0,10,18,0,2,1,2,0.6,0.5758,0.78,0.2239,46,493,539 +6702,2011-10-11,4,0,10,19,0,2,1,2,0.58,0.5455,0.83,0.2239,33,285,318 +6703,2011-10-11,4,0,10,20,0,2,1,2,0.58,0.5455,0.83,0.2239,31,203,234 +6704,2011-10-11,4,0,10,21,0,2,1,2,0.58,0.5455,0.78,0.194,17,158,175 +6705,2011-10-11,4,0,10,22,0,2,1,2,0.56,0.5303,0.83,0.194,27,120,147 +6706,2011-10-11,4,0,10,23,0,2,1,2,0.56,0.5303,0.83,0.2836,10,47,57 +6707,2011-10-12,4,0,10,0,0,3,1,2,0.56,0.5303,0.83,0.1343,2,23,25 +6708,2011-10-12,4,0,10,1,0,3,1,3,0.56,0.5303,0.83,0.1343,2,8,10 +6709,2011-10-12,4,0,10,2,0,3,1,2,0.56,0.5303,0.83,0.2836,0,2,2 +6710,2011-10-12,4,0,10,3,0,3,1,3,0.54,0.5152,0.94,0.2985,0,2,2 +6711,2011-10-12,4,0,10,4,0,3,1,3,0.54,0.5152,0.94,0.3881,1,6,7 +6712,2011-10-12,4,0,10,5,0,3,1,2,0.54,0.5152,0.88,0.3284,0,22,22 +6713,2011-10-12,4,0,10,6,0,3,1,2,0.54,0.5152,0.88,0.2537,5,107,112 +6714,2011-10-12,4,0,10,7,0,3,1,2,0.54,0.5152,0.88,0.3881,15,244,259 +6715,2011-10-12,4,0,10,8,0,3,1,2,0.54,0.5152,0.88,0.3881,27,377,404 +6716,2011-10-12,4,0,10,9,0,3,1,2,0.54,0.5152,0.88,0.4478,14,183,197 +6717,2011-10-12,4,0,10,10,0,3,1,2,0.54,0.5152,0.88,0.3881,15,98,113 +6718,2011-10-12,4,0,10,11,0,3,1,3,0.54,0.5152,0.94,0.3284,18,102,120 +6719,2011-10-12,4,0,10,12,0,3,1,3,0.54,0.5152,0.88,0.3284,9,53,62 +6720,2011-10-12,4,0,10,13,0,3,1,3,0.54,0.5152,0.88,0.2239,25,76,101 +6721,2011-10-12,4,0,10,14,0,3,1,3,0.54,0.5152,0.94,0.2836,8,70,78 +6722,2011-10-12,4,0,10,15,0,3,1,3,0.54,0.5152,0.94,0.2537,3,42,45 +6723,2011-10-12,4,0,10,16,0,3,1,2,0.56,0.5303,0.88,0.2239,5,50,55 +6724,2011-10-12,4,0,10,17,0,3,1,3,0.54,0.5152,0.94,0.2239,16,235,251 +6725,2011-10-12,4,0,10,18,0,3,1,3,0.54,0.5152,0.94,0.1343,11,161,172 +6726,2011-10-12,4,0,10,19,0,3,1,3,0.54,0.5152,0.94,0.1343,7,102,109 +6727,2011-10-12,4,0,10,20,0,3,1,3,0.54,0.5152,0.94,0.1045,9,81,90 +6728,2011-10-12,4,0,10,21,0,3,1,3,0.54,0.5152,1,0.1045,9,74,83 +6729,2011-10-12,4,0,10,22,0,3,1,3,0.54,0.5152,0.94,0.0896,3,40,43 +6730,2011-10-12,4,0,10,23,0,3,1,3,0.54,0.5152,0.94,0.0896,13,41,54 +6731,2011-10-13,4,0,10,0,0,4,1,2,0.54,0.5152,0.94,0.1343,1,14,15 +6732,2011-10-13,4,0,10,1,0,4,1,2,0.54,0.5152,0.94,0,3,9,12 +6733,2011-10-13,4,0,10,2,0,4,1,2,0.54,0.5152,0.94,0,0,4,4 +6734,2011-10-13,4,0,10,3,0,4,1,2,0.54,0.5152,1,0.194,2,4,6 +6735,2011-10-13,4,0,10,4,0,4,1,2,0.54,0.5152,1,0.1045,2,5,7 +6736,2011-10-13,4,0,10,5,0,4,1,2,0.54,0.5152,1,0.1045,1,16,17 +6737,2011-10-13,4,0,10,6,0,4,1,3,0.54,0.5152,1,0.0896,1,51,52 +6738,2011-10-13,4,0,10,7,0,4,1,3,0.54,0.5152,1,0.1343,5,76,81 +6739,2011-10-13,4,0,10,8,0,4,1,3,0.54,0.5152,1,0.0896,10,199,209 +6740,2011-10-13,4,0,10,9,0,4,1,3,0.56,0.5303,1,0.1045,3,106,109 +6741,2011-10-13,4,0,10,10,0,4,1,3,0.56,0.5303,1,0.1343,16,53,69 +6742,2011-10-13,4,0,10,11,0,4,1,2,0.62,0.5758,0.83,0,8,76,84 +6743,2011-10-13,4,0,10,12,0,4,1,2,0.62,0.5606,0.88,0,10,95,105 +6744,2011-10-13,4,0,10,13,0,4,1,2,0.64,0.5758,0.83,0.1642,20,142,162 +6745,2011-10-13,4,0,10,14,0,4,1,1,0.66,0.6061,0.78,0.194,22,111,133 +6746,2011-10-13,4,0,10,15,0,4,1,1,0.66,0.6061,0.78,0.194,27,128,155 +6747,2011-10-13,4,0,10,16,0,4,1,1,0.66,0.6212,0.74,0.2239,36,240,276 +6748,2011-10-13,4,0,10,17,0,4,1,3,0.62,0.5758,0.83,0.2537,47,432,479 +6749,2011-10-13,4,0,10,18,0,4,1,3,0.62,0.5758,0.83,0.2537,20,248,268 +6750,2011-10-13,4,0,10,19,0,4,1,2,0.62,0.5758,0.83,0.1642,18,187,205 +6751,2011-10-13,4,0,10,20,0,4,1,1,0.62,0.5758,0.83,0.2836,15,166,181 +6752,2011-10-13,4,0,10,21,0,4,1,1,0.62,0.5758,0.83,0.194,8,115,123 +6753,2011-10-13,4,0,10,22,0,4,1,1,0.62,0.5758,0.83,0.194,13,101,114 +6754,2011-10-13,4,0,10,23,0,4,1,2,0.58,0.5455,0.88,0.194,2,45,47 +6755,2011-10-14,4,0,10,0,0,5,1,1,0.58,0.5455,0.94,0.0896,10,29,39 +6756,2011-10-14,4,0,10,1,0,5,1,1,0.58,0.5455,0.94,0.1343,8,16,24 +6757,2011-10-14,4,0,10,2,0,5,1,1,0.56,0.5303,0.94,0.1642,0,6,6 +6758,2011-10-14,4,0,10,3,0,5,1,1,0.56,0.5303,0.94,0.1343,1,6,7 +6759,2011-10-14,4,0,10,4,0,5,1,1,0.56,0.5303,0.94,0.194,0,8,8 +6760,2011-10-14,4,0,10,5,0,5,1,2,0.56,0.5303,0.94,0.1642,0,17,17 +6761,2011-10-14,4,0,10,6,0,5,1,3,0.56,0.5303,0.88,0.194,4,90,94 +6762,2011-10-14,4,0,10,7,0,5,1,3,0.56,0.5303,0.88,0.194,10,137,147 +6763,2011-10-14,4,0,10,8,0,5,1,2,0.56,0.5303,0.9,0.1642,23,186,209 +6764,2011-10-14,4,0,10,9,0,5,1,2,0.56,0.5303,0.94,0.2239,16,164,180 +6765,2011-10-14,4,0,10,10,0,5,1,3,0.54,0.5152,0.88,0.3582,17,108,125 +6766,2011-10-14,4,0,10,11,0,5,1,3,0.52,0.5,0.88,0.1642,11,76,87 +6767,2011-10-14,4,0,10,12,0,5,1,3,0.52,0.5,0.88,0.1642,13,62,75 +6768,2011-10-14,4,0,10,13,0,5,1,1,0.56,0.5303,0.73,0.3582,18,114,132 +6769,2011-10-14,4,0,10,14,0,5,1,1,0.62,0.6212,0.43,0.3582,37,119,156 +6770,2011-10-14,4,0,10,15,0,5,1,1,0.62,0.6212,0.43,0.2836,61,169,230 +6771,2011-10-14,4,0,10,16,0,5,1,1,0.6,0.6212,0.43,0.2239,80,264,344 +6772,2011-10-14,4,0,10,17,0,5,1,1,0.58,0.5455,0.46,0.4179,55,426,481 +6773,2011-10-14,4,0,10,18,0,5,1,1,0.54,0.5152,0.39,0.2836,62,361,423 +6774,2011-10-14,4,0,10,19,0,5,1,1,0.54,0.5152,0.39,0.3284,27,221,248 +6775,2011-10-14,4,0,10,20,0,5,1,1,0.52,0.5,0.42,0.194,19,177,196 +6776,2011-10-14,4,0,10,21,0,5,1,1,0.5,0.4848,0.45,0.194,22,156,178 +6777,2011-10-14,4,0,10,22,0,5,1,1,0.46,0.4545,0.59,0.2239,20,123,143 +6778,2011-10-14,4,0,10,23,0,5,1,1,0.46,0.4545,0.59,0.1642,15,80,95 +6779,2011-10-15,4,0,10,0,0,6,0,1,0.46,0.4545,0.55,0.1045,18,88,106 +6780,2011-10-15,4,0,10,1,0,6,0,1,0.46,0.4545,0.55,0.2537,9,64,73 +6781,2011-10-15,4,0,10,2,0,6,0,1,0.46,0.4545,0.59,0.194,17,39,56 +6782,2011-10-15,4,0,10,3,0,6,0,1,0.44,0.4394,0.62,0.194,5,18,23 +6783,2011-10-15,4,0,10,4,0,6,0,1,0.42,0.4242,0.67,0.1343,0,6,6 +6784,2011-10-15,4,0,10,5,0,6,0,1,0.42,0.4242,0.71,0.1642,1,6,7 +6785,2011-10-15,4,0,10,6,0,6,0,1,0.4,0.4091,0.71,0,5,15,20 +6786,2011-10-15,4,0,10,7,0,6,0,1,0.4,0.4091,0.76,0.1045,17,40,57 +6787,2011-10-15,4,0,10,8,0,6,0,1,0.46,0.4545,0.59,0.2836,24,101,125 +6788,2011-10-15,4,0,10,9,0,6,0,1,0.52,0.5,0.48,0.2985,50,157,207 +6789,2011-10-15,4,0,10,10,0,6,0,1,0.54,0.5152,0.45,0.4478,115,207,322 +6790,2011-10-15,4,0,10,11,0,6,0,1,0.56,0.5303,0.4,0.3881,153,221,374 +6791,2011-10-15,4,0,10,12,0,6,0,1,0.58,0.5455,0.35,0.5522,195,261,456 +6792,2011-10-15,4,0,10,13,0,6,0,1,0.6,0.6212,0.35,0.4627,171,223,394 +6793,2011-10-15,4,0,10,14,0,6,0,1,0.6,0.6212,0.33,0.4627,242,230,472 +6794,2011-10-15,4,0,10,15,0,6,0,1,0.62,0.6212,0.31,0.4925,166,211,377 +6795,2011-10-15,4,0,10,16,0,6,0,1,0.62,0.6212,0.29,0.3881,179,264,443 +6796,2011-10-15,4,0,10,17,0,6,0,1,0.58,0.5455,0.32,0.2836,177,264,441 +6797,2011-10-15,4,0,10,18,0,6,0,1,0.54,0.5152,0.39,0.194,121,251,372 +6798,2011-10-15,4,0,10,19,0,6,0,1,0.5,0.4848,0.51,0.1343,61,177,238 +6799,2011-10-15,4,0,10,20,0,6,0,1,0.5,0.4848,0.48,0.1343,42,137,179 +6800,2011-10-15,4,0,10,21,0,6,0,1,0.5,0.4848,0.42,0.2239,53,115,168 +6801,2011-10-15,4,0,10,22,0,6,0,1,0.5,0.4848,0.36,0.1343,45,121,166 +6802,2011-10-15,4,0,10,23,0,6,0,1,0.48,0.4697,0.41,0.1642,33,102,135 +6803,2011-10-16,4,0,10,0,0,0,0,1,0.46,0.4545,0.44,0.1343,22,85,107 +6804,2011-10-16,4,0,10,1,0,0,0,1,0.44,0.4394,0.47,0.1343,13,64,77 +6805,2011-10-16,4,0,10,2,0,0,0,1,0.42,0.4242,0.54,0.1343,13,52,65 +6806,2011-10-16,4,0,10,3,0,0,0,1,0.42,0.4242,0.54,0.1343,11,31,42 +6807,2011-10-16,4,0,10,4,0,0,0,1,0.42,0.4242,0.54,0.194,5,7,12 +6808,2011-10-16,4,0,10,5,0,0,0,1,0.42,0.4242,0.54,0.1642,1,6,7 +6809,2011-10-16,4,0,10,6,0,0,0,1,0.38,0.3939,0.66,0.1642,3,7,10 +6810,2011-10-16,4,0,10,7,0,0,0,1,0.4,0.4091,0.62,0.1343,9,39,48 +6811,2011-10-16,4,0,10,8,0,0,0,1,0.44,0.4394,0.62,0.194,28,71,99 +6812,2011-10-16,4,0,10,9,0,0,0,1,0.5,0.4848,0.45,0.2239,39,151,190 +6813,2011-10-16,4,0,10,10,0,0,0,1,0.54,0.5152,0.45,0.2537,121,226,347 +6814,2011-10-16,4,0,10,11,0,0,0,1,0.56,0.5303,0.35,0.2836,159,218,377 +6815,2011-10-16,4,0,10,12,0,0,0,1,0.58,0.5455,0.35,0.2537,167,277,444 +6816,2011-10-16,4,0,10,13,0,0,0,1,0.6,0.6212,0.35,0.2985,169,269,438 +6817,2011-10-16,4,0,10,14,0,0,0,1,0.6,0.6212,0.4,0.2985,204,258,462 +6818,2011-10-16,4,0,10,15,0,0,0,1,0.62,0.6212,0.35,0.3881,182,230,412 +6819,2011-10-16,4,0,10,16,0,0,0,1,0.6,0.6212,0.46,0.3881,177,289,466 +6820,2011-10-16,4,0,10,17,0,0,0,1,0.58,0.5455,0.49,0.4627,152,253,405 +6821,2011-10-16,4,0,10,18,0,0,0,1,0.56,0.5303,0.49,0.4478,102,226,328 +6822,2011-10-16,4,0,10,19,0,0,0,1,0.54,0.5152,0.52,0.3881,52,181,233 +6823,2011-10-16,4,0,10,20,0,0,0,1,0.54,0.5152,0.52,0.4627,49,129,178 +6824,2011-10-16,4,0,10,21,0,0,0,1,0.54,0.5152,0.52,0.3881,14,88,102 +6825,2011-10-16,4,0,10,22,0,0,0,1,0.56,0.5303,0.49,0.4179,42,88,130 +6826,2011-10-16,4,0,10,23,0,0,0,1,0.56,0.5303,0.52,0.4179,14,48,62 +6827,2011-10-17,4,0,10,0,0,1,1,1,0.54,0.5152,0.56,0.3284,12,25,37 +6828,2011-10-17,4,0,10,1,0,1,1,1,0.56,0.5303,0.54,0.3881,2,15,17 +6829,2011-10-17,4,0,10,2,0,1,1,1,0.56,0.5303,0.54,0.3881,4,3,7 +6830,2011-10-17,4,0,10,3,0,1,1,1,0.56,0.5303,0.49,0.194,0,2,2 +6831,2011-10-17,4,0,10,4,0,1,1,1,0.52,0.5,0.59,0.1642,2,3,5 +6832,2011-10-17,4,0,10,5,0,1,1,1,0.5,0.4848,0.63,0.2985,1,28,29 +6833,2011-10-17,4,0,10,6,0,1,1,1,0.5,0.4848,0.55,0.2537,7,107,114 +6834,2011-10-17,4,0,10,7,0,1,1,2,0.5,0.4848,0.55,0.1642,10,299,309 +6835,2011-10-17,4,0,10,8,0,1,1,2,0.5,0.4848,0.51,0.2537,33,381,414 +6836,2011-10-17,4,0,10,9,0,1,1,2,0.5,0.4848,0.45,0.2836,26,180,206 +6837,2011-10-17,4,0,10,10,0,1,1,1,0.52,0.5,0.45,0.1642,43,92,135 +6838,2011-10-17,4,0,10,11,0,1,1,1,0.56,0.5303,0.43,0,35,131,166 +6839,2011-10-17,4,0,10,12,0,1,1,1,0.56,0.5303,0.4,0.0896,39,166,205 +6840,2011-10-17,4,0,10,13,0,1,1,1,0.56,0.5303,0.46,0.0896,49,156,205 +6841,2011-10-17,4,0,10,14,0,1,1,1,0.58,0.5455,0.49,0.194,44,122,166 +6842,2011-10-17,4,0,10,15,0,1,1,1,0.56,0.5303,0.6,0.2537,66,142,208 +6843,2011-10-17,4,0,10,16,0,1,1,1,0.58,0.5455,0.56,0.2537,64,238,302 +6844,2011-10-17,4,0,10,17,0,1,1,1,0.56,0.5303,0.64,0.1343,80,540,620 +6845,2011-10-17,4,0,10,18,0,1,1,1,0.56,0.5303,0.6,0,49,469,518 +6846,2011-10-17,4,0,10,19,0,1,1,1,0.54,0.5152,0.68,0.1045,45,288,333 +6847,2011-10-17,4,0,10,20,0,1,1,1,0.52,0.5,0.77,0.1045,40,189,229 +6848,2011-10-17,4,0,10,21,0,1,1,1,0.5,0.4848,0.77,0.1045,36,141,177 +6849,2011-10-17,4,0,10,22,0,1,1,1,0.5,0.4848,0.77,0,16,88,104 +6850,2011-10-17,4,0,10,23,0,1,1,2,0.48,0.4697,0.88,0,10,52,62 +6851,2011-10-18,4,0,10,0,0,2,1,1,0.48,0.4697,0.82,0,8,23,31 +6852,2011-10-18,4,0,10,1,0,2,1,2,0.46,0.4545,0.88,0,4,7,11 +6853,2011-10-18,4,0,10,2,0,2,1,1,0.44,0.4394,0.94,0.0896,1,4,5 +6854,2011-10-18,4,0,10,3,0,2,1,1,0.44,0.4394,0.94,0.0896,0,1,1 +6855,2011-10-18,4,0,10,4,0,2,1,2,0.44,0.4394,0.94,0.1045,0,4,4 +6856,2011-10-18,4,0,10,5,0,2,1,2,0.46,0.4545,0.88,0,2,19,21 +6857,2011-10-18,4,0,10,6,0,2,1,1,0.46,0.4545,0.82,0.0896,4,105,109 +6858,2011-10-18,4,0,10,7,0,2,1,1,0.46,0.4545,0.88,0.0896,15,326,341 +6859,2011-10-18,4,0,10,8,0,2,1,1,0.5,0.4848,0.77,0.0896,25,474,499 +6860,2011-10-18,4,0,10,9,0,2,1,2,0.52,0.5,0.77,0.0896,29,198,227 +6861,2011-10-18,4,0,10,10,0,2,1,2,0.56,0.5303,0.68,0,27,75,102 +6862,2011-10-18,4,0,10,11,0,2,1,2,0.6,0.6212,0.49,0.1045,35,112,147 +6863,2011-10-18,4,0,10,12,0,2,1,2,0.62,0.6212,0.43,0.0896,42,159,201 +6864,2011-10-18,4,0,10,13,0,2,1,2,0.64,0.6212,0.38,0.194,48,171,219 +6865,2011-10-18,4,0,10,14,0,2,1,2,0.62,0.6212,0.41,0.1642,32,137,169 +6866,2011-10-18,4,0,10,15,0,2,1,1,0.62,0.6212,0.5,0.194,66,139,205 +6867,2011-10-18,4,0,10,16,0,2,1,1,0.62,0.6212,0.57,0.2537,58,246,304 +6868,2011-10-18,4,0,10,17,0,2,1,1,0.6,0.6061,0.6,0.1343,72,553,625 +6869,2011-10-18,4,0,10,18,0,2,1,1,0.56,0.5303,0.68,0.1045,58,512,570 +6870,2011-10-18,4,0,10,19,0,2,1,1,0.56,0.5303,0.6,0.1045,31,286,317 +6871,2011-10-18,4,0,10,20,0,2,1,1,0.54,0.5152,0.64,0.1642,23,204,227 +6872,2011-10-18,4,0,10,21,0,2,1,2,0.54,0.5152,0.68,0.1343,28,178,206 +6873,2011-10-18,4,0,10,22,0,2,1,2,0.52,0.5,0.77,0.1642,22,115,137 +6874,2011-10-18,4,0,10,23,0,2,1,2,0.52,0.5,0.77,0.194,7,63,70 +6875,2011-10-19,4,0,10,0,0,3,1,2,0.52,0.5,0.77,0.194,11,23,34 +6876,2011-10-19,4,0,10,1,0,3,1,2,0.52,0.5,0.77,0.194,7,9,16 +6877,2011-10-19,4,0,10,2,0,3,1,2,0.52,0.5,0.77,0.194,1,6,7 +6878,2011-10-19,4,0,10,4,0,3,1,3,0.5,0.4848,0.94,0.194,0,3,3 +6879,2011-10-19,4,0,10,5,0,3,1,3,0.5,0.4848,1,0.1045,0,3,3 +6880,2011-10-19,4,0,10,6,0,3,1,3,0.52,0.5,0.83,0.2537,3,28,31 +6881,2011-10-19,4,0,10,7,0,3,1,3,0.52,0.5,0.77,0.2537,1,67,68 +6882,2011-10-19,4,0,10,8,0,3,1,3,0.54,0.5152,0.7,0.4925,10,200,210 +6883,2011-10-19,4,0,10,9,0,3,1,3,0.52,0.5,0.88,0.4925,17,185,202 +6884,2011-10-19,4,0,10,10,0,3,1,3,0.52,0.5,0.88,0.4925,14,109,123 +6885,2011-10-19,4,0,10,11,0,3,1,3,0.52,0.5,0.94,0.2537,33,116,149 +6886,2011-10-19,4,0,10,12,0,3,1,3,0.52,0.5,0.94,0.3284,13,117,130 +6887,2011-10-19,4,0,10,13,0,3,1,2,0.54,0.5152,0.94,0.2836,20,86,106 +6888,2011-10-19,4,0,10,14,0,3,1,3,0.54,0.5152,0.94,0.1642,17,80,97 +6889,2011-10-19,4,0,10,15,0,3,1,3,0.54,0.5152,1,0.1343,21,99,120 +6890,2011-10-19,4,0,10,16,0,3,1,3,0.56,0.5303,0.94,0.1642,21,180,201 +6891,2011-10-19,4,0,10,17,0,3,1,3,0.56,0.5303,0.94,0.1642,11,170,181 +6892,2011-10-19,4,0,10,18,0,3,1,3,0.56,0.5303,1,0.1642,8,132,140 +6893,2011-10-19,4,0,10,19,0,3,1,2,0.6,0.5455,0.88,0.1343,8,189,197 +6894,2011-10-19,4,0,10,20,0,3,1,3,0.6,0.5152,0.94,0.0896,5,140,145 +6895,2011-10-19,4,0,10,21,0,3,1,2,0.58,0.5455,0.94,0.1343,6,79,85 +6896,2011-10-19,4,0,10,22,0,3,1,2,0.58,0.5455,0.94,0.3582,13,94,107 +6897,2011-10-19,4,0,10,23,0,3,1,2,0.58,0.5455,0.94,0.3582,14,55,69 +6898,2011-10-20,4,0,10,0,0,4,1,1,0.56,0.5303,0.94,0.2985,4,22,26 +6899,2011-10-20,4,0,10,1,0,4,1,1,0.56,0.5303,0.88,0.3582,4,11,15 +6900,2011-10-20,4,0,10,2,0,4,1,1,0.56,0.5303,0.88,0.4478,0,4,4 +6901,2011-10-20,4,0,10,3,0,4,1,1,0.5,0.4848,0.82,0.4627,0,1,1 +6902,2011-10-20,4,0,10,4,0,4,1,1,0.5,0.4848,0.82,0.4627,2,6,8 +6903,2011-10-20,4,0,10,5,0,4,1,1,0.48,0.4697,0.82,0.3582,2,24,26 +6904,2011-10-20,4,0,10,6,0,4,1,1,0.44,0.4394,0.82,0.4179,5,87,92 +6905,2011-10-20,4,0,10,7,0,4,1,1,0.42,0.4242,0.77,0.4478,15,303,318 +6906,2011-10-20,4,0,10,8,0,4,1,1,0.44,0.4394,0.67,0.5224,17,409,426 +6907,2011-10-20,4,0,10,9,0,4,1,1,0.44,0.4394,0.67,0.5821,26,197,223 +6908,2011-10-20,4,0,10,10,0,4,1,1,0.46,0.4545,0.59,0.4478,25,105,130 +6909,2011-10-20,4,0,10,11,0,4,1,1,0.48,0.4697,0.55,0.5224,30,99,129 +6910,2011-10-20,4,0,10,12,0,4,1,1,0.48,0.4697,0.48,0.5224,28,171,199 +6911,2011-10-20,4,0,10,13,0,4,1,1,0.5,0.4848,0.45,0.5821,28,136,164 +6912,2011-10-20,4,0,10,14,0,4,1,2,0.48,0.4697,0.48,0.5522,38,117,155 +6913,2011-10-20,4,0,10,15,0,4,1,2,0.48,0.4697,0.51,0.4478,22,138,160 +6914,2011-10-20,4,0,10,16,0,4,1,2,0.48,0.4697,0.51,0.4478,31,203,234 +6915,2011-10-20,4,0,10,17,0,4,1,2,0.48,0.4697,0.48,0.3881,56,438,494 +6916,2011-10-20,4,0,10,18,0,4,1,2,0.46,0.4545,0.51,0.2985,39,430,469 +6917,2011-10-20,4,0,10,19,0,4,1,1,0.46,0.4545,0.51,0.3582,20,278,298 +6918,2011-10-20,4,0,10,20,0,4,1,2,0.46,0.4545,0.51,0.3284,34,207,241 +6919,2011-10-20,4,0,10,21,0,4,1,1,0.44,0.4394,0.51,0.2985,16,149,165 +6920,2011-10-20,4,0,10,22,0,4,1,1,0.44,0.4394,0.51,0.3284,22,106,128 +6921,2011-10-20,4,0,10,23,0,4,1,1,0.42,0.4242,0.58,0.2537,7,83,90 +6922,2011-10-21,4,0,10,0,0,5,1,1,0.4,0.4091,0.62,0.2537,8,42,50 +6923,2011-10-21,4,0,10,1,0,5,1,1,0.4,0.4091,0.62,0.2239,7,19,26 +6924,2011-10-21,4,0,10,2,0,5,1,1,0.38,0.3939,0.66,0.1642,6,12,18 +6925,2011-10-21,4,0,10,3,0,5,1,1,0.36,0.3485,0.71,0.1343,0,7,7 +6926,2011-10-21,4,0,10,4,0,5,1,1,0.36,0.3485,0.71,0.1642,0,6,6 +6927,2011-10-21,4,0,10,5,0,5,1,1,0.34,0.3333,0.76,0.194,0,31,31 +6928,2011-10-21,4,0,10,6,0,5,1,1,0.36,0.3485,0.71,0.1642,7,78,85 +6929,2011-10-21,4,0,10,7,0,5,1,1,0.36,0.3485,0.71,0.1642,5,228,233 +6930,2011-10-21,4,0,10,8,0,5,1,1,0.4,0.4091,0.62,0.194,16,386,402 +6931,2011-10-21,4,0,10,9,0,5,1,2,0.42,0.4242,0.58,0.1343,33,189,222 +6932,2011-10-21,4,0,10,10,0,5,1,1,0.46,0.4545,0.51,0.1642,32,106,138 +6933,2011-10-21,4,0,10,11,0,5,1,1,0.5,0.4848,0.45,0.2239,39,135,174 +6934,2011-10-21,4,0,10,12,0,5,1,1,0.5,0.4848,0.45,0.2985,47,193,240 +6935,2011-10-21,4,0,10,13,0,5,1,1,0.48,0.4697,0.48,0.3284,43,163,206 +6936,2011-10-21,4,0,10,14,0,5,1,1,0.52,0.5,0.42,0.2985,67,131,198 +6937,2011-10-21,4,0,10,15,0,5,1,1,0.5,0.4848,0.45,0.2836,57,178,235 +6938,2011-10-21,4,0,10,16,0,5,1,1,0.5,0.4848,0.45,0.2537,60,242,302 +6939,2011-10-21,4,0,10,17,0,5,1,2,0.46,0.4545,0.51,0.2537,79,445,524 +6940,2011-10-21,4,0,10,18,0,5,1,2,0.44,0.4394,0.54,0.2537,43,354,397 +6941,2011-10-21,4,0,10,19,0,5,1,2,0.44,0.4394,0.54,0.3881,25,228,253 +6942,2011-10-21,4,0,10,20,0,5,1,2,0.44,0.4394,0.54,0.2239,29,155,184 +6943,2011-10-21,4,0,10,21,0,5,1,1,0.42,0.4242,0.58,0.1642,31,112,143 +6944,2011-10-21,4,0,10,22,0,5,1,1,0.42,0.4242,0.54,0.1642,24,89,113 +6945,2011-10-21,4,0,10,23,0,5,1,1,0.4,0.4091,0.62,0.2239,18,99,117 +6946,2011-10-22,4,0,10,0,0,6,0,1,0.4,0.4091,0.62,0.194,16,80,96 +6947,2011-10-22,4,0,10,1,0,6,0,1,0.4,0.4091,0.62,0.194,20,50,70 +6948,2011-10-22,4,0,10,2,0,6,0,1,0.4,0.4091,0.62,0.2537,6,25,31 +6949,2011-10-22,4,0,10,3,0,6,0,1,0.4,0.4091,0.62,0.1343,1,19,20 +6950,2011-10-22,4,0,10,4,0,6,0,1,0.38,0.3939,0.66,0.0896,4,4,8 +6951,2011-10-22,4,0,10,5,0,6,0,1,0.38,0.3939,0.66,0,1,7,8 +6952,2011-10-22,4,0,10,6,0,6,0,1,0.36,0.3788,0.71,0,1,17,18 +6953,2011-10-22,4,0,10,7,0,6,0,1,0.36,0.3636,0.76,0.1045,8,49,57 +6954,2011-10-22,4,0,10,8,0,6,0,1,0.4,0.4091,0.71,0,26,88,114 +6955,2011-10-22,4,0,10,9,0,6,0,1,0.42,0.4242,0.67,0.0896,47,122,169 +6956,2011-10-22,4,0,10,10,0,6,0,2,0.44,0.4394,0.62,0.1045,87,149,236 +6957,2011-10-22,4,0,10,11,0,6,0,2,0.44,0.4394,0.62,0.1045,110,172,282 +6958,2011-10-22,4,0,10,12,0,6,0,1,0.46,0.4545,0.55,0.0896,132,211,343 +6959,2011-10-22,4,0,10,13,0,6,0,1,0.5,0.4848,0.45,0,136,186,322 +6960,2011-10-22,4,0,10,14,0,6,0,2,0.48,0.4697,0.48,0.2239,133,177,310 +6961,2011-10-22,4,0,10,15,0,6,0,1,0.48,0.4697,0.51,0.2239,159,223,382 +6962,2011-10-22,4,0,10,16,0,6,0,1,0.5,0.4848,0.45,0.194,169,190,359 +6963,2011-10-22,4,0,10,17,0,6,0,1,0.46,0.4545,0.55,0.1343,143,217,360 +6964,2011-10-22,4,0,10,18,0,6,0,1,0.44,0.4394,0.54,0.0896,111,224,335 +6965,2011-10-22,4,0,10,19,0,6,0,1,0.44,0.4394,0.67,0,47,173,220 +6966,2011-10-22,4,0,10,20,0,6,0,1,0.42,0.4242,0.67,0,48,111,159 +6967,2011-10-22,4,0,10,21,0,6,0,1,0.4,0.4091,0.76,0,37,120,157 +6968,2011-10-22,4,0,10,22,0,6,0,1,0.4,0.4091,0.76,0,30,114,144 +6969,2011-10-22,4,0,10,23,0,6,0,1,0.38,0.3939,0.82,0,27,81,108 +6970,2011-10-23,4,0,10,0,0,0,0,1,0.36,0.3788,0.87,0,24,64,88 +6971,2011-10-23,4,0,10,1,0,0,0,1,0.36,0.3788,0.87,0,11,60,71 +6972,2011-10-23,4,0,10,2,0,0,0,1,0.34,0.3485,0.87,0.0896,10,40,50 +6973,2011-10-23,4,0,10,3,0,0,0,1,0.34,0.3636,0.87,0,15,31,46 +6974,2011-10-23,4,0,10,4,0,0,0,1,0.34,0.3636,0.87,0,5,1,6 +6975,2011-10-23,4,0,10,5,0,0,0,1,0.32,0.3485,0.87,0,1,2,3 +6976,2011-10-23,4,0,10,6,0,0,0,1,0.32,0.3333,0.87,0.1045,0,12,12 +6977,2011-10-23,4,0,10,7,0,0,0,1,0.34,0.3485,0.87,0.0896,3,18,21 +6978,2011-10-23,4,0,10,8,0,0,0,1,0.36,0.3485,0.87,0.1343,32,56,88 +6979,2011-10-23,4,0,10,9,0,0,0,1,0.4,0.4091,0.76,0,48,95,143 +6980,2011-10-23,4,0,10,10,0,0,0,1,0.42,0.4242,0.71,0.0896,104,185,289 +6981,2011-10-23,4,0,10,11,0,0,0,1,0.46,0.4545,0.72,0.1642,131,202,333 +6982,2011-10-23,4,0,10,12,0,0,0,1,0.5,0.4848,0.59,0.2537,164,249,413 +6983,2011-10-23,4,0,10,13,0,0,0,1,0.52,0.5,0.55,0.194,160,255,415 +6984,2011-10-23,4,0,10,14,0,0,0,1,0.52,0.5,0.55,0.2239,192,216,408 +6985,2011-10-23,4,0,10,15,0,0,0,1,0.52,0.5,0.52,0.1642,171,216,387 +6986,2011-10-23,4,0,10,16,0,0,0,1,0.52,0.5,0.55,0.194,188,217,405 +6987,2011-10-23,4,0,10,17,0,0,0,1,0.5,0.4848,0.63,0.1343,120,220,340 +6988,2011-10-23,4,0,10,18,0,0,0,1,0.5,0.4848,0.63,0.1045,88,196,284 +6989,2011-10-23,4,0,10,19,0,0,0,1,0.46,0.4545,0.77,0,53,140,193 +6990,2011-10-23,4,0,10,20,0,0,0,1,0.44,0.4394,0.72,0.0896,32,112,144 +6991,2011-10-23,4,0,10,21,0,0,0,1,0.44,0.4394,0.72,0.0896,24,75,99 +6992,2011-10-23,4,0,10,22,0,0,0,1,0.42,0.4242,0.77,0.1045,25,69,94 +6993,2011-10-23,4,0,10,23,0,0,0,1,0.42,0.4242,0.77,0.1642,18,31,49 +6994,2011-10-24,4,0,10,0,0,1,1,1,0.42,0.4242,0.77,0,6,26,32 +6995,2011-10-24,4,0,10,1,0,1,1,1,0.4,0.4091,0.82,0.1343,7,8,15 +6996,2011-10-24,4,0,10,2,0,1,1,1,0.4,0.4091,0.87,0.1045,1,6,7 +6997,2011-10-24,4,0,10,3,0,1,1,1,0.4,0.4091,0.82,0.0896,1,3,4 +6998,2011-10-24,4,0,10,4,0,1,1,1,0.4,0.4091,0.82,0,1,5,6 +6999,2011-10-24,4,0,10,5,0,1,1,1,0.38,0.3939,0.87,0,2,19,21 +7000,2011-10-24,4,0,10,6,0,1,1,1,0.4,0.4091,0.87,0.1045,3,83,86 +7001,2011-10-24,4,0,10,7,0,1,1,2,0.38,0.3939,0.94,0.1045,11,274,285 +7002,2011-10-24,4,0,10,8,0,1,1,2,0.42,0.4242,0.88,0.1642,20,378,398 +7003,2011-10-24,4,0,10,9,0,1,1,2,0.44,0.4394,0.82,0.1642,29,166,195 +7004,2011-10-24,4,0,10,10,0,1,1,2,0.46,0.4545,0.82,0.2537,47,82,129 +7005,2011-10-24,4,0,10,11,0,1,1,2,0.5,0.4848,0.72,0.1642,59,109,168 +7006,2011-10-24,4,0,10,12,0,1,1,2,0.52,0.5,0.68,0.1045,64,156,220 +7007,2011-10-24,4,0,10,13,0,1,1,2,0.54,0.5152,0.64,0.2239,60,138,198 +7008,2011-10-24,4,0,10,14,0,1,1,2,0.54,0.5152,0.64,0.194,63,143,206 +7009,2011-10-24,4,0,10,15,0,1,1,1,0.56,0.5303,0.6,0.1642,52,125,177 +7010,2011-10-24,4,0,10,16,0,1,1,1,0.54,0.5152,0.64,0.194,68,242,310 +7011,2011-10-24,4,0,10,17,0,1,1,1,0.52,0.5,0.68,0.1343,87,527,614 +7012,2011-10-24,4,0,10,18,0,1,1,1,0.54,0.5152,0.6,0.0896,58,428,486 +7013,2011-10-24,4,0,10,19,0,1,1,3,0.48,0.4697,0.82,0,25,212,237 +7014,2011-10-24,4,0,10,20,0,1,1,1,0.48,0.4697,0.82,0.1045,11,118,129 +7015,2011-10-24,4,0,10,21,0,1,1,1,0.46,0.4545,0.87,0.1045,12,114,126 +7016,2011-10-24,4,0,10,22,0,1,1,1,0.48,0.4697,0.75,0.1642,3,76,79 +7017,2011-10-24,4,0,10,23,0,1,1,1,0.46,0.4545,0.77,0.0896,9,50,59 +7018,2011-10-25,4,0,10,0,0,2,1,1,0.44,0.4394,0.77,0.1343,4,26,30 +7019,2011-10-25,4,0,10,1,0,2,1,1,0.44,0.4394,0.77,0.1343,5,6,11 +7020,2011-10-25,4,0,10,2,0,2,1,1,0.42,0.4242,0.82,0.1343,2,3,5 +7021,2011-10-25,4,0,10,3,0,2,1,1,0.4,0.4091,0.87,0,1,3,4 +7022,2011-10-25,4,0,10,4,0,2,1,1,0.38,0.3939,0.87,0.1343,0,5,5 +7023,2011-10-25,4,0,10,5,0,2,1,1,0.38,0.3939,0.87,0.1343,0,24,24 +7024,2011-10-25,4,0,10,6,0,2,1,1,0.38,0.3939,0.82,0.1045,3,95,98 +7025,2011-10-25,4,0,10,7,0,2,1,1,0.4,0.4091,0.76,0.194,15,299,314 +7026,2011-10-25,4,0,10,8,0,2,1,1,0.44,0.4394,0.72,0,25,383,408 +7027,2011-10-25,4,0,10,9,0,2,1,1,0.48,0.4697,0.55,0.2239,29,194,223 +7028,2011-10-25,4,0,10,10,0,2,1,1,0.5,0.4848,0.51,0.2537,26,102,128 +7029,2011-10-25,4,0,10,11,0,2,1,1,0.52,0.5,0.48,0.2836,46,120,166 +7030,2011-10-25,4,0,10,12,0,2,1,1,0.54,0.5152,0.39,0.3582,35,165,200 +7031,2011-10-25,4,0,10,13,0,2,1,1,0.56,0.5303,0.37,0.3881,37,161,198 +7032,2011-10-25,4,0,10,14,0,2,1,1,0.56,0.5303,0.4,0.2537,66,134,200 +7033,2011-10-25,4,0,10,15,0,2,1,1,0.56,0.5303,0.37,0.1045,63,171,234 +7034,2011-10-25,4,0,10,16,0,2,1,1,0.56,0.5303,0.37,0.194,52,233,285 +7035,2011-10-25,4,0,10,17,0,2,1,1,0.56,0.5303,0.37,0.1045,68,517,585 +7036,2011-10-25,4,0,10,18,0,2,1,1,0.52,0.5,0.42,0.1045,65,453,518 +7037,2011-10-25,4,0,10,19,0,2,1,1,0.48,0.4697,0.67,0.194,44,294,338 +7038,2011-10-25,4,0,10,20,0,2,1,1,0.46,0.4545,0.67,0.1045,40,208,248 +7039,2011-10-25,4,0,10,21,0,2,1,1,0.44,0.4394,0.72,0.1343,32,169,201 +7040,2011-10-25,4,0,10,22,0,2,1,1,0.46,0.4545,0.67,0.194,22,148,170 +7041,2011-10-25,4,0,10,23,0,2,1,1,0.44,0.4394,0.72,0.1343,15,79,94 +7042,2011-10-26,4,0,10,0,0,3,1,1,0.44,0.4394,0.72,0.194,4,28,32 +7043,2011-10-26,4,0,10,1,0,3,1,1,0.44,0.4394,0.67,0.2537,3,9,12 +7044,2011-10-26,4,0,10,2,0,3,1,1,0.44,0.4394,0.67,0.1642,2,1,3 +7045,2011-10-26,4,0,10,3,0,3,1,1,0.44,0.4394,0.67,0.2239,0,3,3 +7046,2011-10-26,4,0,10,4,0,3,1,1,0.42,0.4242,0.71,0.2537,0,5,5 +7047,2011-10-26,4,0,10,5,0,3,1,1,0.42,0.4242,0.71,0.2239,0,21,21 +7048,2011-10-26,4,0,10,6,0,3,1,1,0.42,0.4242,0.71,0.2239,0,92,92 +7049,2011-10-26,4,0,10,7,0,3,1,1,0.42,0.4242,0.71,0.1642,15,284,299 +7050,2011-10-26,4,0,10,8,0,3,1,2,0.44,0.4394,0.67,0.194,26,438,464 +7051,2011-10-26,4,0,10,9,0,3,1,2,0.48,0.4697,0.63,0.2239,14,226,240 +7052,2011-10-26,4,0,10,10,0,3,1,2,0.52,0.5,0.62,0.2537,26,105,131 +7053,2011-10-26,4,0,10,11,0,3,1,1,0.52,0.5,0.59,0.3284,23,132,155 +7054,2011-10-26,4,0,10,12,0,3,1,3,0.58,0.5455,0.53,0.2239,39,155,194 +7055,2011-10-26,4,0,10,13,0,3,1,3,0.56,0.5303,0.6,0.1045,20,62,82 +7056,2011-10-26,4,0,10,14,0,3,1,2,0.52,0.5,0.77,0,14,52,66 +7057,2011-10-26,4,0,10,15,0,3,1,2,0.52,0.5,0.77,0,11,59,70 +7058,2011-10-26,4,0,10,16,0,3,1,2,0.52,0.5,0.77,0.0896,17,196,213 +7059,2011-10-26,4,0,10,17,0,3,1,2,0.52,0.5,0.77,0.1045,40,414,454 +7060,2011-10-26,4,0,10,18,0,3,1,1,0.52,0.5,0.77,0.1642,55,398,453 +7061,2011-10-26,4,0,10,19,0,3,1,1,0.5,0.4848,0.82,0,25,306,331 +7062,2011-10-26,4,0,10,20,0,3,1,1,0.5,0.4848,0.88,0,27,180,207 +7063,2011-10-26,4,0,10,21,0,3,1,2,0.52,0.5,0.77,0,17,140,157 +7064,2011-10-26,4,0,10,22,0,3,1,2,0.48,0.4697,0.88,0.0896,23,112,135 +7065,2011-10-26,4,0,10,23,0,3,1,2,0.48,0.4697,0.88,0.0896,3,72,75 +7066,2011-10-27,4,0,10,0,0,4,1,1,0.46,0.4545,0.94,0,3,23,26 +7067,2011-10-27,4,0,10,1,0,4,1,1,0.46,0.4545,0.94,0,2,9,11 +7068,2011-10-27,4,0,10,2,0,4,1,3,0.46,0.4545,0.94,0,3,5,8 +7069,2011-10-27,4,0,10,3,0,4,1,3,0.46,0.4545,0.94,0,1,3,4 +7070,2011-10-27,4,0,10,4,0,4,1,3,0.48,0.4697,0.88,0,1,3,4 +7071,2011-10-27,4,0,10,5,0,4,1,3,0.48,0.4697,0.88,0.0896,1,14,15 +7072,2011-10-27,4,0,10,6,0,4,1,3,0.5,0.4848,0.82,0,4,42,46 +7073,2011-10-27,4,0,10,7,0,4,1,2,0.46,0.4545,0.94,0.0896,9,128,137 +7074,2011-10-27,4,0,10,8,0,4,1,3,0.48,0.4697,0.88,0.194,12,304,316 +7075,2011-10-27,4,0,10,9,0,4,1,2,0.5,0.4848,0.82,0.1642,16,155,171 +7076,2011-10-27,4,0,10,10,0,4,1,3,0.5,0.4848,0.88,0.194,10,52,62 +7077,2011-10-27,4,0,10,11,0,4,1,3,0.5,0.4848,0.88,0.1343,8,44,52 +7078,2011-10-27,4,0,10,12,0,4,1,3,0.5,0.4848,0.88,0.1343,8,45,53 +7079,2011-10-27,4,0,10,13,0,4,1,2,0.5,0.4848,0.88,0.1642,10,64,74 +7080,2011-10-27,4,0,10,14,0,4,1,2,0.52,0.5,0.83,0,16,76,92 +7081,2011-10-27,4,0,10,15,0,4,1,2,0.52,0.5,0.83,0.1045,12,95,107 +7082,2011-10-27,4,0,10,16,0,4,1,2,0.56,0.5303,0.73,0.1642,14,193,207 +7083,2011-10-27,4,0,10,17,0,4,1,2,0.5,0.4848,0.72,0.4627,31,291,322 +7084,2011-10-27,4,0,10,18,0,4,1,2,0.48,0.4697,0.67,0.4627,14,226,240 +7085,2011-10-27,4,0,10,19,0,4,1,2,0.44,0.4394,0.67,0.4627,19,200,219 +7086,2011-10-27,4,0,10,20,0,4,1,1,0.42,0.4242,0.67,0.4925,11,171,182 +7087,2011-10-27,4,0,10,21,0,4,1,1,0.4,0.4091,0.62,0.5224,13,119,132 +7088,2011-10-27,4,0,10,22,0,4,1,1,0.36,0.3182,0.66,0.4925,14,97,111 +7089,2011-10-27,4,0,10,23,0,4,1,1,0.34,0.303,0.61,0.4179,8,60,68 +7090,2011-10-28,4,0,10,0,0,5,1,1,0.34,0.3182,0.66,0.2537,4,40,44 +7091,2011-10-28,4,0,10,1,0,5,1,1,0.32,0.303,0.66,0.2537,7,9,16 +7092,2011-10-28,4,0,10,2,0,5,1,1,0.32,0.303,0.66,0.2836,1,3,4 +7093,2011-10-28,4,0,10,3,0,5,1,1,0.3,0.2879,0.7,0.2836,4,4,8 +7094,2011-10-28,4,0,10,4,0,5,1,1,0.3,0.2879,0.65,0.2836,5,5,10 +7095,2011-10-28,4,0,10,5,0,5,1,2,0.3,0.2879,0.61,0.2537,0,25,25 +7096,2011-10-28,4,0,10,6,0,5,1,1,0.28,0.2727,0.65,0.2537,1,74,75 +7097,2011-10-28,4,0,10,7,0,5,1,1,0.28,0.2727,0.65,0.2537,6,199,205 +7098,2011-10-28,4,0,10,8,0,5,1,1,0.3,0.2879,0.61,0.2836,13,361,374 +7099,2011-10-28,4,0,10,9,0,5,1,1,0.32,0.303,0.57,0.2239,19,210,229 +7100,2011-10-28,4,0,10,10,0,5,1,1,0.34,0.3182,0.57,0.2239,28,102,130 +7101,2011-10-28,4,0,10,11,0,5,1,1,0.36,0.3485,0.5,0.2239,40,128,168 +7102,2011-10-28,4,0,10,12,0,5,1,2,0.38,0.3939,0.4,0.1642,46,167,213 +7103,2011-10-28,4,0,10,13,0,5,1,2,0.38,0.3939,0.42,0.2239,43,185,228 +7104,2011-10-28,4,0,10,14,0,5,1,2,0.36,0.3485,0.46,0.194,33,152,185 +7105,2011-10-28,4,0,10,15,0,5,1,2,0.36,0.3333,0.43,0.2537,44,171,215 +7106,2011-10-28,4,0,10,16,0,5,1,2,0.36,0.3485,0.46,0.1642,46,262,308 +7107,2011-10-28,4,0,10,17,0,5,1,2,0.36,0.3485,0.5,0.1343,35,411,446 +7108,2011-10-28,4,0,10,18,0,5,1,2,0.36,0.3485,0.53,0.1642,36,332,368 +7109,2011-10-28,4,0,10,19,0,5,1,2,0.36,0.3485,0.5,0.1343,16,188,204 +7110,2011-10-28,4,0,10,20,0,5,1,3,0.34,0.3182,0.61,0.2239,12,131,143 +7111,2011-10-28,4,0,10,21,0,5,1,3,0.32,0.3182,0.7,0.1642,5,69,74 +7112,2011-10-28,4,0,10,22,0,5,1,3,0.3,0.2879,0.75,0.2836,4,32,36 +7113,2011-10-28,4,0,10,23,0,5,1,3,0.3,0.2727,0.81,0.3284,8,31,39 +7114,2011-10-29,4,0,10,0,0,6,0,3,0.28,0.2576,0.87,0.2985,0,19,19 +7115,2011-10-29,4,0,10,1,0,6,0,3,0.3,0.2727,0.87,0.2985,0,18,18 +7116,2011-10-29,4,0,10,2,0,6,0,3,0.3,0.2727,0.87,0.2985,1,16,17 +7117,2011-10-29,4,0,10,3,0,6,0,3,0.3,0.2727,0.81,0.4179,0,8,8 +7118,2011-10-29,4,0,10,4,0,6,0,3,0.3,0.2727,0.81,0.4179,0,1,1 +7119,2011-10-29,4,0,10,5,0,6,0,3,0.26,0.2273,0.93,0.3582,0,1,1 +7120,2011-10-29,4,0,10,6,0,6,0,3,0.26,0.2273,0.87,0.3582,4,1,5 +7121,2011-10-29,4,0,10,7,0,6,0,3,0.26,0.2273,0.87,0.3582,1,6,7 +7122,2011-10-29,4,0,10,8,0,6,0,3,0.28,0.2576,0.87,0.3582,4,16,20 +7123,2011-10-29,4,0,10,9,0,6,0,3,0.28,0.2576,0.87,0.3582,1,19,20 +7124,2011-10-29,4,0,10,10,0,6,0,3,0.26,0.2273,0.93,0.3284,0,12,12 +7125,2011-10-29,4,0,10,11,0,6,0,3,0.26,0.2273,0.93,0.3881,1,26,27 +7126,2011-10-29,4,0,10,12,0,6,0,3,0.24,0.197,0.87,0.4925,6,44,50 +7127,2011-10-29,4,0,10,13,0,6,0,3,0.24,0.197,0.87,0.5224,0,30,30 +7128,2011-10-29,4,0,10,14,0,6,0,3,0.24,0.197,0.87,0.4478,0,29,29 +7129,2011-10-29,4,0,10,15,0,6,0,3,0.22,0.2121,0.93,0.2537,3,38,41 +7130,2011-10-29,4,0,10,16,0,6,0,3,0.22,0.197,0.93,0.3284,3,19,22 +7131,2011-10-29,4,0,10,17,0,6,0,3,0.22,0.197,0.93,0.3284,3,28,31 +7132,2011-10-29,4,0,10,18,0,6,0,3,0.22,0.197,0.93,0.3284,6,37,43 +7133,2011-10-29,4,0,10,19,0,6,0,1,0.24,0.2121,0.87,0.3582,3,36,39 +7134,2011-10-29,4,0,10,20,0,6,0,1,0.24,0.2121,0.87,0.3582,7,40,47 +7135,2011-10-29,4,0,10,21,0,6,0,1,0.24,0.2121,0.87,0.3582,1,49,50 +7136,2011-10-29,4,0,10,22,0,6,0,1,0.22,0.2121,0.87,0.2239,10,44,54 +7137,2011-10-29,4,0,10,23,0,6,0,1,0.22,0.2273,0.87,0.194,3,33,36 +7138,2011-10-30,4,0,10,0,0,0,0,1,0.22,0.2121,0.87,0.2239,7,47,54 +7139,2011-10-30,4,0,10,1,0,0,0,1,0.22,0.2121,0.87,0.2537,9,34,43 +7140,2011-10-30,4,0,10,2,0,0,0,1,0.22,0.2121,0.87,0.2836,7,43,50 +7141,2011-10-30,4,0,10,3,0,0,0,1,0.24,0.2121,0.75,0.3582,7,26,33 +7142,2011-10-30,4,0,10,4,0,0,0,1,0.22,0.197,0.8,0.3284,1,10,11 +7143,2011-10-30,4,0,10,5,0,0,0,1,0.24,0.2121,0.75,0.2985,0,4,4 +7144,2011-10-30,4,0,10,6,0,0,0,1,0.24,0.2273,0.75,0.2537,2,8,10 +7145,2011-10-30,4,0,10,7,0,0,0,1,0.24,0.2879,0.75,0,7,15,22 +7146,2011-10-30,4,0,10,8,0,0,0,1,0.26,0.2576,0.7,0.2239,20,60,80 +7147,2011-10-30,4,0,10,9,0,0,0,1,0.3,0.2879,0.65,0.2537,55,92,147 +7148,2011-10-30,4,0,10,10,0,0,0,1,0.32,0.3333,0.61,0.0896,53,125,178 +7149,2011-10-30,4,0,10,11,0,0,0,1,0.36,0.3485,0.53,0.2239,58,182,240 +7150,2011-10-30,4,0,10,12,0,0,0,1,0.38,0.3939,0.5,0.194,85,229,314 +7151,2011-10-30,4,0,10,13,0,0,0,1,0.4,0.4091,0.43,0.2239,113,232,345 +7152,2011-10-30,4,0,10,14,0,0,0,1,0.42,0.4242,0.38,0.194,91,209,300 +7153,2011-10-30,4,0,10,15,0,0,0,1,0.42,0.4242,0.35,0.1642,88,202,290 +7154,2011-10-30,4,0,10,16,0,0,0,1,0.42,0.4242,0.32,0.1343,107,213,320 +7155,2011-10-30,4,0,10,17,0,0,0,1,0.4,0.4091,0.35,0.1045,54,191,245 +7156,2011-10-30,4,0,10,18,0,0,0,1,0.36,0.3485,0.4,0.1343,51,162,213 +7157,2011-10-30,4,0,10,19,0,0,0,1,0.56,0.5303,0.49,0.2985,28,125,153 +7158,2011-10-30,4,0,10,20,0,0,0,1,0.34,0.3636,0.57,0,18,74,92 +7159,2011-10-30,4,0,10,21,0,0,0,1,0.32,0.3485,0.66,0,4,75,79 +7160,2011-10-30,4,0,10,22,0,0,0,1,0.3,0.3333,0.75,0,13,58,71 +7161,2011-10-30,4,0,10,23,0,0,0,1,0.26,0.303,0.87,0,7,30,37 +7162,2011-10-31,4,0,10,0,0,1,1,1,0.26,0.303,0.87,0,3,20,23 +7163,2011-10-31,4,0,10,1,0,1,1,1,0.26,0.303,0.81,0,5,8,13 +7164,2011-10-31,4,0,10,2,0,1,1,1,0.24,0.2879,0.87,0,0,3,3 +7165,2011-10-31,4,0,10,3,0,1,1,1,0.24,0.2576,0.87,0.1045,0,3,3 +7166,2011-10-31,4,0,10,4,0,1,1,1,0.24,0.2879,0.87,0,1,5,6 +7167,2011-10-31,4,0,10,5,0,1,1,1,0.22,0.2727,0.93,0,0,18,18 +7168,2011-10-31,4,0,10,6,0,1,1,1,0.24,0.2879,0.87,0,4,82,86 +7169,2011-10-31,4,0,10,7,0,1,1,1,0.24,0.2879,0.93,0,11,216,227 +7170,2011-10-31,4,0,10,8,0,1,1,2,0.28,0.3182,0.87,0,17,355,372 +7171,2011-10-31,4,0,10,9,0,1,1,2,0.32,0.3182,0.76,0.1642,14,197,211 +7172,2011-10-31,4,0,10,10,0,1,1,2,0.36,0.3485,0.57,0.1642,18,87,105 +7173,2011-10-31,4,0,10,11,0,1,1,1,0.4,0.4091,0.5,0.1343,31,97,128 +7174,2011-10-31,4,0,10,12,0,1,1,1,0.42,0.4242,0.44,0.2239,30,140,170 +7175,2011-10-31,4,0,10,13,0,1,1,2,0.44,0.4394,0.44,0.2239,24,128,152 +7176,2011-10-31,4,0,10,14,0,1,1,2,0.44,0.4394,0.47,0.2239,22,119,141 +7177,2011-10-31,4,0,10,15,0,1,1,1,0.44,0.4394,0.47,0.1642,27,141,168 +7178,2011-10-31,4,0,10,16,0,1,1,1,0.42,0.4242,0.54,0.1642,30,242,272 +7179,2011-10-31,4,0,10,17,0,1,1,1,0.42,0.4242,0.54,0.1343,44,442,486 +7180,2011-10-31,4,0,10,18,0,1,1,1,0.4,0.4091,0.66,0.0896,30,392,422 +7181,2011-10-31,4,0,10,19,0,1,1,1,0.4,0.4091,0.66,0.0896,12,226,238 +7182,2011-10-31,4,0,10,20,0,1,1,1,0.4,0.4091,0.66,0.0896,18,154,172 +7183,2011-10-31,4,0,10,21,0,1,1,2,0.36,0.3485,0.76,0.194,7,109,116 +7184,2011-10-31,4,0,10,22,0,1,1,2,0.36,0.3485,0.76,0.194,8,77,85 +7185,2011-10-31,4,0,10,23,0,1,1,2,0.36,0.3485,0.76,0.194,6,46,52 +7186,2011-11-01,4,0,11,0,0,2,1,2,0.36,0.3485,0.87,0.1642,3,18,21 +7187,2011-11-01,4,0,11,1,0,2,1,1,0.36,0.3485,0.81,0.1343,3,8,11 +7188,2011-11-01,4,0,11,2,0,2,1,2,0.36,0.3485,0.81,0.1642,1,3,4 +7189,2011-11-01,4,0,11,3,0,2,1,2,0.36,0.3485,0.81,0.1343,1,5,6 +7190,2011-11-01,4,0,11,4,0,2,1,1,0.34,0.3182,0.81,0.2239,1,7,8 +7191,2011-11-01,4,0,11,5,0,2,1,1,0.32,0.3182,0.81,0.194,0,18,18 +7192,2011-11-01,4,0,11,6,0,2,1,1,0.32,0.3182,0.81,0.194,3,90,93 +7193,2011-11-01,4,0,11,7,0,2,1,1,0.34,0.3333,0.76,0.194,8,246,254 +7194,2011-11-01,4,0,11,8,0,2,1,1,0.36,0.3333,0.71,0.2537,17,402,419 +7195,2011-11-01,4,0,11,9,0,2,1,1,0.4,0.4091,0.66,0.1642,16,206,222 +7196,2011-11-01,4,0,11,10,0,2,1,1,0.44,0.4394,0.62,0.1343,21,114,135 +7197,2011-11-01,4,0,11,11,0,2,1,1,0.46,0.4545,0.63,0.1045,16,101,117 +7198,2011-11-01,4,0,11,12,0,2,1,1,0.5,0.4848,0.48,0.0896,23,153,176 +7199,2011-11-01,4,0,11,13,0,2,1,1,0.48,0.4697,0.55,0.194,20,147,167 +7200,2011-11-01,4,0,11,14,0,2,1,1,0.5,0.4848,0.48,0.2537,32,118,150 +7201,2011-11-01,4,0,11,15,0,2,1,1,0.5,0.4848,0.45,0.1343,38,148,186 +7202,2011-11-01,4,0,11,16,0,2,1,1,0.48,0.4697,0.44,0.1642,46,252,298 +7203,2011-11-01,4,0,11,17,0,2,1,1,0.44,0.4394,0.54,0.1642,36,470,506 +7204,2011-11-01,4,0,11,18,0,2,1,1,0.42,0.4242,0.58,0.1045,39,421,460 +7205,2011-11-01,4,0,11,19,0,2,1,1,0.42,0.4242,0.58,0.0896,39,274,313 +7206,2011-11-01,4,0,11,20,0,2,1,1,0.4,0.4091,0.71,0,18,191,209 +7207,2011-11-01,4,0,11,21,0,2,1,1,0.36,0.3788,0.81,0,13,114,127 +7208,2011-11-01,4,0,11,22,0,2,1,1,0.36,0.3788,0.81,0,5,91,96 +7209,2011-11-01,4,0,11,23,0,2,1,1,0.34,0.3636,0.87,0,11,61,72 +7210,2011-11-02,4,0,11,0,0,3,1,1,0.32,0.3485,0.87,0,0,19,19 +7211,2011-11-02,4,0,11,1,0,3,1,1,0.3,0.3333,0.87,0,2,8,10 +7212,2011-11-02,4,0,11,2,0,3,1,1,0.3,0.3333,0.87,0,0,2,2 +7213,2011-11-02,4,0,11,3,0,3,1,1,0.3,0.3333,0.75,0,0,2,2 +7214,2011-11-02,4,0,11,4,0,3,1,1,0.3,0.3333,0.87,0,0,4,4 +7215,2011-11-02,4,0,11,5,0,3,1,1,0.3,0.3333,0.81,0,0,27,27 +7216,2011-11-02,4,0,11,6,0,3,1,1,0.3,0.3333,0.81,0,1,91,92 +7217,2011-11-02,4,0,11,7,0,3,1,1,0.3,0.3333,0.81,0,11,240,251 +7218,2011-11-02,4,0,11,8,0,3,1,1,0.32,0.3485,0.87,0,20,452,472 +7219,2011-11-02,4,0,11,9,0,3,1,1,0.34,0.3636,0.87,0,15,213,228 +7220,2011-11-02,4,0,11,10,0,3,1,1,0.4,0.4091,0.71,0.0896,25,108,133 +7221,2011-11-02,4,0,11,11,0,3,1,1,0.42,0.4242,0.71,0.1343,27,117,144 +7222,2011-11-02,4,0,11,12,0,3,1,1,0.46,0.4545,0.55,0.1642,32,157,189 +7223,2011-11-02,4,0,11,13,0,3,1,1,0.48,0.4697,0.48,0.1642,22,139,161 +7224,2011-11-02,4,0,11,14,0,3,1,1,0.5,0.4848,0.45,0.1642,40,132,172 +7225,2011-11-02,4,0,11,15,0,3,1,1,0.48,0.4697,0.51,0.1642,26,161,187 +7226,2011-11-02,4,0,11,16,0,3,1,1,0.48,0.4697,0.51,0.1045,35,231,266 +7227,2011-11-02,4,0,11,17,0,3,1,1,0.44,0.4394,0.62,0.194,30,523,553 +7228,2011-11-02,4,0,11,18,0,3,1,1,0.42,0.4242,0.71,0.1642,31,448,479 +7229,2011-11-02,4,0,11,19,0,3,1,1,0.4,0.4091,0.76,0.1343,25,257,282 +7230,2011-11-02,4,0,11,20,0,3,1,1,0.4,0.4091,0.66,0.1343,5,177,182 +7231,2011-11-02,4,0,11,21,0,3,1,1,0.38,0.3939,0.71,0.1343,13,155,168 +7232,2011-11-02,4,0,11,22,0,3,1,1,0.36,0.3485,0.71,0.1343,7,99,106 +7233,2011-11-02,4,0,11,23,0,3,1,1,0.36,0.3636,0.76,0.0896,3,54,57 +7234,2011-11-03,4,0,11,0,0,4,1,1,0.36,0.3636,0.76,0.0896,3,28,31 +7235,2011-11-03,4,0,11,1,0,4,1,1,0.34,0.3333,0.81,0.1343,3,12,15 +7236,2011-11-03,4,0,11,2,0,4,1,1,0.34,0.3333,0.76,0.1343,2,5,7 +7237,2011-11-03,4,0,11,3,0,4,1,1,0.34,0.3333,0.81,0.1343,1,4,5 +7238,2011-11-03,4,0,11,4,0,4,1,1,0.32,0.3333,0.81,0.0896,1,3,4 +7239,2011-11-03,4,0,11,5,0,4,1,1,0.32,0.3333,0.81,0.1343,1,27,28 +7240,2011-11-03,4,0,11,6,0,4,1,1,0.34,0.3333,0.76,0.194,3,96,99 +7241,2011-11-03,4,0,11,7,0,4,1,2,0.32,0.3333,0.81,0.0896,12,280,292 +7242,2011-11-03,4,0,11,8,0,4,1,2,0.34,0.3485,0.81,0.0896,8,394,402 +7243,2011-11-03,4,0,11,9,0,4,1,2,0.36,0.3485,0.81,0.194,9,155,164 +7244,2011-11-03,4,0,11,10,0,4,1,2,0.4,0.4091,0.76,0.194,12,98,110 +7245,2011-11-03,4,0,11,11,0,4,1,1,0.44,0.4394,0.67,0.194,12,108,120 +7246,2011-11-03,4,0,11,12,0,4,1,1,0.5,0.4848,0.51,0.194,17,162,179 +7247,2011-11-03,4,0,11,13,0,4,1,2,0.52,0.5,0.48,0.2537,19,150,169 +7248,2011-11-03,4,0,11,14,0,4,1,1,0.52,0.5,0.52,0.1642,32,120,152 +7249,2011-11-03,4,0,11,15,0,4,1,1,0.52,0.5,0.52,0.1642,24,138,162 +7250,2011-11-03,4,0,11,16,0,4,1,1,0.52,0.5,0.48,0.194,33,234,267 +7251,2011-11-03,4,0,11,17,0,4,1,1,0.48,0.4697,0.55,0.0896,33,465,498 +7252,2011-11-03,4,0,11,18,0,4,1,1,0.46,0.4545,0.63,0.1343,24,409,433 +7253,2011-11-03,4,0,11,19,0,4,1,1,0.44,0.4394,0.67,0.0896,18,234,252 +7254,2011-11-03,4,0,11,20,0,4,1,1,0.42,0.4242,0.71,0.1045,11,198,209 +7255,2011-11-03,4,0,11,21,0,4,1,1,0.4,0.4091,0.76,0.1343,4,140,144 +7256,2011-11-03,4,0,11,22,0,4,1,1,0.4,0.4091,0.82,0,21,116,137 +7257,2011-11-03,4,0,11,23,0,4,1,1,0.4,0.4091,0.82,0.0896,15,80,95 +7258,2011-11-04,4,0,11,0,0,5,1,2,0.4,0.4091,0.82,0,11,32,43 +7259,2011-11-04,4,0,11,1,0,5,1,2,0.4,0.4091,0.82,0,1,16,17 +7260,2011-11-04,4,0,11,2,0,5,1,2,0.4,0.4091,0.76,0,2,8,10 +7261,2011-11-04,4,0,11,3,0,5,1,2,0.38,0.3939,0.87,0.1642,1,7,8 +7262,2011-11-04,4,0,11,4,0,5,1,2,0.38,0.3939,0.87,0.0896,1,6,7 +7263,2011-11-04,4,0,11,5,0,5,1,2,0.38,0.3939,0.87,0,0,23,23 +7264,2011-11-04,4,0,11,6,0,5,1,2,0.38,0.3939,0.87,0.0896,2,64,66 +7265,2011-11-04,4,0,11,7,0,5,1,2,0.4,0.4091,0.86,0.2239,10,235,245 +7266,2011-11-04,4,0,11,8,0,5,1,2,0.4,0.4091,0.87,0.3582,8,387,395 +7267,2011-11-04,4,0,11,9,0,5,1,2,0.42,0.4242,0.71,0.4627,15,239,254 +7268,2011-11-04,4,0,11,10,0,5,1,2,0.42,0.4242,0.71,0.3284,19,115,134 +7269,2011-11-04,4,0,11,11,0,5,1,2,0.44,0.4394,0.67,0.4478,34,128,162 +7270,2011-11-04,4,0,11,12,0,5,1,2,0.48,0.4697,0.55,0.4925,44,153,197 +7271,2011-11-04,4,0,11,13,0,5,1,1,0.44,0.4394,0.51,0.4179,33,161,194 +7272,2011-11-04,4,0,11,14,0,5,1,1,0.44,0.4394,0.47,0.2985,54,147,201 +7273,2011-11-04,4,0,11,15,0,5,1,1,0.46,0.4545,0.44,0.3881,45,192,237 +7274,2011-11-04,4,0,11,16,0,5,1,1,0.46,0.4545,0.36,0.3881,53,237,290 +7275,2011-11-04,4,0,11,17,0,5,1,1,0.42,0.4242,0.38,0.3582,42,438,480 +7276,2011-11-04,4,0,11,18,0,5,1,1,0.42,0.4242,0.32,0.4627,35,339,374 +7277,2011-11-04,4,0,11,19,0,5,1,1,0.4,0.4091,0.37,0.3582,14,182,196 +7278,2011-11-04,4,0,11,20,0,5,1,1,0.36,0.3333,0.43,0.2836,15,171,186 +7279,2011-11-04,4,0,11,21,0,5,1,1,0.34,0.3182,0.46,0.2537,8,115,123 +7280,2011-11-04,4,0,11,22,0,5,1,1,0.34,0.303,0.46,0.3284,6,109,115 +7281,2011-11-04,4,0,11,23,0,5,1,1,0.32,0.303,0.49,0.3284,17,72,89 +7282,2011-11-05,4,0,11,0,0,6,0,1,0.32,0.2879,0.45,0.3582,4,48,52 +7283,2011-11-05,4,0,11,1,0,6,0,1,0.32,0.303,0.45,0.2985,5,57,62 +7284,2011-11-05,4,0,11,2,0,6,0,1,0.3,0.2727,0.52,0.3284,7,24,31 +7285,2011-11-05,4,0,11,3,0,6,0,1,0.3,0.2727,0.52,0.2985,0,8,8 +7286,2011-11-05,4,0,11,4,0,6,0,1,0.28,0.2576,0.56,0.3284,3,5,8 +7287,2011-11-05,4,0,11,5,0,6,0,1,0.26,0.2273,0.6,0.3284,0,2,2 +7288,2011-11-05,4,0,11,6,0,6,0,1,0.26,0.2424,0.6,0.2537,3,20,23 +7289,2011-11-05,4,0,11,7,0,6,0,1,0.26,0.2424,0.6,0.2836,3,31,34 +7290,2011-11-05,4,0,11,8,0,6,0,1,0.28,0.2576,0.56,0.3284,4,80,84 +7291,2011-11-05,4,0,11,9,0,6,0,1,0.3,0.2727,0.56,0.2985,30,111,141 +7292,2011-11-05,4,0,11,10,0,6,0,1,0.32,0.303,0.53,0.2537,56,171,227 +7293,2011-11-05,4,0,11,11,0,6,0,1,0.36,0.3636,0.46,0.1045,69,169,238 +7294,2011-11-05,4,0,11,12,0,6,0,1,0.36,0.3485,0.46,0.1642,137,235,372 +7295,2011-11-05,4,0,11,13,0,6,0,1,0.4,0.4091,0.37,0.1045,148,207,355 +7296,2011-11-05,4,0,11,14,0,6,0,1,0.42,0.4242,0.35,0.1343,159,227,386 +7297,2011-11-05,4,0,11,15,0,6,0,1,0.4,0.4091,0.37,0.194,141,202,343 +7298,2011-11-05,4,0,11,16,0,6,0,1,0.4,0.4091,0.4,0.2239,128,207,335 +7299,2011-11-05,4,0,11,17,0,6,0,1,0.38,0.3939,0.4,0.1642,100,234,334 +7300,2011-11-05,4,0,11,18,0,6,0,1,0.36,0.3636,0.5,0.0896,52,185,237 +7301,2011-11-05,4,0,11,19,0,6,0,1,0.34,0.3636,0.53,0,45,159,204 +7302,2011-11-05,4,0,11,20,0,6,0,1,0.32,0.3485,0.66,0,29,136,165 +7303,2011-11-05,4,0,11,21,0,6,0,1,0.32,0.3485,0.66,0,19,99,118 +7304,2011-11-05,4,0,11,22,0,6,0,1,0.3,0.3333,0.65,0,10,72,82 +7305,2011-11-05,4,0,11,23,0,6,0,1,0.28,0.3182,0.7,0,4,81,85 +7306,2011-11-06,4,0,11,0,0,0,0,1,0.28,0.3182,0.75,0,10,65,75 +7307,2011-11-06,4,0,11,1,0,0,0,1,0.26,0.303,0.81,0,11,104,115 +7308,2011-11-06,4,0,11,2,0,0,0,1,0.26,0.303,0.81,0,6,23,29 +7309,2011-11-06,4,0,11,3,0,0,0,1,0.24,0.2879,0.87,0,5,4,9 +7310,2011-11-06,4,0,11,4,0,0,0,1,0.24,0.2879,0.87,0,0,6,6 +7311,2011-11-06,4,0,11,5,0,0,0,1,0.24,0.2424,0.87,0.1343,0,5,5 +7312,2011-11-06,4,0,11,6,0,0,0,1,0.26,0.2879,0.81,0.0896,5,11,16 +7313,2011-11-06,4,0,11,7,0,0,0,1,0.26,0.2727,0.87,0.1045,4,24,28 +7314,2011-11-06,4,0,11,8,0,0,0,2,0.28,0.303,0.87,0.0896,19,71,90 +7315,2011-11-06,4,0,11,9,0,0,0,2,0.3,0.3182,0.75,0.0896,36,134,170 +7316,2011-11-06,4,0,11,10,0,0,0,1,0.36,0.3636,0.66,0.0896,76,186,262 +7317,2011-11-06,4,0,11,11,0,0,0,1,0.42,0.4242,0.58,0.0896,104,216,320 +7318,2011-11-06,4,0,11,12,0,0,0,1,0.44,0.4394,0.54,0,120,226,346 +7319,2011-11-06,4,0,11,13,0,0,0,1,0.46,0.4545,0.51,0.1642,105,209,314 +7320,2011-11-06,4,0,11,14,0,0,0,1,0.46,0.4545,0.55,0.194,101,219,320 +7321,2011-11-06,4,0,11,15,0,0,0,1,0.48,0.4697,0.51,0.194,96,210,306 +7322,2011-11-06,4,0,11,16,0,0,0,1,0.46,0.4545,0.55,0.2239,84,263,347 +7323,2011-11-06,4,0,11,17,0,0,0,1,0.44,0.4394,0.62,0.1642,71,185,256 +7324,2011-11-06,4,0,11,18,0,0,0,1,0.42,0.4242,0.71,0.1045,39,146,185 +7325,2011-11-06,4,0,11,19,0,0,0,1,0.38,0.3939,0.76,0.194,21,130,151 +7326,2011-11-06,4,0,11,20,0,0,0,1,0.36,0.3636,0.81,0.1045,16,100,116 +7327,2011-11-06,4,0,11,21,0,0,0,1,0.36,0.3636,0.81,0.0896,11,77,88 +7328,2011-11-06,4,0,11,22,0,0,0,1,0.36,0.3788,0.87,0,5,40,45 +7329,2011-11-06,4,0,11,23,0,0,0,1,0.34,0.3485,0.87,0.0896,7,43,50 +7330,2011-11-07,4,0,11,0,0,1,1,1,0.34,0.3636,0.87,0,1,14,15 +7331,2011-11-07,4,0,11,1,0,1,1,1,0.34,0.3636,0.87,0,2,6,8 +7332,2011-11-07,4,0,11,2,0,1,1,1,0.32,0.3485,0.93,0,0,2,2 +7333,2011-11-07,4,0,11,3,0,1,1,1,0.32,0.3485,0.93,0,0,3,3 +7334,2011-11-07,4,0,11,4,0,1,1,1,0.32,0.3485,0.93,0,2,4,6 +7335,2011-11-07,4,0,11,5,0,1,1,1,0.3,0.3333,0.93,0,1,25,26 +7336,2011-11-07,4,0,11,6,0,1,1,2,0.28,0.3182,0.93,0,2,97,99 +7337,2011-11-07,4,0,11,7,0,1,1,2,0.28,0.3182,0.93,0,6,305,311 +7338,2011-11-07,4,0,11,8,0,1,1,2,0.3,0.3182,1,0.0896,13,397,410 +7339,2011-11-07,4,0,11,9,0,1,1,2,0.34,0.3333,1,0.1343,18,156,174 +7340,2011-11-07,4,0,11,10,0,1,1,1,0.36,0.3485,0.93,0.1343,28,95,123 +7341,2011-11-07,4,0,11,11,0,1,1,1,0.42,0.4242,0.77,0.1045,28,100,128 +7342,2011-11-07,4,0,11,12,0,1,1,1,0.46,0.4545,0.67,0,21,158,179 +7343,2011-11-07,4,0,11,13,0,1,1,1,0.54,0.5152,0.49,0.0896,32,157,189 +7344,2011-11-07,4,0,11,14,0,1,1,1,0.56,0.5303,0.37,0.1642,30,115,145 +7345,2011-11-07,4,0,11,15,0,1,1,1,0.56,0.5303,0.37,0.1045,38,132,170 +7346,2011-11-07,4,0,11,16,0,1,1,1,0.52,0.5,0.45,0,40,255,295 +7347,2011-11-07,4,0,11,17,0,1,1,1,0.5,0.4848,0.45,0.1642,39,489,528 +7348,2011-11-07,4,0,11,18,0,1,1,1,0.46,0.4545,0.59,0.1045,18,407,425 +7349,2011-11-07,4,0,11,19,0,1,1,1,0.46,0.4545,0.59,0,20,280,300 +7350,2011-11-07,4,0,11,20,0,1,1,1,0.4,0.4091,0.71,0,17,187,204 +7351,2011-11-07,4,0,11,21,0,1,1,1,0.38,0.3939,0.82,0.1045,10,129,139 +7352,2011-11-07,4,0,11,22,0,1,1,1,0.36,0.3636,0.81,0.0896,6,102,108 +7353,2011-11-07,4,0,11,23,0,1,1,1,0.36,0.3636,0.87,0.0896,1,47,48 +7354,2011-11-08,4,0,11,0,0,2,1,1,0.34,0.3485,0.87,0.0896,0,18,18 +7355,2011-11-08,4,0,11,1,0,2,1,1,0.32,0.3333,0.93,0.1045,3,8,11 +7356,2011-11-08,4,0,11,2,0,2,1,1,0.32,0.3485,0.87,0,0,1,1 +7357,2011-11-08,4,0,11,3,0,2,1,1,0.32,0.3485,0.87,0,0,3,3 +7358,2011-11-08,4,0,11,4,0,2,1,1,0.3,0.3333,0.87,0,0,4,4 +7359,2011-11-08,4,0,11,5,0,2,1,1,0.3,0.3182,0.87,0.0896,1,17,18 +7360,2011-11-08,4,0,11,6,0,2,1,1,0.3,0.3333,0.87,0,3,96,99 +7361,2011-11-08,4,0,11,7,0,2,1,1,0.3,0.3182,0.87,0.1045,7,316,323 +7362,2011-11-08,4,0,11,8,0,2,1,1,0.32,0.3333,0.93,0.1045,11,455,466 +7363,2011-11-08,4,0,11,9,0,2,1,1,0.36,0.3788,0.87,0,14,177,191 +7364,2011-11-08,4,0,11,10,0,2,1,1,0.42,0.4242,0.71,0.0896,25,104,129 +7365,2011-11-08,4,0,11,11,0,2,1,1,0.46,0.4545,0.63,0.1343,21,126,147 +7366,2011-11-08,4,0,11,12,0,2,1,1,0.52,0.5,0.52,0.1343,28,175,203 +7367,2011-11-08,4,0,11,13,0,2,1,1,0.54,0.5152,0.49,0.1642,31,165,196 +7368,2011-11-08,4,0,11,14,0,2,1,1,0.56,0.5303,0.46,0.1045,32,129,161 +7369,2011-11-08,4,0,11,15,0,2,1,1,0.58,0.5455,0.46,0.1045,33,155,188 +7370,2011-11-08,4,0,11,16,0,2,1,1,0.56,0.5303,0.46,0.1343,39,250,289 +7371,2011-11-08,4,0,11,17,0,2,1,1,0.52,0.5,0.43,0.1642,40,459,499 +7372,2011-11-08,4,0,11,18,0,2,1,1,0.48,0.4697,0.55,0.1343,30,432,462 +7373,2011-11-08,4,0,11,19,0,2,1,1,0.46,0.4545,0.59,0,14,264,278 +7374,2011-11-08,4,0,11,20,0,2,1,1,0.4,0.4091,0.76,0,12,169,181 +7375,2011-11-08,4,0,11,21,0,2,1,1,0.4,0.4091,0.76,0,16,166,182 +7376,2011-11-08,4,0,11,22,0,2,1,1,0.36,0.3788,0.87,0,13,95,108 +7377,2011-11-08,4,0,11,23,0,2,1,1,0.36,0.3788,0.81,0,3,45,48 +7378,2011-11-09,4,0,11,0,0,3,1,1,0.36,0.3788,0.87,0,0,13,13 +7379,2011-11-09,4,0,11,1,0,3,1,1,0.34,0.3636,0.87,0,0,10,10 +7380,2011-11-09,4,0,11,2,0,3,1,1,0.32,0.3485,0.93,0,0,5,5 +7381,2011-11-09,4,0,11,3,0,3,1,1,0.32,0.3485,0.93,0,0,4,4 +7382,2011-11-09,4,0,11,4,0,3,1,1,0.32,0.3485,0.87,0,0,4,4 +7383,2011-11-09,4,0,11,5,0,3,1,1,0.3,0.3333,0.87,0,0,28,28 +7384,2011-11-09,4,0,11,6,0,3,1,1,0.3,0.3333,0.87,0,0,98,98 +7385,2011-11-09,4,0,11,7,0,3,1,1,0.3,0.3333,0.87,0,0,300,300 +7386,2011-11-09,4,0,11,8,0,3,1,1,0.32,0.3485,0.93,0,19,437,456 +7387,2011-11-09,4,0,11,9,0,3,1,1,0.34,0.3636,0.93,0,22,197,219 +7388,2011-11-09,4,0,11,10,0,3,1,1,0.4,0.4091,0.76,0.1045,16,89,105 +7389,2011-11-09,4,0,11,11,0,3,1,1,0.46,0.4545,0.67,0.1642,21,131,152 +7390,2011-11-09,4,0,11,12,0,3,1,1,0.5,0.4848,0.51,0.1642,24,156,180 +7391,2011-11-09,4,0,11,13,0,3,1,1,0.5,0.4848,0.51,0.1642,31,136,167 +7392,2011-11-09,4,0,11,14,0,3,1,1,0.52,0.5,0.48,0.1642,26,153,179 +7393,2011-11-09,4,0,11,15,0,3,1,1,0.52,0.5,0.45,0.2239,21,130,151 +7394,2011-11-09,4,0,11,16,0,3,1,1,0.52,0.5,0.48,0.194,24,257,281 +7395,2011-11-09,4,0,11,17,0,3,1,1,0.46,0.4545,0.63,0.1045,27,458,485 +7396,2011-11-09,4,0,11,18,0,3,1,1,0.44,0.4394,0.67,0.1045,21,387,408 +7397,2011-11-09,4,0,11,19,0,3,1,1,0.44,0.4394,0.72,0.1045,12,292,304 +7398,2011-11-09,4,0,11,20,0,3,1,1,0.44,0.4394,0.77,0,18,201,219 +7399,2011-11-09,4,0,11,21,0,3,1,1,0.4,0.4091,0.87,0,9,152,161 +7400,2011-11-09,4,0,11,22,0,3,1,1,0.4,0.4091,0.87,0,9,105,114 +7401,2011-11-09,4,0,11,23,0,3,1,1,0.38,0.3939,0.87,0,5,61,66 +7402,2011-11-10,4,0,11,0,0,4,1,1,0.4,0.4091,0.87,0,0,24,24 +7403,2011-11-10,4,0,11,1,0,4,1,2,0.4,0.4091,0.87,0,1,10,11 +7404,2011-11-10,4,0,11,2,0,4,1,2,0.38,0.3939,0.94,0,0,5,5 +7405,2011-11-10,4,0,11,3,0,4,1,1,0.38,0.3939,0.94,0.0896,0,11,11 +7406,2011-11-10,4,0,11,4,0,4,1,1,0.38,0.3939,0.94,0.0896,1,2,3 +7407,2011-11-10,4,0,11,5,0,4,1,2,0.36,0.3485,1,0.1343,1,22,23 +7408,2011-11-10,4,0,11,6,0,4,1,2,0.36,0.3485,1,0.1343,4,110,114 +7409,2011-11-10,4,0,11,7,0,4,1,2,0.38,0.3939,0.94,0.1642,6,266,272 +7410,2011-11-10,4,0,11,8,0,4,1,2,0.4,0.4091,0.94,0.0896,18,418,436 +7411,2011-11-10,4,0,11,9,0,4,1,2,0.42,0.4242,0.94,0.1045,23,188,211 +7412,2011-11-10,4,0,11,10,0,4,1,2,0.44,0.4394,0.88,0.2239,17,100,117 +7413,2011-11-10,4,0,11,11,0,4,1,2,0.46,0.4545,0.67,0.3881,9,99,108 +7414,2011-11-10,4,0,11,12,0,4,1,2,0.42,0.4242,0.67,0.3582,18,149,167 +7415,2011-11-10,4,0,11,13,0,4,1,3,0.36,0.3333,0.81,0.3582,9,87,96 +7416,2011-11-10,4,0,11,14,0,4,1,3,0.36,0.3333,0.87,0.2836,6,58,64 +7417,2011-11-10,4,0,11,15,0,4,1,3,0.36,0.3333,0.81,0.2985,5,57,62 +7418,2011-11-10,4,0,11,16,0,4,1,3,0.36,0.3485,0.81,0.2239,6,67,73 +7419,2011-11-10,4,0,11,17,0,4,1,3,0.36,0.3485,0.81,0.2239,9,168,177 +7420,2011-11-10,4,0,11,18,0,4,1,2,0.36,0.3485,0.81,0.1642,10,263,273 +7421,2011-11-10,4,0,11,19,0,4,1,2,0.36,0.3485,0.71,0.2239,19,192,211 +7422,2011-11-10,4,0,11,20,0,4,1,2,0.36,0.3333,0.62,0.2836,8,160,168 +7423,2011-11-10,4,0,11,21,0,4,1,1,0.36,0.3333,0.57,0.2537,6,130,136 +7424,2011-11-10,4,0,11,22,0,4,1,1,0.36,0.3333,0.53,0.2836,5,84,89 +7425,2011-11-10,4,0,11,23,0,4,1,2,0.34,0.3333,0.57,0.1642,9,73,82 +7426,2011-11-11,4,0,11,0,1,5,0,1,0.34,0.3182,0.53,0.2537,10,56,66 +7427,2011-11-11,4,0,11,1,1,5,0,1,0.32,0.303,0.57,0.3284,3,16,19 +7428,2011-11-11,4,0,11,2,1,5,0,1,0.32,0.2879,0.57,0.2836,1,10,11 +7429,2011-11-11,4,0,11,3,1,5,0,1,0.3,0.2879,0.61,0.194,0,6,6 +7430,2011-11-11,4,0,11,4,1,5,0,1,0.3,0.2879,0.61,0.194,0,8,8 +7431,2011-11-11,4,0,11,5,1,5,0,1,0.3,0.2879,0.49,0.2537,0,13,13 +7432,2011-11-11,4,0,11,6,1,5,0,1,0.28,0.2576,0.45,0.3284,0,46,46 +7433,2011-11-11,4,0,11,7,1,5,0,1,0.28,0.2727,0.45,0.2537,5,116,121 +7434,2011-11-11,4,0,11,8,1,5,0,1,0.3,0.2727,0.42,0.3582,9,249,258 +7435,2011-11-11,4,0,11,9,1,5,0,1,0.32,0.2879,0.45,0.4478,15,186,201 +7436,2011-11-11,4,0,11,10,1,5,0,1,0.32,0.2727,0.42,0.5522,38,162,200 +7437,2011-11-11,4,0,11,11,1,5,0,1,0.34,0.2879,0.42,0.4925,20,150,170 +7438,2011-11-11,4,0,11,12,1,5,0,1,0.34,0.2879,0.42,0.4627,41,198,239 +7439,2011-11-11,4,0,11,13,1,5,0,1,0.38,0.3939,0.37,0.5522,57,179,236 +7440,2011-11-11,4,0,11,14,1,5,0,1,0.38,0.3939,0.37,0.4627,64,183,247 +7441,2011-11-11,4,0,11,15,1,5,0,1,0.36,0.3333,0.37,0.4179,29,187,216 +7442,2011-11-11,4,0,11,16,1,5,0,1,0.36,0.3333,0.34,0.3284,49,189,238 +7443,2011-11-11,4,0,11,17,1,5,0,1,0.34,0.303,0.39,0.3284,24,286,310 +7444,2011-11-11,4,0,11,18,1,5,0,1,0.34,0.3182,0.39,0.2537,23,185,208 +7445,2011-11-11,4,0,11,19,1,5,0,1,0.32,0.3182,0.39,0.1642,17,182,199 +7446,2011-11-11,4,0,11,20,1,5,0,1,0.32,0.3182,0.39,0.194,12,97,109 +7447,2011-11-11,4,0,11,21,1,5,0,1,0.3,0.303,0.45,0.1642,4,79,83 +7448,2011-11-11,4,0,11,22,1,5,0,1,0.32,0.3182,0.39,0.194,13,78,91 +7449,2011-11-11,4,0,11,23,1,5,0,1,0.3,0.3182,0.45,0.0896,6,67,73 +7450,2011-11-12,4,0,11,0,0,6,0,1,0.24,0.2424,0.6,0.1343,12,52,64 +7451,2011-11-12,4,0,11,1,0,6,0,1,0.24,0.2424,0.6,0.1343,9,45,54 +7452,2011-11-12,4,0,11,2,0,6,0,1,0.24,0.2576,0.65,0.0896,7,39,46 +7453,2011-11-12,4,0,11,3,0,6,0,1,0.24,0.2424,0.7,0.1343,4,13,17 +7454,2011-11-12,4,0,11,4,0,6,0,1,0.2,0.2121,0.8,0.1343,0,7,7 +7455,2011-11-12,4,0,11,5,0,6,0,1,0.22,0.2576,0.75,0.0896,1,3,4 +7456,2011-11-12,4,0,11,6,0,6,0,1,0.22,0.2273,0.75,0.1642,0,7,7 +7457,2011-11-12,4,0,11,7,0,6,0,1,0.22,0.2273,0.75,0.194,3,24,27 +7458,2011-11-12,4,0,11,8,0,6,0,1,0.26,0.2576,0.7,0.194,14,87,101 +7459,2011-11-12,4,0,11,9,0,6,0,1,0.3,0.303,0.61,0.1642,18,142,160 +7460,2011-11-12,4,0,11,10,0,6,0,1,0.34,0.3182,0.61,0.2836,62,170,232 +7461,2011-11-12,4,0,11,11,0,6,0,1,0.38,0.3939,0.54,0.2836,102,213,315 +7462,2011-11-12,4,0,11,12,0,6,0,1,0.44,0.4394,0.44,0.2836,142,224,366 +7463,2011-11-12,4,0,11,13,0,6,0,1,0.48,0.4697,0.36,0.2836,128,225,353 +7464,2011-11-12,4,0,11,14,0,6,0,1,0.5,0.4848,0.36,0.2985,191,244,435 +7465,2011-11-12,4,0,11,15,0,6,0,1,0.52,0.5,0.29,0.2836,165,221,386 +7466,2011-11-12,4,0,11,16,0,6,0,1,0.52,0.5,0.32,0.2537,137,224,361 +7467,2011-11-12,4,0,11,17,0,6,0,1,0.5,0.4848,0.34,0.2537,92,193,285 +7468,2011-11-12,4,0,11,18,0,6,0,1,0.42,0.4242,0.58,0.194,53,150,203 +7469,2011-11-12,4,0,11,19,0,6,0,1,0.42,0.4242,0.5,0.2537,32,139,171 +7470,2011-11-12,4,0,11,20,0,6,0,1,0.42,0.4242,0.5,0.2836,31,101,132 +7471,2011-11-12,4,0,11,21,0,6,0,1,0.42,0.4242,0.47,0.2836,19,103,122 +7472,2011-11-12,4,0,11,22,0,6,0,1,0.42,0.4242,0.47,0.2537,30,88,118 +7473,2011-11-12,4,0,11,23,0,6,0,1,0.4,0.4091,0.58,0.1642,23,78,101 +7474,2011-11-13,4,0,11,0,0,0,0,2,0.4,0.4091,0.58,0.194,12,61,73 +7475,2011-11-13,4,0,11,1,0,0,0,2,0.36,0.3485,0.66,0.194,13,58,71 +7476,2011-11-13,4,0,11,2,0,0,0,1,0.36,0.3636,0.57,0.1045,9,48,57 +7477,2011-11-13,4,0,11,3,0,0,0,2,0.36,0.3485,0.62,0.1343,8,20,28 +7478,2011-11-13,4,0,11,4,0,0,0,2,0.36,0.3485,0.62,0.1343,1,5,6 +7479,2011-11-13,4,0,11,5,0,0,0,1,0.34,0.3333,0.61,0.1642,2,3,5 +7480,2011-11-13,4,0,11,6,0,0,0,1,0.34,0.3333,0.66,0.1343,5,18,23 +7481,2011-11-13,4,0,11,7,0,0,0,1,0.34,0.3182,0.66,0.2239,13,30,43 +7482,2011-11-13,4,0,11,8,0,0,0,1,0.34,0.3485,0.66,0.0896,24,55,79 +7483,2011-11-13,4,0,11,9,0,0,0,1,0.4,0.4091,0.54,0.2836,38,97,135 +7484,2011-11-13,4,0,11,10,0,0,0,1,0.44,0.4394,0.44,0.3284,63,178,241 +7485,2011-11-13,4,0,11,11,0,0,0,1,0.46,0.4545,0.4,0.4179,108,187,295 +7486,2011-11-13,4,0,11,12,0,0,0,2,0.52,0.5,0.29,0.4179,112,242,354 +7487,2011-11-13,4,0,11,13,0,0,0,2,0.52,0.5,0.29,0.4179,105,234,339 +7488,2011-11-13,4,0,11,14,0,0,0,2,0.54,0.5152,0.28,0.4925,108,263,371 +7489,2011-11-13,4,0,11,15,0,0,0,2,0.5,0.4848,0.42,0.3284,89,221,310 +7490,2011-11-13,4,0,11,16,0,0,0,2,0.54,0.5152,0.28,0.3881,93,226,319 +7491,2011-11-13,4,0,11,17,0,0,0,1,0.52,0.5,0.27,0.3284,44,187,231 +7492,2011-11-13,4,0,11,18,0,0,0,1,0.52,0.5,0.27,0.3284,35,155,190 +7493,2011-11-13,4,0,11,19,0,0,0,1,0.5,0.4848,0.29,0.3582,36,121,157 +7494,2011-11-13,4,0,11,20,0,0,0,1,0.5,0.4848,0.31,0.3582,24,115,139 +7495,2011-11-13,4,0,11,21,0,0,0,1,0.48,0.4697,0.36,0.2537,19,76,95 +7496,2011-11-13,4,0,11,22,0,0,0,1,0.48,0.4697,0.41,0.3881,25,64,89 +7497,2011-11-13,4,0,11,23,0,0,0,1,0.46,0.4545,0.51,0.2985,18,49,67 +7498,2011-11-14,4,0,11,0,0,1,1,1,0.46,0.4545,0.59,0.2836,8,22,30 +7499,2011-11-14,4,0,11,1,0,1,1,1,0.46,0.4545,0.63,0.2985,5,6,11 +7500,2011-11-14,4,0,11,2,0,1,1,1,0.46,0.4545,0.63,0.3582,7,10,17 +7501,2011-11-14,4,0,11,3,0,1,1,1,0.44,0.4394,0.67,0.2836,4,3,7 +7502,2011-11-14,4,0,11,4,0,1,1,1,0.44,0.4394,0.67,0.2239,0,5,5 +7503,2011-11-14,4,0,11,5,0,1,1,2,0.44,0.4394,0.67,0.2537,0,19,19 +7504,2011-11-14,4,0,11,6,0,1,1,2,0.44,0.4394,0.72,0.2239,10,104,114 +7505,2011-11-14,4,0,11,7,0,1,1,1,0.44,0.4394,0.72,0.2537,11,311,322 +7506,2011-11-14,4,0,11,8,0,1,1,2,0.46,0.4545,0.67,0.194,27,425,452 +7507,2011-11-14,4,0,11,9,0,1,1,2,0.48,0.4697,0.67,0.3284,29,204,233 +7508,2011-11-14,4,0,11,10,0,1,1,2,0.5,0.4848,0.63,0.2836,26,85,111 +7509,2011-11-14,4,0,11,11,0,1,1,2,0.54,0.5152,0.6,0.2836,22,106,128 +7510,2011-11-14,4,0,11,12,0,1,1,1,0.56,0.5303,0.56,0.2985,36,166,202 +7511,2011-11-14,4,0,11,13,0,1,1,1,0.6,0.6212,0.49,0.3881,50,153,203 +7512,2011-11-14,4,0,11,14,0,1,1,1,0.62,0.6212,0.46,0.4478,47,138,185 +7513,2011-11-14,4,0,11,15,0,1,1,1,0.64,0.6212,0.44,0.3284,53,142,195 +7514,2011-11-14,4,0,11,16,0,1,1,1,0.62,0.6212,0.46,0.4179,51,264,315 +7515,2011-11-14,4,0,11,17,0,1,1,1,0.62,0.6212,0.46,0.2537,55,464,519 +7516,2011-11-14,4,0,11,18,0,1,1,1,0.56,0.5303,0.56,0.2836,29,460,489 +7517,2011-11-14,4,0,11,19,0,1,1,1,0.6,0.6212,0.53,0.2985,28,274,302 +7518,2011-11-14,4,0,11,20,0,1,1,1,0.6,0.6212,0.53,0.3582,30,210,240 +7519,2011-11-14,4,0,11,21,0,1,1,1,0.6,0.6212,0.53,0.3582,37,176,213 +7520,2011-11-14,4,0,11,22,0,1,1,1,0.58,0.5455,0.56,0.3582,17,96,113 +7521,2011-11-14,4,0,11,23,0,1,1,1,0.56,0.5303,0.64,0.2985,13,48,61 +7522,2011-11-15,4,0,11,0,0,2,1,1,0.56,0.5303,0.64,0.3582,7,15,22 +7523,2011-11-15,4,0,11,1,0,2,1,1,0.56,0.5303,0.6,0.2985,5,5,10 +7524,2011-11-15,4,0,11,2,0,2,1,1,0.56,0.5303,0.64,0.2537,7,8,15 +7525,2011-11-15,4,0,11,3,0,2,1,2,0.54,0.5152,0.68,0.3284,0,4,4 +7526,2011-11-15,4,0,11,4,0,2,1,1,0.56,0.5303,0.64,0.2985,1,6,7 +7527,2011-11-15,4,0,11,5,0,2,1,1,0.54,0.5152,0.68,0.2836,2,26,28 +7528,2011-11-15,4,0,11,6,0,2,1,2,0.56,0.5303,0.64,0.0896,4,104,108 +7529,2011-11-15,4,0,11,7,0,2,1,2,0.54,0.5152,0.68,0.1642,21,298,319 +7530,2011-11-15,4,0,11,8,0,2,1,2,0.54,0.5152,0.68,0.1045,27,453,480 +7531,2011-11-15,4,0,11,9,0,2,1,2,0.56,0.5303,0.64,0.0896,26,174,200 +7532,2011-11-15,4,0,11,10,0,2,1,2,0.56,0.5303,0.64,0.2836,23,115,138 +7533,2011-11-15,4,0,11,11,0,2,1,2,0.56,0.5303,0.64,0.194,18,116,134 +7534,2011-11-15,4,0,11,12,0,2,1,2,0.54,0.5152,0.68,0,28,148,176 +7535,2011-11-15,4,0,11,13,0,2,1,3,0.54,0.5152,0.6,0.2239,21,132,153 +7536,2011-11-15,4,0,11,14,0,2,1,2,0.54,0.5152,0.6,0.2836,27,120,147 +7537,2011-11-15,4,0,11,15,0,2,1,2,0.54,0.5152,0.6,0.1343,36,155,191 +7538,2011-11-15,4,0,11,16,0,2,1,2,0.52,0.5,0.68,0.1343,31,240,271 +7539,2011-11-15,4,0,11,17,0,2,1,2,0.5,0.4848,0.72,0.194,29,422,451 +7540,2011-11-15,4,0,11,18,0,2,1,2,0.5,0.4848,0.72,0.2836,27,414,441 +7541,2011-11-15,4,0,11,19,0,2,1,2,0.5,0.4848,0.72,0.3582,27,259,286 +7542,2011-11-15,4,0,11,20,0,2,1,2,0.48,0.4697,0.82,0.1343,26,212,238 +7543,2011-11-15,4,0,11,21,0,2,1,2,0.5,0.4848,0.77,0.0896,23,145,168 +7544,2011-11-15,4,0,11,22,0,2,1,2,0.46,0.4545,0.88,0.1045,19,109,128 +7545,2011-11-15,4,0,11,23,0,2,1,2,0.46,0.4545,0.94,0.1045,14,66,80 +7546,2011-11-16,4,0,11,0,0,3,1,2,0.46,0.4545,0.94,0,5,26,31 +7547,2011-11-16,4,0,11,1,0,3,1,3,0.46,0.4545,0.94,0.0896,0,5,5 +7548,2011-11-16,4,0,11,2,0,3,1,2,0.46,0.4545,0.94,0.1045,4,6,10 +7549,2011-11-16,4,0,11,3,0,3,1,2,0.46,0.4545,0.94,0.1045,2,3,5 +7550,2011-11-16,4,0,11,4,0,3,1,2,0.44,0.4394,1,0.2239,1,3,4 +7551,2011-11-16,4,0,11,5,0,3,1,3,0.46,0.4545,0.94,0.1045,0,13,13 +7552,2011-11-16,4,0,11,6,0,3,1,2,0.46,0.4545,0.94,0.0896,4,52,56 +7553,2011-11-16,4,0,11,7,0,3,1,3,0.46,0.4545,0.94,0,7,130,137 +7554,2011-11-16,4,0,11,8,0,3,1,3,0.46,0.4545,0.94,0,9,223,232 +7555,2011-11-16,4,0,11,9,0,3,1,3,0.46,0.4545,0.94,0,5,77,82 +7556,2011-11-16,4,0,11,10,0,3,1,3,0.46,0.4545,0.94,0,4,32,36 +7557,2011-11-16,4,0,11,11,0,3,1,3,0.46,0.4545,0.94,0.0896,7,53,60 +7558,2011-11-16,4,0,11,12,0,3,1,3,0.46,0.4545,1,0,5,49,54 +7559,2011-11-16,4,0,11,13,0,3,1,3,0.46,0.4545,0.94,0.1343,6,52,58 +7560,2011-11-16,4,0,11,14,0,3,1,2,0.46,0.4545,1,0.1343,12,49,61 +7561,2011-11-16,4,0,11,15,0,3,1,3,0.48,0.4697,0.94,0.1045,16,50,66 +7562,2011-11-16,4,0,11,16,0,3,1,3,0.48,0.4697,0.94,0.1045,13,110,123 +7563,2011-11-16,4,0,11,17,0,3,1,2,0.48,0.4697,0.88,0.194,17,216,233 +7564,2011-11-16,4,0,11,18,0,3,1,3,0.46,0.4545,0.88,0.3881,13,176,189 +7565,2011-11-16,4,0,11,19,0,3,1,3,0.46,0.4545,0.88,0.3881,3,108,111 +7566,2011-11-16,4,0,11,20,0,3,1,3,0.44,0.4394,0.88,0.3881,5,94,99 +7567,2011-11-16,4,0,11,21,0,3,1,3,0.44,0.4394,0.88,0.2836,3,72,75 +7568,2011-11-16,4,0,11,22,0,3,1,3,0.42,0.4242,0.88,0.2239,1,45,46 +7569,2011-11-16,4,0,11,23,0,3,1,3,0.42,0.4242,0.88,0.1343,3,28,31 +7570,2011-11-17,4,0,11,0,0,4,1,2,0.42,0.4242,0.88,0.2537,2,22,24 +7571,2011-11-17,4,0,11,1,0,4,1,2,0.42,0.4242,0.82,0.1642,0,5,5 +7572,2011-11-17,4,0,11,2,0,4,1,2,0.42,0.4242,0.82,0.2239,0,5,5 +7573,2011-11-17,4,0,11,3,0,4,1,2,0.42,0.4242,0.77,0.3284,0,3,3 +7574,2011-11-17,4,0,11,4,0,4,1,2,0.4,0.4091,0.62,0.4179,1,3,4 +7575,2011-11-17,4,0,11,5,0,4,1,3,0.34,0.303,0.71,0.3284,1,21,22 +7576,2011-11-17,4,0,11,6,0,4,1,3,0.36,0.3485,0.57,0.2239,3,72,75 +7577,2011-11-17,4,0,11,7,0,4,1,3,0.34,0.3182,0.61,0.2836,4,164,168 +7578,2011-11-17,4,0,11,8,0,4,1,3,0.34,0.303,0.61,0.2985,12,343,355 +7579,2011-11-17,4,0,11,9,0,4,1,2,0.34,0.303,0.61,0.3284,14,184,198 +7580,2011-11-17,4,0,11,10,0,4,1,2,0.34,0.303,0.61,0.3284,7,74,81 +7581,2011-11-17,4,0,11,11,0,4,1,3,0.32,0.303,0.66,0.3284,4,93,97 +7582,2011-11-17,4,0,11,12,0,4,1,2,0.34,0.303,0.53,0.2985,6,115,121 +7583,2011-11-17,4,0,11,13,0,4,1,2,0.34,0.303,0.49,0.4478,15,109,124 +7584,2011-11-17,4,0,11,14,0,4,1,2,0.34,0.303,0.49,0.3881,5,105,110 +7585,2011-11-17,4,0,11,15,0,4,1,2,0.34,0.303,0.49,0.4478,10,106,116 +7586,2011-11-17,4,0,11,16,0,4,1,1,0.32,0.2879,0.39,0.4179,8,187,195 +7587,2011-11-17,4,0,11,17,0,4,1,1,0.32,0.2879,0.42,0.3881,20,379,399 +7588,2011-11-17,4,0,11,18,0,4,1,1,0.32,0.3182,0.39,0.194,9,298,307 +7589,2011-11-17,4,0,11,19,0,4,1,1,0.3,0.2879,0.45,0.2239,7,210,217 +7590,2011-11-17,4,0,11,20,0,4,1,1,0.3,0.2879,0.42,0.2836,7,162,169 +7591,2011-11-17,4,0,11,21,0,4,1,1,0.3,0.2879,0.42,0.2537,3,113,116 +7592,2011-11-17,4,0,11,22,0,4,1,1,0.26,0.2424,0.52,0.2537,1,83,84 +7593,2011-11-17,4,0,11,23,0,4,1,1,0.26,0.2576,0.52,0.2239,0,58,58 +7594,2011-11-18,4,0,11,0,0,5,1,1,0.26,0.2576,0.48,0.1642,2,28,30 +7595,2011-11-18,4,0,11,1,0,5,1,1,0.26,0.2273,0.44,0.3284,0,10,10 +7596,2011-11-18,4,0,11,2,0,5,1,1,0.24,0.2273,0.44,0.2239,2,8,10 +7597,2011-11-18,4,0,11,3,0,5,1,1,0.24,0.2121,0.41,0.2836,0,2,2 +7598,2011-11-18,4,0,11,4,0,5,1,1,0.22,0.2273,0.44,0.1642,0,5,5 +7599,2011-11-18,4,0,11,5,0,5,1,1,0.22,0.2273,0.44,0.1343,0,22,22 +7600,2011-11-18,4,0,11,6,0,5,1,1,0.22,0.2576,0.44,0.0896,1,70,71 +7601,2011-11-18,4,0,11,7,0,5,1,1,0.22,0.2273,0.44,0.1642,5,211,216 +7602,2011-11-18,4,0,11,8,0,5,1,1,0.22,0.2273,0.44,0.194,6,369,375 +7603,2011-11-18,4,0,11,9,0,5,1,1,0.26,0.2576,0.41,0.194,6,207,213 +7604,2011-11-18,4,0,11,10,0,5,1,1,0.26,0.2424,0.41,0.2836,10,105,115 +7605,2011-11-18,4,0,11,11,0,5,1,1,0.3,0.303,0.36,0.1343,18,122,140 +7606,2011-11-18,4,0,11,12,0,5,1,1,0.32,0.3333,0.33,0,22,143,165 +7607,2011-11-18,4,0,11,13,0,5,1,1,0.34,0.3182,0.31,0.2239,31,146,177 +7608,2011-11-18,4,0,11,14,0,5,1,1,0.34,0.3333,0.31,0.1343,31,123,154 +7609,2011-11-18,4,0,11,15,0,5,1,1,0.34,0.3333,0.29,0.1343,27,151,178 +7610,2011-11-18,4,0,11,16,0,5,1,1,0.34,0.3333,0.29,0.1343,19,190,209 +7611,2011-11-18,4,0,11,17,0,5,1,1,0.32,0.3333,0.31,0.1343,17,361,378 +7612,2011-11-18,4,0,11,18,0,5,1,1,0.3,0.303,0.36,0.1642,16,312,328 +7613,2011-11-18,4,0,11,19,0,5,1,1,0.28,0.2727,0.41,0.1642,7,183,190 +7614,2011-11-18,4,0,11,20,0,5,1,1,0.28,0.2727,0.48,0.194,8,129,137 +7615,2011-11-18,4,0,11,21,0,5,1,1,0.28,0.2727,0.48,0.1642,3,108,111 +7616,2011-11-18,4,0,11,22,0,5,1,1,0.26,0.2727,0.52,0.1045,9,88,97 +7617,2011-11-18,4,0,11,23,0,5,1,1,0.26,0.2727,0.6,0.1343,5,54,59 +7618,2011-11-19,4,0,11,0,0,6,0,1,0.26,0.2576,0.56,0.194,4,49,53 +7619,2011-11-19,4,0,11,1,0,6,0,1,0.26,0.2273,0.52,0.2985,1,34,35 +7620,2011-11-19,4,0,11,2,0,6,0,1,0.24,0.2424,0.56,0.1642,12,34,46 +7621,2011-11-19,4,0,11,3,0,6,0,1,0.26,0.2424,0.48,0.2537,4,11,15 +7622,2011-11-19,4,0,11,4,0,6,0,1,0.26,0.2273,0.48,0.3284,4,4,8 +7623,2011-11-19,4,0,11,5,0,6,0,1,0.24,0.2273,0.56,0.2537,0,2,2 +7624,2011-11-19,4,0,11,6,0,6,0,1,0.24,0.2121,0.6,0.3284,2,10,12 +7625,2011-11-19,4,0,11,7,0,6,0,1,0.24,0.2273,0.6,0.2537,3,38,41 +7626,2011-11-19,4,0,11,8,0,6,0,1,0.26,0.2424,0.56,0.2537,12,80,92 +7627,2011-11-19,4,0,11,9,0,6,0,1,0.28,0.2576,0.52,0.2985,12,130,142 +7628,2011-11-19,4,0,11,10,0,6,0,1,0.32,0.303,0.45,0.2537,35,165,200 +7629,2011-11-19,4,0,11,11,0,6,0,1,0.36,0.3788,0.4,0,50,183,233 +7630,2011-11-19,4,0,11,12,0,6,0,1,0.4,0.4091,0.35,0.2836,93,218,311 +7631,2011-11-19,4,0,11,13,0,6,0,1,0.42,0.4242,0.3,0.2836,118,245,363 +7632,2011-11-19,4,0,11,14,0,6,0,1,0.42,0.4242,0.32,0.2985,121,228,349 +7633,2011-11-19,4,0,11,15,0,6,0,1,0.42,0.4242,0.41,0.194,120,262,382 +7634,2011-11-19,4,0,11,16,0,6,0,1,0.42,0.4242,0.38,0.2985,99,188,287 +7635,2011-11-19,4,0,11,17,0,6,0,1,0.4,0.4091,0.4,0.194,61,171,232 +7636,2011-11-19,4,0,11,18,0,6,0,1,0.38,0.3939,0.46,0.1343,36,172,208 +7637,2011-11-19,4,0,11,19,0,6,0,2,0.34,0.3333,0.71,0.1343,33,149,182 +7638,2011-11-19,4,0,11,20,0,6,0,2,0.36,0.3636,0.57,0.1045,48,115,163 +7639,2011-11-19,4,0,11,21,0,6,0,2,0.36,0.3485,0.62,0.194,29,79,108 +7640,2011-11-19,4,0,11,22,0,6,0,2,0.38,0.3939,0.62,0.1642,26,80,106 +7641,2011-11-19,4,0,11,23,0,6,0,1,0.38,0.3939,0.62,0.2239,20,73,93 +7642,2011-11-20,4,0,11,0,0,0,0,1,0.38,0.3939,0.66,0.1642,14,79,93 +7643,2011-11-20,4,0,11,1,0,0,0,1,0.4,0.4091,0.62,0.2537,11,73,84 +7644,2011-11-20,4,0,11,2,0,0,0,1,0.4,0.4091,0.62,0.2836,12,44,56 +7645,2011-11-20,4,0,11,3,0,0,0,2,0.4,0.4091,0.66,0.2836,6,31,37 +7646,2011-11-20,4,0,11,4,0,0,0,2,0.4,0.4091,0.71,0.2836,2,10,12 +7647,2011-11-20,4,0,11,5,0,0,0,2,0.42,0.4242,0.67,0.2836,0,4,4 +7648,2011-11-20,4,0,11,6,0,0,0,2,0.42,0.4242,0.67,0.2537,3,6,9 +7649,2011-11-20,4,0,11,7,0,0,0,1,0.42,0.4242,0.67,0.2239,4,19,23 +7650,2011-11-20,4,0,11,8,0,0,0,2,0.42,0.4242,0.71,0.2836,13,52,65 +7651,2011-11-20,4,0,11,9,0,0,0,1,0.44,0.4394,0.72,0.1642,29,109,138 +7652,2011-11-20,4,0,11,10,0,0,0,2,0.44,0.4394,0.72,0.1045,46,183,229 +7653,2011-11-20,4,0,11,11,0,0,0,2,0.5,0.4848,0.63,0.1045,74,212,286 +7654,2011-11-20,4,0,11,12,0,0,0,2,0.5,0.4848,0.63,0.1343,70,234,304 +7655,2011-11-20,4,0,11,13,0,0,0,2,0.54,0.5152,0.6,0.194,84,285,369 +7656,2011-11-20,4,0,11,14,0,0,0,1,0.52,0.5,0.63,0.2537,113,250,363 +7657,2011-11-20,4,0,11,15,0,0,0,2,0.52,0.5,0.68,0.194,109,242,351 +7658,2011-11-20,4,0,11,16,0,0,0,2,0.52,0.5,0.63,0.1343,81,225,306 +7659,2011-11-20,4,0,11,17,0,0,0,2,0.52,0.5,0.63,0.194,35,168,203 +7660,2011-11-20,4,0,11,18,0,0,0,2,0.54,0.5152,0.64,0.194,22,123,145 +7661,2011-11-20,4,0,11,19,0,0,0,2,0.5,0.4848,0.72,0.1045,17,140,157 +7662,2011-11-20,4,0,11,20,0,0,0,2,0.52,0.5,0.68,0.1045,23,90,113 +7663,2011-11-20,4,0,11,21,0,0,0,2,0.48,0.4697,0.77,0.0896,11,94,105 +7664,2011-11-20,4,0,11,22,0,0,0,3,0.46,0.4545,0.88,0.0896,5,35,40 +7665,2011-11-20,4,0,11,23,0,0,0,3,0.46,0.4545,0.88,0.0896,3,25,28 +7666,2011-11-21,4,0,11,0,0,1,1,2,0.46,0.4545,0.94,0,4,13,17 +7667,2011-11-21,4,0,11,1,0,1,1,3,0.46,0.4545,0.94,0.194,4,8,12 +7668,2011-11-21,4,0,11,2,0,1,1,3,0.44,0.4394,1,0.194,1,2,3 +7669,2011-11-21,4,0,11,3,0,1,1,3,0.44,0.4394,1,0.1343,0,4,4 +7670,2011-11-21,4,0,11,4,0,1,1,3,0.44,0.4394,1,0.1343,0,5,5 +7671,2011-11-21,4,0,11,5,0,1,1,2,0.44,0.4394,1,0,0,20,20 +7672,2011-11-21,4,0,11,6,0,1,1,2,0.42,0.4242,1,0,0,76,76 +7673,2011-11-21,4,0,11,7,0,1,1,2,0.46,0.4545,0.94,0.1642,17,229,246 +7674,2011-11-21,4,0,11,8,0,1,1,2,0.46,0.4545,0.94,0,13,378,391 +7675,2011-11-21,4,0,11,9,0,1,1,2,0.48,0.4697,0.94,0.1045,15,222,237 +7676,2011-11-21,4,0,11,10,0,1,1,2,0.52,0.5,0.83,0,25,110,135 +7677,2011-11-21,4,0,11,11,0,1,1,2,0.52,0.5,0.83,0.1045,16,112,128 +7678,2011-11-21,4,0,11,12,0,1,1,2,0.5,0.4848,0.77,0.1343,24,138,162 +7679,2011-11-21,4,0,11,13,0,1,1,2,0.5,0.4848,0.77,0.1343,13,121,134 +7680,2011-11-21,4,0,11,14,0,1,1,2,0.48,0.4697,0.82,0.194,14,121,135 +7681,2011-11-21,4,0,11,15,0,1,1,3,0.44,0.4394,0.94,0.2239,15,93,108 +7682,2011-11-21,4,0,11,16,0,1,1,3,0.44,0.4394,0.94,0.2239,13,95,108 +7683,2011-11-21,4,0,11,17,0,1,1,3,0.42,0.4242,0.94,0.2985,10,200,210 +7684,2011-11-21,4,0,11,18,0,1,1,3,0.42,0.4242,0.88,0.194,8,184,192 +7685,2011-11-21,4,0,11,19,0,1,1,3,0.4,0.4091,0.94,0.194,2,135,137 +7686,2011-11-21,4,0,11,20,0,1,1,3,0.4,0.4091,0.87,0.2239,5,75,80 +7687,2011-11-21,4,0,11,21,0,1,1,3,0.4,0.4091,0.87,0.1642,11,103,114 +7688,2011-11-21,4,0,11,22,0,1,1,3,0.4,0.4091,0.87,0.1343,6,72,78 +7689,2011-11-21,4,0,11,23,0,1,1,3,0.4,0.4091,0.87,0.1642,4,29,33 +7690,2011-11-22,4,0,11,0,0,2,1,3,0.38,0.3939,0.94,0.1045,0,14,14 +7691,2011-11-22,4,0,11,1,0,2,1,3,0.4,0.4091,0.94,0.1343,1,5,6 +7692,2011-11-22,4,0,11,2,0,2,1,3,0.38,0.3939,1,0.1045,2,4,6 +7693,2011-11-22,4,0,11,3,0,2,1,3,0.38,0.3939,1,0.1045,1,2,3 +7694,2011-11-22,4,0,11,4,0,2,1,3,0.38,0.3939,0.94,0.1642,0,7,7 +7695,2011-11-22,4,0,11,5,0,2,1,3,0.38,0.3939,0.94,0.1642,1,17,18 +7696,2011-11-22,4,0,11,6,0,2,1,2,0.38,0.3939,0.94,0.1045,1,63,64 +7697,2011-11-22,4,0,11,7,0,2,1,3,0.38,0.3939,0.94,0.2239,2,119,121 +7698,2011-11-22,4,0,11,8,0,2,1,3,0.38,0.3939,0.94,0.2239,5,185,190 +7699,2011-11-22,4,0,11,9,0,2,1,3,0.4,0.4091,0.94,0,2,147,149 +7700,2011-11-22,4,0,11,10,0,2,1,3,0.4,0.4091,0.94,0,6,46,52 +7701,2011-11-22,4,0,11,11,0,2,1,3,0.4,0.4091,0.94,0.1343,4,28,32 +7702,2011-11-22,4,0,11,12,0,2,1,3,0.4,0.4091,1,0.0896,3,18,21 +7703,2011-11-22,4,0,11,13,0,2,1,3,0.42,0.4242,1,0.0896,4,22,26 +7704,2011-11-22,4,0,11,14,0,2,1,3,0.42,0.4242,1,0.0896,4,31,35 +7705,2011-11-22,4,0,11,15,0,2,1,3,0.44,0.4394,0.94,0,2,32,34 +7706,2011-11-22,4,0,11,16,0,2,1,3,0.44,0.4394,0.94,0,3,59,62 +7707,2011-11-22,4,0,11,17,0,2,1,3,0.44,0.4394,1,0.0896,4,161,165 +7708,2011-11-22,4,0,11,18,0,2,1,3,0.44,0.4394,1,0,0,148,148 +7709,2011-11-22,4,0,11,19,0,2,1,3,0.46,0.4545,0.94,0.1045,5,106,111 +7710,2011-11-22,4,0,11,20,0,2,1,2,0.46,0.4545,1,0.2239,6,121,127 +7711,2011-11-22,4,0,11,21,0,2,1,2,0.46,0.4545,1,0.2537,4,86,90 +7712,2011-11-22,4,0,11,22,0,2,1,2,0.5,0.4848,0.94,0.194,6,81,87 +7713,2011-11-22,4,0,11,23,0,2,1,2,0.48,0.4697,0.94,0.2537,3,36,39 +7714,2011-11-23,4,0,11,0,0,3,1,2,0.48,0.4697,0.94,0.2537,2,14,16 +7715,2011-11-23,4,0,11,1,0,3,1,2,0.48,0.4697,0.94,0.2985,0,8,8 +7716,2011-11-23,4,0,11,2,0,3,1,3,0.5,0.4848,0.94,0.3582,1,5,6 +7717,2011-11-23,4,0,11,3,0,3,1,3,0.5,0.4848,0.94,0.3582,1,2,3 +7718,2011-11-23,4,0,11,4,0,3,1,3,0.52,0.5,0.88,0.3582,0,5,5 +7719,2011-11-23,4,0,11,5,0,3,1,2,0.46,0.4545,0.94,0.194,1,16,17 +7720,2011-11-23,4,0,11,6,0,3,1,2,0.44,0.4394,1,0,1,68,69 +7721,2011-11-23,4,0,11,7,0,3,1,2,0.46,0.4545,1,0.0896,2,154,156 +7722,2011-11-23,4,0,11,8,0,3,1,2,0.48,0.4697,0.94,0.1045,7,316,323 +7723,2011-11-23,4,0,11,9,0,3,1,2,0.52,0.5,0.94,0.194,3,164,167 +7724,2011-11-23,4,0,11,10,0,3,1,2,0.52,0.5,0.94,0.194,6,70,76 +7725,2011-11-23,4,0,11,11,0,3,1,2,0.5,0.4848,0.72,0.4179,9,107,116 +7726,2011-11-23,4,0,11,12,0,3,1,1,0.48,0.4697,0.55,0.4179,9,151,160 +7727,2011-11-23,4,0,11,13,0,3,1,2,0.44,0.4394,0.54,0.4925,12,162,174 +7728,2011-11-23,4,0,11,14,0,3,1,1,0.42,0.4242,0.54,0.4627,13,200,213 +7729,2011-11-23,4,0,11,15,0,3,1,2,0.42,0.4242,0.54,0.4478,13,194,207 +7730,2011-11-23,4,0,11,16,0,3,1,2,0.4,0.4091,0.62,0.4627,4,169,173 +7731,2011-11-23,4,0,11,17,0,3,1,3,0.38,0.3939,0.71,0.3881,7,156,163 +7732,2011-11-23,4,0,11,18,0,3,1,2,0.4,0.4091,0.58,0.5224,7,138,145 +7733,2011-11-23,4,0,11,19,0,3,1,1,0.38,0.3939,0.58,0.3881,6,116,122 +7734,2011-11-23,4,0,11,20,0,3,1,1,0.36,0.3333,0.62,0.3881,1,67,68 +7735,2011-11-23,4,0,11,21,0,3,1,1,0.36,0.3182,0.57,0.4627,0,66,66 +7736,2011-11-23,4,0,11,22,0,3,1,1,0.34,0.303,0.61,0.4478,3,59,62 +7737,2011-11-23,4,0,11,23,0,3,1,1,0.34,0.303,0.61,0.3582,4,47,51 +7738,2011-11-24,4,0,11,0,1,4,0,1,0.32,0.303,0.57,0.2239,1,22,23 +7739,2011-11-24,4,0,11,1,1,4,0,1,0.32,0.2879,0.57,0.4179,1,23,24 +7740,2011-11-24,4,0,11,2,1,4,0,1,0.3,0.2879,0.61,0.2836,3,19,22 +7741,2011-11-24,4,0,11,3,1,4,0,1,0.28,0.2879,0.65,0.1343,1,4,5 +7742,2011-11-24,4,0,11,4,1,4,0,1,0.3,0.3182,0.61,0.0896,1,1,2 +7743,2011-11-24,4,0,11,5,1,4,0,1,0.3,0.3182,0.61,0.1045,1,10,11 +7744,2011-11-24,4,0,11,6,1,4,0,1,0.3,0.3182,0.61,0.0896,1,5,6 +7745,2011-11-24,4,0,11,7,1,4,0,1,0.26,0.2727,0.75,0.1045,9,31,40 +7746,2011-11-24,4,0,11,8,1,4,0,1,0.3,0.3182,0.65,0.0896,4,42,46 +7747,2011-11-24,4,0,11,9,1,4,0,1,0.34,0.3485,0.61,0,13,68,81 +7748,2011-11-24,4,0,11,10,1,4,0,1,0.36,0.3788,0.57,0,34,64,98 +7749,2011-11-24,4,0,11,11,1,4,0,1,0.42,0.4242,0.41,0.2985,52,90,142 +7750,2011-11-24,4,0,11,12,1,4,0,1,0.46,0.4545,0.36,0.2239,62,88,150 +7751,2011-11-24,4,0,11,13,1,4,0,1,0.48,0.4697,0.33,0.2239,84,91,175 +7752,2011-11-24,4,0,11,14,1,4,0,1,0.5,0.4848,0.31,0.2985,74,94,168 +7753,2011-11-24,4,0,11,15,1,4,0,1,0.5,0.4848,0.31,0.2537,78,71,149 +7754,2011-11-24,4,0,11,16,1,4,0,1,0.5,0.4848,0.34,0.2836,48,66,114 +7755,2011-11-24,4,0,11,17,1,4,0,1,0.48,0.4697,0.33,0.1045,34,40,74 +7756,2011-11-24,4,0,11,18,1,4,0,1,0.42,0.4242,0.5,0.0896,13,24,37 +7757,2011-11-24,4,0,11,19,1,4,0,1,0.4,0.4091,0.54,0.1045,15,13,28 +7758,2011-11-24,4,0,11,20,1,4,0,1,0.36,0.3485,0.76,0.1642,14,17,31 +7759,2011-11-24,4,0,11,21,1,4,0,1,0.36,0.3485,0.71,0.1642,7,19,26 +7760,2011-11-24,4,0,11,22,1,4,0,1,0.36,0.3485,0.71,0.1642,8,14,22 +7761,2011-11-24,4,0,11,23,1,4,0,1,0.34,0.3485,0.76,0.1045,2,19,21 +7762,2011-11-25,4,0,11,0,0,5,1,1,0.34,0.3333,0.76,0.1642,7,22,29 +7763,2011-11-25,4,0,11,1,0,5,1,1,0.34,0.3636,0.76,0,2,12,14 +7764,2011-11-25,4,0,11,2,0,5,1,1,0.28,0.2879,0.81,0.1045,4,6,10 +7765,2011-11-25,4,0,11,3,0,5,1,1,0.28,0.2879,0.75,0.1045,0,2,2 +7766,2011-11-25,4,0,11,4,0,5,1,1,0.3,0.3182,0.75,0.1045,4,3,7 +7767,2011-11-25,4,0,11,5,0,5,1,1,0.28,0.3182,0.75,0,2,3,5 +7768,2011-11-25,4,0,11,6,0,5,1,1,0.26,0.2727,0.81,0.1045,4,9,13 +7769,2011-11-25,4,0,11,7,0,5,1,1,0.26,0.303,0.81,0,4,31,35 +7770,2011-11-25,4,0,11,8,0,5,1,1,0.26,0.2727,0.81,0.1343,6,68,74 +7771,2011-11-25,4,0,11,9,0,5,1,1,0.32,0.3485,0.76,0,25,70,95 +7772,2011-11-25,4,0,11,10,0,5,1,1,0.36,0.3485,0.71,0.1642,60,82,142 +7773,2011-11-25,4,0,11,11,0,5,1,1,0.4,0.4091,0.66,0.1045,99,127,226 +7774,2011-11-25,4,0,11,12,0,5,1,1,0.46,0.4545,0.51,0.1045,126,146,272 +7775,2011-11-25,4,0,11,13,0,5,1,1,0.5,0.4848,0.45,0.2239,143,140,283 +7776,2011-11-25,4,0,11,14,0,5,1,1,0.52,0.5,0.39,0.2239,122,150,272 +7777,2011-11-25,4,0,11,15,0,5,1,1,0.52,0.5,0.36,0.194,165,145,310 +7778,2011-11-25,4,0,11,16,0,5,1,1,0.5,0.4848,0.39,0.1642,122,139,261 +7779,2011-11-25,4,0,11,17,0,5,1,1,0.5,0.4848,0.34,0.1045,57,127,184 +7780,2011-11-25,4,0,11,18,0,5,1,1,0.46,0.4545,0.44,0.0896,45,108,153 +7781,2011-11-25,4,0,11,19,0,5,1,1,0.42,0.4242,0.67,0,38,96,134 +7782,2011-11-25,4,0,11,20,0,5,1,1,0.4,0.4091,0.58,0.0896,19,76,95 +7783,2011-11-25,4,0,11,21,0,5,1,1,0.36,0.3788,0.71,0,24,61,85 +7784,2011-11-25,4,0,11,22,0,5,1,1,0.34,0.3485,0.71,0.0896,12,46,58 +7785,2011-11-25,4,0,11,23,0,5,1,1,0.34,0.3485,0.76,0.1045,5,28,33 +7786,2011-11-26,4,0,11,0,0,6,0,1,0.34,0.3485,0.76,0.1045,9,38,47 +7787,2011-11-26,4,0,11,1,0,6,0,1,0.32,0.3333,0.76,0.0896,5,24,29 +7788,2011-11-26,4,0,11,2,0,6,0,1,0.3,0.3182,0.81,0.1045,3,20,23 +7789,2011-11-26,4,0,11,3,0,6,0,1,0.3,0.3333,0.81,0,8,9,17 +7790,2011-11-26,4,0,11,4,0,6,0,1,0.3,0.3333,0.81,0,0,4,4 +7791,2011-11-26,4,0,11,5,0,6,0,1,0.3,0.3182,0.75,0.1045,0,3,3 +7792,2011-11-26,4,0,11,6,0,6,0,2,0.3,0.3182,0.75,0.0896,2,8,10 +7793,2011-11-26,4,0,11,7,0,6,0,1,0.26,0.2727,0.87,0.1045,4,13,17 +7794,2011-11-26,4,0,11,8,0,6,0,1,0.32,0.3333,0.76,0.0896,10,50,60 +7795,2011-11-26,4,0,11,9,0,6,0,1,0.34,0.3485,0.76,0.0896,16,67,83 +7796,2011-11-26,4,0,11,10,0,6,0,1,0.36,0.3788,0.81,0,57,84,141 +7797,2011-11-26,4,0,11,11,0,6,0,1,0.4,0.4091,0.62,0.1045,107,123,230 +7798,2011-11-26,4,0,11,12,0,6,0,1,0.44,0.4394,0.51,0,137,172,309 +7799,2011-11-26,4,0,11,13,0,6,0,1,0.48,0.4697,0.44,0,177,148,325 +7800,2011-11-26,4,0,11,14,0,6,0,2,0.48,0.4697,0.48,0.1343,141,158,299 +7801,2011-11-26,4,0,11,15,0,6,0,2,0.5,0.4848,0.42,0.1045,154,160,314 +7802,2011-11-26,4,0,11,16,0,6,0,2,0.46,0.4545,0.47,0.1343,137,138,275 +7803,2011-11-26,4,0,11,17,0,6,0,2,0.46,0.4545,0.47,0.1343,62,122,184 +7804,2011-11-26,4,0,11,18,0,6,0,2,0.42,0.4242,0.62,0.1642,67,118,185 +7805,2011-11-26,4,0,11,19,0,6,0,2,0.4,0.4091,0.71,0.0896,46,100,146 +7806,2011-11-26,4,0,11,20,0,6,0,2,0.42,0.4242,0.58,0,55,86,141 +7807,2011-11-26,4,0,11,21,0,6,0,2,0.38,0.3939,0.82,0,20,72,92 +7808,2011-11-26,4,0,11,22,0,6,0,2,0.38,0.3939,0.76,0,14,51,65 +7809,2011-11-26,4,0,11,23,0,6,0,1,0.36,0.3788,0.81,0,18,51,69 +7810,2011-11-27,4,0,11,0,0,0,0,1,0.36,0.3788,0.81,0,7,39,46 +7811,2011-11-27,4,0,11,1,0,0,0,1,0.36,0.3788,0.81,0,9,35,44 +7812,2011-11-27,4,0,11,2,0,0,0,1,0.34,0.3636,0.81,0,9,22,31 +7813,2011-11-27,4,0,11,3,0,0,0,1,0.34,0.3636,0.87,0,9,8,17 +7814,2011-11-27,4,0,11,4,0,0,0,1,0.34,0.3485,0.87,0.1045,0,4,4 +7815,2011-11-27,4,0,11,5,0,0,0,1,0.36,0.3485,0.87,0.194,0,5,5 +7816,2011-11-27,4,0,11,6,0,0,0,1,0.38,0.3939,0.82,0.1642,0,9,9 +7817,2011-11-27,4,0,11,7,0,0,0,1,0.38,0.3939,0.87,0.2239,12,11,23 +7818,2011-11-27,4,0,11,8,0,0,0,1,0.4,0.4091,0.82,0.2985,6,36,42 +7819,2011-11-27,4,0,11,9,0,0,0,1,0.46,0.4545,0.72,0.2836,21,90,111 +7820,2011-11-27,4,0,11,10,0,0,0,1,0.46,0.4545,0.72,0.2836,58,131,189 +7821,2011-11-27,4,0,11,11,0,0,0,1,0.5,0.4848,0.63,0.3582,83,157,240 +7822,2011-11-27,4,0,11,12,0,0,0,1,0.54,0.5152,0.56,0.2836,63,193,256 +7823,2011-11-27,4,0,11,13,0,0,0,1,0.54,0.5152,0.6,0.194,97,216,313 +7824,2011-11-27,4,0,11,14,0,0,0,1,0.62,0.6212,0.43,0.4627,113,200,313 +7825,2011-11-27,4,0,11,15,0,0,0,1,0.62,0.6212,0.43,0.2836,96,221,317 +7826,2011-11-27,4,0,11,16,0,0,0,1,0.56,0.5303,0.52,0.2537,94,229,323 +7827,2011-11-27,4,0,11,17,0,0,0,1,0.54,0.5152,0.56,0.194,37,167,204 +7828,2011-11-27,4,0,11,18,0,0,0,1,0.5,0.4848,0.63,0.1642,25,130,155 +7829,2011-11-27,4,0,11,19,0,0,0,1,0.48,0.4697,0.67,0.2239,30,109,139 +7830,2011-11-27,4,0,11,20,0,0,0,1,0.5,0.4848,0.63,0.2537,10,94,104 +7831,2011-11-27,4,0,11,21,0,0,0,1,0.48,0.4697,0.67,0.2836,13,75,88 +7832,2011-11-27,4,0,11,22,0,0,0,1,0.48,0.4697,0.72,0.2537,13,53,66 +7833,2011-11-27,4,0,11,23,0,0,0,1,0.48,0.4697,0.72,0.2537,5,27,32 +7834,2011-11-28,4,0,11,0,0,1,1,1,0.46,0.4545,0.77,0.2836,6,10,16 +7835,2011-11-28,4,0,11,1,0,1,1,1,0.46,0.4545,0.77,0.2985,1,12,13 +7836,2011-11-28,4,0,11,3,0,1,1,1,0.44,0.4394,0.88,0.2239,1,4,5 +7837,2011-11-28,4,0,11,4,0,1,1,1,0.44,0.4394,0.82,0.0896,0,4,4 +7838,2011-11-28,4,0,11,5,0,1,1,1,0.4,0.4091,0.87,0,0,34,34 +7839,2011-11-28,4,0,11,6,0,1,1,1,0.42,0.4242,0.82,0.0896,6,98,104 +7840,2011-11-28,4,0,11,7,0,1,1,1,0.42,0.4242,0.88,0,7,270,277 +7841,2011-11-28,4,0,11,8,0,1,1,2,0.42,0.4242,0.94,0,13,394,407 +7842,2011-11-28,4,0,11,9,0,1,1,2,0.44,0.4394,0.88,0.2239,17,191,208 +7843,2011-11-28,4,0,11,10,0,1,1,2,0.48,0.4697,0.77,0.1343,21,86,107 +7844,2011-11-28,4,0,11,11,0,1,1,2,0.52,0.5,0.68,0.1045,11,107,118 +7845,2011-11-28,4,0,11,12,0,1,1,1,0.56,0.5303,0.6,0.1045,13,165,178 +7846,2011-11-28,4,0,11,13,0,1,1,2,0.58,0.5455,0.56,0,25,137,162 +7847,2011-11-28,4,0,11,14,0,1,1,1,0.6,0.6212,0.56,0,14,129,143 +7848,2011-11-28,4,0,11,15,0,1,1,1,0.58,0.5455,0.6,0.1642,18,132,150 +7849,2011-11-28,4,0,11,16,0,1,1,1,0.56,0.5303,0.6,0.194,24,240,264 +7850,2011-11-28,4,0,11,17,0,1,1,1,0.58,0.5455,0.56,0.194,24,444,468 +7851,2011-11-28,4,0,11,18,0,1,1,1,0.56,0.5303,0.64,0.2239,20,396,416 +7852,2011-11-28,4,0,11,19,0,1,1,1,0.54,0.5152,0.73,0.2239,4,260,264 +7853,2011-11-28,4,0,11,20,0,1,1,2,0.54,0.5152,0.73,0.2537,12,206,218 +7854,2011-11-28,4,0,11,21,0,1,1,2,0.54,0.5152,0.77,0.2239,4,152,156 +7855,2011-11-28,4,0,11,22,0,1,1,2,0.52,0.5,0.83,0.1045,7,102,109 +7856,2011-11-28,4,0,11,23,0,1,1,2,0.52,0.5,0.83,0.1343,5,41,46 +7857,2011-11-29,4,0,11,0,0,2,1,1,0.52,0.5,0.83,0,4,18,22 +7858,2011-11-29,4,0,11,1,0,2,1,1,0.52,0.5,0.83,0.0896,1,16,17 +7859,2011-11-29,4,0,11,2,0,2,1,2,0.5,0.4848,0.88,0,0,5,5 +7860,2011-11-29,4,0,11,3,0,2,1,2,0.5,0.4848,0.88,0,0,2,2 +7861,2011-11-29,4,0,11,4,0,2,1,2,0.52,0.5,0.83,0.3284,1,5,6 +7862,2011-11-29,4,0,11,5,0,2,1,1,0.5,0.4848,0.88,0.2836,1,21,22 +7863,2011-11-29,4,0,11,6,0,2,1,1,0.52,0.5,0.83,0.2836,1,88,89 +7864,2011-11-29,4,0,11,7,0,2,1,1,0.52,0.5,0.77,0.2239,12,330,342 +7865,2011-11-29,4,0,11,8,0,2,1,1,0.54,0.5152,0.77,0.2836,9,440,449 +7866,2011-11-29,4,0,11,9,0,2,1,3,0.56,0.5303,0.73,0.4179,5,197,202 +7867,2011-11-29,4,0,11,10,0,2,1,3,0.56,0.5303,0.73,0.4179,2,34,36 +7868,2011-11-29,4,0,11,11,0,2,1,3,0.5,0.4848,0.88,0.2985,1,10,11 +7869,2011-11-29,4,0,11,12,0,2,1,3,0.42,0.4242,0.82,0.4179,1,17,18 +7870,2011-11-29,4,0,11,13,0,2,1,3,0.42,0.4242,0.77,0.2836,0,22,22 +7871,2011-11-29,4,0,11,14,0,2,1,3,0.4,0.4091,0.94,0.194,1,35,36 +7872,2011-11-29,4,0,11,15,0,2,1,3,0.4,0.4091,0.87,0.3284,5,60,65 +7873,2011-11-29,4,0,11,16,0,2,1,3,0.4,0.4091,0.87,0.2239,5,146,151 +7874,2011-11-29,4,0,11,17,0,2,1,2,0.4,0.4091,0.87,0.2985,8,346,354 +7875,2011-11-29,4,0,11,18,0,2,1,2,0.4,0.4091,0.82,0.2985,11,347,358 +7876,2011-11-29,4,0,11,19,0,2,1,2,0.4,0.4091,0.82,0.3284,6,244,250 +7877,2011-11-29,4,0,11,20,0,2,1,2,0.38,0.3939,0.87,0.2537,8,176,184 +7878,2011-11-29,4,0,11,21,0,2,1,2,0.38,0.3939,0.87,0.2537,9,116,125 +7879,2011-11-29,4,0,11,22,0,2,1,1,0.38,0.3939,0.82,0.2985,5,90,95 +7880,2011-11-29,4,0,11,23,0,2,1,1,0.36,0.3333,0.76,0.3881,0,53,53 +7881,2011-11-30,4,0,11,0,0,3,1,1,0.36,0.3333,0.62,0.4179,1,23,24 +7882,2011-11-30,4,0,11,1,0,3,1,1,0.34,0.303,0.66,0.3284,0,8,8 +7883,2011-11-30,4,0,11,2,0,3,1,1,0.32,0.3182,0.7,0.1642,0,5,5 +7884,2011-11-30,4,0,11,3,0,3,1,1,0.32,0.3182,0.7,0.1642,0,1,1 +7885,2011-11-30,4,0,11,4,0,3,1,1,0.3,0.2879,0.75,0.2239,0,5,5 +7886,2011-11-30,4,0,11,5,0,3,1,1,0.28,0.2727,0.81,0.194,1,21,22 +7887,2011-11-30,4,0,11,6,0,3,1,1,0.26,0.2727,0.87,0.1045,3,100,103 +7888,2011-11-30,4,0,11,7,0,3,1,1,0.26,0.2727,0.81,0.1045,9,276,285 +7889,2011-11-30,4,0,11,8,0,3,1,1,0.28,0.2727,0.81,0.194,13,465,478 +7890,2011-11-30,4,0,11,9,0,3,1,1,0.3,0.2879,0.75,0.2836,7,186,193 +7891,2011-11-30,4,0,11,10,0,3,1,1,0.34,0.3182,0.66,0.2239,15,95,110 +7892,2011-11-30,4,0,11,11,0,3,1,1,0.38,0.3939,0.54,0.2985,10,112,122 +7893,2011-11-30,4,0,11,12,0,3,1,2,0.36,0.3333,0.53,0.2836,12,148,160 +7894,2011-11-30,4,0,11,13,0,3,1,2,0.38,0.3939,0.43,0.3582,16,131,147 +7895,2011-11-30,4,0,11,14,0,3,1,2,0.38,0.3939,0.43,0.3582,13,99,112 +7896,2011-11-30,4,0,11,15,0,3,1,1,0.36,0.3333,0.46,0.3284,11,104,115 +7897,2011-11-30,4,0,11,16,0,3,1,1,0.36,0.3333,0.46,0.2836,27,199,226 +7898,2011-11-30,4,0,11,17,0,3,1,2,0.34,0.303,0.49,0.3284,15,365,380 +7899,2011-11-30,4,0,11,18,0,3,1,1,0.34,0.303,0.49,0.3284,7,369,376 +7900,2011-11-30,4,0,11,19,0,3,1,2,0.32,0.303,0.57,0.2836,6,273,279 +7901,2011-11-30,4,0,11,20,0,3,1,2,0.32,0.303,0.57,0.2836,6,184,190 +7902,2011-11-30,4,0,11,21,0,3,1,1,0.32,0.2879,0.53,0.3881,7,128,135 +7903,2011-11-30,4,0,11,22,0,3,1,1,0.3,0.2727,0.56,0.2985,5,82,87 +7904,2011-11-30,4,0,11,23,0,3,1,1,0.28,0.2576,0.52,0.2836,4,46,50 +7905,2011-12-01,4,0,12,0,0,4,1,1,0.28,0.2576,0.52,0.3284,1,19,20 +7906,2011-12-01,4,0,12,1,0,4,1,1,0.26,0.2424,0.6,0.2836,1,9,10 +7907,2011-12-01,4,0,12,2,0,4,1,1,0.26,0.2273,0.56,0.2985,1,8,9 +7908,2011-12-01,4,0,12,3,0,4,1,1,0.26,0.2424,0.56,0.2537,1,6,7 +7909,2011-12-01,4,0,12,4,0,4,1,1,0.26,0.2424,0.56,0.2836,0,1,1 +7910,2011-12-01,4,0,12,5,0,4,1,1,0.26,0.2424,0.56,0.2537,1,23,24 +7911,2011-12-01,4,0,12,6,0,4,1,1,0.24,0.2121,0.65,0.2836,5,92,97 +7912,2011-12-01,4,0,12,7,0,4,1,1,0.24,0.2121,0.65,0.3582,11,265,276 +7913,2011-12-01,4,0,12,8,0,4,1,1,0.26,0.2273,0.6,0.3284,15,462,477 +7914,2011-12-01,4,0,12,9,0,4,1,1,0.3,0.2727,0.52,0.3284,9,215,224 +7915,2011-12-01,4,0,12,10,0,4,1,1,0.32,0.303,0.53,0.2985,7,86,93 +7916,2011-12-01,4,0,12,11,0,4,1,1,0.34,0.3182,0.49,0.2537,7,95,102 +7917,2011-12-01,4,0,12,12,0,4,1,1,0.36,0.3485,0.46,0.2239,9,139,148 +7918,2011-12-01,4,0,12,13,0,4,1,1,0.4,0.4091,0.4,0.2239,17,137,154 +7919,2011-12-01,4,0,12,14,0,4,1,1,0.4,0.4091,0.4,0.2239,20,130,150 +7920,2011-12-01,4,0,12,15,0,4,1,1,0.4,0.4091,0.4,0.2239,10,144,154 +7921,2011-12-01,4,0,12,16,0,4,1,1,0.4,0.4091,0.4,0.194,9,192,201 +7922,2011-12-01,4,0,12,17,0,4,1,1,0.36,0.3485,0.46,0.1343,9,409,418 +7923,2011-12-01,4,0,12,18,0,4,1,1,0.36,0.3636,0.43,0.1045,12,393,405 +7924,2011-12-01,4,0,12,19,0,4,1,1,0.34,0.3485,0.46,0.1045,9,219,228 +7925,2011-12-01,4,0,12,20,0,4,1,1,0.34,0.3485,0.46,0.1045,12,178,190 +7926,2011-12-01,4,0,12,21,0,4,1,1,0.3,0.3182,0.61,0.1045,8,154,162 +7927,2011-12-01,4,0,12,22,0,4,1,1,0.3,0.3333,0.61,0,5,99,104 +7928,2011-12-01,4,0,12,23,0,4,1,1,0.26,0.2879,0.7,0.0896,3,70,73 +7929,2011-12-02,4,0,12,0,0,5,1,1,0.26,0.2879,0.7,0.0896,6,32,38 +7930,2011-12-02,4,0,12,1,0,5,1,1,0.24,0.2879,0.75,0,1,14,15 +7931,2011-12-02,4,0,12,2,0,5,1,1,0.24,0.2879,0.75,0,0,10,10 +7932,2011-12-02,4,0,12,3,0,5,1,1,0.22,0.2727,0.8,0,0,1,1 +7933,2011-12-02,4,0,12,4,0,5,1,1,0.22,0.2727,0.8,0,0,3,3 +7934,2011-12-02,4,0,12,5,0,5,1,1,0.22,0.2576,0.8,0.0896,0,23,23 +7935,2011-12-02,4,0,12,6,0,5,1,1,0.22,0.2727,0.8,0,3,84,87 +7936,2011-12-02,4,0,12,7,0,5,1,1,0.22,0.2576,0.87,0.0896,6,199,205 +7937,2011-12-02,4,0,12,8,0,5,1,1,0.24,0.2879,0.87,0,13,432,445 +7938,2011-12-02,4,0,12,9,0,5,1,1,0.28,0.3182,0.81,0,12,234,246 +7939,2011-12-02,4,0,12,10,0,5,1,1,0.3,0.2879,0.75,0.2239,16,89,105 +7940,2011-12-02,4,0,12,11,0,5,1,1,0.34,0.3333,0.66,0.1642,18,142,160 +7941,2011-12-02,4,0,12,12,0,5,1,1,0.36,0.3485,0.57,0.1343,18,186,204 +7942,2011-12-02,4,0,12,13,0,5,1,1,0.4,0.4091,0.47,0.1343,19,188,207 +7943,2011-12-02,4,0,12,14,0,5,1,1,0.42,0.4242,0.44,0.1045,25,145,170 +7944,2011-12-02,4,0,12,15,0,5,1,1,0.42,0.4242,0.41,0.0896,24,175,199 +7945,2011-12-02,4,0,12,16,0,5,1,1,0.42,0.4242,0.41,0.0896,22,254,276 +7946,2011-12-02,4,0,12,17,0,5,1,1,0.42,0.4242,0.38,0,20,391,411 +7947,2011-12-02,4,0,12,18,0,5,1,1,0.4,0.4091,0.5,0,10,362,372 +7948,2011-12-02,4,0,12,19,0,5,1,1,0.34,0.3182,0.53,0.2537,13,252,265 +7949,2011-12-02,4,0,12,20,0,5,1,1,0.38,0.3939,0.4,0.2836,12,178,190 +7950,2011-12-02,4,0,12,21,0,5,1,1,0.34,0.3333,0.49,0.194,9,102,111 +7951,2011-12-02,4,0,12,22,0,5,1,1,0.32,0.3182,0.53,0.194,7,95,102 +7952,2011-12-02,4,0,12,23,0,5,1,1,0.32,0.303,0.53,0.2836,14,81,95 +7953,2011-12-03,4,0,12,0,0,6,0,1,0.3,0.2879,0.56,0.2537,10,65,75 +7954,2011-12-03,4,0,12,1,0,6,0,1,0.26,0.2576,0.65,0.194,9,62,71 +7955,2011-12-03,4,0,12,2,0,6,0,1,0.26,0.2576,0.65,0.1642,9,41,50 +7956,2011-12-03,4,0,12,3,0,6,0,1,0.24,0.2424,0.7,0.1343,4,5,9 +7957,2011-12-03,4,0,12,4,0,6,0,1,0.22,0.2273,0.75,0.1343,1,7,8 +7958,2011-12-03,4,0,12,5,0,6,0,1,0.24,0.2576,0.7,0.0896,0,6,6 +7959,2011-12-03,4,0,12,6,0,6,0,1,0.24,0.2576,0.65,0.1045,1,10,11 +7960,2011-12-03,4,0,12,7,0,6,0,1,0.22,0.2576,0.75,0.0896,2,24,26 +7961,2011-12-03,4,0,12,8,0,6,0,1,0.24,0.2576,0.75,0.0896,1,62,63 +7962,2011-12-03,4,0,12,9,0,6,0,1,0.26,0.2727,0.7,0.1045,25,99,124 +7963,2011-12-03,4,0,12,10,0,6,0,1,0.32,0.3333,0.61,0.1343,22,173,195 +7964,2011-12-03,4,0,12,11,0,6,0,1,0.32,0.3333,0.57,0.0896,48,227,275 +7965,2011-12-03,4,0,12,12,0,6,0,1,0.36,0.3788,0.5,0,78,280,358 +7966,2011-12-03,4,0,12,13,0,6,0,1,0.36,0.3636,0.5,0.0896,70,222,292 +7967,2011-12-03,4,0,12,14,0,6,0,1,0.36,0.3788,0.46,0,92,251,343 +7968,2011-12-03,4,0,12,15,0,6,0,1,0.38,0.3939,0.46,0,100,237,337 +7969,2011-12-03,4,0,12,16,0,6,0,1,0.38,0.3939,0.46,0,75,230,305 +7970,2011-12-03,4,0,12,17,0,6,0,1,0.36,0.3788,0.62,0,46,186,232 +7971,2011-12-03,4,0,12,18,0,6,0,1,0.34,0.3333,0.53,0.1343,31,193,224 +7972,2011-12-03,4,0,12,19,0,6,0,1,0.3,0.3182,0.61,0.0896,31,144,175 +7973,2011-12-03,4,0,12,20,0,6,0,1,0.3,0.3182,0.61,0.0896,17,110,127 +7974,2011-12-03,4,0,12,21,0,6,0,2,0.3,0.3182,0.61,0.0896,5,104,109 +7975,2011-12-03,4,0,12,22,0,6,0,2,0.32,0.3333,0.61,0.0896,17,97,114 +7976,2011-12-03,4,0,12,23,0,6,0,1,0.3,0.303,0.7,0.1343,12,73,85 +7977,2011-12-04,4,0,12,0,0,0,0,1,0.3,0.3182,0.7,0.1045,19,64,83 +7978,2011-12-04,4,0,12,1,0,0,0,1,0.3,0.3182,0.75,0.0896,20,54,74 +7979,2011-12-04,4,0,12,2,0,0,0,1,0.26,0.303,0.81,0,11,51,62 +7980,2011-12-04,4,0,12,3,0,0,0,1,0.28,0.3182,0.75,0,7,34,41 +7981,2011-12-04,4,0,12,4,0,0,0,1,0.26,0.303,0.87,0,2,9,11 +7982,2011-12-04,4,0,12,5,0,0,0,1,0.26,0.303,0.81,0,2,2,4 +7983,2011-12-04,4,0,12,6,0,0,0,1,0.24,0.2879,0.93,0,1,3,4 +7984,2011-12-04,4,0,12,7,0,0,0,1,0.26,0.303,0.87,0,3,23,26 +7985,2011-12-04,4,0,12,8,0,0,0,1,0.26,0.303,0.87,0,7,48,55 +7986,2011-12-04,4,0,12,9,0,0,0,1,0.3,0.3333,0.81,0,12,114,126 +7987,2011-12-04,4,0,12,10,0,0,0,1,0.32,0.3333,0.81,0.1343,36,180,216 +7988,2011-12-04,4,0,12,11,0,0,0,1,0.34,0.3333,0.81,0.1642,48,206,254 +7989,2011-12-04,4,0,12,12,0,0,0,1,0.36,0.3485,0.81,0.1642,82,247,329 +7990,2011-12-04,4,0,12,13,0,0,0,1,0.4,0.4091,0.66,0.2239,88,269,357 +7991,2011-12-04,4,0,12,14,0,0,0,1,0.42,0.4242,0.62,0.1343,77,246,323 +7992,2011-12-04,4,0,12,15,0,0,0,1,0.42,0.4242,0.62,0.194,69,227,296 +7993,2011-12-04,4,0,12,16,0,0,0,1,0.42,0.4242,0.58,0.2239,64,266,330 +7994,2011-12-04,4,0,12,17,0,0,0,1,0.38,0.3939,0.76,0.1343,38,207,245 +7995,2011-12-04,4,0,12,18,0,0,0,1,0.38,0.3939,0.71,0,10,161,171 +7996,2011-12-04,4,0,12,19,0,0,0,1,0.38,0.3939,0.76,0,15,158,173 +7997,2011-12-04,4,0,12,20,0,0,0,1,0.36,0.3485,0.76,0.1343,8,116,124 +7998,2011-12-04,4,0,12,21,0,0,0,1,0.36,0.3636,0.81,0.1045,7,65,72 +7999,2011-12-04,4,0,12,22,0,0,0,1,0.34,0.3485,0.87,0.1045,5,59,64 +8000,2011-12-04,4,0,12,23,0,0,0,1,0.34,0.3485,0.87,0.1045,3,42,45 +8001,2011-12-05,4,0,12,0,0,1,1,1,0.32,0.3333,0.87,0.0896,3,21,24 +8002,2011-12-05,4,0,12,1,0,1,1,1,0.32,0.3485,0.87,0,2,10,12 +8003,2011-12-05,4,0,12,2,0,1,1,1,0.32,0.3485,0.87,0,0,8,8 +8004,2011-12-05,4,0,12,3,0,1,1,1,0.32,0.3485,0.87,0,0,2,2 +8005,2011-12-05,4,0,12,4,0,1,1,1,0.3,0.3182,0.87,0.1045,0,7,7 +8006,2011-12-05,4,0,12,5,0,1,1,1,0.3,0.3182,0.87,0.1045,1,24,25 +8007,2011-12-05,4,0,12,6,0,1,1,1,0.3,0.3182,0.93,0.0896,1,89,90 +8008,2011-12-05,4,0,12,7,0,1,1,1,0.32,0.3485,0.9,0,9,274,283 +8009,2011-12-05,4,0,12,8,0,1,1,1,0.32,0.3485,0.93,0,12,392,404 +8010,2011-12-05,4,0,12,9,0,1,1,2,0.36,0.3636,0.87,0.0896,12,178,190 +8011,2011-12-05,4,0,12,10,0,1,1,2,0.36,0.3636,0.87,0.0896,9,78,87 +8012,2011-12-05,4,0,12,11,0,1,1,2,0.36,0.3636,0.87,0.0896,17,106,123 +8013,2011-12-05,4,0,12,12,0,1,1,2,0.4,0.4091,0.76,0.1045,29,125,154 +8014,2011-12-05,4,0,12,13,0,1,1,2,0.42,0.4242,0.71,0.0896,18,153,171 +8015,2011-12-05,4,0,12,14,0,1,1,2,0.46,0.4545,0.67,0,7,126,133 +8016,2011-12-05,4,0,12,15,0,1,1,2,0.46,0.4545,0.72,0.0896,16,132,148 +8017,2011-12-05,4,0,12,16,0,1,1,1,0.5,0.4848,0.63,0.0896,10,238,248 +8018,2011-12-05,4,0,12,17,0,1,1,1,0.42,0.4242,0.77,0,16,430,446 +8019,2011-12-05,4,0,12,18,0,1,1,1,0.44,0.4394,0.77,0,13,386,399 +8020,2011-12-05,4,0,12,19,0,1,1,2,0.46,0.4545,0.77,0,13,295,308 +8021,2011-12-05,4,0,12,20,0,1,1,2,0.46,0.4545,0.82,0.1045,15,190,205 +8022,2011-12-05,4,0,12,21,0,1,1,2,0.44,0.4394,0.88,0.0896,10,166,176 +8023,2011-12-05,4,0,12,22,0,1,1,2,0.44,0.4394,0.88,0.1343,12,94,106 +8024,2011-12-05,4,0,12,23,0,1,1,2,0.46,0.4545,0.88,0.1343,8,54,62 +8025,2011-12-06,4,0,12,0,0,2,1,2,0.5,0.4848,0.77,0.2985,3,25,28 +8026,2011-12-06,4,0,12,1,0,2,1,2,0.46,0.4545,0.88,0.2985,2,14,16 +8027,2011-12-06,4,0,12,2,0,2,1,2,0.46,0.4545,0.87,0.194,0,3,3 +8028,2011-12-06,4,0,12,3,0,2,1,2,0.46,0.4545,0.87,0.194,0,3,3 +8029,2011-12-06,4,0,12,4,0,2,1,3,0.46,0.4545,0.88,0.2239,0,7,7 +8030,2011-12-06,4,0,12,5,0,2,1,3,0.46,0.4545,0.88,0.2239,0,20,20 +8031,2011-12-06,4,0,12,6,0,2,1,2,0.44,0.4394,0.94,0.2836,5,81,86 +8032,2011-12-06,4,0,12,7,0,2,1,2,0.46,0.4545,0.94,0.194,7,227,234 +8033,2011-12-06,4,0,12,8,0,2,1,2,0.46,0.4545,0.94,0.194,13,401,414 +8034,2011-12-06,4,0,12,9,0,2,1,2,0.46,0.4545,0.94,0.2239,4,200,204 +8035,2011-12-06,4,0,12,10,0,2,1,3,0.46,0.4545,0.94,0.2836,6,44,50 +8036,2011-12-06,4,0,12,11,0,2,1,3,0.46,0.4545,0.94,0.2537,4,28,32 +8037,2011-12-06,4,0,12,12,0,2,1,3,0.46,0.4545,1,0.2239,5,50,55 +8038,2011-12-06,4,0,12,13,0,2,1,3,0.46,0.4545,1,0.194,6,65,71 +8039,2011-12-06,4,0,12,14,0,2,1,3,0.48,0.4697,1,0.2239,6,69,75 +8040,2011-12-06,4,0,12,15,0,2,1,3,0.5,0.4848,1,0.194,2,43,45 +8041,2011-12-06,4,0,12,16,0,2,1,3,0.46,0.4545,1,0.2537,4,84,88 +8042,2011-12-06,4,0,12,17,0,2,1,2,0.46,0.4545,1,0.2239,13,249,262 +8043,2011-12-06,4,0,12,18,0,2,1,2,0.46,0.4545,1,0.2836,17,287,304 +8044,2011-12-06,4,0,12,19,0,2,1,3,0.46,0.4545,1,0.2985,10,223,233 +8045,2011-12-06,4,0,12,20,0,2,1,3,0.46,0.4545,1,0.2985,3,107,110 +8046,2011-12-06,4,0,12,21,0,2,1,2,0.44,0.4394,1,0.194,6,108,114 +8047,2011-12-06,4,0,12,22,0,2,1,3,0.46,0.4545,1,0.1642,5,82,87 +8048,2011-12-06,4,0,12,23,0,2,1,3,0.46,0.4545,1,0.1642,5,48,53 +8049,2011-12-07,4,0,12,0,0,3,1,1,0.44,0.4394,1,0.194,3,30,33 +8050,2011-12-07,4,0,12,1,0,3,1,2,0.46,0.4545,1,0.2537,4,9,13 +8051,2011-12-07,4,0,12,2,0,3,1,3,0.46,0.4545,1,0.2239,2,4,6 +8052,2011-12-07,4,0,12,3,0,3,1,3,0.46,0.4545,1,0.2239,0,3,3 +8053,2011-12-07,4,0,12,4,0,3,1,3,0.46,0.4545,1,0.1642,0,1,1 +8054,2011-12-07,4,0,12,5,0,3,1,3,0.46,0.4545,1,0.1642,0,3,3 +8055,2011-12-07,4,0,12,6,0,3,1,3,0.48,0.4697,1,0.4179,0,18,18 +8056,2011-12-07,4,0,12,7,0,3,1,3,0.44,0.4394,1,0.1045,3,43,46 +8057,2011-12-07,4,0,12,8,0,3,1,3,0.44,0.4394,1,0,6,80,86 +8058,2011-12-07,4,0,12,9,0,3,1,3,0.44,0.4394,1,0.2537,0,64,64 +8059,2011-12-07,4,0,12,10,0,3,1,3,0.44,0.4394,1,0.0896,1,32,33 +8060,2011-12-07,4,0,12,11,0,3,1,3,0.44,0.4394,1,0.0896,3,47,50 +8061,2011-12-07,4,0,12,12,0,3,1,3,0.44,0.4394,1,0.1343,4,29,33 +8062,2011-12-07,4,0,12,13,0,3,1,3,0.44,0.4394,1,0.194,5,28,33 +8063,2011-12-07,4,0,12,14,0,3,1,3,0.44,0.4394,1,0.194,1,24,25 +8064,2011-12-07,4,0,12,15,0,3,1,3,0.42,0.4242,0.94,0.2239,5,25,30 +8065,2011-12-07,4,0,12,16,0,3,1,3,0.42,0.4242,0.94,0.194,3,28,31 +8066,2011-12-07,4,0,12,17,0,3,1,3,0.4,0.4091,1,0.3284,4,48,52 +8067,2011-12-07,4,0,12,18,0,3,1,3,0.4,0.4091,0.94,0.2985,3,48,51 +8068,2011-12-07,4,0,12,19,0,3,1,3,0.34,0.2879,0.87,0.6418,2,31,33 +8069,2011-12-07,4,0,12,20,0,3,1,3,0.34,0.2879,0.87,0.6418,1,25,26 +8070,2011-12-07,4,0,12,21,0,3,1,3,0.3,0.2576,0.87,0.5224,0,6,6 +8071,2011-12-07,4,0,12,22,0,3,1,3,0.24,0.197,0.93,0.4478,0,13,13 +8072,2011-12-07,4,0,12,23,0,3,1,1,0.24,0.2121,0.93,0.3881,0,16,16 +8073,2011-12-08,4,0,12,0,0,4,1,2,0.26,0.2273,0.87,0.2985,0,21,21 +8074,2011-12-08,4,0,12,1,0,4,1,1,0.26,0.2273,0.81,0.3582,1,6,7 +8075,2011-12-08,4,0,12,2,0,4,1,1,0.28,0.2727,0.65,0.2239,0,4,4 +8076,2011-12-08,4,0,12,3,0,4,1,1,0.26,0.2273,0.6,0.2985,0,1,1 +8077,2011-12-08,4,0,12,4,0,4,1,2,0.24,0.197,0.6,0.4627,0,2,2 +8078,2011-12-08,4,0,12,5,0,4,1,1,0.22,0.2121,0.64,0.2239,0,12,12 +8079,2011-12-08,4,0,12,6,0,4,1,1,0.22,0.2121,0.55,0.2836,0,71,71 +8080,2011-12-08,4,0,12,7,0,4,1,1,0.22,0.2121,0.6,0.2239,11,233,244 +8081,2011-12-08,4,0,12,8,0,4,1,1,0.22,0.2121,0.55,0.2836,11,418,429 +8082,2011-12-08,4,0,12,9,0,4,1,1,0.24,0.2121,0.52,0.3881,7,218,225 +8083,2011-12-08,4,0,12,10,0,4,1,1,0.26,0.2273,0.52,0.3284,1,109,110 +8084,2011-12-08,4,0,12,11,0,4,1,1,0.28,0.2727,0.52,0.2537,6,100,106 +8085,2011-12-08,4,0,12,12,0,4,1,1,0.28,0.2576,0.52,0.3284,8,148,156 +8086,2011-12-08,4,0,12,13,0,4,1,1,0.3,0.2727,0.49,0.3582,9,115,124 +8087,2011-12-08,4,0,12,14,0,4,1,1,0.32,0.303,0.45,0.2537,16,121,137 +8088,2011-12-08,4,0,12,15,0,4,1,1,0.32,0.3333,0.45,0.1343,19,98,117 +8089,2011-12-08,4,0,12,16,0,4,1,1,0.32,0.3182,0.45,0.194,7,201,208 +8090,2011-12-08,4,0,12,17,0,4,1,1,0.3,0.3182,0.49,0.1045,10,321,331 +8091,2011-12-08,4,0,12,18,0,4,1,1,0.28,0.2727,0.52,0.2239,12,306,318 +8092,2011-12-08,4,0,12,19,0,4,1,1,0.26,0.2727,0.56,0.1045,12,220,232 +8093,2011-12-08,4,0,12,20,0,4,1,1,0.26,0.2727,0.6,0.1045,5,170,175 +8094,2011-12-08,4,0,12,21,0,4,1,1,0.28,0.303,0.61,0.0896,4,135,139 +8095,2011-12-08,4,0,12,22,0,4,1,1,0.26,0.2727,0.65,0.1343,8,84,92 +8096,2011-12-08,4,0,12,23,0,4,1,1,0.24,0.2576,0.7,0.1045,3,58,61 +8097,2011-12-09,4,0,12,0,0,5,1,1,0.24,0.2879,0.65,0,8,26,34 +8098,2011-12-09,4,0,12,1,0,5,1,1,0.24,0.2879,0.65,0,2,12,14 +8099,2011-12-09,4,0,12,2,0,5,1,1,0.24,0.2879,0.7,0,2,10,12 +8100,2011-12-09,4,0,12,3,0,5,1,1,0.22,0.2727,0.8,0,1,5,6 +8101,2011-12-09,4,0,12,4,0,5,1,1,0.22,0.2727,0.8,0,1,1,2 +8102,2011-12-09,4,0,12,5,0,5,1,1,0.22,0.2727,0.8,0,0,20,20 +8103,2011-12-09,4,0,12,6,0,5,1,1,0.22,0.2576,0.8,0.0896,1,62,63 +8104,2011-12-09,4,0,12,7,0,5,1,1,0.22,0.2576,0.75,0.0896,4,195,199 +8105,2011-12-09,4,0,12,8,0,5,1,1,0.24,0.2879,0.81,0,7,378,385 +8106,2011-12-09,4,0,12,9,0,5,1,1,0.26,0.2879,0.81,0.0896,8,256,264 +8107,2011-12-09,4,0,12,10,0,5,1,2,0.3,0.303,0.7,0.1642,11,115,126 +8108,2011-12-09,4,0,12,11,0,5,1,1,0.32,0.303,0.7,0.2537,15,121,136 +8109,2011-12-09,4,0,12,12,0,5,1,1,0.36,0.3333,0.57,0.2985,27,146,173 +8110,2011-12-09,4,0,12,13,0,5,1,1,0.38,0.3939,0.5,0.194,27,141,168 +8111,2011-12-09,4,0,12,14,0,5,1,1,0.38,0.3939,0.46,0.2836,24,163,187 +8112,2011-12-09,4,0,12,15,0,5,1,1,0.38,0.3939,0.46,0.2836,29,160,189 +8113,2011-12-09,4,0,12,16,0,5,1,1,0.36,0.3485,0.62,0.1343,24,226,250 +8114,2011-12-09,4,0,12,17,0,5,1,1,0.34,0.3485,0.66,0.1045,10,352,362 +8115,2011-12-09,4,0,12,18,0,5,1,1,0.34,0.3636,0.61,0,10,308,318 +8116,2011-12-09,4,0,12,19,0,5,1,1,0.34,0.3636,0.61,0,10,189,199 +8117,2011-12-09,4,0,12,20,0,5,1,1,0.3,0.3333,0.81,0,10,140,150 +8118,2011-12-09,4,0,12,21,0,5,1,1,0.3,0.3333,0.75,0,9,131,140 +8119,2011-12-09,4,0,12,22,0,5,1,1,0.28,0.3182,0.87,0,15,114,129 +8120,2011-12-09,4,0,12,23,0,5,1,1,0.28,0.3182,0.81,0,6,88,94 +8121,2011-12-10,4,0,12,0,0,6,0,2,0.26,0.2727,0.81,0.1045,11,66,77 +8122,2011-12-10,4,0,12,1,0,6,0,2,0.26,0.2576,0.7,0.1642,11,55,66 +8123,2011-12-10,4,0,12,2,0,6,0,2,0.28,0.303,0.61,0.0896,12,48,60 +8124,2011-12-10,4,0,12,3,0,6,0,2,0.28,0.303,0.61,0.0896,5,20,25 +8125,2011-12-10,4,0,12,4,0,6,0,2,0.26,0.2727,0.65,0.1045,0,3,3 +8126,2011-12-10,4,0,12,5,0,6,0,1,0.26,0.2576,0.6,0.1642,0,6,6 +8127,2011-12-10,4,0,12,6,0,6,0,1,0.24,0.2576,0.65,0.1045,1,10,11 +8128,2011-12-10,4,0,12,7,0,6,0,1,0.24,0.2273,0.65,0.194,0,11,11 +8129,2011-12-10,4,0,12,8,0,6,0,1,0.26,0.2727,0.65,0.1343,5,68,73 +8130,2011-12-10,4,0,12,9,0,6,0,1,0.28,0.2576,0.61,0.2836,15,134,149 +8131,2011-12-10,4,0,12,10,0,6,0,1,0.32,0.303,0.57,0.2537,31,161,192 +8132,2011-12-10,4,0,12,11,0,6,0,1,0.32,0.2879,0.53,0.4179,44,192,236 +8133,2011-12-10,4,0,12,12,0,6,0,1,0.32,0.2879,0.45,0.4179,48,235,283 +8134,2011-12-10,4,0,12,13,0,6,0,1,0.34,0.303,0.34,0.4179,47,239,286 +8135,2011-12-10,4,0,12,14,0,6,0,1,0.34,0.303,0.31,0.3284,51,242,293 +8136,2011-12-10,4,0,12,15,0,6,0,1,0.34,0.303,0.34,0.3582,60,239,299 +8137,2011-12-10,4,0,12,16,0,6,0,1,0.32,0.2879,0.31,0.3881,58,183,241 +8138,2011-12-10,4,0,12,17,0,6,0,1,0.28,0.2576,0.36,0.3284,34,151,185 +8139,2011-12-10,4,0,12,18,0,6,0,1,0.26,0.2576,0.38,0.2239,28,144,172 +8140,2011-12-10,4,0,12,19,0,6,0,1,0.26,0.2424,0.38,0.2836,15,139,154 +8141,2011-12-10,4,0,12,20,0,6,0,1,0.24,0.2273,0.35,0.2239,10,114,124 +8142,2011-12-10,4,0,12,21,0,6,0,1,0.22,0.2121,0.44,0.2239,4,79,83 +8143,2011-12-10,4,0,12,22,0,6,0,1,0.22,0.2273,0.41,0.1343,7,88,95 +8144,2011-12-10,4,0,12,23,0,6,0,1,0.2,0.2121,0.47,0.1642,5,61,66 +8145,2011-12-11,4,0,12,0,0,0,0,1,0.2,0.197,0.51,0.2239,5,69,74 +8146,2011-12-11,4,0,12,1,0,0,0,1,0.16,0.1515,0.59,0.2239,8,54,62 +8147,2011-12-11,4,0,12,2,0,0,0,1,0.18,0.1667,0.55,0.2537,5,47,52 +8148,2011-12-11,4,0,12,3,0,0,0,1,0.16,0.1818,0.59,0.1343,2,17,19 +8149,2011-12-11,4,0,12,4,0,0,0,1,0.16,0.1667,0.59,0.1642,0,5,5 +8150,2011-12-11,4,0,12,5,0,0,0,1,0.18,0.2121,0.59,0.1045,0,6,6 +8151,2011-12-11,4,0,12,6,0,0,0,1,0.16,0.2273,0.59,0,0,10,10 +8152,2011-12-11,4,0,12,7,0,0,0,1,0.18,0.2424,0.55,0,1,30,31 +8153,2011-12-11,4,0,12,8,0,0,0,1,0.16,0.2273,0.69,0,4,68,72 +8154,2011-12-11,4,0,12,9,0,0,0,1,0.2,0.2576,0.59,0,19,88,107 +8155,2011-12-11,4,0,12,10,0,0,0,1,0.24,0.2879,0.44,0,30,164,194 +8156,2011-12-11,4,0,12,11,0,0,0,1,0.26,0.303,0.35,0,31,166,197 +8157,2011-12-11,4,0,12,12,0,0,0,1,0.26,0.303,0.35,0,42,246,288 +8158,2011-12-11,4,0,12,13,0,0,0,1,0.3,0.3333,0.31,0,44,224,268 +8159,2011-12-11,4,0,12,14,0,0,0,1,0.3,0.3333,0.28,0,44,198,242 +8160,2011-12-11,4,0,12,15,0,0,0,1,0.3,0.3333,0.26,0,47,219,266 +8161,2011-12-11,4,0,12,16,0,0,0,1,0.3,0.3333,0.26,0,32,174,206 +8162,2011-12-11,4,0,12,17,0,0,0,1,0.28,0.3182,0.3,0,13,150,163 +8163,2011-12-11,4,0,12,18,0,0,0,1,0.24,0.2273,0.48,0.194,12,151,163 +8164,2011-12-11,4,0,12,19,0,0,0,1,0.22,0.2273,0.55,0.194,7,98,105 +8165,2011-12-11,4,0,12,20,0,0,0,1,0.22,0.2424,0.55,0.1045,10,62,72 +8166,2011-12-11,4,0,12,21,0,0,0,1,0.22,0.2727,0.55,0,10,53,63 +8167,2011-12-11,4,0,12,22,0,0,0,1,0.22,0.2727,0.55,0,7,36,43 +8168,2011-12-11,4,0,12,23,0,0,0,1,0.2,0.2576,0.69,0,4,31,35 +8169,2011-12-12,4,0,12,0,0,1,1,1,0.2,0.2273,0.69,0.0896,4,16,20 +8170,2011-12-12,4,0,12,1,0,1,1,1,0.2,0.2576,0.69,0,0,9,9 +8171,2011-12-12,4,0,12,2,0,1,1,1,0.2,0.2576,0.69,0,0,4,4 +8172,2011-12-12,4,0,12,3,0,1,1,1,0.18,0.2424,0.74,0,0,2,2 +8173,2011-12-12,4,0,12,4,0,1,1,1,0.18,0.2424,0.74,0,0,3,3 +8174,2011-12-12,4,0,12,5,0,1,1,1,0.16,0.2273,0.8,0,0,21,21 +8175,2011-12-12,4,0,12,6,0,1,1,1,0.16,0.1818,0.8,0.1045,1,63,64 +8176,2011-12-12,4,0,12,7,0,1,1,1,0.14,0.1515,0.86,0.1343,4,231,235 +8177,2011-12-12,4,0,12,8,0,1,1,2,0.16,0.1818,0.8,0.1045,7,378,385 +8178,2011-12-12,4,0,12,9,0,1,1,2,0.18,0.2121,0.74,0.0896,12,208,220 +8179,2011-12-12,4,0,12,10,0,1,1,2,0.22,0.2424,0.69,0.1045,14,91,105 +8180,2011-12-12,4,0,12,11,0,1,1,2,0.26,0.2879,0.6,0.0896,15,100,115 +8181,2011-12-12,4,0,12,12,0,1,1,1,0.28,0.2879,0.52,0.1045,7,138,145 +8182,2011-12-12,4,0,12,13,0,1,1,2,0.3,0.3182,0.52,0.1045,13,153,166 +8183,2011-12-12,4,0,12,14,0,1,1,2,0.32,0.3333,0.53,0.1343,16,95,111 +8184,2011-12-12,4,0,12,15,0,1,1,2,0.32,0.3333,0.57,0.1045,8,124,132 +8185,2011-12-12,4,0,12,16,0,1,1,2,0.32,0.3333,0.57,0.0896,5,209,214 +8186,2011-12-12,4,0,12,17,0,1,1,2,0.3,0.3333,0.56,0,7,352,359 +8187,2011-12-12,4,0,12,18,0,1,1,1,0.3,0.3182,0.56,0.0896,8,349,357 +8188,2011-12-12,4,0,12,19,0,1,1,1,0.28,0.303,0.61,0.0896,9,232,241 +8189,2011-12-12,4,0,12,20,0,1,1,1,0.3,0.3182,0.56,0.0896,8,157,165 +8190,2011-12-12,4,0,12,21,0,1,1,1,0.26,0.303,0.81,0,2,119,121 +8191,2011-12-12,4,0,12,22,0,1,1,1,0.26,0.303,0.7,0,0,71,71 +8192,2011-12-12,4,0,12,23,0,1,1,1,0.24,0.2879,0.75,0,3,42,45 +8193,2011-12-13,4,0,12,0,0,2,1,1,0.22,0.2727,0.8,0,1,16,17 +8194,2011-12-13,4,0,12,1,0,2,1,1,0.2,0.2576,0.8,0,0,4,4 +8195,2011-12-13,4,0,12,2,0,2,1,1,0.2,0.2576,0.8,0,1,1,2 +8196,2011-12-13,4,0,12,3,0,2,1,1,0.18,0.2424,0.8,0,0,3,3 +8197,2011-12-13,4,0,12,4,0,2,1,1,0.2,0.2576,0.8,0,0,4,4 +8198,2011-12-13,4,0,12,5,0,2,1,1,0.18,0.2424,0.86,0,0,20,20 +8199,2011-12-13,4,0,12,6,0,2,1,1,0.2,0.2576,0.8,0,0,92,92 +8200,2011-12-13,4,0,12,7,0,2,1,1,0.16,0.1667,0.8,0.1642,2,221,223 +8201,2011-12-13,4,0,12,8,0,2,1,1,0.2,0.2273,0.69,0.1045,9,391,400 +8202,2011-12-13,4,0,12,9,0,2,1,1,0.26,0.2576,0.6,0.1642,10,223,233 +8203,2011-12-13,4,0,12,10,0,2,1,1,0.3,0.303,0.52,0.1642,3,99,102 +8204,2011-12-13,4,0,12,11,0,2,1,1,0.34,0.3182,0.46,0.2537,9,115,124 +8205,2011-12-13,4,0,12,12,0,2,1,1,0.36,0.3333,0.43,0.3582,20,147,167 +8206,2011-12-13,4,0,12,13,0,2,1,1,0.4,0.4091,0.37,0.3284,7,120,127 +8207,2011-12-13,4,0,12,14,0,2,1,1,0.4,0.4091,0.37,0.2537,18,116,134 +8208,2011-12-13,4,0,12,15,0,2,1,1,0.4,0.4091,0.35,0.2537,7,144,151 +8209,2011-12-13,4,0,12,16,0,2,1,1,0.38,0.3939,0.37,0.2537,13,194,207 +8210,2011-12-13,4,0,12,17,0,2,1,1,0.36,0.3485,0.4,0.194,17,381,398 +8211,2011-12-13,4,0,12,18,0,2,1,1,0.34,0.3182,0.42,0.2239,12,370,382 +8212,2011-12-13,4,0,12,19,0,2,1,1,0.32,0.3182,0.49,0.1642,5,262,267 +8213,2011-12-13,4,0,12,20,0,2,1,1,0.32,0.3182,0.49,0.1642,7,154,161 +8214,2011-12-13,4,0,12,21,0,2,1,1,0.3,0.303,0.52,0.1642,5,134,139 +8215,2011-12-13,4,0,12,22,0,2,1,1,0.3,0.303,0.52,0.1642,5,102,107 +8216,2011-12-13,4,0,12,23,0,2,1,1,0.26,0.303,0.7,0,4,55,59 +8217,2011-12-14,4,0,12,0,0,3,1,1,0.26,0.303,0.75,0,0,22,22 +8218,2011-12-14,4,0,12,1,0,3,1,1,0.26,0.303,0.75,0,0,8,8 +8219,2011-12-14,4,0,12,2,0,3,1,1,0.26,0.303,0.75,0,0,2,2 +8220,2011-12-14,4,0,12,3,0,3,1,1,0.28,0.3182,0.65,0,0,3,3 +8221,2011-12-14,4,0,12,4,0,3,1,1,0.26,0.303,0.75,0,0,4,4 +8222,2011-12-14,4,0,12,5,0,3,1,2,0.26,0.303,0.81,0,0,26,26 +8223,2011-12-14,4,0,12,6,0,3,1,2,0.26,0.303,0.87,0,0,87,87 +8224,2011-12-14,4,0,12,7,0,3,1,1,0.28,0.3182,0.75,0,4,243,247 +8225,2011-12-14,4,0,12,8,0,3,1,2,0.26,0.303,0.87,0,9,458,467 +8226,2011-12-14,4,0,12,9,0,3,1,2,0.3,0.3333,0.61,0,9,230,239 +8227,2011-12-14,4,0,12,10,0,3,1,2,0.32,0.3485,0.57,0,9,133,142 +8228,2011-12-14,4,0,12,11,0,3,1,2,0.34,0.3636,0.55,0,12,103,115 +8229,2011-12-14,4,0,12,12,0,3,1,2,0.36,0.3636,0.57,0.0896,6,138,144 +8230,2011-12-14,4,0,12,13,0,3,1,2,0.36,0.3636,0.57,0.1045,14,119,133 +8231,2011-12-14,4,0,12,14,0,3,1,2,0.38,0.3939,0.54,0.0896,17,134,151 +8232,2011-12-14,4,0,12,15,0,3,1,2,0.38,0.3939,0.54,0.1045,24,143,167 +8233,2011-12-14,4,0,12,16,0,3,1,2,0.38,0.3939,0.54,0.1045,12,233,245 +8234,2011-12-14,4,0,12,17,0,3,1,2,0.36,0.3636,0.57,0.1045,19,377,396 +8235,2011-12-14,4,0,12,18,0,3,1,2,0.36,0.3485,0.62,0.1642,11,379,390 +8236,2011-12-14,4,0,12,19,0,3,1,2,0.36,0.3636,0.62,0.1045,9,259,268 +8237,2011-12-14,4,0,12,20,0,3,1,2,0.34,0.3485,0.66,0.1045,6,169,175 +8238,2011-12-14,4,0,12,21,0,3,1,2,0.34,0.3333,0.66,0.1642,9,145,154 +8239,2011-12-14,4,0,12,22,0,3,1,1,0.34,0.3333,0.66,0.1642,5,99,104 +8240,2011-12-14,4,0,12,23,0,3,1,1,0.32,0.3182,0.7,0.1642,3,48,51 +8241,2011-12-15,4,0,12,0,0,4,1,1,0.32,0.3182,0.7,0.1642,5,45,50 +8242,2011-12-15,4,0,12,1,0,4,1,2,0.32,0.3182,0.7,0.1642,5,15,20 +8243,2011-12-15,4,0,12,2,0,4,1,2,0.32,0.3333,0.7,0.1343,2,5,7 +8244,2011-12-15,4,0,12,3,0,4,1,2,0.32,0.3333,0.7,0.1343,1,2,3 +8245,2011-12-15,4,0,12,4,0,4,1,2,0.32,0.303,0.7,0.2239,0,6,6 +8246,2011-12-15,4,0,12,5,0,4,1,2,0.32,0.303,0.7,0.2836,0,24,24 +8247,2011-12-15,4,0,12,6,0,4,1,1,0.32,0.3182,0.66,0.194,2,90,92 +8248,2011-12-15,4,0,12,7,0,4,1,1,0.32,0.3182,0.7,0.194,6,252,258 +8249,2011-12-15,4,0,12,8,0,4,1,2,0.34,0.3182,0.66,0.2239,8,449,457 +8250,2011-12-15,4,0,12,9,0,4,1,2,0.36,0.3333,0.62,0.2836,6,274,280 +8251,2011-12-15,4,0,12,10,0,4,1,2,0.4,0.4091,0.58,0.2985,4,105,109 +8252,2011-12-15,4,0,12,11,0,4,1,2,0.4,0.4091,0.62,0.4179,9,126,135 +8253,2011-12-15,4,0,12,12,0,4,1,2,0.46,0.4545,0.59,0.4925,17,140,157 +8254,2011-12-15,4,0,12,13,0,4,1,2,0.5,0.4848,0.55,0.3582,14,150,164 +8255,2011-12-15,4,0,12,14,0,4,1,2,0.52,0.5,0.52,0.3284,8,124,132 +8256,2011-12-15,4,0,12,15,0,4,1,3,0.52,0.5,0.55,0.2239,6,76,82 +8257,2011-12-15,4,0,12,16,0,4,1,3,0.46,0.4545,0.72,0.1642,7,124,131 +8258,2011-12-15,4,0,12,17,0,4,1,2,0.52,0.5,0.59,0.2836,14,310,324 +8259,2011-12-15,4,0,12,18,0,4,1,2,0.52,0.5,0.59,0.2985,14,363,377 +8260,2011-12-15,4,0,12,19,0,4,1,2,0.52,0.5,0.59,0.2836,13,274,287 +8261,2011-12-15,4,0,12,20,0,4,1,1,0.52,0.5,0.59,0.3284,12,209,221 +8262,2011-12-15,4,0,12,21,0,4,1,1,0.5,0.4848,0.63,0.2985,10,145,155 +8263,2011-12-15,4,0,12,22,0,4,1,1,0.52,0.5,0.63,0.3284,8,116,124 +8264,2011-12-15,4,0,12,23,0,4,1,1,0.52,0.5,0.63,0.3284,10,104,114 +8265,2011-12-16,4,0,12,0,0,5,1,1,0.5,0.4848,0.72,0.3284,12,60,72 +8266,2011-12-16,4,0,12,1,0,5,1,1,0.5,0.4848,0.72,0.2239,3,28,31 +8267,2011-12-16,4,0,12,2,0,5,1,1,0.48,0.4697,0.77,0.1045,0,23,23 +8268,2011-12-16,4,0,12,3,0,5,1,1,0.46,0.4545,0.82,0.1343,2,3,5 +8269,2011-12-16,4,0,12,4,0,5,1,2,0.5,0.4848,0.59,0.194,1,4,5 +8270,2011-12-16,4,0,12,5,0,5,1,1,0.48,0.4697,0.44,0.3881,0,26,26 +8271,2011-12-16,4,0,12,6,0,5,1,2,0.44,0.4394,0.33,0.2985,2,80,82 +8272,2011-12-16,4,0,12,7,0,5,1,2,0.38,0.3939,0.46,0.2985,5,226,231 +8273,2011-12-16,4,0,12,8,0,5,1,1,0.36,0.3333,0.46,0.4179,10,449,459 +8274,2011-12-16,4,0,12,9,0,5,1,1,0.36,0.3333,0.46,0.3881,8,235,243 +8275,2011-12-16,4,0,12,10,0,5,1,1,0.34,0.303,0.46,0.2985,10,122,132 +8276,2011-12-16,4,0,12,11,0,5,1,2,0.34,0.303,0.42,0.2985,9,135,144 +8277,2011-12-16,4,0,12,12,0,5,1,2,0.34,0.303,0.42,0.3582,10,163,173 +8278,2011-12-16,4,0,12,13,0,5,1,2,0.34,0.303,0.39,0.3582,12,125,137 +8279,2011-12-16,4,0,12,14,0,5,1,2,0.34,0.3182,0.42,0.2239,13,115,128 +8280,2011-12-16,4,0,12,15,0,5,1,2,0.32,0.303,0.45,0.2985,12,141,153 +8281,2011-12-16,4,0,12,16,0,5,1,2,0.32,0.303,0.45,0.2239,12,212,224 +8282,2011-12-16,4,0,12,17,0,5,1,2,0.32,0.303,0.45,0.2836,13,352,365 +8283,2011-12-16,4,0,12,18,0,5,1,2,0.32,0.303,0.45,0.2537,10,285,295 +8284,2011-12-16,4,0,12,19,0,5,1,2,0.32,0.303,0.45,0.2239,11,203,214 +8285,2011-12-16,4,0,12,20,0,5,1,2,0.32,0.3182,0.45,0.194,6,146,152 +8286,2011-12-16,4,0,12,21,0,5,1,2,0.32,0.3182,0.45,0.1642,7,95,102 +8287,2011-12-16,4,0,12,22,0,5,1,2,0.3,0.303,0.49,0.1642,6,100,106 +8288,2011-12-16,4,0,12,23,0,5,1,2,0.3,0.303,0.49,0.1343,4,71,75 +8289,2011-12-17,4,0,12,0,0,6,0,1,0.3,0.2879,0.49,0.2836,3,59,62 +8290,2011-12-17,4,0,12,1,0,6,0,2,0.3,0.2879,0.52,0.2537,4,39,43 +8291,2011-12-17,4,0,12,2,0,6,0,2,0.28,0.2727,0.56,0.2239,11,47,58 +8292,2011-12-17,4,0,12,3,0,6,0,1,0.26,0.2576,0.6,0.194,4,22,26 +8293,2011-12-17,4,0,12,4,0,6,0,1,0.24,0.2424,0.65,0.1642,0,9,9 +8294,2011-12-17,4,0,12,5,0,6,0,2,0.26,0.2576,0.6,0.194,2,7,9 +8295,2011-12-17,4,0,12,6,0,6,0,2,0.26,0.2424,0.56,0.2836,0,17,17 +8296,2011-12-17,4,0,12,7,0,6,0,2,0.26,0.2576,0.56,0.2239,0,20,20 +8297,2011-12-17,4,0,12,8,0,6,0,2,0.26,0.2576,0.56,0.2239,4,55,59 +8298,2011-12-17,4,0,12,9,0,6,0,2,0.26,0.2424,0.56,0.2537,5,95,100 +8299,2011-12-17,4,0,12,10,0,6,0,2,0.26,0.2424,0.52,0.2836,10,171,181 +8300,2011-12-17,4,0,12,11,0,6,0,2,0.28,0.2576,0.45,0.3284,21,188,209 +8301,2011-12-17,4,0,12,12,0,6,0,2,0.28,0.2576,0.45,0.2836,28,216,244 +8302,2011-12-17,4,0,12,13,0,6,0,2,0.28,0.2727,0.48,0.2537,32,204,236 +8303,2011-12-17,4,0,12,14,0,6,0,1,0.28,0.2576,0.45,0.3582,30,207,237 +8304,2011-12-17,4,0,12,15,0,6,0,1,0.28,0.2576,0.45,0.2836,33,195,228 +8305,2011-12-17,4,0,12,16,0,6,0,1,0.28,0.2727,0.45,0.2239,30,192,222 +8306,2011-12-17,4,0,12,17,0,6,0,1,0.26,0.2576,0.48,0.2239,12,153,165 +8307,2011-12-17,4,0,12,18,0,6,0,1,0.26,0.2273,0.48,0.3881,7,150,157 +8308,2011-12-17,4,0,12,19,0,6,0,3,0.24,0.2121,0.65,0.3284,11,95,106 +8309,2011-12-17,4,0,12,20,0,6,0,1,0.22,0.2121,0.69,0.2537,7,99,106 +8310,2011-12-17,4,0,12,21,0,6,0,1,0.2,0.2121,0.75,0.1642,8,88,96 +8311,2011-12-17,4,0,12,22,0,6,0,1,0.2,0.2121,0.75,0.1642,7,73,80 +8312,2011-12-17,4,0,12,23,0,6,0,1,0.2,0.2273,0.75,0,6,63,69 +8313,2011-12-18,4,0,12,0,0,0,0,1,0.2,0.2121,0.75,0.1343,7,64,71 +8314,2011-12-18,4,0,12,1,0,0,0,1,0.2,0.2273,0.69,0.1045,3,43,46 +8315,2011-12-18,4,0,12,2,0,0,0,1,0.22,0.2424,0.55,0.1045,2,39,41 +8316,2011-12-18,4,0,12,3,0,0,0,1,0.22,0.2121,0.55,0.2239,2,16,18 +8317,2011-12-18,4,0,12,4,0,0,0,1,0.22,0.2121,0.55,0.2239,0,1,1 +8318,2011-12-18,4,0,12,5,0,0,0,1,0.2,0.2273,0.59,0.1045,1,4,5 +8319,2011-12-18,4,0,12,6,0,0,0,1,0.2,0.2121,0.59,0.1343,0,3,3 +8320,2011-12-18,4,0,12,7,0,0,0,1,0.2,0.2121,0.59,0.1642,1,11,12 +8321,2011-12-18,4,0,12,8,0,0,0,1,0.2,0.2576,0.69,0,1,31,32 +8322,2011-12-18,4,0,12,9,0,0,0,1,0.22,0.2273,0.64,0.1642,6,75,81 +8323,2011-12-18,4,0,12,10,0,0,0,1,0.24,0.2273,0.6,0.2537,7,122,129 +8324,2011-12-18,4,0,12,11,0,0,0,2,0.24,0.2273,0.6,0.2239,20,168,188 +8325,2011-12-18,4,0,12,12,0,0,0,1,0.26,0.2576,0.6,0.2239,26,202,228 +8326,2011-12-18,4,0,12,13,0,0,0,1,0.3,0.2727,0.49,0.2985,18,210,228 +8327,2011-12-18,4,0,12,14,0,0,0,1,0.3,0.2879,0.49,0.2836,19,200,219 +8328,2011-12-18,4,0,12,15,0,0,0,1,0.32,0.303,0.45,0.2836,23,184,207 +8329,2011-12-18,4,0,12,16,0,0,0,1,0.28,0.2727,0.48,0.2537,27,207,234 +8330,2011-12-18,4,0,12,17,0,0,0,1,0.28,0.2727,0.46,0.1642,10,126,136 +8331,2011-12-18,4,0,12,18,0,0,0,1,0.28,0.2727,0.45,0.194,15,123,138 +8332,2011-12-18,4,0,12,19,0,0,0,1,0.26,0.2727,0.52,0.1343,5,130,135 +8333,2011-12-18,4,0,12,20,0,0,0,1,0.24,0.2576,0.6,0.1045,10,94,104 +8334,2011-12-18,4,0,12,21,0,0,0,1,0.22,0.2727,0.64,0,9,80,89 +8335,2011-12-18,4,0,12,22,0,0,0,1,0.22,0.2273,0.75,0.194,2,47,49 +8336,2011-12-18,4,0,12,23,0,0,0,1,0.2,0.2273,0.75,0.1045,6,31,37 +8337,2011-12-19,4,0,12,0,0,1,1,1,0.2,0.2273,0.75,0.0896,3,14,17 +8338,2011-12-19,4,0,12,1,0,1,1,1,0.18,0.2121,0.8,0.0896,3,8,11 +8339,2011-12-19,4,0,12,2,0,1,1,1,0.18,0.2121,0.8,0.0896,0,3,3 +8340,2011-12-19,4,0,12,3,0,1,1,1,0.16,0.197,0.86,0.0896,0,3,3 +8341,2011-12-19,4,0,12,4,0,1,1,1,0.16,0.1818,0.86,0.1343,0,4,4 +8342,2011-12-19,4,0,12,5,0,1,1,1,0.14,0.1667,0.93,0.1045,0,21,21 +8343,2011-12-19,4,0,12,6,0,1,1,1,0.16,0.1818,0.86,0.1045,0,68,68 +8344,2011-12-19,4,0,12,7,0,1,1,1,0.18,0.2121,0.74,0.1045,4,187,191 +8345,2011-12-19,4,0,12,8,0,1,1,1,0.2,0.197,0.75,0.194,8,389,397 +8346,2011-12-19,4,0,12,9,0,1,1,1,0.22,0.2121,0.69,0.2239,17,166,183 +8347,2011-12-19,4,0,12,10,0,1,1,1,0.24,0.2273,0.67,0.194,12,96,108 +8348,2011-12-19,4,0,12,11,0,1,1,1,0.26,0.2424,0.65,0.2537,22,105,127 +8349,2011-12-19,4,0,12,12,0,1,1,1,0.3,0.2727,0.61,0.3284,13,128,141 +8350,2011-12-19,4,0,12,13,0,1,1,1,0.34,0.303,0.49,0.3284,15,127,142 +8351,2011-12-19,4,0,12,14,0,1,1,1,0.36,0.3485,0.46,0.2239,30,107,137 +8352,2011-12-19,4,0,12,15,0,1,1,1,0.38,0.3939,0.46,0.1642,24,135,159 +8353,2011-12-19,4,0,12,16,0,1,1,1,0.4,0.4091,0.4,0.2537,19,189,208 +8354,2011-12-19,4,0,12,17,0,1,1,1,0.38,0.3939,0.46,0.194,16,362,378 +8355,2011-12-19,4,0,12,18,0,1,1,1,0.36,0.3485,0.5,0.194,11,343,354 +8356,2011-12-19,4,0,12,19,0,1,1,2,0.36,0.3485,0.53,0.1642,17,234,251 +8357,2011-12-19,4,0,12,20,0,1,1,1,0.4,0.4091,0.4,0.1642,9,197,206 +8358,2011-12-19,4,0,12,21,0,1,1,2,0.36,0.3485,0.53,0.1343,15,112,127 +8359,2011-12-19,4,0,12,22,0,1,1,2,0.36,0.3485,0.57,0.1642,12,95,107 +8360,2011-12-19,4,0,12,23,0,1,1,1,0.36,0.3485,0.53,0.1642,10,50,60 +8361,2011-12-20,4,0,12,0,0,2,1,1,0.36,0.3636,0.57,0.1045,0,21,21 +8362,2011-12-20,4,0,12,1,0,2,1,1,0.36,0.3788,0.53,0,0,6,6 +8363,2011-12-20,4,0,12,2,0,2,1,2,0.34,0.3636,0.61,0,1,10,11 +8364,2011-12-20,4,0,12,3,0,2,1,2,0.34,0.3636,0.61,0,1,1,2 +8365,2011-12-20,4,0,12,4,0,2,1,2,0.36,0.3636,0.53,0.1045,0,4,4 +8366,2011-12-20,4,0,12,5,0,2,1,2,0.36,0.3788,0.57,0,0,15,15 +8367,2011-12-20,4,0,12,6,0,2,1,2,0.36,0.3788,0.57,0,1,71,72 +8368,2011-12-20,4,0,12,7,0,2,1,2,0.36,0.3788,0.62,0,7,261,268 +8369,2011-12-20,4,0,12,8,0,2,1,2,0.42,0.4242,0.54,0.0896,10,422,432 +8370,2011-12-20,4,0,12,9,0,2,1,2,0.36,0.3788,0.66,0,14,245,259 +8371,2011-12-20,4,0,12,10,0,2,1,2,0.42,0.4242,0.58,0.194,20,109,129 +8372,2011-12-20,4,0,12,11,0,2,1,2,0.4,0.4091,0.62,0.1642,13,116,129 +8373,2011-12-20,4,0,12,12,0,2,1,2,0.44,0.4394,0.54,0.1343,16,159,175 +8374,2011-12-20,4,0,12,13,0,2,1,2,0.44,0.4394,0.54,0.1343,18,163,181 +8375,2011-12-20,4,0,12,14,0,2,1,2,0.44,0.4394,0.58,0.1045,11,127,138 +8376,2011-12-20,4,0,12,15,0,2,1,2,0.44,0.4394,0.54,0,15,139,154 +8377,2011-12-20,4,0,12,16,0,2,1,1,0.44,0.4394,0.54,0,20,218,238 +8378,2011-12-20,4,0,12,17,0,2,1,1,0.4,0.4091,0.62,0.1642,14,417,431 +8379,2011-12-20,4,0,12,18,0,2,1,1,0.38,0.3939,0.66,0.194,26,385,411 +8380,2011-12-20,4,0,12,19,0,2,1,1,0.38,0.3939,0.66,0,6,218,224 +8381,2011-12-20,4,0,12,20,0,2,1,1,0.36,0.3788,0.66,0,3,144,147 +8382,2011-12-20,4,0,12,21,0,2,1,1,0.36,0.3636,0.66,0.0896,4,139,143 +8383,2011-12-20,4,0,12,22,0,2,1,2,0.36,0.3788,0.66,0,3,104,107 +8384,2011-12-20,4,0,12,23,0,2,1,2,0.38,0.3939,0.62,0,13,40,53 +8385,2011-12-21,1,0,12,0,0,3,1,2,0.34,0.3333,0.71,0.1343,7,18,25 +8386,2011-12-21,1,0,12,1,0,3,1,2,0.36,0.3636,0.66,0.1045,1,10,11 +8387,2011-12-21,1,0,12,2,0,3,1,2,0.36,0.3788,0.71,0,0,4,4 +8388,2011-12-21,1,0,12,3,0,3,1,2,0.36,0.3788,0.71,0,0,2,2 +8389,2011-12-21,1,0,12,4,0,3,1,2,0.36,0.3788,0.71,0,0,2,2 +8390,2011-12-21,1,0,12,5,0,3,1,2,0.38,0.3939,0.82,0.1045,0,28,28 +8391,2011-12-21,1,0,12,6,0,3,1,2,0.36,0.3788,0.87,0,1,75,76 +8392,2011-12-21,1,0,12,7,0,3,1,2,0.36,0.3636,0.87,0.1045,5,224,229 +8393,2011-12-21,1,0,12,8,0,3,1,3,0.36,0.3788,0.93,0,12,393,405 +8394,2011-12-21,1,0,12,9,0,3,1,2,0.4,0.4091,0.87,0.2239,14,220,234 +8395,2011-12-21,1,0,12,10,0,3,1,2,0.48,0.4697,0.82,0.4179,6,83,89 +8396,2011-12-21,1,0,12,11,0,3,1,2,0.48,0.4697,0.82,0.4179,3,53,56 +8397,2011-12-21,1,0,12,12,0,3,1,3,0.46,0.4545,0.88,0.3881,3,58,61 +8398,2011-12-21,1,0,12,13,0,3,1,3,0.44,0.4394,0.94,0.2985,3,55,58 +8399,2011-12-21,1,0,12,14,0,3,1,2,0.48,0.4697,0.88,0.3582,11,40,51 +8400,2011-12-21,1,0,12,15,0,3,1,2,0.48,0.4697,0.88,0.3881,4,57,61 +8401,2011-12-21,1,0,12,16,0,3,1,3,0.44,0.4394,1,0.3284,6,95,101 +8402,2011-12-21,1,0,12,17,0,3,1,3,0.44,0.4394,1,0.3284,3,226,229 +8403,2011-12-21,1,0,12,18,0,3,1,2,0.44,0.4394,1,0.2985,2,278,280 +8404,2011-12-21,1,0,12,19,0,3,1,2,0.5,0.4848,0.94,0.3284,9,200,209 +8405,2011-12-21,1,0,12,20,0,3,1,2,0.5,0.4848,0.94,0.3582,5,162,167 +8406,2011-12-21,1,0,12,21,0,3,1,1,0.5,0.4848,0.88,0.2836,9,114,123 +8407,2011-12-21,1,0,12,22,0,3,1,1,0.5,0.4848,0.88,0.2537,3,102,105 +8408,2011-12-21,1,0,12,23,0,3,1,1,0.5,0.4848,0.88,0.194,0,54,54 +8409,2011-12-22,1,0,12,0,0,4,1,2,0.5,0.4848,0.82,0.1045,3,24,27 +8410,2011-12-22,1,0,12,1,0,4,1,1,0.44,0.4394,0.94,0.0896,0,15,15 +8411,2011-12-22,1,0,12,2,0,4,1,1,0.48,0.4697,0.82,0.1045,1,10,11 +8412,2011-12-22,1,0,12,3,0,4,1,1,0.44,0.4394,0.67,0.1642,0,6,6 +8413,2011-12-22,1,0,12,4,0,4,1,1,0.38,0.3939,0.82,0.1343,0,3,3 +8414,2011-12-22,1,0,12,5,0,4,1,1,0.38,0.3939,0.82,0,1,15,16 +8415,2011-12-22,1,0,12,6,0,4,1,1,0.36,0.3788,0.87,0,1,63,64 +8416,2011-12-22,1,0,12,7,0,4,1,1,0.34,0.3636,0.93,0,4,182,186 +8417,2011-12-22,1,0,12,8,0,4,1,2,0.36,0.3788,0.87,0,10,333,343 +8418,2011-12-22,1,0,12,9,0,4,1,2,0.38,0.3939,0.87,0.0896,10,218,228 +8419,2011-12-22,1,0,12,10,0,4,1,2,0.4,0.4091,0.82,0.0896,13,128,141 +8420,2011-12-22,1,0,12,11,0,4,1,2,0.44,0.4394,0.77,0,10,135,145 +8421,2011-12-22,1,0,12,12,0,4,1,2,0.48,0.4697,0.55,0,22,180,202 +8422,2011-12-22,1,0,12,13,0,4,1,2,0.46,0.4545,0.63,0,21,167,188 +8423,2011-12-22,1,0,12,14,0,4,1,2,0.46,0.4545,0.63,0,23,136,159 +8424,2011-12-22,1,0,12,15,0,4,1,2,0.48,0.4697,0.59,0,28,186,214 +8425,2011-12-22,1,0,12,16,0,4,1,2,0.46,0.4545,0.63,0,30,226,256 +8426,2011-12-22,1,0,12,17,0,4,1,2,0.44,0.4394,0.67,0.1642,24,332,356 +8427,2011-12-22,1,0,12,18,0,4,1,2,0.44,0.4394,0.62,0.0896,11,277,288 +8428,2011-12-22,1,0,12,19,0,4,1,2,0.44,0.4394,0.62,0,12,130,142 +8429,2011-12-22,1,0,12,20,0,4,1,3,0.42,0.4242,0.71,0,0,33,33 +8430,2011-12-22,1,0,12,21,0,4,1,3,0.42,0.4242,0.71,0,2,15,17 +8431,2011-12-22,1,0,12,22,0,4,1,3,0.36,0.3636,0.93,0.1045,0,17,17 +8432,2011-12-22,1,0,12,23,0,4,1,3,0.4,0.4091,0.87,0,1,10,11 +8433,2011-12-23,1,0,12,0,0,5,1,3,0.4,0.4091,0.87,0.194,2,7,9 +8434,2011-12-23,1,0,12,1,0,5,1,3,0.4,0.4091,0.87,0.194,1,11,12 +8435,2011-12-23,1,0,12,2,0,5,1,3,0.38,0.3939,0.94,0.1343,1,12,13 +8436,2011-12-23,1,0,12,3,0,5,1,3,0.38,0.3939,0.94,0.1343,0,4,4 +8437,2011-12-23,1,0,12,4,0,5,1,3,0.38,0.3939,0.94,0.2836,0,2,2 +8438,2011-12-23,1,0,12,5,0,5,1,1,0.36,0.3485,0.93,0.194,0,8,8 +8439,2011-12-23,1,0,12,6,0,5,1,1,0.38,0.3939,0.82,0.2537,0,40,40 +8440,2011-12-23,1,0,12,7,0,5,1,1,0.38,0.3939,0.82,0.2239,4,88,92 +8441,2011-12-23,1,0,12,8,0,5,1,1,0.4,0.4091,0.76,0.2537,9,173,182 +8442,2011-12-23,1,0,12,9,0,5,1,1,0.4,0.4091,0.66,0.2985,4,152,156 +8443,2011-12-23,1,0,12,10,0,5,1,1,0.4,0.4091,0.62,0.3582,8,96,104 +8444,2011-12-23,1,0,12,11,0,5,1,1,0.4,0.4091,0.58,0.4478,26,148,174 +8445,2011-12-23,1,0,12,12,0,5,1,1,0.4,0.4091,0.54,0.5224,14,156,170 +8446,2011-12-23,1,0,12,13,0,5,1,1,0.4,0.4091,0.5,0.4627,17,177,194 +8447,2011-12-23,1,0,12,14,0,5,1,1,0.4,0.4091,0.54,0.4179,23,177,200 +8448,2011-12-23,1,0,12,15,0,5,1,1,0.38,0.3939,0.54,0.3582,12,191,203 +8449,2011-12-23,1,0,12,16,0,5,1,1,0.38,0.3939,0.5,0.2985,15,166,181 +8450,2011-12-23,1,0,12,17,0,5,1,1,0.36,0.3485,0.53,0.2239,11,129,140 +8451,2011-12-23,1,0,12,18,0,5,1,1,0.36,0.3333,0.5,0.2537,4,91,95 +8452,2011-12-23,1,0,12,19,0,5,1,1,0.34,0.3333,0.53,0.194,1,75,76 +8453,2011-12-23,1,0,12,20,0,5,1,1,0.32,0.303,0.61,0.3284,3,47,50 +8454,2011-12-23,1,0,12,21,0,5,1,1,0.32,0.303,0.61,0.2239,0,32,32 +8455,2011-12-23,1,0,12,22,0,5,1,1,0.32,0.3182,0.66,0.194,4,43,47 +8456,2011-12-23,1,0,12,23,0,5,1,1,0.32,0.3333,0.66,0.1343,4,21,25 +8457,2011-12-24,1,0,12,0,0,6,0,2,0.32,0.3182,0.7,0.1642,3,20,23 +8458,2011-12-24,1,0,12,1,0,6,0,2,0.32,0.3485,0.66,0,1,15,16 +8459,2011-12-24,1,0,12,2,0,6,0,2,0.32,0.3333,0.66,0.0896,4,22,26 +8460,2011-12-24,1,0,12,3,0,6,0,2,0.32,0.3333,0.7,0.0896,0,5,5 +8461,2011-12-24,1,0,12,4,0,6,0,2,0.32,0.3182,0.66,0.1642,0,3,3 +8462,2011-12-24,1,0,12,5,0,6,0,2,0.32,0.3182,0.66,0.1642,1,3,4 +8463,2011-12-24,1,0,12,6,0,6,0,2,0.3,0.303,0.75,0.1642,0,10,10 +8464,2011-12-24,1,0,12,7,0,6,0,1,0.3,0.2727,0.7,0.3582,0,10,10 +8465,2011-12-24,1,0,12,8,0,6,0,1,0.28,0.2727,0.7,0.194,0,27,27 +8466,2011-12-24,1,0,12,9,0,6,0,1,0.3,0.2727,0.45,0.4179,3,53,56 +8467,2011-12-24,1,0,12,10,0,6,0,1,0.3,0.2576,0.42,0.4925,4,52,56 +8468,2011-12-24,1,0,12,11,0,6,0,1,0.32,0.2879,0.39,0.3582,9,62,71 +8469,2011-12-24,1,0,12,12,0,6,0,1,0.34,0.3182,0.39,0.2836,15,79,94 +8470,2011-12-24,1,0,12,13,0,6,0,1,0.34,0.303,0.39,0.2985,24,97,121 +8471,2011-12-24,1,0,12,14,0,6,0,1,0.34,0.303,0.39,0.2985,15,70,85 +8472,2011-12-24,1,0,12,15,0,6,0,1,0.32,0.303,0.39,0.2239,15,82,97 +8473,2011-12-24,1,0,12,16,0,6,0,1,0.32,0.3182,0.42,0.194,25,74,99 +8474,2011-12-24,1,0,12,17,0,6,0,1,0.28,0.2879,0.45,0.1343,15,53,68 +8475,2011-12-24,1,0,12,18,0,6,0,1,0.3,0.3182,0.42,0.0896,2,33,35 +8476,2011-12-24,1,0,12,19,0,6,0,1,0.3,0.3182,0.52,0.0896,1,24,25 +8477,2011-12-24,1,0,12,20,0,6,0,1,0.26,0.2727,0.56,0.1343,5,19,24 +8478,2011-12-24,1,0,12,21,0,6,0,1,0.26,0.2576,0.56,0.1642,4,15,19 +8479,2011-12-24,1,0,12,22,0,6,0,1,0.24,0.2879,0.52,0,8,12,20 +8480,2011-12-24,1,0,12,23,0,6,0,1,0.24,0.2576,0.56,0,1,16,17 +8481,2011-12-25,1,0,12,0,0,0,0,1,0.24,0.2273,0.6,0.194,2,4,6 +8482,2011-12-25,1,0,12,1,0,0,0,1,0.22,0.2273,0.69,0.1343,2,2,4 +8483,2011-12-25,1,0,12,2,0,0,0,1,0.22,0.2576,0.75,0.0896,0,2,2 +8484,2011-12-25,1,0,12,3,0,0,0,1,0.22,0.2727,0.72,0,1,3,4 +8485,2011-12-25,1,0,12,5,0,0,0,1,0.2,0.2273,0.75,0.1045,0,1,1 +8486,2011-12-25,1,0,12,6,0,0,0,1,0.2,0.2273,0.75,0.1045,0,1,1 +8487,2011-12-25,1,0,12,7,0,0,0,1,0.22,0.2424,0.8,0.1045,0,4,4 +8488,2011-12-25,1,0,12,8,0,0,0,1,0.2,0.2121,0.8,0.1642,1,4,5 +8489,2011-12-25,1,0,12,9,0,0,0,1,0.24,0.2424,0.87,0.1642,3,20,23 +8490,2011-12-25,1,0,12,10,0,0,0,1,0.26,0.2424,0.81,0.2537,31,12,43 +8491,2011-12-25,1,0,12,11,0,0,0,1,0.3,0.303,0.7,0.1642,43,42,85 +8492,2011-12-25,1,0,12,12,0,0,0,1,0.3,0.2727,0.75,0.2985,25,41,66 +8493,2011-12-25,1,0,12,13,0,0,0,1,0.32,0.303,0.7,0.2537,40,39,79 +8494,2011-12-25,1,0,12,14,0,0,0,1,0.34,0.3182,0.61,0.2537,41,45,86 +8495,2011-12-25,1,0,12,15,0,0,0,1,0.34,0.3333,0.61,0.194,43,48,91 +8496,2011-12-25,1,0,12,16,0,0,0,1,0.36,0.3636,0.57,0.1045,35,51,86 +8497,2011-12-25,1,0,12,17,0,0,0,1,0.34,0.3333,0.61,0.1343,11,33,44 +8498,2011-12-25,1,0,12,18,0,0,0,1,0.32,0.3182,0.7,0.194,7,23,30 +8499,2011-12-25,1,0,12,19,0,0,0,1,0.32,0.3333,0.57,0.1343,2,14,16 +8500,2011-12-25,1,0,12,20,0,0,0,1,0.32,0.3333,0.49,0.0896,8,18,26 +8501,2011-12-25,1,0,12,21,0,0,0,1,0.3,0.303,0.56,0.1343,4,15,19 +8502,2011-12-25,1,0,12,22,0,0,0,1,0.28,0.2727,0.61,0.1642,2,15,17 +8503,2011-12-25,1,0,12,23,0,0,0,1,0.26,0.2727,0.65,0.1343,2,14,16 +8504,2011-12-26,1,0,12,0,1,1,0,1,0.22,0.2273,0.75,0.1642,5,6,11 +8505,2011-12-26,1,0,12,1,1,1,0,1,0.28,0.2727,0.56,0.194,3,7,10 +8506,2011-12-26,1,0,12,2,1,1,0,1,0.34,0.3182,0.46,0.2239,2,5,7 +8507,2011-12-26,1,0,12,4,1,1,0,1,0.34,0.303,0.46,0.2985,0,2,2 +8508,2011-12-26,1,0,12,5,1,1,0,1,0.34,0.303,0.42,0.3881,0,4,4 +8509,2011-12-26,1,0,12,6,1,1,0,1,0.32,0.303,0.45,0.2836,0,4,4 +8510,2011-12-26,1,0,12,7,1,1,0,1,0.32,0.3182,0.45,0.194,2,17,19 +8511,2011-12-26,1,0,12,8,1,1,0,1,0.32,0.3333,0.45,0.1343,5,13,18 +8512,2011-12-26,1,0,12,9,1,1,0,1,0.34,0.303,0.46,0.2985,24,38,62 +8513,2011-12-26,1,0,12,10,1,1,0,1,0.34,0.2879,0.42,0.5224,31,39,70 +8514,2011-12-26,1,0,12,11,1,1,0,1,0.34,0.2879,0.46,0.5821,40,51,91 +8515,2011-12-26,1,0,12,12,1,1,0,1,0.36,0.3182,0.43,0.4627,34,66,100 +8516,2011-12-26,1,0,12,13,1,1,0,1,0.38,0.3939,0.4,0.2985,57,83,140 +8517,2011-12-26,1,0,12,14,1,1,0,1,0.38,0.3939,0.4,0.3284,73,80,153 +8518,2011-12-26,1,0,12,15,1,1,0,1,0.38,0.3939,0.4,0.2537,26,105,131 +8519,2011-12-26,1,0,12,16,1,1,0,1,0.36,0.3333,0.43,0.2836,28,69,97 +8520,2011-12-26,1,0,12,17,1,1,0,1,0.34,0.3182,0.46,0.2239,30,67,97 +8521,2011-12-26,1,0,12,18,1,1,0,1,0.32,0.3333,0.49,0.0896,21,54,75 +8522,2011-12-26,1,0,12,19,1,1,0,1,0.3,0.3182,0.65,0.0896,16,49,65 +8523,2011-12-26,1,0,12,20,1,1,0,1,0.3,0.3182,0.56,0.0896,14,42,56 +8524,2011-12-26,1,0,12,21,1,1,0,1,0.26,0.303,0.7,0,10,39,49 +8525,2011-12-26,1,0,12,22,1,1,0,1,0.26,0.303,0.7,0,5,22,27 +8526,2011-12-26,1,0,12,23,1,1,0,1,0.26,0.2727,0.7,0.1045,4,25,29 +8527,2011-12-27,1,0,12,0,0,2,1,1,0.26,0.2576,0.7,0.1642,3,9,12 +8528,2011-12-27,1,0,12,1,0,2,1,1,0.26,0.2727,0.7,0.1343,0,6,6 +8529,2011-12-27,1,0,12,2,0,2,1,1,0.26,0.2727,0.7,0.1343,2,2,4 +8530,2011-12-27,1,0,12,3,0,2,1,2,0.3,0.2879,0.61,0.2239,0,3,3 +8531,2011-12-27,1,0,12,4,0,2,1,2,0.3,0.2879,0.65,0.2239,0,3,3 +8532,2011-12-27,1,0,12,5,0,2,1,2,0.3,0.2727,0.64,0.2836,0,8,8 +8533,2011-12-27,1,0,12,6,0,2,1,2,0.3,0.2727,0.61,0.2985,0,35,35 +8534,2011-12-27,1,0,12,7,0,2,1,2,0.3,0.2879,0.65,0.2836,1,79,80 +8535,2011-12-27,1,0,12,8,0,2,1,2,0.3,0.303,0.65,0.1642,9,155,164 +8536,2011-12-27,1,0,12,9,0,2,1,3,0.32,0.3333,0.66,0.1045,19,85,104 +8537,2011-12-27,1,0,12,10,0,2,1,2,0.32,0.3333,0.66,0.0896,13,47,60 +8538,2011-12-27,1,0,12,11,0,2,1,3,0.32,0.3333,0.66,0.1045,6,24,30 +8539,2011-12-27,1,0,12,12,0,2,1,3,0.3,0.3182,0.81,0.0896,8,16,24 +8540,2011-12-27,1,0,12,13,0,2,1,3,0.3,0.303,0.87,0.1343,1,19,20 +8541,2011-12-27,1,0,12,14,0,2,1,3,0.42,0.4242,0.82,0.3881,0,14,14 +8542,2011-12-27,1,0,12,15,0,2,1,3,0.42,0.4242,0.82,0.3881,2,17,19 +8543,2011-12-27,1,0,12,16,0,2,1,3,0.44,0.4394,0.88,0.3284,2,44,46 +8544,2011-12-27,1,0,12,17,0,2,1,3,0.4,0.4091,0.87,0.1343,6,109,115 +8545,2011-12-27,1,0,12,18,0,2,1,1,0.38,0.3939,0.87,0.2836,9,126,135 +8546,2011-12-27,1,0,12,19,0,2,1,1,0.34,0.3333,0.87,0.1642,3,87,90 +8547,2011-12-27,1,0,12,20,0,2,1,1,0.32,0.3333,0.93,0.0896,3,66,69 +8548,2011-12-27,1,0,12,21,0,2,1,1,0.32,0.3333,0.87,0.0896,11,52,63 +8549,2011-12-27,1,0,12,22,0,2,1,1,0.32,0.3333,0.87,0.0896,3,29,32 +8550,2011-12-27,1,0,12,23,0,2,1,1,0.3,0.303,0.93,0.1343,2,24,26 +8551,2011-12-28,1,0,12,0,0,3,1,1,0.32,0.3182,0.87,0.1642,0,10,10 +8552,2011-12-28,1,0,12,1,0,3,1,1,0.32,0.3182,0.76,0.1642,0,12,12 +8553,2011-12-28,1,0,12,2,0,3,1,1,0.32,0.303,0.61,0.2537,0,7,7 +8554,2011-12-28,1,0,12,3,0,3,1,1,0.32,0.303,0.61,0.2537,0,4,4 +8555,2011-12-28,1,0,12,5,0,3,1,1,0.32,0.2879,0.57,0.3582,0,9,9 +8556,2011-12-28,1,0,12,6,0,3,1,1,0.32,0.303,0.57,0.2537,1,42,43 +8557,2011-12-28,1,0,12,7,0,3,1,1,0.32,0.303,0.57,0.2537,4,106,110 +8558,2011-12-28,1,0,12,8,0,3,1,1,0.32,0.2879,0.57,0.3582,11,206,217 +8559,2011-12-28,1,0,12,9,0,3,1,1,0.32,0.303,0.57,0.3284,18,171,189 +8560,2011-12-28,1,0,12,10,0,3,1,1,0.34,0.3182,0.57,0.2239,12,84,96 +8561,2011-12-28,1,0,12,11,0,3,1,1,0.36,0.3333,0.46,0.3582,18,93,111 +8562,2011-12-28,1,0,12,12,0,3,1,1,0.34,0.2879,0.46,0.5522,32,108,140 +8563,2011-12-28,1,0,12,13,0,3,1,1,0.34,0.2879,0.39,0.4627,24,135,159 +8564,2011-12-28,1,0,12,14,0,3,1,1,0.32,0.2879,0.39,0.3881,24,96,120 +8565,2011-12-28,1,0,12,15,0,3,1,1,0.32,0.2879,0.36,0.4179,16,101,117 +8566,2011-12-28,1,0,12,16,0,3,1,1,0.3,0.2727,0.36,0.4179,23,144,167 +8567,2011-12-28,1,0,12,17,0,3,1,1,0.28,0.2727,0.38,0.2537,25,225,250 +8568,2011-12-28,1,0,12,18,0,3,1,1,0.26,0.2273,0.38,0.3284,10,159,169 +8569,2011-12-28,1,0,12,19,0,3,1,1,0.24,0.2273,0.41,0.2537,16,135,151 +8570,2011-12-28,1,0,12,20,0,3,1,1,0.24,0.2273,0.41,0.2239,9,70,79 +8571,2011-12-28,1,0,12,21,0,3,1,1,0.22,0.2273,0.44,0.1343,7,63,70 +8572,2011-12-28,1,0,12,22,0,3,1,1,0.22,0.2424,0.44,0.1045,2,31,33 +8573,2011-12-28,1,0,12,23,0,3,1,1,0.22,0.2121,0.44,0.2537,3,36,39 +8574,2011-12-29,1,0,12,0,0,4,1,2,0.22,0.2273,0.47,0.194,4,24,28 +8575,2011-12-29,1,0,12,1,0,4,1,1,0.22,0.2121,0.47,0.2239,0,15,15 +8576,2011-12-29,1,0,12,2,0,4,1,1,0.2,0.2273,0.51,0.1045,0,3,3 +8577,2011-12-29,1,0,12,3,0,4,1,1,0.2,0.197,0.55,0.194,0,2,2 +8578,2011-12-29,1,0,12,4,0,4,1,1,0.2,0.2121,0.55,0.1642,1,2,3 +8579,2011-12-29,1,0,12,5,0,4,1,1,0.2,0.2273,0.55,0.1045,0,10,10 +8580,2011-12-29,1,0,12,6,0,4,1,1,0.2,0.2576,0.59,0,2,39,41 +8581,2011-12-29,1,0,12,7,0,4,1,1,0.18,0.2121,0.64,0.0896,2,104,106 +8582,2011-12-29,1,0,12,8,0,4,1,2,0.2,0.2576,0.64,0,3,207,210 +8583,2011-12-29,1,0,12,9,0,4,1,2,0.2,0.2121,0.64,0.1343,15,155,170 +8584,2011-12-29,1,0,12,10,0,4,1,2,0.22,0.2273,0.64,0.1642,13,97,110 +8585,2011-12-29,1,0,12,11,0,4,1,2,0.24,0.2576,0.6,0.0896,23,99,122 +8586,2011-12-29,1,0,12,12,0,4,1,2,0.26,0.2727,0.56,0.1045,12,118,130 +8587,2011-12-29,1,0,12,13,0,4,1,2,0.28,0.2879,0.52,0.1045,36,95,131 +8588,2011-12-29,1,0,12,14,0,4,1,2,0.3,0.3333,0.52,0,30,112,142 +8589,2011-12-29,1,0,12,15,0,4,1,2,0.3,0.3182,0.52,0.0896,27,150,177 +8590,2011-12-29,1,0,12,16,0,4,1,1,0.3,0.303,0.52,0.1642,22,155,177 +8591,2011-12-29,1,0,12,17,0,4,1,1,0.3,0.303,0.61,0.1642,14,226,240 +8592,2011-12-29,1,0,12,18,0,4,1,1,0.3,0.303,0.56,0.1343,15,192,207 +8593,2011-12-29,1,0,12,19,0,4,1,1,0.3,0.3182,0.56,0.1045,11,124,135 +8594,2011-12-29,1,0,12,20,0,4,1,1,0.28,0.2879,0.65,0.1045,8,88,96 +8595,2011-12-29,1,0,12,21,0,4,1,2,0.28,0.2879,0.61,0.1343,8,60,68 +8596,2011-12-29,1,0,12,22,0,4,1,1,0.28,0.2879,0.65,0.1343,7,48,55 +8597,2011-12-29,1,0,12,23,0,4,1,1,0.3,0.303,0.65,0.1642,1,44,45 +8598,2011-12-30,1,0,12,0,0,5,1,1,0.28,0.2879,0.7,0.1343,4,26,30 +8599,2011-12-30,1,0,12,1,0,5,1,1,0.26,0.2879,0.65,0.0896,9,11,20 +8600,2011-12-30,1,0,12,2,0,5,1,1,0.24,0.2576,0.7,0.1045,2,10,12 +8601,2011-12-30,1,0,12,3,0,5,1,1,0.24,0.2576,0.7,0.1045,0,6,6 +8602,2011-12-30,1,0,12,4,0,5,1,1,0.24,0.2576,0.7,0.0896,0,2,2 +8603,2011-12-30,1,0,12,5,0,5,1,1,0.22,0.2576,0.75,0.0896,0,10,10 +8604,2011-12-30,1,0,12,6,0,5,1,1,0.24,0.2576,0.7,0.0896,1,31,32 +8605,2011-12-30,1,0,12,7,0,5,1,1,0.24,0.2576,0.75,0.0896,3,92,95 +8606,2011-12-30,1,0,12,8,0,5,1,2,0.24,0.2576,0.75,0.0896,12,193,205 +8607,2011-12-30,1,0,12,9,0,5,1,1,0.26,0.303,0.75,0,19,175,194 +8608,2011-12-30,1,0,12,10,0,5,1,1,0.3,0.2879,0.65,0.2537,10,108,118 +8609,2011-12-30,1,0,12,11,0,5,1,1,0.32,0.3182,0.66,0.194,45,126,171 +8610,2011-12-30,1,0,12,12,0,5,1,1,0.32,0.3333,0.57,0.1343,38,159,197 +8611,2011-12-30,1,0,12,13,0,5,1,2,0.36,0.3485,0.53,0.1343,53,154,207 +8612,2011-12-30,1,0,12,14,0,5,1,2,0.4,0.4091,0.47,0.0896,67,178,245 +8613,2011-12-30,1,0,12,15,0,5,1,1,0.42,0.4242,0.44,0,56,236,292 +8614,2011-12-30,1,0,12,16,0,5,1,1,0.42,0.4242,0.47,0.194,36,247,283 +8615,2011-12-30,1,0,12,17,0,5,1,1,0.38,0.3939,0.54,0.1642,54,188,242 +8616,2011-12-30,1,0,12,18,0,5,1,1,0.36,0.3485,0.62,0.1343,19,163,182 +8617,2011-12-30,1,0,12,19,0,5,1,2,0.34,0.303,0.49,0.4179,16,96,112 +8618,2011-12-30,1,0,12,20,0,5,1,2,0.34,0.3485,0.66,0.1045,16,75,91 +8619,2011-12-30,1,0,12,21,0,5,1,2,0.36,0.3636,0.66,0.1045,11,84,95 +8620,2011-12-30,1,0,12,22,0,5,1,2,0.34,0.3333,0.71,0.1642,7,78,85 +8621,2011-12-30,1,0,12,23,0,5,1,2,0.36,0.3333,0.66,0.2537,13,60,73 +8622,2011-12-31,1,0,12,0,0,6,0,2,0.38,0.3939,0.62,0.3284,7,37,44 +8623,2011-12-31,1,0,12,1,0,6,0,2,0.4,0.4091,0.62,0.2836,4,31,35 +8624,2011-12-31,1,0,12,2,0,6,0,2,0.4,0.4091,0.62,0.2836,1,27,28 +8625,2011-12-31,1,0,12,3,0,6,0,2,0.4,0.4091,0.62,0.2836,1,17,18 +8626,2011-12-31,1,0,12,4,0,6,0,1,0.38,0.3939,0.71,0.2239,1,9,10 +8627,2011-12-31,1,0,12,5,0,6,0,2,0.36,0.3485,0.76,0.2239,1,0,1 +8628,2011-12-31,1,0,12,6,0,6,0,2,0.4,0.4091,0.71,0.0896,1,5,6 +8629,2011-12-31,1,0,12,7,0,6,0,3,0.38,0.3939,0.76,0,6,13,19 +8630,2011-12-31,1,0,12,8,0,6,0,1,0.34,0.3333,0.81,0.1343,7,42,49 +8631,2011-12-31,1,0,12,9,0,6,0,1,0.38,0.3939,0.76,0,18,72,90 +8632,2011-12-31,1,0,12,10,0,6,0,1,0.4,0.4091,0.76,0.1642,20,108,128 +8633,2011-12-31,1,0,12,11,0,6,0,1,0.42,0.4242,0.71,0.1642,65,152,217 +8634,2011-12-31,1,0,12,12,0,6,0,1,0.52,0.5,0.39,0.2985,93,180,273 +8635,2011-12-31,1,0,12,13,0,6,0,1,0.5,0.4848,0.42,0.4925,108,205,313 +8636,2011-12-31,1,0,12,14,0,6,0,1,0.46,0.4545,0.51,0.3284,115,185,300 +8637,2011-12-31,1,0,12,15,0,6,0,1,0.46,0.4545,0.47,0.4925,89,164,253 +8638,2011-12-31,1,0,12,16,0,6,0,1,0.44,0.4394,0.51,0.3881,52,143,195 +8639,2011-12-31,1,0,12,17,0,6,0,1,0.42,0.4242,0.54,0.194,28,101,129 +8640,2011-12-31,1,0,12,18,0,6,0,1,0.42,0.4242,0.54,0.1343,13,80,93 +8641,2011-12-31,1,0,12,19,0,6,0,1,0.42,0.4242,0.54,0.2239,19,73,92 +8642,2011-12-31,1,0,12,20,0,6,0,1,0.42,0.4242,0.54,0.2239,8,63,71 +8643,2011-12-31,1,0,12,21,0,6,0,1,0.4,0.4091,0.58,0.194,2,50,52 +8644,2011-12-31,1,0,12,22,0,6,0,1,0.38,0.3939,0.62,0.1343,2,36,38 +8645,2011-12-31,1,0,12,23,0,6,0,1,0.36,0.3788,0.66,0,4,27,31 +8646,2012-01-01,1,1,1,0,0,0,0,1,0.36,0.3788,0.66,0,5,43,48 +8647,2012-01-01,1,1,1,1,0,0,0,1,0.36,0.3485,0.66,0.1343,15,78,93 +8648,2012-01-01,1,1,1,2,0,0,0,1,0.32,0.3485,0.76,0,16,59,75 +8649,2012-01-01,1,1,1,3,0,0,0,1,0.3,0.3333,0.81,0,11,41,52 +8650,2012-01-01,1,1,1,4,0,0,0,1,0.28,0.303,0.81,0.0896,0,8,8 +8651,2012-01-01,1,1,1,5,0,0,0,1,0.28,0.2879,0.81,0.1045,0,5,5 +8652,2012-01-01,1,1,1,6,0,0,0,1,0.26,0.2727,0.93,0.1343,1,1,2 +8653,2012-01-01,1,1,1,7,0,0,0,1,0.26,0.2576,0.93,0.1642,1,6,7 +8654,2012-01-01,1,1,1,8,0,0,0,1,0.26,0.2727,0.87,0.1045,4,10,14 +8655,2012-01-01,1,1,1,9,0,0,0,1,0.26,0.2727,0.93,0.1045,13,27,40 +8656,2012-01-01,1,1,1,10,0,0,0,1,0.3,0.3182,0.81,0.1045,18,52,70 +8657,2012-01-01,1,1,1,11,0,0,0,1,0.34,0.3333,0.76,0.1343,40,98,138 +8658,2012-01-01,1,1,1,12,0,0,0,1,0.4,0.4091,0.62,0.2836,58,143,201 +8659,2012-01-01,1,1,1,13,0,0,0,1,0.42,0.4242,0.58,0.2836,82,141,223 +8660,2012-01-01,1,1,1,14,0,0,0,1,0.44,0.4394,0.54,0.2985,120,147,267 +8661,2012-01-01,1,1,1,15,0,0,0,1,0.46,0.4545,0.51,0.2985,101,164,265 +8662,2012-01-01,1,1,1,16,0,0,0,2,0.44,0.4394,0.54,0.2985,68,147,215 +8663,2012-01-01,1,1,1,17,0,0,0,2,0.48,0.4697,0.48,0.1642,36,75,111 +8664,2012-01-01,1,1,1,18,0,0,0,3,0.46,0.4545,0.59,0.2537,25,81,106 +8665,2012-01-01,1,1,1,19,0,0,0,3,0.42,0.4242,0.67,0.3881,20,85,105 +8666,2012-01-01,1,1,1,20,0,0,0,2,0.44,0.4394,0.62,0.2985,25,58,83 +8667,2012-01-01,1,1,1,21,0,0,0,2,0.44,0.4394,0.67,0.2537,10,61,71 +8668,2012-01-01,1,1,1,22,0,0,0,1,0.46,0.4545,0.55,0.4179,13,53,66 +8669,2012-01-01,1,1,1,23,0,0,0,1,0.44,0.4394,0.51,0.2985,4,25,29 +8670,2012-01-02,1,1,1,0,1,1,0,1,0.4,0.4091,0.4,0.4627,8,31,39 +8671,2012-01-02,1,1,1,1,1,1,0,1,0.36,0.3333,0.43,0.4179,1,11,12 +8672,2012-01-02,1,1,1,2,1,1,0,1,0.36,0.3182,0.34,0.4478,1,6,7 +8673,2012-01-02,1,1,1,4,1,1,0,1,0.28,0.2576,0.45,0.3284,0,4,4 +8674,2012-01-02,1,1,1,5,1,1,0,1,0.28,0.2576,0.45,0.3284,1,3,4 +8675,2012-01-02,1,1,1,6,1,1,0,1,0.26,0.2273,0.41,0.3881,0,14,14 +8676,2012-01-02,1,1,1,7,1,1,0,1,0.24,0.2121,0.32,0.3881,0,16,16 +8677,2012-01-02,1,1,1,8,1,1,0,1,0.24,0.2273,0.35,0.2537,2,51,53 +8678,2012-01-02,1,1,1,9,1,1,0,1,0.24,0.2576,0.35,0,15,53,68 +8679,2012-01-02,1,1,1,10,1,1,0,1,0.26,0.2424,0.35,0.2836,20,89,109 +8680,2012-01-02,1,1,1,11,1,1,0,1,0.26,0.2121,0.35,0.4925,33,142,175 +8681,2012-01-02,1,1,1,12,1,1,0,1,0.28,0.2727,0.36,0.2537,41,161,202 +8682,2012-01-02,1,1,1,13,1,1,0,1,0.3,0.2727,0.36,0.2985,26,150,176 +8683,2012-01-02,1,1,1,14,1,1,0,1,0.3,0.2727,0.36,0.4179,10,141,151 +8684,2012-01-02,1,1,1,15,1,1,0,1,0.28,0.2424,0.38,0.4478,29,139,168 +8685,2012-01-02,1,1,1,16,1,1,0,1,0.26,0.2273,0.35,0.4179,10,144,154 +8686,2012-01-02,1,1,1,17,1,1,0,1,0.26,0.2273,0.35,0.3881,17,136,153 +8687,2012-01-02,1,1,1,18,1,1,0,1,0.26,0.2424,0.33,0.2537,13,113,126 +8688,2012-01-02,1,1,1,19,1,1,0,1,0.24,0.2273,0.38,0.2239,4,89,93 +8689,2012-01-02,1,1,1,20,1,1,0,1,0.24,0.2273,0.41,0.2239,5,83,88 +8690,2012-01-02,1,1,1,21,1,1,0,2,0.24,0.2121,0.41,0.3284,3,63,66 +8691,2012-01-02,1,1,1,22,1,1,0,2,0.22,0.2121,0.44,0.2836,3,36,39 +8692,2012-01-02,1,1,1,23,1,1,0,1,0.22,0.2121,0.44,0.2537,2,32,34 +8693,2012-01-03,1,1,1,0,0,2,1,1,0.2,0.197,0.51,0.2537,0,13,13 +8694,2012-01-03,1,1,1,1,0,2,1,1,0.18,0.1818,0.55,0.2239,1,5,6 +8695,2012-01-03,1,1,1,2,0,2,1,1,0.18,0.1667,0.51,0.2537,0,3,3 +8696,2012-01-03,1,1,1,3,0,2,1,1,0.16,0.1364,0.55,0.2836,0,2,2 +8697,2012-01-03,1,1,1,4,0,2,1,1,0.16,0.1515,0.55,0.2537,0,5,5 +8698,2012-01-03,1,1,1,5,0,2,1,1,0.14,0.1364,0.54,0.194,0,12,12 +8699,2012-01-03,1,1,1,6,0,2,1,1,0.14,0.1212,0.59,0.2836,4,81,85 +8700,2012-01-03,1,1,1,7,0,2,1,1,0.14,0.1212,0.59,0.2537,2,168,170 +8701,2012-01-03,1,1,1,8,0,2,1,1,0.16,0.1364,0.55,0.2836,5,349,354 +8702,2012-01-03,1,1,1,9,0,2,1,1,0.16,0.1364,0.5,0.2836,8,145,153 +8703,2012-01-03,1,1,1,10,0,2,1,1,0.16,0.1212,0.47,0.4627,5,55,60 +8704,2012-01-03,1,1,1,11,0,2,1,1,0.18,0.1364,0.37,0.5224,12,63,75 +8705,2012-01-03,1,1,1,12,0,2,1,1,0.18,0.1364,0.37,0.5224,4,70,74 +8706,2012-01-03,1,1,1,13,0,2,1,1,0.18,0.1212,0.34,0.6567,5,68,73 +8707,2012-01-03,1,1,1,14,0,2,1,1,0.16,0.1212,0.43,0.5224,7,72,79 +8708,2012-01-03,1,1,1,15,0,2,1,1,0.16,0.1212,0.37,0.4179,9,68,77 +8709,2012-01-03,1,1,1,16,0,2,1,1,0.14,0.0909,0.39,0.5821,7,129,136 +8710,2012-01-03,1,1,1,17,0,2,1,1,0.14,0.1061,0.26,0.4627,4,241,245 +8711,2012-01-03,1,1,1,18,0,2,1,1,0.14,0.0909,0.26,0.5224,10,214,224 +8712,2012-01-03,1,1,1,19,0,2,1,1,0.12,0.0909,0.28,0.4179,4,152,156 +8713,2012-01-03,1,1,1,20,0,2,1,1,0.12,0.1212,0.33,0.2537,0,115,115 +8714,2012-01-03,1,1,1,21,0,2,1,1,0.1,0.1061,0.36,0.2239,2,66,68 +8715,2012-01-03,1,1,1,22,0,2,1,1,0.1,0.1061,0.46,0.2537,0,33,33 +8716,2012-01-03,1,1,1,23,0,2,1,1,0.1,0.0758,0.46,0.3881,0,18,18 +8717,2012-01-04,1,1,1,0,0,3,1,1,0.08,0.0606,0.42,0.3284,0,9,9 +8718,2012-01-04,1,1,1,1,0,3,1,1,0.04,0.0303,0.38,0.2985,0,3,3 +8719,2012-01-04,1,1,1,2,0,3,1,1,0.02,0.0152,0.34,0.2836,0,1,1 +8720,2012-01-04,1,1,1,3,0,3,1,1,0.02,0.0152,0.34,0.2836,0,1,1 +8721,2012-01-04,1,1,1,4,0,3,1,1,0.02,0.0455,0.41,0.194,0,2,2 +8722,2012-01-04,1,1,1,5,0,3,1,1,0.02,0.0455,0.41,0.194,0,14,14 +8723,2012-01-04,1,1,1,6,0,3,1,1,0.02,0.0455,0.41,0.1642,0,59,59 +8724,2012-01-04,1,1,1,7,0,3,1,1,0.02,0.0455,0.44,0.194,1,151,152 +8725,2012-01-04,1,1,1,8,0,3,1,1,0.02,0.0606,0.44,0.1343,5,310,315 +8726,2012-01-04,1,1,1,9,0,3,1,1,0.04,0.0606,0.45,0.1343,7,173,180 +8727,2012-01-04,1,1,1,10,0,3,1,1,0.06,0.1061,0.45,0,7,57,64 +8728,2012-01-04,1,1,1,11,0,3,1,2,0.08,0.1212,0.42,0.0896,6,40,46 +8729,2012-01-04,1,1,1,12,0,3,1,2,0.1,0.1061,0.46,0.194,9,75,84 +8730,2012-01-04,1,1,1,13,0,3,1,2,0.14,0.1212,0.43,0.2537,9,82,91 +8731,2012-01-04,1,1,1,14,0,3,1,2,0.14,0.1212,0.46,0.2537,6,69,75 +8732,2012-01-04,1,1,1,15,0,3,1,2,0.18,0.1515,0.37,0.3284,9,81,90 +8733,2012-01-04,1,1,1,16,0,3,1,2,0.18,0.1667,0.4,0.2836,8,123,131 +8734,2012-01-04,1,1,1,17,0,3,1,2,0.2,0.197,0.4,0.194,9,272,281 +8735,2012-01-04,1,1,1,18,0,3,1,2,0.2,0.2121,0.37,0.1343,9,280,289 +8736,2012-01-04,1,1,1,19,0,3,1,2,0.2,0.2273,0.4,0.1045,2,182,184 +8737,2012-01-04,1,1,1,20,0,3,1,2,0.2,0.197,0.4,0.194,2,121,123 +8738,2012-01-04,1,1,1,21,0,3,1,2,0.2,0.2576,0.44,0,2,88,90 +8739,2012-01-04,1,1,1,22,0,3,1,2,0.2,0.2273,0.47,0.0896,4,48,52 +8740,2012-01-04,1,1,1,23,0,3,1,2,0.2,0.2273,0.44,0.1045,0,32,32 +8741,2012-01-05,1,1,1,0,0,4,1,2,0.22,0.2273,0.47,0.194,1,13,14 +8742,2012-01-05,1,1,1,1,0,4,1,1,0.2,0.2273,0.51,0.0896,0,5,5 +8743,2012-01-05,1,1,1,2,0,4,1,1,0.2,0.2273,0.51,0.0896,0,4,4 +8744,2012-01-05,1,1,1,3,0,4,1,1,0.2,0.2576,0.61,0,0,4,4 +8745,2012-01-05,1,1,1,4,0,4,1,2,0.2,0.2273,0.59,0.0896,0,5,5 +8746,2012-01-05,1,1,1,5,0,4,1,2,0.2,0.2273,0.59,0.0896,0,26,26 +8747,2012-01-05,1,1,1,6,0,4,1,2,0.2,0.2121,0.75,0.1343,0,78,78 +8748,2012-01-05,1,1,1,7,0,4,1,2,0.2,0.2273,0.69,0.0896,3,212,215 +8749,2012-01-05,1,1,1,8,0,4,1,2,0.2,0.2576,0.75,0,11,377,388 +8750,2012-01-05,1,1,1,9,0,4,1,2,0.22,0.2727,0.69,0,7,220,227 +8751,2012-01-05,1,1,1,10,0,4,1,2,0.24,0.2576,0.52,0.0896,5,81,86 +8752,2012-01-05,1,1,1,11,0,4,1,1,0.3,0.3333,0.49,0,6,78,84 +8753,2012-01-05,1,1,1,12,0,4,1,1,0.3,0.2879,0.49,0.2537,6,114,120 +8754,2012-01-05,1,1,1,13,0,4,1,1,0.34,0.303,0.42,0.2985,6,112,118 +8755,2012-01-05,1,1,1,14,0,4,1,1,0.34,0.3182,0.39,0.2836,13,104,117 +8756,2012-01-05,1,1,1,15,0,4,1,1,0.36,0.3333,0.34,0.2836,6,113,119 +8757,2012-01-05,1,1,1,16,0,4,1,1,0.36,0.3333,0.34,0.2836,19,178,197 +8758,2012-01-05,1,1,1,17,0,4,1,2,0.36,0.3485,0.34,0.194,19,393,412 +8759,2012-01-05,1,1,1,18,0,4,1,1,0.34,0.3333,0.36,0.1642,9,374,383 +8760,2012-01-05,1,1,1,19,0,4,1,1,0.34,0.3485,0.34,0.0896,10,255,265 +8761,2012-01-05,1,1,1,20,0,4,1,1,0.3,0.3333,0.49,0,5,172,177 +8762,2012-01-05,1,1,1,21,0,4,1,1,0.26,0.2727,0.6,0.1343,9,88,97 +8763,2012-01-05,1,1,1,22,0,4,1,1,0.26,0.2576,0.6,0.1642,1,70,71 +8764,2012-01-05,1,1,1,23,0,4,1,1,0.24,0.2576,0.7,0.1045,4,56,60 +8765,2012-01-06,1,1,1,0,0,5,1,1,0.22,0.2576,0.75,0.0896,1,24,25 +8766,2012-01-06,1,1,1,1,0,5,1,1,0.22,0.2727,0.75,0,2,6,8 +8767,2012-01-06,1,1,1,2,0,5,1,1,0.22,0.2273,0.69,0.1642,2,3,5 +8768,2012-01-06,1,1,1,3,0,5,1,1,0.22,0.2424,0.69,0.1045,1,3,4 +8769,2012-01-06,1,1,1,4,0,5,1,1,0.22,0.2273,0.69,0.1343,0,3,3 +8770,2012-01-06,1,1,1,5,0,5,1,2,0.2,0.2121,0.8,0.1343,0,13,13 +8771,2012-01-06,1,1,1,6,0,5,1,2,0.22,0.2424,0.73,0.1045,1,69,70 +8772,2012-01-06,1,1,1,7,0,5,1,2,0.24,0.2879,0.65,0,4,201,205 +8773,2012-01-06,1,1,1,8,0,5,1,1,0.24,0.2424,0.7,0.1343,11,436,447 +8774,2012-01-06,1,1,1,9,0,5,1,1,0.24,0.2424,0.7,0.1642,4,237,241 +8775,2012-01-06,1,1,1,10,0,5,1,1,0.26,0.2576,0.65,0.194,14,102,116 +8776,2012-01-06,1,1,1,11,0,5,1,1,0.3,0.303,0.56,0.1642,8,130,138 +8777,2012-01-06,1,1,1,12,0,5,1,1,0.36,0.3333,0.5,0.2537,23,168,191 +8778,2012-01-06,1,1,1,13,0,5,1,1,0.4,0.4091,0.43,0.2836,16,188,204 +8779,2012-01-06,1,1,1,14,0,5,1,1,0.46,0.4545,0.36,0.194,26,152,178 +8780,2012-01-06,1,1,1,15,0,5,1,1,0.52,0.5,0.27,0.2537,44,178,222 +8781,2012-01-06,1,1,1,16,0,5,1,1,0.52,0.5,0.27,0.2836,35,259,294 +8782,2012-01-06,1,1,1,17,0,5,1,1,0.46,0.4545,0.36,0.2239,20,456,476 +8783,2012-01-06,1,1,1,18,0,5,1,1,0.5,0.4848,0.29,0.2537,28,391,419 +8784,2012-01-06,1,1,1,19,0,5,1,1,0.46,0.4545,0.33,0.2836,11,261,272 +8785,2012-01-06,1,1,1,20,0,5,1,1,0.42,0.4242,0.41,0.194,14,163,177 +8786,2012-01-06,1,1,1,21,0,5,1,1,0.4,0.4091,0.43,0.2239,17,137,154 +8787,2012-01-06,1,1,1,22,0,5,1,1,0.36,0.3485,0.5,0.194,12,123,135 +8788,2012-01-06,1,1,1,23,0,5,1,1,0.36,0.3788,0.5,0,13,88,101 +8789,2012-01-07,1,1,1,0,0,6,0,1,0.36,0.3485,0.5,0.1642,2,77,79 +8790,2012-01-07,1,1,1,1,0,6,0,1,0.38,0.3939,0.46,0.1642,6,56,62 +8791,2012-01-07,1,1,1,2,0,6,0,1,0.36,0.3636,0.5,0.1045,2,36,38 +8792,2012-01-07,1,1,1,3,0,6,0,1,0.32,0.3333,0.57,0.1045,1,19,20 +8793,2012-01-07,1,1,1,4,0,6,0,1,0.32,0.3333,0.57,0.0896,1,9,10 +8794,2012-01-07,1,1,1,5,0,6,0,1,0.26,0.2727,0.75,0.1045,2,7,9 +8795,2012-01-07,1,1,1,6,0,6,0,1,0.26,0.2727,0.75,0.1045,0,7,7 +8796,2012-01-07,1,1,1,7,0,6,0,1,0.22,0.2273,0.87,0.194,0,20,20 +8797,2012-01-07,1,1,1,8,0,6,0,1,0.24,0.2576,0.75,0.1045,0,64,64 +8798,2012-01-07,1,1,1,9,0,6,0,1,0.22,0.2273,0.8,0.1343,14,116,130 +8799,2012-01-07,1,1,1,10,0,6,0,1,0.28,0.303,0.75,0.0896,43,160,203 +8800,2012-01-07,1,1,1,11,0,6,0,1,0.34,0.3182,0.57,0.2239,74,250,324 +8801,2012-01-07,1,1,1,12,0,6,0,1,0.4,0.4091,0.43,0.2239,100,276,376 +8802,2012-01-07,1,1,1,13,0,6,0,1,0.44,0.4394,0.44,0.194,149,296,445 +8803,2012-01-07,1,1,1,14,0,6,0,1,0.5,0.4848,0.42,0.2836,156,356,512 +8804,2012-01-07,1,1,1,15,0,6,0,1,0.58,0.5455,0.37,0.2985,132,317,449 +8805,2012-01-07,1,1,1,16,0,6,0,2,0.56,0.5303,0.37,0.2836,133,268,401 +8806,2012-01-07,1,1,1,17,0,6,0,2,0.54,0.5152,0.43,0.1343,87,235,322 +8807,2012-01-07,1,1,1,18,0,6,0,1,0.54,0.5152,0.45,0.1343,49,248,297 +8808,2012-01-07,1,1,1,19,0,6,0,1,0.52,0.5,0.42,0.1045,44,171,215 +8809,2012-01-07,1,1,1,20,0,6,0,1,0.5,0.4848,0.39,0.2537,21,149,170 +8810,2012-01-07,1,1,1,21,0,6,0,1,0.44,0.4394,0.44,0.2239,22,118,140 +8811,2012-01-07,1,1,1,22,0,6,0,1,0.44,0.4394,0.38,0.2537,16,93,109 +8812,2012-01-07,1,1,1,23,0,6,0,1,0.42,0.4242,0.38,0.2239,16,103,119 +8813,2012-01-08,1,1,1,0,0,0,0,1,0.38,0.3939,0.4,0.2836,14,77,91 +8814,2012-01-08,1,1,1,1,0,0,0,1,0.34,0.3333,0.42,0.194,10,62,72 +8815,2012-01-08,1,1,1,2,0,0,0,1,0.36,0.3333,0.37,0.2836,10,57,67 +8816,2012-01-08,1,1,1,3,0,0,0,1,0.34,0.3333,0.42,0.1343,6,26,32 +8817,2012-01-08,1,1,1,4,0,0,0,1,0.32,0.303,0.49,0.2537,2,4,6 +8818,2012-01-08,1,1,1,5,0,0,0,2,0.32,0.3333,0.49,0.1045,0,2,2 +8819,2012-01-08,1,1,1,6,0,0,0,2,0.3,0.3333,0.52,0,0,2,2 +8820,2012-01-08,1,1,1,7,0,0,0,2,0.3,0.3333,0.52,0,1,23,24 +8821,2012-01-08,1,1,1,8,0,0,0,1,0.3,0.303,0.52,0.1343,4,53,57 +8822,2012-01-08,1,1,1,9,0,0,0,1,0.32,0.3333,0.53,0.1343,23,102,125 +8823,2012-01-08,1,1,1,10,0,0,0,1,0.34,0.3333,0.49,0.1642,27,181,208 +8824,2012-01-08,1,1,1,11,0,0,0,1,0.36,0.3485,0.46,0.2239,55,201,256 +8825,2012-01-08,1,1,1,12,0,0,0,1,0.38,0.3939,0.43,0.2239,78,273,351 +8826,2012-01-08,1,1,1,13,0,0,0,1,0.4,0.4091,0.37,0.2985,77,266,343 +8827,2012-01-08,1,1,1,14,0,0,0,1,0.4,0.4091,0.4,0.194,75,253,328 +8828,2012-01-08,1,1,1,15,0,0,0,1,0.4,0.4091,0.37,0.2836,89,241,330 +8829,2012-01-08,1,1,1,16,0,0,0,1,0.4,0.4091,0.37,0.2985,58,256,314 +8830,2012-01-08,1,1,1,17,0,0,0,1,0.38,0.3939,0.4,0.2239,22,197,219 +8831,2012-01-08,1,1,1,18,0,0,0,1,0.34,0.3182,0.46,0.2239,19,162,181 +8832,2012-01-08,1,1,1,19,0,0,0,1,0.32,0.3182,0.49,0.194,8,104,112 +8833,2012-01-08,1,1,1,20,0,0,0,1,0.3,0.303,0.52,0.1642,7,119,126 +8834,2012-01-08,1,1,1,21,0,0,0,1,0.28,0.2727,0.56,0.2239,10,81,91 +8835,2012-01-08,1,1,1,22,0,0,0,1,0.26,0.2727,0.6,0.1045,4,54,58 +8836,2012-01-08,1,1,1,23,0,0,0,1,0.26,0.2424,0.56,0.2537,0,30,30 +8837,2012-01-09,1,1,1,0,0,1,1,1,0.24,0.2273,0.6,0.2239,3,12,15 +8838,2012-01-09,1,1,1,1,0,1,1,1,0.24,0.2424,0.6,0.1343,1,4,5 +8839,2012-01-09,1,1,1,2,0,1,1,1,0.24,0.2424,0.56,0.1343,2,3,5 +8840,2012-01-09,1,1,1,3,0,1,1,1,0.24,0.2424,0.52,0.1642,0,3,3 +8841,2012-01-09,1,1,1,4,0,1,1,1,0.22,0.2424,0.64,0.1045,0,4,4 +8842,2012-01-09,1,1,1,5,0,1,1,2,0.2,0.2273,0.64,0.0896,0,21,21 +8843,2012-01-09,1,1,1,6,0,1,1,2,0.22,0.2273,0.6,0.1343,3,85,88 +8844,2012-01-09,1,1,1,7,0,1,1,2,0.22,0.2424,0.6,0.1045,1,239,240 +8845,2012-01-09,1,1,1,8,0,1,1,2,0.22,0.2576,0.55,0.0896,13,407,420 +8846,2012-01-09,1,1,1,9,0,1,1,2,0.22,0.2727,0.64,0,9,188,197 +8847,2012-01-09,1,1,1,10,0,1,1,1,0.24,0.2879,0.6,0,13,95,108 +8848,2012-01-09,1,1,1,11,0,1,1,2,0.26,0.2879,0.56,0.0896,6,82,88 +8849,2012-01-09,1,1,1,12,0,1,1,2,0.26,0.2727,0.56,0.1045,10,93,103 +8850,2012-01-09,1,1,1,13,0,1,1,2,0.26,0.2727,0.56,0.1045,3,77,80 +8851,2012-01-09,1,1,1,14,0,1,1,3,0.22,0.2273,0.75,0.1642,5,45,50 +8852,2012-01-09,1,1,1,15,0,1,1,3,0.22,0.2273,0.75,0.1642,5,64,69 +8853,2012-01-09,1,1,1,16,0,1,1,3,0.22,0.2576,0.87,0.0896,3,46,49 +8854,2012-01-09,1,1,1,17,0,1,1,3,0.22,0.2727,0.87,0,5,147,152 +8855,2012-01-09,1,1,1,18,0,1,1,4,0.2,0.2273,0.86,0.0896,6,158,164 +8856,2012-01-09,1,1,1,19,0,1,1,3,0.2,0.2273,0.93,0.0896,3,187,190 +8857,2012-01-09,1,1,1,20,0,1,1,2,0.2,0.2273,0.86,0.0896,5,127,132 +8858,2012-01-09,1,1,1,21,0,1,1,2,0.2,0.2576,0.93,0,1,78,79 +8859,2012-01-09,1,1,1,22,0,1,1,2,0.2,0.2273,0.92,0.1045,8,54,62 +8860,2012-01-09,1,1,1,23,0,1,1,2,0.22,0.2424,0.87,0.1045,1,51,52 +8861,2012-01-10,1,1,1,0,0,2,1,2,0.22,0.2424,0.87,0.1045,0,14,14 +8862,2012-01-10,1,1,1,1,0,2,1,2,0.22,0.2424,0.93,0.1045,2,3,5 +8863,2012-01-10,1,1,1,2,0,2,1,2,0.22,0.2273,0.87,0.1642,2,2,4 +8864,2012-01-10,1,1,1,4,0,2,1,2,0.2,0.2121,0.86,0.1642,0,4,4 +8865,2012-01-10,1,1,1,5,0,2,1,1,0.4,0.4091,0.47,0.2239,0,23,23 +8866,2012-01-10,1,1,1,6,0,2,1,2,0.18,0.1818,0.93,0.194,0,79,79 +8867,2012-01-10,1,1,1,7,0,2,1,2,0.18,0.197,0.93,0.1642,3,219,222 +8868,2012-01-10,1,1,1,8,0,2,1,2,0.18,0.197,0.93,0.1642,6,465,471 +8869,2012-01-10,1,1,1,9,0,2,1,2,0.18,0.197,0.93,0.1642,11,184,195 +8870,2012-01-10,1,1,1,10,0,2,1,2,0.18,0.197,0.93,0.1642,3,80,83 +8871,2012-01-10,1,1,1,11,0,2,1,1,0.42,0.4242,0.38,0.2985,8,93,101 +8872,2012-01-10,1,1,1,12,0,2,1,1,0.42,0.4242,0.38,0.2985,13,127,140 +8873,2012-01-10,1,1,1,13,0,2,1,1,0.42,0.4242,0.38,0.2985,15,163,178 +8874,2012-01-10,1,1,1,14,0,2,1,1,0.42,0.4242,0.38,0.2985,10,96,106 +8875,2012-01-10,1,1,1,15,0,2,1,1,0.42,0.4242,0.38,0.2985,17,119,136 +8876,2012-01-10,1,1,1,16,0,2,1,1,0.42,0.4242,0.38,0.3582,15,225,240 +8877,2012-01-10,1,1,1,17,0,2,1,1,0.42,0.4242,0.38,0.2537,14,446,460 +8878,2012-01-10,1,1,1,18,0,2,1,1,0.4,0.4091,0.4,0.1045,13,372,385 +8879,2012-01-10,1,1,1,19,0,2,1,1,0.38,0.3939,0.46,0.1045,11,270,281 +8880,2012-01-10,1,1,1,20,0,2,1,1,0.34,0.3485,0.53,0.1045,10,190,200 +8881,2012-01-10,1,1,1,21,0,2,1,1,0.32,0.3333,0.66,0.0896,7,109,116 +8882,2012-01-10,1,1,1,22,0,2,1,1,0.3,0.303,0.7,0.0896,10,104,114 +8883,2012-01-10,1,1,1,23,0,2,1,1,0.26,0.2727,0.81,0.1045,3,38,41 +8884,2012-01-11,1,1,1,0,0,3,1,1,0.26,0.303,0.81,0,2,26,28 +8885,2012-01-11,1,1,1,1,0,3,1,1,0.24,0.2576,0.87,0.0896,1,6,7 +8886,2012-01-11,1,1,1,2,0,3,1,1,0.2,0.2121,0.85,0.1642,0,5,5 +8887,2012-01-11,1,1,1,3,0,3,1,1,0.22,0.2727,0.8,0,0,4,4 +8888,2012-01-11,1,1,1,4,0,3,1,1,0.2,0.2121,0.8,0.1343,0,2,2 +8889,2012-01-11,1,1,1,5,0,3,1,1,0.2,0.2121,0.8,0.1343,0,22,22 +8890,2012-01-11,1,1,1,6,0,3,1,1,0.2,0.2121,0.86,0.1343,2,72,74 +8891,2012-01-11,1,1,1,7,0,3,1,1,0.2,0.2121,0.8,0.1343,9,247,256 +8892,2012-01-11,1,1,1,8,0,3,1,1,0.2,0.2273,0.82,0.1045,11,488,499 +8893,2012-01-11,1,1,1,9,0,3,1,2,0.2,0.2121,0.86,0.1343,7,218,225 +8894,2012-01-11,1,1,1,10,0,3,1,2,0.24,0.2424,0.87,0.1343,13,84,97 +8895,2012-01-11,1,1,1,11,0,3,1,2,0.26,0.2879,0.87,0.0896,9,90,99 +8896,2012-01-11,1,1,1,12,0,3,1,3,0.3,0.3333,0.81,0,4,51,55 +8897,2012-01-11,1,1,1,13,0,3,1,3,0.32,0.3485,0.76,0,8,57,65 +8898,2012-01-11,1,1,1,14,0,3,1,3,0.32,0.3333,0.81,0.1045,3,70,73 +8899,2012-01-11,1,1,1,15,0,3,1,2,0.34,0.3485,0.87,0.0896,6,73,79 +8900,2012-01-11,1,1,1,16,0,3,1,2,0.34,0.3182,0.87,0.2537,6,112,118 +8901,2012-01-11,1,1,1,17,0,3,1,3,0.34,0.3182,0.87,0.2239,6,128,134 +8902,2012-01-11,1,1,1,18,0,3,1,3,0.34,0.3333,0.87,0.194,2,96,98 +8903,2012-01-11,1,1,1,19,0,3,1,3,0.32,0.303,0.93,0.2239,1,92,93 +8904,2012-01-11,1,1,1,20,0,3,1,3,0.32,0.3182,0.93,0.1642,2,55,57 +8905,2012-01-11,1,1,1,21,0,3,1,3,0.34,0.3182,0.87,0.2239,0,29,29 +8906,2012-01-11,1,1,1,22,0,3,1,3,0.34,0.3333,0.87,0.194,0,49,49 +8907,2012-01-11,1,1,1,23,0,3,1,3,0.34,0.3182,0.87,0.2239,0,9,9 +8908,2012-01-12,1,1,1,0,0,4,1,3,0.34,0.3333,0.93,0.194,0,3,3 +8909,2012-01-12,1,1,1,1,0,4,1,3,0.36,0.3485,0.93,0.194,0,3,3 +8910,2012-01-12,1,1,1,2,0,4,1,3,0.36,0.3485,0.93,0.2537,1,1,2 +8911,2012-01-12,1,1,1,3,0,4,1,3,0.36,0.3485,0.93,0.2537,0,3,3 +8912,2012-01-12,1,1,1,4,0,4,1,3,0.36,0.3485,0.93,0.194,0,2,2 +8913,2012-01-12,1,1,1,5,0,4,1,3,0.34,0.3485,0.93,0.0896,0,16,16 +8914,2012-01-12,1,1,1,6,0,4,1,3,0.36,0.3485,0.93,0.1343,1,88,89 +8915,2012-01-12,1,1,1,7,0,4,1,2,0.34,0.3182,0.87,0.2836,2,218,220 +8916,2012-01-12,1,1,1,8,0,4,1,1,0.32,0.3182,0.81,0.194,11,497,508 +8917,2012-01-12,1,1,1,9,0,4,1,1,0.32,0.3333,0.81,0.1045,10,220,230 +8918,2012-01-12,1,1,1,10,0,4,1,1,0.34,0.3485,0.81,0.1045,6,84,90 +8919,2012-01-12,1,1,1,11,0,4,1,2,0.34,0.3485,0.81,0.1045,18,101,119 +8920,2012-01-12,1,1,1,12,0,4,1,2,0.34,0.3485,0.81,0.1045,14,130,144 +8921,2012-01-12,1,1,1,13,0,4,1,1,0.42,0.4242,0.77,0.1642,13,170,183 +8922,2012-01-12,1,1,1,14,0,4,1,1,0.42,0.4242,0.77,0.1642,24,132,156 +8923,2012-01-12,1,1,1,15,0,4,1,1,0.44,0.4394,0.72,0.1343,32,156,188 +8924,2012-01-12,1,1,1,16,0,4,1,1,0.46,0.4545,0.67,0.194,33,248,281 +8925,2012-01-12,1,1,1,17,0,4,1,1,0.46,0.4545,0.63,0.1642,23,472,495 +8926,2012-01-12,1,1,1,18,0,4,1,1,0.44,0.4394,0.67,0.194,22,399,421 +8927,2012-01-12,1,1,1,19,0,4,1,2,0.42,0.4242,0.71,0.1642,17,313,330 +8928,2012-01-12,1,1,1,20,0,4,1,2,0.42,0.4242,0.71,0.194,11,229,240 +8929,2012-01-12,1,1,1,21,0,4,1,1,0.44,0.4394,0.67,0.1343,12,162,174 +8930,2012-01-12,1,1,1,22,0,4,1,1,0.38,0.3939,0.76,0.2985,12,121,133 +8931,2012-01-12,1,1,1,23,0,4,1,1,0.4,0.4091,0.76,0.3284,7,60,67 +8932,2012-01-13,1,1,1,0,0,5,1,2,0.4,0.4091,0.71,0.2836,4,38,42 +8933,2012-01-13,1,1,1,1,0,5,1,2,0.38,0.3939,0.76,0.2985,0,12,12 +8934,2012-01-13,1,1,1,2,0,5,1,1,0.4,0.4091,0.76,0.3582,1,13,14 +8935,2012-01-13,1,1,1,3,0,5,1,1,0.36,0.3333,0.87,0.2537,3,4,7 +8936,2012-01-13,1,1,1,4,0,5,1,3,0.32,0.2879,0.7,0.4925,1,2,3 +8937,2012-01-13,1,1,1,5,0,5,1,2,0.3,0.2424,0.7,0.5522,0,21,21 +8938,2012-01-13,1,1,1,6,0,5,1,2,0.26,0.2576,0.75,0.2239,2,71,73 +8939,2012-01-13,1,1,1,7,0,5,1,1,0.26,0.2576,0.6,0.2239,3,174,177 +8940,2012-01-13,1,1,1,8,0,5,1,1,0.24,0.2121,0.52,0.3582,11,408,419 +8941,2012-01-13,1,1,1,9,0,5,1,1,0.22,0.197,0.47,0.3881,10,204,214 +8942,2012-01-13,1,1,1,10,0,5,1,1,0.24,0.197,0.44,0.4478,2,101,103 +8943,2012-01-13,1,1,1,11,0,5,1,1,0.24,0.1818,0.38,0.6119,12,102,114 +8944,2012-01-13,1,1,1,12,0,5,1,1,0.24,0.197,0.38,0.4925,14,140,154 +8945,2012-01-13,1,1,1,13,0,5,1,1,0.26,0.2121,0.33,0.5224,14,140,154 +8946,2012-01-13,1,1,1,14,0,5,1,1,0.26,0.2121,0.33,0.5522,10,119,129 +8947,2012-01-13,1,1,1,15,0,5,1,1,0.26,0.2121,0.33,0.5821,15,123,138 +8948,2012-01-13,1,1,1,16,0,5,1,1,0.28,0.2424,0.3,0.4627,18,215,233 +8949,2012-01-13,1,1,1,17,0,5,1,1,0.28,0.2424,0.33,0.4179,15,317,332 +8950,2012-01-13,1,1,1,18,0,5,1,1,0.26,0.2273,0.38,0.2985,7,294,301 +8951,2012-01-13,1,1,1,19,0,5,1,1,0.26,0.2424,0.38,0.2836,7,199,206 +8952,2012-01-13,1,1,1,20,0,5,1,1,0.24,0.2424,0.41,0.1642,6,130,136 +8953,2012-01-13,1,1,1,21,0,5,1,1,0.24,0.2121,0.41,0.2836,7,95,102 +8954,2012-01-13,1,1,1,22,0,5,1,1,0.2,0.1818,0.47,0.3284,7,58,65 +8955,2012-01-13,1,1,1,23,0,5,1,1,0.18,0.1818,0.47,0.194,5,60,65 +8956,2012-01-14,1,1,1,0,0,6,0,1,0.16,0.1818,0.47,0.1045,2,42,44 +8957,2012-01-14,1,1,1,1,0,6,0,1,0.16,0.1515,0.47,0.194,6,44,50 +8958,2012-01-14,1,1,1,2,0,6,0,1,0.16,0.1667,0.47,0.1642,6,32,38 +8959,2012-01-14,1,1,1,3,0,6,0,1,0.14,0.1667,0.54,0.1045,6,14,20 +8960,2012-01-14,1,1,1,4,0,6,0,1,0.14,0.1515,0.5,0.1343,0,3,3 +8961,2012-01-14,1,1,1,5,0,6,0,1,0.14,0.1515,0.5,0.1343,0,4,4 +8962,2012-01-14,1,1,1,6,0,6,0,1,0.14,0.1515,0.5,0,0,5,5 +8963,2012-01-14,1,1,1,7,0,6,0,1,0.14,0.1515,0.5,0.1343,0,24,24 +8964,2012-01-14,1,1,1,8,0,6,0,1,0.12,0.1515,0.54,0.1045,3,89,92 +8965,2012-01-14,1,1,1,9,0,6,0,1,0.14,0.1667,0.5,0.1045,3,78,81 +8966,2012-01-14,1,1,1,10,0,6,0,1,0.16,0.2273,0.47,0,14,135,149 +8967,2012-01-14,1,1,1,11,0,6,0,1,0.2,0.197,0.4,0.194,28,150,178 +8968,2012-01-14,1,1,1,12,0,6,0,1,0.22,0.2121,0.41,0.2537,30,189,219 +8969,2012-01-14,1,1,1,13,0,6,0,1,0.22,0.2121,0.44,0.2836,38,182,220 +8970,2012-01-14,1,1,1,14,0,6,0,1,0.24,0.2121,0.41,0.2985,52,179,231 +8971,2012-01-14,1,1,1,15,0,6,0,1,0.24,0.2121,0.38,0.2985,49,177,226 +8972,2012-01-14,1,1,1,16,0,6,0,1,0.24,0.2273,0.38,0.2239,28,178,206 +8973,2012-01-14,1,1,1,17,0,6,0,1,0.24,0.2273,0.41,0.2537,15,129,144 +8974,2012-01-14,1,1,1,18,0,6,0,1,0.22,0.2273,0.41,0.194,21,122,143 +8975,2012-01-14,1,1,1,19,0,6,0,1,0.2,0.197,0.47,0.2239,12,90,102 +8976,2012-01-14,1,1,1,20,0,6,0,1,0.2,0.197,0.44,0.2537,8,95,103 +8977,2012-01-14,1,1,1,21,0,6,0,1,0.18,0.1667,0.43,0.2537,3,66,69 +8978,2012-01-14,1,1,1,22,0,6,0,1,0.16,0.1364,0.47,0.3582,4,62,66 +8979,2012-01-14,1,1,1,23,0,6,0,1,0.16,0.1515,0.47,0.2239,5,71,76 +8980,2012-01-15,1,1,1,0,0,0,0,1,0.16,0.1364,0.47,0.3284,9,50,59 +8981,2012-01-15,1,1,1,1,0,0,0,2,0.16,0.1364,0.47,0.2985,2,40,42 +8982,2012-01-15,1,1,1,2,0,0,0,2,0.16,0.1364,0.47,0.2836,5,38,43 +8983,2012-01-15,1,1,1,3,0,0,0,2,0.16,0.1364,0.48,0.3881,1,24,25 +8984,2012-01-15,1,1,1,4,0,0,0,2,0.16,0.1364,0.47,0.3881,1,5,6 +8985,2012-01-15,1,1,1,5,0,0,0,2,0.16,0.1364,0.47,0.3284,0,5,5 +8986,2012-01-15,1,1,1,6,0,0,0,2,0.14,0.1212,0.5,0.2836,1,7,8 +8987,2012-01-15,1,1,1,7,0,0,0,1,0.12,0.1212,0.54,0.2836,3,14,17 +8988,2012-01-15,1,1,1,8,0,0,0,1,0.12,0.1212,0.54,0.2239,2,34,36 +8989,2012-01-15,1,1,1,9,0,0,0,1,0.12,0.1364,0.54,0.194,10,80,90 +8990,2012-01-15,1,1,1,10,0,0,0,1,0.14,0.1212,0.54,0.2836,11,115,126 +8991,2012-01-15,1,1,1,11,0,0,0,1,0.16,0.1515,0.47,0.2239,21,144,165 +8992,2012-01-15,1,1,1,12,0,0,0,1,0.18,0.1667,0.4,0.2985,42,192,234 +8993,2012-01-15,1,1,1,13,0,0,0,1,0.2,0.1818,0.34,0.2836,40,184,224 +8994,2012-01-15,1,1,1,14,0,0,0,1,0.2,0.1818,0.32,0.3582,25,163,188 +8995,2012-01-15,1,1,1,15,0,0,0,1,0.22,0.2273,0.27,0.194,33,173,206 +8996,2012-01-15,1,1,1,16,0,0,0,1,0.22,0.2121,0.29,0.2985,31,174,205 +8997,2012-01-15,1,1,1,17,0,0,0,1,0.22,0.2121,0.25,0.2537,15,131,146 +8998,2012-01-15,1,1,1,18,0,0,0,1,0.18,0.1818,0.29,0.2239,7,113,120 +8999,2012-01-15,1,1,1,19,0,0,0,1,0.16,0.1667,0.37,0.1642,9,101,110 +9000,2012-01-15,1,1,1,20,0,0,0,1,0.16,0.1818,0.37,0.1045,10,85,95 +9001,2012-01-15,1,1,1,21,0,0,0,1,0.16,0.1667,0.4,0.1642,1,71,72 +9002,2012-01-15,1,1,1,22,0,0,0,1,0.18,0.2121,0.37,0.0896,2,58,60 +9003,2012-01-15,1,1,1,23,0,0,0,1,0.16,0.197,0.43,0.0896,3,26,29 +9004,2012-01-16,1,1,1,0,1,1,0,1,0.14,0.1515,0.46,0.1343,2,23,25 +9005,2012-01-16,1,1,1,1,1,1,0,1,0.14,0.1667,0.43,0.1045,2,18,20 +9006,2012-01-16,1,1,1,2,1,1,0,1,0.14,0.2121,0.46,0,3,14,17 +9007,2012-01-16,1,1,1,3,1,1,0,1,0.12,0.197,0.5,0,0,3,3 +9008,2012-01-16,1,1,1,4,1,1,0,1,0.12,0.1515,0.58,0.1045,2,6,8 +9009,2012-01-16,1,1,1,5,1,1,0,1,0.12,0.1667,0.54,0.0896,1,5,6 +9010,2012-01-16,1,1,1,6,1,1,0,1,0.1,0.1364,0.54,0.0896,0,13,13 +9011,2012-01-16,1,1,1,7,1,1,0,1,0.1,0.1364,0.54,0.1045,5,28,33 +9012,2012-01-16,1,1,1,8,1,1,0,1,0.1,0.1212,0.58,0.1642,3,75,78 +9013,2012-01-16,1,1,1,9,1,1,0,1,0.1,0.1212,0.58,0.1642,7,89,96 +9014,2012-01-16,1,1,1,10,1,1,0,2,0.14,0.1364,0.59,0.194,19,107,126 +9015,2012-01-16,1,1,1,11,1,1,0,1,0.16,0.1515,0.59,0.2537,21,158,179 +9016,2012-01-16,1,1,1,12,1,1,0,1,0.2,0.1818,0.55,0.3284,20,165,185 +9017,2012-01-16,1,1,1,13,1,1,0,2,0.24,0.197,0.41,0.4179,26,202,228 +9018,2012-01-16,1,1,1,14,1,1,0,1,0.24,0.2121,0.44,0.3881,23,158,181 +9019,2012-01-16,1,1,1,15,1,1,0,1,0.26,0.2121,0.44,0.4478,17,175,192 +9020,2012-01-16,1,1,1,16,1,1,0,1,0.26,0.2273,0.44,0.3582,22,158,180 +9021,2012-01-16,1,1,1,17,1,1,0,1,0.26,0.2273,0.48,0.3284,19,174,193 +9022,2012-01-16,1,1,1,18,1,1,0,1,0.26,0.2273,0.52,0.3284,6,175,181 +9023,2012-01-16,1,1,1,19,1,1,0,1,0.26,0.2424,0.56,0.2836,10,134,144 +9024,2012-01-16,1,1,1,20,1,1,0,1,0.26,0.2424,0.6,0.2836,2,88,90 +9025,2012-01-16,1,1,1,21,1,1,0,1,0.28,0.2576,0.52,0.3284,3,46,49 +9026,2012-01-16,1,1,1,22,1,1,0,2,0.3,0.2727,0.49,0.3582,4,39,43 +9027,2012-01-16,1,1,1,23,1,1,0,3,0.26,0.2273,0.7,0.2985,0,28,28 +9028,2012-01-17,1,1,1,0,0,2,1,2,0.26,0.2273,0.7,0.3284,0,12,12 +9029,2012-01-17,1,1,1,1,0,2,1,2,0.28,0.2576,0.65,0.3582,0,2,2 +9030,2012-01-17,1,1,1,2,0,2,1,3,0.28,0.2727,0.75,0.2537,0,12,12 +9031,2012-01-17,1,1,1,4,0,2,1,3,0.32,0.303,0.66,0.3284,0,2,2 +9032,2012-01-17,1,1,1,5,0,2,1,3,0.32,0.303,0.66,0.3284,0,13,13 +9033,2012-01-17,1,1,1,6,0,2,1,3,0.3,0.2879,0.75,0.2537,0,57,57 +9034,2012-01-17,1,1,1,7,0,2,1,3,0.3,0.2879,0.81,0.2239,5,121,126 +9035,2012-01-17,1,1,1,8,0,2,1,3,0.3,0.2879,0.81,0.2836,3,90,93 +9036,2012-01-17,1,1,1,9,0,2,1,3,0.32,0.303,0.81,0.2537,7,61,68 +9037,2012-01-17,1,1,1,10,0,2,1,3,0.3,0.2879,0.87,0.2537,1,61,62 +9038,2012-01-17,1,1,1,11,0,2,1,2,0.34,0.3485,0.87,0.1045,2,73,75 +9039,2012-01-17,1,1,1,12,0,2,1,2,0.32,0.303,0.87,0.2836,3,83,86 +9040,2012-01-17,1,1,1,13,0,2,1,2,0.38,0.3939,0.76,0.3284,8,95,103 +9041,2012-01-17,1,1,1,14,0,2,1,1,0.44,0.4394,0.62,0.4925,12,120,132 +9042,2012-01-17,1,1,1,15,0,2,1,1,0.44,0.4394,0.62,0.4925,6,123,129 +9043,2012-01-17,1,1,1,16,0,2,1,1,0.46,0.4545,0.63,0.5522,14,205,219 +9044,2012-01-17,1,1,1,17,0,2,1,1,0.46,0.4545,0.63,0.4478,20,442,462 +9045,2012-01-17,1,1,1,18,0,2,1,1,0.46,0.4545,0.63,0.4478,12,451,463 +9046,2012-01-17,1,1,1,19,0,2,1,1,0.46,0.4545,0.63,0.4179,11,297,308 +9047,2012-01-17,1,1,1,20,0,2,1,1,0.46,0.4545,0.63,0.3582,11,169,180 +9048,2012-01-17,1,1,1,21,0,2,1,1,0.46,0.4545,0.67,0.4627,6,174,180 +9049,2012-01-17,1,1,1,22,0,2,1,1,0.46,0.4545,0.72,0.3881,5,85,90 +9050,2012-01-17,1,1,1,23,0,2,1,1,0.46,0.4545,0.72,0.3881,1,60,61 +9051,2012-01-18,1,1,1,0,0,3,1,2,0.46,0.4545,0.77,0.3284,3,14,17 +9052,2012-01-18,1,1,1,1,0,3,1,2,0.46,0.4545,0.77,0.3284,0,10,10 +9053,2012-01-18,1,1,1,2,0,3,1,2,0.44,0.4394,0.51,0.4925,1,1,2 +9054,2012-01-18,1,1,1,3,0,3,1,2,0.44,0.4394,0.51,0.4925,1,3,4 +9055,2012-01-18,1,1,1,4,0,3,1,1,0.34,0.2879,0.46,0.5224,0,1,1 +9056,2012-01-18,1,1,1,5,0,3,1,1,0.34,0.2879,0.46,0.5224,0,29,29 +9057,2012-01-18,1,1,1,6,0,3,1,1,0.34,0.2879,0.36,0.5224,0,88,88 +9058,2012-01-18,1,1,1,7,0,3,1,1,0.32,0.2879,0.42,0.3582,1,262,263 +9059,2012-01-18,1,1,1,8,0,3,1,1,0.3,0.2727,0.52,0.4179,15,474,489 +9060,2012-01-18,1,1,1,9,0,3,1,1,0.28,0.2576,0.52,0.3284,6,196,202 +9061,2012-01-18,1,1,1,10,0,3,1,1,0.3,0.2727,0.42,0.4627,5,88,93 +9062,2012-01-18,1,1,1,11,0,3,1,1,0.3,0.2727,0.42,0.3881,4,110,114 +9063,2012-01-18,1,1,1,12,0,3,1,1,0.3,0.2576,0.39,0.4925,6,155,161 +9064,2012-01-18,1,1,1,13,0,3,1,1,0.3,0.2727,0.42,0.4627,6,124,130 +9065,2012-01-18,1,1,1,14,0,3,1,1,0.3,0.2727,0.33,0.3881,7,98,105 +9066,2012-01-18,1,1,1,15,0,3,1,1,0.3,0.2576,0.33,0.5224,8,122,130 +9067,2012-01-18,1,1,1,16,0,3,1,1,0.28,0.2273,0.3,0.5224,13,163,176 +9068,2012-01-18,1,1,1,17,0,3,1,1,0.26,0.2121,0.33,0.4478,8,338,346 +9069,2012-01-18,1,1,1,18,0,3,1,1,0.24,0.2121,0.32,0.3582,8,347,355 +9070,2012-01-18,1,1,1,19,0,3,1,1,0.22,0.197,0.37,0.3881,5,251,256 +9071,2012-01-18,1,1,1,20,0,3,1,1,0.22,0.2121,0.37,0.2537,1,164,165 +9072,2012-01-18,1,1,1,21,0,3,1,1,0.2,0.1667,0.4,0.4179,5,119,124 +9073,2012-01-18,1,1,1,22,0,3,1,1,0.18,0.1667,0.47,0.2537,4,74,78 +9074,2012-01-18,1,1,1,23,0,3,1,1,0.16,0.1364,0.47,0.2985,2,36,38 +9075,2012-01-19,1,1,1,0,0,4,1,1,0.14,0.1364,0.5,0.194,0,16,16 +9076,2012-01-19,1,1,1,1,0,4,1,1,0.14,0.1515,0.5,0.1642,0,5,5 +9077,2012-01-19,1,1,1,2,0,4,1,1,0.14,0.1515,0.5,0.1343,0,3,3 +9078,2012-01-19,1,1,1,3,0,4,1,1,0.12,0.1515,0.54,0.1045,0,4,4 +9079,2012-01-19,1,1,1,4,0,4,1,1,0.12,0.1515,0.54,0.1045,0,1,1 +9080,2012-01-19,1,1,1,5,0,4,1,1,0.12,0.1212,0.55,0.2537,0,19,19 +9081,2012-01-19,1,1,1,6,0,4,1,1,0.12,0.1364,0.54,0.1642,0,86,86 +9082,2012-01-19,1,1,1,7,0,4,1,1,0.12,0.1667,0.54,0.0896,2,204,206 +9083,2012-01-19,1,1,1,8,0,4,1,1,0.12,0.1515,0.54,0.1045,9,414,423 +9084,2012-01-19,1,1,1,9,0,4,1,1,0.12,0.197,0.54,0,6,204,210 +9085,2012-01-19,1,1,1,10,0,4,1,1,0.14,0.2121,0.59,0,10,98,108 +9086,2012-01-19,1,1,1,11,0,4,1,1,0.16,0.1667,0.55,0.1642,6,93,99 +9087,2012-01-19,1,1,1,12,0,4,1,1,0.2,0.197,0.51,0.194,5,106,111 +9088,2012-01-19,1,1,1,13,0,4,1,1,0.22,0.2121,0.47,0.2836,6,118,124 +9089,2012-01-19,1,1,1,14,0,4,1,1,0.24,0.2273,0.44,0.194,15,104,119 +9090,2012-01-19,1,1,1,15,0,4,1,1,0.26,0.2273,0.44,0.2985,9,118,127 +9091,2012-01-19,1,1,1,16,0,4,1,1,0.26,0.2273,0.41,0.3284,8,185,193 +9092,2012-01-19,1,1,1,17,0,4,1,2,0.26,0.2273,0.44,0.3284,21,364,385 +9093,2012-01-19,1,1,1,18,0,4,1,2,0.26,0.2273,0.44,0.3284,16,345,361 +9094,2012-01-19,1,1,1,19,0,4,1,2,0.26,0.2273,0.48,0.2985,5,229,234 +9095,2012-01-19,1,1,1,20,0,4,1,1,0.26,0.2121,0.52,0.4478,7,184,191 +9096,2012-01-19,1,1,1,21,0,4,1,1,0.26,0.2273,0.44,0.3582,4,117,121 +9097,2012-01-19,1,1,1,22,0,4,1,2,0.26,0.2273,0.44,0.4179,0,90,90 +9098,2012-01-19,1,1,1,23,0,4,1,1,0.26,0.2273,0.48,0.3284,1,55,56 +9099,2012-01-20,1,1,1,0,0,5,1,2,0.26,0.2273,0.48,0.2985,3,24,27 +9100,2012-01-20,1,1,1,1,0,5,1,1,0.26,0.2727,0.48,0.1343,0,15,15 +9101,2012-01-20,1,1,1,2,0,5,1,1,0.26,0.2727,0.48,0.1343,0,11,11 +9102,2012-01-20,1,1,1,3,0,5,1,1,0.24,0.2576,0.52,0.1045,0,4,4 +9103,2012-01-20,1,1,1,4,0,5,1,1,0.22,0.2424,0.6,0.1045,0,3,3 +9104,2012-01-20,1,1,1,5,0,5,1,1,0.22,0.2424,0.6,0.1045,0,19,19 +9105,2012-01-20,1,1,1,6,0,5,1,1,0.24,0.2121,0.6,0.2836,0,68,68 +9106,2012-01-20,1,1,1,7,0,5,1,1,0.22,0.2121,0.69,0.2239,2,183,185 +9107,2012-01-20,1,1,1,8,0,5,1,1,0.22,0.2121,0.64,0.2537,4,421,425 +9108,2012-01-20,1,1,1,9,0,5,1,1,0.22,0.197,0.47,0.3881,21,242,263 +9109,2012-01-20,1,1,1,10,0,5,1,2,0.22,0.2121,0.44,0.2836,1,96,97 +9110,2012-01-20,1,1,1,11,0,5,1,2,0.2,0.1818,0.32,0.3582,6,117,123 +9111,2012-01-20,1,1,1,12,0,5,1,2,0.2,0.1818,0.34,0.3284,5,119,124 +9112,2012-01-20,1,1,1,13,0,5,1,1,0.22,0.2121,0.32,0.2985,12,144,156 +9113,2012-01-20,1,1,1,14,0,5,1,2,0.22,0.2121,0.35,0.2985,8,107,115 +9114,2012-01-20,1,1,1,15,0,5,1,2,0.22,0.2273,0.35,0.194,12,144,156 +9115,2012-01-20,1,1,1,16,0,5,1,2,0.2,0.2121,0.37,0.1642,11,163,174 +9116,2012-01-20,1,1,1,17,0,5,1,2,0.22,0.2576,0.32,0.0896,9,340,349 +9117,2012-01-20,1,1,1,18,0,5,1,2,0.2,0.2273,0.34,0.0896,5,296,301 +9118,2012-01-20,1,1,1,19,0,5,1,2,0.2,0.2273,0.37,0.1045,6,186,192 +9119,2012-01-20,1,1,1,20,0,5,1,2,0.2,0.2273,0.34,0.0896,1,141,142 +9120,2012-01-20,1,1,1,21,0,5,1,2,0.2,0.2273,0.37,0.0896,5,99,104 +9121,2012-01-20,1,1,1,22,0,5,1,2,0.2,0.197,0.37,0.2239,1,71,72 +9122,2012-01-20,1,1,1,23,0,5,1,3,0.16,0.1515,0.64,0.2239,3,35,38 +9123,2012-01-21,1,1,1,0,0,6,0,3,0.16,0.1515,0.64,0.2239,0,24,24 +9124,2012-01-21,1,1,1,1,0,6,0,4,0.14,0.1364,0.86,0.194,1,22,23 +9125,2012-01-21,1,1,1,2,0,6,0,3,0.14,0.2121,0.86,0,1,25,26 +9126,2012-01-21,1,1,1,3,0,6,0,3,0.14,0.1818,0.93,0.0896,0,13,13 +9127,2012-01-21,1,1,1,4,0,6,0,3,0.16,0.197,0.86,0.0896,0,1,1 +9128,2012-01-21,1,1,1,5,0,6,0,3,0.16,0.1818,0.86,0.1045,0,2,2 +9129,2012-01-21,1,1,1,6,0,6,0,3,0.16,0.1667,0.93,0.1642,0,1,1 +9130,2012-01-21,1,1,1,7,0,6,0,3,0.16,0.1667,0.93,0.1642,3,10,13 +9131,2012-01-21,1,1,1,8,0,6,0,2,0.16,0.1515,0.93,0.2537,3,22,25 +9132,2012-01-21,1,1,1,9,0,6,0,2,0.16,0.1515,0.93,0.2537,2,40,42 +9133,2012-01-21,1,1,1,10,0,6,0,3,0.16,0.1515,0.93,0.2537,0,45,45 +9134,2012-01-21,1,1,1,11,0,6,0,2,0.18,0.2121,0.86,0.1045,1,62,63 +9135,2012-01-21,1,1,1,12,0,6,0,2,0.2,0.2121,0.8,0.1642,5,62,67 +9136,2012-01-21,1,1,1,13,0,6,0,2,0.2,0.197,0.8,0.2537,10,66,76 +9137,2012-01-21,1,1,1,14,0,6,0,3,0.2,0.197,0.8,0.194,6,89,95 +9138,2012-01-21,1,1,1,15,0,6,0,3,0.2,0.197,0.8,0.194,6,113,119 +9139,2012-01-21,1,1,1,16,0,6,0,2,0.2,0.1818,0.8,0.2985,5,113,118 +9140,2012-01-21,1,1,1,17,0,6,0,2,0.2,0.1818,0.8,0.3881,7,99,106 +9141,2012-01-21,1,1,1,18,0,6,0,2,0.2,0.1818,0.75,0.3881,3,107,110 +9142,2012-01-21,1,1,1,19,0,6,0,2,0.18,0.1515,0.8,0.3881,0,85,85 +9143,2012-01-21,1,1,1,20,0,6,0,2,0.18,0.1667,0.8,0.2985,5,62,67 +9144,2012-01-21,1,1,1,21,0,6,0,1,0.18,0.1515,0.74,0.3284,5,64,69 +9145,2012-01-21,1,1,1,22,0,6,0,2,0.18,0.1667,0.74,0.2985,3,56,59 +9146,2012-01-21,1,1,1,23,0,6,0,2,0.16,0.1515,0.8,0.2537,1,51,52 +9147,2012-01-22,1,1,1,0,0,0,0,1,0.16,0.1364,0.8,0.2985,1,51,52 +9148,2012-01-22,1,1,1,1,0,0,0,2,0.18,0.1667,0.74,0.2985,1,49,50 +9149,2012-01-22,1,1,1,2,0,0,0,2,0.18,0.1667,0.74,0.2836,1,46,47 +9150,2012-01-22,1,1,1,3,0,0,0,2,0.18,0.1667,0.69,0.2836,1,20,21 +9151,2012-01-22,1,1,1,4,0,0,0,3,0.16,0.1364,0.8,0.2836,1,1,2 +9152,2012-01-22,1,1,1,5,0,0,0,3,0.16,0.1515,0.74,0.2239,1,2,3 +9153,2012-01-22,1,1,1,6,0,0,0,2,0.16,0.1515,0.74,0.2239,0,4,4 +9154,2012-01-22,1,1,1,7,0,0,0,2,0.16,0.1515,0.74,0.2239,2,11,13 +9155,2012-01-22,1,1,1,8,0,0,0,2,0.16,0.1515,0.74,0.2239,3,29,32 +9156,2012-01-22,1,1,1,9,0,0,0,2,0.16,0.1515,0.74,0.2239,9,54,63 +9157,2012-01-22,1,1,1,10,0,0,0,2,0.16,0.1364,0.74,0.2836,9,120,129 +9158,2012-01-22,1,1,1,11,0,0,0,3,0.16,0.1515,0.8,0.2537,18,115,133 +9159,2012-01-22,1,1,1,12,0,0,0,2,0.16,0.1515,0.8,0.2239,24,161,185 +9160,2012-01-22,1,1,1,13,0,0,0,2,0.16,0.1818,0.8,0.1343,24,175,199 +9161,2012-01-22,1,1,1,14,0,0,0,2,0.16,0.1667,0.8,0.1642,19,163,182 +9162,2012-01-22,1,1,1,15,0,0,0,2,0.16,0.1515,0.8,0.194,27,149,176 +9163,2012-01-22,1,1,1,16,0,0,0,2,0.16,0.1515,0.8,0.194,12,143,155 +9164,2012-01-22,1,1,1,17,0,0,0,2,0.16,0.1818,0.8,0.1045,8,100,108 +9165,2012-01-22,1,1,1,18,0,0,0,3,0.16,0.1818,0.86,0.1045,11,110,121 +9166,2012-01-22,1,1,1,19,0,0,0,3,0.16,0.1818,0.86,0.1045,9,80,89 +9167,2012-01-22,1,1,1,20,0,0,0,3,0.16,0.1667,0.86,0.1642,5,69,74 +9168,2012-01-22,1,1,1,21,0,0,0,3,0.16,0.1818,0.86,0.1045,8,36,44 +9169,2012-01-22,1,1,1,22,0,0,0,2,0.16,0.197,0.93,0.0896,0,58,58 +9170,2012-01-22,1,1,1,23,0,0,0,2,0.16,0.1818,0.93,0.1045,2,35,37 +9171,2012-01-23,1,1,1,0,0,1,1,2,0.18,0.2424,0.86,0,2,17,19 +9172,2012-01-23,1,1,1,1,0,1,1,2,0.18,0.2121,0.86,0.1045,0,4,4 +9173,2012-01-23,1,1,1,2,0,1,1,2,0.18,0.2121,0.86,0.1045,0,2,2 +9174,2012-01-23,1,1,1,3,0,1,1,2,0.18,0.2424,0.86,0,0,1,1 +9175,2012-01-23,1,1,1,4,0,1,1,2,0.2,0.2576,0.8,0,0,1,1 +9176,2012-01-23,1,1,1,5,0,1,1,2,0.2,0.2576,0.86,0,0,19,19 +9177,2012-01-23,1,1,1,6,0,1,1,2,0.2,0.2576,0.86,0,6,36,42 +9178,2012-01-23,1,1,1,7,0,1,1,3,0.2,0.2576,0.86,0,8,114,122 +9179,2012-01-23,1,1,1,8,0,1,1,3,0.2,0.2121,0.86,0.1343,5,267,272 +9180,2012-01-23,1,1,1,9,0,1,1,2,0.2,0.2273,0.86,0.0896,13,198,211 +9181,2012-01-23,1,1,1,10,0,1,1,2,0.2,0.2576,0.93,0,10,139,149 +9182,2012-01-23,1,1,1,11,0,1,1,2,0.2,0.2273,0.93,0.0896,5,57,62 +9183,2012-01-23,1,1,1,12,0,1,1,2,0.22,0.2727,0.93,0,10,52,62 +9184,2012-01-23,1,1,1,13,0,1,1,3,0.22,0.2576,0.93,0.0896,5,32,37 +9185,2012-01-23,1,1,1,14,0,1,1,3,0.22,0.2273,0.93,0.194,13,48,61 +9186,2012-01-23,1,1,1,15,0,1,1,2,0.22,0.2121,1,0.2239,10,64,74 +9187,2012-01-23,1,1,1,16,0,1,1,2,0.24,0.2424,0.93,0.1642,14,108,122 +9188,2012-01-23,1,1,1,17,0,1,1,2,0.24,0.2576,0.96,0.1045,6,270,276 +9189,2012-01-23,1,1,1,18,0,1,1,2,0.24,0.2424,0.93,0.1343,7,275,282 +9190,2012-01-23,1,1,1,19,0,1,1,2,0.26,0.2576,0.93,0.1642,9,211,220 +9191,2012-01-23,1,1,1,20,0,1,1,2,0.26,0.2576,1,0.1642,8,161,169 +9192,2012-01-23,1,1,1,21,0,1,1,2,0.26,0.2424,1,0.2836,8,112,120 +9193,2012-01-23,1,1,1,22,0,1,1,2,0.26,0.2424,1,0.2836,4,64,68 +9194,2012-01-23,1,1,1,23,0,1,1,2,0.28,0.2576,0.93,0.3284,2,35,37 +9195,2012-01-24,1,1,1,0,0,2,1,2,0.3,0.2879,1,0.2836,6,19,25 +9196,2012-01-24,1,1,1,1,0,2,1,2,0.32,0.303,0.93,0.2537,2,7,9 +9197,2012-01-24,1,1,1,2,0,2,1,2,0.32,0.303,0.93,0.2239,2,6,8 +9198,2012-01-24,1,1,1,3,0,2,1,2,0.32,0.3333,0.93,0.1343,0,3,3 +9199,2012-01-24,1,1,1,4,0,2,1,2,0.3,0.3182,1,0.0896,0,2,2 +9200,2012-01-24,1,1,1,5,0,2,1,2,0.3,0.3182,1,0.0896,0,26,26 +9201,2012-01-24,1,1,1,6,0,2,1,2,0.32,0.3182,0.93,0.194,1,88,89 +9202,2012-01-24,1,1,1,7,0,2,1,2,0.32,0.3333,0.93,0.1343,11,221,232 +9203,2012-01-24,1,1,1,8,0,2,1,2,0.32,0.3333,0.93,0.1343,11,479,490 +9204,2012-01-24,1,1,1,9,0,2,1,2,0.32,0.3333,0.87,0.0896,11,244,255 +9205,2012-01-24,1,1,1,10,0,2,1,1,0.32,0.3333,0.93,0.0896,23,93,116 +9206,2012-01-24,1,1,1,11,0,2,1,1,0.34,0.3333,0.87,0.1343,30,123,153 +9207,2012-01-24,1,1,1,12,0,2,1,1,0.34,0.3333,0.87,0.1642,25,150,175 +9208,2012-01-24,1,1,1,13,0,2,1,1,0.34,0.3333,0.87,0.1642,33,170,203 +9209,2012-01-24,1,1,1,14,0,2,1,1,0.36,0.3485,0.81,0.2239,35,138,173 +9210,2012-01-24,1,1,1,15,0,2,1,2,0.4,0.4091,0.71,0.1642,52,156,208 +9211,2012-01-24,1,1,1,16,0,2,1,1,0.4,0.4091,0.71,0.1343,35,243,278 +9212,2012-01-24,1,1,1,17,0,2,1,1,0.42,0.4242,0.67,0.0896,41,474,515 +9213,2012-01-24,1,1,1,18,0,2,1,1,0.4,0.4091,0.62,0.0896,32,391,423 +9214,2012-01-24,1,1,1,19,0,2,1,1,0.44,0.4394,0.51,0.0896,20,311,331 +9215,2012-01-24,1,1,1,20,0,2,1,1,0.32,0.3485,0.76,0,18,211,229 +9216,2012-01-24,1,1,1,21,0,2,1,1,0.36,0.3788,0.71,0,18,154,172 +9217,2012-01-24,1,1,1,22,0,2,1,1,0.32,0.3485,0.81,0,14,111,125 +9218,2012-01-24,1,1,1,23,0,2,1,1,0.32,0.3485,0.76,0,19,80,99 +9219,2012-01-25,1,1,1,0,0,3,1,1,0.26,0.303,0.93,0,6,25,31 +9220,2012-01-25,1,1,1,1,0,3,1,1,0.3,0.2879,0.7,0.2239,4,9,13 +9221,2012-01-25,1,1,1,2,0,3,1,1,0.28,0.2727,0.7,0.194,3,3,6 +9222,2012-01-25,1,1,1,3,0,3,1,1,0.26,0.2576,0.75,0.194,0,1,1 +9223,2012-01-25,1,1,1,4,0,3,1,1,0.26,0.2424,0.7,0.2836,0,4,4 +9224,2012-01-25,1,1,1,5,0,3,1,1,0.26,0.2424,0.7,0.2836,0,33,33 +9225,2012-01-25,1,1,1,6,0,3,1,1,0.24,0.2273,0.73,0.2239,1,87,88 +9226,2012-01-25,1,1,1,7,0,3,1,1,0.24,0.2273,0.75,0.2239,14,243,257 +9227,2012-01-25,1,1,1,8,0,3,1,1,0.24,0.2273,0.7,0.2537,18,495,513 +9228,2012-01-25,1,1,1,9,0,3,1,1,0.24,0.2273,0.7,0.2537,18,218,236 +9229,2012-01-25,1,1,1,10,0,3,1,2,0.26,0.2273,0.7,0.3284,30,111,141 +9230,2012-01-25,1,1,1,11,0,3,1,2,0.3,0.2727,0.61,0.2985,52,105,157 +9231,2012-01-25,1,1,1,12,0,3,1,1,0.32,0.3333,0.53,0.1343,46,147,193 +9232,2012-01-25,1,1,1,13,0,3,1,1,0.34,0.3333,0.49,0.1343,63,149,212 +9233,2012-01-25,1,1,1,14,0,3,1,1,0.34,0.3485,0.49,0.0896,53,107,160 +9234,2012-01-25,1,1,1,15,0,3,1,1,0.36,0.3788,0.46,0,30,125,155 +9235,2012-01-25,1,1,1,16,0,3,1,1,0.34,0.3333,0.53,0.1642,29,217,246 +9236,2012-01-25,1,1,1,17,0,3,1,1,0.36,0.3485,0.53,0.1642,27,443,470 +9237,2012-01-25,1,1,1,18,0,3,1,1,0.32,0.3333,0.66,0.1343,24,415,439 +9238,2012-01-25,1,1,1,19,0,3,1,2,0.34,0.3636,0.53,0,5,311,316 +9239,2012-01-25,1,1,1,20,0,3,1,1,0.32,0.3182,0.61,0.194,10,225,235 +9240,2012-01-25,1,1,1,21,0,3,1,1,0.3,0.3333,0.65,0,19,166,185 +9241,2012-01-25,1,1,1,22,0,3,1,1,0.3,0.3182,0.65,0.0896,12,101,113 +9242,2012-01-25,1,1,1,23,0,3,1,2,0.28,0.3182,0.65,0,3,63,66 +9243,2012-01-26,1,1,1,0,0,4,1,2,0.28,0.2879,0.75,0.1045,3,22,25 +9244,2012-01-26,1,1,1,1,0,4,1,2,0.28,0.3182,0.75,0,5,16,21 +9245,2012-01-26,1,1,1,2,0,4,1,2,0.28,0.2727,0.75,0.1642,1,6,7 +9246,2012-01-26,1,1,1,3,0,4,1,2,0.28,0.303,0.75,0.0896,1,5,6 +9247,2012-01-26,1,1,1,4,0,4,1,1,0.28,0.3182,0.75,0,0,3,3 +9248,2012-01-26,1,1,1,5,0,4,1,2,0.3,0.3182,0.75,0.1045,0,28,28 +9249,2012-01-26,1,1,1,6,0,4,1,2,0.28,0.3182,0.81,0,0,88,88 +9250,2012-01-26,1,1,1,7,0,4,1,2,0.28,0.303,0.81,0.0896,11,228,239 +9251,2012-01-26,1,1,1,8,0,4,1,2,0.28,0.303,0.81,0.0896,16,514,530 +9252,2012-01-26,1,1,1,9,0,4,1,2,0.3,0.3333,0.81,0,18,256,274 +9253,2012-01-26,1,1,1,10,0,4,1,2,0.3,0.303,0.81,0.1343,16,92,108 +9254,2012-01-26,1,1,1,11,0,4,1,2,0.3,0.3182,0.87,0.1045,11,100,111 +9255,2012-01-26,1,1,1,12,0,4,1,2,0.34,0.3333,0.81,0.1343,29,136,165 +9256,2012-01-26,1,1,1,13,0,4,1,2,0.38,0.3939,0.71,0,17,156,173 +9257,2012-01-26,1,1,1,14,0,4,1,2,0.4,0.4091,0.71,0.1045,16,112,128 +9258,2012-01-26,1,1,1,15,0,4,1,2,0.42,0.4242,0.71,0.1045,22,142,164 +9259,2012-01-26,1,1,1,16,0,4,1,2,0.4,0.4091,0.71,0.0896,15,209,224 +9260,2012-01-26,1,1,1,17,0,4,1,2,0.42,0.4242,0.71,0,16,397,413 +9261,2012-01-26,1,1,1,18,0,4,1,2,0.42,0.4242,0.71,0,12,394,406 +9262,2012-01-26,1,1,1,19,0,4,1,2,0.4,0.4091,0.76,0.1045,11,302,313 +9263,2012-01-26,1,1,1,20,0,4,1,2,0.4,0.4091,0.82,0.1343,8,216,224 +9264,2012-01-26,1,1,1,21,0,4,1,2,0.4,0.4091,0.76,0.1045,8,184,192 +9265,2012-01-26,1,1,1,22,0,4,1,1,0.38,0.3939,0.82,0.1045,3,139,142 +9266,2012-01-26,1,1,1,23,0,4,1,2,0.4,0.4091,0.82,0,5,86,91 +9267,2012-01-27,1,1,1,0,0,5,1,2,0.4,0.4091,0.82,0.2537,6,32,38 +9268,2012-01-27,1,1,1,1,0,5,1,2,0.4,0.4091,0.87,0.2239,6,23,29 +9269,2012-01-27,1,1,1,2,0,5,1,1,0.42,0.4242,0.94,0.1343,2,11,13 +9270,2012-01-27,1,1,1,3,0,5,1,2,0.42,0.4242,0.94,0.2836,3,5,8 +9271,2012-01-27,1,1,1,4,0,5,1,2,0.4,0.4091,1,0.2985,1,3,4 +9272,2012-01-27,1,1,1,5,0,5,1,2,0.42,0.4242,0.94,0.3284,0,24,24 +9273,2012-01-27,1,1,1,6,0,5,1,2,0.42,0.4242,1,0.3284,0,72,72 +9274,2012-01-27,1,1,1,7,0,5,1,2,0.42,0.4242,1,0.2985,6,128,134 +9275,2012-01-27,1,1,1,8,0,5,1,3,0.5,0.4848,0.88,0.2836,14,206,220 +9276,2012-01-27,1,1,1,9,0,5,1,3,0.5,0.4848,0.88,0.2836,6,101,107 +9277,2012-01-27,1,1,1,10,0,5,1,3,0.48,0.4697,1,0.1642,9,117,126 +9278,2012-01-27,1,1,1,11,0,5,1,2,0.48,0.4697,0.94,0.1343,10,95,105 +9279,2012-01-27,1,1,1,12,0,5,1,2,0.5,0.4848,0.88,0.1642,25,155,180 +9280,2012-01-27,1,1,1,13,0,5,1,1,0.5,0.4848,0.59,0.4627,26,175,201 +9281,2012-01-27,1,1,1,14,0,5,1,1,0.5,0.4848,0.55,0.5522,23,153,176 +9282,2012-01-27,1,1,1,15,0,5,1,1,0.48,0.4697,0.51,0.3284,27,170,197 +9283,2012-01-27,1,1,1,16,0,5,1,1,0.44,0.4394,0.51,0.5522,26,241,267 +9284,2012-01-27,1,1,1,17,0,5,1,1,0.42,0.4242,0.5,0.5821,16,406,422 +9285,2012-01-27,1,1,1,18,0,5,1,1,0.4,0.4091,0.5,0.4925,20,363,383 +9286,2012-01-27,1,1,1,19,0,5,1,1,0.36,0.3333,0.5,0.3881,10,241,251 +9287,2012-01-27,1,1,1,20,0,5,1,1,0.34,0.303,0.53,0.3582,6,153,159 +9288,2012-01-27,1,1,1,21,0,5,1,1,0.34,0.2879,0.53,0.5224,8,127,135 +9289,2012-01-27,1,1,1,22,0,5,1,1,0.34,0.303,0.49,0.3881,11,108,119 +9290,2012-01-27,1,1,1,23,0,5,1,1,0.32,0.2879,0.49,0.4179,8,78,86 +9291,2012-01-28,1,1,1,0,0,6,0,1,0.3,0.2879,0.52,0.2836,1,68,69 +9292,2012-01-28,1,1,1,1,0,6,0,1,0.3,0.303,0.52,0.1642,1,57,58 +9293,2012-01-28,1,1,1,2,0,6,0,1,0.26,0.303,0.6,0,3,38,41 +9294,2012-01-28,1,1,1,3,0,6,0,1,0.26,0.2879,0.6,0.0896,1,16,17 +9295,2012-01-28,1,1,1,4,0,6,0,1,0.26,0.2576,0.6,0.1642,0,8,8 +9296,2012-01-28,1,1,1,5,0,6,0,1,0.24,0.2576,0.75,0.0896,0,10,10 +9297,2012-01-28,1,1,1,6,0,6,0,1,0.24,0.2879,0.7,0,3,12,15 +9298,2012-01-28,1,1,1,7,0,6,0,1,0.22,0.2727,0.75,0,3,28,31 +9299,2012-01-28,1,1,1,8,0,6,0,1,0.2,0.2273,0.8,0.0896,9,82,91 +9300,2012-01-28,1,1,1,9,0,6,0,1,0.22,0.2727,0.8,0,13,145,158 +9301,2012-01-28,1,1,1,10,0,6,0,1,0.26,0.2727,0.7,0.1343,27,164,191 +9302,2012-01-28,1,1,1,11,0,6,0,1,0.3,0.303,0.56,0.1343,55,241,296 +9303,2012-01-28,1,1,1,12,0,6,0,1,0.3,0.2879,0.61,0.2836,70,244,314 +9304,2012-01-28,1,1,1,13,0,6,0,1,0.34,0.3182,0.57,0.2537,80,293,373 +9305,2012-01-28,1,1,1,14,0,6,0,1,0.38,0.3939,0.5,0.3881,123,239,362 +9306,2012-01-28,1,1,1,15,0,6,0,1,0.4,0.4091,0.47,0.3284,90,320,410 +9307,2012-01-28,1,1,1,16,0,6,0,1,0.42,0.4242,0.44,0.3582,131,276,407 +9308,2012-01-28,1,1,1,17,0,6,0,1,0.42,0.4242,0.44,0.3881,57,233,290 +9309,2012-01-28,1,1,1,18,0,6,0,1,0.42,0.4242,0.41,0.2239,40,201,241 +9310,2012-01-28,1,1,1,19,0,6,0,2,0.42,0.4242,0.41,0.5522,33,173,206 +9311,2012-01-28,1,1,1,20,0,6,0,1,0.4,0.4091,0.35,0.3881,5,115,120 +9312,2012-01-28,1,1,1,21,0,6,0,1,0.36,0.3333,0.29,0.3582,18,109,127 +9313,2012-01-28,1,1,1,22,0,6,0,1,0.34,0.3333,0.34,0.194,6,97,103 +9314,2012-01-28,1,1,1,23,0,6,0,1,0.32,0.3182,0.31,0.194,6,79,85 +9315,2012-01-29,1,1,1,0,0,0,0,1,0.3,0.2879,0.33,0.2239,15,81,96 +9316,2012-01-29,1,1,1,1,0,0,0,1,0.3,0.2727,0.28,0.3582,5,68,73 +9317,2012-01-29,1,1,1,2,0,0,0,1,0.28,0.2576,0.26,0.3284,3,48,51 +9318,2012-01-29,1,1,1,3,0,0,0,1,0.26,0.2576,0.28,0.194,4,17,21 +9319,2012-01-29,1,1,1,4,0,0,0,1,0.24,0.2576,0.3,0,0,5,5 +9320,2012-01-29,1,1,1,5,0,0,0,1,0.2,0.2121,0.51,0.1343,1,6,7 +9321,2012-01-29,1,1,1,6,0,0,0,1,0.2,0.2273,0.4,0.1045,1,4,5 +9322,2012-01-29,1,1,1,7,0,0,0,1,0.2,0.2121,0.37,0.1343,0,12,12 +9323,2012-01-29,1,1,1,8,0,0,0,1,0.18,0.197,0.4,0.1343,9,61,70 +9324,2012-01-29,1,1,1,9,0,0,0,1,0.18,0.197,0.43,0.1343,13,98,111 +9325,2012-01-29,1,1,1,10,0,0,0,1,0.24,0.2424,0.35,0.1343,38,153,191 +9326,2012-01-29,1,1,1,11,0,0,0,1,0.28,0.2727,0.28,0.2239,57,198,255 +9327,2012-01-29,1,1,1,12,0,0,0,1,0.32,0.303,0.26,0.2239,73,215,288 +9328,2012-01-29,1,1,1,13,0,0,0,1,0.32,0.303,0.26,0.2239,78,238,316 +9329,2012-01-29,1,1,1,14,0,0,0,1,0.32,0.303,0.26,0.2537,51,229,280 +9330,2012-01-29,1,1,1,15,0,0,0,1,0.36,0.3333,0.25,0.2836,68,242,310 +9331,2012-01-29,1,1,1,16,0,0,0,1,0.36,0.3333,0.21,0.4179,54,268,322 +9332,2012-01-29,1,1,1,17,0,0,0,1,0.36,0.3333,0.21,0.4179,33,201,234 +9333,2012-01-29,1,1,1,18,0,0,0,1,0.34,0.303,0.23,0.3582,17,151,168 +9334,2012-01-29,1,1,1,19,0,0,0,1,0.32,0.303,0.29,0.2537,14,145,159 +9335,2012-01-29,1,1,1,20,0,0,0,1,0.3,0.2879,0.39,0.2239,13,98,111 +9336,2012-01-29,1,1,1,21,0,0,0,1,0.3,0.2879,0.33,0.2836,0,69,69 +9337,2012-01-29,1,1,1,22,0,0,0,1,0.32,0.2879,0.26,0.3881,7,57,64 +9338,2012-01-29,1,1,1,23,0,0,0,1,0.3,0.2727,0.33,0.3284,4,21,25 +9339,2012-01-30,1,1,1,0,0,1,1,2,0.32,0.2879,0.26,0.4179,0,10,10 +9340,2012-01-30,1,1,1,1,0,1,1,1,0.26,0.2273,0.6,0.3582,0,9,9 +9341,2012-01-30,1,1,1,2,0,1,1,1,0.24,0.2121,0.48,0.3582,1,6,7 +9342,2012-01-30,1,1,1,3,0,1,1,1,0.24,0.2273,0.48,0.194,0,4,4 +9343,2012-01-30,1,1,1,4,0,1,1,1,0.2,0.2121,0.51,0.1343,0,5,5 +9344,2012-01-30,1,1,1,5,0,1,1,1,0.2,0.2121,0.51,0.1343,0,26,26 +9345,2012-01-30,1,1,1,6,0,1,1,1,0.22,0.2121,0.44,0.2537,0,80,80 +9346,2012-01-30,1,1,1,7,0,1,1,1,0.22,0.2121,0.44,0.2836,4,221,225 +9347,2012-01-30,1,1,1,8,0,1,1,1,0.2,0.197,0.47,0.2239,12,481,493 +9348,2012-01-30,1,1,1,9,0,1,1,1,0.2,0.197,0.47,0.2239,5,193,198 +9349,2012-01-30,1,1,1,10,0,1,1,1,0.22,0.2121,0.41,0.2836,3,82,85 +9350,2012-01-30,1,1,1,11,0,1,1,1,0.26,0.2576,0.38,0,6,93,99 +9351,2012-01-30,1,1,1,12,0,1,1,1,0.26,0.2273,0.35,0.3881,5,129,134 +9352,2012-01-30,1,1,1,13,0,1,1,1,0.3,0.2879,0.31,0.194,15,117,132 +9353,2012-01-30,1,1,1,14,0,1,1,1,0.32,0.3333,0.24,0.1343,15,100,115 +9354,2012-01-30,1,1,1,15,0,1,1,1,0.32,0.3182,0.24,0.194,7,131,138 +9355,2012-01-30,1,1,1,16,0,1,1,1,0.34,0.3333,0.25,0.194,11,210,221 +9356,2012-01-30,1,1,1,17,0,1,1,1,0.36,0.3485,0.21,0.194,16,422,438 +9357,2012-01-30,1,1,1,18,0,1,1,1,0.34,0.3333,0.23,0.1642,11,399,410 +9358,2012-01-30,1,1,1,19,0,1,1,1,0.3,0.3182,0.45,0.1045,8,298,306 +9359,2012-01-30,1,1,1,20,0,1,1,1,0.3,0.303,0.39,0.1642,2,179,181 +9360,2012-01-30,1,1,1,21,0,1,1,1,0.3,0.303,0.42,0.1642,3,159,162 +9361,2012-01-30,1,1,1,22,0,1,1,1,0.28,0.2727,0.48,0.2239,2,92,94 +9362,2012-01-30,1,1,1,23,0,1,1,1,0.26,0.2576,0.6,0.194,0,52,52 +9363,2012-01-31,1,1,1,0,0,2,1,1,0.28,0.2727,0.52,0.2537,0,15,15 +9364,2012-01-31,1,1,1,1,0,2,1,1,0.3,0.2727,0.42,0.2985,1,7,8 +9365,2012-01-31,1,1,1,2,0,2,1,1,0.3,0.2879,0.42,0.2239,0,2,2 +9366,2012-01-31,1,1,1,3,0,2,1,1,0.3,0.2879,0.45,0.2239,0,2,2 +9367,2012-01-31,1,1,1,4,0,2,1,1,0.3,0.2879,0.49,0.194,0,1,1 +9368,2012-01-31,1,1,1,5,0,2,1,1,0.3,0.2879,0.49,0.194,0,22,22 +9369,2012-01-31,1,1,1,6,0,2,1,1,0.28,0.2727,0.52,0.1642,0,104,104 +9370,2012-01-31,1,1,1,7,0,2,1,1,0.26,0.2576,0.6,0.194,6,273,279 +9371,2012-01-31,1,1,1,8,0,2,1,1,0.26,0.2727,0.6,0.1045,14,498,512 +9372,2012-01-31,1,1,1,9,0,2,1,1,0.3,0.2879,0.52,0.194,30,231,261 +9373,2012-01-31,1,1,1,10,0,2,1,2,0.32,0.3182,0.57,0.194,34,110,144 +9374,2012-01-31,1,1,1,11,0,2,1,1,0.4,0.4091,0.37,0.2836,14,125,139 +9375,2012-01-31,1,1,1,12,0,2,1,1,0.4,0.4091,0.43,0.2985,13,181,194 +9376,2012-01-31,1,1,1,13,0,2,1,1,0.46,0.4545,0.36,0.2239,27,171,198 +9377,2012-01-31,1,1,1,14,0,2,1,1,0.5,0.4848,0.29,0.3284,18,142,160 +9378,2012-01-31,1,1,1,15,0,2,1,1,0.54,0.5152,0.26,0.4478,27,150,177 +9379,2012-01-31,1,1,1,16,0,2,1,1,0.54,0.5152,0.26,0.4478,30,277,307 +9380,2012-01-31,1,1,1,17,0,2,1,1,0.54,0.5152,0.26,0.4179,41,518,559 +9381,2012-01-31,1,1,1,18,0,2,1,1,0.54,0.5152,0.24,0.3582,16,486,502 +9382,2012-01-31,1,1,1,19,0,2,1,1,0.5,0.4848,0.29,0.2836,22,306,328 +9383,2012-01-31,1,1,1,20,0,2,1,1,0.4,0.4091,0.58,0.1045,15,223,238 +9384,2012-01-31,1,1,1,21,0,2,1,1,0.46,0.4545,0.33,0.2985,3,162,165 +9385,2012-01-31,1,1,1,22,0,2,1,1,0.44,0.4394,0.35,0.2985,11,118,129 +9386,2012-01-31,1,1,1,23,0,2,1,1,0.44,0.4394,0.38,0.2537,2,61,63 +9387,2012-02-01,1,1,2,0,0,3,1,1,0.44,0.4394,0.38,0.2836,0,31,31 +9388,2012-02-01,1,1,2,1,0,3,1,1,0.44,0.4394,0.41,0.2836,0,4,4 +9389,2012-02-01,1,1,2,2,0,3,1,1,0.44,0.4394,0.44,0.2836,1,6,7 +9390,2012-02-01,1,1,2,3,0,3,1,1,0.44,0.4394,0.44,0.194,0,3,3 +9391,2012-02-01,1,1,2,4,0,3,1,1,0.4,0.4091,0.5,0.1642,0,1,1 +9392,2012-02-01,1,1,2,5,0,3,1,1,0.4,0.4091,0.5,0.1642,0,18,18 +9393,2012-02-01,1,1,2,6,0,3,1,3,0.4,0.4091,0.54,0.1642,0,67,67 +9394,2012-02-01,1,1,2,7,0,3,1,3,0.38,0.3939,0.62,0.0896,7,201,208 +9395,2012-02-01,1,1,2,8,0,3,1,3,0.36,0.3636,0.73,0.1045,15,505,520 +9396,2012-02-01,1,1,2,9,0,3,1,2,0.36,0.3485,0.71,0.194,9,267,276 +9397,2012-02-01,1,1,2,10,0,3,1,1,0.4,0.4091,0.62,0.1343,9,124,133 +9398,2012-02-01,1,1,2,11,0,3,1,1,0.4,0.4091,0.58,0.1642,27,116,143 +9399,2012-02-01,1,1,2,12,0,3,1,1,0.44,0.4394,0.54,0.1045,26,153,179 +9400,2012-02-01,1,1,2,13,0,3,1,1,0.46,0.4545,0.47,0.2239,12,174,186 +9401,2012-02-01,1,1,2,14,0,3,1,1,0.62,0.6212,0.33,0.2239,33,159,192 +9402,2012-02-01,1,1,2,15,0,3,1,1,0.6,0.6212,0.35,0.2836,25,169,194 +9403,2012-02-01,1,1,2,16,0,3,1,1,0.6,0.6212,0.4,0.2239,22,261,283 +9404,2012-02-01,1,1,2,17,0,3,1,1,0.58,0.5455,0.43,0.194,31,508,539 +9405,2012-02-01,1,1,2,18,0,3,1,1,0.56,0.5303,0.46,0.1642,12,514,526 +9406,2012-02-01,1,1,2,19,0,3,1,1,0.54,0.5152,0.52,0.2836,20,315,335 +9407,2012-02-01,1,1,2,20,0,3,1,1,0.52,0.5,0.52,0.1642,10,254,264 +9408,2012-02-01,1,1,2,21,0,3,1,1,0.52,0.5,0.52,0.1642,16,217,233 +9409,2012-02-01,1,1,2,22,0,3,1,1,0.5,0.4848,0.55,0.194,7,133,140 +9410,2012-02-01,1,1,2,23,0,3,1,1,0.46,0.4545,0.63,0.0896,22,75,97 +9411,2012-02-02,1,1,2,0,0,4,1,1,0.46,0.4545,0.59,0.1343,7,24,31 +9412,2012-02-02,1,1,2,1,0,4,1,2,0.46,0.4545,0.55,0,3,14,17 +9413,2012-02-02,1,1,2,2,0,4,1,2,0.42,0.4242,0.67,0.1343,2,7,9 +9414,2012-02-02,1,1,2,3,0,4,1,2,0.42,0.4242,0.67,0.1343,0,2,2 +9415,2012-02-02,1,1,2,4,0,4,1,3,0.4,0.4091,0.82,0.0896,0,1,1 +9416,2012-02-02,1,1,2,5,0,4,1,3,0.4,0.4091,0.82,0,0,19,19 +9417,2012-02-02,1,1,2,6,0,4,1,3,0.36,0.3788,0.9,0,1,73,74 +9418,2012-02-02,1,1,2,7,0,4,1,3,0.36,0.3788,0.93,0,4,201,205 +9419,2012-02-02,1,1,2,8,0,4,1,3,0.4,0.4091,0.87,0,9,436,445 +9420,2012-02-02,1,1,2,9,0,4,1,3,0.38,0.3939,0.87,0,15,213,228 +9421,2012-02-02,1,1,2,10,0,4,1,3,0.42,0.4242,0.77,0.1045,6,72,78 +9422,2012-02-02,1,1,2,11,0,4,1,3,0.4,0.4091,0.82,0.2537,6,99,105 +9423,2012-02-02,1,1,2,12,0,4,1,1,0.42,0.4242,0.82,0,8,140,148 +9424,2012-02-02,1,1,2,13,0,4,1,2,0.42,0.4242,0.77,0,13,124,137 +9425,2012-02-02,1,1,2,14,0,4,1,1,0.4,0.4091,0.76,0.1045,12,129,141 +9426,2012-02-02,1,1,2,15,0,4,1,1,0.46,0.4545,0.67,0.1642,24,150,174 +9427,2012-02-02,1,1,2,16,0,4,1,1,0.46,0.4545,0.55,0.2836,18,211,229 +9428,2012-02-02,1,1,2,17,0,4,1,1,0.42,0.4242,0.54,0.4925,13,437,450 +9429,2012-02-02,1,1,2,18,0,4,1,1,0.4,0.4091,0.43,0.4627,17,393,410 +9430,2012-02-02,1,1,2,19,0,4,1,1,0.38,0.3939,0.43,0.5224,16,311,327 +9431,2012-02-02,1,1,2,20,0,4,1,1,0.36,0.3182,0.46,0.5224,9,204,213 +9432,2012-02-02,1,1,2,21,0,4,1,1,0.34,0.303,0.46,0.4179,4,144,148 +9433,2012-02-02,1,1,2,22,0,4,1,1,0.32,0.2879,0.49,0.4179,3,106,109 +9434,2012-02-02,1,1,2,23,0,4,1,1,0.32,0.303,0.49,0.2537,0,61,61 +9435,2012-02-03,1,1,2,0,0,5,1,1,0.32,0.2879,0.53,0.3881,1,37,38 +9436,2012-02-03,1,1,2,1,0,5,1,1,0.3,0.2727,0.56,0.3284,3,19,22 +9437,2012-02-03,1,1,2,2,0,5,1,1,0.3,0.2879,0.52,0.2239,0,6,6 +9438,2012-02-03,1,1,2,3,0,5,1,1,0.28,0.2576,0.56,0.2985,0,4,4 +9439,2012-02-03,1,1,2,4,0,5,1,1,0.26,0.2576,0.6,0.2239,0,2,2 +9440,2012-02-03,1,1,2,5,0,5,1,1,0.26,0.2576,0.6,0.2239,0,17,17 +9441,2012-02-03,1,1,2,6,0,5,1,1,0.24,0.2424,0.65,0.1642,0,88,88 +9442,2012-02-03,1,1,2,7,0,5,1,1,0.24,0.2273,0.65,0.2239,10,216,226 +9443,2012-02-03,1,1,2,8,0,5,1,1,0.24,0.2273,0.65,0.2239,12,429,441 +9444,2012-02-03,1,1,2,9,0,5,1,1,0.24,0.2273,0.65,0.194,18,252,270 +9445,2012-02-03,1,1,2,10,0,5,1,1,0.28,0.2576,0.56,0.2836,15,114,129 +9446,2012-02-03,1,1,2,11,0,5,1,1,0.32,0.303,0.53,0.2985,15,149,164 +9447,2012-02-03,1,1,2,12,0,5,1,1,0.34,0.3333,0.46,0.1343,25,191,216 +9448,2012-02-03,1,1,2,13,0,5,1,1,0.36,0.3636,0.46,0,25,173,198 +9449,2012-02-03,1,1,2,14,0,5,1,1,0.36,0.3485,0.46,0.2239,22,148,170 +9450,2012-02-03,1,1,2,15,0,5,1,1,0.38,0.3939,0.4,0,48,168,216 +9451,2012-02-03,1,1,2,16,0,5,1,1,0.4,0.4091,0.37,0.1045,36,275,311 +9452,2012-02-03,1,1,2,17,0,5,1,1,0.4,0.4091,0.37,0.0896,30,458,488 +9453,2012-02-03,1,1,2,18,0,5,1,1,0.38,0.3939,0.4,0.1343,17,368,385 +9454,2012-02-03,1,1,2,19,0,5,1,1,0.34,0.3485,0.49,0.1045,11,256,267 +9455,2012-02-03,1,1,2,20,0,5,1,1,0.34,0.3485,0.49,0.1045,8,155,163 +9456,2012-02-03,1,1,2,21,0,5,1,1,0.34,0.3485,0.49,0.1045,8,126,134 +9457,2012-02-03,1,1,2,22,0,5,1,1,0.34,0.3485,0.49,0.1045,3,116,119 +9458,2012-02-03,1,1,2,23,0,5,1,1,0.26,0.2727,0.7,0.1045,3,74,77 +9459,2012-02-04,1,1,2,0,0,6,0,1,0.26,0.2727,0.7,0.1045,10,76,86 +9460,2012-02-04,1,1,2,1,0,6,0,1,0.26,0.2727,0.7,0.1045,0,43,43 +9461,2012-02-04,1,1,2,2,0,6,0,1,0.24,0.2576,0.7,0.0896,6,38,44 +9462,2012-02-04,1,1,2,3,0,6,0,1,0.24,0.2879,0.75,0,1,17,18 +9463,2012-02-04,1,1,2,4,0,6,0,1,0.24,0.2576,0.81,0.0896,1,5,6 +9464,2012-02-04,1,1,2,5,0,6,0,1,0.24,0.2576,0.81,0.0896,1,3,4 +9465,2012-02-04,1,1,2,6,0,6,0,1,0.22,0.2273,0.87,0.1343,1,4,5 +9466,2012-02-04,1,1,2,7,0,6,0,2,0.24,0.2424,0.87,0.1343,1,27,28 +9467,2012-02-04,1,1,2,8,0,6,0,1,0.24,0.2879,0.87,0,4,86,90 +9468,2012-02-04,1,1,2,9,0,6,0,1,0.24,0.2424,0.81,0.1343,16,141,157 +9469,2012-02-04,1,1,2,10,0,6,0,1,0.24,0.2576,0.87,0.1045,31,193,224 +9470,2012-02-04,1,1,2,11,0,6,0,1,0.3,0.3333,0.7,0,46,224,270 +9471,2012-02-04,1,1,2,12,0,6,0,2,0.3,0.3182,0.7,0.0896,40,260,300 +9472,2012-02-04,1,1,2,13,0,6,0,2,0.3,0.2879,0.7,0.194,58,244,302 +9473,2012-02-04,1,1,2,14,0,6,0,2,0.32,0.303,0.66,0.2239,47,229,276 +9474,2012-02-04,1,1,2,15,0,6,0,2,0.32,0.303,0.66,0.2239,61,244,305 +9475,2012-02-04,1,1,2,16,0,6,0,2,0.34,0.3333,0.61,0.1642,20,123,143 +9476,2012-02-04,1,1,2,17,0,6,0,3,0.3,0.2879,0.7,0.194,13,54,67 +9477,2012-02-04,1,1,2,18,0,6,0,3,0.28,0.2727,0.75,0.1642,6,58,64 +9478,2012-02-04,1,1,2,19,0,6,0,3,0.26,0.2727,0.87,0.1343,5,75,80 +9479,2012-02-04,1,1,2,20,0,6,0,3,0.24,0.2273,0.93,0.194,4,73,77 +9480,2012-02-04,1,1,2,21,0,6,0,3,0.24,0.2424,0.93,0.1343,4,74,78 +9481,2012-02-04,1,1,2,22,0,6,0,2,0.24,0.2576,0.87,0.0896,4,91,95 +9482,2012-02-04,1,1,2,23,0,6,0,2,0.24,0.2424,0.87,0.1343,4,66,70 +9483,2012-02-05,1,1,2,0,0,0,0,2,0.24,0.2424,0.87,0.1343,3,58,61 +9484,2012-02-05,1,1,2,1,0,0,0,2,0.24,0.2576,0.93,0.1045,5,46,51 +9485,2012-02-05,1,1,2,2,0,0,0,2,0.24,0.2576,0.93,0.1045,4,59,63 +9486,2012-02-05,1,1,2,3,0,0,0,3,0.24,0.2576,0.87,0.0896,4,13,17 +9487,2012-02-05,1,1,2,4,0,0,0,2,0.24,0.2424,0.87,0.1343,0,9,9 +9488,2012-02-05,1,1,2,5,0,0,0,3,0.26,0.2576,0.81,0.194,0,4,4 +9489,2012-02-05,1,1,2,6,0,0,0,2,0.26,0.2424,0.7,0.2537,0,4,4 +9490,2012-02-05,1,1,2,7,0,0,0,2,0.24,0.2424,0.75,0.1642,0,16,16 +9491,2012-02-05,1,1,2,8,0,0,0,2,0.24,0.2424,0.75,0.1642,4,58,62 +9492,2012-02-05,1,1,2,9,0,0,0,2,0.24,0.2273,0.7,0.2239,9,83,92 +9493,2012-02-05,1,1,2,10,0,0,0,2,0.26,0.2424,0.65,0.2836,10,138,148 +9494,2012-02-05,1,1,2,11,0,0,0,2,0.26,0.2576,0.7,0.194,22,160,182 +9495,2012-02-05,1,1,2,12,0,0,0,2,0.26,0.2576,0.65,0.194,25,223,248 +9496,2012-02-05,1,1,2,13,0,0,0,2,0.28,0.2727,0.61,0.1642,33,241,274 +9497,2012-02-05,1,1,2,14,0,0,0,1,0.3,0.303,0.61,0.1642,40,203,243 +9498,2012-02-05,1,1,2,15,0,0,0,2,0.3,0.2879,0.56,0.2537,55,217,272 +9499,2012-02-05,1,1,2,16,0,0,0,1,0.3,0.2879,0.56,0.2239,35,261,296 +9500,2012-02-05,1,1,2,17,0,0,0,1,0.32,0.3182,0.53,0.1642,31,275,306 +9501,2012-02-05,1,1,2,18,0,0,0,1,0.3,0.2879,0.52,0.2537,10,171,181 +9502,2012-02-05,1,1,2,19,0,0,0,1,0.3,0.2879,0.52,0.2239,6,63,69 +9503,2012-02-05,1,1,2,20,0,0,0,1,0.28,0.2727,0.56,0.194,3,47,50 +9504,2012-02-05,1,1,2,21,0,0,0,1,0.28,0.2879,0.56,0.1045,6,60,66 +9505,2012-02-05,1,1,2,22,0,0,0,1,0.26,0.2727,0.6,0.1045,9,182,191 +9506,2012-02-05,1,1,2,23,0,0,0,1,0.24,0.2424,0.7,0.1343,4,38,42 +9507,2012-02-06,1,1,2,0,0,1,1,1,0.24,0.2424,0.65,0.1343,7,14,21 +9508,2012-02-06,1,1,2,1,0,1,1,1,0.22,0.2576,0.75,0.0896,0,6,6 +9509,2012-02-06,1,1,2,2,0,1,1,1,0.22,0.2727,0.75,0,0,3,3 +9510,2012-02-06,1,1,2,4,0,1,1,1,0.18,0.2121,0.8,0.1045,0,2,2 +9511,2012-02-06,1,1,2,5,0,1,1,1,0.18,0.2121,0.8,0.1045,0,17,17 +9512,2012-02-06,1,1,2,6,0,1,1,1,0.16,0.1818,0.86,0.1045,0,72,72 +9513,2012-02-06,1,1,2,7,0,1,1,1,0.16,0.1818,0.86,0.1045,5,229,234 +9514,2012-02-06,1,1,2,8,0,1,1,1,0.16,0.1818,0.86,0.1343,10,434,444 +9515,2012-02-06,1,1,2,9,0,1,1,1,0.18,0.2121,0.8,0.1045,8,269,277 +9516,2012-02-06,1,1,2,10,0,1,1,1,0.22,0.2424,0.75,0.1045,20,96,116 +9517,2012-02-06,1,1,2,11,0,1,1,1,0.26,0.2727,0.7,0.1343,12,104,116 +9518,2012-02-06,1,1,2,12,0,1,1,1,0.32,0.3182,0.61,0.1642,19,131,150 +9519,2012-02-06,1,1,2,13,0,1,1,1,0.38,0.3939,0.4,0.2239,17,124,141 +9520,2012-02-06,1,1,2,14,0,1,1,1,0.4,0.4091,0.35,0.1642,14,126,140 +9521,2012-02-06,1,1,2,15,0,1,1,1,0.42,0.4242,0.32,0.2836,25,146,171 +9522,2012-02-06,1,1,2,16,0,1,1,1,0.4,0.4091,0.43,0.2537,13,215,228 +9523,2012-02-06,1,1,2,17,0,1,1,1,0.4,0.4091,0.43,0.2537,13,417,430 +9524,2012-02-06,1,1,2,18,0,1,1,1,0.38,0.3939,0.43,0.2239,11,426,437 +9525,2012-02-06,1,1,2,19,0,1,1,1,0.36,0.3333,0.43,0.2537,7,284,291 +9526,2012-02-06,1,1,2,20,0,1,1,1,0.34,0.3333,0.46,0.1642,7,194,201 +9527,2012-02-06,1,1,2,21,0,1,1,1,0.32,0.3182,0.57,0.1642,11,143,154 +9528,2012-02-06,1,1,2,22,0,1,1,1,0.3,0.303,0.65,0.1343,4,83,87 +9529,2012-02-06,1,1,2,23,0,1,1,1,0.3,0.303,0.65,0.1343,3,43,46 +9530,2012-02-07,1,1,2,0,0,2,1,1,0.3,0.2879,0.7,0.194,1,22,23 +9531,2012-02-07,1,1,2,1,0,2,1,1,0.28,0.2879,0.61,0.1343,8,6,14 +9532,2012-02-07,1,1,2,2,0,2,1,1,0.3,0.3333,0.61,0,0,4,4 +9533,2012-02-07,1,1,2,3,0,2,1,2,0.3,0.3333,0.7,0,1,1,2 +9534,2012-02-07,1,1,2,4,0,2,1,2,0.3,0.3333,0.7,0,0,3,3 +9535,2012-02-07,1,1,2,5,0,2,1,1,0.32,0.3333,0.49,0.0896,0,15,15 +9536,2012-02-07,1,1,2,6,0,2,1,1,0.32,0.303,0.45,0.2985,2,98,100 +9537,2012-02-07,1,1,2,7,0,2,1,1,0.32,0.3485,0.45,0,6,293,299 +9538,2012-02-07,1,1,2,8,0,2,1,1,0.3,0.3182,0.49,0,7,522,529 +9539,2012-02-07,1,1,2,9,0,2,1,1,0.32,0.3333,0.49,0,11,278,289 +9540,2012-02-07,1,1,2,10,0,2,1,1,0.34,0.3333,0.46,0.1642,12,104,116 +9541,2012-02-07,1,1,2,11,0,2,1,1,0.36,0.3485,0.46,0.194,9,120,129 +9542,2012-02-07,1,1,2,12,0,2,1,1,0.4,0.4091,0.4,0.3284,9,163,172 +9543,2012-02-07,1,1,2,13,0,2,1,1,0.44,0.4394,0.38,0.2537,15,169,184 +9544,2012-02-07,1,1,2,14,0,2,1,1,0.44,0.4394,0.38,0.2537,11,127,138 +9545,2012-02-07,1,1,2,15,0,2,1,1,0.44,0.4394,0.38,0.194,13,133,146 +9546,2012-02-07,1,1,2,16,0,2,1,1,0.44,0.4394,0.35,0.2537,19,257,276 +9547,2012-02-07,1,1,2,17,0,2,1,1,0.44,0.4394,0.35,0.194,15,494,509 +9548,2012-02-07,1,1,2,18,0,2,1,1,0.42,0.4242,0.38,0.194,10,508,518 +9549,2012-02-07,1,1,2,19,0,2,1,1,0.38,0.3939,0.43,0.2239,11,327,338 +9550,2012-02-07,1,1,2,20,0,2,1,1,0.36,0.3485,0.5,0.1343,6,206,212 +9551,2012-02-07,1,1,2,21,0,2,1,1,0.34,0.3485,0.61,0.1045,20,170,190 +9552,2012-02-07,1,1,2,22,0,2,1,1,0.32,0.3182,0.57,0.194,11,107,118 +9553,2012-02-07,1,1,2,23,0,2,1,1,0.32,0.3333,0.57,0.1343,2,49,51 +9554,2012-02-08,1,1,2,0,0,3,1,1,0.3,0.2879,0.56,0.2239,2,21,23 +9555,2012-02-08,1,1,2,1,0,3,1,1,0.28,0.2727,0.61,0.1642,1,9,10 +9556,2012-02-08,1,1,2,2,0,3,1,1,0.26,0.303,0.65,0,0,1,1 +9557,2012-02-08,1,1,2,3,0,3,1,1,0.26,0.303,0.65,0,0,1,1 +9558,2012-02-08,1,1,2,4,0,3,1,1,0.24,0.2576,0.7,0.0896,0,1,1 +9559,2012-02-08,1,1,2,5,0,3,1,1,0.24,0.2576,0.7,0.0896,0,20,20 +9560,2012-02-08,1,1,2,6,0,3,1,2,0.24,0.2879,0.7,0,3,95,98 +9561,2012-02-08,1,1,2,7,0,3,1,2,0.24,0.2576,0.7,0.1045,5,276,281 +9562,2012-02-08,1,1,2,8,0,3,1,2,0.24,0.2273,0.7,0.2239,13,495,508 +9563,2012-02-08,1,1,2,9,0,3,1,2,0.24,0.2576,0.7,0.1642,8,245,253 +9564,2012-02-08,1,1,2,10,0,3,1,2,0.26,0.2576,0.7,0.2239,1,107,108 +9565,2012-02-08,1,1,2,11,0,3,1,2,0.26,0.2576,0.65,0.2239,5,97,102 +9566,2012-02-08,1,1,2,12,0,3,1,2,0.28,0.2576,0.65,0.2836,14,124,138 +9567,2012-02-08,1,1,2,13,0,3,1,2,0.28,0.2879,0.61,0.1045,15,111,126 +9568,2012-02-08,1,1,2,14,0,3,1,2,0.3,0.303,0.56,0.1642,8,100,108 +9569,2012-02-08,1,1,2,15,0,3,1,3,0.3,0.3182,0.61,0.1045,5,49,54 +9570,2012-02-08,1,1,2,16,0,3,1,3,0.26,0.2727,0.81,0.1045,2,73,75 +9571,2012-02-08,1,1,2,17,0,3,1,3,0.24,0.2576,0.87,0.1045,8,147,155 +9572,2012-02-08,1,1,2,18,0,3,1,3,0.24,0.2576,0.87,0.1045,4,163,167 +9573,2012-02-08,1,1,2,19,0,3,1,3,0.24,0.2576,0.87,0.1045,3,158,161 +9574,2012-02-08,1,1,2,20,0,3,1,3,0.24,0.2424,0.87,0.1343,3,122,125 +9575,2012-02-08,1,1,2,21,0,3,1,3,0.24,0.2273,0.87,0.194,3,126,129 +9576,2012-02-08,1,1,2,22,0,3,1,2,0.24,0.2424,0.87,0.1343,3,95,98 +9577,2012-02-08,1,1,2,23,0,3,1,2,0.24,0.2424,0.87,0.1642,3,57,60 +9578,2012-02-09,1,1,2,0,0,4,1,2,0.24,0.2424,0.87,0.1642,3,23,26 +9579,2012-02-09,1,1,2,1,0,4,1,2,0.24,0.2424,0.87,0.1343,2,10,12 +9580,2012-02-09,1,1,2,2,0,4,1,1,0.22,0.2273,0.8,0.1343,0,2,2 +9581,2012-02-09,1,1,2,3,0,4,1,1,0.22,0.2121,0.8,0.2836,0,4,4 +9582,2012-02-09,1,1,2,4,0,4,1,2,0.22,0.197,0.75,0.4478,0,1,1 +9583,2012-02-09,1,1,2,5,0,4,1,1,0.2,0.1818,0.75,0.2836,0,18,18 +9584,2012-02-09,1,1,2,6,0,4,1,1,0.2,0.1818,0.75,0.2836,1,75,76 +9585,2012-02-09,1,1,2,7,0,4,1,1,0.2,0.197,0.69,0.2537,4,263,267 +9586,2012-02-09,1,1,2,8,0,4,1,1,0.2,0.197,0.64,0.2537,11,473,484 +9587,2012-02-09,1,1,2,9,0,4,1,1,0.2,0.197,0.59,0.2239,9,229,238 +9588,2012-02-09,1,1,2,10,0,4,1,1,0.22,0.2121,0.51,0.2239,7,90,97 +9589,2012-02-09,1,1,2,11,0,4,1,1,0.24,0.2273,0.52,0.2239,4,95,99 +9590,2012-02-09,1,1,2,12,0,4,1,1,0.26,0.2273,0.48,0.3284,10,148,158 +9591,2012-02-09,1,1,2,13,0,4,1,1,0.3,0.2727,0.42,0.3284,10,144,154 +9592,2012-02-09,1,1,2,14,0,4,1,1,0.32,0.303,0.39,0.2985,18,126,144 +9593,2012-02-09,1,1,2,15,0,4,1,1,0.32,0.303,0.36,0.2985,14,120,134 +9594,2012-02-09,1,1,2,16,0,4,1,1,0.34,0.3333,0.36,0,15,223,238 +9595,2012-02-09,1,1,2,17,0,4,1,1,0.34,0.3333,0.34,0.1343,12,387,399 +9596,2012-02-09,1,1,2,18,0,4,1,1,0.34,0.3333,0.31,0.1642,13,404,417 +9597,2012-02-09,1,1,2,19,0,4,1,1,0.32,0.3333,0.33,0.1045,8,287,295 +9598,2012-02-09,1,1,2,20,0,4,1,1,0.32,0.3485,0.36,0,2,197,199 +9599,2012-02-09,1,1,2,21,0,4,1,1,0.3,0.3182,0.52,0.0896,8,169,177 +9600,2012-02-09,1,1,2,22,0,4,1,2,0.3,0.3333,0.56,0,3,104,107 +9601,2012-02-09,1,1,2,23,0,4,1,2,0.3,0.3333,0.52,0,9,75,84 +9602,2012-02-10,1,1,2,0,0,5,1,2,0.28,0.303,0.65,0.0896,4,39,43 +9603,2012-02-10,1,1,2,1,0,5,1,2,0.28,0.303,0.61,0.0896,2,11,13 +9604,2012-02-10,1,1,2,2,0,5,1,1,0.26,0.2879,0.52,0.0896,7,9,16 +9605,2012-02-10,1,1,2,3,0,5,1,1,0.26,0.2879,0.56,0.0896,0,5,5 +9606,2012-02-10,1,1,2,4,0,5,1,1,0.26,0.2879,0.56,0.0896,0,2,2 +9607,2012-02-10,1,1,2,5,0,5,1,1,0.24,0.2273,0.52,0.2239,0,17,17 +9608,2012-02-10,1,1,2,6,0,5,1,1,0.2,0.197,0.59,0.194,3,61,64 +9609,2012-02-10,1,1,2,7,0,5,1,1,0.2,0.2121,0.59,0.1642,5,225,230 +9610,2012-02-10,1,1,2,8,0,5,1,1,0.2,0.2121,0.59,0.1642,1,447,448 +9611,2012-02-10,1,1,2,9,0,5,1,1,0.22,0.2273,0.55,0.1642,10,265,275 +9612,2012-02-10,1,1,2,10,0,5,1,2,0.26,0.2727,0.48,0.1343,5,119,124 +9613,2012-02-10,1,1,2,11,0,5,1,2,0.28,0.2879,0.45,0.1045,19,134,153 +9614,2012-02-10,1,1,2,12,0,5,1,2,0.3,0.303,0.42,0.1343,8,168,176 +9615,2012-02-10,1,1,2,13,0,5,1,2,0.3,0.3182,0.42,0.0896,22,170,192 +9616,2012-02-10,1,1,2,14,0,5,1,2,0.32,0.3333,0.39,0.0896,8,138,146 +9617,2012-02-10,1,1,2,15,0,5,1,2,0.32,0.3333,0.45,0.1343,27,164,191 +9618,2012-02-10,1,1,2,16,0,5,1,2,0.34,0.3182,0.42,0.2537,28,240,268 +9619,2012-02-10,1,1,2,17,0,5,1,2,0.34,0.3333,0.42,0.194,15,381,396 +9620,2012-02-10,1,1,2,18,0,5,1,3,0.3,0.303,0.61,0.1343,14,345,359 +9621,2012-02-10,1,1,2,19,0,5,1,2,0.32,0.3485,0.57,0,16,246,262 +9622,2012-02-10,1,1,2,20,0,5,1,2,0.32,0.3485,0.57,0,7,144,151 +9623,2012-02-10,1,1,2,21,0,5,1,2,0.32,0.3333,0.66,0.0896,14,108,122 +9624,2012-02-10,1,1,2,22,0,5,1,2,0.32,0.3485,0.66,0,11,102,113 +9625,2012-02-10,1,1,2,23,0,5,1,2,0.3,0.3182,0.7,0.0896,1,64,65 +9626,2012-02-11,1,1,2,0,0,6,0,2,0.3,0.303,0.7,0.1343,3,50,53 +9627,2012-02-11,1,1,2,1,0,6,0,3,0.28,0.2879,0.81,0.1343,2,43,45 +9628,2012-02-11,1,1,2,2,0,6,0,3,0.26,0.2576,0.81,0.194,2,24,26 +9629,2012-02-11,1,1,2,3,0,6,0,3,0.24,0.2273,0.87,0.2239,0,9,9 +9630,2012-02-11,1,1,2,4,0,6,0,3,0.24,0.2273,0.87,0.2239,0,4,4 +9631,2012-02-11,1,1,2,5,0,6,0,3,0.22,0.2273,0.93,0.1642,0,2,2 +9632,2012-02-11,1,1,2,6,0,6,0,3,0.22,0.2273,0.93,0.1343,1,8,9 +9633,2012-02-11,1,1,2,7,0,6,0,3,0.22,0.2273,0.93,0.1343,0,19,19 +9634,2012-02-11,1,1,2,8,0,6,0,3,0.22,0.2273,0.93,0.1343,1,77,78 +9635,2012-02-11,1,1,2,9,0,6,0,3,0.22,0.2424,0.93,0.1045,5,85,90 +9636,2012-02-11,1,1,2,10,0,6,0,3,0.22,0.2424,0.93,0.1045,23,143,166 +9637,2012-02-11,1,1,2,11,0,6,0,2,0.24,0.2576,0.81,0.1045,26,157,183 +9638,2012-02-11,1,1,2,12,0,6,0,2,0.26,0.2576,0.81,0.1642,25,196,221 +9639,2012-02-11,1,1,2,13,0,6,0,2,0.26,0.2424,0.81,0.2836,18,217,235 +9640,2012-02-11,1,1,2,14,0,6,0,2,0.26,0.2424,0.81,0.2537,38,205,243 +9641,2012-02-11,1,1,2,15,0,6,0,2,0.3,0.2727,0.61,0.2985,12,112,124 +9642,2012-02-11,1,1,2,16,0,6,0,3,0.22,0.197,0.75,0.4179,12,134,146 +9643,2012-02-11,1,1,2,17,0,6,0,3,0.22,0.1818,0.69,0.4627,13,131,144 +9644,2012-02-11,1,1,2,18,0,6,0,2,0.22,0.1818,0.47,0.6567,3,105,108 +9645,2012-02-11,1,1,2,19,0,6,0,2,0.2,0.1515,0.4,0.5522,2,85,87 +9646,2012-02-11,1,1,2,20,0,6,0,2,0.16,0.1212,0.43,0.5522,1,62,63 +9647,2012-02-11,1,1,2,21,0,6,0,1,0.14,0.0758,0.43,0.6418,5,43,48 +9648,2012-02-11,1,1,2,22,0,6,0,2,0.14,0.1061,0.39,0.3881,0,46,46 +9649,2012-02-11,1,1,2,23,0,6,0,3,0.12,0.0758,0.5,0.4925,0,20,20 +9650,2012-02-12,1,1,2,0,0,0,0,3,0.1,0.0758,0.68,0.3881,0,21,21 +9651,2012-02-12,1,1,2,1,0,0,0,3,0.08,0.0455,0.79,0.4627,0,24,24 +9652,2012-02-12,1,1,2,2,0,0,0,2,0.1,0.0606,0.58,0.5821,1,26,27 +9653,2012-02-12,1,1,2,3,0,0,0,2,0.1,0.0455,0.46,0.6866,0,14,14 +9654,2012-02-12,1,1,2,4,0,0,0,2,0.1,0.0455,0.46,0.7164,0,1,1 +9655,2012-02-12,1,1,2,5,0,0,0,1,0.1,0.0758,0.49,0.3881,0,3,3 +9656,2012-02-12,1,1,2,6,0,0,0,1,0.1,0.0758,0.49,0.3881,0,2,2 +9657,2012-02-12,1,1,2,7,0,0,0,1,0.08,0.0909,0.53,0.194,0,18,18 +9658,2012-02-12,1,1,2,8,0,0,0,1,0.08,0.0758,0.53,0.2537,0,26,26 +9659,2012-02-12,1,1,2,9,0,0,0,1,0.1,0.0909,0.49,0.2985,3,60,63 +9660,2012-02-12,1,1,2,10,0,0,0,1,0.12,0.1061,0.42,0.3582,8,83,91 +9661,2012-02-12,1,1,2,11,0,0,0,1,0.12,0.1061,0.42,0.2985,3,121,124 +9662,2012-02-12,1,1,2,12,0,0,0,1,0.14,0.0758,0.39,0.6418,7,133,140 +9663,2012-02-12,1,1,2,13,0,0,0,1,0.16,0.1212,0.4,0.4478,10,128,138 +9664,2012-02-12,1,1,2,14,0,0,0,1,0.16,0.1212,0.4,0.5224,9,102,111 +9665,2012-02-12,1,1,2,15,0,0,0,1,0.16,0.1212,0.4,0.5522,5,150,155 +9666,2012-02-12,1,1,2,16,0,0,0,1,0.2,0.1667,0.34,0.4627,16,148,164 +9667,2012-02-12,1,1,2,17,0,0,0,1,0.18,0.1515,0.37,0.4179,4,83,87 +9668,2012-02-12,1,1,2,18,0,0,0,1,0.16,0.1212,0.4,0.4627,4,92,96 +9669,2012-02-12,1,1,2,19,0,0,0,1,0.16,0.1364,0.4,0.2836,2,78,80 +9670,2012-02-12,1,1,2,20,0,0,0,1,0.14,0.1212,0.42,0.3284,1,56,57 +9671,2012-02-12,1,1,2,21,0,0,0,1,0.14,0.1212,0.43,0.2985,0,36,36 +9672,2012-02-12,1,1,2,22,0,0,0,1,0.14,0.1515,0.43,0.1642,0,28,28 +9673,2012-02-12,1,1,2,23,0,0,0,1,0.14,0.1364,0.43,0.2239,0,23,23 +9674,2012-02-13,1,1,2,0,0,1,1,1,0.14,0.1364,0.46,0.2239,0,9,9 +9675,2012-02-13,1,1,2,1,0,1,1,1,0.12,0.1212,0.5,0.2239,0,6,6 +9676,2012-02-13,1,1,2,2,0,1,1,1,0.14,0.1364,0.46,0.194,0,4,4 +9677,2012-02-13,1,1,2,3,0,1,1,1,0.12,0.1515,0.54,0.1045,0,1,1 +9678,2012-02-13,1,1,2,4,0,1,1,1,0.12,0.1667,0.54,0.0896,0,2,2 +9679,2012-02-13,1,1,2,5,0,1,1,1,0.1,0.1212,0.58,0.1343,0,17,17 +9680,2012-02-13,1,1,2,6,0,1,1,1,0.1,0.1061,0.54,0.2239,1,71,72 +9681,2012-02-13,1,1,2,7,0,1,1,1,0.1,0.1061,0.54,0.194,1,194,195 +9682,2012-02-13,1,1,2,8,0,1,1,1,0.1,0.1212,0.54,0.1343,4,413,417 +9683,2012-02-13,1,1,2,9,0,1,1,1,0.12,0.1364,0.5,0.194,7,198,205 +9684,2012-02-13,1,1,2,10,0,1,1,1,0.14,0.1515,0.5,0.1343,1,70,71 +9685,2012-02-13,1,1,2,11,0,1,1,1,0.22,0.2273,0.37,0.194,5,85,90 +9686,2012-02-13,1,1,2,12,0,1,1,1,0.28,0.2576,0.28,0.3284,7,111,118 +9687,2012-02-13,1,1,2,13,0,1,1,1,0.32,0.303,0.22,0.2239,11,129,140 +9688,2012-02-13,1,1,2,14,0,1,1,1,0.34,0.3485,0.23,0.1045,10,125,135 +9689,2012-02-13,1,1,2,15,0,1,1,1,0.36,0.3485,0.27,0.1343,10,145,155 +9690,2012-02-13,1,1,2,16,0,1,1,1,0.36,0.3485,0.23,0.194,5,206,211 +9691,2012-02-13,1,1,2,17,0,1,1,1,0.36,0.3485,0.23,0.194,8,399,407 +9692,2012-02-13,1,1,2,18,0,1,1,1,0.34,0.3182,0.25,0.2537,6,399,405 +9693,2012-02-13,1,1,2,19,0,1,1,1,0.34,0.3333,0.25,0.1343,7,256,263 +9694,2012-02-13,1,1,2,20,0,1,1,1,0.3,0.303,0.42,0.1642,4,210,214 +9695,2012-02-13,1,1,2,21,0,1,1,1,0.28,0.2879,0.45,0.1343,2,145,147 +9696,2012-02-13,1,1,2,22,0,1,1,1,0.26,0.2727,0.56,0.1045,3,99,102 +9697,2012-02-13,1,1,2,23,0,1,1,1,0.28,0.3182,0.41,0,2,34,36 +9698,2012-02-14,1,1,2,0,0,2,1,2,0.26,0.303,0.6,0,0,14,14 +9699,2012-02-14,1,1,2,1,0,2,1,2,0.26,0.303,0.56,0,0,6,6 +9700,2012-02-14,1,1,2,2,0,2,1,2,0.28,0.3182,0.41,0,0,3,3 +9701,2012-02-14,1,1,2,3,0,2,1,2,0.26,0.303,0.52,0,0,3,3 +9702,2012-02-14,1,1,2,4,0,2,1,2,0.26,0.2727,0.65,0.1045,0,2,2 +9703,2012-02-14,1,1,2,5,0,2,1,2,0.26,0.2727,0.65,0.1045,2,20,22 +9704,2012-02-14,1,1,2,6,0,2,1,2,0.26,0.303,0.56,0,1,89,90 +9705,2012-02-14,1,1,2,7,0,2,1,2,0.24,0.2576,0.6,0.0896,4,276,280 +9706,2012-02-14,1,1,2,8,0,2,1,2,0.24,0.2879,0.7,0,3,510,513 +9707,2012-02-14,1,1,2,9,0,2,1,2,0.26,0.2879,0.56,0.0896,6,256,262 +9708,2012-02-14,1,1,2,10,0,2,1,2,0.26,0.2727,0.6,0.1343,8,90,98 +9709,2012-02-14,1,1,2,11,0,2,1,2,0.3,0.303,0.45,0.1642,5,107,112 +9710,2012-02-14,1,1,2,12,0,2,1,2,0.3,0.2879,0.52,0.2239,10,162,172 +9711,2012-02-14,1,1,2,13,0,2,1,2,0.36,0.3333,0.4,0.2537,6,167,173 +9712,2012-02-14,1,1,2,14,0,2,1,1,0.4,0.4091,0.4,0.2239,13,111,124 +9713,2012-02-14,1,1,2,15,0,2,1,2,0.42,0.4242,0.38,0.4478,7,138,145 +9714,2012-02-14,1,1,2,16,0,2,1,1,0.42,0.4242,0.35,0.2985,15,251,266 +9715,2012-02-14,1,1,2,17,0,2,1,1,0.42,0.4242,0.38,0.3582,19,504,523 +9716,2012-02-14,1,1,2,18,0,2,1,2,0.38,0.3939,0.46,0.2985,8,398,406 +9717,2012-02-14,1,1,2,19,0,2,1,2,0.36,0.3485,0.5,0.1642,11,270,281 +9718,2012-02-14,1,1,2,20,0,2,1,2,0.38,0.3939,0.46,0.1343,6,178,184 +9719,2012-02-14,1,1,2,21,0,2,1,2,0.36,0.3636,0.5,0.1045,4,105,109 +9720,2012-02-14,1,1,2,22,0,2,1,2,0.36,0.3636,0.5,0.1045,5,72,77 +9721,2012-02-14,1,1,2,23,0,2,1,2,0.36,0.3636,0.5,0.0896,2,55,57 +9722,2012-02-15,1,1,2,0,0,3,1,1,0.32,0.3485,0.57,0,0,22,22 +9723,2012-02-15,1,1,2,1,0,3,1,1,0.34,0.3636,0.53,0,0,5,5 +9724,2012-02-15,1,1,2,2,0,3,1,1,0.3,0.3333,0.61,0,0,4,4 +9725,2012-02-15,1,1,2,3,0,3,1,1,0.28,0.3182,0.75,0,0,3,3 +9726,2012-02-15,1,1,2,4,0,3,1,1,0.28,0.2879,0.7,0.1045,0,1,1 +9727,2012-02-15,1,1,2,5,0,3,1,1,0.28,0.2879,0.7,0.1045,0,25,25 +9728,2012-02-15,1,1,2,6,0,3,1,1,0.3,0.303,0.61,0.1343,3,92,95 +9729,2012-02-15,1,1,2,7,0,3,1,2,0.3,0.2879,0.65,0.194,3,318,321 +9730,2012-02-15,1,1,2,8,0,3,1,1,0.32,0.2879,0.61,0.3582,7,508,515 +9731,2012-02-15,1,1,2,9,0,3,1,1,0.32,0.303,0.61,0.3284,5,226,231 +9732,2012-02-15,1,1,2,10,0,3,1,1,0.34,0.303,0.57,0.2985,12,108,120 +9733,2012-02-15,1,1,2,11,0,3,1,1,0.36,0.3333,0.53,0.2836,13,125,138 +9734,2012-02-15,1,1,2,12,0,3,1,1,0.38,0.3939,0.46,0.3881,11,152,163 +9735,2012-02-15,1,1,2,13,0,3,1,1,0.4,0.4091,0.43,0.2836,11,159,170 +9736,2012-02-15,1,1,2,14,0,3,1,1,0.42,0.4242,0.41,0.3582,6,128,134 +9737,2012-02-15,1,1,2,15,0,3,1,1,0.42,0.4242,0.41,0.2239,7,164,171 +9738,2012-02-15,1,1,2,16,0,3,1,1,0.42,0.4242,0.38,0.194,9,222,231 +9739,2012-02-15,1,1,2,17,0,3,1,1,0.42,0.4242,0.38,0.194,17,470,487 +9740,2012-02-15,1,1,2,18,0,3,1,1,0.4,0.4091,0.4,0.2239,9,459,468 +9741,2012-02-15,1,1,2,19,0,3,1,1,0.38,0.3939,0.43,0.1642,9,302,311 +9742,2012-02-15,1,1,2,20,0,3,1,1,0.36,0.3485,0.46,0.1642,4,207,211 +9743,2012-02-15,1,1,2,21,0,3,1,1,0.34,0.3333,0.53,0.1642,8,170,178 +9744,2012-02-15,1,1,2,22,0,3,1,1,0.34,0.3485,0.49,0.1045,3,104,107 +9745,2012-02-15,1,1,2,23,0,3,1,1,0.34,0.3485,0.53,0.0896,4,54,58 +9746,2012-02-16,1,1,2,0,0,4,1,2,0.3,0.303,0.7,0.1343,1,22,23 +9747,2012-02-16,1,1,2,1,0,4,1,2,0.3,0.3333,0.7,0,0,5,5 +9748,2012-02-16,1,1,2,2,0,4,1,2,0.3,0.3182,0.7,0.0896,1,6,7 +9749,2012-02-16,1,1,2,3,0,4,1,2,0.3,0.3333,0.61,0,0,1,1 +9750,2012-02-16,1,1,2,4,0,4,1,3,0.3,0.3182,0.7,0.0896,0,3,3 +9751,2012-02-16,1,1,2,5,0,4,1,2,0.3,0.3333,0.7,0,0,20,20 +9752,2012-02-16,1,1,2,6,0,4,1,2,0.3,0.3182,0.7,0.0896,4,83,87 +9753,2012-02-16,1,1,2,7,0,4,1,2,0.3,0.303,0.7,0.1343,3,285,288 +9754,2012-02-16,1,1,2,8,0,4,1,2,0.3,0.3182,0.7,0.0896,9,489,498 +9755,2012-02-16,1,1,2,9,0,4,1,2,0.3,0.303,0.7,0.1642,3,199,202 +9756,2012-02-16,1,1,2,10,0,4,1,3,0.32,0.3333,0.7,0.1343,2,42,44 +9757,2012-02-16,1,1,2,11,0,4,1,3,0.32,0.3333,0.7,0.1045,2,69,71 +9758,2012-02-16,1,1,2,12,0,4,1,3,0.32,0.3333,0.76,0.0896,1,44,45 +9759,2012-02-16,1,1,2,13,0,4,1,3,0.34,0.3182,0.71,0.2537,2,62,64 +9760,2012-02-16,1,1,2,14,0,4,1,3,0.32,0.3182,0.81,0.194,1,30,31 +9761,2012-02-16,1,1,2,15,0,4,1,3,0.34,0.3636,0.76,0,3,50,53 +9762,2012-02-16,1,1,2,16,0,4,1,3,0.34,0.3636,0.81,0,8,110,118 +9763,2012-02-16,1,1,2,17,0,4,1,3,0.34,0.3636,0.87,0,8,281,289 +9764,2012-02-16,1,1,2,18,0,4,1,3,0.34,0.3333,0.87,0.1343,8,345,353 +9765,2012-02-16,1,1,2,19,0,4,1,2,0.32,0.3182,0.87,0.1642,4,254,258 +9766,2012-02-16,1,1,2,20,0,4,1,2,0.32,0.3333,0.87,0.1343,5,211,216 +9767,2012-02-16,1,1,2,21,0,4,1,2,0.34,0.3485,0.81,0.1045,3,142,145 +9768,2012-02-16,1,1,2,22,0,4,1,2,0.32,0.3485,0.81,0,3,108,111 +9769,2012-02-16,1,1,2,23,0,4,1,1,0.32,0.3333,0.81,0.0896,3,70,73 +9770,2012-02-17,1,1,2,0,0,5,1,2,0.3,0.3333,0.87,0,2,32,34 +9771,2012-02-17,1,1,2,1,0,5,1,2,0.3,0.3333,0.93,0,2,16,18 +9772,2012-02-17,1,1,2,2,0,5,1,2,0.3,0.3182,0.93,0.1045,1,11,12 +9773,2012-02-17,1,1,2,3,0,5,1,2,0.28,0.2879,0.93,0.1045,0,1,1 +9774,2012-02-17,1,1,2,4,0,5,1,2,0.28,0.2727,0.93,0.194,0,1,1 +9775,2012-02-17,1,1,2,5,0,5,1,2,0.28,0.2727,0.93,0.1642,0,16,16 +9776,2012-02-17,1,1,2,6,0,5,1,2,0.28,0.2879,0.96,0.1343,1,70,71 +9777,2012-02-17,1,1,2,7,0,5,1,2,0.28,0.2879,0.93,0.1045,4,222,226 +9778,2012-02-17,1,1,2,8,0,5,1,2,0.26,0.303,1,0,4,516,520 +9779,2012-02-17,1,1,2,9,0,5,1,2,0.26,0.303,1,0,8,275,283 +9780,2012-02-17,1,1,2,10,0,5,1,1,0.34,0.3182,0.71,0.2836,13,103,116 +9781,2012-02-17,1,1,2,11,0,5,1,1,0.36,0.3333,0.62,0.4179,9,133,142 +9782,2012-02-17,1,1,2,12,0,5,1,1,0.38,0.3939,0.46,0.3881,21,177,198 +9783,2012-02-17,1,1,2,13,0,5,1,1,0.4,0.4091,0.43,0.3582,34,171,205 +9784,2012-02-17,1,1,2,14,0,5,1,1,0.4,0.4091,0.37,0.3582,33,137,170 +9785,2012-02-17,1,1,2,15,0,5,1,1,0.42,0.4242,0.35,0.4179,41,180,221 +9786,2012-02-17,1,1,2,16,0,5,1,1,0.42,0.4242,0.32,0.2985,56,251,307 +9787,2012-02-17,1,1,2,17,0,5,1,1,0.42,0.4242,0.32,0.3284,36,429,465 +9788,2012-02-17,1,1,2,18,0,5,1,1,0.4,0.4091,0.35,0.2537,15,362,377 +9789,2012-02-17,1,1,2,19,0,5,1,1,0.38,0.3939,0.37,0.2239,16,266,282 +9790,2012-02-17,1,1,2,20,0,5,1,1,0.4,0.4091,0.35,0.194,13,158,171 +9791,2012-02-17,1,1,2,21,0,5,1,1,0.38,0.3939,0.37,0.194,15,114,129 +9792,2012-02-17,1,1,2,22,0,5,1,1,0.36,0.3485,0.4,0.2239,15,92,107 +9793,2012-02-17,1,1,2,23,0,5,1,1,0.36,0.3485,0.4,0.194,10,72,82 +9794,2012-02-18,1,1,2,0,0,6,0,1,0.34,0.3333,0.42,0.1642,5,81,86 +9795,2012-02-18,1,1,2,1,0,6,0,1,0.32,0.3182,0.49,0.1642,7,38,45 +9796,2012-02-18,1,1,2,2,0,6,0,1,0.3,0.2879,0.61,0.2537,6,39,45 +9797,2012-02-18,1,1,2,3,0,6,0,1,0.28,0.2879,0.65,0.1045,3,15,18 +9798,2012-02-18,1,1,2,4,0,6,0,1,0.26,0.303,0.7,0,3,4,7 +9799,2012-02-18,1,1,2,5,0,6,0,1,0.24,0.2576,0.7,0.0896,0,1,1 +9800,2012-02-18,1,1,2,6,0,6,0,1,0.24,0.2879,0.7,0,1,8,9 +9801,2012-02-18,1,1,2,7,0,6,0,1,0.24,0.2879,0.7,0,8,33,41 +9802,2012-02-18,1,1,2,8,0,6,0,1,0.22,0.2727,0.87,0,10,92,102 +9803,2012-02-18,1,1,2,9,0,6,0,1,0.22,0.2424,0.8,0.1045,22,116,138 +9804,2012-02-18,1,1,2,10,0,6,0,1,0.26,0.2727,0.81,0.1045,48,157,205 +9805,2012-02-18,1,1,2,11,0,6,0,1,0.32,0.3333,0.57,0.1343,102,184,286 +9806,2012-02-18,1,1,2,12,0,6,0,1,0.36,0.3485,0.43,0.194,128,241,369 +9807,2012-02-18,1,1,2,13,0,6,0,1,0.4,0.4091,0.43,0.2836,165,219,384 +9808,2012-02-18,1,1,2,14,0,6,0,1,0.44,0.4394,0.35,0.3582,183,244,427 +9809,2012-02-18,1,1,2,15,0,6,0,1,0.42,0.4242,0.41,0.2537,229,270,499 +9810,2012-02-18,1,1,2,16,0,6,0,1,0.44,0.4394,0.44,0.3284,186,218,404 +9811,2012-02-18,1,1,2,17,0,6,0,1,0.48,0.4697,0.36,0.2836,142,222,364 +9812,2012-02-18,1,1,2,18,0,6,0,1,0.46,0.4545,0.38,0.2985,70,176,246 +9813,2012-02-18,1,1,2,19,0,6,0,1,0.44,0.4394,0.44,0.1642,38,150,188 +9814,2012-02-18,1,1,2,20,0,6,0,1,0.44,0.4394,0.44,0.1642,25,123,148 +9815,2012-02-18,1,1,2,21,0,6,0,1,0.44,0.4394,0.33,0.2985,25,89,114 +9816,2012-02-18,1,1,2,22,0,6,0,1,0.4,0.4091,0.4,0.4478,17,101,118 +9817,2012-02-18,1,1,2,23,0,6,0,2,0.36,0.3333,0.4,0.3881,12,62,74 +9818,2012-02-19,1,1,2,0,0,0,0,2,0.34,0.303,0.39,0.3582,9,65,74 +9819,2012-02-19,1,1,2,1,0,0,0,2,0.32,0.303,0.39,0.2537,10,62,72 +9820,2012-02-19,1,1,2,2,0,0,0,2,0.3,0.2879,0.36,0.2836,3,45,48 +9821,2012-02-19,1,1,2,3,0,0,0,2,0.3,0.2879,0.36,0.2239,3,12,15 +9822,2012-02-19,1,1,2,4,0,0,0,2,0.28,0.2576,0.41,0.3881,0,3,3 +9823,2012-02-19,1,1,2,5,0,0,0,2,0.26,0.2273,0.48,0.2985,1,1,2 +9824,2012-02-19,1,1,2,6,0,0,0,1,0.24,0.2121,0.52,0.3284,0,5,5 +9825,2012-02-19,1,1,2,7,0,0,0,2,0.24,0.2121,0.52,0.2836,2,12,14 +9826,2012-02-19,1,1,2,8,0,0,0,2,0.24,0.2273,0.56,0.194,8,48,56 +9827,2012-02-19,1,1,2,9,0,0,0,2,0.24,0.2273,0.56,0.2537,20,73,93 +9828,2012-02-19,1,1,2,10,0,0,0,2,0.26,0.2424,0.52,0.2537,49,142,191 +9829,2012-02-19,1,1,2,11,0,0,0,2,0.26,0.2273,0.56,0.3284,82,144,226 +9830,2012-02-19,1,1,2,12,0,0,0,2,0.28,0.2727,0.52,0.2537,71,184,255 +9831,2012-02-19,1,1,2,13,0,0,0,1,0.3,0.303,0.49,0.1642,64,197,261 +9832,2012-02-19,1,1,2,14,0,0,0,2,0.32,0.3182,0.45,0.194,48,189,237 +9833,2012-02-19,1,1,2,15,0,0,0,2,0.3,0.2879,0.49,0.194,85,168,253 +9834,2012-02-19,1,1,2,16,0,0,0,2,0.32,0.303,0.45,0.2836,62,150,212 +9835,2012-02-19,1,1,2,17,0,0,0,2,0.3,0.2879,0.49,0.2537,22,135,157 +9836,2012-02-19,1,1,2,18,0,0,0,2,0.3,0.2727,0.49,0.3284,23,144,167 +9837,2012-02-19,1,1,2,19,0,0,0,2,0.3,0.2879,0.52,0.2537,18,101,119 +9838,2012-02-19,1,1,2,20,0,0,0,2,0.3,0.2879,0.52,0.2836,22,81,103 +9839,2012-02-19,1,1,2,21,0,0,0,3,0.26,0.2576,0.65,0.1642,3,33,36 +9840,2012-02-19,1,1,2,22,0,0,0,3,0.24,0.2576,0.75,0.0896,8,47,55 +9841,2012-02-19,1,1,2,23,0,0,0,2,0.22,0.2273,0.93,0.1642,5,30,35 +9842,2012-02-20,1,1,2,0,1,1,0,2,0.24,0.2576,0.87,0.0896,5,36,41 +9843,2012-02-20,1,1,2,1,1,1,0,2,0.24,0.2576,0.87,0.0896,3,12,15 +9844,2012-02-20,1,1,2,2,1,1,0,2,0.24,0.2576,0.87,0.0896,1,19,20 +9845,2012-02-20,1,1,2,3,1,1,0,2,0.24,0.2273,0.81,0.194,0,8,8 +9846,2012-02-20,1,1,2,5,1,1,0,2,0.24,0.2121,0.6,0.3582,0,6,6 +9847,2012-02-20,1,1,2,6,1,1,0,1,0.24,0.197,0.52,0.4179,1,9,10 +9848,2012-02-20,1,1,2,7,1,1,0,2,0.22,0.197,0.55,0.3284,2,42,44 +9849,2012-02-20,1,1,2,8,1,1,0,1,0.2,0.1818,0.59,0.3582,9,111,120 +9850,2012-02-20,1,1,2,9,1,1,0,1,0.2,0.1818,0.59,0.3582,12,98,110 +9851,2012-02-20,1,1,2,10,1,1,0,1,0.22,0.197,0.55,0.4478,30,113,143 +9852,2012-02-20,1,1,2,11,1,1,0,1,0.26,0.2273,0.48,0.3582,45,163,208 +9853,2012-02-20,1,1,2,12,1,1,0,1,0.28,0.2727,0.45,0.2239,65,216,281 +9854,2012-02-20,1,1,2,13,1,1,0,1,0.3,0.2879,0.42,0.2836,66,246,312 +9855,2012-02-20,1,1,2,14,1,1,0,1,0.32,0.3182,0.36,0.194,57,206,263 +9856,2012-02-20,1,1,2,15,1,1,0,1,0.34,0.3182,0.34,0.2239,54,235,289 +9857,2012-02-20,1,1,2,16,1,1,0,1,0.36,0.3788,0.29,0.1045,64,216,280 +9858,2012-02-20,1,1,2,17,1,1,0,1,0.38,0.3939,0.27,0.2537,27,226,253 +9859,2012-02-20,1,1,2,18,1,1,0,1,0.34,0.3182,0.34,0.2239,27,215,242 +9860,2012-02-20,1,1,2,19,1,1,0,1,0.34,0.3182,0.34,0.2239,8,164,172 +9861,2012-02-20,1,1,2,20,1,1,0,1,0.34,0.3333,0.34,0.194,9,117,126 +9862,2012-02-20,1,1,2,21,1,1,0,1,0.3,0.303,0.39,0.1642,8,90,98 +9863,2012-02-20,1,1,2,22,1,1,0,1,0.3,0.3182,0.45,0.0896,9,53,62 +9864,2012-02-20,1,1,2,23,1,1,0,1,0.3,0.3333,0.39,0,0,26,26 +9865,2012-02-21,1,1,2,0,0,2,1,1,0.26,0.303,0.65,0,1,11,12 +9866,2012-02-21,1,1,2,1,0,2,1,1,0.26,0.303,0.56,0,0,7,7 +9867,2012-02-21,1,1,2,2,0,2,1,1,0.24,0.2879,0.6,0,0,3,3 +9868,2012-02-21,1,1,2,4,0,2,1,1,0.22,0.2727,0.64,0,0,2,2 +9869,2012-02-21,1,1,2,5,0,2,1,1,0.2,0.2576,0.69,0,0,15,15 +9870,2012-02-21,1,1,2,6,0,2,1,1,0.2,0.2576,0.75,0,3,83,86 +9871,2012-02-21,1,1,2,7,0,2,1,1,0.22,0.2727,0.64,0,6,273,279 +9872,2012-02-21,1,1,2,8,0,2,1,1,0.2,0.2273,0.69,0.0896,5,459,464 +9873,2012-02-21,1,1,2,9,0,2,1,1,0.2,0.2273,0.75,0.1045,6,211,217 +9874,2012-02-21,1,1,2,10,0,2,1,1,0.24,0.2273,0.7,0.194,8,77,85 +9875,2012-02-21,1,1,2,11,0,2,1,2,0.26,0.2424,0.7,0.2836,5,81,86 +9876,2012-02-21,1,1,2,12,0,2,1,2,0.3,0.2727,0.56,0.3881,16,147,163 +9877,2012-02-21,1,1,2,13,0,2,1,1,0.34,0.303,0.49,0.4179,8,123,131 +9878,2012-02-21,1,1,2,14,0,2,1,2,0.32,0.2879,0.53,0.3881,16,108,124 +9879,2012-02-21,1,1,2,15,0,2,1,1,0.34,0.303,0.49,0.4179,23,137,160 +9880,2012-02-21,1,1,2,16,0,2,1,1,0.36,0.3333,0.46,0.4179,14,240,254 +9881,2012-02-21,1,1,2,17,0,2,1,1,0.38,0.3939,0.4,0.4478,21,411,432 +9882,2012-02-21,1,1,2,18,0,2,1,1,0.38,0.3939,0.4,0.4478,11,414,425 +9883,2012-02-21,1,1,2,19,0,2,1,1,0.36,0.3333,0.46,0.2985,5,299,304 +9884,2012-02-21,1,1,2,20,0,2,1,1,0.34,0.3182,0.53,0.2537,4,223,227 +9885,2012-02-21,1,1,2,21,0,2,1,1,0.34,0.3333,0.61,0.1642,7,147,154 +9886,2012-02-21,1,1,2,22,0,2,1,2,0.34,0.3333,0.71,0.194,3,97,100 +9887,2012-02-21,1,1,2,23,0,2,1,1,0.32,0.303,0.66,0.2239,1,46,47 +9888,2012-02-22,1,1,2,0,0,3,1,1,0.32,0.303,0.66,0.2239,5,19,24 +9889,2012-02-22,1,1,2,1,0,3,1,1,0.32,0.3333,0.7,0.1343,3,4,7 +9890,2012-02-22,1,1,2,2,0,3,1,1,0.34,0.3182,0.66,0.2836,1,5,6 +9891,2012-02-22,1,1,2,3,0,3,1,2,0.34,0.3182,0.71,0.2239,1,3,4 +9892,2012-02-22,1,1,2,4,0,3,1,1,0.34,0.3333,0.71,0.1642,0,2,2 +9893,2012-02-22,1,1,2,5,0,3,1,1,0.32,0.3333,0.7,0.1045,1,29,30 +9894,2012-02-22,1,1,2,6,0,3,1,1,0.32,0.3182,0.76,0.1642,5,85,90 +9895,2012-02-22,1,1,2,7,0,3,1,1,0.32,0.3333,0.76,0.0896,10,292,302 +9896,2012-02-22,1,1,2,8,0,3,1,1,0.3,0.3182,0.75,0.0896,9,567,576 +9897,2012-02-22,1,1,2,9,0,3,1,1,0.32,0.3333,0.7,0.0896,11,241,252 +9898,2012-02-22,1,1,2,10,0,3,1,1,0.34,0.3485,0.71,0.0896,17,122,139 +9899,2012-02-22,1,1,2,11,0,3,1,1,0.36,0.3485,0.66,0.1642,22,149,171 +9900,2012-02-22,1,1,2,12,0,3,1,1,0.4,0.4091,0.62,0.2836,21,186,207 +9901,2012-02-22,1,1,2,13,0,3,1,1,0.42,0.4242,0.54,0.2985,15,175,190 +9902,2012-02-22,1,1,2,14,0,3,1,1,0.44,0.4394,0.54,0.3284,35,142,177 +9903,2012-02-22,1,1,2,15,0,3,1,1,0.5,0.4848,0.31,0.4179,34,161,195 +9904,2012-02-22,1,1,2,16,0,3,1,1,0.52,0.5,0.29,0.4627,30,268,298 +9905,2012-02-22,1,1,2,17,0,3,1,1,0.52,0.5,0.32,0.3881,43,486,529 +9906,2012-02-22,1,1,2,18,0,3,1,2,0.48,0.4697,0.41,0.3284,42,483,525 +9907,2012-02-22,1,1,2,19,0,3,1,2,0.46,0.4545,0.44,0.2836,22,305,327 +9908,2012-02-22,1,1,2,20,0,3,1,2,0.44,0.4394,0.47,0.194,25,249,274 +9909,2012-02-22,1,1,2,21,0,3,1,2,0.48,0.4697,0.33,0.2836,19,220,239 +9910,2012-02-22,1,1,2,22,0,3,1,2,0.44,0.4394,0.47,0.2537,18,129,147 +9911,2012-02-22,1,1,2,23,0,3,1,2,0.46,0.4545,0.41,0.2836,5,57,62 +9912,2012-02-23,1,1,2,0,0,4,1,2,0.44,0.4394,0.47,0.2537,12,30,42 +9913,2012-02-23,1,1,2,1,0,4,1,2,0.44,0.4394,0.58,0.2836,3,12,15 +9914,2012-02-23,1,1,2,2,0,4,1,2,0.44,0.4394,0.62,0.2239,5,4,9 +9915,2012-02-23,1,1,2,3,0,4,1,3,0.44,0.4394,0.67,0.194,2,4,6 +9916,2012-02-23,1,1,2,4,0,4,1,1,0.4,0.4091,0.76,0.2239,0,1,1 +9917,2012-02-23,1,1,2,5,0,4,1,1,0.4,0.4091,0.76,0.2239,2,30,32 +9918,2012-02-23,1,1,2,6,0,4,1,1,0.36,0.3485,0.87,0.194,3,88,91 +9919,2012-02-23,1,1,2,7,0,4,1,1,0.36,0.3485,0.81,0.2239,9,317,326 +9920,2012-02-23,1,1,2,8,0,4,1,1,0.34,0.3333,0.87,0.194,11,549,560 +9921,2012-02-23,1,1,2,9,0,4,1,1,0.36,0.3333,0.81,0.2537,14,228,242 +9922,2012-02-23,1,1,2,10,0,4,1,1,0.42,0.4242,0.67,0.194,41,107,148 +9923,2012-02-23,1,1,2,11,0,4,1,1,0.48,0.4697,0.48,0.2836,54,128,182 +9924,2012-02-23,1,1,2,12,0,4,1,1,0.5,0.4848,0.42,0.2537,25,199,224 +9925,2012-02-23,1,1,2,13,0,4,1,1,0.52,0.5,0.39,0.194,30,188,218 +9926,2012-02-23,1,1,2,14,0,4,1,1,0.54,0.5152,0.37,0.2537,36,156,192 +9927,2012-02-23,1,1,2,15,0,4,1,1,0.54,0.5152,0.33,0.1343,46,171,217 +9928,2012-02-23,1,1,2,16,0,4,1,1,0.56,0.5303,0.32,0,45,298,343 +9929,2012-02-23,1,1,2,17,0,4,1,1,0.54,0.5152,0.32,0.1642,49,561,610 +9930,2012-02-23,1,1,2,18,0,4,1,1,0.54,0.5152,0.37,0.1343,36,486,522 +9931,2012-02-23,1,1,2,19,0,4,1,1,0.5,0.4848,0.39,0.0896,20,349,369 +9932,2012-02-23,1,1,2,20,0,4,1,1,0.46,0.4545,0.44,0.1343,23,246,269 +9933,2012-02-23,1,1,2,21,0,4,1,1,0.44,0.4394,0.51,0.2239,17,205,222 +9934,2012-02-23,1,1,2,22,0,4,1,1,0.44,0.4394,0.54,0.2537,19,118,137 +9935,2012-02-23,1,1,2,23,0,4,1,1,0.44,0.4394,0.54,0,14,71,85 +9936,2012-02-24,1,1,2,0,0,5,1,2,0.44,0.4394,0.58,0.0896,12,34,46 +9937,2012-02-24,1,1,2,1,0,5,1,1,0.44,0.4394,0.62,0.194,3,15,18 +9938,2012-02-24,1,1,2,2,0,5,1,3,0.42,0.4242,0.77,0.2836,1,4,5 +9939,2012-02-24,1,1,2,3,0,5,1,1,0.42,0.4242,0.82,0.1343,0,5,5 +9940,2012-02-24,1,1,2,4,0,5,1,1,0.4,0.4091,0.87,0.1343,0,2,2 +9941,2012-02-24,1,1,2,5,0,5,1,2,0.4,0.4091,0.87,0.194,1,25,26 +9942,2012-02-24,1,1,2,6,0,5,1,2,0.4,0.4091,0.87,0.1642,2,80,82 +9943,2012-02-24,1,1,2,7,0,5,1,2,0.4,0.4091,0.76,0.2239,7,265,272 +9944,2012-02-24,1,1,2,8,0,5,1,2,0.38,0.3939,0.76,0.3284,17,452,469 +9945,2012-02-24,1,1,2,9,0,5,1,3,0.36,0.3333,0.81,0.2537,15,229,244 +9946,2012-02-24,1,1,2,10,0,5,1,3,0.36,0.3485,0.81,0.1642,22,119,141 +9947,2012-02-24,1,1,2,11,0,5,1,2,0.36,0.3636,0.81,0.1045,27,136,163 +9948,2012-02-24,1,1,2,12,0,5,1,2,0.4,0.4091,0.76,0.1343,27,201,228 +9949,2012-02-24,1,1,2,13,0,5,1,2,0.42,0.4242,0.71,0.1045,30,153,183 +9950,2012-02-24,1,1,2,14,0,5,1,2,0.46,0.4545,0.67,0.1045,5,26,31 +9951,2012-02-24,1,1,2,15,0,5,1,3,0.4,0.4091,0.87,0.194,11,83,94 +9952,2012-02-24,1,1,2,16,0,5,1,3,0.4,0.4091,0.87,0.194,12,166,178 +9953,2012-02-24,1,1,2,17,0,5,1,3,0.42,0.4242,0.94,0.1045,14,242,256 +9954,2012-02-24,1,1,2,18,0,5,1,2,0.42,0.4242,0.94,0,11,289,300 +9955,2012-02-24,1,1,2,19,0,5,1,2,0.44,0.4394,0.88,0.1045,6,227,233 +9956,2012-02-24,1,1,2,20,0,5,1,1,0.44,0.4394,0.62,0.4179,8,159,167 +9957,2012-02-24,1,1,2,21,0,5,1,1,0.42,0.4242,0.35,0.806,1,151,152 +9958,2012-02-24,1,1,2,22,0,5,1,1,0.4,0.4091,0.37,0.5821,4,102,106 +9959,2012-02-24,1,1,2,23,0,5,1,1,0.38,0.3939,0.37,0.6866,10,76,86 +9960,2012-02-25,1,1,2,0,0,6,0,1,0.36,0.3333,0.43,0.3881,5,56,61 +9961,2012-02-25,1,1,2,1,0,6,0,1,0.34,0.303,0.42,0.3582,4,42,46 +9962,2012-02-25,1,1,2,2,0,6,0,1,0.32,0.303,0.45,0.2537,1,37,38 +9963,2012-02-25,1,1,2,3,0,6,0,1,0.3,0.2727,0.45,0.2985,0,7,7 +9964,2012-02-25,1,1,2,4,0,6,0,1,0.3,0.2727,0.42,0.3284,0,2,2 +9965,2012-02-25,1,1,2,5,0,6,0,2,0.3,0.2727,0.42,0.2985,0,3,3 +9966,2012-02-25,1,1,2,6,0,6,0,2,0.3,0.2727,0.42,0.2985,1,12,13 +9967,2012-02-25,1,1,2,7,0,6,0,1,0.28,0.2424,0.45,0.4478,3,20,23 +9968,2012-02-25,1,1,2,8,0,6,0,1,0.26,0.2273,0.48,0.2985,3,90,93 +9969,2012-02-25,1,1,2,9,0,6,0,3,0.22,0.197,0.75,0.3284,15,107,122 +9970,2012-02-25,1,1,2,10,0,6,0,1,0.26,0.2273,0.6,0.3582,24,159,183 +9971,2012-02-25,1,1,2,11,0,6,0,1,0.3,0.2576,0.39,0.6119,23,187,210 +9972,2012-02-25,1,1,2,12,0,6,0,1,0.32,0.2727,0.29,0.6567,41,217,258 +9973,2012-02-25,1,1,2,13,0,6,0,1,0.32,0.2879,0.31,0.4925,48,178,226 +9974,2012-02-25,1,1,2,14,0,6,0,1,0.32,0.2727,0.29,0.5522,35,181,216 +9975,2012-02-25,1,1,2,15,0,6,0,1,0.32,0.2879,0.29,0.4179,21,211,232 +9976,2012-02-25,1,1,2,16,0,6,0,1,0.32,0.2727,0.26,0.5821,32,163,195 +9977,2012-02-25,1,1,2,17,0,6,0,1,0.3,0.2727,0.33,0.3582,16,143,159 +9978,2012-02-25,1,1,2,18,0,6,0,1,0.3,0.2576,0.28,0.5821,7,126,133 +9979,2012-02-25,1,1,2,19,0,6,0,1,0.26,0.2121,0.33,0.4478,12,137,149 +9980,2012-02-25,1,1,2,20,0,6,0,1,0.26,0.2121,0.3,0.5522,10,95,105 +9981,2012-02-25,1,1,2,21,0,6,0,1,0.24,0.2121,0.35,0.3284,5,80,85 +9982,2012-02-25,1,1,2,22,0,6,0,1,0.24,0.197,0.38,0.4627,4,78,82 +9983,2012-02-25,1,1,2,23,0,6,0,1,0.24,0.197,0.41,0.4179,7,84,91 +9984,2012-02-26,1,1,2,0,0,0,0,1,0.24,0.2121,0.41,0.3582,7,63,70 +9985,2012-02-26,1,1,2,1,0,0,0,1,0.24,0.2121,0.44,0.3284,5,50,55 +9986,2012-02-26,1,1,2,2,0,0,0,1,0.24,0.2121,0.44,0.3284,5,37,42 +9987,2012-02-26,1,1,2,3,0,0,0,1,0.24,0.2273,0.41,0.194,0,21,21 +9988,2012-02-26,1,1,2,4,0,0,0,1,0.24,0.2273,0.41,0.194,1,5,6 +9989,2012-02-26,1,1,2,5,0,0,0,1,0.22,0.2121,0.44,0.2239,0,5,5 +9990,2012-02-26,1,1,2,6,0,0,0,1,0.22,0.197,0.44,0.3284,0,2,2 +9991,2012-02-26,1,1,2,7,0,0,0,1,0.2,0.1818,0.51,0.2985,2,20,22 +9992,2012-02-26,1,1,2,8,0,0,0,1,0.2,0.197,0.51,0.2537,2,57,59 +9993,2012-02-26,1,1,2,9,0,0,0,1,0.22,0.2121,0.47,0.2239,11,76,87 +9994,2012-02-26,1,1,2,10,0,0,0,1,0.26,0.2576,0.41,0.1642,27,161,188 +9995,2012-02-26,1,1,2,11,0,0,0,1,0.28,0.2576,0.41,0.2836,43,184,227 +9996,2012-02-26,1,1,2,12,0,0,0,1,0.3,0.303,0.39,0.1343,49,256,305 +9997,2012-02-26,1,1,2,13,0,0,0,1,0.32,0.3333,0.36,0,58,267,325 +9998,2012-02-26,1,1,2,14,0,0,0,1,0.34,0.3485,0.31,0,68,263,331 +9999,2012-02-26,1,1,2,15,0,0,0,1,0.36,0.3485,0.29,0,72,281,353 +10000,2012-02-26,1,1,2,16,0,0,0,1,0.36,0.3333,0.32,0.2537,64,275,339 +10001,2012-02-26,1,1,2,17,0,0,0,1,0.36,0.3485,0.32,0.2239,46,241,287 +10002,2012-02-26,1,1,2,18,0,0,0,1,0.34,0.3182,0.36,0.2836,17,199,216 +10003,2012-02-26,1,1,2,19,0,0,0,1,0.34,0.3333,0.34,0.1642,13,130,143 +10004,2012-02-26,1,1,2,20,0,0,0,1,0.32,0.3182,0.42,0.1642,9,79,88 +10005,2012-02-26,1,1,2,21,0,0,0,1,0.3,0.303,0.45,0.1343,8,75,83 +10006,2012-02-26,1,1,2,22,0,0,0,1,0.3,0.2879,0.42,0.2239,3,68,71 +10007,2012-02-26,1,1,2,23,0,0,0,1,0.26,0.2576,0.56,0.1642,5,59,64 +10008,2012-02-27,1,1,2,0,0,1,1,1,0.26,0.2424,0.6,0.2537,2,27,29 +10009,2012-02-27,1,1,2,1,0,1,1,1,0.26,0.2576,0.65,0.2239,0,6,6 +10010,2012-02-27,1,1,2,2,0,1,1,1,0.26,0.2273,0.7,0.2985,0,4,4 +10011,2012-02-27,1,1,2,3,0,1,1,1,0.24,0.2424,0.75,0.1642,0,2,2 +10012,2012-02-27,1,1,2,4,0,1,1,1,0.22,0.2273,0.8,0.1343,0,1,1 +10013,2012-02-27,1,1,2,5,0,1,1,1,0.22,0.2273,0.8,0.1343,0,16,16 +10014,2012-02-27,1,1,2,6,0,1,1,1,0.24,0.2273,0.7,0.2537,1,89,90 +10015,2012-02-27,1,1,2,7,0,1,1,1,0.24,0.2273,0.6,0.2239,2,278,280 +10016,2012-02-27,1,1,2,8,0,1,1,1,0.24,0.2424,0.6,0,14,514,528 +10017,2012-02-27,1,1,2,9,0,1,1,1,0.26,0.2576,0.6,0.194,11,219,230 +10018,2012-02-27,1,1,2,10,0,1,1,1,0.32,0.303,0.45,0.2537,16,88,104 +10019,2012-02-27,1,1,2,11,0,1,1,1,0.36,0.3333,0.43,0.2537,11,92,103 +10020,2012-02-27,1,1,2,12,0,1,1,1,0.4,0.4091,0.37,0.2985,13,164,177 +10021,2012-02-27,1,1,2,13,0,1,1,1,0.4,0.4091,0.43,0.2836,23,159,182 +10022,2012-02-27,1,1,2,14,0,1,1,1,0.46,0.4545,0.36,0.2836,24,134,158 +10023,2012-02-27,1,1,2,15,0,1,1,1,0.52,0.5,0.25,0.4179,24,150,174 +10024,2012-02-27,1,1,2,16,0,1,1,1,0.54,0.5152,0.26,0.4478,20,266,286 +10025,2012-02-27,1,1,2,17,0,1,1,1,0.54,0.5152,0.24,0.4478,26,503,529 +10026,2012-02-27,1,1,2,18,0,1,1,1,0.52,0.5,0.27,0.4179,10,517,527 +10027,2012-02-27,1,1,2,19,0,1,1,1,0.5,0.4848,0.29,0.3881,16,318,334 +10028,2012-02-27,1,1,2,20,0,1,1,1,0.48,0.4697,0.33,0.3582,11,208,219 +10029,2012-02-27,1,1,2,21,0,1,1,1,0.46,0.4545,0.36,0.2836,14,181,195 +10030,2012-02-27,1,1,2,22,0,1,1,1,0.44,0.4394,0.44,0.2537,9,83,92 +10031,2012-02-27,1,1,2,23,0,1,1,1,0.42,0.4242,0.5,0.1642,6,50,56 +10032,2012-02-28,1,1,2,0,0,2,1,1,0.42,0.4242,0.54,0.1343,0,10,10 +10033,2012-02-28,1,1,2,1,0,2,1,1,0.38,0.3939,0.62,0.1642,0,6,6 +10034,2012-02-28,1,1,2,2,0,2,1,1,0.4,0.4091,0.54,0.1045,1,2,3 +10035,2012-02-28,1,1,2,3,0,2,1,1,0.4,0.4091,0.32,0.3582,0,6,6 +10036,2012-02-28,1,1,2,4,0,2,1,1,0.32,0.2879,0.42,0.3582,0,2,2 +10037,2012-02-28,1,1,2,5,0,2,1,1,0.32,0.2879,0.42,0.3582,1,20,21 +10038,2012-02-28,1,1,2,6,0,2,1,1,0.3,0.2727,0.45,0.3284,1,100,101 +10039,2012-02-28,1,1,2,7,0,2,1,1,0.28,0.2576,0.45,0.2985,12,313,325 +10040,2012-02-28,1,1,2,8,0,2,1,1,0.26,0.2424,0.48,0.2836,16,543,559 +10041,2012-02-28,1,1,2,9,0,2,1,1,0.28,0.2576,0.45,0.3582,7,225,232 +10042,2012-02-28,1,1,2,10,0,2,1,1,0.3,0.2727,0.42,0.2985,14,118,132 +10043,2012-02-28,1,1,2,11,0,2,1,1,0.34,0.3182,0.36,0.2537,8,155,163 +10044,2012-02-28,1,1,2,12,0,2,1,1,0.36,0.3485,0.34,0.1343,19,164,183 +10045,2012-02-28,1,1,2,13,0,2,1,1,0.36,0.3636,0.32,0.1045,16,174,190 +10046,2012-02-28,1,1,2,14,0,2,1,1,0.4,0.4091,0.28,0.1045,13,121,134 +10047,2012-02-28,1,1,2,15,0,2,1,1,0.42,0.4242,0.26,0.0896,17,149,166 +10048,2012-02-28,1,1,2,16,0,2,1,1,0.42,0.4242,0.28,0.0896,17,270,287 +10049,2012-02-28,1,1,2,17,0,2,1,1,0.42,0.4242,0.26,0.0896,33,497,530 +10050,2012-02-28,1,1,2,18,0,2,1,1,0.42,0.4242,0.28,0.1642,17,432,449 +10051,2012-02-28,1,1,2,19,0,2,1,1,0.42,0.4242,0.26,0,15,291,306 +10052,2012-02-28,1,1,2,20,0,2,1,1,0.4,0.4091,0.3,0.1343,2,202,204 +10053,2012-02-28,1,1,2,21,0,2,1,1,0.34,0.3485,0.46,0.1045,10,178,188 +10054,2012-02-28,1,1,2,22,0,2,1,1,0.34,0.3485,0.46,0.1045,6,95,101 +10055,2012-02-28,1,1,2,23,0,2,1,1,0.32,0.303,0.53,0.2239,4,61,65 +10056,2012-02-29,1,1,2,0,0,3,1,1,0.32,0.3333,0.53,0.1045,2,30,32 +10057,2012-02-29,1,1,2,1,0,3,1,1,0.32,0.3333,0.49,0.1045,0,6,6 +10058,2012-02-29,1,1,2,2,0,3,1,1,0.3,0.2879,0.56,0.2239,0,4,4 +10059,2012-02-29,1,1,2,3,0,3,1,1,0.28,0.303,0.7,0.0896,0,2,2 +10060,2012-02-29,1,1,2,5,0,3,1,1,0.26,0.2879,0.75,0.0896,0,29,29 +10061,2012-02-29,1,1,2,6,0,3,1,2,0.28,0.303,0.75,0.0896,4,100,104 +10062,2012-02-29,1,1,2,7,0,3,1,2,0.26,0.2727,0.81,0.1343,11,242,253 +10063,2012-02-29,1,1,2,8,0,3,1,2,0.28,0.2879,0.75,0.1045,2,114,116 +10064,2012-02-29,1,1,2,9,0,3,1,3,0.3,0.303,0.81,0.1642,5,33,38 +10065,2012-02-29,1,1,2,10,0,3,1,3,0.3,0.303,0.81,0.1642,0,10,10 +10066,2012-02-29,1,1,2,11,0,3,1,3,0.3,0.2879,0.87,0.194,1,9,10 +10067,2012-02-29,1,1,2,12,0,3,1,3,0.32,0.303,0.93,0.3284,1,69,70 +10068,2012-02-29,1,1,2,13,0,3,1,2,0.36,0.3333,0.93,0.3582,4,72,76 +10069,2012-02-29,1,1,2,14,0,3,1,2,0.36,0.3333,0.93,0.3582,2,54,56 +10070,2012-02-29,1,1,2,15,0,3,1,2,0.4,0.4091,0.82,0.2836,3,87,90 +10071,2012-02-29,1,1,2,16,0,3,1,3,0.4,0.4091,0.82,0.2239,5,162,167 +10072,2012-02-29,1,1,2,17,0,3,1,2,0.4,0.4091,0.87,0.1642,7,246,253 +10073,2012-02-29,1,1,2,18,0,3,1,3,0.4,0.4091,0.87,0,11,181,192 +10074,2012-02-29,1,1,2,19,0,3,1,2,0.4,0.4091,0.87,0.1045,0,102,102 +10075,2012-02-29,1,1,2,20,0,3,1,3,0.42,0.4242,0.88,0.1642,2,95,97 +10076,2012-02-29,1,1,2,21,0,3,1,3,0.42,0.4242,0.88,0.1642,0,39,39 +10077,2012-02-29,1,1,2,22,0,3,1,3,0.42,0.4242,0.94,0.2537,2,53,55 +10078,2012-02-29,1,1,2,23,0,3,1,3,0.42,0.4242,0.94,0.2537,3,30,33 +10079,2012-03-01,1,1,3,0,0,4,1,2,0.42,0.4242,0.94,0,1,10,11 +10080,2012-03-01,1,1,3,1,0,4,1,3,0.46,0.4545,0.94,0.1045,0,12,12 +10081,2012-03-01,1,1,3,2,0,4,1,2,0.46,0.4545,0.94,0.2836,0,6,6 +10082,2012-03-01,1,1,3,3,0,4,1,1,0.46,0.4545,0.94,0.2239,0,3,3 +10083,2012-03-01,1,1,3,4,0,4,1,1,0.48,0.4697,0.88,0.194,0,5,5 +10084,2012-03-01,1,1,3,5,0,4,1,1,0.48,0.4697,0.88,0.1642,0,18,18 +10085,2012-03-01,1,1,3,6,0,4,1,2,0.42,0.4242,0.94,0,2,107,109 +10086,2012-03-01,1,1,3,7,0,4,1,2,0.44,0.4394,0.94,0.1343,8,296,304 +10087,2012-03-01,1,1,3,8,0,4,1,1,0.42,0.4242,0.94,0.1343,15,579,594 +10088,2012-03-01,1,1,3,9,0,4,1,1,0.46,0.4545,0.77,0.2537,5,281,286 +10089,2012-03-01,1,1,3,10,0,4,1,1,0.52,0.5,0.55,0.1642,11,122,133 +10090,2012-03-01,1,1,3,11,0,4,1,1,0.56,0.5303,0.43,0.194,19,149,168 +10091,2012-03-01,1,1,3,12,0,4,1,1,0.56,0.5303,0.37,0.2239,16,204,220 +10092,2012-03-01,1,1,3,13,0,4,1,1,0.58,0.5455,0.32,0.2537,24,187,211 +10093,2012-03-01,1,1,3,14,0,4,1,1,0.58,0.5455,0.35,0.2239,37,174,211 +10094,2012-03-01,1,1,3,15,0,4,1,1,0.6,0.6212,0.31,0.2537,31,192,223 +10095,2012-03-01,1,1,3,16,0,4,1,1,0.58,0.5455,0.32,0.2985,38,323,361 +10096,2012-03-01,1,1,3,17,0,4,1,1,0.56,0.5303,0.35,0.2985,49,551,600 +10097,2012-03-01,1,1,3,18,0,4,1,1,0.54,0.5152,0.34,0.3582,27,498,525 +10098,2012-03-01,1,1,3,19,0,4,1,1,0.48,0.4697,0.41,0.3284,11,317,328 +10099,2012-03-01,1,1,3,20,0,4,1,1,0.44,0.4394,0.44,0.4179,12,242,254 +10100,2012-03-01,1,1,3,21,0,4,1,1,0.42,0.4242,0.47,0.4627,7,190,197 +10101,2012-03-01,1,1,3,22,0,4,1,1,0.38,0.3939,0.5,0.2537,3,123,126 +10102,2012-03-01,1,1,3,23,0,4,1,1,0.36,0.3485,0.5,0.2239,9,76,85 +10103,2012-03-02,1,1,3,0,0,5,1,1,0.34,0.3333,0.53,0.1642,1,45,46 +10104,2012-03-02,1,1,3,1,0,5,1,1,0.34,0.3182,0.49,0.2239,2,18,20 +10105,2012-03-02,1,1,3,2,0,5,1,1,0.34,0.3182,0.53,0.2836,2,4,6 +10106,2012-03-02,1,1,3,3,0,5,1,1,0.32,0.3182,0.57,0.194,0,3,3 +10107,2012-03-02,1,1,3,4,0,5,1,1,0.32,0.3485,0.61,0,0,2,2 +10108,2012-03-02,1,1,3,5,0,5,1,1,0.3,0.3333,0.7,0,0,24,24 +10109,2012-03-02,1,1,3,6,0,5,1,1,0.3,0.3333,0.7,0,4,88,92 +10110,2012-03-02,1,1,3,7,0,5,1,1,0.3,0.3333,0.7,0,4,258,262 +10111,2012-03-02,1,1,3,8,0,5,1,1,0.28,0.303,0.75,0.0896,16,533,549 +10112,2012-03-02,1,1,3,9,0,5,1,1,0.3,0.3333,0.81,0,15,299,314 +10113,2012-03-02,1,1,3,10,0,5,1,1,0.36,0.3485,0.5,0.1642,14,118,132 +10114,2012-03-02,1,1,3,11,0,5,1,1,0.38,0.3939,0.46,0.194,40,154,194 +10115,2012-03-02,1,1,3,12,0,5,1,1,0.4,0.4091,0.43,0.1642,40,194,234 +10116,2012-03-02,1,1,3,13,0,5,1,1,0.42,0.4242,0.44,0.2239,31,191,222 +10117,2012-03-02,1,1,3,14,0,5,1,2,0.44,0.4394,0.44,0.2836,29,176,205 +10118,2012-03-02,1,1,3,15,0,5,1,2,0.42,0.4242,0.5,0.2985,5,73,78 +10119,2012-03-02,1,1,3,16,0,5,1,3,0.4,0.4091,0.58,0.2985,1,49,50 +10120,2012-03-02,1,1,3,17,0,5,1,3,0.36,0.3485,0.81,0.2239,5,123,128 +10121,2012-03-02,1,1,3,18,0,5,1,3,0.36,0.3485,0.87,0.194,6,154,160 +10122,2012-03-02,1,1,3,19,0,5,1,3,0.36,0.3485,0.87,0.194,12,162,174 +10123,2012-03-02,1,1,3,20,0,5,1,2,0.36,0.3485,0.87,0.194,7,93,100 +10124,2012-03-02,1,1,3,21,0,5,1,3,0.36,0.3636,0.87,0.0896,2,99,101 +10125,2012-03-02,1,1,3,22,0,5,1,2,0.36,0.3788,0.87,0,6,69,75 +10126,2012-03-02,1,1,3,23,0,5,1,2,0.36,0.3788,0.87,0,4,19,23 +10127,2012-03-03,1,1,3,0,0,6,0,3,0.36,0.3636,0.93,0.0896,1,21,22 +10128,2012-03-03,1,1,3,1,0,6,0,3,0.36,0.3788,0.93,0,0,44,44 +10129,2012-03-03,1,1,3,2,0,6,0,3,0.36,0.3636,0.93,0.0896,4,34,38 +10130,2012-03-03,1,1,3,3,0,6,0,2,0.36,0.3485,0.93,0.1642,1,20,21 +10131,2012-03-03,1,1,3,4,0,6,0,2,0.36,0.3485,0.93,0.194,0,2,2 +10132,2012-03-03,1,1,3,5,0,6,0,2,0.36,0.3636,0.93,0.0896,1,1,2 +10133,2012-03-03,1,1,3,6,0,6,0,2,0.36,0.3636,0.93,0.0896,1,6,7 +10134,2012-03-03,1,1,3,7,0,6,0,2,0.36,0.3485,0.93,0.1343,2,14,16 +10135,2012-03-03,1,1,3,8,0,6,0,3,0.36,0.3788,0.93,0,2,46,48 +10136,2012-03-03,1,1,3,9,0,6,0,3,0.38,0.3939,0.87,0.0896,7,87,94 +10137,2012-03-03,1,1,3,10,0,6,0,2,0.4,0.4091,0.87,0,31,137,168 +10138,2012-03-03,1,1,3,11,0,6,0,2,0.4,0.4091,0.87,0.0896,33,181,214 +10139,2012-03-03,1,1,3,12,0,6,0,2,0.42,0.4242,0.67,0.2537,47,252,299 +10140,2012-03-03,1,1,3,13,0,6,0,1,0.44,0.4394,0.51,0.2836,87,252,339 +10141,2012-03-03,1,1,3,14,0,6,0,2,0.46,0.4545,0.44,0.3582,151,279,430 +10142,2012-03-03,1,1,3,15,0,6,0,1,0.48,0.4697,0.29,0.2985,145,254,399 +10143,2012-03-03,1,1,3,16,0,6,0,1,0.5,0.4848,0.27,0.2985,167,300,467 +10144,2012-03-03,1,1,3,17,0,6,0,1,0.5,0.4848,0.23,0.3284,106,279,385 +10145,2012-03-03,1,1,3,18,0,6,0,1,0.5,0.4848,0.25,0.2537,53,250,303 +10146,2012-03-03,1,1,3,19,0,6,0,1,0.46,0.4545,0.33,0.1642,34,191,225 +10147,2012-03-03,1,1,3,20,0,6,0,1,0.44,0.4394,0.3,0.1343,28,121,149 +10148,2012-03-03,1,1,3,21,0,6,0,1,0.46,0.4545,0.19,0.3284,24,130,154 +10149,2012-03-03,1,1,3,22,0,6,0,1,0.44,0.4394,0.21,0.1343,19,115,134 +10150,2012-03-03,1,1,3,23,0,6,0,1,0.42,0.4242,0.24,0,12,94,106 +10151,2012-03-04,1,1,3,0,0,0,0,2,0.4,0.4091,0.3,0.3582,5,67,72 +10152,2012-03-04,1,1,3,1,0,0,0,2,0.38,0.3939,0.34,0.194,16,60,76 +10153,2012-03-04,1,1,3,2,0,0,0,2,0.36,0.3485,0.4,0.1642,14,66,80 +10154,2012-03-04,1,1,3,3,0,0,0,2,0.36,0.3485,0.43,0.1343,5,21,26 +10155,2012-03-04,1,1,3,4,0,0,0,2,0.34,0.3333,0.46,0.194,3,11,14 +10156,2012-03-04,1,1,3,5,0,0,0,2,0.34,0.3182,0.46,0.2239,0,5,5 +10157,2012-03-04,1,1,3,6,0,0,0,2,0.3,0.2727,0.52,0.2985,0,5,5 +10158,2012-03-04,1,1,3,7,0,0,0,2,0.3,0.2727,0.52,0.2985,2,21,23 +10159,2012-03-04,1,1,3,8,0,0,0,1,0.3,0.2727,0.52,0.3284,12,54,66 +10160,2012-03-04,1,1,3,9,0,0,0,1,0.3,0.2727,0.49,0.4478,14,104,118 +10161,2012-03-04,1,1,3,10,0,0,0,1,0.3,0.2727,0.49,0.3284,31,161,192 +10162,2012-03-04,1,1,3,11,0,0,0,1,0.32,0.2879,0.45,0.3881,64,192,256 +10163,2012-03-04,1,1,3,12,0,0,0,1,0.34,0.303,0.42,0.3582,71,256,327 +10164,2012-03-04,1,1,3,13,0,0,0,1,0.36,0.3333,0.37,0.3582,108,256,364 +10165,2012-03-04,1,1,3,14,0,0,0,1,0.36,0.3333,0.34,0.3582,106,226,332 +10166,2012-03-04,1,1,3,15,0,0,0,1,0.36,0.3182,0.32,0.5224,82,252,334 +10167,2012-03-04,1,1,3,16,0,0,0,1,0.36,0.3182,0.29,0.4925,68,231,299 +10168,2012-03-04,1,1,3,17,0,0,0,1,0.34,0.2879,0.31,0.5522,49,214,263 +10169,2012-03-04,1,1,3,18,0,0,0,1,0.32,0.2727,0.33,0.6119,20,164,184 +10170,2012-03-04,1,1,3,19,0,0,0,1,0.3,0.2576,0.36,0.4925,12,120,132 +10171,2012-03-04,1,1,3,20,0,0,0,1,0.28,0.2727,0.38,0.2537,8,80,88 +10172,2012-03-04,1,1,3,21,0,0,0,1,0.28,0.2727,0.36,0.2239,9,70,79 +10173,2012-03-04,1,1,3,22,0,0,0,1,0.26,0.2424,0.41,0.2537,9,53,62 +10174,2012-03-04,1,1,3,23,0,0,0,1,0.26,0.2576,0.41,0.194,2,24,26 +10175,2012-03-05,1,1,3,0,0,1,1,1,0.24,0.2273,0.44,0.194,2,15,17 +10176,2012-03-05,1,1,3,1,0,1,1,1,0.24,0.2424,0.48,0.1343,3,3,6 +10177,2012-03-05,1,1,3,2,0,1,1,1,0.24,0.2424,0.48,0.1343,1,3,4 +10178,2012-03-05,1,1,3,3,0,1,1,1,0.22,0.2273,0.51,0.1642,0,1,1 +10179,2012-03-05,1,1,3,4,0,1,1,1,0.2,0.2273,0.55,0.0896,0,1,1 +10180,2012-03-05,1,1,3,5,0,1,1,1,0.2,0.2273,0.55,0.0896,1,17,18 +10181,2012-03-05,1,1,3,6,0,1,1,1,0.18,0.197,0.59,0.1343,2,89,91 +10182,2012-03-05,1,1,3,7,0,1,1,1,0.18,0.197,0.59,0.1343,7,253,260 +10183,2012-03-05,1,1,3,8,0,1,1,1,0.2,0.2273,0.59,0.1045,13,415,428 +10184,2012-03-05,1,1,3,9,0,1,1,2,0.22,0.2576,0.6,0.0896,11,186,197 +10185,2012-03-05,1,1,3,10,0,1,1,2,0.24,0.2576,0.6,0,12,74,86 +10186,2012-03-05,1,1,3,11,0,1,1,2,0.26,0.2727,0.56,0.1343,17,86,103 +10187,2012-03-05,1,1,3,12,0,1,1,2,0.26,0.303,0.52,0,15,122,137 +10188,2012-03-05,1,1,3,13,0,1,1,2,0.26,0.2576,0.55,0.2985,11,109,120 +10189,2012-03-05,1,1,3,14,0,1,1,2,0.28,0.2576,0.57,0.3582,13,115,128 +10190,2012-03-05,1,1,3,15,0,1,1,2,0.3,0.2879,0.53,0.194,20,110,130 +10191,2012-03-05,1,1,3,16,0,1,1,1,0.3,0.2727,0.45,0.2985,30,180,210 +10192,2012-03-05,1,1,3,17,0,1,1,3,0.3,0.2727,0.45,0.2985,11,376,387 +10193,2012-03-05,1,1,3,18,0,1,1,3,0.28,0.2273,0.55,0.6567,12,363,375 +10194,2012-03-05,1,1,3,19,0,1,1,1,0.26,0.2576,0.53,0.2239,6,220,226 +10195,2012-03-05,1,1,3,20,0,1,1,1,0.26,0.2273,0.37,0.4627,8,171,179 +10196,2012-03-05,1,1,3,21,0,1,1,1,0.26,0.2273,0.37,0.3881,4,116,120 +10197,2012-03-05,1,1,3,22,0,1,1,1,0.24,0.2121,0.35,0.3284,2,76,78 +10198,2012-03-05,1,1,3,23,0,1,1,1,0.22,0.1818,0.37,0.5821,2,29,31 +10199,2012-03-06,1,1,3,0,0,2,1,1,0.22,0.2121,0.37,0.2985,0,8,8 +10200,2012-03-06,1,1,3,1,0,2,1,1,0.2,0.197,0.44,0.2537,0,6,6 +10201,2012-03-06,1,1,3,2,0,2,1,1,0.2,0.197,0.44,0.2239,0,4,4 +10202,2012-03-06,1,1,3,3,0,2,1,1,0.18,0.1667,0.51,0.2836,0,1,1 +10203,2012-03-06,1,1,3,4,0,2,1,1,0.18,0.1667,0.51,0.2985,0,3,3 +10204,2012-03-06,1,1,3,5,0,2,1,1,0.18,0.1818,0.51,0.2239,0,25,25 +10205,2012-03-06,1,1,3,6,0,2,1,1,0.18,0.2121,0.51,0.1045,3,99,102 +10206,2012-03-06,1,1,3,7,0,2,1,1,0.16,0.1818,0.55,0.1343,5,270,275 +10207,2012-03-06,1,1,3,8,0,2,1,1,0.16,0.1818,0.59,0.1045,14,487,501 +10208,2012-03-06,1,1,3,9,0,2,1,1,0.22,0.2727,0.47,0,11,222,233 +10209,2012-03-06,1,1,3,10,0,2,1,1,0.24,0.2576,0.44,0,16,113,129 +10210,2012-03-06,1,1,3,11,0,2,1,1,0.26,0.2576,0.41,0.1642,20,110,130 +10211,2012-03-06,1,1,3,12,0,2,1,1,0.26,0.2727,0.41,0.1045,9,129,138 +10212,2012-03-06,1,1,3,13,0,2,1,1,0.3,0.303,0.39,0.1343,16,148,164 +10213,2012-03-06,1,1,3,14,0,2,1,1,0.32,0.303,0.36,0.2239,20,116,136 +10214,2012-03-06,1,1,3,15,0,2,1,1,0.34,0.303,0.36,0.2985,24,142,166 +10215,2012-03-06,1,1,3,16,0,2,1,1,0.36,0.3333,0.32,0.2836,22,228,250 +10216,2012-03-06,1,1,3,17,0,2,1,1,0.36,0.3333,0.34,0.2836,21,425,446 +10217,2012-03-06,1,1,3,18,0,2,1,1,0.34,0.303,0.42,0.3284,15,442,457 +10218,2012-03-06,1,1,3,19,0,2,1,1,0.34,0.303,0.46,0.2985,9,278,287 +10219,2012-03-06,1,1,3,20,0,2,1,1,0.32,0.3182,0.49,0.194,3,184,187 +10220,2012-03-06,1,1,3,21,0,2,1,1,0.32,0.3182,0.49,0.194,5,143,148 +10221,2012-03-06,1,1,3,22,0,2,1,1,0.28,0.2727,0.56,0.1642,5,101,106 +10222,2012-03-06,1,1,3,23,0,2,1,1,0.28,0.2727,0.61,0.2239,3,51,54 +10223,2012-03-07,1,1,3,0,0,3,1,1,0.28,0.2576,0.65,0.2836,3,12,15 +10224,2012-03-07,1,1,3,1,0,3,1,1,0.3,0.2727,0.61,0.2985,1,4,5 +10225,2012-03-07,1,1,3,2,0,3,1,1,0.3,0.2879,0.56,0.2836,0,4,4 +10226,2012-03-07,1,1,3,3,0,3,1,1,0.3,0.2727,0.56,0.3582,1,2,3 +10227,2012-03-07,1,1,3,4,0,3,1,1,0.28,0.2576,0.61,0.3284,0,3,3 +10228,2012-03-07,1,1,3,5,0,3,1,1,0.28,0.2576,0.56,0.2985,0,18,18 +10229,2012-03-07,1,1,3,6,0,3,1,1,0.28,0.2576,0.61,0.3881,4,104,108 +10230,2012-03-07,1,1,3,7,0,3,1,1,0.26,0.2273,0.65,0.3881,12,332,344 +10231,2012-03-07,1,1,3,8,0,3,1,1,0.28,0.2576,0.61,0.2985,12,554,566 +10232,2012-03-07,1,1,3,9,0,3,1,1,0.3,0.2727,0.61,0.3284,12,252,264 +10233,2012-03-07,1,1,3,10,0,3,1,1,0.36,0.3333,0.5,0.2985,20,127,147 +10234,2012-03-07,1,1,3,11,0,3,1,1,0.42,0.4242,0.38,0.2985,33,128,161 +10235,2012-03-07,1,1,3,12,0,3,1,1,0.44,0.4394,0.38,0.3881,40,175,215 +10236,2012-03-07,1,1,3,13,0,3,1,1,0.5,0.4848,0.34,0.4179,26,179,205 +10237,2012-03-07,1,1,3,14,0,3,1,1,0.52,0.5,0.36,0.3284,38,137,175 +10238,2012-03-07,1,1,3,15,0,3,1,1,0.54,0.5152,0.37,0.4478,44,165,209 +10239,2012-03-07,1,1,3,16,0,3,1,1,0.56,0.5303,0.37,0.4179,34,254,288 +10240,2012-03-07,1,1,3,17,0,3,1,1,0.56,0.5303,0.43,0.4925,58,554,612 +10241,2012-03-07,1,1,3,18,0,3,1,1,0.56,0.5303,0.4,0.4627,35,509,544 +10242,2012-03-07,1,1,3,19,0,3,1,1,0.54,0.5152,0.42,0.3582,16,345,361 +10243,2012-03-07,1,1,3,20,0,3,1,1,0.5,0.4848,0.51,0.2537,18,242,260 +10244,2012-03-07,1,1,3,21,0,3,1,1,0.44,0.4394,0.62,0.2537,11,177,188 +10245,2012-03-07,1,1,3,22,0,3,1,1,0.46,0.4545,0.59,0.2985,10,149,159 +10246,2012-03-07,1,1,3,23,0,3,1,1,0.44,0.4394,0.62,0.3284,4,58,62 +10247,2012-03-08,1,1,3,0,0,4,1,1,0.44,0.4394,0.62,0.3284,11,35,46 +10248,2012-03-08,1,1,3,1,0,4,1,1,0.46,0.4545,0.63,0.3284,4,17,21 +10249,2012-03-08,1,1,3,2,0,4,1,1,0.46,0.4545,0.63,0.3881,6,5,11 +10250,2012-03-08,1,1,3,3,0,4,1,1,0.46,0.4545,0.63,0.3881,0,3,3 +10251,2012-03-08,1,1,3,4,0,4,1,1,0.44,0.4394,0.72,0.2836,0,2,2 +10252,2012-03-08,1,1,3,5,0,4,1,1,0.42,0.4242,0.77,0.2239,1,28,29 +10253,2012-03-08,1,1,3,6,0,4,1,1,0.42,0.4242,0.77,0.2537,4,105,109 +10254,2012-03-08,1,1,3,7,0,4,1,1,0.42,0.4242,0.77,0.3582,8,326,334 +10255,2012-03-08,1,1,3,8,0,4,1,1,0.44,0.4394,0.77,0.3881,12,573,585 +10256,2012-03-08,1,1,3,9,0,4,1,1,0.46,0.4545,0.72,0.4627,19,282,301 +10257,2012-03-08,1,1,3,10,0,4,1,1,0.5,0.4848,0.68,0.4627,19,119,138 +10258,2012-03-08,1,1,3,11,0,4,1,2,0.54,0.5152,0.6,0.4627,48,156,204 +10259,2012-03-08,1,1,3,12,0,4,1,2,0.56,0.5303,0.6,0.4478,27,224,251 +10260,2012-03-08,1,1,3,13,0,4,1,2,0.6,0.6212,0.49,0.6418,35,198,233 +10261,2012-03-08,1,1,3,14,0,4,1,2,0.62,0.6212,0.43,0.6418,48,155,203 +10262,2012-03-08,1,1,3,15,0,4,1,1,0.64,0.6212,0.38,0.6866,24,161,185 +10263,2012-03-08,1,1,3,16,0,4,1,2,0.62,0.6212,0.41,0.6418,37,305,342 +10264,2012-03-08,1,1,3,17,0,4,1,1,0.62,0.6212,0.38,0.6567,52,545,597 +10265,2012-03-08,1,1,3,18,0,4,1,2,0.62,0.6212,0.38,0.5522,45,545,590 +10266,2012-03-08,1,1,3,19,0,4,1,2,0.6,0.6212,0.4,0.4478,21,395,416 +10267,2012-03-08,1,1,3,20,0,4,1,2,0.6,0.6212,0.43,0.3881,20,282,302 +10268,2012-03-08,1,1,3,21,0,4,1,1,0.6,0.6212,0.43,0.3881,27,206,233 +10269,2012-03-08,1,1,3,22,0,4,1,1,0.56,0.5303,0.49,0.3284,12,141,153 +10270,2012-03-08,1,1,3,23,0,4,1,2,0.56,0.5303,0.49,0.4478,6,88,94 +10271,2012-03-09,1,1,3,0,0,5,1,2,0.58,0.5455,0.49,0.4627,3,51,54 +10272,2012-03-09,1,1,3,1,0,5,1,3,0.56,0.5303,0.52,0.4925,4,22,26 +10273,2012-03-09,1,1,3,2,0,5,1,3,0.48,0.4697,0.77,0.4179,2,9,11 +10274,2012-03-09,1,1,3,3,0,5,1,1,0.46,0.4545,0.77,0.5224,0,7,7 +10275,2012-03-09,1,1,3,4,0,5,1,3,0.4,0.4091,0.66,0.2836,0,1,1 +10276,2012-03-09,1,1,3,5,0,5,1,3,0.4,0.4091,0.66,0.2836,2,27,29 +10277,2012-03-09,1,1,3,6,0,5,1,2,0.4,0.4091,0.5,0.3582,2,83,85 +10278,2012-03-09,1,1,3,7,0,5,1,2,0.38,0.3939,0.4,0.2985,6,262,268 +10279,2012-03-09,1,1,3,8,0,5,1,2,0.34,0.2879,0.42,0.5224,17,484,501 +10280,2012-03-09,1,1,3,9,0,5,1,2,0.34,0.2879,0.42,0.5224,17,267,284 +10281,2012-03-09,1,1,3,10,0,5,1,2,0.34,0.2879,0.36,0.4925,14,145,159 +10282,2012-03-09,1,1,3,11,0,5,1,2,0.36,0.3333,0.34,0.3582,17,170,187 +10283,2012-03-09,1,1,3,12,0,5,1,1,0.36,0.3333,0.34,0.4179,27,174,201 +10284,2012-03-09,1,1,3,13,0,5,1,2,0.38,0.3939,0.29,0.3881,49,175,224 +10285,2012-03-09,1,1,3,14,0,5,1,1,0.42,0.4242,0.28,0.4627,49,129,178 +10286,2012-03-09,1,1,3,15,0,5,1,1,0.46,0.4545,0.24,0.2537,50,188,238 +10287,2012-03-09,1,1,3,16,0,5,1,1,0.48,0.4697,0.23,0.4179,51,292,343 +10288,2012-03-09,1,1,3,17,0,5,1,1,0.46,0.4545,0.24,0.3881,68,498,566 +10289,2012-03-09,1,1,3,18,0,5,1,1,0.44,0.4394,0.26,0.5224,30,440,470 +10290,2012-03-09,1,1,3,19,0,5,1,1,0.42,0.4242,0.28,0.6119,12,232,244 +10291,2012-03-09,1,1,3,20,0,5,1,1,0.38,0.3939,0.32,0.3582,3,156,159 +10292,2012-03-09,1,1,3,21,0,5,1,1,0.36,0.3333,0.34,0.3284,8,133,141 +10293,2012-03-09,1,1,3,22,0,5,1,1,0.34,0.2879,0.31,0.4925,7,100,107 +10294,2012-03-09,1,1,3,23,0,5,1,1,0.32,0.303,0.33,0.2985,9,77,86 +10295,2012-03-10,1,1,3,0,0,6,0,1,0.3,0.2727,0.36,0.3582,9,68,77 +10296,2012-03-10,1,1,3,1,0,6,0,1,0.3,0.2879,0.36,0.2537,1,50,51 +10297,2012-03-10,1,1,3,2,0,6,0,1,0.26,0.2424,0.41,0.2537,12,30,42 +10298,2012-03-10,1,1,3,3,0,6,0,1,0.26,0.2273,0.41,0.2985,0,16,16 +10299,2012-03-10,1,1,3,4,0,6,0,1,0.24,0.2121,0.41,0.3582,1,3,4 +10300,2012-03-10,1,1,3,5,0,6,0,1,0.22,0.2121,0.42,0.2836,3,7,10 +10301,2012-03-10,1,1,3,6,0,6,0,1,0.22,0.2121,0.44,0.2836,1,11,12 +10302,2012-03-10,1,1,3,7,0,6,0,1,0.22,0.2121,0.44,0.2537,4,36,40 +10303,2012-03-10,1,1,3,8,0,6,0,1,0.22,0.2121,0.44,0.2985,15,96,111 +10304,2012-03-10,1,1,3,9,0,6,0,1,0.24,0.2121,0.41,0.3284,21,127,148 +10305,2012-03-10,1,1,3,10,0,6,0,1,0.26,0.2273,0.35,0.4179,47,176,223 +10306,2012-03-10,1,1,3,11,0,6,0,1,0.3,0.2727,0.33,0.3284,56,218,274 +10307,2012-03-10,1,1,3,12,0,6,0,1,0.3,0.2727,0.33,0.3284,88,241,329 +10308,2012-03-10,1,1,3,13,0,6,0,1,0.32,0.303,0.29,0.2836,89,268,357 +10309,2012-03-10,1,1,3,14,0,6,0,1,0.34,0.3182,0.27,0.2836,117,262,379 +10310,2012-03-10,1,1,3,15,0,6,0,1,0.34,0.3636,0.25,0,132,274,406 +10311,2012-03-10,1,1,3,16,0,6,0,1,0.36,0.3636,0.23,0,115,275,390 +10312,2012-03-10,1,1,3,17,0,6,0,1,0.36,0.3788,0.23,0,104,250,354 +10313,2012-03-10,1,1,3,18,0,6,0,1,0.36,0.3636,0.21,0.0896,67,230,297 +10314,2012-03-10,1,1,3,19,0,6,0,1,0.34,0.3636,0.25,0,25,159,184 +10315,2012-03-10,1,1,3,20,0,6,0,1,0.32,0.3333,0.33,0.1343,20,101,121 +10316,2012-03-10,1,1,3,21,0,6,0,1,0.3,0.2879,0.39,0.2239,16,94,110 +10317,2012-03-10,1,1,3,22,0,6,0,1,0.26,0.2576,0.44,0.1642,19,81,100 +10318,2012-03-10,1,1,3,23,0,6,0,1,0.26,0.2576,0.41,0.194,6,77,83 +10319,2012-03-11,1,1,3,0,0,0,0,1,0.26,0.2879,0.44,0.0896,7,62,69 +10320,2012-03-11,1,1,3,1,0,0,0,1,0.24,0.2424,0.52,0.1642,4,57,61 +10321,2012-03-11,1,1,3,3,0,0,0,1,0.24,0.2424,0.6,0.1343,15,51,66 +10322,2012-03-11,1,1,3,4,0,0,0,1,0.24,0.2424,0.6,0.1343,7,15,22 +10323,2012-03-11,1,1,3,5,0,0,0,1,0.24,0.2424,0.6,0.1642,2,5,7 +10324,2012-03-11,1,1,3,6,0,0,0,1,0.24,0.2424,0.7,0.1642,2,8,10 +10325,2012-03-11,1,1,3,7,0,0,0,1,0.22,0.2273,0.69,0.1343,2,15,17 +10326,2012-03-11,1,1,3,8,0,0,0,1,0.22,0.2273,0.69,0.1343,4,68,72 +10327,2012-03-11,1,1,3,9,0,0,0,1,0.26,0.2576,0.6,0.2239,20,70,90 +10328,2012-03-11,1,1,3,10,0,0,0,1,0.32,0.303,0.49,0.2836,71,147,218 +10329,2012-03-11,1,1,3,11,0,0,0,1,0.36,0.3333,0.43,0.2836,90,209,299 +10330,2012-03-11,1,1,3,12,0,0,0,1,0.4,0.4091,0.37,0.2985,146,264,410 +10331,2012-03-11,1,1,3,13,0,0,0,1,0.42,0.4242,0.41,0.2985,176,288,464 +10332,2012-03-11,1,1,3,14,0,0,0,1,0.46,0.4545,0.31,0.2836,212,289,501 +10333,2012-03-11,1,1,3,15,0,0,0,1,0.5,0.4848,0.29,0.2985,201,286,487 +10334,2012-03-11,1,1,3,16,0,0,0,1,0.5,0.4848,0.31,0.3582,208,301,509 +10335,2012-03-11,1,1,3,17,0,0,0,1,0.52,0.5,0.27,0.2985,199,299,498 +10336,2012-03-11,1,1,3,18,0,0,0,1,0.52,0.5,0.29,0.2836,133,256,389 +10337,2012-03-11,1,1,3,19,0,0,0,1,0.5,0.4848,0.31,0.2836,55,203,258 +10338,2012-03-11,1,1,3,20,0,0,0,1,0.44,0.4394,0.47,0.2239,42,129,171 +10339,2012-03-11,1,1,3,21,0,0,0,1,0.42,0.4242,0.54,0.2239,37,110,147 +10340,2012-03-11,1,1,3,22,0,0,0,1,0.4,0.4091,0.54,0.194,13,81,94 +10341,2012-03-11,1,1,3,23,0,0,0,1,0.4,0.4091,0.5,0.1642,12,40,52 +10342,2012-03-12,1,1,3,0,0,1,1,1,0.38,0.3939,0.54,0.194,4,20,24 +10343,2012-03-12,1,1,3,1,0,1,1,1,0.38,0.3939,0.5,0.1343,1,9,10 +10344,2012-03-12,1,1,3,2,0,1,1,1,0.38,0.3939,0.54,0.1045,4,5,9 +10345,2012-03-12,1,1,3,3,0,1,1,1,0.36,0.3485,0.5,0.1343,0,2,2 +10346,2012-03-12,1,1,3,4,0,1,1,1,0.34,0.3333,0.61,0.194,0,3,3 +10347,2012-03-12,1,1,3,5,0,1,1,1,0.34,0.3333,0.61,0.194,1,15,16 +10348,2012-03-12,1,1,3,6,0,1,1,1,0.34,0.3485,0.57,0.1045,2,86,88 +10349,2012-03-12,1,1,3,7,0,1,1,1,0.34,0.3485,0.53,0.1045,9,259,268 +10350,2012-03-12,1,1,3,8,0,1,1,1,0.34,0.3333,0.61,0.1343,17,547,564 +10351,2012-03-12,1,1,3,9,0,1,1,1,0.38,0.3939,0.54,0.1642,21,260,281 +10352,2012-03-12,1,1,3,10,0,1,1,1,0.4,0.4091,0.5,0.2239,39,98,137 +10353,2012-03-12,1,1,3,11,0,1,1,1,0.44,0.4394,0.44,0.2239,39,111,150 +10354,2012-03-12,1,1,3,12,0,1,1,1,0.48,0.4697,0.39,0.2239,59,162,221 +10355,2012-03-12,1,1,3,13,0,1,1,2,0.54,0.5152,0.32,0.2537,74,176,250 +10356,2012-03-12,1,1,3,14,0,1,1,2,0.58,0.5455,0.32,0.2836,76,145,221 +10357,2012-03-12,1,1,3,15,0,1,1,1,0.56,0.5303,0.4,0.1343,74,159,233 +10358,2012-03-12,1,1,3,16,0,1,1,1,0.62,0.6212,0.35,0.4478,77,255,332 +10359,2012-03-12,1,1,3,17,0,1,1,2,0.62,0.6212,0.38,0.4179,87,557,644 +10360,2012-03-12,1,1,3,18,0,1,1,2,0.6,0.6212,0.43,0.194,89,623,712 +10361,2012-03-12,1,1,3,19,0,1,1,2,0.56,0.5303,0.49,0.2239,67,379,446 +10362,2012-03-12,1,1,3,20,0,1,1,2,0.56,0.5303,0.49,0.2239,49,237,286 +10363,2012-03-12,1,1,3,21,0,1,1,2,0.54,0.5152,0.56,0.2239,22,183,205 +10364,2012-03-12,1,1,3,22,0,1,1,2,0.56,0.5303,0.56,0.2239,17,116,133 +10365,2012-03-12,1,1,3,23,0,1,1,1,0.56,0.5303,0.56,0.2239,10,53,63 +10366,2012-03-13,1,1,3,0,0,2,1,2,0.56,0.5303,0.52,0.194,5,21,26 +10367,2012-03-13,1,1,3,1,0,2,1,2,0.52,0.5,0.59,0.2239,2,14,16 +10368,2012-03-13,1,1,3,2,0,2,1,3,0.52,0.5,0.72,0.2985,0,1,1 +10369,2012-03-13,1,1,3,3,0,2,1,3,0.52,0.5,0.72,0.2985,0,2,2 +10370,2012-03-13,1,1,3,4,0,2,1,2,0.46,0.4545,0.82,0.194,0,1,1 +10371,2012-03-13,1,1,3,5,0,2,1,2,0.46,0.4545,0.82,0.194,0,24,24 +10372,2012-03-13,1,1,3,6,0,2,1,3,0.46,0.4545,0.82,0.194,5,108,113 +10373,2012-03-13,1,1,3,7,0,2,1,2,0.46,0.4545,0.82,0.2836,16,292,308 +10374,2012-03-13,1,1,3,8,0,2,1,1,0.46,0.4545,0.82,0.2836,22,571,593 +10375,2012-03-13,1,1,3,9,0,2,1,1,0.48,0.4697,0.82,0.2836,18,324,342 +10376,2012-03-13,1,1,3,10,0,2,1,1,0.52,0.5,0.77,0.2239,28,115,143 +10377,2012-03-13,1,1,3,11,0,2,1,1,0.54,0.5152,0.73,0.2836,64,155,219 +10378,2012-03-13,1,1,3,12,0,2,1,1,0.6,0.6061,0.6,0.2836,47,197,244 +10379,2012-03-13,1,1,3,13,0,2,1,1,0.6,0.6061,0.6,0.3881,53,180,233 +10380,2012-03-13,1,1,3,14,0,2,1,1,0.64,0.6212,0.53,0.3284,52,160,212 +10381,2012-03-13,1,1,3,15,0,2,1,1,0.7,0.6364,0.39,0.2239,68,196,264 +10382,2012-03-13,1,1,3,16,0,2,1,1,0.72,0.6515,0.34,0.3881,53,312,365 +10383,2012-03-13,1,1,3,17,0,2,1,1,0.7,0.6364,0.37,0.1045,62,614,676 +10384,2012-03-13,1,1,3,18,0,2,1,1,0.7,0.6364,0.34,0.2985,96,638,734 +10385,2012-03-13,1,1,3,19,0,2,1,1,0.64,0.6212,0.47,0.2239,50,429,479 +10386,2012-03-13,1,1,3,20,0,2,1,1,0.6,0.6212,0.49,0.1642,45,306,351 +10387,2012-03-13,1,1,3,21,0,2,1,1,0.58,0.5455,0.56,0.1045,44,200,244 +10388,2012-03-13,1,1,3,22,0,2,1,1,0.56,0.5303,0.6,0.0896,24,146,170 +10389,2012-03-13,1,1,3,23,0,2,1,1,0.56,0.5303,0.56,0.1343,8,79,87 +10390,2012-03-14,1,1,3,0,0,3,1,1,0.54,0.5152,0.6,0.1045,5,34,39 +10391,2012-03-14,1,1,3,1,0,3,1,1,0.52,0.5,0.63,0.0896,2,25,27 +10392,2012-03-14,1,1,3,2,0,3,1,1,0.5,0.4848,0.68,0.194,0,2,2 +10393,2012-03-14,1,1,3,3,0,3,1,1,0.48,0.4697,0.72,0.194,1,3,4 +10394,2012-03-14,1,1,3,4,0,3,1,1,0.48,0.4697,0.67,0.0896,1,4,5 +10395,2012-03-14,1,1,3,5,0,3,1,1,0.44,0.4394,0.82,0.1343,2,25,27 +10396,2012-03-14,1,1,3,6,0,3,1,1,0.44,0.4394,0.82,0.0896,1,120,121 +10397,2012-03-14,1,1,3,7,0,3,1,1,0.44,0.4394,0.82,0.1045,20,348,368 +10398,2012-03-14,1,1,3,8,0,3,1,1,0.44,0.4394,0.82,0,34,628,662 +10399,2012-03-14,1,1,3,9,0,3,1,1,0.52,0.5,0.68,0,26,325,351 +10400,2012-03-14,1,1,3,10,0,3,1,1,0.56,0.5303,0.56,0.1045,38,150,188 +10401,2012-03-14,1,1,3,11,0,3,1,2,0.62,0.6212,0.41,0.1642,65,155,220 +10402,2012-03-14,1,1,3,12,0,3,1,2,0.64,0.6212,0.29,0,55,212,267 +10403,2012-03-14,1,1,3,13,0,3,1,1,0.66,0.6212,0.27,0.2239,57,197,254 +10404,2012-03-14,1,1,3,14,0,3,1,1,0.7,0.6364,0.23,0.3284,61,163,224 +10405,2012-03-14,1,1,3,15,0,3,1,1,0.7,0.6364,0.24,0,86,197,283 +10406,2012-03-14,1,1,3,16,0,3,1,1,0.72,0.6364,0.25,0.194,78,278,356 +10407,2012-03-14,1,1,3,17,0,3,1,1,0.7,0.6364,0.28,0.0896,140,642,782 +10408,2012-03-14,1,1,3,18,0,3,1,1,0.7,0.6364,0.32,0,102,647,749 +10409,2012-03-14,1,1,3,19,0,3,1,1,0.64,0.6212,0.33,0.1642,70,402,472 +10410,2012-03-14,1,1,3,20,0,3,1,1,0.62,0.6212,0.35,0.1045,44,286,330 +10411,2012-03-14,1,1,3,21,0,3,1,1,0.6,0.6212,0.4,0.0896,47,241,288 +10412,2012-03-14,1,1,3,22,0,3,1,1,0.52,0.5,0.55,0.1343,43,159,202 +10413,2012-03-14,1,1,3,23,0,3,1,1,0.56,0.5303,0.43,0.1642,19,72,91 +10414,2012-03-15,1,1,3,0,0,4,1,1,0.54,0.5152,0.49,0.1343,14,46,60 +10415,2012-03-15,1,1,3,1,0,4,1,1,0.5,0.4848,0.59,0.1045,15,8,23 +10416,2012-03-15,1,1,3,2,0,4,1,1,0.5,0.4848,0.59,0,14,5,19 +10417,2012-03-15,1,1,3,3,0,4,1,1,0.5,0.4848,0.63,0,0,7,7 +10418,2012-03-15,1,1,3,4,0,4,1,1,0.44,0.4394,0.77,0.1045,11,3,14 +10419,2012-03-15,1,1,3,5,0,4,1,1,0.46,0.4545,0.67,0,2,24,26 +10420,2012-03-15,1,1,3,6,0,4,1,1,0.44,0.4394,0.72,0.0896,4,113,117 +10421,2012-03-15,1,1,3,7,0,4,1,1,0.44,0.4394,0.72,0.0896,14,367,381 +10422,2012-03-15,1,1,3,8,0,4,1,1,0.44,0.4394,0.77,0.1045,21,602,623 +10423,2012-03-15,1,1,3,9,0,4,1,1,0.48,0.4697,0.77,0.0896,30,285,315 +10424,2012-03-15,1,1,3,10,0,4,1,1,0.52,0.5,0.68,0.1045,34,130,164 +10425,2012-03-15,1,1,3,11,0,4,1,1,0.56,0.5303,0.6,0.1343,60,151,211 +10426,2012-03-15,1,1,3,12,0,4,1,2,0.62,0.6212,0.5,0.1343,59,206,265 +10427,2012-03-15,1,1,3,13,0,4,1,2,0.66,0.6212,0.41,0.0896,62,211,273 +10428,2012-03-15,1,1,3,14,0,4,1,2,0.72,0.6515,0.3,0.1045,81,177,258 +10429,2012-03-15,1,1,3,15,0,4,1,1,0.72,0.6515,0.32,0.2239,100,187,287 +10430,2012-03-15,1,1,3,16,0,4,1,1,0.72,0.6515,0.37,0.3881,95,331,426 +10431,2012-03-15,1,1,3,17,0,4,1,1,0.7,0.6364,0.39,0.2537,79,634,713 +10432,2012-03-15,1,1,3,18,0,4,1,1,0.66,0.6212,0.44,0.2836,98,648,746 +10433,2012-03-15,1,1,3,19,0,4,1,2,0.64,0.6212,0.5,0.194,72,353,425 +10434,2012-03-15,1,1,3,20,0,4,1,2,0.58,0.5455,0.6,0.3582,60,270,330 +10435,2012-03-15,1,1,3,21,0,4,1,1,0.54,0.5152,0.68,0.2836,36,207,243 +10436,2012-03-15,1,1,3,22,0,4,1,1,0.52,0.5,0.68,0.1343,32,137,169 +10437,2012-03-15,1,1,3,23,0,4,1,1,0.48,0.4697,0.72,0.194,12,85,97 +10438,2012-03-16,1,1,3,0,0,5,1,1,0.44,0.4394,0.77,0.2537,8,49,57 +10439,2012-03-16,1,1,3,1,0,5,1,2,0.42,0.4242,0.82,0.3284,4,22,26 +10440,2012-03-16,1,1,3,2,0,5,1,2,0.42,0.4242,0.82,0.194,0,4,4 +10441,2012-03-16,1,1,3,3,0,5,1,2,0.4,0.4091,0.87,0.2836,0,3,3 +10442,2012-03-16,1,1,3,4,0,5,1,2,0.4,0.4091,0.87,0.2537,0,3,3 +10443,2012-03-16,1,1,3,5,0,5,1,2,0.4,0.4091,0.87,0.2239,2,30,32 +10444,2012-03-16,1,1,3,6,0,5,1,2,0.4,0.4091,0.87,0.1642,3,96,99 +10445,2012-03-16,1,1,3,7,0,5,1,2,0.4,0.4091,0.87,0.194,13,265,278 +10446,2012-03-16,1,1,3,8,0,5,1,2,0.4,0.4091,0.87,0.2239,28,534,562 +10447,2012-03-16,1,1,3,9,0,5,1,2,0.42,0.4242,0.82,0,35,277,312 +10448,2012-03-16,1,1,3,10,0,5,1,2,0.4,0.4091,0.87,0.0896,44,136,180 +10449,2012-03-16,1,1,3,11,0,5,1,2,0.44,0.4394,0.77,0,40,167,207 +10450,2012-03-16,1,1,3,12,0,5,1,2,0.44,0.4394,0.82,0.1343,72,222,294 +10451,2012-03-16,1,1,3,13,0,5,1,2,0.46,0.4545,0.77,0.0896,52,208,260 +10452,2012-03-16,1,1,3,14,0,5,1,2,0.48,0.4697,0.72,0,56,146,202 +10453,2012-03-16,1,1,3,15,0,5,1,3,0.46,0.4545,0.82,0.1045,19,88,107 +10454,2012-03-16,1,1,3,16,0,5,1,3,0.48,0.4697,0.77,0,18,111,129 +10455,2012-03-16,1,1,3,17,0,5,1,3,0.48,0.4697,0.82,0.0896,23,235,258 +10456,2012-03-16,1,1,3,18,0,5,1,3,0.48,0.4697,0.82,0.0896,31,377,408 +10457,2012-03-16,1,1,3,19,0,5,1,2,0.46,0.4545,0.88,0,23,273,296 +10458,2012-03-16,1,1,3,20,0,5,1,1,0.46,0.4545,0.88,0,31,204,235 +10459,2012-03-16,1,1,3,21,0,5,1,2,0.44,0.4394,0.94,0,8,144,152 +10460,2012-03-16,1,1,3,22,0,5,1,2,0.44,0.4394,0.94,0,16,132,148 +10461,2012-03-16,1,1,3,23,0,5,1,2,0.44,0.4394,0.94,0,22,104,126 +10462,2012-03-17,1,1,3,0,0,6,0,2,0.44,0.4394,0.94,0.0896,13,87,100 +10463,2012-03-17,1,1,3,1,0,6,0,2,0.44,0.4394,0.94,0,12,57,69 +10464,2012-03-17,1,1,3,2,0,6,0,2,0.44,0.4394,0.88,0,10,32,42 +10465,2012-03-17,1,1,3,3,0,6,0,2,0.44,0.4394,0.88,0,2,24,26 +10466,2012-03-17,1,1,3,4,0,6,0,2,0.42,0.4242,0.94,0,0,2,2 +10467,2012-03-17,1,1,3,5,0,6,0,2,0.42,0.4242,0.94,0.0896,5,3,8 +10468,2012-03-17,1,1,3,6,0,6,0,2,0.42,0.4242,0.94,0.194,1,29,30 +10469,2012-03-17,1,1,3,7,0,6,0,2,0.4,0.4091,1,0.1343,29,57,86 +10470,2012-03-17,1,1,3,8,0,6,0,2,0.42,0.4242,0.94,0.1045,63,155,218 +10471,2012-03-17,1,1,3,9,0,6,0,2,0.44,0.4394,0.88,0.0896,104,217,321 +10472,2012-03-17,1,1,3,10,0,6,0,2,0.5,0.4848,0.77,0.0896,140,303,443 +10473,2012-03-17,1,1,3,11,0,6,0,2,0.52,0.5,0.77,0.1343,226,359,585 +10474,2012-03-17,1,1,3,12,0,6,0,1,0.56,0.5303,0.68,0.1642,286,365,651 +10475,2012-03-17,1,1,3,13,0,6,0,1,0.6,0.6061,0.6,0.1045,286,400,686 +10476,2012-03-17,1,1,3,14,0,6,0,1,0.62,0.6212,0.53,0.0896,352,338,690 +10477,2012-03-17,1,1,3,15,0,6,0,1,0.64,0.6212,0.53,0.1343,357,322,679 +10478,2012-03-17,1,1,3,16,0,6,0,1,0.64,0.6212,0.5,0,367,318,685 +10479,2012-03-17,1,1,3,17,0,6,0,1,0.64,0.6212,0.5,0.1343,291,357,648 +10480,2012-03-17,1,1,3,18,0,6,0,1,0.62,0.6212,0.57,0.2985,221,339,560 +10481,2012-03-17,1,1,3,19,0,6,0,1,0.58,0.5455,0.64,0.2836,155,262,417 +10482,2012-03-17,1,1,3,20,0,6,0,1,0.56,0.5303,0.64,0.194,89,182,271 +10483,2012-03-17,1,1,3,21,0,6,0,1,0.54,0.5152,0.68,0.1642,54,169,223 +10484,2012-03-17,1,1,3,22,0,6,0,1,0.54,0.5152,0.68,0,58,153,211 +10485,2012-03-17,1,1,3,23,0,6,0,1,0.5,0.4848,0.77,0.1642,34,151,185 +10486,2012-03-18,1,1,3,0,0,0,0,1,0.46,0.4545,0.88,0.194,27,80,107 +10487,2012-03-18,1,1,3,1,0,0,0,1,0.46,0.4545,0.82,0.1343,25,88,113 +10488,2012-03-18,1,1,3,2,0,0,0,2,0.46,0.4545,0.82,0.1045,15,41,56 +10489,2012-03-18,1,1,3,3,0,0,0,2,0.44,0.4394,0.88,0.1343,3,15,18 +10490,2012-03-18,1,1,3,4,0,0,0,2,0.42,0.4242,0.94,0.1642,6,8,14 +10491,2012-03-18,1,1,3,5,0,0,0,2,0.4,0.4091,0.94,0.1045,0,6,6 +10492,2012-03-18,1,1,3,6,0,0,0,2,0.4,0.4091,0.94,0.1045,2,9,11 +10493,2012-03-18,1,1,3,7,0,0,0,3,0.42,0.4242,0.88,0.1642,17,25,42 +10494,2012-03-18,1,1,3,8,0,0,0,2,0.42,0.4242,0.88,0.1045,25,71,96 +10495,2012-03-18,1,1,3,9,0,0,0,2,0.42,0.4242,0.88,0.0896,65,113,178 +10496,2012-03-18,1,1,3,10,0,0,0,2,0.42,0.4242,0.88,0.1045,139,212,351 +10497,2012-03-18,1,1,3,11,0,0,0,2,0.44,0.4394,0.82,0.1642,129,239,368 +10498,2012-03-18,1,1,3,12,0,0,0,2,0.44,0.4394,0.88,0.1343,222,281,503 +10499,2012-03-18,1,1,3,13,0,0,0,2,0.46,0.4545,0.82,0.1642,198,346,544 +10500,2012-03-18,1,1,3,14,0,0,0,2,0.5,0.4848,0.77,0.0896,218,303,521 +10501,2012-03-18,1,1,3,15,0,0,0,1,0.54,0.5152,0.68,0.0896,240,314,554 +10502,2012-03-18,1,1,3,16,0,0,0,1,0.54,0.5152,0.73,0.0896,229,312,541 +10503,2012-03-18,1,1,3,17,0,0,0,1,0.56,0.5303,0.64,0.0896,233,308,541 +10504,2012-03-18,1,1,3,18,0,0,0,1,0.56,0.5303,0.64,0.1642,165,294,459 +10505,2012-03-18,1,1,3,19,0,0,0,1,0.56,0.5303,0.64,0.1642,118,234,352 +10506,2012-03-18,1,1,3,20,0,0,0,1,0.52,0.5,0.77,0.1343,59,139,198 +10507,2012-03-18,1,1,3,21,0,0,0,1,0.52,0.5,0.72,0.0896,44,129,173 +10508,2012-03-18,1,1,3,22,0,0,0,1,0.5,0.4848,0.77,0.1045,21,79,100 +10509,2012-03-18,1,1,3,23,0,0,0,1,0.48,0.4697,0.82,0.1642,7,39,46 +10510,2012-03-19,1,1,3,0,0,1,1,1,0.48,0.4697,0.82,0.1045,4,19,23 +10511,2012-03-19,1,1,3,1,0,1,1,1,0.46,0.4545,0.88,0.1045,0,15,15 +10512,2012-03-19,1,1,3,2,0,1,1,1,0.46,0.4545,0.88,0.1642,1,7,8 +10513,2012-03-19,1,1,3,3,0,1,1,1,0.46,0.4545,0.88,0.0896,0,2,2 +10514,2012-03-19,1,1,3,4,0,1,1,1,0.44,0.4394,0.94,0,0,3,3 +10515,2012-03-19,1,1,3,5,0,1,1,1,0.44,0.4394,0.94,0,0,31,31 +10516,2012-03-19,1,1,3,6,0,1,1,1,0.46,0.4545,0.88,0.1343,2,118,120 +10517,2012-03-19,1,1,3,7,0,1,1,1,0.46,0.4545,0.88,0.194,25,329,354 +10518,2012-03-19,1,1,3,8,0,1,1,1,0.46,0.4545,0.88,0.1045,16,563,579 +10519,2012-03-19,1,1,3,9,0,1,1,1,0.5,0.4848,0.72,0.1642,55,276,331 +10520,2012-03-19,1,1,3,10,0,1,1,1,0.52,0.5,0.72,0.1343,56,128,184 +10521,2012-03-19,1,1,3,11,0,1,1,1,0.56,0.5303,0.68,0.1642,56,145,201 +10522,2012-03-19,1,1,3,12,0,1,1,1,0.6,0.6061,0.64,0.2537,71,211,282 +10523,2012-03-19,1,1,3,13,0,1,1,1,0.62,0.6061,0.61,0.2537,69,194,263 +10524,2012-03-19,1,1,3,14,0,1,1,2,0.64,0.6212,0.57,0.194,60,200,260 +10525,2012-03-19,1,1,3,15,0,1,1,1,0.64,0.6212,0.57,0.2239,97,189,286 +10526,2012-03-19,1,1,3,16,0,1,1,1,0.66,0.6212,0.5,0.2239,65,320,385 +10527,2012-03-19,1,1,3,17,0,1,1,1,0.64,0.6212,0.53,0.2239,106,615,721 +10528,2012-03-19,1,1,3,18,0,1,1,1,0.64,0.6212,0.57,0.2537,120,681,801 +10529,2012-03-19,1,1,3,19,0,1,1,1,0.62,0.6061,0.61,0.2239,86,463,549 +10530,2012-03-19,1,1,3,20,0,1,1,1,0.6,0.6061,0.64,0.2239,34,296,330 +10531,2012-03-19,1,1,3,21,0,1,1,1,0.6,0.6061,0.64,0.1642,33,190,223 +10532,2012-03-19,1,1,3,22,0,1,1,2,0.56,0.5303,0.73,0.1642,17,131,148 +10533,2012-03-19,1,1,3,23,0,1,1,2,0.56,0.5303,0.78,0.1343,9,45,54 +10534,2012-03-20,1,1,3,0,0,2,1,2,0.56,0.5303,0.78,0.0896,5,24,29 +10535,2012-03-20,1,1,3,1,0,2,1,1,0.54,0.5152,0.88,0,6,9,15 +10536,2012-03-20,1,1,3,2,0,2,1,1,0.54,0.5152,0.88,0.0896,2,8,10 +10537,2012-03-20,1,1,3,3,0,2,1,2,0.54,0.5152,0.88,0,0,3,3 +10538,2012-03-20,1,1,3,4,0,2,1,2,0.52,0.5,0.88,0.1343,0,6,6 +10539,2012-03-20,1,1,3,5,0,2,1,2,0.52,0.5,0.94,0,0,20,20 +10540,2012-03-20,1,1,3,6,0,2,1,2,0.52,0.5,0.94,0.1642,6,94,100 +10541,2012-03-20,1,1,3,7,0,2,1,3,0.52,0.5,0.94,0.2239,3,167,170 +10542,2012-03-20,1,1,3,8,0,2,1,2,0.52,0.5,0.88,0.1045,28,488,516 +10543,2012-03-20,1,1,3,9,0,2,1,2,0.54,0.5152,0.88,0.1642,41,284,325 +10544,2012-03-20,1,1,3,10,0,2,1,1,0.54,0.5152,0.88,0.194,44,119,163 +10545,2012-03-20,1,1,3,11,0,2,1,1,0.58,0.5455,0.83,0.1343,74,156,230 +10546,2012-03-20,1,1,3,12,0,2,1,1,0.6,0.5758,0.78,0.2537,56,205,261 +10547,2012-03-20,1,1,3,13,0,2,1,1,0.6,0.5758,0.78,0.194,77,207,284 +10548,2012-03-20,1,1,3,14,0,2,1,2,0.6,0.5909,0.73,0.1642,66,182,248 +10549,2012-03-20,1,1,3,15,0,2,1,2,0.62,0.6061,0.69,0.1045,67,177,244 +10550,2012-03-20,1,1,3,16,0,2,1,1,0.62,0.6061,0.66,0.2239,99,332,431 +10551,2012-03-20,1,1,3,17,0,2,1,1,0.6,0.5909,0.73,0.194,108,642,750 +10552,2012-03-20,1,1,3,18,0,2,1,1,0.6,0.5909,0.69,0.2537,136,665,801 +10553,2012-03-20,1,1,3,19,0,2,1,1,0.58,0.5455,0.73,0.1343,75,480,555 +10554,2012-03-20,1,1,3,20,0,2,1,1,0.58,0.5455,0.68,0.0896,78,299,377 +10555,2012-03-20,1,1,3,21,0,2,1,1,0.56,0.5303,0.73,0,38,239,277 +10556,2012-03-20,1,1,3,22,0,2,1,1,0.54,0.5152,0.77,0,32,156,188 +10557,2012-03-20,1,1,3,23,0,2,1,1,0.52,0.5,0.83,0,10,80,90 +10558,2012-03-21,2,1,3,0,0,3,1,1,0.52,0.5,0.88,0,4,29,33 +10559,2012-03-21,2,1,3,1,0,3,1,1,0.52,0.5,0.83,0.0896,4,22,26 +10560,2012-03-21,2,1,3,2,0,3,1,1,0.5,0.4848,0.88,0.1343,2,8,10 +10561,2012-03-21,2,1,3,3,0,3,1,2,0.5,0.4848,0.88,0.194,1,7,8 +10562,2012-03-21,2,1,3,4,0,3,1,2,0.5,0.4848,0.88,0.1343,0,4,4 +10563,2012-03-21,2,1,3,5,0,3,1,2,0.5,0.4848,0.88,0.1343,4,35,39 +10564,2012-03-21,2,1,3,6,0,3,1,2,0.48,0.4697,0.94,0.2239,10,139,149 +10565,2012-03-21,2,1,3,7,0,3,1,3,0.48,0.4697,0.94,0.194,34,338,372 +10566,2012-03-21,2,1,3,8,0,3,1,3,0.48,0.4697,0.94,0.1045,33,502,535 +10567,2012-03-21,2,1,3,9,0,3,1,2,0.5,0.4848,0.88,0.1642,38,255,293 +10568,2012-03-21,2,1,3,10,0,3,1,2,0.52,0.5,0.83,0.1642,30,124,154 +10569,2012-03-21,2,1,3,11,0,3,1,2,0.52,0.5,0.83,0.1642,38,154,192 +10570,2012-03-21,2,1,3,12,0,3,1,2,0.54,0.5152,0.77,0,58,171,229 +10571,2012-03-21,2,1,3,13,0,3,1,2,0.54,0.5152,0.83,0,81,222,303 +10572,2012-03-21,2,1,3,14,0,3,1,2,0.56,0.5303,0.78,0,74,151,225 +10573,2012-03-21,2,1,3,15,0,3,1,2,0.56,0.5303,0.78,0,68,175,243 +10574,2012-03-21,2,1,3,16,0,3,1,1,0.58,0.5455,0.73,0,91,287,378 +10575,2012-03-21,2,1,3,17,0,3,1,1,0.6,0.5909,0.69,0.0896,113,616,729 +10576,2012-03-21,2,1,3,18,0,3,1,1,0.6,0.5909,0.69,0,152,627,779 +10577,2012-03-21,2,1,3,19,0,3,1,1,0.6,0.6061,0.64,0,86,496,582 +10578,2012-03-21,2,1,3,20,0,3,1,1,0.56,0.5303,0.78,0.0896,76,298,374 +10579,2012-03-21,2,1,3,21,0,3,1,1,0.54,0.5152,0.83,0.1045,47,204,251 +10580,2012-03-21,2,1,3,22,0,3,1,1,0.52,0.5,0.83,0.1642,41,156,197 +10581,2012-03-21,2,1,3,23,0,3,1,1,0.54,0.5152,0.77,0,37,88,125 +10582,2012-03-22,2,1,3,0,0,4,1,1,0.52,0.5,0.83,0.0896,9,32,41 +10583,2012-03-22,2,1,3,1,0,4,1,1,0.52,0.5,0.83,0.0896,14,16,30 +10584,2012-03-22,2,1,3,2,0,4,1,1,0.52,0.5,0.83,0.0896,1,5,6 +10585,2012-03-22,2,1,3,3,0,4,1,1,0.52,0.5,0.83,0.0896,0,7,7 +10586,2012-03-22,2,1,3,4,0,4,1,2,0.48,0.4697,1,0.0896,0,6,6 +10587,2012-03-22,2,1,3,5,0,4,1,2,0.48,0.4697,1,0.0896,2,32,34 +10588,2012-03-22,2,1,3,6,0,4,1,2,0.48,0.4697,1,0.0896,10,126,136 +10589,2012-03-22,2,1,3,7,0,4,1,2,0.48,0.4697,1,0.0896,29,332,361 +10590,2012-03-22,2,1,3,8,0,4,1,2,0.5,0.4848,0.94,0.0896,51,598,649 +10591,2012-03-22,2,1,3,9,0,4,1,2,0.5,0.4848,0.94,0.0896,41,277,318 +10592,2012-03-22,2,1,3,10,0,4,1,2,0.5,0.4848,1,0.0896,32,110,142 +10593,2012-03-22,2,1,3,11,0,4,1,2,0.52,0.5,0.94,0,53,166,219 +10594,2012-03-22,2,1,3,12,0,4,1,2,0.54,0.5152,0.88,0.194,48,224,272 +10595,2012-03-22,2,1,3,13,0,4,1,1,0.58,0.5455,0.78,0.0896,83,215,298 +10596,2012-03-22,2,1,3,14,0,4,1,1,0.6,0.5909,0.73,0.1045,96,161,257 +10597,2012-03-22,2,1,3,15,0,4,1,1,0.62,0.6061,0.69,0.194,102,202,304 +10598,2012-03-22,2,1,3,16,0,4,1,1,0.64,0.6061,0.65,0.194,125,300,425 +10599,2012-03-22,2,1,3,17,0,4,1,1,0.66,0.6212,0.65,0.1642,154,656,810 +10600,2012-03-22,2,1,3,18,0,4,1,1,0.66,0.6212,0.65,0.1642,147,654,801 +10601,2012-03-22,2,1,3,19,0,4,1,1,0.66,0.6212,0.61,0,117,469,586 +10602,2012-03-22,2,1,3,20,0,4,1,1,0.62,0.5909,0.73,0.2836,64,360,424 +10603,2012-03-22,2,1,3,21,0,4,1,1,0.58,0.5455,0.78,0.1642,66,308,374 +10604,2012-03-22,2,1,3,22,0,4,1,1,0.56,0.5303,0.83,0.194,56,164,220 +10605,2012-03-22,2,1,3,23,0,4,1,1,0.56,0.5303,0.83,0.0896,34,117,151 +10606,2012-03-23,2,1,3,0,0,5,1,1,0.56,0.5303,0.83,0,30,65,95 +10607,2012-03-23,2,1,3,1,0,5,1,1,0.54,0.5152,0.88,0,18,32,50 +10608,2012-03-23,2,1,3,2,0,5,1,1,0.54,0.5152,0.88,0.0896,12,20,32 +10609,2012-03-23,2,1,3,3,0,5,1,1,0.52,0.5,0.88,0.1045,4,6,10 +10610,2012-03-23,2,1,3,4,0,5,1,2,0.5,0.4848,0.94,0.1045,0,3,3 +10611,2012-03-23,2,1,3,5,0,5,1,2,0.5,0.4848,0.94,0,5,29,34 +10612,2012-03-23,2,1,3,6,0,5,1,2,0.5,0.4848,0.88,0,6,110,116 +10613,2012-03-23,2,1,3,7,0,5,1,2,0.5,0.4848,0.93,0.1343,28,318,346 +10614,2012-03-23,2,1,3,8,0,5,1,2,0.5,0.4848,0.94,0.1343,47,615,662 +10615,2012-03-23,2,1,3,9,0,5,1,2,0.52,0.5,0.9,0.0896,75,305,380 +10616,2012-03-23,2,1,3,10,0,5,1,2,0.56,0.5303,0.88,0.1045,125,150,275 +10617,2012-03-23,2,1,3,11,0,5,1,2,0.62,0.5909,0.73,0.1045,131,187,318 +10618,2012-03-23,2,1,3,12,0,5,1,2,0.66,0.6212,0.61,0.2239,199,272,471 +10619,2012-03-23,2,1,3,13,0,5,1,2,0.7,0.6515,0.48,0,172,256,428 +10620,2012-03-23,2,1,3,14,0,5,1,2,0.72,0.6515,0.42,0.1045,208,224,432 +10621,2012-03-23,2,1,3,15,0,5,1,2,0.72,0.6515,0.42,0.1343,191,281,472 +10622,2012-03-23,2,1,3,16,0,5,1,2,0.72,0.6515,0.42,0.1642,219,370,589 +10623,2012-03-23,2,1,3,17,0,5,1,2,0.72,0.6515,0.42,0.1642,264,693,957 +10624,2012-03-23,2,1,3,18,0,5,1,1,0.7,0.6364,0.45,0.1642,237,593,830 +10625,2012-03-23,2,1,3,19,0,5,1,1,0.66,0.6212,0.5,0.194,213,473,686 +10626,2012-03-23,2,1,3,20,0,5,1,1,0.66,0.6212,0.47,0.1343,117,328,445 +10627,2012-03-23,2,1,3,21,0,5,1,1,0.62,0.6212,0.53,0.1045,64,220,284 +10628,2012-03-23,2,1,3,22,0,5,1,1,0.6,0.6061,0.64,0.2836,53,218,271 +10629,2012-03-23,2,1,3,23,0,5,1,1,0.6,0.5909,0.69,0.2537,51,125,176 +10630,2012-03-24,2,1,3,0,0,6,0,1,0.58,0.5455,0.68,0,45,111,156 +10631,2012-03-24,2,1,3,1,0,6,0,1,0.56,0.5303,0.73,0.0896,20,108,128 +10632,2012-03-24,2,1,3,2,0,6,0,1,0.54,0.5152,0.77,0.1045,14,55,69 +10633,2012-03-24,2,1,3,3,0,6,0,1,0.54,0.5152,0.77,0.1045,10,22,32 +10634,2012-03-24,2,1,3,4,0,6,0,1,0.52,0.5,0.83,0.0896,1,6,7 +10635,2012-03-24,2,1,3,5,0,6,0,2,0.52,0.5,0.83,0.1642,0,4,4 +10636,2012-03-24,2,1,3,6,0,6,0,2,0.52,0.5,0.83,0.1642,4,24,28 +10637,2012-03-24,2,1,3,7,0,6,0,2,0.5,0.4848,0.88,0.1343,25,45,70 +10638,2012-03-24,2,1,3,8,0,6,0,2,0.5,0.4848,0.94,0.194,41,113,154 +10639,2012-03-24,2,1,3,9,0,6,0,2,0.52,0.5,0.83,0.1343,96,153,249 +10640,2012-03-24,2,1,3,10,0,6,0,2,0.52,0.5,0.88,0.194,148,197,345 +10641,2012-03-24,2,1,3,11,0,6,0,2,0.5,0.4848,0.94,0.194,98,175,273 +10642,2012-03-24,2,1,3,12,0,6,0,3,0.52,0.5,0.94,0.2239,61,122,183 +10643,2012-03-24,2,1,3,13,0,6,0,3,0.52,0.5,0.94,0.2537,62,134,196 +10644,2012-03-24,2,1,3,14,0,6,0,3,0.52,0.5,0.88,0.3582,63,160,223 +10645,2012-03-24,2,1,3,15,0,6,0,3,0.5,0.4848,0.94,0.4179,118,172,290 +10646,2012-03-24,2,1,3,16,0,6,0,3,0.5,0.4848,0.94,0.2985,49,128,177 +10647,2012-03-24,2,1,3,17,0,6,0,3,0.5,0.4848,0.88,0.2239,47,68,115 +10648,2012-03-24,2,1,3,18,0,6,0,3,0.46,0.4545,0.94,0.2239,27,110,137 +10649,2012-03-24,2,1,3,19,0,6,0,3,0.46,0.4545,0.94,0.1343,28,105,133 +10650,2012-03-24,2,1,3,20,0,6,0,3,0.44,0.4394,1,0.194,31,80,111 +10651,2012-03-24,2,1,3,21,0,6,0,3,0.44,0.4394,1,0.194,16,88,104 +10652,2012-03-24,2,1,3,22,0,6,0,3,0.44,0.4394,1,0.2537,12,71,83 +10653,2012-03-24,2,1,3,23,0,6,0,2,0.44,0.4394,0.94,0.2836,17,88,105 +10654,2012-03-25,2,1,3,0,0,0,0,2,0.44,0.4394,0.94,0.2836,18,62,80 +10655,2012-03-25,2,1,3,1,0,0,0,2,0.42,0.4242,1,0.2537,24,65,89 +10656,2012-03-25,2,1,3,2,0,0,0,3,0.42,0.4242,1,0.2985,6,29,35 +10657,2012-03-25,2,1,3,3,0,0,0,3,0.42,0.4242,0.94,0.2985,8,10,18 +10658,2012-03-25,2,1,3,4,0,0,0,2,0.42,0.4242,0.94,0.3284,1,7,8 +10659,2012-03-25,2,1,3,5,0,0,0,2,0.4,0.4091,1,0.2537,0,6,6 +10660,2012-03-25,2,1,3,6,0,0,0,2,0.4,0.4091,1,0.2537,5,13,18 +10661,2012-03-25,2,1,3,7,0,0,0,2,0.4,0.4091,0.94,0.3284,14,25,39 +10662,2012-03-25,2,1,3,8,0,0,0,2,0.4,0.4091,0.94,0.3284,21,68,89 +10663,2012-03-25,2,1,3,9,0,0,0,2,0.4,0.4091,0.94,0.2537,26,92,118 +10664,2012-03-25,2,1,3,10,0,0,0,2,0.4,0.4091,0.87,0.3284,78,172,250 +10665,2012-03-25,2,1,3,11,0,0,0,2,0.4,0.4091,0.94,0.2537,106,217,323 +10666,2012-03-25,2,1,3,12,0,0,0,2,0.4,0.4091,0.94,0.2239,122,238,360 +10667,2012-03-25,2,1,3,13,0,0,0,2,0.42,0.4242,0.88,0.1642,110,257,367 +10668,2012-03-25,2,1,3,14,0,0,0,2,0.44,0.4394,0.88,0.1642,123,291,414 +10669,2012-03-25,2,1,3,15,0,0,0,2,0.44,0.4394,0.88,0.194,139,282,421 +10670,2012-03-25,2,1,3,16,0,0,0,1,0.48,0.4697,0.77,0,153,339,492 +10671,2012-03-25,2,1,3,17,0,0,0,1,0.5,0.4848,0.72,0.0896,146,273,419 +10672,2012-03-25,2,1,3,18,0,0,0,1,0.5,0.4848,0.72,0.0896,202,289,491 +10673,2012-03-25,2,1,3,19,0,0,0,1,0.5,0.4848,0.72,0.2537,114,261,375 +10674,2012-03-25,2,1,3,20,0,0,0,1,0.48,0.4697,0.77,0.2239,63,196,259 +10675,2012-03-25,2,1,3,21,0,0,0,1,0.48,0.4697,0.77,0.1642,25,124,149 +10676,2012-03-25,2,1,3,22,0,0,0,1,0.46,0.4545,0.82,0.1343,19,92,111 +10677,2012-03-25,2,1,3,23,0,0,0,1,0.48,0.4697,0.82,0.1343,9,56,65 +10678,2012-03-26,2,1,3,0,0,1,1,2,0.48,0.4697,0.82,0.1642,10,23,33 +10679,2012-03-26,2,1,3,1,0,1,1,1,0.48,0.4697,0.82,0.2537,18,10,28 +10680,2012-03-26,2,1,3,2,0,1,1,1,0.46,0.4545,0.88,0.2239,14,6,20 +10681,2012-03-26,2,1,3,3,0,1,1,1,0.46,0.4545,0.82,0.194,0,1,1 +10682,2012-03-26,2,1,3,4,0,1,1,1,0.44,0.4394,0.82,0.1642,0,4,4 +10683,2012-03-26,2,1,3,5,0,1,1,1,0.44,0.4394,0.77,0.1642,0,36,36 +10684,2012-03-26,2,1,3,6,0,1,1,1,0.42,0.4242,0.82,0.194,7,110,117 +10685,2012-03-26,2,1,3,7,0,1,1,1,0.44,0.4394,0.72,0.194,15,355,370 +10686,2012-03-26,2,1,3,8,0,1,1,1,0.44,0.4394,0.62,0.2239,32,625,657 +10687,2012-03-26,2,1,3,9,0,1,1,1,0.46,0.4545,0.47,0.2836,37,245,282 +10688,2012-03-26,2,1,3,10,0,1,1,1,0.5,0.4848,0.36,0.5821,55,98,153 +10689,2012-03-26,2,1,3,11,0,1,1,1,0.48,0.4697,0.31,0.5821,47,131,178 +10690,2012-03-26,2,1,3,12,0,1,1,1,0.48,0.4697,0.33,0.5224,63,216,279 +10691,2012-03-26,2,1,3,13,0,1,1,1,0.48,0.4697,0.29,0.6866,79,222,301 +10692,2012-03-26,2,1,3,14,0,1,1,1,0.48,0.4697,0.33,0.4478,58,159,217 +10693,2012-03-26,2,1,3,15,0,1,1,1,0.48,0.4697,0.29,0.6119,66,157,223 +10694,2012-03-26,2,1,3,16,0,1,1,1,0.48,0.4697,0.25,0.6418,60,245,305 +10695,2012-03-26,2,1,3,17,0,1,1,1,0.46,0.4545,0.24,0.4478,65,599,664 +10696,2012-03-26,2,1,3,18,0,1,1,1,0.44,0.4394,0.26,0.4478,90,594,684 +10697,2012-03-26,2,1,3,19,0,1,1,1,0.42,0.4242,0.24,0.3582,41,417,458 +10698,2012-03-26,2,1,3,20,0,1,1,1,0.4,0.4091,0.26,0.5224,12,209,221 +10699,2012-03-26,2,1,3,21,0,1,1,1,0.38,0.3939,0.25,0.4925,11,175,186 +10700,2012-03-26,2,1,3,22,0,1,1,1,0.36,0.3333,0.25,0.3582,13,80,93 +10701,2012-03-26,2,1,3,23,0,1,1,1,0.34,0.2879,0.25,0.5224,2,46,48 +10702,2012-03-27,2,1,3,0,0,2,1,1,0.32,0.2879,0.26,0.5224,1,9,10 +10703,2012-03-27,2,1,3,1,0,2,1,1,0.3,0.2727,0.26,0.4627,1,4,5 +10704,2012-03-27,2,1,3,2,0,2,1,1,0.26,0.2273,0.3,0.2985,1,6,7 +10705,2012-03-27,2,1,3,3,0,2,1,1,0.26,0.2273,0.3,0.3284,0,4,4 +10706,2012-03-27,2,1,3,4,0,2,1,1,0.24,0.2121,0.32,0.3582,0,3,3 +10707,2012-03-27,2,1,3,5,0,2,1,1,0.22,0.2121,0.37,0.2836,1,16,17 +10708,2012-03-27,2,1,3,6,0,2,1,1,0.22,0.2273,0.37,0.194,5,96,101 +10709,2012-03-27,2,1,3,7,0,2,1,1,0.22,0.2273,0.37,0.194,9,277,286 +10710,2012-03-27,2,1,3,8,0,2,1,1,0.22,0.2121,0.37,0.2985,14,567,581 +10711,2012-03-27,2,1,3,9,0,2,1,1,0.24,0.2273,0.38,0.194,8,259,267 +10712,2012-03-27,2,1,3,10,0,2,1,1,0.26,0.2576,0.35,0.2239,20,118,138 +10713,2012-03-27,2,1,3,11,0,2,1,1,0.3,0.303,0.31,0.1642,34,141,175 +10714,2012-03-27,2,1,3,12,0,2,1,1,0.32,0.3182,0.31,0.1642,34,186,220 +10715,2012-03-27,2,1,3,13,0,2,1,1,0.36,0.3485,0.27,0,46,165,211 +10716,2012-03-27,2,1,3,14,0,2,1,1,0.4,0.4091,0.24,0.1343,37,147,184 +10717,2012-03-27,2,1,3,15,0,2,1,1,0.42,0.4242,0.2,0,42,158,200 +10718,2012-03-27,2,1,3,16,0,2,1,1,0.42,0.4242,0.2,0,38,267,305 +10719,2012-03-27,2,1,3,17,0,2,1,1,0.44,0.4394,0.16,0,72,542,614 +10720,2012-03-27,2,1,3,18,0,2,1,1,0.42,0.4242,0.17,0.0896,67,577,644 +10721,2012-03-27,2,1,3,19,0,2,1,1,0.42,0.4242,0.17,0,40,377,417 +10722,2012-03-27,2,1,3,20,0,2,1,1,0.42,0.4242,0.17,0.0896,28,266,294 +10723,2012-03-27,2,1,3,21,0,2,1,1,0.36,0.3485,0.37,0.1642,21,200,221 +10724,2012-03-27,2,1,3,22,0,2,1,1,0.36,0.3485,0.37,0.1343,7,127,134 +10725,2012-03-27,2,1,3,23,0,2,1,1,0.36,0.3485,0.37,0.194,5,59,64 +10726,2012-03-28,2,1,3,0,0,3,1,1,0.36,0.3333,0.4,0.2836,1,29,30 +10727,2012-03-28,2,1,3,1,0,3,1,1,0.36,0.3333,0.46,0.2985,0,8,8 +10728,2012-03-28,2,1,3,2,0,3,1,1,0.34,0.303,0.49,0.2985,0,3,3 +10729,2012-03-28,2,1,3,3,0,3,1,1,0.34,0.303,0.42,0.3284,1,4,5 +10730,2012-03-28,2,1,3,4,0,3,1,1,0.36,0.3333,0.43,0.2985,1,0,1 +10731,2012-03-28,2,1,3,5,0,3,1,1,0.36,0.3333,0.43,0.2985,2,38,40 +10732,2012-03-28,2,1,3,6,0,3,1,1,0.36,0.3333,0.46,0.3582,4,111,115 +10733,2012-03-28,2,1,3,7,0,3,1,1,0.36,0.3333,0.46,0.3284,10,348,358 +10734,2012-03-28,2,1,3,8,0,3,1,1,0.38,0.3939,0.46,0.3582,19,639,658 +10735,2012-03-28,2,1,3,9,0,3,1,1,0.4,0.4091,0.43,0.4478,16,298,314 +10736,2012-03-28,2,1,3,10,0,3,1,1,0.46,0.4545,0.41,0.4179,33,140,173 +10737,2012-03-28,2,1,3,11,0,3,1,1,0.5,0.4848,0.42,0.3881,54,168,222 +10738,2012-03-28,2,1,3,12,0,3,1,2,0.54,0.5152,0.39,0.3582,50,218,268 +10739,2012-03-28,2,1,3,13,0,3,1,1,0.56,0.5303,0.43,0.3284,60,213,273 +10740,2012-03-28,2,1,3,14,0,3,1,1,0.62,0.6212,0.38,0.3582,52,169,221 +10741,2012-03-28,2,1,3,15,0,3,1,2,0.62,0.6212,0.41,0.2985,47,152,199 +10742,2012-03-28,2,1,3,16,0,3,1,3,0.62,0.6212,0.43,0.0896,51,170,221 +10743,2012-03-28,2,1,3,17,0,3,1,1,0.6,0.6061,0.6,0.2239,72,532,604 +10744,2012-03-28,2,1,3,18,0,3,1,1,0.6,0.6061,0.6,0.2239,61,580,641 +10745,2012-03-28,2,1,3,19,0,3,1,1,0.6,0.6212,0.56,0.194,41,415,456 +10746,2012-03-28,2,1,3,20,0,3,1,1,0.6,0.6212,0.56,0.2985,38,324,362 +10747,2012-03-28,2,1,3,21,0,3,1,1,0.58,0.5455,0.6,0.2537,28,233,261 +10748,2012-03-28,2,1,3,22,0,3,1,1,0.56,0.5303,0.64,0.1343,21,151,172 +10749,2012-03-28,2,1,3,23,0,3,1,1,0.54,0.5152,0.68,0.1343,12,81,93 +10750,2012-03-29,2,1,3,0,0,4,1,1,0.54,0.5152,0.68,0.194,10,38,48 +10751,2012-03-29,2,1,3,1,0,4,1,1,0.56,0.5303,0.6,0.1343,15,18,33 +10752,2012-03-29,2,1,3,2,0,4,1,1,0.6,0.6061,0.28,0.2985,0,6,6 +10753,2012-03-29,2,1,3,3,0,4,1,1,0.58,0.5455,0.28,0.2537,1,4,5 +10754,2012-03-29,2,1,3,4,0,4,1,1,0.52,0.5,0.36,0.2537,1,8,9 +10755,2012-03-29,2,1,3,5,0,4,1,1,0.5,0.4848,0.45,0.2239,2,30,32 +10756,2012-03-29,2,1,3,6,0,4,1,1,0.46,0.4545,0.59,0.2537,3,114,117 +10757,2012-03-29,2,1,3,7,0,4,1,1,0.46,0.4545,0.51,0.3582,14,353,367 +10758,2012-03-29,2,1,3,8,0,4,1,1,0.46,0.4545,0.47,0.4179,26,628,654 +10759,2012-03-29,2,1,3,9,0,4,1,1,0.48,0.4697,0.48,0.4179,22,299,321 +10760,2012-03-29,2,1,3,10,0,4,1,1,0.46,0.4545,0.47,0.4478,42,124,166 +10761,2012-03-29,2,1,3,11,0,4,1,1,0.5,0.4848,0.45,0.4179,54,166,220 +10762,2012-03-29,2,1,3,12,0,4,1,1,0.5,0.4848,0.42,0.4925,64,228,292 +10763,2012-03-29,2,1,3,13,0,4,1,1,0.46,0.4545,0.44,0.3284,50,223,273 +10764,2012-03-29,2,1,3,14,0,4,1,1,0.5,0.4848,0.42,0.3582,63,175,238 +10765,2012-03-29,2,1,3,15,0,4,1,1,0.52,0.5,0.39,0.3582,109,198,307 +10766,2012-03-29,2,1,3,16,0,4,1,1,0.52,0.5,0.36,0.3284,67,321,388 +10767,2012-03-29,2,1,3,17,0,4,1,1,0.52,0.5,0.38,0.3881,83,620,703 +10768,2012-03-29,2,1,3,18,0,4,1,1,0.5,0.4848,0.39,0.2985,83,598,681 +10769,2012-03-29,2,1,3,19,0,4,1,1,0.48,0.4697,0.39,0.2836,47,421,468 +10770,2012-03-29,2,1,3,20,0,4,1,1,0.46,0.4545,0.38,0.2985,34,301,335 +10771,2012-03-29,2,1,3,21,0,4,1,1,0.44,0.4394,0.44,0.2836,10,214,224 +10772,2012-03-29,2,1,3,22,0,4,1,1,0.42,0.4242,0.44,0.2836,17,135,152 +10773,2012-03-29,2,1,3,23,0,4,1,1,0.42,0.4242,0.47,0.2985,17,77,94 +10774,2012-03-30,2,1,3,0,0,5,1,1,0.4,0.4091,0.47,0.2985,10,49,59 +10775,2012-03-30,2,1,3,1,0,5,1,1,0.38,0.3939,0.5,0.2836,6,21,27 +10776,2012-03-30,2,1,3,2,0,5,1,1,0.36,0.3333,0.53,0.2985,1,6,7 +10777,2012-03-30,2,1,3,3,0,5,1,1,0.34,0.3333,0.61,0.1343,0,7,7 +10778,2012-03-30,2,1,3,4,0,5,1,1,0.34,0.3485,0.53,0.0896,1,1,2 +10779,2012-03-30,2,1,3,5,0,5,1,1,0.32,0.3333,0.61,0.0896,0,26,26 +10780,2012-03-30,2,1,3,6,0,5,1,1,0.32,0.3333,0.57,0.1343,5,81,86 +10781,2012-03-30,2,1,3,7,0,5,1,1,0.32,0.3182,0.57,0.1642,9,280,289 +10782,2012-03-30,2,1,3,8,0,5,1,1,0.32,0.3182,0.66,0.1642,38,555,593 +10783,2012-03-30,2,1,3,9,0,5,1,1,0.34,0.3333,0.57,0.1642,29,292,321 +10784,2012-03-30,2,1,3,10,0,5,1,2,0.36,0.3636,0.57,0.1045,50,137,187 +10785,2012-03-30,2,1,3,11,0,5,1,2,0.36,0.3788,0.53,0,48,153,201 +10786,2012-03-30,2,1,3,12,0,5,1,2,0.38,0.3939,0.54,0,62,208,270 +10787,2012-03-30,2,1,3,13,0,5,1,3,0.4,0.4091,0.5,0,44,198,242 +10788,2012-03-30,2,1,3,14,0,5,1,3,0.4,0.4091,0.54,0,56,173,229 +10789,2012-03-30,2,1,3,15,0,5,1,3,0.42,0.4242,0.54,0,73,207,280 +10790,2012-03-30,2,1,3,16,0,5,1,3,0.4,0.4091,0.62,0.2836,75,292,367 +10791,2012-03-30,2,1,3,17,0,5,1,2,0.4,0.4091,0.62,0.2239,93,513,606 +10792,2012-03-30,2,1,3,18,0,5,1,1,0.42,0.4242,0.54,0.194,68,492,560 +10793,2012-03-30,2,1,3,19,0,5,1,1,0.4,0.4091,0.62,0.2239,42,353,395 +10794,2012-03-30,2,1,3,20,0,5,1,1,0.4,0.4091,0.62,0.1642,17,192,209 +10795,2012-03-30,2,1,3,21,0,5,1,1,0.38,0.3939,0.66,0.1343,30,190,220 +10796,2012-03-30,2,1,3,22,0,5,1,1,0.36,0.3485,0.71,0.1642,23,130,153 +10797,2012-03-30,2,1,3,23,0,5,1,1,0.36,0.3788,0.71,0,16,107,123 +10798,2012-03-31,2,1,3,0,0,6,0,1,0.38,0.3939,0.82,0.194,12,80,92 +10799,2012-03-31,2,1,3,1,0,6,0,2,0.4,0.4091,0.76,0.194,17,65,82 +10800,2012-03-31,2,1,3,2,0,6,0,2,0.42,0.4242,0.77,0.2537,14,55,69 +10801,2012-03-31,2,1,3,3,0,6,0,1,0.4,0.4091,0.82,0.2537,10,21,31 +10802,2012-03-31,2,1,3,4,0,6,0,1,0.42,0.4242,0.77,0.4179,4,4,8 +10803,2012-03-31,2,1,3,5,0,6,0,1,0.4,0.4091,0.82,0.2239,0,5,5 +10804,2012-03-31,2,1,3,6,0,6,0,2,0.36,0.3485,0.87,0.1642,7,16,23 +10805,2012-03-31,2,1,3,7,0,6,0,2,0.36,0.3485,0.87,0.1642,5,39,44 +10806,2012-03-31,2,1,3,8,0,6,0,2,0.36,0.3485,0.87,0.194,19,126,145 +10807,2012-03-31,2,1,3,9,0,6,0,2,0.4,0.4091,0.76,0.194,62,161,223 +10808,2012-03-31,2,1,3,10,0,6,0,2,0.4,0.4091,0.76,0.1343,108,205,313 +10809,2012-03-31,2,1,3,11,0,6,0,1,0.46,0.4545,0.72,0.1343,184,279,463 +10810,2012-03-31,2,1,3,12,0,6,0,1,0.48,0.4697,0.67,0,263,358,621 +10811,2012-03-31,2,1,3,13,0,6,0,1,0.54,0.5152,0.6,0.1343,265,373,638 +10812,2012-03-31,2,1,3,14,0,6,0,2,0.5,0.4848,0.63,0.2537,240,311,551 +10813,2012-03-31,2,1,3,15,0,6,0,2,0.52,0.5,0.59,0.2836,275,330,605 +10814,2012-03-31,2,1,3,16,0,6,0,2,0.5,0.4848,0.63,0.2985,243,292,535 +10815,2012-03-31,2,1,3,17,0,6,0,1,0.48,0.4697,0.67,0.4179,238,305,543 +10816,2012-03-31,2,1,3,18,0,6,0,2,0.44,0.4394,0.72,0.4627,102,239,341 +10817,2012-03-31,2,1,3,19,0,6,0,1,0.42,0.4242,0.71,0.4478,93,191,284 +10818,2012-03-31,2,1,3,20,0,6,0,2,0.4,0.4091,0.71,0.3284,52,156,208 +10819,2012-03-31,2,1,3,21,0,6,0,2,0.4,0.4091,0.71,0.3284,36,107,143 +10820,2012-03-31,2,1,3,22,0,6,0,2,0.38,0.3939,0.71,0.2537,40,116,156 +10821,2012-03-31,2,1,3,23,0,6,0,2,0.36,0.3333,0.76,0.2836,12,100,112 +10822,2012-04-01,2,1,4,0,0,0,0,2,0.36,0.3333,0.76,0.2537,8,59,67 +10823,2012-04-01,2,1,4,1,0,0,0,2,0.36,0.3485,0.76,0.1642,13,49,62 +10824,2012-04-01,2,1,4,2,0,0,0,2,0.36,0.3485,0.76,0.1642,20,61,81 +10825,2012-04-01,2,1,4,3,0,0,0,2,0.34,0.3333,0.81,0.1343,4,21,25 +10826,2012-04-01,2,1,4,4,0,0,0,2,0.34,0.3485,0.81,0.0896,3,9,12 +10827,2012-04-01,2,1,4,5,0,0,0,2,0.34,0.3485,0.81,0.0896,8,10,18 +10828,2012-04-01,2,1,4,6,0,0,0,2,0.36,0.3485,0.76,0.1343,9,88,97 +10829,2012-04-01,2,1,4,7,0,0,0,2,0.36,0.3485,0.71,0.1343,29,55,84 +10830,2012-04-01,2,1,4,8,0,0,0,2,0.36,0.3636,0.76,0.0896,37,88,125 +10831,2012-04-01,2,1,4,9,0,0,0,2,0.36,0.3485,0.81,0.1343,102,173,275 +10832,2012-04-01,2,1,4,10,0,0,0,2,0.4,0.4091,0.71,0.1045,132,228,360 +10833,2012-04-01,2,1,4,11,0,0,0,2,0.4,0.4091,0.71,0.194,180,271,451 +10834,2012-04-01,2,1,4,12,0,0,0,2,0.42,0.4242,0.67,0.2239,187,258,445 +10835,2012-04-01,2,1,4,13,0,0,0,2,0.44,0.4394,0.62,0.194,190,274,464 +10836,2012-04-01,2,1,4,14,0,0,0,2,0.48,0.4697,0.55,0.2239,283,278,561 +10837,2012-04-01,2,1,4,15,0,0,0,1,0.5,0.4848,0.55,0.2537,295,278,573 +10838,2012-04-01,2,1,4,16,0,0,0,1,0.52,0.5,0.52,0.2537,236,275,511 +10839,2012-04-01,2,1,4,17,0,0,0,1,0.52,0.5,0.55,0.2836,232,323,555 +10840,2012-04-01,2,1,4,18,0,0,0,1,0.52,0.5,0.52,0.2985,153,279,432 +10841,2012-04-01,2,1,4,19,0,0,0,1,0.52,0.5,0.55,0.2239,110,236,346 +10842,2012-04-01,2,1,4,20,0,0,0,2,0.5,0.4848,0.59,0.2239,66,166,232 +10843,2012-04-01,2,1,4,21,0,0,0,2,0.5,0.4848,0.59,0.1642,33,101,134 +10844,2012-04-01,2,1,4,22,0,0,0,3,0.5,0.4848,0.63,0.1045,7,61,68 +10845,2012-04-01,2,1,4,23,0,0,0,3,0.46,0.4545,0.72,0,10,53,63 +10846,2012-04-02,2,1,4,0,0,1,1,3,0.44,0.4394,0.82,0.1045,6,11,17 +10847,2012-04-02,2,1,4,1,0,1,1,3,0.42,0.4242,0.88,0.0896,0,5,5 +10848,2012-04-02,2,1,4,2,0,1,1,3,0.44,0.4394,0.94,0.1343,2,3,5 +10849,2012-04-02,2,1,4,4,0,1,1,1,0.4,0.4091,0.76,0.4925,2,2,4 +10850,2012-04-02,2,1,4,5,0,1,1,1,0.4,0.4091,0.76,0.4925,2,22,24 +10851,2012-04-02,2,1,4,6,0,1,1,1,0.36,0.3182,0.71,0.4627,6,104,110 +10852,2012-04-02,2,1,4,7,0,1,1,1,0.34,0.303,0.66,0.3881,12,294,306 +10853,2012-04-02,2,1,4,8,0,1,1,1,0.36,0.3333,0.57,0.3284,36,553,589 +10854,2012-04-02,2,1,4,9,0,1,1,1,0.38,0.3939,0.54,0.3582,52,265,317 +10855,2012-04-02,2,1,4,10,0,1,1,1,0.4,0.4091,0.5,0.5224,85,119,204 +10856,2012-04-02,2,1,4,11,0,1,1,1,0.42,0.4242,0.47,0.4478,81,128,209 +10857,2012-04-02,2,1,4,12,0,1,1,1,0.44,0.4394,0.44,0.4478,100,181,281 +10858,2012-04-02,2,1,4,13,0,1,1,1,0.46,0.4545,0.36,0.2985,97,184,281 +10859,2012-04-02,2,1,4,14,0,1,1,1,0.46,0.4545,0.31,0.4179,127,141,268 +10860,2012-04-02,2,1,4,15,0,1,1,1,0.48,0.4697,0.33,0.4179,132,192,324 +10861,2012-04-02,2,1,4,16,0,1,1,1,0.5,0.4848,0.34,0.3284,104,291,395 +10862,2012-04-02,2,1,4,17,0,1,1,1,0.5,0.4848,0.29,0.2836,128,601,729 +10863,2012-04-02,2,1,4,18,0,1,1,1,0.5,0.4848,0.27,0.2537,75,543,618 +10864,2012-04-02,2,1,4,19,0,1,1,1,0.5,0.4848,0.27,0.2537,66,428,494 +10865,2012-04-02,2,1,4,20,0,1,1,1,0.46,0.4545,0.31,0.1642,48,281,329 +10866,2012-04-02,2,1,4,21,0,1,1,1,0.46,0.4545,0.28,0.1343,24,214,238 +10867,2012-04-02,2,1,4,22,0,1,1,1,0.42,0.4242,0.44,0.194,17,106,123 +10868,2012-04-02,2,1,4,23,0,1,1,1,0.44,0.4394,0.35,0.1642,6,60,66 +10869,2012-04-03,2,1,4,0,0,2,1,1,0.4,0.4091,0.43,0.1343,1,32,33 +10870,2012-04-03,2,1,4,1,0,2,1,1,0.38,0.3939,0.46,0.2239,0,11,11 +10871,2012-04-03,2,1,4,2,0,2,1,1,0.36,0.3485,0.53,0.1642,0,5,5 +10872,2012-04-03,2,1,4,3,0,2,1,1,0.38,0.3939,0.46,0,0,3,3 +10873,2012-04-03,2,1,4,4,0,2,1,1,0.32,0.3333,0.66,0.0896,0,1,1 +10874,2012-04-03,2,1,4,5,0,2,1,1,0.32,0.3333,0.66,0.0896,2,25,27 +10875,2012-04-03,2,1,4,6,0,2,1,1,0.3,0.3182,0.61,0.0896,2,98,100 +10876,2012-04-03,2,1,4,7,0,2,1,1,0.32,0.3333,0.66,0.0896,14,327,341 +10877,2012-04-03,2,1,4,8,0,2,1,1,0.34,0.3636,0.57,0,27,577,604 +10878,2012-04-03,2,1,4,9,0,2,1,1,0.42,0.4242,0.38,0.0896,54,354,408 +10879,2012-04-03,2,1,4,10,0,2,1,1,0.44,0.4394,0.33,0,77,142,219 +10880,2012-04-03,2,1,4,11,0,2,1,1,0.48,0.4697,0.23,0,91,167,258 +10881,2012-04-03,2,1,4,12,0,2,1,1,0.52,0.5,0.25,0,101,230,331 +10882,2012-04-03,2,1,4,13,0,2,1,1,0.56,0.5303,0.19,0.2537,120,225,345 +10883,2012-04-03,2,1,4,14,0,2,1,1,0.56,0.5303,0.21,0.2239,120,191,311 +10884,2012-04-03,2,1,4,15,0,2,1,1,0.6,0.5909,0.2,0.194,109,207,316 +10885,2012-04-03,2,1,4,16,0,2,1,1,0.62,0.6061,0.21,0.1642,145,340,485 +10886,2012-04-03,2,1,4,17,0,2,1,1,0.62,0.6061,0.17,0.194,123,634,757 +10887,2012-04-03,2,1,4,18,0,2,1,1,0.6,0.6061,0.26,0.1642,139,661,800 +10888,2012-04-03,2,1,4,19,0,2,1,1,0.56,0.5303,0.26,0.1045,98,460,558 +10889,2012-04-03,2,1,4,20,0,2,1,1,0.56,0.5303,0.24,0.1343,54,325,379 +10890,2012-04-03,2,1,4,21,0,2,1,1,0.54,0.5152,0.45,0,33,210,243 +10891,2012-04-03,2,1,4,22,0,2,1,1,0.5,0.4848,0.51,0,25,133,158 +10892,2012-04-03,2,1,4,23,0,2,1,1,0.5,0.4848,0.59,0,13,66,79 +10893,2012-04-04,2,1,4,0,0,3,1,1,0.48,0.4697,0.48,0,18,23,41 +10894,2012-04-04,2,1,4,1,0,3,1,1,0.46,0.4545,0.59,0,0,11,11 +10895,2012-04-04,2,1,4,2,0,3,1,1,0.46,0.4545,0.51,0,0,5,5 +10896,2012-04-04,2,1,4,3,0,3,1,1,0.44,0.4394,0.58,0.0896,0,3,3 +10897,2012-04-04,2,1,4,4,0,3,1,1,0.42,0.4242,0.67,0.0896,0,1,1 +10898,2012-04-04,2,1,4,5,0,3,1,1,0.42,0.4242,0.67,0.0896,0,27,27 +10899,2012-04-04,2,1,4,6,0,3,1,2,0.42,0.4242,0.58,0,6,120,126 +10900,2012-04-04,2,1,4,7,0,3,1,2,0.44,0.4394,0.62,0,12,354,366 +10901,2012-04-04,2,1,4,8,0,3,1,2,0.44,0.4394,0.72,0.1045,31,653,684 +10902,2012-04-04,2,1,4,9,0,3,1,2,0.48,0.4697,0.55,0.1343,47,316,363 +10903,2012-04-04,2,1,4,10,0,3,1,2,0.54,0.5152,0.45,0.1343,68,99,167 +10904,2012-04-04,2,1,4,11,0,3,1,2,0.6,0.6212,0.43,0.3582,61,152,213 +10905,2012-04-04,2,1,4,12,0,3,1,2,0.6,0.6212,0.46,0.4179,73,210,283 +10906,2012-04-04,2,1,4,13,0,3,1,2,0.6,0.6212,0.46,0.2537,70,181,251 +10907,2012-04-04,2,1,4,14,0,3,1,2,0.62,0.6212,0.46,0.194,82,176,258 +10908,2012-04-04,2,1,4,15,0,3,1,2,0.64,0.6212,0.44,0.2836,86,182,268 +10909,2012-04-04,2,1,4,16,0,3,1,2,0.64,0.6212,0.44,0.2836,86,304,390 +10910,2012-04-04,2,1,4,17,0,3,1,1,0.66,0.6212,0.41,0.2985,99,645,744 +10911,2012-04-04,2,1,4,18,0,3,1,1,0.66,0.6212,0.41,0.2537,113,646,759 +10912,2012-04-04,2,1,4,19,0,3,1,1,0.66,0.6212,0.39,0.2239,75,419,494 +10913,2012-04-04,2,1,4,20,0,3,1,1,0.64,0.6212,0.27,0.2239,39,333,372 +10914,2012-04-04,2,1,4,21,0,3,1,1,0.6,0.6061,0.23,0.3284,43,255,298 +10915,2012-04-04,2,1,4,22,0,3,1,1,0.56,0.5303,0.22,0.1642,30,151,181 +10916,2012-04-04,2,1,4,23,0,3,1,1,0.52,0.5,0.23,0.4179,19,112,131 +10917,2012-04-05,2,1,4,0,0,4,1,1,0.52,0.5,0.23,0.4179,12,33,45 +10918,2012-04-05,2,1,4,1,0,4,1,1,0.48,0.4697,0.27,0.3284,4,9,13 +10919,2012-04-05,2,1,4,2,0,4,1,1,0.46,0.4545,0.33,0.4478,7,11,18 +10920,2012-04-05,2,1,4,3,0,4,1,1,0.42,0.4242,0.41,0.2985,0,5,5 +10921,2012-04-05,2,1,4,4,0,4,1,1,0.38,0.3939,0.43,0.2836,0,1,1 +10922,2012-04-05,2,1,4,5,0,4,1,1,0.38,0.3939,0.43,0.2836,1,28,29 +10923,2012-04-05,2,1,4,6,0,4,1,1,0.36,0.3333,0.46,0.3284,8,118,126 +10924,2012-04-05,2,1,4,7,0,4,1,1,0.34,0.3333,0.49,0.194,15,339,354 +10925,2012-04-05,2,1,4,8,0,4,1,1,0.36,0.3485,0.46,0.2239,28,610,638 +10926,2012-04-05,2,1,4,9,0,4,1,1,0.4,0.4091,0.37,0.2537,61,290,351 +10927,2012-04-05,2,1,4,10,0,4,1,1,0.42,0.4242,0.35,0.2836,88,158,246 +10928,2012-04-05,2,1,4,11,0,4,1,2,0.42,0.4242,0.38,0.2239,109,159,268 +10929,2012-04-05,2,1,4,12,0,4,1,1,0.44,0.4394,0.33,0,66,210,276 +10930,2012-04-05,2,1,4,13,0,4,1,1,0.46,0.4545,0.31,0,89,202,291 +10931,2012-04-05,2,1,4,14,0,4,1,1,0.46,0.4545,0.33,0.1642,143,144,287 +10932,2012-04-05,2,1,4,15,0,4,1,1,0.5,0.4848,0.31,0.1045,98,171,269 +10933,2012-04-05,2,1,4,16,0,4,1,1,0.5,0.4848,0.31,0.194,84,318,402 +10934,2012-04-05,2,1,4,17,0,4,1,1,0.5,0.4848,0.31,0.1642,145,677,822 +10935,2012-04-05,2,1,4,18,0,4,1,1,0.5,0.4848,0.31,0.1045,80,618,698 +10936,2012-04-05,2,1,4,19,0,4,1,1,0.48,0.4697,0.31,0.2537,64,423,487 +10937,2012-04-05,2,1,4,20,0,4,1,1,0.44,0.4394,0.41,0.194,33,279,312 +10938,2012-04-05,2,1,4,21,0,4,1,1,0.42,0.4242,0.47,0.194,25,209,234 +10939,2012-04-05,2,1,4,22,0,4,1,1,0.4,0.4091,0.47,0.194,20,129,149 +10940,2012-04-05,2,1,4,23,0,4,1,1,0.4,0.4091,0.5,0.1343,12,124,136 +10941,2012-04-06,2,1,4,0,0,5,1,1,0.4,0.4091,0.43,0,12,49,61 +10942,2012-04-06,2,1,4,1,0,5,1,1,0.36,0.3485,0.62,0.1343,10,28,38 +10943,2012-04-06,2,1,4,2,0,5,1,1,0.36,0.3333,0.5,0.3881,6,5,11 +10944,2012-04-06,2,1,4,3,0,5,1,1,0.34,0.303,0.53,0.3284,3,5,8 +10945,2012-04-06,2,1,4,4,0,5,1,1,0.34,0.3182,0.49,0.2537,2,3,5 +10946,2012-04-06,2,1,4,5,0,5,1,1,0.32,0.303,0.53,0.3284,0,23,23 +10947,2012-04-06,2,1,4,6,0,5,1,1,0.3,0.2727,0.49,0.3881,0,80,80 +10948,2012-04-06,2,1,4,7,0,5,1,1,0.3,0.2879,0.49,0.2836,15,231,246 +10949,2012-04-06,2,1,4,8,0,5,1,1,0.3,0.2727,0.49,0.3284,19,489,508 +10950,2012-04-06,2,1,4,9,0,5,1,1,0.34,0.303,0.42,0.3284,58,318,376 +10951,2012-04-06,2,1,4,10,0,5,1,1,0.36,0.3333,0.4,0.3284,81,157,238 +10952,2012-04-06,2,1,4,11,0,5,1,1,0.4,0.4091,0.37,0.2985,112,179,291 +10953,2012-04-06,2,1,4,12,0,5,1,1,0.42,0.4242,0.35,0.2985,113,224,337 +10954,2012-04-06,2,1,4,13,0,5,1,1,0.44,0.4394,0.33,0.3284,173,238,411 +10955,2012-04-06,2,1,4,14,0,5,1,1,0.48,0.4697,0.29,0.3881,240,250,490 +10956,2012-04-06,2,1,4,15,0,5,1,1,0.5,0.4848,0.27,0.2836,218,319,537 +10957,2012-04-06,2,1,4,16,0,5,1,1,0.5,0.4848,0.22,0.4179,184,352,536 +10958,2012-04-06,2,1,4,17,0,5,1,1,0.52,0.5,0.23,0.4179,172,483,655 +10959,2012-04-06,2,1,4,18,0,5,1,1,0.5,0.4848,0.22,0.4627,104,380,484 +10960,2012-04-06,2,1,4,19,0,5,1,1,0.48,0.4697,0.23,0.2985,115,297,412 +10961,2012-04-06,2,1,4,20,0,5,1,1,0.46,0.4545,0.24,0.2836,61,191,252 +10962,2012-04-06,2,1,4,21,0,5,1,1,0.44,0.4394,0.26,0.1343,54,136,190 +10963,2012-04-06,2,1,4,22,0,5,1,1,0.4,0.4091,0.35,0.2239,34,124,158 +10964,2012-04-06,2,1,4,23,0,5,1,1,0.42,0.4242,0.3,0.2836,21,92,113 +10965,2012-04-07,2,1,4,0,0,6,0,1,0.42,0.4242,0.26,0.2985,9,85,94 +10966,2012-04-07,2,1,4,1,0,6,0,1,0.4,0.4091,0.28,0.2537,9,60,69 +10967,2012-04-07,2,1,4,2,0,6,0,1,0.36,0.3333,0.29,0.3582,11,25,36 +10968,2012-04-07,2,1,4,3,0,6,0,1,0.36,0.3333,0.32,0.2537,7,21,28 +10969,2012-04-07,2,1,4,4,0,6,0,1,0.32,0.3182,0.36,0.1642,3,4,7 +10970,2012-04-07,2,1,4,5,0,6,0,1,0.32,0.3182,0.33,0.1642,0,2,2 +10971,2012-04-07,2,1,4,6,0,6,0,1,0.32,0.3182,0.36,0.194,1,18,19 +10972,2012-04-07,2,1,4,7,0,6,0,1,0.32,0.303,0.33,0.2836,14,40,54 +10973,2012-04-07,2,1,4,8,0,6,0,1,0.34,0.3182,0.34,0.2537,29,94,123 +10974,2012-04-07,2,1,4,9,0,6,0,1,0.4,0.4091,0.3,0.2537,103,173,276 +10975,2012-04-07,2,1,4,10,0,6,0,1,0.42,0.4242,0.28,0.2537,187,176,363 +10976,2012-04-07,2,1,4,11,0,6,0,1,0.44,0.4394,0.24,0.2985,251,244,495 +10977,2012-04-07,2,1,4,12,0,6,0,1,0.46,0.4545,0.23,0.3881,320,270,590 +10978,2012-04-07,2,1,4,13,0,6,0,1,0.5,0.4848,0.22,0.194,355,288,643 +10979,2012-04-07,2,1,4,14,0,6,0,1,0.5,0.4848,0.22,0.2985,326,252,578 +10980,2012-04-07,2,1,4,15,0,6,0,1,0.52,0.5,0.19,0.4179,321,305,626 +10981,2012-04-07,2,1,4,16,0,6,0,1,0.54,0.5152,0.19,0.3881,354,261,615 +10982,2012-04-07,2,1,4,17,0,6,0,1,0.54,0.5152,0.19,0.4179,299,268,567 +10983,2012-04-07,2,1,4,18,0,6,0,1,0.54,0.5152,0.18,0.2985,227,290,517 +10984,2012-04-07,2,1,4,19,0,6,0,1,0.54,0.5152,0.16,0.3284,170,243,413 +10985,2012-04-07,2,1,4,20,0,6,0,1,0.5,0.4848,0.2,0.2239,73,135,208 +10986,2012-04-07,2,1,4,21,0,6,0,1,0.5,0.4848,0.2,0.194,75,159,234 +10987,2012-04-07,2,1,4,22,0,6,0,1,0.48,0.4697,0.2,0.194,76,106,182 +10988,2012-04-07,2,1,4,23,0,6,0,1,0.46,0.4545,0.23,0.2239,32,86,118 +10989,2012-04-08,2,1,4,0,0,0,0,1,0.44,0.4394,0.24,0.2239,21,64,85 +10990,2012-04-08,2,1,4,1,0,0,0,1,0.44,0.4394,0.24,0.2239,14,50,64 +10991,2012-04-08,2,1,4,2,0,0,0,1,0.44,0.4394,0.24,0.2239,7,25,32 +10992,2012-04-08,2,1,4,3,0,0,0,1,0.42,0.4242,0.26,0.1343,8,18,26 +10993,2012-04-08,2,1,4,4,0,0,0,1,0.4,0.4091,0.4,0,0,7,7 +10994,2012-04-08,2,1,4,5,0,0,0,1,0.38,0.3939,0.46,0,4,8,12 +10995,2012-04-08,2,1,4,6,0,0,0,1,0.36,0.3788,0.37,0,7,21,28 +10996,2012-04-08,2,1,4,7,0,0,0,1,0.38,0.3939,0.27,0.1642,12,25,37 +10997,2012-04-08,2,1,4,8,0,0,0,1,0.4,0.4091,0.3,0.194,19,65,84 +10998,2012-04-08,2,1,4,9,0,0,0,1,0.44,0.4394,0.26,0.2239,71,108,179 +10999,2012-04-08,2,1,4,10,0,0,0,1,0.5,0.4848,0.25,0.2239,113,186,299 +11000,2012-04-08,2,1,4,11,0,0,0,1,0.5,0.4848,0.25,0.2836,195,209,404 +11001,2012-04-08,2,1,4,12,0,0,0,1,0.54,0.5152,0.24,0.2537,198,259,457 +11002,2012-04-08,2,1,4,13,0,0,0,1,0.58,0.5455,0.23,0.1642,229,254,483 +11003,2012-04-08,2,1,4,14,0,0,0,1,0.6,0.5909,0.18,0.4478,254,249,503 +11004,2012-04-08,2,1,4,15,0,0,0,1,0.62,0.6061,0.17,0.3582,260,226,486 +11005,2012-04-08,2,1,4,16,0,0,0,1,0.64,0.6061,0.19,0.3284,232,244,476 +11006,2012-04-08,2,1,4,17,0,0,0,1,0.62,0.6061,0.22,0.2985,185,226,411 +11007,2012-04-08,2,1,4,18,0,0,0,1,0.62,0.6061,0.25,0.3284,151,159,310 +11008,2012-04-08,2,1,4,19,0,0,0,1,0.6,0.6061,0.23,0.3881,89,187,276 +11009,2012-04-08,2,1,4,20,0,0,0,1,0.56,0.5303,0.28,0.4179,65,146,211 +11010,2012-04-08,2,1,4,21,0,0,0,1,0.52,0.5,0.34,0.2239,53,77,130 +11011,2012-04-08,2,1,4,22,0,0,0,1,0.5,0.4848,0.36,0.2239,33,85,118 +11012,2012-04-08,2,1,4,23,0,0,0,1,0.5,0.4848,0.39,0.2537,10,41,51 +11013,2012-04-09,2,1,4,0,0,1,1,1,0.48,0.4697,0.39,0.2836,8,29,37 +11014,2012-04-09,2,1,4,1,0,1,1,1,0.46,0.4545,0.41,0.1343,4,12,16 +11015,2012-04-09,2,1,4,2,0,1,1,1,0.46,0.4545,0.31,0.194,2,4,6 +11016,2012-04-09,2,1,4,3,0,1,1,1,0.44,0.4394,0.3,0.1642,0,3,3 +11017,2012-04-09,2,1,4,4,0,1,1,1,0.44,0.4394,0.24,0.194,2,3,5 +11018,2012-04-09,2,1,4,5,0,1,1,1,0.44,0.4394,0.24,0.194,2,30,32 +11019,2012-04-09,2,1,4,6,0,1,1,1,0.42,0.4242,0.26,0.2239,3,108,111 +11020,2012-04-09,2,1,4,7,0,1,1,1,0.42,0.4242,0.28,0.194,11,320,331 +11021,2012-04-09,2,1,4,8,0,1,1,1,0.44,0.4394,0.3,0.194,22,595,617 +11022,2012-04-09,2,1,4,9,0,1,1,1,0.46,0.4545,0.31,0.3582,50,236,286 +11023,2012-04-09,2,1,4,10,0,1,1,1,0.5,0.4848,0.27,0.4925,65,113,178 +11024,2012-04-09,2,1,4,11,0,1,1,1,0.52,0.5,0.27,0.4179,96,127,223 +11025,2012-04-09,2,1,4,12,0,1,1,1,0.54,0.5152,0.28,0.7164,94,186,280 +11026,2012-04-09,2,1,4,13,0,1,1,1,0.54,0.5152,0.28,0.5821,108,173,281 +11027,2012-04-09,2,1,4,14,0,1,1,1,0.56,0.5303,0.28,0.5522,66,152,218 +11028,2012-04-09,2,1,4,15,0,1,1,1,0.58,0.5455,0.28,0.5522,82,173,255 +11029,2012-04-09,2,1,4,16,0,1,1,1,0.56,0.5303,0.3,0.4179,81,310,391 +11030,2012-04-09,2,1,4,17,0,1,1,1,0.54,0.5152,0.32,0.4179,67,565,632 +11031,2012-04-09,2,1,4,18,0,1,1,1,0.54,0.5152,0.34,0.2985,60,586,646 +11032,2012-04-09,2,1,4,19,0,1,1,1,0.52,0.5,0.34,0.3582,35,386,421 +11033,2012-04-09,2,1,4,20,0,1,1,1,0.5,0.4848,0.36,0.4478,21,255,276 +11034,2012-04-09,2,1,4,21,0,1,1,3,0.48,0.4697,0.41,0.4179,14,151,165 +11035,2012-04-09,2,1,4,22,0,1,1,2,0.46,0.4545,0.41,0.4925,6,102,108 +11036,2012-04-09,2,1,4,23,0,1,1,1,0.44,0.4394,0.44,0.2985,6,61,67 +11037,2012-04-10,2,1,4,0,0,2,1,2,0.44,0.4394,0.47,0.2836,3,23,26 +11038,2012-04-10,2,1,4,1,0,2,1,1,0.42,0.4242,0.5,0.2239,0,9,9 +11039,2012-04-10,2,1,4,2,0,2,1,1,0.36,0.3485,0.71,0.1343,0,2,2 +11040,2012-04-10,2,1,4,3,0,2,1,1,0.36,0.3485,0.71,0.1343,0,2,2 +11041,2012-04-10,2,1,4,4,0,2,1,1,0.34,0.3333,0.71,0.1343,0,2,2 +11042,2012-04-10,2,1,4,5,0,2,1,1,0.34,0.3333,0.71,0.1343,0,24,24 +11043,2012-04-10,2,1,4,6,0,2,1,1,0.34,0.3485,0.71,0.1045,3,100,103 +11044,2012-04-10,2,1,4,7,0,2,1,1,0.34,0.3485,0.71,0.1045,16,368,384 +11045,2012-04-10,2,1,4,8,0,2,1,1,0.38,0.3939,0.62,0.1343,22,670,692 +11046,2012-04-10,2,1,4,9,0,2,1,1,0.42,0.4242,0.67,0.194,16,303,319 +11047,2012-04-10,2,1,4,10,0,2,1,1,0.46,0.4545,0.47,0.1642,49,132,181 +11048,2012-04-10,2,1,4,11,0,2,1,1,0.5,0.4848,0.36,0,78,161,239 +11049,2012-04-10,2,1,4,12,0,2,1,1,0.54,0.5152,0.22,0.2836,75,184,259 +11050,2012-04-10,2,1,4,13,0,2,1,1,0.52,0.5,0.23,0.1642,81,202,283 +11051,2012-04-10,2,1,4,14,0,2,1,1,0.54,0.5152,0.24,0.194,75,149,224 +11052,2012-04-10,2,1,4,15,0,2,1,1,0.56,0.5303,0.21,0.2985,46,157,203 +11053,2012-04-10,2,1,4,16,0,2,1,1,0.58,0.5455,0.21,0.2537,66,296,362 +11054,2012-04-10,2,1,4,17,0,2,1,1,0.56,0.5303,0.21,0.5224,88,656,744 +11055,2012-04-10,2,1,4,18,0,2,1,1,0.54,0.5152,0.22,0.4179,78,626,704 +11056,2012-04-10,2,1,4,19,0,2,1,1,0.52,0.5,0.2,0.3582,51,428,479 +11057,2012-04-10,2,1,4,20,0,2,1,1,0.46,0.4545,0.28,0.3881,34,248,282 +11058,2012-04-10,2,1,4,21,0,2,1,1,0.44,0.4394,0.3,0.4478,15,191,206 +11059,2012-04-10,2,1,4,22,0,2,1,1,0.4,0.4091,0.37,0.4925,9,118,127 +11060,2012-04-10,2,1,4,23,0,2,1,1,0.36,0.3333,0.4,0.4179,14,48,62 +11061,2012-04-11,2,1,4,0,0,3,1,1,0.34,0.303,0.36,0.4179,8,22,30 +11062,2012-04-11,2,1,4,1,0,3,1,1,0.34,0.3182,0.39,0.2836,3,9,12 +11063,2012-04-11,2,1,4,2,0,3,1,1,0.32,0.303,0.45,0.2836,0,2,2 +11064,2012-04-11,2,1,4,3,0,3,1,1,0.3,0.2879,0.49,0.2239,0,2,2 +11065,2012-04-11,2,1,4,5,0,3,1,1,0.28,0.2727,0.56,0.194,1,26,27 +11066,2012-04-11,2,1,4,6,0,3,1,1,0.28,0.2879,0.56,0.1045,3,103,106 +11067,2012-04-11,2,1,4,7,0,3,1,1,0.28,0.2879,0.52,0.1343,18,303,321 +11068,2012-04-11,2,1,4,8,0,3,1,1,0.3,0.2879,0.49,0.2239,14,581,595 +11069,2012-04-11,2,1,4,9,0,3,1,2,0.32,0.303,0.49,0.2985,20,282,302 +11070,2012-04-11,2,1,4,10,0,3,1,2,0.34,0.303,0.49,0.3284,26,127,153 +11071,2012-04-11,2,1,4,11,0,3,1,2,0.38,0.3939,0.43,0.3284,38,142,180 +11072,2012-04-11,2,1,4,12,0,3,1,2,0.36,0.3333,0.46,0.3881,35,147,182 +11073,2012-04-11,2,1,4,13,0,3,1,2,0.4,0.4091,0.4,0.3582,16,147,163 +11074,2012-04-11,2,1,4,14,0,3,1,1,0.36,0.3485,0.57,0.194,34,101,135 +11075,2012-04-11,2,1,4,15,0,3,1,3,0.38,0.3939,0.46,0.2239,50,152,202 +11076,2012-04-11,2,1,4,16,0,3,1,1,0.44,0.4394,0.33,0.2985,51,256,307 +11077,2012-04-11,2,1,4,17,0,3,1,1,0.42,0.4242,0.35,0.3881,52,527,579 +11078,2012-04-11,2,1,4,18,0,3,1,3,0.4,0.4091,0.43,0.4179,39,546,585 +11079,2012-04-11,2,1,4,19,0,3,1,1,0.4,0.4091,0.4,0.3582,28,356,384 +11080,2012-04-11,2,1,4,20,0,3,1,1,0.36,0.3333,0.46,0.3284,22,213,235 +11081,2012-04-11,2,1,4,21,0,3,1,1,0.34,0.303,0.57,0.3881,11,154,165 +11082,2012-04-11,2,1,4,22,0,3,1,1,0.34,0.303,0.57,0.3284,4,130,134 +11083,2012-04-11,2,1,4,23,0,3,1,1,0.34,0.303,0.57,0.2985,9,52,61 +11084,2012-04-12,2,1,4,0,0,4,1,1,0.32,0.303,0.66,0.2836,1,22,23 +11085,2012-04-12,2,1,4,1,0,4,1,1,0.32,0.303,0.66,0.2239,2,7,9 +11086,2012-04-12,2,1,4,2,0,4,1,1,0.32,0.303,0.61,0.2836,0,4,4 +11087,2012-04-12,2,1,4,3,0,4,1,1,0.32,0.303,0.61,0.2537,2,3,5 +11088,2012-04-12,2,1,4,4,0,4,1,1,0.3,0.2879,0.61,0.2239,0,1,1 +11089,2012-04-12,2,1,4,5,0,4,1,1,0.3,0.2879,0.61,0.2239,0,21,21 +11090,2012-04-12,2,1,4,6,0,4,1,1,0.3,0.2879,0.65,0.2537,2,94,96 +11091,2012-04-12,2,1,4,7,0,4,1,1,0.3,0.2727,0.65,0.3284,12,294,306 +11092,2012-04-12,2,1,4,8,0,4,1,1,0.32,0.303,0.66,0.3284,20,604,624 +11093,2012-04-12,2,1,4,9,0,4,1,1,0.36,0.3333,0.57,0.3582,16,271,287 +11094,2012-04-12,2,1,4,10,0,4,1,1,0.4,0.4091,0.5,0.2985,38,103,141 +11095,2012-04-12,2,1,4,11,0,4,1,1,0.42,0.4242,0.47,0.2985,50,174,224 +11096,2012-04-12,2,1,4,12,0,4,1,1,0.46,0.4545,0.41,0.4478,60,199,259 +11097,2012-04-12,2,1,4,13,0,4,1,1,0.48,0.4697,0.36,0.4478,55,148,203 +11098,2012-04-12,2,1,4,14,0,4,1,1,0.5,0.4848,0.31,0.3284,74,150,224 +11099,2012-04-12,2,1,4,15,0,4,1,1,0.48,0.4697,0.31,0.2537,42,169,211 +11100,2012-04-12,2,1,4,16,0,4,1,1,0.48,0.4697,0.31,0.2836,54,299,353 +11101,2012-04-12,2,1,4,17,0,4,1,1,0.5,0.4848,0.27,0.3881,60,596,656 +11102,2012-04-12,2,1,4,18,0,4,1,1,0.5,0.4848,0.27,0.3881,63,547,610 +11103,2012-04-12,2,1,4,19,0,4,1,1,0.46,0.4545,0.31,0.2985,50,383,433 +11104,2012-04-12,2,1,4,20,0,4,1,1,0.44,0.4394,0.33,0.2537,26,254,280 +11105,2012-04-12,2,1,4,21,0,4,1,1,0.42,0.4242,0.35,0.1343,13,174,187 +11106,2012-04-12,2,1,4,22,0,4,1,1,0.42,0.4242,0.35,0.194,15,143,158 +11107,2012-04-12,2,1,4,23,0,4,1,1,0.42,0.4242,0.35,0.194,8,86,94 +11108,2012-04-13,2,1,4,0,0,5,1,1,0.4,0.4091,0.37,0.2537,14,41,55 +11109,2012-04-13,2,1,4,1,0,5,1,1,0.36,0.3485,0.46,0.194,2,28,30 +11110,2012-04-13,2,1,4,2,0,5,1,1,0.34,0.3333,0.49,0.194,6,5,11 +11111,2012-04-13,2,1,4,3,0,5,1,1,0.34,0.3333,0.61,0.1642,3,8,11 +11112,2012-04-13,2,1,4,4,0,5,1,2,0.34,0.3333,0.53,0.1343,1,4,5 +11113,2012-04-13,2,1,4,5,0,5,1,2,0.34,0.3333,0.53,0.1642,2,19,21 +11114,2012-04-13,2,1,4,6,0,5,1,1,0.32,0.3182,0.57,0.1642,0,84,84 +11115,2012-04-13,2,1,4,7,0,5,1,1,0.34,0.3182,0.61,0.2239,13,283,296 +11116,2012-04-13,2,1,4,8,0,5,1,1,0.38,0.3939,0.46,0.1642,24,539,563 +11117,2012-04-13,2,1,4,9,0,5,1,1,0.42,0.4242,0.41,0.1642,36,294,330 +11118,2012-04-13,2,1,4,10,0,5,1,1,0.44,0.4394,0.38,0.194,59,133,192 +11119,2012-04-13,2,1,4,11,0,5,1,1,0.46,0.4545,0.36,0.1343,75,158,233 +11120,2012-04-13,2,1,4,12,0,5,1,1,0.5,0.4848,0.31,0,104,218,322 +11121,2012-04-13,2,1,4,13,0,5,1,1,0.52,0.5,0.27,0.2836,112,246,358 +11122,2012-04-13,2,1,4,14,0,5,1,1,0.52,0.5,0.27,0,125,223,348 +11123,2012-04-13,2,1,4,15,0,5,1,1,0.54,0.5152,0.28,0.2537,115,237,352 +11124,2012-04-13,2,1,4,16,0,5,1,1,0.56,0.5303,0.26,0.2537,113,350,463 +11125,2012-04-13,2,1,4,17,0,5,1,1,0.56,0.5303,0.24,0.1642,117,621,738 +11126,2012-04-13,2,1,4,18,0,5,1,1,0.54,0.5152,0.26,0.1343,107,564,671 +11127,2012-04-13,2,1,4,19,0,5,1,1,0.54,0.5152,0.24,0.1642,75,352,427 +11128,2012-04-13,2,1,4,20,0,5,1,1,0.5,0.4848,0.29,0.1343,44,242,286 +11129,2012-04-13,2,1,4,21,0,5,1,1,0.5,0.4848,0.36,0,36,153,189 +11130,2012-04-13,2,1,4,22,0,5,1,1,0.44,0.4394,0.62,0.1045,34,181,215 +11131,2012-04-13,2,1,4,23,0,5,1,1,0.42,0.4242,0.62,0.0896,35,163,198 +11132,2012-04-14,2,1,4,0,0,6,0,1,0.4,0.4091,0.58,0.1343,11,85,96 +11133,2012-04-14,2,1,4,1,0,6,0,1,0.4,0.4091,0.58,0.1343,10,60,70 +11134,2012-04-14,2,1,4,2,0,6,0,1,0.38,0.3939,0.66,0.1343,3,53,56 +11135,2012-04-14,2,1,4,3,0,6,0,1,0.36,0.3636,0.66,0.0896,6,26,32 +11136,2012-04-14,2,1,4,4,0,6,0,1,0.36,0.3636,0.71,0.1045,5,9,14 +11137,2012-04-14,2,1,4,5,0,6,0,1,0.36,0.3636,0.66,0.1045,2,5,7 +11138,2012-04-14,2,1,4,6,0,6,0,1,0.34,0.3333,0.76,0.1642,2,19,21 +11139,2012-04-14,2,1,4,7,0,6,0,1,0.36,0.3636,0.71,0.1045,8,56,64 +11140,2012-04-14,2,1,4,8,0,6,0,1,0.36,0.3485,0.71,0.1343,24,136,160 +11141,2012-04-14,2,1,4,9,0,6,0,1,0.42,0.4242,0.54,0.1343,59,206,265 +11142,2012-04-14,2,1,4,10,0,6,0,1,0.46,0.4545,0.51,0.1642,139,267,406 +11143,2012-04-14,2,1,4,11,0,6,0,1,0.52,0.5,0.45,0.194,207,357,564 +11144,2012-04-14,2,1,4,12,0,6,0,1,0.58,0.5455,0.35,0.2537,274,404,678 +11145,2012-04-14,2,1,4,13,0,6,0,1,0.62,0.6212,0.33,0.2985,308,370,678 +11146,2012-04-14,2,1,4,14,0,6,0,2,0.64,0.6212,0.27,0.4478,288,372,660 +11147,2012-04-14,2,1,4,15,0,6,0,2,0.64,0.6212,0.27,0.3582,311,347,658 +11148,2012-04-14,2,1,4,16,0,6,0,2,0.64,0.6212,0.27,0.3582,253,329,582 +11149,2012-04-14,2,1,4,17,0,6,0,2,0.62,0.6212,0.33,0.2537,251,309,560 +11150,2012-04-14,2,1,4,18,0,6,0,2,0.62,0.6212,0.35,0.2239,197,284,481 +11151,2012-04-14,2,1,4,19,0,6,0,2,0.6,0.6212,0.4,0.1642,163,296,459 +11152,2012-04-14,2,1,4,20,0,6,0,2,0.56,0.5303,0.46,0.1343,112,209,321 +11153,2012-04-14,2,1,4,21,0,6,0,2,0.56,0.5303,0.43,0.1343,53,167,220 +11154,2012-04-14,2,1,4,22,0,6,0,2,0.54,0.5152,0.52,0.1642,64,148,212 +11155,2012-04-14,2,1,4,23,0,6,0,2,0.54,0.5152,0.56,0.194,45,151,196 +11156,2012-04-15,2,1,4,0,0,0,0,2,0.54,0.5152,0.56,0.194,30,94,124 +11157,2012-04-15,2,1,4,1,0,0,0,2,0.54,0.5152,0.56,0.1642,17,89,106 +11158,2012-04-15,2,1,4,2,0,0,0,2,0.54,0.5152,0.56,0.1045,20,54,74 +11159,2012-04-15,2,1,4,3,0,0,0,2,0.52,0.5,0.59,0.1343,18,27,45 +11160,2012-04-15,2,1,4,4,0,0,0,2,0.54,0.5152,0.52,0,2,10,12 +11161,2012-04-15,2,1,4,5,0,0,0,1,0.5,0.4848,0.68,0.1642,4,6,10 +11162,2012-04-15,2,1,4,6,0,0,0,1,0.5,0.4848,0.63,0.1343,5,11,16 +11163,2012-04-15,2,1,4,7,0,0,0,1,0.5,0.4848,0.63,0.1045,16,30,46 +11164,2012-04-15,2,1,4,8,0,0,0,1,0.52,0.5,0.59,0,33,64,97 +11165,2012-04-15,2,1,4,9,0,0,0,1,0.54,0.5152,0.56,0.2239,97,154,251 +11166,2012-04-15,2,1,4,10,0,0,0,1,0.58,0.5455,0.53,0.2239,186,235,421 +11167,2012-04-15,2,1,4,11,0,0,0,1,0.62,0.6212,0.46,0.2985,227,301,528 +11168,2012-04-15,2,1,4,12,0,0,0,1,0.64,0.6212,0.44,0.2836,275,360,635 +11169,2012-04-15,2,1,4,13,0,0,0,1,0.66,0.6212,0.44,0.2836,298,383,681 +11170,2012-04-15,2,1,4,14,0,0,0,1,0.7,0.6364,0.39,0.2836,282,346,628 +11171,2012-04-15,2,1,4,15,0,0,0,1,0.7,0.6364,0.39,0.3881,266,351,617 +11172,2012-04-15,2,1,4,16,0,0,0,1,0.72,0.6515,0.39,0.3582,286,330,616 +11173,2012-04-15,2,1,4,17,0,0,0,1,0.7,0.6364,0.42,0.3284,262,361,623 +11174,2012-04-15,2,1,4,18,0,0,0,1,0.7,0.6364,0.42,0.2537,184,295,479 +11175,2012-04-15,2,1,4,19,0,0,0,1,0.7,0.6364,0.42,0.2537,114,265,379 +11176,2012-04-15,2,1,4,20,0,0,0,1,0.64,0.6212,0.5,0.2239,86,205,291 +11177,2012-04-15,2,1,4,21,0,0,0,1,0.62,0.6212,0.57,0.2537,76,153,229 +11178,2012-04-15,2,1,4,22,0,0,0,1,0.68,0.6364,0.44,0.3582,37,97,134 +11179,2012-04-15,2,1,4,23,0,0,0,1,0.66,0.6212,0.5,0.3881,25,65,90 +11180,2012-04-16,2,1,4,0,1,1,0,1,0.64,0.6212,0.53,0.3284,15,26,41 +11181,2012-04-16,2,1,4,1,1,1,0,1,0.62,0.6061,0.61,0.3582,7,21,28 +11182,2012-04-16,2,1,4,2,1,1,0,1,0.6,0.6061,0.64,0.2537,5,9,14 +11183,2012-04-16,2,1,4,3,1,1,0,1,0.58,0.5455,0.68,0.194,3,5,8 +11184,2012-04-16,2,1,4,4,1,1,0,1,0.54,0.5152,0.77,0.2239,0,6,6 +11185,2012-04-16,2,1,4,5,1,1,0,1,0.54,0.5152,0.77,0.2239,4,34,38 +11186,2012-04-16,2,1,4,6,1,1,0,1,0.52,0.5,0.83,0.1642,10,123,133 +11187,2012-04-16,2,1,4,7,1,1,0,1,0.52,0.5,0.83,0.1343,20,367,387 +11188,2012-04-16,2,1,4,8,1,1,0,2,0.56,0.5303,0.83,0.1642,48,549,597 +11189,2012-04-16,2,1,4,9,1,1,0,1,0.62,0.6061,0.69,0.2239,59,238,297 +11190,2012-04-16,2,1,4,10,1,1,0,1,0.62,0.6061,0.69,0.2537,75,149,224 +11191,2012-04-16,2,1,4,11,1,1,0,1,0.64,0.6061,0.65,0.2537,115,136,251 +11192,2012-04-16,2,1,4,12,1,1,0,1,0.68,0.6364,0.57,0.2537,75,196,271 +11193,2012-04-16,2,1,4,13,1,1,0,1,0.74,0.6667,0.45,0.2985,79,184,263 +11194,2012-04-16,2,1,4,14,1,1,0,1,0.76,0.6818,0.45,0.3284,105,183,288 +11195,2012-04-16,2,1,4,15,1,1,0,1,0.8,0.697,0.33,0.4478,81,194,275 +11196,2012-04-16,2,1,4,16,1,1,0,1,0.8,0.697,0.33,0.4627,92,266,358 +11197,2012-04-16,2,1,4,17,1,1,0,1,0.8,0.697,0.33,0.3881,111,601,712 +11198,2012-04-16,2,1,4,18,1,1,0,1,0.8,0.697,0.31,0.3881,87,589,676 +11199,2012-04-16,2,1,4,19,1,1,0,1,0.76,0.6667,0.35,0.3881,61,461,522 +11200,2012-04-16,2,1,4,20,1,1,0,1,0.74,0.6515,0.37,0.2985,48,327,375 +11201,2012-04-16,2,1,4,21,1,1,0,1,0.68,0.6364,0.51,0.2537,39,232,271 +11202,2012-04-16,2,1,4,22,1,1,0,1,0.68,0.6364,0.51,0.2537,35,179,214 +11203,2012-04-16,2,1,4,23,1,1,0,1,0.7,0.6364,0.45,0.2985,24,97,121 +11204,2012-04-17,2,1,4,0,0,2,1,1,0.66,0.6212,0.54,0.194,10,33,43 +11205,2012-04-17,2,1,4,1,0,2,1,1,0.64,0.6212,0.57,0.1642,6,5,11 +11206,2012-04-17,2,1,4,2,0,2,1,1,0.56,0.5303,0.83,0.1045,1,3,4 +11207,2012-04-17,2,1,4,3,0,2,1,1,0.56,0.5303,0.83,0.0896,0,6,6 +11208,2012-04-17,2,1,4,4,0,2,1,1,0.66,0.6212,0.44,0.2239,3,6,9 +11209,2012-04-17,2,1,4,5,0,2,1,1,0.6,0.6212,0.53,0.2537,1,29,30 +11210,2012-04-17,2,1,4,6,0,2,1,1,0.6,0.6212,0.46,0.2537,8,136,144 +11211,2012-04-17,2,1,4,7,0,2,1,1,0.6,0.6212,0.43,0.2537,18,443,461 +11212,2012-04-17,2,1,4,8,0,2,1,1,0.6,0.6212,0.43,0.2239,42,631,673 +11213,2012-04-17,2,1,4,9,0,2,1,1,0.6,0.6212,0.38,0.4925,71,329,400 +11214,2012-04-17,2,1,4,10,0,2,1,1,0.6,0.6212,0.38,0.4179,52,165,217 +11215,2012-04-17,2,1,4,11,0,2,1,1,0.6,0.6212,0.33,0.4179,65,174,239 +11216,2012-04-17,2,1,4,12,0,2,1,1,0.6,0.6212,0.33,0.3582,65,179,244 +11217,2012-04-17,2,1,4,13,0,2,1,1,0.62,0.6212,0.29,0.2537,52,187,239 +11218,2012-04-17,2,1,4,14,0,2,1,1,0.64,0.6212,0.27,0.2836,59,198,257 +11219,2012-04-17,2,1,4,15,0,2,1,1,0.66,0.6212,0.24,0.3284,91,217,308 +11220,2012-04-17,2,1,4,16,0,2,1,1,0.68,0.6212,0.22,0.3582,50,320,370 +11221,2012-04-17,2,1,4,17,0,2,1,1,0.64,0.6061,0.25,0.2985,108,673,781 +11222,2012-04-17,2,1,4,18,0,2,1,1,0.64,0.6061,0.25,0.2836,105,670,775 +11223,2012-04-17,2,1,4,19,0,2,1,1,0.6,0.6061,0.26,0.2537,73,464,537 +11224,2012-04-17,2,1,4,20,0,2,1,1,0.6,0.6061,0.23,0.2836,44,358,402 +11225,2012-04-17,2,1,4,21,0,2,1,1,0.56,0.5303,0.26,0.2985,37,260,297 +11226,2012-04-17,2,1,4,22,0,2,1,1,0.54,0.5152,0.3,0.2537,18,144,162 +11227,2012-04-17,2,1,4,23,0,2,1,1,0.54,0.5152,0.32,0.2239,10,72,82 +11228,2012-04-18,2,1,4,0,0,3,1,1,0.54,0.5152,0.32,0.1642,3,35,38 +11229,2012-04-18,2,1,4,1,0,3,1,1,0.52,0.5,0.34,0.194,1,13,14 +11230,2012-04-18,2,1,4,2,0,3,1,2,0.5,0.4848,0.42,0.194,1,8,9 +11231,2012-04-18,2,1,4,3,0,3,1,2,0.5,0.4848,0.42,0.2239,0,6,6 +11232,2012-04-18,2,1,4,4,0,3,1,2,0.5,0.4848,0.42,0.2836,0,7,7 +11233,2012-04-18,2,1,4,5,0,3,1,2,0.5,0.4848,0.39,0.2836,0,34,34 +11234,2012-04-18,2,1,4,6,0,3,1,2,0.5,0.4848,0.39,0.2836,6,128,134 +11235,2012-04-18,2,1,4,7,0,3,1,2,0.5,0.4848,0.39,0.194,10,408,418 +11236,2012-04-18,2,1,4,8,0,3,1,3,0.5,0.4848,0.39,0.194,25,551,576 +11237,2012-04-18,2,1,4,9,0,3,1,3,0.48,0.4697,0.44,0.2985,20,245,265 +11238,2012-04-18,2,1,4,10,0,3,1,3,0.48,0.4697,0.47,0.2239,31,116,147 +11239,2012-04-18,2,1,4,11,0,3,1,3,0.46,0.4545,0.47,0.2537,14,105,119 +11240,2012-04-18,2,1,4,12,0,3,1,3,0.46,0.4545,0.51,0.1045,15,95,110 +11241,2012-04-18,2,1,4,13,0,3,1,3,0.46,0.4545,0.59,0,13,51,64 +11242,2012-04-18,2,1,4,14,0,3,1,3,0.44,0.4394,0.72,0.1045,8,43,51 +11243,2012-04-18,2,1,4,15,0,3,1,3,0.42,0.4242,0.77,0.194,10,88,98 +11244,2012-04-18,2,1,4,16,0,3,1,2,0.42,0.4242,0.77,0.2239,36,199,235 +11245,2012-04-18,2,1,4,17,0,3,1,2,0.42,0.4242,0.77,0.194,46,442,488 +11246,2012-04-18,2,1,4,18,0,3,1,2,0.42,0.4242,0.77,0,27,478,505 +11247,2012-04-18,2,1,4,19,0,3,1,2,0.42,0.4242,0.77,0.1343,26,342,368 +11248,2012-04-18,2,1,4,20,0,3,1,2,0.42,0.4242,0.77,0.1045,8,252,260 +11249,2012-04-18,2,1,4,21,0,3,1,2,0.42,0.4242,0.82,0.0896,24,165,189 +11250,2012-04-18,2,1,4,22,0,3,1,2,0.42,0.4242,0.77,0.0896,9,153,162 +11251,2012-04-18,2,1,4,23,0,3,1,2,0.42,0.4242,0.77,0,14,56,70 +11252,2012-04-19,2,1,4,0,0,4,1,3,0.42,0.4242,0.77,0,5,20,25 +11253,2012-04-19,2,1,4,1,0,4,1,3,0.42,0.4242,0.77,0.1045,2,11,13 +11254,2012-04-19,2,1,4,2,0,4,1,2,0.42,0.4242,0.77,0,2,4,6 +11255,2012-04-19,2,1,4,3,0,4,1,2,0.42,0.4242,0.77,0,1,3,4 +11256,2012-04-19,2,1,4,4,0,4,1,2,0.42,0.4242,0.71,0,0,4,4 +11257,2012-04-19,2,1,4,5,0,4,1,1,0.4,0.4091,0.82,0,0,30,30 +11258,2012-04-19,2,1,4,6,0,4,1,2,0.4,0.4091,0.82,0.1045,4,116,120 +11259,2012-04-19,2,1,4,7,0,4,1,2,0.4,0.4091,0.82,0.1343,15,391,406 +11260,2012-04-19,2,1,4,8,0,4,1,2,0.42,0.4242,0.82,0.1045,26,651,677 +11261,2012-04-19,2,1,4,9,0,4,1,2,0.46,0.4545,0.72,0,31,270,301 +11262,2012-04-19,2,1,4,10,0,4,1,1,0.48,0.4697,0.63,0.0896,46,143,189 +11263,2012-04-19,2,1,4,11,0,4,1,1,0.52,0.5,0.48,0,46,183,229 +11264,2012-04-19,2,1,4,12,0,4,1,1,0.54,0.5152,0.49,0,51,213,264 +11265,2012-04-19,2,1,4,13,0,4,1,1,0.56,0.5303,0.46,0,52,186,238 +11266,2012-04-19,2,1,4,14,0,4,1,1,0.58,0.5455,0.43,0,54,183,237 +11267,2012-04-19,2,1,4,15,0,4,1,1,0.6,0.6212,0.43,0,66,196,262 +11268,2012-04-19,2,1,4,16,0,4,1,1,0.62,0.6212,0.41,0.1045,63,324,387 +11269,2012-04-19,2,1,4,17,0,4,1,1,0.6,0.6212,0.43,0.0896,85,663,748 +11270,2012-04-19,2,1,4,18,0,4,1,1,0.6,0.6212,0.43,0.0896,99,677,776 +11271,2012-04-19,2,1,4,19,0,4,1,1,0.58,0.5455,0.46,0.1343,70,516,586 +11272,2012-04-19,2,1,4,20,0,4,1,1,0.56,0.5303,0.49,0.2239,52,352,404 +11273,2012-04-19,2,1,4,21,0,4,1,1,0.52,0.5,0.55,0.1045,27,240,267 +11274,2012-04-19,2,1,4,22,0,4,1,1,0.52,0.5,0.59,0.194,29,219,248 +11275,2012-04-19,2,1,4,23,0,4,1,1,0.5,0.4848,0.63,0.1045,20,124,144 +11276,2012-04-20,2,1,4,0,0,5,1,1,0.48,0.4697,0.67,0.1045,5,59,64 +11277,2012-04-20,2,1,4,1,0,5,1,1,0.46,0.4545,0.82,0.194,3,22,25 +11278,2012-04-20,2,1,4,2,0,5,1,1,0.44,0.4394,0.88,0.194,3,18,21 +11279,2012-04-20,2,1,4,3,0,5,1,1,0.44,0.4394,0.88,0.1045,1,9,10 +11280,2012-04-20,2,1,4,4,0,5,1,1,0.42,0.4242,0.94,0.1045,0,3,3 +11281,2012-04-20,2,1,4,5,0,5,1,1,0.42,0.4242,0.94,0,1,24,25 +11282,2012-04-20,2,1,4,6,0,5,1,1,0.42,0.4242,0.88,0,8,105,113 +11283,2012-04-20,2,1,4,7,0,5,1,1,0.42,0.4242,0.94,0,9,350,359 +11284,2012-04-20,2,1,4,8,0,5,1,1,0.44,0.4394,0.94,0,32,668,700 +11285,2012-04-20,2,1,4,9,0,5,1,1,0.46,0.4545,0.88,0.0896,30,329,359 +11286,2012-04-20,2,1,4,10,0,5,1,1,0.5,0.4848,0.77,0.194,67,170,237 +11287,2012-04-20,2,1,4,11,0,5,1,1,0.54,0.5152,0.68,0.1642,64,190,254 +11288,2012-04-20,2,1,4,12,0,5,1,1,0.58,0.5455,0.53,0.194,102,260,362 +11289,2012-04-20,2,1,4,13,0,5,1,1,0.62,0.6212,0.43,0.194,105,276,381 +11290,2012-04-20,2,1,4,14,0,5,1,1,0.64,0.6212,0.44,0.2239,156,238,394 +11291,2012-04-20,2,1,4,15,0,5,1,1,0.64,0.6212,0.47,0.194,145,262,407 +11292,2012-04-20,2,1,4,16,0,5,1,1,0.66,0.6212,0.44,0.194,151,429,580 +11293,2012-04-20,2,1,4,17,0,5,1,1,0.64,0.6212,0.47,0.2836,122,697,819 +11294,2012-04-20,2,1,4,18,0,5,1,1,0.62,0.6212,0.5,0.2836,110,558,668 +11295,2012-04-20,2,1,4,19,0,5,1,1,0.6,0.6212,0.53,0.194,82,410,492 +11296,2012-04-20,2,1,4,20,0,5,1,1,0.58,0.5455,0.56,0.1642,52,289,341 +11297,2012-04-20,2,1,4,21,0,5,1,1,0.56,0.5303,0.68,0.1343,35,252,287 +11298,2012-04-20,2,1,4,22,0,5,1,1,0.54,0.5152,0.68,0.2537,26,176,202 +11299,2012-04-20,2,1,4,23,0,5,1,1,0.52,0.5,0.72,0.1343,31,156,187 +11300,2012-04-21,2,1,4,0,0,6,0,1,0.52,0.5,0.77,0.2239,31,111,142 +11301,2012-04-21,2,1,4,1,0,6,0,1,0.52,0.5,0.77,0.2239,24,64,88 +11302,2012-04-21,2,1,4,2,0,6,0,1,0.52,0.5,0.77,0.1343,12,69,81 +11303,2012-04-21,2,1,4,3,0,6,0,1,0.5,0.4848,0.82,0.2239,4,25,29 +11304,2012-04-21,2,1,4,4,0,6,0,1,0.5,0.4848,0.82,0.2537,0,9,9 +11305,2012-04-21,2,1,4,5,0,6,0,1,0.5,0.4848,0.82,0.2537,0,1,1 +11306,2012-04-21,2,1,4,6,0,6,0,1,0.5,0.4848,0.82,0.2239,10,21,31 +11307,2012-04-21,2,1,4,7,0,6,0,1,0.5,0.4848,0.82,0.2836,21,55,76 +11308,2012-04-21,2,1,4,8,0,6,0,1,0.52,0.5,0.81,0.2537,52,149,201 +11309,2012-04-21,2,1,4,9,0,6,0,1,0.54,0.5152,0.77,0.1642,115,216,331 +11310,2012-04-21,2,1,4,10,0,6,0,1,0.6,0.6061,0.64,0.2836,183,299,482 +11311,2012-04-21,2,1,4,11,0,6,0,1,0.62,0.6061,0.61,0.3284,260,348,608 +11312,2012-04-21,2,1,4,12,0,6,0,1,0.64,0.6212,0.57,0.2239,262,378,640 +11313,2012-04-21,2,1,4,13,0,6,0,1,0.66,0.6212,0.54,0.2239,291,358,649 +11314,2012-04-21,2,1,4,14,0,6,0,1,0.7,0.6364,0.45,0.2836,278,331,609 +11315,2012-04-21,2,1,4,15,0,6,0,1,0.72,0.6515,0.39,0.2537,267,361,628 +11316,2012-04-21,2,1,4,16,0,6,0,1,0.7,0.6364,0.42,0.3284,267,372,639 +11317,2012-04-21,2,1,4,17,0,6,0,2,0.74,0.6515,0.3,0.3881,259,309,568 +11318,2012-04-21,2,1,4,18,0,6,0,2,0.68,0.6364,0.47,0.4925,133,283,416 +11319,2012-04-21,2,1,4,19,0,6,0,3,0.56,0.5303,0.73,0.4925,9,45,54 +11320,2012-04-21,2,1,4,20,0,6,0,3,0.5,0.4848,0.82,0.2985,5,34,39 +11321,2012-04-21,2,1,4,21,0,6,0,3,0.5,0.4848,0.82,0.3881,20,82,102 +11322,2012-04-21,2,1,4,22,0,6,0,1,0.48,0.4697,0.82,0.2985,20,90,110 +11323,2012-04-21,2,1,4,23,0,6,0,2,0.46,0.4545,0.82,0.2836,18,73,91 +11324,2012-04-22,2,1,4,0,0,0,0,2,0.46,0.4545,0.82,0.194,32,85,117 +11325,2012-04-22,2,1,4,1,0,0,0,2,0.46,0.4545,0.82,0.3284,10,63,73 +11326,2012-04-22,2,1,4,2,0,0,0,1,0.46,0.4545,0.72,0.2836,7,49,56 +11327,2012-04-22,2,1,4,3,0,0,0,2,0.44,0.4394,0.82,0.1642,7,33,40 +11328,2012-04-22,2,1,4,4,0,0,0,2,0.44,0.4394,0.77,0.1642,0,7,7 +11329,2012-04-22,2,1,4,5,0,0,0,2,0.44,0.4394,0.77,0.2239,0,6,6 +11330,2012-04-22,2,1,4,6,0,0,0,2,0.44,0.4394,0.72,0.3284,6,6,12 +11331,2012-04-22,2,1,4,7,0,0,0,3,0.42,0.4242,0.77,0.3881,2,22,24 +11332,2012-04-22,2,1,4,8,0,0,0,3,0.4,0.4091,0.82,0.2537,8,43,51 +11333,2012-04-22,2,1,4,9,0,0,0,3,0.4,0.4091,0.82,0.2537,8,71,79 +11334,2012-04-22,2,1,4,10,0,0,0,3,0.38,0.3939,0.82,0.3284,11,57,68 +11335,2012-04-22,2,1,4,11,0,0,0,3,0.38,0.3939,0.87,0.2985,7,70,77 +11336,2012-04-22,2,1,4,12,0,0,0,3,0.38,0.3939,0.87,0.3284,5,56,61 +11337,2012-04-22,2,1,4,13,0,0,0,3,0.38,0.3939,0.82,0.3582,5,47,52 +11338,2012-04-22,2,1,4,14,0,0,0,3,0.38,0.3939,0.87,0.3284,3,50,53 +11339,2012-04-22,2,1,4,15,0,0,0,3,0.38,0.3939,0.87,0.3881,0,38,38 +11340,2012-04-22,2,1,4,16,0,0,0,3,0.38,0.3939,0.87,0.4478,4,32,36 +11341,2012-04-22,2,1,4,17,0,0,0,3,0.36,0.3182,0.93,0.4478,2,25,27 +11342,2012-04-22,2,1,4,18,0,0,0,3,0.36,0.3182,0.93,0.4478,1,38,39 +11343,2012-04-22,2,1,4,19,0,0,0,3,0.36,0.3182,0.87,0.5224,1,34,35 +11344,2012-04-22,2,1,4,20,0,0,0,3,0.36,0.3333,0.93,0.4179,1,34,35 +11345,2012-04-22,2,1,4,21,0,0,0,3,0.36,0.3333,0.87,0.3881,0,11,11 +11346,2012-04-22,2,1,4,22,0,0,0,3,0.34,0.2879,0.87,0.5224,0,13,13 +11347,2012-04-22,2,1,4,23,0,0,0,3,0.36,0.3182,0.81,0.4627,0,17,17 +11348,2012-04-23,2,1,4,0,0,1,1,3,0.36,0.3333,0.81,0.4179,0,6,6 +11349,2012-04-23,2,1,4,1,0,1,1,3,0.34,0.303,0.87,0.3582,1,1,2 +11350,2012-04-23,2,1,4,2,0,1,1,2,0.32,0.303,0.87,0.3284,1,3,4 +11351,2012-04-23,2,1,4,3,0,1,1,2,0.32,0.303,0.81,0.2836,0,1,1 +11352,2012-04-23,2,1,4,4,0,1,1,3,0.3,0.2879,0.81,0.2836,0,3,3 +11353,2012-04-23,2,1,4,5,0,1,1,3,0.3,0.2879,0.81,0.2836,0,14,14 +11354,2012-04-23,2,1,4,6,0,1,1,2,0.3,0.2727,0.81,0.2985,2,43,45 +11355,2012-04-23,2,1,4,7,0,1,1,3,0.3,0.2879,0.81,0.2537,1,206,207 +11356,2012-04-23,2,1,4,8,0,1,1,2,0.3,0.2727,0.75,0.3284,11,471,482 +11357,2012-04-23,2,1,4,9,0,1,1,2,0.3,0.2727,0.75,0.3881,17,255,272 +11358,2012-04-23,2,1,4,10,0,1,1,2,0.32,0.303,0.7,0.3284,12,88,100 +11359,2012-04-23,2,1,4,11,0,1,1,2,0.32,0.2879,0.7,0.3582,10,91,101 +11360,2012-04-23,2,1,4,12,0,1,1,3,0.34,0.303,0.71,0.3582,12,117,129 +11361,2012-04-23,2,1,4,13,0,1,1,3,0.34,0.303,0.66,0.3881,9,98,107 +11362,2012-04-23,2,1,4,14,0,1,1,3,0.36,0.3182,0.62,0.4478,12,102,114 +11363,2012-04-23,2,1,4,15,0,1,1,2,0.36,0.3333,0.62,0.3881,11,101,112 +11364,2012-04-23,2,1,4,16,0,1,1,3,0.3,0.2727,0.81,0.3582,10,93,103 +11365,2012-04-23,2,1,4,17,0,1,1,3,0.3,0.2727,0.82,0.3284,13,219,232 +11366,2012-04-23,2,1,4,18,0,1,1,3,0.32,0.3333,0.81,0.1343,10,369,379 +11367,2012-04-23,2,1,4,19,0,1,1,2,0.34,0.3182,0.76,0.2239,11,287,298 +11368,2012-04-23,2,1,4,20,0,1,1,2,0.32,0.3182,0.76,0.1642,17,186,203 +11369,2012-04-23,2,1,4,21,0,1,1,2,0.32,0.303,0.81,0.2239,7,145,152 +11370,2012-04-23,2,1,4,22,0,1,1,1,0.32,0.3182,0.76,0.194,18,71,89 +11371,2012-04-23,2,1,4,23,0,1,1,1,0.32,0.3182,0.76,0.1642,10,49,59 +11372,2012-04-24,2,1,4,0,0,2,1,2,0.32,0.3333,0.76,0.1343,0,13,13 +11373,2012-04-24,2,1,4,1,0,2,1,2,0.32,0.3333,0.81,0.1343,1,6,7 +11374,2012-04-24,2,1,4,2,0,2,1,1,0.32,0.3182,0.81,0.194,0,3,3 +11375,2012-04-24,2,1,4,3,0,2,1,1,0.32,0.303,0.81,0.2239,0,2,2 +11376,2012-04-24,2,1,4,4,0,2,1,1,0.3,0.2879,0.7,0.194,0,3,3 +11377,2012-04-24,2,1,4,5,0,2,1,1,0.3,0.2879,0.7,0.194,0,22,22 +11378,2012-04-24,2,1,4,6,0,2,1,1,0.3,0.2879,0.7,0.2239,3,117,120 +11379,2012-04-24,2,1,4,7,0,2,1,1,0.32,0.303,0.66,0.2836,3,387,390 +11380,2012-04-24,2,1,4,8,0,2,1,1,0.34,0.3182,0.61,0.2537,12,599,611 +11381,2012-04-24,2,1,4,9,0,2,1,1,0.4,0.4091,0.47,0.3582,6,302,308 +11382,2012-04-24,2,1,4,10,0,2,1,1,0.42,0.4242,0.47,0.2985,33,129,162 +11383,2012-04-24,2,1,4,11,0,2,1,1,0.46,0.4545,0.31,0.3881,27,157,184 +11384,2012-04-24,2,1,4,12,0,2,1,1,0.48,0.4697,0.29,0.3881,36,188,224 +11385,2012-04-24,2,1,4,13,0,2,1,1,0.5,0.4848,0.29,0.3881,35,189,224 +11386,2012-04-24,2,1,4,14,0,2,1,1,0.5,0.4848,0.25,0.3881,50,166,216 +11387,2012-04-24,2,1,4,15,0,2,1,1,0.5,0.4848,0.23,0.4179,60,176,236 +11388,2012-04-24,2,1,4,16,0,2,1,1,0.5,0.4848,0.23,0.194,54,308,362 +11389,2012-04-24,2,1,4,17,0,2,1,1,0.52,0.5,0.22,0.3881,72,619,691 +11390,2012-04-24,2,1,4,18,0,2,1,1,0.5,0.4848,0.18,0,52,580,632 +11391,2012-04-24,2,1,4,19,0,2,1,1,0.5,0.4848,0.18,0.2537,33,449,482 +11392,2012-04-24,2,1,4,20,0,2,1,1,0.48,0.4697,0.21,0.3284,16,269,285 +11393,2012-04-24,2,1,4,21,0,2,1,2,0.46,0.4545,0.28,0.1642,13,213,226 +11394,2012-04-24,2,1,4,22,0,2,1,1,0.44,0.4394,0.35,0.1045,8,149,157 +11395,2012-04-24,2,1,4,23,0,2,1,1,0.42,0.4242,0.38,0.0896,4,69,73 +11396,2012-04-25,2,1,4,0,0,3,1,1,0.44,0.4394,0.38,0,13,23,36 +11397,2012-04-25,2,1,4,1,0,3,1,1,0.42,0.4242,0.41,0,8,17,25 +11398,2012-04-25,2,1,4,2,0,3,1,1,0.36,0.3788,0.62,0,1,6,7 +11399,2012-04-25,2,1,4,3,0,3,1,1,0.32,0.3333,0.7,0.0896,0,3,3 +11400,2012-04-25,2,1,4,4,0,3,1,1,0.34,0.3485,0.66,0.0896,0,6,6 +11401,2012-04-25,2,1,4,5,0,3,1,1,0.34,0.3636,0.66,0,0,24,24 +11402,2012-04-25,2,1,4,6,0,3,1,1,0.34,0.3636,0.66,0,3,103,106 +11403,2012-04-25,2,1,4,7,0,3,1,1,0.34,0.3636,0.71,0,13,376,389 +11404,2012-04-25,2,1,4,8,0,3,1,1,0.36,0.3636,0.76,0.0896,20,634,654 +11405,2012-04-25,2,1,4,9,0,3,1,1,0.42,0.4242,0.54,0,33,288,321 +11406,2012-04-25,2,1,4,10,0,3,1,1,0.52,0.5,0.34,0,44,138,182 +11407,2012-04-25,2,1,4,11,0,3,1,1,0.52,0.5,0.32,0.1343,42,177,219 +11408,2012-04-25,2,1,4,12,0,3,1,1,0.54,0.5152,0.28,0.2836,49,220,269 +11409,2012-04-25,2,1,4,13,0,3,1,1,0.56,0.5303,0.26,0.2239,35,193,228 +11410,2012-04-25,2,1,4,14,0,3,1,1,0.56,0.5303,0.22,0.3582,47,176,223 +11411,2012-04-25,2,1,4,15,0,3,1,1,0.58,0.5455,0.19,0.1642,47,182,229 +11412,2012-04-25,2,1,4,16,0,3,1,1,0.6,0.5909,0.2,0.3284,49,327,376 +11413,2012-04-25,2,1,4,17,0,3,1,1,0.6,0.5909,0.2,0.2985,68,664,732 +11414,2012-04-25,2,1,4,18,0,3,1,1,0.6,0.5909,0.2,0.2836,60,649,709 +11415,2012-04-25,2,1,4,19,0,3,1,1,0.6,0.5909,0.2,0.194,42,501,543 +11416,2012-04-25,2,1,4,20,0,3,1,1,0.56,0.5303,0.22,0.0896,22,358,380 +11417,2012-04-25,2,1,4,21,0,3,1,1,0.52,0.5,0.52,0.1343,27,212,239 +11418,2012-04-25,2,1,4,22,0,3,1,1,0.5,0.4848,0.51,0,22,170,192 +11419,2012-04-25,2,1,4,23,0,3,1,2,0.5,0.4848,0.51,0.0896,10,94,104 +11420,2012-04-26,2,1,4,0,0,4,1,2,0.48,0.4697,0.63,0,1,42,43 +11421,2012-04-26,2,1,4,1,0,4,1,2,0.48,0.4697,0.48,0.0896,1,22,23 +11422,2012-04-26,2,1,4,2,0,4,1,2,0.48,0.4697,0.55,0.1045,3,10,13 +11423,2012-04-26,2,1,4,3,0,4,1,2,0.48,0.4697,0.55,0.0896,1,5,6 +11424,2012-04-26,2,1,4,4,0,4,1,2,0.48,0.4697,0.59,0.0896,1,2,3 +11425,2012-04-26,2,1,4,5,0,4,1,2,0.48,0.4697,0.63,0.1343,0,17,17 +11426,2012-04-26,2,1,4,6,0,4,1,2,0.48,0.4697,0.63,0.2537,5,122,127 +11427,2012-04-26,2,1,4,7,0,4,1,3,0.46,0.4545,0.82,0.3582,3,222,225 +11428,2012-04-26,2,1,4,8,0,4,1,3,0.46,0.4545,0.82,0.194,4,358,362 +11429,2012-04-26,2,1,4,9,0,4,1,3,0.46,0.4545,0.88,0.1642,12,189,201 +11430,2012-04-26,2,1,4,10,0,4,1,2,0.46,0.4545,0.94,0.1642,14,125,139 +11431,2012-04-26,2,1,4,11,0,4,1,2,0.46,0.4545,0.94,0.2239,16,135,151 +11432,2012-04-26,2,1,4,12,0,4,1,2,0.52,0.5,0.83,0.1343,24,180,204 +11433,2012-04-26,2,1,4,13,0,4,1,2,0.5,0.4848,0.82,0.1642,19,178,197 +11434,2012-04-26,2,1,4,14,0,4,1,2,0.54,0.5152,0.77,0.194,37,132,169 +11435,2012-04-26,2,1,4,15,0,4,1,2,0.52,0.5,0.83,0.3284,39,165,204 +11436,2012-04-26,2,1,4,16,0,4,1,2,0.52,0.5,0.83,0.2239,37,281,318 +11437,2012-04-26,2,1,4,17,0,4,1,1,0.54,0.5152,0.77,0.1642,49,565,614 +11438,2012-04-26,2,1,4,18,0,4,1,2,0.54,0.5152,0.77,0.194,75,589,664 +11439,2012-04-26,2,1,4,19,0,4,1,2,0.54,0.5152,0.77,0.2239,57,407,464 +11440,2012-04-26,2,1,4,20,0,4,1,1,0.54,0.5152,0.77,0.194,23,314,337 +11441,2012-04-26,2,1,4,21,0,4,1,1,0.52,0.5,0.83,0.2239,28,222,250 +11442,2012-04-26,2,1,4,22,0,4,1,1,0.52,0.5,0.83,0.1642,20,163,183 +11443,2012-04-26,2,1,4,23,0,4,1,1,0.5,0.4848,0.88,0.1642,6,106,112 +11444,2012-04-27,2,1,4,0,0,5,1,1,0.54,0.5152,0.68,0.5522,12,61,73 +11445,2012-04-27,2,1,4,1,0,5,1,1,0.5,0.4848,0.59,0.4478,3,32,35 +11446,2012-04-27,2,1,4,2,0,5,1,1,0.48,0.4697,0.59,0.4179,4,18,22 +11447,2012-04-27,2,1,4,3,0,5,1,1,0.44,0.4394,0.47,0.4925,1,2,3 +11448,2012-04-27,2,1,4,4,0,5,1,1,0.42,0.4242,0.44,0.2537,0,6,6 +11449,2012-04-27,2,1,4,5,0,5,1,1,0.42,0.4242,0.44,0.3582,0,19,19 +11450,2012-04-27,2,1,4,6,0,5,1,1,0.4,0.4091,0.43,0.2537,5,115,120 +11451,2012-04-27,2,1,4,7,0,5,1,1,0.38,0.3939,0.5,0.2836,10,305,315 +11452,2012-04-27,2,1,4,8,0,5,1,1,0.4,0.4091,0.5,0,17,575,592 +11453,2012-04-27,2,1,4,9,0,5,1,1,0.42,0.4242,0.44,0.4925,42,292,334 +11454,2012-04-27,2,1,4,10,0,5,1,1,0.44,0.4394,0.38,0.4179,64,172,236 +11455,2012-04-27,2,1,4,11,0,5,1,1,0.44,0.4394,0.38,0.3284,47,186,233 +11456,2012-04-27,2,1,4,12,0,5,1,1,0.46,0.4545,0.36,0.3582,63,244,307 +11457,2012-04-27,2,1,4,13,0,5,1,1,0.46,0.4545,0.33,0.3881,94,212,306 +11458,2012-04-27,2,1,4,14,0,5,1,1,0.46,0.4545,0.33,0.3284,94,206,300 +11459,2012-04-27,2,1,4,15,0,5,1,1,0.52,0.5,0.29,0.4925,101,223,324 +11460,2012-04-27,2,1,4,16,0,5,1,1,0.52,0.5,0.29,0.2537,105,335,440 +11461,2012-04-27,2,1,4,17,0,5,1,1,0.54,0.5152,0.24,0.3881,89,613,702 +11462,2012-04-27,2,1,4,18,0,5,1,1,0.52,0.5,0.25,0.3582,94,560,654 +11463,2012-04-27,2,1,4,19,0,5,1,1,0.5,0.4848,0.27,0.3881,62,379,441 +11464,2012-04-27,2,1,4,20,0,5,1,1,0.46,0.4545,0.31,0.2836,49,240,289 +11465,2012-04-27,2,1,4,21,0,5,1,1,0.44,0.4394,0.33,0.2985,23,185,208 +11466,2012-04-27,2,1,4,22,0,5,1,1,0.42,0.4242,0.38,0.2537,24,134,158 +11467,2012-04-27,2,1,4,23,0,5,1,1,0.4,0.4091,0.4,0.2537,11,105,116 +11468,2012-04-28,2,1,4,0,0,6,0,1,0.38,0.3939,0.43,0.1642,4,99,103 +11469,2012-04-28,2,1,4,1,0,6,0,1,0.36,0.3333,0.5,0.2985,7,60,67 +11470,2012-04-28,2,1,4,2,0,6,0,1,0.36,0.3485,0.43,0.1642,8,49,57 +11471,2012-04-28,2,1,4,3,0,6,0,1,0.34,0.3182,0.49,0.2239,3,23,26 +11472,2012-04-28,2,1,4,4,0,6,0,1,0.34,0.3333,0.49,0.1642,0,2,2 +11473,2012-04-28,2,1,4,5,0,6,0,1,0.34,0.3333,0.46,0.1343,2,3,5 +11474,2012-04-28,2,1,4,6,0,6,0,1,0.34,0.3333,0.49,0.194,10,18,28 +11475,2012-04-28,2,1,4,7,0,6,0,1,0.34,0.3333,0.49,0.1642,10,45,55 +11476,2012-04-28,2,1,4,8,0,6,0,1,0.38,0.3939,0.4,0.1343,24,134,158 +11477,2012-04-28,2,1,4,9,0,6,0,2,0.38,0.3939,0.37,0,45,169,214 +11478,2012-04-28,2,1,4,10,0,6,0,2,0.38,0.3939,0.4,0.1642,60,231,291 +11479,2012-04-28,2,1,4,11,0,6,0,2,0.38,0.3939,0.4,0.1045,96,262,358 +11480,2012-04-28,2,1,4,12,0,6,0,2,0.38,0.3939,0.43,0,105,276,381 +11481,2012-04-28,2,1,4,13,0,6,0,2,0.4,0.4091,0.37,0,120,269,389 +11482,2012-04-28,2,1,4,14,0,6,0,1,0.42,0.4242,0.38,0.1045,147,293,440 +11483,2012-04-28,2,1,4,15,0,6,0,1,0.42,0.4242,0.41,0.2239,148,260,408 +11484,2012-04-28,2,1,4,16,0,6,0,2,0.44,0.4394,0.38,0.194,132,244,376 +11485,2012-04-28,2,1,4,17,0,6,0,2,0.42,0.4242,0.41,0.1343,98,224,322 +11486,2012-04-28,2,1,4,18,0,6,0,3,0.4,0.4091,0.47,0,17,73,90 +11487,2012-04-28,2,1,4,19,0,6,0,3,0.36,0.3485,0.71,0.1343,29,110,139 +11488,2012-04-28,2,1,4,20,0,6,0,2,0.38,0.3939,0.66,0,21,81,102 +11489,2012-04-28,2,1,4,21,0,6,0,3,0.36,0.3485,0.71,0.194,12,72,84 +11490,2012-04-28,2,1,4,22,0,6,0,2,0.36,0.3485,0.71,0.1343,17,66,83 +11491,2012-04-28,2,1,4,23,0,6,0,2,0.38,0.3939,0.76,0.0896,5,37,42 +11492,2012-04-29,2,1,4,0,0,0,0,3,0.36,0.3485,0.81,0.1343,0,36,36 +11493,2012-04-29,2,1,4,1,0,0,0,2,0.36,0.3485,0.81,0.1343,5,37,42 +11494,2012-04-29,2,1,4,2,0,0,0,2,0.36,0.3636,0.87,0.1045,4,36,40 +11495,2012-04-29,2,1,4,3,0,0,0,2,0.36,0.3485,0.87,0.194,2,14,16 +11496,2012-04-29,2,1,4,4,0,0,0,2,0.36,0.3485,0.87,0.1343,0,4,4 +11497,2012-04-29,2,1,4,5,0,0,0,1,0.34,0.3636,0.87,0,0,6,6 +11498,2012-04-29,2,1,4,6,0,0,0,1,0.34,0.3333,0.87,0.1343,1,6,7 +11499,2012-04-29,2,1,4,7,0,0,0,2,0.34,0.3333,0.87,0.1343,7,20,27 +11500,2012-04-29,2,1,4,8,0,0,0,1,0.38,0.3939,0.76,0.1343,26,81,107 +11501,2012-04-29,2,1,4,9,0,0,0,1,0.4,0.4091,0.71,0.1343,52,133,185 +11502,2012-04-29,2,1,4,10,0,0,0,1,0.42,0.4242,0.67,0.0896,127,253,380 +11503,2012-04-29,2,1,4,11,0,0,0,1,0.46,0.4545,0.51,0,128,283,411 +11504,2012-04-29,2,1,4,12,0,0,0,1,0.48,0.4697,0.44,0.1045,223,361,584 +11505,2012-04-29,2,1,4,13,0,0,0,1,0.5,0.4848,0.45,0.1045,226,369,595 +11506,2012-04-29,2,1,4,14,0,0,0,1,0.54,0.5152,0.34,0,281,372,653 +11507,2012-04-29,2,1,4,15,0,0,0,1,0.56,0.5303,0.37,0.1343,279,324,603 +11508,2012-04-29,2,1,4,16,0,0,0,1,0.6,0.6061,0.28,0.194,240,347,587 +11509,2012-04-29,2,1,4,17,0,0,0,1,0.62,0.6061,0.27,0.2985,188,334,522 +11510,2012-04-29,2,1,4,18,0,0,0,1,0.6,0.6061,0.26,0.2836,164,323,487 +11511,2012-04-29,2,1,4,19,0,0,0,1,0.56,0.5303,0.35,0.1343,128,247,375 +11512,2012-04-29,2,1,4,20,0,0,0,1,0.56,0.5303,0.32,0.0896,45,198,243 +11513,2012-04-29,2,1,4,21,0,0,0,1,0.52,0.5,0.48,0.1343,65,139,204 +11514,2012-04-29,2,1,4,22,0,0,0,1,0.5,0.4848,0.45,0,23,85,108 +11515,2012-04-29,2,1,4,23,0,0,0,1,0.48,0.4697,0.59,0,15,67,82 +11516,2012-04-30,2,1,4,0,0,1,1,1,0.44,0.4394,0.72,0.1045,12,36,48 +11517,2012-04-30,2,1,4,1,0,1,1,1,0.42,0.4242,0.77,0.0896,9,15,24 +11518,2012-04-30,2,1,4,2,0,1,1,1,0.42,0.4242,0.71,0.1343,3,5,8 +11519,2012-04-30,2,1,4,3,0,1,1,1,0.4,0.4091,0.76,0.1642,0,4,4 +11520,2012-04-30,2,1,4,4,0,1,1,1,0.38,0.3939,0.62,0.2537,0,2,2 +11521,2012-04-30,2,1,4,5,0,1,1,1,0.38,0.3939,0.62,0.2537,1,19,20 +11522,2012-04-30,2,1,4,6,0,1,1,1,0.4,0.4091,0.47,0.194,2,121,123 +11523,2012-04-30,2,1,4,7,0,1,1,1,0.4,0.4091,0.5,0.194,6,343,349 +11524,2012-04-30,2,1,4,8,0,1,1,1,0.4,0.4091,0.43,0.194,17,579,596 +11525,2012-04-30,2,1,4,9,0,1,1,2,0.4,0.4091,0.43,0.194,29,239,268 +11526,2012-04-30,2,1,4,10,0,1,1,2,0.42,0.4242,0.41,0.1642,41,116,157 +11527,2012-04-30,2,1,4,11,0,1,1,2,0.44,0.4394,0.44,0.1045,52,111,163 +11528,2012-04-30,2,1,4,12,0,1,1,2,0.44,0.4394,0.47,0.1343,45,180,225 +11529,2012-04-30,2,1,4,13,0,1,1,2,0.48,0.4697,0.51,0.0896,48,199,247 +11530,2012-04-30,2,1,4,14,0,1,1,2,0.5,0.4848,0.55,0.1343,56,162,218 +11531,2012-04-30,2,1,4,15,0,1,1,2,0.52,0.5,0.55,0.1045,55,176,231 +11532,2012-04-30,2,1,4,16,0,1,1,1,0.56,0.5303,0.6,0.194,44,303,347 +11533,2012-04-30,2,1,4,17,0,1,1,1,0.56,0.5303,0.6,0.1642,66,617,683 +11534,2012-04-30,2,1,4,18,0,1,1,2,0.56,0.5303,0.6,0.2239,53,611,664 +11535,2012-04-30,2,1,4,19,0,1,1,2,0.54,0.5152,0.64,0.2537,51,420,471 +11536,2012-04-30,2,1,4,20,0,1,1,2,0.52,0.5,0.59,0.194,28,281,309 +11537,2012-04-30,2,1,4,21,0,1,1,2,0.52,0.5,0.59,0.194,26,195,221 +11538,2012-04-30,2,1,4,22,0,1,1,2,0.52,0.5,0.55,0.1642,11,123,134 +11539,2012-04-30,2,1,4,23,0,1,1,2,0.52,0.5,0.55,0.2239,10,50,60 +11540,2012-05-01,2,1,5,0,0,2,1,2,0.5,0.4848,0.59,0.194,7,28,35 +11541,2012-05-01,2,1,5,1,0,2,1,2,0.5,0.4848,0.63,0.1343,0,21,21 +11542,2012-05-01,2,1,5,2,0,2,1,2,0.5,0.4848,0.72,0.0896,1,7,8 +11543,2012-05-01,2,1,5,3,0,2,1,2,0.5,0.4848,0.77,0,1,2,3 +11544,2012-05-01,2,1,5,4,0,2,1,2,0.52,0.5,0.72,0.0896,1,7,8 +11545,2012-05-01,2,1,5,5,0,2,1,2,0.52,0.5,0.72,0.0896,0,17,17 +11546,2012-05-01,2,1,5,6,0,2,1,3,0.52,0.5,0.77,0.1045,2,24,26 +11547,2012-05-01,2,1,5,7,0,2,1,2,0.5,0.4848,0.94,0.2537,8,161,169 +11548,2012-05-01,2,1,5,8,0,2,1,2,0.52,0.5,0.94,0.2836,19,538,557 +11549,2012-05-01,2,1,5,9,0,2,1,2,0.54,0.5152,0.88,0.194,18,331,349 +11550,2012-05-01,2,1,5,10,0,2,1,1,0.62,0.5909,0.78,0.2537,30,144,174 +11551,2012-05-01,2,1,5,11,0,2,1,2,0.66,0.6212,0.65,0.2836,50,179,229 +11552,2012-05-01,2,1,5,12,0,2,1,2,0.7,0.6515,0.58,0.2239,41,228,269 +11553,2012-05-01,2,1,5,13,0,2,1,2,0.72,0.6667,0.51,0.2239,41,208,249 +11554,2012-05-01,2,1,5,14,0,2,1,2,0.74,0.6667,0.48,0.2537,37,167,204 +11555,2012-05-01,2,1,5,15,0,2,1,2,0.74,0.6667,0.45,0.2239,48,186,234 +11556,2012-05-01,2,1,5,16,0,2,1,1,0.74,0.6667,0.45,0.194,41,313,354 +11557,2012-05-01,2,1,5,17,0,2,1,1,0.74,0.6667,0.48,0.0896,65,616,681 +11558,2012-05-01,2,1,5,18,0,2,1,1,0.7,0.6515,0.54,0.1343,81,662,743 +11559,2012-05-01,2,1,5,19,0,2,1,1,0.7,0.6515,0.54,0.1343,58,429,487 +11560,2012-05-01,2,1,5,20,0,2,1,1,0.66,0.6212,0.61,0.1642,36,299,335 +11561,2012-05-01,2,1,5,21,0,2,1,1,0.64,0.6061,0.65,0.1343,31,251,282 +11562,2012-05-01,2,1,5,22,0,2,1,1,0.64,0.6061,0.65,0,21,190,211 +11563,2012-05-01,2,1,5,23,0,2,1,1,0.6,0.5758,0.78,0,16,79,95 +11564,2012-05-02,2,1,5,0,0,3,1,1,0.6,0.5758,0.78,0.1045,4,43,47 +11565,2012-05-02,2,1,5,1,0,3,1,1,0.56,0.5303,0.83,0,8,7,15 +11566,2012-05-02,2,1,5,2,0,3,1,1,0.56,0.5303,0.83,0,7,9,16 +11567,2012-05-02,2,1,5,3,0,3,1,1,0.56,0.5303,0.83,0.0896,0,6,6 +11568,2012-05-02,2,1,5,4,0,3,1,1,0.54,0.5152,0.88,0,2,2,4 +11569,2012-05-02,2,1,5,5,0,3,1,1,0.54,0.5152,0.9,0.1343,2,31,33 +11570,2012-05-02,2,1,5,6,0,3,1,3,0.56,0.5303,0.84,0,3,117,120 +11571,2012-05-02,2,1,5,7,0,3,1,2,0.54,0.5152,0.88,0.194,14,344,358 +11572,2012-05-02,2,1,5,8,0,3,1,2,0.56,0.5303,0.88,0,26,640,666 +11573,2012-05-02,2,1,5,9,0,3,1,2,0.56,0.5303,0.88,0,26,289,315 +11574,2012-05-02,2,1,5,10,0,3,1,2,0.58,0.5455,0.83,0.1045,28,147,175 +11575,2012-05-02,2,1,5,11,0,3,1,2,0.58,0.5455,0.81,0.1343,39,145,184 +11576,2012-05-02,2,1,5,12,0,3,1,2,0.58,0.5455,0.83,0.2239,35,210,245 +11577,2012-05-02,2,1,5,13,0,3,1,2,0.58,0.5455,0.78,0.194,49,179,228 +11578,2012-05-02,2,1,5,14,0,3,1,1,0.62,0.6061,0.69,0.0896,51,189,240 +11579,2012-05-02,2,1,5,15,0,3,1,1,0.64,0.6061,0.65,0.194,52,204,256 +11580,2012-05-02,2,1,5,16,0,3,1,1,0.64,0.6061,0.65,0.1642,54,313,367 +11581,2012-05-02,2,1,5,17,0,3,1,1,0.6,0.5909,0.73,0.2537,70,659,729 +11582,2012-05-02,2,1,5,18,0,3,1,1,0.56,0.5303,0.73,0.2239,43,770,813 +11583,2012-05-02,2,1,5,19,0,3,1,1,0.54,0.5152,0.77,0.2537,43,461,504 +11584,2012-05-02,2,1,5,20,0,3,1,1,0.52,0.5,0.77,0.2537,42,296,338 +11585,2012-05-02,2,1,5,21,0,3,1,1,0.52,0.5,0.77,0.2239,21,218,239 +11586,2012-05-02,2,1,5,22,0,3,1,1,0.5,0.4848,0.82,0.2239,27,146,173 +11587,2012-05-02,2,1,5,23,0,3,1,2,0.5,0.4848,0.77,0.2537,21,77,98 +11588,2012-05-03,2,1,5,0,0,4,1,2,0.48,0.4697,0.82,0.2239,15,75,90 +11589,2012-05-03,2,1,5,1,0,4,1,2,0.46,0.4545,0.88,0.1045,9,15,24 +11590,2012-05-03,2,1,5,2,0,4,1,2,0.46,0.4545,0.88,0.1343,4,13,17 +11591,2012-05-03,2,1,5,3,0,4,1,2,0.46,0.4545,0.88,0,2,4,6 +11592,2012-05-03,2,1,5,4,0,4,1,2,0.46,0.4545,0.88,0.2537,0,2,2 +11593,2012-05-03,2,1,5,5,0,4,1,2,0.46,0.4545,0.88,0.2537,1,20,21 +11594,2012-05-03,2,1,5,6,0,4,1,2,0.46,0.4545,0.88,0.1642,8,128,136 +11595,2012-05-03,2,1,5,7,0,4,1,3,0.44,0.4394,0.94,0.1045,8,376,384 +11596,2012-05-03,2,1,5,8,0,4,1,3,0.46,0.4545,0.88,0.194,19,608,627 +11597,2012-05-03,2,1,5,9,0,4,1,2,0.48,0.4697,0.88,0.1045,25,279,304 +11598,2012-05-03,2,1,5,10,0,4,1,2,0.52,0.5,0.77,0,36,132,168 +11599,2012-05-03,2,1,5,11,0,4,1,1,0.54,0.5152,0.77,0,34,167,201 +11600,2012-05-03,2,1,5,12,0,4,1,1,0.56,0.5303,0.73,0.1343,64,237,301 +11601,2012-05-03,2,1,5,13,0,4,1,1,0.6,0.5909,0.69,0.194,48,217,265 +11602,2012-05-03,2,1,5,14,0,4,1,1,0.64,0.6061,0.65,0.194,33,171,204 +11603,2012-05-03,2,1,5,15,0,4,1,1,0.66,0.6212,0.61,0.194,63,220,283 +11604,2012-05-03,2,1,5,16,0,4,1,1,0.68,0.6364,0.57,0.1642,45,325,370 +11605,2012-05-03,2,1,5,17,0,4,1,1,0.72,0.6667,0.54,0.1343,87,617,704 +11606,2012-05-03,2,1,5,18,0,4,1,1,0.72,0.6667,0.58,0.1343,64,642,706 +11607,2012-05-03,2,1,5,19,0,4,1,1,0.7,0.6515,0.61,0.194,55,467,522 +11608,2012-05-03,2,1,5,20,0,4,1,1,0.64,0.6061,0.73,0.194,52,368,420 +11609,2012-05-03,2,1,5,21,0,4,1,1,0.62,0.5909,0.78,0.1343,42,257,299 +11610,2012-05-03,2,1,5,22,0,4,1,1,0.62,0.5909,0.78,0,29,217,246 +11611,2012-05-03,2,1,5,23,0,4,1,1,0.6,0.5606,0.83,0,21,100,121 +11612,2012-05-04,2,1,5,0,0,5,1,1,0.6,0.5758,0.78,0.3284,19,70,89 +11613,2012-05-04,2,1,5,1,0,5,1,1,0.6,0.5758,0.78,0.1045,15,33,48 +11614,2012-05-04,2,1,5,2,0,5,1,1,0.58,0.5455,0.83,0.1343,0,11,11 +11615,2012-05-04,2,1,5,3,0,5,1,1,0.56,0.5303,0.88,0.2239,0,14,14 +11616,2012-05-04,2,1,5,4,0,5,1,1,0.56,0.5303,0.88,0.194,0,4,4 +11617,2012-05-04,2,1,5,5,0,5,1,1,0.56,0.5303,0.88,0.194,0,24,24 +11618,2012-05-04,2,1,5,6,0,5,1,1,0.54,0.5152,0.94,0.1045,8,127,135 +11619,2012-05-04,2,1,5,7,0,5,1,1,0.56,0.5303,0.88,0,9,347,356 +11620,2012-05-04,2,1,5,8,0,5,1,1,0.56,0.5303,0.88,0,34,584,618 +11621,2012-05-04,2,1,5,9,0,5,1,1,0.6,0.5758,0.78,0,32,262,294 +11622,2012-05-04,2,1,5,10,0,5,1,2,0.62,0.5909,0.73,0.1343,45,154,199 +11623,2012-05-04,2,1,5,11,0,5,1,2,0.7,0.6515,0.61,0,77,181,258 +11624,2012-05-04,2,1,5,12,0,5,1,3,0.62,0.5909,0.78,0.3582,46,125,171 +11625,2012-05-04,2,1,5,13,0,5,1,2,0.64,0.6061,0.73,0,71,207,278 +11626,2012-05-04,2,1,5,14,0,5,1,2,0.66,0.6212,0.74,0.2836,83,159,242 +11627,2012-05-04,2,1,5,15,0,5,1,2,0.72,0.6818,0.62,0.1045,103,240,343 +11628,2012-05-04,2,1,5,16,0,5,1,1,0.76,0.6818,0.48,0.1642,82,358,440 +11629,2012-05-04,2,1,5,17,0,5,1,1,0.7,0.6515,0.54,0.4627,96,547,643 +11630,2012-05-04,2,1,5,18,0,5,1,1,0.7,0.6515,0.54,0.4627,77,564,641 +11631,2012-05-04,2,1,5,19,0,5,1,1,0.7,0.6515,0.51,0.1343,64,388,452 +11632,2012-05-04,2,1,5,20,0,5,1,1,0.68,0.6364,0.51,0.1045,58,255,313 +11633,2012-05-04,2,1,5,21,0,5,1,1,0.64,0.6061,0.69,0.194,49,186,235 +11634,2012-05-04,2,1,5,22,0,5,1,1,0.6,0.5606,0.83,0.1343,50,182,232 +11635,2012-05-04,2,1,5,23,0,5,1,1,0.6,0.5606,0.83,0.0896,51,205,256 +11636,2012-05-05,2,1,5,0,0,6,0,1,0.6,0.5606,0.83,0,42,111,153 +11637,2012-05-05,2,1,5,1,0,6,0,1,0.6,0.5606,0.83,0,28,79,107 +11638,2012-05-05,2,1,5,2,0,6,0,1,0.58,0.5455,0.88,0,24,51,75 +11639,2012-05-05,2,1,5,3,0,6,0,1,0.58,0.5455,0.83,0,11,21,32 +11640,2012-05-05,2,1,5,4,0,6,0,1,0.56,0.5303,0.88,0.1045,0,9,9 +11641,2012-05-05,2,1,5,5,0,6,0,1,0.56,0.5303,0.88,0.0896,8,14,22 +11642,2012-05-05,2,1,5,6,0,6,0,2,0.54,0.5152,0.94,0,9,33,42 +11643,2012-05-05,2,1,5,7,0,6,0,1,0.56,0.5303,0.88,0,8,79,87 +11644,2012-05-05,2,1,5,8,0,6,0,1,0.58,0.5455,0.83,0.0896,42,155,197 +11645,2012-05-05,2,1,5,9,0,6,0,1,0.62,0.5909,0.78,0.1045,61,223,284 +11646,2012-05-05,2,1,5,10,0,6,0,1,0.66,0.6212,0.65,0.1343,148,284,432 +11647,2012-05-05,2,1,5,11,0,6,0,1,0.7,0.6515,0.58,0.0896,221,354,575 +11648,2012-05-05,2,1,5,12,0,6,0,1,0.72,0.6667,0.58,0.1045,220,330,550 +11649,2012-05-05,2,1,5,13,0,6,0,1,0.74,0.6818,0.55,0.1343,217,307,524 +11650,2012-05-05,2,1,5,14,0,6,0,3,0.7,0.6515,0.61,0.2985,187,241,428 +11651,2012-05-05,2,1,5,15,0,6,0,3,0.66,0.6212,0.74,0.194,241,300,541 +11652,2012-05-05,2,1,5,16,0,6,0,2,0.68,0.6364,0.65,0.3881,230,310,540 +11653,2012-05-05,2,1,5,17,0,6,0,2,0.68,0.6364,0.65,0.2985,204,298,502 +11654,2012-05-05,2,1,5,18,0,6,0,2,0.66,0.6212,0.69,0.3582,156,278,434 +11655,2012-05-05,2,1,5,19,0,6,0,2,0.62,0.5909,0.73,0.3284,151,254,405 +11656,2012-05-05,2,1,5,20,0,6,0,1,0.6,0.5758,0.78,0.2537,100,209,309 +11657,2012-05-05,2,1,5,21,0,6,0,2,0.58,0.5455,0.78,0.2239,86,183,269 +11658,2012-05-05,2,1,5,22,0,6,0,2,0.58,0.5455,0.78,0.194,67,150,217 +11659,2012-05-05,2,1,5,23,0,6,0,2,0.56,0.5303,0.83,0.2836,35,114,149 +11660,2012-05-06,2,1,5,0,0,0,0,2,0.56,0.5303,0.83,0.1343,23,111,134 +11661,2012-05-06,2,1,5,1,0,0,0,2,0.54,0.5152,0.83,0.2239,37,84,121 +11662,2012-05-06,2,1,5,2,0,0,0,2,0.54,0.5152,0.83,0.2836,29,64,93 +11663,2012-05-06,2,1,5,3,0,0,0,2,0.54,0.5152,0.83,0.194,9,19,28 +11664,2012-05-06,2,1,5,4,0,0,0,3,0.52,0.5,0.83,0.1642,9,7,16 +11665,2012-05-06,2,1,5,5,0,0,0,3,0.52,0.5,0.83,0.1642,1,10,11 +11666,2012-05-06,2,1,5,6,0,0,0,3,0.52,0.5,0.83,0.2239,3,14,17 +11667,2012-05-06,2,1,5,7,0,0,0,3,0.52,0.5,0.83,0.1642,5,31,36 +11668,2012-05-06,2,1,5,8,0,0,0,2,0.5,0.4848,0.88,0.1343,23,91,114 +11669,2012-05-06,2,1,5,9,0,0,0,2,0.52,0.5,0.83,0.1642,63,128,191 +11670,2012-05-06,2,1,5,10,0,0,0,2,0.54,0.5152,0.77,0.1045,112,221,333 +11671,2012-05-06,2,1,5,11,0,0,0,1,0.56,0.5303,0.73,0.1045,144,270,414 +11672,2012-05-06,2,1,5,12,0,0,0,2,0.6,0.6061,0.64,0.0896,207,351,558 +11673,2012-05-06,2,1,5,13,0,0,0,2,0.6,0.6061,0.64,0,197,368,565 +11674,2012-05-06,2,1,5,14,0,0,0,2,0.6,0.6061,0.64,0.1045,226,292,518 +11675,2012-05-06,2,1,5,15,0,0,0,2,0.62,0.6061,0.61,0.1045,229,342,571 +11676,2012-05-06,2,1,5,16,0,0,0,2,0.62,0.6061,0.61,0,181,363,544 +11677,2012-05-06,2,1,5,17,0,0,0,2,0.62,0.6061,0.65,0.1343,195,316,511 +11678,2012-05-06,2,1,5,18,0,0,0,2,0.62,0.6061,0.61,0.194,131,332,463 +11679,2012-05-06,2,1,5,19,0,0,0,2,0.6,0.6061,0.64,0.1642,117,266,383 +11680,2012-05-06,2,1,5,20,0,0,0,1,0.6,0.6061,0.64,0.194,73,203,276 +11681,2012-05-06,2,1,5,21,0,0,0,1,0.56,0.5303,0.73,0.194,55,148,203 +11682,2012-05-06,2,1,5,22,0,0,0,1,0.56,0.5303,0.73,0.1642,44,113,157 +11683,2012-05-06,2,1,5,23,0,0,0,1,0.52,0.5,0.77,0.194,22,80,102 +11684,2012-05-07,2,1,5,0,0,1,1,2,0.52,0.5,0.77,0.1343,13,21,34 +11685,2012-05-07,2,1,5,1,0,1,1,2,0.52,0.5,0.77,0.194,1,8,9 +11686,2012-05-07,2,1,5,2,0,1,1,1,0.5,0.4848,0.77,0.194,1,5,6 +11687,2012-05-07,2,1,5,3,0,1,1,1,0.5,0.4848,0.77,0.194,2,3,5 +11688,2012-05-07,2,1,5,4,0,1,1,2,0.46,0.4545,0.88,0.1343,0,2,2 +11689,2012-05-07,2,1,5,5,0,1,1,1,0.46,0.4545,0.88,0.1045,2,21,23 +11690,2012-05-07,2,1,5,6,0,1,1,1,0.46,0.4545,0.88,0.1045,8,134,142 +11691,2012-05-07,2,1,5,7,0,1,1,2,0.48,0.4697,0.82,0.1045,18,367,385 +11692,2012-05-07,2,1,5,8,0,1,1,2,0.5,0.4848,0.82,0.194,31,608,639 +11693,2012-05-07,2,1,5,9,0,1,1,2,0.5,0.4848,0.82,0.2239,61,289,350 +11694,2012-05-07,2,1,5,10,0,1,1,2,0.52,0.5,0.77,0.194,62,128,190 +11695,2012-05-07,2,1,5,11,0,1,1,2,0.54,0.5152,0.73,0.194,64,149,213 +11696,2012-05-07,2,1,5,12,0,1,1,2,0.56,0.5303,0.68,0.2239,74,189,263 +11697,2012-05-07,2,1,5,13,0,1,1,2,0.6,0.6212,0.53,0.3284,75,197,272 +11698,2012-05-07,2,1,5,14,0,1,1,2,0.62,0.6212,0.5,0.3582,59,180,239 +11699,2012-05-07,2,1,5,15,0,1,1,2,0.62,0.6212,0.53,0.2985,76,186,262 +11700,2012-05-07,2,1,5,16,0,1,1,2,0.6,0.6212,0.53,0.3881,68,320,388 +11701,2012-05-07,2,1,5,17,0,1,1,2,0.6,0.6212,0.53,0.3284,102,667,769 +11702,2012-05-07,2,1,5,18,0,1,1,2,0.6,0.6212,0.53,0.3582,78,602,680 +11703,2012-05-07,2,1,5,19,0,1,1,1,0.58,0.5455,0.56,0.3284,83,463,546 +11704,2012-05-07,2,1,5,20,0,1,1,1,0.54,0.5152,0.52,0.2836,46,277,323 +11705,2012-05-07,2,1,5,21,0,1,1,2,0.54,0.5152,0.45,0.2239,37,210,247 +11706,2012-05-07,2,1,5,22,0,1,1,2,0.54,0.5152,0.45,0.2537,26,147,173 +11707,2012-05-07,2,1,5,23,0,1,1,2,0.54,0.5152,0.45,0.194,21,92,113 +11708,2012-05-08,2,1,5,0,0,2,1,2,0.52,0.5,0.52,0.2537,10,28,38 +11709,2012-05-08,2,1,5,1,0,2,1,2,0.52,0.5,0.59,0.2239,6,3,9 +11710,2012-05-08,2,1,5,2,0,2,1,2,0.52,0.5,0.63,0.2985,9,7,16 +11711,2012-05-08,2,1,5,3,0,2,1,2,0.52,0.5,0.68,0.2836,2,4,6 +11712,2012-05-08,2,1,5,4,0,2,1,2,0.52,0.5,0.68,0.3284,0,5,5 +11713,2012-05-08,2,1,5,5,0,2,1,2,0.52,0.5,0.72,0.2836,0,20,20 +11714,2012-05-08,2,1,5,6,0,2,1,1,0.5,0.4848,0.77,0.2985,7,158,165 +11715,2012-05-08,2,1,5,7,0,2,1,1,0.52,0.5,0.72,0.2836,21,442,463 +11716,2012-05-08,2,1,5,8,0,2,1,2,0.54,0.5152,0.73,0.2985,36,605,641 +11717,2012-05-08,2,1,5,9,0,2,1,2,0.56,0.5303,0.68,0.3582,37,258,295 +11718,2012-05-08,2,1,5,10,0,2,1,2,0.58,0.5455,0.68,0.4179,46,111,157 +11719,2012-05-08,2,1,5,11,0,2,1,2,0.6,0.6061,0.64,0.3881,60,157,217 +11720,2012-05-08,2,1,5,12,0,2,1,2,0.62,0.6061,0.65,0.3284,60,203,263 +11721,2012-05-08,2,1,5,13,0,2,1,3,0.64,0.6061,0.65,0.2537,53,166,219 +11722,2012-05-08,2,1,5,14,0,2,1,3,0.62,0.6061,0.69,0.3284,8,63,71 +11723,2012-05-08,2,1,5,15,0,2,1,3,0.62,0.5909,0.73,0.2537,40,94,134 +11724,2012-05-08,2,1,5,16,0,2,1,3,0.64,0.6061,0.73,0.2239,54,335,389 +11725,2012-05-08,2,1,5,17,0,2,1,2,0.64,0.6061,0.69,0.2836,77,640,717 +11726,2012-05-08,2,1,5,18,0,2,1,2,0.64,0.6061,0.69,0.2836,69,641,710 +11727,2012-05-08,2,1,5,19,0,2,1,2,0.64,0.6061,0.69,0.2836,59,399,458 +11728,2012-05-08,2,1,5,20,0,2,1,2,0.64,0.6061,0.69,0.2836,40,262,302 +11729,2012-05-08,2,1,5,21,0,2,1,2,0.64,0.6061,0.65,0.2836,21,202,223 +11730,2012-05-08,2,1,5,22,0,2,1,3,0.6,0.5758,0.78,0.2537,13,134,147 +11731,2012-05-08,2,1,5,23,0,2,1,3,0.6,0.5758,0.78,0.3284,10,53,63 +11732,2012-05-09,2,1,5,0,0,3,1,2,0.58,0.5455,0.83,0.194,8,27,35 +11733,2012-05-09,2,1,5,1,0,3,1,2,0.6,0.5758,0.78,0.2537,3,11,14 +11734,2012-05-09,2,1,5,2,0,3,1,3,0.56,0.5303,0.88,0.4478,0,1,1 +11735,2012-05-09,2,1,5,3,0,3,1,3,0.56,0.5303,0.88,0.1642,0,2,2 +11736,2012-05-09,2,1,5,4,0,3,1,3,0.56,0.5303,0.88,0.2239,0,5,5 +11737,2012-05-09,2,1,5,5,0,3,1,2,0.56,0.5303,0.88,0.1343,1,27,28 +11738,2012-05-09,2,1,5,6,0,3,1,2,0.56,0.5303,0.88,0.1045,5,121,126 +11739,2012-05-09,2,1,5,7,0,3,1,2,0.56,0.5303,0.94,0.194,17,401,418 +11740,2012-05-09,2,1,5,8,0,3,1,2,0.6,0.5606,0.83,0.1343,28,594,622 +11741,2012-05-09,2,1,5,9,0,3,1,2,0.6,0.5606,0.83,0,40,285,325 +11742,2012-05-09,2,1,5,10,0,3,1,2,0.62,0.6061,0.65,0.2537,40,113,153 +11743,2012-05-09,2,1,5,11,0,3,1,2,0.62,0.6212,0.59,0.2985,45,156,201 +11744,2012-05-09,2,1,5,12,0,3,1,2,0.62,0.6212,0.57,0.2239,58,222,280 +11745,2012-05-09,2,1,5,13,0,3,1,2,0.64,0.6212,0.53,0.2537,50,216,266 +11746,2012-05-09,2,1,5,14,0,3,1,2,0.64,0.6212,0.53,0.2537,68,175,243 +11747,2012-05-09,2,1,5,15,0,3,1,2,0.64,0.6212,0.53,0.1045,68,191,259 +11748,2012-05-09,2,1,5,16,0,3,1,2,0.66,0.6212,0.5,0.194,68,289,357 +11749,2012-05-09,2,1,5,17,0,3,1,2,0.64,0.6212,0.53,0.1343,76,629,705 +11750,2012-05-09,2,1,5,18,0,3,1,3,0.52,0.5,0.77,0.5821,23,349,372 +11751,2012-05-09,2,1,5,19,0,3,1,3,0.5,0.4848,0.77,0.3582,2,96,98 +11752,2012-05-09,2,1,5,20,0,3,1,3,0.5,0.4848,0.82,0.2239,3,48,51 +11753,2012-05-09,2,1,5,21,0,3,1,3,0.5,0.4848,0.82,0.1642,4,30,34 +11754,2012-05-09,2,1,5,22,0,3,1,3,0.48,0.4697,0.82,0.1045,5,62,67 +11755,2012-05-09,2,1,5,23,0,3,1,3,0.48,0.4697,0.82,0.194,8,47,55 +11756,2012-05-10,2,1,5,0,0,4,1,3,0.5,0.4848,0.77,0.1642,2,31,33 +11757,2012-05-10,2,1,5,1,0,4,1,2,0.48,0.4697,0.88,0.1343,2,9,11 +11758,2012-05-10,2,1,5,2,0,4,1,2,0.48,0.4697,0.88,0.1045,0,3,3 +11759,2012-05-10,2,1,5,3,0,4,1,1,0.46,0.4545,0.88,0.2239,0,3,3 +11760,2012-05-10,2,1,5,4,0,4,1,1,0.46,0.4545,0.77,0.194,0,2,2 +11761,2012-05-10,2,1,5,5,0,4,1,1,0.46,0.4545,0.72,0.2985,1,23,24 +11762,2012-05-10,2,1,5,6,0,4,1,1,0.44,0.4394,0.72,0.2239,9,130,139 +11763,2012-05-10,2,1,5,7,0,4,1,1,0.44,0.4394,0.72,0.2985,8,393,401 +11764,2012-05-10,2,1,5,8,0,4,1,1,0.46,0.4545,0.67,0.3582,27,603,630 +11765,2012-05-10,2,1,5,9,0,4,1,1,0.5,0.4848,0.59,0.4627,62,299,361 +11766,2012-05-10,2,1,5,10,0,4,1,1,0.5,0.4848,0.51,0.4179,42,112,154 +11767,2012-05-10,2,1,5,11,0,4,1,1,0.52,0.5,0.48,0.5224,77,179,256 +11768,2012-05-10,2,1,5,12,0,4,1,1,0.52,0.5,0.48,0.4179,65,207,272 +11769,2012-05-10,2,1,5,13,0,4,1,1,0.54,0.5152,0.39,0.4478,90,228,318 +11770,2012-05-10,2,1,5,14,0,4,1,1,0.56,0.5303,0.37,0.4179,61,186,247 +11771,2012-05-10,2,1,5,15,0,4,1,1,0.54,0.5152,0.37,0.3881,79,192,271 +11772,2012-05-10,2,1,5,16,0,4,1,1,0.58,0.5455,0.3,0.4627,87,334,421 +11773,2012-05-10,2,1,5,17,0,4,1,1,0.6,0.6212,0.31,0.3881,84,648,732 +11774,2012-05-10,2,1,5,18,0,4,1,1,0.56,0.5303,0.35,0.5224,109,661,770 +11775,2012-05-10,2,1,5,19,0,4,1,1,0.54,0.5152,0.37,0.3582,84,469,553 +11776,2012-05-10,2,1,5,20,0,4,1,1,0.52,0.5,0.4,0.3284,59,315,374 +11777,2012-05-10,2,1,5,21,0,4,1,1,0.5,0.4848,0.42,0.194,19,211,230 +11778,2012-05-10,2,1,5,22,0,4,1,1,0.5,0.4848,0.42,0.1045,36,196,232 +11779,2012-05-10,2,1,5,23,0,4,1,1,0.48,0.4697,0.48,0.1045,23,112,135 +11780,2012-05-11,2,1,5,0,0,5,1,1,0.46,0.4545,0.55,0.1045,17,47,64 +11781,2012-05-11,2,1,5,1,0,5,1,1,0.46,0.4545,0.51,0.1343,14,32,46 +11782,2012-05-11,2,1,5,2,0,5,1,1,0.46,0.4545,0.44,0.1642,15,16,31 +11783,2012-05-11,2,1,5,3,0,5,1,1,0.46,0.4545,0.41,0.2836,5,8,13 +11784,2012-05-11,2,1,5,4,0,5,1,1,0.42,0.4242,0.5,0.194,3,6,9 +11785,2012-05-11,2,1,5,5,0,5,1,1,0.42,0.4242,0.47,0.1642,3,23,26 +11786,2012-05-11,2,1,5,6,0,5,1,1,0.42,0.4242,0.5,0.2537,7,128,135 +11787,2012-05-11,2,1,5,7,0,5,1,1,0.42,0.4242,0.5,0.2537,6,345,351 +11788,2012-05-11,2,1,5,8,0,5,1,1,0.46,0.4545,0.44,0.2537,41,579,620 +11789,2012-05-11,2,1,5,9,0,5,1,1,0.48,0.4697,0.41,0.2985,34,288,322 +11790,2012-05-11,2,1,5,10,0,5,1,1,0.52,0.5,0.36,0.3284,69,141,210 +11791,2012-05-11,2,1,5,11,0,5,1,1,0.56,0.5303,0.35,0.2537,88,203,291 +11792,2012-05-11,2,1,5,12,0,5,1,1,0.56,0.5303,0.28,0.4179,89,259,348 +11793,2012-05-11,2,1,5,13,0,5,1,1,0.6,0.6061,0.26,0.3881,100,263,363 +11794,2012-05-11,2,1,5,14,0,5,1,1,0.62,0.6061,0.23,0.2239,116,218,334 +11795,2012-05-11,2,1,5,15,0,5,1,1,0.62,0.6061,0.25,0.194,140,299,439 +11796,2012-05-11,2,1,5,16,0,5,1,1,0.62,0.6061,0.25,0.3284,128,397,525 +11797,2012-05-11,2,1,5,17,0,5,1,1,0.64,0.6061,0.23,0.2537,102,677,779 +11798,2012-05-11,2,1,5,18,0,5,1,1,0.64,0.6061,0.23,0.2836,78,518,596 +11799,2012-05-11,2,1,5,19,0,5,1,1,0.64,0.6061,0.23,0.2537,73,430,503 +11800,2012-05-11,2,1,5,20,0,5,1,1,0.62,0.6061,0.25,0.1343,64,277,341 +11801,2012-05-11,2,1,5,21,0,5,1,1,0.6,0.6061,0.28,0.1045,45,225,270 +11802,2012-05-11,2,1,5,22,0,5,1,1,0.56,0.5303,0.35,0.2239,44,190,234 +11803,2012-05-11,2,1,5,23,0,5,1,1,0.54,0.5152,0.37,0.194,38,142,180 +11804,2012-05-12,2,1,5,0,0,6,0,1,0.52,0.5,0.42,0.1642,25,111,136 +11805,2012-05-12,2,1,5,1,0,6,0,1,0.5,0.4848,0.48,0,14,79,93 +11806,2012-05-12,2,1,5,2,0,6,0,1,0.46,0.4545,0.59,0.1343,10,46,56 +11807,2012-05-12,2,1,5,3,0,6,0,1,0.48,0.4697,0.59,0,14,20,34 +11808,2012-05-12,2,1,5,4,0,6,0,1,0.44,0.4394,0.67,0.1045,3,6,9 +11809,2012-05-12,2,1,5,5,0,6,0,1,0.42,0.4242,0.67,0.1343,5,8,13 +11810,2012-05-12,2,1,5,6,0,6,0,1,0.42,0.4242,0.67,0.0896,10,23,33 +11811,2012-05-12,2,1,5,7,0,6,0,1,0.42,0.4242,0.71,0.1343,10,57,67 +11812,2012-05-12,2,1,5,8,0,6,0,1,0.52,0.5,0.48,0,22,156,178 +11813,2012-05-12,2,1,5,9,0,6,0,1,0.54,0.5152,0.49,0,87,248,335 +11814,2012-05-12,2,1,5,10,0,6,0,1,0.56,0.5303,0.49,0.0896,132,279,411 +11815,2012-05-12,2,1,5,11,0,6,0,1,0.62,0.6212,0.41,0.0896,157,365,522 +11816,2012-05-12,2,1,5,12,0,6,0,1,0.64,0.6212,0.36,0.1642,206,353,559 +11817,2012-05-12,2,1,5,13,0,6,0,1,0.64,0.6212,0.36,0.194,293,366,659 +11818,2012-05-12,2,1,5,14,0,6,0,1,0.66,0.6212,0.36,0.194,257,358,615 +11819,2012-05-12,2,1,5,15,0,6,0,1,0.68,0.6212,0.24,0.194,269,321,590 +11820,2012-05-12,2,1,5,16,0,6,0,2,0.7,0.6364,0.21,0.194,254,337,591 +11821,2012-05-12,2,1,5,17,0,6,0,2,0.66,0.6212,0.36,0.194,233,343,576 +11822,2012-05-12,2,1,5,18,0,6,0,2,0.66,0.6212,0.31,0.194,164,382,546 +11823,2012-05-12,2,1,5,19,0,6,0,2,0.64,0.6212,0.36,0.1642,166,273,439 +11824,2012-05-12,2,1,5,20,0,6,0,1,0.62,0.6212,0.46,0.1642,113,174,287 +11825,2012-05-12,2,1,5,21,0,6,0,2,0.6,0.6061,0.6,0.1343,74,156,230 +11826,2012-05-12,2,1,5,22,0,6,0,1,0.56,0.5303,0.68,0.1343,62,212,274 +11827,2012-05-12,2,1,5,23,0,6,0,1,0.58,0.5455,0.56,0.0896,42,134,176 +11828,2012-05-13,2,1,5,0,0,0,0,1,0.56,0.5303,0.64,0.1642,19,79,98 +11829,2012-05-13,2,1,5,1,0,0,0,2,0.54,0.5152,0.68,0.1343,22,72,94 +11830,2012-05-13,2,1,5,2,0,0,0,2,0.54,0.5152,0.68,0.1642,22,60,82 +11831,2012-05-13,2,1,5,3,0,0,0,2,0.54,0.5152,0.64,0.1343,2,26,28 +11832,2012-05-13,2,1,5,4,0,0,0,2,0.54,0.5152,0.68,0.194,2,7,9 +11833,2012-05-13,2,1,5,5,0,0,0,1,0.52,0.5,0.72,0.194,4,8,12 +11834,2012-05-13,2,1,5,6,0,0,0,1,0.52,0.5,0.68,0.1642,6,15,21 +11835,2012-05-13,2,1,5,7,0,0,0,1,0.52,0.5,0.68,0.1642,18,42,60 +11836,2012-05-13,2,1,5,8,0,0,0,1,0.56,0.5303,0.64,0.1642,32,124,156 +11837,2012-05-13,2,1,5,9,0,0,0,1,0.6,0.6212,0.56,0.1343,79,143,222 +11838,2012-05-13,2,1,5,10,0,0,0,1,0.6,0.6061,0.6,0.2537,128,222,350 +11839,2012-05-13,2,1,5,11,0,0,0,1,0.62,0.6212,0.57,0.2537,150,278,428 +11840,2012-05-13,2,1,5,12,0,0,0,1,0.64,0.6212,0.57,0.2239,189,342,531 +11841,2012-05-13,2,1,5,13,0,0,0,1,0.68,0.6364,0.54,0.2537,255,347,602 +11842,2012-05-13,2,1,5,14,0,0,0,1,0.7,0.6515,0.48,0.1343,228,324,552 +11843,2012-05-13,2,1,5,15,0,0,0,1,0.72,0.6515,0.45,0.3582,190,309,499 +11844,2012-05-13,2,1,5,16,0,0,0,1,0.72,0.6515,0.42,0.2537,225,339,564 +11845,2012-05-13,2,1,5,17,0,0,0,1,0.7,0.6364,0.45,0.4179,191,287,478 +11846,2012-05-13,2,1,5,18,0,0,0,2,0.68,0.6364,0.44,0.3582,129,260,389 +11847,2012-05-13,2,1,5,19,0,0,0,2,0.66,0.6212,0.5,0.4179,107,232,339 +11848,2012-05-13,2,1,5,20,0,0,0,2,0.66,0.6212,0.5,0.2836,67,163,230 +11849,2012-05-13,2,1,5,21,0,0,0,2,0.64,0.6212,0.53,0.194,48,121,169 +11850,2012-05-13,2,1,5,22,0,0,0,2,0.62,0.6061,0.61,0.1642,42,89,131 +11851,2012-05-13,2,1,5,23,0,0,0,2,0.62,0.6212,0.57,0.2239,17,57,74 +11852,2012-05-14,2,1,5,0,0,1,1,2,0.6,0.6212,0.56,0.2537,21,14,35 +11853,2012-05-14,2,1,5,1,0,1,1,2,0.6,0.6061,0.6,0.2836,5,6,11 +11854,2012-05-14,2,1,5,2,0,1,1,1,0.6,0.6061,0.6,0.2985,1,1,2 +11855,2012-05-14,2,1,5,3,0,1,1,2,0.58,0.5455,0.68,0.3284,0,2,2 +11856,2012-05-14,2,1,5,4,0,1,1,2,0.54,0.5152,0.77,0.2239,3,3,6 +11857,2012-05-14,2,1,5,5,0,1,1,2,0.54,0.5152,0.77,0.1642,1,25,26 +11858,2012-05-14,2,1,5,6,0,1,1,3,0.52,0.5,0.94,0.2985,3,62,65 +11859,2012-05-14,2,1,5,7,0,1,1,3,0.52,0.5,0.94,0.194,3,72,75 +11860,2012-05-14,2,1,5,8,0,1,1,3,0.54,0.5152,0.88,0.2239,1,155,156 +11861,2012-05-14,2,1,5,9,0,1,1,3,0.54,0.5152,0.88,0.2836,5,105,110 +11862,2012-05-14,2,1,5,10,0,1,1,3,0.54,0.5152,0.88,0.2537,6,53,59 +11863,2012-05-14,2,1,5,11,0,1,1,2,0.56,0.5303,0.83,0.2239,12,67,79 +11864,2012-05-14,2,1,5,12,0,1,1,2,0.56,0.5303,0.88,0.2836,53,132,185 +11865,2012-05-14,2,1,5,13,0,1,1,2,0.62,0.5909,0.73,0.2239,33,143,176 +11866,2012-05-14,2,1,5,14,0,1,1,2,0.62,0.5909,0.73,0.2537,43,128,171 +11867,2012-05-14,2,1,5,15,0,1,1,2,0.62,0.5909,0.73,0.1642,40,156,196 +11868,2012-05-14,2,1,5,16,0,1,1,3,0.62,0.6061,0.69,0.2836,20,91,111 +11869,2012-05-14,2,1,5,17,0,1,1,3,0.56,0.5303,0.88,0.2537,25,204,229 +11870,2012-05-14,2,1,5,18,0,1,1,2,0.58,0.5455,0.83,0,17,283,300 +11871,2012-05-14,2,1,5,19,0,1,1,3,0.58,0.5455,0.88,0.1343,10,294,304 +11872,2012-05-14,2,1,5,20,0,1,1,2,0.58,0.5455,0.88,0.194,11,178,189 +11873,2012-05-14,2,1,5,21,0,1,1,2,0.58,0.5455,0.83,0.0896,7,145,152 +11874,2012-05-14,2,1,5,22,0,1,1,2,0.58,0.5455,0.78,0.0896,11,118,129 +11875,2012-05-14,2,1,5,23,0,1,1,2,0.58,0.5455,0.78,0.1045,11,64,75 +11876,2012-05-15,2,1,5,0,0,2,1,2,0.56,0.5303,0.88,0.1343,7,25,32 +11877,2012-05-15,2,1,5,1,0,2,1,3,0.56,0.5303,0.88,0.1343,5,14,19 +11878,2012-05-15,2,1,5,2,0,2,1,2,0.56,0.5303,0.88,0.1343,1,3,4 +11879,2012-05-15,2,1,5,3,0,2,1,3,0.58,0.5455,0.83,0.1343,1,5,6 +11880,2012-05-15,2,1,5,4,0,2,1,3,0.56,0.5303,0.94,0,1,4,5 +11881,2012-05-15,2,1,5,5,0,2,1,3,0.56,0.5303,0.94,0.2239,0,8,8 +11882,2012-05-15,2,1,5,6,0,2,1,3,0.56,0.5303,0.94,0.2239,2,22,24 +11883,2012-05-15,2,1,5,7,0,2,1,3,0.56,0.5303,0.94,0.1343,1,91,92 +11884,2012-05-15,2,1,5,8,0,2,1,2,0.58,0.5455,0.88,0.0896,8,401,409 +11885,2012-05-15,2,1,5,9,0,2,1,2,0.58,0.5455,0.88,0.0896,28,327,355 +11886,2012-05-15,2,1,5,10,0,2,1,1,0.62,0.5909,0.78,0.1045,27,144,171 +11887,2012-05-15,2,1,5,11,0,2,1,1,0.64,0.5909,0.78,0.194,38,186,224 +11888,2012-05-15,2,1,5,12,0,2,1,1,0.64,0.6061,0.73,0.194,44,220,264 +11889,2012-05-15,2,1,5,13,0,2,1,3,0.64,0.6061,0.73,0.194,35,204,239 +11890,2012-05-15,2,1,5,14,0,2,1,2,0.64,0.6061,0.73,0.194,39,145,184 +11891,2012-05-15,2,1,5,15,0,2,1,2,0.7,0.6515,0.61,0.0896,51,205,256 +11892,2012-05-15,2,1,5,16,0,2,1,1,0.68,0.6364,0.65,0.1642,74,300,374 +11893,2012-05-15,2,1,5,17,0,2,1,1,0.68,0.6364,0.65,0.1045,75,603,678 +11894,2012-05-15,2,1,5,18,0,2,1,1,0.7,0.6515,0.61,0.0896,68,665,733 +11895,2012-05-15,2,1,5,19,0,2,1,1,0.66,0.6212,0.69,0.1642,57,455,512 +11896,2012-05-15,2,1,5,20,0,2,1,3,0.64,0.6061,0.73,0.1343,30,233,263 +11897,2012-05-15,2,1,5,21,0,2,1,3,0.64,0.6061,0.73,0.2836,11,98,109 +11898,2012-05-15,2,1,5,22,0,2,1,1,0.58,0.5455,0.78,0.2239,9,74,83 +11899,2012-05-15,2,1,5,23,0,2,1,3,0.56,0.5303,0.88,0.1045,13,58,71 +11900,2012-05-16,2,1,5,0,0,3,1,2,0.58,0.5455,0.88,0,12,27,39 +11901,2012-05-16,2,1,5,1,0,3,1,2,0.58,0.5455,0.88,0.1343,7,14,21 +11902,2012-05-16,2,1,5,2,0,3,1,1,0.56,0.5303,0.94,0.194,5,14,19 +11903,2012-05-16,2,1,5,3,0,3,1,1,0.56,0.5303,0.88,0.1045,1,5,6 +11904,2012-05-16,2,1,5,4,0,3,1,1,0.54,0.5152,0.94,0.1343,1,3,4 +11905,2012-05-16,2,1,5,5,0,3,1,1,0.54,0.5152,0.94,0.1045,4,34,38 +11906,2012-05-16,2,1,5,6,0,3,1,1,0.54,0.5152,0.94,0.0896,8,150,158 +11907,2012-05-16,2,1,5,7,0,3,1,1,0.56,0.5303,0.83,0,14,426,440 +11908,2012-05-16,2,1,5,8,0,3,1,1,0.58,0.5455,0.83,0,33,617,650 +11909,2012-05-16,2,1,5,9,0,3,1,2,0.6,0.5758,0.78,0.1343,33,314,347 +11910,2012-05-16,2,1,5,10,0,3,1,1,0.66,0.6212,0.61,0,43,149,192 +11911,2012-05-16,2,1,5,11,0,3,1,1,0.7,0.6515,0.54,0.1343,55,212,267 +11912,2012-05-16,2,1,5,12,0,3,1,1,0.7,0.6515,0.54,0.194,59,271,330 +11913,2012-05-16,2,1,5,13,0,3,1,1,0.72,0.6515,0.42,0,77,273,350 +11914,2012-05-16,2,1,5,14,0,3,1,1,0.72,0.6515,0.42,0.0896,42,221,263 +11915,2012-05-16,2,1,5,15,0,3,1,2,0.72,0.6667,0.48,0.1045,52,222,274 +11916,2012-05-16,2,1,5,16,0,3,1,1,0.72,0.6515,0.45,0.1045,70,376,446 +11917,2012-05-16,2,1,5,17,0,3,1,1,0.72,0.6667,0.51,0.194,104,769,873 +11918,2012-05-16,2,1,5,18,0,3,1,1,0.72,0.6667,0.51,0.2239,97,749,846 +11919,2012-05-16,2,1,5,19,0,3,1,1,0.7,0.6515,0.54,0.2836,91,499,590 +11920,2012-05-16,2,1,5,20,0,3,1,1,0.66,0.6212,0.65,0.2537,61,398,459 +11921,2012-05-16,2,1,5,21,0,3,1,1,0.64,0.6061,0.73,0.194,63,330,393 +11922,2012-05-16,2,1,5,22,0,3,1,1,0.64,0.6061,0.73,0.1642,33,253,286 +11923,2012-05-16,2,1,5,23,0,3,1,1,0.62,0.5909,0.78,0.1045,26,107,133 +11924,2012-05-17,2,1,5,0,0,4,1,1,0.6,0.5758,0.78,0.1343,30,49,79 +11925,2012-05-17,2,1,5,1,0,4,1,1,0.6,0.5606,0.83,0.1642,12,16,28 +11926,2012-05-17,2,1,5,2,0,4,1,1,0.6,0.5758,0.78,0.1642,8,8,16 +11927,2012-05-17,2,1,5,3,0,4,1,1,0.6,0.5909,0.73,0.2537,0,3,3 +11928,2012-05-17,2,1,5,4,0,4,1,1,0.6,0.6061,0.64,0.3284,3,13,16 +11929,2012-05-17,2,1,5,5,0,4,1,1,0.56,0.5303,0.68,0.3284,1,34,35 +11930,2012-05-17,2,1,5,6,0,4,1,1,0.54,0.5152,0.73,0.3582,10,149,159 +11931,2012-05-17,2,1,5,7,0,4,1,1,0.54,0.5152,0.73,0.3881,25,449,474 +11932,2012-05-17,2,1,5,8,0,4,1,1,0.52,0.5,0.55,0.4478,29,605,634 +11933,2012-05-17,2,1,5,9,0,4,1,1,0.54,0.5152,0.52,0.4179,57,289,346 +11934,2012-05-17,2,1,5,10,0,4,1,1,0.56,0.5303,0.52,0.2537,43,162,205 +11935,2012-05-17,2,1,5,11,0,4,1,1,0.58,0.5455,0.49,0.2239,77,202,279 +11936,2012-05-17,2,1,5,12,0,4,1,1,0.6,0.6212,0.43,0.194,72,276,348 +11937,2012-05-17,2,1,5,13,0,4,1,1,0.62,0.6212,0.41,0.194,86,241,327 +11938,2012-05-17,2,1,5,14,0,4,1,1,0.64,0.6212,0.38,0.194,68,202,270 +11939,2012-05-17,2,1,5,15,0,4,1,1,0.64,0.6212,0.36,0.2537,83,233,316 +11940,2012-05-17,2,1,5,16,0,4,1,1,0.66,0.6212,0.31,0.194,86,344,430 +11941,2012-05-17,2,1,5,17,0,4,1,1,0.66,0.6212,0.31,0.1343,133,719,852 +11942,2012-05-17,2,1,5,18,0,4,1,1,0.66,0.6212,0.27,0.1642,134,734,868 +11943,2012-05-17,2,1,5,19,0,4,1,1,0.64,0.6212,0.31,0.194,86,451,537 +11944,2012-05-17,2,1,5,20,0,4,1,1,0.6,0.6212,0.4,0.194,83,363,446 +11945,2012-05-17,2,1,5,21,0,4,1,1,0.58,0.5455,0.4,0.1045,45,254,299 +11946,2012-05-17,2,1,5,22,0,4,1,1,0.56,0.5303,0.43,0.0896,42,209,251 +11947,2012-05-17,2,1,5,23,0,4,1,1,0.54,0.5152,0.49,0.1343,29,137,166 +11948,2012-05-18,2,1,5,0,0,5,1,1,0.52,0.5,0.59,0.1045,13,57,70 +11949,2012-05-18,2,1,5,1,0,5,1,1,0.5,0.4848,0.59,0.0896,16,33,49 +11950,2012-05-18,2,1,5,2,0,5,1,1,0.48,0.4697,0.67,0.1343,12,9,21 +11951,2012-05-18,2,1,5,3,0,5,1,1,0.46,0.4545,0.72,0.0896,6,8,14 +11952,2012-05-18,2,1,5,4,0,5,1,1,0.46,0.4545,0.77,0.1642,0,11,11 +11953,2012-05-18,2,1,5,5,0,5,1,1,0.46,0.4545,0.77,0.1045,1,33,34 +11954,2012-05-18,2,1,5,6,0,5,1,1,0.44,0.4394,0.77,0.1343,14,168,182 +11955,2012-05-18,2,1,5,7,0,5,1,1,0.46,0.4545,0.82,0.1045,40,475,515 +11956,2012-05-18,2,1,5,8,0,5,1,1,0.52,0.5,0.59,0.1343,49,696,745 +11957,2012-05-18,2,1,5,9,0,5,1,1,0.56,0.5303,0.56,0.1045,57,304,361 +11958,2012-05-18,2,1,5,10,0,5,1,1,0.6,0.6212,0.46,0.1343,74,181,255 +11959,2012-05-18,2,1,5,11,0,5,1,1,0.62,0.6212,0.43,0,109,225,334 +11960,2012-05-18,2,1,5,12,0,5,1,1,0.62,0.6212,0.41,0.194,108,252,360 +11961,2012-05-18,2,1,5,13,0,5,1,1,0.64,0.6212,0.36,0.1642,98,297,395 +11962,2012-05-18,2,1,5,14,0,5,1,1,0.66,0.6212,0.34,0.1343,108,242,350 +11963,2012-05-18,2,1,5,15,0,5,1,1,0.66,0.6212,0.36,0.194,131,260,391 +11964,2012-05-18,2,1,5,16,0,5,1,1,0.66,0.6212,0.34,0.2537,151,417,568 +11965,2012-05-18,2,1,5,17,0,5,1,1,0.66,0.6212,0.34,0.1343,124,688,812 +11966,2012-05-18,2,1,5,18,0,5,1,1,0.64,0.6212,0.38,0.2239,99,570,669 +11967,2012-05-18,2,1,5,19,0,5,1,1,0.62,0.6212,0.41,0.194,91,392,483 +11968,2012-05-18,2,1,5,20,0,5,1,1,0.62,0.6212,0.41,0.1642,73,264,337 +11969,2012-05-18,2,1,5,21,0,5,1,1,0.6,0.6212,0.4,0,49,209,258 +11970,2012-05-18,2,1,5,22,0,5,1,1,0.56,0.5303,0.52,0.3284,57,194,251 +11971,2012-05-18,2,1,5,23,0,5,1,1,0.52,0.5,0.55,0,41,133,174 +11972,2012-05-19,2,1,5,0,0,6,0,1,0.52,0.5,0.48,0.1045,30,118,148 +11973,2012-05-19,2,1,5,1,0,6,0,1,0.5,0.4848,0.48,0,20,84,104 +11974,2012-05-19,2,1,5,2,0,6,0,1,0.5,0.4848,0.51,0.1045,13,56,69 +11975,2012-05-19,2,1,5,3,0,6,0,1,0.5,0.4848,0.48,0.1045,10,23,33 +11976,2012-05-19,2,1,5,4,0,6,0,1,0.48,0.4697,0.59,0.0896,6,8,14 +11977,2012-05-19,2,1,5,5,0,6,0,1,0.46,0.4545,0.67,0,1,11,12 +11978,2012-05-19,2,1,5,6,0,6,0,1,0.44,0.4394,0.72,0.0896,12,38,50 +11979,2012-05-19,2,1,5,7,0,6,0,1,0.46,0.4545,0.67,0.1045,28,67,95 +11980,2012-05-19,2,1,5,8,0,6,0,1,0.52,0.5,0.52,0.1642,33,162,195 +11981,2012-05-19,2,1,5,9,0,6,0,1,0.56,0.5303,0.43,0.1642,79,213,292 +11982,2012-05-19,2,1,5,10,0,6,0,1,0.56,0.5303,0.43,0,177,275,452 +11983,2012-05-19,2,1,5,11,0,6,0,1,0.62,0.6212,0.38,0.0896,235,351,586 +11984,2012-05-19,2,1,5,12,0,6,0,1,0.66,0.6212,0.36,0,276,366,642 +11985,2012-05-19,2,1,5,13,0,6,0,1,0.7,0.6364,0.37,0.1343,332,372,704 +11986,2012-05-19,2,1,5,14,0,6,0,1,0.72,0.6515,0.3,0.0896,361,369,730 +11987,2012-05-19,2,1,5,15,0,6,0,1,0.72,0.6515,0.3,0.1045,356,316,672 +11988,2012-05-19,2,1,5,16,0,6,0,1,0.74,0.6515,0.3,0.1045,331,311,642 +11989,2012-05-19,2,1,5,17,0,6,0,1,0.74,0.6515,0.3,0.0896,279,347,626 +11990,2012-05-19,2,1,5,18,0,6,0,1,0.74,0.6515,0.33,0.0896,254,391,645 +11991,2012-05-19,2,1,5,19,0,6,0,1,0.7,0.6364,0.39,0.1343,203,229,432 +11992,2012-05-19,2,1,5,20,0,6,0,1,0.68,0.6364,0.41,0.0896,118,197,315 +11993,2012-05-19,2,1,5,21,0,6,0,1,0.64,0.6212,0.5,0,81,178,259 +11994,2012-05-19,2,1,5,22,0,6,0,1,0.64,0.6212,0.47,0,104,234,338 +11995,2012-05-19,2,1,5,23,0,6,0,1,0.6,0.6212,0.56,0.1642,71,168,239 +11996,2012-05-20,2,1,5,0,0,0,0,1,0.58,0.5455,0.53,0.1045,42,128,170 +11997,2012-05-20,2,1,5,1,0,0,0,1,0.56,0.5303,0.52,0,28,102,130 +11998,2012-05-20,2,1,5,2,0,0,0,1,0.56,0.5303,0.52,0,36,62,98 +11999,2012-05-20,2,1,5,3,0,0,0,1,0.54,0.5152,0.56,0.0896,26,40,66 +12000,2012-05-20,2,1,5,4,0,0,0,1,0.52,0.5,0.68,0.0896,2,14,16 +12001,2012-05-20,2,1,5,5,0,0,0,1,0.5,0.4848,0.72,0.1045,1,7,8 +12002,2012-05-20,2,1,5,6,0,0,0,1,0.5,0.4848,0.63,0.1343,4,21,25 +12003,2012-05-20,2,1,5,7,0,0,0,1,0.52,0.5,0.68,0.194,35,55,90 +12004,2012-05-20,2,1,5,8,0,0,0,1,0.56,0.5303,0.56,0.1642,51,120,171 +12005,2012-05-20,2,1,5,9,0,0,0,1,0.62,0.6212,0.32,0.2537,129,184,313 +12006,2012-05-20,2,1,5,10,0,0,0,1,0.66,0.6212,0.34,0.2985,174,258,432 +12007,2012-05-20,2,1,5,11,0,0,0,1,0.66,0.6212,0.36,0.3284,258,323,581 +12008,2012-05-20,2,1,5,12,0,0,0,1,0.68,0.6364,0.36,0.3284,247,390,637 +12009,2012-05-20,2,1,5,13,0,0,0,1,0.7,0.6364,0.37,0.2836,244,363,607 +12010,2012-05-20,2,1,5,14,0,0,0,1,0.72,0.6515,0.39,0.3881,236,307,543 +12011,2012-05-20,2,1,5,15,0,0,0,2,0.7,0.6364,0.39,0.4179,246,256,502 +12012,2012-05-20,2,1,5,16,0,0,0,1,0.72,0.6515,0.42,0.3582,238,339,577 +12013,2012-05-20,2,1,5,17,0,0,0,1,0.72,0.6515,0.45,0.3284,209,340,549 +12014,2012-05-20,2,1,5,18,0,0,0,1,0.7,0.6515,0.51,0.3284,146,328,474 +12015,2012-05-20,2,1,5,19,0,0,0,1,0.66,0.6212,0.61,0.4179,142,260,402 +12016,2012-05-20,2,1,5,20,0,0,0,1,0.66,0.6212,0.61,0.4179,83,191,274 +12017,2012-05-20,2,1,5,21,0,0,0,1,0.64,0.6061,0.69,0.3881,51,182,233 +12018,2012-05-20,2,1,5,22,0,0,0,1,0.62,0.5909,0.73,0.3284,44,101,145 +12019,2012-05-20,2,1,5,23,0,0,0,3,0.6,0.5758,0.78,0.3582,32,54,86 +12020,2012-05-21,2,1,5,0,0,1,1,3,0.58,0.5455,0.88,0.2985,12,28,40 +12021,2012-05-21,2,1,5,1,0,1,1,3,0.58,0.5455,0.88,0.3582,4,11,15 +12022,2012-05-21,2,1,5,2,0,1,1,3,0.56,0.5303,0.94,0.2537,2,9,11 +12023,2012-05-21,2,1,5,3,0,1,1,2,0.56,0.5303,0.88,0.2985,0,2,2 +12024,2012-05-21,2,1,5,4,0,1,1,3,0.56,0.5303,0.88,0.2985,2,5,7 +12025,2012-05-21,2,1,5,5,0,1,1,3,0.56,0.5303,0.88,0.2985,1,14,15 +12026,2012-05-21,2,1,5,6,0,1,1,3,0.56,0.5303,0.88,0.2985,3,76,79 +12027,2012-05-21,2,1,5,7,0,1,1,3,0.56,0.5303,0.88,0.2985,7,146,153 +12028,2012-05-21,2,1,5,8,0,1,1,3,0.56,0.5303,0.88,0.2537,12,258,270 +12029,2012-05-21,2,1,5,9,0,1,1,3,0.56,0.5303,0.88,0.2239,14,172,186 +12030,2012-05-21,2,1,5,10,0,1,1,3,0.58,0.5455,0.88,0.2836,28,71,99 +12031,2012-05-21,2,1,5,11,0,1,1,2,0.6,0.5606,0.83,0.1642,24,82,106 +12032,2012-05-21,2,1,5,12,0,1,1,2,0.6,0.5606,0.83,0.2537,46,124,170 +12033,2012-05-21,2,1,5,13,0,1,1,3,0.6,0.5606,0.83,0.2537,49,135,184 +12034,2012-05-21,2,1,5,14,0,1,1,3,0.6,0.5606,0.83,0.194,45,135,180 +12035,2012-05-21,2,1,5,15,0,1,1,2,0.64,0.6061,0.73,0.194,67,158,225 +12036,2012-05-21,2,1,5,16,0,1,1,2,0.64,0.6061,0.73,0.2537,46,269,315 +12037,2012-05-21,2,1,5,17,0,1,1,2,0.66,0.6212,0.69,0.2537,40,468,508 +12038,2012-05-21,2,1,5,18,0,1,1,1,0.64,0.6061,0.73,0.1045,52,478,530 +12039,2012-05-21,2,1,5,19,0,1,1,1,0.66,0.6212,0.69,0.0896,57,391,448 +12040,2012-05-21,2,1,5,20,0,1,1,1,0.64,0.6061,0.69,0.1343,44,292,336 +12041,2012-05-21,2,1,5,21,0,1,1,1,0.62,0.6061,0.69,0.0896,31,210,241 +12042,2012-05-21,2,1,5,22,0,1,1,2,0.62,0.5909,0.73,0.1642,26,116,142 +12043,2012-05-21,2,1,5,23,0,1,1,1,0.62,0.5909,0.73,0.2836,18,79,97 +12044,2012-05-22,2,1,5,0,0,2,1,1,0.58,0.5455,0.83,0.2537,10,26,36 +12045,2012-05-22,2,1,5,1,0,2,1,2,0.58,0.5455,0.83,0.1343,11,16,27 +12046,2012-05-22,2,1,5,2,0,2,1,2,0.58,0.5455,0.83,0.1045,1,10,11 +12047,2012-05-22,2,1,5,3,0,2,1,2,0.58,0.5455,0.83,0.1045,0,5,5 +12048,2012-05-22,2,1,5,4,0,2,1,2,0.56,0.5303,0.88,0.1642,0,7,7 +12049,2012-05-22,2,1,5,5,0,2,1,3,0.56,0.5303,0.88,0.194,0,17,17 +12050,2012-05-22,2,1,5,6,0,2,1,3,0.56,0.5303,0.88,0.1045,8,129,137 +12051,2012-05-22,2,1,5,7,0,2,1,3,0.56,0.5303,0.88,0.1045,9,315,324 +12052,2012-05-22,2,1,5,8,0,2,1,2,0.56,0.5303,0.88,0.1343,20,570,590 +12053,2012-05-22,2,1,5,9,0,2,1,2,0.6,0.5606,0.83,0.1045,46,287,333 +12054,2012-05-22,2,1,5,10,0,2,1,2,0.6,0.5758,0.78,0.0896,52,148,200 +12055,2012-05-22,2,1,5,11,0,2,1,2,0.6,0.5758,0.78,0,39,150,189 +12056,2012-05-22,2,1,5,12,0,2,1,2,0.62,0.5909,0.78,0.1045,76,178,254 +12057,2012-05-22,2,1,5,13,0,2,1,2,0.64,0.6061,0.73,0,48,195,243 +12058,2012-05-22,2,1,5,14,0,2,1,2,0.64,0.6061,0.73,0.1045,73,149,222 +12059,2012-05-22,2,1,5,15,0,2,1,1,0.7,0.6515,0.61,0.0896,54,191,245 +12060,2012-05-22,2,1,5,16,0,2,1,1,0.68,0.6364,0.61,0.1045,70,316,386 +12061,2012-05-22,2,1,5,17,0,2,1,1,0.72,0.6667,0.54,0,69,716,785 +12062,2012-05-22,2,1,5,18,0,2,1,1,0.68,0.6364,0.61,0.1343,76,709,785 +12063,2012-05-22,2,1,5,19,0,2,1,1,0.66,0.6212,0.69,0.2239,44,320,364 +12064,2012-05-22,2,1,5,20,0,2,1,1,0.64,0.6061,0.73,0.2537,54,319,373 +12065,2012-05-22,2,1,5,21,0,2,1,1,0.62,0.5909,0.78,0.1343,28,233,261 +12066,2012-05-22,2,1,5,22,0,2,1,1,0.62,0.5909,0.73,0.0896,17,157,174 +12067,2012-05-22,2,1,5,23,0,2,1,1,0.62,0.5909,0.73,0.1045,14,91,105 +12068,2012-05-23,2,1,5,0,0,3,1,1,0.62,0.5909,0.78,0.1343,19,28,47 +12069,2012-05-23,2,1,5,1,0,3,1,1,0.6,0.5606,0.83,0.1642,23,9,32 +12070,2012-05-23,2,1,5,2,0,3,1,2,0.6,0.5606,0.83,0.1642,2,10,12 +12071,2012-05-23,2,1,5,3,0,3,1,1,0.6,0.5758,0.78,0.1045,0,5,5 +12072,2012-05-23,2,1,5,4,0,3,1,3,0.58,0.5455,0.83,0.194,0,3,3 +12073,2012-05-23,2,1,5,5,0,3,1,3,0.56,0.5303,0.88,0.2537,6,29,35 +12074,2012-05-23,2,1,5,6,0,3,1,1,0.56,0.5303,0.88,0.1045,6,154,160 +12075,2012-05-23,2,1,5,7,0,3,1,1,0.58,0.5455,0.83,0,16,452,468 +12076,2012-05-23,2,1,5,8,0,3,1,1,0.58,0.5455,0.83,0,38,681,719 +12077,2012-05-23,2,1,5,9,0,3,1,2,0.62,0.5909,0.78,0,39,258,297 +12078,2012-05-23,2,1,5,10,0,3,1,1,0.62,0.5909,0.78,0.0896,52,139,191 +12079,2012-05-23,2,1,5,11,0,3,1,1,0.64,0.6061,0.73,0,46,151,197 +12080,2012-05-23,2,1,5,12,0,3,1,2,0.66,0.6212,0.69,0.0896,42,206,248 +12081,2012-05-23,2,1,5,13,0,3,1,2,0.66,0.6212,0.74,0.1642,69,201,270 +12082,2012-05-23,2,1,5,14,0,3,1,1,0.68,0.6364,0.65,0.1343,67,181,248 +12083,2012-05-23,2,1,5,15,0,3,1,2,0.7,0.6515,0.61,0,65,192,257 +12084,2012-05-23,2,1,5,16,0,3,1,3,0.7,0.6515,0.61,0.0896,62,216,278 +12085,2012-05-23,2,1,5,17,0,3,1,3,0.64,0.6061,0.69,0.1045,45,240,285 +12086,2012-05-23,2,1,5,18,0,3,1,3,0.64,0.6061,0.69,0.1045,31,296,327 +12087,2012-05-23,2,1,5,19,0,3,1,2,0.62,0.5758,0.83,0.1642,40,336,376 +12088,2012-05-23,2,1,5,20,0,3,1,2,0.62,0.5758,0.83,0.0896,28,267,295 +12089,2012-05-23,2,1,5,21,0,3,1,2,0.62,0.5758,0.83,0,24,221,245 +12090,2012-05-23,2,1,5,22,0,3,1,2,0.62,0.5758,0.83,0.1642,23,136,159 +12091,2012-05-23,2,1,5,23,0,3,1,1,0.6,0.5606,0.83,0.1343,23,83,106 +12092,2012-05-24,2,1,5,0,0,4,1,1,0.6,0.5606,0.83,0.0896,7,49,56 +12093,2012-05-24,2,1,5,1,0,4,1,1,0.6,0.5455,0.88,0.0896,9,21,30 +12094,2012-05-24,2,1,5,2,0,4,1,1,0.6,0.5455,0.88,0,1,15,16 +12095,2012-05-24,2,1,5,3,0,4,1,2,0.6,0.5455,0.88,0.1045,1,3,4 +12096,2012-05-24,2,1,5,4,0,4,1,2,0.6,0.5455,0.88,0.1642,0,5,5 +12097,2012-05-24,2,1,5,5,0,4,1,2,0.6,0.5455,0.88,0.1642,1,29,30 +12098,2012-05-24,2,1,5,6,0,4,1,2,0.6,0.5455,0.88,0.1642,10,146,156 +12099,2012-05-24,2,1,5,7,0,4,1,1,0.6,0.5606,0.83,0.1045,22,393,415 +12100,2012-05-24,2,1,5,8,0,4,1,1,0.62,0.5909,0.73,0.1343,33,659,692 +12101,2012-05-24,2,1,5,9,0,4,1,1,0.62,0.5909,0.73,0.1642,34,267,301 +12102,2012-05-24,2,1,5,10,0,4,1,1,0.64,0.6061,0.65,0.1343,47,138,185 +12103,2012-05-24,2,1,5,11,0,4,1,1,0.66,0.6212,0.65,0.1343,61,160,221 +12104,2012-05-24,2,1,5,12,0,4,1,1,0.7,0.6515,0.58,0.1045,76,213,289 +12105,2012-05-24,2,1,5,13,0,4,1,1,0.72,0.6667,0.54,0.1642,69,233,302 +12106,2012-05-24,2,1,5,14,0,4,1,1,0.74,0.6818,0.55,0.2239,72,180,252 +12107,2012-05-24,2,1,5,15,0,4,1,1,0.74,0.6818,0.55,0.2537,69,222,291 +12108,2012-05-24,2,1,5,16,0,4,1,1,0.74,0.6818,0.55,0.3582,61,329,390 +12109,2012-05-24,2,1,5,17,0,4,1,1,0.74,0.6818,0.55,0.2836,120,678,798 +12110,2012-05-24,2,1,5,18,0,4,1,1,0.72,0.6667,0.54,0.3284,118,634,752 +12111,2012-05-24,2,1,5,19,0,4,1,1,0.7,0.6515,0.61,0.2985,76,416,492 +12112,2012-05-24,2,1,5,20,0,4,1,2,0.66,0.6212,0.69,0.2836,55,343,398 +12113,2012-05-24,2,1,5,21,0,4,1,3,0.64,0.5909,0.78,0.1642,48,237,285 +12114,2012-05-24,2,1,5,22,0,4,1,2,0.64,0.5909,0.78,0.1045,33,210,243 +12115,2012-05-24,2,1,5,23,0,4,1,2,0.64,0.5909,0.78,0.1343,36,131,167 +12116,2012-05-25,2,1,5,0,0,5,1,1,0.62,0.5758,0.83,0.1343,10,63,73 +12117,2012-05-25,2,1,5,1,0,5,1,1,0.62,0.5758,0.83,0.0896,8,34,42 +12118,2012-05-25,2,1,5,2,0,5,1,2,0.62,0.5758,0.83,0,7,16,23 +12119,2012-05-25,2,1,5,3,0,5,1,2,0.62,0.5606,0.88,0.1045,8,6,14 +12120,2012-05-25,2,1,5,4,0,5,1,2,0.62,0.5606,0.88,0,3,6,9 +12121,2012-05-25,2,1,5,5,0,5,1,2,0.62,0.5606,0.88,0.0896,2,28,30 +12122,2012-05-25,2,1,5,6,0,5,1,2,0.62,0.5606,0.88,0.1045,5,108,113 +12123,2012-05-25,2,1,5,7,0,5,1,2,0.62,0.5606,0.88,0.1642,7,352,359 +12124,2012-05-25,2,1,5,8,0,5,1,2,0.64,0.5758,0.89,0.1045,32,551,583 +12125,2012-05-25,2,1,5,9,0,5,1,2,0.64,0.5758,0.89,0.1343,52,288,340 +12126,2012-05-25,2,1,5,10,0,5,1,1,0.68,0.6364,0.79,0.1642,69,163,232 +12127,2012-05-25,2,1,5,11,0,5,1,1,0.68,0.6364,0.79,0.1343,90,193,283 +12128,2012-05-25,2,1,5,12,0,5,1,1,0.72,0.6818,0.66,0.1045,84,237,321 +12129,2012-05-25,2,1,5,13,0,5,1,1,0.74,0.6818,0.62,0.1343,86,247,333 +12130,2012-05-25,2,1,5,14,0,5,1,1,0.74,0.6818,0.58,0.1343,104,263,367 +12131,2012-05-25,2,1,5,15,0,5,1,1,0.76,0.7121,0.62,0.194,106,360,466 +12132,2012-05-25,2,1,5,16,0,5,1,1,0.78,0.7121,0.52,0.2239,137,447,584 +12133,2012-05-25,2,1,5,17,0,5,1,1,0.76,0.7121,0.62,0.2836,124,529,653 +12134,2012-05-25,2,1,5,18,0,5,1,1,0.76,0.7121,0.62,0.194,131,391,522 +12135,2012-05-25,2,1,5,19,0,5,1,1,0.74,0.6818,0.62,0.2537,111,293,404 +12136,2012-05-25,2,1,5,20,0,5,1,1,0.7,0.6515,0.65,0.1642,80,270,350 +12137,2012-05-25,2,1,5,21,0,5,1,1,0.7,0.6515,0.65,0.1343,62,193,255 +12138,2012-05-25,2,1,5,22,0,5,1,1,0.66,0.6212,0.74,0.1045,65,147,212 +12139,2012-05-25,2,1,5,23,0,5,1,1,0.66,0.6061,0.78,0.2239,34,132,166 +12140,2012-05-26,2,1,5,0,0,6,0,1,0.64,0.5758,0.83,0.1642,18,98,116 +12141,2012-05-26,2,1,5,1,0,6,0,1,0.64,0.5758,0.83,0.1343,18,64,82 +12142,2012-05-26,2,1,5,2,0,6,0,1,0.64,0.5758,0.89,0.1343,9,55,64 +12143,2012-05-26,2,1,5,3,0,6,0,1,0.62,0.5606,0.88,0.1045,9,22,31 +12144,2012-05-26,2,1,5,4,0,6,0,1,0.62,0.5606,0.88,0.1343,0,2,2 +12145,2012-05-26,2,1,5,5,0,6,0,1,0.62,0.5758,0.83,0.2537,0,10,10 +12146,2012-05-26,2,1,5,6,0,6,0,1,0.62,0.5606,0.88,0.194,6,25,31 +12147,2012-05-26,2,1,5,7,0,6,0,2,0.62,0.5606,0.88,0.1642,10,44,54 +12148,2012-05-26,2,1,5,8,0,6,0,1,0.64,0.5909,0.78,0.1343,38,103,141 +12149,2012-05-26,2,1,5,9,0,6,0,1,0.64,0.5758,0.83,0.194,97,192,289 +12150,2012-05-26,2,1,5,10,0,6,0,1,0.68,0.6364,0.79,0.2239,181,256,437 +12151,2012-05-26,2,1,5,11,0,6,0,1,0.7,0.6667,0.74,0.194,208,312,520 +12152,2012-05-26,2,1,5,12,0,6,0,1,0.74,0.697,0.66,0.2239,236,293,529 +12153,2012-05-26,2,1,5,13,0,6,0,1,0.76,0.7273,0.66,0.194,265,274,539 +12154,2012-05-26,2,1,5,14,0,6,0,1,0.76,0.7273,0.66,0.2239,307,257,564 +12155,2012-05-26,2,1,5,15,0,6,0,1,0.76,0.7273,0.66,0.2836,274,239,513 +12156,2012-05-26,2,1,5,16,0,6,0,1,0.78,0.7273,0.55,0.2537,261,243,504 +12157,2012-05-26,2,1,5,17,0,6,0,2,0.78,0.7121,0.52,0.2537,235,227,462 +12158,2012-05-26,2,1,5,18,0,6,0,2,0.76,0.697,0.52,0.194,199,216,415 +12159,2012-05-26,2,1,5,19,0,6,0,2,0.74,0.697,0.66,0.2239,159,214,373 +12160,2012-05-26,2,1,5,20,0,6,0,2,0.74,0.6818,0.55,0.194,103,152,255 +12161,2012-05-26,2,1,5,21,0,6,0,2,0.72,0.6818,0.62,0.194,100,151,251 +12162,2012-05-26,2,1,5,22,0,6,0,1,0.7,0.6667,0.74,0.2537,86,118,204 +12163,2012-05-26,2,1,5,23,0,6,0,1,0.7,0.6667,0.74,0.2537,36,114,150 +12164,2012-05-27,2,1,5,0,0,0,0,2,0.68,0.6364,0.69,0.1642,48,89,137 +12165,2012-05-27,2,1,5,1,0,0,0,1,0.66,0.6212,0.74,0.2239,17,61,78 +12166,2012-05-27,2,1,5,2,0,0,0,1,0.64,0.6061,0.73,0.2239,16,45,61 +12167,2012-05-27,2,1,5,3,0,0,0,1,0.64,0.6061,0.73,0.1642,11,25,36 +12168,2012-05-27,2,1,5,4,0,0,0,1,0.62,0.5758,0.83,0.194,4,7,11 +12169,2012-05-27,2,1,5,5,0,0,0,1,0.62,0.5758,0.83,0.194,2,5,7 +12170,2012-05-27,2,1,5,6,0,0,0,1,0.62,0.5758,0.83,0.1642,8,14,22 +12171,2012-05-27,2,1,5,7,0,0,0,1,0.62,0.5606,0.88,0.1642,22,30,52 +12172,2012-05-27,2,1,5,8,0,0,0,1,0.64,0.5758,0.83,0.194,69,82,151 +12173,2012-05-27,2,1,5,9,0,0,0,1,0.66,0.6061,0.83,0.2239,130,140,270 +12174,2012-05-27,2,1,5,10,0,0,0,1,0.68,0.6364,0.74,0.2537,209,215,424 +12175,2012-05-27,2,1,5,11,0,0,0,1,0.72,0.6818,0.66,0.2239,268,251,519 +12176,2012-05-27,2,1,5,12,0,0,0,1,0.74,0.697,0.66,0.2537,301,269,570 +12177,2012-05-27,2,1,5,13,0,0,0,1,0.76,0.7121,0.58,0.194,270,228,498 +12178,2012-05-27,2,1,5,14,0,0,0,1,0.76,0.7121,0.58,0.2537,317,230,547 +12179,2012-05-27,2,1,5,15,0,0,0,1,0.78,0.7121,0.52,0.2836,290,245,535 +12180,2012-05-27,2,1,5,16,0,0,0,1,0.78,0.7121,0.52,0.2836,258,243,501 +12181,2012-05-27,2,1,5,17,0,0,0,1,0.78,0.7121,0.49,0.2985,275,247,522 +12182,2012-05-27,2,1,5,18,0,0,0,1,0.76,0.7121,0.58,0.2985,266,252,518 +12183,2012-05-27,2,1,5,19,0,0,0,1,0.76,0.7121,0.58,0.194,230,212,442 +12184,2012-05-27,2,1,5,20,0,0,0,1,0.74,0.6818,0.55,0.1642,168,224,392 +12185,2012-05-27,2,1,5,21,0,0,0,3,0.66,0.6212,0.69,0.4179,64,79,143 +12186,2012-05-27,2,1,5,22,0,0,0,3,0.62,0.5758,0.83,0,9,50,59 +12187,2012-05-27,2,1,5,23,0,0,0,3,0.62,0.5758,0.83,0.1343,31,65,96 +12188,2012-05-28,2,1,5,0,1,1,0,1,0.62,0.5758,0.83,0.2239,21,44,65 +12189,2012-05-28,2,1,5,1,1,1,0,2,0.62,0.5909,0.78,0.2985,14,45,59 +12190,2012-05-28,2,1,5,2,1,1,0,1,0.6,0.5606,0.83,0.2239,20,28,48 +12191,2012-05-28,2,1,5,3,1,1,0,1,0.62,0.5909,0.73,0.1642,6,12,18 +12192,2012-05-28,2,1,5,4,1,1,0,1,0.6,0.5758,0.78,0.1343,2,4,6 +12193,2012-05-28,2,1,5,5,1,1,0,1,0.6,0.5758,0.78,0.1343,4,5,9 +12194,2012-05-28,2,1,5,6,1,1,0,1,0.62,0.5909,0.73,0.1642,6,14,20 +12195,2012-05-28,2,1,5,7,1,1,0,1,0.62,0.5909,0.78,0.1045,24,43,67 +12196,2012-05-28,2,1,5,8,1,1,0,1,0.66,0.6212,0.69,0,62,93,155 +12197,2012-05-28,2,1,5,9,1,1,0,1,0.7,0.6515,0.61,0,121,142,263 +12198,2012-05-28,2,1,5,10,1,1,0,1,0.7,0.6515,0.61,0,189,175,364 +12199,2012-05-28,2,1,5,11,1,1,0,1,0.72,0.6818,0.66,0.1343,254,222,476 +12200,2012-05-28,2,1,5,12,1,1,0,1,0.76,0.7121,0.62,0.1343,233,292,525 +12201,2012-05-28,2,1,5,13,1,1,0,1,0.8,0.7879,0.63,0.2836,272,284,556 +12202,2012-05-28,2,1,5,14,1,1,0,1,0.8,0.7879,0.63,0.2836,238,227,465 +12203,2012-05-28,2,1,5,15,1,1,0,1,0.82,0.7879,0.56,0.2985,181,279,460 +12204,2012-05-28,2,1,5,16,1,1,0,1,0.82,0.7879,0.56,0.3284,223,278,501 +12205,2012-05-28,2,1,5,17,1,1,0,1,0.82,0.7879,0.56,0.2985,158,247,405 +12206,2012-05-28,2,1,5,18,1,1,0,1,0.8,0.7727,0.59,0.2985,132,251,383 +12207,2012-05-28,2,1,5,19,1,1,0,1,0.8,0.7576,0.55,0.2985,148,247,395 +12208,2012-05-28,2,1,5,20,1,1,0,1,0.78,0.7424,0.62,0.2985,114,203,317 +12209,2012-05-28,2,1,5,21,1,1,0,1,0.76,0.7273,0.66,0.2836,66,181,247 +12210,2012-05-28,2,1,5,22,1,1,0,1,0.74,0.697,0.7,0.1642,47,110,157 +12211,2012-05-28,2,1,5,23,1,1,0,1,0.72,0.697,0.74,0.1642,22,60,82 +12212,2012-05-29,2,1,5,0,0,2,1,1,0.7,0.6667,0.79,0.194,12,33,45 +12213,2012-05-29,2,1,5,1,0,2,1,1,0.7,0.6667,0.79,0.194,1,13,14 +12214,2012-05-29,2,1,5,2,0,2,1,1,0.68,0.6364,0.83,0.1343,0,8,8 +12215,2012-05-29,2,1,5,3,0,2,1,1,0.68,0.6364,0.79,0.2836,1,2,3 +12216,2012-05-29,2,1,5,4,0,2,1,1,0.66,0.6061,0.78,0.2239,0,5,5 +12217,2012-05-29,2,1,5,5,0,2,1,1,0.66,0.6061,0.78,0.2239,1,32,33 +12218,2012-05-29,2,1,5,6,0,2,1,1,0.66,0.6061,0.78,0.2239,9,145,154 +12219,2012-05-29,2,1,5,7,0,2,1,1,0.66,0.6061,0.78,0.2239,19,431,450 +12220,2012-05-29,2,1,5,8,0,2,1,1,0.7,0.6667,0.74,0.194,46,588,634 +12221,2012-05-29,2,1,5,9,0,2,1,1,0.72,0.6818,0.7,0.2537,26,231,257 +12222,2012-05-29,2,1,5,10,0,2,1,1,0.74,0.697,0.7,0.2985,67,116,183 +12223,2012-05-29,2,1,5,11,0,2,1,1,0.76,0.7121,0.62,0.2985,50,147,197 +12224,2012-05-29,2,1,5,12,0,2,1,1,0.8,0.7576,0.55,0.3284,56,181,237 +12225,2012-05-29,2,1,5,13,0,2,1,1,0.82,0.7727,0.52,0.3881,63,176,239 +12226,2012-05-29,2,1,5,14,0,2,1,1,0.82,0.7727,0.52,0.4179,80,158,238 +12227,2012-05-29,2,1,5,15,0,2,1,1,0.82,0.7727,0.49,0.3881,71,185,256 +12228,2012-05-29,2,1,5,16,0,2,1,1,0.82,0.7727,0.49,0.4925,73,320,393 +12229,2012-05-29,2,1,5,17,0,2,1,1,0.82,0.7576,0.46,0.4179,107,674,781 +12230,2012-05-29,2,1,5,18,0,2,1,1,0.8,0.7424,0.49,0.4925,78,632,710 +12231,2012-05-29,2,1,5,19,0,2,1,1,0.78,0.7121,0.52,0.3582,69,457,526 +12232,2012-05-29,2,1,5,20,0,2,1,3,0.7,0.6515,0.61,0.5224,36,168,204 +12233,2012-05-29,2,1,5,21,0,2,1,3,0.6,0.5455,0.88,0.4478,3,68,71 +12234,2012-05-29,2,1,5,22,0,2,1,2,0.62,0.5606,0.88,0.0896,2,53,55 +12235,2012-05-29,2,1,5,23,0,2,1,3,0.62,0.5455,0.94,0,10,40,50 +12236,2012-05-30,2,1,5,0,0,3,1,3,0.64,0.5758,0.89,0.1045,4,32,36 +12237,2012-05-30,2,1,5,1,0,3,1,3,0.64,0.5758,0.89,0.194,0,4,4 +12238,2012-05-30,2,1,5,2,0,3,1,3,0.62,0.5455,0.94,0.0896,1,4,5 +12239,2012-05-30,2,1,5,3,0,3,1,3,0.62,0.5606,0.88,0.1343,0,5,5 +12240,2012-05-30,2,1,5,4,0,3,1,2,0.62,0.5606,0.88,0.1343,0,6,6 +12241,2012-05-30,2,1,5,5,0,3,1,2,0.62,0.5909,0.78,0.1343,1,40,41 +12242,2012-05-30,2,1,5,6,0,3,1,2,0.6,0.5758,0.78,0.1045,5,139,144 +12243,2012-05-30,2,1,5,7,0,3,1,3,0.6,0.5758,0.78,0.194,14,469,483 +12244,2012-05-30,2,1,5,8,0,3,1,2,0.6,0.5909,0.73,0.1642,29,642,671 +12245,2012-05-30,2,1,5,9,0,3,1,2,0.62,0.6061,0.69,0.194,23,282,305 +12246,2012-05-30,2,1,5,10,0,3,1,2,0.62,0.6061,0.69,0.2836,39,138,177 +12247,2012-05-30,2,1,5,11,0,3,1,2,0.66,0.6212,0.65,0.194,53,155,208 +12248,2012-05-30,2,1,5,12,0,3,1,1,0.7,0.6515,0.54,0,33,220,253 +12249,2012-05-30,2,1,5,13,0,3,1,1,0.7,0.6515,0.51,0,53,236,289 +12250,2012-05-30,2,1,5,14,0,3,1,1,0.72,0.6515,0.42,0.1343,44,208,252 +12251,2012-05-30,2,1,5,15,0,3,1,1,0.72,0.6667,0.51,0.1343,54,221,275 +12252,2012-05-30,2,1,5,16,0,3,1,1,0.72,0.6667,0.51,0.1642,46,352,398 +12253,2012-05-30,2,1,5,17,0,3,1,1,0.74,0.6667,0.48,0.0896,83,756,839 +12254,2012-05-30,2,1,5,18,0,3,1,1,0.74,0.6667,0.48,0.1642,50,746,796 +12255,2012-05-30,2,1,5,19,0,3,1,1,0.72,0.6667,0.54,0.1642,50,506,556 +12256,2012-05-30,2,1,5,20,0,3,1,2,0.54,0.5152,0.6,0.1343,57,374,431 +12257,2012-05-30,2,1,5,21,0,3,1,1,0.68,0.6364,0.57,0.1045,41,263,304 +12258,2012-05-30,2,1,5,22,0,3,1,1,0.66,0.6212,0.69,0.1045,26,191,217 +12259,2012-05-30,2,1,5,23,0,3,1,1,0.66,0.6212,0.65,0.1045,39,121,160 +12260,2012-05-31,2,1,5,0,0,4,1,1,0.64,0.6061,0.69,0.1343,13,48,61 +12261,2012-05-31,2,1,5,1,0,4,1,1,0.64,0.6061,0.69,0.0896,4,22,26 +12262,2012-05-31,2,1,5,2,0,4,1,1,0.64,0.6061,0.65,0.0896,6,8,14 +12263,2012-05-31,2,1,5,3,0,4,1,1,0.62,0.6061,0.69,0.0896,0,8,8 +12264,2012-05-31,2,1,5,4,0,4,1,1,0.62,0.6061,0.69,0,0,8,8 +12265,2012-05-31,2,1,5,5,0,4,1,1,0.62,0.6061,0.69,0.2537,1,32,33 +12266,2012-05-31,2,1,5,6,0,4,1,1,0.6,0.5909,0.73,0.2239,7,171,178 +12267,2012-05-31,2,1,5,7,0,4,1,1,0.6,0.6061,0.64,0.2836,18,489,507 +12268,2012-05-31,2,1,5,8,0,4,1,1,0.62,0.6212,0.57,0.2537,18,675,693 +12269,2012-05-31,2,1,5,9,0,4,1,1,0.66,0.6212,0.47,0.2537,47,264,311 +12270,2012-05-31,2,1,5,10,0,4,1,1,0.7,0.6364,0.39,0.3881,65,155,220 +12271,2012-05-31,2,1,5,11,0,4,1,1,0.72,0.6515,0.37,0.2836,65,153,218 +12272,2012-05-31,2,1,5,12,0,4,1,1,0.72,0.6515,0.37,0.194,62,230,292 +12273,2012-05-31,2,1,5,13,0,4,1,1,0.74,0.6515,0.35,0.1343,77,216,293 +12274,2012-05-31,2,1,5,14,0,4,1,1,0.76,0.6667,0.31,0.194,61,193,254 +12275,2012-05-31,2,1,5,15,0,4,1,1,0.76,0.6667,0.33,0.2985,81,209,290 +12276,2012-05-31,2,1,5,16,0,4,1,1,0.76,0.6667,0.33,0.2836,103,384,487 +12277,2012-05-31,2,1,5,17,0,4,1,1,0.76,0.6667,0.31,0.1642,85,742,827 +12278,2012-05-31,2,1,5,18,0,4,1,1,0.74,0.6515,0.3,0.194,85,700,785 +12279,2012-05-31,2,1,5,19,0,4,1,1,0.72,0.6515,0.34,0.1642,104,487,591 +12280,2012-05-31,2,1,5,20,0,4,1,1,0.68,0.6364,0.44,0.1642,88,391,479 +12281,2012-05-31,2,1,5,21,0,4,1,1,0.68,0.6364,0.44,0.1642,41,318,359 +12282,2012-05-31,2,1,5,22,0,4,1,1,0.66,0.6212,0.5,0.1642,43,221,264 +12283,2012-05-31,2,1,5,23,0,4,1,1,0.66,0.6212,0.54,0.2239,26,114,140 +12284,2012-06-01,2,1,6,0,0,5,1,1,0.66,0.6212,0.5,0.2537,10,76,86 +12285,2012-06-01,2,1,6,1,0,5,1,1,0.64,0.6212,0.53,0.2239,0,34,34 +12286,2012-06-01,2,1,6,2,0,5,1,1,0.64,0.6212,0.57,0.2239,3,13,16 +12287,2012-06-01,2,1,6,3,0,5,1,2,0.62,0.5909,0.73,0.0896,0,4,4 +12288,2012-06-01,2,1,6,4,0,5,1,2,0.62,0.5909,0.73,0.0896,1,5,6 +12289,2012-06-01,2,1,6,5,0,5,1,2,0.62,0.5909,0.78,0.0896,4,41,45 +12290,2012-06-01,2,1,6,6,0,5,1,3,0.62,0.5606,0.88,0.1343,5,136,141 +12291,2012-06-01,2,1,6,7,0,5,1,2,0.64,0.5758,0.89,0.194,33,369,402 +12292,2012-06-01,2,1,6,8,0,5,1,2,0.64,0.5758,0.89,0.194,19,675,694 +12293,2012-06-01,2,1,6,9,0,5,1,2,0.64,0.5758,0.83,0.2239,24,274,298 +12294,2012-06-01,2,1,6,10,0,5,1,2,0.66,0.6061,0.78,0.2836,49,155,204 +12295,2012-06-01,2,1,6,11,0,5,1,2,0.7,0.6667,0.74,0.2836,42,193,235 +12296,2012-06-01,2,1,6,12,0,5,1,2,0.7,0.6667,0.74,0.3582,72,238,310 +12297,2012-06-01,2,1,6,13,0,5,1,1,0.72,0.6818,0.7,0.2985,76,240,316 +12298,2012-06-01,2,1,6,14,0,5,1,3,0.72,0.6818,0.66,0.2836,55,199,254 +12299,2012-06-01,2,1,6,15,0,5,1,3,0.72,0.6818,0.66,0.2836,45,213,258 +12300,2012-06-01,2,1,6,16,0,5,1,3,0.74,0.6818,0.58,0.4478,36,186,222 +12301,2012-06-01,2,1,6,17,0,5,1,3,0.7,0.6515,0.7,0.2537,25,202,227 +12302,2012-06-01,2,1,6,18,0,5,1,3,0.62,0.5606,0.88,0.3582,10,100,110 +12303,2012-06-01,2,1,6,19,0,5,1,3,0.62,0.5606,0.88,0.3582,4,41,45 +12304,2012-06-01,2,1,6,20,0,5,1,3,0.62,0.5455,0.94,0.2537,1,38,39 +12305,2012-06-01,2,1,6,21,0,5,1,3,0.64,0.5909,0.78,0.2537,12,73,85 +12306,2012-06-01,2,1,6,22,0,5,1,3,0.6,0.5455,0.88,0.1343,1,22,23 +12307,2012-06-01,2,1,6,23,0,5,1,3,0.6,0.5455,0.88,0.1343,6,67,73 +12308,2012-06-02,2,1,6,0,0,6,0,2,0.56,0.5303,0.83,0.1343,5,81,86 +12309,2012-06-02,2,1,6,1,0,6,0,2,0.56,0.5303,0.83,0.1045,15,61,76 +12310,2012-06-02,2,1,6,2,0,6,0,2,0.54,0.5152,0.88,0,3,38,41 +12311,2012-06-02,2,1,6,3,0,6,0,1,0.56,0.5303,0.73,0.2537,8,18,26 +12312,2012-06-02,2,1,6,4,0,6,0,1,0.54,0.5152,0.68,0.2836,2,11,13 +12313,2012-06-02,2,1,6,5,0,6,0,1,0.5,0.4848,0.72,0.194,1,11,12 +12314,2012-06-02,2,1,6,6,0,6,0,2,0.5,0.4848,0.77,0.194,5,31,36 +12315,2012-06-02,2,1,6,7,0,6,0,1,0.52,0.5,0.72,0.1045,9,77,86 +12316,2012-06-02,2,1,6,8,0,6,0,1,0.54,0.5152,0.64,0.1642,30,180,210 +12317,2012-06-02,2,1,6,9,0,6,0,1,0.56,0.5303,0.6,0.2537,89,243,332 +12318,2012-06-02,2,1,6,10,0,6,0,1,0.6,0.6212,0.53,0.2985,145,348,493 +12319,2012-06-02,2,1,6,11,0,6,0,1,0.58,0.5455,0.46,0.2836,179,357,536 +12320,2012-06-02,2,1,6,12,0,6,0,1,0.62,0.6212,0.43,0.3582,250,418,668 +12321,2012-06-02,2,1,6,13,0,6,0,1,0.62,0.6212,0.41,0.2537,279,400,679 +12322,2012-06-02,2,1,6,14,0,6,0,1,0.64,0.6212,0.38,0.2985,259,388,647 +12323,2012-06-02,2,1,6,15,0,6,0,1,0.64,0.6212,0.38,0.2537,297,405,702 +12324,2012-06-02,2,1,6,16,0,6,0,1,0.64,0.6212,0.38,0.2537,275,369,644 +12325,2012-06-02,2,1,6,17,0,6,0,1,0.64,0.6212,0.36,0,248,338,586 +12326,2012-06-02,2,1,6,18,0,6,0,1,0.64,0.6212,0.36,0,171,341,512 +12327,2012-06-02,2,1,6,19,0,6,0,1,0.64,0.6212,0.36,0.1642,185,369,554 +12328,2012-06-02,2,1,6,20,0,6,0,1,0.62,0.6212,0.35,0.2836,139,260,399 +12329,2012-06-02,2,1,6,21,0,6,0,1,0.6,0.6212,0.4,0.1045,96,220,316 +12330,2012-06-02,2,1,6,22,0,6,0,1,0.58,0.5455,0.46,0.1343,52,176,228 +12331,2012-06-02,2,1,6,23,0,6,0,1,0.56,0.5303,0.52,0.1045,53,185,238 +12332,2012-06-03,2,1,6,0,0,0,0,1,0.54,0.5152,0.6,0.1642,27,142,169 +12333,2012-06-03,2,1,6,1,0,0,0,1,0.54,0.5152,0.56,0.1343,21,100,121 +12334,2012-06-03,2,1,6,2,0,0,0,1,0.52,0.5,0.59,0.1045,22,67,89 +12335,2012-06-03,2,1,6,3,0,0,0,1,0.52,0.5,0.63,0.1045,16,34,50 +12336,2012-06-03,2,1,6,4,0,0,0,1,0.5,0.4848,0.72,0.0896,1,12,13 +12337,2012-06-03,2,1,6,5,0,0,0,1,0.5,0.4848,0.68,0,1,11,12 +12338,2012-06-03,2,1,6,6,0,0,0,1,0.46,0.4545,0.82,0.1045,6,16,22 +12339,2012-06-03,2,1,6,7,0,0,0,1,0.5,0.4848,0.68,0.1045,6,29,35 +12340,2012-06-03,2,1,6,8,0,0,0,1,0.54,0.5152,0.6,0.0896,30,110,140 +12341,2012-06-03,2,1,6,9,0,0,0,1,0.6,0.6212,0.46,0.194,70,230,300 +12342,2012-06-03,2,1,6,10,0,0,0,1,0.62,0.6212,0.43,0.2239,127,277,404 +12343,2012-06-03,2,1,6,11,0,0,0,1,0.64,0.6212,0.41,0.2537,199,312,511 +12344,2012-06-03,2,1,6,12,0,0,0,1,0.66,0.6212,0.39,0.2985,276,408,684 +12345,2012-06-03,2,1,6,13,0,0,0,1,0.68,0.6364,0.36,0.2836,265,421,686 +12346,2012-06-03,2,1,6,14,0,0,0,1,0.68,0.6364,0.36,0.3284,267,411,678 +12347,2012-06-03,2,1,6,15,0,0,0,1,0.7,0.6364,0.34,0,236,408,644 +12348,2012-06-03,2,1,6,16,0,0,0,1,0.7,0.6364,0.34,0.2985,226,436,662 +12349,2012-06-03,2,1,6,17,0,0,0,1,0.7,0.6364,0.34,0.2537,192,386,578 +12350,2012-06-03,2,1,6,18,0,0,0,1,0.7,0.6364,0.34,0.2537,153,343,496 +12351,2012-06-03,2,1,6,19,0,0,0,1,0.68,0.6364,0.34,0.2836,116,337,453 +12352,2012-06-03,2,1,6,20,0,0,0,1,0.66,0.6212,0.39,0.194,84,230,314 +12353,2012-06-03,2,1,6,21,0,0,0,1,0.62,0.6212,0.43,0.1642,66,180,246 +12354,2012-06-03,2,1,6,22,0,0,0,1,0.62,0.6212,0.5,0.3881,61,165,226 +12355,2012-06-03,2,1,6,23,0,0,0,1,0.58,0.5455,0.53,0.1045,26,82,108 +12356,2012-06-04,2,1,6,0,0,1,1,1,0.58,0.5455,0.53,0,11,38,49 +12357,2012-06-04,2,1,6,1,0,1,1,3,0.56,0.5303,0.6,0.1642,4,10,14 +12358,2012-06-04,2,1,6,2,0,1,1,1,0.56,0.5303,0.6,0.2239,1,10,11 +12359,2012-06-04,2,1,6,3,0,1,1,2,0.56,0.5303,0.6,0.1642,0,5,5 +12360,2012-06-04,2,1,6,4,0,1,1,1,0.54,0.5152,0.64,0.0896,2,6,8 +12361,2012-06-04,2,1,6,5,0,1,1,1,0.52,0.5,0.68,0,2,33,35 +12362,2012-06-04,2,1,6,6,0,1,1,1,0.52,0.5,0.68,0,4,135,139 +12363,2012-06-04,2,1,6,7,0,1,1,1,0.56,0.5303,0.6,0.2239,13,504,517 +12364,2012-06-04,2,1,6,8,0,1,1,1,0.6,0.6212,0.49,0.2239,22,643,665 +12365,2012-06-04,2,1,6,9,0,1,1,1,0.62,0.6212,0.43,0.2836,38,244,282 +12366,2012-06-04,2,1,6,10,0,1,1,1,0.64,0.6212,0.41,0.4627,67,120,187 +12367,2012-06-04,2,1,6,11,0,1,1,1,0.64,0.6212,0.38,0.4925,71,168,239 +12368,2012-06-04,2,1,6,12,0,1,1,1,0.64,0.6212,0.41,0.4478,62,220,282 +12369,2012-06-04,2,1,6,13,0,1,1,1,0.64,0.6212,0.41,0.3582,85,189,274 +12370,2012-06-04,2,1,6,14,0,1,1,1,0.62,0.6212,0.41,0.4925,84,186,270 +12371,2012-06-04,2,1,6,15,0,1,1,1,0.66,0.6212,0.39,0.2985,77,215,292 +12372,2012-06-04,2,1,6,16,0,1,1,1,0.66,0.6212,0.39,0.4627,73,375,448 +12373,2012-06-04,2,1,6,17,0,1,1,1,0.64,0.6212,0.38,0.4478,101,733,834 +12374,2012-06-04,2,1,6,18,0,1,1,1,0.64,0.6212,0.38,0.3881,103,719,822 +12375,2012-06-04,2,1,6,19,0,1,1,1,0.62,0.6212,0.38,0.4179,91,554,645 +12376,2012-06-04,2,1,6,20,0,1,1,1,0.6,0.6212,0.4,0.3582,71,390,461 +12377,2012-06-04,2,1,6,21,0,1,1,1,0.58,0.5455,0.49,0.2537,40,225,265 +12378,2012-06-04,2,1,6,22,0,1,1,1,0.56,0.5303,0.52,0.2836,41,127,168 +12379,2012-06-04,2,1,6,23,0,1,1,1,0.58,0.5455,0.49,0.2985,8,78,86 +12380,2012-06-05,2,1,6,0,0,2,1,1,0.54,0.5152,0.68,0.4179,6,28,34 +12381,2012-06-05,2,1,6,1,0,2,1,2,0.52,0.5,0.77,0.3284,2,18,20 +12382,2012-06-05,2,1,6,2,0,2,1,3,0.52,0.5,0.72,0.2985,0,8,8 +12383,2012-06-05,2,1,6,3,0,2,1,3,0.5,0.4848,0.82,0.194,0,5,5 +12384,2012-06-05,2,1,6,4,0,2,1,3,0.5,0.4848,0.82,0.194,1,4,5 +12385,2012-06-05,2,1,6,5,0,2,1,3,0.48,0.4697,0.82,0.1642,1,35,36 +12386,2012-06-05,2,1,6,6,0,2,1,1,0.48,0.4697,0.82,0.2537,7,177,184 +12387,2012-06-05,2,1,6,7,0,2,1,1,0.48,0.4697,0.77,0.2985,29,540,569 +12388,2012-06-05,2,1,6,8,0,2,1,2,0.48,0.4697,0.77,0.2239,30,680,710 +12389,2012-06-05,2,1,6,9,0,2,1,2,0.52,0.5,0.72,0.2537,50,285,335 +12390,2012-06-05,2,1,6,10,0,2,1,2,0.52,0.5,0.63,0.2985,58,118,176 +12391,2012-06-05,2,1,6,11,0,2,1,2,0.56,0.5303,0.56,0.194,65,162,227 +12392,2012-06-05,2,1,6,12,0,2,1,2,0.54,0.5152,0.56,0.2239,45,202,247 +12393,2012-06-05,2,1,6,13,0,2,1,2,0.6,0.6212,0.46,0.1642,66,201,267 +12394,2012-06-05,2,1,6,14,0,2,1,2,0.58,0.5455,0.46,0.1642,82,194,276 +12395,2012-06-05,2,1,6,15,0,2,1,2,0.58,0.5455,0.46,0.1343,54,218,272 +12396,2012-06-05,2,1,6,16,0,2,1,1,0.6,0.6212,0.43,0.194,91,382,473 +12397,2012-06-05,2,1,6,17,0,2,1,1,0.6,0.6212,0.43,0,86,764,850 +12398,2012-06-05,2,1,6,18,0,2,1,1,0.6,0.6212,0.43,0.194,111,679,790 +12399,2012-06-05,2,1,6,19,0,2,1,1,0.58,0.5455,0.46,0.2537,68,445,513 +12400,2012-06-05,2,1,6,20,0,2,1,1,0.56,0.5303,0.49,0.2239,44,371,415 +12401,2012-06-05,2,1,6,21,0,2,1,1,0.54,0.5152,0.56,0.1642,30,253,283 +12402,2012-06-05,2,1,6,22,0,2,1,1,0.56,0.5303,0.52,0.1045,22,171,193 +12403,2012-06-05,2,1,6,23,0,2,1,1,0.54,0.5152,0.56,0.0896,20,93,113 +12404,2012-06-06,2,1,6,0,0,3,1,1,0.52,0.5,0.59,0.0896,3,46,49 +12405,2012-06-06,2,1,6,1,0,3,1,1,0.52,0.5,0.68,0,6,21,27 +12406,2012-06-06,2,1,6,2,0,3,1,1,0.5,0.4848,0.68,0,4,7,11 +12407,2012-06-06,2,1,6,3,0,3,1,1,0.46,0.4545,0.82,0,0,8,8 +12408,2012-06-06,2,1,6,4,0,3,1,1,0.46,0.4545,0.82,0.1045,3,7,10 +12409,2012-06-06,2,1,6,5,0,3,1,1,0.48,0.4697,0.77,0,1,37,38 +12410,2012-06-06,2,1,6,6,0,3,1,1,0.48,0.4697,0.77,0,7,165,172 +12411,2012-06-06,2,1,6,7,0,3,1,1,0.5,0.4848,0.72,0,16,531,547 +12412,2012-06-06,2,1,6,8,0,3,1,1,0.54,0.5152,0.68,0,31,637,668 +12413,2012-06-06,2,1,6,9,0,3,1,1,0.56,0.5303,0.6,0.0896,35,268,303 +12414,2012-06-06,2,1,6,10,0,3,1,1,0.6,0.6212,0.46,0.0896,55,148,203 +12415,2012-06-06,2,1,6,11,0,3,1,1,0.62,0.6212,0.43,0.0896,65,161,226 +12416,2012-06-06,2,1,6,12,0,3,1,1,0.62,0.6212,0.38,0,80,235,315 +12417,2012-06-06,2,1,6,13,0,3,1,1,0.62,0.6212,0.41,0.0896,73,230,303 +12418,2012-06-06,2,1,6,14,0,3,1,1,0.64,0.6212,0.38,0,111,191,302 +12419,2012-06-06,2,1,6,15,0,3,1,1,0.64,0.6212,0.38,0,75,225,300 +12420,2012-06-06,2,1,6,16,0,3,1,1,0.64,0.6212,0.41,0.1343,107,345,452 +12421,2012-06-06,2,1,6,17,0,3,1,1,0.62,0.6212,0.43,0.1343,72,652,724 +12422,2012-06-06,2,1,6,18,0,3,1,3,0.6,0.6212,0.53,0.194,94,688,782 +12423,2012-06-06,2,1,6,19,0,3,1,1,0.56,0.5303,0.73,0.2985,52,486,538 +12424,2012-06-06,2,1,6,20,0,3,1,1,0.54,0.5152,0.73,0.2836,59,349,408 +12425,2012-06-06,2,1,6,21,0,3,1,1,0.54,0.5152,0.73,0.1642,38,260,298 +12426,2012-06-06,2,1,6,22,0,3,1,1,0.52,0.5,0.77,0.0896,27,221,248 +12427,2012-06-06,2,1,6,23,0,3,1,1,0.52,0.5,0.77,0,13,110,123 +12428,2012-06-07,2,1,6,0,0,4,1,1,0.52,0.5,0.77,0.1343,9,50,59 +12429,2012-06-07,2,1,6,1,0,4,1,1,0.52,0.5,0.77,0.1642,5,17,22 +12430,2012-06-07,2,1,6,2,0,4,1,1,0.52,0.5,0.77,0.0896,0,12,12 +12431,2012-06-07,2,1,6,3,0,4,1,1,0.5,0.4848,0.77,0.1045,0,5,5 +12432,2012-06-07,2,1,6,4,0,4,1,1,0.46,0.4545,0.88,0.1045,0,6,6 +12433,2012-06-07,2,1,6,5,0,4,1,1,0.48,0.4697,0.82,0.0896,0,34,34 +12434,2012-06-07,2,1,6,6,0,4,1,1,0.46,0.4545,0.88,0.1045,5,165,170 +12435,2012-06-07,2,1,6,7,0,4,1,1,0.5,0.4848,0.82,0.1642,20,506,526 +12436,2012-06-07,2,1,6,8,0,4,1,1,0.52,0.5,0.77,0,20,661,681 +12437,2012-06-07,2,1,6,9,0,4,1,1,0.58,0.5455,0.68,0,37,300,337 +12438,2012-06-07,2,1,6,10,0,4,1,1,0.6,0.6061,0.64,0.0896,56,154,210 +12439,2012-06-07,2,1,6,11,0,4,1,1,0.66,0.6212,0.47,0.2239,67,170,237 +12440,2012-06-07,2,1,6,12,0,4,1,1,0.7,0.6364,0.42,0,62,274,336 +12441,2012-06-07,2,1,6,13,0,4,1,1,0.7,0.6364,0.37,0.2836,65,250,315 +12442,2012-06-07,2,1,6,14,0,4,1,1,0.72,0.6515,0.37,0.2537,67,198,265 +12443,2012-06-07,2,1,6,15,0,4,1,1,0.74,0.6515,0.3,0,99,234,333 +12444,2012-06-07,2,1,6,16,0,4,1,1,0.72,0.6515,0.34,0.2239,77,389,466 +12445,2012-06-07,2,1,6,17,0,4,1,1,0.72,0.6515,0.34,0.2537,91,778,869 +12446,2012-06-07,2,1,6,18,0,4,1,1,0.7,0.6364,0.34,0.2836,110,703,813 +12447,2012-06-07,2,1,6,19,0,4,1,1,0.7,0.6364,0.34,0.2836,65,537,602 +12448,2012-06-07,2,1,6,20,0,4,1,1,0.66,0.6212,0.36,0.2985,76,402,478 +12449,2012-06-07,2,1,6,21,0,4,1,1,0.62,0.6212,0.41,0.3284,30,283,313 +12450,2012-06-07,2,1,6,22,0,4,1,1,0.6,0.6212,0.46,0.1642,35,193,228 +12451,2012-06-07,2,1,6,23,0,4,1,1,0.56,0.5303,0.52,0.1343,42,135,177 +12452,2012-06-08,2,1,6,0,0,5,1,1,0.56,0.5303,0.56,0.1642,18,59,77 +12453,2012-06-08,2,1,6,1,0,5,1,1,0.56,0.5303,0.56,0.1642,4,25,29 +12454,2012-06-08,2,1,6,2,0,5,1,1,0.56,0.5303,0.56,0,2,17,19 +12455,2012-06-08,2,1,6,3,0,5,1,1,0.54,0.5152,0.6,0.0896,0,8,8 +12456,2012-06-08,2,1,6,4,0,5,1,1,0.52,0.5,0.63,0,1,9,10 +12457,2012-06-08,2,1,6,5,0,5,1,1,0.5,0.4848,0.72,0.1045,1,28,29 +12458,2012-06-08,2,1,6,6,0,5,1,1,0.52,0.5,0.68,0.0896,8,131,139 +12459,2012-06-08,2,1,6,7,0,5,1,1,0.52,0.5,0.72,0.1045,19,409,428 +12460,2012-06-08,2,1,6,8,0,5,1,1,0.58,0.5455,0.6,0.0896,37,663,700 +12461,2012-06-08,2,1,6,9,0,5,1,1,0.62,0.6212,0.53,0.1642,47,335,382 +12462,2012-06-08,2,1,6,10,0,5,1,1,0.66,0.6212,0.47,0.2537,62,187,249 +12463,2012-06-08,2,1,6,11,0,5,1,1,0.7,0.6364,0.42,0.2537,65,199,264 +12464,2012-06-08,2,1,6,12,0,5,1,1,0.72,0.6515,0.39,0.3284,89,280,369 +12465,2012-06-08,2,1,6,13,0,5,1,1,0.74,0.6515,0.37,0.2537,129,287,416 +12466,2012-06-08,2,1,6,14,0,5,1,1,0.76,0.6667,0.31,0.3881,128,245,373 +12467,2012-06-08,2,1,6,15,0,5,1,1,0.76,0.6667,0.29,0.2836,98,299,397 +12468,2012-06-08,2,1,6,16,0,5,1,1,0.76,0.6667,0.29,0.2985,111,412,523 +12469,2012-06-08,2,1,6,17,0,5,1,1,0.76,0.6667,0.29,0.194,114,679,793 +12470,2012-06-08,2,1,6,18,0,5,1,1,0.76,0.6667,0.27,0.2239,147,576,723 +12471,2012-06-08,2,1,6,19,0,5,1,1,0.74,0.6515,0.3,0.2239,95,460,555 +12472,2012-06-08,2,1,6,20,0,5,1,1,0.72,0.6515,0.32,0.1343,117,291,408 +12473,2012-06-08,2,1,6,21,0,5,1,1,0.7,0.6364,0.37,0.1343,83,256,339 +12474,2012-06-08,2,1,6,22,0,5,1,1,0.68,0.6364,0.41,0.1045,61,230,291 +12475,2012-06-08,2,1,6,23,0,5,1,1,0.64,0.6212,0.57,0.1642,52,163,215 +12476,2012-06-09,2,1,6,0,0,6,0,1,0.64,0.6212,0.53,0.1045,54,152,206 +12477,2012-06-09,2,1,6,1,0,6,0,1,0.64,0.6212,0.53,0.0896,34,89,123 +12478,2012-06-09,2,1,6,2,0,6,0,1,0.64,0.6212,0.5,0.2239,21,68,89 +12479,2012-06-09,2,1,6,3,0,6,0,1,0.62,0.6212,0.57,0,12,28,40 +12480,2012-06-09,2,1,6,4,0,6,0,1,0.62,0.6212,0.57,0,2,6,8 +12481,2012-06-09,2,1,6,5,0,6,0,1,0.6,0.6212,0.56,0.1045,4,15,19 +12482,2012-06-09,2,1,6,6,0,6,0,1,0.56,0.5303,0.64,0.1045,17,63,80 +12483,2012-06-09,2,1,6,7,0,6,0,1,0.58,0.5455,0.64,0.1045,26,61,87 +12484,2012-06-09,2,1,6,8,0,6,0,1,0.62,0.6061,0.61,0.1045,42,202,244 +12485,2012-06-09,2,1,6,9,0,6,0,1,0.64,0.6212,0.57,0.1642,99,241,340 +12486,2012-06-09,2,1,6,10,0,6,0,1,0.7,0.6515,0.51,0.1045,168,298,466 +12487,2012-06-09,2,1,6,11,0,6,0,1,0.76,0.6667,0.35,0,218,351,569 +12488,2012-06-09,2,1,6,12,0,6,0,1,0.8,0.697,0.29,0.2537,196,335,531 +12489,2012-06-09,2,1,6,13,0,6,0,1,0.82,0.697,0.24,0.2836,236,349,585 +12490,2012-06-09,2,1,6,14,0,6,0,1,0.82,0.697,0.23,0.194,239,354,593 +12491,2012-06-09,2,1,6,15,0,6,0,1,0.84,0.7121,0.2,0.2985,218,337,555 +12492,2012-06-09,2,1,6,16,0,6,0,1,0.82,0.697,0.24,0.194,219,315,534 +12493,2012-06-09,2,1,6,17,0,6,0,1,0.84,0.7121,0.24,0,238,257,495 +12494,2012-06-09,2,1,6,18,0,6,0,1,0.82,0.697,0.24,0.2239,175,276,451 +12495,2012-06-09,2,1,6,19,0,6,0,1,0.8,0.697,0.29,0.1642,128,232,360 +12496,2012-06-09,2,1,6,20,0,6,0,1,0.76,0.6667,0.35,0.2239,125,238,363 +12497,2012-06-09,2,1,6,21,0,6,0,1,0.74,0.6515,0.4,0.194,105,211,316 +12498,2012-06-09,2,1,6,22,0,6,0,1,0.7,0.6515,0.58,0.1642,63,151,214 +12499,2012-06-09,2,1,6,23,0,6,0,1,0.68,0.6364,0.61,0.1642,69,161,230 +12500,2012-06-10,2,1,6,0,0,0,0,1,0.66,0.6212,0.69,0.1045,33,125,158 +12501,2012-06-10,2,1,6,1,0,0,0,1,0.64,0.6061,0.73,0.1343,32,84,116 +12502,2012-06-10,2,1,6,2,0,0,0,1,0.64,0.6061,0.69,0.1343,17,56,73 +12503,2012-06-10,2,1,6,3,0,0,0,1,0.64,0.6061,0.73,0.1045,6,23,29 +12504,2012-06-10,2,1,6,4,0,0,0,1,0.62,0.5909,0.78,0.0896,1,9,10 +12505,2012-06-10,2,1,6,5,0,0,0,1,0.62,0.6061,0.69,0.0896,3,10,13 +12506,2012-06-10,2,1,6,6,0,0,0,1,0.62,0.5909,0.73,0,3,27,30 +12507,2012-06-10,2,1,6,7,0,0,0,1,0.6,0.5758,0.78,0.1045,21,41,62 +12508,2012-06-10,2,1,6,8,0,0,0,1,0.64,0.6061,0.73,0.0896,59,122,181 +12509,2012-06-10,2,1,6,9,0,0,0,1,0.7,0.6515,0.58,0,91,175,266 +12510,2012-06-10,2,1,6,10,0,0,0,1,0.76,0.6818,0.4,0.1045,148,265,413 +12511,2012-06-10,2,1,6,11,0,0,0,1,0.76,0.6818,0.45,0.0896,184,315,499 +12512,2012-06-10,2,1,6,12,0,0,0,1,0.82,0.7121,0.3,0,173,329,502 +12513,2012-06-10,2,1,6,13,0,0,0,1,0.84,0.7273,0.28,0.1343,204,288,492 +12514,2012-06-10,2,1,6,14,0,0,0,1,0.84,0.7273,0.3,0.1343,186,314,500 +12515,2012-06-10,2,1,6,15,0,0,0,1,0.84,0.7273,0.3,0.2239,159,316,475 +12516,2012-06-10,2,1,6,16,0,0,0,1,0.84,0.7424,0.36,0.2836,173,335,508 +12517,2012-06-10,2,1,6,17,0,0,0,1,0.82,0.7424,0.41,0.2836,184,337,521 +12518,2012-06-10,2,1,6,18,0,0,0,1,0.82,0.7273,0.38,0.2239,164,326,490 +12519,2012-06-10,2,1,6,19,0,0,0,1,0.82,0.7121,0.28,0.2836,115,260,375 +12520,2012-06-10,2,1,6,20,0,0,0,1,0.76,0.6818,0.48,0.1642,113,250,363 +12521,2012-06-10,2,1,6,21,0,0,0,1,0.72,0.6818,0.62,0.1642,83,170,253 +12522,2012-06-10,2,1,6,22,0,0,0,1,0.7,0.6515,0.65,0.1642,42,117,159 +12523,2012-06-10,2,1,6,23,0,0,0,1,0.72,0.6667,0.58,0.1045,30,80,110 +12524,2012-06-11,2,1,6,0,0,1,1,1,0.7,0.6515,0.65,0.1045,8,33,41 +12525,2012-06-11,2,1,6,1,0,1,1,1,0.66,0.6212,0.74,0.1642,9,15,24 +12526,2012-06-11,2,1,6,2,0,1,1,1,0.66,0.6212,0.74,0.1045,5,12,17 +12527,2012-06-11,2,1,6,3,0,1,1,1,0.64,0.5909,0.78,0.1045,2,4,6 +12528,2012-06-11,2,1,6,4,0,1,1,1,0.64,0.5909,0.78,0.1343,0,7,7 +12529,2012-06-11,2,1,6,5,0,1,1,1,0.64,0.5909,0.78,0.1343,1,36,37 +12530,2012-06-11,2,1,6,6,0,1,1,1,0.62,0.5758,0.83,0.1045,8,136,144 +12531,2012-06-11,2,1,6,7,0,1,1,1,0.64,0.5909,0.78,0.1045,25,478,503 +12532,2012-06-11,2,1,6,8,0,1,1,1,0.66,0.6212,0.74,0.1343,38,613,651 +12533,2012-06-11,2,1,6,9,0,1,1,1,0.7,0.6515,0.65,0.1343,57,259,316 +12534,2012-06-11,2,1,6,10,0,1,1,1,0.72,0.6818,0.62,0.1642,52,95,147 +12535,2012-06-11,2,1,6,11,0,1,1,2,0.76,0.6818,0.48,0.2239,59,143,202 +12536,2012-06-11,2,1,6,12,0,1,1,2,0.8,0.7273,0.43,0.194,63,198,261 +12537,2012-06-11,2,1,6,13,0,1,1,2,0.8,0.7121,0.41,0.2985,63,185,248 +12538,2012-06-11,2,1,6,14,0,1,1,2,0.82,0.7273,0.34,0.2836,66,193,259 +12539,2012-06-11,2,1,6,15,0,1,1,1,0.8,0.7121,0.41,0.3582,72,187,259 +12540,2012-06-11,2,1,6,16,0,1,1,2,0.8,0.7121,0.41,0.4179,87,340,427 +12541,2012-06-11,2,1,6,17,0,1,1,2,0.8,0.7121,0.41,0.3582,85,715,800 +12542,2012-06-11,2,1,6,18,0,1,1,2,0.8,0.7273,0.43,0.2985,105,726,831 +12543,2012-06-11,2,1,6,19,0,1,1,2,0.76,0.6818,0.48,0.3284,89,507,596 +12544,2012-06-11,2,1,6,20,0,1,1,2,0.74,0.6667,0.48,0.2239,61,333,394 +12545,2012-06-11,2,1,6,21,0,1,1,2,0.72,0.6667,0.58,0.194,32,218,250 +12546,2012-06-11,2,1,6,22,0,1,1,2,0.72,0.6667,0.58,0.194,22,142,164 +12547,2012-06-11,2,1,6,23,0,1,1,2,0.7,0.6515,0.58,0.2239,8,72,80 +12548,2012-06-12,2,1,6,0,0,2,1,2,0.7,0.6515,0.58,0.2239,2,26,28 +12549,2012-06-12,2,1,6,1,0,2,1,2,0.68,0.6364,0.65,0.1642,5,11,16 +12550,2012-06-12,2,1,6,2,0,2,1,2,0.68,0.6364,0.65,0.194,0,6,6 +12551,2012-06-12,2,1,6,3,0,2,1,2,0.66,0.6212,0.69,0.2537,3,9,12 +12552,2012-06-12,2,1,6,4,0,2,1,3,0.6,0.5455,0.88,0.1642,0,4,4 +12553,2012-06-12,2,1,6,5,0,2,1,3,0.6,0.5455,0.88,0.1343,1,28,29 +12554,2012-06-12,2,1,6,6,0,2,1,2,0.6,0.5455,0.88,0.1343,6,130,136 +12555,2012-06-12,2,1,6,7,0,2,1,3,0.62,0.5606,0.88,0.2537,16,287,303 +12556,2012-06-12,2,1,6,8,0,2,1,2,0.62,0.5606,0.88,0.2239,15,598,613 +12557,2012-06-12,2,1,6,9,0,2,1,2,0.64,0.5758,0.89,0.2836,31,254,285 +12558,2012-06-12,2,1,6,10,0,2,1,3,0.64,0.5758,0.89,0.2836,29,96,125 +12559,2012-06-12,2,1,6,11,0,2,1,3,0.66,0.6061,0.83,0.3582,8,57,65 +12560,2012-06-12,2,1,6,12,0,2,1,3,0.66,0.6061,0.83,0.2537,19,76,95 +12561,2012-06-12,2,1,6,13,0,2,1,3,0.66,0.6061,0.83,0.194,11,48,59 +12562,2012-06-12,2,1,6,14,0,2,1,3,0.64,0.5758,0.89,0.2537,12,48,60 +12563,2012-06-12,2,1,6,15,0,2,1,3,0.66,0.5909,0.89,0.194,22,94,116 +12564,2012-06-12,2,1,6,16,0,2,1,2,0.66,0.5909,0.89,0.2836,30,209,239 +12565,2012-06-12,2,1,6,17,0,2,1,2,0.66,0.5909,0.89,0.2836,56,625,681 +12566,2012-06-12,2,1,6,18,0,2,1,1,0.68,0.6364,0.83,0.194,57,596,653 +12567,2012-06-12,2,1,6,19,0,2,1,1,0.68,0.6364,0.83,0.194,47,444,491 +12568,2012-06-12,2,1,6,20,0,2,1,1,0.68,0.6364,0.83,0.194,36,350,386 +12569,2012-06-12,2,1,6,21,0,2,1,1,0.68,0.6364,0.83,0.1045,41,233,274 +12570,2012-06-12,2,1,6,22,0,2,1,1,0.66,0.5909,0.94,0.1045,19,188,207 +12571,2012-06-12,2,1,6,23,0,2,1,3,0.66,0.5909,0.94,0.2239,11,78,89 +12572,2012-06-13,2,1,6,0,0,3,1,2,0.66,0.5909,0.94,0.194,7,27,34 +12573,2012-06-13,2,1,6,1,0,3,1,1,0.64,0.5909,0.78,0,7,21,28 +12574,2012-06-13,2,1,6,2,0,3,1,1,0.62,0.5758,0.83,0.1343,0,4,4 +12575,2012-06-13,2,1,6,3,0,3,1,1,0.62,0.5758,0.83,0.194,1,7,8 +12576,2012-06-13,2,1,6,4,0,3,1,1,0.6,0.5606,0.83,0.1343,1,9,10 +12577,2012-06-13,2,1,6,5,0,3,1,1,0.6,0.5606,0.83,0.194,2,38,40 +12578,2012-06-13,2,1,6,6,0,3,1,1,0.6,0.5606,0.83,0.2239,6,188,194 +12579,2012-06-13,2,1,6,7,0,3,1,1,0.6,0.5606,0.83,0.2985,25,480,505 +12580,2012-06-13,2,1,6,8,0,3,1,2,0.62,0.5909,0.73,0.5224,41,672,713 +12581,2012-06-13,2,1,6,9,0,3,1,2,0.64,0.6061,0.69,0.5821,54,298,352 +12582,2012-06-13,2,1,6,10,0,3,1,2,0.64,0.6061,0.65,0.4179,64,134,198 +12583,2012-06-13,2,1,6,11,0,3,1,2,0.66,0.6212,0.54,0.4478,64,182,246 +12584,2012-06-13,2,1,6,12,0,3,1,2,0.68,0.6364,0.44,0.4179,93,241,334 +12585,2012-06-13,2,1,6,13,0,3,1,2,0.7,0.6364,0.45,0.3881,52,203,255 +12586,2012-06-13,2,1,6,14,0,3,1,2,0.72,0.6515,0.37,0.4179,98,222,320 +12587,2012-06-13,2,1,6,15,0,3,1,1,0.72,0.6515,0.32,0.4925,65,216,281 +12588,2012-06-13,2,1,6,16,0,3,1,1,0.72,0.6515,0.3,0.4925,72,320,392 +12589,2012-06-13,2,1,6,17,0,3,1,1,0.72,0.6515,0.32,0.4925,75,782,857 +12590,2012-06-13,2,1,6,18,0,3,1,1,0.72,0.6515,0.32,0.4478,104,640,744 +12591,2012-06-13,2,1,6,19,0,3,1,1,0.7,0.6364,0.34,0.3881,123,548,671 +12592,2012-06-13,2,1,6,20,0,3,1,1,0.68,0.6364,0.36,0.4179,86,362,448 +12593,2012-06-13,2,1,6,21,0,3,1,1,0.64,0.6212,0.44,0.3582,79,317,396 +12594,2012-06-13,2,1,6,22,0,3,1,1,0.62,0.6212,0.5,0.2985,29,209,238 +12595,2012-06-13,2,1,6,23,0,3,1,1,0.62,0.6212,0.5,0.2836,25,128,153 +12596,2012-06-14,2,1,6,0,0,4,1,1,0.6,0.6212,0.53,0.2985,3,45,48 +12597,2012-06-14,2,1,6,1,0,4,1,1,0.6,0.6212,0.56,0.2239,4,17,21 +12598,2012-06-14,2,1,6,2,0,4,1,2,0.62,0.6212,0.53,0.2537,1,8,9 +12599,2012-06-14,2,1,6,3,0,4,1,2,0.62,0.6212,0.53,0.2836,1,4,5 +12600,2012-06-14,2,1,6,4,0,4,1,2,0.6,0.6212,0.56,0.2537,0,6,6 +12601,2012-06-14,2,1,6,5,0,4,1,2,0.6,0.6212,0.56,0.2537,0,40,40 +12602,2012-06-14,2,1,6,6,0,4,1,1,0.58,0.5455,0.6,0.2836,12,169,181 +12603,2012-06-14,2,1,6,7,0,4,1,1,0.6,0.6061,0.6,0.2836,26,480,506 +12604,2012-06-14,2,1,6,8,0,4,1,1,0.6,0.6061,0.64,0.3284,37,682,719 +12605,2012-06-14,2,1,6,9,0,4,1,1,0.62,0.6061,0.61,0.2239,45,312,357 +12606,2012-06-14,2,1,6,10,0,4,1,1,0.64,0.6061,0.65,0.4179,47,132,179 +12607,2012-06-14,2,1,6,11,0,4,1,1,0.66,0.6212,0.57,0.2537,60,168,228 +12608,2012-06-14,2,1,6,12,0,4,1,2,0.66,0.6212,0.61,0.2239,55,210,265 +12609,2012-06-14,2,1,6,13,0,4,1,1,0.68,0.6364,0.57,0.1642,71,213,284 +12610,2012-06-14,2,1,6,14,0,4,1,1,0.7,0.6515,0.54,0.2836,84,214,298 +12611,2012-06-14,2,1,6,15,0,4,1,1,0.74,0.6667,0.48,0.2985,111,213,324 +12612,2012-06-14,2,1,6,16,0,4,1,1,0.72,0.6667,0.51,0.2836,70,368,438 +12613,2012-06-14,2,1,6,17,0,4,1,1,0.72,0.6667,0.48,0.2239,117,750,867 +12614,2012-06-14,2,1,6,18,0,4,1,1,0.74,0.6667,0.42,0.194,107,716,823 +12615,2012-06-14,2,1,6,19,0,4,1,1,0.7,0.6515,0.48,0.2537,96,483,579 +12616,2012-06-14,2,1,6,20,0,4,1,1,0.66,0.6212,0.65,0.2537,84,351,435 +12617,2012-06-14,2,1,6,21,0,4,1,1,0.64,0.6061,0.65,0.1642,67,270,337 +12618,2012-06-14,2,1,6,22,0,4,1,2,0.64,0.6061,0.65,0.2239,46,196,242 +12619,2012-06-14,2,1,6,23,0,4,1,2,0.62,0.6061,0.69,0.1642,36,136,172 +12620,2012-06-15,2,1,6,0,0,5,1,1,0.6,0.5909,0.73,0.1642,26,68,94 +12621,2012-06-15,2,1,6,1,0,5,1,1,0.58,0.5455,0.78,0.194,9,42,51 +12622,2012-06-15,2,1,6,2,0,5,1,1,0.56,0.5303,0.83,0.1045,0,15,15 +12623,2012-06-15,2,1,6,3,0,5,1,1,0.56,0.5303,0.83,0.1642,2,3,5 +12624,2012-06-15,2,1,6,4,0,5,1,1,0.56,0.5303,0.83,0.1642,1,13,14 +12625,2012-06-15,2,1,6,5,0,5,1,1,0.54,0.5152,0.88,0.1045,2,31,33 +12626,2012-06-15,2,1,6,6,0,5,1,1,0.54,0.5152,0.83,0.1343,9,142,151 +12627,2012-06-15,2,1,6,7,0,5,1,1,0.56,0.5303,0.83,0.1642,17,413,430 +12628,2012-06-15,2,1,6,8,0,5,1,1,0.6,0.5909,0.69,0.1642,44,609,653 +12629,2012-06-15,2,1,6,9,0,5,1,1,0.62,0.6061,0.65,0.1642,58,308,366 +12630,2012-06-15,2,1,6,10,0,5,1,1,0.64,0.6212,0.57,0.1642,56,160,216 +12631,2012-06-15,2,1,6,11,0,5,1,1,0.66,0.6212,0.5,0.194,106,180,286 +12632,2012-06-15,2,1,6,12,0,5,1,1,0.7,0.6364,0.45,0.1642,111,292,403 +12633,2012-06-15,2,1,6,13,0,5,1,1,0.7,0.6364,0.42,0.2537,121,272,393 +12634,2012-06-15,2,1,6,14,0,5,1,1,0.72,0.6515,0.39,0.194,121,255,376 +12635,2012-06-15,2,1,6,15,0,5,1,1,0.72,0.6515,0.39,0.194,107,270,377 +12636,2012-06-15,2,1,6,16,0,5,1,1,0.72,0.6515,0.37,0.194,132,426,558 +12637,2012-06-15,2,1,6,17,0,5,1,1,0.74,0.6515,0.37,0.2836,125,698,823 +12638,2012-06-15,2,1,6,18,0,5,1,1,0.72,0.6515,0.39,0.2239,121,572,693 +12639,2012-06-15,2,1,6,19,0,5,1,1,0.7,0.6364,0.42,0.1642,98,369,467 +12640,2012-06-15,2,1,6,20,0,5,1,1,0.68,0.6364,0.44,0.194,89,296,385 +12641,2012-06-15,2,1,6,21,0,5,1,1,0.66,0.6212,0.5,0.1045,83,250,333 +12642,2012-06-15,2,1,6,22,0,5,1,1,0.64,0.6212,0.53,0.1343,79,242,321 +12643,2012-06-15,2,1,6,23,0,5,1,1,0.62,0.6212,0.53,0.2537,46,176,222 +12644,2012-06-16,2,1,6,0,0,6,0,1,0.58,0.5455,0.56,0.1343,25,112,137 +12645,2012-06-16,2,1,6,1,0,6,0,1,0.56,0.5303,0.56,0,16,79,95 +12646,2012-06-16,2,1,6,2,0,6,0,1,0.58,0.5455,0.53,0,12,55,67 +12647,2012-06-16,2,1,6,3,0,6,0,1,0.56,0.5303,0.6,0.0896,8,19,27 +12648,2012-06-16,2,1,6,4,0,6,0,1,0.54,0.5152,0.68,0.1343,3,5,8 +12649,2012-06-16,2,1,6,5,0,6,0,1,0.54,0.5152,0.68,0.1045,2,21,23 +12650,2012-06-16,2,1,6,6,0,6,0,1,0.54,0.5152,0.68,0.194,11,36,47 +12651,2012-06-16,2,1,6,7,0,6,0,1,0.54,0.5152,0.68,0.194,10,68,78 +12652,2012-06-16,2,1,6,8,0,6,0,1,0.58,0.5455,0.64,0.1642,47,157,204 +12653,2012-06-16,2,1,6,9,0,6,0,1,0.6,0.6061,0.6,0.0896,88,279,367 +12654,2012-06-16,2,1,6,10,0,6,0,1,0.64,0.6212,0.47,0.194,139,296,435 +12655,2012-06-16,2,1,6,11,0,6,0,1,0.66,0.6212,0.39,0.1642,213,353,566 +12656,2012-06-16,2,1,6,12,0,6,0,1,0.68,0.6364,0.41,0.1045,254,349,603 +12657,2012-06-16,2,1,6,13,0,6,0,1,0.72,0.6515,0.34,0.1642,293,324,617 +12658,2012-06-16,2,1,6,14,0,6,0,1,0.72,0.6515,0.37,0.2239,264,309,573 +12659,2012-06-16,2,1,6,15,0,6,0,1,0.74,0.6515,0.35,0.2537,295,288,583 +12660,2012-06-16,2,1,6,16,0,6,0,2,0.72,0.6515,0.37,0.1045,238,304,542 +12661,2012-06-16,2,1,6,17,0,6,0,2,0.72,0.6515,0.37,0.2537,244,349,593 +12662,2012-06-16,2,1,6,18,0,6,0,2,0.72,0.6515,0.37,0.2239,256,315,571 +12663,2012-06-16,2,1,6,19,0,6,0,2,0.7,0.6364,0.42,0.194,184,277,461 +12664,2012-06-16,2,1,6,20,0,6,0,2,0.66,0.6212,0.47,0.194,123,229,352 +12665,2012-06-16,2,1,6,21,0,6,0,2,0.64,0.6212,0.5,0.2239,98,192,290 +12666,2012-06-16,2,1,6,22,0,6,0,1,0.62,0.6212,0.53,0.2985,91,189,280 +12667,2012-06-16,2,1,6,23,0,6,0,1,0.6,0.6212,0.53,0.2985,49,134,183 +12668,2012-06-17,2,1,6,0,0,0,0,1,0.56,0.5303,0.56,0.1343,31,117,148 +12669,2012-06-17,2,1,6,1,0,0,0,1,0.56,0.5303,0.6,0.1642,21,67,88 +12670,2012-06-17,2,1,6,2,0,0,0,1,0.56,0.5303,0.56,0.0896,19,55,74 +12671,2012-06-17,2,1,6,3,0,0,0,1,0.54,0.5152,0.64,0.1045,6,22,28 +12672,2012-06-17,2,1,6,4,0,0,0,1,0.54,0.5152,0.64,0.0896,2,16,18 +12673,2012-06-17,2,1,6,5,0,0,0,1,0.54,0.5152,0.68,0.1343,4,13,17 +12674,2012-06-17,2,1,6,6,0,0,0,1,0.52,0.5,0.72,0.1045,7,16,23 +12675,2012-06-17,2,1,6,7,0,0,0,1,0.52,0.5,0.72,0.0896,18,30,48 +12676,2012-06-17,2,1,6,8,0,0,0,1,0.56,0.5303,0.68,0.2537,24,95,119 +12677,2012-06-17,2,1,6,9,0,0,0,1,0.56,0.5303,0.68,0.1642,91,183,274 +12678,2012-06-17,2,1,6,10,0,0,0,1,0.58,0.5455,0.64,0,148,288,436 +12679,2012-06-17,2,1,6,11,0,0,0,1,0.62,0.6212,0.57,0.1045,196,350,546 +12680,2012-06-17,2,1,6,12,0,0,0,1,0.62,0.6212,0.57,0,260,355,615 +12681,2012-06-17,2,1,6,13,0,0,0,1,0.64,0.6212,0.53,0.1642,267,347,614 +12682,2012-06-17,2,1,6,14,0,0,0,1,0.64,0.6212,0.53,0,255,327,582 +12683,2012-06-17,2,1,6,15,0,0,0,1,0.66,0.6212,0.5,0.2239,203,260,463 +12684,2012-06-17,2,1,6,16,0,0,0,1,0.66,0.6212,0.54,0.2985,240,340,580 +12685,2012-06-17,2,1,6,17,0,0,0,1,0.64,0.6212,0.57,0.2239,232,361,593 +12686,2012-06-17,2,1,6,18,0,0,0,1,0.64,0.6212,0.53,0.194,203,310,513 +12687,2012-06-17,2,1,6,19,0,0,0,1,0.64,0.6212,0.53,0.1642,146,244,390 +12688,2012-06-17,2,1,6,20,0,0,0,1,0.62,0.6212,0.57,0.194,101,201,302 +12689,2012-06-17,2,1,6,21,0,0,0,1,0.62,0.6212,0.57,0.194,76,170,246 +12690,2012-06-17,2,1,6,22,0,0,0,1,0.6,0.6061,0.6,0.194,55,98,153 +12691,2012-06-17,2,1,6,23,0,0,0,1,0.58,0.5455,0.64,0.194,29,79,108 +12692,2012-06-18,2,1,6,0,0,1,1,1,0.56,0.5303,0.73,0.2537,12,28,40 +12693,2012-06-18,2,1,6,1,0,1,1,1,0.54,0.5152,0.77,0.2239,9,5,14 +12694,2012-06-18,2,1,6,2,0,1,1,1,0.54,0.5152,0.77,0.2239,3,6,9 +12695,2012-06-18,2,1,6,3,0,1,1,2,0.52,0.5,0.83,0.2239,1,3,4 +12696,2012-06-18,2,1,6,4,0,1,1,2,0.54,0.5152,0.77,0.194,0,9,9 +12697,2012-06-18,2,1,6,5,0,1,1,3,0.52,0.5,0.83,0.1343,1,22,23 +12698,2012-06-18,2,1,6,6,0,1,1,3,0.52,0.5,0.88,0.1343,0,37,37 +12699,2012-06-18,2,1,6,7,0,1,1,3,0.52,0.5,0.94,0.2239,10,135,145 +12700,2012-06-18,2,1,6,8,0,1,1,2,0.54,0.5152,0.83,0.194,27,447,474 +12701,2012-06-18,2,1,6,9,0,1,1,2,0.54,0.5152,0.83,0.2239,28,222,250 +12702,2012-06-18,2,1,6,10,0,1,1,3,0.54,0.5152,0.88,0.1045,13,78,91 +12703,2012-06-18,2,1,6,11,0,1,1,2,0.54,0.5152,0.88,0.1642,25,96,121 +12704,2012-06-18,2,1,6,12,0,1,1,3,0.56,0.5303,0.78,0.1343,29,139,168 +12705,2012-06-18,2,1,6,13,0,1,1,2,0.58,0.5455,0.73,0.1343,51,174,225 +12706,2012-06-18,2,1,6,14,0,1,1,2,0.58,0.5455,0.68,0.194,59,160,219 +12707,2012-06-18,2,1,6,15,0,1,1,2,0.6,0.5909,0.69,0.194,61,176,237 +12708,2012-06-18,2,1,6,16,0,1,1,2,0.6,0.5909,0.73,0.194,53,279,332 +12709,2012-06-18,2,1,6,17,0,1,1,2,0.6,0.5909,0.73,0.194,54,669,723 +12710,2012-06-18,2,1,6,18,0,1,1,2,0.62,0.6061,0.69,0.1642,47,595,642 +12711,2012-06-18,2,1,6,19,0,1,1,2,0.62,0.5909,0.73,0.0896,52,411,463 +12712,2012-06-18,2,1,6,20,0,1,1,2,0.62,0.5909,0.73,0.1343,45,317,362 +12713,2012-06-18,2,1,6,21,0,1,1,2,0.62,0.5909,0.73,0.1045,35,238,273 +12714,2012-06-18,2,1,6,22,0,1,1,2,0.62,0.5909,0.73,0.1642,21,143,164 +12715,2012-06-18,2,1,6,23,0,1,1,3,0.6,0.5758,0.78,0.194,17,57,74 +12716,2012-06-19,2,1,6,0,0,2,1,2,0.6,0.5758,0.78,0.1642,5,30,35 +12717,2012-06-19,2,1,6,1,0,2,1,2,0.6,0.5758,0.78,0.1642,1,13,14 +12718,2012-06-19,2,1,6,2,0,2,1,2,0.6,0.5758,0.78,0.1343,1,14,15 +12719,2012-06-19,2,1,6,3,0,2,1,2,0.6,0.5758,0.78,0.1045,1,7,8 +12720,2012-06-19,2,1,6,4,0,2,1,2,0.6,0.5758,0.78,0.1343,1,9,10 +12721,2012-06-19,2,1,6,5,0,2,1,2,0.6,0.5758,0.78,0.194,3,34,37 +12722,2012-06-19,2,1,6,6,0,2,1,2,0.6,0.5758,0.78,0.1045,9,152,161 +12723,2012-06-19,2,1,6,7,0,2,1,2,0.6,0.5758,0.78,0.1343,18,462,480 +12724,2012-06-19,2,1,6,8,0,2,1,2,0.6,0.5758,0.78,0.1642,40,633,673 +12725,2012-06-19,2,1,6,9,0,2,1,1,0.64,0.6061,0.73,0.0896,40,288,328 +12726,2012-06-19,2,1,6,10,0,2,1,1,0.64,0.6061,0.73,0.0896,44,136,180 +12727,2012-06-19,2,1,6,11,0,2,1,1,0.66,0.6212,0.69,0.1343,56,174,230 +12728,2012-06-19,2,1,6,12,0,2,1,1,0.7,0.6515,0.65,0.0896,70,222,292 +12729,2012-06-19,2,1,6,13,0,2,1,1,0.74,0.6818,0.58,0.1343,69,203,272 +12730,2012-06-19,2,1,6,14,0,2,1,1,0.76,0.7121,0.58,0.1045,61,166,227 +12731,2012-06-19,2,1,6,15,0,2,1,1,0.78,0.7424,0.59,0.1642,72,187,259 +12732,2012-06-19,2,1,6,16,0,2,1,1,0.8,0.7576,0.55,0.2239,48,286,334 +12733,2012-06-19,2,1,6,17,0,2,1,1,0.8,0.7576,0.55,0.194,86,725,811 +12734,2012-06-19,2,1,6,18,0,2,1,1,0.8,0.7576,0.55,0.1642,91,704,795 +12735,2012-06-19,2,1,6,19,0,2,1,1,0.8,0.7727,0.59,0.2239,82,432,514 +12736,2012-06-19,2,1,6,20,0,2,1,1,0.8,0.7576,0.55,0.2537,59,399,458 +12737,2012-06-19,2,1,6,21,0,2,1,1,0.76,0.7121,0.62,0.1642,37,239,276 +12738,2012-06-19,2,1,6,22,0,2,1,1,0.72,0.697,0.79,0.0896,51,240,291 +12739,2012-06-19,2,1,6,23,0,2,1,1,0.72,0.697,0.79,0.1343,23,102,125 +12740,2012-06-20,2,1,6,0,0,3,1,1,0.7,0.6667,0.79,0.194,5,48,53 +12741,2012-06-20,2,1,6,1,0,3,1,1,0.68,0.6364,0.79,0.194,7,20,27 +12742,2012-06-20,2,1,6,2,0,3,1,1,0.66,0.6061,0.83,0.1642,1,5,6 +12743,2012-06-20,2,1,6,3,0,3,1,1,0.66,0.6061,0.83,0.1343,2,6,8 +12744,2012-06-20,2,1,6,4,0,3,1,1,0.66,0.6061,0.83,0.0896,1,6,7 +12745,2012-06-20,2,1,6,5,0,3,1,1,0.66,0.6061,0.83,0.0896,2,37,39 +12746,2012-06-20,2,1,6,6,0,3,1,2,0.64,0.5758,0.89,0.0896,9,156,165 +12747,2012-06-20,2,1,6,7,0,3,1,2,0.66,0.5909,0.89,0,20,444,464 +12748,2012-06-20,2,1,6,8,0,3,1,2,0.7,0.6667,0.79,0.1045,43,600,643 +12749,2012-06-20,2,1,6,9,0,3,1,2,0.72,0.697,0.74,0,40,276,316 +12750,2012-06-20,2,1,6,10,0,3,1,1,0.82,0.7727,0.52,0.1343,41,126,167 +12751,2012-06-20,2,1,6,11,0,3,1,1,0.84,0.7879,0.49,0.194,54,134,188 +12752,2012-06-20,2,1,6,12,0,3,1,1,0.86,0.803,0.47,0.1642,30,150,180 +12753,2012-06-20,2,1,6,13,0,3,1,1,0.86,0.7879,0.44,0.1045,48,163,211 +12754,2012-06-20,2,1,6,14,0,3,1,1,0.88,0.803,0.39,0.194,56,134,190 +12755,2012-06-20,2,1,6,15,0,3,1,1,0.9,0.8182,0.35,0,36,143,179 +12756,2012-06-20,2,1,6,16,0,3,1,1,0.9,0.8182,0.35,0.1045,51,284,335 +12757,2012-06-20,2,1,6,17,0,3,1,1,0.9,0.8182,0.37,0,80,611,691 +12758,2012-06-20,2,1,6,18,0,3,1,1,0.9,0.8182,0.37,0,81,591,672 +12759,2012-06-20,2,1,6,19,0,3,1,1,0.88,0.7879,0.37,0.2537,90,449,539 +12760,2012-06-20,2,1,6,20,0,3,1,1,0.86,0.7727,0.39,0.2239,81,326,407 +12761,2012-06-20,2,1,6,21,0,3,1,1,0.84,0.7576,0.44,0.0896,42,302,344 +12762,2012-06-20,2,1,6,22,0,3,1,1,0.82,0.7576,0.46,0.1045,31,226,257 +12763,2012-06-20,2,1,6,23,0,3,1,1,0.78,0.7424,0.59,0.1045,21,102,123 +12764,2012-06-21,3,1,6,0,0,4,1,1,0.74,0.697,0.7,0.0896,16,53,69 +12765,2012-06-21,3,1,6,1,0,4,1,1,0.72,0.697,0.79,0.1343,2,16,18 +12766,2012-06-21,3,1,6,2,0,4,1,1,0.72,0.697,0.79,0.1045,11,7,18 +12767,2012-06-21,3,1,6,3,0,4,1,1,0.72,0.697,0.74,0,3,8,11 +12768,2012-06-21,3,1,6,4,0,4,1,1,0.7,0.6667,0.84,0.0896,1,11,12 +12769,2012-06-21,3,1,6,5,0,4,1,1,0.7,0.6667,0.84,0.1045,3,37,40 +12770,2012-06-21,3,1,6,6,0,4,1,2,0.7,0.6667,0.84,0.0896,3,147,150 +12771,2012-06-21,3,1,6,7,0,4,1,2,0.7,0.6667,0.84,0.0896,26,437,463 +12772,2012-06-21,3,1,6,8,0,4,1,1,0.74,0.697,0.7,0,27,563,590 +12773,2012-06-21,3,1,6,9,0,4,1,1,0.8,0.7576,0.55,0.194,32,224,256 +12774,2012-06-21,3,1,6,10,0,4,1,1,0.84,0.7879,0.49,0.1045,36,114,150 +12775,2012-06-21,3,1,6,11,0,4,1,1,0.86,0.7879,0.41,0.1343,40,121,161 +12776,2012-06-21,3,1,6,12,0,4,1,1,0.9,0.8182,0.35,0.194,33,182,215 +12777,2012-06-21,3,1,6,13,0,4,1,1,0.9,0.8182,0.35,0,31,141,172 +12778,2012-06-21,3,1,6,14,0,4,1,1,0.88,0.7879,0.37,0.1642,27,149,176 +12779,2012-06-21,3,1,6,15,0,4,1,1,0.92,0.8333,0.33,0.1642,40,154,194 +12780,2012-06-21,3,1,6,16,0,4,1,1,0.9,0.8182,0.35,0.1343,37,283,320 +12781,2012-06-21,3,1,6,17,0,4,1,1,0.88,0.7879,0.37,0,80,535,615 +12782,2012-06-21,3,1,6,18,0,4,1,1,0.86,0.7879,0.41,0.194,78,562,640 +12783,2012-06-21,3,1,6,19,0,4,1,1,0.88,0.803,0.39,0.2239,84,415,499 +12784,2012-06-21,3,1,6,20,0,4,1,1,0.86,0.7879,0.44,0.194,79,357,436 +12785,2012-06-21,3,1,6,21,0,4,1,1,0.82,0.7727,0.52,0.194,55,256,311 +12786,2012-06-21,3,1,6,22,0,4,1,1,0.8,0.7879,0.63,0.1642,22,221,243 +12787,2012-06-21,3,1,6,23,0,4,1,1,0.8,0.7727,0.59,0.0896,12,134,146 +12788,2012-06-22,3,1,6,0,0,5,1,2,0.76,0.7273,0.7,0.1045,16,77,93 +12789,2012-06-22,3,1,6,1,0,5,1,1,0.76,0.7424,0.75,0.1642,3,28,31 +12790,2012-06-22,3,1,6,2,0,5,1,1,0.74,0.7121,0.79,0,0,12,12 +12791,2012-06-22,3,1,6,3,0,5,1,1,0.74,0.7121,0.79,0,1,6,7 +12792,2012-06-22,3,1,6,4,0,5,1,2,0.76,0.7273,0.66,0,3,7,10 +12793,2012-06-22,3,1,6,5,0,5,1,2,0.74,0.697,0.66,0.0896,3,35,38 +12794,2012-06-22,3,1,6,6,0,5,1,2,0.74,0.697,0.66,0,4,118,122 +12795,2012-06-22,3,1,6,7,0,5,1,1,0.74,0.6818,0.58,0.1642,19,340,359 +12796,2012-06-22,3,1,6,8,0,5,1,1,0.76,0.697,0.55,0.1343,49,564,613 +12797,2012-06-22,3,1,6,9,0,5,1,1,0.76,0.7121,0.58,0.2239,55,278,333 +12798,2012-06-22,3,1,6,10,0,5,1,1,0.8,0.7424,0.52,0.3582,52,133,185 +12799,2012-06-22,3,1,6,11,0,5,1,1,0.8,0.7424,0.49,0.1642,76,160,236 +12800,2012-06-22,3,1,6,12,0,5,1,1,0.84,0.7879,0.49,0.1343,63,191,254 +12801,2012-06-22,3,1,6,13,0,5,1,1,0.88,0.803,0.39,0.2239,72,228,300 +12802,2012-06-22,3,1,6,14,0,5,1,1,0.88,0.803,0.39,0.2239,65,198,263 +12803,2012-06-22,3,1,6,15,0,5,1,1,0.9,0.8182,0.35,0.3582,72,212,284 +12804,2012-06-22,3,1,6,16,0,5,1,1,0.9,0.803,0.31,0.2239,67,313,380 +12805,2012-06-22,3,1,6,17,0,5,1,2,0.82,0.7273,0.38,0.1642,95,551,646 +12806,2012-06-22,3,1,6,18,0,5,1,2,0.8,0.7576,0.55,0.194,62,443,505 +12807,2012-06-22,3,1,6,19,0,5,1,2,0.76,0.697,0.55,0.5224,45,317,362 +12808,2012-06-22,3,1,6,20,0,5,1,2,0.76,0.697,0.55,0.5224,31,171,202 +12809,2012-06-22,3,1,6,21,0,5,1,3,0.68,0.6364,0.65,0.2836,40,177,217 +12810,2012-06-22,3,1,6,22,0,5,1,1,0.68,0.6364,0.69,0.1343,42,155,197 +12811,2012-06-22,3,1,6,23,0,5,1,2,0.66,0.6212,0.74,0,29,145,174 +12812,2012-06-23,3,1,6,0,0,6,0,1,0.64,0.5758,0.83,0,14,102,116 +12813,2012-06-23,3,1,6,1,0,6,0,1,0.66,0.6061,0.78,0.1343,16,107,123 +12814,2012-06-23,3,1,6,2,0,6,0,2,0.64,0.5758,0.83,0.194,18,75,93 +12815,2012-06-23,3,1,6,3,0,6,0,2,0.64,0.5758,0.83,0.0896,10,32,42 +12816,2012-06-23,3,1,6,4,0,6,0,2,0.64,0.5758,0.83,0.1642,6,13,19 +12817,2012-06-23,3,1,6,5,0,6,0,2,0.64,0.5758,0.83,0.194,1,11,12 +12818,2012-06-23,3,1,6,6,0,6,0,1,0.64,0.5758,0.83,0.0896,10,37,47 +12819,2012-06-23,3,1,6,7,0,6,0,1,0.64,0.5758,0.83,0.1642,19,59,78 +12820,2012-06-23,3,1,6,8,0,6,0,1,0.68,0.6364,0.69,0.2239,34,133,167 +12821,2012-06-23,3,1,6,9,0,6,0,1,0.72,0.6818,0.62,0.194,85,247,332 +12822,2012-06-23,3,1,6,10,0,6,0,1,0.74,0.6818,0.55,0.194,177,317,494 +12823,2012-06-23,3,1,6,11,0,6,0,1,0.76,0.6818,0.48,0.2239,167,349,516 +12824,2012-06-23,3,1,6,12,0,6,0,1,0.8,0.7273,0.43,0.194,203,399,602 +12825,2012-06-23,3,1,6,13,0,6,0,1,0.82,0.7273,0.34,0.194,226,338,564 +12826,2012-06-23,3,1,6,14,0,6,0,1,0.78,0.6818,0.33,0.2537,238,371,609 +12827,2012-06-23,3,1,6,15,0,6,0,1,0.84,0.7273,0.3,0.3284,217,266,483 +12828,2012-06-23,3,1,6,16,0,6,0,1,0.84,0.7121,0.26,0.2985,196,317,513 +12829,2012-06-23,3,1,6,17,0,6,0,1,0.82,0.7121,0.26,0.2537,214,316,530 +12830,2012-06-23,3,1,6,18,0,6,0,1,0.82,0.7121,0.26,0.2985,194,287,481 +12831,2012-06-23,3,1,6,19,0,6,0,1,0.8,0.697,0.27,0.194,185,240,425 +12832,2012-06-23,3,1,6,20,0,6,0,1,0.78,0.6818,0.31,0.194,158,228,386 +12833,2012-06-23,3,1,6,21,0,6,0,1,0.76,0.6667,0.33,0.1343,99,223,322 +12834,2012-06-23,3,1,6,22,0,6,0,1,0.74,0.6515,0.37,0.1045,101,178,279 +12835,2012-06-23,3,1,6,23,0,6,0,1,0.72,0.6515,0.44,0,69,156,225 +12836,2012-06-24,3,1,6,0,0,0,0,1,0.7,0.6515,0.51,0,31,132,163 +12837,2012-06-24,3,1,6,1,0,0,0,2,0.68,0.6364,0.54,0.0896,39,88,127 +12838,2012-06-24,3,1,6,2,0,0,0,2,0.68,0.6364,0.51,0,34,80,114 +12839,2012-06-24,3,1,6,3,0,0,0,2,0.66,0.6212,0.65,0.0896,19,41,60 +12840,2012-06-24,3,1,6,4,0,0,0,2,0.66,0.6212,0.65,0,4,9,13 +12841,2012-06-24,3,1,6,5,0,0,0,1,0.62,0.6061,0.69,0.1343,1,7,8 +12842,2012-06-24,3,1,6,6,0,0,0,1,0.64,0.6061,0.65,0.1642,6,19,25 +12843,2012-06-24,3,1,6,7,0,0,0,1,0.64,0.6061,0.69,0.1045,23,48,71 +12844,2012-06-24,3,1,6,8,0,0,0,1,0.7,0.6515,0.58,0,38,103,141 +12845,2012-06-24,3,1,6,9,0,0,0,1,0.72,0.6667,0.51,0,63,169,232 +12846,2012-06-24,3,1,6,10,0,0,0,1,0.76,0.6818,0.4,0,161,291,452 +12847,2012-06-24,3,1,6,11,0,0,0,1,0.76,0.6818,0.4,0.1343,215,291,506 +12848,2012-06-24,3,1,6,12,0,0,0,1,0.8,0.697,0.33,0.2239,227,329,556 +12849,2012-06-24,3,1,6,13,0,0,0,1,0.8,0.697,0.33,0.2836,237,348,585 +12850,2012-06-24,3,1,6,14,0,0,0,1,0.84,0.7273,0.3,0.2985,253,298,551 +12851,2012-06-24,3,1,6,15,0,0,0,1,0.86,0.7424,0.28,0.2239,197,290,487 +12852,2012-06-24,3,1,6,16,0,0,0,1,0.84,0.7273,0.3,0.2537,181,280,461 +12853,2012-06-24,3,1,6,17,0,0,0,1,0.84,0.7273,0.32,0.2537,203,329,532 +12854,2012-06-24,3,1,6,18,0,0,0,1,0.82,0.7273,0.34,0.2537,186,348,534 +12855,2012-06-24,3,1,6,19,0,0,0,1,0.8,0.7121,0.38,0.2985,176,247,423 +12856,2012-06-24,3,1,6,20,0,0,0,1,0.78,0.697,0.46,0.2537,108,224,332 +12857,2012-06-24,3,1,6,21,0,0,0,1,0.76,0.697,0.55,0.2239,76,177,253 +12858,2012-06-24,3,1,6,22,0,0,0,1,0.74,0.6818,0.55,0.1045,53,125,178 +12859,2012-06-24,3,1,6,23,0,0,0,1,0.74,0.6818,0.58,0.1045,20,67,87 +12860,2012-06-25,3,1,6,0,0,1,1,1,0.72,0.6818,0.62,0.2537,31,37,68 +12861,2012-06-25,3,1,6,1,0,1,1,1,0.7,0.6515,0.65,0.2239,11,20,31 +12862,2012-06-25,3,1,6,2,0,1,1,1,0.7,0.6515,0.7,0.2239,4,10,14 +12863,2012-06-25,3,1,6,3,0,1,1,1,0.68,0.6364,0.69,0.2537,1,4,5 +12864,2012-06-25,3,1,6,4,0,1,1,1,0.66,0.6061,0.78,0.2537,0,4,4 +12865,2012-06-25,3,1,6,5,0,1,1,1,0.66,0.6212,0.74,0.1642,1,42,43 +12866,2012-06-25,3,1,6,6,0,1,1,1,0.64,0.5909,0.78,0.1642,11,147,158 +12867,2012-06-25,3,1,6,7,0,1,1,1,0.66,0.6212,0.74,0.2239,30,455,485 +12868,2012-06-25,3,1,6,8,0,1,1,1,0.66,0.6212,0.69,0.194,53,555,608 +12869,2012-06-25,3,1,6,9,0,1,1,1,0.7,0.6515,0.65,0.194,37,231,268 +12870,2012-06-25,3,1,6,10,0,1,1,1,0.76,0.697,0.55,0.1642,58,126,184 +12871,2012-06-25,3,1,6,11,0,1,1,1,0.8,0.7273,0.43,0.3284,54,127,181 +12872,2012-06-25,3,1,6,12,0,1,1,2,0.76,0.6818,0.45,0.2985,74,196,270 +12873,2012-06-25,3,1,6,13,0,1,1,1,0.72,0.6667,0.51,0.2985,79,181,260 +12874,2012-06-25,3,1,6,14,0,1,1,1,0.78,0.697,0.43,0.2537,90,161,251 +12875,2012-06-25,3,1,6,15,0,1,1,1,0.78,0.6818,0.38,0.4627,65,168,233 +12876,2012-06-25,3,1,6,16,0,1,1,1,0.8,0.697,0.31,0.3881,104,310,414 +12877,2012-06-25,3,1,6,17,0,1,1,1,0.8,0.6818,0.24,0.4627,76,757,833 +12878,2012-06-25,3,1,6,18,0,1,1,1,0.76,0.6667,0.27,0.4478,100,691,791 +12879,2012-06-25,3,1,6,19,0,1,1,1,0.74,0.6515,0.27,0.4627,74,508,582 +12880,2012-06-25,3,1,6,20,0,1,1,1,0.72,0.6515,0.3,0.4478,100,375,475 +12881,2012-06-25,3,1,6,21,0,1,1,1,0.68,0.6212,0.3,0.3284,50,281,331 +12882,2012-06-25,3,1,6,22,0,1,1,1,0.66,0.6212,0.29,0.3881,21,166,187 +12883,2012-06-25,3,1,6,23,0,1,1,1,0.64,0.6212,0.33,0.3284,15,88,103 +12884,2012-06-26,3,1,6,0,0,2,1,1,0.62,0.6212,0.33,0.3284,8,29,37 +12885,2012-06-26,3,1,6,1,0,2,1,1,0.6,0.6212,0.35,0.3881,11,19,30 +12886,2012-06-26,3,1,6,2,0,2,1,1,0.56,0.5303,0.43,0.2836,2,7,9 +12887,2012-06-26,3,1,6,3,0,2,1,1,0.54,0.5152,0.45,0.2836,2,7,9 +12888,2012-06-26,3,1,6,4,0,2,1,1,0.54,0.5152,0.45,0.2985,1,9,10 +12889,2012-06-26,3,1,6,5,0,2,1,1,0.52,0.5,0.48,0.3881,3,36,39 +12890,2012-06-26,3,1,6,6,0,2,1,1,0.52,0.5,0.52,0.2985,12,176,188 +12891,2012-06-26,3,1,6,7,0,2,1,1,0.52,0.5,0.52,0.3582,21,531,552 +12892,2012-06-26,3,1,6,8,0,2,1,1,0.56,0.5303,0.49,0.3284,34,622,656 +12893,2012-06-26,3,1,6,9,0,2,1,1,0.58,0.5455,0.46,0.4179,30,298,328 +12894,2012-06-26,3,1,6,10,0,2,1,1,0.62,0.6212,0.41,0.3582,37,153,190 +12895,2012-06-26,3,1,6,11,0,2,1,1,0.64,0.6212,0.36,0.4179,46,189,235 +12896,2012-06-26,3,1,6,12,0,2,1,1,0.66,0.6212,0.34,0.4478,80,228,308 +12897,2012-06-26,3,1,6,13,0,2,1,1,0.7,0.6364,0.3,0.4627,74,232,306 +12898,2012-06-26,3,1,6,14,0,2,1,1,0.7,0.6364,0.3,0.4627,81,186,267 +12899,2012-06-26,3,1,6,15,0,2,1,1,0.72,0.6515,0.28,0.4627,88,229,317 +12900,2012-06-26,3,1,6,16,0,2,1,1,0.72,0.6364,0.26,0.4925,67,349,416 +12901,2012-06-26,3,1,6,17,0,2,1,1,0.72,0.6515,0.28,0.4179,104,796,900 +12902,2012-06-26,3,1,6,18,0,2,1,1,0.72,0.6515,0.28,0.3582,105,719,824 +12903,2012-06-26,3,1,6,19,0,2,1,1,0.72,0.6515,0.28,0.2239,83,529,612 +12904,2012-06-26,3,1,6,20,0,2,1,1,0.7,0.6364,0.3,0.2836,88,417,505 +12905,2012-06-26,3,1,6,21,0,2,1,1,0.68,0.6364,0.34,0.194,61,267,328 +12906,2012-06-26,3,1,6,22,0,2,1,1,0.66,0.6212,0.34,0.2537,26,219,245 +12907,2012-06-26,3,1,6,23,0,2,1,1,0.62,0.6212,0.41,0.1343,13,118,131 +12908,2012-06-27,3,1,6,0,0,3,1,1,0.62,0.6212,0.41,0.194,15,39,54 +12909,2012-06-27,3,1,6,1,0,3,1,1,0.6,0.6212,0.43,0.1642,5,18,23 +12910,2012-06-27,3,1,6,2,0,3,1,1,0.56,0.5303,0.49,0.1343,2,12,14 +12911,2012-06-27,3,1,6,3,0,3,1,1,0.58,0.5455,0.46,0.0896,1,7,8 +12912,2012-06-27,3,1,6,4,0,3,1,1,0.6,0.6212,0.4,0.2537,0,11,11 +12913,2012-06-27,3,1,6,5,0,3,1,1,0.6,0.6212,0.43,0.2836,2,34,36 +12914,2012-06-27,3,1,6,6,0,3,1,1,0.58,0.5455,0.46,0.2836,7,193,200 +12915,2012-06-27,3,1,6,7,0,3,1,1,0.6,0.6212,0.43,0.3582,29,498,527 +12916,2012-06-27,3,1,6,8,0,3,1,1,0.62,0.6212,0.43,0.3284,49,638,687 +12917,2012-06-27,3,1,6,9,0,3,1,1,0.66,0.6212,0.41,0.3582,29,253,282 +12918,2012-06-27,3,1,6,10,0,3,1,1,0.7,0.6364,0.37,0.2537,34,138,172 +12919,2012-06-27,3,1,6,11,0,3,1,1,0.74,0.6515,0.3,0.3284,64,145,209 +12920,2012-06-27,3,1,6,12,0,3,1,1,0.76,0.6667,0.29,0.2985,77,245,322 +12921,2012-06-27,3,1,6,13,0,3,1,1,0.8,0.697,0.27,0.3284,77,244,321 +12922,2012-06-27,3,1,6,14,0,3,1,1,0.8,0.6818,0.24,0.3582,78,192,270 +12923,2012-06-27,3,1,6,15,0,3,1,1,0.8,0.697,0.26,0.4627,61,188,249 +12924,2012-06-27,3,1,6,16,0,3,1,1,0.82,0.697,0.24,0.4478,55,337,392 +12925,2012-06-27,3,1,6,17,0,3,1,1,0.82,0.697,0.24,0.3881,113,730,843 +12926,2012-06-27,3,1,6,18,0,3,1,1,0.8,0.697,0.27,0.3284,85,719,804 +12927,2012-06-27,3,1,6,19,0,3,1,1,0.78,0.6818,0.29,0.3284,101,542,643 +12928,2012-06-27,3,1,6,20,0,3,1,1,0.76,0.6667,0.31,0.194,76,369,445 +12929,2012-06-27,3,1,6,21,0,3,1,1,0.74,0.6515,0.35,0.1343,54,313,367 +12930,2012-06-27,3,1,6,22,0,3,1,1,0.72,0.6515,0.39,0.0896,52,245,297 +12931,2012-06-27,3,1,6,23,0,3,1,1,0.68,0.6364,0.47,0.1343,11,148,159 +12932,2012-06-28,3,1,6,0,0,4,1,1,0.66,0.6212,0.5,0.1045,2,52,54 +12933,2012-06-28,3,1,6,1,0,4,1,1,0.66,0.6212,0.54,0,11,14,25 +12934,2012-06-28,3,1,6,2,0,4,1,1,0.66,0.6212,0.5,0.0896,1,12,13 +12935,2012-06-28,3,1,6,3,0,4,1,1,0.66,0.6212,0.5,0.0896,1,6,7 +12936,2012-06-28,3,1,6,4,0,4,1,1,0.62,0.6061,0.61,0.1045,0,11,11 +12937,2012-06-28,3,1,6,5,0,4,1,1,0.62,0.6212,0.57,0.0896,8,39,47 +12938,2012-06-28,3,1,6,6,0,4,1,1,0.62,0.6212,0.57,0.0896,8,185,193 +12939,2012-06-28,3,1,6,7,0,4,1,1,0.62,0.6061,0.61,0.1045,24,484,508 +12940,2012-06-28,3,1,6,8,0,4,1,1,0.66,0.6212,0.5,0.1343,40,577,617 +12941,2012-06-28,3,1,6,9,0,4,1,1,0.7,0.6515,0.54,0.1642,33,318,351 +12942,2012-06-28,3,1,6,10,0,4,1,1,0.74,0.6667,0.48,0.194,40,144,184 +12943,2012-06-28,3,1,6,11,0,4,1,1,0.8,0.697,0.31,0.194,41,170,211 +12944,2012-06-28,3,1,6,12,0,4,1,1,0.84,0.7121,0.26,0.2836,49,227,276 +12945,2012-06-28,3,1,6,13,0,4,1,1,0.86,0.7424,0.26,0.2239,31,205,236 +12946,2012-06-28,3,1,6,14,0,4,1,1,0.86,0.7273,0.25,0.3284,66,188,254 +12947,2012-06-28,3,1,6,15,0,4,1,1,0.86,0.7273,0.25,0.3284,24,195,219 +12948,2012-06-28,3,1,6,16,0,4,1,1,0.88,0.7424,0.23,0.194,56,331,387 +12949,2012-06-28,3,1,6,17,0,4,1,1,0.88,0.7424,0.22,0.194,71,634,705 +12950,2012-06-28,3,1,6,18,0,4,1,1,0.86,0.7273,0.25,0.2239,100,597,697 +12951,2012-06-28,3,1,6,19,0,4,1,1,0.84,0.7273,0.3,0.2985,79,492,571 +12952,2012-06-28,3,1,6,20,0,4,1,1,0.82,0.7273,0.34,0.194,71,399,470 +12953,2012-06-28,3,1,6,21,0,4,1,1,0.76,0.697,0.55,0.194,89,309,398 +12954,2012-06-28,3,1,6,22,0,4,1,1,0.76,0.697,0.52,0.1642,50,224,274 +12955,2012-06-28,3,1,6,23,0,4,1,1,0.74,0.6667,0.48,0.1343,26,145,171 +12956,2012-06-29,3,1,6,0,0,5,1,1,0.74,0.6667,0.48,0.1343,24,90,114 +12957,2012-06-29,3,1,6,1,0,5,1,1,0.74,0.6667,0.51,0.1642,9,33,42 +12958,2012-06-29,3,1,6,2,0,5,1,1,0.72,0.6667,0.58,0.194,3,15,18 +12959,2012-06-29,3,1,6,3,0,5,1,1,0.72,0.6667,0.58,0.194,3,5,8 +12960,2012-06-29,3,1,6,4,0,5,1,1,0.7,0.6515,0.65,0.1642,1,9,10 +12961,2012-06-29,3,1,6,5,0,5,1,1,0.7,0.6515,0.61,0.1045,6,39,45 +12962,2012-06-29,3,1,6,6,0,5,1,2,0.7,0.6515,0.61,0.1642,15,148,163 +12963,2012-06-29,3,1,6,7,0,5,1,1,0.7,0.6515,0.61,0.1045,19,361,380 +12964,2012-06-29,3,1,6,8,0,5,1,2,0.74,0.6818,0.55,0.1045,48,576,624 +12965,2012-06-29,3,1,6,9,0,5,1,1,0.76,0.697,0.55,0.1642,39,261,300 +12966,2012-06-29,3,1,6,10,0,5,1,1,0.9,0.8333,0.39,0.2985,32,139,171 +12967,2012-06-29,3,1,6,11,0,5,1,2,0.9,0.8485,0.42,0.2836,51,137,188 +12968,2012-06-29,3,1,6,12,0,5,1,2,0.92,0.8788,0.4,0.2537,53,173,226 +12969,2012-06-29,3,1,6,13,0,5,1,1,0.94,0.8788,0.38,0.194,45,194,239 +12970,2012-06-29,3,1,6,14,0,5,1,1,0.96,0.9091,0.36,0.2239,49,184,233 +12971,2012-06-29,3,1,6,15,0,5,1,1,0.96,0.9091,0.36,0,47,183,230 +12972,2012-06-29,3,1,6,16,0,5,1,1,0.96,0.9091,0.36,0,59,292,351 +12973,2012-06-29,3,1,6,17,0,5,1,1,0.98,0.9242,0.34,0.194,82,457,539 +12974,2012-06-29,3,1,6,18,0,5,1,1,0.96,0.8636,0.31,0,50,414,464 +12975,2012-06-29,3,1,6,19,0,5,1,1,0.9,0.8788,0.47,0.194,60,303,363 +12976,2012-06-29,3,1,6,20,0,5,1,1,0.92,0.8939,0.42,0.2537,53,255,308 +12977,2012-06-29,3,1,6,21,0,5,1,2,0.86,0.8333,0.53,0.194,41,195,236 +12978,2012-06-29,3,1,6,22,0,5,1,3,0.82,0.8333,0.63,0.194,34,129,163 +12979,2012-06-29,3,1,6,23,0,5,1,3,0.82,0.8333,0.63,0.194,6,42,48 +12980,2012-06-30,3,1,6,0,0,6,0,3,0.64,0.5758,0.89,0.1642,4,65,69 +12981,2012-06-30,3,1,6,1,0,6,0,3,0.64,0.5758,0.89,0.2239,3,55,58 +12982,2012-06-30,3,1,6,2,0,6,0,2,0.64,0.5758,0.89,0,7,54,61 +12983,2012-06-30,3,1,6,3,0,6,0,2,0.64,0.5758,0.89,0,3,20,23 +12984,2012-06-30,3,1,6,4,0,6,0,2,0.62,0.5455,0.94,0,3,15,18 +12985,2012-06-30,3,1,6,5,0,6,0,1,0.64,0.5758,0.89,0.1642,3,7,10 +12986,2012-06-30,3,1,6,6,0,6,0,1,0.64,0.5758,0.89,0.1642,6,36,42 +12987,2012-06-30,3,1,6,7,0,6,0,1,0.64,0.5758,0.89,0.1642,10,82,92 +12988,2012-06-30,3,1,6,8,0,6,0,1,0.64,0.5758,0.89,0.1642,26,168,194 +12989,2012-06-30,3,1,6,9,0,6,0,1,0.64,0.5758,0.89,0.1642,41,234,275 +12990,2012-06-30,3,1,6,10,0,6,0,1,0.88,0.7727,0.3,0.2537,96,308,404 +12991,2012-06-30,3,1,6,11,0,6,0,1,0.88,0.7727,0.3,0.2537,102,350,452 +12992,2012-06-30,3,1,6,12,0,6,0,1,0.88,0.7727,0.3,0.2537,143,328,471 +12993,2012-06-30,3,1,6,13,0,6,0,1,0.88,0.7727,0.3,0.2537,105,323,428 +12994,2012-06-30,3,1,6,14,0,6,0,1,0.88,0.7727,0.3,0.2537,114,295,409 +12995,2012-06-30,3,1,6,15,0,6,0,1,0.88,0.7727,0.3,0.2537,117,287,404 +12996,2012-06-30,3,1,6,16,0,6,0,1,0.9,0.7879,0.29,0.1642,109,264,373 +12997,2012-06-30,3,1,6,17,0,6,0,1,0.88,0.7727,0.32,0.1343,131,231,362 +12998,2012-06-30,3,1,6,18,0,6,0,1,0.88,0.7879,0.35,0,91,248,339 +12999,2012-06-30,3,1,6,19,0,6,0,1,0.84,0.7576,0.44,0.2537,134,240,374 +13000,2012-06-30,3,1,6,20,0,6,0,1,0.82,0.7727,0.52,0.1642,88,204,292 +13001,2012-06-30,3,1,6,21,0,6,0,1,0.82,0.7727,0.52,0.1642,48,165,213 +13002,2012-06-30,3,1,6,22,0,6,0,1,0.78,0.7424,0.62,0.1642,38,134,172 +13003,2012-06-30,3,1,6,23,0,6,0,1,0.78,0.7424,0.62,0.0896,33,119,152 +13004,2012-07-01,3,1,7,0,0,0,0,1,0.76,0.7273,0.66,0,27,122,149 +13005,2012-07-01,3,1,7,1,0,0,0,1,0.74,0.697,0.7,0.1343,12,81,93 +13006,2012-07-01,3,1,7,2,0,0,0,1,0.72,0.697,0.74,0.0896,21,69,90 +13007,2012-07-01,3,1,7,3,0,0,0,1,0.72,0.7121,0.84,0.1343,6,27,33 +13008,2012-07-01,3,1,7,4,0,0,0,1,0.7,0.6667,0.79,0.194,0,4,4 +13009,2012-07-01,3,1,7,5,0,0,0,1,0.68,0.6364,0.79,0.1045,3,7,10 +13010,2012-07-01,3,1,7,6,0,0,0,1,0.7,0.6667,0.79,0.0896,8,19,27 +13011,2012-07-01,3,1,7,7,0,0,0,1,0.74,0.697,0.7,0,13,37,50 +13012,2012-07-01,3,1,7,8,0,0,0,1,0.78,0.7424,0.62,0.1045,36,106,142 +13013,2012-07-01,3,1,7,9,0,0,0,1,0.82,0.7879,0.56,0,51,168,219 +13014,2012-07-01,3,1,7,10,0,0,0,1,0.86,0.7879,0.44,0.2985,98,268,366 +13015,2012-07-01,3,1,7,11,0,0,0,1,0.88,0.803,0.39,0.2239,121,256,377 +13016,2012-07-01,3,1,7,12,0,0,0,1,0.9,0.8182,0.37,0.2239,114,319,433 +13017,2012-07-01,3,1,7,13,0,0,0,1,0.92,0.8333,0.33,0.2537,111,309,420 +13018,2012-07-01,3,1,7,14,0,0,0,1,0.92,0.8333,0.33,0.2537,98,346,444 +13019,2012-07-01,3,1,7,15,0,0,0,1,0.9,0.8182,0.37,0.2836,101,244,345 +13020,2012-07-01,3,1,7,16,0,0,0,1,0.92,0.8333,0.33,0.2985,85,228,313 +13021,2012-07-01,3,1,7,17,0,0,0,1,0.92,0.803,0.26,0.2985,90,323,413 +13022,2012-07-01,3,1,7,18,0,0,0,1,0.88,0.7727,0.32,0.3582,95,275,370 +13023,2012-07-01,3,1,7,19,0,0,0,1,0.84,0.7424,0.39,0.1343,103,279,382 +13024,2012-07-01,3,1,7,20,0,0,0,1,0.84,0.7424,0.39,0.1343,88,244,332 +13025,2012-07-01,3,1,7,21,0,0,0,1,0.84,0.7424,0.39,0.1343,77,181,258 +13026,2012-07-01,3,1,7,22,0,0,0,1,0.82,0.7424,0.43,0.1343,41,110,151 +13027,2012-07-01,3,1,7,23,0,0,0,3,0.78,0.7121,0.52,0.1642,22,88,110 +13028,2012-07-02,3,1,7,0,0,1,1,2,0.76,0.7121,0.58,0.2239,12,31,43 +13029,2012-07-02,3,1,7,1,0,1,1,2,0.76,0.7121,0.58,0.194,3,14,17 +13030,2012-07-02,3,1,7,2,0,1,1,2,0.74,0.6667,0.51,0.2537,1,14,15 +13031,2012-07-02,3,1,7,3,0,1,1,2,0.72,0.6667,0.54,0.4179,0,5,5 +13032,2012-07-02,3,1,7,4,0,1,1,2,0.72,0.6667,0.54,0.2985,2,10,12 +13033,2012-07-02,3,1,7,5,0,1,1,2,0.72,0.6667,0.51,0,3,37,40 +13034,2012-07-02,3,1,7,6,0,1,1,1,0.7,0.6515,0.58,0.1045,4,132,136 +13035,2012-07-02,3,1,7,7,0,1,1,1,0.74,0.6667,0.51,0.1045,8,390,398 +13036,2012-07-02,3,1,7,8,0,1,1,1,0.76,0.697,0.52,0.194,20,548,568 +13037,2012-07-02,3,1,7,9,0,1,1,1,0.78,0.697,0.46,0.2537,56,239,295 +13038,2012-07-02,3,1,7,10,0,1,1,1,0.8,0.7273,0.43,0.2537,70,104,174 +13039,2012-07-02,3,1,7,11,0,1,1,1,0.84,0.7273,0.28,0.2985,67,134,201 +13040,2012-07-02,3,1,7,12,0,1,1,1,0.84,0.7273,0.32,0.194,74,183,257 +13041,2012-07-02,3,1,7,13,0,1,1,1,0.86,0.7576,0.34,0.2239,74,162,236 +13042,2012-07-02,3,1,7,14,0,1,1,1,0.86,0.7576,0.34,0.2239,79,151,230 +13043,2012-07-02,3,1,7,15,0,1,1,1,0.84,0.7424,0.36,0.194,59,175,234 +13044,2012-07-02,3,1,7,16,0,1,1,1,0.84,0.7424,0.36,0.2836,61,304,365 +13045,2012-07-02,3,1,7,17,0,1,1,1,0.84,0.7273,0.32,0.1343,82,665,747 +13046,2012-07-02,3,1,7,18,0,1,1,1,0.84,0.7273,0.32,0.1642,72,658,730 +13047,2012-07-02,3,1,7,19,0,1,1,1,0.78,0.697,0.43,0.1343,71,510,581 +13048,2012-07-02,3,1,7,20,0,1,1,1,0.78,0.697,0.43,0.1343,38,357,395 +13049,2012-07-02,3,1,7,21,0,1,1,1,0.76,0.6818,0.45,0.1343,17,241,258 +13050,2012-07-02,3,1,7,22,0,1,1,1,0.74,0.6667,0.51,0.1642,19,169,188 +13051,2012-07-02,3,1,7,23,0,1,1,1,0.74,0.6667,0.51,0.1045,12,90,102 +13052,2012-07-03,3,1,7,0,0,2,1,1,0.72,0.6667,0.54,0,9,44,53 +13053,2012-07-03,3,1,7,1,0,2,1,1,0.72,0.6667,0.58,0,5,9,14 +13054,2012-07-03,3,1,7,2,0,2,1,1,0.72,0.6667,0.58,0,2,8,10 +13055,2012-07-03,3,1,7,3,0,2,1,1,0.7,0.6515,0.65,0.1343,0,2,2 +13056,2012-07-03,3,1,7,4,0,2,1,1,0.68,0.6364,0.65,0.1045,0,6,6 +13057,2012-07-03,3,1,7,5,0,2,1,1,0.7,0.6515,0.61,0,2,33,35 +13058,2012-07-03,3,1,7,6,0,2,1,1,0.7,0.6515,0.65,0.1343,5,149,154 +13059,2012-07-03,3,1,7,7,0,2,1,1,0.74,0.6818,0.58,0,21,462,483 +13060,2012-07-03,3,1,7,8,0,2,1,1,0.74,0.6818,0.62,0.0896,42,604,646 +13061,2012-07-03,3,1,7,9,0,2,1,1,0.8,0.7424,0.49,0.1642,46,226,272 +13062,2012-07-03,3,1,7,10,0,2,1,1,0.82,0.7424,0.43,0.2239,56,153,209 +13063,2012-07-03,3,1,7,11,0,2,1,1,0.86,0.7576,0.34,0.194,66,151,217 +13064,2012-07-03,3,1,7,12,0,2,1,1,0.88,0.7576,0.28,0.194,74,198,272 +13065,2012-07-03,3,1,7,13,0,2,1,1,0.9,0.7727,0.25,0.1343,80,203,283 +13066,2012-07-03,3,1,7,14,0,2,1,1,0.9,0.7727,0.25,0.194,69,241,310 +13067,2012-07-03,3,1,7,15,0,2,1,2,0.9,0.7727,0.24,0,71,330,401 +13068,2012-07-03,3,1,7,16,0,2,1,2,0.9,0.7727,0.24,0.194,90,437,527 +13069,2012-07-03,3,1,7,17,0,2,1,2,0.86,0.7424,0.3,0.1642,102,620,722 +13070,2012-07-03,3,1,7,18,0,2,1,2,0.86,0.7424,0.3,0,85,542,627 +13071,2012-07-03,3,1,7,19,0,2,1,2,0.84,0.7424,0.36,0.1343,66,431,497 +13072,2012-07-03,3,1,7,20,0,2,1,3,0.76,0.697,0.55,0.4925,80,296,376 +13073,2012-07-03,3,1,7,21,0,2,1,2,0.7,0.6515,0.7,0.2836,36,160,196 +13074,2012-07-03,3,1,7,22,0,2,1,1,0.68,0.6364,0.79,0.194,26,151,177 +13075,2012-07-03,3,1,7,23,0,2,1,1,0.66,0.6061,0.83,0,19,152,171 +13076,2012-07-04,3,1,7,0,1,3,0,1,0.68,0.6364,0.79,0.0896,19,140,159 +13077,2012-07-04,3,1,7,1,1,3,0,1,0.68,0.6364,0.74,0,27,96,123 +13078,2012-07-04,3,1,7,2,1,3,0,1,0.66,0.6061,0.83,0.1343,27,66,93 +13079,2012-07-04,3,1,7,3,1,3,0,1,0.68,0.6364,0.74,0.194,9,23,32 +13080,2012-07-04,3,1,7,4,1,3,0,1,0.68,0.6364,0.69,0.2537,5,11,16 +13081,2012-07-04,3,1,7,5,1,3,0,1,0.66,0.6212,0.69,0,5,14,19 +13082,2012-07-04,3,1,7,6,1,3,0,1,0.66,0.6212,0.69,0,9,23,32 +13083,2012-07-04,3,1,7,7,1,3,0,1,0.68,0.6364,0.65,0,10,62,72 +13084,2012-07-04,3,1,7,8,1,3,0,1,0.7,0.6515,0.61,0.1045,43,110,153 +13085,2012-07-04,3,1,7,9,1,3,0,1,0.76,0.697,0.52,0,90,203,293 +13086,2012-07-04,3,1,7,10,1,3,0,1,0.8,0.7273,0.46,0,143,304,447 +13087,2012-07-04,3,1,7,11,1,3,0,1,0.82,0.7576,0.46,0.1343,164,321,485 +13088,2012-07-04,3,1,7,12,1,3,0,1,0.86,0.7879,0.41,0.2239,164,330,494 +13089,2012-07-04,3,1,7,13,1,3,0,1,0.9,0.8182,0.35,0,177,322,499 +13090,2012-07-04,3,1,7,14,1,3,0,1,0.9,0.8182,0.37,0.1642,190,357,547 +13091,2012-07-04,3,1,7,15,1,3,0,1,0.92,0.8485,0.35,0.2985,155,299,454 +13092,2012-07-04,3,1,7,16,1,3,0,1,0.92,0.8485,0.35,0.2537,163,226,389 +13093,2012-07-04,3,1,7,17,1,3,0,1,0.92,0.8485,0.35,0.2985,161,253,414 +13094,2012-07-04,3,1,7,18,1,3,0,1,0.9,0.8485,0.42,0.194,159,271,430 +13095,2012-07-04,3,1,7,19,1,3,0,1,0.86,0.803,0.47,0.2239,177,255,432 +13096,2012-07-04,3,1,7,20,1,3,0,1,0.86,0.803,0.47,0.2239,237,314,551 +13097,2012-07-04,3,1,7,21,1,3,0,1,0.84,0.803,0.53,0.1343,222,362,584 +13098,2012-07-04,3,1,7,22,1,3,0,1,0.82,0.7879,0.56,0.2239,175,327,502 +13099,2012-07-04,3,1,7,23,1,3,0,2,0.78,0.697,0.43,0.0896,31,152,183 +13100,2012-07-05,3,1,7,0,0,4,1,1,0.74,0.6667,0.51,0.1045,17,71,88 +13101,2012-07-05,3,1,7,1,0,4,1,1,0.74,0.6667,0.51,0.2836,6,24,30 +13102,2012-07-05,3,1,7,2,0,4,1,1,0.74,0.6667,0.51,0,4,14,18 +13103,2012-07-05,3,1,7,3,0,4,1,1,0.72,0.6667,0.54,0.194,2,5,7 +13104,2012-07-05,3,1,7,4,0,4,1,1,0.72,0.6667,0.58,0,0,7,7 +13105,2012-07-05,3,1,7,5,0,4,1,1,0.72,0.6818,0.62,0,1,28,29 +13106,2012-07-05,3,1,7,6,0,4,1,1,0.72,0.6667,0.58,0,3,130,133 +13107,2012-07-05,3,1,7,7,0,4,1,1,0.76,0.697,0.55,0,27,316,343 +13108,2012-07-05,3,1,7,8,0,4,1,1,0.8,0.7424,0.49,0.1343,36,514,550 +13109,2012-07-05,3,1,7,9,0,4,1,1,0.84,0.7879,0.49,0.2985,47,246,293 +13110,2012-07-05,3,1,7,10,0,4,1,1,0.86,0.7879,0.44,0.3582,69,117,186 +13111,2012-07-05,3,1,7,11,0,4,1,1,0.9,0.8182,0.35,0.2985,107,137,244 +13112,2012-07-05,3,1,7,12,0,4,1,1,0.9,0.8182,0.35,0.3582,88,188,276 +13113,2012-07-05,3,1,7,13,0,4,1,1,0.92,0.8333,0.33,0.2239,129,181,310 +13114,2012-07-05,3,1,7,14,0,4,1,1,0.92,0.8333,0.33,0.3284,87,154,241 +13115,2012-07-05,3,1,7,15,0,4,1,1,0.92,0.8333,0.33,0.3582,71,183,254 +13116,2012-07-05,3,1,7,16,0,4,1,1,0.92,0.8333,0.33,0.2537,109,324,433 +13117,2012-07-05,3,1,7,17,0,4,1,1,0.92,0.8485,0.35,0.1642,114,575,689 +13118,2012-07-05,3,1,7,18,0,4,1,2,0.9,0.8333,0.39,0.3284,96,511,607 +13119,2012-07-05,3,1,7,19,0,4,1,2,0.88,0.803,0.39,0.2537,128,375,503 +13120,2012-07-05,3,1,7,20,0,4,1,2,0.86,0.7879,0.41,0.2239,78,243,321 +13121,2012-07-05,3,1,7,21,0,4,1,1,0.84,0.7727,0.47,0.194,91,232,323 +13122,2012-07-05,3,1,7,22,0,4,1,1,0.82,0.803,0.59,0.1343,63,162,225 +13123,2012-07-05,3,1,7,23,0,4,1,1,0.8,0.7576,0.55,0.1642,32,99,131 +13124,2012-07-06,3,1,7,0,0,5,1,1,0.78,0.7424,0.62,0.1343,39,63,102 +13125,2012-07-06,3,1,7,1,0,5,1,1,0.78,0.7273,0.55,0.2239,7,22,29 +13126,2012-07-06,3,1,7,2,0,5,1,1,0.76,0.7121,0.62,0.1343,4,13,17 +13127,2012-07-06,3,1,7,3,0,5,1,2,0.74,0.697,0.7,0.194,1,6,7 +13128,2012-07-06,3,1,7,4,0,5,1,1,0.74,0.697,0.66,0.2239,2,4,6 +13129,2012-07-06,3,1,7,5,0,5,1,1,0.74,0.697,0.66,0.194,4,31,35 +13130,2012-07-06,3,1,7,6,0,5,1,1,0.74,0.697,0.66,0.194,4,127,131 +13131,2012-07-06,3,1,7,7,0,5,1,1,0.78,0.697,0.46,0.1343,20,333,353 +13132,2012-07-06,3,1,7,8,0,5,1,1,0.8,0.7121,0.41,0.1642,28,557,585 +13133,2012-07-06,3,1,7,9,0,5,1,1,0.82,0.7273,0.36,0.1343,55,249,304 +13134,2012-07-06,3,1,7,10,0,5,1,1,0.86,0.7424,0.3,0,80,130,210 +13135,2012-07-06,3,1,7,11,0,5,1,1,0.84,0.7273,0.32,0.0896,92,141,233 +13136,2012-07-06,3,1,7,12,0,5,1,1,0.88,0.7576,0.28,0.1045,111,220,331 +13137,2012-07-06,3,1,7,13,0,5,1,1,0.9,0.7879,0.27,0.1045,99,194,293 +13138,2012-07-06,3,1,7,14,0,5,1,1,0.9,0.803,0.31,0.1642,91,184,275 +13139,2012-07-06,3,1,7,15,0,5,1,1,0.92,0.8182,0.29,0.194,79,206,285 +13140,2012-07-06,3,1,7,16,0,5,1,1,0.92,0.8182,0.29,0.1045,69,300,369 +13141,2012-07-06,3,1,7,17,0,5,1,1,0.92,0.8182,0.29,0,90,486,576 +13142,2012-07-06,3,1,7,18,0,5,1,1,0.9,0.803,0.33,0.2537,106,454,560 +13143,2012-07-06,3,1,7,19,0,5,1,1,0.88,0.7879,0.35,0.2239,112,306,418 +13144,2012-07-06,3,1,7,20,0,5,1,1,0.86,0.7727,0.39,0.1045,91,245,336 +13145,2012-07-06,3,1,7,21,0,5,1,1,0.82,0.7727,0.52,0.1343,77,223,300 +13146,2012-07-06,3,1,7,22,0,5,1,1,0.8,0.7576,0.55,0.194,78,210,288 +13147,2012-07-06,3,1,7,23,0,5,1,1,0.8,0.7879,0.63,0.1045,27,137,164 +13148,2012-07-07,3,1,7,0,0,6,0,1,0.78,0.7576,0.66,0.1343,35,115,150 +13149,2012-07-07,3,1,7,1,0,6,0,1,0.76,0.7273,0.7,0.1642,16,55,71 +13150,2012-07-07,3,1,7,2,0,6,0,1,0.76,0.7424,0.72,0.1642,11,47,58 +13151,2012-07-07,3,1,7,3,0,6,0,1,0.74,0.7121,0.74,0.1343,12,19,31 +13152,2012-07-07,3,1,7,4,0,6,0,1,0.74,0.7121,0.74,0.1642,3,13,16 +13153,2012-07-07,3,1,7,5,0,6,0,1,0.74,0.7121,0.74,0.194,2,12,14 +13154,2012-07-07,3,1,7,6,0,6,0,1,0.74,0.7121,0.74,0.1045,4,27,31 +13155,2012-07-07,3,1,7,7,0,6,0,1,0.78,0.7576,0.66,0.1045,8,63,71 +13156,2012-07-07,3,1,7,8,0,6,0,1,0.82,0.8333,0.63,0.194,32,120,152 +13157,2012-07-07,3,1,7,9,0,6,0,1,0.84,0.8333,0.59,0.1642,69,187,256 +13158,2012-07-07,3,1,7,10,0,6,0,1,0.92,0.8636,0.37,0.2239,100,225,325 +13159,2012-07-07,3,1,7,11,0,6,0,1,0.94,0.8788,0.38,0.194,122,245,367 +13160,2012-07-07,3,1,7,12,0,6,0,1,0.96,0.8636,0.31,0.3582,124,218,342 +13161,2012-07-07,3,1,7,13,0,6,0,2,0.96,0.8636,0.31,0.2537,116,244,360 +13162,2012-07-07,3,1,7,14,0,6,0,2,0.96,0.8636,0.3,0.1343,105,203,308 +13163,2012-07-07,3,1,7,15,0,6,0,1,0.96,0.8485,0.26,0,113,193,306 +13164,2012-07-07,3,1,7,16,0,6,0,1,1,0.8636,0.19,0.1642,102,192,294 +13165,2012-07-07,3,1,7,17,0,6,0,1,0.96,0.8485,0.26,0.1343,103,176,279 +13166,2012-07-07,3,1,7,18,0,6,0,1,0.94,0.8333,0.29,0.0896,83,194,277 +13167,2012-07-07,3,1,7,19,0,6,0,1,0.92,0.8333,0.33,0.1343,68,219,287 +13168,2012-07-07,3,1,7,20,0,6,0,1,0.9,0.8182,0.37,0.1642,79,197,276 +13169,2012-07-07,3,1,7,21,0,6,0,1,0.88,0.7879,0.37,0.194,51,150,201 +13170,2012-07-07,3,1,7,22,0,6,0,1,0.84,0.8333,0.59,0.194,56,164,220 +13171,2012-07-07,3,1,7,23,0,6,0,1,0.84,0.8182,0.56,0.1642,34,114,148 +13172,2012-07-08,3,1,7,0,0,0,0,1,0.82,0.7879,0.56,0.1343,22,125,147 +13173,2012-07-08,3,1,7,1,0,0,0,1,0.82,0.7879,0.56,0.1045,25,99,124 +13174,2012-07-08,3,1,7,2,0,0,0,1,0.82,0.7727,0.52,0.1045,11,59,70 +13175,2012-07-08,3,1,7,3,0,0,0,1,0.8,0.7727,0.59,0.1045,1,34,35 +13176,2012-07-08,3,1,7,4,0,0,0,1,0.78,0.7576,0.66,0.1045,2,11,13 +13177,2012-07-08,3,1,7,5,0,0,0,1,0.78,0.7424,0.62,0,1,5,6 +13178,2012-07-08,3,1,7,6,0,0,0,1,0.78,0.7727,0.7,0.1343,5,16,21 +13179,2012-07-08,3,1,7,7,0,0,0,1,0.8,0.803,0.66,0,19,33,52 +13180,2012-07-08,3,1,7,8,0,0,0,1,0.84,0.803,0.53,0.1343,35,95,130 +13181,2012-07-08,3,1,7,9,0,0,0,1,0.86,0.8182,0.5,0.1045,70,172,242 +13182,2012-07-08,3,1,7,10,0,0,0,1,0.9,0.8636,0.45,0,85,234,319 +13183,2012-07-08,3,1,7,11,0,0,0,1,0.92,0.8939,0.42,0.2985,120,269,389 +13184,2012-07-08,3,1,7,12,0,0,0,1,0.92,0.8939,0.42,0.194,105,271,376 +13185,2012-07-08,3,1,7,13,0,0,0,1,0.94,0.8788,0.38,0.2537,118,219,337 +13186,2012-07-08,3,1,7,14,0,0,0,1,0.96,0.9091,0.36,0.1642,77,235,312 +13187,2012-07-08,3,1,7,15,0,0,0,3,0.92,0.8939,0.42,0.2836,80,218,298 +13188,2012-07-08,3,1,7,16,0,0,0,2,0.8,0.803,0.66,0.2239,65,226,291 +13189,2012-07-08,3,1,7,17,0,0,0,1,0.78,0.7424,0.59,0,68,202,270 +13190,2012-07-08,3,1,7,18,0,0,0,1,0.78,0.7424,0.62,0.1045,54,199,253 +13191,2012-07-08,3,1,7,19,0,0,0,1,0.78,0.7424,0.62,0.1642,66,184,250 +13192,2012-07-08,3,1,7,20,0,0,0,1,0.76,0.7273,0.7,0.0896,79,206,285 +13193,2012-07-08,3,1,7,21,0,0,0,1,0.76,0.7273,0.7,0.1343,58,149,207 +13194,2012-07-08,3,1,7,22,0,0,0,3,0.74,0.697,0.7,0.0896,12,110,122 +13195,2012-07-08,3,1,7,23,0,0,0,3,0.68,0.6364,0.83,0.0896,25,98,123 +13196,2012-07-09,3,1,7,0,0,1,1,2,0.68,0.6364,0.89,0.0896,6,33,39 +13197,2012-07-09,3,1,7,1,0,1,1,3,0.7,0.6667,0.89,0.1045,5,13,18 +13198,2012-07-09,3,1,7,2,0,1,1,3,0.7,0.6667,0.89,0.194,1,4,5 +13199,2012-07-09,3,1,7,3,0,1,1,3,0.7,0.6667,0.89,0.194,0,5,5 +13200,2012-07-09,3,1,7,4,0,1,1,3,0.66,0.5909,0.89,0.3881,1,1,2 +13201,2012-07-09,3,1,7,5,0,1,1,2,0.66,0.6061,0.83,0.2239,1,27,28 +13202,2012-07-09,3,1,7,6,0,1,1,3,0.64,0.5758,0.89,0.2537,6,86,92 +13203,2012-07-09,3,1,7,7,0,1,1,3,0.66,0.6061,0.83,0.2239,7,223,230 +13204,2012-07-09,3,1,7,8,0,1,1,2,0.66,0.6061,0.83,0.1642,26,429,455 +13205,2012-07-09,3,1,7,9,0,1,1,2,0.66,0.6061,0.83,0.1642,38,325,363 +13206,2012-07-09,3,1,7,10,0,1,1,2,0.66,0.6061,0.83,0.194,58,117,175 +13207,2012-07-09,3,1,7,11,0,1,1,2,0.72,0.6818,0.62,0.1343,72,144,216 +13208,2012-07-09,3,1,7,12,0,1,1,2,0.74,0.6818,0.58,0.1343,61,205,266 +13209,2012-07-09,3,1,7,13,0,1,1,2,0.76,0.697,0.55,0.2537,67,188,255 +13210,2012-07-09,3,1,7,14,0,1,1,2,0.74,0.6818,0.55,0.2537,58,184,242 +13211,2012-07-09,3,1,7,15,0,1,1,2,0.76,0.6818,0.45,0.2239,66,214,280 +13212,2012-07-09,3,1,7,16,0,1,1,1,0.78,0.697,0.4,0.2836,88,325,413 +13213,2012-07-09,3,1,7,17,0,1,1,1,0.78,0.697,0.4,0.1642,108,741,849 +13214,2012-07-09,3,1,7,18,0,1,1,1,0.76,0.6818,0.48,0.1343,82,790,872 +13215,2012-07-09,3,1,7,19,0,1,1,1,0.74,0.6818,0.55,0.1642,88,543,631 +13216,2012-07-09,3,1,7,20,0,1,1,2,0.74,0.6818,0.55,0.1045,69,378,447 +13217,2012-07-09,3,1,7,21,0,1,1,2,0.74,0.6818,0.55,0.0896,33,298,331 +13218,2012-07-09,3,1,7,22,0,1,1,2,0.72,0.6667,0.58,0.1045,35,189,224 +13219,2012-07-09,3,1,7,23,0,1,1,2,0.7,0.6515,0.65,0.1045,22,109,131 +13220,2012-07-10,3,1,7,0,0,2,1,1,0.7,0.6515,0.7,0.0896,16,53,69 +13221,2012-07-10,3,1,7,1,0,2,1,1,0.7,0.6667,0.74,0.1343,0,15,15 +13222,2012-07-10,3,1,7,2,0,2,1,1,0.7,0.6667,0.74,0.1343,1,15,16 +13223,2012-07-10,3,1,7,3,0,2,1,1,0.68,0.6364,0.79,0,0,3,3 +13224,2012-07-10,3,1,7,4,0,2,1,1,0.68,0.6364,0.79,0.1343,0,4,4 +13225,2012-07-10,3,1,7,5,0,2,1,1,0.66,0.6061,0.83,0.1343,3,39,42 +13226,2012-07-10,3,1,7,6,0,2,1,2,0.68,0.6364,0.79,0.1045,4,183,187 +13227,2012-07-10,3,1,7,7,0,2,1,2,0.7,0.6667,0.74,0.1045,31,489,520 +13228,2012-07-10,3,1,7,8,0,2,1,1,0.72,0.697,0.74,0.1343,34,615,649 +13229,2012-07-10,3,1,7,9,0,2,1,1,0.74,0.697,0.66,0.1343,47,284,331 +13230,2012-07-10,3,1,7,10,0,2,1,1,0.76,0.7121,0.62,0.194,56,134,190 +13231,2012-07-10,3,1,7,11,0,2,1,1,0.8,0.7273,0.46,0.194,92,157,249 +13232,2012-07-10,3,1,7,12,0,2,1,1,0.8,0.7121,0.41,0.1343,69,203,272 +13233,2012-07-10,3,1,7,13,0,2,1,1,0.82,0.7273,0.38,0.194,77,203,280 +13234,2012-07-10,3,1,7,14,0,2,1,1,0.82,0.7273,0.38,0.1642,71,171,242 +13235,2012-07-10,3,1,7,15,0,2,1,1,0.8,0.7121,0.41,0.2239,77,188,265 +13236,2012-07-10,3,1,7,16,0,2,1,1,0.8,0.7273,0.43,0.2239,89,346,435 +13237,2012-07-10,3,1,7,17,0,2,1,1,0.78,0.7121,0.49,0.194,103,769,872 +13238,2012-07-10,3,1,7,18,0,2,1,3,0.7,0.6667,0.74,0.2239,70,749,819 +13239,2012-07-10,3,1,7,19,0,2,1,3,0.7,0.6667,0.74,0.2239,55,359,414 +13240,2012-07-10,3,1,7,20,0,2,1,3,0.64,0.5758,0.83,0.2239,9,75,84 +13241,2012-07-10,3,1,7,21,0,2,1,3,0.64,0.5758,0.89,0.1045,7,83,90 +13242,2012-07-10,3,1,7,22,0,2,1,2,0.64,0.5758,0.83,0.1343,14,125,139 +13243,2012-07-10,3,1,7,23,0,2,1,2,0.64,0.5758,0.89,0.1045,29,74,103 +13244,2012-07-11,3,1,7,0,0,3,1,2,0.66,0.6061,0.83,0.1343,6,38,44 +13245,2012-07-11,3,1,7,1,0,3,1,1,0.64,0.5758,0.89,0,3,11,14 +13246,2012-07-11,3,1,7,2,0,3,1,1,0.64,0.5758,0.89,0,0,5,5 +13247,2012-07-11,3,1,7,3,0,3,1,1,0.64,0.5758,0.89,0.1343,0,5,5 +13248,2012-07-11,3,1,7,4,0,3,1,1,0.64,0.5758,0.89,0.1343,1,6,7 +13249,2012-07-11,3,1,7,5,0,3,1,1,0.64,0.5758,0.89,0,5,36,41 +13250,2012-07-11,3,1,7,6,0,3,1,1,0.64,0.5758,0.89,0.1343,17,168,185 +13251,2012-07-11,3,1,7,7,0,3,1,2,0.66,0.6061,0.83,0.1045,26,471,497 +13252,2012-07-11,3,1,7,8,0,3,1,2,0.68,0.6364,0.79,0.1642,30,644,674 +13253,2012-07-11,3,1,7,9,0,3,1,2,0.7,0.6667,0.74,0.1343,30,298,328 +13254,2012-07-11,3,1,7,10,0,3,1,2,0.72,0.6818,0.62,0.1343,47,148,195 +13255,2012-07-11,3,1,7,11,0,3,1,2,0.76,0.7121,0.58,0.1343,56,173,229 +13256,2012-07-11,3,1,7,12,0,3,1,2,0.78,0.697,0.43,0,69,223,292 +13257,2012-07-11,3,1,7,13,0,3,1,2,0.8,0.7121,0.42,0.1642,65,208,273 +13258,2012-07-11,3,1,7,14,0,3,1,1,0.8,0.7121,0.41,0.2239,66,175,241 +13259,2012-07-11,3,1,7,15,0,3,1,1,0.8,0.7121,0.41,0.2537,56,220,276 +13260,2012-07-11,3,1,7,16,0,3,1,1,0.8,0.7121,0.38,0.194,53,358,411 +13261,2012-07-11,3,1,7,17,0,3,1,1,0.8,0.697,0.31,0.2239,90,740,830 +13262,2012-07-11,3,1,7,18,0,3,1,1,0.78,0.6818,0.38,0.2537,79,735,814 +13263,2012-07-11,3,1,7,19,0,3,1,1,0.74,0.6667,0.48,0.2239,73,560,633 +13264,2012-07-11,3,1,7,20,0,3,1,1,0.74,0.6667,0.51,0.1642,73,410,483 +13265,2012-07-11,3,1,7,21,0,3,1,1,0.72,0.6667,0.58,0.2239,69,322,391 +13266,2012-07-11,3,1,7,22,0,3,1,1,0.72,0.6667,0.58,0.2537,41,203,244 +13267,2012-07-11,3,1,7,23,0,3,1,2,0.7,0.6515,0.58,0.2537,20,132,152 +13268,2012-07-12,3,1,7,0,0,4,1,2,0.68,0.6364,0.61,0,1,55,56 +13269,2012-07-12,3,1,7,1,0,4,1,2,0.66,0.6212,0.61,0,0,21,21 +13270,2012-07-12,3,1,7,2,0,4,1,2,0.66,0.6212,0.65,0.0896,0,9,9 +13271,2012-07-12,3,1,7,3,0,4,1,1,0.64,0.6061,0.69,0.1045,0,10,10 +13272,2012-07-12,3,1,7,4,0,4,1,1,0.64,0.6061,0.73,0.1045,0,5,5 +13273,2012-07-12,3,1,7,5,0,4,1,1,0.62,0.5909,0.73,0.1045,4,40,44 +13274,2012-07-12,3,1,7,6,0,4,1,1,0.64,0.6061,0.73,0.0896,7,171,178 +13275,2012-07-12,3,1,7,7,0,4,1,1,0.64,0.6061,0.73,0.1343,32,480,512 +13276,2012-07-12,3,1,7,8,0,4,1,1,0.68,0.6364,0.65,0.1343,49,653,702 +13277,2012-07-12,3,1,7,9,0,4,1,1,0.72,0.6667,0.58,0.1045,42,285,327 +13278,2012-07-12,3,1,7,10,0,4,1,1,0.74,0.6667,0.48,0.1642,69,145,214 +13279,2012-07-12,3,1,7,11,0,4,1,1,0.76,0.6818,0.45,0.1045,49,166,215 +13280,2012-07-12,3,1,7,12,0,4,1,1,0.78,0.6818,0.35,0.1343,71,221,292 +13281,2012-07-12,3,1,7,13,0,4,1,1,0.8,0.697,0.33,0.2239,52,225,277 +13282,2012-07-12,3,1,7,14,0,4,1,1,0.78,0.6818,0.33,0.2239,42,167,209 +13283,2012-07-12,3,1,7,15,0,4,1,1,0.8,0.7121,0.36,0.194,63,208,271 +13284,2012-07-12,3,1,7,16,0,4,1,1,0.8,0.7121,0.36,0.194,69,370,439 +13285,2012-07-12,3,1,7,17,0,4,1,1,0.76,0.6818,0.43,0.2239,91,704,795 +13286,2012-07-12,3,1,7,18,0,4,1,1,0.76,0.6818,0.45,0.1343,86,739,825 +13287,2012-07-12,3,1,7,19,0,4,1,1,0.74,0.6667,0.48,0.2239,97,532,629 +13288,2012-07-12,3,1,7,20,0,4,1,1,0.74,0.6667,0.48,0.2239,75,439,514 +13289,2012-07-12,3,1,7,21,0,4,1,1,0.72,0.6667,0.48,0.2239,44,329,373 +13290,2012-07-12,3,1,7,22,0,4,1,1,0.72,0.6667,0.51,0.1642,56,262,318 +13291,2012-07-12,3,1,7,23,0,4,1,2,0.7,0.6515,0.51,0.2239,33,178,211 +13292,2012-07-13,3,1,7,0,0,5,1,2,0.7,0.6515,0.54,0.1642,19,76,95 +13293,2012-07-13,3,1,7,1,0,5,1,2,0.68,0.6364,0.54,0.1045,17,42,59 +13294,2012-07-13,3,1,7,2,0,5,1,2,0.66,0.6212,0.61,0.0896,2,15,17 +13295,2012-07-13,3,1,7,3,0,5,1,2,0.68,0.6364,0.61,0,2,7,9 +13296,2012-07-13,3,1,7,4,0,5,1,2,0.66,0.6212,0.61,0,0,11,11 +13297,2012-07-13,3,1,7,5,0,5,1,2,0.66,0.6364,0.56,0.0896,4,30,34 +13298,2012-07-13,3,1,7,6,0,5,1,2,0.7,0.6515,0.54,0,7,120,127 +13299,2012-07-13,3,1,7,7,0,5,1,2,0.7,0.6515,0.54,0,27,353,380 +13300,2012-07-13,3,1,7,8,0,5,1,2,0.7,0.6515,0.54,0,53,660,713 +13301,2012-07-13,3,1,7,9,0,5,1,2,0.72,0.6667,0.51,0,70,344,414 +13302,2012-07-13,3,1,7,10,0,5,1,2,0.74,0.6667,0.48,0,78,190,268 +13303,2012-07-13,3,1,7,11,0,5,1,2,0.76,0.6818,0.45,0.1343,68,220,288 +13304,2012-07-13,3,1,7,12,0,5,1,2,0.76,0.6818,0.4,0.1045,125,254,379 +13305,2012-07-13,3,1,7,13,0,5,1,2,0.8,0.7121,0.38,0.1343,100,274,374 +13306,2012-07-13,3,1,7,14,0,5,1,2,0.8,0.7121,0.38,0,120,249,369 +13307,2012-07-13,3,1,7,15,0,5,1,2,0.8,0.7121,0.36,0.1045,92,261,353 +13308,2012-07-13,3,1,7,16,0,5,1,2,0.8,0.7121,0.36,0.2239,111,381,492 +13309,2012-07-13,3,1,7,17,0,5,1,2,0.8,0.7121,0.36,0.1642,138,697,835 +13310,2012-07-13,3,1,7,18,0,5,1,1,0.78,0.697,0.46,0.194,108,523,631 +13311,2012-07-13,3,1,7,19,0,5,1,1,0.74,0.6667,0.51,0.1642,103,385,488 +13312,2012-07-13,3,1,7,20,0,5,1,1,0.74,0.6667,0.48,0.1343,95,294,389 +13313,2012-07-13,3,1,7,21,0,5,1,1,0.72,0.6667,0.54,0,76,223,299 +13314,2012-07-13,3,1,7,22,0,5,1,1,0.74,0.6667,0.45,0,59,204,263 +13315,2012-07-13,3,1,7,23,0,5,1,2,0.72,0.6515,0.45,0.1343,37,175,212 +13316,2012-07-14,3,1,7,0,0,6,0,2,0.72,0.6667,0.48,0.0896,35,156,191 +13317,2012-07-14,3,1,7,1,0,6,0,2,0.7,0.6515,0.51,0.1045,13,105,118 +13318,2012-07-14,3,1,7,2,0,6,0,2,0.7,0.6515,0.54,0.1045,10,91,101 +13319,2012-07-14,3,1,7,3,0,6,0,2,0.7,0.6515,0.54,0.1045,9,31,40 +13320,2012-07-14,3,1,7,4,0,6,0,3,0.66,0.6212,0.61,0.1045,3,6,9 +13321,2012-07-14,3,1,7,5,0,6,0,2,0.64,0.6061,0.73,0.1045,0,5,5 +13322,2012-07-14,3,1,7,6,0,6,0,2,0.64,0.5909,0.78,0.0896,6,29,35 +13323,2012-07-14,3,1,7,7,0,6,0,1,0.64,0.5909,0.78,0.2836,7,41,48 +13324,2012-07-14,3,1,7,8,0,6,0,1,0.62,0.5909,0.78,0.2836,33,118,151 +13325,2012-07-14,3,1,7,9,0,6,0,1,0.66,0.6212,0.74,0.194,82,200,282 +13326,2012-07-14,3,1,7,10,0,6,0,2,0.72,0.6818,0.7,0.0896,123,278,401 +13327,2012-07-14,3,1,7,11,0,6,0,2,0.72,0.6818,0.7,0.1045,143,296,439 +13328,2012-07-14,3,1,7,12,0,6,0,2,0.74,0.697,0.66,0.0896,181,318,499 +13329,2012-07-14,3,1,7,13,0,6,0,2,0.76,0.7121,0.62,0.0896,219,372,591 +13330,2012-07-14,3,1,7,14,0,6,0,1,0.8,0.7576,0.55,0.1045,269,363,632 +13331,2012-07-14,3,1,7,15,0,6,0,1,0.8,0.7576,0.55,0.1642,243,317,560 +13332,2012-07-14,3,1,7,16,0,6,0,2,0.72,0.697,0.79,0.3582,226,340,566 +13333,2012-07-14,3,1,7,17,0,6,0,3,0.7,0.6667,0.79,0.1642,199,259,458 +13334,2012-07-14,3,1,7,18,0,6,0,1,0.72,0.697,0.79,0.1343,108,243,351 +13335,2012-07-14,3,1,7,19,0,6,0,1,0.72,0.7121,0.84,0.0896,156,275,431 +13336,2012-07-14,3,1,7,20,0,6,0,1,0.72,0.7121,0.84,0.0896,101,255,356 +13337,2012-07-14,3,1,7,21,0,6,0,1,0.7,0.6667,0.84,0.2239,69,200,269 +13338,2012-07-14,3,1,7,22,0,6,0,1,0.7,0.6667,0.79,0.194,65,160,225 +13339,2012-07-14,3,1,7,23,0,6,0,2,0.68,0.6364,0.83,0.0896,55,156,211 +13340,2012-07-15,3,1,7,0,0,0,0,1,0.68,0.6364,0.89,0.1642,40,147,187 +13341,2012-07-15,3,1,7,1,0,0,0,2,0.68,0.6364,0.89,0.1343,49,119,168 +13342,2012-07-15,3,1,7,2,0,0,0,2,0.68,0.6364,0.89,0.1642,29,86,115 +13343,2012-07-15,3,1,7,3,0,0,0,2,0.68,0.6364,0.89,0.1642,15,42,57 +13344,2012-07-15,3,1,7,4,0,0,0,2,0.68,0.6364,0.89,0.1045,8,11,19 +13345,2012-07-15,3,1,7,5,0,0,0,1,0.68,0.6364,0.89,0.1343,2,7,9 +13346,2012-07-15,3,1,7,6,0,0,0,1,0.68,0.6364,0.83,0.1343,6,9,15 +13347,2012-07-15,3,1,7,7,0,0,0,1,0.7,0.6667,0.79,0.0896,17,30,47 +13348,2012-07-15,3,1,7,8,0,0,0,2,0.7,0.6667,0.84,0.1343,37,96,133 +13349,2012-07-15,3,1,7,9,0,0,0,2,0.72,0.697,0.79,0,58,163,221 +13350,2012-07-15,3,1,7,10,0,0,0,1,0.76,0.7273,0.66,0.1343,141,241,382 +13351,2012-07-15,3,1,7,11,0,0,0,1,0.82,0.7727,0.52,0.1642,170,281,451 +13352,2012-07-15,3,1,7,12,0,0,0,1,0.84,0.7727,0.47,0.1045,153,336,489 +13353,2012-07-15,3,1,7,13,0,0,0,1,0.86,0.7879,0.41,0.1045,171,309,480 +13354,2012-07-15,3,1,7,14,0,0,0,1,0.9,0.8182,0.37,0.2537,162,314,476 +13355,2012-07-15,3,1,7,15,0,0,0,1,0.86,0.803,0.47,0.1343,182,307,489 +13356,2012-07-15,3,1,7,16,0,0,0,1,0.86,0.7879,0.41,0.2239,152,343,495 +13357,2012-07-15,3,1,7,17,0,0,0,1,0.8,0.7727,0.59,0.4925,122,314,436 +13358,2012-07-15,3,1,7,18,0,0,0,3,0.72,0.697,0.79,0.3582,84,228,312 +13359,2012-07-15,3,1,7,19,0,0,0,1,0.72,0.7121,0.84,0.2239,109,203,312 +13360,2012-07-15,3,1,7,20,0,0,0,1,0.72,0.697,0.79,0.2537,95,199,294 +13361,2012-07-15,3,1,7,21,0,0,0,1,0.72,0.697,0.79,0.0896,59,164,223 +13362,2012-07-15,3,1,7,22,0,0,0,1,0.72,0.697,0.74,0.1045,34,96,130 +13363,2012-07-15,3,1,7,23,0,0,0,1,0.72,0.697,0.79,0.1343,25,66,91 +13364,2012-07-16,3,1,7,0,0,1,1,3,0.72,0.697,0.79,0.1045,11,32,43 +13365,2012-07-16,3,1,7,1,0,1,1,2,0.7,0.6667,0.74,0.194,1,10,11 +13366,2012-07-16,3,1,7,2,0,1,1,1,0.7,0.6667,0.74,0.1045,4,11,15 +13367,2012-07-16,3,1,7,3,0,1,1,1,0.68,0.6364,0.79,0.0896,2,4,6 +13368,2012-07-16,3,1,7,4,0,1,1,1,0.68,0.6364,0.79,0.1045,1,11,12 +13369,2012-07-16,3,1,7,5,0,1,1,1,0.66,0.6061,0.78,0.1045,10,40,50 +13370,2012-07-16,3,1,7,6,0,1,1,1,0.68,0.6364,0.79,0.1343,8,132,140 +13371,2012-07-16,3,1,7,7,0,1,1,1,0.72,0.6818,0.7,0.0896,24,459,483 +13372,2012-07-16,3,1,7,8,0,1,1,1,0.74,0.697,0.66,0.1045,48,619,667 +13373,2012-07-16,3,1,7,9,0,1,1,1,0.76,0.7121,0.62,0.1642,51,297,348 +13374,2012-07-16,3,1,7,10,0,1,1,1,0.8,0.7727,0.59,0,69,117,186 +13375,2012-07-16,3,1,7,11,0,1,1,1,0.84,0.803,0.53,0.2239,57,125,182 +13376,2012-07-16,3,1,7,12,0,1,1,1,0.84,0.7879,0.49,0.1343,79,158,237 +13377,2012-07-16,3,1,7,13,0,1,1,1,0.84,0.7879,0.49,0.2836,63,176,239 +13378,2012-07-16,3,1,7,14,0,1,1,1,0.88,0.8182,0.42,0.1045,53,159,212 +13379,2012-07-16,3,1,7,15,0,1,1,1,0.86,0.803,0.47,0.2537,61,176,237 +13380,2012-07-16,3,1,7,16,0,1,1,3,0.76,0.7273,0.66,0.5821,77,309,386 +13381,2012-07-16,3,1,7,17,0,1,1,3,0.76,0.7273,0.66,0.5821,86,669,755 +13382,2012-07-16,3,1,7,18,0,1,1,1,0.84,0.803,0.53,0,97,697,794 +13383,2012-07-16,3,1,7,19,0,1,1,1,0.8,0.7879,0.63,0.1343,92,503,595 +13384,2012-07-16,3,1,7,20,0,1,1,1,0.8,0.7727,0.59,0.1343,66,388,454 +13385,2012-07-16,3,1,7,21,0,1,1,1,0.76,0.7273,0.66,0.1343,45,283,328 +13386,2012-07-16,3,1,7,22,0,1,1,1,0.76,0.7121,0.62,0.0896,56,228,284 +13387,2012-07-16,3,1,7,23,0,1,1,1,0.74,0.7121,0.74,0.0896,27,139,166 +13388,2012-07-17,3,1,7,0,0,2,1,1,0.72,0.7121,0.84,0.1045,10,43,53 +13389,2012-07-17,3,1,7,1,0,2,1,1,0.74,0.697,0.7,0,4,21,25 +13390,2012-07-17,3,1,7,2,0,2,1,1,0.72,0.6818,0.7,0.0896,1,6,7 +13391,2012-07-17,3,1,7,3,0,2,1,1,0.72,0.6818,0.66,0,0,3,3 +13392,2012-07-17,3,1,7,4,0,2,1,1,0.72,0.6818,0.62,0,0,7,7 +13393,2012-07-17,3,1,7,5,0,2,1,1,0.68,0.6364,0.74,0.0896,7,31,38 +13394,2012-07-17,3,1,7,6,0,2,1,1,0.72,0.6818,0.66,0.1045,16,197,213 +13395,2012-07-17,3,1,7,7,0,2,1,2,0.74,0.697,0.7,0,26,500,526 +13396,2012-07-17,3,1,7,8,0,2,1,2,0.8,0.7576,0.55,0,43,618,661 +13397,2012-07-17,3,1,7,9,0,2,1,2,0.82,0.7727,0.49,0.2239,32,288,320 +13398,2012-07-17,3,1,7,10,0,2,1,2,0.84,0.7879,0.49,0.194,37,122,159 +13399,2012-07-17,3,1,7,11,0,2,1,2,0.86,0.803,0.47,0,60,146,206 +13400,2012-07-17,3,1,7,12,0,2,1,2,0.9,0.8182,0.35,0,80,174,254 +13401,2012-07-17,3,1,7,13,0,2,1,1,0.92,0.8182,0.31,0.194,53,168,221 +13402,2012-07-17,3,1,7,14,0,2,1,1,0.94,0.8333,0.29,0.2537,54,160,214 +13403,2012-07-17,3,1,7,15,0,2,1,1,0.92,0.8182,0.31,0.1642,57,187,244 +13404,2012-07-17,3,1,7,16,0,2,1,1,0.92,0.8182,0.29,0.1642,64,317,381 +13405,2012-07-17,3,1,7,17,0,2,1,1,0.92,0.8182,0.29,0,95,675,770 +13406,2012-07-17,3,1,7,18,0,2,1,1,0.9,0.803,0.33,0.194,60,712,772 +13407,2012-07-17,3,1,7,19,0,2,1,1,0.88,0.7879,0.35,0.2537,69,478,547 +13408,2012-07-17,3,1,7,20,0,2,1,1,0.86,0.7576,0.34,0.1642,52,375,427 +13409,2012-07-17,3,1,7,21,0,2,1,1,0.82,0.7727,0.52,0.1343,50,283,333 +13410,2012-07-17,3,1,7,22,0,2,1,1,0.8,0.7576,0.55,0.2239,29,215,244 +13411,2012-07-17,3,1,7,23,0,2,1,1,0.78,0.7424,0.59,0.194,22,139,161 +13412,2012-07-18,3,1,7,0,0,3,1,1,0.76,0.7273,0.66,0.1642,8,45,53 +13413,2012-07-18,3,1,7,1,0,3,1,1,0.76,0.7273,0.66,0.1045,2,14,16 +13414,2012-07-18,3,1,7,2,0,3,1,1,0.74,0.697,0.66,0.1343,3,5,8 +13415,2012-07-18,3,1,7,3,0,3,1,1,0.74,0.697,0.66,0.1343,2,2,4 +13416,2012-07-18,3,1,7,4,0,3,1,1,0.72,0.6818,0.7,0.1343,0,6,6 +13417,2012-07-18,3,1,7,5,0,3,1,1,0.72,0.6818,0.7,0.1642,5,41,46 +13418,2012-07-18,3,1,7,6,0,3,1,1,0.72,0.6818,0.7,0.1642,10,152,162 +13419,2012-07-18,3,1,7,7,0,3,1,1,0.74,0.697,0.66,0.1642,23,472,495 +13420,2012-07-18,3,1,7,8,0,3,1,1,0.78,0.7424,0.62,0.1045,55,624,679 +13421,2012-07-18,3,1,7,9,0,3,1,1,0.82,0.7879,0.56,0,44,253,297 +13422,2012-07-18,3,1,7,10,0,3,1,1,0.9,0.8485,0.42,0.194,39,99,138 +13423,2012-07-18,3,1,7,11,0,3,1,1,0.9,0.8485,0.42,0,47,136,183 +13424,2012-07-18,3,1,7,12,0,3,1,1,0.92,0.8636,0.37,0.1343,36,166,202 +13425,2012-07-18,3,1,7,13,0,3,1,1,0.94,0.8485,0.33,0,49,154,203 +13426,2012-07-18,3,1,7,14,0,3,1,1,0.94,0.8788,0.38,0.3284,54,116,170 +13427,2012-07-18,3,1,7,15,0,3,1,3,0.92,0.8485,0.35,0.3582,42,152,194 +13428,2012-07-18,3,1,7,16,0,3,1,1,0.74,0.697,0.66,0.2537,34,190,224 +13429,2012-07-18,3,1,7,17,0,3,1,1,0.74,0.697,0.7,0.2537,35,335,370 +13430,2012-07-18,3,1,7,18,0,3,1,2,0.76,0.7121,0.62,0.1343,63,580,643 +13431,2012-07-18,3,1,7,19,0,3,1,2,0.76,0.7121,0.62,0.1045,78,438,516 +13432,2012-07-18,3,1,7,20,0,3,1,1,0.74,0.6818,0.62,0.0896,56,310,366 +13433,2012-07-18,3,1,7,21,0,3,1,1,0.76,0.7121,0.62,0,53,266,319 +13434,2012-07-18,3,1,7,22,0,3,1,1,0.76,0.7121,0.58,0.0896,42,246,288 +13435,2012-07-18,3,1,7,23,0,3,1,1,0.76,0.7121,0.58,0.0896,19,112,131 +13436,2012-07-19,3,1,7,0,0,4,1,1,0.74,0.697,0.66,0.0896,16,50,66 +13437,2012-07-19,3,1,7,1,0,4,1,1,0.74,0.697,0.66,0.0896,6,19,25 +13438,2012-07-19,3,1,7,2,0,4,1,1,0.72,0.6818,0.7,0.194,0,7,7 +13439,2012-07-19,3,1,7,3,0,4,1,1,0.72,0.6818,0.7,0.1045,1,4,5 +13440,2012-07-19,3,1,7,4,0,4,1,1,0.72,0.6818,0.7,0.1045,0,8,8 +13441,2012-07-19,3,1,7,5,0,4,1,2,0.72,0.6818,0.7,0.1045,5,35,40 +13442,2012-07-19,3,1,7,6,0,4,1,1,0.72,0.6818,0.66,0.1343,8,144,152 +13443,2012-07-19,3,1,7,7,0,4,1,1,0.74,0.6818,0.58,0.2836,23,450,473 +13444,2012-07-19,3,1,7,8,0,4,1,1,0.76,0.7121,0.58,0.2537,32,625,657 +13445,2012-07-19,3,1,7,9,0,4,1,2,0.76,0.7121,0.58,0.2836,36,282,318 +13446,2012-07-19,3,1,7,10,0,4,1,1,0.76,0.7121,0.58,0.194,41,157,198 +13447,2012-07-19,3,1,7,11,0,4,1,1,0.8,0.7576,0.55,0.1045,54,162,216 +13448,2012-07-19,3,1,7,12,0,4,1,1,0.84,0.7879,0.49,0.2836,59,215,274 +13449,2012-07-19,3,1,7,13,0,4,1,1,0.84,0.7727,0.47,0.1343,74,180,254 +13450,2012-07-19,3,1,7,14,0,4,1,1,0.86,0.7879,0.44,0.0896,46,151,197 +13451,2012-07-19,3,1,7,15,0,4,1,1,0.84,0.7727,0.47,0.0896,66,226,292 +13452,2012-07-19,3,1,7,16,0,4,1,1,0.84,0.7727,0.47,0.0896,70,288,358 +13453,2012-07-19,3,1,7,17,0,4,1,1,0.86,0.7879,0.41,0,93,678,771 +13454,2012-07-19,3,1,7,18,0,4,1,1,0.86,0.7879,0.41,0.194,93,684,777 +13455,2012-07-19,3,1,7,19,0,4,1,1,0.84,0.7576,0.44,0.2537,53,480,533 +13456,2012-07-19,3,1,7,20,0,4,1,1,0.82,0.7727,0.49,0.1642,65,440,505 +13457,2012-07-19,3,1,7,21,0,4,1,3,0.66,0.5909,0.89,0.2537,38,294,332 +13458,2012-07-19,3,1,7,22,0,4,1,3,0.66,0.5909,0.89,0.2537,6,62,68 +13459,2012-07-19,3,1,7,23,0,4,1,3,0.66,0.5909,0.89,0.2239,3,62,65 +13460,2012-07-20,3,1,7,0,0,5,1,3,0.66,0.5909,0.89,0.0896,2,32,34 +13461,2012-07-20,3,1,7,1,0,5,1,1,0.66,0.5909,0.89,0.1343,2,9,11 +13462,2012-07-20,3,1,7,2,0,5,1,2,0.66,0.5909,0.89,0.0896,5,14,19 +13463,2012-07-20,3,1,7,3,0,5,1,2,0.66,0.5909,0.89,0.1045,7,29,36 +13464,2012-07-20,3,1,7,4,0,5,1,2,0.66,0.5909,0.89,0.1642,3,6,9 +13465,2012-07-20,3,1,7,5,0,5,1,2,0.66,0.5909,0.89,0.1343,2,40,42 +13466,2012-07-20,3,1,7,6,0,5,1,2,0.66,0.5909,0.89,0.1642,3,123,126 +13467,2012-07-20,3,1,7,7,0,5,1,2,0.66,0.5909,0.89,0.194,14,346,360 +13468,2012-07-20,3,1,7,8,0,5,1,2,0.68,0.6364,0.83,0.1642,39,652,691 +13469,2012-07-20,3,1,7,9,0,5,1,2,0.7,0.6667,0.79,0.1343,45,328,373 +13470,2012-07-20,3,1,7,10,0,5,1,2,0.7,0.6667,0.79,0.1642,51,182,233 +13471,2012-07-20,3,1,7,11,0,5,1,3,0.72,0.697,0.74,0.194,51,161,212 +13472,2012-07-20,3,1,7,12,0,5,1,3,0.68,0.6364,0.83,0.1642,41,233,274 +13473,2012-07-20,3,1,7,13,0,5,1,3,0.68,0.6364,0.83,0.1045,20,136,156 +13474,2012-07-20,3,1,7,14,0,5,1,3,0.7,0.6667,0.84,0.1642,53,176,229 +13475,2012-07-20,3,1,7,15,0,5,1,2,0.7,0.6667,0.84,0.2985,47,169,216 +13476,2012-07-20,3,1,7,16,0,5,1,2,0.68,0.6364,0.83,0.3284,70,366,436 +13477,2012-07-20,3,1,7,17,0,5,1,2,0.66,0.5909,0.89,0.4179,95,620,715 +13478,2012-07-20,3,1,7,18,0,5,1,2,0.66,0.6061,0.78,0.3284,73,549,622 +13479,2012-07-20,3,1,7,19,0,5,1,2,0.64,0.5758,0.83,0.2836,44,358,402 +13480,2012-07-20,3,1,7,20,0,5,1,3,0.64,0.5758,0.83,0.3881,35,216,251 +13481,2012-07-20,3,1,7,21,0,5,1,3,0.62,0.5758,0.83,0.3284,14,108,122 +13482,2012-07-20,3,1,7,22,0,5,1,2,0.62,0.5758,0.83,0.194,13,122,135 +13483,2012-07-20,3,1,7,23,0,5,1,2,0.62,0.5758,0.83,0.2836,18,148,166 +13484,2012-07-21,3,1,7,0,0,6,0,3,0.62,0.5758,0.83,0.2836,11,92,103 +13485,2012-07-21,3,1,7,1,0,6,0,2,0.62,0.5758,0.83,0.2537,10,62,72 +13486,2012-07-21,3,1,7,2,0,6,0,3,0.6,0.5455,0.88,0.2836,5,55,60 +13487,2012-07-21,3,1,7,3,0,6,0,3,0.6,0.5455,0.88,0.2537,7,18,25 +13488,2012-07-21,3,1,7,4,0,6,0,3,0.58,0.5455,0.88,0.2836,11,5,16 +13489,2012-07-21,3,1,7,5,0,6,0,3,0.58,0.5455,0.88,0.2537,7,10,17 +13490,2012-07-21,3,1,7,6,0,6,0,3,0.58,0.5455,0.88,0.2537,3,12,15 +13491,2012-07-21,3,1,7,7,0,6,0,2,0.58,0.5455,0.88,0.2537,8,29,37 +13492,2012-07-21,3,1,7,8,0,6,0,2,0.58,0.5455,0.83,0.2537,18,94,112 +13493,2012-07-21,3,1,7,9,0,6,0,2,0.58,0.5455,0.83,0.194,32,175,207 +13494,2012-07-21,3,1,7,10,0,6,0,3,0.58,0.5455,0.88,0.2537,49,219,268 +13495,2012-07-21,3,1,7,11,0,6,0,2,0.6,0.5606,0.83,0.2537,80,225,305 +13496,2012-07-21,3,1,7,12,0,6,0,3,0.6,0.5606,0.83,0.2239,156,301,457 +13497,2012-07-21,3,1,7,13,0,6,0,3,0.6,0.5606,0.83,0.1642,106,248,354 +13498,2012-07-21,3,1,7,14,0,6,0,2,0.6,0.5455,0.88,0.2537,121,222,343 +13499,2012-07-21,3,1,7,15,0,6,0,3,0.6,0.5455,0.88,0.2537,148,232,380 +13500,2012-07-21,3,1,7,16,0,6,0,3,0.6,0.5455,0.88,0.194,130,196,326 +13501,2012-07-21,3,1,7,17,0,6,0,3,0.6,0.5455,0.88,0.1642,92,149,241 +13502,2012-07-21,3,1,7,18,0,6,0,2,0.6,0.5455,0.88,0.194,51,126,177 +13503,2012-07-21,3,1,7,19,0,6,0,2,0.6,0.5455,0.88,0.1642,67,171,238 +13504,2012-07-21,3,1,7,20,0,6,0,3,0.6,0.5455,0.88,0.1642,51,163,214 +13505,2012-07-21,3,1,7,21,0,6,0,2,0.6,0.5455,0.88,0.1642,29,134,163 +13506,2012-07-21,3,1,7,22,0,6,0,2,0.6,0.5455,0.88,0.1045,40,140,180 +13507,2012-07-21,3,1,7,23,0,6,0,2,0.6,0.5455,0.88,0,32,117,149 +13508,2012-07-22,3,1,7,0,0,0,0,2,0.6,0.5455,0.88,0.1045,17,96,113 +13509,2012-07-22,3,1,7,1,0,0,0,2,0.6,0.5455,0.88,0.1642,31,99,130 +13510,2012-07-22,3,1,7,2,0,0,0,2,0.6,0.5455,0.88,0.1045,32,90,122 +13511,2012-07-22,3,1,7,3,0,0,0,2,0.6,0.5455,0.88,0.1045,13,30,43 +13512,2012-07-22,3,1,7,4,0,0,0,1,0.6,0.5455,0.88,0,6,10,16 +13513,2012-07-22,3,1,7,5,0,0,0,1,0.6,0.5455,0.88,0,0,6,6 +13514,2012-07-22,3,1,7,6,0,0,0,2,0.62,0.5758,0.83,0.0896,5,14,19 +13515,2012-07-22,3,1,7,7,0,0,0,2,0.62,0.5758,0.83,0,13,25,38 +13516,2012-07-22,3,1,7,8,0,0,0,2,0.64,0.5909,0.78,0,31,122,153 +13517,2012-07-22,3,1,7,9,0,0,0,2,0.66,0.6212,0.69,0,79,179,258 +13518,2012-07-22,3,1,7,10,0,0,0,2,0.66,0.6212,0.74,0,114,265,379 +13519,2012-07-22,3,1,7,11,0,0,0,2,0.66,0.6212,0.74,0,191,314,505 +13520,2012-07-22,3,1,7,12,0,0,0,2,0.7,0.6515,0.7,0.1045,178,326,504 +13521,2012-07-22,3,1,7,13,0,0,0,2,0.7,0.6515,0.65,0.0896,238,388,626 +13522,2012-07-22,3,1,7,14,0,0,0,1,0.72,0.6818,0.66,0.1045,256,360,616 +13523,2012-07-22,3,1,7,15,0,0,0,1,0.72,0.6818,0.66,0.1045,254,370,624 +13524,2012-07-22,3,1,7,16,0,0,0,1,0.74,0.697,0.66,0.1045,245,383,628 +13525,2012-07-22,3,1,7,17,0,0,0,1,0.74,0.697,0.66,0.2239,188,368,556 +13526,2012-07-22,3,1,7,18,0,0,0,1,0.74,0.697,0.66,0.1642,173,342,515 +13527,2012-07-22,3,1,7,19,0,0,0,1,0.72,0.6818,0.7,0.1642,180,342,522 +13528,2012-07-22,3,1,7,20,0,0,0,1,0.7,0.6667,0.74,0.1642,124,254,378 +13529,2012-07-22,3,1,7,21,0,0,0,1,0.7,0.6667,0.79,0.1045,88,257,345 +13530,2012-07-22,3,1,7,22,0,0,0,1,0.7,0.6667,0.74,0.194,55,136,191 +13531,2012-07-22,3,1,7,23,0,0,0,1,0.68,0.6364,0.79,0.1642,33,90,123 +13532,2012-07-23,3,1,7,0,0,1,1,1,0.66,0.5909,0.89,0.0896,11,36,47 +13533,2012-07-23,3,1,7,1,0,1,1,1,0.66,0.5909,0.89,0.194,7,20,27 +13534,2012-07-23,3,1,7,2,0,1,1,2,0.66,0.5909,0.94,0.0896,3,14,17 +13535,2012-07-23,3,1,7,3,0,1,1,2,0.66,0.5909,0.94,0.0896,0,2,2 +13536,2012-07-23,3,1,7,4,0,1,1,2,0.66,0.5909,0.94,0.1343,0,8,8 +13537,2012-07-23,3,1,7,5,0,1,1,2,0.66,0.5909,0.94,0.1343,3,48,51 +13538,2012-07-23,3,1,7,6,0,1,1,2,0.68,0.6364,0.89,0.0896,9,148,157 +13539,2012-07-23,3,1,7,7,0,1,1,1,0.7,0.6667,0.84,0.1343,17,404,421 +13540,2012-07-23,3,1,7,8,0,1,1,1,0.72,0.697,0.79,0.1045,47,691,738 +13541,2012-07-23,3,1,7,9,0,1,1,2,0.74,0.7121,0.74,0.1343,57,288,345 +13542,2012-07-23,3,1,7,10,0,1,1,2,0.78,0.7424,0.62,0,63,137,200 +13543,2012-07-23,3,1,7,11,0,1,1,2,0.8,0.7424,0.52,0.1642,69,129,198 +13544,2012-07-23,3,1,7,12,0,1,1,2,0.8,0.7424,0.52,0.1642,109,205,314 +13545,2012-07-23,3,1,7,13,0,1,1,2,0.8,0.7424,0.52,0.1045,64,176,240 +13546,2012-07-23,3,1,7,14,0,1,1,2,0.78,0.7424,0.59,0.1642,82,189,271 +13547,2012-07-23,3,1,7,15,0,1,1,1,0.8,0.7576,0.55,0.194,76,205,281 +13548,2012-07-23,3,1,7,16,0,1,1,1,0.82,0.7727,0.52,0.1343,111,346,457 +13549,2012-07-23,3,1,7,17,0,1,1,1,0.82,0.7727,0.52,0.1343,87,760,847 +13550,2012-07-23,3,1,7,18,0,1,1,1,0.82,0.7576,0.46,0.1045,69,672,741 +13551,2012-07-23,3,1,7,19,0,1,1,1,0.8,0.7424,0.52,0.2537,94,488,582 +13552,2012-07-23,3,1,7,20,0,1,1,1,0.76,0.7121,0.62,0.1642,56,347,403 +13553,2012-07-23,3,1,7,21,0,1,1,1,0.76,0.7121,0.62,0.194,45,251,296 +13554,2012-07-23,3,1,7,22,0,1,1,1,0.74,0.6818,0.62,0.194,40,169,209 +13555,2012-07-23,3,1,7,23,0,1,1,1,0.72,0.6818,0.66,0.1642,16,98,114 +13556,2012-07-24,3,1,7,0,0,2,1,1,0.7,0.6667,0.74,0.1343,12,52,64 +13557,2012-07-24,3,1,7,1,0,2,1,1,0.68,0.6364,0.79,0.1642,7,10,17 +13558,2012-07-24,3,1,7,2,0,2,1,1,0.66,0.6061,0.83,0.1642,6,10,16 +13559,2012-07-24,3,1,7,3,0,2,1,1,0.66,0.6061,0.83,0.1642,1,5,6 +13560,2012-07-24,3,1,7,4,0,2,1,1,0.66,0.6061,0.83,0.0896,0,6,6 +13561,2012-07-24,3,1,7,5,0,2,1,1,0.66,0.6061,0.83,0.1045,5,45,50 +13562,2012-07-24,3,1,7,6,0,2,1,1,0.66,0.6061,0.83,0.1343,17,173,190 +13563,2012-07-24,3,1,7,7,0,2,1,1,0.7,0.6667,0.74,0.1642,24,492,516 +13564,2012-07-24,3,1,7,8,0,2,1,1,0.74,0.697,0.7,0.194,49,694,743 +13565,2012-07-24,3,1,7,9,0,2,1,1,0.84,0.803,0.53,0,40,299,339 +13566,2012-07-24,3,1,7,10,0,2,1,1,0.82,0.7879,0.56,0.2985,62,156,218 +13567,2012-07-24,3,1,7,11,0,2,1,2,0.84,0.803,0.53,0.3284,45,156,201 +13568,2012-07-24,3,1,7,12,0,2,1,2,0.82,0.7879,0.56,0.2239,60,169,229 +13569,2012-07-24,3,1,7,13,0,2,1,2,0.84,0.803,0.53,0.2985,73,200,273 +13570,2012-07-24,3,1,7,14,0,2,1,2,0.84,0.803,0.53,0.2239,48,196,244 +13571,2012-07-24,3,1,7,15,0,2,1,1,0.82,0.7727,0.49,0.2985,73,212,285 +13572,2012-07-24,3,1,7,16,0,2,1,1,0.8,0.7576,0.55,0.2985,86,369,455 +13573,2012-07-24,3,1,7,17,0,2,1,1,0.8,0.7424,0.52,0.3284,94,775,869 +13574,2012-07-24,3,1,7,18,0,2,1,1,0.76,0.7121,0.62,0.3284,110,767,877 +13575,2012-07-24,3,1,7,19,0,2,1,1,0.76,0.7121,0.62,0.1642,109,523,632 +13576,2012-07-24,3,1,7,20,0,2,1,1,0.74,0.697,0.66,0.1642,85,438,523 +13577,2012-07-24,3,1,7,21,0,2,1,1,0.74,0.697,0.66,0.2239,53,325,378 +13578,2012-07-24,3,1,7,22,0,2,1,1,0.74,0.697,0.66,0.2537,43,226,269 +13579,2012-07-24,3,1,7,23,0,2,1,1,0.74,0.6818,0.58,0.3284,38,154,192 +13580,2012-07-25,3,1,7,0,0,3,1,1,0.72,0.6667,0.58,0.2985,9,57,66 +13581,2012-07-25,3,1,7,1,0,3,1,1,0.7,0.6515,0.58,0.194,2,26,28 +13582,2012-07-25,3,1,7,2,0,3,1,1,0.68,0.6364,0.61,0.2239,1,11,12 +13583,2012-07-25,3,1,7,3,0,3,1,1,0.66,0.6212,0.65,0.2239,1,11,12 +13584,2012-07-25,3,1,7,4,0,3,1,1,0.64,0.6061,0.65,0.2239,0,4,4 +13585,2012-07-25,3,1,7,5,0,3,1,1,0.64,0.6061,0.65,0.1642,5,54,59 +13586,2012-07-25,3,1,7,6,0,3,1,1,0.64,0.6061,0.65,0.194,11,179,190 +13587,2012-07-25,3,1,7,7,0,3,1,1,0.66,0.6212,0.5,0.3284,34,473,507 +13588,2012-07-25,3,1,7,8,0,3,1,1,0.7,0.6364,0.42,0.2537,43,745,788 +13589,2012-07-25,3,1,7,9,0,3,1,1,0.72,0.6515,0.37,0.2985,65,277,342 +13590,2012-07-25,3,1,7,10,0,3,1,1,0.74,0.6515,0.37,0,68,163,231 +13591,2012-07-25,3,1,7,11,0,3,1,1,0.76,0.6667,0.33,0,79,202,281 +13592,2012-07-25,3,1,7,12,0,3,1,1,0.76,0.6667,0.33,0,77,233,310 +13593,2012-07-25,3,1,7,13,0,3,1,1,0.78,0.6818,0.31,0.1642,81,231,312 +13594,2012-07-25,3,1,7,14,0,3,1,1,0.8,0.697,0.29,0.1343,88,159,247 +13595,2012-07-25,3,1,7,15,0,3,1,1,0.78,0.6818,0.31,0.0896,94,223,317 +13596,2012-07-25,3,1,7,16,0,3,1,1,0.8,0.697,0.29,0.1642,91,393,484 +13597,2012-07-25,3,1,7,17,0,3,1,1,0.8,0.697,0.31,0,130,783,913 +13598,2012-07-25,3,1,7,18,0,3,1,1,0.8,0.697,0.27,0.1343,104,787,891 +13599,2012-07-25,3,1,7,19,0,3,1,1,0.76,0.6667,0.33,0.1343,116,582,698 +13600,2012-07-25,3,1,7,20,0,3,1,1,0.74,0.6515,0.4,0.1642,106,461,567 +13601,2012-07-25,3,1,7,21,0,3,1,1,0.72,0.6515,0.45,0.1343,71,326,397 +13602,2012-07-25,3,1,7,22,0,3,1,1,0.7,0.6515,0.58,0.1343,55,223,278 +13603,2012-07-25,3,1,7,23,0,3,1,1,0.68,0.6364,0.57,0.2985,52,187,239 +13604,2012-07-26,3,1,7,0,0,4,1,1,0.66,0.6212,0.65,0.194,27,65,92 +13605,2012-07-26,3,1,7,1,0,4,1,1,0.66,0.6212,0.65,0.2239,6,23,29 +13606,2012-07-26,3,1,7,2,0,4,1,1,0.66,0.6212,0.69,0.2537,5,16,21 +13607,2012-07-26,3,1,7,3,0,4,1,1,0.66,0.6212,0.69,0.2239,0,8,8 +13608,2012-07-26,3,1,7,4,0,4,1,1,0.66,0.6212,0.74,0.194,0,4,4 +13609,2012-07-26,3,1,7,5,0,4,1,1,0.66,0.6212,0.74,0.194,2,40,42 +13610,2012-07-26,3,1,7,6,0,4,1,1,0.66,0.6061,0.78,0.2537,16,165,181 +13611,2012-07-26,3,1,7,7,0,4,1,1,0.7,0.6667,0.74,0.2537,17,478,495 +13612,2012-07-26,3,1,7,8,0,4,1,2,0.7,0.6667,0.74,0.2537,49,680,729 +13613,2012-07-26,3,1,7,9,0,4,1,2,0.72,0.6818,0.7,0.1343,55,289,344 +13614,2012-07-26,3,1,7,10,0,4,1,1,0.74,0.697,0.7,0.2239,79,143,222 +13615,2012-07-26,3,1,7,11,0,4,1,1,0.8,0.803,0.66,0.2239,69,143,212 +13616,2012-07-26,3,1,7,12,0,4,1,1,0.84,0.8182,0.56,0.2537,73,206,279 +13617,2012-07-26,3,1,7,13,0,4,1,1,0.86,0.8333,0.53,0.2985,53,189,242 +13618,2012-07-26,3,1,7,14,0,4,1,1,0.9,0.8788,0.47,0.2537,62,151,213 +13619,2012-07-26,3,1,7,15,0,4,1,1,0.92,0.8939,0.42,0.2985,65,173,238 +13620,2012-07-26,3,1,7,16,0,4,1,1,0.94,0.8788,0.38,0.3881,62,312,374 +13621,2012-07-26,3,1,7,17,0,4,1,1,0.9,0.8636,0.45,0.4179,110,628,738 +13622,2012-07-26,3,1,7,18,0,4,1,1,0.92,0.8788,0.4,0.3582,73,615,688 +13623,2012-07-26,3,1,7,19,0,4,1,1,0.9,0.8636,0.45,0.3284,80,534,614 +13624,2012-07-26,3,1,7,20,0,4,1,1,0.86,0.8182,0.5,0.2537,47,399,446 +13625,2012-07-26,3,1,7,21,0,4,1,1,0.8,0.7424,0.49,0.6119,35,236,271 +13626,2012-07-26,3,1,7,22,0,4,1,1,0.8,0.7424,0.49,0.6119,33,201,234 +13627,2012-07-26,3,1,7,23,0,4,1,1,0.72,0.6818,0.7,0.1343,18,127,145 +13628,2012-07-27,3,1,7,0,0,5,1,1,0.72,0.6818,0.7,0,14,83,97 +13629,2012-07-27,3,1,7,1,0,5,1,1,0.72,0.6818,0.7,0.1045,12,42,54 +13630,2012-07-27,3,1,7,2,0,5,1,1,0.7,0.6667,0.79,0.1343,8,18,26 +13631,2012-07-27,3,1,7,3,0,5,1,1,0.7,0.6667,0.74,0.1642,3,9,12 +13632,2012-07-27,3,1,7,4,0,5,1,1,0.7,0.6667,0.79,0,3,5,8 +13633,2012-07-27,3,1,7,5,0,5,1,1,0.7,0.6667,0.84,0,3,34,37 +13634,2012-07-27,3,1,7,6,0,5,1,1,0.7,0.6667,0.79,0,10,129,139 +13635,2012-07-27,3,1,7,7,0,5,1,1,0.72,0.6818,0.7,0.194,13,397,410 +13636,2012-07-27,3,1,7,8,0,5,1,1,0.76,0.7273,0.66,0.0896,34,642,676 +13637,2012-07-27,3,1,7,9,0,5,1,1,0.82,0.7727,0.49,0.2836,34,329,363 +13638,2012-07-27,3,1,7,10,0,5,1,1,0.84,0.7727,0.47,0,85,146,231 +13639,2012-07-27,3,1,7,11,0,5,1,1,0.84,0.7727,0.47,0.3284,67,184,251 +13640,2012-07-27,3,1,7,12,0,5,1,1,0.84,0.7879,0.49,0.3582,92,207,299 +13641,2012-07-27,3,1,7,13,0,5,1,1,0.84,0.7727,0.47,0.2836,83,231,314 +13642,2012-07-27,3,1,7,14,0,5,1,1,0.86,0.803,0.47,0.2985,105,221,326 +13643,2012-07-27,3,1,7,15,0,5,1,1,0.86,0.7879,0.41,0.2239,81,242,323 +13644,2012-07-27,3,1,7,16,0,5,1,1,0.84,0.7727,0.47,0.2239,80,419,499 +13645,2012-07-27,3,1,7,17,0,5,1,1,0.86,0.7879,0.41,0.2836,101,598,699 +13646,2012-07-27,3,1,7,18,0,5,1,1,0.82,0.7576,0.46,0.1642,100,522,622 +13647,2012-07-27,3,1,7,19,0,5,1,1,0.82,0.7727,0.52,0.194,93,423,516 +13648,2012-07-27,3,1,7,20,0,5,1,1,0.78,0.7424,0.59,0.1045,85,269,354 +13649,2012-07-27,3,1,7,21,0,5,1,1,0.78,0.7424,0.59,0.1045,71,189,260 +13650,2012-07-27,3,1,7,22,0,5,1,1,0.78,0.7424,0.59,0.1343,48,151,199 +13651,2012-07-27,3,1,7,23,0,5,1,1,0.76,0.7273,0.66,0,34,155,189 +13652,2012-07-28,3,1,7,0,0,6,0,1,0.74,0.6818,0.62,0.1343,21,170,191 +13653,2012-07-28,3,1,7,1,0,6,0,1,0.72,0.6818,0.7,0.0896,11,69,80 +13654,2012-07-28,3,1,7,2,0,6,0,1,0.72,0.6818,0.7,0,14,76,90 +13655,2012-07-28,3,1,7,3,0,6,0,1,0.72,0.6818,0.7,0,10,33,43 +13656,2012-07-28,3,1,7,4,0,6,0,1,0.7,0.6667,0.79,0.194,1,10,11 +13657,2012-07-28,3,1,7,5,0,6,0,1,0.68,0.6364,0.79,0.1642,4,10,14 +13658,2012-07-28,3,1,7,6,0,6,0,1,0.7,0.6515,0.7,0.1045,10,40,50 +13659,2012-07-28,3,1,7,7,0,6,0,1,0.72,0.6818,0.7,0.1343,18,58,76 +13660,2012-07-28,3,1,7,8,0,6,0,1,0.74,0.697,0.66,0.1642,36,127,163 +13661,2012-07-28,3,1,7,9,0,6,0,1,0.78,0.7424,0.59,0.1045,110,227,337 +13662,2012-07-28,3,1,7,10,0,6,0,1,0.82,0.7727,0.52,0.2239,129,266,395 +13663,2012-07-28,3,1,7,11,0,6,0,1,0.84,0.7424,0.39,0.2239,162,295,457 +13664,2012-07-28,3,1,7,12,0,6,0,1,0.84,0.7576,0.41,0.2239,157,330,487 +13665,2012-07-28,3,1,7,13,0,6,0,1,0.86,0.7879,0.41,0.2836,173,325,498 +13666,2012-07-28,3,1,7,14,0,6,0,1,0.86,0.7727,0.39,0.2537,186,308,494 +13667,2012-07-28,3,1,7,15,0,6,0,1,0.88,0.7727,0.32,0.2836,206,291,497 +13668,2012-07-28,3,1,7,16,0,6,0,1,0.86,0.7727,0.39,0.2239,208,279,487 +13669,2012-07-28,3,1,7,17,0,6,0,1,0.82,0.7576,0.46,0.194,199,280,479 +13670,2012-07-28,3,1,7,18,0,6,0,1,0.82,0.7424,0.43,0.2836,151,330,481 +13671,2012-07-28,3,1,7,19,0,6,0,1,0.68,0.6364,0.83,0,115,223,338 +13672,2012-07-28,3,1,7,20,0,6,0,1,0.66,0.6061,0.83,0.2239,79,154,233 +13673,2012-07-28,3,1,7,21,0,6,0,1,0.66,0.6061,0.78,0.0896,94,206,300 +13674,2012-07-28,3,1,7,22,0,6,0,1,0.66,0.6061,0.78,0.0896,88,188,276 +13675,2012-07-28,3,1,7,23,0,6,0,1,0.66,0.6061,0.83,0.0896,52,156,208 +13676,2012-07-29,3,1,7,0,0,0,0,1,0.66,0.5909,0.89,0.1343,59,129,188 +13677,2012-07-29,3,1,7,1,0,0,0,1,0.66,0.6061,0.83,0.1045,49,109,158 +13678,2012-07-29,3,1,7,2,0,0,0,1,0.66,0.6212,0.74,0.1343,33,79,112 +13679,2012-07-29,3,1,7,3,0,0,0,1,0.66,0.6212,0.69,0.2239,18,37,55 +13680,2012-07-29,3,1,7,4,0,0,0,1,0.64,0.6061,0.69,0.194,2,11,13 +13681,2012-07-29,3,1,7,5,0,0,0,1,0.62,0.5909,0.78,0.194,3,14,17 +13682,2012-07-29,3,1,7,6,0,0,0,1,0.64,0.6061,0.73,0.1642,6,12,18 +13683,2012-07-29,3,1,7,7,0,0,0,1,0.66,0.6212,0.69,0.2537,4,36,40 +13684,2012-07-29,3,1,7,8,0,0,0,1,0.7,0.6515,0.65,0.2239,37,90,127 +13685,2012-07-29,3,1,7,9,0,0,0,1,0.72,0.6818,0.62,0.2537,64,162,226 +13686,2012-07-29,3,1,7,10,0,0,0,1,0.74,0.6818,0.62,0,123,258,381 +13687,2012-07-29,3,1,7,11,0,0,0,1,0.76,0.697,0.55,0.2239,183,326,509 +13688,2012-07-29,3,1,7,12,0,0,0,1,0.76,0.697,0.55,0.3284,154,357,511 +13689,2012-07-29,3,1,7,13,0,0,0,1,0.8,0.7424,0.49,0.2239,181,302,483 +13690,2012-07-29,3,1,7,14,0,0,0,1,0.8,0.7273,0.46,0.194,231,291,522 +13691,2012-07-29,3,1,7,15,0,0,0,1,0.8,0.7273,0.43,0.1343,195,306,501 +13692,2012-07-29,3,1,7,16,0,0,0,1,0.8,0.7273,0.43,0.1343,191,330,521 +13693,2012-07-29,3,1,7,17,0,0,0,1,0.82,0.7424,0.41,0.1642,164,367,531 +13694,2012-07-29,3,1,7,18,0,0,0,1,0.8,0.7121,0.41,0.2985,117,302,419 +13695,2012-07-29,3,1,7,19,0,0,0,1,0.76,0.697,0.55,0.1642,143,300,443 +13696,2012-07-29,3,1,7,20,0,0,0,1,0.74,0.697,0.66,0.1045,75,228,303 +13697,2012-07-29,3,1,7,21,0,0,0,1,0.72,0.6818,0.7,0.1343,72,199,271 +13698,2012-07-29,3,1,7,22,0,0,0,1,0.7,0.6515,0.7,0.1045,31,117,148 +13699,2012-07-29,3,1,7,23,0,0,0,1,0.7,0.6515,0.7,0,18,82,100 +13700,2012-07-30,3,1,7,0,0,1,1,1,0.7,0.6667,0.74,0.0896,12,46,58 +13701,2012-07-30,3,1,7,1,0,1,1,1,0.7,0.6667,0.74,0,4,15,19 +13702,2012-07-30,3,1,7,2,0,1,1,1,0.68,0.6364,0.74,0,2,6,8 +13703,2012-07-30,3,1,7,3,0,1,1,1,0.66,0.6061,0.83,0,0,3,3 +13704,2012-07-30,3,1,7,4,0,1,1,1,0.66,0.6061,0.83,0,0,5,5 +13705,2012-07-30,3,1,7,5,0,1,1,1,0.66,0.6061,0.83,0.1045,4,43,47 +13706,2012-07-30,3,1,7,6,0,1,1,1,0.66,0.6061,0.83,0.1642,6,155,161 +13707,2012-07-30,3,1,7,7,0,1,1,1,0.7,0.6667,0.79,0.0896,14,455,469 +13708,2012-07-30,3,1,7,8,0,1,1,2,0.72,0.697,0.74,0.1045,31,720,751 +13709,2012-07-30,3,1,7,9,0,1,1,2,0.74,0.697,0.7,0,34,259,293 +13710,2012-07-30,3,1,7,10,0,1,1,2,0.74,0.697,0.66,0.2239,54,135,189 +13711,2012-07-30,3,1,7,11,0,1,1,2,0.78,0.7273,0.55,0.1343,69,146,215 +13712,2012-07-30,3,1,7,12,0,1,1,2,0.76,0.7121,0.58,0.1045,77,201,278 +13713,2012-07-30,3,1,7,13,0,1,1,1,0.8,0.7424,0.49,0.1642,57,169,226 +13714,2012-07-30,3,1,7,14,0,1,1,1,0.8,0.7424,0.49,0.2239,87,167,254 +13715,2012-07-30,3,1,7,15,0,1,1,1,0.8,0.7424,0.49,0.2537,51,185,236 +13716,2012-07-30,3,1,7,16,0,1,1,1,0.8,0.7424,0.49,0.2836,88,337,425 +13717,2012-07-30,3,1,7,17,0,1,1,1,0.76,0.697,0.55,0.3284,62,765,827 +13718,2012-07-30,3,1,7,18,0,1,1,1,0.76,0.7121,0.58,0.4179,77,735,812 +13719,2012-07-30,3,1,7,19,0,1,1,1,0.82,0.7727,0.52,0.1642,81,584,665 +13720,2012-07-30,3,1,7,20,0,1,1,1,0.72,0.6818,0.7,0.2239,78,384,462 +13721,2012-07-30,3,1,7,21,0,1,1,1,0.72,0.6818,0.7,0.194,71,251,322 +13722,2012-07-30,3,1,7,22,0,1,1,2,0.7,0.6667,0.74,0.2239,47,183,230 +13723,2012-07-30,3,1,7,23,0,1,1,1,0.7,0.6667,0.74,0.194,34,116,150 +13724,2012-07-31,3,1,7,0,0,2,1,1,0.68,0.6364,0.79,0.1343,11,27,38 +13725,2012-07-31,3,1,7,1,0,2,1,1,0.66,0.6061,0.83,0.1343,3,18,21 +13726,2012-07-31,3,1,7,2,0,2,1,1,0.66,0.6061,0.83,0.1343,0,8,8 +13727,2012-07-31,3,1,7,3,0,2,1,1,0.66,0.6061,0.83,0.0896,1,5,6 +13728,2012-07-31,3,1,7,4,0,2,1,1,0.66,0.6061,0.83,0.0896,0,6,6 +13729,2012-07-31,3,1,7,5,0,2,1,1,0.66,0.6061,0.83,0.1642,5,40,45 +13730,2012-07-31,3,1,7,6,0,2,1,1,0.64,0.5758,0.89,0.1642,2,192,194 +13731,2012-07-31,3,1,7,7,0,2,1,1,0.68,0.6364,0.83,0.2239,21,492,513 +13732,2012-07-31,3,1,7,8,0,2,1,1,0.7,0.6667,0.79,0.1343,30,730,760 +13733,2012-07-31,3,1,7,9,0,2,1,1,0.72,0.6818,0.7,0.1045,31,302,333 +13734,2012-07-31,3,1,7,10,0,2,1,1,0.72,0.6818,0.66,0,73,154,227 +13735,2012-07-31,3,1,7,11,0,2,1,1,0.76,0.7121,0.58,0,54,170,224 +13736,2012-07-31,3,1,7,12,0,2,1,1,0.8,0.7273,0.46,0,50,198,248 +13737,2012-07-31,3,1,7,13,0,2,1,1,0.8,0.7576,0.55,0.2836,69,219,288 +13738,2012-07-31,3,1,7,14,0,2,1,1,0.8,0.7576,0.55,0.2836,72,188,260 +13739,2012-07-31,3,1,7,15,0,2,1,1,0.76,0.7121,0.62,0.1343,29,115,144 +13740,2012-07-31,3,1,7,16,0,2,1,1,0.76,0.7121,0.58,0.2239,66,336,402 +13741,2012-07-31,3,1,7,17,0,2,1,1,0.78,0.7121,0.52,0.2239,94,726,820 +13742,2012-07-31,3,1,7,18,0,2,1,1,0.76,0.697,0.55,0.2836,99,758,857 +13743,2012-07-31,3,1,7,19,0,2,1,3,0.74,0.6818,0.62,0.3582,90,524,614 +13744,2012-07-31,3,1,7,20,0,2,1,3,0.7,0.6515,0.7,0.194,55,397,452 +13745,2012-07-31,3,1,7,21,0,2,1,1,0.68,0.6364,0.79,0.194,60,292,352 +13746,2012-07-31,3,1,7,22,0,2,1,1,0.68,0.6364,0.74,0.2537,33,224,257 +13747,2012-07-31,3,1,7,23,0,2,1,2,0.66,0.6061,0.83,0.1642,20,127,147 +13748,2012-08-01,3,1,8,0,0,3,1,1,0.68,0.6364,0.79,0.1642,3,44,47 +13749,2012-08-01,3,1,8,1,0,3,1,1,0.66,0.6061,0.83,0.0896,5,28,33 +13750,2012-08-01,3,1,8,2,0,3,1,1,0.64,0.5758,0.83,0.1045,0,13,13 +13751,2012-08-01,3,1,8,3,0,3,1,1,0.64,0.5758,0.83,0.1045,0,7,7 +13752,2012-08-01,3,1,8,4,0,3,1,2,0.64,0.5909,0.78,0.1343,1,3,4 +13753,2012-08-01,3,1,8,5,0,3,1,2,0.64,0.5909,0.78,0.1343,3,46,49 +13754,2012-08-01,3,1,8,6,0,3,1,1,0.64,0.5909,0.78,0.1343,6,179,185 +13755,2012-08-01,3,1,8,7,0,3,1,2,0.64,0.5758,0.83,0.1343,19,468,487 +13756,2012-08-01,3,1,8,8,0,3,1,2,0.66,0.6061,0.78,0.194,32,649,681 +13757,2012-08-01,3,1,8,9,0,3,1,2,0.68,0.6364,0.74,0.0896,34,316,350 +13758,2012-08-01,3,1,8,10,0,3,1,1,0.72,0.6818,0.66,0.0896,61,175,236 +13759,2012-08-01,3,1,8,11,0,3,1,1,0.76,0.7121,0.58,0.0896,54,180,234 +13760,2012-08-01,3,1,8,12,0,3,1,1,0.8,0.7424,0.49,0.1045,75,209,284 +13761,2012-08-01,3,1,8,13,0,3,1,1,0.8,0.7424,0.52,0.1642,59,221,280 +13762,2012-08-01,3,1,8,14,0,3,1,1,0.82,0.7576,0.46,0.1642,88,175,263 +13763,2012-08-01,3,1,8,15,0,3,1,1,0.82,0.7576,0.46,0.1642,89,206,295 +13764,2012-08-01,3,1,8,16,0,3,1,1,0.8,0.7576,0.55,0.2836,93,386,479 +13765,2012-08-01,3,1,8,17,0,3,1,1,0.8,0.7424,0.49,0.1642,103,734,837 +13766,2012-08-01,3,1,8,18,0,3,1,3,0.76,0.7121,0.62,0.2239,105,786,891 +13767,2012-08-01,3,1,8,19,0,3,1,1,0.74,0.697,0.66,0.2537,88,564,652 +13768,2012-08-01,3,1,8,20,0,3,1,1,0.74,0.697,0.66,0.1642,65,448,513 +13769,2012-08-01,3,1,8,21,0,3,1,1,0.72,0.6818,0.7,0.1343,36,284,320 +13770,2012-08-01,3,1,8,22,0,3,1,1,0.72,0.6818,0.7,0.1045,37,251,288 +13771,2012-08-01,3,1,8,23,0,3,1,1,0.7,0.6667,0.74,0,18,134,152 +13772,2012-08-02,3,1,8,0,0,4,1,1,0.7,0.6667,0.74,0,8,55,63 +13773,2012-08-02,3,1,8,1,0,4,1,1,0.68,0.6364,0.79,0.1045,6,36,42 +13774,2012-08-02,3,1,8,2,0,4,1,1,0.66,0.6061,0.83,0.0896,3,8,11 +13775,2012-08-02,3,1,8,3,0,4,1,1,0.66,0.6061,0.83,0,0,6,6 +13776,2012-08-02,3,1,8,4,0,4,1,1,0.66,0.6061,0.83,0,0,9,9 +13777,2012-08-02,3,1,8,5,0,4,1,1,0.66,0.5909,0.89,0.1343,4,37,41 +13778,2012-08-02,3,1,8,6,0,4,1,1,0.66,0.6061,0.83,0,6,177,183 +13779,2012-08-02,3,1,8,7,0,4,1,1,0.7,0.6667,0.79,0,21,452,473 +13780,2012-08-02,3,1,8,8,0,4,1,1,0.72,0.697,0.74,0.1045,21,718,739 +13781,2012-08-02,3,1,8,9,0,4,1,1,0.76,0.7273,0.66,0.1045,31,312,343 +13782,2012-08-02,3,1,8,10,0,4,1,1,0.8,0.7576,0.55,0,51,130,181 +13783,2012-08-02,3,1,8,11,0,4,1,1,0.82,0.7727,0.52,0.1045,59,167,226 +13784,2012-08-02,3,1,8,12,0,4,1,1,0.84,0.7576,0.44,0.1343,56,230,286 +13785,2012-08-02,3,1,8,13,0,4,1,1,0.86,0.7879,0.44,0.2239,98,212,310 +13786,2012-08-02,3,1,8,14,0,4,1,1,0.86,0.7879,0.41,0.1343,54,171,225 +13787,2012-08-02,3,1,8,15,0,4,1,1,0.78,0.7424,0.62,0.4627,68,202,270 +13788,2012-08-02,3,1,8,16,0,4,1,1,0.82,0.7576,0.46,0.194,71,335,406 +13789,2012-08-02,3,1,8,17,0,4,1,1,0.82,0.7727,0.52,0.2537,90,775,865 +13790,2012-08-02,3,1,8,18,0,4,1,1,0.8,0.7727,0.59,0.2239,86,681,767 +13791,2012-08-02,3,1,8,19,0,4,1,1,0.8,0.7879,0.63,0.1642,98,509,607 +13792,2012-08-02,3,1,8,20,0,4,1,1,0.78,0.7576,0.66,0,51,376,427 +13793,2012-08-02,3,1,8,21,0,4,1,1,0.76,0.7273,0.66,0.2239,41,301,342 +13794,2012-08-02,3,1,8,22,0,4,1,1,0.74,0.697,0.7,0.194,35,227,262 +13795,2012-08-02,3,1,8,23,0,4,1,1,0.72,0.6818,0.7,0.2537,25,152,177 +13796,2012-08-03,3,1,8,0,0,5,1,1,0.72,0.6818,0.7,0.194,15,56,71 +13797,2012-08-03,3,1,8,1,0,5,1,1,0.7,0.6667,0.74,0.194,11,32,43 +13798,2012-08-03,3,1,8,2,0,5,1,1,0.68,0.6364,0.79,0.2239,0,14,14 +13799,2012-08-03,3,1,8,3,0,5,1,1,0.68,0.6364,0.79,0.2239,1,5,6 +13800,2012-08-03,3,1,8,4,0,5,1,1,0.66,0.6061,0.83,0.2537,1,11,12 +13801,2012-08-03,3,1,8,5,0,5,1,1,0.66,0.6061,0.83,0.1642,2,35,37 +13802,2012-08-03,3,1,8,6,0,5,1,1,0.66,0.6061,0.83,0.2239,5,158,163 +13803,2012-08-03,3,1,8,7,0,5,1,1,0.7,0.6667,0.74,0.1045,25,396,421 +13804,2012-08-03,3,1,8,8,0,5,1,1,0.72,0.6818,0.7,0.1343,32,636,668 +13805,2012-08-03,3,1,8,9,0,5,1,2,0.74,0.697,0.66,0.1045,69,327,396 +13806,2012-08-03,3,1,8,10,0,5,1,2,0.8,0.7424,0.52,0.194,82,167,249 +13807,2012-08-03,3,1,8,11,0,5,1,2,0.84,0.7879,0.49,0.2836,77,186,263 +13808,2012-08-03,3,1,8,12,0,5,1,2,0.86,0.8182,0.5,0.2985,95,243,338 +13809,2012-08-03,3,1,8,13,0,5,1,2,0.86,0.803,0.47,0.2836,98,247,345 +13810,2012-08-03,3,1,8,14,0,5,1,2,0.86,0.7879,0.44,0.2537,102,229,331 +13811,2012-08-03,3,1,8,15,0,5,1,1,0.88,0.8182,0.42,0.2537,112,268,380 +13812,2012-08-03,3,1,8,16,0,5,1,2,0.86,0.7879,0.41,0.2239,112,380,492 +13813,2012-08-03,3,1,8,17,0,5,1,2,0.84,0.803,0.53,0.2836,95,646,741 +13814,2012-08-03,3,1,8,18,0,5,1,2,0.82,0.803,0.59,0.2537,98,573,671 +13815,2012-08-03,3,1,8,19,0,5,1,2,0.8,0.7879,0.63,0.2239,81,388,469 +13816,2012-08-03,3,1,8,20,0,5,1,2,0.78,0.7576,0.66,0.2239,89,300,389 +13817,2012-08-03,3,1,8,21,0,5,1,2,0.76,0.7273,0.66,0.194,54,224,278 +13818,2012-08-03,3,1,8,22,0,5,1,1,0.76,0.7273,0.7,0.2537,46,156,202 +13819,2012-08-03,3,1,8,23,0,5,1,1,0.74,0.7121,0.79,0.1343,26,170,196 +13820,2012-08-04,3,1,8,0,0,6,0,1,0.74,0.7121,0.79,0.1642,24,137,161 +13821,2012-08-04,3,1,8,1,0,6,0,1,0.72,0.7121,0.84,0.194,23,99,122 +13822,2012-08-04,3,1,8,2,0,6,0,1,0.72,0.7121,0.84,0.1343,20,64,84 +13823,2012-08-04,3,1,8,3,0,6,0,1,0.72,0.7121,0.84,0.1343,11,19,30 +13824,2012-08-04,3,1,8,4,0,6,0,1,0.72,0.697,0.79,0.194,2,11,13 +13825,2012-08-04,3,1,8,5,0,6,0,1,0.7,0.6667,0.84,0.2239,1,16,17 +13826,2012-08-04,3,1,8,6,0,6,0,1,0.7,0.6667,0.84,0.2239,12,37,49 +13827,2012-08-04,3,1,8,7,0,6,0,1,0.72,0.697,0.77,0.1642,18,49,67 +13828,2012-08-04,3,1,8,8,0,6,0,1,0.74,0.7273,0.72,0.194,49,132,181 +13829,2012-08-04,3,1,8,9,0,6,0,1,0.78,0.7576,0.65,0.2239,61,217,278 +13830,2012-08-04,3,1,8,10,0,6,0,1,0.82,0.7727,0.57,0.194,108,288,396 +13831,2012-08-04,3,1,8,11,0,6,0,1,0.86,0.803,0.47,0.2537,155,315,470 +13832,2012-08-04,3,1,8,12,0,6,0,1,0.86,0.7879,0.4,0.3582,222,325,547 +13833,2012-08-04,3,1,8,13,0,6,0,1,0.88,0.8182,0.42,0.2985,195,332,527 +13834,2012-08-04,3,1,8,14,0,6,0,1,0.88,0.803,0.4,0.3881,183,289,472 +13835,2012-08-04,3,1,8,15,0,6,0,1,0.88,0.8182,0.42,0.4179,205,284,489 +13836,2012-08-04,3,1,8,16,0,6,0,1,0.9,0.8182,0.39,0.2985,197,253,450 +13837,2012-08-04,3,1,8,17,0,6,0,1,0.88,0.7879,0.37,0.2985,179,313,492 +13838,2012-08-04,3,1,8,18,0,6,0,1,0.86,0.7879,0.41,0.3284,169,321,490 +13839,2012-08-04,3,1,8,19,0,6,0,1,0.82,0.7727,0.52,0.3582,133,264,397 +13840,2012-08-04,3,1,8,20,0,6,0,1,0.82,0.7727,0.52,0.2985,117,195,312 +13841,2012-08-04,3,1,8,21,0,6,0,1,0.8,0.7727,0.59,0.2537,96,193,289 +13842,2012-08-04,3,1,8,22,0,6,0,1,0.76,0.7273,0.66,0.2836,104,165,269 +13843,2012-08-04,3,1,8,23,0,6,0,1,0.76,0.7273,0.66,0.2985,61,161,222 +13844,2012-08-05,3,1,8,0,0,0,0,1,0.74,0.697,0.7,0.2836,32,121,153 +13845,2012-08-05,3,1,8,1,0,0,0,1,0.74,0.697,0.7,0.2537,8,79,87 +13846,2012-08-05,3,1,8,2,0,0,0,1,0.74,0.697,0.7,0.2985,5,68,73 +13847,2012-08-05,3,1,8,3,0,0,0,1,0.72,0.697,0.74,0.2537,9,32,41 +13848,2012-08-05,3,1,8,4,0,0,0,1,0.72,0.697,0.74,0.2537,3,11,14 +13849,2012-08-05,3,1,8,5,0,0,0,1,0.72,0.697,0.74,0.2537,1,18,19 +13850,2012-08-05,3,1,8,6,0,0,0,1,0.72,0.697,0.79,0.2537,7,12,19 +13851,2012-08-05,3,1,8,7,0,0,0,1,0.74,0.7121,0.74,0.2985,18,50,68 +13852,2012-08-05,3,1,8,8,0,0,0,1,0.76,0.7273,0.66,0.3284,27,81,108 +13853,2012-08-05,3,1,8,9,0,0,0,1,0.8,0.7727,0.59,0.3582,61,168,229 +13854,2012-08-05,3,1,8,10,0,0,0,1,0.8,0.7727,0.59,0.4179,111,253,364 +13855,2012-08-05,3,1,8,11,0,0,0,1,0.82,0.7879,0.56,0.4179,155,282,437 +13856,2012-08-05,3,1,8,12,0,0,0,1,0.86,0.803,0.47,0.5224,161,330,491 +13857,2012-08-05,3,1,8,13,0,0,0,1,0.88,0.8182,0.44,0.4627,208,315,523 +13858,2012-08-05,3,1,8,14,0,0,0,1,0.9,0.8485,0.42,0.4627,161,365,526 +13859,2012-08-05,3,1,8,15,0,0,0,1,0.9,0.8182,0.37,0.5224,164,286,450 +13860,2012-08-05,3,1,8,16,0,0,0,1,0.82,0.7576,0.46,0.2537,143,278,421 +13861,2012-08-05,3,1,8,17,0,0,0,1,0.82,0.7576,0.46,0.2537,122,260,382 +13862,2012-08-05,3,1,8,18,0,0,0,3,0.7,0.6667,0.74,0.1343,75,154,229 +13863,2012-08-05,3,1,8,19,0,0,0,1,0.7,0.6667,0.84,0.2239,46,139,185 +13864,2012-08-05,3,1,8,20,0,0,0,1,0.7,0.6667,0.84,0.1045,53,140,193 +13865,2012-08-05,3,1,8,21,0,0,0,1,0.72,0.697,0.79,0.1045,51,157,208 +13866,2012-08-05,3,1,8,22,0,0,0,1,0.72,0.697,0.79,0.0896,57,100,157 +13867,2012-08-05,3,1,8,23,0,0,0,2,0.72,0.697,0.79,0.1642,29,58,87 +13868,2012-08-06,3,1,8,0,0,1,1,2,0.72,0.697,0.79,0,9,24,33 +13869,2012-08-06,3,1,8,1,0,1,1,1,0.72,0.697,0.79,0.0896,1,10,11 +13870,2012-08-06,3,1,8,2,0,1,1,2,0.72,0.697,0.79,0.1343,0,5,5 +13871,2012-08-06,3,1,8,3,0,1,1,3,0.72,0.697,0.79,0.0896,0,5,5 +13872,2012-08-06,3,1,8,4,0,1,1,2,0.72,0.697,0.74,0.1343,3,8,11 +13873,2012-08-06,3,1,8,5,0,1,1,1,0.7,0.6667,0.79,0.194,2,23,25 +13874,2012-08-06,3,1,8,6,0,1,1,2,0.7,0.6667,0.79,0.2239,4,137,141 +13875,2012-08-06,3,1,8,7,0,1,1,3,0.72,0.6818,0.7,0.194,14,393,407 +13876,2012-08-06,3,1,8,8,0,1,1,2,0.7,0.6667,0.79,0,27,578,605 +13877,2012-08-06,3,1,8,9,0,1,1,3,0.7,0.6667,0.79,0.1642,28,248,276 +13878,2012-08-06,3,1,8,10,0,1,1,1,0.74,0.697,0.7,0.2239,54,159,213 +13879,2012-08-06,3,1,8,11,0,1,1,1,0.76,0.7273,0.66,0.1343,108,152,260 +13880,2012-08-06,3,1,8,12,0,1,1,2,0.78,0.7273,0.55,0.0896,72,213,285 +13881,2012-08-06,3,1,8,13,0,1,1,1,0.8,0.7424,0.49,0.2239,85,204,289 +13882,2012-08-06,3,1,8,14,0,1,1,1,0.82,0.7576,0.46,0,99,188,287 +13883,2012-08-06,3,1,8,15,0,1,1,1,0.82,0.7576,0.46,0.1343,91,183,274 +13884,2012-08-06,3,1,8,16,0,1,1,1,0.84,0.7576,0.41,0.1045,88,363,451 +13885,2012-08-06,3,1,8,17,0,1,1,1,0.82,0.7424,0.41,0,112,746,858 +13886,2012-08-06,3,1,8,18,0,1,1,1,0.82,0.7424,0.41,0.1343,100,743,843 +13887,2012-08-06,3,1,8,19,0,1,1,1,0.78,0.7424,0.59,0.194,109,531,640 +13888,2012-08-06,3,1,8,20,0,1,1,1,0.76,0.7273,0.66,0.1642,89,368,457 +13889,2012-08-06,3,1,8,21,0,1,1,1,0.74,0.697,0.7,0.194,72,245,317 +13890,2012-08-06,3,1,8,22,0,1,1,1,0.74,0.697,0.7,0.0896,50,157,207 +13891,2012-08-06,3,1,8,23,0,1,1,1,0.72,0.697,0.74,0.194,16,97,113 +13892,2012-08-07,3,1,8,0,0,2,1,1,0.7,0.6667,0.84,0.194,15,32,47 +13893,2012-08-07,3,1,8,1,0,2,1,1,0.7,0.6667,0.84,0.1343,2,16,18 +13894,2012-08-07,3,1,8,2,0,2,1,1,0.7,0.6667,0.84,0.1343,6,7,13 +13895,2012-08-07,3,1,8,3,0,2,1,1,0.7,0.6667,0.84,0,0,6,6 +13896,2012-08-07,3,1,8,4,0,2,1,1,0.68,0.6364,0.83,0.1343,2,7,9 +13897,2012-08-07,3,1,8,5,0,2,1,1,0.7,0.6667,0.79,0.1045,3,33,36 +13898,2012-08-07,3,1,8,6,0,2,1,2,0.7,0.6667,0.79,0.194,3,176,179 +13899,2012-08-07,3,1,8,7,0,2,1,2,0.7,0.6667,0.74,0.1343,21,481,502 +13900,2012-08-07,3,1,8,8,0,2,1,2,0.7,0.6515,0.7,0.1642,41,664,705 +13901,2012-08-07,3,1,8,9,0,2,1,2,0.7,0.6667,0.74,0.1343,44,283,327 +13902,2012-08-07,3,1,8,10,0,2,1,2,0.74,0.697,0.7,0.1343,89,161,250 +13903,2012-08-07,3,1,8,11,0,2,1,2,0.76,0.7273,0.66,0.0896,84,130,214 +13904,2012-08-07,3,1,8,12,0,2,1,2,0.8,0.7576,0.55,0.1343,86,197,283 +13905,2012-08-07,3,1,8,13,0,2,1,2,0.8,0.7424,0.52,0.194,68,185,253 +13906,2012-08-07,3,1,8,14,0,2,1,2,0.82,0.7576,0.46,0,76,185,261 +13907,2012-08-07,3,1,8,15,0,2,1,1,0.8,0.7424,0.52,0,100,206,306 +13908,2012-08-07,3,1,8,16,0,2,1,3,0.76,0.7273,0.66,0.2836,101,344,445 +13909,2012-08-07,3,1,8,17,0,2,1,2,0.78,0.7424,0.62,0.1343,125,743,868 +13910,2012-08-07,3,1,8,18,0,2,1,2,0.76,0.7121,0.62,0.1642,103,711,814 +13911,2012-08-07,3,1,8,19,0,2,1,2,0.76,0.7273,0.66,0.1045,104,506,610 +13912,2012-08-07,3,1,8,20,0,2,1,2,0.74,0.697,0.7,0.1343,74,374,448 +13913,2012-08-07,3,1,8,21,0,2,1,2,0.72,0.697,0.74,0,70,247,317 +13914,2012-08-07,3,1,8,22,0,2,1,1,0.72,0.697,0.74,0.1045,43,181,224 +13915,2012-08-07,3,1,8,23,0,2,1,1,0.72,0.697,0.79,0,18,120,138 +13916,2012-08-08,3,1,8,0,0,3,1,1,0.7,0.6667,0.84,0,12,46,58 +13917,2012-08-08,3,1,8,1,0,3,1,1,0.72,0.697,0.79,0,3,20,23 +13918,2012-08-08,3,1,8,2,0,3,1,1,0.7,0.6667,0.84,0,1,5,6 +13919,2012-08-08,3,1,8,3,0,3,1,1,0.7,0.6667,0.84,0,1,6,7 +13920,2012-08-08,3,1,8,4,0,3,1,1,0.7,0.6667,0.84,0,1,6,7 +13921,2012-08-08,3,1,8,5,0,3,1,2,0.68,0.6364,0.89,0.1642,3,40,43 +13922,2012-08-08,3,1,8,6,0,3,1,2,0.7,0.6667,0.84,0.1642,4,169,173 +13923,2012-08-08,3,1,8,7,0,3,1,2,0.7,0.6667,0.84,0.1642,24,458,482 +13924,2012-08-08,3,1,8,8,0,3,1,2,0.74,0.7273,0.72,0.0896,48,689,737 +13925,2012-08-08,3,1,8,9,0,3,1,2,0.76,0.7273,0.66,0,33,308,341 +13926,2012-08-08,3,1,8,10,0,3,1,2,0.8,0.7424,0.52,0.1343,63,151,214 +13927,2012-08-08,3,1,8,11,0,3,1,2,0.8,0.7424,0.52,0,80,159,239 +13928,2012-08-08,3,1,8,12,0,3,1,2,0.8,0.7424,0.52,0,63,217,280 +13929,2012-08-08,3,1,8,13,0,3,1,2,0.8,0.7424,0.52,0,101,212,313 +13930,2012-08-08,3,1,8,14,0,3,1,2,0.8,0.7424,0.52,0.2537,63,173,236 +13931,2012-08-08,3,1,8,15,0,3,1,2,0.84,0.7576,0.44,0.194,110,168,278 +13932,2012-08-08,3,1,8,16,0,3,1,2,0.8,0.7424,0.49,0.2836,113,328,441 +13933,2012-08-08,3,1,8,17,0,3,1,2,0.8,0.7576,0.55,0.2836,107,751,858 +13934,2012-08-08,3,1,8,18,0,3,1,2,0.78,0.7273,0.55,0.1642,117,745,862 +13935,2012-08-08,3,1,8,19,0,3,1,1,0.76,0.7273,0.66,0.2239,119,567,686 +13936,2012-08-08,3,1,8,20,0,3,1,1,0.74,0.697,0.66,0.194,76,424,500 +13937,2012-08-08,3,1,8,21,0,3,1,1,0.74,0.697,0.66,0.1045,70,311,381 +13938,2012-08-08,3,1,8,22,0,3,1,1,0.72,0.697,0.74,0.1045,35,198,233 +13939,2012-08-08,3,1,8,23,0,3,1,1,0.72,0.6818,0.7,0.1343,16,120,136 +13940,2012-08-09,3,1,8,0,0,4,1,1,0.72,0.697,0.74,0.194,16,51,67 +13941,2012-08-09,3,1,8,1,0,4,1,1,0.7,0.6667,0.79,0.1343,5,23,28 +13942,2012-08-09,3,1,8,2,0,4,1,1,0.7,0.6667,0.84,0.0896,2,12,14 +13943,2012-08-09,3,1,8,3,0,4,1,1,0.68,0.6364,0.83,0.0896,1,5,6 +13944,2012-08-09,3,1,8,4,0,4,1,2,0.68,0.6364,0.89,0.1343,0,10,10 +13945,2012-08-09,3,1,8,5,0,4,1,2,0.66,0.5909,0.94,0,4,37,41 +13946,2012-08-09,3,1,8,6,0,4,1,1,0.66,0.6061,0.83,0.1642,5,162,167 +13947,2012-08-09,3,1,8,7,0,4,1,2,0.7,0.6667,0.84,0.1642,24,451,475 +13948,2012-08-09,3,1,8,8,0,4,1,1,0.72,0.697,0.74,0.1045,28,670,698 +13949,2012-08-09,3,1,8,9,0,4,1,1,0.76,0.7273,0.66,0.1642,54,299,353 +13950,2012-08-09,3,1,8,10,0,4,1,1,0.8,0.7576,0.55,0.2239,72,133,205 +13951,2012-08-09,3,1,8,11,0,4,1,1,0.84,0.7576,0.44,0,94,166,260 +13952,2012-08-09,3,1,8,12,0,4,1,1,0.84,0.7576,0.44,0.1045,85,192,277 +13953,2012-08-09,3,1,8,13,0,4,1,1,0.86,0.7879,0.41,0.1045,80,201,281 +13954,2012-08-09,3,1,8,14,0,4,1,1,0.88,0.7727,0.32,0.2239,86,161,247 +13955,2012-08-09,3,1,8,15,0,4,1,1,0.88,0.7727,0.32,0.2239,63,204,267 +13956,2012-08-09,3,1,8,16,0,4,1,1,0.86,0.7727,0.39,0.2537,97,320,417 +13957,2012-08-09,3,1,8,17,0,4,1,1,0.86,0.7576,0.36,0.2537,111,699,810 +13958,2012-08-09,3,1,8,18,0,4,1,2,0.82,0.7576,0.46,0.2537,78,733,811 +13959,2012-08-09,3,1,8,19,0,4,1,1,0.8,0.7576,0.55,0.194,90,533,623 +13960,2012-08-09,3,1,8,20,0,4,1,1,0.7,0.6515,0.54,0.2836,91,387,478 +13961,2012-08-09,3,1,8,21,0,4,1,2,0.7,0.6515,0.58,0.1642,50,286,336 +13962,2012-08-09,3,1,8,22,0,4,1,2,0.66,0.6212,0.69,0.2239,41,218,259 +13963,2012-08-09,3,1,8,23,0,4,1,1,0.66,0.6212,0.74,0,19,137,156 +13964,2012-08-10,3,1,8,0,0,5,1,1,0.7,0.6515,0.65,0.0896,17,68,85 +13965,2012-08-10,3,1,8,1,0,5,1,1,0.7,0.6515,0.65,0.1045,6,35,41 +13966,2012-08-10,3,1,8,2,0,5,1,1,0.7,0.6667,0.74,0.1642,6,14,20 +13967,2012-08-10,3,1,8,3,0,5,1,2,0.72,0.697,0.79,0.194,3,6,9 +13968,2012-08-10,3,1,8,4,0,5,1,3,0.7,0.6667,0.84,0.2836,0,7,7 +13969,2012-08-10,3,1,8,5,0,5,1,3,0.7,0.6667,0.84,0.2836,4,29,33 +13970,2012-08-10,3,1,8,6,0,5,1,3,0.64,0.5758,0.89,0.2985,3,41,44 +13971,2012-08-10,3,1,8,7,0,5,1,3,0.64,0.5758,0.89,0.1045,5,128,133 +13972,2012-08-10,3,1,8,8,0,5,1,3,0.62,0.5455,0.91,0.1642,7,112,119 +13973,2012-08-10,3,1,8,9,0,5,1,2,0.64,0.5758,0.89,0,21,199,220 +13974,2012-08-10,3,1,8,10,0,5,1,1,0.66,0.5909,0.89,0.1045,27,166,193 +13975,2012-08-10,3,1,8,11,0,5,1,3,0.68,0.6364,0.79,0.1343,73,178,251 +13976,2012-08-10,3,1,8,12,0,5,1,2,0.74,0.697,0.66,0.3582,75,240,315 +13977,2012-08-10,3,1,8,13,0,5,1,2,0.76,0.7121,0.58,0.3582,66,224,290 +13978,2012-08-10,3,1,8,14,0,5,1,1,0.76,0.7121,0.58,0.3582,71,236,307 +13979,2012-08-10,3,1,8,15,0,5,1,1,0.8,0.7424,0.52,0.4478,100,224,324 +13980,2012-08-10,3,1,8,16,0,5,1,1,0.78,0.7273,0.55,0.2985,96,371,467 +13981,2012-08-10,3,1,8,17,0,5,1,1,0.8,0.7424,0.52,0.2836,111,619,730 +13982,2012-08-10,3,1,8,18,0,5,1,1,0.76,0.7121,0.62,0.2985,88,552,640 +13983,2012-08-10,3,1,8,19,0,5,1,1,0.76,0.7121,0.62,0.2836,98,394,492 +13984,2012-08-10,3,1,8,20,0,5,1,1,0.74,0.697,0.66,0.2985,55,315,370 +13985,2012-08-10,3,1,8,21,0,5,1,1,0.74,0.697,0.66,0.2537,50,236,286 +13986,2012-08-10,3,1,8,22,0,5,1,1,0.72,0.697,0.74,0.2836,50,168,218 +13987,2012-08-10,3,1,8,23,0,5,1,1,0.72,0.6818,0.7,0.2836,33,159,192 +13988,2012-08-11,3,1,8,0,0,6,0,1,0.7,0.6515,0.7,0.2836,38,142,180 +13989,2012-08-11,3,1,8,1,0,6,0,1,0.66,0.6212,0.74,0.2239,32,84,116 +13990,2012-08-11,3,1,8,2,0,6,0,3,0.64,0.5909,0.78,0.194,19,66,85 +13991,2012-08-11,3,1,8,3,0,6,0,3,0.62,0.5758,0.83,0.1642,2,19,21 +13992,2012-08-11,3,1,8,4,0,6,0,1,0.62,0.5758,0.83,0.1045,3,7,10 +13993,2012-08-11,3,1,8,5,0,6,0,1,0.62,0.5606,0.88,0.1045,2,9,11 +13994,2012-08-11,3,1,8,6,0,6,0,2,0.62,0.5606,0.88,0.1045,5,18,23 +13995,2012-08-11,3,1,8,7,0,6,0,1,0.64,0.5758,0.83,0.0896,8,54,62 +13996,2012-08-11,3,1,8,8,0,6,0,1,0.66,0.6061,0.83,0.1045,30,132,162 +13997,2012-08-11,3,1,8,9,0,6,0,1,0.68,0.6364,0.79,0.194,54,217,271 +13998,2012-08-11,3,1,8,10,0,6,0,1,0.72,0.6818,0.62,0.0896,125,282,407 +13999,2012-08-11,3,1,8,11,0,6,0,1,0.74,0.697,0.66,0.1642,203,296,499 +14000,2012-08-11,3,1,8,12,0,6,0,1,0.76,0.7121,0.58,0.2239,214,332,546 +14001,2012-08-11,3,1,8,13,0,6,0,1,0.8,0.7424,0.49,0.3582,228,341,569 +14002,2012-08-11,3,1,8,14,0,6,0,1,0.8,0.7424,0.49,0.2985,248,290,538 +14003,2012-08-11,3,1,8,15,0,6,0,1,0.8,0.7273,0.46,0.2836,246,316,562 +14004,2012-08-11,3,1,8,16,0,6,0,1,0.8,0.7273,0.46,0.2985,227,304,531 +14005,2012-08-11,3,1,8,17,0,6,0,3,0.74,0.697,0.66,0.3582,220,292,512 +14006,2012-08-11,3,1,8,18,0,6,0,2,0.7,0.6667,0.74,0.2985,107,193,300 +14007,2012-08-11,3,1,8,19,0,6,0,2,0.66,0.5909,0.89,0.1045,97,178,275 +14008,2012-08-11,3,1,8,20,0,6,0,3,0.66,0.5909,0.89,0.4179,31,129,160 +14009,2012-08-11,3,1,8,21,0,6,0,2,0.66,0.5909,0.89,0.1642,26,102,128 +14010,2012-08-11,3,1,8,22,0,6,0,2,0.66,0.5909,0.89,0.1642,40,128,168 +14011,2012-08-11,3,1,8,23,0,6,0,3,0.66,0.6061,0.78,0.1642,42,121,163 +14012,2012-08-12,3,1,8,0,0,0,0,2,0.64,0.6061,0.73,0.1343,24,96,120 +14013,2012-08-12,3,1,8,1,0,0,0,1,0.62,0.5909,0.78,0.1343,21,92,113 +14014,2012-08-12,3,1,8,2,0,0,0,1,0.64,0.6061,0.69,0.1343,19,67,86 +14015,2012-08-12,3,1,8,3,0,0,0,1,0.62,0.5909,0.73,0.0896,11,37,48 +14016,2012-08-12,3,1,8,4,0,0,0,1,0.64,0.6061,0.69,0.0896,2,8,10 +14017,2012-08-12,3,1,8,5,0,0,0,1,0.64,0.6212,0.61,0.194,1,9,10 +14018,2012-08-12,3,1,8,6,0,0,0,1,0.64,0.6061,0.65,0.1642,2,14,16 +14019,2012-08-12,3,1,8,7,0,0,0,1,0.64,0.6061,0.65,0.1343,9,30,39 +14020,2012-08-12,3,1,8,8,0,0,0,1,0.66,0.6212,0.54,0.2239,35,84,119 +14021,2012-08-12,3,1,8,9,0,0,0,1,0.7,0.6515,0.48,0.2239,64,153,217 +14022,2012-08-12,3,1,8,10,0,0,0,1,0.7,0.6515,0.51,0.1642,88,240,328 +14023,2012-08-12,3,1,8,11,0,0,0,1,0.74,0.6667,0.42,0.1642,167,282,449 +14024,2012-08-12,3,1,8,12,0,0,0,1,0.74,0.6667,0.42,0.194,163,342,505 +14025,2012-08-12,3,1,8,13,0,0,0,1,0.78,0.697,0.4,0,178,365,543 +14026,2012-08-12,3,1,8,14,0,0,0,1,0.76,0.6818,0.4,0.194,213,366,579 +14027,2012-08-12,3,1,8,15,0,0,0,1,0.76,0.6818,0.4,0.1343,235,342,577 +14028,2012-08-12,3,1,8,16,0,0,0,1,0.76,0.6818,0.4,0.1045,213,300,513 +14029,2012-08-12,3,1,8,17,0,0,0,1,0.8,0.697,0.33,0,186,319,505 +14030,2012-08-12,3,1,8,18,0,0,0,1,0.78,0.6818,0.35,0.1045,164,327,491 +14031,2012-08-12,3,1,8,19,0,0,0,1,0.74,0.6515,0.4,0.1343,148,317,465 +14032,2012-08-12,3,1,8,20,0,0,0,1,0.72,0.6515,0.45,0,96,204,300 +14033,2012-08-12,3,1,8,21,0,0,0,1,0.72,0.6667,0.51,0.0896,78,142,220 +14034,2012-08-12,3,1,8,22,0,0,0,1,0.7,0.6515,0.58,0.1343,40,141,181 +14035,2012-08-12,3,1,8,23,0,0,0,1,0.68,0.6364,0.61,0,25,85,110 +14036,2012-08-13,3,1,8,0,0,1,1,1,0.66,0.6212,0.65,0.1045,14,33,47 +14037,2012-08-13,3,1,8,1,0,1,1,1,0.66,0.6212,0.65,0.0896,3,11,14 +14038,2012-08-13,3,1,8,2,0,1,1,1,0.64,0.6061,0.69,0.1045,1,8,9 +14039,2012-08-13,3,1,8,3,0,1,1,1,0.64,0.6061,0.69,0,1,5,6 +14040,2012-08-13,3,1,8,4,0,1,1,1,0.64,0.6061,0.69,0.1045,0,11,11 +14041,2012-08-13,3,1,8,5,0,1,1,1,0.64,0.6061,0.69,0.1045,3,33,36 +14042,2012-08-13,3,1,8,6,0,1,1,1,0.62,0.5909,0.73,0.0896,4,155,159 +14043,2012-08-13,3,1,8,7,0,1,1,1,0.66,0.6212,0.65,0.1343,11,425,436 +14044,2012-08-13,3,1,8,8,0,1,1,2,0.72,0.6667,0.51,0.0896,25,648,673 +14045,2012-08-13,3,1,8,9,0,1,1,2,0.74,0.6667,0.45,0,40,265,305 +14046,2012-08-13,3,1,8,10,0,1,1,2,0.76,0.6818,0.4,0,88,111,199 +14047,2012-08-13,3,1,8,11,0,1,1,1,0.76,0.6818,0.4,0.0896,94,151,245 +14048,2012-08-13,3,1,8,12,0,1,1,1,0.78,0.697,0.4,0.194,96,180,276 +14049,2012-08-13,3,1,8,13,0,1,1,1,0.78,0.697,0.4,0.1642,79,175,254 +14050,2012-08-13,3,1,8,14,0,1,1,1,0.8,0.697,0.33,0.1045,85,163,248 +14051,2012-08-13,3,1,8,15,0,1,1,1,0.82,0.7273,0.34,0.2239,80,194,274 +14052,2012-08-13,3,1,8,16,0,1,1,1,0.8,0.7121,0.36,0.0896,116,348,464 +14053,2012-08-13,3,1,8,17,0,1,1,1,0.8,0.7121,0.36,0.1642,102,716,818 +14054,2012-08-13,3,1,8,18,0,1,1,1,0.76,0.697,0.52,0.2836,103,709,812 +14055,2012-08-13,3,1,8,19,0,1,1,1,0.74,0.6818,0.58,0.1642,88,467,555 +14056,2012-08-13,3,1,8,20,0,1,1,1,0.74,0.6818,0.62,0.2239,58,374,432 +14057,2012-08-13,3,1,8,21,0,1,1,1,0.72,0.6818,0.66,0.194,44,246,290 +14058,2012-08-13,3,1,8,22,0,1,1,1,0.72,0.6818,0.62,0.3284,44,148,192 +14059,2012-08-13,3,1,8,23,0,1,1,1,0.7,0.6515,0.7,0.2239,28,100,128 +14060,2012-08-14,3,1,8,0,0,2,1,1,0.7,0.6667,0.74,0.2239,12,48,60 +14061,2012-08-14,3,1,8,1,0,2,1,1,0.68,0.6364,0.79,0.2537,8,19,27 +14062,2012-08-14,3,1,8,2,0,2,1,2,0.68,0.6364,0.83,0.194,2,9,11 +14063,2012-08-14,3,1,8,3,0,2,1,2,0.68,0.6364,0.83,0.194,0,3,3 +14064,2012-08-14,3,1,8,4,0,2,1,2,0.68,0.6364,0.83,0.194,0,5,5 +14065,2012-08-14,3,1,8,5,0,2,1,2,0.68,0.6364,0.89,0.194,1,35,36 +14066,2012-08-14,3,1,8,6,0,2,1,3,0.64,0.5758,0.83,0.194,1,63,64 +14067,2012-08-14,3,1,8,7,0,2,1,2,0.66,0.6061,0.83,0.0896,1,178,179 +14068,2012-08-14,3,1,8,8,0,2,1,2,0.66,0.5909,0.89,0.0896,27,591,618 +14069,2012-08-14,3,1,8,9,0,2,1,2,0.7,0.6667,0.74,0,48,354,402 +14070,2012-08-14,3,1,8,10,0,2,1,1,0.7,0.6667,0.79,0.1045,53,155,208 +14071,2012-08-14,3,1,8,11,0,2,1,1,0.74,0.6818,0.62,0,69,127,196 +14072,2012-08-14,3,1,8,12,0,2,1,1,0.78,0.7121,0.49,0.1343,93,207,300 +14073,2012-08-14,3,1,8,13,0,2,1,1,0.8,0.7273,0.46,0.0896,83,203,286 +14074,2012-08-14,3,1,8,14,0,2,1,1,0.82,0.7424,0.41,0.2239,80,175,255 +14075,2012-08-14,3,1,8,15,0,2,1,1,0.82,0.7576,0.46,0.2239,84,199,283 +14076,2012-08-14,3,1,8,16,0,2,1,1,0.82,0.7424,0.43,0.2239,95,363,458 +14077,2012-08-14,3,1,8,17,0,2,1,3,0.76,0.7273,0.66,0.2537,78,734,812 +14078,2012-08-14,3,1,8,18,0,2,1,1,0.76,0.7121,0.62,0.2836,97,757,854 +14079,2012-08-14,3,1,8,19,0,2,1,1,0.76,0.7273,0.66,0.194,94,533,627 +14080,2012-08-14,3,1,8,20,0,2,1,1,0.74,0.7121,0.74,0.2239,65,371,436 +14081,2012-08-14,3,1,8,21,0,2,1,1,0.74,0.6818,0.62,0.2239,56,252,308 +14082,2012-08-14,3,1,8,22,0,2,1,1,0.72,0.6818,0.66,0.0896,28,161,189 +14083,2012-08-14,3,1,8,23,0,2,1,1,0.72,0.6818,0.66,0.1642,53,114,167 +14084,2012-08-15,3,1,8,0,0,3,1,1,0.7,0.6515,0.61,0.194,8,52,60 +14085,2012-08-15,3,1,8,1,0,3,1,1,0.7,0.6515,0.61,0,6,31,37 +14086,2012-08-15,3,1,8,2,0,3,1,1,0.68,0.6364,0.65,0,1,7,8 +14087,2012-08-15,3,1,8,3,0,3,1,1,0.66,0.6212,0.74,0.1045,1,10,11 +14088,2012-08-15,3,1,8,4,0,3,1,1,0.64,0.5758,0.83,0.1045,0,4,4 +14089,2012-08-15,3,1,8,5,0,3,1,1,0.64,0.5909,0.78,0.1045,4,35,39 +14090,2012-08-15,3,1,8,6,0,3,1,1,0.64,0.5909,0.78,0.1045,12,160,172 +14091,2012-08-15,3,1,8,7,0,3,1,2,0.66,0.6212,0.69,0.194,25,467,492 +14092,2012-08-15,3,1,8,8,0,3,1,2,0.7,0.6515,0.65,0.2836,40,642,682 +14093,2012-08-15,3,1,8,9,0,3,1,2,0.72,0.6818,0.62,0.194,57,310,367 +14094,2012-08-15,3,1,8,10,0,3,1,2,0.72,0.6818,0.62,0.194,70,163,233 +14095,2012-08-15,3,1,8,11,0,3,1,2,0.74,0.6818,0.58,0.194,80,155,235 +14096,2012-08-15,3,1,8,12,0,3,1,1,0.74,0.6818,0.55,0.2239,77,230,307 +14097,2012-08-15,3,1,8,13,0,3,1,2,0.78,0.7121,0.49,0.194,88,206,294 +14098,2012-08-15,3,1,8,14,0,3,1,2,0.76,0.697,0.52,0.2985,94,161,255 +14099,2012-08-15,3,1,8,15,0,3,1,1,0.76,0.697,0.52,0.2537,70,196,266 +14100,2012-08-15,3,1,8,16,0,3,1,1,0.76,0.6818,0.48,0.194,91,340,431 +14101,2012-08-15,3,1,8,17,0,3,1,1,0.74,0.6818,0.55,0.2836,102,749,851 +14102,2012-08-15,3,1,8,18,0,3,1,1,0.74,0.6667,0.51,0.2239,80,768,848 +14103,2012-08-15,3,1,8,19,0,3,1,1,0.7,0.6515,0.61,0.1343,124,525,649 +14104,2012-08-15,3,1,8,20,0,3,1,1,0.7,0.6515,0.61,0.1343,72,355,427 +14105,2012-08-15,3,1,8,21,0,3,1,1,0.7,0.6515,0.61,0.194,49,266,315 +14106,2012-08-15,3,1,8,22,0,3,1,1,0.7,0.6515,0.61,0.1642,29,197,226 +14107,2012-08-15,3,1,8,23,0,3,1,1,0.68,0.6364,0.65,0.1045,18,120,138 +14108,2012-08-16,3,1,8,0,0,4,1,1,0.66,0.6212,0.69,0,13,63,76 +14109,2012-08-16,3,1,8,1,0,4,1,1,0.64,0.6061,0.73,0,3,18,21 +14110,2012-08-16,3,1,8,2,0,4,1,1,0.64,0.6061,0.69,0.194,0,15,15 +14111,2012-08-16,3,1,8,3,0,4,1,1,0.62,0.5909,0.73,0.2537,0,4,4 +14112,2012-08-16,3,1,8,4,0,4,1,1,0.62,0.5909,0.73,0.2239,2,3,5 +14113,2012-08-16,3,1,8,5,0,4,1,1,0.6,0.5758,0.78,0.1642,7,30,37 +14114,2012-08-16,3,1,8,6,0,4,1,1,0.6,0.5758,0.78,0.1642,3,162,165 +14115,2012-08-16,3,1,8,7,0,4,1,1,0.64,0.6061,0.69,0.2239,16,448,464 +14116,2012-08-16,3,1,8,8,0,4,1,1,0.68,0.6364,0.61,0.194,33,649,682 +14117,2012-08-16,3,1,8,9,0,4,1,1,0.72,0.6667,0.54,0.1045,41,296,337 +14118,2012-08-16,3,1,8,10,0,4,1,1,0.74,0.6667,0.48,0.1343,78,121,199 +14119,2012-08-16,3,1,8,11,0,4,1,1,0.76,0.6667,0.37,0.0896,83,191,274 +14120,2012-08-16,3,1,8,12,0,4,1,1,0.8,0.697,0.33,0.1045,96,247,343 +14121,2012-08-16,3,1,8,13,0,4,1,1,0.8,0.697,0.31,0.2239,81,219,300 +14122,2012-08-16,3,1,8,14,0,4,1,1,0.82,0.7121,0.32,0,72,176,248 +14123,2012-08-16,3,1,8,15,0,4,1,1,0.8,0.697,0.33,0.2537,63,197,260 +14124,2012-08-16,3,1,8,16,0,4,1,1,0.82,0.7121,0.32,0.2239,79,340,419 +14125,2012-08-16,3,1,8,17,0,4,1,1,0.82,0.7273,0.34,0.194,130,767,897 +14126,2012-08-16,3,1,8,18,0,4,1,1,0.8,0.7121,0.36,0,109,723,832 +14127,2012-08-16,3,1,8,19,0,4,1,1,0.76,0.6818,0.4,0.1045,119,558,677 +14128,2012-08-16,3,1,8,20,0,4,1,1,0.76,0.6818,0.4,0.0896,100,414,514 +14129,2012-08-16,3,1,8,21,0,4,1,1,0.74,0.6667,0.48,0.194,83,273,356 +14130,2012-08-16,3,1,8,22,0,4,1,1,0.72,0.6667,0.51,0.1642,69,185,254 +14131,2012-08-16,3,1,8,23,0,4,1,1,0.7,0.6515,0.54,0.1045,58,168,226 +14132,2012-08-17,3,1,8,0,0,5,1,1,0.68,0.2424,0.57,0.1642,21,67,88 +14133,2012-08-17,3,1,8,1,0,5,1,1,0.66,0.2424,0.65,0.1045,16,38,54 +14134,2012-08-17,3,1,8,2,0,5,1,1,0.66,0.2424,0.61,0.1343,4,15,19 +14135,2012-08-17,3,1,8,3,0,5,1,1,0.64,0.2424,0.65,0.1045,0,6,6 +14136,2012-08-17,3,1,8,4,0,5,1,1,0.64,0.2424,0.73,0.1642,0,9,9 +14137,2012-08-17,3,1,8,5,0,5,1,1,0.64,0.2424,0.73,0.1045,2,34,36 +14138,2012-08-17,3,1,8,6,0,5,1,1,0.62,0.2424,0.78,0.1343,6,151,157 +14139,2012-08-17,3,1,8,7,0,5,1,1,0.64,0.2424,0.73,0.1045,11,368,379 +14140,2012-08-17,3,1,8,8,0,5,1,1,0.68,0.2424,0.65,0.1343,43,625,668 +14141,2012-08-17,3,1,8,9,0,5,1,1,0.7,0.2424,0.58,0.1045,58,320,378 +14142,2012-08-17,3,1,8,10,0,5,1,1,0.74,0.2424,0.55,0.1642,82,149,231 +14143,2012-08-17,3,1,8,11,0,5,1,1,0.76,0.2424,0.52,0.2836,98,205,303 +14144,2012-08-17,3,1,8,12,0,5,1,1,0.82,0.2424,0.41,0.2239,110,255,365 +14145,2012-08-17,3,1,8,13,0,5,1,1,0.84,0.2424,0.36,0.3881,103,254,357 +14146,2012-08-17,3,1,8,14,0,5,1,1,0.86,0.2424,0.34,0.4179,128,200,328 +14147,2012-08-17,3,1,8,15,0,5,1,1,0.86,0.2424,0.3,0.4627,127,256,383 +14148,2012-08-17,3,1,8,16,0,5,1,2,0.84,0.2424,0.32,0.4478,116,372,488 +14149,2012-08-17,3,1,8,17,0,5,1,1,0.82,0.2424,0.36,0.3284,144,647,791 +14150,2012-08-17,3,1,8,18,0,5,1,2,0.82,0.2424,0.38,0.2537,108,561,669 +14151,2012-08-17,3,1,8,19,0,5,1,2,0.74,0.2424,0.55,0.3881,88,403,491 +14152,2012-08-17,3,1,8,20,0,5,1,2,0.72,0.2424,0.58,0.2239,97,262,359 +14153,2012-08-17,3,1,8,21,0,5,1,2,0.68,0.2424,0.69,0.2985,57,198,255 +14154,2012-08-17,3,1,8,22,0,5,1,3,0.66,0.2424,0.83,0.194,43,170,213 +14155,2012-08-17,3,1,8,23,0,5,1,3,0.64,0.2424,0.83,0.2239,21,100,121 +14156,2012-08-18,3,1,8,0,0,6,0,2,0.64,0.5758,0.83,0.2239,6,99,105 +14157,2012-08-18,3,1,8,1,0,6,0,3,0.62,0.5455,0.94,0.1642,14,78,92 +14158,2012-08-18,3,1,8,2,0,6,0,3,0.62,0.5606,0.88,0.1642,4,39,43 +14159,2012-08-18,3,1,8,3,0,6,0,1,0.62,0.5606,0.88,0.194,7,23,30 +14160,2012-08-18,3,1,8,4,0,6,0,1,0.62,0.5909,0.73,0.2239,5,8,13 +14161,2012-08-18,3,1,8,5,0,6,0,1,0.6,0.5758,0.78,0.2537,2,7,9 +14162,2012-08-18,3,1,8,6,0,6,0,1,0.6,0.5606,0.83,0.2239,4,23,27 +14163,2012-08-18,3,1,8,7,0,6,0,1,0.6,0.5606,0.83,0.2239,12,52,64 +14164,2012-08-18,3,1,8,8,0,6,0,1,0.64,0.6061,0.69,0.2537,28,161,189 +14165,2012-08-18,3,1,8,9,0,6,0,1,0.66,0.6212,0.65,0.2836,81,211,292 +14166,2012-08-18,3,1,8,10,0,6,0,1,0.7,0.6515,0.54,0.1642,166,314,480 +14167,2012-08-18,3,1,8,11,0,6,0,1,0.72,0.6667,0.51,0,180,356,536 +14168,2012-08-18,3,1,8,12,0,6,0,1,0.76,0.6667,0.37,0.194,266,388,654 +14169,2012-08-18,3,1,8,13,0,6,0,1,0.74,0.6515,0.4,0.194,289,355,644 +14170,2012-08-18,3,1,8,14,0,6,0,1,0.76,0.6667,0.37,0.2836,242,356,598 +14171,2012-08-18,3,1,8,15,0,6,0,1,0.76,0.6667,0.37,0.2239,250,346,596 +14172,2012-08-18,3,1,8,16,0,6,0,1,0.76,0.6667,0.37,0.2537,287,354,641 +14173,2012-08-18,3,1,8,17,0,6,0,1,0.76,0.6667,0.37,0.2239,256,379,635 +14174,2012-08-18,3,1,8,18,0,6,0,1,0.74,0.6515,0.37,0.1642,225,329,554 +14175,2012-08-18,3,1,8,19,0,6,0,1,0.7,0.6364,0.45,0.1642,164,324,488 +14176,2012-08-18,3,1,8,20,0,6,0,1,0.7,0.6364,0.45,0.1045,99,242,341 +14177,2012-08-18,3,1,8,21,0,6,0,1,0.66,0.6212,0.61,0,90,248,338 +14178,2012-08-18,3,1,8,22,0,6,0,1,0.66,0.6212,0.61,0,90,171,261 +14179,2012-08-18,3,1,8,23,0,6,0,1,0.64,0.6061,0.65,0.0896,60,175,235 +14180,2012-08-19,3,1,8,0,0,0,0,1,0.66,0.6212,0.57,0.0896,44,143,187 +14181,2012-08-19,3,1,8,1,0,0,0,1,0.64,0.6061,0.65,0,29,102,131 +14182,2012-08-19,3,1,8,2,0,0,0,2,0.62,0.6061,0.69,0.0896,16,103,119 +14183,2012-08-19,3,1,8,3,0,0,0,2,0.62,0.6061,0.61,0.1642,21,34,55 +14184,2012-08-19,3,1,8,4,0,0,0,2,0.62,0.6061,0.65,0.1642,4,22,26 +14185,2012-08-19,3,1,8,5,0,0,0,2,0.6,0.5909,0.73,0.1343,3,8,11 +14186,2012-08-19,3,1,8,6,0,0,0,2,0.62,0.6061,0.69,0.1045,5,15,20 +14187,2012-08-19,3,1,8,7,0,0,0,2,0.62,0.6061,0.69,0.1343,12,29,41 +14188,2012-08-19,3,1,8,8,0,0,0,2,0.64,0.6061,0.69,0.1045,34,90,124 +14189,2012-08-19,3,1,8,9,0,0,0,2,0.66,0.6212,0.74,0,86,184,270 +14190,2012-08-19,3,1,8,10,0,0,0,2,0.68,0.6364,0.69,0,138,287,425 +14191,2012-08-19,3,1,8,11,0,0,0,3,0.64,0.5909,0.78,0.1642,83,189,272 +14192,2012-08-19,3,1,8,12,0,0,0,3,0.64,0.6061,0.73,0,112,186,298 +14193,2012-08-19,3,1,8,13,0,0,0,3,0.64,0.5758,0.83,0.1343,50,112,162 +14194,2012-08-19,3,1,8,14,0,0,0,3,0.64,0.6061,0.73,0,41,108,149 +14195,2012-08-19,3,1,8,15,0,0,0,2,0.64,0.6061,0.65,0,89,187,276 +14196,2012-08-19,3,1,8,16,0,0,0,2,0.66,0.6212,0.65,0,97,259,356 +14197,2012-08-19,3,1,8,17,0,0,0,2,0.64,0.5909,0.78,0.1045,76,267,343 +14198,2012-08-19,3,1,8,18,0,0,0,2,0.64,0.5909,0.78,0.1045,86,291,377 +14199,2012-08-19,3,1,8,19,0,0,0,2,0.64,0.6061,0.73,0.1343,72,269,341 +14200,2012-08-19,3,1,8,20,0,0,0,2,0.64,0.6061,0.73,0.194,61,213,274 +14201,2012-08-19,3,1,8,21,0,0,0,3,0.62,0.5909,0.78,0.0896,36,154,190 +14202,2012-08-19,3,1,8,22,0,0,0,2,0.62,0.5909,0.78,0.1642,6,50,56 +14203,2012-08-19,3,1,8,23,0,0,0,2,0.62,0.5909,0.73,0,7,39,46 +14204,2012-08-20,3,1,8,0,0,1,1,2,0.6,0.5606,0.83,0.1045,1,25,26 +14205,2012-08-20,3,1,8,1,0,1,1,1,0.6,0.5606,0.83,0.1642,0,10,10 +14206,2012-08-20,3,1,8,2,0,1,1,1,0.6,0.5606,0.83,0.1642,0,5,5 +14207,2012-08-20,3,1,8,3,0,1,1,1,0.6,0.5455,0.88,0.1045,0,3,3 +14208,2012-08-20,3,1,8,4,0,1,1,2,0.6,0.5606,0.83,0.0896,0,7,7 +14209,2012-08-20,3,1,8,5,0,1,1,2,0.6,0.5606,0.83,0,2,35,37 +14210,2012-08-20,3,1,8,6,0,1,1,2,0.6,0.5758,0.78,0.1343,6,155,161 +14211,2012-08-20,3,1,8,7,0,1,1,2,0.62,0.5909,0.78,0.1343,15,427,442 +14212,2012-08-20,3,1,8,8,0,1,1,2,0.62,0.5909,0.73,0.1045,37,618,655 +14213,2012-08-20,3,1,8,9,0,1,1,2,0.64,0.6061,0.69,0,55,302,357 +14214,2012-08-20,3,1,8,10,0,1,1,2,0.64,0.6061,0.69,0.0896,69,124,193 +14215,2012-08-20,3,1,8,11,0,1,1,1,0.66,0.6212,0.65,0.1045,90,151,241 +14216,2012-08-20,3,1,8,12,0,1,1,2,0.66,0.6212,0.61,0.1045,66,216,282 +14217,2012-08-20,3,1,8,13,0,1,1,1,0.7,0.6515,0.58,0.1343,97,194,291 +14218,2012-08-20,3,1,8,14,0,1,1,1,0.72,0.6667,0.54,0.1642,87,188,275 +14219,2012-08-20,3,1,8,15,0,1,1,1,0.72,0.6667,0.54,0.2537,102,207,309 +14220,2012-08-20,3,1,8,16,0,1,1,1,0.72,0.6667,0.54,0.2537,103,357,460 +14221,2012-08-20,3,1,8,17,0,1,1,1,0.7,0.6515,0.58,0.2836,83,810,893 +14222,2012-08-20,3,1,8,18,0,1,1,2,0.62,0.5909,0.78,0.2537,89,726,815 +14223,2012-08-20,3,1,8,19,0,1,1,2,0.62,0.5909,0.78,0.194,33,266,299 +14224,2012-08-20,3,1,8,20,0,1,1,2,0.62,0.5758,0.83,0.2836,30,231,261 +14225,2012-08-20,3,1,8,21,0,1,1,1,0.6,0.5606,0.83,0,26,219,245 +14226,2012-08-20,3,1,8,22,0,1,1,2,0.6,0.5606,0.83,0,25,136,161 +14227,2012-08-20,3,1,8,23,0,1,1,2,0.6,0.5606,0.83,0,10,92,102 +14228,2012-08-21,3,1,8,0,0,2,1,2,0.6,0.5606,0.83,0.0896,7,52,59 +14229,2012-08-21,3,1,8,1,0,2,1,2,0.6,0.5606,0.83,0.1343,11,22,33 +14230,2012-08-21,3,1,8,2,0,2,1,2,0.58,0.5455,0.88,0.2836,1,7,8 +14231,2012-08-21,3,1,8,3,0,2,1,1,0.56,0.5303,0.88,0,0,3,3 +14232,2012-08-21,3,1,8,4,0,2,1,1,0.56,0.5303,0.88,0,0,4,4 +14233,2012-08-21,3,1,8,5,0,2,1,1,0.56,0.5303,0.83,0.0896,4,30,34 +14234,2012-08-21,3,1,8,6,0,2,1,1,0.56,0.5303,0.83,0.1045,5,164,169 +14235,2012-08-21,3,1,8,7,0,2,1,1,0.6,0.5758,0.78,0.1045,19,500,519 +14236,2012-08-21,3,1,8,8,0,2,1,1,0.6,0.5758,0.78,0.1343,27,696,723 +14237,2012-08-21,3,1,8,9,0,2,1,1,0.64,0.6061,0.69,0,47,281,328 +14238,2012-08-21,3,1,8,10,0,2,1,1,0.66,0.6212,0.61,0.1045,58,120,178 +14239,2012-08-21,3,1,8,11,0,2,1,1,0.7,0.6515,0.54,0,61,195,256 +14240,2012-08-21,3,1,8,12,0,2,1,1,0.72,0.6515,0.45,0,79,226,305 +14241,2012-08-21,3,1,8,13,0,2,1,1,0.74,0.6515,0.4,0,76,255,331 +14242,2012-08-21,3,1,8,14,0,2,1,1,0.76,0.6667,0.37,0.0896,110,192,302 +14243,2012-08-21,3,1,8,15,0,2,1,1,0.76,0.6667,0.37,0,76,226,302 +14244,2012-08-21,3,1,8,16,0,2,1,1,0.76,0.6667,0.37,0,109,358,467 +14245,2012-08-21,3,1,8,17,0,2,1,1,0.74,0.6667,0.51,0.2836,92,786,878 +14246,2012-08-21,3,1,8,18,0,2,1,1,0.72,0.6667,0.54,0.2239,93,532,625 +14247,2012-08-21,3,1,8,19,0,2,1,3,0.62,0.5909,0.73,0.1045,56,420,476 +14248,2012-08-21,3,1,8,20,0,2,1,1,0.64,0.6061,0.73,0,62,296,358 +14249,2012-08-21,3,1,8,21,0,2,1,1,0.64,0.6061,0.73,0,41,239,280 +14250,2012-08-21,3,1,8,22,0,2,1,1,0.64,0.5909,0.78,0,24,208,232 +14251,2012-08-21,3,1,8,23,0,2,1,1,0.62,0.5758,0.83,0,23,113,136 +14252,2012-08-22,3,1,8,0,0,3,1,1,0.62,0.5758,0.83,0.1045,9,46,55 +14253,2012-08-22,3,1,8,1,0,3,1,1,0.62,0.5909,0.78,0,1,20,21 +14254,2012-08-22,3,1,8,2,0,3,1,1,0.62,0.5909,0.78,0,7,10,17 +14255,2012-08-22,3,1,8,3,0,3,1,1,0.62,0.5909,0.78,0,0,7,7 +14256,2012-08-22,3,1,8,4,0,3,1,1,0.58,0.5455,0.83,0.1045,1,7,8 +14257,2012-08-22,3,1,8,5,0,3,1,1,0.6,0.5758,0.78,0,2,38,40 +14258,2012-08-22,3,1,8,6,0,3,1,1,0.62,0.5909,0.73,0,12,175,187 +14259,2012-08-22,3,1,8,7,0,3,1,1,0.62,0.5909,0.78,0,16,537,553 +14260,2012-08-22,3,1,8,8,0,3,1,1,0.64,0.6061,0.73,0.0896,37,703,740 +14261,2012-08-22,3,1,8,9,0,3,1,1,0.68,0.6364,0.65,0,41,335,376 +14262,2012-08-22,3,1,8,10,0,3,1,1,0.7,0.6515,0.61,0.0896,66,159,225 +14263,2012-08-22,3,1,8,11,0,3,1,1,0.72,0.6667,0.58,0.0896,71,196,267 +14264,2012-08-22,3,1,8,12,0,3,1,1,0.74,0.6667,0.51,0,59,253,312 +14265,2012-08-22,3,1,8,13,0,3,1,1,0.76,0.6818,0.48,0.1045,74,242,316 +14266,2012-08-22,3,1,8,14,0,3,1,1,0.76,0.6818,0.45,0.1045,82,184,266 +14267,2012-08-22,3,1,8,15,0,3,1,1,0.76,0.6818,0.48,0.1343,94,222,316 +14268,2012-08-22,3,1,8,16,0,3,1,3,0.74,0.6818,0.55,0.2239,102,358,460 +14269,2012-08-22,3,1,8,17,0,3,1,3,0.74,0.6818,0.55,0.2239,72,711,783 +14270,2012-08-22,3,1,8,18,0,3,1,3,0.66,0.6212,0.61,0.2239,91,592,683 +14271,2012-08-22,3,1,8,19,0,3,1,2,0.66,0.6212,0.74,0,56,524,580 +14272,2012-08-22,3,1,8,20,0,3,1,1,0.64,0.6061,0.73,0.194,59,332,391 +14273,2012-08-22,3,1,8,21,0,3,1,1,0.64,0.6061,0.73,0,60,291,351 +14274,2012-08-22,3,1,8,22,0,3,1,1,0.64,0.6061,0.73,0,55,226,281 +14275,2012-08-22,3,1,8,23,0,3,1,1,0.64,0.5758,0.83,0,27,113,140 +14276,2012-08-23,3,1,8,0,0,4,1,1,0.64,0.5909,0.78,0,9,51,60 +14277,2012-08-23,3,1,8,1,0,4,1,1,0.62,0.5758,0.83,0,10,15,25 +14278,2012-08-23,3,1,8,2,0,4,1,1,0.62,0.5758,0.83,0,9,11,20 +14279,2012-08-23,3,1,8,3,0,4,1,1,0.62,0.5758,0.83,0.0896,3,6,9 +14280,2012-08-23,3,1,8,4,0,4,1,1,0.62,0.5758,0.83,0,0,6,6 +14281,2012-08-23,3,1,8,5,0,4,1,1,0.62,0.5758,0.83,0,1,36,37 +14282,2012-08-23,3,1,8,6,0,4,1,2,0.6,0.5455,0.88,0.0896,14,178,192 +14283,2012-08-23,3,1,8,7,0,4,1,2,0.62,0.5758,0.83,0.0896,18,463,481 +14284,2012-08-23,3,1,8,8,0,4,1,2,0.66,0.6212,0.74,0.0896,45,662,707 +14285,2012-08-23,3,1,8,9,0,4,1,2,0.7,0.6515,0.7,0,58,354,412 +14286,2012-08-23,3,1,8,10,0,4,1,1,0.74,0.6818,0.58,0.0896,76,157,233 +14287,2012-08-23,3,1,8,11,0,4,1,1,0.76,0.6818,0.48,0.0896,86,192,278 +14288,2012-08-23,3,1,8,12,0,4,1,1,0.78,0.6818,0.38,0,97,235,332 +14289,2012-08-23,3,1,8,13,0,4,1,1,0.82,0.7273,0.38,0.0896,93,253,346 +14290,2012-08-23,3,1,8,14,0,4,1,1,0.78,0.697,0.46,0.2239,114,191,305 +14291,2012-08-23,3,1,8,15,0,4,1,1,0.78,0.697,0.46,0.1343,104,201,305 +14292,2012-08-23,3,1,8,16,0,4,1,1,0.76,0.6818,0.48,0.1642,121,344,465 +14293,2012-08-23,3,1,8,17,0,4,1,1,0.76,0.6818,0.45,0.2239,111,709,820 +14294,2012-08-23,3,1,8,18,0,4,1,1,0.74,0.6667,0.48,0.2239,130,811,941 +14295,2012-08-23,3,1,8,19,0,4,1,1,0.72,0.6667,0.51,0.0896,96,537,633 +14296,2012-08-23,3,1,8,20,0,4,1,1,0.7,0.6515,0.61,0.1045,70,404,474 +14297,2012-08-23,3,1,8,21,0,4,1,1,0.7,0.6515,0.61,0.1343,53,276,329 +14298,2012-08-23,3,1,8,22,0,4,1,2,0.68,0.6364,0.61,0.1045,21,177,198 +14299,2012-08-23,3,1,8,23,0,4,1,2,0.66,0.6212,0.69,0,24,133,157 +14300,2012-08-24,3,1,8,0,0,5,1,2,0.66,0.6212,0.69,0,27,84,111 +14301,2012-08-24,3,1,8,1,0,5,1,2,0.64,0.6061,0.73,0,5,37,42 +14302,2012-08-24,3,1,8,2,0,5,1,1,0.66,0.6212,0.69,0,1,15,16 +14303,2012-08-24,3,1,8,3,0,5,1,1,0.64,0.6061,0.73,0,2,6,8 +14304,2012-08-24,3,1,8,4,0,5,1,2,0.64,0.6061,0.73,0,2,6,8 +14305,2012-08-24,3,1,8,5,0,5,1,1,0.64,0.5909,0.78,0,1,37,38 +14306,2012-08-24,3,1,8,6,0,5,1,2,0.62,0.5758,0.83,0,9,142,151 +14307,2012-08-24,3,1,8,7,0,5,1,2,0.64,0.5909,0.78,0,13,412,425 +14308,2012-08-24,3,1,8,8,0,5,1,2,0.66,0.6212,0.74,0,41,703,744 +14309,2012-08-24,3,1,8,9,0,5,1,2,0.72,0.6667,0.58,0,58,331,389 +14310,2012-08-24,3,1,8,10,0,5,1,2,0.76,0.6818,0.48,0,78,184,262 +14311,2012-08-24,3,1,8,11,0,5,1,2,0.76,0.6818,0.48,0.1045,69,212,281 +14312,2012-08-24,3,1,8,12,0,5,1,2,0.76,0.6818,0.45,0.0896,95,276,371 +14313,2012-08-24,3,1,8,13,0,5,1,2,0.8,0.7121,0.41,0.1045,79,272,351 +14314,2012-08-24,3,1,8,14,0,5,1,1,0.78,0.697,0.43,0.1045,109,229,338 +14315,2012-08-24,3,1,8,15,0,5,1,2,0.76,0.6818,0.48,0.1343,99,251,350 +14316,2012-08-24,3,1,8,16,0,5,1,2,0.76,0.6818,0.48,0.1343,101,414,515 +14317,2012-08-24,3,1,8,17,0,5,1,2,0.74,0.6667,0.51,0.2239,117,695,812 +14318,2012-08-24,3,1,8,18,0,5,1,2,0.72,0.6818,0.62,0.194,109,627,736 +14319,2012-08-24,3,1,8,19,0,5,1,2,0.72,0.6667,0.58,0.1642,106,430,536 +14320,2012-08-24,3,1,8,20,0,5,1,2,0.7,0.6515,0.65,0.2239,66,297,363 +14321,2012-08-24,3,1,8,21,0,5,1,2,0.7,0.6515,0.61,0.1642,58,248,306 +14322,2012-08-24,3,1,8,22,0,5,1,2,0.7,0.6515,0.61,0,42,209,251 +14323,2012-08-24,3,1,8,23,0,5,1,2,0.68,0.6364,0.69,0.0896,38,140,178 +14324,2012-08-25,3,1,8,0,0,6,0,2,0.7,0.6515,0.61,0,21,114,135 +14325,2012-08-25,3,1,8,1,0,6,0,2,0.68,0.6364,0.69,0,15,100,115 +14326,2012-08-25,3,1,8,2,0,6,0,2,0.66,0.6212,0.74,0.0896,18,61,79 +14327,2012-08-25,3,1,8,3,0,6,0,1,0.66,0.6212,0.69,0.0896,7,31,38 +14328,2012-08-25,3,1,8,4,0,6,0,2,0.66,0.6212,0.69,0.1045,3,9,12 +14329,2012-08-25,3,1,8,5,0,6,0,2,0.64,0.6061,0.73,0.1642,4,10,14 +14330,2012-08-25,3,1,8,6,0,6,0,2,0.64,0.5909,0.78,0.1642,5,25,30 +14331,2012-08-25,3,1,8,7,0,6,0,2,0.64,0.5909,0.78,0.194,14,88,102 +14332,2012-08-25,3,1,8,8,0,6,0,2,0.66,0.6212,0.74,0.2537,29,133,162 +14333,2012-08-25,3,1,8,9,0,6,0,2,0.68,0.6364,0.69,0.2836,60,228,288 +14334,2012-08-25,3,1,8,10,0,6,0,2,0.7,0.6515,0.65,0.2985,148,294,442 +14335,2012-08-25,3,1,8,11,0,6,0,2,0.72,0.6818,0.62,0.2836,143,314,457 +14336,2012-08-25,3,1,8,12,0,6,0,3,0.72,0.6667,0.54,0.3881,217,367,584 +14337,2012-08-25,3,1,8,13,0,6,0,2,0.72,0.6667,0.51,0.3582,186,331,517 +14338,2012-08-25,3,1,8,14,0,6,0,3,0.66,0.6212,0.65,0.2537,150,332,482 +14339,2012-08-25,3,1,8,15,0,6,0,2,0.64,0.6061,0.73,0.3284,79,154,233 +14340,2012-08-25,3,1,8,16,0,6,0,2,0.66,0.6212,0.74,0.2985,174,260,434 +14341,2012-08-25,3,1,8,17,0,6,0,2,0.66,0.6212,0.74,0.3284,132,271,403 +14342,2012-08-25,3,1,8,18,0,6,0,2,0.66,0.6212,0.69,0.4627,122,261,383 +14343,2012-08-25,3,1,8,19,0,6,0,2,0.64,0.5909,0.78,0.2985,97,257,354 +14344,2012-08-25,3,1,8,20,0,6,0,3,0.62,0.5758,0.83,0.2537,65,176,241 +14345,2012-08-25,3,1,8,21,0,6,0,1,0.62,0.5758,0.83,0.3582,60,148,208 +14346,2012-08-25,3,1,8,22,0,6,0,1,0.62,0.5758,0.83,0.3582,46,136,182 +14347,2012-08-25,3,1,8,23,0,6,0,2,0.62,0.5758,0.83,0.2537,34,124,158 +14348,2012-08-26,3,1,8,0,0,0,0,1,0.64,0.5758,0.89,0.2537,23,111,134 +14349,2012-08-26,3,1,8,1,0,0,0,1,0.64,0.5758,0.89,0.194,18,116,134 +14350,2012-08-26,3,1,8,2,0,0,0,2,0.64,0.5758,0.89,0.2239,12,104,116 +14351,2012-08-26,3,1,8,3,0,0,0,2,0.64,0.5758,0.89,0.2537,11,42,53 +14352,2012-08-26,3,1,8,4,0,0,0,2,0.64,0.5758,0.89,0.194,5,11,16 +14353,2012-08-26,3,1,8,5,0,0,0,1,0.64,0.5758,0.89,0.194,3,13,16 +14354,2012-08-26,3,1,8,6,0,0,0,1,0.62,0.5606,0.88,0.2985,5,16,21 +14355,2012-08-26,3,1,8,7,0,0,0,1,0.64,0.5758,0.89,0.194,11,23,34 +14356,2012-08-26,3,1,8,8,0,0,0,1,0.68,0.6364,0.79,0.2836,21,112,133 +14357,2012-08-26,3,1,8,9,0,0,0,1,0.68,0.6364,0.79,0.2537,67,196,263 +14358,2012-08-26,3,1,8,10,0,0,0,1,0.74,0.697,0.66,0.3881,127,258,385 +14359,2012-08-26,3,1,8,11,0,0,0,1,0.74,0.697,0.66,0.3582,150,332,482 +14360,2012-08-26,3,1,8,12,0,0,0,3,0.72,0.697,0.74,0.3582,225,401,626 +14361,2012-08-26,3,1,8,13,0,0,0,3,0.66,0.6061,0.78,0.4179,166,375,541 +14362,2012-08-26,3,1,8,14,0,0,0,1,0.64,0.5758,0.89,0.1045,125,252,377 +14363,2012-08-26,3,1,8,15,0,0,0,3,0.64,0.5758,0.89,0.2836,132,259,391 +14364,2012-08-26,3,1,8,16,0,0,0,3,0.64,0.5758,0.89,0.2836,57,116,173 +14365,2012-08-26,3,1,8,17,0,0,0,1,0.64,0.5758,0.89,0.2239,61,172,233 +14366,2012-08-26,3,1,8,18,0,0,0,2,0.64,0.5758,0.89,0.0896,77,187,264 +14367,2012-08-26,3,1,8,19,0,0,0,2,0.64,0.5758,0.89,0.1642,80,212,292 +14368,2012-08-26,3,1,8,20,0,0,0,1,0.64,0.5758,0.89,0.1642,40,171,211 +14369,2012-08-26,3,1,8,21,0,0,0,1,0.64,0.5758,0.83,0.1343,26,140,166 +14370,2012-08-26,3,1,8,22,0,0,0,1,0.62,0.5606,0.88,0.0896,20,94,114 +14371,2012-08-26,3,1,8,23,0,0,0,1,0.62,0.5758,0.83,0.0896,21,59,80 +14372,2012-08-27,3,1,8,0,0,1,1,1,0.62,0.5758,0.83,0,4,28,32 +14373,2012-08-27,3,1,8,1,0,1,1,1,0.62,0.5606,0.88,0.0896,3,4,7 +14374,2012-08-27,3,1,8,2,0,1,1,1,0.62,0.5606,0.88,0.1343,0,8,8 +14375,2012-08-27,3,1,8,3,0,1,1,1,0.62,0.5606,0.88,0.0896,1,3,4 +14376,2012-08-27,3,1,8,4,0,1,1,1,0.62,0.5758,0.83,0,1,11,12 +14377,2012-08-27,3,1,8,5,0,1,1,1,0.62,0.5606,0.88,0,4,32,36 +14378,2012-08-27,3,1,8,6,0,1,1,1,0.62,0.5606,0.88,0,4,141,145 +14379,2012-08-27,3,1,8,7,0,1,1,1,0.66,0.6061,0.83,0,15,418,433 +14380,2012-08-27,3,1,8,8,0,1,1,1,0.66,0.6061,0.83,0.0896,29,670,699 +14381,2012-08-27,3,1,8,9,0,1,1,1,0.66,0.6061,0.83,0.1045,40,302,342 +14382,2012-08-27,3,1,8,10,0,1,1,1,0.72,0.697,0.74,0.1343,63,143,206 +14383,2012-08-27,3,1,8,11,0,1,1,1,0.74,0.697,0.66,0.194,47,182,229 +14384,2012-08-27,3,1,8,12,0,1,1,1,0.76,0.7121,0.62,0,69,214,283 +14385,2012-08-27,3,1,8,13,0,1,1,1,0.76,0.7121,0.57,0.1642,77,211,288 +14386,2012-08-27,3,1,8,14,0,1,1,1,0.8,0.7424,0.49,0.0896,66,180,246 +14387,2012-08-27,3,1,8,15,0,1,1,1,0.8,0.7576,0.55,0.2985,77,205,282 +14388,2012-08-27,3,1,8,16,0,1,1,1,0.8,0.7576,0.55,0.2537,77,367,444 +14389,2012-08-27,3,1,8,17,0,1,1,1,0.8,0.7273,0.46,0.2239,101,744,845 +14390,2012-08-27,3,1,8,18,0,1,1,1,0.78,0.7121,0.52,0.2537,83,751,834 +14391,2012-08-27,3,1,8,19,0,1,1,1,0.74,0.697,0.7,0.2239,74,499,573 +14392,2012-08-27,3,1,8,20,0,1,1,1,0.74,0.697,0.7,0.194,50,366,416 +14393,2012-08-27,3,1,8,21,0,1,1,1,0.72,0.697,0.74,0.194,39,239,278 +14394,2012-08-27,3,1,8,22,0,1,1,1,0.7,0.6667,0.84,0.194,30,131,161 +14395,2012-08-27,3,1,8,23,0,1,1,1,0.7,0.6667,0.84,0.1642,35,79,114 +14396,2012-08-28,3,1,8,0,0,2,1,1,0.7,0.6667,0.84,0.2537,9,29,38 +14397,2012-08-28,3,1,8,1,0,2,1,1,0.68,0.6364,0.89,0.194,4,17,21 +14398,2012-08-28,3,1,8,2,0,2,1,1,0.68,0.6364,0.83,0.2537,2,13,15 +14399,2012-08-28,3,1,8,3,0,2,1,1,0.66,0.5909,0.89,0.1343,0,4,4 +14400,2012-08-28,3,1,8,4,0,2,1,1,0.66,0.6061,0.83,0.1343,2,6,8 +14401,2012-08-28,3,1,8,5,0,2,1,2,0.66,0.6061,0.83,0.0896,3,39,42 +14402,2012-08-28,3,1,8,6,0,2,1,2,0.66,0.5909,0.89,0.1045,3,144,147 +14403,2012-08-28,3,1,8,7,0,2,1,2,0.66,0.6061,0.83,0.1045,14,255,269 +14404,2012-08-28,3,1,8,8,0,2,1,2,0.66,0.6212,0.85,0.1045,25,668,693 +14405,2012-08-28,3,1,8,9,0,2,1,2,0.7,0.6667,0.79,0.1642,44,351,395 +14406,2012-08-28,3,1,8,10,0,2,1,1,0.74,0.6818,0.62,0.2537,39,165,204 +14407,2012-08-28,3,1,8,11,0,2,1,1,0.78,0.7273,0.55,0.2537,47,171,218 +14408,2012-08-28,3,1,8,12,0,2,1,1,0.8,0.7424,0.49,0.2836,56,241,297 +14409,2012-08-28,3,1,8,13,0,2,1,1,0.82,0.7424,0.43,0.2985,56,232,288 +14410,2012-08-28,3,1,8,14,0,2,1,1,0.84,0.7424,0.38,0.3284,72,179,251 +14411,2012-08-28,3,1,8,15,0,2,1,1,0.82,0.7273,0.34,0.2985,59,258,317 +14412,2012-08-28,3,1,8,16,0,2,1,1,0.82,0.7121,0.33,0.3284,84,396,480 +14413,2012-08-28,3,1,8,17,0,2,1,1,0.8,0.697,0.35,0.3881,90,774,864 +14414,2012-08-28,3,1,8,18,0,2,1,1,0.78,0.697,0.4,0.2537,82,736,818 +14415,2012-08-28,3,1,8,19,0,2,1,1,0.74,0.6667,0.45,0.1642,82,515,597 +14416,2012-08-28,3,1,8,20,0,2,1,1,0.72,0.6667,0.48,0.1045,87,348,435 +14417,2012-08-28,3,1,8,21,0,2,1,1,0.72,0.6667,0.48,0.0896,38,276,314 +14418,2012-08-28,3,1,8,22,0,2,1,1,0.7,0.6515,0.54,0,20,196,216 +14419,2012-08-28,3,1,8,23,0,2,1,1,0.68,0.6364,0.57,0,17,92,109 +14420,2012-08-29,3,1,8,0,0,3,1,1,0.66,0.6212,0.69,0,8,39,47 +14421,2012-08-29,3,1,8,1,0,3,1,1,0.64,0.6061,0.69,0.1343,2,12,14 +14422,2012-08-29,3,1,8,2,0,3,1,1,0.62,0.5909,0.73,0.1343,2,4,6 +14423,2012-08-29,3,1,8,3,0,3,1,1,0.62,0.5909,0.73,0.1045,0,9,9 +14424,2012-08-29,3,1,8,4,0,3,1,1,0.6,0.5909,0.73,0.2239,2,8,10 +14425,2012-08-29,3,1,8,5,0,3,1,1,0.6,0.5909,0.73,0.194,3,37,40 +14426,2012-08-29,3,1,8,6,0,3,1,1,0.6,0.5909,0.73,0.194,6,175,181 +14427,2012-08-29,3,1,8,7,0,3,1,1,0.64,0.6061,0.65,0.0896,23,531,554 +14428,2012-08-29,3,1,8,8,0,3,1,1,0.66,0.6212,0.61,0.1642,28,780,808 +14429,2012-08-29,3,1,8,9,0,3,1,1,0.7,0.6515,0.51,0.194,40,308,348 +14430,2012-08-29,3,1,8,10,0,3,1,1,0.72,0.6667,0.48,0.1642,34,144,178 +14431,2012-08-29,3,1,8,11,0,3,1,1,0.72,0.6515,0.45,0,67,186,253 +14432,2012-08-29,3,1,8,12,0,3,1,1,0.76,0.6818,0.4,0,79,281,360 +14433,2012-08-29,3,1,8,13,0,3,1,1,0.74,0.6667,0.42,0,68,232,300 +14434,2012-08-29,3,1,8,14,0,3,1,1,0.76,0.6818,0.4,0.0896,87,214,301 +14435,2012-08-29,3,1,8,15,0,3,1,1,0.76,0.6818,0.4,0.194,96,242,338 +14436,2012-08-29,3,1,8,16,0,3,1,1,0.76,0.6667,0.37,0.1343,104,389,493 +14437,2012-08-29,3,1,8,17,0,3,1,1,0.78,0.6818,0.35,0.2239,108,762,870 +14438,2012-08-29,3,1,8,18,0,3,1,1,0.74,0.6667,0.42,0.1642,119,693,812 +14439,2012-08-29,3,1,8,19,0,3,1,1,0.7,0.6515,0.54,0.194,120,523,643 +14440,2012-08-29,3,1,8,20,0,3,1,1,0.68,0.6364,0.57,0,49,378,427 +14441,2012-08-29,3,1,8,21,0,3,1,1,0.66,0.6212,0.54,0.1045,48,265,313 +14442,2012-08-29,3,1,8,22,0,3,1,1,0.66,0.6212,0.54,0,50,185,235 +14443,2012-08-29,3,1,8,23,0,3,1,1,0.66,0.6212,0.57,0,34,123,157 +14444,2012-08-30,3,1,8,0,0,4,1,1,0.64,0.6061,0.69,0,14,51,65 +14445,2012-08-30,3,1,8,1,0,4,1,1,0.64,0.6061,0.65,0,6,17,23 +14446,2012-08-30,3,1,8,2,0,4,1,1,0.62,0.5909,0.73,0,3,14,17 +14447,2012-08-30,3,1,8,3,0,4,1,1,0.62,0.5909,0.73,0,0,5,5 +14448,2012-08-30,3,1,8,4,0,4,1,1,0.62,0.5909,0.73,0,4,4,8 +14449,2012-08-30,3,1,8,5,0,4,1,1,0.62,0.5909,0.73,0,2,33,35 +14450,2012-08-30,3,1,8,6,0,4,1,1,0.6,0.5606,0.83,0.1642,8,155,163 +14451,2012-08-30,3,1,8,7,0,4,1,1,0.64,0.6061,0.73,0,31,501,532 +14452,2012-08-30,3,1,8,8,0,4,1,1,0.66,0.6212,0.65,0,53,701,754 +14453,2012-08-30,3,1,8,9,0,4,1,1,0.7,0.6515,0.58,0,57,339,396 +14454,2012-08-30,3,1,8,10,0,4,1,1,0.72,0.6667,0.54,0,73,161,234 +14455,2012-08-30,3,1,8,11,0,4,1,1,0.76,0.6818,0.4,0.0896,85,191,276 +14456,2012-08-30,3,1,8,12,0,4,1,1,0.76,0.6818,0.45,0.1343,73,221,294 +14457,2012-08-30,3,1,8,13,0,4,1,1,0.76,0.6818,0.48,0.1045,76,214,290 +14458,2012-08-30,3,1,8,14,0,4,1,1,0.76,0.6818,0.48,0.1343,74,199,273 +14459,2012-08-30,3,1,8,15,0,4,1,1,0.8,0.7273,0.46,0.1642,99,239,338 +14460,2012-08-30,3,1,8,16,0,4,1,1,0.8,0.7273,0.43,0.1642,77,405,482 +14461,2012-08-30,3,1,8,17,0,4,1,1,0.82,0.7273,0.34,0.2239,93,751,844 +14462,2012-08-30,3,1,8,18,0,4,1,1,0.8,0.7121,0.38,0.1343,109,744,853 +14463,2012-08-30,3,1,8,19,0,4,1,1,0.76,0.6818,0.48,0.1343,78,532,610 +14464,2012-08-30,3,1,8,20,0,4,1,1,0.74,0.6818,0.58,0.1642,68,403,471 +14465,2012-08-30,3,1,8,21,0,4,1,1,0.72,0.6818,0.66,0.1343,32,275,307 +14466,2012-08-30,3,1,8,22,0,4,1,1,0.7,0.6515,0.7,0,38,253,291 +14467,2012-08-30,3,1,8,23,0,4,1,1,0.7,0.6667,0.74,0.1045,19,133,152 +14468,2012-08-31,3,1,8,0,0,5,1,1,0.68,0.6364,0.83,0.1642,13,69,82 +14469,2012-08-31,3,1,8,1,0,5,1,1,0.68,0.6364,0.83,0.1642,8,24,32 +14470,2012-08-31,3,1,8,2,0,5,1,1,0.66,0.5909,0.89,0.1343,9,11,20 +14471,2012-08-31,3,1,8,3,0,5,1,1,0.66,0.5909,0.89,0.1045,3,2,5 +14472,2012-08-31,3,1,8,4,0,5,1,1,0.64,0.5758,0.89,0.1343,0,7,7 +14473,2012-08-31,3,1,8,5,0,5,1,1,0.64,0.5758,0.89,0.1642,2,27,29 +14474,2012-08-31,3,1,8,6,0,5,1,2,0.64,0.5758,0.89,0.1642,7,101,108 +14475,2012-08-31,3,1,8,7,0,5,1,2,0.64,0.5758,0.89,0.0896,21,400,421 +14476,2012-08-31,3,1,8,8,0,5,1,2,0.7,0.6667,0.74,0,36,654,690 +14477,2012-08-31,3,1,8,9,0,5,1,1,0.74,0.697,0.7,0,61,337,398 +14478,2012-08-31,3,1,8,10,0,5,1,1,0.76,0.7121,0.62,0.0896,59,197,256 +14479,2012-08-31,3,1,8,11,0,5,1,1,0.86,0.7424,0.28,0.2985,88,221,309 +14480,2012-08-31,3,1,8,12,0,5,1,1,0.86,0.7424,0.26,0.2537,120,290,410 +14481,2012-08-31,3,1,8,13,0,5,1,1,0.88,0.7576,0.27,0.3284,97,271,368 +14482,2012-08-31,3,1,8,14,0,5,1,1,0.9,0.7727,0.25,0.2239,112,324,436 +14483,2012-08-31,3,1,8,15,0,5,1,1,0.9,0.7879,0.27,0.2985,109,417,526 +14484,2012-08-31,3,1,8,16,0,5,1,1,0.9,0.7879,0.27,0.2537,111,417,528 +14485,2012-08-31,3,1,8,17,0,5,1,1,0.88,0.7727,0.32,0.1642,129,488,617 +14486,2012-08-31,3,1,8,18,0,5,1,1,0.86,0.7576,0.36,0.2537,87,459,546 +14487,2012-08-31,3,1,8,19,0,5,1,1,0.8,0.7424,0.49,0.1343,91,361,452 +14488,2012-08-31,3,1,8,20,0,5,1,1,0.8,0.7424,0.49,0.1343,102,254,356 +14489,2012-08-31,3,1,8,21,0,5,1,1,0.76,0.7121,0.58,0.194,71,232,303 +14490,2012-08-31,3,1,8,22,0,5,1,1,0.76,0.7121,0.58,0.194,65,212,277 +14491,2012-08-31,3,1,8,23,0,5,1,1,0.74,0.6818,0.62,0.1045,32,142,174 +14492,2012-09-01,3,1,9,0,0,6,0,1,0.74,0.6818,0.62,0.1045,22,146,168 +14493,2012-09-01,3,1,9,1,0,6,0,1,0.72,0.697,0.74,0.1343,11,68,79 +14494,2012-09-01,3,1,9,2,0,6,0,1,0.7,0.6515,0.7,0.1642,8,61,69 +14495,2012-09-01,3,1,9,3,0,6,0,1,0.7,0.6515,0.7,0.1045,6,29,35 +14496,2012-09-01,3,1,9,4,0,6,0,1,0.7,0.6515,0.7,0,3,9,12 +14497,2012-09-01,3,1,9,5,0,6,0,1,0.68,0.6364,0.79,0,2,20,22 +14498,2012-09-01,3,1,9,6,0,6,0,1,0.68,0.6364,0.79,0,13,23,36 +14499,2012-09-01,3,1,9,7,0,6,0,2,0.74,0.6818,0.58,0.1343,8,58,66 +14500,2012-09-01,3,1,9,8,0,6,0,2,0.76,0.7121,0.58,0.2537,28,134,162 +14501,2012-09-01,3,1,9,9,0,6,0,2,0.76,0.7121,0.58,0.194,62,175,237 +14502,2012-09-01,3,1,9,10,0,6,0,2,0.78,0.7273,0.55,0.194,128,289,417 +14503,2012-09-01,3,1,9,11,0,6,0,2,0.8,0.7576,0.55,0.2239,219,273,492 +14504,2012-09-01,3,1,9,12,0,6,0,2,0.8,0.7576,0.55,0.1642,187,284,471 +14505,2012-09-01,3,1,9,13,0,6,0,2,0.82,0.7727,0.52,0.0896,237,267,504 +14506,2012-09-01,3,1,9,14,0,6,0,2,0.86,0.7879,0.44,0.1642,240,274,514 +14507,2012-09-01,3,1,9,15,0,6,0,2,0.86,0.7879,0.44,0.2836,257,306,563 +14508,2012-09-01,3,1,9,16,0,6,0,2,0.84,0.7879,0.49,0.1642,209,253,462 +14509,2012-09-01,3,1,9,17,0,6,0,2,0.82,0.7727,0.52,0,205,258,463 +14510,2012-09-01,3,1,9,18,0,6,0,2,0.82,0.7727,0.52,0,184,258,442 +14511,2012-09-01,3,1,9,19,0,6,0,3,0.78,0.7424,0.62,0,157,235,392 +14512,2012-09-01,3,1,9,20,0,6,0,3,0.76,0.7273,0.67,0.2537,67,140,207 +14513,2012-09-01,3,1,9,21,0,6,0,3,0.64,0.5758,0.89,0,21,83,104 +14514,2012-09-01,3,1,9,22,0,6,0,2,0.66,0.5909,0.89,0,21,57,78 +14515,2012-09-01,3,1,9,23,0,6,0,1,0.66,0.5909,0.89,0.0896,57,88,145 +14516,2012-09-02,3,1,9,0,0,0,0,2,0.66,0.5909,0.89,0.0896,27,72,99 +14517,2012-09-02,3,1,9,1,0,0,0,1,0.66,0.5909,0.89,0,15,58,73 +14518,2012-09-02,3,1,9,2,0,0,0,1,0.66,0.5909,0.89,0,28,56,84 +14519,2012-09-02,3,1,9,3,0,0,0,1,0.66,0.5909,0.89,0.0896,9,33,42 +14520,2012-09-02,3,1,9,4,0,0,0,1,0.66,0.5909,0.89,0,7,15,22 +14521,2012-09-02,3,1,9,5,0,0,0,1,0.66,0.5909,0.89,0,2,10,12 +14522,2012-09-02,3,1,9,6,0,0,0,2,0.66,0.5909,0.89,0,2,13,15 +14523,2012-09-02,3,1,9,7,0,0,0,1,0.66,0.5909,0.94,0.0896,13,29,42 +14524,2012-09-02,3,1,9,8,0,0,0,1,0.68,0.6364,0.89,0.0896,30,99,129 +14525,2012-09-02,3,1,9,9,0,0,0,2,0.7,0.6667,0.84,0,88,144,232 +14526,2012-09-02,3,1,9,10,0,0,0,2,0.72,0.697,0.79,0,158,230,388 +14527,2012-09-02,3,1,9,11,0,0,0,2,0.74,0.697,0.7,0,226,263,489 +14528,2012-09-02,3,1,9,12,0,0,0,3,0.74,0.697,0.7,0,248,297,545 +14529,2012-09-02,3,1,9,13,0,0,0,3,0.72,0.697,0.74,0.0896,263,230,493 +14530,2012-09-02,3,1,9,14,0,0,0,3,0.72,0.697,0.79,0.0896,249,208,457 +14531,2012-09-02,3,1,9,15,0,0,0,3,0.74,0.7121,0.74,0.1343,179,176,355 +14532,2012-09-02,3,1,9,16,0,0,0,1,0.74,0.697,0.7,0,222,252,474 +14533,2012-09-02,3,1,9,17,0,0,0,1,0.76,0.7273,0.66,0.0896,219,284,503 +14534,2012-09-02,3,1,9,18,0,0,0,1,0.74,0.7121,0.74,0,227,273,500 +14535,2012-09-02,3,1,9,19,0,0,0,2,0.7,0.6667,0.84,0.2239,140,152,292 +14536,2012-09-02,3,1,9,20,0,0,0,2,0.7,0.6667,0.84,0.1642,63,41,104 +14537,2012-09-02,3,1,9,21,0,0,0,1,0.7,0.6515,0.7,0.1642,79,92,171 +14538,2012-09-02,3,1,9,22,0,0,0,3,0.68,0.6364,0.83,0.1343,66,100,166 +14539,2012-09-02,3,1,9,23,0,0,0,3,0.66,0.5909,0.89,0.0896,53,70,123 +14540,2012-09-03,3,1,9,0,1,1,0,2,0.66,0.5909,0.94,0,42,62,104 +14541,2012-09-03,3,1,9,1,1,1,0,2,0.68,0.6364,0.89,0.1642,18,37,55 +14542,2012-09-03,3,1,9,2,1,1,0,1,0.66,0.5909,0.89,0.1045,10,30,40 +14543,2012-09-03,3,1,9,3,1,1,0,1,0.66,0.5909,0.89,0.1343,6,22,28 +14544,2012-09-03,3,1,9,4,1,1,0,1,0.66,0.5909,0.89,0.0896,6,5,11 +14545,2012-09-03,3,1,9,5,1,1,0,1,0.66,0.5909,0.89,0,1,3,4 +14546,2012-09-03,3,1,9,6,1,1,0,1,0.66,0.5909,0.94,0,3,12,15 +14547,2012-09-03,3,1,9,7,1,1,0,1,0.68,0.6364,0.85,0,6,37,43 +14548,2012-09-03,3,1,9,8,1,1,0,2,0.7,0.6667,0.84,0.1045,36,91,127 +14549,2012-09-03,3,1,9,9,1,1,0,2,0.7,0.6667,0.84,0.1343,71,158,229 +14550,2012-09-03,3,1,9,10,1,1,0,2,0.74,0.7121,0.74,0.2239,135,224,359 +14551,2012-09-03,3,1,9,11,1,1,0,2,0.74,0.697,0.7,0.2836,187,272,459 +14552,2012-09-03,3,1,9,12,1,1,0,2,0.76,0.7121,0.62,0.2985,191,352,543 +14553,2012-09-03,3,1,9,13,1,1,0,2,0.76,0.7273,0.66,0.2537,209,357,566 +14554,2012-09-03,3,1,9,14,1,1,0,1,0.74,0.697,0.7,0.2985,163,349,512 +14555,2012-09-03,3,1,9,15,1,1,0,1,0.74,0.697,0.7,0.2239,168,281,449 +14556,2012-09-03,3,1,9,16,1,1,0,1,0.74,0.697,0.7,0.2239,165,335,500 +14557,2012-09-03,3,1,9,17,1,1,0,1,0.74,0.697,0.7,0.1343,181,317,498 +14558,2012-09-03,3,1,9,18,1,1,0,1,0.74,0.697,0.66,0.1642,145,337,482 +14559,2012-09-03,3,1,9,19,1,1,0,1,0.72,0.697,0.74,0.194,73,301,374 +14560,2012-09-03,3,1,9,20,1,1,0,1,0.72,0.6818,0.76,0.1045,78,183,261 +14561,2012-09-03,3,1,9,21,1,1,0,1,0.72,0.6818,0.76,0.1045,42,142,184 +14562,2012-09-03,3,1,9,22,1,1,0,2,0.7,0.6667,0.84,0.1343,17,104,121 +14563,2012-09-03,3,1,9,23,1,1,0,2,0.7,0.6667,0.84,0.2537,12,58,70 +14564,2012-09-04,3,1,9,0,0,2,1,2,0.7,0.6667,0.84,0.2537,10,19,29 +14565,2012-09-04,3,1,9,1,0,2,1,1,0.68,0.6364,0.82,0.1343,15,9,24 +14566,2012-09-04,3,1,9,2,0,2,1,1,0.68,0.6364,0.82,0.1343,1,4,5 +14567,2012-09-04,3,1,9,3,0,2,1,2,0.66,0.5909,0.89,0.0896,1,9,10 +14568,2012-09-04,3,1,9,4,0,2,1,2,0.66,0.5909,0.89,0,0,8,8 +14569,2012-09-04,3,1,9,5,0,2,1,2,0.66,0.5909,0.89,0,1,37,38 +14570,2012-09-04,3,1,9,6,0,2,1,2,0.68,0.6364,0.89,0.1045,7,165,172 +14571,2012-09-04,3,1,9,7,0,2,1,2,0.68,0.6364,0.89,0.1045,13,480,493 +14572,2012-09-04,3,1,9,8,0,2,1,1,0.7,0.6667,0.84,0.1642,15,617,632 +14573,2012-09-04,3,1,9,9,0,2,1,2,0.72,0.697,0.79,0.2239,30,293,323 +14574,2012-09-04,3,1,9,10,0,2,1,2,0.72,0.697,0.74,0.2537,42,130,172 +14575,2012-09-04,3,1,9,11,0,2,1,2,0.74,0.697,0.7,0.2836,39,159,198 +14576,2012-09-04,3,1,9,12,0,2,1,1,0.76,0.7273,0.7,0.2836,54,233,287 +14577,2012-09-04,3,1,9,13,0,2,1,1,0.8,0.7576,0.55,0.4478,72,203,275 +14578,2012-09-04,3,1,9,14,0,2,1,1,0.8,0.7576,0.55,0.4179,72,202,274 +14579,2012-09-04,3,1,9,15,0,2,1,1,0.8,0.7727,0.59,0.4627,68,224,292 +14580,2012-09-04,3,1,9,16,0,2,1,1,0.76,0.7273,0.7,0.4179,61,397,458 +14581,2012-09-04,3,1,9,17,0,2,1,1,0.76,0.7273,0.7,0.3582,110,746,856 +14582,2012-09-04,3,1,9,18,0,2,1,1,0.76,0.7424,0.75,0.2239,94,745,839 +14583,2012-09-04,3,1,9,19,0,2,1,1,0.76,0.7273,0.7,0.2239,50,502,552 +14584,2012-09-04,3,1,9,20,0,2,1,1,0.74,0.697,0.7,0.2537,47,338,385 +14585,2012-09-04,3,1,9,21,0,2,1,1,0.74,0.7121,0.74,0.2537,28,240,268 +14586,2012-09-04,3,1,9,22,0,2,1,1,0.74,0.697,0.7,0.2836,25,154,179 +14587,2012-09-04,3,1,9,23,0,2,1,1,0.72,0.697,0.74,0.2985,12,83,95 +14588,2012-09-05,3,1,9,0,0,3,1,1,0.72,0.697,0.74,0.2537,10,27,37 +14589,2012-09-05,3,1,9,1,0,3,1,1,0.7,0.6667,0.84,0.194,2,11,13 +14590,2012-09-05,3,1,9,2,0,3,1,1,0.7,0.6667,0.84,0.194,0,9,9 +14591,2012-09-05,3,1,9,3,0,3,1,1,0.7,0.6667,0.79,0.2836,0,4,4 +14592,2012-09-05,3,1,9,4,0,3,1,1,0.7,0.6667,0.79,0.2836,2,3,5 +14593,2012-09-05,3,1,9,5,0,3,1,2,0.7,0.6667,0.79,0.2985,1,39,40 +14594,2012-09-05,3,1,9,6,0,3,1,1,0.7,0.6667,0.79,0.2836,7,203,210 +14595,2012-09-05,3,1,9,7,0,3,1,2,0.7,0.6667,0.79,0.2537,11,489,500 +14596,2012-09-05,3,1,9,8,0,3,1,1,0.72,0.697,0.74,0.194,33,692,725 +14597,2012-09-05,3,1,9,9,0,3,1,1,0.74,0.697,0.7,0.2537,27,263,290 +14598,2012-09-05,3,1,9,10,0,3,1,2,0.76,0.7273,0.7,0.1642,54,181,235 +14599,2012-09-05,3,1,9,11,0,3,1,2,0.78,0.7424,0.62,0.1642,61,156,217 +14600,2012-09-05,3,1,9,12,0,3,1,1,0.78,0.7727,0.7,0.1045,49,217,266 +14601,2012-09-05,3,1,9,13,0,3,1,1,0.8,0.7879,0.63,0.1642,50,184,234 +14602,2012-09-05,3,1,9,14,0,3,1,1,0.82,0.7879,0.56,0.1045,34,177,211 +14603,2012-09-05,3,1,9,15,0,3,1,1,0.78,0.7727,0.7,0.194,41,204,245 +14604,2012-09-05,3,1,9,16,0,3,1,1,0.76,0.7273,0.7,0.2537,86,344,430 +14605,2012-09-05,3,1,9,17,0,3,1,1,0.76,0.7273,0.66,0.2537,83,780,863 +14606,2012-09-05,3,1,9,18,0,3,1,1,0.76,0.7424,0.75,0.194,82,757,839 +14607,2012-09-05,3,1,9,19,0,3,1,1,0.74,0.697,0.7,0.1642,64,598,662 +14608,2012-09-05,3,1,9,20,0,3,1,1,0.72,0.7121,0.84,0.1642,55,357,412 +14609,2012-09-05,3,1,9,21,0,3,1,1,0.72,0.697,0.79,0,18,278,296 +14610,2012-09-05,3,1,9,22,0,3,1,1,0.72,0.697,0.79,0,38,181,219 +14611,2012-09-05,3,1,9,23,0,3,1,1,0.7,0.6667,0.84,0.0896,24,126,150 +14612,2012-09-06,3,1,9,0,0,4,1,2,0.7,0.6667,0.84,0.0896,30,35,65 +14613,2012-09-06,3,1,9,1,0,4,1,2,0.7,0.6667,0.84,0,2,16,18 +14614,2012-09-06,3,1,9,2,0,4,1,1,0.72,0.697,0.79,0,1,7,8 +14615,2012-09-06,3,1,9,3,0,4,1,1,0.7,0.6667,0.84,0.1045,0,9,9 +14616,2012-09-06,3,1,9,4,0,4,1,2,0.7,0.6667,0.84,0.0896,0,8,8 +14617,2012-09-06,3,1,9,5,0,4,1,2,0.7,0.6667,0.84,0.194,3,35,38 +14618,2012-09-06,3,1,9,6,0,4,1,2,0.7,0.6667,0.84,0.0896,11,189,200 +14619,2012-09-06,3,1,9,7,0,4,1,2,0.7,0.6667,0.84,0.1045,21,461,482 +14620,2012-09-06,3,1,9,8,0,4,1,3,0.7,0.6667,0.79,0,24,622,646 +14621,2012-09-06,3,1,9,9,0,4,1,3,0.66,0.5909,0.94,0.1343,8,115,123 +14622,2012-09-06,3,1,9,10,0,4,1,3,0.66,0.5909,0.94,0.1343,9,40,49 +14623,2012-09-06,3,1,9,11,0,4,1,3,0.7,0.6667,0.84,0.1642,10,52,62 +14624,2012-09-06,3,1,9,12,0,4,1,2,0.72,0.697,0.79,0.2239,11,126,137 +14625,2012-09-06,3,1,9,13,0,4,1,2,0.72,0.697,0.79,0.1045,39,152,191 +14626,2012-09-06,3,1,9,14,0,4,1,1,0.74,0.7121,0.74,0.0896,38,178,216 +14627,2012-09-06,3,1,9,15,0,4,1,1,0.74,0.697,0.66,0.194,42,228,270 +14628,2012-09-06,3,1,9,16,0,4,1,1,0.74,0.697,0.7,0.2239,42,374,416 +14629,2012-09-06,3,1,9,17,0,4,1,1,0.74,0.697,0.7,0.2537,67,741,808 +14630,2012-09-06,3,1,9,18,0,4,1,1,0.72,0.6818,0.7,0.2537,74,761,835 +14631,2012-09-06,3,1,9,19,0,4,1,1,0.68,0.6364,0.74,0.2537,54,504,558 +14632,2012-09-06,3,1,9,20,0,4,1,1,0.66,0.6061,0.78,0.2239,50,356,406 +14633,2012-09-06,3,1,9,21,0,4,1,1,0.64,0.5758,0.89,0.1642,32,240,272 +14634,2012-09-06,3,1,9,22,0,4,1,1,0.64,0.5758,0.89,0.1642,24,193,217 +14635,2012-09-06,3,1,9,23,0,4,1,1,0.64,0.5758,0.89,0.1642,19,150,169 +14636,2012-09-07,3,1,9,0,0,5,1,1,0.64,0.5758,0.89,0.1343,17,79,96 +14637,2012-09-07,3,1,9,1,0,5,1,1,0.62,0.5606,0.88,0.1045,10,19,29 +14638,2012-09-07,3,1,9,2,0,5,1,1,0.62,0.5758,0.83,0.0896,1,10,11 +14639,2012-09-07,3,1,9,3,0,5,1,1,0.62,0.5758,0.83,0.0896,0,9,9 +14640,2012-09-07,3,1,9,4,0,5,1,1,0.62,0.5606,0.88,0,1,7,8 +14641,2012-09-07,3,1,9,5,0,5,1,1,0.62,0.5606,0.88,0,1,39,40 +14642,2012-09-07,3,1,9,6,0,5,1,1,0.62,0.5606,0.88,0.1045,5,135,140 +14643,2012-09-07,3,1,9,7,0,5,1,1,0.64,0.5758,0.89,0.1045,21,465,486 +14644,2012-09-07,3,1,9,8,0,5,1,1,0.66,0.6061,0.83,0.1045,29,690,719 +14645,2012-09-07,3,1,9,9,0,5,1,1,0.7,0.6667,0.79,0.1343,33,311,344 +14646,2012-09-07,3,1,9,10,0,5,1,1,0.72,0.697,0.74,0.1343,53,171,224 +14647,2012-09-07,3,1,9,11,0,5,1,1,0.76,0.7121,0.62,0.1343,81,215,296 +14648,2012-09-07,3,1,9,12,0,5,1,1,0.78,0.7424,0.59,0.1642,76,247,323 +14649,2012-09-07,3,1,9,13,0,5,1,1,0.8,0.7727,0.59,0.2537,71,283,354 +14650,2012-09-07,3,1,9,14,0,5,1,1,0.8,0.7727,0.59,0.2537,66,187,253 +14651,2012-09-07,3,1,9,15,0,5,1,1,0.8,0.7576,0.55,0.2537,66,265,331 +14652,2012-09-07,3,1,9,16,0,5,1,1,0.8,0.7576,0.55,0.2836,75,395,470 +14653,2012-09-07,3,1,9,17,0,5,1,1,0.78,0.7424,0.59,0.3284,88,684,772 +14654,2012-09-07,3,1,9,18,0,5,1,1,0.76,0.7121,0.62,0.2985,95,697,792 +14655,2012-09-07,3,1,9,19,0,5,1,1,0.74,0.6818,0.62,0.2836,70,498,568 +14656,2012-09-07,3,1,9,20,0,5,1,1,0.72,0.6818,0.66,0.2239,51,340,391 +14657,2012-09-07,3,1,9,21,0,5,1,1,0.7,0.6667,0.74,0.1642,55,246,301 +14658,2012-09-07,3,1,9,22,0,5,1,1,0.7,0.6667,0.74,0.2239,52,247,299 +14659,2012-09-07,3,1,9,23,0,5,1,1,0.66,0.5909,0.89,0.2537,28,220,248 +14660,2012-09-08,3,1,9,0,0,6,0,1,0.66,0.5909,0.89,0.2537,20,130,150 +14661,2012-09-08,3,1,9,1,0,6,0,2,0.66,0.5909,0.89,0.2836,10,107,117 +14662,2012-09-08,3,1,9,2,0,6,0,1,0.66,0.5909,0.89,0.3284,9,68,77 +14663,2012-09-08,3,1,9,3,0,6,0,2,0.66,0.5909,0.89,0.2239,3,40,43 +14664,2012-09-08,3,1,9,4,0,6,0,2,0.66,0.5909,0.94,0.2239,3,8,11 +14665,2012-09-08,3,1,9,5,0,6,0,2,0.66,0.5909,0.94,0.2537,2,13,15 +14666,2012-09-08,3,1,9,6,0,6,0,2,0.66,0.5909,0.89,0.2836,8,26,34 +14667,2012-09-08,3,1,9,7,0,6,0,1,0.66,0.5909,0.89,0.2836,11,71,82 +14668,2012-09-08,3,1,9,8,0,6,0,1,0.7,0.6667,0.79,0.3284,33,197,230 +14669,2012-09-08,3,1,9,9,0,6,0,1,0.7,0.6667,0.74,0.3881,87,261,348 +14670,2012-09-08,3,1,9,10,0,6,0,1,0.74,0.697,0.7,0.3881,100,322,422 +14671,2012-09-08,3,1,9,11,0,6,0,1,0.76,0.7121,0.62,0.4478,173,405,578 +14672,2012-09-08,3,1,9,12,0,6,0,1,0.8,0.7576,0.55,0.4925,220,474,694 +14673,2012-09-08,3,1,9,13,0,6,0,1,0.82,0.7727,0.52,0.5224,233,435,668 +14674,2012-09-08,3,1,9,14,0,6,0,1,0.82,0.7727,0.52,0.4478,260,366,626 +14675,2012-09-08,3,1,9,15,0,6,0,3,0.56,0.5303,0.88,0.2537,175,337,512 +14676,2012-09-08,3,1,9,16,0,6,0,3,0.56,0.5303,0.88,0.194,19,95,114 +14677,2012-09-08,3,1,9,17,0,6,0,3,0.58,0.5455,0.83,0.194,52,119,171 +14678,2012-09-08,3,1,9,18,0,6,0,3,0.58,0.5455,0.83,0.1343,27,140,167 +14679,2012-09-08,3,1,9,19,0,6,0,3,0.58,0.5455,0.88,0.1045,28,187,215 +14680,2012-09-08,3,1,9,20,0,6,0,2,0.58,0.5455,0.88,0.0896,28,166,194 +14681,2012-09-08,3,1,9,21,0,6,0,2,0.6,0.5606,0.83,0.1642,26,155,181 +14682,2012-09-08,3,1,9,22,0,6,0,2,0.58,0.5455,0.78,0.2985,18,176,194 +14683,2012-09-08,3,1,9,23,0,6,0,2,0.58,0.5455,0.73,0.1642,12,121,133 +14684,2012-09-09,3,1,9,0,0,0,0,2,0.56,0.5303,0.73,0.194,18,106,124 +14685,2012-09-09,3,1,9,1,0,0,0,1,0.56,0.5303,0.73,0.2239,20,104,124 +14686,2012-09-09,3,1,9,2,0,0,0,1,0.54,0.5152,0.77,0.2836,15,81,96 +14687,2012-09-09,3,1,9,3,0,0,0,1,0.54,0.5152,0.73,0.2239,14,32,46 +14688,2012-09-09,3,1,9,4,0,0,0,1,0.52,0.5,0.77,0.2537,12,16,28 +14689,2012-09-09,3,1,9,5,0,0,0,1,0.52,0.5,0.77,0.194,16,39,55 +14690,2012-09-09,3,1,9,6,0,0,0,1,0.52,0.5,0.77,0.194,11,24,35 +14691,2012-09-09,3,1,9,7,0,0,0,1,0.54,0.5152,0.73,0.1642,20,50,70 +14692,2012-09-09,3,1,9,8,0,0,0,1,0.58,0.5455,0.68,0.2239,27,143,170 +14693,2012-09-09,3,1,9,9,0,0,0,1,0.62,0.6061,0.61,0.2537,85,214,299 +14694,2012-09-09,3,1,9,10,0,0,0,1,0.64,0.6212,0.57,0.2985,172,323,495 +14695,2012-09-09,3,1,9,11,0,0,0,1,0.66,0.6212,0.47,0.2537,194,391,585 +14696,2012-09-09,3,1,9,12,0,0,0,1,0.66,0.6212,0.47,0,247,510,757 +14697,2012-09-09,3,1,9,13,0,0,0,1,0.68,0.6364,0.39,0,238,491,729 +14698,2012-09-09,3,1,9,14,0,0,0,1,0.72,0.6515,0.3,0.2836,232,415,647 +14699,2012-09-09,3,1,9,15,0,0,0,1,0.72,0.6515,0.32,0.2537,284,412,696 +14700,2012-09-09,3,1,9,16,0,0,0,1,0.68,0.6364,0.36,0.2836,228,473,701 +14701,2012-09-09,3,1,9,17,0,0,0,1,0.7,0.6364,0.37,0.2537,213,458,671 +14702,2012-09-09,3,1,9,18,0,0,0,1,0.66,0.6212,0.36,0.2537,156,404,560 +14703,2012-09-09,3,1,9,19,0,0,0,1,0.66,0.6212,0.36,0.2537,134,362,496 +14704,2012-09-09,3,1,9,20,0,0,0,1,0.62,0.6212,0.41,0.2836,91,265,356 +14705,2012-09-09,3,1,9,21,0,0,0,1,0.6,0.6212,0.46,0.2239,57,149,206 +14706,2012-09-09,3,1,9,22,0,0,0,1,0.58,0.5455,0.49,0.2537,64,125,189 +14707,2012-09-09,3,1,9,23,0,0,0,1,0.56,0.5303,0.52,0.2836,22,70,92 +14708,2012-09-10,3,1,9,0,0,1,1,1,0.56,0.5303,0.52,0.2836,6,35,41 +14709,2012-09-10,3,1,9,1,0,1,1,1,0.54,0.5152,0.6,0.194,7,11,18 +14710,2012-09-10,3,1,9,2,0,1,1,1,0.54,0.5152,0.6,0.3582,11,6,17 +14711,2012-09-10,3,1,9,3,0,1,1,1,0.52,0.5,0.68,0.2836,5,7,12 +14712,2012-09-10,3,1,9,4,0,1,1,1,0.52,0.5,0.68,0.2836,1,4,5 +14713,2012-09-10,3,1,9,5,0,1,1,1,0.5,0.4848,0.72,0.2537,4,36,40 +14714,2012-09-10,3,1,9,6,0,1,1,1,0.5,0.4848,0.72,0.1343,8,172,180 +14715,2012-09-10,3,1,9,7,0,1,1,1,0.52,0.5,0.68,0.2239,20,427,447 +14716,2012-09-10,3,1,9,8,0,1,1,1,0.56,0.5303,0.6,0.2836,33,697,730 +14717,2012-09-10,3,1,9,9,0,1,1,1,0.6,0.6212,0.53,0.2985,40,321,361 +14718,2012-09-10,3,1,9,10,0,1,1,1,0.62,0.6212,0.5,0.2836,42,168,210 +14719,2012-09-10,3,1,9,11,0,1,1,1,0.62,0.6212,0.43,0.2985,69,167,236 +14720,2012-09-10,3,1,9,12,0,1,1,1,0.64,0.6212,0.44,0.3881,76,256,332 +14721,2012-09-10,3,1,9,13,0,1,1,1,0.64,0.6212,0.38,0.2239,72,251,323 +14722,2012-09-10,3,1,9,14,0,1,1,1,0.66,0.6212,0.36,0.2836,107,233,340 +14723,2012-09-10,3,1,9,15,0,1,1,1,0.66,0.6212,0.31,0.3881,115,231,346 +14724,2012-09-10,3,1,9,16,0,1,1,1,0.66,0.6212,0.34,0.3284,108,380,488 +14725,2012-09-10,3,1,9,17,0,1,1,1,0.66,0.6212,0.34,0.3284,127,744,871 +14726,2012-09-10,3,1,9,18,0,1,1,1,0.62,0.6212,0.35,0.2985,111,857,968 +14727,2012-09-10,3,1,9,19,0,1,1,1,0.62,0.6212,0.38,0.2985,71,562,633 +14728,2012-09-10,3,1,9,20,0,1,1,1,0.6,0.6212,0.4,0.194,34,356,390 +14729,2012-09-10,3,1,9,21,0,1,1,1,0.56,0.5303,0.49,0.1642,29,256,285 +14730,2012-09-10,3,1,9,22,0,1,1,1,0.54,0.5152,0.52,0.1343,15,144,159 +14731,2012-09-10,3,1,9,23,0,1,1,1,0.54,0.5152,0.52,0,7,86,93 +14732,2012-09-11,3,1,9,0,0,2,1,1,0.52,0.5,0.59,0.1045,5,27,32 +14733,2012-09-11,3,1,9,1,0,2,1,1,0.5,0.4848,0.63,0,0,10,10 +14734,2012-09-11,3,1,9,2,0,2,1,1,0.48,0.4697,0.77,0.1642,1,5,6 +14735,2012-09-11,3,1,9,3,0,2,1,1,0.48,0.4697,0.67,0.1343,1,7,8 +14736,2012-09-11,3,1,9,4,0,2,1,1,0.48,0.4697,0.72,0.1045,1,8,9 +14737,2012-09-11,3,1,9,5,0,2,1,1,0.46,0.4545,0.72,0.0896,3,41,44 +14738,2012-09-11,3,1,9,6,0,2,1,1,0.46,0.4545,0.77,0.1045,11,200,211 +14739,2012-09-11,3,1,9,7,0,2,1,1,0.5,0.4848,0.72,0.1045,24,572,596 +14740,2012-09-11,3,1,9,8,0,2,1,1,0.54,0.5152,0.68,0.0896,48,702,750 +14741,2012-09-11,3,1,9,9,0,2,1,1,0.6,0.6212,0.49,0,35,323,358 +14742,2012-09-11,3,1,9,10,0,2,1,1,0.62,0.6212,0.38,0,54,142,196 +14743,2012-09-11,3,1,9,11,0,2,1,1,0.64,0.6212,0.33,0.0896,60,187,247 +14744,2012-09-11,3,1,9,12,0,2,1,1,0.64,0.6212,0.33,0,62,254,316 +14745,2012-09-11,3,1,9,13,0,2,1,1,0.66,0.6212,0.29,0.0896,72,256,328 +14746,2012-09-11,3,1,9,14,0,2,1,1,0.68,0.6212,0.3,0.0896,74,181,255 +14747,2012-09-11,3,1,9,15,0,2,1,1,0.7,0.6364,0.28,0.1045,73,253,326 +14748,2012-09-11,3,1,9,16,0,2,1,1,0.7,0.6364,0.28,0.1343,99,437,536 +14749,2012-09-11,3,1,9,17,0,2,1,1,0.7,0.6364,0.28,0,168,802,970 +14750,2012-09-11,3,1,9,18,0,2,1,1,0.64,0.6212,0.36,0.1642,110,767,877 +14751,2012-09-11,3,1,9,19,0,2,1,1,0.62,0.6212,0.41,0.194,56,540,596 +14752,2012-09-11,3,1,9,20,0,2,1,1,0.58,0.5455,0.56,0.1343,40,421,461 +14753,2012-09-11,3,1,9,21,0,2,1,1,0.56,0.5303,0.6,0.1045,28,282,310 +14754,2012-09-11,3,1,9,22,0,2,1,1,0.56,0.5303,0.64,0.1045,27,189,216 +14755,2012-09-11,3,1,9,23,0,2,1,1,0.54,0.5152,0.68,0.1045,18,91,109 +14756,2012-09-12,3,1,9,0,0,3,1,1,0.52,0.5,0.72,0.1045,8,41,49 +14757,2012-09-12,3,1,9,1,0,3,1,1,0.52,0.5,0.72,0.1045,2,19,21 +14758,2012-09-12,3,1,9,2,0,3,1,1,0.52,0.5,0.72,0.0896,2,9,11 +14759,2012-09-12,3,1,9,3,0,3,1,1,0.5,0.4848,0.77,0.0896,0,7,7 +14760,2012-09-12,3,1,9,4,0,3,1,1,0.5,0.4848,0.72,0.0896,0,5,5 +14761,2012-09-12,3,1,9,5,0,3,1,1,0.5,0.4848,0.72,0.0896,4,44,48 +14762,2012-09-12,3,1,9,6,0,3,1,1,0.5,0.4848,0.72,0.0896,6,199,205 +14763,2012-09-12,3,1,9,7,0,3,1,1,0.52,0.5,0.72,0,24,533,557 +14764,2012-09-12,3,1,9,8,0,3,1,1,0.56,0.5303,0.64,0.1045,43,727,770 +14765,2012-09-12,3,1,9,9,0,3,1,1,0.6,0.6061,0.64,0.1343,50,278,328 +14766,2012-09-12,3,1,9,10,0,3,1,1,0.62,0.6061,0.61,0.1343,57,148,205 +14767,2012-09-12,3,1,9,11,0,3,1,1,0.66,0.6212,0.5,0.194,51,181,232 +14768,2012-09-12,3,1,9,12,0,3,1,1,0.68,0.6364,0.36,0.1343,88,264,352 +14769,2012-09-12,3,1,9,13,0,3,1,1,0.7,0.6364,0.34,0.1343,85,238,323 +14770,2012-09-12,3,1,9,14,0,3,1,1,0.7,0.6364,0.37,0.1642,78,200,278 +14771,2012-09-12,3,1,9,15,0,3,1,1,0.72,0.6515,0.37,0.1343,75,243,318 +14772,2012-09-12,3,1,9,16,0,3,1,1,0.72,0.6515,0.37,0.1642,90,419,509 +14773,2012-09-12,3,1,9,17,0,3,1,1,0.7,0.6364,0.41,0.2985,114,811,925 +14774,2012-09-12,3,1,9,18,0,3,1,1,0.66,0.6212,0.44,0.2537,91,886,977 +14775,2012-09-12,3,1,9,19,0,3,1,1,0.64,0.6212,0.5,0.194,78,557,635 +14776,2012-09-12,3,1,9,20,0,3,1,1,0.62,0.6212,0.57,0.1343,38,432,470 +14777,2012-09-12,3,1,9,21,0,3,1,1,0.6,0.6061,0.6,0.1343,27,279,306 +14778,2012-09-12,3,1,9,22,0,3,1,1,0.56,0.5303,0.64,0.0896,23,189,212 +14779,2012-09-12,3,1,9,23,0,3,1,1,0.56,0.5303,0.68,0.1045,16,111,127 +14780,2012-09-13,3,1,9,0,0,4,1,1,0.56,0.5303,0.73,0.0896,11,46,57 +14781,2012-09-13,3,1,9,1,0,4,1,1,0.54,0.5152,0.77,0.0896,6,22,28 +14782,2012-09-13,3,1,9,2,0,4,1,1,0.54,0.5152,0.77,0,0,9,9 +14783,2012-09-13,3,1,9,3,0,4,1,1,0.52,0.5,0.83,0,2,9,11 +14784,2012-09-13,3,1,9,4,0,4,1,1,0.52,0.5,0.83,0,0,6,6 +14785,2012-09-13,3,1,9,5,0,4,1,1,0.52,0.5,0.83,0,3,54,57 +14786,2012-09-13,3,1,9,6,0,4,1,1,0.5,0.4848,0.82,0,9,186,195 +14787,2012-09-13,3,1,9,7,0,4,1,1,0.54,0.5152,0.77,0,22,549,571 +14788,2012-09-13,3,1,9,8,0,4,1,1,0.56,0.5303,0.73,0,33,725,758 +14789,2012-09-13,3,1,9,9,0,4,1,1,0.6,0.6061,0.64,0.0896,46,292,338 +14790,2012-09-13,3,1,9,10,0,4,1,1,0.64,0.6212,0.61,0.0896,63,149,212 +14791,2012-09-13,3,1,9,11,0,4,1,1,0.66,0.6212,0.5,0.1642,71,183,254 +14792,2012-09-13,3,1,9,12,0,4,1,1,0.68,0.6364,0.44,0.1045,57,240,297 +14793,2012-09-13,3,1,9,13,0,4,1,1,0.7,0.6364,0.42,0,60,223,283 +14794,2012-09-13,3,1,9,14,0,4,1,1,0.72,0.6515,0.39,0.1343,60,214,274 +14795,2012-09-13,3,1,9,15,0,4,1,1,0.72,0.6515,0.42,0.1045,99,253,352 +14796,2012-09-13,3,1,9,16,0,4,1,1,0.72,0.6515,0.39,0.0896,85,406,491 +14797,2012-09-13,3,1,9,17,0,4,1,1,0.72,0.6515,0.42,0.194,97,787,884 +14798,2012-09-13,3,1,9,18,0,4,1,1,0.68,0.6364,0.57,0.2537,108,744,852 +14799,2012-09-13,3,1,9,19,0,4,1,1,0.64,0.6061,0.65,0.1343,80,594,674 +14800,2012-09-13,3,1,9,20,0,4,1,1,0.62,0.6061,0.69,0.1045,55,408,463 +14801,2012-09-13,3,1,9,21,0,4,1,1,0.62,0.6061,0.65,0.1045,26,291,317 +14802,2012-09-13,3,1,9,22,0,4,1,1,0.6,0.5909,0.69,0.1343,28,223,251 +14803,2012-09-13,3,1,9,23,0,4,1,1,0.58,0.5455,0.73,0.1045,33,137,170 +14804,2012-09-14,3,1,9,0,0,5,1,1,0.56,0.5303,0.73,0.0896,24,63,87 +14805,2012-09-14,3,1,9,1,0,5,1,1,0.56,0.5303,0.73,0.0896,6,35,41 +14806,2012-09-14,3,1,9,2,0,5,1,1,0.56,0.5303,0.73,0,7,23,30 +14807,2012-09-14,3,1,9,3,0,5,1,1,0.54,0.5152,0.77,0,5,9,14 +14808,2012-09-14,3,1,9,4,0,5,1,1,0.54,0.5152,0.77,0,1,11,12 +14809,2012-09-14,3,1,9,5,0,5,1,1,0.54,0.5152,0.83,0,1,40,41 +14810,2012-09-14,3,1,9,6,0,5,1,1,0.54,0.5152,0.88,0,8,144,152 +14811,2012-09-14,3,1,9,7,0,5,1,2,0.56,0.5303,0.88,0,17,437,454 +14812,2012-09-14,3,1,9,8,0,5,1,2,0.6,0.5758,0.78,0.1642,51,715,766 +14813,2012-09-14,3,1,9,9,0,5,1,2,0.62,0.5909,0.78,0.1642,46,301,347 +14814,2012-09-14,3,1,9,10,0,5,1,2,0.68,0.6364,0.65,0,75,171,246 +14815,2012-09-14,3,1,9,11,0,5,1,2,0.68,0.6364,0.61,0.1045,84,196,280 +14816,2012-09-14,3,1,9,12,0,5,1,2,0.7,0.6515,0.54,0.194,119,290,409 +14817,2012-09-14,3,1,9,13,0,5,1,2,0.7,0.6515,0.54,0.194,107,301,408 +14818,2012-09-14,3,1,9,14,0,5,1,2,0.72,0.6515,0.45,0.1642,113,258,371 +14819,2012-09-14,3,1,9,15,0,5,1,2,0.72,0.6667,0.48,0.1343,94,273,367 +14820,2012-09-14,3,1,9,16,0,5,1,2,0.72,0.6667,0.51,0,109,454,563 +14821,2012-09-14,3,1,9,17,0,5,1,1,0.72,0.6667,0.51,0.194,137,757,894 +14822,2012-09-14,3,1,9,18,0,5,1,1,0.72,0.6515,0.45,0.2239,116,692,808 +14823,2012-09-14,3,1,9,19,0,5,1,1,0.68,0.6364,0.57,0.1642,83,496,579 +14824,2012-09-14,3,1,9,20,0,5,1,1,0.66,0.6212,0.61,0.1045,67,337,404 +14825,2012-09-14,3,1,9,21,0,5,1,1,0.64,0.6061,0.73,0.2239,42,270,312 +14826,2012-09-14,3,1,9,22,0,5,1,1,0.62,0.5758,0.83,0.194,40,189,229 +14827,2012-09-14,3,1,9,23,0,5,1,1,0.62,0.5909,0.78,0.0896,27,168,195 +14828,2012-09-15,3,1,9,0,0,6,0,1,0.6,0.5606,0.83,0.1045,38,169,207 +14829,2012-09-15,3,1,9,1,0,6,0,1,0.6,0.5909,0.73,0,8,101,109 +14830,2012-09-15,3,1,9,2,0,6,0,1,0.58,0.5455,0.78,0.1045,18,75,93 +14831,2012-09-15,3,1,9,3,0,6,0,1,0.6,0.5909,0.73,0.2537,6,31,37 +14832,2012-09-15,3,1,9,4,0,6,0,2,0.6,0.5909,0.69,0.3582,3,3,6 +14833,2012-09-15,3,1,9,5,0,6,0,1,0.58,0.5455,0.6,0.5224,1,15,16 +14834,2012-09-15,3,1,9,6,0,6,0,1,0.54,0.5152,0.49,0.4179,6,27,33 +14835,2012-09-15,3,1,9,7,0,6,0,1,0.54,0.5152,0.52,0.2836,10,63,73 +14836,2012-09-15,3,1,9,8,0,6,0,1,0.56,0.5303,0.49,0.4179,43,169,212 +14837,2012-09-15,3,1,9,9,0,6,0,1,0.6,0.6212,0.43,0.4179,79,263,342 +14838,2012-09-15,3,1,9,10,0,6,0,1,0.62,0.6212,0.41,0.3881,119,323,442 +14839,2012-09-15,3,1,9,11,0,6,0,1,0.64,0.6212,0.38,0.3881,228,399,627 +14840,2012-09-15,3,1,9,12,0,6,0,1,0.66,0.6212,0.36,0.3582,287,419,706 +14841,2012-09-15,3,1,9,13,0,6,0,1,0.68,0.6364,0.36,0.194,327,377,704 +14842,2012-09-15,3,1,9,14,0,6,0,1,0.68,0.6364,0.34,0.3284,325,390,715 +14843,2012-09-15,3,1,9,15,0,6,0,2,0.68,0.6364,0.34,0.2836,312,342,654 +14844,2012-09-15,3,1,9,16,0,6,0,2,0.66,0.6212,0.36,0.2239,350,433,783 +14845,2012-09-15,3,1,9,17,0,6,0,2,0.66,0.6212,0.36,0.2537,295,434,729 +14846,2012-09-15,3,1,9,18,0,6,0,2,0.64,0.6212,0.36,0.2836,232,382,614 +14847,2012-09-15,3,1,9,19,0,6,0,1,0.62,0.6212,0.41,0.1642,169,309,478 +14848,2012-09-15,3,1,9,20,0,6,0,1,0.6,0.6212,0.43,0.0896,89,241,330 +14849,2012-09-15,3,1,9,21,0,6,0,1,0.56,0.5303,0.52,0.1045,86,210,296 +14850,2012-09-15,3,1,9,22,0,6,0,1,0.56,0.5303,0.52,0,82,197,279 +14851,2012-09-15,3,1,9,23,0,6,0,1,0.54,0.5152,0.6,0,47,182,229 +14852,2012-09-16,3,1,9,0,0,0,0,1,0.54,0.5152,0.64,0,28,123,151 +14853,2012-09-16,3,1,9,1,0,0,0,1,0.54,0.5152,0.64,0,35,82,117 +14854,2012-09-16,3,1,9,2,0,0,0,1,0.52,0.5,0.63,0,27,62,89 +14855,2012-09-16,3,1,9,3,0,0,0,1,0.5,0.4848,0.68,0,10,38,48 +14856,2012-09-16,3,1,9,4,0,0,0,1,0.5,0.4848,0.72,0,2,6,8 +14857,2012-09-16,3,1,9,5,0,0,0,1,0.5,0.4848,0.72,0,3,10,13 +14858,2012-09-16,3,1,9,6,0,0,0,1,0.5,0.4848,0.72,0,9,26,35 +14859,2012-09-16,3,1,9,7,0,0,0,1,0.5,0.4848,0.77,0.1045,28,43,71 +14860,2012-09-16,3,1,9,8,0,0,0,1,0.54,0.5152,0.64,0.1642,34,94,128 +14861,2012-09-16,3,1,9,9,0,0,0,1,0.58,0.5455,0.6,0.1343,70,226,296 +14862,2012-09-16,3,1,9,10,0,0,0,1,0.62,0.6212,0.5,0.1045,160,330,490 +14863,2012-09-16,3,1,9,11,0,0,0,1,0.64,0.6212,0.47,0.1642,183,376,559 +14864,2012-09-16,3,1,9,12,0,0,0,1,0.66,0.6212,0.44,0.1642,188,468,656 +14865,2012-09-16,3,1,9,13,0,0,0,1,0.64,0.6212,0.41,0.1642,240,454,694 +14866,2012-09-16,3,1,9,14,0,0,0,1,0.66,0.6212,0.39,0.1343,225,410,635 +14867,2012-09-16,3,1,9,15,0,0,0,1,0.64,0.6212,0.41,0.1642,215,342,557 +14868,2012-09-16,3,1,9,16,0,0,0,1,0.64,0.6212,0.44,0.1642,194,402,596 +14869,2012-09-16,3,1,9,17,0,0,0,1,0.66,0.6212,0.41,0,177,393,570 +14870,2012-09-16,3,1,9,18,0,0,0,1,0.62,0.6212,0.5,0.1642,120,361,481 +14871,2012-09-16,3,1,9,19,0,0,0,1,0.62,0.6212,0.5,0.1343,91,312,403 +14872,2012-09-16,3,1,9,20,0,0,0,1,0.6,0.6212,0.53,0.1045,57,267,324 +14873,2012-09-16,3,1,9,21,0,0,0,1,0.6,0.6212,0.56,0.1045,43,148,191 +14874,2012-09-16,3,1,9,22,0,0,0,1,0.56,0.5303,0.68,0.0896,18,109,127 +14875,2012-09-16,3,1,9,23,0,0,0,1,0.54,0.5152,0.68,0.1045,9,85,94 +14876,2012-09-17,3,1,9,0,0,1,1,1,0.54,0.5152,0.68,0.1642,14,31,45 +14877,2012-09-17,3,1,9,1,0,1,1,1,0.52,0.5,0.72,0.0896,9,12,21 +14878,2012-09-17,3,1,9,2,0,1,1,1,0.52,0.5,0.72,0,5,8,13 +14879,2012-09-17,3,1,9,3,0,1,1,1,0.5,0.4848,0.77,0.0896,0,7,7 +14880,2012-09-17,3,1,9,4,0,1,1,1,0.5,0.4848,0.77,0,2,9,11 +14881,2012-09-17,3,1,9,5,0,1,1,1,0.48,0.4697,0.82,0.0896,1,44,45 +14882,2012-09-17,3,1,9,6,0,1,1,1,0.48,0.4697,0.82,0,7,157,164 +14883,2012-09-17,3,1,9,7,0,1,1,1,0.52,0.5,0.77,0,18,474,492 +14884,2012-09-17,3,1,9,8,0,1,1,1,0.54,0.5152,0.77,0.0896,36,647,683 +14885,2012-09-17,3,1,9,9,0,1,1,1,0.58,0.5455,0.68,0.1343,34,265,299 +14886,2012-09-17,3,1,9,10,0,1,1,2,0.62,0.6061,0.65,0.194,73,129,202 +14887,2012-09-17,3,1,9,11,0,1,1,2,0.62,0.6061,0.69,0.2239,61,191,252 +14888,2012-09-17,3,1,9,12,0,1,1,2,0.62,0.6061,0.69,0.194,62,233,295 +14889,2012-09-17,3,1,9,13,0,1,1,1,0.64,0.6061,0.65,0.2537,112,231,343 +14890,2012-09-17,3,1,9,14,0,1,1,2,0.64,0.6061,0.65,0.2537,129,203,332 +14891,2012-09-17,3,1,9,15,0,1,1,2,0.64,0.6061,0.65,0.2537,82,256,338 +14892,2012-09-17,3,1,9,16,0,1,1,2,0.64,0.6061,0.69,0.2537,74,379,453 +14893,2012-09-17,3,1,9,17,0,1,1,2,0.64,0.6061,0.65,0.2239,102,740,842 +14894,2012-09-17,3,1,9,18,0,1,1,2,0.62,0.5909,0.73,0.1642,66,708,774 +14895,2012-09-17,3,1,9,19,0,1,1,2,0.62,0.5909,0.73,0.1045,51,435,486 +14896,2012-09-17,3,1,9,20,0,1,1,2,0.62,0.5909,0.73,0.1343,32,308,340 +14897,2012-09-17,3,1,9,21,0,1,1,3,0.62,0.5758,0.83,0.2239,28,205,233 +14898,2012-09-17,3,1,9,22,0,1,1,3,0.62,0.5758,0.83,0.2537,20,109,129 +14899,2012-09-17,3,1,9,23,0,1,1,3,0.6,0.5152,0.94,0.2537,4,66,70 +14900,2012-09-18,3,1,9,0,0,2,1,3,0.6,0.5152,0.94,0.2537,2,11,13 +14901,2012-09-18,3,1,9,1,0,2,1,3,0.6,0.5152,0.94,0.194,0,5,5 +14902,2012-09-18,3,1,9,2,0,2,1,3,0.6,0.5152,0.94,0.194,0,4,4 +14903,2012-09-18,3,1,9,3,0,2,1,2,0.62,0.5455,0.94,0.2836,0,6,6 +14904,2012-09-18,3,1,9,4,0,2,1,2,0.62,0.5455,0.94,0.3582,2,5,7 +14905,2012-09-18,3,1,9,5,0,2,1,2,0.64,0.5758,0.89,0.3284,3,45,48 +14906,2012-09-18,3,1,9,6,0,2,1,2,0.64,0.5758,0.89,0.3284,4,163,167 +14907,2012-09-18,3,1,9,7,0,2,1,2,0.64,0.5758,0.89,0.4478,13,343,356 +14908,2012-09-18,3,1,9,8,0,2,1,2,0.66,0.6061,0.83,0.5522,32,640,672 +14909,2012-09-18,3,1,9,9,0,2,1,2,0.66,0.6061,0.83,0.5821,27,266,293 +14910,2012-09-18,3,1,9,10,0,2,1,3,0.68,0.6364,0.79,0.6418,30,130,160 +14911,2012-09-18,3,1,9,11,0,2,1,2,0.68,0.6364,0.79,0.6418,36,115,151 +14912,2012-09-18,3,1,9,12,0,2,1,3,0.68,0.6364,0.83,0.4478,16,72,88 +14913,2012-09-18,3,1,9,13,0,2,1,2,0.7,0.6667,0.74,0.5821,17,97,114 +14914,2012-09-18,3,1,9,14,0,2,1,3,0.6,0.5152,0.94,0.4627,27,116,143 +14915,2012-09-18,3,1,9,15,0,2,1,3,0.6,0.5455,0.88,0.2537,1,35,36 +14916,2012-09-18,3,1,9,16,0,2,1,3,0.6,0.5455,0.88,0.2985,19,122,141 +14917,2012-09-18,3,1,9,17,0,2,1,3,0.6,0.5455,0.88,0.194,36,302,338 +14918,2012-09-18,3,1,9,18,0,2,1,2,0.6,0.5455,0.88,0.1642,19,262,281 +14919,2012-09-18,3,1,9,19,0,2,1,2,0.6,0.5455,0.88,0.2985,22,302,324 +14920,2012-09-18,3,1,9,20,0,2,1,2,0.6,0.5455,0.88,0.194,19,271,290 +14921,2012-09-18,3,1,9,21,0,2,1,2,0.6,0.5455,0.88,0.194,21,186,207 +14922,2012-09-18,3,1,9,22,0,2,1,3,0.58,0.5455,0.83,0.4627,14,137,151 +14923,2012-09-18,3,1,9,23,0,2,1,2,0.56,0.5303,0.83,0.2239,11,67,78 +14924,2012-09-19,3,1,9,0,0,3,1,2,0.56,0.5303,0.68,0.2836,3,23,26 +14925,2012-09-19,3,1,9,1,0,3,1,2,0.54,0.5152,0.68,0.2537,0,12,12 +14926,2012-09-19,3,1,9,2,0,3,1,2,0.54,0.5152,0.64,0.3284,0,3,3 +14927,2012-09-19,3,1,9,3,0,3,1,2,0.52,0.5,0.63,0.2537,1,4,5 +14928,2012-09-19,3,1,9,4,0,3,1,2,0.52,0.5,0.68,0.3582,0,10,10 +14929,2012-09-19,3,1,9,5,0,3,1,2,0.52,0.5,0.68,0.2836,2,54,56 +14930,2012-09-19,3,1,9,6,0,3,1,1,0.52,0.5,0.63,0.2537,6,166,172 +14931,2012-09-19,3,1,9,7,0,3,1,1,0.52,0.5,0.63,0.2239,16,529,545 +14932,2012-09-19,3,1,9,8,0,3,1,1,0.52,0.5,0.59,0.2239,39,758,797 +14933,2012-09-19,3,1,9,9,0,3,1,1,0.54,0.5152,0.6,0.2537,26,336,362 +14934,2012-09-19,3,1,9,10,0,3,1,1,0.56,0.5303,0.52,0.194,23,169,192 +14935,2012-09-19,3,1,9,11,0,3,1,1,0.6,0.6212,0.43,0.2537,44,203,247 +14936,2012-09-19,3,1,9,12,0,3,1,1,0.6,0.6212,0.43,0.2537,53,260,313 +14937,2012-09-19,3,1,9,13,0,3,1,1,0.6,0.6212,0.4,0.2537,55,234,289 +14938,2012-09-19,3,1,9,14,0,3,1,1,0.62,0.6212,0.38,0.2537,59,227,286 +14939,2012-09-19,3,1,9,15,0,3,1,1,0.62,0.6212,0.35,0,57,254,311 +14940,2012-09-19,3,1,9,16,0,3,1,1,0.62,0.6212,0.35,0,69,397,466 +14941,2012-09-19,3,1,9,17,0,3,1,1,0.6,0.6212,0.38,0.2239,74,812,886 +14942,2012-09-19,3,1,9,18,0,3,1,1,0.58,0.5455,0.4,0.2836,85,807,892 +14943,2012-09-19,3,1,9,19,0,3,1,1,0.56,0.5303,0.43,0.2239,72,539,611 +14944,2012-09-19,3,1,9,20,0,3,1,1,0.52,0.5,0.48,0.1642,31,378,409 +14945,2012-09-19,3,1,9,21,0,3,1,1,0.5,0.4848,0.59,0.1642,25,324,349 +14946,2012-09-19,3,1,9,22,0,3,1,1,0.5,0.4848,0.63,0.0896,31,198,229 +14947,2012-09-19,3,1,9,23,0,3,1,1,0.48,0.4697,0.67,0.0896,17,106,123 +14948,2012-09-20,3,1,9,0,0,4,1,1,0.5,0.4848,0.63,0,12,46,58 +14949,2012-09-20,3,1,9,1,0,4,1,1,0.46,0.4545,0.72,0.1045,4,14,18 +14950,2012-09-20,3,1,9,2,0,4,1,1,0.46,0.4545,0.72,0.0896,2,9,11 +14951,2012-09-20,3,1,9,3,0,4,1,1,0.44,0.4394,0.77,0,0,6,6 +14952,2012-09-20,3,1,9,4,0,4,1,1,0.44,0.4394,0.77,0.1045,0,6,6 +14953,2012-09-20,3,1,9,5,0,4,1,1,0.44,0.4394,0.77,0.1045,4,52,56 +14954,2012-09-20,3,1,9,6,0,4,1,1,0.44,0.4394,0.77,0,9,161,170 +14955,2012-09-20,3,1,9,7,0,4,1,1,0.46,0.4545,0.82,0.1045,20,514,534 +14956,2012-09-20,3,1,9,8,0,4,1,1,0.5,0.4848,0.68,0.0896,44,746,790 +14957,2012-09-20,3,1,9,9,0,4,1,1,0.56,0.5303,0.6,0.0896,25,330,355 +14958,2012-09-20,3,1,9,10,0,4,1,1,0.58,0.5455,0.53,0.0896,33,157,190 +14959,2012-09-20,3,1,9,11,0,4,1,1,0.62,0.6212,0.43,0,37,206,243 +14960,2012-09-20,3,1,9,12,0,4,1,1,0.62,0.6212,0.43,0.1343,57,233,290 +14961,2012-09-20,3,1,9,13,0,4,1,1,0.64,0.6212,0.44,0.1343,68,268,336 +14962,2012-09-20,3,1,9,14,0,4,1,1,0.64,0.6212,0.47,0.194,76,222,298 +14963,2012-09-20,3,1,9,15,0,4,1,1,0.66,0.6212,0.47,0.194,60,231,291 +14964,2012-09-20,3,1,9,16,0,4,1,1,0.64,0.6212,0.47,0.1642,72,385,457 +14965,2012-09-20,3,1,9,17,0,4,1,1,0.64,0.6212,0.5,0.2239,91,885,976 +14966,2012-09-20,3,1,9,18,0,4,1,1,0.6,0.6212,0.56,0.2537,119,781,900 +14967,2012-09-20,3,1,9,19,0,4,1,1,0.58,0.5455,0.6,0.2239,69,534,603 +14968,2012-09-20,3,1,9,20,0,4,1,1,0.56,0.5303,0.64,0.1343,57,360,417 +14969,2012-09-20,3,1,9,21,0,4,1,1,0.56,0.5303,0.64,0.194,28,246,274 +14970,2012-09-20,3,1,9,22,0,4,1,1,0.54,0.5152,0.73,0.1045,35,252,287 +14971,2012-09-20,3,1,9,23,0,4,1,1,0.54,0.5152,0.68,0.1045,17,137,154 +14972,2012-09-21,3,1,9,0,0,5,1,1,0.54,0.5152,0.68,0,13,53,66 +14973,2012-09-21,3,1,9,1,0,5,1,1,0.52,0.5,0.72,0,11,38,49 +14974,2012-09-21,3,1,9,2,0,5,1,1,0.52,0.5,0.72,0,6,8,14 +14975,2012-09-21,3,1,9,3,0,5,1,1,0.5,0.4848,0.77,0.0896,1,11,12 +14976,2012-09-21,3,1,9,4,0,5,1,1,0.5,0.4848,0.77,0,1,9,10 +14977,2012-09-21,3,1,9,5,0,5,1,1,0.5,0.4848,0.82,0,2,47,49 +14978,2012-09-21,3,1,9,6,0,5,1,1,0.5,0.4848,0.77,0.0896,19,146,165 +14979,2012-09-21,3,1,9,7,0,5,1,1,0.52,0.5,0.77,0,35,468,503 +14980,2012-09-21,3,1,9,8,0,5,1,1,0.54,0.5152,0.73,0.1343,31,726,757 +14981,2012-09-21,3,1,9,9,0,5,1,1,0.58,0.5455,0.68,0.1343,35,348,383 +14982,2012-09-21,3,1,9,10,0,5,1,1,0.62,0.6061,0.61,0.2239,44,181,225 +14983,2012-09-21,3,1,9,11,0,5,1,1,0.66,0.6212,0.57,0.2239,79,242,321 +14984,2012-09-21,3,1,9,12,0,5,1,1,0.68,0.6364,0.57,0.1642,84,305,389 +14985,2012-09-21,3,1,9,13,0,5,1,1,0.7,0.6515,0.54,0.1642,105,316,421 +14986,2012-09-21,3,1,9,14,0,5,1,1,0.7,0.6515,0.54,0.2985,113,299,412 +14987,2012-09-21,3,1,9,15,0,5,1,1,0.7,0.6515,0.54,0.2239,90,333,423 +14988,2012-09-21,3,1,9,16,0,5,1,1,0.7,0.6515,0.54,0.194,103,464,567 +14989,2012-09-21,3,1,9,17,0,5,1,1,0.68,0.6364,0.57,0.2836,107,739,846 +14990,2012-09-21,3,1,9,18,0,5,1,1,0.66,0.6212,0.61,0.2537,106,699,805 +14991,2012-09-21,3,1,9,19,0,5,1,1,0.64,0.6061,0.65,0.2537,95,493,588 +14992,2012-09-21,3,1,9,20,0,5,1,1,0.62,0.6061,0.69,0.194,54,315,369 +14993,2012-09-21,3,1,9,21,0,5,1,1,0.6,0.5909,0.73,0.2537,46,266,312 +14994,2012-09-21,3,1,9,22,0,5,1,1,0.6,0.5909,0.73,0.194,47,248,295 +14995,2012-09-21,3,1,9,23,0,5,1,1,0.6,0.5909,0.73,0.3284,23,163,186 +14996,2012-09-22,3,1,9,0,0,6,0,1,0.6,0.5909,0.73,0.2836,32,140,172 +14997,2012-09-22,3,1,9,1,0,6,0,1,0.58,0.5455,0.83,0.2985,18,106,124 +14998,2012-09-22,3,1,9,2,0,6,0,1,0.56,0.5303,0.88,0.2537,10,73,83 +14999,2012-09-22,3,1,9,3,0,6,0,1,0.56,0.5303,0.83,0.2836,6,39,45 +15000,2012-09-22,3,1,9,4,0,6,0,1,0.56,0.5303,0.83,0.2836,5,10,15 +15001,2012-09-22,3,1,9,5,0,6,0,1,0.56,0.5303,0.83,0.3284,2,15,17 +15002,2012-09-22,3,1,9,6,0,6,0,1,0.56,0.5303,0.83,0.2985,6,35,41 +15003,2012-09-22,3,1,9,7,0,6,0,1,0.58,0.5455,0.78,0.2537,7,70,77 +15004,2012-09-22,3,1,9,8,0,6,0,1,0.6,0.5909,0.73,0.2985,21,197,218 +15005,2012-09-22,3,1,9,9,0,6,0,1,0.64,0.6061,0.69,0.2537,88,289,377 +15006,2012-09-22,3,1,9,10,0,6,0,1,0.66,0.6212,0.65,0.3284,124,350,474 +15007,2012-09-22,3,1,9,11,0,6,0,1,0.7,0.6515,0.58,0.2836,189,437,626 +15008,2012-09-22,3,1,9,12,0,6,0,1,0.72,0.6667,0.54,0.2985,228,460,688 +15009,2012-09-22,3,1,9,13,0,6,0,1,0.74,0.6667,0.51,0.3582,273,434,707 +15010,2012-09-22,3,1,9,14,0,6,0,1,0.76,0.6818,0.48,0.2836,250,404,654 +15011,2012-09-22,3,1,9,15,0,6,0,1,0.74,0.6667,0.48,0.3881,307,443,750 +15012,2012-09-22,3,1,9,16,0,6,0,1,0.74,0.6667,0.51,0.3284,253,427,680 +15013,2012-09-22,3,1,9,17,0,6,0,1,0.74,0.6667,0.51,0.3284,195,451,646 +15014,2012-09-22,3,1,9,18,0,6,0,1,0.72,0.6667,0.54,0.2836,171,427,598 +15015,2012-09-22,3,1,9,19,0,6,0,1,0.7,0.6515,0.58,0.194,99,308,407 +15016,2012-09-22,3,1,9,20,0,6,0,1,0.7,0.6515,0.54,0.2537,76,249,325 +15017,2012-09-22,3,1,9,21,0,6,0,1,0.64,0.6212,0.57,0.2537,59,202,261 +15018,2012-09-22,3,1,9,22,0,6,0,1,0.62,0.6212,0.57,0.194,59,180,239 +15019,2012-09-22,3,1,9,23,0,6,0,1,0.62,0.6212,0.5,0.194,34,137,171 +15020,2012-09-23,4,1,9,0,0,0,0,1,0.62,0.6212,0.38,0.2537,34,146,180 +15021,2012-09-23,4,1,9,1,0,0,0,1,0.54,0.5152,0.52,0.3582,23,119,142 +15022,2012-09-23,4,1,9,2,0,0,0,1,0.52,0.5,0.48,0.2537,18,77,95 +15023,2012-09-23,4,1,9,3,0,0,0,1,0.48,0.4697,0.55,0.2985,13,41,54 +15024,2012-09-23,4,1,9,4,0,0,0,1,0.46,0.4545,0.55,0.2239,1,9,10 +15025,2012-09-23,4,1,9,5,0,0,0,1,0.46,0.4545,0.59,0.2537,1,5,6 +15026,2012-09-23,4,1,9,6,0,0,0,1,0.44,0.4394,0.62,0.2537,5,11,16 +15027,2012-09-23,4,1,9,7,0,0,0,1,0.46,0.4545,0.59,0.2239,9,48,57 +15028,2012-09-23,4,1,9,8,0,0,0,1,0.48,0.4697,0.55,0.3582,38,137,175 +15029,2012-09-23,4,1,9,9,0,0,0,1,0.5,0.4848,0.51,0.3284,71,205,276 +15030,2012-09-23,4,1,9,10,0,0,0,1,0.54,0.5152,0.45,0.2537,138,351,489 +15031,2012-09-23,4,1,9,11,0,0,0,1,0.56,0.5303,0.46,0.2537,186,384,570 +15032,2012-09-23,4,1,9,12,0,0,0,1,0.56,0.5303,0.43,0.1045,250,526,776 +15033,2012-09-23,4,1,9,13,0,0,0,1,0.6,0.6212,0.38,0.1642,257,445,702 +15034,2012-09-23,4,1,9,14,0,0,0,1,0.6,0.6212,0.38,0.194,266,400,666 +15035,2012-09-23,4,1,9,15,0,0,0,1,0.6,0.6212,0.33,0.2239,265,375,640 +15036,2012-09-23,4,1,9,16,0,0,0,1,0.6,0.6212,0.33,0.3284,254,437,691 +15037,2012-09-23,4,1,9,17,0,0,0,1,0.56,0.5303,0.35,0.2985,227,496,723 +15038,2012-09-23,4,1,9,18,0,0,0,1,0.54,0.5152,0.37,0.194,135,405,540 +15039,2012-09-23,4,1,9,19,0,0,0,1,0.52,0.5,0.42,0.1642,82,331,413 +15040,2012-09-23,4,1,9,20,0,0,0,1,0.52,0.5,0.42,0.1642,60,192,252 +15041,2012-09-23,4,1,9,21,0,0,0,2,0.52,0.5,0.48,0.1045,51,145,196 +15042,2012-09-23,4,1,9,22,0,0,0,1,0.52,0.5,0.48,0,40,98,138 +15043,2012-09-23,4,1,9,23,0,0,0,2,0.5,0.4848,0.59,0.1045,30,70,100 +15044,2012-09-24,4,1,9,0,0,1,1,1,0.46,0.4545,0.63,0.194,10,54,64 +15045,2012-09-24,4,1,9,1,0,1,1,1,0.46,0.4545,0.67,0.1343,6,12,18 +15046,2012-09-24,4,1,9,2,0,1,1,2,0.46,0.4545,0.63,0.1343,0,8,8 +15047,2012-09-24,4,1,9,3,0,1,1,1,0.46,0.4545,0.63,0.1642,0,5,5 +15048,2012-09-24,4,1,9,4,0,1,1,1,0.44,0.4394,0.67,0.1045,0,8,8 +15049,2012-09-24,4,1,9,5,0,1,1,1,0.46,0.4545,0.63,0,2,35,37 +15050,2012-09-24,4,1,9,6,0,1,1,1,0.46,0.4545,0.59,0,6,162,168 +15051,2012-09-24,4,1,9,7,0,1,1,1,0.46,0.4545,0.67,0.1343,17,513,530 +15052,2012-09-24,4,1,9,8,0,1,1,1,0.5,0.4848,0.55,0.194,31,719,750 +15053,2012-09-24,4,1,9,9,0,1,1,1,0.52,0.5,0.52,0.194,42,286,328 +15054,2012-09-24,4,1,9,10,0,1,1,1,0.54,0.5152,0.49,0.2239,52,146,198 +15055,2012-09-24,4,1,9,11,0,1,1,1,0.56,0.5303,0.37,0.194,71,167,238 +15056,2012-09-24,4,1,9,12,0,1,1,1,0.56,0.5303,0.35,0.2985,77,265,342 +15057,2012-09-24,4,1,9,13,0,1,1,1,0.6,0.6212,0.31,0.2985,80,248,328 +15058,2012-09-24,4,1,9,14,0,1,1,1,0.6,0.6061,0.28,0,77,226,303 +15059,2012-09-24,4,1,9,15,0,1,1,1,0.6,0.6212,0.31,0,84,216,300 +15060,2012-09-24,4,1,9,16,0,1,1,1,0.58,0.5455,0.3,0.2537,99,417,516 +15061,2012-09-24,4,1,9,17,0,1,1,1,0.58,0.5455,0.3,0,89,809,898 +15062,2012-09-24,4,1,9,18,0,1,1,1,0.54,0.5152,0.37,0.1642,95,758,853 +15063,2012-09-24,4,1,9,19,0,1,1,1,0.52,0.5,0.39,0.1343,56,534,590 +15064,2012-09-24,4,1,9,20,0,1,1,1,0.52,0.5,0.52,0.1045,52,380,432 +15065,2012-09-24,4,1,9,21,0,1,1,1,0.5,0.4848,0.51,0.1343,26,230,256 +15066,2012-09-24,4,1,9,22,0,1,1,1,0.5,0.4848,0.51,0.2239,18,154,172 +15067,2012-09-24,4,1,9,23,0,1,1,1,0.46,0.4545,0.63,0.1343,11,83,94 +15068,2012-09-25,4,1,9,0,0,2,1,1,0.46,0.4545,0.67,0.1642,8,56,64 +15069,2012-09-25,4,1,9,1,0,2,1,1,0.44,0.4394,0.72,0.1343,2,11,13 +15070,2012-09-25,4,1,9,2,0,2,1,1,0.42,0.4242,0.77,0.1343,5,9,14 +15071,2012-09-25,4,1,9,3,0,2,1,1,0.42,0.4242,0.71,0.0896,2,3,5 +15072,2012-09-25,4,1,9,4,0,2,1,1,0.42,0.4242,0.71,0.1343,2,7,9 +15073,2012-09-25,4,1,9,5,0,2,1,1,0.42,0.4242,0.77,0.0896,1,46,47 +15074,2012-09-25,4,1,9,6,0,2,1,1,0.44,0.4394,0.77,0.2239,5,189,194 +15075,2012-09-25,4,1,9,7,0,2,1,1,0.44,0.4394,0.77,0.194,17,539,556 +15076,2012-09-25,4,1,9,8,0,2,1,1,0.48,0.4697,0.67,0.194,41,764,805 +15077,2012-09-25,4,1,9,9,0,2,1,1,0.54,0.5152,0.56,0.2537,28,381,409 +15078,2012-09-25,4,1,9,10,0,2,1,1,0.56,0.5303,0.52,0.2985,45,125,170 +15079,2012-09-25,4,1,9,11,0,2,1,1,0.6,0.6212,0.46,0.2985,63,155,218 +15080,2012-09-25,4,1,9,12,0,2,1,1,0.62,0.6212,0.43,0.2836,69,233,302 +15081,2012-09-25,4,1,9,13,0,2,1,1,0.64,0.6212,0.47,0.2537,48,257,305 +15082,2012-09-25,4,1,9,14,0,2,1,1,0.66,0.6212,0.39,0.3284,80,213,293 +15083,2012-09-25,4,1,9,15,0,2,1,1,0.66,0.6212,0.36,0.3284,56,247,303 +15084,2012-09-25,4,1,9,16,0,2,1,1,0.66,0.6212,0.39,0.2985,59,436,495 +15085,2012-09-25,4,1,9,17,0,2,1,1,0.66,0.6212,0.39,0.2836,107,860,967 +15086,2012-09-25,4,1,9,18,0,2,1,1,0.64,0.6212,0.41,0.2239,64,758,822 +15087,2012-09-25,4,1,9,19,0,2,1,1,0.62,0.6212,0.5,0.2537,49,490,539 +15088,2012-09-25,4,1,9,20,0,2,1,1,0.6,0.6212,0.56,0.2985,37,388,425 +15089,2012-09-25,4,1,9,21,0,2,1,1,0.6,0.6212,0.56,0.2985,34,263,297 +15090,2012-09-25,4,1,9,22,0,2,1,1,0.6,0.6212,0.56,0.2836,14,174,188 +15091,2012-09-25,4,1,9,23,0,2,1,1,0.6,0.6212,0.56,0.3284,9,89,98 +15092,2012-09-26,4,1,9,0,0,3,1,1,0.58,0.5455,0.64,0.2985,6,34,40 +15093,2012-09-26,4,1,9,1,0,3,1,1,0.58,0.5455,0.64,0.3284,10,19,29 +15094,2012-09-26,4,1,9,2,0,3,1,2,0.58,0.5455,0.64,0.2836,2,15,17 +15095,2012-09-26,4,1,9,3,0,3,1,2,0.58,0.5455,0.64,0.2836,4,6,10 +15096,2012-09-26,4,1,9,4,0,3,1,1,0.56,0.5303,0.73,0.2537,2,12,14 +15097,2012-09-26,4,1,9,5,0,3,1,1,0.56,0.5303,0.73,0.2239,1,35,36 +15098,2012-09-26,4,1,9,6,0,3,1,1,0.54,0.5152,0.73,0.194,3,179,182 +15099,2012-09-26,4,1,9,7,0,3,1,1,0.54,0.5152,0.77,0.1343,9,523,532 +15100,2012-09-26,4,1,9,8,0,3,1,1,0.56,0.5303,0.73,0.2985,30,808,838 +15101,2012-09-26,4,1,9,9,0,3,1,1,0.6,0.6061,0.64,0.194,27,307,334 +15102,2012-09-26,4,1,9,10,0,3,1,1,0.6,0.5909,0.73,0.2836,29,156,185 +15103,2012-09-26,4,1,9,11,0,3,1,1,0.64,0.6212,0.61,0.2537,46,181,227 +15104,2012-09-26,4,1,9,12,0,3,1,1,0.68,0.6364,0.61,0.2239,56,260,316 +15105,2012-09-26,4,1,9,13,0,3,1,1,0.7,0.6515,0.58,0.2239,67,273,340 +15106,2012-09-26,4,1,9,14,0,3,1,1,0.74,0.6667,0.48,0.2836,60,217,277 +15107,2012-09-26,4,1,9,15,0,3,1,1,0.74,0.6667,0.48,0.2985,69,231,300 +15108,2012-09-26,4,1,9,16,0,3,1,1,0.74,0.6667,0.45,0.2836,71,397,468 +15109,2012-09-26,4,1,9,17,0,3,1,1,0.74,0.6667,0.48,0.2985,77,876,953 +15110,2012-09-26,4,1,9,18,0,3,1,1,0.74,0.6667,0.48,0.2239,69,815,884 +15111,2012-09-26,4,1,9,19,0,3,1,1,0.7,0.6515,0.54,0.1642,38,589,627 +15112,2012-09-26,4,1,9,20,0,3,1,1,0.66,0.6212,0.69,0.194,45,389,434 +15113,2012-09-26,4,1,9,21,0,3,1,1,0.66,0.6212,0.65,0.194,32,328,360 +15114,2012-09-26,4,1,9,22,0,3,1,3,0.62,0.6061,0.69,0.194,21,194,215 +15115,2012-09-26,4,1,9,23,0,3,1,2,0.6,0.5758,0.78,0.2537,13,102,115 +15116,2012-09-27,4,1,9,0,0,4,1,1,0.6,0.5758,0.78,0.0896,5,60,65 +15117,2012-09-27,4,1,9,1,0,4,1,1,0.6,0.5758,0.78,0.0896,5,16,21 +15118,2012-09-27,4,1,9,2,0,4,1,1,0.6,0.5758,0.78,0.0896,3,13,16 +15119,2012-09-27,4,1,9,3,0,4,1,1,0.6,0.5758,0.78,0,0,7,7 +15120,2012-09-27,4,1,9,4,0,4,1,1,0.56,0.5303,0.94,0.1343,2,6,8 +15121,2012-09-27,4,1,9,5,0,4,1,1,0.58,0.5455,0.83,0,2,64,66 +15122,2012-09-27,4,1,9,6,0,4,1,1,0.56,0.5303,0.88,0.0896,5,164,169 +15123,2012-09-27,4,1,9,7,0,4,1,1,0.6,0.5758,0.78,0,12,546,558 +15124,2012-09-27,4,1,9,8,0,4,1,1,0.62,0.6061,0.71,0.0896,20,774,794 +15125,2012-09-27,4,1,9,9,0,4,1,1,0.66,0.6212,0.65,0,30,305,335 +15126,2012-09-27,4,1,9,10,0,4,1,2,0.7,0.6515,0.51,0.2836,40,168,208 +15127,2012-09-27,4,1,9,11,0,4,1,2,0.72,0.6667,0.51,0.1642,67,227,294 +15128,2012-09-27,4,1,9,12,0,4,1,2,0.74,0.6667,0.48,0.1343,63,272,335 +15129,2012-09-27,4,1,9,13,0,4,1,2,0.74,0.6667,0.48,0,61,265,326 +15130,2012-09-27,4,1,9,14,0,4,1,2,0.76,0.6818,0.45,0.0896,44,214,258 +15131,2012-09-27,4,1,9,15,0,4,1,2,0.76,0.6818,0.45,0,41,255,296 +15132,2012-09-27,4,1,9,16,0,4,1,2,0.72,0.6667,0.58,0.3284,73,399,472 +15133,2012-09-27,4,1,9,17,0,4,1,2,0.66,0.6212,0.69,0.2985,87,818,905 +15134,2012-09-27,4,1,9,18,0,4,1,2,0.66,0.6212,0.69,0.2239,77,822,899 +15135,2012-09-27,4,1,9,19,0,4,1,1,0.66,0.6212,0.65,0.1343,48,511,559 +15136,2012-09-27,4,1,9,20,0,4,1,2,0.64,0.6061,0.69,0.2239,45,412,457 +15137,2012-09-27,4,1,9,21,0,4,1,3,0.62,0.5758,0.83,0.2985,18,235,253 +15138,2012-09-27,4,1,9,22,0,4,1,3,0.62,0.5758,0.83,0.2985,2,45,47 +15139,2012-09-27,4,1,9,23,0,4,1,3,0.62,0.5758,0.83,0.1642,1,44,45 +15140,2012-09-28,4,1,9,0,0,5,1,3,0.6,0.5455,0.88,0.1642,2,15,17 +15141,2012-09-28,4,1,9,1,0,5,1,3,0.6,0.5455,0.88,0,0,13,13 +15142,2012-09-28,4,1,9,2,0,5,1,3,0.6,0.5455,0.88,0.2985,1,9,10 +15143,2012-09-28,4,1,9,3,0,5,1,2,0.6,0.5606,0.83,0.1343,0,7,7 +15144,2012-09-28,4,1,9,4,0,5,1,3,0.56,0.5303,0.88,0.1045,2,10,12 +15145,2012-09-28,4,1,9,5,0,5,1,1,0.56,0.5303,0.88,0,1,37,38 +15146,2012-09-28,4,1,9,6,0,5,1,2,0.56,0.5303,0.94,0.0896,2,136,138 +15147,2012-09-28,4,1,9,7,0,5,1,2,0.56,0.5303,0.94,0,10,384,394 +15148,2012-09-28,4,1,9,8,0,5,1,1,0.6,0.5606,0.83,0.0896,31,674,705 +15149,2012-09-28,4,1,9,9,0,5,1,2,0.62,0.5758,0.83,0.1045,27,399,426 +15150,2012-09-28,4,1,9,10,0,5,1,2,0.64,0.5909,0.78,0.1343,51,194,245 +15151,2012-09-28,4,1,9,11,0,5,1,2,0.66,0.6212,0.69,0,109,252,361 +15152,2012-09-28,4,1,9,12,0,5,1,2,0.7,0.6515,0.54,0.2537,94,286,380 +15153,2012-09-28,4,1,9,13,0,5,1,2,0.7,0.6515,0.51,0.2537,106,304,410 +15154,2012-09-28,4,1,9,14,0,5,1,2,0.7,0.6515,0.54,0.2537,105,294,399 +15155,2012-09-28,4,1,9,15,0,5,1,2,0.68,0.6364,0.54,0.2836,104,288,392 +15156,2012-09-28,4,1,9,16,0,5,1,2,0.66,0.6212,0.5,0.2239,83,419,502 +15157,2012-09-28,4,1,9,17,0,5,1,2,0.66,0.6212,0.54,0.2985,101,707,808 +15158,2012-09-28,4,1,9,18,0,5,1,2,0.64,0.6212,0.5,0.2239,58,609,667 +15159,2012-09-28,4,1,9,19,0,5,1,3,0.62,0.6212,0.5,0.2537,38,470,508 +15160,2012-09-28,4,1,9,20,0,5,1,3,0.6,0.6212,0.53,0.1642,32,304,336 +15161,2012-09-28,4,1,9,21,0,5,1,3,0.6,0.6212,0.53,0.194,32,205,237 +15162,2012-09-28,4,1,9,22,0,5,1,2,0.58,0.5455,0.53,0.2239,30,190,220 +15163,2012-09-28,4,1,9,23,0,5,1,2,0.56,0.5303,0.56,0.194,26,164,190 +15164,2012-09-29,4,1,9,0,0,6,0,1,0.54,0.5152,0.6,0.2239,15,134,149 +15165,2012-09-29,4,1,9,1,0,6,0,1,0.54,0.5152,0.6,0.2239,12,89,101 +15166,2012-09-29,4,1,9,2,0,6,0,2,0.54,0.5152,0.56,0.2239,12,80,92 +15167,2012-09-29,4,1,9,3,0,6,0,2,0.52,0.5,0.59,0.194,4,25,29 +15168,2012-09-29,4,1,9,4,0,6,0,1,0.52,0.5,0.59,0.2537,6,8,14 +15169,2012-09-29,4,1,9,5,0,6,0,1,0.5,0.4848,0.63,0.194,2,13,15 +15170,2012-09-29,4,1,9,6,0,6,0,1,0.5,0.4848,0.63,0.2537,4,33,37 +15171,2012-09-29,4,1,9,7,0,6,0,1,0.48,0.4697,0.67,0.2836,12,61,73 +15172,2012-09-29,4,1,9,8,0,6,0,1,0.5,0.4848,0.68,0.2836,37,174,211 +15173,2012-09-29,4,1,9,9,0,6,0,1,0.5,0.4848,0.63,0.194,69,263,332 +15174,2012-09-29,4,1,9,10,0,6,0,1,0.54,0.5152,0.56,0.2537,134,338,472 +15175,2012-09-29,4,1,9,11,0,6,0,1,0.56,0.5303,0.52,0.2836,217,446,663 +15176,2012-09-29,4,1,9,12,0,6,0,1,0.62,0.6212,0.41,0.2985,191,491,682 +15177,2012-09-29,4,1,9,13,0,6,0,1,0.62,0.6212,0.41,0.2985,233,453,686 +15178,2012-09-29,4,1,9,14,0,6,0,1,0.6,0.6212,0.4,0.2537,302,448,750 +15179,2012-09-29,4,1,9,15,0,6,0,1,0.6,0.6212,0.43,0.2537,271,456,727 +15180,2012-09-29,4,1,9,16,0,6,0,1,0.62,0.6212,0.41,0.2537,275,447,722 +15181,2012-09-29,4,1,9,17,0,6,0,1,0.6,0.6212,0.38,0.3582,256,456,712 +15182,2012-09-29,4,1,9,18,0,6,0,1,0.56,0.5303,0.46,0.194,174,420,594 +15183,2012-09-29,4,1,9,19,0,6,0,1,0.52,0.5,0.55,0.1642,119,351,470 +15184,2012-09-29,4,1,9,20,0,6,0,1,0.52,0.5,0.55,0.194,88,227,315 +15185,2012-09-29,4,1,9,21,0,6,0,1,0.52,0.5,0.55,0.1343,73,219,292 +15186,2012-09-29,4,1,9,22,0,6,0,1,0.5,0.4848,0.63,0.1045,48,173,221 +15187,2012-09-29,4,1,9,23,0,6,0,1,0.5,0.4848,0.59,0.0896,35,161,196 +15188,2012-09-30,4,1,9,0,0,0,0,1,0.5,0.4848,0.59,0,35,112,147 +15189,2012-09-30,4,1,9,1,0,0,0,1,0.5,0.4848,0.59,0,31,85,116 +15190,2012-09-30,4,1,9,2,0,0,0,1,0.48,0.4697,0.63,0,21,71,92 +15191,2012-09-30,4,1,9,3,0,0,0,1,0.46,0.4545,0.67,0.0896,17,41,58 +15192,2012-09-30,4,1,9,4,0,0,0,1,0.46,0.4545,0.67,0.0896,1,6,7 +15193,2012-09-30,4,1,9,5,0,0,0,1,0.44,0.4394,0.72,0,1,9,10 +15194,2012-09-30,4,1,9,6,0,0,0,1,0.44,0.4394,0.72,0.0896,2,20,22 +15195,2012-09-30,4,1,9,7,0,0,0,1,0.44,0.4394,0.77,0,14,43,57 +15196,2012-09-30,4,1,9,8,0,0,0,1,0.48,0.4697,0.67,0.1045,18,126,144 +15197,2012-09-30,4,1,9,9,0,0,0,1,0.52,0.5,0.63,0.1045,74,211,285 +15198,2012-09-30,4,1,9,10,0,0,0,1,0.58,0.5455,0.46,0.1642,160,319,479 +15199,2012-09-30,4,1,9,11,0,0,0,1,0.6,0.6212,0.43,0.2239,222,369,591 +15200,2012-09-30,4,1,9,12,0,0,0,1,0.62,0.6212,0.41,0.2985,206,474,680 +15201,2012-09-30,4,1,9,13,0,0,0,1,0.62,0.6212,0.41,0.2537,180,414,594 +15202,2012-09-30,4,1,9,14,0,0,0,1,0.62,0.6212,0.41,0.2537,208,404,612 +15203,2012-09-30,4,1,9,15,0,0,0,1,0.64,0.6212,0.36,0.2537,230,419,649 +15204,2012-09-30,4,1,9,16,0,0,0,1,0.64,0.6212,0.36,0.2239,202,446,648 +15205,2012-09-30,4,1,9,17,0,0,0,1,0.62,0.6212,0.35,0.2836,195,380,575 +15206,2012-09-30,4,1,9,18,0,0,0,1,0.52,0.5,0.59,0.4478,91,310,401 +15207,2012-09-30,4,1,9,19,0,0,0,3,0.5,0.4848,0.72,0.1343,34,223,257 +15208,2012-09-30,4,1,9,20,0,0,0,3,0.5,0.4848,0.72,0.1343,31,163,194 +15209,2012-09-30,4,1,9,21,0,0,0,1,0.5,0.4848,0.68,0,19,104,123 +15210,2012-09-30,4,1,9,22,0,0,0,1,0.48,0.4697,0.72,0,15,76,91 +15211,2012-09-30,4,1,9,23,0,0,0,1,0.48,0.4697,0.72,0.0896,8,49,57 +15212,2012-10-01,4,1,10,0,0,1,1,1,0.46,0.4545,0.72,0.1045,6,39,45 +15213,2012-10-01,4,1,10,1,0,1,1,1,0.44,0.4394,0.77,0.0896,5,13,18 +15214,2012-10-01,4,1,10,2,0,1,1,1,0.46,0.4545,0.72,0,6,6,12 +15215,2012-10-01,4,1,10,3,0,1,1,1,0.44,0.4394,0.77,0,1,6,7 +15216,2012-10-01,4,1,10,4,0,1,1,1,0.42,0.4242,0.82,0.1045,0,10,10 +15217,2012-10-01,4,1,10,5,0,1,1,1,0.44,0.4394,0.77,0,2,34,36 +15218,2012-10-01,4,1,10,6,0,1,1,1,0.44,0.4394,0.77,0.1045,8,147,155 +15219,2012-10-01,4,1,10,7,0,1,1,1,0.44,0.4394,0.77,0.1642,13,470,483 +15220,2012-10-01,4,1,10,8,0,1,1,2,0.46,0.4545,0.77,0.1045,40,744,784 +15221,2012-10-01,4,1,10,9,0,1,1,2,0.52,0.5,0.63,0,26,314,340 +15222,2012-10-01,4,1,10,10,0,1,1,1,0.54,0.5152,0.56,0,44,135,179 +15223,2012-10-01,4,1,10,11,0,1,1,1,0.58,0.5455,0.46,0,76,196,272 +15224,2012-10-01,4,1,10,12,0,1,1,2,0.6,0.6212,0.43,0.1642,61,262,323 +15225,2012-10-01,4,1,10,13,0,1,1,2,0.6,0.6212,0.43,0.1642,80,225,305 +15226,2012-10-01,4,1,10,14,0,1,1,2,0.6,0.6212,0.43,0.1045,51,193,244 +15227,2012-10-01,4,1,10,15,0,1,1,2,0.62,0.6212,0.43,0.1343,95,234,329 +15228,2012-10-01,4,1,10,16,0,1,1,2,0.62,0.6212,0.46,0.2537,51,408,459 +15229,2012-10-01,4,1,10,17,0,1,1,3,0.56,0.5303,0.6,0.2537,65,791,856 +15230,2012-10-01,4,1,10,18,0,1,1,3,0.56,0.5303,0.64,0.1045,42,571,613 +15231,2012-10-01,4,1,10,19,0,1,1,2,0.54,0.5152,0.68,0.1045,33,483,516 +15232,2012-10-01,4,1,10,20,0,1,1,3,0.54,0.5152,0.68,0,11,251,262 +15233,2012-10-01,4,1,10,21,0,1,1,2,0.54,0.5152,0.73,0.1343,13,205,218 +15234,2012-10-01,4,1,10,22,0,1,1,3,0.54,0.5152,0.77,0.0896,17,190,207 +15235,2012-10-01,4,1,10,23,0,1,1,3,0.54,0.5152,0.77,0,17,88,105 +15236,2012-10-02,4,1,10,0,0,2,1,2,0.56,0.5303,0.73,0,1,30,31 +15237,2012-10-02,4,1,10,1,0,2,1,2,0.54,0.5152,0.77,0.1045,0,11,11 +15238,2012-10-02,4,1,10,2,0,2,1,2,0.54,0.5152,0.77,0.1045,0,2,2 +15239,2012-10-02,4,1,10,3,0,2,1,2,0.54,0.5152,0.88,0.194,1,4,5 +15240,2012-10-02,4,1,10,4,0,2,1,2,0.56,0.5303,0.83,0.0896,2,8,10 +15241,2012-10-02,4,1,10,5,0,2,1,2,0.56,0.5303,0.83,0.1343,1,42,43 +15242,2012-10-02,4,1,10,6,0,2,1,3,0.58,0.5455,0.83,0.1045,3,176,179 +15243,2012-10-02,4,1,10,7,0,2,1,3,0.58,0.5455,0.83,0.1045,4,256,260 +15244,2012-10-02,4,1,10,8,0,2,1,3,0.6,0.5455,0.88,0,6,128,134 +15245,2012-10-02,4,1,10,9,0,2,1,3,0.58,0.5455,0.88,0.1343,3,83,86 +15246,2012-10-02,4,1,10,10,0,2,1,3,0.58,0.5455,0.88,0.1343,3,42,45 +15247,2012-10-02,4,1,10,11,0,2,1,3,0.58,0.5455,0.94,0.1343,7,92,99 +15248,2012-10-02,4,1,10,12,0,2,1,2,0.6,0.5455,0.88,0.194,6,98,104 +15249,2012-10-02,4,1,10,13,0,2,1,2,0.6,0.5455,0.88,0.1343,6,148,154 +15250,2012-10-02,4,1,10,14,0,2,1,3,0.6,0.5455,0.88,0.1343,16,147,163 +15251,2012-10-02,4,1,10,15,0,2,1,3,0.62,0.5758,0.83,0.1343,14,195,209 +15252,2012-10-02,4,1,10,16,0,2,1,3,0.62,0.5606,0.88,0.1343,46,328,374 +15253,2012-10-02,4,1,10,17,0,2,1,3,0.62,0.5606,0.88,0.1045,38,677,715 +15254,2012-10-02,4,1,10,18,0,2,1,3,0.62,0.5455,0.94,0.0896,48,639,687 +15255,2012-10-02,4,1,10,19,0,2,1,3,0.62,0.5455,0.94,0,27,368,395 +15256,2012-10-02,4,1,10,20,0,2,1,3,0.62,0.5455,0.94,0.1343,20,286,306 +15257,2012-10-02,4,1,10,21,0,2,1,3,0.62,0.5455,0.94,0.1045,24,265,289 +15258,2012-10-02,4,1,10,22,0,2,1,2,0.62,0.5455,0.94,0.1045,28,212,240 +15259,2012-10-02,4,1,10,23,0,2,1,2,0.62,0.5455,0.94,0,11,87,98 +15260,2012-10-03,4,1,10,0,0,3,1,3,0.62,0.5455,0.94,0,5,47,52 +15261,2012-10-03,4,1,10,1,0,3,1,2,0.62,0.5455,0.94,0,3,16,19 +15262,2012-10-03,4,1,10,2,0,3,1,2,0.62,0.5455,0.94,0,2,7,9 +15263,2012-10-03,4,1,10,3,0,3,1,2,0.6,0.5152,0.94,0,0,7,7 +15264,2012-10-03,4,1,10,4,0,3,1,2,0.6,0.5,1,0,2,9,11 +15265,2012-10-03,4,1,10,5,0,3,1,2,0.62,0.5455,0.94,0.1045,2,32,34 +15266,2012-10-03,4,1,10,6,0,3,1,2,0.6,0.5152,0.94,0.1343,4,173,177 +15267,2012-10-03,4,1,10,7,0,3,1,2,0.6,0.5152,0.94,0.194,11,504,515 +15268,2012-10-03,4,1,10,8,0,3,1,2,0.62,0.5606,0.88,0.1045,28,781,809 +15269,2012-10-03,4,1,10,9,0,3,1,2,0.62,0.5758,0.83,0.194,30,332,362 +15270,2012-10-03,4,1,10,10,0,3,1,1,0.66,0.6061,0.78,0.1642,29,146,175 +15271,2012-10-03,4,1,10,11,0,3,1,1,0.66,0.6061,0.78,0.1642,52,178,230 +15272,2012-10-03,4,1,10,12,0,3,1,2,0.7,0.6515,0.7,0.1045,69,289,358 +15273,2012-10-03,4,1,10,13,0,3,1,2,0.7,0.6667,0.74,0.1045,55,224,279 +15274,2012-10-03,4,1,10,14,0,3,1,2,0.72,0.6818,0.7,0.1642,56,195,251 +15275,2012-10-03,4,1,10,15,0,3,1,2,0.72,0.6818,0.62,0.1642,59,260,319 +15276,2012-10-03,4,1,10,16,0,3,1,2,0.72,0.6667,0.58,0,42,436,478 +15277,2012-10-03,4,1,10,17,0,3,1,1,0.7,0.6515,0.65,0,84,833,917 +15278,2012-10-03,4,1,10,18,0,3,1,1,0.7,0.6515,0.65,0,54,756,810 +15279,2012-10-03,4,1,10,19,0,3,1,1,0.7,0.6515,0.65,0,49,544,593 +15280,2012-10-03,4,1,10,20,0,3,1,1,0.7,0.6515,0.65,0,48,449,497 +15281,2012-10-03,4,1,10,21,0,3,1,2,0.66,0.6212,0.74,0,14,195,209 +15282,2012-10-03,4,1,10,22,0,3,1,1,0.66,0.6212,0.74,0,9,232,241 +15283,2012-10-03,4,1,10,23,0,3,1,2,0.66,0.6061,0.78,0,21,199,220 +15284,2012-10-04,4,1,10,0,0,4,1,3,0.64,0.5758,0.89,0,11,65,76 +15285,2012-10-04,4,1,10,1,0,4,1,2,0.62,0.5455,0.94,0,2,23,25 +15286,2012-10-04,4,1,10,2,0,4,1,1,0.64,0.5758,0.89,0,3,10,13 +15287,2012-10-04,4,1,10,3,0,4,1,2,0.62,0.5606,0.88,0,5,6,11 +15288,2012-10-04,4,1,10,4,0,4,1,2,0.64,0.5758,0.89,0,0,10,10 +15289,2012-10-04,4,1,10,5,0,4,1,3,0.64,0.5758,0.89,0.1045,2,37,39 +15290,2012-10-04,4,1,10,6,0,4,1,3,0.62,0.5455,0.94,0.0896,1,132,133 +15291,2012-10-04,4,1,10,7,0,4,1,2,0.64,0.5758,0.89,0.0896,12,379,391 +15292,2012-10-04,4,1,10,8,0,4,1,1,0.64,0.5758,0.89,0,27,711,738 +15293,2012-10-04,4,1,10,9,0,4,1,2,0.64,0.5758,0.89,0.194,40,319,359 +15294,2012-10-04,4,1,10,10,0,4,1,2,0.66,0.6061,0.83,0.1343,27,150,177 +15295,2012-10-04,4,1,10,11,0,4,1,2,0.7,0.6515,0.7,0.1045,38,176,214 +15296,2012-10-04,4,1,10,12,0,4,1,2,0.72,0.6667,0.54,0.194,57,231,288 +15297,2012-10-04,4,1,10,13,0,4,1,2,0.72,0.6667,0.58,0.3284,63,231,294 +15298,2012-10-04,4,1,10,14,0,4,1,1,0.7,0.6515,0.61,0.2537,56,211,267 +15299,2012-10-04,4,1,10,15,0,4,1,1,0.72,0.6667,0.54,0.2836,77,248,325 +15300,2012-10-04,4,1,10,16,0,4,1,1,0.7,0.6515,0.54,0.2836,86,411,497 +15301,2012-10-04,4,1,10,17,0,4,1,1,0.7,0.6515,0.51,0.2239,112,789,901 +15302,2012-10-04,4,1,10,18,0,4,1,1,0.66,0.6212,0.57,0.1045,75,812,887 +15303,2012-10-04,4,1,10,19,0,4,1,1,0.66,0.6212,0.57,0.1045,67,467,534 +15304,2012-10-04,4,1,10,20,0,4,1,1,0.64,0.6061,0.65,0.1045,50,391,441 +15305,2012-10-04,4,1,10,21,0,4,1,1,0.62,0.6061,0.69,0,33,288,321 +15306,2012-10-04,4,1,10,22,0,4,1,1,0.64,0.6212,0.47,0.1343,29,203,232 +15307,2012-10-04,4,1,10,23,0,4,1,1,0.6,0.6212,0.56,0.0896,18,137,155 +15308,2012-10-05,4,1,10,0,0,5,1,1,0.56,0.5303,0.73,0,16,86,102 +15309,2012-10-05,4,1,10,1,0,5,1,1,0.54,0.5152,0.68,0.1045,3,43,46 +15310,2012-10-05,4,1,10,2,0,5,1,1,0.54,0.5152,0.73,0.0896,1,10,11 +15311,2012-10-05,4,1,10,3,0,5,1,1,0.54,0.5152,0.73,0,5,11,16 +15312,2012-10-05,4,1,10,4,0,5,1,1,0.54,0.5152,0.77,0,0,11,11 +15313,2012-10-05,4,1,10,5,0,5,1,1,0.52,0.5,0.77,0,0,41,41 +15314,2012-10-05,4,1,10,6,0,5,1,1,0.52,0.5,0.83,0,4,133,137 +15315,2012-10-05,4,1,10,7,0,5,1,1,0.52,0.5,0.83,0.1045,11,417,428 +15316,2012-10-05,4,1,10,8,0,5,1,1,0.58,0.5455,0.64,0,36,749,785 +15317,2012-10-05,4,1,10,9,0,5,1,1,0.6,0.6061,0.6,0.0896,58,326,384 +15318,2012-10-05,4,1,10,10,0,5,1,1,0.66,0.6212,0.54,0,68,192,260 +15319,2012-10-05,4,1,10,11,0,5,1,1,0.7,0.6515,0.48,0.194,90,214,304 +15320,2012-10-05,4,1,10,12,0,5,1,1,0.72,0.6515,0.42,0.1642,161,307,468 +15321,2012-10-05,4,1,10,13,0,5,1,1,0.7,0.6515,0.48,0,117,307,424 +15322,2012-10-05,4,1,10,14,0,5,1,1,0.74,0.6515,0.37,0.2239,113,287,400 +15323,2012-10-05,4,1,10,15,0,5,1,1,0.72,0.6515,0.39,0.2537,150,320,470 +15324,2012-10-05,4,1,10,16,0,5,1,1,0.72,0.6515,0.37,0,153,481,634 +15325,2012-10-05,4,1,10,17,0,5,1,1,0.7,0.6364,0.42,0.1642,158,742,900 +15326,2012-10-05,4,1,10,18,0,5,1,1,0.64,0.6212,0.57,0.1343,106,655,761 +15327,2012-10-05,4,1,10,19,0,5,1,1,0.62,0.5909,0.73,0.1642,67,433,500 +15328,2012-10-05,4,1,10,20,0,5,1,1,0.6,0.5758,0.78,0.1343,66,306,372 +15329,2012-10-05,4,1,10,21,0,5,1,1,0.6,0.5909,0.69,0.2239,47,220,267 +15330,2012-10-05,4,1,10,22,0,5,1,1,0.6,0.5909,0.73,0.2836,63,201,264 +15331,2012-10-05,4,1,10,23,0,5,1,1,0.58,0.5455,0.78,0.2239,23,148,171 +15332,2012-10-06,4,1,10,0,0,6,0,1,0.56,0.5303,0.83,0.1642,37,154,191 +15333,2012-10-06,4,1,10,1,0,6,0,1,0.56,0.5303,0.83,0.2537,25,116,141 +15334,2012-10-06,4,1,10,2,0,6,0,1,0.56,0.5303,0.78,0.194,13,62,75 +15335,2012-10-06,4,1,10,3,0,6,0,1,0.54,0.5152,0.77,0.2239,2,54,56 +15336,2012-10-06,4,1,10,4,0,6,0,1,0.54,0.5152,0.83,0.2537,4,7,11 +15337,2012-10-06,4,1,10,5,0,6,0,1,0.54,0.5152,0.83,0.194,2,8,10 +15338,2012-10-06,4,1,10,6,0,6,0,1,0.54,0.5152,0.88,0.2537,2,28,30 +15339,2012-10-06,4,1,10,7,0,6,0,1,0.54,0.5152,0.83,0.2985,13,71,84 +15340,2012-10-06,4,1,10,8,0,6,0,1,0.56,0.5303,0.83,0.194,22,184,206 +15341,2012-10-06,4,1,10,9,0,6,0,1,0.6,0.5909,0.73,0.194,130,265,395 +15342,2012-10-06,4,1,10,10,0,6,0,1,0.62,0.6061,0.69,0.2836,198,341,539 +15343,2012-10-06,4,1,10,11,0,6,0,1,0.64,0.6061,0.65,0.2537,258,389,647 +15344,2012-10-06,4,1,10,12,0,6,0,1,0.7,0.6515,0.54,0.1045,362,381,743 +15345,2012-10-06,4,1,10,13,0,6,0,2,0.64,0.6212,0.57,0.5224,310,400,710 +15346,2012-10-06,4,1,10,14,0,6,0,2,0.6,0.6212,0.56,0.4179,269,307,576 +15347,2012-10-06,4,1,10,15,0,6,0,1,0.6,0.6212,0.46,0.4179,279,341,620 +15348,2012-10-06,4,1,10,16,0,6,0,1,0.6,0.6212,0.43,0.5224,317,342,659 +15349,2012-10-06,4,1,10,17,0,6,0,1,0.54,0.5152,0.49,0.3881,268,342,610 +15350,2012-10-06,4,1,10,18,0,6,0,1,0.52,0.5,0.48,0.4179,183,312,495 +15351,2012-10-06,4,1,10,19,0,6,0,1,0.48,0.4697,0.55,0.2239,102,239,341 +15352,2012-10-06,4,1,10,20,0,6,0,1,0.48,0.4697,0.55,0.2537,75,172,247 +15353,2012-10-06,4,1,10,21,0,6,0,1,0.46,0.4545,0.59,0.1343,78,137,215 +15354,2012-10-06,4,1,10,22,0,6,0,1,0.44,0.4394,0.62,0.1343,45,140,185 +15355,2012-10-06,4,1,10,23,0,6,0,1,0.44,0.4394,0.62,0.1343,37,142,179 +15356,2012-10-07,4,1,10,0,0,0,0,1,0.44,0.4394,0.62,0.1642,28,99,127 +15357,2012-10-07,4,1,10,1,0,0,0,1,0.44,0.4394,0.54,0.2239,29,80,109 +15358,2012-10-07,4,1,10,2,0,0,0,1,0.42,0.4242,0.62,0.1343,10,64,74 +15359,2012-10-07,4,1,10,3,0,0,0,2,0.44,0.4394,0.62,0.1343,5,17,22 +15360,2012-10-07,4,1,10,4,0,0,0,2,0.44,0.4394,0.54,0.2537,5,6,11 +15361,2012-10-07,4,1,10,5,0,0,0,2,0.44,0.4394,0.54,0.1343,2,8,10 +15362,2012-10-07,4,1,10,6,0,0,0,2,0.44,0.4394,0.54,0.194,4,19,23 +15363,2012-10-07,4,1,10,7,0,0,0,3,0.42,0.4242,0.58,0.1642,6,29,35 +15364,2012-10-07,4,1,10,8,0,0,0,3,0.42,0.4242,0.67,0.1343,11,51,62 +15365,2012-10-07,4,1,10,9,0,0,0,3,0.4,0.4091,0.71,0.1343,16,70,86 +15366,2012-10-07,4,1,10,10,0,0,0,2,0.42,0.4242,0.67,0.194,41,144,185 +15367,2012-10-07,4,1,10,11,0,0,0,2,0.42,0.4242,0.71,0.1045,76,260,336 +15368,2012-10-07,4,1,10,12,0,0,0,2,0.42,0.4242,0.71,0,100,292,392 +15369,2012-10-07,4,1,10,13,0,0,0,2,0.44,0.4394,0.72,0.0896,80,240,320 +15370,2012-10-07,4,1,10,14,0,0,0,2,0.44,0.4394,0.67,0,71,243,314 +15371,2012-10-07,4,1,10,15,0,0,0,3,0.42,0.4242,0.77,0.1343,74,232,306 +15372,2012-10-07,4,1,10,16,0,0,0,3,0.4,0.4091,0.82,0.1642,87,246,333 +15373,2012-10-07,4,1,10,17,0,0,0,3,0.4,0.4091,0.82,0.2239,35,122,157 +15374,2012-10-07,4,1,10,18,0,0,0,3,0.4,0.4091,0.82,0.194,24,82,106 +15375,2012-10-07,4,1,10,19,0,0,0,2,0.4,0.4091,0.82,0.1045,17,97,114 +15376,2012-10-07,4,1,10,20,0,0,0,1,0.4,0.4091,0.82,0.1343,19,97,116 +15377,2012-10-07,4,1,10,21,0,0,0,1,0.38,0.3939,0.87,0.1045,22,91,113 +15378,2012-10-07,4,1,10,22,0,0,0,1,0.38,0.3939,0.87,0.1343,7,72,79 +15379,2012-10-07,4,1,10,23,0,0,0,1,0.36,0.3485,0.93,0.1343,12,68,80 +15380,2012-10-08,4,1,10,0,1,1,0,1,0.38,0.3939,0.76,0.2836,7,44,51 +15381,2012-10-08,4,1,10,1,1,1,0,1,0.34,0.3333,0.87,0.1642,2,35,37 +15382,2012-10-08,4,1,10,2,1,1,0,1,0.34,0.3333,0.81,0.194,3,12,15 +15383,2012-10-08,4,1,10,3,1,1,0,1,0.34,0.3485,0.81,0.1045,2,8,10 +15384,2012-10-08,4,1,10,4,1,1,0,1,0.32,0.3333,0.87,0.1343,1,6,7 +15385,2012-10-08,4,1,10,5,1,1,0,1,0.34,0.3182,0.76,0.2239,2,17,19 +15386,2012-10-08,4,1,10,6,1,1,0,1,0.34,0.3485,0.76,0.1045,5,54,59 +15387,2012-10-08,4,1,10,7,1,1,0,2,0.34,0.3333,0.76,0.1642,3,151,154 +15388,2012-10-08,4,1,10,8,1,1,0,2,0.36,0.3485,0.71,0.2239,23,374,397 +15389,2012-10-08,4,1,10,9,1,1,0,2,0.38,0.3939,0.66,0.1642,59,273,332 +15390,2012-10-08,4,1,10,10,1,1,0,2,0.38,0.3939,0.68,0.1343,69,229,298 +15391,2012-10-08,4,1,10,11,1,1,0,2,0.4,0.4091,0.62,0.0896,72,236,308 +15392,2012-10-08,4,1,10,12,1,1,0,2,0.42,0.4242,0.63,0.1642,74,267,341 +15393,2012-10-08,4,1,10,13,1,1,0,2,0.42,0.4242,0.62,0.194,82,323,405 +15394,2012-10-08,4,1,10,14,1,1,0,2,0.42,0.4242,0.58,0.1045,121,299,420 +15395,2012-10-08,4,1,10,15,1,1,0,3,0.42,0.4242,0.58,0.194,76,294,370 +15396,2012-10-08,4,1,10,16,1,1,0,3,0.42,0.4242,0.58,0.194,61,316,377 +15397,2012-10-08,4,1,10,17,1,1,0,2,0.42,0.4242,0.66,0.2537,81,416,497 +15398,2012-10-08,4,1,10,18,1,1,0,2,0.42,0.4242,0.66,0.2836,41,415,456 +15399,2012-10-08,4,1,10,19,1,1,0,2,0.4,0.4091,0.71,0.2537,38,333,371 +15400,2012-10-08,4,1,10,20,1,1,0,3,0.4,0.4091,0.71,0.2239,20,207,227 +15401,2012-10-08,4,1,10,21,1,1,0,2,0.4,0.4091,0.76,0.2239,17,134,151 +15402,2012-10-08,4,1,10,22,1,1,0,3,0.4,0.4091,0.76,0.2537,6,101,107 +15403,2012-10-08,4,1,10,23,1,1,0,2,0.4,0.4091,0.71,0.2239,9,60,69 +15404,2012-10-09,4,1,10,0,0,2,1,2,0.38,0.3939,0.82,0.2239,9,22,31 +15405,2012-10-09,4,1,10,1,0,2,1,3,0.36,0.3485,0.87,0.2239,0,13,13 +15406,2012-10-09,4,1,10,2,0,2,1,3,0.36,0.3333,0.87,0.2537,2,5,7 +15407,2012-10-09,4,1,10,3,0,2,1,3,0.36,0.3333,0.87,0.2836,0,3,3 +15408,2012-10-09,4,1,10,4,0,2,1,3,0.36,0.3485,0.87,0.1642,1,6,7 +15409,2012-10-09,4,1,10,5,0,2,1,2,0.38,0.3939,0.82,0.194,0,19,19 +15410,2012-10-09,4,1,10,6,0,2,1,2,0.38,0.3939,0.82,0.2537,2,141,143 +15411,2012-10-09,4,1,10,7,0,2,1,1,0.38,0.3939,0.83,0.2537,5,357,362 +15412,2012-10-09,4,1,10,8,0,2,1,2,0.42,0.4242,0.8,0.194,15,698,713 +15413,2012-10-09,4,1,10,9,0,2,1,2,0.42,0.4242,0.77,0.2239,37,345,382 +15414,2012-10-09,4,1,10,10,0,2,1,2,0.44,0.4394,0.75,0.2239,33,131,164 +15415,2012-10-09,4,1,10,11,0,2,1,2,0.48,0.4697,0.69,0.194,37,156,193 +15416,2012-10-09,4,1,10,12,0,2,1,2,0.48,0.4697,0.69,0.194,43,217,260 +15417,2012-10-09,4,1,10,13,0,2,1,2,0.5,0.4848,0.67,0.1343,54,184,238 +15418,2012-10-09,4,1,10,14,0,2,1,2,0.54,0.5152,0.64,0.194,43,201,244 +15419,2012-10-09,4,1,10,15,0,2,1,2,0.52,0.5,0.63,0.1642,59,195,254 +15420,2012-10-09,4,1,10,16,0,2,1,2,0.5,0.4848,0.72,0.1642,70,354,424 +15421,2012-10-09,4,1,10,17,0,2,1,2,0.52,0.5,0.68,0.1642,73,733,806 +15422,2012-10-09,4,1,10,18,0,2,1,2,0.5,0.4848,0.72,0.2239,26,758,784 +15423,2012-10-09,4,1,10,19,0,2,1,2,0.5,0.4848,0.72,0.1642,31,483,514 +15424,2012-10-09,4,1,10,20,0,2,1,2,0.5,0.4848,0.72,0.1642,30,330,360 +15425,2012-10-09,4,1,10,21,0,2,1,2,0.48,0.4697,0.77,0.1343,16,209,225 +15426,2012-10-09,4,1,10,22,0,2,1,2,0.48,0.4697,0.77,0.0896,7,155,162 +15427,2012-10-09,4,1,10,23,0,2,1,2,0.48,0.4697,0.77,0.0896,8,76,84 +15428,2012-10-10,4,1,10,0,0,3,1,2,0.46,0.4545,0.88,0.0896,0,33,33 +15429,2012-10-10,4,1,10,1,0,3,1,2,0.46,0.4545,0.88,0,2,6,8 +15430,2012-10-10,4,1,10,2,0,3,1,2,0.46,0.4545,0.88,0,2,6,8 +15431,2012-10-10,4,1,10,3,0,3,1,2,0.46,0.4545,0.88,0,0,6,6 +15432,2012-10-10,4,1,10,4,0,3,1,2,0.46,0.4545,0.88,0.1045,0,6,6 +15433,2012-10-10,4,1,10,5,0,3,1,2,0.46,0.4545,0.88,0.0896,2,38,40 +15434,2012-10-10,4,1,10,6,0,3,1,2,0.46,0.4545,0.88,0.0896,3,172,175 +15435,2012-10-10,4,1,10,7,0,3,1,1,0.46,0.4545,0.82,0.1045,9,498,507 +15436,2012-10-10,4,1,10,8,0,3,1,1,0.5,0.4848,0.77,0.1642,33,806,839 +15437,2012-10-10,4,1,10,9,0,3,1,1,0.54,0.5152,0.6,0.2836,35,331,366 +15438,2012-10-10,4,1,10,10,0,3,1,1,0.56,0.5303,0.52,0.2985,27,190,217 +15439,2012-10-10,4,1,10,11,0,3,1,1,0.54,0.5152,0.52,0.194,53,238,291 +15440,2012-10-10,4,1,10,12,0,3,1,1,0.54,0.5152,0.56,0.2239,78,312,390 +15441,2012-10-10,4,1,10,13,0,3,1,1,0.56,0.5303,0.49,0,60,237,297 +15442,2012-10-10,4,1,10,14,0,3,1,1,0.6,0.6212,0.4,0.1642,63,208,271 +15443,2012-10-10,4,1,10,15,0,3,1,1,0.6,0.6212,0.4,0.1642,58,261,319 +15444,2012-10-10,4,1,10,16,0,3,1,1,0.6,0.6212,0.4,0.2239,92,474,566 +15445,2012-10-10,4,1,10,17,0,3,1,1,0.58,0.5455,0.43,0.2239,91,857,948 +15446,2012-10-10,4,1,10,18,0,3,1,2,0.56,0.5303,0.49,0.1642,57,787,844 +15447,2012-10-10,4,1,10,19,0,3,1,2,0.56,0.5303,0.46,0.5821,32,534,566 +15448,2012-10-10,4,1,10,20,0,3,1,1,0.52,0.5,0.48,0.4925,21,371,392 +15449,2012-10-10,4,1,10,21,0,3,1,1,0.5,0.4848,0.51,0.3582,18,251,269 +15450,2012-10-10,4,1,10,22,0,3,1,1,0.46,0.4545,0.55,0.2985,27,170,197 +15451,2012-10-10,4,1,10,23,0,3,1,1,0.44,0.4394,0.58,0.194,17,119,136 +15452,2012-10-11,4,1,10,0,0,4,1,1,0.44,0.4394,0.51,0.1343,1,41,42 +15453,2012-10-11,4,1,10,1,0,4,1,1,0.44,0.4394,0.47,0.194,1,10,11 +15454,2012-10-11,4,1,10,2,0,4,1,1,0.42,0.4242,0.5,0.1343,4,7,11 +15455,2012-10-11,4,1,10,3,0,4,1,1,0.42,0.4242,0.47,0.2985,0,3,3 +15456,2012-10-11,4,1,10,4,0,4,1,1,0.4,0.4091,0.5,0.2537,0,10,10 +15457,2012-10-11,4,1,10,5,0,4,1,1,0.36,0.3485,0.5,0.1642,0,42,42 +15458,2012-10-11,4,1,10,6,0,4,1,1,0.36,0.3485,0.5,0.1642,1,158,159 +15459,2012-10-11,4,1,10,7,0,4,1,1,0.36,0.3333,0.5,0.2537,11,467,478 +15460,2012-10-11,4,1,10,8,0,4,1,1,0.38,0.3939,0.46,0.3881,25,773,798 +15461,2012-10-11,4,1,10,9,0,4,1,1,0.4,0.4091,0.43,0.3881,27,328,355 +15462,2012-10-11,4,1,10,10,0,4,1,1,0.44,0.4394,0.41,0.3582,39,165,204 +15463,2012-10-11,4,1,10,11,0,4,1,1,0.46,0.4545,0.36,0,76,175,251 +15464,2012-10-11,4,1,10,12,0,4,1,1,0.46,0.4545,0.36,0,47,279,326 +15465,2012-10-11,4,1,10,13,0,4,1,1,0.5,0.4848,0.31,0.1045,48,243,291 +15466,2012-10-11,4,1,10,14,0,4,1,1,0.52,0.5,0.32,0.0896,78,270,348 +15467,2012-10-11,4,1,10,15,0,4,1,1,0.52,0.5,0.34,0.1642,102,358,460 +15468,2012-10-11,4,1,10,16,0,4,1,1,0.5,0.4848,0.36,0.1343,68,413,481 +15469,2012-10-11,4,1,10,17,0,4,1,1,0.5,0.4848,0.39,0.2836,90,737,827 +15470,2012-10-11,4,1,10,18,0,4,1,1,0.46,0.4545,0.44,0.1642,64,628,692 +15471,2012-10-11,4,1,10,19,0,4,1,1,0.44,0.4394,0.51,0.1343,81,662,743 +15472,2012-10-11,4,1,10,20,0,4,1,1,0.42,0.4242,0.62,0.1642,27,388,415 +15473,2012-10-11,4,1,10,21,0,4,1,1,0.42,0.4242,0.62,0.1343,24,236,260 +15474,2012-10-11,4,1,10,22,0,4,1,1,0.4,0.4091,0.66,0.1642,10,167,177 +15475,2012-10-11,4,1,10,23,0,4,1,1,0.42,0.4242,0.58,0.0896,10,176,186 +15476,2012-10-12,4,1,10,0,0,5,1,1,0.4,0.4091,0.66,0.1045,8,60,68 +15477,2012-10-12,4,1,10,1,0,5,1,1,0.4,0.4091,0.66,0.0896,8,29,37 +15478,2012-10-12,4,1,10,2,0,5,1,1,0.4,0.4091,0.66,0.0896,0,16,16 +15479,2012-10-12,4,1,10,3,0,5,1,1,0.36,0.3636,0.76,0.0896,0,6,6 +15480,2012-10-12,4,1,10,4,0,5,1,1,0.36,0.3636,0.76,0.1045,0,8,8 +15481,2012-10-12,4,1,10,5,0,5,1,1,0.36,0.3636,0.81,0.0896,2,33,35 +15482,2012-10-12,4,1,10,6,0,5,1,1,0.34,0.3485,0.81,0.0896,3,140,143 +15483,2012-10-12,4,1,10,7,0,5,1,1,0.36,0.3636,0.81,0.0896,8,384,392 +15484,2012-10-12,4,1,10,8,0,5,1,1,0.42,0.4242,0.71,0.0896,34,711,745 +15485,2012-10-12,4,1,10,9,0,5,1,1,0.5,0.4848,0.39,0.2985,26,374,400 +15486,2012-10-12,4,1,10,10,0,5,1,1,0.52,0.5,0.36,0.2985,75,200,275 +15487,2012-10-12,4,1,10,11,0,5,1,1,0.54,0.5152,0.37,0.3582,61,252,313 +15488,2012-10-12,4,1,10,12,0,5,1,1,0.54,0.5152,0.39,0.4627,75,312,387 +15489,2012-10-12,4,1,10,13,0,5,1,1,0.56,0.5303,0.37,0.3881,81,300,381 +15490,2012-10-12,4,1,10,14,0,5,1,1,0.56,0.5303,0.37,0.4627,109,262,371 +15491,2012-10-12,4,1,10,15,0,5,1,1,0.52,0.5,0.42,0.4478,138,317,455 +15492,2012-10-12,4,1,10,16,0,5,1,1,0.46,0.4545,0.41,0.3582,108,412,520 +15493,2012-10-12,4,1,10,17,0,5,1,1,0.46,0.4545,0.38,0.3881,131,706,837 +15494,2012-10-12,4,1,10,18,0,5,1,1,0.44,0.4394,0.38,0.2985,76,566,642 +15495,2012-10-12,4,1,10,19,0,5,1,1,0.42,0.4242,0.41,0.2239,40,453,493 +15496,2012-10-12,4,1,10,20,0,5,1,1,0.42,0.4242,0.47,0.1343,28,280,308 +15497,2012-10-12,4,1,10,21,0,5,1,1,0.4,0.4091,0.54,0.194,21,169,190 +15498,2012-10-12,4,1,10,22,0,5,1,1,0.4,0.4091,0.47,0.2985,17,143,160 +15499,2012-10-12,4,1,10,23,0,5,1,1,0.36,0.3485,0.57,0.194,11,89,100 +15500,2012-10-13,4,1,10,0,0,6,0,1,0.36,0.3636,0.57,0.0896,31,218,249 +15501,2012-10-13,4,1,10,1,0,6,0,1,0.34,0.3333,0.53,0.1343,23,123,146 +15502,2012-10-13,4,1,10,2,0,6,0,1,0.34,0.3333,0.53,0.194,7,60,67 +15503,2012-10-13,4,1,10,3,0,6,0,1,0.32,0.303,0.66,0.2239,2,13,15 +15504,2012-10-13,4,1,10,4,0,6,0,1,0.3,0.303,0.65,0.1642,2,9,11 +15505,2012-10-13,4,1,10,5,0,6,0,1,0.3,0.2879,0.61,0.194,0,11,11 +15506,2012-10-13,4,1,10,6,0,6,0,1,0.3,0.2879,0.61,0.194,5,23,28 +15507,2012-10-13,4,1,10,7,0,6,0,1,0.3,0.303,0.61,0.1642,10,60,70 +15508,2012-10-13,4,1,10,8,0,6,0,1,0.34,0.3485,0.53,0.1045,23,151,174 +15509,2012-10-13,4,1,10,9,0,6,0,1,0.36,0.3636,0.5,0.0896,46,220,266 +15510,2012-10-13,4,1,10,10,0,6,0,1,0.4,0.4091,0.43,0,82,265,347 +15511,2012-10-13,4,1,10,11,0,6,0,1,0.42,0.4242,0.35,0.1343,192,335,527 +15512,2012-10-13,4,1,10,12,0,6,0,1,0.44,0.4394,0.35,0,202,371,573 +15513,2012-10-13,4,1,10,13,0,6,0,1,0.46,0.4545,0.36,0.194,235,435,670 +15514,2012-10-13,4,1,10,14,0,6,0,1,0.48,0.4697,0.33,0.194,243,354,597 +15515,2012-10-13,4,1,10,15,0,6,0,1,0.5,0.4848,0.36,0.194,251,364,615 +15516,2012-10-13,4,1,10,16,0,6,0,1,0.5,0.4848,0.31,0.194,294,343,637 +15517,2012-10-13,4,1,10,17,0,6,0,1,0.46,0.4545,0.38,0.1642,193,335,528 +15518,2012-10-13,4,1,10,18,0,6,0,1,0.46,0.4545,0.44,0.1343,174,299,473 +15519,2012-10-13,4,1,10,19,0,6,0,1,0.44,0.4394,0.44,0.1642,73,259,332 +15520,2012-10-13,4,1,10,20,0,6,0,1,0.4,0.4091,0.54,0.0896,57,198,255 +15521,2012-10-13,4,1,10,21,0,6,0,1,0.4,0.4091,0.58,0.194,47,157,204 +15522,2012-10-13,4,1,10,22,0,6,0,1,0.4,0.4091,0.62,0,33,156,189 +15523,2012-10-13,4,1,10,23,0,6,0,1,0.42,0.4242,0.58,0.2985,27,98,125 +15524,2012-10-14,4,1,10,0,0,0,0,1,0.4,0.4091,0.71,0.2537,8,103,111 +15525,2012-10-14,4,1,10,1,0,0,0,1,0.42,0.4242,0.77,0.2836,16,96,112 +15526,2012-10-14,4,1,10,2,0,0,0,1,0.4,0.4091,0.76,0.0896,8,58,66 +15527,2012-10-14,4,1,10,3,0,0,0,1,0.42,0.4242,0.77,0.2985,3,25,28 +15528,2012-10-14,4,1,10,4,0,0,0,1,0.42,0.4242,0.77,0.2836,2,10,12 +15529,2012-10-14,4,1,10,5,0,0,0,1,0.4,0.4091,0.82,0.194,5,6,11 +15530,2012-10-14,4,1,10,6,0,0,0,1,0.4,0.4091,0.82,0.1642,4,20,24 +15531,2012-10-14,4,1,10,7,0,0,0,1,0.42,0.4242,0.82,0.2239,7,44,51 +15532,2012-10-14,4,1,10,8,0,0,0,1,0.44,0.4394,0.77,0.194,28,104,132 +15533,2012-10-14,4,1,10,9,0,0,0,2,0.48,0.4697,0.67,0.2836,62,161,223 +15534,2012-10-14,4,1,10,10,0,0,0,1,0.54,0.5152,0.56,0.194,150,278,428 +15535,2012-10-14,4,1,10,11,0,0,0,1,0.56,0.5303,0.52,0.2985,156,338,494 +15536,2012-10-14,4,1,10,12,0,0,0,1,0.62,0.6212,0.46,0.3284,200,362,562 +15537,2012-10-14,4,1,10,13,0,0,0,1,0.64,0.6212,0.41,0.4478,218,401,619 +15538,2012-10-14,4,1,10,14,0,0,0,1,0.66,0.6212,0.39,0.4179,249,368,617 +15539,2012-10-14,4,1,10,15,0,0,0,1,0.66,0.6212,0.39,0.4179,213,355,568 +15540,2012-10-14,4,1,10,16,0,0,0,1,0.64,0.6212,0.41,0.2985,203,378,581 +15541,2012-10-14,4,1,10,17,0,0,0,1,0.64,0.6212,0.44,0.194,193,346,539 +15542,2012-10-14,4,1,10,18,0,0,0,1,0.58,0.5455,0.6,0.2537,134,319,453 +15543,2012-10-14,4,1,10,19,0,0,0,1,0.56,0.5303,0.64,0.2239,68,268,336 +15544,2012-10-14,4,1,10,20,0,0,0,1,0.54,0.5152,0.73,0.2239,48,198,246 +15545,2012-10-14,4,1,10,21,0,0,0,1,0.56,0.5303,0.73,0.4478,40,156,196 +15546,2012-10-14,4,1,10,22,0,0,0,1,0.56,0.5303,0.73,0.3881,38,87,125 +15547,2012-10-14,4,1,10,23,0,0,0,2,0.56,0.5303,0.68,0.2836,27,78,105 +15548,2012-10-15,4,1,10,0,0,1,1,2,0.56,0.5303,0.73,0.2537,17,31,48 +15549,2012-10-15,4,1,10,1,0,1,1,2,0.58,0.5455,0.64,0.3881,5,24,29 +15550,2012-10-15,4,1,10,2,0,1,1,2,0.58,0.5455,0.64,0.3582,1,5,6 +15551,2012-10-15,4,1,10,3,0,1,1,2,0.56,0.5303,0.73,0.2537,2,2,4 +15552,2012-10-15,4,1,10,4,0,1,1,2,0.56,0.5303,0.68,0.2239,1,9,10 +15553,2012-10-15,4,1,10,5,0,1,1,2,0.56,0.5303,0.73,0.2836,2,38,40 +15554,2012-10-15,4,1,10,6,0,1,1,2,0.56,0.5303,0.73,0.2985,6,141,147 +15555,2012-10-15,4,1,10,7,0,1,1,2,0.58,0.5455,0.68,0.3284,15,461,476 +15556,2012-10-15,4,1,10,8,0,1,1,2,0.6,0.6061,0.64,0.3881,24,713,737 +15557,2012-10-15,4,1,10,9,0,1,1,3,0.6,0.5909,0.69,0.3881,31,328,359 +15558,2012-10-15,4,1,10,10,0,1,1,2,0.6,0.5909,0.73,0.3881,43,125,168 +15559,2012-10-15,4,1,10,11,0,1,1,2,0.6,0.5909,0.73,0.3881,88,175,263 +15560,2012-10-15,4,1,10,12,0,1,1,3,0.6,0.5758,0.78,0.3284,73,211,284 +15561,2012-10-15,4,1,10,13,0,1,1,2,0.56,0.5303,0.88,0.2239,37,109,146 +15562,2012-10-15,4,1,10,14,0,1,1,2,0.6,0.5758,0.78,0.2836,57,128,185 +15563,2012-10-15,4,1,10,15,0,1,1,1,0.6,0.5758,0.78,0.3284,70,190,260 +15564,2012-10-15,4,1,10,16,0,1,1,1,0.56,0.5303,0.64,0.2985,76,371,447 +15565,2012-10-15,4,1,10,17,0,1,1,1,0.56,0.5303,0.64,0.2537,96,670,766 +15566,2012-10-15,4,1,10,18,0,1,1,3,0.54,0.5152,0.77,0.1343,52,540,592 +15567,2012-10-15,4,1,10,19,0,1,1,3,0.52,0.5,0.83,0.0896,12,227,239 +15568,2012-10-15,4,1,10,20,0,1,1,2,0.52,0.5,0.72,0.3881,19,237,256 +15569,2012-10-15,4,1,10,21,0,1,1,1,0.52,0.5,0.59,0.3284,16,190,206 +15570,2012-10-15,4,1,10,22,0,1,1,1,0.5,0.4848,0.59,0.2836,8,118,126 +15571,2012-10-15,4,1,10,23,0,1,1,1,0.46,0.4545,0.63,0.2239,9,72,81 +15572,2012-10-16,4,1,10,0,0,2,1,1,0.46,0.4545,0.63,0.3881,3,45,48 +15573,2012-10-16,4,1,10,1,0,2,1,1,0.44,0.4394,0.67,0.194,4,9,13 +15574,2012-10-16,4,1,10,2,0,2,1,1,0.44,0.4394,0.67,0.1343,2,1,3 +15575,2012-10-16,4,1,10,3,0,2,1,1,0.42,0.4242,0.67,0.1045,0,2,2 +15576,2012-10-16,4,1,10,4,0,2,1,1,0.42,0.4242,0.67,0.1642,0,7,7 +15577,2012-10-16,4,1,10,5,0,2,1,1,0.42,0.4242,0.67,0.2537,5,47,52 +15578,2012-10-16,4,1,10,6,0,2,1,1,0.42,0.4242,0.67,0.1642,4,168,172 +15579,2012-10-16,4,1,10,7,0,2,1,1,0.42,0.4242,0.67,0.0896,20,505,525 +15580,2012-10-16,4,1,10,8,0,2,1,1,0.44,0.4394,0.62,0,35,800,835 +15581,2012-10-16,4,1,10,9,0,2,1,1,0.48,0.4697,0.55,0.2537,32,323,355 +15582,2012-10-16,4,1,10,10,0,2,1,1,0.5,0.4848,0.48,0.2836,65,157,222 +15583,2012-10-16,4,1,10,11,0,2,1,1,0.5,0.4848,0.45,0.4179,56,172,228 +15584,2012-10-16,4,1,10,12,0,2,1,2,0.52,0.5,0.45,0.3284,69,256,325 +15585,2012-10-16,4,1,10,13,0,2,1,1,0.52,0.5,0.45,0.2836,68,260,328 +15586,2012-10-16,4,1,10,14,0,2,1,1,0.54,0.5152,0.39,0.2537,94,214,308 +15587,2012-10-16,4,1,10,15,0,2,1,1,0.54,0.5152,0.41,0.3284,76,270,346 +15588,2012-10-16,4,1,10,16,0,2,1,1,0.54,0.5152,0.39,0.2836,79,367,446 +15589,2012-10-16,4,1,10,17,0,2,1,1,0.52,0.5,0.39,0.194,104,839,943 +15590,2012-10-16,4,1,10,18,0,2,1,1,0.5,0.4848,0.42,0.1642,71,767,838 +15591,2012-10-16,4,1,10,19,0,2,1,1,0.48,0.4697,0.48,0,40,491,531 +15592,2012-10-16,4,1,10,20,0,2,1,1,0.46,0.4545,0.55,0,35,397,432 +15593,2012-10-16,4,1,10,21,0,2,1,1,0.46,0.4545,0.63,0,19,176,195 +15594,2012-10-16,4,1,10,22,0,2,1,1,0.4,0.4091,0.71,0.0896,18,163,181 +15595,2012-10-16,4,1,10,23,0,2,1,1,0.4,0.4091,0.71,0,23,176,199 +15596,2012-10-17,4,1,10,0,0,3,1,1,0.38,0.3939,0.76,0,1,48,49 +15597,2012-10-17,4,1,10,1,0,3,1,1,0.38,0.3939,0.76,0,3,14,17 +15598,2012-10-17,4,1,10,2,0,3,1,1,0.38,0.3939,0.76,0,4,12,16 +15599,2012-10-17,4,1,10,3,0,3,1,1,0.36,0.3788,0.87,0,0,7,7 +15600,2012-10-17,4,1,10,4,0,3,1,1,0.36,0.3636,0.81,0.0896,0,4,4 +15601,2012-10-17,4,1,10,5,0,3,1,2,0.36,0.3788,0.81,0,2,39,41 +15602,2012-10-17,4,1,10,6,0,3,1,1,0.38,0.3939,0.82,0,7,171,178 +15603,2012-10-17,4,1,10,7,0,3,1,1,0.36,0.3636,0.81,0.1045,15,449,464 +15604,2012-10-17,4,1,10,8,0,3,1,2,0.4,0.4091,0.76,0,38,779,817 +15605,2012-10-17,4,1,10,9,0,3,1,2,0.42,0.4242,0.77,0,38,344,382 +15606,2012-10-17,4,1,10,10,0,3,1,2,0.46,0.4545,0.67,0,60,168,228 +15607,2012-10-17,4,1,10,11,0,3,1,2,0.5,0.4848,0.51,0.0896,78,156,234 +15608,2012-10-17,4,1,10,12,0,3,1,2,0.52,0.5,0.48,0.1642,71,261,332 +15609,2012-10-17,4,1,10,13,0,3,1,2,0.54,0.5152,0.49,0.1343,73,237,310 +15610,2012-10-17,4,1,10,14,0,3,1,2,0.56,0.5303,0.43,0.1642,82,188,270 +15611,2012-10-17,4,1,10,15,0,3,1,2,0.58,0.5455,0.43,0.2239,62,239,301 +15612,2012-10-17,4,1,10,16,0,3,1,1,0.56,0.5303,0.52,0.194,70,396,466 +15613,2012-10-17,4,1,10,17,0,3,1,1,0.54,0.5152,0.56,0.1045,122,766,888 +15614,2012-10-17,4,1,10,18,0,3,1,1,0.52,0.5,0.59,0.1642,90,794,884 +15615,2012-10-17,4,1,10,19,0,3,1,1,0.5,0.4848,0.72,0.194,49,467,516 +15616,2012-10-17,4,1,10,20,0,3,1,1,0.5,0.4848,0.72,0.1343,54,360,414 +15617,2012-10-17,4,1,10,21,0,3,1,1,0.46,0.4545,0.82,0.194,28,301,329 +15618,2012-10-17,4,1,10,22,0,3,1,1,0.46,0.4545,0.88,0.2537,17,198,215 +15619,2012-10-17,4,1,10,23,0,3,1,1,0.46,0.4545,0.88,0.2239,15,84,99 +15620,2012-10-18,4,1,10,0,0,4,1,1,0.46,0.4545,0.88,0.194,4,53,57 +15621,2012-10-18,4,1,10,1,0,4,1,1,0.46,0.4545,0.82,0.2239,0,14,14 +15622,2012-10-18,4,1,10,2,0,4,1,1,0.46,0.4545,0.82,0.194,1,13,14 +15623,2012-10-18,4,1,10,3,0,4,1,1,0.44,0.4394,0.88,0.2239,0,5,5 +15624,2012-10-18,4,1,10,4,0,4,1,1,0.44,0.4394,0.88,0.2239,0,5,5 +15625,2012-10-18,4,1,10,5,0,4,1,1,0.44,0.4394,0.82,0.2239,1,41,42 +15626,2012-10-18,4,1,10,6,0,4,1,1,0.44,0.4394,0.88,0.2985,3,150,153 +15627,2012-10-18,4,1,10,7,0,4,1,1,0.44,0.4394,0.88,0.2537,20,488,508 +15628,2012-10-18,4,1,10,8,0,4,1,1,0.46,0.4545,0.82,0.2537,31,803,834 +15629,2012-10-18,4,1,10,9,0,4,1,2,0.5,0.4848,0.77,0.1642,41,346,387 +15630,2012-10-18,4,1,10,10,0,4,1,2,0.52,0.5,0.68,0.2239,60,158,218 +15631,2012-10-18,4,1,10,11,0,4,1,2,0.56,0.5303,0.56,0.2239,79,189,268 +15632,2012-10-18,4,1,10,12,0,4,1,2,0.6,0.6212,0.53,0.2537,93,284,377 +15633,2012-10-18,4,1,10,13,0,4,1,1,0.62,0.6212,0.53,0.3582,96,236,332 +15634,2012-10-18,4,1,10,14,0,4,1,2,0.62,0.6212,0.5,0.2985,94,191,285 +15635,2012-10-18,4,1,10,15,0,4,1,2,0.6,0.6061,0.6,0.194,69,284,353 +15636,2012-10-18,4,1,10,16,0,4,1,1,0.6,0.6212,0.56,0.2836,94,356,450 +15637,2012-10-18,4,1,10,17,0,4,1,2,0.58,0.5455,0.64,0.3284,102,788,890 +15638,2012-10-18,4,1,10,18,0,4,1,2,0.56,0.5303,0.64,0.3284,68,720,788 +15639,2012-10-18,4,1,10,19,0,4,1,2,0.56,0.5303,0.68,0.2985,42,471,513 +15640,2012-10-18,4,1,10,20,0,4,1,2,0.56,0.5303,0.68,0.2537,39,348,387 +15641,2012-10-18,4,1,10,21,0,4,1,2,0.54,0.5152,0.77,0.194,38,245,283 +15642,2012-10-18,4,1,10,22,0,4,1,2,0.54,0.5152,0.83,0.194,27,202,229 +15643,2012-10-18,4,1,10,23,0,4,1,2,0.54,0.5152,0.83,0,6,111,117 +15644,2012-10-19,4,1,10,0,0,5,1,2,0.56,0.5303,0.83,0.1045,5,51,56 +15645,2012-10-19,4,1,10,1,0,5,1,3,0.54,0.5152,0.88,0,4,12,16 +15646,2012-10-19,4,1,10,2,0,5,1,2,0.54,0.5152,0.88,0,1,9,10 +15647,2012-10-19,4,1,10,3,0,5,1,2,0.56,0.5303,0.83,0,0,5,5 +15648,2012-10-19,4,1,10,4,0,5,1,2,0.56,0.5303,0.83,0,1,5,6 +15649,2012-10-19,4,1,10,5,0,5,1,2,0.54,0.5152,0.88,0,1,35,36 +15650,2012-10-19,4,1,10,6,0,5,1,3,0.54,0.5152,0.94,0.1642,5,126,131 +15651,2012-10-19,4,1,10,7,0,5,1,3,0.54,0.5152,0.94,0,5,149,154 +15652,2012-10-19,4,1,10,8,0,5,1,2,0.54,0.5152,0.94,0.1045,20,447,467 +15653,2012-10-19,4,1,10,9,0,5,1,2,0.54,0.5152,0.94,0.1045,26,363,389 +15654,2012-10-19,4,1,10,10,0,5,1,2,0.58,0.5455,0.88,0.194,26,198,224 +15655,2012-10-19,4,1,10,11,0,5,1,1,0.6,0.5606,0.83,0.194,41,211,252 +15656,2012-10-19,4,1,10,12,0,5,1,1,0.64,0.6061,0.69,0.2239,74,258,332 +15657,2012-10-19,4,1,10,13,0,5,1,1,0.62,0.5909,0.73,0.2239,87,288,375 +15658,2012-10-19,4,1,10,14,0,5,1,1,0.66,0.6212,0.61,0.2537,88,277,365 +15659,2012-10-19,4,1,10,15,0,5,1,1,0.64,0.6061,0.65,0.2985,93,302,395 +15660,2012-10-19,4,1,10,16,0,5,1,3,0.62,0.6061,0.69,0.3582,131,434,565 +15661,2012-10-19,4,1,10,17,0,5,1,3,0.62,0.6061,0.69,0.3582,48,377,425 +15662,2012-10-19,4,1,10,18,0,5,1,1,0.56,0.5303,0.83,0.1045,21,212,233 +15663,2012-10-19,4,1,10,19,0,5,1,1,0.54,0.5152,0.88,0.1642,19,213,232 +15664,2012-10-19,4,1,10,20,0,5,1,1,0.52,0.5,0.77,0.194,13,216,229 +15665,2012-10-19,4,1,10,21,0,5,1,1,0.5,0.4848,0.77,0,17,189,206 +15666,2012-10-19,4,1,10,22,0,5,1,1,0.5,0.4848,0.77,0.0896,14,176,190 +15667,2012-10-19,4,1,10,23,0,5,1,1,0.46,0.4545,0.88,0.1045,13,118,131 +15668,2012-10-20,4,1,10,0,0,6,0,1,0.46,0.4545,0.82,0.1343,13,110,123 +15669,2012-10-20,4,1,10,1,0,6,0,1,0.46,0.4545,0.72,0.1343,2,93,95 +15670,2012-10-20,4,1,10,2,0,6,0,1,0.44,0.4394,0.77,0.1045,9,69,78 +15671,2012-10-20,4,1,10,3,0,6,0,1,0.42,0.4242,0.77,0.1343,2,27,29 +15672,2012-10-20,4,1,10,4,0,6,0,1,0.44,0.4394,0.72,0,9,9,18 +15673,2012-10-20,4,1,10,5,0,6,0,1,0.4,0.4091,0.82,0,6,7,13 +15674,2012-10-20,4,1,10,6,0,6,0,1,0.42,0.4242,0.71,0,5,26,31 +15675,2012-10-20,4,1,10,7,0,6,0,1,0.4,0.4091,0.82,0,6,59,65 +15676,2012-10-20,4,1,10,8,0,6,0,1,0.46,0.4545,0.67,0.1343,29,164,193 +15677,2012-10-20,4,1,10,9,0,6,0,1,0.5,0.4848,0.59,0.1343,106,257,363 +15678,2012-10-20,4,1,10,10,0,6,0,1,0.52,0.5,0.52,0.1343,111,312,423 +15679,2012-10-20,4,1,10,11,0,6,0,1,0.54,0.5152,0.52,0.2239,204,408,612 +15680,2012-10-20,4,1,10,12,0,6,0,1,0.56,0.5303,0.46,0.1045,267,436,703 +15681,2012-10-20,4,1,10,13,0,6,0,1,0.56,0.5303,0.43,0.1642,273,441,714 +15682,2012-10-20,4,1,10,14,0,6,0,1,0.56,0.5303,0.4,0,335,376,711 +15683,2012-10-20,4,1,10,15,0,6,0,1,0.56,0.5303,0.43,0,308,403,711 +15684,2012-10-20,4,1,10,16,0,6,0,1,0.54,0.5152,0.42,0.2836,325,366,691 +15685,2012-10-20,4,1,10,17,0,6,0,1,0.54,0.5152,0.37,0.2239,347,384,731 +15686,2012-10-20,4,1,10,18,0,6,0,1,0.52,0.5,0.39,0.2239,165,356,521 +15687,2012-10-20,4,1,10,19,0,6,0,1,0.5,0.4848,0.45,0.1343,77,268,345 +15688,2012-10-20,4,1,10,20,0,6,0,1,0.5,0.4848,0.39,0.2239,61,198,259 +15689,2012-10-20,4,1,10,21,0,6,0,1,0.46,0.4545,0.47,0.2239,72,224,296 +15690,2012-10-20,4,1,10,22,0,6,0,1,0.44,0.4394,0.51,0.1045,47,155,202 +15691,2012-10-20,4,1,10,23,0,6,0,1,0.42,0.4242,0.58,0,27,136,163 +15692,2012-10-21,4,1,10,0,0,0,0,1,0.44,0.4394,0.54,0.1642,35,119,154 +15693,2012-10-21,4,1,10,1,0,0,0,1,0.42,0.4242,0.58,0.2239,20,97,117 +15694,2012-10-21,4,1,10,2,0,0,0,1,0.4,0.4091,0.62,0.1642,44,88,132 +15695,2012-10-21,4,1,10,3,0,0,0,1,0.38,0.3939,0.66,0.0896,12,29,41 +15696,2012-10-21,4,1,10,4,0,0,0,1,0.36,0.3485,0.71,0.1642,8,13,21 +15697,2012-10-21,4,1,10,5,0,0,0,1,0.38,0.3939,0.66,0.0896,6,4,10 +15698,2012-10-21,4,1,10,6,0,0,0,1,0.4,0.4091,0.62,0,8,18,26 +15699,2012-10-21,4,1,10,7,0,0,0,1,0.4,0.4091,0.62,0,13,43,56 +15700,2012-10-21,4,1,10,8,0,0,0,1,0.42,0.4242,0.67,0,39,104,143 +15701,2012-10-21,4,1,10,9,0,0,0,1,0.46,0.4545,0.55,0.1642,77,192,269 +15702,2012-10-21,4,1,10,10,0,0,0,1,0.5,0.4848,0.48,0.2985,167,276,443 +15703,2012-10-21,4,1,10,11,0,0,0,1,0.52,0.5,0.44,0.3284,191,356,547 +15704,2012-10-21,4,1,10,12,0,0,0,1,0.54,0.5152,0.39,0.3284,236,439,675 +15705,2012-10-21,4,1,10,13,0,0,0,1,0.56,0.5303,0.37,0.2537,243,383,626 +15706,2012-10-21,4,1,10,14,0,0,0,1,0.56,0.5303,0.35,0.194,235,405,640 +15707,2012-10-21,4,1,10,15,0,0,0,1,0.56,0.5303,0.35,0.3284,240,383,623 +15708,2012-10-21,4,1,10,16,0,0,0,1,0.54,0.5152,0.37,0.2239,182,409,591 +15709,2012-10-21,4,1,10,17,0,0,0,1,0.52,0.5,0.39,0.2537,153,338,491 +15710,2012-10-21,4,1,10,18,0,0,0,1,0.52,0.5,0.39,0.194,71,342,413 +15711,2012-10-21,4,1,10,19,0,0,0,1,0.5,0.4848,0.42,0.1642,50,216,266 +15712,2012-10-21,4,1,10,20,0,0,0,1,0.48,0.4697,0.44,0.194,35,160,195 +15713,2012-10-21,4,1,10,21,0,0,0,1,0.42,0.4242,0.54,0.1642,42,124,166 +15714,2012-10-21,4,1,10,22,0,0,0,1,0.42,0.4242,0.54,0,18,95,113 +15715,2012-10-21,4,1,10,23,0,0,0,1,0.44,0.4394,0.54,0,7,59,66 +15716,2012-10-22,4,1,10,0,0,1,1,1,0.4,0.4091,0.62,0,3,28,31 +15717,2012-10-22,4,1,10,1,0,1,1,1,0.4,0.4091,0.71,0,0,11,11 +15718,2012-10-22,4,1,10,2,0,1,1,1,0.38,0.3939,0.71,0,1,4,5 +15719,2012-10-22,4,1,10,3,0,1,1,1,0.4,0.4091,0.62,0,1,5,6 +15720,2012-10-22,4,1,10,4,0,1,1,1,0.38,0.3939,0.66,0,1,6,7 +15721,2012-10-22,4,1,10,5,0,1,1,1,0.38,0.3939,0.76,0.0896,2,44,46 +15722,2012-10-22,4,1,10,6,0,1,1,1,0.36,0.3636,0.76,0.0896,4,140,144 +15723,2012-10-22,4,1,10,7,0,1,1,1,0.38,0.3939,0.66,0.0896,15,437,452 +15724,2012-10-22,4,1,10,8,0,1,1,1,0.44,0.4394,0.67,0,32,696,728 +15725,2012-10-22,4,1,10,9,0,1,1,1,0.46,0.4545,0.63,0.0896,25,335,360 +15726,2012-10-22,4,1,10,10,0,1,1,1,0.5,0.4848,0.55,0,51,161,212 +15727,2012-10-22,4,1,10,11,0,1,1,1,0.56,0.5303,0.37,0.1642,63,148,211 +15728,2012-10-22,4,1,10,12,0,1,1,1,0.58,0.5455,0.35,0,52,231,283 +15729,2012-10-22,4,1,10,13,0,1,1,1,0.62,0.6212,0.29,0.1642,88,227,315 +15730,2012-10-22,4,1,10,14,0,1,1,1,0.62,0.6061,0.27,0.194,73,209,282 +15731,2012-10-22,4,1,10,15,0,1,1,1,0.64,0.6212,0.29,0.1642,74,222,296 +15732,2012-10-22,4,1,10,16,0,1,1,1,0.62,0.6212,0.33,0.1642,80,444,524 +15733,2012-10-22,4,1,10,17,0,1,1,1,0.62,0.6212,0.33,0.1045,84,838,922 +15734,2012-10-22,4,1,10,18,0,1,1,1,0.54,0.5152,0.64,0.1045,60,726,786 +15735,2012-10-22,4,1,10,19,0,1,1,1,0.52,0.5,0.52,0.1045,36,478,514 +15736,2012-10-22,4,1,10,20,0,1,1,1,0.5,0.4848,0.59,0.1045,21,382,403 +15737,2012-10-22,4,1,10,21,0,1,1,1,0.48,0.4697,0.67,0.0896,19,155,174 +15738,2012-10-22,4,1,10,22,0,1,1,1,0.46,0.4545,0.82,0.1343,25,158,183 +15739,2012-10-22,4,1,10,23,0,1,1,1,0.46,0.4545,0.82,0.1045,20,143,163 +15740,2012-10-23,4,1,10,0,0,2,1,1,0.46,0.4545,0.88,0.1642,5,32,37 +15741,2012-10-23,4,1,10,1,0,2,1,1,0.46,0.4545,0.77,0.1642,1,16,17 +15742,2012-10-23,4,1,10,2,0,2,1,1,0.44,0.4394,0.88,0.1343,1,6,7 +15743,2012-10-23,4,1,10,3,0,2,1,1,0.44,0.4394,0.77,0.1343,0,1,1 +15744,2012-10-23,4,1,10,4,0,2,1,1,0.42,0.4242,0.77,0.1045,1,6,7 +15745,2012-10-23,4,1,10,5,0,2,1,1,0.44,0.4394,0.77,0.0896,1,49,50 +15746,2012-10-23,4,1,10,6,0,2,1,1,0.4,0.4091,0.82,0.0896,6,152,158 +15747,2012-10-23,4,1,10,7,0,2,1,2,0.44,0.4394,0.77,0.1045,12,519,531 +15748,2012-10-23,4,1,10,8,0,2,1,2,0.46,0.4545,0.72,0.0896,28,733,761 +15749,2012-10-23,4,1,10,9,0,2,1,2,0.5,0.4848,0.72,0.1045,29,305,334 +15750,2012-10-23,4,1,10,10,0,2,1,1,0.54,0.5152,0.6,0,50,171,221 +15751,2012-10-23,4,1,10,11,0,2,1,1,0.6,0.6212,0.53,0,52,201,253 +15752,2012-10-23,4,1,10,12,0,2,1,2,0.62,0.6212,0.46,0.0896,80,298,378 +15753,2012-10-23,4,1,10,13,0,2,1,2,0.64,0.6212,0.44,0,59,244,303 +15754,2012-10-23,4,1,10,14,0,2,1,1,0.68,0.6364,0.41,0.1642,67,229,296 +15755,2012-10-23,4,1,10,15,0,2,1,1,0.68,0.6364,0.41,0.2239,76,270,346 +15756,2012-10-23,4,1,10,16,0,2,1,1,0.68,0.6364,0.44,0.0896,108,433,541 +15757,2012-10-23,4,1,10,17,0,2,1,1,0.68,0.6364,0.41,0,67,871,938 +15758,2012-10-23,4,1,10,18,0,2,1,1,0.62,0.6212,0.53,0.0896,64,762,826 +15759,2012-10-23,4,1,10,19,0,2,1,1,0.62,0.6212,0.57,0.0896,25,457,482 +15760,2012-10-23,4,1,10,20,0,2,1,1,0.58,0.5455,0.64,0.1343,45,334,379 +15761,2012-10-23,4,1,10,21,0,2,1,1,0.56,0.5303,0.68,0,35,260,295 +15762,2012-10-23,4,1,10,22,0,2,1,1,0.56,0.5303,0.68,0.1045,20,174,194 +15763,2012-10-23,4,1,10,23,0,2,1,1,0.54,0.5152,0.73,0.1045,9,102,111 +15764,2012-10-24,4,1,10,0,0,3,1,1,0.54,0.5152,0.73,0.0896,7,39,46 +15765,2012-10-24,4,1,10,1,0,3,1,1,0.54,0.5152,0.68,0,4,23,27 +15766,2012-10-24,4,1,10,2,0,3,1,1,0.54,0.5152,0.68,0,0,6,6 +15767,2012-10-24,4,1,10,3,0,3,1,1,0.5,0.4848,0.77,0.0896,0,4,4 +15768,2012-10-24,4,1,10,4,0,3,1,1,0.5,0.4848,0.77,0.0896,1,5,6 +15769,2012-10-24,4,1,10,5,0,3,1,1,0.5,0.4848,0.82,0,1,52,53 +15770,2012-10-24,4,1,10,6,0,3,1,1,0.48,0.4697,0.82,0.1045,11,162,173 +15771,2012-10-24,4,1,10,7,0,3,1,1,0.5,0.4848,0.88,0.0896,20,506,526 +15772,2012-10-24,4,1,10,8,0,3,1,2,0.52,0.5,0.83,0.1045,24,777,801 +15773,2012-10-24,4,1,10,9,0,3,1,2,0.52,0.5,0.77,0.0896,24,349,373 +15774,2012-10-24,4,1,10,10,0,3,1,2,0.54,0.5152,0.73,0,29,142,171 +15775,2012-10-24,4,1,10,11,0,3,1,1,0.58,0.5455,0.64,0,47,189,236 +15776,2012-10-24,4,1,10,12,0,3,1,1,0.62,0.6061,0.61,0.0896,48,268,316 +15777,2012-10-24,4,1,10,13,0,3,1,1,0.66,0.6212,0.54,0,65,241,306 +15778,2012-10-24,4,1,10,14,0,3,1,1,0.74,0.6515,0.3,0.2537,64,237,301 +15779,2012-10-24,4,1,10,15,0,3,1,1,0.7,0.6364,0.39,0.1045,63,245,308 +15780,2012-10-24,4,1,10,16,0,3,1,1,0.74,0.6515,0.33,0.1343,67,465,532 +15781,2012-10-24,4,1,10,17,0,3,1,1,0.66,0.6212,0.47,0,87,876,963 +15782,2012-10-24,4,1,10,18,0,3,1,1,0.66,0.6212,0.44,0,63,795,858 +15783,2012-10-24,4,1,10,19,0,3,1,2,0.64,0.6212,0.53,0.0896,50,522,572 +15784,2012-10-24,4,1,10,20,0,3,1,2,0.62,0.6061,0.61,0.0896,45,396,441 +15785,2012-10-24,4,1,10,21,0,3,1,2,0.62,0.6061,0.61,0.1045,33,280,313 +15786,2012-10-24,4,1,10,22,0,3,1,2,0.6,0.6061,0.64,0.1343,30,208,238 +15787,2012-10-24,4,1,10,23,0,3,1,2,0.58,0.5455,0.68,0.0896,12,111,123 +15788,2012-10-25,4,1,10,0,0,4,1,2,0.6,0.6061,0.64,0.1045,19,57,76 +15789,2012-10-25,4,1,10,1,0,4,1,1,0.58,0.5455,0.68,0,6,22,28 +15790,2012-10-25,4,1,10,2,0,4,1,1,0.54,0.5152,0.77,0,3,15,18 +15791,2012-10-25,4,1,10,3,0,4,1,2,0.52,0.5,0.88,0.1343,0,8,8 +15792,2012-10-25,4,1,10,4,0,4,1,2,0.52,0.5,0.88,0.1045,1,4,5 +15793,2012-10-25,4,1,10,5,0,4,1,2,0.52,0.5,0.88,0.1642,2,53,55 +15794,2012-10-25,4,1,10,6,0,4,1,2,0.52,0.5,0.88,0.194,3,168,171 +15795,2012-10-25,4,1,10,7,0,4,1,2,0.52,0.5,0.88,0.194,18,477,495 +15796,2012-10-25,4,1,10,8,0,4,1,2,0.52,0.5,0.83,0.1642,33,746,779 +15797,2012-10-25,4,1,10,9,0,4,1,2,0.52,0.5,0.83,0.194,23,320,343 +15798,2012-10-25,4,1,10,10,0,4,1,2,0.54,0.5152,0.83,0.1343,53,172,225 +15799,2012-10-25,4,1,10,11,0,4,1,2,0.54,0.5152,0.83,0.1045,57,181,238 +15800,2012-10-25,4,1,10,12,0,4,1,2,0.56,0.5303,0.78,0.1045,61,258,319 +15801,2012-10-25,4,1,10,13,0,4,1,2,0.56,0.5303,0.78,0,49,264,313 +15802,2012-10-25,4,1,10,14,0,4,1,1,0.6,0.5909,0.73,0.1343,71,214,285 +15803,2012-10-25,4,1,10,15,0,4,1,1,0.62,0.6061,0.65,0.1343,63,242,305 +15804,2012-10-25,4,1,10,16,0,4,1,1,0.6,0.5909,0.69,0.1642,92,407,499 +15805,2012-10-25,4,1,10,17,0,4,1,1,0.6,0.5909,0.69,0.1642,112,774,886 +15806,2012-10-25,4,1,10,18,0,4,1,1,0.56,0.5303,0.83,0.1045,77,732,809 +15807,2012-10-25,4,1,10,19,0,4,1,1,0.54,0.5152,0.83,0.194,45,497,542 +15808,2012-10-25,4,1,10,20,0,4,1,2,0.52,0.5,0.88,0.1343,28,319,347 +15809,2012-10-25,4,1,10,21,0,4,1,2,0.52,0.5,0.88,0.1343,24,247,271 +15810,2012-10-25,4,1,10,22,0,4,1,2,0.54,0.5152,0.83,0.0896,22,185,207 +15811,2012-10-25,4,1,10,23,0,4,1,2,0.54,0.5152,0.83,0.1343,13,122,135 +15812,2012-10-26,4,1,10,0,0,5,1,2,0.54,0.5152,0.77,0,6,65,71 +15813,2012-10-26,4,1,10,1,0,5,1,2,0.52,0.5,0.88,0.0896,4,32,36 +15814,2012-10-26,4,1,10,2,0,5,1,2,0.52,0.5,0.88,0.1045,4,15,19 +15815,2012-10-26,4,1,10,3,0,5,1,2,0.52,0.5,0.88,0.1642,5,8,13 +15816,2012-10-26,4,1,10,4,0,5,1,2,0.52,0.5,0.88,0.1642,1,4,5 +15817,2012-10-26,4,1,10,5,0,5,1,2,0.52,0.5,0.88,0.1045,3,47,50 +15818,2012-10-26,4,1,10,6,0,5,1,2,0.52,0.5,0.88,0.0896,2,142,144 +15819,2012-10-26,4,1,10,7,0,5,1,2,0.52,0.5,0.88,0.1045,13,408,421 +15820,2012-10-26,4,1,10,8,0,5,1,2,0.54,0.5152,0.88,0.1343,20,714,734 +15821,2012-10-26,4,1,10,9,0,5,1,2,0.54,0.5152,0.88,0.1343,56,347,403 +15822,2012-10-26,4,1,10,10,0,5,1,2,0.54,0.5152,0.88,0.1045,41,156,197 +15823,2012-10-26,4,1,10,11,0,5,1,2,0.56,0.5303,0.78,0.1642,68,222,290 +15824,2012-10-26,4,1,10,12,0,5,1,2,0.58,0.5455,0.73,0.194,90,302,392 +15825,2012-10-26,4,1,10,13,0,5,1,2,0.58,0.5455,0.68,0.1343,77,260,337 +15826,2012-10-26,4,1,10,14,0,5,1,2,0.6,0.6061,0.64,0.1343,99,252,351 +15827,2012-10-26,4,1,10,15,0,5,1,2,0.62,0.6061,0.61,0.1045,142,306,448 +15828,2012-10-26,4,1,10,16,0,5,1,2,0.58,0.5455,0.73,0.1642,137,445,582 +15829,2012-10-26,4,1,10,17,0,5,1,2,0.56,0.5303,0.76,0.194,125,692,817 +15830,2012-10-26,4,1,10,18,0,5,1,2,0.56,0.5303,0.78,0.1343,81,584,665 +15831,2012-10-26,4,1,10,19,0,5,1,2,0.54,0.5152,0.83,0.1642,72,399,471 +15832,2012-10-26,4,1,10,20,0,5,1,2,0.54,0.5152,0.83,0.1642,40,271,311 +15833,2012-10-26,4,1,10,21,0,5,1,2,0.54,0.5152,0.77,0.1343,40,252,292 +15834,2012-10-26,4,1,10,22,0,5,1,2,0.52,0.5,0.83,0.1343,37,180,217 +15835,2012-10-26,4,1,10,23,0,5,1,1,0.52,0.5,0.83,0.1642,19,159,178 +15836,2012-10-27,4,1,10,0,0,6,0,1,0.5,0.4848,0.88,0.1343,10,132,142 +15837,2012-10-27,4,1,10,1,0,6,0,2,0.5,0.4848,0.88,0.1343,23,123,146 +15838,2012-10-27,4,1,10,2,0,6,0,2,0.5,0.4848,0.88,0.194,19,71,90 +15839,2012-10-27,4,1,10,3,0,6,0,1,0.5,0.4848,0.88,0.1642,5,21,26 +15840,2012-10-27,4,1,10,4,0,6,0,2,0.5,0.4848,0.88,0.1642,1,12,13 +15841,2012-10-27,4,1,10,5,0,6,0,2,0.5,0.4848,0.82,0.194,2,7,9 +15842,2012-10-27,4,1,10,6,0,6,0,2,0.5,0.4848,0.82,0.194,2,29,31 +15843,2012-10-27,4,1,10,7,0,6,0,1,0.48,0.4697,0.83,0.194,7,79,86 +15844,2012-10-27,4,1,10,8,0,6,0,1,0.5,0.4848,0.77,0.2537,26,187,213 +15845,2012-10-27,4,1,10,9,0,6,0,1,0.5,0.4848,0.77,0.2537,88,240,328 +15846,2012-10-27,4,1,10,10,0,6,0,1,0.54,0.5152,0.73,0.2836,165,314,479 +15847,2012-10-27,4,1,10,11,0,6,0,1,0.56,0.5303,0.64,0.2836,197,388,585 +15848,2012-10-27,4,1,10,12,0,6,0,1,0.6,0.6212,0.53,0.2537,264,404,668 +15849,2012-10-27,4,1,10,13,0,6,0,1,0.6,0.6212,0.43,0.2537,310,450,760 +15850,2012-10-27,4,1,10,14,0,6,0,1,0.6,0.6212,0.46,0.2537,325,425,750 +15851,2012-10-27,4,1,10,15,0,6,0,2,0.56,0.5303,0.6,0.2836,310,401,711 +15852,2012-10-27,4,1,10,16,0,6,0,2,0.56,0.5303,0.64,0.2836,257,355,612 +15853,2012-10-27,4,1,10,17,0,6,0,2,0.56,0.5303,0.64,0.2836,248,370,618 +15854,2012-10-27,4,1,10,18,0,6,0,2,0.56,0.5303,0.64,0.2836,138,318,456 +15855,2012-10-27,4,1,10,19,0,6,0,2,0.52,0.5,0.72,0.2239,67,233,300 +15856,2012-10-27,4,1,10,20,0,6,0,2,0.52,0.5,0.72,0.2239,58,238,296 +15857,2012-10-27,4,1,10,21,0,6,0,2,0.52,0.5,0.72,0.2239,49,160,209 +15858,2012-10-27,4,1,10,22,0,6,0,2,0.52,0.5,0.72,0.2537,41,116,157 +15859,2012-10-27,4,1,10,23,0,6,0,2,0.52,0.5,0.68,0.3881,31,136,167 +15860,2012-10-28,4,1,10,0,0,0,0,3,0.52,0.5,0.68,0.2985,20,97,117 +15861,2012-10-28,4,1,10,1,0,0,0,2,0.5,0.4848,0.72,0.2836,22,111,133 +15862,2012-10-28,4,1,10,2,0,0,0,2,0.5,0.4848,0.68,0.2985,17,99,116 +15863,2012-10-28,4,1,10,3,0,0,0,2,0.5,0.4848,0.63,0.3284,18,61,79 +15864,2012-10-28,4,1,10,4,0,0,0,2,0.5,0.4848,0.63,0.3582,1,19,20 +15865,2012-10-28,4,1,10,5,0,0,0,2,0.5,0.4848,0.63,0.2985,6,22,28 +15866,2012-10-28,4,1,10,6,0,0,0,2,0.5,0.4848,0.59,0.3284,7,32,39 +15867,2012-10-28,4,1,10,7,0,0,0,2,0.5,0.4848,0.59,0.2985,17,48,65 +15868,2012-10-28,4,1,10,8,0,0,0,2,0.5,0.4848,0.59,0.5522,55,118,173 +15869,2012-10-28,4,1,10,9,0,0,0,2,0.5,0.4848,0.55,0.4179,124,206,330 +15870,2012-10-28,4,1,10,10,0,0,0,2,0.48,0.4697,0.51,0.4179,120,314,434 +15871,2012-10-28,4,1,10,11,0,0,0,2,0.5,0.4848,0.51,0.3881,110,352,462 +15872,2012-10-28,4,1,10,12,0,0,0,2,0.48,0.4697,0.59,0.4925,118,373,491 +15873,2012-10-28,4,1,10,13,0,0,0,2,0.48,0.4697,0.59,0.4179,75,316,391 +15874,2012-10-28,4,1,10,14,0,0,0,2,0.5,0.4848,0.59,0.4478,98,304,402 +15875,2012-10-28,4,1,10,15,0,0,0,2,0.5,0.4848,0.63,0.4925,76,225,301 +15876,2012-10-28,4,1,10,16,0,0,0,3,0.48,0.4697,0.67,0.4627,42,251,293 +15877,2012-10-28,4,1,10,17,0,0,0,3,0.46,0.4545,0.77,0.5224,37,188,225 +15878,2012-10-28,4,1,10,18,0,0,0,3,0.42,0.4242,0.94,0.4925,18,136,154 +15879,2012-10-28,4,1,10,19,0,0,0,3,0.42,0.4242,0.94,0.3582,7,47,54 +15880,2012-10-28,4,1,10,20,0,0,0,3,0.42,0.4242,0.94,0.4627,4,51,55 +15881,2012-10-28,4,1,10,21,0,0,0,3,0.44,0.4394,0.88,0.4179,2,44,46 +15882,2012-10-28,4,1,10,22,0,0,0,3,0.44,0.4394,0.88,0.3582,2,35,37 +15883,2012-10-28,4,1,10,23,0,0,0,3,0.42,0.4242,0.94,0.3582,2,12,14 +15884,2012-10-29,4,1,10,0,0,1,1,3,0.44,0.4394,0.88,0.3582,2,20,22 +15885,2012-10-30,4,1,10,13,0,2,1,3,0.3,0.2727,0.81,0.3582,11,105,116 +15886,2012-10-30,4,1,10,14,0,2,1,3,0.3,0.2727,0.81,0.3582,8,118,126 +15887,2012-10-30,4,1,10,15,0,2,1,3,0.3,0.2879,0.87,0.2537,10,114,124 +15888,2012-10-30,4,1,10,16,0,2,1,3,0.3,0.2879,0.87,0.2537,15,83,98 +15889,2012-10-30,4,1,10,17,0,2,1,3,0.3,0.2879,0.87,0.2239,19,105,124 +15890,2012-10-30,4,1,10,18,0,2,1,3,0.3,0.303,0.87,0.1343,4,139,143 +15891,2012-10-30,4,1,10,19,0,2,1,2,0.5,0.4848,0.68,0.194,6,109,115 +15892,2012-10-30,4,1,10,20,0,2,1,2,0.3,0.2879,0.81,0.194,5,76,81 +15893,2012-10-30,4,1,10,21,0,2,1,2,0.3,0.3182,0.87,0.1045,4,60,64 +15894,2012-10-30,4,1,10,22,0,2,1,1,0.3,0.303,0.81,0.1343,2,64,66 +15895,2012-10-30,4,1,10,23,0,2,1,1,0.3,0.303,0.81,0.1343,3,36,39 +15896,2012-10-31,4,1,10,0,0,3,1,2,0.3,0.3182,0.81,0.1045,0,16,16 +15897,2012-10-31,4,1,10,1,0,3,1,2,0.3,0.3182,0.81,0.0896,0,8,8 +15898,2012-10-31,4,1,10,2,0,3,1,2,0.3,0.303,0.81,0.1343,0,7,7 +15899,2012-10-31,4,1,10,3,0,3,1,2,0.3,0.3182,0.81,0.1045,0,3,3 +15900,2012-10-31,4,1,10,4,0,3,1,2,0.3,0.2879,0.87,0.194,0,5,5 +15901,2012-10-31,4,1,10,5,0,3,1,2,0.3,0.303,0.81,0.1343,0,24,24 +15902,2012-10-31,4,1,10,6,0,3,1,1,0.3,0.303,0.81,0.1642,0,116,116 +15903,2012-10-31,4,1,10,7,0,3,1,2,0.3,0.303,0.87,0.1343,3,334,337 +15904,2012-10-31,4,1,10,8,0,3,1,1,0.32,0.3333,0.81,0.1343,6,615,621 +15905,2012-10-31,4,1,10,9,0,3,1,1,0.34,0.3333,0.76,0.1343,17,280,297 +15906,2012-10-31,4,1,10,10,0,3,1,1,0.36,0.3333,0.71,0.2836,21,147,168 +15907,2012-10-31,4,1,10,11,0,3,1,2,0.36,0.3485,0.66,0.2239,37,152,189 +15908,2012-10-31,4,1,10,12,0,3,1,1,0.4,0.4091,0.58,0.194,44,197,241 +15909,2012-10-31,4,1,10,13,0,3,1,1,0.42,0.4242,0.5,0.2239,22,191,213 +15910,2012-10-31,4,1,10,14,0,3,1,2,0.42,0.4242,0.5,0.2239,45,179,224 +15911,2012-10-31,4,1,10,15,0,3,1,1,0.42,0.4242,0.54,0.1642,33,197,230 +15912,2012-10-31,4,1,10,16,0,3,1,2,0.42,0.4242,0.54,0.2239,51,373,424 +15913,2012-10-31,4,1,10,17,0,3,1,2,0.4,0.4091,0.58,0.1642,39,684,723 +15914,2012-10-31,4,1,10,18,0,3,1,2,0.4,0.4091,0.58,0.1642,28,556,584 +15915,2012-10-31,4,1,10,19,0,3,1,2,0.4,0.4091,0.5,0.194,27,383,410 +15916,2012-10-31,4,1,10,20,0,3,1,2,0.4,0.4091,0.5,0.194,9,259,268 +15917,2012-10-31,4,1,10,21,0,3,1,2,0.4,0.4091,0.5,0.194,12,180,192 +15918,2012-10-31,4,1,10,22,0,3,1,1,0.36,0.3485,0.57,0.1343,18,147,165 +15919,2012-10-31,4,1,10,23,0,3,1,1,0.36,0.3636,0.57,0.0896,7,94,101 +15920,2012-11-01,4,1,11,0,0,4,1,1,0.36,0.3636,0.57,0.0896,8,52,60 +15921,2012-11-01,4,1,11,1,0,4,1,1,0.3,0.3182,0.75,0.1045,8,22,30 +15922,2012-11-01,4,1,11,2,0,4,1,1,0.32,0.3333,0.66,0.1343,10,10,20 +15923,2012-11-01,4,1,11,3,0,4,1,1,0.34,0.3333,0.61,0.1343,5,10,15 +15924,2012-11-01,4,1,11,4,0,4,1,2,0.34,0.3485,0.66,0.1045,2,8,10 +15925,2012-11-01,4,1,11,5,0,4,1,1,0.34,0.3333,0.66,0.1343,1,39,40 +15926,2012-11-01,4,1,11,6,0,4,1,1,0.34,0.3333,0.61,0.1642,2,146,148 +15927,2012-11-01,4,1,11,7,0,4,1,2,0.36,0.3485,0.57,0.1642,6,414,420 +15928,2012-11-01,4,1,11,8,0,4,1,2,0.36,0.3788,0.56,0.1045,12,668,680 +15929,2012-11-01,4,1,11,9,0,4,1,3,0.36,0.3636,0.57,0.1045,21,310,331 +15930,2012-11-01,4,1,11,10,0,4,1,3,0.36,0.3485,0.62,0.194,13,133,146 +15931,2012-11-01,4,1,11,11,0,4,1,3,0.36,0.3485,0.62,0.194,14,156,170 +15932,2012-11-01,4,1,11,12,0,4,1,3,0.38,0.3939,0.62,0.1642,23,198,221 +15933,2012-11-01,4,1,11,13,0,4,1,2,0.4,0.4091,0.58,0.1642,49,199,248 +15934,2012-11-01,4,1,11,14,0,4,1,2,0.4,0.4091,0.5,0.2239,31,154,185 +15935,2012-11-01,4,1,11,15,0,4,1,2,0.4,0.4091,0.54,0.1642,35,179,214 +15936,2012-11-01,4,1,11,16,0,4,1,2,0.4,0.4091,0.54,0.1642,31,313,344 +15937,2012-11-01,4,1,11,17,0,4,1,3,0.4,0.4091,0.5,0.2239,37,652,689 +15938,2012-11-01,4,1,11,18,0,4,1,2,0.4,0.4091,0.5,0.1642,50,628,678 +15939,2012-11-01,4,1,11,19,0,4,1,2,0.4,0.4091,0.5,0.1642,28,424,452 +15940,2012-11-01,4,1,11,20,0,4,1,2,0.38,0.3939,0.54,0.2537,16,280,296 +15941,2012-11-01,4,1,11,21,0,4,1,2,0.38,0.3939,0.54,0.1343,29,238,267 +15942,2012-11-01,4,1,11,22,0,4,1,1,0.36,0.3485,0.57,0.1642,27,175,202 +15943,2012-11-01,4,1,11,23,0,4,1,1,0.34,0.3333,0.57,0.1642,8,112,120 +15944,2012-11-02,4,1,11,0,0,5,1,1,0.34,0.3333,0.57,0.1642,10,40,50 +15945,2012-11-02,4,1,11,1,0,5,1,1,0.32,0.3182,0.66,0.1642,5,19,24 +15946,2012-11-02,4,1,11,2,0,5,1,1,0.32,0.3182,0.66,0.1642,3,7,10 +15947,2012-11-02,4,1,11,3,0,5,1,1,0.3,0.303,0.7,0.1642,0,3,3 +15948,2012-11-02,4,1,11,4,0,5,1,1,0.3,0.3182,0.7,0.1045,1,6,7 +15949,2012-11-02,4,1,11,5,0,5,1,2,0.32,0.3182,0.66,0.1642,1,25,26 +15950,2012-11-02,4,1,11,6,0,5,1,1,0.3,0.303,0.7,0.1642,2,120,122 +15951,2012-11-02,4,1,11,7,0,5,1,2,0.3,0.303,0.7,0.1343,8,349,357 +15952,2012-11-02,4,1,11,8,0,5,1,1,0.32,0.3333,0.57,0.1045,31,656,687 +15953,2012-11-02,4,1,11,9,0,5,1,1,0.34,0.3485,0.53,0.1045,27,355,382 +15954,2012-11-02,4,1,11,10,0,5,1,1,0.38,0.3939,0.46,0.2985,21,183,204 +15955,2012-11-02,4,1,11,11,0,5,1,1,0.42,0.4242,0.41,0.4179,42,179,221 +15956,2012-11-02,4,1,11,12,0,5,1,1,0.42,0.4242,0.41,0.3582,52,240,292 +15957,2012-11-02,4,1,11,13,0,5,1,1,0.4,0.4091,0.43,0.3582,64,230,294 +15958,2012-11-02,4,1,11,14,0,5,1,2,0.4,0.4091,0.4,0.4925,63,199,262 +15959,2012-11-02,4,1,11,15,0,5,1,2,0.4,0.4091,0.4,0.4179,51,255,306 +15960,2012-11-02,4,1,11,16,0,5,1,1,0.38,0.3939,0.4,0.3284,48,373,421 +15961,2012-11-02,4,1,11,17,0,5,1,1,0.38,0.3939,0.4,0.3284,57,581,638 +15962,2012-11-02,4,1,11,18,0,5,1,2,0.38,0.3939,0.43,0.3284,32,490,522 +15963,2012-11-02,4,1,11,19,0,5,1,2,0.38,0.3939,0.43,0.2985,38,336,374 +15964,2012-11-02,4,1,11,20,0,5,1,2,0.36,0.3333,0.46,0.4179,14,207,221 +15965,2012-11-02,4,1,11,21,0,5,1,2,0.36,0.3333,0.46,0.3582,23,133,156 +15966,2012-11-02,4,1,11,22,0,5,1,1,0.36,0.3333,0.46,0.2537,14,135,149 +15967,2012-11-02,4,1,11,23,0,5,1,1,0.34,0.303,0.53,0.2985,11,108,119 +15968,2012-11-03,4,1,11,0,0,6,0,1,0.34,0.3333,0.53,0.1642,9,99,108 +15969,2012-11-03,4,1,11,1,0,6,0,2,0.34,0.303,0.49,0.3284,6,83,89 +15970,2012-11-03,4,1,11,2,0,6,0,2,0.34,0.3182,0.49,0.2537,10,36,46 +15971,2012-11-03,4,1,11,3,0,6,0,2,0.34,0.3182,0.49,0.2537,6,22,28 +15972,2012-11-03,4,1,11,4,0,6,0,2,0.32,0.303,0.49,0.2985,8,8,16 +15973,2012-11-03,4,1,11,5,0,6,0,2,0.32,0.303,0.49,0.2537,1,8,9 +15974,2012-11-03,4,1,11,6,0,6,0,2,0.32,0.3182,0.49,0.194,4,17,21 +15975,2012-11-03,4,1,11,7,0,6,0,2,0.32,0.303,0.49,0.2537,1,58,59 +15976,2012-11-03,4,1,11,8,0,6,0,2,0.34,0.303,0.46,0.2985,10,132,142 +15977,2012-11-03,4,1,11,9,0,6,0,2,0.34,0.303,0.49,0.2985,31,188,219 +15978,2012-11-03,4,1,11,10,0,6,0,1,0.36,0.3333,0.46,0.3881,68,260,328 +15979,2012-11-03,4,1,11,11,0,6,0,2,0.36,0.3333,0.46,0.3881,56,284,340 +15980,2012-11-03,4,1,11,12,0,6,0,2,0.36,0.3182,0.46,0.4627,74,320,394 +15981,2012-11-03,4,1,11,13,0,6,0,2,0.36,0.3333,0.46,0.3582,110,339,449 +15982,2012-11-03,4,1,11,14,0,6,0,2,0.36,0.3333,0.5,0.3284,136,319,455 +15983,2012-11-03,4,1,11,15,0,6,0,2,0.36,0.3182,0.46,0.4478,117,331,448 +15984,2012-11-03,4,1,11,16,0,6,0,2,0.36,0.3333,0.46,0.3284,108,292,400 +15985,2012-11-03,4,1,11,17,0,6,0,2,0.36,0.3333,0.46,0.2985,112,298,410 +15986,2012-11-03,4,1,11,18,0,6,0,2,0.36,0.3333,0.46,0.2836,58,239,297 +15987,2012-11-03,4,1,11,19,0,6,0,1,0.36,0.3485,0.5,0.1642,22,217,239 +15988,2012-11-03,4,1,11,20,0,6,0,1,0.34,0.3333,0.53,0.1343,23,158,181 +15989,2012-11-03,4,1,11,21,0,6,0,1,0.34,0.3485,0.53,0.1045,31,135,166 +15990,2012-11-03,4,1,11,22,0,6,0,1,0.32,0.3333,0.57,0.1045,13,133,146 +15991,2012-11-03,4,1,11,23,0,6,0,1,0.32,0.3333,0.57,0.1045,15,133,148 +15992,2012-11-04,4,1,11,0,0,0,0,1,0.3,0.3182,0.61,0.1045,5,97,102 +15993,2012-11-04,4,1,11,1,0,0,0,1,0.26,0.2879,0.81,0.0896,26,139,165 +15994,2012-11-04,4,1,11,2,0,0,0,1,0.3,0.3333,0.65,0,6,31,37 +15995,2012-11-04,4,1,11,3,0,0,0,1,0.28,0.2879,0.61,0.1045,1,10,11 +15996,2012-11-04,4,1,11,4,0,0,0,1,0.26,0.2727,0.65,0.1343,2,7,9 +15997,2012-11-04,4,1,11,5,0,0,0,1,0.26,0.2727,0.65,0.1045,0,5,5 +15998,2012-11-04,4,1,11,6,0,0,0,1,0.28,0.2879,0.61,0.1343,2,14,16 +15999,2012-11-04,4,1,11,7,0,0,0,1,0.26,0.2727,0.65,0.1045,11,39,50 +16000,2012-11-04,4,1,11,8,0,0,0,1,0.3,0.303,0.56,0.1343,34,115,149 +16001,2012-11-04,4,1,11,9,0,0,0,1,0.32,0.3333,0.53,0,56,161,217 +16002,2012-11-04,4,1,11,10,0,0,0,1,0.34,0.3333,0.49,0.194,73,287,360 +16003,2012-11-04,4,1,11,11,0,0,0,1,0.38,0.3939,0.46,0.2239,134,311,445 +16004,2012-11-04,4,1,11,12,0,0,0,1,0.38,0.3939,0.46,0.2239,150,354,504 +16005,2012-11-04,4,1,11,13,0,0,0,1,0.4,0.4091,0.43,0.2239,122,371,493 +16006,2012-11-04,4,1,11,14,0,0,0,1,0.4,0.4091,0.4,0.194,149,360,509 +16007,2012-11-04,4,1,11,15,0,0,0,1,0.4,0.4091,0.4,0.2239,149,300,449 +16008,2012-11-04,4,1,11,16,0,0,0,1,0.4,0.4091,0.4,0.3284,119,316,435 +16009,2012-11-04,4,1,11,17,0,0,0,1,0.36,0.3485,0.43,0.194,57,270,327 +16010,2012-11-04,4,1,11,18,0,0,0,1,0.36,0.3485,0.43,0.2239,31,206,237 +16011,2012-11-04,4,1,11,19,0,0,0,1,0.36,0.3333,0.43,0.2537,24,203,227 +16012,2012-11-04,4,1,11,20,0,0,0,1,0.32,0.2879,0.53,0.3582,7,134,141 +16013,2012-11-04,4,1,11,21,0,0,0,1,0.3,0.2727,0.52,0.2985,16,71,87 +16014,2012-11-04,4,1,11,22,0,0,0,1,0.3,0.2879,0.52,0.2836,15,64,79 +16015,2012-11-04,4,1,11,23,0,0,0,2,0.3,0.303,0.56,0.1642,12,41,53 +16016,2012-11-05,4,1,11,0,0,1,1,2,0.3,0.2879,0.56,0.194,3,20,23 +16017,2012-11-05,4,1,11,1,0,1,1,2,0.3,0.303,0.56,0.1642,0,8,8 +16018,2012-11-05,4,1,11,2,0,1,1,2,0.3,0.303,0.56,0.1642,2,4,6 +16019,2012-11-05,4,1,11,3,0,1,1,2,0.3,0.2879,0.56,0.2239,1,3,4 +16020,2012-11-05,4,1,11,4,0,1,1,2,0.3,0.2879,0.52,0.2537,5,11,16 +16021,2012-11-05,4,1,11,5,0,1,1,2,0.3,0.2727,0.52,0.3284,1,38,39 +16022,2012-11-05,4,1,11,6,0,1,1,2,0.3,0.2879,0.49,0.2836,4,135,139 +16023,2012-11-05,4,1,11,7,0,1,1,2,0.3,0.2879,0.49,0.2537,8,453,461 +16024,2012-11-05,4,1,11,8,0,1,1,2,0.3,0.2879,0.49,0.2239,19,629,648 +16025,2012-11-05,4,1,11,9,0,1,1,2,0.3,0.2879,0.52,0.2836,18,239,257 +16026,2012-11-05,4,1,11,10,0,1,1,1,0.32,0.303,0.49,0.2836,30,112,142 +16027,2012-11-05,4,1,11,11,0,1,1,1,0.34,0.3182,0.46,0.2239,31,119,150 +16028,2012-11-05,4,1,11,12,0,1,1,1,0.36,0.3333,0.43,0.3284,38,206,244 +16029,2012-11-05,4,1,11,13,0,1,1,1,0.36,0.3333,0.4,0.2537,39,183,222 +16030,2012-11-05,4,1,11,14,0,1,1,1,0.38,0.3939,0.4,0.1642,31,165,196 +16031,2012-11-05,4,1,11,15,0,1,1,1,0.38,0.3939,0.4,0.2836,24,207,231 +16032,2012-11-05,4,1,11,16,0,1,1,1,0.36,0.3333,0.43,0.2836,35,325,360 +16033,2012-11-05,4,1,11,17,0,1,1,1,0.34,0.3182,0.46,0.2239,34,604,638 +16034,2012-11-05,4,1,11,18,0,1,1,1,0.34,0.3182,0.46,0.2239,19,523,542 +16035,2012-11-05,4,1,11,19,0,1,1,1,0.32,0.3182,0.49,0.1642,11,361,372 +16036,2012-11-05,4,1,11,20,0,1,1,1,0.32,0.303,0.49,0.2537,13,228,241 +16037,2012-11-05,4,1,11,21,0,1,1,1,0.3,0.2879,0.52,0.2836,8,144,152 +16038,2012-11-05,4,1,11,22,0,1,1,1,0.28,0.2727,0.56,0.194,4,111,115 +16039,2012-11-05,4,1,11,23,0,1,1,1,0.26,0.2727,0.6,0.1343,0,53,53 +16040,2012-11-06,4,1,11,0,0,2,1,1,0.24,0.2576,0.6,0.1045,1,18,19 +16041,2012-11-06,4,1,11,1,0,2,1,1,0.24,0.2424,0.6,0.1642,0,8,8 +16042,2012-11-06,4,1,11,2,0,2,1,1,0.24,0.2576,0.65,0.0896,0,2,2 +16043,2012-11-06,4,1,11,3,0,2,1,1,0.22,0.2424,0.64,0.1045,0,4,4 +16044,2012-11-06,4,1,11,4,0,2,1,1,0.22,0.2576,0.69,0.0896,0,7,7 +16045,2012-11-06,4,1,11,5,0,2,1,2,0.22,0.2273,0.64,0.194,1,40,41 +16046,2012-11-06,4,1,11,6,0,2,1,1,0.22,0.2273,0.69,0.1343,6,143,149 +16047,2012-11-06,4,1,11,7,0,2,1,1,0.2,0.2121,0.69,0.1642,9,378,387 +16048,2012-11-06,4,1,11,8,0,2,1,1,0.22,0.2273,0.69,0.1642,20,568,588 +16049,2012-11-06,4,1,11,9,0,2,1,1,0.26,0.2576,0.6,0.1642,25,338,363 +16050,2012-11-06,4,1,11,10,0,2,1,1,0.3,0.2879,0.49,0.194,41,189,230 +16051,2012-11-06,4,1,11,11,0,2,1,1,0.32,0.3333,0.45,0.1343,39,177,216 +16052,2012-11-06,4,1,11,12,0,2,1,1,0.32,0.3333,0.45,0.1343,31,217,248 +16053,2012-11-06,4,1,11,13,0,2,1,1,0.34,0.3333,0.46,0.1343,46,232,278 +16054,2012-11-06,4,1,11,14,0,2,1,1,0.36,0.3636,0.4,0.1045,44,196,240 +16055,2012-11-06,4,1,11,15,0,2,1,1,0.34,0.3182,0.46,0.2537,45,227,272 +16056,2012-11-06,4,1,11,16,0,2,1,1,0.34,0.3333,0.46,0.1642,33,369,402 +16057,2012-11-06,4,1,11,17,0,2,1,1,0.32,0.303,0.53,0.2239,30,597,627 +16058,2012-11-06,4,1,11,18,0,2,1,1,0.32,0.303,0.57,0.2985,20,477,497 +16059,2012-11-06,4,1,11,19,0,2,1,1,0.3,0.2879,0.56,0.2239,15,356,371 +16060,2012-11-06,4,1,11,20,0,2,1,1,0.3,0.2879,0.61,0.2239,10,218,228 +16061,2012-11-06,4,1,11,21,0,2,1,2,0.3,0.2879,0.56,0.2537,14,137,151 +16062,2012-11-06,4,1,11,22,0,2,1,2,0.3,0.2879,0.56,0.2239,17,136,153 +16063,2012-11-06,4,1,11,23,0,2,1,1,0.3,0.2879,0.56,0.2239,19,186,205 +16064,2012-11-07,4,1,11,0,0,3,1,2,0.3,0.2879,0.56,0.2836,49,234,283 +16065,2012-11-07,4,1,11,1,0,3,1,2,0.28,0.2727,0.61,0.2537,6,86,92 +16066,2012-11-07,4,1,11,2,0,3,1,2,0.28,0.2576,0.56,0.2985,6,68,74 +16067,2012-11-07,4,1,11,3,0,3,1,2,0.28,0.2576,0.52,0.3284,2,9,11 +16068,2012-11-07,4,1,11,4,0,3,1,2,0.28,0.2576,0.52,0.2985,0,9,9 +16069,2012-11-07,4,1,11,5,0,3,1,2,0.28,0.2576,0.52,0.3284,0,27,27 +16070,2012-11-07,4,1,11,6,0,3,1,2,0.26,0.2424,0.56,0.2836,3,115,118 +16071,2012-11-07,4,1,11,7,0,3,1,2,0.26,0.2273,0.56,0.2985,5,314,319 +16072,2012-11-07,4,1,11,8,0,3,1,2,0.26,0.2424,0.6,0.2836,18,583,601 +16073,2012-11-07,4,1,11,9,0,3,1,2,0.26,0.2424,0.6,0.2836,17,351,368 +16074,2012-11-07,4,1,11,10,0,3,1,2,0.28,0.2576,0.56,0.2985,27,151,178 +16075,2012-11-07,4,1,11,11,0,3,1,2,0.32,0.2879,0.51,0.3582,15,132,147 +16076,2012-11-07,4,1,11,12,0,3,1,2,0.32,0.303,0.53,0.2836,24,188,212 +16077,2012-11-07,4,1,11,13,0,3,1,2,0.32,0.303,0.53,0.2537,23,158,181 +16078,2012-11-07,4,1,11,14,0,3,1,2,0.32,0.2879,0.53,0.3582,19,142,161 +16079,2012-11-07,4,1,11,15,0,3,1,2,0.32,0.303,0.53,0.2985,20,178,198 +16080,2012-11-07,4,1,11,16,0,3,1,2,0.32,0.2879,0.53,0.3582,23,250,273 +16081,2012-11-07,4,1,11,17,0,3,1,2,0.32,0.2879,0.53,0.3881,16,501,517 +16082,2012-11-07,4,1,11,18,0,3,1,2,0.32,0.2879,0.53,0.3582,17,448,465 +16083,2012-11-07,4,1,11,19,0,3,1,2,0.3,0.2727,0.56,0.2985,17,302,319 +16084,2012-11-07,4,1,11,20,0,3,1,2,0.32,0.303,0.49,0.3284,7,249,256 +16085,2012-11-07,4,1,11,21,0,3,1,2,0.32,0.303,0.49,0.2985,5,121,126 +16086,2012-11-07,4,1,11,22,0,3,1,3,0.3,0.2879,0.56,0.2537,4,56,60 +16087,2012-11-07,4,1,11,23,0,3,1,3,0.28,0.2727,0.65,0.2239,3,37,40 +16088,2012-11-08,4,1,11,0,0,4,1,3,0.28,0.2727,0.61,0.2239,1,14,15 +16089,2012-11-08,4,1,11,1,0,4,1,3,0.3,0.2727,0.52,0.3582,1,11,12 +16090,2012-11-08,4,1,11,2,0,4,1,2,0.3,0.2879,0.49,0.2239,1,5,6 +16091,2012-11-08,4,1,11,4,0,4,1,2,0.3,0.2879,0.45,0.2836,1,9,10 +16092,2012-11-08,4,1,11,5,0,4,1,2,0.3,0.2879,0.42,0.2239,0,35,35 +16093,2012-11-08,4,1,11,6,0,4,1,1,0.3,0.2879,0.39,0.2239,2,122,124 +16094,2012-11-08,4,1,11,7,0,4,1,1,0.28,0.2576,0.36,0.3881,12,411,423 +16095,2012-11-08,4,1,11,8,0,4,1,1,0.3,0.2879,0.33,0.2836,16,652,668 +16096,2012-11-08,4,1,11,9,0,4,1,1,0.32,0.2879,0.31,0.3582,22,253,275 +16097,2012-11-08,4,1,11,10,0,4,1,1,0.32,0.303,0.31,0.3284,13,148,161 +16098,2012-11-08,4,1,11,11,0,4,1,1,0.36,0.3333,0.29,0.3881,16,155,171 +16099,2012-11-08,4,1,11,12,0,4,1,1,0.4,0.4091,0.24,0.4925,33,202,235 +16100,2012-11-08,4,1,11,13,0,4,1,1,0.44,0.4394,0.18,0.4179,33,195,228 +16101,2012-11-08,4,1,11,14,0,4,1,1,0.44,0.4394,0.18,0.4179,33,149,182 +16102,2012-11-08,4,1,11,15,0,4,1,1,0.44,0.4394,0.18,0.4179,32,201,233 +16103,2012-11-08,4,1,11,16,0,4,1,1,0.42,0.4242,0.16,0.3881,18,321,339 +16104,2012-11-08,4,1,11,17,0,4,1,1,0.4,0.4091,0.2,0.2985,36,556,592 +16105,2012-11-08,4,1,11,18,0,4,1,1,0.4,0.4091,0.24,0.4179,16,491,507 +16106,2012-11-08,4,1,11,19,0,4,1,1,0.38,0.3939,0.27,0.3582,14,359,373 +16107,2012-11-08,4,1,11,20,0,4,1,1,0.38,0.3939,0.29,0.4627,10,235,245 +16108,2012-11-08,4,1,11,21,0,4,1,1,0.36,0.3333,0.37,0.3881,11,222,233 +16109,2012-11-08,4,1,11,22,0,4,1,1,0.34,0.303,0.42,0.2985,14,147,161 +16110,2012-11-08,4,1,11,23,0,4,1,1,0.34,0.303,0.46,0.3582,5,82,87 +16111,2012-11-09,4,1,11,0,0,5,1,1,0.34,0.303,0.49,0.3284,9,46,55 +16112,2012-11-09,4,1,11,1,0,5,1,1,0.32,0.303,0.53,0.2836,3,17,20 +16113,2012-11-09,4,1,11,2,0,5,1,1,0.32,0.303,0.53,0.2537,1,11,12 +16114,2012-11-09,4,1,11,3,0,5,1,1,0.32,0.2879,0.53,0.4179,3,6,9 +16115,2012-11-09,4,1,11,4,0,5,1,1,0.32,0.2879,0.53,0.3582,0,14,14 +16116,2012-11-09,4,1,11,5,0,5,1,1,0.3,0.2879,0.56,0.194,2,25,27 +16117,2012-11-09,4,1,11,6,0,5,1,1,0.26,0.2727,0.65,0.1045,5,126,131 +16118,2012-11-09,4,1,11,7,0,5,1,1,0.26,0.2727,0.65,0.1343,9,332,341 +16119,2012-11-09,4,1,11,8,0,5,1,1,0.32,0.3182,0.57,0.1642,25,668,693 +16120,2012-11-09,4,1,11,9,0,5,1,1,0.34,0.303,0.53,0.4179,23,304,327 +16121,2012-11-09,4,1,11,10,0,5,1,1,0.36,0.3333,0.5,0.3284,34,163,197 +16122,2012-11-09,4,1,11,11,0,5,1,1,0.4,0.4091,0.47,0.2985,66,185,251 +16123,2012-11-09,4,1,11,12,0,5,1,1,0.44,0.4394,0.41,0.3284,48,214,262 +16124,2012-11-09,4,1,11,13,0,5,1,1,0.46,0.4545,0.38,0.2985,62,237,299 +16125,2012-11-09,4,1,11,14,0,5,1,1,0.46,0.4545,0.38,0.194,67,207,274 +16126,2012-11-09,4,1,11,15,0,5,1,1,0.46,0.4545,0.36,0.2239,78,278,356 +16127,2012-11-09,4,1,11,16,0,5,1,1,0.46,0.4545,0.38,0.1642,57,377,434 +16128,2012-11-09,4,1,11,17,0,5,1,1,0.42,0.4242,0.44,0.1343,61,593,654 +16129,2012-11-09,4,1,11,18,0,5,1,1,0.42,0.4242,0.44,0.1045,46,450,496 +16130,2012-11-09,4,1,11,19,0,5,1,1,0.34,0.3485,0.71,0.0896,30,331,361 +16131,2012-11-09,4,1,11,20,0,5,1,1,0.36,0.3485,0.66,0.1343,20,235,255 +16132,2012-11-09,4,1,11,21,0,5,1,1,0.34,0.3636,0.76,0,23,176,199 +16133,2012-11-09,4,1,11,22,0,5,1,1,0.34,0.3485,0.71,0.1045,17,146,163 +16134,2012-11-09,4,1,11,23,0,5,1,1,0.32,0.3333,0.81,0.0896,20,142,162 +16135,2012-11-10,4,1,11,0,0,6,0,1,0.32,0.3333,0.76,0.0896,16,106,122 +16136,2012-11-10,4,1,11,1,0,6,0,1,0.3,0.3182,0.81,0.0896,8,68,76 +16137,2012-11-10,4,1,11,2,0,6,0,1,0.32,0.3485,0.81,0,2,42,44 +16138,2012-11-10,4,1,11,3,0,6,0,1,0.3,0.3333,0.81,0,9,11,20 +16139,2012-11-10,4,1,11,4,0,6,0,1,0.26,0.2727,0.87,0.1045,2,4,6 +16140,2012-11-10,4,1,11,5,0,6,0,1,0.28,0.3182,0.87,0,1,9,10 +16141,2012-11-10,4,1,11,6,0,6,0,1,0.24,0.2576,0.93,0.0896,5,11,16 +16142,2012-11-10,4,1,11,7,0,6,0,1,0.26,0.2727,0.87,0.1045,14,57,71 +16143,2012-11-10,4,1,11,8,0,6,0,1,0.32,0.3485,0.76,0,16,130,146 +16144,2012-11-10,4,1,11,9,0,6,0,1,0.36,0.3485,0.66,0.194,43,216,259 +16145,2012-11-10,4,1,11,10,0,6,0,1,0.4,0.4091,0.58,0,86,264,350 +16146,2012-11-10,4,1,11,11,0,6,0,1,0.4,0.4091,0.58,0.1045,143,323,466 +16147,2012-11-10,4,1,11,12,0,6,0,1,0.48,0.4697,0.48,0,258,348,606 +16148,2012-11-10,4,1,11,13,0,6,0,1,0.52,0.5,0.39,0,268,383,651 +16149,2012-11-10,4,1,11,14,0,6,0,1,0.54,0.5152,0.37,0,280,347,627 +16150,2012-11-10,4,1,11,15,0,6,0,1,0.52,0.5,0.45,0,216,351,567 +16151,2012-11-10,4,1,11,16,0,6,0,1,0.54,0.5152,0.37,0,227,378,605 +16152,2012-11-10,4,1,11,17,0,6,0,1,0.5,0.4848,0.45,0.194,183,318,501 +16153,2012-11-10,4,1,11,18,0,6,0,1,0.44,0.4394,0.54,0.1343,103,256,359 +16154,2012-11-10,4,1,11,19,0,6,0,1,0.46,0.4545,0.51,0.0896,55,228,283 +16155,2012-11-10,4,1,11,20,0,6,0,1,0.44,0.4394,0.54,0,39,153,192 +16156,2012-11-10,4,1,11,21,0,6,0,1,0.4,0.4091,0.66,0,51,169,220 +16157,2012-11-10,4,1,11,22,0,6,0,1,0.38,0.3939,0.71,0.0896,39,154,193 +16158,2012-11-10,4,1,11,23,0,6,0,1,0.36,0.3636,0.71,0.1045,26,120,146 +16159,2012-11-11,4,1,11,0,0,0,0,1,0.34,0.3485,0.81,0.0896,14,110,124 +16160,2012-11-11,4,1,11,1,0,0,0,1,0.34,0.3485,0.76,0.0896,26,82,108 +16161,2012-11-11,4,1,11,2,0,0,0,1,0.32,0.3333,0.87,0.1045,12,58,70 +16162,2012-11-11,4,1,11,3,0,0,0,1,0.34,0.3636,0.81,0,9,39,48 +16163,2012-11-11,4,1,11,4,0,0,0,1,0.32,0.3333,0.81,0.1045,5,6,11 +16164,2012-11-11,4,1,11,5,0,0,0,1,0.3,0.303,0.87,0.1343,0,12,12 +16165,2012-11-11,4,1,11,6,0,0,0,1,0.3,0.3182,0.87,0.0896,3,16,19 +16166,2012-11-11,4,1,11,7,0,0,0,1,0.3,0.3182,0.89,0.0896,12,56,68 +16167,2012-11-11,4,1,11,8,0,0,0,1,0.32,0.3333,0.87,0.1045,32,87,119 +16168,2012-11-11,4,1,11,9,0,0,0,1,0.36,0.3788,0.77,0,94,179,273 +16169,2012-11-11,4,1,11,10,0,0,0,1,0.4,0.4091,0.69,0.1045,133,272,405 +16170,2012-11-11,4,1,11,11,0,0,0,1,0.46,0.4545,0.59,0.1343,180,324,504 +16171,2012-11-11,4,1,11,12,0,0,0,1,0.5,0.4848,0.48,0.0896,195,390,585 +16172,2012-11-11,4,1,11,13,0,0,0,1,0.54,0.5152,0.45,0.1045,262,424,686 +16173,2012-11-11,4,1,11,14,0,0,0,1,0.6,0.6212,0.33,0.1642,292,362,654 +16174,2012-11-11,4,1,11,15,0,0,0,1,0.56,0.5303,0.37,0.2239,304,420,724 +16175,2012-11-11,4,1,11,16,0,0,0,1,0.54,0.5152,0.42,0.2239,260,393,653 +16176,2012-11-11,4,1,11,17,0,0,0,1,0.54,0.5152,0.45,0.1642,151,342,493 +16177,2012-11-11,4,1,11,18,0,0,0,1,0.52,0.5,0.45,0.1343,102,303,405 +16178,2012-11-11,4,1,11,19,0,0,0,1,0.48,0.4697,0.55,0.1343,69,208,277 +16179,2012-11-11,4,1,11,20,0,0,0,1,0.44,0.4394,0.62,0.1642,54,146,200 +16180,2012-11-11,4,1,11,21,0,0,0,1,0.44,0.4394,0.67,0.2239,44,127,171 +16181,2012-11-11,4,1,11,22,0,0,0,1,0.42,0.4242,0.71,0.194,21,113,134 +16182,2012-11-11,4,1,11,23,0,0,0,1,0.42,0.4242,0.71,0.194,16,93,109 +16183,2012-11-12,4,1,11,0,1,1,0,1,0.42,0.4242,0.71,0.1045,6,43,49 +16184,2012-11-12,4,1,11,1,1,1,0,1,0.4,0.4091,0.76,0.1642,4,26,30 +16185,2012-11-12,4,1,11,2,1,1,0,1,0.4,0.4091,0.76,0.1343,6,14,20 +16186,2012-11-12,4,1,11,3,1,1,0,1,0.4,0.4091,0.76,0.1343,1,3,4 +16187,2012-11-12,4,1,11,4,1,1,0,1,0.4,0.4091,0.76,0.0896,1,3,4 +16188,2012-11-12,4,1,11,5,1,1,0,1,0.38,0.3939,0.87,0,1,23,24 +16189,2012-11-12,4,1,11,6,1,1,0,1,0.38,0.3939,0.87,0.1045,7,64,71 +16190,2012-11-12,4,1,11,7,1,1,0,1,0.4,0.4091,0.87,0,16,248,264 +16191,2012-11-12,4,1,11,8,1,1,0,1,0.42,0.4242,0.82,0.1642,50,490,540 +16192,2012-11-12,4,1,11,9,1,1,0,1,0.44,0.4394,0.88,0.1045,60,337,397 +16193,2012-11-12,4,1,11,10,1,1,0,1,0.48,0.4697,0.77,0.1642,82,184,266 +16194,2012-11-12,4,1,11,11,1,1,0,1,0.52,0.5,0.72,0.1642,112,255,367 +16195,2012-11-12,4,1,11,12,1,1,0,1,0.56,0.5303,0.64,0.2239,105,314,419 +16196,2012-11-12,4,1,11,13,1,1,0,1,0.6,0.6061,0.6,0.2239,108,312,420 +16197,2012-11-12,4,1,11,14,1,1,0,1,0.58,0.5455,0.6,0.2836,134,310,444 +16198,2012-11-12,4,1,11,15,1,1,0,1,0.56,0.5303,0.64,0.2537,102,280,382 +16199,2012-11-12,4,1,11,16,1,1,0,1,0.56,0.5303,0.64,0.1642,87,347,434 +16200,2012-11-12,4,1,11,17,1,1,0,1,0.56,0.5303,0.64,0.2836,66,530,596 +16201,2012-11-12,4,1,11,18,1,1,0,1,0.52,0.5,0.72,0.1343,65,486,551 +16202,2012-11-12,4,1,11,19,1,1,0,1,0.54,0.5152,0.73,0.2836,30,323,353 +16203,2012-11-12,4,1,11,20,1,1,0,2,0.52,0.5,0.77,0.2836,31,273,304 +16204,2012-11-12,4,1,11,21,1,1,0,3,0.54,0.5152,0.73,0.2239,10,145,155 +16205,2012-11-12,4,1,11,22,1,1,0,1,0.52,0.5,0.77,0.2537,12,100,112 +16206,2012-11-12,4,1,11,23,1,1,0,2,0.54,0.5152,0.77,0.2239,1,62,63 +16207,2012-11-13,4,1,11,0,0,2,1,2,0.52,0.5,0.83,0.2836,5,18,23 +16208,2012-11-13,4,1,11,1,0,2,1,3,0.44,0.4394,0.88,0.6418,0,5,5 +16209,2012-11-13,4,1,11,2,0,2,1,3,0.36,0.3182,0.87,0.4478,7,4,11 +16210,2012-11-13,4,1,11,3,0,2,1,3,0.36,0.3333,0.87,0.3582,2,3,5 +16211,2012-11-13,4,1,11,4,0,2,1,2,0.36,0.3333,0.81,0.3881,0,9,9 +16212,2012-11-13,4,1,11,5,0,2,1,2,0.34,0.3182,0.87,0.2836,0,18,18 +16213,2012-11-13,4,1,11,6,0,2,1,3,0.32,0.2879,0.81,0.4627,2,48,50 +16214,2012-11-13,4,1,11,7,0,2,1,3,0.32,0.303,0.87,0.2537,1,106,107 +16215,2012-11-13,4,1,11,8,0,2,1,3,0.32,0.303,0.87,0.2537,4,207,211 +16216,2012-11-13,4,1,11,9,0,2,1,3,0.32,0.303,0.81,0.3284,1,109,110 +16217,2012-11-13,4,1,11,10,0,2,1,3,0.3,0.2727,0.75,0.3881,10,84,94 +16218,2012-11-13,4,1,11,11,0,2,1,1,0.32,0.303,0.7,0.2537,22,133,155 +16219,2012-11-13,4,1,11,12,0,2,1,1,0.34,0.3182,0.61,0.2836,16,180,196 +16220,2012-11-13,4,1,11,13,0,2,1,1,0.34,0.3182,0.53,0.2537,30,188,218 +16221,2012-11-13,4,1,11,14,0,2,1,1,0.38,0.3939,0.46,0.3881,34,169,203 +16222,2012-11-13,4,1,11,15,0,2,1,1,0.4,0.4091,0.4,0.2985,33,184,217 +16223,2012-11-13,4,1,11,16,0,2,1,1,0.38,0.3939,0.4,0.2985,28,282,310 +16224,2012-11-13,4,1,11,17,0,2,1,1,0.34,0.303,0.49,0.3582,33,575,608 +16225,2012-11-13,4,1,11,18,0,2,1,1,0.34,0.3182,0.49,0.2836,45,514,559 +16226,2012-11-13,4,1,11,19,0,2,1,1,0.32,0.2879,0.49,0.3582,12,344,356 +16227,2012-11-13,4,1,11,20,0,2,1,1,0.3,0.2727,0.49,0.3582,12,211,223 +16228,2012-11-13,4,1,11,21,0,2,1,1,0.3,0.2727,0.49,0.3582,9,178,187 +16229,2012-11-13,4,1,11,22,0,2,1,1,0.26,0.2273,0.56,0.3284,15,120,135 +16230,2012-11-13,4,1,11,23,0,2,1,1,0.26,0.2273,0.56,0.2985,6,78,84 +16231,2012-11-14,4,1,11,0,0,3,1,1,0.26,0.2424,0.56,0.2836,4,20,24 +16232,2012-11-14,4,1,11,1,0,3,1,1,0.24,0.2121,0.6,0.2985,2,10,12 +16233,2012-11-14,4,1,11,2,0,3,1,1,0.24,0.2121,0.6,0.2985,0,1,1 +16234,2012-11-14,4,1,11,3,0,3,1,1,0.24,0.2273,0.65,0.2239,0,5,5 +16235,2012-11-14,4,1,11,4,0,3,1,1,0.22,0.2273,0.69,0.194,0,6,6 +16236,2012-11-14,4,1,11,5,0,3,1,1,0.22,0.2273,0.69,0.194,0,39,39 +16237,2012-11-14,4,1,11,6,0,3,1,1,0.24,0.2273,0.65,0.194,4,142,146 +16238,2012-11-14,4,1,11,7,0,3,1,1,0.24,0.2121,0.65,0.2836,10,405,415 +16239,2012-11-14,4,1,11,8,0,3,1,1,0.28,0.2576,0.56,0.2985,27,664,691 +16240,2012-11-14,4,1,11,9,0,3,1,1,0.3,0.2727,0.52,0.3881,22,310,332 +16241,2012-11-14,4,1,11,10,0,3,1,1,0.32,0.2879,0.45,0.3582,18,153,171 +16242,2012-11-14,4,1,11,11,0,3,1,1,0.32,0.303,0.49,0.2537,25,126,151 +16243,2012-11-14,4,1,11,12,0,3,1,1,0.34,0.3333,0.46,0,40,200,240 +16244,2012-11-14,4,1,11,13,0,3,1,1,0.36,0.3333,0.43,0.2537,32,182,214 +16245,2012-11-14,4,1,11,14,0,3,1,1,0.36,0.3485,0.43,0.1343,20,161,181 +16246,2012-11-14,4,1,11,15,0,3,1,1,0.36,0.3333,0.43,0.2836,32,228,260 +16247,2012-11-14,4,1,11,16,0,3,1,1,0.34,0.3485,0.46,0.1045,31,290,321 +16248,2012-11-14,4,1,11,17,0,3,1,1,0.32,0.3182,0.49,0.194,19,564,583 +16249,2012-11-14,4,1,11,18,0,3,1,1,0.32,0.3182,0.53,0.1642,24,543,567 +16250,2012-11-14,4,1,11,19,0,3,1,1,0.3,0.303,0.52,0.1642,28,368,396 +16251,2012-11-14,4,1,11,20,0,3,1,1,0.3,0.3182,0.52,0.0896,15,252,267 +16252,2012-11-14,4,1,11,21,0,3,1,1,0.3,0.3333,0.56,0,5,208,213 +16253,2012-11-14,4,1,11,22,0,3,1,1,0.28,0.3182,0.61,0,6,167,173 +16254,2012-11-14,4,1,11,23,0,3,1,1,0.24,0.2424,0.7,0.1343,9,78,87 +16255,2012-11-15,4,1,11,0,0,4,1,2,0.26,0.303,0.65,0,1,33,34 +16256,2012-11-15,4,1,11,1,0,4,1,2,0.26,0.2727,0.65,0.1045,2,14,16 +16257,2012-11-15,4,1,11,2,0,4,1,2,0.26,0.2576,0.7,0.1642,0,7,7 +16258,2012-11-15,4,1,11,3,0,4,1,2,0.26,0.2879,0.7,0.0896,0,2,2 +16259,2012-11-15,4,1,11,4,0,4,1,2,0.28,0.2879,0.65,0.1343,0,5,5 +16260,2012-11-15,4,1,11,5,0,4,1,2,0.3,0.2879,0.61,0.2239,3,34,37 +16261,2012-11-15,4,1,11,6,0,4,1,2,0.3,0.2879,0.65,0.2537,1,146,147 +16262,2012-11-15,4,1,11,7,0,4,1,2,0.3,0.2879,0.65,0.194,7,403,410 +16263,2012-11-15,4,1,11,8,0,4,1,2,0.3,0.2879,0.61,0.2239,21,625,646 +16264,2012-11-15,4,1,11,9,0,4,1,2,0.32,0.3182,0.61,0.194,17,306,323 +16265,2012-11-15,4,1,11,10,0,4,1,2,0.32,0.3333,0.66,0.1343,11,142,153 +16266,2012-11-15,4,1,11,11,0,4,1,2,0.34,0.3333,0.61,0.1343,28,138,166 +16267,2012-11-15,4,1,11,12,0,4,1,2,0.36,0.3636,0.62,0.1045,29,184,213 +16268,2012-11-15,4,1,11,13,0,4,1,2,0.38,0.3939,0.54,0.1045,20,197,217 +16269,2012-11-15,4,1,11,14,0,4,1,2,0.38,0.3939,0.54,0.1642,27,174,201 +16270,2012-11-15,4,1,11,15,0,4,1,2,0.38,0.3939,0.58,0.1642,26,191,217 +16271,2012-11-15,4,1,11,16,0,4,1,2,0.36,0.3485,0.57,0.194,24,318,342 +16272,2012-11-15,4,1,11,17,0,4,1,2,0.36,0.3485,0.57,0.2239,22,541,563 +16273,2012-11-15,4,1,11,18,0,4,1,2,0.36,0.3485,0.57,0.194,10,563,573 +16274,2012-11-15,4,1,11,19,0,4,1,2,0.34,0.3333,0.61,0.1343,11,375,386 +16275,2012-11-15,4,1,11,20,0,4,1,1,0.34,0.3485,0.61,0.1045,23,262,285 +16276,2012-11-15,4,1,11,21,0,4,1,2,0.32,0.3333,0.66,0.1045,20,204,224 +16277,2012-11-15,4,1,11,22,0,4,1,2,0.32,0.3182,0.66,0.1642,10,143,153 +16278,2012-11-15,4,1,11,23,0,4,1,2,0.32,0.3182,0.61,0.1642,7,118,125 +16279,2012-11-16,4,1,11,0,0,5,1,2,0.32,0.3333,0.66,0.1045,7,58,65 +16280,2012-11-16,4,1,11,1,0,5,1,2,0.3,0.3182,0.65,0.0896,5,16,21 +16281,2012-11-16,4,1,11,2,0,5,1,2,0.3,0.303,0.7,0.1642,0,9,9 +16282,2012-11-16,4,1,11,3,0,5,1,2,0.3,0.303,0.65,0.1343,0,6,6 +16283,2012-11-16,4,1,11,4,0,5,1,2,0.3,0.3182,0.65,0.1045,0,5,5 +16284,2012-11-16,4,1,11,5,0,5,1,2,0.3,0.3182,0.65,0.0896,2,34,36 +16285,2012-11-16,4,1,11,6,0,5,1,2,0.3,0.3182,0.61,0.0896,4,126,130 +16286,2012-11-16,4,1,11,7,0,5,1,2,0.3,0.303,0.61,0.1642,5,362,367 +16287,2012-11-16,4,1,11,8,0,5,1,2,0.32,0.3333,0.57,0.1343,17,694,711 +16288,2012-11-16,4,1,11,9,0,5,1,1,0.34,0.3485,0.53,0.1045,21,330,351 +16289,2012-11-16,4,1,11,10,0,5,1,1,0.34,0.3333,0.53,0.194,33,165,198 +16290,2012-11-16,4,1,11,11,0,5,1,1,0.38,0.3939,0.43,0.2239,33,185,218 +16291,2012-11-16,4,1,11,12,0,5,1,1,0.4,0.4091,0.43,0.2836,28,262,290 +16292,2012-11-16,4,1,11,13,0,5,1,1,0.42,0.4242,0.38,0.194,52,226,278 +16293,2012-11-16,4,1,11,14,0,5,1,1,0.42,0.4242,0.35,0.1045,47,204,251 +16294,2012-11-16,4,1,11,15,0,5,1,1,0.42,0.4242,0.38,0.1343,30,237,267 +16295,2012-11-16,4,1,11,16,0,5,1,1,0.42,0.4242,0.35,0.2836,33,350,383 +16296,2012-11-16,4,1,11,17,0,5,1,1,0.36,0.3485,0.46,0.194,41,539,580 +16297,2012-11-16,4,1,11,18,0,5,1,1,0.34,0.3333,0.53,0.1642,22,483,505 +16298,2012-11-16,4,1,11,19,0,5,1,1,0.36,0.3485,0.46,0.2239,26,306,332 +16299,2012-11-16,4,1,11,20,0,5,1,1,0.36,0.3333,0.46,0.2537,20,207,227 +16300,2012-11-16,4,1,11,21,0,5,1,1,0.34,0.3333,0.49,0.194,21,157,178 +16301,2012-11-16,4,1,11,22,0,5,1,1,0.32,0.303,0.53,0.2537,23,139,162 +16302,2012-11-16,4,1,11,23,0,5,1,1,0.32,0.303,0.53,0.2239,14,114,128 +16303,2012-11-17,4,1,11,0,0,6,0,1,0.3,0.303,0.52,0.1642,11,95,106 +16304,2012-11-17,4,1,11,1,0,6,0,1,0.26,0.2576,0.6,0.194,13,74,87 +16305,2012-11-17,4,1,11,2,0,6,0,1,0.26,0.2576,0.65,0.194,8,41,49 +16306,2012-11-17,4,1,11,3,0,6,0,1,0.26,0.2576,0.65,0.194,2,19,21 +16307,2012-11-17,4,1,11,4,0,6,0,1,0.24,0.2273,0.7,0.194,1,6,7 +16308,2012-11-17,4,1,11,5,0,6,0,1,0.24,0.2424,0.7,0.1642,1,10,11 +16309,2012-11-17,4,1,11,6,0,6,0,1,0.24,0.2273,0.7,0.2239,0,21,21 +16310,2012-11-17,4,1,11,7,0,6,0,1,0.24,0.2273,0.7,0.194,8,70,78 +16311,2012-11-17,4,1,11,8,0,6,0,1,0.26,0.2576,0.65,0.2239,30,138,168 +16312,2012-11-17,4,1,11,9,0,6,0,1,0.34,0.3182,0.49,0.2239,48,200,248 +16313,2012-11-17,4,1,11,10,0,6,0,1,0.36,0.3333,0.46,0.2537,62,258,320 +16314,2012-11-17,4,1,11,11,0,6,0,1,0.38,0.3939,0.43,0.194,80,343,423 +16315,2012-11-17,4,1,11,12,0,6,0,1,0.4,0.4091,0.4,0.1642,117,359,476 +16316,2012-11-17,4,1,11,13,0,6,0,1,0.42,0.4242,0.38,0.2537,179,346,525 +16317,2012-11-17,4,1,11,14,0,6,0,1,0.42,0.4242,0.38,0.194,175,380,555 +16318,2012-11-17,4,1,11,15,0,6,0,1,0.42,0.4242,0.35,0.2985,175,374,549 +16319,2012-11-17,4,1,11,16,0,6,0,1,0.4,0.4091,0.4,0.2239,144,325,469 +16320,2012-11-17,4,1,11,17,0,6,0,1,0.38,0.3939,0.4,0.1642,101,302,403 +16321,2012-11-17,4,1,11,18,0,6,0,1,0.36,0.3636,0.5,0.1045,34,237,271 +16322,2012-11-17,4,1,11,19,0,6,0,1,0.34,0.3485,0.53,0.0896,45,208,253 +16323,2012-11-17,4,1,11,20,0,6,0,2,0.34,0.3485,0.66,0.0896,30,142,172 +16324,2012-11-17,4,1,11,21,0,6,0,2,0.32,0.3333,0.57,0.1045,15,124,139 +16325,2012-11-17,4,1,11,22,0,6,0,2,0.32,0.3333,0.57,0.1045,15,130,145 +16326,2012-11-17,4,1,11,23,0,6,0,2,0.3,0.3182,0.7,0.1045,19,114,133 +16327,2012-11-18,4,1,11,0,0,0,0,2,0.3,0.303,0.7,0.1642,11,118,129 +16328,2012-11-18,4,1,11,1,0,0,0,1,0.3,0.2879,0.7,0.194,14,81,95 +16329,2012-11-18,4,1,11,2,0,0,0,1,0.28,0.2879,0.81,0.1045,8,65,73 +16330,2012-11-18,4,1,11,3,0,0,0,2,0.3,0.303,0.81,0.1642,9,37,46 +16331,2012-11-18,4,1,11,4,0,0,0,2,0.3,0.2879,0.81,0.194,4,8,12 +16332,2012-11-18,4,1,11,5,0,0,0,1,0.28,0.2879,0.81,0.1343,5,7,12 +16333,2012-11-18,4,1,11,6,0,0,0,1,0.28,0.2727,0.81,0.1642,2,13,15 +16334,2012-11-18,4,1,11,7,0,0,0,1,0.28,0.2727,0.81,0.194,3,39,42 +16335,2012-11-18,4,1,11,8,0,0,0,1,0.3,0.2879,0.75,0.2537,19,100,119 +16336,2012-11-18,4,1,11,9,0,0,0,1,0.32,0.303,0.7,0.2537,49,155,204 +16337,2012-11-18,4,1,11,10,0,0,0,1,0.34,0.303,0.66,0.2985,79,250,329 +16338,2012-11-18,4,1,11,11,0,0,0,1,0.38,0.3939,0.62,0.2836,92,267,359 +16339,2012-11-18,4,1,11,12,0,0,0,1,0.4,0.4091,0.62,0.2836,101,341,442 +16340,2012-11-18,4,1,11,13,0,0,0,1,0.4,0.4091,0.62,0.3284,113,334,447 +16341,2012-11-18,4,1,11,14,0,0,0,1,0.4,0.4091,0.62,0.2985,125,303,428 +16342,2012-11-18,4,1,11,15,0,0,0,1,0.42,0.4242,0.54,0.2836,89,318,407 +16343,2012-11-18,4,1,11,16,0,0,0,2,0.4,0.4091,0.62,0.3284,59,313,372 +16344,2012-11-18,4,1,11,17,0,0,0,2,0.38,0.3939,0.66,0.2537,48,232,280 +16345,2012-11-18,4,1,11,18,0,0,0,1,0.36,0.3485,0.66,0.2239,36,240,276 +16346,2012-11-18,4,1,11,19,0,0,0,1,0.36,0.3485,0.66,0.1642,16,194,210 +16347,2012-11-18,4,1,11,20,0,0,0,1,0.36,0.3485,0.66,0.1642,9,120,129 +16348,2012-11-18,4,1,11,21,0,0,0,1,0.36,0.3333,0.66,0.2537,17,93,110 +16349,2012-11-18,4,1,11,22,0,0,0,2,0.36,0.3333,0.66,0.2537,8,66,74 +16350,2012-11-18,4,1,11,23,0,0,0,1,0.36,0.3485,0.66,0.2239,6,53,59 +16351,2012-11-19,4,1,11,0,0,1,1,1,0.36,0.3485,0.66,0.2239,5,22,27 +16352,2012-11-19,4,1,11,1,0,1,1,2,0.36,0.3485,0.66,0.2239,0,19,19 +16353,2012-11-19,4,1,11,2,0,1,1,2,0.36,0.3333,0.66,0.2537,0,5,5 +16354,2012-11-19,4,1,11,3,0,1,1,2,0.36,0.3485,0.66,0.2239,0,2,2 +16355,2012-11-19,4,1,11,4,0,1,1,2,0.36,0.3485,0.66,0.2239,1,11,12 +16356,2012-11-19,4,1,11,5,0,1,1,2,0.36,0.3485,0.66,0.2239,1,38,39 +16357,2012-11-19,4,1,11,6,0,1,1,2,0.36,0.3333,0.66,0.2537,3,128,131 +16358,2012-11-19,4,1,11,7,0,1,1,2,0.36,0.3333,0.66,0.2985,5,381,386 +16359,2012-11-19,4,1,11,8,0,1,1,2,0.36,0.3333,0.66,0.2985,13,650,663 +16360,2012-11-19,4,1,11,9,0,1,1,2,0.36,0.3333,0.66,0.2537,18,260,278 +16361,2012-11-19,4,1,11,10,0,1,1,2,0.36,0.3333,0.66,0.2537,33,106,139 +16362,2012-11-19,4,1,11,11,0,1,1,2,0.38,0.3939,0.66,0.2537,33,164,197 +16363,2012-11-19,4,1,11,12,0,1,1,2,0.4,0.4091,0.62,0.2836,35,207,242 +16364,2012-11-19,4,1,11,13,0,1,1,1,0.44,0.4394,0.54,0.2537,46,205,251 +16365,2012-11-19,4,1,11,14,0,1,1,1,0.44,0.4394,0.54,0.2537,47,170,217 +16366,2012-11-19,4,1,11,15,0,1,1,1,0.44,0.4394,0.54,0.194,44,213,257 +16367,2012-11-19,4,1,11,16,0,1,1,1,0.42,0.4242,0.58,0.2537,55,325,380 +16368,2012-11-19,4,1,11,17,0,1,1,2,0.42,0.4242,0.58,0.194,33,586,619 +16369,2012-11-19,4,1,11,18,0,1,1,2,0.4,0.4091,0.58,0.2537,21,559,580 +16370,2012-11-19,4,1,11,19,0,1,1,2,0.4,0.4091,0.58,0.2836,25,381,406 +16371,2012-11-19,4,1,11,20,0,1,1,2,0.38,0.3939,0.62,0.2537,7,252,259 +16372,2012-11-19,4,1,11,21,0,1,1,2,0.38,0.3939,0.54,0.2239,15,188,203 +16373,2012-11-19,4,1,11,22,0,1,1,1,0.34,0.3485,0.66,0.1045,7,106,113 +16374,2012-11-19,4,1,11,23,0,1,1,1,0.34,0.3485,0.66,0.1045,2,72,74 +16375,2012-11-20,4,1,11,0,0,2,1,1,0.32,0.3333,0.7,0.1343,3,26,29 +16376,2012-11-20,4,1,11,1,0,2,1,1,0.34,0.3485,0.66,0.1045,3,10,13 +16377,2012-11-20,4,1,11,2,0,2,1,2,0.32,0.3333,0.76,0.0896,0,5,5 +16378,2012-11-20,4,1,11,3,0,2,1,2,0.34,0.3485,0.71,0.1045,0,2,2 +16379,2012-11-20,4,1,11,4,0,2,1,2,0.34,0.3333,0.66,0.194,0,6,6 +16380,2012-11-20,4,1,11,5,0,2,1,2,0.34,0.3333,0.66,0.1343,2,32,34 +16381,2012-11-20,4,1,11,6,0,2,1,2,0.34,0.3333,0.71,0.1642,2,146,148 +16382,2012-11-20,4,1,11,7,0,2,1,1,0.32,0.3333,0.76,0.0896,7,411,418 +16383,2012-11-20,4,1,11,8,0,2,1,1,0.34,0.3333,0.71,0.1343,16,649,665 +16384,2012-11-20,4,1,11,9,0,2,1,2,0.36,0.3788,0.71,0,28,298,326 +16385,2012-11-20,4,1,11,10,0,2,1,2,0.4,0.4091,0.66,0.0896,32,144,176 +16386,2012-11-20,4,1,11,11,0,2,1,2,0.42,0.4242,0.67,0.0896,45,160,205 +16387,2012-11-20,4,1,11,12,0,2,1,2,0.42,0.4242,0.67,0,70,243,313 +16388,2012-11-20,4,1,11,13,0,2,1,2,0.44,0.4394,0.58,0,49,218,267 +16389,2012-11-20,4,1,11,14,0,2,1,2,0.44,0.4394,0.58,0.1045,47,177,224 +16390,2012-11-20,4,1,11,15,0,2,1,2,0.46,0.4545,0.55,0.1045,61,226,287 +16391,2012-11-20,4,1,11,16,0,2,1,2,0.42,0.4242,0.62,0.0896,60,335,395 +16392,2012-11-20,4,1,11,17,0,2,1,2,0.4,0.4091,0.71,0.1642,37,553,590 +16393,2012-11-20,4,1,11,18,0,2,1,2,0.4,0.4091,0.66,0.0896,16,534,550 +16394,2012-11-20,4,1,11,19,0,2,1,2,0.4,0.4091,0.71,0,23,361,384 +16395,2012-11-20,4,1,11,20,0,2,1,2,0.38,0.3939,0.76,0,11,224,235 +16396,2012-11-20,4,1,11,21,0,2,1,1,0.36,0.3788,0.71,0,14,143,157 +16397,2012-11-20,4,1,11,22,0,2,1,1,0.36,0.3788,0.71,0,7,115,122 +16398,2012-11-20,4,1,11,23,0,2,1,1,0.32,0.3333,0.81,0.1045,1,82,83 +16399,2012-11-21,4,1,11,0,0,3,1,1,0.34,0.3636,0.81,0,1,25,26 +16400,2012-11-21,4,1,11,1,0,3,1,1,0.32,0.3333,0.81,0.1045,1,13,14 +16401,2012-11-21,4,1,11,2,0,3,1,1,0.28,0.2879,0.87,0.1045,2,6,8 +16402,2012-11-21,4,1,11,3,0,3,1,1,0.28,0.2879,0.87,0.1045,0,2,2 +16403,2012-11-21,4,1,11,4,0,3,1,1,0.28,0.2879,0.87,0.1045,0,10,10 +16404,2012-11-21,4,1,11,5,0,3,1,1,0.28,0.2879,0.87,0.1045,1,28,29 +16405,2012-11-21,4,1,11,6,0,3,1,1,0.28,0.303,0.87,0.0896,2,99,101 +16406,2012-11-21,4,1,11,7,0,3,1,1,0.26,0.2879,0.87,0.0896,11,262,273 +16407,2012-11-21,4,1,11,8,0,3,1,2,0.28,0.303,0.81,0.0896,12,539,551 +16408,2012-11-21,4,1,11,9,0,3,1,1,0.32,0.3333,0.76,0.0896,45,320,365 +16409,2012-11-21,4,1,11,10,0,3,1,1,0.36,0.3788,0.62,0,22,150,172 +16410,2012-11-21,4,1,11,11,0,3,1,1,0.4,0.4091,0.5,0.194,43,198,241 +16411,2012-11-21,4,1,11,12,0,3,1,1,0.44,0.4394,0.41,0.194,74,270,344 +16412,2012-11-21,4,1,11,13,0,3,1,1,0.44,0.4394,0.35,0.2239,70,295,365 +16413,2012-11-21,4,1,11,14,0,3,1,1,0.46,0.4545,0.33,0.194,84,329,413 +16414,2012-11-21,4,1,11,15,0,3,1,1,0.46,0.4545,0.36,0.0896,68,381,449 +16415,2012-11-21,4,1,11,16,0,3,1,1,0.44,0.4394,0.33,0.1343,59,410,469 +16416,2012-11-21,4,1,11,17,0,3,1,1,0.4,0.4091,0.4,0.1642,41,333,374 +16417,2012-11-21,4,1,11,18,0,3,1,1,0.42,0.4242,0.38,0.1045,24,287,311 +16418,2012-11-21,4,1,11,19,0,3,1,1,0.38,0.3939,0.43,0.194,23,220,243 +16419,2012-11-21,4,1,11,20,0,3,1,1,0.34,0.3485,0.61,0.1045,11,125,136 +16420,2012-11-21,4,1,11,21,0,3,1,1,0.36,0.3788,0.5,0,8,97,105 +16421,2012-11-21,4,1,11,22,0,3,1,1,0.34,0.3636,0.49,0,6,82,88 +16422,2012-11-21,4,1,11,23,0,3,1,1,0.32,0.3485,0.61,0,7,50,57 +16423,2012-11-22,4,1,11,0,1,4,0,1,0.32,0.3333,0.66,0.0896,3,43,46 +16424,2012-11-22,4,1,11,1,1,4,0,1,0.28,0.303,0.65,0.0896,5,37,42 +16425,2012-11-22,4,1,11,2,1,4,0,1,0.24,0.2576,0.75,0.0896,3,15,18 +16426,2012-11-22,4,1,11,3,1,4,0,1,0.24,0.2576,0.75,0.1045,0,6,6 +16427,2012-11-22,4,1,11,4,1,4,0,1,0.22,0.2273,0.8,0.1343,1,2,3 +16428,2012-11-22,4,1,11,5,1,4,0,1,0.24,0.2576,0.75,0.1045,2,6,8 +16429,2012-11-22,4,1,11,6,1,4,0,1,0.26,0.2727,0.7,0.1045,2,15,17 +16430,2012-11-22,4,1,11,7,1,4,0,1,0.22,0.2273,0.75,0.1343,7,49,56 +16431,2012-11-22,4,1,11,8,1,4,0,1,0.24,0.2424,0.75,0.1343,20,77,97 +16432,2012-11-22,4,1,11,9,1,4,0,1,0.3,0.3333,0.65,0,25,94,119 +16433,2012-11-22,4,1,11,10,1,4,0,1,0.36,0.3636,0.57,0.1045,65,154,219 +16434,2012-11-22,4,1,11,11,1,4,0,1,0.42,0.4242,0.41,0.0896,89,143,232 +16435,2012-11-22,4,1,11,12,1,4,0,1,0.44,0.4394,0.35,0,117,145,262 +16436,2012-11-22,4,1,11,13,1,4,0,1,0.46,0.4545,0.31,0,125,144,269 +16437,2012-11-22,4,1,11,14,1,4,0,1,0.48,0.4697,0.29,0,125,95,220 +16438,2012-11-22,4,1,11,15,1,4,0,1,0.48,0.4697,0.29,0,132,96,228 +16439,2012-11-22,4,1,11,16,1,4,0,1,0.46,0.4545,0.33,0,101,81,182 +16440,2012-11-22,4,1,11,17,1,4,0,1,0.44,0.4394,0.35,0,66,43,109 +16441,2012-11-22,4,1,11,18,1,4,0,1,0.4,0.4091,0.43,0,16,54,70 +16442,2012-11-22,4,1,11,19,1,4,0,1,0.36,0.3788,0.76,0,13,31,44 +16443,2012-11-22,4,1,11,20,1,4,0,1,0.34,0.3636,0.71,0,15,37,52 +16444,2012-11-22,4,1,11,21,1,4,0,1,0.34,0.3485,0.61,0.0896,7,39,46 +16445,2012-11-22,4,1,11,22,1,4,0,1,0.32,0.3485,0.61,0,8,36,44 +16446,2012-11-22,4,1,11,23,1,4,0,1,0.3,0.3333,0.7,0,8,28,36 +16447,2012-11-23,4,1,11,0,0,5,1,1,0.28,0.3182,0.65,0,0,32,32 +16448,2012-11-23,4,1,11,1,0,5,1,1,0.28,0.303,0.81,0.0896,2,11,13 +16449,2012-11-23,4,1,11,2,0,5,1,1,0.26,0.2879,0.75,0.0896,1,4,5 +16450,2012-11-23,4,1,11,3,0,5,1,1,0.26,0.303,0.7,0,1,0,1 +16451,2012-11-23,4,1,11,4,0,5,1,1,0.24,0.2576,0.75,0.1045,1,3,4 +16452,2012-11-23,4,1,11,5,0,5,1,1,0.24,0.2879,0.81,0,0,10,10 +16453,2012-11-23,4,1,11,6,0,5,1,1,0.24,0.2879,0.81,0,0,20,20 +16454,2012-11-23,4,1,11,7,0,5,1,1,0.24,0.2879,0.81,0,5,72,77 +16455,2012-11-23,4,1,11,8,0,5,1,1,0.24,0.2576,0.87,0.0896,11,83,94 +16456,2012-11-23,4,1,11,9,0,5,1,1,0.28,0.2879,0.75,0.1045,32,80,112 +16457,2012-11-23,4,1,11,10,0,5,1,1,0.34,0.3333,0.61,0.1642,87,114,201 +16458,2012-11-23,4,1,11,11,0,5,1,1,0.4,0.4091,0.5,0.194,121,130,251 +16459,2012-11-23,4,1,11,12,0,5,1,1,0.44,0.4394,0.38,0.2537,186,193,379 +16460,2012-11-23,4,1,11,13,0,5,1,1,0.46,0.4545,0.38,0.2836,224,200,424 +16461,2012-11-23,4,1,11,14,0,5,1,1,0.5,0.4848,0.36,0.2537,240,195,435 +16462,2012-11-23,4,1,11,15,0,5,1,1,0.48,0.4697,0.41,0.2537,233,214,447 +16463,2012-11-23,4,1,11,16,0,5,1,2,0.48,0.4697,0.41,0.1045,158,199,357 +16464,2012-11-23,4,1,11,17,0,5,1,2,0.48,0.4697,0.39,0.0896,117,183,300 +16465,2012-11-23,4,1,11,18,0,5,1,2,0.46,0.4545,0.44,0,50,158,208 +16466,2012-11-23,4,1,11,19,0,5,1,2,0.46,0.4545,0.47,0.1642,33,121,154 +16467,2012-11-23,4,1,11,20,0,5,1,1,0.46,0.4545,0.41,0.0896,42,100,142 +16468,2012-11-23,4,1,11,21,0,5,1,1,0.46,0.4545,0.47,0.2537,23,75,98 +16469,2012-11-23,4,1,11,22,0,5,1,1,0.44,0.4394,0.33,0.5224,16,63,79 +16470,2012-11-23,4,1,11,23,0,5,1,1,0.42,0.4242,0.38,0.4478,20,47,67 +16471,2012-11-24,4,1,11,0,0,6,0,1,0.4,0.4091,0.37,0.4179,9,33,42 +16472,2012-11-24,4,1,11,1,0,6,0,1,0.34,0.2879,0.42,0.4925,1,19,20 +16473,2012-11-24,4,1,11,2,0,6,0,1,0.32,0.2727,0.39,0.6119,10,22,32 +16474,2012-11-24,4,1,11,3,0,6,0,1,0.28,0.2576,0.41,0.3582,6,5,11 +16475,2012-11-24,4,1,11,4,0,6,0,1,0.26,0.2121,0.41,0.4478,1,2,3 +16476,2012-11-24,4,1,11,5,0,6,0,1,0.26,0.2273,0.41,0.3284,1,2,3 +16477,2012-11-24,4,1,11,6,0,6,0,1,0.24,0.2121,0.44,0.2836,1,9,10 +16478,2012-11-24,4,1,11,7,0,6,0,2,0.26,0.2273,0.41,0.3284,4,21,25 +16479,2012-11-24,4,1,11,8,0,6,0,2,0.26,0.2121,0.44,0.4627,7,55,62 +16480,2012-11-24,4,1,11,9,0,6,0,2,0.26,0.2121,0.41,0.4478,26,96,122 +16481,2012-11-24,4,1,11,10,0,6,0,2,0.26,0.2273,0.44,0.2985,46,96,142 +16482,2012-11-24,4,1,11,11,0,6,0,2,0.28,0.2576,0.41,0.2985,55,131,186 +16483,2012-11-24,4,1,11,12,0,6,0,2,0.3,0.2576,0.36,0.6119,53,146,199 +16484,2012-11-24,4,1,11,13,0,6,0,1,0.32,0.303,0.36,0.3284,64,161,225 +16485,2012-11-24,4,1,11,14,0,6,0,1,0.32,0.2879,0.33,0.4627,58,149,207 +16486,2012-11-24,4,1,11,15,0,6,0,1,0.3,0.2727,0.36,0.4627,59,139,198 +16487,2012-11-24,4,1,11,16,0,6,0,1,0.28,0.2424,0.38,0.4478,43,127,170 +16488,2012-11-24,4,1,11,17,0,6,0,1,0.26,0.2273,0.41,0.3881,29,97,126 +16489,2012-11-24,4,1,11,18,0,6,0,1,0.26,0.2273,0.38,0.3881,22,123,145 +16490,2012-11-24,4,1,11,19,0,6,0,1,0.26,0.2424,0.41,0.2537,9,74,83 +16491,2012-11-24,4,1,11,20,0,6,0,1,0.24,0.2121,0.44,0.2836,3,55,58 +16492,2012-11-24,4,1,11,21,0,6,0,1,0.24,0.2273,0.44,0.194,9,66,75 +16493,2012-11-24,4,1,11,22,0,6,0,1,0.24,0.2273,0.44,0.2537,7,69,76 +16494,2012-11-24,4,1,11,23,0,6,0,1,0.24,0.2273,0.44,0.194,9,48,57 +16495,2012-11-25,4,1,11,0,0,0,0,1,0.22,0.2121,0.47,0.2537,3,31,34 +16496,2012-11-25,4,1,11,1,0,0,0,1,0.22,0.2273,0.44,0.194,4,32,36 +16497,2012-11-25,4,1,11,2,0,0,0,1,0.22,0.197,0.44,0.3582,1,27,28 +16498,2012-11-25,4,1,11,3,0,0,0,1,0.22,0.2727,0.44,0,0,8,8 +16499,2012-11-25,4,1,11,4,0,0,0,1,0.22,0.2727,0.44,0,1,1,2 +16500,2012-11-25,4,1,11,5,0,0,0,2,0.22,0.2273,0.47,0.1642,0,3,3 +16501,2012-11-25,4,1,11,6,0,0,0,2,0.22,0.2424,0.47,0.1045,1,10,11 +16502,2012-11-25,4,1,11,7,0,0,0,2,0.22,0.2424,0.51,0.1045,2,19,21 +16503,2012-11-25,4,1,11,8,0,0,0,2,0.22,0.2273,0.51,0.194,2,32,34 +16504,2012-11-25,4,1,11,9,0,0,0,2,0.24,0.2273,0.48,0.2239,13,83,96 +16505,2012-11-25,4,1,11,10,0,0,0,1,0.24,0.2273,0.48,0.2537,15,124,139 +16506,2012-11-25,4,1,11,11,0,0,0,2,0.26,0.2576,0.41,0.2239,49,155,204 +16507,2012-11-25,4,1,11,12,0,0,0,2,0.26,0.2576,0.41,0.2239,37,165,202 +16508,2012-11-25,4,1,11,13,0,0,0,2,0.26,0.2576,0.41,0.1642,37,158,195 +16509,2012-11-25,4,1,11,14,0,0,0,2,0.3,0.2879,0.42,0.2537,33,170,203 +16510,2012-11-25,4,1,11,15,0,0,0,1,0.3,0.2879,0.42,0.2537,36,209,245 +16511,2012-11-25,4,1,11,16,0,0,0,1,0.3,0.2879,0.39,0.2239,25,203,228 +16512,2012-11-25,4,1,11,17,0,0,0,1,0.28,0.2727,0.41,0.1642,11,170,181 +16513,2012-11-25,4,1,11,18,0,0,0,1,0.26,0.2576,0.44,0.1642,9,132,141 +16514,2012-11-25,4,1,11,19,0,0,0,1,0.26,0.303,0.41,0,13,114,127 +16515,2012-11-25,4,1,11,20,0,0,0,1,0.24,0.2879,0.6,0,9,110,119 +16516,2012-11-25,4,1,11,21,0,0,0,1,0.26,0.303,0.48,0,4,60,64 +16517,2012-11-25,4,1,11,22,0,0,0,1,0.24,0.2879,0.6,0,3,59,62 +16518,2012-11-25,4,1,11,23,0,0,0,1,0.22,0.2576,0.69,0.0896,1,40,41 +16519,2012-11-26,4,1,11,0,0,1,1,1,0.22,0.2727,0.69,0,4,19,23 +16520,2012-11-26,4,1,11,1,0,1,1,1,0.2,0.2576,0.69,0,1,9,10 +16521,2012-11-26,4,1,11,2,0,1,1,1,0.2,0.2576,0.69,0,0,5,5 +16522,2012-11-26,4,1,11,3,0,1,1,1,0.2,0.2576,0.69,0,1,4,5 +16523,2012-11-26,4,1,11,4,0,1,1,1,0.22,0.2727,0.69,0,0,10,10 +16524,2012-11-26,4,1,11,5,0,1,1,1,0.22,0.2273,0.69,0.1642,0,41,41 +16525,2012-11-26,4,1,11,6,0,1,1,1,0.22,0.2727,0.69,0,1,123,124 +16526,2012-11-26,4,1,11,7,0,1,1,1,0.24,0.2879,0.65,0,2,363,365 +16527,2012-11-26,4,1,11,8,0,1,1,1,0.24,0.2879,0.65,0,9,629,638 +16528,2012-11-26,4,1,11,9,0,1,1,1,0.28,0.303,0.56,0.0896,10,276,286 +16529,2012-11-26,4,1,11,10,0,1,1,1,0.32,0.3485,0.49,0,20,127,147 +16530,2012-11-26,4,1,11,11,0,1,1,1,0.4,0.4091,0.37,0.2985,18,112,130 +16531,2012-11-26,4,1,11,12,0,1,1,1,0.42,0.4242,0.35,0.1045,22,188,210 +16532,2012-11-26,4,1,11,13,0,1,1,1,0.42,0.4242,0.35,0.1045,20,197,217 +16533,2012-11-26,4,1,11,14,0,1,1,1,0.42,0.4242,0.35,0,51,177,228 +16534,2012-11-26,4,1,11,15,0,1,1,1,0.42,0.4242,0.35,0.0896,49,181,230 +16535,2012-11-26,4,1,11,16,0,1,1,1,0.44,0.4394,0.3,0,49,297,346 +16536,2012-11-26,4,1,11,17,0,1,1,1,0.42,0.4242,0.32,0,13,540,553 +16537,2012-11-26,4,1,11,18,0,1,1,1,0.36,0.3485,0.5,0.1642,19,502,521 +16538,2012-11-26,4,1,11,19,0,1,1,1,0.34,0.3636,0.53,0,16,355,371 +16539,2012-11-26,4,1,11,20,0,1,1,1,0.34,0.3636,0.49,0,12,265,277 +16540,2012-11-26,4,1,11,21,0,1,1,1,0.34,0.3636,0.49,0,9,172,181 +16541,2012-11-26,4,1,11,22,0,1,1,1,0.32,0.3485,0.61,0,3,101,104 +16542,2012-11-26,4,1,11,23,0,1,1,2,0.32,0.3333,0.66,0.1045,8,57,65 +16543,2012-11-27,4,1,11,0,0,2,1,2,0.32,0.3333,0.66,0.0896,2,24,26 +16544,2012-11-27,4,1,11,1,0,2,1,2,0.32,0.3333,0.7,0.1343,0,6,6 +16545,2012-11-27,4,1,11,2,0,2,1,2,0.32,0.3182,0.7,0.1642,0,5,5 +16546,2012-11-27,4,1,11,3,0,2,1,3,0.32,0.3333,0.7,0.1343,1,3,4 +16547,2012-11-27,4,1,11,4,0,2,1,3,0.3,0.2879,0.81,0.194,0,5,5 +16548,2012-11-27,4,1,11,5,0,2,1,2,0.3,0.2879,0.81,0.2239,0,31,31 +16549,2012-11-27,4,1,11,6,0,2,1,2,0.3,0.303,0.81,0.1642,3,97,100 +16550,2012-11-27,4,1,11,7,0,2,1,3,0.3,0.303,0.81,0.1343,4,289,293 +16551,2012-11-27,4,1,11,8,0,2,1,3,0.32,0.3182,0.81,0.1642,6,494,500 +16552,2012-11-27,4,1,11,9,0,2,1,2,0.32,0.3333,0.81,0.1343,11,257,268 +16553,2012-11-27,4,1,11,10,0,2,1,3,0.32,0.3182,0.81,0.194,5,52,57 +16554,2012-11-27,4,1,11,11,0,2,1,3,0.3,0.2727,0.87,0.2985,6,59,65 +16555,2012-11-27,4,1,11,12,0,2,1,3,0.28,0.2576,0.87,0.3881,6,70,76 +16556,2012-11-27,4,1,11,13,0,2,1,3,0.28,0.2727,0.87,0.2537,4,73,77 +16557,2012-11-27,4,1,11,14,0,2,1,3,0.28,0.2576,0.87,0.3582,8,86,94 +16558,2012-11-27,4,1,11,15,0,2,1,2,0.3,0.2727,0.75,0.2985,9,146,155 +16559,2012-11-27,4,1,11,16,0,2,1,2,0.3,0.2727,0.75,0.2985,5,252,257 +16560,2012-11-27,4,1,11,17,0,2,1,2,0.3,0.2727,0.75,0.3881,13,514,527 +16561,2012-11-27,4,1,11,18,0,2,1,1,0.26,0.2273,0.81,0.3582,13,469,482 +16562,2012-11-27,4,1,11,19,0,2,1,1,0.26,0.2576,0.73,0.2985,12,338,350 +16563,2012-11-27,4,1,11,20,0,2,1,1,0.26,0.2273,0.75,0.3284,4,228,232 +16564,2012-11-27,4,1,11,21,0,2,1,1,0.24,0.2121,0.81,0.2836,6,179,185 +16565,2012-11-27,4,1,11,22,0,2,1,1,0.24,0.2424,0.81,0.1642,2,95,97 +16566,2012-11-27,4,1,11,23,0,2,1,2,0.26,0.2424,0.81,0.2537,3,64,67 +16567,2012-11-28,4,1,11,0,0,3,1,2,0.26,0.2424,0.75,0.2537,1,22,23 +16568,2012-11-28,4,1,11,1,0,3,1,2,0.26,0.2576,0.75,0.2239,0,12,12 +16569,2012-11-28,4,1,11,2,0,3,1,2,0.26,0.2576,0.75,0.2239,0,1,1 +16570,2012-11-28,4,1,11,3,0,3,1,2,0.26,0.2576,0.7,0.2239,0,3,3 +16571,2012-11-28,4,1,11,4,0,3,1,2,0.26,0.2576,0.7,0.194,0,4,4 +16572,2012-11-28,4,1,11,5,0,3,1,2,0.26,0.2727,0.7,0.1343,0,39,39 +16573,2012-11-28,4,1,11,6,0,3,1,2,0.26,0.2576,0.65,0.2239,4,132,136 +16574,2012-11-28,4,1,11,7,0,3,1,2,0.26,0.2576,0.56,0.194,6,403,409 +16575,2012-11-28,4,1,11,8,0,3,1,1,0.24,0.2121,0.6,0.2836,9,683,692 +16576,2012-11-28,4,1,11,9,0,3,1,1,0.28,0.2727,0.52,0.194,13,309,322 +16577,2012-11-28,4,1,11,10,0,3,1,1,0.3,0.3182,0.49,0.1045,17,144,161 +16578,2012-11-28,4,1,11,11,0,3,1,1,0.32,0.3182,0.45,0.1642,19,127,146 +16579,2012-11-28,4,1,11,12,0,3,1,1,0.34,0.303,0.42,0.3881,19,189,208 +16580,2012-11-28,4,1,11,13,0,3,1,1,0.34,0.3333,0.42,0.194,21,193,214 +16581,2012-11-28,4,1,11,14,0,3,1,1,0.36,0.3333,0.4,0.2985,15,161,176 +16582,2012-11-28,4,1,11,15,0,3,1,1,0.36,0.3333,0.32,0.3284,7,172,179 +16583,2012-11-28,4,1,11,16,0,3,1,1,0.36,0.3333,0.29,0.2537,9,316,325 +16584,2012-11-28,4,1,11,17,0,3,1,1,0.34,0.3333,0.31,0.1343,17,546,563 +16585,2012-11-28,4,1,11,18,0,3,1,1,0.32,0.303,0.33,0.2836,12,530,542 +16586,2012-11-28,4,1,11,19,0,3,1,1,0.32,0.3182,0.36,0.1642,9,378,387 +16587,2012-11-28,4,1,11,20,0,3,1,1,0.32,0.3333,0.36,0.1343,5,258,263 +16588,2012-11-28,4,1,11,21,0,3,1,1,0.3,0.303,0.39,0.1642,4,219,223 +16589,2012-11-28,4,1,11,22,0,3,1,1,0.28,0.2727,0.45,0.1642,6,148,154 +16590,2012-11-28,4,1,11,23,0,3,1,1,0.26,0.2727,0.48,0.1343,5,73,78 +16591,2012-11-29,4,1,11,0,0,4,1,1,0.24,0.2576,0.56,0.0896,4,25,29 +16592,2012-11-29,4,1,11,1,0,4,1,1,0.22,0.2576,0.64,0.0896,1,15,16 +16593,2012-11-29,4,1,11,2,0,4,1,1,0.22,0.2727,0.64,0,1,6,7 +16594,2012-11-29,4,1,11,4,0,4,1,1,0.2,0.2273,0.75,0.0896,0,7,7 +16595,2012-11-29,4,1,11,5,0,4,1,1,0.2,0.2576,0.69,0,0,42,42 +16596,2012-11-29,4,1,11,6,0,4,1,1,0.22,0.2727,0.69,0,1,120,121 +16597,2012-11-29,4,1,11,7,0,4,1,1,0.2,0.2576,0.64,0,8,354,362 +16598,2012-11-29,4,1,11,8,0,4,1,1,0.2,0.2273,0.69,0.1045,15,664,679 +16599,2012-11-29,4,1,11,9,0,4,1,1,0.24,0.2879,0.6,0,13,286,299 +16600,2012-11-29,4,1,11,10,0,4,1,1,0.28,0.303,0.65,0.0896,25,153,178 +16601,2012-11-29,4,1,11,11,0,4,1,1,0.36,0.3485,0.37,0.2239,13,150,163 +16602,2012-11-29,4,1,11,12,0,4,1,1,0.36,0.3485,0.4,0.1642,11,225,236 +16603,2012-11-29,4,1,11,13,0,4,1,1,0.38,0.3939,0.32,0.1343,15,209,224 +16604,2012-11-29,4,1,11,14,0,4,1,1,0.36,0.3485,0.43,0.2239,13,174,187 +16605,2012-11-29,4,1,11,15,0,4,1,1,0.36,0.3333,0.43,0.2836,19,201,220 +16606,2012-11-29,4,1,11,16,0,4,1,1,0.36,0.3485,0.4,0.2239,24,346,370 +16607,2012-11-29,4,1,11,17,0,4,1,1,0.34,0.3333,0.39,0.194,17,544,561 +16608,2012-11-29,4,1,11,18,0,4,1,1,0.32,0.3182,0.49,0.1642,17,520,537 +16609,2012-11-29,4,1,11,19,0,4,1,1,0.3,0.2879,0.52,0.194,18,326,344 +16610,2012-11-29,4,1,11,20,0,4,1,1,0.3,0.303,0.52,0.1642,11,241,252 +16611,2012-11-29,4,1,11,21,0,4,1,1,0.26,0.2727,0.65,0.1343,7,201,208 +16612,2012-11-29,4,1,11,22,0,4,1,1,0.26,0.2879,0.7,0.0896,4,147,151 +16613,2012-11-29,4,1,11,23,0,4,1,1,0.28,0.3182,0.61,0,6,124,130 +16614,2012-11-30,4,1,11,0,0,5,1,1,0.26,0.2576,0.7,0.1642,4,48,52 +16615,2012-11-30,4,1,11,1,0,5,1,1,0.24,0.2576,0.7,0.0896,2,17,19 +16616,2012-11-30,4,1,11,2,0,5,1,1,0.24,0.2879,0.75,0,2,10,12 +16617,2012-11-30,4,1,11,3,0,5,1,1,0.24,0.2879,0.75,0,0,4,4 +16618,2012-11-30,4,1,11,4,0,5,1,1,0.22,0.2727,0.75,0,0,3,3 +16619,2012-11-30,4,1,11,5,0,5,1,1,0.2,0.2576,0.8,0,1,39,40 +16620,2012-11-30,4,1,11,6,0,5,1,1,0.2,0.2576,0.86,0,2,104,106 +16621,2012-11-30,4,1,11,7,0,5,1,1,0.22,0.2727,0.8,0,6,346,352 +16622,2012-11-30,4,1,11,8,0,5,1,2,0.22,0.2576,0.8,0.0896,20,709,729 +16623,2012-11-30,4,1,11,9,0,5,1,1,0.24,0.2576,0.75,0.0896,17,313,330 +16624,2012-11-30,4,1,11,10,0,5,1,2,0.3,0.3333,0.65,0,10,159,169 +16625,2012-11-30,4,1,11,11,0,5,1,2,0.34,0.3485,0.53,0.0896,31,170,201 +16626,2012-11-30,4,1,11,12,0,5,1,2,0.38,0.3939,0.46,0,25,243,268 +16627,2012-11-30,4,1,11,13,0,5,1,2,0.4,0.4091,0.4,0.1642,21,241,262 +16628,2012-11-30,4,1,11,14,0,5,1,2,0.4,0.4091,0.4,0.1045,26,225,251 +16629,2012-11-30,4,1,11,15,0,5,1,2,0.42,0.4242,0.38,0.1045,34,262,296 +16630,2012-11-30,4,1,11,16,0,5,1,1,0.4,0.4091,0.43,0.1343,37,368,405 +16631,2012-11-30,4,1,11,17,0,5,1,1,0.36,0.3485,0.5,0.194,31,551,582 +16632,2012-11-30,4,1,11,18,0,5,1,2,0.34,0.3636,0.61,0,20,489,509 +16633,2012-11-30,4,1,11,19,0,5,1,1,0.32,0.3485,0.66,0,18,359,377 +16634,2012-11-30,4,1,11,20,0,5,1,1,0.32,0.3485,0.66,0,12,233,245 +16635,2012-11-30,4,1,11,21,0,5,1,1,0.3,0.3182,0.75,0.0896,14,169,183 +16636,2012-11-30,4,1,11,22,0,5,1,1,0.3,0.3333,0.75,0,18,145,163 +16637,2012-11-30,4,1,11,23,0,5,1,2,0.3,0.3182,0.75,0.0896,11,99,110 +16638,2012-12-01,4,1,12,0,0,6,0,1,0.26,0.303,0.81,0,9,99,108 +16639,2012-12-01,4,1,12,1,0,6,0,1,0.26,0.303,0.81,0,5,64,69 +16640,2012-12-01,4,1,12,2,0,6,0,2,0.26,0.303,0.81,0,3,47,50 +16641,2012-12-01,4,1,12,3,0,6,0,2,0.26,0.2727,0.81,0.1343,1,14,15 +16642,2012-12-01,4,1,12,4,0,6,0,1,0.26,0.2879,0.81,0.0896,0,5,5 +16643,2012-12-01,4,1,12,5,0,6,0,1,0.24,0.2576,0.87,0.0896,1,12,13 +16644,2012-12-01,4,1,12,6,0,6,0,1,0.24,0.2424,0.87,0.1343,7,20,27 +16645,2012-12-01,4,1,12,7,0,6,0,2,0.24,0.2424,0.87,0.1343,7,56,63 +16646,2012-12-01,4,1,12,8,0,6,0,2,0.24,0.2424,0.87,0.1343,11,133,144 +16647,2012-12-01,4,1,12,9,0,6,0,2,0.26,0.2424,0.93,0.2537,34,159,193 +16648,2012-12-01,4,1,12,10,0,6,0,2,0.28,0.2727,0.89,0.1642,45,211,256 +16649,2012-12-01,4,1,12,11,0,6,0,2,0.32,0.3333,0.76,0.1045,74,318,392 +16650,2012-12-01,4,1,12,12,0,6,0,2,0.32,0.3333,0.81,0.0896,119,327,446 +16651,2012-12-01,4,1,12,13,0,6,0,2,0.34,0.3636,0.76,0,123,386,509 +16652,2012-12-01,4,1,12,14,0,6,0,2,0.36,0.3788,0.71,0,110,369,479 +16653,2012-12-01,4,1,12,15,0,6,0,2,0.4,0.4091,0.62,0,113,371,484 +16654,2012-12-01,4,1,12,16,0,6,0,2,0.38,0.3939,0.66,0,89,354,443 +16655,2012-12-01,4,1,12,17,0,6,0,2,0.34,0.3485,0.76,0.1045,50,270,320 +16656,2012-12-01,4,1,12,18,0,6,0,2,0.34,0.3636,0.76,0,42,255,297 +16657,2012-12-01,4,1,12,19,0,6,0,2,0.32,0.3485,0.81,0,30,219,249 +16658,2012-12-01,4,1,12,20,0,6,0,1,0.32,0.3485,0.81,0,28,170,198 +16659,2012-12-01,4,1,12,21,0,6,0,2,0.3,0.3333,0.87,0,23,135,158 +16660,2012-12-01,4,1,12,22,0,6,0,2,0.3,0.3333,0.87,0,17,130,147 +16661,2012-12-01,4,1,12,23,0,6,0,2,0.32,0.3485,0.81,0,10,116,126 +16662,2012-12-02,4,1,12,0,0,0,0,2,0.3,0.3182,0.87,0.0896,9,108,117 +16663,2012-12-02,4,1,12,1,0,0,0,2,0.3,0.3182,0.87,0.0896,10,84,94 +16664,2012-12-02,4,1,12,2,0,0,0,2,0.3,0.3333,0.87,0,2,72,74 +16665,2012-12-02,4,1,12,3,0,0,0,2,0.26,0.303,0.93,0,4,21,25 +16666,2012-12-02,4,1,12,4,0,0,0,2,0.26,0.303,0.93,0,1,6,7 +16667,2012-12-02,4,1,12,5,0,0,0,2,0.26,0.303,0.93,0,1,7,8 +16668,2012-12-02,4,1,12,6,0,0,0,2,0.24,0.2424,0.93,0.1343,1,15,16 +16669,2012-12-02,4,1,12,7,0,0,0,2,0.24,0.2424,0.93,0.1343,5,26,31 +16670,2012-12-02,4,1,12,8,0,0,0,2,0.26,0.2879,0.93,0.0896,12,81,93 +16671,2012-12-02,4,1,12,9,0,0,0,2,0.28,0.2727,0.93,0.1642,37,135,172 +16672,2012-12-02,4,1,12,10,0,0,0,2,0.3,0.3333,0.87,0,63,230,293 +16673,2012-12-02,4,1,12,11,0,0,0,2,0.32,0.3182,0.81,0.1642,81,274,355 +16674,2012-12-02,4,1,12,12,0,0,0,2,0.34,0.3333,0.81,0.1642,111,409,520 +16675,2012-12-02,4,1,12,13,0,0,0,1,0.4,0.4091,0.71,0.2239,84,347,431 +16676,2012-12-02,4,1,12,14,0,0,0,1,0.44,0.4394,0.62,0.2537,142,331,473 +16677,2012-12-02,4,1,12,15,0,0,0,1,0.44,0.4394,0.67,0.1343,106,311,417 +16678,2012-12-02,4,1,12,16,0,0,0,2,0.44,0.4394,0.67,0.1045,85,358,443 +16679,2012-12-02,4,1,12,17,0,0,0,3,0.4,0.4091,0.76,0.194,59,244,303 +16680,2012-12-02,4,1,12,18,0,0,0,2,0.44,0.4394,0.72,0.1045,25,178,203 +16681,2012-12-02,4,1,12,19,0,0,0,2,0.4,0.4091,0.82,0.194,16,158,174 +16682,2012-12-02,4,1,12,20,0,0,0,2,0.42,0.4242,0.82,0.194,12,142,154 +16683,2012-12-02,4,1,12,21,0,0,0,2,0.44,0.4394,0.77,0.2239,8,91,99 +16684,2012-12-02,4,1,12,22,0,0,0,2,0.44,0.4394,0.77,0.194,11,85,96 +16685,2012-12-02,4,1,12,23,0,0,0,2,0.42,0.4242,0.82,0.1343,7,44,51 +16686,2012-12-03,4,1,12,0,0,1,1,2,0.42,0.4242,0.82,0.1642,3,18,21 +16687,2012-12-03,4,1,12,1,0,1,1,1,0.4,0.4091,0.82,0.1343,2,11,13 +16688,2012-12-03,4,1,12,2,0,1,1,1,0.42,0.4242,0.77,0.1642,1,9,10 +16689,2012-12-03,4,1,12,3,0,1,1,1,0.38,0.3939,0.87,0,2,6,8 +16690,2012-12-03,4,1,12,4,0,1,1,1,0.36,0.3788,0.93,0,1,4,5 +16691,2012-12-03,4,1,12,5,0,1,1,1,0.34,0.3485,0.93,0.0896,0,38,38 +16692,2012-12-03,4,1,12,6,0,1,1,1,0.36,0.3788,0.93,0,2,136,138 +16693,2012-12-03,4,1,12,7,0,1,1,2,0.34,0.3636,0.93,0,9,387,396 +16694,2012-12-03,4,1,12,8,0,1,1,1,0.36,0.3788,0.93,0,19,712,731 +16695,2012-12-03,4,1,12,9,0,1,1,1,0.4,0.4091,0.87,0.0896,19,289,308 +16696,2012-12-03,4,1,12,10,0,1,1,1,0.44,0.4394,0.77,0.1343,31,105,136 +16697,2012-12-03,4,1,12,11,0,1,1,1,0.48,0.4697,0.77,0.1343,51,182,233 +16698,2012-12-03,4,1,12,12,0,1,1,1,0.52,0.5,0.68,0.1642,40,228,268 +16699,2012-12-03,4,1,12,13,0,1,1,1,0.58,0.5455,0.56,0.0896,81,240,321 +16700,2012-12-03,4,1,12,14,0,1,1,1,0.6,0.6212,0.53,0,51,209,260 +16701,2012-12-03,4,1,12,15,0,1,1,1,0.6,0.6212,0.56,0,58,210,268 +16702,2012-12-03,4,1,12,16,0,1,1,1,0.6,0.6212,0.53,0.1343,45,397,442 +16703,2012-12-03,4,1,12,17,0,1,1,1,0.52,0.5,0.63,0.1045,43,665,708 +16704,2012-12-03,4,1,12,18,0,1,1,1,0.5,0.4848,0.68,0.0896,26,666,692 +16705,2012-12-03,4,1,12,19,0,1,1,1,0.5,0.4848,0.68,0.0896,27,444,471 +16706,2012-12-03,4,1,12,20,0,1,1,2,0.46,0.4545,0.77,0,16,284,300 +16707,2012-12-03,4,1,12,21,0,1,1,1,0.44,0.4394,0.82,0.194,14,207,221 +16708,2012-12-03,4,1,12,22,0,1,1,1,0.42,0.4242,0.82,0.1045,5,139,144 +16709,2012-12-03,4,1,12,23,0,1,1,1,0.42,0.4242,0.82,0.1045,9,93,102 +16710,2012-12-04,4,1,12,0,0,2,1,1,0.42,0.4242,0.88,0.1045,6,49,55 +16711,2012-12-04,4,1,12,1,0,2,1,1,0.42,0.4242,0.82,0.1045,3,22,25 +16712,2012-12-04,4,1,12,2,0,2,1,1,0.42,0.4242,0.88,0.0896,3,5,8 +16713,2012-12-04,4,1,12,3,0,2,1,2,0.4,0.4091,0.87,0.1343,1,3,4 +16714,2012-12-04,4,1,12,4,0,2,1,2,0.4,0.4091,0.87,0.1343,0,7,7 +16715,2012-12-04,4,1,12,5,0,2,1,2,0.44,0.4394,0.88,0,1,45,46 +16716,2012-12-04,4,1,12,6,0,2,1,2,0.36,0.3788,0.93,0,3,150,153 +16717,2012-12-04,4,1,12,7,0,2,1,1,0.42,0.4242,0.88,0.1343,7,495,502 +16718,2012-12-04,4,1,12,8,0,2,1,2,0.44,0.4394,0.88,0.1642,21,700,721 +16719,2012-12-04,4,1,12,9,0,2,1,1,0.46,0.4545,0.82,0.1045,19,317,336 +16720,2012-12-04,4,1,12,10,0,2,1,1,0.46,0.4545,0.82,0.194,26,130,156 +16721,2012-12-04,4,1,12,11,0,2,1,1,0.48,0.4697,0.77,0.194,24,183,207 +16722,2012-12-04,4,1,12,12,0,2,1,1,0.52,0.5,0.68,0.194,39,273,312 +16723,2012-12-04,4,1,12,13,0,2,1,2,0.54,0.5152,0.64,0.1642,39,233,272 +16724,2012-12-04,4,1,12,14,0,2,1,2,0.58,0.5455,0.56,0.2239,39,231,270 +16725,2012-12-04,4,1,12,15,0,2,1,2,0.6,0.6212,0.49,0.2537,56,244,300 +16726,2012-12-04,4,1,12,16,0,2,1,1,0.58,0.5455,0.49,0.2836,44,391,435 +16727,2012-12-04,4,1,12,17,0,2,1,1,0.52,0.5,0.59,0.2239,43,700,743 +16728,2012-12-04,4,1,12,18,0,2,1,1,0.5,0.4848,0.63,0.2239,38,693,731 +16729,2012-12-04,4,1,12,19,0,2,1,1,0.48,0.4697,0.67,0.2239,46,414,460 +16730,2012-12-04,4,1,12,20,0,2,1,2,0.5,0.4848,0.63,0.2239,34,272,306 +16731,2012-12-04,4,1,12,21,0,2,1,2,0.5,0.4848,0.63,0.2537,28,252,280 +16732,2012-12-04,4,1,12,22,0,2,1,2,0.5,0.4848,0.63,0.2985,26,155,181 +16733,2012-12-04,4,1,12,23,0,2,1,1,0.48,0.4697,0.67,0.2537,5,91,96 +16734,2012-12-05,4,1,12,0,0,3,1,1,0.5,0.4848,0.59,0.2836,6,31,37 +16735,2012-12-05,4,1,12,1,0,3,1,1,0.52,0.5,0.55,0.3284,0,11,11 +16736,2012-12-05,4,1,12,2,0,3,1,1,0.46,0.4545,0.67,0.1642,2,7,9 +16737,2012-12-05,4,1,12,3,0,3,1,1,0.48,0.4697,0.67,0.1343,0,7,7 +16738,2012-12-05,4,1,12,4,0,3,1,1,0.5,0.4848,0.63,0.4478,1,9,10 +16739,2012-12-05,4,1,12,5,0,3,1,1,0.5,0.4848,0.59,0.2836,1,48,49 +16740,2012-12-05,4,1,12,6,0,3,1,3,0.48,0.4697,0.55,0.3881,5,119,124 +16741,2012-12-05,4,1,12,7,0,3,1,3,0.46,0.4545,0.59,0.2985,9,389,398 +16742,2012-12-05,4,1,12,8,0,3,1,2,0.44,0.4394,0.58,0.2836,22,737,759 +16743,2012-12-05,4,1,12,9,0,3,1,2,0.44,0.4394,0.51,0.2239,26,362,388 +16744,2012-12-05,4,1,12,10,0,3,1,2,0.44,0.4394,0.47,0.3582,14,127,141 +16745,2012-12-05,4,1,12,11,0,3,1,1,0.44,0.4394,0.44,0.2836,11,161,172 +16746,2012-12-05,4,1,12,12,0,3,1,1,0.44,0.4394,0.41,0.2836,24,208,232 +16747,2012-12-05,4,1,12,13,0,3,1,1,0.46,0.4545,0.41,0.2836,14,200,214 +16748,2012-12-05,4,1,12,14,0,3,1,1,0.48,0.4697,0.33,0.4925,39,179,218 +16749,2012-12-05,4,1,12,15,0,3,1,1,0.48,0.4697,0.33,0.2836,20,265,285 +16750,2012-12-05,4,1,12,16,0,3,1,1,0.46,0.4545,0.33,0.2836,23,354,377 +16751,2012-12-05,4,1,12,17,0,3,1,1,0.44,0.4394,0.35,0.2836,29,576,605 +16752,2012-12-05,4,1,12,18,0,3,1,1,0.42,0.4242,0.35,0.4478,29,580,609 +16753,2012-12-05,4,1,12,19,0,3,1,1,0.38,0.3939,0.43,0.3582,14,400,414 +16754,2012-12-05,4,1,12,20,0,3,1,1,0.34,0.303,0.46,0.3881,19,274,293 +16755,2012-12-05,4,1,12,21,0,3,1,1,0.34,0.303,0.46,0.4179,14,184,198 +16756,2012-12-05,4,1,12,22,0,3,1,1,0.32,0.2879,0.45,0.3881,4,101,105 +16757,2012-12-05,4,1,12,23,0,3,1,1,0.3,0.2727,0.49,0.3881,5,69,74 +16758,2012-12-06,4,1,12,0,0,4,1,1,0.26,0.2424,0.48,0.2836,1,43,44 +16759,2012-12-06,4,1,12,1,0,4,1,1,0.26,0.2424,0.48,0.2836,0,16,16 +16760,2012-12-06,4,1,12,2,0,4,1,1,0.24,0.2121,0.52,0.2836,0,9,9 +16761,2012-12-06,4,1,12,3,0,4,1,1,0.22,0.2273,0.55,0.194,0,2,2 +16762,2012-12-06,4,1,12,4,0,4,1,1,0.22,0.2121,0.55,0.2985,1,8,9 +16763,2012-12-06,4,1,12,5,0,4,1,1,0.22,0.2121,0.55,0.2836,0,32,32 +16764,2012-12-06,4,1,12,6,0,4,1,1,0.22,0.2121,0.55,0.2985,2,122,124 +16765,2012-12-06,4,1,12,7,0,4,1,1,0.22,0.2121,0.51,0.2985,8,381,389 +16766,2012-12-06,4,1,12,8,0,4,1,1,0.22,0.2121,0.51,0.2985,13,646,659 +16767,2012-12-06,4,1,12,9,0,4,1,1,0.24,0.2273,0.48,0.194,19,257,276 +16768,2012-12-06,4,1,12,10,0,4,1,1,0.24,0.2273,0.52,0.194,23,122,145 +16769,2012-12-06,4,1,12,11,0,4,1,1,0.26,0.2576,0.48,0.1642,16,162,178 +16770,2012-12-06,4,1,12,12,0,4,1,1,0.28,0.3182,0.41,0,29,206,235 +16771,2012-12-06,4,1,12,13,0,4,1,1,0.3,0.3333,0.39,0,33,212,245 +16772,2012-12-06,4,1,12,14,0,4,1,1,0.3,0.3182,0.42,0.0896,37,175,212 +16773,2012-12-06,4,1,12,15,0,4,1,1,0.32,0.3485,0.39,0,38,232,270 +16774,2012-12-06,4,1,12,16,0,4,1,1,0.32,0.3485,0.42,0,39,292,331 +16775,2012-12-06,4,1,12,17,0,4,1,1,0.3,0.3182,0.45,0.1045,31,586,617 +16776,2012-12-06,4,1,12,18,0,4,1,1,0.28,0.2727,0.45,0.1642,23,542,565 +16777,2012-12-06,4,1,12,19,0,4,1,1,0.26,0.2576,0.6,0.194,13,360,373 +16778,2012-12-06,4,1,12,20,0,4,1,2,0.24,0.2576,0.6,0.1045,5,222,227 +16779,2012-12-06,4,1,12,21,0,4,1,2,0.24,0.2424,0.6,0.1343,7,184,191 +16780,2012-12-06,4,1,12,22,0,4,1,1,0.24,0.2424,0.65,0.1642,2,131,133 +16781,2012-12-06,4,1,12,23,0,4,1,1,0.24,0.2424,0.65,0.1642,0,93,93 +16782,2012-12-07,4,1,12,0,0,5,1,1,0.24,0.2576,0.7,0.1045,3,45,48 +16783,2012-12-07,4,1,12,1,0,5,1,2,0.24,0.2273,0.7,0.194,2,26,28 +16784,2012-12-07,4,1,12,2,0,5,1,2,0.26,0.2727,0.7,0.1343,0,11,11 +16785,2012-12-07,4,1,12,3,0,5,1,2,0.26,0.2576,0.81,0.194,0,5,5 +16786,2012-12-07,4,1,12,4,0,5,1,1,0.26,0.2727,0.75,0.1045,0,10,10 +16787,2012-12-07,4,1,12,5,0,5,1,2,0.26,0.2727,0.81,0.1045,1,25,26 +16788,2012-12-07,4,1,12,6,0,5,1,2,0.28,0.2727,0.75,0.1642,0,84,84 +16789,2012-12-07,4,1,12,7,0,5,1,3,0.28,0.2727,0.81,0.1642,3,212,215 +16790,2012-12-07,4,1,12,8,0,5,1,2,0.3,0.2879,0.75,0.194,11,430,441 +16791,2012-12-07,4,1,12,9,0,5,1,2,0.3,0.303,0.81,0.1343,10,291,301 +16792,2012-12-07,4,1,12,10,0,5,1,2,0.32,0.3182,0.76,0.1642,16,150,166 +16793,2012-12-07,4,1,12,11,0,5,1,2,0.32,0.3182,0.76,0.194,20,183,203 +16794,2012-12-07,4,1,12,12,0,5,1,2,0.34,0.3333,0.71,0.194,36,204,240 +16795,2012-12-07,4,1,12,13,0,5,1,2,0.36,0.3485,0.66,0.1343,23,197,220 +16796,2012-12-07,4,1,12,14,0,5,1,2,0.36,0.3485,0.71,0.1642,40,175,215 +16797,2012-12-07,4,1,12,15,0,5,1,2,0.36,0.3485,0.71,0.1642,34,269,303 +16798,2012-12-07,4,1,12,16,0,5,1,1,0.36,0.3485,0.76,0.1343,39,336,375 +16799,2012-12-07,4,1,12,17,0,5,1,2,0.38,0.3939,0.66,0,29,539,568 +16800,2012-12-07,4,1,12,18,0,5,1,2,0.38,0.3939,0.71,0.1045,25,473,498 +16801,2012-12-07,4,1,12,19,0,5,1,2,0.38,0.3939,0.76,0,15,337,352 +16802,2012-12-07,4,1,12,20,0,5,1,3,0.38,0.3939,0.76,0.1045,12,229,241 +16803,2012-12-07,4,1,12,21,0,5,1,2,0.36,0.3636,0.93,0.1045,9,162,171 +16804,2012-12-07,4,1,12,22,0,5,1,2,0.36,0.3636,0.93,0.0896,15,150,165 +16805,2012-12-07,4,1,12,23,0,5,1,2,0.36,0.3636,0.93,0.0896,6,116,122 +16806,2012-12-08,4,1,12,0,0,6,0,2,0.36,0.3636,0.93,0.0896,5,98,103 +16807,2012-12-08,4,1,12,1,0,6,0,2,0.36,0.3485,0.93,0.1343,16,84,100 +16808,2012-12-08,4,1,12,2,0,6,0,2,0.36,0.3485,0.93,0.1343,3,67,70 +16809,2012-12-08,4,1,12,3,0,6,0,2,0.36,0.3636,0.93,0.1045,6,23,29 +16810,2012-12-08,4,1,12,4,0,6,0,2,0.36,0.3788,0.93,0,3,9,12 +16811,2012-12-08,4,1,12,5,0,6,0,2,0.36,0.3636,0.93,0.1045,2,4,6 +16812,2012-12-08,4,1,12,6,0,6,0,2,0.36,0.3636,0.93,0.1045,1,19,20 +16813,2012-12-08,4,1,12,7,0,6,0,2,0.36,0.3636,1,0.0896,3,36,39 +16814,2012-12-08,4,1,12,8,0,6,0,2,0.38,0.3939,0.94,0.0896,5,106,111 +16815,2012-12-08,4,1,12,9,0,6,0,2,0.38,0.3939,0.94,0,17,153,170 +16816,2012-12-08,4,1,12,10,0,6,0,2,0.4,0.4091,0.87,0.1343,43,244,287 +16817,2012-12-08,4,1,12,11,0,6,0,2,0.4,0.4091,0.87,0.1045,63,341,404 +16818,2012-12-08,4,1,12,12,0,6,0,2,0.4,0.4091,0.87,0.2239,122,364,486 +16819,2012-12-08,4,1,12,13,0,6,0,2,0.4,0.4091,0.87,0.1642,148,399,547 +16820,2012-12-08,4,1,12,14,0,6,0,2,0.4,0.4091,0.87,0.2836,164,378,542 +16821,2012-12-08,4,1,12,15,0,6,0,1,0.42,0.4242,0.82,0.1642,167,374,541 +16822,2012-12-08,4,1,12,16,0,6,0,1,0.42,0.4242,0.82,0.1642,139,368,507 +16823,2012-12-08,4,1,12,17,0,6,0,1,0.38,0.3939,0.87,0.1343,77,268,345 +16824,2012-12-08,4,1,12,18,0,6,0,1,0.4,0.4091,0.87,0.1045,40,264,304 +16825,2012-12-08,4,1,12,19,0,6,0,1,0.4,0.4091,0.87,0,34,212,246 +16826,2012-12-08,4,1,12,20,0,6,0,2,0.36,0.3788,1,0,20,162,182 +16827,2012-12-08,4,1,12,21,0,6,0,2,0.36,0.3788,1,0,34,175,209 +16828,2012-12-08,4,1,12,22,0,6,0,2,0.38,0.3939,0.94,0.1045,23,137,160 +16829,2012-12-08,4,1,12,23,0,6,0,2,0.4,0.4091,0.94,0,18,144,162 +16830,2012-12-09,4,1,12,0,0,0,0,2,0.4,0.4091,0.87,0.1343,15,103,118 +16831,2012-12-09,4,1,12,1,0,0,0,2,0.4,0.4091,0.87,0.1045,17,85,102 +16832,2012-12-09,4,1,12,2,0,0,0,2,0.4,0.4091,0.87,0.1045,8,70,78 +16833,2012-12-09,4,1,12,3,0,0,0,2,0.4,0.4091,0.87,0.1045,14,34,48 +16834,2012-12-09,4,1,12,4,0,0,0,2,0.4,0.4091,0.94,0.1045,1,11,12 +16835,2012-12-09,4,1,12,5,0,0,0,2,0.4,0.4091,0.87,0.1045,0,8,8 +16836,2012-12-09,4,1,12,6,0,0,0,3,0.4,0.4091,0.94,0.1045,0,6,6 +16837,2012-12-09,4,1,12,7,0,0,0,3,0.4,0.4091,0.87,0.2239,1,22,23 +16838,2012-12-09,4,1,12,8,0,0,0,3,0.4,0.4091,0.87,0.1642,1,68,69 +16839,2012-12-09,4,1,12,9,0,0,0,2,0.4,0.4091,0.87,0.2537,9,94,103 +16840,2012-12-09,4,1,12,10,0,0,0,2,0.4,0.4091,0.87,0.2836,39,180,219 +16841,2012-12-09,4,1,12,11,0,0,0,2,0.4,0.4091,0.87,0.2985,40,210,250 +16842,2012-12-09,4,1,12,12,0,0,0,2,0.4,0.4091,0.82,0.2985,60,255,315 +16843,2012-12-09,4,1,12,13,0,0,0,2,0.38,0.3939,0.87,0.2836,65,220,285 +16844,2012-12-09,4,1,12,14,0,0,0,2,0.38,0.3939,0.87,0.194,42,190,232 +16845,2012-12-09,4,1,12,15,0,0,0,3,0.36,0.3485,0.93,0.1642,25,200,225 +16846,2012-12-09,4,1,12,16,0,0,0,3,0.36,0.3485,0.93,0.1642,33,220,253 +16847,2012-12-09,4,1,12,17,0,0,0,3,0.36,0.3485,0.93,0.1343,20,209,229 +16848,2012-12-09,4,1,12,18,0,0,0,3,0.36,0.3636,0.93,0.0896,17,181,198 +16849,2012-12-09,4,1,12,19,0,0,0,3,0.38,0.3939,0.94,0.0896,12,110,122 +16850,2012-12-09,4,1,12,20,0,0,0,3,0.36,0.3636,1,0.0896,6,102,108 +16851,2012-12-09,4,1,12,21,0,0,0,2,0.36,0.3636,1,0.1045,10,86,96 +16852,2012-12-09,4,1,12,22,0,0,0,3,0.36,0.3636,0.93,0.1045,3,75,78 +16853,2012-12-09,4,1,12,23,0,0,0,3,0.36,0.3636,1,0.0896,3,48,51 +16854,2012-12-10,4,1,12,0,0,1,1,3,0.36,0.3636,1,0.0896,0,20,20 +16855,2012-12-10,4,1,12,1,0,1,1,2,0.36,0.3788,1,0,0,4,4 +16856,2012-12-10,4,1,12,2,0,1,1,2,0.38,0.3939,0.94,0.1045,2,3,5 +16857,2012-12-10,4,1,12,3,0,1,1,2,0.38,0.3939,0.94,0.1045,0,4,4 +16858,2012-12-10,4,1,12,4,0,1,1,2,0.38,0.3939,0.94,0.1045,3,9,12 +16859,2012-12-10,4,1,12,5,0,1,1,2,0.38,0.3939,0.94,0.1045,0,27,27 +16860,2012-12-10,4,1,12,6,0,1,1,2,0.38,0.3939,0.94,0.1642,2,121,123 +16861,2012-12-10,4,1,12,7,0,1,1,2,0.38,0.3939,0.94,0.2537,3,291,294 +16862,2012-12-10,4,1,12,8,0,1,1,2,0.42,0.4242,1,0.2537,9,575,584 +16863,2012-12-10,4,1,12,9,0,1,1,2,0.42,0.4242,1,0.2239,11,273,284 +16864,2012-12-10,4,1,12,10,0,1,1,2,0.44,0.4394,0.94,0.2239,12,121,133 +16865,2012-12-10,4,1,12,11,0,1,1,3,0.46,0.4545,0.94,0.2239,8,126,134 +16866,2012-12-10,4,1,12,12,0,1,1,3,0.44,0.4394,1,0.2239,23,150,173 +16867,2012-12-10,4,1,12,13,0,1,1,3,0.44,0.4394,1,0.2239,30,190,220 +16868,2012-12-10,4,1,12,14,0,1,1,2,0.5,0.4848,0.94,0.2239,31,179,210 +16869,2012-12-10,4,1,12,15,0,1,1,2,0.5,0.4848,0.87,0.1642,29,207,236 +16870,2012-12-10,4,1,12,16,0,1,1,2,0.5,0.4848,0.88,0.1045,37,308,345 +16871,2012-12-10,4,1,12,17,0,1,1,2,0.48,0.4697,0.82,0.2836,38,578,616 +16872,2012-12-10,4,1,12,18,0,1,1,2,0.46,0.4545,0.88,0.2836,20,544,564 +16873,2012-12-10,4,1,12,19,0,1,1,1,0.52,0.5,0.77,0.2836,18,409,427 +16874,2012-12-10,4,1,12,20,0,1,1,1,0.46,0.4545,0.88,0.2537,13,287,300 +16875,2012-12-10,4,1,12,21,0,1,1,2,0.46,0.4545,0.94,0.194,21,224,245 +16876,2012-12-10,4,1,12,22,0,1,1,2,0.5,0.4848,0.82,0.2239,11,115,126 +16877,2012-12-10,4,1,12,23,0,1,1,1,0.46,0.4545,0.88,0.2537,8,76,84 +16878,2012-12-11,4,1,12,0,0,2,1,3,0.46,0.4545,0.77,0.4627,2,29,31 +16879,2012-12-11,4,1,12,1,0,2,1,1,0.42,0.4242,0.71,0.4478,1,7,8 +16880,2012-12-11,4,1,12,2,0,2,1,2,0.4,0.4091,0.66,0.3284,0,1,1 +16881,2012-12-11,4,1,12,3,0,2,1,3,0.36,0.3333,0.76,0.2836,0,3,3 +16882,2012-12-11,4,1,12,4,0,2,1,2,0.34,0.3182,0.76,0.2836,0,8,8 +16883,2012-12-11,4,1,12,5,0,2,1,3,0.34,0.303,0.71,0.3284,1,40,41 +16884,2012-12-11,4,1,12,6,0,2,1,2,0.34,0.303,0.71,0.2985,0,118,118 +16885,2012-12-11,4,1,12,7,0,2,1,1,0.34,0.303,0.66,0.4179,8,372,380 +16886,2012-12-11,4,1,12,8,0,2,1,2,0.34,0.303,0.61,0.2985,16,708,724 +16887,2012-12-11,4,1,12,9,0,2,1,1,0.34,0.303,0.61,0.3881,12,322,334 +16888,2012-12-11,4,1,12,10,0,2,1,1,0.36,0.3333,0.57,0.3284,12,142,154 +16889,2012-12-11,4,1,12,11,0,2,1,2,0.36,0.3333,0.53,0.3582,28,145,173 +16890,2012-12-11,4,1,12,12,0,2,1,2,0.36,0.3182,0.53,0.4478,13,213,226 +16891,2012-12-11,4,1,12,13,0,2,1,2,0.38,0.3939,0.5,0.2239,28,226,254 +16892,2012-12-11,4,1,12,14,0,2,1,2,0.38,0.3939,0.5,0.2239,19,185,204 +16893,2012-12-11,4,1,12,15,0,2,1,2,0.38,0.3939,0.5,0.2239,20,250,270 +16894,2012-12-11,4,1,12,16,0,2,1,2,0.38,0.3939,0.5,0.2239,24,334,358 +16895,2012-12-11,4,1,12,17,0,2,1,1,0.32,0.303,0.53,0.2239,21,580,601 +16896,2012-12-11,4,1,12,18,0,2,1,1,0.32,0.303,0.53,0.2239,23,523,546 +16897,2012-12-11,4,1,12,19,0,2,1,1,0.32,0.303,0.53,0.2239,21,412,433 +16898,2012-12-11,4,1,12,20,0,2,1,1,0.32,0.303,0.53,0.2239,9,248,257 +16899,2012-12-11,4,1,12,21,0,2,1,1,0.32,0.3182,0.53,0.194,14,193,207 +16900,2012-12-11,4,1,12,22,0,2,1,1,0.3,0.2879,0.56,0.2239,6,100,106 +16901,2012-12-11,4,1,12,23,0,2,1,1,0.3,0.2879,0.52,0.2239,4,60,64 +16902,2012-12-12,4,1,12,0,0,3,1,1,0.3,0.303,0.52,0.1642,1,33,34 +16903,2012-12-12,4,1,12,1,0,3,1,1,0.28,0.2727,0.61,0.2239,3,18,21 +16904,2012-12-12,4,1,12,2,0,3,1,1,0.28,0.2727,0.56,0.1642,5,4,9 +16905,2012-12-12,4,1,12,3,0,3,1,1,0.26,0.2727,0.6,0.1343,3,7,10 +16906,2012-12-12,4,1,12,4,0,3,1,1,0.26,0.303,0.6,0,0,4,4 +16907,2012-12-12,4,1,12,5,0,3,1,1,0.26,0.2576,0.6,0.194,0,37,37 +16908,2012-12-12,4,1,12,6,0,3,1,2,0.26,0.2424,0.6,0.2537,2,126,128 +16909,2012-12-12,4,1,12,7,0,3,1,2,0.26,0.2576,0.6,0.194,3,366,369 +16910,2012-12-12,4,1,12,8,0,3,1,2,0.26,0.2576,0.65,0.1642,18,670,688 +16911,2012-12-12,4,1,12,9,0,3,1,2,0.28,0.2879,0.61,0.1343,13,272,285 +16912,2012-12-12,4,1,12,10,0,3,1,2,0.3,0.303,0.56,0.1642,20,116,136 +16913,2012-12-12,4,1,12,11,0,3,1,2,0.32,0.303,0.49,0.2239,24,148,172 +16914,2012-12-12,4,1,12,12,0,3,1,2,0.34,0.3636,0.42,0,14,218,232 +16915,2012-12-12,4,1,12,13,0,3,1,2,0.34,0.3636,0.42,0,18,220,238 +16916,2012-12-12,4,1,12,14,0,3,1,2,0.34,0.3636,0.46,0,34,191,225 +16917,2012-12-12,4,1,12,15,0,3,1,2,0.34,0.3485,0.46,0.1045,21,207,228 +16918,2012-12-12,4,1,12,16,0,3,1,2,0.34,0.3333,0.46,0.1343,19,310,329 +16919,2012-12-12,4,1,12,17,0,3,1,2,0.32,0.303,0.53,0.2239,21,540,561 +16920,2012-12-12,4,1,12,18,0,3,1,2,0.32,0.303,0.49,0.2239,25,515,540 +16921,2012-12-12,4,1,12,19,0,3,1,2,0.3,0.2879,0.56,0.2239,18,384,402 +16922,2012-12-12,4,1,12,20,0,3,1,1,0.3,0.2879,0.52,0.2836,25,243,268 +16923,2012-12-12,4,1,12,21,0,3,1,2,0.3,0.2879,0.52,0.2537,16,186,202 +16924,2012-12-12,4,1,12,22,0,3,1,2,0.3,0.2879,0.52,0.2239,4,118,122 +16925,2012-12-12,4,1,12,23,0,3,1,2,0.28,0.2727,0.56,0.2239,3,76,79 +16926,2012-12-13,4,1,12,0,0,4,1,2,0.28,0.2727,0.52,0.2239,1,31,32 +16927,2012-12-13,4,1,12,1,0,4,1,2,0.28,0.2576,0.52,0.2836,4,19,23 +16928,2012-12-13,4,1,12,2,0,4,1,2,0.26,0.2424,0.56,0.2537,3,5,8 +16929,2012-12-13,4,1,12,3,0,4,1,2,0.26,0.2424,0.56,0.2836,0,2,2 +16930,2012-12-13,4,1,12,4,0,4,1,2,0.26,0.2576,0.56,0.194,0,8,8 +16931,2012-12-13,4,1,12,5,0,4,1,2,0.26,0.2576,0.56,0.1642,2,31,33 +16932,2012-12-13,4,1,12,6,0,4,1,1,0.26,0.2424,0.56,0.2537,2,112,114 +16933,2012-12-13,4,1,12,7,0,4,1,1,0.24,0.2273,0.6,0.2239,5,380,385 +16934,2012-12-13,4,1,12,8,0,4,1,1,0.24,0.2273,0.6,0.194,24,655,679 +16935,2012-12-13,4,1,12,9,0,4,1,1,0.28,0.2727,0.52,0.2537,25,300,325 +16936,2012-12-13,4,1,12,10,0,4,1,1,0.32,0.303,0.45,0.2836,28,139,167 +16937,2012-12-13,4,1,12,11,0,4,1,1,0.34,0.3182,0.42,0.2537,25,164,189 +16938,2012-12-13,4,1,12,12,0,4,1,1,0.36,0.3333,0.4,0.2537,30,252,282 +16939,2012-12-13,4,1,12,13,0,4,1,1,0.36,0.3485,0.34,0.194,41,230,271 +16940,2012-12-13,4,1,12,14,0,4,1,1,0.36,0.3485,0.34,0.1642,31,211,242 +16941,2012-12-13,4,1,12,15,0,4,1,1,0.36,0.3485,0.34,0.1343,40,240,280 +16942,2012-12-13,4,1,12,16,0,4,1,1,0.34,0.3485,0.39,0.1045,50,356,406 +16943,2012-12-13,4,1,12,17,0,4,1,1,0.32,0.3333,0.42,0.1045,43,507,550 +16944,2012-12-13,4,1,12,18,0,4,1,1,0.32,0.3333,0.42,0.1045,20,446,466 +16945,2012-12-13,4,1,12,19,0,4,1,1,0.3,0.303,0.45,0.1642,20,328,348 +16946,2012-12-13,4,1,12,20,0,4,1,1,0.3,0.3333,0.45,0,6,235,241 +16947,2012-12-13,4,1,12,21,0,4,1,1,0.28,0.3182,0.48,0,13,200,213 +16948,2012-12-13,4,1,12,22,0,4,1,1,0.26,0.2879,0.6,0.0896,7,141,148 +16949,2012-12-13,4,1,12,23,0,4,1,1,0.26,0.303,0.6,0,5,115,120 +16950,2012-12-14,4,1,12,0,0,5,1,1,0.22,0.2576,0.64,0.0896,4,43,47 +16951,2012-12-14,4,1,12,1,0,5,1,1,0.22,0.2576,0.69,0.0896,0,26,26 +16952,2012-12-14,4,1,12,2,0,5,1,1,0.22,0.2727,0.69,0,3,6,9 +16953,2012-12-14,4,1,12,3,0,5,1,1,0.2,0.2273,0.69,0.1045,0,12,12 +16954,2012-12-14,4,1,12,4,0,5,1,1,0.2,0.2273,0.75,0.0896,1,9,10 +16955,2012-12-14,4,1,12,5,0,5,1,1,0.2,0.2273,0.69,0.1045,0,34,34 +16956,2012-12-14,4,1,12,6,0,5,1,1,0.16,0.1818,0.8,0.1045,1,112,113 +16957,2012-12-14,4,1,12,7,0,5,1,1,0.2,0.2121,0.72,0.1045,3,305,308 +16958,2012-12-14,4,1,12,8,0,5,1,1,0.2,0.2273,0.69,0.0896,13,623,636 +16959,2012-12-14,4,1,12,9,0,5,1,1,0.24,0.2424,0.75,0.1343,13,330,343 +16960,2012-12-14,4,1,12,10,0,5,1,1,0.26,0.2576,0.81,0.194,28,162,190 +16961,2012-12-14,4,1,12,11,0,5,1,1,0.32,0.303,0.65,0.194,31,180,211 +16962,2012-12-14,4,1,12,12,0,5,1,1,0.36,0.3485,0.5,0.194,29,244,273 +16963,2012-12-14,4,1,12,13,0,5,1,1,0.36,0.3485,0.5,0.2239,39,274,313 +16964,2012-12-14,4,1,12,14,0,5,1,1,0.4,0.4091,0.43,0.194,48,251,299 +16965,2012-12-14,4,1,12,15,0,5,1,1,0.4,0.4091,0.47,0.1642,57,252,309 +16966,2012-12-14,4,1,12,16,0,5,1,1,0.38,0.3939,0.54,0.194,45,372,417 +16967,2012-12-14,4,1,12,17,0,5,1,1,0.34,0.3333,0.57,0.1642,40,582,622 +16968,2012-12-14,4,1,12,18,0,5,1,1,0.36,0.3485,0.46,0.1343,23,432,455 +16969,2012-12-14,4,1,12,19,0,5,1,1,0.34,0.3485,0.53,0.0896,17,302,319 +16970,2012-12-14,4,1,12,20,0,5,1,1,0.32,0.3333,0.61,0.1343,12,209,221 +16971,2012-12-14,4,1,12,21,0,5,1,1,0.3,0.303,0.75,0.1642,7,165,172 +16972,2012-12-14,4,1,12,22,0,5,1,1,0.28,0.2879,0.75,0.1045,4,134,138 +16973,2012-12-14,4,1,12,23,0,5,1,1,0.28,0.303,0.75,0.0896,11,123,134 +16974,2012-12-15,4,1,12,0,0,6,0,1,0.3,0.3333,0.7,0,4,90,94 +16975,2012-12-15,4,1,12,1,0,6,0,2,0.26,0.2879,0.81,0.0896,9,86,95 +16976,2012-12-15,4,1,12,2,0,6,0,1,0.26,0.303,0.81,0,6,63,69 +16977,2012-12-15,4,1,12,3,0,6,0,2,0.24,0.2879,0.81,0,5,18,23 +16978,2012-12-15,4,1,12,4,0,6,0,2,0.24,0.2576,0.87,0.1045,1,5,6 +16979,2012-12-15,4,1,12,5,0,6,0,2,0.26,0.303,0.75,0,0,3,3 +16980,2012-12-15,4,1,12,6,0,6,0,2,0.24,0.2879,0.75,0,0,11,11 +16981,2012-12-15,4,1,12,7,0,6,0,2,0.24,0.2576,0.75,0.0896,1,47,48 +16982,2012-12-15,4,1,12,8,0,6,0,2,0.26,0.2879,0.75,0.0896,8,111,119 +16983,2012-12-15,4,1,12,9,0,6,0,1,0.26,0.2727,0.75,0.1045,17,203,220 +16984,2012-12-15,4,1,12,10,0,6,0,1,0.32,0.3182,0.61,0.1642,23,250,273 +16985,2012-12-15,4,1,12,11,0,6,0,1,0.34,0.3485,0.61,0.0896,66,327,393 +16986,2012-12-15,4,1,12,12,0,6,0,1,0.38,0.3939,0.5,0.0896,89,364,453 +16987,2012-12-15,4,1,12,13,0,6,0,1,0.38,0.3939,0.54,0.2537,88,368,456 +16988,2012-12-15,4,1,12,14,0,6,0,1,0.4,0.4091,0.5,0.1642,92,334,426 +16989,2012-12-15,4,1,12,15,0,6,0,1,0.42,0.4242,0.47,0,95,352,447 +16990,2012-12-15,4,1,12,16,0,6,0,1,0.4,0.4091,0.5,0.1343,85,328,413 +16991,2012-12-15,4,1,12,17,0,6,0,1,0.4,0.4091,0.5,0.2239,35,274,309 +16992,2012-12-15,4,1,12,18,0,6,0,1,0.38,0.3939,0.58,0.1642,40,232,272 +16993,2012-12-15,4,1,12,19,0,6,0,1,0.36,0.3485,0.62,0.1343,32,225,257 +16994,2012-12-15,4,1,12,20,0,6,0,2,0.36,0.3485,0.57,0.194,23,178,201 +16995,2012-12-15,4,1,12,21,0,6,0,1,0.36,0.3485,0.62,0.1343,15,169,184 +16996,2012-12-15,4,1,12,22,0,6,0,1,0.36,0.3485,0.62,0.194,22,134,156 +16997,2012-12-15,4,1,12,23,0,6,0,2,0.36,0.3485,0.62,0.1343,11,108,119 +16998,2012-12-16,4,1,12,0,0,0,0,2,0.36,0.3788,0.62,0,8,102,110 +16999,2012-12-16,4,1,12,1,0,0,0,3,0.34,0.3485,0.76,0.1045,14,82,96 +17000,2012-12-16,4,1,12,2,0,0,0,2,0.34,0.3485,0.87,0.0896,8,79,87 +17001,2012-12-16,4,1,12,3,0,0,0,2,0.34,0.3333,0.87,0.194,1,37,38 +17002,2012-12-16,4,1,12,4,0,0,0,2,0.34,0.3636,0.87,0,1,10,11 +17003,2012-12-16,4,1,12,5,0,0,0,2,0.34,0.3636,0.87,0,0,9,9 +17004,2012-12-16,4,1,12,6,0,0,0,2,0.34,0.3636,0.87,0,0,6,6 +17005,2012-12-16,4,1,12,7,0,0,0,2,0.34,0.3485,0.87,0.0896,5,22,27 +17006,2012-12-16,4,1,12,8,0,0,0,2,0.36,0.3485,0.87,0.1343,12,76,88 +17007,2012-12-16,4,1,12,9,0,0,0,2,0.36,0.3485,0.87,0.1343,19,113,132 +17008,2012-12-16,4,1,12,10,0,0,0,2,0.36,0.3485,0.87,0.1343,42,215,257 +17009,2012-12-16,4,1,12,11,0,0,0,2,0.36,0.3485,0.87,0.1343,47,248,295 +17010,2012-12-16,4,1,12,12,0,0,0,2,0.38,0.3939,0.82,0.194,67,350,417 +17011,2012-12-16,4,1,12,13,0,0,0,2,0.38,0.3939,0.76,0.194,67,289,356 +17012,2012-12-16,4,1,12,14,0,0,0,2,0.38,0.3939,0.76,0.1343,50,260,310 +17013,2012-12-16,4,1,12,15,0,0,0,2,0.38,0.3939,0.82,0.1045,46,292,338 +17014,2012-12-16,4,1,12,16,0,0,0,2,0.38,0.3939,0.82,0.1045,66,334,400 +17015,2012-12-16,4,1,12,17,0,0,0,2,0.38,0.3939,0.82,0.1045,29,214,243 +17016,2012-12-16,4,1,12,18,0,0,0,3,0.38,0.3939,0.82,0.1045,8,99,107 +17017,2012-12-16,4,1,12,19,0,0,0,1,0.36,0.3485,0.93,0.1343,10,99,109 +17018,2012-12-16,4,1,12,20,0,0,0,2,0.38,0.3939,0.82,0,14,108,122 +17019,2012-12-16,4,1,12,21,0,0,0,2,0.36,0.3788,0.93,0,14,92,106 +17020,2012-12-16,4,1,12,22,0,0,0,2,0.4,0.4091,0.82,0.194,6,83,89 +17021,2012-12-16,4,1,12,23,0,0,0,2,0.36,0.3485,0.93,0.1343,4,29,33 +17022,2012-12-17,4,1,12,0,0,1,1,2,0.38,0.3939,0.87,0,2,26,28 +17023,2012-12-17,4,1,12,1,0,1,1,2,0.38,0.3939,0.87,0.1045,1,14,15 +17024,2012-12-17,4,1,12,2,0,1,1,2,0.38,0.3939,0.94,0,1,4,5 +17025,2012-12-17,4,1,12,3,0,1,1,2,0.36,0.3788,0.93,0,0,3,3 +17026,2012-12-17,4,1,12,4,0,1,1,3,0.36,0.3788,1,0,2,3,5 +17027,2012-12-17,4,1,12,5,0,1,1,2,0.38,0.3939,0.87,0,0,24,24 +17028,2012-12-17,4,1,12,6,0,1,1,2,0.36,0.3485,0.93,0.1343,1,107,108 +17029,2012-12-17,4,1,12,7,0,1,1,2,0.36,0.3485,0.93,0.1343,5,314,319 +17030,2012-12-17,4,1,12,8,0,1,1,2,0.38,0.3939,0.87,0.1045,10,582,592 +17031,2012-12-17,4,1,12,9,0,1,1,2,0.4,0.4091,0.87,0,11,271,282 +17032,2012-12-17,4,1,12,10,0,1,1,2,0.38,0.3939,0.87,0.0896,15,120,135 +17033,2012-12-17,4,1,12,11,0,1,1,2,0.4,0.4091,0.87,0.0896,22,148,170 +17034,2012-12-17,4,1,12,12,0,1,1,2,0.4,0.4091,0.87,0,21,211,232 +17035,2012-12-17,4,1,12,13,0,1,1,2,0.4,0.4091,0.87,0.1343,16,194,210 +17036,2012-12-17,4,1,12,14,0,1,1,2,0.4,0.4091,0.87,0.0896,12,169,181 +17037,2012-12-17,4,1,12,15,0,1,1,2,0.42,0.4242,0.88,0,15,196,211 +17038,2012-12-17,4,1,12,16,0,1,1,3,0.4,0.4091,0.94,0.2537,15,287,302 +17039,2012-12-17,4,1,12,17,0,1,1,2,0.4,0.4091,0.94,0.2537,17,478,495 +17040,2012-12-17,4,1,12,18,0,1,1,2,0.4,0.4091,0.94,0.0896,14,493,507 +17041,2012-12-17,4,1,12,19,0,1,1,2,0.42,0.4242,0.88,0.1343,7,333,340 +17042,2012-12-17,4,1,12,20,0,1,1,2,0.42,0.4242,0.94,0.2537,8,192,200 +17043,2012-12-17,4,1,12,21,0,1,1,2,0.42,0.4242,0.94,0.1343,6,114,120 +17044,2012-12-17,4,1,12,22,0,1,1,2,0.42,0.4242,0.94,0.1343,5,49,54 +17045,2012-12-17,4,1,12,23,0,1,1,3,0.42,0.4242,0.94,0.2239,6,41,47 +17046,2012-12-18,4,1,12,0,0,2,1,2,0.44,0.4394,0.94,0.1343,0,18,18 +17047,2012-12-18,4,1,12,1,0,2,1,2,0.44,0.4394,0.94,0.1343,0,15,15 +17048,2012-12-18,4,1,12,2,0,2,1,2,0.44,0.4394,0.88,0.2239,2,5,7 +17049,2012-12-18,4,1,12,3,0,2,1,1,0.42,0.4242,0.88,0.194,0,5,5 +17050,2012-12-18,4,1,12,4,0,2,1,1,0.42,0.4242,0.82,0.1642,3,5,8 +17051,2012-12-18,4,1,12,5,0,2,1,1,0.38,0.3939,0.87,0.0896,0,36,36 +17052,2012-12-18,4,1,12,6,0,2,1,1,0.36,0.3485,0.93,0.1642,1,117,118 +17053,2012-12-18,4,1,12,7,0,2,1,1,0.36,0.3485,0.93,0.1343,4,351,355 +17054,2012-12-18,4,1,12,8,0,2,1,1,0.38,0.3939,0.94,0,10,652,662 +17055,2012-12-18,4,1,12,9,0,2,1,1,0.4,0.4091,0.87,0.0896,19,307,326 +17056,2012-12-18,4,1,12,10,0,2,1,1,0.44,0.4394,0.77,0.0896,22,162,184 +17057,2012-12-18,4,1,12,11,0,2,1,1,0.48,0.4697,0.63,0.2239,58,211,269 +17058,2012-12-18,4,1,12,12,0,2,1,3,0.48,0.4697,0.48,0.2537,49,264,313 +17059,2012-12-18,4,1,12,13,0,2,1,1,0.5,0.4848,0.42,0.2836,51,235,286 +17060,2012-12-18,4,1,12,14,0,2,1,1,0.46,0.4545,0.47,0.4478,56,191,247 +17061,2012-12-18,4,1,12,15,0,2,1,1,0.46,0.4545,0.44,0.4925,28,218,246 +17062,2012-12-18,4,1,12,16,0,2,1,1,0.44,0.4394,0.41,0.4627,40,323,363 +17063,2012-12-18,4,1,12,17,0,2,1,1,0.4,0.4091,0.47,0.4478,39,533,572 +17064,2012-12-18,4,1,12,18,0,2,1,1,0.38,0.3939,0.46,0.3284,13,512,525 +17065,2012-12-18,4,1,12,19,0,2,1,1,0.38,0.3939,0.46,0.3881,19,334,353 +17066,2012-12-18,4,1,12,20,0,2,1,1,0.36,0.3333,0.5,0.2537,4,264,268 +17067,2012-12-18,4,1,12,21,0,2,1,1,0.36,0.3485,0.5,0.2239,9,159,168 +17068,2012-12-18,4,1,12,22,0,2,1,1,0.34,0.3333,0.49,0,5,127,132 +17069,2012-12-18,4,1,12,23,0,2,1,1,0.34,0.3485,0.49,0.0896,1,80,81 +17070,2012-12-19,4,1,12,0,0,3,1,1,0.3,0.3182,0.61,0,6,35,41 +17071,2012-12-19,4,1,12,1,0,3,1,1,0.3,0.3182,0.65,0.0896,1,14,15 +17072,2012-12-19,4,1,12,2,0,3,1,1,0.28,0.303,0.65,0.0896,1,2,3 +17073,2012-12-19,4,1,12,3,0,3,1,1,0.26,0.2727,0.75,0.1343,0,5,5 +17074,2012-12-19,4,1,12,4,0,3,1,1,0.24,0.2424,0.75,0.1343,1,6,7 +17075,2012-12-19,4,1,12,5,0,3,1,1,0.26,0.2879,0.75,0.0896,2,29,31 +17076,2012-12-19,4,1,12,6,0,3,1,1,0.24,0.2576,0.75,0.0896,3,109,112 +17077,2012-12-19,4,1,12,7,0,3,1,1,0.26,0.2727,0.75,0.1343,3,360,363 +17078,2012-12-19,4,1,12,8,0,3,1,1,0.24,0.2576,0.87,0.1045,13,665,678 +17079,2012-12-19,4,1,12,9,0,3,1,1,0.28,0.2879,0.75,0.1045,8,309,317 +17080,2012-12-19,4,1,12,10,0,3,1,1,0.32,0.3333,0.7,0.1045,17,147,164 +17081,2012-12-19,4,1,12,11,0,3,1,1,0.4,0.4091,0.54,0.2239,31,169,200 +17082,2012-12-19,4,1,12,12,0,3,1,1,0.4,0.4091,0.54,0.2836,33,203,236 +17083,2012-12-19,4,1,12,13,0,3,1,1,0.42,0.4242,0.5,0.194,30,183,213 +17084,2012-12-19,4,1,12,14,0,3,1,1,0.42,0.4242,0.5,0.194,33,185,218 +17085,2012-12-19,4,1,12,15,0,3,1,1,0.42,0.4242,0.5,0.2836,28,209,237 +17086,2012-12-19,4,1,12,16,0,3,1,1,0.42,0.4242,0.5,0.3582,37,297,334 +17087,2012-12-19,4,1,12,17,0,3,1,1,0.4,0.4091,0.5,0.3881,26,536,562 +17088,2012-12-19,4,1,12,18,0,3,1,1,0.38,0.3939,0.5,0.3582,23,546,569 +17089,2012-12-19,4,1,12,19,0,3,1,1,0.38,0.3939,0.5,0.3881,7,329,336 +17090,2012-12-19,4,1,12,20,0,3,1,1,0.36,0.3485,0.57,0.2239,10,231,241 +17091,2012-12-19,4,1,12,21,0,3,1,1,0.34,0.3182,0.61,0.2239,4,164,168 +17092,2012-12-19,4,1,12,22,0,3,1,1,0.34,0.3485,0.61,0.0896,12,117,129 +17093,2012-12-19,4,1,12,23,0,3,1,1,0.32,0.3333,0.66,0.1343,4,84,88 +17094,2012-12-20,4,1,12,0,0,4,1,1,0.32,0.3333,0.61,0.1343,2,40,42 +17095,2012-12-20,4,1,12,1,0,4,1,1,0.32,0.3485,0.66,0,1,19,20 +17096,2012-12-20,4,1,12,2,0,4,1,1,0.32,0.3485,0.66,0,2,6,8 +17097,2012-12-20,4,1,12,3,0,4,1,2,0.3,0.3182,0.7,0.0896,1,3,4 +17098,2012-12-20,4,1,12,4,0,4,1,2,0.3,0.3182,0.7,0.0896,0,6,6 +17099,2012-12-20,4,1,12,5,0,4,1,2,0.3,0.3182,0.7,0.1045,0,35,35 +17100,2012-12-20,4,1,12,6,0,4,1,2,0.3,0.3333,0.65,0,4,114,118 +17101,2012-12-20,4,1,12,7,0,4,1,2,0.3,0.3333,0.7,0,4,346,350 +17102,2012-12-20,4,1,12,8,0,4,1,2,0.3,0.3333,0.7,0,14,585,599 +17103,2012-12-20,4,1,12,9,0,4,1,2,0.3,0.3333,0.7,0,14,303,317 +17104,2012-12-20,4,1,12,10,0,4,1,2,0.32,0.3333,0.66,0,23,138,161 +17105,2012-12-20,4,1,12,11,0,4,1,2,0.34,0.3182,0.57,0.2239,22,168,190 +17106,2012-12-20,4,1,12,12,0,4,1,2,0.36,0.3485,0.53,0.1642,31,181,212 +17107,2012-12-20,4,1,12,13,0,4,1,2,0.36,0.3485,0.57,0.194,46,171,217 +17108,2012-12-20,4,1,12,14,0,4,1,2,0.36,0.3485,0.53,0.194,43,171,214 +17109,2012-12-20,4,1,12,15,0,4,1,2,0.34,0.3333,0.57,0.194,33,216,249 +17110,2012-12-20,4,1,12,16,0,4,1,2,0.34,0.303,0.53,0.3284,21,281,302 +17111,2012-12-20,4,1,12,17,0,4,1,2,0.34,0.3333,0.66,0.194,25,450,475 +17112,2012-12-20,4,1,12,18,0,4,1,2,0.34,0.3333,0.71,0.194,22,359,381 +17113,2012-12-20,4,1,12,19,0,4,1,3,0.34,0.3182,0.71,0.2537,3,115,118 +17114,2012-12-20,4,1,12,20,0,4,1,3,0.34,0.3333,0.76,0.1642,1,49,50 +17115,2012-12-20,4,1,12,21,0,4,1,3,0.34,0.3333,0.76,0.1642,1,25,26 +17116,2012-12-20,4,1,12,22,0,4,1,3,0.34,0.3333,0.87,0.194,1,20,21 +17117,2012-12-20,4,1,12,23,0,4,1,3,0.4,0.4091,0.82,0.2985,0,13,13 +17118,2012-12-21,1,1,12,0,0,5,1,3,0.42,0.4242,0.88,0.2985,0,17,17 +17119,2012-12-21,1,1,12,1,0,5,1,3,0.44,0.4394,0.88,0.3881,2,7,9 +17120,2012-12-21,1,1,12,2,0,5,1,3,0.46,0.4545,0.94,0.3881,0,3,3 +17121,2012-12-21,1,1,12,3,0,5,1,3,0.46,0.4545,0.94,0.3881,0,3,3 +17122,2012-12-21,1,1,12,4,0,5,1,2,0.36,0.3182,0.71,0.4925,1,5,6 +17123,2012-12-21,1,1,12,5,0,5,1,2,0.34,0.303,0.76,0.4179,0,13,13 +17124,2012-12-21,1,1,12,6,0,5,1,2,0.34,0.2879,0.57,0.5821,4,76,80 +17125,2012-12-21,1,1,12,7,0,5,1,2,0.32,0.3182,0.57,0.194,3,205,208 +17126,2012-12-21,1,1,12,8,0,5,1,2,0.32,0.303,0.57,0.2836,8,464,472 +17127,2012-12-21,1,1,12,9,0,5,1,2,0.32,0.2727,0.45,0.5522,2,265,267 +17128,2012-12-21,1,1,12,10,0,5,1,1,0.32,0.2879,0.42,0.4627,8,146,154 +17129,2012-12-21,1,1,12,11,0,5,1,2,0.32,0.2879,0.42,0.4179,12,150,162 +17130,2012-12-21,1,1,12,12,0,5,1,2,0.32,0.2879,0.39,0.3582,25,199,224 +17131,2012-12-21,1,1,12,13,0,5,1,2,0.32,0.2879,0.39,0.4179,26,214,240 +17132,2012-12-21,1,1,12,14,0,5,1,2,0.32,0.2879,0.39,0.4478,20,199,219 +17133,2012-12-21,1,1,12,15,0,5,1,2,0.3,0.2727,0.42,0.3881,29,234,263 +17134,2012-12-21,1,1,12,16,0,5,1,2,0.3,0.2727,0.39,0.3284,28,253,281 +17135,2012-12-21,1,1,12,17,0,5,1,2,0.26,0.2273,0.56,0.3284,24,297,321 +17136,2012-12-21,1,1,12,18,0,5,1,2,0.26,0.2273,0.56,0.2985,7,236,243 +17137,2012-12-21,1,1,12,19,0,5,1,1,0.28,0.2576,0.45,0.2985,5,148,153 +17138,2012-12-21,1,1,12,20,0,5,1,1,0.28,0.2727,0.41,0.2537,8,104,112 +17139,2012-12-21,1,1,12,21,0,5,1,1,0.26,0.2273,0.41,0.4179,2,68,70 +17140,2012-12-21,1,1,12,22,0,5,1,1,0.26,0.2273,0.44,0.2985,6,57,63 +17141,2012-12-21,1,1,12,23,0,5,1,1,0.26,0.2424,0.44,0.2836,1,39,40 +17142,2012-12-22,1,1,12,0,0,6,0,1,0.26,0.2121,0.44,0.5821,1,30,31 +17143,2012-12-22,1,1,12,1,0,6,0,1,0.26,0.2121,0.44,0.4478,1,34,35 +17144,2012-12-22,1,1,12,2,0,6,0,1,0.26,0.2121,0.44,0.5224,0,23,23 +17145,2012-12-22,1,1,12,3,0,6,0,1,0.26,0.2121,0.48,0.4478,0,7,7 +17146,2012-12-22,1,1,12,4,0,6,0,1,0.26,0.2576,0.48,0.194,3,2,5 +17147,2012-12-22,1,1,12,5,0,6,0,1,0.26,0.2273,0.48,0.4179,0,5,5 +17148,2012-12-22,1,1,12,6,0,6,0,1,0.26,0.2273,0.44,0.4179,0,13,13 +17149,2012-12-22,1,1,12,7,0,6,0,1,0.26,0.2273,0.44,0.4179,2,18,20 +17150,2012-12-22,1,1,12,8,0,6,0,1,0.26,0.2121,0.44,0.6119,0,29,29 +17151,2012-12-22,1,1,12,9,0,6,0,1,0.26,0.2273,0.48,0.3881,13,77,90 +17152,2012-12-22,1,1,12,10,0,6,0,1,0.26,0.2273,0.48,0.3284,13,105,118 +17153,2012-12-22,1,1,12,11,0,6,0,1,0.28,0.2576,0.41,0.3582,29,130,159 +17154,2012-12-22,1,1,12,12,0,6,0,1,0.3,0.2576,0.36,0.6567,18,128,146 +17155,2012-12-22,1,1,12,13,0,6,0,1,0.32,0.2879,0.36,0.4478,23,135,158 +17156,2012-12-22,1,1,12,14,0,6,0,1,0.32,0.2879,0.36,0.4627,20,141,161 +17157,2012-12-22,1,1,12,15,0,6,0,1,0.32,0.2727,0.33,0.5821,14,139,153 +17158,2012-12-22,1,1,12,16,0,6,0,1,0.3,0.2727,0.39,0.4478,23,124,147 +17159,2012-12-22,1,1,12,17,0,6,0,1,0.28,0.2576,0.41,0.2985,15,99,114 +17160,2012-12-22,1,1,12,18,0,6,0,1,0.26,0.2121,0.44,0.4627,7,75,82 +17161,2012-12-22,1,1,12,19,0,6,0,1,0.24,0.2121,0.48,0.3582,9,51,60 +17162,2012-12-22,1,1,12,20,0,6,0,1,0.24,0.2424,0.48,0.1642,6,55,61 +17163,2012-12-22,1,1,12,21,0,6,0,1,0.22,0.2121,0.51,0.2836,4,49,53 +17164,2012-12-22,1,1,12,22,0,6,0,1,0.22,0.2121,0.51,0.2836,1,38,39 +17165,2012-12-22,1,1,12,23,0,6,0,1,0.22,0.2273,0.51,0.194,3,37,40 +17166,2012-12-23,1,1,12,0,0,0,0,1,0.22,0.2121,0.47,0.2836,3,24,27 +17167,2012-12-23,1,1,12,1,0,0,0,1,0.22,0.2273,0.47,0.194,1,19,20 +17168,2012-12-23,1,1,12,2,0,0,0,1,0.2,0.2121,0.51,0.1642,1,17,18 +17169,2012-12-23,1,1,12,3,0,0,0,1,0.2,0.2121,0.51,0.1343,0,9,9 +17170,2012-12-23,1,1,12,4,0,0,0,1,0.2,0.2121,0.47,0.1343,1,3,4 +17171,2012-12-23,1,1,12,5,0,0,0,1,0.2,0.2576,0.51,0,0,6,6 +17172,2012-12-23,1,1,12,6,0,0,0,1,0.16,0.197,0.64,0.0896,4,5,9 +17173,2012-12-23,1,1,12,7,0,0,0,1,0.16,0.197,0.64,0.0896,3,16,19 +17174,2012-12-23,1,1,12,8,0,0,0,1,0.14,0.1667,0.69,0.1045,5,43,48 +17175,2012-12-23,1,1,12,9,0,0,0,1,0.2,0.2273,0.59,0.1045,7,49,56 +17176,2012-12-23,1,1,12,10,0,0,0,1,0.22,0.2273,0.64,0.194,26,67,93 +17177,2012-12-23,1,1,12,11,0,0,0,1,0.24,0.2273,0.52,0.194,48,104,152 +17178,2012-12-23,1,1,12,12,0,0,0,1,0.3,0.2879,0.42,0.194,52,144,196 +17179,2012-12-23,1,1,12,13,0,0,0,1,0.32,0.303,0.39,0.2239,63,123,186 +17180,2012-12-23,1,1,12,14,0,0,0,1,0.34,0.303,0.31,0.2985,57,118,175 +17181,2012-12-23,1,1,12,15,0,0,0,1,0.34,0.3333,0.34,0.1343,39,98,137 +17182,2012-12-23,1,1,12,16,0,0,0,1,0.34,0.3333,0.34,0.194,53,137,190 +17183,2012-12-23,1,1,12,17,0,0,0,1,0.32,0.3182,0.49,0.194,21,93,114 +17184,2012-12-23,1,1,12,18,0,0,0,1,0.3,0.303,0.45,0.1642,8,85,93 +17185,2012-12-23,1,1,12,19,0,0,0,1,0.3,0.3333,0.42,0,4,54,58 +17186,2012-12-23,1,1,12,20,0,0,0,1,0.26,0.303,0.65,0,0,52,52 +17187,2012-12-23,1,1,12,21,0,0,0,1,0.24,0.2879,0.7,0,4,38,42 +17188,2012-12-23,1,1,12,22,0,0,0,1,0.24,0.2576,0.6,0.1045,5,53,58 +17189,2012-12-23,1,1,12,23,0,0,0,1,0.24,0.2879,0.6,0,3,22,25 +17190,2012-12-24,1,1,12,0,0,1,1,1,0.22,0.2727,0.69,0,0,12,12 +17191,2012-12-24,1,1,12,1,0,1,1,1,0.22,0.2727,0.69,0,0,11,11 +17192,2012-12-24,1,1,12,2,0,1,1,1,0.2,0.2576,0.75,0,0,5,5 +17193,2012-12-24,1,1,12,3,0,1,1,1,0.2,0.2576,0.75,0,1,2,3 +17194,2012-12-24,1,1,12,5,0,1,1,1,0.18,0.197,0.8,0.1343,0,9,9 +17195,2012-12-24,1,1,12,6,0,1,1,1,0.2,0.2576,0.75,0,1,16,17 +17196,2012-12-24,1,1,12,7,0,1,1,1,0.2,0.2576,0.69,0,3,27,30 +17197,2012-12-24,1,1,12,8,0,1,1,1,0.22,0.2727,0.69,0,4,62,66 +17198,2012-12-24,1,1,12,9,0,1,1,2,0.24,0.2576,0.65,0.1045,17,68,85 +17199,2012-12-24,1,1,12,10,0,1,1,2,0.26,0.303,0.65,0,21,82,103 +17200,2012-12-24,1,1,12,11,0,1,1,2,0.26,0.2576,0.6,0.1642,20,104,124 +17201,2012-12-24,1,1,12,12,0,1,1,2,0.28,0.303,0.56,0.0896,22,113,135 +17202,2012-12-24,1,1,12,13,0,1,1,3,0.24,0.2576,0.81,0.1045,16,54,70 +17203,2012-12-24,1,1,12,14,0,1,1,3,0.24,0.2576,0.87,0.1045,13,33,46 +17204,2012-12-24,1,1,12,15,0,1,1,3,0.24,0.2273,0.87,0.2239,6,27,33 +17205,2012-12-24,1,1,12,16,0,1,1,3,0.24,0.2273,0.87,0.2239,6,27,33 +17206,2012-12-24,1,1,12,17,0,1,1,3,0.24,0.2424,0.93,0.1642,7,19,26 +17207,2012-12-24,1,1,12,18,0,1,1,3,0.24,0.2576,0.93,0.1045,6,20,26 +17208,2012-12-24,1,1,12,19,0,1,1,2,0.24,0.2424,0.93,0.1343,4,14,18 +17209,2012-12-24,1,1,12,20,0,1,1,3,0.24,0.2424,0.93,0.1343,6,17,23 +17210,2012-12-24,1,1,12,21,0,1,1,3,0.24,0.2576,0.93,0.0896,13,9,22 +17211,2012-12-24,1,1,12,22,0,1,1,2,0.24,0.2879,0.93,0,7,5,12 +17212,2012-12-24,1,1,12,23,0,1,1,3,0.24,0.2879,0.93,0,1,10,11 +17213,2012-12-25,1,1,12,0,1,2,0,3,0.24,0.2576,0.93,0.0896,3,10,13 +17214,2012-12-25,1,1,12,1,1,2,0,2,0.26,0.2576,0.87,0.1642,0,13,13 +17215,2012-12-25,1,1,12,2,1,2,0,2,0.26,0.2576,0.87,0.1642,0,7,7 +17216,2012-12-25,1,1,12,4,1,2,0,2,0.24,0.2576,0.87,0.0896,0,1,1 +17217,2012-12-25,1,1,12,5,1,2,0,2,0.22,0.2273,0.93,0.1343,2,1,3 +17218,2012-12-25,1,1,12,6,1,2,0,2,0.22,0.2273,0.93,0.1343,1,6,7 +17219,2012-12-25,1,1,12,7,1,2,0,2,0.22,0.2273,0.93,0.1642,0,6,6 +17220,2012-12-25,1,1,12,8,1,2,0,2,0.24,0.2879,0.87,0,1,10,11 +17221,2012-12-25,1,1,12,9,1,2,0,2,0.24,0.2576,0.87,0,7,21,28 +17222,2012-12-25,1,1,12,10,1,2,0,1,0.28,0.3182,0.81,0,11,21,32 +17223,2012-12-25,1,1,12,11,1,2,0,1,0.3,0.3182,0.75,0.0896,43,43,86 +17224,2012-12-25,1,1,12,12,1,2,0,1,0.32,0.3333,0.76,0.0896,62,52,114 +17225,2012-12-25,1,1,12,13,1,2,0,1,0.4,0.4091,0.5,0.3284,75,46,121 +17226,2012-12-25,1,1,12,14,1,2,0,1,0.38,0.3939,0.46,0.2985,58,68,126 +17227,2012-12-25,1,1,12,15,1,2,0,1,0.36,0.3333,0.5,0.2537,51,56,107 +17228,2012-12-25,1,1,12,16,1,2,0,2,0.36,0.3333,0.5,0.2537,48,38,86 +17229,2012-12-25,1,1,12,17,1,2,0,2,0.32,0.303,0.57,0.2537,16,34,50 +17230,2012-12-25,1,1,12,18,1,2,0,2,0.32,0.303,0.66,0.2537,20,23,43 +17231,2012-12-25,1,1,12,19,1,2,0,2,0.32,0.303,0.66,0.2239,16,20,36 +17232,2012-12-25,1,1,12,20,1,2,0,2,0.32,0.303,0.66,0.2836,11,29,40 +17233,2012-12-25,1,1,12,21,1,2,0,2,0.3,0.2879,0.65,0.194,8,26,34 +17234,2012-12-25,1,1,12,22,1,2,0,2,0.3,0.303,0.7,0.1642,3,16,19 +17235,2012-12-25,1,1,12,23,1,2,0,2,0.28,0.2727,0.65,0.2537,4,26,30 +17236,2012-12-26,1,1,12,0,0,3,1,2,0.28,0.2727,0.65,0.2537,1,8,9 +17237,2012-12-26,1,1,12,1,0,3,1,2,0.26,0.2273,0.65,0.2985,0,7,7 +17238,2012-12-26,1,1,12,2,0,3,1,2,0.26,0.2273,0.65,0.2985,0,1,1 +17239,2012-12-26,1,1,12,3,0,3,1,2,0.24,0.2121,0.7,0.3582,0,2,2 +17240,2012-12-26,1,1,12,4,0,3,1,1,0.22,0.197,0.69,0.3582,0,2,2 +17241,2012-12-26,1,1,12,5,0,3,1,2,0.22,0.2121,0.69,0.2836,0,11,11 +17242,2012-12-26,1,1,12,6,0,3,1,2,0.22,0.2121,0.69,0.2239,0,36,36 +17243,2012-12-26,1,1,12,7,0,3,1,3,0.2,0.1818,0.86,0.3284,0,26,26 +17244,2012-12-26,1,1,12,8,0,3,1,3,0.2,0.1818,0.86,0.3284,0,31,31 +17245,2012-12-26,1,1,12,9,0,3,1,3,0.2,0.1818,0.86,0.3284,1,22,23 +17246,2012-12-26,1,1,12,10,0,3,1,3,0.2,0.1818,0.86,0.2985,0,8,8 +17247,2012-12-26,1,1,12,11,0,3,1,3,0.2,0.1667,0.86,0.4627,0,10,10 +17248,2012-12-26,1,1,12,12,0,3,1,3,0.22,0.197,0.87,0.3284,0,10,10 +17249,2012-12-26,1,1,12,13,0,3,1,3,0.22,0.197,0.87,0.3284,0,15,15 +17250,2012-12-26,1,1,12,14,0,3,1,3,0.22,0.2121,0.93,0.2537,0,20,20 +17251,2012-12-26,1,1,12,15,0,3,1,3,0.26,0.2273,0.87,0.3582,0,13,13 +17252,2012-12-26,1,1,12,16,0,3,1,3,0.26,0.2273,0.87,0.3582,0,13,13 +17253,2012-12-26,1,1,12,17,0,3,1,3,0.26,0.2273,0.93,0.3582,2,51,53 +17254,2012-12-26,1,1,12,18,0,3,1,3,0.28,0.2576,0.93,0.2985,1,42,43 +17255,2012-12-26,1,1,12,19,0,3,1,3,0.3,0.2879,0.87,0.194,2,33,35 +17256,2012-12-26,1,1,12,20,0,3,1,3,0.32,0.303,0.93,0.2537,0,32,32 +17257,2012-12-26,1,1,12,21,0,3,1,2,0.3,0.2727,0.87,0.2985,0,20,20 +17258,2012-12-26,1,1,12,22,0,3,1,3,0.24,0.197,0.93,0.4478,0,11,11 +17259,2012-12-26,1,1,12,23,0,3,1,3,0.26,0.2273,0.87,0.2985,2,8,10 +17260,2012-12-27,1,1,12,0,0,4,1,3,0.26,0.2273,0.87,0.2985,0,3,3 +17261,2012-12-27,1,1,12,1,0,4,1,3,0.24,0.197,0.93,0.4478,0,5,5 +17262,2012-12-27,1,1,12,2,0,4,1,2,0.24,0.197,0.87,0.4478,0,2,2 +17263,2012-12-27,1,1,12,3,0,4,1,2,0.24,0.2273,0.87,0.2239,0,1,1 +17264,2012-12-27,1,1,12,4,0,4,1,2,0.24,0.2121,0.87,0.3284,0,3,3 +17265,2012-12-27,1,1,12,5,0,4,1,1,0.24,0.2424,0.75,0.1642,1,10,11 +17266,2012-12-27,1,1,12,6,0,4,1,1,0.24,0.197,0.7,0.4925,0,45,45 +17267,2012-12-27,1,1,12,7,0,4,1,1,0.24,0.2273,0.7,0.2239,0,90,90 +17268,2012-12-27,1,1,12,8,0,4,1,1,0.26,0.2273,0.65,0.3582,2,206,208 +17269,2012-12-27,1,1,12,9,0,4,1,1,0.26,0.2121,0.6,0.4925,6,127,133 +17270,2012-12-27,1,1,12,10,0,4,1,1,0.28,0.2273,0.56,0.5224,8,67,75 +17271,2012-12-27,1,1,12,11,0,4,1,2,0.28,0.2424,0.56,0.4478,23,80,103 +17272,2012-12-27,1,1,12,12,0,4,1,1,0.3,0.2727,0.49,0.4627,22,87,109 +17273,2012-12-27,1,1,12,13,0,4,1,1,0.3,0.2727,0.49,0.4627,19,99,118 +17274,2012-12-27,1,1,12,14,0,4,1,2,0.26,0.2121,0.56,0.5224,25,94,119 +17275,2012-12-27,1,1,12,15,0,4,1,2,0.26,0.2121,0.56,0.4478,31,89,120 +17276,2012-12-27,1,1,12,16,0,4,1,2,0.26,0.2424,0.52,0.2836,23,151,174 +17277,2012-12-27,1,1,12,17,0,4,1,2,0.26,0.2424,0.52,0.2836,30,227,257 +17278,2012-12-27,1,1,12,18,0,4,1,2,0.24,0.2273,0.6,0.2537,9,188,197 +17279,2012-12-27,1,1,12,19,0,4,1,1,0.24,0.2121,0.6,0.2836,11,106,117 +17280,2012-12-27,1,1,12,20,0,4,1,1,0.24,0.2424,0.6,0.1642,12,79,91 +17281,2012-12-27,1,1,12,21,0,4,1,2,0.24,0.2273,0.6,0.2537,12,51,63 +17282,2012-12-27,1,1,12,22,0,4,1,2,0.24,0.2121,0.6,0.2836,11,33,44 +17283,2012-12-27,1,1,12,23,0,4,1,2,0.24,0.2273,0.6,0.2537,2,24,26 +17284,2012-12-28,1,1,12,0,0,5,1,2,0.24,0.2424,0.6,0.1642,3,22,25 +17285,2012-12-28,1,1,12,1,0,5,1,1,0.24,0.2273,0.6,0.194,0,9,9 +17286,2012-12-28,1,1,12,2,0,5,1,1,0.24,0.2273,0.6,0.2537,0,5,5 +17287,2012-12-28,1,1,12,3,0,5,1,2,0.24,0.2424,0.6,0.1642,0,2,2 +17288,2012-12-28,1,1,12,4,0,5,1,2,0.24,0.2576,0.6,0.1045,0,4,4 +17289,2012-12-28,1,1,12,5,0,5,1,2,0.24,0.2576,0.6,0.0896,0,15,15 +17290,2012-12-28,1,1,12,6,0,5,1,1,0.22,0.2121,0.64,0.2836,2,49,51 +17291,2012-12-28,1,1,12,7,0,5,1,1,0.22,0.2273,0.64,0.1642,1,111,112 +17292,2012-12-28,1,1,12,8,0,5,1,2,0.24,0.2424,0.6,0.1343,1,238,239 +17293,2012-12-28,1,1,12,9,0,5,1,2,0.24,0.2121,0.6,0.2836,18,173,191 +17294,2012-12-28,1,1,12,10,0,5,1,2,0.26,0.2424,0.56,0.2537,77,85,162 +17295,2012-12-28,1,1,12,11,0,5,1,2,0.28,0.2727,0.52,0.2239,71,107,178 +17296,2012-12-28,1,1,12,12,0,5,1,2,0.3,0.303,0.49,0.1343,72,150,222 +17297,2012-12-28,1,1,12,13,0,5,1,2,0.3,0.303,0.49,0.1343,76,146,222 +17298,2012-12-28,1,1,12,14,0,5,1,2,0.3,0.303,0.49,0.1343,84,177,261 +17299,2012-12-28,1,1,12,15,0,5,1,2,0.3,0.3182,0.49,0,74,151,225 +17300,2012-12-28,1,1,12,16,0,5,1,1,0.3,0.303,0.49,0.1343,42,208,250 +17301,2012-12-28,1,1,12,17,0,5,1,1,0.24,0.2424,0.6,0.1343,43,228,271 +17302,2012-12-28,1,1,12,18,0,5,1,1,0.24,0.2424,0.6,0.1343,16,197,213 +17303,2012-12-28,1,1,12,19,0,5,1,2,0.24,0.2273,0.65,0.194,15,113,128 +17304,2012-12-28,1,1,12,20,0,5,1,2,0.24,0.2576,0.65,0.1045,14,83,97 +17305,2012-12-28,1,1,12,21,0,5,1,2,0.24,0.2424,0.7,0.1343,17,75,92 +17306,2012-12-28,1,1,12,22,0,5,1,2,0.24,0.2576,0.7,0.0896,13,49,62 +17307,2012-12-28,1,1,12,23,0,5,1,2,0.24,0.2576,0.65,0.0896,5,54,59 +17308,2012-12-29,1,1,12,0,0,6,0,2,0.24,0.2424,0.7,0,1,25,26 +17309,2012-12-29,1,1,12,1,0,6,0,2,0.24,0.2424,0.75,0.0896,6,31,37 +17310,2012-12-29,1,1,12,2,0,6,0,2,0.24,0.2424,0.7,0,1,18,19 +17311,2012-12-29,1,1,12,3,0,6,0,2,0.24,0.2424,0.75,0,1,5,6 +17312,2012-12-29,1,1,12,4,0,6,0,2,0.24,0.2424,0.75,0.0896,0,3,3 +17313,2012-12-29,1,1,12,5,0,6,0,2,0.24,0.2424,0.75,0.0896,0,3,3 +17314,2012-12-29,1,1,12,6,0,6,0,2,0.26,0.2424,0.7,0.1045,0,7,7 +17315,2012-12-29,1,1,12,7,0,6,0,2,0.26,0.2424,0.7,0.1343,1,17,18 +17316,2012-12-29,1,1,12,8,0,6,0,2,0.26,0.2424,0.81,0,9,35,44 +17317,2012-12-29,1,1,12,9,0,6,0,2,0.26,0.2424,0.81,0,16,33,49 +17318,2012-12-29,1,1,12,10,0,6,0,3,0.26,0.2424,0.81,0.1343,6,35,41 +17319,2012-12-29,1,1,12,11,0,6,0,3,0.2,0.2424,0.93,0.0896,7,38,45 +17320,2012-12-29,1,1,12,12,0,6,0,3,0.2,0.2424,1,0,5,43,48 +17321,2012-12-29,1,1,12,13,0,6,0,3,0.2,0.2424,1,0,13,71,84 +17322,2012-12-29,1,1,12,14,0,6,0,2,0.24,0.2424,0.87,0.0896,10,88,98 +17323,2012-12-29,1,1,12,15,0,6,0,2,0.24,0.2424,0.87,0,19,110,129 +17324,2012-12-29,1,1,12,16,0,6,0,1,0.3,0.2424,0.75,0.1045,22,125,147 +17325,2012-12-29,1,1,12,17,0,6,0,1,0.26,0.2424,0.79,0.1045,18,100,118 +17326,2012-12-29,1,1,12,18,0,6,0,1,0.3,0.2424,0.7,0.194,8,102,110 +17327,2012-12-29,1,1,12,19,0,6,0,2,0.3,0.2424,0.61,0.2537,7,90,97 +17328,2012-12-29,1,1,12,20,0,6,0,2,0.3,0.2424,0.56,0.5522,2,64,66 +17329,2012-12-29,1,1,12,21,0,6,0,2,0.28,0.2424,0.56,0.4925,4,56,60 +17330,2012-12-29,1,1,12,22,0,6,0,2,0.26,0.2424,0.6,0.4627,3,51,54 +17331,2012-12-29,1,1,12,23,0,6,0,2,0.26,0.2424,0.6,0,0,32,32 +17332,2012-12-30,1,1,12,0,0,0,0,2,0.26,0.2576,0.6,0.1642,0,41,41 +17333,2012-12-30,1,1,12,1,0,0,0,2,0.26,0.2273,0.56,0.4179,1,27,28 +17334,2012-12-30,1,1,12,2,0,0,0,2,0.26,0.2424,0.56,0.2836,0,19,19 +17335,2012-12-30,1,1,12,3,0,0,0,2,0.26,0.2273,0.56,0.4179,1,14,15 +17336,2012-12-30,1,1,12,4,0,0,0,2,0.26,0.2576,0.56,0.2239,0,7,7 +17337,2012-12-30,1,1,12,5,0,0,0,2,0.26,0.2273,0.48,0.2985,0,2,2 +17338,2012-12-30,1,1,12,6,0,0,0,2,0.24,0.197,0.52,0.4179,1,7,8 +17339,2012-12-30,1,1,12,7,0,0,0,1,0.24,0.2121,0.56,0.3582,0,13,13 +17340,2012-12-30,1,1,12,8,0,0,0,1,0.24,0.197,0.52,0.4627,1,32,33 +17341,2012-12-30,1,1,12,9,0,0,0,1,0.24,0.2121,0.52,0.3881,9,65,74 +17342,2012-12-30,1,1,12,10,0,0,0,1,0.26,0.2121,0.41,0.5821,31,91,122 +17343,2012-12-30,1,1,12,11,0,0,0,1,0.26,0.2273,0.41,0.4179,33,103,136 +17344,2012-12-30,1,1,12,12,0,0,0,1,0.28,0.2273,0.36,0.5821,47,97,144 +17345,2012-12-30,1,1,12,13,0,0,0,1,0.3,0.2576,0.36,0.6567,49,120,169 +17346,2012-12-30,1,1,12,14,0,0,0,1,0.3,0.2727,0.36,0.4627,39,121,160 +17347,2012-12-30,1,1,12,15,0,0,0,1,0.28,0.2576,0.38,0.3284,37,101,138 +17348,2012-12-30,1,1,12,16,0,0,0,1,0.28,0.2424,0.38,0.4179,31,102,133 +17349,2012-12-30,1,1,12,17,0,0,0,1,0.26,0.2273,0.41,0.3284,26,97,123 +17350,2012-12-30,1,1,12,18,0,0,0,2,0.24,0.2121,0.44,0.2985,12,113,125 +17351,2012-12-30,1,1,12,19,0,0,0,1,0.34,0.3636,0.61,0,16,86,102 +17352,2012-12-30,1,1,12,20,0,0,0,1,0.22,0.197,0.47,0.3284,9,63,72 +17353,2012-12-30,1,1,12,21,0,0,0,1,0.2,0.2121,0.51,0.1642,5,42,47 +17354,2012-12-30,1,1,12,22,0,0,0,1,0.2,0.197,0.55,0.194,6,30,36 +17355,2012-12-30,1,1,12,23,0,0,0,1,0.2,0.197,0.51,0.2239,10,39,49 +17356,2012-12-31,1,1,12,0,0,1,1,1,0.18,0.1818,0.55,0.194,4,30,34 +17357,2012-12-31,1,1,12,1,0,1,1,1,0.18,0.1818,0.55,0.194,6,13,19 +17358,2012-12-31,1,1,12,2,0,1,1,1,0.16,0.1667,0.59,0.1642,3,8,11 +17359,2012-12-31,1,1,12,3,0,1,1,1,0.16,0.1818,0.59,0.1045,0,1,1 +17360,2012-12-31,1,1,12,4,0,1,1,1,0.14,0.1667,0.69,0.1045,0,3,3 +17361,2012-12-31,1,1,12,5,0,1,1,1,0.16,0.1515,0.64,0.194,0,9,9 +17362,2012-12-31,1,1,12,6,0,1,1,1,0.16,0.1667,0.64,0.1642,0,40,40 +17363,2012-12-31,1,1,12,7,0,1,1,1,0.16,0.1818,0.64,0.1343,2,83,85 +17364,2012-12-31,1,1,12,8,0,1,1,1,0.14,0.1515,0.69,0.1343,9,187,196 +17365,2012-12-31,1,1,12,9,0,1,1,2,0.18,0.2121,0.64,0.1045,13,144,157 +17366,2012-12-31,1,1,12,10,0,1,1,2,0.2,0.2121,0.69,0.1343,33,87,120 +17367,2012-12-31,1,1,12,11,0,1,1,2,0.22,0.2273,0.6,0.194,43,114,157 +17368,2012-12-31,1,1,12,12,0,1,1,2,0.24,0.2273,0.56,0.194,52,172,224 +17369,2012-12-31,1,1,12,13,0,1,1,2,0.26,0.2576,0.44,0.1642,38,165,203 +17370,2012-12-31,1,1,12,14,0,1,1,2,0.28,0.2727,0.45,0.2239,62,185,247 +17371,2012-12-31,1,1,12,15,0,1,1,2,0.28,0.2879,0.45,0.1343,69,246,315 +17372,2012-12-31,1,1,12,16,0,1,1,2,0.26,0.2576,0.48,0.194,30,184,214 +17373,2012-12-31,1,1,12,17,0,1,1,2,0.26,0.2879,0.48,0.0896,14,150,164 +17374,2012-12-31,1,1,12,18,0,1,1,2,0.26,0.2727,0.48,0.1343,10,112,122 +17375,2012-12-31,1,1,12,19,0,1,1,2,0.26,0.2576,0.6,0.1642,11,108,119 +17376,2012-12-31,1,1,12,20,0,1,1,2,0.26,0.2576,0.6,0.1642,8,81,89 +17377,2012-12-31,1,1,12,21,0,1,1,1,0.26,0.2576,0.6,0.1642,7,83,90 +17378,2012-12-31,1,1,12,22,0,1,1,1,0.26,0.2727,0.56,0.1343,13,48,61 +17379,2012-12-31,1,1,12,23,0,1,1,1,0.26,0.2727,0.65,0.1343,12,37,49 diff --git a/first-neural-network/DLND-your-first-network/dlnd-your-first-neural-network.html b/first-neural-network/DLND-your-first-network/dlnd-your-first-neural-network.html new file mode 100644 index 0000000..52fa711 --- /dev/null +++ b/first-neural-network/DLND-your-first-network/dlnd-your-first-neural-network.html @@ -0,0 +1,18956 @@ + + + +dlnd-your-first-neural-network + + + + + + + + + + + + + + + + + + + +
+
+ +
+
+
+
+
+

Your first neural network

In this project, you'll build your first neural network and use it to predict daily bike rental ridership. We've provided some of the code, but left the implementation of the neural network up to you (for the most part). After you've submitted this project, feel free to explore the data and the model more.

+ +
+
+
+
+
+
In [14]:
+
+
+
%matplotlib inline
+%config InlineBackend.figure_format = 'retina'
+
+import numpy as np
+import pandas as pd
+import matplotlib.pyplot as plt
+
+ +
+
+
+ +
+
+
+
+
+
+

Load and prepare the data

A critical step in working with neural networks is preparing the data correctly. Variables on different scales make it difficult for the network to efficiently learn the correct weights. Below, we've written the code to load and prepare the data. You'll learn more about this soon!

+ +
+
+
+
+
+
In [15]:
+
+
+
data_path = 'Bike-Sharing-Dataset/hour.csv'
+
+rides = pd.read_csv(data_path)
+
+ +
+
+
+ +
+
+
+
In [16]:
+
+
+
rides.head()
+
+ +
+
+
+ +
+
+ + +
+
Out[16]:
+ + +
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
instantdtedayseasonyrmnthhrholidayweekdayworkingdayweathersittempatemphumwindspeedcasualregisteredcnt
012011-01-01101006010.240.28790.810.031316
122011-01-01101106010.220.27270.800.083240
232011-01-01101206010.220.27270.800.052732
342011-01-01101306010.240.28790.750.031013
452011-01-01101406010.240.28790.750.0011
+
+
+ +
+ +
+
+ +
+
+
+
+
+
+

Checking out the data

This dataset has the number of riders for each hour of each day from January 1 2011 to December 31 2012. The number of riders is split between casual and registered, summed up in the cnt column. You can see the first few rows of the data above.

+

Below is a plot showing the number of bike riders over the first 10 days in the data set. You can see the hourly rentals here. This data is pretty complicated! The weekends have lower over all ridership and there are spikes when people are biking to and from work during the week. Looking at the data above, we also have information about temperature, humidity, and windspeed, all of these likely affecting the number of riders. You'll be trying to capture all this with your model.

+ +
+
+
+
+
+
In [17]:
+
+
+
rides[:24*10].plot(x='dteday', y='cnt')
+
+ +
+
+
+ +
+
+ + +
+
Out[17]:
+ + + +
+
<matplotlib.axes._subplots.AxesSubplot at 0x7fa490564c18>
+
+ +
+ +
+
+ + + +
+ +
+ +
+ +
+
+ +
+
+
+
+
+
+

Dummy variables

Here we have some categorical variables like season, weather, month. To include these in our model, we'll need to make binary dummy variables. This is simple to do with Pandas thanks to get_dummies().

+ +
+
+
+
+
+
In [18]:
+
+
+
dummy_fields = ['season', 'weathersit', 'mnth', 'hr', 'weekday']
+for each in dummy_fields:
+    dummies = pd.get_dummies(rides[each], prefix=each, drop_first=False)
+    rides = pd.concat([rides, dummies], axis=1)
+
+fields_to_drop = ['instant', 'dteday', 'season', 'weathersit', 
+                  'weekday', 'atemp', 'mnth', 'workingday', 'hr']
+data = rides.drop(fields_to_drop, axis=1)
+data.head()
+
+ +
+
+
+ +
+
+ + +
+
Out[18]:
+ + +
+
+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
yrholidaytemphumwindspeedcasualregisteredcntseason_1season_2...hr_21hr_22hr_23weekday_0weekday_1weekday_2weekday_3weekday_4weekday_5weekday_6
0000.240.810.03131610...0000000001
1000.220.800.08324010...0000000001
2000.220.800.05273210...0000000001
3000.240.750.03101310...0000000001
4000.240.750.001110...0000000001
+

5 rows × 59 columns

+
+
+ +
+ +
+
+ +
+
+
+
+
+
+

Scaling target variables

To make training the network easier, we'll standardize each of the continuous variables. That is, we'll shift and scale the variables such that they have zero mean and a standard deviation of 1.

+

The scaling factors are saved so we can go backwards when we use the network for predictions.

+ +
+
+
+
+
+
In [19]:
+
+
+
quant_features = ['casual', 'registered', 'cnt', 'temp', 'hum', 'windspeed']
+# Store scalings in a dictionary so we can convert back later
+scaled_features = {}
+for each in quant_features:
+    mean, std = data[each].mean(), data[each].std()
+    scaled_features[each] = [mean, std]
+    data.loc[:, each] = (data[each] - mean)/std
+
+ +
+
+
+ +
+
+
+
+
+
+

Splitting the data into training, testing, and validation sets

We'll save the last 21 days of the data to use as a test set after we've trained the network. We'll use this set to make predictions and compare them with the actual number of riders.

+ +
+
+
+
+
+
In [20]:
+
+
+
# Save the last 21 days 
+test_data = data[-21*24:]
+data = data[:-21*24]
+
+# Separate the data into features and targets
+target_fields = ['cnt', 'casual', 'registered']
+features, targets = data.drop(target_fields, axis=1), data[target_fields]
+test_features, test_targets = test_data.drop(target_fields, axis=1), test_data[target_fields]
+
+ +
+
+
+ +
+
+
+
+
+
+

We'll split the data into two sets, one for training and one for validating as the network is being trained. Since this is time series data, we'll train on historical data, then try to predict on future data (the validation set).

+ +
+
+
+
+
+
In [21]:
+
+
+
# Hold out the last 60 days of the remaining data as a validation set
+train_features, train_targets = features[:-60*24], targets[:-60*24]
+val_features, val_targets = features[-60*24:], targets[-60*24:]
+
+ +
+
+
+ +
+
+
+
+
+
+

Time to build the network

Below you'll build your network. We've built out the structure and the backwards pass. You'll implement the forward pass through the network. You'll also set the hyperparameters: the learning rate, the number of hidden units, and the number of training passes.

+

The network has two layers, a hidden layer and an output layer. The hidden layer will use the sigmoid function for activations. The output layer has only one node and is used for the regression, the output of the node is the same as the input of the node. That is, the activation function is $f(x)=x$. A function that takes the input signal and generates an output signal, but takes into account the threshold, is called an activation function. We work through each layer of our network calculating the outputs for each neuron. All of the outputs from one layer become inputs to the neurons on the next layer. This process is called forward propagation.

+

We use the weights to propagate signals forward from the input to the output layers in a neural network. We use the weights to also propagate error backwards from the output back into the network to update our weights. This is called backpropagation.

+

Hint: You'll need the derivative of the output activation function ($f(x) = x$) for the backpropagation implementation. If you aren't familiar with calculus, this function is equivalent to the equation $y = x$. What is the slope of that equation? That is the derivative of $f(x)$.

+
+

Below, you have these tasks:

+
    +
  1. Implement the sigmoid function to use as the activation function. Set self.activation_function in __init__ to your sigmoid function.
  2. +
  3. Implement the forward pass in the train method.
  4. +
  5. Implement the backpropagation algorithm in the train method, including calculating the output error.
  6. +
  7. Implement the forward pass in the run method.
  8. +
+ +
+
+
+
+
+
+
+
+

Shape of weight matrices

+
Node j1 -> | w1|w2|w3|....wn
+---------------|--|-------
+Node j2 -> | w1|w2|w3|....wn
+---------------|--|-------
+Node j3 -> | w1|w2|w3|....wn
+.            ...
+.            ...
+Node jm      ...
+ +
+
+
+
+
+
In [29]:
+
+
+
class NeuralNetwork(object):
+    def __init__(self, input_nodes, hidden_nodes, output_nodes, learning_rate):
+        # Set number of nodes in input, hidden and output layers.
+        self.input_nodes = input_nodes
+        self.hidden_nodes = hidden_nodes
+        self.output_nodes = output_nodes
+        
+        #print("DEBUG: hidden units ",self.hidden_nodes)
+        
+
+        # Initialize weights
+        self.weights_input_to_hidden = np.random.normal(0.0, self.hidden_nodes**-0.5, 
+                                       (self.hidden_nodes, self.input_nodes)) # shape hidden_n x n_features
+
+        self.weights_hidden_to_output = np.random.normal(0.0, self.output_nodes**-0.5, 
+                                       (self.output_nodes, self.hidden_nodes)) # shape 1 x hidden_n
+        self.lr = learning_rate
+        
+        #### Set this to your implemented sigmoid function ####
+        # Activation function is the sigmoid function
+        self.activation_function = self._sigmoid
+        self.hidden_gradient_function = self._gradient_sigmoid
+        self.output_activation = lambda x: x
+    
+    def _sigmoid(self, x):
+        return 1 / (1 + np.exp(-x))
+    
+    def _gradient_sigmoid(self, output):
+        return output * (1-output)
+    
+    
+    def train(self, inputs_list, targets_list):
+        # Convert inputs list to 2d array
+        inputs = np.array(inputs_list, ndmin=2).T
+        targets = np.array(targets_list, ndmin=2).T
+        
+        #### Implement the forward pass here ####
+        ### Forward pass ###
+        # TODO: Hidden layer
+        
+        # signal into hidden layer
+        hidden_inputs = np.dot(self.weights_input_to_hidden, inputs) # shape: hidden_n x n . nx1 -> hidden_n x 1
+        # signals from hidden layer
+        hidden_outputs = self.activation_function(hidden_inputs) # shape: hidden_n x 1
+        
+        # TODO: Output layer
+        
+        # signals into final output layer
+        final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs) # 1 x hidden_n . hidden_n x 1 -> 1x1
+        # signals from final output layer - same, h(f(x)) = f(x)
+        final_outputs = self.output_activation(final_inputs)  # shape 1x1
+        
+        #### Implement the backward pass here ####
+        ### Backward pass ###
+        
+        # TODO: Output error
+        
+        # Output layer error is the difference between desired target and actual output.
+        output_errors = targets - final_outputs
+        
+        # since the gradient of f(x) = x is 1, the error and output error are the same 
+        
+        # TODO: Backpropagated error
+        
+        # errors propagated to the hidden layer
+        # shape: 1xhidden_n . 1x1
+        hidden_errors = np.multiply(self.weights_hidden_to_output, output_errors) # shape 1 x hidden_n
+        
+        # hidden layer gradients
+        # Calculate using sigmoid gradient of the hidden output
+        #shape hidden_nx1  
+        hidden_grad = self.hidden_gradient_function(hidden_outputs)
+        
+        # TODO: Update the weights
+        
+        # update hidden-to-output weights with gradient descent step 
+        self.weights_hidden_to_output += np.dot(self.lr, output_errors).dot(hidden_outputs.T) #shape hidden_nx1
+
+        
+        # update input-to-hidden weights with gradient descent step
+        # shape hidden_n x input_n += (hidden_nx1 * hidden_nx1) . (1x1 . inputs_nx1.T) 
+        self.weights_input_to_hidden += np.multiply(hidden_grad, hidden_errors.T).dot(self.lr).dot(inputs.T) 
+        
+    def run(self, inputs_list):
+        # Run a forward pass through the network
+        inputs = np.array(inputs_list, ndmin=2).T
+        
+        #### Implement the forward pass here ####
+        # TODO: Hidden layer
+        hidden_inputs = np.dot(self.weights_input_to_hidden, inputs)# signals into hidden layer
+        hidden_outputs = self.activation_function(hidden_inputs)# signals from hidden layer
+        
+        # TODO: Output layer
+        final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs)# signals into final output layer
+        final_outputs = self.output_activation(final_inputs)# signals from final output layer
+        return final_outputs
+
+ +
+
+
+ +
+
+
+
In [23]:
+
+
+
def MSE(y, Y):
+    return np.mean((y-Y)**2)
+
+ +
+
+
+ +
+
+
+
+
+
+

Training the network

Here you'll set the hyperparameters for the network. The strategy here is to find hyperparameters such that the error on the training set is low, but you're not overfitting to the data. If you train the network too long or have too many hidden nodes, it can become overly specific to the training set and will fail to generalize to the validation set. That is, the loss on the validation set will start increasing as the training set loss drops.

+

You'll also be using a method know as Stochastic Gradient Descent (SGD) to train the network. The idea is that for each training pass, you grab a random sample of the data instead of using the whole data set. You use many more training passes than with normal gradient descent, but each pass is much faster. This ends up training the network more efficiently. You'll learn more about SGD later.

+

Choose the number of epochs

This is the number of times the dataset will pass through the network, each time updating the weights. As the number of epochs increases, the network becomes better and better at predicting the targets in the training set. You'll need to choose enough epochs to train the network well but not too many or you'll be overfitting.

+

Choose the learning rate

This scales the size of weight updates. If this is too big, the weights tend to explode and the network fails to fit the data. A good choice to start at is 0.1. If the network has problems fitting the data, try reducing the learning rate. Note that the lower the learning rate, the smaller the steps are in the weight updates and the longer it takes for the neural network to converge.

+

Choose the number of hidden nodes

The more hidden nodes you have, the more accurate predictions the model will make. Try a few different numbers and see how it affects the performance. You can look at the losses dictionary for a metric of the network performance. If the number of hidden units is too low, then the model won't have enough space to learn and if it is too high there are too many options for the direction that the learning can take. The trick here is to find the right balance in number of hidden units you choose.

+ +
+
+
+
+
+
In [28]:
+
+
+
import sys
+
+### Set the hyperparameters here ###
+epochs = 1000
+#learning_rate = 0.03
+learning_rate = 0.07
+N_i = train_features.shape[1]
+hidden_nodes = int(N_i / 1.2)
+output_nodes = 1
+
+
+network = NeuralNetwork(N_i, hidden_nodes, output_nodes, learning_rate)
+
+losses = {'train':[], 'validation':[]}
+for e in range(epochs):
+    # Go through a random batch of 128 records from the training data set
+    batch = np.random.choice(train_features.index, size=128)
+    for record, target in zip(train_features.ix[batch].values, 
+                              train_targets.ix[batch]['cnt']):
+        network.train(record, target)
+    
+    # Printing out the training progress
+    train_loss = MSE(network.run(train_features), train_targets['cnt'].values)
+    val_loss = MSE(network.run(val_features), val_targets['cnt'].values)
+    sys.stdout.write("\rProgress: " + str(100 * e/float(epochs))[:4] \
+                     + "% ... Training loss: " + str(train_loss)[:5] \
+                     + " ... Validation loss: " + str(val_loss)[:5])
+    
+    losses['train'].append(train_loss)
+    losses['validation'].append(val_loss)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Progress: 99.9% ... Training loss: 0.067 ... Validation loss: 0.162
+
+
+ +
+
+ +
+
+
+
In [25]:
+
+
+
plt.plot(losses['train'], label='Training loss')
+plt.plot(losses['validation'], label='Validation loss')
+plt.legend()
+plt.ylim(ymax=0.8)
+
+ +
+
+
+ +
+
+ + +
+
Out[25]:
+ + + +
+
(0.0, 0.8)
+
+ +
+ +
+
+ + + +
+ +
+ +
+ +
+
+ +
+
+
+
+
+
+

Check out your predictions

Here, use the test data to view how well your network is modeling the data. If something is completely wrong here, make sure each step in your network is implemented correctly.

+ +
+
+
+
+
+
In [30]:
+
+
+
fig, ax = plt.subplots(figsize=(8,4))
+
+mean, std = scaled_features['cnt']
+predictions = network.run(test_features)*std + mean
+ax.plot(predictions[0], label='Prediction')
+ax.plot((test_targets['cnt']*std + mean).values, label='Data')
+ax.set_xlim(right=len(predictions))
+ax.legend()
+
+dates = pd.to_datetime(rides.ix[test_data.index]['dteday'])
+dates = dates.apply(lambda d: d.strftime('%b %d'))
+ax.set_xticks(np.arange(len(dates))[12::24])
+_ = ax.set_xticklabels(dates[12::24], rotation=45)
+
+ +
+
+
+ +
+
+ + +
+
+ + + +
+ +
+ +
+ +
+
+ +
+
+
+
+
+
+

Thinking about your results

Answer these questions about your results. How well does the model predict the data? Where does it fail? Why does it fail where it does?

+

Note: You can edit the text in this cell by double clicking on it. When you want to render the text, press control + enter

+
+

Your answer below

    +
  • The model predicts the data good enough.

    +
  • +
  • The model fails for the period of christmas holidays.

    +
  • +
  • Reason: We can expect users to change behavior around christmas time (family visit, holidays ..). We also have much less data for christmas holidays which occures only once a year then the rest of the year. So the model is over fitted for normal days vs christmas holidays. We would need data over several years to fit the model for these seasonal changes.

    +
  • +
+ +
+
+
+
+
+
+
+
+

Unit tests

Run these unit tests to check the correctness of your network implementation. These tests must all be successful to pass the project.

+ +
+
+
+
+
+
In [160]:
+
+
+
import unittest
+
+inputs = [0.5, -0.2, 0.1]
+targets = [0.4]
+test_w_i_h = np.array([[0.1, 0.4, -0.3], 
+                       [-0.2, 0.5, 0.2]])
+test_w_h_o = np.array([[0.3, -0.1]])
+
+class TestMethods(unittest.TestCase):
+    
+    ##########
+    # Unit tests for data loading
+    ##########
+    
+    def test_data_path(self):
+        # Test that file path to dataset has been unaltered
+        self.assertTrue(data_path.lower() == 'bike-sharing-dataset/hour.csv')
+        
+    def test_data_loaded(self):
+        # Test that data frame loaded
+        self.assertTrue(isinstance(rides, pd.DataFrame))
+    
+    ##########
+    # Unit tests for network functionality
+    ##########
+
+    def test_activation(self):
+        network = NeuralNetwork(3, 2, 1, 0.5)
+        # Test that the activation function is a sigmoid
+        self.assertTrue(np.all(network.activation_function(0.5) == 1/(1+np.exp(-0.5))))
+
+    def test_train(self):
+        # Test that weights are updated correctly on training
+        network = NeuralNetwork(3, 2, 1, 0.5)
+        network.weights_input_to_hidden = test_w_i_h.copy()
+        network.weights_hidden_to_output = test_w_h_o.copy()
+        
+        network.train(inputs, targets)
+        self.assertTrue(np.allclose(network.weights_hidden_to_output, 
+                                    np.array([[ 0.37275328, -0.03172939]])))
+        self.assertTrue(np.allclose(network.weights_input_to_hidden,
+                                    np.array([[ 0.10562014,  0.39775194, -0.29887597],
+                                              [-0.20185996,  0.50074398,  0.19962801]])))
+
+    def test_run(self):
+        # Test correctness of run method
+        network = NeuralNetwork(3, 2, 1, 0.5)
+        network.weights_input_to_hidden = test_w_i_h.copy()
+        network.weights_hidden_to_output = test_w_h_o.copy()
+
+        self.assertTrue(np.allclose(network.run(inputs), 0.09998924))
+
+suite = unittest.TestLoader().loadTestsFromModule(TestMethods())
+unittest.TextTestRunner().run(suite)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
.....
+----------------------------------------------------------------------
+Ran 5 tests in 0.004s
+
+OK
+
+
+
+ +
+
Out[160]:
+ + + +
+
<unittest.runner.TextTestResult run=5 errors=0 failures=0>
+
+ +
+ +
+
+ +
+
+
+ + + + + + diff --git a/first-neural-network/DLND-your-first-network/dlnd-your-first-neural-network.ipynb b/first-neural-network/DLND-your-first-network/dlnd-your-first-neural-network.ipynb new file mode 100644 index 0000000..ec69803 --- /dev/null +++ b/first-neural-network/DLND-your-first-network/dlnd-your-first-neural-network.ipynb @@ -0,0 +1,1001 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Your first neural network\n", + "\n", + "In this project, you'll build your first neural network and use it to predict daily bike rental ridership. We've provided some of the code, but left the implementation of the neural network up to you (for the most part). After you've submitted this project, feel free to explore the data and the model more.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina'\n", + "\n", + "import numpy as np\n", + "import pandas as pd\n", + "import matplotlib.pyplot as plt" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load and prepare the data\n", + "\n", + "A critical step in working with neural networks is preparing the data correctly. Variables on different scales make it difficult for the network to efficiently learn the correct weights. Below, we've written the code to load and prepare the data. You'll learn more about this soon!" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "data_path = 'Bike-Sharing-Dataset/hour.csv'\n", + "\n", + "rides = pd.read_csv(data_path)" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
instantdtedayseasonyrmnthhrholidayweekdayworkingdayweathersittempatemphumwindspeedcasualregisteredcnt
012011-01-01101006010.240.28790.810.031316
122011-01-01101106010.220.27270.800.083240
232011-01-01101206010.220.27270.800.052732
342011-01-01101306010.240.28790.750.031013
452011-01-01101406010.240.28790.750.0011
\n", + "
" + ], + "text/plain": [ + " instant dteday season yr mnth hr holiday weekday workingday \\\n", + "0 1 2011-01-01 1 0 1 0 0 6 0 \n", + "1 2 2011-01-01 1 0 1 1 0 6 0 \n", + "2 3 2011-01-01 1 0 1 2 0 6 0 \n", + "3 4 2011-01-01 1 0 1 3 0 6 0 \n", + "4 5 2011-01-01 1 0 1 4 0 6 0 \n", + "\n", + " weathersit temp atemp hum windspeed casual registered cnt \n", + "0 1 0.24 0.2879 0.81 0.0 3 13 16 \n", + "1 1 0.22 0.2727 0.80 0.0 8 32 40 \n", + "2 1 0.22 0.2727 0.80 0.0 5 27 32 \n", + "3 1 0.24 0.2879 0.75 0.0 3 10 13 \n", + "4 1 0.24 0.2879 0.75 0.0 0 1 1 " + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "rides.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Checking out the data\n", + "\n", + "This dataset has the number of riders for each hour of each day from January 1 2011 to December 31 2012. The number of riders is split between casual and registered, summed up in the `cnt` column. You can see the first few rows of the data above.\n", + "\n", + "Below is a plot showing the number of bike riders over the first 10 days in the data set. You can see the hourly rentals here. This data is pretty complicated! The weekends have lower over all ridership and there are spikes when people are biking to and from work during the week. Looking at the data above, we also have information about temperature, humidity, and windspeed, all of these likely affecting the number of riders. You'll be trying to capture all this with your model." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 17, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABBoAAALzCAYAAAC/R2QvAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAewgAAHsIBbtB1PgAAIABJREFUeJzs3Xu0ZHV55//P95zTzaGhuQittN1KUEFhElcQlBjNj4iY\n6BgZNTomJBnFqMREXIyuhESjXMyKK4NrVMJEFC/gTBxiBk1iJJhBRMcLChIXysUIotJtg402Td+b\nPvX9/bFrd33P7r2rdlXty/Pd9X6tddap06dOnV3Vdfbls5/n2c57LwAAAAAAgCrMtb0AAAAAAACg\nOwgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQga\nAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAA\nAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABAZQgaAAAAAABA\nZcwHDc65U5xz73DOfc45d79zbrdzbptz7rvOuY86555T4jFe7Zzrlfz4LyUe72Dn3J84577hnPup\nc267c+4u59x7nHNPrOaZAwAAAAAQn4W2F2AY59yXJD23/6UPvrVC0lMkHS/pNc65/ynpdd77R0c8\npB/x/TLL9BRJ1/V/f/h4J0h6qqTXOed+x3v/2Wl/FwAAAAAAsTEdNEhaq+Rg/seS/l7S/5P0I0nz\nkp4t6a2S1kn6vf6//W6Jx/w1SZuGfH9D0Tecc4dK+qwGIcOHJP2dpF2SnifpzyQdJuka59xzvPe3\nl1geAAAAAAA6w3k/9Un+2jjn/knS1ZI+5XMW1Dn3GElfVVJN4CWd7r3/cs79Xi3pY/37HOe9/9GE\ny3OJpD/vP84fe+//e+b7vyTpS0pCjy9678+Y5PcAAAAAABAr0zMavPdnee+vzQsZ+t//mZKqhtQr\n6loW59yCpPOUhAx3ZUOG/vLcLOkjkpyk051zp9S1PAAAAAAAWGQ6aCjppuD2k2v8Pc+TdHj/9tVD\n7ndVcPtltS0NAAAAAAAGdSFoWBncXqrx9zw3uP3FIfe7VdKO/u2RV8QAAAAAAKBLuhA0/Gpw+64S\n97/KObfRObfHObfZOfc159y7nHOPH/FzJwW37y66k/d+SdK9StonTiyxPAAAAAAAdEbUQYNzzkm6\nIPinT5b4sdMlHaPkihuPkfQsSW+XdI9z7g1Dfm59//MO7/0jI37H/f3Pa5xzK0osEwAAAAAAnWD9\n8pajvEVJUOAlXeu9/7ch971X0rWSbtYgCHiSpN9UMkRyUdIHnHM97/2Hc35+df/z9hLLtSO4faik\nLSV+BgAAAACA6Jm+vOUwzrnTJf1fJWHJA5Ke7r1/qOC+q73324Y81n+U9On+Y+2U9GTv/U8y97lH\nSTDxI+/9z41Ytqsl/Z6SAOQJ3vsfl31eAAAAAADELMrWCefcf5D0KSXBwC5JrywKGSRpWMjQ//51\nki5RMldhlaTfz7nb7v7nlTnfyzoouL2rxP0BAAAAAOiE6FonnHPHSfqcpCMl7ZP0Ku/9Vyp46A8p\nCRukZI7DuzPfT8OKQ0s81iHB7TKtFvs553YoCSq8pJ+V+JElSb1xfgcAAAAAYCbMSZovcb/HKDnx\nvsd7f8ioO48SVdDQvzLEDZIer+Tg+hzv/T9X8dje+83OuZ9KOkrSupy7bJB0mqRDnHOHjRgI+YT+\n583e+0fHXJSDNHgjPHbMnwUAAAAAYFIHjb7LaNEEDc65o5TMZDhOydn+N3nv/7biXzNsYMWdSgZH\nStLTJH0j707OuXlJT+4/VpnLbRYuw5o1a0beeX5+XvPzZQIqIB579+7V5s2btWbNGq1cWaZbCegm\n/hYA/g6AFH8LmMTS0pKWlpZG3u+hhx5Sf35jJdXyUQQNzrnDJP2rpBOVHIhf4L2/ouLfcbSko/tf\n5g1v/HJw+3QVBA2STlXSOuElTdLS8TNJj12zZo1+8pOfjLwz0EW33XabTjnlFF1//fV6xjOe0fbi\nAK3hbwHg7wBI8beAOq1fv14bN26UpEoOQs0Pg3TOHSzpOkknKzl4/wvv/Xtq+FXnKulJkaQv5nz/\nJklb+7dfPeRxzgluf3r6xQIAAAAAIB6mgwbn3ApJ/yDpl5WEDO/z3l845mMc65z7xRH3+Q1J7+h/\nuVvSx7L36c9auExJGHGic+6tOY/zbEmv7S/rTd77b46zrAAAAAAAxM5668Q1kl6g5MD9Rkkf7V/a\nsshe7/33Mv/2c5K+4Jz7mqTPSPqWknIQJ+lJkl6pZPaC6/+et3rvNxU8/qWSXiXpBEmXOueO7y/j\nLklnSPozJa/pTknnj/VMAQAAAADoAOtBw8v6n52k50v69oj7/0BJeJDlJf2SpGcX/JyXtEPS+d77\njxQ9uPd+u3PuxZI+K+l4SW/of4SPs1XS2d77UcsKAAAAAEDnWA8ahl0Fouz9vynpd5WEDKdKWqtk\n6OOCpC2S7pD0eUkf9t4/NPIXeH+vc+5kSX+kpBriKZJWSrpfSQBxmff+/jGXGwAAAACATjAdNHjv\np75uo/d+u6T/3f+ohPd+l6T39D8AAAAAAECf6WGQAAAAAAAgLgQNAAAAAACgMgQN9ixJ0vz81F0j\nQLTWrl2rCy+8UGvXrm17UYBW8bcA8HcApPhbQEyc9+POW0SdnHMbJK1bt26dNmzY0PbiAAAAAAA6\nbv369dq4caMkbfTer5/28ahoAAAAAAAAlSFoAAAAAAAAlSFoAAAAAAAAlVloewEAAAAAwJJTTz1V\nDzzwQNuLAYzlmGOO0a233tr2YkgiaAAAAACAZR544IF0MB6ACRA0AAAAAECOubk5LicJ8zZt2qRe\nr9f2YixD0AAAAAAAOdauXcsl52FecGlKMxgGCQAAAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAA\nAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAAAA\nKkPQAAAAAAAAKkPQAAAAAAAAKkPQAAAAAABAja6++mrNzc1pbm5Or33ta9tenNoRNAAAAAAA0ADn\nXNuL0AiCBgAAAAAAUBmCBgAAAAAAUBmCBgAAAAAAauS9b3sRGkXQAAAAAADonG3btunyyy/XWWed\npeOOO06rV6/W4uKi1q1bpzPPPFOXXHKJ7rzzzgN+7pxzztk/uPHjH/+4JGnnzp36m7/5G/3Kr/yK\njjnmGC0uLuqJT3yizj77bH31q18tXIbXvOY1ywZAeu911VVX7X/88OOMM86o54VowULbCwAAAAAA\nQJWuuOIKvf3tb9eWLVskLR/C+MADD2jTpk268cYbddFFF+n666/Xr/3arx3wGOnP3H333Xr5y1+u\nu+++e9njbNiwQddcc42uueYaXXjhhbrwwgtzHyP9mbSqYRYGQhI0AAAAAAA6481vfrMuv/zy/Qf5\n8/PzeuYzn6njjz9ei4uL2rx5s771rW/pBz/4gSRp9+7dhY+1ceNGnXnmmdq0aZOOPPLI/RUNDz30\nkG688UZt3bpVknTJJZfopJNO0itf+cplP/+CF7xAq1ev1t13360bbrhBzjk97WlP0/Of//wDftfx\nxx9f3YvQMoIGAAAAAEAnXHHFFftDBkl61atepUsvvVTr1q074L533nmnrrzySq1atarw8S655BLt\n3btXF1xwgd75zndqcXFx//cefvhhveIVr9CNN94o55ze9ra3HRA0nH322Tr77LN19dVX64YbbpAk\nnXbaabrsssuqeLpmMaMBAAAAABC9hx9+WBdccMH+kOGNb3yjPvGJT+SGDJJ00kkn6b3vfa/OPPPM\n3O9777V371697W1v01/+5V8uCxkk6YgjjtAnPvEJHXLIIfLe6/vf/75uueWWap9UpAgaAAAAAADR\n+9CHPqRt27bJe69jjz1W733ve6d+zDVr1ugd73hH4fcf+9jH6sUvfvH+r7/xjW9M/Tu7gNYJAAAA\nAGjIqadKDzzQ9lJM55hjpFtvbXspDnT99ddLSoYtvv71r9eKFSumejznnF7ykpdo5cqVQ+938skn\n65Of/KQk6Yc//OFUv7MrCBoAAAAAoCEPPCBt3Nj2UnTT17/+9f23n/e851XymL/wC78w8j5HHXXU\n/tvpcMhZR9AAAAAAAA055pi2l2B6Fp/Dtm3btGvXrv1fP+lJT6rkcQ8//PCR9wkrJx599NFKfm/s\nCBoAAAAAoCEWWw66YNu2bcu+PvTQQyt53HSwJMbDMEgAAAAAQNRWr1697Ovt27e3tCSQCBoAAAAA\nAJFbvXq1Dj744P1f33fffS0uDQgaAAAAAADRO+200/bfvvHGG1tckgPNWgsGQQMAAAAAIHovetGL\n9t++8sorTQ1mXFxc3H/b0nLVhaABAAAAABC917/+9Tr00EPlvdcPf/hDnX/++W0v0n7hJTA3zsD1\nTQkaAAAAAADRO+KII/RXf/VXkiTvvT7wgQ/ot37rtwoP7O+44w6df/75uuGGG2pftp//+Z/ff/vr\nX/+6NmzYUPvvbBOXtwQAAAAAdMIb3/hG3XHHHfrABz4g770++clP6tprr9Uzn/lMnXDCCVpcXNTm\nzZv1b//2b/rBD34g55zOOOOM2pfrcY97nH75l39ZX/3qV7Vr1y49/elP1wtf+EKtXbtWc3PJ+f8n\nP/nJ+oM/+IPal6UJBA0AAAAAgM64/PLL9dSnPlXvfOc79cgjj6jX6+nmm2/WzTffvP8+zrn9H6tW\nrWpkud7//vfr+c9/vrZt26atW7fqmmuuWfb9X/3VX+1M0EDrBAAAAACgU8477zx9//vf13ve8x69\n4AUv0Pr167W4uKjFxUWtX79eZ555pt71rnfpu9/9rs4888wDfj4NIcpK7zvsZ0455RTdfvvtestb\n3qKTTz5ZRxxxhBYWFpaFHl3hvPdtLwMCzrkNktatW7eu8307AAAAgEXr16/Xxo0bxT45YlDF+zV9\nDEkbvffrp10mKhoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoA\nAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAA\nAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBlCBoAAAAAAEBl\nFtpeAAAAgKbcd5/0uc9Jr3iFdPTRbS8NAOs2bdqk9evXt70YwFCbNm1qexEOQNAAAABmxktfKt1+\nu/SlL0mf+ETbSwPAul6vp40bN7a9GEB0CBoAAMDMuOOO5POdd7a7HABsO+aYY9peBGBslt63BA0A\nAGBmLC0ln/fta3c5ANh26623tr0IQNQYBgkAAGZCrze4/eij7S0HAABdR9AAAABmQljFQEUDAACJ\nD39Y2rq12sckaAAAADMhbZuQCBoAAEj94z9K27dX+5gEDQAAYCYQNAAAcKA6tokEDQAAYCYQNAAA\ncKBw+1gVggYAADATmNEAAMCBqGgAAACYEBUNAAAciIoGAACACRE0AABwIIIGAACACYU7Uo8+2t5y\nAABgCa0TAAAAE2JGAwAAB6KiAQAAYELhjpT3Uq/X3rIAAGAFFQ0AAAATyp6xoaoBAAAqGgAAACZG\n0AAAwIEIGgAAkLR1a9tLgBhlgwWCBgAAaJ0AAEB/+IfSkUdK735320uC2FDRAADAgahoAADMvP/1\nv5JBfh//eNtLgthkd6S4xCUAAFQ0AACgvXuTz7RPYFxUNAAAcCAqGgAAMy/dGG7b1u5yID4EDQAA\nHIigAQAw89KN4fbtUq/X7rIgLgyDbM7DD0u7d7e9FACAMmidAADMNO+Tj9T27e0tC+JDRUMzvvUt\n6fGPl449lhYnAIgBFQ0AgJmW3RDSPoFxEDQ047rrpF27pJ/8RPrSl9peGgDAKFQ0AABmWvZA8ZFH\n2lkOxImgoRnh1Tz27GlvOQAA5VDRAACYadkDQyoaMI7s+4fLW9Yj3GHlNQYA+6hoAADMNFonMA0q\nGppB0AAAcaGiAQAw02idwDQIGpoRvq5797a3HACAcggaAAAzjYoGTIOgoRlUNABAPHq95Vf0qgpB\nAwAgGgQNmEY2WCBoqAdBAwDEo45qBomgAQAQEVonMA0qGpoRvs60TgCAbXVtCwkaAADRoKIB0yBo\naAYVDQAQDyoaAAAzj4oGTCP7/uEguB5hgMNrDAC2ETQAAGZe9gw0FQ0YBzMamkFFAwDEg9YJAMDM\no3UC06B1ohnMaACAeMxkRYNz7hTn3Ducc59zzt3vnNvtnNvmnPuuc+6jzrnnjPl4L3LOfSp4rPv7\nX79wjMeYd879gXPuS865nzjndjrn7nHOXeGcO2n8ZwkAKIvWCUyDoKEZVDQAQDzq2hYu1POw03PO\nfUnSc/tfhlf2XCHpKZKOl/Qa59z/lPQ6733hpsw55yRdKem1mcd7vKSXSnqpc+5K7/25I5bpKEn/\nIunUzDIdJ+kNkl7tnHuT9/4jJZ4iAGBMVDRgGgQNzSBoAIB4zGJFw1olB/MbJb1f0iskPUvSsyW9\nRdKG/vd/T9LHRjzWXyoJGbykb0r67f5j/bak2/r//jrn3F8UPYBzbk7SP2gQMlwr6UWSTpP0ZkkP\nSjpI0hXOuV8f+9kCAEaiogHTYEZDM8LXldYJALCtrqDBbEWDpLsk/amkT3nvfeZ73+hXMnxV0gmS\nfts5d4X3/svZB3HOHS/prUrCgVskne6939P/9jedc5+R9EUlAcIfO+c+5r2/N2d5XiPpOf3H+R/e\n+zcH37vVOXe9khBjtaTLnHMneu97Ez1zAEAuKhowDSoamkFFAwDEY+aGQXrvz/LeX5sTMqTf/5mS\nACH1ioKH+q8aBCrnBSFD+ji7JJ3X/3JB0vkFj5P+ri2S/iRnee6V9G5JTklrx8sKHgcAMCGCBkyD\ny1s2g6ABAOIxi60TZdwU3H5ywX3OUlKFcLf3/pa8O3jvvy7pu0pCgv+U/X6/KuLE/uP8nfd+d8Hv\nuiq4TdAAABXLCxry42jgQFQ0NIOgAQDiMXMVDSWtDG4fkMU4545TMvBRStojhkm/v845d2zme8/N\nud8BvPcPSvp3JYHFWFfEAACMlj1Q9F7asaOdZUF8mNHQDC5vCQDxoKIh368Gt+/K+X54ucm7RzxW\n+P0TK3icJzjnDh5xXwDAGPIODGmfQFlUNDSDigYAiAdBQ0b/kpUXBP/0yZy7rQ9ubxjxkPcHt59Q\nweO4zM8BAKaUtzHkyhMoi6ChGeHrStAAALbROnGgtyi5RKWXdK33/t9y7rM6uL19xOOFxbeH1vQ4\nAIAp5AUNVDSgLIKGZtA6AQDxoKIh4Jw7XckVHiTpQUl/WHDXxeD2qE1deDWKbMvD/sfx3k/zOACA\nKVDRgGkQNDSD1gkAiAcVDX3Ouf8g6VNKLkW5S9IrvfcPFdw9vDrEyoL7pA4Kbu8qehzn3DSPAwCY\nAhUNmEZ2Z4qD4HoQNABAPOqqaFio52Hr0b+KxOckHSlpn6RXee+/MuRHwt3PUW0MhwS3s+0R2cf5\n2YSPU9revXt12223jbzf2rVrtXbt2kl/DQBEhaAB06CioRkEDQBgx6ZNm7Rp06bC79+1/5IK1fa6\nRRM0OOceL+kGJZer7Ek6x3v/zyN+LBzcOGowYzgA8v7M97KPMyxoSB/Ha/TgyEKbN2/WKaecMvJ+\nF154oS666KJJfw0ARIXWCUyDoKEZ4evKjAYAaNcHP/hBXXzxxY3/3iiCBufcUZL+r6TjlBzAv8l7\n/7clfvTO4PbTRtw3/H72UpnZx7m9xOPc772fuHVizZo1uv7660fej2oGALOEigZMg6ChGVQ0AIAd\n5557rs4666zC73/lK9Kb3yxJL5S0ubLfaz5ocM4dJulfJZ2oJGS4wHt/RZmf9d7f55z7saS1kk4f\ncff/r/95o/f+h5nvfTm4fbryL6Up59zjJJ3QX85hLR0jrVy5Us94xjOmeQgA6BwqGjCNbLBA0FAP\nggYAsGNUq/3GjemtUaMIx2N6GKRz7mBJ10k6WcnB+194798z5sP8oyQn6WnOuWcV/J5fUlKJ4CX9\nQ/b73vvvKalycJL+s3NuMXufvnOC258eczkBACPkHRhS0YCyqGhoBpe3BIB4zNzlLZ1zK5Qc9P+y\nkgDgfd77Cyd4qPcpGRwpSX+dDQn6X1/W/3KfpPcXPE4acDxG0n/LWd4nS/rT/pf3iKABACpH6wSm\nQdDQDCoaACAes3jViWskvUBJyHCjpI/2L21ZZG+/8mAZ7/33nHPvURICPFPSV5xzfyXpXklPlnSB\nBhUT/817f2/B418t6bWSniPpTc65tZKulLRF0mmS/lzSYZKWJJ3nve+N+XwBACPQOoFpZN8/HATX\nIwxweI0BwLa6QnfLQcPL+p+dpOdL+vaI+/9A0pMKvvd2SWuUBAW/qCTESPn+x4e99+8oenDvfc85\n91JJn1USWPxm/yN8nD2S/sh7/68jlhUAMAEqGjANZjQ0g4oGAIjHzLVOaBAAjPOR/0CJ10t6sZKZ\nDRuVhAIb+1+/yHt/7sgF8v6nSlo5/lDS/5P0kKRdSqojPiTpGd77j07wXAEAJRA0YBq0TjSDGQ0A\nEI+Zq2jw3s/X8JjXSxp9zcjhj9GT9MH+BwCgQbROYBoEDc2gogEA4jGLFQ0AACxDRQOmQdDQjPB1\n7vXq24kFAEyPoAEAMPOoaMA0mNHQDIZuAkA86toWEjQAAKJRVNHgC6f0xOP++6WnP1064wxpz562\nl6abqGhoRvZ1JWgAALuoaAAAzLy8A8NeT9q1q/llqdonPyl9+9vSF76QfKB6nGlvBq8zAMSDigYA\nwMwLD2Dmgi1YF9onduwY3N6+vb3l6DIqGppB0AAA8aCiAQAw88KN4eGHD253YSBkeNDLgVk9mNHQ\njOxOK5e4BAC7CBoAADMv3BgeeeTgdheCBi4JWD8qGppBRQMAxIPWCQDAzCsKGrrQOkFFQ/0IGprB\nMEgAiAcVDQCAmRduDI84YnC7CxUNBA31I2ioX6934L/ROgEAdlHRAACYeVQ0YBoEDfXLOzPG+xkA\n7KKiAQAw87pc0cCMhvoxDLJ+BA0AEBcqGgAAM6/LwyCpaKgfQwrrR9AAAHGhogEAMPPCg/GwooHW\nCZRB60T98l5TZjQAgF0EDQCAmUdFA6ZB0FA/KhoAIC60TgAAZl6Xh0FamtHw6KPS3/yN9Ld/2+5y\nVI0ZDfUjaACAuNRV0bBQz8MCAFC9Lg+DtFTR8E//JP3RHyW3jz9eetaz2l2eqlDRUL+8HVZaJwDA\nLioaAAAzj9aJZtx77+D2d77T3nJUjaChflQ0AEBcmNEAAJh54cbw8MMHt7vQOmEpaAiX5ac/bW85\nqkbQUD+CBgCIC0EDAGDmhRvDlSulVauS212oaLA0oyE8AH/oofaWo2rZYMH7+nawZlVeeNP2+xkA\nUIzWCQDAzAsPCufnpdWrk9tdCBqoaKhfXqhAVUO1mNEAoG4f/ah06aXSnj1tL0k3MAwSADDzwoPC\n+XnpsMOkBx+kdaJqsxY0HHRQ88vSVbROAKjTbbdJv//7ye3HP176nd9pd3m6gIoGAMDMG1bR4H07\ny1QVgob6UdFQP4tBw86d7S8DgGr84Af5tzE5ZjQAAGZeuDFcWBgEDfv2Sbt3t7NMVbE6o6FLQUNe\nqEDQUC1rrRPf/a60bp30cz8nPfxwe8sBoBrhOpv1dzUIGgAAMy9b0XDYYYOvY5/TQEVD/ahoqJ+1\nYZCf+UwSMPz4x9LnP9/ecgCoBkFD9WidAADMvKLWCYmgoUrZoCH2tpQUQUP9rLVOhNUUsVc9AbBV\n/dcVVDQAAGbesIqG2AdCWg0a9u2LP8SRpF4v/9/bfq27xlrQEL6XufoFED8qGqpHRQMAYOZ1uaLB\n0lma7E5HF9oninak2FGtlrUZDQQNQLcQNFSPigYAwMwbFjRQ0VCd7M7bQw+1sxxVKtqRYke1WpYr\nGvbsaW85AFTDUijfFQQNAICZxzDIZnSxooGgoRnWhkFS0QB0CxUN1aN1AgAw88KN4dxct1onCBrq\nRdDQDFonANQpXMew/q4GFQ0AgJmXbgzn5iTnujUM0lI5KEEDJmW5dYKgAYgfFQ3Vo6IBADDz0oOY\nhYXkMxUN9ehi0MAwyGZYDhqY0QDEz9K2siuoaAAAzLx0Yzg/n3wmaKhHF4OGoh2ptl/rrrEcNFDR\nAMSP1onqETQAAGZeNmjoUusEQUO9aJ1oBjMaANSJ1onqpa+jc9U+LkEDACAaXa5oYEZDvQgammH5\nqhO0TgDxo6KhelQ0AABm3rCggYqG6mR33h56qJ3lqBIzGppB6wSAOlnaVnYFFQ0AgJmXDRoWFwff\ni/0gwtLOExUNmBStEwDqROtE9ahoAADMvGzQsHLl4HuxH0QQNNSLoKEZVDQAqBOtE9UjaAAAzLx0\npyINGtLPUvsH59MKN/R790ret7cs2Z23HTvi728naGiG5aAh9vcwAFuhfFfQOgEAmHnZigbnBlUN\nMZ+t7PWSj1BdZxjKyDv4jr2qoShQYEe1WpaHQca8jgCQoHWielQ0AABmXroxXFgY/FsXggbLZ4FT\nsQcN4WscnrVhR7VazGgAUCdaJ6pX1+tI0AAAGGnPHumWWw486960bEWD1I2gwfJZ4FSXgoaDDhrc\nZke1WpZDs5jXEQASVDRUL11v0zoBAGjcWWdJz3qW9Ja3tLsceUHDihXJ55hL4C0fnKUIGlCG5fcy\nMxqA+IXrmJi3+5bQOgEAaM2NNyafv/CFdpeDioZmdDFoCJ9TeFlUgoZqWQ4aYl5HAEhQ0VA9hkEC\nAFrh/WAj1PbZA4KGZuQtz0MPNb8cVaKioRl5ryczGgBUxVLQ0OtJ550nvfKV0s9+1u6yTKOuioaF\n0XcBAMwyS2WKBA3N6GJFA0FDMyxXNNA6AcTP0jDIr31Nuvzy5Pazn91+e+mkGAYJAGhFeJBgMWhg\nRkP1ZikQVCQ+AAAgAElEQVRoiPl9Y5Hl93LMYSSARPg33fb6O6xiuP/+9pZjGt4nHxKtEwCAhoUb\n8jZ31L0fXPWCioZ6pctz1FGDf4s9aGBGQzO4vCWAOllqnQh//5Yt7S3HNOpqm5AIGgAAI1ipaAg3\nhnlBw9JSvRvMOlkNGh772MG/xR40hO8Ngob65P0N7ts3OGPWNIIGoFsstU6Evz/WGQ11voYEDQCA\noayUKYY7FwvBhKE0aJDaL6OclNWg4eCDpcMOS253KWhgRkN9isK+tl5nZjQA3WJln0TqXkUDrRMA\ngEZZr2hIZzRI7e90TMpqX/vCwqB9gqABZYSv51ywl9nW+zlb0dBWZQWAalDRUC0qGgAArbEeNIQV\nDbGWRluqaOj1BgdjYdCwZUu8rSkSMxqaUtSi0tbfZvj/633c72EAzGioGjMaAACtyZYptnVGkKCh\nGeGyhEFDryc9/HA7y1QFKhqaEb7OBx88uG3h/SzFu44AkMgGDW1WKYXru1grGmidAAC0JnuA0NYZ\nwTJBQ6ytE5aDhqOPHnwdc/sEl7dshvWggTkNQNyy+yBtVillZ8Ds2tXeskyK1gkAQGuyBwhtHTCU\nmdEQ69lKy0FDVy5xyVUnmmG5daLN5QBQjezfdJvr8OzvjrF9gooGAEBrrAQN4Qa9a60TloZBZq/u\nQdCAcRTNwrCw3pDiXUcASGS3l5aChhjbJ6hoAAC0JrsRsnAA3LWggYqG+oXPixkN9aF1AkCdqGio\nFsMgAQCtsVLRkD3TnmJGQ7W6GjQwDLIZRZUjFt7PUrxhJICElZMfUvcqGmidAAA0ymLQwIyG+hA0\nYBpFFQ1t/G2Gl2ptczkAVMdy6wQVDcsRNAAAhsoe8La1o97l1glLMxoIGppz2WXS8cdLf//37S1D\n1SxVNOT938a6jgCQsNw6EWNFA0EDAKA1VsoUuxw0UNFQP2tDCiXpkkuke+6R/vzP21uGqoWvc9sz\nGvL+rpjRAMTNyj6JdOBBeowVDbROAABaE1PrBDMappcNGo4+evD1Qw81vzxVsXjViUceST5/73vS\njh3tLUeVLF3ekooGoHsst05Q0bAcQQMAYCjrQQMVDdXKBg2rVg1aDWKuaLDWOuH94P/Ye+muu9pZ\njqpZuuoEQQPQPZZbJ6hoWI6gAQAwlJUyxS4HDZZnNDg3aJ8gaKhOr7f8629/u53lqJr1oIHWCSBu\nVDRUi4oGAEBrrFQ0hBv0rgUNlisapOVBQ3aKfyyKZjS0tZOa/f/tYtDQ9iwMKhqA7rFy8kPqRkUD\nQQMAoDVWggZmNDQjL9BJg4Y9e6SdO5tfpipYq2jI/v9+5zvtLEfVmNEAoE60TlSL1gkAQGssBg3p\nmXaJioaqDatokOJtnwjfPytWDHao2tpJzf7erlQ0WLq6B0ED0D20TlSLigYAQGuslCkyo6EZsxA0\nzM8PnpuVioYHHoj7qh4pZjQAqJOlioa8y1tm5+9YV+frR9AAABjKYkVD14IG6xUNhx8++Lf0koyx\nyT6v9Lm19Trn/d4utE/QOgGgTtmDe0szGno9adu2dpZlUuHrSesEAKBR1oMGZjRUKy9oOOSQwb91\nYUbD/PzgfWOldULqRvuE9YoGggYgXt7bbp2Q4pvTQOsEAKA12QOEtnbUqWhoxqigYceOZpenKtZb\nJ6RuVzQQNACYVt5BsbWgIbY5DQyDBAC0hhkN9bM+o2HVqsG/ETRUI+//twsVDenr6dzyv00rQQMz\nGoB4WdpWSlQ0jELQAAAYynrrRBeCBioa6lc0o8FS68R3vpOUBscs/TsN21MkZjQAmF7e3zQVDdNh\nGCQAoDVWgoZwY8iMhvp0NWiIoaJh2zbpRz9qflmqlL7OCws2KxoIGoB4xRA0xFzRQOsEAKBRtE7U\nj6ChfjEEDVL87RNFFQ1tvJ/zfietE0C8mNFQPVonAACtsVLREG4M04NEqRtBg6W+01kLGiy8zmvX\nDm53NWigdQLAtCyF8lL+tju2igZaJwAArbEYNFDRUJ9ZuLzlwkL7l7cM/39PPnlwO/YrT6Sv5/w8\nrRMAqhVDRUNsQQOtEwCA1lgPGtouz66CpQOirlY0ZGd8WGqd+PmfHyxPVysaCBoATCuGGQ2xtU5Q\n0QAAaA0zGupnvaKBy1tWL/y9hxwiPfWpye277443MJOWD4O0GDQwowGIVwxBQ8wVDVUjaAAADGW9\noqELQQMzGuo3LGho45KS4f/vwkJS1ZD++7//e/PLUxVmNACoi6VtpTRYx4TBamwVDbROAABaYz1o\naPtgpgrWKxq6FjQsLCwfKNrrNb884f/vihXSL/zC4OuY2yfCoIEZDQCqZLWiYWFBOvLI5HZsFQ20\nTgAAWmM9aGj7YKYK1oOGgw6S5vp7DLEGDUUzGrLfa2N5skFDzAMhmdEAoC6Wg4bHPCa5TUXDAEED\nAGCo7EbcwpBCKhrqkxc0ODeoaog1aChqnZDaea2LWick6c47m1+eqoRXnWj7b5MZDUC3WGudCGfS\npBUN27bFddKDigYAQGusVzTMzw++jjVosLTzlBc0SIOgoQuXt7RQ0ZBtnXjc4wZfb93a/PJUJdzx\nbrvaiIoGoFtiqGiQpIcfbmd5JsEwSABAaywGDeFBojQ4cxrTWYSQ9YoGqXsVDeHZdgutEwcdNPh6\n9+7ml6cqtE4AqEveQbGVoCGtaJDimtNA6wQAoDXWL28pDc6cxnoQEb7Gi4vJZ4KGamWfl6WKhnR5\n0mWKubzfetAQ82sLzLpYKhpiChponQAAtMZiRQNBQzPLEh6Mr1qVfN69u95Sy7qEyzw3ZytoSA/I\n0//7rlQ0hH+nVmY0xLqOAGCr+k8qrmiIaSAkrRMAgNYQNNQvfG4HH5x8thY0hJe4jHFOQ/oaz80l\n5aFtBw3Z1gmpG0FDOAzSucHfppWKhljXEQDstk7Mz3ejooHWCQBAo2IIGro0oyGGoCHG9onwTLvU\nftCQbZ2Q4g8avE8+pMHr3ObfJkED0C2WWyeoaDgQQQMAYChmNNSvKGhID9raWpYuBQ3hDmH4WWr/\n8pbpwXg6EDLWoCFvYGv63Ky0TjCjAYiX1YoGhkHmI2gAAAwVQ0VDl4KG9Ky21M4shK4GDdYqGrrY\nOpH3N0rrBICqWJvREF7ON2ydiKmigWGQAIDWWAkawo1h14KGvBkNUvsHZ12c0ZAt6ZdonahKXtBg\nsXWijUohANOLpXUi1oqGqhE0AACGiqF1IjyYifEgoqiioe2DMyoa6jPsqhNLS+3uPE8qLwy0EjSE\nJcGxznIBZp2l1gnviy9vGWtFA60TAIBGWaloyOv/TqUVDd7HeenFcHJ1eKa97YOzvMtbSnEGDcNm\nNFhrnZDinCUwrKKh7RkN4fs31sonYNZZap3o9Qa3qWjIR9AAABgquxFvaye9zIwGKc6DiPAgOHwu\nloIGKhqqNax1QoqzfSIvDLQyo4GgAYifpdaJ7LZy5crBdjKmigaCBgBAayxWNHQtaAgHSlmtaCBo\nqNaw1gkp/qDBWusEQQMQP0utE3mtYmlVQ0wVDXW2TiyMvgsAoIydO6U//VPp9tsH/+ac9Bu/Ib31\nre0t17RimtEgxdl/HVY0tP1cZjFoaPt1noWgwVLrRIxtKQBsVzRIyZyGDRviChrqrGggaACAivyf\n/yP99V8f+O833ZSEDU99auOLVAkqGuoXw4yG2IMGazMa8lonDjpo8G9dCRqy81Oyf7t1Cv9fw/dv\njOsIAPkHxRb2SdJ1eFrRsHu3tGvX8qtIWcUwSACIwKZNxd/78Y+bW44qeU/Q0ARLFQ1FQze5vGW1\nutg6MeyqE1Lz72daJ4BuiaGiIRVLVQMzGgAgAuFO9Kc+Jb3jHYOvYzxokGydPcg7iEmFBzMxHkQw\no6F+1mY0dL11In19CRoAVMV60BBeeSKWgZB1vn4EDQBQkXAnetUq6dBDB1/v2tX88lTB0qWkylY0\nMKOhmmVJlycV++UtrQUNXb/qRF5FQ9MH+EWtE8xoAOJkdRhkXtAQY0UDrRMAYFS2FDr2gwYp/0DX\netAQ49lKizManJPmgr2ErlQ0WJzRkFfREOPB8LAZDRIVDQCmY+nkx6jWiVgqGma6dcI5t8Y592Ln\n3MXOueucc5udc73+x0dLPsarg58Z9fFfSjzewc65P3HOfcM591Pn3Hbn3F3Oufc45544/bMGEKPs\ngUM4BCjWigarQcNCZpRxV4IGSxUN2dc49qAhDHOk9oOGrrdOMKMBQNWstk5kL28pxVPRUOfrF8NV\nJx7MfO37H5OY9Of2c849RdJ1kp6SebwTJD1V0uucc7/jvf/stL8LQDn33JO0KRxzTLvL0cWKhrwN\nUK+XfMw1HFWXvbxljAcRFmc0ZF/j2IMGa5e37GLrxKhhkG22TnB5SyB+1lsnwhNMsaxn6mydiCFo\nkAYH9PdLukvSr2vy0ODXJA2ZDa8NRd9wzh0q6bMahAwfkvR3knZJep6kP5N0mKRrnHPP8d7fPuEy\nAijpK1+Rnvvc5LJw990nrV3b3rLMSkVD+u/hpfiawIyG5pclFHPQ0OsNblupaOjiVScst05weUsg\nftZbJ8L1XSxBw6xXNFws6RZJt3jvNzvnjpV03xSP9z3v/Y8m/Nk/kXS8kpDhj733/z343tedczdJ\n+pKkVZLeJ+mMKZYTQAk33ZR83rNH+trXpJe/vL1lGRY0xHjQIMUZNMR4EGFxRkM2aJifT/7P9+yJ\n7/KWo0r6226d6EpFA1edAFAnSxUNeeu7cL8olqAhfR5VVzNIEcxo8N5f7L2/znu/uc3lcM4tSDpP\nSchwVyZkkCR572+W9BFJTtLpzrlTml1KYPaEG5i2z2QPa53oWkVDGzvqsxA0WK5okAZnhWOraMg7\nqLdS0TA/P9jBC3dSYw8amNEAoGpWZzTkBQ2xrGeybYVVMh80GPI8SYf3b1895H5XBbdfVtvSAJC0\nfMe17ZV6FysaijbgbRwAd3VGg/eD0n7rQUN6sBZb0JD33rESNIT/312qaGBGA4CqxRQ0xLKeGbbN\nnxZBQ3nPDW5/ccj9bpWU7oI9p77FASAtX9G3vVKfpYqGNg+ApW7NaMiWX1oOGmKtaLAYNKS/s0tB\nQ97fqIUZDWnbTyq2MBJAIq91ghkN06GioVpXOec2Ouf29C+V+TXn3Lucc48f8XMnBbfvLrqT935J\n0r1K2idOrGB5AQwRS0UDQcP0uto6kT04iyVo8FNfx6k5FoOG9P82XI4waIhlJzVktXViYSHudQSA\nBBUN1aOioVqnSzpGySDMx0h6lqS3S7rHOfeGIT+3vv95h/f+kRG/4/7+5zXOuRVD7wlgKuGKvu2d\nxy5e3tJq0JDdIMZ8EJHdWYkhaFhaiut1HjWjoc3LW3apomHUMMi2WieyQUMsBwAAlrM0DDKvgivG\nyqk6KxpiuOpEVe6VdK2kmzUIAp4k6TclvULSoqQPOOd63vsP5/z86v7n7SV+V1hUeqikLRMtMYCR\nLAcNXahosDqjYS4Tk8c8oyHG1gkpqWpo+sojk7JY0dDF1gmrl7dcWIjzAADActnQeN8+G0EDrRP5\nZiVo+JT3Pm+A4zcl/b1z7j9K+rSS1+O9zrl/8t7/JHPfdPNfZvMUvrUOFkEDUBvLrROhGA8aJJsV\nDXNzB16GKeYZDTFWNEjJJS4f85hmlmlaFi9vOap1IsZ1Bq0TAOoUrqsXF6Xt25N/876eyzMO05XL\nW9I6MSXv/bYR379O0iVK5iqskvT7OXdLN/krc76XFZ7jifQ8JhAHKhrqZTFoyEvdYz6IiHFGgxTX\nQEiLFQ1db50gaABQtXAdE64v81oq6sblLUeblYqGMj6kJGyQkjkO7858Pw0rDi3xWMGuWKlWiwPs\n3btXt91228j7rV27VmvXrp3kVwCdYL2iwbkkaY/xoEGy2TrR5aDBekVDeInAmIKGUTMarLROhDup\nMa4zRl11ghkNAKYRrmPC9eW+ffWckS+7LNZbJzZt2qRNmzblfi89Eba0JO3bV+1KmqChz3u/2Tn3\nU0lHSVqXc5cNkk6TdIhz7rARAyGf0P+82Xs/0W7i5s2bdcopp4y834UXXqiLLrpokl8BdILFiob5\n+UEJ3+JishKnomF6w4IGZjRUw/vB8lDRUK+ut07kDYNkRgOAaWRbJ1KPPrr866aXxXrrxAc/+EFd\nfPHFQ+9TkENMhaBhuWEX67pTyeBISXqapG/k3ck5Ny/pyf3HumvSBVmzZo2uv/76kfejmgGzzmJF\nQ7YUeteuOA8apOWv7+Li4HlYCxqY0VCNYVf2kAgaqkTrRP1onQC6pah1os2qNMl+68S5556rs846\nK/d7p5+ezLp44hOlXbteqM2bN1f2ewka+pxzR0s6uv/lj3Pu8uXg9ukqCBoknaqkdcJL+sqky7Ny\n5Uo94xnPmPTHgZlhsaIh3LE++GBpy5ZuVDSsWtVu0JD+X3e5dSI7o6GtUnNptoKGNipH8lonrJbd\nljUqaLDSOhHbOgJAoqiioe2gIa9VzNI6vEyr/aGHSktLZUYRljcTwyBLOlfJMEhJ+mLO92+StLV/\n+9VDHuec4Panp18sAMNYr2hIB0LGeHZSWr4hDfvz23itmdHQ/LJkxRo0WJvRUFQ54txg5znGdUYs\nl7e0dAAAoDzrFQ3z84N1XyzrmTqHQXY+aHDOHeuc+8UR9/kNSe/of7lb0sey9+nPWrhMSRhxonPu\nrTmP82xJr1VSzXCT9/6bUy4+gBHCFX3bK/VhpdBdqGgIr6LRZutE3gFw2wMUp2FpRkNXgwZrl7cc\ndincmIOGvDN8Ft7PVDQA3VA0DLLNfRJp+fYyXa6290nLqvPyluZbJ5xzz5H0lOCfjg5uP8U5t6y6\nwHt/deYhfk7SF5xzX5P0GUnfkvQTJYHBkyS9UsnsBackIHir975oHMalkl4l6QRJlzrnjpd0jZJL\nWJ4h6c+UvKY7JZ0/1hMFMBFLFQ15K+v04HzXrnau8zytbOtE3r83hYqG5pclKwwadu6sf3mqYm1G\nQ/j7uhQ0WB4GGfM6AkDCautENmjYuTOe9cysX97ydcpvVXCSntv/SHlJ2aAh/fdfkvTsgt/hJe2Q\ndL73/iNFC+K93+6ce7Gkz0o6XtIb+h/h42yVdLb3/ttFjwOgOtZnNKQbwl4vWdbsQYV1BA31Gzaj\nwVrQEOvlLa0FDeH/a/Z17krQ0PblLcM5GFzeEuiGcB0TVllaChrSdU0M6xnvk/1TaUYrGvqGXQ1i\n1P2+Kel3lYQMp0paq6QqYkHSFkl3SPq8pA977x8a+Qu8v9c5d7KkP1JSDfEUSSsl3a8kgLjMe39/\nyeUFMCVLFQ3DZjRISVVDbEFD0YwGa0FDzJe3jLWiIdagwcKMhq62Tli66kS685wug9Vp8ADKK2qd\nsBQ0xNQ6kbfOrpL5oMF7f46WD1gc9+e3S/rf/Y+qlmmXpPf0PwC0KJaKBik5cDjssGaXaVoWZzR0\n7fKWzGioX97sAKutE+lOKkHDdIYFeG1vKwBMpqh1os0rYUn5QUMM65m6g4bOD4ME0G2xVTTEhtaJ\n+lHRUD9rl7cs0zqxZ09S1hqTUcMgm/zbzL6XnRssS2zrCAAJq1edCPdLYmqdGLXNnxZBA4CoxVbR\nEJtYWie6EjRYn9HQ1aDBYuuEFN972dLlLfPeyzGVNAM4UPp3PTfX7pWDsr+T1ol8BA0AomYlaOj1\nBj3BVDTUI/2/ZkZDc8uSRdBQjbJBQ2zhpKWrTuS9l9PQI7Z1BIBEeJnrNqvSpNFBw759y2fFWFR0\nic6qEDQAiJqV1omiA4eYDxokW0FD2WGQsc1oiDVoiOnylnnPy8qMhqLWCSm+dYblGQ0SQQMQu/CE\nQ5vrcKn4ID2mCsui9o+qEDQAiJqVioaioIGKhuqEZzKyYu6/jmkY5OJi8lpL8Vc0zM0lHxIVDVWx\ndHlLggage8JL1lpvnZDst0/QOgEAQ4Q77G2u0Lta0WBlRkN4reeijWGsBxExzWhwbvA+iD1okAbP\nkaChGtYrGmLqnQZwoKLWCYKGyTAMEgCGoKKhXkWXt2z6tQ77HIuChlgrGvKm46fP0VrQIA3aJwga\nJtfV1olRV51oO2iINYwEkChqnbA0oyGm1gkqGgBgCOszGsKD89gOGiQ7rRNlNobpxj32GQ3S4D1E\n0FCNoudlsaIhPBsW2zpj1DBIWicATIOKhmoxDBIAhrBe0RCenYyxosFK68Q4QUNsBxF5G/q2QpOu\nBg2jKhqafp1nqXWirctbhr8r7+/K+jR4AAeyOqMh3K7EFDQwDBIAhoipoiHGoKGodYKgoTrDys0t\nBw27dsVzsBZr64T1ndSsWGY0NL0sAKph6aoTZSoarO+P0DoBAEOEK/qlpeUrzSZ1dRhkTK0TXZnR\nIMURNEjxhGdF75/0dbbUOhHzOsN60BBT7zSAAxW1TrR5Jax0eVLhesZ6WMwwSAAo4P2BwUJbZ6lm\noaKhzaChTHkfMxrqWZas8H0QS/tETDMaYg4a8v5OnRu8zlZmNDS9LACqEVY0WGqd6MKMBioaACCQ\nt2Fpa+exqxUNzGio37ABehaDhrCiIZagIdbWidjWGUVn+Np4P49qnbB+AADgQOGMBlonpscwSAAo\nkLfTai1ooKKhGmU2hmnQsG9fPLMDpPgqGggaptfVioZRLSptBw1UNABxs9Q6UebyltYDTYZBAkAB\nKhrqZzFoGDWjQYqrfSLWYZASQcOkZjVooHUCwDRiGwZpPWigdQIACsQQNFDRUI1xWiekeIMGKhrq\nMWpGg6XXOdxJ7UrQ0Mb8lFFBg/UDAAAHCisaLM1o6MLlLWmdAIBA3k5rWyv1rlY0hBuhWC5vKcV1\ntnLUjAbvm1uWMmWUMQYNVDQ0o+j9Y6V1IqbeaQDL9XqDtkjLFQ0x7YtQ0QAABahoqF/6vJxr9xr0\n47ZOWN+4h4ZVNEjNXrK1zCyMMGjYubPe5anKqJL+paVmA52yQYP1s2FZzGgAUJfs9snqjIaYKhoY\nBgkABWIbBhnb2Ulp8LxWrFi+EWr6de5yRcOwGQ1S+wdnWTFe3nJURUP2PnWbtatOtHFFGIIGoFuG\nBQ1tVDQUre9iChoYBgkABaxWNIQbnPCgIcaKhvQ1XrEiqWqw0NM+SzMaJHtBQ+ytE3kzGqRmd1S7\n2jphvaIhpgMAAMtl9wMszWigdSIfQQOAaMVQ0bByZXKALsUZNIQVDeFni60TMW3cQ8NmNEjtH5xl\nxRg0FAVVBA3Vsh40xLqOAGCvoiH8nXPBEXVMgSbDIAGggKWKhnBZwgMH5wYHDrEdNEiDA4O2r4Yw\nyzMa2j44y4oxaCjTOtHkjuostE7kBQ379jU3C4OgAeiW7N+0lRkNCwuDE0pSXEEDFQ0AUCCGigZp\nMKeBiobJlRlYFOtBRGwzGroaNDT5One1oqGocqSNtiaCBqBbrLZODLtEsfX1DBUNAFDAUkVDmQOH\n2A4apOUzGsLPFisamNFQz7JkxR40MKOhPkWvcxvvZ2Y0AN1itXUiu08S7otYX89Q0QAABahoqF+2\noiHdgFoPGqyfRQgxo6F+RWfa2zojNux1Dg+GYw4ail5nKhoATCK7HrfUOhGKKdAkaACAAlQ01I8Z\nDfWLraIhvLzlzp31Lk9VrM1oGLa+iGknNavodW7jAJ+gAeiWYTMa2ry8ZcxBA60TAFAgxoqGpgah\nVcXijIauVTQwo6F+MQUN4Q50bOFkTBUN1g8AACyXrf6zOqMhpn0RKhoAoEDehqWtnccyFQ29Xjsb\nw2kwo6F+sVU0EDRMb9TrHGsVVNHl3qwEDTENaQOw3LDWCUtBAxUNAwQNAKIVS+tEWtEgxTenwUrr\nRFGPfSimswih2GY0rFgxWL5Ygoai52WxokGKN2goKiVuo62J1gmgW4YNg2RGw2SoaACAArG0TsQ6\nRX5padDqka1o2Lev2TYQZjS0tyx50qqGWIKGmC5vKcUfNAybwm6ldSKmdQQAezMaaJ0YjaABQLSo\naKhX+Ppmgwap2QOzWWmdiGFGg9TNoIHWiekVBQ0WWyesn2kEsFx2W2llRkN2fRfTeobWCQAoEEtF\nQ6xBQ95zshA0FG0MYzqLEKKioX5lhhTSOjE9KhoA1GVY64SlioaYggYqGgCgQCwVDbG2ToTPqe0D\n4C5fdSK2GQ3S4BKXsVzeMrYZDemOakzrC6n4DB8zGgBMa9gwSEszGubnJeeS29bXM1Q0AEABKhrq\nZbWiYdZmNDT5XMataHj00TjaVGJtnVhaiutKNUXDINs4wCdoALol+zfddutE0frOuUFYTEUDAESK\nioZ6MaOhGTHPaJDiaJ+wFjSUbZ2Q7O+ohqy3TsRU0gxguVhaJ6Q4gwYqGgAgQEVDvWidaEbMMxok\ngoZJjBM0xBROlhkGSUUDgElkQ/m54Ci26ZMLvd7gylsxBw1lLh0+DYIGANGioqFeec+praqBLgcN\nec/NetCQzmiQ4gjPysxoaOt1zns/x7rOKFPRQNAAYBLZs+/ODf62m65oGLWtTNc11tcztE4AQAEq\nGuplaUZDmdQ99hkN4QApC6/zsKAhfE/HMBDSakVDurOc1eWggdYJAJMY1mZoLWiIsaKB1gkACORt\nWNpaqXexooEZDc3Im9QfU9AQQ3hm9fKWeW0TUvzrDFonAFQt72/aQkVD3j5JLEEDFQ0AUICKhnpZ\nndFQdAAc60FE3kAp60FDbK0T1ioahg0Rk+INGrjqBIC65O0HpJ+bPrnQxdYJKhoAIMCMhnpZap3o\n8uUt8w7OrAcNsbVOlJnRYKmiISzxj2mdYb11Ivz/jmkdASC/iqCtioZRB+jpOnzv3sHQSIsYBgkA\nBWIJGmKtaIi5dSKmgwirFQ3Ddjpie0+H759wUrnVoKFrl7e00joR0/XtASyXd3BvfUaDZHt/hNYJ\nAChgtXUiu7IODxpiOChLjapoaPK1ZkZD88uSN6QwFWvrxNzc8udF60S1rF91IlwWyzv/AA5kdUbD\nsOzHKiMAACAASURBVNYJyfa6hmGQAFDAYkVD3hT58OxvTAcNVmc0UNHQ7LLkia11ougAuK3LW87a\nMEgrrRPhssS0jgAwvHXC2oyGWK5wQ0UDABSwWNGQd+AQW5l5ihkNzbA4o2GcoCGG93TRkMIYWidi\nCRq8l3q95Hb2dbbSOiHROgHEKtbWCcvrGoZBAkABSxUN6bLkHTjEeNAgMaOhKTFWNMTWOlF0pp3W\nieqkIYNE6wSA6lkaBln28paS7aCBYZAAUICKhnpZap0oszHs0oyGtp7LrLVOhO9nKhqmMywMpHUC\nwLSGzWiw1joRy4kPWicAoIClioZhBw4xHjRI+a0TbR0AU9HQ3rLkiS08KzOjgaBhOsP+RttunQiv\nNELQAMQpr8zfQkVDzK0TDIMEgAJUNNTLautE0cYwvFJCTAcRMc5oiK11wlLQ4H3xzIhUjEHDsKqj\nNlsnsgN6mdEAxClvHZNuK5eWknVrU0btk8QSNFDRAAAFwo1OeuDT1gp9VioaLAQNwzaGMZ6tjL2i\nIYbWiaLn1UbQkBfgZcWykxoatuPdZutE0bIsLS1fZgC2DatoyH6/bl1snaCiAQAC4Q5rGjRYrGhY\nuXJwRi2Gs78pSzMaxg0aYp/REFPQEMN72tLlLfMCvKwYw0mrrRMWQg8A0xs2DFJqb25UzBUNDIME\ngALhCvKQQ5LPFoMG5wYHDrEcNEhUNDShqIzeetBA68TkyvTEdi1oaLt1ou1lATC9YcMgs99vY1lC\nsQQNtE4AQIG8ioa2ymFHDXdLzwDHcFCWsjqjYdjGMF2+WA4gwksCxhQ0xNY6YSlomMWKBkutE7Ec\nAABYLq/Mv60rB42qBIgl0GQYJAAUyKtokJovh/W+/BT5mIKGmFsnLG/YQ0Ub+ZiChhje00XPq42d\n1FkMGqy2TsSyngAwunWCiobxUdEAAAXyKhqk5ncewxX1qIqGWA4apNGtE02+zl2d0VB0VqTtoGHU\nDseKFYP7xBA0WKpo6GrrhOWrTrS9LACmN6p1ghkN42MYJAAUKKpoaHrnscwZyhhbJyzNaCg7sIiK\nhsl5X76iwbnBe5rWifHMQkWDhQGMtE4A3TLqqhNNVjSMOkAP13mW1zMMgwSAAuEKMtwxtxg0hMMg\nm7zW8zSY0VC/op2VNs7SFM2LKBJTeEbQUD9aJwDUKe+g2MKMhlEVDZbXM+F6e66GVICgAUC00h32\nhYV2V+rjVDT0evGU9Vud0TDsILgrFQ3ODb5u+gxwdlmKpO1KMQQNRQedbQQ6ZV7nGM+6Dwsa5ucH\nl/i1FDTE8toCGF3RQOvE+MpWMU6KoAFAtNIV5IoV7Z6l6uoZSkutE+POaAgvG2nZsLLF9LW2dAAc\nonViMmXWF+H6LJb1xbC/Ueean59StAMdzvOJ4b0LIMEwyOoVbRurQtAAIFphRYP1oCG2Kf1SfutE\nG73W0vitE1IcVQ3DdlasBw2xVDSELSGxBA3OLW+3isGoXt+m25qKgobDDx/c3rq1mWUBML28bZTV\n1olYWrTSfSsqGgAgo6iioen0eBYqGiy1TpSpaJBsb9xTw1pCmg4axp0+nYZn+/bZbgcqOzvAWqAT\nW9BQdjha20HDYYcNbj/ySDPLAmB6loZBjgpWY6loKHulqUkRNACIVowzGiT7Z4BTMbdOSLYPflMx\nVzTE8p4u8xpLzb3OZdYXUtxBw7CKhiZe57B1iqAB6IZRrRPMaBgfrRMAUIAZDfWKPWiIoaIh5hkN\nYa+75aDBWkXDrAYNTVY0DKuuoHUCiJPVioaYWycYBgkABZjRUC9Ll7cse61nZjTUsyx5wve05aF6\n1oKGWWidaDtoGPYaU9EAxMnSjIZRrWJUNCQIGgBEK9aKhliCBmY01M/SjIautk5YCxq6WtFQdhhk\nE68zQQPQPVx1onpUNABAAYsVDUUr6/CgLJYDh9hbJ5jRUN2y5ImxdSL7vJwbvJ+sBg179iQzB6yz\nNAxy2HuZ1gkgTqNaJyzNaGhzQPk4qGgAgAJUNNRrVNDQ5Otc9ooIsVU0xDyjIZbWiVFn2tu6GoI0\n/HVuc8DtJMq2TjTxfh4W/lLRAMQp1ooGy+tvggYAKGCxoqHMjIZYKhoszWgoW9HQ1RkNTZzRnsXW\nCan5QGfcigYpjnVG2dd53z6p16t3WWidALrH0oyGrl3ektYJAAh4H1dFQywHZSFmNNSvzIyG7P3q\nMgutEwQN9Slb0SDV/1oPey8vLg5ed1ongHjQOlE9KhoAIEd4RiyGiobYDhqk/Oc1N5d8ZL9ft1me\n0SC1P0AvTyytE6Pabqy2qMS2zoglaHBuUNVARQMQD1onqkdFAwDkyB4Etxk05LUYZMVY0VD0vJrs\ntU6lBzHOJR9FYqtoKDOjQbIfNFh+T5e9GkJT75euVjSMev802dY0alkIGoD45IXGVi9vGS4XFQ0A\nEJnsjmSbZWqzUNEwF2wtmj4DLJXfGHZxRoNkM2joSutE08FZV4OGcSoa2g4a0itPbN0axxU9ACz/\nu073SaxWNDg3qGqIIWigogEAAtn5AdZbJ2I5+xtKn9eKFcurCNoIGtKN+qigIeaKhmFBQ9uXBMwT\nY+uEhRkNtE602zohDSoaHn3U9kEAgIFwPyDdJ7E6o0Fq/opG4/KeigYAyJUt62+zH67rFQ3Z52S5\noiG2GQ1lh0FarGiIJTyzFjRMUtEQw8Fw2ddZar+igStPAPHJO/tutaJBsl/REM46I2gAgAAVDfUL\nr+oRiiVosHoWIUTrRP3Kzg6wFjSE4WkM4WSMrRMSV54AYpFX2Wjh8paxBg3jbvMnQdAAIErZigbr\nQUN4dtLyQVkofV5FB8BtBA2jNoYxz2iIeRhkzK0T6bpj375m+vW72joxauhmW60TeetkKhqA+Fit\naCg6AZKu86wGDWWv5jUNggYAUYq5oiGGgwYp/taJ2IKG2CoaYqnSGaekv4nXeRaGQcZy1QmJoAGI\nRd6lGC3PaEgrGqzui4xaZ1eBoAFAlKhoqF/sQUPsMxqafi5dbZ0gaGgGrRMA6pTXOtFWRUOZg/SY\nWieoaACAABUN9cs7eyANnmOTr/MsVjQ0/Vy62jpRdkaDZCvQ6XLQYOWqExIVDUAs8lonLM9oCFsn\nLF5Gl9YJAChARUP9ylQ0NLXxLBs0dGlGQzgMsIkzIrRONPOemdWKBlonAExjVEWD1daJ7P2tYBgk\nABSIraJh5crBdZ9jOGiQRgcN0vKDizrNYkWD9aChK60TTbeodDVoGGcYZNtBA60TQHxGzWiwVtHQ\n9DZ8XFQ0AECB2CoanBucAbZ8UBYadXlLqbkzCHlnMvJ0dUaDxaAhPBC23DphbUbDJK0TFndSs0b1\nLNM6AWAasbZOSDbX4QyDBIAC2ZV8myv0cc9QxnB2Uiq+vGUbB/Oz0DoRW0WDc4P3tOXwzNqMhrLr\ni/D/P4Z1RqytE1Q0AHGwNAyyzCDFcB1ucX+EYZAAUCC7s269okGKq6Kh10s+JBsVDXlnMvLE3DoR\n24wGadA+Yfk9ba2ioautE7FedYKKBiAOefsBscxosF7RQNAAAIFhFQ1Wg4aYKhqGPac2gwZmNLSz\nLEXS8KwrrROWru7R5aCB1gkA47Ja0RBr0MAwSAAokD0Qnp8fbHysBg0xVTQM2wARNFRnWI9k02WX\n0wQNlt/Tlg6As7+jbEWD5dc3ZSnQoXUC6J68YZBtzWgoM98gphkNVDQAQCBvRzJdqVsNGsJ+dovX\nVA7FWtFg/QxCVuwVDTG0TsQ6oyGWq3qkRr3OllonDjpo8PdFRQMQh1GtE8xoGA/DIAGgQPbylpL9\noCE9++u9/SsixBo0WD+DkBX7jIb0Pb1792CmhzWWZzQMe53T11aKI2iwVDlS5r2cVjUQNAD29XqD\nEzRFrRNtzGiYnx9cujzL+okPhkGis3bskH72s7aXAjHLXt5SshE0lO25tn7gkPf65n3dxIY9PICl\noqGdZSkSHgxbnSNgLWgY9rcVCl9byzMwUjG1TkiDgZC0TgD2Ff1Nt13RMGxbaf3EB60T6KT77pOe\n8ARp/Xrp9tvbXhrEKuaKBsnuQVlqWHjS9IHZOBvD2IKGsjMarAYNMZT3W7oagjTZ+sLqaxuy9DqP\nW9FgvZUNmHVF28q2ZjTkDabMst46wTBIdNJll0lbtiQ7Tv/8z20vDWJltaKhK5erK9s60fSQwq4F\nDV2qaLB61n1UH6rV1okVKwbf71rQYKl1YmkpjtcXmGVF+wFtt04MW4db3x+hogGds2eP9PGPD75+\n8MH2lgVxi7GiwfpGJ2RpRsM4G0PrpYpZXZnRINk9WBsVVLXZOjHq/RzD5UNT47zOFioa0tYJifYJ\nwLqiwJjWickxDBKd8+lPL5/N8MAD7S0L4hZjRUNMQYOlGQ3jbAyda+99MAmrFQ1lz250oXWirYqG\nFSuKh4ilYriqR2rU36nV1gmJgZCAdUXbp7Yvb1m2osHi/gjDINE5H/7w8q8JGjCpYRUNTR/EdzFo\nGFbe3WQJtDR+eV/6Olt/jaX4ZzTE1jrR9pl2afA3U+Y1Tl/f2IKGtgMdggagW2IcBml9n4/WCXTK\n978vff7zy/+N1glMalhFw9LS8hVoU8viXPnBQBY3OqFYWyekuIKGYQdETZdddrV1wtLsAGnwOg8L\nJVMxtU7ENgyS1gkgHmVaJ6zNaLDeOsEwSHTKRz5y4L9R0YBJ5a0gmz5gyP6uUQcO1jc6IYKGZsQ+\noyGG1olRz6vN1olRYm2diCFooKIBiEfRtnJuLv8+TS0PFQ3DETSgEfv2SR/7WHJ7fl464YTk9tat\n9qfvw6a8A+G2+uHKHjhY3+iEhu2oEzRUx+qMhlltnWgyaBindWLfvmbD00lYGrpJ0AB0S1FFg3OD\ndYvloMHijAaGQaIzrrtO2rQpuf2Sl0hPf/rge7RPYBKjKhoIGqZDRUMzujSjwepZd2tBwyStE5Ld\n1zcV2zBIWieAeAz7m06/biNoiLldlmGQ6IxwCOTrXy8dc8zga9onMIm8A2GChupYChrG3RjGFDSU\nrWho4v3c1dYJq8Mgx2mdkOy+vilaJwDUZdh+QPo3zoyG8dA6gU548EHps59Nbq9bJ/36r0uPe9zy\n7wPjoqKhXlYvbzlO0LB3r+R9PctUlWE7T+HrHENFg9XWiVHPq+nZLpO0Tkh2X9+UpcoRggagW4ZV\nTLVZ0RBz6wTDINEJt90m9XrJ7d/6rWQHhIoGTIuKhnoNu7yl9aChrffBJIZt6J1rtjqD1glaJ6YR\nW0UDrRNAPMqE8k0GDen6jmGQwxE0oHbf//7g9kknJZ8JGjAtKhrqZal1YtyBRTG9zqOem/WgIYbS\nfmtBA60TNoIGKhqAeJSZ0dBU60SvNziB2pXWiZmsaHDOrXHOvdg5d7Fz7jrn3GbnXK//8dEJHu9F\nzrlPOefud87t7n/+lHPuhWM8xrxz7g+cc19yzv3EObfTOXePc+4K59xJ4y7TLLjvvsHt445LPtM6\ngWnlnXGPKWiwfqa9bNDQxPOYtHVCsrlxD406ILIeNMRQ2m91RkPXWidGvX+avN49QQPQLZZaJ8oe\noFvf52tiGGRN+UVlsoegvv8xFueck3SlpNcGjyNJj5f0Ukkvdc5d6b0/d8TjHCXpXySdmlmO4yS9\nQdKrnXNv8t5/ZNxl7LKwouFJT0o+U9GAaeXNEIgpaIj5ANh660Ssr3Pec2syaJjk7EYMpf2WZjSE\nZ8JmrXUivQzdo4/aq2igdQKwzVLrRNlQ3vq+yMxXNPSl4cKPJP2rJDfBY/ylkpDBS/qmpN+W9Kz+\n59v6//4659xfFD2Ac25O0j9oEDJcK+lFkk6T9GYlochBkq5wzv36BMvYWWlFw8KCtH59cvuxjx18\nn6ABkxhV0dDkSr2LQYPV1gmChmaWhdaJegwbsponhtc3VebvNF1HW6hoWLFiEORQ0QDYZqmioWwl\ngPXWCSoapIsl3SLpFu/9ZufcsZLuG/Ezyzjnjpf0ViXhwC2STvfep//d33TOfUbSF5UECH/snPuY\n9/7enId6jaTn9B/nf3jv3xx871bn3PVKQozVki5zzp3ove+Ns6xd5P2gouHYYwdv5IMOko48Utqy\nhdYJTMZKRYP3gw3grAQNTU/pnyZosFiuGEqf29xc8pFlPWiIobTfUtAwbMhqnhhe31SZv9P0tbZQ\n0SAlVQ27dhE0ANZZurxlFysaZnIYpPf+Yu/9dd77zVM8zH/VIFA5LwgZ0t+xS9J5/S8XJJ1f8Dhv\n7X/eIulPcpb1XknvVlJx8RRJL5timTtjy5bBBjydz5BK2yeoaMAkrMxoGHZAnmV9oxOioqEZoy6R\n1UbQ4Fx+6JEnhjPuloKGcSsautQ6IQ3W0VaChvTKE7ROALaVGQZpuXXC4kkPWieqcZaSKoS7vfe3\n5N3Be/91Sd9VEhL8p+z3+1URJ/Yf5++897sLftdVwW2CBuXPZ0ilAyF37JC2b29umdANVioauho0\nWJrRMG55X4yvc5lS817NNXJlrgueFcOB8KidwiaHQY6zvpDiCHJS4etcFFRZap2QBnMaHnkkqU4D\nYNOwg+KmZzSUPUCndaLjQYNz7jglAx+lpD1imPT76/otGqHn5tzvAN77ByX9u5LA4jljLGpn5V1x\nIhUOhKR9AuPK25FsIz0e58DB+kYnREVDM8pWNEjNnQWeNGiwWto/zmUXaZ2YXPo6z88nVTF5LLZO\nSEnIsGNHvcsUm15Puukm6cc/bntJgHKtE0tLzQSGZdcv1vf5qGiYXni5ybtH3Df8/okVPM4TnHMH\nD73nDBhW0cCVJzCNvANhKhqqQ9DQjPS5lQka6n4ukwQNCwuD94PVM+60TjQjDBqKWG2dkGifyLr0\nUul5z5N+8RelbdvaXhrMujKtE9n7tbEsobm55sLVSVDRML31we0NI+57f3D7CRU8jsv83EwaVtGQ\ntk5IVDRgfHkreoKG6gw7IGozaChzEBzj6xxr0CANDoatHghbChq63DoxTtBgrXVCYiBk6NFHpfe9\nL7m9ebP05S+3uzxAmatOSLaCBqnZOUvjmvlhkBVYHdweNQUgLJo7tKbHmTlhRcOw1gkqGjAuKhrq\nNazEm4qG6oya0WC9dUIaBA1WS/tHBVVtzWjocutEkaZbJ0YNNg2DBioaBq67bvl+2de+1t6yANLw\ns+/hOtxa0JDul1rcF5nkSlPj6nrQsBjcHrVZC98C2ZaH/Y/jvZ/mcWZOWtGwerV01FHLv0fQgGlY\nrGgom2xLNjc6IVonmtGFiob0rLvVM+6jykObnNHQ5daJUaGZtLyioc5e6rLv5bB1goqGgSuvXP41\nQQPaVraiwdqA6nQbbrF1goqG6YVXh1hZeK9EsDun7OZ8/+M456Z5nJmytCT98IfJ7eOOO3A4FK0T\nmAaXt6xXzEGD9QFModhnNEhxVTTQOlGfUe9lafnfZp1nHsu+l2mdONCGDdK//Mvyf7v55uV/R0DT\nygyDzN6viWXpSutEXRUNNT2sGeH4mlFtDIcEt7PtEdnH+dmEj1Pa3r17ddttt42839q1a7V27dpJ\nf02tNm4c7FRlB0FKVDRgOuHZszTEImioTtnLWzbxOlPRkLAeNOzalZylLrriQFtGvX+cS/59aYmr\nTkxjnNYJKVl3lAlbJjFJ0EDrROKqqwaX0k3/LrZvl+64Q3r601tdNMywsvsk1oIGK60TmzZt0qZN\nm5b9W3hFmXvuSYLtvRXv1HU9aAgHN44azBgOgLw/873s4wwLGtLH8Ro9OLLQ5s2bdcopp4y834UX\nXqiLLrpo0l9Tq2HzGSRpzZpkB897ggaML91hDzcwbZzJHidomJ8f7Li1vdEZZdjzajrQmYWgocyM\nBqtBQ3rWvddL3jcrR9X9NazM+2fFimaChi63TowzDFJK1h2HHFJ832nQOjGZXk/6yEeS285Jb3qT\n9P73J19/9asEDWiPpWGQ41QCWKlo+OAHP6iLL7648PvnnFPP7+160HBncPtpI+4bfv//Z+/dw/Wq\nqnv/79p7Z++E3EMSkgAh4R6EQsJFQRRavEBV9NhqLVaxCof2Vz21x6JVVLDH1lPrpT3F9lhbe+Tg\n5UFB+pyjpT0CKsrVRAsCgXBLIASSQMg9+/au3x8zs3Osd6/3Xbc55xpzrfF5njx77b3fvbPeudea\na47v/I4xHs74Pffn+D1Px3FcelmwaNEi3HLLLZmv4+pmAJIdJ9IcDUNDwMKFqqKxpE4IRUlbSHJ3\nNADqobNvH898PUq/9+U7kC/agilEoaEJjgZAXdvchIY8u0/TpgEHDvgtBtnU1Im8QoNLUUdSJ8px\n663AU0+p49e/HnjHO4zQcNddwO/9Xm2nJrScvKkTvms05BUa6l7zXXHFFbj44osTX/vMZ4DvfEcd\nX389sGoVcOGFF2Lbtm3W/t9GCw1xHD8ZRdGzAJYCOC/j5a8++HFzHMcbu75HG/ucB+CGtF8QRdFh\nAI6HcjP8tPgZG4aHh7FmzZoqv6J2shwNgEqf2LZNORo4Wm4FvuiHSahCA/cAOK/QII6GajSpRgOg\nguF58+ydkw3yXD++2i4WTZ0YGTHOvyamTrhCUifKQYtAXnYZsHq1ujfGxqQgpFAvnBwNZVInOh31\nc65qIWSRlmq/YIE5Pvlk4NRTVfxpk6YXgwSAfwYQATgxiqKz0l4QRdEroJwIMYCbu78fx/EGKJdD\nBODtURRN737NQajx5LtVTroJZDkaAFMQcnRUHvJCMfRE3yt1grPQAPAPgPs9SIeGTMs4H++jyUJD\nExwN3Hfd86ZOAPxSJ6IImH5wxcFxbCl5rh9fc7SkThRn2zbg5oMr4MWLgTe9Sc0/Oot3wwb1GkGo\ng1DbW3Jej9A52JUTsQ1Cw18B0JfE33SLBAc//x8HP50A8Nc9fs/nDn5cAOCz3d+MougYAH9y8NPH\nIEJDwtGwYkX6a2hBSEmfEIrAxdFQNHAIRWjIElB8vo+ilZF9Oy7KEsfNqNHAvWAhJ6GhqDAJ8G8f\nqpHUibC54QbzN7n0UvO3Oucc85q77/Z/XoIAZG9+pL3Ox7nkbW8J8FuPlHkeFYV16kQURa8EcCz5\n0kJyfGwURZfS18dx/LXu3xHH8YYoij4HJQKcCeCnURT9BYDHARwD4CMAVkO5GT4bx/HjPU7nawDe\nC+CVAN4fRdFSAF8BsAPAywF8HMAcAJMAPhDHcafg220c2tGwdGlyIUrp7jxxwgnuz0toBmmOhjom\n9KY6GvIIDfv3i6OhCh3ylOgVEPkqcNrpmPOpmjrBjTyLQs5CA/f2oRpJnQgbujn0hjeY47PPNsd3\n3aWcDoLgm7ypE1xrNAD81iM+HA2shQYAlwG4NOXrEYBzD/7TxFBiQBpXAVgEJRScBuBbXT8XA/iH\nOI4/0etE4jjuRFH0FgDfgxIsfuPgP/p7RgH8QRzH/9bnPbWCfftMJ4le9RkAkzoBiKNBKAYXR0PR\nwIFLq6Mssh6kVQSTjRuB3/gNYPly4NvfzhYPmio05HFq+HovVfpph5I6MTDQuw6QvnddzxtFFqga\n2j6UM2W6TrhCHA3F2bvXHNNxoULDnXf6Ox9BoOQtBsktdaKObmh5oevXNqdOxAX+pf8CxeUA3gBV\ns2EzlCiw+eDnF8VxfEXmicTxCwDOAfD/AbgDwHYA+6HcEX8PYE0cx18t9S4bhq5aDPSuzwBMdTQI\nQl6kRoNbXKZOfPWrwNq1wHe/m2/h2lShIc9ixdd7KRMAa0JJncgTAHN0NEjqRHHyCg2Dg6bFpggN\n5pi2HV22DDjqKHV8331+AjlB6KbfM8p3jYYy7S0BfqkT9HxamToRx/HvIllgservuwVAds/I/r+j\nA+DLB/8JPcjTcQJIOhpEaBCKEHJ7S6D+CsRZuBQaNm82xy+8kP36NggNeWo0+NgBBpqXOlHE0s9R\naNDje+CAmjcGmG4R5QnuuaVOAGr3fu9eSZ3oJTQAytWwcaMSEu+/Hwi8KZoQIKF2neC8HhFHgxAs\neTpOAFIMUiiPniBDdTQA/B46FJdCA61cvmdP9uvbIDSE7GjgnjpRJACemFBFOl2fS9b5UKiQc+CA\n3fOxSYipE4DpPCGOBnM8a1bye7QgpKRPCHWQN3WCW40GzqkTPhwNIjQITsjraJDUCaEsoTsaAH4P\nHUrWbrtPoaFIhWcgnDHmVKOh7akT9N51uVCtkjoB8BRygGRh07pTJ4oWNtX1CHbvTr6PtpHlaNDc\ndZef8xEESr/npbS3LIcIDUKwUEdDP6Hh0EPNokSEBiEvdCHZy9Hga0JvqtBAi22mFdDTYz05mVwA\n5IEKDXRx24uijgbOOwiUpjgampA64at2QJXUCYCnkAPkv0d9pE4UvZZp4cPdu+2fTyho0Xd4eOq4\nnXqquQ7F0SDUAddikE1obzkwkG9tVQYRGgQnaEfDtGnA4Yf3ft3AALB4sTqW1AkhL70WkoODJn9Z\nHA3VSEtNoVR5H5I6oShao4Gr0MB9x12/Nw6OhqqpExzHF8h/j/pwnVURGtqcPqFF3243A6DuD12X\n4amn8s3bgmCTfvc159SJ6dPNMTehWM/BruozACI0CA6IY+NoOOqo7MBAp088/3y7bYtCfvoF93rC\nFKGhGlk5zmXfx9hYsuhaUaEhT+AgjgY359IL7jvuetzoWHYjqRPVyLvD52Oci17LIYyvD/oJDYDq\nPqHZvt39+QgCJdRikAsWmOMXX3RzPmXRc7AIDUJQbN9ugod+hSA1uvPExAS/m1DgSb9JPiShgZuN\njuLK0UDdDIAbR0MUmeuAs9DAtUZDUQsl9x13fZ+FKjRwF3KA/GIgR0cD9+vXF1lCw6GHmuM83YIE\nwSb9nlG+5u+0c8maYzjfN3oOdlWfARChQXDAo4+a42OPzX69dJ7Ij8tq6CEhjgb3ZFnqOAsNQLVi\nlb5oiqOB+45wUUcDtzaiIQTCIadOhDC+roljERoE3vS7r30LDUVclvS+4eYEEkeDECQPP2yOT7FJ\n8gAAIABJREFUV63Kfr10nsjHI4+owpqvfjXvnXAf9Hvg6GBChIZqZAVnZZ0Z3UKDi2KQQHhCQ8g1\nGrjvuOcRGjgXg+Qu5ADlikFySZ0QoUHN4fpv2N3aUrNwoTkWoUHwTb/g3nfHMXE05EeEBsE669eb\n4xNPzH69LgYJAFu32j+fpvD1rwMbNwJ33AH88Id1n0290AVqyKkTnIPgIkIDZ0cDZ1GuKY4GzoFa\np2PuUUmdcIc4GsKGzsPiaBA40k+Y972uaprQII4GISiKCg3z5pljWiROSEIDtC1b6jsPDtBJXlIn\n3NAUoYHzGOexX/oqbNnU1Ak6D/RbTEnXiWqI0BA21FmWR2jgZgFvKwcOAF/6EvCDH9R9Ju4RR4N9\nsmpx2aDgckIQstFCw8yZ/VtbakRoyActlNl254c4GtzDSWgoU6gwBKGhiY4GbjvudB7gUKOhqakT\nea8fSZ3gSVGhgVvA1FauvRa48kp1nW/cmOwM0jTyOhp8Cw1Za5I5c9TfZ2KC330jjgYhOA4cMK0t\nTzxRVX/PYu5cc/zSS27Oqwns2GGO2y40cHU0NKX14sSEaTXLQWhoqqOhiUIDt0CNjpmkTrhDHA1h\nI0JDmNxxh/o4MQFs2FDvubgmT7cxgF/qRBSZe4fbfSPFIIXgeOwxE6DkSZsAxNGQF+poaHt3jjwP\nHBosu6SJjoY8wVnZ99EtkhUtBpk3CKY1Grh2a8mzK8K5G4JmZMSIytwCtbxCg69ikJI6YY65CA3U\nMcJVyHGNCA1hQlOVd++u7zx8EGrqBMBTaJicNGtkKQYpBEPR+gyAOBryIo4GQ572loCfB44IDbwd\nDQDfgpB5BJQo8uPOqCI0RJEJhrkFak1wNIQQCEvXibChQkOvrhNz5wIDB6MGTgFTWxkbAx5/3Hze\ndKGBU+pE0c0PLTTs26ec3xzIW7+oKiI0CFYp2toSSAoN4mjojdRoMOS10InQUA6fQsO+fcmHdhpV\nhQau45w3IOIuNAAmWOMWqDVBaAghEA7Z0RDC+Lomj6NhYABYsEAdi9BQP48/nrzvdu2q71x8kNfR\nwC11AuDpBirzLCqDCA2CVcTR4IbJyaQI0/bUCXE0uIWeVy+lu+yDvVtoALJ3aUVoUB85Cw16151b\noFam6wS3FJUQAuG878tHiooIDcXJ094SABYuVB+5BEtthq63AXE0aDinTgB8OraIo0EIEj3xDQwA\nxx6b72emTTOLVHE0pLNzZzLPfOtWvnnnPhBHg1tcORrGx5MpQJqs9IkyQkMoRTc1/d5XCEJD6KkT\nvmo0SOqEORZHAx/yOBoAEzDt3s03Ja0tUAcx0HxHQ6jFIAH+jgYRGoQg6HSM0HD00f0Xdd3ogpDi\naEinOzibmGj3WImjwS2uhIZeD1gXQkMI45w3zzMkoYFboCapE36Q1ImwKSo0AHwCprbSNkcD12KQ\nedYk2gkE8Llv6DhJ6oQQBJs3m92WvGkTGp0+IY6GdGh9Bk2b0yfE0eAWV0JDWtoEkN15ouhDHQhj\nnDmlTuRJl+mH3nUfG8uuueETbkJDU1MnyggNkjrBBxEawqNtQgO9rwe6oteQUie43DfiaBCCo0x9\nBo12NOzZk7yBBUWa3bzNBSH7Bfe+HzhVhAau1lPfQkNbHQ2chAYq8s6ZU/znp083x1yqagPlhAaX\n96WkTphjcTTwIU/XCYBnwNRG4niq0NCW1Im0e1pSJ4ojjgYhOOikl7fjhIYWhGz6ZFkGERqScM3V\na4qjgT6AbAoN9JqdPdsci9CQr0bD+Ljpe20bKjTQ+TgvXIO1vE4NSZ2oRplikFyEhhCEHNeIoyEs\ntmyZ6mBouqNBrwOyhAZxNORDikEKwUEL05R1NACSPpGGpE4kyeto8LGzKqkT+R/s1NGwcqU5LiI0\n5LWbhzDOeR/0PlwwNoUGTo6GPKIZ4K8YZJnUicFBM7dwFRokdSJs8nad4BgwtZFuNwPQ/E06fV+n\nzS++n/dF1yQc7xtpbykEB534Tjih2M9Ki8v+iKMhSb+FJN2d8rFo1JP14CAQRdmvDyEA9pE6sWKF\nOXbtaOCaokIDcpp60I2Pa6YNjgYONRr0746i/NcyYOY1rjvuIadOjIyYuZvTtesTcTSERZrQII4G\nhTga8iGOBiE49MS3aFHypsqDOBr6k+ZoaLPQQAOB7kned8CjzyWvIixCg+Loo81xVjHIpqZOlGm9\n6Oq90N2wMkIDFUo4BWtchYaiO0hcu3poOAkNVLikz4NeRJG5frmOr2vyCg0cq+e3ke7WlkDzhYZ+\njoaBAbMW5Cg0LFhgjrncN1IMUgiKnTtVzhhQPG0CEEdDFmmOhjanTvSri8BdaPBdQ6IMPoSGsqkT\nbRQaxNFQHm7FIPsVNOtHU4QGGhC4EnQ2bzbHy5bl+xnu4+uaMo6G7dvdnY/QH+po0PN1W1Ines2d\nem3FsRjk0JDZUOVy30gxSCEoHnnEHFcVGsTRMBVxNCTpN8nTgMeHzVgcDSI0lKVM6oTUaCgGt2KQ\n+u9XdGHXlNQJwLx3V9cyFRoOPzzfz4jQoD6OjPQPnDhawNuIFhoWLDDP0t27VTeKptIvdQIw87tv\nR0PeNYm+d7jcN+JoEIKiSscJIJk6IY6GqVBHgw5I2iw09Ctiw93REEIA7FpomDkzacFtq9DA0dEw\nY0a53Q2ujgZuxSD1riPtupIHGghzDCaK7PC5DgiefdYci9CQDy009HMzADwt4G1j927gmWfU8apV\nph3x+DjfZ50N+qVOAGZ+55g6ARih4aWXkmuauhBHgxAUVTpOAOJoyEI7GkZGgOXL1XGbhYa8jgYR\nGsrhSmjQ1+yiRcle7VlCQ5ndg6aMc/f3XAsNZdwMgNRoyIsWjefPL/Zzel7rdNyeX1mKiIFaaHCd\nOjE4CCxenO9nRGhQH7OEhuFhI5KJ0FAPjz5qjk88MSlaNrlOQ15HA8fUCcAIDXGcng7tGykGKQQF\ndTSUERrE0dAfPSktWAAcdpg63rmTl0XZJyE7GgYHzUI85AC4aK2JyUkjmHULDUWKQQ7kfGqFIDTQ\n+5eL0KB3x4rC1dHAqUbDgQPmfIoKDbSbDsf0CY6pE0uW5Bcm9fgeOKDEnLahxd4soQHgZwFvG93r\nbTpnN1loyHI0+EydKOOy5JZ2JO0thaDQE9/06WbHvQjiaOiPDtDmz0/u0NCc9zYRsqMBMAEP1wDY\nhaPhhReM5buoo0E/1Iu0Awyt6Gad7S07HbNALetoCKFGgw1Hw/g4cNVVwDXXFA9I6S4WFdfzwFXI\n0ZRxNLgICMbHjXMqb9oEwPf69UEc53c0ACZgevHFdooyddPtIKaOhiYXhMwqBulzXaXPZWAg/+YH\nN6FBHA1CMHQ6wGOPqePjjisWDGjE0dCb0VGzg7VgQVJoaGv6RD8llu78idBQjjwF9IoGv1QUW7y4\nnNBQpFJ/CI4GLqkTtIiYDaGBUyBcpoVoP6Hh5puBP/9z4FOfAr7//WLnQoWGsqkTAK/x1XBJndiy\nxVzLZYUGjuPrktFRIxgUERo6HdkYqoNuR4OkTijqKAZZZE3CTWiQYpBCMOzebW66JUvK/Q5xNPSm\ne3FKhYa2trgUR4Nb8gRnQ0NGyS8qNCxapHbwo0h97sLREILQwCV1omrHCYBvjYa8uzZ5HQ1PPGGO\nabelPFQRGrinTuR15wBuUyfKFIIE2i000NQ1KgD3ghby5RAwhc7oKLBpU/7Xa6FheBhYsSKZOtEG\nR0OeYpCuC+aWERq43TdSDFIIBjqxlc3vnTXLBC0iNCShi1NaowEQRwMQXo0GwG915DIU3WkvIzRE\nkVnUtlVo4JI6QedwSZ3oLzRQx91zzxU7F/qzTXM05G3VCrjdeaStLZcty/9z3MfXJVRoKOJoAIDt\n2+2fT5uYnARWrwaOOgr42teyXz8xAWzYoI6PP14Fum1xNGQF9746B9FzKbIm4eZokNQJIRhsLFKj\nyPyspE4k0fUZgKmOhrYKDXkdDa53/iYnjXLeJEdD3paAVYQGIL/QoBfCWQFM2rnlPb864JI6YcPR\nwDVQs10Mkj6fijrKmlyjoYzQMD5uf+eRCg3iaMhHFaGBQ8AUMo89ZmouXHdd9uufesrMT7rwehuK\nQcZx/tQJwP0mTtNSJ8TRILCGLlLLOhoAs/ASR0OSbkeDpE7wcTTQ8yhTP6ApAXCehzoVxbqFhn5d\nJ+LY/Cx182QhQkN+mpw6wcnR0OTUiSJCgx5rGjzYQoSG4lChV4QGv9CxX7cuW3hL6/DWhmKQtOho\nVuoEwF9o4OAEEkeDEAw2UieApKPBdX5VSHQvTiV1gk+NhrKKcNOEBpeOhj17TBBDRbYsfC46yqLf\nVxT1v35CEBq4Bmq2i0FK6kQ6ZRwNgP17s2yNBt9FhDkhjob6oM++l15SjoV+dHecANqROkEFyTyO\nBtdrK3E05EeEBqEyNlIn6M+Oj/PK8a0bmjohXScUHB0NZYSGTicpmnDBl9CgF7VjY70DDuraKSs0\ncBd0RkZMYcw0XC+gbLjSuNZosF0M0lbqRNOEBnpOeR0NgP1caqnRUBwRGuqjW2Rfu7b/69McDW1I\nnaDrJA6OhjKdsLjdN+JoEILBduoEIHUaKN2L07lzzUKtrakT/RwN06aZBxFXocGn8l4G344GoHf6\nBBXTmpo60W+Mu78vjoZiuEyd2LatmFBYpUZDk1InXDoatNAwa1ax9YjP2j7cKNp1glvAFDLdQsO6\ndf1fT4WGE05QH9uQOtFvzafh7mg45BAzN3K4b6S9pRAMtlMnAKnTQOkuBhlFZme3rY6GrIeOXpRz\nFRq4B8EuhYYZM8yuGV3U9kqfoNd40xwNOjgrIjS42KlpQ40G2o41jTLFIOO4WK5tkx0NHISGODZC\nQ5G0CYD/+LpEHA310S2w9xMa4tikThx5pHl+tsHRUDR1gmONBsC0uORw30h7SyEYbKVOiKMhne5i\nkIDZ2d22LVkkpy1kBfh60ShCQzmKCg2Tk9lF3bTQoFtbAvkcDW1IncgKzMTRUJ68rpGBAeOE6udo\noPMxUKxOg9RoULhKndi1y8wjIjTkp6jQoIMlgEdRu5BJczT0qlG2fbuZf3TaBNA+RwOH1Iky7S0B\nI9K98EL9tegkdUIIBlupE+JoSKfb0QCYgGtiYurCtw1kORpEaKhG3l3gvO+j0zEKvk6bAIo7GiR1\nIgyhgVONhrxjDJh7uFfwe+DA1PEvIjTouXpkJDleeZDUiWzKFoIE2i00FO06MXOm+Rty2JkNme7n\n3rZtyTojlLT6DEA7ikE2IXUCMELD2Fj/bls+kGKQQjCIo8EtaXbbtheEzOtocL0gb7rQkBWc5X2w\n79hhHA9UaKCLWtupE/TvwXGMgXKpE/3ey/g4cM89xd+vjTl82jQjSnEK1HQgm2fHJktoSHsuFamT\no+fyovUZAP6BMAehoWwhSID/+LqkqKMhipI7s0J50p57vQpCpnWcANT9pJ8RTRUampI6wSntSBwN\nQjBIjQa3aEfD7NlmUmt7i0txNLjF9k47vUaLOhrKpk5EkXl4chxjwH7qxDveAbziFcB731vsPGw4\nGqLIvA9OgVoZR0OvRWqa0FDG0VA0bQLgHwhzSJ2gQoM4GvJTVGgARGiwRdpzr1edhl6OBsC4GiR1\nQiFCQzZSDFIIBhddJ0RoMOjFqa7PACQDrjZ2nsjraJiYcNs+UoSGqT+TRlrHCcBtMUh6fhzHeGLC\n1FexNc63364+3nRTsetez7fDw9lBYj98CXxFsJk6UcXRMDFhrvEyQkNIqRNFnFCuHA1FhQY6vpyu\nXx9UERoOHOB5PYZCmn2+jNCg197iaFC4fObHsXl2VxEa6q5vIsUghWDQCmoU5WuN1Au6kyapE4o4\nNo4Gujhte+pEXkcD4HbRSM+jiUJDlsptU2jIam85c2b+RXD3+XEeY8CO0BDHZpF54EByUZqFFhqq\npL4B5r4LtUaDvt6LCA15HQ1VCkEC/Hfc9d+cthfuhdRo4EXR9pYAr53ZkKECuy6SnCU0zJ4NLF2a\n/J44Gvw5GvKIHr3gdN+Io0EIBj2xzZ7dv3BcFuJomMrevWaC7eVoaKPQkNfRALhdNIqjYerPpFHF\n0aB3jIu6Gej5cRzjIlbzPOM8NpZcjPXK801Dz7dVHGlAOx0NeYUGWmunyTUa8jhiuKdOtG2Hvoqj\nAag/YAoZ+tw75RT18dlnp84r+/cDTz2ljletMqKERgsNY2Pu0wbqgFMxyDzn0gtO9404GoRgsLVI\nFUfDVNI6TgBSoyFrordpg926FfjRj9LbN9oQGjguCvQ52RIa6AOVtkbLEhrGx8090DShwbajIa1N\nWh7i2IjFVR0N3Go0FElPAdymTlR1NISSOpFHaHCdOhFFxTrUAPyFHJeI0FAfdN5+9avNcff8vWGD\naYfYnTYBJNffTUyf4FQMsorQQNc/dd83UgxSCAZbi1QpBjkVugsmNRoMOhAYGEh30dhaNI6OAqtX\nA+efD/zlX/Y+D0AcDXmFBrpAzeo6QXMYiwYO9Pw4jzFQr9Cwd69ZxNlMnai7RzhQfCHlshhkWveg\nInAPhDkJDYcdVnyHjvv4uqRoe0sgGTDVnWseMnrsBweBs882X++ev3t1nNDQFpdNTJ/glDqR51x6\nwUmgk/aWQhBMTho1XBwN9unlaKD28zY7GnqpybYWjY8+avJ+//7vpwZPTRQayhYp7Pdg7yU0ZDka\nynac6D4/jq6RIsXzyggNP/+5+Tv2w0bHCQ297zhc10XEHKCYo0Fbl198Md/1VVVoGB42/yfHQLju\n1InJSSP6FE2bANotNOg13MhI/sCJU8AUMnrsZ80CTj/dfL1baOhXCBJovqMhj2gsqRPF0GM6OFgt\n7T0LERqESthqbQmoh5xepIijQdHL0TA8bPJ88woNX/0q8MpXArfcYu/86kIvTnsF97YWjVu2mOMn\nnwQeeCD9PPqdSxqchQbbO+1AeaGhSscJen5jYzx22Cl0vLKCM7qYySs07N2r7LZZuBIaOARrRYUG\nvVCdmEi/XqjQcNRR5jjPHFy1RkMUmfQJSZ2YyvPPG2FNhIZi6GC3SLHdtIDp7/8eOOcc4F//1d65\nNR09b8+aBRx3nHkmFhUamu5ooPckvVcpIaROcBIa9PrVZdoEIEKDUBE6oVVdpAJmASaOBkW/XTBt\nJc+TOnHzzcD73gfceSdw5ZX2zq8uijgaqizKqdAAAN/9bvJzXZyp+//MwpfyXgYXQgO11lLBLKvr\nBA3gqqROAPxcDUXGOYqy3Rlpu1h5CkLanMNpkMkhWCvraADS24PS5xJd7OdJn6haowHgWWwTUKJM\nWaHBlqOhSiFIgN+16xO6q56X7oDpxhuBK64A7roL+MM/tHt+TUYLDTNnql3l1avV5xs3JgNRLTQM\nDgLHHDP191ChoYmOBnpP9ppjfG3gVOk6MXeucQ/ULTTotYTLtAlAhAahIjYdDYBZ6IqjQUFTJ2iA\nBpgd3t27+y+MHnkEePe7k5+nLaJDQp+/T0cDoAQbTRwrl4jmNa/J/3vb6miYNy/5YPaVOpF1fnVQ\nJHWCviavowHIV6fBlaOBQ4vLKkJDmqDTS2jII/ZWTZ0A+DoaxseNA6Ro6oQtAZAKDcuWFf/5KOIr\n5LimqqPhnnuA97zHfP7II8lOQ0I6cZx0NADAmjXm+3r+7nTUmALAscem70A3PXWCPk9CdjQMDJi1\nPBehQRwNAmvoItWG0KAdDbt2pVf5bxv9Fqd5Wlzu2QO89a3JB8/4OPD00/bOsQ70LpjrGg3dQsMv\nfmFcDPfcAzz4oDp+5StVy6m8cA6AXQoNdHEKZBeDtJU6kXV+dVAkdQIIT2jgEKyVLQYJpO+0U6Hh\nuOPMcR5HQ9XUCYBvIFykVSvgJiCo6mgA+I6vS+K4utBw991T55+7765+bk1ndNSsc9OEhp/8RH18\n+mlzTaalTQCSOgHUUwyyqNAAmHun7iKqWSnIthChQaiE7dQJ+juaqMoWpVcxSCBpJU9b6MYx8N73\nAg89NPV7eXK3OZOVOmGrvWW30AAYV8NXvmK+dvnlxX5vKAGwDaFhctIEaN1Cw/CwechlCQ1VUydC\nHmf6mqJCQ1ZtCptiMTehoWyNBqC/0DBnTjKYLSo0NC11oorQYCt1QhftBaoLDdwcIy45cMDUtigi\nNMyfb4qTaqhD7a67qp9b06Fzth67V73KjOu116o1dlbHCaD5joY8qRMhFIMETMeW3bvrTekUR4MQ\nBK4cDd2/u630KgYJJO2haQHxl74EfPvb6njOHOCP/9h877HH7J1jHfgqBkkXr5qbb1YP/299S30+\nZw7wm79Z7Pc2JQDO82DfscMEu91CA2AWWG12NNgWGvTiZ+dOVcS0HzYdDdzy3KukTvQTGubPTwpf\neVInbNRo0ALq2Bgvx18eWzPFdeqEOBryQ2vjFBEaBgeT67W5c4Hvf998fued1c+t6dCx18/BlSuB\nd75THb/4IvCFL2QXggTE0QDUkzpRtL0lkFwH0c1E34ijQQgCVzUaACkICfR3NCxdao7ThIa/+ztz\nfN11wK//uvm86Y4G26kT8+YBxx+vju+4Q+006F2vSy4ptkADmhMA53kfvTpOaPQCK60YpA7gaF5j\nETgX3Sy6C5wlNNBdLNomLasgpNRoMOQVGubNA5YsMV8v4mgYGEgGBUXg5hjRcEudKFOjARChoehz\nbPlyc3z99Wo3Xndjue++8GtBuYaKw3Tsr7nGrG0+/3mTQgHkExqa6GjII2b6Sp2gv7tMkE7XMnUK\nDeJoEILAVdcJQBwNgFmcDg5OFXL6CQ1xrKoWA8AJJwBvfnMyp1gcDdnEsRnXZcuAt7xFHXc6aiGg\nKZo2AYjQQMnjaFi4sNzOQVPGmb4mj6PhvPPMcVadhibXaLBZDHL/fvP75s0r7mjQc/m8eVMt53nh\nNr4aH6kTmzcD3/lO77QGLTTMmFG9BsaBA/za4boibVc9L5/5DHDuucDXvga88Y3qa2efrT7u2wfc\nf7+dc2wqaakTgOoqcdll5jU33mi+J6kT+RwNLp/39O9WRjSmP5O2weILaW8pBIHt1AlxNCTRamfa\n4rSf0LBrl5nAtI102TKzCBRHQza7dpmfXbrUCA2AmaBXr04Wb8pLUwJgG0KD3snZsye5uI9jIzSU\nqc+Q9/zqoqzQMD5ucqopdPHz6lebY59CA7fUCZvFIOnzaN48NcfoZ16R9pZl0yaAZO0ZTnUEigoN\nRVMnOh3ggguAt72td3tmneZ2+OF2hBwOjhwfVHE0XHSRcvjRrlZaaAAkfSKLXkIDAHz841PvpSVL\neotobUqdyNPe0qWjoaqTm95ndQkNcSztLYVAcFkMUhwNZhcszTbeT2hIs5EODKjWSADwxBO8cnyL\nUsTRUHZBTsd06VLg5S9P2qWBcm4GIJwAOCs4y/M+aGVlXQSJohdYnU5ycb9rl/mdZeoz5D2/uijb\n3hJIX0TRResJJ5gxyyoIaXMO57bjbrMYZLfQAJj5IEto6HTsCA3cxlfjOnVi61bT3u+ee6Z+f+9e\ns14oW58B4Du+LqkiNKRxzjnmWApC9qef0HD44cD735/8Wi83A9B8R0PR9pYun/c2hYY0J6cPJifN\nukAcDQJrbNdooGpt2x0NtFp/2uJ00SJjJ+8WGnpV4NbpE+PjwKZN9s7VJ3FsRBKXjoZuoWFgQKWg\n0P/jkkvK/W7OAXBdqRNA8qFbtRBk3vOri7LtLbt/VtNt59ROm+3b+7ezlRoNhiKOBsA4baj7KY3d\nu40LRYSG4qkTzzxjjmmBZI2NQpCAvW5FIWFbaDj1VHOditDQn6y0lY98JOlU6Cc0tMnRUHcxSDq+\nZVInODga6LwrjgaBNS5TJ9ruaKDvP83RMDBgFrrd3RF6Lby0owEIt04DLTDlskZDt9AAAG99q/na\n295WPjDzZfErAz0fLkKDpE4UExpmzUqm9PRLn9DzzOBgMtAqA7dA2GaNhn6OBqB/nQYaHJetHwC0\nN3WCCmVpxdPS5uoycLt+fdCrIGFZpk0DzjxTHT/5ZL60orbSz9EAKAfghz5kPj/llN6/a2TE3FdN\ndDTkERokdSI/RdMKqyBCg1AJl8Ug2+5o6NdxQqMXVVu3JlMhegkNtCBkqHUa8vQwtrEzlbZ4fe1r\ngQ9+ELjwQuDP/7zc7wWaEwDbFhroQ5cGbk10NFRJnUh7L3pxGUXq+i8qNMydWz63XcOtRoNrR0Pe\nzhM2WlsCfANh16kTVGjYuXNq2h99VqalZ+XFRspdaNh2NADJOg3iauhNHpHnyiuB97xHbWz8zu/0\n/3066G260NBrjgkxdaINjoYey3RByIe+4QYGqu+GAeJooNBdsF6t/XQA3OkosUF/3mRHQ54J0pWj\nIYqAL36x3O+jcA6AywoNvQKGvMUgAbepE9ycI0VTJ7IWUXrsZs1S1yltcZlHaLDhSOMWCBdx5wDl\nUyeA/I4GERqqpU7Esbpme7WIS5tj8sJ1fF1SpetEL7oLQv6n/2Tn9zaNLEcDoNbV//RP+X7f7Nnq\nedvE1ImiNRpcPu+pkBOq0CCOBiEY6CK16m4YII4GSh5HA+0XTgNjmkpBX9MWR4ONBSMdwyp23DSa\nKDRI6kQxqqRO9CsGqcfzqKOMcNurzZwO2gA7jjTONRryLKbKFoME+jsabAkNkjqh6K7TQOeYXqJ8\nHtouNIijwS95hIYi6HoBTXc09BIaBgdN3TLOqRO9XJw+EaFBCAZ9w9lYpHb/HnE0mOMsRwOQFBq0\noyGKkoth2uJSHA39sZX3m4Yvi18ZXAkN06enu556CQ2SOpEkb40GPZ5RBJx8sjp++ul04fbAAXM/\n2RYaOARqvopBAuZ6/fGPgSOOAH7rt0xVb1s1GriNr8Zn6gQwtU4D/VyEhmK4EBoWLwaOOUYd/+xn\n/NxkXLDtJtFBL53Xm0IeoQEwc4ukTvRHikEKwaBvOBu2W0ApstoZ0Xahge7SZNVoANKSc6TgAAAg\nAElEQVSFhsMOS04iTWhxmcfRMG2aUbarCg0zZ5arLNwPqrxzC4BdCQ29cqel64TdrhN0PGnxsAcf\nnPozNjtOAM2q0VC0GORzz6m56fLL1fx7ww3Az38+9WebmDqRJ3+aUiV1ApjqaJDUifK4EBoA42oY\nHTX3gZDElaMBaJ6rQd+Pg4O9132AmedD6TpRV3tLcTQIQTA+bm5+W0LDwID5XW1PnaA7unRBS0kT\nGiYmjI03rdWXFhrGxvq3veNKnq4TgFk0lrUY6/G07WbQ6Aci5wC4qtAQx6q9ItA7AOhlI2yT0FDV\n0TA2ZgK2XkLDAw9M/Z22hQbOqRM2HQ1aLOgWGq67Dnj0UfM1XRtDUieSFEmdmJxM1hwCpjoaJHWi\nPLa7TmjOOcccS/pEOrbHnq7DmyY06Dmmn5sB8O9oKCMQiaNBEHJiu+OERu8Wtd3RQHN+e+WopwkN\nzz9veranCQ2h12mgE2Q/ZVs/kMosGPftM9c3rXFhkzYIDXv2mL9XL6Ghl7qvhbZZs8oXmm2L0EAX\nlXTho1MngHShwfYczi1QK1oMsmiNBiqAbdoEXHNN8mfWrlUfXRSDDFloKJI68fzzSXEZ6O9oqCI0\n2OhWFBquHQ0AcPfd9n5vk3DpaGhaQUh9P+YVGnw4GmbNMs7UInAQGsTRIARB1TylXugFb9sdDVRo\nyONo0MULexWC1ITeeSJP6gRQTWhwWZ9B0wShYXBQuZC6f06TVQgSyE6dKOtmAHjXwrBZo4GOG11s\nUkfDL3859Xe6dDRwCNSKFoPM42iIIjPGw8MmsF27dqpDTDsa0kSKMtDnLCchvoqjISt1ojttAuhd\no2FkJDsQ6Qe369cHLrpOAMDLXmaeDU88Ye/3NgnbjoY2pE5kzS8+Uif02JaNezgIDXTeFaFBYAtd\n6LgQGkZHedhv60Lv6EYRsGhR+msOO8zUtNDBca/WlpomORr6Wb707pQIDcUou9NuU2gYGzO7llWE\nhlAcDVVrNPTaGZs/38wBDzxgihNqpEZDkjw1GubONQEUkO4202LCv/+7mq9sORponRN6b9VNUaGB\nCpRZAUFael8vR8Ohh1brfsXVMeISV46GadPMBkmIKZo+0GM/MmLHvk7X4W13NPhInSgb91DnFAdH\ng6ROCGxxnToB8Nq18Y12NCxc2L/ooV58lhEaxNGQjggNirqEhm3bzNfKtrak59br/OqEnk+eB30Z\noQEw6RM7diTdToB9sZgGmRxEYhc1GrodCd1us7e/HXj9683/v369va4T9B7StU84UFRoAPJbnNOC\n1F41GqqkTQDiaCibotaLI49UH59/XjpPpJFWwLcKTXY0FK3R4Op663SqOxoGB8374CA0iKNBYIvr\n1AmgvUJDHBuhoVfahEanRzz3nPq5LKGBtrhssqNBT+Tj48W7a4jQoPAtNOiHro1CkPTcep1fneiF\n08hIvl3YskJDv/QJ246GKDJzC4dAzZbQEMf5hIaBAeBP/xQ4/XTztbVrjdAwZ065nF4NDaRDdjQA\nZqz1OD/7LPDf//vU7ihpqRNUuNm/31xrIjQUR8+506dXuzbTOOII9TGOp4qcgpm3bTlJmloMstMx\nc3mW0EBTJ7odfDbYu9f83ipxj/6b19V1QopBCkHgKnWCLuTaWqdh506jOGbt6OpAeHxcLT7pAz1N\naAi9xWVRRwNQfNFIx9CH0ODigVgWV0JDkfaWbRAa9PnkDcxsCA3dBSFtCw1ANSeRbYpey72KQR44\nYObjbqGBzs/veQ9wwgnAmjXma+vW9RYpijJtmnnWhi40dO88XnIJ8NGPAm98Y3I+zHI0UNFBhIbi\n2A52KdrRAEj6RBouHQ1NSp0oMr/oeSWOpxaRtUHV1pYafb+Jo0EQ+uAqdaKpqmwR8hSC1HR3nqCO\nhl4dE0JucVnU0QAUXzT6dDTEMS+xp+gDqJ/QQO3dRbpO0NauTU+dyBMAd78ub9cJoH/nCRdCAydH\nQ9FruZejoV8xxze+Ue0EL14MXH21+trq1eb71NFQpT6DRt9HTRIafvlL4Ec/Up8/9ZT6p6HPJ/33\noeICFR16zTF5aaPQoAMd10JDmjOlzXQ6ZuwldaI/dH7J62gA3KRP2HJy1y00iKNBCAJXqRN00m3S\nZFkEGmiVFRpmzOi9gxZynYYyjoaihb18Cg0AryDYd+pEmtDQBkcDTZ3IQ1lHw6pVpvCe69QJwNx3\noddooIvUfjUWLrhABcbr1wPLl6uvLVgArFihjteuNb/LptCwY4dpY1w3VVMn/vEfk9/T3ToAE6Au\nXmxcUVRcoHNMVUdDm9tb2uw4odGpE0B4GxquodeXrbFv6iYdHau8NRoAN898Oq62hIY6HK3iaBCC\nwMUiFUiqsnXlL9UNdTTkTZ0AkkLD4Yf3zv2mLS5Dq9Pg09EwMmInOEiDaxBcVmhI2z3IIzTQwkgu\nhQZuxciKpk70W0D1am8JqLHVwuJDDyXdM21Knciza1PG0QCogKp7ntB1Gug52BQaOh0+qYVFW7UC\n5nreswe47rrk99auVR8nJ00aGx3jXo4GSZ0oRhz7czSI0JCknzhclqamTpQVGkJwNMRxPaK8tLcU\ngkAcDe4omzrx6KPm75JWn0HTNkdDWaFhyZJq7dL60TShYXJyagpIHqEBMPd8mtAgqRNTX9e9gMpa\ntOr0iQMHkve7i/Q3KjTUXXuEjnGe+7is0JAGrdNQ9Gf7wbHFpV4oDw8nW3/2Qy9ud++e2kVCOxq2\nbDFzypFHGiFh3z7ztxWhoTwHDph7VIQGv7gQGtrgaMgS5kNJnUgrgu0TaW8pBIEroUEcDeVTJ/RO\nENBfaGiDo6GsDXZszCzgXaVNAO4tfmXR5zI0lC9o6BfM63EcGOgfyOqH7t696p7/6U/N95qaOmGz\nRkPWorVX5wntaIgie4tdvRDsdKa2iPRN0THuVQzSltBg09EA8GlxqYWGvO4cIH3u1vPNunUqAKZ5\n/UcemRw/7WpwVaOhaLpdiNAAx4XQsHSp6WQhNRqS0Dnb1tg31dFQpEaD63WVbUcDUE+cI6kTQhC4\nSp0QR0P51AkqNPQqBAkoEUIvvp98svj51YlrRwMde5dCA9cg2GYATPvb9xMt9EN3927gfe8zu+5n\nnlkteKDBDKcxnpgwO7W+hQZaEFLP4XPm5N+JzoLed3XXadCLqbxjbNPRQAtCamwLDdwcDUWEhu7F\n7bHHqnoXALBtmwpM6S74EUckHQtaYLBZo4GefxscDa6FhsFB8wwVR0MS16kTTVo7F0md8OlosNF1\nAqjH0SDFIIUgEEeDO8qmTtAdrn6OhoEBU6zsiSfqtzgXgQoNLmo0+CgECbRLaOjV2lKjF1r79wM3\n3KCO58wBrr++WupKFPUvVlkX9FxstLfs13UC6N15ggoNtuBkP9fjlHfHplcxSCo05BULDjts6hxs\nI3WiqULDZZcBZ5xhPl+3Lhmc5nE0VBUaoohX1xTXuNhV70anT2zdymsOrhsXQsP06WbzpalCQ972\nlkA4joa6UyfE0SCwRS9Sh4aKLTCyEEeDSZ0YHMze0Z0+PX3x209oAICVK9XHffvUDlIo5C3wZkNo\n6OcKqUrThYaxMXP/Zl3DaQut664Djj8+3znkOT+OYwz4cTQcc4yZo9NSJ2w60jjtChe9lm06GgBT\nEFIjjgYDHeuhIeDSS5PpJuvWTU2doEKCC6EB4FXM1DWuHQ1Ask4Dbb3ddujY2xIaoshs1DUpdaLJ\nxSABcTQIQk/0DTd3rt2CeeJoMI6GxYvzWZrTdt6zhIajjzbHTzyR/9zqJu9OQFmhQVc5B8TRkIde\nOwh5C0ECU/+OH/sY8OY35/v/s+AoNJSp0p9XaEizcw4OAiedpI4fe0zdD2Nj5jxsCg2cUifqrNEA\nTK3TIEKDgY71m96knHvdQkN36gQdPy0w2KzRALRXaHDR3hKQFpe9cOFoAEzw26RNuiI1GlynTthu\nbwmIo0EQeqKFBpu2W6C5eWZ56XSMoyErbUKTFhBn7cZrRwMQVp2GvIujsoW9JHVCfawaABcRGug9\n/7rXAX/6p/n+7zzohyjHMQbspE7oRWsU9V6I6ToNnQ7w8MPuauxwTJ2oy9HQBqGBtmcrIjTQ1152\nmfq4cqUZ37VrTWAaRUo4T3M06DEYGckOQvKgiwjXfe36wLejQYQGg6u0FXE0mGPOqROcuk6I0CCw\nxUV+L5C8AdvoaHjxRVMoLm9rvzJCgzga0hGhQX30KTRccon6PWvWAN/4hqlUbgOOjgZXqRMzZ/Z2\nQNGCkLfdBtx/v/m8iUJDHFcrBtmrRkMVoaGJNRroOBURGn7zN9W1+qpXAa9/vfpaFJkx27LF1BM5\n7DC1GO7naFiwwI6zsq2OBh9Cg3SeMLhyNGihYf/+ZD2rkClboyGU1Ik64hyfqRN9arYLQm9GR81N\nbHORCqhF4dCQmiTb6GgoUghS0x0QL1qUrVKG6mgQocEdcVyP0HDhher1M2bY637QfX5cxhhwlzrR\n736gBSGvvDL5vSbWaKALqTLFINMcDQMDxYKCZctUkKwdajYcDbSwKgehgV7LRYSGd71LpUx0p16u\nWaOEMMA40bT9vl+NBhtpE0BSaIhju2mhnBgfB26/3XzuSmiQ1Il0XKdO6P/DhrhZN9J1wj7iaBDY\n46rjBJDs6d5GoUEvSoHyQkNWfQag+Y4GbYEFygkNg4NKsHEFR6FhYsJ0IPEpNAD9d+OrwFFocNV1\not/9sGZN77/pCSfkO4c8cKnRUMY1kiU0zJ1b7BqNIuC889TxggV2guFDDjHvh3YZqouyQgOggqDu\nQL7bBQKYXfFuR8OBA0aMsFEIEkhev5zmDJvce6/q8PHlL5uv5VkzlEFSJ9JxVR+DCgsc5gcbFKnR\n4Ct1Ytq0/M+VNOoWGsTRILCH5vfaFhoApRS+9FI7Uyeoo6Fs6kSeRcPcuWrhtmNHuI6GfrsweR0N\n990HXHutWbA+8oj6eNhhbgJfDUehwaalv6jQ4Ap9fuPjqj6By79pXsqM89CQCsqo60STx9GwaBFw\n443qn07NAoBjjzU58jbgkjpRZoyzikGW2R38/OeVqHvhhXYWdFGk7qdnnw3b0dCLfkJDt6PBdscJ\nYGptH5sdteqm0wE++lHgc59Tx4CaD//4j4GLLnLzfx52mHGoSuqEwZWjgTpInnlGze+hw9HRMGdO\nNbdT3UKDT0eDCA1CKaijwXbqBNBuR0OZ1Inuegx52zKuXKkWbJs2qYW1a2XTBmWKQfYLeC69VBXH\n68Zla0vA/QOxDGUePnmEBmr39k33OHMIHMoEwVGkXnvgwNQ2ojoozrJyvuEN6p9LQhYa0hwNcVxN\naDjiCOAznyn+c/2gQkPd9v4iu415OO44Na/TQEwHT3T8X3zRvdDQtDoNd9wBfPaz5vPTTgO+8hXl\nbnDF4KB6lm7aJI4GiiuhoYkOkrI1Glxs4OiYpOoGa91Cg7S3FNjjMnUCMAvmvXuN8t4WfKVOACZ9\notMJ56Fks0bD7t3pIsPQEPCf/3O588uLOBr8wHGcy9RooK+l78PVgrUsXGo0UNEs7xgPDBjHi/75\n/fvNooxLvrO+n0ZHi3XUcYFtR8PAALB6dfJrOngaGjLrjW5Hg+0aDUDzhAa6ifGe95gUCtfov9/2\n7c0b07K46jrRxJoYZVMnXDsaqlC30CA1GgT2uGqNptEL5jiufyHlG1+pE0CyIGQodRrytPID8i0Y\ndZoEALzjHerB/PTTakF0+eXVz7UfHAPgqkIDfXjR/FAuQgMX50iZGg2AWRDQ98FNaOBYo6HIQkrv\n7mhxoWzHCZdw6jxhW2gApqZP0F1aXadBHA3FoWupl7/cn4OR/v02b/bzf3LHh6OhKakqXFInRkfN\nc6Wq0CDtLQUhA1+OBqB9dRrKpE7MmpWcuIo6GoBw6jTQfPR+luE8C8b1683xmjVqN+CII9yIZ924\ntviVwZWjwVYQUIamCDr0tZwdDVwCtbJjLEJDMXwIDXSXVgsNO3a4mWPKFhEOAfp+6Pt0TRPt/FWh\nwaXNv0UTx5pL6gRN5bbpaGh6e0sRGoRSuBYa6IK5bXUadOrE8HCxhS11NbTB0ZBlN+wu6pUGFRpO\nPLHaeRWlKQFwltAwe7Z7xbwfHMfZZuoEnR9FaDCUFRr0taoXYrqNImCnPaUNOLW4dCE0nH66OY6i\nZL0cLShMTAAbN079elW4XL8uoM9Bn0JDE+38VaHrGJsFihcvVilGQHPGuoijwWXqhK3WloCkTghC\nJq5TJ8TRoNImihT5ouJCkx0NelLOCqqKOhpWrap2XkXhGAC7EBrqTJsA+I9zkeAsBEcDlxoNVR0N\neiFGhQYfTqc80Huq7hZ2LoSGE04w8/eyZckdNyr2PP64OZYaDdnUJTQ0cZe9Knk6BZVhcNCs/5qS\nOlGkRoPL1AmbG6z0/pNikIKQgs/UiTY5GiYmgG3b1HHetAnN7/++Ukl/93fzL7qWLzdiRmiOhqwH\n9LRp6qELZAsNw8PAihVWTi833APgKkJDp2Pyp0VomIqr1Imquyw24FijoUrqBM0p766FUxdNT50Y\nGgI+/GH1+/7wD5Pfo86Fxx5L/3oV2iI02OgQkpcm1g2oSl5nZhn0eL/wQjNqnJV1NHBOnRgYMO+l\nbkeDCA0CS3ymTrTJ0bB9uyqACeQvBKl5+9tVPvFXv5r/Z0ZGjK0xBEfD2JiZILOEBlosMm3BODEB\nbNigjo87ztgNfdGUADjtwb5zp+kWU2drS4DnOFdNnRgfN+PLzdHAJVAr03UCmCo00B1YGjDVCSeh\noUj+dBGuuUYt7K+8Mvl16mhwLTQ0IUijSOoEH1w5GoDkeDdB2NFzTBRl2/xDcTQARmSq09EwNGQ3\ndScNERqEUvhMnWiTo6FMIUhKmWBZ12nYvp3/WNMJOc8Dup/Q8OST5kHkuz4DwDMAtuVo4NLaEuBf\ndLNM6gRgrl0RGtIp23Wiu0YDXaiL0DAVF44GTdrzjAoK0nWiGHUVg1y82Ah4IjSoTQ49P7mYs5uW\nqqKv2+nTs9OJXT7vmyQ06PWDj84zIjQIpZBikG7QhSCBckJDGUKq01C093Q/oaHOQpBAs4UGLq0t\ngeaMc/dr9e/gJjTQYLMJqRN0oU53CuukLUJDGmkFOUdG7AXOTRYa6nI0DAyYe6cJO+xVKbphUpSm\nparoOSZPuo+vYpA24h79t6/Dta3HxkehbhEahFJIe0s3UEdD0dSJsoTUecKmo0GEhqk00dHAcZzL\npk7QQE7fC9J1Ih1bxSC10DB7Ns9ikG0TGtKcCwsWFCuc3A8u168L6hIaACM0vPhi81JSiuJaHG6q\noyGP0OArdcJGPSS9WbZvn0mZ9oUW0sXRILBFp06MjBRbxOWlrY6GqqkTZaBCQ0iOBptCg++OEwDP\nANiW0PDss+ZrdRfQa8o4A8luMrq1HzdHA5dArarQMDGhFn96R5BL2gSg2h7rwLptQkOao8FW2gSQ\nDMCbLDT4LAYJNC/4rUJRZ2ZRmlYTg6ZOZBFi6kQc+59rxNEgsEffcC7cDEB7HQ11p05wdzSUFRrG\nxoDJyeT3qNBwwgnVz60oTQmA094HrdSft9WqK7iPc5Hg7LjjzLEuZMqt6wSX9pZVi0ECwNat5j1w\nSZsAVDcdHVw3sb1lP9JEBZuuKS5CmQvqqtEANM/OXwWfjoYmjDVHR4NNoQHwX6dBHA0Ce3RvcVdW\nUnE01JM60VRHA5BcZMUx8PDD6viII+rZCXb5QCxLmeAsS2hYtqz6eVWBu9BQJAimQoOuuM/N0TA4\naBYvXGo0lCkGCQCPP26OOTkaABNci6PBrqOhyUKDdjTQe9QXTdtlr4LrOXvRIjOPhT7WcVy+RgPn\n9pZAvUKDOBoE1oyPqzaKgJrQXNBWR0MdqRNLlpgFYkhCQx7LYS8b7LZtRiyroz4DwD8AzvsAEkdD\nccrWaDj2WHOc5mjgIDQA/VOWfFE1dQJIOry4Cg27dpndqTrg4GgQoSEfWmg45BB7NS3yIqkTBtfF\nIAcGzHM39LEeGzP1C4qmToijoTciNFgmiqJOzn+35fhdF0VRdFMURU9HUXTg4Meboii60Md74QBt\nKbVwoZv/o63tLXXqxIwZ/oKGgQFgxQp1/OST/ovSFKFsMUgguWisuxAkoMZdt2/jEgDbrtEwbZq7\nOSIvHJ0jZVMnVq40Pa+1o4FbMUigOUIDFV45pU4AyXQB+kz2jW+hYfZstSNPcSU0NK1oIRUafEPv\nHypEtxEf4rAWdl56KezNOvoMaVrqBP3b+/4bSeqEG+Kc/1KJFP8A4HsA3gJgGYBpBz++BcD3oyj6\nsss3wAWaE+oqiKjzBqwT7WhYssTvjoOu07B/f7JOBDdspU7UXQhSo9XkpggN+sGuF5LLlpnAuC44\nOhqqtLdcvlwdb9igREF6T9QRQKTRFKEhBEcDUG/6hG+hIYqmpk9IjYZ8aKHBdyFIILlW1G7CtuJT\naADCrtNA55c81+3goFlzFHneT04C99zT/56nQoONv1tdjoY4FkeDS/4OwCl9/r23z8/++cHvxwDW\nAvhtAGcd/Lju4Ncvi6Lo065Ongs+hAZ6A7bF0TA2ZnamfKVNaEJpcelCaKjL0QCYACjkALg7kB8d\nNXNE3WkTQPL86qwZQCmbOgGYOg07d6px1vfEzJn1izoaHXDWOd5li0HShZcIDdn4FhqAqUKDpE7k\nQ7+fOgTJefPMsU69bSuuu04AzamJUdTRAJg5PK+jYXwceP3rgVe8AnjHO3q/TgsNs2bZedbWJTTQ\nwujiaLDP1jiOH+rzb2PaD0VRdByAD0GJCfcBODeO4xviOF4bx/ENAF4FJT5EAK6MougYX2+oDqjQ\n4KpGw8CAuQnb4mjYutUc+yoEqaGdJzjXaRChwS1lhIbBQWNlHh1NtrasuxAkkLyXHnywvvOglC1U\nCCTrNDz2mLknuKRNAPwcDUXGuJejgXPqRNuEhm5hQdpbZhPH9aZOzJ5tXJoiNJhjH46GpggNeecX\nvXbJKzR8+MPArbeq4//zf3pfn7a77dUlNNBxEUcDH/4IwMFsanwgjuNEWBDH8X4AHzj46RCAD3o8\nN+9s22aOXeZf6zoNbXE01FEIUhOioyHPTkCW0DB7NrB0qZ1zK0MThAb62tFRXoUgAeC008zC4Lbb\ngE6n3vMBzDiPjBRPkepucanvCQ6tLTX6vhsfn9pW1hc2Uif0tTx3Lq/xBZLP3jpbXDbN0cClPatt\nxsbM3FeH0DAwYLqUtV1ocF0MEmhO6kQVR0OeddU3vwn81V+Zz+NYpVCkoWOR0IUGWjxYHA18uBjK\nzbA+juP70l4Qx/E9AB6BcjW82eO5ecdH6gRgJuC2CA3U0bB4sd//mwoNG1N9PTyoUgxS7+bs3w88\n9ZQ6PvFE/9W3KU0RGuiDnToaOAgNQ0PA+eer4+3bgQceqPV0AJjgrGjaBBCGo4FDsGZDaNCFcbml\nTQA8HQ1lrucydAsLNms0RJG5fpskNNDClnXVctHpE20XGnw4GpqSOlG0RgOQ39HwwAPAZZdN/fpd\nd039WqfTHKFBHA3MiKJoJVTBRwD4UcbL9fcPj6LoKHdnVS++hAa9g9SW1AkqqNB8Rh/QFJi6+7L3\nw0bqxKOPmgCizrQJwDwQ9+1Tyvo3v6mse3XltjfR0QAAF1xgjrVFsk6oo6Eo1NHw4INm0cBJaKD3\nXWjXctoOT0hCw333mY4kPqCimS/R1qWjAeCR+mMbKjTUUQwSSAoNnLtbuUZSJ/LjytHw0kvAW99q\n7ouLLjLfu/POqa/fu9dcsyI0FKNtQsPboyh6MIqivVEU7Yqi6NEoiv5XFEXn9/mZk8jx+p6vmvr9\nGmvZu8W3o2F8nM+Or0t8PHx6QRdqnCtCFx2jtHxbLh0nABMAjY8Dl1yi/l18sfpYB7aFBg41GgC+\nQkMZqzltcfmLX5ivcxUa6grWbBSD1HCrzwCkCw3f+Q5w1lnASSf5ayGohQZfaROACA1loO+lbkfD\n2Bifwrx14KMY5MKFZt5rSupE3jkmTzHIa64xguyaNcCNN5qU5XvumZpiabu1JVBfdz1JnXDLKgAn\nApgOYCaAYwC8G8BtURTdFEVR2uVDlxhZtyvVDRnugdjBt6MBaIergToafAcNM2aYh1KdPdmzsOFo\n4FIIEgDOOCP963fc4fc8NE11NJx0kikK+eMfJx+0dVDF0UBbXNJ6KiI0JGmjo+Ev/1J9HB8HfvYz\nP+dRh9BAhYXhYfuBs75+qQsgdDilTgDtTp/wsakURUYgbZujIU/qxN13m+Mbb1S/+5xz1Oe7dgEP\nPZR8PRUabNXrEUdDs9gL4JsALofqELEawOsA/BmA7VD1F94C4OYoiga7fpZeUlnhLr1UGC377KKF\nhsFBU9zHBfRmbkOdhjodDYBZvIXgaJg2Ld8EyV1o+Pznga9/Hfjrv1b/dGBe1/XeVKEhioBf+zV1\nvGcPcO+99Z5PlRoNQLJOg4aT0ECDTg6pE2W7TmhCEBruvz95XdOFsUvqdjQceqj9lI0mOhpEaOCD\nj2KQgJm3du3yNx/YpkyNBpo60StFR2+ozZsHrFihjs8+23y/O33ChaNBikE2i8PjOP6dOI6/Gsfx\nnXEc3x/H8a1xHH8SwMsAaAPqeQB+v+tn6eMzq1kKNfjXlAXnHi00HHqo277tddmK6qJuoUEv3jg7\nGvRknNdumCY06GKAg4PAMTU3oj3kEJUm8V/+i/qnH3ijo/XsutsQGnQxyHnz6lvQpkHTJ267rb7z\nAKqlTgDJOg0aTl0RmuZo4Jg6MTJi5sEXXgD+8R+T32+y0EAdDbbTJgAzrvv31+9+sgU3oYHzhoZr\n9FpvcNBtAdUmdJ6okjoRx727Hul1LhVstaMBmFoQkm7+hC40iKPBAXEc93zkxnG8DcBvAtCPkw90\nvYTux2T9SeiU0SAtPIkWGlymTQDtdjTUETToBdu+ffn7D/umaIX97oBn505jiTgVqy0AACAASURB\nVDv1VD+TbBHqFtfKPoD0YqnTMTZNLm4GDZc6DRMTZvHTVEcDJ6FhcFD9y0sojgbALJK3bAH+9/9O\nfq/JQgN1NLgQGppSSI/CoRgk/bu12dFA1zEuC6g2ofNEldQJIH0t2+kYoYvOH2vWmPlfHA32GHL/\nX/AnjuMnoyj6fwB+HcCxURQtieP4uYPfpiFu1lKO7rNWChPGxsawbt26zNctXboUS5curfJfFWL/\nfnNDuBYa6g66fMPF0QCoSVjntHOiqtBwzz3GSkfVay50i2vdRc9co4OzadOKuZXSHuxcCkFqVqwA\njj5a1TW46y618K5jZ6/sTjslzdEgQkMSfR0WHeNQikEC6hm8aVN60OZDnI/j6u6cMrh2NNB2z088\noeaN0OFUDBIQoQFwVwhS0wTBrErXCUDNT93X+86dptgjnT+mTwdOP13Vb3j0UeUU02JuE4SGLVu2\nYMuWLYkW3zt2AN2h5pjlnUYRGgwPQQkNAHA4AC00UMNR1nKD7ntUuq23bduG008/PfN1V199Na65\n5poq/1UhaBstcTTYpW6hgU64L77IT2iI42pCw759SZWa5uNxoW5xrWyRwrTXc3M0AMrV8MQTKgj9\nyU+A173O/zlQoaFscMbd0cCpRkPRa7l7h2f+fPcBQVmo7bcbH44GG9dyGY45RlWIf+454FWvsv/7\nqbDw5JP2f38dcEudEKHB/ZzdhNSJMjUashwNND24W6g8+2xTKPLuu4E3vEEduxAa6H3oY7335S9/\nGZ/61KcSX7v+evXPJSI0GHp19aW1R7NKx9HvP1zlZBYtWoRbbrkl83U+3QyAv44TQP1Bl2/q7DoB\nTHU0cOPAAaNCl3U00Ly7EBwNvtGBQ9GUkpCEhq98RR3femv9QkNZR8PRRyvLLS10xUlo4OBoKHst\ndwsNXNMmgKlCw+zZZt7wITTQIMCn0DB9uuqq8dBDpsirTbodDU1AhIb6uO461U5R71rv3Kk++hQa\nmuBoKFqjAUg+bzVUaOieQ885B/jiF9XxnXemCw22UpsHBtS9uG+fH0fDFVdcgYsvvhh33gl84GCR\ngMsvB37v95Kvu/DCC7Ft2zZr/68IDYaTyPGz+uBgWsWzAJZCFYvsx6sPftwcx/HGKiczPDyMNWvW\nVPkVTvApNNQddPnGR2/lfnQ7GrhBJ+K840MXVHv3GqV6yRLgqKPsnZst6hbXmu5ooEFJXXUaaHBW\nVmjQLS43kqeMCA1JbDkaQhIaLrvMLJKbLDQAan5xNceIo8ENbRUarroq3VGweLHb/7etNRqo0JDm\naKDO7DRHg4ZuTLlwNABqLetLaNCp9rQz2FFHqdoUlGHLxctaUQwyiyiKVgJ4LZSr4Yk4jrd0veSf\nAUQAToyi6Kwev+MVUI6GGMDNDk+3VqjItWiR2/+LLpzbJDSMjPgp0NINd0dDmdQS+mBau9Y8LM4+\n220RprLULa41XWhYtAj4lV9Rx+vW1SOo2bKbd9dpkK4TScpey91rLK71GYCk0DA0BLz//ebzpgsN\nLlm+3DwfmuhoqKsYZFuFhq1b1cfhYeWWWbkSOPNMJUC4ZMEC87cOVWjwnTpx+OHq/gdUTa+JCXXs\nousEYDbNmlwMsvFCQxRFb4yiqGfN6SiKDgNwI0xHiWtTXvZXAA5ebvibKIoSj9SDn/+Pg59OAPjr\nSifNmLocDW1IndDvsa6AgQoNHB0NVYWGxx83xxzTJoD6r3mbQgO3YpAa3X0ijoEf/9j//28jdQKY\nWqeBk6OhSTUaQnE0vPnNKoDRRVx9Cw11Ba8uGBkxAlNTHA1SDLIeRkdNsPvylyvh6okngHvvBc49\n1+3/HUVm/nr66WSqXSjYKAbZTT+hATCuhn37TDt0l44GQNpbhs61ADZGUfTXURS9I4qiV0RRdGoU\nRRdEUfRpAA8AOA3KiXAHgL/t/gVxHG8A8DkoV8OZAH4aRdHboyg6PYqitwP4KYAzDv6Oz8Zx/Hj3\n72gKddVoaJOjoa6AgU64TXQ0UDgWggTqv+Zt7QIDPB0NAPCKV5jj9ev9//82UieAqY4GTkIDB0dD\n2a4TIQkNOlAZHAT+6I9UYKEXwT7mj6Y6GgBTp2H79masPyR1oh7otVPHJtLxx6uPe/cCGzb4//+r\nUqZGQxFHQ1pBXbo+1AXEXQsN+/aZGmSuEaHBPjFUfYUPAPgGgDsB/BzA/wPwMQCHHnzNdwBcHMfx\neI/fcxWAfzz42tMAfAvAfQc/aqHiH+I4/oSzd8IAcTS4o26hoYmOhrQ2jdOmqRZGHKnzmqet6qo6\nGgYH3eeflmXFCnO8sVIlnXK0wdFQt9DQ6RjLa9VikJxTJ9asUTtuv/gF8MpXqq/pRbCkTlSjaXUa\nOAgNs2aZ53FbhAYXRQSL0KvmQCj4rtEAJB2vP/mJ+uhaaIhjf89KSZ2wz7sBXA3gXwA8AuAFAOMA\ndgC4H8CXAZwTx/FvxXHc89EcKy4H8Aaomg2bAYwe/PjPAC6K4/gKl2+EA1IM0g2Tk2YhII6GdMoU\ny4yiqQ+nNWv4LorrdDRMTBhrZVWhYelSJTZwhBYBrVtosFmjQYQGQxUxJyRHAwCcfLL6pxGhwQ5N\n6zzBoUbDwAAwd646bovQ4Cq3Py9pu/MhUSY9q2rqxGmnmTXmDTcAt91m5tPh4WobBN3Q57av9Anf\njobGd52I4/gOqJQIW7/vFgDZfScbirS3dAOdYMTRkE7ZMZoxI/mzXOszAPWKa1WCs+7Xc63PAKiC\nkCMj6v3WLTRUWbCsXJlscclJaKi7RkOVMQ6pGGQaeg7Zt0+Jh0MOV3lNFhrE0eCGefPURkYbhYY6\nHA1nnqlE/8nJ8B0NLlIn0oSGadOAD38YuPpq5Y77rd8y37P9N6SbZr6EBnE0CKzRQsPIiPsWjHXn\nq/uECikcikFydzQUCaq6F1Vc6zMA9YprNoUGrvUZALWrpqtKb9rkv0CWrRoN06eb9wHU0xK3F01x\nNCxcGF6RQ7pr6vq5WSYICIWmORo4FIMETJ2Gl14KszhhUVxZ7vMya5bptPTLX/pxOtlEX7fDw1PT\nYHuR5WigqRN03Uu56irgwgvV8fbtJvax/TesQ2iQGg0Ca/TNtnCh+/aAw8PmJmi6o6FsEG2ToSEj\ncnB0NJQdo+5AgbPQ0BRHA2ehATDpE3v2+BfVbKVOAMDFF6uP55+ffxHmg7qFBrqQqiI0hOZmAJIL\nYddBhTgawoGTowFQu6p1FYr1Sd2OBsC4OONYtWwMCX2NFBF8s2o06PXtvHm9UzwHB4Gvfz1Z0wlo\nhtAgjgaBLXGcFBp8oCfmNjka6rRAaxtZkxwN9AF15JG8gwdxNPihzjoNtlInAOALX1ALx1uYJfPR\ney601Am68OJenyENERrssGSJeU9NcDRooWFgwM8uZi9o5wmO6wzb1F0MEgi7IKSeY4oIDXlTJ9LS\nJigLFgA33ZSc25ogNIijQWDLnj3mAvUlNOjASxwNftA2shdf5GdrLFMMEkg+oDi7GYB6HQ1VdoFF\naMiPrdQJQLmQzjrLbnEqG9CFWd2pE0UXUrRbygkn2Dkfn/hMnWiy0BBFJn3iySf5PQ+LooWGGTPc\nu1H70bYWl3UXgwTCFhr086PI/NIvdaLTMQJXltAAAKtXA//zf5rPbYvPdC3rK86RYpACW3wWgtS0\nxdFA3x8HR8PEhFJXORWYq1IMUsO5ECSgAsahITX+ITsaOBeDBPg4GpoWnGnqTp2oci2vXAl85jOq\nZeQHP2j3vHwgjgZ7rFwJPPywep/PPae66YSKFhrqTJsA2i001OVoWLkSOOww4PnnldDQ6fBKtetH\nmdSJfo6GnTvV+weAQw/N9/suvVSlG/zrvwIf/3j+88hDHV0nJHVCYEsdQoO+CfftU1Vzmwo3RwPA\nr06DjWKQ3B0NUWTem9RocAcXoYGbE8EW06aZ3NfQhAYA+JM/Ab71Lf7XcRoiNNiDFoQMvU6Dvg9F\naPBL3cUgAbWu0GufnTuB9evrOY8yVK3R0O1oyOo40YvLLgO+/W1g1ar8P5MHSZ0QBMK2bebYt6MB\n8HcT1gGHrhNAcuLllj9ZVmh43evUx5e9TNnguKP//nU6Goo+fERoyI/N1AnO6MCzjhoNVdKAQoc+\nP0RoqAYtCBl6nQYujga6mdEGoYGDowEIM31iYsJsMNoqBllWaHCFFIMUBAJ1NCxa5Of/rDNn3Sfi\naMim7Bh94APAQw8B997rZ1KtSuiOhtmz611Q5eHww411VBwNbtALw61blWXXJ20Z4zTE0WCPpjga\n4piP0NBmR0Odz0WaNnrnnfWdRxHKts/tlzohQoM4GgTG1Jk6ATS7ICQXoSEUR0ORYpCAsrvVvcDK\nC3U06FxCH9gSGri7GQAlOOk6ElKjwQ36GbFtm7r//umf/BXUq+LOCR0RGuzRFEfD+Hi5nWEXtE1o\n4FAMEgBOP13VfwLCcTRQocFW6sQLL5jjvDUaXCKOBkEg1FkMEmi2o4FLMUjqaOAmNOhJePr03r2P\nmwD9+9Pe566pIjTQBzv3QpAanT6xfbvftKy27LZ/8YvJdrnvfS/wmtcAzzzj/v9uyxinIV0n7NEU\nRwMN2OoW3NssNNTpaJgxA1izRh0//DC/9V0adH6R1Al7iKNBYIs4GtzB0dHANXWCUycMF9QlrlUJ\nzmiQEYKjAUjWadi0yd//25YaDRdeqIqOvfOd5mu33Qb8wR+4/79FaFCIo6Eac+aYXc+QhQYqWIvQ\n4Bd9Dw4P1z8X0ToNd99d33nkpayjIdTUiaa2txShQciNOBrcwUVo4OxoaIvQUJe4ViU4O+UUYMUK\ndfwbv2HtlJxSV0HItqROAKqWz/XXA//yL8ai+dhj7v/fNheDFKHBLtrV8PTTU4OWUBChoT70upVD\n3aLQCkKWrdHgouuEK6S9pSAQqNDgK7epjY4GLl0nxNFQDyE6GoaHlR3zmWeAN7/Z7nm5goPQ0JYg\n+MILVR93wI+A2cYx1ojQYBddpyGO/TqfbCJCQ31wEhpoQcif/rS+88iLC0eD1GgQR4PAGC00zJrl\nb1Ehjga/cHU0dDpmsdQmoSEURwOg5oRQ0iYASZ3wjRYxfQiYbS4GKe0t7ULrNIRaEJIKDXUXg5w1\ny3T8aYPQoO/BOgtBao48Eli+XB3ffjvw4IP1nk8WNmo0cHc0SDFIoVY2bgQ+9SngF7+o+0wUWmjw\nlTYBtNPRULSjgk24trekC6U6x8cH9JoPxdEQIhwcDU0NztLQc8voaHKnygVtu5YpQ0NmUS5CQ3Vo\n54lQ6zRwKgYZRcbV0HShYWzMzEUcHA0A8P73q49xDHzyk/WeSxY2Uif61Wig7pq6oAKKOBoE77z/\n/cA11wBveYu/tmC96HSM5cin0NAWR4N+byMjfhTGXsyZYzo6cHI0cHF8+CBUR0No6J0dQFInfOAz\nLautY6zRu6c+hYamOkea5mioW2gA2iM0cGltSfmDPwCWLlXHN90E3HdfvefTD5fFIOfONe0+62Rg\nwNyTdQgN4mhoOQ89pD5u3Ag89VStp4KXXlJiA1Cf0GAr6Irj5AKJA1zqD9DdBk6OhjYJDXU5GtpW\nQG/mTDOX+RQa2hCcpeEzLatt13I3Oqjx1d5y+nT17GgiTXA0cBUaduyofxPNJVxaW1IOOQT4xCfM\n5x//eH3nkkVZoaFf6oTeMOVQn0GjXbq+NpZ06sTQkJ95W4QGxtDF2Lp19Z0HAGzbZo7rSp2wsWga\nGwPOPBNYvBj4yU+q/z5b6AmGw8NI7zyKo6EeQiwGGSo6fWLz5mTeokv0OA8PNzc4S0McDf6gjgaX\ngRwVGprK8uWmpsDjj9d7LmXhKjRMTvrbxa0DjkIDALzvfcap82//Bvzwh7WeTk/K1mjo5WjodMy6\nlkN9Bo1e0/p2NPja6BChgTHUVrZ2bX3nAdTT2hKwH3T99KdqLHfvBr7xjeq/zxZcHA2A2Xl86SW1\nEOBAm4SGENtbhopOn+h0lNjgAz3OTQ7O0vDpaGjjtUzRQsPkpNt6GG0QGqZNA449Vh2vW8enZlYR\nOBWDBNrTeYKmLnFJnQBUgPmpT5nPr7qKp7PEdo2GXbuMM5uT0KAdDb6LQfpK0xahgSlxnLzx63Y0\n1CU02A66aA93+p7qZHKSV0cFOgHv3FnfeVDoBNz0YpDiaPBHHQUh9Ti3ZYw1PgvNtrnrBOCvxWUb\nhAZA5bVrOFvNe8GpGCTQHqGBq6MBAC65BDjpJHV8553A979f7/mkUTZ1YnDQuAXps4C2tuQoNOzb\nZ4QQl4ijQQAw9WJbt65exZEG5YsW+ft/bQddGzaYYy41COhuAwehgWPniTY5GjgUg2xLcFaH0KCD\ns7YJDXRhJ44Gt/hqcdkWoeGKK1RrQAD43veUMzIkuKZOAM0WGrg6GgAVjH/60+bza6+t71x6UVZo\niCIz71NHA13PcqzRACTvVVeIo0EAMFVo2LbNn7U3jbocDTNmmPxIGnTt3FlOeKGOBi5BNBVQOATR\nPgOCvLRJaJD2lv6o09HQ9OCsG0md8Ic4GuwyMgJcfbX5nKvVvBehCg27dk3tGhASnB0NgOpop9cb\nHAudlq3RAJjNEvosoGt+To6GxYvN8ZYt7v8/cTQIANLtM3WmT9QlNESRmQj1pP2JT6hF62mnAXff\nXez3UUcDtVHVCbcgWhwN9cLB0dCW4ExSJ/zhsxikdJ0wx66Ehk7HjHPThQYAuPRS4Ljj1PGPfgT8\n4Af1nk8RQhQa7rkHWLIEOOYYP8GXC7gLDVEEHHaYOt66td5zSaNsjQbABNG9HA2chAbtlgKAp592\n///pMRFHQ8tJExrqLAhZl9AAmMByzx5VwPHTn1a7CfffD5xzDvCBD+Tb+e10klWjOQbRHB5G4mio\nF3E0+ENSJ/whjgZ/UKHB1RxCx7gNQsPQEPDf/pv5/GMfC8fVEGIxyK9/XQWazzyTHPeQ4Jw6odFC\nw44d/NwjZVMngPTUCa41GnwLDTp1QhwNLSftAdZGRwNggu+tW4HLLkt+L45VbtlJJ2W36Hn22eTE\ntWcPj4mVWxDN0dHQpmKQ9P2Jo8EtCxaY8fYhNExOmk4ubQjOKHW1t2xLvRGKD0cDtTW35Vp+29uA\nU09Vxz/7GXDzzfWeT164FYOka4xeQsMDD5jjr3wlzNai3B0NQNK2T9vYc8Bl6gSnGg1HHGGOn3nG\n/f8nqRMCAN6pE76VQB18j42ZB+Z73gN87nNm8nnmGeBNbwIefrj376H1GTQcduy5CQ3iaKiXgQET\n/IqjwS1RZFwNmza536Fs4xhr5s41lcDF0eAWERrcMDAA/Nmfmc//9m/rO5cihJY6EcdJoWFiArjm\nGuenZR36/ObqaKBCA7f0iSqpE1nFINvqaIhjKQYpHIQKDVp1evZZ4Lnn6jkfbTmaN09ZCH3SrQSf\nfjrwd38HfOhDwIMPAuefr76+Zw/w1rf2Ds5ofQYNhzoN3IpBcnQ0tEloAMw171No0K6RgYF2BWda\naBgddb/QosFZm8YYUNeVDjBczyv6Wh4cFEeDCA12+fVfN88gHzZnG4QmNDz//NS12de/Dvzyl27P\nyzb03uPqaNCpEwBvoaGso0GEhiQTE+ZYHA0thwoNZ59tjn/+c//nApgbtI6bk07Qhx4K3HijWdis\nXAn83/8LnHKK+nz9euB3fzd9ZzLN0cAhkOYWRIujoX5oXRJfaFFj9myz89wGli83x67TJ9q+065F\nTNfzSluvZY2P9pZtFRqiyKSPctioyENoQgN1M+ixjmNVCDwkQkudeP75+s4jDRtCA33mcq3RsGiR\ncRe4Tp3QbgZAHA2thwoNF1xgjusoCNnpmIVhHTfnSSepjwMDwDe/mSzgBiib+U03KWsuoISIz31u\n6u9JczRwExo4PIx8Fm3LS9uEBupo8FVwTI9xG8aXQvMjXVc3b1sBvW7082PHjvT0QFu09VrW+HY0\ncCgw6BOd3/3ii26vY1tooSGKeAicWUIDdS5ccw2wbJk6vvlm4N57nZ6aVUIoBsk5daLKHKOv807H\n1EWi6326zq2bgQGzDnHtaKAOD3E0tJxeQkMddRp27jTnU4fQ8JGPAJ/9rGop9drXpr/m2GOB6683\nn//JnwC33558jTga8sExdaJNxSABIzRMTPgrWEp3gduETwdPm1MnADO3dDpu04Laei1rfHSdqJI/\nHTpaaOh01PqIO/pvNWMGD4fPzJkqrQnIdjScdVbSyXDVVW7PzSb63hsa4jvfh5A6MThYPGWbBtFa\n4Nfr2blz/aeAZ6HTJ3bsSK53bSNCg/Af0F3MM84wAWgdQkPdlVrnzweuvBI499z+r3vjG4FPflId\ndzrARz9qvtfppAsNHKyP3ISGGTPMwpGboyGK2rF75rvFZRybMW5bcEZ316RIoVt8iJhtvpY1vh0N\nbbuW6TqIwxoiC+1o4JA2AajnuJ530+Zc7WiIIuVofe97VZosAPzgB+F0oNDP7jlzeAg8aYSQOlFm\nzUeDaB1c15kCnoWvzhOSOiH8B9pBMHu2umFWr1afb9zo/8HGtYBKGp/8JLBqlTq+917zENuyxUxa\n9GHLYceem9AAmL8zh/EBzBjNnKlsZk2HBkg+6jTs32/mHC7XoC/ytFqzhaROmGNXos6+fUaob9u1\nrPEhNISQf+4KERqqo4WG7jm301FFvgHg6KPVM394GPjt3zavefRRP+dYFX3vcb4/OKdOVBEaqPg5\nNqauK85Cg6+CkOJoEP4DvejXi+A1a8z3fBeEDEloGBwEXv96dRzHwA9/qI5pfYYzzjDHHAJpbl0n\nAH9F2/LStpxr346GNgcNPmuSiKPBHLsa6zZfy5oZM4w13ZXQEEL+uSuo0EBbf3OFu9BAHbxPPGHO\nVxf5BpToQF8TAiGkcM2fb+YKbkKDdk1VdTSMjqr5SsdVdTizs/AlNIijQfgP+gkNvgtCUsWe4w3a\nDa1pceut6iNNm3j5y80xB6GBs6Nh375kcFQXbRMafDsaOF6DvvApNNDq721IAeqGCtWu5t42X8ua\nKDJziA9HQ5uFhpAcDZzmHC00dDrJe5YWgqRCg06dAIAnn3R7bjaYmDA78pzvj4EB42rgmjpRxv3X\n7WjgvmHqK3VCHA3CFPRkTIUG33UauN+g3bz61Uah1UIDdTRQoYHDIoFb1wmAX+cJXRynLYGDOBr8\nkVUB3SZtHmdAHA0+0cGNOBrso1suAjzWEP0YH1dBL8DT0QAk511aCPLkk81xaI6GkOYhLTRs3eqv\ny1UebNVoGB3lH8fUkTohjgYBgFmYnXiiudnuv9/vOXC/QbuZMwc480x1vH49sHlz0tGwerWpOMvN\n0cClowKnzhPj48ZVwWV8XEMXJj6EhjbvAvsU1docnAHiaPCJT6GBeyBlm5AcDbQ7SGhCA3U0HHGE\n2UAKwdEQotAwPs6ni8rkpAmKbRSDpPcpxzimjtQJcTQIAMwieGgIOPxwdfzcc37PITShAUimT9x2\nm3E0TJsGLF/Oq9ihXhiPjPhTGLPIU7Rt0yY/iyza6qctgQN9nz5SJ0JaFNlmeNgswF0LDW0eZ0Ac\nDT7RQsP+/WZH2yZtFs1CEhpoulYIQoNOnRgeVm3LNUNDwFFHqeMnnuC1855GSPcHbXHJJX2iauHk\nfqkTHFPAFy405yzFIAWv0IXZokXq40svJS8W14RWowFICg0/+IFxNKxcqR5Y+n1wWCRwrD+Q5Wj4\n8Y/VQ3/FCmDbNrfn0sYdSnE0+MVX8dOQFp8uEEeDP+j15WIOafO1LEJDddKEhtFR01Fi1aqpGy+6\nTsOuXTxSOvsRkuDJsfMEdeJUdTTccENyg5bjhmkUmToN0t5S8AqdjLXQAPitdByio+Hss40K+t3v\nmknruOPUR/0+9uzxK9qkoR9InBbFWY6GG25QH/fsAe64w+25tDFw8F0MMqRFkQt6tVqzTdvHWRwN\n/nDd4lKEBgV3oaFqwOaKNKFh/XplmQeSaRMaWqeBe/pESPcHR6GBOlnLCGQrVpjjz38e+MhHzOdc\n4xidPrFzp7sNJnE0CFOgCzM6GbjeRaZQoYE+HDgzfTpw7rnqmN6w2orno597XkJ0NNA8StfpJ21P\nnfBdDLItY0zR1/v+/W67rLQ9CPZR+6WNwmQaIjS4Y9YssxvIXWgIydHQqxCkhnae4F4QMqS5nmPq\nBJ1f5s4t/vNXXAF8+MOmrgfdyecuNADu0ifE0SBMIS11AqhHaJg71xRRDAGaPqHpdjQA9dZpmJw0\nCwFODyM6Pt0Pnjj2KzTQB3Ybi0H6bm/J6Tr0ha+CkG0OzgB1/+rFjTga3ELfuwuhQY/zwACvANYH\nUWRcDT7dpWXgKjTQzh3f+Ibaae1VCFITkqMhpHmIo6OBFqUsIzQMDQF/8RfAffclu/YByffLCR8t\nLsXRIEyBg9CgFftQ6jNo+gkNXKyPdBHAafftxBPN8b//e/J7W7YkgwTXQkMbgzNpb+kXX0JD28c5\nisxY+3A0tHGMNb4cDXPmqL9r29CBsjgaynHeeaowNwDcfTfwoQ+ZQpBA+I6GkNYtHIUGW+O3ejVw\nzz0qfeLII4F3vhM45pjq5+cCH44GaW8pTKFXjQZfQkOnYxbeXO1GvVizZmqqR1rqRJ2OBq4236OO\nMgHB2rXJCs901wFwP35Vle0QqdPRwOk69EWvCui2aXNLQI2ee304Gtp4LWt8FYNs63WsNysOHEgG\n89zgKjTMmAHceKOptH/ttapwN6Ce8zTo0oijwQ3cUyeqCjVDQ8B//a+qU9r11/MVRn2nToijQQBQ\nv6Nh1y4lNgDhCQ2Dg8D555vPdWtLQISGLKIIOP10dbx1q3IxaOiuA+B+R6eNQoM4Gvzi29EwOMir\nMJtP9Fjv3p1c9Nii7deyxqejoY1wcUVmwbUYJACccQbwpS+Zz/Vu68knpweDhx5qno3cHQ10HuJ+\nj9DYgoujoY3rPkmdEGqhl9DgazIIseMEhaZP6NaWAB+hgfPuG81rW7fOOVusywAAIABJREFUHIuj\nwT2+hQaugpcvfAsNs2fz3VVxDZ17XbhH2n4ta1wKDRMTZqecexDlilCEBq6OBs373gdcfnnya2lp\nE4CaM7WrYeNG06GCIyG516ZPN/cxF6EhpNQTW0gxSKEW6k6doA/Q0Go0AEmh4YQTzDGXRQLnfGIq\nNKxda45FaHDP8LBRm6W9pXt8F4Nsy8IpDddj3fZrWeNSaAhpt9YVXNYQWXAXGgDgb/4GOPNM8/mp\np/Z+ra7TMD4ObN7s9ryqENo8pNMnmpg6EQoLFhjXkY8aDeJoEDB9uvqnqUNoCN3RsGoV8MEPKpHh\nyivN17k4GjjvvqU5GiYngYceSr5OUifcoBcnvh0NbensQfFVo4E6GtqK67m37deyxqXQ0MYgoBsR\nGuwxMqLqNZx/PvCrvwq86129XxtKnYbQ7hFdEHLnTrctnvPSxnVfFJn0CVepE+JoEBLQnR9AiQ56\ngSpCQ36++EVg/XrgVa8yXxOhIZtjjjEPSC00PP64Kn5FEUeDG/S97tPRcMghpu90m/DhaODaytY3\nvhwNM2eq1ottxWV7S3E0iNBgmyOPBG6/Hbjttv5roVA6T4TmaKCdJ3x2tetFaEKNLXT6xO7dybWv\nLcTRICToFhoA42oQoaEaIjRkMzCgWgMBSl3dunVqIUhAFZuiBads01ahQV8PPotBcrsGfeFDaJDg\nTEHH2qWjoa3XskYcDW6hQsP27fWdRxaci0GWgQoNnB0NoRX+pZ0nONRpaOsc47pOA12H+HL8idDA\nmO7WjIARGl58URVkck3oNRp6MXu2KQxZ524E52KQQDJ94uc/T9ZnoGqoj7x2IIydAVvo97p/v/ui\nVzo4a9P4UnwLDW0dZyAp8rp0NLR5jAG37S3bOidTFi40x+Jo8AdNneDsaKDtX0Mo/EsdDRzqNLR1\ng4l2nnAhNNQR04nQwJh+jgbAz8OtqY6GKDLvRxwNvekuCEmFhrPOMscur0X9wDnkEH85ZRyg14Pr\n9Im2Oxp81Gho6w5NN+Jo8IPL1Am5liV1oi5WrDDHITgaQhHiqNDAydEQRe2qtUMdDS7qNIjQICTI\nEhp8pE80VWgA+AkNHB9Ip59ujtetM6kT06cnq0S7HEMtNLRJ1QaS14NLoWFszOTtcbwGfUBFLHE0\nuMWlo0GuZcPQkAksRWiwTz+hYcMG4NZbgU7H7zml0TSh4ZBDgCVL1DF1NOzZA3z3u27dlUUIrcMQ\nV6Fh9ux21dpxnTohQoOQIEto8DEZtEFo2L07WSDFJ9wdDccfbxYnd92lFlAAcNJJyQeTCA32odeD\nyzoN3MUuH0SRmW9FaHCLS0eDjHESHeSI0GCf+fONJZ4u3rduVbWNXvMa4Gtfq+fcKE0TGgBTp+G5\n51RqYRwDb3sb8Na3AhddVO+5AWEW/qU1GjilTrRt3ec6dULXk5k+3d98IEIDY9JqNPiuDEsfoGnC\nR8hQNa8uFZy70DA4CJx2mjp+9lmzQ3PyyX4Kak5OmuChbQ8cukDxJTRwvAZ9oec3SZ1wi0tHg1zL\nSURocMfgoFmj0XXSnXcCe/eq4+99z/95ddO0YpBAsk7DU08BP/gBcMst6vN77pnaGcs3dB4K5f7g\n6mgIZfxs4St1wmfNPREaGMMpdWLuXFM8sSlw6DwRwsKY1mnQnHJKcvxc5ajSALttQoOvGg2yC6zQ\nQcOuXW6Kb8o4K1wW3pQxTqLHYNcutetrC+mgotCLdfr8W7/eHKd1afINdTRMn17fediEdp54/HHg\nYx9Lft9FgFaEEIulchIaxsfNddu2dd+8eaYmhW1HQxyL0CB0wUloaFraBMBDaODedQJI1mnQ+HI0\ntLXyMCCOBt/Q+daFqyHExacLhofNQsr2vCHXchItAnQ6dlsQi6NBoRfrL71kuoA9/LD5/oYNbls/\n50EHbDNmNCfXnToavvhF4Gc/S35/40a/59NNiILn/PlmM7Hu1Ik2C5lRBCxfro6ffNKuO2fPHiXi\nAMmuOa5pyLTTTOoWGjqd9ggNdVWNDmFh3MvRQBVRERrs46sYZIiLIhe4bnHZ5sVTN67qYci1nIRe\nZzbTJ0Q0U9DFun4GUkdDp5MUHupACw1Nqc8AJB0Nt9029ft1Cw0hCnFRZFwNdTsaQhw/m5x9tvo4\nNqZSsWxRRyFIQIQG1qTVaPApNOzaZXLyfV6UvvARKGehA8iREb6tG1etUuenmT8fWLbMj1DTZqHB\nVzHIEFw1PnDtaJAg2OCq408Iwq1P6CLd5jXd9kBA0915Io6TQgNQf/oEdTQ0Bepo0NAWiHULDaHO\n9VRosJlqVZQ2r/sA4IILzPGtt9r7vSI0CFOo29HQ5I4TAI/UiRB6vk+bBvzKr5jPTz5Zqd8+xq/N\nC1pfjgbpOqGgwq4LR0Obr+Vu9LNtdNSutTzUBb4rVqwwx9328iqIo0HRLTQ899xU58gDD/g9p270\n/dUkR8Phh0/dmPnMZ8wxJ6EhpLled56YmKi3TWjbn5W/9mvmWIQGwSlpQsMhh5gHhmt7kwgN7glB\naACSdRpOOUV9POQQ43SQ1An7iKPBLz5TJ9ocnAHuWlyKoyHJr/6qOba5YNWBwIwZfJ14PugWGrrd\nDAAfR0OThIbBQeCoo8znr30t8O53m8/rFhpCFeK4FIRsu9CwZAnwspep4/vuS66DqyBCgzCFtNQJ\nwLgaXDsa6EXZdKGh7hoN3BfFZ51ljlevVh+pq0GEBvuIo8EvIjT4w1WLSxnjJGefbToN3HqrPTu0\nHuc2BgGUPEJDnY6GiQmV5w00S2gAgBNPNMd/9mdqfaDXCHULDaHOQ1yEhjav+zTa1dDpAD/6kZ3f\nuX27ORahQQDQO/jUk8ELL7hpw6ahwaPUaLBPp2P6bXMXGi65BHjXu4Df/m3gne80X9cBg9RosE8d\njoaQFkW28dl1ou0Bmg9HQ5uvZc306cC556rjp58GHnvMzu9ta4/7bvoJDbomwubN9dnQaVpS04SG\nq69WwdgXvgCceab6mnY5PP2027VxFqGnTgD1dp6QZ6WbOg3iaBASDAyoHeM0tKMhjt0GyJI64RYt\nMgD8F8UjI8B11wHf+EayqJQew/373bTxarPQUIejgbvg5RLXNRr04nNgoHmL/qL4cDS0+Vqm2F6w\nxrEIDZp+QsMb3mCO60qf0GkTQLOKQQLAGWeo6/mP/sh8TQsNExPAli31nBcgqRNVEaEBOO88045W\nhAbBCf36HfsqCNl0oWH2bJXrB9QjNDQhwKOTlYvgrM1Cgzga/OI6dUIvnmbP7i0itwVXYy3X8lRs\nCw3795vd4raPMX3+bd9uWlnOn58s6FZX+kSTHQ1p0LoNdaZPhOpo4CI0tHndp5k3T4lpAPDgg6rQ\nbFWo0EBb87pGhAam9FuI+hIaml6jgdYYqKNGQxOEBteukDY/cOgiXopBusdXjYa2B2eAu3mjCXOq\nbdasMW6d2283LavLIruNBrpY37RJWfYBVT9AF00GeDga2iY0bNpU33mE6miQ1AleUJH4ttuq/z5x\nNAgJuDkamlijATDvqw5HA/0////27jxMiur6G/j3zDDsICoqCiKoILgrIhoUcDduqIlJ3MA1MXmN\nScxuNO7GmGiMif5i3BONuC+RxC2Iu4gLYhB3xAUUUSHsAzP3/eNUeW/39FJVfavX7+d5+pnqmerq\nWz19u6tOnXturX6Ypl1Qs5EDDd26AZ0767KPaHY+HNeu0q7RwECDxYyG8mluBsaN0+XPPgNeeaW0\n7fEkwHKPi6ZNs8vDhuk00KFKZTS8845dboTvz2rMaKilz6GNNrLLlZwthZ8xync2WniM3tRU3s8D\nBhqqVLUFGuoxowGw+7VkCbB6dXmfe84cu+x+QdaStDMaGvkLR8ROcfTmm5k1PXxiRoNyhzT4zmho\nb7cBnUZ7H+fCjIby8nnA2sifydm6drWZAu7rMmyYZpEMGKD3//tffzN+xHHffXbZneq0XlVjoKGW\n+ki/fsDw4br8zDOVy2po5AtMrq98xU4h72PWoDDQsM46hc8xfWOgoUpFDTSkOY7KPQB0r0DVk7SK\nkkXhBhoGDy7vc/uS9swd4RdOly72A7eR7Lij/mxvL/1KZD7hyVlLS2O+xqGmJpti7vuzgFkjmdzv\nE5+ZUOEBfufONhuI/AYaavUkKi25sj3Dk7Vw+MSiRTr7RDm1tQH336/L3bsD++xT3uevhGoJNITH\nLSK1N2TlsMP0pzHAP/9ZmTYwmKm6dQNGj9bluXOBd98tbXvhd225M9QZaKhShQINbsGWctRo6N0b\n6NQpveeppLRT/wtxPzQ23bS8z+1LuWo0NGpUe8QIu/zSS+k8R3jiwCvA9gTY99CJWh2zm5b+/W0h\n3tde87fdMKDD93KmYcOADTfU5SeeAFpbk2+LJwGZch20DxumPys5fOKZZ+zx4f7719+sE7msv75m\nmQCVCzQYY6eRHTCg9gr/HnqoXb733sq0IfyMaW6uvUCNb76CxK2t9liPgQYCUB3FIMMTx3qtzwCk\nf0W+kHrIaChXjYZGDTSEGQ1AeoGG8OSMJ8CZgQafqc68Cpypa1c7LOi11/xNjcs6GLmJ2APW5csz\n6wnExUBDpuzjo5YW+31eyYKQ7kmie/JYz0SAgQN1ee7cygxX+fBDe9zi/v9rxYgRGggGgEceSbcQ\ndT7h69e7d+0FanzzFWioVCFIgIGGqlXpGg3t7fbEu17rMwDpX5EvJMxoWGut2h2akubr587X3qiB\nhm23tVd+mdGQvnDoRFub3wOsWi0OlqYwiNbW5u9qLzMa8nOnWyzlgJXZOZmyD9qHDLEZoO6JZjkz\nGoyxgYbmZuCgg8r33JUWBhqWLatMkW/3/1yLgYamJhuYam0FHnyw/G0IP2MYyNTAT/g6TJmSfNYg\nBhqog0KBhh49bHpYWoGGJUvsG7pRAg3/93/Aj3+st1tvTfd516yx0y8NHly7Uds0M0KWLrXvwUYN\nNHTrZsf7zpoFrFzpd/vt7bbIJE8a0psNgVeBO/I9LIjv5cJ8XRnjezlT9nz04bCJcDkMFJczo+HV\nV+2FjHHjavdCRhKVrtPg/p/doTO1pNLDJxr9ApOrUyc7a9DChckDlm6gIfszK20MNFSpQoEGEZvV\nkFagoZLRr3JyAw3//jdw2WV6O+oonXM8LR9+qFfygNqtzwCkO3SClYdVeOV3zRr/V8XcmSx4cpbe\nFJfMaOjIHRb04oulb48FNwsbOBDYfHNdfu655MNVGGjIlH18FAaGAb0gNGSILr/2mn6Gl0MjDpsI\nVTrQUOsZDQAwdqzN7ps8ubSaLnG1ttoLKvx8UT6CxMxooA6KTT0SBhoWLkyeSlNII0xtCWg6ab4P\nsylT0ntetxBkrdZnALRQT1jd3XdGAwMNKs2CkJzaMlN4cAX4zWhgoKGj7bazmVw+3tec2rK4XXfV\nn2vW6JS5STDQkCn7oN3NaADsVe1Vq/R9/vnnekvjuC3kBhrGj0/veapRtQQamps7vhdqRUuLHW6z\neDEwdWr5npufLx0x0ECpiBpoaGvzXyEdaJxAw/rra3bBU0/p7Z577N/SGhMPZBaCrOWMBhH7ocVA\nQzrSLAjJq8CZOHSifHr0sAfir75a+lUzBnOKc6+2v/56sm2wsGmmYoEG96r2qFG6/rrr6nppTKk9\ndy7w8su6vNNOwMYb+3+OauYGGsLhqeWyZg0we7YuDx1a29NFV2r4hPtd2cjHfa4ttwQ22ECXn3gC\nWL06/jYYaKAOio3Zd6e4XLDA//M3SqAB0IPS0aP1Nn68PdlIM9BQLxkNgH1/+A408AtH+b7y62JG\nQ6a0Ag08Cc4tzNZZvVprkJSCGQ3FuSfBSQMNDJplyj5o32KLzPujRuV+3FtvAXfe6b897knhYYf5\n3361q2RGw1tv2YBprQ6bCO23nw2U3Hdfuhk4LvcCEz9flIgt5rt0KfD88/G3wUADdRA1owFIp05D\no9RoyCZiryB//DEwb146z1MvGQ2ADTQsX+63WCG/cFSvXnp1BABmzkwWzc6HGQ2Z0qrRwJOz3HzW\naWAwpzgGGvxzj4/69+/43tt3X+D884EDDtDbbrvZv6UxE0Uj12cA9H8QHj+XO9BQD4UgQz17Avvs\no8vz5gHTp5fnefn5kps7fCLJsO6FC+0yAw0USdqBhkbKaMiW5pj4kJvR4Ebga1FaU1xy6IQVnpC1\ntmpRMV+Y0ZCJNRrKy+ewIGY0FLfZZnYWhFIDDc3NOitOo3MruLtDU0IiwJlnalG9yZMzh2f6DjR8\n8YWmVgNahDJXe+pdS4sGG4DyBxrqoRCky82IeeCB8jwnM1lzK7VOAzMaKDYGGtKT5pj4UJjR0L+/\nnaq0VqU1xSUDDZYb/PJRoT/EjIZMHDpRXttvb5dL/azla1xc584abACAN95Ilg4dngj06lW70zL7\nNHiwHR5x7LHF1+/bF+jXT5dffRUwxl9bpk61/9ODDmrc/0948WbhwsyZldJWb4GGMF0fAGbMKM9z\nMpM1t0GDbPbzs89qBnEcDDRQbOUcOsFAg19Ll9q6GrVenwFgRkM5pPWeZEZDJhaDLK+11rJTLr7y\nSmnT/zGjIZrwKveKFcmK5YXvZb6PlQjw9NPAJ58AEyZEe0x4EvrZZ/o4X9wrne4V0EZTqYKQ4dCJ\n7t3r49huk01s0NYdFpImflfmF/bp1lYtXh9HeE7Xq5edKa5cGGioUWkHGty0yrDaaaPYbDP74ZpG\noOG99+xyrddnADIDDW6AqlQMNFg77GCXfb4nmdGQKa0aDbzanl+YrbNyZfJ0foCvcVSl1mlgoKGj\n5ubMAt3FuOP3fZ7AhYGGTp2AMWP8bbfWDBxol8s1fGLZMuCdd3R5q62K11mrBSL2vfree5mfsWnh\n0In8Shk+ER6bV6LmXh10hcaUZqBh0SKbnr311pkH342gqcme2H3wgf/Xt55mnACY0VAOffrYoNSM\nGTqtrQ/MaMjkvs/Symjo0cPfduuBr4KQfC9HU0qgYc0azYQAGGgohZtW76tOw0cf2f/nzjs3drCt\nEjNPzJ5th8HUeiFIl/teLUdWA4dO5OcOZYkTaGhvt8fmDDRQZG703GfqHZA5zq9R0+/SLAjpzjhR\nD4EG1mgoj/CEbMUKHV/tA68CZ2ppsSepadRo6NWrPq50+eRrWBCzc6IpJdDgfl7wJCC5NAINbiX6\nRj1uC1Ui0FBv9RlCaWXf5MOhE/mttx6w7ba6/NJL0Y+3Fy+253QMNFBkvXrZgynfY9A4zi/dOg1u\nRkO9DZ3wGWhgCl2mNApC8uSsozCDK61AA2XyNSyIQbNoSgk08CTAjy23tIUafZ288bjNqkSNhnoN\nNKQRFCuEx32FhX3bGL0oHEUlC0ECDDTULBF7kjp3rr9UasB+YTU3A2PH+ttuLUkz0FBvGQ1p12ho\naan9mTl88JVi7mK6eUdhoMFnjQaOa89v3XXticHLLyebCQFgMcio+vSxsx4w0FAZ3bvb2T9mzUr+\nng8ZY4/bunUDdtmltO3VOrdGwzPPAG+/nf5zugGjeho64e5LOQINHDpRWJI6DQsX2mUGGiiW8CR1\n9Wodn+fDvHk61gwARo5s3I6+xRZ6MAD4nU4QsBkNnTsDG23kd9uVkHaNht69G3eaLpcbaJg+3c82\nmdHQUZ8++nPVKjsevRTG2NeZr3FuYbbOsmXAm28m2wYzGqILsxo++SRe5o4baOBrXJrwBG758sws\nxyTeegv48ENd3n13oEuX0rZX63r0sMHLOXP0qvzFF+uxclrCk/C+feurgHqa07HmwoyGwsaMscfD\n06ZFewwzGigx92p4qV9UIY7zU83Ndo73OXP8pVEbYzMaBg2qj/Haaddo4JeN6tvXTgX4wgt6Ilwq\nZjR05HuKy2XL7MEZT85y85FBxoyG6JIOn2BGgz8+i+xx2ERHN99sgw0rVwK//KVePEtjKMXChcDH\nH+vy1lvX34WRtKZjzcXNZG30gFkuvXrZbKjXXouWze4GGvr2TaddhdTBaU7jcsf3u+n4peAXluUe\n/L78sp9tLligVzCA+qjPAGjmRzgvr6+hE8Yw0JDLrrvqz9ZWP+/J8ORMxGbwNDo3Q2fevNK3x5Oz\n4tzP2hdeSLaNMGjG93JxDDRUns+x7+5xm1uZvpHttpsGcH70I3tB55VX9L5vbqConuozhMo5fMId\nZlhvARtfwvfYihXRLjIzo4ES853R4I7z69rVntQ0qjTqNNRbfQZAvwzCkzNfGQ0rVuhUagADDa6v\nfMUuP/ts6dsLT8569KiP7Bof3OKE//536dtjSn9xI0fa5eeeS7aNMGjWsycPUItJGmjgrBP++Krm\n394OPPaYLvfpk/n51eh69gQuu0w/U8JMtf/8p/SaGNnqtRBkqJxTXIaBBh735Rf3/8FAAyXmO6Ph\n7beBDz7Q5d12YwG+tAMN9ZLRANgPL1+BBk5tmZsb/HvmmdK3x9oBHY0fb5fvvbf07fEqcHF9+wJD\nhujyiy8mGxbEmT2iGz7cLjOjoTKGDLGp4aVcJZ4xw37v7rGHDvukTCNH2sLmixfHL4JajPv/q6dC\nkKFyzTzhZrLy8yW/uBkmDDRQYoMG2WUfGQ1Mv8u05Zb2QMBXQUj3/1QvGQ2AzWhYvlzHQ5aKgYbc\ntt7ajj9/5pnSCzOFJ2cc024NHGiDjC+9VPqYXmY0RBNm67S2JgvsuhkNVNiAAXZ4CQMNldGpkw34\nvPlm8po7HO4aje8gfWj1auD++3W5U6f6DDS407GmGWhYtcoW7OTnS35xAz8MNFBi3boBG26oyz4y\nGlgIMlNLC7Dttrr85puZB1lJ1WtGgzuu/dNPS98eAw25NTcDo0bp8rx5NgMpCWN4FTifww6zy6Vm\nNTDQEI17IhB3WBDfy/E0NenMSgDwzjsa3ImCgQa/wpPStrbkV9kZaIjG97DD0OTJtkDi+PH1+fnj\nezrWfDjjRDSbb24vgkYZOsHpLakk4VXxjz+2RQaTaG+3gYa11rLTjTU6d/jEv/5V+vbqNaPBrWS7\n117A1KmlbY9fOPmVckLmWrXKViyux4OjUhx6qF0uNdDAk7NoSrniuHIl38txhXUa2tp02GQUnN7S\nr1JT0levBp58Upc32sgGj6ijESM04wDwG2i45hq7fNJJ/rZbbcKgWNQChEm4F5j4XZmfmw311lvF\ns4jDjIaWlspk/DHQUOPcq+LvvZd8OzNn2jfjuHEc5xc6/HC7fM45tkBhUmFGw9pra+GmevGtb+mH\nGKAffHvsoV+6SacHZEZDfr6uzHBqy/y22spOJfrEE6XNpsKMhmi22sq+Ps8+G29YEKe2jC9JQUgG\nzfwqtSDkzJn2AtOYMSyCWki3brZQ5uzZfupJffAB8OCDujxwILDPPqVvs1qVo04DP1+iC/8fUbKh\nwuOXddetzGcEAw01ztfME0y/y22ffYDdd9flN97QuZmTWr3ajveup2wGANh7b51u0b0qed11+tol\nGXvKyHZ+u+xil0sZa+qenPEEOJOIzWpoawMeeCD5tngVOJpShgUxmBNfkoKQPBHwq9STNzfQ7Aag\nKTf3NZo2rfTt3XCDHUZwwgn1fYGuHDNPMJM1ujgFIcNAg5t5XE4MNNQ4XzNP/POfdnnvvZNvp96I\nABdeaO+fc07yok3u2Lawwno92Wor4KmngCuvtAf7s2YBf/1r/G0xoyG/tde2VyNffllTGZNgRkNh\nvoZPcErA6NwTgThBNGY0xJcko4EBHb8GDLDfb0kCDW4fafTpyKPwWRCyrU0vpgB6nHjCCaVtr9rF\nnekgCV5gii5qkNIt0F6J+gwAAw01z0dGw8KFdpzfkCGZByCkV+W/+lVdnjs3c0xeHO4Xm3tVup40\nNQHf+56d1xsALrgAWLYs3nYYaCgsPCFbsyb5jCjMaChsl12ADTbQ5YceSl4Dhydn0SWtP8LXOL4h\nQ2wabdyMhu7d7Xh3Sk7EnsB98EHm914UYR/p1g3Ybju/batHPgtCPvqozVDdf39g441L21618zUd\nayHMmIouaoZJpWecABhoqHk+MhoeeMBeaT/sMI7zy+WCCzKX4544A5lfbPV+9WHECOCII3R5wQLg\niiviPZ6BhsJ8XJlhRkNhzc3AIYfo8ooVwMMPJ9sOD56iSzosiBkN8XXtai9UvPaanVaukPC9zPex\nP+4Jw8svR3/c/Pm2LtfIkbZGEuW38cZA//66PG2aLSCbxLXX2uWTTy6tXbUgbgHCJDh0Irr+/aNl\nQzHQQCXbaCP7BZM0o+Gee+yymy5M1o47Al//ui5/8gnwpz/F30YYaOjSxRYlqmfnnacZDgBwySXx\nCkMy0FCYjyszvApcnDvN5XXXAY88orcnnoh+oMXXObo+fXTOdgCYMSN6Fglf42TCmhjLlgHTpxdf\nn4EG/9ygsZsJWEwjXbjwKXytli5NXmtgwQLgvvt0eYMNgIMO8tO2audjOtZCOHQiOhEbpPzww/zH\n1ww0UMmam4FBg3R5zpx4lboBPcAIr9T162cPPKgj98T5t78FFi2K/tgFC3S+cgDYaSegc2f/7as2\nw4YBEyfq8qJFwO9/H/2xDDQUNmyYnbXkmWfi93uAV4Gj2HNP+9o88ACw7756GzsWOOCAaK87T4Lj\nCU8E1qwBXngh2mP4Xk7GLfzsFoTOxRgbaOD72J84/wMXC0Em4yNI//e/2wyg445rnGwSN/vmoov8\nzNzhYkZDPO7/Y9as3Ot8/LFdZqCBEgvTH5cu1XoLcTz8sL0yN368PZGmjoYPByZM0OVFi4BLL43+\n2Ea9+nD22fZL+PLLNRskCgYaCmtqskHBBQuSDZviCXBxXbrYIUDZHnss3lVggCfBUSQ5EeB7OZk4\nJ7nLl9shlrza6E///sAWW+jytGmZ7+VCGqHmUxp8DDu88067XO9FIF3ua3fHHXpMfPvtyS505MJh\nhvEUK9BpDHD11fZ+pYrQ87SyDrgFIeOecLjV1Dlsojj3xPkPf9AjiCRPAAAgAElEQVSTvCjcL7RG\nuvqwySbAKafo8vLlGgWPIvzCaW4GevRIp221rtQrM7wKHM0f/qBBsrPO0tuRR9q/ueN08wlPHHr0\nqO/pz3xJUhCS9UaSGTTI1nl69tnCQ1U4e0p6woDPmjU6LMs1c6YO4frHP+zvVq2yRYA33xxYf/3y\ntLMe7LCDzShN8r05bx7w3HO6vM02wNCh/tpW7XbfXU9cw2DuggXAN7+pwfgoNV6K4dCJeIoVhHz4\nYVvof+hQ4MADy9OubAw0JCQiA0XkUhGZLSJLReQzEXleRH4iIt3K2Ra3IGScOg2rV9tpLXv1AvbY\nw2+76tGgQcB3vqPLy5YBv/lNtMc1akYDAJxxhlYpB4C//MVWai4k/MLp3ZvFSfNx30eTJsV/PK8C\nR7PWWsAPfqBDp847T6drDV+vW2/NDNjkwnTzeLbYQqdwBaIPC+IMKsmFJ7mtrTo9cT5vv22Xw2Fb\n5Ee+zBJjgKOP1gtCxx5rx8W//LKdZrvRjidK1aWLDl8F9D396afxHn///Xa5ES/OffvbWjx2/Hj7\nu7vuAq6/vvRtc+hEPIUyGozRY+/Q+edXbqYgBhoSEJGDAcwE8CMAQwF0A9AHwAgAlwB4WUQ2K1d7\nkmY0PPmkLSBy4IF26hoq7Fe/0umkAOCqq4qfOK9ebVOsBw/WWhiNpF8/PVED9GD2vPOKP8YNNFBu\nY8boPOyA1g+Ie3WGJ2fJ9OxpsxqWLgVuu63w+mFAh+/laJqabCr4p59GC54zaJZc1OETN95ol8eO\nTa05DWncOBtQd/8H06bZK5Xt7cCvf63LjXzhwoek0+gCmVnAbrHgRjJggBaRv+UW+7so2X3FuIEG\nfo4Xt/badhaVV1/NDMrffTfw0ku6vP32tph9JTDQEJOI7ABgEoBeAJYAOAPAVwDsBeAaAAbAEAAP\niEiSpO8mAGiLMe9O0owGDptIJvvE+fzzC68/Y4atg9GoBwU//amNUN94I/DGG4XXX7RoPoBz0L37\n/LSbVrO6dLEHnoBGr+OMlXQDZEw3j8edzuyaa/KvZ4w9CU564DR//nycc845mD+/cfqCOyzIvYKY\nj1uUjO/lePbc0y7nCzQsWWKzpnr3Br7xjfTbla2e+8E66+jMVoAOlQiHZGafvN1xh548sBBkadzX\nbOrU6I9bvBiYMkWXBw7UE7hKqIa+IAIcdZROZQ5o4d4ZM0rbZniBqUsXXviMKhw+sWiRDusBdFaQ\nM8+061x4YWXr7zHQEN/l0AyGNQD2Mcb81hgzzRgz1RhzCoCfARBopsOPE2y/GYgXaEiS0WCMDTR0\n7gx89auRn46QeeJ8ww3Am2/mX5cHBRp5/elPdbmtTWtd5LNqFdDaOh/Auejatf4OKn067jgdowvo\nAVPUquW33GKn5+reXQ+aKLoRI4DtttPladPyz2Pto4De/Pnzce6559blCVY+++xjl3/5SzsePZeP\nP878LguzfCia9dYDtt1Wl196KXcl+UmTdKggoCcXlaibU+/9wM0seeyxzOCO68wzbc2nnj0z06cp\nmt12s2nk114bvYj6v/5laxEcemjlhnVWU1846SS7XGpWQ5jRwGET0bl1GsLjkJtvtsOsRo+u/Pkd\nAw0xiMhIALtDsxauNcY8n2O1ywDMhgYbfiAiqZf/Wntt2zGjZjQ8+ijwwQe6vNdeTOuNa511gJ/8\nRJfb2oDTT89fSMstBNmoGQ2AZoGst54u33Zb/ui3WxCIVycLa2kBzj3X3o+S1TBzZuYV+T/8ga9z\nXCLRDrDcWVaYChrdqFHAd7+ry6tWAV/7WuZ84K6LLrKfvaecwtc5ifAk15jcV3jd97f7vid/soew\nuMGd44+3weB//xv46CNdHjWKBWaTWH994MQTdXnJEp2uPApmAXd01FG2BtfNN2ceB8+YAey9t2Ze\n5jouue02DdpvvbXewnMSno9E5wYaTzxR73//+/Z3F11U+TpnDDTE43603JhrBWOMAfC34G4fAKmX\nWBSxWQ3vv6+Vi/NZtkxPkPff3/7OLepC0bknzpMna2Tx0Uc7rhdmNHTvbq8cNaKePbW+RchN7XJd\nd51dZtGx4r71LRvVnj7dZirksmgRcPjhwIoVev+EEzKDDhTd0UcDXbvq8t//bodHAXpQNWlSZmAx\nLHBI0Vx+ua3VMHeu1sXITvSbO1cLzAL6+eoWv6LoCg2fmDkTeD64pLL99jbFn/waPdrOaPWf/2QG\nd049FTjnnI6PaeQLF6U66yybnv/nP9vgTT6rVmlGA6AXmnbfPd321Qp3KNXixVoYEtBMswMO0Pfy\n+ed3nA7+6aeBY47RLKpZs/QWfr737Vu+9te6MLMS0KETs2bZ4Zr77ae1vCqNgYZ4dgt+LgNQIJkT\njzvLo9NrjhXWaWhrs1HBbA8/rCckl15q03l33hmYMKEcLaw/vXrpQW74ZfXuu5ryO3Givfr20Ud2\nLPzOO1eu6mu1+M53gI031uXJk/XLxvXII5kBiIMOKl/balVTE3DBBfb+mWd2PCEDtM8feyzwzjt6\nf8QI4MorKx/trlVrr20LLH3xhZ4I3HqrTkN38MF6YhyOte7dmwGduDp31jHp4dR9jzySWZME0Gye\nMJX5hz8ENtigvG2sF2PG2Cvj2YEG94T35JP5eZGWHj1s4ODdd21wZ4cdNLhz7LE6I4uLgYbk+vfX\nAA6gQWL3OzSXKVNsAeWDD+axnCu7ZtHq1TrtpTuy4+c/1yFBgAYhjjjCXhTt1k0vRPXsqZk7v/hF\n+dpe67bdVjMZeve2r2HPnsDw4cAVV1S6dYqBhniGQ4dNvG2MaS+w3utZj0ldoToNCxdqMGG//ezf\nunTRAiFPPWVnUKD4Dj9cr/i4Vbj/9jdg2DAdB8/q0Jm6ds2sz3DEEVq9GLBXLdudnhVOQ0WFHXyw\nptECGtEePz6z2ONbb2kK4wMP6P111gHuvNNekadk3DTy3/5W00iPPlqDaKHDDtPpwEaXJeRcXwYM\n0PTa8CT4ootsIPf114GbbtLf9+ljh7JRfL17ayAc0EK94dXdFSs0WwfQ44SjjqpM+xqFO3wiFH7G\ndOrUsfB0mPFDyfziF3bY4LXX2iB8LuFxCsBhE9l23VVPbAGdze7II4EnntD7nTvrz/Z2DT7MmaMZ\nEGEQYtw4rc2wZIne5s5llnUcIvreXbzYvoZLlugxx9ChlW6dYqAhIhHpAiBM6Pmw0LrGmEXQrAcA\n2DjNdoVyzTxhjJ7sDh9uDxYA7dgzZ2qaaZiqR8kNHarR7r/+1dbKWLhQ08K+/W27XqMWgsw2cSKw\n1Va6PH++BmsOPzxzHDbTEuMR0RPd8Grj5Mn6Gl9xBfCb32gmU3g1oalJr7wPGlSx5taNMWPyVx7f\ncEOdYuruu+0UVBTfuHHAJZfY+2Egd8IEG5T82c84NKVUuaa5vOceHW4FaFCYQ9nSlR1oyA7ufO1r\n9jhir700YEzJ9e0L/Dgo2b5mTe7hKYBmCIZDErt1A/bdtyzNqxnZNYvC4RMtLXrcsd9+ev/TTzXV\n/8kn9X7//hpIZnZIfWOgITq3xNTSvGtZYaChLGXW3IyGn/1MAw8bb6wnu2FF3T59NPI1ZUr1RLrq\nRVOTpo/Nnp05X+0XX9hlXn1QnToBDz2UWQn3nntsZfnNNis+ZSh1NHasvo79+un9pUu1jsgZZ+j4\nUkCDCw8+yAMlX0S0ONvVVwN//KO93XijfhY06jzrvv3oR/rd5QZyp0/X5fXXB047rXJtqxfuSe5p\np+kxxCmn2N+xCGT6dt45szBvdnCnqUmHwN51V+4ZKSi+00+3AZtbbtH3ffZt8GA7DG6//WzxQ7Im\nTOh44fKKKzQwdsst9sJGWD+gpUWzKsOhcVS/GGiIzk0ybo2w/irozBNlGZjgBg6++ELTk9ziNkcc\noQe+J57IMZZp2nBDHVd8332ZVzG32IIFblz9++tV93/8wxbUBPRqwd13s3J8UuPHaz93M2kAPUA9\n/XTgv//NnDqQStevn77ep51mbxMncooun0T0uys7kAtoTZJKTLdYb3bd1Z5ALV6sxxDhScEWW+iU\ngJSulpbMYZi5gjs9emgGII8n/OjdW6fQBTQLeM6cjje37hmHTeTWt29mYP2447QmFwCsu64Gx9yh\nmldcwYtvjYIJK9E5NcXROcL6XaD1HFak05xMm22mY1RvuSWzENyAATom/pBDytEKCh1yiKb8nn22\nzkTBK/QdiehYvn331ZOFadO0INO222olYkqmTx+9wn700ZrN0KWLDqtgvQuqdWEg9/77tRDkkCEd\ng2qUTJcuOtXtRRfZWWkA/Tz50594gaJczjlHLxKNHcvgTrmceqoef4R1BfIZPVpneaLcLr5YgzKb\nbgpcdVXmZ8aOO+qQzV/9SocAhUEIqn9iik26TgC+rNGwAho8mGyMKXjqLiJLAHQH8JwxJnIZMBFZ\njSAAtJ57qTeP5uZmNHMiZaozra2t+PTTT7Heeuuhc+cocT2i+sS+QMR+QBRiX6Ak2tra0JZrSrIs\nCxcuRBAbWG2MKfkNxkBDDCKyAMC6AF4xxuSdTVpE+gD4HBqUuMMYEzkGKiJrADByQEREREREROXW\nZowpeeQDh07EMxvA7gA2F5GmAlNcDst6TByrYIddfB5h/TYAhabaJCIiIiIiosbUhGgXsteB1hhc\n5eNJGWiI5ylooKEHgBEApudZzynng6fjPIExhmWtiIiIiIiIqGZx1ol47nWWj8+1gogIgAnB3UUA\nHku7UURERERERETVgoGGGIwx0wE8CU0pOVFERuVY7ScAhkOHPlxujCleeYOIiIiIiIioTrAYZEwi\nsj10OEQ3AEsBXATNWugG4EgAJwervg5gpDFmWSXaSURERERERFQJDDQkICIHArgZQG9odoPLAHgD\nwIHGmDnlbhsRERERERFRJTHQkJCIbAzgBwAOBDAAQCuAtwHcDuBKY8zKCjaPiIiIiIiIqCIYaCAi\nIiIiIiIib1gMkoiIiIiIiIi8YaCBiIiIiIiIiLxhoIGIiIiIiIiIvEk10CAiI0TkLBF5SEQ+EJGV\nIrJERN4QketFZHTM7X1VRO52tvVBcH//CI/tLCKjRORUEfmbiLwuIm0i0i4ibTHasJ6IHCgi54rI\nv0Tk02Ab7SJyfZz9iaMa9j1BmweKyKUiMltElorIZyLyvIj8RES6RXj8ZiLyLRG5TESeEpFlzms9\nIa12+8Z+4E817HuCNifuB8FjTxKRv4rINBF5L+gHy0XkfRG5T0SOEZFOabXfJ/YFf6ph3xO0uZS+\nMNZ5XYvdfp3WPvjCvuBPNex7gjaX0hei9oP2tPejVOwH/lTDvidoc0nnCcE2NhCRC0XkRRFZJHp8\n9I6IXCsiI9JqO0VkjEnlBuAJAO3BrS3HLfzbTQBaimxLAFybZ3vh764uso0bnHXbsx7fFmO/2vNs\npw3A9Sm8jlWz7zHbfTCARXn+/+0AXgewWYHHjynyWk9I673LfsB+UEX94PwC7x13318BMKjS73f2\nBfaFFPvC2CJ9wb39utLvd/YF9oUU+0KUPuDeXqv0e579gP3Adz8ItnEIgMUFtrEawJmVfq838i3N\njIYNARgAHwH4I4CvA9gZwK4ATgfwYfD3Y6Fv8EIuAnBCsP6LAI4MtnUkgJeC358kIhcU2Y4Jbv8D\n8DiAj+PuVNZ23gfwMLSTp6Xa9r0oEdkBwCQAvQAsAXAGgK8A2AvANUE7hgB4QER65NuM0+42AP8F\n8DzSfa3TwH7gR7Xte1Ge+kE7gBkArgRwMoCDAIwEsGdw/+lgO9sAeCTqFYAKYV/wo9r2vShPfcF1\nAvQ9n+92ledd8I19wY9q2/eiPPWFQu/98HYp7Gt/Uxr74gH7gR/Vtu9F+egHIrI7gDsA9ASwEsDv\nAOwBYCcAxwB4AZq5f66IfCetfaEi0opgALgfwNcQTKGZ4+/rQKNVYRRqtzzrDQHQGqzzHIAuWX/v\nBj0BbQewCnmiXwCOADABwHDnd48hfqTybAAHAFgvuL+Jsw9eI5XVtu8x2v2406adc/z9x85rlvPK\nE4DNoV80uwPoHvxuovO4WsloYD8o/TWsqn2P0W4f/aApwvP8wdnOqZV8vxdpJ/tC6a9hVe17jHb7\n6AtjnXXGVPr9XOLrwb5Q+mtYVfseo90l94WIz/NcsJ01AAZU6r1epI3sB6W/hlW17zHa7eM74dVg\nnVYA43L8vROAh4J1FgFYt9Lv+Ua8VfbJgQOdN9Lleda5yllnZJ51Rjnr/CnG85fciVL+AKnqfc+z\n3ZFOe67Ms44AmBWs9xmA5ojbrrlAQ8T9Yj8ovO2q3vc8202tH+TYzgbOc92W5ns17Rv7QtFtV/W+\n59mul76AOgo0RHzd2BcKb7uq9z3PdsvyvQBgqPM8j6b9Xk3zxn5QdNtVve95tltyPwAwwtnGTQWe\na3NnPQ6hqMCt0rNOTHWWN8uzziHQFJrXjTHTc61gjJkG4A3oG3O8zwZWWC3u+6HO8o25VjDa+/8W\n3O0DTXVqZFOdZfaDjmpx38vZD5Y4y10TbqNaTHWW2Rc6qsV953dCMlOdZfaFjmpx38vVFyYWe54a\nMtVZZj/oqBb33Uc/2MlZfjDfExlj3gbwTnD367FaSV5UOtDQ2Vluy/6jiAwGsFFw9/Ei2wr/3l9E\nNvHQtoqq4X3fLfi5DDpWLB93n2JVFa5D7Ad51PC+l7MfHOksv55wG9WCfSGPGt53fickw76QRw3v\ne7n6wlHO89yd4PHVhP0gjxredx/9YF1n+ZMiz/cJNMiyjYj0itRC8qbSgYZxzvLsHH/f0lkudgDt\n/n140gZVkVrd9+HQ6Orbxpj2AutVU5srbZyzzH6QqVb3PdV+ICJ9RGR7EbkMtvBdK4D/i93S6jLO\nWWZfyFSr+55GX7hIdKrXlSLyuYi8JDoN8pCSW1s9xjnL7AuZanXfUz8+EpFx0FR9A+AuY8zyuI2s\nMuOcZfaDTLW67z76wVJnea0iz+f+fcu8a1EqKhZoEBEB8HPnV7fnWG2As/xhkU1+4CxvnLRdVaTm\n9l1EugDoG9wt2GZjzCJoNBOoj/9XIuwHRdXcvqfVD0TkxnAubgCfQ6tJ/xBa8GgpgG8ZY94roekV\nxb5QVM3te4rfCbsG67RADyK3g/aF2SJyduIGVwn2haJqbt/LeHw0wVn+e8zHVhX2g6Jqbt899gM3\n6DS2wPOtB2AYNLCRazuUskpmNJwOnX4ljLq+nGMdN8VlaY6/u5Y5yz1LbFs1qMV9j9NmwLa7Hv5f\nSbEfFFaL+55WPzB5bpOglaLvi9nOasO+UFgt7rvvvjAPwJ+hw4VGQQuCHQbgemhGjwA4O8I0btWO\nfaGwWtz31I+PgumNvwZ933xojJkSvXlVif2gsFrcd1/94EnoBRcBcLyI5KvfcQGAZtjpRTl0oswq\nEmgQkbEAfhPc/QTA9/Ks6hY2ay2y2VXOcjXPJR9VLe57nDYD2m5Bffy/YmM/iKQW9z2tfnAG7Bzp\nowF8F5rVcCSAW0Vk8/hNrQ7sC5HU4r777AvPA9jEGPMDY8ztxpgXjDEzjDH3G2NOho77/V+w7i9E\nZNuSWl4h7AuR1OK+l+P46FDYE6laz2ZgPyiuFvfdSz8wxqyEBhEAfc8/ISLHiMg6ItIiItuIyM0A\nTkb17HtD6lTuJxSRraDFaToBWAHgCGPMwjyrr3SWO+dZJ9TFWV6RvIXpCvY/nznOeLqq2XcR2QjA\n2nn+/IUxZl6wHKfNgLbboIr/X2lhP2A/cETqB8aY+QDmO796TkSuAXAlgFOC+3sYY16N8JxVg32B\nfcGRty8YY4r1jxdE5FToCZYA+H8AvhPhOasG+wL7giPJ8VFdDJtgP2A/cBT6TrhcRIZCP+f7wc5S\n4VoAzYI7L7i/JMc6lKKyBhqCCqkPQd+MawB80xjzdIGHuG+IYqk+PZzlKOk4lVLoJGAcgCeC5Wra\n94uQ+QXmuhHACcFynDYDtt3V/P/yjv0AAPuBK3E/MMYYEfkBdK7xAdBikLsVflT1YF8AwL7gKvU7\nYRI08NYLBcbtViP2BQDsC65YfUFE+gHYG3pSNt0Y80aUx1Ub9gMA7Aeugv3AGPM9EXkQOsxmV9jz\n2mXQmh5nIHNmri8iPCd5VLahE0G061HoVCztAI43xjxQ5GFuoZABeddSboGPD/KuVXn5xllnV16t\npn3P1+bwpisZswpAGHUu2GYR6QP7AVLN/y+v2A++xH4AP/3AGLMaOo+0ANhVRDZMsp1yY1/4EvsC\nvPWFNgBvQvtC/yTbqAT2hS+xLyBxXzgGOhYdAG6K+Jiqwn7wJfYDRO8HwfC5cQB6A9gUwCAAaxtj\nTjLGLADgDqObFXmPyIuyZDSIyLoAHgEwGPqGO9UYc0uEh77mLA8rsq7791xT4FQFY0xz8bUAVNG+\nG2OOB3B8xNVnA9gdwOYi0lRg6pqa+H/5xH5gsR98yVebP3WWByJziEXVYV+w2Be+5KvNpvgq1YN9\nwWJf+FKSNh8T/FwNzeypKewHFvvBl2K1OQhizM3xp9HBz8+NMXMitpE8ST2jQUR6A3gYdt7Unxtj\n/hLlscEbIhzXUywNckzw8yNjTK43Wk2p4X1/KvjZA1oRPB93nwqlxdUF9oNkanjfy90P3Ku31ZwS\nyr6QUA3ve9n6gog0ARgKfV/NK7J6xbEvJFPD+55KXxCR7aBXbQ2AycaYmkoPZz9Ipob3vZzfCTvB\nfifclmQbVJpUAw3BVDv/ArAD9J98gTHm9zE3cx80DXKYiOyc53l2gZ0n9d7kLa46tbjvbhtyRjeD\nuZHDsVyLADyWdqMqif2gZLW472XrByLSHcBXg7srALydZDvlwL5Qslrc93J+JxwJYK1g+fGE2ygL\n9oWS1eK+p9UXJjrLNTVsgv2gZLW47+X8Trgw+GkAXJ1wG1QKY0wqNwAt0IIu7QDaAFyacDtDoFOg\ntAGYBqBr1t+7Qqe9aodOYbJZjG0/FravhP3cxNnH6z2/hlW97wW2/bjTplE5/v5T5zU7K8Z2JzqP\nm5DWe9fza8F+UPprWNX7XmDbJfUDAOsCOLzIc3SBRunD7dzgez88vh7sC6W/hlW97wW2XWpf6ANg\nbJHn2Bk6r3o7tIjcjr73w+Prwb5Q+mtY1fteYNtej4+gFwznBY9ZAKA57fevx9eC/aD017Cq973A\ntkvuB9Civ2sVeI6LnW38MY33MG/Fb2nWaJgEYB9oFGkKgOuLTNnSaox5K/uXxpi3ROT3AH4BYCSA\np0XktwDeAbAZgJ/DRkIvMca8k2vjIrIBgP2zft3P+fvErL89aYx5N8d2RgNw56vv6yxvnr0dY0zi\n6HK17XsMP4CmOXUD8IiIXAT9wOoGveJ0crDeGwAuy7cREfkaMqvSutX0dwsinqGPjTEPldDmtLAf\ngP0AyfpBTwB3isjbAO6CHix8BP1i7gs9sToRWvwI0MJQvyihvWljXwD7ApL1hbUAPCYiM6FXw16E\n1iFpg9YkORg6Rr0zdN9/Z4x5qYT2po19AewLKOH4yLFf0F4D4B9GC6LWCvYDsB8geT8YCuAJEZkE\nLYj9LrT24FYAvg09RjLB81TzsVF9SyuCAY0ixbm9W2BbAuAa6EFFW9bjwt9dXaQ9Y2O2J+cVcwA3\nxNhGyVHAatr3mO0+EDqNTHabw3a/BmBwkW28F6PNU9J6L7MfsB9Uoh9Ar4Lk2t9c+/4kgEGVfr+z\nL7AvpNgX2vM81t1GK4AzK/1eZ19gX0irL+TY1iTncSMq/d5mP2A/KFc/gNZ3yPe9EO73JAA9K/1e\nb+RbmhkNxtf6Rt9RJ4vIXdAo1UhohHAhgOkA/mKMedhjm4qt52s7xTdQffsebSPGTBaRbaFRywOh\n09i0QseP3w7gSmPMyiKbaY/RHi/tTgH7AftB0n7wPoBRAPaAHgQMBrABNF1wafD3FwDcEXHfK419\ngX0haV+YB+Dr0HnSd4YWP+0LTQteDL3qNRXAtcaY9320N2XsC+wLpR4fQUR6QbN5DIDZxpgXfbSv\njNgP2A9K6QdvADgVwF4AtoEeHzVBs92eAnCTMaaqa/U0AtH3JxERERERERFR6VKf3pKIiIiIiIiI\nGgcDDURERERERETkDQMNREREREREROQNAw1ERERERERE5A0DDURERERERETkDQMNREREREREROQN\nAw1ERERERERE5A0DDURERERERETkDQMNREREREREROQNAw1ERERERERE5A0DDURERERERETkDQMN\nREREREREROQNAw1ERERERERE5A0DDURERERERETkDQMNREREREREROQNAw1ERERERERE5A0DDURE\nRERERETkDQMNREREREREROQNAw1ERERERERE5A0DDURERJSYiEwUkXYRaRORgZVuTxQiMjVo85RK\nt4WIiKgeMdBAREREjcYENyIiIkoBAw1ERESUiirPHJBKN4CIiKheMdBAREREaWHmABERUQNioIGI\niIjSxMwBIiKiBsNAAxERERERERF5w0ADERER5SUifUTkYhGZLSLLReQTEXlERL5e4DE3ikg7gLHB\nr8YFtRrc25w8j+0mIj8UkSki8rGIrAqe8yEROU5Eih67iMguInK7iMwXkRUi8q6IXC0iQyPucz8R\n+a6I3CEib4rIUhFZKSIfisi9IvINEcmZqSEidwX795mIdC7yPM3BPraLyANR2kZERFQLOlW6AURE\nRFSdRGQ4gEcBbAhba6ELgD0B7CUiNwB4IsdD3doMgtx1GtpzPN9IAPcA2CjrMX0B7A1gHwCniMgh\nxpgFedr8IwC/g15MCbexCYCTARwlIt/IubP28U0APsrT7g0BHBLcThSRw4wxy7PWuRbAYQD6ADgU\nwO0Fnu4AAOsHz3NdoXYRERHVEjGGNZqIiIgok4j0AjALQP/gV5MA/A3AAgBDAZwOYCcALwAYCT1Z\nHmyMeV9ENgSwNoAbg3WmAzg+6ylajTFvO8+3DYBnAHQH8DzsGfkAAAd6SURBVD8Afw4e9wGAdaEn\n998B0ALgOQC7G2Pastp8GIC7grYsBnAxgMeDP+8J4GfB8gIAQwBMNcbsmbWNZgArATwG4EEArwL4\nFEAvAJtCAxa7Bqv/zRhzfNbjBcDc4HV72BjzVeQhIndDgxELAWxkjFmTb10iIqJawkADERERdSAi\nvwPwY+hJ+y+NMZdk/b0ZwGQA+wa/+jLQ4KzzGHT4RIcT+hzP9wqArQG8BGBfY8wXOdbZL3hOAfBt\nY8x1zt9aAMyBZh0sBrCLMebNrMdvBeBpAL2D9j6eq10isqkx5t0CbT0bwNnQrIwtjDHvZP39XABn\nAWgDMMgY81GObawH4ENodunlxpgf53s+IiKiWsMaDURERJQhOGk/HnoyPjM7yAAAQTbBiQBWe3i+\nAwFsE9ydmCvIEDznQwDuhAYajsv683jokAsAOC87yBA8fhaAC4u1p1CQIXA+NAtBoJkW2a6HvnZN\nACbk2cax0OwMALihWJuIiIhqCQMNRERElG0EgHWC5ZvyrRRcqX/Yw/OND36+YYx5rci6YU2IkVmF\nIfcOmwUd4pHPDchdMyInURuKyFAR2SrIitgSmo0AANtlP8YYMxda2yJXQCQU/v5FY8x/o7aHiIio\nFrAYJBEREWXbxlmeXmTd5wEcWOLz7RT8HBbMVhFFCzQYsjC4H7Z5jjHm83wPMsYsFJH3AAwqtHER\nOQbACQBGAeiWb3PQQpW5XAstXrm5iIw2xjztbHsEdJgIi0ASEVFdYkYDERERZVvHWc45u4PjEw/P\nF868EPfWPavNJkJ7wzbnm56yi4j8C5oVMRZA1wLPD+QPQtwLGwTJLoR5YvBzJYBbI7SXiIiopjCj\ngYiIiLK5J+HFhhnkPGGPqTn4+TSAU2I8zi2yGLYjyrCIQm0+E8D+wXamArgKWqDyY2PMii83IPI4\ngN3zbcsYs1pE/g7gRwCOEJHTjDHLRaQLgG8G27/bGPO/CO0lIiKqKQw0EBERUTZ36MEGAN7OtyI0\nG6FUnwXPs16EGg35fA496d8gwrphBkUuJwZ/e9IYs1eBbYQZFIVcCw009ATwdWiWxGHQqT8NWASS\niIjqFIdOEBERUbZXneWRRdYt9PeoRRdfDn4OFZGBER+TLWzzYBFZO99KItIXeeoziMg6APoFd+8o\nsI0eALYo1iBjzGwAz0IDIOHwiROCn3ONMVOKbYOIiKgWMdBARERE2V4EEE4xeWy+lUSkP4B9C2xn\nZfCzS5Hnu99Z/nnR1uX2aNgs5J9SEtAT/nxDJ9xMzx4FtnEyomeFXhv8HCMi4wDsCWYzEBFRnWOg\ngYiIiDIYY1qhJ8ICYHsR+Un2OiLSDOAa6OwP+cwPfm5a5CnvAjA7eL5TROSkQisH00welPXre4Pn\nEwBnicjQHI/bEsAZyJ9p8SmARcHykSLSYd9EZCSA8wpsI9ttAJYEy/+AHnsZFJg2lIiIqNYx0EBE\nRES5nAfgQ+iJ+yUicouI7CciO4jIN6FDAvYD8EKBbTwT/FxfRC4TkR1FZLPg9uUQCWNMO7RAYnhC\n/lcReUhEjhWRnYPn3E9EfiEiT0OHSYxxn8gYsxrA94O76wB4TkR+LiKjRGQXEfkltNikgdac6JDV\nYIwxAG4J/rYdgKdF5FsiMkJE9hSRSwE8DmAFgDdzbSPHNpcDmARbP8IAmGKMeb/YY4mIiGqV6Hcq\nERERUaYgA+ARaN2C7JNqA+BGAE8CuD64P9g9gQ5qGbwCYHCOx79njMnIdBCRrQHcCWBI+KsczQoP\nXH5tjLkwR5tPB/Bb6MWU7McvhQY0fgadunKqMWbPrMf3BvAYgO3zPP9CAIcDOD/fNnK0aSSAaU7b\njzLG3FboMURERLWMGQ1ERESUUzADxFYALoFewV8JHV4wBcCRxphwhobwlv34ZQB2BfBHAK8BWFZk\n/f8C2BLAROhQiPeh2QOrAMyDBgAuADAiV5Ah2MZl0Gkn7wbwSdDm96C1EnYyxvw7XDVPG/4HYDSA\nswDMDJ5/SdD+SwBsb4x5qtA2cmxzOmwGxCIA9xR7DBERUS1jRgMRERFRikSkF4CPAXQFcJUx5vtF\nHkJERFTTmNFARERElK6jAHQLlq+vZEOIiIjKgRkNRERERCkJZueYBWAogGnGmF0r3CQiIqLURZ0D\nmoiIiIgiEJG1oTNfrAvgJ9AggwHwm0q2i4iIqFwYaCAiIiLy6zQAZzv3DYB/GmPur1B7iIiIyoqB\nBiIiIiL/DIA1AOYC+AeAiyvbHCIiovJhjQYiIiIiIiIi8oazThARERERERGRNww0EBEREREREZE3\nDDQQERERERERkTcMNBARERERERGRNww0EBEREREREZE3DDQQERERERERkTcMNBARERERERGRNww0\nEBEREREREZE3DDQQERERERERkTcMNBARERERERGRNww0EBEREREREZE3DDQQERERERERkTcMNBAR\nERERERGRNww0EBEREREREZE3DDQQERERERERkTcMNBARERERERGRNww0EBEREREREZE3DDQQERER\nERERkTcMNBARERERERGRN/8fLeuuH1J9kooAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 377, + "width": 525 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "rides[:24*10].plot(x='dteday', y='cnt')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Dummy variables\n", + "Here we have some categorical variables like season, weather, month. To include these in our model, we'll need to make binary dummy variables. This is simple to do with Pandas thanks to `get_dummies()`." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
yrholidaytemphumwindspeedcasualregisteredcntseason_1season_2...hr_21hr_22hr_23weekday_0weekday_1weekday_2weekday_3weekday_4weekday_5weekday_6
0000.240.810.03131610...0000000001
1000.220.800.08324010...0000000001
2000.220.800.05273210...0000000001
3000.240.750.03101310...0000000001
4000.240.750.001110...0000000001
\n", + "

5 rows × 59 columns

\n", + "
" + ], + "text/plain": [ + " yr holiday temp hum windspeed casual registered cnt season_1 \\\n", + "0 0 0 0.24 0.81 0.0 3 13 16 1 \n", + "1 0 0 0.22 0.80 0.0 8 32 40 1 \n", + "2 0 0 0.22 0.80 0.0 5 27 32 1 \n", + "3 0 0 0.24 0.75 0.0 3 10 13 1 \n", + "4 0 0 0.24 0.75 0.0 0 1 1 1 \n", + "\n", + " season_2 ... hr_21 hr_22 hr_23 weekday_0 weekday_1 weekday_2 \\\n", + "0 0 ... 0 0 0 0 0 0 \n", + "1 0 ... 0 0 0 0 0 0 \n", + "2 0 ... 0 0 0 0 0 0 \n", + "3 0 ... 0 0 0 0 0 0 \n", + "4 0 ... 0 0 0 0 0 0 \n", + "\n", + " weekday_3 weekday_4 weekday_5 weekday_6 \n", + "0 0 0 0 1 \n", + "1 0 0 0 1 \n", + "2 0 0 0 1 \n", + "3 0 0 0 1 \n", + "4 0 0 0 1 \n", + "\n", + "[5 rows x 59 columns]" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dummy_fields = ['season', 'weathersit', 'mnth', 'hr', 'weekday']\n", + "for each in dummy_fields:\n", + " dummies = pd.get_dummies(rides[each], prefix=each, drop_first=False)\n", + " rides = pd.concat([rides, dummies], axis=1)\n", + "\n", + "fields_to_drop = ['instant', 'dteday', 'season', 'weathersit', \n", + " 'weekday', 'atemp', 'mnth', 'workingday', 'hr']\n", + "data = rides.drop(fields_to_drop, axis=1)\n", + "data.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Scaling target variables\n", + "To make training the network easier, we'll standardize each of the continuous variables. That is, we'll shift and scale the variables such that they have zero mean and a standard deviation of 1.\n", + "\n", + "The scaling factors are saved so we can go backwards when we use the network for predictions." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "quant_features = ['casual', 'registered', 'cnt', 'temp', 'hum', 'windspeed']\n", + "# Store scalings in a dictionary so we can convert back later\n", + "scaled_features = {}\n", + "for each in quant_features:\n", + " mean, std = data[each].mean(), data[each].std()\n", + " scaled_features[each] = [mean, std]\n", + " data.loc[:, each] = (data[each] - mean)/std" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Splitting the data into training, testing, and validation sets\n", + "\n", + "We'll save the last 21 days of the data to use as a test set after we've trained the network. We'll use this set to make predictions and compare them with the actual number of riders." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Save the last 21 days \n", + "test_data = data[-21*24:]\n", + "data = data[:-21*24]\n", + "\n", + "# Separate the data into features and targets\n", + "target_fields = ['cnt', 'casual', 'registered']\n", + "features, targets = data.drop(target_fields, axis=1), data[target_fields]\n", + "test_features, test_targets = test_data.drop(target_fields, axis=1), test_data[target_fields]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We'll split the data into two sets, one for training and one for validating as the network is being trained. Since this is time series data, we'll train on historical data, then try to predict on future data (the validation set)." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Hold out the last 60 days of the remaining data as a validation set\n", + "train_features, train_targets = features[:-60*24], targets[:-60*24]\n", + "val_features, val_targets = features[-60*24:], targets[-60*24:]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Time to build the network\n", + "\n", + "Below you'll build your network. We've built out the structure and the backwards pass. You'll implement the forward pass through the network. You'll also set the hyperparameters: the learning rate, the number of hidden units, and the number of training passes.\n", + "\n", + "The network has two layers, a hidden layer and an output layer. The hidden layer will use the sigmoid function for activations. The output layer has only one node and is used for the regression, the output of the node is the same as the input of the node. That is, the activation function is $f(x)=x$. A function that takes the input signal and generates an output signal, but takes into account the threshold, is called an activation function. We work through each layer of our network calculating the outputs for each neuron. All of the outputs from one layer become inputs to the neurons on the next layer. This process is called *forward propagation*.\n", + "\n", + "We use the weights to propagate signals forward from the input to the output layers in a neural network. We use the weights to also propagate error backwards from the output back into the network to update our weights. This is called *backpropagation*.\n", + "\n", + "> **Hint:** You'll need the derivative of the output activation function ($f(x) = x$) for the backpropagation implementation. If you aren't familiar with calculus, this function is equivalent to the equation $y = x$. What is the slope of that equation? That is the derivative of $f(x)$.\n", + "\n", + "Below, you have these tasks:\n", + "1. Implement the sigmoid function to use as the activation function. Set `self.activation_function` in `__init__` to your sigmoid function.\n", + "2. Implement the forward pass in the `train` method.\n", + "3. Implement the backpropagation algorithm in the `train` method, including calculating the output error.\n", + "4. Implement the forward pass in the `run` method.\n", + " " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Shape of weight matrices\n", + "\n", + " Node j1 -> | w1|w2|w3|....wn\n", + " ---------------|--|-------\n", + " Node j2 -> | w1|w2|w3|....wn\n", + " ---------------|--|-------\n", + " Node j3 -> | w1|w2|w3|....wn\n", + " . ...\n", + " . ...\n", + " Node jm ...\n", + " \n" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "class NeuralNetwork(object):\n", + " def __init__(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + " \n", + " #print(\"DEBUG: hidden units \",self.hidden_nodes)\n", + " \n", + "\n", + " # Initialize weights\n", + " self.weights_input_to_hidden = np.random.normal(0.0, self.hidden_nodes**-0.5, \n", + " (self.hidden_nodes, self.input_nodes)) # shape hidden_n x n_features\n", + "\n", + " self.weights_hidden_to_output = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.output_nodes, self.hidden_nodes)) # shape 1 x hidden_n\n", + " self.lr = learning_rate\n", + " \n", + " #### Set this to your implemented sigmoid function ####\n", + " # Activation function is the sigmoid function\n", + " self.activation_function = self._sigmoid\n", + " self.hidden_gradient_function = self._gradient_sigmoid\n", + " self.output_activation = lambda x: x\n", + " \n", + " def _sigmoid(self, x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " def _gradient_sigmoid(self, output):\n", + " return output * (1-output)\n", + " \n", + " \n", + " def train(self, inputs_list, targets_list):\n", + " # Convert inputs list to 2d array\n", + " inputs = np.array(inputs_list, ndmin=2).T\n", + " targets = np.array(targets_list, ndmin=2).T\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + " # TODO: Hidden layer\n", + " \n", + " # signal into hidden layer\n", + " hidden_inputs = np.dot(self.weights_input_to_hidden, inputs) # shape: hidden_n x n . nx1 -> hidden_n x 1\n", + " # signals from hidden layer\n", + " hidden_outputs = self.activation_function(hidden_inputs) # shape: hidden_n x 1\n", + " \n", + " # TODO: Output layer\n", + " \n", + " # signals into final output layer\n", + " final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs) # 1 x hidden_n . hidden_n x 1 -> 1x1\n", + " # signals from final output layer - same, h(f(x)) = f(x)\n", + " final_outputs = self.output_activation(final_inputs) # shape 1x1\n", + " \n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + " \n", + " # TODO: Output error\n", + " \n", + " # Output layer error is the difference between desired target and actual output.\n", + " output_errors = targets - final_outputs\n", + " \n", + " # since the gradient of f(x) = x is 1, the error and output error are the same \n", + " \n", + " # TODO: Backpropagated error\n", + " \n", + " # errors propagated to the hidden layer\n", + " # shape: 1xhidden_n . 1x1\n", + " hidden_errors = np.multiply(self.weights_hidden_to_output, output_errors) # shape 1 x hidden_n\n", + " \n", + " # hidden layer gradients\n", + " # Calculate using sigmoid gradient of the hidden output\n", + " #shape hidden_nx1 \n", + " hidden_grad = self.hidden_gradient_function(hidden_outputs)\n", + " # TODO: Update the weights\n", + " \n", + " # update hidden-to-output weights with gradient descent step \n", + " self.weights_hidden_to_output += np.dot(self.lr, output_errors).dot(hidden_outputs.T) #shape hidden_nx1\n", + "\n", + " \n", + " # update input-to-hidden weights with gradient descent step\n", + " # shape hidden_n x input_n += (hidden_nx1 * hidden_nx1) . (1x1 . inputs_nx1.T) \n", + " self.weights_input_to_hidden += np.multiply(hidden_grad, hidden_errors.T).dot(self.lr).dot(inputs.T) \n", + " \n", + " def run(self, inputs_list):\n", + " # Run a forward pass through the network\n", + " inputs = np.array(inputs_list, ndmin=2).T\n", + " \n", + " #### Implement the forward pass here ####\n", + " # TODO: Hidden layer\n", + " hidden_inputs = np.dot(self.weights_input_to_hidden, inputs)# signals into hidden layer\n", + " hidden_outputs = self.activation_function(hidden_inputs)# signals from hidden layer\n", + " \n", + " # TODO: Output layer\n", + " final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs)# signals into final output layer\n", + " final_outputs = self.output_activation(final_inputs)# signals from final output layer\n", + " return final_outputs" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def MSE(y, Y):\n", + " return np.mean((y-Y)**2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training the network\n", + "\n", + "Here you'll set the hyperparameters for the network. The strategy here is to find hyperparameters such that the error on the training set is low, but you're not overfitting to the data. If you train the network too long or have too many hidden nodes, it can become overly specific to the training set and will fail to generalize to the validation set. That is, the loss on the validation set will start increasing as the training set loss drops.\n", + "\n", + "You'll also be using a method know as Stochastic Gradient Descent (SGD) to train the network. The idea is that for each training pass, you grab a random sample of the data instead of using the whole data set. You use many more training passes than with normal gradient descent, but each pass is much faster. This ends up training the network more efficiently. You'll learn more about SGD later.\n", + "\n", + "### Choose the number of epochs\n", + "This is the number of times the dataset will pass through the network, each time updating the weights. As the number of epochs increases, the network becomes better and better at predicting the targets in the training set. You'll need to choose enough epochs to train the network well but not too many or you'll be overfitting.\n", + "\n", + "### Choose the learning rate\n", + "This scales the size of weight updates. If this is too big, the weights tend to explode and the network fails to fit the data. A good choice to start at is 0.1. If the network has problems fitting the data, try reducing the learning rate. Note that the lower the learning rate, the smaller the steps are in the weight updates and the longer it takes for the neural network to converge.\n", + "\n", + "### Choose the number of hidden nodes\n", + "The more hidden nodes you have, the more accurate predictions the model will make. Try a few different numbers and see how it affects the performance. You can look at the losses dictionary for a metric of the network performance. If the number of hidden units is too low, then the model won't have enough space to learn and if it is too high there are too many options for the direction that the learning can take. The trick here is to find the right balance in number of hidden units you choose." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress: 99.9% ... Training loss: 0.067 ... Validation loss: 0.162" + ] + } + ], + "source": [ + "import sys\n", + "\n", + "### Set the hyperparameters here ###\n", + "epochs = 1000\n", + "#learning_rate = 0.03\n", + "learning_rate = 0.07\n", + "N_i = train_features.shape[1]\n", + "hidden_nodes = int(N_i / 1.2)\n", + "output_nodes = 1\n", + "\n", + "\n", + "network = NeuralNetwork(N_i, hidden_nodes, output_nodes, learning_rate)\n", + "\n", + "losses = {'train':[], 'validation':[]}\n", + "for e in range(epochs):\n", + " # Go through a random batch of 128 records from the training data set\n", + " batch = np.random.choice(train_features.index, size=128)\n", + " for record, target in zip(train_features.ix[batch].values, \n", + " train_targets.ix[batch]['cnt']):\n", + " network.train(record, target)\n", + " \n", + " # Printing out the training progress\n", + " train_loss = MSE(network.run(train_features), train_targets['cnt'].values)\n", + " val_loss = MSE(network.run(val_features), val_targets['cnt'].values)\n", + " sys.stdout.write(\"\\rProgress: \" + str(100 * e/float(epochs))[:4] \\\n", + " + \"% ... Training loss: \" + str(train_loss)[:5] \\\n", + " + \" ... Validation loss: \" + str(val_loss)[:5])\n", + " \n", + " losses['train'].append(train_loss)\n", + " losses['validation'].append(val_loss)" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(0.0, 0.8)" + ] + }, + "execution_count": 25, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABBwAAALJCAYAAAAXuDCjAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAewgAAHsIBbtB1PgAAIABJREFUeJzs3Xd4VNX2N/DvnjQCSUhIqKEIFgzSO9JRFOki8JMiSFGQ\nC/IiXEVQUAQULqiIqFxREUVsFxAQUZCiNDEgAgZEpSYhQAKBBFJn9vvHZE6mz5mTKWHy/TxPnpyZ\n2bPPPkn04axZa20hpQQRERERERERkSfp/L0AIiIiIiIiIgo8DDgQERERERERkccx4EBERERERERE\nHseAAxERERERERF5HAMORERERERERORxDDgQERERERERkccx4EBEREREREREHseAAxERERERERF5\nHAMORERERERERORxDDgQERERERERkccx4EBEREREREREHseAAxERERERERF5HAMORERERERERORx\nDDgQERERERERkccx4EBEREREREREHseAAxERERERERF5HAMORERERERERORxDDgQERERERERkccx\n4EBEREREREREHseAAxERERERERF5nM8CDkKI2kKIxUKI40KIbCFEhhDigBBimhAi3EPnqCOEeE0I\nkSiEuCqEyC86zx4hxItCiMqeOA8REREREREROSeklN4/iRB9AHwCIAqA9QkFgJMAekkp/ynBOR4D\n8B6AcDvnMJ3nCoBHpZTbtJ6HiIiIiIiIiFzzesBBCNEMwG4A5QBkA5gPYCeMgYFHATxRNPQkgJZS\nyhsaztEewC4YgwoGACsBbACQCqA2gJEA+hS9fhNAQynlGY2XREREREREREQu+CLgsAtARwAFADpK\nKQ9YvT4VwH9gzEp4WUo5R8M5NgLoVTTHBCnlcjtjFgF4pmjMMinl0+6eh4iIiIiIiIjU8WrAQQjR\nCsAvMN7kvyel/JedMQLAMQAJAK4CqCKl1Lt5ngwAMQDSpZRVHIyJApBZtJZDUspW7pyDiIiIiIiI\niNTzdtPI/mbHK+0NkMaIx6qih9EAumo4TyiMgYTTjgZIKa8DSDcbT0RERERERERe4u2AQ4ei7zcA\nHHQybpfZcXsN5/kTxv4MdR0NEEJEAogzG09EREREREREXuLtgEMCjJkHf0spDU7GnbB6j7veK/oe\nK4QY52DMLLPjdzWcg4iIiIiIiIhUCvbWxEKIMBgzCiSAZGdjpZSZQogbAMoDqKXhdB/CmBkxAsDb\nQogWMO5ScQHGXSqGA3i4aC1zpZQ7NJyDiIiIiIiIiFTyWsABQKTZcbaK8aaAQ4S7JyrKnhhVtFvF\nTABji77MbQcwX0q53d35iYiIiIiIiMg93iypKGd2nK9ifB6MfRjCtZxMCJEAYCSARjBmMlh/3Qtg\nrBCihpb5iYiIiIiIiEg9bwYccs2O1ewKEQZjYCDH3RMJIToC2AugD4zlG8MBVCs6by0A/wJwE8Cj\nAA4UBSeIiIiIiIiIyEu8WVKRZXaspkyiQtF3NeUXCiFEKIA1AKJg7NnQRkp52WxIKoD3hBA/AUgE\nUB3AxwBau3meGygOilxR8RY9AGeNMomIiIiIiKhs0gEIUjGuEoyVAHlSygquBpc2Xgs4SCnzhBDp\nAGIB1HQ2VggRDWPAQQI47+apegCoUfTepVbBBvP1JAkhPoWxt0MLIUQjKeVRN84ThuI/iCpurpGI\niIiIiIhIqzB/L0ALb2Y4AMBxAB0B3CGE0DnZGvNuq/e4w7w84pCLsQdR3EzybgDuBBxk8WFlREcD\nISFARgZgUK7KAJTPAACUCy2H2AqxDidLTQVk0YxhYUBcnBsrIfKS/Px8XL58GZUrV0ZoqJpKKKJb\nD//OqSzg3zmVBfw7p1uZXq+HXq93Oe7yZeXzdOlsXGnl7YDDbhgDDhUAtADwq4Nxnc2O97h5jkKz\nY1fXE+LgfWpcAVAFqAzgEjZvBtq1A+rVA06fLhpR4SLw72oAgAfqP4BvHv3G4WTR0cC1a8bjLl2A\nLVvcXA2RFxw6dAgtWrTAli1b0Lx5c38vh8gr+HdOZQH/zqks4N85lQVVqlQxBR3UlPWXOt5sGgkA\n682OR9kbIIQQAEYUPcwEsMPNc5w2O+7oYqx5YOO0w1EqmIJR0jzOZCiOZxQa1Mcz5C0ZqyIiIiIi\nIiJyzKsBBynlrwB+hrHJxRghRBs7w6bBWBYhAbwppbTIKxFCdBZCGIq+PrTz/h9h3IFCAHhKCNHQ\n3lqEEA8BeLjoYYqU8rCmiypiKqOwDDgUJ1i4CjiYv48BByIiIiIiIgo03i6pAIDJMJZJhAPYKoSY\nD2MWQziAIQCeKBr3J4DXncxj97ZcSnlNCPEagDkw7lSxVwixFMBWAFcBVAXQH8beDbqieZ4r4TWZ\n9W0wf7L4x1mgL1A9FwMOREREREREFGi8HnCQUh4WQgwG8CmMAYH51kNgDDb0klLe0HiOuUKIGBiD\nGxUAPF/0ZX2efADPSynXaDmPObsZDnqWVBAREREREREB3u/hAACQUn4LoDGAN2AMLtyAMfvgVwDP\nAmgupXTWU0Fafbd3jqkAWgF4D8bdJ67D2BgyE0AijNkTDaWUb5ToYorY7+FgluFgYIYDERERERER\nlV2+KKkAAEgpz8PYr2Gam+/bBSBI5djfAPzL/dW5z25JBQRgCAJ0evZwICIiIiIiojLNJxkOgchu\nSQWgZDmwpIKIiIiIiIjKMgYcNLJbUgEoAQdXTSOFABCZCoRnOMiWICIiIiIiIrp1+aykIgAUhRiM\n1R0OgwRFjSNdZTjoKx8GhrUE9KHI3f83gBoeWiaRdtWrV8fs2bNRvXp1fy+FyGv4d05lAf/OqSzg\n3zmVBUFBSncBvT/XoZWQzOdXRQiRDCAeiAeQjHXrgP79gfh4IDXVbOC/KwMV0lEvph7+efofh/MF\nTboHhrgkAEDVlDFI++8Kr66fiIiIiIiIbi01a9ZESkoKAKRIKWv6ez3uYkmFRo57OKjLcJDh6cqx\nXpfjyaURERERERER+R0DDhrpHSW0qGwaKXXFPR6EZGULERERERERBRYGHDSyznAQwvSCuqaRCDIL\nOBRlRRAREREREREFCn60rpF1wEGnK8p6UNk0EsLsdQN/DUREROSeli1bIi0tzd/LICIiFapVq4bE\nxER/L8PneKerkXVJhU2Gg4EZDkREROQ9aWlppkZiREREpRIDDho5LqlQmeGgM4tYsIcDERERaaTT\n6bgtIBFRKXXhwgUYTDePZRDvdDWyV1JhfEFd00hzzHAgIiIirapXr47k5GR/L4OIiOww29ayTGLT\nSI2sg1TWJRWFhkJImz0zHU3GuA8REREREREFFgYcNDL1cLApqdAXZyusProazZY3w2dHP3M+GTMc\niIiIiIiIKMAw4KCRq5IKAJi5fSYOpx3GC9tfcDoXSyqIiIiIiIgo0DDgoJHjkori4MHF7IsAgEs3\nLrmYjCUVREREREREFFgYcNDIYUmFWfAgT58HALhRcAN6g9U+mmZUdnogIiIiIiIiumUw4KCRmpIK\nc9n52c5m89i6SqtCQyEOph50GnghIiIiIiKiwMGAg0YOSyr09vsxZOVnOZ4LgX8T/ujXj6Ll+y3x\n1LdP+XspRERERERE5AMMOGhkneFgr6TC3PW86w7nkiLwAw7bT2+3+E5ERERERESBjQEHjRz3cHCQ\n4ZBXnOEgbZo2BH5JhV4af2AGGfjXSkRERERERAw4aOZ4lwrXGQ7W7y0LJRWmQAMDDkRERERERGUD\nAw4auds00ryHg3WGgywDAQdTs0hTpgMRERERBZYlS5ZAp9NBp9PhmWee8dl5r127ppy3UqVKPjtv\nadC0aVPl2o8cOeLv5RDZYMBBI4c9HBw0jbTMcLCMOJSFHg4sqSAiIqKy7OzZs8qNoae+5syZ4+/L\nskso/zAuG+f1J9M1l8Vrp1uD/Y/jySW9VYzAnZKKfH2B1auBfxNuCjRwW0wiIiIqy3hj6D3StlEa\nEfkZAw4aOS6pcN00Mic/z+K1MpHhYGCGAxEREZVdUVFRmDhxotMxBw4cwIEDByCEQI0aNfDwww87\nHd+6dWtPLtEj/J3dwIAOUenCgINGJdkWM7fQKuAQ4D0cpJSQMP6gGHAgIiKisigmJgZvvfWW0zEv\nv/wyDhw4AAC48847XY4vbSZPnozJkyf7/LwVK1aE3jr9mIhKBfZw0MjdkgrzppF51gEHEdg34eZB\nBjaNJCIiIiIiKhsYcNCoJE0jy1qGg3mQgRkOREREREREZQMDDhqVZFvMnIJci9cCvYeDeaNINo0k\nIiIi8qz+/fsrO1ds2LABAJCRkYFFixahffv2iI+PR0hICIKCgmzem5qaihUrVuCxxx5D06ZNERMT\ng9DQUFSqVAkNGzbEk08+iV27dqlah5ptMb/55htlzIABA5Tnt2zZgoEDB6Ju3boIDw9HlSpVcN99\n9+GDDz6AweD8Ayu122JGR0cr465fN34YeObMGTz77LNo1KgRKlasiKioKDRs2BDTpk1DWlqaqus2\n2bhxIx555BHUqlUL4eHhqFmzJrp164YVK1YgL8/4geOUKVOUNfijZCYtLQ1z585Fhw4dUL16dYSF\nhaFKlSpo3bo1Zs6ciVOnTqme6+jRo3jmmWfQunVrxMXFKX83d911F9q3b4/Jkydj8+bNuHnzpsM5\nrl+/jqVLl6JHjx6oVasWypcvj/Lly6N27dpo0aIFBg8ejPfffx/nzp3zxOWTH7CHg0bW/98rLqlw\nP8MBAR5wMM9qYIYDERERkWdZN0zctm0bhg8fjkuXLjltojhv3jzMnj1buaE3H3vt2jVkZmYiKSkJ\nK1asQO/evbF69WpERkaqXo+aMTk5ORg7dizWrFlj8XxGRgZ27NiBHTt2YNWqVfj2228RERFRovMK\nISzGrF69GuPHj8eNGzcsnk9KSlKue8OGDejUqZPTeXNycjBkyBAl2GM614ULF5CamoqdO3di+fLl\n+N///qd6rd7w1ltvYebMmbhx44bFGjIyMpCeno7ExEQsWrQIzz33nMstV6dOnYolS5bY/O1cu3YN\n165dw99//419+/Zh6dKlmDhxot3givnfqfkcAJCSkoLk5GT89ttv+PrrrxEREaEEiejWwoCDRqYe\nDlqaRtr0cAjwbTFZUkFERETkG7///jtee+015ObmIiYmBp06dULVqlWRkZFhk6mQnJwMKSWEELjz\nzjtx9913K59UZ2Zm4vDhw/jzzz8BAJs2bULv3r2xc+dOj90sSykxfPhwrFu3DiEhIWjXrh3uvPNO\nFBQUYM+ePcqn7bt378a4ceOwevVql/OpOacQAmvXrsWYMWMAAHfccQdat26NChUq4O+//8ZPP/0E\ng8GA69ev4+GHH8bx48dRpUoVu/MZDAb07dsXP/74o/JzqVKlCjp37oyKFSvizJkz2LVrFw4dOoQ+\nffrg3nvvdedH5DEvvPAC5s+frwRdwsPD0aVLF8THxyM9PR07duzAtWvXUFhYiLlz5yIlJQUffPCB\n3bleeuklvPHGG8pcVatWRZs2bVC1alUAwJUrV5CUlIQTJ044/J38+eef6Nu3L/Ly8iCEQFhYGNq0\naYN69eqhXLlyyMrKwunTp3HkyBFkZ2d77edC3seAg0Zul1TkOWsaGdgZDhYlFWwaSUREROQ1r7zy\nCvR6PaZPn47Zs2cjNDRUeS0/P99ibKNGjbB8+XL07dvX4Q31wYMHMXr0aBw9ehS7d+/Gu+++iwkT\nJnhkrVu2bEFeXh66du2Kjz76CLVr17Z4fd68eXjxxRcBAJ9//jmef/55NGzYsETnNAUFnnrqKVSs\nWBEfffQR+vXrZzHm0KFD6NGjB9LT05GZmYmFCxdi0aJFdud74403LIINL7/8MmbMmAGdrrhyPTU1\nFcOHD8fOnTtx8uTJEq1fi++//x6vvvqqssZBgwbhvffeQ3R0tDImNzcXU6dOxbvvvgshBFauXImO\nHTvi8ccft5grJycHixYtUoINb7/9NsaNG2c3CJWeno5169ahoKDA5jVTUEwIgYceeggff/wxYmNj\nbcbp9Xrs2rULn376aQl/CuQv7OGgkcOSChVNI3MKrXo4BHjTSJZUEBEREXmflBJ6vR7//ve/MW/e\nPItgAwCbxxMmTMDYsWMdBhsAoEWLFti2bRsqVqwIAFi6dKnH1puXl4fmzZtjy5YtNsEGAJg5cybu\nv/9+5bGp7KKkTD+n7777zibYAADNmzfHm2++qYx1dN7c3FzMmzdPudmePn06XnjhBYtgAwDUqFED\nmzZtQkJCgk3QxxeeffZZ5bhHjx5Ys2aNRbABAMqVK4dly5ZhxIgRxi3tpcSMGTNQWFhoMe7QoUNK\nT4aePXti/PjxDjNe4uLi8MQTT9gNUO3evRuAMQDkKNgAAEFBQejWrRs+/PBD9RdMpQoDDhq5W1Jh\n3jSyrPVwsC6pUJPuRkRERETui46Oxssvv+zROStXroyePXtCSomTJ08iJSWlxHOaShsWL16MkBD7\nH9gBwOjRo5XjAwcOlPi8gPEmd9iwYWjTpo3DMYMGDVJ6RqSlpdm95rVr1yIzMxNSSsTFxWHWrFkO\n5ytfvjxee+015bp95ZdffsHRo0chpYROp8OyZcucnv+NN95AhQoVAAAXL17EunXrLF4376NQuXJl\nzesyzRMcHIyYmBjN81Dpx5IKjdwtqbhZcBOFhkIE64LtlFQE9qf+1jtTSEgI+L5RDhEREanXsiXg\nZpP+UqVaNSAx0d+r8C0hBPr27YuwsDC333vhwgX88ssvOHHiBDIzM3Hz5k2LD4n++OMP5fjw4cOI\nj48v8XpjY2PRuXNnp2OaNWumHJ89e7bE5zTd8A8cONDpuJCQENxzzz345ZdflHNbX/POnTsBGH/u\nAwYMcPlz79mzJypVqoQrV674LOiwfft2AMY1durUCbfddpvT8TExMRgwYAA++eQTAMCOHTswaNAg\n5fVatWopx5s3b8b58+ctnlOrVq1auHz5MgoKCrBixQo8+eSTbs9BtwYGHDRyd5cKAMjOz0Z0uegy\n18PBuozCIA3QCSbXEBERlWZpaYAHPsgmH2vRooVb43/77Tc899xz2L59u8vtJ03S09O1LM2CEEJV\nPwbzVPtr166V+LwmjRo1KvG5Dx8+rBw7y5YwCQoKQvPmzbFt2zaVqyy53377TTlW27Cyffv2SsDh\n0KFDFq81bNgQCQkJOH78OC5evIgmTZpg5MiR6NevH9q1a6c62DV48GAcOnQIUkqMHz8eGzZswJAh\nQ3DfffehWrVqKq+ObgUMOGhkneHgqqQCMPZxiC4XjVyrHg4I8B4O1o0i9QY9gnX80yMiIirNbvV/\n89/q69fKnTT3r7/+GkOHDkVhYaHNlpH2mDIesrKynI5Ty9QXwhlTuYWU0m7zQV+cG4Ddc1++fFk5\nVvspf82aNVWN8xTzNdapU0fVe8yzIOwFl1atWoUHH3wQV69eRWZmJpYsWYIlS5YgNDQUzZs3R6dO\nndCjRw907tzZ4d/UlClT8OOPPyrBl82bN2Pz5s0AgHr16qFjx47o2rUr+vXrp+p3RaUX7/o0su7h\noJRUOGgaCRQ3jizrJRVsHElERFT6lbVyhEARHh6uatz58+cxcuRI6PV6ZVvMcePGoX379qhbty4q\nVqxo0WRyypQpWLJkCQCozoRwxZe9DLxxbvPtGsuXL6/qPaa+EL5ivkZTbwZXTOOklHaDSy1atMCR\nI0cwZ84crFmzRjlHQUEB9u/fj/3792PhwoW47bbbMG/ePAwZMsRmjpCQEHz33XdYvnw5lixZgr/+\n+kt57dSpUzh16hQ+/vhjhIWF4YknnsD8+fN9/rMjz2Beu0aOSyocx3BMW2Pm6ctW00jrAAO3xiQi\nIiLyr6VLlyInJwcA0KFDB/z++++YMmUKWrdujcqVK9vsaOGprIZAYn4DbNq5wZUbN254azl2ma9R\n7blN44QQiIyMtDumRo0aeO+993Dp0iVs27YNs2fPRvfu3REREaFky5w5cwbDhg3DSy+9ZHcOnU6H\np556CidOnMCxY8fwzjvvYPjw4ahTp44yR35+Pt5++220b9/eInhCtw4GHDTSWlIB2AYcAr2Hg3WA\ngRkORERERP5laiYIAC+99JLL2ntPNGwMNHFxccpxcnKyqveoHecp5iU2586dU/WeM2fOKMfm12hP\nWFgYunbtilmzZmHLli3IyMjAN998g5YtWypZJPPmzbPIYLAnISEB48aNw8cff4zTp0/j6NGjmDBh\nghJ4OHbsGBYuXKhq/VS6MOCgkcOSCidNI01bY1qXVAR6hgNLKoiIiIhKl9TUVOXYVfPGvLw8HDhw\nwK8lEKVR06ZNlWPTbhbOGAwGmyaM3ma+y8fevXtVvcd8XPPmzd06X0hICHr37o3t27crvSAMBgO+\n/fZbt+Zp0KABli5dimeeeUbpHbJhwwa35qDSgQEHjbSUVJgyHKybRkoE9g24TUmFIbADLERERESl\nnU5XfBvgqhzg008/RVZWlsU2mQR06dIFgLHXwdq1a5Gfn+90/ObNm5GRkeHTwE23bt0AGNe4a9cu\nl5kqmZmZWLt2rc373VWhQgV07dpVeXzx4kVN8/Tt2xeAcf1a5yD/YsBBI4clFU6aRpp6OOSXsR4O\nLKkgIiIiKl3q1aunHDv75Dg5ORnPP/88sxvseOSRR5QdFC5fvoxXXnnF4dicnBw8//zzAODTwE2b\nNm3QuHFjAMZMg4kTJzod/8wzzyi9EqpWrYr+/ftbvH716lXV5z5//rxyXKVKFeVYr9fj+vXrquYw\nlYEIISzmoFsHAw4alSTDocz1cLDKaGDTSCIiIiL/6tOnj3L8wgsvYP369TZj9u7di06dOiEjI0P1\nDgdlSbly5TBz5kwAxiDC/PnzMW/ePOj1lv/WTU1NRa9evfDHH3+gXLlyPl/nwoULIYSAlBKbN2/G\n0KFDkZmZaTEmJycHEydOxMqVKwEYb/BfffVVBAdb3tusWrUK99xzD5YuXYqUlBS758vNzcVrr72G\nrVu3Ks/16NFDOc7OzkbNmjUxefJk7Nu3z+G6d+/ejeeee0553LNnT9XXTKUHt8XUyOr/I24FHGwz\nHAL7E3/rjAZmOBARERH51/jx47F06VIkJycjKysLAwYMQKNGjdCoUSMEBwfjyJEjOHz4MIQQuPfe\ne9G0aVMsW7bMb+tVk2HhjyyMZ555Bt999x127twJKSVefPFFvP322+jcuTOioqJw9uxZ7Nq1CwUF\nBWjcuDHatWuH5cuXA7Asa/GmBx54ADNmzMD8+fMhpcTnn3+ODRs2oGvXrqhRowYyMjKwfft2JQgh\nhMCoUaMwcuRIu/MdP34ckydPxuTJk1G3bl00atQIlStXhl6vR1paGvbt24dr164pc02cOBEJCQkW\nc2RnZ2Pp0qVYunQpoqOj0axZM9SsWRPh4eG4fPkykpKS8Oeffyrja9eubRF8oFsHAw4aGQzF5RSA\ne00jbXo4BHqGA0sqiIiIiFTzRcp9ZGQkNm3ahN69eyufVB89ehRHjx4FAGV3gAceeACffvop5s6d\n69Hzu3uNasb7o8eETqfDpk2bMGTIEGzcuBGAsV/Bl19+qYwRQqB58+ZYu3YtFixYoDwfFRXls3W+\n8sorqFKlCmbMmIGbN28iJyfHopGj6fcdEhKCZ599FnPmzLE7j2nbS5MzZ87g9OnTFmNMcwUHB+OZ\nZ57Bq6++avF6UFAQKlSooPQOuXbtGnbs2GFzLtN52rZti88//xzR0dHaLp78igEHjawDDo4yHEJ0\nISgwFAAwK6kwlLEeDtYlFWwaSURERGSX6SbL3U/rTTd57mjcuDGOHDmCt956C9988w3+/vtv6PV6\nVKtWDU2bNsWwYcPw8MMP26xNzfrVjFG7XjXj1Y5x9+ekZnx4eDjWr1+PjRs3YuXKlThw4ADS09MR\nGxuLu+66C8OGDcNjjz2G0NBQXLlyRXmfp26g1V7TpEmTMGjQIKxYsQJbtmzBP//8g6tXryIqKgp1\n6tTBAw88gDFjxlj097A2ZswY9OzZE99//z327t2Lo0eP4vTp08jMzIQQAtHR0ahfvz46d+6MESNG\n4Pbbb7eZIyIiAhkZGdixYwd+/vlnJCYm4q+//sKlS5eQl5eHChUqoHbt2mjZsiUGDRpkUY5Btx7B\nbrPqCCGSAcQD8QCS0asXsGEDEBRkfL11a+DAAQBxx4GJDZT3xUfGIyXLGDXufVdvbByyEd1W9MGO\nlE3KGN3FptC/85vvLsbH9pzbgw4fdVAen3r6FOrG1PXjioiIiG59NWvWREpKCuLj45GcnOzv5RCR\nCg0bNkRSUhKEEDhx4gTuvPNOfy+JvKyk/682vR9AipSypscX6GVsGqmRXq8uw6FyhcrKseNdKgK7\nxMC6pIJNI4mIiIiorDl27BiSkpIAABUrVmSwgcoEBhw0ctzDwTLgEBVWXJuVrzfuzZunL2M9HAzs\n4UBEREREZZfBYMDTTz8NwFgCMWTIED+viMg3GHDQyPEuFZZNI8uHlIdOGH/Mpl4OthkOgR1w4C4V\nRERERBSopk2bhuXLlys7M1g7efIkevTogZ07dwIAwsLCMHnyZB+ukMh/2DRSo/x8dSUV4cHhCNGF\nIE+fhwK9MeBg2zQysG/AbUoq2DSSiIiIiALEP//8g9dffx1PP/00mjRpgvr16yMyMhJZWVn4448/\ncOTIERgMxn/vCyGwYMEC3HXXXX5eNZFvMOCgkXXAQSmp0FtmOJQLLoeQoKKAQ1GGQ4FVhgNLKoiI\niIiIbl1CCBQWFiIxMRGJiYk2rwkhEBERgcWLF2Ps2LF+WiWR7zHgoFGedZKCgwyHcsHlEKIzBiGU\nDAfrkgpdYAccWFJBRERERIHqgw8+wPr167Fjxw6cOHECly9fRnp6OqSUqFSpEho0aID7778fY8aM\nQaVKlfy9XCKfYsDBTUIYMxvy8iwzHBT2Ag5BRQEHUw8Hg2XTyEDv4cBdKoiIiIgoUFWqVAmjR4/G\n6NGj/b0UolKHTSPdZMpkcNzDwU5JhVWGQ5nbFpMlFURERERERGWOzwIOQojaQojFQojjQohsIUSG\nEOKAEGL/NI8VAAAgAElEQVSaECK8BPPWEUIY3Pw6VdLrsVdSIQRUZjiU7V0q2DSSiIiIiIgo8Pmk\npEII0QfAJwCiAJjyAsIBtADQEsBYIUQvKeU/Gk9hr7jBmRMaz+Mww6F4JTrjV1HWQnhwOIJ1xh+z\nKcOhwDrgEOA9HKxLKJjhQEREREREFPi8HnAQQjQD8DmAcgCyAMwHsBPGgMOjAJ4AcCeATUKIllLK\nG26eIgVAIxXjZgAYCmNw4mM3z6EwBRyseziYMhykBIQhGDIoH4BVSYWhAHqDHoWy0GrSwL4BZ0kF\nERERERFR2eOLDIc3YQwuFADoLqU8YPbaTiHEXwD+A+AuAFMBzHFncillIYAkZ2OEEDoAXYoeZgH4\nxp1zWM5l/O6wpAKAkMGQMAs4BBX3cLDZoQIAhN4YqBC2LwUCm5IKNo0kIiIiIiIKeF7t4SCEaAWg\nI4xZBSusgg0mrwM4DkAAmCyECPLCUu4HUKNoHV9JKXNdjHfJYUkFACGLG0daZzjkFdoJOOj0DucK\nBCypICIiIiIiKnu83TSyv9nxSnsDpJQSwKqih9EAunphHSPMjlc5HKWCKQvBYAAKCy2fL96pojhx\nxDzDwSANyCnMsTNpgAccrEoq2DSSiIiIiIgo8Hk74NCh6PsNAAedjNtldtzekwsQQkTAGPiQAM5K\nKX8u2XzFx7m5ls8Xl1TYz3AAgOz8bDuTGgI64GCd0cAMByIiIiIiosDn7YBDAow3+n9L6fQu03zX\niAQPr2EggPJFx5qbRdpj3cfBRMjiDIfwkHAlwwFwEHBgSQUREREREREFGK8FHIQQYQDiih4mOxsr\npcyEMQsCAGp5eCnm5RSflHQy8wwH84CDRYaDdUmFywyHAA84WJdUsGkkERERERFRwPNmhkOk2bGd\nu2wbpoBDhKcWIISoBaAzjFkWe6WUp0o+Z/Gx4wwHq5IKlxkOBhgMgRtxYEkFERERERFR2ePNgEM5\ns+N8FePzYNypItyDa3isaE7AQ+UUano42DSNNMtwyMrLsjuvIYBTHKwzGtg0koiIiIiIKPAFux6i\nmfnWk6EqxofBmIlgZxsHzYYXfc8D8KUnJjQY8gEcAgAcOVL8fFZW8TaZMq3AGOaIVJnhAKBQr4f3\nW2r4h3WAgRkORERERERUll24cAEXLlxwOS4/X81n96WXNwMO5h/lqymTqFD0XU35hUtCiFYA7oYx\niLFBSnndE/Pm5l4G0AIAMGZM8fP79hUf538GYyFHV5U9HAAU6PUAQuy+dqtj00giIiIiIqJiy5cv\nx8svv+zvZXid1wIOUso8IUQ6gFgANZ2NFUJEwxhwkADOe2gJ5s0iV3loToSHV0ZOzhYAwJIlwOTJ\nxufbtQMOHTL2dQh7dATy4v8wjg9WsUsFAL0hcG/CrQMMbBpJRERERERl2bhx49C3b1+X43r06IHL\nly/7YEXe4c0MBwA4DqAjgDuEEDonW2PebfWeEhFCBAP4v6KHlwBsKemcJkFBoQCaAwBqme2nER0N\nBAUVjYmLVlpmqs1wKAzgvgYsqSAiIiIiIipWvXp1VK9e3eW40FA13QlKL283Ddhd9L0CTHUI9nU2\nO97jgfP2gnFLTglgtZNAh9tU7VJhCFOOw0PCLQIONwpu2HtLUQ+HwMSmkURERES3piFDhkCn00Gn\n0+HLL+23RFu+fLkyZsKECR45b15enjJn+fLlPTKntzz//PPKWhcuXOjv5fjErfT7If/ydsBhvdnx\nKHsDhBACxeUPmQB2eOC8XimnANTtUlHx3FAEiSAMvmcwyoeUtyipuJFvP+AQyDfh3BaTiIiIyrpp\n06YpN2j16tXTPM/Vq1cRFhamzPXJJ594cJWOCfN/BJdgjDfOW1rcSmv1lLJ4zeQerwYcpJS/AvgZ\nxj0bxggh2tgZNg1AAozZCG9KaflxuBCisxDCUPT1oatzCiFiYMxwkACOSimPuHiLWxxlOJg/H3N6\nDK4+dxVfDPwCACxLKgrKXg8HllQQERFRWff4448DMN6gnT17Fj/99JOmedasWYOCggIIIRAREYFH\nHnnEg6ssfaSPt44vi9kKJeHr3w/denyxD+NkGLe6DAGwVQgxXQjRRgjRRQixHMCConF/AnjdyTxq\n/5ofRfE2nCs1rFc165IKU9BBSiAyLFJ5Xk2GQ0D3cLAuqWDTSCIiIipjGjZsiGbNmik3aKtWaUvC\nNc9oGDhwYKlJZzd90h0on3i7ex2Bct1Enub1gIOU8jCAwQCuwdjLYT6AfQC2A3gCxkDCCQC9pJT2\n78bdM7Loux7AZx6Yz4KakgprDptGGoKUw0AOOLCkgoiIiAgYOdL4z1QpJb7++mvkOWoI5sBff/2F\nX375RXk8YsQIJ6N9Z9y4cdDr9dDr9Vi2bJm/l+Nzr776qnL9zz77rL+XQ1Sq+CLDAVLKbwE0BvAG\njJkMNwBcBfArgGcBNJdSnnY2hdV3u4QQdwBoVTTuBynlpRIu3c45io8dlVRYZxZZZDiYN40sCFcO\nDWWopCKQ+1UQEREROTJ06FCEhIRACIGsrCysX7/e9ZvMfPzxx8pxrVq10KVLFw+vkIjIs3wScAAA\nKeV5KeU0KWWClDJSShkrpWwjpVwspcx18r5dUsqgoq8xLs7xt9nY3p6/CkvOSirMOcxwKCwOOARy\nhoN1CQUzHIiIiKgsiouLw0MPPaS5rGL16tUAjOn7pmwJIqLSzGcBh0ChJsPBmsMeDmYZDievnMCY\nb8Zg3fF1nlimU1dzrvq0wQtLKoiIiIiMzMsqtm7dikuX1CXk7tq1C2fPnlUeP/bYYw7HJiUl4fXX\nX8fDDz+M+vXrIyoqCqGhoahSpQpat26NadOm4eTJkyW7EDPubouZmZmJV155BS1btkRMTAwiIyOR\nkJCAp556Cr///rvb5z9z5gzeeecdPProo2jYsCGio6MRGhqKuLg4NG3aFBMnTkRiYqLTOdq2bQud\nTocFC4zt5aSUmD59unJd5l/W1+huo8n8/Hy8//776NevH+rUqYPy5csjOjoaCQkJePLJJ7Fjh7pN\n+6pVq6ac1/R3dO7cOcyYMQONGzdGdHQ0IiMjcc8992DKlClISUlRNa+3HD16FFOnTkXTpk0RFxeH\ncuXKIT4+Ht26dcOiRYuQmZmpeq6tW7di1KhRFr/vKlWqoEGDBujWrRtmz56NPXv2QK93/KHu2bNn\nMXv2bHTq1AlVq1ZFWFgYoqKicPvtt6NNmzYYO3YsvvrqK1y9etUTl192SSn5peILQDIAGRsbL405\nDFJOnCiV4379pKxY0Xhcv7608Nb+tyRegsRLkJUWVFKO8a+7leNm77SWeAkycn6kzC3Ild7y2ZHP\nZNDLQbLryq7SYDB47Tzmnt78dPE1vwS5ZP8Sn5yXiIgokMXHx0sAMj4+3t9LITfk5+fL2NhYKYSQ\nOp1OvvHGG6reN2rUKOU97du3dziuT58+UgihfOl0Oosv0/NBQUHyueeec/nvwUcffVSZ54svvrA7\n5r333lPGPPXUU07n2759u6xWrZrd9QkhZHBwsFy4cKHMzc1VxoSHhzucb+LEiaquVwghR44cKXNz\n7f87u23btjbvsZ7L9GV9jdOnT1fGL1iwwOn1//zzz/K2225zeP2mr169esmrV686natatWrKey9e\nvCi/+OILGRUV5XDeiIgIuXXrVqdzqqX29yOllAUFBXL8+PEyKCjI6XXHxsbKNWvWOJ3r+vXrskeP\nHqp/56tXr7Y7z1tvvSXDw8NVzdG9e3fNPycpS/7/atP7ASTLUnBf7O5XsB9jHbckRxkO1q+ZM89w\ncFRSkZp9HgCQlZ+FrPwshAWHlXit9nyV9BX0Uo8dZ3bg0o1LqBpR1SvnMceSCiIiIiKjkJAQDBky\nRGmuuGrVKvy///f/nL4nNzcX//vf/5THpi027Tl//jyEEAgJCUGDBg1wxx13ICYmBkIIXLp0Cb/+\n+itSU1NhMBiwcOFCFBYWYtGiRR65Nlf27duHPn36ICcnB4CxNKRNmzZo0KAB8vLysHfvXpw5cwbT\np09HeHi4i9mMkpOTIYSATqdD/fr1Ub9+fVSqVAkhISHIyMjAwYMHcfq0sVXcqlWrkJ2dja+//tpm\nnsGDB6NVq1bYt28fDh48CCEE2rVrh+bNm9uM7dixo6br37ZtG/r27Yu8vDwIIZTrT0hIQF5eHvbt\n26esdfPmzejYsSN2796NihUrOpxTSgkhBL799ls88cQTkFKibt26aNu2LSIjI/H333/jp59+gl6v\nx40bNzBw4EAkJSWhRo0amq7BXQaDAb1798YPP/ygXHNcXBw6d+6MmJgYnD17Fjt37kRBQQGuXLmC\nYcOG4fr163jyySftzjd48GB8//33yq4gd911F5o2bYqYmBjk5+fj0qVLOHr0KM6dO+dwTZ9//jkm\nT56srKdixYpo164d4uPjERQUhGvXruHPP//EH3/8gYKCAq/8XMoSBhxKQHXTSLMeDvn6/OIXCsuZ\nPV88WW6hw5YWJWZ+/gKDb/4DYkkFERERUbGRI0di2bJlkFLi999/xx9//IF77rnH4fi1a9ciKysL\nABAWFobBgwc7HPvggw9i9uzZ6N69OypUqGB3zPr16/HEE08gIyMDb775JoYMGYIWLVqU7KJcyM3N\nxYgRI5CTkwMpJerVq4cvv/zS5ob+ww8/xIQJEzBt2jQIIUyZxg61adMGAwcORM+ePRETE2N3zK5d\nuzB69GicPn0a69atw9q1azFgwACLMVOmTAFgLI84ePAgAKBv374e23UiPT0djz32mLIzSYMGDbBm\nzRo0bNjQYtyqVaswfvx45OXlISkpCePGjcPnn3/ucF7TjfeECRMQERGBFStWYODAgRZjjhw5gh49\neiAtLQ1ZWVmYP38+3n77bY9clytz585Vgg0AMHv2bMycORNBQcW79V24cAGPPfYYtm/fDgCYPHky\n2rRpgyZNmljMdeDAASXYULFiRXz99dfo1q2b3fOeOnUKn332GeLi4mxemzNnjrKeadOm4ZVXXkFo\naKjNuOzsbGzevBnHjh3TdvEEgD0c3KZmW0ybgINZhoOFguJ9k3P1xZPlFbq3RZI7zLMNfLVbBHep\nICIiIirWsmVLJCQkKI9dNY/85JNPABhvLvv374+oqCiHY1977TX079/fYbABAPr3749164x9w6SU\nPrn5fP/99/HPP/9ASomIiAhs27bNbvbA6NGjsWzZMuTn59uZxdb06dMxbNgwh8EGAOjcuTN++OEH\nhIQY/02+dOlSbRdRAv/5z39w8eJFSClRuXJl/PjjjzbBBsC41emHH36opKN/9dVXLvtPSClRWFiI\nDRs22AQbAKBx48bK71hK6TSA4UlXr17FggULlJv7F198EbNmzbIINgBA9erVsWnTJjRt2hRSSuTn\n52PmzJk28/3888/K8b///W+HwQYAqFevHl544QU88MADFs9nZGTgxIkTAIDbb78dCxYssBtsAICI\niAgMHjwYc+bMUXfBZBcDDm7SVFKhcxRwKE4VM89wyNN7MeBgdrNfaCj02nkszsmSCiIiIiIL5s0j\nV69e7fCT/LS0NGzbtk15PGLECI+cv0OHDqhbty6klBbze8sHH3wAwBg0mTp1Km677TaHY8eMGePx\njIvbb78dHTp0gJQSe/fuVTINfMFgMFhc/5w5c1ClShWH4x999FF07dpVefzuu+86nV8IgYEDB6JT\np04Ox/Tv3x+VKlUCYAwE/PPPP+5cgiarVq1SMlpq1aqFF154weHYcuXKYcmSJQCM/01s2bLFpizi\n+vXryrG9zAU1zOeoXLmypjnIPSypcJOaXSpUZziYlVSY35R7s6TCIsNB+ibTwDrA4KvzEhERkXYt\n/9sSadlp/l6GZtUiqiHxSeefDPvT8OHDMWPGDBgMBly4cAHbtm1D9+7dbcZ9+umnSqf9atWq4cEH\nH1R9jpMnTyIxMRGnTp3CtWvXkJeXZxHYuHnzJgAgNTUVGRkZiI2NLeFV2Xf16lUcOXJEeexshw2T\nESNGKKUNap07dw4HDhzAyZMnce3aNeVm1+T8eWPPtMLCQhw7dszrZSQmv//+O65cuQIACA0NxdCh\nQ12+Z+zYscpuFc52rTD1cLCX2WBOp9OhUaNG2LVrFwDjDg2333672kvQxFQiIYTAsGHDEBzs/Naz\nY8eOuPPOO/HXX39BSomdO3daBNhq1aqlHK9cuRIjRoxAWJh7fe+qVauGkJAQFBQU4NChQ/j111/R\nqlUrt+Yg9zDgUAKOSiqsOc5wKG/3aa+WVBj8UFLBDAciIqJbTlp2GlKy/LuNXiCrUaMG7r//fvzw\nww8AjGUTjgIOgPGmbfjw4Up6ujPr16/HSy+9ZHGT70p6errXAg6//fabchwXF4d69eq5fE+7du1U\nz//zzz/j+eefx969e1W/Jz09XfXYkjJdvxACDRs2REREhMv3tG/fHoAxoHD27FlkZmYiOjra4fhG\njRq5nNP893vt2jWX40vK/Pd+7733qnpP+/bt8ddffwEADh06ZBFw6NOnD8qVK4fc3Fzs27cPCQkJ\nGD16NHr16oUmTZpAp3OdvB8eHo5evXph/fr1yMvLQ6dOnTBkyBA88sgj6NSpEyIjI928SnKFAQc3\nOSupMFGf4WC/+65XSyr8kOFgHdhgwIGIiKj0qxZRzd9LKJFbYf0jR47EDz/8ACkl1q1bh5s3b6J8\n+eIPpA4fPmwRNFBTTjF9+nQsXLgQAFQFJ0wZAKamlN5w+fJlZT21a9dW9R614959911MnDhR+aTf\nFV9crzXT9QNAnTp1VL2ndu3a0Ol0MBiM/25OT093GnBwtpOFiamHBQCf7L6g5brNS22sg0JVq1bF\n8uXLMXbsWBQWFuLMmTOYNWsWZs2ahcjISLRt2xadO3dGnz59nAZgli5diiNHjuD06dPIy8vDypUr\nsXLlSuh0OjRs2BCdOnVC9+7d0aNHD4ufGWnDgIObnJVUOGwaqaKHgzlvZjiY923wVQ8Hm5IKNo0k\nIiIq9UpzOUKgePjhhxEVFYXr16/j5s2b+Prrry2CCh9//DEA4416s2bNnO5kAQAbNmzAwoULlRvv\njh07YuTIkWjZsiVq1aqFChUqWNxAtWvXDr/88gsAKDe23pCdXbwtvHlAxRlnTS9Nfv/9d0yaNAmA\n8WfUuHFjPPHEE2jbti3q1KmDyMhIi4aAQ4YMwRdffAHAu9drzfz61VyXSXh4OG7cuAHAdYBETbDF\nlwwGg0WfDLXXbT7O3jU/9thjuOeeezBnzhx89913KCw03s9kZ2dj69at2Lp1K1544QW0bdsWr7/+\nOtq2bWszR3x8PA4dOoQFCxbggw8+UAIjUkocOXIER44cwdtvv424uDhMnz4dU6ZMKXU/31sJm0a6\nSc0uFdbsZjgYdIDBfiDCqz0cWFJBREREVCqUK1cOgwYNUh6bdqMAAL1eb7GbwOOPP+5yvsWLFyvH\n//rXv7Bz506MGjUKjRo1QnR0tM2ntb76lN+8hMDUN8IV0422M4sXL1YCB/369cPBgwcxYcIENG/e\nHLGxsTa7D/gyq8Gc+fWruS6TnJwc5fhWS/XX6XQW/RXUXrf5OEfX3Lx5c6xfvx4XL17E2rVrMXXq\nVLRp0wYhISEQQkAIgf3796NTp07YtGmT3TmioqIwb948pKamYs+ePViwYAH69u2L2NhYZY709HRM\nmzZNVc8NcowBhxJQXVJhL8NBHwYYgmyfR+CXVLBpJBEREZGR+W4VO3fuREqKsW/G999/j4sXLwIA\ngoODMWTIEKfz5OXlKT0MgoODMXfuXKfjpZRITk4u6fJVMe0GIKW02XnAEVODR2dMTQkBYO7cuS5r\n+M+ePavq3J5mvhuC2us/d+6cRRaG1l0Z/EnLdZ85c0Y5dnXN0dHR6NevHxYuXIi9e/fi8uXLeP/9\n9xEfHw8hBPR6PcaNG6c0XbVHp9Ohbdu2mDZtGtatW4fLly9j586deOihh5Sshi+//BKbN29WtX6y\nxYCDmzSVVNjLcNCHAtJBwCHAmkZaZzQww4GIiIjIqEOHDkoTRYPBoDSJNC+n6Nmzp8uGjhcvXoRe\nr4cQAjVr1kRUVJTT8YcPH7bYItCbmjZtqhynp6fj9OnTLt+zb98+p69LKZWAjE6nQ4MGDZyOz8jI\nwPHjx12mxnsjdb5Zs2YAjGs+evSoqk/7TcEjIQTq1KnjtH9DaWW6bgCqG3qaj2vevLlb54uKisLo\n0aOxdetWBAUFQUqJtLQ0/Prrr27N07FjR2zcuNFim9ENGza4NQcVY8BBA1PwtES7VOhDAWn/x++r\nbTF91cOBJRVEREREjpn3bfjkk09w/fp1bNy4UXnOlAXhjPmn+2rKFt555x03V6ldpUqV0LhxY+Wx\neemII67GmNLeAWOgJjfX+b+f33vvPRgMBottMu0pV65423pPNVZs0qSJEjDKz8/HmjVrXL7ngw8+\nUI67devmkXX4mvm6V69e7TTTAAB2796NkydPAjD+frt06aLpvPXr18ddd92lPDYFptwhhECvXr1K\nNAcZMeCggakcqUS7VBT6qaTCUApKKtg0koiIiEgxYsQI5eb5+PHjePbZZ5Ub6EqVKqF3794u56hW\nrRrKly8PKSUuXbrk9FPd7du346OPPvJpI7wxY8YAMH7Kv3jxYqflDStXrsSvv/7qcn3mOxqYB2is\nJSUl4dVXX1V1veaZJKbylpLS6XQW1z9r1iyLHRysffnll/jxxx+Vx+PHj/fIOnxtxIgRCA8PV0pp\n5s+f73BsXl4eJk+eDMB4s//QQw/Z7FSSkZGh6rwFBQUWAYIqVaoox1lZWUqjSVfMy3rM5yD3MOCg\ngb2Ag7YMBz+UVEiWVBARERGVJrfddhs6duyofPr+3//+F4Dxxmvo0KEIDna9sVxwcDAefPBB5fHw\n4cPx22+/2YxbvXo1+vXrBymlWzsmlNTYsWOV0pHs7Gzcd999OHTokM24jz76COPHj7doOOhInz59\nlONJkyZZ9HQw2bJlC+677z7k5OSout6GDRsqx999951bTR6dmTZtGqpWrQrA+Gl5t27dcOzYMZtx\nq1atwuOPP65kcAwePBgtW7b0yBp8LSYmBtOnTwdgDLS89NJLmDdvnk2mQ2pqKnr16qX8vYaGhtrt\nQTJp0iTcd999WL16tcNyoPT0dIwaNUoJTsTGxqJVq1bK63v37kXdunUxd+5cJZvCmqm0afny5cpz\nDz30kBtXTua4LaYGpoa3+fnFz5kHG1T3cHCQ4eCzXSp8leEg2TSSiIiIyJmRI0fip59+AmAMNJiC\nD+blFq7MmjUL3377LfLz8/HXX3+hVatWaNeuHe68806lqeTZs2chhMCkSZPwyy+/KNtielt4eDhW\nrVqF7t27IycnB6dOnUKrVq3Qpk0bNGjQAHl5edi3bx9OnToFIQTeeustZctLR6ZOnYqVK1fi6tWr\nuHTpEu6//360bNkSCQkJkFIiMTERJ06cgBACffv2RVhYGL788kunc3bo0AFVq1bFxYsXcfbsWdx9\n9924//77ld0LAON2ogMGDHDr+uPi4vDpp5+ib9++yM3NxR9//IEmTZqgXbt2uPvuuy2uHzD+Ddxz\nzz1477333DqPr7nKGpk5cyb27duH77//HlJKvPjii3jrrbfQpUsXREdH4+zZs9i5cyfyi26sdDod\nlixZgiZNmtjMJaXEjh07sGPHDgQFBSEhIQEJCQmIiYnBjRs3kJKSgj179iilMEIIvPnmmzYBu5SU\nFMyaNQuzZs1C9erV0bRpU1StWhVBQUG4ePEiEhMTkZaWpszRvXt39O/f3xM/rjKJAQcNHAVcHTaN\ndLRLhYMeDr7apcJnPRwM7OFARERE5MygQYMwadIki/4LCQkJaNGiheo5mjRpgk8//RQjR45Ebm4u\npJTYs2cP9uzZA6C478GkSZOwePFidOjQwePX4cy9996LjRs3YtiwYbh06RIAYP/+/di/f7+yvqCg\nIMydOxdjx451GXCoUaMG1q9fjwEDBuDKlSsAgMTERCQmJirzmbIEVqxYgbFjx7pcY1BQEJYtW4Yh\nQ4agsLAQqampSgNPk/Hjx7sdcACA++67Dz/88ANGjBih7Mawd+9ei0aJphv4nj174pNPPkHFihXd\nPo8vueqJodPpsGHDBkyaNAkrVqyAwWBAeno6vvrqK2WM6fcUExODZcuW4f/+7//szhUZGWnRt+PY\nsWM2WSKmuSpWrIglS5bYbGlZvnx5BAcHK1kWaWlp+O677+zOAQBDhw7F+++/r+InQY4w4KCBvYCD\n05IK7lLh9DERERFRWRcREYGHH34Yn332mfLc448/7vY8AwcORPPmzfHGG2/ghx9+QHJyMkJCQlCj\nRg106NABo0aNQrt27ZTx5jdXjqgdY/7dkW7duuH48eN4++23sW7dOpw6dQqFhYWIj49Hly5dMG7c\nODRr1gx5eXmq5uzYsSOOHTuGN998E99++62yA0b16tXRqlUrjBgxwqLURM21DBgwAL/++iuWLVuG\nvXv34vz588jOzlZuru29X20/jPbt2+PEiRNYtWoVNmzYgMOHD+Py5csIDQ1F9erV0bFjRwwdOlR1\nw0Q111OS8a7mMv/uSHBwMN59911MmDABK1euxI8//ojk5GRkZ2cjNjYW9evXR+/evTFmzBinAZb/\n/ve/mDJlCrZt24b9+/cjKSkJ586dQ1ZWFkJDQxEbG4uGDRviwQcfxPDhw1GpUiWbOTp27IhLly5h\n69at2L17Nw4fPox//vkHV65cgV6vR1RUFO644w7ce++9GD58uMVOG6SNcBWVIiMhRDKA+Pj4eFSs\nmIykJMvXhw0DduwAUlOBmjUB862DM25mIO4/VvvInu0AHB8A9HjG5lxPt34aSx5a4vmLAFBjcQ1c\nyL4AAFj3f+vQ/27vpwd1XtkZP539SXn8ZPMnsbzPcifvICIiIldq1qyJlJQUxMfHIzk52d/LISIi\nO25OS7MAACAASURBVEr6/2rT+wGkSClrenyBXsamkRq4XVLhMMPBvyUVvspwYEkFERERERFR2cOA\ngwZqSyrOngVefx1IPe9mSUWAbYtpHWBg00giIiIiIqLAxx4OGph2qTBnb5eKUaOMZRbfbAwBuli9\nQR/mn10q/NE0UjLDgYiIiIiIqKxhhoMGrkoqTP78s+j7CTuBhTLUNJIlFURERERERGUPAw4aOCqp\nMDFlOBQWJRAY9MJ2a8zS0MOBJRVERERERETkJQw4aOCopMK6aaQp4KDX22kcWeinkgp/ZDiwpIKI\niIiIiKjMYcBBA7UlFRYBB7sZDn4oqfBHDwerwIavAh1ERERERETkPww4aKC2pEKvL/5uk+GgD3WY\n4RDou1Qww4GIiIiIiCjwMeCggaaSCpsMhzCHPRy8VVJhkAZIyOIlsKSCiIiIiIiIvIQBBw0clVRY\nc9rDwQ8lFTalDT7KcPDXeYmIiIiIiMh/GHDQQE2Gg5TFJRWFhQ56OPi4pML6Rt9XPRxYUkFERERE\nRFT2MOCggaMeDuZ9HAwGy2O7u1T4uKTCX80bWVJBRERERERU9jDgoIGrkgopi8spTErDLhXWN/5+\nK6ngLhVEREREREQBjwEHDdSUVFgHHILtNY30dUmFn278WVJBRERERERU9jDgoIGakgqbgINQn+GQ\nW5gLKaXd10rCXz0c/JVZQURERERERP7DgIMGWkoqbDMcQh32cACAAkOBxtU55u5uEXqDHhv/3IhD\nFw559LzMcCAiIiIiIgp8DDho4KqkAijeocLEJsOh0HFJBeCdPg42mQYuSiq+/ONL9P28L9qsaIML\nWRc0n5clFURERERERGUPAw4aOCqpMFGf4eAk4OCFPg7uZjiYMhsKDYVIupyk/bxuBjqIiIiIiIjo\n1hfs7wXcihyVVDhtGmm3h4PjeI83tsZ0t4eDedCjJH0XWFJBRETkPRcuXEDNmjX9vQwiIrLjwgXt\nmeKBgAEHDdSUVFgHHIJsAg5+KKlwc5cK8zWUpMGkdYCBTSOJiIg8x2AwICUlxd/LICIissGAgwZq\nSipc9nDwQ0mFddDA1Y2/RYZDCcogrM/DDAciIqKSq1atmr+XQEREKpXV/2cz4KCBlpIK2wyHUKcZ\nDr4oqXAVRDBfQ0kyHFhSQURE5HmJiYn+XgIREZFTbBqpgZaSCru7VDjp4eCLkgp3ejh4tKSCTSOJ\niIiIiIgCHgMOGmjZpSII/i+psMlwcFVSUVjyppFSSkhIi+eY4UBERERERBT4WFKhgauSCsA24KCz\nF3AQjm+8vVJS4ea2mJ7IcLB3DjaNJCIiIiIiCnzMcNCgfHnb51w1jbS7S4WvSyrc7OFgkeGgsQzC\nXjYDMxyIiIiIiIgCHwMOGjgKODhtGlkaSir80MPBXqCCAQciIiIiIqLAx4CDBvYCDoDzkgqbgENh\nmP93qfBBDwe7JRVsGklERERERBTwGHDQQE1JhaoeDs4yHJyUVCRfT0a/z/thxo8z1Cy3+JTWPRxc\nlVR4IMOBJRVERERERERlEwMOGoSEAEFWsQLrkgqbHg52Aw5Oejg4KalYnrgcG/7cgFd3v4qky0mq\n112iDAeNWQn23semkURERERERIGPAQcNhADCw+0/b+I0w0EKwBCsuaTiQvYF5fjSjUsu16usySpL\nwSc9HOwEF5jhQEREREREFPgYcNDIuqzCnZKKIIQCEJpLKm4U3FCObxbcVLVewP2SCvOgh9asBJZU\nEBERERERlU0MOGhkL+DgNMNBWgccoLmk4kZ+ccAhpyDH9WKLlKSkwpO7VLBpJBERERERUeBjwEEj\nRztVACoyHGSY8UBjSYV5hkNOoRsBBz80jWRJBRERERERUdnEgINGzjIc7DWNNM9w0CkZDhpLKvI1\nllRY3fw7CyIUGgotAgNasxLsBRfYNJKIiIiIiCjw+SzgIISoLYRYLIQ4LoTIFkJkCCEOCCGmCSHs\ntGAs0bk6CyE+EEKcFEJkCSGuFR3/TwjxlBDCSX6COiUqqZBFAQcnGQ5OSyoKNJZUWGc4OLnxtw54\neLKkghkOREREREREgS/YFycRQvQB8AmAKACy6OlwAC0AtAQwVgjRS0r5TwnPEwHgAwCDip6SZi9H\nALgDQH8AewAcKcm57O1SYWKvpEJYNI0sKqlw0sPBaUmFhzIcnGUtWAc8tGYlsKSCiIiIiIiobPJ6\nwEEI0QzA5wDKAcgCMB/AThgDDo8CeALAnQA2CSFaSilvOJjK1XnKA9gC4F4YAw3fFZ33JIyZHHUA\ntEJxMKJEXJVUOMtw0EkVJRVqMxxK0sPBBxkOdksq2DSSiIiIiIgo4Pkiw+FNGIMLBQC6SykPmL22\nUwjxF4D/ALgLwFQAczSeZxGMwQY9gNFSyk+sXt8P4AsA04QQJS4lcdY0ErCT4WCwE3BwVlLh5x4O\nNhkOGoMELKkgIiIiIiIqm7zaw0EI0QpARxgzDlZYBRtMXgdwHIAAMFkI4fgu3PF5mgAYV3SexXaC\nDRakLPkdr7tNI80DDsUlFe7vUiGltAgylKiHg7OSCk/1cLCTRcGmkURERERERIHP200j+5sdr7Q3\nQEopAawqehgNoKuG80yAMWCRB2CBhve7zd2mkcJuSYXljz/ILNbiqKQipzAH0qw1xc1C1xkOUkpI\nKW2CBk5LKjzUw8FeNgMzHIiIiIiIiAKftwMOHYq+3wBw0Mm4XWbH7TWc5xEYsxu2SymvAoAQIqho\nZ4zaQohQDXM65aykwm7TSItdKuw3jYwIjVCOHZVUZOdnWzx2leFw6cYl3PX2XWjwTgNcvnnZ4jWf\nZDiwpIKIiIiIiKhM8nbAIQHGQMDfLsoYTli9RzUhxB0AKhU93CeEiBVC/H/27jxMrrLMG//3qaWr\nuiu9pdNJOhtZ2CIaIQngjDKoDIsICDiKPwdRAQ2+zCgzOs7o64joK44zo+DooIC4jKgjv0F9FTWC\nsskaDEQim4RshHSSztL7Vsvz/nH6nPOctc45daq6qvv7ua6++tSps1Sn8fI6d3/v+7kZwBEAO6e+\nBoQQvxJCvMH9KuHZV6koNzTSdYYDYJnjkE1lkZgaL+GVcFDnNwDlZzh8+FcfxrbD2/D8wefx6fs+\nbXnPr4hgb+mIs6WiJEvQgi1EREREREQ0U1Wt4CCEyACYN/Vyj9+xUsp+aCkIAFga8lavUrYT0Ja7\n/ACAHLRihwTQBOBsAA8IIT4a8vquyrVU+M1wsBQclJRDOplGNpUF4D3DQV2hAii/SsXmXjNY4lgW\nc5paKgBY2kKIiIiIiIho5qlmwqFV2R72PMqkP0nP8T3Kaa6y/UkACwH8EtoSmFkA8wF8CEA/tDkP\n/yqEODfkPRzCtlTAUnDIKPvNhEM6kUYmqb3n1VJhTziUa6nwe3+6WioAtlUQERERERHNdNUsOGSV\n7ckAx09AKwg0lzvQJqdsNwG4G8D5UsonpZR5KeUhKeUtAM4HoD/l/kvIeziEHxppphqsCQel4KAk\nHDxbKvLhWir8EhChEg5Rl8X0uEfU6xEREREREVFjqGbBQe0JCDK0MQOt/SH4Oo/W++iP+/8oXQYE\nSCkfBvDjqeNOEEKcEPI+Fm4FB5W94NCN43HM3GMgILB09DzzDVvCQS842IdD6hwJhzItFX4JB7/U\nQlwJB68kAxMOREREREREM1uqitceUraDtEnoSYUg7Rde9zkopXza59hfA/irqe2TATwT8l6YnJzE\nk08+ib17rfv37QNGlbBBf38PgB5zRymFP179RxwYOYBPX7PE3G+b4bA8txw7+nfg8NhhHB47jLnN\nasdIzAkHv5aKmGY4eN0j6vWIiIiIiIgaXW9vL3p7e8seNzkZpFmgflWt4CClnBBCHATQBWCJ37FC\niA6YQx5fDnkr/fgg56rvd4e8DwCgr68P69atc+z/1resr7dsuRbAZ4zXxSLQlGzCkrYlsOQvlJaK\nVCKF1fNW476d9wEAnut7Dq9fZl0lNOwMBz9eK0hMFCbim+HgUVhgwoGIiIiIiGarm2++Gdddd910\nf4yqq2bCAQCeA3AagKOFEAmfpTGPt50TxrPKdtLzKOf7kf7E3t3djY0bN2LrVuB97zP3X3EFsGkT\nsHWr9vr443vw+OPKzZS7WQoOtpaKV3Wbi2482/ess+AQMuHgx54+mChM4KSbT8KewT142/Fv8z02\n6j10QQoOfSN9+PVLv8Zbjn4Lulq6It2fiIiIiIio3mzYsAEXXHBB2ePOOecc9PX11eATVUe1Cw4P\nQSs45ACsA/CEx3GnK9sPh7mBlHJACLEVwGsALC9z+Cpl+5Uw99E1NTVh7dq1SNn+5Xp6gNZW9Tjr\n+2rBoaQ+a9taKlZ3rzZeP9un1lI0bjMcpJQQ9iESAdhTC997+nt47qBW77n96dt9jw3Kq7AQpIDx\n7h+/G7/Z/hucd+x5+Pn/9/NI9yciIiIiIqo3PT096OnpKXtck/3BssFUc2gkAPxU2X6/2wFCe1K+\nbOplP4D7Itznx1Pf24QQb/Y57mJl+6EI9zGEXaXCM+EgvRMO+sO/yp5wKMkSJovR+nrs7Q4D4wOB\nj416D12QhMOWfVss34mIiIiIiKhxVLXgIKV8AsDvoK0McYUQ4lSXwz4GYDW0GQw3Sml9QhVCnC6E\nKE19fcvlfAD4T2jDIwWALwshWu0HCCEuBfDGqfvcJaWMlHDQhV2lIlBLRTKNBbkF6Mh2AAiWcADK\nr1ThxZ4y0FfIcBN5hkMFQyPzxbzlOxERERERETWOaiccAOAj0Ja6TAO4RwjxT0KIU4UQbxRC3Azg\ni1PHvQDgyz7XcSx1abwh5UEA/zT1cg2ATUKI9wkh1k7d56sAvj31/iCAv6/g5wHgLDgA/gkH9bXf\n0EghBFbP09oqXh58GUMT6iIczoQD4D04styDuv2hvznd7HnsdCyLmS9pnz9qgoOIiIiIiIimT7Vn\nOEBKuUUI8U4AtwNoA3C9/RBoxYa3SimdT9PB7/N1IUQntKUhjgVgT0NIAPsBXCilfCnqfXTNtmdz\ne8Ihb3vWDzTDIZEGALyq+1V4dM+jAIDnDz6PkxefbBzjVnDwGhw5NDnkul9VkiUkhPYZmlPeBYfI\nQyMraKkwEg4lJhyIiIiIiIgaTS0SDpBS/gJa8uAGaMWFEQBHoA2R/DiAtVLKHX6XsH33us/1AF4H\nLc2wA8A4gAEAvwfwaQDHSSkf975CcPbZHfYZDn4FB7+WCgBGwgFwznEI01IxODHoul+lJhekzz9v\n7C0VZQoYUkomHIiIiIiIiBpY1RMOOinly9DmNXws5HkPoPxyl+rxTwK4MtynC89tUYg4hkYCcCyN\nqQqTcAhScCiWisa/rl8LRtShkVFbKtT75Yv5yCtxEBERERER0fSoScJhNog8NNK2LCYA36UxXRMO\nHjMcAhUclAd7vyRB5ISDR6GiXAFDLX5IyMgFDyIiIiIiIpoeLDjEJExLhWWGQ8mZcFjWvgwC2sX2\nDe+zXKcqCQf9M/vMSog8w8HjvHIJB/tn4UoVREREREREjYUFhyqJ0lKRSmgdLgmRMJaotM9nqOYM\nB7+H+lqvUmH/LJzjQERERERE1FhYcIhJLEMjpxIOgLlEpb1dIvaEgwyYcIjY0uDZUlEmMeFIOHCl\nCiIiIiIioobCgkNM8vmoQyOdMxwAc4nKQAmHSmY4lKzDGb3oCYehifJLbXpdX09wAEw4EBERERER\nzXQsOMRkzPbM71dwsMxwcFmlAnBPOEgpMTw57Lx3BS0VQRMOhVIBNzx6Azq+2IEP/vyDZa+rUwsL\n6s9XdmgkZzgQERERERE1NBYcKpBUFuscG4uhpUJJOLSkW7TrKsWE8cI4JNSTNZW0VKizGfxSBMVS\nEbdvvR0lWcJ3tnyn7HWN85TCgvrzMeFAREREREQ0s7HgUIGWFnPbXnCoZGgkYLZUjBfGIadOUOc3\nqMfWqqVivDCuHVfKB04cqNdXEw6hV6ngDAciIiIiIqKGwoJDBZqbze1RW8gg0gwHl5YKAMaDvjq/\nYV7LPGO7b7QPv3rxV475CnEPjVSLDF6pCjtLS4WScCg7NLLIlgoiIiIiIqJGxoJDBdSCQ5iWCssM\nB4+WCj3hAJhtFWrCobul29j++u+/jnN/cC4u+tFFlnvGnXBQ2xqCFhwsLRUVJBzYUkFERERERNRY\nWHCogL3goLInHNTXXi0VXgkHvWXCK+Gg++2O31peh53h4JtwKBUt7wcuOJTcZziUHRppTziwpYKI\niIiIiKihsOBQgagJhzDLYgIeCYecmXDwErqlIqaEw67+Xegf7wfgvUoFEw5EREREREQzGwsOFfAr\nOASe4VDySDikyiQcmp0JB8CaWAjbUjFZ8lmlIuAMhwd2PoAVX1mBo248CkfGjniuUnHRjy7C0f9x\nNJ7e/7TrdTjDgYiIiIiIqLGx4FAB+yoVKr+Cg2WGg9cqFWn/hINbSwVgDpgEpifhcPdLd0NCYnBi\nEI/uedRzlYrDY4fx0pGX8O2nvu16HSYciIiIiIiIGhsLDhWwr1IRqaUiyNBIl4SDV0uFXnAoyRKG\nJodcj1EFneGgXhswiyB+x4zmRz1XqdD1T/S7XoczHIiIiIiIiBobCw4VWLHC3F66NGJLRYBlMd0S\nDu2ZdksiQqc/8A9PDgf5EQKvUgEAEuYH90o4TBQnjO2x/JjnKhXlrsOEAxERERERUWNjwaEC110H\nLFkCzJ0L3HKL9b3gQyODJxzUFom2TJvlGJ1ecAjSTgHYWipCpAi8CgX2hIPXKhXlrsMZDkRERERE\nRI3N+SdyCqyjA9ixQysuNDdbEw52njMcvIZGuiQcBsYHjH3t2XY0p5sdbRNuxQk/lqGRIVIEQQsO\nXqtU6NQ2ERUTDkRERERERI2NBYcKpVLaVzlBWiosQyPLJBzaM+2uD/36A//QRPn5DYBthkOIFEGQ\nlorR/KglQdGUbAp8Hc5wICIiIiIiamxsqYhR0IRDoJYKJeGgP5QPTJgJh7ZMm+ucBr3gsLN/Z6DP\nXM2WirHCWPSWCiYciIiIiIiIGhoLDjGKVHDwaqlIOVsqLAmHbLvrffQH/o0vbTT2uSULjM8VYmik\nSk9deN0fcLZUuA25VAdhqjjDgYiIiIiIqLGx4FAjnjMcAiQc9Id7e8LBzVhhDCVZwsZtWsGhJd2C\nN694s/fnijnhMFGIqaWixJYKIiIiIiKiRsaCQ4yitVR4LIvpk3DIprKeqYXxwjj+sO8P2De8DwDw\n5hVvxpymOZ6fK+4ZDr6rVIQZGllkSwUREREREVEjY8EhRpEKDgePAwAkZROWdyw3drsmHKZWqdDT\nDZ9/8+cd9xkvjONX235lvH7L0W9xfdA3PlfUVSoKwWY4lFulYjQ/Cmn5B9GohRCALRVERERERESN\nhgWHGikoz8+W5+vNG4C7vo53Fe5Gd67b2O2XcGjPaPMbPvGGT+CxKx7Dl876knGsa8HBZVijLvaW\nCp9VKtw+R1EWXe/LoZFERERERESNjQWHGAVNOFhmOORbgN9fhWWl0y3HWxIOhTFIKY2Cg55wEELg\n1CWnYuGchcaxB0YO4NGXHwUAHNd1HFZ0rgiccKhKS4X0b6kA3NsquCwmERERERFRY2PBoUY8Wyo8\n9lkSDvkxy8O7fYUK9dgdR3YYx528+GQA3g/6gG2GQ8zLYtpXqfBKWozkR/DQ7ofwbN+znp+FCQci\nIiIiIqLGwoJDjCLNcPDY15JuMbbHCmO+K1RkU1lj++DYQWO7takVQJllMaV/wuGkhSe5nhdkWcyx\n/FjZoZEAcOezd+K0b5+G137jtdjVv8v1s3CGAxERERERUWNhwSFGlRQcLG0WcA6N1AdGAuYMB51a\ncDg0esjYzqVzALyTBYCtpWIqVfDq+a/Gg+97EPe/9378+dI/dz0vyrKYXp/jnu33ANDSFo/teczy\nWXRMOBARERERETUWFhxqJGzCQS0ijBXGjPkNQJmCw5hZcNCXw/Sd4SCdq1Q0JZtw2lGn4fTlpyOV\nSLme51ZwkFI6hkaWW6UCAPYO7TW2+0b7AHCGAxERERERUaNjwSFGkYZGTrEXHBIigUwyA2Aq4eDT\nUqGmIQ6Omi0VuabyCQd9hoOU0thWCwNJkXQ9z63goBYb9GMsLRUen6N3uNfY7huZKjgw4UBERERE\nRNTQWHCokbAJB8AsJDgSDlnvhMPhscPGttFSEWCVCnV4pFoYCJNwUNsp9M8dZJWKAyMHHNv2ggMT\nDkRERERERI2FBYcYxTk0EjBXn7DPcPAbGqm2MARJOOgFAfWB3pJwSARPOKgDIwEtlaAmE7w+h/qZ\nPVsqODSSiIiIiIioobDgEKPYCw5eCQefGQ6qQDMcphIO6gN9kIRDURYdRQB7wQEAhieHzev6fA6d\nUXBgSwUREREREVFDY8GhRsLOcABsCYeAy2KqgqxSobdSWJIIifIFB8CZcrDPcABsBQefz6EzZjhw\naCQREREREVFDY8EhRtVMOFiWxbTNcNALE3ZGS0WAVSrUB/qmZJOx7TU0EnAWHJhwICIiIiIiIh0L\nDjGq1gyHkixZlru0JxzUAoHKaKnwm+EQsaUCCFZwGJoYKvs5VYdGD6FYcrZrcIYDERERERFRY2HB\noUYqSTgAwL7hfca2fYaDEMK1rSLQKhURh0YCLi0VBWdLxdCkWXAI0lIhIXFo7BATDkRERERERA2O\nBYcY+SUcpDRnN7jNcHDbp7ZK7B/Zb2zbEw6A+xwHvaXCnizIJDPGtj7DoVoJB7Wlwu9aqr6RPs5w\nICIiIiIianAsONSQnnKoNOHQmml1Husyx8GrpUItQBgtFV4JhwpnOHjd10/faB8TDkRERERERA2O\nBYcY+SUcgAgFB6WIcGDkAACgtakVCeH8tYVpqbAUHKZaKqKsUjFWGLO8dlulQhVkaCTgkXDgDAci\nIiIiIqKGwoJDjKpZcNDZV6jQ2QsOmWTGmL9gTzhkUmZLhdvQSMsqFSFmOPglHJIi6Xst1YGRA0w4\nEBERERERNTgWHGpILzi4zWso11Khsw+M1NkLDno7BeCfcDBmOJSqM8NBNz8337c9Q9U3yhkORERE\nREREjY4FhxjVIuHgNjAScBYc9IGRQJkZDtJlWcyIMxzcVqnQLZizwLUVxE3fCGc4EBERERERNToW\nHGIUe8HBLeHg0VJhP1af3wCUmeHgNjSyCgmHBbkQBQe3hANnOBARERERETUUFhxqqJYJB0tLhX2G\ng7IsZtmEQ0wzHBbMWVDRDAe2VBARERERETUWFhxiFDThUIsZDpaWioT30Eh9hoNllYqACYexvHWV\nimomHCaLk5Bu/0hERERERERUl1hwqCG/hINbEaIl3eLYFzTh0JLK4eKLgfXrgQP7AsxwKLmvUhGm\npcJvWcyFcxaWHRopoFVs3GY4qJ+ViIiIiIiI6p/30ySFVouhkfNa5rle237sxNAc/PIn2vYv7wow\nwyGGoZGVJhyWtS/DroFdODR2yPW+k8VJ3wIIERERERER1Q8mHGJUi6GRZ6862/Xa9oRDsmS2VIyP\nNFneK5dwCDw0shBuhoO94GBPa6yauwoAUJIl14QDB0cSERERERE1DhYcqiythAtCz3CwpRaO7ToW\nJy480fU+9oJDRpgFB1nwGRpZLuEQYmikX0vFgpxzaGR3S7fl8y9uXex5PsClMYmIiIiIiBoJCw4x\ncks4ZMxn+4oTDu864V0QHjEKZ8HBXKWilPduqdCHRlZ9WUyXhENXS5ex3Z5p95xPoeNKFURERERE\nRI2DBYcYlSs4FLRn+8gzHC559SWe97YXHJpgJhxKhfJDI9X0gPr+y7srn+GQFEl0NXc55jKo8yja\ns+2WpTzdVJpwODR6CG/8zhtx/g/Px0TBO41BRERERERElWPBocqalPEJ9oSDWqBwKzjYiwiv6n6V\n533sxYkmpaWi6JNw8Gup+NWvgPe9x5pwSCfSRlLBviym10N8d64byUTSkXBQWyraM+ULDpXOcPjI\nxo/ggV0P4K4/3YWvbfpaRdciIiIiIiIifyw4xChoS4U+wyGh/Ou7FRyO7ToWr57/agDA9y/+vu+9\nnQkHpaXCb4aDz9DIBx4AIK2phKZkk7Fc50h+xPKemnBQP8+C3AIAcLZUNCstFdl2tDa1uv1ohkoT\nDr/a9itje8v+LRVdi4iIiIiIiPzVbI1BIcQyAB8BcC6ApQAmALwE4A4A/ymlHPM5vdy13wvg2wEP\nf5+U8r+i3issvxkOiYT/XIdkIonNH9yMI2NHsGDOAt/72AsOaaWlojgZYIaDS8KhUABQsiUckmk0\np5oxPDmMoYkhy3teLRX6Z7cPjVQTDYESDhXOcFATGW5LjhIREREREVF8apJwEEKcD+BpAH8H4FgA\nzQA6AKwD8K8AnhJCrIrhVjLAV9WEHRqZVJ6/7QWH7duB224DRoeayhYbAJeCg1RnOCQhYH64TMpl\nlQqXhEOxCKDkTDjowx0HJwYt7+mrVDQlmyzFB6+Eg56UAGrTUuGVwCAiIiIiIqL4VT3hIIQ4CcB/\nA8gCGAJwPYD7oRUd3gXgAwCOAXCXEGK9lHLE41JBnQWg1+f9PRVe31MlBQd1qUwpgde9DujrAx5+\nGPjWt8rf276iRUopOOTzWhFBb0koNzRSTzhoBQfnDAe14CClNFbO0B/os6ms5Xp6wcE+NNJScKjB\n0Eip1JtYcCAiIiIiIqquWrRU3AituJAHcKaUcpPy3v1CiBcB/Bu05MNHAXy2wvu9KKXcXeE1YuM2\nNLLcDIehIa3YAADf/nawgoMz4WA+vOfzWqFAf2DXCwqA+9BIvSDh1lKhJhwkJEbyI0ahQC04qOmH\n+bn52s9rSzjkmsyiSC1aKlQsOBAREREREVVXVVsqhBAnAzgNWivDN23FBt2XATwHQAD4iBDCex3G\nOhc24eBVcDh0KPy97Q/QqZL5MF8omG0SgDZLQU8bGDMcvFoqbEMj00kz4QBY2yr0VSrUoZSAH8oS\nkwAAIABJREFUll4AnAUHPfkAAIvbFqM1U92hkSp72oKIiIiIiIjiVe0ZDhcq299xO0BKKQHoQxw7\nALypyp+ppqLMcIil4GBrqVDbKJIiaQxwNFapcBkaWa6lArAWHNSEg8pr9YmzVp2Fy0+8HJeccAku\nOeGSqs9wUOnzJoiIiIiIiKg6qt1S8Yap7yMANvsc94Cy/XoAv6naJ6qipMsfzaMkHA4fDn9vZ8LB\nfHgvFKxtFGrCwW9opNZS4RwaqRYQghQcOps7ATgTCulkGre97Tbj9dCkddULuzgTDnoag4iIiIiI\niKqj2gmH1dDaKbZJKUs+xz1vO6cS3xFCvCKEmBBC9AkhHhVCfE4IsajC65a1YoVzn1vBodwMhygJ\nB/syj2pLhT40UpcUSaQSWq3JSDiUAiYcbC0V+tKYUkqj4JBJZXDTuTcBAJa2LcWblmuhlcVti3HM\n3GMAAP/8F//s+BncEg5q60MlMxykbRkQJhyIiIiIiIiqq2oJByFEBsA8aAUH35UhpJT9QogRAC0A\nllZ469OV7bkATgFwKoCPCiGukVLeUuH1Pb32tc590zHDQUBAFMwChGvCIWGd4WBZpcJnhoM6NBIw\nEw6FUsFYBSKbyuKq9VfhlMWnYNXcVcYynAmRwKNXPIot+7bg9OWnwy6Xzjn3NeWMe1SScBjNj1pe\nq9fqHerFT5//Kc479jwsba/0Pz8iIiIiIiICqttSoTbuDwc4Xi84+Dfye3sJwJ0AHgPw8tS+lQDe\nDuCvoC3L+XUhRElK+c2I9/B14onOfdMxwyHXlEOpZE6wtCccUomUs6Ui4CoVXjMc9HSD/lmEEFi3\naJ3jc3a1dOGMlWe4/gzJRBIt6RZLcaAl3WLco5IZDgMTA5bXasLh8p9djo3bNuIHf/wBfvf+30W+\nBxEREREREZmqWXBQG/mD/Gl6AtpKFc3lDnTxYynld132bwbw/wshzgXwE2g/7w1CiJ9JKQ9EuI+v\nhQuB7m5zSUtgemY45NI5qGEAR8LBbWikZ0tFsFUq1IKDfZWKMOY0zbEUHNTUQyUJh4FxW8FBmeGw\ndf9Wy3ciIiIiIiKqXDVnOIwr202eR5ky0NovxsLeSErpO21QSvlLAJ+FVtBoAXBF2HsEIYSzrUIt\nOBQK+ufRvsc6wyFt1mlyTTmjuAG4zHBImDMcdvbvxAU/vAAP7DTndlqHRlprUl4tFWpiwD40Mgz7\nHIeWdIv5c1Qww6F/vN/yWv28I/kRANaiSaXG8mMYy4f+T5mIiIiIiGjGqGbCQS0CBGmT0P+UHaT9\nIopboBUdAG3OwxeiXGRychJPPvmk5/sLF+pbPQB6fIdGqi0VJWWkplpwaApSqoE1VTCnaY6l4OCa\ncFCGMf78Tz+3XMuScLDNcAjaUhGVveCQa4op4WBvqVASDnqiYqI4gZIsISEqq8PtHtiN137jtUiI\nBLZ+aCsWtVZ9XikRERERETWQ3t5e9Pb2lj1ucjK+lfqmQ9UKDlLKCSHEQQBdAJb4HSuE6IBWcJAw\n5y/E/Xn6hBCHpj7P4qjX6evrw7p1ztkETtcC+IylYBBlhkOrOgnDhxACJy86GU/sfQKnLj4VxVfM\n99wSDnpLhRvL0MiACYe4WirUJTcBW8KhkhkOtpYKvXhRKBUshYzxwrjlnlH84k+/MBIVd790N953\n4vsquh4REREREc0sN998M6677rrp/hhVV82EAwA8B+A0AEcLIRI+S2MebzunWmT5Q/x1d3dj48aN\nnu+/+CLwrncBWsLBfYaDLkhLhQzxiTdeuhGPvPwIzlhxBj73qLnfLeFQ8vhVCAgj/aC1VJSZ4TA5\n1VJRmL6Wir6RPnQ2dxptIm68hkba2x7iKDiMFcxrqv8uREREREREALBhwwZccMEFZY8755xz0KcO\nCWww1S44PASt4JADsA7AEx7HqWskPlyNDyKEmAdtmU4A2Bv1Ok1NTVi7dq3n+695jfW1veCgFhCC\nDI20Fyn8zG2ei/OOPQ+AOS8CcE847B7Y7XqNdDINIYR5b2ltL6h5S0WZoZE/e+FnuPhHF+OYrmOw\nZcMWYwlOO6+hkfblMsfyY9HGlirUf4tK2kCIiIiIiGhm6unpQU9PT9njmoL22Nepag6NBICfKtvv\ndztAaE+3l0297AdwX5U+ywZoQyMB4AG/AyuRTltfZ5Vn72LROqvBreBQKAD9ynxDtXAQhn1opL7U\nJQDL/AY7NQmhXUMARbMuFailwuOhPwjfhINLS8V7f/peFGURzx98Hv/z7P94Xtcr4WAvOMQxOFK9\nRiWDLomIiIiIiBpZVQsOUsonAPwO2oP+FUKIU10O+xiA1dDaHW6UUlr+pi+EOF0IUZr6+pb9ZCHE\nUUKIE/0+hxDiPAD/PPVyHMC3w/80wX33u9qwx8suA3LmH+gdCQe3GQ5HjlivFSbh4HWeo6UikcSH\n1n/I9Tw1CWEUO5TBkelEGplUxrhe3KtU2Gc4lEs4qKtPPNP3jOd1vRIO+goVOrUdIiomHIiIiIiI\niKqfcACAj0Bb6jIN4B4hxD8JIU4VQrxRCHEzgC9OHfcCgC/7XMdrmsFyAE8KIR6euvY5Qoi1Qoh1\nQoh3CCHuAPB/oS3NKQF8VEpZfhxoBS67DBgc1AoPalEhSEuFfUnMOAoOjpYKkcTXzv0anvjAE7j1\n/Fst5zkTDrAMjtSTEnrKodotFeVmOKj38moTAYD+CeuymHohwLWlokKWhEMFgy6JiIiIiIgaWbVn\nOEBKuUUI8U4AtwNoA3C9/RBoxYa3SilH7OcHvQ2A1wH4M5/3RwBcI6W8LeI9QtFnN4QtOKjzG4B4\nWirsCYdUIoWESGD9ovUYmbT+k6uFCbPgkHS835Zpw6GxQ7GvUhF2WcwlbUuw7fA2AMBLR17yvK4j\n4VCjlgomHIiIiIiIaLaqesEBAKSUvxBCrIGWdngrtGUyJwFsA3AHgP+UUvo96Unbd9VmAJdCKzas\nh7Y8xDxoP9sRAM8A+C2Ab0opD1b+04RjLzioMxzcWirsCQcptS8hEIo94ZCytVToFrdZVwhVZz0Y\nxQ4l4aAXLuwJh+HJYeMYtUgQVtgZDupqG3888EeUZAkJ4QzuOGY4eA2NjLmlgjMciIiIiIhotqpJ\nwQEApJQvQ5vX8LGQ5z0AwHPKoZRyGMAPp77qjl/CQX1PL0TYCw76eamQvyl7MsK+LKZucetiz+OM\nooUyw8HeUjFZnMREYQKHx8xoRldzV7gPq/ArOEyWnGkBNbkwPDmMXf27sKJzhe9xgHfCIe6WCiYc\niIiIiIhotqrFDIdZrdIZDkC0tgr77IcE3BMOzelmdGY7jdfuLRUpx/v2lSrUgkNns3m9sFoz3kMj\n7QkHKaUjufD0/qddrxs04RD7KhWc4UBERERERLMUCw5VFkfBIcrgSPs5KWG2SqQS1riE2lahvme2\nVHgnHABnwWFu89zwH3iKb8LBlhYYK4yhULJWY7Ye2Op6XXvCIV/KQ0rpmGHBVSqIiIiIiIjiwYJD\nlfnNcAgyNBKIJ+Hwl8veiqRIYlHrIqzrWWd5T22rGJoYcl7DZ4YDAAxNDuHIuLmeZ5wFB3UehH0e\ngr2IALgXHNySEIBWDKj6KhWc4UBERERERLMUCw5Vps5e8JvhECXhICXw+OPAgPNZ2nHOn/e8GXv+\nfg9e+vBLaE43W97rae0xtg+MHHBeQ7qvUqGrVcLhod0P4bMPfNZoh3ArImzd7yw4DE8OW4ZL6iaK\nE54tFVJKfPq+T+PKn11pDMYMigkHIiIiIiIiFhyqTi0qFArxtlR88pPA616nfUnpf04+DyycsxDZ\nVNZxnfkt843tkbzZYuC2SkW5loqmZBOaU9aCRhitTdYZDp3ZTsxrmQcA6B/vx7X3X4t/e+TfALgn\nHP506E+OuQluhQlAm+PgtUrFIy8/gs89+Dnc9tRtuGXzLaF+BhYciIiIiIiIWHCoumoNjdyxA/iX\nf9G2n38e6OvzP8evLWN+br7rfrOlQkk4uLRUqAWHuc1zIcKu4amwJxyyqSzuvvRunHvMuca+Fw69\nAMC9kFCURfSP91v2uRUmAP+Eg34PAHim75kQPwFbKoiIiIiIiAAWHKou7AyHfuuzsnGe3bXXWl/n\nbc+1bgkHL+ULDs6Eg5pEGJwYxJExbYZDJe0UgLPgkE6mcVLPSbjtgtss9wO8Cwn2QoRXwsFvhsMr\ng68Y+3Yc2RHw02uYcCAiIiIiImLBoerCJhxGrc+/xnmqP/4RuP12675J23NtHAUHt5YKtxkOfSN9\nRitGpQUHdWYDYCYq2jPtxj690KAWEvTjAARPOBQmLC0kgNlSsXdor7FvR3/0ggOXxSQiIiIiotmK\nBYcqC1twGLE+/wJwtkP84AfOmQ3lCg5+LRXrF603tt/72vc6ryH9l8XcNbDL2K604JBMJC2v9QJH\nNpU1igp6oUEtJCxrX2Zs2wsM9qKCzq+l4pUhM+Hw8sDLoZIKTDgQERERERGx4FB1fgUHIbQvQGu1\nKJWAMZdVGe3FA7e2i0oSDl0tXbjnPffg2tOvxb+d+W/Oa5RZFnNn/05juzPb6X2jCPT7CSHQntVS\nDm4Jh6M6jjK27QkHtaiQFOYvxG9opFpwkJDYPbA78GfmDAciIiIiIiIWHKrOb4aDWnCQ0r3YADjT\nCW7Fg0oSDgDwlyv/Ep9542fQnet2nlPyXxYzzoSDnX4/wGyrcEs4HNVuFhzsMxv0uQwA0NlsFkTc\nEg76sWpLBVB+jkPvUC/ueekeTBYnLUUGJhyIiIiIiGi2SpU/hCpRrqVCLTi4tVPo56miFBz8Eg5e\n/IZGqgWH7Ue2G9uxFxyU2QxqwkFKicHJQeM9taXCnnDQUwsA0JHtwMHRgwDch0aOF8aRL+axf3i/\nZb/fHIfJ4iTW3rIW+4b34bo3Xmd5jzMciIiIiIhotmLCocqCtlRI6T4wUj9PZS8uuO0LsyymF+Mc\n6VwWszvXbbQnlKQZ24i74KAXOACzyFGURYzmRz0TDn4tFWrLh1dLxb7hfZCwDsnwSzjs6t+FfcP7\nAAD37rjX8h4TDkRERERENFux4FBlYQoOXgmHOFoq9HP6+4H/+A/g97/3/9xSKp/VJeHQlGyyzE3Q\nxVFwePzKx3H2qrPx3Qu/axkiaVmpYmLA0jrhNzRSbanQUxKA1lLhWKUiP+ZopwCA7f3bHfuMc5QE\nhZ6e0LHgQEREREREsxVbKqoszAyHoC0VQRIOXjMcPv954N//HZg7F9i7F8hkAtzTZYYDABwz9xhL\nOwUQz9DIUxafgo2XbnTsV4sFA+MDRmEhKZJY1LrIeK9/wjvh0JHtMLbdEg7jhXHLwEidX8JBLWjY\nCw4cGklERERERLMVEw5VFmaGQ9CWikoSDi+8oH0/fBg4dMj7c1tSFS6rVABawcEu7pYKlVfCoS3T\nZhkG6Ug4KAkES0uF29DIwhheGXQpOPjMcGDCgYiIiIiIyIkFhyqrl5YK/RrqtfzmOljOV2Y4qDMV\njp57tOM8r4LD5s3A3/898Mwz3vcsx1JwUBIO7dl2y3vlhkbqxgvjliUs9X1qwiGV0IotB0cPYmhi\nyPVzqdcoSus/PIdGEhERERHRbMWCQ5VFLTg0NVnPU0VpqdCLFJEKDgNLAQDZVBZdLV3G7mO6gicc\nLr8cuOEG4G/+xvue5VhaKpSEQ3umHdlU1iiG2JfF9BoaaS9MAM4ZDmt71hrbXikHtaXCjgkHIiIi\nIiKarVhwqDK14FAoOGc4JKZ+A6WStaWitdXcjtJS4bVKRdCCg+W9h/8RR/Vegx9c/APLcpj2hIOA\nsBQFVLt3a9937fK+ZzlqiuHAyAHjYb492w4hhPG+I+HgMTTyyNgRxz3GCmOWhMMblr7B2LbPq1DP\n8cIZDkRERERENFux4FBlYWY4qAmHNvO5PtZVKtT9gRMOQ4tw3I4bcNHqiyzHLO9YbiyNCWjtCgnh\n/p+Ufi/75wpDLRa8PPCyuX+q0KC3S/jNcFBbKo6MOwsO44VxY4ZDLp3DqUtONd57qvcp18/FhAMR\nEREREZETCw5VllLWASnXUqEmHNrbreepat5S4fIacC6NqQ5u9Lqe3z3LURMOuwd3m/uztoLDxABK\n0oySeK1S4VZwKJQK2D2gXXtx22KcsvgU473HX3nc9XP5JRxKsoRiqYIqCxERERERUYNiwaHKos5w\niDvh4NZS4Zc2sN/T61i1raJQ8q4mxJ1w0IsCgFmI0N8vyRKGJ4eN9/UEQjqRRku6xdjv1lIBmAWE\nRa2LcFT7UZifmw8A2PTKJkj1F2i7vhe2VRARERER0WzEgkOV2QsO9hkOQQoOcS6LGWfCAYDxMA4A\nh8cOl71ebAkHl4KDml5Q2yr0hENLugWZZMbY75ZwUC1qXQQhBE5dfKpx/IuHX3Qc55dwANhWQURE\nREREsxMLDlUWZoaD2lLhV3CI0lJR0bKYLq91c7PmqhRqqkAlpVloiSvhsGdwj2O/19KYekGgOd2M\nTEopOHgkHHQLcwsBwCg4AFrKwa5swoFLYxIRERER0SzEgkOVNTWZRYXR0elrqQibcAjaUrGic4Wx\n3dXc5XpM0EGV5agFBbf9loSDsjSmXhBoSbcYS2cC5RMOC+doBQfLHIc9zjkOTDgQERERERE5seBQ\nZYmEucTl0FC0gkOcy2JGWqUC1lYQ1eUnXW60VXz3wu+WvVYlCYeWdItlVQxduYSD3lLRnGq2tFTY\nl8+0WzBnAQDg5MUnG/vcBkdyhgMREREREZFTqvwhVKnWVmBwUPsKukqFXqQA6neVCgBoy7Rh299u\nw6GxQ1jesbzstSpJOAgh0JZpcyQTFrUuAuA+w0FK6dlSoWrLtGFwYtCyb0FugXHd4+cdj+cPPo8t\n+7ZgvDCObCprHMeEAxERERERkRMTDjWgpxUGB61JgURC+wKmZ5WKOFoqAKA10+pZbLBfq5KEA2Cd\n46A7ofsEx3t6emGyOGkskWkfGqlyawfREw6A2VaRL+XxXN9zluPKFRyizHDYM7gHV//iatzxzB2h\nzyUiIiIiIqoHLDjUgF48GB62PnCrCYdSySw4CAHMmWMe55VwaGlx7vM6p1oJhyDUc0sla8ojLPsc\nh545Pehq0YoFbjMc1GJAc8o74TC3ea5jnz7DAQCWti01tg+NHbIcN14Y9/3MURIO//LQv+Cm39+E\ny35ymWXFDSIiIiIiokbBgkMNqO0RQ0PmtldLRS7nXN1CJ6X5Opcz90/XKhVBhElLlGNPOLx6/quN\nbbXgoCcc1PkK9qGRKr1ooRMQmNcyz7yvx3wI+z3cRJnhsKN/BwBgojiBAyMHQp9PREREREQ03Vhw\nqAG1PWJA+WO119DIlhYgpUzXUB/Y1XYKNQURNOEQdIBjnEUCr+JHFPaEg1pwcCsK6AMjgakZDh4t\nFfaEw7yWeUglzF+CWzFDV40ZDuoSo+WuT0REREREVI9YcKgBr4JDIuFecPBLOKiFBa+Eg1vbwnQm\nHOK8lj3h8Jr5rzG2g7RUJBNJ15Uu7DMc1PkN9vvaWxzKJhwizHCwFBzKXJ+IiIiIiKgeseBQA2pL\nhV/Cwaulwivh4FVwcHugDzvDwf6e17KYQXgt0RmFb8LBZWikvaUCgOscB3vBQZ3fANQ+4TA0Yfbe\nlJsRQUREREREVI9YcKgBNeEwqKy8aB8aqRcc7C0VagEhSEtFHAWHaiYc4iw4vKr7VcZ2W6YNAto/\nqJ5CsLRUpJoBwLWtwj7DQV8S0+2+enpCVy6BUI2WioOjB/HMgWdCX5eIiIiIiKhWWHCogSAzHEbN\n5+KKWyrcigMzpaWiLdNmeZ1rMv8REiKB1owWJ3m271lce9+16B3uNd7XEw5ugyMdLRW2gkMlCYco\nQyP9WiqGJoZwzFePwau//mr85LmfhL42ERERERFRLbDgUANqS0W/8qyqznDQ5zcAWiEhyNBIr2Ux\nq9FSEecqFZUkHOzpArslbUsAAEOTQ/jsg5/FB37+AeO95vRUwsHWUtHa1OpoofCd4VDlhIOU0jfh\n8NS+p4yixz3b7wl1bSIiIiIiolphwaEGgrRU2AsJXgkH9bimJu0LCJZwsM9haMSEQy5tJhrUgZG6\nG8++EWsWrDFeqw/uXi0Vx8873ihG6OKc4RB2aORofhQS5tRPe0FDbRM5Mn4k1LWJiIiIiIhqhQWH\nGvBrqUi4/AaCtlSk0+4FB7dCQj4fLmlQrzMcNqzfgO6WbjSnmvHDt//Q8f6Zq87EH676A3rm9Dje\n8xoaubp7tVGM0NlbKrKprNGKoa5SUSgVUCj5/0BhEw5qkQRwFjTUAsSRMRYciIiIiIioPqXKH0KV\nCrJKhco+NNKrpcKr4OCVcAjTJlHNVSoqKV7Ma5mHXdfsQlEWMadpjudx3bluy/wGQGmpsCccuo5H\nNpW17LO3VABayuHAyAFLwiHIkpVhZzgMTQ5ZXtvvoRYgmHAgIiIiIqJ6xYJDDQQZGqnySzhEbakI\nmnC4/npg2zZg7Vrr/npJOABwtD+46W7pdp43lWKwD410a6mwJxwAs+CgznAo104BxJ9wsLRUMOFA\nRERERER1igWHGvAqOKhDI1X2goP6gB6kpSJowsH+eutW4H//b21782bre3EOjazkWkF155wFB7+W\nCnvCwe18fWnMgfEBlGQJCZHAeGG87GcJO8PBUXCwJxzyTDgQEREREVH94wyHGqi0pSKuhEO5pMEr\nr5jbe/ZY36unhEMQrgkHj5aKVZ2rHDMcUglnLU4fHClhriIRpKUibMJhaMLWUmGf4VCwznCQUoKI\niIiIiKjesOBQA3G2VFQz4TBqJvUxZnuOrpdVKoJyKzjoCYd0Mm3s68x2Ip1MB2rTUJfG1Oc41KSl\nwmeViqIsOo4nIiIiIiKqByw41EA2ayYW1D9GBy04xDE0MsgMh2oVHMKsjhEXt5YIPcXQO2QOk1zR\nuQIAkE6ksa5nHQDgb0/5W9drdmTMpTH1lSqqMTQyzCoVANsqiIiIiIioPnGGQw0IobVVHLE9F3rN\ncIjaUiGldj23B/qwCQc7Kc3rh1UvCQc9xbD9yHZj38rOlQAAIQTufe+9eOKVJ3DaUae5XjNMwiGb\nyhrzHUK3VNhXqfBpqQC0topl7ctC3YOIiIiIiKjamHCoEbWtQhdnSwVgFhCiJhzsqQa7qEtjTssM\nB5+hkWoiYGXHSmO7LdOGM1ae4VjFQqfPcACUgoNHwqEtY/7C4x4aqbZUAEw4EBERERFRfWLBoUa8\nCg4Jl9+APeEQpKUCMIsRcbRUuImaTJiWVSp8lsU879jzjH1vXvHmwNfUV6kAYCyNqaYNOrOdxnZr\nkzkpNO5lMd0SDkRERERERPWGLRU1oq5UoYuScPBqqQC0gkMu5z00slxrQ7UKDvWWcLjh7BswMjmC\nkxaehLNWnRX4muUSDvNz8420gSXhEHKGg2OVCp9lMQEmHIiIiIiIqD6x4FAjbgkHrxkOUVsqGiXh\nUIuCw9zmuUiIBEpS6wNJiqSxOsXRc4/Gve+9N/Q11RkOxtBIJW3QnevGC4deAAC0ZipIOOStCQd9\nFoSOCQciIiIiImoEbKmokTAzHKrRUlHp0Eiv6wYxHUMjEyKBruYu43WQZS/LKZdwUNs4Kkk4lGup\n4AwHIiIiIiJqBCw41EitWioA9wRBHAmHRhoaCVjbKvR2ikqUm+EwPzff2FYLDqFXqXBpqegb6cMd\nz9yBgfEBZ0sFEw5ERERERFSHWHCokbAJB7XgoD6g+7VU6MWIeks4TMfQSMCaOJBSVny9cgmHNy1/\nEzqznUiIBM4/9nxjfxxDIy/80YW45H8uwYa7NjhbKphwICIiIiKiOsQZDjUSZoZDNgtMTJivi0Xg\nE58AenuBefPM/WFnOJRLGsykoZGANeEwkh+p+HqWGQ5TCQd1vsLCOQux85qdGJwYNFbEAOJZFvPx\nPY8DAJ7Y+4TjeBYciIiIiIioHrHgUCNBWyoyGa0QoSYcHn4Y2LhR2+40V170bKmot4TDdMxwAKwJ\nB/vcgyjaMm0QEJCQZsJBSRs0p5vRlmlDW6YNgxODxv7QLRWTQ56vBycGkU6kLe+zpYKIiIiIiOoR\nWypqJGhLRfPUH8bVgsOQ8vx5RHm25CoV/jqzneUPCiEhEsbqE8YqFUpLhZpqaEqav5hKh0aqBicG\n2VJBREREREQNgQWHGnErOHR0eBccUgGyJ2ESDqWSdf6D23EzLeEwp2lO7NfU5zh4JRx0agqh0hkO\nqsnipCU9ATDhQERERERE9almBQchxDIhxJeEEM8JIYaFEIeEEJuEEB8TQlS+ZqH7PZuFENuFEKWp\nr+3VuE8Q9paK1lbgne/U2idUbgkHL14JB68Ewfi49fVMTzhUo+Cgr1Sxf2Q/7t95Pw6OHjTeUxMO\nyUQSCaH9csPMcCiWimXbP0rSulzIkfEjsQzFJCIiIiIiilNNZjgIIc4H8D0AbQD0J6NmAOsArAdw\npRDirVLKl2K+9ecALFfuOW3sCYePfxzo7g7WUuElTEsFUL7gMGZN6jvEtSzmTEg4AMCbvvsmy3tq\nwgHQUg4TxYlQCYcowy0LpQJG8iNV+XmJiIiIiIiiqnrCQQhxEoD/BtAKYAjAJwH8OYAzANwKrRhw\nDIC7hBC5mO/7EQBjU/d1WQ+idtrbra//7u+0714FByGc6Qe7MC0VwPQlHKZrlYqj5x5tbC9tWxrL\nNZe0LfF8T004AOYchzAzHPzaKfywrYKIiIiIiOpNLVoqboSWZigAOFNK+UUp5eNSyvullFcB+Di0\nYsCxAD4axw2FEAloxYwEgOsBTPvT2OrVwJo1WhHhjjuA3FRpxavgAJRPOYRNONgTDNPVUlGrhMMb\nlr0B71nzHhw992j8+JIfx3LNa0+/Fu941TuwZsEax3vZVNbyOp3U5jiESTgMTQyVP8gFB0cSERER\nEVG9qWpLhRDiZACnQUsxfFNKucnlsC8DuBzAagAfEUJ8XkpZ6SPpNQDWAngOwBcBXFnC/kAsAAAg\nAElEQVTh9SqWSgFPPKGlDNT2Cr+CQyqlrS7hJc6Winze/15+1y1nuhIOQgj810X/Fes1j5t3HO54\nxx24+6W7cfbtZxv7s6kshO2XqSccwhQcmHAgIiIiIqKZotoJhwuV7e+4HSC1aXf6U2EHgDe5HReU\nEGIZgOugFTk+JKWs0eNteU1NzlkOlSQcwrZU+CUcys1v8Ltu2PNqVXCoptcvfb1lJQp7OwVgrlQR\nZmjk0CQTDkRERERENDNUu+DwhqnvIwA2+xz3gLL9+grv+XUALQD+S0r5YIXXqrpatlT4JRzKtVP4\nXbec6WqpqKZcUw6nLD7FeO32wF9pwkE/34ta5GDCgYiIiIiI6k21Cw6roSUNtkkp/dY4eN52TiRC\niHcBeAu0mQ3/EPU6tVSupcJPnMtiBik4xLVKxUxIOADAm5b7h3EqHRrZ3dLte6w6wPLQ2KHA9yAi\nIiIiIqqFqhUchBAZAPOmXu7xO1ZK2Q8tBQEAkZYTEEJ0ALgBWoHjH6WUB6Ncp9bsBYesMncwjpYK\ntWhhLziox1Uz4TBdy2JW25tW+BccKh0aOT833/fYZe3LjO2+kb7A9yAiIiIiIqqFaiYcWpXtIJPw\n9ILDnIj3+3cACwA8IqW8LeI1aq7aLRWZjLntN8Ohli0VMyXh8GdL/szYXtezzvG+kXAIMcPBknDI\n+Scc1ILDwdGGqK8REREREdEsUs2Cg7pGYJA/8U5AWx7TOX2vDCHEXwB4P4A8gKvCnj+dErbfQBwt\nFWphQE1MTNcMh5macGhON+POd96Ji46/CDe99SbH+/rQyKIsouTbUWQK01JhSTiMxpdw0Oa4EhER\nERERVaaaBQf18dZ/+p0mA60dIsB6CSYhRBOAW6Ze3iilfCbM+dMtasIhldLObYSCw0xNOADAxasv\nxo8v+bFlgKROHfoYNOWgJhV65vT4HqvOcIir4PCVx76CBf++ADc94SygEBERERERhVHmb+gVUdf3\nC9ImkZv6HqT9QvUpAMcC2A1tOcyqmpycxJNPPln2uJ6eHvT0+D8wAtELDumpFRkboeAwUxMO5egz\nHABtjkMmlfE5WrOjf4exvbrbf35qW6YNHdkO9I/3x9ZS8a+P/Cv6Rvvw5Ue/jP918v+K5ZpERERE\nRGTV29uL3t7essdNTgafB1ePqlZwkFJOCCEOAugCsMTv2KmBjzloCYeXQ97q41Pn/QbA+cL+BK/R\nixk5IcQlU9sHpJT3hbwX+vr6sG6ds1/f7tprr8VnPvOZssdFXaVCLzSUW6VCLTj4zXCwv+em1qtU\nlErAM88AJ5zgbD1pBJaEQ8CVKrYf2Q4AyCQzWNW5yvfY5lQzulu60T/eH9vQyIHxAQDA0ORQmSOJ\niIiIiCiqm2++GdddV/W/l0+7aiYcAOA5AKcBOFoIkfBZGvN42zlh6E91l099+ekG8MOp7fsBhC44\ndHd3Y+PGjWWPC5JuAKqfcFCHRtZLS0XQ67z3vcDttwNXXgncemu0e08nfYYDALwy+AraM+1IJpy/\n1J+/8HNsP7IdV62/yig4rOhcgZZ0i+/1W9ItmNcyDy8efhEDEwPIF/OWVEVYUkqM5rX/ECYKE5Gv\nQ0RERERE/jZs2IALLrig7HHnnHMO+voad0W6ahccHoJWcMgBWAfgCY/jTle2Hw55j6AT7oTt2EiT\n8ZqamrB27doop7qqZUvFdK1SETXh8OtfW783GjXhsOYba3DK4lPwyOWPWIoO249sx9v++22QkNh+\nZDvGCtovaWXnSjSn/eenNqebLStZHBw9iJ7WYIUuN2OFMcip/1mMF8bLHE1ERERERFEFbcFvagoy\nDrF+VTuo/lNl+/1uBwitB+KyqZf9CJk6kFImy31Bm+8AALuU/WeE/WGqoRotFUGXxVSPq8cZDhNT\nf2Rv1LYle9pg0yubsOmVTZZ9v972a+Mh/z82/Yexf2XHSjSnyhQcUs2Y1zzPeF3pHAc93QAAE8UJ\nrlZBREREREQVqWrBQUr5BIDfQUsXXCGEONXlsI8BWA0tcXCjlNLyOCqEOF0IUZr6+lY1P+90mA1D\nI6OuUqH/PBMNmu5XEw66u1+62/K6NdPqem6QhENLusWScKh0pYqRyRHL64lig/7DExERERFRXajF\nKL6PQFvqMg3gHiHEPwkhThVCvFEIcTOAL04d9wKAL/tcZ0b+uTVqwiGugoP+R+x6TDjoP0/DJhwS\nznkK92y/x/JaH9Jot7IzQMIhrQ2N1FU6OFJNOADR5zjs7N+J32z/DUqeI1uIiIiIiGg2qHrBQUq5\nBcA7AQxAm+VwPYBHAdwL4APQCgnPA3irlHLE6zozlX31haAJB73QkFaeacutUmEvOADmyhNBCg5R\nV6mIknAoFs37NWrB4fDYYce+x/Y8ZikyHBk/4npuoBkOqWbMa4mvpWIkb/2fX5Q5DkfGjuDkW0/G\nmd87Ezc9cVNFn2e67R/ej+8//X0cGXP/HRERERERkb+aLDYopfwFgDUAboCWZBgBcATaEMmPA1gr\npdzhdwnb90gfo8Lzq2I6h0YC5sN/vQ2NVNsoCoXoxY7p1D/e79hXlEXct/M+32MAbZWKTDIDAfM/\nkK7mLssx9paKFw+/iOvuvw4bt5VfRcWNvaUiSsHh1idvNQof12y8JtLnqBcX33ExLv3JpbjqF1dN\n90chIiIiImpI1V6lwiClfBnavIaPhTzvAQA+j96BrrGikvOrqdotFS3KyopuBYNCQRssWW8tFfZU\nw+SktXjSCD647oP47Y7foi3Thi+c8QVc/curAQA3PXEThiaG8Lbj3+ZacJifm485TXMAANlU1li5\nYln7MhwaO2Qc15y2Jhy+8vhXjHN6P9qLjmxHqM/raKmIMMNBHYqZSWV8jqx/f9j3BwDAln1bpvmT\nEBERERE1ppoVHMhd1ISDXmhIJrVrSOlecMjl/O9fi4RDlJaKmVBweOcJ78TyjuXobulGd64b12y8\nBvlSHvdsvwf3bL8Hl6651PGQD2jtFLrmdLOl4PDUvqcAaAMpEyJhmeGgGy+MY9vhbVi/aH2ozxu1\npeKFgy/g+1u/j3e/5t34/d7fG/vX9awLdf96ky/lte/F/DR/EiIiIiKixsSCwzSrtKVCCK34MDFR\nPuHgZjpaKoJcx74yRaPOcThl8SnG9gXHXYA7n7vTeP3Iy49gecdyxzmWgoMyOHJp21JjuyWt/WLV\nhIMqSjuEvfgR9Brv+7/vw2N7HsNXN33VktjQUxqNqlDS/sehFx6IiIiIiCicmsxwIG9RWyrUVgp9\nu14TDlFmOLglHBrdbRfchtsvut14fWDkgOtAwpUdZsEhmzJjHcvalxnbeiFiTtMcZJLO1gWv1S/8\nRJ3h8NiexwA451E08rKaJVkyVtnQCw9ERERERBQOCw7TzF5wUFedCJJwACorOOjHug2U9Do2LHuB\nIcoMB3vioRG1Z9vx12v+Gn9x1F8AAIYnh7F3aK/juBMXnmhsqytVLGlb4tgvhLAMjtQNTgyG/nz2\nloqoy2LqoqQs6oVaZGBLBRERERFRNGypmGb2goP6OmrBQX3AjzPhEHWlCCYcrObn5hvb+0f2AwBW\nda7C5SddjmKpiAuPv9B4/7Rlp+GPB/6I05adhs7mTmO/3lIBaG0Vewb3WO4xMBE+4RC1pSIpkihK\nZxWp0oLFdFKLDGypICIiIiKKhgWHaWYvOKjiaKngDIf6M79lvmPf3Oa5+ORpn3Ts/8o5X8E7T3gn\n1i9aj6f3P23sV2c72JfLBGrXUlEoFVyLDUBjt1SoRQYmHIiIiIiIomHBYZqpBYeErcElbMJBf0iv\ntxkOca1SMVO4tUCo6QVVOpnGG5e/EQDQlmkz9qutFupSmbooLRVRlsX0K0o0csLB0lLBhAMRERER\nUSSc4TDN1CJDxjb7zy/hoBYcWlu170ND2vKYYRMOUpoFB7971jLhMJMLDmpLha4j21H2PLXgoLZU\n9A71Oo6N0lIRZVnMsbw5/OO1C16L/3nH/xifs6ETDkqqoVAqQEo5jZ+GiIiIiKgxseAwzdSEQzZr\nfc8v4aC2VLRNPYcWi9rwx7AJh3zePKfT/Q/txvWjiJJwsLdUzIShkTq3gkNn1ucffkrPnB5jcOT6\nnvXG/s+96XOOY2tWcCiYBYdjuo7B21/1duNnmSkJB7fXRERERERUHlsqppn6h9MwBQc14dBm/uEb\ng4PhCw5qO0VHB9DX534sEw7xiJpwSCfTePjyh7F572ace8y5xv7LXnsZxgpjSCVSuPqXVwOIp6Ui\nSMFBPUafK5FJZQKfX6/sbRSFUgHpZNrjaCIiIiIicsOEwzRT/3JvLzgEbalobze3BwbCr1KhFhzU\na9nt3QvceCPwwgv+17TjKhVW3S3OGQ5BCg4AsKx9GS5afZHxUA9oD/gfPvXDeP+J7zf2xTE0MkhC\nQW2pMAoOSe2zNXJLhT3RwDkOREREREThMeEwzcaVPwJX2lIBOBMO5WY4FIvA8LD5eu5c72Nvukn7\nfsstwLPP+l9XFUdLxUwqOERtqSgnm8oilUihUCrULOGgtlRkU1nL94nCBKSUEH5LsdQp+8oUXKmC\niIiIiCg8JhymWdSCQ9CWimzWf+nNQkEbNqnrcq6w6PDcc+HaK+JoqZhJMxw6mzuRFNZfbtCEgx8h\nBNozWkRlOoZG6itn6OkLCVl29kFJlvDywMthP2rVMeFARERERFQ5FhymmV/Bwa+lQk04qG0Q9oJD\nMum8rqpQCJ5wUKnn6O69F7j7bud+tlRYJUTCsTSm17KYYbVnpwoOcbRUhFwW095SEeQab/vvt2HZ\njcvwhd99IcxHrTp7gYEJByIiIiKi8FhwmGZxJxwGBpwFB/tymyp7wcGecEh4/BcyaEvsP/YYcMYZ\nwNlnA9/8pvMeqtk+NBJwznGII+EAmEtnDk4Mhl7KsdKWCnvCAfCfA5Ev5nHXn+4CANz53J2hPmu1\nMeFARERERFQ5FhymWbVbKlKp8gkHtaWiw/bcqyYpVOo5APDUU+b2Bz4A7Nljvo6ScJjJMxwA5xyH\nuAoOektFvpQPvUpEpS0V+uwGNeHgdw31vSgtINXEGQ5ERERERJVjwWGaqQUHexLB3lKhFhaCDo1M\nJMIlHFpbrdcOWnDI257HPvQhc8nPOBIOM2mGA+AsOMQxNBIwWyqAcA/xUkpHwiFIS4Ul4TDVUqEX\nHspdw1JwiNACUk32hEO5WRREREREROTEgsM081sW055wUOcreC2LOThoPuDr54dJONgLDl7FCnvB\nYdT6rIq77gL+8AdtmzMcnKqVcNBbKgAYK1V89fGv4i3ffwue6n3K6zRMFCdQkiXLviAJB8sMh5At\nFXWdcLDPcGBLBRERERFRaCw4TLMwLRVeBQevGQ76+WESDnPmREs42AsOAPDy1OIDUVapmOktFeoM\nh1w6h3Qy7XN0cHpLBaClBh7a/RA+vPHD2LhtIz734OeM9/YN78Pp3zkdf3XHXyFfzDsGRgLxtFQE\nTThMFid9ixO15pjhwJYKIiIiIqLQWHCYZmFWqVi1ytxeuNDc9mqpiCPhUEnBYWzMvIf9nuXMpoRD\nXOkGwFZwmBjAP9zzD8brnzz/E2N7w10b8OCuB3Hnc3fiFy/+wtFOAUQYGum2SkXAhIP+eeuFY4YD\nEw5ERERERKH5LLxItRCmpeLd7wYWLNCOO/NMc7/XsphBEg7FYvUSDvq+KAkHFhyiUVsqvrPlO3hs\nz2PGaz1VUSgV8LMXfmbs3/TKJhw/73jHtYIkDsq2VARMOABaIsPeajJdmHAgIiIiIqocCw51pFzC\noa0N+OpXnee1tprbURIO1So4jI1pgyNL1tEAkRIOM3loZGdzPAMjAevQyO9v/b7lvcNjh1GSJdy/\n837L/t7h3ugJh/wMTThwhgMRERERUcXYUlFHyiUcvJIKTU3mueoMB71gUYuWihHnCACMjrqnGYpF\ncwULLzN+hkPOnOFQrYSDXVEWcWj0EH649YeW/S8cfCH6DIeCc4aDukpF0GUxgfpaqYIJByIiIiKi\nyrHgUEeiFhwAc46D2yoVlQyNjLpKhb7Pq33Cnnqwm+ktFSs7V+LEhScCAC487sLYrqvOcHCza2AX\n7nzuTsu+5w8+j5G8s+AQelnMSlsq6inhYCswcFlMIiIiIqLw2FJRR8q1VPgVDtrbgQMHtILDnDna\nvihDI3O5eFsqvNonikVnQUU10wsOCZHApis3Ye/QXhzVcVRs11VbKnRtmTZjicwfbv2h48H+yPgR\n7Ozf6Tgv9LKYlbZU1HPCgS0VREREREShMeFQR8olHLwe/gFrwiHqspi5HJBIWI/3uufgoPV12IRD\nuTkO9paKcjMcJieBjRuBgwf9j6sn6WQ61mID4N5Scdaqs4zte3fea2yrrRxP9j7pOC/yspgREw56\nUaQeOGY4sKWCiIiIiCg0Fhym2S23aN9bWoBLLrG+F6Wlolg00wdhh0bqyYg4Ew5RCw5hEw6f+hTw\nlrcAp59efj7ETObWUnHmSnNJk6f3P21sn3fsecZ25IKDW0vFDBgayYQD0czyrae+hc8/+HlLkZSI\niIiqjy0V0+zyy4Fly4CVK4EO2+zAsC0VOnvBoVzCQT9eX+0izmUx/Voq/IQtODz6qPb92We1AZZ6\n8WS2sbdULGtfhuO6jjNel6Q5POPco8/F7U/fDsC94DBRmICUEkIIz/u5rVKhDo0MuyxmvbAnGphw\nIGpcW/dvxRU/uwIA0NPag8tPunyaPxEREdHswYLDNEsmgbPP9n5PFSThoAqySkWx6J9wCDs0MpEw\nB0LWMuGgtlxMTMzegkNrU6vl9ZoFa7BwzkLHcbl0DqcddZrxWsIZC5GQyJfyaEp69/LoRYOkSCKd\nTAOwtlSEWqWCCQciqoLdA7uN7V39u6bxkxAREc0+bKmoY5UWHIIkHEZHzYf5OBIOXV3WfV4Fh3IJ\nh7AzHOwFh9kqmbD+R7NmvnvBYdXcVVjcuhi5dM7xXiph1iHLtVXoLRVqqmEmtFRwhgPRzKEWELni\nDBERUW2x4FDHwrRU+BUc7AkH9XV/v7kdxwyHefPMfX6rVFQz4TDTVrSoxJoFa9CWabMUBABtWU4h\nBI6fd7zjnLnNc41tv4IBYLZU6PMbgAqWxayjlgr7QwkfUogaV1EWXbeJiIio+lhwqGP2hIO9AKFq\nd84K9Ew4qAWHI0fM7bAFB3044+SkWUCYaz6rVpRwCFtwUN+fzQkHuzUL1kAIgQW5BZb9qzpXAQBW\nd692nKMWHMolHPT39fkNwAxJONhnOLClgqhhFUvm/+GweEhERFRbLDjUMb8Cg10cCYcwLRWFgvlg\nrw6MzOWA5qlnz0pmONiLBkw4BPe6Ja8zto/pOgYAHG0VKztXAgAuOcG2NAqArmazLyZoS0UcCYd6\nWhbTMcOBLRVEDcuScCgx4UBERFRLLDjUMXvCwU+YGQ5hWir82jj0tgp7waGlxdxfq1UqOMPB9M3z\nv4krT7oSG/96ozGPwavg8NZj3oq1PWst74VJOOgtFWrLhmWVijAJhwAtFfuH90PWYN1TxwwHJhyI\nGhZnOBAREU0fFhzqWJiCg19LhV/CQW2pCJNwANwLDi0t8SQc7AWGMEMjwyYctmwBrr4a2LQp3Hn1\n6oT5J+DWC27F2Ueby5/YCw56S4UQAp/+i09b3rPMcPBJKJRkyXjfq6VivOizSkUxXEvFp+79FBZ+\naSE23LXB97g4MOFANHOoqQbOcCAiIqotFhzqWKUtFfr59pSC+jrq0EjAu+AQR8KhkpaKsAmHq68G\nbroJuPLKcOc1EnWGQ0IkcFTHUcbrC467wHi/q7krcMJBTS94tlSESDiM5kctD/b94/345G8/ie/9\n4XsAgB898yPL92riDAeimUMtMjDhQEREVFshHmmp1uJqqajGKhVAbRMOfgWHUgnIK8+DYQsOu6aW\nZd+5M9x5jURNOCxtW4qmpPmLFULg0SsexZce/RIuOv4i3LfzPuM9v4KDPr8B8FkWM8QMB0Cb49DV\nos2Q+Mbvv4EvPPQFCAi8YdkbMDw5DEArTFQbEw5EM4cl4cAZDkRERDXFhEMdq9YMB/X1uPLM59ZS\nkUwCCY//SganZvx5JRwmJrwLBX4JBynDFRzCznuwG5t6bh4dNVfemGnUgoM+v0G1onMFvnbu13DG\nyjMCrzKhz28AbC0VERMOgLWt4sVDLwIAJCR29u80Cg6FUqHqf6W0Jxr4V1GixmWZ4SD5v2UiIqJa\nYsGhjoVpqYg6w0HllnBIpbwLDuUSDgAwPOx+rl/Cwe09v9SC/b2wCQe96FIsWpMSM8mi1kXG9tFz\nj/Y9Vk0rBE04WFoqKkg4qIMj1eLD0OQQRiZHzHsrxY5qcCQc2FJB1LC4SgUREdH0YcGhjgkR/Fg9\nnaAKskqF2zXsCQevpEW5GQ6Ad8HBL+HgVjAoFLTWiSDHh0k4SGlNeYxWP60/LdYvWo8zV56JBbkF\n+OC6D/oeG7TgoL4XV8JBXRpTLTgcHD0ICTN+ohY7qsGxSgVbKogallpkYFqJiIiotjjDoY6VG6yo\nSqeBlSuB7dvNfV4JB6+lLt0SDkEKDiPmH54dCQf9GDu/hINXwSCfd//slSQc7IWM0VGgoyP4+Y0i\nmUji7vfcjZIsISH864yBEw559xkOCZFAOpFGvpQPXLDQqUWG/nFzwMi+4X2W46o9x4EJB6KZw5Jw\n4CoVRERENcWEQx0rN1jR7qqrrK+jJhyOMhcwwJIllSUc4iw4eO2vpOAwbnvmnakJB125YgNgSyj4\ntERYWiqUhIN6jYpaKpTt/cP7rfeucksFV6kgmjk4NJKIiGj6sOBQx8IkHADgAx+wvtaHOoad4XDW\nWcD11wP/5/8A55wTruCQywVLOIRtqQizP0xLxZjtuXWmFxyCiNRSkbYVHKbmOAQdOqlTEw7q9r4R\na8KBLRVEFJRlaCRbKoiIiGqKLRV1LGzCoaMDuPBC4Kc/1V5v3qx9D5pw0AsOySTwiU+Y++NKOKTT\n5lDGek042AsQs1GlLRVAPAkHtaWi1gkHtlQQzRxsqSAiIpo+TDjUsfnzzW2vIoHd9deb2x/8oPu5\n5Voq7NRVKtRBlkFWqVALDupsCL+EQ6UFhzAJB7+Wiq9+FfjkJ60zKmaDwMti+rRU6AWIqMtijhfG\nMVk0f5H2GQ5VTzgUuSwm0UzBoZFERETThwmHOnbSScCVVwIPPgjcfnuwc1avBn79a+Cxx4C/+Rtt\nnz3h4DZ4MZn0HiapJhzmzgUOHdK2wyYcMhnz4b1eEw76z/LUU8CHP6xtH3UUsGFD8Gs2Oq+Ew8HR\ng3j7HW9HZ7YTd7zjDkvKwLOlwiPhIKU0rp1KpIyHAD3hoCYdAJeCQ60TDmypIGpYXBaTiIho+rDg\nUOduvTX8OWedpX3p0mktmSCnVhV0Kyy0tnovw6kWHObPL19w8Eo4qPeNMsOhlgWH3bvNfer2bKAW\nHP50+E94aPdDeP3S1+PrT3wdD+56EACwcdtGz2UxAbOlwqslI1/KG8tcLsgtwCtDrwAA+ie0Ngp1\nfoPb65rPcGBLBVHD4gwHIiKi6cOWillACOvDvltLhT6/wY1eqAC0hIPeYhEk4TA8bG6rnyFKwqEa\nQyO9Cg7qNcMUMGYCteDw0+d/itO+fRq+8ftvYNPeTcb+Xf27LA/9jhkOUwmHyeIkpPof0BS1ELG4\nbbGxfWhUq2ap8xvcMOFAREFZVqngDAciIqKaYsFhllCLDE1NzjSDX8EhrzxrNTcD7e3adv/UM2Hc\nCYdatlR4rVIxmwsO6rKYujuevQOb9242Xu8d2uvfUqFcQ53FoFMLDvNz85FKaGGrw2OHAThbKuxq\nPcOBCQeixqUWGZhwICIiqi0WHGYJ9WE/ldK+VAsXep+rPuhns0BXl7att1aEmeGg80s4hG2psO/3\nSzjs3g3sU8YBeCUc1P2zreBgTysAwEO7H0LvcK/xeu/wXv+WCnXwpMscB/u5c5vnAgAOjWn/Udlb\nKOyYcCCioCwJB85wICIiqikWHGYJNeHgVnBYs8b7XPUBvrnZLDj092uFA3UVh7hXqVDPqTThsHkz\nsHy59rVnj7aPLRVObgUH+wP43qG91lUq0u6rVADuK1WoBYdsKouuZu0/Kr2lolzCYTQ/6vt+pTjD\ngWjm4AwHIiKi6cOCwyyhpguSSWfB4TWv8T7XXnCYN898ffiw+ZCeyWjXrjThoN5PbfUIOsPB67hL\nL9XmUfw/9s47PI7qXv/v0WrVZRVbluVubAw2xuBCtyEYQscQLoQWcjElkEsIISHkl3aBEJLcEIjJ\nJbEJNQSC6cYmgLkYsMEY9yJ3W5ZlWZbVe1lJu/P7YzQzZ86cabuzTTqf59Gj2dnZ2dl+zjvv9/0G\nAsDvfy+vS1bBYccO4P33rYWbcOEJDixsSYUhwyHVucMhIzVDdTh09HYg0Bewz3CIckkFOykRkxSB\nIHnRdakQGQ4CgUAgEMQUITgMEliHA915ArB2ONCTWrqkApDLKpRJuiI0eJnhQAsOTh0OZtvR3SaU\n2ySj4FBfD8yeDVx5JfCPf3i//9y0XIzKlYMczxh1Bncbg8PBoqSC16nC4HDI0t5UjV2NcS+pMGQ4\niJIKgSBpocsohHgoEAgEAkFsEYLDIMEqw4EQ4KSTnO2HdTjU1xsFB9rhQE/+w8lwyM3l78tse95l\nBTprIjtb/p+MoZFLlmjHdPvt3u/fl+LDqltX4bkrn8PK767EiBxjwEdzd7Ma8AhwQiPpDAcXJRWA\nnOMQ79BIQ4aDKKkQCJIWncNBZDgIBAKBQBBThOAwSGAdDnSXiokTtQm4HXSGA2DvcKAJp6QiHMHB\nSVtMxTmRjKGRZs+vl0wsnIjbZ96O7LRsnDvuXO42ZU1l6rJVScWt792Kn6z4iW6gbyk4dDbYOxyi\n3aWCzXAQDgeBIGmhBURRUiEQCAQCQWyJmeBACBlLCHmCELKbENJOCGkghKwnhE97dhoAACAASURB\nVDxACIloCkUIOZEQcg8h5CVCyCZCSCUhpKv/fsoIIUsIIfO9eizJCOtwqKvTLluVU7BkZLhzOJgd\nQzglFZFmONAoJSXJWFJh9vxGi/PGnacuE2hKVVmjJjhYlVRsPLoRT379JFaUrVDXmWU4AHJJhW2G\nQ6y7VAiHg0CQtIi2mAKBQCAQxI+YCA6EkCsBbAdwP4DJADIB5AOYBeCPALYQQiZGcBe/BPC/AG4B\ncCqAkQDS+u9nPIBvA1hKCPmMEFJotpOBDOtwoLEKjGRhHQ51ddqE3I3DoaUFOHSIv120SipoeE4G\nen0iCw4Z9pmOnnLDtBswsWAiirKK8L1Z31PXKy6DFJKC7DS9RYYXPLm3fq+6bJXh0NCVAA4HkeEg\nEAwYRFtMgUAgEAjiR9QFB0LIDABLAOQCaAPwCwBnA7gAwLMAJADHA3ifEOLQ2G+gD8DXAJ4EsADA\npQBmA/gmgHsBlPbfz3kAloX7WJIZtksFjRuHQ1qa3uFQWaktu3E4/Pa3wIQJwNKlxu2iERoZCukv\nJ7PgwD6WaFOYWYh99+5D1Y+rdG4HhWnDpyHNl6ZbR5dUKBxu0VI7bUsq7DIchMNBIBA4RDgcBAKB\nQCCIH6n2m0TMQshOg14A35QkaT113eeEkP0AHofsfPgJgN+EcR+3S5JkNg37lBCyCMCbAK4BcBYh\n5ApJkt4P436SFq8cDmlpeocD3flBERrS0+WMCEnS3zbdOAfF0qXA1Vfr10Wa4cATCJqa9JcVYYEN\njVQuJ3KGQ28c5r4pJAUpvhSMzB1puO6s0WcZ1tElFQqHWy0EhwRyOEiSZBAYxCRFIEheRIaDQCAQ\nCATxI6oOB0LIaQDmQnYXPMeIDQpPAtgNgAC4jxDi42xjiYXYoFwvQRY1FOa6vY9kh81woEMijzvO\n+X6cCA6E8MsqeILDkSPGdZGWVPC2ozMrAHcOB3abeOMkFDNaOBYcXDocEinDIcT5OhElFQJB8iLa\nYgoEAoFAED+iXVJBn7t+ibdBvxjwcv/FfADnR+lY2qjlGFfBxx/W4bBsGTBvHvDaa8YSCyusBAda\nxHAqONAlGQrRCI2srdVfTuaSCtbhEMvjK8ktMaw7e8zZhnVch0PLYXx95Guc/fzZeOyLx9T1idYW\nk1c+EZSCkFjLjkCQwEiShB999CNc8a8rcLTtaLwPJ66ItpgCgUAgEMSPaAsOc/r/dwDYZLHdKmr5\nnCgdy43U8p4o3UfCQosB6emy2LByJXDDDe72k5Ym/ynOgxZqXkhnN/ByHNLSjOuOHDGWXkQjwyES\nh0OiCw4t1nNzT8lJy0FummY7GZY1DJMKJxm244VG1nbU4qf/91OsPbIWtR21um3pkorKlkpb23Nn\nb2c4h+8IszOgIsdBkExsPbYVT617Cv/e/2+8sOWFeB9OXBEOB4FAIBAI4ke0BYcpkMspDtiUPdAC\nwBSv7pwQMpQQciYh5HnIYZUAUAfgVa/uI1m48UY57PH004HTTgt/P4poQAdHKhQVacs8wYHncOjs\nNOYrRCPDYSA5HNjnIZaCA6Cf7J85+kwQQgzb8EoqAGDN4TWGdRmpGchIzUCWX37THGw6aHsM0Syp\nMCufEGUVgmSioatBXa7rqLPYcuCjcziIDAeBQCAQCGJK1EIjCSHpAIZBFhw4lfoakiQ1E0I6AGQB\nGBPh/X4O4Fze3QCoBfAtSZJaI7mPZGTGDODoUbmcgjM/dIwiOAwdCpSX66+jwyedllQAcllFIdWs\nlJ7gh+NwiFRwkKTkCo1sto478Bx6wD512FTuNlWtVdz1EoxlCYobojCzEJ29nWjq1hSo/Ix8bp5D\nNEsqhMNBMBCgBTI6M2UwQn+mhcNBIBAIBILYEk2HA3VuGu0Otu/o/59juZU9ksnfXwBMlSRpbYT7\nT1r8/sjEBmUfAN/hMG2atuzU4QAYgyPNHA5OMxx6e41lGmYlFWyXCkmS90fvMxQC+hJojMoKL7EW\nHOgSitkjZ3O3mVLk3KikCA50joPCqNxR3NtE1eFgIiwIh4MgmaDfx93BwS040CUVIsNBIBAIBILY\nEk3BgS7idpKrH4DcqYJzbtwVtwI4GcB0yE6HHwPYD+BeAC8RQoZHuP9BDe1woMnM1He7cOtwoIm0\npIK3rVOHg3KdE9cEzVdfAQsXxqa8Id4Oh8WXL0a2PxtnjzkbV594NXeby4+/HAtOXYD5J8zHP67+\nh+X+VMEhiyM4DOELDr2h3qhNHMzOgIozo4JkQjgcNGhXlgSJ24lGIBAIBAJBdIhaSQUAeoTDiQs0\nkA7ZiRDRqUtJkiqYVWsIIYsAvAngSgDrCSFnS5I0uGO7w0QpfWAdDlOn6rtduHE4HD4MvPKKvM9L\nLnFfUsFb39Ojvz+noZHKdTzBgQ7epGlvl4+7rU2+n8ce42/nFfF2OFxw3AVoeLABab40bn4DAPhS\nfHjhKjmobuuxrZb7y0yV1Smew2F07mjT23X1dSEnLTJDVKAvgBSSAr/Pr64zzXAQJRWCJIJ+v0bT\nEZQMsOJkMBREii/aEVYCgUAgEAiA6AoOdBtKJ7MCZTrnpPzCFZIk9RBCbgNQATkj4o8AvhPOvnp6\nerB582bb7UpKSlBSYmwhmIx88IHczeKCC4BTTpHXsQ4HupwCcOdw+P3vteUdOyLvUqGso90R0XQ4\nVFXJYgMA7IlB/5N4OxwA81BIHmOGWMey0BkOLGYOB0CeRGWkZiA1JbyvsSOtRzDr77MQkkLYcOcG\njM8fD8Aiw0GUVAiSCOFw0GA/00EpCD/8JlsLBAKBQBAbqqurUV1dbbtdj9kkKEmImuAgSVKAEFIP\nYCgA89OUAAgh+ZAFBwlApdW2ERxPAyFkDYBvAriKEOKTJPdx1XV1dZg1a5btdg899BAefvhh9wea\ngFx6KdDQIAdOKtgJDsXFxv2YCQ40y5bpJ/fhZDjw1rEOh95e+c9McGDXWwkOnVSHRjYTIhokguDg\nhsLMQmT5s0xbWVplOJw1+izT/V70ykUoayzDG9e9gUsmXeL6uN7c+abanvP5zc/j0XmPArDIcBAO\nB0ESoctwSALBoTXQinRfuisx0ylsZ4pwy6O212xHcXYxinM4P3ACgUAgELjkmWeewSOPPBLvw4g6\n0XQ4AMBuAHMBTCKEpFi0xjyRuU20UKadWZA7aNS43UFRURE++ugj2+0GirtBIZV5p7AlFXSHCgCY\nNAkG0hwU1qSmAvX18nJ2tl5w6Ojg38YuwyEY1PZJ09nJFwjcOhxiLTjEuy2mWwghGJs3FnvqZfsH\nAdF1qzDLcJg3YR6+Mf4bpvtVSjWe2fRMWILDzrqd6vKKshWq4JAoDodgKIjeUK/6/AgEbkgmh8OO\n2h0487kzkZOWg9337EZBZoGn++eVVLjlvT3v4erXr0ZuWi4q769EXkaeV4cnEAgEgkHKXXfdhfnz\n59tud8kll6COPXuaRERbcPgSsuCQDWAWgA0m251HLa+J4vHQ/uywSjfS0tIwc+ZMjw4nebFzOPAE\nB9bhkJZmnDzX1MjtOwFg5EhZdCgsBBobjW04FewcDg0Nxq4VAN/JoKx3IzjQQohwOPChBYc5Y+fg\ni8NfqNcpZzTz0vUD+IfPexgZqRlIISlqyFteeh5aAnqF5VDzobCOaVfdLnV549GNqO+sx7CsYQmR\n4dDS3YJTFp+C9p52rL19LY4fenzM7lswMNBlOESxjawXLHhvATp6O9DR24HfrPoN/nzJnz3dvxcO\nh1UVqwAAbT1t2HpsK84bf57NLQQCgUAgsMZpCX6ak7O2CUy0U5OWUssLeBsQOXXuu/0XmwF8Fo0D\nIYSMAnAW5LKNCkmSTM6XC5zAOhxGjtRfdiI48HSbsjKgtVVeVj5/x/fPtSor9W4CBTuHg5kg2NrK\nb3fZ2iq7IuzuQyHeDodkEBzG541Xl2+YdoPuuhQifw2NyBmhrkvzpWHuuLkghOiCIYuyiwz7rmhm\nc2LtkSRJ53CQIOGTg58ASAyHw4qyFahoqUBDVwOW7llqfwOBgCGZHA5HWrXeyNXt9rWsbuFlOESy\nD1FeJRAIBAKBc6IqOEiStAHAF5DbXd5OCDmDs9kDAKZAFgIWsrkKhJDzCCGh/r8X2BsTQo4nhJxv\ndRyEkCEAXoPWLcO6T5/AlgLG8co2Kxg/Xt+1AgAyGGf4SScZ97tli7asiBjHUyd3y8qMt7FzOLCB\nkQpNTfz1vAm8U8GB55jwmmR0ONw9+25MyJ+Aa6Zcg1tPvZW7zfkTzsepI07FyNyR2HjnRnW9TnDI\nMgoOTd1NaAu0GdZbcbTtKFoDrbp1K8pWADCfTMSyLWZTl/bm7OgV2qjAPcmU4eBPobrERGEyz5ZQ\nhPNZpgWcnmByh3cJBAKBQBBLol1SAQD3QS6TyATwf4SQ30F2MWQCuBHAnf3b7QXwpMV+OKZ4AMBI\nACsJIdsgOyo2ATgGoA/ACADnALi9f1kCUAq5S4UgAsaMAb75TeDTT4GXXjJen5YGjBsHHDyorfMz\noeC8NpMV1MlqxeEwebK2bv9+fV5EKMR3KdACAX0MNI2N/PU8IUI4HCJjRskMHLzP5IXoJ8ufhc3f\n22xotWnncACAipYKTBs+jXsdD9rdoPBx2ceQJMnc4RDLkgqqbGSwtzQUhEcyORzSfJpVNBqTedbR\nEE6GA/35F4KDQCAQCATOibrgIEnSVkLItwG8AmAIgN+xm0AWGy6PoMxBAjAdwCkW10sA3gdwmyRJ\nYgQfIYQAK1bI7SCHDOFvM2mSfrJPbzd8uLEsg4XncNi/X7+NmRBAT8q3bdOWTz8dWL9eXm5o4N82\n0QWHZHQ4sKz87kos/Hoh7p59t249KzYAesFhWCb/TVPR7FJwqNUEh9SUVPSF+nC07Sh21e0yz3CI\nYUlFSzclOCR4/b0gMUkmh0PUBQcPHA66kgrRIlcgEAgEAsdEO8MBACBJ0r8hCwJ/hiwudABoghwi\n+SCAmZIkmUQCyrtg/tN8CTl08hEAnwDYB6AFQC+ABgAbAfwVwBxJkq6SJMlkmilwCyHmYgOgFwoA\nedt//hP49reBzz8H7rxTDp/MygLyOIHfbIYDAOzbp9/GSatMWnA4i+qySDscMjO1ZbeCQ6xDI5PR\n4cAyb8I8LLtxGS47/jLbbZ06HKyQJEltgQnoAyMvmniRunyo+ZBuYkFPhNw4HBZvXIxLX71U7aTh\nFjOHQ1ugDS9tfQkHGg+EtV/B4IGeFCe6S8bvo0oqojCZ9yLDYbA7HAJ9AdelawKBQCAQALEpqQAA\nSJJUCTmv4QGXt1sFwGdxfRCy6PBlRAco8Bw2ONLnA77zHflPoaJCFi5uuQV45x399jzBwa3DQZI0\nwWHUKGDsWG0bWnAoLASqquTlSB0OSkcMzsl6T2AdDh0dclkJ27p0oDA8ezgAwEd8KMnhJ/nadar4\n7tLv4pXtr+BXc3+FR+c9qiupmDt2Lj7Y/wEAoKGrAUPSNRUtMzVTnVw4nQg1dDbg+//+PgBgT/0e\nlN9npaXyae7WVCTa4fCzT36GRRsXYfSQ0Si/rxypKQP0RRdEDD1BDkpB9IX6Evb9EuuSikgdDoNN\ncGjpbsGUv05Ba6AVa29fi5OLT7a/kUAgEAgE/cTE4SAYnPAEB5bsbNnhMGKE8TqlpGLIEKC4WF52\nKjgo6w8fBlr6Txafcop8Xwqs4MBbb3c/gF5wCIWA6mpg+nRgxozouA9YwQHQHuNA5MGzH8ScsXPw\nhwv/gOKcYu42Vg4HSZLwyvZXAAC//eK36A32qg6HsXljMT5/vLptY1ejTljI9GvWF6cOh8Mth9Xl\ncFt26hwOlOCwaOMiAHKqf7j7FgwOWIEskcsqYh0aGVaGA/V8DrYuFasqVqG6vRodvR1Yvm95vA9H\nIBAIBEmGEBwEUYMtqbA6A89rQUuvU/Z17JicG6FgJzjQ5RROBYdIHA4A8OtfAzt2AFu3Aj/7mfnt\nwoUtqQCSs6zCKaeNOg1fLPgCD5z9ADJTM7nbWLXGZLtRvLnrTXVCP7VoKgoztRe/obNBdyaTvj+n\nDof6znpH21mhy3AwscNXtlRGfD+CgQs7KU5owSHKJRXC4RAZ9OMN9Fn8GAoEAoFAwEEIDoKoMX68\n/jLP4aDAOhyysvT5EGynCgW7kgorwYEOjfRScNi0SVteudL8duHCczgMZMGBhnYc0Fg5HBq79JaV\nX376S3V5WtE0neDQ2NWom6hl+bU3jNNJytG2o462s8LM4UBzsMm664dgcMNOihNZcIh2SYXIcIgM\n0RJUIBAIBJEgBAdB1EhP1192IziUlOgzEMxyHGghgG6z6cTh4EZw6LYYq3cwvVVyc7Xl2lp4zmBz\nONCYORyOtR8znVA1detfULoU4dsnfRtDM4eqlxu6GIdDGCUVVW1VjrazwonDQQgOAivY92siB0cm\nW0nFYJt0D2axRSAQCASRIwQHQVS5u7/r4axZ1iGKrOCg5DcoOBEcaEcE63DIzJT3EYuSCvo+2qIQ\n6i0cDnzMSgxYh4PC3LFzcdqo04wOhyDf4cCzetd31uNXn/4K7+97X10XM4dDsxAcBOYkU4ZDsoVG\nDra2mINZbBEIBAJB5AjBQRBVnnwS+PBD4JNPrLdjMxzYy7TgUFamLdNCAO0sCASA9nZt22nTZIeF\nE8EhyDn55UZwiHaA42B2ONACAAAcX6i9MczKKpq6OAoSgJ+c9RMAwJD0IfAR2X5jcDikWjscHl31\nKB774jFc8/o1qGqVnQ2s4GB3NrWuow5LdixRhZGQFDJ1ONBngsub3He/EAwekinDgSYqggPrcBAl\nFa4YzI9dIBAIBJEjBAdBVMnMBC65BMjPt95u+HD9ZdbhMGaMtlxFOdbpyTftcAgEZLFBaVE5bZr8\nnxYc6Ek6LTjwcCM41DOZgV4LEDyHQ3u7t/eRqLAlFWeNOUtdNguO5DkcJhVOwpUnXAkAIISoLger\nDAfeWc2vjnwlXxfqxcpyObCDFRzae6xfnOvfuh43vn0jbnn3FnV7CZJ6veJwkCQJISmkrhclFQIr\nksnhQH/mRGhk4qFzOIQG12MXCMzYUbsD/yr9V0J/twoEiYIQHAQJgd8PDBumXWYdDvn5WkZDJeWc\nN3M49PQAdXXaZaVkI0t/glwlO1ufAcESieBQ7vGJaJ7DoStxy7M9hS2pOGu0JjiUN/OfaDrDYVjW\nMIzNG4u/X/F3pBDt629olpzj0NjV6DjDISSF1PaaAPBp+acAjBkObT3mdTWdvZ1YVbEKALC6YjUk\nSdK5GwDN4dAb6tVNnOo669AWiELNjmBAkEwOB125gscZDqxQB4i2mG4RDgeBQE9HTwfmvDAHN79z\nMxZ+vTDehyMQJDxCcBAkDHSOA+twIAQYPVperqzUnAtmGQ6BgH7ir4gZZoJDZiZQUGB+bFaCAxsa\nyToavBYceA4Hq1DLgQTrcDhnzDnq8oHGA9zb0A6HN697ExU/qsD5E87XbaM4HFoDrboSBqu2mBXN\nFejs1dSmT8s/RTAURHVbtW47K4fDrrpd6mSovacdLYEWXX4DoDkceKF/ZiKLQMC+X82yQBKBaGYE\n8MonhMPBHSLDQSDQU9laqf5Wb6vZZrO1QCAQgoMgYaBdDazDAdAEh44OoLVVXrYKjXQjOGRkhC84\nsA4HloMeO995gsNgdTicMOwE1amwv3E/7ya6DAc6IJKGXl/bobUWoQUHdpKys26n7nJlayW+PvK1\nYYJj5UIorSnV76OlkutwkCSJO2EUOQ4CM5LJ4RDNkgqem0FkOLhjMD92gYAHPR4I9FkMEAUCAQAh\nOAgSiLFjteVx44zXK4IDABw5Iv+3Co3kCQ6ZJk0Ooik4iJIK76BDEwE53X5cnvxmOdB4AJIkGW7T\n2K05HAoy+C8y3RqzpqNGXdZlODATuB21Owz7+ef2fxrWWZVUbK/Zrrt8uOWwweEgQUJPsIfrcBA5\nDgIzkinDgS2pYEsgIsErh8NgPss/mB+7QMCD/kwk8nerQJAoCMFBkDD86EfAmWcCP/kJMGmS8Xov\nBAe/X/5jCVdwkKT4ORzoxzFYBAdCCJ686EmcOOxEvHfDewCA44fKnSpaA62o66wz3IYuqXDicKAF\nB12GAzOBYx0OAF9wsCqpKK1lHA6tlWjuNrYc6err0pVvKAjBQWBGUjkcoiiO8MSFcDIcopkzkegI\nh4NAoIf+Pkjk71avOdp2FL9d/VtsOrop3ociSDKE4CBIGKZNA9auBf70J/71dKcKJTjSKjSSFhyK\nirRlXllFuIJDTw8QsjkZ56XDQZKAvv7fObqEZLBkOADA/Wfdj9337Mb8E+YD0LfG3N9gLKtQSir8\nKX5DW00FncOh3ZnDYWetLDj4iA+5afKbjycKWJZUMILD4ZbDhpIKQC6r4JZUiAwHgQnJ5HBgP1u8\nz1G48MSFsBwOg3jSrQvMjEIXEYEg2aC/DxL5u9Vr7lx+J3792a8x+9nZYQm3gsGLEBwESYOdw4EN\njaS7VNAdMHiCQ7ihkWxgJI/ycntRwil0fkNenrY8WBwOPHSCAyfHQXE4FGYWghDC3Yepw8EkNDIY\nCmJ3/W4AcovNiyZeZHp8Zg6HmvYaXV4EoA+iounq6+KWVJQ1lZner2Bww07iee+fRIEVADwVHDgl\nFeFkOAzq0MhBLLYIBDwGq8Phg/0fqMu8luMCgRlCcBAkDW4EB9rhkJYG5ORo1/HaX4brcLArp1Bu\ne+yY/XZOoPMb6Mc7qAWHoTYOh/62mAWZ5i+w0hYTgK6kwczhcLDpoDrIOGn4Sbhtxm2m+zbLcGDd\nDYB7h8Oxdo/eWIIBR1I5HIJJ4HAYxDkGg/mxCwQ8RIYDDCdMBAIrhOAgSBrCzXAYNkxuq6ngZUmF\nE8EBAA4dcradHbTDQQgOMlYOh95gr+owMMtvsLrOLMOBzm84qegkXDzxYoweMho8zEoq2MBIoL9L\nhYnDgTcJa+5uDmvyJBj4JFWGQxRLKqKS4TDIygqEw0Eg0KPrUhEcnF0qaDeoQGCHEBwESUNhodZl\ngpfhwJZU0IIDTTwEhxbjHDIszASHwZThwDI+fzx8xAfAKDgo7gbAvEMFYCE40G0xJW2AQXeoOKno\nJPhSfLjtVL7LwaykgnY4pPnSAABHWo9wbYpdvfySCkDf9lMgUEgmhwMrCnT0OKhVc4hnXSoG8aR7\nMD92gYDHYM1woBEOB4EbhOAgSBoI0VwOdg6Hhgat/MBOcBg9GsjPj67g0GaeG+gKuqQiPV0uFwEG\nt8PB7/NjQsEEAHJJBd0a00mHCkAfGklj5nDYVrNNXZ5ePB0ATMsqzEoq9jXsU5fnjp0r30eoV7de\noauPX1IBAA1dDdz1gsFNUjkcYlxSEU6Gw2AuKxjMj10g4DFYMxxo6IBtgcAOITgIkgpFcGhtlf+2\nbNGuy8kBUlPl5aoqbT3doQIwCg533imLGdEKjVSO1wtoh4Pfrzk+BrPgAGhlFR29HbpcA/rsfzgl\nFWYZDtuOyYJDZmomJg+dDAAYlz8OPz7zx/ARH74383vqtmYOB+XsQH5GvroPQO+eUGAdDsOyNBWt\nvrPesL1AwE7izQSrRCCqXSp4oZEuSyokSdLtR7TFFAgGNyLDQTgcBO4QgoMgqaBzHH7/e+Crr+Tl\niROBE0/Uzvg3US5z1uGgiBIKd9wh/+cJDoprIhyHQyE1h42GwyEtTQgOCmY5DrTDwaqkIictB/4U\nv2E9LTgE+uQ3QVugDQcaDwAApg2fBl+KT93miYufQNcvu/DovEfVdWYOh7oOuY1KUVYRxuaNVdfz\nJjNshsOYIVqP2IZO4XAQGEkmh0M0u1TwyifcllSw2w+2SfdAdDiUNZbhs/LPEJI8aiElGFSwDgfa\nWTlQYR+jyHAQuEEIDoKkYow2z8If/qAtP/004PPJZQYsrOCwdq22PHUqMHKkvMwTHJSchHAEh5IS\nbdnO4dDbC9x/P/CjH+lFBd52Cn6/nD0BDO4MB0DfqaKsUWsV6bSkghDCvb44uxjpPvlNpZQ6lNaW\nQoL8w3vqiFMNt/H7/MhN0+p7eKGRPcEeNRyyKLtIJyDwYLtU0AGVoqRCwCOZMhwSvaRi0AsOA8zh\n0NjViJMXnYx5L8/Da6WvxftwBEkI/ZkISaFBEd7M/oYIwUHgBiE4CJKK0ZxGANdeC1xyibysOBxo\nWMFhwQJt+VHtRLTnDocRI7RlO8Hh/feBhQuBp54Cli0z3044HPjQDoEjrUfUZV1opEVbTEDfGlMh\n05+JacOnAZAFh/aedrWcAuALDgCQkZqhBlnySiroMgjW4cCjq09fUqETHITDQcAhmRwOsS6pcDs5\nYI9vIEy63TDQHA47a3eqAu66qnVxPhpBMsJ+hwyGThVsWZ4oqRC4IdV+E4EgcZg2TX95yhR5kq4w\ndChQw4iurOBwzz1ym8rp04FvfUtbn5Ym5zsoIgKdkRCpw8GupELpusEus4gMBz70BPxI6xE0dDZg\n8cbFWFm+Ul1v5XAwuz41JRUzRszApupNkCBh27Ft2Hpsq3r9KcWncPdFCEFOWg5aAi3ckgqlnAKQ\nBYcxee4cDrqSCuFwEHBIlgwHSZKiWlLBdTi4zHBgj2+wtcWkH39PsAeSJIHQvaaTDFpAMuv+IxBY\nwXOQ5aTlxOloYgP7WRGhkQI3CMFBkFTMmQMsXix3qZg3DzjnHL2r4ayzgF279LdhQyNHjwZeM3FR\nFhRoIkJ6ulai0dsLhEJACuMJsgqNdFNSQe+nnZ8xCMDocFBKKvr65D82n2KwMCp3lLpc1VaFX336\nKyzetFi3jVWGA8DvVOFP8WNmyUygP5x0y7Et2FqjCQ5Khwoeuem5suDAKamgHQ7DsoZh9JDRKMoq\nQl1nnWFbgJPhQAkUIjQy/iTaBCwYCqplPwqJ6nDgORA6er1ri+lFhgM7uRgIZ/ndQE/QJcgBmqkk\neX9sdIF/wcT8XAgSG/Y7JFG/X72E53BItN8+QeIiSioESQUhwF13yaUQGLldTAAAIABJREFU559v\nLKGYO9d4G9bhYAVdVkELDgDf5eCVw4EWGay2NXM4AIM7x6Eou0gNfTzSesQgNgD2Dofzxp1nWJea\nkooZJTPUyxuPbkRpTSkAYFLhJOSm5xpuo6Cc7eCVVNDCQlF2EVJTUvHUJU8ZtlMQGQ6JSaAvgHNf\nPBcn/e0kHG45HO/DUeEFjybqgJjnFoh6lwqXGQ6ipKLX8nKyQb9+ifq5ECQ2yVSy5hXsY+zq6zLt\nwiUQsAjBQTCgOPdc47pwBQdJikxwcJPhQAsOVg4HWnCgMxyAwV1WkUJSMGqI7HI40npEzU+gsctw\n+N6s72F49nDdOr/Pj+nF05FC5K/KN3a+oU78zcopFJTgyPaedkO6M1tSAQA3nnwjLj/+cnU97Z5w\nm+FQ2VIp6itjwKfln+KLw19gd/1uvLHzjXgfjgpvQpioA2Ke2yDaJRWiS4U7BprgIkoqBJGSTKG8\nXsH7rIjgSIFThOAgGFCMHw+MGqVfN9TolDeFFhy6uiITHAoKtNt75XCgSypYh8NgFhwAbRLe0NXA\nPYNpV1KRnZaNX8z5hW5dCklBlj8LJww9AYDeUmgWGKmguB8kSIYJFOtwUHj92tdxxeQrMDZvLH54\n+g/V9azDoSCjQG3ZyTocNlRtwLiF4zBu4Tgcaj5keYyCyKDP7vBKZ+JFojkcqlqrTNsP8o416g4H\nlxkOhjP8nGMeyAy0khJdScUgmCgOVA42HcRPVvwEqw6tivl9i5IKGXFiQ+AUITgIBhSEGF0OSs6B\nEyIRHHzMSfXsbK2tZjQcDnRbTOV4BzP0WX+WUbmj4Pf5bfdx1+y71OVhWZo1hi6rAGQhYv4J8y33\nRQdIscGRPIcDIIsey29cjkP3HcLZY85W17MZDpn+TDVzgnU4vLP7HUiQ0N3XjWV7LVqeCCKGTiZP\npEkYz+EQrzO5f/rqTxj959G4aslV3OujXlIRBYdDX6jPVEAZiAxkh8NgmCgOVH6+8ud48usncf1b\n18f888h+JgJ9g6BLBc/hkATBkasrVuPxNY+jubs53ocyqBGCg2DAwSurcArbGtNOcKDDHguZiICc\nHK2tphvBwanDgS2pGMwZDgAwOtcoOPzHlP/ATSffhBeuesHRPjJSM7D29rW4YvIVWHT5InX9jBF6\nweGBsx6wDIwEtJIKwHj2m3Y40MKGAiEEmX7txWVLKjJSM9TbNXQ16Eo2dtfvVpfpFp4f7P8A5//j\nfLyz+x3L4xY4h554JVJbtERyOLy9+20AwPv73ucOyqNdUsHbf6QZDkDy5xi4YaA5HOjjT9TuLQJ7\nypvKAci2fi+/M5wgHA4yie5waA204rJXL8ODnzyI333xu3gfzqAmeWOGBQIT5swJ/7b5+frLbhwO\nQ4cCdVSTAdrhYFdS4bRLhVVopHA4GAWH66Zeh+unXe9qP2eOPhPLb1yuWzerZJa6nO3PxiPnP2K7\nH1pwYIOV6M4SdEkFTWYqJThQJRUZqRlIISkYmiU7HPpCfWgNtCIvIw8AIzjUaILD/Svux76GfShv\nKsc1U66xPX6BPfQEOpHOcCVShkNHj/blFggGkJ6arrs+HiUVkTocAPm405HO2XrgMeAcDqKkYkBA\nv3advZ0xbUspMhxkttVsw7/3/RsXT7oYqSmJN6WsaK5Qux5tqt4U56MZ3AiHg2DAMXUqMHOmvPz/\n/p+72+Ywv1duBQea7GzN4dDTw7+9ghcOh8EuOCihkTQTCyd6su/zxp+HW0+9FaePOh1f3/E1MlLt\n63TowU9LoEU3IVUcDln+LDWLgcXM4aAIEXQbTyXHoSfYg7LGMnX9jtod6Av1oSfYgwONBwDIoZqD\nyQ4eTeiJVyJNwrhn5EO9rrMLvIBucckblPPEES/bYvIec6QZDkBivd7RZqA5HERJxcCAdpXRwmYs\nEA4HmUUbF+GK167Ag//3YByOyJ7GrkZ1uaq1Ko5HIhCCg2DAkZICfP458PXXwGOPubstPYEH3AkO\nbDkG7XAArIUEkeEQOTyHw3EFx3my7xSSghevehHr7liHacOnOboN3TLz/H+cj1FPjlIn/UqGA53f\nwMI6HJSzvooQoRMc+nMc9jfs153RDQQD2Fu/F4eaD6kiQ1AKoqW7xfLYJUnCoeZDg8o2Hg70gDeh\nSipMXrd4HCPtVuANyqPepcKDtpi8Y0z2SbcbBprDQbTFHBiwDodY4kVbTEmS8Fn5Z1hdsdqrw4oq\nVjlAidSliaapu0ldPtJ6xNAxTBA7hOAgGJDk5gJnnCGLD27IYk42OxUcsrKMt2UFB6scB6cOB6u2\nmIM+w4ERHPIz8lGYWWiydfShSyoA2YXwyvZXEJJCqiPBrJwCkFtyKu09u/q0kgrV4ZBldDjQ5RQK\n22q2qUIHu70ZT617ChOemoC5L84VP9AWxDvDobKlEvd+cC/e3f2ubr3ZhDAewZF2gkO0Syp4YoHb\nkgreMSbDpHvlwZW47s3rIk7xH3AOB+rxiLaYyUs8BQcvHA5fVX6FeS/Pw3kvnYeNRzd6dWhRwyrv\nxJdibEWeCNAOh47eDrQGbALVBFFDCA4CAQUrGtAOgmXLgD179Ncr2QvZ2frJv98v/+VSc06ngkN7\nO1BWBtx0E/Dcc/K6vXuBxx8HDh7U34coqdAYkTMCKUT7SvPK3RAuvHrS7TXb0djVqLoNeIGRNIqb\noatXK6lQSjB4DodddbsM+9h2bJuuzILe3oz7V9wPAFhXtQ7H2o9ZbjuYiXeGw03v3ISnNzyNa964\nRpfAbda2MdZncyVJ0mc4xCE0MlolFcng/rnwnxfirV1v4Rv/+EZE+xloDgdeSUVdR13MJ62CyKC/\nT7wsw3KCFxkOm6s3q8ubjiZ+voDVY4znyR0rmrqadJePtB6J05EIhOAgEFCMomIA/H69w2HhQjkb\ngg6GpB0OtDihZEE4KamQJH1oZG8v8MgjwGuvAXffDdTUANdeCzz4oHwM9PEJwUEjNSUVJTkl6uWJ\nBd7kN4QLXVKhsL1mu2lLTB6Km6Gzt1NzOPjdORy21mxFWZNecKBDK1n2N+zXXRatpMyJd4bDl4e/\nVJdpYchsMhxrwaE31KsrX3Ca4RD10Egp8tDIRJ90e5nXMdAcDmxJxdrKtRj55EiMXzhenAFNIhLJ\n4RCOw40+/lgLJuFAu4HyM/QJ64kUmkxDOxwAITjEEyE4CAQUZ58NXH21HAD58cd6wQGQJ/Xr12uX\nacGBnvxnZ8v/nZRU9PQAfcx4trRU/h8MAlu2ADt2GG+XliYyHFjosopEdDiUNZWhvLlcvWwrOPSL\nC/SkXxEhaHeEIiDsrpMFBx/xoSBDDhXZdmybQXCwKqn48MCHust0DWQ8qO+sx5nPnYlTF5+acD2/\nEynDgR4AJ4rDgQ1yc1NS4VUpjycOhyQsqaBb70bKgHM4UAKKBAnv7H4HfaE+1HXW4avKr+J4ZAKn\nSJKk+85NxgwHneAQ49DLcKBLKpZevxRf3/418tLl7liJmoXCjl+q2kRwZLwQgoNAQEEI8O67QG0t\n8I1vABMmGLcp758v9vWZl1QoggNdUmHmcOCFRFZUaMurTMpvWYfDYM9wAPSdKuLtcDDrZPFZ+Wfq\nslWGA6CJCy0BLeTRLDQyGApib8NeAMCkwkmYWSK3aqnpqMGaw2t0+7Uqqfhg/we6y/F2ONyx7A6s\nq1qHbTXbsGTHkrgeC4suw8HlGZ7eYC8eXfUoHl/zuCeTa3rAmigOB3YS4DQ0MiSFPJvUepHhkIwO\nh6NtR3WXw+1ME5JChtuaCVrJAnv89V2a4ytRJ04CPeznLxm7VNAT+GRzOAxJH4IzRp+hOjnjLbib\nIRwOiYMQHAQCDkrY5BVXyGUM11yjXXfokPy/vBwI9Y/Dxo0L3+HAExyaKFH2s8+M1wPRa4tZW+vN\nfuLBuLxx6vKkwklxPBJzwWNl+Up12anDgUbNcGBKKipaKtRBz5SiKZhVMku9nlX5zUoqOns78fmh\nz3Xr2BrIWNLc3Yz39r6nXt5ybEvcjoUHLTK4nYAu3bMU//35f+PBTx7UvSecwk7m6QGr2YTQKvQr\nGrCDaKclFbzbhosXXSq4GQ4JPumubqvWXQ43HJH32NsCbfhX6b+Sou6cB/uYaAE2Ua3hAj3sd4lw\nOEQf+vdDGZuk+2QbcKIKdQaHg4etMTcd3YQb3roB7+9737N9DmSE4CAQWOD3A/fdBzzxhLZOcTjs\n26etO+EEe4eDG8GBZqNJeHE0Mhx++EOguFjOi0hG7ph5ByYWTMRlx1+Gc8edG9djmVAwAYsuX4Qf\nnPYDLLthmbqenjTbhkamGgUHtUsF5XDYU79H50yYMmwKrph8hel+zUoqPiv/zHCmIp4Oh8UbF+su\nJ9pZ5Z5Q+F0qDjZpCbBsFxEnsOUlZg4HOkg13g4H3nNkNnH3agLBK58YDA6H6na94BCugMN7fZ7e\n8DRufudmzH1xrm0AbSxYW7kWr+943fHryr52tACbqGdqBXriLTh44XBIugwHWnDoH4coTs5EFeoM\noZFt3jkcfvbJz/D6ztdx5/I7PdvnQCY13gcgECQDo0cDPp+cqaAIDnv3atdPnqwvmeA5HNyUVNAE\nTU7GpaXJJSAKXpRUvPaa/H/JEuCPf4x8f7FmatFUHPih+8lbtLh79t0AzCfttiUVHIeD8kM/JH0I\nTio6CTvrdmJbzTb8eMWP1W2uPvFqzCqZhaKsIm4tt5ngwLobAPMMh0BfAPd9dB98xIeFlyyE3+e3\nfCxuCfQF8NS6p3TrEs0OGUmXivYe7YMfjoukpoMRHEwcDrlpuWpJTiJmOJhNEj0THHgOh0GQ4cA6\nHDp6OoBs9/vhORw2VG0AIE9ADjQe0LmtYs2R1iM496Vz0Rfqw4t9L+LWU2+1vY2hpIIWHBJ04iTQ\nwwpDydilgr4N/XuQqNAuKUVoSE9NbIdDNEsqlH0daz+GkBTSifsCI+LZEQgckJoKjB0rLztxOPC6\nVJg5HDrC/J2MhsNBET86RXcwT8nPyMfYvLG6dQQEE/I5ISEUXIdDvwhBCMGTFz+prlcG0TdOuxGn\njzodvhQfrjrhKu5+lQF2MBTEgvcW4OJXLkZNew0qWioM25qJJe/sfgfPbHoGf9v4Nyzft9zycYTD\nhqMbDC05K1srPb+fSNBlOLg8M6oTHMII5mSfGzOHA90tJd4OB1clFR5ZjL3IcEjGtpjRdDhI0DJH\n4j3RKK0pVV/Pbce2OboN+5hoATZeDodEF7ASjURzOITzvkm6DAdOSYUiPASloOvv1VgQzZIK+jWL\n9/dgMiAEB4HAIePHy/+bm+U/1uHgZWikE7zOcAgGNZeEEBy8Z8qwKbrLPzj9ByjJLTHZWsYqwwEA\nLpp4Ea6deq22fWom/nDhH9TL10y5BjwUG/TK8pV4aetL+LjsYzy/5XlD0BxgfvadFicONR+yfBxO\n2VK9BT//5OfY17AP5U3lhuurWqvCDr+LBvQg0+2EgRYcwilbMZRUWDgcFBJScKCO1Ud86rJnGQ68\nLhUuMxySsaSC/SyHK+DYCSuxzgVhoScUTo+Ffe3o77h4OBweW/0Ycn6Xg9+s+k3M7ztZibfgMCgz\nHHqNJRVKhgOQeO6gkBQyjF8auhrCzrNhoX/DvdrnQEYIDgKBQ+iOFeXlmsNh2DCgsNDb0Egn+P3e\ntsWknRZdXYBHXekE/Zw+6nR1efbI2Tp3ghlWGQ4Kf774z2r45EPnPaRzUsybME+3rWL5UxwOe+s1\n1exA4wFuy6jmAH8yTFsVeXXcr5W+hgtevkDXlcOOK1+7En9Y8wdc9uplXLdFb6gXdR3etfuLlEi6\nVLT3RuZwMJRUOHA4xHpQ7iQ0kp7M073dvbIY88QF1w6HZCypiKLDgSbeZ/Zosc7p+5sVUWjHRjwc\nDos2LkJvqBd/2/C3mN93ssJ+38Z6wu51SUUyORx8xKeWUNLduBIt/6Q10Kr7bCvwTqyEA/2ei7fw\nmgwIwUEgcAgtOJSWAkf7v7NOOEH+H63QSDO8bovJHodos+kt95x2D84Zcw6uOuEqfPydj5GaYh+h\nY1VSoTB6yGhsu3sb1t+xHj+b8zPddemp6bhr1l3qdicOOxGArPJLkqQLLqxoqVB/iOkOG2YOB3o9\nLxPi3g/vxafln+IXn/7C8jEqSJKkCh5lTWU618TskbPV5UTKcdBlOLgcbNGDlXAyHAwlFSYOh4KM\nAu59xgJDaCRHlKEH7gWZ3h8r1+HgMsMhGR0O3AyHMLATZ+J9Zo/+7Dgd9FuJKPE4S6sIjsr3ssAe\ng8OhL/lCI+nPTjI4HJTHSI9BlAwHIPEcDmx+g0K4Y4ieYA+e3fQsVhxYgd5gr+57JN7fg8mAEBwE\nAofQgsOKFdry5Mny/1mzNMfBnDny/2QqqWCzJLq6gOXLgd/+FmhpiWzfAqA4pxhf3vYllt6wVDex\nssIqNJKmJLcEp406jbuPJy56Aq9e8ypW37oaI3JGAJB/ODt6O3CwWRMctlRvUSdRxw89Hv4U+QyG\nmd2/sZtyOHQ1oLuvG+/sfgeHmg8h0BdQRYiKZqNTgQc7WVhXtU5dPmfMOepyIgkO9KQz5iUVDh0O\nukl8jM+iOQmNpAdteel52m09OlbehHmgt8WUJMlSkHKDXUlFvB0OtDsoXIcDTazP0vYGe9Xj7gv1\nJUV4YCIw4EoqksHh0D+ppscgtMMh3t8FLLQYSaAlrIc7hvjntn/ie+9/D5f96zLsqN2hu044HOwR\nXSoEAoeYCQ6Kw6GoSM51qK2VxQdADpvMypIzEaLtcIhUcGCPo7wcuOYaoK9P7obxy19Gtn+Be3ji\nAp3h4ITstGzcdPJNAPStNOs761HWWKZepgfuo3JHIT8jH3WddaZ2f7ak4ndf/A6Prn4UJTklWH/n\nevW6us46RwnOrQH9B0T5Qc9Jy8H04unq+kQSHNgMB0mSQOjWMRZEGhrpNMOBdjjEejLjJMMhGUoq\nksHhUNNegyx/FnLTc9HQ1WCYEIWd4ZBEJRVOzzJavXaxPkvb1qM/E9HY1agrgxLwYYWheIdGDooM\nh/5Jtc7hQGc4JFhJBT1GmVQ4Cfsb9wMIfwyxq24XADkbYsPRDbrrhMPBHuFwEAgcQgsODZSDXBEc\nALmTxezZ+naVisvBzOEQbpeKtDRZ0PD156x57XDYvVsWGwBg//7I9i0ID67DgbPOKazgQJdU0IzK\nHaWeGTc7+86WVHx5+EsAcu04rf73hfocncFvC/A/IOPyxmHMkDHq5UQSHNiJi5tJaKRtMS1LKoKJ\nITg4yXBIhpKKRM9wWHdkHUY9OQpjF45FQ2eDoZwCiJ7DId5n9sJyOFiVVMR40tTSrbcPmtnABXrY\n75J4ZziEI1TRn51kcLYkncOB+m6YPHQyd70b6NeL/f2NteCVjAjBQSBwyIgR+pBGhcmTjetolODI\naDgcAM3l4HWGQx2VzReuKCKIjJG5Iw3reK4HpwzLGqYu76rbZTpZGDVklHq2uaW7hdsZgnU40D/A\nbNcKJ0GPrMNBYVz+OIweMlq9fKQtcQQHdpDpZrJCDzBbAvzn2ArLkooQfxIfb4cD7/mJdkmFJ6GR\nnEl3IgkO7+55F0EpiObuZqyuWM0NRRsUDgenGQ5WJRUxdji0BKIrOLQGWl0LbMlAvEsqvHY4BIKB\nhH+dbB0OCZzhQI8hwn2vWAkO8RZekwEhOAgEDiFEa42pkJICTJzI3VxFERza2vidH6wEB5/P/Lq0\nNPm/Ijh47XCorze/ThAbrp16LXLScnTrInI4ZGkOh/VV6023G5U7Sj0zLkEynIUD9GcJGrqsBYfa\njlrbY2OtxQrj8hjBYQA6HEJSyNThwaOjp8MgHiSkw8FBhkPUSyqi1BbT7sw/j/rOeizeuJjb8jUS\n6M9ES6DF0KECGMAZDnRopBclFQPI4bDq0CoU/6kY0xZNSyiBzAvYyW2yZzgAiZ3jEAwF1feQlcNh\nc/VmrK5YHfPj40F/N4zKHaUuh1v+QL9eBsFBlFTYIgQHgcAFrJth3jwgPZ2/rYJSUtHXx3chWAkO\nM2bI/4cM0ZdpEKKJEV4JDuxx0GUjQnCID/kZ+bh9xu26dW4zHGjokgq2BpFmZO5I3eSPLYnoCfbo\nJoTdfd06ASIcwcHU4ZA3DrnpuRiSLit3iSQ4sJMTN2d42Am1m+BI1t0AmDscojGJdwqbHG9XUkEf\nq1cWaZ5YEK+2mPd+eC++/+/v46olV7m+rRX0Z6I10MovqYiSwyHeA+1kL6lgv/e8FBze2vUWuvu6\nsad+DzYe3ejZfhOBeE/Wve5SASR2jgP9+My6VGyv2Y5Zf5+F8146D8v3Lo/p8fGgP0ujhmiCQ7gd\nTejXSzgc3CMEB4HABb/6ldyB4tprgRdeAJYts79NvjaGxt69xuvNBIf8fOCJJ4ALLwQWLQLyNLex\nWk4BaGUeXgsOdElFpyhPixv3nXGf7jJ9RsEttMNhQ5W54DBqyCjdmXG25tEuc6C8WX8GNyLBIX8c\nAM0SeaT1SMK0jmMnnU4nKz3BHsOkx01dKRsYCVg4HOLYpcJJaKSZOBLNkop4tcUsrSmV/9eWmpbQ\nHGo+hOmLpuOyVy9zLIwo7WQB+Yz5YHI4iJIKc2jXmBsHVTIQ75IK9j000B0O9GeLHoPQy2/uelNd\nnr9kfmwOzAL6NzXqJRXC4WCLEBwEAhecdhrwxRfAm28CCxbou0SYceGF2vJf/2q8nnYPKOUXADB0\nKHDuucD//R9w001AAdVJUSmnALzLcBAlFYnJhIIJmFUyS73My3VwCp3hIIE/afen+DEsa5ilw8Fu\nchxWSYVFaCSgDRi6+7oTJliNnZw4nYTynAZugiPZwQ5g7nDI8mchzZdmer/RJFFLKrzIcAinLSY9\noTAb9L66/VWU1pbiwwMf4ouKL2z3KUlSdEsqEjjDISSFdCUJynO6pXoLqlqrzG5m+TmN9eOJZkkF\n/RlKhlBCN8RbcIjU4RAMBT3rJBML6Ak1XVJBZzjQbgcg/mIk23lLIVxxgL4d+x0rHA72CMFBIIgy\n3/2uJiS88op+Ig9ozoL0dL2oMHSofjv6OtrhoAgOgQAQcpc7xz0OBREamTh8ePOHuO3U2/D0pU/r\nlHq30CUVNHSP6pG5I5FCUnSTv6auJp3oYDcoZgWGSBwO4/PHAwBG5yZWjoMkSUaHg8Ozo7zBf8Ql\nFSYOB3+KX80BScjQSLOSimg6HDzIcAjH4UA/H2aTC3qQ7OQ90djVqBvYtwZabQUpN3jVpaK0phQT\n/zIR81+b7zog1YzWQKtOOO3u68ayvcsw8+8zcfz/Ho/6znru7RKppCKaDoeBLDiwr1NPsMe1kBgJ\n7HsoEAy4ct7x3meJ7HAwK6mgHQ7s99XKgyujf2AWmJZUhClO0c8BK6YIh4M9QnAQCKJMbi5we38Z\nfnc3cMUVwKWXAg89BGzerE30c3K0vAfAKDjQpRk91FiXdllE4nKwcjiIkor4UpRdhOeveh73nH5P\nRPuhSyoUhmcP17kmlB9m2op/7ZvXouB/CvD4mscBuB8U13aGFxqZ5ktDcU6x7riAxBAc+kJ9BpeI\n08kK1+EQaUmFicPB74ue4PDClhfwrde/pWuDqjsmJ20xzUoqBmCGAz3QNXst6G2cTObZz0JLoAUN\nnQ2G7eLtcPju0u/iYNNBLN+3HG/vejusY2HhuYI+OvARAPm523R0E/d2iVRSYchw6PZOcKBf84Em\nOPDed7F0OUQqQvImqAntcOgzcThQrgZ2XPDe3veif2AWKN8P6b505KXnwUfk4DMvSircXCeQEYKD\nQBADfvADLfRx3Trgo4+A3/wGmDULKCuT12dny6KDgpXDgRYH6FadkeQ4iNDIgU9eeh6Ksop0644r\nOA5j88aqlxXrIT35U1i8aTEAd/Z/IPy2mGPzxiKFyD9TJTkl6nreGf5YwxtcRuJwcPOc0pNM5fnp\n6O1Qz7DpBIcoORzae9rx/X9/H0v3LMUvP/0ldxsnGQ70wD0jNcPz8g9ulwqXGQ5eCA6SJOkdDiYC\nAD1wdXLWjM5vAGSLviJe0UJivB0OW49tVZd31e0K61hYeCId7aYye44TyuHAKanYemwrXt3+asTi\nx0B2OMRbcOB9LtyUEPC2TeTXyElJBSs4LN+33DM3Uzgox1OQWQBCiOrMCFtwsPg+Fg4He4TgIBDE\ngOOOkzMfrLBzONCCA020HA50eUZXV2TlGoLEgBCCd69/V5flcMLQE3SCgzJJoUMjFQ42HUR7T7t7\nh0OYbTGV/AYAqtMB4J/hjzW8iUkkGQ5uSioqWirU5ZOKTgIg17Mrx6QrqfD5ke3PBiBPOr0K3Kxq\nrVIf7+bqzdxtnGQ4mJV/RLOkwq3DgdsW02WGQ0+wRzf4duJwcDIw5jkclM/nsKxh6oQgHg6H7TXb\nsfLgSsN7zqx8yi28zwz9XWP2HFu2xYxzaOS+hn2Y88IcfOfd7+Av6/4S0b4HsuDAe51i6RDgfSdE\nKjiE+xmNxaRe53BwWFJxrP2YqcsoFiiCZGFmIQCtw1e4bgThcIiMmAkOhJCxhJAnCCG7CSHthJAG\nQsh6QsgDhJDwG8vL+84khHyLEPK3/n02EkJ6CCH1hJCvCCEPEUKK7fckEESPxYuB9euBnTuBgweB\nadP01+fkWDsc8o0nnAHoBQc3DofKSqCRmjdatecERFnFQOGcsedg3w/24a5Zd2Hu2Ln46dk/dexw\nAICdtTtd2f8B9xkOU4umIjUlFQtO1VS64mxKcEhUh0OMSioUwSHbn40xeWPU9cqA28zhIEHybGBE\nvwZHWo8YztQC7rtU0OKIZw4HDzIceGcz3Toc2OfCbHIUaUlFdVu1emyFmYXITtPEpnAIt0vF4ZbD\nmPnMTFz4zwuxbO8y3ZlQdpIdLjxXkBPBwbKkIs4ZDsfaj6kTzwc/eTCifdOv+UATHLqD8XM4SJLE\n/Q5xIzjwPtvhfEY/OfgJhj8+HDe+faPr27rB1OGQat0X/mDTwajqaIbAAAAgAElEQVQdkxWdvZ3q\ne145waIIDsLhEB9iIjgQQq4EsB3A/QAmA8gEkA9gFoA/AthCCJkY5r5PBlAL4G0Ad/fvMw+AD0AB\ngDMAPARgLyHkusgeiUAQPn6/3OVi6lRgwgRgPtM1yAuHg1PBYc0aYNw4+a+2f3xmVzYhBIeBQ0Fm\nARZfsRirF6zGScNPwonDTlSvmzx0sroNj9LaUtcOh4auBtuzynSXirW3r0XDgw24efrN6roROSPU\nZV4oXqzhnWGLqKTCoeAgSRIOtxwGILcMVSbogHaGjHU4KIKD2X2HA+sy2Vm307ANe8aO9/zQ74vU\nlNSIJ8hW+1dqeOPRFpMd5HrlcGC7MdACRGFmoeZuiZLDwWygvfXYVnVStq5qHYakay2YvBIceA4H\nWggzFRysSio8dDhIkoSb37kZJz59oq6khMYrtwePgexwiGdJhdn7Jx4Oh2c3P4uGrgYs2bEE5U3l\n9jcIEycOBx7RfH9bcbTtqLqsuDYjFRysXl/hcLAn6oIDIWQGgCUAcgG0AfgFgLMBXADgWQASgOMB\nvE8IyTbbjwVDAGT37+dLAD8H8E0AMwFcDOAZAH39271KCLk4kscjEHjFJZfoL4crOIST4bB8OSBJ\nsqvhi/7Oa3YOB5HjMHC5cdqNuGPGHbj39Htx+eTLAZg7HEpr3AsOAEwT4xXogUlOWo5uggIwJRUJ\n6nCIpKSivrMeqytW22Y51HbUqgOfcXnj1Ak6YO9wMLvvcGBfg521esEhJIUMAzRbhwNTUuFF+Qct\nLij5EPEIjWQnE2aTC53DwcFZsyNteocDfea1IKMgbg4H+nG0Bdp0n2cnmS5O4Il0tAjBe69LkmT5\n+nvpcHh/3/v4V+m/sLdhL+75gB/4y3MGKSjv13CQJEkvOPQOLMGBW1IRoy4PZu8fN+8druDAKUFb\nW7nWUiClfy/YPBcvcZLhwCMhBIccWXBQjrurt8v1b4skWbsDheBgT2oM7mMhZEdDL4BvSpK0nrru\nc0LIfgCPQ3Y+/ATAb1zuPwTgdQAPS5K0l3P9J4SQjwC8C9n18L/99yUQxJUzz9RfzswECgu1y8VM\nEVBeHn8/4WQ4VFMthJv6f6/sBAUhOAxcMv2ZeHb+s7p1poJDbanuDIdTajtqdS4FFiXDISctRw1C\npMlJy0GWPwudvZ2uMhz6Qn2QJAl+n99+YxfwBpeRlFR8XPYxPi77GBMLJmLvD/bCl+Lj3lZxNwBy\nqGZqivYznkgOB95k2TbDgSqpUASLcN5rNPQEPD01HV19XZ60xbSbiLOEU1IRToYDDetwkCQJhBDT\n7XmEm+FAP77WHv2kg54MRIKdOMd7ju0ej5cOhy8Pf6kuf1X5FXcbK7eHUt4WDt193bouOsLh4B1m\nn31XJRW8LhWMYHLla1fik4Of4L9m/xf+evlfufuhs4+8+lzxGEgOh6AURG+o15WgZ/fbLkoq7Imq\nw4EQchqAuZDdB88xYoPCkwB2AyAA7iOE8EdZJkiStFaSpBtNxAZlm2UA3um/j4mEkFPd3IdAEA38\nfrm0QmH/fuCWW4ApU+S2mXPn6ren3Q80dLbD7t3O7psnOITrcHj1VWD6dOCVV5zdN01Tk9yxI5Kw\nS0F0oCeyNNtrtuscDrlpJm9MBruzmsrAhHU20CiChdOSiq3HtmLoH4fixL+e6PnAx+suFQplTWWW\nZ6rowMhxeUxJRRwdDmxrTN7ZRl6verOSCrN9uIU+O6icjXPtcIhChoPZ60APXJ2cNWNLKmjoDAc6\nVNQN4XapYB0O7MTIC/eKXdAq7zm2ezxeOhwOtRxSl5XJDovV95KZ6OsE9rELwcE7zL4/Ii6poAQy\nSZLwWflnAICV5StN90OXIla3VZtuFynhZjgkguCgtNSmP4Nu3yt2goJwONgT7ZKKq6nll3gbSPKv\nzsv9F/MBnB+lY/mMWg4rL0Ig8JqLqQKf1FTg+OOBXbuADz4AfIz0lm1ScESXZvzrX87uNxyHg1mG\nw89/DpSWAvffL5dpOEWSgAsukMWVBx5wfjtBfFC6ITR0Naht7fLS8zA8e7ij29sFRyoDEysBQwmO\nbOpucjThu+DlC9AaaMXBpoNYvne57fYvb3sZf1n3F0cTUm6GQwQOBxqr8pOKZkpwyB/HnaDHxOHA\nllQwDgezAR37HFmJI17kONCvpTI4jkdbTIPDwUFJhd2guC3QZnmGvCCjgCtIuSFchwN97K2BVt37\nrqO3w5OJiF3uCa+MwO7x9AR7POvkUtZYpi6PGTLGcL0kSZbPQySTmIEuOPC+awdShkMgGFCdWFaf\ncVrIq26PnuBAH2+yOhzo43brSLB7bYXDwZ5oCw5z+v93ALDqjbKKWj4nSsdCy3DuRhsCQZR4+GGt\njOK//9t6WzPBYcYM4MT+zL/Vq4HDh/nb0dCCg9KpIhyHQ2en3O0CAOrrnd23QiAAbNkiL69e7fx2\ngthxzZRrAABnjDoDF0/U1DFlEFGQWYChWUO5twUAAs2+bSU4SJKknqmxcjjQOQ52AkZdR53OidHQ\n1WC5/fqq9fjPpf+J+z66D2/vettyW8C7DAf6OVKgBYdgKKhzh4TjcKC3iVZJxbH2Y2jo1J5js8kt\nK9TQggBdUgF4c6x0+YRioZUguWolpxOgJPn1ctsWM5zQSLsJp13NNu1wAMJzjHiR4dDc3Wx4/F7Y\nv8NxODj5jLoVk3j0Bnt1rh/e+6W9p93yfRjJBHqgCw5OMhCiRbQcDmYhn1Y5HzqHQxQFB/q7iBYZ\nbDMcehJHcIjI4WDzXSwcDvZEW3CYArmc4oAkWf6672FuEw3Oo5YdGs8FguhSXAzs2QPs3Qtcfrn1\ntmO1zoUYNkxbJgS4WQvzx2uvWe+npwdooOZeTU1AKGTfhYInOBw6pL+sCAhOoAWOWvuuiYI48OJV\nL2LJfyzBeze8h5OLTzZcX5hZiKGZmuDADj7olo1v7X7LNKm9o7dDrTe2LKnIdt6p4p/b/6m7bBd0\nSU8OFAeHFdwMhzBKKmgRRUERHEJSCGc+fyZGPDECr+94HQAjOJg4HJQJk4/4QAiJyDUQkkLcM768\n4E7a5WA2oGMH2vSENjUlVSc4bDi6AT//5OeGcg038Eoq2PV2qMcYTAWCsmgRi7aYdoNiq/wGQJ/h\nYHWfVjhxOPDeH/Sx8z6rXgTc2TkcuBkODrI3vCir2NuwV7cfXt6EXbeOSAQHVlwaDIJDUmU48Npi\nUq8Z/d7t6uvift9IkqR3OMShpILncKC/cxLB4VCSUwIAyEqNYkmFcDjYEjXBgRCSDkCZFln+KkqS\n1AzZBQEARt9Z5MdyCoDLIYsf263yHgSCWFNUBEx2EGNaUgL88Y/AuefKuQc0N92kLb/6qvV+jjFj\nv6YmZy0veYJDOdOFafNm+/0o0IJDfb0seggSiyHpQ3D9tOtRnFOMk4cbBYeCDL3DQWmpybv85eEv\nMeOZGXh397uG/dBnaXLTLUoq6E4VFsGRkiTh+S3P69bZBU3SZ+ftzpwCJhkOYZRU+FOMYZaKo2Fv\n/V5sPLoRISmE13f2Cw79JRWpKakoySnhOxz6B8RKUGa4JRV76vdg9JOjcdqzp+kG05IkcZ9PulOF\n2dl0g+BgUVKx4L0F+MOaP+C2925zfMwsPIcD4C7HQd025PdOcOA8P8FQUPceshvE0oNqpdc8jUFw\nCMPh4Ki8yMbezhMcvHA42IVGcjMcHDhT7ITDl7e9jKfXP20pWm2p1qvvLYEWg5uBnozxhFbhcDCH\n9xrFO8PBTeConUODfb14LodAMKA7llg5HOjSBF6GAx0O7YXg4DagF9C+X/LS81RRXjgc4ks0HQ70\nqNHJN53yScux3MolhJA0AM9B7lABAL/0cv8CQSz56U+BVauAWbP06487DjjrLHm5tBSoqDDeVqGa\n+U1qanLWgUIRJbq7gRdfBNatAw4e1G/jRnBo0+aYCAa1LAlBYjK9eDrG5Y3TrSvMLMTwLC3DYWrR\nVN3104qmGUoGXik1povaDbwV6IGMVWvMjUc3GlwKdq006ZILuzOnAH9wGU5JRWVrpeF6xeFADyCV\ndYrDYcyQMfCl+PgZDv2TKkXMCFdweGHLC6hur8am6k1qgJmyD2WARU9oVx/WaqOcOhysQiMV9jaE\nf46Al+EAwFWnCnWSGvTLfwijLWaP/RlndtBqNyimRR9W7APkkide21Q3OBns84SRzj7t2HnPtVXY\npVOiVVJhJRx+sP8D/OfS/8S9H96Lt3ebl16xbq6QFNIJq4B+EjkhfwJYOns7XZX+0PAEB6+yKRIB\nuwyEaBKLDAf2sfDcMOz7KapdKlw4HHLTc9XfnEgFh8fXPI7c3+fi4c8fdnwbSZLU50IppwCYDAfm\nu7Y10Io3dr5hWqopHA6RE03BgX4XOvllDkDuIhFZDywjfwUwC7K74SVJkj7weP8CQUIwZ462vH+/\n+XY8wcEuvwHQRIm//AW47TZg3jxg7Vr9NuE6HABRVpHo+H1+PHPFM7p1ISmEW065BcXZxTh7zNm4\ncvKVuuuPKzgOCy9ZiG8e90113ZrDa1DVWoXz/3E+Fry3QB6IU7ZQJ6GRgHVJxeZq4xvRVnDwwuEQ\nRknFQ+c9ZLheFRwoi2xDVwNaA63qsY3Ll8WfaDoc9jXsU5fpgRj9XF486WK1rObd3e+qZ53NJrdm\nJRVK+Qf9eBRaA62ugx4VzEoqwnM4aCUVEbfF5EyO2EGr3Vkz+jPAExy8cDjQkyuzgLhw7O2RTo4k\nSbIPjaTe6xuPbsRPP/6po/Icq8/xrUtvVZef2/yc6XZbjhnrC9njpSeR4/PHc/fjZhJLw37++kJ9\nnmRTJAqJ0qWC/m6NuC2mhcOhobMBVy25CicvOln9XqZ/NwG5bDDStq6SJOH2927H7L/Pxt56Teg1\ndThwMhyy/FnqiYNIBYfFmxYjEAzg6fVPO75NW0+b+l1HCw5WDoe7378b1791Pea/Np+7T9vQSOFw\nsCWaggP96jhpdpoOWRTw7FUjhPwcwO39+10P4Ade7VsgSDQmUr1XysrMt+OVVDhxOCjb/Oxn8v/O\nTmNeRHW1cf9mCMEh+bh40sWYN2GeevnEYSfi1BGnourHVVhz2xoUZRfpts/LyMMPz/ghPr7lY1wy\nSW6nUtNRg1vfuxWfH/ocL219CZ8f+tyxw8FpSQXPNWBbUtHlTnDgZji4LKnITM3Evaffix+e/kP8\nYs4v1OvrOuWSCtbhQHeoGJsnh7pE0+FwoPGA4ZgA/XM5ZsgY3DL9FgDy43+1VK7pogd0dIvVipYK\nLN+7XB3AqcfKEUdo7OrdzaDPruscDuFkOHhYUsF7HdhtbB0OlPAzuVAvOPiID7lpubr3R1NXk23Y\nKgstrPDEICBMwaE9MsGhu69bfQ3oUhkaWmA57dnT8Ke1f8K1b1zraN88unq7dJ8DMwFGkiRuXg37\nvWLncADCn0Tz3l8DqaxC+a71Ea2dVzwyHMIVHGwdDoxg9MH+D7Bs7zLsqN2BC1++EIDR4QA4bxlt\nxvaa7Xhh6wvYVL0Jz2zSTjDQ7x36MftSfIYW2tn+bM8EB+Uz09Td5NihwwuMBKwFhzWVawAA66rW\nccVk25KK3q4B5SCKBvxG695AfxKclEkov2SefCMSQu4C8BhksWE3gMslSYpYzOjp6cFmB6dxS0pK\nUFJSEundCQSO4QkOtbXACy/IoZS/+IXcdpN1ODQ368sbzHAiSgBycOSll9pvx95nXR1/O0Fi8dZ1\nb+GK165AVWsVbjpZDg/xpciDPrZvPH35nDHn4KMDcvjIJwc/UdeX1pTqzu5ZORycllQcbjG2S/G6\npMILh0NOWg6GZg3FU5c+hc7eTvzuy98B0BwO9OCxsasRh5oPqZeV8hYrh4MyEQtHcAhJIZQ1acol\n3SmDnrQWZxfjyhOuxMJ1CwHIZ33vOe0e3YCuMLNQvc1VS64CAPz4zB/jiYufUM8WKuIIr6QCkCfL\nhZmFjo6dhhYWws1wUM/yh1Jl0QHRCY1kt7Gz6dLv6ROGnaC7riCzwOAY+c673wEAvP3tt9UONHbQ\nDofstGxutxfeYNxu8hdpSQX92EfmjtR9NhSU9zo96VHCaa0wEw5XlK3QXeYJm8qx8b5D2MwJ+riO\nKziOu6+Ong5uPocdZoKDVVehZEGSJHXCXphZqIpAkZRU/Ne//wsbjm7Ac1c+h1NGnGK5Lf3dkZuW\ni2OQv6fdnOV2m+Gwo05z5lS2VuJY+zHuZL66vVp1v1lR21GLgowCVehVoD9X9O8P/dyywmO6L133\nnGSnaYJDW6ANISmEFBLe+W1FVAlJIbT3tFtmPCmYCQ50KQj93cpmEtV11ulux27PQ4KEnmAPN9PC\njurqalSzA3MOPT3J7VCKmuAgSVKAEFIPYCiA0VbbEkLyIQsOEgD+N7gLCCE3Qi6lkAAcAvBNSZKs\nI8odUldXh1lsAT2Hhx56CA8//LAXdykQOOI4arxSViaXPvz0p3JXCkAWGj76yCg4hELAUQcnm5QM\nh9GjgSMWMbCbNzsTHITDITkpyCzAmtvW8K/LKNBdzkvPU5fPGcPveLyrbpduImnpcGBKKo60HsHI\n3JGGwQwtOMwYMQNbjm1Be087Ons7dWc5aNyWVHAzHEL8AcG6I+vw/r73ceesOzE2b6xOcFDI8mch\ny5+Fzt5OrsOhL9Sn6wLhyOHAcQ04HZRXtVbpBsV0q056UFqcU4xpw6fhjFFnYF3VOmyr2YbN1Zt1\n90MLDgoflX2EJ/CEKo4oZ8nMzqI7eU146NpuUgGdbjIc1H0EE8zh0D9ITvOlYcwQfd628pniCTh/\n+upP4QkOXjocHJRU7K3fi/944z9w/NDj8dZ1b6nCJqDPSJhZMpMrOHT3daMv1Me9zgoz4fCtXW/p\nLpc1lkGSJBBCDOt5GBwOlGunJLcEd8y4A6+UvoKeYI+a3RDuWXve55y14CcrfaE+9fmhBYdwn6uy\nxjIs2rgIAHDqM6ei99e9hrP2NPRnghbV3WSk0OIEAYEEybQtJmAU0V/a+hKmF0837NdJp4olO5bg\n5nduxklFJ2H9net1Th36PUqLZvRjY79TMlIzdO83uqRCgoSOng5HQgFLoC+ge65bAi2G/XT1doEQ\nonsMbh0OrYFWnch4rP2YUXBwICZ19XWFJTg888wzeOSRR1zfLtmIdlvM3ZBzGSYRYilvncjcJmwI\nIfMB/KP/fo8CuECSJM+iW4uKirBp0ybbv7vuusuruxQIHDF2LJDa/xu5bx/wy19qYgMAfPKJ7CLg\nCalWAoKC4nBI4XyS6ZadTz4J/P73QK9NibMQHAYeBZl6wYEejJ0+6nTuIG5X/S59hoPFwCQ7LVud\n9KyqWIUxfx6Dq5dcbdhOOfOYn5GPSYWT1PVWZRU6h4NN+j3g3OEgSRIuefUS/PaL3+Kmt2VHCE9w\nALROA7wMBwAorS1Vl0fljgIAbo2+muHAcQ04dTjsb9QHwZiVVCgi0IJTF6jrVpSt0A3o6Nap6v4b\n9qMv1Oe4pMKJ64SHIiykpqTqJqyRllQ46XZAQ4coAvwJocHh4DDDoTi7GHkZebrrVMGBIxKsPbKW\n6wIC5OdlS/UWVWTRlVSYuE/CERyq26ttAxHnL5mPnXU7sXTPUqyuWK27bkPVBnX53LHnmu6jo6cD\n5U3lptfz4Dkcuvu6sWzvMt26tp42ruPjYJOWpnzCUM15YshwoEoqhqQPwbPzn0Xr/2vF92d/X10v\nSiqM0K8PLVaH+1zR322AdTYHoBcx6ft3I+jQnxnldzMQDKjfS+z3AytiPbv5WW7nCidC3sOfP4yQ\nFEJpbSle3PKi7jpacKBbSSvvnXRfuuF3nJ1k0yUVQPhlFezzyQp2B5sOYsQTIzDmz2N0v0luBQfW\n/cgbJzgplwk3OPKuu+5yNK8sKiqy31kCE23B4cv+/9mQgxvNOI9a5p86cwAh5AIAr0N2bjRAdjYc\nCnd/PNLS0jBz5kzbP1FOIYg1qanAuH4n3Y4dxgl9MAi8+y5fcKh04CtSBIcG4/gKF18sOx8AoLFR\nLt/4n/+x3p8oqRh4sCUV9CQoOy0bM0bMMNxmZ+1Ow8DbCrqsAgCW71uuG4iEpBAqW+Q39Ni8sTpX\nhFlZhSRJOodDR2+HbSig0wwHOuhxTeUaBENBdSLJTq6LsuQBRUNnA0JSyFCPW1qjCQ4lufJvDK8L\ngdUk3umkg85vABjBgXE4AMDZY85W1+2q26U7I8YrhegN9aK8qVwdvKsOB5NJbbgOB2UA7yP6WuPw\nQyOjV1LBCgx9oT7T92FfqE8VpkbkjNC5iQDNbWT2fL65803u+lvfuxUz/z4Tt7wr53LQwoqZO4jb\npcJm8tcX6sOe+j2m17cGWnWhpfQkHgA2Vm9Ul+eOm2u6n/aedk8cDpurN3MnlDw3A12KNHvkbHWZ\nFTJph4Py+vl9/oja9ynEU3BYtncZpv1tGp76+qmo7J83WQfC68ICGCfEv/7s15bfN/Rnkr7/cAUH\nWpBVhAb2tWLFqoNNB7F833LDfp20xqS7/rBdo8wEB+W4eN8nbHCkZ4KDRVcXAHh719toDbSivrMe\nHx74UF3vWnBgBAbeOMGJmBBucGRJSYmjeWVampM4xMQl2oLDUmp5AW8DInvRvtt/sRnAZ7zt7CCE\nnN1/f+kAWgBcJEmS+a+ZQDAAoXMcFG64QVt+443wHQ6dnUAgwM9ymDQJ+Owz4LrrtHU7bMLAhcNh\n4JGakqrLYGAnQbyyiqbuJt3k1k5w4J1Z3t+gnY2vaa9RtxkzZIyjoMn2nnbDfu1CCp06HOhBG6Af\nzJg5HIJSEM3dzYbBIz1BK8npFxwcOBzogZZjh0OD3uFgWlLRL+hMHjpZLW3ZVbdL73AwqRvfU7/H\n6MYwse07cZ3wUBwOvhSfLmAu/LaY8qAvJIVcuSR4JRVsyBhvcmk2iK3vrFfzCIpz3DkcAOCNXW9w\n17+yXZ6ALNmxBMFQ0PPQyPPHn68uf7DfvGnY27v0LSfpz4IkSdh4VBYchmcP53boUOjo7UB5c+QO\nB/o7avQQrUqYFUIAveAwq0Q719bc3YzeYK/6utMTMfr1oz+v4eYSxFNwuPaNa7Gzbid+tOJHUQly\npN9vmamZqp0+3PtiJ7b1nfV4dfurptvTYiVdRujm+dUJDtT3oyKaOBFPvqr8yrDOSUkFfWLgq8qv\ncOPbN6LkiRI8t/k5fUkF9Z1r5soDjOGpWf4sDEmLvsOBFhbo31nTDAeTtpiswMAL3nRUUiFaY1oS\nVcFBkqQNAL6AXN5wOyHkDM5mDwCYAjlvYaEk6UcBhJDzCCGh/r8XePdDCDkVwPsAsiCHTl4qSZIx\nIlggGODwBIe77tLWr1zJz2ugHQ5+v/F6QBYaeO4GAJgwQRYd/v53bV2jTWqKEBwGJvRghhUP5p+g\ntZyaNnyauryuap26bBUaCQCnFBsDvegzoXSQm1OHA88WbTfB5WY4cEQI9szUtmPb1GUzwQEAKlsq\nDQMsZeLrIz61I4jf51cn6x09HZAk6f+zd93xUZT5+5nd9E4SElKA0AlIb9JBBA/EAp7+7EixchbO\n3k5PTz29E3tFz7N3PQt2sSAqCIggIkV6TYAkpCck8/tj8s5833femZ3dbEKi7/P55JPZ2dnZ2Z2y\n833e53m+NoWDT/OZxaJnhUORoHAol1sq2HZER0Sb9pVf9/+K4mpr21Nj5GGPv+7/1bOlouUoHKxR\npmBsFWIBUafX2Y4XKeHgcBMr2lpiImK4UEy3DAcAWLZrmc1qIFocfiv6zRYaKUMwoZGnH2Ux4O9v\neF+6DGAfeaUWkK3FW80CY3D2YEflBeCucKAEFIXs3KYE3KTOk8xpSi6Y84jqYWDWQHN62e5lyJmf\ngx4P90BxVTFHatJrJSV2wpnhEG7CoV6vxz+/+SfuXHwnR77RY2bZrmVhfU+A3z8xETHm9xUqOSMr\niN1IKvr5KOEg6xrhBHrO0Ou+k8JBBlloaSCFA1XcMbzy8yvYW7YXdy+525bhwK4J7PolIx1tloqo\nJlI4CIMAtNMNVSjuKrUCaRkxDwShcJAMTDSlwuGPgqZWOADA5TBaXUYC+FTTtOs0TRumado4TdOe\nAMCE1+sBzHdZjzRaWNO0zgA+AsDo4ZsAlGqa1tvlr3UbYRQUHNBZCLr2+YDBg4HTTnN/HVU4ZGTI\nl3EjHNj7JiUBLD+rKMCApGip8EI4VDeuxbRCM4CNNnZu05nzzAPA+E7j8cGZH+CDMz/ApUMvNefT\nMMRACofz+p9nC4mkElFamHhVONCbFYZABa5U4SAZGRUVDj/u/dGcdrJUAMDPBc4SocyETO47YIVg\neW05N3JPQxLZe4WqcCipLjE/MyNukqOTudGtXm17ATBuvBZuWAjAKOrEDgoMv+7/1bOlwmuGw7rC\ndbjus+tMYoet36Zw8KhO0HWdEA6RZpcKIDhbhaxwFPeFbBmngpOOwjFSjZ47pqVCKBD6ZPQxp0WF\ngXiD/3PBz2FXOAzMGohuqd0AAN9s/0Z6nu08tBNfbOHFrvS8ZuoGABicNRg+zWeTdTOU1ZQ5Fo80\nL4aeT1KFAyHgjut6nDktIxyY6iErIYsbYf3kt09QWFGIjQc34o1f3uCCVKkarLVYKt5d/y6u//x6\n3LjoRlPeL5JWS7YvwUXvX4T297XHp799Gpb3pcdbTERM0Nc2ETIrhFtXI3pOUGVKqAoHjnBgCgcP\n5IksAyUQ4eCU3QIYKh7RnlhaXQpd110tFVKFQzNkODgpHNjvfJuYNhwZ4jXDYW+5XeHglOEQDnLw\nj4ImJxwalAanwbA5xAO4E8B3ABYBOB8GkfArjLaVodCTowFkwFBRaADuB7AmwN/F0jUpKLRyiAqH\nXr2AhATgjDPsy7Yh+X6UcHDKpQmkcAAMgiOlYYA7EOEgKhwCZTjcey+QmAhcd537cgpHFg9NfgiX\nDb0ML0x7Qfr85G6TMbnbZLM4FREozXp6/nTsv3o/fr7YKvzizugAACAASURBVMipwoHeUDVK4eBS\n4FbUVsgzHDxYKtwIB3rjSQMiRYg5FuYIXw2fPUFbnrH38nojKyukDlQcwHc7vjPl5WL7tV7p1j5l\n39/ArIHS0EgA+PWAd0uFV4XDWW+dhbuX3I1xz44DwIdGhqJw4KwXgsKhsYSDuC+CsVTQY5kdD7Ro\ndVI4XD7scnNaLE7E82BtwVrHLhX02BVvxg/XH3b8bhKiEnB8t+MBGN/tx5s+NtfBCvX3N7xva1/p\nSDg0ZCRQuTSFm8KBqqnoSLXsPGbHvE/zYUKnCeZ80VJRVlNm7psuqV1suTYM32z/Bj/uMa4FeSl5\n3H5qasJh56GdeHnNy40mIGhwJyP4xOvdo8sfxRMrnsDOQztx7WfXNur9GOjxFu2PNn8zglEYUMgK\nYreAYXrtiPJHmedCKBkOGjReJdGwjlD3TaDQyG3F21yfF38jD1YeRNXhKpPckCnQjlSGA0c4VFnH\nHbuOie1k6XnFWSq8KBwcrsNM4QcoS0UgNIfCAbquLwTQF8B9MJQM5QCKAPwA4BoAA3VddzPZ6cJ/\n2fPB/Cko/C4hEg5Dhxr/+/QBbrqJf66XvN5zJBwqKnjCoW9fIC4OmDMHSCWKaTYdrKXiwAHgsEsN\n8MQTRueLRx91X6/CkUV+23w8MPkBDG8/3H259Hzp/EAKB8AI6uqW1s0cseYsFSWCpSLBA+EQhMLh\n7LfORvI/k/HoD/YDUWqpEKwZrMgAJAoHcvPiRjhQmShgFZViFkWoCgexJSZDQXkB5n08z3x84SC+\nG5OMRBrVYZRtBIxhXeE6m6XCqXD0SjgwQqe4qhgVtRWcpYLrUuExw4ELbayLBA5bnyWYG8xwKxw4\nS0XDMU5HW50yHPq3629Oi8e9WCyuLVzLfX56w05v5sXvwe17SYhKwNTuU83H76x/B+v3r0e3h7qh\ny4NdsGDFAq7lJcP2ku1m9gENjByUbWQkxEbIj5sdJTscCx56raHBpiKZqOu6qfjpkNwBaXFpphpJ\nDI2kBESXNs6Ew+u/vG4WMcNz+Wsll+EQYhCi7HVlNWWo1+sx6flJOPOtM3HJwktCWjfDhoPWdXdr\nyVYAsLW/pUUhJVsbA1HhwIijysOVQVmlGGREhavCgVxjaW5RUJaKhnMkJiKGI5DZ9+V0nXbKK2HH\nY0F5ge08pthWYhEOC05YgI2XbuTOR/F4Plh5kCNGZYSweH0Pm6XCReGg67pU4XC4/rC5nJgdRK8R\nrgoHWYaDwzWNuw42kaXirx//FcOfHh5yjlFLQbMQDgCg6/oOXdev0nU9X9f1RF3X03RdH6br+r26\nrjv2G9F1/Std1/0Nf7Mlzz9Lnvf6d1vTfloFhSMD0VIxZIg1fdttwF/+Yj2eMAFSeFU4XHopcOgQ\nsGABvxxTThQXA/UuXc9ESwUA7N9vn8fAFBOlpYFbbiq0fKTFpXHqA8AY7XEa4RYR5Y9CpzaGtGbD\ngQ1mMbL9ELFUJLfnFQ5OlgqJwkFW4JbVlOHFNS/icP1h2wgs4M1SQZUDIrnCKRz2eSccmGy7pLoE\nuw5Z/lWqcGCkRNXhqoA35WJLTIaHlj1k5m30atsLFwy6gHs+WMKhqKrIZqkQ7TJ02UAQJcabDm7i\nQiNDUThwOQ31kUB1aDfSUoWDUBTKbmodMxwkwZ2cpaIhPT8lJsW0UVw/6nruJpyOCgL2Y/Xngp/N\nzx/hi+DkyfRYFckpt1H5xKhEjO442izSXv75ZfR7vB92HjJkds+vfp4j24ZkGz9i5bXl5jHASDtq\nWXDKcXCzJjkSDoLC4UDlAdM/znJKuqQa7P6uUp6cowVblzZdEOmPlI4K0++IdngBeFVKuBUOK/es\nxLr9Rvf551c/H9K6Gdbvt6xsbOTcTRkABNeO1gn0OhsTEcPtx1BUDrLzWCROKOi1I9IXGZKlgx0z\nMREx6JBs9RVnSh4nJVpWQpa068+EztYN3dKdS23PM1CFQ15KHrqmdkW7eIvwEC0XRVVF3HVK2qVC\nyHAIm6XCJcOhuKqYO+/YtYv+bovfk+e2mLIuFYRMoL9RbsRruLB893J8v/P7Vm/ZaDbCQUFBoemR\nkMBnMDCFA2BkKzzwAPDiiwZJcN558nWkp8vni4RDWhrgl2RuMcJB1w1CwgmiwgFwtlXoOlBC1HSB\n7BpHGk8+CRx/PPDTT4GX/SNDLFCP6XQMNBYC4gGsx315bbkpD2c3TBo05CTmIDYy1ixuglE4yEYT\nAt1Me7FUUIzqMIp7zIVGSgLBGFhLTIbBWVbrve92fmdOyxQOQOBRU5rQ3jO9pzn99I9Pm9PzJ823\n9WPvkd4DGvj9N6rDKNsNqQx0W2XwonAQb1A3HthoZTho8gyHPaV7XIsgjpiojwgv4eDBUuEpw4Ep\nHCSWCk3TsGTWEnwz8xv8fdzfuZtw8dgUH284sMF8/0hfJBdKGSrhEB8Vjyh/FK4cfqU5jxaQi7cv\nxup9qwEYBRHNnNhesh2VtZUm8cCKf8BZGeOmFOrdtrc53Tezr3R7AL5DRdc2xnt2bmOx+zR8k1M4\nNJASTioHBjeFQzBFRm1dLaa+NBWDnhwktUSV1ZSZ2SqNRb1ezxGTbOTcrVAH+EC/UGFTOBAbXijF\nLR1JZ+dQYXmhNCMB4FVPEb4Iy9JRU2rrPCNiyfYluP/7+83vKTYyFu2T2pvPM4WeE3mRHpfOZf0w\n0CBT+hsAGNf7U147Bce/dDx3PjCig57LIpF+sPIgty0JkYG7VITNUuGicBCtI+zaRX/LRSsfZ6mo\ndbZUHKw8aFMrOoV8NofCwbRz+Vp3yd66t15BQcGG/g2K2ZQUw0pB4fMBZ55pt0FQiAqH5IZ72Opq\nPtgxTW7L5tbrZquQEQ5OwZFVVbyqoTkIhzVrgFtvBTbbu565orwcuOwy4IMPgDvuaJJN+93glPxT\nABijnrePvx3vnP5OUK+n8lI22sZu2LITs80RflaQNVbh4ERYsBEPmcLBaWS+TUwbW5tQ2Y2kDKLC\ngfnYAeDZn541pzPiLfaREg5uI3Hr96/Hf3/6LwDj5vuiQRfZluma2pULzmOIi4wzVSeAsX8y4jMc\nFQ4UVI0hgxc5qfhdbzy40SQTZBkOjyx7BNnzszH2v2MdCwWbpSKMhIMXS0UwGQ5Umk2PkcToRIzs\nMBKR/kgkRiWa34NItImPa+trzVaskX5nwkHcRqciOSYixnzvv439G54+8Wmpool9L30z+3KjvztK\ndqCwwmKl6fEdisLhmE7H4L8n/RdPTH0CJ/U4yZwvEoeUcOiWZgRedmlj+Rdp4U0LfUZKUI++iLjI\nOI7sED9LMJ0X3lz3JhZuXIiVe1aa8+g+K6spw8KN4SEcdpTs4Ar/HSU7UFdf52pFAORtRIMFl+EQ\nEc21YAwmR4GBnseMxKrT66RENCAoHBrOKTZf9hvA8PxPz2P0M6Mx7+N55mi9TeHQoNBzIoXT49K5\n4x4wfn+O6XSM+VgkHB754RG8te4tfLDxA3y46UNzvoxwEGGzVMgUDmKGQzN0qRAJB7avKGkqKhzo\n75CbwgGwE2f0mKPff3ps0yocymuswRSR4G9tUISDgsLvDPfdZxAKr77q3OISMAIYRcJU0+yEA1VM\n0PaZToQDDaN0IwZklgonwqFYqP2ag3A44wzg738H5s4N7nUlJVY3DVkLUoo9ewz1xh8Vc4fOxbq5\n67Bj3g7cNOYmxw4FTqCEw6mvn4rej/Y2bx7aJ1ujRkxyXlJdIs0m8Boa6TR6x4p5WYaDk8JhSrcp\ntiLb6cZPvKGzKRwI4fD1tq/N6fF5423bCLgTDn/78m/myN41I6+ReoZpqz8RVLUyqr2h4BAJB9mI\nb6CbqeKqYizduRQLVixwLGjF75paKirL/fhxpaVwOFx/GH/50PCYLdmxxJT0iwiHwqG2rlbaQlMs\nKioOB5/hEOWPMkdlLx16KUa0H4FrRlzDET8UmqaZN+KBFA4Ukb5Ibr/lJOaY014VDjSkUdM0zBow\nC+v/sh5vnvYmbhtnd7r2zehrk5vTc5De+DtlOMjObYYofxRm9J+BCwZdwCkkXBUODQUpPc4pqUEJ\nB0ZKMHuLDEOyh9iuA6EqHGThmNRS9lvRb/hhtxX06NN8AUfknUA7AwEGObWnbE9AhUM4CAexLSZV\nOIRiqaAkBVXNOJEnYoaD27X1yRVPIv2edPR8uCfO/d+5NgVBTEQMshKzTNLai8JBJBwSohLQIbmD\naS9aunOpSbTqus4p0xhYK122TicUVRbxCgeJPShgl4qaplc4lNaUorauljvfRYWDpmnmdYKdV2U1\nZdJzTMxxoGQCRzg0scKBXk8U4aCgoNCi0KuXYZmYNMl9OdpRguHoo43WlhRNRTgEo3Ao4cOJgyYc\n3n/faA26zGNLcF0H1jfcT23Y4L6siAry2yUjVRjmzweys4GTTw5u/b839EzvyYXdBQNaDB+oPIBf\nCn8xH9NCJVBrTNlI1q/7f8Wsd2Zx4ZBOCgl2cxWMpYKOqDKkxqbaLAl+zW8r+kWFQ+c2naVFPB31\nolJYp5vZLUVb8Nra1wAYN1WXDbuMC7JkGNBugPT1AN+pgllGxBtSUUIOOFsq2OfaV74Pxzx3DC54\n/wLc99190mVFFQRVOOzd48fnn1o3bHQUGHBOdw9HhoNT0ehJ4RAgwyEzPtO0IfXO6I0ls5bg7ol3\nS1/DwAgHsRh3JRz8kfhzrz9jXN44nNjjRJzc07pweSUcZMVKTlIOpudPx/hO423PiQoHN8LBSeHg\nBlroU1LPTeHAClKqSmAWEMAqqBOjEs1ixM1SIeY3AKG32pMF7lLVy/c7v+eeq9frQ/aF06Behm3F\n27hr5Dl9z0GH5A44rbfVlzvcCgcxwyGU0XT2Gr/mR15Knjnf6XovZji4ER73f38/DlQesBE0dPsj\nfBEmgccsgcEQDknRSdA0zbyultaUmr+F32z/RrqvaIehgAqHGvfQyCbrUiEQDrRLhex6XVRV5Kpw\nAKzrBDvunfaxOJ+SCSzwOiYihiN2m0LhQK89inBQUFBotRBtFWeeCcQKA0VU8bCd5Ak1xlKh68Fl\nODSGcNB1YPZs4PXXgSuvDLw8ANTUWB0zKoK8H/NKOLzxhvH/3XcNy4hC8HBK7AaA/plWGj+VP9Pi\ngEE2CvrVtq/wzKpnMPeDueZNstOIFxu59WqpiPRFSi0Jfp/fdpOUmZBpK/pFhYOmaZzKATAKIkq0\n0GLP6QaQfjfnDzwfCVEJUpuHG+Ewo/8MpMWmoUdaD0zPnw7AfkM6MGug7eaJFn9zBswBYNxIU589\nu0kUJcMMYsFMMxxQ7zf+GvDGuje4ZZ0IB5vCocoixxpLOISa4XC4/jAKy42LJd3HXsGOsbKaMk6V\n46YGiPBFID0uHV/M+ALvnP4Olxfh1VIhIxwYBrQbwGVsABLC4dB283MDvAWJKhS8kg+U5KI5I+J5\nzCwTGjTTJtEttZtpV6C+eFao5CTlmESQaKmg1yMZ+eamcNhWvA1XfHSF2U6UQnY8isWpl9d4AQ2M\nNLetZBsKKixC6O5j78a2K7bh7+P+bs6TEQ5OWQlOsLXFJMqZUCwVjCRIik7y1EZZzHCgZK74/vsr\nXJKwYSlzmCKvsKIQlbWVjlYaGeHAPj89lhi5JFM3AEDHZG+EQ1FVUUBLhaxLRWNzNQA7eeOmcACM\n6z+X4RBnv0ll5xa7ZtF9TM87J4VDTEQMrh91PS4behlenP4id7w0hcKBIxw0RTgoKCi0UlCFg6YB\np55qtLqk8yixsLfhGpyU5GzX8KJwqKiwrASUoCgoMIr9a64B/vY3q8tFYwiHsjJLObFpk/uyDJQo\nKA+yK5lXwoEuF+x7KBjISczhipi7JtyFBScswPxJ83H50Zeb84flDDOnl+5ailV7V2HeR/OwrtBI\na2c3KW1i2tgUBgDM5bxYKkSJsmzUeHyn8Y7tP0UpfKeUTrYbQrG7B2Cl+TNM6MS3oaEWE6cuFFSS\nzcgc2c3ogCwXhUPbXthz5R6sm7vOVK6IN6Q5iTlcUBrAj97cPfFuPDz5YXw761upwoK2dqMQv+s9\nZXusm2XdbxAGDRBHex0VDmHIcKBFIz1e31r3Fq799FrzvaVdKiQ3sfsr9pvSbDqC7RVUakxVIYEs\nFRR0nzZG4cAQHxWPozKO4tbfNbUrcpNyzXleLRWiAsgJNN+AkmLi52E3/blJuebnjvRHmraK9fvX\no+pwFWrras2CkxKHosLh2pHXAjD23di8sbbtcstwuGHRDXhg6QP404t/suXM0BFghsToRNfOPyET\nDpIR+63FW7mRYXbuUtWASDh8vvlzpN+TjmmvTvNs73DrUiH7PHX1da7rZq9JjE7kCDyn670tw4EU\n11SZoOu6mTtwVMZRKLq2CLeOvZVbFzue6PVwc9Fmxy46stBI9v5H5x5tzvtu53c4VH3IVKyJZB4l\n8gIpHAJZKmRdKqL8UeZnC5vCgWY4lMkJh0AKB0ZMyhQONKBWJJvYNSE2IhY5STl4YPIDmJ4/nSM6\nlcLBHYpwUFD4A4PaBYYMATIzeYVDXJzR+UKEk7oB4AkEJ2KAqhtoK8+CAkOJ8K9/AbffDnzYkG/U\nGMJhLyGqCwqAOg9duej2UXLECyrJb45XwsFtOQVnaJqGm8fcjLTYNNxxzB24btR1mDNwDuYNn8fd\ntA/LtQiHb3d8i2mvTsP9S+/HcS8cx/k+28a3lRIB5s2JZMQr0hfJFV+iV5/dAPVq2wsz+89ElzZd\npH51hnsn3YtjOh2DozKOwtCcobh9/O1cgZgamyrt+iAqHI7tfCz3mCoFqPWEghbyrEiIj4rnPl92\nYnbAUdNIfyTXbUQkHDITMm3ECi1oU2NTMXfoXPTO6C2Vo28t3iotIFxbZ9ZHGKSDA5zS88NtqaDf\n3VfbvsI9396DGz6/wbac7LUMe0r3mNMy8ikQ6I04VTXQm3U6Cg/YQz3pPhVvtB0zHEhhJsPQHKut\n0lEZR8Hv8yM2Mtb8zrxaKmQkTKcUe6YF/Uz089CCtqC8wPxeqL8fsAqUOr0O6wrXcQQAVTWIx/Cc\ngXOw6sJVWHvJWun1xk3h8NKal8xpmtcC8AUZQ0JkgivRE2ox6GSpYPsnLTbNLJBiImJMy4BIODzy\nwyMoqirC/379n2tHEQq3LhXiqPiWoi3o9EAn9Hu8n2MQIytsbQoHB7m9mOHAKSzI+1cdrjIVRGmx\naUiJSbEFhLLjjhIArG2pDG4Kh0HZg8zr6PLdy7Fi9wqTsDyv/3nca+jnbKylQtalArAsPuFSOFTU\nVpgEML0G0m11y3AAJJYK8ptO941N4dDwPYrdcCjR2dQKB7/P+ferNUARDgoKf2BQNcK0acZ/kXCI\nlwyOuBEOdJ1Olgpa0OflWdMFBTwJ8ktDXRQuwqG+3tm24bR9dXWG6sIrKJFQXc1316CgxITMXqLg\nDdeMvAaFVxfihtE3OC6Tm5Rr3vB+ufVLcyR/x6EdeOXnV8wbInZTKIIl48tGvKIjonk5NvF/Vx+u\nNm9sUmNT8Z+T/oNNl23iCBARYzqOwefnfo41F6/B0jlLMb7TeO6G0Gn0lhIOEb4IjOk4hnuehtz9\nUvgL3vjlDXR+oDOu/fRaU9JMFQ50VJLe4LvZKZwg3pC2S2iHvOQ8bp5TlwpZwn9ZTZmUXHAboTcs\nFc4jRJ4tFY0kHGQWlcXbF9uWY5CNmv20z+q32y21m6dtoKA34vQ7YzfrydHJ6JHeg3uNqHCgN97h\nUDgAPOHQN8O6+WfF2O7S3dzIppPCQUY40HUzOFoqyDm8dOdSc1oMS6UFypqCNdwxSYMixUJX0zT0\na9dPOgILGOcC2za3jIUvtnyB6sPV2HRwEzeaTpEQZScc6HcRSjFYWVtpZg3Q9qDbSrZZ2SKC1Yct\nV1hRyBWSVG3l1lGEQuxSQQt+8fO8tvY17Di0A2sK1uCjTR/Z1nW4/rD5HSdG8QoHJ0uFmOFAv186\nKk/3B1N79cnkW4ex84gqHJiiToa2cW3thEMD4RITEWNmM2wr2cYF4fbJ6GNmDwDglENJ0UmOo+de\nLBVUHRTpizSv5Y0mHCT2GPadOloqCOHgluFQU1djdFUhpFK/zH7mtLjv2XVYDKflFA5NSDikxaaZ\nwaKtFa176xUUFBqFW24x/nfsaHVjoJaK+PjGEQ5OxAAd0U9NtV5TWAjsI9f5nQ2/l40hHPYJ9wx7\n7MS4DSIBEEyOg7isk3pBEQ7hAx1NdwIr8sWU8Ju+uMmcTotLkybKM9+4bMQryh/FSbPp6CgtQJyK\nCy+gBaKY38DQPqk9eqb3BAAc1+U4W5GREZ9hrmdt4Vpc+9m12FK8Bfd8ew9mvjMTdfV1JuEQ4Ysw\nE89F9G/XXzrfDeLNbGZ8JkdoyJZhcArckyXyu7bO1PkMBxGyG9gNBzbgiy1fWDNESwVJX9d13VFS\n66RwYNhctBlFlUWeFQ7Ldy83p4fkDLE9HwicwqHCrnBIjU0NSuEQLsJhctfJ5ujoST2tUFVGONTr\n9Vixe4U530nhkBGfYbs5lxEOTpYKeg5T6w2VrAO8BHv1vtXc8UeJMlqweWkRC1ifh5IVInHx2ZbP\nMOmFSej2UDfc9tVtUktFfFQ8l5GQEZ+BE7ufaD4OVAzuKNmBf3/7by6z4dmfnjWvoyPbjzSLsLWF\na819Lx7nlJjYUrwFgLE/6Qju2oK1rtvCIHapoCoRsUilhJosT4HaBZKik7jt9prh4GSpoIoXdh2j\n3wPbfsBZ4SBa/NwUDoBFJByqPsQp2XKTcvHGaW8gMz4TA9oN4EJfNU1zVDl4sVTQY5qeh5RwCKUb\niqzjSHFVMXRdl16vD1Qc4Pa3LMNBVCQEq3AQz1+6vpKqEqnyIhAqaisw/dXp6PlwT26fVdZWYsch\nI6ldVFe1RijCQUHhD4yZMw0VwapVRptMgFc4xMfzBARDOC0ViYlWJ4yCAr5TRaiEw4YNwOmnA//5\nD69wAOyPA20fEFzGglfCgS6nCIemx9E5R0vns5E6IESFgz+aK1ZoCJ+bn7SsDPi//wNmzAisoPGi\ncNA0DQvPXIhHpzyK/5z0H+nzTOWwu3Q3J21+7qfncP3n15tFfPuk9o4EgEyaHggiIZSZYCccAnWp\nELGt2J7jcLAqgMJB5295Hp78sHnDKFoqdpTsQN/H+uKqT68i65ArHOr1ehz3wnFI/mcyLvvwMlvB\nTQtOmcIBAFbtXSVXOEhGzSjh4Nai1An0WGTHaL1ezxEO4g2uuH/oMe8WGkmLeloYyZCTlIN1c9fh\nxwt/xAndTzDnd21jbQvLDvBrfo4cpCONiVGJrqP65mfyB1Y4fL/LmXAQFQ5O5/vcIVZv5df+/Jpt\nO2QQpd8AbK1bfy742bRV3PnNnVLyID4y3izwAeDkHidz2+ZGOCzfvRwDnxyIqz+9Gie/ahSody2+\nCxcvvNhcZkKnCea5TLdPtPrQQptde3aX7ubIqp8Lg1c4BLJUUJWBLBSVfv7E6ERuJNlTlwp/pKOl\nghJALGTVp/m4opyti2bsUMKBKi5iImIQFxlny7WREQ6AkVdE57N8nRUXrLBZA5wIh6LKosBdKsi5\nQxUQjHA4XH9Y2o46EKQKh6oSHKg8YNpa6PWFhkaKVhcG0a5ECYe8lDzzukH3PSWTbZYK8vid9e8g\ne342nlzxpPcPCeC6z67D27++jfUH1uORZY+Y8+l5qwgHBQWFVo/8fD48MpwKBydLBS3CExKsThiH\nDvGdMFgbzmAJhzvvBF59FbjwQsuWweBF4SCSBOFWOOi6Ujg0N0Qbg2ykZlzeOKmEv7CiEDV1NVIZ\nf5Q/irvhuuC9C/D0SiMZ3GnEEzC6lLz2GvDcc8BHdqUvBzqixawhMnRu0xkXD7nYMWOB2ipEPL78\ncfPz0ZZpAHDGUWeY07KQu2CREJXgmXCQ7Q9ArnBwtVTofuCgZT+YM2AO5g6dayo5xBGzL7d+ae86\nUh8J1Fg3saygWLV3FT7d/Clq62vx0LKHMPCJgVzxFUjhAAA/7v3RXI4GvG06uAl3fH0HHlz6IJbv\nXo6auhqs2rsKgGGncGu56AQ68se+s9LqUnMkPC0uLaDCQdM0c7TPTeFA7Q2BFA6AUXj1b9efI6m6\npHaxLZcel86pGGghIdoIYiJiuEBK8zP55G0xP/ntE4z971g8+sOjWLbL6KWcm5TLFXPsszHV0Op9\nq3lLBTlue2f0xtfnfY33zngPU7tPdfn0Fljh5kY4UNTU1UgtFQcrD3LH3Cm9TpGGLOq6jmd+fAbP\n/fQcdF3Hyj0rMf7Z8aYq4Nf9v2LNvjW4+YubzddeOvRSnN33bNv1AgigcCgyCqmNB/jwWq8KB7FL\nBfd5angChX4nsusDJQiSopLg9/lNUtBR4SBkODhZKmQKB4C3QTGFB6dwIJYKer1Pj0uHpmlIjU3l\njn1KuFBrxg+7fzCn2bGraZpUESgSDkxZUV5bzhG5gbpU0POQdrIJ1lah67q0NWhxVTF3raY2ERoa\nmRqbKv2cdPsqayttmTBs37NBBsDY30zRY7NUCI8B4LHlj7l/OILPN3+Oh5Y9ZD6mOUqydrytGYpw\nUFBQ4JCQYHSnAIDk5OAJh/h4IKJhcNSLwiEhwVI4ADxBEKrCYVvDNfvwYWDxYv65lqBwqK4OvIxC\neDEoaxBXyN027jbzxq9rale8/X9v47z+53EjJgyF5YWOieXREdHcaxZuXIg5783BusJ1rgoHquQp\nkK/axNi8sRjRfgRyk3Ixo/8M94VdQIMjGVihRW+URTLgn8f+EzP6zcBTJzwV8o0Pu+FmN2hNaamI\n9kfb1o/6CGDjZPTb+SiePvFpPD71cQAwCYfiqmKuuJMGt9VFGuupNW5a2U20GN63/sB6XPvZteZj\nL4TDyj0rTaUAPVa+2vYVbvriJlz+0eUYsmAIRv5npEmEiEGhXkHXv7V4K+5cfCeeX/0893wghQMA\nR8KBKjroSLcXwkEG2TEnfo/0xj8xOpE7nvLT86Wj0O1dbgAAIABJREFUnU4KBx06vt72NeZ+MNcs\nekR1A2AUb0zlsLdsLxekKFqzRnccjandp3qyfwFyhQOTV8sQGxErtVRU11Xj3kn3IsIXgYmdJ+LY\nzsdKCYcPNn6AWe/Owoz/zcCiLYvwz2/+aSv4Xlj9Aup0I3V5Vv9ZeHDyg/D7/LY8FsCucKD7cNU+\ngzATu+VsLtrsmlnBUFXnnOEgKhxooetF4QBYqoKC8gKpFUDMcPBiqaDFd7c0i3BgpEtabJp5PlGi\nMyeJJxwAQyVBlVJOCge2LX7NH7CbjUg4UEsdJbqkXSoIWUcVEPQ4cyWDJaiorZC2Sy2pLuFsTjQT\n42CVleEgC4wE7AoHZpdMjEpEdES0qR45WHnQ3M/UKuemcGCQtd6WoaSqBDPfmcnNo9+1IhwUFBR+\n14iPBy6+2FA9XHgh0K+ffRk3wkHTLFtFsJYKgB/537vXkJsHSzgUk25ha4VBE6pwWLQI+PRT9+0D\nglM4VAoKaJl6QVyfUjg0PeKj4tGvnXUwn9b7NHw/53t8OeNLrL1krelplbVcLKxwJhyi/FHcDRfD\nl1u/dCUc6DFQFUBtGuWPwpJZS7D9iu1mTkMoEBUOkb5IXHH0FbblxAKiQ3IH/Pfk/2L2wNkhv/cX\nM77AX4/+K5bOMWS+YkaEY2ikJFMDkO8n9n23iW2DQVmD+Cfr/YDuR+6eizFrwCwz8Zve0NORM2kn\nj4bQSa2GD0MTCQcAePXnV03bBy2iZL5iwJCvs8LdLTWey2/IDj6/AeCPxYd/eBg3LroRl354qfV8\nTKpdgSLZP6zId+tSQSXhTUk40FH2TimdsOuQZZHpndEbfp/fdp5SotCn+VzbzjlZsnqkWeGaP+79\n0Zx2UuZ4BSuMKg9XmoWXm8JB9KMDRqHJVAjF1xbj47M/hk/zSQmH73Z+Z877YfcP0i4Ur6x9xZwe\nlzfOnJbliIihkQPaDTA/02ebP4Ou61xBBRhEj1tgIoNoVXBri0mXlSocCNHK1sPIEifViJjhwCkc\nqKWCvJYSp5cMvsScvmWsEaSlaZqtVTBgVzgw0OOfEh6iCgcwlDiBOhykx/LXHKpI2VFiEV3BWCro\nefvJb5+4vr8ImZ0CMHIa/v3tv83HM/pZBPzesr0myeKUmUSJyYraCvN3nX2flMhh9gxqGRMVDbJr\nhpNtTsS8j+fZSER6jlMFkCIcFBQUfpd45BHgwAHglFOA3r2NFpUUboQDYNkqgrVUiNB1gyAIlnCg\nz4sDFEzhsGgRMGECMGkS8LVQLzS1wsELKaEQftw27jZ0T+uOv4/7O3KScpAam4qxeWO5wkNW7BWW\nFzr6eSN9kVLC4dud3zpKrAH+GBCPByd4HR11Qu8MXuEwNGcoxueNty1nUweEAf3b9ce9x91rjkiJ\nN8BhUTg0fN+psal2wqGhJaaYl5GdYBEflHCQKhzqG4ruaj4MjREOKTEp+NuYvwEwWiXe9/19APgC\n/Lmn+Rt2lsHAsgkAZ1JCRKgKB6fRPwbWepWOnAajcOAsFfHWyGqgDAcn5Cbl2s4x0cc+uetk3Dr2\nVtwy9hYc3/14cyQesCTsohzcLZdCxPD2w6XzaYgrLZYbExIL2KXfAF/4ySwibD8Mzh6M9X9Zj42X\nbjRH0+Oj4s3rh6xApwTDlqIt0vOL5t1Q8vasPmfZimWREIqOiDY75+wu3Y1fCn+xKRwAI3gyEMRu\nIDERMaZ6TSxUuQyHigAKhyhe4QDIcxyopULMcCirdVA4xFgKh/GdxmPBCQtw85ibcf6g88351FbB\n0COth6mOoF0UOMIhSm6pYJCRECLE3z3atpju92BCI0/tdao5/dLPL3GvCRQiSYkbep4+s+oZ87iZ\n0GkCJnSaYO57SmA5XUPp9h2qPmQeS+x6Qq8rzFZBCVVZ6KtInhdWFErVGRTvrX8Pz6x6BoCx/9g1\n6kDlAfP9fj3wq7l897TurutrDVCEg4KCghQ+cnW46Sbg/vuNeXFxwHh7jcKBEQ6lpYatQdeB5cut\njhFulgoRO3faCYeyMud2kwCvcBDBFA7nnGPNW7DAvn6KcGc4KMLhyOD47sdj/V/W429j/+a4zG3j\nbkNMRAzaxLQxAxr3V+y3pVYzlNeWcyM8DN/u+NZV4RAK4dBYZMZncsTH2I5jpWF6TUE4BILMrwvY\niRq2baLCoaauxlxHm5g2GJQtUTjAbmeiN4uMcKg+XI3fDv5m3xjWVrPKKABYEjyT8Y7uMBqXDrvU\nHAVbsHIBDlYe5Arwzz+MQ8q3D6Frale8fMrLdmIExqgttf8AxkieWMQNyAq+RSkQuBhmN+tUNSDb\nP0xOzArduvo6bC/Zzn1eGmoZ6k2zT/PZ0v0z4vjvwu/z45Zxt+DWcbfayCt2zIijs6JqQ3YeMziF\nc9IQ19+KrGPGSZnjFXRb2fe5s9Qa/Xz/jPfxw/k/cCO8DMnRyeie1p0rGilkmQe0+F+1b5VZqMs+\nd6QvklNaRUdE46YxN3HLiJYKAJjYeaI5/enmT20ZDoC31pjsuhrlj0JsRCw0TXNswRhQ4VBtVzjQ\nY0uW4+BmqXAKjRSJ0zkD5+C28bdxBTxVyzBkxGfgw7M+xL2T7sWNo28059PCOJDCIRTCgarcGMGi\nQZMW3E6Wivy2+WYb5WW7lpmEwBUfXYG0e9Lw2lrnAFVKHFEVGlXiXDfqOmiaZp5rVB3gdI2jhAMl\nUmQKB2a3oISqzEJx/3H3Y0KnCebARb1eLyW3GNYWrMW5/zvXev2f7ud+r9jnYARmRnxGownMlgBF\nOCgoKHjC5ZcbgY47dgDt3O2AXKeK4mLgmWeAIUOAXr2Mx26WChE7dtgJB7ZeGerq5Msz7N0L1NcD\nu0lGnE+4EookgVI4/HEwIGsAds7biZ1/3WmOxtfpddwIIA3sKqkqkeY+bC7azI2SuxEOgSwV4QLt\nVAEY2RBtYtvYCsHmIhxosSZ2iWAQCzfWJaO4qpi7oacBnamxqfZCyUHhQG9mmQx/48GN3Ai5iTqj\nQNWrjMJEh44PN31oPj2m4xikx6Vj1oBZAIxC8Y6v7+B96bVxKP7kL9jwl404/ajTpQVdXGSc7cZ2\nYNZAXDPiGm5eqBaFhKgEx5BOwDpW6YirzMLCio/Kw5XQdR1TXpqCjvd3xNu/vm0uc1rv0/Dw5Ifx\n9IlPY1SHUSFtL2APjnTKwmCg3ytTgogKB/G8dVI4TOs5zbGdJfXG01HNcFkqAEI4NBQikb5ItE9u\nj8HZg6Xta+lougyiwkHXda74p61HB7QbYOuMk9823/bdndf/PO6xrH0vJRw++e0Tk6ChxbgnhUOl\npWRiqg1WdIczwwGwt0cEQguNpBkOTrCRpDDO1eHth+Ovw//KXQspKUIVDulx6bbjOFjCITEqUXp+\nUZUMBT03xHOMBg6/tOYlFFcV44GlD6Coqgjzv5vvuD10P8q2f1DWIEzoNAGAnFxwUnHR6yq9pjGi\ngSMcmMLBxVIBAKf2PhWfnfsZzuxzpjnPKXB0R8kO/OnFP5nHxok9TsTM/jORm2h9xp2HduJg5UFz\nHTQYszVDEQ4KCgqekZPDkwlOEDtVPPqoNb14sd1SEazCAXC2VRwKEIa8Zw/w44/8vMJC/nFTKxy8\nts5UODJIi0sz2o+Rmw/aso2mjBdXFXMEBMX7G943p8XCmR4DzaVwAIAp3aYAMLzBrACkgXh+zc8V\n4U0JeiNJPfcUcZFxOKvPWYjwReCJqU9wo+70hpHKrFNjU5Eel86PdDe0xLRZKiQKB0cfOVM4kNaY\ndB8zyfjVI642b8IfWPoAlxbPAieZ0kJmi4iLjOMKTgDomd4TFw+52Bzpp6OdwYIl3TuBPdcx2fqu\n6WggA/uMh+sPY+PBjVKfdmJ0IuYOnYtZA2Y1yhJEW2MCgQmHp098GiPbj8StY281STabwkG0VAgK\nh+enPY83T3tT2maWwSmMr7EKB7r/WQgns1TkJuWa1xyZBYwSCjKIhMPesr1c0Ccl2/JS8mxWLNoO\nlCHKH4VPz/kU7RLa4ey+Z0vtAUdlHGV+Xx9u+tAcOR7VYZS5TaxTxSPLHsHE5yfixz0/2tbDznVK\n6rDXl9aU4tf9v+K99e+htq6WIxQOVh60SfllGQ6UYJFZKsS2mJRwoEogpwwHJ8jUTrKuEAAwudtk\nc5upSk3TNFuBHizhkBKTIr0+OBGcdBtF29TpR51uTr+05iWOuJdZahjofpHZRG4ec7N5PZFtqxeF\nA22tbBIO8XaFAxcaKSEcGCgJ5JT5dM1n15jE4aCsQXhh2gu2fbbz0E7uN8itu1RrgiIcFBQUwg5K\nOKxaBaywBkywcqVd4eCU4QA4KxycCIdA+Q4VFcArr/DztgmDd43JcBCLR6VwaL2ghANt2dYj3ZK+\nVtdVY+WeldLXUylmS7BUAMA1I6/Bp+d8iqVzlpo3X8NyrJahuUm5ruF54cSI9iPMaVEyT/HC9BdQ\ncl0JLhh0ASf1pT5zKpdmhUj/dv2tlbTZDCAA4VDWQDjI8hsAILJhRxHC4attXwEwilkmH+6Y0hHX\nj7oegFG8LdqyyFpHA+HAVC0DswbaRpDjIuNsN7b56fmIi4zD0jlLsXjmYtw2/jb5NnqEG+HARgdl\nRSMF3cY3fnnD9rxf87sqKYKBGJomZjiI6N+uP76Z9Q1uGXeLOc+W4SBYKkQVw6CsQZieP921WJSN\n5MdExDgqIrxCtFSU15SbhTYtTmQjuYFG02lnnUPVh1wLv47JHW3dbfpm2AkHADi287HYc+UePD/t\neenzmqbh2M7H2uZ3S+1mjuJuK9mGPaV7cMXHV+CzzZ/h1q9u5Zatrau1rFOE1GGFbkVtBQY9OQgn\nvnIi/v3tv812hoBBFIjWIFmGAyWR9pTZ+2iLoZERvgjzXKAj805tMZ3Qq20vmzrBqcj/U9c/YcNf\nNmDzZZtt5JZIMLi1UmYQCQdZq1NZYCRgEEkj249EUnQSN8oPGG1uR7YfCcDIqflq61fmcwcrD3LK\nNAo3hcPRuUfjxB4nmo+DUTjQ/fDLfisYWGqpkCgc3M7rQNkfALB0pxGaHB8Zj4VnLjRVNTbCgfwG\nKYWDgoKCggOoCuKpp/jnRMIhkMJh0yZ5XoMTseCW3+C0Tdu38+GSKsNBAeCLmi3FW8zp7qm8BWFq\n96nm9Lyj59nWo0GzFQFHinCI8EXg2M7HcioGSjg0Z37D7eNvR6+2vZCblIs7J9zpuiwjR+iNMA0J\nk+Vl9M8khEPmGgDuhANTWUg7VABAhrEOSjgwDMgawBWwV4+42rR/cKg1btoZ4eDTfJieP51bRFQ4\nxEfGm/srPS4dozqMclTVeIVbMCX7/obnWkGJsqwPGmJ395K7bc/HRcY1OuiUQSQcAikcZBALJjEn\ngxZ7GjRXEsxtO8LhtxYtFdSf3j7ZGvGVKRy8yPdp5oEsS4EhLyXPRjjQwMhgMbXbVNu8bqndOMvM\nZ5s/M1UEv+7/lVvWKYiX5hgwC8pHv31key/RViHLcKAkkoxwEDMcAIsYoCPzVOEQyOYCGASYqB5x\nKvIBo72m7DwOh8KhS5sutmWc1BY+zYfFMxej4KoCU+VFQX9f3lz3JvcczT2hcFM43DXhLu66EozC\ngf6+UfWMNDTSY4YDA80tkVkq6urrzK4UXVO7cgSFm8Ihv60iHBQUFBSkoAoHse3kypV2S0Vqqj1H\ngeFnhwypYAmHyEjnZcrK+PU1dZeKYNti1tYCL71kfHcKzQdZe6tIX6Rt9OesPmfhnL7nYO6QufjX\nxH9xlgvAuIETuzIE0xazqdGvXT8z9f6kHic12/smRifi54t/xpbLt3gq7gBetfDu+nfNaTpSxkb8\nZIWRSDjERcaZo16mpaJhdCnCF4H5k4jPePXZxn8J4SDeoMdGxuKJqU9wI/zansFAsXHs0H3+515/\ntm0TPV56pPdoNMEgwoulYmzeWFw69FIMzh6Mp054yrbcqb2tFHoxrA+AzRbSGASb4SCDWDCJZAi1\nVHRI7uAaIskQ5Y+yFf2NzW8A+O/uhdUv4MkVT5qPqd9bSjh4KG4p4SBrgcng1VLhFaf1Pg2XDb2M\nm9cnsw9Hzn2+5XNzemvxVi4bQ3aeA3Ibidh2E7CISV3Xsbt0txmaCVikBVU4eMlwoK+lCgqmcIiN\niJXm/Mgg2ipCyWkRC3QvhEO7hHbm791RGUchIz7DRna4bYumaY7nC8tCAoClu5Zyz8n2EcATQZkJ\nmeZv6tTuU7mWrADfaYjBiVClhEN1nZUg7Kpw8GqpiLdbKqoOV+EfX/8Dz656FnvK9phklageoyTi\nztLfp8KheXSTCgoKfyi0cbnf2rnT6HTBkJAA+P1Gq00xSwEw1AcMPp8R+AgEb6no1Qv46Sfn7dq2\nzVJmiCTBkVY43HcfcO21QGwssGYN0MU++HDEsHEjMH060L078PrrzsRRa4RMtp2VmGW7gUuOScZz\n054zH5/b71zc/MXN5uPZA2bb1nOkFA4yRPgisGzOMmwt3trsoymapiFC834r0i+zH3qk9TDkudu+\nwo6SHdhXvo9Lt2cF89l9z8YtX95iFBmfGiPwYpcKwJAcF1cVY8ehHaipq8H6/UaLym6p3XD50Zfj\ns0WH8cFCH7C5QQ4uIRxkhMnELhNReHUhdpfuRk1dLfpn9wZgFLmUcBjdYTT3unq9nmvRKRttbCxq\n6izmpV1CO0zsPBHPr34euUm5HBnx4OQHHdcxLGcYOiR3kOY7AOElHDomd4Rf85v5AuFQOIigxWIw\nSp92Ce2wv2K/+bix+Q0A/90tWMm3UeIsFZLCKmiFg4OlIsIXgezEbE490DaurbQDhVdomoYHJj+A\n2QNn48GlDyInMQcj2480zzmAJxxq6mqwu3S3+Zm5rJYY6ziVtVyl5xAD6x4w94O5eGz5Y9xz7DtJ\njU1FpC8StfW12FNqKBwKyguQHpcOn+azZTgAROEg6VLhhQBiGJQ9CCAW1FAIB/H3SRYsKiI6IhoL\nz1yIr7Z9hfP6nwdN09AltQtW71ttLhPo/HGCG0HlSDgQhUNiVCIWzViERVsW4ZT8U2zLzhwwEw8u\ne9DVwsiQk5iDCF8Etw8BhwyHBsKBkqmuCgeJpeKplU+Z9wIPTX7IfJ7m4wCGOoJd31hoJGB8di/7\nrzXgd3RrqKCg0FLQQWL9jSHWtw0NAyrR0ZbyQLRVdJd0UGtPiPtgFQ79+/OPIyOB860W2Byx0dQK\nBy85DxTMAlJZCVx6qfdtaQ6ceqqhQnnrLeD99wMv35ogUziMbD8SU7tPRd/MvojwReC9M96zLXPD\n6Buw4oIV+G72d9h+xXb8a9K/bMu0JMIBMG6kWoN0U9M0zid83AvHYciCIbjn23vMeexmMyUmBd/O\n+hZDt7wNfPdXAHaFA2BIkwGjuPl629fmyFd+23z4NB9G4GrguyvNThdeCQfAKDby2+ajR0pf6/Xg\nCQe/z8/lOBRXFXMWEfHmNBygJEHH5I544E8P4NEpj+KTsz+xqXGcoGkaTu11quPz4SQcIv2RZpvH\n2IhYaYEZCIGKN/qd0BHHQBCDI8OhcHCSrwMeLBVBKBxq6mocW1F2SO4Av8+PlJgUjM8zemFPz58e\nFptM38y+eOrEp/D38X+HpvH2FWofAYyOPwxcVksAhYMMBysPoupwFf7zoz0IlB1TPs1nFo97y/bi\nsR8eQ+a/MzHx+YnQdd2W4UBfW11XbT7PFA5e8hsYRIWD23HgBEo4ZMRneFLqAMCQnCG4asRV5jEl\nXtNC2RbAGKF3Umh5UTgkRiciNykX5/Y7V7oNPdN74r7j7uPmOWU4+H1+aTYNIxriI+PNnIbC8kLU\n6/UcMSW1yTVAZqlYtmuZOe/1X143p0WVpN/nN4mF9fvXm4GW+W3zw2ZLO9JQhIOCgkLYMWYMMG8e\nkNVwD52aCtx8s325ZHJfRAmH6GhDkSAiL8+aDlbhIBIOM2cCI0daj2lwZGMyHLyQCcFaKhLJvfWH\nHwIF8gDkIwKqGtm503m51giZwmFi54mI8EVg5QUrse+qfVx+A4NP82Fg1kAcnXu0Y+HSkiwVrQ2U\ncJAFPNKCr0d6D6QXnmx2mJASDsQC886v75jTvdKNi5AtQyYIwoFB3Mfi4/fOeM8sXs7pew73nCzA\nrbEY23GsOX1C9xPQJrYNLh5ycdCkk0g4UOm422hgKLh6xNVIik7CVSOuCukmPBiFQ4ck98BMCjH0\nM9wZDiLosZYcnWzLoghG4QBY3QJE4oSqPBaeuRDfzf4OD095OOC6QwEjk2TYUrQFuq6jXq/nLRUx\n9tDIQDhQeQDf7/yek9MDBnFAAwHZPi0oL8DTPz4NAFi0ZRG2l2znRsfZd09VIGU1ZairrzNH6b3s\nDwbRvuLUqtUN9DfHS2CkE0RlVahteGMjY22tlxm8ZDh4ed8LB11oXou6pnaVhrkyyNRLbHBB0zRz\nurCiEK+vfR0r9hiSk36Z/XBc1+Mc15sWlwatQcHGLBXUrvTtjm/NaRmJzIii8tpyM+z092KnABTh\noKCg0ATw+4H584Fdu4wuE9u3AydJrOHTSVYaJRwyM4GOkntsSjg4KRm8Khyuu45/jyOpcAhEONTV\n8Y8fDtM9n64Df/sbMGeOt7DNQEjyNsjUaiBTOLCUdb/P36jCoqUpHFoTuqZ25YLIRIj7hRIGgQiH\ndzdYuRCs+A4H4SDuY5FwGJQ9CKsvWo3VF63GsNxh3LHH8jXCiZvG3IRBWYMwqcskXDniypDXMzRn\nqPnZcxJzOHsIC10LFy4YdAGKri0KuUNHMCO0R1rhIHb3eHH6ixjTcQwuH3Y5+mRYnnhN02y2imAU\nDhRDsodwxTHtCBMbGYujc49usg42uUm5NuKEYcWeFej9aG90eqATJ/EPVeHw5dYvbfMToxI5EosV\nrDp0rNq7ypy/bv86M8MhwhdhvoYWxaU1pZwMPxiFQ5Q/CgOzBgIwulaEQqzlpeSZ3yVTb4UCkXAI\n1VIBgDtmKbxaKgJB0zS88udX8NFZH+Hr8752PU7pcQ0Yxw5VgbCBhoLyAlz/+fXm/LuPvds1SyfC\nF2GqQ/aV74Ou61h/wLIKUaJKprKQZW38XlpiAopwUFBQaEJoGpCbC8THAz168M+lpwP/+If1mLbG\nzMgwVBIiQlU4+P3AkCFWRsMFFwCdOvHWD6Zw0PUj06VCaA/OYf9+/vGjj1pZFo3B0qXA7bcDTz8N\nPPdc4OUD4fDhwMu0Jog3ij3SegRViLhBEQ6Nw8z+M81p0dcrEg6UZKipsZ9rdPSNyurZ6FIgwiE2\nIjagrz2QwgEwCA4WsPbyKS+jTUwbnNzzZE6NEC5kJ2Zj+QXL8fHZHzeqhaOmaXj5lJcxo98MvHTK\nS9x3ua1km8srQ0NjwjMDFUyUNOJaqgaAjXAIQ4YDLRQvGnQRzuxzJr467yvc/6f7bUWoaKvwpHCI\nshfo3VK7cUqDplDWOCHCF+H4fo8tfwzr9q/D9pLteHzF4+Z8ep5ThYEbDlQckBIO1P8PAO3irX3K\nckMAYF3hOrNwpKQQLYpLq0uD7lBB8fb/vY07jrkDr5/6euCFJUiNTcWDkx/ElG5TcNPom0JaB2AP\nag1V4QA45zjsLdtra1UK2C0VXuDTfDiu63Gu6gbArnAQ82Ao2cu6Ux3T6RhM6jIp4DYwK86+sn3Y\nX7Gfa41KITvWZYRDYwJaWxpUaKSCgkKzICICOPpo4Pvvjcd3320ERTKICodjjzVeQ4tYL4QDHamf\nPt3IFpg+3SA9vvkGWLYMOP104/ncXIMU0XVL4VBdbS+cvSocdD20LhWHDxuFULREPanrdsLhwAHg\n0CEgxfvAiRS7dlnTodghxMItGCVIa4B4Yy/rIR8qKMmgLBXB4/xB56OmrgZxkXGYOWAm7lx8J27+\n4maMyxvnSjjouqEYiiB3P7JRQA0aeqQbLKmNSKviC4jObToHHIkMpHAQMaHzBBReXeg5T+FIYmjO\nULNt5vLdy4/w1jgjkMLhxekv4ty3z8X4vPE4Ovdoz+sVLRXhUDgMyR6Cx45/DEWVRbhqxFWuy9oI\nhxAVDgOyBmBz8WZzRL85W+QChj+e5jUw0JFhqhyg37NXhcPust34fuf3tvmVh/kT1KloXbd/nZnR\nQEfRKeFQVlPGdbJIiQ7uh7pDcgfcMPqGoF4j4pIhl+CSIZc0ah1NqXDIS8nD1uKtAIyMDrGw3lVq\n3Jz4NX9YzicK0b4jKhllVsqLBl3kSW3CyIvqumrHa2G0P1oaeisSDmM6jsHEzhMDvmdrgSIcFBQU\nmg033QTMmgWceCJw3nn8c5RwyMgw5PmjRgFffmnNz8oyOlxUVDgXyJSIePJJ4N57LSVDfr7xxxAV\nZaxz925L4SCzN4gEwfbtwLRpRojlG29YxUttrd3+UFpqFDn0t0o2ol1aKiccysvlyfrhIBzo5woU\nXCmDSPr83ggHEeEiHOrq+H2qFA7Bw6f5cOkwK0H1pjE3YdaAWWgb19Z2YyjaKGpqeMIhKyEL8ZHx\nKK+1DuC8lDzTRx9I4eClpacXhYOI1kA2iKDnyMk9Tz6CW2JHoILp6NyjseFS5xaRThAVDuHIcNA0\nDRcNvsjTsmJAXrAZDgzDc4dzAZJie9+mhlsgnwxUSeI1w2HhhoVmfsPArIFYuUfea1rcpwzr9hOF\ng99SOIiWCqqYCFbh0FIgyv5DDY0E7CP1U7pOwaPLHwVg2Cro87qum1aLvJQ87nsOB0QiTSQYZFbK\nsXneVGZU6bZ4+2LpMh2SO0iVWtQ6NyhrEBaeubBV/gY4QVkqFBQUmg3HHw/s2wcsWGBvnygqHABg\nyhR+meRkK0xy0ya51YEqHJKTDVWEW6tGluOwb59RBMgIB7GQfuIJYOVK4J13gE8/tebLCsfDh+2E\ngWw5pxwHWatQwCAcGgv6uUIhHHYLXccCZVG0RtBQyHBJ28ViUxEO4UF2Yrb05lQkDMTzUdM0m8qB\nemfDQTgEq3Boreib2RfzJ83H9PzpuHfSvUcWtq9wAAAgAElEQVR6czg0pmBygzgaHg5LRTCgCodI\nX6Qni4xIOLSNa4vObTrjkiGXYHzeeFw46MKgVB7hgJfziCIUhQMlFS8fdjmmdp8KDRoemfIIt5yo\nWmFYV8hnODBQ2f/u0t2clD6YDIeWBPFa2hiFQ8eUjiYpkxiViFEdRpnPiTkO+8r3mTaLrqldQ35P\nJ9gsFXHOlgoA6N22t+c2vF4IByfr0MTOE3HXhLtw4+gb8cWMLxplYWmJUAoHBQWFFoGxY42MhUOH\nrIDJKVOAa66xlklOBo46Cli+3FAN/PILMHgwvx426p6YyI9iOqFjR+C774zpHTvkagKR2FhHgvE3\nbAAmT5Yvx1BayrcFDYZwEO0UDC2RcPg9KhzuPvZupMSk4OQeJ4etkBD3vyIcmhYyhYOI7mnduXA4\nmg5uIxxq+NHUplI4tFbMGz4P84bPO9KbYUNjCiY3NEVoZDCghENSdJIn6bdYoA9vPxyapqFDcgcs\nmrEo7NvoBaLUPS02DQcqDzguzykcPPr8KcbljcM5fc9BSXWJjRRwUjgcqDxgFsM0w4GqM+Z+MJfL\nlQmmS0VLRk2d5MLpET7Nh7lD5uLuJXfjosEXcUTCxgMbUX24Gtd/fj2SopNwTKdjzOeaQmWTlZCF\nSF+kSRzZFA7CY9YS1gsoMfHN9m+kyzh1wNE0DdeNus7ze7U2KMJBQUGhRSA11bA1lJdbCgexNWZS\nEtCHWAHXrDFIh23bgKuuAmJjLYVDG4/3fLRTxYYNfK4Eg1hIr7eCh7GJkPNuhAMNxZQtdyQIh8Za\nKmgGBPD7VDj0atsLz097PqzrFPd/dbXddqMQPnghHMQbW9oe0kY41EVxD5XCoXWgMeGYbkiOTka0\nP9qU6je3woFaKrzK922EQ+7wsG5TKBAtFcd0Ogav/yIPToyNiOX2ZyBLRdu4tiissOSCg7MHm5YB\nmQLBLXiQ7WeqcJiWPw1jOo7B19u+RllNGZ796VnzudaqcBCxt2xvo17/z2P/iRtH34jE6EQuFHJ1\nwWo8s+oZ3Pf9fcZj0omkMV02nOD3+dExpaOprHALjQQMYsorWGikG5ozjLUlQVkqFBQUWgwSEiyy\nATAKsNtvN6bz8428BUo4LFgAzJhhtHZ8vCG8mikcvOYb0HaZy5YFDnmsq+NJht9+ky9HIa4zVIVD\nJ3I/phQOrRey/a8K0KZDKIQDtVQE6r6iFA6tA+H2gjNomsYVqOHIcAgGVOHgdTS9JRIO9DzSoLkW\neiKp46Zw8Gt+W3ehuUPmum5LoK4zAH88Rfmj8NFZH+GE7ifYlmutGQ4A8O+J/zanz+hzRqPXx/ZT\nYnSiGUq5Zt8afLXtK3OZd9dbrYmbwlIB8LaKQKGRXvMbAPlx0zO9p5kHBAAdkxXhoKCgoNDicP31\nhoVi2TIji4ESDswKAQBffWXcxDNLhFeFw7Bh1vTSpfLCv7raCoPcto0vWLwqHCicQiNlaE2Ew+9R\n4dAUkO1/ZatoOogKBSdLBYWrpUKAlzR/pXA48ojyRwVeKESw0fIof1Szj2hzhEOICofB2YMdlmw+\npMelm9vVPrk9eqb3dFxWtK3Qx6IdIik6CfW61Ufap/nwf73/z3VboiOiOeJIlhFBFQ4AEBsZi/+c\n9B/ERsRy81uzwuGyYZdh/qT5eHH6i0G1ivWCfu36ATA6hLy3/j1zPm1D2lTBpXnJeea0SDCIIaxi\nFxg39GrbCxp4qWLP9J7okWb1hVcKBwUFBYUWCL8fGDTIUD8AhgIiXXL9X7mS75rgVeGQl2fZHZwU\nDoBFJmwQQsy3brVGQL0SDsFYKmhoZGcykBoKQSCisZYKpXAIDUrh0LzwpHAg0t2shCyucJMRDsek\nnQMAGNl+JDd65QSlcDjyEAvBcOLG0TfiqIyj8I/x/2hSYkOGUBQOsZH8d9FUgZrBQNM03DT6JrSN\na4sbR9+I4bnD0SmlEzRoOKvPWdyyosIhOiIab5z6Bs7tdy6+nPElF7iXHJOM/RUWcz+t5zTb55eB\nBkdO6TbF9jzNcGBIj0vHrAGzuHmtOcMh0h+JecPn4cw+Z4Z93f0y+5nTNMyTwa/5m6w160k9jaCw\n1NhUDMsZxj3XJbWLqaygCg8v6NSmE24cfSM3r3tqdwxoNwCAodwRye0/ClSGg4KCQquCphnBkbRd\nJmAEPm7caD32qnDQNEPl8P77BmGxapV8ufJyI4iS5jcARjGyY4ehPqCFZFKSpUIIl6WCEg4tUeGg\nCAdvkBFOSuHQdBAJBlkwbFpsGvpl9sNP+37CcV2P456TEQ4Xt38C8yadhhHtR3jaBqVwOPLIb5uP\n3m17Y23h2rB30JjUZRLWXLwmrOv0io4pHeHX/KjT6zzLtbuldkNmfCb2le/DXRPuauIt9I6rR16N\nq0ZcZQZfrpu7DiXVJfjt4G94cc2L5nKyYM5Tep2CU3oZYY0Z8RlmuGNSdBLO6nMWrv3sWsRExOBf\nE//laVvaJbTD2sK1AAxi8eNNH6OoyhrVcGpZ+Nfhf8UjP1hdL1qzwqEpEUgx0RQtMRmmdp+KjZdu\nRHpcuk0VFOGLwMoLVmLjwY0hqTpuG38b9lfsx+MrDJ/viPYj0DezL8pqyzCmwxhkJ2aH5TO0NijC\nQUFBodWhTx874QAAn39uTXtVOAAW4SCuIzraKk6cFA6AkePQqRNfSGZmti7Coaws+OBCZakIDcpS\n0bzwonDQNA2fnPMJlmxfgoldJnLPyQgHX10s1zI1EJTC4cjDp/mw7Pxl2HRwE/pk9An8glaCdgnt\n8MTUJ7BkxxJcNeIqT6+JjojGygtX4ueCn7muAC0BtMtGdEQ0MiIyUFdfxy0TKCcjIz4Dm4s2AzAU\nBpcPuxztEtohPz3f1g3DCTSXo3tad8waMAv3fmcRVdSmQdG5TWec3fdsvLD6BbSNa+spSPCPCKpw\nkKEpAiMp3PIhEqMTMTBrYEjr1TQND095GENyhuBw/WGc2ONEaJqGV//8aqib+ruAIhwUFBRaHfo4\n3Cs2hnBgWG0FJCMjw1AvAFZxLiocACPH4dhj7YQDU1yIhXioXSoaQzhUVgJLlgAjRgBxcfbt0HXj\nMyZ4bP1cVwfs2cPPUwoHb1CWiuaDrnvLcACMImVa/jTbfBnhEChIUoRSOLQMxEXGoW9m3yO9GWHH\n7IGzMXvg7KBek52Y3WpGWzMTMhHljzJbMwZqPUo7DyTHJCM6Ihrn9js3qPc8ttOxeGH1C6bsfkKn\nCYiJiMEdi+8AwOe8iHhi6hMY3WE0RnUY1ewWm9aCDskdkBKTguKqYunzXds0TWBkc8Dv89usNX90\nKMJBQUGh1YESDsnJQEmJMb1kiTXfq6UCAIYMkc/PzLQIh0AKB7oMey1DQQGvHgglNDIhwSBAGIIl\nHGbOBF59FZgyBVi40JgnEgSlpd4Jh8JCK0iTQSkcvEFZKpoPMmLAiXAIZh3BEg4iwaD2t4KCd/g0\nHzomd8TGgwaLH6j1aEac9WMpC3z0gnP7nYtebXuhfXJ7U3b/j2P+gUldJmHRlkWYM3CO42vjIuNw\nwaALQnrfPwo0TUPfzL74etvXAIyMFZ/mM/McmlrhoNC8UKGRCgoKrQ4DBgAdG6yqN98MxEvyroJR\nOKSkAD162OfTAr+83PhjBEQ2GRhauBC4/HLgww/lr73zTqBrV6PDBWAVGxGE8g0UGpmebuRCMARL\nOCxaZPz/+mtrnoxw8ArRTiFbn4IcylLRfJCRC8ESDjKFQ6DOFSKUpUJBoXGg6f5BKRxCDG3UNA1D\ncobYul6M6TgGt467FblJuSGtV8ECtVUMzBqIoTlDzcdN1aFC4chAEQ4KCgqtDtHRRrjjDz8A8+YZ\nBISIYBQOADBmjH0eVSlUVPAtMI85BohqUEquWwc8+CDwrtU+GvmC2nLzZmD+fGNklBUrtNuGjHCo\nqwMOHrSWTSStxoMhB2prLeKirMy5q0ZjCYeKCqBebmtVIFCEQ/OhqQgHZalQUGhe0FaGARUOYSAc\nFJoelHAYnD0YJ/UwukfERca1iFatCuGDIhwUFBRaJVJSgMGDAZ8PGCjJ9qEKAy+44gp7YCIlHMrL\ngZ9+sh7n57uTGr16AU89ZVgYGDHxyit8UU+38eWXDRXEp59a84qLrQI+Pd1QRMQ2dPMKRuFQUMA/\nZhaUxigcxPwGwLCNNKZw/uILw/qxYkXo62gNkFkqVAHaNJCRBUrhoKDQ+tCrbS9zOlA3DhrUqLpE\ntFyc0OMEpMelI9IXiXP6noO5Q+fizdPexPezv0fb+LZHevMUwghFOCgoKLR6jBvHP54+3SAjgkGv\nXsBZfKtvm8LhpZesxyNHur9HXBwwe7ZhtzjhBGNeQYHVDQMA2gq/p7/9Btxwg/WYBkayZZmtIhjC\nYe9e/nFxQ0ZTYwiHoiL5/MbYKmbPBv77X4P8CRd++83++Y80lMKh+SAjF2RtMd2gFA4KCkceswfO\nxqz+s3D9qOsxqsMo12X/1PVPyEvJQ3pcutkqU6HlISM+A9uv2I59V+3DoOxBiPBFYHr+dPTJ/P10\nkVEwoEIjFRQUWj1OOgl48kmj2J02zcp3CBa33AK88IL1OC3Nmt640VIf5OUBo0cDu3ZZAYwiWCcI\nwCAy3nzTmF6wwJqfKunstXy5UfgnJvKEA7NfJCUB+/YFRziIaoSiIkON0BhLBSUcMjIsFUVZWfDq\nEsBQcmzdakyzEM7G4ssvgfHjDVXI5s1Au3buy3/4oUHGnHYa4Je3WA8LFOHQfAiHpUJGLiiFg4JC\n8yIpOglPn/S0p2VTYlKw6dJNOFx/GNER0U28ZQqNQWxkLGIjY4/0Zig0MZTCQUFBodXD5wPOP98Y\nGQ+VbAAMS8P11xvT//d/fBjlE09Y9oZzzzXe88wzDdJh6VL7umLJ7+eUKVaI5eLF1nxZ2CUAfPut\n8Z/lLgA84QAYhIOuG9MVFUZB7QSZwqG62p63EAzhUEw6WeWS7KxQFQ6lpdbnOXjQmm4MmLKkshJ4\n9ln3ZZctM/bTmWcC773X+Pd2g2qL2XxQGQ4KCn9M+H1+RTYoKLQQKMJBQUFBgeCOO4xAxJde4lUK\nLLwRMAgHhuxso02nmP9AXxsdDZx6qv29YmOB3r3t81knCSeFA2AESlZWGoVL795Aly685YNCVDgU\nF8uJgVAJh5wcazrU1pgsVwIwyBBZzkGwoNsSEUDPd/PN1vR11zX+vd2g2mI2H5oqw6GxbTEV4aCg\noKCg8EeBIhwUFBQUCDQNyMoyFAwyBcKoUUZxTxEbC3TuzM+jhAMg74IRGws8/TRwwQX8qDojHD7+\n2JrH7AC0U8WhQ8DKlZYV4Z135J9JpnCQFb1HUuFACQeAJ3hCQV2d+2MR+/ZZ08F2OAkWylLRfGgp\nbTGVwkFBQUFB4Y8KleGgoKCg4ACRNACAiy6SL9urF589IL5WbJPJlhk2zPgDDEvHpk2GvH/RIuCN\nN4z5GRlGG07AUjgABkFA31PWOQKQEw5OCodVq4z3y86Wr4uug4EuGy7C4cABoH370NYFGJkbFE4h\nlwyUcKBhoU0BZaloPrQUS4VSOCgoKCgo/FGhFA4KCgoKDhAVDv36AWecIV+2Vy/+cbRgHe3Rw/6a\nWCEniakgamqA00+35t92G5CQYExTwuHQIW+Egyw0UkYMPPEEMGAAcNRRlmrCCYxwSEzktykclgqg\n8QqHH3/kHwdDOMiIplBQVQV8/rldTaIsFc2HlhIaqRQOCgoKCgp/VCjCQUFBQcEBYuH5r38ZVgsZ\nRMJBzHRISLCP2DsRDoAVGJmfb7SLZBAJh02brMd79sjDFr1aKti8oiI+p0IGRji0aWORIUDLsVQE\nQzjU1vLfWzDWEjfMnAkce6zR9YJCWSqaDzJi4Ei0xRQJhpoae2hrc6G2NjwZKQoKCgoKCl6gCAcF\nBQUFB7RtaxXTI0cCEyc6LysSDjL07Mk/FgmNiRP5eT4f8MADfOChm8KhvNxeLOu6d0sFxeLFRotO\nJzDCISWFV4KES+Fw4EBo62FYtYp/7EY4iGqOYFqOuuHLL43/LJODQREOzYeWmuEABE98hANFRUZb\n3+xsYP365n9/BQUFBYU/HhThoKCgoOCA2Fjg3XeBW24x/rtBJBO8LCMqHLKzjSL1vvuA114DNmyw\nkxxuhANgt08cOmQvdrwQDgBw9dXy+VVV1ohtSkrLUzjoenAKBzHvoTEKh337jIJW1633LC3lQytl\no8tKYt80aCzhUF9vKRGioqz5jVU4OM1ranzyidGFp6QE+N//mv/9FRQUFBT+eFChkQoKCgouGD/e\n+AuEhARg6FAj8HHqVPkyYnCkSDgAwJAhxp8TaJeKXbss6wXDnj18XoSobgCMQtiLpPrLL4GCAiNE\nkoIGRooKh1AJB1FV0BiFw65dfEtRwJ3AEAmHUBUOH3wAnHCCMYK8YgU/gl1aanxXgEUARUdby7QU\nhcNLLxkj31deyZNbrRWNJRyokiE21nptMISDrrecoFBKvIXLOqSgoKCgoOAGRTgoKCgohAlvvWWM\nIJ5wgvz5QJYKL6BFoGgbAIzRSwpZkKRXhQMAbNsWmHCgCoeWEBr500/2eW4KB5qDAYReiL31ljEa\nvnkz8Nln/HMlJXbCITXV2j8tgXDYvBk4+2yjQE5IcFa4tCbIrA+NIRzYcRqMpULMCGE4EoQDPc9C\nPVcVFBQUFBSCgbJUKCgoKIQJOTlGUGB6uvx5LwqHQKCEg2gbAOwEg0zh4BQayUADL3fskL+eIVwK\nh3BmOOzaZZ9XXOwc0hcuhQPdZnGd9PMxcoEREEDLsFRs3GgVxhs2HNltCRcaq3CgSgZ6vgajcHAi\nk47EPqfnrlI4KCgoKCg0BxThoKCgoNBMyMzkHzeWcJAVhV4JBzdiYOxYa3r7dvnrGVqiwkFGVui6\nM5EgkgNVVcGHAgL8NjsRDrpukT1xcUBMjDHdEhQOVAUSqI1oa4GMXAgmrJEeB1SRFMzx4UQsHGnC\nQSkcFBRCx333AdOn23OUFBQU7FCEg4KCgkIzQWyVGQoC+erdLBWspWdlpXtBTwkHpnCgBVZLVzjQ\n/IasLGtaVkTX1Ni7VADG6O/+/cGNZLspHNh3VltrKS3i4izSSREOTYNwZzgwtFaFg7JUKCg0Hnv2\nGDk3b78NzJ9/pLdGQaHlo9kIB03TOmiadq+maes0TSvTNO2ApmnLNE27StO0EMb5uHVrmqbla5o2\nQ9O0RxrWW6VpWn3D35jAa1FQUFBoeowYYU3n5gb/+kCEg5vCoVMna5oSE23a8K8RCYe//MUIq3z0\nUWOeG+HQVAqHgweBc84BbrhB7oenoIRDt27WtKyI3rZNbrV46SWgXTugb1/vhSHdZlF9wj4fLT5j\nYxXh0NRoKsLh96BwUJYKBYXQsGeP9Tsksx0qKCjwaBbCQdO0EwCsBjAPQHcAsQBSAAwCcA+AHzVN\n69KItzgHwFoAzwC4GMBgAJEA9IY/BQUFhRaBV181gvkefhho3z7418sIB2prEAkHaomgGRI056Bd\nO/41w4ZZ07/8YhAN1dXAPfcY82jR0qaN0S4woiGCOJxtMSmx8OSTwAsvAHfdZXTPcANVGlDCQabq\nkOU9AMDTTxutLNetA775xv39AGNb6fsWFPDPs89HszNiYy1LRUvIcKAkA93HrRnhDo1kCEbh0JII\nB6VwUFBoPChZ93u5ViooNCWanHDQNG0AgFcAJAIoBXADgBEAJgBYAIMQ6AbgfU3T4p3WE+htYJEL\nNQBWAljTMF9BQUGhxSA3F3j+eWDu3NBeHxtrWSMYunSxrAO7dxthkuvWGSP3K1ca87OyeIUDLbTF\nbIm4OIuEWLvWKvy3bTNGc0SFg6ZZKodwEQ61tXxBtHatNf3DD+7rogqHLoTKlo3ai8QAA/XlBno/\nwCAS3ApZmcKhKS0Vug5cfDEwcaIzqSJCKRzscAqNDEbh4LRvj4SqRSkcFBQaD/rbpAgHhcbif/8D\n7rzz931Nbg6Fw/0wFA2HAUzUdf1uXdeX6rr+pa7rFwG4BgYx0B3AlSG+xy8ALgcwHECSruuDAbzd\n+E1XUFBQaFnQNLsFYPZsi3AoLQUGDgQGDDB+xFhQ4pAhvHWCWioiI63p5GTjv5P64ptv+GKUdVpg\nCotQRk2dAh2pImHzZmta1g6UghEObdoAbdta82VF9L591nRenjVNf/iXLXN/PyBw5oQXS0Ugq0gw\nWLoUePxxoz3nY495ew39fkpKnLt6tCaE01JBQyNDVTjQdRxpS4VSOCgohAalcFAIF3bsAE45Bbjx\nRsu2+ntEkxIOmqYNATAahvLgKV3XZbdt8wGsg0E6XK5pmj/Y99F1/Qdd1x/WdX2ZrutB3EooKCgo\ntD706WNN33MPcNFFQHY2v0x1NXD11dbjIUP4Noy06KIyf9bSs0MH+Xt/841d4QA0TuFQViYvbmkR\nv2WLNf3TT+7rY69LT+dJlkAKh65d5etzUjjoOvDJJ8Drr/OqChkCWSrq64MrYgNh2zZretMmb6+h\n34+u21UnrREtLTSSHo/KUqGg0DrREgiH3wMhrGAETLN9SZWcvzc0tcLhZDL9X9kCuq7rAJ5reJgC\nYHwTb5OCgoJCq8aDDwJ//rOhYLj6akP1QLsxMFBVgEg4UNDCgykCnBQOixfLCYfGKBycClumcKio\n4LMpfv3VWY5eW2ttX1paYMKBKhycCIddu+zdP0pKgNNPB447DjjtNCNjwg2BLBXic40F/Vyy1qYy\niN/P78FWIbM+hNoWMxyhkfQcbG7CobaWJwSdiD4FBQV3UMKhtDS8ZLEXrF1rDAqMGhUcgarQ8kCV\nnI3pztXS0dSEw6iG/+UAVrgs9xWZHtl0m6OgoKDQ+jFunDGqftJJ1jwZ4UAxeLCccIiLA/6/vfMO\nl6I6//j3cOECIkhHBUsoCho0ithQitgIsStGjQViLz8b1hiJRI0aCxg1KooGO4pRgsYKUqw0sQAq\nKghXUUBAQbjc8v7+ePfknJmd2Z29dy+3fT/PM89OOXPm7O7Z2Xm/533fc9VVbtt6RcR5OHzyifM2\nMMYlsbQeDqWl+gD0xRfA6NGubFlZvIEVJzjYP9/wtJXl5fEjAf6fdxIPhySCAxD0cti0CRgwABg/\n3u17+eX4cwH3Hv3QET+kAkguOJSVAf/5T2ZPD/99Jc2iHv58so3cffihil/5FibWrdMEob5gVlHy\n6eFgvVGA2unhEPU78z1uahqbNumUg/9mgCypYYRj7aNCAquSJ55QIfztt4HJkzfvtUl+oeCQH3pA\nwykWiUgmHX1h6BxCCCE5EJ7a0qdzZx3tjxMcTjgBuOMOjR885hjdH+fhIKJiAqBig01gaT0cAH0Y\nGzQIuOQSYJddgJNO0uu3bas5BcL4hpDfRvtHHGV4xuVx8P+wcw2p6Nw5uk4gmMdh6lRNzOmTzai3\n79Gf8WLnnYNGbFIDdNw44MgjdTaRsOeFxRccvv02mYGci4dDcbF6d1x8MXDdddnrzoUrr9QpUA86\nqPIjh/lMGtmokct3UlEPh5omONTksIprrwUuvxw49thgEldCqpuw4LC5wypWrHDrfrghqX1QcKgk\nxpjGAFLRwFiWqayIrIF6QQBABSaKI4SQ+s0hhzhjaPjw4LG99tLXKFGiWTOd0vKyy3RWA5Oa2yfs\n4bDzzunn+uKA72ExdqwzEDZuBJ5+Wo2d9euBYcPSjRzfEPKNfvvnG/VAFSc4+LkU2rQBWrd221HT\nYlrDvHVrLR+HLzhMmRJfLg77Hl9/3e07+OCKeThMTfkEFhcDs2ZFl/EFh/LyeGHCUlaWPkqXSXAo\nKnJizexM/osV4J139HXJEuDjjytXVz49HBo2dNO/1hQPh7lz3eeVjSijqKZmRV++XEVQy0cfVV9b\nLKtX6+w/pHr56ivg6KPdNM3VQXULDv71wh6ApHZBwaHyNPfWk2joVnDYMmMpQgghaeywg+Y2mDcP\n+Nvf3GwTgOZvAOI9HKIIezhccIETIyx+fYce6tZvvDG+nUuXAiNHBvf5goM/dWcmD4e4cAJfcGjb\nFmjeHChIpSLOFFLRoYOWjWPWLBfv/tZbbn/4M4lj7Vptm/WM2H13oH37igkO/gPmshg53xccgOwe\nGFEPzJkEh+XL3bqfXyMf+O/pvfcqV5cvGNiwn4oKDvnwcMhnDof583VGmj59gn0yjqjvOBcPh9JS\nDXF4+OHk51SUsDEZN33t5mLdOhVdd9klGEpFNj933gm8+KKGAlbX6H5YcNjc+W7861FwqN34gsOa\nNSr+10WqUnDwHEWR5O+9GDpTRdNsBQkhhKTTuTOw2246CnvIIW7/3nvra5TgYI2wMB06BKfLPOQQ\ndXH38UdrDz3UhVf4I+WTJmm86fvvA40b674771RxxBLn4RAlOFiBZN68YMK7sjL9sw6HVBjj3nf4\noXD9ehfD3r69y0cRxZo1OtvDunUun0P37u6zzcbatcCbb7qpL+3340/b6eelmDEDuPTSaFdyfwaK\nOCEhLDhkSxwZ9cCcVHBYvjx/U3pu2BD8DisrOPjigg37iRMcRNLFk7DgUJM8HHzPhmnTspevbEjF\nM89oiMOZZ2pfriqWL0+fyjXfolauzJrl3NhfeaV621Lf8e9/n31WPW2obg8HCg51B19wEKkbyZqj\nqErBwf8rLUxQvjE030Me83QTQkj95Npr1Xg/9ljNZA2oQR0WGOIEhwYN1GsC0FH4rl2B008PlvEF\njFatgP33Dx7v1g0YPBg4+WQ1zK++WveXlanxYvENIT9xo80VYQWHRo2c6PHzz8Crr+r6xo2axLFV\nK+CWW9z5NkTCGnnhP3LfKI/zcLDhKIAKDTNmOGNzwID45Jph/PYCTnAYONDts4bM+vXAEUcAo0bp\ncV/AKS0NigxRHg4iuXs4RD3kZHqI9gWHTZuiw1UqQlFRcPvddytXX5TgUFISPTvDiSfq9LLnnusE\nlHwIDkk8HIqL9Xdy3HHJRQBfmPG/D74+MPoAACAASURBVItNrmrfQ2VDKvzEqX5fzjeTJqWLMVHv\nb3MSFthI9eF7sVWXsV2TBAfmcKjdhP8762pYRVUKDv7PMUmYhH3srcEpjIBNmzZhzpw5WZfvqluO\nJ4TUa/bYQ0fHJ0xwngcNGgAXXhgs1zSDT9mf/qQG2MiRamgde2zwuPVYsAweHNw+7LDg9tChbt3P\ngxD2cOjeXdffe08fLu0D1Y47An/4gyt72WVqTI0YodN1AkFviLapLEJWcFi7Nmho+m7acR4O/nv6\n4IOg6/qAAfHJNaN4/nl9bdwYOPBAXd9vPyd0vPqqtm/CBPcAu2SJejpYioqCLpdRQsJPP6VP/ViV\nHg6AjkDbGUoqQ1hA+eKLyj2ARQkOQHpIREmJzvwCAA884Fz68xFSkcTD4emngaee0j7y6KPJ6vU/\nl/AjR3Gx5gj59a+Bs8/WfZX1cFi0yK37yU/zTZRBX92PVFUZQlQTeecd7Tv+DEY1hZogOIR/N9WZ\nw2HFiuB0t6R28f333wGY87/lnXei7cpNtXz+0yoTHESkGIC9LXTKVNYY0xJOcEg4gVf1sGLFCvTq\n1Svr8sADD1R3UwkhJI3rrw+GSmQa+T7jDDVwbRLKZs2CySPDD3u//W1w+/DDg9s77OByNLz7rjPE\nfENoq610dB9Q4/tf/3IPU507A0OGOE+KhQtVgLj99uj2hwUHEaBfP+ddEfZwaNzYjWBHvaeZM4NC\nSb9+yT0c/Pd5wAFO6CksdF4OK1dqAsawwTl2LPDf/+p6+DOP8nAIezcAFfNwyCQ4hI2uTz7R76dj\nx8qNuEW9n/ffr3h9UTkcgHRhxM/6DgDXXKMJPsOzVFSVh4P/Hv0EpZnwDS/fIBYBzj/f9dVHH9Xf\nUGVzOPiCw6xZyXOO5EqUwFTdXgV+f6/utmwObr9dvWNuuy36flKd+L/V+ujhUF6efj0/zGRzMn26\nzlIU9kwjySkqegBAr/8tw4ZF25Urwn9StYyqnhZzATQvQ1djTKZrdQ+dU2Np164dZs+enXU555xz\nqruphBCSxhZbAM8957b9XA9J8LXUM84IHuvZE+iUkpcLC4H+/dPPtyERmzY5d/k4wQEA7r7brXfu\nrDkZRo1y+8aPj3aPB1xIhT9TxYwZwCmnaPLGsOBgTLqXQ/fuLsxj9mw3K8Suu6pXRC6Cg2XffYPb\ngwa59fvvd4ai733y0EP6GiU4hPMnbA7BIWx03XefXmPlSuDxxzNfKxNRgkNl8jjEeThkExxE9D1t\nLg8Hf6aPOXOS1Rvn4TB2rC4+b79duZCK0tKg91BJSTDEIp9ECQ7V7VXg9/cffqi7id0svmiYbYab\nzUlJSfD/Iqng8PHHmqj3zDPzk2+mOgWHn35Kfw/VIbxs2qTTaN90k4ZwkopRUnIOgNn/W0aMiLYr\n2/kJn2ohDbMXqRQzABwI9V7oBSDu76mft/52FbepUhQWFmLPPfes7mYQQkiFOfJI9Rz44APgyitz\nO7dfPxUsli5NFxysGDBihE6xGZUfYsAAl+X+wgvVqPUf5rfaSoWF1q01ttEPBbAJJXv31hHc++7L\n3FYrNBxzjLqsW8rKgLPOAn73O7evfXt9bd7cxVQWFur23nvr6K5vpB53nL6GQyq22SZoHNn34bPr\nrsFt3xPENxSvuUYf5oqLdUYCIP3BsrhYjXz/WSRKcKjqkAp/hN5PCJorVSU4GBOckSUsOETNgvDx\nx8FEqflIGhnl4VBSEpx1ZcEC9UiIy69iCedwENH3GRYbABWxKhNSsXRpusgyYwbQt2+y83PB99zY\ndls1eP33l4myMp3BoGNHYJ998tcmv7+Xl2sbO3TIX/01Df93WJM8HMJiVFJvqtGjdWrVjz7S/6Ze\nvSrXjuoUHKLuy9UhOHz/vfs+7P8TyY0NG4CNG7cB4OYVb9FCZx8KU1iYJB1izaWqPRxe8NaHRhUw\nxhgAp6U21wCowAznhBBCcuG004B77lEDOVeOOw645BKgSZPoY598otNoRjFggFtfsCDdcN1qKzXq\nwuEZTZsGc0j84x/qbfDcc8C4ccAbbwTLt2rljMMhQ9Rg+ugjZ+zPng3ccIMrb40H38OhfXs1cOy0\nopaGDQHrxBb2cOjWLbi9445IIyw4bL+9TrfnY4wKOjaEZdEiNfiiHizDRnqUgbBqlZuRI4rKJI0E\nggb8gkr4KfrvxeYISTriH4VtV2FhMOdINg8HQEf0fSGla1fXp3LxcPBHiDt5AaZWcJg/P5hzo7xc\n+2o2fMO8uNh9X1Hi0uTJlQup8MMpLFWVx8EaMca430VJSbLEpI8/rvegAw+MnuGlooT7e10Oq9i4\nMdi3qntKUh+/XYC2LdN9zWITEANuauKKUl5evTkcoq7l/y+E8/dUFX6/CH8vdYVnnwVuvTVZH6sI\nUf+7TBpZAURkJoDp0LCKPxpjovTm4QB6QGeoGCUiAUc1Y0w/Y0x5aonQ7QkhhNQWtt022ggHdPTX\nihh+WAUAvPBCcMrMBg10lOq444BTT1UDwzcobTiFpVMnDfkYMyZ6lNT3cAjvC099efzx+j4A9Szw\nr7vNNkHRwuassBQUBPNgWC67LNiuiy5S74kePXS7tFQNqCjBwYZL3Hqr5sm46y53zHfhzxRWkYuH\nQ3l55lHPhQvjw1yyYWOBGzRwQs+qVdGj80mwwkCjRio6WDJ5ONjvU0T7HaDfze67u5CKXDwcrOHb\nvr1+HzaJqxUc/HAKS9S+MOEH0+XLdYTfetjssYcTt2bNihYikoZU+Aab5Z13Kv49Z8K+r5Yt1VPB\nEmfk//ija9/UqfpaUqI5OPJFOKSjLgsO4RCKmuThEGXYJslf4IcDffhh5doQlaCxpng4DB+uoWN3\n3FH17ajrgsPMmTpYcfXVwIMPVs01okRUCg4V52LoVJeNALxujLnaGLOPMaa/MeYBALemyn0G4M4M\n9WSMujLGnO4vAH7jHR4UOt6nEu+HEEJIJfBzFuy/v3paDBgQzNdw5JEqIrRrB0ycCBx6aOY6Cws1\nq7qloCC63H77ASedlL4/ysPBhin85jfB+i66yK0bEwyraN06KHaExZVu3dJn9wCAP/5RR9mXLNGH\nkNGjdb8VHAD1HIjzcNi4Efjzn9Wo9Ed2/Wk9kwoO9n2vXh0d77x6deYR/g0bsodw+IjoCPysWc7D\nYZttgsJMRUerfQ8HX3AoLlaDec891dPFf3j2wyiscdGtm4pRuYZUbNjgDLguXbS/WFHNGvtR4kI2\nr47y8vSH1e++C+YX6NjRvZfy8uiR3Yp4OFghbu1aTSyYb+wDd9u2QQ8sa/Q/8gjQpw/w2mtatmtX\nYKeddDpNv51Jk29mo6Qk3aDKh+BQXq4eNP60t1XFzz8nF4fCHlNV6eGwbJnOojJuXLLyUYZttnCC\njRuDSQ398KWKECXSZQo/yzdxgoMIcO+9em+6//6qb4ffL9at23yeFZsL3wPyppuq5hoUHPKIiHwI\nYAiAtdBcDjcDeBfAZABnQYWEhQAGi0hlJnZ5JLQcldpvAFwVOvbHSlyHEEJIJbjiCmC33TRs4qWX\nNPxi8mT1VLA0aQJMm6YP9mFvhzhsckcg80PoJZcEt7fYwsXLR3k4bLGFS67Zt6+KFj5+WEXr1m52\nDCDdw8EXRcK0aaN1+V4J3b2Uyh9/HC0aLF2qo3ZRIoAvOGSKd/YfYm2by8qiDdIkxlYueRwmTdKZ\nOnr3dnV36qQGuiXKpT8JcYLDpk36EDl3rj5Y+mEgvuBgsTG11sNBJFniQH9k1b4fG3azcKG+X19w\nsN4P2Twc1qxJNyCXLw8aVr7gEEdSDwf/8//97916ZcJdoigtdaPFbdoAW2/tji1frsb5eeepWDR8\nuIZS2b77/PNBYaoys5v4rFiRLrzlQ3C47jpNINu3b9V4iohoUt2+fVVIPfzwZAkTk4RoZeK994Cj\nj3bTAGfi5pvV62zYsGQzHVREcFiyJPi+P/qocokjo34zNcHDYc0a5zW1ObxSwkJUXTKURYL3tqh8\njc88o4MNlREjKDjkGRF5CcBuAO6CejKsB7AamkTySgB7ikim1C8Seo0rk8tCCCGkGvjVr3SU6aWX\ngkn0omiQw7+Ub6BmGm3p3Ts4U4QfnxklOAD6cPHii7qEQzJ8waFNG+eV0LJlUAQB0vM3ZMP3cHjz\nTWfk+kLEsmXxo7m+wWln2IjCPjAXFAQ9NqIebpMYW7nkcXj55fR9YcEh3x4OmzYBn32m6yJuxhQg\nmGfEssce+upPm5rEy8E31O378QW0f//bjbh27er6x6efBmexCBM3k0NYcOjXL/035D88hwWlxx9X\nUSw84mxDFpo0CSZbTRL6kQv+A3ibNukeDpMmud/2p58GE4rOnBkMB1iwIN17YMECN01oUqJmyMiH\n4DBpkr7Om1c1niIPPgiceKJOXQhoiEmS8IOw4Z/Nw2HZMp3q9w9/UOFk+HC9T555ZnYhxeYqKStL\nJl5VRHDwRT9A+0RlkixWt+AQda0VK4L3mp9/rrppay3hflGXwirmzw/+7qOE95tu0r5/ww3pIXpJ\noeBQBYjIUhEZLiI9RKS5iLQRkX1E5A4Rif1bFZGpIlKQWmI9E7wySRZ6OBBCSB3jvPOcu7oNSYgj\nLqllVEiF3X/kkdECSTik4qabdPaP555LHxnJVXDYaSdnME6b5vYfcIBbX7o0eorC1q3VG8Oen2m2\nBysstGoV9LDYbTfg5JODxnUSY2vaNJ2F5LHHspeNMjQ6dgyKNdkEh/fe0/e6yy6ac8OOrlqvj7Dg\n8MsvQePLHxHcbbf079l6OPiCQ5LEkX67reBw5JFu3y23OGGhVy93nbKyzIkjox5KozwcWrUCjjoq\nWM5PXBl+kL7qKjV+/dlrysqc0dalSzDDf749HPz3FeXh4E/pW14OPPWU2/7kk2BdIkFB5JdfVIAZ\nOlS9rJIS1d8rKziUlwcNRP+37VNWpiFn2UIOZs1STwHfyJw8Ob1cVC6OMLl6ONxyi067+sQTmkPD\nzliwenV2scIXAz7+OHvb8iE4AJULq4gSHNavzy2RbGXwRWD/9xEWnavayyFOcFi3TmeBuvrq/ExB\nWh38+9/B7aKioKgg4u7tJSUVF8QpOBBCCCG1jE6dgLfeUiP33HMzlx0yxHkP+HOIx3k4ZMJOa1lQ\noHknOnXSBI4DB+qsGz6ZQiqiaNIkPSwD0ASYtu5ly6IFhx9/1FCR3XbT7U8+iXehtw+xLVsGBYe1\na9Wos6OxQNDYipup64UXNJ74tNMyZ4UvLY02rHMJqRDR7/u993QEe+ZMnUWkvNw9JDZqFMydsWhR\ntIdC69ZaNvw9WQ8HG1Jh256NKMGhVy83cu/nuhg0KLkxH2V4RXk4AMD//V+wXJs26XkkAH3QtR4C\n33/vZu5Ytsx9jl276mdk++TcuclCS5Liv69wDodFi4BXXgmWz2ZU+UbYu++69/TYY8kzz1eF4FBU\nFBQH4gSH8eM1Z8zpp8eXWb0a6N9fcyH4cedRYU1JQpNyyeEgEvRQmjkzaBBnEgM2bAiOIocFoyii\nZpPJNjVmlODgJ44UUQP5179OllAyLu9JRRPbxrFhgwo54dFz//O193Yg/T8g029jwYLk+VviiAup\n+Ne/gIce0v/AfCZuzRfffBM9/bJPWHAoLw+eE54dxXrL5QoFB0IIIaQWss8+6tqbbcrqwkJ9mHvn\nHeCvf3X7/USFSb0R9t9fH5a/+CI4kwaQLjiEQyyS4IdPWA44wI1Uf/ll5geefVLzQ5WXR4dVlJe7\nh+Wwh4PFumUDQWOrZ89guaZN08+9997g9saNOsq//fbAk09Ghw506qSfnU3AmWkEadas9BHLlSvV\n4IoLqYgL+bAeKb7gsP32rh358HBo0CA9L0mvXtpv/fnXM4UrJA2pAHRU3/fY+OILzWQPBI2O8Gdi\n3fz9/Tb/hG3nL78An38e385cyeThMHFi7m7ifh4H32Bftw74z3+S1VEVIRVhT4Np06JHg/1Qn7fe\niq5rzhwXInLrrS6/SNT3ksTDISqkIi40YtGioMH/5pvB45lCOMJiRK4eDlYczubhECVI+PeL+fPV\nQP70U+DOiNT1a9dqEkb7e4gTbfMdVnH44XqfP++84H5fcPDvv2HBIU4oevBB9QTbbbfKJXqM83Dw\n7xdJRKTNybx5KpbuuGP8f0BRUbTY6/ezcJ/KJWeRjy84WO/KjRuDYsbf/qahUZsjuWxVQsGBEEJI\nvaRVq2DIAQCccALw978DY8emT4eZiV13jfZECAsO/gh5Uvw8DoA+fOy5ZzB3hOWYY9z6hRfqq5+v\nIhxWsWqVeiFYY6dVq/Q2AyrOLFsGjBihhp/FjvwDasT6SSotTz4ZfEi+5x419pYu1VHZKOzIthVo\nli2LNzbHjHHrvmDkjw4mFRysV4svOPjv0SYXBaIN0TBWcGjWLOgx44dVACrKFBTojCg2R0gmD4ek\nIRWA1nf55W7/Hns4Y80XHKwrfHj7pZfcPis0+J4Y+czjEBYcmjcPfua54ns4+KIZoPkqklAVHg5h\nw3/5cud98PTTmgdh1aqgIeOPvj/zjBqbjzySLsbZKVD9UJ2460YRHv31E3mG+e9/g9vhzziT4BD2\nPPAFwjisUVtQ4PriihWZZ8Wx12nY0PUlX3DwP78oA/nii9XoHzBADUFfcPAF1nwKDmvXOoFs7Nig\nt4J/Hd/DIXxPi/NwsKP3X3+d2fssG3GCg/+dVzTZb1UxYYKKZ2VlQa89n9dec+t+AuiqFhysmAsE\n74Ovv66eTkkT/NZUKDgQQgghKQoL9WF/6ND81NeokRr+W2yhxkFF8I1oQEcxAeD449PLHnWUZu4f\nMQK4/nrd5wsO/oiviBq+Tzzh9g0cqEZvmDlz9HojRwZHTn1jfKednJHrs2GDe+8rVwI33uiO+SNs\nBx+sry1auHr9sAr7kPf448CoUW4WDRvHv+WWwalVp093I7NhwSFsXFush4P/mfn5Mvz9b76pD6Fj\nxkQbG6Wl7iHVTolpGTjQvbeLLnJeKM2aOY+Wjz/Wz+6TT9JHIuNCKmxIRNOmQa+G4cP1fbRoAVx6\nqfNw8B9iw5/Jp5/q52fzYRQWAoMH67rviZHPPA7+g7Z92Pe9HAD9jqLEPZ/dd9fXoiL9TDZtCnoL\nABqekSTRnS8uWJHPnxGgIkQZ/tOnq7F2yinAHXfo78Q3ZHzj8IortF9cfnm6J8MLLwTPGzjQhdBk\nExxKS6OFtDjjNRziEhYFM3kfhAWH0tLsrun2+2rb1t0vgHhvFRF3nR13dB4BX3/tRoz9Ni5cGPTm\n+OUX4NlndX3FCr23+r8ZP39PPgWH8G/RF8d88db3wgt7yMR9Z34fqKgHgki84OCLPxXNbVBV+KJd\n3H/Aq6+69XPOcevVJTjY30QuCbRrIrW8+YQQQkjN5h//0BGrM86o2Pm+wXv99cAOO+j60KHAYYcF\ny/burQbGX/7ijOeddnLG53vvuQfT997TkBJAjz/8sBoy+++vhsSkScAfUymWS0qipxn0wz26dQNO\nOknXCwuDD8n33acGxV//Gh/r/NhjGg8+d67zsgjPVPHqqzp96qWX6nt8/HE3Sn/yyToFoBUWpkxx\n5xYWurAIIHp6UcB5IfTqBTzwgOb3OP98d9xOjwro6O7Ageql0adP+gjU0qUuz4P/PgA1AGfMUDf5\ncIJTOyJdUqJ5HXr21D7gu9n6D6R21Hb1amdcdewYFDiaNFGDdtUq/Yys4LBxo2tjeIR0/nz9zq2I\nccghzu23qgQHXwCw35c1li1jxkR70lhatgwao/PmqReGFQjsg3tpqTMmi4ri+6UvOFghA0iWlO8/\n/9Gkiv7sGUB0uMO0adonrMH78svBfmqnPvz+e7d/9er0OPmw4LDLLs5T6KuvMufcWL48Onwiyj1/\nw4b4MA9LLh4OQPawCl9w8L2EfK8rn1Wr3O+yc+fg92dzx/ht3LAhaDC/+mrwd/fii5tHcAjPWjJ2\nrLtvW8Fhq63cf0EUUf2zpCRoOFd0dpSffkr3RrH3JP/zrGmCgy/aRQkOZWXu99SiheZ6smQSHD77\nrGIJMq3g0LChCmIW+1n+/LO7d/jhfLURCg6EEEJIFVOZh4Xu3dX4GDtWPRcsxug+awS2bZvuDQGo\ngWVH0L//3j0sPfSQKzN6NDBsmDNSDztMR7P79IlvV+fOahxbceC3v1Uj4N13dSTplFOc4ffll8A1\n16jwAKTn2Nh2Wx3JHjQomAcjPFPFAw+47TvuAK67zm2fdZYap9YY9V3DGzXSXAbZ8MMezj5bZxzZ\nYgu3b9ddXbjH6687o2n+fBWU/IfOqPwNPltvrW0KT7PqG/NTp+rrrFnAZZe5/b7gsMsubt0a1VGe\nJoDrh35yVBv/HxVSMWGC2z7uOLferp0ztubMSRYLvmgR8Kc/qffK//1ftItwOKQCCH5vjzyiXjy+\nZ02DBsHkkl26BI9/+GHQ1d8ftXzlFZ3NYbvttI477khPBmpH/Fu3Dhp4mUJqysr0+zrySO33XbsC\nf/6z+37sKHPjxk5QeeutYAhIlCjx4YfpbvDhpKvz5wcN8O7d3ehpSYl6XHXvrsazZcMGba8VDIFg\nv4wSHKZNy55TI1fBwY64r1+f/v5/+cUZ/23bqhBnv48pU5zHwtdf6+/G924A1CvGFxzsaHfYC8MX\n3vz+D6iA5AtT2aYQrihRv0Wbo8FP8OsngA0TJTh8/XVQcMpVcBBR4zrqe125Uj8b//NZsmTzzd6R\njZUrg/8J8+eniwRz5zoRwPdCAzILDmvWZJ+RJQp7rdatg4K4vQ/6vwEKDoQQQgipUgYNUo+GsFvl\nttuqa/9JJ+lof0FB9Pm+cPDII2rsPfOMbrdoER2eET4PUAPp3HP1AenKK1Vs+PRT9ZQ49VQts+++\nLu+EDesAgNtvd8bcn/4UjD/2jWwf/4HvnXeCrtMbNrgHsyFDnNAQJZIUFqpx4rutRhGexjSMMcHR\nc5/nn1evC0BH//zcB1GCQxx+zL3PAw84A8j3BIia+WTbbTNfw3o4ANoXfvopenYCK/A0bJg+veb+\n+7vzw4lBwzz3nPaJm29WQ+8f/9DQnXAS0yjB4frrVVh69VXnJeQLCt26BftP165Bw3LevGDCyIsu\ncnVPnaoimIj2p+HDVWizRpmI83DYeuv0aTqjmDoVOOgg4K673L4NGzREok8fFRusIdy1q/sclyxJ\nN3DDRAkOUfieBzvvHOz399+vRqM/O8+112p7Z8wInmeJMl7936Iv+PgsXhw/8hvn4VBaqrNu7Lxz\ncNaNcLiNMc7LoaRE+8fKlZp7p39//S361+jcORguZvM4hI1nKzgUF6eHaqxYEYzx9wWHfM4uECUE\njBunn6X1pGjVSj8Df5pbnygDOJxTIVfB4corVazq3Tv92MqV6Z5jpaWZ82tsTsIzkKxbl37P88Mp\nDjtMvcfsf0ImwQGoWFhFnOBgvzs/xIiCAyGEEEKqjb320sSM4fAKnzPOcA8s992nnhF2ZPvkk4Oj\n+D7dugWN8OOPB/75T324tCPFHTtq8s3wSD2g04SG27XTTvrgesopbp9vQPr4Hg7PPRc9FWXLlsGw\nBGvA+ViPCj8kIookU6GG62jXzr33kSPVM6JbN80zYdlpp+z1WqJyaFguv1xd3q1x06RJ9CwmcR4O\nFt/DYd26+CSaNlxlwAB9KPa56ir3vv/613SDq7hYR6S//Va9T8Lf3VdfqZDmnxclOHTooPUfeqg7\nttde7jvdb7+gl0eXLmqs2mlQZ850hnS7dvp59e+v22vXpk+B99prbqrHb791o+pbbx00rMePVy+B\nvn01xKe0VH8T/fs7gaOgQBPR2mSxc+ZoX7Du6DvtFOxP4RkiwsydGy849OqVLkhuvbX+PqJmx5k/\nX0fLZ88O5j6x+CJO2HgVcV4UhYXAmWdGt2n9emdUlZS4/uR7H3Tr5vrjxx/r92WFqBtvdKO84SlT\ngeBsLy++qN+dLTdyZHAGom7d1CvC9lkrOIQ9HKx3wZtvOq8J3xj0fyu+aPrcc/q9PvJIevLMXLFC\nQEGBa++772pftB4DdjYhX/TwiRKJwjk8vv02eSjIvHnqAQREJ/dcuTLa8yEfYRXvvqsidabpLKdM\n0d/2pZdGi1xRU56GPUl8Mcneb2yoQ1GRvu+ysmgRJVfBoaTEeXm1bh0U+GxyZ19wqEjC6ZoEBQdC\nCCGkjrPddiosAGoAXHqpO2bzNERhTDCHxFlnuf1Juemm4Pb996uhfOaZ+sC+3XYazhFF+/ZuNM1/\niLTtANRzwh95zkVwiEpImI2wh8OIEe5BHNAcA/4D6VFHqfCSFBsiYzn+eHfNJUv0wdoa5m3bqndH\nePQrm+DgezisWxd88I4Ky/HzWFj22AM4/XRdX7NGDTzLDz+oR8OWW+qIszVqjj5aH6btDDArVwJX\nX+3Os8bills6wSCKtm01WeiFF6pRakOGbLsaNXIJ9b76yrl5H3KI9t0BA1x5m7PA9w6ynh3jxrl9\nBxygngv2wf+pp/T9TJ+u4tLee+uUg5bOnXXEdPx4NaCjvGu6dcsugvlk8nA4+GAVN3ysGBXn2fPu\nuyqQReVt8Eexw8brvHluNHvAgMwi2eLFarjvvrsKC/36qcBhhZwuXZxHypIlOouNpbRUxUkgKDjY\n32m/fk6seOml9Gk5bb/u3FmFzy23dN5Gn3yi/SIslFlB4YUX3L7bbov2Huvf3wkzs2er+DVsmH4X\nUR4cSVi71glP++4bTCLrCz9WcIjzcEgiOADJvBxE1PsnU56CVauiDfEvv8w++0gmXn5Zv+ebb9Yw\nh7jZGm68UQ30UaPS+wEQ/bvx73s//uhyGnXp4hLTWsGhvFwFj6IiJ576U0hnS3gaxu8fHTqoiGqF\n/ylTXPiKpbZ7OEBEuCRYACwDIB07dhRCCCGktvHRRyL6GOOWgQNFyssznzd7tsiee4pcfHH2snFc\neqle78or049lq3PpUpHu3V2br3zuVQAAIABJREFUDzhApKxMZMwYkaeeij7/tNOC7/Pcc3X/mjUi\nBQVu/ymnBMt98kmy9/Ob32j55s1F1q7VfcOHB+saPFjknXeS1Zep/W+/LTJ+fLDNjRrp+u67a/mz\nzw5ee/z4zPVfdZUrO3lysO1/+lOwrsMPj/+Oli0T2WILLdekicjKlbp/5Mj0vrb11u54UZF+dn4b\nRETatdPtHXbI7fMqLRW54QaRESN0XURk2LD0NjzzjB779NP0Y+PGiWy/va4bI7J4sUjXru74V1/p\nuc8+K9KgQfr5/nLPPSIlJcE2rl0rsvPOwXJjxmhfbtMmvq5GjYL9P2556CGRWbOC+846y31PUef0\n6uXWf/1rkVdeEenSReS3vxX55ht37OijtZ5Jk/QzPussd+zee0U+/DC+XRMmiDz8cPzx888Xue22\nzO/trbdEnnzSbY8a5T7XIUPc/saN089t3FjvYZbjjnPHnnsuvXyrVvqdbLON69fr14sMGJBeduNG\n7VNRbR49Orc+bHnnHVfH2WcH71Hjxrn1YcO0/LXXxn9umzYF6z700PQyDzyQvU3//W/2/geIXHJJ\n+r4WLfT3dMIJ2e/1K1aIXHCByIMP6vaUKenf6e9/L/LddyLFxe680tJgmT593LXmzNH3uPXW6W07\n80xXxyOPuP2XXeb2X3GF2//mm9oX7faJJ7r1QYOyf44+/vVuvln3+d/PZ5+5/5mCApGOHTsKAAGw\nTKT67eJcl2pvQG1ZKDgQQgip7Qwa5B5oTjhB5OefN9+1ww+/ufDDDyqOtG0rMm1a9vLl5Wqo/+EP\nIgcdpAamZf/93WcwdqwEHkB/+CFZe2bPVlHgtdfcvrIykTvu0Adm38CpCIsWiRx7rMidd+r2xo1q\nCIUfmA86SI8vWRLc//bbmeu/8UZX9pprRHr0cNtz5wbr+vzzzHVdfLEre/vt+jn86lfBOho0EJk4\nMXje3Xe74w0b6kO+3e7Vq2Kfm8/o0cE2NGrkxKHycpEOHYLtW7UqKJT4BubAgcG6H3tMDdEttxQ5\n55zgda66Kr5NM2cGy06dqvt9oxkQadrUrffoIXLqqenffXixvwvfQBs50r3fbOf/5z+urIgadPbY\nLruIDB0afd4334j89FNw3y67uPU771TxLe66t9+u4k54f2GhWz/++OD3+fjj7jN9/PHoeq0oZ41X\ni/8dh4VJu7z8slv/3e/0vPvuS69fRIWlzp3T6zjiiOR91eehh1wdo0fr52O3zzjDrV9+uZb/5z/j\nP9uiomDd4d8lIPJ//5e9TQMHuvLGBM+3wgwg0rdv5j722Wfx1ygvFzn4YFd27tx0gc5fWrVSoUtE\nheLw8ddeU4HTFzaBoIi4//7u+n4f9YVi/3s/9tjgf8bo0SItW+p6ixYqTInofeaQQ/TzCH8HFl+0\nmzJF991yi9v3z386MbdrVwoO9Wah4EAIIaS2s2yZjuo8+GDFvRWqk3y0edQoffpp3Fjk22/VaLQP\n0nZ0vCZywQWS9lA9ZIg7ftFFum+LLUR+/DFzXe+9l14XoCNq5eXaR5o0STb6uXChO79LF5E33nDb\nAwfq9qxZ6eeVlsYbKIcemttnE8XUqZnr9EcnDzxQ9xUVBT1g7PLEE+n1r1gh8ssvun777XrekCHp\nng1h7rpL+9qOO7rzx4xx12rQQOT00932MceoB0i4TXvuGdxevlzrmjZNBZzCwqCBZz/rjh1F2rcP\nntu2bbQgGCVyhdtgsd4pgHot2PWhQ5140LFj0KAGRJ5/Xs/fd9/g/ksu0XYBKsD4o79vveWuu2pV\n+nd21VUqMkb1u4kTXbmttope79/frd9/v563fHnQs6V1a1fngw+mfzbNm+tn+vnnwdH4bFhvMEDk\n9deD3731wAFUNBRRoci/ri/UzJnj6i0udu3fbrvgbzSKjRtVkF682IkMXbumi0cHHeTW7b0UcIKP\nv9x9twqStt/7PPZYsOwf/uDW99pLvdnC9Z16qp4bFo4B9XJ49NH0/UOHuvffsqXe79asCfbRsjLX\nrqIiFRPs+Z06ufWJE4NC3Lhxes6tt7p9xx4b/fn++td6vKBAZN063ff++8H22/XBgyk41JuFggMh\nhBBS+yktFXn6aWeMWONi112rt13ZCLvLAyoyWDZtUtf1mTOT1XfddcG6mjcXWbDAHc9FfPFHJv0R\nxKefznzehg0if/lLutv0SSclv3Yca9YE67z33uBx30i89Va3Pxxm07JltIEUJhcxbNEiZ2SIBEf4\ne/YMGlDXXKNlwiE7vodI8+bB63/9tYYi+Xz+uYbLfPihhkj4dV1wQXQ7d901WK5Zs6Dnxy23uLK+\nYOCHAPnLhRdq2aeeUiOuc2fnZWWFQLu88Ua69wigI+phUccXCAD1UIgj7A1kl+OPj97vf47hsApL\nebkat08+qS7/9rj1KNhuOz3uG7JLl+rvZtAgkdWr3X7fm+Dbb/VYVLvuuUfLh8NZrBs+oKEQv/yi\nfccXbE480QlETZuqYOb38a++0uNbbRUUGG68Ub24wt9puG3t2qnhHt4/eLAKaIB6B5SUiFx9tX6u\nvlEfXv7+d23XuHHa9mbNdH+LFiqMnHtu9Hk9e6bve+EFDROz259+GvSS8e+plpdfjg6j+vhjkRkz\n3Hbfvlq+d+9guddfd3WVlem9yYo4e+zhjpWUpHtkAOr9RcGhniwUHAghhJC6x+efq/E9b151tyQz\n5eX6YF1YqA+rnTtXrs2lpcEQm3DIQy48/3z6Q3KrViooJOG114Ln+rHVlWHHHV2dYQN8wwb1SDj6\naA0J8Pc//LDu79nT5X2oas46Sw2pxx5Tg6RbNx3htyJQcbHI3nvre+nQQT0s7Mh+7965XcsfgQXi\nc42MGaNG3W67qSH2xRfaD8ePV8PTH7m3Xg1NmmjbGjZM7xPWdVxE3c99UauoyBlhW2yhhuSbb6bX\nce216e288053vEEDFzoTRXl5tOfGpEnpRuVvfhM8Nxy+EEXUaLtdrLDz889BYcC+p5kz3WfQoYMT\nkaJCIWzY1sqVwf1+2MWjj0Yb49ddp6Ei/r599nGfm+9hYxdj9Dc0Z05w/9//nl52r72icyb44RiN\nG0d7bUUtfkicSDDEaOJE5+1jjMhNN6Wf366dyJdfah4jERU57LHWrYPeOTbMKYwv8AEqxpWU6Hfk\n51gJ38sADYtav17kr3/VPmY9d/w+YYkKP3rgAQoO9Wah4EAIIYSQusT69Wqsvfpq5eopKUlPbHjF\nFbnV4Rsu4bj7inLXXWqU50vAqGr8EfDy8vQwh7Vr1fCZO1e3//Y3kZ12cvkXkuKHm3TunJ9QpeXL\nNZRh0iTd9kf6ATWysoWb2ESftu+UlKSHf3z5Zfp5ixa540nyf0SNym/cqIZd2DD3+f57d8wPJ/GJ\n86CwBvHMmSJHHRXc36qVihD77ef22fwtIsFEl4DIYYe5Y+XlwZwfd93l1n23fH959FENrzvxxKAI\n0KePiphRYUX2mmVlbl+TJtFhC8ce63JlNGqUHi4Tt/TsmZ6vZMcd0/unHxZzwglO3Np1Vw2xCXtM\nDR2a/j126ZJ+/U6dMnt2ffSReitMn679xeLn2fAXX3SLEo2AYD4SEfVcCZd56y0KDvVmoeBACCGE\nEBLNihWa62D0aPUQ8B/Ik1BerjNAjBqV+7mZSOplUZ/YuNGFvljX/HxTUqKjzTaW/9JLs59TXu5m\nMrGcd54zvOLyDYhoEsXtt08mvqxeHUx2CLhj99yjRniTJtFJDu+5R4UBO7NKFNtu6+odOjSYDNCG\nA4SXAw906z16BMWmG24Ilg17pHTr5o69/np0/TYBY0GBCjSWWbOCs6T4YoPfVt/TZ8IEDR+YODE9\nh4T9rtev14SLU6em52cILxdf7N7vlCnBY+efn/75btwYHYJhZ+4Ii10vvBDdB444Qv4nDBxzjBPy\ncuWHH7S/hNvz3HOZQ0WAdAGtuFi/b+t10aqVemBRcKgnCwUHQgghhBBSF1i/XsM1qjp57BdfqOFV\nUeFn9mwXRvTf/+avXc8+64y+Pn2CxxYtSg/ByQWbj6JnT/VK2bgxfYS7UaP0fAh2CXsc+dNSduuW\nfr0//1mPHXOMfqfh+o46SkfuJ04U+eCD9PPnzHGzLdileXMVXM48Uz09fO8bn3ffTb/emDHBMsuX\nB4/74Rw77BCcLWnTpmAeA+sxEyZq5hab4NMPa2jSJJgvxae8XEWGpLMTZcKfthXQmVpEtP/6go6f\n78YPmwmzYYP2g8WLdZuCQz1ZKDgQQgghhBCyefnoo+TJUJNSXq4j8dtvL/Lii/mv+6uvgiLL0087\nQ7Ow0HliHHKI229MehiHiHqL/O53GjIwf370NRcvVlFg3bqgZ8Kpp2bOaWH54ougEHD99cne6/r1\nzlukQwdNcBiVZNUmatxnHxUVTj5ZvS6i8ofYPBRt27qpJsNMny4BA98Yl++krEzzjgCaj2Jz8eij\nLkTlrrvc/oULg7NDjRyp4RsPP5y87touOBhRY5pkwRizDEDHjh07YtmyZdXdHEIIIYQQQkgtQAS4\n6CJg+nTgttuAww7T/fPmAUccAXTqBNx1F7DPPpW/1ptv6nLSSUDPnrmdO306sHixntuwYbJzfvwR\n+PZboEcPoKAgvszUqcDBBwPNm2eub80a4IkngL59M7d/wQJg8mRg4ULggAOAE090x1atAmbNAvr1\nA5o0SfY+8sEHHwBffw2ccALQoEH+6u3UqROKiooAoEhEOuWv5s0DBYeEUHAghBBCCCGEELI5qe2C\nQx61F0IIIYQQQgghhBCFggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwI\nIYQQQgghhBCSdyg4EEIIIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGE\nEEIIIXmHggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQgghhBCS\ndyg4EEIIIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXmHggMh\nhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQgghhBCSdyg4EEIIIYQQ\nQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXmHggMhhBBCCCGEEELy\nDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQgghhBCSdyg4EEIIIYQQQgghJO9QcCCE\nEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXmHggMhhBBCCCGEEELyDgUHQgghhBBC\nCCGE5B0KDoQQQgghhBBCCMk7m01wMMZsb4y5wxizwBizzhizyhjzgTFmuDGmaR6vM8gY87wxZqkx\nZmPq9XljzOH5ugYhhBBCCCGEEEIy03BzXMQYcwSAxwC0ACCp3U0B9AKwF4AzjTGDReTLSlzDABgD\nYFhql73OtgCOBnC0MWaMiJxT0WsQQgghhBBCCCEkGVXu4WCM2QPA0wCaA/gZwLUA9gcwECoQCIBu\nACYZY5pV4lI3Q8UGATAbwEkA9k69zkntP9MYc2MF628AAGVlZZVoIiE1m++++w5/+ctf8N1331V3\nUwipMtjPSX2A/ZzUB9jPSX3Asz9rZTqEzdHoUVBvhlIAh4jIrSLyvoi8JSLnArgSgAGwE4DLK3IB\nY0y31LkCYCaAA0RkvIjMFpHxAA6EihAGwBXGmC4VuEwBQMGB1G2+++473HDDDfzjJnUa9nNSH2A/\nJ/UB9nNSH/Dsz4LqbEdFqVLBwRjTG2rsC4CHROSDiGJ3AlgAFQMuNsZU5IO8FC485CIRKfYPisgG\nABelNhsCuKQC1yCEEEIIIYQQQkhCqtrD4Whv/dGoAiIiAMalNlsCGFCB6xwJFTUWisjMmOu8D+Az\nqLBxVAWuQQghhBBCCCGEkIRUteBwQOp1PTSkIY6p3nqfXC5gjPkVNDFkuJ5M1+lojNkhl+sQQggh\nhBBCCCEkOVUtOPSAeh4sEpHyDOUWhs7JhV1i6sn3dQghhBBCCCGEEJKQKhMcjDGNAbRNbS7LVFZE\n1kC9IABguxwv1clbz3gdAEu99VyvQwghhBBCCCGEkIRUpYdDc299XYLyVnDYsgqvs95bz/U6hBBC\nCCGEEEIISUhVCg5NvPVNCcoXQxM6Nq3C6/izV+R6HUIIIYQQQgghhCSkKgWHjd56YYLyjaH5HjZU\n4XUae+u5XocQQgghhBBCCCEJaViFdf/srScJX2iWek0SflHR6zTz1nO9TmsAWLFiBdq3b5+1cEFB\nAQoKCnK8BCHVy6ZN6iR0+OGHo7AwiU5ISO2D/ZzUB9jPSX2A/ZzUZsrKylBWVpa13IoVK+xq6ypt\nUBVRZYKDiBQbY1YCaINgYsc0jDEtoWKAIJjYMQl+osiM10EwUWSu1zF2xfvSCamTsI+T+gD7OakP\nsJ+T+gD7OaknmOxFah5V6eEAAAsAHAigqzGmQYapMbuHzsmF+TH15Ps6xXBhHz8mKF8GINNUoIQQ\nQgghhBBC6icNACRxiW8NFRuKsxWsiVS14DADKjg0A9ALwMyYcv289bdzuYCIfG2M+RbANqF6ouib\nei0SkSU5XqdZ9lKEEEIIIYQQQggBqjZpJAC84K0PjSpgjDEATkttrgEwpQLXeRGq+nQ3xuwdc519\noR4OEmoXIYQQQgghhBBC8kyVCg4iMhPAdKgY8EdjzD4RxYYD6AEVAkaJSCBzhjGmnzGmPLWMjbnU\nKAClqfV/GGP8qTKR2r47tVkKYHSF3hAhhBBCCCGEEEISUdUeDgBwMXQKykYAXjfGXG2M2ccY098Y\n8wCAW1PlPgNwZ4Z6JPaAyBcAbocKG70BvG2MGWKM6WWMGQIN09grVcdtIvJlpd8VIYQQQgghhBBC\nYjEisXZ8/i5izGAAjwNogfTsmgIVGwaLyNcR5/aDhlkIgH+JyLCYaxgADwKwx/3r2Df5kIicU9H3\nQQghhBBCCCGEkGRsDg8HiMhLAHYDcBdUXFgPYDU0ieSVAPaMEhv8KkKvUdcQETkLwGBoTociaCbP\notT2IIoNhBBCCCGEEELI5mGzeDgQQgghhBBCCCGkfrFZPBwIIYQQQgghhBBSv6DgQAghhBBCCCGE\nkLxDwYEQQgghhBBCCCF5h4JDFowx2xtj7jDGLDDGrDPGrDLGfGCMGW6MaVrd7SP1E2NMO2PMYGPM\nDcaYl40xK4wx5allbAXqG2SMed4Ys9QYszH1+rwx5vAc6igwxpxrjJlmjPnBGPOLMWaRMeZ+Y8wu\nubaJkNTUxn82xrzq9c2fjTGfGWPGGmP65Fgf+zmpURhjmhtjTjTG3G6MecsY84UxZo0xptgY870x\nZoox5gpjTOuE9e1njHnMGLPYGLPBGPOdMeYVY8zvc2zXSanf3Xepehan6t23Yu+UkGiMMbd5zy/l\nxpi+Cc7hvZzUSEJ9OdMyOUFddaafM2lkBowxRwB4DDqdZ/iDMgA+h07n+eXmbhup3xhjykO7/P4Z\nO31sRD0GwBi46WT9euzUsmOyzfBijGkD4L8A9kL0b6UYwIUi8nCSdhFijJkG4IDUZtQfle2fjwE4\nU0RKMtTFfk5qJMaYgQBeR4ZZuKB9ayWAP4jIaxnq+guA66CDSVF9/CUAx4nIpgx1NAEwAcCgiDYZ\nAOUARorIyAztJSQRxpjdoTPWFXi7B4jItJjyvJeTGk3q+TyJcf2WiAyMqaPO9XN6OMRgjNkDwNMA\nmgP4GcC1APYHMBDaCQRANwCTjDHNqqudpF4jqeUbAK/B3YRy4WboDU0AzAZwEoC9U69zUvvPNMbc\nGFeBMaYBgBfgbmj2YXUfAP8H4HsAjQHcb4w5rAJtJPWTbaD9qQjAaADHQ/vmfgAuA7AsdfxUAI9k\nqYv9nNRkvgHwLwAXAzgW2sf7ADgRwLMASgG0BfCiMaZnVAXGmHMAXA/9H1gE7e97AzgawGRon/0t\ngGwecI/AiQ2TU+fvDeCPqXobABhhjDmzYm+VEMUzqgoA/IBkzzC8l5Pawj8B9MywZBoYrHv9XES4\nRCwApkKV/GIAe0ccvzx1vAzA9dXdXi71awEwAvrw2C61vYPXH8cmrKMbgE2pc94D0Dh0vCmAD7zf\nQZeYeoZ517474ngXAGtSxz8D0KC6Pz8uNX8BMBHAcUh54kUcbw1godf3Dogpx37OpcYucf07VOYo\nr+89F3G8JYDVqeNfA2gVvgaAF706+sZcZ4BX5t/htgFoA2BxqswqAFtV9+fHpfYuAC5J9aVPAdyY\noH/yXs6lxi+opG1YV/s5PRwiMMb0BnAgVBF6SEQ+iCh2J4AF0D/yi40xBRFlCKkSROQGEXlZRFZU\noppLATRMrV8kIsWha2wAcFFqsyH04SCKy1OvqwFcGdHWLwH8Dfpb6QrgmEq0mdQTRORIEZkgqX/G\niOM/wvU9QD0gomA/JzWWuP4dKvMi9IHQQJ9NwpwNYKvU+pUisjriGudDHywB4IqYS9n9ZQAuCLdN\nRFYBuCq12RIAvRxIhTDGdAIwEvqcfR6A2JA4D97LSX2gTvZzCg7RHO2tPxpVIPVHPC612RI6MkBI\nbeJI6J/9QhGZGVVARN6He9A9KnzcGNMNQI9UPc+IyMaYaz3qrfPPm+SLt7z1LjFl2M9JXeDn1GuT\niGO2z/4E9UxIQ0SKALwB7eMDw6GgxpgtARwE7eOvi8i3Me14PnUdgH2cVJx/AmgG4FGJydcQAe/l\npD5QJ/s5BYdobKKy9dDYmTimeus5ZUsnpDoxxvwKwLapzamZynrHOxpjdggdOyCiXBoi8j00yaoB\nfyskfxR662Xhg+znpC5gjNkZwG+QeggNHWsEoHfq2LsiUpqhKtt3G0Pjen16w/2eMvXxEqibrwHQ\n2xjTMK4sIVEYY4YAGAzgR0SMvMacw3s5qfPU5X5OwSEaqwotEpHwbAA+/h9/j6ptEiF5xZ8KZ2Fs\nqfTj4X5ekXq2M5xSluSH/t76gojj7OekVmKMaWqM6WqMuQzqyWMN+7tCRXfyjm3uPt4Q6opLSCKM\nMVtBkwALNPxnVcJTeS8ntY0hxphPjTHrjTE/GWM+N8Y8aozpn+GcOtvPKTiEMMY0hmaDBjQLeiwi\nsgbqBQEA21VluwjJM5289Yz9HMBSbz3czytSjwmdR0jOpDKcX+XtGh9RjP2c1BqMMafbOdqhzxaf\nA7gdQHuogfY3EXk6dFp19vGoegjJxN8BdADwtohkm13Ih/dyUtvoAaA7NAyuGTTs8zQAk40xzxtj\nWkScU2f7OV3h0mnura9LUH49gC0AbFk1zSGkSsiln6/31sP9PF/1EJIrl0GniRIAE0RkbkQZ9nNS\n24hKIvkhgLNFJCrEk32c1AqMMQdAp1ctAXBujqezn5PawnrorECTod4D6wC0A9AP2u/bQHMFvmCM\nOURE/HDQOtvPKTik4ydk2pSgfDFUFaK7FalN5NLP/Qy54X7+v3pEpDL1EJIYY0w/aHZlQOeSPj+m\nKPs5qU38G4BNEtYUOiI2BJrM62ljzCUi8lLonLz38UrWQ0gaqVwjD6Y27xSR+TlWwXs5qS10FJGf\nIva/aYz5B4BXAOwBFSDOA3CPV6bO9nOGVKTjZ/IsjC3laAwdkdhQNc0hpErIpZ839tbD/fx/9Rhj\nKlMPIYkwxuwKzZTfENqPThCRlTHF2c9JrUFEfhKR+alltoiMF5HjoW64naEjYqeFTst7H69kPYRE\n8Seoe/kS6HSYucJ7OakVxIgN9tgK6BTedhrYi0JF6mw/p+CQzs/eehLXEju1VJLwC0JqCrn0c3/6\ntHA/z1c9hGQllcH5VQCtAJQCOFFE3s5wCvs5qfWIyBMAngVQAOAeY0xL7zD7OKnRpGZZuRo6OHeR\niFTEqGE/J3UCEfkawOtQ7/iuxpitvcN1tp8zpCKEiBQbY1ZCY2wyJs9I/ek3g95El2YqS0gNw08i\nky1JjJ+MJtzPw/X8mKAeQfYkNoQEMMZsC+AN6JRR5QCGisikLKexn5O6wovQ8IpmAA4HYJNHVlUf\nn1PBeggJcyl0tPZLAFsaY06MKPNrb32gMWab1PrElEDBezmpS8wH8NvUekcAy1PrdbafU3CIZgGA\nA6HKU4MMU2N2D51DSG3Bj5/sHlsq/Xi4n4fr+ShBPUsrOMJB6inGmDbQEYFfQf8UL0yN+maD/ZzU\nFVZ46/6c658DKIN6rOazj09MUE8pgEVZrkmIddnuAuCpLGUNgD+n1gV6z/8GvJeTukVUcmCgDvdz\nhlREMyP12gxArwzl+nnrmdx6CalRpFy6vk1t9stUFkDf1GuRiCwJHZvhrcfWY4zpAJ0vXsDfCsmB\n1NRRr0GnmBIAV4nI/UnOZT8ndYiO3vr/3F5FpATAB1BDbT9jTKaBJNt3iwHMCh2bCZekLFMfbwRg\nX2gfnykipYlaT+o7kmCJKqs7eC8ndYtdvHXbr+t0P6fgEM0L3vrQqAKpOeBt8qY1AKZUdaMIyTMv\nQh9Suxtj9o4qYIzZF6p+CoK/CwCAiHwBVVYNgCHGmCbhMin839G/K9NoUn8wxjQF8DI0o7MAuFFE\nbs+xGvZzUhc4wVv/OHTM9tkWAI6NOtkY0wnAwdA+/oaI+FOhQUTWAXgT2scPToUwRXFc6jqAJm8l\nJCMiMlRECjItcIkkBUD/1P6GIvKNVxXv5aTWk8pFdQi0j34lIt+FitTJfk7BIQIRmQlgOvSL+qMx\nZp+IYsPhRtxGheZRJaQ2MArqEgsA/wjfkFLbd6c2SwGMjqnHGoCtAdwWPmiM6QJNGAWo+y3/vElW\nUiOpLwDYH+4+O6ICVbGfkxqLMeZ0Y0zjLGUuhYv3/Qr6fOLzEIC10GeWW4wxrULnNwBwHzTpJOD6\nchi7vyGAe1Pn+fW0BXCBspPIAAAEHklEQVRLanMNgIcztZuQCmJi9vNeTmo0xpjfGWMKMhzvAGAC\n3AwU90QUq5P93IjEhZHUb4wxv4G6lzSFui/eDPViaArgJABnpYouBNA7PFpASFVijOkDoKu3qy2A\nv8O5RQUeBEXkXzH13Ax3w5kL4FZoYqcuAK6CG1m+WUT+HFNHAwBTAfRJ7ZoAYAyA1QD2AXAdgPbQ\nOOPBIvJaDm+V1FOMMRMAHAPtf5MBXJLllE0pVT+qLvZzUiMxxnwNoDm0P82A9st1qX09AZwC1+eK\nAfxWRNI8Ko0xZwOwoUZfArgJ6gmxLTRpX39oH39SRE7N0J4nAfw+tTkF+vD7LYDdAFwL/c0IgHNE\n5KGKvGdCwhhjRgAYAe1bA0RkWkw53stJjcUYsxgq2E4A8C6AxdCpJtsCGADg7NS6QIXjQ1JhceF6\n6lw/p+CQAWPMYACPQ90Hw4qrAPgM+iV9vbnbRuo3xphHAJyesLikXBaj6jEAHgQwzO7yz0u9PiQi\n52RpTxsALwHojejfyiYAF4jI2IRtJvUcY0xcst44FotI55i62M9JjSQlOGyP+FFd2z+XAhgmIpMz\n1DUCmnDPRNQn0L57vIhsCp/r1dEEOgWn9agI/1bKAYwUkb/G1UFIruQgOPBeTmosOdzPnwNwloj8\nFFNPnevnFByyYIzZDsDFAAZDpxbZBHU9GQ/gXhHZWI3NI/WUlOBwWtaCiohIxhlpjDGHQ5XX3lD1\ndSU0idj9SVXPlJp6FoCToeFGzaAjY28AuFtEOJMLSYwxJtcwtcUi0iVLneznpEaRiuc9GDr61QNA\nB+i03BsA/ADgQwCTAIxP8ryRiu29ADrTVgdo6MM8AGNFZHwO7fo9gDMA7A6gJYDvAUyDPve8n7Qe\nQpKQEhyuhxpBB8UJDl553stJjcMYcyA0SeN+ADpD+2YLqNfaUgDvAPhX0ntoXernFBwIIYQQQggh\nhBCSd5g0khBCCCGEEEIIIXmHggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7\nFBwIIYQQQgghhBCSdyg4EEIIIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBC\nCCGEEEIIIXmHggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQggh\nhBCSdyg4EEIIIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXmH\nggMhhBBCCCGEEELyDgUHQgghhBBCCCGE5B0KDoQQQgghhBBCCMk7FBwIIYQQQgghhBCSdyg4EEII\nIYQQQgghJO9QcCCEEEIIIYQQQkjeoeBACCGEEEIIIYSQvEPBgRBCCCGEEEIIIXnn/wEBtmy1xV9a\nhQAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 356, + "width": 526 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "plt.plot(losses['train'], label='Training loss')\n", + "plt.plot(losses['validation'], label='Validation loss')\n", + "plt.legend()\n", + "plt.ylim(ymax=0.8)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Check out your predictions\n", + "\n", + "Here, use the test data to view how well your network is modeling the data. If something is completely wrong here, make sure each step in your network is implemented correctly." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABYQAAAMGCAYAAABCg4CrAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAewgAAHsIBbtB1PgAAIABJREFUeJzs3XecVNX9//H32V2WtoAKKAiKWCJojAqIBVQkVtQEEzUx\nUURi+xoL0VhCiFgCFtQoij9bosSYWL5REkswsQRLMFL0awEERZGyIB2Wsm3O7487O3vnzuzOlnvn\nzsx9PR+PfeydnTtnztldZpn3fOZzjLVWAAAAAAAAAIDCVxT2BAAAAAAAAAAA2UEgDAAAAAAAAAAR\nQSAMAAAAAAAAABFBIAwAAAAAAAAAEUEgDAAAAAAAAAARQSAMAAAAAAAAABFBIAwAAAAAAAAAEUEg\nDAAAAAAAAAARQSAMAAAAAAAAABFBIAwAAAAAAAAAEUEgDAAAAAAAAAARQSAMAAAAAAAAABFBIAwA\nAAAAAAAAEUEgDAAAAAAAAAARQSAMAAAAAAAAABFBIAwAAAAAAAAAEUEgDAAAAAAAAAARQSAMAAAA\nAAAAABFBIAwAAAAAAAAAEUEgDAAAAAAAAAARkfeBsDGmjTHmQmPMDGPMSmPMDmPMFmPMQmPMH4wx\nRzZxnFOMMc8bY5bFx1gWv3xyM+ZSbIy51BjzljHmG2PMNmPM58aYh4wxB7R8lQAAAAAAAADQesZa\nG/YcWswYs6ekVyTVha3exZj45ynW2rENjGEkPSppTJox6m7/qLX2kgxz6SrpH5IGNTCPSkmXW2t/\n39g4AAAAAAAAABCUvK0QNsaUqD4MtpI+lDRa0pGSTpR0i6SK+HVXGGOua2CoSXLCYCtprqRzJA2O\nf54X//qFxpjfNjKXIknTVR8G/1XSKZIOl3SlpNWS2kp6yBhzUkvXDAAAAAAAAACtkbcVwsaYH0p6\nTk4A+x9Jx1jPYowxAyTNklQiaaOkXa21ta7r95P0qaRiSbMlHWutrXRd317STDlBb7WkA6y1X6SZ\nyxhJj8XnMtVae6Xn+n3khM2dJH0uqb+1NtaqbwAAAAAAAAAANFPeVghLOsp1fLs3DJYka+08SS/J\nadmwk6R+nlN+IScslqQr3GFw/PbbJV0Rv1giKW3bCUnXxD9vkJRSiRwPkW+Lz2NfSWc0MA4AAAAA\nAAAABCafA+FS1/GSRs5zV/SWeq77npyq3oXW2tnpbmyt/a+kz+SEud/3Xh+vMu4fH+cZa+2OBubx\nhOuYQBgAAAAAAABA1uVzIPyZ63jvRs7bJ/7ZSlpc90VjTF9Ju8cvzsxwX3XX9zLG9PFcNzTNeSms\ntaslLZITLA/JcH8AAAAAAAAA4Lt8DoT/ImmTnID1+vjGbkmMMYdKOlVOGPyUtbbCdfUBruOFGe7L\nfX1/z3UtGWePeH9iAAAAAAAAAMiavA2ErbXrJI2StFVOxe1sY8x5xpjDjTHfNcZMkPRvSW3kbOj2\nS88QvV3HyzPc3TLX8R4+jGM8twMAAAAAAACAwJVkPiV3WWtfNMYMlBP2/kzSNM8pqySNl/Romt6+\nnVzHFWrcVtdxWUDjAAAAAAAAAECg8rZCWJKMMW0kjVb9Zm/W87GbpPMknZDm5u1cx1UZ7qrSdext\n9ZAYx1rbmnEAAAAAAAAAIFB5GwgbYzpIel3SDZJ2lnSHnP6+bSV1kXSipHckDZI03Rgz1jOEu2K4\nNMPdtXUdb29oHGNMa8YBAAAAAAAAgEDlc8uImyUNlVMJPMZa+yfXdRWSXjfGvCnpX5KOkzTZGPO6\ntfbj+DlbXOdnat/Q0TO2m3ec9S0cp1HGmK1yAmWb4T7q1EqKNec+AAAAAAAAgDxUJKm4CeftImdv\nr0prbcdMJxeqfA6EL5ATji7yhMEJ1tqYMeY3ciqFi+S0l7gmfrV7A7hMG7y5N5Jb5rnOO05jYW3d\nOFaZN6Dzaqv6X+xdm3lbAAAAAAAAAI62mU8pXHkZCBtjdpOT6FtJH2Q4fa7ruJ/reH4DX0/Hff0C\nz3XecT5qwjjLrLXNbRkRk1RsjFG3bt0ynlxcXKzi4qa8MJJZVVWV1qxZo+7du6u0NFNXjPwXtfVK\n0Vtz1NYrRW/NUVuvFL01R229UvTWHLX1StFbc9TWK0VvzVFbrxS9NUdtvVL01hy19UrRW3OhrLe2\ntla1tbUZz1uzZk3doQ10QrnOWpt3H5K6yglIayU9m+HcMte5f/Nctzz+9U8zjDE/PsbXaa7bzzX+\n1EbG2M113p9asOblkmyvXr1sts2dO9dKsnPnzs36fYchauu1Nnprjtp6rY3emqO2Xmujt+aordfa\n6K05auu1Nnprjtp6rY3emqO2Xmujt+aordfa6K05auu1Nnprjtp6u3fvbuWEwattDmScYX3k66Zy\n6yVtjh8faYxpbB3DXMdfeq77m5y+If2MMYPT3dgYc4Scyl4rabr3emvtYjlVw0bS2caYdg3M4wLX\n8QuNzBcAAAAAAAAAApGXgbC11kp6WU4Iu7ukX6c7zxizs6TbXV96yXPKvZJq4sf3e8Pc+OUp8Ys1\nku5rYEp3xT/vIunONPPYR9IN8Yufi0AYAAAAAAAAQAjyMhCOu0XSNjmh8M3GmL8ZY35gjDnEGHOE\nMeYXcvoLHyCnuvc1a+1r7gHi1b13xcc4TNK7xpizjTEDjTFnS3pX0qD47e+01n7RwFymxc81ki43\nxjxnjDnRGHOYMeby+HWd5bSLuMJaG/P1OwEAAAAAAAAATZCXm8pJkrX2M2PM9yT9RVI3SafHP5JO\ni3+8LunsBob6taTuksZIOkTS02lu/5i19jeNzCVmjBkpp2r5MEk/jH+4x6mU9HNr7T+btEAAAAAA\nAAAA8Fk+VwjLWvuGnP6+10t6U9I3kqrkVA4vkfSspJHW2hOttZsaGMNaay+SdKqcnsIr5IS3K+KX\nT7HWXtKEuayTdJSkyyS9LWmtpO2SvpD0iKQB1to/tHy1AAAAAAAAANA6eVshXMdau0FO24e7Mp2b\nYZwZkma0coyYpIfjHwAAAAAAAACQU/K6QhgAAAAAAAAA0HQEwgAAAAAAAAAQEcU33XRT2HNAE9x8\n881XS+rcuXNnXX311Vm//7KyMg0bNkydOnXK+n2HIWrrlaK35qitV4remqO2Xil6a47aeqXorTlq\n65Wit+aorVeK3pqjtl4pemuO2nql6K05auuVorfmKK33nnvuUUVFhSRtvummm+4Oez5hMdbasOeA\nJjDGLJfUq1evXlq+fHnY0wEAAAAAAADySu/evbVixQpJWmGt7R32fMJCywgAAAAAAAAAiAgCYQAA\nAAAAAACIiJKwJwAAAAAAAPLLoEGDtGrVqrCnASCievTooTlz5oQ9jbxFIAwAAAAAAJpl1apVdX04\nAQB5hkAYAAAAAAC0SFFRkXr27Bn2NABERHl5uWKxWNjTyHsEwgAAAAAAoEV69uyp5cuXhz0NABHR\nu3dv3p3gAzaVAwAAAAAAAICIIBAGAAAAAAAAgIggEAYAAAAAAACAiCAQBgAAAAAAAICIIBAGAAAA\nAAAAgIggEAYAAAAAAACAiCAQBgAAAAAAAICIIBAGAAAAAAAAgIggEAYAAAAAAACAiCAQBgAAAAAA\nAICIIBAGAAAAAAAAgIggEAYAAAAAAEBkTJs2TUVFRSoqKtKYMWPSnjNz5szEOcOHD8/yDJvmggsu\nSMzxj3/8Y9jTQR4hEAYAAAAAAIiIYcOGJULEdB+dO3dW3759NXLkSD3wwAPavHlz2FMOjDHGl3PC\nlg9zRG4hEAYAAAAAAIgIY0yjH1u3btXSpUv197//XVdeeaX23HNPPfnkk2FPOzTW2qzcD9W+yKaS\nsCcAAAAAAACA7LHWyhijww47TIMHD076+saNGzV79mwtXrxYkrR582adf/752rFjhy666KKwphyK\nusrbbFbgNve+6oJ8oDkIhAEAAAAAACJoxIgRuvHGG9Ne97e//U0XXHCBNm3aJGutrrrqKp166qna\nfffdszzLcBx77LGqra0NexqNevzxx/X444+HPQ3kIVpGAAAAAAAAIMn3v/99/elPf0pUE1dWVurB\nBx8Me1oAfEAgDAAAAAAAgBQjRozQwQcfnOij+9prr4U8IwB+IBAGAAAAAABAWkcddZQkp7/wkiVL\nkq7ba6+9Ehuhff3115KkJUuWaPz48RowYIB23XVXFRcXa8CAAQ2Ov3z5cv32t7/VMccco169eqld\nu3bq2rWrBgwYoGuvvTbRy7ipPvjgA1188cXaZ5991KFDB+266646/PDDNXnyZG3YsKHJ48ycOTOx\ntuHDhzfpNt98840mT56sE088UX369FGHDh3UoUMH9enTRyNGjNDkyZO1dOnSpNvUfQ+nTZsmyfk+\njx49OnHf7o9bbrkl6bbN3Yhu69atmjJlik4++WTtscceat++vXbZZRcddNBBuuKKK/T+++83aZ11\n91lcXJz42qJFizR27FgdcMAB6tSpk7p06aJDDjlE48aN07p165o0LrKHHsIAAAAAAABIa+edd04c\nb968Oek674ZmjzzyiMaOHasdO3Zk3OjMWqsJEyborrvu0o4dOxLjSdLGjRu1YcMGffjhh7rvvvt0\n3XXX6be//W3GuY4fP1533HFHovdvXauLdevWafbs2br//vv13HPPNW3hrjVmYq3VrbfeqjvvvFPb\ntm1Lud3y5cu1bNkyzZgxQ7/61a/0ySefqF+/fonz6s6tq8RuycZymbz00ku6+OKLtWrVqqTbVFVV\naePGjfr00081depU/eQnP9Gjjz6q9u3bN/k+H3roIf3iF79QZWVl0tc/+ugjffTRR3r00Uf16quv\nNvrCALKLQBgAAAAAAABpuatqu3TpknJ9XY/hZ599Vtddd52MMerVq5eGDBmiLl26aOXKlVq/fn3S\nbWKxmM4++2w9//zziUC0Z8+eOvzww7XrrruqoqJC77//vj7//HPV1NRo0qRJWrt2rR566KEG5zlu\n3DjdfvvtifE6dOig4cOHq2fPnlq1apXeeOMNrVixQiNGjNDYsWN9+/7EYjGdeeaZmj59euK+S0tL\ndeSRR2qvvfZSmzZttGrVKs2dO1fl5eWy1qqqqipx+9GjR2vdunV67bXXtHDhQhlj9N3vfjcRGLsN\nHjy4RXN85plndO655yoWi8kYo+LiYg0dOlT77ruvKioq9Pbbb2vlypWSpD//+c/66quv9MYbb6i0\ntLTBMet+7tOmTdNll10mY4z69eunQYMGqX379lq4cKHeffddWWu1bt06nX766VqwYIE6d+7cojXA\nXwTCAAAAAAAASOvdd9+V5FSE9u3bN+X6uorQcePGqW3btpo6darGjBmTdE51dXXS5ZtuuikRBnfr\n1k0PPPCAzjrrrJSxp0+frp/97GfasGGDHn30UR1//PE688wzU8576623dMcddyTmctZZZ+nhhx9O\nCrC3bNmiSy+9VE8//bQmTpzYzO9Cw8aNG5cIgyXpiiuu0IQJE5Iqq+vMmTNHU6dOVZs2bRJfmzBh\ngiSn/cPChQslSeeee65GjRrly/yWLFmiiy66SLFYTJJ0+OGH66mnnkr5Wd5777269tprFYvFNGvW\nLF133XW69957Gxy3br2XXnqpdt11Vz355JM64YQTks555513dNppp2nz5s1atWqVpkyZovHjx/uy\nLrQOPYQBAAAAAACQ4uWXX9ZHH32UCP+OP/74tOdZa1VbW6vHH388JQyWlBSALl26VLfddpuMMSor\nK9Nbb72VNgyWpJEjR+qFF15I3P9NN92U9rxf/epXiXYLxx9/vP7yl7+kVDN36tRJf/rTn3TCCSck\nVei2xuLFi3XXXXcl5nf77bfr3nvvTRsGS9KgQYP0+OOPq3///r7cf1PcfPPNqqiokLVW++67r159\n9dW0wf7YsWM1efJkWWtlrdXUqVNT+h171VUJv/766ylhsCQNHTpUkyZNSlz+y1/+0voFwRcEwgAA\nAAAAAEgyffp0nXfeeTLGyFqr0tJS/c///E/ac40xGjx4sH784x9nHPfee+9N9Pi97rrrtP/++zd6\n/jHHHKOTTjpJ1lotWLBAH374YdL1Cxcu1KxZsxKXp0yZ0mBPXWOM7r///pTexy11zz33JCpvjzji\nCF177bWtHtNPmzZt0rPPPivJWfvkyZPVqVOnBs+/6qqrdOCBB0pyWmE88sgjjY5vjNEll1ySuE06\no0aNUklJiay1+uyzz1RRUdGClcBvtIwAAAAAAAA5Y9AgKb7vVd7q0UOaMyfsWWT28ssva82aNUlf\n27hxo95//30tXrw48TVjjO6991716tUrZYy6KtGmhMGS9I9//CNx3NTbHHfccZoxY4Ykpw3BIYcc\nkrjuzTffTMxx4MCBGQPm/fbbT0cccYRmzZrV6lD41VdfTRxffvnlrRorCP/5z39UWVkpSerWrZtO\nO+20Rs83xmjMmDG65pprJNV/bxuTroWHW1lZmfbZZx999tlnstbq66+/1gEHHNDEFSAoBMIAAAAA\nACBnrFolrVgR9iyiYfbs2Zo9e3ba6+rC0k6dOmnKlCkZe9oOHDgw4/2tX79eixYtSoz9u9/9rkmh\n7Pz58xPHy5YtS7rugw8+SBwfeeSRGceqO89dVdwS33zzjb766qvE5eOOO65V4wWh7ntTV8FdVJS5\nUcCQIUMkOUG/+3vrVfdCwEEHHZRxzK5duyaON23alPF8BI9AGAAAAAAA5IwePcKeQevlyxrShbFl\nZWXq2rWrvvOd7+j444/XqFGj1Llz54xjde/ePeM55eXliWNrrR588MFmzddaqw0bNiR9zV3hvOee\nezZpnKae15jVq1cnjtu2baseOfhDd39v+vTp06Tb7LXXXonjqqoqVVRUqKysrMHzm/K74e4h7d1g\nEOEgEAYAAAAAADkjH1otFIoJEyboxhtv9GWs9u3bZzzHXR3a0nYNdf2H67h70nbo0KFJY3Ts2LFF\n9+22ZcuWxHFjgWmY3N+bpq7Ze96WLVtydn1oOQJhAAAAAAAABK4ubLTWauedd9a6detaPaY7rNy2\nbVuTbrN169ZW3697c7Zc3SjN/b1p6pq95zW2CR3yV+bmIQAAAAAAAEAr7bbbbonjjRs3+hIIu1tV\nfP311026jbcPcUu411JZWZnUQiJXtOR74+6LXFpaSnVwgSIQBgAAAAAAQOB69OiR1L/31VdfbfWY\nhx56aOL4vffea9JtWruhnCTtuuuuSf1233jjjVaN19IWGo2p+95Ya/X+++/LWpvxNv/5z38S83F/\nb1FYCISRYK3Vf5f/V1sqt2Q+GQAAAAAAoJlOPfXUxPG9997b6vGOO+44SU6mMWfOHC1atKjR8z//\n/HPNmjXLlwD2lFNOSRxPnTq1VWO1a9cucezXxmtHHXWU2rZtK8nZYO7ll1/OeJvHH388cTx8+HBf\n5oHcQyCMhDvfvVNH/P4IDX5scJNeNQIAAAAAAGiOa665RsXFxbLWau7cuZowYUKTb5uuLUO/fv10\n5JFHJi5fddVVDd7eWqsrr7xS1lpfco+xY8eqqKhI1lrNmjVLd955Z4vH6tq1a+J4xYoVrZ6bJHXp\n0kU/+tGPEpevvfbaRnsJT5kyRR9//LEkqaioSBdddJEv80DuIRBGwltfvyVJWrh2odZuWxvybAAA\ncFgrxWJhzwIAAAB+2HvvvTV+/HhJTkB766236vzzz9fy5cvTnh+LxfT6669r1KhRGjBgQNpzJk2a\nJGOMrLX65z//qXPOOUebNm1KOmfLli0699xzNWPGjETVbGvtt99+uuaaaxJrueGGG3TllVdqw4YN\nac+fPXu2LrjgAi1YsCDlum9/+9uJ47/97W++VQlPmDBBZWVlstZq0aJFOvHEE/Xll18mnWOt1X33\n3ZdYizFGl19+ufr06ePLHJB7SsKeAHJHbaw2cRyzPPMGAISvslIaOlRavVp6/XVpv/3CnhEAAABa\na8KECVq6dKmmTZsmSXryySf15z//WYceeqj69eunsrIybd68WUuXLtX//d//qaKiQpLUrVu3tOMd\ne+yx+uUvf6nJkyfLWqtnnnlGL730koYPH64ePXpo9erVeuONN1RRUaFddtlFV111VbMqkxszadIk\nffbZZ3rxxRdlrdUDDzygRx55REceeaT69u2rkpISrVq1SnPnzlV5ebmMMfrFL36RMs4pp5yi9u3b\na/v27frggw/Uv39/DRs2TDvttFOivcVJJ52k448/vlnz69u3rx577DGde+65qq2t1axZs7T//vvr\n6KOP1j777KOtW7fqrbfeSlQlG2N05JFH6o477mj9Nwc5i0AYCbW2Nu0xAABheestac4c5/i556Rx\n48KdDwAAAPzxhz/8QYMGDdKNN96oDRs2KBaLac6cOZpT95+/OGNM4mPo0KENjnfHHXeopKREd955\np2KxmLZt26YXX3wxaZzevXvrueee08KFC5s8z0ytJYqLizV9+nT95je/0d13363KykpVV1dr5syZ\nmjlzZso6SkpKkvoF1+ncubPuuece/fznP5e1VkuWLNGSJUuSzunUqVOzA2FJOvvss1VWVqYLL7xQ\nq1evVm1trd588029+eabSXOTpJ/85Cd69NFHVVpa2uz7Qf6gZQQS3FXBVAgDAHLB9u31xzt2hDcP\nAACAQuLHhmrusVo63mWXXaalS5dq6tSpGjlypPbee2916tRJbdq00S677KLvfOc7+vGPf6yHHnpI\nS5cu1fPPP9/oeBMnTtT777+vMWPGaK+99lK7du3UrVs3DRo0SJMmTdKHH36oww8/PDFv9+fG1taU\n9d16661avHixbrnlFh199NHafffdVVpaqg4dOmivvfbSiBEjdPfdd+urr77St771rbRjXHLJJXr7\n7bd13nnnaf/991dZWZmKiooanUNT5zdixAh9/vnnuu+++3TCCSeoV69eatu2rXbaaSf1799fl112\nmd577z09+eSTaQPrln5fmjtPZIfJx83DjDH/lnRMM282zFr7VgPjnSLpIkmHSeouaY2k2ZIesdbO\naOKciuNj/ERSP0llklZKek3SFGvt/GbO1zv+ckm9evXq1WBfndYa9sQwzVzqvHq1dOxS7dllz0Du\nBwCApnrhBekHP3COx42TJk4Mdz4AAMDRu3dvrVixQkE+RwUAr9Y+9tTdXtIKa21v3yeYJ/K1Qtg2\n40OSaiUt9g5iHI9JelnSSEm7S2oT/zxS0ivGmIczTcYY01XSLEkPShoiqauktpL6SrpY0lxjzM9a\nuNasSWoZEaNlBAAgfDU19cdsLAcAAAAArZevPYRHS+qY4ZwDJT0jJxR+3VpbnuacSZLGxM+ZJ2my\npC8k7SPpOkmHSrrQGLPGWjs+3Z0YY4okTZc0KD7OXyU9Jmm9pMMljZe0m6SHjDHLrbWvNn2Z2UXL\nCLRGLCZ9843Uo0fYMwFQSNyBcC2vVQIAAABAq+VlIGytXZrpHGPM+a6L09Jcv5+ka+SEuLMlHWut\nrYxfPdcY86KkmXKC3muNMY9ba79Ic1ej5VQFW0lTrbVXuq6bY4yZIWmupE6Sphhj+lubm2mruyqY\nTeXQXCNGSK++Kt1/v3T55WHPBkChcIfABMIAAAAA0Hr52jKiUcbpUv2T+MUKSS+kOe0Xqg/Er3CF\nwZIka+12SVfEL5ZIGtvA3V0T/7xBTlVxkniIfJskI2lfSWc0bRXZ5w6BqRBGc2zb5oTBkvT00+HO\nBUBhoUK48FVXS6efLh11lLRqVdizAQAAAApfQQbCkr4rqZecqt3nrLXp9iX/Xvz6hdba2ekGsdb+\nV9JncsLc73uvj1cZ94+P80wD9yNJT7iOczcQjhEIo2Wqq+uP168Pbx4ACg8VwoXvjTekl16SZs2S\nnnkm7NkAAAAAha9QA+FRruMnvVcaY/rK2ThOctpCNKbu+l7GmD6e64amOS+FtXa1pEVyguUhGe4v\nNO4QmE3lCtPixdLPfiZNn+7vuO6QZt06f8cGEG1UCBe+LVvqjysqwpsHAAAAEBUFFwgbYzrKqcK1\nkr621qYLag9wHS/MMKT7+v4+jLOHMaZ9hnNDQcuIwnfLLdIf/iCNGiVVVfk3rjukWb9esta/sQFE\nGxXChY+fMQAAAJBdBRcIS/qhpI7x4z82cE5v1/HyDOMtcx3v4cM4xnO7nMGmcoVv5Urn85Yt0tat\n/o3rfgJfUyNt3uzf2ACijQrhwsfPGAAAAMiuQgyEG20XEdfJdZzpzYnu2KwsoHFygrsqmArhwhTU\nk27vWLSNAOAXqkcLH4EwAAAAkF0FFQgbY3pJGianXcR71trPGzi1nes40xvnK13H3lYPiXGsta0Z\nJyfQMqLwEQgDyDeEhYWPnzEAAACQXQUVCEs6T/VreqKR83a4jkszjNnWdby9oXGMMa0ZJycktYxg\nU7mClK1AeP16/8YGEG1UCBc+fsYAAABAdpWEPQGfnRv/XCnp2UbOc+1nnbF9Q0fXsbcthHecxmKw\nxsZpsqqqKs2bNy/jeT179lTPnj2bNTYVwoWPCmEA+Ybq0cLHzxgAAACtVV5ervLy8oznVVVleoN/\nNBRMIGyMGSjpADntIl6y1m5q5HT3BnCZNnhzbyS3zHOdd5zGAuG6cawyb0DXoDVr1mjgwIEZz5sw\nYYJuuummZo3tDoHZVK4wEQgDyDeEhYXP/TN2HwMAAABN9fDDD+vmm28Oexp5o2ACYUnnu46nZTh3\nvuu4X4Zz3dcvyDDOR00YZ5m1tsUtI7p3764ZM2ZkPK+51cFScpsIKoQLE4EwgHwTZDuBykrp2Wel\nAw6QmvBaa0FYsUL66CPp+OOlNm3Cno2D0B8AAACtdckll+h73/texvNOPvlkrVmzJgszym0FEQgb\nY0ok/Sh+cY2kfzR2vrX2S2PMSkk9JR2bYfhj4p9XWGuXeq57x3V8rBpoU2GM2U3St+RUB7+b4f4a\nVVpaqgEDBrRmiAZV1RAIFzoCYQD5JsiwcOpU6ZprpI4dpaVLpa5d/R0/11RXS4MGSatWSbfdJt1w\nQ9gzchAIAwAAoLWa2jq1tDTTFmDRUCibyp0iqbucwPUpa5uUZv5NkpHUzxgzON0Jxpgj5FT2WknT\nvddbaxfckDHqAAAgAElEQVTLqRo2ks42xrRr4L4ucB2/0IS5haJia/2zsK3beUZWiAiEAeSbICuE\nP/3U+bx1qzR/fuPnFoIVK5wwWJLebdXL0/5iUzkAAAAguwolEB7lOn6yibe5V1JdPHa/N8yNX54S\nv1gj6b4Gxrkr/nkXSXd6rzTG7COprgbnc+VwIOyuCt66jQrhQhRUn0bvE/j1jXXTBoBmCLJ61D1e\nE/afyHvu9W7YEN48vKgQBgAAALIr7wNhY8xOkk6TU8X7ibX2w6bcLl7de5ec6t7DJL1rjDnbGDPQ\nGHO2nNYOg+Lj3mmt/aKBoabFzzWSLjfGPGeMOdEYc5gx5vL4dZ0l1Uq6oonVy6GIqf5ZWHUNz8gK\nERXCAPJNkNWj7sfEqAXCGzeGNw8vAmEAAAAguwqhh/CPJbWVE9xm2kzO69dyWk2MkXSIpKdd19n4\nx2PW2t80NIC1NmaMGSnpZTnB8g/jH+5xKiX93Fr7z2bOL6usqX8WVlObs7m1r95e+rYWrVukn37n\np2pX0lDHj8IR1JNub7UxgTAAv1Ah7B8qhAEAAABIhREInysndK2R9Ofm3NBaayVdZIz5q6SL5QS6\n3SStlTRb0kNNCXGtteuMMUdJukjSTyT1l9RR0kpJr0maYq1d0Jy5haM+BI5CILy6YrW++8fvqjpW\nrR01O/TzwT8Pe0qBo0IYQL4JskKYQDg30EMYAAAAyK68D4SttUN9GGOGpBmtHCMm6eH4R15KrhAu\n/GdkSzYsUXWsWpL02brPQp5NdmQrEN682dnNvk0b/+4DQDQFWT0a5ZYR27dLlZVS27bhzacOFcIA\nAABAduV9D2H4x7p6CNfECr9CuNbWr7c2Fo1noNkKhCU2lgPgDyqE/eP9/uVKH2ECYQAAACC7CIRR\nr8i1qVwEnpG5Q+BY7u715ysCYQD5hgph/3i/f7nSNoJAGAAAAMguAmFIkpx2yvVqI9BD2F0hTCDc\nOunGoo8wAD9ka1O5deukqip/x881uVohTA9hAAAAILvyvocw/OEOR6VobCoXtQrhWCy+zt7vS998\nW7W1Zb6NTSAMICjZahkhSatWSXvu6e995BIqhAEAgCQNGzZMb731Vtrr2rZtqy5duqhz587abbfd\ndOihh2rgwIEaPny49thjjyzPFEBQqBCGpNQeulHYVC6ph7CNwHprJR09UbrwSGnMEFVX24y3adbY\nHgTCAPyQrZYRUu60jfhi/Re67l/X6b3l7/k6bq5WCBMIAwCQXcaYBj+qqqq0Zs0affHFF3r33Xf1\nwAMP6IILLlDfvn112mmn6Z///Gcocx42bJiKiopUVFTUYJgNoOmoEIak1ArZSGwqF7EK4ZoaSX3i\nfzh7fKQt1ZsldfFlbAJhAEHJZoVwrgTCY1/5pV76Yrqe+fh/tfTqJb6N6w3AqRAGACC6rLUyxuiw\nww7T4MGDE1+PxWLatGmTNm7cqE8//VRLly5NnP/KK6/olVde0ejRozVlyhSVlfn3rtNMjDFJnwG0\nDoEwJKVrGVH4z8jcIXBkAuGi+mfd1TX+/YzZVA5AULLVQ1jKnUD4P/O/ktpKX29e6uu4udoygh7C\nAACEZ8SIEbrxxhsbvP6bb77Rk08+qSlTpmj58uWSpCeeeELz58/XzJkz1bZt22xNFYCPaBkBSela\nRhR+QOoOwSurC/8ZqBMI16/Tz58xFcIAghJkWJirLSO27Ygv1Pj7t5iWEQAAoLl23XVXXXPNNVqw\nYIHOOuusRGXx7NmzNXr06LCnB6CFCIQhKbVCuDYCLSPWrKtf8+LPC3+9NTWSTP2aq3181k0gDCAo\nUawQtnL1uPfx73GuVggTCAMAkPs6dOigp59+WqeddpqstbLW6tlnn9U777wT9tQAtACBMCSl6SEc\ngWdkq7+pX+OWiogEwkUEwgDySxQrhK37xbsA2/sQCAMAgOaaNm2aOnXqlOjlO3HixAbPnTdvnm6/\n/Xadfvrp2meffdSpUye1bdtWPXr00JAhQzR+/HgtW7as0fur20hu5syZkpxexu4N5twff/zjH1Nu\nv2bNGj3xxBMaPXq0BgwYoK5du6q0tFQ777yz+vfvrzFjxujVV19txXcEyE/0EIak1JYRflcIP/yw\n9Oc/S5MmSUOG+Dp0i7mfZMds4T8Dra5WUoUwLSMA5IOoVwhXVcfUrtSfcXO1ZQQ9hAEAyB8777yz\nRo8erfvvv1+S9K9//UsbN27UTjvtlHTe4MGDNWfOnMRl92Zwa9as0TfffKNZs2Zp8uTJ+u1vf6tr\nr722wfusu621NmWsxtx///26+uqrVRv/D4b7dps3b9amTZv02Wef6YknntDw4cP17LPPapdddmnS\n2EC+IxCGpDSbysX8rUi6+mpp2zbpppukf/3Lt6FbxV0hG51N5VyBcMCbyhEIA8H497+lceOkUaOk\nSy8NezbBCzIszNlA2PXiXRUVwgAAIMecddZZiUDYWqt33nlHp512WtI5y5YtkzFGbdu21YEHHqh9\n991XXbp0kbVW5eXl+u9//6u1a9equrpa119/vYwx+uUvf5lyX5dffrkk6fnnn9fKlStljNHIkSPV\nq1evlHP79++fdHnlypWKxWIyxmjvvfdW//791b17d7Vr104bN27Uxx9/rE8//VSS9MYbb+iEE07Q\ne++9pzZt2vjyfQJyGYEwJAVbIVxd7YTBkpTh3SBZFc1AuP5Zd7XPob/X+vWStVITX7wF0ESTJkmz\nZkkffSRdcknh/xsLMiz0toxYvdq5j+Jif++nudwVwtU19BAGAAC5ZeDAgSopKUlU3r733nspgfAP\nf/hDnX766Ro2bJjatm2bMoa1Vk8++aQuv/xyVVRUaPz48TrrrLPUp0+fpPOmTJkiSfr444+1cuVK\nSdJVV12lY445JuM8999/f91///0644wz1LNnz7TnfPLJJ/rZz36m2bNn68MPP9TkyZM1bty4zN8E\nIM/RQxiSUgPRoDaxWbPGt2Fbzd0nOabCfwbq3VSuJsCQQZIqK+tfCADgn7q3+W/dGo3wLJstI2Kx\n3Pg7lVQhXB3ci3e50jKCQBgAgPzSvn179e7dO9HCYfXq1SnnPPDAAzrppJPShsGS075h1KhR+v3v\nfy9Jqq6u1kMPPeTrPEePHq3LLruswTBYkr797W/rX//6l3r06CFrrR588MHEuoBCRiAMSaktIvzc\nVM79RG/9+tSKrLDURLJCOHubykm50Tbi6aeliROlioqwZwL4I2r9VrPZMkLKjbYR2WoZsWmTE4KH\nzf3/glz5PwIAAGhcly5dEscbWvG2ox/+8IcqKyuTJL322mutnldLdO7cWWeccYYkqby8XPPnzw9l\nHkA20TICklLDwaAqhCUnJNxtN9+GbzH3mm1UAmFXyFAb0KZyXbo4IYPk/Kz33NO3u2m2L7+UzjnH\nOe7USbryyvDmAvjFG541UHRRMGpqJB30lNSpXDWfXi6pnb9je6xa5dvwLed6rA6yZYS1zuP1zjv7\ndhctErUXOQAAmQ16ZJBWVeTCH+WW61HWQ3MunpP5xDxVF+JK0pYtWxo995NPPtG8efP01VdfafPm\nzaqsrEy63hgja60+/vjjQOYqORvZvffee1qwYIE2bNigrVu3JlUCuzfA+/DDD3XggQcGNhcgFxAI\nQ5JUXZ38hNPvTeXc1qzJjUA4ki0j3BXCAf2Mu3WrD4TDfjvy11/XHy9dGt48AD9FLTzb1vFT6dRz\nJUlV1d0lne/b2HlRIRxgywjJeZwOOxCmZQQAwGtVxSqt2LIi7GmgEe4QuHPnzmnPmTZtmm677TYt\nWrSoSWNWV1dr06ZNSdXHrTV//nxdf/31mjFjRqLncSZr16717f6BXEUgDElSpecJZ62PFbPeCqxc\neWx1h97RaRlR/8Pwsy2Ie6j27euPq6p8u4sW4W3IKERRC4R3tP8ycVzb+ctGzmy+dI8LuRAIJ1UI\n+/hujnTr3bBB6tvXt7toEQJhAIBXj7IeYU+h1QphDY3ZVFcFJGmXXXZJuX7MmDF64oknJDkVwJnU\nVetu2bLFt0D41Vdf1ciRI1VZWSljTMZ5uOcAFDoCYUhK7VEYVDsBKTc27JGSA2GriATCWWgZ0c71\nbu6wQ9ioBWeIhqiFZzHrfqz2d8G5WiGc3DIi2ArhVrT8803UfqcBAJkVcquFQrBt2zYtX748EbD2\n6JEcfj/yyCN64oknEteffPLJOuecczRgwAD17t1bHTp0UElJfRzVt29fLY2/pTPmU/vKtWvX6sc/\n/rGqqqpkjFGfPn106aWX6uijj9bee++tnXbaKWnDu5tvvlk333yzr3MAchmBMCSlviW1NuCWEbkg\n6i0jgmoL4u5nWl3t2120CCEDCpH7dznsF12yoTYLgXDHjtLWrc5x2D2Ea2uV3N4n4EA47NY+Ei/e\nAQCQb+bMmZNov2CM0RFHHJF0/d133504vuWWW/TrX/+60fGCqMh99NFHtWnTJhljdPDBB+utt95K\n6nucjTkAuawo7AkgN9R4qkWDbBmRi4FwFDeVC6plRC5VCNMyAoUoauGZOxCWicnPgo26xwX3uxK3\nbfNv/JbwPlYHuamcRIUwAABovmeffTZxXFRUpKFDhyYuL1++XIsXL5Yk7bTTTrrhhhsaHWvLli3a\nEMB/SF5//fXE8fjx4xsNgyUlKpSBqCAQhqQ0LSMiUCHsDr0j0zIiqUI4mJAhlyqEoxacIRqi9nud\n9PfI1Pq65rqx3C9khf24lbIBaEAv3tUhEAYAAM2xfv16/fGPf0z05D3llFPUqVOnxPUrV66U5FQO\n9+vXT8XFxY2O98477yR69zamKX2I3ermIUkHHXRQo+fGYjG9++67zRofyHcEwpCUrmVEcBVJObOp\nXCRbRtQ/6w4q9M+5YCXNMZDPohaeJT0+F0UwEA64QjgXWkZE7XcaAIB8NmrUKFVUVCRC3PHjxydd\nX1RUHzNta8Jbrx588MEm3W8713/YqpvwH7bmzOOFF17QqlWrmh06A/mMQBiSUp9wxqLQMiLim8oF\nVXWWSy0jolZJiWiIXg9h1yJ9rhCu+/61b5/6tbBUViX/PYrCpnI8VgMAkPu2bt2qH/3oR3rllVck\nORW7o0aN0uDBg5PO69u3r4wxstbqk08+0VdffdXgmM8884xefvnlJgWxXbt2TRyvWLEi4/l77713\n4vjvf/97g+etWbNGV199dWLOQFQQCENS6hPOSLSMiGIg7Ko6C6oKPOcq7eIIGVAoohaexTw9hP1a\ns/shsLS0/jjsx63KquQFVtfSQxgAAIRn9erVuuuuu3TAAQfoueeek+SEwUOGDNEjjzyScn7Xrl0T\nm8zFYjGdeeaZWrRoUdI51lpNnTpVo0aNUklJSVL1b0O+/e1vJ47/93//N+P5p59+euL4tttu01NP\nPZVyzrx583Tsscdq+fLl6tixY8YxgUJSEvYEkBu81aJ+biqXq4Gwu0I4Mi0jAtpUzv1kPlcrhMOe\nC+CXqIVnSYGwjy0j3N/HkhLno6YmBwJhTwunoDYArUPLCAAAou3ll1/WGteT9Fgsps2bN2vjxo2a\nP3++vvzyy8R1dZW8F198se655x61adMm7Zi33nqrTjzxRMViMc2bN08HHXSQhgwZor333lsVFRV6\n++23VV5eLmOMJk6cqIcffjjjpm4/+MEPNG7cOFlr9dJLL+k73/mOjjrqqKT+xeecc44GDBggSTr/\n/PN19913a9GiRdqxY4fOO+88TZo0SQcffLDatWunTz75RHPmzJExRgcffLBOOukk3XHHHS3+PgL5\nhkAYkoLdVM4bxK1dK1krhd2ehwrhYEKGXNpUjpABhShqFcK1SRXC/gXC7nGKi6U2bXIkEPZUCHv/\nPrcGFcIAAMDNWqvZs2dr9uzZaa+v2zhOkoqLi3XKKado7NixOu644xodd/jw4XrwwQd1xRVXqKam\nRjU1Nfr3v/+tf//734lxi4uL9Zvf/EY33HCDHn744Yxz3W+//XTDDTfo9ttvlyR98skn+uSTT5LO\nOeiggxKBcGlpqV588UWNGDFCS5YskSQtWLBACxYsSFrb0KFD9fTTT6etdgYKGYEwJEk1AfYQ9j65\nq6mRNm2SdtrJt7tokagFwlXVMcnU90TKRsuIsKtygw7O1q6VunRxgiQgW6JU+W6tZAPaVM49TkmJ\n8+94+/bwA2FvAFwTgZYRUXuRAwCAXNBQ397S0lJ17txZXbp0UY8ePXTooYdq4MCBOv7447X77rs3\nefyLL75YQ4YM0e9+9zu9+eabWrlypdq3b69evXpp+PDhGjNmjA4++OCk+WTqJTxx4kQdffTRevzx\nxzV37lytXr06sWFcutvut99++uCDDzR16lQ9//zz+uyzz1RVVaUePXrooIMO0k9/+lOdeeaZiQ3o\n2FQOUUIgDEmpLSOCDIQlp21E2IFwja2V4o/3NgItI7x9omsiViHsd3D2xhvSKadIu+8uffyxVFbm\n7/hAQ6IUnsViSnpng58Vwu7HhLoKYSn8x62ot4ywNjfeRQQAQCF78803s3I/Bx54oB577LGM57nb\nUmRy8skn6+STT27y+WVlZbr++ut1/fXXN3rehAkTNGHChCaPC+Q7NpWDpDQtI6x/T0DTBXG50Ec4\nchXCAf6Mc3VTuSCDsxdflKqqpK++csJhIFui9Pb62lol9T73c1O5dC0jpPCrrqs8gXB1jX9/n9Kt\nbcMGJ4ANk3dehf57DQAAAISNQBiSUqtHg2onUCcXAmHvzvWFLrVPdDA/Y3eFcNjBSpDBWVVV/fG8\nef6ODTTE+8+20IMzb+/zoFtGSOG/kOWtEPa+g6c10j1WV1dL8XdahsLa6P1eAwAAAGEjEIak1HAw\nFlD1aJ21a30bvsWSK4QL/9lnaugfrQphv8Np99o++MDfsYGGpOvJXshSK4SDaxlREm+iFfbjlrdC\nOKiWEV271h+H2TYi3fIIhAEAAIBgEQhDUmpY6GcP4ZxtGeEKvS0Vwq2Sq5vKBVkhTCCMMHh/jws9\nOItihXC2NpXr1q3+OMyN5dL9nSj032sAAAAgbATCkBTOpnJhi1oP4WrPs+4obCpXWyvpsAelk36h\nSuNvCZx7bcuWSevW+To8kFbUeq1ms0I4ZwLhLLWMcG+EWVnp2100G4EwAAAAkH0EwpCUWoEU1IZj\ndXIiEE5qE1H4gXAYm8qFXSG8umaxdOrPpSPv1aqev/d1bG9oRJUwsiHyFcJZ2FQu9EDYs4lcbUAV\nwrnS3odAGAAAAMg+AmFICrZCOFdbRsTcFcKm8J99pvyMs7CpXNjByqbalYnjyrYrfB3b+3tNIIxs\niFoP4ZoaJVcIZ6FlRNjf02pvD+EC7/dOD2EAAAAg+wiEISl105oobCqXFHpHoIdwyqZyEagQdofg\nfv5OS6kByrx5vg4PpBXJlhFJFcLBt4ywNtzvq/fdHEG1jMiVQJgKYQAAACD7CIQhKV0gHIEewu5N\n5SLQMsIbKtRGoYewa40xBRsIUyGMbIhky4gsVAgXFztVwnXCfOxKae8TwZYRYb+YCAAAABQ6AmFI\nkmo87QNiPgakOdsyQslVZ4UuW6F/roQMUnYrhBctkioqfL0LIEXUAuHUCuFgegi7W0ZI4T52ed/N\nUegtI6gQBgAAALKPQBiSst8yYts25yNM7jXaCLSMqPI8645Cy4hsVghbK330ka93AaSIfA/hLLSM\nkHKrQti76WtrEAgDAAAAkAiEEedtJ2ADbhkhhV8lnBSIRiAQ9ob+QQXCudQyoiYpEPY3OUsXYtA2\nAkGLfA/hLGwqJ1EhnE3Z2FRuxYrw/x4BAAAAuYRAGJJSexT6WU3pDjDat68/DntjuaQq6Ai0jPCG\n/rFY8FVnYVcvuivrbMAVwhKBMIIXtZYRYVUIh/nYlVohHMzf41wJhIOuEH7uOWmPPaSDDw7/bxIA\nAACQKwiEISm1Aimo/rK77VZ/vG6db3fRIkkBIRXCrZKrFcLuNQbSMsLzQsIXX/h6F0CKqAXC2aoQ\nzqWWEd7Ham+P/9bIxQrhoAPhl15yWvosWCDNn+/fuAAAAEA+IxCGpDSBsI+byrmf2HXsWH9cWenb\nXbRIzBMIWxveXLIhNfQv/B7C7mDF70B4c8cPpGt6yYw5JhEMV1X5ehdACu+/qbD/jQWtpkZSkWuR\nEdxUrtbHdDSKgbB7/LBfiAYAAAByBYEwJKWGhdbHsLChlhFhh2felhE+FmHlpOra5GfdQVWBl5a6\n7jOHKoT9bhmxodefpbLVsnu+Le0+V1L460Xhi2SFcJZaRpSU1F/OpU3lagN6rHb/PS7kHsLun3PY\nraoAAACAXEEgDEnpeggX9hNQKbVCuNCDlWy0jCgqyp0+nFLyCx3W+LypXNH2xHFJe+c47Bc5UPii\nFgg7FcLRahnh7ffuZw/hKFYIu8eiQhgAAABwEAhDklTrrRD2sZoyF5+ASp5AuCgCgXAW+kQXFzuh\ncFH8kSX0n7E7EPa5Qjim+sWVlDpJcNjrReGLZCAcUIVwrraMSO0hXNh/j2kZAQAAAGRfSeZT8oMx\nZndJF0o6TdJekjpJWiNpqaQ3JT1jrf20kdufIukiSYdJ6h6/7WxJj1hrZzRxDsXxMX4iqZ+kMkkr\nJb0maYq1Nme3MwkyLGyoZUTY4Zk3IKyuial9Ab9G4g0ZgughXFzsfC4pcaplw/4Zu3+vY8bnQNjW\n/2IXEwgjS6LWQzjITeW8LSNy5d0N3grhGJvKtQoVwgAQvPLycvXu3TvsaQCIiPLy8rCnUBAKIhA2\nxlwo6W45IbB7a7Be8Y8hcsLZq9Pc1kh6VNKY+Jfqbr+7pJGSRhpjHrXWXpJhDl0l/UPSIM8c+kq6\nWNL5xpjLrbW/b97qsqM2VptUL26z0DIi7LfXezcZq66JqZCL5lM3DgwuEG7Txvn5hh1WuV/Y8LtC\nuNa4KoTbOscEwggaFcIR2FTOWyGchQ1AC7mHMIEwAAQvFotpxYoVYU8DANAMeR8IG2PGSrpHTgi7\nVNL/k/RfSZvlhMHfkhPsNpRwTpITBltJ8yRNlvSFpH0kXSfpUEkXGmPWWGvHNzCHIknTVR8G/1XS\nY5LWSzpc0nhJu0l6yBiz3Fr7autW7T/vpjVRaBnhXWNVdWHvKpetlhFSfbAS9s+4JkstI4qoEEaW\nRC0QTqkQDnBTuVwJhGtqa6Xi+sveHv+tkYsbgNIyAgDyV48ePcKeAoAI4zGodfI6EDbGHC7pLjkh\n7IuSfmStrXSd8kH88z3xdg7e2+8n6Zr47WdLOtZ1+7nGmBclzZQT9F5rjHncWvtFmqmMllOFbCVN\ntdZe6bpujjFmhqS5ciqYpxhj+lvrYxrng9pY8hNQPyuE86VlhHdn90JTE0t+1h10ywgpFyqE3W81\nr5G1kjE+jS1Xy4g2TiAcdtU7Cl/UAuGUCuEAN5Urcf2PKKcC4YAeq3MmAM9iy4i1a/0bFwAgzZkz\nJ+wpAABaKN/fH///5KxhqaRzPGFwEmvTPqP6hepD8Su8t7fWbpd0RfxiiaSxDQx/TfzzBjlVxd77\n/kLSbZKMpH0lndHQPMOSjXYCUo4Fwp6esjU1/lZhTZ8uzZ7t25Ct5t04MOiWEVL4P+Ok32tTKx9b\nccq6WkYUtaFlBLIj8j2EI7CpXDZ6CEc1EKZCGAAAAHDkbSBsjDlC0iFyqnLvjIe3zfW9+O0XWmvT\nRnfW2v9K+kxOmPv9NPPYT1L/+DjPWGt3NHBfT7iOcy4Q9j7hDKqHsLtlRNjVlN41VvsYCP/lL9IZ\nZ0hDhkjLlvk2bKt4Q38/i9TrfsZ1FXa5WSHsX5BkrWRNaoUwgTCCFvkKYR97COd0ywj35Zh/P+S6\nNefSeoPuIUzLCAAAACBV3gbCks5yHf9v3YExpqsxZl9jTJfGbmyM6Stn4zjJaQvRmLrrexlj+niu\nG5rmvBTW2tWSFskJlodkuL+s874llZYRrfPRR87n6mpp7lzfhm2VIKvA3SGDlDsVwkm/16bWt4Da\nqVqkQhjZF7VAOKVCOMCWEe6ANMwXs7yP1bSMaB33WBs3Fv6/GQAAAKAp8jkQPjz+eYm1dq0x5hJj\nzGeS1sgJXjcYYz41xlxljGmT5vYHuI4XZrgv9/X9fRhnD2NM+0bPzDJvO4GgNpXLqUDY2zLCx417\n3E9wV670bdhWyWbLiLoK4bB/xkFVCFdXSyqq/yEXuSqErfXnPoB0vOFZoYdbqRXCwWwql0stI1Ie\nq2kZ4dv41kobNvg3NgAAAJCv8jkQPkBOm4alxpin5PQT3jf+tbqPfpJ+J+k1Y0wnz+17u46XZ7gv\n95v+9/BhHOO5XehSKoRNFFpGeCuEg1lz7gTCyc+6g2gZ4a0QDrtlRK07SPExSKqullRcn6CYEueX\n2drCD+gQLu/vV9j/xoIWVoVwLvUQrgmgQjiXAvBsVghLtI0AAAAApDwNhI0xRlJdS4ghks6RVC7p\nXEm7SOog6VhJ78kJhodKeswzjDsgrshwl1tdx2UBjROqWuvtIex/OwEpORAOu3rUWyFc7WPLiJys\nELbZqxDOlZYRSWssqvEtPKupUXLLiJL647DXjMIWtZYRQVYIezeVq3tng0SFcDYF3UOYQBgAAABI\nlZeBsJzA18SP28oJWo+11v7FWrvJWltprX1H0nclfRQ/90xjzCDXGK5oUplqVStdx95WD4lxrLWt\nGSdUsZSWET4/Ad19jnTMrdreZkXi62EHZyktI3ysEM7JQDgLbUG8LSPCrl5MWnOALSPqKoQT1wEB\niWQg7K4QjsKmct5AmB7Cvo6/dq1/YwMAAAD5qiTzKTlph+vYSnrUWvu59yRr7Q5jzK8lvRT/0o8k\nzUkzRmmG+2vrOt7e0FyMMaUZQuHGxmmSqqoqzZs3L+N5PXv2VM+ePZs8bsqmNT62jKipjUk/OkPq\nslzPbVgo6SlJORCceSuEfewh7H4yu2JFw+dlU2qFcPAtI2Ix56MopJeekiqEA20ZQYUwsiNqPYRr\na1cHTZgAACAASURBVJX8WB2BlhE1ngVGZlO5NlulNtulbd2oEAYAAECzlZeXq7y8PON5VWH3L80R\neRkIW2trjTE75FTnWkn/auT01yXVSCqWdJjr61tcx5naN3R0HXvbQnjHWd/CcZpkzZo1GjhwYMbz\nJkyYoJtuuqnJ48YCbBlRVVstdXHaKy+v+rT+62H3EI54y4hsVAhLzveiNNNLLgHxbirnV8UyFcII\nCxXCwbWMcAekYb67wftY7W3p1KqxczUQbr9OunI/qc026bH3VFt7iG/jEwgDAABEw8MPP6ybb745\n7GnkjbwMhOOWSdrPdZyWtbbSGLNW0m6Suruucm8Al2mDN/dGct778o7TWCBcN45V5g3o0urevbtm\nzJiR8bzmVAdLqW9J9XNTuZra+mfWW2P123uHHpylBMLBtIxYv17asSO5f3IYam1ywpGNCmHJ+TmH\nFggHWSHs6iGs4vpAOOwXOlDYIrmpXEAVwrnaMiKlh3AUKoT7vC21j///YN8ZvgbC3n8jBMIAgHSs\nlcrLpd13D3smAFrqkksu0fe+972M55188slas2ZNFmaU2/I5EJ6v+kC4OMO5dde7n1XNdx33y3B7\n9/UL0szDfd5HTRhnmbW2RS0jSktLNWDAgJbctFFBVo/WxOqfjVXU5G4gXBNQywjJ+c9F376+Dd8i\nKaF/FiuEw5JcIVwTWA9hd/uI0H+vUdCi1jIiXYWwX3useVtGuOVWIBx8hXCoFdG1Sn6BzccNQBPj\nuxAIAwDSufBC6Q9/kMaNkyZODHs2AFqiqa1TS8OqWMsx+bqpnCS95Treu6GTjDGdJHWLX0x0c7XW\nfimp7s38x2a4r2Pqbm+tXeq57h3XcYPjGGN2k/QtOdXB72a4v6yLecNBHyuEq10VwttqtySCtDAr\nKa1VcsggqdrHZMX7ZDYX2kZks2VErlSeJf1e+9gyoqZGSSGwu0KYQBhBilrLiJQK4YA2lSspSX4h\nK5c2lfMGxK2RsxXCRcG8eJcY34VAGACQzl//mvwZAApdPgfCL8gJVyXpjEbO+4EkEz9+23Pd3+LX\n9TPGDE53Y2PMEXIqe62k6d7rrbWL5VQNG0lnG2MaagxwgWfuOSWlh7CfLSNinmdj7TZKCrkCyxsy\nyN8K4VwMhFPedmxiTjDe2nFd37Z0FcKhBsLu32vfW0a4K4QJhJEdtbWSiiul/V6R2q8v+EDYCQtd\n/9aKYqqp9eGBS7m7qVzqBqDBBMK58jid8jP28bFaSn3RZO1a/8YGABSOumKlQm/HBQB18jYQttZ+\nJek5OUHsOcaY47znGGN6SLo1frFK0uOeU+6Vs+GcJN3vDXPjl6fEL9ZIuq+B6dwV/7yLpDvTzGMf\nSTfEL36unAyEvWFhMC0jJEntnLYR4T8BzV7LiNwMhP1563W6t13nyluRvZvKBdZD2HVMD2EEqbZW\n0om/lH56qjTqeFXX+BOO5iqnnUAwj9WNbSqXU4Gwjy0j6h6Pi4slY+pD4fD/HrtD/2ADYSqEAQDp\n1P2NLPQX2wGgTt4GwnHXSVojp0fwy8aYScaYocaYgcaYyyS9L2ejNytpvLW23H3jeHXvXXJC5cMk\nvWuMOTt++7PltHYYFL/9ndbaLxqYx7T4uUbS5caY54wxJxpjDjPGXB6/rrOcHsZXWOvjszufpFYg\n+dkywjN2+xwJhFM2lSvslhEpP2OfnnRnCoRzpmWE8a9lRHW1klpG2CIqhJEdNTWSer3vXOj5gWp9\nfCErF6V7rK7x6ZlaY5vKhdpTN0ubykn1aw79HTtJgTAtIwAA2WVt/d9CAmEAUZHPm8rJWvu1MWaE\nnIrbXnKqcG9wnyIn2Zxorb27gWF+Lam7pDGSDpH0tOf2VtJj1trfNDKPmDFmpKSX5QTLP4x/uMep\nlPRza+0/m77C7EnZcMzHCuFam75COMxKyqArhHMtELZWiskzKZ96caYLhHNlU7mkPskBbipnXRXC\nBMIIkrdi1uk3m2lf1fyVrkLYr37v3seuXGmh4P17HNSmcpITCG/fnmMv0AbcMmLdOudvojHpzwcA\nRI/7XZMEwgCiIt8rhGWtnSvp25J+I2mupI2SdkhaIqdFxGHW2gmN3N5aay+SdKqcnsIr5IS3K+KX\nT7HWXtKEeayTdJSky+T0Kl4rabukLyQ9ImmAtfYPLVxm4Ky3IjjAHsKmYy5UCFvJJL/VupBbRsRi\nSm0D4tOT7rypEPbxbcjOCwpUCCP7vP3P/aqWzVXpKoT9qorOm5YRPvUQtlaJvvF14XcuVAhnu2VE\ndbVUUeHf+ACA/OcuYCnw/1oBQEJeVwjXsdZuljQp/tHSMWZImtHKecQkPRz/yCvpNhzzizcQLu64\nQTUK9wloVXXq+gq5QjhdRXSQLSNyskLY75YRrgAjZgiEkR3et9f7VS2bq2pqJJUGUyHsbRmRuxXC\nwVRES7kaCAfbMkJyqoQ7dfLvPoAoe+opadIk6brrpPPPD3s2QMu4/w4W+H+tACAh7yuE4Y+UCqQA\nN5Ur6hB+y4jK6tT1+Vlpl5OBcEqFcMyXgDSKFcKVVTGpqP4FBGvYVA7Z4X1xpyZW2M9avBXRkn9r\n9j525crjVmoPYf8ronMpEE7pIRxwywhJWrvWv/GBqLv1Vmn+fOczkK+oEAYQRQTCkBRsywhvD+G6\nQDjcCuF0gXBwLSM2bw73LappK4QDbBmRMxXC7iDFxwrhyurkgagQRrZ4wzNveFho0vd7979COJda\nRngDYOtTy4hcDYSDbBlhbXJfyDpsLAf4Z/Nm5/OWLeHOA2gN99/BdH83AKAQEQhDUpoK4aLaRK/B\n1krpIZwDgXC6CuHqAFtGSFJ5uW/DN1vKE24p0JYRuRCsxGJKDpKKYk7vaB9sr0peVMywqRyyI6WH\ncIEHwukqhGt9eqbWWIVwmC9kef8eF3qFcEro72PLiIbGIRAG/FP374yqSuQzKoQBRBGBMCSlqUAy\n1rfwzLtBjtqFHwhX12S3ZYQUbtuIhlpGFHIgnC5I8qv3aGW1NxCmQhjZ4X1xJxKbygVUIZyrm8ql\n9BCOWoWwjy0jCISB4NX9nzfMF9KA1qKHMIAoIhCGpPRvSa2p9SsQTv4fom2XAz2E0wTCflWdSen/\nI7FihW/DN1sUW0Y4b61PXmC6ViEt4W0ZUSsCYWSH9/e60FtGBNlD2LupXK4Ewt4XUQu9ZURKD2Ef\nW0Y09PeHQBjwT92/M0I05DMqhAFEEYEwJEkxbw9h+ReeeVtG2ByoEA66h3BeVAgXeMuIdGsOrEJY\nbCqH7Ihay4i0FcIBbSrnfiErZzbDVPq/zy2RKRD2q03U/2fv3YNlyeo63+/KzKq993n0OZymaU53\nY4MtjwaZq6CgNlfUCRXUQBwVYyJmHNFRwStzVUavEept1DtjyDg3BF/gY0SN0RnvwKCO2GrIAEI4\nCt00DxGh6QeHfjec3ue1a1c+1v0jMyt/a+XKql2VK3OtzPp9Ik6cOrv2zl15Kiur8pvf/PzWpa+G\n8KlT1W0OhBnGHqyMYMYAN4QZhtlGOBBmcvSwEECcWDoI1RrC2dTTQNhisOJlIFxrCHenjPC1IWxS\nhWzCYcINYcYNtaFyupJnZJgdwuNWRuiN4D4awvr9fVI7eWfRIUx31ddeW91+7DE7y2cYhhvCzDjg\nhjDDMNsIB8IMALOjcG4pPMu0QDiduldGmJqiNhvCpg8SzgPhmkN4+xrCepC7KXpDOJEcCDP9oJ/c\nGbtDOE4kEKj7Zlv7al+VEbpDWFpqCOvrC/ixzrWGsEVlBF3OyZPV7cNDO8tnGIYbwsw40A8RLJoE\nGYZhvIUDYQaA+YDTVkM40RvCkwuLMNLVJaqmhnDagTLi9Onqaw8+aG3xa1M74Aa2oyEs9CCpK4dw\nlaRwIMx0Sa0hPHJlhGm/bLUhvPs48JU/i7/89J8s9l+A2+FIdWVEPw1hpwNAe1BGTKfmrzMMszlS\nckOYGQf6eyBvzwzDbAMcCDM5HSojMhiOrHcfz3+HowNQU/u5C2UEbSQdHFhb/NoYlRFBaiX08DVk\nMK2zLWXEPOWGMOMGXaEwdmWE8WoOm0PlXvBLwNf833jln30rHr78kPMha1L2o4woT9r5s6+mDeFu\nlBE7O9VtPtBnGDvoLUpuVTJDRT8m4vcJhmG2AQ6EGQDmA057ygjDcvbceoRNwWBq8VNs+SGCeild\nBoV9KyN8GM5kco/a2qbnibpSqYwB5HV3HirHdIkenm1jIGy1IfyEe/Lfk8W463N3Od9fZxlqJ7Js\nKSP0ffVhcohPX/eLwOf/JQDXgTB1CHNDmGGGgv5a4tcWM1S4IcwwzDbCgTADAJCGhnDS0VA5AMCu\nW4+wKRC26RAuzzJHUXUQ6jwQ7nGoHG2dubr02rTOtpQRc22lJOTid3FDmOkSfVji2B3CpvWzFYLn\n/5fVC/axK48tTmY5O1kZo3Yiy/T+vAn6vvr7/sf34aM3/DDwL14KHHvUn4ZwR8oIbggzjH30z3gu\ndTsM0wZuCDMMs41wIMzkHl9RD0NttSmNgbDjhrBp3Wy6OMsPFXRQkfNAWA8VLLWw6Acony5DNjWE\n49TOkYreEAYAhPnZDQ6EmS5JUqkEwsYrMEaEMRC2dDWHHkQ+duUx5/tr04ksKe03hC/v3IXf/dDv\n5v8IUuDUOX8C4Y6UEdwQZhj7cIjGjAVuCDMMs41wIMzkl6iaGsKWGrNmh7BjZYTRS9mtMsKlSsA8\nVK4fZYRPDWHT874Jc1OwXDQNORBmukQfsjZ2ZYTJF2zLIZymAEK1IexFINxDQ/iDp25V7xSpP0Pl\nWBnBMIOBlRHMWOCTGwzDbCMcCDNmnQA6Hiq39zkALpURpsn13SgjXAcMi8fjSBnhV0PYzqe7OOWG\nMOMGfRsefyBcf/+w6hBuaAi7OpEVx+jeIXzth3HX3h+odwaJPyG4RWUED5VjmG7hEI0ZC7wtMwyz\njXAgzBQHY/UDTlvhmTkQdtsQNl2GbKt1BpQfKiTC0COHcEfKCK8bwtp2bXJHb4JRPVE0DXmoHNMl\nekBqU3XjI6b9sq0QPD9RtoUN4a/+KUBI9U7XgbCmjLD1vsENYYbpFg7RmLHAygiGYbYRDoQZc1gI\ne0PlvFRGGILBzFJDWEogvfqjwI88BZ+45cWIdtw3R80N4e4CYR8awsahcpbCs/kWNIR/6IeAZz4T\neN/7XD8ShqKfzBp9Q7hDh/CyhrBP+y3T+/MmpCnyq3Oe9cf1O70KhLtRRtD3JT7QZxg7sDKCGQt8\ncoNhmG2EA2GmWRlhqyFsCiwcN4RN62YrZMgyAF/3b4Gr7sfFM+/BxZt/GYAHDmE9VOhQGeFDQ9ik\njEgsDZUbuzLiwQeBN7wB+MQngDe+0fWjYSh6Q3jsQ+VMDWi7DWEPA2HDvtrG21OaAtjZN9/pMBCu\nOYQ7UkaEYfUexQf6DGMHDtGYsaC/B1o0CTIMw3gLB8KMMTgD7AyVyzLUh5kBi4awO4ewaX0thgzX\nfnjx74Nr/2f+O503hLXnIUithLVDagjbcwiPe6jcQw9Vtx9+2N3jYOro27DtQPhnfxa45Rbg/e+3\nutiNMbX6sw6HypUns5ydrDQ4hK1ezWF6PwY8aAiTFQySThrCUcSBMMPYhgNhZizwtswwzDYSrf4W\nZuw0OoQtKCMaD0B9bAhLO6eCkwTAozcDJx8EABxe9Q/574xznYQQVn7N+o9py5QRxoawpSApzpob\nwmNwCH/uc+bbjHv0xqxNZcT+PnDrrfl+6gUvAC5fBo4ds7b4jeiyIWxSRpwhDWEX++umhnCSqPvV\njZftbSDcvTKCG8IMYx/9teTqqjCGaQs7hBmG2Ua4Icw0KyMsDOBqPAB17BDu3Et5/qbFvw+PfwpA\n4RZ29OFiG5URRoewrRZ4ZlipESkjPvvZ6jYHwn6RSG2onMVA+NKlfD9V8nM/Z23RG9PnULn9w32E\n0+rfLi4XNb4f2xwAOoRAuAdlBIdWDGMHblUyY4G3ZYZhthEOhJnmoXIWlBGDaghbao8mCQCpvbQc\n6wQ6DxkKvG8I21JGGBvC41FG0ECY3mbco2/DNpURegD6+tcDn/yktcVvhFkZ0c1QOQAQe9UZEBev\n5TiG4eSdxX112LBSPjmEWRnBMIOBQzRmLHBDmGGYbYQDYabToXKNgbBrh7Bh3TKbygh9na/+RP57\nPRtUNPqGsKZCiU3N3k2WbXIIj6ghTFvBsxlwcODusTAqtXa6SK01WfX9wXwO/OAPqq3hvulcGaEF\npPLYY4vbLl7LZr1PZs/37mtDmL4/WVRG8FA5hukW/bXEry1mqPDJDYZhthEOhJmiSVlPFGw0hJuV\nERcAkXqljEg6bJ3hSR8F4FlD2LZD+PS9+GDyn3FpfsmfhrC2ztZa4NKwUiMaKqe3glkb4Q+1MLQj\n32rJX/yF25Zwl4Gw6f0p2/UgENZP3nWkjNiNdsnvGKcyghvCDNMtHKIxY4G3ZYZhthEOhJkelBHV\nso9Pjld37j7uVSBsa3K9MQS/9iMA3DWijY/JZsggMuA7/yl+78q/wE/81U941BDuyiE87qFyHAj7\nSyrrDeEuA2EAeOwx89f7wBT+2tJk5O9P6ms53fEgEDY0hLcqEO5IGcENYYaxD4dozFhgZQTDMNsI\nB8JMr8qIa45fU925d95dQGpsndlURmifKnxoCHepjJheBM7cDQC446E7vG0Im573jZath3IAwuk4\nlREAe4R9onZSw2JDuOkiCZcnOUwNYVuBsOlEmetAuMkh3IUyQgmEw9gjhzArIxhmKLAyghkLfHKD\nYZhthANhxuhaBYC0A2XENcdIILx73quhcl0OKsKT8oaws/WN0a0yIqwSo8vzy942hLtURkQ7rIxg\nuqfWmO2hIew0EDaEv7ZO3pkcwum0CoRd7Ls6HwA6hIYwKyMYZjBwiMaMBW4IMwyzjXAgzCCOJSDq\nU4O6aAg/8dgTqzv33AXCJnVAl15KnLkbmF7yqyEcZPlz35I0BRAdLv59Ob7sbUPY1lC5sTeEORD2\nl9q2F9hpjwLNBz+Hh+av90GXDWHTybt46oEyosurObwNhOlQOXvKCG4IM0y36O8/rkoADNMWPrnB\nMMw2woEwg8PY/I6XWmjM6gegp3ZPVXdGB+7CQmPIYFMZYfhEfM3HHDuETZ5oS4Gw1hD2IRDusiGc\nGhrCwWQ8DWE9AOZA2B9q2/DYG8Iw7KsNX9sEk94njjxQRhiu5uhCGbEX7VV3Og+Eu1FGcEOYYbqF\nlRHMWOCGMMMw2wgHwkxjE9jGUDn9QO/E5ER1ZzRz6BA2KDK6VEYAwJM+4s8Bd/l1Wy3wsKoQXppf\n8kIZsRh2R+jUITwZx1C5LGOHsM+YGsLsEN5w2YZ99dxxILxtDWEpi+2uB2UEN4QZxj7cqmTGAm/L\nDMNsIxwIM4hjcwpgLSykgfCUBMITdw3h3pURAHCt40BYDxlgUQtCG8KeKCNMrWhbz7GxITwSZcT+\nfj0Y5IawP/TlEJ5Oq9suA2FT+Gvjag4pzQ7hWehBILxFDuHFevFQOYYZJByiMWOBt2WGYbYRDoQZ\nzJOGhrAtZQQJIpVAOJq5CwudKCP+3kNlhKXnmDiEkyxBJqoVddsQ7kYZkcHUEB6HMsIU/nIg7A99\nOYT3iE3AlUM4y2A8kWWjIbx4e9P21YeBB4FwrSFsJyDVFRk+BMKLbZeus0WHsK6MKK9e4QN9hrED\nKyOYscDKCIZhthEOhJnGQDhu+Po66OHoyZ2T1Z0OlRFdOoQblRG7+941hO0pI9Qnci4vL2571RC2\npYxAfaXEZBwNYZMeggNhf+jLIUwDYd9OZNlwCOfrKmvLPyCBsIuTWWaHcNaJQ9irQFhRRkgrfnuA\nlREM0zXcqmTGAm/LDMNsIxwIM43KiC6Gyh2fHK/udNgQ7m2onBSIZHHQHR56FY4CNpURaoXwIL0M\nIcjvdkC+7enKCDsPJjMEwkE03kCYHcL+UGvHdjSAy4dA2NTyB+wEwk1XcsTi0mJ/5k1DeMTKCGMg\nDHsn71gZwTDdwiEaMxa4IcwwzDbCgTDTGAp2EQirygiHDmHDwWYXDuEAESKxk98ROQ6EjQ1hS8+x\n1hC+PK88wq7WeW440WHNIWxQRogoX9GhD5VjZYTf1LY9iw1husv3IRBuCm1tKCNM/uAFx/IzIN44\nhEc8VM7oEIa9k3e6MqIMhLMs90gzDNMOVkYwY0E/uWFp1jjDMIzXcCDMNDuELXyqW6WM8KkhLDtQ\nRgSIEKKYzhQeehesmILxddEdwgBwaX5p4Wr0anCgNYewQRlRNIQXw6oGCisj/MbUEB6rQ9jU8gcA\nCfsnKxWO5doIbxrCws5z7GMgXDWEu9H7NDWEAT7YZxgb6PsmV1eFMUxbuCHMMMw2woEw0+gKtjZU\nrrEh7NAh3NHkekANX0MRYVI2hMO5f8oIC55oY0M4rhrCrg4O4qS7hrAU9ZUKourJHbI2whQIHxzk\nfxj31JqTY3cIG65skEhbtzv1AWsKDgNho0N4C5URNk5WAs0NYf0+hmE2g5URzFjgbZlhmG2EA2HG\nGJwBdho6esNLCYQnfikjbFyGDGyrMkKtEHqhjDCE3dYuNTcESWVDGBh2IEzbwNddZ/46446aP3fs\nDmHDiSwEaet257AawhJJ0t5voK/zXkSeZM8C4S6UEXpDmA/2GaY9rIxgxgI3hBlmHFy+DLznPXzF\nylHhQJjptCGsqwr0hrBPyghbDWFdGTEJyoawu0A4TiQQ1NfPmjLC0BAulRGudsamdUsshAx5g8+w\nnHAcgTBtCD/jGdVtDoT9IHPgEHaljGg6kWVjnZc7hB+rfn/P5Otc31fPG07crkMtEJ64D4TzwYEZ\nINTA26rTv8C3QPj8eeB3fxd44AG3j4Nh2sCtSmYs8LbMMOPgJS8BXvxi4LWvdf1IhgEHwkyjQzi1\n1R6lDuGp6hD2ShlhwUsJLFFGRO4cwo0tcFvPceRfQ9h0osNGQziOYQ6SyNeGPFiOBsJPf3p1mwNh\n92QZ6gFpDw5hHxvCbQ/Ulp6s9E0ZASDpIBD2RhlhCP1tOYR9Vkb8wA8A/+pfAd/yLW4fB8O0gRvC\nzFjgQJhhho+UwHvfm98u/2aWw4Ew0zg8zp4yonqHPTY5Vt3psCFsCga7UkZMFw3hGIdzN1Ns4tSc\nGNloCCcJ/GwIG7Zra4GwyT06sIbwfA689a3AJz+pfr0MfoMAeOpT619n3JG3KesDx8aqjGhuCGd2\nGsLkdfzkE0+u7nQZCCey1pYFgNjCk6y3or0JhA1XXGyDMuLDH1b/ZpghwiEaMxZYGcEww4de7civ\n4aPBgTCDuKElmlpWRgQI1UtUI3cO4b6UEaGIMA13FvfNYjfJSqMWxFrIoDaEL80vOW8Izw3rZuMy\n5KYAQ5CgZQiB8L/7d8C3fRtwyy3A/n719bIhfOYM8MQn1r/OuMPovd1Gh7CFEFx/HZ89cba602kg\nbF6xpit51mFVQ9iZIsMYCI9fGVE+NtePg2HawIEwMxZ4W2aY4UNfx+wQPhocCDNLHMJ2D0ADROoQ\nG5cOYcPBpuxIGbFoCAM4mLuRcTYFvzY80UaH8HzkDWGDMkIGw2oIf/CD+d+PPgrcdlv19TL4vfrq\nPBQu4Yawe4wBaZBsn0PYQgiuh6NPPEbOfuxcAODmddwU/I5VGdE03G8blBHleyMfsDBDRn8d8fbM\nDBVuCDPM8KHvQfwaPhocCDONl6KmFhqzeiAcBRECUWx23jmE7Ssj9IbwQewmWWl8jm2F/rpDOHbv\nEE4MzffaQK4NaBoqN7RA+PLl6vYf/VH+dxwDF/IsDGfO5KFwCQfC7jG2KcV4HcL5+vYzVE7RGRUq\nCSeBcGxeMRvKCH378SEQbnyOgwSybs7YbPkFvjaEpVRPxjDMkOBWJTMWeFtmmOHDDeH14UCYUYIz\nIatNwsbAMdrwChBBCFEdhLp0CBuOvrpTRkwX980SzxrC1pQR9Yaw+0C4G2VEo0OYfG0IQ+UuXapu\nv+Md+XqdP199jRvC/mF0CI9dGdGRQ1gPR5VAuNifufgg2RT8mk5wrYuPDeEmZYQtN/YQGsKA+8fC\nMJvCIRozFrghzDDDhz9brQ8HwowSnAWYLG7bHioXitwhsDgInbhzCJvawNJSIKw3hHdIQ/gwceQQ\nbmwI21JG1BvCpTIiy9y0n0zrbG+oXP780hMoQ24I7+8Df/3Xqif4zBk1EGaHsHuMl9d3NFTuGMlH\nvWsIW1NGNDSEQ3cNYZcOYa8CYUsnOnweKsctFmYM6K8j168rhtkUPrnBMMOHP1utDwfCjBKchTQQ\ntqyMKAPhhUd4S5QROxEdKudIGZGZ94j2JtdrDWGijADc7JCNgwMtPMfUIRzKqkaZDSwQpg1hINdG\n0BYwN4T9wxiQWmwI++YQbm4I23cI70a7EBD5P4r9mVcO4ZE2hJscwrbc2ENQRui3GWZIcIjGjAVu\nCDPM8OGG8PoMNhAWQmRH/PPOIyzrpUKItwkhzgkhZsXfbxNCvGSNxxMKIV4lhHiPEOIRIcQVIcRd\nQog3CSGe3W5tu4U2hENhtyFMw9EA+ZGYF8qIDofKxUkGBPmyokANhA89U0Z05RC+NL+0aAgD/lx6\nbaMhnG/TRSCMCSZB/pqRotqYhxgI//Efqy3gq6/OW6I7xebLgbB7jAFpRw7hyQQIik8IY2wIJwkU\nh/AkmGASFu9/Dh3CzcoI+w1hZcirbw1hVkYwzCDgQJgZC7wtM8zw4ZPt6zPYQLhAHvGPEZHzmwD+\nFMDLAVwHYFL8/XIA7xBCvHnVgxBCXA3gbwD8KoBbAFwNYAfA0wB8H4DbhRDfs9kqdg9tHoWofLfW\ndAJlIKwrIyJ3yghTQ9heIEwD9gi7ExIIp74Fwh0pI+ZqQ9hJ0GBYZynsDpULUDmiM1GlZkNwCFNl\nBADcey/wrndV/z5zBhCiaglzIOweY5uyI4dwGALT4u3AP4ew/YZwFESLkzsuG8LJNiojTM/xsOon\nWQAAIABJREFUFigj6O/ngxZmqLAyghkLHAgzzPDhQHh9otXf4j2/hjyIbeLykvv+PYDvRh4a3wHg\nPwD4FICbAPwYgC8G8K+FEI9KKX/StAAhRADg7QC+pFjOWwH8JoDPAXghgJ8EcC2ANwkhPiOl/POj\nr1o/0EZSJCaLCN3GAC6jMmJClBGxBMrLdHtEGtQBtpQR80QNGXapMsJVQ7ihCdz09XVoUkacctwQ\nNq2bNYdw0SwMMMEkTIAYyAbUEE5TYDarf/0tb6luX311/veZM8CDD7JD2Afy/Wm9IdxlIDybedgQ\ntjZUrnqhRkF+cudyfNmtQ7jp5J01ZUS1UvTqFe8awh0oI7ghzDD24RCNGQusjGCY4UNfx/waPhpj\nCIQfkVJ+bN0fEkI8HcBrkcef7wfwYillmdbdLoT4EwDvRh70/qgQ4rellJ8yLOq7kLeCJYBfkVL+\nG3LfB4QQtwG4HcBJAG8UQtwsbU0vs0RNGVEEwpmF9qju0wVIKynIMI8TgHiL+6JTZUSqBsJ7pCE8\n960hbCv0n/rXEDYFKNYcwmSbNjWEfQ+EaTv4Wc8C7r47D/3On6++TgNhADg4yP9QtyzTL8bwrCOH\ncBhWuhDvHMLWhspV/5eTkCgjihNcTlQ3DU1ga753ss7TcIpABMhk5p9DuANlhG8NYW6xMGNA33Z5\nW2aGCp/cYJjhw5+t1mfoyog2/DCqQPw1JAwGAEgpDwC8pvhnBOCHGpbz2uLv88hbxQpFiPxzyGuw\nXwDgW9o9bPtQbUAUdDFULn9HrQXCAObSUFPsAVMwaCunny8LhDNXDWHzHtGeMqLeEHbtEDY1hE3N\n8HXJA+HKIVwGwimGEwhTf/DNNwO/93vAiRPq95RBMB0sRwNjpn/6dAgHgXtlRHND2FIgHNYbwgA8\ndQjbHyoXBRGioNhRe9cQthMI+zpUTkpWRjDjgJURzFjQ3wMtHCJ5j5S5Lu4DH3D9SBjGDnz11fps\ncyD8MuRd2I9LKd9v+gYp5d8C+EfkYe436/cXLeObi+X8Vykb0823kNveBcK0PToJ7A6VowegUamM\nIINsYnnQ+ndsgrEhbGqibUBNGTH1oCHc8Fxae459dAiblBEWAmFlUKKovKMZhqOMoA3h48eBV7wC\nuPNO4IUvzL929dXAM59Z3S5hbYRb2CFcYGuoHG0IBxPiEHYXCDfqfTpoCPsTCJueYzvKCF+HynGI\nxowFblUyY2Ebt+V3vhP46q8GvvRLgX/8R9ePhmHaww3h9dnKQFgI8TTkg+OAXAuxjPL+64UQN2r3\nvcjwfTWklA8D+ATyYPmWNR5qL9ADUNoQziw0ZhVlRGBoCGceNYQtKSPmido625tUg/pcNYSbgt+u\nGsKX5pcQTap5jqMbKlcERpEwN4R9HypHG8JlM/imm4D3vhf4q78CPvQh4Nix/OtPelL1vQ891N9j\nZOr06RB+PPsMHvnalwJf92/H2xA2OIQBOB0q16iMsHyCFvApEN4+ZQRfZs+MhW0M0Zhxso0O4dtv\nr27feae7x8EwtuCG8PqMIRB+hRDi74UQl4UQF4QQnxBCvEUI8VVLfubZ5PbHVyyf3n+zheU8RQjh\nlYWzsSFsfahcfiRGA+HYJ2WEpUCY6hkmwQR7pCEcZ26SlcaGsK3nOFKD7lSmCKJqXV0c7JrWzZ5D\nuFBGiMo7mg6oIWwKhIG8Qfc1XwNcf331tbNnq9sPPtj9Y2Oa6dohTJfzs/d+PS5fdxvwFf8RByfW\n1vRboTksbD9UbqlD2KEyoqkJnCR2T9ACngfCI1dGcCDMjAVuuzNjQNf4ANuxLdP3fX4fYsYA3Y6l\n3A71S1vGEAjfDOBZAHYBHAdwE4DvBPBOIcTbhBBXGX7mBnL7MyuWf47cfoqF5Qjt55yTkFfKJKza\nrDYawnGSASJvipocwjHcKCOMLlmRWtlpKEPlwgjHlEDYt4awLWWEIeieVl4CX4IVew7h+lC5VA7H\nIawrI5Zx3XXV7Qce6ObxMEeja4fwYv8nUpybVSFwvOOmGt65MqLJIeywIdyojBhpQ7jPoXI+KSO4\nVcmMBd6WmTFg2m63YVvmNiUzNviE+/pEq7/FWy4D+CMA70Tevr0E4BoALwbwKgBXA3g5gLcLIb5W\nSqUueJLcJl25xt9Too1dsrYcp9BQcKo4hC0EwmmyOO1QHnhSh3CCGaQEhGj9q9bC2BQtWmdBy9Mk\nukP42E4VCCdyrEPl6uslJ5cB5BPJfGkI2wiED+cZEOT/b7kyIj/hkSJBrhMX3gfCTQ1hE9wQ9gej\nMqKLhvAN/0v5epYGSFM1TOuD7pURTQ7h/LWcJD2/MWHJ1RxjHipnDP3tOISXNYRdHiTwAQszFjgQ\nZsaA6f1vG7ZlbggzY4Pfk9ZnyIHw9VLKC4av/5UQ4pcA3Abgi5EHxK8G8Mvke3bJ7VXX8NOkS1c9\nLJYjpWyzHKfQA9BJRB3CFtqUhkCYNoQRzpCmeXOnT8wN4TwQpsPQNkFRRugNYUeBcJMawpoywtQQ\nnrhtCJsCFBuDAw9jNVCZ0BMIYQykU+8DYW4IDxPj5fVdOISf+cfqHUGM+RzY6/mdq/uhcmpDeKGM\nAIAwRhxP6z/YMU0O4T4awlmWt8TbnhRdh66VEdwQZphu2cbL7JnxYQpDt2Fb5gFczNjgE+7rM9hA\nuCEMLu97VAjxbcibwxGA10ANhKm4dtUR3w65rfsNFssRQkxXhMLLlnNk5vM57rjjjpXfd/bsWZyl\n1b4lKA3h0O5QuThNgbJ0ZVBGIJohjj0JhC0dgFJlxCSIsDdx3xCmz7GAgETearUR+pscwgAgoyp1\ndLEzNgYoRejRJvA4VIYGTjANSYswnAPpdJBD5ZrghrA/GAPSLhrCeiAczp0Ews0N4W4cwlOiTELg\nJhBu0vjEFp5kus6BCBCIQAmEgfzk3c5OwwI6gIfKmf/NMEOBt2VmDGxrQ5gDYWZs5K/lB4s/wB13\nACdPmr937vsBe08MNhBehZTyHiHEXwL4BgBfIIR4spSyFCFeJN+6St9A+3O6FkJfzuc2XM6RefTR\nR/H85z9/5ffdeuuteN3rXnekZSakSbkT2R0qF1N9QlgoIyYkVZgcII77DxqWKSPaEmfqOu9EJBCG\n+0B4J9rBLJkVX7eljKjvULPIcUPYtP0WIUOrQHhOnl8RYUoTBofu0XVYpyG8uws84QnA+fPcEHaN\n0bdq2yF89SeAa7QZqUUg3DeNDWFbygjNIUyHqiKcI45XvDg6oFEZYW1fna9zGQS7DoQbHcI9KCM4\nEGaY9nDbnRkD29oQZmUEMzby7fjNAH4aAPBVX+XwwQyE0QbCBR9DHggDwPUAykCYDoBbNeCNDpI7\np92nL2dZIFwuR2L1ALpGrrnmGtx2220rv++o7WCg44ZwpoZngLkh3DcShnWzFAinmjJiJ/SgIUzC\n0Ukwwawot9tTRtTXK4uq8x5OHMLGhnAenrXRgsxitSE8oQlDcQm674HwOg1hIG8Jnz+fN4RdOL+Z\nnF4cws/4k/odYewkEG5sCFtTRqgOYaUhHMbeDMNc9vV1oOHrskC4T7p8jgH1gD4I/A2EtyF4YMYJ\nKyOYMcANYQ6EmXGQb8ffD+BlAIA//3PgiU80f+9LXvISPProo309NG8ZeyAsG77+MXL7WSuWQe//\nhxXL+fARlnNOSrmxMmI6neJ5z3vepj9uJCGh4HRiNxBOiD4hCvIjMT0Q7jtokBKNrTMbb4ZUGTHV\nGsKp6D8QllJVQ9DQw5oywtQQDv1tCLeBKiPyy8zJbnSEDWEg9wh/7GPAbAbs7wOnT3f32JhmenEI\n67oIAAjnOHRwLqvzhvAyh3DgKBDuuiHsZSDcvTIiDPMTWb4GwnwgzgwVPrnBjAFuCG/H+jLjJ38t\nny3+AM95DnD99ebvnU77V8P5SI+jQ5zwbHJ7cbGzlPIe8u8Xr1jGVxZ/3y+lvE+7773kduNyhBDX\nAngG8oD6fSt+X+9k5EBzN7IbFs5pIByaGsIHvR+AGlt2gLWGcKIpI2gAm66cYWgf/ZJcNRC2EfpL\nIKqvV+o6EG5oCLd9jue1oXLqICrA/0B4k4ZwCWsj3NG1Q/ggvQR83nvrdwTjawivdAiHc3+ubID9\noXLeB8JBYuX/v1xGOaeAA2GGsQsHwswY4IYwvw8x44A/X63PaANhIcTTAHwt8hD2bimlPg7pjwAI\nAM8SQrygYRlfhrzZKwG8Xb9fSvlJ5K1hAeAVQohd/XsKXklu//d11qMP6AHoru2GcKaGZwCwFxFh\nsANlRJKgoXVmPxCeasoIFw1hPVRRAmoLoX+SmZ/ANHA7VM54QsNCyDBPq/WdGkIkAIMaKnfUhnAJ\nD5Zzh/FklsWG8EzuA4Fhv++dQ7j9vjpJsMIh7KYh3LRPbgqK11q2h4Fws0PYfkOY/k3vcwGHaMxY\nYGUEMwa4IczBGTMO+PPV+gwyEBZCfJMQIlxy/7UA3gqgTGp+2fBtvwig3GR+SQ9zi3+/sfhnAuAN\nDb/uF4q/zwB4veGx3ATgx4t/3gUfA2FFGVFd/m4cvLYmNBydBGaHcN9BQ9fOQmWdI00Z4WConB6A\nK4GwhZChaVBe4rwhbPJEW1BGaA1hUyDse0OYKiO4ITwcjG3KwN5Qubjh5I6rQLhxX21NGbHMITz3\nzCHcsTJCSEBkHp2gtfR+XDzFvgfCfCDODBU++GbGwLYGwtwQZsYGf75an6E6hH8ZQCSEeCuAvwFw\nL4ADAE8E8NUAvq+4LQH8NYBf1RcgpfykEOIXkIe1XwrgfUKInwfwKQA3Afi/AHxxsYzXSyk/1fBY\nfgfAdwO4BcAPCiHOAvgNAOcBvBDATwK4CkAK4DVSWqjdWkZtCFcHxDYeakzeTc3KiP4bwrXWmRSL\ng+GuG8JZ4FdD2EYLPJVVUhSKcHGCIRFuG8LGpp0NZUQSL041TaOJ2ioc6VA5bgj7gbExa7EhTAPh\nabCDeVbsr8LYiUO4MSy0NlRuSUPYkUOY7rciESGR+c6z84Yw8vviuF+f2jJlhM2GsO/KiG0IHphx\nwtsyMwa2VRnBDWFmbPB70voMNRCWyE3Rryn+mO6XAP4bgO+VUjYd1v0EgGuQB7pfBOC/GJbxm1LK\nn2p8IFJmQoiXA/hT5MHytxZ/6HIOAfwfUsq/WL1q/ZOSUHAadqeMmBSB8N6EKCMm/TuE9YA0QIQM\nsbVAOJXNDeHMlTKChCp0cJKVFjhpCJ/ZO4NHr+TTOhNRpY6ug5UFFgYHzhPaKhx+Q/jYsdXfzw1h\nPzBeXm/RIUz1L3vhcRIIO1RG9NUQDifqUDlHDWEa/E6C6WIwq42hcjR89T8Q3i5lBB+IM0NFfx3x\ntswMEW4I82uXGQf8+Wp9BqmMAPCdAG4F8GcA/hHAZwHEyFu5HwbwZgBfIaX8DinlhaaFyJzvBfCN\nyJ3C9yMPb+8v/v1SKeX3r3owUsrPAvgKAD+AvJH8GPLG8qcA/DqA50kp/9Nmq9o99AB0ajksTI+g\njHB6iaoMIGRxhCgyO0NsaEM4UhvC0llDuGmonI1AuEqKzuydWdyOHTeEzQ5hCw1h6hCO9MvMh9UQ\nPnYMCI7wLsANYT/o2iGckHOnuxE5U+ByqFxHDWE9EK7rX9w3hCfk8dgeKle2oeuBcOtfs/FjUuhI\nGRGR1XV5sM8hGjMWuI3FjAHTe5+F87Deww1hZmxwILw+g2wISyn/Gnnwamt5twG4reUyMuRB9Jut\nPKgeocHvxHJDOE7JAXdxROaTQ1jIEKI8L2LrAFQPhElDWIaHkBIQov3vOfLj6VgZkcgq5H7C3hMW\nt2M4dgg3DJVr+xzTbXpqaBUC/g+VKxvCRxkoB3BD2BcaHcKWtjelIUwDYe8awvaHyk0CTf8Szt2o\nbpSG8MT49Y2XnWKhyWhuCLf+NWvR7PRnZQTDDAHelpkxwA3h7VhfZvzon2N5u17NUBvCjEXopaj0\nANT6UDmjQ9iRMqJonQUIIVA1hG0rI6ahPrn+sPf1jWM0DpWz0gI/QkPYRSBsbAjbUEaQhvAkGqYy\nomwIH8UfDAB7e8Dp0/ltbgi7o2uHMN1f74VqIOyVQ7gDZUStIezAISylfoKWDAC14Xs/kkO49a9Z\nC1ZGmP/NMENBfx3xwTczRLbVIczKCGZs8Oer9eFAmFGCM9p2tDFUjoaj5VC5vYg4hF0NlSsbwqAN\n4W4cwkIIBFnREo76D4Q7HyqnOYRL5nL8yoidgQ6VW7chDFQt4QceyIMrpn+MjVmbDmGijDg2IRtH\n6EYZoa9vWF7UZE0ZoQ2VU9r+/QfCeuDfTUN4IIGwZWWE7w1hPmBhhgo3hJkxsK0NYVZGMGOD35PW\nhwNhRnUWKg1hC8qITG3LAp4oI4QhELYUrCSGdRZZEcI6uPRab9nZdgjThvATditlxKF03BA2tZ8t\nhAwxHSoXDq8hnKbAlSv57aM2hIHKI3xwAFxoNLMzXWL0rVptCDcFwm6UEfq+KxTFa83COutB5CTU\nfeD9D5WLYzSevLMyVC6VQOhXINzoELasjPC9IcwHLMxQ4YYwMwa4IcyBMDMO+IT7+nAgzCihID0A\nlVaGytH2sUdD5YhDOIDdoXIZ1MuQASCURUPYgTJCbwjb1oKkomoIH58cX2xDh/KS+hh6xniJddBe\nGRGT0GxnUg+RAL8D4TIMBjZrCAPsEXaFUaFgYZsuSWlD2EOHcFQGwrYawqHWENba/nHcbxt+2ck7\noxN9TWio7Esg3LUyQh8q52sgzAcszFDhkxvMGOCGML8PMeOA35PWhwNhRtEGKMoICw1hRZ9QKiMm\nRBkx8cEhbFcZkch6IBxI18oI2lq2q4zISEN4Gk5xvGgWum4Iyz6GykX1y8wBv4fKXa6elo0awgB7\nhF3ReUO4SRkRxF44hENRhJe2hsrRhnBgPrnT5wfJut6HKCOsBML19yYvAuGOPNEAD5VjmK7hbZkZ\nA9saCHNDmBkbfMJ9fTgQZpSWKD04tD1UbhpVB6CLVq5XDmFLB6CGQDiE44Yw9VIqob+FkCGokqKd\naAfHp0UgnDl2CHekjKCX1e9EaogUTv1vCF+qitvcEB4YfTqEy9cxAMcN4XznEQp6NUeKtgaFoziE\ngX73Xfm+ulox28oI03tTJDwIhFkZwQcszGDRX0e8LTNDZFuVEXS9t2F9mfHDn6/WhwNhplkZYXmo\n3IQciU1EoY1w5hDO1y0QIQJRKSNsvBlmpoNuVA1hJ+tLW2cBaQhbCISbGsKz1F1DOMtgbp1ZuLye\nnuTY1ZQRwcT/oXLcEB4uxvDMYkOYKiOOT6gyws1QObrvUvbV1obKqQ5hRRnhQP8Sx9CUEUTvY2Oo\nnOm9yVuHsEVlxOl7cN8Lvx2/+L9+kQNhhrEMN4SZMcANYX4fYsYBvyetDwfCjBII0wNi68qIqDrw\nrALh8SkjfG8I70Q7i9s2Qv+MNoTDqiE8yy4DkNVj6BE9BF9gY6gcaQhPJ6p3NJxwQ5jpjjRFtw5h\n4j8/NnHvEKb7rlCECIS9qzmSBDWHsKKMCPo/uaPvt3Yiu3ofk87IdSDc2BC2qYz4sjfg8ev+G37k\nz38E++mD6n2O4AMWZizwtsyMAW4IcyDMjAM+4b4+HAgzjQ7hzNSwXBNFGRFWB57ToPAIux4qRwNh\nS40k01C5qiE8x3ze45QiLB8qh6D9pdeZUBvCJ6Z57TSVqbMha8bgDLDUEK5WptYQLpQR7BBmusCo\njOioIXxsqgbCLhzCdH1DESK0eDUHbaYGCBCIQFNG9L/vqg2ViywPlUO1MuW6+hEId+N7z7JiKODx\nRwAAEhIXskcW9zsPhM/eAbz8u4Cn/RUfsDCDRX8dbUOIxowPbghzcMaMA/1z7Da8jtsSrf4WZuxQ\nbYDthrDSPqaBsENlBA0LAwQQorhDZFbeDGkgvDjoFlUr98rhHMCO/mOdoV+Sq7TgijAp2PDUkJQA\nQs0hTIdRTS8BBzv+hAxWHMLNQ+XCASgjaEN4nUD4mmuq2489Zu/xMEfHeHm9RYcwDYRP7KhD5Vw3\nhG0rI5IEwDRf37AIRdWhcv2/luMYjSfvrDiEfR0qN+1GGbH4eeKKTnClfr8DkgTA1/4o8PnvBG58\nN5LkHncPhmE2REpuCDPjYFsbwhwIM2ODG8Lrww3hLUdKdbCY9YFjhktUAWAaVoGwy7AwECGCHpQR\nEQmAL/dctdPbsmog3G6d0xSLJl25bGUY1TSvo/a9M87X2RCgWGid0dBsGuoO4fEqI2h4TJfB9Ife\nIAXQWUP4pDdD5YqGcEAbwnYdwmXwql494UgZ0bCvtuF7X+0QjkeljFj8PHmPmvsUCJ8sLrU4+cBW\nBA/M+DCdp+JtmRki29oQZmUEMzb4JOX6cCC85eTDt8xTzW07hOmB506pjJgcuFEokNaZsOilBMzK\niAkZ5HbQc7JSGyqneDLbrXMeCGsOYdoQnuSBsGtv8gIbyghtm6b/nyLyvyG8qTIiCKoA+eJFu4+J\nORrmhnCGOLGzD6VKAVUZETtRRugO4cW+2lZ7tGgBh8LUEJ5Xj6EnanqfkDaEeajcuiyeO+KKjn0K\nhMv3qCDhA3FmkJheQ3zwzQyRbQ2EuSHMjA1uCK8PB8JbzjK/bKeBcNkQFhKHSb9HoEpDGCEC2PNS\nAkAm1EFFADBRlBE+NYRtBMJaQ3jiviHcrTJC9XDS18yYG8JAFSBzIOwGo0MYQJq231cDaiC8O5lW\njdwRNoRpM7UMXlWHsGcN4ZZD5aSEErz6Egg3n7yzcDWHqSEsPQqEy+cjyJBYeg0zTJ+YPtvxwTcz\nRLZVGUHXexvWlxk/HAivDwfCW45+MGZbGUHbsmEQLm7vRruL21fms9a/Zx0Uh7Ayud5SIGxsCJNA\neO4gEG5qCNtQRkSaQ3jqviHcPFQuQ9KyTUlPckyCidYQHu9QOQA4eTL/m5URbmi6vD6x0B4FgIwE\nwtMoqq5scBQI6w1h+0Pl8vWNljSEfXEIt1VG6E3ccp29CIS7VkYQh3DsVSBcPYA5H7EwA2RbW5XM\n+NjWbZkbwszYYGXE+nAgvOUsbQibHKxrsrIhDGCW9BsI03UOaSBsQScA+BcI6wfcSguu5TqbGsLH\nJuRS88lB9Rh6RN+uBcTidpy0266pZzUK1KFywQCUEW0awmUgfPFi0ThkeqXpRIcNnQCg7q93InKy\nw9FQOXoyaxJGVSBsSxmhN4Q9cwgrgXDLhnAtEPapIdyjMuLQq0C4Wu95z1dKMYwNWBnBjAVuCHMg\nzIwDbgivDwfCW44+fEsJCy03hOmB5160t7h9kBy0/j3r0Di53lJDWIr6Ok9JIDyLfVJG2BgqpzqE\nafu7vM91QzhEtc5tm1j0svqJNlSuDMezzDxsxQfaNITL708Sv1vQY6VJGRFbOmqhuptJQHQo4dwL\nh/DiKhNbyogiKFycuFOUEfkG3ud6L/O9Z9JyQ9iTQLjZIbxFyggAccaBMDM8trVVyYwP07bs6+d4\nW0ipvl45OGPGADeE14cD4S1nWSMJImvdAqQHsfTAk4aGs7jfhnAcSyDI3+Vzh3D3yohpWAXCBy4C\n4UZlhH2HsBIIR/lz67ohHIlqnduGZ6n2/JqUEYC/LWEbDWGAPcIuaArPbDWEs6aTHaH7hnAYqFdz\ndNEQVk/u5P8Xsx7fnuIYjQqnsQbCnfrey9Ulyoh55lEgTJ7rOOUjcWZ4cCDMjIVtbAhzk5IZI7xd\nrw8HwlvOsqnmVjyNTQ3hCQmE056VEWR4SxiQyfUWDkCzDICpIUwDYRfKiKaGcMtgxeQQ3iHrWgbC\nbhrC5HlWGsLtnuRMNrQoAXWavaeBsA2HMMCBsAuaBnDZcggr7fdggmnkj0M4UobKSSRpu7OV+Xuf\n1hAO6g3hgx4vYFnaEG455NXvQNisjGj7Id7UED70JBDWn4956ukbBsMsgZURzFjYxpMbHJwxY4S3\n6/XhQHjLyQ+4SXAmQkAWvlULLaxGZcSkUkYcpv0qI2ggqDqEbQ0qqq/zbuhYGdHlULkjNIRdT66n\nDeGkw4awcmmyp0qFNg1hGiDzYLn+SdLq6gZKdw3hShnhLBAmDWE6mJSe2NsEuq+emhrCQf8N4WUn\n7+w0hNXnF3AfCPejjCAOYU8CYT38T7ghzAyQbQzRmHHCDeHxry+zHeivZd6uV8OB8JajH4CGQQgh\n7Tl1aSC8aHdBbQgfZv02hPVAmD6uJGnXOmsKhGlD+DDpN1lZ7hC2oYxY4hB22RDuSBnReFk9ADGA\nhrAeCGcyw6+9/9fw6v/xajx48cGlP8sNYbc0ncyw1RCWmkOYDpVz4RCm+656IGwhIF3qEM7vc9kQ\n3oahco2BcEfKCL8CYXYIM8OGA2FmLGzjtqy/33OTkhkD3BBen2j1tzBjRj8AzduyFhUKDQ3hY1MS\nCCfuAuFAEIfw4r7NXxb6QV65zjtRFRrOEgfKCPKYlMuiLSsj9IZwMJ0hQ7+DmYDlDeG2Q+X0bZqG\nSDLw3yFcKiN2doBZdgn/8q3/Em//+NsB5M/fG176hsaf5UDYLUlm3natNYRJIKxs2z40hOnVHGgf\ngseJXCzb2BB2pYxoOHknWw55PWog3PcHZ/2EZfVYLHmiAeXKjZlPgTBZ76bXNsP4jOk1xAffzBDZ\nxkCYgzNmjHDzfX04EN5y9IOxUIQQCCABKw1h2RAIH9+plBHznhvCMW0IByEySUKGlpchNwfCRBnR\ncyDcpzJiJ1IbwpNjMxxC9db2gb7Ok8COMiLLoF52HagNYRn6HwiXDeG9a+/HV/zWS/GRRz6yuI/e\nNsHKCLfEDf7rtKVOoISe7NCHyh3OJQBh5fccFb0hHEl7DWEawEWhwSHsQBlR22/RoXItA+Gm9ybX\nDeFmh3A3yohZ6lEgTBvC7BBmBogpQMoyQEpA9Pt2wTCtoO9902mufRt7kMQNYWaM8ImO9WFlxJZT\ncwjbVkYYBqwBakN4Lvt1CCtD5XRlhEUvJUAcwpMqEJ6nHg2Vs6yM0BvC0W6eply5ov8YR0S1AAAg\nAElEQVRktyxvCG++wnEMJVzQlREyOFS/10PKcD655XW1APi+/fuW/iw3hN3S1IpNpZ1PO1mTMkJI\nzOP+j4yoMzkUqjIizVruq7P6ftp1Q3iZ3oeVEeuzOAigQ+U8CYTzhnr1nLIyghkiTQfaLXfPDNM7\ndFveKQ7Zxh4Ic3DGjBFuCK8PB8JbjkkZIcrNwsKUbzoIR2kI00DYcUM4CGhDuJsW1i5pCB/2HAgv\nbQi3fI5rDWHNIVwGwu4bwtX/f5vnOI5Re34DESyeZxoI+z5UTl51bvG1Jx1/EgDg3P65pcETN4Td\n0nRZedv9VokSCIcTpTE7i/vfoGnoa9shHJP/y3I9XTuEl+2ru1JGKOvsOBBWw2mLyghyVccBCYRd\nHvzqbX9WRjBDpOk1ygfgzNCg733bEghzQ5gZI3yiY304EN5yakPlhN2GsGxoCB/fqULDWLocKheo\nXsqOlBF7EzpUzkEg3NgQtqCMIA7hKIiUQDjccdkQrp5LGmy1D4TVFiWQB+EAkAm/G8JZRp6L3f3F\n17/kui8BkLfUlg2W44awW5oGItpSRjQOlQMwd3BJO3UjhyJERAPhlg7hJFN9yeXvqH5hHoD3qYzQ\n34+VoXLopyHct++dhuCTYFI9BxaUEVVD2D9lhP5a5oYwM0SaDrTHHqQx44MbwuNfX2Y74EB4fTgQ\n3nL0hnAYhFVDuOUlm1I2B8IndiuHcIJ+lRF6Q5iGAHHSgzIiczxUjjbCrCgj8uAklDsQQqiB8DRP\nU2azfi8h1ENw6hCep5u/M8xmWOqIzjxXRhwc5K9LAJDTxwEAJ6cn8fmnP3/xPaU24s6H7sRv3P4b\nuDSvqsAcCLulcahcF4FwOFH2FfO0/4YwXd98X21vqBxddrmeQogqBA/8aghDpIvX7ubLXh0I97m+\ngPr+FAYhgvL92IIyIn8fyJT/04PEk0BYO0KxpX1hmD7hQJgZC6aGMDBu/Qk3hJkxwic61oeHym05\niwOmgnKoHIDW7dF8AJd6QF+yN3HXEE6ydDEbKQxChLAYMjQ0hI9N3TmEVykj2jzHSYKFQziQ+XJp\nICym1XN75YqqHOgS/USHraFyFy6g5hAGaEPY76FyVPOQFoHw6d3TuPH0jYuv3/v4vXjuk56Lr/md\nr8H52Xnc9bm78PNf+/MAWBnhmrTRIdyBMkJrCLsYepVoDWGbyohU1hvCQL7e83TuhUPYdPIu2vBT\nm6+BsK6MSESKGLCnjAjU7dabQFg7MZlwQ5gZIKyMYMYCDZF2q8MYpCkQjLQ+x01KZozwdr0+I93F\nMUdFPwDNHcJ2lBFNB6CAGhq6dAhHQdipMqJsH1NlRNxzQ7hzZUTZEEa+jvS5DSZqINwXyxrCbUL/\nCxewtCGceq6MoC7nJMyVEad3T+PGU1UgfN/j9+HOh+7E+dl5AMDfPfB3i/u4IeyWxoZwyxNZJfoV\nHXRfcZi4cAirV3NEob2hcglpZFI1w2KdixM/fSojljeEMwu+97ruxrdAOBTR4rFYUUaE6nbrTyDM\nDmFm+HBDmBkLJmUEMO5tmYMzZozox99jfg3bghvCW45+AKooI1o2dJYFwntRpYzouyEcpylQHP+H\nQYhQ0taZPWWEkCGEyKvIe7QhnPUbrOhtWf0yZFsO4RD1hjAcBcL1dbYzVC4PhJsdwimqQLhvF+dR\nWLR6wznSIE9+Tu2ewlNPP3XxPfft34dTj5xa/Pv+C/cvbtNAmBvC/dO07VpTRgTNQ+VkMEeaAiST\n7ZyaQ1iExvs2weQQBkgr15UyounkXUfvx94FwuVjtKWMCAfSEJYenkFkmBU0BUgcLDFDg4ZIekN4\nrJiCMymB4tCVYQYJn+hYHw6EtxyzMsJOQ7hJnwCooWEs3TmEoyBEkHWjjAjIy+vYDrn0WnrUELYR\nMhQNrMjQEEZUBcK0ndo1tXW2pIzY34dZGWFoCPsYCC+eg51qoJyujLhv/z6lNf+ZC5+BlBJCCEUZ\nwQ3h/mkKfrMehsohiDGfA3t7hh/sCLq+YaAGwm331dTZStUMVUPYkTJCUd1QZUQ3V+x4EQgX++pQ\nEKe/LWWE1hCeJbP8M48MnB7o6+9DKQfCzABhZQQzFrghnNNGTcUwPsAO4fVhZcSWo081z5URdobK\n6aFcUyCcoN+GMG2WRUGIkMihkpZD5WggLCRpRJOGcOJCGbHkMuT2gXC+PmUgXIajQHUf4LohTJUR\nm58qbFRGKA3hfPKTj4Hw4jnYfXzxtVM7p3DNsWsWrf37Hr8PH3nkI4v7D5KDhT6CA2G3NG27tgLh\nRftdCoRBqO4rwjnmPVsj9IYw3Ve3bQgXploAQCRUh3D+C/tXRujvx9av5vAsEJZSnTWQKyPKQNiS\nMiIwBK1RvpJOG8Laa5mVEcwQYWUEMxaahsqNeVs2qe24TckMHW4Irw8HwluOHhYGIkBgaajcUR3C\niXCgjCgfUxgiooOKWnop6TrThvBx8umi74aw3tRWW2f2GsKhyMOLQASLICMLXTqEq+eSNgB1d+M6\nUGWEQLBo0pYhuIRc/F/3GSQdlUVDmATCp3dPQwiBzzv1eQDyoXIffeSjys+V2ojJpPqgzMqI/mlq\nxdpzCBfbdpa/XpR9RRj3fpKj1hAmvoqkZQieZgNoCNOhckHa3iF8hEC49/00oATCi8fTkTICADBx\nHwjXGsLghjAzPDgQZsZCuS1HkarGGvO23NQQZpghw4Hw+nAgvOXQFq9AACFEL8qIvUl13XEWHKBl\nDrsWSiDc4VA5VRlBGsKOlRFKANByUFGSyIVDuGwIA1XgLwM3yohlTTtbQ+VC0ipUmnzF/4ePDeEq\nEFaVEQAWHuGD5ACPzx5Xfu4zFz6zuF16hLkh3D9pU4swSK3sQ0uHsJB5EOm8ISzV/RY9edd2qJzS\nEB6CQ7gnZcRsljd3+8AUCIeBPWWEaagcAGByRf39DtAbwo2vbYbxGFZGMGOhbMvqgXCfx6d9ww1h\nZoywMmJ9OBDecmiTMiiCYCG6GSoXEv+j7pnt9bJcslJhYPcyZBoIh6jaXSd2SSAMd8qI/LJr8kmn\n5XMcpykg8vQgElV4sVAoCD+GylGNRZvn+MIFLBpntD25E9Y1GUNpCJ/ayQfI3XjqRsNP5Nx/sT5Y\njgPh/mkcHmehTQmgar8XuhuloeogEKYqjFCoDeHWyghpvmqiUkbkK9vn63i5Q7jdc9x0glYPhIH+\n1rkKhIv3pyC0qozI/z/rR7zRnvtAmBvCzBjghjAzFrgh3Pw1hhkKWVY/icPb9Go4EN5y6AFo6Q4O\nLDWEj6qMQDTrtYVFG6JRqAakbRvCR1FG9B0I07ZsKCKlEd02ZJjF1bpEot4QpoGwy6FyNNhqM1Su\nqSGseJM9bgibHMJlQ5gOltOhDeHSI8zKiP6hIaaChTallCD+c0NDOOhfGUH31WEQIujKIRwY2v5h\nAkB61RDuQxkB9NeKXqxPp8qI+lmMYMeDQFhvCDe9thnGY7ghzIyFsi07mWxPIMwNYWZsmF6vY34N\n24ID4S2HHoCWQXBAhsq10wnAz0A4pUGh3hC2qIwggeFx0hBOHTaEg6J9JSx5og+T6mA7QhVelM8v\nHRjoqiEsIDAh217S4sB7fx+Lxtk0bGoI998sPCqLUH6nUkac2j1CQ/hCvSE8m/EHx75p1J203FcD\nxRn1UHUIu1ZG1BrC5CqTxrb0UZcNs0NY9fb2O2Rt2TBM21fs+BAI96OMqB/xehEIc0OYGQHcEGbG\nAjeEm7/GMEOBt+nN4EB4y6HKCKErIzpsCEdBtLgsGdFBv8qIWkO4ehm0GTgG6MqIan2pMiIVLgLh\nstVahv52DrpnSbUuk6DeEI6ly6Fy1YmOSUQCYUsNYRocKQ3h0N+GcNNQOWBFQ/hi3SEMcEu4b7Km\nENRCeJZlWJzsCKRpqFy/gXCWQWnL6rqb1g5huaIhDADhvH9lRMOVDVYGgAb1dfY3EO5OGeFDIKw3\ngukJCoYZCk0H2nwAzgyNcpvtsiH8jncAb3oTej+53gQ3hJmxYdqmx3xSxxYcCG85anu0C2WEeskv\npQwdEMY9N4SrICEKQruDiqgygjSE93aqA/sU/X4SUJURWiDcMmSYp6QhLOoN4TwQzh3DvQ+VI63o\nSWDHPao4hEnIrDSEoyE4hJuHypU84+pnLMIx2hAulREAe4T7pvGycmuX1y8bKhf3ehBjev+gHvo2\nr2MpASlWOISB3t+b9HXuUhlRhs2mQLivk3eLE9KFhz4UYXU1h4VtummonJi6D4T1K1XoCQqGGQr0\nNUTMaHwAzgyOpqFytrble+4BvumbgFe/GnjLW+wssy2mzxQ2X7sf+hDwHd8BvPWt9pbJMMvghvBm\ncCC85RiVEZaGytG2rJCB6q4FGboWzvs96FYawkFnygjqmA2DAEjz9c2cNISroT2APWWE0hA2OIQl\n5CJk6r8hXA1LVIZRtbjUPG8Ir1JG+NsQNjmEy6FyZ0+cVcKhf3LtP8H1J68HoDqEaUOYA+F+oduu\nss1ZaAjTNmVQ7Jv1oXJ9btN6W1ZvCDe2pY8AbUMDyxvC83l/4Ya+zmpYO1JlhLa+NpUR9CQHRfjQ\nENZ+ecbKCGaA0ANtDoSZIdN1Q/iuu4pZDQA++EE7y2xL1+HZT/4k8Id/CLzylebmJsPYhgPhzeBA\neMvRm5RAN0PlBNEnlIRlozTouSG8RBnRdlBRkzIiX3j+adlJIKw3hIWdg26lIRzUG8L5HXlVtneH\ncLHOQgRKIGxLGUGDFNNQOb8bwnVlRBiEeMpVT1l8/blPei5uuOoGAMD52XlcifMnkJUR7kjJICpl\nm7PgEKbhWdAwVM51Q5ieVGxzYkf32zc7hPP/j76CcOXknQi1AaDdKJycD5XTHtPiip0Oh8r52BBm\nZQQzRDgQZsZC1w1hGog+9JCdZbala2XEgw/mf1+86M86M+Om69b7WOFAeMtRmpRCcwhbbCQtQmZC\n1RDuWxmhDpWjyojEhjKiCFVoQxgARBkIB/0GwjT0D4uDf1vKiMMVDmEAi0C4T2WEHqxMlIZwy6Fy\npTKCXFqutgr9bQgvGyoHqB7h5z7pubj+qusX/y61EayMcAcNQW0OHAPU1mzpd3c5VM7YELY0VE5v\nji5rCANuGrOBtr5tQ38fA2HTY4rC4vEEGeJEtlp+/t5naAhP3AfC+snnVHB9ihke9DU0nZq/zjBD\noOuGMA1fH37YzjLb0nWbkn6WuP/+5u9jGFtwQ3gzOBDeclRlROEQJq2kpMUBmaqMqDeEI1EEakHs\ndqhcSFpnabtAOI4lCSL9CIRNDWE6OLDNjpI2hCcGhzAA5w3hACEmYfVcbBokSbmkITw4h3DeEN4J\nd5Tn6jnXPAcAICDwvLPPww0nb1jcV2ojuCHsDtoiVP2yNnyrEgjLE3juh8rRE1lA4RCmyogWJ+9q\nDeElDmGg58ZscYJWV2SMsSHcpSd6sXyTMsKDhrB+YlJyQ5gZINwQZsYCbQiTC0e5IdwCehzEgTDT\nB9wQ3ox6SsdsFepQOU0ZASBOM8DQ7j36ssuAwRAIB1Mgg1OH8ERTRiQtD0DnCW20aYFwln9alk4a\nwmWIqT3HLZuFh2m1LlNPG8KBCDGJaAt8sxU+OFA9q/TScuXyfY8bwrpDuNRFlPz4i34cl+PL+PIb\nvhw3nr5RbQhf5Iawa7p0CNN9VxkI60PlnDuEbTaEGxzCeggO9HdyJ00BTKv9lqKMGKtDWHtMYaDv\nqzf/mNqkjIAHDWH9fYgdwswQ4UCYGQvlttyVMoK+Vh56KC+ZCGFn2ZvSdZuSfnb6zGeav49hbMEN\n4c3gQHjL0S9Rzf8OgKIYHCcpWgXCi5ZmfVObBJMiEO55krtUA2GqjGg7VC5Oq71OGPgRCJuGyi2C\nlZbNwpg2hEM/G8J1ZcRmK3zhAvL2XpBvIzQ4Gl5DOFdGUF0EANxw1Q347W/+beXfJaaGMAfC/ZI1\nKSMsOIRnpCoSmhzCLpQRHTmEm8JRoO5NBtwoI/QAvO2+2sdA2OwQJldzZAmAnfoPrrV8Q9DqQSCs\nN4TZIcwMEfoa4kCYGSpZVmiz0I8y4uAgv8KOfp52QdcNYVZGMH3DgfBmsDJiyzE5hANyEJq0UCjQ\ngz1zQ7hSRjhrCEeqMqLNZcgAcBiTQFhrCAdlIBw6VEYEdWVEq4ZwdvSGcJ+BcK0hHNJLzVsEwg0h\nktoQzlMzHxvCly8jf73vXABQbwjrXH+y7hBmZYQ7aGikbHMWGsKH5MhgoYzQBqw5dwhTZUTboXJE\nJdA4VK5nZYS+39KHyrX5UGsKXwGooXNxf1/7amNDWLS/mkNZvqcNYf2EhgzixQR6hhkKTQ1hPgBn\nhgTdXvsYKgf4oY3o+vJ6VkYwfcPKiM3gQHjLUQ5ADQ7huMWriB7s6eEoAEyDoonlUBkRhSEiqoxo\nETIAwDwxB4YAEMji03J0CNnjkZ8yVE5oDeGWQZLSEA7MDeHJbp6M9qmMUFycQYiIKiM2HCqXB8Lm\nEElpCIeeN4SnFwGRb3+ndk4t/X6lIXwxbwizMsIdyxrC7ZUR9UDYfUOYXnGhNmbbBMJHbgi7UEYs\nrtgJelFGCCHI+4EDZcSS0L+tQ1gP/hd4EAhn+vtQEKPl+WiG6R168M1D5ZihQrfjPhrCgB+D5bps\nCEvJgTDTP9wQ3gwOhLcc08CxkAbCyeZHKKscwotALchw+Up/R0K0mRMFISKlPWpPGRFpIXgoq0/L\nSdbf3ok+x+VwNQE7DeE5bQiH5obw7glHDWEyLHFKhsptGiTt7+NIDeFg6rlDuPAHA6sbwk8+8eRF\nKDWGoXK3/s9b8Q3/+Rtw9/m78elPo9eA0wYpbQhbdggfkkA4bBgq17tDOFgSFsq2703k5E7TUDkX\nyghy8k4IASHt7Kv1AWt0/7W47YFDWFFGbHjyTlm+oSEsI/eBcC3sDhI+aGEGBysjmDFAg9E+HMKA\nvw1hW+9D8zmUq17YIcz0gekkB78frWaUgbAQ4vVCiIz8+coj/MxLhRBvE0KcE0LMir/fJoR4yRq/\nNxRCvEoI8R4hxCNCiCtCiLuEEG8SQjy73Vp1g96kBDpSRoi6h3hKGpZXDvsbqEID4VBoDeEW6wuo\nDWHdIRwSFyIdxtY1SlO7Q4fwtMEhPD3uyCFMvMnTyJJDODSHSDSci3Y9bwivEQhPwgmuPX4tALMy\nYkgN4bvP342fec/P4M/u+jO8+rd/GTfeCDz/+cM6c9ylQ1hVRkT13xE6VkZYbAgfHmKthnCvTt1G\nvU+751gPX+n+y6dAWFFGtLxiRw/+S6TjhnCWAVJoT2YY80ELMzh4qBwzBra1IdxlIKwfA91/P1iL\nxHQON4Q3Y3SBsBDifwPwQ8jHopV/ln2/EEL8JoA/BfByANcBmBR/vxzAO4QQbz7C770awN8A+FUA\ntwC4Gvk0lKcB+D4AtwshvmfD1eoM2kgqDzzpZaqJJWWEqSE8jaoD78sH/SUNtJkTBqpDuM2gIuCI\nyggAV3qs2tFwNNJD/5bNQtoQpqEovT3dyz8V9KmM0Jvvka2hclH1CYeG3rQhHO742RCO4+IDaTFQ\nDlitjAAqbcRDlx5CnMaDVUY8duWxxe0P35uH2x/9KPDxj7t6ROuRZVDCM3X4mQ1lBPWf+6KMUE/e\n0femDJuv8GwGNRz10CFcBqMB7Oyrl2kyXAXCJq9xH8oIGboNhPXnAgA3hJlBwsoIZgwsawjbUvn4\n6BDuUhmhB8KzGXD+vJ1lM0wT7BDejFEFwkIIAeA3AIQAHgEgjvBj/x7AdyMPjm8H8M8BvKD4+47i\n6/9aCPH/LPm9AYC3A/iS4vvfCuClAF4I4N8AeBh5OPwmIcTXb7JuXWFWRthpCK9yCO9E1YH35Vl/\nDeGs1hDuSBmhBcIRaQhfmvXdEFZbZ4GloXJxdoSG8DH3DeHIwjCqPBCukpK9yd7iNl33aOpnQ3gR\nyK/REAaA66/KB8tJSDx06aHBKiNmSfWEXJxXofh997l4NOujN2aVoXI2HMKpQRnhcKgcbcsCxck7\nS0PlDg7QqE/wxSG8aAhb0vv4GAibQv9tUEbQ96cFATeEmeHByghmDHBDePnXNsH0OYI9wkzXcEN4\nM0YVCAP4P5GHsh8H8FurvlkI8XQAr0Ue4r4fwIuklH8opbxdSvmHAP535CGxAPCjQoibGhb1Xchb\nwRLAr0gpXyGl/Asp5QeklL8C4EUALiD//36jEMKb/3djIymwM1ROCeUMDWEaCF/pMRBWlBGB2jpr\n46UElgfCoaNAmIb6tdC/5WXIsTQ3hGkgHBUN4UVDtQf0Ex1qkNRiqNyk+oSzG5KGMFn3wNOG8CKQ\nJ4Hwqd3VDeGzJ84ubj98+eHBKiNoIHwlG2ggTEIk2w1hk0NYD0ddOoR1nUDWYl99cIBGfYJPDmGA\nNIQtKyOWBcJ9nbxbpYxo2xBuVEYUgbCUbi5h1ZvRALghzAwSVkYwY6APh/BQGsK21td0Ip09wkzX\ncCC8Gd4Ek20RQtwA4GeQh7KvBnCU6OmHgUVS+RoppXK4K6U8APCa4p8RchWFidcWf58H8GP6nVLK\nTwH4OeTB8hcA+JYjPLZeyIOzwiEstPYoOm4IT6qw4cphj8oIrSGsXIbcoTKCNoR7VUakaqgCVO2z\ntkHSURrCk10SxPUUNNQbwrR11mKoHFFG0IawoozwvSG8U4WhR2kIH5scW9w+iA8wneYfmIFhBcKH\nSfWak9MqFB9WINwwVM6yQ7hURuhD5Zw6hPUTO62VEeaGsKqM6NchTNc5WjSEy0DYbkOYvu955RC2\n1AIHSmVEfaPNwuqNyEVwpbffAQBhzActzOBoCoR5W2aGBN1e+xoqN/aGsOkYiBvCTNewMmIzRhMI\nA/g1AMcBvEVK+Z4j/szLkAfIH5dSvt/0DVLKvwXwj8jD3G/W7y9axjcXy/mvUsqmGOgt5LY3gXCS\nSkDkFZmyGWxLGZEkchHKBYZAeHfiZqgc1ULoDeG2Q+VoQ3iiB8Ki+rR8uc9AOKOD7iwrI2hDODI3\nhMOd/gNh5USHPoxqwyCppoyISCBMG8ITPxvCmyoj6HoeJAcQohosN1RlBPUoDyoQJiGSdYdwWg+E\nnQ+VozqBwN7Ju1pDmITA+joDPSsjmvQ+Fh3CISLkhq0cnxzC9pUR9c8WXgTCtYYwKyOY4cHKCGYM\nuFBG+NoQZmUEM2S4IbwZowiEhRCvAPCNAD4HQ0O34WeehnxwHAC8e8W3l/dfL4S4UbvvRYbvqyGl\nfBjAJ5AHy7cc5TH2AW2PmpQRbYbKzRPSTF0RCM/6cglADQQDEVi7DBlQw1e9ITwRbhrCeiOa/t3W\nPZqQhvBOZG4IByQQ7muwHG1ghSJQBxW1cQhTZUTDULlgkv+fxLG9YRQ2qJQR6w2Vo03ogzhf/3Kw\n3JAawkogvDO8QFgPkfSGcPuhclQZke+7nA+V0xvCZF8NkW58uf9shkaHsEtlhLLfCrpTRuhX7PjS\nENZP3rUd8pq/ZnwNhHWHMCsjmOHByghmDLhSRrhQFlH6bgizMoLpGm4Ib8bgA2EhxCkAb0De0P0x\nKeVnj/ijzya3V82Zp/ffbGE5TxFC7C39zp5Q/LKB3aFytC0b0AP5gr2dKmw48EQZ0fYAdE4dwmFz\nQ7hPRQb1QC+eY1vKCE8bwnEigcDcEJathso1KCNIOCei6v/Ep5awrYYwMMyG8GFKnozplUVQNJRA\nuGuHMN1fL5QR2lA5lw5hfagcgs0D0mUOYdNQORcO4YUywtLVHN4GwiT0rzmEW74f60Plyn1ZGrgN\nhPUgHAArI5hBQrfZKdl18gE4MyRcNITn80JF55AuG8KsjGBcwA3hzRh8IAzgPwC4FsD7pJS/vcbP\n3UBurzpndY7cfoqF5Qjt55xBG8BlMBrShnCLoS7Up2t0CEeOGsJLhsq1bQgryggtEJ4ExJk877Eh\nnKkBOECVETIPTzdEDYTNDWExcaGMUE902HCPLlVGkDBcTKr/E588wqZA+ChD5UwNYRoI+9SCXobS\nEAYWTekHH/QruG9CD5GU4NKCQ5g2hCOjMsK3hnC28ZDKozuEHSgjmhrCVpQR+fp4FQgvUUZsOgBU\nWT5pgpf7OxnEi/8LbxzCQcIhGjM4WBnBjAEXDWHAvTaiy/CMlRGMC7oclDhmBh0ICyFeBOB7kA+Q\ne9WaP36S3F7Vc6MXup/oaDlOoO3RSHcWQg3W1mXZgDVAbWUdzB0FwvqgorbKiCUOYaqMOOgzENYC\ncKB6rgFgHrfwRJNAeI+EwE2BcF/KCH27VryU2OzTzlJlBL18P/S8IbzmUDlTQ/gE2Xv19Zy2pR4I\nV8H4uXPwHj0gpSchrDuEA/eBsH5Zfa0hLNKNH8/RHcLuG8KLq2taNoSpciTU/fblv8P8/l5d70uG\nyqUtBgcC9aFyiiKn2Jf75BDmFgszNFgZwYyBPhrCpv2768FyXV5ezw1hxgVN27RrPYvvDDYQFkJM\nAPx68c//V0r5sTUXsUturzqspLGOrnpYLEdK2WY5TlAcwoawMGlR/1NUBYaGMD3wPoz7SxpoQ9Tm\noCJAdQhPSQMaAKaBG4dwooSj+fMQKoHw5uscZ9U7/vEdcyBMNQt9BQ2JFgjbUEbs76NZGUHDucjP\nhnDlEM6D0EAEODFdfV5qWUMYGI424jDRXnNLPMJSSlye+5V06wGp3hBur4wgjVnk+y6lkRs6UEZo\nDWG6r0bQMhD20SFMhrzWB4DacwiXDfASX4bK6S3wtMUVSoDaiga0E2CTK9X39EyTQ5hDNGZosDKC\nGQPb2hDuWxnx2c/29/mC2U6atl9+T1pOPaUbDj8B4FkA7gXwMxv8PN1VTRu/K4ekPdB3ZYvlCCGm\nK0LhZcs5EvP5HHfcccfK7zt79izOnj278vto4Fs7AAUQJy3CwnRFQzj0QBmhO4RbXv++TBkxJS3S\ng7jPhrB6wA2oWpA4adECz6r1OHnc7BBG6MAhrJ3osKaMOGtWRtBwTvreEC5UCczHRucAACAASURB\nVFftXKUGbA0scwgD+WC5I+xqnNOkjADUQFhKiW/4/W/AO+95J37/n/0+vvXZ39rTI1yO3qZUWumW\nG8JREYoKITANp5incyCcu3cICzsN4VwZcXSHcG/KCJPex6oyogiEmxrCQQJA4uBAbP6L1n5M6glL\n5bGJPCSlB+drL39CGsJUkeM8EGaHMDN86Otnd9f8dYbxHbrv7TMQ9rEh3GUgDAAPPADcdJOd38Ew\nOtX2+2DxJ+cDH1BPWpbM+7z00WMGGQgLIZ4J4MeRD5J7jZRyk3D1Irm9qiZ3nNzW+3D6cj634XKO\nxKOPPornP//5K7/v1ltvxete97qV36c0hItwKAppQ8eOMkK/RBVQD8JpGNE1ekOYhgytlRGkIawP\nlaMN4X4DYUMLnDzHh/MWnmjSEL7qmDkQlmH/ygi63Uah9hxvEAhLuZkywqeGsO4QPoouAjA3hKky\nYigNYT0QDo8/vtgSaCB8/8X7cdtdtwEAfv+jngXCpDFr3SGc1h3CQL6fnqdzN0PldIewNlRu08eT\nKyOO7hDuq9FiGgAa2BwqF60YKlf8noODDRPYTR7TEmUEghRxvHkgnCQAdpc3hF2EsKyMYMZCkzKC\nt2VmSNCwdjIBSGeGG8IbQj83PeUplZrt/vs5EGa6o9p+3wzgpxdf//Ivd/FohsMgA2EAP4y81fsp\nACeEEN9h+J4vJLf/qRCi7LD9cREg0wFwqwa80UFyum1SX86yQLhcjsTqAXRGrrnmGtx2220rv+8o\n7WBAHRpXqiJC0hpMWrwTUp9utEoZkThSRogOlRFLGsKzHgNhNfQ3OITbtMCJQ/j0cbMyInPeEA6U\nkEFuEAjPZsWbTIMyojyxkMoUWeBzQ1guAmHFp7mEVQ7hixf1n/CTw1R9Mp528z7u+mh+mwbC5w/O\nL27XWsUO0UMk2w7hWHEIV/uuaTjF5fiyhw3hzF5DOFQD8OqX9uwQNjSEF+tsUxnR1BBG/j2zWQgp\nAdFxUdgYCBta4LR5uP7yyVC5HY8awjxUjhkBrIxgxkAfDeGhOIS7aAjfdFMVCH9mo/SDYY5Gtf1+\nP4CXLb7+7nerx64lL3nJS/Doo4/28dC8ZqiBcHkkfBOAP1jxvQLATxW3JYCnAfg0AOocftaKZdD7\n/0G7T1/Oh4+wnHMbtpoxnU7xvOc9b5MfNaIcgJaNJHJqtJ1DmDaE6xUfehCeZDGyTD0r2wVSqoFg\n3SFsryGsKyN2HCkjMkNDOAztKCPoULlTJ8wN4UxU39NfQ7h5qFy2wVC5CxfKhZmVEUAe0F2Jr0AG\nVUrlU0P4yhXkgXbRemzTED5OrnXoK+Rvix7uPvWZ+7iruE0D4f3DSiUxT/25lEgPSO07hOkJPINC\nwUUgvKwh3HaoXINDWFVG5N/T1+s4Mb0fWxoqR8PXpQ3hIAHSHcxmwF7Hkw5qDmF9cGCQGBtMay2/\ncagcKyMYpi309cND5Zih0sdQuW1rCOuB8Lveld/mwXJMl1Tb79niT84XfiFw5kz9+6cmj8QWMtih\ncsjD3VV/TN+bf0HKewA8UPzzxSt+11cWf98vpdTGD+G95HbjcoQQ1wJ4RvEY3rfi9/VGklKHcKGM\noFO+2ygjVjmEteE9fRx0ZxmWhgw2A+FppK7z3pQMlZs7UkYU4cLEUkM4ISru0yQQpu3FNPBsqNwG\nDeH9MiNsUEYAVeBPA3DvGsJFOxhYIxA2NIRpINxXyN8WPRA+de3jOF38FyiB8MznQJhegUA+xNho\nCGd1hzBATty5GCoXNJ+8az1UrsEhTE9UhpN+lRFGh3C5zn04hFF9Tx/rrIf+NYdwi+d4sfwi1A9E\ngJM7RH7uPBCuN4Q5EGaGRpMyggNhZki4Gio35oYw/Qzx1KdWt7mMyXQJD5XbjEEGwlLKV0opw2V/\nUA2akwC+qvh6JKX8NFnUHyFvED9LCPEC0+8SQnwZ8mavBPB2w2P5JPLWsADwCiFE08WNryS3//s6\n69slpoZwqDSELSkjDIGwPryntwPQJSFD2mFD+OQeCYRn/QVNiaw/D7QhPI9bNIRRJETJFNNpdX0x\nvfQ3hYNAWNuu2w6VqxrCZmUEUIXgqfDYIUwCYWXA0hKUhnARCB87pi13AOjKiOjEPm68Mb997lz1\nYcHrhjAJzxRvtQWHMFVGGIes9dwQ1i+rD4XtoXKrG8LhtGdlhOlqjr6VEUWA2se++ijKiDYN4TwQ\nzp/DSTDBsQnZcTkPhOsOYT5gYYYGKyOYMdB3Q7gczOxjQ9jW+tLjn9Okf9LjDHlmC2l6T+IT7ssZ\nZCC8AU0mvF8EFteP/5Ie5hb/fmPxzwTAGxqW8wvF32cAvL72y4W4CfkQPAC4Cx4FwkaHMG0Ipy3C\nwkxt/ujow3v6CM9MIYOijNggLFSWv6QhfOJYtWe63GOykpkawuTTTtyiIZyK4klLd2u+ybJBG6P/\noXK1QJg2hHV34xE4kjIirAfC3jWEp9UTcGKyapZmjtIQHpEyQuxWgXCSAA8Ww2h9bQjrrULrDeG0\noSFc3g7nvZ7gMDqEtYFjthrCylA5su7htGdlhMH3vlA4iaw3hzDQY0N4xVC5Ng3h/DWTP4fTcOpX\nIGxwCPMBCzM06OuHA2FmqPTdEL6hmFz08MPFlauO6MshfNVV1e027+kMswoedLoZ2xIIGynavb+A\nPDD+UgDvE0K8QgjxfCHEK5CrHb4EeTv49VLKTzUs6neK7xUAflAI8f8JIb5OCPGlQogfLO67CkAK\n4DVStqyhWoQqI6Kw3hBOWzSEqZNSb8sCdWWEq4awEha2fGpS2RwIX3WMNIR7SgqlNDuEqRakTSBc\n6hFEulO7rwyEE9l/Q1i/9Fp3j0pp+KElLALhZcqIsiEMPxvCC4dwgd5wbsLUEB6DMuJQPL4IhIFK\nG/H4rGpR+xQI6+GZMlTOhkO4QRmxCJ4DF8oI4petNYRbDpUrmrACQtk/0KA9mDgcKqc3hFuGo4fz\nDAjy97co9DMQDkVYeyztG8L5Aiah5w3hkBvCzPAoD7LDsJsQjWH6YFlD2FZgS3/HdddVX3M5mLlL\nhzD9DMGBMNMXdJumA4n5PWk5Wx0IF/wEgN9CHvp+EYD/AuD9xd9fVHz9N6WUP9W0gCLgfTmAvyu+\n/1sB3Abgb5E3jK8BcAjg+6WUf9HZmmyAegCabw4TOnCsRUM49lAZsaohLJGtHRZSlgbCx/sfKmdy\nJgPq4MA2Q+WyIF+PIGsOhOeZC4cwdWPXQ4Zlbwwf+ADw7GcDr3pV9bWjKCPK7TmBxw1h8vj1QLuJ\nVQ3hoQTCh4n6ZOwf7iuB8L335idQhqKM6NIh7IMyojZUTncItx0q19CWdeoQNlzNsTh5J7JW4eiV\nGfVP+xEI6613kzKidUO4UEZ41xCuOYR5qBwzPMrXTxTlf/SvM8wQ6LMhHATqZ+i+Pl+Y6KshfIoY\n6jgQZrqEbr80EObPV8sZeyCsD5erf0PO9wL4RuRO4fuRh7f3F/9+qZTy+1f+Iik/C+ArAPwAgL8G\n8BiAAwCfAvDrAJ4npfxPm69KNySSHowVOoGItEdb+GXnyYqGsKaM8MEhDJG2OhuckFBFD4Tp0LWD\nnobKmdYXgHLQ3WaonCwGxgWyHi6aAuG+wkMarAQiqIUMy94YXv964B/+AXjzm4FPfjL/2jrKiET6\n2RDeNBCehJPF/5/JITwUZcQl7cnYn+0rgy5+5EfyDw9/8DY/A2G9Vag7hNsetCRKIEwUCouhcgnm\nc9nqhNk61PZdetO/RWOWOoSV9yGoQbuI5ovv72O9jQ7hhTJC4nC++YOggfDORF1nXxrCJmVE64Zw\n+Tx77xBmZQQzPMptVg/ReFtmhgTdXrsOhCcT9TP0NgTC3BBm+qIpEOaTlMupp3QjQUr50wB+eo3v\nvw15q7fN78wAvLn4MwhoQ7hURkwiMnCsRVh4cJgAhaZ0ZxLW7nehjMgbwlXiG4hA0wnknsaw/nCP\nRIrmhvDpk1WAo7cVu0JvRC+GylFlRIu9pAzz9Qhlc0P4MHXREF4eJC1b5dtvr27fcw/w9KcD+2VG\nWCgjBITa0ER1Cb9Elh/sZ5F/DeGd6rlQAsUV7E32cGl+adAN4YsHaiD8+ExVRjzySP73px/ZB67N\nb/sUCOsBqd4QbvshXh2IaWgIA0AYYz6fKl6urjA1hG21R5c2hMn7kojyI7gsyw/mpupL3jq66gZQ\n99XzOMWmH9uUhnDkR0PYFAjrj6XNwaMyVE5XRhQn9/xxCGfF1Tpj72kwY4KVEcwYoCceux4qF0XA\nHumTuAyEWRnBjA12CG8Gf/LcctKMXlpfKCNoQ7iFTmA2p42kVcqIHhvCS5QREFmrN/90yVA52hCe\n9RQImxyNAJR1TjZ8jtMUi8ZpuKQhTN2tLhrCdfdoc3i2vw/cfXf173Pn8r91ZcRutAuhTdFTAtYi\nKPepIXzlChaPCzh6Qxio2tBDdghfmWsN4cN9POc5wJOfrH3jjp8NYf21rOw/+3AIA71qI/QAvNYe\nbeEQPjhA5ZYN1LYsDcPLhvDiZzpmaUMYhQd4Qw4O11NG9HHyrvb+ZAj92zSEc2WEp0Pl9IYw1Kuq\nGGYIUGUEB8LMUOmjIVz+Dr0h7PIqO9Nbjq315YYw4wJWRmwGB8JbDm0kTcqGsOIQ3vydgQbCu9NV\nyoh+Jtjr7r6aMqKlizORzSH4mauqwHCe+aOM2PQ5vnQlWSw7RHNDOJUp9o73FzIA9eFMR70M+UMf\nUv9dBsKPl3PGilaZaSCbMuSraKV51xDeQBkBVOtbNoR9+TC7DjPN270/28fursSddwLvehfwwAPA\niRMAdj0OhMnJLGV7s+AQThocwvqVHL0GwmLJiZ3WyghzQ1hRRoSx+jMdY2oIR+TItPUVOwW+DJUz\nXcFiSwsCDEwZAeCwTfrNMA5oUkZwIMwMib6VEb41hKlFyrYyQl9fDoSZLmFlxGZwILzlmBpJijKi\nhUN4Nq+WbQyEHSgjVoYMLRvCGVFG6N7kJ9BAOHWsjKCB8IYhw+OXqnWYoLkhDADHT+WfDJwEwobJ\n9U3b2p13qv/+9Kfzv++9t/hCoYwwhalKQzjyqyEsZctAeAQNYaouAfJG7EFygGuvBV78YuDsWeBl\nL4O3DWH9ZJaqjFg+KPFIy6f+80ZlRH8N4ZUn71orI8wOYeX9gATCvTeE9aFyAOaWGsJ6CO6zMqK1\nQ3goQ+UAzFOusDDDgpURzBig22vfgbAPDeEumpTlZ4jdXVW3xYEw0yWsjNgMDoS3HKqMKA88p8T3\nu6lOAAAOY9IQ3vFDGbE6ZGipjJDNgfBVx6o9U5z18454JGVEutlzfOFylQxFQXNDGAB2T+RhXF/h\nYaKd6NAvQ27a1j74QfXfZUN4oZEoAlV9oBygN4Tz/xtfGsLzuar4ANo1hIcYCNPhhiX7s33l39/+\n7fC7IbxkqFzbDzuxJA1hEpLqwz/7OslhPHlndaicuSEsBPGDB/0qIzKTMoJcsXMY27lix+dA2JYn\nGgDiRAJhvvxJOFH3ec4DYW4IM8OHlRHMGKCfn7o6ueHjULnyMdGA2nZDeHdXbSBzIMx0CTeEN4MD\n4S2HNinLkDCypIw4JAege0dQRvjhEG4XrNBAeNmlyIl0o4ywOVRunwTCU7E8EN472W9DOMuWB0lN\noZbeED53Lv8/LBvCYrpEGeFxQ3jx/04CYSXAXkEZgMdZjDRLsbsLlArloSgjEtRfc4/PHlf+/fVf\nDyUQzmSm7CNdorf9dWVE2yxJUUaQk1m+OIRtDZVLEjWQ0x3C9Guyb2WEoSE8ofvqFidoh9AQNu2r\nWzWEJW29T7X1TKvH0DP655CSmBvCzMBgZQQzBvpURuhD5Vx9hs6y/OpBoJuGcPmZaW9PDdk5EGa6\nhBvCm8GB8JZjCjAjSwegtCE8JYPqSlwoI0wNYX1QUZsDUKqM0A+6aYBjCqe6QG8imRzCyYaB14XL\nVUIyDZYrI3aOV4Fw+QGkS2jzPRDBkRrC8znw93+vfu3cufxP/kYiIcMjKiM8awgvWrzRhkPlSAB+\nkBxAiKrhMISGsJQSqTA0hA/VhvB0JwV2Lihf86UlrL+W9YZw20A4bWgIOw2ElzWENxwqtwh2i7BX\n308D1TpL0V9DWEpAosOGcEyHBvoUCKsnLPXH0qohrHmxTevpU0N4nnBDmBkWVBlB5yhzIMwMiT6U\nEU1D5Vw1hOlnxi4awlQZAVTaCA6EmS6h2zU3hI8OB8JbjikQVnUCLRrCSXM4CrhRRqxuCLcMhJc0\nhAMRAFn+tbSnQFhfX9NzvGnof+EKaQiHyxvCZSCcZf2EpMmKoXKmbe1jH0Ptub9yBXj/+8sFxYDI\n02yTMkLZnj1rCFeBcDuHMFDXRgwhEI6z6rmj6MqIi/OLte/xJRDWwzPVIdzu0npAHYhJHcJOh8qt\nGAC6yTovXvtlQzg0NISLr8mgP4dwlqH23pQ/lva+9zQFYvp+LPwIhPVg1KSMaNcQrjaQSTipLRvw\nyyF8mHIgzAyLJmUEt7GYIdG3MsKHoXJNl9Z3oYwAOBBm+qHr7XqscCC85dBLVI1h4YZ+WQCYx8sD\nYV0Z0Ud4ttIh3Haq+ZKGMAAEWR6cZsEhss3/a4/+eBocwmrTLt3oA8/FA6IfCFc0hI9VKVIfl0dl\nMIT+snAcBIlxW6P+4IBsEu9+d3Ejqj61GZURHjuETcqINg1hoGo4DEEZcZiYnwhdGaEHxIA/gbAe\nnimBcEvXKgCkng2VWz0AtG0gvLohnJGGcNfvT/p7U/m+FAbth7xSZzJQD8FNgXAfr2ujQ9iSJxoA\nkqXKCP8awgkrI5iBwcoIZgzoDWF6DNC1Q9jVZ+guXatSVsc/ZfjNgTDTBxwIbwYHwlsObQibwsJN\nB44BwHxFQ9iFMmJ1yNCdMgIAQlmEhuFhL81KPWRYOIQtrPPFgyoZMvloaeA4OValKX2sd6o1hAEg\n+P/Ze/NoS476zvMbmffet9eiUpVUWkslAQKBEQiBsQ02HkzLHpDbmLYBGze2u9scDwefbsxgwwD2\nwNgwBy/TNj4D3WNPN43ZTBvDuA9mF2AjdozQBqWlhFSLXpXqVb3lvrvG/JEZGb+IjFxvbvfe+J5T\np/LdNfPmEhnf+MbnJ7Y5AhlB+cE/9mNy+fOf9xfa8k2JyAibEG6ULnTNO0JHRuh/A+q08zpFGcIM\nTqEDWYBmnrUiEsJuvQnhLGbheAx86UvAuXPq4945SYqNxTCEx6guIey1TbK9FdvaognhnD21bhch\n45WqMQxhxy0UGUGP6bbTDh0/wTpULJ0HLmSREVbTJoqMsIaw1bSq7IQw5yoyogkJ4aip9UUYZ7Tv\nYxPCVlXKFpXLJ2sIz7nGCQnhSZAR/WH4s6nqQEYkJoRzcimFEg1h+KZhq4fN8Oz0wmXqcAMIFdLL\nYwhv7cabi4ohvCRfW/ZoeIjF6RvBDvw7vAhkBE0Iv+hFcjngChMz1YSMaHJC2GQILxgwH1FSDOGh\naghXxYWeRI+cInenXB77eiK4+Qlh77h20QJjTJ7HRSSEG8YQ1k2zrAnhd74TeM5zgGc8Q70R9MzR\n+LZJbP+4QoawabCS/g/kTwhPiyFcODICcv9NQ0K4X2BC+Ngx4KMftZ1vq3IVhYywnW+raVLZReX0\nz29CUbkyk5TWELaqS7aoXD5ZQ3jOpSSETQXHJkkIj+LNUR0Z0QyG8GQd0NSGsNvD1lb+70mrSJOh\ngGm527u0QFnYXKSGY5WGsClZCKgJYT25Ox7LhPBVVwE33mj44HYCMkIpKuf9oI1LCLsFFJUbqMgI\nzutLOKTVI6fldi+NDwbLIWSEISG81W3G3Ss1zxzdLCw6IdwAQzgxIZwwePfFL3r/338/8Mgj8vEk\nfAIgt3lEEsJln8vR1y3Key8nIazO1qmYIawx7otERuiDHMp2184QDvdOBgUxhHd2gGc+E/i5nwP+\n+I8L+Ugrq5Bo6tEmhK2mWWUXlaNm1DwUlaP3SxYZYVWlbEI4n6whPOdKSgjnnaI6GsUXWAPqQUaY\nEsK6yVCmIdxmwhDuV5IQNhXtAYpBRmwTZ2ipE4+McBeqQ0ZEmeBxCeHjxxHsjxtvBK68Mvy5l15B\nkBEGZjJNCLNOsxLCEzOEYxLCyuc3VCfW5XavOYeC5RAywpAQPvloM+5eqXnmMu08LiIhDIqMkNcu\nxTB1BpUNciQO3iWYhXQ96fHpmaNyW+PaJmoo1pYQJu1TWYZwkxLC+roUlRBuO23t+Kk7IRz+4kFB\nCeEHH5SolM9+tpCPtLIKidbBsAlhq2lW2cgI2o41BRlRZkKYbpNNCFtVKZsQzidrCM+5aBG0wDhT\nkBHFTFFVTFdfdSAj9E53y2kViozg5Pc0syklMqKyhHBC6ixvKnq7Jx2X5XY8MqK1WF1C2DPO5HEr\nizP5RoMzDB1rNEF47bVmQ/iyqwgyIiEh3F6YMYawISFMDeGmc4RPn5HbfWDh0mA5DUO4UYawSAhD\nMwsLSAjT2SKUIdyohHAGZAQ99+jxqbdNpuu0mhDm8n0lKlVCuKD2uKmGcFGFA4PPh1pUDiDbWjdD\n2JQQLohXTtvYY8cK+Ugrq5DKTlVaWVWlspERuiHchKJy+jqJbbbICKtpVpQhbNukeFlDeM41MqR4\nqXk7Go1z8UGTOqBAGBlRhXlmSszqqbNSE8KO3yK69TCETfs47zZ3++kTwk6n4oSwwVhxY5ARp07J\n5Usv9W7WDhxQX3PJ5QQZkcAQbi01KyFcaFG5oYqMUD6/oTp9Ru6IQ8uXBMs6MkL/23tvM+5e6XEd\nQkYUzBCORkZUXFROZwhnwAnQ9aTH5+4uADchIeyGEQplt08mZjL9H6g2IVxFJ9XIEC6gbQo+Xysq\nB1DMSs0JYUNRuSKREULHj9sOuFU50lOVZGKJTWNZTZXiBjfG+cmJgWg7pjOEm5AQbrXk+VtEmxiH\njBgOi/lNraxMEse143gDHfrjVmZZQ3jOlYSMgDPKdRKlMoRrQEYkJ5ImREaw+G0OUqTOGOc3y786\nRZkMRaSid/qyxV9ZiE8Is7Z8bdn7OdFYMSAjdEMYCKeELz5MkBEGM5UaZ60pSAgrRl+CkhLCTUdG\nrJ+T2314D0FGpCgq1xRDmA5mBciIIhPCIk05dtBuyeuDcp1uVEI4/rqVOiEcwxD2vrgv31eiEgey\nAAyGEySEY0xw+nd7oeqEsHoPopvTExV5ZWpROfEd4rODdahY0QzhYu4J6PV4PPZMYSurolV2qtLK\nqiqVjYyIYwjPYkI4Dhmhf7eVVZESxy8d5KCPW5llDeE5F020mjqgec3CbhchNIMuxph8vCJkBO2I\nMThgjBWLjEgyhEmK9NyF8p0VU4cb0PdxzoTwQK7/ymJ8QrhKQzjSWHFkmjKPIXzgkvRF5YQh3JSE\nsGQIeyu04C6AMZb6/UkM4aYnhM+cl8ffpfv3BfsqDTJi/bEGGcJMSwg7JSSEx22lM1QrMiKhAGge\nQ9grKpeOIex9sffauhjCRTD9sySEqzSETTN2imibAD+BRPazMP7pOQM0iyE8LAEZAVhshFU5ssgI\nq1lR2cdy0xnC1DwrGxkB2FkrVuVJnGu6IWzbpHhZQ3jONUZCQjin0ZAmIQyQjrfbrzwh7MC8vZOM\nXPIEZMQiMYQ3tsp3VvQOd8gcBfIjIwayxV9djE8I02RqNQzh+ISwntw9eVIuHz7s/X/VVepr9h0k\nDOEEZITbaXZCOAsuAjAnhKcJGXF2Q55rB/YuYO/iXgBhRITJED5zrhl3rnSgo6UXlSsgITwWCeGR\nagjrReWqTQir1y79uhW3LlHIiG4XwcAIoA7kCJkSwmWfy2nwPlUgI6pPCMcjI/Ie18Mhgn0HyHuN\nRieEx8UnhAHgvvsK+VgrK0V6qtIht9K28201Taq6qFynA4hMRl0JYT21XJYhrCMjAGsIW5UnmhCm\n57FNCMfLGsJzrnFCUbmJEsJpDGFhNlSEjKDJHNdoCE+GjEhKCC+1pflwvgJD2FREDyhmm3tDuf6r\nCQlh7tafEG670UXl0iSE1/bHIyOoseQ0LCE8sSGckBBuOjLi3JY8/vYsL2Lf4j4A6ZARZzaaceeq\nFJUrIyEMmRCmHfu6EsJ6ijJrwbHYhHACS1s3wYHqr1uma3VRRV71QnpNMYRdxw0hI/K2x6MRFExG\n44rKGRjCNiFsNU3SE4aMSVPYGsJW06Sqi8oxJkMVdSWEda5xkYZwEjLCGsJWZUkcv3SQA7BtUpKs\nITznMiWElYRO2YZwkBCuniHs6BxOYCJkhDdFNcEQJsXXzm9XhYwIm/5FVHLPYgijQkM4KiHcSoGM\ncF1ZTI4awnv3As5CAjKCJoTb3g/a7zejeEIZCeFpQkZsbKoG4N4FLyF8oXcBYy53kCkhfO58M+5c\n6XEdQr8UmRAet2KQETUWlTMkhHMzhJMM4RqQESZ8AqBeqzlGuW5qG50Q1gYsi2ibAENC2G1YUTlD\nQnhYUkLYGsJWZUifZg9II812vq2mSVUiI8S5IpKzFhlhZVWcbEI4n6whPOfiBoawXlQutyFsmK6p\nKzAb3H4l0+uVwkymhPAEVc2jOvRUy8QQ3typChmRYPrn3ObeSO6wteV4ZMTYqQ4Z4ZkM0uQT+9d1\npXmmH2vCEL7kEplwociIo0eB3jABGUETwm25b5tw4yMZwt42UPM6jUwJ4WlBRvR6QLevIgIEMoKD\nY7O3GTxnSgifu9CAHQi9qFwZCWH/2qUjI7SiclVhUGhilsFjvWfh28cjI+IN4fqKyiXgfXLO5shi\nCLc63ut2d8sfzDIyhAtomwDxe8YkhC1D2MpqIunT7On/1hC2miZVXVQOkPfQTSkqJwzhIrY3CRnR\nlNmTVrOnqKJytk2KlzWE51icmxEHeqc7z4W72wXQkb3wlc6K8XVVIyNMCeGiisqlMoQXpBF3Yad8\noynKZCgEGTGSB8ae5fiEMDWEq0naGZAREQnh8Rg4fdpbFrgIALj2Wsn41kxKJAAAIABJREFUuv56\naYQCEcgIYrIyYgg3gSMcGGKut15FJ4SbjIx49FGEDECBjABUjrApIbyx1Qfnpa5iKlGDNFQAtNCE\ncPOKyontVAohxpjgnKsdDnp8pkJGODUhI5LwPpMM0KY0hN2OfF0d3GRl3SYY6NCREWKfBoZzExPC\nvBhDWB+ge+AB2xmyKl56whCwhrDVdKrqonLAbCeELTLCqi7ZhHA+WUN4jjUeIwVfdoIOaFv2Spbb\ny8bXUWREr1dtIkkkhHUDPG8iaTCA0XylomiFrW71DOEoZESebe5zlcuqixotI1Z1QtiAjCAJYXqz\ncvasvOGjhvDhw8Af/AHw/OcDb3yjNEKBCGSEazaEmzASvrUFABxol8MQbnJC+NQphAzAA0sHgr/P\nds8Gy6aE8HDcx4ULpa5iKinXLj3pX0BCmBaVowzheovKqQY4kM4E19exiIRwJdetBGRE3mt1poRw\nW76u6lR0KAU+AUNYR0Y0jiFcITKi3wcefriQj7aqQHfeCXzyk83ATcXJhIwo0lSysqpKVTOEAWkI\nNyUhLLbZIiOspllRCWHbJsXLGsJzLL1TUnhRubZs5Vba5oQwRUYA1SaSXGNCOP8UVaXTzR31c30t\nL8oWcWu3KmREfOX6vKmzwTh9QnjE5GtrSwiTonL0ODMVlBP6nd8BPvUp4IYbgN0kZARNCLealRDe\n2oJijkyUEJ4yZMTp0wDI/lhoLeDg8sHg7/XtdQDAaDzCZn9Tfzvg9pVjpC5RhnBoYKeIhDBrckKY\nJFhTmOBxhnCahDDd5s5SPYZwke3xzg5SG8Ju5YawxsUuoG2Sn00Swk1jCJuKyhWUEDYdqxYbMR16\n5BHgppuAf/EvgPe/v+61iZdFRljNispGRpgYwuIeejCox6yqiiFsQkZYQ9iqLFlkRD5ZQ3iOFVXV\nvLCichmREcH7SpTK4SwWGUENYYebi+jRhPBOBc6Kvo+LREYMSEKYFssTokbLEPUnhN0IZEScIUyV\nhIxQjLNWsxLC29tINMHipCSEpwwZ4RnC6rZfvHxx8PeZnTMAvAJzRjXEEI41zwpOCDemqJx/PVUM\nyxQmuD4IkzUhTAdAFtd2Qp9RhiITwgUwdTMhI2pKCIt10JEREzGEDXUMZEK4PkNY39fB4yUlhAHg\nvvsK+WirkvXlL8t7hs99rt51SZJFRljNiujx6jjVMISXSK6kDmxEFEPYIiOsplkWGZFP1hCeY6Xi\ny07CLMyIjADqSQgXUbQH0AxhmA3hRZIi3a4gIVzmNOQhl+tvMlWiDOG6EsLSSBpjpyuhsHkM4SRk\nhGD1Ag1KCBMTTFnXFDIlhKcZGWEyhCk/WDme3T5Onix9NRM1GI4B5h23LQNDeDDARKzjqISwXlSu\nKkOYDt5RkzBon2IG7+IM4TQJYTqjpbPqvbnsQY/IAqAZCulFyWub5EW+KYawsVCihoyYiOlPGMJi\nYIMOogANKypnE8Jzr/vvl8sPPiiX3/Y24Bd/EThxovJVipQJGWENYatplDCLXNerHSL+AeUhI+gs\nuzoM4aoSwtYQtqpS4lyzCeFssobwHCuqiE0RHdCdHQTICJe11JQZUfC4MwLYuNIOqGOakjtBVfM0\nhjDFCuwMKkJGxKVlgdwm+AhkGr7BYKTb2h83KCEMYLcn4XxpDeEsyAjuNjEhHG/gx8mUEJ4qZISr\nHqsHVwgyYsdDRlB+MEVKNCUhPBjGzObwj/e81y7OObg4Z8YthSFcKzLClBCeEBmRJiFMZ7R0Vqox\nhKMG74pAGk1DQjhAOhSQiJafHUZGNCEhPBhyoyE84uUlhK0hPB2iSW5hCN9xB/CmNwEf+hDwp39a\ny2oZZZERVrMimioUEvdBZTOEgfoN4aITwhYZYVWXbEI4n6whPMdKxyycICHsIyOWXDMuAggXLKqy\nkru5wz0hMsJPJDkswhAmxmm3Xz0yosjK9UMWb6q0nXbwPb1xN7i5qsT0Z9LwFetABzq6u/I3SZ0Q\nHsQjI+i+5U5zEsKjUbpUZJzo6+chIUwN46YYwv1ReDYHTQgD+W+ylWJWcciIyovK1YOMoAnh1vJ2\n6DPKUNSMnSKYulkMYaclX1elCR41eyXvMb27i1RF5ephN5qrhY1KSAiLc9kiI6ZDNCH80EPeOXLH\nHfKxu++ufp2iZJERVrMicbxSQ7jIYzkpIVwHdk3nGott53zygpYWGWFVhziX5ysd5ABsm5QkawjP\nsdIlkiYpKuf1oBddMy4C0Kcjl28ImxjCeiI6byKJFu5xUySEu4PeRFO80yjSZCgAGTFmaqEuXYyx\nwGzpDrqVVdSNQkZQ06Pbk89THMBEyAiaEHbkSVN3QjgwsiYwhPV9CUwPQ/ixxxDadpoADgzhhieE\nhyM6sKNzsTkAnvsmezAmFwAdGeGqyIiqBjjSJIRHI/NN3sTIiE7YEB4O8yew0yhte1xEQljZp1B/\nX9aqJyFsZAg7w9y/uVdIkySEHa2oHOMAG9eTEB6ZXeghijWEXRc4etRbPnZsMqRMXTp1CnjrW4Hb\nb697TaoRNYQHAw8R8f3vy8ceeKD6dYqSRUZYzYooMkKoLENYnCtNSwgXyU22yAirOqS3SdQQtgnh\neFlDeI6VtqhcHkPLM4S9XslyKzohrKfP6uiAFtHhBtROt5siIcxZr3SzUEdGRO3jrNvMOTB2/BZ/\n7IYSZ0ICNbA73A1Gw6tMgQMRyIj+MOgc50FGmEykltMKjqVxgxLCW1v+wgSGMCD3pTDGFxclY63J\nCeGNDaj85NYCDiwfCP4OkBENTwhTE6ntGgazcqZHvc8mF4DYonINQEYEhp4XYTFduyZGRpCEsLso\n31zmcZ4O4VRBQtithyEcVUQv7zHtGcKkqJyOjACCQYWqNSBfysCC5XHByIjlZeC66+RjTbiOZdUb\n3gC8+c3ArbfOvokwGADHj6uPPfigagg/+GBzjH2LjLCaFZWdEE4qKtekhDAwuXlmkRFWdUiftWKR\nEellDeE5VulF5XxkBE1b6dLTZ2V3QAfDMeB4RoIwbYtAZAApDWGapHX72NzM911plQoLkoPT2O8j\n4LI6PLpAmUjSdof1J4RVY0UWKxKd5OVlYHU1+nNFMrbltCINcGH40/R0ExPCWYvKAWRf+r8DY3LK\nW/MNYZWfvNxeDgpdmhLCh5YPyQ9oiCFME8JGnMAE167YhLA2i6MOQ1gY4AC5dsVgMvRBGHrNyZoQ\ndoghXOa1K7IAqDZ4V7YhzCo0hNMgIyZKCBOGcKioHLzvrqWoHEG00OOvaGTE8jJwxRXy8dOnC/n4\nSnXXXd7/6+vNKqhWhn7wg7D5pBvC29vAmTOVrlakTAnhIjmkVlZVycQQFvdBk+ITgOkqKqc/l0c2\nIWxVh+KOaTtIGS9rCM+xItOjRRSV644CE0YYLyZVjYwYjIpPywophnCCYeitQE+mN0tSZAp8wtSZ\nl7LzDeFxtLlIMQN1J4TVpKM81oTZd/iwTLyaJJKxpoJyQsLwp4ZwYxLCbv6ickA4IQxIbESTkRF6\nQlhsu8BCCEN4Y3cjeI2eEKZYkbo0HMtjuuyE8LQUlQPSGcKTJITZQjUJYZqWBYobvAMyGsINQ0bk\nPaY3N2FERuifX3dCWDGEUYyLJo7T5WVgzx75eNkD0GXovBynwyOP1LceVcjEedYNYaA52AjLELaa\nFVWJjGhKUTl9nYo0hC1D2KoO2YRwfllDeI5VZlG5rZ50iFYX0iMjyjaW+sNwp7ioonKUIRyVIFXN\nlV7pHbSQyWBKgecwwamp4iLaXKQmYt0JYd3Y2t310rvnznkPxeEiAImMMPGDhYThP2p4QjiXIawl\nhIHmJ4Q5NyAj/H0kCsud3TmLMR+ryAiNIby+Xv/NBEVGhBjCQIEJ4dZ0ICNiEsL6Ou7uyk5dKkOY\nJIRZu8qEcP3ICObUawgXYYADYWREkBDWBlFqSQiTc5kef+OCGcK6IXzhQiEfX6moITzrCWHKDxb6\nxjfkPYrQgw9WsjqJssgIq1lRlUXlxHfUXVTOJoStZk02IZxf1hCeY5U5RXV7IDvRawsxCWENGVG2\nsTQcJRjCE3RAaac7EinQqiEhnMQQzrHN3S6CxGkrBTJid7iLpWUPfFdJcabEhLCHJ6FTaJMMYWGE\nxiWExXcM0cCEcIEMYe5DDEVCuKmG8Pa2fzyIY9VpBce+MIRHfISN3Q21qJyWEObcm7JcpygyotSE\n8LgZDGETXxbIlxAGZIcrFTKCJIR5u0KGcEVF5eIMYRBkRNmdVJMhzBiT2zzBIIeOjDAyhOtKCEci\nIyYfdeJ8tgxhus7zaAh/7nPhx5qSELZF5axmRTYhXKx5Ju7BFhbkzEtrCFuVLZsQzi9rCM+x9ERS\nVHp0YkN4MYYhrCEjyjaW+gZDWEdklGoIU2REBQnhqBT4pJxGmrJrseSEMAAsLEsnqcybHw+FIqFf\n4njW+c3dbvqCcgApphZjporvGPLZTQgDQG/kbRRFRjSl0A3VhqBA+NtOt5uavmd2zuCRTTkf+Yo9\nBLzppwzr5m9SE6nlFpsQpkzT+KJy5RfCDNZpxANzMq6oXFpDeHvbO0azJoTHrXoYwkZO9CRMf4JP\naEpCOMr0D5adYWEJYXGvoZ8z9SSEzciIMZt8pLTXk9fiaTeEh0N1EGYekRGmfdYUQ9giI6xmRXEM\n4bKKys1yQljcOyyS2ytrCFuVrTKP6VmXNYTnWKkSSXk7oAPZutG0lS4dGVFlQrhIAxxQDWFaBImq\n6oRwFD5h0m2mDOEWkhnCALCwKt2FMo2GqIGOjhNGRmQxhLMgI5qeEFaOw5Si5r5IS9ddFCNJuiFM\nB2QuXro4WF7fXscDG14ve7G1iKv3Xi0/xDeV6t6+EWEIB4ZwUQlhragcZQgzxuTAXauHfr+YIitJ\nGo3kl1CzME1ROZNpvb3tXQ/HY2RKCI/dehPCkw5YBiZ4TEKYztQZM/mD1oGMAFQsyEQJYTdcVK4J\nCeGoonJFICOosbC8DKytyb+njSGsm6HzkhB2XeCGG6JfNw3IiPG4mYPEVlYmVYmMaGpCuMg0pej3\n0G20hrBV2dLRLBYZkV7WEJ5jReIEimAWjmTPOX1RuX7po6SmonI6M7mIhHCkIVxzQrgoZMTODg8M\n4Y4TkxAm5mlnWd7xlLmf9WKJcciItIbwaDxCf+SdCGmKyg3GTUwIT1hUjuxLkZZeIWM9TcRGSEPY\n2/aohPD6zjoeOOcZwkf2HQmlyYH69yM1kTqu4TyehCGsFZWjHQOADCD4v0UVN/NDbjYw8yIjtrdJ\npytDQnjkVJgQNrTHkzL9g+M2xhCm17QBqhm4A/wUuG/sR+3j4TCfsRRKCBuREfUkhOngjmoITx5h\nocfoysp0J4QpPxiY7YQw5zIhfPXVwHXXRb+2KQnhOGQEUM3AoZVVEaobGTFrCWFxD2YTwlZVyiIj\n8ssawnOsKJxAEVW+d0ckIdyJQUa41SIjaGGmNmEWMviQowmQEds7Y4B5PdcO3S6iWhLCBnN0UmTE\nhR06FTeGIdyihrA0YWpJCE9gCIt0MJCAjPAN/zFGwe/exITwJAxhQCaEqSFcxw1tkuKQEYIhDAB3\nPnpnYHIf3X80NHMBqP8Glk4zN57HBSaEQ4awGMjyWcxVmOOjcYQhnKKoXJQhHDyecC60nFZwDAyc\nahLCegHQKKZ/LrwPELvN9O8Blydy6QnhcdgAV5b93yNPm7y5CYUhbC4q17CEcAHICD0hPEuG8Cwn\nhB97TO6fo0eBI0fU5x1HmsQPPtgMszUOGQHYRJbV9KjuonJ1JIR1jIVFRlhNu+KOadsexcsawnOs\nqKrmukmbp/O/O5Y95yYhI6gh3CIpXlnEJj8yYrtLEnytNAnhfm0J4UkLFW3uyIOik9IQbi02KSHc\ny4SMoIZwLDKigcnSUgzhKUkIB9XZBTKC7B9qCH/1xFeD5Wv2XROauQDUvx9HPKE4ZIkJ4eDcaVVo\nCE+QEI5CRujmaMftqNdCItFuDVATQ5gZ8D45TP9gndty5fVZO4yx4LFehYbwMKXpn8cQ1pER4pxW\nBrzrYghXlBDWDeFpQ0aYEsKziiGgBeWOHgWuuUZ9/uqrgSc8wVvu99V7l7pkQkZYZqPVNKrshLCJ\nIdwkZERZCWGLjLCqUjYhnF/WEJ5jRRWxUcwQZ5Drwt3n+ZARpTOEx+GEMKByKfMmhHd2UxjCrTqQ\nEeG07KSFii7sEB6tG20u0s5ua6lmhnBCQviSS6I/UxigQAIywlUT4ED9CeEyisqZGMJNNISDhLBr\nQEYsS2TEVx+RhvDR/UdVbm6FmIQ40WtXcB6XkhBuBVWhhXRkRBWGMEVG0OsVHbwD0ieEd3bChnDc\neSBmttC2rEpDOArhlDshHGMI08d6owoTwhOY/knSkRFBQthpQEKYV5cQpgzhaU8Ib29Pn6mdVtQQ\nvvbacEL4cY9TTeImYCPouaMzhPXnrayarLKLypmQEU0qKldkQng0kttrE8JWVSoOg2Lbo3hZQ3iO\nFTVFVTfOsl64OQcGaD4ygiaEZQc0PzJipyd/qIVWBDLCrRcZUVShoq2udISUbdJETUR3oUkJYc8Q\nfuwx+dCBA9GfKQxQIAEZoRn+QP3JUmNRuZh9FqVpTAhvbMC7xrneeR+FjDi1JUcGrtnn9bqD46Uh\nCeGhYXp9UQlhajYzHr521YGMSJsenQQZEXcur3ZWAQA9XmVRuaR9nLMAKAC04q9hwhCmuKeyO6lR\nhnARyIitLSjIiIAhzOovKhfFEOYFJITpMTpryAhgdrERgh8MmJERuiHchMJyJpPLGsJW0yaKX5nX\nonJFmmf0/ssawlZVyiaE88sawnOsqCI2ukmb9cK9uwug3UxkhJIQLhoZ0Scm+II5RapjC6pGRojt\nnLSQ3maXmIsxpgo1Ed3FChnCTN7hiW3VDeHdXYkUWFhQb850KciImISw8h1NSwi7BRaVmyaGsGse\nvKCGMNU1+82GcN03sEpCuGiGMEFGOAZDuJnICA6AT4SMiE0I++1Wb7wNwJunXkdCWEdGTJoQXmwt\nGjEZwhDuDqtLCFNEQlbTP0l6Qjg4Z7TZMfUYwnK7aXvCbUJYkckQntXCcjoy4uqr1ecf9zjVJG5C\nQtgyhK1mQSb0CV2eh4RwkcgI2uexyAirKlVmocRZlzWE51hRzMJJkRHdLoDOFCAjjIZwfmTEdk/2\nnqMM4aqLytHUmcsMpgqQy2TY2qXmYrqEsNOpMCFsQEbofF+aEN6/H6Fp8lQKMiKOIexOR0K4KIbw\nVCAjWmYj/KKli2QxSaKmJoTHJoYwKyYhTJERxoRwDciIxPQoEFn0tIiEsJjZMuKjYLurLCpnxIJM\nkhD2DeGo9lgawt3gWlimITwew2iAA2FkxKQM4Y7bAfM3Sj9+msQQ5qx4hrDryuv0tOEW5ikhTA3e\no0eBffu8f0JNREaYuKjWELaaNpkGNoDyi8o1KSHcbheXprQJYau6ZJER+WUN4TlW2qJyuTqg7WYi\nI6IM4aDTPUFCeIdgBVYiOt26YVh2B42aDNRYUE1/s6kSp+0eScx20jGEWadChnBKZIRICF90Ufxn\npkZGzBlDeCqQERHb7TouLlpSd/xFSxdh7+JeAM0zhI0M4QlZ4EI0IRyLjHBGABtVckxHGcJpkEZR\nhnCehLD3pd7BXXtCOIfpLw1hbyFqhoN4fDAeYGF5oL63BEUhq5Rl//ncCWEfGUHbuyYUlaPJaIUh\n7BSfEAYkNsImhJur9XXv/8VFaQTTRPB11zUbGWETwlbTKnqcUgPJccLP51UTi8qVlaak22INYasq\nZZER+WUN4TlWWUXlPEM4HzKi7GkzKpfSXKgob0J4N0WK1HVcacxWlRBm4YSwbqpk3eZt4o4ttWMS\nwtR8aNWbENa3+cIFaWLu3x//mWmREdPCEC4qITwVyAjKTtbS7AdXDip/H91/NFhuGjKCckeNyIiC\nEsImZIQ+s6GShLCBmQyEB11M6xKFjPCMYp4pIQwgaM9KZwibeO8ZZnN87nPAX/yF2iHLmhAGgKW1\nrvreEhRlgANhZETW9olzPw3rn7v02q9iVqpPCI/HahK47bYB7t1/FIqMeMr78MGt38SJzRMzZQjP\nakJYzFaig9M//uPe/1dfHU4N24SwlVUxSkJGACpnOI9MyAhqltZx/6yvUxnICGsIW1UpfZDSIS6n\nbY/i1Up+idWsinZAGVhgiqqp3exF5ZqNjJBXhE7LgIzIgU8Q6pJiPEmm4c5gp3KGMDUWJjWEd3o9\nQHCwOumQEaxdJUM4OSFMO5aJCeE8yIiGJYSdhV2Ie1rdGE0jU0J4KpARMexknSMscBFA8xLCIwNv\ntYyEsIMYhjAAuD30etHX9KKk8GXJYJaOfsmcEHYHPn94OhLCaZERJ08Ct9zidbS6XeC1r/Uez2MI\nL67tAKf2VGAIy+MuDhmR9bju9/0OrY+MoPc0KjKi+oSwyfhn4xa42weKQkasnAb+5SvxmQtDvPOf\nFrFnzx8D8ExyzuPxSE2SycCe1YSwyRD+gz8AfuRHgGc9S5o1R44A3/428NBD3jHeqrEXZxPCVrOg\nqISwfiw7E0ToTIYwY15KuNudrYSwZQhb1SX9mGbMO49HI5sQTpJNCM+xaAeUQU3LBp3QeUFGOJMj\nI9IkhAFiGrZ6pd8E0LRsq8CE8E5ftvjLC+mKyvEmJYRbPcUQTkoIp0ZGELOKtbwDqW5DWCSE3c6E\nReUSEsKNNYRpMtrNbwjXfQOrJISNfNmCEsKGcWIVddOvlSGsm9OZDeGUSXnFEK4qIWxg+qctAHrH\nHfIY/c535OPdLrwim/4AVVTbRA3hhVXvAl2HAa4s5ywqF8yK8M9dOvCsDqJUnxAOsaKZG2BaeFHI\niP33A673HcfPHw8Ky43HzZzJEaV5SQhTQ4gawsvLwC/8glpgTiyPRsDp09Wto0km9qot4mM1bUqT\nEJ60nTANngAyVFF3UTk9ITzJ9lpkhFVdimuT7ABlvKbSEGaMrTHGfpEx9k7G2OcZY99njG0wxnqM\nsdOMsc8xxl7HGEvI/QWf92zG2HsZYw8yxrqMsZOMsU8wxl6acb1exhj7B//9Xf/z3ssY++F8W1qu\naIdMNwECo9YZZO78T4KM2N72EixliZoMbVNCeAJkRG8sW8G4VDQt0FS2IRyVEM6SAuc8bIR0B/Kg\nWIlJCCsFc9xmMYSzJITTIiPo9i6slj/tOo2ChHCnXIZwE42GRGTEcjQyIjhHmpIQNhikRSWE6UBZ\nY5AREYawnsI3bXMsMiKtIUwGMp3FqhLCCciImAFLmpw8e1YueyZ4cttEH++s7Mj3lqQQQ5gWPQ2Q\nEd7zmYueCkPYkUXlgu+pmSGsD1i2nBYYl8zkSe9/dnYArEqncKu/FSAjgOnCRlBDWKASZtEQFrUM\ngOR7kQMHzO+rQxYZYTULSioqB0yOjDCdK4BM0NZdVK6shDA1hOl2W0PYqgzFtUl2gDJeU2kIA3gm\ngPcD+PcAngPgKIA1eAiMiwE8F8A7ANzDGHtB3Acxxn4PwBcB/BKAKwF0ABwC8AIAf80Y+zhjrBP9\nCQBjbJEx9vcA3gfg+f77O/7n/RKALzHG3pxrS0sU7YC6uiEsEjV5E8I5kRGcl5uopMZHxzUZwvmQ\nEZwDfZ4SGSEMDbdXuolGzVGHZUdGjEbA857ndVI++lH5+O5A7qTVxZiEMDERR06FCWEm797Evo0z\nhBMTwinT34qpsuy9p26jVJgjwhBmYOp5l1KmhHCTkRGcC0M4AzJivyEh7IwBNqrdEKYIBbH/CksI\nZ0ZG5PueLCq6qNzOTv6E8MJa+Qlh3SAV1y2deRt1raaGsJh+DoRn7KQxhNvL3ut7vck7wlHSE8JG\nrEPOonKhhDD57LoZwqGEsCMTwnAGE6+PZwifCv6eBUN4eVkmY0+cKO+YrEv0fE0yhIUxDvjtW42y\nyAirWVBaZMQkMiEjgGYlhIsqwEVRiDQ0wpjc9lkzhO+9F/it3wK+9KW612S+FZcQtoZwvKbVEAaA\nhwD8FwC/BeDFAJ4N4EcB/CKADwMYwjOH/44x9hTTBzDGfgPAmwEwAMcA/Bo8s/lfAvgsAA7gZwD8\nZcK6/BWAn/Zf/1n//c8E8Ov+5zoA3sIY+zf5NrUc0Y4JRUYAakK4SmQEUPK0XGoItwyJJDbGeJy9\n8e/3oaSw4kzDwNBolW8I030cZ6pEmQz//M/Abbd52/fe98rHd4ckIbyYrqgcdypkCKcoKheXyjm1\ndUoZPEiLjKDb216uPyGsTBEmhbRYDohkUkK4aYbw1pZvHMQYgLohbCwqB+RiqRetEQ/zVgtjCBNk\nhGswhKtGRnCuFt6KTAgbkBH9UR9bfB2AyvybJCG8sFotQ9iBG5yj6nEY3R7HG8KkbYoYrFQM4SX5\n+rIGaOOQEcE2M55rkFYawgkJ4boMYY0hHKTyneHEnZbtbXgMYV+6IVx23YIiJQzhPXuAyy/3lodD\n4MyZ+tapDGUxhOngtU0IW1lNrqqREbOeEKYzlOiMBkBiI+q+ny5ar3418B//I/DSTPPKrYqWRUbk\n17Qawp/lnB/hnP8a5/zPOed/xzn/Kuf8ds7533DOXwrgX/mv7QB4i/4BjLF9AN4Oz8R9CMCzOOf/\nhXP+Dc75xwH8FID/D55Z/DLG2HNNK8IYex48E5oD+BiAF3DOP+5/zv8Lz6h+yP+cdzDG9hb3M0wm\nmh51mZoQpvzMfIaw14F20FI7tJp0ZARQcgqLIiOMCWEvepK1A5p2Wi5AkRE97O6Wm3ZRUuA5EsLf\n+pZcfvRRuUwRCmtL6RLCQ1ZlQjiMjNBNLSrayfrgdz+Iy/7oMtz4f9+Inm98p0VGqKZK+RzOJCnf\n7RtheQrKAckM4bqT0LqC9BRFRrjRyAgGhqv2XhX8rZ8jdSeEjUXlKkoI65iGsn+L8RiRZmFcUbnd\n4S6e+K4n4tjPHgau+4SSpjMyhN10CeH2SrUM4TyDd3TGQwgZkTEbkndYAAAgAElEQVQh3FqWry+r\no5rKEAZy3YMEpqd/TxHNEK4LGaEzhEW8spyEsGAIA9OZEN67F7jsMvn4rBWWmwVD2CaEraZVaZAR\nZRvCZc7GiZLYbsfx/hVlCNPr2TwYwpwDX/mKt/zII7O1bdOmuDbJJoTjNZWGMOfJlDXO+d8BuBee\nEfscw0v+HQBhzv6vnHPl1sr/jt8EIJqB10V8lXh8BOB/0deNc34WwOv9P/cBaExKWE8kUU2MjPA7\noB0WX41eR0YAJXe6kwxh3zzNXGRtB6lSWIBaVA4oF5FB9zHdXv13j9rH1BCmqZweSQivLqVjCFND\nuHw2ZXJCmIp2wj5814fBwXHn+p34wvEvAEiPjKDPtZbK53AmSTmXSEI4j5SE8BQgIwJD2E2HjLhi\nzxXKMdK0hPAY8qIkZlZUlRDWkRFlF0pMbRZqDOGvn/g67j93v3f+P/n9WF6WDLvMReVIQri9XFFC\n2DBAm/Y41BPConOZ1hCmbVZrsVmGcL6EMAdaYWREIxLCGkM4GIRxBhN3WmaFITwaSWN/716ZEAZm\njyNskRFWVvWpamQE/Vx6D111X0GsU9EFIectIXzihDrzZppm4cyabEI4v6bSEM4gcVqaen0/6/9/\nAcDfmt7MOX8EwKfhmcr/E2NMYR8wxlYB/CS8dPCnOOdRt6n/3f8eAPi51Gtfsmh61NESwlmQEfpJ\nRhnCi040LkL5HiCY3llqp3ts7oDqCeFJMRlxpmGQcHNGABuVb46ycELYdVxSuGeQKiG8vi6Xe2OC\njFhIh4wY8GoSwmmLylHR1M3pbdmZ/swDnwGQHhlBzRZ30XvPcJh9gKEoBVOnAXDH22e5DeHWdCEj\nTAlhfdsPrsiEMMVFAM1KCEclZnUeajEJ4VboeT2VW/ZvEWcWxiEjHt0m0xguug+Li/IYzYyMIAlh\nd8k7uLvdapi60YZwLxVDeDyWnRJ99koaZIRTgSEcKipXYEJ4awuK6dq8onLqdjukqFwhhnAMMmJa\nDGHaqZ71hDA1UKY1IWyREVbTqiqQEeI7Wi2PpSu0RJrjqg1huk70f2Cy7Y27ns2iIXz33erf09LG\nzqJsQji/ZtYQZow9AcCN8Mzae7Tn2gBu9p/7Muc87jC5zf9/AcAztOduhoekoK8LiXM+AHA7PGP5\nZsZYuMddg9QOaL6E8Etf6l3w//7v5WMUGbHoxhvCVSMjRjxsFAIkaVcQMiJVQhgonSNMTf+Wqx52\nFAti2t7RyGMICz32mLxJ6I/SmSoKMgLVMIQnTQif2pLTbYUhvNmXvdO4fUufq8JUSRI9l0ZO8Qlh\nWk+wziS0SUZkhIbLuGbfNcFx8bRLn6Y816SEsG4iBUXlnIKQETQhzOpHRuTFCaxvk1Gr/feHDOG8\nCWFhCAPVJGbj8D6mfTwYAKdPq4+J1GEeZITTka8vq30qPSFMrvEKMqIJReWY2j4FCeESkBE7gx2s\nrMoPnZb0ksBFAOGE8PHj1a9PmZpWZERSQth2wK2mQVUmhNva7RVNCFeNXdPXySIj8umuu9S/p6WN\nnUWZ2iSbEE6nmTKEGWNLjLHrGGP/AcDngSDq9CfaSx9PnrsH8aLPP1F77kkRr4v7nBaA6xJeW4lU\nZEREQjjGCHn4YeCDH/RGw97zHvm4h0/wWrYlNwsyogpDOCkh7F0x8iWEMzKEAcCtwhAWXErV9E8y\nhI8dU/cF57KhH5CEsM5lpaIGaW/UDYo8lV5UbpKE8JZ0Vr5x4hs4u3MWnzj2ieCzjuw7EvndqqlS\nTSI6TjQhPGaTGcL7FuVc1bM7XgSAMWkKN9cQjkZGHFg+gA+95EN4/Y++Hm94zhuU55qUEPYMYUNR\nOVYMMoIWUEyDjKg1IdxSzenIhPDaSbSXdxRDeH0duRLCzoK8EJZ1LlPT381ojp4+7V2fqURKh7bH\nQDpDmHWahYzIlxCWP1R0UbmGJIRRcEJ4VR0dWFiV+3Na0ku6Ifz4x8u/3/Me1XSYdk0rMiJuei5g\nO+BW06Eqi8rphnBTE8JFISNsQtiqStmEcH5NvSHMGPvXjLExY2wMYBvA9wC8E8AheAngP+Scf0B7\n2xVk+eGEr/gBWb6ypM+pRamKysUgI07JEIqSTtrujoC21+leamdBRpTPEB4bCjMBkxeV29mBmhCO\nQ0ZoabuqkBFKRxjJhQMpLkJIYCMGPJ2pQp/rDrvBaHjpReUMRoNualEJQ3hnsKOkgTk4fv+238cP\nLnin7y3X3YIDy9qQNxHd71WYKkkKDGE2xoh5OznOwI9Tx+3gwJK37Se3TgaPixvasrmyWZUGGQEA\nP3v9z+Ltz3+7go8AgI7THENYN8+MDOGCisoZE8KtahPCcTgBvTikkhDeIQlhAHzf/cE1Z3vbv3HP\nkRBmxBAuq31SZnOwCAM84lptmkKvJoST2ybFKG6YIZwvIRxmbgM6d7suhrBaVK5IhvBWfytAdgm1\nVuTI4LR0VnVD+HGPA17yEu/vM2eA3/3detarDMUl6nQ1KSFskRFWs6Aqi8q11G5YrQlhsd2mc7cI\nQ3hpSTW8gfkwhG1CuD7FDVJaQzheU28I++KGf98G8EzO+f9meD2puYwtw/NU9M56taTPqUXpkBFD\n9Ps8lD4CgEcfNS9vdkkaqRWfEK4eGWHugAbbPwkygjKE45ARWge/KQlhfR+bDGFRWG4AkhBupSsq\n1x12g5uD0hPCvhngMhfMB3ZFJYTX1mSDQdPBQn/+1T8Pll/xQ6+I/W4lZdeuPyEcnEsxhdWy6PDa\nYQDAic0TEPUzq9ineWRERmQwwxuHjHDLSwgnFZXTTdjmICN6sYbweO/9QUJ4PAYeeAC5EsJoy+a9\nCoRCXEI4nyGcLSFMDeRpZAhvbkK5xkcnhBtSVI4LQ3iMwXAySPUWwm2YuySP32nprOqGMAD86Z8C\nq/7d83veA3z5y9WvVxma1oSwLSpnNQuqExlRZ0K4rKJy4npmGtyihrDJV5hGWWREc2QapLTIiHSa\nBUP4bwE8xf/3TAAv8x+7EcAHGGP/s+E9tBeY1NWgXV/d5Svqc2oRTSTpCWEluesMjQYpLTJGlzd7\nsvO52klICDcNGeEUg4xInRB2S04IjzjgeB3MuIQwEL5Yfvvb4c8T+3nI0xmMjLHg+d3hboUJYe9Y\napHjK8oQph0wWlBOiMO7a9mzsAe3PuHW2O+mpgpvNSghHINNyKLL1rzKPv1RH+d2vXhS85ER6QxA\nXc1DRoQZwnqBrEISwrOCjAAwWL1PKXzovSd7Qpi3q0oIJzOETW2xyRAWKZ1chnAF167SE8IEGRHJ\nEG5AUTmvwKvc9t3+ZDGWHRZuw9jC9CWE6XoKQ/jyy4G3vlU+/pu/ORumgjBQ2m2Er1eaFhdlm2sT\nwlZWk6vKonJNMoT1dSrCEOZc3nvEGcKcz8b14exZ1f8ApqeNnUVZZER+Tb0hzDm/wDm/y//3Dc75\nhzjnLwHwKwCOAvgoY+xXtLfRyc0dxItGyvTLdVGfU4tox8RxIhLCQGRhOZoK3tqSjdlWT/aYVxea\nhYwYlYSM0IvKxTGElc5uyUXlhqTFdWMSwoC6zZxHJ4RHI4A76VOXIi3dHVSfEG6nMITpFExaUE7X\nLzzpF2KNfkBNhvNWgxLCOU1RXYdXDwfLJzc9bETjE8I509GNSwibGMJOCQlhbeAIqB4ZUUhROQC9\nlQkMYZIQ5q3yGcIU4ZQ1LXviRPgxJSGcouApvbaN3WYZwnHHtem5UFE5cp/RiIQwUxPCdBCmN8x4\n86FptxVuw1hn+gxhU0IYAF79auCpT/WWv/1t4M47q12vMiTO1Ysu8rj8SRL3LHUbwjYhbDULmvei\nckUmhLe25OeaZjt0SNM+C9gIHRcB2IRwnYpDRtj2KF7hnt+MiHP+PsbYiwD8AoA/Z4x9jHMuLAJ6\nuibhG2h3UsdCFPU5qdXv9/HNb34z8XWHDx/G4cOHY1+jdEDjEsJuH/1+2ODUR8XW14GrrgK2+9QQ\nzo6MKLNRHEchI5wKkRFudUXlhmO140mlG8L9vjT3TpyQ+7fVkhfZ9XVhMKRDRgDSdOkOuzhUMUPY\njUoWRiWECTLisrXLcGJTOi2veGo8LgJQBwKqMFWSJBPCxRvCJzZP4IZDNyiGMOfpOrRVyIiMSDhW\nqZqUEI4yz/S0Y94b7O5AHqBtwwQWHRlRNi9a315qfOvXzzhkRHfp/kISwiO32oRw1rRsIjJiX7aE\nML12VYHIAPIlhD/wAeCVrwRuvRX40Ifk41tbABalo7ins0d+nDaIUntCmLlKUd/eIH+MZTAARouG\nWS7t2TGEWy3gV34FeO1rvb+/8AXgyU+udt2KFjWE02jfPuDkyfqREXFpLMB2wK2mQ1UyhKclIZx3\ne5N46LohvBxvETReOi4CsIZwnaL3ihsbJ/HNb54M+iuDAWCyz/qzMDJRgGbWEPb1d/AM4RUAtwAQ\nxeVoAbgr9DdpogXgfqA9p39OnFMb9zmptb6+jptuuinxdW95y1vwe7/3e7GvGQ454PoJYRaRHgUi\nC8s9qs7ODQzhnYHsQa4tNgsZkVhULicywqvk7rXmDEz9/TTpabtykRFmUwUgpr8hIUzTwc9+NvDF\nL3rL6+v+/slgMIrk2e5wN7j5GQ6979NvjoqQZzSEE8LqsZacEH7Zk1+GP/ryHwEAjuw7gh+76scS\nv1tJ2TnlmypJMiWEs5iiugQyApCF5cQ+5dw7bxbyf3yhKhoZUX9COKGo3AQJ4e2+PEBb4/A1Wx1M\naRAywt8vp04B737PGOs4o3zOdid/QrjjdtByWhiOhxg55SeE1YEsuV9d5oKBeegaDZEhVDRDeOhM\nR0L4He8Aej3gwx/27kcOHfIe9wxhGZ+8aEk6bY1ICGsMYVrIcbefPyG8swNgNWwID5wtOI7H0Z6W\nzmqUIQwAz32uXP7CFzx0xLSq35cDt2kNYXHPItJ4ZdxHpZFFRiRrOPQGrc6eBf7bf0suGmhVvapA\nRjSxqFwZCWGBiwDSGcLTLlNCeFoGXWdR9Lj9xCfejZe97PeV51PYZ3OrWTeEaVzoarL8PQAjeMiM\n6xM+gz6vn/p3aa/7WIrPGQI4lvCdkTp48CA+8YlPJL4uKR0MAMORLF4SSginQEaYEsIAsDOUnee9\nS81CRiQawgAAjsEgW8yRTsvtsKWgkJlJdSWE9cKBMiE8BNgYg4H8Dagh/IIXSEP4zBngBz+AMg0/\nCzKC3vx0u+V0ZGgBLnocM8bQdtre9PgUDOEXP/HF+B/f/x+4+8zdeN2PvE47RsxqOa3gO0ZO+YWZ\nkmRMCLuTF5UDwsgIANjdbZ4hzNo9CMzkRAzhmkx9IGVRuQkSwttkEK+NsGGoY24ag4xo9dDfBt74\nRuAv378BvF7txWy1HsDSygiA/J1Yezf18bDSXsH53nkMWH0JYca8AcbeqJeYEGZMMlVVhnAy354a\nwiNWviGcpahcVA0DyrkPGcJLMqq0f0mO+unc7doTwhpDuD8B6M4zhMPIiO3+FtbWPJN1Wjqr1BDe\ns0d97sYbveJyW1ueIdykmSlZRbEPWQ1hwGvnDh4sdp3SKgkZYZmNwCc/Cbzvfd7yX/0V8Nu/Xe/6\nWIUVhYxwHPNr8qiJDGHdEC7i3LWG8PQMus6iaJv0cz/3G3jNa27Fr/+6vFf8ylfCgzK33HIL1nVD\naw4164bw5WQ5mDPHOR8wxr4K4NkAns0Ya3HOoy5/P+7/3wPwde25r8ErJtf2X/d/mj6AMdYG8MMA\nOICvxXxXojqdDp7+9Kfnfbui4XgE+DfROjdSLSqXLiEs/u4SQ3jPUnZkRKmGMDczdVVjZYzBQDVP\nk0Q73Qtu/DZXmRCOKqIHhH/7wUCuF+XyPf/5wJve5C2vrwMPPIBcCeHusIvFJQ5x0HW74Y5eEVIS\nwq5699VxOyFDOCohfOWeK/HN3/gmTmyewDX7rkn9/UvtJQx6AwxYgxLCOTm6uhSGsJ8QXiQf1+2G\n01x1SRjC7aXdoOJn0uAFVZOREWKgo6iEMJ3V0Wbh65eOjCj7t4gzC03IiDvuALAcvqEbswFGK48A\nuCp4bO+BXYjweKIh3PENYVTEEPaTo23DtVoYwnEJ4WuuAe6/31ueJCE8QPMTwp/9rPq3Uth2E8CK\nNIRpQlhp62tICFNUFxBOCPey8qqIdnYArIQTwlv9LezZM72GsN6mtFrAj/4o8A//4KET7rsPuO66\natevKNEp1lmQEUJ1GsLCNHIcaZ4VMe18lnT8uFy+77761sMqWlUmhOMYwlUawrSoW5FF5ZKuZ7Nm\nCFtkRLNEb58OHz6Mpz/9sNJePuUp6iAM4PlqVjNQVC5B/4os36E991H//z0AXmx6M2PsCgDPh2fk\nfppzrliVnPMtAJ+B53A9nzF2WfhTAAA/738PAPz31GtfsgYUJ6ClRydJCHdHsjO52mkYMoKlSAiz\nceaGina6F934wmOVJoTTFJUDQp3uhx6Sy099qrxROHMGePBBKAzhODwGIE2XMR9jcVlerUudem1I\nCANkXcn6RyWED60cwmJrEUf3H41NfOsSxsoQDU0IT2IIr6kMYaDehEOchCHcWpIH2nQjI/InhPt9\n4FWvAp71LHOiIeC+j9qhcwZoFjLCtF/W1wGsmEf4u4tqL3zvgfTngigs1+flJ4SHIx4YwnG8d90v\n3NyU5/nVV8tBtsce8zp9aQ3hBXcBzB+sq+LaNSlD+NOfVv+m9yNeQjgNMmJUeYrRxBBWDOESEsLC\nEAZmwxAGwtiIaVUeQ5gOYtdZWM40Dd4iI1SdIqfjD3LDAq3KVNkM4fHY+wfEJ4SrDI6YcC8WGZFN\nW1vynKYDktPSxubRhz8MPPGJwLveVfeamEXvFYtkY8+DptIQZoz9a8ZYbNSLMfbvAfyM/+f9AL6o\nveQ/AzgPz8x9O2NsP32SMeYA+AvIuabvjPgq8XgLwLv899HPuRjA2/0/NwD8P3HrXaUGIzWlQqWj\nHEwGgIkhDAC9sewxryQYwowxaWpUgYyISMwqhrAzylxUbmcHATJiMaagHBBOCJdqCKdNCGudbtHI\nHTzo3bBcfLH3d5AQ9hOnHWch0SylBfY6y9KMqcJoaLvxhfQAtXMlisrtX9yfm7UrtnfAG5QQLqGo\nnM4QBppjCHNOGMIr3oXKYY5iDCVJPT8GtSaEJ2UI//ZvA+9+N/DVrwJ/9mfh54OE8GBZmSYpVAsy\ngpnbJ/362e97A1ViPwMAzsvSAJst1RDec1EGQ9hvv3q8AoZwzOBdsM2GgQnKD77sMtkJe+wx7+Z4\nPEbQNgHRBU8ZY4FZ3OfNTghzDnzqU+pjYUOYICMW5UVePWcawhAmk/X6w+IZwpv9TaytyddMQ6fI\nGsLR0hPCdUmYRtYQjpY1hJuvKGQEPZbHkrCYWSaTSqiu++ck48wawsm65x65/KxnyeVZTQhz7vUl\n7rkHeMMbJJ6sSUoqdGoxRtGaSkMYwO8BeIQx9m7G2CsYYz/CGPshxtiPMsZexRj7IoA/8l/bA/Dv\nOFcPXc75OQCv9/88AuCrjLFXMsZuYoy9CMCnAbwQXjr4rznnxltOzvnn4BWrYwB+FsCnGGMv8j/n\nVwF8Gd58VQ7g9Zzz86bPqUNxBcc6TnxRue3tcOMlDOLeODmNpHyX6PxVgYyISAgr28/GmQ1hioxY\nbiUgI7Tp16UiI9IwhP31ENs8HHrTMAHgSr8UopiSqCMj0pimlFnZWZYbW25xpmhkBIBIhrBARlyy\neknu75emSjMTwpMUlVtqL2HfotcbNTGEm2IIb23JG/jRkreeh1YOha5zcTKZUnXdAE3CEH7f+1QT\n+MSJ8GuoIewafiJ9VkOtReW06+eFC/5xTpERD/9wsHjBvV/57OU92RPCQz4IrillXbfojJ24gazR\nSO2cUkP48svl9eyxx8i6pkgI0+d6DTOE9fb4/vvVqdiAPyjgSzeEm1dUTt3ullMMMmJ7m8ciI4Sm\nocMqDOGFBTOX/uab5ePzZgg3JSFs4qJaQ1jVtBvCW1vAS1/qFcab4NLUaJWNjDCxtoUoMqLMvq+u\nspKU84SMoIbwjTfK33FWE8LHj8vZwxcuqPu6KbIJ4fyaZobwfgD/1v+ni/v/fgDg13zTNvwizt/D\nGDsM4E0AjgL4S8Pn/D2AX09Yl18DsAYvkfwTAJ6nfcYIwP/OOf/PCZ9TqdInhMOGsJ4OBmRCp094\ni6JDHae220Z32K0EGcHTFJXLgYzY7o4CDEFU0R4hffp1melRagjHJ4QHwYX01Cl50bzCD9qJhHCv\nB3z3uwCe4G3rYhpDmCTSWksVTUX2j6VOCkM4qNjd38L2wDv4LlnJbwiL/e8NjHjM5FlJCANeSnhj\ndwMnNk+Ac46lJZkQ392NeWOFClJTbIR+xzNIaLo5jXRTinOv41BHRfco8ywpIfzd7wL/VmshqXEm\ntOMf95GGcKtahnB6s7An26IVzRC+4W8AAGe5mhBeXN2FQOSmTQh7X7wN7O4rr6gcuVYnzWwYDKQZ\nFmUIj0bAaeENkqJycYZwcO0alT+7IUtROf241nERALn/6PudgkUzMkIfRBmPqy1KNhggVFSOFvXt\nj/JHWM5ubQEdb4et4hJswTsAdEP4wgU1ZdoUra8D732vV8hWGMJRTPqFBeCHfxi47TZvkPoHP5AD\n2NOkaTaELTIiWdQQPnfOuydbSe4WNUYf+QjwwQ96yy9+MXDrrfWuTxlKkxCe5Fg24RmE6PXtfIVx\nMVOS0iaEs+n735fLj388sLbmXc+nYcA1j/SB14ceMu/jOmUyhG1COJ2mNSH8kwBeBS+Z+88ATgEY\nALgA4BiAjwD4VQDXc84/G/UhAMA5/30APwbgrwE8BC9RfBrAJwG8jHN+K+c89rLFOd/lnL8IwC8B\n+JT//p7/ee8D8GOc87fm29TyRBNJLS05p3AkDQlhU0FG8RgtwJOEjKDfxdrel5TVAR2PEdkBVQ3h\n7MiIbeKELXfiDWF9+nVVReXSMoRpikFPCAPAww8jMBjTmIv0NdQQLi1pN5QsTj0hTKdeC4lOmMBF\nAMClq5fm/n5huHDwYJCg7oRwa7GYonIAcNmah0rvDru40LvQyIRwYAgvnwX30QNZ96l+fgCoDRuh\nM4SDonIJCeG3vz28T0yGcHfon4z9FaMhrF+zyjb+YxPCGjIiEEVGnHgGMPau6etD1RBuL2VPCHtv\n9Nq10gzScQq8j4/qoe0TTXxTQxggZrGfEHaZG7omUolr1+6o2QnhOEM4mBVBkRFL0kHTE8LAZNOB\ns2owQLioXEEJ4VObsg272JVQw2lJCL/mNcBrXws873nyOhVXpJRiI76oQ+GmRLOAjLAJ4Wid0pDe\n05YSpgaf6d5hFlRlQlg3hOk5X2Xi0iIjJtf3vieXH/c4Wb+hqe3rpLrtNvVvWmuoKSrruJ4HTWVC\nmHP+AID/5P8r4vNuB3B7AZ/zAXgm9VQorgOaJyH86KNe0mbA8iEjHHeAEUos2hOTSFKNlRwJ4b7s\nNa8kGMJVFpUb8ZiEsGPudCcZwt7rPWMiFTKCJoQXymcID8dh40woLiFMC8pNkhBWjvlWFxgu1p4Q\n7izvBtn4iRPCaypHeHFR9tgbZwivngwemzQhDNR3Axt17dITwqOR13ERHZl77/X+Z8zjyz7ySLhT\nN+Zj7I788zKCIdwkZIRpvwBQkRGblwMXrgT2HcfpvkRGHDoEjFh2hrD3xd7JVFr7RHnvbsTgneE4\n1BPCtBP23e/6C74hnNQei+e7DTOE6faOx8Bn/WH+lRW5P4QhHHTGfEN4qbWkDkpqReXEupgGQspQ\nKCHM/ISwj6OZJCF8clO6T5e0r8WDo38E4BnCl63J1zV1Suudd3r/02tUWkP4H/8RePnLy1mvMmUT\nwrMrzs2G8PXX17M+eUSvvXXWUShTZReVizOE6blctyFcRJKSbsP+/eHnZ8kQFglhxwGOHkXA6W9q\n+zqp9IRwEwe3khjC894mxWlaE8JWBWgYh4ygRpphymZUQngwANDKjozwvqdcZERcB1RHRmQN6Wz1\nZK95dSGBIVxlUbmx2vGkikphmQxhgYwIlCEhTBEa7mIFDOFUSbs+RA9cdMIEPxiYLCGsFG3yjZi6\nE8ILK8UiI4RObp5sdkJ4Ve5TamSnUZMSwvq1KygqpyWEARiLQx4+7JmFgHfDTm+KAn4wMBXICN2c\nDkSREduHgHNHAQCbw3MBPuD664HdYXqedpUJ4RG5brUjr1vD0IBlFDICIElav6hcWkN4OB4GifQm\nGML0mL7jDtnp/KmfkuiMcELY2+d6IUm9qFywLhXJM4TVey96vzWYoKjcelcOal6+GJ0QbmqH1ZR2\njTOEn/Y0uUx5jtOkaTaETUXlbBpLamMjbHo10USJ0zwYwmUjI+IYwgsLkiNc5blcdkJ4797wtgKz\nYwhzLhPCV1/t7UfRxu7uzt6178QJ4Ngx9TGbEJ4tWUN4jqWYhXE4AQMywpQQ3tryH+/kQ0YI06WW\nhLDWScxqCFNTZXUhW0K4TBNtHJcQzmAIhxLC/lRtZVsiRA1S1imfIawkhKMYwszDSriuHNWlyIgi\nisp5K+BtZF0JYWGO0GnyafZZnAQyAgBObJ5otiG8JhPCRSAjak0Im4rKaQlhQK5jrycZsldeKZOj\nnKsdD9UQToeMKLtjGHet1s3pQCIhPOwAvTXg/FXBU85e7zh45jOlIdxxO+pAoEGKIVx2Qpjy3qMY\nwgDgDJT2Sdykt9ue8U9Npc9/3l/wB6aS+Pama1dp1+mcDGHK7bv5ZrXgKRBGRlBchP49YhClekNY\nvfdyyTpNxBDeJdij5SuDe6tZNYQPHJDmqN5ZnRbNGjKCFgBsSk2BuqSngwFrCDdRZSMj4hjCgFoI\ntiqZTOoiDeEotuysGMKPPipnIz3+8d7/a2QWzqxhI0yFW6fRELYJ4WhZQ3iORTugeiIpCRlBE8LU\nLLztNqSuaB76Lj+R1OuVc9J6iST5wTRdpxdYy9pQdQey17Fq/ksAACAASURBVJypqFzJCeER0ieE\nxTY//DCAg3cBr3gBPvrY2wBohrAzDH7HrAxhp1MBQzgNMgIA3B7275fFhIpCRjQlIcy5NK9aiyUl\nhLfUhHBTOoBlISPqZQjTBGl0QlicxzQ5euWVasqfTslOlRDWBrEGg3K5q6nToyaG8M5BAAzYkgMA\nv/t/nMJrXwu8/vXSEE5zHigDmhUyhNNeq0cjaZBed51340tNpaBTkhEZAQBLe8q9duVNCD/4oFw+\nckS2TWfOeNe8rS14M1h8QzuUEGYNSAiz6IRwf5Q/IfxYnyAjVi/FamcVwHQwhIdD83rFGcKAx24E\nPKOtKe1PFgkTyHGg7KM4NSUhbEJGLJNLTFMGiOvSrBnC02zgxalOZARQjyFsMqknNYTHY3k9ihrc\nmhVDmA5MizZobQqwTFn0X/+rZ3a/7W1hfjDQzGuZLSqXX9YQnmPFJYSTisrRhPANN8jlT38aQccZ\nSIeMEJ0/ToomlZHCypJIypoQ7g6JIdxqEEOYmP5JReWUhPCz/i/g2k/hj7/9Jnzr5LdUZASZpp2K\nIUwMctaugCHMUySEAcDtKx2ropARIYYw6kkId7ueQQIUW1ROYQg3FBkRdJILRkbUyxBOSgh71zZl\nYMdXFkPYxBCu+reIMwtbTksme4NrEQeW/Y3aPuT9TwzhJz3zFN75Tu83yGQIk/arvVJuQpgWANVn\nc6hthrxWP/SQNMGe8ATv/3AyhwfmaBZDeGHVOy5KQ2ToGBRyzxGXED5+XC5TQ3g08gaCtrYQIEKA\nsCEcxRCuSkaGMDWEJ0BGbAzloObh1UsUQ7jpndXz582PpzWEOQfuuy/+tU2UMIH274fx2mvS6qp8\nbdMSwk28H6hLs2YIz2pCuEpkhMkQFv2QXq+6c6aMJOXGhuxzzHpCWC8oB6Dxg65ZxDnwutd5xveb\n3gT8J79il+vKNrmJCWHT4I5NCKeTNYTnWEpC2I1PCOs3AjQhTA3hz3wGuZER3JGtQxmd7vSJpF5m\nQ3h3mD4VrU95LvMGIM5kUPexZgivSMf/w3d9WE0Ik1ReVmSEMEiBEo0GnjYh3FdGsZWE8ATICGqA\nOwv1JYTpOUSL+RWZED6xdQKL5OOa0gEsCxnRFIawOJf1mQ2AvMmmHc8rrog2hLf75EDpm5ERruPK\nZKV//pf5W8RdqwFDccjFDY+vCwDb/sWKGMJ0sCeLISwMNQBYWC07IZwe7yP2sSgaCEhDOJTMcQeB\n8Zk0WLncChvCpSaEDRgUIF9CGPDuS7a2EOAiAGD/ooqMaCJDmG77YBJkBJexpav2XRGZEG6iIUyN\nzSc/WZoUN94Y/77rJCpZSW1Ni4QhnBYXAXhmsMBGNC0hbA1hKWsIT4fKRkakTQgD1aWEJ2Wt3nsv\n8Du/A3zrW/IxgYsAZt8Qpm3NLCIjHnxQDf6J4/+mm6QBfuJE8xK3NiGcX9YQnmOpxbdiEsKGonLi\nQuG68mII+Dc7/vRUl7khQ84kYUxyNoIo9FV5QtiJTiSl0e44AzLCrQYZMR5DmZqapqhcv+/fxBLs\nx4fu/BAOHODytUuy1d+3SGB2EaK/B29VwRBOYSQBsQnhQyuHcn+/mrKrLyEcsDQBOEUawlOQEJ5J\nZIQbTr7ryVHAbAhPiowAyECWW78hHGy3GJwiA1geMgKFGMJ0QLPOhHCUQUoLaYnK9SFjKQPCiV6r\nOyvNQUbQ9lgYwp0OcOmlYUN4cxOKIRyfEK7LEFZnZ3XI4OwgJzJizMc47X7D++P8Fbh8/0EtISzb\n8CZ2Vqkh/JznAP/0T8DHPga84hXx7xOdU2D6DGGRageyGcKATBXWZQhzbi4q18T7gboUZQhzHn68\nqZoHQ7jOonJAcwzhLMbZr/4q8I53AC9/uXwsDQ99VgxhU0K46bNwsuirXzU//tznAlf55TnGY88U\nbpLEce04chaNLSqXTtYQnmMNSRJFTwgnFZUTCeGLLwYu0cOUPjJipbMCJgCtMTKl3Mow0GIr17fM\nU3J17e4C73kPcPvt6uP9UQZkRKuaonKpU3ZA0Ok+ccK/WSUmwn3n7sPDo2/L165lM9mo8cLdihPC\nCcgIJSHsF5W7aOki9XUZRfd/e7kZCWGHoDrSYD7itNpZxVrHu/PRGcJN6QBKQ9jrke1d2Js4UKOr\necgI71xmYAEyQZ/ZAGQ3hDd7KQ1hzYQtk9eZaAgH5rS/sStkykqBCWGKjGgtl5sQpoZw7OBdq5ct\nIUwG4bIgI8q+duVBOHEukRFXXeXd8NPjWiaEo5ERJu523QzhIhLCxx47hoHj90JPPAPLyzLhzsHR\nXpY7somdVWoI798PPOMZwItelIxRmGZDmG5zVkNYJITpNO0qRRnyUciIuorpNkXUEL7Ub462tqLx\nKE3UPBjCaRLCkxhJSUXl6mCCJ02tj9ve7W3gK1/xlu+5Rx4j85gQbreBq6/2lmcJGSH2LwC85jWe\n2b2wALzylbLYPNA8bIS4V6TnmUVGpJM1hOdYcVXN44rKcS4TwgcPagXHgAAZkaagHBBOIwM1M4Rb\n0ciIP/xD4Dd+A/jJn5SjoZwDA6RPCOvfVS6jMRtDOOCOttWV+uj3PiwrW69mY+1Sg3TsVswQ1hLq\neqJSNOSc88A0mqSgHKAe9yJlV3dCmLKbJ00IAzIl3PiEsD94kZUfDDQrIUwNUhfRrFUgPUOYc+DW\nW4Ff/GVysU2VEC7/t0iPjPBXYpkawmGGsDi3Oee5E8KtJZkQLsOAyZMQTmUId+TFJ6ltMhnCvV45\nBQTzJITPnZMdrSNHvP/p/ceZM8nIiKYlhFtOS7nfyltU7usnvi7/OPEMXHSRijxxl2SD0ERDmBoh\n+5InHgWaZkM4TaIuSsJEGo/rMR+iUo9NvB+oS9QQvvlmuTxN2Ih5M4TpsUxxaJOYl7OGjPjWt9R7\nApESnRdDeDwGjh3zlo8elb/bLCWEqSH8pjd5RaoffdRDhIqEMDAdhrBFRqSTNYTnWEoH1I1BRmgJ\n4a0teWNw6JDBEPbNxDQF5QDNfPYLJ9XLEI5GRnzkI97/3a6crru7i0zTcvWicoNBOReprAnhwYDc\nqGqG8Ifv+jAOXOy7IHQafgqjjZoQY6c5DOGffmEfr361t7zV3woKA05SUA5Q939ryfvMOjpGtJPI\n2sUVlQOAy9Yu876jvwnelkZDU6q8b2wA6GwGg1N59mnzEsLece0gemYDEE4Iu66XTtIN4bvvBj7+\ncR0ZsRKZyNNN2DI7h3GDd4AJGUEMYYGM2N0bvE4YwoPxANzHEmVNCLuL3rE0GiEzYz6NRjkYwqIN\nOnhQdipbLTWpcuQ60ja10ieEW0vyfWWc11naY/F76/xgIJkhPA1F5egMLYo8yiJqCLfXn4G9e1VD\n2FlstiFM07JZDOH9+6X5MI+GMFAPNiLKRLOGsJQwhNtt4Id+SD5uDeFmKQoZQQ3hSdrAWTOEv/Y1\n9W8RPpgXZMQjj8hrGx2QnJWE8GAAfPOb3vK113p9h7U1uX3UEG7atcyEMbIJ4XSyhvAcK0tROXrh\npqDxgwc9U1h9s0RGpJEJGVF1QjiqijvVyZPAnXfKv8XN3s4OlGm5mZARvqFRxo3zcIjMDOEoQ/jY\nY8ewctTHRmRERtDfY8gqYAinREb8/tv6uMzzNQsrKAeoBrgwVepICCs3JOT4TFMIMEl0v18Yy+Oh\nKR3AjQ0oSfas/GAgXHQRqJkhLBLCLCIh3DIjIy67zDOFdUP4/vv9P9r5kBHNSAgLZARplAQyAgyH\nlr2BAGEIi3QwkD0hzBZlo1QK0ihjQvjCBa9NAmQ6WIh2xh5/Qz5kRGux3MG7PAO0AhcByGmaRkN4\nMQYZUXNRuX4foaJydOAyL0OYGsKXjG8CY6ohjI40hJvYWc1rCAOyU/7II9OFKZjEEKa/Ef3tqlKU\nyWUNYSnRR7j0UtVEobN3mq55MISjkBFFFUxOYgjXMbiTZAjHtYlf/7r6tzieTQlhzjnuWr8raNdm\nwRA2FZQDZqeo3B13yAGQZz0r/Py0ISNsQjidrCE8x6Id0HaGonLrJIx16JA2NYSNAH96etOQEek7\noD1jQ/XpT6t/i5u9bhdAO2dROT9tV16HO1vqLMoQBoDu0b/xFjIiI6jxQg3h0orpIZ2x0hvJu9tz\nXWIgLGbsmWmix7274G1vWanCONEbkoEroXV7F/dO/NlX7LkiWD7Rl3dHTekAbmwg88CFriYhI2hR\nOZdFG2eAd5Pd7UpOsLh5o4bDmTPAAw/4f7Sbj4zQB7P0AndGZASAS/zr05mdMxiMBtkNYZIQZh35\nO5XSPlGGcALep99XcRGioJwQ3ddHn5APGeEsyveVNmBZQkJYLyq3fykGGVEXQzimqFyehPBoPMI3\nT/qRnnNHcPl+b/SHGsJ9bAWd8aYnhPfvj36dSTSlJabyToNsQnh2NRzKvtKllwJXyFumxqXq4jRv\nhvA8J4QdBxBlfyZNCAtf4C2ffwtu+Isb8FPv/SkAs2EImwrKAbODjKAF5Z75zPDz04aMsAnhdLKG\n8BxrFJMQjisqpyeEWy3SoBFjtGnIiFBilnS6ozqgVLohfNoPlXqGcAZkRJUJYSelyeAMjAnhA0vS\n7e/t/5a3MAEyYojyGcJpkRH9kTyoz/eKM0xpItpZkMdF1cklekPShXenttJemahgntAzL5d3Cd9Y\n/4L8ngZ0AMdjv2gLOU6nHRmhMIRJQtg0uNTve0k5IWEIt1rSRFANYXJg9lciDWEdGVHm8ZxYVC5I\nK/cBcDMyAsBla95+5+BY31mfKCFMjfNSBvBSDmSJ9snEDxaig7RXXZe+bVIM4U65hvCkCeEoQ/h7\n30MsMkIZXKiLIawVlaP3X3kYwveevRfbA//4PPGMoIAVNYS3+lvBlM8mdlbzMoSB6eUIT3NCOKpQ\nVqsl/27C/UBdWl+XrPlLL1VTddYQbpbKRkZkKSpXlSEcZYKL5ShDeGMjfI01JYTF9ewD3/0AAOC2\n47dhu789E4ZwVEJ4VpARlB9sSghfcok8jqfBELYJ4XSyhvAcK5YhHIOMoAlh0RkLsBEdeRVcWyDD\nZTGqChkRlzpLMoQ5Bz71KfUxJSGcARnRclpwmH/qlZ0QZukTwrKoHA8KER3dfzTYHvfiB3DTTcBF\nRzyjbcFdCBXsMYn+Hv1RN7g4l8YQRjpkhGII7xJDeGEyQ1gxVRbKR2REid6QdLnX29bNkbx6zlXP\nCZa/cqpZhvDWll/wgiIjchSVM81cqBcZkT4hTDucNJkksBGRhvBgOZIhHJiw7hBg41JveLPxzwfA\n8hn59463kY4DHF5TC8tNkhCmhnAZ7dOYR1+rdaRRvy/5wUDYEBa8yr17gSuOpEdG0Gs1a5AhbEoI\nC2TEvn3yhv/4ca/gDZaikRFNKyrnMhdLC/Ja08sxlUQvKHeJTz2aJkO4CGQEMD+GcN0J4bhp8CIl\n3IT7gbpEC8rphvBXvgLccgvw7Gerg1xN1DwYwlHICJp2ryohXCcyApDbH2WcfeMb4ceikBE7gx0c\ne0xO2TjfOz9zhnBUQngWDOF2G7jxxvDzjiP7FU0b3EpiCFtDOFrWEJ5jUUO4o93R6UXl6I0ATQgL\nIzhI6SzInsaeBTJcFiP9u4DyGcKMu2BibgzCxZn0huruuyWvUSgvMgIgHe92eYzZpGnXkQzhlrzz\nWems4Mi+IwCAk90H8bWvcTh7vB/i0tVLld8wSvT36A67WPE9lrIaTMUQriMh3DabKlUnhOXvy7E1\n8nqe+vTpvLpk9RJcf7E3T/0bp74WmGVNKCoXGAsTIiMYY/L4aURROZEQzmYI046oMITPnyfT3tIy\nhLVrZJ2GsLouPYWPip53V764qCbDQ4awmy0hPG6VmxAeZ2AIJyEj3vxm4M/+zB/EbBFkRMJgpWIY\n12gIu8wFg9+2+NvLuTSEWy0E/HfHkYnoe+7xP9dPCDOw0H2IyhCuCxmhzlRaXZHbvtOb3BBOSgg3\nsbNalCFskRHVKCphCFhDGAgbwnv3Aqv+6XjPPcA//ANw++3Au99dz/qlFb3nmVYDL0m2qJxcTkoI\n67gIQBrCIizmul5a9u71u4MivgCwsbsxE4awqL+xuAhcfrl8nCaEmzjomkbnz8uwwVOfqp4DVAIb\nce5cs+4nkpARVaMbp0nWEJ5jxSEjcieEF7MnLU1JvLINUgcxHe5WL3TR0NPBgERGZC0qBxCzfMG7\nkpaHjEifEN7c9M1+DX8hDOHd4S4e2XwEZ3a8NF7aafg0idcddoPO3vnzEW+YQOMxgkEFoP6EMD0u\nqu4cBTck7R0MuLetRSWEAeC5Vz0XgMe8bB+9HUAzOoCBsZCRdW2SXrysroTMaIRg9kSLICNcx5UD\nPaSoHC1aYzKEAZIw7aRjCOuc9TJveLMlhPsyvTtcALi3AYmGcIqE8IK7EBiTY1ce3GVM0VYYwrGD\ndz0FGdFuA9dco37Wnj3Aq18N3Hyzl9IRyoKMoO1A1YYwY0w59zj3Xi/SdFddpSa5KDYCQGAI71/a\nL2fjGL5HfH+Vg3U0IczA4DAH+1bkAEe3l72X/LUTpJd+8umRhrBIMPX7zUv72YRwtvc2FRkBSEN4\nmgr8FS3dEGZMbYuFvvOd6tYpj+Y5IVxVUbm1NQQzs+pOCIv1izLOqCEs1vnhh72+lwgZXHON99wd\nj96hvHdWDGFx3T54EMqMullICH/96xJ1Y+IHC1GOcJNSwiZD2HLt08kawnOsIS0q10pfVO7b35bL\nIqUTjJLlSAhXhYygKbtYQ9iAjKD8YNEAqAnh9J1ugBrC3u9VSUI4oVBRMB2XGEQr7RVcs0+6DV95\nWMKF0k7Dpwb57nC3VEM4NXsU1TCE6XFRW0I4Zvr0JHru1c8Nlp1rbgPQjMZWGsLZWNcm6YZwIxLC\nUeZohoQw4A+eAFpCOJohrGALWr16E8LaugTXrL5M9C4sqIbwyc2TmQ1hxph8HRncobNkitIoBhmh\nX6t3d2Xn67rrzJ1Moe4wPTKCPs9b5RrCcUXlgPBxvb4uz22BixAKGcKL0YgcxSD2kUpVpnkoQ1hs\n8/498ljcGWSLog1GA3z7lH9TdvY6YHd/YkIYaF6CSRghq6vxx7NJe/bIUMK8GMJ1J4QtMiJeIjAC\nIDgff+mXvP+vuUb+Rt/9brXrlVXzZgjXwRB2HHk+180QFjM4o/orX/cno6yuSjTVyZNeala854Yb\nvP+/+6h6cJ/rnpsJQ1jch+gDl4uLckBhWg3hj39cLj/72dGvayoT3WQIL5Pb3nluk5JkDeE5FmUW\npi0qt74O3OZ5P7j2WuDoUW/5Va8CbroJ+MmfzoGMqKioXFxC2MRoFBoMgM9/3ls+dAh40pO85VOn\nvJG0PMiIwHRcuACAl5fAokX0EpARAcssIiEMAF9++MvBctpp+AoyYiATwru7xSMGvH2cAxlRUkJ4\n7JZrqsRJGsKkwNJiOYYwv8rjCDehsdWRER23k4p1bVJTEsLUPKMJYYDgE1IwhGmxsUBpGcIapqFM\nQ0mf3aCnPHV8hUgIs6E0hItICAPyfKYJ4TIM4XGGonLHjsljUecH66IJ4aS2STGEyfbWOmDpH9cU\nBSAKygkphjAbB4NgUed98Pv6319l540mhEWbfGCP2kZm0fcf+748rk/eBACpDOGmdVijOtlpJVLC\nJ096HPlpEDV/sm43NZApu7MqpUkIN+F+oC7pCWEAeOMbvePz+9+XbM7jx5t3LlLNgyFcNzICkOdz\n3ciIOM78o4/KImJPf7ocmB1fdDf+8hMSWyT6yLOYEO715HVNv2YzFv/7NV3DIfABrwYgOh3ghS+M\nfi1NCDelsBznZoawTQinkzWE51hKUTknXVG5j35UJste8hLvAggAT3yiN3L4in8zIUPY7/zVmxBW\nkRHf/77sYPzETwCHfR+01/NSrlmLygHkt2Ec6GyV0uFOncACchnCaafhO8wJvosiI4DiU8LDIYKU\nOZAeGbHRk3Mu9y3m7JH6oqZL2aZKnIIbEmIIF8UQBoAr914ZpMcHl9wOtHYb0djqyIi0rGuTmmII\n04GO6CSlREYIQ7jdRlBgClATwoHaeZAR1TGEXbRC+0+/XouEcGscYwhv5zOExfk8hDyBafqrKFFD\nOGk2By0oR6fLm5QXGVH2YJY6QOuETH/93KPJz1hDeOG8164iekaENIRrSgiLwR2REF6TxyI9RtPo\n5CYpbrBxBAAii8rRKa1N67BOagjTY+LEiYlXpxIJ82fv3uypaHotpxi3qpQmITwazS+z0WQIi2XX\nBZ78ZPnYXXdVt15ZNQ+GcN1F5QCZED5/vhqmfZIhvLlJZpH5+jpB1d98sx82uOgY8JtPxh+evRk4\n8jkAxBA+PXuGMO2zmtoq0cY2eZAnSp/5jLy3feEL49tiGjShiLo6Rc+bKGTEPGOMkmQN4TnWKKaI\njV7oTVy4P/IR+fBLXhL+zAu95iIjlA4oS4+MoBe7a69Vb+5On/YvMO30KSxA+20WLpSYwFKL11Dp\nRnzQgGmG8DX7JTLiGydkidkshbqE+dIddLGXBHCLZt9lSQj3hvLuVkkIF4iMGDlNSAiXg4wAZEqY\nuz3gsq81p6icMwBWvF5ynoJyQk1BRgyGHHCFiaQlhN1wQlhcsy6/XGWcmQ3hlEXlNExDVQxhl4Wd\nkhAywje121CREZesSDc8b0JYnM8DyBO4aEPYY5+nH7yjiQxa1MQkmjjNYggPS752KfvYCe9j/dwL\niiAibAgrx3WK610wW6a2hLCPjPBnZq0syGOxN8p2EV3fIW7gtsdNiDKEm4qMoKmr/TnHLOl9xbR0\nxoUhnBUXAXjbKzq9Z84Ut05plaaoHDC/iSxqCNNBWSFqCDcZGzFvhnDdCWHOy8HppV0nOmioz7S4\ng/i7T3uabwpe9UXA8Z3jazxD+IYbgLM7Z3FyS63EPguGMO2z7jV0FafZEH7f++TyL/9y/GsPk24V\nvdbVqahj2iIj0skawnMsiozQcQKmhPC5c94IEuBNFbnppvBnUmOtacgIOg3ZTWAI04ZK53HSm7tT\np1RkhIt2qDNvkm4IV120BzAUZhKKSQj3RvKOMAuXVZgqlCEMFG8I500IKwzhCZERruMGhtXIqS8h\nLG5IFvcTZETBhvCPX/3j8o8jt6HblQUJ6tLGBoBV6djl5QcDzUkID4aELxs1mOUXldvYkEaDXsQm\n1hAedoBxK50h7FbHENYH7wDt2rVwIeiQLDA1IbzUXgrO50mREb1RecgIHe+TxRA+nHB4K8iIhNkr\n1BAeseoYwibTPy4hHMsQpjMikpARdTGENWQE3S/9cUZDeJsawgextiY5kNNiCE9SUE6oqdsWpfF4\nMkOYMXk9ryMhHIeMsB1wOY2ano9U1hBujqKQEQvklqeoonJf3vwA3v6ltyvtMqBeA6rARiQlhIHw\ndZTOTHriE31DmBRvxtoJMOZhrHR+MDB7hrCprRK/3/Z2NUnvorS9Dfzt33rL+/YBP/Mz8a+fJkPY\nJoTTyRrCc6xRDLPQVFTuYx+TN4E///MSF0FFE8JpjTWls+ubGpUzhDUmZVRC+Ior1IRwYAj7yIgF\nJzkdDAB7OuUnhIdDZGIIB9IM4QNLB7DSDt/RpkVGADI1XTYyIosJbmIIO8xROtF5JbZ3gPoSwuJm\nrrM32SDJK8oRxpX/hPG4/iminiEs71AuXUl/nOpqSkK4P0xxTPvreN998jk6rQuIMIRFQbaB14uP\nYgjr14vGJIQXZSJ00VUNYUBep3InhP1zeXe0i3bHM56LTggnFtHT2ifaXiQawsP0yIiO2wGD17CX\nfe2i26wPcoh1AZAZGeGsJA+ABbNl6koIa0Xl6LE44NE/NufAW98KvPzlHosU0BLCOweV+5NpYQjP\noyFMp2TnMYQBedyvr1c/EJsGGQHMpyG8tQU88IC3/MQnml8jCm8B0hAej8vp+0yieTCEo5ARjoPA\nwCykqNyB7+GPjr8Mv/uZ38VffeuvlNdUXSQyKhUddx29+265/IQnmA3ho0e9AaF5NYTjEtZN1sc+\nJtf3JS9RB0NMOnhQekBNMYSjjmk7QJlO1hCeY43TVjX3kRF/8zfyIYGL+MrDX8GffPlPsLHrXSXz\nICPo61qrnjFXOkM4zhxt9WITwjoygiaEF9yUhvAUJYQZY0pKWCjLVHyRgKJF5YCSEsJ5isr5CeE9\nC3ty82aphPEyZPUnhNtr5SWEj+4/Kgca/BvDuhvcjQ0oXNy01yGTVFOK15cQHkWn3vWictQQzpQQ\nHnhmamRCuNUgZARdF5IIpQNX4oZWGMJb/S2c2ZFzq7MiIwDg4GGvV1hKQpgWWEs7eAe1PTIpCzKC\nMRa8pkpDOAsyotUKYzKoIXzk+mRkhF5UrraEsG9M02NxiN0Qu1Hoc58D3vxm4P3vB971Lu+xR7fJ\nwbgdbwg3lSE8j4YwTQFOagj3+9Ub/GmKygH13w/UoTvvlMtPeYr5NYcOyfb4zju9cMSTnuQ9/sUv\nlr+OaaUX2I66Nk2zohLCgBxYLgQZcZGsjHrv2XuV1zQ9Icy5NISvvtpLvZsM4aiCcoBXq8V1pZE4\ni4bwtLVDQllwEYB3noj25+TJ+NdWpTQJ4Xlsj9LKGsJzrLgiNjoy4swZ4JOf9P687DLgWc/yOhi3\nvO8W/IdP/ge89ba3AgAu9LMbwrTTtrjPawnLuJDGmQxxDGFqCF9xRRQywus8L7rxHW4hhVNbE0M4\nymTorIQTZZQjDAAMDIdWDqVel4AhPCzXEB6NkA8Z4SeEJ8VFCAkTqc/rSQiPx3K011ktjyHMGJOF\n6ha9nVl3g7uxgWCmAZDe+DMpOF4YB5xRbYZwf5RiYKfVA8Bx//3yuWyGsHeuNw0ZEZseBRRm7MpC\ndEIYAI5vHJfPZ0RGAMDFl3oH9/p6sR3jTIN3LfUgzISMSMG3F9tLr13ltU8p9nFLHYy57rpwp/1x\nj5PJ9uueklxEUzKE60JGRCeE0dqNHBB///vlskhMxqVt8wAAIABJREFU6wxhen+y1FoKEt/TgozI\nyxBu6rZFqUhDGKgeG2ETwtGiCAiKhqBiTD538iTwjncA997rXWs/9KHy1zGNRqPwtPdpNPGSFJUQ\nBuSxXIghTIIKIkQl1HRD+NQp+ff113v/X345MhnC57rnwJhMXU/jsZQlIdykWThJuv127/9Dh4Dn\nPCfde8Tg86lT9aMCAYuMmFTWEJ5jjWISwnpRuYcflhfvW2/1Ol/3nrk3aNS+8+h3AORLCB9YOhAs\nL/z/7L15lGxXXfb/7Jq7q6fbfbtv3yk3N7lDkhsgM7lJEBFlfhGQQV1LNAoiKD8Vlq++iiJLXln6\n4qvI0iWoIC+CShAUBQIyCBhiEkImMg830527b4/V1TXu3x+79tnffeqMVfsM3V3PWlmp2119qk7V\nOXt49rM/323zAKLpEDVmoR0ZkfVHRpRKwNRUNzLi1ClYyIggE27A9tmUlmJBRgRNCI9NdRvC54+f\nr/3t9uHtXWarl+Tn0mw3MTKmRl9xJ4RpstApIdxvQTkpy1RpR2uquKlSUR00G/I3SPrRRKkzKups\n3U96AigMYTV619KkIeW0UyIJNciMxSv1jkwT8/Pqn3ZDeGJCR0IcOoTAhnCcyAg/vqzWXpPre8zH\nEH5y6Un1+5DICACYnBWfVatlto+yL94FbatHRsR/XqKGsF9CmD6nTtAFkSMjvBLCgGaWy8ko1d69\nwKc/DfzWbwHX/LD/jgh7Qjj+onI6Q7iQLQC8E5vKVx3vrUYD+Nzn1L+PHxf/1xjCNmQEY8xKCacZ\nGUG3SA8SwsGVpCEctKhcVOOeO+8E/u7vkh9vOIkawm4JYUA3i//kT9TjNCbupDaiiecnr2vZaEK4\n4G4Ix42MCGsIU1yExKAMDQGZcWIIl8/i8CV1cM4tZMTesb3IMDHolOc8MITTJVrI0F6I2ksyjNBo\nxHPN+mlQVK4/DQzhLaw2d9+iqieE9Vb7hhvE/48tHrN+Nr8mXIheispNDStDODcqjrOyYp5FGjwh\n7IyM2LNHrOrbkRH/fRu3kBET5fQiI4JuQx6ZdDCEbciIsIW66Lbr4TE1skpDUbn15rr12FhCOC8T\nwusAE1HCODsiOhDhpeiQEQDhEpeWANbua+BsQouLALKGE8IAkK0nh4xo02vaYzHLlh61M4SzWd18\neN7lLfU3PgxhOzIiroSwE07ADRkxPuyOjACAJxeftB73goyY2K5uYpMc4V7xPn7pYEDsypAKcs6q\niF6MReX8DGFyzocPOx/vTW8CPvABoMpDICOSKipnYwgzxpDlne8mt+74fr72Nd1ElAvVVkK4NgI0\nS10IETdDOE2m6VZERtCFu41uCMeNjDh3DnjRi4AbbwQ+9CHzx+9X95FwpFtCGNA5wnTOkRYmp5Nh\ntxk5wkGQEf1cx9a9UlBQWVrIGkg2IRyEIWwvKGdpRL9YZy44hWeWn7HCYc/d8VwrNLIVDOGN1g8B\nYrFDXqNjwWwbALoXkoZFLLeFnUFCOJgGhvAWVjtwUTndmT16VPz/iQW1N3m+Kka3shMYyg0FTpDS\nSRsrq1Gy6U7RK3XmhoxYXlbmmkzbTU6qFN2DDwKPPt6w0l3Dhd4M4TiQEd4mg/qOh8f8kRFh+MGA\nnrIrjaiRVSTIiJBF5egihumEsHgTwiGNsyOiZl2rIAySLMtitDDq8he9y0odMw4UlxJfge1KCGcN\nJYQ7xTWTUIMgI7wWOeyLd/aEMKBjIy69jFyUdR+GsA0ZsboaHU8wDDLiBT+mDMBtIz4J4R4MYXov\nj29Xn5dJjjDtm4DgeB8/fjCgEsKlXMlK6nhJnu96xIZwqIQwOWenhDDVuap/Ec3Ei8rZGMIAkIO3\nIfxP/6T/+8QJcf9ZDOE14Q56GcIbgSG8FZERU1Puz/MSbcu3EjLirrvUPfuD7tpViUu+p+lpHTFn\nl5tZPDCE45UXMsJoQjhFyAi3BZ0gCWHZB6811tDO641tefYE7j+jINqXzly6pQzhjZgQpt9zr4Zw\nGtqsQUK4Pw0M4S2sNtz5stlMVk0eyRb8HTuA/R1v8NiCSgjLSZg0hMMUcqLICJr2Mt0p9sIQtvOD\nAZGgk4O8Y8dg4SKAYFtygXgSwn4mg/Ydkwn3kIMhbE8IU5MliKj5UiSG8NKS07N7V5iicrWWGNnS\nlXoLf9CnaKpQXh9xGsK0g2/kxI20bWibkYJ5dmmfWWkx8Q43EoYwkGxCmBSVs5tnWlqWssALeoJM\nipoI511ILko/hrDD60RVRdk3IUzM6fK06ij2zJStgiXSDKeLVxQT00tCeHRyAyaEO0XlgvZNVkFM\n3rTa0sQNYXI/uyWEpShTl+4+orKSudl0MIQBoMA615mDIVyrAZ//fPdxTp1pKgO8Ipj+g4Rwus7N\nTaaREXNz7s+LQkkmhCU/G0ifQXn2rOobvHARgJ4QpkqDuQJsHUNYJoQZ694hJQ3hWq13TqoTMoIG\nUYB0IiOooemEjDi92j0IOtc4gUfmH7H+fdH2izRDmHM+MIRTJvo+wxjCdPyZhjbL7ZrO59V9nfT8\nNM0aGMJbWF4JYYCYaSQ9evSoqhD6xKJKCK/WV1Fv1XsyhIfzw9YEv1mILiFcb7RFihHdJoPd7KjX\nRecvt2UCetpOm3TlVQujGYEe0tAEUSaEPRjCQHcldwDIl9WgpZwXibsuZETYhDD5XArD0SEjeikq\nR1fqTSEjNPOlw2hNChlRz4gbKQpcBGBL4Q0tJNrhttudRYYoGMIpSQgXPBPCarYmETd2UUN4ejep\nXhWGIdwx6KIyXsIkhBeqava0d0cZf/u3wLvfDbzjHeJnbnibXhLCIxPq4jaZEPZrq/sxhGVCOKgh\nrHHwI2y76HdsX7gDwiMjpI4vC7hulmUxPeywIgJaVC4dDGEAyGdkQribIXzzzc732gPHCHegIs7V\nnkiUhnCj3UCprD7HNE1WBwzh3o6xVYvKPfaYepw2gzJIQTmpbds6hblsqlSiW2wNo61iCMvFDaex\nD72Wez33NCaEwzKEJTJiakq1O6dWu13AEyu6IXxo6pBlCLd4C5VGZVMbwhutHwL09zkaYhNp2pAR\nbtc0YyolPEBGuGtgCG9h0YSwoyEsjYeMbghL0YQwIDjCvRjCjDEryVPPRoiMIKAoz4Rwx+xotZwT\nwoBt0kUSwj0VlStGU1TOjyEMOBvCuaHuhPC20jbtPffDEM6WokNGhEkIOyIjIjGE408IW5P9TBM1\nJu7JWAzhUrKGsIUxIMaoSWREUpOhJndnCLsZZ064CAB47WvF/48cAXafTxPCYvHHlSFsQ0YA0ZlK\nfulRavJTREC5UMaNNwIf/KCaYF06cyledP6LtL9nYNg+vB1BRNv04XH1eUWZELa31faip1JhDOGg\ni5Va21UQfxtFYonuYAmKjJie9jfPTqycACD6KPuuGClVVC6phHD3eRez7siIz35WPb76avX44Wf1\ngnKAe0IYgMawTNNkdZAQ7u0YaWEI2w3hqLfo0oRw0jUL7KL8YL+EMKDmVCMjwEtfqn6ehsTdVjOE\n7dcxoBLCQO/XmkoIk/a3tow2V8wtmhBOmyG8vKyKmFJkk5sh/Og5dYMenDyo7SJcXF/cNIbwuMN0\nMa1YJi9tFmREkEKng4SwuwaG8BYWh7dZqBLCqtWWg5dWu6XxGAHgmeVn0OJighXGEAaUWVWFMoRp\n0Q0TqpHWwp46c5p8Nhq6IeyeEFYmQdBJd3zICHcsCOBsCGeK3YYwY0xLCYdFRlBTpYkqyh3UZxoS\nwhQZYYohrF0HCSSErQ6+pD7gyAzhIWoILyY6QbOuJ5IQ3hzICMoQ9igqF8AQfvObgaefFhzGOu8R\nGRFjQtgPGaEZwvly13MzLIOvv/nruO/t9+FPXvIneMMlb8CHX/5hTJed06N20Xu5NLoBkRHN3pAR\nADA1K64PORk0KS0hnA1mCPvxgxuthsXU3T3qEL2Th5N9IROvX6vFNzmt1bnVL9M+2csQ/va3xf/L\nZeBnf1b9/NGTJKbeSQjPzOh/Sw3hamvV6nvTNFk1wRAuldQkME3n5qbNZAgPkBFKYRLCAPCnfwq8\n5z1iF8All6ifp8Fg2SqGsMwK+RnCvV7LqqicSghzcKzU1Ip6sagWUuJARriZZ06GsFtBOb+E8ERp\nAtuHt2uhkY1uCEvM4ciI8/VCg2P0c0uzejWENwoyAlB90iAh7C6Hy3mgrSK/hLAyC8VdlssBV10l\nfnRi5QQabb3YHC0yF9YQlhzhBtZF4rY5ZHyVlJoqdnM0wzLIZXJotpvW5LNe15ERtKF3Q0YEnXSP\nFskyYqRF5byxIE6GMDW46fmcP3E+7j19L4DwyAhqylUbVUxMiG1x0SSE+ygqF0VCOAGGsJXcJExu\ntwJL/UpjCCeMjFCGMEkIm0JG5GrJISNIW1vIeSAjyHm7GcL0d5V6j8iITnsRlfHSbHHCWfVYyAKw\nsK5mT+VCtyEMiAWtS2cuxaUzAWboNtF7mfLPjSMjAhcAVd+xX1G5Nm9jvSkWRwIbwjn1vB27q5h/\nVGwHbDadJ0C9qpeicn64iJOrJ8EhsFC7x9wNYfl6nPQVKyu9F/cKKs6BVkulwuh5D+eHgBqAbBML\nS03I4fmpU2IBBxDjr/2kvuvTc2cBeXtUZjA1pQr2SNF7YqW2grEx0femyTSV7TZj4basUjEmJrPn\nzqXr3NxEx7e9muCTk+K8Od/cyAhx34jXabWAxx9Xv0tzQtiNEUy1Zw/wB38gHt9yi/p5GgyWrWII\neyEjjCaECTICEOYoDaFMTop5QtoSwmEM4ScWnsDTS6LDOjh5EIwxbY6wUF2w+qhGQ9zbEZQ2iUyy\nr3LbyXLkiOiD63Xge9+L7331IxMJ4TQjIwC12DJICLtrkBDewmozd4MU6EZGXHaZGuhR81eKJobD\nJi214i/DIhpsnCHcpIloL3NUjHi8EsKuyIiACeFcJqcm6FEmhBnFZARLgbezzobwDXtvsH52ZCbA\nSJeIfi5rjTWrMzVdVE6YDCGREVEkhEkiOlNKkCFcUmbZVkBGRJsQriU2GWq13RPCbulRuoDlJokT\nAOCfEI4RGdFoei9WUpOfbrt0Sgj3K3ovZ4tr1uTFZEI4zEJWmISwLCgHBMcZ6QlhMYFtt82eLwA0\nW4rpbyohLPnBgE9CmPaFTFw/cZiI9l079Hseyqt2amFFtV+3367+/upr2vjrM78AvOVaYPIxnFjS\nkRFOCwT0nlhrrFkTvjQyhMfH3ZE1QSTPbSMZwiMj3SZ+UGWzahEjTcgIk4ZwqwXccIM4z+9+V4zJ\nqVGZJoOSc5UQPv/88IsbaduCvdUM4TiREYA+9wDUwlAShvD82jxW66uOyANaUM4PGXH78dutRdmD\nUwcBwBUZYX8fG0F+hnCxCDz3ueLxww+nq591U6+G8OioauvT0F4FSQgPDGF3DQzhLSwesqicxg9e\nPNb1fMoUHiuEREaUiFk1JAxh08gImhAOkpalCeGhIbF622g18NE7P4qlse+qP6TIiICTboCkqFOU\nEB4eBupt9WaomfbO578Tn3jNJ/Ctn/tWaHORIiaOrxy32EuVitkBQbMJT2QENbXiSggXy+LzjDMh\nbHXwJCEcCzIiLQlhQwxh7W8TTAg3iSFczLlf0/S8vRLCUpohXBfGURqQEfR8fdOjRG4J4X5EF7Pq\n7aplwJhPCLsv0PZsCDfD716hn+G2HWoCS3fLmFCTpN5NJYQlPxgAdo3ucn2e9noxFpaj/GBAN6aH\nC6qvXVxVzsNtt6m/HznyLfzLUx8D9twGPP9DOFshLmBluqugHKB/72uNNWvCv7wsDKw0yG+SHVRp\nNLvdJE2fXnERUhIbkWRC2AsZ0e+45557hBG8vCzwChQXAaTLoHzqKVUMLgg/2K6BIZyMvJAR9Fru\n3xDuTghTybagVoveuKL37yOL92H3/92N3f93Nxbqp1HsDPWcDGEtIVxRF6lceKy11AVyaPIQAG9D\neCNdT/W6as+8+iq5k5pzgWZLu3o1hBlTY9A0tFdBuPaNhv68gZQGhvAWFkVGOKZHbQnh665Tv3NK\nCFOTODQyIoaEcKMZzhylCeE9e0Tj99Z/eyve9u9vwwee/TFgrDNDzodPCAPEfCxFWVQuHEN41y5g\nramq0jOyl6eUK+HNz3szrtp1Vej3sn+b2uf6xMITWmdqMiXslxCm33tcDOFCWVwfiSSEY0BG6Anh\nxXQYwoYSwpoJmiRDuE0XOforKkdVaXQjI9wSek6vE1lC2Gfxzs3kjyIhbDfUJKP19Glzhprf4l02\nk0WGdb6Yzmefz6sJ5NL6ErjDm6GGf2CcUUFFhMam1Rds2hD24mIDvRnCx1eCJYS1z5fFV1iu0YC2\na4e+j5GS6jPcDOGVye+of8zeg4U6ZQjPhEoIcy4WZJMW56rd7hWdICXN7mo13ckzzs0Zwts7tTEr\nlXjHGHElhOn48D//s9sQThMyIiw/2K60bcHeKoZwksgIKjovMo3Tc31PAL7x9M2otWpYri3jq49/\ntWunhURGlErAvn3q706uiIuUgeE5O7pXQNwSwhux8Bqgt0VehvCVV6rHGwEbQcfxYQxhQLVZ584l\n3zYEXaQcpISdNTCEt6g4h1VQBfBLCNeRyQDXX69+55gQ7scQHiKGcMfEMm4I+6TOLJOhM/mcm1Or\n/Xv3At968lv4xD2fAACst9eAg18EAMzuCZ/CAmwJ4So3ntbx24YMdBvCO3cqEyHMufjpgm0XWI+P\nLR6LbODjlxBmjFnnLFeyNUM4goRwbjj+hLAyhKNHRmgM4dJCSorKmWEI29O3tZpoO++4w7xB5iXP\nhHAuXFE5qn6REUklhN2+00gSwmTXR7VZtVKY1ao5Q63VgqtRKGVvq3fsEOb9x+/6OCb/eBKv+PQr\nukzhhSrhKwc0yynffnibSgibLizX5OEM4Xxe5+c6SUNGeDCEtcXRJBPC5H2MEOdhqSLGFO22aGsA\nsVj7gyWyM2nmB1hp68iIAwe6X5P2RZVGRZvwpSFJS81bUwlhIB3n5ia6M8pUQhiINyUcV1G5VbLL\nfm4O+Nd/1X+ftAlBRdvICy8M//cbISG8EQuB+SloUblex7ZOReUAfXciAKvgJxD9Yh29fxtcndh8\ndb7LEJbX9b59emBAIiOmy9M4b/y8rtc4NOWcEKYLf1Eb3yZF32uQhDAA3HlndO/HlHpNCAN6m2Ua\nKxZWce1a2awaGMJbVO02PIvYAGpClsk38Nd/rRsMTgnhpxafsh73kxBm5YQTwh0z6Qlyirv2NPCO\nL71De37+kpsBAEcuC89pBMhnxDh4rmJ8oGU3GZxS4HrhQC4SwhEYwueNnwcGkTa2J4RNDgjCYDKc\nkBGaudmH6HWQH4o/IbzlkRFRJIQ7yIjPfQ645hqRVjSNtXFTi9Oich4J4U7bVSoFMxp0Qzg8MiIq\n06XpwUwGPJARUTCESdq/2qhaCWHA3AC4F7yP3Kr3iXs+gTZv4+bHbraKuUjdc/oe6/HhKZ94bUc0\nITw0Fl1CuBUGC5Kt48AB/6J2PSWEMzEnhF3GXaPEeVheE+3Xww+r93X1NS3c+uyt6mBDC2hOqu8X\nlWlce233a9JFEpoQBtKRzlpQaxZGDeE0nJub6NjWpCE8N9ffscIorqJydmPsq1/V/50mQ5hihGg/\nEVRTU6r/TashnKbP25SCJoR7vZatz9HGELYnhOM0hOn9W2+rL3V+TTeEV1fVogxFVHHOLUN4dmQW\nu0a6EU0HJ50TwrSdp+1/2kXnquMe2aEjR2BhNzZCQpj2lWG55/SaSLrNClJUDhgkhN00MIS3qPyY\nhYBKV7ZZAzfeqCePJC+YTmjotuawW++pWTU0FRFDmE5AvUyGzoSbGsIn9v0ZHjj7gP78w1/Hn/9F\nHS/7H4QhHAIZoZnmERSW86tcD9gn3Y3IEsKFbAF7x8WKwrGFY1pnahIZIVLR7sgI+V6AaJERWkK4\npBLCcTEbHZERQ9EgI7RUdVqKykXBEO4c88tfFv9cW4uPD6alKb0M4U7btXevQNw0Wg186dEvaclJ\nqjAJYafXiS4h7FNUzg0ZEUFC2I6MoJxWUxzhrv7Yc/FON4QpN/ehuYe0v7nrpLpAL995eaD3QhPC\n+RFlCEeZEA5iCPsVlAN0Q9iLIax9vilkCK90omgUF3H+1Q9guWa74aYeE/+vl4HmEJ7//O7X1BLC\n9UrqtusGTV0F0VY3hJNKCLvxGgGzCWEnpQkZQT9/+r0EVTarjOSkzRVg6xnCUSWErb7FhoywF5VL\nwhDO54FaS53Y3Nqc1Y42m8CTT6q/oWOfhfUFa84/OzLb1d/OlGes+RTFytkTwhvVEPbqq/J54HnP\nE48fecR88XTTMpUQThpzExfGaLNqYAhvUQXZokrNtBZXz602qji5Ku78S2ecQVn9ICNKExEhI0jl\n+rzXNmS7IZxfwy259wEAMiyDK3cKQFCluYLnvuJW8GyfyAggksJydmSEJ0MYALL1yBLCALB/Quz3\nna/OozSueiDjCWEPZATgnhDOsqyxhCFdGMiW1PbfuLiG1gC0FD0yIpvJYjQvedjJMoStwWVECWEA\neJqEMOMyHZo0IexRKFGa1nI3x/u//X688tOvxHUfuw61ZvdMrlIPzhB2ep00JYRzmZxrcrgfuSEj\nAHMJ4SB4HzvSaOdOkdKhhvCDcw9qf3PXKWIIzwYzhEcKI9bjzFB0CeFm0CKvAJCt48gR/2PKz2K0\nMKoZ23Y5FZVLmiFMr7OVTiN6++3qb9m+W9wPXJnB4cPO5qJ9QSNtpint//tlCKft3Ny0GQzhuHiN\nfsZYmgzKfg1hQBksp093dnImqK1iCEddVE4Ygty3qFxihjAZG1JkBCAMTSkNabKqViycDGGZDgb0\nhPDC+sKmN4QBHRuR9sJypgzhpBexgiaEB8gIZw0M4S2qIIkkaqY1WupOe3LxSevxJdOXOBqH/SAj\nsqMiGryyYtZAC1y5PlcDwJUhPHEMNS5659dd/Dr82rW/Zv3NzY/drFVyD4OM0JKVRfOF5UJxKQEg\nW8f0bMP6nEwbwpQj3BhRvGnjDOGQCWE5KBsrjmlF9PoR/ewyRfXFxtURyQ4+OxI9MgIAxgqd0VFq\nkBFmGML6/SGOKQtNAvGZDm2PNKVbQhiAtcX86aWnu8xCICRD2MEcjywh7FNwzOk7pUamSWnIiKaO\njDCbEA7YVnc++507gZX6ilYY8MGz6jtu87ZlCO8Z24PpcjCHgiIjanzFmvgYN4RDJIQPXVzHO97R\n9RRNnHMrCe/FDwZsi6OdPjJphjBduKrU9IQwY8DJnJchPI2jR51/5VZUDkgHZ3eAjOjvWGlMCEfF\nEHZSq5WeqvH9IiMAZbC0WvEhqdy0VQzhqIvKLS1BFB9n+hbBJA1hmoqW9VQAPSEM6EUcXQ3hcrch\nLPnBwNZjCAO6IZx2bITsKxnTr8Eg2ijIiEFC2F8DQ3iLKgizkJpp0jwD9OJx+yf2OxpN/SSEM2U1\nCjK5eqhXrvdJy2aayhAuqf0ee0b34CUXvsT6982P36yZKulDRgTchgwAmQYmd6hRSFQJYQBYL0Vj\nCPeUEO5s2zKFiwBshnBBfbFxdURyop8hhjDdtmVaE/LYpQWsVWPiYjjIiSFsDBmRZEKYJmZtixxO\nhvCePeKfdEvi/Wfu7zquZgjXvRnCTuZ4Uglhp+80Cn4w4I2MiIoh7Lmbo/Mdz87quAhATwgfWzhm\nIQaCpoMBHRmxUluxrqXjx80ib1ohDOF3/8+6NvFw0nJt2TLHvfjBXa8Xd0LYZdxFDeG1+jrW1oB7\n7xX/vuQS4I7ToqBcMVsEsw/d15z5wUB3Ubm0ISNMpCqlBoZwf8cKI6+icrmcMomjTggD6TEp5edf\nLAIjPa5Ppilxt9UM4SiQEa1WZ5yU776Q04KMWG86F5UDzCSEh/PDVl+3WRjCfobwlVeqxxvFEB4b\nE6ZwGKUJGTEoKtefBobwFpU9keSHE6B8YFpQ7oJtF2hmrlRYQ5jyTdslZQibXCEPtQ05W1fspJLq\nBSZKE5gpz1jYiLtP3Y2/v/fvrd+nDxkRLiE8Ma3ehGmDhSaEl7PqGjKfEBbfMwNDhnU3cdJMqrfq\n4JxbyAgtsd2ntKR4If6EsDTq2JAYbY0URhzNcVOy7t9sU8cQxCx7QjiXyTm2bUGlJVE7Jij9DuMy\nHVpQ7a/9PnZK7sqEMC2YaGegA9DSpb4JYc0cj5ghTM3CgMiIKPjBgA0Z0YgyIRy+qJzdEKYM4V5w\nEYCeEF5trGJ3x1ut1cz2x2GKytEFaTcF5QcDdoZw3Alh50KvdDG5nani1luVUfGco6escdc1u6/B\nnuED+oG9EsIpLypH76GBIRxeaUBGeG2173fM45cQBtJjUsrPf3o6vLEilabE3VYxhL2QEf0WlbP6\nlUL3uDg1yAiPhHCvhjBNCDPGrJTwVmAIA2IRV147G8kQDqs0LWDFxbXfrBoYwltUQZiFbsgIWVAO\n6BjCw/0bwoVswZqItvJqtGySI6wlhP1SZ7maxe8a26F6AZkifdmBl1k/kxPRfeP7XJnKToo9Iexj\n+k/O1DE1q0bvxhPC21RC+BxX15BJ4L44Z3GtZuFsgNKE8Hpz3VrsiCohjLz6TOPoiDhXHXy7KG6g\nKHERADA1rEZ4K83k9oDJAVu2KBIP/fCDAeeEMFVshrC2mOWfELYMYZoQPuuTEPZjCBPjOVuIFhmh\nna/fQlZHUSWEqVEXaUI4KN7HwxA+u3YW82vCtaUF5a7YeUXg9+KWEAbMYiPCJISDGML0s9joCWHk\n1jV+8NCh71qPr9t7HS6dfo523HxjxpWxbC8qlzZkhIlt9lJb0RDevl09TgsyAlCGcFQJYfq5paGw\nXLsNzM2Jx/0sbKTJYNkKhjDnyhCOAhlhzW+cEsLrKUwIr81jdFRtBaLICDr2sRvCo8VRDdt1cEol\nhAFsOUM4lwOe+1zx+PHH033fyHFAL4bwzIy8MBZvAAAgAElEQVRa/Eq6vRogI/rTwBDeogpbVI4m\nhE+sqonXnrE9RhLCgOII17IqhmTSEA6bEJaa2as6bdmpUUMYAK7dcy1u/YVbQ6XUok4I279jP2TE\n332yjlYmOkOYJoTPNiJMCHeQEVnmbwhTw8xoQpiiQ3Kq94kjIVytymIkHK1CPIbw5LAaHVWayY3w\n5LWUyYvRVz+4CMCeEO6eHcVmCDOPhHC2+z06JYT9DWFvZEQuk7MS99IQjgwZQczCvIPbkM1ku9qz\nqBLC2UzWajPsDOE4i8pZbXWmBbCWoyEMKGzE90993/rZ5TuDJ4TppG6lvmIlhAGBjTClMAzhQAnh\nZfXm/BjC2ut1+siki8rZDeE771T/XBxT/ODr916PK/fqC8/7tk+73rcaQ7iZ7oTwwBAOrzQYwvbJ\nN2DOEKYJ4cs7zVgmox4D6TBbFhfVZ9LPdTwwhOMVLdwXRVE5yxAudEfd05AQzuX0onK1Vg2lMfXi\ndIzjlRAG9IXYA5P6LhaJrFuqLWF8Qn3om5UhDOgLQ2ntj1otda31Ygjn86oPSjMywquoXLUxcIiB\ngSG8ZdVPUbmFqjJ8poamusymUq7UU7V3eZwqzgEQK5RJIiOkJmZ1ZAQgDGCZuvq5y34O3/zZb2Ln\nqA/k0Ca7IWzaXAllMgA47/y6ZhCZNoR3lHdYRumJ9QgZwplghjAgtkdJRZUQ5rl4E8LWdZRfA+98\nFlHygwFgkiBfKu1kDOF2mwy+cxEkhLPJJYRpUTlvhrBCRjRaDa3g5RMLT3QNfHRkhLg33Ywl+los\nHy0ygiaECw5tNdBdWC6qhDCgFniqjSqGh1VqyNSCZVje++hEAzt2uBjCncJyMiE8OTSJvWN7A7+X\nQrZgvdZGSghTZIRfQlhHRoj3kUhROeZcVA65dW2b6SncbT2+ds+1uGynnhC+6Dz3SKKdgZ1mhvDA\nEA6vYlGd99yc93NNKigywmRC+MMfBl7zGvH/vaRJS0NC2BQLO01Mzq1gCNOFjSgSwtb8JmXICHne\ndmQEALDh7ok3Y+K6/reH/w2/983fswoWA8oQftORNwEAfuLin+iaP8q5c5u3kRtW5vhmTQgDG6M/\nouMeOjYII9lmnTplts5EWPWaELazvLeqnGdaA216Balq7lZU7lxVjGYZGMZL410J4V7SwYAqLNdG\nSxRyW59IRUJ4ZGoJ6PypTJHmMjnc9pbbcGr1FPaM7UEv0gzI0pLxKu5BONH27zhKQ5gxhv3b9uOB\nsw/gmZVjEKY/iy4h7IOMAIAzFRVPmij2WeKciHJH25l4E8LWwGNI3TxRJ4Sp4VxFMiO8lRUyGOng\nHeymYVg58XmpYksIc/eEML2ey+N1/NK7xWB1bk0f5LR5Gw/PP4zLZi+zfibvd9YqgnPRPngZwsVs\nEevNdbCc4im3Wt5/04taPgxh+V5oexVVQhgQbeFSbcl6vW3bxGTdVNsViiEM4CN/U0c+X3JNCJ9c\nOYnTFRHtuXz2crCQQMvRwijmq/NYqa9gD9n5aTIhbNwQXg7OEKavly8KQnd8yAjncZfGnc9X8dRT\nnefkgCpfsJ6/fXh7F5rqyovcnVQvZEQaJqkyIZzLBZtkeylt5+Ymk4YwIIya5eXNnxC+4grg858X\nj9/+dvXzNJiUURjCg4Rw9PJDn0SKjKgtgXNu9c9pQEYAAB+aA7BP+9n0NHDHyVvx6n98tfbzQrZg\nmb3ve9H78NYr3+q4ICufA4j+rFQaw/r6xjWExwPkhzZCf0TfVy8JYQCY6lhA9bpo74fNWgeB5XUv\nexWVa/EWBhokhLeswvJlKTJCGsITpQlkWKaLIdyzIUyP0zGzzBrC6qb3rVxP0oDF8e6EMCAmZ72a\nwUB3QlhOAE0prMkQtSEMAPsnBEd4vbmO4pQY7UZVVC7LnI0kes5nK2oUbzIhTNNe7Zya0cSR+LBW\nfOM0hElCuIZk9oDR64hnNllCGO73MTWtb3xrDR/8oHhsT58A3YXl5P2ea6uZiBtDWHstYo5HkazU\nkBEO7RbQzRGONCHcMetk4loy8ExNZoIs0NLv+cdeJmbqTobwQ3MPaQXlwvCDpSRHeKWmIyNMLlq2\nTTOECcrKDxlBxzvDowkmhDPuCWGpCy8Eluuq8CljDAcmDyDbVs+/4fLgCeG0MoT7KcQltREm4IDa\n+VYq6RPVXiUNyIUFPSUVpfwSwtIUaDb1iXpYSWMsk9HNuSLpmtNgUm4VQ9jpZxtZLeIF9VpUrtUC\n3vUu4Bd+obtN9UJGyDomUtQQjjo8ohWVa+o3UKvYnRDesQP43onu6mivOvQqbcF5z9gexwVoOnem\nHOGNiIwol50XweyipnFa+yMThjA9T5M1gcIqKDKC3seNVkNuSN/yGhjCW1RBzEKNIUyREetiFixN\nJrvZ1G9CGAAwJDokk4aw3zZkt4RwZtjZEO5XURvCzSY0XqHkf1LFbQhTjvDwHsERjgoZkQuAjDi7\nRgxhgwzhDMtY10ojp1wjWqAhKlkD0qLq6U2em5PofVHLJLPkT6+jdiYChnCSCWGo9terqFyDGGf2\ngiUAcP8ZnSNsGcJQ93oQZARtH6MwlWhb7dQ3AckhIwCVZKxUzBgwYRfv5AROGsKTQ5NWUdYH5x7E\n908SfvBscH6wlDzWan01MmSEKYZwmwseoUwIZ1jG2sLqJvp65ZGYi8oFZAhLHTyo7mW5aJnNZHHR\n9kvUc3a5O1DZTNZqB+2GcNKTVM6VIdwvLgLYOIawyXMGdAPSJGbNS0GLygH9pYSlIVwu6wsG1BBO\nAzLCFAt7ZESZF2k0hNNgvpuUCWTEv/wL8Kd/CnzsY8BnPqP/ThnCzpFful09qYSwHRlRy3azZ2Zn\n9R2Vf/gjf4jv3PgdfOb1n+l6rpPcDOGNlBCW32XQnSy0P0rSKPWSCUOYfh5JGvy9ICNW6ilYFU+J\nNqwhzBi7kjH2u4yxrzDGnmGMrTPGVhhjDzPGPsYYuz7k8V7OGPscOdYznX+/zP+vrWNkGWO/xBj7\nNmPsDGNsjTH2GGPsrxhjl/gfIT4FKTimMYQ7CeE2b1sMYWkE25ERvZpPmrHcYRiZHNw2ekBGZLNA\nK08KjxlMkcaTEBbfcZC0bJwJYQAozgqOsMmOkiIjcpkAhnBECWEA2D4sSPsVri7iRx4x+hKOsjr4\nfLTfJRVFRtSzCRvCrI12pwhbvwlhJz4vVRoSwm7GmRMXy15YrlIXsw55fRSL0BKhdkljiWfUZxHF\nZ+CHEwAcEsIRIyMAYahxzrUq2SYGwGF37NRbdXDOLUN49+huXDx9MQDgqcWn8PG7P249N0xBOSmZ\nEK42qxgdb1qDaZPICBMJ4f93z//D5B9N4uWfejmOLYr+ZEd5h+s1I0XHO8Mjoo9MmiHsVoj04CFu\npf3p2OpNz30NAFG8x2+nkrw3Ko0KhobULoCkTdPlZTWJM2GOjqh6iImfm5taLcX6NWUI79ihHsdl\nIgZFRgD9GcISGUG/W0A36tJgUppKCDOmMzmT1FYzhHstKved76jH9j7SCRlB+za6kysuQ7jdVni1\nXK4bGUELu0vNzuoBmpdc+BLccN4NjmMVJ9kNYdOL6nFIjvV6MYTT2h9tFUPYrajcci2lX0wC2pCG\nMGPs2wDuAPA+AD8KYBeAPIBhAAcA/ByA7zDGPsGYS0xQHYsxxv4GwBcBvIYca1fn319ijH0kwHua\nAnArgL8EcD2AKQBFAPsB/CKAOxljvxD6ZCMSnYAynnHc4uGUEF5aXwLv5OvlNnFjyIihaJERLa1y\nfXcn5pQG3L8fWK6rFq7Xc3OSTGGJF1/Gk08aOzQA/Tt2MvyBBAzhbcoQzm4XCeHlZX3bVj8KmxCm\nK96mU7Tyel6uLyJXEN9DrAnhvJqFaXzKCESREY3sQiKFBayBCMW99MsQzqYjIUwNYXtROfoeadLD\nKSHshozYPT2MD34Q+OIXvZmW8vNsZ6JFRgQxhO3p7ziQERwc9Vbd+ACYom4YWKDdHAvrC9b3vWt0\nFy7afpH1Hp9YEG3r0T1HcXjqcOj3M1JQ7kulsWotEhgtKuexyAH4G8J3nrgTb/nCW7BUW8LNj91s\nFQj1w0XYX2+oLN5HvR692VGvwxUN4pYQ3ndgzWLc0UXL337Bb+O/bvwv3PHWO3wn5XRBgzE18Usa\nGUFTlf2YaFKZjCqMk9YJ+LlzwpABzBnCuwgy2+SijZeCFpUDzCWEqTYrMgJQhvDCQrLpZyejLg2f\ntUmFQUa4fRe3364e29sdp4QwZdwnYQjbjTM7MmINzglhagjPlMM1XjQ0QhPCwMZICTca6jvZTIYw\nHQNsdGREUIYw7Y8GhrDShjSEAeyEoH4cB/AhAK8HcA2AowDeBeDZzu9/BsDHXY4h9YcAfr7z/DsB\n/FTnWD8F4Pudn7+FMfZ+twMwxjIA/gXAVZ3n/zOAlwN4PoD/D8BpCHP4rxhjLw19thFITEBFT8hc\nagvShLCckElcBOCeEDbBEGZl88iIXorKHTqkjJWRwohv8iiM8tm8Ml2LS1haiiAty9KVEKbICD5+\nzHpsqrMMkhCmRpKGjDCcEKbX876LxIX82GNqMhiVlCGcTEIYpcVEVvwtY46YKX0zhHP+DOE4zG+Z\neAb6Swg/vvC4lQZptBrWzo/RUhnvfjfw4hd7vw/5Wi1EnBD2MQsBB2REhAlhmt6sNqvGJzN0x07G\npT8uZPTvmfKDd43uwsXbL9aeP1Ycw9+/7u9DF5QD9MXKldqKhY1YXjZnInql3gFvQ3iltoKf/Oef\n1GobSPkVlLO/3nBZOQJRG6S9MIR3nq/uY5qyymayuP686wNhrGT7L3cEyIlf0pNUU9vsqdJybm6K\n4pzpro4T3VjxSBQmIUwTWc0mcOedwY1Ft4Rw2pARJg1hmviOs1CgXVstIdwLMqJeB76vCE2BGMK0\n6BpduKf3TJyGsD0hXGk7M4RpgGa6HO4id0NGABuDI0zn5kEN4a3CEN4ICWG3/mhgCCttVEP4QQBv\nBHAe5/xdnPPPc87v5Jzfzjn/EIDLAMgs3k8xxm5wOghj7CCAd0OYuHcAuIFz/pnOsT4D4AUQJjED\n8BuMsQtd3s/PQaSCOYC/4Jy/kXP+Vc759zjnfwHgBgDLEJ/3n3cM5ERF06MZ+KdH5cRLFpQDgMmS\nWYYwPc7QZAQMYZI6KzgsBTsZwocPqxVck/xgKeuz6vBeTWIjUpkQJsiIRvkJ67GpTkQ/5wBF5SJi\nCAP6Qsnew+J6rlajn7BZHTzZdqxtR45A2r1RWkhkgqYMYZIQ7pch7JMQbjTimSBpCWEbQ5gao3UX\nhrA0m9q8jYfnHgaAnu51+XkIg1o44VEjI4K0XUC0CWF7YS7TA+Cw7VYQQ/ijr/qotgAXRhIZAQjG\nGuUIm0og9mMI//KXfhmPnXsMAHDVrqtwdM9R63cHJw/6vrZmxA6r9xH1pM1uCAdJCG/fQ5BVPfZR\n8t6Q93xaUrTU7BoYwr1rIyWE3/Uu4KqrgFe8IthrSFPSnhBOGzLC5PdK/54eN25tBUPYLyFMFx6c\nku733qt/JvZ2xxofEGQE3cVCE8KZjNraHmtC2MYQXmrMdRUXnp1ViL2RwkjosIWXIbwREsJ0nDdg\nCOvaCIawW1G5gSGslLgx2Ys456/mnP8z587ZLM75OQijV+r1Lof6dcCK47yTc661ipzzKoB3dv6Z\nA/BrLseRr7UA4H86vJ/HAXwAwlg+AOC1LseJTTSRxLhLQtgBGUENYblNnG4XB8wgIwoT5hnCWlG5\ngIbwoUMb3RDuJIQDcDjrrbqVIAKiMYRHi6PWOdcLp62fm+pEGk1uJYTtW+ul4mII0+t5x/nxcYST\nRkZgaKGvLaK9KvqEsDPHNA7jgQdMCNOtfzQhfPWuq63HkiPckyHs8HlEkar048sCDsiIKBPC5P6p\nNiJKCIdYoHUyhK/dc61l/L3l8rfgTZe+qef3Y08I0wSiKcNJQzg5tNVuhvD9Z+7HJ+/9pPU+/+n1\n/4Rv/uw38Xs/9Ht405E34deudRumKenICOUIxGIIu9Ru0NroTttdKgGl8f4NYXl/N9oNNFoNa+K3\ntqYn5OJWlAnhlZXod+P0otNq2KMlQftR0glhu3kEOBvClQrwN38jHn/jG/7XHjXFNkpCuFBQCy69\nit4Lg4RwtPJLCOdyyih2us4oLgIIhozQEsK2nVxy4SMuQziX60ZGzFfnu8xBioyYHg4fgae+wHJt\nWTMRt4IhnNYFSvq+em230oKM6KWo3MAQVtqQhnBA/Sd57JbsfTVEzOkhzvkdTk/gnN8G4GEIM/fH\n7b/vpIwv7hznnzjnbkOTvyOP02EI+0xAnYrKyYJygEr05jI5zSw1gYzIjQrjeWXFHHC+6ZMQ1gyG\nzvbw/QfqqDZF62E6QQrYDWFulCNMuZS9JISjStzJz7GZVQ2xqU6k2aJ8Rn9D+NllBcXUsAcGRK/n\nbbuUIRw1R9gy6GJMCBeyBWTbHVOxtJisIWyQIZzL5BTP1QEZAUQ/0Gu34ZoqBDyQESQhfN3e66zH\n958RhjCdiFBmrJe0RbNOYjopZEScCWF6/9gTwsYN4R4TwjtGduA7N34HN73hJnzkf/iWPfAUNYRX\n66vGt3dyDoD1lhD+3onvWY9/64bfwgXbLkAxV8T7XvQ+/OPr/9G3wBqg94elIfU+4kZG+CWEDx4E\nVur9F7WliyVrjTVtorq66vAHMck0QxhAas7NTZslISyNtHxeFEKzy2kC/uUv65Nxv7aTmmJpTwhL\n43Z62vnzCKNBQjg++RWVA9S17GQI33ab/u+wyAiaEAbiMYS1c863u9BLc2tzXYbw9pmmFQgLyw8G\n9D6o0qhsuYTwRjCEN3pC2OteHhSV89dmNoTpbLGrZBVjbD9E4TgA+JbPseTvdzPG9tl+d4PD87rE\nOT8N4BEIY/l6n9eLXLoh3FtCmCIe6ONejVOaqGTDykAz1Vn0goxw4/eZkmUIZ9pAfs18QthiCKcD\nGQGoc25kVENs6juuNdXAJkhCeKUuRm/jxXHMjsyaeRMd0et5ZCY+Q9jq4GNkCANAsd0Z4ZVSlBDO\n9pcQBshCkQMyAoh+oCcWdtyva9eicsTwff7u51uPH1sQW+2fXnra+lkQE83+WlEmhAMVlYuRIUzv\nHztD2BwyIhzv3W4IA8DlOy/H6y95vWNRujCiCwQr9RXjCRA6/gDCGcIy4Q4AV+y8oqfX14zY4ZgT\nwiGKyh08qN/H/SaEgW5DOMmJapQJYSCdk/Aoznl6Wk2A40oIy6CGn4kGKBP4s5/Vn+OHhKOGvldC\nOGmTknPdEO5X9BhpMYSlAZ/0Z21aTsiIf7jvH3DDx27Alx/9MgB17kEMYbeEcLbkj4wA4kdGZArd\nX+j8WndCODemCs2F5QcD+oK93RDeCAxhOkcdD9gNp70vAjaXIeyVEC4W1ULdICHsrM1sCP8wefyg\nw+8vIY8f8jkW/f3Ftt/1cpy9jLFoI3s+ohNQV0PYoaichowgiUpqfvWaEB4vjVsT2XZRGWimsBGt\ntur5gxSVK5eBoW0xGcIAUFoyaghrCeEAKbtGu4G1ZvQmokw51VGx3p8pVrRcuACCJYSlLp25tKfi\nS16iCWGJQAE2JzICAIq8c38kjowwlxDWjpFQQpjex0C3eZbNZK12kxpndKLx3B3PBYO4vo8tiGKO\nTy2qxmbfuH2d01lORfaiOH8/vizggIyIKSEcBTKCfscZl8U7+tnXWjVHQ9iUNIZwLb2G8JHpIz29\nPmUIFxNMCNP3kc/klZHf2d1Bi9oCvSeEaV9eaVS0raFJbvGMkiEMpHMSHoUhnMkAO3eKx0kkhJ1k\nZzZWq8AXv6g/x29s75UQThMyYnFRfR4mvtM0JoRlm7HZDGEnZMRv/Mdv4JZnbsHvfvN3AbgbwouL\nwMMP6z9zNYSHXJAR687IiLW16AoWU+Ms62AI2xPCuRzQLKjGuhdkhJYQrm+8hPD9atih1VTw0lYp\nKrcRkBGMqft4kBB21qY0hJlwdn6T/OgzDk+jt/SzDr+neoY83mvgOMz2d7FL36IavKjcwno3MgLQ\nza9eDeEMy1gJmFZetSqmzEJqMjgiI3J6AalDh4ClmjJVokBGaMcsLkfHEE5hQli8ATEDN8VJq1ND\n2MH0B9wNYdPaPrzdetzMz1ud0WZERgDAEOuM8PJVLFXinzVEwRAGnBPC1FCJeqBnN8/sReUA9R41\nZARJFk6Xp61UypOLTwIAnloihvBEMEPYCRmRFEO4CxkRU0I4bUXlGBh2lA3BSDvSGMIRJIT9FjkA\nPQlPr+sHzj5gvcegyXa76OtRQziWhDBF2pBFDcaYaq8MJ4TpYslaYy01E9WokRFpnIRHYQgDiiN8\n9qzzVn/TCpsQ/upXuxEe/SSE04SMoONXE9dxGhnC8vNP+rM2Ladt5nNrIg17avUUAHdD+A4H0KQb\nMiLTSQgzMG034mLNGRnBuXMROxOixhkrdK+mVJtVlCfUXHDHDmB+XV2IPSEjSB+0Wl/dcAzhW29V\nj48edX8eVamkrqm0FpWj1+tmTggDapFykBB21qY0hAG8C8A1EFzff+ac3+XwHIrP9iON0c0bdtii\nqePEKooTMIGMMJEQBlSSsZ1Vd6wpQ9hvG7I9IUwLygExJISLy0YZwtRkCGKqJGIIl0QvaWrQ22x5\nFyoCnA3h58w8x8wbIKL3xLnqPA4cEI+feELfpmZaVqom5oRwOaOW/M8sxT8qkAORXNHZcOlV1vVC\njJyDB9XvY0kIZ92LygHqPWpF5TrJEwaGkcIIzp84H4AoDFKpVzRDWP7OT07IiMQSwnZkRJQJYVpU\nrhlRUblOf5wLaQjPlGccFwn6kVdC2MT3HSQhnM1krYVMaQiv1letBY1Lpi/peVcHXSAtlFRjHEtC\nmCxY2a9huyEcRULYbggnOVGV5mip1G349aqNZAhv3+7+vLCiHOGTJ80d103SSAtqCNtxEcDmSQhH\naQinISGczytTNI7FhjhFx+LZLMA5t9Bbq3UxpZfnbjdo7bgIQG9zmk21qME6DOHh/LBWhNktIQxE\nh42gJjhzQaEVya7G2VngTEVdiH0nhDcYQ7jVAv77v8XjnTuBfcHyE2BM9Udp7IsA80Xl5DzsmWeA\n3/5t4L/+q/f3FlZ+PHDZJw0MYWdtOkOYMfZCAB/o/PM0gHe4PJVGx/y6ONpi2p0V6zic836OE6t6\nLSqnISNIp3Zo6hAAYT4ENRecJCcuLaaMybk5t2eHEzWEnRKzdMKdLdTx8z9vZjLmJbshfPasvp2h\nHzWbUCZDJnxCOCoT0Z6KBqJJCBdcTBInjEAUCWGamp+vzuOQuEVQrwNPP+3yRwZkTY5iZghPDqv2\n4IFj8Y/w5ECkPK6aWSMJ4Vx3Qjh2QzjjvdAh72OnhPBYcQwZltHa5aeWnrKMNSAEMsKh8GYkCWE4\nc1apCpn4EsJ2ZETSCeH15jpOrgrXxzQuAog+IRzEEAa6r+uH5hSV65LpSxz/Jojo6xVKMSeEc+7t\nk1NCmC5Km0gIV+qV1CR6pNk1M9N/IS4paghHbfD3InnO27YBhe616Z61W+1Cj4Uj7IeMoIbw4iLw\nhS90P2ezMIRNJ90nJ9X9kAZDuFBQn7fJz/r++4E3vAF48YuBV74S+PVfj98ctJtIdAy1Ul8B51xL\nCFOMAzWEZzuh33pdfUa0P+EF4e6OFEa0dtytqBwQnSGsJYTzzqsp+XFlCO/YAZytEGREDwzhXCZn\n9ed2ZESSfVCtBvzO7wAf+IA7ouOBB9R3efRouL5qoxjCQ0Pubbmfcjl13crx4W/+pvhMf+Inog1A\nUQVNCA+QEc5yWdvdmGKMHQHwOYjzqgJ4A+fczU6kraDfsIw6SPZNHNZxGGMFH1PY6zixSpgMnYSw\nywTUKSFMkRGUIfzOa96JSr2CS2cuDbz92Ely4t0gH48xQ9gndUYn3L/9u3W85EeBv/1+nAlh0ZI+\n/TRw0UX9HztIQtjOiZaGcCFbcP2bfmU3wQFz33Gz7V58SyouZARNCM9X53GUmIiPPALs32/8JQEo\nQzhbrFq2WhzIiD3bJ3BbB5pz/+PJGcLDY+uQnpURhnC2myEszX0g/qJyTvelPE+tqFxnMUsuZO2f\nUBfcsYVjFkN4amgqsJlqx+oA0SQM05YQticsR0cFt7PdNpgQDoFwOrFyAs22eH4khjBJCK/WVxM1\nhKvNqjVJv/9M//xg++sV404IuyAjANJO56u49FJhlGrICEMJ4TQYwu226vdNohM2SkLY5DkDekI4\nDo5wGGTEzTer72LXLmVY+xnCXgnhtCIjTHyv2axIj589mw5kBDWE63VhnJlYwPn939eT41/6EjA1\nBbznPf0fO6jsReXWm8oaaPM21pvrGOpczJyL614u5NzV2Xs8MQFccYV4/4DoR4pFva/kWXExlwtl\n5LN5DOeHsdZYS9wQzuSdb57MiJqUzc6KnWVSvSSEATFGq7fqqUoI33QT8Id/KB4///nAj/xI93O+\n+131+Lrrwh1/oxjCveIipCYmxPUqxxP33Sf+f+aM+Hm/xw8iP0N4kBD21qYxhBlj+wF8BcA2AE0A\nb+Kc3+LxJ3To77dZjQ5F7FgI+3G8hjhexwmker2O73//+77P27lzJ3bKKhMOCpsQtheVG8oNaQnS\nbUPb8Ec/9ke+78tP8ph1XoUgfjBjAyI/LiWdnPFMx+wgk7EoDGFazR2dFeQnnzRpCHcYwiETwlEm\nSrWUk2FkBE0Iu22jthvCO0d2amleUxrKD2EoN4Rqs4q5tTktVfroo8BLX2r8JQGoyRErxouMOLRn\n2qKoP3DiKQABQVsG1G6rwffQ6OZKCAdhCHslhOX9RhPCj557FMdXhGsQZgGP3julcg3rAE6dCvzn\ngdVTUbkoE8I2ZEQmI7bJLSyYTwgHQUbQdHcUhjDtlyIrKueDQQG6r2vJDwaAIzO9G8JaMbdicglh\nN2REsbyOr3xFmC4mxiD27brUEE4KGRQ1r+oAACAASURBVHHunGi3AXP8YCDdhnC1qhYdTBvCSSWE\ngxjCcrs1ALzxjcCf/Zl47IeMCJoQ3mzICEBcH2fPCkPFlAEbVk6GsPx5sf+1djzzTPfPfvCD/o8b\nRvaictQQBkRKuFRSF/P6uvg8lpfVwsuRI90FxLZv19vWZlZczHLherw4jrXGmta+AwkkhF2QEays\nIyNoQrgXhjAg+qGF9QVU6hWUy+LzbrWSNYQff1x/7GQIU35wWENYXhe1mvjPxH1jUiYN4ePH1XiY\nzguq1XgN4WxWtJcnT57EScJPkuONWk3wv7NZ4OQjJ4GYEsxp16YwhBljuwB8DcAuAG0AN3LO/93n\nz2gBOL/KJLSQnL0Lsx/HyxCWx+HwL0DnqLNnz+LKK6/0fd573/te/P7v/77r78NuUbUjIyguwqS0\nJGNuHWgOmTOEQySE5QTUxHZNL+mGsBgwmCospyEjAhRYi8sQpgnh0allrMAgQ5hTQzhYMarn7DDP\nD5aaGp7Cs8vPYn5tHofIy0RZWE5OjjIFgv+IISF8/f7Lgc7E71j9dnD+k7FNYlZW1Hav0ghhdBpg\nCFvHyLTE/cSzFg8aSBdDWLZb9VbdmtjIVCE1hG955ha0ebvr536i19H0riqeeURM6kxPWIMYwvQ+\nzrCMke/aTfS8ZRu5bZuYyJiYzIjvWC7e+Z/vscVj1uM4kBEjI+L75Ty+onJA93V9/1mVEDaFjMgX\n1WwgHkPYveil/HcT61biU8NW9TgGsSeEZx2Yf3ErquJqcRb8DCvTSVKqpBLCQZARNMH70pcqQ3gz\nJoRNGsL33y/MlErFHGM7jNwMYVPGlvx+pTEIOJvEUcqOjLAbwqv1VZRK6mZdXxfm1kOKXoSLLxY7\nhqTkoo/VV7IWWkwcV875JkoTOLl6MvGEMCf90fTwtJUEbhX1hPBDa4Qh3AMyAlBmeKVRAWNiDDU3\nlywygo5n3MZyMiFcKIgkeBjZFyhNLn72K85VH9krP1hKGt9ra6LNort+oyqOaJd9kfIjH/kI3ve+\n9zk+95pr4nlPG0kb3hBmjE0B+A8A+yGM1l/hnH8qwJ8+QB775THp7x/0Oc69AY7zDOe8p1tkenoa\nN998s+/zvNLBgF7Exs0QdkRGVEWLSQvKmZRmROarqTKEo08ImzWE9dRZihLCZNtreWoJKzCHjGiQ\nonJuDOEuQziCgnJSU0MdQ7g6jwMHROIdEMiIqCQNYUaKyplIyvrpmt2qh12fug2nTyuuWtSiA8pS\nOaKEMABk65gaH8IUCZTHzRD22t0gi8o5mUjU+P3Wk9+yHgflBwN6ezU5W8EzEIO9c+egfSb9Kiwy\nopwv91xgLIhoe1htiPtKbnlcXOzfEG+1AOSDJ4SPLURsCBd1QziTERObpaVkGcIyITxSGMHesb2O\nfxNEtIZArqDeR+LIiE4SvcVbaLQayGfzVoIsn8n33J6lERkRlTma5oSwlwn+7PKz+Os7/xqvOPgK\nPH/P80MfO80JYamREbEtW2ozMoRNXcvUODpzJnlDmPKuTX3e8vuVzOQzZ6Ktr+EkL2QEIHbI0MUH\naW49SJyAiy7SE5Gy3bH6SlLPQ+7WkPPJ1foqmu2m1QfGXVSO7ljZPbbbMoQndqqE8LXXAjf9wAAy\nonPulXoFnHNs28YwN5dsQpj2f07v4+xZFeC56qrwCyFpNoSrVXX9m0gIS9kDT3EZwvZFyutecx1+\ncdcv4tWHX42dozvxq7+qitx97WtiDP/ST74Mcx89Cxiq3bSRtaENYcbYGICvArgYwgz+Tc75XwX5\nW875McbYCQA7AbzQ5+k/1Pn/cc653a6jNRRfCOAzLu91B4BDnffphbLwVKFQwBVhl6gc1EtRuWqj\nimpT3NlRGcLa1vZ8FaiaS4/6MYSpwSBZnFEjIzTuZV70/sYSwi0OZEQKMK0J4dK4GDlVKqLTcJpE\nhBFlCBdywQzhKPjBUtuHt3feVxPD21ZQLo+hUhFYkKikisqJe3UoNxSpWSY1XZ7GeOtCLGUfB3Z+\nH3ff28DLZnusUhBSdFBXGCYJYZMMYQDI1bBjx1CspoNoq73Z2NQ445zrOxs6CzB7x/YiwzJo87bG\ng+vVEB6fVrP1p582ZwhzjlBmIRAtLgLoRkYAagDcaomJbT8JC23Hjgveh16HTy+pWXPkCeGacEnH\nx5M1hCv1ipWMvmT6kr7aNPp61BBOCzICEMZEPpvXWOC9nnNXUTliWiWFjIgqIbxRDeF3feVduOmB\nm/Dh2z+Mp3/9aT0sEEBxJ4T9isoNOwwhL79ctJsyEeqHjPBKCG8FZAQ9/gUXmDluGHklhE1Ifr/l\nshg/nDkjFjOaTfeFBtPyQ0aIhLD6t7zWqCF88cW2QlV2Q7igLmTZFtN50Eptxdp1GzsygixQ7h7d\njbtP3S3ex8wcvvIVcX9ffTVw9nZxkY8URnpG0Mlzb/EW6q06tm0TF9XSktjOT1PWcckvIUxxEUd7\noOCluT+iC+AmDWGangeSM4R/6Vu/hGOLx3Bm/Aw+/8LPg+YkDx4EzjsPWP7COlwssC2nBG4/M2KM\nDQH4EoDLIUzW93POPxjyMP8KEdm7iDHmGCBnjF0LkezlAP7F/nvO+aMQqWEG4I2MMbcIx43k8edD\nvk/jajaVWRgkIVxv1V0LypkU3Zo7sV20IqYMYc69K9f7IiN6LOjiJaeEsCmzsNmi5+ufEF5vrluD\nobgM4eKY6iFNfM+NdniGcKQJYcImPledtxKzp09H9pLWYJ3nOoZwDPxgqYtGO/Gf/Dr+4z6vzRJm\npRnCQ1EmhGvYuTPeQV6ghHDnPXJwtHhLX8gqipFaPpvHnrFuOlIYhjA1XkenlCFscptnULOQGqRR\nFpQD3JERUv0mXJqtNsAE8yQIMoK2c1EYwvls3vp8V+rKEAaSM4QfmlOzjH4KygE20521rAl/LAnh\nAMgIQBkTdhZ4L7InhMdThowwmZqi55Zk8sxJXoawxKEsrC/g8w+GnyKMjakUaRwJ4TBF5aSuvBLW\nVnGgv4RwGpER+bw5Via9Puh1E6fiMoRHRoQ5AwhjMI7rV8o3IVxf0a5lN0OYLgh3ISMK6kKWYyg6\n96s0iGFMhjJrEaUW3ZARu0fVNoNz1Xm85CXAi14k/n2mIi7CXtPBgDvL3hSKqhf5JYT74QcD6TaE\n6fvpt92i/e7DD+u/S8IQbrabVoDgnlP3AND7pGoVeOzxNuos4kHfBtKGNIQZY3kIc/Y6CKP2zzjn\n7+3hUH8GWLHRD9vN3M6//7zzzyaAD7kcRxrRkwD+2OH9Xgjgtzr/fAwpMIQbpBd0q2quJYRbDQsX\nAUSYEKaG8LToDZNERtCt11EjI4pjYtBgajBUa6jvOAhPl5rfcRWVy42oz9fE99wiDOFigIQwA8PF\n0xf3/8IumhpShvB8dR47dojHCwtqsG1SnFNDOPq0t13X71P7Qf/7mdtie106qMuVzDKE6fUyuq2G\nX/5lMbDIdprNOA1hxrOOKUH6HmvNmo6MIAtZTrzgXhPC5Qk1iUnEEKbIiIgTwk7ICJPb7ml/HAQZ\nQRWFIQyo73q1LvolOeBfX++/7eqFIWyKH2x/vWa7GVslcD9khN0Q5pxrCeFeZZ+I08lfGgxhkwlh\neqwoF157EX0/9nOWhgsAfPLeT/Z0fJkSjjoh3G4rZn9YQxhQu0n6YQgnlRCuVICbb1amX6MBPPaY\neLxrlzmW/mY3hNttZXiWy8BeQgCKExvRa0JYpiCHhoB9+5yNP4WMUBfySF70q3QsJftYIP6EMO2P\ndo8pQ3huTXH8mu2mVT+oV34w0L1ThS6qJ9UP+SWEJT8Y6C0hbC82mCaZNITpeDgpQ5hijCQ+DwBO\nrZ4C51zrk9bWgF/9jVUMpLQhDWEA/wjgxyDM4G8A+Bhj7IjHfwedDtJJ934QIt17NYBbGGNvZIxd\nyRh7IwTa4arO6/wx5/xxp+MA+ETnuQzArzDGbmKMvYQxdjVj7Fc6vxuDqGX4Ts471XwSFJ2ABi0q\nJzsEIB6G8HgnIby6ambA12bBDWGJjJAmaSFbiITDSgcFhRHROPltowuqWl2dr9s2ZDdDOMrEHU0I\nZ4ZUj2SCI6wnhP2ThQcmD0RqmGqG8Np81zZA06IDdZ5VyIi49KrLlSH8yFpShrDhhDC5Xu74fg2v\nfa2Y9MVlItGichkXypN9MYsmhOkCzP6J/V1/G6aonLaANaojI0wpbHoUiCEh7ICMMJsQDtdWW89l\n2b4SO16SHGGKjJDqN80T9jtu8RbuO32f9fO+E8JkEbzVblnprngSwu7ICNpWV5tVrDXW0OrsbDKZ\nEM5mVfuVVDIrKobw8LA6N8r1TIPcTPBWu4X5NTXw+9oTX8PxZWdXd62xhn+47x/w6Hx3ZVrJEV5d\njfZapoZSkKJyUtIQnuxMH5aXbeaUTV4J4VxObTGPMyH8+tcDL3858DM/I/79ve8p4+766829jp0h\nHLdaLWHaAtEYwtQkKpdVQhiIt7CcX1E5O0NYLog+3nEDDh8W16FTMUtrXEqREZ3FObs5av0+ZkO4\nnVHnu6O8AxkmbiraHtHHJhPCJsdQvcorIcw5cI8Il2LvXsCnNJOjqNGaVF/rpqgM4TQgI+Q4HRCP\nV+orGsboq18FvvT1lDn0CWujGsKv7fyfAXgxgPt8/vuKx7F+B8DfQpi+l0GYzXd0/n9Z5+d/wzn/\nXbcDdAze1wC4vfP8nwBwM4DbIBLG0wBqAN7GOf9q6LMlMmUWNkgvmHVLCNuKylFDODJkBJl4j02q\nG9qEeUYTwk6Tbmr8WAnhjrESRToY0DvI/LDo/VdWzKRH1+vU9A9nCMdVVI4XDCeE26SoXICE8HN2\nRIeLAHRkBE0IA9Gkl+hAvZ2NHxlx3f7LgJb4fOdLt1kpoqhFB3WZQnQM4QZXH3BchrDOe3e+pul7\nrLWCJ4RHC6Oh2jY6icmXU4SMiJoh7ICMMJkQrhNDOB/AAJeaHZl1NZD7leQI25ERQPyGMADcffpu\n63GUCeEo26ywyAhtYaePhLDdEAbU97nZEsKAKma6UQzhc9Vz4FAXHgfHp+/7tOMx/ve3/zd++nM/\njR/6ux+yxqlScXGE7Saak+yGcLkMHDokHlPevJcR5JUQBhQ2Ik5D+LbOOvcXviC+z29+U/1Obq83\noajDA36ic5AoDGFq9ieZEPZDRtgTwtWqKJol/+7izgZDaqh5IiPy3ciIuBPC9P7lGfVlDuWHLMP3\n5OpJ6+e05sRMuffG2ishnEZDeH5efYeHD/d2/AEyIhlD2H4fn1o9pfVJ//7vAIop+0IS1kY1hHkP\n/zkfSOitAF4JwRQ+DmHeHu/8++Wc87f5viHO5yEQFu8A8B0AcwCqAB4H8FEAV3DOP9bDuWryWk0P\nozo1hF0mY/aicpQhHAcyYmSbWUOY94CMkCZpVIYwHRRkSmpQ4LeVLohqjXATbrpFKC6GcDtvliHc\nDICMODJzxEruvu6i1/X/oh6yJ4SjNoStJD1rWyv/cSaEi7kiJqqXAQD41MO479F4Rnm6IRwdQ5hu\nQ4o1IZwxkxC2G8L7JvaFKlSltVfFZJERcSaENWREJAlhb7494GwIR4WLAFRCeL25jma7mbghLBnC\npVwJe8f3Oj4/qOjrtXjLupebzWjNpbDICK2GQR8JYW0i3uFVygWNNBjCpiuvy352eTm+yWgQuRnC\n1HCR+sQ9nwB3WJ2448QdAMQE98SKzhfbrXZ8R8phpYaSW0I4n1dYJUAUlJP/niTTB6+xLjUNnYrU\nSZMyTmSEheTiwBe/CPznf6rf/fAPm3udpJERXoawicCK3exPQ0LYCRmxUu9OCNv5wUBwZIQTQzhJ\nZETb1h/JMeKJlRPWeJfibPpKCNv6IbqonoQhbGcX29/Do2QTxkHHfeb+2iqGMP0uV20khkQSwg39\nRU+tntL6kLvuwsAQtmlDGsKc82zI/y4McMybOeev45zv5ZwPdf7/ujCJXs55m3P+Ec75CznnM5zz\nMuf8IOf87ZzzB/2P4K+2IdhEEGahvahc3MiI4XFF1DeSEA6BjKi36mjztuL39TEZ85JmvBYMG8I0\nIRxgG/J8VcXPnbaWm9JQbshKLDcyZpERTYKMcEsID+eH8dCvPIS73nYXfvo5P93/i3rInhCOepBv\nTYxICi1OhjAAXFhS2Ih//d4dsbymZmrkzDKE7elbKTmAqlbNLdQ5ifJWAxvCARPCYfjBgD6JWeer\nVtLL5AQuKF82ToawhoyIgCHcJDsbcgF471KRGsIFtQd2pbaSuCH87PKzAMQ1LLe19iraHzbbTcft\nvlGoXkdwZESjqt/HBpERgLp+19eTKcold7uVy3qBMBOSCWEgXRxh2efncvqCEjVcpO4/ez/uPnV3\n18/pOJwaSUB8CWHa37klhAE9JXzFFeoxNYS9dj1KQ2x4WOEhqJJICNPXuukm4JZbxOM9e4ALfWea\nwZVmQ9jE5203hNPAEHZLCNuLylFD+KKLxP+d+hCVECYM4c4Yyo5PkIrdEGb6jjo6Rnx6SXwRZytq\nAt4XQ7jgnhBOYmFybU3//hcXdY9FssEB4MCB3l4jzQxhOo6j128vmvDIzCXBEPZLCFcqGBjCNm1I\nQ3gri3MzpnCDbFENVFSurReV2zYUPTJiaNRwQtjHELYnAVfrq9Y2vqgSwhmWsSZrnKwim0CDBEkI\nZ1kWDN0JwSMz/TEavcQYs0yqdZhFRgRJCAPA9uHtuGz2slDpyF60fXi79XhubS4+ZERO3TtxIiMA\n4JrdyhD+zrF4OMJ0MMly8SaEgWh5ja0WLIZwNggyolkLzBAOww8G9AH9an3VSvU8+6y+9bIf9YSM\niDghnM/krUUsaagZTQi36Y6d4AzhOBLCgEhJmTSEwxaVowp7zTpJSwi3W7GleOzICL+EsClkBL1v\nnZAnSbAN5ec8HsFaOzWE04SNkMbezIxefIwaLgcmlfPwqfs+1XUML0M4iYRwUENY8oMBHRkRJCFs\n5wdLxZ0QbrX0fu7LX1aF0V70InMF5QBxX8j09WZERlCzc2RE3LPyWoozIeyHjHBiCPslhLuQETQh\nnAJkhM4Q1sfLtH89tngMQETIiBQwhO39Xrutj+U3e0L42DH1mPYdvcirH0+aIQx0G8IABoawTQND\neAPKxOC9EXKLqp0hHAcyojgSLzLCnoim2zWjMoQBNTBoZ9WgwIwh7M8QZow5Trr7ZTT6SWIjqi2z\nyIgWpwlhj5lKTNKQETEwhK2JUZ4YwjEiIwDg5c9RhvADS/EbwrRIhmmGMGU2xjXQM5kQ3j22W2sL\n+kkIr9ZXrVRPq2XOeLEbwkEM0qgNYcaYtbASCTKCGMJuDGGnazlKQ5h+12lICEudP35+fy8OvT+0\nJ4QjL8bV2aKby+S6ru0uQ5jcx/2MQYrZorXwKwsY0e8ziXSWbDP73arqpDQawpzrhjAVNVzeftXb\nrfvhpgdu6sJG0HG4LPgoRSf1cSWE3ZARgLshHDYh7MQPBpRJGVdC2Ot1TOIiAGEuS5TKVkgIZ7Pq\n+k0qIeyEjFhtrLoawpmMMgq9kBG0AG8akBH0nNtMR0bQ0MCTi08CsCWETRWVSwFD2Knfo++DGsK9\nJoTTXFSOFn+TSfdelXRCmHN1XbsxhLuwQ4WIqwhvMA0M4Q0oE4N3rap5kKJy7QbOrcdgCJM0Y2HY\nHDKi3YbvBJQxZp1zl6kSETICUAODBjNtCPtPuAE9CQ6ISThNqUQh+XmuNNRnbAIZ0eKkqFzWY6YS\nkzRkxFqcyIjkEsI/esWFwLoYBZ3lD/k824zoIM6eeOhXWkLYARkBxGcIZ5lLQjinm9aLNWf2aC6T\n0/ir+ybCGcL2oiB0m6epVE9Qs/DIzBGrvb5q11VmXtxDcmEliqJydIE2n0JkxGp9NT2GsOGEMC0q\nB8SREBZtiBPOxjMh3McYhDFmTcajuH7DiiaxojCEo1547UWLi2rC2mUIE8Pl8NRhvHj/iwGILdu3\nH7/d+l2r3dKuCXtCeOdO9ThKIzxoQlgmgUdGdMMhLEPYzRCOGxnhxc41WVBOSl4nZ85EW+zSSXEb\nwoDiCJ87F50ZalfYonKViiqadeGF6nOhKfZuQ9gBGZHXd1tJUdMqjoRwywMZIQ1hjSHcDzLCIyFs\nApMYVk7jGDqXkMgIxoALLujtNdKcEJaG8MiIjhvqRUkbwnauvRNDeJAQ9tbAEN6AMrGSRpERgYrK\ntWzIiFI0yAjKussNmUsIC1MleCq61qrFlhCWnWQN5gzhZhPg8GcIA92T7sNThz0NZBOSCeFaq4aR\nCTG6NI2MsBvdSWi8OG4tuMSRELYG6nm1mDKci5chPDTEUKyK0VN96CnUG4ZYAh6ShkahADTaETKE\nXZAR0RvCEhnhb5zVWjXXhDCgYyPCJoSzmaxljFJkBGAu1RPULJwdmcUdb70D//ZT/4afvPQnzby4\nh2TfJAeaUSWEU1NUrhAdMqIfQ3j/tv759rQ/bPFWvAnhDjLCabFKY1U3q573cVjJ69deVA6IP7lU\nqSiDa6skhN0KygG64TJTnsEbLnmD9e+bHrjJekzHpEC3IRzXdxqkqBwAvPe9gh3853+uF5gLgoxo\ntZSREAQZEYdh6maE7tsH7I+g7Ia8TprN+Bdt7IZwgTTFJgxhWnhKGsJRLDD7ybeonA0Z8cgj6rqU\nuAhAmMnSzJV9iPzO8mVvZITctQHEj4xosXDICJMJ4e2KqGckEBRWXglhzlVC+Lzz9AWRMEqrIby+\nrpARF13UP+4maWSEHwvcMSE8MIQ1DQzhDSgjCeE2NUf9E8K0qBwD63ty4ia6vT1bMmcIB+ZSdpJ2\ndmREVOcLkIQwXweY+F76NYRrNQQywIHuSXfUuAhAGcIAMDkrRk9GCgeCGMKZ5A1hxpiVpp9fm8f4\nuBpYx4aMiDkhDABTmc7sKNvELT94NvLXk23ixISe4jWBjLCbrVJxDfRo2+WWEO5CRnRSZBmW0SYe\nAHB0z1EA4h68ePpihJUc1FNkBBB/QhgAnjf7PLzq0Ks8F7xMyY6MiK6oXAoZwoaREWliCMeeEO4g\nI5zapqgSwoAyhGVCOElkhMnq5k7aaIawZriUp/Gai15jLSR/9oHPWtgIiosAug1hurARpSEctKjc\nj/84cOedwI036j8PgoxYU2vavsgI+3uKSm5GqGlchNQ08d7i5ggnmRAG4sNG2I0kOsYDuovK3XWX\nenyxbfgk27LlZXE9yms4PxwcGZHPq0WWuA3hYrao7RqzkBG29qlX2RPCaTaE5+dVG9orPxgQ2BzZ\nRqbJEH70UbWI1i8uAhC7NQrdwzUA8RjCdoyRX1E5AAND2KaBIbwBlUhCuK0YwtuGtvVd5dtN1LzK\nFFQr0m9nEXYCSk0VIB6GsHgDYgTQ7/YZYQj7Y0GA7kn3kenoCspJUYN9YlZ8zufO6YOzXtREuhLC\ngMJGzFfnwZi+DdC0HJERMTOEAeC8MRWX+fZ9T0T+etQQlgOBLMsaSboHKSoXeUK4U1Quw/wLrNWa\nKiE8Vhzraqv/1wv+F/7qlX+Fr7/569rCTFDJ9qrSSBYZEbfsyIhCQSWC+u2TWwEYwk4LXFsxIRxF\nUbl4E8IhkBEGE8JyMp4GZMRWNISpoTdt81TsCbyp4Sm8+AKBjXhq6SncceIOAMDCut7QrNT1izWb\nVaZwXAnhXko1BEFG2IuOOYkmN+PARtDXoO/pZS+L5vWiRox5KWlDOK6EsG9RubqeEL77bvX4iG2q\nJO+95WW9jcsOBS8qB6jPIw5DuMnV+ZZyJZRyJewcEewZOzKinC9ru3jDyp4QHh5WY6gkDGEvZISJ\ngnKASN7ShYK0yCQ/GBDn6YaNSMIQHhSVC6+BIbwBZcIQ1hPCziO6LMtahUgarYY1GI0KFwHo5lWt\nvWY1pKYTwm4GKTWE4y4qJ96AGBj0mxBeX4eVNgbCJYSPzERvCI8V1CxwbFo1yv0a4TQhnBYjSRaW\nW62vot6qW9iIs2c7bGuDUglhgozoYwDXq47sVMCtu2gp2wjUbquBFk0Im+AHAzazNWGGcC5kQtgp\nVThSGMHbrnpbz9xd2V7ZkRGmJnBBF+/ilryPmu2mleiVA+D+E8L+DOFsJqv1W/lMXitaaVppLCo3\nnB/ua8uqlFZUjsfNEPZARpDxT7VRjSQhXKlXwDlPFBkRtSFMjbS0MITpZ2yfOEvDZSg3ZJkmGjbi\nfoGN8EsIAyr5nQZkhJuCICOckAJ2UZNyfd35OSZFjdA3vhF4//uBP/gD8TgKbRVDWJrrdIE5iYSw\nY1E5G0OYGk9Hj+rHkm3Zyop+/2VKDgzhgp6WpYraEKbn3EL3jjq56Hpq9RQW1xfxxIIIdewZ29PX\n69oTwgCslHBaEsLyZ5IfDPReUE5KXhdpKipn2hAG3LERaUgIn6mcQXFIxxcOTQwMYaqBIbwBZSLN\n0WjSyZizOcoYs1KWtVbNYghHVVAO0M2raqNqJSnMMIQDICM65g9N2QHRFpWjAwPkRSdpBhnRW0I4\nDmQETTuVp9Tn3Lfxz2nSLh0J4e3Dal8ULSzXapkpHkilGMLJIiOuPqgSwo+cjTYhvLystj5t26YG\nAiZwEfbjJJMQ5hb+JeuSEO4yhDttVxSoG5o0nN3ZRqYzioibIRy3NL6rjSPcd0KY+yMjAP173jW6\nC6xf8JuHNGRExAlht90c9r7p/InzjZxzognhXpER/SaEO+MMDo5aq7apkRH5vDId05IQ9jpnWVRu\npqwcwNde9Fpr3HbTAzeBc54aQzgoMsJNo6Pq79zGQGlPCJdKwO/8DvCe98DqA02LJsk3syGcZELY\njz1qZwhLTU+LonJUsh9ptYCTJ8kvCiQh7IOMAOJNCDfQXXODcvo//+DnUW+Ji+HaPdf29bpOJjg1\nhOMunBhHQhjYGglhINmEsP0+theVa/EWahm9sxmeTNEXkgINDOENqLgSwoAy1ebW5sAhWusoDWF7\nURU5IFpY6I8RFjaRFGtCOK8GAX4SdAAAIABJREFUBsVRMwnhXhnC+UweByb7XA4NILpVfWhcNcr9\nGsLtNCIjSIrPXljO9CA/LciI6y9RCeHja9EmhKmZMTGhTNvNkhCuNenCjjf7HACWa8vW+4xiIYtO\nZOp8zapsbxQZEXB3Q5yi95F92/3amncFej9RhrBbQhjoNoSjFEVGrNZXMTqqCo8klRA2gYsA9KJy\ncTKE6w0O5MSFEhoZYSghDIiU8GZGRgAKG3HqVPxGg5PczrnN25ivigEf5XNODU/hBfteAEBgI+bW\n5rTCzoAwq+yShvDaWnRc3X4TwowpbMRGTQj3WmQqjGhCeLMxhP2KysWVEPZDRtgTwlLXX99diIve\n18+S0hk8RxjCqUNG6EXlAOD88fOtn33qvk9Zj6/be11fr6slhOu6Idxqxd8PeTGEqSFsKiFcq8Wz\ncBVE0hDOZPo/P6k0ISPs9zEArEJfHS6MDgxhqoEhvAFlxhD2TwgDylSjVZC3DcWDjKCGMNCfSWpP\nCLsxkJNGRoxtN2kIU050sNTZ4e2HYzFgqCFcHFOT3n63DbVSVlQOUAxhQCysUEPY9HbWtBSVO7D9\nfOvxcvaJSAdBdkPYSgg7GC69KOmEcI2MdIIgIyiPMoqEsH0iI1M9p0+bGeymNSGs7V5p6glhoL/J\nTCoNYVtRuUzGHJ+016JydKLajxIrKtdWLovTgpXdEJZjkHwm3/cCF71+1xprmxoZAShDuFqNNvUd\nVG7nfK56Dm0u2FF2HMoFE2ph9eTqye6EcMM9IWx/TZPqNyEM+BvCQRLCpk1KPyVpCKcpIdzP4qeU\nU0J4YkJ910kkhJ2QEdVmFflid3GT6xy8UXpfP/ywetzKiQa2kC1YfVoxW7TmoW6GcLNp5rO2SzOE\nPZARAPCNY9+wHl+/9/q+XtcpIUzn+HFjI7wSwhIZwRhwwQXdzwsj2ianoS9qt5UhfMEF5tqyNCEj\n7AxhAFhq6YZwptTpIKPbaLehNDCEN6CMICNawSbc0lSjE9aLpgztL3CQfdJiqsouNRkYz7luO5Wd\ndaPdwFxV9U5RGsK0kxyZVEXl+km12BnCQZERcRSUA/S0U27EYEKYpS8hTCd5ZytnI+UbKmREsgzh\nUq6EoWbHsJo4hkceie61aHs4Ph4tQ1hunQPiM4SDtNX0PdLFuygSwrS9Wq2vaqkemorpVWk1hOnC\nikwIU0O4n4XaFicMYQ+HhS5OxJkQlsWrTG1H7zUhTLey9iPaH7Z4jMgIUsDHCRlh3yFlscBL432j\nMmg6a62xtqmREYBeWC4NHGF6XdHrjbbXNCEMALMj6iROrZ4KhIygn2dURn+/ReUAZQivrDibXkES\nwkkiI+IwhClr2TRezE9JICMYUynhp5+OJ9nvlxAGgHa2O6rrZAjT+/q229TjRk7hFmU7zhjTCvRS\n0es9ipQwvX/rbXW+cr5P+1m5M3iiNIGLpy/u63VpmMCeEAbiN4TdEsKcq4Tweef1f6/H0SaH0fHj\nYgcJYA4XAXQnhOWQJS0J4cWmbgi3cmIg4hYQ3GoafAobUCYSwi0NGRGcLwsAl++8vP834CI7p9GU\nIdxswjJIMzzYhPupxaesx3Rgblq0kxyeEKPger2/gYA9IRwUGRGXIUwTwtmhaJARaTGSqHFzfOX4\nlkBGAMBMvjOoHDmN790b0d436IONcjlihnACyIg65b0HSAhHbQhTxE2lXtEMYROpntQawrluhrCp\nbfc6+zw4QzhK2RnCgDlDmCaEGZjvjh0pU8iI5BLCJI0VAhlhYkFaQ0Y0KpveEKb9bBo4wm7nLPnB\nADAzTFaLAewc3Wk9Prly0irsLOWFjLC/pkn1i4wAdLPTaU4TNiG8GZERQYrvRaUkDGFAGcLr6/Gc\ns19CGAAaGf0+KxSAK6/sPha9r6khXOXOBdlpgV4q+nmsrcG4nJARpVzJMqud+tmje472bZwVsgWr\n77UzhIH0GMLz82qM0y8/GIhvrhBUQfnBf3nHX2Lm/8zgj2/540DHpePhyUlguDPkiKNt9mMIA8C5\nuj4QqEF8GWwQEQYwMIQ3pEwM3jVkhMcWVaeU5eWz0RnC+Uze6nTsyAhjCWEEM0efXHwSgCj0EWXa\nlBrCQ+NqYNBPIsDOEA6KjIijoBygb2XnRXNF5dpIX1E5Wpn32eVnI00IpwUZAQAXbFP7rG598MnI\nXocawsUit1K8kTCEk0BGNNXoPRsgSfnUklrI2jGyw+npfcmOjKCTVhOfg2YIc3ezMG7Zd68AJhPC\n6UNG0O9ZGk/SbKpW++OT0gXLTMD+GIiGIdxqtzTDKaqEMOdAm3Vvz6WibVa1WcVyTdxQJhZ27Anh\nYhEY6nQNm9EQpgnhVBvCBPFjTwjvHCGGsBMywqOoHBBdGs0kMgJwNv7CMoQ3Y0J4fFwVrNvMCWHa\n/lKDMG5D2C0h3GD6fXbllXDkCtP7Wn5f+VIDlab4eztuMYghHEVCWCsq11mkpOPcvWN7u0yyfvnB\nUrIfckoIx83Jlu1juazazYUFs/xgYOMawu//9vtxdu0s3vON92hBEzfRvmd2Vo0v0pIQPl05ZbVj\no2McK41BQphq8ClsQJkuKuc1AbWbalNDU5rBZVqMMSuJZRIZQRNJXglhOgGVaYyoJ950oiaLygH9\nDQDX1xE4ZUcHAkdm4k8It/Kqh+x3hTiNyIjdY7utx88uPxtpQtgaqKcgIXzpHrXt7J6noissR1ef\n80PeCbxeRNsEmhAeGVFboqI0VBpN/0UOai4dW1Cf9e7R3U5P70t2Q3iIXF4mBn7UEPYyC+MWNeVk\n30C3iK52ezOB1QpY5DVOQ3isOGZNCqURZcpsov2TW6FEwAEZMWEGGWFPCOdyKs0SKXc1pxorpwUr\n2lbPr81bKBETLHB7UTlAJXo2M0MYSLkhTBLCdoZwL8iIOAxhk8gIwHmsGyQhTE25zZgQzmTUomPS\nCeECaYpNGsKMQRtDmFpkDaogyIhqa8Uy5gFnXASgjwekDlyqTsJekN1ujlo/j9EQrrW7d9QVc0Vt\n3gL0zw+WksixNCWEJybUdWc3hLdqQni5toyTqycBCITmx+/6uO9xaUI4cUO41X0fn1o9hcsuE49/\n+MXrVjCyXxzXZtHAEN6AMmIIc2oW+heVk7pi5xWR3zxy4mISGRHUZHAykeJMYhXKqvfvOyEckCH8\nhkvegCzL4oX7XohDU4d6f9EQooZwI2MuIcxZ+orKUVPOjoyILiGcLEMYAC4/XyWEH194IrLXoRPB\nbNE7gdeL3JARmYyafEdpCNdb/hgUapzJQRyArkG9CdkZwqYn5driXYoMYdoPnFg5AUCZiPj/2Xvz\nMEmO+sz/jTq6u/ru6bnv0WhGo5FGo1uA0LEGhGQhQAs/juXBMgYj8Eq2AcPjNUasF69sbG4DPwlz\nr581YGMuG4RkIQkhgY7RLQ0jzX1390xf1V13Ve4fWRHxzazMqqzMyMyomX716JnqrurMyqrMyIg3\n3vh8EWxpZ9VoH+8T9n0plUiJxOLxOdNRU2U20YSwV0O4v6u/YUDtV/Qz5qYrH8yHlRA2DWHvyAj+\nmQOKEsJd1oQwEE375aTTmSFsT1vS5NXSvibIiDkHZESpOTIiioSwCmTEQkLYXa2K74WlsBPC/Pvt\n7ZUT60Dr5LhqeUFGzJetE9+Xu3ijTm3Zui3ymnVDRpRrZUt9ijgSwvYJSroaJ8mSuHTVpUr2bTfB\ndSgqZzeEn31WvkYFYzcKjE878mII757cbfn5zh13iuKnbqKG8LJlcsIulqJyDsiI43PH8YMfAN/8\nJvC3n5NfxEJC2NTCp9CBmp4ODtuvVP0lhMPERXDxJe4qkRFWZmF7zOSV/dEZwslelcgIbybDO857\nByY+PIH7brovsoaRDnDz1VkxqFDJENYlIZxJZ4SRcXj2MEZHZQf4VEZGbFwk03wnq/tC6dgC1s5G\nsqt5As+P3JARgOwAhZoQ9lBUzqndAsJPCM+X55UbwromhJ0MYVUDN2oIN8P70HORLicPSzyheHzu\nOAzDiDUhvH54vbLJaDpBylMifDAfakI46d0QHpuXNwfVCWFuCPP2K5u1JuXC1unMELanCJshI2hC\n+FhWH2SEjgnhU90Qnp62fu5hKypkhN3sjzoh7AUZkS1lLeeaW0LYqS1bdaa8Zu0TmvbVVlxRFpUr\n1pOU9vsRNYTPX36+ZUIxiHRJCJfL8rMdGpLnXbUKPPywfN327cH3pVtROW4IL15snZijevGktRL4\nvul9uGfPPU23Gycywst1fHzuOFasAG66CUj3EUN4wQoFsGAId6RKpeAXWNXCEPZukIZZUI6LL5uM\nIyHsaAhHmBBOdMtOQZDZ8XYYwoDJtopy2QRNCM+WZsX3HLRDYCT0KyoHSI7w0exRJJI10Qk6lZER\nlCGM4b3YuTOc/bgmhBUhI9wSwoDVEA6rInap0jr17nasYbRd9kFMqIZwE7MwaumQEH79Wa8HALz6\njFcrKTTWStyQKtfKmCpMxW4Iq5IlIVxrTAiHcS17QUakEikxKUvNPxUJYXtROcCa6IkyuUT35bTM\n2k0PHngQ33zqm5YaGG7SFRlhN40shrANGdGb7hV9JTeGsGE7WaM2hP0mhFUzhE9FZARgNWyiTPKf\nLoYwnQhzSwjPleawfr35eNs262QTlVNbtmSNe0LYvtpK/D7ChHCp6pwQpngmVfxgQCaES9USKrVK\nbAxh2jbShDAAPPaY+e+iRcBKBV1onZARtZq8H65b5/66l06+1PC7O3bc0XTbL3uZ7BNfc400hAuF\n8MZHXA0J4YocAy/rMy9YuuqK12cAFpARXAuGcIcqaMeg4rGIjRMyImzxRGNoDGHNDGHLzGuXGmRE\noQALMkIncxQwTTZuYs0UZkSn9+TJYDcOQ8OicoBMapaqJZzInRAdyrExtTdKnZARKwdWIon69TSy\nD3v2hLMfOhBMRJwQ5p3IcjmcatAAUPZQANSp3epOditbYk9FmedhM4STp0lCmK7YadZWf+Tyj2Df\nn+zDXe+4K5JOrJ1hGoYh7BWRoYofDFgnSO0J4Wo1nERLAzLCAWlDayhQqajbYC8qB1jNwyjNJj4o\n7umxskkBZ0OmZtTw0Xs/iiu/eSXe9aN34X//8n+33MfoqGnwAHobwhQZYU8IA3IlwMGZg5Zl5YD5\nudCBL9A5ReVo397p+/GSED6dkBFAtIXlThdD2FNCuJjF174GfPCDwD/9k/u2nBLCg8vkbIdbUTnA\nyhGOlCFcN4Tt9yM61r9m4zXK9k3Hu/Olecv5HWVCmLaNNCEMyM9n+3YrzsSvdDKEs1nTFAasx2zX\ni5MyIczHUj/Z9RMcnj3s+jdLlgC7dwPPPQdce62VDR72hF2zonLrhk3ne6owJcZw1BBeQEaYWvgU\nOlRBb5TWIjZNGMLEVOvv6seZixSU3GwhbmCVa2V091TFjNPpwBA20uEgI5oxhOMST77MFmeFIVws\n+jfWajUASf2QEYB1QH9k9giW1lGBhYJaZqVOyIgES2BJer35w8hejI+HM0VsMYTT8SSEgfAMFU8J\nYQdzadXgqlBMQ/sg5nRBRoz2jorPX3VCuFTxlhAGzKRsqxUfqrS8LxxD2Lw/med1s+O9ZOUl4ny7\nYfMN/ndoU4IlRME8bgjTdFcYHOFSCS2REUDjRFZXsgvv2PaOwPtvVlQOiMcQthsodzx+B/pv78c7\nf/BO8bu50hxu/O6NuP1Xt4vfff6RzzviEqiSSYj7bNwM4WJRGmwNCeF6UblMKmMx7bk4R9huBnPZ\nP4dOQUbwxCUA7HOoOeslIXw6ICNasZbDUphF5eikm93spwahLkXl5kpz2L4d+PSngfPOc9+WPSGc\nSAA9w+5F5frT8SAjhHmWqAiGvv1+9PqzXo8vXPsF3HH9Hbh+0/XK9k3buPnyPNJpaUxGaQjT+509\nIczV7LtuRzoZwvS4mxnCNCH8J5f9CQCz3sIPdv6g6fZXrADOqdemVx0WaSY3hnAqkbKMvzmKayEh\n3KgFQ7hDFdgQpgnhJj06aqptX7Y9kpkUmpDJV/KiQxTkmGlCuJ2q5kC0hnAlodIQ1jchDEhDeKY4\nY+n0+j1u8zvWr6gcYGW5Hp49bFlyphIboRMyAgBW9dZTfV3zOBBSb8/S0UiHmxC2D8qjMFQsDOE2\nEsJh8IOB0xcZkWAJYc6oTgiXqnpO3sWdEB7tHcXeP96LF295Ea/Z+Br/O3QQN9X5gDjsQZsXZITT\n79953jstxcX8yqmoHG2/omQbuhnCX3vya6gaVfzTM/+EA9MHAAB/dvef4ce7fmx53VRhCl974mst\n98OxEWNjMhUVh+gEgxsyYknfEseBKb0GnRSHIayiqNyaNaZhBjgbwvR+6saZPh2QEVEXWeOyG8KM\nSVO45Dw34Vl08rRZQjjqonKJhOHKEPYi+3m6cSOQreiHjBDHTCYo7fedBEvg1stuxc0X36zUMLMn\nhAHJEY4rIexmCKvgBwN6FZWjPspwE+oYZwivHVqLN255o/j9c+PPed5XlIawW9I/k8o0BBsAW0J4\nwQoFsGAId6yCmg98EAQAXR6LykVRUA6wJhrz5byYdZ1rHghpKtNkMI9ZO2QEmTEtMzWGMB1wA60Z\nwnGIF8qZLc5i0ahMjwYyhDshIZyVCWFAbXrJnhBOsmSsn8P6IckR3je1N5R9uCaEHVKzfpRKpESa\n0A0ZAYSXZqGGcJfLdxllu2UfxKg2hEslaImMAORnOpGbQKlaUpYQ9lI4MA5FYgg36X8AplG2aXST\n/525iH/OdmQEEKYh3Lp9sn//H3r5h5Ts36moXBzICMNwN4SPZY+Jx3ftvgulagnfee47AMz3/4Vr\nvyCe/+xvPtuSJcwnXsvlaNKGbnJjJteMGk7kTBfEzg/malU8Mlu0mlWdkhBOp4HV9W7R/v2Nz3OT\neHjYnTO9gIwIT3ZDGJDHHPSzpianTsgIJCswYI5FaPip1WoELnt7tnWrlQXfFBlRjh4ZwULoL7eS\nPSEMSEN4eto62RSm6P3OjozgCiMhHHdROXpNuSWET+ZOYqpgvnDTok04e/HZ4rnnJ573vK84E8Lc\nEO5J9TQUZwXMIBrXQkLY1IIh3KEKjoxov6hcFPxgwJpozJVzYknR3Jx/3mo+D2kyNBlw22+KCZbA\n0r6lLq9Wo0w6Iw0nQw1DuFiE1gxhQCaEK7UKhhdLJ8nvLLFp+utZVG7VoHtCOBxD2Bzwx8UP5tq0\nZL14fGT+QCj7oCakkVSfEGaMiXYwDmREpUZT787ntNPy8ygSwvNlKzJCRafP2lbrNZFFTfbjc8eV\nDdzKFT3b6igMYbdzOmzxJLa9qBwQDjKiXIYnZMSBGdlO3rD5Bpy95GzH17WrVkXlojKECwVpwtCB\nsmEYFp7uXXvuwoMHHhQDtxu33IhbL7sV1515HQDzc/qX5/+l6b5oYbljx9xfF7aoIUyPeTI/iZph\nRpfd+pmtDGG7WRWF+aCiqBwAbKgvIJqctH5GlQpw6JD1NU5SPRnZSqczMgI4NQ1hioyoQp5Eoxn5\nwdsnXdxkP5azz4Yw1gAHZESXMzKCTjSHaQinM/J4VSHWWsliCNsSwkB0kx6tEsKJhEQfBJVOyAgv\nhjBPBwPA5tHNGOoZEqGmFyZeaChk6qZYkRF1tn4mnbGssOIJYVpgTqdVeXFqwRDuUEWVEKamygUr\nokkI04FLvpIXhrBh+E9hUZPBaxEbwBwQh21KJFhCpO7my3NikBak89cJDGFaOb13RN6dgyEj9Cwq\nZ08IryBjvKNH1e3HnhCOix/MdeYyaaCdKIYzIrd0NDwYLn7EJ4piSQj7LCpHJyFUyj6IUV04witO\nIA6t7LcWlqMDQL/3pkoFMJieqzmaGcJBBjb0/tSsqG2Yiich3HrCanGvHB1/+BUfVrZ/p6JycSAj\n3MzR6cI0ymTy69699+JfX/hX8fPrz3o9AOtn8ncP/50wVJ1EK8SrvM+2K7dj5vxgwLmgHOCMjKC/\nsxvCqZQ0pnQuKgdYzV6KjTh0SBp1zQzhhYRweHIyhPl5FdSkbGYIZzJyf1EnhMvEEKbX41zZW0I4\nkbBOLG7dCkzl3ZERboZwVAnhVI87MiIsWZAR5UZDOCpsRKuE8FlnWSecgiiTkQVO4zaEvTCEX5qU\n/OBNi8zVWVuXbAVgTnBQM7WZdEkI00lVjnujq5ESiQUrFFgwhDtWKhnCzRLCbz/37ehJ9eDVZ7wa\n5y1TtH6ihSwMYYKMAPwnd3I5eEok2Y2VsHERXHywNleaE4kAlQxhnUwGLp4QBoDMsLxL+j3uahXa\nIiPsDOFVxKs7ckTdfviAhXFDOEZ+MABsWCyvn+lKOIYwNSFrISSEAWkux5EQLlflOe2GjHAsKteh\nDGGvk3dxiN4PjmaPKkny2PE+Oh1zJAnhmA1hPjkeSULYAzLi41d9HAwMbz3nrXjl2lcq2z+9bjk/\nL46EsJs5you9cGVLWXztSZMTnE6k8dqNrwUAXL3+aly88mIAwFPHn8K3n/626750MYTdGMKcHww0\nQUY48KPXDq2V23bgm/LrVGdkBOBuCNPHXhPCp6ohrFNCmJ+7Qc+rZoYwY9IEj+J4aUK4QgxhOjHn\nNSEMWK/vs89ujoygk3TUEKZGXRgGKTfPkl0kIRwHMqKeEF5Cmr44DGGnhLAqfjBgntO8TY7bEPbC\nEKYF5TaPbgYAnLNExqVfmHjB077iZAjzonKZVMbSbz82d8zyL6BnYC4OLRjCHaqghnCFGMJdafce\n3Ru2vAGTH5nEPe+8J5KCcoA11UiREYB/jvB8rgYwc5lDOwnhqAxhPlibL82LDuD0tLWz0o50Nhm4\nqCGcHlCVEJbmmU6N/HDPsEi+h2kIC0MupUdCmM7MzrFwRuQWEzIZDhPNLSEcCTKCttUuo+8oGcL2\npeensyFMO7yBDGFN8T7DPcPi3ArNEA7iKAUQnySNNCHsYQXDLZfegrm/mMN33vwdpWy70V7pLHEj\nMg6GsKshPNfITuKJ4avXXy1qDjDGcPvv3C5e8+F7PmwxXajofVbHhDB933SJOpUTMoIawk5807DN\nBxVF5YDghvBCUbnw1MwQLhSCFZaj90o6vuPi5lzkCWGDJITJBI1XhjBgvb63bJHIiL50X0M/zYLf\nKskPZbVcUIgDIZDW+DFbEsLJ0yshTPsvTglhVfxgLn5e6GQIuyIjJiUygtdv4AlhwDtHOK6EcCpl\niPBOT6rHMqnKjWCeFE4n0pF5W7pr4VPoUAXtvNcIMiLdJCEMRG8q2ZERNLnj1xCey3lLRNsHaXSJ\ncJjiHQOaEDYM/x0iO0NYJ3OUiyIj0n1qE8KsltYKFM8YE4nNI7NHQjeEDU0YwvRGXOw6FkqldzoQ\nrCaiTQhHgYyoeEgIJ1lScMi5wkJGJFhCnFf2hLCKTp8OfFk32Q3hREJ2ev0iI3SevGOMiZTw8bnj\nlntxkD5IvlADEmZj0AxZFaZEQjhKhrAHZAQQTrudSqSE6cjNV50TwlQcF8H1mo2vwVvOeQsA4ETu\nBD5670cd/44mhFXeZ9uVW1G56YL80O3pQS4nZMTaQW+GcDbrP1TQTFEmhNevd//7BWREeHIyhOkE\nUpA2ko7j7AlhQPar5ufDLzJGz2WaEB7pGRFGkVMK302/+7vy3/5+iYxwur7dkBE9PZJ/HoYhLBPC\nMReVc2AIT0zY/yIcRZkQBtSl64PKEzKinhBOsiQ2DJuNtO4JYUs7YetnLeldIq5lgYyoG8NO99fT\nVQuGcIdKJTLCLXUWl+zICDqD7LcTMp8nbNkmA9C4E8LFahEjo7Jl89sBtDOEdTIZuGhCmGXkXUpF\nQjgB/Y6Xc4SzpSzQPSs6wsqREYmK+O7jRkYMdQ8hUasbH/3HQkm40I5GjUXLEI6mqFzrtosx1tCh\nD7PtohNYqhnCubw0C+PCCbjJbggDwbmKNBEN6Dd5xzvMJ3InUENZCeO+UIw/Ec0/52gZwtEPwKmW\n9ZvVTHnxNq0MYYeEMNcNm29o+N1nrvmMaIfu3HEnHjvyWMNrdE8IU0N4uMd5/e6izKKGfumaoTXi\nsdNydlXGnZtUF5UDgiMjTtWE8NCQ5I/qkhAGghlbzZARQLSF5eiECU0IZ9IZSz/Hq/7+74Hnnwd+\n9CPzZ74KwM4PBqxpWfs++ETIsWPqJzuEIdwdQ1E5DRPCw8ON+IR2E8L3778fV3/zanzrqW85Ps+v\nnWIxmskrN7VCRhiGIYrKbRjZINCLtLit7glhWlw8k84gmUiKfuyx7DGUq2XRB3JCMp2uWjCEO1SB\nkRHkLtjTBBkRh2gimRaVAwIgI/LeDPDYGMLkJjm4WPaW/HYAO4EhTGfmsuyweOy3Q1CtQhgrCUMf\nfjAXTWwezcqUsPKEcEreeeNGRjDG0FutX0MDR0OZ/ecDwXQaKFVPQYZwjbZd7uc1bbtGM6OhFgnh\nKY/50jxSKbOYCqBmUD6fJ6tXNJusdDKEOUf4VEwIA7KdNmBgIjehhHFfKMd/vPaicpEkhEMqeulV\nS/uWAjAH4hxPxc2mPXuieQ9eEsKcEQwA25dtx7rhdQ3bWTW4Cv/r6v8FwDw3OW+Yatkyk98IxJsQ\ndmMIezGEaUqfqxUyQpVx5yZVReVWrJDG6kJC2FmMSYM07oSwqkmzVoYwTUWHbQhbkBE1a/+RG8Lt\nMIQZM4vJcY4p7zMuyixqeK0FGVG2ziivI03ewYOed+9J0hCOoahci4RwXEXlkkk5kbZokXUy0Ys+\ndPeH8MCBB3DLz24Rq46o6LUTRv/Cq1ohI47PHRfnIi8oB5j3J77K9fnx52EYRst9xcUQribkzvh5\nzdFLY/NjFn5wVB5PJ2jBEO5QBTUfaOqsK62XWUhTjblyTslAzWtCOMqUHRXtGPQNy46B3w6g7iYD\nAJwxcoZ4fLy4T5hKgRLCSZ4Q1s8QXj0gwWCUIzw3pyaRZhj17z0tb4ZxIyMAYJDVZ2AzUzh8XH2M\nh5uQmYzVsA2DIVypVSxafYAZAAAgAElEQVRV7aNIslQNgozwOJkVFi6CiyZnGJNJLSUM4aK3tjoO\nDfcMiw6mqoSw2VbHn5h10/I+a2E5bghPTQVg3BfjvzfxSVJeVC7qhHBUA3CqZX3LxOOx+TF0dwPb\ntpk/P/+8/wn3duRmCPPEDgC898L3isdv3vpm1229c/s7xeN90/sank+lTFMY6NyEMGCdPE+ypKVf\n2gwZAYRjCKtCRiQS0vjat8/sw/DHgPnd9TbpwpwOhjAgC8vplBAO0xCm/aqwj5mfy4kEULQFCga6\nzMFnOwlhKs4PBtpDRgBWQ1g1NoIbwiwdAzLCISEcR1E53i6m09K4vOkm89/3v19OJHpRrpzD08ef\nBmB+j9Rw5KJtcpwcYTsqwy6eDgZkQTkuzhGeKkw1RTxxaZEQrvtJPAlcM2riuwKcGf2nqxYM4Q4T\nN80CJ4Rr+g5ALQzhspqEcI6kzro1TAjTjkHvsDzIQMgIzRnCG0bkWsD9M/sCpyDMhHDdENY8IXwk\nq54jXKnUB1RpGVWMGxkBAKNd8oa7+/hx5dvnJmRPD1CohJsQBoBSVY6UMhm5ZDYKZESzhDB9j2G3\nW7y9ylfyqNaqwhBW0emjbXVcfFk3McbEZ+uUEPYQmmiQ7pN31IyihrBh+D/nC6X4j9eeEI7GEI6+\nqjsVNYS5AXvZZebPtRqwY0f478FLQvh1m1+Hb73xW7jtytvwoZd/yHVbdCXEoZlDjq/h99njx8Ph\n6XqRF4YwralgFx20jmRGhFEFxGMIqyoqB8gEcC5n8kPzefO7AprjIoD4kBF0VUwU4onZ2dnwmbpc\n3BBmTK4iUHVeeS0qB0SHjEilGvuPIiFcynpKRNpFi0Yu6mmeEI7SEOYmeLIrBmSEZgzhoSFp/n7+\n8+b59td/3d62nj7+tJhYBoD90/sbXhN2/8Kr+PXU3+88mffLA78Uj+2GMOUIPz/eGhuhgyFsTwgD\nwI5jsqOzYAhLLRjCHSbeCQmcEK7qyyy0IyNUJIRzRc2REeQm2TOkyBDWOHUGmJ8t/7z3Tu0NvBRZ\n+4TwoHNCGFBjCIsBkUbICABY2iuvob0T6mNavKPR02Nl/Krs4NJ2ge6DMTnLHpoh7CchPBBuQtie\n8uAdPyUMYdpWa4YzAuQ9YaowhXw5LxJO1aq/yut2Q1g3vI+bIQwEWMGiATKC93siLSqnCTICkMze\nSy+Vzz/6aPjvwQtDeHHvYvze9t/DX/2Xv2p6D2OMYc2gydM9PHvY8TW8sFy1CoyPO74kdAVNCNNB\n66LMIouR5FTwqlMSwkAjR3j/fufnnBRXQjjKdDAQLUKBi9/LurqkYabK1PJaVA6IDhmRTDYawgPd\n5k2hZtQsz3kVLygHOCeE6bjPjSEMhJgQ7ooBGeGQEKac7KiREfaUrFNqtpUeP/q45ed9U42rVcLG\n+HgVv56ccBGFSgFfeuxLAMzi0deeea3leZ4QBrwVlovLEKbICB6Kol7OE8eeEI8XkBFSC4Zwh4kb\nwtmstVPWrmjqTDezsFlROf8JYW8mAzVV0ok0RntHXV+rUrSD392vHhmhm8kAmDec9cPrAZg30NHF\n5gy83xSEJSGsYVE5atIdmVWfEBZmXLrxZhinVg3Kweyh6calVEFFkRGhJYRJms/OEeYdq/CQEfI6\nbra6gb7HsA1hC/uuNK8WGVEgbbVmCWHA2oE8NnfMsqTZD0dY94QwLbqhyhAulSkWJJ7JO3tCmPYz\nTllkRH9jQlgbQ7ieEB7NjLZ1TvACa9lSFjOFxpG26vusH6lERizKLBJGFRA/MiJoQpiavvv3ey8o\nB5jjIb7/U9kQpm1uVNgIaghznYoMYS8JYcB54qWVLMgIh6JyXckucR9qxhBWaQgbBsFkpKNfsWJJ\nCNePmTGZEo7KEOZeAp0I9qvHj1kNYV0TwobR3BD+P0//H3EfftPZb7JgHQHgnKUkIeyhsFxsDGHW\nRkJ4oaic0IIh3GGiy5SCJNLo8gbdzEKaCMmVc2oMYWIydDcxhGlqZ8XACiRYNJcI7Xik+9QjI3Qz\nGbj4DWe+PI+BZXKtkJ/jrlQgjJWk7gnhrDUhfNg53NSWxICIICN0YAivXSRvuE5sraCiyIjQGMKk\nXaAJYUAmCmZnzWXXqlWpydmR7iaj7zgYwoBpSKg0hClOQMuEcL80hI/MHrEMaP1whAsFaN1W2xPC\ndLDup502DD0Swny/vC+UTEpzIrSEcMzICEtCuD7wO/tsaYY/8kj478HJHDUMQySEqWntRfS+emi2\nERuxkgSA4uII0/OJ9mdniqZb25XsajpBQAetIz0jTZeaA9EiI1QnhNsxhAFpzkaJjIgzIRxVYTkn\nQzgMZITWCeEWaJZWsiAjHIrKMcYs9RioqCFMU/NBRY0zFsMEJU0I02PmHOEoDOFyWbZhzRjlXmVP\nCOtqCBcK8rq2G8I1o4ZP//rT4ucPv+LDDX/fMQlh5lBUjtxDOe4NWEBGUC0Ywh0mCjoPYgjrnBC2\nMIQVISNooaJmhjA1VaJcSkBnTZMZeZP0mwYwkRH6YkG4NgzLHn96iRwJ+On0VqsQyIgk088QXtq3\nVFxrDx54EMVBOcMaGjJCg4TwxqXyhjuRVzsiF4X0EDJD2ENC2DDC6ejRhLBXZEToDOG0syGsotNn\naauDug0hiH62R7NHLQNaFQlh3dpq1ciIchkAi7//wSfCK7WK4EPyQVtoCeGYkRFODOFkErj4YvN3\nhw4Bx9TP2VnkZAjPleaQr+Qb3qMXcWQE4MwRpoZw3Anh/n65NBqQCeHhnmGwJlWM7MiIrmQX0gmz\nj3OqISP8GsILCWG14gZLHAnhOIrKtUwIF30khFsgIwA59rMbwv39ciJAZULYYgino78fOTGEAZkQ\nzuf99aPaEe2nBjWE50pz2Dmx0/I7pwKnOhSVo5MrdjTGT3b9BLtO7gIAXLXuKlyy6pKGvx/uGRb9\n38eOPmYpQOek+AxhUlQu3YiMoFpARkgtGMIdJpoQDjJzWtW4qFwoyAiLIew+4I7LEKYdj0R3cGSE\nnSGsWwqcixrCxshe8dh/QlhfhnAykRQV02eKM/jgk68FhsyeXmjICA0YwltWyetouqLWbaCDwIaE\nsMIOrpeEMBBOmqVqeEsI0/cYF0O4Wg2GMgKshnBKc2TE0exRy4DCd0JYY2QENehUGMK6HC/db80w\no/188jm8hHC8yAinhDBgxUY89li478HJEKbvpd2EsMUQdkgI05U4cSWE+THblydTQ7iZ6NJdnojm\n2IhOQ0Ycnj2MTzzwCTwz9gyA4IawytUpraRDQvh0QEbYE8KPPgqccw7wgQ/436ebmiEjgiaEWyEj\nADn2o+YoF+cIHz4cvF/FRY0zFgMyoifVAwZz8otiMmg7vWtXuO+BGs5BDeEnjz0JA9aCg7omhOn4\nxJ4Q/sxvPiMeO6WDud58tjmGLVQKeNu/vq1hPEQVlyFccUoIOySBkyyJJX1Lwn1jHaQFQ7jDpAwZ\nAX0TSRZkRCWnJCFMlyGnm5gM64fXC0P60pWXur5OtaghXE7MiU62KoawbiYDFx3oVPoVJoQ1NIQB\n4Cuv+wouWnERAOB47gjwztcCqfwpnRA+c5m8EWeZWkOYDgIzGatZqzQhnHRPCFNDOIzCcrSt9swQ\njgkZAQQfmNO2Wsd2K+yEsG7HnElnMNRtOkwqDGH76pUUi7eoHCCxEXzQls2aiX+V0g4ZQYq4XXaZ\nfE3YHGE+GE6lpJnH08qAj4TwkDSEnQrL6YCM4MdMTYGaURPM41aG8DlLz8FfXvGXeP1Zr8cfXfJH\nAOC61BzQGxlx0w9vwm3334Ybv3sjADP9ykMf1BBOJIA1a1w2QnQ6JIRPZWQEDfxw2RnCt98OvPAC\n8LnPAceP+9+vk9yQEd3J7sAM4VbICMB6HRu2mw7HRlSr6touS32WGCYoGWMiUEBNcDop+etfh/se\nVCaE7bgIwJyYpKuwAT2KytHxiSWFn5/EgwceBABsHt2M6zZd57qN2191O84aPQsA8OTxJ/Hn//nn\nrq+NiyFcMUhCuD4GXta/TExEcC3rXxYZFrQTtPBJdJhUJIQNA6gZ+iaELcgIRQlhS+qsyfGOZEZw\n30334as3fBV/fNkf+9uZD1mLNM2JQbcqhrBupj/XhhEZAcl1B0sIF8tVgJkdqmRMJkMrDXQP4Gfv\n+Bk2j242f7F4F7D5P5QYwroyhEd7FwFVc1RRSKkdkVPz0Y6MUMoQTrknhMPm3VmKyjWJY924xRxg\n/86G38GS3nBnvRcMYVNH51QlhPW9HwMSG3GqJoT54I1PPtdq6petUmREEqlYBiJ9XX1iyS41Yelg\nPGyOMDVHOSWBmtOBkBEtEsJxICMMQwYZqCmQLWZFsoxPuDTTJ37nE/jR236EtUNrAcg22Gkpu64J\n4cOzh/GLfb8AAOyd2otytQzGZBJ4717g+TpJa80ab9uOyhA2jNMHGVGrybEWvberSjnybTNm3T6X\nvU/1xBPyZ9WT7hQZQSf7e1I9LYs3tpIlIeyGjKibo1Wj2hA2CKOwnJshHCXCiN+DaEL45S+Xzz/8\ncLj7V5kQpgXl+IrXSq2CI7PWm41uCWEaZLl///3iXnT9puub9k36uvrw3Td/V5wvn3vkc7h///2O\nr40vIdyIDkwlUpYJcWCBH2zXgiHcYVKREC6XYWUWaoYTsCAjKnlLAsuvIdyOyXDZ6svw7gvfHely\neztoX4khrMGgu5VoQniGBUsIlyo09a5nQhgAlvQtwcev+rj8xeAhjI3ZOmo+pCsygjGGdNG88VZ6\njilN3tFORoMhHBYyIuqEMEFGNFvdcMult+DIB4/gnnfe05RHqUJ2Dhwd1AXt+Fnaag0ndpolhFUg\nI3S7HwPSEM6WssgMBkMa6WII08+ZI7TCHLRRZERXInpcBBdHMlBMw6pVwIr62Oixx8IpjsnllJal\n78U+aGslmhB2YggvWiRTjnEkhOfnZdqcHjPHRQCtE8JO4svZ58vzAnnCpWtC+PsvfN/y82zRPBlu\nNOcyUavJZKoXXAQQHTKiUpHf46meED52TJrfa9fK36tGRvT1WevicHV3SzNpzx6Tbc6leqKOIyNa\nFZXjaf52ZGEIt0BGANEUlrOMM5LxrFhxSghv3y6v5bATwvQcygQcIvGEcHeyG9dvul783o6N0M0Q\nppMu9+69Vzx+1YZXtdzO9uXb8alrPiV+/tJjX3J8XVyGcNlwHgPTwnLAAj/YrgVDuMNEb55+LzDd\nzUJ6AefLeaRSsmHxjYwo6p3AsiSEy/PCEM7n2/+eBcezAxjCwz3DYjA0UZUJYT+VZufz8o6Q0tgQ\nBqyFmtA/BsMIvhROV2QEAPRW6jfivhMYP1lStl17QpgnMpIsaZlkCSrKFi9Vre8/bEO4Bu9t9cqB\nlZEkD8NMCBcr+t6bADPlz1MHJ3InLAmTUxEZAVjbq0qPNO862RBulhAG1A/aSiUIZEQ6ET0ugosn\ncCfzkyhXzXsmYzIlPDMDvNi8VkwgORrCc/4ZwkPdQ2KCyikhzJjERsSREHZiJgPBDWHaBufK1oYn\n7AJGfovKfe+F71l+nimaJttf/iXw/vdbX+vVEObmbLkc7kQGTSCf6gnhPXvk4zPPlI8zGfl9qzKE\n3cQNK/skjmpjqVlROXpd8nO1HVFkhNs1bl0dap1R5gxhQF1C2MIiTkaPjACcE8JdXbK46d69wPi4\n01+qkaqE8HRhWhRWO3/5+dg0ukk8ZzeEdSgq54aMuHefaQinEilcue5KT9t638XvE5O3P971Y8vk\nB1dsCWE4Fxe3J4IXEsJWLRjCHSZlhrDGOAFqYvGOLsdG+E0IF8vxD0CbyW6wBCkiITquGgy6vYin\nhCeKh0RROD9Gw0xW3hG6UnobwpZlsf2mExx0sCqREbJh0AEZAQADTN54dx5SB4GzM4Qn5icAAIt7\nFys1RuNERlRoQjihx3ltb69oxy+IIWwYQFFzZAQAjGbMEfrJ3Ek1CWGm94QlNYRnqsdE4lIJQ1gD\nQ9jOEAZCSggneUI4PkOYJnAnchPiMR+MA8Azz4Sz71JJtg9uCeF2kRGMMZESPjRzqIHDCUhsxORk\nNMXHqOh5RCccVBrCdmxEOi0H47ogIw7PHsbDh6xrwXlCOJ0Gvvxl4B/+Qa6CfPWrvW2XTkaGiY2I\n0xCOuqgcNYQ3bpSPGZPXrQqGcDNDeJEzcje0hHArQ5her17FAwpD3UOuoZz+tLeEcBjICCMZzoq6\nVuJhjUKlYClwT7ERYaaEVTGEnzz2pHh88cqLsX54vfh53/Q+y2t1YAg7JYSPzB7BrpNmFb9LV11q\nwaQ0UyqRwn87978BMEMy33v+ew2vSSTk6pwoDeFSrbGoHOBgCA8sGMJUC4Zwh6lYk3dDvxdYqQQt\nBmRuSiaSIpGXr5gHGaT6t2EARdJa6GaAA9Yl2BQZAbQ/6BYdV41NfyrOXaoaVWDITPj4MRqmiCHc\n027p64hlSQj3mYPhoIawREbINkIHZAQALErLG++Lx9QVlqOD++4eQxgcqivHxomMKFf1a6tp+nq+\nPK8sIVwuAwbT73jt4gViJvOTyGSkAXWqJoSpSTeRGw+ENNLleOk9kSeELYVWFBsvFmREMkZkBPku\naTJ32zb5mueeC2fftP/magi3mRAGJEc4X8lbuJ1ccRaWcztmmjj0hYxowTflibSwkRFJj11LOy4C\nkIYw1y23AL/9LfDAA8Db3+5tu9Sc5X3f5583l6C/+93qikOqMIQrtQru2XMPTubaazgHBmQyNwpk\nhJshDMhzOKqEsF2qDWG3onI9qR4L99ePIcwTwm4F5YBGXCBV+IYwYQhHiIxww2RExRFWlRDePblb\nPD536bliLAs0JoQzGdlW6oCM4OMWng4GvOEiqH5v+++Jx99+5tuOr+ETk1EWlSs7FJUDGhERC8gI\nqxYM4Q5TviZ7l8GQEXonkvhFnC+bBxkkIVwsQsvUJJUbMgIIYAhrMOj2IsoRxrA5q+qn0ztLE8Jp\nfY8XMAtMiO9EUUJYZ2TE0l554903oc4Qpm1gomdOdOhVF1VrlhAO3RCuUYawHhMdzZARQTp++Ty0\n5+kCwGiv2UAXq0Wke+UBK2EIazh5R026sfmxU8IQtiSE6ymlZcSLHBuz/0UwmYaw2T5FmcayiyaE\naWG5c8+VrwnLEHbDJ1Bjul2GMGArLOfAEaaF5aI2hENDRjRJFgLhGsLURPOKq/+XF/6l4XdOXNZN\nm4Arr/S+XSdD+ItfNFPuX/86sHOnt+20UlBDuFqr4vr/ez2u+adr8Lv/93cdk+xuYkwmZqNICO+W\nPpcFGQHI88qvqVWtyr4qLRhul5shHBYywskQDpIQNgxDLKN3KygHNI79qEZG5GcUBkO4logHGUGT\nmkeycuATVUJYFUP44MxB8Xjd0DqsG5YOvt0Qpul6HQxhfn0FMYTPX34+zl1qdhwePvQwHj/6OD7x\nwCfwZ3f/mVi1EpUhTM/rouGSELYlgheQEVYtGMIdJlo8QhVDWMdBN0828oQwvykWCjYGkgflcrCk\nJnU3hIMmhIUx2AEMYQCWWdXuFSZH2BcyYk6eGBnNE8IJlpBJrX41CWEnZIQuCeGVg/LGe3BK3Yic\nplGr3XL5sx9ToZmaJYRDR0ZomBAOiyGsi1nYSjTxU07LxupUTQjT62lsThrChUL7x6zL8dJ7Ik8I\nh24Ic2REjIaw3dzn2rBBDt6iNoS5MT3UPeTLmLAUlnPgCNOEcNQc4SgYwtlS49I5atypZuvywbfX\nbtaR2SN46NBDDb+3J4T9yOnec1D6NL7qUTipREoH+DGEP/nQJ3H3nrsBAI8eedRihHkRN4SjTAgn\nElaOLSDP4WLRH6KDTprGnRCuVOS53Nvb3BB2WnnQTHOlOYEicisoBzQvKseY/PwPHlRzHdPxs5GI\nBxmxdkhWKqSm6vLl8ngffzx4oW03qUoIH5yV733t0FoMdg+KvqEdGQEEn0wJKjtD2DAM/GLfLwCY\n4aGXrX5ZW9tjjOGd571T/HzpP16K2+6/DZ/+9adx5447ze3GYAiXayQhTIvKLSAjmmrBEO4wGUZN\nDKYCISM0ZxbyZCNnCFP2WrspYdMQlr0QHQ3hnlQPGMxIRBjICB2/Y64NI9IQ7lkeICE8R5ARXXob\nwgAZmPeNA6wWSkJYl3N97Yi88R7NhoOMoIZwVAnhXx38Fb64838AQ2bHUHVC2DCkWQWc+gxhe0JY\n13ZrUY80hCspGdnynxDWu622YAZIQhjwOWGpwXfsVFSOGsJBi3zaVSobQMp0lXpiREa4JYQTCeCc\nc8zHu3eHM4BzTQjXjWk/uAigdUI4TmREWIawV2SEYfivveEmWojLi+7bf594TDmbfgp12UVNRf5Z\nUyqV30LUdlHzs6vL/XVOevjQw7jtvtssv6P8US/ibe7cnNWcDkPcEF6zpvFYg3LWvRrCbgxhle0S\nPTcGBhoN4aFuWQms3YQwLSjXFBmRdkdGABIbUSyqmaTUISHsZggDwCteYf6bzwNPPx3O/pUZwuS9\n80lJHnA6PHtYFG3l0ikhPDwMvDT5Eg7PHgYAXLHuCl/YkHdse4fwLgzIVQ87ju0AEFNCuOqcEF5A\nRjTXgiHcieo1jQ+/M6UNCWENl6hyI8uOjADa79zm87AkhCmzSRcxxoTJki1mA1UVdkJG6Pgdc1Fk\nRGJUJoTbnQ2fnZd3hEy3HsZZMwmTJVEFMifDYQhrgozYuFTeeCfy4SAjSmliCEfAEM6X87jhn2/A\nF578W+B3bwGg3hA2ee/yvNbFLKSDGJUM4U4xhDkyAgCKSdlAq0gI67iaww0ZAficsNTgO6b3RJ7k\nWk7Q7qoTwnQiKUpeo11uDGFAYiMMQ91Seyonc7RQKYikqN+VHasHV4vHTglhioyIMyHsVlSOGk9e\n1SxZCFir2qvGRnBD2GtC+NmxZ8Xj15zxGvFYRUJ4BQl5cbM/bEO4nYTwTGEGb//+20Ubw/Xk8fYM\nYWqQhpkSnpyUxpGdHwxYz6swDeEoEsLNDOHuVDe6U92iD92uIUwTxX4TwgBwBiHqUbazX7kZwlHe\nk9YNSbTCgWkrHDkKbISqonL8vQ/3DGOw27yh8QmvmlETZiuXuOcVwp/UcRK/rru7TaP2wQMPiufa\nxUVwrRpchRvOugEAwOr/AcDOCbMDETVDOJkEClVnhjBNBDMw5StJO10LhnAnqs9MdahCRug46KbI\nCMMwLB3pdjt4nYCMAKyFioJ0/qQhrHfqjGvd0DpxE6kOmgnhWq39QcxcTvZ0ejvAELYUlusfO6WR\nEWetlDfiqUo4yIhCSqbdokgI//LAL+UgYcMvgERFOTIil4Olre4EhvDphowoMNlAq2AI63jMbsgI\nwGdCOBn/JEerhLBqQ7hADWFNkBHjuXHLc2FzhJ0MYWpKU7O6HemMjHArKqcUGVF0R0YA6rAJXNxU\n8poQfm5CnkyXr7lcPHZiCLcru9lfrQLj5LSO2xD+1MOfEknCLYu3iN8/ceyJtvZPz+HDh91fF1TU\ndLTzgwFbYUQfX19QQziKhHAqkRL3B35tejGEZ4uz+NFvf4T//h//HW/63pvE7z0zhEuNHYjNm+Xj\nF19s+RZaymIIM/N4EywR6X3YkhCetSaEqSH86KPh7F9FQrhaqwrDlxrcdAWEHRsRNF0fVDywwq8t\niq05Z8k5vrf7jTd8A1953VfwzPufweZR84TddXIXakZNGMLVangIEMCKMbIn/bnomHtp31It+9px\nasEQ7kQFNITN1JnefFk+q1MzaijXykoTwroawnzQfSJ3AsOL5PfjnyGsd+qMqzvVLRKdlR65Trfd\n485aEsL6N/SWwW//8cBmos5F5TavXgxUze9kOrGnrYIqzUTNx2IiWobwXbvvki/omgeWPqs8ITw/\nDy3MM7t6071iEidbzIZWVE6X47WLGsJ5qE0I63jMdPnsqYKMsCSE60XlhobkEukwE8I9aT2Kyrkl\nhIEIDeF5BYYwQUbYU1kAsFoGiHGo0S8OVaEhI7qaIyNouvO3v217803VLjKCJ4SHuodwzlJpPKhI\nCNsN4fFx6+oyVcaLH0N4fH4cn/3NZwGYuKd/f/u/i++t3YTwOuk54cAB99cFFTWEnRLCUSEj4kwI\nUxOJm7nNDOFcOYc3fOcNWPTJRXjjd9+ILz/+Zeyd2iueP3vx2a5/2yohHKYhXGXmSR0lLgKwTuDZ\nkRHnnGPiiwDg+efD2b+KonJj82Oi4DM1uGlNHHthubgNYT7G5IWwT+Zk521x72Lf212UWYQ/vOgP\nce7Sc3H2EvNcL1QKODB9wPL5hpkSpoYwX1kOWM/trmQXzlt2HgDgopUXhfdmOlQdawgzxpYwxq5n\njP0VY+ynjLEJxlit/v/XfWzvOsbYvzHGDjHGCvV//40xdm0b20gyxt7HGPslY2ycMZZjjO1mjN3B\nGNva7ntylZKEsH4mAxVNNubKOYsh7Csh3KU3QxiQAzUDBhJ9sqFWwRDWGRkBQCy3qSbll9vucc/l\npMnQcQnhvjFMT5tLdf1KmKNdZqeSgWlzri8aSQDHLgYA5DIv4tEjaqb+qSGcY+EhI7qSEqRXqppr\nve7ac5f1RWsfUm4I2xPCurTVjDFxzc4WZxUzhPW+NwHAaEY6ovM1BQzhDmirebJ0fH5cgSEc/3dM\n0/b8mmZMpoRVM4SLdBljOj6G8EjPiPjMqRkLxGMIHyNMeb8M4YHuATFh4cQQ7u0FFtfHu2GaaU6K\noqick5F0Dgl8qTZW2ikqN1OYEantc5eea8FjzJbUG8LHbESqOBPCf/urv8V82bwpvPei92Ljoo3Y\nvnw7ANMIo4ZMK9Hibvv3e/6zttXKEA6KjKD3C3o92BUFQ5iGi9wMYX5tzpXmLPUcqP5xxz/ix7t+\nbMGCpBIpXLXuKnzh2i/gHee9w/U9UISh03W8aZN8rMIQpkXlqvWEcNQrVvq7+sWkut0Q7umRx7xz\np5ksVS0VCWH6vjg3PuMAACAASURBVKkhTBGI9++/3/I3Qa+dICqX5fnOJ1tO5OXSEYpBC6Ito3IV\nxM4TOyM3hFMpa0LYvkr2X/+/f8VnrvkMvvK6r4T3ZjpUHWsIAxgD8BMAHwNwLYBFAIz6/57FTH0V\nwH8AeCOAlQDS9X/fCOCnjLE7PWxnFMCvAXwZwOUARgF0A9gA4L0AdjDG3t3Oe3NVv9mJD5QQrhuk\nDEybFCEVNbLy5XygonINDOG0fgxhwJrcKXfJNW/+kRH6GUlu4qmJciILfgm3e9zzBWkydKX0N4Qt\ng9/+46hU/BlKXMKI6zZ7Gv1d/WCM+d+gQjEGLDnwXvHz///4HUq2S9vAOSM6ZMT+6f347Qlb7GrN\nw8jl1LLBTENYnte6FJUDYDGElSIjOqCtpgnhbFU2VEETwgmW0OaatYunN2eLsxgYkV+yr/uTBkx/\nN0ONc4RPnLAOnoOqVKMFfOJLCDMm2Xm0qBxgLkvn6aGoDGGaoqLLbdsVT50dnj3suAKFG2pHj4a7\ndNUuN4YwL6iWSqR8TdzSonLZUqPrGYUh7CUh/Ny4PJG2Ld0m7htAOMgIXQzhQzOH8OXHvgzAXKn1\n0Ss+CgC4YPkF4jVPHX/K8/6jMoR375aPw0BG7NrVfPtcNCFMv+OwEsL9/c0NYcA5JWwYBu7YIfuz\nf3TxH+Enb/8JJj8yift//37cetmtTcdflC9sR/gAwNq1ctXKSy+1PqZWsiSEYZ7UcTDtuYl6ePaw\nWKHDxduuQgHYu9f+l8GlgiFM2cfUEL5i3RXiO/3n5/7ZsmIlzoQwvVaFIZyThnCQhDAVTwgDJkdY\n1erBVqJc+3zF3BEDaxgzbRrdhA+8/ANYNbjKvonTXp1sCAPSAD4I4G4AfkZStwP4g/p2dgB4O4BL\n6/8+Uf/9exhjf+22AcZYAsAPAVxcf/33AVwH4DIAfwzTvO4GcAdj7LU+3qNFqSEFCeF6irAL+phG\nVNSkzlfygZARncIQpibWVGlcDB5UMIR1RkYAcnBeQwVImQcwMdHsLxo1l9fTOHOTFRlhTvIESZiK\n773b7OXSAaMOuqz/rUDBnCL/znPfwVQ+OHCXmo9zteiQET/f/fPGF615CIDawnK6JoSBcAzhfB7i\n3gToWQAUsBrC04VJMWALyhDW6fu1i15TrF8mS30lhOl3HJPpT40paqjxhLBhqGWvlqp6ICMAee8Z\nnx9HzZDr6xmTKeFDh9QXI3MyhOnSarrctl1xbESxWsRErrHzwJfc12rhMljtasUQHu4Z9tUHb5UQ\n3rBBLodWbQhzU67Zkn+uZ8dlQblzl56LoR6SEFaAjFi61CwkBJiG8FFbiYK4DOG/+dXfCLzUrZfe\nKgoaXbjiQvGadjjCcSAjaEEzrqCmFjWEzzrL/XVbt8p9vfWt8vdRIyNaGcIPHHhAhAOuWncVvnT9\nl/C6za/z3P+midI9k41V45JJmdR+6aX2i23bpQMyApAmaqVWwbE56yxOmJNZgPqEMGUI93f14/0X\nvx+AeWyf/83nxXNBJ1OCiCIJuSHMVygkWdJXYVMnUTzKb0/8NhZkBL+OM+mMlv6WrupkQ/ivANwA\nYLlhGOsBvK/dDTDGNgH4EEwT9zEArzQM43uGYewwDON7AK6AaRIzAB9mjDksoAEA/D7MVLAB4EuG\nYbzFMIy7DcN43DCMLwF4JYBZmJ/3F+oGsm8lB9UZwt2J/uYvjknUEM6Vc4GKypkJ4c5BRgDAxPyE\nWJY7OenyBy4ShkxSRhV1NhoAm3lZPzfbTUHkSEJYl+JbzWQtKmeuTw5iJkpkhHmBUMNDB124rRd4\n6iYA5vLpbz/97cDbpObjbNU0AZIs2bSIhx/ZE8I/2/0z8bO4bocPAoOHlRaWy+VgYQjrdF7z82u+\nPI90t5x8CswQJngfanroJLq8brIwKYwR/wlh8/PTuZ2mE1jVTFBDWH7HcZn+tH2kxlRYheVoQri3\nKz5kBCDbrEqt0mByUGyE6sG4kyFMC+9sGPFvCK8elKBgJ2xEVIaaXa2QEX5wEYB1UspuqAAmh3Nr\nHVS3Z0+wiTqqalW28f0emmdLQnjZNnQnu8WEPU9JB1EyCayo16zVKSHM+wiZVAYfufwj4vc0IdwO\nR3j5cpkUjQIZsXSpNdHOFXTZu1dDeHAQeOop4O67gT/9U/n7KIrK0QDAcHdzQ/gOstrtfRe3bUNg\noHtA3Ft3T+52fA3nCBeLwRno1BCuIB5kBGA1Ue3YiLDRRSoYwm7ICAC49bJbBWbuzh13ipUQcSaE\n6biErwLiCeHR3lFlxiktnBkHMoIyhOOY6OhkdawhbBjGXxmG8VPDMNrMEVr0AQB8BHarYRhF+qRh\nGHkAt9Z/TAH4UzjrQ/V/pwB8xP6kYRh7APwNTGP5TAA3BnjPIp0TDBlhmm49TM8Btx0ZoTIhrGvq\njBrC4/PjgqE1OdkeW1Z0XOsmeG+6F4lgcxChixZI4YbmnsbJclcZBpAryRFPnFXcvcqCjOgLnhA2\nB3yGuLYtn6kG2r4dwI6bxc937LgjcHE5OsidqZi3gtHeUeXnOz2f5spzuHffvQDMZVbvvoCQgNY8\nfNolhAGgmpK928DICA3So61EzZiTuZMiZeInIUy5ybrygwFre1XqCmgIkwnauL5j2j5SQ3g5madT\nyREu1QjXrivmhDD5LqMsLNcsIdyV7MLKgZW+t92qsBw1hMM01Ozix5xKQaykMAxDmEt+k1nrhtaJ\n4AQ1Xal40q5WU1dYjrZxXgxhe0KYMSZSwioSwoBECoyPAwet3lIsReXK1bIwjM5ecrZlAnHrkq3C\nLGrHEE4k5Dl84ECwehNuyuVkwtoN56AqIbx8udVcdtKGDcBrXmM1pnVKCI/NjeHfdv4bAHOF541b\n/A3tz1xkftjH5o5hvtTYiaCF5YJiIygGqWzUDeEYkRGAc2E5Lm0TwrPuhvDy/uW4absZfsmWsvjK\nDpNXq4shLBLCebPzRmtiBNVA9wBWDZgN8s4TO9GTkQ1V1AzhBUO4PentEoWv18NM9f7WMIzHnF5g\nGMYjAHbBNHPfYH++njI+u76d7xqG4TYc/iZ5HMgQNgIWlSsUDGG6ZZJ6GsIUBG5HRvhLCOuPjLAb\nwjwhXK22t7xEdFzrxoqupgqVxbzsbt8Qnp+HxUjSDZfgpJGeEYm2UJAQFmzOhLmmTLfP4LzzAExs\nBfZfCcBcTvTQoYcCbVO2gQamima7qJofDFg7zPftu08s0X3txtfilWtfKV+45iE84X0VaEvZDWGd\nDEO69FeVIWxfzaFrQrgn1SPuI5N5mRD2jYyo3590nawErAnhefg3hOkKJSC+7zjqhHBZE4YwYP0u\n7RzhKA1hwzBEQnj98PpAE3m0cj0vYkYVd0J4YMBEcgAm4oGjOvwmhJOJJLYuMSPAuyd3Wyqrc4Vh\nrNBARitD2DAMYVavHFgpJtL4taeCIQxYGbP2+28cCeFDs4fE92vHoKSTaZy71LzIdp3Y5Yj7cBM/\nh7NZKF2JxEV5rU4F5YBgy96npkzTHmieDrYrrJQhPTd6+yuiKBw1kuhqM7sh/I2nvoFyzXSi/uCC\nP/BtrHJDGAD2TDUOfFQWlhMJ4WQJlTpDOI7VhNREpTxewDxeXrAy7IRwj0/PkJvYqUTKutqzrg+9\n/ENgdYrp5x/5PGpGLdaicnRsOTJirnTkbY8qfjAX5whP5idhZCR3S9UqFSc5MYR1rI+ls05bQ5gx\ntgFm4TgAeKDFy/nzqxhj62zPvdLhdQ0yDGMMwIswjeXL23irDaplghnCuWIJSJpXT29SL9OIy8IQ\nDlhULpeDZYmqro2EmyEMtDfolugA85h1NVWoqHm5aEX7hvDsLLQwGdoRY0wmteqp/yCd/EIBoqAc\noF9C+Iwz6szBJ/9A/O5nL/3M/Q88iJ7rxXoCTzU/GLAmhOny3OvOvA4vX/1y+cI1D+OTn1RXWG5+\nHiI9mkBKKx7WYJccRJQTCg3hDmAIAzIlfDIvE8LtppcMw4pQ0HnyjqZKT+THxLLDBWSEN5UNagjr\ngYwAgLF594TwCy+o3S8fBDNm3gvG58eRK5sXDWVp+hFNCOuEjOD9VdqHpcaSX0MYgDAWa0YNO0/s\nbHg+bkP42NwxTOZN5hl/r4CVPx90lRBgNYSffdb6XByGMOViO53XFy43OcIGDDwz9ozn9xB2YTna\n53YzhIOYWl5xEXZ1dZkJaSC8hHB3r3P77JYQninM4POPSD7sey+SRZPbFTWEnbARNCGszBDulm6+\nKn5sO2qWEO7qkse8a5f6IqDcO8lk5HnVrriJvXpwtWONnrMWn4XrNl0HADiSPYInjj2hTUJ4eFim\ngwErAk2FKEd4rkfel6JmCMfdz+o0nbaGMICt5HGrxVT0+bNtz/nZzhrGmG9XstI9DsBAoeAPMJ8l\nM9KZlJ7GGU0I58o5ZQnhNOvWtsDakj6ZbBzP+TeE7QnhTjBHqXm5fK35BY+NeU/cdaIhDJCkVu8E\nwKrBkRHd8uLQLSGcSADbtgHYc434HUcv+JUwH3slOYheR6rklPw4Z8k5eMOWN2AkMyKSWljxJA4c\nnce3vqVmv5QhnGL68IMBq6FWZrJ3q5IhrLNByg3hyfwkevtMU6NUsi7JbKVyub70twPaanuqlCON\nOhUZYSkqV2wsKgeoNoT1QRrR79KOjBgdlcXC7AW6goqmZRMJGz84QEE5QN+EMDdQaR9WlSG8bek2\n8fjZsWcbno/bEKbvib5XbkCVa2VReC2IqCFsb3/jMIT3TTU/ry9YITnCOhWW2yfftqeEsJupVSgA\n99zT+LxfQ5gxmRJWaSrRc7mrV7bPXgzh2+67DcfnzJV9N265MdCEVjuGcFBkhDBXe6QhHKQN8iuL\nITx7sOF5PjFZLgO7ndHKvsUnFfziIrLFLKYKpsNKWch2vX7z68Xju3bfFWtROXtCmPODAWBxRm1C\nmHKEp9PRGsKptGEpKrcg7zqdDeHV5HGrmsO0d7nG9pyf7TDb37UlI1kQxo+fNNZckSSwNDWEKdbB\nXlQuCEM4k9LXYKBL3WlROcCHIcyqQNpsfXVO2XFRI2TxKtmDp0vYmimbRUcawmKpUaIG9J4Ijozo\nkp8dTXDqou3bAcytAMZNA/Wxo48FWjYqOhh9ctlzGMiI4Z5hYaL0pfvw8as+jl+/+9fiPLt8TX3R\nR6IKrHoMt9+uJtVgtl2medaT0Os6poZakSlkCHcAMgKQ3LVStYSefhlbaifBZBaUqwAp023Qua22\np0r5/Wl62sQaeZU9IawFMqIUPkO4QhLCcTAbqeyrkezin4HK4wekQWTnBwPBE8KWonIOhvDwsNxv\nVIZwrSYntcMwhGnq1okjvHatNPfjMIQtBeWIIUyvPRXYCGoI2xUHQ5hOdDid1+cvP188djLy3RR2\nQpimCJe4dKNamVp79wIXXQRccw3wutdZn6OG8JYtaEt+V+E0E50sSPW0NoSn8uYH9OSxJ/HFx74I\nwFxx+pnXfibQ+2hlCC9fLq81ZQnhHtkGxZEQXt6/XCDz7AlhwDqZpRobwc8hvwXl6P3Fzg+meu2Z\nrxWP7YZw1Alhur+hIbP2BVeYCeHJRPiGsGHIPmgyXRa4noWEcHs6nQ1hGp9rZTHSnKK9C6RqO+0p\nAEd4vkyL9ug54B7pkdymyfyksqJymaSe/GDAHCTyGzNNYAFmYTmvEizZunQ2VbgsyIhl8gv2io2w\nJ4R1ThZS0aQW+sdO6YQwUOcIA8C+VwEwl7o+cKAVscddwnzsIwnhEAzhVCKF+3//ftxx/R146daX\n8D+v/p+Wz/dlq18mX7zqEezfD3z728H3a+Ju6gVAk3qd03RQX4AcGZ5uyAgASA3KznU7HOFOMsAt\nhciIIWwY7aFu7AzhuL5jev1GgYwoVfVBRti/y4bn609PTVmNsKCyG8KtkpTtqDfdK65Jp6JyjMmE\n5aFD/lbXtatcThb/oqGGmaKadN62ZSQhPN5oLCYSwNb64pW9e9WYaW0lhG0F5bgof15FYblmhnDc\nyIgNI43nNf0snL43N4VtCNPvdsCl+9jTYy7pBxpNrQcfBC69VKJmHnzQahr7TQgDehjC04Vp1Iwa\n3v8f7xem021X3Yb1w+sDvY+NIzKO/dJkYwSYMckR3rcvGJJMJOi7400IJ1hCrOpwMoQpukh1Ybmg\nCWH6fpsZwuuH14u07K8P/xq1btlRitMQHhy0JYRDYggDwIQhF86HZQjT8E2ymxTv1RQPqqtOZ0OY\n9shbNa+0S2w/w8R2DMMIsp32FMAQnivLu2CfpgNQ+4AlODLCHHRnNC0ox8WTO4EZwh1mjlJkRP+o\n/ILbMoQ7xFihshQj6D8e3BAmCWHdGMJAPSEMAHtfJX53717/2AgnZEQYDGHAXAZ188U3Y8XAiobn\nLIbw6kcAAN/4RvB9Uv55r2bXMR3UF2oqGcLx4wS8iBrCyX45Y9d2QrhD2ur+rn6xcmdsbizY/ane\nVidYIjZ8ghtDeGhIGh4qDWG+jBGIHxnhNSEMyCJQQVWtSsPJKSHsZJy1K84RPjJ7RBg2VNwQLpXU\np5+d5GaeqkoIr+hfIcITTglhQCbtDAPY2YgZblt+DGEGJrFKsK5eoua4XzUzhAuF9jA+bvKTEGZg\njkvKB7sHxe+fG3/OM0c5bGQEHVs1+2759UtNpv37zVSw/V5AOeTcEO7qsprbXhQGMoIeb9KLIVyc\nxl2778IjR8w+3pbFW/DBl38w8PsYyYyIFUdOCWFAYiOqVSvao105JoR7ok8IA9JMnS5MN0wMhZkQ\n5udQ2IYwAFy78VoAZvjliSk51onbELYwhDNqE8LL+paJ6+Z4JfyEMO1zd/fLncQ98d5pOp0NYTps\n7WrxWnr7t5/SYjuMsSDbaU995kjFV0K4Int0gxqmCIHGAUtvL6nQHCAhrPOAG5DHPVOcweCI7IG2\njYzQYEluO6Jprd6R0wcZQSc+0BcsIVwswlpUTsNrexsPNB24CjDM208QjjA3H5ODBBkRAkO4lbYs\n3iIMpsTa3wAwcLgVQMiD5nOGaLt6NcPdUENtvjIr2ucgnT5qkHYne7TlvQPWTjTrlYZw2wnhDmqr\n+YoGmhAG/ExYyiJ6cRVKpJ81ZQgzJhOyqgxhwwAKFX2QEXQVRbOEMKDuM6D9NpEQbrG0vl3xxFm5\nVm5gIwPRc4TDNoQZYyIlfCR7RCxpp1LNEfZqCBcrRYFDOGvxWRaeY5QJYUBNSthPQnjV4CrXa51/\nb9lS1jEh6aSVK4FUynwcdkLYiyFM078//KHsj43IBZ7CEK5WJf/2zDOBZJu39jATwj09QMVwNoTp\natXpwjSePv60+PmjV3wUXclWw35v4tiIw7OHkS83dqJ4QhgIxhEW/ZOeeIvKAc0Ly23cKK8zlQnh\nalVey34NYV5QDmjOEAaAa8+8Vjz+xaG7xHkftyEcZkKYMSaS0ScqB8UYJixDmLbvmQHn63hBrXU6\nG8K0i9BqFEZH4nY7UtV2vKkK4CgA9jiAJ/DEE87/Hzt2zHUT+Yr+xpndEGZMdlDa7dzN5ytAygxv\n93XrnRCmZhbrl6nH9g3hzkidcdE0a9eAz4RwJxrCFmTE8baWXtvVgIzQMCE8OAhs2ACgMIzEsYsB\nAM9PPC+Kc7Qr3sFIDoaLjGilBEvgkpWXAABqfceAwcOYmGjxRx6UzecBZqaH+rv1uo4tRblKs+ip\n970CJ4Tr6VHdDHC7aEIYvbKBDpIQ1r3d4hNYk/lJjIzKdXrtnOv0mOM83gRLiDbSbkrxhOzEhLp0\nYY3pg4xIJ9Pi/HUyTsPgKNsHpIA0hId7hpUsW+YJYaB1YbkwDDW7aF+VLsGnhnBQM4ayeZ1SwnEZ\nws+NP4dyzWwjLl55seU5t3S+X/X1mYxoN0VpCGeLWWG2NJvkOHdJ+9iIZNLkQgN6GMKzsxKJ8gSp\njffnfy4f83Nu/36JOmiXHwzIhHCloqY+AyDPi4EB6woO2j7TyYvpwrRlVQMtnhVUlCNM98FFC8sF\n4QiLdjhmZAQArB10N4STSeDsOnngpZfUoYuoKemXIUyL4LVKCF+57kpxPv18910YGDQvmKiLyjUk\nhENkCAO2tm/I/LzCMoTpsWUG5E6MrOHqkdH/S0EYLKeQTmdDmOa3WhV4o4Xk7L1MP9sx0LoAnbNy\nAL4CYOftAC7C2952ES66qPH/O++803UT+SpJCPfoOQB1SrDwDnW7CeG5IuXp6m0IL+2VRrjR688Q\n7iQuJRd9j9VkVnRGT3VD2IqMUMAQpkXlyKBLJ3GOcG2PxEb8Yt8vfG2Lm48JMnkSR0IYAC5bdZn8\nYfUjyOWCp1lm8/I6HujW65y2D+p551oVQ1j3a5gawrXuAAnhdGcgMgDrBNbQSpnKbydtaTLu6wnh\nmBnR/By2m1I8IWsYwIkT9r9qX3NzAFL6ICMA+V06ISNoQjgsQ7hcLQsTQEU6GLAZwjPNDeFTISEM\ntObRqmZxejUNHz/6uHh88Qp3Q1hFUTmgMSU8RHx2FWk8r4YwTb0342JT/rMb7sNJ/ByemUGg/qKT\nvDCEAfnZlsvyc3nySfPfVAp4y1vka/k5F4QfDFjTnKqMJS+GcCqREn2R6cK08lUNXK0Kyyk3hDVC\nRgDOHGHOP69WvY8FW4n2yVUgI/iqFDdl0hlcvf5qAOYqjsxa84KIOiHMDehk0jTCT+TDSwgDtuT0\nkHmzDRJ4aiY64dfTL6/jvffsdfTH7P9PqEjvnAI6nQ1hQjZCq2k++rydwuVnO4cMw/B3S+sF8F4A\nr3gLgB24884d2LGj8f+bb77ZdRMWQzij56CbJlj4gIV3Pts1hOdLxBDWPCFMk9HlLjlQa7uonAZF\ne9oRxRvMlbI4o97P2r/fW0KroahcBxwzEAYyQu+icgDhCO/7HfE7vxxhYT72h88QbqXLVhNDeJXJ\nmAvaz8gWqSGs1zltN4RVJIQtOAHNr2Gaqqh1qWEI626C02urf7lMlrbDNLQjI+IUbyPdDGFADTIh\nmwWQ0gcZAch7z3x5HvMl6ywGTQirQkbYDeFDs4cE5zdoQTmu1YMyk+FUWO5UNIRbJYRXr5b7jpIh\nTA3hS1ZdYnmOpqJVJISBRkOYmo5RJoRpocSmCWEFheVUn8P8c0okIO7nThokWYOZGdOg5efW1q3m\ndcaxEaoMYZrmVGUI83O5mSEMyGt0Kj8lDOHB7kELTiKo2jGEf/vbhqc9SyRTNUBGrBuWDfJjRx5r\neJ6e64ca5/d8KaghXK1VBQpnSe8ST302zhEGAGz8OYD4kBGDgyYWy5IQVswQBmyG8LDZUD38sPLd\nALB+lt198jq+8IYLHf0x+/9LlsQTItJNqbjfQFwyDGMfY+wogBUArmrx8ivr/x4xDMN+C/4VeXwV\ngO85bYAxtgzAZpjp4Ifaf8d1JQGsBDBlALgQa9YAF17Y3iaKhuzRDWmaEAbMwedkfrLBEM5mzeSO\nV/Sg1RDW22SgA+45YxyJhFkN+5RnCHdRQ3gOGzcCzzxjmsGHD7cuQEEZwl2JbqQSndG0WRLCm36K\n/ZuWYssXF+H+37/f+lwLVSp141zzonKAOUAFABy8HCl0o4Iifrr7pyhXy0gn021ti5uPRsZsIxIs\nYV3KH6HsCWHALMi0rjlirKnmSnTyTq+2iw4iZoozYgAZZLCWy9cEb0y3RLRd9DwrpWQD3T5DuHMm\nsmhCuGdUTlh6Zb0DQL5YEeZo3PcmPqmRLWVRM2pIMDMjodoQzWYBJPVBRgCNWK4NXdKUjSIhTJdF\nK0sIDzVHRoRppjkpCkP4nKWSCeFkLDJmLtF//HFz4iaf979UGvBuCD921DR4EiyB85efb3nOkhBW\nUFQOaDSEN28GHn3UfBylIWwplNhkomPL4i1IJVKo1CptJYTt57CYYFcg/t329zcfX1FDeHYWOHjQ\nTHEC5niUMRNV8qtfAUeOmCakyoSwCo4wTTf397c2hA/PHsZkfhKVeTOhcsbIGUr5960M4ZERs10e\nGws2saMTMuKC5RegK9mFUrWErz/1dbx565tx3abrxPNrSPhWlSFM+6d+DOGnjj+FqYIZdb1i3RWe\n/oa+rjZqfnmFgolQ6VKDoG4paggDkiGcYIlQvn9q9i/dfADjO8xVBFNTVsa4CtH2vas3D9QJEKNL\nR3GhB4OsK6ovQXOdzglhAPgRAAZgC2PsUqcXMMZeBjPZawD4of15wzBegpkaZgDewhhz6+m/izz+\nQZA3DQDoMwdjfgbfBUNePcO9+g66+YBlrjSHXDknljBRKLwX5atyhN6b6pyE8IncOBbVPYd2kob2\nhHDcg24vomnWbCmLjRvlc16WCtGEcG9K/+PlGuoekkUpeidh9E5g18ld+P4L329rO+J60LyoHEAG\nkZUMzk6Znb/jc8fx7y/+e9vb4u1fLWNeIKOZUWHqRK1l/cuwfni9+cOKHUCiEjghnCvLtmuwRy+z\nMIyE8HyJMJM1N0dpqqKUDJIQ7pzJO7qiodI9JgYz7RjCBXI/jtsAtxRGJClZ1YaozsgIoLGwXBQJ\nYZqkVJUQbsUQXrpUJh+jMIS9MISDDsaHe4bFcT879iwMDnUl4kuvDSPYcnPAmyGcL+eFyXnOknPQ\nm7b2u1UXlQMaDWFahCvShDBFRoy4n9ddyS6cNWo6ozsndqJc9QbGDZODTROzzWTHcVB+8AUXmP9S\ndvULL5gBDy4dEsL2a9NLQrhYLaJqmM63qjaLy2IITzUawoBk6o6Pt7dqlIq3wywTPzJiSd8SfPLV\nnxQ/3/TDm3AsK+sfrSV43oPe6i62VNCEMC2C/aoNr2rySikxLgBQG5AHcuRI+/v3K7shfDJvhhhG\nekZCKd5Mj3lk/X4A5v3nwQeV78rSt0j3yus4kwow83ka6nQ3hD8HgC9I/we7mVv/+Qv1HysAPu+y\nnU/V/10E4O/sTzLGNgLgmP3dCGAIixnJAIYwTQiP9OppGgHWAcv4/Lil89lOBy9flncAe8dUN1H+\n6URuQnT+sw1DOwAAIABJREFUDh/2brQ0pM4051ICjRXfzyCBIe+G8HzDtnQXYwwXrbio4fdOCYFm\nEueG5kXlAOtg44La+8TjO3bc0dZ2DIMP0gxUu03nNS5+MJdICXflgKXPBTeEK9Qs1Os67uvqA4N5\nP1JlCFMDXPfrmCaEi4kADOEOaqvpPXkiPybSavv2yeJCrVSo6cNMditudTogI+wJYaqoE8LNjLN2\nRJERTgxhxqTJcOCA93PWr1olhBMsoaSd4zzameIMjmaPNjzPjSQgODbCiyH89NjTwjizF5QDwmcI\nDw8DdBVwXAnhVsl3jo0o18p4afIlT++DJoRVG8L8c2qW/AYakRGcHww4G8J33w385jfm482b/SUE\nVSeEmxnC9gk7JzSESn4wYE4w81VXbv1/WozPLzaCt8PJvviREQDwJ5f9CV63+XUAzDHvu34ks3Nh\nG8J+Vkr4MYRHekZEX6dKDOFnvZNiAqlUkv1ye0I4DH4wYOVDJxbJ2df77lO/L3otpzPSFNNhJVYn\nqWMNYcbY5Yyxm/j/AN5Mnj6TPld/vkH1dO+nYKZ7LwHwEGPsLYyxixhjb4GJdrgYZjr47wzDcLOm\nvlV/LQNwC2PsXxhj1zDGLmGM3VJ/bhBAFcCthlEHp/mQSL/1maMUP4ZwGbJHN9yn76DbPmChRlI7\nHOFCrXMMYfsx8w5ArWZWWvUiWrQH0N9YAczzmt8w/SSEKTJiQGMMipN+8NYf4KIj/wj8s1yA4JYQ\ncJMYrHRAUTk62Fg+/xqRtLh7z93YM+m9coQ85nnUkmZDGBc/mMuCjVj1SGBDOK9RmtKuBEtYGKy0\nCrgX7reTKN9et+O1ixrCOQRARnRQW00TwmNzY9hQ9/Hm572tYjEMoGjoc07TSbPQDWHNkBGWhPCc\n9SAzGTlwDC0hHEJxpu5Ut7gHOCWEAZmwnJ/3n7LzKjfz9Pic6bIv6V2iZNn5GcPy83M67rAMYbck\nqaWgnIMhbGEIl9QnhFesaMQaBBU1hJutMObndXeyuyX2i/KfOZe0lWjymRqxQWUYVmREM9k/W/o+\nzq/TQXgqHQA++1k5+fK2t/l7f1Eawm4JYSrVCWHGmEgJH5w5iGKlcRmsiuuYM4QTveaDVCIV67iY\nMYZvvOEbWDmwEgDw8z0/x5FZMzobBjIiSEK4WCniwQNmxHXVwCpsHt3c4i9MMcaEQTqXOAjTUorO\nEKbnOi/oyvs7tBaGSvWme7Gk15yVm2EHBILm/vvV74u278lu9+t4Qc3VsYYwgPcA+Ab5/+/rv2cA\nXml77utNtvNRAF+DeYWeD+A7AB6r/3t+/fdfNQzjY24bqBu8bwTwaP31bwJwF4BHYCaMlwAoArjZ\nMIy72z9UKRHt750EEmVfN8YSowlhfQegdnPUb0K4UJUfUtwD0FayH7OfDkAnFpUDpBmSLbZvCM/M\n1kRCeFBz9qhdy/qX4UK8B3jxBqBsumpKEsK6IyMAzGUTuPkiWQDzKzu+4nk7YjKsVzpRvAMSlyyF\n5VYHN4Qty+s1TI/ygT1NCAP+U8J5zY+XqjvVLd7jfO30KCpnxwzQlRxesBHlMoA0Od60HgxhwGoI\nh8IQ1gwZQfsadmQEIE3xsBLC3DhjYNYCNAHF8QlHs0dRqTXOTNGEZVB8Qis5ISNqRk183twECapV\ng9IN5WYKVViGcJ9LE93KEHa77oLIbghTs1plQriry52vaxiGQKFsGNnQEl/Fk92Ac0FAJ61cCTER\n9+tfB1uRQ1UsSg5wK0OYIiMOHZI4iE2b5OdOE8IzJAT+9rf7e386ICOoVCeEAYmNqBk1HJxpjMSq\nuI4lQ9hcpTDUPaSUhexHi3sX481ny0wfvzcMDcnJB1UJ4SAM4d8c/g3yFXMDrzrjVW19btwQLhl5\noNdM50ZlCNvvvRwXAYSXEAYkR/j4/FGcd4EJ9n36afUTsfRaTnTLLziTXkBGtKNONoQB03z1+r/z\nBkz9IYDrYTKFj8A0b4/Uf77OMIyb3f6ebOckgFcA+CMADwI4ASAPYA+ArwC40DCMZsa0J1k6GL0n\nfN0YK8QQHtQ4TWlPsPhJCNdqQBmEIax5Qng0MyqWYdsNYa9LhDqxqBwgDcxsKYu1a4Fkfe7DkyFM\nXJhOOV6qkREARgKYNJ3wvVN7Ua1VPf+9GBTUE8I9qR5tC+vZr+N3XfAupBNmMbmvP/V1x2SEk8Qx\n9+ljCF+w/AIkWf3EXfZMcEO4pvd1zAf2M4UZJYZwodY55iggU8Jz1SBF5fRJzLaSJSFsM4T37XP4\nA5t0O15qTGVLclQRDkNYL2QE/S7tyAhAmuKzs2oMmIZBab3K+UhmROnnwbERNaMmkrhUtMbMAw8o\n262jnBLCJ3InhFG9YmCFkv2sGiCGcLbRED7jDJlqfeGFYPvix9TV5Z6U5YZwKpHCecvOa3g+DGTE\nmWfKz/iCC8IzhJvhIsbmx4Rh5CVBypERgHNBQDddfbV8T4884vnPmsqNd+0kavZ+7GPyfs9xEYDZ\nhi5aZP2788+3Yg/aUdgJ4cm8dKnsBrBjQlgR5sayTXLOUPQIV1BDuFg08QEAUOsyr7u4+MF20SJk\n+6f3i8ccG3HokBrET5CEsB9cBBed9EyNmu52bIZwTvZXaS0M1eIc4ZpRw4VXHwZgfoe//KXa/dDj\nS6QXEsJ+1bGGsGEY7zIMI+nx/5bOiGEYdxmG8V8Nw1hjGEam/u9/bSfRaxhGzTCMOw3DuMowjKWG\nYfQZhrHJMIz3G4YRcF7elMUQ7hv3ZwgnOmPQ3Swh7NUQNpfkdg4yIplIiiUcfhPCncal5OLLd+dK\nc0ilDJxZr7HwzDOtlwvNFjrjnHbTMO9vTpoHXaqWcHj2sOe/txeV05UfDKDhOl7atxRv2vomAOZg\n+We7f+ZpO8J0zEQz2+1FmXQGw931UVDmZGBDuKTR8non8YH9fHke3T1yAsOvIWw53g5ot7ghPFue\nBJ93PpUTwrQI5vj8uEiqAd4SwnZERtzfsVtScWhIph9VLFXVHhnRJCEMqElJ2welnKOrml9pKSzn\nwBF+FRnH/+d/Kt11g5wMYcr4XdGvxhCmSWOnhHAqJVEDL77oH+kDtObMzpXmsPOE2VndtnSb47me\nTqZFwR9VCeHBQeCnPwU++Ungox+NxxBuhx8MmIYJbwO9JoQBaQgD6pZge2FDc115JfDqV5uPafqX\nGsKMWY1jwH86GFCfELajTyZysrNm70c6GcK0aJYqUZOZInW4Vq2S340fhrBsgw1U02b7G7SopSpR\nw/TAtGTOcmxEsdhecXU3BWEIBzGEKVN35VbTEN61y4qiCUv2ey/nBwMhJ4TJd3rGhfI7VY2NsLTv\naZIQXigq15Y61hA+XSXSZwDQP+bPEE7Wrx6DaW2QqkBG5HKwGMJxD0C9iB/3RG4CGzfKpGxbyIgO\n4lJy8YRwpVZBsVoUrLFaDfja15r/bbbYeYgMKrshDLSHjbAjI3TFRQDOA7UbNt8gfueUjHCSOOYe\nOSLRoXO7qLf+HjJTgTqw5TJQS+k9sUMNtVSfbJT9GMLVaudMVnLxybuKURbJ1yAMYR2/YyrGmBhA\nnMidaBsZYccZxf0duxnCjEnWrIriYw1F5TRDRjRLCAPqDeGBAQMzxXASamuGiCHswNM980yZOnvo\nIXVL7p3kZLIdyx4TvwsDGXF0rrGoHCDTheWyt2vVTa04s08dfwq1epkUJ1wEl1hdUlSTEAaAK64A\nPvIRc8VVHIYwXebvxTBMsATOWWq6pnun9lqwBc101VXycRyGMGPAP/wDkE5bf0/T90CjIeyXHwyE\nmxDu77eaZPbixPZ+5cqBlaFM6tFJBKd+MGMyYb1vX/vGuDDv0zkYzJzAj7OgHBVNCB+Ykeah6sJy\nfhPC2WIWjx55FABw1uhZljbXi6ghvHijeSDVqv/igO2oGTIizIQwNYQXbZAcYdWF5SyM+NRCQtiv\nFgzhDpM1IezPEK4mzTs/K/fHzg5qJvvyVD/IiHwelgG3zgY4Fx+o5co5lDEvkrK7dpnmaCt1KkOY\nplqzxSze8x4gUT/dv/pV91RLpWJbah4zl9KPlBnCdWSErgXlACt3kF/HFPVAlzM1k2j7eqbF73RY\n/jaSqVek7p7F+ITv+qH1yazOSAgDQCIje2V+7ksN5qiGx2sXLSzHk+qnckIYgMUQ3rBBOqWeE8Ia\nISPcisoBkjVbKAQ3RE1kBGEIa4CM6OvqExMQ9qJygNUQVoHNoIO2dG9eYBNUT+K1SggzJlPChQLw\n8MNKd2+R0zL8Y3PSEFaVELYgIxwSwoC1yFcQjnArQ3jH0R3i8UUrLnLdDr9Xq0oI2xVWUblmhjDt\nu3gtcMuNYwOGZbKgmdatU88R9lIskGrLFuADH7D+jiaEAashfPnlVnOvXYXNEKYJYTt6zN5GhcEP\nBqzICKeEMCAndgyjfQa6uA40C1EA1gkUagirLizn1xB+6NBD4p7VbjoYsBrCvSuksx0FNiK2hDAx\n+SdKB0T78MwzwPS0yx/5EL2Wy0z+EHf/stO0YAh3mCyGcGbS142Rp84SFb0Hn/YEC+VReR2g2BPC\nnWQIA1ZsRKFgppVayT7o7gSTAbC+z2wpi9WrgeuvN38+csRcDuikbBYdmYimGql7iH4N4ZkZAMkS\nkDIBYTojI5JJ2bnnN3Ja6ZbOXjeTTEXLzq0OaYeRnvqXyQyMz/hPP+VysJpnGqZH6efNemSv088A\ntcEs1PB47VrUQw1hk0HYPkO4sybv+ACiVC0hmZkTbZdnhrBGiWgLQ7hojRHS4mNe7rvNRJERqUS6\nZbGpqMT7Gk4J4TCREXy5MhACMqJFQhiIDhvhVIDNgoxQxBAe6B4Q93wnhjCgpiBVrSbbNzdD+Mnj\nT4rHF6640PlFkNfebHEWhgo4qE1xJIQph1b0A1poZb9MidNzo5VUc4TtiVkv+tjHZDG/rVuBJbYS\nDjTJ/J73BHt/YTOEJ+ZNQ3iga6Bhws7+XXrhQ/vR2qG14t7gtlIuyHUsDWG9QhSAmVTl43OKjFCd\nEPZbVI6Oxy5ZdUnb+6WGcG1QHl8chrCFIdwbTUJ4/8x+vOxl8rmnnlK3H358ySQwVYrG7D4VpUev\ndEGepcIQNtKdYQgPdA2IpZV2XqGXwSfAE8IdZgj3Wg1hWoTBSwegISHcAcYK0JgQBoCbSTnHO+90\n/rtsFh2XsrPLMSE85d0QHhuDSAcDeiMjADlY4wNmumypbUNYs7QDfQ/ztSnf6Z1OSwhTY97PMefz\n6Ljr2NKZ7vWbEO6sFSw0PTWRmxDYiEOHZMEaN+mWEHZDRgASGQEA+/cH2w9FRuiAi+Diq7BO5k+i\nXC1bngsrIdzbC8xXyCSeYkOCDrzdTBVqCN97r+NLlIjf37q75fL6MJARdFtHZo84GqwqDOF8XuJT\nWhnCSZbEtmXbXLfFr72aUcN8uY1ZNI/q64NYohyVITxVmBKPLatHmoieA34MYUANNqIdZAR93S9/\nCfzFXwDf+U7j89u2AffcA3z3u8BNNwV7f6EbwvWEsB0XAUSXEE4n02KFw74p50EuHQ+2ixsQxqBm\nIQrAxFFxA/HAzAHRhumCjKDXJl2R4VWrB1eLgvHzydMvIXxg+oBlBcGTTzr8gU/xa9k8Nn2KjHea\nFgzhDpNKQzhZ1XvAzRgTA5bx+fG2eYWAQ8pOQ1PFLtohsReWa9UBqNVMRhw3VnpSPUgmks3/SBNR\nE5NXfL/2Wtkh+NnPnJNas7PoOCPJLmEIz65Gwvh/7J13eBzVvf7f2SLtSlp1q1nuvXdcMNhUY1qA\nBAKmhF5yCQQC5Jeb5AZCAim0JNTQMZAQ4JqOMc3GNu4Fd2y5yJZsyep1tf33x9lTZnZmd7bImtHV\n+zw8jLTFu9rZM+e85/1+vqRpUzwJ4ePHwRrKAcZOCAN8wcEMYTEhrBMZwQ1hY6UdZGkSR+IcYeXY\nZcTzWm4IJ5cQVuJ9zDBWi5NpKYt80IkmhJ22TMMkR6NJfM8iRzgYjL1gMwtDGEhtQlhERhjJEBar\nkcSSaUCeEE6lIZydLefG5qandhOvPLucbYLvqtulep+SEl7OvnFjastXRdGFqphW7Q5kBMA5wh2+\nDjZ/EjVyJDdId6n/WWIqlmnY5e/CzuM7AQBj+42NynAUjajuwEZIEn+NyRrCoRDf7NJrCDN0VAwl\naginmiOciCEMAEOHAn/8IzF/1XTmmcBll/FzL1F1JzLCmeln6W41E0lpCHdXQhjgZnNTVxNrvCkq\nmY0dVrBmsBAFFTUQu/xdrGqlO5ER8TSVE7+biWzk2a129rga92HkhIe/HkkInyCGcK4jl43zlS3d\nZwjT9xerOWSfosv4q48+ySQ3hJvivjB6/B7ASpIgtqDxDAalxAZruXkBNrHWawibMiGsgYwAYk8A\nWEIrbKz09II7HokmZrs3vGlh5aVmoRDw3nuRj+tVhnDIigwPmRDub9zPmrPEUm0tWEM5wPiGMP0e\n00l5pj0TaVZihOtNCLOxz2BpB9lC0NGcsCHc0QFDlderSTTUgmkpYAib7HtcnMlds6xiUldfrV6x\nrSoRoZBhM97nqyalIRxP5Y7RkBHiJmSrt5sTwuFzO9tA1Rvi+avERnRXU7nsbMiMjlRv4lkkC8b2\nI8DcA00H0OlTjxPSlHAwCKxYkdKXwKTG2xWNBbFPRrKKxRF2Ojl3ds+exBolxjINdxzfgUCINKua\nUjol8g6CxGtHS1fqGsuJUs4zEpVY+ZByZESChnCqOcJqvGsjqTsTwsF0/rmpmUgnKiEMKDjCKinh\nYcMAm40cpwQZYYA5M9XgnMHsmHKE+/fnmwmGSQjH2VCOilav1HbUYuxE8oWtqgKamqI9Knn1VEIY\n4Cb/kZYjGDM2wM7d7koIU/RLriMXdqs9yqP6pFSfIWwyJZsQbnHzGZ0taMCrvkLUHA2GgmjqasSw\nYeT3lZXaTcZEmZEhLA7QDe4GjBrFb4s1AeDNxcjn3NML7ngkSwgLPMeFC/l91C4ivQoZASCtnWAj\n3H637iYjSmSEkZvKAXwx6fGQRLskSWynOv6EsLHSDrKFoDPZhLCx+bIyQ9iWgoSwyRjCJVncNcvp\nT2KUNTX6mxiJJrhZmmHKkBEddXFV7pgJGSEmhFNiCIcT9NkO44zNoiGsbCxXJPTESjYhHAopEsJd\n3TtmU0M4hBC+r/9e9T5nnsmPP/885S8BgLohTBPC/TL6sU3QVEhmCMfgCLe3J5a2i2UIbznGJ2hT\nSqIbwt2dEAZ4Y7lkm8pRXAQQIyHsTjIh3K7fEAZ4StjjATZtin7fWEo0IXyilOqEsPh+PTahzFwF\nGaGcTw/J6/6EMKDeWM5uB0aMIMd79wKBgP7nVkVGGKCqjkqJGADI941uTvYkQ5gawk6bM2ETXcQZ\nDZ5YxY537Ejo6XQrWkJY7ziVqCgGxBf0odF3jDU33b07Nd9jv5+b/DL0Sx8uIm71GcImU7KGcFMH\nvwraQwa86iukTLDQxaffT3bWYkmZEDaDyaBMYGVn88YNsZARbOKaZu6EsFjyOH48SQoD6oawMiHc\n0yZDInI4yH8AYGmKv7EcQUaYhyEsLjjoxJye9/Wd9bqazBgVGSEzOFKIjDDiZpb49/anxBA218aO\nmPBzFnJDrUIn7UU0wc0ybmkhIwC9hrBxPuNoTeWKiviYnHRTuQ4fYHdH/Js9LWU1kqj0dN7sNNmE\ncGcnSeICkciI7kiojes3jh3vrNupep958zjX98035QZRKhQI8IUqvd6FQiG2yZuqhnJUorGolhAG\ngIkT+bFWk95oimkI1+g3hGUJYU/3JoTb2xNLRFPpNoTDyAinzRkVlyEq0YQwAMyZw4/XrInroREy\nuiHcnQlhtxSdO2q1WNn5mmZNSyn7WynRbNZioFOOsMcTH3LA8MgIsQlZ8yF2TLERNTWx+xTEUrIJ\n4TJXGaQE+Sfi+yscfuIay2klhPMcebBZbN36bw/OHcyOK5srMXkyOQ4EUmOEy8atHC/bXOzDRcSv\nPkPYhGITaEdT3BfGRsEQToMBr/oKiQuW2o5a2eJz//7Yj1c2ZjKiqaKUcsEN8GRHQwOiGkweDwAp\naDqTAZCbA+Li3OHg73/XLvnkHOgdyAiAp4QD9fEbwhFN5UyCjAAiOcKegEezzFeUEhnhtDlTmrhK\nVKlCRohjlx1OQ/JlxUW935qCpnImYwiLCWEpm7tm+/bpe7y7K8CMQle6OcYtMUEVLzLC44GhkBHp\n1nTYLcQVVKYUJYljIw4dSs5QavUYs3pD3NCo7Yh0fSlHONmEsHJB2p3ICIAnhAEwpq1S2dnAFVeQ\n46Ym4KWXUvsaRJY4vd41uBvgCxJkWyr5wYC8jFkrIXzZZfz4lVfi/zdimYabj21mx5NLJkd9LvFz\n766EMP27h0Lxsd2V0msIU2REPKk7V7qLzVnjNYRnz+bHqTSEjYiM6E6GcHsodiOqicVkN2Vm/5nd\nOheTJYQ1Gsuddho/fv55/c9tdGSELCHcwg1T2kcmFIoPyaWmRBjCbp+bbfYksxkgJoQzS3nced26\nhJ9SlyISwuEqTFlT5G6SaIIrOcJbtyb//OJ7S8vlKAy1pH+fost4K8w+xRTrXptAQri5UzCETZAQ\nViZY4m0sp0RGOO1xUOR7SNEMYSA6NsLjAWDjJ4WZzFG1pnJUdFfR7wd2KtZ3xBA2dvMtPaKGsOdo\ngoaw2FTOhAlhsbmBHo6wMiFshHQwkGpkBDmv0yVjmqOiueWzpIIhbByzUI/ynfmwSqR8wZfGXbO9\ne/U9Xtz4yEo3/vsF5Nenus46DBwIWMIzye/Vq/OZjIaMkCSJncNqphTFRnR2AvX1ETfrVruPP7eR\nDGHZhnt7pCFMS3U7OpJL0CoXpN2NjBhXxBPCu+q1O6jdcw8/fvxxfRgyvVIzT0UEVKpThiIyQstY\nnDwZmDSJHK9bFz+DNJohHAgGsK12GwBgWN6wmNfjE8kQBpLjCIuGcFqUPWeKjGBrNJ2i50K8hvCY\nMfw9rlmT3KaV+Pf5v5QQdjqBxq7oyAgAeP3i1/HY2Y9h8cWLk//Ho0hkCB9oVl/kXn01kBm+dL72\nmn4kiuGRETnRDWEg+cZyiRjCYiPQVBnCtoLD7Hv20UepvfYoJTNNnV5mbovX/+6SEgOS6sZy4rhl\ny4m9sdMnbfUZwiYUN4Sb0OmObwYgGsLpkgGv+golawiLyAi75DBkyk4p0RijhjAtEQKAb77Rfqyy\nJNcMpgqVDBmhKN+NtqvYGxjCADeEO6u4IfzZhtiGsNcb7pJuoqZy4oKDXtBlhrAOjrCSIWyUpENK\nkRHhNKXDasxzWlzUe6X/e8gIi2RhKcsOxJ8Q7vTz92uWhLBywzItjXeY3749egWL0ZrKAXzzTM0Q\nFhvLJYqN8HrlmyXZacYxhGVIrs7jkbcLPc+SwUZEGMLdjIwYmDOQnVtaCWGAnLfnnEOODx0CXngB\neOQR4Kqrki/jVTWEBWOhJxLCAHDddfz45ZfJ/48e1WeYRjOEv2/4Hm4/2QmM1VAOiM7vTpVEQzgZ\njrCehHCXv4u9f70N5ajoudDqaWUNlfXIagVmziTHx44lZ5YZHRkhmnepNIRdLn1NtgblDsJds++S\nGVzdoaLMIlbJqpUQzs0lpjBAPrfXXtP33GoJYSMhI0pdpaxihzKEAY6MAJLnCNNzJz2dowhjSdyo\nSZUhfLTjMOuP09gYfV2frOjn7nIB9W5+nT8RhrCIjNjftJ+Fu4DUGMLiuG7J6jOEk5Hx3bE+RYiV\nI1kCcAfi2/Zu7hRK8S3GNo2AyKYnCSWEw4kkh6XnF596ZLfa2SKJJiXPOYd3Wn3qKW3DxeOB6UwV\nKjHVqpwUR9tV7G3ICLQMAgKE67S9OrYhfJxe39PMwxCOhowA4kgISwFmhBtlYisrF3U28c8nTolj\nl9NqzLFLXNR7kAJDOGwWWmA1BP5Dj+g1qsFTS3A90J8QdgfMN24pE8IAsGAB+TkUit6gS9ywtElp\nhugCTc9hZVUKkJrGcu3tkFVvGCkhLCJP1latRSAo71BUwm9OChtxopERFsmCMf1IWdWBpgNw+7RL\nFu69lx/fdhv5+Y03gNtvT+41iAYrvd6JxkKqGcIlWSWQQCaJWgxhALjySs5OXryYvM/+/YGhQ2On\n4KOZhmJDuaklU2O+XvF6red6n4iyha9aqhLCWoZwIg3lqESTSW8jYapUYSOMbgjb7YAtjDtNJTLC\n5SLNUal62kiSJImlhA82H0QwFFS933/9Fz9+6il96XA1hrBRghQAGbepaVrZUsl6iYgJ4WQNYXru\nJMIPBlJnCFe2VOKSS/htS5Yk/LQxRa+/OTnyXgGiv9JdGpE/gh3va9yHnBwwH2fbtviaIqpJNq5n\nxk7690lbfYawCSWWI3WGGuN6bGuXkBC2GPCqr5AyITxoEDdG400IO23G5wdTiQ22ADKA/vCH5Laa\nGuD119UfRwxhc+ITtJrKAbzMEYhMCPcWQ5g28EHQBrSSLXF/ViV8vuiPY8mtdGNyKtWUioSw2w2Z\n0WKU0jdZOigJhnBbR1AYu4xpCIuLia6QHBnxn//E17hINAsdlqyEG3ecaNGEcCAUQNkwcj3WzRAO\nGistq0dp1jQ2vtDr09ln89s/+0z7seL1ySgbtPS9dPm74A3IO9aIhnCiCeG2NhjWEM5z5uGUgacA\nAPY27MXr2+QTC3EDPhmzKVpCuLs28mhjuRBC2FOv3Y33tNOAqSr+5bffJmcixkJGpDohbLPY2FgU\nLSFcWAicfz45rqkhZhJAzOCPP47+b0QzDUV+sJ6EsFgerheNFa+6AxmhaQh3cUM4UWQEED82YtYs\nfpzMd1RtA8NooiZeKhLC9Fx2ufjGJmAMI4lyhL0Br+YGwfjxpDEmQJqNf/VV7Oel47DkNCYyAuCI\ngVZPK9s4TKUhTM+dnjCEcx25bJ17uOUwzj2XI2jee4+Y+q++Cvz0p8BbbwkGfpKin3t2thwNdSIM\n4RxxmUAMAAAgAElEQVRHDvNx9jWQiTENeHV26g9PaEmcWwTSjbOxY0b1GcImVL6DTzY8lsa4uFGi\nIew0aBmyKJkh3HkcaWm8fCRehrDDaj5DuMndBH+QwIXuu4/f/sgjvGu3KFMjI6IwhPPzefnu1q3y\n965ERpjpPYvKFdfFHeHz3tmEozXR4VLMEO4lTeWAOBLCBkw65DhyWFIrGWREa6cbkMjgbtRzOjMt\nk71Xd4h/Fm+8Afz4x8B55wEbNuh7LoKMCDOTDWIW6pGYsiwfTb6MjY2kAWgseYLm3Miik21qCM+d\nyxdYy5Zpp5VEZITDIKl30aBVoopEZESiCWEjG8IA8IfT/8CO719xv8wUpzgFAPjww8T/jWgM4e4a\nt2WN5eq0sRGSROZTDgc5h0eOJL/3+4FVqxL/92MhI1LNEAY4R7imvYbNG9UkYiNErV4d/fmjNR7b\nUsMTwlNKYhvCw/KHMf769w0x4OMJKlWGsFfYJ9IyhGlDOSB+ZESqDOG1a+N6qEz0s7XZonOSe1IU\nG5FsQtjr5Z9phCFsACNJxhFu0l7oKlPCsUTHYUsGMVqN0ohZlBpHWNyYTBblQw1hvfxgIHWGsCRJ\nLCV8uOUwslxBnHEGua2qCrjxRuDaa4FnngEuv5xs3iVbqRII8O92dra8eazYVLY7RVPCx9qPoc3T\nllKOsDiu+9Nio1/6pK0+Q9iEknexb5LtXseSzBA2QUJY3K2lO1v04tDYGGanRpFYhky5TGYQHcxC\nCLFStBkzgPnzye3ff6++QGtogIzRaCaTQXytyoU5wBvLtbcD+/fz34sJYQmSKRoHqklmCHfyi9m+\nquhVAAxJ0IuayolMNy11dUHWHMMoyAiLZOGmTxJN5Vrdxmm+pSWLZGHnmjvAz7+qKn6f5cv1PZfI\nEM4wwWYllZiyKBrK6+r1pIQ9IfMlhAF+fWp0N8If9CM9nV+bamq0F23ihmWGzRifcTSWaW9HRgDA\nqYNOxYJhhPlxqPkQXtj8Artt5EhukK5eTeZciUgLGeGwOZBu03DYkhRNCAPArjrtxnIASQkfP07m\nTw89xH//5ZeJ//tqTbq6ExkBcI5wMBSUlQYrdc45vFHxtGm8FD+WAR4tIby7nnSoK84s1mU0pFnT\nMCSPGF97G/ZqlsYnoxOaEBaRESfQEM7PB0aNIsebNxPD6+uv469ooJ9tVhavwjSaUpUQVn43KTIi\n3ZpuiDUTTQgDBBuhpYsuAkrDw8gHHxCOdDRxhnC474bB0sGAnDlLOcL5+cDwcGuVTZvkGzTxqicT\nwgAwooCYo56ABy9veRkXX8xve+kl+X39fmL06604U5M4ZvdEQhgARhaMZMcVjRUyjvC6dck9tzi3\n8NiMlfQ3m/oMYRNKVo7kbIxrt1Rks2bae/7CF0s2i40ZRXRna9gwfvtB7WslAKC90wdYSVLCqKaK\nmpSNe6hE3t1f/hL5uOXLIU/Lmug9ywxhFZ6jVmM50RDOTMs0ReNANckTwvxiVnE0ujmqhowwekJY\nFRmRER8ygiSEBRalQRLCgLBp52hGSwuZwIomqR61dYkbO8b9HlODqyPQqnr7jh36nkdMjxoVkaEm\nMSGcUxZfYzkvzJkQFq9PNBlHOcKANjbC3RViKfAMg3zG4lipNIRLSnhaLlXICCNu1v3x9D+y4we/\neVDG3L3gAvL/QAD49NPEnl8LGdGdY7behDCVy0VSwnRjA9BXhq0ltTStmBAWx41UiSaEgegcYbud\nIDFWryapUjq32r07uumvZQj7g35mNIiczFgaVUCczE5fZ9TXm6hOZFO5nkJGADwl7PMRvNrppwMT\nJ8a3gSMydY2qVCWElXgMusYqzCg0BKqKbpQA0RPCdjtw/fXkOBDgTSLVFApxBEEozViNmEWpJYQB\nfo57PIQ9m4iCQd7fIlFDOFnUzx0n3cGO7/viPsw9uz5iA+a224Azz+Q/79EmHsWU8trbkwlhgGz+\nzZ7NOfb//jdiYhGjSfwuuyVjJf3NJnM6J//HlSpD2CgJnVgqzy4HAFS1ViEQDMjKR8SkqJraPXwr\nOTPNPAlhrbTkwoWEHQXwCb2oL76AaRnCFsnCUnJqCWGtMpPDh8GMJDO9X6XEju4Z4IbLwdroEVNV\nZIQBTQdRasgI0WTSg4xwuyFHRhgo7cASQo4mACFMmUJQN/fco/852jz8e5ztMO55zQxhn/qKW2+J\nX7vbyzbvMkyUlhVTFukFPCEci40WCgE+GD8FriYxfUGvT3oM4c4uL2AhXUSMkoiWISMUG5EWC+cX\nHjqkr3GPUkZHRgDAtLJpuHg0iSrVtNdg1WEeFaWGMJA4NkIrIdydY/ag3EGsKmzLsS34+7q/44Hl\nD8ga2qmpoIBXI23dqg/9oqZoDOF8Zz4cNkdiTxxForEYjSMMkA3oOXNIOvjkk/nvv/1W+zFahnBt\ney1CCEW8hliihjDQPdiIE9lUToaMSKKp3NH2+A1hsbFcRRjH3NoaH1NYTAgbVWJCOJGxmEqWEHaF\n2DXMKKnCYXk89RTre3HjjTzR/fzz6ihBgJzDPh8AKYCgnQzIRqmqE0UZwgDw4d4P2ebkzJn8Pomi\nUcRmx4kYwq40V9Jrq9OGnIZFExYBIGPGY9/9CnPn8ttvuIGkgq+5hv8umYRwNENYRHJ2p8SE8L7G\nfcjPB37wA/Lz8eOx2fXRJL6/jlBfQjgZ9RnCJpSsHMnZFKchzK+EWXZjm0ZU9ALhD/pxrP2YzBCO\nxRFuEwzhrHTzGMJaCWFJ0k4JV1aGLxwm5unSi624cUEllpnQhPDx42FDOPyezWwI/+AHwBlnkAZN\nC+fzi9mRxugJYY6MIN9tq2SF02ZsbEbMpnJ6GcIGREYAwmux+oG0DuwKVywvXqz/OTq8giHsNO73\nmKZMOvztgBTZMnjXLn2dhEWcUZYJqleoxKSfxaU/Iez1wrTNMAud/PpEy21HjuTM3ZUrgY6OyMe1\n+4xXvRINGQFwbERbG9DUFHFzTBkdGUF13ojz2DEt/weIWUgbni5dmliaR1y0ZbmCbMO3O8dsi2Rh\nKeEjrUdw59I7cf+K+3H9+9fHfOzpp5P/h0LAihWJ/ftK8zQUCrGEcKobylHpTQgrJRrC0TjCWoZw\noiXVowoFQ7g+9YawWZAR4vmQSEJYNIRF6d2MDYXMYQjThHAwmLpUYVp2C3xB8mRGSRWOLBjJ2L5b\na7ZGve/gwXwz9tAh4PPP1e/HxmChktBIIQqqicUT2VzoiwNf4JSXT0F1a3VKWNkiaiQRhnCquO+P\nnPUImwe8sOUFXPWrbzFhAmFCP/ssWedTRAaQYkO4B5ARFJMBkIQwQIxvKiUqIx6J3+XWAJmLZtgz\nTIUINYr6DGETKpmEcIfPfAtQWQlJc2VchnCHT0zZGWMBqkdahjBAYPP9w/P+Dz4gZX6AwLszKUMY\n4OW7asiIAQMISwogHKlQCNi4MXxjLzCEc3NJwvuzz4BR5fzzr2mNLyHsSncZouwtmtQSwrmOXNag\nrNcgI4BwSpiIcir1qF0whHMMbAjLDK70yO+tx8NTS9HU7jFnWlYsu+uy1cASnlXFSggrG4CaaexS\nuz5JEl+Yer3EFFZK3ORwpRvj/cYyhMXGcolgI8yQEAaAMf3GsOPdddwQttlIZRJASo7VPtdYEhel\nVmcbS5N295g9sWhixO+W7FmCPfXRa3Bpox8gcWyEsiy9uasZXX4SUesOfjDAGcKAekLYG/DiuY3P\n4auD8jclGsLROMIpN4S7OSF8Qg3hJJARmWmZ7LuQiCE8YQLhyWZmAosW8d/rxTW53Txxa2RDWEx1\nJsMRFs8FS5bxUoV2qx0TiiYAIBsl4nVTTTffzI//+U/1+3B+MJ8zGylEQZXvzMcbl7zBwkybjm3C\ngtcXYOx4P/vuJcqdFc8ZvQnhNk8bW4+myhAudZXiD6fxZq5PVd6GzVv9ePJJznMfwT3UlBrClC2f\nbk0/YXOR4fnc3d7XSN7MWWcB5aT4G598QuZWjzxCGp7Gg9cT31+LL5z0N8jGjtnUZwibUEpDOJ4L\nY6efz+iMsiCLJSVTKB5DuL5ZSAg7zLNjFM0QTksD7rqL//zoo+T/bGfYpAxhgJsibZ42hBQ1YZLE\ny4aOHydcpY0bAVi9gM0re7zZNbhITODpZAg7yJXR6PxgQL2pnNViZUaqnoRwZyeMj4wAAEczHEJ1\n8O7dkfdXU6dfMIQzjPs9lhvCfHZWKDT51bMwFdOjRkZkKCUmhOvctSxRum9f9LJWkZkMmKuaQw0Z\nAQCnnMLvo5ZOE8/prHRjvN9oDGEg+cZyZjGERxeOZsd7GuSGabLYCHHRFkwXNvG6ecy+e/bdmDNg\nDs4edjYuGXMJ+/2j3z4a9XGnnAJYreQ4UUNYaZ6K/OBUGQtKiQnhzcc2R9z+t7V/w60f34oFry9A\nVStfdZeW8mbNGzZAs1E1u1Zb5cZoShLCJjeEk0FGAPzvdrTtaMTcN5YkCViyhGzYvPwyN5X0GsLK\nzQujKlWGsPjdRIYxuaNTSggjL4QQvqv9Lup9zz+f8O4B4P33hapBQZQfLFbVGSlEIerCURdizQ1r\nGI98Z91OVLTswrRp5PaKCqA+dt/pCCViCHfXuH3bjNvYZ7ytdhueXP+k7PaCAt5XpjuQEcVZxScs\nOJRhz2Doz30N5M1YrcC115LbAwHCPr/3XuCVV4AHHtD/3GzskoJo9pJ1o+if9Em/+gxhE0qZPosn\nIdwZEBJJJjGElV1HCwr4pCXaQOn1ApVHBYawiUoIohnCAHDTTUBO+Fq+eDHBJtCEsD3LnKkzgCMj\nAqEAS9OIoqWcAFmobdgA05oq0TS8lE9Mmzz6kBFSOk8IG11qyAiAYyP0JIQbGmBYZIRoCN/3uyb8\n6U/8Nr2GsDsgbt4Z97zWMoR/+Uv+az2lq/L0qHHfr1J5jjzYLaRDRm1HLUaGcWnt7UBNjfbjektC\nuK6TL6hHc08R36v4O50GrNiRMYRV2PWiIZxIQtgsyIh8Zz5jCooJYQA45xxuMr3/fvz8TpkhbBfG\n7PTuHbPHFY3D6utX47OrPsPLP3iZ/e1f2/Yaatq1v5wuF3DSSeR4927gaPyhzQhDOJWNibQ0smAk\nKwP+bP9n+HSfvAvgJxWfACD4te9q5CYTTQl7PMDmSC8ZAL9WZ2VB1ggpUUO4OLOYfSbdjYxIpqmc\naCQ5NNDPYkI4XmQEwP9unb5O1Y0pPbJaSWCEXoN279aHVtBKfhtNYpl/Mo3lxDln0MGvX0YykqaU\n8qYpW45tiXJP0qDrqqvIcSCgXsXBE8LGN4QBYELxBFw76Vr2c1VrlYwjvH59/M8pnjN6DeFEx7ZY\nsllseOa8Z1hV5G+//q0M8yNJPCV85IicfxyP5LimAPMTThQugopyhBvcDWzz7Lrr+O0t/LTE8uX6\nn5e9P2cjgiEC0DZK0t9s6jOETahkkBHMZAhJcJmEqStC5g81H4Ikkd0kgCR2tDqO7tsHBCx8AWom\npozMEHZHGoLZ2aQTKUCM79mzgbrwvKb/YPMjIwB1bIRYyvnll+GEsElNlWgqz+efP+UiqSkQCH/u\nUhChsDFuhoSwGjICAAoyiCHc4mmBLxB9JVNbC8MiI0Rzes7pTZgwgd+m1xDuCphjo0M0uPoPJbOz\nU04BLryQ30dPUklWvWJgRIZSkiQxI62mvUZW6hcNG0EMYXNiMsQklbhhOZL3DlHtjN0VNF4KPB5k\nRCoSwka+Ro0pJNiI2o5aGRM1NxeYP58cHzwYf5d3umhLSwPcwZ6p6shOz8at024FQNAJ/1j3j6j3\nF+cazz8f/78na1yVBby7613289C8oSqPSF52qx2PnP0I+/n2T29nTZl8AR/WVfFa6yOtR2SP1cMR\n1uLMJmqaSJLEsBGHWw6z15oq5QinV3P0XoJRJSYuizT6MMkYwkkkhIHEsBGiaONpr1cnrqkHDeFY\n8zxR3YGM8KXx65eREsJTS6ey4y010Q1hQF6do2aWmgUZIYqmSgFiCCfLEU4kIdxdhjAAzCyfiZum\n3gSA9M35xbJfyG6nc8lQCNi/P7F/Q7YRllnPTFMRdXYiNCKfT4xpSnjoUHnIi6qigvsZsUS/y1aX\nMZP+ZlKfIWxCOW1O2BCuW4rTEGYLMm8WHA5jc0aplMgIALjiCn77q6+qP27HDgB2MSFsngV3rIQw\nANx9N2fwiAmWfv3N31QOUE9rTZrEG9x8+mk4gdcLDWHx83dDOyHc0BDuKiz8DYyaQBPldPKEkVpC\nGJCXYCrldocf5zBoQlhYEDZ3NWMMx3OyBnOx5AmawywU01DX/HIL7r8feP11YNgwnqbSkxB2CwZ4\nrtNc32OKjajrqMOo0byDXjTWXW9JCIvXp+xsoCy8ZlJLCIufsVEaJeptKgckbwhn2lywSMaddlND\nGJA3lgOAiy/mx//7v/qf0+/nf7e8PDIeUp3oTbw7Z93J0vxPb3xatXkt1XXX8VT044/HbyiKJluX\n9The+e4VAGROdunYS+N7sjh05YQrMW/QPADAgaYD+PPqPwMAvqv9Dm4/XyyIyAgAsk73775LPjel\nNA3h9sRNE4qNCCHE+JKpUk4OR3/oZferSaz0KClRvw+dr7jSXLBZbHH/G6k0hMUNaD2bsT2FjFj8\n3WJkPZyFy9+5XBcmozsSwl6b8RjCAGmuRq8VavgXpWbM4MdRDeF0Y2LW1CQawkdajsgSwskawnqb\nynWnIQwAD53xEJtPvbXzLVnlSio4wqIhHHDwhnJFGRo7W90kmhAGeGM5APjHP0gF0n//N/DTn/L7\nr1mj73np+3MW9hnCycq4M9M+aUqSJGRI4UW4kyAjli0jpXyx1BUMXwm9WUhL677XmEoVZhTCaSOj\nNzWEL78c7PW/8Yb65HXnTsgMYTMlhPOceayURMsQ7tePlAYNVYRNcvv1joSw2mLNYgFOO40csxKa\nNPO+Xy3lOnKBIFnJ+NPqNcuFlA3lAHMgIySJLyrVEsJAdI4wS+wYdHIrmqRNXU0oKeFpJb0JYQ/M\ncV4vHLGQHb964CHc999uDBxIFuJjx5LfV1TEXsB1GdAs1CuatgiEApg6l5+3n36q9QhzM4S1kBEA\nMCqMBa2vjzRhxE2OLINscsQyhMvKuDGYLDLClWbszTqxsZyy8dpFF/HjJUv0P+e333Izdd48Uv1B\ndaI38cpcZVg0gXTdau5qxpcHvtS879ChwE9+Qo5bWoAnnojv3xKva6/uepohsG6aelNCCVK9kiQJ\nT5/3NDMlH171MI61HcPqw/LYr9IQHjMGGEiQnVi7Fvj5z+XPGwrFTgjbLXbZpq4eyRrLpRgbIUmc\nZZ8Ic5TqGMeIahrCFBmR6GfbHQlhQN9mbE8lhP+5+Z/wBrx4a+dbukzPVCWExcS3125MIynDnsG+\nGzuO74A34I16/9JS0ngbIJWTgYD8dlaSbxJkBAAMyBnAjqvaqjBwIP/+rV0LvPkmqVjRK6MlhAGy\n5rl64tXs5x3H+Q5Oqg1hr50bwj2aEBY2/saOJfPkP/5RXpWj1xCmmzvOAiHpb6CNHTOpzxA2qTKt\nYWyEsxHffEO6e190EfDOO9Ef5wnxhLBWcwSjSZIkxhGubK5EKBRCfj5vdFJbSwxxpXbuhMwsNJMh\nbLPY2MRSyxAGSHpp5UpuvEyeDEjp5m0qFwsZAcgvGgBMm7KLJkmSkB4Ir2Qy67jxqxA3RvlV3wzI\nCIAnUcTFSKGTG03ROMK8kR6PbRnpfYtGR5O7CZLEv6OHDyuammjIFzKHWTi9bDouGk3coqNtR/H0\nhqfZbTSpFAzGNsJFnIDLZN/jkkzuEmQW1bJNulWrtNmVZk4I5zpyYZXIhpXy+hSNI8zmHzDOOS1u\noLV6Iz8sq5UvtJNNCBu9ekNsLKfkCJeVgZXsbt+urxQdkDehO/98oKWrZzfxxOZyXx2M3jHu17+W\np4SbmqLeXSY6xjtcnXh6I2kYZJWsuGv2XVEelRqN7TcWd5x0BwCCx3hz+5v4tupb2X2UyAiLhfSj\nsJMANZ56CnhS6HPk8XCTScsQLnOVxd2oSGYId0NjuVQYwjQhnJbGK9REhUIhlhCWIf3ikGg2VbdV\nR7lnbImGsK6Grj1kCB9r40774m2LY94/VQnhI8Kp77MbMyEMcI6wL+jDrrrYpWWUe97eHols4sgI\nPogZKUShJiUyQmws3tYGXHklqUR76SV9z6fFEN55fCcqGtUvaN1tCAPAuH7jZK+FKhWGsMjmdVsF\nQ7iHGMKAPCEcCAaw+vBq1HfWY/Zsfv9v5ZcrTVFD2J5jzI0dM6nPEDapXLbwpCOtAx99yncOX3wx\n+uNEQ9gsCWGAc4TdfjdbgNL0BqCOjYhARpjMHKUprFgNtsrKSInQu+8CS5cCHT5zGElqEk0RNWQE\noMIcMqmpEkuZUviillGv2ZyKGaPpQkLYQMZoNNGFhwwZEW9COJx2yE7PhtViTfVLTFhiSogmh0Rs\nhBpfVVQgAASs5kBGAMCDpz3IKhoeXvUwS1rqXZh6PEB9q3ner1Ji2uJ4Zy0WhkPTfj9v9qmUkiFs\npg1LSZLY9amuQz0hDMgN4VAIaPcaL/UupvmPd6i0ZwfHRjQ3yxdYetTaFmTjc67T2IZwNGQEIMdG\n6E0Jf/QR+b/FAixcKEdG9ATmZ96geawU+8uD2glhABgyhHdCb22NLyVMr2u26a+wa9nl4y/HwJyB\n8b7khHTr9FvZ8WvbXsO3R+QrbGVCGABOPRX45z/5z3feGQ5WQNs09Pg9bE7eP7t/3K+TIiOA7jGE\n+4WnUW430NER/b5aovOvkhJ5Mz2qDl8H/EFSpphIQzlA3jz73d3v6kIoaGnIEG6exouMOJGGcG0H\nN6je3P5mTJ5wqhLCVeFTX5KAjpAxm8oBwNQSzhHWk6CmhjAQiY1ghnAWX0xQ1JVRlZOew+YJR1qI\ni3/LLZCF2UIh4LXX9D2fWkL4jW1vYOKzEzHhmQmqf+MT0Qx0XBE3hEXjP9UJ4U6Jz29OdEJ4SN4Q\ndt0VE8J//favmPvyXEz75zTkFrrZXGvDhtgNMf1+/plas437PTaL+gxhkyrbzicd1Q18x++LL7RZ\nWd6AF0Ep/A0zUUIYkHOEDzUfAkC4M3Sy9/778vRGV1c4wWJSZATABzU9DbYyM4FLLgGKizlqIc2a\nBrvV3u2vM5WSMYQ1EsKjRpHyKKrsQuOlzlKhHFv4omZ341C1+kqGG8JCQtgEyAhAnhCmax+x3FRX\nQjiMjDBa6Zu4KKQGiGgIx0rLdnZCttFh9PN6fNF4Vord4G7AE2uJcyKyDKOVrn71FeAJmndjR1xY\n1bTXMEMY0MZGiAlhO5yG2tDQI5qmUiaEtQzhgweBdo/xTH9XuoslSsTkiiixsVy82IgWNz+vcxzG\nNoTLs8vZWKNERgDxc4QrKvjm1+zZJK0pIiN6YtzOceRgetl0AMDOup0yZqOaxJTwv/+t/9+hBqp3\nPE9p3Dvn3rheazIaUTACs8pJpHtb7bYIA7iqtUrVdLz2WuCee8hxMMhDJlqGsPj3SyRBNyJ/BNtM\nTDUyAuAJYSCxlLDfzxscaeIikmwoB5AmYpOKSbfsjUc34r097yX0PACpahgX9pgqKmKbp+Jne6IY\nwh3eDhkWrq6zDsv2q5R6ChITwqkwhEtKgIYuclJYJEvC6e7uEk0IA8CWY7Eby0UzhNlGpqv7E6+p\nkiRJLCVMx6uFC4HqauCTT0izU0B/tYqSIby3YS9u+egWBENBdPm78MeVf5Td/7mNz2FNFWEX5Dny\n4LTrBA/HqbH9xrLjnXU8IZyXBxSEl0SpMIRbgz2XEE6zprFNr70Ne9m15+WtLwMgTUXXVK1hKWG3\nG/juu+jPKY5bUpZxk/5mUZ8hbFLlpAkXLidvvuT3a0/UZUxWExvClCNstwOLiAcBjwf43e/4/ffs\nCTfb6gWGMBA9LakUTQibzVQB5IkhZfKMSpLk2IhBI8xrJEVTvoN//gdq1FcyLCmbyXd+zVIuQxeV\ngQD5/gKJJISJ2Wq00jcZMiKcEB7L53wxG8t1dkLOlzWIeRZND8x/gGEE3t71NgD9LMMlSyBLyxrd\nAFdKnFzXttdi/nyeYlm6lG94iBIN4XTJfOMWvT65/W50+vh1VkRGiEn4b76BYTc5aEqxpr0m5Y3l\n2gQMhdGREZIkMWzEweaDjHtLNWIE/06vXStvZqsmmg4GOOKrp5ERAHDGED6BiIWNGDwYmBL2Zfbu\n1UbAKEUWqyH4csiXYET+CEwqmRT/i01C10y8RvO2Tl8nuzYp9atf8R4db75J1hVahrCspDorfoPJ\naXey1PT3Dd8nlYxVU7KG8PHjfPwu1QgIig1wE00IWyQL/nD6H9jPv/n6NwgEA1EeEV30exoKxd6A\n7glkhJgOpnptW/Sop5gQThQZ4fNxJnR5OV9nFDgLDNfwc3LJZHa8pSa2ITxtGk+wayaEw4awBOmE\nm4KJiBrCHb4OFq4oKCDVJnSuUV2tb4NAvI/d4cHl71wuq6hdsnsJ9jXsgz/ox399/F+49eNbWfL/\n2snXpuT9qCk7PZu9z511O2VjIE0J632PSonXqyaf0FQu88Q2lQM4kqrd245ddbtwoOmAbBP+m8pv\nMGcOv38sbIT43kLOPmREsjLW6Ncn3cpzqBvCgHaKQVaC73GZEhkBEI4w1S238PTGP/4BPPMMOaZl\nbqKpYjpDWOCpRuMIK0WNfyMtuPVK5MmJO6VKidiIsiG90xAuyuIXtco69c+fJWVNtOtPJS482trI\nRN1ljSMhbPUCdmJY9ETpcTTJkBHuSGSEvoSwuQzSYfnDMCx/GADgYNNBhEIhlJXxBfnXX/NkjqhA\nINwQVTQLTWCAixLL72raa5CZScqvAcIrVNsAOHIE7PrksJrr/QKKxnLC5t3AgYDDQY7FhPDKlZBd\nj400VsdqbJVMQthMhjDAG8sFQ0HVxPQlHMErM3zVJPKDqSHc7OlZZAQgN4SjNZajmsort2OmlpVf\n+pQAACAASURBVABioHZ1AcioR8hGVvFD8obE+zKT1mXjLoPdIq8SU3I51ZSfD5x3HjmurSWVh7oM\n4QTnHhOLJwIgTR23H9fRBS0OJWsIi7iuWA3lgMQZwgBw3ojzMLucROR21e3Cm9vfTPi54uEI94gh\n3B5pCL+/530ZUkapVCAjjh3jBn//8hBrimrEVGG+M5+FobbWbI25QeBy8eDBtm1y01xpCBdlFpmi\ngnRAttBYTjFeiUiF/ftjP5c4H3m//TfMZKdjZAghPLrmUdz20W14eiPvhXH3rLvxl7P+ksCr1y/K\nEW7uapZVXYjvcdcu0idKT6NIKvq5Z2QAdZ0911QOAM4eejY7fnvX2/is4jPZ7SsqV8g4wrEay4mo\nm0B6X0I4WfUZwiZVvliW5JTv8i9fDtUmVGZOCIt8LZoQBojJ8uyz/H4/+xnw+eeiISwwhE1gqogS\nF9yJGMJGWnDr1fgiPouNtjC4/HKCDJkxA5g8w5gmQ7Iqy+Wff1WTelqafc+zeHOOUlf3cK5SLbE0\ncc8ewsK+/AfcEI52ztfWguEiAOMhI9KsaWwDii5wBg3iJY+6DGGDmmfRRMfpDl8H6jvrIUnADTeQ\n27xe4OGHIx+zdm048W3C90slIiNo8ikWNmLjRjATPDfDXO8XkKcwxO+qxQKMDPcP2b+fc+BIQtiY\nqfdYja2SSQh3+E1mCIsc4brIgYoahYA2DgUgJcrffEOOhwzhG2KyhHAPjdtzBsxBupVMgL88+GXM\nVKpoCG+OjfLkBlvuIfa7Ibkn3hAuyCjA+SPPZz9LkPDDMT9kP1Mup5qu5o3vsXixNlYgFYbwWUPP\nYsdLK5Ym9Bxa6id4A3Xq06io0mUIi8iIBBPCAEno//F0XrZ+/4r74Q14ozxCWyKuKdYmhmisnChk\nhGh6OWxkB9ET8OA/O/+j+ZhUNJUTN6WLB7ay6pbu4sMmq6mlZPDp8HXI2KtaotgIvx/YupX/vrUV\ngBQEXGStYJbgiLiBpWyEOXw4P9aDjaAGo9UWxAdHSXoszZqGz6/+nM05n9v0HF7Y8gIAYhS/dOFL\neHTBo7BZbEm8i9jSwkaIhvC55wKXXkrwS3o3t6ghnJ3NN2GskrVH8Cg/HMuvPW/vehtL98vH+rVV\nazFqrIdt/MSTEPbZyR/EbrEbbj1oFvUZwiZVQWZkQjgn/B0IBskuklJmNoTVkBFUN9zAmWeBAPDj\nHxNTGICpkRFi+bxeQzgUCqHDa15kRJ4zD/1dpDHJjuM7NBdqTidZjK5fD0i9tKncwEK+kqltU//8\naekbneQBxp3YKiUmUV54gUxwmo/GgYxw8CSJ0ZARAF8Y0vSQxcL5qhUVHJOhJmVCuLvYZamWaHpQ\n1vs99/DP+vnngcOH5Y95j6ISDYoT0CMZMkKnIbxhYxBII9en/CxzvV9AvmGpbMZGz3Ofj7CDjx2j\nTH9jpt5lja1UEsKJGsJ+P+CVzGsIq3GEp0/nqcsvviAbPWr68kvy/gGSDqalzCJDuKf+Hk67EycP\nPBkAmU8eaDqATl+n5nwjFYawGGo4kbpmEsdGTCieIOtor5UQBoj5kBf2NpcsEeYaSH1C+Jzh57Dj\nVBvCySaExfetCxmRIEOY6rQhp+HMoWcCAA40HcBLW15it7V52nRjJERD+OmnCXNVSz2NjLh+8vXs\n+G/r/oZgKKj6GDEh3NZGqo6OaO9pqEq8f2YpP/8TaYh4IjStdBo73nR0U8z7a3GEW1oAZNQBFnL+\nmNEQjpYQjmUINzXxIMa4WdUMFbFg2ALMGzwPN0+9WXZ/CRIWX7wY1025LvEXH4fEcXlX3S7sbdiL\nBa8vwPZ8jpGhG1odHZFIEC3JDOHwd64os6hH8Cjl2eWYM4AwIXbV7cKn++QT4y5/F7bWbcCMGeTn\nw4ejVyGJG1ldVvLHKcwohKTW+bNPMdVnCJtU/URD2EEMh1/8gv9KDRshTsThzTIVMqLUVcrKOqjR\nIOpPf+IliU1N4fQVAKvTxMiIBBLCbr8bIZBFjZESWPFoQjGZyTZ3NaO6rTrm/cWNjt5kCA8uEj//\nyGhLKMQbDTj6CZ1wTZgQXrs2fODWZwjX1gJw8PEsN91YyAiAl0OL6SFazhcMRm8SISaE7aEMw7Ht\ntCSaHnScLiwE7riD/M7nAx56iN8/FArzgwEg3bwJ4VxHLkscVreSMWvUKG4krlwpn7w2NQEHjvDN\nSle6ud4vAMb9BICKRvlqTNlYbuXK8A8GxYLESgj37082dID4kBHt7ZA1/DSDIUw5fwCwqz6SdWKx\nAAsWkOP2dmD1avXnEdNp8+fzY1ox4Upz9WgjRREbceorpyLzoUyMe3qcHK0W1vjxHE0WlyGcd5D9\nrqcM4XNHnMs+00XjF2FAjnYJtqj0dFKJBZA0pliJJzOE25M3hIfnD8fQvKEAgFWHV6l+BonKTMgI\nKjEl/PsVv4fb58bjax5H3p/zMPfluRFsbzWVlvKUt8cDXHRRlP4yPWAIiwnhc0eci5MHkA0aNaOI\nSkwIP/YYQcdNnQo0a1MmIiQmhNML+fqi3FWucu+eF22ACZBmg7E0cyY//vRTjsdobYUp0XLRkBFi\nQjhW07V16/jxsBl8vjIin7jKP5/1c1kK+O8L/44fj/9xAq84MckSwsd34o5P78Cy/cvwdv1vgaJI\n5sueyL3aCLnd3BB2ZQfZxn1P4CKoLht7GTv2BUn5mDgv+qbyG1x6Kb//5ZdrX3N5QjiETokbwn1K\nTOZYZfYpQoUuERlBdqcXLeID5Pr1kY1sxJ18tJWZKiFskSxsIisyhKmsVuC110hpoqisPPMmhBMx\nhGk6GDCfqUI1oYhHG3YcjwE/A9Du652GcGk2//ybvZGf/9GjfCJvzSExFleayzR/A3HhsZeiKv0O\n2ILkexqNIWyKhHA4KeT2u+HxkziwyBF+6in1ZmMASQDQhHCaZBzjLJZE0+NgMzdDfvELvgHw0ksk\nNQoQzh1lv+X1MyZOQI8kSWLv/UDTAYRCIUgSTwn7fCTNRLV5M0yNyADkiRYl711sLCc3hI35nofm\nDWWLQTVurt1OGhAB8SWE29pgOkN4eP5wOG3EeVm2fxncvsja7HPP5cda2AiRmz2OnyoMGdHTY7Zo\nCNO58e763ViyZ0nEfR0O/h527YrNL2WbPz2MjABISfT6G9dj263bcN/J90UtwVZKxEaI5bupTghL\nkoRzhpGUsC/ow9eHvo7xCP06IYZwipARVCf1PwkXjb4IAHCs/Rgu+NcFuHvZ3QiEAlhbtRZ/Wa2P\nZ/rii6RiEiDXoCuuUB+/xM3KnmAIF2cV494597Kf//rtX1UfIyaEaYVVfT3wVfS+kDKJCeGQywQJ\n4TKeEN54LLYhPH48UBz2+z77DHjjDXJsVkNYbbwKBAMIhUJxISNEHm3BCH7n4fnkSQbkDMBfzvwL\nBuYMxOMLHsftJ92e/IuPQ6IhvOzAMny2n/N1c6aTi2w8fUgAYNMmvsYYNamZNcjryWaCIjaC6hez\neZpxReUK3HYbcFnYN+7oAM4/X70SgI1bjmYEQEqVetLsNrv6DGGTqiRbjozIyCBm6FCyyQ6vN3LX\nVLa71lpuqoQwwLERLZ4WGYeOKjcXePttyN6XM1tgCJvMZEjEEBbTskYqyY1HMo5wbWx6vuw9m+wz\njiYRjN8ejPz8xQYJPgcxhM2SDga0Fx42Hznv61RS0QApQ66vh6EZwoB8YUhTcRddxJNmzz4L/PnP\n6o8lyAhyXpvVEBYrOfLzgZ//nBz7fMBdd5GJ6n//N39sdj/+PTbb5h3AFxZuvxvH2sn38RxeCS0z\nzkR+MGDOcYs2HwNIqkuUmBDes4ezZEVD2Eifsd1qZwnFvQ17VUuWadq7oUGeqIsmMyaE7VY7LhtH\nVmPNXc2qTM+zz+YICC1DmPZxSE8Hhg3jv6eVaj09Zk8rmybbfKbSQhZQbEQwGLupj5GQEQDgSndh\nQvEESJIUNXGn1KxZwJQpkb+njHCAG8IZ9oykzm8RG6FsNpSMkmUIi8gILUM4lcgIqgdPexASyJfs\ny4PyxocPrXwI+xtjd9Gy24khuGgR+dnrVa8e5eNZCDua12Dnce2GzqlSTQd32kuySnDBqAtYpcaK\nyhVYXx1ZE+/UIGetWKH/3xUTwp50ISGcbcyEcGFGIRs7Nh/bHBMZYreTButUt98O/P3vpCrJjIaw\nsqJh+aHlyPtzHk5/7XS4cvwoCBcVxmMIW4siDWEAuGv2Xaj8eSV+PuvnKXnt8SjHkcPOQWUV9MSL\nl2LlSnk1jp6EsLiJN3oa34ApyixK5qUmJREbARA0x23Tb2Oow9WHVyMIP155BazB3LFjwIUXhoMy\nglhCOIdz6AZmD0SfElOfIWxSleSIhnATxo0jpXzFwuaIsrGcrIFEa7mpEsKAdmM5UdOmAY8/zn9O\nyyFGjFWyssYFZlFCCWGfMRNY8UiWEK7TkRDupcgI8fP3p9dFmBDMEE5rh1ciW6VmmeQB2s1LrO3k\ngl7fWa9aOtrQEN71FpERPdStPprE10RLSsePJ6kdql/9CnhTpZF4UxNYgtRpNY9ZqMYQprrnHs5g\nfP994LbbONewf38gI4e83wy7eRAZosSFBUUonH4636AUyzc3bYLMEM6ym2/cynXkMt77zrqdMv6q\naAgvWcINNEc2ec9Om9NwnzEtq3f73apG2SDexiAmNuKDD4AzzwRefhmmM4QB4Nbpt7LjZzaS5jtf\nHPgCt350K3bX7UZhIWdV7tgRmd7xeHgJ7+jRpIILAHwBH2vi1NNjts1iw9c/+RrLf7IcR+8+ygzq\nZfuXqZou8XCEuSFMSiEcNkePLsJFZadns3lSrISwJBFO9KuvAg8+SMbs55+HrBM8NYTLXGVJsRtP\nG3Iaw8J9WvFpzEZ/elXAKVSmQUYAJBixaMIi2e/oNcYT8OD2T2/X9TeyWoE/cAwp3n478j70fLWN\n/gzzFs/BlOemYFvttoRfux6JCWHKNBWTgn9a9aeI9zd0KP8MTj2Vb0olYghLEtAKISHsMmZCGABm\nlBGoaqevU5XrrtSllwJXXUWOW1qAO+8kG1lmNIRz0nNYuOlIyxE8+M2DaPO2Yfmh5fj2yLcsJXzk\niHajwWCQIyNKS4E6v7oh3NMSU8Ki1h5bickntSMvjzTfBvQlhEVDeOBYIZHfgwlhALh0LGdCTC+b\njn6Z/TBv8DwAxMPYfGwznE6yTqAhx61bgZ/8JHweh8USwqIhnNNnCCcqY83I+6RbJTlyZARtIBDN\nEK5qExPCA0xnCIuN5dQ4wlS33UYmPU8/Dbht5D2XucoMtwCNpf+rCeHRhaPZZxVvQri3GsLIqJct\nTADBEM4yX0M5QDshHKzncTIRO0DFxjWjIyOEhLBYUnrNNcDDD/P73XdfJDqirp43HMs0kVlYlFnE\nNt6UY3R2tnyz7rnn+PHjjwO1neQ8LnAWwIxSM4SzsoBTTiG/q6zkqQ6SEDYvIoOKLmAa3Y2yJkEu\nF2/40tjIz297JpnBG/H9yjjCSTaW+8UvSFO1v/4VpjSEZ/afiUnFkwAA66rX4Q/f/AHnvH4Ontv0\nHK59/1oA8qaJSxWh2r17SYNfQIGLEPpYGGHMLsgowLzB81DqKmWNvBrcDaqszngM4ePHASDEEsKD\ncwcbptGNJEksiVbVWhXTVMzPJ9es3/yGzKlvvJHf1unrZNUvyRpMWWlZOGUQGSwPNh/EK1tfwcd7\nP5albxNRRgZHDSRjCOflQXPNJBrCqUBGUD0w/wE2j79q4lXYfPNm9tktrViK979/X9fzDBlCwjIA\nOXcPHJDfTo0V20jSjdsX9OG5jc+hO0WvF/nOfKRZya7p1ZOuZmbVkj1LcNOHN8Eb4F0r09OB774j\nCKKvvgImkSEK27aFN9F1iG5eFRcDxzr4mtioCWEgfo4wQFLC5Yq3NGSS+Qxhcbw62HwQyw8tZ7dt\nOrpJho1QntdUu3bxNOns2cC+RrJbabfYZQnknpaI4RLlC/rw9UGC0aHYiIaG6ONZKMQN4bw8IC1f\njmjpSf143I/hSiOJILrpderAU9nt31SSkrJ+/YAPP+ThoXffBe6/nz8P7RWFXL5D32cIJy5zOWR9\nYsrPENIVzkaMD1fZ60oI+9OAzkJWumwW0ZJOAPiu5jvN+0kS8KMfAdfd1MXKzo18sddSriOXGaOJ\nGMJmNUeddicD/e+q28W4R1qirNl0azqbWPYGOWwO2EPhzzCaIezqXYZw1zH+PVcri2TjmtGREULp\nqLKi4Ze/BM4IIyyrqyN3+4838aiDK9145pmWRJbuoeZDEWbDZZeR5KSos84Czjq/hS38xXHeTFIz\nhAG5cfbpp2Qif/Ag5Alhk47Vys7Yol57jSSkaToUUgBuG08TGk2xGsvFkxAWy5LNaAhLkiRLCf/2\n698iECIO7/rq9ahqrZKd1zTpTxWLHwz0fEJYKRFZoIaNmDSJJxJjGcIvvQQg8zhgJ82/eoofrCWK\njRAN3UR0rI3PPVLxnaYcYQC4/oPrcf6/zsdJz5/EUuWJinKEkzGEtdLBAEdGSJBSutExLH8YNt68\nEZ8s+gSvXvQqXOkuPLHgCXb7S1te0v1cP/oRP373XfltNCEsFXJ++r93/ltmxqZatKmcmFZ02Byy\nhnovbnkRZ7x2huwcLSoC5s4l15V5JFSIUEjg1EeRz8cRIAMG8AawdotdhmgzmhIxhHNzgbfeIonS\n8ePJGD1ulvkMYYBjI7wBrwzntPHYRrbxDGg3lhOTsrNmhdj8bEjeEFkjuZ6WMiF816y72DG9Jon9\nGaJhIyoqOCJn9mygrtM4CeFSVynW37QeH13xEX520s8AgCWEAW4IA6QR97//zZv6PvggOa+XLSMJ\nYgDIKO1LCKdCfYawSWW1WCF1hSfUjiZ9CWFaBtnWH+lpFhgksKBbND0AQAZc1xK92APmNIQtkoWl\n5Q63HI7JjgLkE/SCDHMm7QBgQjE5oT0BT1RWmj/ox/4mcruRSn9SpSwpvJLJqIv4PlNDOL0wtYuy\nEyUtZESogSeE6WcriqSvYHhkhMjCvvfze2VN8iRJ3pjpSzkiEMebeHo022keQxjg5ofb72Zdjakk\niTTToxiFtDTgySeBQy08CT4kz1jmiV7pMYSXLhXMJKGpnFmrOZSdsUXNmkXO69pawrF8+Z1j8IdI\nV+me5KlqaVRhahLCXV3kPyYTGsIAcOWEKzU3Kj78/kNMn07MGQD4+GP5fHOncCqIhrBo7BhtE2/B\nsAXseOn+SEM4M5MvxnfsIDxWNa1ZEzYgDMIPVlM8jeWiSdZQLiv5ucfFYy5m2Aiq/U378fSGp5N6\nXsoRbmiQlxzHUlsb51aWauy1f1fzHXbXkR1dMcSRKo0uHI2FIxay571kzCWsemzl4ZVsXdDp64w6\nV/6h0MvpnXfI/0MhgrdpDIewA7ncUWt0N+KTfZ/gSMsRzH9lPha+sVC1d0siave2M5NfmVa8YeoN\nePOSN1ml0arDq/D7Fb9XfZ553EPSxEaEQgQXUFlJzGC6R11eztfERq8gnVrKyxP0NJajmjOHbE5u\n307mIfT7apWs6JdhXANcKa31+8ajG3U1lhP5wSOnHYPbTwIXRlszihvsowtH44H5D0RgdPQ2lhNN\n8DlzgH0N/LtthFT06MLROG/kebBaSGJgTOEY1XENIGulvwp9Jq+9FrjpJv7zhLncEB6UK+zc9yku\nGXcE7FNMWTxhwy+rBuPGkaucaAiLicIObwcva2oxHy4CIJNqyvlbU7VGVoatJpEDKDbSMJNmlc8C\nQBpsfVqh0b1FkNghfWTByCj3NLbG9xMayx3XxkYcaj7EUgzigr63KDctPGlzNqL6KL9AdnVxU6Lf\nUL4o6w1N5dDEDeEDTZE1YGZBRlw8+mLMHzwfABmLrv/gelliVkzKfvGF/LF1LdwszMkwl1mo1ViO\nauRIwladMoUk6UaOBA42cUN4aK45E8KDcgbBKpHJrWgIjxlD0kgAWbR+RvcynbwU2kxGoahxRdoJ\nYaqCAtLUaNg0wfQ3WGISiJ0Q1msItwieicsF0xrCrnQXrpxwJft5cslkdvzB3g9gsQDXXUd+9vkI\nW5ZKNITHCqEnGTLCYIbwgJwBbEG+vnq9bAOPimIjvF55ClrUo4+GD/L4+W40QziexnLRVN3GQxep\n2Iwenj8cK65dgYfPeBi/m/c71lTt4VUPo9XTGuPR2qIJ4UAgstl2NMXiBx9oOoAFry9gvTsuHn1x\nwq9RryRJwqmDSHl1c1czdhzfAY/fg+n/nI7h/xiuaZ6PGMERC+vXA6+8QuYgP/gB+f7C4ocvS24o\nv7TlJfz4nR9jReUKLK1Yike+fSQl74GmgwHSUE6pKyZcgZXXrWQVf+/sekcVbXIKzwhFGMKhELnW\nnnQS2ZwcP15ukpWUd6HBTb7jRg8M5TpyWdXk1pqt8AV8uh8rBr+oIVySVcKMODOo3KX++ext2IvS\nwfyaEssQttuBzAECPzjPWIbw9LLpmFg8EQBpKulKd2HuwLkACC6jorFCd0JYaQh/V8urqum/YSSp\njWui7rqLzze6uoDDYQ947lzAkscNYaN/l42sPkPYxMroCtdKOFoRygyX32gkhGWTvtZyltAym2hJ\nWTAUjOi8q5T4ns06SNw87WZ2/OzGZ2Pen7KRALAJhBlFE8IAIi4MosQkl7ig7y0qzAyvZCxB7D3C\nN0AqKnjSwVVqTmSEVkIYjToTwunGTghbLVa8fvHrLOX/wfcfsAZNAFmg0OTS8uWAXyCjNLRynEBe\npnkNYTUGNEAMws2bgSvDfpNo/Js1IWy32tl7r2isYAtYSeIpYa9XMIwK+OadWTEZsoRwXfTO9OLm\ngNEMMoAw2yn/U80QHjCAL64Pqp/WAOSG08UXAxNncCOLcvPMot+f9nucN+I8XDf5Oqy8biUzEr86\n+BXaPG249VZeyvnss2FjCdwQdjh4UxhAXsFkxE08io0IhoL44sAXEbeLHOENGyIfv38/aaIIAFnl\nh9jvjXa+i/PhZAzh9dXr2XGqxu3ZA2bj/839f7h//v24aiLpjNXobsRjax5L+DkLhXYM8WAjohnC\nLV0tOHvx2YyFO7P/TPx94d8Tfo3xaN4gHo1dUbkCy/Yvw+56Ehd8a+dbmo8TsRHXXUc4vFQzzqpE\nyCLHs32490OsqeLxyqc3Po0ObweSldhQTqt8fXrZdJw2+DQAJMUuGlpUhYVguMQtW+SbcddcA5xz\nDueMtreTcnOq7P58M6N/tnEbylFRbESXv0tz8zWa/EE/+7ubqZIQiEy0ilUEbS7O71FDRrS08GrK\nyZOBw+3GbCgHkDnkpps3ofG+RvxoLPmyLhzOS8yWViyN2xC2WoEZM0KsSWR5dnnKGl+mWiJHeEWl\nfIdHkoBnniEGMJXVSioOD7cQQ7g4s5hVFvQpfvUZwibWWVP4yPB9AxkZ9BrCZkwIA8DCEfLBMZrE\nUjgjlEgkooXDF7JF2Cf7PkFlc3R4IU0IWyWraY0VAJhQxA3haAlhceHeGw3h/rl8JVNxlK9kvhf8\nCnu+OZERmgnhzkI4JHJjVIaww7hpM6r+2f3xykWvsJ8fXvUwMwotFs4Rbm0VGiQAaOoQkBEO8xrC\n0Zp/ihKNY7OaowBfYLR52xi/HgCuuCLyvtZiYewyaXVDriOXjTk763ZGbVBldENYkiT2ORxuORzB\nLU1LI82ZAFKqqfVWRUM4NxcIpRFD2Glzwm61qz/IoCrKLMJHiz7CSz94CVlpWbhw1IUACMtx2f5l\nGDwYuOACct/qasL083h4Umv0aIEhDeD17a+z4yklU07Qu9AvkSP80b6PIm6fOZMf0671op54giMJ\nxsw5xH5vtES8DBnRkjgygs7BLZKFmXep1P3z72eMz0fXPIoHVzyIh1Y+hK01W+N6HtEQrqvTvp9S\nx/jUKgIZ8eb2N9mG9ZjCMfh40ccnrFkmTdIBhLf59q632c/ba7drjsOiIUw1ZAhBSNz/d75BmW5V\nXyA2uhvj4hZrKVZCmIqONwDZUFcTxUYEg8Dq1eS4oQF4/fXI+4qp/rRCASmokUA1kkSO8Gf7P5Ox\ndPWotr0WIZDzwkzrBCAy0HXLtFvY8d62jcgLt+tQSwhvFYaKadPk1VtGM4QBwGaxyfqPiNekt3a+\nhbIyHqbRQkY0N/NN2cmTgXp/JavOoc1ijSgtjjBVejrhn1Nu9K9+BYwe52XJ9z5+cHLqM4RNrAVT\nOUyG7g4XFvIUCzVOgkFg1XbREDYnMgIgEyGnzQmATEajLUB7Q0LYarGylHAIITy/+XnN+4ZCIZYQ\nHpw72NQN1obmDWWfM93ZVJMsIWxSUyWaBhVyztfhenVDOJDRO5ARk1lFsoQ8kJRwZUtlRFNBlhDO\nJgtZp82JDHtG973QJHX+yPNx9rCzAZAxSUx3aGEjmju5IWy2hmOi+aHXEJYlhA1mnsQjLY7w/Pmk\n6c2dd/Iu0bnDyZfYYXOYeiJLy+wb3Y0RzGhR4rlg1M9Y3FRUS4hS/EFHB+9Wr5TY7T43F6zU3Uy4\nCC2JBs3735OOLj/7Gb/9ySeBvXtJaT4g5wcfaDqAzyoIL2Vw7mCcOVTRXdIAmjtwLhtv39rxVsT4\nNXUqWDPmtWvljw0GgX/9ixxnZABZ5QZGRggBiaq2xBLClc2VbN0xq3yWzMRIlYbmDcWNU24EQLiz\n/7P8f/Drr36NU18+VZYyjaV+Ai41VQnhb6t4TfaLF754Qnt2TCiawDbBV1SuYN9FAGjqasKx9mOq\njxs9GvjpT4HsbOC888gGzt69hC9c0cTjlWJDSQC4bvJ17PixtY/FbPQcSzRVDURvcBWPIQyQSiuA\nbE5RXXIJsGABIpXNz3szJYQB4Jdf/BLFjxTjd1//TvfjZbxvExvCQ/OG4qZpHCArNpY7ckTB7wdJ\njlNNmWJ8Q1ip8UXjWSXW6iOrsbNuB0sJHzoEuN2Rj1m7lm9Yz5lDOOdURsRFUInj2jeVFf7Z+AAA\nIABJREFU36j6O0VFxOTfuRP4/e9Jryi60WHmebQR1GcIm1hjCgVDONzUwGbju+HUEL7vPuD+x4TV\ni4mREQ6bg3E5q9uqo5apiglhsxrCAHDDlBtYSuKFzS9o8qNq2mvQ7iWl5mbmBwPECKfYiIrGCk1e\ndG9PCBe7eLTlaDOPtoiGcKeFTP4z7BmmKklWIiNEczTTSwxhf9AfkWCqrQVg7wTySTpnXNE4SAbv\nkKks+6KiCWGAG8KhENDqFhqOnaDUUaqUTELYYXNETQwZXVqGMEBK3Z54gqSUurx+tFrJ+Tsif4Sh\nG9rEkl5shJgCN2rjD9GkvOnDm2RJNkDOw9ViyCoTwr3JEJ43aB67xny872P4g36cfjpvtrZiBSnh\npBIN4ec3Pc8WbjdPvdmQDEuHzYE7TroDAOAL+iKaWTmdwMTwenrXLlLZQbVnD0kmAuRaVt1xCAC5\nLtNmOUaRyBBWa6CoR2JjZ4py6w79dt5vGXaJqs3bhifXPwkACAQDWLJ7Cb448AW6/F1qT9EtyIh1\nVSQinmZNkzX9OhGyWqysyXZ9Z30EX3l7rXZV3VNPkTL6jz4CLryQb3CI/UcuHXspzhp6FgDggpEX\n4IULX2BNFw81H8K7u95N6vXLkBFZ2oZweXY5+9tuOrZJFW9y8sn8eHv4bYvJ7tGjgeuvj3xun9Nc\nTcdnlM1Afxc3rus76/H7b34fFaknysyG8OjC0Wxe+bOTfoax/caywJDYWC4UikwJi4bw1Kl8XmaV\nrIadh4iSJAm3TuMbNM9tfI6FCkIhsqGjlNhEb/ZsOT/YyAlhcVyr66zDnnp1JkZGBpmLSRLHRQCk\nj0efEpd5VyF9Yg3WAJ4QBjg2oraWDBjvvw/ZbqhZm8pRiSUUn+7TbrRGJw9WyWoqtqpSpa5SXDT6\nIgBkZ/3jfR+r3q+38IOpZvbn9ZkbjqoA+8AN4X4Z/bolodLTEheSLf7jbDdYNISbfGT2W5pVanhj\nVFRaGmnyQCUawrY2jg1QcoRrawH02wVIxFwYXzQeRpc4Zokd7AcPBoaFkclr1pDkYXs7EMzkk3fK\nNTWLCjMKWWJbiyEsKhgKsqZyQ3KHmOocViqaISzqcOtB+IJkY8/slQ1iZ+xobEO6OZDryDUk8xsA\nFk1YxDZvjnccxzVLrpGV5oqG8E4N71s0hHNyQr3KEE63pbOxrNHdiDVH1kCSgDvu4Pd57jl+TA1h\nb8CLF7e8CICUxF4/RcWhMYjumXMPOz9f/e7ViEXpLNLnF6GQHPOzahU/nnNykOG9jDim5Thy2Pph\nffV6NHfF0WktLHFjU7y+pVplrjLs+OkOfLzoY/znR/9h4QjKs73pw5twyX8uwVmLz0Len/Nw/pvn\nR6CmEjWEtZARje5GNt+eUjIF6bYTv6ASeZtK6TUJRcnWDwUj8N7l72HDTRvw3uXvwSJZcO+ce9nt\nf1v3t7ifX5ReZAQAXDiSp4Q/2huJcSkp4fNImgxWfm4XXgiGFQCIkdQmCQlhl/ETwk67E1tu2YKn\nz30acwbMYb9fWblS1+PNbAinWdOw9Zat2HzzZtw5807YLDZMKSXIoQNNBzB8Am/Qu1Lx56CGsNUK\njBsXYvOyQbmDTFNFe/Wkq5kB/tq21zB0FA+MqHGERb79rFnyKttJJcY1hIHoHGE1iYZwX0I4OfUZ\nwiZWv8x+bOdcnLQWFZH/ezwksXDwIOSGsIkTwoDCEK6IbQiXukoNmUaJR9dOupYdq5WyAvId/hEF\nvcsQpmkMUa2eVjaxNLupoiVZqWnRDhw+TBai1BDuP9iNZg9ZzJltkgfwlLDTSXayqQL1vLGciBMI\nhcLIiCK+4BF500bVqIJRbPf6m8pvZI1ZqBHu9RJToaEBQAmPNYgNFs0gSZIYEqCyuTIm666mvQae\ngAeAufnBgH5DuDdVNsgSwsfVXVJ/0M8qdoyKiwAIC/WVi15hG8ifH/gcT6x9gt0uJl71JIQzctwI\nhAg/oTcYwoC82oFy/m68kTRwUooa6Et2L2FM7UvGXBI1FdjTynPmMfMrGArif77+H9ntIkdYxEaI\nhvDYk2rZmGY0XAQVTfUGQgF8eSB6g2alfAEfm4cWZhRiWtm0lL8+USVZJTh3xLm4dNylWDRhEQBi\nyl7yn0vw8taX2f26/F34eN/HuHvZ3bLHKxnCTzxBSqmfeUbezFUprYSw2ExvVvmsxN5UkhJ5mwAg\ngW86ROu7oSW6fshJz0G/jH7IsGdgetl0Vr1y+pDTWVXquup1aOlq0Xwu5fNuOrpJ9ju9yAggNjbC\nYgH6h/1cLUPY4SCNbNm/WQwc6zAfUrBfZj/cNuM2PHY2b7AookuiycyGMEA2saaUTmGba9NLOUKj\naDI/vz7/nD+mq4tfp8eOBdqCdWjztgEwBy6CKteRiyvGk0YUrZ5WHC/+N7tNaQiHQtwQLiggoROa\nEHbanIYPi8XiCCvVZwinTn2GsMk1ph+5QFe3VbMkithYbt26MM8tzNtEwA509jN1QnhE/gi2qFx1\neJXqxMTj9zCeoVgeZ1adOuhUNjHT2jXb18B3+M2OjACAmeV85bW2em3E7TJ+sMlNFS3N6D+D/1C+\nDocOkQUNNR0GjeMzXzPxg6nOIlWJ+OEPCdeOcoW7qrkhLKZ9WlqIcYoivuAxgyEsSRLbyPIGvFh+\naDm7Teyau2NH2BAuDRvCIcnQJV5aoiaIJ+CJyXrsLfxgIJwGDC/MoxrCwtglVvqYUaIhvKNOPZl2\ntO0o404a1SCjKsoswuuXvM4+x4dWPsQazIkdvvUYwvYsXsrdWwzhuQP5gEXNCLudNHsRy7cdDtKw\nKhQK4R/r/8F+f9v0207Ya01Ud8y8A0WZJFnx9q638aP//IjNr7Qay1FD2OEAXAOMyw+mklWtxGjQ\nrNSaqjXMWDl72NknFHlzz+x72PGy/cvY8QUjL2D850/2fYK6Do7YEhnC330H3H03qcj56U8JV5Sy\nZ5WihrDdLk+YigEFMbhwIjWlZAoy7Rwn9cOxP2RjVrwJYY/fwxLtIwpGqCbaJUnCGUMI4yoYCuLb\nI7GNyOWHlmPCMxMw/fnpeGvHW+z3YkKYfs+0NLlkMjNsvzz4pep6jxrCjY2EqaqW7BaxEeXlhD3K\n7mOyufOU0ilw2BwAgNWHV+t6jNkNYaVEpnJLxkYUhKkyX37JN3l27OA8+wh+cJ55DGFAzvVe0fEM\nEMYvffed/H6VlbwKYsYMoMPXztZQ44vGGz4cJ45rKypXRO0TBZA+M1R9hnBy6jOETS6RI0xTwqIh\nzMonaEK4tRwIWUxtCEuShAtGktbWvqBPxjKjqm4zFx8qllzpLsbS2nF8Bxo6GyLu09uQEcPyhrEE\n/LqqdREXht6UstNSriMXxdbweyvZiopDXTJcRMlwwRA2IRbljTeAzZuBl8MhH5rCaT6kjoxgDeWK\nuSFsBmQEoM0RHiRgr6qqgOP1PqCYlHjlYwRc6ebhQlOJJki0ppAAGC4CMH9CON2Wzial/1cSwnnO\nPGbkr6tST46Jn7FRDTJRpw85nSURG9wNeHkLGaDsDg9Kpm4ELH7s2sUbt4gSDWFrBjeEzfg9VtPw\n/OHol0Ectm+PfMsqADIyCJd0ahinetFFpEx32f5lWH2EmBZjCsdg3qB5qs9rJGWlZeGB+Q+wn9/d\n/S7GPj0Wb2x7AyNGcHNw3TpyDlRXhyvxAJx0ErCvmSfljXq+nzroVGYqLd0fvUGzUjJcRDfyg9U0\noXhCBKLiJ5N+gg+u+AA/nf5TAKQi4d87eIpOTAh/8YX8e7tjB6nS2cenz0zUEC4uJklUqnXVgiFc\n3jOGsN1qx8kD+Q7MNROvYanHnXU7EQgGdD/X/qb9jO8dLUwipvdilXPXd9bjyv+9Et6AFwBphObx\nk9Q8TQgXOAtgt9o1nwMg672LR18MgGymv7H9jYj79BeID0ePyg3hsrD3OWUKcPnlhJl88828grQ4\ns9g06ACqNGsaZpSRsMjB5oM41qbeRFDU0fbeZQiLYZl1R9eySrvWVp6QjdZQzmxVtNPLpjMPYGfT\nJjjP+jMA4JNPOLsekOMiZswgPHH63TZDuEQc1462HcWWmi1R79+XEE6d+gzhFEuSpIGSJD0qSdJu\nSZLaJUlqkCRpvSRJ90iS5Ez1vyfjCIcby4mG8KpVIA2YMsKMnVZijpoZGQHELiMSmw/0BkMYgGwh\nterwqojbaclXmjWtVwyMkiThpP4nASCLcjFJCCgSwr0UGQEA4/PCCw6rD+sPb8XWrfy2vEF8ImjG\nSZ7VSiZqtLEJNYTbjgyEVSI72eLnThtlUmREgbPANE3ITh9yOuMfihzhcmF4qq4GttfsAWxk4TTQ\nPuWEvsZUSdyovPTtS1XZf1QiZ3hInrkTwgAvRWzqakKju1H1PjJDuBeMXeePPB+A9gat2FzQqAaZ\nUiIz87G1j6GmvQZTnpuCmgtnAAt/htZWYj4oJRrCFqeQEE7rHQlhSZIYw7K5q1mGK8vNJSbp2rXA\nK6+QdPCvv/o1u/3++fcbjqerpVum3YLnL3ielbT7g37cufROeINdOIlMS1BbSxJZq4WQ3ty5wL92\n/Iv/LCSqjSSn3ckaNFe1VkXlf4sKhUKy8fzsYWd3x8uLKvG7ObJgJJ48lzSYu3rS1ez3i7ctZsf5\n+erPQxP/gYD8MwRIuTmdb4iGYygUYsiIwozCHq1quX3G7Ui3pmNm/5lYMHwBw0t1+bsi5svRJFYX\nRguTnDLwFHYcrZw7FArh2veulaVSK1sq8cLmFxAKhVhCWO/c7capN7LjZzc+G7F5IX4+1dVyQ5jO\nKSUJ+Ne/gLY24Lob/Ow1mHV9ePIAvhmgJ61NPwu7xR7RpNGMGlkwkm1Mfn3wa5x2ppfdRrERSkN4\nbRWvNDVbaEqSJPzmlN+wn90n/wqY+gI8HmAxH+pkXPsZMxQN5QzOD6aiG0AA8OiaR6PelxrCTpvT\ncM1bzaY+QziFkiTpAgDbANwFYCQAJ4BcANMA/AXAFkmShmk/Q/yKlRDesAER/GAApk4IAyTdkJOe\nA4B0u/YFfLLbj7QcYce9ARkBkPdMpdydD4aCbPdzWN4ww5eF6JWMI1wt5wj3ppRdNJ08iDPqdjSv\nk+0A55bzCbcZE8JKMU5f0I7+WSQ6u79pP1sAVFYCcDYALjLjH1803jQGgyvdxcyBisYK9n0tE3z8\nqipgRz2fxY5wmdMQvnbytZhcMhkA6Qh/4b8uxIubX1S9r7hwNXtCGNDHEaabWSVZJb0CJRBrg1Y0\nhM2CBZlUMglnDSVMmwNNBzD52cm8ee+0fwK5B1Uby4mGMNJ7HzICkJsRypJlm41gFdLTgSV7lmDT\nMcJ3nFwyGT8a+6MT+jqTkSRJuHHqjai4owLnjjgXANmYfmfXOxHYCJEfPHLGYXx96GsAxHToKaSA\nHonpXr3YiJWHVzJG7YyyGT3Cgz5t8Gl46PSHcNHoi/DhFR8yVMT4ovEsRbfh6AYWkrHZ5MgHABgz\nhrCEqZTf5cpKniQeIgxZ+5v2o8FNInkz+8/s0fnHBaMuQOMvG7HmhjVIs6ZhfD9eLRUPR1isLoyW\nEC7OKmZz7Q1HN8h6IYj627q/sebXYgPRP6z8A2o7atHl72LPp0cTiyeyTajtx7djzf9n77zDpCjS\nP/6p3SXnqESRKKKoIKICignB8PMUs2c4PeXMOZ05e+p5xlPx9Dz1jKfnmc+cUDlzwgiCoKui5Mzu\n1u+P6p7pHTYzsz099f08Tz/b013TW9+pDm+9/dZbc96qtD/TIRy+qOvQwaVwidK8Ofy05KdUbvce\nbfN/QrmqiE4sVx+HcPc23RNjM9dEkSli5/47A87G7DA0fU48F2SSiTqENx5azqOfPwpA85Lma+Tg\nTgJ7Dt6TK3a4Ir1ht0nQ/xluvz19r8qMEP7ox4hDOAERwgCHbHJIyrn74KcPptLZZGKtTTmEe7fr\nXRDndZzIIZwljDGbAQ8AbYDFwB+BrYEdgNtxCV8GAE8aY1pVd5z6EuYQBlKdlahDeNUqKjuEFzrn\naNIdwk2KmzBhgBuCvWDFgtSQxJBCjBAe03tMKkdY5tv52QtnpyYxKYT8wSHR4XiZE8uFDuGSopKC\ncCRVx7gh6d/gu4q3Uw/8Jk3AtE12DuFMohO3rNvUvTtbtHJRqgP26ackbkK5KNEOeGicNmuWngh0\nzhz4esn7qTIbdx7WqPXLFq2atuKN372RcgBZLKc8d0rV6QSiEcIJcRbWRDTyJHMyHYCFKxamhswW\nyousbdbbJuXwrOoF7cyFM1PrSYkQhsqRiNGJkCiqgK3+UmUe4dAhbAyUFRemQ7iSM6KaSY3KK8o5\n96V0RNOl213aqLlms0Xrpq05e/TZqc+3vnsrW0bmEYs6hI2B6a3SQ9oPHnpwXndSK+URnl43h/DV\nb16dWj9h5AlZr1NdMMZw9piz+fd+/17D3j14aNVRwh26z4ctr4MdzoYdzmbdfS6nW79fUvszHcLf\nph9L9I2Yl/mQPzhKyyYtU+dYdALa+uQRrjQhdS2Rk+FIxbKKskoRlyHvl77PGc+fkfp8/8T7mTh4\nIuByBx/06EGpfbVNKBflD8PTOVRvfffWSvuqixDuVo1JXCmlYJtk9g+j9+DM/m8mpYtL+WWZO9eT\n6gCviqg9/cGiZxkUmFNvvw3z56fz6/brBx8vfD31DJ/Qf0LqJVLSOHPUmZyyZTBpZlEFjD+ZaZ9X\n8OabUFEB7wUmZ8+erj8VjRAeus7QGGpcf1o2acmxI44F3KSn0Yl9o8xfMZ+lq91LqUIYFR03ybPO\n8pfrcBHBZcBO1to/WWunWmtfsdb+ATgDMLjI4VOz9U97t+tNixKXiaIqhzCQnlAOCiZlBMD/Daw+\nKimc0RygV7vCiBDu0KJDyuD74McPKjlXCi1/cEiYMgJchPA9H93D6DtHc/7L56eGufXt0LfWPGRJ\nZoveG0OZC3OY12JqalbZTTeFucuTnTIik6hDuAPpXlgYRfrJJ1TKHxztACWBaITc3R/dnYp8DtNG\n/PADfLc6HdYwvHsyI4TBOYUf2vuhVC7WRSsXMfm9yWuUC9u2c8vOBZFndbv1t0utP/n1mqkyCnFk\nQ9Pipqkc2VW9oE1aDuGQHfvumIp0B+jQvAPNi1u6D5vdwftfrJnLP3QIt20Lcxan89sVkkN4ePfh\nqbyb1U1q9NBnD6Vs0q17bZ2Ksk0io3qNYkiXIYBzvrTum34G3XZb2vGw0caWh7+6O7Xvt0N/26j1\nrC8DOw1MXY+vzXqNa9+6tsbUEdPmTkuli+jVthf7DdmvMapZLw7Y6IBUuqk7PriDG6feyNVTrua7\n3/SD8SfDmCthzJW8XHQOx72+F23buWfwpxn+06hDOBohnA/5g6sj+oK8oRHCteVWjY5UzAxMWbxy\nMfv/a39WV7gXgqdtdRrj+4/n4u0uTr0Meunbl1Ll6+MQ3nvDvenYwuX+eOizhyrNoxJ1CE+b5tJ9\nQPUO4WjAUFIdpJ1adkqljHy/9H2Wr15ebdlHPn8ktR5ODFgIjOs3LhUk9ez0Z1OTVJeXu/vy8uAn\n2Wwzd86E7LPhPo1d1axhjOHqcVen07d0/hIGPsHtt8NXX7kcygCbb+7srjAH73rt1qNd83Yx1br+\nHDvi2FSO+9vfv535y+evUSYaOSyH8Nojh3AWMMaMAMbgooD/Zq39XxXFrgU+xzmFTzTGZGVMf5Ep\nSuUgnD5vOqvKV1XhEC68lBHgohvCnJz/+fI/vDX7LW5/73ZmL5xdkBHCkH47nznLb/QNfyFFCHds\n0THl4J76/VQOeewQpsyewiWvXcLyMve0LxSnSnU0KW5C60XDAaho9y20dLNnjxhReebgQkgZETXg\nW65MZ9e5+Z2bmbd83hoRwkmZUC6kX8d+qciOz+Z+xoc/uoTQYYemvKKCn0yQJHphT/p1S3ZOLGMM\n521zXurz9VOvT00yA25283C270KIDgY3S3KPNq5BX5zxIktWLam0v1Bzn9eUNiJMGdGpRadEOf2N\nMVy2/WUUmSLaNmvLUwc+xSEbB9PVN10WzPZdmdAh3K7jSq5969rU9ugkOEmneUlzhndzz6Sv533N\n3KVzK+231nLVm1elPl+y3SV5HSlbG8aYSrO8Pzj9tlQe4WXLXGQWwMCx76ZSt22z3jZ5nxPdGJOK\nsltVvopTnzuVIX8dwuWvX15l+T+/mc7neNKWJ+Xli/h1Wq+Tinz+eenPnPDsCZzxwhmUNVnTofD6\nd6/TdSfnwJ89O+1MAZgRScEbRghbays5QaMBC/lAv479aFbsOncNiRDu0rJLpRQPVZGZum7mgpnc\nMPUGrppyFfv9a7+Uc3lE9xFctsNlAGzYZcPUhH8hBlPpmVEbLZq04LBNDgNgZflKhk8eztFPHs3U\nOVMrOYSjQ+arcwhHUwomuX+4dU9nS66uWM27P7xbbblCcYZm0qVVF4Z3d8+hD3/8kM23SwfInJ0e\n1MEmm6bTRTQrbpaa8yCpFJkizhx1ZnrDqKt58EF4KN3MDNt8NQc8ckAqPUvSNHdp1YXfbfo7AJau\nXrrGqACoPKHceu3WW2O/qB9yCGeH30TW76qqgHWhYGHoQHtgu6rKNYQwj3C5Leebed+khh+nqOQQ\ndtGyhRAh3KFFh5RxMmP+DLa+c2uOevIoRt05KmWUF5mixEw6VReqyyNcaVKIhM2eWhu1RWEUukMY\noFtF5Dfo6SJUNhj+c6pz0r55+1oN+SQQjRBuuTgd7XL3R3fT9/p+fNdlMnRNR74kzSEMbjbwkHBI\na2piuQ4zKC8JeqU/bkan5M/9wQadN2CPQXsAbqjm/Z+kJ1yatXBWagbkQkn7Yky6o7uyfCXPTX+u\n0v5CjBAGNwwzjMx7/MvHU9HvZRVlqRe0SYoODtllwC58e+K3TD9hOlv12oqztjkFKpzpPLvbjSxb\nlY7MstYNVQUo3+RvzFroIljG9x+fd86jtaWmSY1e/PbF1MuuzbtvznZ9smbuxsbBQw+mZRMXHX73\nR3fzwKNLOP74ysEVi/vfUal8Ejh5q5PXGFV23svnpdovpHRxKfd+ci8A7Zq148hhRzZaHevLRWMv\nWnPuEGvgw0PhHy+wS/ltqc1zBp8OLdzkn9EUMFVFCF8/9frUEOyh6wzNO5urpKiEDbtsCDgnb+ni\n0lq+Ae/+8G4qsGBI1yG1lu/Vrlfq5e2U2VMYdNMgTnz2RM584Uye+eYZANo0bcP9E+9PjSIAuG78\ndbx75Lu8fOjLvHzoy8w6aVal0TR1YdLmk1KRxrMWzuLW925l27u2ZWXLtPc+2obVOYSjEbPRnP9J\nY1TvSC73atJGlC4uTU1CPqjToETazDURTRuxvPtztGy5ZpmmAyLpIgZMSNRL6eqYMGBCatQKvaew\novNbXHBBev9n65yfGs3Qt0NfLt+h6pd8+cwpW52SigC/44M71phMMuoQVoTw2iOHcHYIpxFeCqyZ\nNDBNdCawUdWWqifhsBFwQ7qaNs2YQKH9zPR6AUUIQ+W0ESGzF81Odbq7t+meiiIuBKIO4XDiEoAv\nfk3P9F1IKSMAtuyRTtjXpKgJfx73Zw7Y6IDUtjCXdCEzqHXEIdzDPeSntb4plTf68E0PT3QEVkjU\nIdzqx3FcuO2FqWFDC1cugN0nQW/nfOjdrncih2HvO2TfVEfpvk/uo6yiLO0QXjcyC0bpMNrnV3+z\nwURzsV7z1jUpwy7qSCqUCGGoOVq2kkO4gCKEoy9op8+fzv++dwOlZi+cnZrAJ4kOYXD3mnCSk/U7\nrM+684Lc2K1+5pIX/pIqt2JFMG9Dk6X8PPiS1PZLt7u0UevbGNQ0qVE0z+zpW59eEM+mds3bceBG\nLv3N4lWL+fNHZ3HDDc5xeOFF5Wx72ek8N885GpsVN0tMJN7ATgP58rgvmXbMNCYNnwS4EWhHP3U0\nFbYiVe78l89Pje44evOj89qpMrz7cGadNIuP/vARV+14FeeMOYcz2n4Aj91FyewduOaAo9h3yL4A\nrCieCzv8EaicNmLGDKDlL5iNH2RJi89574f3KuXG/dOOf2pMSXUmTKNVYSsYcOMALnrlIlaWray2\nfDTy7qCND6q2XJRwQq6yirJKI35Cbt/9dvp1rDx/enFRMcO7D2dsn7GM7TO2Qan8BnYayP0T72ds\nn7E0KXLR6SvLV3LFWxfSORhMVZE+Zat0CE+dM5XXv3sdcH3nLXtuuWahhBC9B1/z5jWpQKgoj3z+\nSOql+75D9i2Ie3GUaB70V+Y8wyOPwF57uYlN+/SBgw+Gma0eTpVJyn25NopMEadtfVp6w6j0M5c+\nL/PQ9+7+VFJUwgMTH0hkX6l/x/6pl0bT50+vlLPcWlspJdt67RUhvLbIIZwdBuPSRXxjbcSCWpPo\n3XpwtaXqSTRR+B0fuAiFVNqIvi9A//8C0MK0g6UufLhQHMK/2+x3jOo1inVbr8veG+6dch6FJHk4\nUFV0bdU19Vbwf9//j6e/fpqPfvyI56c/D7gch4WQSzbKQUMPYli3YQzuPJiXDn2JU7Y6hfsm3sc3\nx3/Dp0d/yvbrbx93FXPOiHUjRmvPt2nVYSkPzbwZcA/8k7Y8KaaaZZeoQ/inH4u4YOwFfH381xy2\n6WFrlE3ahHIhHVp0YPeBuwNuoqrnpz+fdgh3SzuEWy3ajJICeZc1qvcotuq5FeCGsm5z1zac//L5\n/P7x36fKRHO1Jp3t+myXmrTkya+epLyinCWrlnDxqxfz1Fdu9vUmRU0S6yCtjj032DO9/uCezFww\ns1IntVCc/js1OxcqXDT0te9dksqRHKaLYMvrWN3MRSRNHDwxNay1kIg6I2559xaueP0Klqxawgel\nH6Si4tdvvz57Dd4rripmndO2Pi1lY978zs089sVjrGoxi7f77carq69JlTt79NmJytdojGFwl8Hc\nMOGGVIDJ23Pe5o73XX/izdlv8rcP/ga4XNhJsDeMMQxdZyinjzqdS7e/lIuP3YSRJmtOAAAgAElE\nQVTJk+G//4XBg+HacdemJ5YaPhl6TK00sdz0ubNh0jDsxP0ZetuGjPzbyDVy4+Yjk4ZPok1T56xf\nunopF756IUc+UXU094IVC7j/Uzdip22ztuy/0f51+h87rr9jar15SXPOHHUmj+77KI/u+ygfTvqQ\n/TbKXW7pfYfsy8uHvkzpqaV0auGGUN378b102iBsvHQUYVUO4ejLqtO2Oi2RE12GDOo0iJ377QzA\nr8t/Zdw94yqlSwR4eFrhOUOjjOw5knbN3L32uenPscNOq3nkETex3LffwvnXfcMDwTnerLhZyvYu\nBA7c+MB0f3+Dx1ywUPFKmuz1h9RLgCt2uCLR6aqiIyrv/iidn//Rzx9N2Rk92/ZM9IudfCG5d8I8\nwRjTDAgTPc6pqay1dgEuihggazOdTeg/IZU/5dlvnuW1Wa85h3Crn2DPg8G4G8PJw8+lZQvX5COS\ne3+oRNtmbXnj8DcoPbWUh/d5mHPGnFNp/xrDxgqAs0aflVo/7unj+MNTf0hFYJ2y1SkF9wa4ffP2\nvHfUe3x2zGeM7j06tb1fx351GuJWCGzWtxcsDryl/V6g+LBxzFvuhjnuv9H+BTNxYpcubqZ2gB9/\ndH97tu3J3/f4OzsuugfK0m+ykjz07ZBN0kbOxa9djOn0DbT/FgY+kdreaXVyJ5Srigu2TY9ne+O7\nN7jktUtS963dB+5eacK9pNOspFmljtqpz53KgBsHcMErF6Ryn4/qPaqgRq8AHDHsiFRqhNIlpYy6\ncxR7Pph2EheKA3ynTTaGqScAsKpiBcc/czzWWubPt7DNJbDDuYCL4rl4u4vjrGrOWKf1Oozo7gzJ\nxasW88eX/kibK9owbPKwVJlTtjqloM7xQZ0H8Zed0xHhB//7YAbeNJBnv3kWgGJTzF93+SsXjL2g\nukPkNU2Lm3LzLjenPp/xwhnc98l9HP3U0altl253Keu0rvtkYPlCs2Zw5JGwfRA/0KNtDy4eG1yb\nxsJuR/PJp+55NP2HX1m0xzhol841Gz6rorlx85Gte23N18d/zTGbH5O69u75+B5enPHiGmXv/fhe\nlq1eBrgUJykHeS3st9F+/HH0Hzl969P5+vivuXLHK9lz8J7sOXhPNll3k+yJqYFOLTul+kIWy7zN\nz4RtL4LTu8Lho6H9TLpnxMZ8M++bVC7ZdVuvm/eTPtaGMYYH934w9TJ99qLZ7Hj3jqkXlKWLS3l9\nlouGLsR0EeACYnbq52aTm79iPrvct0uqb1S6uJSd792Z+StcHqeJG07M65EN9aVpcVNO3epU98FY\nSvbfH7Y/j9VtXU7wrXttzSlbnRJjDdeevQbvlUrV9OBnD7KybCWLVy7mxGdPTJW5fvz1awQDivoj\nh/DaE727LKm2VJrQIVy3J28daFbSjAvHXpj6fM5L59Cx2yLY62Bo47wqO6y3M5fscgoffghTpsCe\ne1ZzsIRz+tanV0qZUGgRwuCGdY3tMxaAbxd8mxpGMbDTwEpDswuNQnN014c+fQy8nY7KWdQuPUT3\ntK1Oq+oriaRJE1JD/2bPdjk5Q1a/91v4+2vw6wC6tlyHwzc7PJ5KZoHx/cfTpWUXwEViHf7eYDhu\nA1gnGLO6pCtdmxeGkz9k5/478+QBT66RN/esUWfx7/3+TXFRVuZZzRuiaSOun3o9Py5xz+KSohKO\nHXEsj+z7SHVfTSwtm7TkqQOfSrXxD4t/SKW16dqqa+ImNqmObbcFXrkQFjmPw1NfP8VeD+3FQS+M\nhe3PT5U7aeRJqZyehciTBz7JUcOOqjLKrmOLjqlJYQqJScMnMXHwRACWrFqSGjLfpWUXnj/4eY4e\ncXRNX897tl9/+1TqgAUrFnDQowfx8U8fA27CzKTri3L8yOPTIyy7fcA73ML3i75nj4d3gS5uZEOb\nsvUZ128czUua06d9nzVy4+Yj67Reh5t3vZlbd02ngzj26WMrpY6w1lZKFxGmC6kLJUUlXLbDZVy1\n01Wx9rGOHXFsajLlue2fgu0uhFa/QO8pcOQIZhelszRW2Aoue/2yVOTkCVucQLOS5A+Vbde8Hc8e\n9Cz9OrgUHV/++iUjbh/Bbe/exm8e/E1K7z4b7lOwfaiTRp6USiHywowX2Hzy5vz20d+y7V3bMmO+\nyy+9UdeNuGnCTXFWMyecMPKE1GidsjYzU6kjik0xt+x6S6Ij4AHaNGuTGnk2f8V8Hv/ycc564Sy+\nX+wmo57Qf0KlkWmi4ZjMJM2ifhhjegLf4cap3GOtPayW8rNw0cHfWGsH1uP/zAF69OjRgzlz1gxE\nLqsoY+NbNk4Nz2xa0ZZVRW5yoqKl61J6wUd0bZU521xh8sKMFxh3zzgslvsn3l/nYVBJ4vO5nzP0\n1qGUVZSltj1/8PPs2HfHGr4lksqiRdCunYWRN8LOJ0ORy0wzrt84/vvb/8Zcu+yyww7w0ktufepU\n2GIL5xju3BnmzYNu3S3fzS5PfOTZizNe5MBHD+TnpT9X3rG0MzwxmfF99uSZZ+KpWy4pqyjjjvfv\n4JlvnuHgoQczccOJcVcpJ/yy7BfWuWadSnk4Jw6eyOU7XM7ATnV+9CeS7xZ+x6g7RzFn0RyaFTfj\nxJEnctbos+jQokPtX04I/frBjBYPwT5VDI+2hh3NFTx3/hkF2wmPMm3uNP405U+pzneLkhacvvXp\nqcitQmP+8vkMmzyMmQtm0rykOSeNPIkzR5+Zd5OMNZT5y+ez/yP7V5oQ02B464i3ap3kN2lM+W4K\no/8ejDxb1YoWLStSozhYvC6ndniTa85ZnwpbgcEk6nqusBWMvnM0b815C3AjcX5d/iuf/PQJFbaC\npatdfNKoXqN44/A34qxqg7nlnVs45uljqtxXUlTCbgN3Y1SvUdz3yX188KNLydWqSStmnzy7oJ5H\n387/lvH/HM9Xv361xr5mxc346A8fFdScBZm8Put1Jj40kbnL5q6xb7126zHl8Cn0aNsjhprlnlkL\nZrHpbZuyYMWC1LaTtzyZa3e+NsZaZY/npj/Hzve6EXfFpjg1WqN5SXM+PfrTNfKV15eePXvy/fff\nA3xvrS28KMI6IofwWmKM6Qz8jHMIP2itPbCW8j8CXYBPrbV1HltTm0MY4F/T/sU+D2fkCFrdnI0+\nfopPHi/8PKtRpnw3hV+X/8ruA3dPlAFXH85+4WyunHIl4NIG3D/x/phrJHJJx47BDPZ9n6fTUQex\nvHwprx72Kpt33zzuqmWVO++EI45w68ccAzffDKWlpIb/jRvn8gAWAotXLuaaN6/hmreuYdkyC2+d\nDFPOhJVtOegguPfeuGso1oZJT0xi8vuT2brX1ly909WV8q4WOnOXzuWZb55huz7bFUxKmyhHHAF3\n3mmdQ3hIOk8jyzrBY3dx3TG7ceKJ1X9fJJtflv3Cs988y9g+YwtyJJq1luemP8cZL5zBxz99zAXb\nXlBpJGIhMfisI/iixZ2VNy7tDHe/yH1/GcoBB1T9vSTw0Y8fMWzysEovJjO5Z897Eps+YVX5Ksbf\nO95Nsv3Z3jDlDNj+XOj/XLXfuXqnqytPyFUgLFixgAMeOSCVwgZgSJch3DjhxtTkXIXMrAWz2Pdf\n+6YmtAU3sfzLh75c8C/hH5n2CHs/7NKudW/TnS+O/aJg0mOUV5TT6y+9KF1SWmn7dTtfx4lbrr2R\nJYewQw7htSTIIbwc5xB+ylr7f7WUXwy0BN621o6qx/+ZA/To0qULzz77bJVlKmwFhz12GJ/9/JlL\nZDFzf3jpMn73m77ceWeVXxEJZvnq5Zz47IksWrmIm3e5mU4tO8VdJZFDxo2D55+H/v3hk2krsVTQ\nokmLuKuVdRYtcpPLLV8OHTo4Z/Crr8LO7gUxp54K11xT8zGSxrLVyxgxAqZ91DK17fjj4YYbYqyU\nyArzl88vqEgk4bjnHjjkEADLiefP4fgTV/HPf8IFJ/eC8qbcdRccemjMlRQiCyxdtZRWTVvFXY2c\ncdVNv3Dmd4Oh1S8U04QhS4/l45vPgWWdefttGJnwoOhT/nsKf3k7nfu6T/s+tGri2nOb9bbhxgk3\nJjplU1lFGU8/U8EeuwWpPIrK6LjnxRSNvIVflv2SKrfZuptx1U5XFfRIyvKKcq544wpemPECBw89\nmMM2PSzRbVtfrLXMXjSb1eVuAsiebXsWRGqQunDD1Bt47IvHuHyHywtukrXzXjqPS1+/FICRPUZy\n9U5XM2a9MTV+p7S0lNLS0hrLAIwfP565c+eC5w7hZI+5zQOstSuNMb8AnYAaTyRjTHugFc55PLum\nstUxd+5chg+vfcbq3fY/iicfuQ2ATRonx79oZFo0acHk3SfHXQ3RSEye7JwQe+4JzZsUroHTti3s\ntRf8858uIvrJJ2HWrPT+jQpvXgxaNmlJ724w7aP0tk56v1MQyBlcmIwdG64ZPnilF/0ugiZLADea\nkfaFkT1AiIJ2BgNsObQznD8V+j/LAVuO55ev+/Kxm2uN9dePt27Z4OqdrmZIlyEYY9ix7470btc7\n7ipllZKiEtaPSqooYaO5F/PyaRfyQekHvDXnLdZrtx67Dtw18TlVa6O4qJhztzmXc7c5N+6qxIIx\npuDO77pywsgTOGHkCXFXIyecs805dG/Tnd7terPLgF3qNPL7tttu46KLLmqE2hUGcghnh8+BMUB/\nY0yRtdWOzdkg4zv1pqYI4SjrrtuNv/aDn39OD78WQiSXPn3gvPPirkXjcOihziEMLoVEi0gg9MYb\nx1OnXNMz43WiHMJC5C+9erk8wtOnw9tvuxENC9Ip/OQQFiIhDBkCzO8L7xzDxyvdtQzQsiV06RJr\n1bJCcVExRwwr7I5gj4z0sN26QZEpYnj34QzvXnsQlRAif2le0rzeE5pOmjSJ//u/GgftA5UihL1G\nDuHs8AbOIdwKGA68U025bSPrUxryj5o2bcqwYcPqVPbSSxvyH4QQIl62394Z+N9/D08/nd5uDAwe\nHF+9ckmmQ7hjx3jqIYSoG2PHOofwqlXOKSyHsBDJo1Mn2Gwz+OAD+Pjj9Pa+fZ3NIfKfDh2geXNY\nscJ97tYt3voIIeKlW7dudKvDjaBp06aNUJv8p7DHTjQej0XWf1dVAePi2w8JPi4AXs51pYQQIokU\nF8PBB6+5feJEF7VTiChCWIhkkU4bAa+8IoewEEnl8svX3FYI6SJ8wZjKUcLhJMRCCCFqRw7hLGCt\nfQd4HTDAEcaYqqYgOA0YjMsffJ21trwRqyiEEIniuOOck7SkxOVNfuIJeOCBuGuVO+QQFiJZbBsZ\n8/Xyyy7neYgcwkIkh513hp12qrxNDuFkEXUIK0JYCCHqjlJGZI8TcWkgWgDPG2Mux0UBtwAOAI4M\nyn0JXBtLDYUQIiH06AEzZ0J5OfgwokcpI4RIFr16uWHlM2bA//4Hgwa57cZAmzbx1k0IUXeMgauv\ndqkjrHXb+vaNt06ifsghLIQQDUMRwlnCWvshsC+wEJdL+HLgLeAlnDPYAl8Au1prl8ZVTyGESArF\nxX44g2HNSVEUISxE/jNmjPu7ciV88olbb9cOimRdC5EoNtnETWgbssEG1ZcV+Ud4L27Z0rWlEEKI\nuqEI4SxirX3KGDMUFy28K9ATWAV8AzwE3GytXRFjFYUQQuQh7dpBq1awdKlzhLdtG3eNhBC1MXo0\n/OMfbj2MLOzQIb76CCEazp//7CaJbN8edtwx7tqI+nDkkS53cP/+0Llz3LURQojkIIdwlrHWzsbl\nCz4t7roIIYRIBsbAxhvD22/DgAGa3VyIJDB69JrblD9YiGTSsSP8859x10I0hJIS2GOPuGshhBDJ\nQ4PahBBCiDzg1lvh2GPhrrvirokQoi4MGrRmNJocwkIIIYQQIgnIISyEEELkAZtsAjfdBCNHxl0T\nIURdMGbNKGE5hIUQQgghRBKQQ1gIIYQQQogGIIewEEIIIYRIInIICyGEEEII0QDC2e1D5BAWQggh\nhBBJQA5hIYQQQgghGsBmm0HLlunPcggLIYQQQogkIIewEEIIIYQQDaBJE9hyy/RnOYSFEEIIIUQS\nkENYCCGEEEKIBhLNI9ypU3z1EEIIIYQQoq7IISyEEEIIIUQDOfJI6NMHBg2CXXaJuzZCCCGEEELU\nTkncFRBCCCGEECKp9OwJM2aAtVCkUAshhBBCCJEA5BAWQgghhBBiLTDGLUIIIYQQQiQBxTEIIYQQ\nQgghhBBCCCGEJ8ghLIQQQgghhBBCCCGEEJ4gh7AQQgghhBBCCCGEEEJ4ghzCQgghhBBCCCGEEEII\n4QlyCAshhBBCCCGEEEIIIYQnyCEshBBCCCGEEEIIIYQQniCHsBBCCCGEEEIIIYQQQniCHMJCCCGE\nEEIIIYQQQgjhCXIICyGEEEIIIYQQQgghhCfIISyEEEIIIYQQQgghhBCeIIewEEIIIYQQQgghhBBC\neIIcwkIIIYQQQgghhBBCCOEJcggLIYQQQgghhBBCCCGEJ8ghLIQQQgghhBBCCCGEEJ4gh7AQQggh\nhBBCCCGEEEJ4ghzCQgghhBBCCCGEEEII4QlyCAshhBBCCCGEEEIIIYQnyCEshBBCCCGEEEIIIYQQ\nniCHsBBCCCGEEEIIIYQQQniCHMJCCCGEEEIIIYQQQgjhCXIICyGEEEIIIYQQQgghhCfIISyEEEII\nIYQQQgghhBCeIIewEEIIIYQQQgghhBBCeIIcwkIIIYQQQgghhBBCCOEJcggLIYQQQgghhBBCCCGE\nJ8ghLIQQQgghhBBCCCGEEJ4gh7AQQgghhBBCCCGEEEJ4ghzCQgghhBBCCCGEEEII4QlyCAshhBBC\nCCGEEEIIIYQnyCEshBBCCCGEEEIIIYQQniCHsBBCCCGEEEIIIYQQQniCHMJCCCGEEEIIIYQQQgjh\nCXIICyGEEEIIIYQQQgghhCfIISyEEEIIIYQQQgghhBCeIIewEEIIIYQQQgghhBBCeIIcwkIIIYQQ\nQgghhBBCCOEJcggLIYQQQgghhBBCCCGEJyTWIWyMaWWMGWOMOdUY86AxZoYxpiJYZjTgeEOMMbcZ\nY74xxiwzxvxsjHnNGDPJGFNcj+NMMMY8aoyZbYxZEfx91Bgzvr51EkIIIYQQQgghhBBCiGxSEncF\n1oIngW0jn22w1BtjzJHAjUDTyDGaAaOA0cDvjDG7WGvn1XAMA9wOHB6pD0B34DfAb4wxt1trJzWk\njkIIIYQQQgghhBBCCLG2JDZCOCB0As8DngeWAqY+BzDGTABuAZoAPwLHAyOBCcCjwfFHAP8OnL7V\ncTnOGWyB94ADgC2Cv+8H239vjLm0PvXLB0pLS7nwwgspLS2NuyqNgm96wT/NvukF/zT7phf80+yb\nXvBPs296wT/NvukF/zT7phf80+ybXvBPs296wT/NvuktLy8PV5PuE107rLWJXIDfA/sDfSPbvgUq\ngBl1PEYJ8HXwnflAnyrK3BTsLwcOqeY4A4BVQZm3gWYZ+1sA/wuOsxLo1wC9cwDbo0cP29i89957\nFrDvvfdeo//vOPBNr7X+afZNr7X+afZNr7X+afZNr7X+afZNr7X+afZNr7X+afZNr7X+afZNr7X+\nafZNr7X+afZNb5cuXcLg0p9sHvg341oS6w231v7NWvuAtbbe+YIj7An0w50Il1trZ1ZR5nScszhc\nr4qTSaffON5auzKjrstxkccE5U5aizoLIYQQQgghhBBCCCFEg0isQzhL/Cay/o+qCgTO3IdwqSg2\nNMb0r6LY/+Gcyl9Ya9+p5jhTgS+D4+yxNpUWQgghhBBCCCGEEEKIhuC7Q3h08PdLa+3PNZR7NbI+\nKrrDGLM+buK4zHI1HaeHMWa9OtdSCCGEEEIIIYQQQgghsoC3DmFjTCugF0Fkby3Fo/sHZ+zbsJpy\n9T2OEEIIIYQQQgghhBBC5BRvHcJAz8j6nFrKzo6s98rRcYQQQgghhBBCCCGEECKn+OwQbhNZX1JL\n2aWR9dY5Oo4QQgghhBBCCCGEEELkFJ8dws0j66tqKbsyst4iR8cRQgghhBBCCCGEEEKInJJTh7Ax\npiILyyE5qt6KyHrTWso2i6wvz9FxhBBCCCGEEEIIIYQQIqcYa23uDm5MeRYO8ztr7d11/H/fAusB\nM621fWspOwj4HDep3M3W2hNqKNsJmBuUfcBae1Bk3yTglmDfPtbaR2s4zkTg4aDsH6y1t9dFV/Dd\nVUATYwydO3eutXxxcTHFxcV1PXyNrFq1irlz59KlSxeaNq3N5518fNML/mn2TS/4p9k3veCfZt/0\ngn+afdML/mn2TS/4p9k3veCfZt/0gn+afdML/mkuFL3l5eWUl9fuhpw7d264WmatbZLTSuUxuXYI\nD8zCYUqttYvr+P/q4xBuBSzGOWf/Y63dq4aymwLvB2WvttaeFdm3K/BEsO9ka+0NNRznJODaoOyu\n1tpn66Ir+G4ZkB0PrxBCCCGEEEIIIYQQ/lJurS2JuxJxkVPh1tqvcnn8tcFau9QYMxvoBWxQS/Ho\n/s8z9k2rplx9j1MbK3EpJywwrw7ly4GKev4PIYQQQgghhBBCCCGSRhF1C6TsCBgqz/PlHd56wgPe\nAA4ABhljulprf66m3LaR9SnRHdbab40xPwDdMspVxTbB3++ttbPqU1Frbav6lBdCCCGEEEIIIYQQ\nQohMcjqpXAJ4LLJ+WFUFjDEtgH1xkbnTrLXfVFHsP7i3CxsYY7ao5jhb4iKEbcb/FUIIIYQQQggh\nhBBCiEbBd4fwv4HpOGfu2caY9asocw3QIVi/qprjXAeUBes3GmOaR3cGn8PcwmXA9WtTaSGEEEII\nIYQQQgghhGgIiU0ZYYzpB4zO2NwaF4Hb2hhzaMa+ZzJTQlhry4wxJ+AmhWsHvGmMuRT4H84JfBSw\nV3DM14F7q6qLtfZrY8w1wFnACGCKMeZPOGdzP+BMYLPgOFdZa6c3TLUQQgghhBBCCCGEEEI0HGOt\njbsODSJw+P69Hl8Za619rZpjHQHcBDTFRQtHscBUYDdrbbWTuRljDDAZODzclHEMgL9ZayfVo85C\nCCGEEEIIIYQQQgiRNZKeMsLWcamo8SDW3gEMB27HRfUuB37BRQX/ARhdkzM4OIa11h4J7IrLKfw9\nbsbC74PPE+QMFkIIIYQQQgghhBBCxEliI4SFEEIIIYQQQgghhBBC1I+kRwgLIYQQQgghhBBCCCGE\nqCNyCAshhBBCCCGEEEIIIYQnyCEshBBCCCGEEEIIIYQQniCHsBBCCCGEEEIIIYQQQniCHMJCCCHW\nGmOMibsOjY1vmo0xHeKugxBCCOE7vtkf4J9m2VxCiMZADmEhRE4xxug+4wHWWutLWxtj2htjTKC5\nOO76NAbGmKuAc4wxbeOuS2MQbVdfzmshRPLR/coPZHMVNrK5RKHh2wudJKELTohGoLqboA83R2tt\nRfRzIT/ojTEHGmM2jbsejYkx5g5jzAWwZlsXIsaYPYErgcnGmBJrbXncdco1xpi/AqcBewIdgm2F\nfu9qGq4U+nnt8/MpSiE/m3zD53NaNldhI5tLNleBIpurwNvYWmujnwv52ZQ0TEbbCNFoGGOKCv2m\nH2KMKQH6Ay2BNsCnwLzwDX8h/g7GmIHAxsAoYDXwM3CLtXZZrBXLEcaY24AjgcnAjdbaz2KuUs4x\nxlwHnBB83LjQNRtjzsMZ6W2ACuAwa+298dYqtxhjbgKOAcqBYuAfwBGFeM8CMMbsDGwB7AcsAOYB\n1wIfW2vnxVm3XGGMaQesh3s+tQA+A+ZmGu+FgjGmLzAI6I67jmcAb1lrV8VasRxTqLZGVcjmks1V\niMjmks1VaMjm8sLm2gjYAGd3rQS+B54CVhSq3ZU0O6Mk7goI/zDGXAI8Y619MxwCFHedcokx5mBg\nR2CfYFNz4F3gHWPMSdba1caY4kJ6622MORk4ANg8Y9dvjTHHWWunFFLbG2NuwXVMwOnGGHOTtfbT\n+GqVW4wxNwDHAWXA8cAX8dYotxhjrgVOAixwN/CUtfbheGuVW4I2Pib4GHZONsYZdtOSZvDUhjHm\nQuAoYN2MXUOBvxljJltrf270iuUQY8xxwK64Z1QZ0Az4BJgZ/B6zCqlTZow5C9gXiEYVLgW+MMZM\nBl6x1n5dYM8n2VyyuWRzJRzZXLK5ZHMlHw9trnOA3+KcwVG+AP5jjHnIWvtBoTyfjDE3A/+21r6Q\nqOvVWqtFS6MtwC24t7zTgc2DbSbueuVQ7xW4h3p5oHtZ8DdcXgKaFNLvAPw5om8V8BawOFgqgI+B\nzoWiGfdi7ZFA29Lg76LgXN8o7vrlSPMNkTb+PVAcd51yrPesiN7TgZ6RfUVx168R2njvQHf4+ay4\n65cDvX+J6PsBeBGYDcwPts0E9i6kNgeuzngefR38XRH8/Qa4HheJFnt9s6D3mojWZbio0eh9+2fg\ndWDbuOuaRc2yuWRzyeZK+CKbSzZX3PXLgV7ZXH7ZXL8CU4N79oJg2+KgzXeLu65Z0ntzoKsMGBNs\nS8S5G3sFtPizAJdGLpSC76DghryEN8KHcMOejgAuCm6Gq4J99xeKcYfL8xVqvhYYH2zfGrgpMNor\ngL/GXdcs674q0PVk8MCrABYWYgfFw47JVoGRVh5cw80i+6rUnhQDoI5tfBQuSmUj4L1g27fAJnHX\nM4t6L4zoPRMYGmwfEJzjpcG+KXHXNYuaL4tovhQYB7QHdgo6LT9G7mNTQ+M2qUvQrqHes3HD6psD\nY3Ed71mR/SuBveKucxY0y+aSzSWbK+GLbC7ZXLK5kr94aHOdEtF7KrBZsH0TYC9chHDUOX4s0Dzu\neq+F3ouo7NxfTYKcwrFXQIsfS3Dx/xA84BdEbgAF2UEJjJhQ49FAp4z9u+HefoZvQXdM+m8Q3PCj\nN/b2Gfs3w+UNqgAeiru+WdJcFPz9v0jnZCzwPgXYQaGGjkltD7ykntuBcV4OTAEGRraXRLUBfXDD\n3pom4eHfwDa+k3Q01oF1afd8X3BD934OdJ0FtMnY3w44DxdpOJMCiLQL7nZ7T0kAACAASURBVFcL\nI/fqFhn7OwIHke6gVAT37h3jrnsD9fbDpQwoD9qyRRVleuIilBZFNB8ad93XQrNsrsr7ZXPlQZ2z\noFk2V8ZvUcN3E3luI5tLNpdsrsTaXMG12R14LbiOL83UG5TrDPwd97IjFQmf+RxLwgLsHrEvfo60\ndWKcwrFXQEvhL0AvXP6nctzwgDNJv9EPh0gUTAcF2B74MtB2AdAqsi9q0Bwe+Q3Oi7vea6l5F+C7\niOY2kX1R4+aFoMzLBMM2C2EJDNOFwIdAF2C7YL3KDkoSz3Pg1sj5egDQNLIvel63wkXf7QLsD4xM\nYlsHRk0J8Hyg+epqyl2EmxwhnMTnE1yk1si4NTRA8/VU0TEh3QnvFbm3fUFgqCd5AS4O9HwFbBht\n/8j66KDMPGCduOucBc1XBHpeBXpEthdF1tvijPOlwbldgRvKOTbu+jdA7w6khyduk9nGkfO8I27I\n3+zIdXBQ5vmQ7wuyuWRzpddlc9lknufI5pLNJZtLNlcyba4tAw2rgF0y2zhynrfB2ScfR66D06L3\nunxfcC+l/oYbibUy0PMg6RfxiXAKx14BLYW/AIdFLvSbg23dgHci2wsmagW4JLgBTAU2rWJ/eEPs\nFPkNHg+MocRpB7oC9+KGSdwPDKhKMzAY1xFdjZtNdkNgPPA7XLL5DnFraaD+YtxMsR8FbblVsH0s\n8AHVd1CaA23jrn8dNR4buVY/APoH24uIdDxwE508RuVhyvOBt3HDozrFUf+10N0UF1lYAZwYbgv+\n9ohoDfNVroj8XUKQ/ywJC3BdxHipclgqzmC9J9KuR4bnQdz1b4DeIlwkyueBnsdqKDsMWI4bpjwC\nmACcGFzjfYMyeX/vDjS3J5237oZaym9OOp9fmKvza2CLuLXUU/ehpCND16uqvUh3wNvihuFHo1Z2\nj1tDPfUehmyu6H7ZXLK5QDZX3i/I5sosI5tLNlcSba7dgrr/Agyrqr1I21wtcaN6wlEeFcDvE9TG\n+0fqfWuwrRPwLxLkFI69AloKewkMsCeCC+K/RIYCAL2B/1FAHRRgSMRAObcO5Z8m/ea3Ta7rlyPN\n65CevGTXjH3RN74nRwy5r0jn86vARWT9nSDHUBIX0vkLfxt8bkrVHZRBuA7NgbhIh+Fx170O2gbg\nIjKWBe32BNAno8ztkfYMh7hFDffpwB+BbnHrqaf253EdrVMytj8Z0XkHLgLg+kBn9Hc4OCift/c0\nYD1cJNkqYBI15CgExkS0/TuyPW/11aI9fAZ9BXTN1IKbAfqU4DyeSXqoYxi98imwc1J+g+C+NC2o\n/x3htowyUf2v4jooYYTpcpwTqmdj1TkLmo8i3cGqtmNFuoPSBpfTb06knbeKW0cdtcrmqrm8bC7Z\nXLK58nhBNldmWdlcsrmSZnPtRfpFzbgayoUva1sAx5Ge6LcC2CluHXXQ2QS4MajvK+H5HOzrBjxM\nQpzCsVdAS+EvgWFzKy5CIexwhcMFMjsoiR7KiBsm8RruTVeP6nREfodwBs4vgY5x138tdI8Brox8\nrhR5g5sxN2zjabhcjdfiDLxweO7CwEBKVO63SFseH+h4OPKQK2HNDsrfcJMYhcPAriQy/C/floi+\n9YP2WhUsTxF0NIC7SDtc/h38FnvgJvR5lLRB9x0u72HruHXVQ/8DQd3fAboE28LhXwuAERnlOwC3\nUXmSqvFx66iDzi2Cc7Wmjkl4Xt9CutN5aNx1b6BeA7QG3gh0fA8cSSSCDNcx2Z50RMs8XCfmfVx+\n1mgntFqjN1+WQHPbQEMF8Gpk3xoGKi4iazbuubwF6WF9cwiiZknAcxoXKRkOr7+1pvsPlSOF7wju\ndauD+1qPxqhvFvTK5qq+XWVzyeaSzZXHC7K5omVkc8nmSqLN1ZW0E/wZakh1EjnHWwLnk55Q8ANg\nUNxa6qC1Z1DvfVkz5UtinMKxV0BLYS+Ri6I9ayZRL9QOyv8BlxPJY1dD2XBY2GzcG+Mk6i3K+Fyc\n8Tk6nOIfwDYEM4nicp9tSvoN/3zgL7i3hYn6LXATF80H3sjYXoLLbxcOh1kG/Bqs/wD0irvudW1j\n1uygPA78KdCyGJejsXvGd7vjhjCHxtzXwJBgX962cUTzMcFDfBbwG5zB+gquIxYaaCXB33BoY5vg\nd/kp0Pw+0DtuTdXorHcbkB6CX46LUirJ57asRctupCMMP8JNZrJ+cF86lvRMyNNwuWh74KIwewXn\nf2jozSDPZwEnbXhfQXp48eWR/U2Cv+Gzefvgun4z+LwFaWP9dfLYqZKhuyvpDub7BGkFqGXGetyw\nv9A2mUPQASXPDPlq6i6bq+qysrlkc8nmysMF2Vw1fUc2l2yuJNlcbYDnSEeC71HT+Rr5ndoBjwTf\n+wX4Q/T3ybclUu82BM/YyL5EOYVjr4AWvxeq76BUO5QRl5Mn7x6EVI7OaFFV3av4znGB3p9JgJHa\ngN+kNemOx18J3vaHv03kZtqXdATXLBISiZWhpTtpAzzMaRc9J3YhPQtpOOwtnOk8Lx92GRqr6qCs\nCB5uy3AdkOhkNtFJT1rjckSFubEejVtPPXT3w0UpVOAidPrhnAmlBPlIq/md2pEe4jg3NAAKZcFF\nJVUE50EihtNXo6MDblKm5RE9FbjJIcJOyzQiE5tEnludcB2UMBLtuGB73j2fMjTvHjmnS4ELqigz\nhPSM19cE21rhhuiuCJ5Zm8etpQ5aw2fMb4L7VAXwRGZbVvG9sI03jNzXn4lbTxZ+D9lcsrlANpds\nrjxdkM1V3e8im0s2V97bXBE9oyNt/DLBiyuqcYJS2YEajmKZGreOtfwN6uQUznhulRDDpKBFCBEj\n1tpyY0yxtfY7YB/cZALgDKAHjTGbW2utMaYYwBjTGpdcfqQxpkk8ta6aoJ4mWF8ebquqbFgO98Yb\nwELV16MxprcxpkeWq9soWGuX4HIJXQScb62dG9lnw7a11s7Aze6+HPcWeAxU+p3ymkDKD7icjeBm\nHU1hjGmBm8E+ej4YYIIxZlNrbXnj1LThWGsrjDFF1tpvccMTnwt2FeOGVj8e1WGtLYusL8ENWf05\n2NTHGNO8cWrecAK904FLccbqBNzwU4OLRJmb+Z3I77QQOANntHbCGYSJOaerwxgT3qcewRm4JcAJ\nxpi28dWq4Vhr5+OGY/4e50j5Eddmt+M60/OBo6y1P4XPoeC5VWSt/RUX0TIPFyGwR7C/yvt+vmCt\nfQL4c/BxHeACY8zjxphjjTFHGGMuxRnwXYO/fwy+txR3j2sKdAY2avTK15NIW7yDM8jLgF2NMX8P\n9peH7ZrxvfLgXP8Wl7+zDFjfGNO5cWqeG2RzyeaSzSWbK1+RzbUmsrlkc5FnNldt11SwfyouTdEq\nYFvc8ya8XquyuSqCZ1NpUHYF0N8Ys2G2619fGnoPidybSoETcPnRF+Hu4S8ZY8ZYaysIbJHA3toF\n2N0Y0yk7ta8bcgiL2Il0UGbh8p5ldlBGBGXa4h7wN+IMnVHx1Lh66vpQipSbh+toNgfaZd50jDED\ncPnAPjPG9M1mXRsLa+1HwMXW2l+q2R8atN/ihi2C+03y/iFfBd8Hf/c0xpig89UaF512Fm6I6s+4\n4cutcPmzJhljhsRS23pSRQflZdywnqettfNq+e5XuNxhAEPJ6MDlI8GDGlyexZdww/W2xUUm9cHd\no6r8njGmBNd5WRhsXhXsS9o5XYnIb/JfXNsDbIYzVqOdl8Rgrf3ZWvtP3PC8EcDGwH24CSNKgS+D\n6zna+a4IHGS/4q57i+vA5jVh+1hrLwMuJh09txsut+btwJm49nwRN6v5quB8Jtj2cbCeN85RY0wT\nY0xxFc/Q0GH4PfAg6Tyihxpjbgz2VecUrggcjVNwnfCBuMjavKA6zbWRVJurKr2FbnM1pI2TbHM1\nQG/iba6aNBeizVWb3mC1oGyuht6rIZk2Vw3P49D+KDibqwbNJVBYNldwn13fGNPXGDM0ui+4X1lr\n7WrgP7h7VhmwhzHmweD75VWdt5H2/hj3jO5AHty3atNby3drcwpvk2Fv/Rn4Fy5wodGu7ZLaiwhR\nM8aY9YHv1uaNezRqxRizN+5i2Bz38L/fGHM4bgjUGTijwOKGFDQ62dAbPRzuOgzzQaUMl6Bj8ldg\nWLApNqNmbTXXZJAFN8pwcoyQ5Q35P9mivnrDjgjuwXcmblibNS4iY3fgXNITG22Hm2TgHzgjfR+g\ntTHmcmvt59lXUzfqqjnaQTHGnAhsZ619ppZjlwTRK6tx5/FPwJLI79bo1KeNrbUfG2PuwjmE+uE6\nKW1xURpnV9Uxs9aWGWMqcNc4OO2xkq17V9D+c40xF+KM2YHA2ThDtqLGLzci9dEbnIvzI59H4iLM\npgdaq+q8rjbGtMPlazVUEb3U2NSmORKFUW6tvdAYMxN3j9oNl6sRXF6/qcBJgVGfij6z1q40xoTX\nbOyRV8aYrXGdyQk4h88sY8xrwIPW2pXBfbgocO4+bYzpjuuEdQSODX6LY2w6Aqkicuzw8xxcB84E\nf2OlNs11OUbCbK611hs9HMmwudZKcwJtrnrpLRCbq06aC8jmqlVv5F5dKDZXVu5dCbK5anseV4Tn\nYAHZXLVpLjPGNLHWri4Qm2s/3GSIB+DsoXbGmH/iRiz8K3K/qrDW/s8YcxsuKnpjYJ+gXQ+oykaN\n3J++x0UIxz6qoY56a7yvRp3CxpgTgBuAnXD3tBeNMbvh2vYM3P2uDJdHuvGubZsHOTa0JHcB7sZN\nWLAlWUiOTTpP0HpUzm/3E+lE83OBwUnWS+W8huW4vDKDI/sH4N6Qh3o3KJQ2ruZ/tMEZ6xW4aKxG\nz5+TDb24vHzzcVFIo4EDgc8CXTMJchbiHnLb4YYxV+A6Ld2SpDmzXHhO11C+A+lJXv4Tl9b66o3q\nws3gXUp6tuevcJO6tAl/k8i1bXDD2RYE7TuyLr9TPmiuxzE3Aj4JfosPCWarj0tjNvRG2u/kQNcn\nQLtgW+bkTUXAfriIpO+B3ZLSxtH9OMffBsDI4L4VzTuaqXk90nkpj4y5jc8mPXt35nILweRxVeg9\nAecMC6/jewgmJ4q0q4msnxqUew9omRTNdTxevttcWdFLsmyurLZxNf8jn2yuBusluTZXvTWTbJur\noffqJNtcWb+OyW+bq0F6SbbNVZ/zOprvO6k216U42ynUuDyy/j/gkMx2DdYPwo3SWB2UfRz3UiOc\nVC/T5joqKPcZ0DkJeut4vGhO4X+RzilcRnr0Wiz2VmwnlZbkL8BNkQvjDbLvFO6G64CFyeYrcMNl\n4uqYZF0vzkBdEegLjZeBpDsmsenNZRtX8X/GBjfDlcD5xDSD7troxRmjTYPvVeCG2X7Amh2TcHbk\nEmAc8BoxzpTbiG28G24CmLnAoeFvlgS9VO6gHI4zAEPD5hOcg6lrpEwJznB/hfTEKB0KsY1x0Vnh\nsU+OS2O29eKiPcJj3B7ZHj6fDC4C7bWgzLNEjPokaCbSka5mf6YDohjXQV+GG9LXfW3rvRZ6/xJo\nLcd1/l/CRdmURn6HO8PrMmiv6HV8HC53YTjz97O4Gc3bR/XjJnp5MyhzG9AsSZrreNx8tbmyrpf8\nt7ly0sZV/J+x5IfN1WC9JNfmaqw2zheba23v1Um0uXLWxuSnzbXWekmezdWQ8zrJNte1EV3/xT03\nTgo0htv/Q+SlecZ1vD/uhVw4ieCUQNu6kTIlOJsrvKffQ0wv4Ruit47HjU6Q+BjO4Rze234lLnsr\nrhNLS7IX4BjSbzXCNyZZdybhHvrhG5R5wIaFpBf3ZjC8OW5LfnVMGquNB5M24t6K64GXLb3ANcF3\nw3adSbpjkvnmtziuh10MbRwacS8RmT04KXqpbNjsHRgI4YzIc3DDvY4EDgWuxuXlrMBFMfQvtDYm\nbdiuB7wdHPdbgoiVpOvFRW+8S9ph+DfSwxTb4yKRXg32zS7ENq7i/0Sdo/8AWsek91LSBvmZwLBg\newtc9NBjkf2nZnw3eh0fGpy7ofPzK1y+voOD45wXObenA+vH2MYN1lzP/5MvNldO9JLfNldjtXG+\n2FxZ0UuybK7GbON8sLmyda9Oks2Vq3tXvtpc2bqOk2RzNdZ1nC8213kZevtG9jWhckDCdhnfjV7H\nu+IiY0M7tRSYhkuVcAxwXeQ6nhn9P0nRW8//cwzuhV2s9pa1cghracCCSwL/CpXfimW1E4obTnFQ\n5Mbwa1wXSi714jony3EdsANxs5nnQ8ckl5pDo6YNLkolNFrnAAOSqjfcD2yDmw061FRlxyTupZHa\nuHXwe4RG3HfEZMRls42D9c1xb4yXkDYMyiPrFbhIlkGF2MbR3wQXORlqPrhQ9OKMtcWRdv0SN0xs\nOu6ZFF7jcQ4vb4zruB3uXv1q5HeIxTmKswuWBfU4noyI3eB8nIAbTluBS6HRM7NMZH0sLtInPGYF\n6WiNcPkq5ut4rTXX4X/kk82VM73kr82VS835aHNl7TomOTZXY7RxPtlc2b5XJ8Hmaox7dT7ZXFnV\nSzJsrsa4jvPJ5jqM9EvzU4DmkX3h6Iu9SKe12CmqJVouWN8Ilx5kUeQcLqPydfx5XNdxlvTWls4n\nb+ytVJ3i/OdakrkAl5B+E38m7g3WHWSvE9oCOIR07qu4DfWc6cXNFLswOM7cfNDbSG28EfAn0jkK\nv47r5p9tvbgIlAnAA0CfcFuc7RlTGw8CLiKdz+9b4jXisqI380EPbIozzl/DveFdgIsyvAToXeBt\nHBqv65A27gYmXS+VDbujcREM4XHD5UdcZGFsUSq5bmNcZM6WwKOk8+TNjOteDfQM6lKOy5dcbQ5Q\n4O9BfRcBQ2tp41bAKNzwv/dJO4TfBm4muI8nXXMN38sbmyvXeslDm6uR2jhvbK5s6yUBNlcjtXHe\n2Fy5ulcHn/PS5mqkNs4bmytXbUwe21y5bmPyz+baGBdwUIaL3q1SL/D7oK4rCaJpgRYZZZpUcezr\nguP/invR8ypwFW6S0KTrrfIZhMtnfxh5YG9VqlfcFdCSrAXYMHKTvi+yfYvg5peNTmgv4CHSBnuc\nhnpO9QY3n2Wk34bGlj+mMTQDXXG5G0NDZikusiPOobhZ14vroDQL1ktyUe980xw5RgfcEL5fSL/5\nfR3oV2h6SUcohRMj9CRPopMa414d/ga4yLMrifelTlb1UjkyaVvcxGIf4jqh9+KGsHbNhZZ8aWNc\nFMN9wTEW4zpjcV7H+5COCtuvpnYDxpB+zhwVbKsub1/YyW6Jm/V7CM6BlrqPF5rmjO/nk82VU73k\np82VM83kp82Vdb3kv82VyzbOR5srV/fqfLa5cn6vDo9BfthcWdVLMmyuXD+f8sbmCs6zw4P7yUcE\nOfcj+6OTwIUpFJ7BpdN4BufI/g9uBFZq8riMv2E+3Z4ETmBiun/nSG9Vz6qOwF3kgb1VqV5xV0BL\nshagPzAZFzkzOtgWXiTDyU4ntDnwu+AhMKSQ9eLe6Icze88n5iEDjaR5PC5K5RXccIx1s60hn/Tm\n49IIbTwB94b/XeAKoEeB6w2NGxNdL2TNVfy/poWmN3M/GREOcS+N0cZAP+BC3OQfseShDH970hE3\n/4psr85xsAHp2aHPq+P/WGPIX5zXcWNoDr6XFzZXI7VxXtlcjaQ5b2yuxjqn82lppDbOG5urkfTm\nlc0Vx3lNjDZXrvSSxzZXY7UxeWJzBXXZC5eu4sSM7VFb6VAqR3Ivx0XuR7e9Fmoh8uImD6/jnOqN\naB0DPE3M+b8r1SvuCmhJ3gL0wb29axF8jl4o2eqEtgDaxK21MfTicia9Tx44gxtR82ZAl3x52DfG\nOZ1vSyO08QhcNFaruLWqjf3Q3AjndDSCJVaHf2NoJm2gNyEPhmDjOkoPAhdltkdmvYPny9eB7gvj\nrnu+ayZPbK7G0Eue2VyNpDlvbC5dxzlr47yxudTGha+5kc7pvLK5cq2ZPLO5groMq6o9gs97knaC\n/hs4CRft2x/3kvk90vmWXyIPXrTHobea86N5rjQ0SHfcFdCSzCXzoiYLndA8v0HkQm80yXq7uDU2\ntuZ8W3w7p9XGauNC1OybXp804yIrBgAd6lC2BGecVwC31qYpH/X6qDnHeqORSXljc+VYc949j307\np9XGauNC1JxLvfm6+NTGrOkMzfy8HWnn6G244ITMPMHbAS+TTm12fty6pLfqpQghGoANzvzoX2OM\nCdbfw+VXeQCXcHtr4BpgC2NMEUBY1hjTwhgzIHKMvDwnc6S3zBjTJFhf2LiKaieHmn1q47w9p0Ft\nrDYuPM2+6QV/NFvH19ba+WGda6AYl+cPoGl1hYwxrcJj1+GYjY5vmnOst9wYUxKs543NlWPNZZ61\ncd6d09Aobax7dcz4pjmXeoP1vNILfrWxtbaips+4KNdSnHP0j9bamdba1ZBuO2vty8CNOOc4uHkv\n8hLf9GaSVw8QkWzq0QktCcq2xuVrucoYc27wvcwLMG/Jkt7V8dS+YaiNC1sv+KfZN73gn2bf9ELh\na7bWOb+rInCOlONyxIIbwrfGd4wxA4FLjDGH1nbMfMA3zTnSW5ab2mYHtXGaQtQLOdOse3Ue4Ztm\n3/SCn5qjWGtfxEXEnmWt/TVjX9T+fBSXvx5gW2NM+3x7gVUXCl6vzYMwZS2FtVDzcNUpwbZ2wH64\nmRwrgLeowxCMfFx80+ujZt/0+qjZN70+avZNr6+aI3qfCvTcW8VvMQCX460CeBxoHXd9pVl6pVl6\nfdTsm14fNfumt5A1R3XUUq44+PssaduyqK7fz5fFB71hSLMQWcNa96bEOt4zxtwU7Nof2AoXqfQM\nLhppY2ABcIS1dn7VR8xvfNML/mn2TS/4p9k3veCfZt/0gp+aIRWhUxx8bAfp6Bvj0mLcAowFFuIi\nPpbEUM2s4ptm3/SCf5p90wv+afZNL/in2Te9UNiaQx11KFdujFmXdOqEz20ej2aoDh/0yiEsckIV\nndCbccMnDgJGAhvgbpDzgDHW2s9jrO5a45te8E+zb3rBP82+6QX/NPumF7zVXGGMWRB8XBFuj3TE\ntsfpHW2t/SKGKmYd3zT7phf80+ybXvBPs296wT/NvukFPzVnYlzO/vE4+/I74J/BdlNXJ2uSSLLe\n/M9pIRJL2AkN1t8F/gFMxeUwbIfLrVMQnU/wTy/4p9k3veCfZt/0gn+afdMLfmomyNtHYOsal68v\n2hEbU4AdMd80+6YX/NPsm17wT7NvesE/zb7pBT81RxkMTALaAO8CH0KyciXXk8TqlUNY5JTwIjDG\ntAS6AusAzXDDUrcpsM6nd3rBP82+6QX/NPumF/zT7Jte8FJz8+BvR2PMRsBfqdwRKzS94J9m3/SC\nf5p90wv+afZNL/in2Te94JnmMOjAGNPEGLMZcD1uFNoM4HSbMRlb0ikkvUoZIXKOcbOX7wacCQzC\n3Qi3sdZOi7ViOcI3veCfZt/0gn+afdML/mn2TS94pzmMzmkP3AyMoUA7YhF80+ybXvBPs296wT/N\nvukF/zT7phc80xyMROsK7Aocg5u0uBTYzVr7bayVywGFpFcOYZFTgrcnuwKXAetTwDdC8E8v+KfZ\nN73gn2bf9IJ/mn3TC15qnhf8HYKzdwtdL/in2Te94J9m3/SCf5p90wv+afZNL3ikOXCM/gY4Emdf\ndsSlTTjIWvt1nHXLBYWmVykjBJCaDTNcN8aYFsH6Wr00CIaofo67WCwueXrsN0Lf9IJ/mn3TC/5p\n9k0v+KfZN73gn+Zc6SUd9BB2xPImLYZvmn3TC/5p9k0v+KfZN73gn2bf9IJ/mnOktzWwCy5Kdh5w\nB7B3PjhHfdPbEOQQFhhjiq21FcH6BOAvwGfGmA2stWVreWxjrf0Yl1NliM2D5Om+6QX/NPumF/zT\n7Jte8E+zb3rBP8251As8CHwGlAHb5ktaDN80+6YX/NPsm17wT7NvesE/zb7pBf8050qvtXYGcH6w\nHAWcaq39Lht1Xht809tgrLVaPF6A4sj6icAvuFnHK4C/A0VZ+B9rfQzplWbplWZf9fqo2Te9PmrO\ntV6gKbAD0Cdurb5q9k2vj5p90+ujZt/0+qjZN70+as613nxbfNO7Vr9V3BXQEmPjRy4E4ILgAqkA\n7gL2AErirqP0SrP0SrPPen3U7JteHzXnWi9g4tbou2bf9Pqo2Te9Pmr2Ta+Pmn3T66PmXOvNt8U3\nvWv9e8VdAS3xL7iZEcML5Qyga2RfXt3QpFeapVeafdTro2bf9Pqo2Te9Pmr2Ta+Pmn3T66Nm3/T6\nqNk3vT5qlt7C1tvQZW2TZYuEY4zZDJf7BOBK4GZr7dJgn7HBFVMo+KYX/NPsm17wT7NvesE/zb7p\nBf80+6YX/NPsm17wT7NvesE/zb7pBf80+6YX/NMsvYWtd23QpHJiELAB8APwVHihQGpG8kLDN73g\nn2bf9IJ/mn3TC/5p9k0v+KfZN73gn2bf9IJ/mn3TC/5p9k0v+KfZN73gn2bpDShQvQ1GDmGPMcYU\nAXvhkp5/aq2dEnOVcopvesE/zb7pBf80+6YX/NPsm17wT7NvesE/zb7pBf80+6YX/NPsm17wT7Nv\nesE/zdJb2HrXFjmE/cYC7YL18oYcwBhTnL3q5Bzf9IJ/mn3TC/5p9k0v+KfZN73gn2bf9IJ/mn3T\nC/5p/v/27i1UtvMg4Pj/2ydtkwabmFuNpqXEJG29YUWsFoxYqlWE4oM+KlbUlljSB9GCFu9Cn+oF\naSnqmxA1D0JqQR8kabVeiBVKqARNTURSsUkvgtIU2718WGt7prtnJ+fk7LNnZv3/f/jYc2atmfP9\n9pr18q3FbJsXfGabF3xmmxd85ryX2J55L6sWhMUtt8v/J/NJc+0Y48Vj6aTXHG0bY7xxjHH7NE1f\nXK7C7Hw2L/jMNi/4zDYv+Mw2L/jMNi/4zDYv+Mw2L/jMNi/4zDYv+Mx51+293BTIetb+BRjA3cAb\npqWTdp6maRpj3AS8F3hsjHHLNE2HZzTX08jmBZ/Z5gWf2eYFn9nmw+Wg/QAADEVJREFUBZ/Z5gWf\n2eYFn9nmBZ/Z5gWf2eYFnznvur3PuxaEJR2/wrFxheQDwCPL418ZY3zrc7zPOeB7gRcDjzGfaDuX\nzQs+s80LPrPNCz6zzQs+s80LPrPNCz6zzQs+s80LPrPNCz5z3nV7r0QtCK+0jZMBgONXODaukHwM\n+Nvl8R3Az48xXnP0Hkub36HyauBe4Bbg/cBnr8D0LzmbF3xmmxd8ZpsXfGabF3xmmxd8ZpsXfGab\nF3xmmxd8ZpsXfOa86/aeSdM0NVY2gHMbj18NvAn4beA3gDcDdx3b/yXA3wGHwH8BDwKvv8A+dy/b\nDoGHgZdt22r0Gs02r9Fs8xrNNq/RbPMazTav0WzzGs02r9Fs8xrNedftPbPf67Yn0DjlA/qlJ8rb\ngI8y/3XFw43xj8C7gbGx71cCHzm23/uAXwd+CniA+btYDoH/AO7cttXoNZptXqPZ5jWabV6j2eY1\nmm1eo9nmNZptXqPZ5jWa867be6a/221PoHGKBxMONh7/0saH/p+WD/t9wKeBzy3Pf+DYa64D7gee\nOHbSHJ1s/wP8PceuvuTNnDdz3sx5vWab12i2eY1mm9dotnmNZpvXaM67bu+Z/363PYHGFTio8/ef\nHH3QfwH4xo1tdwJ/DDyzbH/P8vy55ec1wA8Bvwd8fDlB/g34c+DtwFdv22f3Gs02r9Fs8xrNNq/R\nbPMazTav0WzzGs02r9Fs8xrNedftPbPf67Yn0DjlAwrfsXzID5lvhb+W+a8kjmX7bcBTy/a/Ar5h\n47UHx97reuBW4MZtu/J6zTav0WzzGs02r9Fs8xrNNq/RbPMazTav0WzzGs151+0909/ttifQOOUD\nCj8GfB74MPD1y3NHV0buZP5ulEPgg8CrTniPcyc8P057vnkz581s8xrNNq/RbPMazTav0WzzGs02\nr9Fs8xrNedftPdPf7bYn0DjFgwlXMX9nyiHwm8tzR1dN7tg4UR660IkC3LLxeOdPDJvXaLZ5jWab\n12i2eY1mm9dotnmNZpvXaLZ5jWab12jOu27vWY8Dak1NzLfPAzwKME3TNMa4g/nW+ZcCHwLeOk3T\no0cvGnMvBH5wjPHdR68705k/v2xe8JltXvCZbV7wmW1e8JltXvCZbV7wmW1e8JltXvCZbV7wmfOu\n23u2XcnV5sbpDOCA81dB/v/xCfvez3yF5B3LvzdvoX+Ik2+h/xbmv7T4N8Ar8mbOmzlv5rxus81r\nNNu8RrPNazTbvEazzWs05123d1dHdwjvcGOMATBN0+G0fKI3H5/QY8vPN48x3sT8PSovZb568iVX\nTTb+n+uBtwAD+G/mv7h45tm8y1xUZpt3mYvKbPMuc1GZbd5lLiqzzbvMRWW2eZe5qMw27zIXldnm\nXeaiMtu8y1xU5rzr9u56V217AnVy0zRNY4xvAt4AfDvwAuAc8BfAI9M0feho3zHGVdM0fQF4P/DD\nwCuAPwKuBh4E3nbCiTKA1wKvBz4D/OHy/47nOClPPZsXfGabF3xmmxd8ZpsXfGabF3xmmxd8ZpsX\nfGabF3xmmxd85rzr9u580w7cptz48gHcDbwLeAb4AvPt8Efj88vPdwGvPfa6a4H7lu3/CzwJfP+y\n7YD5Csm5jf2/jvk7Vw6BPwNuzps5b+a8mfM6zTav0WzzGs02r9Fs8xrNNq/RnHfd3n0YW59A4wIH\nBe4FHmH+vpND4Ang48AngKeOnTgPMd8mv/n6m4F/XrZ/CvhT4HXH9rkO+J7l9YfA48DteTPnzZw3\nc16n2eY1mm1eo9nmNZptXqPZ5jWa867buy9j6xNoHDsg8GsbJ8JfAz8D3MB8G/31wG3AbwEPb+z3\nKPDOY+/zcubvWjlcTrovAu9jvuJyL/Mt+Y8v258EXpk3c97MeTPndZptXqPZ5jWabV6j2eY1mm1e\noznvur37NLY+gcbGwZhPgqMT4FeBbz62/arl5wHwncAfbOz/78DPHdv/VuZb5Z9e9jm6Lf/oqsxn\nmK+e3Jk3c97MeTPndZptXqPZ5jWabV6j2eY1mm1eoznvur37NrY+gcZyIOB3Nj74Pw1ct7Ht4ITH\nNwPv3njdI8CPLtvG8vM64B7gfuarJM8wXzV5APgJ4KvyZs6bOW/mvE6zzWs027xGs81rNNu8RrPN\nazTnXbd3H8fWJ9CYAH534wP/k2x8IfZFvPbqYyfaAyzfkwK84Ni+NwFfA9yQN3PezHkz53WbbV6j\n2eY1mm1eo9nmNZptXqM577q9+zq2PgH7AH5/44N+z/LcuMT3uI3zf3XxEPjZY9sPTnjdJf0/eTPn\nzWz0Gs02r9Fs8xrNNq/RbPMazTav0WzzGs151+3d53FAba0xxg8AP77881+BwzHG1dM0TWOMSzk2\nTwJ/srwHwNvHGHcdbZym6fBCL5qWM+assnnBZ7Z5wWe2ecFntnnBZ7Z5wWe2ecFntnnBZ7Z5wWe2\necFnzrtu777XgvB2e5j5i7UBbme+lf5HxhjXTNN0OMYYF/Mmy4f+AeCjy1NfwfzdK7uWzQs+s80L\nPrPNCz6zzQs+s80LPrPNCz6zzQs+s80LPrPNCz5z3nV797tpB25TNg/gRuAXOX8r/EeYT5qrl+3P\necs7y+3ywBuBz7FxS/3FvD5v5ryZ82Y2e41mm9dotnmNZpvXaLZ5jWab12jOu27vPo/uEN5y0zR9\nCngP8MvLU68B3sp8FeXo1vpnvYoynb9d/pMbTx8u23bqlnmbF3xmmxd8ZpsXfGabF3xmmxd8ZpsX\nfGabF3xmmxd8ZpsXfOa86/bucy0I70DTND3NZZ4wSy8BXrQ8fvzUJ3pK2bzgM9u84DPbvOAz27zg\nM9u84DPbvOAz27zgM9u84DPbvOAz5123d19rQXhHupwTZuP5o+9U+SzwiSs43cvO5gWf2eYFn9nm\nBZ/Z5gWf2eYFn9nmBZ/Z5gWf2eYFn9nmBZ8577q9e9m0A99b0Tg/gJt4Ht+3AtwKfHh5zX3bduTN\nbPUazTav0WzzGs02r9Fs8xrNNq/RbPMazTav0Zx33d59GlufQOMCB+USTxjghcv2TwOPAd+3PH+w\nbUvezEav0WzzGs02r9Fs8xrNNq/RbPMazTav0WzzGs151+3dl7H1CTROODDPfcIcbOz7bcA/LPu9\nF7hu2/PPm9nuNZptXqPZ5jWabV6j2eY1mm1eo9nmNZptXqM577q9+zC2PoHGsxyck0+Yazb2uQv4\n4LL9YeBl25533sx5vWab12i2eY1mm9dotnmNZpvXaLZ5jWab12jOu27vro+tT6DxHAfo5BPmRcDL\ngb9cnn8C+Nptzzdv5ryZbV6j2eY1mm1eo9nmNZptXqPZ5jWabV6jOe+6vbs8tj6BxkUcpAufMO8E\nHlz+/RTwym3PM2/mvJmtXqPZ5jWabV6j2eY1mm1eo9nmNZptXqM577q9uzq2PoHGRR6oLz9hPrlx\norxq2/PLmzlvZrvXaLZ5jWab12i2eY1mm9dotnmNZpvXaM67bu8ujrEciNqDxhg3AfcwnzQHzCfK\nd03T9OhWJ3aFsnnBZ7Z5wWe2ecFntnnBZ7Z5wWe2ecFntnnBZ7Z5wWe2ecFnzrtu767VgvCeNca4\nEXgH8BbgddM0fWzLU7qi2bzgM9u84DPbvOAz27zgM9u84DPbvOAz27zgM9u84DPbvOAz5123d5dq\nQXgPG2PcABxM0/T0tudyFtm84DPbvOAz27zgM9u84DPbvOAz27zgM9u84DPbvOAz27zgM+ets6gF\n4aqqqqqqqqqqqipJB9ueQFVVVVVVVVVVVVWdTS0IV1VVVVVVVVVVVUlqQbiqqqqqqqqqqqpKUgvC\nVVVVVVVVVVVVVZJaEK6qqqqqqqqqqqqS1IJwVVVVVVVVVVVVlaQWhKuqqqqqqqqqqqoktSBcVVVV\nVVVVVVVVJakF4aqqqqqqqqqqqipJLQhXVVVVVVVVVVVVSWpBuKqqqqqqqqqqqkpSC8JVVVVVVVVV\nVVVVkloQrqqqqqqqqqqqqpLUgnBVVVVVVVVVVVWVpBaEq6qqqqqqqqqqqiS1IFxVVVVVVVVVVVUl\nqQXhqqqqqqqqqqqqKkktCFdVVVVVVVVVVVVJakG4qqqqqqqqqqqqSlILwlVVVVVVVVVVVVWSWhCu\nqqqqqqqqqqqqktSCcFVVVVVVVVVVVZWkFoSrqqqqqqqqqqqqJLUgXFVVVVVVVVVVVSXp/wDBv2CL\n9YhmagAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 387, + "width": 706 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "fig, ax = plt.subplots(figsize=(8,4))\n", + "\n", + "mean, std = scaled_features['cnt']\n", + "predictions = network.run(test_features)*std + mean\n", + "ax.plot(predictions[0], label='Prediction')\n", + "ax.plot((test_targets['cnt']*std + mean).values, label='Data')\n", + "ax.set_xlim(right=len(predictions))\n", + "ax.legend()\n", + "\n", + "dates = pd.to_datetime(rides.ix[test_data.index]['dteday'])\n", + "dates = dates.apply(lambda d: d.strftime('%b %d'))\n", + "ax.set_xticks(np.arange(len(dates))[12::24])\n", + "_ = ax.set_xticklabels(dates[12::24], rotation=45)\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Thinking about your results\n", + " \n", + "Answer these questions about your results. How well does the model predict the data? Where does it fail? Why does it fail where it does?\n", + "\n", + "> **Note:** You can edit the text in this cell by double clicking on it. When you want to render the text, press control + enter\n", + "\n", + "#### Your answer below\n", + "- The model predicts the data good enough.\n", + "\n", + "- The model fails for the period of christmas holidays. \n", + "\n", + "- **Reason:** We can expect users to change behavior around christmas time (family visit, holidays ..). We also have much less data for christmas holidays which occures only once a year then the rest of the year. So the model is over fitted for normal days vs christmas holidays. We would need data over several years to fit the model for these seasonal changes." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Unit tests\n", + "\n", + "Run these unit tests to check the correctness of your network implementation. These tests must all be successful to pass the project." + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + ".....\n", + "----------------------------------------------------------------------\n", + "Ran 5 tests in 0.005s\n", + "\n", + "OK\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import unittest\n", + "\n", + "inputs = [0.5, -0.2, 0.1]\n", + "targets = [0.4]\n", + "test_w_i_h = np.array([[0.1, 0.4, -0.3], \n", + " [-0.2, 0.5, 0.2]])\n", + "test_w_h_o = np.array([[0.3, -0.1]])\n", + "\n", + "class TestMethods(unittest.TestCase):\n", + " \n", + " ##########\n", + " # Unit tests for data loading\n", + " ##########\n", + " \n", + " def test_data_path(self):\n", + " # Test that file path to dataset has been unaltered\n", + " self.assertTrue(data_path.lower() == 'bike-sharing-dataset/hour.csv')\n", + " \n", + " def test_data_loaded(self):\n", + " # Test that data frame loaded\n", + " self.assertTrue(isinstance(rides, pd.DataFrame))\n", + " \n", + " ##########\n", + " # Unit tests for network functionality\n", + " ##########\n", + "\n", + " def test_activation(self):\n", + " network = NeuralNetwork(3, 2, 1, 0.5)\n", + " # Test that the activation function is a sigmoid\n", + " self.assertTrue(np.all(network.activation_function(0.5) == 1/(1+np.exp(-0.5))))\n", + "\n", + " def test_train(self):\n", + " # Test that weights are updated correctly on training\n", + " network = NeuralNetwork(3, 2, 1, 0.5)\n", + " network.weights_input_to_hidden = test_w_i_h.copy()\n", + " network.weights_hidden_to_output = test_w_h_o.copy()\n", + " \n", + " network.train(inputs, targets)\n", + " self.assertTrue(np.allclose(network.weights_hidden_to_output, \n", + " np.array([[ 0.37275328, -0.03172939]])))\n", + " self.assertTrue(np.allclose(network.weights_input_to_hidden,\n", + " np.array([[ 0.10562014, 0.39775194, -0.29887597],\n", + " [-0.20185996, 0.50074398, 0.19962801]])))\n", + "\n", + " def test_run(self):\n", + " # Test correctness of run method\n", + " network = NeuralNetwork(3, 2, 1, 0.5)\n", + " network.weights_input_to_hidden = test_w_i_h.copy()\n", + " network.weights_hidden_to_output = test_w_h_o.copy()\n", + "\n", + " self.assertTrue(np.allclose(network.run(inputs), 0.09998924))\n", + "\n", + "suite = unittest.TestLoader().loadTestsFromModule(TestMethods())\n", + "unittest.TextTestRunner().run(suite)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/first-neural-network/__MACOSX/DLND-your-first-network/._.DS_Store b/first-neural-network/__MACOSX/DLND-your-first-network/._.DS_Store new file mode 100644 index 0000000..09fa6bd Binary files /dev/null and b/first-neural-network/__MACOSX/DLND-your-first-network/._.DS_Store differ diff --git a/first-neural-network/dlnd-your-first-network.zip b/first-neural-network/dlnd-your-first-network.zip new file mode 100644 index 0000000..9942391 --- /dev/null +++ b/first-neural-network/dlnd-your-first-network.zip @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:346a4fc838c6474b144db51a5783842bfd0ddee5bd13e6faebad2dabcbf897e0 +size 293284 diff --git a/image-classification/image-classification/.ipynb_checkpoints/dlnd_image_classification-checkpoint.ipynb b/image-classification/image-classification/.ipynb_checkpoints/dlnd_image_classification-checkpoint.ipynb new file mode 100644 index 0000000..ccce8fb --- /dev/null +++ b/image-classification/image-classification/.ipynb_checkpoints/dlnd_image_classification-checkpoint.ipynb @@ -0,0 +1,1236 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "source": [ + "# Image Classification\n", + "In this project, you'll classify images from the [CIFAR-10 dataset](https://www.cs.toronto.edu/~kriz/cifar.html). The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.\n", + "## Get the Data\n", + "Run the following cell to download the [CIFAR-10 dataset for python](https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz)." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "All files found!\n" + ] + } + ], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "from urllib.request import urlretrieve\n", + "from os.path import isfile, isdir\n", + "from tqdm import tqdm\n", + "import problem_unittests as tests\n", + "import tarfile\n", + "\n", + "cifar10_dataset_folder_path = 'cifar-10-batches-py'\n", + "\n", + "class DLProgress(tqdm):\n", + " last_block = 0\n", + "\n", + " def hook(self, block_num=1, block_size=1, total_size=None):\n", + " self.total = total_size\n", + " self.update((block_num - self.last_block) * block_size)\n", + " self.last_block = block_num\n", + "\n", + "if not isfile('cifar-10-python.tar.gz'):\n", + " with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:\n", + " urlretrieve(\n", + " 'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',\n", + " 'cifar-10-python.tar.gz',\n", + " pbar.hook)\n", + "\n", + "if not isdir(cifar10_dataset_folder_path):\n", + " with tarfile.open('cifar-10-python.tar.gz') as tar:\n", + " tar.extractall()\n", + " tar.close()\n", + "\n", + "\n", + "tests.test_folder_path(cifar10_dataset_folder_path)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Explore the Data\n", + "The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named `data_batch_1`, `data_batch_2`, etc.. Each batch contains the labels and images that are one of the following:\n", + "* airplane\n", + "* automobile\n", + "* bird\n", + "* cat\n", + "* deer\n", + "* dog\n", + "* frog\n", + "* horse\n", + "* ship\n", + "* truck\n", + "\n", + "Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the `batch_id` and `sample_id`. The `batch_id` is the id for a batch (1-5). The `sample_id` is the id for a image and label pair in the batch.\n", + "\n", + "Ask yourself \"What are all possible labels?\", \"What is the range of values for the image data?\", \"Are the labels in order or random?\". Answers to questions like these will help you preprocess the data and end up with better predictions." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Stats of batch 1:\n", + "Samples: 10000\n", + "Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}\n", + "First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]\n", + "\n", + "Example of Image 32:\n", + "Image - Min Value: 0 Max Value: 255\n", + "Image - Shape: (32, 32, 3)\n", + "Label - Label Id: 1 Name: automobile\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfoAAAH0CAYAAADVH+85AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAWJQAAFiUBSVIk8AAAHOhJREFUeJzt3UmT3fd1HuDfnXoeMBATAY7gDA4iNZOSUpFlyZJiRamU\nbKtiV6Vc3uVrJF8gWSSLOE4pjsqJMyqRZDkVSZHFaIomkxQHkQRAAgRAkEA3uvv2dIcstLGX5wQO\npVPPs3/rdN/+3/vibvB2ptNpAwBq6r7TPwAA8DdH0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBF\nDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAorP9O/wB/U979t++bZnKd\nafwl2d7JXGrtlqML4cx999+duvXqubOp3M9/ciGc2V4fpW5NJpNwZnd3O3VraXkxldtO/LEHg0Hq\n1vxy/Pm44/ZbUrceff894czREwdSt3aGuTfMjas3wplv/8UrqVtnHv1IOHNotZe6NbxyPpV78fx6\nOHPhtRdTt/ZG8Y/T8TT+fm6ttX6nk8r9zqc+Gc588/s/TN167WL8c3E0zr0e62sbuRfkr/CNHgAK\nU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoLCy63X3\nnLk9lZvpzYczw63cgtp4Es8Nh1upW6dOnkrljh0+Es7sD1PDgW3jRnydbGUl/vdqrbXFZG48iv/b\n+NChg6lb3Zn42tXJU7lbO/vxxcGLl+MLXq211m3xVb7WWnv17LVw5sqlzdSt+98VXxxcX3sjdevU\n8dzrsZb42LlxfSl168b2bjyU3Fyb7I1Tud1h/AWZTHKfVdNEbJxcr7sZfKMHgMIUPQAUpugBoDBF\nDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIWVHbV57qeXcsFRfK1gbj4+gNFa\na91ufPWh0/ZTtzY3hqnc/Q+djN/azg3vHDwaH9BZWs699sPNnVRufS0+vPPahTdTt2YXZsKZZ595\nPXVr0uLPx6HDy7lbo9zQzMxSfFhlcamXuvXTH34nnNl+O/eZ8+EnH0jlWmcxHOkOcq/HzDj+Puv3\nc98jN/dyn1W9Xvx3G43iY06ttba7G/8czt66GXyjB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoe\nAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKKzset2l16+kcguz8cWwdngpdWvzRnylqRMfvGut\ntTbaz63evfbqOJyZ6c+nbr343F+GM6PudurWoCX+zq21bjf+b+P19fjiXWutves994Yzb15aT91a\nWp4NZ4bbuQXAmcEklVu7Hn+/ZBfDXks8i+O9vdStb7Tcz3jy6PFwpj+Ov59ba+2WuYVwZmM3t1I4\nTv6M40n8uZrGx0rTuck093vdDL7RA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNED\nQGGKHgAKU/QAUJiiB4DCyo7adDu9VG5nJz7+MjfMjVlsJEZtjp84mLo1HuVGXK5c2QhnVlZyr8fs\nTPxxXF1aTt16az3+e7XWWmcaf64WVnKv/dlz58KZQX8xdWt3J76WtDqbe+0PHszlZjrx0ZLTx3KD\nU+1M/NbFS5dTpw4eP5TKPfDQI+HM/HLutT9z8rZw5itf+0rq1reefjqVG3QySzO5QaHMtlgnu0h2\nE/hGDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QA\nUFjZ9bppZ5zKzS/El8auvrWeutXvx1/+a9c2U7e6g1SsTRL/FLyxG18AbK211cX4itfG+lrqVq+b\ne/Q7Lf67HVo4kLr14H2PhjN333Y6devOO+K5Rx98KHVrMo4vw7XW2nMvvBrOnH/9XOrWhcuXwpnu\n8pHUrW4n933rtYvxz50nP/H+1K3f+MwnwpmDR46nbh0/cWsqdygzWJp87aeZpbx3kG/0AFCYogeA\nwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhZVdr5tM9lK5\n/iA+8zY3n5uGm1+YD2d2hru5W/O5P/XScnx5bX97K3WrM44vDo4nuaW8yV5u3fDUrUfDmc/93j9O\n3VpYja94DXd2UrfeTqzyPXM19xpeffP1VO7shQvhzKS7mrrVWR2GM8dO3p66tbq8lMp996tfC2fO\nfed7qVuHf/e3wplPffrTqVvvfvjxVO5LX/ijcCb5MdB6M/GV02575xbvfKMHgMIUPQAUpugBoDBF\nDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIWVHbVZWZlN5fZ3R+FMfyb376Wd\nnc1wZnYuN6CzuLCYyi3PL4cz3ZncrdEoMUQ0yI0XDXdzwzuP/vrvhjM7h+5M3Xrl0vlwZrSXG/np\n7cZfx2+ejf98rbW2tXE9lZufjQ/23H/mqdStpz76aDhz62p8AKq11k6fio8XtdbayeX4x/cX/91/\nTN360/8Uzz3x7vekbq3M5T4/pi3+DM914p/3rbXW73XCmZ1JPHOz+EYPAIUpegAoTNEDQGGKHgAK\nU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQWNn1uu3NaSq3mli9G7dJ6lYb\nxJfoet3c7zXbnUnlVuZWw5n5+dxjtb72VjzUif98rbV2z633pnJ33f+RcGZjM/F7tdZ6k/ha29XM\na9hamxvFn+H9xOJda60Nd3MLe0ePx9fhFuZz32Uurr0dzly/vpG6dXA59wy/78kPhTN//Cf/PnXr\n3/zhH4cz21evpW59+H1PpHIrM/HPxsk49yxO9nbDmcFs7jP4ZvCNHgAKU/QAUJiiB4DCFD0AFKbo\nAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUVnbUZnaul8pNJ6Nwpj/I/Xup10v8jKPc\nrSOHDqdy994dH3/5Pz/7VurWXHc+nJnGd19aa63d/uCHU7leP/4z7m1fSd2aTMfhzGwnPpTUWmuj\nyXY4s92J/3yttdbZ30zlThy9L5x57K5bU7fmluLvzRdfeD5165994Y9SuV6Lf1bt7+VGXIYr8aq4\nvJEb+fmnf/ivUrnpXvx5HC0cSd3qXbsRzky6yfGzm8A3egAoTNEDQGGKHgAKU/QAUJiiB4DCFD0A\nFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMLKrtetHJhN5Sbj+GrV6txi6tb+ZCucmXY6\nqVvd3GhVW2rxe6cPnUrdmszEM8OZ3Erh8u1nUrndjbVwZm9rL3WruxtfeRuMh6lb+9P4v/mXu7nv\nCZc34s99a62d/WF8He76xQupW6P96+HM1vV4prXWzl1eT+WmcyvhzOyxu1O3JisL4cyfPftq6tb6\nZm71btCPP48nzzycurWzEf+b7Wzlno+bwTd6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBF\nDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwsqu181Ml1O58XQ7cWspdatNBuFIL/kXW9/cTeWef/W1\ncGZp5Wjq1upCfCnv/rtuT92a6+T+jbu8HM8dWTqeurW1Pg5nZie51cbdcXyBcXjyUOrWjXsOp3Jt\ndxqOvJxcr9vdjr8312dyz+KR07k3dTexLLm1cS11a/d6YrVxL7dS2EsuUmZWIg8fXE3d+oPf/51w\nZr6f+71uBt/oAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKEzR\nA0BhhUdtZlK57nw8tz/KvYz9+fhoyWhjJ3Vrvhu/1Vprc5ONcObobHxso7XWjpw4E87c/eRHUreO\n3n46lZvrz4Uzvfn51K29ySPhzKAzSd3qT0bhzPY4d+v6MH6rtdZ+9NNXwpn95Pvl+valcGZ3ez11\na28t/h5rrbXxeD+c6eTemm22E/+bLS/Eh4Faa224Fx+naa21lbn4e3Oxn/vsPn1LZrgr93rcDL7R\nA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFFZ2\nvW60P03llhKvyNGV3BrXPbfE15bufNcdqVsH7noolTv1yOPxW6fuT93aWDoezuxOc8tw452tVG6v\nF1+g6kxzK28zidW7nf3credfejmc+ca3vpe69dxzL6Zyw7X4otygm1tCm+3E1x6nrZe6Ne3mPoZn\n5+LP4tJ87tbCIP67jce575Gd4WYqtzyTeP2nuZ/x0HgYzjxx6ljq1s3gGz0AFKboAaAwRQ8AhSl6\nAChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKKzsqM0/+vSjqdypIwfCmaXb707d\n6twaH5qZPXZP6tbc6uFUbtKfCWe2b2ynbg3mF8KZ7n58fKS11jYTv1drrW2P46Mxo+3c6NF3n/5x\nOPP0t59O3bp8/vVwZrx7I3Vr0N9J5VZm40NVg2l8OKq11rrd+KDQYNBJ3TqQGKdprbVOiz/7s73k\nR37iuX/2Zz9Lndrf303lHnvgvnCm08m9N4984EPhzDPf/0bq1s2YwvGNHgAKU/QAUJiiB4DCFD0A\nFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoLCy63X3Pracyr29Ec/1TzyZ\nunX/g4+HM71pbq1t0I2vT/0iF890D8SXv1prrZ8Y/xr1UqfalWHudXzp+WfDma//4JnUrS9/50fh\nzGo/92/35V58xaszl7vV66+kcguJB2QuNyjXpm0/nOlM4+t6rbU2M87ltifxv9ne/l7q1nQcf790\ndtZTtzbfvp7K9c48HM7s7G6lbn3jm98NZ/7W4aXUrZvBN3oAKEzRA0Bhih4AClP0AFCYogeAwhQ9\nABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUFjZUZs/f+GVVO5Y965w5vLP/kPq1qt3fS+cWVg9\nnLrVm+RGbVov/ojsj3KDMaNx/GfcHe2kbp0//1oqt70RH8GYX1hI3bpvNZ4ZzOZuXV9bC2emndx4\n0YF+bsRl0I8/H9vJEZfRfvwZTuy+tNZa2xrlfsY2jb8e/ZncR/5M4ivhiVtPpm69+eabqdzzLzwX\nzty7upi6Nfn5T8KZw6d/PXXrZvCNHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoe\nAApT9ABQmKIHgMIUPQAUpugBoLCy63WPd3PLSRfX42top+58PHXrpbOJhb3xudSt0X5uMaw/OxPO\nzC/kVs0mrRPODGZ6qVudlns9Tt93bzhz4sSJ1K3nv/SlcGa8fjl1q9uL/50vXLmYurW3nFvYW+jH\n/9aj8Sh1q5cYe+x2ct+bBv1krjsIZ3JPfWvb6zfimd3cnN+Dd55K5X7tA/HP4ScfeTB1a+XALeHM\ny8P91K3TqdRf5xs9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8A\nhSl6AChM0QNAYWXX6774P19K5V7bmQ1nfv/3PpS6dfcdd4QzL59/PXVrtLuRyrVx/N+C45291Kl+\nN75O1pnk9rhOHjySyt1xa3wVcX+SW/Ga68cX5db2creuvH4lnHnpf389devE3fEFwNZau/+RM+FM\nfA/xFwYz8WW43iCeaa21/iT3U7557e14Zi2eaa21+JPY2vIg/lnaWmt/8A8+n8p9IPF8vHFtM3Xr\n7c34mt8gsQR6s/hGDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNED\nQGGKHgAKKztqs3Xi0VTus+96KJw5tLqYuvXi2fjAxGAm9yebG6ymcqPEaMw4+e/HaWLbYzTNjdq8\nfPFCKvfKm5fCmSceeyx1a3k2PvKzfmMndWv/enzU5tSRQ6lba5dzr3334QfDmf7sXOrWeG8/nLmx\ndj11a3O4lcrNJT4L7j5yPHXr6uX4cz+e5p7F85eupnLjyQvhTKeNUrcyA0aTrdzIz83gGz0AFKbo\nAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKEzRA0BhnWly/euX\n3U++/fXUL3Zw6UA4c2WSW5T76v/4Sjhz9ODB1K0rV9ZSucl4Es4sLebW/DqT+JLUzl5ufWq0v5fK\nXVu7Fs58/JOfSt361//5v4czw/XLqVv7198KZ+Z749Stbzz9g1Tukfc/Fc7sT+PPb2utbe/shjMr\nSwupW7et5pYlN6/H39M///nPU7cuX44/V3fefzp1660rufW6aYuvPXb7yTXQ+aVwZmsv9yxeOfd8\nYtfzr/ONHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAU\nlvsf/X8VDHPDCD+6+Eo4szV/d+rWmUfeG84sHlhO3brjkdwuwuxMfKBmcTb3WG0Ph/HMznbq1nw/\n93pk7g2ns6lbt55+JJy59MJG6tYbF86HM/sz86lbg27u+bj6xsVw5vCxY6lbJw8lxqP2d1K3fvqT\nH6Vy587F/2Z7e/upW91B/P1y9tWzqVt7u7mfsT8fHxXqDXLvzY2t+GfV8WNHU7duBt/oAaAwRQ8A\nhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKEzRA0Bhih4ACiu7Xje3FV92\naq21u8bjcOaff/G/pG4d6cRf/jseeCB1a300TeWe+fGz4cy0P0jdeuKDT4Uz8zO91K252dxq1exM\n/HcbjkepW4dvXApnvvPKS6lb40n8dRx0ct8T+v1c7paV+HLjQif33L/8zE/DmTfeeCN1a28/93x0\nEgOM3V7ute9M48cmk9xC5OLqgVRufzwJZxbmcp8DH3nfE+HM3/3Ex1K3bgbf6AGgMEUPAIUpegAo\nTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoeAAoru153ZHE+lescXAln\nTiYyrbV2aBL/d9Z0mlvjGu/tp3J3nDwczqzv537G7Y0L4cxkJv7ztdbaW1evp3K9TnzdcH5pMXXr\na9/8ejhz6cqV1K27jh0NZ+Z68bWw1lrr5UbN2ksvvhjObNy4kbo1SqxYdgYzqVu95KJcm8Z/xu7M\nXOrU3NLBcGZvmvvMmUxyz9Vj990bzvz9z3wydevxRx8OZ3Z3dlK3bgbf6AGgMEUPAIUpegAoTNED\nQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYWVHbTZGuQGBrb3NcOZzv/lE6tZk\nNBvO/Omf/SB16wtf/otU7v2PPR7O9FcWUree/vL/CmcWZwapWwcOLKdy68PtcGZnPzfS8fLLr4Qz\ne4kxltZa21qOD+/s749St/YnudGj69fXwpnsYEwqlxiZaa217iA5NLMQH+6attyi0Gzi5XjfQ2dS\nt97/ntzn6Xsff1c4s5wcnNraHIYz49xjf1P4Rg8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAU\npugBoDBFDwCFKXoAKEzRA0Bhih4AClP0AFBY2fW6yxvXU7lhYoDqxo3cy/jauXPhzHPnL6ZuvfHm\nhVTuT/7b+XDm9C2rqVv/5B9+LpzZnosveLXW2refiy/DtdbaM8+8FM6ce/NK6tYgsTQ2GOTW/Hoz\nM+HMrbfdmbr1l8/+LJXrdOPfS5JDeW3Qj7+nF5dyq42tn8vt7O2GM6eOHk7d+u3PfCqc+dhTT6Vu\n9Qa9VO7G1lY4s7mVWzlt/fj7bPwOztf5Rg8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugB\noDBFDwCFKXoAKEzRA0Bhih4ACis7avPMS8NU7tlXLoUzP3725dStC2+8Hs48ccfx1K3bbjmWyp27\ncjWcueXEydSts9fXw5nvn42PzLTW2rd+9Ewqt7a5Hc70Wm6kY9KJLyyNxqPUrYsX4qNHN9Y3UrfG\no71UbjCID+/MzM2lbvVn4rlRcrNkaSb3MfyJD703nPnN3/h46tapk/H39HAnNxizu5nLdXvx17HX\ny33XHe7Gf8bdndxzfzP4Rg8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoA\nKEzRA0Bhih4AClP0AFBYZzpNTi79krv/zltTv9i1tfiC2pGF2cyp9skPJtanPvxU6ta//OZ3U7l/\n+1+/msr9f9PJLcN1+53cuUSsn/z39LQljnVzv1fmJ+z3c6trk8zv1Vqb9uPrdd1u7rWf7cdz73ns\n4dStT3/0o6ncww/cF86MJpPUrb39+JLiOPEattZat5PLdRK54dZm6tZoFF+JXFhYSN367G99PveG\n+St8oweAwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6\nACgsNz/1K+Dv3Xt3KnfX6ZPhzL2PP5G6tXrkVDiztT1M3XrivrtSubVf+0g4c/Hym6lbmxvxJalr\na2upW1vbO6nc7l58tSq++/UOyOxjTQapU/OLi6nc4cX4SuSxo0dSt/7Oxz8Wznzkgx9M3er3c6/j\n+o34+2U0ya2VzgziP2M3OYw6neYW9tY34suj2XXDxaWlcKaTmb68SXyjB4DCFD0AFKboAaAwRQ8A\nhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFdabT5PLAL7lXvvgvUr/Y1vxKODNM\nDkVsb8VHKQbd3ODDXD+3X5SZflmP77601lq7lhjpuPzm1dStreF2KreztxfObAy3Ure2tuIDRtPk\nszjciN8aTXJ/6EfOPJTK3XP7HeHM8oHl1K2DKwfDmdE499rvJUdcJi1+r5P+uI8PsuyNc8/H9k7u\nvdnr9cKZ+bm51K1Mb2bfm5/97c//P6/h+EYPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKbo\nAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQWG7S7FfAueXFVG5vZxwPbe+nbmWmpLYHg9SptVFuOWmS\nWCjrJG8dO7gaztx24njq1iSxxtVaa4OZmXBmZjaeaa21lli7Gu/lnsXOOL6g1u3kXsPpNPEea60N\np/F1st3kwt76dnxBrd/PvTcTI3SttdY6vcTr3819txtux9cNd3Yy25etLSwspHKDxOs/HeeexdR6\n3Tu4FOsbPQAUpugBoDBFDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAo\nrOyoTdvKxbqj+ChIdyY+ttFaa60XH97pd3J/sk52gGQSHzsZDXJDIq0b/xn39nK3sgMTu8P42ElL\n3honBjdyf+XWOjOJQZZe7nvCJLniMu0mhkT68ef3F7n4EFF2KCnzHvtFMB7Z2t5MnRqN4u+zxcXc\nsFg3+VyNxvGfsZf8XOwk/taTSW5A52bwjR4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNED\nQGGKHgAKU/QAUJiiB4DCFD0AFKboAaCwTnbFCwD45ecbPQAUpugBoDBFDwCFKXoAKEzRA0Bhih4A\nClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8A\nhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKEzRA0Bhih4AClP0AFCYogeA\nwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNA\nYYoeAAr7v3To37eZkHdPAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 250, + "width": 253 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina'\n", + "\n", + "import helper\n", + "import numpy as np\n", + "\n", + "# Explore the dataset\n", + "batch_id = 1\n", + "sample_id = 32\n", + "helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Implement Preprocess Functions\n", + "### Normalize\n", + "In the cell below, implement the `normalize` function to take in image data, `x`, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as `x`." + ] + }, + { + "cell_type": "code", + "execution_count": 171, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def normalize(x):\n", + " \"\"\"\n", + " Normalize a list of sample image data in the range of 0 to 1\n", + " : x: List of image data. The image shape is (32, 32, 3)\n", + " : return: Numpy array of normalize data\n", + " \"\"\"\n", + " return (x - x.min()) / (x.max() - x.min()) # This is local normalization\n", + " # Note : I should divide by a global maximum like 255, \n", + " # so that the normalization does not depend on the image maximum\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_normalize(normalize)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### One-hot encode\n", + "Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the `one_hot_encode` function. The input, `x`, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to `one_hot_encode`. Make sure to save the map of encodings outside the function.\n", + "\n", + "Hint: Don't reinvent the wheel." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "encodings = {}\n", + "for i in range(10):\n", + " v = np.zeros(10)\n", + " v[i] = 1\n", + " encodings[i] = v\n", + "\n", + "def one_hot_encode(x): # I could use LabelBinariser to scale to large labels\n", + " \"\"\"\n", + " One hot encode a list of sample labels. Return a one-hot encoded vector for each label.\n", + " : x: List of sample Labels\n", + " : return: Numpy array of one-hot encoded labels\n", + " \"\"\"\n", + " assert np.max(x) < 10 and np.min(x) >= 0, 'label must be between 0 and 9'\n", + " encoded = np.array(list(map(lambda v: encodings[v],x)))\n", + " return encoded\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_one_hot_encode(one_hot_encode)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Randomize Data\n", + "As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Preprocess all the data and save it\n", + "Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "# Preprocess Training, Validation, and Testing Data\n", + "helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Check Point\n", + "This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import pickle\n", + "import problem_unittests as tests\n", + "import helper\n", + "\n", + "# Load the Preprocessed Validation data\n", + "valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Build the network\n", + "For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.\n", + "\n", + "If you're finding it hard to dedicate enough time for this course a week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) to build each layer, except \"Convolutional & Max Pooling\" layer. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.\n", + "\n", + "If you would like to get the most of this course, try to solve all the problems without TF Layers. Let's begin!\n", + "### Input\n", + "The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions\n", + "* Implement `neural_net_image_input`\n", + " * Return a [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder)\n", + " * Set the shape using `image_shape` with batch size set to `None`.\n", + " * Name the TensorFlow placeholder \"x\" using the TensorFlow `name` parameter in the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder).\n", + "* Implement `neural_net_label_input`\n", + " * Return a [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder)\n", + " * Set the shape using `n_classes` with batch size set to `None`.\n", + " * Name the TensorFlow placeholder \"y\" using the TensorFlow `name` parameter in the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder).\n", + "* Implement `neural_net_keep_prob_input`\n", + " * Return a [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder) for dropout keep probability.\n", + " * Name the TensorFlow placeholder \"keep_prob\" using the TensorFlow `name` parameter in the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder).\n", + "\n", + "These names will be used at the end of the project to load your saved model.\n", + "\n", + "Note: `None` for shapes in TensorFlow allow for a dynamic size." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Image Input Tests Passed.\n", + "Label Input Tests Passed.\n", + "Keep Prob Tests Passed.\n" + ] + } + ], + "source": [ + "import tensorflow as tf\n", + "\n", + "def neural_net_image_input(image_shape):\n", + " \"\"\"\n", + " Return a Tensor for a bach of image input\n", + " : image_shape: Shape of the images\n", + " : return: Tensor for image input.\n", + " \"\"\"\n", + " # shape [None, 32, 32, 3]\n", + " return tf.placeholder(tf.float32, [None] + list(image_shape), name='x')\n", + "\n", + "\n", + "\n", + "def neural_net_label_input(n_classes):\n", + " \"\"\"\n", + " Return a Tensor for a batch of label input\n", + " : n_classes: Number of classes\n", + " : return: Tensor for label input.\n", + " \"\"\"\n", + " return tf.placeholder(tf.float32, [None, n_classes], name='y')\n", + "\n", + "\n", + "def neural_net_keep_prob_input():\n", + " \"\"\"\n", + " Return a Tensor for keep probability\n", + " : return: Tensor for keep probability.\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return tf.placeholder(tf.float32, name='keep_prob')\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tf.reset_default_graph()\n", + "tests.test_nn_image_inputs(neural_net_image_input)\n", + "tests.test_nn_label_inputs(neural_net_label_input)\n", + "tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Convolution and Max Pooling Layer\n", + "Convolution layers have a lot of success with images. For this code cell, you should implement the function `conv2d_maxpool` to apply convolution then max pooling:\n", + "* Create the weight and bias using `conv_ksize`, `conv_num_outputs` and the shape of `x_tensor`.\n", + "* Apply a convolution to `x_tensor` using weight and `conv_strides`.\n", + " * We recommend you use same padding, but you're welcome to use any padding.\n", + "* Add bias\n", + "* Add a nonlinear activation to the convolution.\n", + "* Apply Max Pooling using `pool_ksize` and `pool_strides`.\n", + " * We recommend you use same padding, but you're welcome to use any padding.\n", + "\n", + "Note: You **can't** use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) for this layer. You're free to use any TensorFlow package for all the other layers." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):\n", + " \"\"\"\n", + " Apply convolution then max pooling to x_tensor\n", + " :param x_tensor: TensorFlow Tensor\n", + " :param conv_num_outputs: Number of outputs for the convolutional layer\n", + " :param conv_strides: Stride 2-D Tuple for convolution\n", + " :param pool_ksize: kernal size 2-D Tuple for pool\n", + " :param pool_strides: Stride 2-D Tuple for pool\n", + " : return: A tensor that represents convolution and max pooling of x_tensor\n", + " \"\"\"\n", + " w_conv = tf.Variable(tf.truncated_normal((conv_ksize[0], # Kernel height\n", + " conv_ksize[1], # Kernel width\n", + " x_tensor.get_shape()[3].value, # Input channels\n", + " conv_num_outputs), stddev=0.05)) # output depth\n", + " bias_conv = tf.Variable(tf.random_normal([conv_num_outputs], stddev=0.05))\n", + " \n", + " padding = 'VALID'\n", + " \n", + " \n", + " # DEBUG\n", + " #print('conv ksize: ', conv_ksize)\n", + " #print('conv strides: ',conv_strides)\n", + " #print('pool ksize: ', pool_ksize)\n", + " #print('pool strides: ',pool_strides)\n", + " #print('conv_depth: ', conv_num_outputs)\n", + " #print('x_tensor :',x_tensor)\n", + " #print('conv weights :', w_conv) \n", + " # DEBUG\n", + " \n", + " \n", + " # Apply conv to x_tensor\n", + " cv1 = tf.nn.conv2d(x_tensor,\n", + " w_conv,\n", + " [1, conv_strides[0], conv_strides[1], 1],\n", + " padding=padding\n", + " )\n", + " # add bias\n", + " cv1 = tf.nn.bias_add(cv1, bias_conv)\n", + " \n", + " # Apply relu\n", + " cv1 = tf.nn.relu(cv1)\n", + " \n", + " # Apply Maxpooling\n", + " cv1 = tf.nn.max_pool(cv1,\n", + " ksize=[1,pool_ksize[0], pool_ksize[1], 1],\n", + " strides=[1, pool_strides[0], pool_strides[1], 1],\n", + " padding=padding\n", + " )\n", + " \n", + " \n", + " \n", + " #print('conv layer :', cv1.get_shape())\n", + " return cv1 \n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_con_pool(conv2d_maxpool)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Flatten Layer\n", + "Implement the `flatten` function to change the dimension of `x_tensor` from a 4-D tensor to a 2-D tensor. The output should be the shape (*Batch Size*, *Flattened Image Size*). You can use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) for this layer." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "\n", + "def flatten(x_tensor):\n", + " \"\"\"\n", + " Flatten x_tensor to (Batch Size, Flattened Image Size)\n", + " : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.\n", + " : return: A tensor of size (Batch Size, Flattened Image Size).\n", + " \"\"\"\n", + " flat_size = np.multiply.reduce(x_tensor.get_shape()[1:].as_list())\n", + " return tf.reshape(x_tensor, [-1, flat_size])\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_flatten(flatten)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Fully-Connected Layer\n", + "Implement the `fully_conn` function to apply a fully connected layer to `x_tensor` with the shape (*Batch Size*, *num_outputs*). You can use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) for this layer." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def fully_conn(x_tensor, num_outputs):\n", + " \"\"\"\n", + " Apply a fully connected layer to x_tensor using weight and bias\n", + " : x_tensor: A 2-D tensor where the first dimension is batch size.\n", + " : num_outputs: The number of output that the new tensor should be.\n", + " : return: A 2-D tensor where the second dimension is num_outputs.\n", + " \"\"\"\n", + " full_w = tf.Variable(tf.truncated_normal((x_tensor.get_shape()[1].value, num_outputs), stddev=0.05))\n", + " full_b = tf.Variable(tf.random_normal([num_outputs], stddev=0.05))\n", + " \n", + " full = tf.add(tf.matmul(x_tensor, full_w), full_b)\n", + " full = tf.nn.relu(full)\n", + " return full\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_fully_conn(fully_conn)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Output Layer\n", + "Implement the `output` function to apply a fully connected layer to `x_tensor` with the shape (*Batch Size*, *num_outputs*). You can use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) for this layer.\n", + "\n", + "Note: Activation, softmax, or cross entropy shouldn't be applied to this." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def output(x_tensor, num_outputs):\n", + " \"\"\"\n", + " Apply a output layer to x_tensor using weight and bias\n", + " : x_tensor: A 2-D tensor where the first dimension is batch size.\n", + " : num_outputs: The number of output that the new tensor should be.\n", + " : return: A 2-D tensor where the second dimension is num_outputs.\n", + " \"\"\"\n", + " out_w = tf.Variable(tf.truncated_normal((x_tensor.get_shape()[1].value, num_outputs)))\n", + " out_b = tf.Variable(tf.zeros(num_outputs))\n", + " \n", + " out = tf.add(tf.matmul(x_tensor, out_w), out_b)\n", + " return out\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_output(output)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Create Convolutional Model\n", + "Implement the function `conv_net` to create a convolutional neural network model. The function takes in a batch of images, `x`, and outputs logits. Use the layers you created above to create this model:\n", + "\n", + "* Apply 1, 2, or 3 Convolution and Max Pool layers\n", + "* Apply a Flatten Layer\n", + "* Apply 1, 2, or 3 Fully Connected Layers\n", + "* Apply an Output Layer\n", + "* Return the output\n", + "* Apply [TensorFlow's Dropout](https://www.tensorflow.org/api_docs/python/tf/nn/dropout) to one or more layers in the model using `keep_prob`. " + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "--------------\n", + "Tensor(\"x:0\", shape=(?, 32, 32, 3), dtype=float32)\n", + "Tensor(\"MaxPool:0\", shape=(?, 15, 15, 60), dtype=float32)\n", + "Tensor(\"MaxPool_1:0\", shape=(?, 6, 6, 120), dtype=float32)\n", + "Tensor(\"MaxPool_2:0\", shape=(?, 2, 2, 400), dtype=float32)\n", + "Tensor(\"Reshape:0\", shape=(?, 1600), dtype=float32)\n", + "Tensor(\"dropout/mul:0\", shape=(?, 600), dtype=float32)\n", + "Tensor(\"dropout_1/mul:0\", shape=(?, 60), dtype=float32)\n", + "Tensor(\"Add_2:0\", shape=(?, 10), dtype=float32)\n", + "--------------\n", + "Tensor(\"Placeholder:0\", shape=(?, 32, 32, 3), dtype=float32)\n", + "Tensor(\"MaxPool_3:0\", shape=(?, 15, 15, 60), dtype=float32)\n", + "Tensor(\"MaxPool_4:0\", shape=(?, 6, 6, 120), dtype=float32)\n", + "Tensor(\"MaxPool_5:0\", shape=(?, 2, 2, 400), dtype=float32)\n", + "Tensor(\"Reshape_4:0\", shape=(?, 1600), dtype=float32)\n", + "Tensor(\"dropout_2/mul:0\", shape=(?, 600), dtype=float32)\n", + "Tensor(\"dropout_3/mul:0\", shape=(?, 60), dtype=float32)\n", + "Tensor(\"Add_5:0\", shape=(?, 10), dtype=float32)\n", + "Neural Network Built!\n" + ] + } + ], + "source": [ + "def conv_net(x, keep_prob):\n", + " \"\"\"\n", + " Create a convolutional neural network model\n", + " : x: Placeholder tensor that holds image data.\n", + " : keep_prob: Placeholder tensor that hold dropout keep probability.\n", + " : return: Tensor that represents logits\n", + " \"\"\"\n", + " \n", + " \n", + " conv1_hyper_params = {\n", + " 'conv_num_outputs': 60,\n", + " 'conv_ksize': (3,3),\n", + " 'conv_strides': (1,1),\n", + " 'pool_ksize': (2,2),\n", + " 'pool_strides': (2,2)\n", + " }\n", + " \n", + " conv2_hyper_params = {\n", + " 'conv_num_outputs': 120,\n", + " 'conv_ksize': (3,3),\n", + " 'conv_strides': (1,1),\n", + " 'pool_ksize': (2,2),\n", + " 'pool_strides': (2,2)\n", + " }\n", + " \n", + " conv3_hyper_params = {\n", + " 'conv_num_outputs': 400,\n", + " 'conv_ksize': (3,3),\n", + " 'conv_strides': (1,1),\n", + " 'pool_ksize': (2,2),\n", + " 'pool_strides': (2,2)\n", + " }\n", + " \n", + " \n", + " conv1 = conv2d_maxpool(x, **conv1_hyper_params) \n", + " conv2 = conv2d_maxpool(conv1, **conv2_hyper_params)\n", + " conv3 = conv2d_maxpool(conv2, **conv3_hyper_params)\n", + "\n", + " \n", + " \n", + " \n", + " flat = flatten(conv3)\n", + " \n", + " \n", + " \n", + " \n", + " full1 = fully_conn(flat, 600)\n", + " full1 = tf.nn.dropout(full1, keep_prob)\n", + " \n", + " \n", + " full2 = fully_conn(full1, 60)\n", + " full2 = tf.nn.dropout(full2, keep_prob)\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " # TODO: Apply an Output Layer\n", + " # Set this to the number of classes\n", + " # Function Definition from Above:\n", + " # output(x_tensor, num_outputs)\n", + " out = output(full2, 10)\n", + " \n", + " \n", + " # DEBUG\n", + " print(x)\n", + " print(conv1)\n", + " print(conv2)\n", + " print(conv3)\n", + " print(flat)\n", + " print(full1)\n", + " print(full2)\n", + " print(out)\n", + " # DEBUG\n", + " \n", + " \n", + " return out\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "\n", + "##############################\n", + "## Build the Neural Network ##\n", + "##############################\n", + "\n", + "# Remove previous weights, bias, inputs, etc..\n", + "tf.reset_default_graph()\n", + "\n", + "# Inputs\n", + "x = neural_net_image_input((32, 32, 3))\n", + "y = neural_net_label_input(10)\n", + "keep_prob = neural_net_keep_prob_input()\n", + "\n", + "# Model\n", + "logits = conv_net(x, keep_prob)\n", + "\n", + "# Name logits Tensor, so that is can be loaded from disk after training\n", + "logits = tf.identity(logits, name='logits')\n", + "\n", + "# Loss and Optimizer\n", + "cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))\n", + "optimizer = tf.train.AdamOptimizer().minimize(cost)\n", + "\n", + "# Accuracy\n", + "correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))\n", + "accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')\n", + "\n", + "tests.test_conv_net(conv_net)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Train the Neural Network\n", + "### Single Optimization\n", + "Implement the function `train_neural_network` to do a single optimization. The optimization should use `optimizer` to optimize in `session` with a `feed_dict` of the following:\n", + "* `x` for image input\n", + "* `y` for labels\n", + "* `keep_prob` for keep probability for dropout\n", + "\n", + "This function will be called for each batch, so `tf.global_variables_initializer()` has already been called.\n", + "\n", + "Note: Nothing needs to be returned. This function is only optimizing the neural network." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):\n", + " \"\"\"\n", + " Optimize the session on a batch of images and labels\n", + " : session: Current TensorFlow session\n", + " : optimizer: TensorFlow optimizer function\n", + " : keep_probability: keep probability\n", + " : feature_batch: Batch of Numpy image data\n", + " : label_batch: Batch of Numpy label data\n", + " \"\"\"\n", + " session.run([optimizer], feed_dict={x: feature_batch,\n", + " y: label_batch,\n", + " keep_prob: keep_probability\n", + " })\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_train_nn(train_neural_network)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Show Stats\n", + "Implement the function `print_stats` to print loss and validation accuracy. Use the global variables `valid_features` and `valid_labels` to calculate validation accuracy. Use a keep probability of `1.0` to calculate the loss and validation accuracy." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def print_stats(session, feature_batch, label_batch, cost, accuracy):\n", + " \"\"\"\n", + " Print information about loss and validation accuracy\n", + " : session: Current TensorFlow session\n", + " : feature_batch: Batch of Numpy image data\n", + " : label_batch: Batch of Numpy label data\n", + " : cost: TensorFlow cost function\n", + " : accuracy: TensorFlow accuracy function\n", + " \"\"\"\n", + " loss_feed = {\n", + " x: feature_batch,\n", + " y: label_batch,\n", + " keep_prob: 1.\n", + " }\n", + " \n", + " valid_feed = {\n", + " x: valid_features,\n", + " y: valid_labels,\n", + " keep_prob: 1.\n", + " }\n", + " \n", + " loss = session.run(cost, loss_feed)\n", + " accuracy = session.run(accuracy, valid_feed)\n", + " \n", + " print('Loss: {:>10.4f} - Valid Accuracy: {:.2f}%'.format(loss,accuracy*100))\n", + " \n", + " " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Hyperparameters\n", + "Tune the following parameters:\n", + "* Set `epochs` to the number of iterations until the network stops learning or start overfitting\n", + "* Set `batch_size` to the highest number that your machine has memory for. Most people set them to common sizes of memory:\n", + " * 64\n", + " * 128\n", + " * 256\n", + " * ...\n", + "* Set `keep_probability` to the probability of keeping a node using dropout" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# TODO: Tune Parameters\n", + "epochs = 9\n", + "batch_size = 512\n", + "keep_probability = 0.75" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Train on a Single CIFAR-10 Batch\n", + "Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": false + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "print('Checking the Training on a Single Batch...')\n", + "with tf.Session() as sess:\n", + " # Initializing the variables\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Training cycle\n", + " for epoch in range(epochs):\n", + " batch_i = 1\n", + " for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):\n", + " train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)\n", + " print('Epoch {:>2}, CIFAR-10 Batch {}: '.format(epoch + 1, batch_i), end='')\n", + " print_stats(sess, batch_features, batch_labels, cost, accuracy)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Fully Train the Model\n", + "Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches." + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training...\n", + "Epoch 1, CIFAR-10 Batch 1: Loss: 2.2161 - Valid Accuracy: 18.40%\n", + "Epoch 1, CIFAR-10 Batch 2: Loss: 2.0625 - Valid Accuracy: 25.98%\n", + "Epoch 1, CIFAR-10 Batch 3: Loss: 1.7185 - Valid Accuracy: 34.68%\n", + "Epoch 1, CIFAR-10 Batch 4: Loss: 1.6533 - Valid Accuracy: 39.72%\n", + "Epoch 1, CIFAR-10 Batch 5: Loss: 1.5159 - Valid Accuracy: 43.98%\n", + "Epoch 2, CIFAR-10 Batch 1: Loss: 1.6357 - Valid Accuracy: 47.40%\n", + "Epoch 2, CIFAR-10 Batch 2: Loss: 1.4579 - Valid Accuracy: 48.64%\n", + "Epoch 2, CIFAR-10 Batch 3: Loss: 1.2390 - Valid Accuracy: 51.38%\n", + "Epoch 2, CIFAR-10 Batch 4: Loss: 1.2468 - Valid Accuracy: 53.06%\n", + "Epoch 2, CIFAR-10 Batch 5: Loss: 1.2385 - Valid Accuracy: 53.86%\n", + "Epoch 3, CIFAR-10 Batch 1: Loss: 1.3218 - Valid Accuracy: 55.72%\n", + "Epoch 3, CIFAR-10 Batch 2: Loss: 1.1992 - Valid Accuracy: 54.00%\n", + "Epoch 3, CIFAR-10 Batch 3: Loss: 1.0670 - Valid Accuracy: 57.56%\n", + "Epoch 3, CIFAR-10 Batch 4: Loss: 1.0798 - Valid Accuracy: 57.62%\n", + "Epoch 3, CIFAR-10 Batch 5: Loss: 1.0199 - Valid Accuracy: 59.20%\n", + "Epoch 4, CIFAR-10 Batch 1: Loss: 1.0873 - Valid Accuracy: 61.30%\n", + "Epoch 4, CIFAR-10 Batch 2: Loss: 1.0366 - Valid Accuracy: 61.46%\n", + "Epoch 4, CIFAR-10 Batch 3: Loss: 0.9007 - Valid Accuracy: 61.18%\n", + "Epoch 4, CIFAR-10 Batch 4: Loss: 0.8702 - Valid Accuracy: 63.08%\n", + "Epoch 4, CIFAR-10 Batch 5: Loss: 0.8193 - Valid Accuracy: 65.02%\n", + "Epoch 5, CIFAR-10 Batch 1: Loss: 0.9168 - Valid Accuracy: 63.96%\n", + "Epoch 5, CIFAR-10 Batch 2: Loss: 0.8561 - Valid Accuracy: 65.00%\n", + "Epoch 5, CIFAR-10 Batch 3: Loss: 0.7567 - Valid Accuracy: 65.08%\n", + "Epoch 5, CIFAR-10 Batch 4: Loss: 0.7559 - Valid Accuracy: 65.62%\n", + "Epoch 5, CIFAR-10 Batch 5: Loss: 0.7026 - Valid Accuracy: 66.76%\n", + "Epoch 6, CIFAR-10 Batch 1: Loss: 0.8156 - Valid Accuracy: 67.12%\n", + "Epoch 6, CIFAR-10 Batch 2: Loss: 0.7452 - Valid Accuracy: 67.18%\n", + "Epoch 6, CIFAR-10 Batch 3: Loss: 0.6952 - Valid Accuracy: 65.44%\n", + "Epoch 6, CIFAR-10 Batch 4: Loss: 0.6694 - Valid Accuracy: 66.68%\n", + "Epoch 6, CIFAR-10 Batch 5: Loss: 0.6272 - Valid Accuracy: 67.82%\n", + "Epoch 7, CIFAR-10 Batch 1: Loss: 0.7157 - Valid Accuracy: 68.12%\n", + "Epoch 7, CIFAR-10 Batch 2: Loss: 0.6450 - Valid Accuracy: 67.92%\n", + "Epoch 7, CIFAR-10 Batch 3: Loss: 0.6158 - Valid Accuracy: 66.10%\n", + "Epoch 7, CIFAR-10 Batch 4: Loss: 0.5492 - Valid Accuracy: 68.74%\n", + "Epoch 7, CIFAR-10 Batch 5: Loss: 0.5193 - Valid Accuracy: 68.12%\n", + "Epoch 8, CIFAR-10 Batch 1: Loss: 0.6347 - Valid Accuracy: 69.08%\n", + "Epoch 8, CIFAR-10 Batch 2: Loss: 0.5660 - Valid Accuracy: 69.54%\n", + "Epoch 8, CIFAR-10 Batch 3: Loss: 0.4913 - Valid Accuracy: 67.54%\n", + "Epoch 8, CIFAR-10 Batch 4: Loss: 0.4737 - Valid Accuracy: 69.96%\n", + "Epoch 8, CIFAR-10 Batch 5: Loss: 0.4385 - Valid Accuracy: 70.38%\n", + "Epoch 9, CIFAR-10 Batch 1: Loss: 0.5353 - Valid Accuracy: 69.56%\n", + "Epoch 9, CIFAR-10 Batch 2: Loss: 0.4946 - Valid Accuracy: 69.56%\n", + "Epoch 9, CIFAR-10 Batch 3: Loss: 0.4304 - Valid Accuracy: 68.02%\n", + "Epoch 9, CIFAR-10 Batch 4: Loss: 0.4035 - Valid Accuracy: 70.64%\n", + "Epoch 9, CIFAR-10 Batch 5: Loss: 0.3885 - Valid Accuracy: 70.56%\n" + ] + } + ], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "save_model_path = './image_classification'\n", + "\n", + "print('Training...')\n", + "with tf.Session() as sess:\n", + " # Initializing the variables\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Training cycle\n", + " for epoch in range(epochs):\n", + " # Loop over all batches\n", + " n_batches = 5\n", + " for batch_i in range(1, n_batches + 1):\n", + " for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):\n", + " train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)\n", + " print('Epoch {:>2}, CIFAR-10 Batch {}: '.format(epoch + 1, batch_i), end='')\n", + " print_stats(sess, batch_features, batch_labels, cost, accuracy)\n", + " \n", + " # Save Model\n", + " saver = tf.train.Saver()\n", + " save_path = saver.save(sess, save_model_path)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Checkpoint\n", + "The model has been saved to disk.\n", + "## Test Model\n", + "Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina'\n", + "\n", + "import tensorflow as tf\n", + "import pickle\n", + "import helper\n", + "import random\n", + "\n", + "# Set batch size if not already set\n", + "try:\n", + " if batch_size:\n", + " pass\n", + "except NameError:\n", + " batch_size = 64\n", + "\n", + "save_model_path = './image_classification'\n", + "n_samples = 4\n", + "top_n_predictions = 3\n", + "\n", + "def test_model():\n", + " \"\"\"\n", + " Test the saved model against the test dataset\n", + " \"\"\"\n", + "\n", + " test_features, test_labels = pickle.load(open('preprocess_training.p', mode='rb'))\n", + " loaded_graph = tf.Graph()\n", + "\n", + " with tf.Session(graph=loaded_graph) as sess:\n", + " # Load model\n", + " loader = tf.train.import_meta_graph(save_model_path + '.meta')\n", + " loader.restore(sess, save_model_path)\n", + "\n", + " # Get Tensors from loaded model\n", + " loaded_x = loaded_graph.get_tensor_by_name('x:0')\n", + " loaded_y = loaded_graph.get_tensor_by_name('y:0')\n", + " loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')\n", + " loaded_logits = loaded_graph.get_tensor_by_name('logits:0')\n", + " loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')\n", + " \n", + " # Get accuracy in batches for memory limitations\n", + " test_batch_acc_total = 0\n", + " test_batch_count = 0\n", + " \n", + " for train_feature_batch, train_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):\n", + " test_batch_acc_total += sess.run(\n", + " loaded_acc,\n", + " feed_dict={loaded_x: train_feature_batch, loaded_y: train_label_batch, loaded_keep_prob: 1.0})\n", + " test_batch_count += 1\n", + "\n", + " print('Testing Accuracy: {}\\n'.format(test_batch_acc_total/test_batch_count))\n", + "\n", + " # Print Random Samples\n", + " random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))\n", + " random_test_predictions = sess.run(\n", + " tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),\n", + " feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})\n", + " helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)\n", + "\n", + "\n", + "test_model()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Why 50-70% Accuracy?\n", + "You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores [well above 70%](http://rodrigob.github.io/are_we_there_yet/build/classification_datasets_results.html#43494641522d3130). That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.\n", + "## Submitting This Project\n", + "When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as \"dlnd_image_classification.ipynb\" and save it as a HTML file under \"File\" -> \"Download as\". Include the \"helper.py\" and \"problem_unittests.py\" files in your submission." + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + }, + "widgets": { + "state": {}, + "version": "1.1.2" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/image-classification/image-classification/__pycache__/helper.cpython-35.pyc b/image-classification/image-classification/__pycache__/helper.cpython-35.pyc new file mode 100644 index 0000000..869a1e0 Binary files /dev/null and b/image-classification/image-classification/__pycache__/helper.cpython-35.pyc differ diff --git a/image-classification/image-classification/__pycache__/problem_unittests.cpython-35.pyc b/image-classification/image-classification/__pycache__/problem_unittests.cpython-35.pyc new file mode 100644 index 0000000..18ef63d Binary files /dev/null and b/image-classification/image-classification/__pycache__/problem_unittests.cpython-35.pyc differ diff --git a/image-classification/image-classification/checkpoint b/image-classification/image-classification/checkpoint new file mode 100644 index 0000000..313455f --- /dev/null +++ b/image-classification/image-classification/checkpoint @@ -0,0 +1,2 @@ +model_checkpoint_path: "image_classification" +all_model_checkpoint_paths: "image_classification" diff --git a/image-classification/image-classification/cifar-10-batches-py/batches.meta b/image-classification/image-classification/cifar-10-batches-py/batches.meta new file mode 100644 index 0000000..4467a6e Binary files /dev/null and b/image-classification/image-classification/cifar-10-batches-py/batches.meta differ diff --git a/image-classification/image-classification/cifar-10-batches-py/data_batch_1 b/image-classification/image-classification/cifar-10-batches-py/data_batch_1 new file mode 100644 index 0000000..1b9ff78 --- /dev/null +++ b/image-classification/image-classification/cifar-10-batches-py/data_batch_1 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:54636561a3ce25bd3e19253c6b0d8538147b0ae398331ac4a2d86c6d987368cd +size 31035704 diff --git a/image-classification/image-classification/cifar-10-batches-py/data_batch_2 b/image-classification/image-classification/cifar-10-batches-py/data_batch_2 new file mode 100644 index 0000000..da8acc0 --- /dev/null +++ b/image-classification/image-classification/cifar-10-batches-py/data_batch_2 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:766b2cef9fbc745cf056b3152224f7cf77163b330ea9a15f9392beb8b89bc5a8 +size 31035320 diff --git a/image-classification/image-classification/cifar-10-batches-py/data_batch_3 b/image-classification/image-classification/cifar-10-batches-py/data_batch_3 new file mode 100644 index 0000000..e98eb3e --- /dev/null +++ b/image-classification/image-classification/cifar-10-batches-py/data_batch_3 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:0f00d98ebfb30b3ec0ad19f9756dc2630b89003e10525f5e148445e82aa6a1f9 +size 31035999 diff --git a/image-classification/image-classification/cifar-10-batches-py/data_batch_4 b/image-classification/image-classification/cifar-10-batches-py/data_batch_4 new file mode 100644 index 0000000..9b81f87 --- /dev/null +++ b/image-classification/image-classification/cifar-10-batches-py/data_batch_4 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:3f7bb240661948b8f4d53e36ec720d8306f5668bd0071dcb4e6c947f78e9682b +size 31035696 diff --git a/image-classification/image-classification/cifar-10-batches-py/data_batch_5 b/image-classification/image-classification/cifar-10-batches-py/data_batch_5 new file mode 100644 index 0000000..0428cfd --- /dev/null +++ b/image-classification/image-classification/cifar-10-batches-py/data_batch_5 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d91802434d8376bbaeeadf58a737e3a1b12ac839077e931237e0dcd43adcb154 +size 31035623 diff --git a/image-classification/image-classification/cifar-10-batches-py/readme.html b/image-classification/image-classification/cifar-10-batches-py/readme.html new file mode 100644 index 0000000..e377ade --- /dev/null +++ b/image-classification/image-classification/cifar-10-batches-py/readme.html @@ -0,0 +1 @@ + diff --git a/image-classification/image-classification/cifar-10-batches-py/test_batch b/image-classification/image-classification/cifar-10-batches-py/test_batch new file mode 100644 index 0000000..3e03f1f Binary files /dev/null and b/image-classification/image-classification/cifar-10-batches-py/test_batch differ diff --git a/image-classification/image-classification/cifar-10-python.tar.gz b/image-classification/image-classification/cifar-10-python.tar.gz new file mode 100644 index 0000000..3026cc5 --- /dev/null +++ b/image-classification/image-classification/cifar-10-python.tar.gz @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6d958be074577803d12ecdefd02955f39262c83c16fe9348329d7fe0b5c001ce +size 170498071 diff --git a/image-classification/image-classification/dlnd_image_classification.html b/image-classification/image-classification/dlnd_image_classification.html new file mode 100644 index 0000000..0d9470a --- /dev/null +++ b/image-classification/image-classification/dlnd_image_classification.html @@ -0,0 +1,14004 @@ + + + +dlnd_image_classification + + + + + + + + + + + + + + + + + + + +
+
+ +
+
+
+
+
+

Image Classification

In this project, you'll classify images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.

+

Get the Data

Run the following cell to download the CIFAR-10 dataset for python.

+ +
+
+
+
+
+
In [3]:
+
+
+
"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+from urllib.request import urlretrieve
+from os.path import isfile, isdir
+from tqdm import tqdm
+import problem_unittests as tests
+import tarfile
+
+cifar10_dataset_folder_path = 'cifar-10-batches-py'
+
+class DLProgress(tqdm):
+    last_block = 0
+
+    def hook(self, block_num=1, block_size=1, total_size=None):
+        self.total = total_size
+        self.update((block_num - self.last_block) * block_size)
+        self.last_block = block_num
+
+if not isfile('cifar-10-python.tar.gz'):
+    with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:
+        urlretrieve(
+            'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',
+            'cifar-10-python.tar.gz',
+            pbar.hook)
+
+if not isdir(cifar10_dataset_folder_path):
+    with tarfile.open('cifar-10-python.tar.gz') as tar:
+        tar.extractall()
+        tar.close()
+
+
+tests.test_folder_path(cifar10_dataset_folder_path)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
All files found!
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Explore the Data

The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named data_batch_1, data_batch_2, etc.. Each batch contains the labels and images that are one of the following:

+
    +
  • airplane
  • +
  • automobile
  • +
  • bird
  • +
  • cat
  • +
  • deer
  • +
  • dog
  • +
  • frog
  • +
  • horse
  • +
  • ship
  • +
  • truck
  • +
+

Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the batch_id and sample_id. The batch_id is the id for a batch (1-5). The sample_id is the id for a image and label pair in the batch.

+

Ask yourself "What are all possible labels?", "What is the range of values for the image data?", "Are the labels in order or random?". Answers to questions like these will help you preprocess the data and end up with better predictions.

+ +
+
+
+
+
+
In [4]:
+
+
+
%matplotlib inline
+%config InlineBackend.figure_format = 'retina'
+
+import helper
+import numpy as np
+
+# Explore the dataset
+batch_id = 1
+sample_id = 32
+helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
+Stats of batch 1:
+Samples: 10000
+Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}
+First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]
+
+Example of Image 32:
+Image - Min Value: 0 Max Value: 255
+Image - Shape: (32, 32, 3)
+Label - Label Id: 1 Name: automobile
+
+
+
+ +
+
+ + + +
+ +
+ +
+ +
+
+ +
+
+
+
+
+
+

Implement Preprocess Functions

Normalize

In the cell below, implement the normalize function to take in image data, x, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as x.

+ +
+
+
+
+
+
In [5]:
+
+
+
def normalize(x):
+    """
+    Normalize a list of sample image data in the range of 0 to 1
+    : x: List of image data.  The image shape is (32, 32, 3)
+    : return: Numpy array of normalize data
+    """
+    return (x - x.min()) / (x.max() - x.min())
+
+
+"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+tests.test_normalize(normalize)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Tests Passed
+
+
+
+ +
+
+ +
+
+
+
+
+
+

One-hot encode

Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the one_hot_encode function. The input, x, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to one_hot_encode. Make sure to save the map of encodings outside the function.

+

Hint: Don't reinvent the wheel.

+ +
+
+
+
+
+
In [6]:
+
+
+
encodings = {}
+for i in range(10):
+    v = np.zeros(10)
+    v[i] = 1
+    encodings[i] = v
+
+def one_hot_encode(x):
+    """
+    One hot encode a list of sample labels. Return a one-hot encoded vector for each label.
+    : x: List of sample Labels
+    : return: Numpy array of one-hot encoded labels
+    """
+    assert np.max(x) < 10 and np.min(x) >= 0, 'label must be between 0 and 9'
+    encoded = np.array(list(map(lambda v: encodings[v],x)))
+    return encoded
+
+
+"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+tests.test_one_hot_encode(one_hot_encode)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Tests Passed
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Randomize Data

As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.

+ +
+
+
+
+
+
+
+
+

Preprocess all the data and save it

Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.

+ +
+
+
+
+
+
In [7]:
+
+
+
"""
+DON'T MODIFY ANYTHING IN THIS CELL
+"""
+# Preprocess Training, Validation, and Testing Data
+helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)
+
+ +
+
+
+ +
+
+
+
+
+
+

Check Point

This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.

+ +
+
+
+
+
+
In [8]:
+
+
+
"""
+DON'T MODIFY ANYTHING IN THIS CELL
+"""
+import pickle
+import problem_unittests as tests
+import helper
+
+# Load the Preprocessed Validation data
+valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))
+
+ +
+
+
+ +
+
+
+
+
+
+

Build the network

For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.

+

If you're finding it hard to dedicate enough time for this course a week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use TensorFlow Layers or TensorFlow Layers (contrib) to build each layer, except "Convolutional & Max Pooling" layer. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.

+

If you would like to get the most of this course, try to solve all the problems without TF Layers. Let's begin!

+

Input

The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions

+
    +
  • Implement neural_net_image_input
      +
    • Return a TF Placeholder
    • +
    • Set the shape using image_shape with batch size set to None.
    • +
    • Name the TensorFlow placeholder "x" using the TensorFlow name parameter in the TF Placeholder.
    • +
    +
  • +
  • Implement neural_net_label_input
      +
    • Return a TF Placeholder
    • +
    • Set the shape using n_classes with batch size set to None.
    • +
    • Name the TensorFlow placeholder "y" using the TensorFlow name parameter in the TF Placeholder.
    • +
    +
  • +
  • Implement neural_net_keep_prob_input
      +
    • Return a TF Placeholder for dropout keep probability.
    • +
    • Name the TensorFlow placeholder "keep_prob" using the TensorFlow name parameter in the TF Placeholder.
    • +
    +
  • +
+

These names will be used at the end of the project to load your saved model.

+

Note: None for shapes in TensorFlow allow for a dynamic size.

+ +
+
+
+
+
+
In [9]:
+
+
+
import tensorflow as tf
+
+def neural_net_image_input(image_shape):
+    """
+    Return a Tensor for a bach of image input
+    : image_shape: Shape of the images
+    : return: Tensor for image input.
+    """
+    # shape [None, 32, 32, 3]
+    return tf.placeholder(tf.float32, [None] + list(image_shape), name='x')
+
+
+
+def neural_net_label_input(n_classes):
+    """
+    Return a Tensor for a batch of label input
+    : n_classes: Number of classes
+    : return: Tensor for label input.
+    """
+    return tf.placeholder(tf.float32, [None, n_classes], name='y')
+
+
+def neural_net_keep_prob_input():
+    """
+    Return a Tensor for keep probability
+    : return: Tensor for keep probability.
+    """
+    # TODO: Implement Function
+    return tf.placeholder(tf.float32, name='keep_prob')
+
+
+"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+tf.reset_default_graph()
+tests.test_nn_image_inputs(neural_net_image_input)
+tests.test_nn_label_inputs(neural_net_label_input)
+tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Image Input Tests Passed.
+Label Input Tests Passed.
+Keep Prob Tests Passed.
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Convolution and Max Pooling Layer

Convolution layers have a lot of success with images. For this code cell, you should implement the function conv2d_maxpool to apply convolution then max pooling:

+
    +
  • Create the weight and bias using conv_ksize, conv_num_outputs and the shape of x_tensor.
  • +
  • Apply a convolution to x_tensor using weight and conv_strides.
      +
    • We recommend you use same padding, but you're welcome to use any padding.
    • +
    +
  • +
  • Add bias
  • +
  • Add a nonlinear activation to the convolution.
  • +
  • Apply Max Pooling using pool_ksize and pool_strides.
      +
    • We recommend you use same padding, but you're welcome to use any padding.
    • +
    +
  • +
+

Note: You can't use TensorFlow Layers or TensorFlow Layers (contrib) for this layer. You're free to use any TensorFlow package for all the other layers.

+ +
+
+
+
+
+
In [10]:
+
+
+
def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):
+    """
+    Apply convolution then max pooling to x_tensor
+    :param x_tensor: TensorFlow Tensor
+    :param conv_num_outputs: Number of outputs for the convolutional layer
+    :param conv_strides: Stride 2-D Tuple for convolution
+    :param pool_ksize: kernal size 2-D Tuple for pool
+    :param pool_strides: Stride 2-D Tuple for pool
+    : return: A tensor that represents convolution and max pooling of x_tensor
+    """
+    w_conv = tf.Variable(tf.truncated_normal((conv_ksize[0], # Kernel height
+                                                 conv_ksize[1], # Kernel width
+                                                 x_tensor.get_shape()[3].value, # Input channels
+                                                 conv_num_outputs), stddev=0.05)) # output depth
+    bias_conv = tf.Variable(tf.random_normal([conv_num_outputs], stddev=0.05))
+    
+    padding = 'VALID'
+    
+    
+    # DEBUG
+    #print('conv ksize: ', conv_ksize)
+    #print('conv strides: ',conv_strides)
+    #print('pool ksize: ', pool_ksize)
+    #print('pool strides: ',pool_strides)
+    #print('conv_depth: ', conv_num_outputs)
+    #print('x_tensor :',x_tensor)
+    #print('conv weights :', w_conv) 
+    # DEBUG
+    
+    
+    # Apply conv to x_tensor
+    cv1 = tf.nn.conv2d(x_tensor,
+                       w_conv,
+                       [1, conv_strides[0], conv_strides[1], 1],
+                       padding=padding
+                      )
+    # add bias
+    cv1 = tf.nn.bias_add(cv1, bias_conv)
+    
+    # Apply relu
+    cv1 = tf.nn.relu(cv1)
+    
+    # Apply Maxpooling
+    cv1 = tf.nn.max_pool(cv1,
+                         ksize=[1,pool_ksize[0], pool_ksize[1], 1],
+                         strides=[1, pool_strides[0], pool_strides[1], 1],
+                         padding=padding
+                        )
+    
+     
+    
+    #print('conv layer :', cv1.get_shape())
+    return cv1 
+
+
+"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+tests.test_con_pool(conv2d_maxpool)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Tests Passed
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Flatten Layer

Implement the flatten function to change the dimension of x_tensor from a 4-D tensor to a 2-D tensor. The output should be the shape (Batch Size, Flattened Image Size). You can use TensorFlow Layers or TensorFlow Layers (contrib) for this layer.

+ +
+
+
+
+
+
In [11]:
+
+
+
def flatten(x_tensor):
+    """
+    Flatten x_tensor to (Batch Size, Flattened Image Size)
+    : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.
+    : return: A tensor of size (Batch Size, Flattened Image Size).
+    """
+    flat_size = np.multiply.reduce(x_tensor.get_shape()[1:].as_list())
+    return tf.reshape(x_tensor, [-1, flat_size])
+
+
+"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+tests.test_flatten(flatten)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Tests Passed
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Fully-Connected Layer

Implement the fully_conn function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). You can use TensorFlow Layers or TensorFlow Layers (contrib) for this layer.

+ +
+
+
+
+
+
In [12]:
+
+
+
def fully_conn(x_tensor, num_outputs):
+    """
+    Apply a fully connected layer to x_tensor using weight and bias
+    : x_tensor: A 2-D tensor where the first dimension is batch size.
+    : num_outputs: The number of output that the new tensor should be.
+    : return: A 2-D tensor where the second dimension is num_outputs.
+    """
+    full_w = tf.Variable(tf.truncated_normal((x_tensor.get_shape()[1].value, num_outputs), stddev=0.05))
+    full_b = tf.Variable(tf.random_normal([num_outputs], stddev=0.05))
+    
+    full = tf.add(tf.matmul(x_tensor, full_w), full_b)
+    full = tf.nn.relu(full)
+    return full
+
+
+"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+tests.test_fully_conn(fully_conn)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Tests Passed
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Output Layer

Implement the output function to apply a fully connected layer to x_tensor with the shape (Batch Size, num_outputs). You can use TensorFlow Layers or TensorFlow Layers (contrib) for this layer.

+

Note: Activation, softmax, or cross entropy shouldn't be applied to this.

+ +
+
+
+
+
+
In [13]:
+
+
+
def output(x_tensor, num_outputs):
+    """
+    Apply a output layer to x_tensor using weight and bias
+    : x_tensor: A 2-D tensor where the first dimension is batch size.
+    : num_outputs: The number of output that the new tensor should be.
+    : return: A 2-D tensor where the second dimension is num_outputs.
+    """
+    out_w = tf.Variable(tf.truncated_normal((x_tensor.get_shape()[1].value, num_outputs)))
+    out_b = tf.Variable(tf.zeros(num_outputs))
+    
+    out = tf.add(tf.matmul(x_tensor, out_w), out_b)
+    return out
+
+
+"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+tests.test_output(output)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Tests Passed
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Create Convolutional Model

Implement the function conv_net to create a convolutional neural network model. The function takes in a batch of images, x, and outputs logits. Use the layers you created above to create this model:

+
    +
  • Apply 1, 2, or 3 Convolution and Max Pool layers
  • +
  • Apply a Flatten Layer
  • +
  • Apply 1, 2, or 3 Fully Connected Layers
  • +
  • Apply an Output Layer
  • +
  • Return the output
  • +
  • Apply TensorFlow's Dropout to one or more layers in the model using keep_prob.
  • +
+ +
+
+
+
+
+
In [50]:
+
+
+
def conv_net(x, keep_prob):
+    """
+    Create a convolutional neural network model
+    : x: Placeholder tensor that holds image data.
+    : keep_prob: Placeholder tensor that hold dropout keep probability.
+    : return: Tensor that represents logits
+    """
+    
+    
+    conv1_hyper_params = {
+        'conv_num_outputs': 60,
+        'conv_ksize': (3,3),
+        'conv_strides': (1,1),
+        'pool_ksize': (2,2),
+        'pool_strides': (2,2)
+    }
+    
+    conv2_hyper_params = {
+        'conv_num_outputs': 120,
+        'conv_ksize': (3,3),
+        'conv_strides': (1,1),
+        'pool_ksize': (2,2),
+        'pool_strides': (2,2)
+    }
+    
+    conv3_hyper_params = {
+        'conv_num_outputs': 400,
+        'conv_ksize': (3,3),
+        'conv_strides': (1,1),
+        'pool_ksize': (2,2),
+        'pool_strides': (2,2)
+    }
+    
+   
+    conv1 = conv2d_maxpool(x, **conv1_hyper_params)  
+    conv2 = conv2d_maxpool(conv1, **conv2_hyper_params)
+    conv3 = conv2d_maxpool(conv2, **conv3_hyper_params)
+
+    
+    
+    
+    flat = flatten(conv3)
+    
+    
+    
+    
+    full1 = fully_conn(flat, 600)
+    full1 = tf.nn.dropout(full1, keep_prob)
+    
+    
+    full2 = fully_conn(full1, 60)
+    full2 = tf.nn.dropout(full2, keep_prob)
+
+
+    
+    
+    
+    # TODO: Apply an Output Layer
+    #    Set this to the number of classes
+    # Function Definition from Above:
+    #   output(x_tensor, num_outputs)
+    out = output(full2, 10)
+    
+    
+    # DEBUG
+    print(x)
+    print(conv1)
+    print(conv2)
+    print(conv3)
+    print(flat)
+    print(full1)
+    print(full2)
+    print(out)
+    # DEBUG
+    
+    
+    return out
+
+
+"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+
+##############################
+## Build the Neural Network ##
+##############################
+
+# Remove previous weights, bias, inputs, etc..
+tf.reset_default_graph()
+
+# Inputs
+x = neural_net_image_input((32, 32, 3))
+y = neural_net_label_input(10)
+keep_prob = neural_net_keep_prob_input()
+
+# Model
+logits = conv_net(x, keep_prob)
+
+# Name logits Tensor, so that is can be loaded from disk after training
+logits = tf.identity(logits, name='logits')
+
+# Loss and Optimizer
+cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))
+optimizer = tf.train.AdamOptimizer().minimize(cost)
+
+# Accuracy
+correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))
+accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')
+
+tests.test_conv_net(conv_net)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
--------------
+Tensor("x:0", shape=(?, 32, 32, 3), dtype=float32)
+Tensor("MaxPool:0", shape=(?, 15, 15, 60), dtype=float32)
+Tensor("MaxPool_1:0", shape=(?, 6, 6, 120), dtype=float32)
+Tensor("MaxPool_2:0", shape=(?, 2, 2, 400), dtype=float32)
+Tensor("Reshape:0", shape=(?, 1600), dtype=float32)
+Tensor("dropout/mul:0", shape=(?, 600), dtype=float32)
+Tensor("dropout_1/mul:0", shape=(?, 60), dtype=float32)
+Tensor("Add_2:0", shape=(?, 10), dtype=float32)
+--------------
+Tensor("Placeholder:0", shape=(?, 32, 32, 3), dtype=float32)
+Tensor("MaxPool_3:0", shape=(?, 15, 15, 60), dtype=float32)
+Tensor("MaxPool_4:0", shape=(?, 6, 6, 120), dtype=float32)
+Tensor("MaxPool_5:0", shape=(?, 2, 2, 400), dtype=float32)
+Tensor("Reshape_4:0", shape=(?, 1600), dtype=float32)
+Tensor("dropout_2/mul:0", shape=(?, 600), dtype=float32)
+Tensor("dropout_3/mul:0", shape=(?, 60), dtype=float32)
+Tensor("Add_5:0", shape=(?, 10), dtype=float32)
+Neural Network Built!
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Train the Neural Network

Single Optimization

Implement the function train_neural_network to do a single optimization. The optimization should use optimizer to optimize in session with a feed_dict of the following:

+
    +
  • x for image input
  • +
  • y for labels
  • +
  • keep_prob for keep probability for dropout
  • +
+

This function will be called for each batch, so tf.global_variables_initializer() has already been called.

+

Note: Nothing needs to be returned. This function is only optimizing the neural network.

+ +
+
+
+
+
+
In [15]:
+
+
+
def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):
+    """
+    Optimize the session on a batch of images and labels
+    : session: Current TensorFlow session
+    : optimizer: TensorFlow optimizer function
+    : keep_probability: keep probability
+    : feature_batch: Batch of Numpy image data
+    : label_batch: Batch of Numpy label data
+    """
+    session.run([optimizer], feed_dict={x: feature_batch,
+                                        y: label_batch,
+                                        keep_prob: keep_probability
+                                       })
+
+
+"""
+DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
+"""
+tests.test_train_nn(train_neural_network)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Tests Passed
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Show Stats

Implement the function print_stats to print loss and validation accuracy. Use the global variables valid_features and valid_labels to calculate validation accuracy. Use a keep probability of 1.0 to calculate the loss and validation accuracy.

+ +
+
+
+
+
+
In [16]:
+
+
+
def print_stats(session, feature_batch, label_batch, cost, accuracy):
+    """
+    Print information about loss and validation accuracy
+    : session: Current TensorFlow session
+    : feature_batch: Batch of Numpy image data
+    : label_batch: Batch of Numpy label data
+    : cost: TensorFlow cost function
+    : accuracy: TensorFlow accuracy function
+    """
+    loss_feed = {
+        x: feature_batch,
+        y: label_batch,
+        keep_prob: 1.
+    }
+    
+    valid_feed = {
+        x: valid_features,
+        y: valid_labels,
+        keep_prob: 1.
+    }
+    
+    loss = session.run(cost, loss_feed)
+    accuracy = session.run(accuracy, valid_feed)
+    
+    print('Loss: {:>10.4f} - Valid Accuracy: {:.2f}%'.format(loss,accuracy*100))
+    
+    
+
+ +
+
+
+ +
+
+
+
+
+
+

Hyperparameters

Tune the following parameters:

+
    +
  • Set epochs to the number of iterations until the network stops learning or start overfitting
  • +
  • Set batch_size to the highest number that your machine has memory for. Most people set them to common sizes of memory:
      +
    • 64
    • +
    • 128
    • +
    • 256
    • +
    • ...
    • +
    +
  • +
  • Set keep_probability to the probability of keeping a node using dropout
  • +
+ +
+
+
+
+
+
In [62]:
+
+
+
# TODO: Tune Parameters
+epochs = 9
+batch_size = 512
+keep_probability = 0.75
+
+ +
+
+
+ +
+
+
+
+
+
+

Train on a Single CIFAR-10 Batch

Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.

+ +
+
+
+
+
+
In [64]:
+
+
+
"""
+DON'T MODIFY ANYTHING IN THIS CELL
+"""
+print('Checking the Training on a Single Batch...')
+with tf.Session() as sess:
+    # Initializing the variables
+    sess.run(tf.global_variables_initializer())
+    
+    # Training cycle
+    for epoch in range(epochs):
+        batch_i = 1
+        for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
+            train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
+        print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
+        print_stats(sess, batch_features, batch_labels, cost, accuracy)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Checking the Training on a Single Batch...
+Epoch  1, CIFAR-10 Batch 1:  Loss:     2.2416 - Valid Accuracy: 16.52%
+Epoch  2, CIFAR-10 Batch 1:  Loss:     2.0681 - Valid Accuracy: 26.34%
+Epoch  3, CIFAR-10 Batch 1:  Loss:     1.9934 - Valid Accuracy: 30.72%
+Epoch  4, CIFAR-10 Batch 1:  Loss:     1.8648 - Valid Accuracy: 36.14%
+Epoch  5, CIFAR-10 Batch 1:  Loss:     1.7653 - Valid Accuracy: 38.90%
+Epoch  6, CIFAR-10 Batch 1:  Loss:     1.6582 - Valid Accuracy: 41.92%
+Epoch  7, CIFAR-10 Batch 1:  Loss:     1.4946 - Valid Accuracy: 44.92%
+Epoch  8, CIFAR-10 Batch 1:  Loss:     1.3776 - Valid Accuracy: 48.44%
+Epoch  9, CIFAR-10 Batch 1:  Loss:     1.2710 - Valid Accuracy: 49.32%
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Fully Train the Model

Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.

+ +
+
+
+
+
+
In [63]:
+
+
+
"""
+DON'T MODIFY ANYTHING IN THIS CELL
+"""
+save_model_path = './image_classification'
+
+print('Training...')
+with tf.Session() as sess:
+    # Initializing the variables
+    sess.run(tf.global_variables_initializer())
+    
+    # Training cycle
+    for epoch in range(epochs):
+        # Loop over all batches
+        n_batches = 5
+        for batch_i in range(1, n_batches + 1):
+            for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):
+                train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)
+            print('Epoch {:>2}, CIFAR-10 Batch {}:  '.format(epoch + 1, batch_i), end='')
+            print_stats(sess, batch_features, batch_labels, cost, accuracy)
+            
+    # Save Model
+    saver = tf.train.Saver()
+    save_path = saver.save(sess, save_model_path)
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Training...
+Epoch  1, CIFAR-10 Batch 1:  Loss:     2.2161 - Valid Accuracy: 18.40%
+Epoch  1, CIFAR-10 Batch 2:  Loss:     2.0625 - Valid Accuracy: 25.98%
+Epoch  1, CIFAR-10 Batch 3:  Loss:     1.7185 - Valid Accuracy: 34.68%
+Epoch  1, CIFAR-10 Batch 4:  Loss:     1.6533 - Valid Accuracy: 39.72%
+Epoch  1, CIFAR-10 Batch 5:  Loss:     1.5159 - Valid Accuracy: 43.98%
+Epoch  2, CIFAR-10 Batch 1:  Loss:     1.6357 - Valid Accuracy: 47.40%
+Epoch  2, CIFAR-10 Batch 2:  Loss:     1.4579 - Valid Accuracy: 48.64%
+Epoch  2, CIFAR-10 Batch 3:  Loss:     1.2390 - Valid Accuracy: 51.38%
+Epoch  2, CIFAR-10 Batch 4:  Loss:     1.2468 - Valid Accuracy: 53.06%
+Epoch  2, CIFAR-10 Batch 5:  Loss:     1.2385 - Valid Accuracy: 53.86%
+Epoch  3, CIFAR-10 Batch 1:  Loss:     1.3218 - Valid Accuracy: 55.72%
+Epoch  3, CIFAR-10 Batch 2:  Loss:     1.1992 - Valid Accuracy: 54.00%
+Epoch  3, CIFAR-10 Batch 3:  Loss:     1.0670 - Valid Accuracy: 57.56%
+Epoch  3, CIFAR-10 Batch 4:  Loss:     1.0798 - Valid Accuracy: 57.62%
+Epoch  3, CIFAR-10 Batch 5:  Loss:     1.0199 - Valid Accuracy: 59.20%
+Epoch  4, CIFAR-10 Batch 1:  Loss:     1.0873 - Valid Accuracy: 61.30%
+Epoch  4, CIFAR-10 Batch 2:  Loss:     1.0366 - Valid Accuracy: 61.46%
+Epoch  4, CIFAR-10 Batch 3:  Loss:     0.9007 - Valid Accuracy: 61.18%
+Epoch  4, CIFAR-10 Batch 4:  Loss:     0.8702 - Valid Accuracy: 63.08%
+Epoch  4, CIFAR-10 Batch 5:  Loss:     0.8193 - Valid Accuracy: 65.02%
+Epoch  5, CIFAR-10 Batch 1:  Loss:     0.9168 - Valid Accuracy: 63.96%
+Epoch  5, CIFAR-10 Batch 2:  Loss:     0.8561 - Valid Accuracy: 65.00%
+Epoch  5, CIFAR-10 Batch 3:  Loss:     0.7567 - Valid Accuracy: 65.08%
+Epoch  5, CIFAR-10 Batch 4:  Loss:     0.7559 - Valid Accuracy: 65.62%
+Epoch  5, CIFAR-10 Batch 5:  Loss:     0.7026 - Valid Accuracy: 66.76%
+Epoch  6, CIFAR-10 Batch 1:  Loss:     0.8156 - Valid Accuracy: 67.12%
+Epoch  6, CIFAR-10 Batch 2:  Loss:     0.7452 - Valid Accuracy: 67.18%
+Epoch  6, CIFAR-10 Batch 3:  Loss:     0.6952 - Valid Accuracy: 65.44%
+Epoch  6, CIFAR-10 Batch 4:  Loss:     0.6694 - Valid Accuracy: 66.68%
+Epoch  6, CIFAR-10 Batch 5:  Loss:     0.6272 - Valid Accuracy: 67.82%
+Epoch  7, CIFAR-10 Batch 1:  Loss:     0.7157 - Valid Accuracy: 68.12%
+Epoch  7, CIFAR-10 Batch 2:  Loss:     0.6450 - Valid Accuracy: 67.92%
+Epoch  7, CIFAR-10 Batch 3:  Loss:     0.6158 - Valid Accuracy: 66.10%
+Epoch  7, CIFAR-10 Batch 4:  Loss:     0.5492 - Valid Accuracy: 68.74%
+Epoch  7, CIFAR-10 Batch 5:  Loss:     0.5193 - Valid Accuracy: 68.12%
+Epoch  8, CIFAR-10 Batch 1:  Loss:     0.6347 - Valid Accuracy: 69.08%
+Epoch  8, CIFAR-10 Batch 2:  Loss:     0.5660 - Valid Accuracy: 69.54%
+Epoch  8, CIFAR-10 Batch 3:  Loss:     0.4913 - Valid Accuracy: 67.54%
+Epoch  8, CIFAR-10 Batch 4:  Loss:     0.4737 - Valid Accuracy: 69.96%
+Epoch  8, CIFAR-10 Batch 5:  Loss:     0.4385 - Valid Accuracy: 70.38%
+Epoch  9, CIFAR-10 Batch 1:  Loss:     0.5353 - Valid Accuracy: 69.56%
+Epoch  9, CIFAR-10 Batch 2:  Loss:     0.4946 - Valid Accuracy: 69.56%
+Epoch  9, CIFAR-10 Batch 3:  Loss:     0.4304 - Valid Accuracy: 68.02%
+Epoch  9, CIFAR-10 Batch 4:  Loss:     0.4035 - Valid Accuracy: 70.64%
+Epoch  9, CIFAR-10 Batch 5:  Loss:     0.3885 - Valid Accuracy: 70.56%
+
+
+
+ +
+
+ +
+
+
+
+
+
+

Checkpoint

The model has been saved to disk.

+

Test Model

Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.

+ +
+
+
+
+
+
In [65]:
+
+
+
"""
+DON'T MODIFY ANYTHING IN THIS CELL
+"""
+%matplotlib inline
+%config InlineBackend.figure_format = 'retina'
+
+import tensorflow as tf
+import pickle
+import helper
+import random
+
+# Set batch size if not already set
+try:
+    if batch_size:
+        pass
+except NameError:
+    batch_size = 64
+
+save_model_path = './image_classification'
+n_samples = 4
+top_n_predictions = 3
+
+def test_model():
+    """
+    Test the saved model against the test dataset
+    """
+
+    test_features, test_labels = pickle.load(open('preprocess_training.p', mode='rb'))
+    loaded_graph = tf.Graph()
+
+    with tf.Session(graph=loaded_graph) as sess:
+        # Load model
+        loader = tf.train.import_meta_graph(save_model_path + '.meta')
+        loader.restore(sess, save_model_path)
+
+        # Get Tensors from loaded model
+        loaded_x = loaded_graph.get_tensor_by_name('x:0')
+        loaded_y = loaded_graph.get_tensor_by_name('y:0')
+        loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')
+        loaded_logits = loaded_graph.get_tensor_by_name('logits:0')
+        loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')
+        
+        # Get accuracy in batches for memory limitations
+        test_batch_acc_total = 0
+        test_batch_count = 0
+        
+        for train_feature_batch, train_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):
+            test_batch_acc_total += sess.run(
+                loaded_acc,
+                feed_dict={loaded_x: train_feature_batch, loaded_y: train_label_batch, loaded_keep_prob: 1.0})
+            test_batch_count += 1
+
+        print('Testing Accuracy: {}\n'.format(test_batch_acc_total/test_batch_count))
+
+        # Print Random Samples
+        random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))
+        random_test_predictions = sess.run(
+            tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),
+            feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})
+        helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)
+
+
+test_model()
+
+ +
+
+
+ +
+
+ + +
+
+ +
+
Testing Accuracy: 0.7010512441396713
+
+
+
+
+ +
+
+ + + +
+ +
+ +
+ +
+
+ +
+
+
+
+
+
+

Why 50-70% Accuracy?

You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores well above 70%. That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.

+

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_image_classification.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.

+ +
+
+
+
+
+ + + + + + diff --git a/image-classification/image-classification/dlnd_image_classification.ipynb b/image-classification/image-classification/dlnd_image_classification.ipynb new file mode 100644 index 0000000..d1ebe06 --- /dev/null +++ b/image-classification/image-classification/dlnd_image_classification.ipynb @@ -0,0 +1,1260 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "source": [ + "# Image Classification\n", + "In this project, you'll classify images from the [CIFAR-10 dataset](https://www.cs.toronto.edu/~kriz/cifar.html). The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.\n", + "## Get the Data\n", + "Run the following cell to download the [CIFAR-10 dataset for python](https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz)." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "All files found!\n" + ] + } + ], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "from urllib.request import urlretrieve\n", + "from os.path import isfile, isdir\n", + "from tqdm import tqdm\n", + "import problem_unittests as tests\n", + "import tarfile\n", + "\n", + "cifar10_dataset_folder_path = 'cifar-10-batches-py'\n", + "\n", + "class DLProgress(tqdm):\n", + " last_block = 0\n", + "\n", + " def hook(self, block_num=1, block_size=1, total_size=None):\n", + " self.total = total_size\n", + " self.update((block_num - self.last_block) * block_size)\n", + " self.last_block = block_num\n", + "\n", + "if not isfile('cifar-10-python.tar.gz'):\n", + " with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:\n", + " urlretrieve(\n", + " 'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',\n", + " 'cifar-10-python.tar.gz',\n", + " pbar.hook)\n", + "\n", + "if not isdir(cifar10_dataset_folder_path):\n", + " with tarfile.open('cifar-10-python.tar.gz') as tar:\n", + " tar.extractall()\n", + " tar.close()\n", + "\n", + "\n", + "tests.test_folder_path(cifar10_dataset_folder_path)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Explore the Data\n", + "The dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named `data_batch_1`, `data_batch_2`, etc.. Each batch contains the labels and images that are one of the following:\n", + "* airplane\n", + "* automobile\n", + "* bird\n", + "* cat\n", + "* deer\n", + "* dog\n", + "* frog\n", + "* horse\n", + "* ship\n", + "* truck\n", + "\n", + "Understanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the `batch_id` and `sample_id`. The `batch_id` is the id for a batch (1-5). The `sample_id` is the id for a image and label pair in the batch.\n", + "\n", + "Ask yourself \"What are all possible labels?\", \"What is the range of values for the image data?\", \"Are the labels in order or random?\". Answers to questions like these will help you preprocess the data and end up with better predictions." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "Stats of batch 1:\n", + "Samples: 10000\n", + "Label Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}\n", + "First 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]\n", + "\n", + "Example of Image 32:\n", + "Image - Min Value: 0 Max Value: 255\n", + "Image - Shape: (32, 32, 3)\n", + "Label - Label Id: 1 Name: automobile\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfoAAAH0CAYAAADVH+85AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAWJQAAFiUBSVIk8AAAHOhJREFUeJzt3UmT3fd1HuDfnXoeMBATAY7gDA4iNZOSUpFlyZJiRamU\nbKtiV6Vc3uVrJF8gWSSLOE4pjsqJMyqRZDkVSZHFaIomkxQHkQRAAgRAkEA3uvv2dIcstLGX5wQO\npVPPs3/rdN/+3/vibvB2ptNpAwBq6r7TPwAA8DdH0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBF\nDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAorP9O/wB/U979t++bZnKd\nafwl2d7JXGrtlqML4cx999+duvXqubOp3M9/ciGc2V4fpW5NJpNwZnd3O3VraXkxldtO/LEHg0Hq\n1vxy/Pm44/ZbUrceff894czREwdSt3aGuTfMjas3wplv/8UrqVtnHv1IOHNotZe6NbxyPpV78fx6\nOHPhtRdTt/ZG8Y/T8TT+fm6ttX6nk8r9zqc+Gc588/s/TN167WL8c3E0zr0e62sbuRfkr/CNHgAK\nU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoLCy63X3\nnLk9lZvpzYczw63cgtp4Es8Nh1upW6dOnkrljh0+Es7sD1PDgW3jRnydbGUl/vdqrbXFZG48iv/b\n+NChg6lb3Zn42tXJU7lbO/vxxcGLl+MLXq211m3xVb7WWnv17LVw5sqlzdSt+98VXxxcX3sjdevU\n8dzrsZb42LlxfSl168b2bjyU3Fyb7I1Tud1h/AWZTHKfVdNEbJxcr7sZfKMHgMIUPQAUpugBoDBF\nDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIWVHbV57qeXcsFRfK1gbj4+gNFa\na91ufPWh0/ZTtzY3hqnc/Q+djN/azg3vHDwaH9BZWs699sPNnVRufS0+vPPahTdTt2YXZsKZZ595\nPXVr0uLPx6HDy7lbo9zQzMxSfFhlcamXuvXTH34nnNl+O/eZ8+EnH0jlWmcxHOkOcq/HzDj+Puv3\nc98jN/dyn1W9Xvx3G43iY06ttba7G/8czt66GXyjB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoe\nAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKKzset2l16+kcguz8cWwdngpdWvzRnylqRMfvGut\ntTbaz63evfbqOJyZ6c+nbr343F+GM6PudurWoCX+zq21bjf+b+P19fjiXWutves994Yzb15aT91a\nWp4NZ4bbuQXAmcEklVu7Hn+/ZBfDXks8i+O9vdStb7Tcz3jy6PFwpj+Ov59ba+2WuYVwZmM3t1I4\nTv6M40n8uZrGx0rTuck093vdDL7RA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNED\nQGGKHgAKU/QAUJiiB4DCyo7adDu9VG5nJz7+MjfMjVlsJEZtjp84mLo1HuVGXK5c2QhnVlZyr8fs\nTPxxXF1aTt16az3+e7XWWmcaf64WVnKv/dlz58KZQX8xdWt3J76WtDqbe+0PHszlZjrx0ZLTx3KD\nU+1M/NbFS5dTpw4eP5TKPfDQI+HM/HLutT9z8rZw5itf+0rq1reefjqVG3QySzO5QaHMtlgnu0h2\nE/hGDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QA\nUFjZ9bppZ5zKzS/El8auvrWeutXvx1/+a9c2U7e6g1SsTRL/FLyxG18AbK211cX4itfG+lrqVq+b\ne/Q7Lf67HVo4kLr14H2PhjN333Y6devOO+K5Rx98KHVrMo4vw7XW2nMvvBrOnH/9XOrWhcuXwpnu\n8pHUrW4n933rtYvxz50nP/H+1K3f+MwnwpmDR46nbh0/cWsqdygzWJp87aeZpbx3kG/0AFCYogeA\nwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhZVdr5tM9lK5\n/iA+8zY3n5uGm1+YD2d2hru5W/O5P/XScnx5bX97K3WrM44vDo4nuaW8yV5u3fDUrUfDmc/93j9O\n3VpYja94DXd2UrfeTqzyPXM19xpeffP1VO7shQvhzKS7mrrVWR2GM8dO3p66tbq8lMp996tfC2fO\nfed7qVuHf/e3wplPffrTqVvvfvjxVO5LX/ijcCb5MdB6M/GV02575xbvfKMHgMIUPQAUpugBoDBF\nDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIWVHbVZWZlN5fZ3R+FMfyb376Wd\nnc1wZnYuN6CzuLCYyi3PL4cz3ZncrdEoMUQ0yI0XDXdzwzuP/vrvhjM7h+5M3Xrl0vlwZrSXG/np\n7cZfx2+ejf98rbW2tXE9lZufjQ/23H/mqdStpz76aDhz62p8AKq11k6fio8XtdbayeX4x/cX/91/\nTN360/8Uzz3x7vekbq3M5T4/pi3+DM914p/3rbXW73XCmZ1JPHOz+EYPAIUpegAoTNEDQGGKHgAK\nU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQWNn1uu3NaSq3mli9G7dJ6lYb\nxJfoet3c7zXbnUnlVuZWw5n5+dxjtb72VjzUif98rbV2z633pnJ33f+RcGZjM/F7tdZ6k/ha29XM\na9hamxvFn+H9xOJda60Nd3MLe0ePx9fhFuZz32Uurr0dzly/vpG6dXA59wy/78kPhTN//Cf/PnXr\n3/zhH4cz21evpW59+H1PpHIrM/HPxsk49yxO9nbDmcFs7jP4ZvCNHgAKU/QAUJiiB4DCFD0AFKbo\nAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUVnbUZnaul8pNJ6Nwpj/I/Xup10v8jKPc\nrSOHDqdy994dH3/5Pz/7VurWXHc+nJnGd19aa63d/uCHU7leP/4z7m1fSd2aTMfhzGwnPpTUWmuj\nyXY4s92J/3yttdbZ30zlThy9L5x57K5bU7fmluLvzRdfeD5165994Y9SuV6Lf1bt7+VGXIYr8aq4\nvJEb+fmnf/ivUrnpXvx5HC0cSd3qXbsRzky6yfGzm8A3egAoTNEDQGGKHgAKU/QAUJiiB4DCFD0A\nFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMLKrtetHJhN5Sbj+GrV6txi6tb+ZCucmXY6\nqVvd3GhVW2rxe6cPnUrdmszEM8OZ3Erh8u1nUrndjbVwZm9rL3WruxtfeRuMh6lb+9P4v/mXu7nv\nCZc34s99a62d/WF8He76xQupW6P96+HM1vV4prXWzl1eT+WmcyvhzOyxu1O3JisL4cyfPftq6tb6\nZm71btCPP48nzzycurWzEf+b7Wzlno+bwTd6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBF\nDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwsqu181Ml1O58XQ7cWspdatNBuFIL/kXW9/cTeWef/W1\ncGZp5Wjq1upCfCnv/rtuT92a6+T+jbu8HM8dWTqeurW1Pg5nZie51cbdcXyBcXjyUOrWjXsOp3Jt\ndxqOvJxcr9vdjr8312dyz+KR07k3dTexLLm1cS11a/d6YrVxL7dS2EsuUmZWIg8fXE3d+oPf/51w\nZr6f+71uBt/oAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKEzR\nA0BhhUdtZlK57nw8tz/KvYz9+fhoyWhjJ3Vrvhu/1Vprc5ONcObobHxso7XWjpw4E87c/eRHUreO\n3n46lZvrz4Uzvfn51K29ySPhzKAzSd3qT0bhzPY4d+v6MH6rtdZ+9NNXwpn95Pvl+valcGZ3ez11\na28t/h5rrbXxeD+c6eTemm22E/+bLS/Eh4Faa224Fx+naa21lbn4e3Oxn/vsPn1LZrgr93rcDL7R\nA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFFZ2\nvW60P03llhKvyNGV3BrXPbfE15bufNcdqVsH7noolTv1yOPxW6fuT93aWDoezuxOc8tw452tVG6v\nF1+g6kxzK28zidW7nf3credfejmc+ca3vpe69dxzL6Zyw7X4otygm1tCm+3E1x6nrZe6Ne3mPoZn\n5+LP4tJ87tbCIP67jce575Gd4WYqtzyTeP2nuZ/x0HgYzjxx6ljq1s3gGz0AFKboAaAwRQ8AhSl6\nAChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKKzsqM0/+vSjqdypIwfCmaXb707d\n6twaH5qZPXZP6tbc6uFUbtKfCWe2b2ynbg3mF8KZ7n58fKS11jYTv1drrW2P46Mxo+3c6NF3n/5x\nOPP0t59O3bp8/vVwZrx7I3Vr0N9J5VZm40NVg2l8OKq11rrd+KDQYNBJ3TqQGKdprbVOiz/7s73k\nR37iuX/2Zz9Lndrf303lHnvgvnCm08m9N4984EPhzDPf/0bq1s2YwvGNHgAKU/QAUJiiB4DCFD0A\nFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoLCy63X3Pracyr29Ec/1TzyZ\nunX/g4+HM71pbq1t0I2vT/0iF890D8SXv1prrZ8Y/xr1UqfalWHudXzp+WfDma//4JnUrS9/50fh\nzGo/92/35V58xaszl7vV66+kcguJB2QuNyjXpm0/nOlM4+t6rbU2M87ltifxv9ne/l7q1nQcf790\ndtZTtzbfvp7K9c48HM7s7G6lbn3jm98NZ/7W4aXUrZvBN3oAKEzRA0Bhih4AClP0AFCYogeAwhQ9\nABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUFjZUZs/f+GVVO5Y965w5vLP/kPq1qt3fS+cWVg9\nnLrVm+RGbVov/ojsj3KDMaNx/GfcHe2kbp0//1oqt70RH8GYX1hI3bpvNZ4ZzOZuXV9bC2emndx4\n0YF+bsRl0I8/H9vJEZfRfvwZTuy+tNZa2xrlfsY2jb8e/ZncR/5M4ivhiVtPpm69+eabqdzzLzwX\nzty7upi6Nfn5T8KZw6d/PXXrZvCNHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoe\nAApT9ABQmKIHgMIUPQAUpugBoLCy63WPd3PLSRfX42top+58PHXrpbOJhb3xudSt0X5uMaw/OxPO\nzC/kVs0mrRPODGZ6qVudlns9Tt93bzhz4sSJ1K3nv/SlcGa8fjl1q9uL/50vXLmYurW3nFvYW+jH\n/9aj8Sh1q5cYe+x2ct+bBv1krjsIZ3JPfWvb6zfimd3cnN+Dd55K5X7tA/HP4ScfeTB1a+XALeHM\ny8P91K3TqdRf5xs9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8A\nhSl6AChM0QNAYWXX6774P19K5V7bmQ1nfv/3PpS6dfcdd4QzL59/PXVrtLuRyrVx/N+C45291Kl+\nN75O1pnk9rhOHjySyt1xa3wVcX+SW/Ga68cX5db2creuvH4lnHnpf389devE3fEFwNZau/+RM+FM\nfA/xFwYz8WW43iCeaa21/iT3U7557e14Zi2eaa21+JPY2vIg/lnaWmt/8A8+n8p9IPF8vHFtM3Xr\n7c34mt8gsQR6s/hGDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNED\nQGGKHgAKKztqs3Xi0VTus+96KJw5tLqYuvXi2fjAxGAm9yebG6ymcqPEaMw4+e/HaWLbYzTNjdq8\nfPFCKvfKm5fCmSceeyx1a3k2PvKzfmMndWv/enzU5tSRQ6lba5dzr3334QfDmf7sXOrWeG8/nLmx\ndj11a3O4lcrNJT4L7j5yPHXr6uX4cz+e5p7F85eupnLjyQvhTKeNUrcyA0aTrdzIz83gGz0AFKbo\nAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKEzRA0BhnWly/euX\n3U++/fXUL3Zw6UA4c2WSW5T76v/4Sjhz9ODB1K0rV9ZSucl4Es4sLebW/DqT+JLUzl5ufWq0v5fK\nXVu7Fs58/JOfSt361//5v4czw/XLqVv7198KZ+Z749Stbzz9g1Tukfc/Fc7sT+PPb2utbe/shjMr\nSwupW7et5pYlN6/H39M///nPU7cuX44/V3fefzp1660rufW6aYuvPXb7yTXQ+aVwZmsv9yxeOfd8\nYtfzr/ONHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAU\nlvsf/X8VDHPDCD+6+Eo4szV/d+rWmUfeG84sHlhO3brjkdwuwuxMfKBmcTb3WG0Ph/HMznbq1nw/\n93pk7g2ns6lbt55+JJy59MJG6tYbF86HM/sz86lbg27u+bj6xsVw5vCxY6lbJw8lxqP2d1K3fvqT\nH6Vy587F/2Z7e/upW91B/P1y9tWzqVt7u7mfsT8fHxXqDXLvzY2t+GfV8WNHU7duBt/oAaAwRQ8A\nhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKEzRA0Bhih4ACiu7Xje3FV92\naq21u8bjcOaff/G/pG4d6cRf/jseeCB1a300TeWe+fGz4cy0P0jdeuKDT4Uz8zO91K252dxq1exM\n/HcbjkepW4dvXApnvvPKS6lb40n8dRx0ct8T+v1c7paV+HLjQif33L/8zE/DmTfeeCN1a28/93x0\nEgOM3V7ute9M48cmk9xC5OLqgVRufzwJZxbmcp8DH3nfE+HM3/3Ex1K3bgbf6AGgMEUPAIUpegAo\nTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYYoeAAoru153ZHE+lescXAln\nTiYyrbV2aBL/d9Z0mlvjGu/tp3J3nDwczqzv537G7Y0L4cxkJv7ztdbaW1evp3K9TnzdcH5pMXXr\na9/8ejhz6cqV1K27jh0NZ+Z68bWw1lrr5UbN2ksvvhjObNy4kbo1SqxYdgYzqVu95KJcm8Z/xu7M\nXOrU3NLBcGZvmvvMmUxyz9Vj990bzvz9z3wydevxRx8OZ3Z3dlK3bgbf6AGgMEUPAIUpegAoTNED\nQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNAYWVHbTZGuQGBrb3NcOZzv/lE6tZk\nNBvO/Omf/SB16wtf/otU7v2PPR7O9FcWUree/vL/CmcWZwapWwcOLKdy68PtcGZnPzfS8fLLr4Qz\ne4kxltZa21qOD+/s749St/YnudGj69fXwpnsYEwqlxiZaa217iA5NLMQH+6attyi0Gzi5XjfQ2dS\nt97/ntzn6Xsff1c4s5wcnNraHIYz49xjf1P4Rg8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAU\npugBoDBFDwCFKXoAKEzRA0Bhih4AClP0AFBY2fW6yxvXU7lhYoDqxo3cy/jauXPhzHPnL6ZuvfHm\nhVTuT/7b+XDm9C2rqVv/5B9+LpzZnosveLXW2refiy/DtdbaM8+8FM6ce/NK6tYgsTQ2GOTW/Hoz\nM+HMrbfdmbr1l8/+LJXrdOPfS5JDeW3Qj7+nF5dyq42tn8vt7O2GM6eOHk7d+u3PfCqc+dhTT6Vu\n9Qa9VO7G1lY4s7mVWzlt/fj7bPwOztf5Rg8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugB\noDBFDwCFKXoAKEzRA0Bhih4ACis7avPMS8NU7tlXLoUzP3725dStC2+8Hs48ccfx1K3bbjmWyp27\ncjWcueXEydSts9fXw5nvn42PzLTW2rd+9Ewqt7a5Hc70Wm6kY9KJLyyNxqPUrYsX4qNHN9Y3UrfG\no71UbjCID+/MzM2lbvVn4rlRcrNkaSb3MfyJD703nPnN3/h46tapk/H39HAnNxizu5nLdXvx17HX\ny33XHe7Gf8bdndxzfzP4Rg8AhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoA\nKEzRA0Bhih4AClP0AFBYZzpNTi79krv/zltTv9i1tfiC2pGF2cyp9skPJtanPvxU6ta//OZ3U7l/\n+1+/msr9f9PJLcN1+53cuUSsn/z39LQljnVzv1fmJ+z3c6trk8zv1Vqb9uPrdd1u7rWf7cdz73ns\n4dStT3/0o6ncww/cF86MJpPUrb39+JLiOPEattZat5PLdRK54dZm6tZoFF+JXFhYSN367G99PveG\n+St8oweAwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6\nACgsNz/1K+Dv3Xt3KnfX6ZPhzL2PP5G6tXrkVDiztT1M3XrivrtSubVf+0g4c/Hym6lbmxvxJalr\na2upW1vbO6nc7l58tSq++/UOyOxjTQapU/OLi6nc4cX4SuSxo0dSt/7Oxz8Wznzkgx9M3er3c6/j\n+o34+2U0ya2VzgziP2M3OYw6neYW9tY34suj2XXDxaWlcKaTmb68SXyjB4DCFD0AFKboAaAwRQ8A\nhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFdabT5PLAL7lXvvgvUr/Y1vxKODNM\nDkVsb8VHKQbd3ODDXD+3X5SZflmP77601lq7lhjpuPzm1dStreF2KreztxfObAy3Ure2tuIDRtPk\nszjciN8aTXJ/6EfOPJTK3XP7HeHM8oHl1K2DKwfDmdE499rvJUdcJi1+r5P+uI8PsuyNc8/H9k7u\nvdnr9cKZ+bm51K1Mb2bfm5/97c//P6/h+EYPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKbo\nAaAwRQ8AhSl6AChM0QNAYYoeAApT9ABQWG7S7FfAueXFVG5vZxwPbe+nbmWmpLYHg9SptVFuOWmS\nWCjrJG8dO7gaztx24njq1iSxxtVaa4OZmXBmZjaeaa21lli7Gu/lnsXOOL6g1u3kXsPpNPEea60N\np/F1st3kwt76dnxBrd/PvTcTI3SttdY6vcTr3819txtux9cNd3Yy25etLSwspHKDxOs/HeeexdR6\n3Tu4FOsbPQAUpugBoDBFDwCFKXoAKEzRA0Bhih4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAo\nrOyoTdvKxbqj+ChIdyY+ttFaa60XH97pd3J/sk52gGQSHzsZDXJDIq0b/xn39nK3sgMTu8P42ElL\n3honBjdyf+XWOjOJQZZe7nvCJLniMu0mhkT68ef3F7n4EFF2KCnzHvtFMB7Z2t5MnRqN4u+zxcXc\nsFg3+VyNxvGfsZf8XOwk/taTSW5A52bwjR4AClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNED\nQGGKHgAKU/QAUJiiB4DCFD0AFKboAaCwTnbFCwD45ecbPQAUpugBoDBFDwCFKXoAKEzRA0Bhih4A\nClP0AFCYogeAwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8A\nhSl6AChM0QNAYYoeAApT9ABQmKIHgMIUPQAUpugBoDBFDwCFKXoAKEzRA0Bhih4AClP0AFCYogeA\nwhQ9ABSm6AGgMEUPAIUpegAoTNEDQGGKHgAKU/QAUJiiB4DCFD0AFKboAaAwRQ8AhSl6AChM0QNA\nYYoeAAr7v3To37eZkHdPAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 250, + "width": 253 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina'\n", + "\n", + "import helper\n", + "import numpy as np\n", + "\n", + "# Explore the dataset\n", + "batch_id = 1\n", + "sample_id = 32\n", + "helper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Implement Preprocess Functions\n", + "### Normalize\n", + "In the cell below, implement the `normalize` function to take in image data, `x`, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as `x`." + ] + }, + { + "cell_type": "code", + "execution_count": 171, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def normalize(x):\n", + " \"\"\"\n", + " Normalize a list of sample image data in the range of 0 to 1\n", + " : x: List of image data. The image shape is (32, 32, 3)\n", + " : return: Numpy array of normalize data\n", + " \"\"\"\n", + " return (x - x.min()) / (x.max() - x.min()) # This is local normalization\n", + " # Note : I should divide by a global maximum like 255, \n", + " # so that the normalization does not depend on the image maximum\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_normalize(normalize)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### One-hot encode\n", + "Just like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the `one_hot_encode` function. The input, `x`, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to `one_hot_encode`. Make sure to save the map of encodings outside the function.\n", + "\n", + "Hint: Don't reinvent the wheel." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "encodings = {}\n", + "for i in range(10):\n", + " v = np.zeros(10)\n", + " v[i] = 1\n", + " encodings[i] = v\n", + "\n", + "def one_hot_encode(x): # I could use LabelBinariser to scale to large labels\n", + " \"\"\"\n", + " One hot encode a list of sample labels. Return a one-hot encoded vector for each label.\n", + " : x: List of sample Labels\n", + " : return: Numpy array of one-hot encoded labels\n", + " \"\"\"\n", + " assert np.max(x) < 10 and np.min(x) >= 0, 'label must be between 0 and 9'\n", + " encoded = np.array(list(map(lambda v: encodings[v],x)))\n", + " return encoded\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_one_hot_encode(one_hot_encode)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Randomize Data\n", + "As you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Preprocess all the data and save it\n", + "Running the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "# Preprocess Training, Validation, and Testing Data\n", + "helper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Check Point\n", + "This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import pickle\n", + "import problem_unittests as tests\n", + "import helper\n", + "\n", + "# Load the Preprocessed Validation data\n", + "valid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Build the network\n", + "For the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.\n", + "\n", + "If you're finding it hard to dedicate enough time for this course a week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) to build each layer, except \"Convolutional & Max Pooling\" layer. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.\n", + "\n", + "If you would like to get the most of this course, try to solve all the problems without TF Layers. Let's begin!\n", + "### Input\n", + "The neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions\n", + "* Implement `neural_net_image_input`\n", + " * Return a [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder)\n", + " * Set the shape using `image_shape` with batch size set to `None`.\n", + " * Name the TensorFlow placeholder \"x\" using the TensorFlow `name` parameter in the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder).\n", + "* Implement `neural_net_label_input`\n", + " * Return a [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder)\n", + " * Set the shape using `n_classes` with batch size set to `None`.\n", + " * Name the TensorFlow placeholder \"y\" using the TensorFlow `name` parameter in the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder).\n", + "* Implement `neural_net_keep_prob_input`\n", + " * Return a [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder) for dropout keep probability.\n", + " * Name the TensorFlow placeholder \"keep_prob\" using the TensorFlow `name` parameter in the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder).\n", + "\n", + "These names will be used at the end of the project to load your saved model.\n", + "\n", + "Note: `None` for shapes in TensorFlow allow for a dynamic size." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Image Input Tests Passed.\n", + "Label Input Tests Passed.\n", + "Keep Prob Tests Passed.\n" + ] + } + ], + "source": [ + "import tensorflow as tf\n", + "\n", + "def neural_net_image_input(image_shape):\n", + " \"\"\"\n", + " Return a Tensor for a bach of image input\n", + " : image_shape: Shape of the images\n", + " : return: Tensor for image input.\n", + " \"\"\"\n", + " # shape [None, 32, 32, 3]\n", + " return tf.placeholder(tf.float32, [None] + list(image_shape), name='x')\n", + "\n", + "\n", + "\n", + "def neural_net_label_input(n_classes):\n", + " \"\"\"\n", + " Return a Tensor for a batch of label input\n", + " : n_classes: Number of classes\n", + " : return: Tensor for label input.\n", + " \"\"\"\n", + " return tf.placeholder(tf.float32, [None, n_classes], name='y')\n", + "\n", + "\n", + "def neural_net_keep_prob_input():\n", + " \"\"\"\n", + " Return a Tensor for keep probability\n", + " : return: Tensor for keep probability.\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return tf.placeholder(tf.float32, name='keep_prob')\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tf.reset_default_graph()\n", + "tests.test_nn_image_inputs(neural_net_image_input)\n", + "tests.test_nn_label_inputs(neural_net_label_input)\n", + "tests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Convolution and Max Pooling Layer\n", + "Convolution layers have a lot of success with images. For this code cell, you should implement the function `conv2d_maxpool` to apply convolution then max pooling:\n", + "* Create the weight and bias using `conv_ksize`, `conv_num_outputs` and the shape of `x_tensor`.\n", + "* Apply a convolution to `x_tensor` using weight and `conv_strides`.\n", + " * We recommend you use same padding, but you're welcome to use any padding.\n", + "* Add bias\n", + "* Add a nonlinear activation to the convolution.\n", + "* Apply Max Pooling using `pool_ksize` and `pool_strides`.\n", + " * We recommend you use same padding, but you're welcome to use any padding.\n", + "\n", + "Note: You **can't** use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) for this layer. You're free to use any TensorFlow package for all the other layers." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):\n", + " \"\"\"\n", + " Apply convolution then max pooling to x_tensor\n", + " :param x_tensor: TensorFlow Tensor\n", + " :param conv_num_outputs: Number of outputs for the convolutional layer\n", + " :param conv_strides: Stride 2-D Tuple for convolution\n", + " :param pool_ksize: kernal size 2-D Tuple for pool\n", + " :param pool_strides: Stride 2-D Tuple for pool\n", + " : return: A tensor that represents convolution and max pooling of x_tensor\n", + " \"\"\"\n", + " w_conv = tf.Variable(tf.truncated_normal((conv_ksize[0], # Kernel height\n", + " conv_ksize[1], # Kernel width\n", + " x_tensor.get_shape()[3].value, # Input channels\n", + " conv_num_outputs), stddev=0.05)) # output depth\n", + " bias_conv = tf.Variable(tf.random_normal([conv_num_outputs], stddev=0.05))\n", + " \n", + " padding = 'VALID'\n", + " \n", + " \n", + " # DEBUG\n", + " #print('conv ksize: ', conv_ksize)\n", + " #print('conv strides: ',conv_strides)\n", + " #print('pool ksize: ', pool_ksize)\n", + " #print('pool strides: ',pool_strides)\n", + " #print('conv_depth: ', conv_num_outputs)\n", + " #print('x_tensor :',x_tensor)\n", + " #print('conv weights :', w_conv) \n", + " # DEBUG\n", + " \n", + " \n", + " # Apply conv to x_tensor\n", + " cv1 = tf.nn.conv2d(x_tensor,\n", + " w_conv,\n", + " [1, conv_strides[0], conv_strides[1], 1],\n", + " padding=padding\n", + " )\n", + " # add bias\n", + " cv1 = tf.nn.bias_add(cv1, bias_conv)\n", + " \n", + " # Apply relu\n", + " cv1 = tf.nn.relu(cv1)\n", + " \n", + " # Apply Maxpooling\n", + " cv1 = tf.nn.max_pool(cv1,\n", + " ksize=[1,pool_ksize[0], pool_ksize[1], 1],\n", + " strides=[1, pool_strides[0], pool_strides[1], 1],\n", + " padding=padding\n", + " )\n", + " \n", + " \n", + " \n", + " #print('conv layer :', cv1.get_shape())\n", + " return cv1 \n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_con_pool(conv2d_maxpool)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Flatten Layer\n", + "Implement the `flatten` function to change the dimension of `x_tensor` from a 4-D tensor to a 2-D tensor. The output should be the shape (*Batch Size*, *Flattened Image Size*). You can use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) for this layer." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "\n", + "def flatten(x_tensor):\n", + " \"\"\"\n", + " Flatten x_tensor to (Batch Size, Flattened Image Size)\n", + " : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.\n", + " : return: A tensor of size (Batch Size, Flattened Image Size).\n", + " \"\"\"\n", + " flat_size = np.multiply.reduce(x_tensor.get_shape()[1:].as_list())\n", + " return tf.reshape(x_tensor, [-1, flat_size])\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_flatten(flatten)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Fully-Connected Layer\n", + "Implement the `fully_conn` function to apply a fully connected layer to `x_tensor` with the shape (*Batch Size*, *num_outputs*). You can use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) for this layer." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def fully_conn(x_tensor, num_outputs):\n", + " \"\"\"\n", + " Apply a fully connected layer to x_tensor using weight and bias\n", + " : x_tensor: A 2-D tensor where the first dimension is batch size.\n", + " : num_outputs: The number of output that the new tensor should be.\n", + " : return: A 2-D tensor where the second dimension is num_outputs.\n", + " \"\"\"\n", + " full_w = tf.Variable(tf.truncated_normal((x_tensor.get_shape()[1].value, num_outputs), stddev=0.05))\n", + " full_b = tf.Variable(tf.random_normal([num_outputs], stddev=0.05))\n", + " \n", + " full = tf.add(tf.matmul(x_tensor, full_w), full_b)\n", + " full = tf.nn.relu(full)\n", + " return full\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_fully_conn(fully_conn)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Output Layer\n", + "Implement the `output` function to apply a fully connected layer to `x_tensor` with the shape (*Batch Size*, *num_outputs*). You can use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) for this layer.\n", + "\n", + "Note: Activation, softmax, or cross entropy shouldn't be applied to this." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def output(x_tensor, num_outputs):\n", + " \"\"\"\n", + " Apply a output layer to x_tensor using weight and bias\n", + " : x_tensor: A 2-D tensor where the first dimension is batch size.\n", + " : num_outputs: The number of output that the new tensor should be.\n", + " : return: A 2-D tensor where the second dimension is num_outputs.\n", + " \"\"\"\n", + " out_w = tf.Variable(tf.truncated_normal((x_tensor.get_shape()[1].value, num_outputs)))\n", + " out_b = tf.Variable(tf.zeros(num_outputs))\n", + " \n", + " out = tf.add(tf.matmul(x_tensor, out_w), out_b)\n", + " return out\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_output(output)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Create Convolutional Model\n", + "Implement the function `conv_net` to create a convolutional neural network model. The function takes in a batch of images, `x`, and outputs logits. Use the layers you created above to create this model:\n", + "\n", + "* Apply 1, 2, or 3 Convolution and Max Pool layers\n", + "* Apply a Flatten Layer\n", + "* Apply 1, 2, or 3 Fully Connected Layers\n", + "* Apply an Output Layer\n", + "* Return the output\n", + "* Apply [TensorFlow's Dropout](https://www.tensorflow.org/api_docs/python/tf/nn/dropout) to one or more layers in the model using `keep_prob`. " + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "--------------\n", + "Tensor(\"x:0\", shape=(?, 32, 32, 3), dtype=float32)\n", + "Tensor(\"MaxPool:0\", shape=(?, 15, 15, 60), dtype=float32)\n", + "Tensor(\"MaxPool_1:0\", shape=(?, 6, 6, 120), dtype=float32)\n", + "Tensor(\"MaxPool_2:0\", shape=(?, 2, 2, 400), dtype=float32)\n", + "Tensor(\"Reshape:0\", shape=(?, 1600), dtype=float32)\n", + "Tensor(\"dropout/mul:0\", shape=(?, 600), dtype=float32)\n", + "Tensor(\"dropout_1/mul:0\", shape=(?, 60), dtype=float32)\n", + "Tensor(\"Add_2:0\", shape=(?, 10), dtype=float32)\n", + "--------------\n", + "Tensor(\"Placeholder:0\", shape=(?, 32, 32, 3), dtype=float32)\n", + "Tensor(\"MaxPool_3:0\", shape=(?, 15, 15, 60), dtype=float32)\n", + "Tensor(\"MaxPool_4:0\", shape=(?, 6, 6, 120), dtype=float32)\n", + "Tensor(\"MaxPool_5:0\", shape=(?, 2, 2, 400), dtype=float32)\n", + "Tensor(\"Reshape_4:0\", shape=(?, 1600), dtype=float32)\n", + "Tensor(\"dropout_2/mul:0\", shape=(?, 600), dtype=float32)\n", + "Tensor(\"dropout_3/mul:0\", shape=(?, 60), dtype=float32)\n", + "Tensor(\"Add_5:0\", shape=(?, 10), dtype=float32)\n", + "Neural Network Built!\n" + ] + } + ], + "source": [ + "def conv_net(x, keep_prob):\n", + " \"\"\"\n", + " Create a convolutional neural network model\n", + " : x: Placeholder tensor that holds image data.\n", + " : keep_prob: Placeholder tensor that hold dropout keep probability.\n", + " : return: Tensor that represents logits\n", + " \"\"\"\n", + " \n", + " \n", + " conv1_hyper_params = {\n", + " 'conv_num_outputs': 60,\n", + " 'conv_ksize': (3,3),\n", + " 'conv_strides': (1,1),\n", + " 'pool_ksize': (2,2),\n", + " 'pool_strides': (2,2)\n", + " }\n", + " \n", + " conv2_hyper_params = {\n", + " 'conv_num_outputs': 120,\n", + " 'conv_ksize': (3,3),\n", + " 'conv_strides': (1,1),\n", + " 'pool_ksize': (2,2),\n", + " 'pool_strides': (2,2)\n", + " }\n", + " \n", + " conv3_hyper_params = {\n", + " 'conv_num_outputs': 400,\n", + " 'conv_ksize': (3,3),\n", + " 'conv_strides': (1,1),\n", + " 'pool_ksize': (2,2),\n", + " 'pool_strides': (2,2)\n", + " }\n", + " \n", + " \n", + " conv1 = conv2d_maxpool(x, **conv1_hyper_params) \n", + " conv2 = conv2d_maxpool(conv1, **conv2_hyper_params)\n", + " conv3 = conv2d_maxpool(conv2, **conv3_hyper_params)\n", + "\n", + " \n", + " \n", + " \n", + " flat = flatten(conv3)\n", + " \n", + " \n", + " \n", + " \n", + " full1 = fully_conn(flat, 600)\n", + " full1 = tf.nn.dropout(full1, keep_prob)\n", + " \n", + " \n", + " full2 = fully_conn(full1, 60)\n", + " full2 = tf.nn.dropout(full2, keep_prob)\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " # TODO: Apply an Output Layer\n", + " # Set this to the number of classes\n", + " # Function Definition from Above:\n", + " # output(x_tensor, num_outputs)\n", + " out = output(full2, 10)\n", + " \n", + " \n", + " # DEBUG\n", + " print(x)\n", + " print(conv1)\n", + " print(conv2)\n", + " print(conv3)\n", + " print(flat)\n", + " print(full1)\n", + " print(full2)\n", + " print(out)\n", + " # DEBUG\n", + " \n", + " \n", + " return out\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "\n", + "##############################\n", + "## Build the Neural Network ##\n", + "##############################\n", + "\n", + "# Remove previous weights, bias, inputs, etc..\n", + "tf.reset_default_graph()\n", + "\n", + "# Inputs\n", + "x = neural_net_image_input((32, 32, 3))\n", + "y = neural_net_label_input(10)\n", + "keep_prob = neural_net_keep_prob_input()\n", + "\n", + "# Model\n", + "logits = conv_net(x, keep_prob)\n", + "\n", + "# Name logits Tensor, so that is can be loaded from disk after training\n", + "logits = tf.identity(logits, name='logits')\n", + "\n", + "# Loss and Optimizer\n", + "cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))\n", + "optimizer = tf.train.AdamOptimizer().minimize(cost)\n", + "\n", + "# Accuracy\n", + "correct_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))\n", + "accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')\n", + "\n", + "tests.test_conv_net(conv_net)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Train the Neural Network\n", + "### Single Optimization\n", + "Implement the function `train_neural_network` to do a single optimization. The optimization should use `optimizer` to optimize in `session` with a `feed_dict` of the following:\n", + "* `x` for image input\n", + "* `y` for labels\n", + "* `keep_prob` for keep probability for dropout\n", + "\n", + "This function will be called for each batch, so `tf.global_variables_initializer()` has already been called.\n", + "\n", + "Note: Nothing needs to be returned. This function is only optimizing the neural network." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):\n", + " \"\"\"\n", + " Optimize the session on a batch of images and labels\n", + " : session: Current TensorFlow session\n", + " : optimizer: TensorFlow optimizer function\n", + " : keep_probability: keep probability\n", + " : feature_batch: Batch of Numpy image data\n", + " : label_batch: Batch of Numpy label data\n", + " \"\"\"\n", + " session.run([optimizer], feed_dict={x: feature_batch,\n", + " y: label_batch,\n", + " keep_prob: keep_probability\n", + " })\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_train_nn(train_neural_network)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Show Stats\n", + "Implement the function `print_stats` to print loss and validation accuracy. Use the global variables `valid_features` and `valid_labels` to calculate validation accuracy. Use a keep probability of `1.0` to calculate the loss and validation accuracy." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def print_stats(session, feature_batch, label_batch, cost, accuracy):\n", + " \"\"\"\n", + " Print information about loss and validation accuracy\n", + " : session: Current TensorFlow session\n", + " : feature_batch: Batch of Numpy image data\n", + " : label_batch: Batch of Numpy label data\n", + " : cost: TensorFlow cost function\n", + " : accuracy: TensorFlow accuracy function\n", + " \"\"\"\n", + " loss_feed = {\n", + " x: feature_batch,\n", + " y: label_batch,\n", + " keep_prob: 1.\n", + " }\n", + " \n", + " valid_feed = {\n", + " x: valid_features,\n", + " y: valid_labels,\n", + " keep_prob: 1.\n", + " }\n", + " \n", + " loss = session.run(cost, loss_feed)\n", + " accuracy = session.run(accuracy, valid_feed)\n", + " \n", + " print('Loss: {:>10.4f} - Valid Accuracy: {:.2f}%'.format(loss,accuracy*100))\n", + " \n", + " " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Hyperparameters\n", + "Tune the following parameters:\n", + "* Set `epochs` to the number of iterations until the network stops learning or start overfitting\n", + "* Set `batch_size` to the highest number that your machine has memory for. Most people set them to common sizes of memory:\n", + " * 64\n", + " * 128\n", + " * 256\n", + " * ...\n", + "* Set `keep_probability` to the probability of keeping a node using dropout" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# TODO: Tune Parameters\n", + "epochs = 9\n", + "batch_size = 512\n", + "keep_probability = 0.75" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Train on a Single CIFAR-10 Batch\n", + "Instead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": false + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "print('Checking the Training on a Single Batch...')\n", + "with tf.Session() as sess:\n", + " # Initializing the variables\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Training cycle\n", + " for epoch in range(epochs):\n", + " batch_i = 1\n", + " for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):\n", + " train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)\n", + " print('Epoch {:>2}, CIFAR-10 Batch {}: '.format(epoch + 1, batch_i), end='')\n", + " print_stats(sess, batch_features, batch_labels, cost, accuracy)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Fully Train the Model\n", + "Now that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches." + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training...\n", + "Epoch 1, CIFAR-10 Batch 1: Loss: 2.2161 - Valid Accuracy: 18.40%\n", + "Epoch 1, CIFAR-10 Batch 2: Loss: 2.0625 - Valid Accuracy: 25.98%\n", + "Epoch 1, CIFAR-10 Batch 3: Loss: 1.7185 - Valid Accuracy: 34.68%\n", + "Epoch 1, CIFAR-10 Batch 4: Loss: 1.6533 - Valid Accuracy: 39.72%\n", + "Epoch 1, CIFAR-10 Batch 5: Loss: 1.5159 - Valid Accuracy: 43.98%\n", + "Epoch 2, CIFAR-10 Batch 1: Loss: 1.6357 - Valid Accuracy: 47.40%\n", + "Epoch 2, CIFAR-10 Batch 2: Loss: 1.4579 - Valid Accuracy: 48.64%\n", + "Epoch 2, CIFAR-10 Batch 3: Loss: 1.2390 - Valid Accuracy: 51.38%\n", + "Epoch 2, CIFAR-10 Batch 4: Loss: 1.2468 - Valid Accuracy: 53.06%\n", + "Epoch 2, CIFAR-10 Batch 5: Loss: 1.2385 - Valid Accuracy: 53.86%\n", + "Epoch 3, CIFAR-10 Batch 1: Loss: 1.3218 - Valid Accuracy: 55.72%\n", + "Epoch 3, CIFAR-10 Batch 2: Loss: 1.1992 - Valid Accuracy: 54.00%\n", + "Epoch 3, CIFAR-10 Batch 3: Loss: 1.0670 - Valid Accuracy: 57.56%\n", + "Epoch 3, CIFAR-10 Batch 4: Loss: 1.0798 - Valid Accuracy: 57.62%\n", + "Epoch 3, CIFAR-10 Batch 5: Loss: 1.0199 - Valid Accuracy: 59.20%\n", + "Epoch 4, CIFAR-10 Batch 1: Loss: 1.0873 - Valid Accuracy: 61.30%\n", + "Epoch 4, CIFAR-10 Batch 2: Loss: 1.0366 - Valid Accuracy: 61.46%\n", + "Epoch 4, CIFAR-10 Batch 3: Loss: 0.9007 - Valid Accuracy: 61.18%\n", + "Epoch 4, CIFAR-10 Batch 4: Loss: 0.8702 - Valid Accuracy: 63.08%\n", + "Epoch 4, CIFAR-10 Batch 5: Loss: 0.8193 - Valid Accuracy: 65.02%\n", + "Epoch 5, CIFAR-10 Batch 1: Loss: 0.9168 - Valid Accuracy: 63.96%\n", + "Epoch 5, CIFAR-10 Batch 2: Loss: 0.8561 - Valid Accuracy: 65.00%\n", + "Epoch 5, CIFAR-10 Batch 3: Loss: 0.7567 - Valid Accuracy: 65.08%\n", + "Epoch 5, CIFAR-10 Batch 4: Loss: 0.7559 - Valid Accuracy: 65.62%\n", + "Epoch 5, CIFAR-10 Batch 5: Loss: 0.7026 - Valid Accuracy: 66.76%\n", + "Epoch 6, CIFAR-10 Batch 1: Loss: 0.8156 - Valid Accuracy: 67.12%\n", + "Epoch 6, CIFAR-10 Batch 2: Loss: 0.7452 - Valid Accuracy: 67.18%\n", + "Epoch 6, CIFAR-10 Batch 3: Loss: 0.6952 - Valid Accuracy: 65.44%\n", + "Epoch 6, CIFAR-10 Batch 4: Loss: 0.6694 - Valid Accuracy: 66.68%\n", + "Epoch 6, CIFAR-10 Batch 5: Loss: 0.6272 - Valid Accuracy: 67.82%\n", + "Epoch 7, CIFAR-10 Batch 1: Loss: 0.7157 - Valid Accuracy: 68.12%\n", + "Epoch 7, CIFAR-10 Batch 2: Loss: 0.6450 - Valid Accuracy: 67.92%\n", + "Epoch 7, CIFAR-10 Batch 3: Loss: 0.6158 - Valid Accuracy: 66.10%\n", + "Epoch 7, CIFAR-10 Batch 4: Loss: 0.5492 - Valid Accuracy: 68.74%\n", + "Epoch 7, CIFAR-10 Batch 5: Loss: 0.5193 - Valid Accuracy: 68.12%\n", + "Epoch 8, CIFAR-10 Batch 1: Loss: 0.6347 - Valid Accuracy: 69.08%\n", + "Epoch 8, CIFAR-10 Batch 2: Loss: 0.5660 - Valid Accuracy: 69.54%\n", + "Epoch 8, CIFAR-10 Batch 3: Loss: 0.4913 - Valid Accuracy: 67.54%\n", + "Epoch 8, CIFAR-10 Batch 4: Loss: 0.4737 - Valid Accuracy: 69.96%\n", + "Epoch 8, CIFAR-10 Batch 5: Loss: 0.4385 - Valid Accuracy: 70.38%\n", + "Epoch 9, CIFAR-10 Batch 1: Loss: 0.5353 - Valid Accuracy: 69.56%\n", + "Epoch 9, CIFAR-10 Batch 2: Loss: 0.4946 - Valid Accuracy: 69.56%\n", + "Epoch 9, CIFAR-10 Batch 3: Loss: 0.4304 - Valid Accuracy: 68.02%\n", + "Epoch 9, CIFAR-10 Batch 4: Loss: 0.4035 - Valid Accuracy: 70.64%\n", + "Epoch 9, CIFAR-10 Batch 5: Loss: 0.3885 - Valid Accuracy: 70.56%\n" + ] + } + ], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "save_model_path = './image_classification'\n", + "\n", + "print('Training...')\n", + "with tf.Session() as sess:\n", + " # Initializing the variables\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Training cycle\n", + " for epoch in range(epochs):\n", + " # Loop over all batches\n", + " n_batches = 5\n", + " for batch_i in range(1, n_batches + 1):\n", + " for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):\n", + " train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)\n", + " print('Epoch {:>2}, CIFAR-10 Batch {}: '.format(epoch + 1, batch_i), end='')\n", + " print_stats(sess, batch_features, batch_labels, cost, accuracy)\n", + " \n", + " # Save Model\n", + " saver = tf.train.Saver()\n", + " save_path = saver.save(sess, save_model_path)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Checkpoint\n", + "The model has been saved to disk.\n", + "## Test Model\n", + "Test your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters." + ] + }, + { + "cell_type": "code", + "execution_count": 173, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Testing Accuracy: 0.7007892191410064\n", + "\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA9sAAAN6CAYAAACeyVqXAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAewgAAHsIBbtB1PgAAIABJREFUeJzs3XecLFWZ//HPM3nmXi4qQbIJDIthSaJixoxZ14SLEePu\n6hrWn2FFTLiuYlzDooIu65owK2DAiEhQDIhiBJUkGe6dPPP8/jh1es5UV3VXd9dMz9z5vl+vek1P\nd9Wp09U1Nf3UOec55u6IiIiIiIiISH0G+l0BERERERERke2Ngm0RERERERGRminYFhEREREREamZ\ngm0RERERERGRminYFhEREREREamZgm0RERERERGRminYFhEREREREamZgm0RERERERGRminYFhER\nEREREamZgm0RERERERGRminYFhEREREREamZgm0RERERERGRminYFhEREREREamZgm0RERERERGR\nminYFhEREREREamZgm0RERERERGRminYFhEREREREamZgm0RERERERGRminYFhEREREREamZgm2R\nAmY2Zmb/aGYfNbOfm9nlZjZlZou55fP9rquIrH9m9uKC68t7K267f8G2v1jpOq9VZnZEwfH4cr/r\nJSIiG89Qvysga4uZ7QgcCOwG3CxbFoFtwFbgMuBi4BJ3n+lTNVeUmb0AeBOwU/aUJy978xYiIrXp\n5Rqj69NyOh4iItJXCrYFM9sfOBp4OLAvYBU2WzSzi4DzgHOAb7n7RStXy9VhZh8AXkD4kha/qFU5\nHiJrgpm9CXhth5tNAjcA1wN/Yvnf9Wy9NZQKjO4DRV2vluvlWIqIiPREwfYGZmZ3Aj4A3C97ynM/\nW24O3BG4E/CPWXkXA8e7+/vrrenqMLNnshRoN56mgy9qZnYL4JDc01e7+096rqBIZzoJMMazZTfC\n3/QjsuevMbOTgHe7+6X1Vk9WiLNOAm4z2xk4KPf039z9/Bp3o0BbRET6RsH2BmVmLwfeDIzSHFym\n2rXuptveCrgvsO6CbTMbAI6h+VjcAJwE/AD4G5Bv5bs29/sBwKm5574FPKSuuop0oJugK/0buAXw\ncuC5ZvbP7n5yPdWSFbIuguzEocBXcs99FXh0TeWvt+MhIiLbGQXbG5CZHQv8O8VdpePv08AfCMHm\nFLAj4Yv3noQAndx26bbr0UMINwvS43EdcIi7/7GL8tbzsZDtTyfnY/5a4MAW4BNmtp+7H1NrzaQu\n+c94PV2DVqKu6/l4iIjIdkLB9gZjZs9hKdBuPJ39/mfgBOALwG/cvenLiZkNAncGDiaM8X4Yoftp\nLGe9uk/yOB6P/+oy0M6XI9IvsUvxAnAY5X+jm4GdCX/XRxCGiFiyfTyPX2dmf3H3j6xkpaUz7v4r\nYLDf9ehRbddLd/8a6/94iIjIdkDB9gaSjSf+D4oD7eOAN7ZLhuTuC8DPs+WjZjZOGN/5QuABK1Hv\nVXJAwXPfW/VaiKwQdz+3wmqfAf7NzJ4AfJAQgKct3Aa808y+4O7XrExNRURERLYPmmd7Y3kZoSt4\nFAPtV7r767rJOuzuU+5+irs/CLgLoVV8Pdq54Lm/rnotRNYAdz+FMJ72bwUvb6bzbOciIiIiG46C\n7Y3l8Swfk+zA2e5+fB2Fu/uF7v5/dZTVBzvS3IVxqh8VEVkL3P1PwFE052Uw4Gl9qZSIiIjIOqJg\ne4Mws30I4zDzPrradVmjRvpdAZG1xt2/AZxF81jvXczsrn2okoiIiMi6oTHbG8c+Jc//bFVr0QEz\nuyVhLPWtCS3PQ8A24FLg18CvipK49Vnfk8SZ2TBh7to7ATsBw4TM6hcC57p7Ry322Vj/g4B9CZ/D\nFHAF8HN3/02NVU/3OQDchnCDaHdCNuwxQnb8a7P9n+vuW1di/7LMF4F7Fjx/KPCLXgs3MwPuSjhf\ndwM2EabYuxr4grtf30WZexOGtdyKcO5A6BL/N8J14+Je691m/xPAPYA7ADcjJKi7FrgIOK/Tv8G1\nwMxuDewP7E04psPAVuB6wswVv+5iHH/fr5fdMLM7An8H7EUYVrEI3AhcAvysH3PSm9lehOv0rbI6\n3QhcBVzo7r9coX2OE/529yP8b9hC6P0ySTg3/gJcDPzJ3edXog4iImueu2vZAAvwBMIXgoVsiY9v\n1++65eq5E/A64IKsjq2Wawgt8wd3uI8TK5RdZTkqK++vNZW3COxRUuc3Faz7mtw6d8ze2w0tyr8R\n+ACwa4Xj9Ejg28Bci/IuAp4PWA2f/SGETPnfJnxZa3es5oDzgFcDW7rc53ElZT+ph/fx2JIy37NK\nf0PxXEn/1md7KO8BBeUtAK/t8Lj+W26d/QmzH1xTcrwWgLt3UM/9gHcCf6pw7vwqq+MuNR/7g4HP\nEqZOLNv3JCER3SG5bV+cvO/4870V97t/wX5+UcP7uQfwYeCyCsd0kXBD7z3AYS3KvLpiWVWWW5Ts\n44iCdb/c47G4C/Ahwg2bdvX6LfBG4JY97K/tZ0rIuP5s4Cdt6nMpITnqzWo4JzYDLwB+CMxX/Jwm\ngR9kf3OH9FoHLVq0aFlPi7qRbxxln/Wuq1qLEmY2YGavJNwFfyOhlcvbLDcDngWcY2anmNkeHeyy\nqLwq6xSt32s5abbnTurdYGZvIGSIfwbhy1DZfjYRvihdZGYPLtqBmd3SzL4BfJkQaA20KG9fQtbq\nM7MW8I6Z2evN7A/A2cCx2T5HW+wzLgPAgcBbgD+b2au62P1rge8XlP3fZrZvF+/lNsDHCso7F3h5\nF/VbC64qeb4oqWBe0/lqZkNm9i5Cr5rnEP6O2/0NlTKzXczs44TeLv9K6MXT7ty5I/Aq4Pdm9ups\nSsOumdmYmX2QcA4/gdDqW7bvUeCJwI/N7L+ynih16eoY5pnZAWb2PeBHwNHALWl/TJ3Qkv/PwA/M\n7IKshb9VHftyveyEme1sZv9LOF+fR7gh3K5+tyPcNP5Tdn3r5TMuu+bfmRBkfwT4+zb12Q14JfAH\nM3t4txUxsycBvyHcsL0XS7lf2i2jhKkHXwWcbWZv77YOIiLrjYLtjePqkuf7Pl2Xme0InE648z7B\n0pcKyy3552Dpn/njgPPN7LCquy0pv906VdbtpJyybdrVPTwI/g94PUvDQvJlFh2zHYGvmtkDlxVs\nth+htfhBNH8OZe/TCS1g3zKzLXTuhYShAukXyrLjk38+brMDcFx206XoC34hd18kJPu6muXvcQfg\nM2ZWeSx/9oX6M4Rjm9b3BkJL+XrtRtlrV9/0fB0DvgG8hKX/P0WfcbWCze4P/BL4R5bP09zu3CFb\ndzPhZs2Xsi6xHTOzmwFnEHp4xHLb1SF6IXCqmY12s++yKtHDZ5bd9DwbuDfd/01CuGE61qaOq3q9\n7JSZHUgIsp+aPdXp+TUKvAE4w8yq3JwqrUq6LzM7HDiT0NruBfsuOl4O3Bz4opk9ruMKmL0W+BRh\nWE88L1p9LmXXaqf8vBAR2e5ozPbG8dvc7/Ef5T+b2Yfc/do+1CmObfwGoQtxeuc+/dJ2BaGr9iSw\nByEwG6I54N6F8MX1oe5+VptdF7V05L8sdNIakq97N+V00/rybuDJLH2eThjv+gdCd8cxwtjn2IMh\n/dIzDHzazO7o7teY2a6EoCHtIeCEsaaXEMZ934LwJXqU5Z8RwN0I3XiP7uJ95D9LCF0ULyGMCb2B\n8JnfjNCaPlGy3eMIXemfXHXH7n6ZmT0dOJWl92TZ+3kvoSdAFccTxkymX8gdeKa7X1K1PmtQWZBQ\ndgOvlf8F7s/y8xVCj5YrgRlgT8JY2JYBqJk9Bvg0S63I0Pyl/o+Elvkpwt/AviyduyTrPYJws+gB\n3sEUiNnNmK8QbjaVXb/+mi0LLF2/0uDkAYTeED+qut+VYmbvB15E8fUsPncZ4dpyLaGnzE6Ea8xg\nbv1W+nW9rMzMDiAMaYnjkGH5sVgknLeXEc7BvQjnblwvbYE/DPi2md3X3W/osV4HEfIoTLD8fL+K\ncJ5dRwiq9yPcTMoHu8PAx8zsHK84ttzMnkIYnlJ2jk8S/tauJwyh2Ew4brcinCNF24iIbAgKtjcI\nd7/UzC4Cbp97aVfCl8ynuHs+IF8N76U40F4gjBU8wd1/nm6QdVd+EnAMof7pF4nNhADyrl6eWOmN\nhK7PqS8SukrG/cfA7YoWdf9D9vORLAUGd8/eUxpMnAv8S4tyUkXzGpd5BKErXzx2vyK0opzq7pPp\nimZ2T+Dt2fqpWxCOx4sJgUv8srhICIze5+7n5coaJ7QkvjXbPm3leJaZneDu53TwPsi2nyEE+18B\nfkxI7DNXtLKZ3QU4ktCauIXlX+KeaGbPc/f/rrxz92+a2VsJXT/T93O0mX3X3T/VanszewLhGOaD\nvne7+5er1mONOqTk+bLu5WX+kTAONR6jy4E3A19098vTFbNz7MmU/D2Y2f7AJwmBQ+PprOxzCDc+\nvpm/BmTlPozwd3Jnlp839wD+k9DqXtUxhEAqf/2aA94FfNTdf5erw56EY/EaQiBiwFNY+tvri2wo\nSlGgHW9avBP4urv/uWDbMUJX5iMIn9vtWuzqISx9bvcG3sHy6+WZwCsqVrvj5HlVZDeBP8NScj1Y\nqt91hMDzU+5+ZW67OwLPJXSlT28CQTjf/psObgTmeFafz7IUwM4QunSf5LkkaGY2BDwceBvNM5Fs\nIYytf2K7nWZ/M++i+byYJvyPPtndf9Ji+1sT/i8+KqvPzdvtU0Rku9LvQeNaVm8hjJdKEx2lCY9m\ngJMJ3YeHVqk+jy2ozyLhC/ahFba/OXBaroz4+DMd1uVPBWXs08V7OjzZPv78Rg3H6k25cvOJpI6H\n1knKCC1PXy44XtsIScZiedcC96tQpzsTklvlj9uJHb63s4CXApu7OC67EwL0/Hu6AhjusKyBkrJu\nAPZrsd3tCF/A8+fxj1brb6nFuVJHgrQzS/7G7tZim+MoPl9jGZ8FJrqszzjhxlL+eG8DnlWxDCME\nj0Xv68EVy7gboRdJvh6XAnetsP2tCNnc88eplwRpaTmVE6QBD8y9j1jOPPD/Oj2PgQcD3wFu3ma9\nIwrec0+JzOoolzAOuujc+A7VkksewFLizHwZR/Xwmaa//wbYt0I5E8D3CuoyR0lCztz2RclVbwAO\n7OJzGSEkdHtRr5+xFi1atKyXRWO2N5b3E4KQom57Q4Sxq98ArjGz083sjWb2KDPbve6KZNP9vJPm\nu+U3AQ9197PbleHu1xFan89kefc+A55gZveutdJrS9pF8f3u/jJ3b9k9z90XCF908t0YxwjjViEE\nDw9x9++1rYD7BYTWufyxf2In41/d/Z7u/m7vYhovDy2iRxB6D6Tn9S4sjbOsWtZits2VubI2A58t\nGlebdSMuagG7Dniyr99x2gCY2UMJ037lz62rPdfjpI10+68QxrBPlq3cxusJQxkiI5y3D3f3EytV\nJng58D6ar4fHVqzHO1jeO8wIf1sPcfe2U6J5GFrwUODP9LF7bZYc7kP5pwmB9tPc/W2dnsfu/k13\nf0B2jV5XssRjz6a56/hPgEe5e9veR+5+PuGzTd9/vD6+vcuEaWmPm0uAe7n77yvUZZJwXbsp99IA\ncFSF/T40eRzP02Pd/adVKp2ry6y7f8zdP9DptiIi65WC7Q3E3bcR7lLPUJzQJO2K/WBCl9ovAZea\n2aVm9nkze6mZHZgFy714FGGcXxT/ib8u+6JSiYf5ao8kjMnM++eearh2xS9cEFo3qna5xN2vJnQP\nL/r8HHiD57qNt/ExmrtyThCyhK+K7Bx4fsFLT++irCsJ59NifIpwrO5CuFmV925CK1YUz+NnuPtf\nOt3/WpJlVv84xd2K/6+DotLtrwOObndjqEWd4rRD+e76L3f373dR5CsI01XFehpwqJmVdZ2P9bgd\noTU4X4/Xuvuvqu48u1lUNSfASnkGYSx7FN/LW939M/2pUl8VDfmZBZ7eyQ3B7Dx4Bc3X2l0Iwwa6\nET+bI72DPCvufhnhWp2vS5WEonsXPPe1qvsWEdnoFGxvMB4Shx1BGG8Zv1yWBV7pshuh2/fxhGzV\nl5jZcVn26m68qOC53wL/1WlBHsYQvoPmFtbHrUSr/BriwNu8ZFxzC1/KlRHdQBhvXr0CYd+n0XwO\nrVqwndXjZ4Ss1GmCs7t3WdYZhHHs+fPp2WZ2ZFwvmwanKPB7p7t/tZt9rxXZGPRzCIFB3iRhvH5H\nRRKOzX+5e6djvVPPZynbe3rDqauWsuz8/Q+az99nt9n06IJtfkdzLogqdTiN0KOoX63babAf39Ov\nCcMRNpRsNoWn0fw3/QHvIqdJ1tPiJzSfK//UaVFJXb7p7t0k00tvnMTyDqqwXdEMEzd2sX8RkQ1J\nwfYGlAUTBxCm8VhkedBdFnzD8uB7T8IY8AvN7IROgtqs6+39aP5C89GsK283/pul1shokDCGens1\nTfgMO/XL3O/x+J/SZdfeoi6z+UR8q+Hc3O87ZIm0uvEm4Fs0B9wfNLM7ZDeZTqA5OPoxYfz7mmNm\nh7ZYHmhmTzKzt5vZhYQx1fl50+N58qoqXWlLVOrm3UI+uVQMhHoJUj9DaLmM5RlwnzbbpHMVx+Ny\nUg/1+GiX2/XEzPYBDmb5eezAe7JhJxvNfVia6SB1Qg9lpoka4/l1sJl1myjsY11udz7N/yNvaWab\nilZOFCWh6+pGpojIRqRge4Ny98vc/UjgroSMotewPJiuGnwPAM8BLjCzB1Xc/YEUT+vz6cpvIF+R\nMIXJ92mu6z27LXMNa2Rd7qJVG3e/gqXgIvXDLutzccFzOxY8t9KKAsBWWZFLZUHTkYSM2ek5tQn4\nHOFc3SF53gh/Q09eY0FKrPsQIRFd2fItwo2bVwB3oLnreDzn3t7DeMtL3P3iLrclCwoOoPkGxxe6\nLRPA3WdoHvN/x2z+7LJ6/F1BPT7XQzW+TLh5ttoeUvDcLGGoyUaU/r+I58MF7v7rHsr8NGH8e949\nuizvB91s5O7ThGnK8tpdq/PDIgz4DzO7ZdHKIiKynILtDc7dL3T3FxGyOj+CMPXN2YQvXFWDbydk\nBj8161rbTtGXjCtrGOP649zvxvYZbEe9fAHMJ8vppbyisroOts1sxMweaWbHZnkCLjCzv5jZDWa2\nYGaLRQsha3JeYcBURdbd+amEDLywFFztT5jmKO2ZsUgYp/3Xbve3wvLDQlot+b91J3zGR7t7N632\nsYzS6YEqOoyluZyjK7LxqL26OPe7EbLtFzmooB43VElWVSYL+C+g/MbmSkm7EcfP6fwektetd/n/\nTU74f9g1d7+RMNShjhvBN3pumrxOty94rt21Oh2fHa8Rtwd+aWYvN7OdeqiPiMh2T/NsCwBZttnT\ns4UsW+pdCa3QBxO6190hWz39Et4ogvAF9GNmdlGbTMW3Sh43vuDV8DbSMmLQsE8N5a5VlRPkFChq\nReu2vKKyxjotxMxuBfw7IYlf+gUw34LYqqtu/gtt18E2gLt/38yOIcwHnY6bTPfnwDvc/eu97GuF\ndRLEpe/veuATwPE13Azr9UZEGvzG436lmR3aY7lQ3NMm340+unVBPdpmH6/g54Rr7WoqyrnRU3C5\nzt2K5utLXf+b8jdvuvnf1Ms1H0Ii0fw1rOW12t2/Z2Y/Au6VbOfAToSb828zsx8C3yS0up+btaKL\niAgKtqVE1j35J9lyAkDWbewfgOcRWvfSZFTxn/A4IVlRqyynRWPVerlbH3XTRW49q7v1qc7yOmqh\nM7NXE7Lfj7PUetJVWTntxiO25e5vNbP7EKbASc/3+PNMwhRoa1m7scRThAR5NxBaec8jJEj7Ztbq\nWof8lHOdyregGaGHwVk9llumLNguun51O4a97jI6tSfN58Za7Z2xGlbzf1MnY7bjNXBbDXUpK7uV\nIwk3YXZJ1o/X6QFCDpb7Zc/PmdnPgO8CZwDfcfeiYUsiIhuCgm2pLJsW6f3A+83s6YQ5anekOQC5\nh5kd7u7fLimq6EtGHdlNi77MD5jZlqwrn6xBZvYBlrJ6p12zoTnwbltcm9+7dRLL55tNHd1DYr+V\nFv8mF9x9pM916fVmTlHwu5LZu8tu1BX1llip69dK21zwXFFCrI2i6DNfqc+22wRpq87dL8l6kHyW\nMPQgP5ws/TscAg7JllcC15vZ54D3uvsFq1RlEZE1Q8G2dMXdT87uXv+QkCgq/6X3CUBZsD1e8FzR\nPNmdKitjE5qqZE0ys39l+fRZsHTjZg74KaFF5RJCi9tNhG7r0zSfcy8AnrkCddyTcGMp7UKZeh3w\nj3XvV5oUBbkrOcY5Py47KupyXkfLXV09CDpR9B7ruBavO9ksGYM0/32vxP8mo4ZeN6vJ3S8B7p7d\naH85YZhZVJTHJdoReC7wHDP7FPBP7n7dilZWRGQNUbAtXXP3C8zs34APsTwQMVpPuVUU+O5Q8Fyn\nysroR4uRtJENSziG5kB7G3AscIK7V/7szOxx9dYQzGyQkKV7Z5qnqovn+tPM7Lvu3pfpmzaQoozO\n/ZiXuuicrOP6VTSf8Uq7Hshnle5HPfrO3WfNbBYYzr20Ev+bnHX6f8ndTwZONrO7Ao8EHkBI9pbe\nRM+3ese/06cC9zaz+7v7n1ajviIi/aZgW3p1IvAOmu/S36bFNkV3tev4glfUBXBuA2fWXeueQfjc\n0yB2K3CYu+fnAq9iJbplvoWQfyAfaOcD7vea2dnqJrmi8jfpHPhvd3/hKtejqJt1Hbkh+pFf4lqa\nE2b1lFRwnbsO2DX33Er9b+o12VlfufsvCIkB35rdlDyIkEj1PoQAPA5RSM8vB/YGvm5mf19jPggR\nkTVLU39JT7JEaufQ3I1s0MzKgp+iYPu2NVSnaE5ldVdbux6dPI5fyI7pMtCG0PpcGzN7OGHMYT7Q\nPpnm+bfHgM+Y2USddZBlirKhdzWPeo+uTh7Hmy1FWb07dfsayujU1QXP1fFe1iv9b+qCuy+4+znu\n/k53fyzhWnwE8EXCtIj57we3B/5plaspItIXCralDleWPF825vHC5HH8snq3GupxQPI4BkYXlqwr\n/XcQzd2AT+6hvAMLyuuKme1FmPIq76eE8YdHEr5EwtI5fAfgw3XsXwoVXTfu0od6FE1ruK+ZFeWi\n6EQ6d/tq+Vnud6N5rumN5EKaA8MDilbs0AE0f7bb7f8md59z99Pc/QnA/WnuDWIoz4WIbBAKtqUO\nRd3sFt29qNUEiqfq2WxmB/ZYj/vT/IVmpaYFKtOPMaTrjpltoTnR1HXuflWX5e3J8vmPu5aM006n\nmjLCGMsnZV8kv0sYV55mTY/jt59TRz2kydk0/33tamZ3X81KuPvvae4CPAA8sNsyzWw/YK9e6tWl\n7yWP47G9Q3azaTWstetl+v8i/k3fu5cCzezWhPm7W+1ru+XuPwReyvJrJcBdzKyO8fAiImuagm2p\nwx1o/tJUGjS5+0UUj1d7ercVyAL1/QteWu0vNEVj0PIJd6R4DOPWHsp7bg/b5h0H3Ivm7uPPzSX1\neTPwLZoD7veYWdG5KD3Iph48j+aWx360kJ1VUI8jeyjvqB627cVZNGdSN+D5q7T/tXa9/FHBc3uY\nWdc3Ugi5KfJmgZ/0UOZ681mWegKl8sn5RES2Owq2pSdmdgCwb/oUIego+tKSOoXmIOWoFuO823l5\nwXPXUT792Eq5Kff7upviZZUUZeLdxcw6viaZ2WZCcNBzK5mZPYJwLuUD7Q+4+ynpuu7uhAArP357\nHPisxm+viP9JHsfrxtFm1ioh42rV43Fmtm/J+qXMbEfgefShldfdp4FP03wtfr6Z1ZoDoUT+egn9\nvV6eTXFugH/tpjAz2wQcTfP15MtZvpMNwd2ngKJEpWVDzUREthsKtjcIMzvIzE7s5stgizINeE/J\ny19qs/n7Cp67OSH7c6f1uB9hSpH8F5oTsi+TqylNehPrc+tVrsOa5+43Eqb4So0Rsth26r3Abtnj\nruddNrO9gY8XvPRT4GVF22Td3p8GLMSnWBq//aFu6yKlTgKuyT03DPxPNk/yavkCzb13RgjnYqeO\nA3bJHq/kvOFl3k5zoL8T8JFV2Hf+emn08Xrp7ovAB2m++fCI7EZcp94A7FHwfNH/v+2Wme3CUnby\n1KWrXRcRkdWmYHvjGCJ0Z/u1mX2i1/HRZjYE/C9hPFv+i9o1tAm2symSvk1xi0rlLsHZWMdPFdRh\nFvhA1XLq4u5/obm1Zicz60fW5LXuhzQHF8d20rqdzfP+TJbOn660GKd9I9k47bJt3f17FI/fPtLM\nnt1tnaSZu28FXk/zsb4n8Lmsl0PXLHismd25TT3mCIF1vh4PNbPjO9jfvwAvoMfztxfu/itC1uj8\ne3mUmX24m94mAGa2U/Z31cofgPwN0T1Xccx4kRNYfiMwHo+Pm9mdqhZiZkcSbtLl/zf9JBvHvC6Y\n2Qlm9nc9FlPUM+B32d+ziMh2TcH2xjNAGBt9npldYGb/1kkXTDMbMLPHEObXfArLv0jEFuVXZS2X\n7byQ4i81HzKz17b7omZmhxMC9nRe1FiH12WBbz+cT/MX53/vR0XWuC8mj9Og6ZPtumCb2Y5m9gHg\nbcm28z3U5W3ZvtuN0y7zFuCbNAcs79X47Xq5+wdpPtYAjwTON7OHdlqmmd0uu3HzB+DzwD4VNns7\n8CuaP/OXmtmnW3XDNrMJM3sv8K5ku3xPj9X0fOAymo/p0cDpnZzD2bE8HriYNnOHZy3Jv2ANXS/d\n/RpCkJzWyQk34r6dTQlYKrth8wrgxPxLhDHq6y2B4jOBC8zs62b2lE5uaGXH4uXAv1E8haKIyHZv\nqN8VkL6I//TuRAgy3mZmVxDGWZ8LXEFonb6OkDF6C2Ge0LsBDyV0eXSKA+3Pu3v+S0ZxJdx/b2Yv\nY2m6pFimAW8iZHY+CTgV+CswReiSdzBhrOyjckXGOnzX3d9ZpQ4r5HPAfbPH6Xj0/YHPEKZ8uQEo\nai39ibv3EjSuJycCryVkYU7PpycB98iC6dNZav3ahTAH8GMIN4zieWiEc/ZLdJHYycweyfIWqHge\nfdDdP1elDHd3M3s6YSql3ZOy4vjtg929aMyidOcpwA+A2OIWz5/bAqea2a8Jf4dnAr8hXMu2ATsQ\nAsDdgLshaBg8AAAgAElEQVRmywNYnlyx0thpd5/LMs//gNCVPT2H/4HQ9fhLhCR6lxKGGuwB3Ad4\nAnALls7fqwk9cV7fwTGojbtfbWZPAc4gfC+I78OBw4Gfm9lXga8R/k9cSTimmwhB6J0J1+VHEKb0\ng4rHkfA5xYzy6Tj8uxESa/2acL0sui6emwXstXL3E8zsCODRLD8WuwFfM7PTCMHijwk5G4YJ17EH\nA88inFdF/x9f5+6/rLu+q8CBh2XLtJl9GziHMMTmt4Rz4XrCGOybEa7T9yUkLyxKoPoXyoegiYhs\nVxRsbzz5u/Xxn+AtgcdnSyvpNvmyPkOHGXmzLzV3pHlqEAfuCPxHtpTVJYpfZn5FCNb66WTCWL34\nZTrW8yCWvogWcWBvQgvTelU5yZO7z5rZ84GvEHpcpF9q9ya7EdRiP/GL+TThM++mRXNvwjjgvPMp\nGaddxt2vMrOnEYKr+H7i+O0PUpyVWLrg7teZ2QOArxPmVy+6dlRtHS27plWpxzlZwH0iS8meYnkT\nhPH8T2uxTyMMeXkySzcOetVVl3R3/2EWcP8vYfx5vKbG8h6dLXXv/yTCTbcdWf5Z3J2lILxsP7tQ\nPLNFfr1uHEXoQXEwy48FLAWe7faZnpcf7eEmcJ3DDLo9HnG7UeCIbKmyTf5YbAOe7u5FyfFERLY7\n6ka+cVxJyLS6yPIvUOniFZZ0fbLnrgae7+5P7aaVwd1fDryK0HKR7iOWX6UuTvhidN+sG2DfuPv1\nhNaNmDSryvGtqs5xnelxrrO8ymW6+6nAPxGOVdG51e6zvwF4dG4MZKX9Z8MUPk24KZLuO47Tzk+J\nVOX9pOO3Yz0MeHof5t+u+/Pttg4rIktOdx9CIrp4Xev0ela0DRT3Oimrx8mElvatFetA8vpW4HHu\n/p2kSMv97ERPn7m7fx64P3AJ3R3Pjuvt7lcTuqvH/x11Xy+7OibZUKgHEnrMdHos0n0uAG909+d1\nWofce6hLt+XV8bd1BXD4ehqzLiLSKwXbG4S7X+zu9wT2JCTk+RIhm27Zl8Cihdz6FwCvAW7v7j1l\nrnX3dwCHAN+j+B910RLXuxR4nrs/LAt0+87dv0Lonvo7qh3fSsXmyuq5mjWW2fUXYnf/EKGV6C9U\n++zj/r4JHOzu3yqpRztvAw7NbbNIGKf9x6r1L/AW4BsFdVnN+bd7CVBWog4rswP3aXd/MeFzPI3W\nNxPbXdMuA44HDnD3b3ZYj1MILeynsvx9t9rfacDfZzecyL3WzXGr5TN397MJ3erfTOgeXPV4pnX4\nM6F3T9EUf0X7/ByhC/Yfc3Wv63rZ1fFw923u/nhCK/fFVD8Wcb0fAYe5+7Gd7ruu91ChzHZeRhhe\nMEu1z6boOEwB7wbu4O7n1PAeRETWDQtTxcpGZWZ3AO5B6HZ5e8LY7FsQxjZuInTRvYnQ2ncF8EtC\nQpuzsiy2K1GnAwnd0Q8ndK0sSpR2OSGb9SnAF3uds9TMTiR0pY8ceGbWetYTM7svYSzj3QjHeEfC\nNCjDuVUd2Mfd13M38q5lGe6PIox3PYzm+XadMH77m8D/ZEFBuv0hhBs2qR+7+09XpsZrk5ntQRg/\nmnJ3P7cf9VlNZnZr4ImEa8dBLM8un5oBfk/In/Aj4Nt1Xc+yTOZPJYxZvT1hDOsCIXC9iHDd+ux6\nGLubJSp8PKH7+D0pnsYKwnv7NfB9wg2HM73LLxfZ8ICHE66X+xFyhpRdL3d193bdyGuRZWV/FOH8\nug9hqEveIuEzPgP4P3c/azXqthqyxGiHEc6DQwnDY/amuNHGCTcnzicM9ficuo2LyEalYFvWtKyr\n7z6EAHUAmAQu1T/u7Vv2xfaWhGBpiNDd9i/uPtPXism6YmZbCAnrJggBwE3ZclW3weBGlgXfexKC\n33g8r1utgHctMbNNhJtamwhB9o2Ea1RPN37Xk+wG6W6Em/MThJvzNwDXuPtUP+smIrJWKNgWERER\nERERqZnGbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjU\nTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuI\niIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiI\nSM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2\niIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiI\niIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0U\nbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiIiIiISM0UbIuIiIiIiIjUTMG2iIiI\niIiISM0UbIuIiIiIiIjUTMF2F8zsGDNbzJbX97s+7ZjZ/ZL6ntHv+oiIiIiIiGzvFGz3xvtdgQ6t\nt/qKiIiIiIisSwq2RURERERERGqmYLt7aiUWERERERGRQkP9rsB65O7HAsf2ux4iIiIiIiKyNqll\nW0RERERERKRmCrZFREREREREarZhgm0z28fMXmBmnzSzX5rZ9WY2a2ZXm9kvzOwDZnZoxbLaTv1l\nZs9I1vlY9tyAmT3ZzL5oZn8ws8ns9Ucn252YbHdU9twtzOxVZna2mf0t2+73ZvZhM/v7Oo5Psv87\nmtlLzewUM/uNmd2YHae/mdm5Zna8md2pYlnfTd7LfbPnbp69l3PM7KrsvfzBzD5iZvt3Ud/HmtlJ\nZnZR9plOmdmfzewLZnaUmQ12WqaIiIiIiEivNsSYbTP7T+BlgGVPpcnNbg7cArgz8AIz+xTwHHef\nqlB0lSRpntVhd+DTwL1z25aVEbe7B3AKsHtu3dtmy7PN7C3u/oYKdWnJzD4DPDFfh8xOwM7AQcBL\nzOw9wCvcfbFFkcveo5kdRjgGe+TKvk22PMPMXujuH6lQ17sCHwfuVlDXPYG9gMcArzazx7v7r9uV\nKSIiIiIiUpcNEWwTAi+AReCibLkGmCMEkQcAt8vWeQqwA/CoGvc/BnyZEKjOAT8C/gCMAge22O7W\nwLuAmwE3AWcAVxKC1QcAE4TeCa83M2oIuPcmBK3zwIXA74DrgQVgV+AQQiAL8FJgBPinimXfBTgO\n2JS9hx8QPoM9gQcC48Ag8EEz+6W7n11WUNZK/mVgS1Lf8wif6yzhuN07K/MOwJlmdk93v6hiXUVE\nRERERHqyUYLt84BTga+6+7VFK2Strh8D9gMeYWZPc/dP1rT/JxICye8Az3T3v+T2PVyy3WuAYeBk\n4MXuvjXZZkfgI8ATsqdea2anu/tZPdTzDOAdwOnpvnJ1PQL4KCH4fqGZfdLdf1Sh7HcQjsHLgPel\nLeJmtifh87kz4ebBW4AHlez/loTW8R0IgfanCS3sl+XW2xn4L+AfgB2BT5vZAe6uKdtERERERGTF\nbYgx2+7+Tnf/RFmgna1zJvAQYDp76p9rrMIg8AvgEflAO9v3XMl2w8DX3P0Z+eDX3W8gtMJ/N3tq\ngNBy3DV3f627n1IWaGfrfI3lrf5VjpMRWsFf6O7vyXc9d/dLgacSgmcD7p8F1UXeCsTXPubuT8sH\n2lmZVxOOz7eyMu/C8i7yIiIiIiIiK2ZDBNtVufslhNZnAw4xs801FBvHib/K3Wc63M6Bfylbwd0X\nktcNuI+Z7ddVLTvg7ucCv872eXiVTYBfuvtHW5T5K+Dc7FcDDs6vk7VWPy379QZCV/ZW9XTgdclT\nR1aoq4iIiIiISM82SjfyBjPbG7g7cHvCWOhxlgJiCIm6yJ67G3BmDbu9zt2/0eE2DvzI3S9uuZL7\nBWZ2PmHcOYSx3L/rvIrLZUH7wYSx7DsSxpenx2nH7OdOZrZn1jrdymcr7PZ8wmcDcKuC1x+U1cMJ\nLf7b2hXo7ueY2TbCWPF7t1tfRERERESkDhsm2DazewJvIwRc1mb1aOcadu3Az7rctur467NYCrYP\naLViO9mY7Dd2WM7OQLtg+5cVyrkmebxjwev3TB7vZ2bvq1Bm6uZmNl4x07yIiIiIiEjXNkSwbWbP\nBk5gqWt2uyRZMRjfoaYqXNXldn/uYr1dutwXZvYGIM4bXiWRWCfH6YYK66Rj14uSxu2RPL47S63g\nnbg5oGBbRERERERW1HY/ZtvM7gh8KPvVgV8BLyEEarcExt19MC7AJ5LN6zo+3QZ3kxXXS7tTd3WD\nwMweTAi0482Is4DnEVq4dwbGcsfp+8nmVY5THVnA09Zu73LZEDeYRERERESkvzZC4PEywvt04DTg\nMe4+32L9ulqz6zBRcb1NyeObutzXK5PHH3X357VZvx/HKb2p8FJ377QbuYiIiIiIyKrY7lu2gQcm\nj1/XJtCG4sRc/bJPxfX2Th5f3elOzGwAuG/26yJhfu92qtatTlcmj2/fh/2LiIiIiIhUshGC7XSc\n7wWtVjSzLcBdqafLcx3uUXG9NHHYT7vYz86EebAd+Fs2R3UpM7sT9SSP69TZyeOH9GH/IiIiIiIi\nlWyEYHsxedyuW/bRhMRcVbOVryQDDjOzW7dcyWx/4MDkqe92sa94jIwwFVo7L+piH3U4HZgn1HNf\nM3tkn+ohIiIiIiLS0kYItv+YPH502UrZvNIxQdha4ISg8t1lK2Tdv9+bPPUDd/9tF/u6hqVs4Tua\n2X1a7PMw4AX04Ti5+2XAyclTHzKzvapsa0E/WuNFRERERGQD2gjB9leSx8ebWVP3YzM7HPgOsJnl\nSbj6bRZ4tJmdZGab0xfM7GbAp4AHZE9VHWvdxN0d+Hry1Elmdkh+PTN7EvA1wnnTr+P0GuBywo2I\nPYBzzOyJZlbYG8HM9jCzlwC/AZ60etUUEREREZGNbCNkI3838FzC/NM7AaeZ2U+BCwmtswcC+2eP\nTwf+BhzVn6o2eSvwUkJ9HmdmZxDqtxsh8VvMQu7AW939Rz3s683AYwndyG8D/NjMzgJ+SxjPfc/s\neSfMWX4H4H497K8r7n6FmT2GEPTvTDgWnwGuMrMfE5KoDRA+6zsDt2VpfnUREREREZFVsd0H2+5+\nVRacfYmlpF4HsjTOOc6//AXgWSzvlt1vlwBHAJ8Ddgcek7wW670AHOfux1Qor3Qsurv/2syeCvwv\nS2Pb75Ut6f4+TJin/BvV30a93P08MzsY+ChwePb0zsCj8qsmP68Afrc6NRQRERERkY1uuw+2Adz9\nx1kisZcSArLbZi9dDvwEONndvwaQ9UaOgWXLYqvsuoN1iwsIdb8b8DzgccCtCd3dLwO+DXzQ3X9W\nR13c/ctmdmfC3OQPIUzvNZ/t60zgJHf/IXR8nDp5/5XWdfe/AA8xs0OBfyBMXbY3cPOsztcQguvz\nCDcGvuvuiyXFiYiIiIiI1MrCcF1ZK8zsROAZhKDzWe7+iT5XSURERERERDq0ERKkiYiIiIiIiKwq\nBdsiIiIiIiIiNVOwLSIiIiIiIlIzBdsiIiIiIiIiNVOwLSIiIiIiIlIzBdtrU6fTZYmIiIiIiMga\noqm/RERERERERGqmlm0RERERERGRminYFhEREREREamZgm0RERERERGRminYFhEREREREamZgm0R\nERERERGRmg31uwIiIiLrnZltA0aBReBvfa6OiIjI9mxXQqPxjLtv6ndlWtHUX4nb7XdbB3B34nGJ\nj9Mlz8waj/PrtVu/laL14nPpa4XrYZjZ0mu2tO7AwEDhNq3eY9x2WZlQ+D6L6piWX/R+BgYGGBwc\nbFm3Mr/65YXVDqiIyAoxs3lgsN/1EBER2UAW3H1NNx6v6cqttnxAVxRwp6/lA96i7VsFiXH7soC8\nKDCN+033XxbAAjiO0VksWrbvfKBdtm5j3xVv5MQbADHgjmW2umEhIrLGOITr2R577NHvuohIBbOz\ns1x11VXssssujIyM9Ls6IlLRZZddFuODNR8kKNhO5IPp/HNF6xS1xBa1hudbg4tah8ukQXVZwL28\nAuAWyuwm0C57vmqLfJn8MU1bwGOgPTw8jJmxsLDA4uIii4uLPe1TRGSVXAvsuvPOO/PXv/6133UR\nkQp++tOfctBBB3Haaadx4IEH9rs6IlLRrrvuylVXXQXhf++apmC7QCddyKOi1tiybYuC8/izrOW4\nk4A735pdtXW7VUt9Udfw/OudtkCnAfzAwABDQ0MMDQ0t20cMtrspX0REREREpF8UbNckDRzbBYVF\nr+cD7lZBdduW7Vy5Zd3O221bFPiX/d5rIJxv2U7rnpZf5T2LiIiIiIj0m4LtRKfBY9VkZSuhaqBd\nplWwXnbToNU+O33fRYH70NAQIyMjjbotLi42upNX6V0gItJvV199NXvttVe/qyHbod12243zzjuv\n39UQEZEOKNhOlAWMVZKcdRJsFrXctqtDp9LW67KW9KKAu5v3kZZXVIdW20X5lu0YaA8MDDR1zReR\n9c/MYkKGN7j7G7ss437Ad7Jf7+/u36+lcj1wdy699NJ+V0NERETWAAXbibJgsd144XzAvZbGF7fL\nFN4q0C7LsF5Hd/J8dvM4Zju2bC8sLDSmAovjttuNfReRdaeuC+XauOA27NnvCsh25XLC9O0iIrLe\nKNguUdSaWpQQLHZ3jq2xnQaEnQTlReUWZfbOP44Z0+P0WnFJA960vPR9xW7c+WPSbi7ssunM2r2n\nWLeYLG1oaAh3Z2Fhoen9rpUbGiLSV86aCrYNUDZyqdNegHpLiIisRwq2E/mgulX35fh7DLSL1ssn\nOCvbZ9Xu2EXP51um0wA7nbs6v6QBd/wZp9qKQfb8/Dxmxvz8fOmY6bLW77L3km5TFJyndR4eHm4E\n+wMDA03risjG5u7fAwb7XQ8RERGRIgq2S1QJuOPzrbpRl2UCL9pH1cziRfWM26dzVsdlaGiI4eHh\nRktxfC4fbC8sLDSW+fl5Zmdnl7Uqt7tpUPazLJFcPoCP9c/XOwb9aXnKSi4iIiIiImuZgu0Cnbac\nlnWZLury3Gkrd9ymSNqSHVur00A1LsPDw4yMjCwLuGPQXRZsz83NMT093Vhvfn6+sZTdPGh3PKoc\nv/T9DA0Nsbi4yNzc3LKWbRGRNWih3xUQkc7svvvuHHPMMey+++79roqIdGBwsNGpbc3/71WwXVE3\nY4SLWmOrlNWu1bYowE4D6/RxXNJgO/7MdyeP47QXFxeZnZ1lenqaqakppqenmZ6eZmZmhunpaRYW\nFmqbiivfcyDtTj40FE7P2dnZxo2BbrK/i8jKMrPdgZcADwZuB0wA1wJ/Ay4ATgdOcfetLco4BHgZ\ncG9gF+Bq4Azgre7+m5JtWmYjN7MTgWcAF7v7bc1sD+AVwBGEgbDbgHOA97n76V289VSWwUrXJpH1\nYvfdd+cNb3hDv6shIh1Kgu01nz1SwXaBTgLjsu2hOZlXvqxOy853w04TiY2MjCxb8oF2+lwMuPPj\nt9OAd25ujqmpKSYnJ5mammLr1q2NzOBzc3PL5r+uWucq4nFKE7ilrfDdlCkiK8fM7gN8BdjC8kRl\nu2TLnYGnAFcBXy8p40XAu1k+/np34OnA483sYe7+wxbVaHshNbODsv3vnDw9BjwCeISZvdPdX9mu\nHBEREZGqFGzntOoSXmUe6aKptPLjjKtkK2+VXCz+jC3aIyMjjI6OMjY2xtjYGKOjo8uC7Bho5wPu\nGMTGgDvd39zcHNu2bWNsbIxt27Y16h27kZe9//R4FdU5fX/58ev5lu30PaYt8O2OnYisDjMbAT4F\n7ADcCHwA+C6hRXsEuA1wL+BxLYp5GHB34OfAewgt4ePZNi/JHv+Pme3n7vNdVnUC+GxWz+OAU4EZ\n4FDg1cAewMvM7M/u/r4u9yEiIiKyjILtRLsgso4yO53qKw0o0zHZw8PDjI6ONpZ8YB2D6TRATbuJ\nxyW2aBdNAzY2NtYIeNN6Dw4ONrqUx9bt/M2EKoFwvgdArJ+7N96nmTXeTz7YFpG+O4zQAu3AU939\n1Nzr5wCfNrN/JQS8Re4BfBV4fC6YPtPMrgXeDOxD6Pr9pS7ruSswCxzu7mcmz59nZp8HziZ0K3+L\nmX3S3a/pcj8iIiIiDco6tYJ6CbSLyondxkdGRhgbG2NiYoJNmzaxefNmNm3axPj4OGNjY42AOw1Y\ngWXTeqUBd9rdPe5jeHiY8fFxNm3axA477MCWLVsay+bNmxkbG2NoaKhwHHWVsdVF66TBduw+nt48\nKJsjXET6Zrfk8Q/KVnL3xZLx2gZMAc8uabV+LyFIBrhP17UMNwM+lAu0Y90uB16e/bqJMMZbRERE\npGdq2a6o0/HVdQbaMaDMB9sxGB4fH2+aQzsGzmUt2wsLC41W6/Q95rOZx33l67SwsMD09HRju3xd\nO23dTuvn7suC/vz48pXogSAiXbk8efwsoNMu2A58092vLnzRfauZ/Q74O+C23VWx4aQWr30BuB7Y\nEXgQcHyP+xIRERFRsN2JKgF3q3m082W12iaukyYLGx4ebrRoj4+PN36Oj483JRXLbxvLjksMauPP\n/H7zY6bHx8eZn59ncXGR+fl5ZmZmGBkZaXQxX1xcXFb/bhKjpTcD4r7zXeJjkjYRWRN+CPyREAi/\nx8yeTghcvw+c6+5zFcoozDSeuJbQAr5DD/WcBX5R9qK7z5vZ+cADgLv0sB9CYtRdK6w3yPJ8cCJl\nLm+/iojIduDyyy/n8svbX/NmZ2fbrrNWKNjuUtX5pXttfY3jtAcGBhgdHWViYoIddtiBiYmJxljt\n4eHhpsRhRd2s02C7LODOJyqL2w8NDTE2NrYsU3nc//x86P0Zf3YjDbRjq/jg4CAjIyPLErnF96RW\nbZH+y4LURwKfA+4EHAwckr08ZWbfBz4BfNrdy+6STbbZTdyul8j02hb7j67Mft6ih/1kruq9CBER\nkQ3mwx/+MMcee2y/q1ErBds9qiPAbhU8pq3LY2NjbNq0iS1btjAxMbGsu3caJLcaz1w14I77hhDw\nDw8PN7p3x0zlMet5+p7T/beSf7+xDnE8OdBo2Y7dyGPLdvo+RKS/3P03ZnYX4FHZcl9gX8K0Wg/N\nlpeZ2cPLuouvRjUrrFNLAggzY+edd267XryBKFLVbrvt1n4lEZF17PnPfz6PfvSj2673sIc9jKuu\nWh83thVsJ6oGb0Xdw1ci8IvduWNQnbZs57uOpwnP4rZpK3DRNFv5QDsfcKdd0WMytMHBQWZnZ5cl\nY4tjuNtNa9bqGOWD7Xw38vyYbSVGE1k7PPxxfzlbMLNbEqb0ejFwEHAg8GHgCX2q4k5mZt76Qh37\nfl/by4722GMP/vrXv/ZShIiIyIa0++67s/vuu7ddb2RkZBVqUw9lI0/kg9CyJa6bbterouzdacCZ\nTvOVdh+PXatjq2/aApyWmY7nTsUAfX5+noWFhUawm7Zu58uP9YkJ2sbGxhpd2XsJgtOu5MCyfbaa\nE1xE1hZ3v9LdP06YY/t8QqvxI81stE9VGgHuVvaimQ0Cf09oAb9gtSolIiIi2zcF24l0zHCrccwr\n1aJdFHDng+10Pu0YgKZTYhVNjZVPlpYfxx2D7DTQToPtNOjNt7LnpwHL17+qfMt2Om477S4f36+I\nrH0epvP6XvbrEHCzPlan1ZRejwdunj3+1irURURERDYARS2JVi3Z6TpFj+vSTct2PtDOt24XLen7\nTVu205bltB75lu10nu/Yst1Li3Paqp12hU+nNEuzkqtlW6T/zOzeZna7Fq8PA/fLft1K/zKHGfBC\nM7tX0wtmuwH/mf06CXx8NSsmIiIi2y+N2U6UTdtV9Di/fjfaJTDLr1uWZbzV+0hbm4sep4F1Uct+\nUR3SgHvTpk1MT08zOTnZSKJW9h5aSVu2i7qyx3Hjsdt8/n2KSF8cDvy7mf0A+Bpheq2rgHHg9sAL\nCOO1HfhIhYzgZXr9Y/8bIZD+lpm9C/g6MAMcCrwa2CPbx+v6mMRNREREtjMKthOtgu18F/KiQLSX\n/aXPpS3OabfutOW3yrjyNJt4DFiL3kMaYMeAd3BwsKlLeZo4LQbbCwsLTE5ONlrb813B28l3aZ+f\nn28aPx7fQ1FGchHpOyNkIL9fwWueLV8EXtPjPnoxCTwROJUQXL86eS3W8T3u/p4e9yMiIiLSoGA7\n0S7YbhVoF7UmdxIM5oPOoiC77Pe0nmkX7PgzzWreahx6WSCflh/LTOfdHhsba2QmX1hYAGgKtotu\nRsTjFn+mLdvz8/OlXcqLWuVFpC/+A/gx8GDgnoQW4pjV+wrgHOAT7n5qj/uJAXGnry2t5P5TMzsQ\neAVwBLAnsA04lxBof6PHOoqIiIgso2C7QLuu42VBdFmQ3Srozm9TFgCnSczSsdXtWrXzXbBjQrQ0\nGC4K4lsF3THYjvuI04DFlu64jxh4tzoWaRAeg+e0ZTuWlyZoU4I0kbXB3aeB07Ol020r/SG7+wNa\nvPY9oPJk1e5+KfCv2SIiIiKyohRsV5C2XHfaat1LV+c0edns7CwzMzPMzMwwOzvbCDjLxnKnSc2G\nh4cby+LiInNzc8zNzZXeSCh7Lj0GsYU5tmzHacBikD43N7fs+FV9v3FZWFho1DMG4WmwrSRpIiIi\nIiKylinY7kC+u3iUdoXuNABs1woeA9fBwUFmZ2cbQXc+C3m+nmkys6GhIUZGRhgdHWV+fr5Rbgxi\ni1rr82PH813C06nE4jRgmzZtYn5+nrm5OWZmZhrrFh27smOVdiePZcUW8nS+bQXaIiIiIiKylinY\nrqgs0F7J/QGNoDO2asclToM1PDxcun0MxGOwPTY21mjRTqf6KhuLXtSqnZYdg/m0ZXt2dpbp6ell\nLc/5wDjfJT2/73ywrZZtERERERFZbxRsF0gDwKIu5FHVbNtF65W16hbtY2FhATNjdnaWqakptm3b\nBpA/UnUAACAASURBVMDo6GjTPoqmBksTpC0uLjbNwZ2vUz5ZW75eaYu6uzdattNW7enp6UagHFur\ni6YfS1vm43NDQ+G0nJuba5RTdHNAwbaIiIiIiKxVCrYT7QLgVsm9Oi23KKjOP5dm3V5YWGBmZoap\nqalGorMYOMfgFGhq8S0KwsuWsvXS+qQBcuzOHYNtd28EyFNTU8zMzDSm8kqD7XT7tJwottbPz88z\nPT3dlDCt6rRiIiJUzFYuIiIiUjcF24l8i3ZUFmRXyTJeZX+tpNN5zc7OMjk52Xg+Jj8bHR1d1lId\nl6Ku3/klfa9lQXZRGXH8tJkxNjYGhEB/bm6OyclJtm3bxtTUFGa2bIqudNt0ScXf5+fnmZqaWjYl\nmYJtEanK3Z8FPKvf9RAREZGNScF2oizAzic/6zTQ6yTwLgv0Y3buNEAeGRlpJD6L45ljEB4D5qLp\nu9IW5vT5dDx0fJwPxOPjNGAfHh5u1Gl6eppNmzYxMTHRGLudb4VPx5vHJU3GFruVp93Q4zI/P9/Y\n1+Bg5Rl/REREREREVpWC7TZaJUZr9Vo3mcnbSafUmp6eZnJyspGdO83UnS6wlGQtdsdO562OgfPi\n4uKyAHhoaKh0bHd8f/G9x7KGh4cZGxtjYmKCLVu2sLCwwMjISCO4ToPtuI+4v7SOaSt2/vjGYDvu\nT0REREREZC1SsF2gVRfydJ18ArWVFufcjuOf06A4H8DGnwMDA40gNgaqactwGmynLc2xi3g+0C6a\nJiwmNUvHb8/OzjbKTLOIlwXbcVozM2vMrx3rnLaox5ZuTf8lIiIiIiJrmYLtGpQF3HW3bqdzYwON\nQDsG3mkr8cjISCOgTqfRivWJXbXTruRpoB2D5/z7KZoKLNZhYGCAsbExNm3a1JgbO20hzwfbacA9\nPT3dCKbjDYU4p3ga9Kf7UjdyERERERFZqxRsJ9oFxumY53wg3SrjeNFY73bjw8vKSMWu1nHu7Pg4\n7TK+uLjY6HI9Pz/fCLJhKeCO+4vd0OPPNLBOx07nj0Va77RLeVxGRkYagX4sO+2ennZHj8F3mgSt\naG5tJUgTEREREZG1TMF2omqW8aot1q0C5aIy89uV/R7rGAPsGFjHYDQ/L/XQ0FAjkVpR9/D4M7Y6\nx27fcfs4djqdL7to/u3Ykp4G7THYnp+fB5bm6E67gafd2oeGhprm0k7rBSzLTi4iIiIiIrIWKdhO\ndNqy3W6bKsF2UZlV6xQDzjgWO5YXu5LHIHlkZISxsbHG+Od8S3Fc0im54jjuuJ+0dTufuCxtLU+T\nteVbt2NQnU+6lm/ZTo9JzHaeZi1PW+5FRERERETWIgXbiSqJ0YrWLfq93TqdBO5VAu78knb5Hh0d\nZWpqitHR0Ua38vxY6tiCnJ8eLJ0iLNY7H6yn+52ZmWF2dpa5ublGMBy7lRfN6Z12Dc+/56Iu6bHM\nokzlIiIiIiIia4WC7YradS1v9Vy7lu2y7uuddFWP26fjuKOpqalGMrKRkZFlU4PFgDZ2647r5Lt7\np3WLAXB8bX5+nunpaWZmZpiammJqaorJyUlmZmYaQXGazCwNtOMSx5PH1vc00M6PI4+t+TMzM22P\nj4iIiIiISD8o2K5Z1fHZ+bHfrcaBF2UFz0vHVOefi4H24OAgMzMzy4LtfFK0kZGRRhlpl/O0RTrN\nKh6D7ampKW666SampqaYnp5menqaubm5ZVnIU2mQH/cRA+243/h+0qA8Bu9zc3PMzs62+TRERERE\nRET6Q8F2jaoE2vFnfo7uTqYIy899Dcun5UoTm8W5rtP5uNNAN51+a2RkhPHx8UZX7TgWu2zqrTiu\ne25ujsnJSW644QampqYac2YvLi4yMjLCyMhIIzBvlfgsf3zKksapZVtERERERNY6Bds1Kguci1qy\ne91POoY6Px1Xvg75KcJiqzHQCILT1uOi5Glx/XR8dsyEfv3113PjjTdy0003Nbqvx4A+ZkGP836n\n9Y2/x5sCcax5PnlaOqVZ7J6ejgkXERERERFZaxRs1yAfZHfTWt3pvuLjdMlL6zM/P98IttNpu2KX\n8TTpWVmwnWY/n5mZYWZmhunpaW644QZuvPFGtm7dirs3Wqxjq3bMhp7WOR+05xOepQF37DI+PT3N\n5ORko4u6gm0REREREVmrFGxXVDY1V6pV0J3vNt6NskC7qK6pfMt2XGKW8TQhWT7Yjj9jYBvHS09N\nTbFt2za2bt3aaNW+6aabGBwcZHx8nJGRkUbX9NHRUUZHRxt1SecHT7u75+fXjt3YFxcXmZ2dZXJy\nUi3bIiIiIiKyLijYLlEUGLeaGqws2VnZzzqnrapSVpp4LA2Cx8fHGR8fZ2JigtHR0WVjq2MgDDTm\ntY6Zx2P28Tg2G2iM/Y7B9ejoaCObeFm901b2NBFb2r18dna20YqeBtqa+ktE1prLL7+cvfbaq9/V\nkFW02267cd555/W7GiIisgYp2G6hqBt4WSbw+DM/5Vc+yF4NaSt6fByD7Bhgj42NMTExwcTEBJs2\nbWJiYoKxsbHG2OrYtTsu8/PzjSUmQJufnwdCkD0+Pt7oPh4D+DiFGCzNi53vQl50nOL+0u7jCrRl\nIzOzZwAnAg7cxt3/3OcqYWbHAMcA7u7Fd9Q2oMXFRS699NJ+V0NERETWAAXbPWoX9BVlDm+3basW\n9G7qlU7XNTo62giw4xJ/j5nKY8A7OzvbCHjTYDsGwvPz88vGaMdpxmKr+fDw8LLAOl3SLvD5DOdA\nI6CP83anwXaaSE1EZO3Zs98VkFVxObDYdi0REdm4FGwn2s1nne8C3ipYTueojgFkul06H3ZZGZ2O\n807HOafJzeI82rFlO7Zq54PttFt3DKhjkBuD7NjSnQa86XRiw8PDjWVwcLAxTjyOz47vvVWwnQb7\nMSFa7LKucdqygXm2yJo2APy135WQVbEXoF4MIiJSTsF2C/lAtywhWbp+vhU5beFNu2XHluEYPOa7\nn+f33yrwjgF1Ol91GmiPjY01lpGREQYHB5dNuxW7hqeBcMw4nrZgp0Fxum6+9T69mVCU+CydeixK\nk7hNT08zNTXF5OQkU1NTjWA/7RYvspG4+8eBj/e7HiIiIiJSnYLtRFH37W5algcGBhgeHmZ8fJzN\nmzczNjbWCFpjt2xYCjDT4LiVVvXIZxOP5cUx1GmwHZOgpXNlx+7ZsdU6PhfHSKc3EtJguWj6sXyw\nnc/Snh+jbWbMzMywuLjIzMxMo+t4DLjTgD+Wr4BbRERERETWMgXbJdIgu6iFu9V2MdiemJhgy5Yt\nbNq0qZFRe2ZmhoGBgUZAOz8/XynYzmc7z0u7iseW6xhsj42NNQLu2LIdy0lbtmOd0qRo8efQ0FCj\n9byoFTtukz6XD8TzXcXTmwIxqI/Bdpzma2pqatnUYEXHQ0REREREZK0Z6HcF1pI06O1kSbdPW7bH\nxsbYtGkTW7ZsYfPmzY3x0bFreRpsmhk2kC2xXGuuW2xZHhoaaiQiGxsPwXQ6Bnt8fJyx8TFGR0cb\nwXUcBz01NcXWrVu56aab2Lp1K9u2bWuMjY5Bbz5Yzrd2p93gi9YvC7JjwB67osdAf3p6msnJyUZd\nYvCfn65seHi4EfhX6Q0gslaY2f5m9lozO83M/mJm02Z2k5n91sxOMrNDW2z7DDNbNLMFM9un4PXv\nZq+fkf2+n5m9Pyt7W/baPtlr98t+XzSz+1pwtJmdaWbXmNlWM/uZmf0/Mxvt4f0Om9kjzex9ZnaO\nmV1rZrNmdrWZ/djMjjGzndqUcXFWz49lv9/BzE4wsz9lx+8KM/t8q2OXK29PMzvOzH6S1WfKzC4x\ns0+Z2f27fa8iIiIiRdSynci32Ka/x+CwqMt0XC9tsY1jtmOwnQaIcTx07M6dBt0Abr6UCsnACb+n\nQfnwSDaN10SYI3tkZITRkdHG1F1p8Btb0mOG71jvgYEBJiYmmJ2dZWJiohEEp63X8X3GFu70WMV6\np8ckJmiLxyOuG8uMQXcaaM/NzTWC/q1btzI1NcXc3BxA4xjF+qSt6O2SzImsFWZ2P+A72a9p15Rh\n4HbAvsBRZnacu7+2i100kqeZ2aOBTwLjudeLthkFvg48NLfOXfj/7N15vGR5Xd//1+ecU1X33u4Z\nFmdgZhxEEI27qIC4ByQ6ShwwQSL5IcugGYUYSDQ/jdHAoNlcgjFiHEVZXB7+wICoMQQXgkuIDoto\nRKIgBBhappmlt9u3qs45n98f3+/31PecW3frvt339tz3E8uqW3XqnFM1XX37XZ/v9/OFzwWeaWZP\ndPe7LuCcfgZ41pJjPwh4LPA44B+b2VPc/X9usY/8dT0V+EVgJXv8WuCpwNeb2T9099dtdTJm9jzg\nxwnvS35ONwJPB55uZj8L3Oru+stFRERELprC9hb2WjEdVnDT8O21tTWuuuqqXjCdzWbdcO8uaBdZ\nlTb7Z6DjmFu4LoyyKLvK+WRl0lW0J+MJ40modAM0dUPd1LRN23X3zudop+HrqULdti2TyaRbG3v4\n+vPmbsCmoeDD9yy91jwop/cmhf+80p6C9pkzZ7ovBIDeCID0vLyyLnKFqICzwG8QQvd7gdPAQ4DP\nAv4J8HDge8zsL2NDtAvxcOAX4rFeAvwB0BDC7dkl2/8g8BjgTcBPAR8GHgY8H/g7wGcAv25mj/e9\nr0NYAu8HXg/cAXwIqOM5Pgm4BfgE4PVm9tnu/vFt9vW5wDcBHwV+BHgHYezP1wDfQwjgP21mv+vu\ndw+fbGa3EMK/A+8DXg68BzgJfDLwPODr4jmdAr5rj69VREREZBOF7cwwLC4zrKymUJk/L7/kTcuK\nMl7yJmaFUVjRC9upku0eg7aFoF2VFWVVUlaLDuOrq6usrq4yGo8Yj8Iwa/fw/NZbvN08FDyvcOdL\ndqXbbdv2AnEeblPATUPZlzVlG76Pw1A+XFoszc1OS3w1TdMNF8/nildVRV3XbGxsaPi4XGneBdzo\n7qeXPPZbZvYTwH8lBNwXm9lrLiDcGvAIwlpEj3f3fE2iO7Z4zmOA2939+YNz/TUz+xlCCH0McCsh\njO/Fv3L3Dyy5/53AG8zsJ4G3EarT3wG8eJt9fQHhNTzJ3c9k9/+xmb2f8AXD1cAzgf+YP9HMbiRU\ntB34WeDbBpXrPwF+1cx+EPhe4IVmdru7/9XuX6qIiIjIZgrbSwzXfc6DZQqqaVms4dzkPKCur69z\n6tQpzKzXYfv8xnnm9RwnLqdV2KawmgfTdMnXsF5ZWWHt2BqTlQmj8QgM5vWcuolNzeaLdbFTmC+s\n6DV9S18G5POoYfGFwsbGRnfJK9tFUXTD1dPzUjheFrzTJW+6Np1OeyE7hfj0+tKyaWl+e9r/bDbj\n3LlznDt3rjdkXeQwc/d7dni8NrN/Tgh+DwceTQi9ez4U8N2DoL2djwH/bIvHXgTcDFxDqHTvKWxv\nEbTzx//czF4Rj/NUtg7bRnhdtwyCdtrPL5nZDwHXA1/OIGzH/a8RFr9+/jZDxF8MPBu4gTD8/fu3\nO38RERGRnShsL5E38srXy55MJl0TrzQnOnXJTvOH032z2awL223bsjHd6LqRn984z2w+o6XFyhjq\nregNs+6Gnacx5Q7j8TgMFR9PwhDyldAEbTQedZXiVH1OYbttWkbViGoUqsTJsi8T8vWzU9hOwTZ/\nnaPRqOt0ngJ3ep/yedr5sYBu2bMU4PMvINKw9jS8fm1trbuMx+PusrGx0XsdIlciMxsDDwWOs2hU\nmX979HlcWNieAb+yh+1f6+4byx5w93Nm9lrgBcBnmdlDLnDuNgBm9kDgwYQh3+kviPvi9WeaWenu\nzbJTAf7M3f98m92/ixCSH7nksZvjPn7D3bece+LujZm9DXga8MXbvhgRERGRXVBqWSIFyNQBO3X4\nXltbo6oq3L3r2p0M15hOYbsoihAy57Pe9bwOVfGi7AftPOQPK7eTlUkYPj4JXcbHk3HXobtuauqm\nDkF+NqOe1zR1GOq+Mgn9hMqi7F5fPrx7eLw0LzpVn8+cOdOrsE8mky5cA11lO833zt+P/P1J89XT\nsl4pdE+n0+680lz3Y8eOcdVVV3H8+PHQcT0Om19fXwfoKuQiVwozWwNeCPwDwjztcpvNr7nAw/yV\nu8923qyz1fDy5I8JYRtC07Tf2cvJmNlnEyrnNwHXbbNpQWicttW87ffucKg0cuCqwfGvJjSfc+Db\nzOzbdjrnaLtz3UFLmIq/k5Lt/wjI4XfioE9AROR+5cSJE5w4sfPfrXl/p8NOYTuTlshKwTGFvLW1\ntW7prhSk09Dnoii6IdB5VTof/lyWJSNCNbgoC6pRxagOAblt28W8bVvMy84rzSm45kO1He+Om7qb\nz2azrtmZ42D9Odnj8bjXtCw9BnSV+XTMfDmv4dD2vLN6ks8LX7b8F7BpWbGyLLvh6HmlfTKZsLa2\nxsrKSjeiIH2pkF5HuohcCczs4YTGaJ/MogXisjnZ6UO1uuSx3bh3j9vvVKn+WHb7wXvZcez+/Z8J\nv2e6ruLLNo3X273m9R0Ol4aGD9Nrnnr3Mgf+Qt//6OTFPV1EROQIuv3227ntttsO+jT2lcJ2Jg1P\nTlXbFLTT+tVXXXVV16BrfX291xE7H05dlosGZim0V01FM2po2v6lN4eaxT5SqG7btmt0lpqsYXSh\nNR0/Vcy7AM9iePpoNOqWCkvNz1IFOq9mp7Cc5p6nx/Ou6fn87Dxwp6Hn6bl56E63U9hOS4Sl15g3\nZ0vzwVPYHq6tXVVV78sQkSvELxCCdgv8HPD/AX8BnHT3OYCFD1MarnGhHQD3OtxjpwB6QedhZn+L\nELRLQmD/IcKXDR8EzqTh4mb2XELTsgs+1g7y8P1j2bF2csFfmZsZ11yz88CENIpJrnzXXXcRAyFE\nRKRz6623cvPNN++43U033cTJk1fGF9sK25lhZXt1dbUL28ePH+f48ePdMOgUAoFe5TlvYJYqs+Px\nmNbb7uLui9ux83iSgnZVVRRWhFDeNLRN2DZ1Gm/apjePOm3TtE23JndhcYj4qOrCaQrbSbqdGp8B\nXTOzVNUeNlHLK+zp9eehOj+v/JKH7XzO9/AY6VzzivawK7nCtlwpYvD8UsIn/V+7+1aNwPZUOd4n\nD93h8bwyvG2Tt4HnEH6/1MBXbNPZ+1K/5nwZsDV3f88lPh433HADH/nIRy71YURERO53rr/+eq6/\n/vodt7uSRrcqbGeOHTsG0M0ZTkF7bW2N1dXVXngehu3ULTyFwLQkV3puCsotIZA23nQV63yJrqIo\nuqHkQK9O1TVhy5YGS0HbPa7DbeF5+bD0UbVYRiuvpKdzzpfwyoeM58PI8zCcV/DTflIH9nQ7XfKh\n5GmOdXpePux+WZhOFe/hWt55hVvkCvBZ2e3XbrPdYy71iSzxWOAXd3g8+d972G96ze/eYQmtS/qa\n3f3jZnYn8ImEZdVERERELhuF7Uwa+pfCclrHemVlpRf80hzotKZ1qtzmFde8ydexY8dC2I7/a5qG\neTPvmnyl6xRGm7ahnYWgm5bySkO0U4gui7IbKp7CaxdI02hMj8PAq7JXtc6Dct5pPV8zOzUzW1tb\nYzabbbk0WR62U/BPndFTtTx/bgr96bnL1iMfVs2HQ9SBbjuRK0D+B/XYNtt9+6U+kSW+0cy+292n\nwwdiQ7enEyry73H3j2169tbSa97y9ZrZ9YRO4ZfarxHe20eY2Te6++suwzFFREREFLZz1157LUBv\nned8jedhVXU0Gm1aEisfgn78+HGuvvpqrrr6qlDdjePF53VYAms6nzKfzbsu5UAYCp4F1/xSViVV\nWXXnkkIx0M3nLoqiV/U2rBde83BelmX3xUAK2ymIA90yYqlCDfSq1Hmzs7quu3nZs9msC9v5sPjh\nfPTccG3zYdg2s675W9peYVuuEHll9zmEDt89ZvbtLJaoupyuA34U+MdLHnsZYRi5Az+5x/3+FaED\n+aea2ePd/X/lD5rZKqGivrLnM967HwZuASbAfzazD7r7ll3YzexrgQ+7+14q+SIiIiKbKK1kUtjO\nK9fDqnG+VFYK203T9IY/j8fjrrJ99dVX84AHPKCragNM59OwXvRGxUax0VWH26YNlzi/Oe/e3TQN\nY8ahoh2Dcm9odZUFZWcxz9udqtwcttP55x2/05cKveZs+ZzweLuua2azWReq8wp96tSeOqOnYfd5\nVTstHZZ3Kk+vY1g5T2E73U7BP70GkcPO3d9lZv8b+GzC8lMPBn6esG7QjcA3A38f+APgy7jwwH0h\nz3s78HwzeyTwU8CHgYcBzwe+Om7zTuD2Pe7354HvIDQo+00z+2HC69sgDB3/p8CnAH9IeM2XjLt/\nMC759XOEOeJ/YGY/D/w68CHC78EbgccB30hYq/vvsrdh8yIiIiKbKK1kHvzg0KsnXxYrBeF0yYeK\nT6fTLmin7dIQ6tls1i3HNZ1OQ0O02Fl8Opsy3ZiyMQ1rTKftu7WxmzYMAbcCijAs3DAm40k3tD0f\nig0srQi7O4YxHi3mmKf523kzt/QFwXAudh62U5U7dT/Pu40vW0YsBeR8n/nc63zueDIM28uWFstf\nqzr5yhXkmwlrVD+IMDT76dljDrybEPQuZuHeC+nm/S+B7wS+hlCJzjmhY/rXu3s7fOJ23P3tZvZi\n4CXAA4B/vWTfPwK8h53D9kV3KXf3V5vZOvDTwNWESvctyzYlNHU7d7HHFBEREVHYzjzoQQ/q/Zx3\n2U6V3WFATdVcWCx/lcJzCtrT2ZSmbrr519PZNNyfrY1dNyFoN22Dt6EBWr58lxfOZDxhbTU0bVvW\nHCyvdOdrV0/GcV72eNIL2Xkle7j8Vno96ZKq2F0VPus8XpZlL2CncJ6+nEiBO4XtPKTn5zs89/y/\nQ3pe2qYsy15VXOQwc/d3m9mjgX8BfC1wA3AGeB9hGbCfdPdZ/ud+2W7Yvnq90+PLzNz9a83sVuBZ\nwKcDY+D9wC8DL1s2n3s3x3T3HzCzO4AXEhqtHSOs6/1HwE+5+++a2bN3cd67fV3bbufurzOzNwP/\niPDFwmcSvvyYA38D/DlhebJfcfc7d3E8ERERkW0pbGdSZXs4XzpVdOfzea+yPR6Pe83D8q7cvbAd\nq9f1vGZez7uwPZuGbbqqdwyw6Z+MZkbBoro9GYfGbcePHV+0vHe6TudpSbG05FcKuSsrK6xMVnrr\nVufD5NMlD+jDJbzy6n1e1V52qeuaqqq65+VztoeV7bx5GmyudCfD4eaqasuVxt0/Arxgh22KLe5/\nNfDqbZ73hIs8t9vZ41Bxd78NuG2Hbd4EvGmbx3d6XY/Y5bk8F3juLrY7RZjD/cO72a+IiIjIxVDY\nzqQAm4aGp6Cd5JXavFq7rKFXXddMp1POnz9PURTUTUOThmLH+dRGWJqrbVvS/7Csukt/SPXa2jHW\nVtdYXV0L86C7E+sH7qIsukZq1ajqqtr5vOxhwE5NyIDeHGnoh970c15dzod4N03TBXlg0zzw7Sp3\nW1W3h8/JG8OJiIiIiIgcRgrbmRSs82HOaYh0um+4HnW+tFV+f9u2bGxsUBRFt/50yIeOWQjD41Fo\nHBaGmIcw3g3lTkth5WH72Bpra2usTFYYj0eQwjhgRdquoCxDp+6qrChHJaNqMWQ8D7x5R/EUsJP8\n9aTXkBqhpWCeOoIP35/Uzb0oil4FPV9+bDhUPD/u8OdhtTtv9iYiIiIiInIYKWxn0jrXeUU1hd9U\nSU0V4eG608PA3bZtaIyWhe6iKLs51OPxmNWVFaqqYj6vqWP38bJrXpaGXcf9mjFZict0rawwqqrw\nWGG9YeNlWVKVJWVVUVUlZVVSFmW3LFg+PDwNeU+X4VDtfNmuFLbTXGxYNCobDjdPQ+3NrFfZ3ilI\nD+/Pg/WQgraIiIiIiBxmCtuZVNleVrFOc4RT0M47YufDpPP52ynEmtmiujwaMRmNGZUjVlfWWFmZ\nxPndYZ53NapCGJ+Mwz6z0NnNtR6PKcuKMltbu6qqrtN4VVVxKbAQvj3rGZTWqk7Li02nYRmy6XS6\nqeN3agI3Go16le28A3mS35eq3en9SZd82/Q+p+tll2XDzvNGaSIiIiIiIoeVwnZmfX0doBea83CX\nAuVoNGJ1dbWb253Wq867l6dmaWl4djUaMY7zpldWV1ldDUt4raxMQnAuZ72qcD7suyjC8PAU1kdV\nRVktwnZZlL0vASB2BJ/7Ilz75iW88kZus9msF2LzKniau56u05Ji4f+M1ttNQT0fOp6f19B2gXu7\nOd4K3CIX5UI6l4uIiIjIHihsZ4ZhO69s57fH4zErKysA3fzkY8eO9UJsHmqbpmU0qhhPxt062aur\nazFsr8SgXVGWYeh3Wg+7rFJYLbuKcd7ZO1xC2Dazbt42QNu0tITu5Glpsfychuc6n89770U+PLw3\nt7tplwbdVD139151f9htPLfTGts7NVRT4BbZO3d/K6B2/iIiIiKXmMJ2ZlnYzoeJp0pzqmyXZclk\nMumCa1rmK62fPZ2m5WlrRtWIyTjMuV5dWY2Be5WVyUpX2e72nw0FL4tF4A7DwmPQruLc7DLMx041\nquGc7Kapmc1n4ZLW9O6+BFhUrIfz1VPYbpqGqqrwNgR3bz0E6+x4i+Zs/ZC81TJew5+3qm7vFLZF\nREREREQOK4XtTKru5mE770A+Go26ZmSj0agXWOu6ZmNjg/Pnz/fC+Wg0omkaVlfXWFsN3cRTyJ5M\nwrDyFOjr+bw3z7ko+sOwexV3jLZ1oOmGrefD15u26cL2vJ73AnbbtjEsL+ZdD5fZKoo4PD1WzSmM\nwg1itTwNS3d3SiuxwiiLEsImS9ffTu9j3pk8HXO74eT5dqCKtoiIiIiIHH4K25mmDtVdLzxcSigL\np4jzpVdXVkO1NwXZOgTtFGbzLuVlWTIej5nXc9q2ZWUShoyvTlZZWV1hsjLpupJXVcWoaWgmk67z\nuBUFRepEHudsh6yZ1sJuuuXE3NteNbu73abbTQzHTgjYZfjSwB33KgR197CE2KATe3g9BWShNlVi\nRgAAIABJREFUuYlzv0O1O8xjr4qwpre796rleafy1HAtDYOHzSF/q7W2cwrdIiIiIiJy2ClsZ+oY\ntovCSc2z2zINHV/M016EzrB0Vj0PgTsf9lxVVQi7cY7zeBzC9WQ8YXVlNa6VHcJn3oAtSIGzHzrb\n1nFP1evFXOrhcPAUbt3bGIbjPo0uvGMWmpxl0nYWf+jW8DaLoT6uP17XeNvSYLFZ2qLan5/LfD7v\nnSMQlzOz7npx7M3zt4f351KzOhERERERkcNIYXtbtqhSVyMmk5UQVKOmbZjP5szmM6p51Q2rdvcw\nz5lFh+5RFTqMj6sQ2ieTSdd1PHX1Bnqh1t1TRo4Ww8TzBmcp1ObDxPN51UVZbBrCPVwn3LJwnV57\nLh+mDmFN8qJosCYtjVYxHo2ZM2dms25psbxZnJl162/nS4HB8jndCtMiIiIiInKlUtjOTCYTgNCU\nLAbUyXjCeDSJHcKrUOWlvxSYsei8nZYBG41GYaexmlyV4flVOQohexyWxrJiEGrblqZtaZumF95b\nD53AU7V82F08fwzAisUw8GoU55CPqkXoLspNjc3Sa0mh2+NrxKFuapo6VKi9ddoqVM8NYxzX/66q\nUbe+NtAtNZaaxpVlyerqave68jC9myHhGjYuIiIiIiJXCoXtTBomnsJoURbdvOrRaExVhQAdKtZt\nb9h4HrbTz11H7SLMaQ6NzyqqctFtvLCiVwFv3WlT5bppaGOzszxM53PGm6ahbsLa16ljeGGLMF3G\npcTG4zHjyZgqDosvyn51O80PT3PG02to437n8zmzYgbzEKKrqgrHikPsRzFwt20T5njTD9vr6+uM\nRiPm83m39niSlgnbLkwraIuIiIiIyJVEYTuzMlmE7TTkOs2rTmtfp6DduoXKbladHc5H7nUWt8V6\n2UXW/MzMestouYcwnXcQ76rXgzWv+/OzF9X2qqooWazNPR6PmUzCsmNVVYWqfbFo5mbFImyH81yE\n5bTvjY2NsP/WactFxb0syi5opw7taah9Gko+nU45f/48KysrvbCdV7d7Q+fjfwPoB/E8cGuIuYiI\niIiIHGYK25mqCm/HcJ3tFErT8PHQXKzohppXVcWoHfWW4MrXiU5LdWGxWty0tNZm87QXQTmfdz0M\n03m1PEnrYedznFPAThXttMRYWmast4TYoOt3Nywe7+5Pa2nn6453Tdu8ZTxZfCHRtA0rK6vdlwPd\nPs1YW1tjMpl0z092W7VO57KX54iIiIiIiBwEhe1MvhzVMJDioaprxaJDd1mUXddxx3uheOmc5Nbj\n8luxmJ3lxRQe8/Ww82puOqe037Ise8fKzzet750H7nS96XXBpvNMw9pbb/vnUFj4gsHL7nbotB72\nX40qJkxomhrMKcr+uaythXXGR6NR91qWDR/PvwBIXyakc9OyXyIiIiIiciVQ2M5sG7ZZNAsLjcWg\noAhzon3UVV2Hgbu7r/VYrQ63U7fwYWjcvHxXf+h0XtUe3p+GrOfDukPjsqq7XlbNTlKoTefbNWbz\nrOlaWVBZ1T0OLIbZj0ZxCLn3ziOvrq+urvbCdv5+pdeSv65h4O7+O4iIiIiIiBxiCtuZFLbz4d8p\n3DphXe2CIlt7enPVGdgUIts2NjWrw2URxBfHTiFzGLLz4ejLLmmbqqq6sJ3CdQrY+WPD5+ZBNy0r\n1mSd0Ns2hO22bXrn1HVRt7C0WGq4BhWjcRzuXi6G4qdzGY/H3fmktbeHQ8Pz1zt8fxS0RURERETk\nSqCwnUmdxPOgnQJq6qzd5b84/zp1Bs9Dcrqdgmu4hOW8mrrBSatYbw7N6XpZFXs4hzx/br6Odgrc\n6ZJX6XPLzrUbxt7mXwq0tO0ieC+WPAOzgrZtaNsiXNyBMNw+hex8mHs6J6B7v5qm6b323TRCU+gW\nEREREZHDTGE7k4ftPHCnAFjXNXHhL9yJXcnbbsmtFFzTJc297tbBjoGbwXJby+ZQp9t5sE4hehi4\n03bDxm5dJ/TB/mFR0c5D9nw+D5d6HuaMh5W242vN5lcbFN175LRe0LQtRdt0+0+vazQadZX3/LWm\ngJ83lFs2hzzfX37uw1AuIiIiIiJymChsZ1LFddhJHBZV2FDZXQTsbu41viloD5ftSoG7sGLT8lvL\nlrZKATsF1Hz+9TBsD4ebD+ecb5p7zqKynYL2bDZjNpsxnU3DEG9L3dfZNOS9iN3JHadoCwovaNr+\n+aRzr6qq99qG1exh9/ahriP64FpE5LA5ceIEN95440Gfxv3Cddddx9vf/vaDPg0REZELprCdSXOI\n07DxZU3EeoHbvRe287na+bznwDArKEvrOpkPq9H5MO98ma88tObzr9O5puthF+9ljdCGw8Zns1kX\ntOfzObP5jPl8FsM2i0tnONTdaNpQGR8u6bWs23r3HmaV7WVzyNP7v6xT+bDTu4jIYdG2LXfeeedB\nn4aIiIgcAgrbmTxs59d5JTiE6qzjOL55G+8H7xAOQzdv937VOQ/dKUCHJbltadgernM9PN9lDdSS\nvJI9rGbnQ8jn8zlt20BBF7bTOuH9HYbHyrqkKqtFA7b0v0GDszw4p9Cfn3d+nulc89eX70tE5GKY\n2cOBD8Qfn+Pur9m/vX/i/u3qSDoBtDtuJSIictgpbGdS2B7qL+XV0rrjtJuqtcPA2AuUYaIzBWGu\n9rI51pu6jhexAh6HnOfzsLfq1r0seOdS2E4hO7+EED6nbuqw3Ff8gmBT4Ha6Rmk4/deQ5qFb0QXu\n7cJ0ap6W3z/cZln39fxaROQi7POclAL4yP7u8si5EdDoABERufIpbGeGYXtZtToNF180D9s8zHnz\nz7GaPQjZw6HkvaW5iuWhfHOzM2OYOZdW5OPtvBHasKrdNPWis3oM2/EQ/cDsLDqTt07hBdYaTdMs\nXme8DDurD7+gWPaeDc952XuroC0iIiIiIoeZwnYmVaFh89rPQ7HAu2l4dF7hziuyi8p0FW8XFEW/\nW/jS9bQL64XW3VR1hxXifEmyYcjulvpqmlixp6tiu8Gw6GMYTlr6qz+0vBv6nTK5LZb8yt/b4Xu8\nXffx4dDz7YbJi4jswX7/BRKbbqh5o8iV4sSJE9x+++3ceuutXH/99Qd9OiKyS1mBtNhuu8NAYTuT\nAuGyoN0Ldhb+n7E8jOdNv7olsKpRbHA2okqhu9wcttOxemEyDeFmWBnedORum2GztnQZhu10fze/\nPE3ENjDr6vcsXjEsrhZhOz3XPNznOIUt/vwP1/jObfUFwrL3dvhFhojcP5jZlwDPBb4cuJ7FeOz3\nAq8HftXdz2TbXwd8A/BE4POAGwi/0z4OvB34JeB1vuQvEjPLv/0z4FVm9qrBZi9x95fu4SWUe9hW\nRA6BEydOcNttt3HzzTcrbItcQbKwfeh/9ypsZ7YK22axmrupSVg+D7ndFP7y4eHVqGI8Gmehu+oC\nd75U17LguaxKnZquLf4Z6fF2fwmy4TJkXSO0buh4023neDdkPAXmXK+inc4rlvhTV3YI87kLClpb\ndBTfKjgvm3u+bPv8PRgOTReRK5eZrQA/B3xTvCv/8H9avDwFeAnw0vicgjCp19hcSr4euDlenmdm\n3+Du64NtFstEbD6miIiIyL5Q2F4iHxqeQmQYVh0DZSz+pjnby9bHHnYPz5fsCl3J6QJuYWGYtRXL\nl81anEcI1P2Gbem+2LgtC9mpal03NU3ddMPFN1WzU4CFrhmaA61bN0c9yb+QSA3SuqCdiuLpPbBi\n0xcJw6Hjy66Hry+fn55vKyJXNgsf5l8DnkT4G+SvgJ8kVKbXCcH5S4CnD59KaFf9FuBNwJ8BJ4Gr\ngEcC3wp8cdzvywkV89znECrhb47H/T7gjYNt7rrY1yciIiJHm8J2Ztk8YehXbdM9i7u8FyDz5bpS\nyB6NRiFox87ioSgT5zc7oQIcO3f7MMD7svNIw8R9U8DOQ3a41NRNQ1PX1HUT97uogucv1ayIHcih\ndceyEN2tJ26L96L3PvX2Y72Qvbmp2/L3Pa9y518EDLZeug8RuSJ9B4ug/XrgH7r7PHv8T4D/Bny/\nmXVjPN29MbO/5e5/vWSfvw+82sxeDLwY+GYz+0F3f3/2/PeY2bnsOXe6+3v272WJiIiIKGz39IZu\nDwJ2L+jGe7rHBpXtPGyPx2NGo1GvszhA2zruDd72jz0cKr7pHLNjtm1LkwJ23fRCdt3U/Up2vB0C\nbUFR5MOw84ZsgBmFtXhrYQ52fLmt77zuaUFY8mtZJ/XtKtLLhpT3G9b133OFbpErW6xqfxfhg/0R\n4NmDoN3j7icGPy8L2rkfAF4AfAJhSPnLLuqERURERPZIYXupGOpsMVQ63JsNm86akSVbNThzd1rC\nMllpuawUlL3dPDc5ree9GCKeKucsKuDuNG0TK9ltF6jrphlUthvaNl437aLiXBYUbYEVaV1sy9qg\nZRVt9945Dudxp2dkP/Sud7s29rJu66GL+eK1u0NRhC8L1CBN5Ir3aMKCyg78zJJ51bsWg/t1hGHk\no3Q3IcR/AqGBmoiIiMhlpbCd6desF7FyMV+6v9b2pucPqtFpSLeZZSGdLiC3cd704piE46TrWE1O\nIb2wIi4FZv3KdlfNbmjqEK7b+Jy2bWmbNt63aFjWehtCt4clyCiKOA09TLxOIdtb70L3toZthgZD\nzdPc6+2W++rtLj4evr/I1xS3GLgNVbdFrmifn93+/QvZgZk9E7gF+CJgdYvNHLjmQvYvIiIicjEU\ntjPuqfkXWewOQbvtVZt907rRZtbdNxwSXtf1lktxtU27CPnej4+hiVrTDRUPw7Hj8G8n3h/2Ucf5\n2U3ThJDczf2mm9Odgm5rLeZG4QUlZVrpCzBoQyO47ouDlq6yPaxqd13JU6f27C1Lr394vWxefAjh\n+XsZ995dW3dtqVs6qmyLXOHyAHxiy62WMLMJ8AbgJjZ9xbfUVkH8EnDgIbvYruQKWLHkgOzpj4OI\niNxPnDhxghMndv4dMJ9vOevs0FHYXsrJ018Irm0vMC8L25sDZH/71AG8afqdwWFz+LRYXe6CeQzb\nZZnP+17WEK1hWIXuqtR5Vdm3WEYrLeU16Ai+/F3y/hDywTGHP+++sm1LA/eyZcJE5H5hh6Ezm3wf\ni6D9PwgdzN8J/I27n08bmdlbCet2X+a/LE5e3sOJiIjcD9x+++3cdtttB30a+0phW3YUZnHv9d/C\nu9df3ktEjoiPZ7dvICz7tVvPIwTt33f3r9pmuwez9yB/oR6cbuz27zP9vbe9u+66ixtvvPGgT0Pu\nx2azGQA33XQT4/H4gM9GRJqm4dprr91xu5Mnuy+1H7zddoeBwnbmuc+5Rf/yERG5PN6Z3f4K4K27\neZKZPZjQDM2B122z3THgb22zq/0O4d3vjx17XOxxu6OqbVvuvPPOgz4NOQKyf7iLyJXl0Gc3hW0R\nETkI7wY+DDwM+BYz+9FddiTPf28d22a7b43bbpVoN7Lbk10cdyfTuJ8WuGsf9iciIiLLPQQoCL97\nDzWFbRERuezc3c3sh4EfJywB9hoze8aytbbT0l5xre2TwH3AA4BnmNnLhs8xs8cCL2X76vXdwIyw\nVNin7MPr2S74i4iIyBFkGsYmIiIHIYboNwFPIgwF+0tCs7O3A+uE4eJfDHwT8Ivu/tL4vP8EvCDu\n5u3AfyDM+X4A8GTg24EzwD2EoeT/w92fuOT4vw98KWH++D8B/gRIwf0ed793f1+xiIiIHCUK2yIi\ncmDMbAV4NfC0dNeSzRy4LQvbVwNvAR69xfYfB/4e8APAV7J12P464NfiPob7eUk6noiIiMiFKA76\nBERE5Ohy9w13/wfAE4GfB/6aUNU+DfwF8F+AZwA/nD3nNKEi/f3AnwLnCZXs9wA/BDza3f8gbc4W\nw8nd/TeBrwLeCNxJGFa+5fYiIiIie6HKtoiIiIiIiMg+U2VbREREREREZJ8pbIuIiIiIiIjsM4Vt\nERERERERkX2msC0iIiIiIiKyzxS2RURERERERPaZwraIiIiIiIjIPlPYFhGRI8/MPsnMftTM/sLM\nzprZ3Wb2x2b2XWa2uo/H+Voze72ZfdjMNuL1683spv06hshRcik/u2b2bDNrd3l51n69JpH7KzO7\n1syebGa3mdlvmtnJ7DP0c5fomM8ws/9uZifM7LyZfdDMft7MHn8pjrfp+FpnW0REjjIz+3rg54Gr\ngeEvRQP+Eniyu7//Io5hwM8At8S78uNYvP4Zd7/1Qo8hctRc6s+umT0beOWSfS/zXHd/zYUcR+So\nMLN2cFf+2Xq1u9/CPjGzFeC/AF/L8r8fWuCl7v7S/TrmMqpsi4jIkWVmnw/8MnAVcAb4XuBLgK8i\nhGMHPhX4DTM7dhGH+jeEoO3AO4BnAI+L1++M93+Lmf3gRRxD5Mi4jJ/d5KuBz9nm8qv7cAyRo8Dj\n5UPAm1l84bzfXskiaP8u8FTC793nAe8j5OAXm9m3XKLjA6psi4jIEWZmbwW+HJgDX+7ufzx4/DuB\nHyb8sr7tQr4BN7NPBf4cKIE7gK9092n2+CrwVuAx8Tw+82Kq6CJHwWX67OaV7Ue4+4cu+sRFjjAz\nezHh9+Ad7n7SzB4OfIDwGdu3yraZPQH4nbjfXwP+nmeh18w+gfDF9ycB9wKPdPdT+3HsIVW2RUTk\nSDKzxxL+se7AK4b/WI/+A/AXhG/eX2hm5QUc6p8CVbz9HXnQBnD388B3xB8r4EUXcAyRI+MyfnZF\nZB+5+23u/pvufvISH+qfx+sGeIEPqsvufjfw3fHHBwKXrLqtsC0iIkfVU7Pbr1q2QfwFneZhPhB4\nwgUc52ZCKHivu9+xxXH+CPg/hGDwlAs4hshRcrk+uyJyhTGz48ATCb93f8vdP7rFpq8HTsfb33Cp\nzkdhW0REjqovi9fnCMPJtvLW7PaX7uUAZvYI4IYl+9nuOJ8Yh9aJyHKX/LMrIlesxwLjeHvL37vu\nPgf+F+FL7seaWbXVthdDYVtERI6qzyB88/0+dx92SM29d/CcvfjMLfaz38cROUoux2d36FVmdqeZ\nTeNyRW8zsx8wsxt2fqqIXEYX8nu3Ah51KU5GYVtERI4cM5sA18QfP7Ldtu5+H6GCBvCwPR7qxuz2\ntscBPpzd3utxRI6Ey/jZHfpK4DrCP8ofTOhq/C+B95nZP7rIfYvI/jlUv3cvSblcRETkkLsqu312\nF9ufA9aA45fwOOey23s9jshRcbk+u8n7CWv1/i8W/zB/JPD3gacBK8B/NrPW3V9xgccQkf1zqH7v\nKmyLiMhRtJLdnu1i+ylhXtfqJTxO3qV8r8cROSou12cX4PXu/uol978DeJ2ZfR3wBsK/p19mZr/m\n7nddwHFEZP8cqt+7GkYuIiJH0UZ2e7zlVgsTwhzR85fwOJPs9l6PI3JUXK7PLu5+ZofHfxN4KSHM\nrwHP2+sxRGTfHarfuwrbIiJyFOX/iN7N0LFj8Xo3w1Yv9DjHstt7PY7IUXG5Pru79dOEMA9hXreI\nHKxD9XtXYVtERI4cd58CH48/3rjdtmb2QBa/kD+83bZL5M1Ztj0O/eYsez2OyJFwGT+7uz2fk8Dd\n8cdPvBTHEJE9OVS/dxW2RUTkqPoLwvDPR5nZdr8PP33wnL14zxb72e/jiBwll+Ozuxe+8yYicplc\nyO/dGnjfpTgZhW0RETmq/iBeHwO+cJvt8qGhf7iXA7j7B4CPLtnPMl8Rr+909/+7l+OIHDGX/LO7\nW2Z2DYulyD663bYiclncwaIx2pa/d81sBDye8GXZHe5eX4qTUdgWEZGj6lez289dtoGZGfCs+ON9\nwFsu4DhvJFThPt3MHrfFcR5P+IbdB+clIptdrs/ubtxK+HwDvPUSHUNEdsndzwK/Q/hcPsnMbthi\n078PXB1vv/5SnY/CtoiIHEnufgfw+4RfyM8zsy9astl3AZ9BCME/5u5N/qCZfaWZtfHyc1sc6scI\nQ9QA/pOZ5cuSEH/+8fhjDfzHC3pBIkfE5fjsmtnDzezR252Hmf1d4PvjjxvAK/f+akRkL8zs2dln\n919tsdmPxOsKePlwukkckfLv4o/3AT97ac5W62yLiMjR9kLC8NJV4LfM7N8QKmCrwDOAb43b/R/g\nP2yzny3nbLr7X5nZjwDfAzwW+EMz+/fA+4FPAb4b+Py4jx9y9/df1CsSORou9Wf3k4G3mNnbgF8H\n/gS4ixDwHwl8I6EyZnEf3+nuJy7i9Yjc75nZlwKPyu66Jrv9KDN7dr79Fuvcdw9v+YD7W8zsl4Fv\nAp5C+DvixwhTPT4X+F7gk+I+vtvdT+3pheyBwraIiBxZ7v4nZvZ04BcIw8n+zXATwj/Wn+zu5y7i\nUP8SuBa4BXg08MuDYzjwCnf//iXPFZGBy/TZdcKczi/e5vFzwIvc/ZJVxkTuR74FePaS+w34snhJ\nHNgubO/kFuAq4OuAvw08YbDvBnipu7/iIo6xI4VtERE50tz9v5rZ5xIqZU8mLBUyI3QmfS3wcnff\n2G4Xg+tlx3DgW83svwD/iFDhvoawhNEdwE+5+5sv9rWIHCWX+LP7DuCZhKD9GOB6wme2Au4F/pww\nL/QV7v7xJc8XkeV2271/u+123Ef87H+9mX0T8Bzg84AHAh8Dfo/w98Mf7fJcLpiF3/8iIiIiIiIi\nsl/UIE1ERERERERknylsi4iIiIiIiOwzhW0RERERERGRfaawLSIiIiIiIrLPFLZFRERERERE9pnC\ntoiIiIiIiMg+U9gWERERERER2WcK2yIiIiIiIiL7TGFbREREREREZJ8pbF8hzOx/mFkbL19x0Ocj\nIiIiIiIiW1PYvnL44FpEREREREQOKYVtERERERERkX2msC0iIiIiIiKyzxS2RURERERERPaZwraI\niIiIiIjIPlPYFhEREREREdlnCtsHzIJnm9mbzeyEmZ03sw+Y2a+a2VMucJ8PM7PbzOxtZvY3ZjaN\n128zs5eY2Y173N8Dzez7zOwOM7vHzM6Y2XvN7GfM7DHZdmlpsuZCzltEREREROT+wty1ktRBMbOH\nAm8EHpfdnf6DWLx+PfAc4NeBr4yPP8Hdf2+LfX4v8H3AymB/+T43gNvc/d/v4hyfAPwS8NAt9tfG\nff2AmbXpcXcvd9q3iIiIiIjI/VV10CdwVJnZA4C3AJ/OIsB+AHgbMAU+ixDCv4Fdrq1tZj8BPD9u\n78DZeIy/Aa4DngAcBybAvzWzh7j7d26zv8cTQv5qts87gD8HxvH8PhV4iZndnZ622/MVERERERG5\nv1Jl+4CY2c8Cz40/ToFvc/dXD7Z5DPBa4JOBGSHgLq1sm9nTgV9mEXRfCbzI3c9m2xwHXg58c7bd\n09z9DUvObwL8GfAphAD918DT3f2dg+2eFo9VxfMzwFXZFhERERGRo0xh+wCY2acC783uera7/8I2\n276LUF1OVeNe2DYzA95HCOUAr3X3Z2xz/DcAT4n7er+7f9qSbb4N+Mn44zngc9z9g1vs76mE4e6O\nwraIiIiIiIgapB2Q57GYP33HVkEbwN3/CvixbPtlvhp4RNxmBrxwh+O/AJjH7T/FzP7Okm1uSacA\nvGyroB3P8VcJw9W3O0cREREREZEjQ2H7YDwhu/2aXWz/6h0ef2K8duA33f2u7TZ2948Cb9rifNJw\n8y/I7vrFXZzjll8YiIiIiIiIHDUK2wfj87Lbb9tp41jdvnubTT4/u/0/d3kOf5jd/oLBY5/L4s/G\naXf/P7vY3x/t8rgiIiIiIiL3ewrbl1nsQj7O7vrQLp/64W0euza7/X93ub8PZrev2WJ/Dnxkl/vb\n7XYiIiIiIiL3ewrbl9/xwc/ru3zeuV3uc7vtttrfVdvsb7fnd3bnTURERERERI4Ghe3LbxhK13b5\nvGO73Od22221vzPb7G8/zk9ERERERORIUdi+zNz9FKFjePJJu3zqw7Z57OQF7O+Ts9sfHzyWfjbg\nE3e5vxt3uZ2IiIiIiMj9nsL2wXh3dvvxO21sZo8CPmGbTd6V3f6SXZ5Dvt07B4/9KdDG2w8ws03r\ncC/xuF0eV0RERERE5H5PYftgvCW7/cxdbP/sHR7/3XhtwNeZ2bDhWY+ZXQ987ZLnA+DuZ+gH+P9n\nF+e4m9chIiIiIiJyJChsH4yfy24/3sz+4VYbxqr2iwidwbfyZuAD8fYE+LEdjv8TwCjefr+7//Y2\n52jAi8zsk7c5x5uBr9rhHEVERERERI4Mhe0D4O5/CbyKEGQNeIWZPWu4nZk9BvgtQpOy2fDxbH8O\nfE96GvAMM/tpM+s1LTOz42b2KuAb0lOB/3eL3b4SeF+8fRz4bTMbrseNmT0N+EVgY6vzExERERER\nOWos5DS53MzsgcDbgE8jBGSAv473TYHPBL4o3v96wlrYX0kIyE9w999bss8fB16Q7e8MYcj6x4CH\nEKrPaVkvB17m7t+1zTl+MfDbwGr2nD8C3kNYK/xx8fwd+MfAy+N2rbtXu3gbRERERERE7pcUtg+Q\nmV0HvBF4TLorezj9h3kj8M3Ab7BD2I77/F7g+wjDybfa5wZwm7v/0C7O8YnALwHXbrG/FrgN+Hcs\nqu/3ufuDd9q3iIiIiIjI/ZXC9gEzMwOeRWhC9rnAAwiV6HcDr3L3N8Tt3gJ8BSHgPnGrsB23fRjw\nLcDXAI8AHgjcR6icvwn4WXf/yB7O8YGEyvVTgUcS5nvfCfwecLu7v8PMHgL8TTy//+vuj9zt/kVE\nRERERO5vFLZlX5jZkwiN2hx4k7s/+YBPSURERERE5MCoQZrsl2/Kbt9xYGchIiIiIiJyCKiyLRfN\nzL6IMKR8RKhsf0bsuC4iIiIiInIkqbItWzKzh5nZa83sS7d4vDCzZxLmgVeEoP1GBW0RERERETnq\nVNmWLZnZw4EPxB/vAt4BnAAa4KHAF7PoUg6hadpj3P1jl/M8RUREREREDhuFbdlSFrbTHxIbbJL/\n4bkDeNpeupyLiIiIiIjcXylsy7bM7DHA1wOPB24EriEsJXaWsETZ24A3uPtvHNhJiogQL4GzAAAg\nAElEQVSIiIiIHDIK2yIiIiIiIiL7TA3SRERERERERPaZwraIiIiIiIjIPlPYFhEREREREdlnCtsi\nIiIiIiIi+6w66BMQERG50pnZOWACtMBdB3w6IiIi92cPIRSNp+5+7KBPZjvqRp75y4/e7RD+y1UF\nlBZut+60bYu7Y2YURUFRFGCLZacd5667T3PynjOcvOcM995zH6fuvYdT997D6Xvv5ux9d3P21N2c\nu+9uZrMp88apW6dpnbB8dQEYZiVmJUVRYlZgRbhgBQ64e29xa8tXvnYHB29bWm/DdduyWA7bwzY4\n/SWy831ZPA/rbqdNPd/P5mfHfRiFGWb5uXl2ZcuehpGdV3yN7uG9d4+vu7sOl+RP3/3W4frfIiKX\nlZnVQHnQ5yEiInKENO5+qIvHh/rkLrc2BjgzcO9SIOAYIQqahSBqKYimCOpdNA2XwiiritF4zGhl\njdHKOqONVarJCo0brdU0dQ3egqdnhcBNCrpW0Au/MZcv/YIk5eh4nniKuBbv835GjrfzsO6++Dnt\nK0bo+CotHjt7kvnivbC0bf+Z6X1a7Lz/3g5fRzr+5lepL4ZE5NAKXyeaccMNNxz0uYjILsxmM06e\nPMm1117LeDw+6NMRkV366Ec/mvLQoQ8HCtuZEPCcIoZLt0XoDFkyVmyhq9wucq8vgrY5ZVFQjUaM\nV1ZZmTfMZxvMp1Nm0w1aL2iZUrQOjYdQ7SloF0CoZIeDpAp6F/e7+JzOuffnzHtXm26HO6x7xD2L\nzmaLkJ1yce/Vebzu3hIWz7bFE/MA351vtl2v4m2Y+eAlpNeaat3xf73XqmK2iBwq9wAPueaaa/jI\nRz5y0OciIrvwzne+ky/8wi/kTW96E1/wBV9w0KcjIrv0kIc8hJMnT0L43XuoKWxnUmW77YJ2HgwN\nLIXDrLpthGHPXWXbKYCyLBiNx4xX1lhpiWF7g9n0PE0cQm51AzR01WzPg3a6nVe8w7EWlfdsOLjn\nwTg75+Hj8Xl5AI4168Uw7/QlwiDPuqcKd9qz9Sr++RO64vWmTDy8I0/Z/bK656X6GLQ160FERERE\nRK4ECtuZFLbDPO1Y3c4eT9XsRdgGbx3aMD/a26a7BqcsS0aTCRNgPj3OfDajns9p3Wgc5nVD3bQh\nxLrhbahkW6pmF4vKdn8YOeQhtRdKF2dLN3Q7G8mNeVbZXmzZC8uWvjbIbg+fa9bF5vTcfJp3N/2b\n7OdeFXzJOacKdzrnzS9qcNYiIiIiIiKHk8J2pqts22LOsC/JdAYUFuYve9tS1zXzes4sDhOfTTeo\nZzPcoSyN8XjM2vFjmDmjUcVoZYVyPMbKkqI8S1M7ddNS121oipYao1nRBft8XPdwXvXSQBqHvXdj\n3d1jmE1NyNLc8iI0NCuypmhmXYOy1Jgs7cbyPN4Nq7f4NFs8Zl3ujrezUnnXKC2dU5sdwOPptamQ\n3x95buE/Sn8Iv4iIiIiIyOGisJ3phmG7D+ZsxzDpIZTSNUgLXcrres5sNluE7Y3z1PMaioqyrCjK\nCrPjVKMRK2trjCYTrEhNa43ZdI7N5rQ+BywG4CJWubM5z4uR5IMz7rM8EXfDzuMo+OxZBrGzeklZ\nlosKepy73cZu5qGjufWHhMfzMkvnZdntdLp5q7T8ufEsuhMp6CV6HKOIM8QXs7aNNt/B4nWKyBXL\nzJ4NvJLwN8Ij3P1DB3xKF+XjH/84N95440Gfhojswmw2A+Cmm266ohqkXXfddbz97W8/6NMQkV1Q\n2M7kc7ZboPUwpDyFyXy+smWBtK5r5rMps9lGrGyfp66dclxQFgXFaEw1GtGureFtSzUah+c2DU3d\nYsUGLRvM67j/GLIXoX4xCHtRH16MKV8WuTcFbjy+qsiJy5iVlGVFVVVQWDeE3d1pmgbqJpu/nQ0G\nT8GcbFh993N/1vjmjuOpXO1AOxg6Hr4VsHZxzouqdpENiVfQFpHDx9258847D/o0RGQPYqMlEZF9\np7CdCRXcMIy8LcLo5rbIZhpnlVf3lqZpmE03WD93lrNnz3Lm9CnOnj7N2dNnqN0YO4zLimI0pixL\nyqICK1iZ16xtbDCfz2jaFqvO4RQ0TVo/OlWRF43Y8nniXYUbz84nXNo2nFvbelxrOz3W4l50w8ch\nrBc+Gk/C8mTjcaghx/XEm6bB5vPwHjRZRTl800DXmT19KZDdR/d+ZS3bBvPMO17E0N1/rOv6TtEb\nUm69bxwUuEXkMPrEgz4BEdmVGXASuBa4EirbJyAvnIjIoaewnenCdmG0KWi3UBSLZmCLecwtTT1n\nY+M8Z8+c4b777uXUvac5feo0Z06fprUSt4JiNKGaOFYaRVlRVCMma2uszq6iaWrcHbOStnXqug7n\nkKZYE4atF1aEdbuLkrIs4tDvNIw7nXtD07Sx0t7QNHW8bmhjgHZfzAkvioKyqpisrHaXpknPmzOf\nzXGHpm7iK4/zv7PQ36toL/kyIL5jdAk/zh33fMJ198XBoGlaCttOqGiz+FJBGVtEDi8DtPSXyJXh\nncAXAm8CroSlv24ENHJG5EqisJ1JXb29dbyAtjWKIjZC66q2qYIcho+HsH2a++69h1P3neH0qTOc\nOXUaqgnFaMJodR5DrlFWFeV4wtid1XreBc7WYV7XzKYzmrqOjdfCwlohWId51dWoYlRVVKMqhG6z\nLnQ3TUNdh4A9n4c55PPZnPl8ThvPt21bijIMGy/LitF4wsraMVaOHWN17Rj1fM5sNmU2nWJ2nrZp\nmM/m8d0JdeZelT2/DOdsk1Xfu4t187D7K4sZFkN0Pp88r2R39/eaw6lDmoiIiIiIHE4K2xlfUtlu\nPTRG6+Ztu9M2DW4wm005v36OM2dOc98993Dq1DnOnA5DystxzXjtGKv1PO7XKMqS0ajCmODNsTAf\nHGialvm8pp7PQ9huPRwYFtXssmQ0GjEajxiNRlRVSVkUlLGLeF3X1PM6hPbZjOnGjOl0ymwWhqq3\nbUvTthRlRVWNqEajELaPHWd17Rgra8eYTTew9QJvnXo+x4oSK8Ia3131uiiWh+3uEl7TInCnofGp\nyVm/q3l/NvxiiLvjcRh6fgnvo8embyIih0iz8yYicrhcD7w4XovIlaIsU6Ppw/+7V2G7JzYoW/Qe\nw7NGXGaGtw3ztsHbmvX1s5w5c4ZT993Hvffcw+mz5zl7doON9fOMvKCezWnqMIzbcEpgVBQUVQXj\nCUUcVt024c9JWVV402DuFB6CaVmWsRpdxJAcmpmVVUlZxMq32aKy3TTMpnOm0ykb0ymz6Zymbanb\nMMe8iFXtMgvc6dI2Tazkx7XCY+XdiiIuDVbsImxnTdSyod9hfnlshuaOu5Hn5fRFRpq/bf3Sd5SG\nk/dbxYnI4WVmDwS+B3gK8HDgDPCnwO3u/iu73MfDgRcBfwf4JKAkjKX8XeAn3P1/72IfXw88nzBm\n9BhhrPcbgR9194+Z2Qfjvl/l7rfs5TVGcSKl/l4SuXJcD7zkoE9CRPYoC9uHvomBwnbGsoWb3fNF\np2LnbjNqb6nnM+r5jPX1c5w9c5pT993Lvffczbnzc86dn3H+/Jy2qJjPZ2EedtNi7pQGo9KoipLC\nx11YNiwM655MsLaljMG8sPCHqQvcVRnCclXGhmvh+WVh1E0b51w3zGZzptMZGxtTprM5dRPmc9dN\nG5YVq0ZxSbISK0qKeJnPZqGSHYedh+oyIWgXRbckWW+OdrwUKYzTX/rL3UOXd28hNjtL1WywfiWb\nNFedbPx4CuBt99/FlidxETlkzOwzgN8m/Is2/QU7AZ4IfJWZvRL4vR328Szg9vi8fO7IpwCPAp5n\nZt/v7v9um328HPj2+GPax6OA7wKeaWZfx2IIjYiIiMi+UNju8d7NsM72YkC0mXXram9Mz3N+/WwM\n2/dxzz33sDFr2Ji1TGcNVo6Zz9Kw8BbcKQxGhYGVlJYq1WPKKgTtlWPHKGmpDEYGVWHdGthFHEpu\nMSCnRmllaZRW0LRNN1R8EbZnTGdz5nVD3bTM6waKIjRqK0vMitC1vAldy6fr6xRWdJVt9zZk2hi2\n0/zxtDxY1ygtVruLrnHaIgq33lJ4i7cFbQzam4JyWmK7G1Ie19fuFbEXTdIWz9e/i0UOKzO7Cvjv\nwHWED+svA68B7gI+DfhnwHOAz95mH08mrMENoSL+I8DvADXwJcC/ILQR/tdmdq+7375kH/8vIWg7\n8GHg3wLvIIT3r4nn8SvA2sW8XhEREZEhhe1MmkPdC4yL1a4AaOqajfPnOXv2DKdPneLM6TOcPXuO\n9XPnmdXOrHbmNYybrBN4HEZeFcaoKjArmcfqsAPj8SQcvyypzBkVMDbCfOwUbosiBM74vK4juRVQ\nGEVRYXiozpdjqCbYaE45r7vGaXUT1sx2C4E3BO0Zbew+PptuMJ9Ns7njoZocKt9F18k8VLMXgbsY\nzNnO19bu1gkvoHDDveg6unebpZJ2lq4dxwrvOpgTg7/5ou2aqtsih9q/IrTOdeBfuPsPZY+9y8x+\nBfivwFcve7KZVcBPxR/PAl/m7n+WbfLHZvZ64G2EyvmPmNnr3P2ebB8PJYwRdeB9wOPd/d5sH39o\nZv8NeAth3R99gyciIiL7RmE7U9jiuoCuy/ZicWenruecP3+eM6dPc+q+U5w5c4Zz59ZZX9+gbo26\nNZqWXtB2b8OQ8MIYVyUUJW6GY7RujEYeu5WXjAzGJYyLsH2I6YZT0Ho35Xnx/C7MxhBuRlk6o7LB\nRg1lHeZyhyW9wpD2pm1pY1M2bxrmsykb6+eZnl+PgXu2WIYsdkS3oh+0uzncg8CdFvEKUnhO9xRA\nC1Zg7ixG7echmzjEvO0vp+1FNqRclW2Rw8zMRsBzCR/SPx0EbQDcvTGz5wF/DYyW7OYbCAtWO/AD\ng6Cd9vEhM/vnwC8QKtPPBX402+TZwErcxwsHQTvt421xmPk/3durFBEREdmewnamyK77y1ot4l09\nn7Nxfp0zp0/HyvbZELbPT2kpYiguaJs2Ds9uYVDZpihp3bqLFYSg3VYhaJfGpDQKg8aNuoWmhbZ1\nmjSfOgvc7nE+dRm6h5eAjZyydUZtS9PUNPWcpg5LgdXzmvk8/OxtzXw6ZWP9HNONdebTDebzaVwD\nPL4fm4J2vB2bs/XeJ7NF1TrKW6GZFaFu7aEJmocXEoeODxufOW6xWh+vF33L4zh/ETmMvhB4MOED\n/eqtNnL3O83szcCTlzz8pLQZi6Hky7wOeDlwdXxOHrbTPj7u7m/aZh+vQWFbRERE9pnCdqZX2R5U\na1tvaRpnOpuyfi6E7VOnTnP27DnOn58ym9W4FWAlTqxsNzVNrCrThur2qAxDwts2rO4VGrGlgFqG\nIeQxcBuON6Gu3bahtVjbOk0scVsRzpEuBIdGam6GeagQF61jdQjuRQHg0La0TQjKtA1tPWM+3aCe\nzboh5G3TdmPp8+Hjae56F7SLIm4W36tsWa60RFeK0V2WzuZm9zuLp8nbvqhkeziP1GytW6PbQxAX\nkUPpc7Lbd+yw7R+zPGynudwfcPe7t3qyu8/N7F3A32bz/O/PJvwF8yc7nMOfAVPCUHIRERGRfaGw\nnenWiM46bBdGqCQ3LbW3nF9f5+zZM5w6dYrTp0+zfv48s9mcto2FVguDodumDRXk6Yz5xpS2DoG7\nNMMKoy1DVTuFz7TMWGmOeUvbON62zOYNG/PQdK11x1to3SkKY1RVWFUyKlKH7kVjsbaNXxC0LW1T\n09Zz2nqGNzXmoeN5FTuGW2qIlpqi5e8JWffxbkg9XRW7F7TTA8Qp2CFphxhtZJXrGKrbrPlv2sgX\ngdviEmFOWg5sMczcPR+uLiKHzIOz23ftsO3HttmH7+L5AH+z5LgAD4rXJ7d7sru3ZnYv8NBdHGsH\nLfCQXWxXxouIyG6dOOgTELmkTpw4wYkTO/85n81ml+Fs9ofCdiYP2yFoh6WsUgfytpmzvr7OmbNn\nOXXqNKdPn2F9/TyzWd3NpyYOdW5S2J5NmU2nNPUcvKU0KIsiFG4LsLLtOp67W1yLuqVtG5q6YTat\n2diYsz6twzbx/1VVSWlOURpV4VAs1qhu3PFYiW+ahrau42Ue1vFu2/DPPAuLceEtNHUXthfDwNMQ\n8aJ7X/pLfmXz2vM3cMlS2P1gHB+MU7iXPWSxE3w+dHxR2Y7rcGvOtshhlX/kd/qg7vS92W4+6Pux\nj320bbYXERGRJW6//XZuu+22gz6NfaWwnenaeBVxGHlhsUocwvZ8Og2V7TNnF5Xt9Q1m85rWLc5x\nDoG3bRqaumY+C0O0m/k8hNzCqEoLFVsPQTXcjlXcNlXSa+r5nNl0zvnzM86dn/XmkdNWeGnYuKQq\nqjCk2kIQbR08NkGrmxpPQbueE0rwIeeWZiGjty3ehrBNDNtOnLtOVtHuVf4XIwD67143Vnzxc6pa\nxxS96FYeviQI3drTpmlIvXVzulPrNPN0HV+rK2yLHFL3ZLcfSugEvpWtysD3EP722E21OW1zz+D+\nVK3ettRs4RvFB223zW6ZGddcc82O25VlWNZRRGSvrrvuuoM+BZFL4tZbb+Xmm2/ecbubbrqJkyev\njC+2FbZ7rHczNPtqQ1O09XXWz50N3cfPhqZoaa5202Rdty1E1DQP29sYJuP85dIsrLVdhmeUluZh\nt/8/e28ba1uWneU9Y8651trnVjcd2SbqahoiggChCMWWsJDNR5SYH41QjAjEfCjIYENaYEVEgBRF\nOIE2EXFiJSAESA1SwETkR0CxYwwhgShxLEv5aBARYCchIRF0UwQTt7vqnrP3XmvOMfJjzPWxz723\nqrqruupWezyldffX2nOvvW7VqfOud4x3oKqoNbDm86kxSjKmIrQxbz3kSYRhyJyGxJRhSEbrfeVq\nsn0mpl2YQsoepJZScqGNMCRjfmViuZxYrk/Q1pjnmeTN3V76vX4trxvv/dpyM1t7P3W95Hstj7/R\n3bfCuB+WvzWlG+Esvfzcw9F6eJqJX1A4LLML/SAIXjKOyeFfD/zIm+z79S94/m8D3wD8bBH56hf1\nbfcRYV+H/3T4249e/jv4nO+vfYvj/YX43O13fAXvYx/7GJ/97Gff6TJBEARB8FOOV199lVdfffUt\n9xvHD07ESnrrXX7qsHcQy6EP2ViWmfPDA2+8/gXeeP11nj59ysP9uYvtxUdkbS6vHNxbvCxbFbAu\nnqFkGIswDsI0wJCNQZRMJVNJNBKNLMqQhdOQeGUqPJkKr2ybi+2xuGjOKGINa76xim2sl8UnTzwf\nM6dp4MndwIeeTHz4yYkPf+gJP+3DH+LJ3YlxHDax7aeil6cLXWTLTRr59lwX4Wm9XZ9bTfEu1m84\nOPXpuFZa1+gXB9YkdPx+upn3HYI7CF5C/jruKgP85hftJCI/gxfM2Qb+2robPtLrRfyrwEcevWfl\nv+23XyMiv/JN1vjWN3ktCIIgCILgSyLE9gvYBCJGXWbO54c+6ut1nr5x353tC8tSac3nUd8IbtZU\nbe+fFrM+axtKFoYMU/FtzEZJSrYuslESSkYZE5yGtAvtU+GV08CTsXA3JqYCJZkHq6m62O7zvb2u\n3T835T7HeyhMU+HJaeRDT0Y+9IqL7Q9/6BWePDkxDi62DwO2/Oukfk5S8n7254rjPSH9RkDL4QLE\nwQjfQ9ZWx34X16vQ3sR9So/23bcgCF4uzGzGx3UJ8LUi8nsf7yMiGfhTPH/GNsD3Af+wr/H7RORx\n0jgi8jOB7+kPH3h2RNj34injAH9ERL76OWt8A/A7iRCIIAiCIAjeZaKM/E0QvPd5dbafvvE690+f\ncn544HK5ssy1+9VpKx/3ULG0qfU1PkzkOFKs74ph4sJarPXNy8cRkJy8BLy74cdxZCUnShZKEnLy\nWdyAi3v1tWEPMPNUcSNnYSiZofiosNM0cHcaWZ6cOE0T0zhSSqGU3MeJHWdrp+5Y7yL6sbPsqej7\ncz60awsYZ6sEX594BrsZn32c0U3iOeXpQRC8pHwX8C3Ax4H/UES+Dp9n/Y+Bnwf8Hnwe92d4Tim5\nmVUR+deBv4g71z8iIt+Du9UN+CXAv4X3Yxvwe8zsJx6t8ZqIfAr4Q8DPBf66iHx3/8wJ+ATwu4HP\nAR8Cvob46RIEQRAEwbtEiO0Da+o2Ij1oTGnNQ84u57ML7fOZ63Wmro72NouamwZnWR3anDwIJx1n\nVXvwl4eo+abNA83A3eEiBcuQ1WjmU7LWsdKCC/CchLyVWu9p4dYDyWQT3InUg8XWSwIJc/GdXHyP\nY2GaRqZp5DSNLEs92vvbBYTbJPLnnMPtXK4xaGkbS+YuPz2xnRuxbYfU8Wd+1ZVVdEtX63vyehAE\nLydm9rqIfAL4q3jf9G/s27YL8GeAH+YFfdtm9pdF5LcAn8bF8Hf17bhGBb7TzP7kC9b4bhH5WcAn\ngZ8J/IlHu/xjvBT9+/rjy9v7hkEQBEEQBG9OiO0DNy5tL/9W9dCwy+XMw9N7zg8utpdaaard5V3f\n6wFpIrel0Jsg7oKbnhbeemJ5a9VHdKm6A55cpIsfhs/X7hv91kvSDyPK1kC3m3BwH53lcW2yueqH\nCDdKhqEkxmFgGgemcWSaJual0XXxXkrepfSWHP7cCm7ZAsjp/eJ+V3qCuD9PTxr3U30ruh+vt330\n6miHsx0EHwjM7EdF5J/DHehfA/ws4A08QO1Pmtl/LiLfyjEy49k1/lMR+SHg38T7u38W/mPsH+Iu\n9x8zs7/zFsfxO0XkLwPfAfwi4AnwWeAvAd/THfCf1nf/wjv5zkEQBEEQBCshtg9IWudJ+4gps9ad\n7evubD88cL1eWZZGa0rK7hZvZePb7S60c17d573sWtV8NFftQrtvOWeSJErOPahsHXHl+6s2VF3E\nrs629zSzd+DflGhLD0lbC96NJD5OKz1ytsdx6M72xDw3tH+2PiOMj5HgPPMr8rEfe3uml81v5vXh\nAoKIeMic2bO/bct+Z1+zzwZ/y9G6QRC835jZTwL/dt+e9/r34r3Vb7bG38fLvd/Jcfwg8IPPe60H\ntX0E/wn1d9/J5wRBEARBEKyE2D6Q0zrz1GdNt6osy8I8L1znmct13gLR1lnQz91WwX1wtNMWFOaf\nYGY0NVQBEikV8tBd8JJJ2Ud9+SgwO4jcPSl9K1EvmdGgIjSAqlg1ai9Nz0nJyRiSULbyc7ysPPln\n5Zy5uzvxkY98mKbGKx960h11d9ZrbbTWqNUdeB9Tpj5mbBPO4BcH2MX+Vh7uI8G6ZO/nWfp1Adsq\nA5KkR63ccnt/ezHxjMoPgiD40vhNh/v/4/t2FEEQBEEQfEURYvtAzi62VX0+dmuNulTmZWG+zlyu\nV+Zloba2h3w9b+N2pFVex1elXTqa+XQutd7XnBM5Qc7Jy8N7Gbn0Zm3TW2G5rZ0zpRQGYBJQfM3a\nFFMvUx+KJ4UPSRiyC+6UvO05pbxtp9OJj3zEKEPpI80MNd/m68J1Xphnv+BQW+tl8M33URfeuopv\n9Z53NiGum9C2rS59F91+jcIFtDwS22vlOce3RSl5EARvAxF5Avw0M/tHL3j964Dv7A8/Y2Y/9p4d\nXBAEQRAEX9GE2D6wim1MqeZie1kWlnnmOs9cr1fmzdnmRlw/drWPpeQp5X0EFmuJuov6pmy93Hkb\nq7WOtFJkE6nHyvBdyJeSGUqhCah4a3RTY54N6z3hlhKZxJASw8HZNmN33nPh7u5EGQqvvPLES9u7\n0FY1zucL5/OFh/PVe9bnxc/N4r3r2tSd+qbbY1Fxwa2Kba70HhB3uPSwnRtB9nL4LsptLT+/yU8T\nHqnyIAiC5/HTgR8Tke8H/grwv+PjwD4G/Erg24A7/FrlOypVD4IgCIIgOBJi+0DuPdsNF9rz9crl\ncuFyufp2nVmWhabt4M4+drV31odrC3VrRm0NscTSlKUZVSGLUHJGctmWMVnLrqWP617Vph3WP4hu\ng5b9t8XVvc5iZIwiPtt7LImSfda3G8Me6JaSl5HnnDmdpk0De6m4u9bnhxMP5wtPzheu15m5i+15\nqZ6o3pTalFabu9697LytJeereNfugh8db/VS8/Wfja6xxeTgZN+W1AdBELwNTsCvB37Dc14zXHz/\nNjP7kff0qIIgCIIg+IomxPaBtPZTq1L7uK+H+/s+V/vSg9G6sw1s/dM3PdscQsDYHOzalLlWzteK\n5MRlaVyrUpsxJiGRKCn1tuReaq09SK0L1F2HWv80O4zachFdzMX2mIUpCxSYSmIqmXHI5B5UZija\n3XlJe++3O+XZk81Nt/7su2nkyZPT7mrXSq1eZl9rY6nez11rrwZYmie2r+Xm27Y739tzun9Os9ZT\n13uA+eF87vXkz44OC4IgeAGfw+d9fwIfMfbTga8CHoD/Bx9N9sfM7B+8XwcYBEEQBMFXJiG2D6S1\nzLtpn639wMPTp5zPD1y6o7uKbYwtefwZod1ndq0OrPUe6uvSeJgXUspcZxfbqkYqUESQlFmFsD3q\nfX7cs33Ek8UhC1j2vuwhw5iBLExFmIbMNBTAUG00pa/Ze79L8dFfp4HTNJJzwrYQNGVZ6ratYWmt\nKrW1TVgvS2OePVBu7sFydRXl621rh8fret4DXltDmnTnm718fK8256i8n5NdHgRBcIOZVeAv9C0I\ngiAIguA9I8T2gU3TaWOZZ84PZ+7v73l4OHdnexXbhlkPQjuEoW2szvTqbJuXj89L43xdSFmZl8Zc\nXVQWBUU8tczWkDEPULMe1mZq2wQteTRGiz5HO/dDGBKMWRiLIOrl42NJjENGVVlMt+MCurMN4zjw\n5O6OV56cGIaC6Zo87ret+eYjyPxCQWvK3EX4vFSu88L1Om/bUusjoV635+br3EPXFpa6kJbKIkJr\nbSudN7NHxeKrxA6hHQRBEARBEATBy0uI7QOqDYBaF+brhfPDPQ/391zOZ+Z57o7uY5f50Wgq2V1u\nNe29zM1d7evC0/NCLl5WXtVF8tJgqcbcDDv0PrfasNawptDaJrSTQFHBUoPcIGj0WAAAACAASURB\nVClNoZlvS1MAhpJJwDAUck6A+MixNfhtqVshelnHiPWS8pQSiiF9LJdID21LHvyWLW1l8sMw9HRy\n3UT3ur6XmB9vK3Vpm/O99n/P88z14Iav5eY+g1xp2jaRbz24zaKMPAiCIAiCIAiCl5QQ2wfUh15T\nl4Xr9cr54YHz/T3Xy2UX2+q9zntA2uF27TE+zNKuqiy1cV28X/v+spDLOqLLSCKUBqUZpWoXrI15\n0S621QW3Kgkjd7E9qCC5kXJDUqMZVIW6BpCJUEqm5MQw5J54fnTZF+rStrT0lPMmuNdxZZiB+Fou\ntL2nPBkHN182l9xD4Bq1ukheg9LWEvFa9xLyZak3Dvj1OnOZ/daPrT5yxT2QrbbWx4ypn8QgCIIg\nCIIgCIKXkBDbB1rbne3r9cL54aE72xeWeaG1ijawdcbWUVnT78samOYCtKmytMZlqTxcF8p5oYzr\naDAhZ1ia9U1ZqnKdlevsoWPWmo/Oaurp4n1rBql0sZ3bJrSrD+52QV6yi/ks7mzL6ra3TcQOZSCl\nXWjnnMjZx4F5UnhC0J5aroitCejrfO70jPD2YDXro810c6lrrZv4npeF66WL7Ou8Jb6fL9c+Zs0d\n7+vcX7v2OHKB2nz2drNQ20EQBEEQBEEQvJyE2D7Q2gLgI6362K9zLyFflrr1Ulu3sLeAtHWu9s1q\nXuash1Jo7YFjal4+vpacKz4be6nGUpW5qSeVVwPdNxXzfdfRYNdGozJroqrR1KjNyDkxlUQaMqlH\nrKsq1djc5VbdIRaBnBPDUBhKpvRS8pQE04QlxSxtCelm60WC3EV57vPEvbwcVtffbpLUm9qeRt68\npHyelq1v+3r10Wqb0z3PXK9eZn6++Izv83ThOi+9VH2h1vae/bsRBEEQBEEQBEHwxRBi+8CyuNie\n59l7iC9X5kMCua5Cu8+nFkm34Wiy+9xmHm62jrQCIyV6UriXZIskUk4IsvVaL2vPtvomeKm5pMSa\nwK3A0qDNjUudkUsPK+vbOGQ+dBrc1U6ZZi7yMfU54X3mNeAzukthHIbe2+1udZKEJQNy/z6KmXSB\nLo82//K3GXH+2EeXJxBFJHdX3LecEiVnxnFgmgbuuoheDmnm13nhfL7ugvty5XKZuVyvzPPyZf43\nIgiCIAiCIAiC4EsjxPaBWo/O9sz1eu0J5IsHox16tdcU8lVhrmXUO+voLk/vFjOy9LFcJblY7yXY\niM/TXqqXkS9Nqao0M3wY2D6WTHvKeVXzEDVT1BZUvWxd1XhyGsji477MPIHcWu0p617KbT2cLaXE\n0AXvMBRKSS7uuygGd6tVfe72+l1TSofveyu0fR+21/06QULELxyoucguRRmHwV33Vrc+73ro1Z6X\nyvl84eF85Xy+cP9w4eHhzP3Dmcvl+i7+7QdBEARBEARBELx7hNg+sDrby9Kd7XV81dJTyG11tffb\nTXRzlNpdaJvuzrZ5r3VJ3dneAsdctKqBriJb97Rtkz2MbBX7BixqXJfGZTHmpe/rH40qnIZMbYP3\njbdGqxWty+bSm9kWhFZKYRwHSjk42wehvX6+ah93djim3dleOT42P0fmlyfMDHrumpkxwJYovpXY\nqyePr+Xuy1I5X66cz1cezlfu7s+8MY2UUhiG4cvzL0IQBEEQBEEQBME7JMT2gWt3Sq+Xaw/o6snY\n1fu1nd3JlmPdOF7kvcnutV+7VtqyYFoRU4oYZdXYyUW3mQdrW3+qJJDsGWxFIGMU8dLxKtDUQ9Uu\nc+Ph2jjP6iO7UqLkQzhba7S60OpCXRbqMvvcaoGhlE1ku6PdhXZXyutULRf5h/Ff6fD9H5fRP8K7\n2A0TQcwAweRwjtaJ2f2YhO7gS4LMzYWMlBJDKd5bPhRO08g5nO0gCIIgCIIgCF5SQmwfuF672L7O\nzNferz17EJe708dRX3sJudPFZP/TTNHWaK1S64K1hpgnig+uKjfB3RQwTxgXgZK9U1qSkMXvF9zW\ndtcbajMui3J/qdxfGtOYOQ2FnAsGvXy90qptQnuZZ++XzoVSMuMwbL3apXjgmfQRYasQBjZX3YXv\nrdt9vPjwfMTPh/itW/WryL49hevFi5TWnnghZb+IMJRCnSamaeQ0jTx5cmK+zl/i33QQBEEQBEEQ\nBMGXlxDbB+YutudrHz81+/gpF9u9TFtue7VhldnrPQ9Rs4OzXeuCbs42m7MtWbDU3V/Fx3mJ+Wsi\nJPO/oIyRcaGdxPu2l+Zl5A/XyhvnBbWRnDJjryX3EvZKrUqtM8viWymFkgtDKUzTeONse6L48Yzs\nD/z5BOghEO3ZMvLdr/YHYodnZX316Go/+rQEYgkBcva9rJQtDX25m3hyPXHtCfFBEARBEARBEAQv\nIyG2D6zibZ0H3Zo72maKrRLSdg/bVs1oiupeWi7iJeXbiKvrwvVy5fxw5v7+HgPyOFFGSIVNnSbZ\nPHPSWkKtLq7NfDTYdTEus3FZGteqLM3QPm46JaGURM4Cpu5qC9TFw8dM1dfOiWEYXGhv5eO38W7b\nl+3sgvpYRv5sCbnxiC2ofS9HX8vGn0WOVy72t2c5lLXTR48l6hijv4IgCIIgCIIgeDkJsX2gNVet\nNwFlB5G9s4aRGa6GEyTFbdneqWzQVKlLZb5eub9/4Auvv8HpJ+64Lo3Tk1c4PYHxBObS2lPAD/9g\nXmKuDbQZD7Nyf2k8vSoPl8ZcfRxZyolSEmPJnIbMmL1ku9XG3MvZTRUQUsq9V9v7tXPxWdybJ7+N\nN7NtXvb2VVdeWDL+5hz7v+0ZVf7MSd6esn6FY/27yL3Wfu0fD4IgCIIgCIIgeNkIsX2gNXdKtWl3\ntJ8VmnbzxypRFVHpZdjW0898nWVZSClxf//A66+/QRlHlmZ8SPFZ2zkjKXtvsuTjMC3UvLR8qcay\nGPfXLrYvlYdZWaoL4m18V0mcxsyQIZkHo8217kZx74EuJW/OdilrKFoXtGtvdf9mz4piv5jAo1L6\nt8tRcL+I9bzTE9bNdrEtmJ/nlMHSF/35QRAEQRAEQRAE7wUhtg/szrZ6j3Z3t3dPFTaRfXzeegCY\n7OLPzGja48Nl4eHhwvDGPZIHd7JzoUyTb2UNQzu66F4eXitcq5eOP1wb99fK/blyqT6KzJCeQp4Y\nSmIqmSE3pHlAWrVKTqmP80rknCmll5APAzklJMmeVmb2jL9sB8X9WCMfM9ofVYA/W2Le11kF95tz\nFNrHSgL6RY1wtYMgCIIgCIIgeHkJsX1AdXW2myeJV6XVhjbDdHd598yvgwgF7+22xD5n20PStDWW\nZeFyuVIezgzTienJzOm6MM7Vs8PwoG4f2aVoU+alcr4sPFwWzpeZp5fK/aVyvi7URhfQmZyFkqCI\nkWlkPEgtpUQi+5ivoTCUgbu7O6Y+pzrntI8w41YMHx8fvvUthzjxx68+Twy/lcg29tJ9v5ZxLCmw\nm57v/awHQRC8PLz22mt8/OMff78PIwjedT760Y/ymc985v0+jCAIgg8UIbYP2FpGrm0X3Dcl5YcC\n68dCG/Oy5u7EYi7QFaU1ZVkql8tMyhfG04W7y5W7eeG0tG1+dcnJ962NpTYuc+X+MvP0YeH+vHC+\nVh6ulcu1YuDOdIahi+0sRkJJGClBskQWGIaRafLtdDoxTWN/b96P/1Daffxux77t25O1O+E3vEBk\nr+ve9m0f6wX2UvG9bJxNZO8fu/d8i7xYuAdBELwfqCqf+9zn3u/DCIIgCILgJSDE9oGjs91ac1e7\nKtoTv20LD2M3VW1PJV/d7K383BQUmrhLLZcrJpnh7glPLjPX68K8VFKCkgXTRGuNuTauc+XhUnl6\nXnjjvPDGw8J1rn1rXnaeMmnYxfbmbIsP6RIRcioM48B0mnjSXe1xnPqor3RTov3mzvazvFXv9ePX\nXiS0fbFdcN/cP3zWfr+/IZztIAi+RETknwH+7/7wt5jZn333Vv8Z795SQfC+8xqg7/dBBEEQfCAJ\nsX3DMRjME8lVdU8mXwPTgOPsaLnRfes+24ww31PV08n73O26VGpt1NpoJWGqWO8Vd3dbmZuP9lqa\nURVa39SMbOKudRKmkpiyMGZh6FuSRBaj5OSO9jQxTdOWQC5JDgdtN9L1sZP8Io6i+a16qJ8ntN9O\n7/bNee/VBbKd8HC2gyB4x7zLP0gS8Nl3d8kgeF/5OBDVGkEQBF8KIbaPeLv1Vsq8Cu21jPymVXgr\no+6P+4xo6w73qgd9FnV/n7mgtqa9VL32LaGa+2caTXdxrSaYJJAMooj4mLAsQknCWIRTEaYhMZa+\nDYncy9JLTpymiXGaGKeRUjI5ZWAVuo8vIqzHevv75/NE8WNn+x2FlgmICSYHd/sZoU0/3qPgDoIg\nCIIgCIIgePkIsX1E9m0X3KvDfewdlqOx/Qy2mtprxfl6x6w72Mee8OZivmkPYfOS9daM2nz0lyFI\nSi60xZO4kwglC1MWTiW5u90F9zQUSk7k4uFo02nq5eMjKaV+XM+Zdf0CQf28+296Gp9TPr7ePu4N\n355D+mivg+B+9LnPOttBEARfMvFDJAiCIAiCLysxqPiAYr7ZcdtFsB2c61vhuSZ6yzMFibvwXNdw\nV9t6SrmLbN3WVO3uthm69ohL6rO4s4/uyonSXeypJE5D4rS62iUzlMw4DpymkdNpYhw8fTzlDOLz\nu2tTlqbUprT1+9FN5ee4yo/vH9PW364Ifys2UX74HfjZY3j2OIIg+OAjIt8oIn9KRP43EfmCiLwh\nIj8mIt8nIr9ZRD78aP+PisjvEJE/LyL/h4g8FZGLiHxWRL5fRL5FXlBuIyIK/L31IfBnREQfbf/u\nl/krB0EQBEHwFU442wdqrQDuNtvzRORtb/NNSJfIvh0s8pvhWVtwmm+CJ2rLLnPZRfnqjicf7VUS\nAmSBMcNUhFdOA3enzN2YmMbEUDK5ZEoplGFgGArDMGCSmKux6IJanyPeZ4oPRRi6UF+/13594Ha+\ntd08t38v39+Q1K/eiGyV9btw3idx22Fu+fH2ccn484S235cehB7GVBB80BGRE/CfAL+hP3X8sfnz\n+vargT8AfFd/T8KbSJ8X3vAq8M19+3YR+TVm9vBon2NT0OPPDIIgCIIgeFcIsX1g6WK7tuaCtMvC\n7R+zzbwWA1uFsqRNaK9l3nKQmauQvHVjDREj4Wu44NZd2Bpg0hPFoRTv0bYM1rxP+8mp8GQq3E2J\naciMQ6bkQi6FYSiM40gZCkuji+3qTnZVamsIcDcVnkyZLEJK62+ea7z6ennhMGO8W9+r+w0+zxuU\nRMLSKrQPZ2ArG3/+eX9G0N/cu70IsT63tswHQfDBpTvPPwD8Cvw/9L8L/AngM8ADLpy/EfiWx2/F\n45H/O+CvAH8L+HHgw8A/C/x24Bv6un8c+K2P3v8LgY8B/03/3O8E/stH+/zjd/r9giAIgiD4qU2I\n7QOrs12191HbOuri4DyvolNufetNXh+E96EBnLUC2kw3ASuY73ojtB852ymRSAxJEBOSCWKJuwKv\n3GWeTJnTmJmGfONsD8PAMA6UUpi1MbfG/aWyLG2b450E1JScRsYh+bcQIT1KZX8streLBnro45bU\n3fqj5b39sT1hj+T0MZztRnTb+vnPKxk/7hMEwQeYf4NdaP8XwG8ys+Xw+t8E/ivg3xGRV9cnzayJ\nyM83s7/Hs/ww8L0i8vuB3w/8ZhH598zs/zq8/0dF5P7wns+Z2Y++e18rCIIgCIIgxPYNu5Y79mgf\ny6Zv+7S3KnKvB38mbscd8P7+vp9I8rFcScjJb1Mvv5ajCPdubbIAyZO6sySKZIoIp4KXjw+JIQs5\nCym5WDY8WK0ZaIPz3Hh6Xnj9YWGeK0sfOZYT5ARjydxN/q9CEv9DDj3mR/HPoa987dfOLZNyIueE\nmSed+7Gkg0jfqz23SvrtdB9t8xcHsUV/dhB85dBd7d+L/yj4LPCtj4T2DWb22qPHzxPaR/4g8B3A\nV+Ml5X/4HR1wEARBEATBF0mI7QNrWpwcNN3ep2y7uN5f2UrKn4+/T7pIFHERmlMiJRenOSdyHw8m\nfd9VbKfDPGkRY8zCmDNjTpwKnAZhKEJJLsoPoefUZlANE3e0X3+Y+ck3Zq5LpdZKq42ShXFI3E2F\n2gYEV/2rIb2J6oPgpvd8t6Y9Sd3IuVF6eJuZl5Wvott7q/fjOhSF7441HG5fcCYfhaHJM655EAQf\nML4WH+BrwJ96Tl/126YL94/iZeTD+jQu4r8a+Off2aEGQRAEQRB88YTYPnBT8GyPHNj1Dzvs+WKV\n3cW5bH3Pbn4LSdLmAueUKEnc3V7F9ia01d1uWXu78ZnaQ+JuEKYCp2yM2d3p1R0HPMVcDa1KA+6v\njdcfFj7/9Mq8LLTqI8fG3vd9vRupbRX33aTvolpVaYe0dAxaU2p1h7w1peRMy0opigiUkvtYr+eI\n535St7J64LaK4Jj6vr9l79tmC0eLnu0g+EDzdYf7P/ylLCAi/xrwbcAvBu5esJsBX/OlrB8EQRAE\nQfBOCLF9ZG0/vnniRm0f/WywvW/7RjBa1+GyB6sZLrYl+YzsnDxl3F1uIQskccGbxUjiItrw+4Yx\nZGEqwjS4sz0ko4h2oe77qJqPE0NBG4sZ95fK00vl6Xk5iO3KNCQu18p1adS2rpOw5N9Q1WjNXGz3\neeOmPi5sWSrLUqm1Ufo876ZKSolh0GdE9o2LzVFY2zMi+7l/NTevhdAOgq8AjgL4tRfu9RxEZAK+\nD/gEt2mKL+JFQvzLgAL/9NvYL/ctCF52vqj/PIMgCL5kXnvtNV577a1/5szz/B4czbtDiO0bDsL5\nJoTrOb/DdUW9pXI/ShvfRfchZOxQRC3sgjoL5ORp40MWhgJT849RpM/bNooYCSWty6hhyba52c2U\npBWr0MTQZCwqPFwr86I0M5qy3zZjacbSlLlqT1I3kgpJfF01o6mhraHNXe5lqVznynVeqEtjHAfq\nYIxqlC66jyXfx+9/FN3rLPNnz9vuft8K7J6VLt4P/oIRukEQfPD4YgMZvpNdaP/3eIL53wD+kZmd\n151E5IeAX8Z73nPy4+/txwVBEATBVwCf/vSn+dSnPvV+H8a7SojtAzfC2o4isSM3L68WrZdMc0zN\nVtRALIGpS+tNOdoWVp6gC24X2iUnBjWfe61gYqgKzVz4ZlEyghiIAskFqaqniqMNmtEwKkpFuXax\nfa1K1R6cptDUqOq93bUac9Ue2gZq66iuPQzNR4a5kz3Plct14XxdWJZGbcakfizj4KXlx3OyC+dn\nL0aY7WXv276698ivYlsO6e5r+FqI7SD4QPNPDvc/ho/9ert8O/4T+IfN7JveZL+v4osX8u8IEeFr\nvuatq9Zzz7kIgg8KH/3oR9/vQwiC4CucT37yk3zzN3/zW+73iU98gh//8Q/Ghe0Q2zc8drYPv6Md\nArnWUnLrDvVaFg23zqzPxko3pdKrSu+h372k3NPES3JXe1SoJfWycHopt1EEEkYy7wc37Xo/4b3V\ngGJUNWZTZlWuTTivzrZy2NzdrqosTVmqUrJQsqBdKB9Tx1tTHxu2LFyvC+fLwsPF082brhckhGla\nxfbtWT2K6+c52fs4sS64/S+in9NVVCefBS4eMBdiOwg+0PyNw/1fDvzQ23mTiHwVHoZmwJ9/k/1e\nAX7+myz1ZRHhH/vYx/jsZz/75Vg6CIIgCL6iefXVV3n11Vffcr9xHN+Do3l3CLF9ZC1T7iO0ZE0J\nX1O+t17tfutP7ilqsgtuLx+X2/LyHjhmus7U7g537+EuJdMwSlGGngJezdzhNqNVY67uXCexLVgt\ndTdaTVCEaonFMrMlFk3M1VjqYVY2LvfVhKowN+O6qDvr2QV+SlvHOnRnXq2XkbfGXBvXpXKZG5KV\nlJVUjNpcxOtz3Wzdysb1ELr2OIn89jdgL21f+7Q90T35/PEQ20HwQeZ/Bf4B8DOB3yYi/9HbTCQ/\n/n/rlTfZ77f3fV8kqi+H+9Pb+NwgCIIgCIIvihDbR3IXb2kPMlvDx4AtWfzN2v88LXsX3Kwl0l1o\nq7YuNF1w773biZwzBWNQoZZGU8GSh5RhSq2NpVWsNujvS/2igNIFN9AssZCpJJplGolmCevTvBHB\nxHPPm8JSXWwPRRlLckH86Dt6ubp5EnnTLrYbl6WRSiMVo1Rj6a75WpK/Cm419fC2FwjuzcW+qSaQ\n/SgEpPdtr0I7nO0g+OBiZiYi3wP8UXwE2J8Vkd/4vFnb62ivPmv7x4GfBD4C/EYR+cOP3yMiXw98\nF2/uXv9/wIyPCvs578Z3CoIgCIIgOBJi+0jy4Vmbs53Y3G04Cu0XhN+uCeQmh9lhsoWkreXY2tO9\n175mSULO7mwrxlCgqtGyoc1LxzGlLQvzdWGZZ7TpwX1fxbbn4DYSDRfZKgXJA5IHyLI51ZB8X4Wl\nO9vT4CXouyu97rs623bjbF+WxmVu5KKUQRkazzrbPMfZVr0JRnvOmdyOEti+I6uzfRDcQRB8oPnj\nwL8M/ArgXwH+loj8CeAzwANeLv4NwG8A/hzwXV2k/zngO/D52T8iIv8x3vP9EeBXAb8DeAP4HC8o\nJTezJiL/C/BLgG8Tkb8J/E1gFe4/YWaff/e/chAEQRAEP1UIsX1A8lpGnpDs5crpWEfOrbO9jtta\ng8BEZC8nPwjutRfZev+zNqW25rOql0qrg/c5K/u+qyhtjbpUlrlyucxcL1cu5yu1qju/sk7m3kux\nVRIqGZUEqZDHiTxAJm0J42sp+VFsz1VpzcvIbRPm/k2tp6I39VLxpXlJ+6UqpSpl8W2unm7u6/TN\nuqt/ENqqytH5Pia6czzDPqC8f9fdyd8D04Ig+KDShfOvBr4X+HXAzwX+yPN2ffT49wHfCHwt8IuA\n/+zR6/8E+LXAH+TN+7b/feAHgK9+zhp/AHfHgyAIgiAIviRCbB+Qngwr2QO4tm11t+329z3Depu2\n9bHbvbe4v4rJts8qorWZj866zDw8XHh6f3ZNmQRJmdq0O8aVy7XycJ55OC88nGcul32rrfX1wRc4\njNpKCVLGUkLyQFYjq5BNQFIXwJAQmkHtgrs22xLLfdxYwrqm3Xq8Tajqvd7LKtSrkRZFrpUnV+/j\nvi7uftsmtHW/2NBF9+p6byJ7S4CnC+yjnBY/s7IdUGjtIPgKwMwuwK8XkX8B+K3AL8Ud7Yo70z+K\nB6H9xcN7XheRXwL8buBbcJFe8R7wHwT+qJn9w95q8ijt8uaz/7KIfBPwu4CvB346XlYeBEEQBEHw\njgmxfWAX25mUszvc6VFv8Na37X7vjdd9zE/zHfrzu6tt6r3X12sX208feuhXJqVCbcp1aVznyvm6\n8PBw5f5+5v7hyvk6c7ksXK4zteqe7o33ia8mvDvzGVIilYGsUCyRSUgq3rPdj95ndBtzW/ut2YLW\ndiEvGImG0EyohgvuBkuD66JIUpDG+Vq5zJV5qSy19dFnB6d+7d3eZnEfxbYdLlh0US1pP/E3FQZB\nEHwlYWY/xNtMJO/7X4A/1LcX7fMvvtufGwRBEARB8HYJsX1Asgu7lJO723kfMbXN0d72tm1+ttzM\nuTqIwp5avotNF9x1aVy62J5OZyRlUi6kXGhqzF2sXi4LD/cX7u8vvPH0yvm6cOnbUtvmUJsZKXm+\nm5vaiZQzqSRyGSgmKIkshVSsf17GWOduG0vV7mxbd7a7P+/qHRNBLbnYVnFXW2FRI1UD8eneD9fG\ntTvbS23bBQmRQzl5F9s8EtvW768XA5C0S/5HQjtM7SAIgiAIgiAIXmZCbB9Ij5zt3G9T9lFT3pO9\nJ3UfBd9tCTk3r6x299qbXFvler3y9OGBVBJNvYd7Xnxmda0+9/p6XXj6MPP0/srThytLf14Bk4RJ\n73820OYjwQQjZaMMRiG7YG0VqQssV9QaIhlJGWuNgpKtIlo5FeNugPOYXeRuieLGZfHy9oercp6V\ny6zMS2NeFKjb6LHLvHBZFubFLxgk8dR0SRwS2X3rEee349F6Sf5awq8pezK8umD3hPLbPvogCIIg\nCIIgCIKXjRDbByStznYX2SWTS+5znddU8u74rs3Mj0dk9T8ea8E1ksxMabVyuVzJ5R4DlrqWjc+Y\nCa2ZO9yLcv8wc997trfSbklIXodSG8ahD9qUbOa91kkQbbRWkTpDzog2V76SaTWTtEIrWK1MBe6G\nxN1UDsfvn3OelYer8jA3ztfW+7JdcFsX2s3gPC9c5sp1qcy1krvjnk22/u3W2nas6H7cezq59KqC\njCTrFzoMET2UlYfgDoIgCIIgCILg5SXE9oHHPdvrJjmRJPXk755FdkjDti3Iay9wPuZ8cRDaZkqt\n7mwjeI/2vHC+ztyfr16ybS7ml2o8XCrnS+XhspBKIedMLgmR1Gd4Kyou0LUZrTWKdnc4Q86CtEqr\ni9eYS3VXnOxitlWsFnRZOA3Cw1i4Ow37yLO+nZfGeXZn++GqNyFoal6OnptxuS5cZ9/mZaBkgbyW\n4e+jz1Qb9B72dVsvGCBC0kzK5tUGqv59JeHDzUJoB0EQBEEQBEHwchNi+4AkF9sp9X7tkslr3/Y6\ne9vM+5hXhb21Z9ttkBrr810XHkqymzaWZdlGb9Vm1OYusUgGXFjWBufr7iRnhCKJ3EdxVRKLQMVo\nCKq46EaRJqSWoCl5XkgKuakLbUsofgGhDoWlFOahMGaYhsRp6hcdkiDJhe3DtXF/bTxcFg9BWxpz\n7/MWMUS9fF0xWp/H3VrzxPMeeGZdZG9l5KpY0728vDvf7mx7L3zSfr5JgPT7spXyB0EQBEEQBEEQ\nvIyE2D6Q1jLy3i+c0+OQtHUE2GplH1Kzn2F9zrYE8+OYK+1haa3P206L91enpCAZkUxTXLT2ILTW\nQ8lohkpPBTehkgF1AWqCKaCCVahiSK3ItYFcoY/wMgSRxFgyQy4MJZNpDFkYh0xV666+p5q/cV64\nP8/cn2fO15l5btTmiegpJYahcJoGpnHwtdIeKqdqNNzZtuaCeysh7yUArzkOtAAAIABJREFUxhog\n50FpzbqRLT0Ers/59lA6tjT1IAiCIAiCIAiCl5EQ2wd2sZ17ovc6a1u2Wdu34vpRv7bRy6/lpsrZ\n+otr4vbq5DZVpClSG5IqIKRk3qcs0ExoTd0pBmpPDm/NaMmo6q5xEyORSAhpFaUK2iBjmC7uILd1\nDJdscW5DzpSc/DYZ45AZx0IzSGVASkFy4el54el55v4yc74uLIuLbcVT3IehcDqNTJOL7ZTcznex\n3U9Od65NG+Yx6n1Oec9KWy9C0M9TM0z8YkNTtvR1PQjuIAiCIAiCIAiCl5EQ2wfWgDTp4VxrObnP\n206ehL31a3cRLc/3tTcO5eObq22GmCKt0dZeZFncJc5GSj7KS0m01gW2waz7thjb3GsFCo1iiWJC\nwdAGVSCp0ZZKm2d0nnuZ9l72XlLyLSdKFsZxYJxGVDIyNNIwIsVcbF+8r/x6Xdyt1h5mtjnb4yNn\n+xAK15PHj8Fo+/lhE9xegt4FtymtzwGv6qX2rb9VLZztIAiCIAiCIAheXkJsH0hdNmdJlFIYx5Fx\nnBiGxceASULFtsTxFbN9lvSeqA1HJW7rtjrcalhytzYdnG5EQXqRtEAqiSELDD4Gq0pGU6HhZeTN\nxIWsrY66z6Y2A2ueX67Lgs4z7Xpxwd/7nkWEmgRLCdXEvKzhZpVhXkgmJDJimbk2lkWp1XuuDUF6\nuf1QMtOYuZsypyExZEgomDvYxu1MbfoFC6FXApjQBEy8X9voDra5o11Vqc3HnvkM8C62Q20HQRAE\nQRAEQfCSEmL7wDolO6VEyS62p2lkGAZKL402QNQ28Sy7jN5E9zq+qtvg2/pubt/2bUvqTrcatHUs\nl/aRV5BLYpRMSoXFEhcyRqapuNvb+5vpReRJEs0UVbqD3rClovOMzddtdBgikBKmXWyLuNju27BU\nEplEI1ljXhpLa9TmYlu6819KZhgy05C4GzOnQSgJEoZp69/Vx3pt4ea9HH/vg0+IGajPDjdWod0d\nbVWW5lszc6GtnkseBEEQBEEQBEHwMhJi+8Aqi1dnexgnxmliGC7ubCdBe7z4Lqxld6Ifie5tRVl7\njHfn++hwr8nciCDqQjt1cZpKIpWBUgaumkia0JaoQDWjolQUcalNdpmLmPla2rBlgcXFNrDOBAPN\nWIKmLnznZdnc7TIvZMskConGUt3V9tFdRk6QkpBLZhxe4Gz3Put1vrb3vicX2KmLbPEjR9NNeb5u\nJeXuatfWXGyr9bne66zzIAiCIAiCIAiCl48Q2wfWnu1UCmUYGKaRcZooY3FnOydSL28W6ZXRxyC0\nrb58LSvfVt6e3fawvaRc16jxVWz3MVjS15AkSMmk5kIVSx4whu1r4SHpltz1NRTxWLV+2/qtC2sk\nQTrMBced4mb0/mhDk5JaI4mL3XVGtuEXHFLuznbJjCUxFWHIkEXBBDe2dQs+83OMX3xgPyEm64UI\nPwbtFybUDuXkfdO+ma0XPIIgCIIgCIIgCF4+QmwfSMVPRx4KeRoo00g5jZRh8OdyQhWS6ZbvJcjm\nyB77s2+c7QNrMvmNC26KqkCDJgmRhqaMJE8rT7k3Kq+92WYkvFQ79zLtjJHEIPW+b2sYC6SKtEZW\nI5mL9ZSyzxRPGSNh0udu5wFJxZ+X5Eeots3Gti7uRcRd7ZQoOVOyMCRhSEYWP05VfNxXH33mX/5w\n0cFkE9jQeviZ9sRx25zr9TOPFwW2R2FsB0EQBEEQBEHwkhJi+0AaynabR3e2yzRSxoFcCjn77Gsx\nPE1cgb0TmZvZ2siN3j56sDdhaazNx54qLqJoUrQ1JKv3cquS3OL18nC64DYjYRSMJIqI94BjikkF\nKmILoo1kUEzIpG2OOJJRyZ56TiaXguSMSUa3A++l6MfgN+kl5Dkx9K1kKAmyeNK6trWXfJ9Jbj4N\nbCv/1nUUmvWk8bVMXA9ufT+v61Lrcs+/lBEEQRAEQRAEQfByEGL7gKxiexwoozvbQxfbZcjknMjN\naAlQkC0A7Si0cUEpfkdeKAl3oWl4fzVmiCRaau48NyU39URv3ceHrc62YWRTxJTUBbeHsilIc8Ft\nC6JKNqOQKGSyZHLKQKaRqRSEvDvb0p3ttWReG9pDzowexbaOCyvp1tnu/eKmHmbmfdmpX3sQjt3t\na3l5011o3zrb2+m8uX3LcWtBEARBEARBEATvMyG2j/RycMmZPI4MdxPjkzvGuxPjNDGOI2oLzSqp\n2aHE+biG/7GOtULWkVy38vBRTrkLbmQX4KqYNlQb0hrWKqKJbIlBE9o8pbuZ0lBSF9xiBrIem2Io\nKoZKQgXUkn94M0QUo6ef95Fhx/JsA5+NbWB9PreXkLvQHkpmGgrTUBiHzDgUhuLhaEJfT/YgNOjB\nZv3CQdtGnhm1J5231vr3WueL2zOOtx9ceNtBEARBEARBELy8hNg+YKmL7ZK9jPx0Ynpyx3Q3MZ4m\nhmnYwsOSaC+1lkeOq9uuchDZu9Dend2j0F5Lzve08h6QZs1d5VaRWkimFE2MJliD1AV36m736h2b\nx4xtYtsAFUFTpvlAa3fIUZ9tnRJJ3C2X9djxfbT3VLtA9u+Qeq/2WEoX23kT3EMGSIipH08fMyY9\nkV17otvR0a6qLrJXwd3FtnaxXXtiu6215ax931+ufxOCIAiCIAiCIAjeGSG2Dzwjtu8mxsuJ8XRi\nOI2M00ityrIoSVoXtrBL56PQZttgdblf9MFsM7lXsW3WUM3I6mynipiQV7GtkNRI5iPAVmENvdxb\nDMVDzVSkb8mPp5elJ18EkV7CvnVIyzbezEy3iwDr91id7XHInA7O9jQUSnYX3CytX2xz+L0cXjeh\n7X3abRvtVdWd7aZK0zWFnD2BXB+f7yAIgiAIgiAIgpeTENs3dLGdEnkYGKaJ6e7EeJoYp5FhGihz\nJefaRfTek23P1dJduh6c7Vu6Wyy2WbXrKLBtRnYX2qaZZIkBA0kkEZYEC5DwMd3N3AEWlKS6lZNL\nE9CEWfI54QbWPFE9dzd+DU2TlCAlduHfXeVDGfm2+V4kgbwllAsm6fDd9u9tpn4RoE86a2q9UkBv\ntqZ6M/JrHW92PIe2BtAFQRAEQRAEQRC8hITYfg5JErlkhtHnbA/TxDCNDONIKQspp8MM7eMwKttG\nWtE7sB+z9i7vdKEtu7i1bRSYIqmRtIE2coJJhJJgMGFWuDYhqVEryGLU6mO/BCV3Zzu7gd3ncyut\nmfds96PMyZPIUylIyohkrM/C1l7ivTnbqfdfm2HNS9y1lT77GkQSKclWUr+O99pKyFWxnrzeg9b9\nQoHavjXbhLbPNN/LBATZytlDawdBEARBEARB8LISYvtIV29JhJwLMkI7nRh7Cfk4DeQhk5OL7cfd\n19siq839gsbiTag/Lh+nz7UW9XWlYdo3qySg5IRkaAhZBUmCNNy9NkNrQ5qXk29y1NYebEHVS9BN\n/RCzz/FysZ0LKZXubKdeym5oa5uzvKaLm3UhXpu/rn0GuPTRYtnd9zVxXM3HmAkJE93Et5r1FHJo\nzYV2bWsiuTvcKYEkd85tc8lDbAdBEARBEARB8PISYvvAJptTokjBUqJNS3e3vYx8GMomJG+EtB1X\ncCt309KP12cvhubQ+Q2gpiQVFLd8VRtqDdPqCeBJKQVUusDuDrSJB4qlxUvPk7nYTvi+irhbbeL9\n0LqODxOkO9s5+5xtkbytq+pp4FtwWu9HX4V9WxZaHdCmu7udEjkXck40VRf/Kj4DPO2hci64ZSsr\n3257MNo6BgzJZPaqgHVO93obBMHLhYh8K/Cn8R9yP9vM/v77fEjvGa+99hof//jHAfjoRz/KZz7z\nmff5iIIgCIIgeL8IsX0gySqBU2+3doe7DIVxGplOJ8bpwjAWcknklLZyaFt7rtfFekL4WhoucuuD\nv9ngqnWetvW0bqnVg8lyomgm4eO9MkrG/xKrNpJWRBs0JYmRxWdit35su7u892nnUijDyDidKMNA\nznk71v1Ifbr22n+uaszzDAvoLExZmbIyZqW+cuLuNHG68+MVxMW3QNZMUUPLem0i7eXhKSFVQLxv\nXLR5croqKfXXU9qqBsLZDoLgZURV+dznPvd+H0YQBEEQBC8BIbYPJEl+R/bgspwzwzAwjhOnHpY2\nDAOlZFL2kmzZhPbquNIFYX9mFdxdlJvd5IZx45Fv76UHpbnoNKCUjJn2fuxExihAMSP3MWE0v03J\nyAg59UC03gu+XgRIgovtXBiGkWE6UcaRlMtW577PtGYNWnchrsq8LCytMssqtI0h99RySZRhYBwA\nEXfQJVGyYZb3IoCj0E4J0i66pQpQMWwLb0tJtjnh1md2B0EQvHwkQN/vgwiCIAiC4H0mxPaBJAev\nWTJglFwoZejOdk8lHws5Z0/ehi3hey0dBzZVbUehvcnxF+EusvWZ2fQSbqORzGitefiZGCLqbjHW\nxXYl9TFhqJIwSvaebNuSvRWx9ddA7692Z9uT18swknJG0mEK+NqrDT2gzEvd27zQrhekLS60k28p\nJYZx4O7utH1bSf6+nP3Xz218lyR3so8O93ofnxSezJAutN3x3svQNUaABUHw0pGAV4Fwt4MgCILg\npzohtp+LbMFlktylnU4Td6884XT3wDRNjGPhWgpW1YPCDhOgbwzhVXBjN/3b6+eIHHLSDp3c9Pew\njt4Cn0HdGq02UgZreFn4apevY70OYnQ9njWrTQCSkCVRhsIwDAyTp66vZeTbCDR6n/aaLt6Vtzal\n1sp8nbHlyv2YmAoMGXJOlJIZStnc85S9ZF3V+7Ctp6/7vO51Bvm++QUMJZui1vvje7+59mC11jxQ\nLQiCIAiCIAiC4GUkxPZzsb10WoQyFKbTiScf+hB3rzxwujsxTSOX8YrRaGrQXCua3a5jq9S1TfIC\nhyrq3kHN0VXf3rUKaE/vbk1ZamNeqo/7UtCeQt7nbmEpYWq05EJeBZqsItd7x1Pvix6HgWEcGcap\nO9vDPtZs3Tclt6RX5948nbzWyrIstHnm4ZwYMqQ+czuJ32+tUkqmlEJxVe091/0MCJB7cvnx+xuG\nWqZp8xFoXWibGq0JtSq1wlK/DH/1QRAEXxq9DykuAgbBB4XXXnuNT3/603zyk5/k1Vdffb8PJwiC\nt0lrbb2b3s/jeDu89Af4XrLPuT44wSLef3zXxfaTJ967PY0MgwelpZT2gma58af3cnIOv4L1hm1h\nTfd2YezzqbcFurjVzRFurVFrY1kWlqXSqo8Fw7w421PJEyqZJplFCguFSsYLy710O2UXwMM4+HZw\ntlPKq/zvx+T7y1ra3eduu9iuzPPC+Xzl6dMzX3j9ns//5FM+/5Nv8BOff53Pf/4LfOELb/DGG/fc\n35+5nK/M80KtPlIMhJwzYykMJTMMHkZXSiaX3Pvi01bW7mnlPhqsVr/wsNTtP7YgCN4jROSfEpHv\nFpEfE5EHEfl/ReSvisiv+yLWyCLy7SLyl0TkcyJyEZEfF5EfEpHfJSLT21zn14rIXxCRvy8iZxH5\nCRH5n0TkO0XkI2/yvj8tIioif68//qiI/Aci8rdF5PX+2i9/u98HyF/EvkEQvAS89tprfOpTn+K1\n1157vw8lCIIvgoPYfun/3xvO9gE7NCiLdRNWhFwK43SCV5S7J3dMdxPTNDKOA7UZKTUXo+tIKnm0\nXhfx27yro+rufdBb6fZqVK/J5mv4Wk8mr7WxpEpSf4/1lPB+ZQBLsglrRbw8fV+ozxDvYnsYu7Pt\nGyljWyJ7d7azIJK3edioz9VutVHr4mJb3H1vrQLmc7ExVCvTNHE6eb/7UAb/3DJQCpTBhX3Juc/V\nNoolmgq5z9VOKdGUrYR8n8Wt1BoOUhC814jILwD+Gt6YvP5HOAH/EvBNIvKngf/hLdb4OcAPAL/g\nsAbAVwG/FPhlwO8UkV9lZv/nC9b4GuD7gW98tMYI/CLg64HvEJFfbWb/81sczy8GfrB//kr8gAmC\nIAiC4B0RYvuAWk+PNTYn1/C50alk8jhQptGd4B6WtlQll4pI7ZpXbkrJzWPFsaTuUlu67d1+PANs\n7bvW7lVLAlEQObjKiYx4iFvyBY7Ra3Yo+V7jxJIIKXfHuIvtXDI5Z3eut1JuX0v69157vKu2npKu\niFbQtl0AWJoitaEpkeZKvizIcKUlYZorp+vM6Xz1VPdh6OnuI9M0ME0jYNRWadU3bc2T07Gtp90d\nbajVhXZr2t3xIAjeK0Tkw8B//f+zd+dhlmVlne+/71p7nxOZVQUoRVFV0NricNUGGykKQdQSpy5E\nUbH1wn1UJpVufXyc29ZuQdDmtmNre+2WFkFAvXgRcGjBgdYLqCiTtO3YLcplKiChoIbMiDhnr/Xe\nP9bawzkZkRmRGRkZWfH71HM40469d5wkMvK337XeBVxP+dF8KfBi4APAJwHfATwFePA59nE98EfA\ndcAW8F8o4fwdwNXAFwHfCnwi8Goze5i737W2j5P1az4Z6IBfAl4N/APQAp9Tz+X+wKvM7NPd/V27\nnNLVwMspIf2HKBcSzgAPAVTuEhERkQumsD2Rcgnb/Vxqw2rgNSxGYtvS9MOuT8yYbczYXnbErViX\npVptjtbP/XZKIzM3x8POw9X7rOzU17OvLpEVAp4TKQW6UCrXFsv8ayYXBvp1w3LtZg7QhDJEPZrR\n9IG7aUrgjv1Q7dIZfBjvbkYZ1R7LMVIHSyd7qkG7VLhzTiy6QLZMZwnbTtjmAo+BJc68iWw0gY02\nMp/NmNXbxsaclOeAE2Nt/tZ1pG5JrsPj+5EB9XpFCdvJSV3p0p7VIE3ksD0TeCDlb4nvdfcfmbz3\nZ2b2q8BvUQLzbn6OErTfAXyuu79z7f3X1f28HngQ8K+A71/b5ocpQfvDwOe7+9vW3v9jM/tl4A2U\nCwPPBb52l/O5FrgLeLS7/8Xk9bec43sQEREROS/N2Z5IOdVbnSddAzFDZbshzme0G5NlwPo1t22c\nf107jA0dxT2PS4CNj3cZo9gH7X7bnIfqeE6Z3JV5211K5JzLOTIG7b6xmmcv30fKw7reMYYyhDw2\nY9humlrZrkt+9Utw0XcLD7RNpAlGMMc8l/W8Vyrbia0uc3qZuWvRccfWkg+f2eZDd29y+52nuf2O\nu7j9w3fw4Q/fwUfuuJM777yLu+6+mzNnNtne3qbrOrrlkq5b1sp2h+cM5LrmuQ9ztZeTynbfnV1E\nLj0za4GnUv66+fO1oA2Auyfg6cByl338E+BxdR/fvEPQ7vfzNuBnKH91PWVtH/etx3Dg3+4QtPt9\nvBP4wbqPrzKzE7t8aw788FrQFhEREbloCtsTvnLr52+Xpa9CjIRa2Z7N58xOzJmfmNPOW2ITCSGM\n84z7jt/WN0GjZu/S8KwP3vRhnvKvwVA7iJfK+qSDuedJs7SMp1SCd841jOdh20D5Qw3mZR3u4LRN\nYDZvObFR5k/PZuUCQQxxOMfV9F/21Tdti8GIARpzGstEy8RaLbcYyCGysMAmkbtz4K5k3NHBHUvn\nzmXmrkXi7u2O09tLzmwv2dxesLVy22axLEG7VLQz1DXBjfGzS6k0iuuXMOs7n4vIobiJcU7zi3bb\nyN3fA/zuLm9/Wb0/A/z2eY7Xz/u+0cweOHn9nwEb9fHLz7OP19f7lnL+u/nl8+xHREREZN80jHw3\n1rctK0O4yxxmo5nVyvbJDeYn5szObJeu5LGss+V1SHhZastqOHT6TmUlbOchdFNnSJcluQy3gFsu\nzc36yd39WPPsEGrAzglPAbfaIM0zwfMQsvv8HMyYzyInNmacPDEv3cYt1q7kdQ3rWk0fDuZ15Drl\nFs1oDHIoYbuJEKMR20hILV1o6WLDIrR01pBp6HJkmSKde5kL744lCF2miXnoZL69vaBtI5DrN5qH\ndcPHmeheh5L7MLQ8BAPXtSKRQ/SQyeM3nWfbN1Iq2OseXu+vAlLfG2MPrgfevbYPgPftcx87udvd\n37HXnYiIiIjslcL2RB83x3+61SHVFuvQ6lDmbG/MmZ3YYHZyg/buTZo2EksL7jEzUqqwwcYp2WM1\nOw9VbjxAXw0PobyXS8XW+07kUCvPZQ1tD7k0XcuZnFMJzDljOBGvTdNKvg+xhu0TM05etVE6i9fu\n3hYCZiWwevYa0Ccd2bGhehyD01jZdxucphnDtoeGpbVshYZgDR0NS29Y5oCTSxWaTMxOTJmmS7Q1\nbG81kdgEYqBWy/vD9xciGC4G9MP6S1XbCEe+2b/IPcq0U/cHzrPt+3d5/bp6v9+GCyd32Md+93Ny\nl9c/ss9zOQ8H3gvAYrHgrW99645b3XDDDVrXV0REZOK2227b01J8y+WOs9WOJIXtqb5JmY3PrV8T\nuya72DZDk7R2PqNpa5OxEIaR2J7BzQmWyWtFlxK0A8P6233HcKNWto1s43DyodEaw47xbKW7eT+M\n3Pph107ox1gDZk4MxqyJzOcNJ07MgVA6enel0/l0mDYrxypfH2wauMHrEPIYjdAEQhtxGjprWNCA\nNSQinQdSDpR3yxjOWYY2OV3tqL7oOprlkmY70DYBomHNOOy+DCPvRwT4UNkuVXdjHxUtEbl4K+sm\n7GPbqf4S2T8AX3qO7db9ww77WAAP28c+3r3L65eg8UP5eE6dOsVNN+08ev1Zz3oWP/ADP3DwhxYR\nEblCPe95z+PZz3725T6NA6WwfRyc65+ih5JX1w+ikCxyBbp98vj+wI7rX1fX7fL6hyZf/7fu/XqL\n+9LvYwbc7u67VdEP21D57y8Exhj5qI/6qB03ft7znsfzn//8wzkzEdnRYrEA4NZbb2U2m13msxGR\nlBL3u9/9zrvdqVOn+ocffa7tjgKF7YmveugjlQJFRHb2PyaPb6aslb2bm3d5/c+AL6YM6X40YwOz\n/fizyeMvAl5yAfu4FIbfH/1Ioa7rpv8gEJEjSj+nIlesI5/dbBw+LCIisjMzmwG3AfcB3ubuO46P\nNrMHAG+nVJ4d+Lh+iS8zezileZoDr3b3L7mA87iRMqy8Af4ceHhdcmy/+3kh8GTgHe7+oP1+/Q77\nOw3MKZ07zjenXURERC7cdZQ+ztvuftXlPplzUWVbRETOy90XNaB+B/BQM/sud/+x6TZmFoGfo7Rp\n2Gkfbzaz36VUpB9rZs9x92fudkwz+1jgUe7+0sk+3lvP4xuBTwN+zsy+YbfAbWb3Ax7v7j+/r294\nn476L3sRERE5fKpsi4jInpjZvYC/APp1r/9v4MWUSu4nAd9JWc/6zZSh5CuV7bqPGyhLh91AGf71\nRuAFlCr1FnBfSoh+LPAY4JXu/tVr53EV8MfAg+s+/gb4WeAtwN2U6vs/Ab6w7ufP3f0Ra/s40Mq2\niIiIyDpVtkVEZE/c/U4zuxX4Pcq61U+qt2ET4Bcoc7F3nLft7reZ2aOAl9VtbgYesdOm9f6OHfZx\n2sxuAX4JuBX434Cf3M8+RERERC61cLlPQERErhzu/leUqvGPAP+TUo0+Bfw+8CR3fzp1FUR2WSLM\n3d/l7o8EvgJ4KfD3wGnKcl4foDRf+3HgFnf/hl328RF3fxzw+cAL67ncBSwpHcvfCPwMpSHbF+32\n7ex2jiIiIiIXS8PIRURERERERA6YKtsiIiIiIiIiB0xhW0REREREROSAKWyLiIiIiIiIHDCFbRER\nEREREZEDprAtIiIiIiIicsAUtkVE5Ngzs48xsx83s782s7vN7ENm9kYz+y4zO3GAx3msmb3CzN5l\nZlv1/hV1/XIR2adL+bNrZk82s7zH29cd1Pckck9lZvczs8eZ2bPN7FVmdmryM/SCS3TMJ5nZ75jZ\nbWa2aWbvMLOXmNkjL8Xxzjq+lv4SEZHjzMy+FHgJcC/OXnfbKGt4P87d334RxzDg54Cn1Zemx7F6\n/3Pu/owLPYbIcXOpf3bN7MnAC3fY906e6u4vvpDjiBwXZpbXXpr+bL3I3Z/GATGzDeDlwGPZ+e+H\nDDzH3Z9zUMfciSrbIiJybJnZpwMvBa4B7gK+D/hM4PMp4diBTwT+q5lddRGHei4laDvwFuBJwCPq\n/Vvr619vZj90EccQOTYO8We390XAQ85x+7UDOIbIceD19k7gdxkvOB+0FzIG7d8Hvpzye/fpwN9R\ncvCzzOzrL9HxAVW2RUTkGDOz1wKfDSyBz3b3N669/53Aj1J+WT/7Qq6Am9knAn8JROBNwC3uvj15\n/wTwWuDh9Tw+9WKq6CLHwSH97E4r2x/n7u+86BMXOcbM7FmU34NvcvdTZvaxwD9QfsYOrLJtZo8B\n/lvd728AT/BJ6DWz+1IufH8M8GHgQe5+x0Ece50q2yIiciyZ2c2Uf6w78Pz1f6xXPwH8NeXK+7ea\nWbyAQ3070NTH3zIN2gDuvgl8S33aAN92AccQOTYO8WdXRA6Quz/b3V/l7qcu8aG+u94n4Jt9rbrs\n7h8Cvqc+vQ9wyarbCtsiInJcffnk8S/stEH9Bd3Pw7wP8JgLOM7jKaHgb9z9Tbsc50+Bv6UEgy+7\ngGOIHCeH9bMrIlcYM7sa+DzK793fc/f37rLpK4A76+OvuFTno7AtIiLH1WfV+9OU4WS7ee3k8aP3\ncwAz+zjgxh32c67jPKAOrRORnV3yn10RuWLdDMzq411/77r7EvgTykXum82s2W3bi6GwLSIix9Wn\nUK58/527r3dInfqbta/Zj0/dZT8HfRyR4+QwfnbX/YKZvcfMtutyRW8wsx80sxvP/6Uicogu5Pdu\nA3zCpTgZhW0RETl2zGwOXFufvvtc27r7RygVNIB/tM9DPXDy+JzHAd41ebzf44gcC4f4s7vuFuB6\nyj/KP5rS1fjfAH9nZt94kfsWkYNzpH7vXpJyuYiIyBF3zeTx3XvY/jRwErj6Eh7n9OTxfo8jclwc\n1s9u7+2UtXr/hPEf5g8CvhL458AG8J/NLLv78y/wGCJycI7U712FbREROY42Jo8Xe9h+mzKv68Ql\nPM60S/l+jyNyXBzWzy7AK9z9RTu8/hbgZWb2xcArKf+e/g9m9htqBahTAAAgAElEQVTu/oELOI6I\nHJwj9XtXw8hFROQ42po8nu261WhOmSO6eQmPM5883u9xRI6Lw/rZxd3vOs/7rwKeQwnzJ4Gn7/cY\nInLgjtTvXYVtERE5jqb/iN7L0LGr6v1ehq1e6HGumjze73FEjovD+tndq/9CCfNQ5nWLyOV1pH7v\nKmyLiMix4+7bwAfr0weea1szuw/jL+R3nWvbHUybs5zzOKw2Z9nvcUSOhUP82d3r+ZwCPlSfPuBS\nHENE9uVI/d5V2BYRkePqrynDPz/BzM71+/CT175mP/5ql/0c9HFEjpPD+NndDz//JiJySC7k924H\n/N2lOBmFbREROa7+sN5fBdx0ju2mQ0P/aD8HcPd/AN67w3528jn1/j3u/v/t5zgix8wl/9ndKzO7\nlnEpsveea1sRORRvYmyMtuvvXTNrgUdSLpa9yd27S3EyCtsiInJc/drk8VN32sDMDPi6+vQjwB9c\nwHF+nVKF+2Qze8Qux3kk5Qq7r52XiJztsH529+IZlJ9vgNdeomOIyB65+93Af6P8XH6Bmd24y6Zf\nCdyrPn7FpTofhW0RETmW3P1NwOspv5CfbmafscNm3wV8CiUE/6S7p+mbZnaLmeV6e8Euh/pJyhA1\ngJ82s+myJNTn/7E+7YCfuqBvSOSYOIyfXTP7WDN76LnOw8y+BPj++nQLeOH+vxsR2Q8ze/LkZ/eZ\nu2z2Y/W+AX5mfbpJHZHy7+vTjwA/f2nOVutsi4jI8fatlOGlJ4DfM7PnUipgJ4AnAd9Qt/tb4CfO\nsZ9d52y6+/8ysx8D/jVwM/BHZvbDwNuBjwe+B/j0uo8fcfe3X9R3JHI8XOqf3X8M/IGZvQH4TeBt\nwAcoAf9BwFdRKmNW9/Gd7n7bRXw/Ivd4ZvZo4BMmL107efwJZvbk6fa7rHM/vL3rG+5/YGYvBZ4I\nfBnl74ifpEz1+DTg+4CPqfv4Hne/Y1/fyD4obIuIyLHl7m8zs68GfpEynOy565tQ/rH+OHc/fRGH\n+jfA/YCnAQ8FXrp2DAee7+7fv8PXisiaQ/rZdcqczked4/3TwLe5+yWrjIncg3w98OQdXjfgs+qt\n58C5wvb5PA24Bvhi4HOBx6ztOwHPcffnX8QxzkthW0REjjV3/y0z+zRKpexxlKVCFpTOpP8P8DPu\nvnWuXazd73QMB77BzF4OfCOlwn0tZQmjNwE/6+6/e7Hfi8hxcol/dt8CfA0laD8cuIHyM9sAHwb+\nkjIv9Pnu/sEdvl5EdrbX7v3n2u68+6g/+19qZk8EngL8U+A+wPuB11H+fvjTPZ7LBbPy+19ERERE\nREREDooapImIiIiIiIgcMIVtERERERERkQOmsC0iIiIiIiJywBS2RURERERERA6YwraIiIiIiIjI\nAVPYFhERERERETlgCtsiIiIiIiIiB0xhW0REREREROSAKWyLiIiIiIiIHDCF7SuImb3QzHK9fd0B\n7fPJk32+4CD2KSIiIiIictwpbF+Z/ArZp4iIiIiIyLGksC0iIiIiIiJywBS2BcaqtqrbIiIiIiIi\nB6C53Ccgl5e7vwh40eU+DxERERERkXsSVbZFREREREREDpjCtoiIiIiIiMgBU9g+ZGb2QDN7ppm9\n1szeZ2ZbZnanmf29mf2pmb3AzJ5oZvfd4/5Omtk3mdnrJ/t7p5n9spl95h6+/rxLf5nZLZNtfn/y\n+hPM7NfN7B1mtmlmt5nZ75jZ15qZ7f1TERERERERuWfRnO1DZGbPAH4COFFf6huStcBVwD8Gbgae\nAvwh8Dnn2d8nA68APpnV5mYPBJ4IPNHMnu3uz97D6e2lOZrX414N/CLw+LWvvT/whfX2L8zsy939\n1B72KyIiIiIico+isH1IzOzLgf9MCaYO3Am8AXg30AH3Bj4JeDAw28MuHwC8BrgB+DDweuB9wLXA\n59X9ATzTzP7K3V92YN8M/AIlaGfgjcBfAXPgMykXDAAeBbzGzB7t7ncf4LFFRERERESOPIXtw/NM\nxgrwTwP/2t231jcys5PAY4Gb9rC/GfDDwHOm+zKz+wC/SgndDjwXOKiw/Zn1uG8Hvtrd/2zt/J8G\n/CdKtf7BwI8C//KAji0iIiIiInJF0JztQ2BmVwEPrU/f5e7ftlPQBnD3M+7+cnf/vnPtkhJ4n+vu\n37e+L3f/CPB/AKfrtg8ys5sv+hspZsDdwBeuB+167BcA31yPa8A3mNnHHdCxRURERERErggK24fj\nXpPHHzqgfZ4CfnC3N939A8BvTV56xAEd14Efd/d3nOPYPw+8pT414OsP6NgiIiIiIiJXBIXtw/FB\nYIsSPB+8ly7h5+HAb7r74jzbTSvPH3uRx4Ry/gAv2cO2L548fswBHFtEREREROSKobB9CNx9Cfxa\nfdoCv29mLzKzLzWze5/jS8/lf+xhm2kV/UKPs+6D7v73e9juDfXeGIfQi4iIiIiIHAsK24fn24H/\nSalKt8DXAr8OfMjM/ruZ/V9m9mVmNt/j/u7YwzbLyeN2X2e7Mwfeucdtp9vN63JhIiIiIiIix4LC\n9iFx9/cDDwd+CHg/4xJgBjwE+CbglcB7zOxfmdn5/mz2si72pXBmj9udXnt+zUGfiIiIiIiIyFGl\nsH2I3P1ud38WZY3sRwHfTRlefooxfH8U8O8pS3cdRSf3uN1Va8/vOugTEREREREROaoUti8DL97o\n7j/h7l/p7vcHPhv4jclmX2ZmT7hMp7gbAz5mj9v+o8njbXe/+xKcj4iIiIiIyJGksH1EuPsfu/tX\nAK+ZvPz4y3U+53DtHtfNflS9d+Btl/B8REREREREjhyF7aPnNyeP73/ZzmJn/Tzxr93DttNt/uAS\nnIuIiIiIiMiRpbB9CMzsajPbazfw6fDrD1yK87lIBnynme26breZPQW4uT514OcP4bxERERERESO\nDIXtw3ET8A4ze5aZfcpOG5hZMLP/HfiWycuvPpSz259t4GrgNWb26etvmtlTgZ9lbPj2/D2uyy0i\nIiIiInKP0VzuEzhGbgCeBTzLzN5Hmcf8PqCjDBe/CbixbuvA69z9pZfjRM/jDcDtwBOAN5vZnwB/\nDcwp87QfNNn2Lykd10VERERERI4Vhe3DsQksGT/v+wO3rm3TV4IBXgY8/XBO7YI8BZgBjwMeydgM\nDcbv4U+Ar3B3LfklIiIiIiLHjsL2IXD3N5rZdcAXAJ8FfDrw8cB9gQjcCbydElB/0d3ffK7dMQba\nPR1+7f5Ctxk3Lst4Pd7MvhJ4MvBplAsIHwH+nPI9vGQf5ygiIiIiInKPYu77yW1yHJnZLZSO4g68\n1t0/7zKfkoiIiIiIyJGmBmkiIiIiIiIiB0xhW0REREREROSAKWyLiIiIiIiIHDCFbREREREREZED\nprAt+2GX+wRERERERESuBFr6S/ZqX8uDiYiIiIiIHGda+ktERERERETkgGkYuYiIiIiIiMgBU9gW\nEREREREROWAK2yIiIiIiIiIHTGFbRERERERE5IApbIuIiIiIiIgcMC39JSIicpHM7DQwBzLwgct8\nOiIiIvdk11GKxtvuftXlPplz0dJfE7/2ylc6wLLrWC6XLLoli8WCza1Ntra22NzaInWJLiVSlwkh\nsLFxgo35BhsbG8xmM+azGbPZjBjjsF93p1t2pK6j6zq2zpzh9N13cfr03WyeOcNisc32cpvFYhtw\nsPJ1MRpN09K2LW3bEGMghkgMgRBs5dxDCMQYiTEOj0Ooz2ODhXIfQiCEiIWAmYE7/f8HzAwzI/Tv\nhQAWsBCgvofV93o2fl3/uueyz3Ir3z++uv91PjkPDEIMhBiwYLg7OWfcMzlnck7Dc4Bv/Lqn21k7\nFBE5RGbWAfG8G4qIiMhBSe5+pIvHR/rkDl0NixYCoYlEnMadmXsJmzGSUiZ1mZwyFgLz2Zz5bM5s\nNqdtW5q2pWnaISTiAE6IjgMNRjufM88JN4hNZLGcM18uWC6XOLkEYJxgRtNEmqahaWrIjoEY1gIv\n1ABe3g8h1lDdB+vxFmp4tiE0l0AMvhKazUrQHUN2/7hsWy4I1Of922Vv5X2cmrTrQwcCWHlsNnw0\nQA3blMBtGO4ZdzBCPbf+ePWAgK4TicgR4lAuKt54442X+1xEZA8WiwWnTp3ifve7H7PZ7HKfjojs\n0Xvf+96+SHfk04DC9oTVimsAYq3sWoyEpqGZtcy6jpQcz05OjlmgbVraZkbbtsRYgnGMETPIuVRk\n8UzAwCIhJjwY1gTaecvG8gRdt6y3Ds+pBM+cASfGUCvVYaho7xS2LVgN0qVy3D8uoTqChXqzIWgb\ntvL/UGOsPjOtVk+r6Ob0IbvkXuujb03bjpPPqmyX42TwUN5nsk8fK+HZczlvD2NW95UzLDc33FXQ\nFpEj43bgumuvvZZ3v/vdl/tcRGQP3vrWt3LTTTfx27/92zzsYQ+73KcjInt03XXXcerUKSi/e480\nhe2pPmybQQyYR0LONLOWNs/L0OWSnfFcqq4xRGJsiKEh9EE3hlKhTRkskVMmRMdiBhpC09DOW3Le\nIKeOlBKp3ueUyjDplKBWgEMdqh3r/mOwWhkew6atDeceQ3MADCeUcForw4YN1WnqXR+bhypy6PfZ\nXzbqh3n78Ppq3C0xuiTkPIZtpoE547VavfqVJWi7ZzzXYzuTQD35vrDyvStsi4iIiIjIEaWwPWV1\nLrGVOGgwGf5dq7OZUlXNJfQFK0Ozg4VJNbhUtd1LYC6jp72GXwhEykefgYznMiw950xOXQncKeE5\nlfNxL18frAb6SQDuM+swlNvG4FyDaaZUgd1LJdv66vDKUHDrdzN+HGM2L8G3nMzqUPP1oeg+7Irp\nDm3Yn41PhvP3lf+MYSD6ZCfjrQ/aqmyLiIiIiMhRpbA94ZN7P+u5lSBY02dpUBYmxeE+ipbgmXEy\niVz/d5zDDGZebpTAilkdJu5kC2RLZEtlKLl7aQRWJjAP4Xc8R68Buh/hbTXwGkao5w1u1k8oZJj7\n3A8BnzQ3W/9Ehop2H8z7CnMd0m4W6tz02uAsOH3tPNQh5z4E+ckccCYh3R1zI5iRcyihvDZxI4Ra\nwbbxe6nDzbPCtogcMR/84Ad54AMfeLlPQ0T2YLFYAHDrrbdqzvYV4vrrr+fNb37z5T4NkT1T2J7I\nk8dnh23oK6vTOc82DL72IZoCJQz2/3lmGo0DJaaPwX0My24Rt0S2jKdErvO33TNuDlZbkJkPc6Gd\nYSr1StDuwy1eLxYMleUyNn34HnYI26XgPM677oeW993H++HyfdjO9cKA5VqBx+v51op/bY42Njmr\nc7lr53LzUq0OVi8qDB3TQ7lQMPRcK5X67IGsDmkiVywz6//K/QF3f84F7uMW4A/q089199cdyMld\nBHfnPe95z+U+DRHZhzr3U0TkwClsT/gO9+uvMQ2olCoyMAyj7iN3H7L7/7yvbPcDuWulOVggUKq6\n0UIJ1paHwJ1zLpXulMtz8/J+nR/d/y/1XMYZ2ZP5zdMc3YfsvqGajQF6Oojc3Mlel/Eqr4zDx0MY\nbiEEcs6lTp1rZbvONS/Zuu983g/R74ePewnLVi4m9GG7r+CHlco29YJBHcWvyrbIPcVBXTE7Ylfe\nHnC5T0BE9mQBnALuB6iyfbTdxmpZTOTKoLA9EWogHIZmTxqD9Y2+wjRo16+zOsw695XmfhmrvodX\nmMyTdh/CZz8Me6iN13nI2Y2c+znWZSg4oQxMzySSlzDvDtnrMO6hWdlwJCyPPdB8cq5hqIDnla+b\nKiPDc5177kO1PJhhwQnBCcmx/oLAsBb2alO0Ycj8dDJ3nZTdV7azT7uXZ8yMLmdCrlnba1M6h5Sc\nlCET1B9NRGC9xcNlZ4C6kYtcGd4K3AT8NqBu5EfbAwGNGpIrj8L2RAh92C7/u1LZNmrzLptO0WZt\n9HWputZqs/cBM0zCuUEg1Ip2Ca/9gTKQcwnauTZh6+cpY4HsiQQkLzf3XCu95QSHkd4+bZJWT6x/\n2Fed65xx70vG00HwQ1jul+6q1XIra3+XCndZZ9xgGOqe+8CMDfuwOvTdyCtzxBn2v37L5VgRQiqj\n0Iflv3I9llMuQvTVchE5ltz9tUC83OchIiIishOF7Yk+bMNq87Hp8z5oD4F7Gra9b5A2Du4mlPA7\nnR8dJv+ZjzO+yV7DNqQauKedw0sYdzp3kvtQ1e4r28CwP6Z3Nt5DH7QBJgG5n//sk7ZoPs6Vni67\nNXRdp1TG+4p29jzMDx+7h5fq93QYeh+6V9bhrmm63BtWg/ZQ2e7L5f3xCWdf6RARERERETkiFLYn\npsPIh2A3lLBrss4+DIHup2H3FeSUM11KdLk0Nst4vadWhEslO9QqdqI0BsupVoZzJnWJrkukLpFz\nP/S6LM+19MQyd3S5I3ltnoYPAZV6ymMEHUdW2lr4Lm/XCwO1ojzMO1+Zh251OPgkaLMa6Fcr0+WN\nYWj6sK53GJYYs8k+pk3l+nK4BcNCJIRICNP3bVxPPIxHERE5AtLlPgER2a8bgGfVexG5UsQ4DGo7\n8r97FbYnbOWRD03NxnnYQC7zjMk1WFo/AdlYdIlFt2S5XNLlMpg81aHewfqGZKUhmvVV6wypy6RU\nQnbXdSyXHd2yI+U8NCcD6HIieaLzNM5zruc22nnq4rj+9uo2Ow/l9n5keV/LZ9pFfIi5a2G7/xzG\nz3DsuF5ydhias/XBuRdqs7Vyi8QYCU1DjE0ZDdAHbKvd272vmIvI5WZmNwDfCnwh8PHASeB24APA\nXwC/A7zc3e8+xz5uBr4D+CxKt6IPAr8PPNfd/2aXr7mFc3QjN7MXAk8G3uHuDzKzG4HvAh5HmQB4\nGngj8NPu/jsX8K1P1b+QdBFQ5MpxA/ADl/skRGSfJmH7yIcBhe2JsVJaG6T1c4X7QeR9d+7sePLJ\ne6Xx2XKR2F4s2VosWKauBO065DuEMFS3rYZtauOv5bIrAb3rWCw6Fssli8WSLqWVhmOlA3e/nNjq\nhYDxrHfgu1WBfZhbDZNmZXkSuOuc8Z3X4Z7uaRwNML1oYTZWuYdO5rW6Pc7/NmJsaJpIbBqapq03\nJzblByrGQIw1cA/zzs/95ykil56ZfTbwm8C9WL3ad796ezDwRErL31ftso9vAn6S1fnXNwBfAzzB\nzG519z88x2mct0Gamd1Uj3/t5OUN4IuBLzazH3f37z7ffkRERET2SmF7Yshu3gftPtD6+Dg7OXkd\n+j1+lTssFomtrSVntrdZLJckz0PgLt3Hy23ovOaQsrPYXrJYLNheLNleLFgslmxvL1h2XRmG3q9H\nTV9vH+dYT+dbD2F75Z+dtvKa+dlv9Q9yHcredxcft7O1f8nuVj1f2enQFK6vqoehsh3GtbrNCMFo\n21m5zWa0baadOW0LbRtoGmhaowGC1znnob8MICKXi5nNgJcC1wB3Av8J+H8pFe0Z8HHAZwJfcY7d\n3Ao8AvjvwE9RKuEn6td8a338EjP7RHfvLvBUTwIvq+f5fwKvBraBzwC+F7gR+A4ze6e7//QFHkNE\nRERkhcL2xPbmaYBhHvO0eViuAddzaVLmaZzTXOY1G8utTRabm2xvnmF7sSDlMn875Vwq2mth2702\nPKtDx5ddrXAvO/Kyw1MaW3EPa3RPQqaPz8vc8XH493Rhsn498Olrg8nw7GxGKm3PyJm1id6s7HEc\nYL7aJc5s7TmUpcb6+dah3IdQQnaoobttAm1rtI3RNNAGp7FMQyICMUOTx0Ht5JXvQkQuj0dTKtAO\nPMndX732/huBXzGzb6cE3p08EvivwBPWwvQfmdntwA8BH0MZ+v3rF3ie11EW1P18d/+jyetvNrNX\nAH9KGVb+78zsl939Qxd4HBEREZGBwvbE9pkxbA+NwxiHVA9DunNZomtoNz6E7S0WZ86wfeY024tt\nll0qw8lTGodP17A9zInOTpcSKeV6X245dZDq4tLl4DXI9iZBm1qJH7qlT2vK5b8wvMYkRNfh3KGc\nVzKI+BC4xynaNgR8W7lNAvd0TPcknI9rktfj1MAdzAjRiPUCRAnaoYTtaMQIMTjRMhEI7sScxw9O\nVW2Ro+D6yePX77aRu2dgp/naBmwCT9ulav0fgWcCLfDZXHjYduBn14J2f263mdl3Ar8CXEWZ4/0T\nF3gcERERkYHC9sT25umV5bvG+dJ9s7RxOayyBna/dFcJ3MvNLRabZ9g+fZrN7a1arS5zr+nna1sf\nU8eu5tm9dCKv86XLvOk8DhEfKtsl0w592SYBeCVwe+3+3Yfcep7B60Tn/ual6hyCEZpASpA8E7ye\n36SyHWpUDzVgh3rcQN8TbXX7XrBJdbs2RyuVbYixBPAYwxi0W6OJEIITzQlkgjvmmZD7Ift5aF4n\nIpfVbZPHTwX2OwTbgd9z9w/u+Kb73Wb2v4BPBR50Yac4+IVzvPdK4CPAvYEvQGFbREREDoDC9sRy\n80x54GvV7PLiMJLbPeDe13f7GGqw3MK6JTF3NDlRxppnzFONvZkST+v++ntzcqjHCo67lTW0mYT9\n2ui73IYB45xd4R0jtlGbsvWretew7Wa4UYa1N5FQG5PlnMkpk3PC65Btq2HZvIbenDGc4BlzJ+AM\nH0Wdm+2M3ctC3xBupSFawIIRQyTEQAyBpgnEOoQ8RgjmhJBLSA/TRmswrmJ+5BsQitzT/SHw95Qg\n/FNm9jWU4Po64E3uvtzDPnbsND5xO+VvmGsu4jwXwJ/v9qa7d2b2Z8BjgIdcxHEofy9dt4ftIqv9\n4EREZHe3nX8TueLddttt3Hbb+f+sF4vFIZzNwVDYnkj9nO36fGw8xtD5u0+Wfdg2DLcAbsS8YG4J\nb6D1SGogJSPlOCko2/QI9TjjwOg8fe7Qj1Z3jGSQQplb7cZkHrczLq/Vr2kdx6XG6r1ZqUs74FYq\n0bFpatiODMVy9zLH253gZQg3qcNSwroOywnzXO+9BOwauL1W7vvwPZxP/7h+EKFWuYcmaX3H8WCE\nUIbMB8uEem2ifG3f9dx377wuIoemhtQvAX4V+BTg4cDN9e1NM3sd8GLgV+pQ8p2cOc9h+q+7mGR6\n+zmO33t/vf/oizhOderidyEiInLMPO95z+PZz3725T6NA6WwPdFtnf1vPp/8zzRsr9xqwIy5Y2aJ\n0BjZAtmNnAM+LFXlw/Bv61ue07cas/qqjaGbPliX+y4EOjO6YOQhbJd53dNltILFcgsRs4iFCKE0\naMv1O8neh+0StJvYDGuBRzMCELIT3QnZYbGA5QIWSywtS/Duq/fTsE0539LB3MaQTaj5e3JRYHoL\nVi8MUIN2WUd7Mr18+ieiIeQiR4S7/42ZPQT40nr7HOATKMtq/bN6+w4ze+xuw8UP4zT3sM2B9Fw0\nM6699trzbleWNFRlW0RkP66//vrzbyRXrGc84xk8/vGPP+92t956K6dOXRkXthW2J9LWZnlg64tY\njdXt6YJW/ZJY/Tzs6CUkziJ4Lck6kT5cl7Bd69TWB25KZZxQQnUN3Jka2IORLJAtsIyBZSihu5Ro\nylBucKKFMv/ZAiFEQmgIoSlBO/a3QHbGdboNmliCdrkPNCGWmxkxZWJ2Ysr41tZwY7nAcgepg9zV\nkM1Q2S63MFyEsMlnNum5xlgB7+9seM/INaSf3RTurD8cEbmsvAwD+o16w8zuT1nS65uBm4CHAc8D\nvvIyneJ9zczcz3mVrh/7ffvFHOjGG2/k3e9+98XsQkRE5Fi64YYbuOGGG8673Ww2O4SzORgK2xND\njcF9mIJcnk9j3qQDt083Ymh4NoTP6Rt90LY6BLp/bJBxskEyo7NAZ4GlGcsQWFqgC8bSQq1yl3vD\naCzSAI0ZVqvTbdMQYwnZITZYjOQY8abcRzNiPblg0IRAa4Em1KAd62OMJmea5DQp08VAwulSR84d\neBnK3g/nNrOVj2Lokj48r59D/3g6kn7SyXx1ebIdVtI+uwebiBwx7v5+4EVm9kvAn1DC9peY2dzd\nty/DKc2Afwq8bac3zSwCD6X8jfQXh3heIiIicg+msD0RfRIGB+Mc674mclYEHMeac1YaHO69zpPO\npSGa+SRkB7oatrdDYCsEtmNkywLbwdiqr/fLdwUCLcY8BOZ13rPNZjTtjPlsRmxaLEasVrRT05Cb\nCG1ThplT5kxHjJkZLeVWAncogRtoU6bJTpMSC5xF6rDFFqkL5GzktMMSXDbpjr72EYyb7vT5jfO8\np5/yyhiDyXR3DSIXOfrqnO7XUsJ2A9yHcW70YXsyu4Rt4AnAR1H+annNoZ2RiIiI3KMpbE800xGG\nRpmIbL6W7Nabm42Bsx8abf0w9EnF1s1xy3jI5CF4U8O20xl0ZmxF43QMnG4iZ0LgTDA2zThjgdaN\nJhutGxsEcmwIMdLGiG1s0GxsMN/YILbtELS9iVjbkNoGbxtCiDQWaKwE9pkz3JpQAncMRksJ223O\nNCmz1ZWg7WcaiAaJ1cp2/d9pRXvt8sP4ytBsbnzdbTI8/6xPe9y7iBwdZvZZwG3u/vZd3m+BW+rT\nu7l8ncMM+Jdm9jJ3/+OVN8yuB360Pj0DvOiwT05ERETumRS2p9ab1ZqNWXvHqqzXUNmv+dxXZ0tr\nsDolu4RqnAxkjAR0GB0lYC8t0llDFxoWIbIdItsW2apV7k0zNkNZ1zt6IHiZVz1vGjZi5GTTcmI2\nY2M2Y9a0NP36WU0J26Fp6GqlO1gdIg60Dk2dl20pYyFgMQxLbQXK0PomWF0v23Ey2Se3nLEwfMf1\nAsXYrb1vmmZYHRngqyG7Hy3Qd1iv6397/fzHu/L1CtwiR8rnA99vZq8HfouyvNYp4ATwScC/oFS1\nHXj+HjqC7+ZiB7N8gBKkX2Nm/wF4FbANfAbwvcCN9Rj/9jI2cRMREZF7GIXtCZ+u2zwMV14Pd77D\nw76ruE++bhwI3XcXTwadGx1lTvYCo/NAZw3JGhINCyJLAst632Gk2o08WEMTGjYsclVsuLppuVfb\ncE3TMo+x3KhLdQ0XAiB4JmTDukTwTOMQMyVkLxPeJXLXkYREWBoAACAASURBVGIJ2zRl/escQmn0\nZobnREqJLnV13nYi5UT2hOUSxvuGaEM1eqjs2+rnuEOPov5axbRpWv/13q+xbasD+BW7RY4Eo3Qg\nv2WH9/qhP78GfN9FHuNinAH+OfBqSrj+3sl7/Tn+lLv/1EUeR0RERGSgsD2RPQEMwe7ctZQd/u3n\n7DjsvHQX74M2LCyw7YGFlVCdKUE7W8OiBu1FH7QJJErYttjSNi0nYstVTcvVbcu92pZ7NW1tlOY0\nQBiGtZdmbMEdy5k6ZZyYnJic0CVssYTFkrzosCaQmhq220huW7xtoGnItTFaSh1dTvV5wnNdnquu\ng903NivfeV+HPnt4+I4f7mTYPbYaur3+eUw30bxtkcvuhykN0L4QeBSlQtx39X4f8Ebgxe7+6os8\nzg4NIvb03riR+1vN7GHAdwGPAx4AnAbeRAnav3uR5ygiIiKyQmF7op9/XDLzZAz4WXaekdx/sdtq\ney93L8t4YXTYULkuVeyI00xuhnuA3K9PXYdyG8yisWGRk03DVW3L1e2Mq9uWa9oWSx2WEyEl8Dys\n6+04pFpZtoClTFhmQueEZQeLBXl7CdsLaALeRnIbsVlDN5+T8pzsTuo6uq4bKtueE+4Z9zwE6tB/\nLr7a5KzMY9/ts5x+dj5+ad/S3cZ9mq2ONFBlW+Tycvct4Hfqbb9fG/a43WPO8d5rmSwksYd9vQf4\n9noTERERuaQUticmGZGVKDcUWVeXpRqe+1p/clt7YmXdbTeDEDALhBCIFnGLwxeYO5adkDPRYN45\nJwy2DRYB7j3L3DvD1RhXYZzEOAHMcfJyiS8X5OWiNGar3ceJAa9Tyh3w5JAc65zcJXy5hGWtbteK\ntrURTy2WOyx3eJqxvb3FYrFg0XV0Kddx3z5WtIcGZ4zlZx+r0+N7qx/Sjhl8Mnx8/bXppgrbIiIi\nIiJyVClsT4x1llqXttXAXYQaHAN9066VQne/L5sETwMs1Mc1bFsftsvj4EbAidlpPNN6ZunOwpwl\nzgLnXgmuwbjGAlfbJGy7021v0W1vkra2SsatjdEIAXfHs5NzCfOeIafSFM27brjRBKyN0EZSmkFa\n4rkjp47trS0Wi22WXUdKaViLPNTvz+p87SESr62fPVywmMzjHkeK20pyHsYMTLu5Tz7b1f2IiIiI\niIgcPQrbE75j2bQPy2MwdAJ92C6NyHcIfn3DsL5KG0IN2jaE7WCRpq53HYHG++W2OuYp0eXEkkzn\nmaVnrgautsDVoeEqC5wENoC5Z9jeIp05g585jbsTmgZvYgnbOeOp3nItcWcgO54SnhI5p2EY+RC2\nc4d7wnNia3uLxXLBsuvIOZema/1nVr/PYemuySLbKx/p5ALEdHm06TZjn7SzVjM/6+MVERERERE5\nqhS2J8xC/6AER2OH+z5oT8P22Q22+8DYz1UONVAP77sTyORhj+X9YAmzRAyptE1LmS4nmpyYLxfM\nQgnoljp82dA1DYsmstzaZLm1Rbe9VcL2MmAhQAh0OZOyk3Pu19RiMrAbC04MAdoGm5VbnM1ompYY\nmnL2IWCxIcQWGicQiGSCl0sPVpf1stVlyFc+kH6+tVE+r7E4vdMw/Ok87/WmcwrbIiIiIiJytCls\nT4QwxuFhWPQ0+JnVaDzel6Dta2F7fT5yCdJGnZdNJrqRzMnWF4ZrXdgyFjIRJ+EET4TcEXJittwu\ngTw7LBZ0TWS7CeQY6JYLlosF3XKBe98HvOw8uZPrzSyQQxnOHkLAQiQ05T7OWsJ8RpjPiG0N201L\niA1N09I0M5rZjGwQyOXcyFgdnm7upWLeN5rr19Veq3KPQ+/7udmrfw7m4wUO69fsXu/y7grcIrIn\ne+pWLiIiInLQFLYn+sp2H/JWGn8NodtqnTrUzuFeuo3n9b5odlYYtFoFjl5CdnbDbTJX3CDjuGU8\nOskyMWciiehL2g6a7IRlwmOki4ZHYxmMnOq616l0CSeDZ4bm4JnarywEQozkJmIWCSEQmkCctTTz\nOc18g2Y+J7azEsRDxCwQm5bYtjTtjGTUqnbGPGEpQ38LDnlS6t9hLvbwZD1kT94qM+PrRQ1zzA0f\nArf+3Swi5+fuTwWeernPQ0RERI4nhe2JMOmm3f8HKwOux//MsL6QCwQb02Ot1zJNk6FfWKxvLAZk\ncg3staFaMIIZXkN/zEaXM9YFAkaTnZgTwcC6jhwgl6ngtXQzHmOo5WSvVeLJ+O5ycIglaLfzGe3G\nCdqNDWYbJ2jnG4S2rUG3DHLPKZdbzuRlLCE7d5gnfNmVOnwN2f1yX/1nMC6oNp7CdHT4auYel/ga\nPvezhqabQreIiIiIiBxpCttT01BnPswr9uGNfsy3M0xOHoLtGDFZe1R24isjofsvy5SKusWINbFs\nG0rn8pwz5AbSArpYG6kZ0a3m5VpVd69zqkul2mp3cyOUCwI5k3Mi9eX32qwthIZZu8GJE1dx8ppr\nmM1L4G7nG8Smpe+e7gSaGGliJMbIcrGFdwvoluRuQc5O7jK5dj0fKtv9R9R/HEM2nqRsn6xIvtIp\nzctnMwnVPv16ERERERGRI0xhe2o9DLI+n9jHDuM2SZLuY+V4SJZ13nc/N3ltBHRfdPZQ6twhlOZj\nFiOEiIVQgmtq8K7BYyR6Jrr3A9ixGkizQ7SAxZbQNsTQ0IRItEjA6JZLUtfRLZfldK1U0s0is3bO\nyZNXc80192E236i3ObFpcStrg7sZMQZCLEPQF9sNaXuLtNgi4XRdIls3LDG2srD38MlNBolPPoPh\ncytXNcYlumGt69z6vHgREREREZGjS2F7Yr0HF9TwuLIRtfo8mZPt06C9Xt228ZWhGG5DMXwIyqEh\nNDNC02KhwWKDJ8e7iC8DHq0MIfdEzAlI5DpnPGcnWMCahjib0zYtTWxpY0Mk0C0WLLe3MQI5ZxIl\npIfQMJuVyva9rrk3840TzGZzZvM5oWlxM7IF3CDEWG5NpNmMLGNgac4iJ/JyiWF49jJfvL+SMHwM\nNhlePv14HR8Gi68n7fHPQERERERE5EqjsD2RvZ+BvcMazzUsDstKD92wvXbG9nEZsP5LamO1vu5d\n+pCXxmgJalguy2f5pBoczAgWIII3Dd62kJzYLbDOyXRljWy8NA3DCBZpY8tstsFsNqdtZ8yaGSEE\nts9s4QS6ZHhK4LlUoUubs+G8+qp3+cb6qnkmOWWIONQ53JCzk7pEWixJy46UurJmd05rncLHrujj\nXPbpe/1rxjSOT0edD13N16kduYiIiIiIHFEK2xPZpzH57CQ3xG0vQXdom+aTOcX1Nm1YlmESaK10\nIsfItdFXngRuq/uNdXi5Nw3M+vWrM+SO3DmeM7kvxRtEi7TNjPlsY6VCHUMEIl02bOlgJRTnnMke\nyG711gdqr6G7zAXvHDp3upxrM7iyFFdOmW6ZWNawnbuOnDvIeejebitz3Mcmc6uf6fTRpAK+0sXc\nzwrcZkraIiIiIiJydClsT/Rhe+ijbeOzscrqQ4V2UgSu71GW3KrrbmdKE7Oy7JbVgD0+9jpPOXgJ\nz+QM7kNlO1iEpsFyWXs65w7vQl0zO9dCuGOhbN/Elnk7Z2PjBPONE8w3NgixocvG9tKx7QS+wL3D\nvSPXynYJ214arkFpDkcJ28mdLjnJawM0K1X6Utnu6BYdabkkdx2eujJ/va5RTgilpt/PXe/Z+HlZ\n/7nVz3SlB1r9/PtLAN53VjdjOgBdRERERETkqFHYnvC+Qk0N2r7SQrtEweGlEgz7lbnLEHIf7vNa\n2C5htlS13cegDeCpw1MkdwEPEQ9LCBEPGer8bKzsLZNI5GHIu1N7tdXKeJnDnUkp0aWEmdG5k4KR\nY4M3XkJwarAY6DC2lkvuPrNJlzMpZ1JOhKYhAZ0bHZC6Ug0flvfKdSmwlIYlwcr89owTSij2XL/P\nWu7vq92TT3Tl0doI8/6jLpXtWnMf5sWvDjsXERERERE5ShS2J8bKdo151oc6mLbQ9kkntT7uTWcf\nT5e5nr7Th+Hyso1HSkvycmh5Rl8DtxDKvOiuo+sSKW2R84LkS9zTUOXFjC53LBZbxM1IlxPbi23i\n9ibEhq1Fx1bXsYyQrQEaIkYIxhK4a2uTrbRkY2POiY05Gxtz2tkMiw3E0qyt6zpSl/A+VNfbtPu4\n0y9vVj6ADKVaP+kTZ0PXcRsvbpwnM/cV/L6aPa7krcq2iIiIiIgcTQrbEz5NfbZagV2pwU7KsG42\nqcCWLXOdY9wPue4DeKk+51r1LcewnKHzukZ1wjxjnjDvsBBLhbrrSCmRum1SXpC9w0kYgVD/N6Ul\ni8UWBAjdEtuuHc1jpCPQeWAZasfyGAmxwYBFt2Bzc5N015KN+YwTNXDPNzZo53Pa2QbtbF4q1ymV\nBmsp187jPqmo9x9NH6Anw7z78rsZVm/Tz3o9bJdieD+OfHXJr2H4uK/+qYiIiIiIiBwlCtsTOZeh\n2SGEsfo6jdmTBmpA389rWIu7X8prqGzXwF2+ogTtfmmsoat5PUbOCVLASODlFmKgS6kE7ZRIaUnO\nS5J3YE4Yvh661MFym4zDYkEOgRzqsPTZHNoNaDeIs4bYzojtDNzZvGvBmc1NTt91JxvzGSdPzDm5\nscHJEyfYOHkVJ06OjcvIeahoj99s/3H45BrEZCxAvbiAGxbCnv4cVq959IF7NVhPh5SLiIiIiIgc\nNQrbO/Da5Ku/71+DPuSNQ6b7Rmd9o69+bnFpkOaTbFqr132DtbC6/nYJpmU5LbclmQxmpJzJuYTt\nvqNYaJs6KjtgGGYBYiBjWHayl3nUySBbxCxisQxRD5SLA25l2Pz2csFdm2e44847ufqqk4QYmM/n\neAhDJRqzSbAu59+EwKxt8fmMpSeWuSOHBe51ma++uZz1/dVXh3zv1k18+pnbLpXw6Z+HiIiIiIjI\nUaSwvYsdg3YdK93PvS41Xx+WusIngdsZ1qfO/Zzjur41/bxlxsq0u+MZMgnD6VJXun57Cds5Z0II\nxBAJbVO6lbvR/0co62Und3L2ulyXk0IiNi0xZ4L1QbtcIEjutTnaGT58xx04znxjztUAFkqIt8nM\n6DpsPABNiMzaGczmkDpyt6CzQB0SUEM65dOZFKGnAZrJa/1nulPQBsq644zN0qZ/LiIiIiIiIkeN\nwvZ5rAe7IWgPVeqSsse+5XVpLOrSWbmE32FgtTvWl5f7jO5jpdlTxnMaGoJlz2RPuGfadk6IJWzH\n0NQu5FbX4C5n4bmsib1M5ZbMaNo5pFRHc9tQ2U44W8tFCdt33kFsG66+5mqSAyFAGKL8+HnkErZj\niMzaBpvNS9BexDJMPHupagcbmqH1s6/Xq9nTYL1uGrbNbLig4dnJZAVtERERERE50hS2z2E6nLx/\n7mvV62mjtPKs1rvHht1Dl/Nxv6XfuBHKwO4QCDEQQqghvSwYlnLG05KUIaVMbMGalmZ2grZtCbVB\nmrnVBmqJrkt0KbO96NjcXrBIidg5cbsjbG4zq+tvzzc2SCmxvblJt1yO50bpIt65s1h2mG2Rc8ZS\nwlKGlKFb4mmJe5nfHmIkNg2xacj5/2fv/aMt2666zs+ca+19zr1V9RJiEl6Sxy8RuiVEfjQRESEM\nRIi2HREbaOmBCa0Q0thqj8ahNNqQFvkhAt2KOKJCAOlu29GAA2FIkBZEQYYGSHqE3w4JkEdBwo+8\nd2/de87ea83Zf8y1zzlVqbpVL6/qVSW1Pm+st8+PfdZee99TVfe755zfuaSQHyaOX29ydhixvjHK\nfeM+h0J7t20R/W6Q1ul0HjSuXr3KY489dr+Xcc959NFHeeMb33i/l9HpdDqdzgNNF9u34eYR1L0j\neavAPnAkb6K7GZ/t/MMa0hLPHaGiuKRI8x4GdBhxadFsjFJnfLvBHcpccMloXjGuj1mt1mTNZE0k\nEpvNlu1mQ7Ut1QqbTeHk2jlnmw1+cobnAc+Z9fExx5cucXzpEqqJ7bUzxJ2j9ZpxHEk54QJzLbgZ\n0zRxpoKao9Vi6xZp6R4p7yKCpkQeR6yWXU27uyPuYC36f5P08euE98G1ulFoX4fcsHOn03ngEJEP\nAH6pPX2Vu3/7uznPK4HXE3/iP8jdf+WG918PvBJ4q7v/7qex5LuCmfH444/f72V0Op1Op9N5AOhi\n+wJujGy3F3d9n3f50chOaO9TyfdC+0ZBDoq1pl1IIucVabVmWB+B7FPHpUwUA58Ls4FLQocVw/oS\n66NLjHlgzCNZE3JyDXNlOzmlnnO+LZycnPHkySmzG7NHevnRpUtceeQRrjzrEVbrFXMpiBlH6zWr\ncSSlhBNie65Luy8juZPMSQ6DwKjKqEIWEFVSzgw2UFV3NeZ4tAgT9dZ0e39d4SZp5CI7K3J51/D4\nu3Ark7VOp/NA8RDeFnvR/V7APeQq1/2F3ul0Op1O55Z0sX3AnYg3v/HBgeBusxwI7+titfv3o6i5\nvZ8gDeiwZlgfgxA1yRhMW7ZzwXUTZmuSEB1Iw4phdcRqXLMeVgx5oBTYbCuiE9WE7bZw7dqGdz5x\nwnae2E4Tm3ni0uVLbM/OKNstly5fRodEGhLr1YphyIgo1Y1pnqnTTJkm6jyTgewwIIxJIecwaksp\nHNJTIg8jooVaBWqhVl+84Ljdpb3R/f2mP4/lqXeh3XkwEJEfBj4R+GF3/+T7vJz3Zm5MEnqAUeBt\n93sR95DHgB6573Q6nU7nTuhi+4BbCb2btZ7iXYT0UrO9CO0lbVoR9b3RmMS+RgviuoWDOFARkiZS\nElQFNJE3G0Qz5kqtzlwq22lmGGaSZHIaSK0FGSK4KiZCMWNbCptpyzTPTPNEnSfqPGJlppYJqxOa\nRpREVgE35nnL+TVlzgkrFasVrzWi1xKmaYhQiUi5LO7gLZXccMSMw5D0dfciDliu6dLf/E5+Ljd7\n3uncR96DROB7Ju7+bcC33e91dDqdTqfT6TxVutg+YCfh5FAY36RuexfVliY+2bW52r/dzLwU1K6v\nQTb289bdAPMmtofMMAxISuThDNUBRygG01zZbkNs5zQyNqfzcBlX0IQhzOZs55nz7UQpM6XEti5C\ne56wMsOQUHFyUjBjnra4G5o0Isit7jrljCXZuZkbTrEaZmXSbiykhHq0G6O1NJNDpX3jZbzAUVw4\nuMGh75py3t3IOw8Q/e5Pp9PpdDqdTudd6GL7gMPa4UOxvXB97fbuUwfj+uzy3TwqaItya8urruYY\nHq2sWk/uCgwpR5r4eo2kTB5PkHR9ZHszzeTtzDgUSrVwPEdwbf22r4tsT1gNkb1EtOu8f44NKN4i\n25UyGfM8gQhJtfX2VrIKllJkSGqI7WrRLDypkFVQ1UgFr4fiet8ajX1J9nXX9GY13IfXWpY2aQdC\nu0e3O51Op9PpdDqdzoOM3u8FPEi8S2/nNrSJTlUlpRQjp/3jlNDU+kwj2NIejIg4LwZi4ziwWq84\nOl5z+fIxVx65zJVHLrE+WpGHhOPUWpjnmWm7ZdpO1BJNwoacURHMjDLPzGWi1IphoJCHxLhecXT5\nmPWlI1ZHK4bVQBoTeRwY1iPD8YrxaBkjq6M1R8dHXLp0zJXLlzk+OmI1DAyqJCQs3KSNlMhDZliN\n5NVIHgbSkOPcVXeRbCA+qftroqqIatxsaNvr3Ma5mVnaMiKSvquF7wHtTkNEXiwiXyoi3y8ivyoi\nGxE5EZFfEJFvFZGPveCzrxcRE5H/dJtjvLLtV0Xk/Q9e/1YRMeBl7aVPavsdjl+6xZwfLiKva+u8\nJiJPishbROTrm4P3rdbyAQdz/5n22meIyA+IyG+IyKmIvElE/ryI5Bs++zki8sNtv2si8hMi8uqL\nzv3prvcWc32miPxgW8eZiPysiHyliDzrgs/c9GfwVBGRIxH5SyLyr0Tk10Vk29bxBhF5lYj0fw87\nnU6n0+ncVXpk+4BDobdvSXVD2rKDq++6f7m0WmkRrBrutbXuInpgE/PsxPY4ojlDUkgJ10SVTJVE\n9RDSZpUytfTvaUIcVsNIUgW3SAmfJ6oV3JvYHjNHrNGsXDm7zPGVY9bHa1ZHK9wNo+JeWV++xPqR\nY9ZXLnH0yDGXL1/mypUrXL58mVIK22limiaqGWjcKBAVhiEzjAPjesWoilprA2YGZogbbrVFnSGp\nIinjZrhZXLhDg/GldZr7LnJ9/c9ADpIHQnBHjDzS5u0gIt55+BCRlwE/1J4efhEG4IOB3wP8GRH5\nKnf/0nuwhMNa7Vs1onsXMwIR+RLgbxA3Og8/83uBDwNeIyJf4O7/+DbHRkS+CfjCG+b5fcDfAV4m\nIp9FXI//A/hTN+z3UcDfF5GPcvcvvNWB7tJ6l7m+Gfi8G+b5UOCvEj+rP+zuP3+7ed4dROSlwHcD\nL7zh+M8FPgX4I8AXisgr3P3t92INnU6n0+l0Hj662D7kQPDtzM1ulq7cfs3e1Uo3h/HiBayldS+m\nYQCi5JwZVyuOjtYM40gaB3QcQJXz2WJMRikzPkd6ea2FeZoQYMxDRM7dKSUi22YFxxAV8pjRnBjW\nI5fPTjm+fCC2xXBxXDzE9pVLHF055vjKJS5fCbH9rCuPMG23nJ+dc+4wl9LS0mPE+kfG9ciYUvTc\nrobUipWCzYZZrFsQVBNCS28nIvK7WxciSBPLy383im0XaU7mN7q8L6nnvW77IScDp8D3EqL754An\ngecDLwb+AvABwF8VkV9oJlt3k/8Z+FrgW4GPAd5ICMlDpsMnIvLfA3+T+Bq/Hfhq4MeARAi+vwxc\nBl4vIu9w9++/4PivAT6WOP9vBn4ZeD/gS4A/AHwG8N8BH9EefwfwfxF9mz4E+HJCMH++iHyXu//A\njQe4y+v9IuClwI8D/xvwi8TP6pXAZwMvAL5fRD7c3a9dMM9TRkReAvwr4Bh4AvhG4D8Avwr8LuAV\nwKuB3w/8MxH5BHevd3MNnU6n0+l0Hk662L6OGwzOWASd7Mu3byjbNgcTx9yZa2Uqhc08406YnWnU\nObsqmjN5NTKuVwyrkWE9giZ8M1OZmKphtVLnwjwXainUMoM5OaXWMoyWo958z1XQJEjKbU3C6njN\naj0yrgeG1RC/GicBhfWViGqvL1/m6PJlji9f3kW3t3lAURRhmiZM2I1xHBjGsUW2E1INrRWpRhUo\nHj250cjETCKYSIv0+y7EF03PlpsU++u4u76HaeNLGvm+Yfn+1LsB9MPOTwGPufuTN3nvX4rINwLf\nR0Qsv0xEvt3v4t0Zd78KXBWRRRhec/efudX+IvJc4G8RX9xfAz7W3X/tYJd/JyL/HPg3hCj8ByLy\nQReIvt8PfL27f/HBa28SkR8EfgZ4f+CrgOcAf9Hdv/GG/X4E+AVCLL8GuE5s34P1vpS4MfDp7n4Y\n8X+DiPw0ET1/f+CvE5Huu8l3tDX+JPCp7v47N7z/gyLyfcT35WOBVxE3MDqdTqfT6XSeFr1G7QBr\nozoUM6Za2ZbKZp452265ttlwbbvhdLPhdLvhZHvOk5sz3nl2xu9cu8bvXLvGb5+e8tsnp/zO6TWe\nPD/ndDtxNhfOa2XrzgxtOLMblYooDEPiaD0yjonIFq/UUrBasWrUCirKkAfWqxVH6zXr1YpxHMjj\nQMqKJIkINh4O3ipIVtLQRP7xEeOlS6zaGI8vMx5fYjy6xLg6Zn18mStXHuHZjzybZz/rfXjkyiNc\nOj7maL1mNa4Yx1arvdSAH60Zj9cM6yOG1ZphtSKPI2kc0TyGsZsIk8F5qZzPhfNS4nqUwtYqsztV\noIpQNYapxlaESruh4eDV8ZaRfqMxXefhwt1/+xZCe3m/EJFXiAj3Rz4jC7s1n0cIPoD/8QbhCoC7\nv4kQyAK8CPj0C+b7VeCv3GSOc6JNlhBR2x+/QWgv+/0GkVYtwCfc4/UKsAG+4AahvfCVwFvafn/2\nxnrzp4OI/JfAS9rTV95EaAPg7m8A/p+2hlfdreN3Op1Op9N5uOmR7QOW3wLNI1JttmyNaq3n9MH+\n1Z3ZQjTP5pxvJzabic12QlRZr9asV04RZajG6M66ZWaDI1ZJGu3B8pg4SgmrhXkbEeG92HasOjIo\nQxoYx5Gj1ZrVasW4GhmGTGn9sPFIGV9CyJqENA6kVYzV8XET25djHF1mXF9iWB+T08gqr7DxiHme\n2ZSJbdmyLTOr1Yqhie2ccwTLATVv6faR6L305jY1TASbC7M75yXW1oLVUceuSlIhLZ9vru1LNHsJ\nYAuOmmNG62Aek3RD8s6CiIzA+xKR2uUm4uHNxI8gouH3i09p23cSIvdW/CPgKw4+85232O+7Logi\nv/ng8T+94FjLfu8jIo/ccPPibq7XgR9w91+/2QTu7iLybURa/nOAjwb+/QXHfCr8ibb9+YsyDxo/\nAnwW8FIR0VvcGOh0Op1Op9O5Y7rYPmD5zap6tLUqNUR2KZVaC6WWXZ2wA9WNrRlTNba1cr4JsX2+\n2aIpc1xhRrE8sKqVlRkTguKIG+oCHiZkQ0oMg1LmmY2Cm1HrjFXHaxOarfb76LrI9kgeMiaESVmN\nfGtfIttJSWMmr1cRib50zHi8iO1LLbJ9zLg6QgdHakVqpcwTZ9tzzrYZnTYh7MeRNAzkIZNFya0m\nXWU5J6dqoZYKWhGHKspkznkpUc8tzUBNhJwTg6adIZ2K4Kr7FmEWhmjaUsjV2q2Og5TzzsOLiBwD\nf5Go+X0xcf/nVjz3GVnUrflw4q+Nn7yoHtjd3y4ibyWi8R9+wXy/cMF773w39rtC1Lwv3O31/ocL\n3oPrxfVLuHti+2Pa9j9v7vF3wkCI/t989w5pRDn67VhuWb6ncfV+L6DT6XQ676VcvXqVq1dv/+/M\nNE233edBoYvtA6Ym5qpDcSgIM0rB4rEvdcbexLawddg6TAhbUbaamNJA0sykiYSSTDiZDTYT9fQa\nqykzJGFIQkqKSkI0IaKcnW05PT/jZJ7Y1oo1wWkQEeN5pm4F2yTkfITVgGdlLlHnXUphs90yTxOl\nlHACr45YiNbkrZWXKqoJzRkdBtJqFQ7jTWwjkOqMVmZiQgAAIABJREFUzhqiuRjTdsLcmZKiAokw\nOpvOzpjOzpnOzpinmdrWMW1nTq5d4+TsjNNr17AWKBIcVWEYBsacGYYcAj5n8pBiXYuTO5BNSAYp\nsuP3kXTtYvthpbWc+iHgA7mhvf2Nu7bt0TOwrIt4TtveidP1rxPn9ZwL9jm74L1DUXmn+92o+u72\nem83z2/c5Nh3g+fz1A0enH0K/bvJO57exzudTqfTeQh53etex2tf+9r7vYy7ShfbB2zbr57VhXkn\ntmEWZUaZRcNlvDloF2BCmQS2wKyZOcE8KKaJpAMqGXHB58J8tuXcPUS2QhYnqaDS+lKrsp0Km21h\nM81MteIuLGXKxSrTvGUjhTk5NiRqFoo4dTFXK5XzzTnb7USZC1YMr4ZUj+Gg3oSsKppS9Mseh2jl\nVQtSFfOKTinkrkOdC7Ua22lujY7adbDK5toZm2vX2F67xrSdmKeZMs1stxOnZ2ecXjvj2tk5ZnV3\ns0JVGMeRcRwYh+g/Pq5GVusVQ47IedJEFmVAGRDctV2vds261n6Y+Q5C4BnwLcD/Dfws8A53nwEk\nrO2XqOyD8m25E+H3oKwV7t56bzfPvTrn5SbCjxJt0u6Ux9/dA4oIz33u7RMpUkqk9J4Y2Q4effTR\n+72ETqfT6byX8epXv5pXvOIVt93v5S9/Oe94x3vGje0utg+YmtguwOzCRKSBT01oT2i00Wr1xBXY\nokzAJFAVahaqR//sKQ2IJNyVeaqc+4aTeSKJoxjqhoo31/KINBeDYs5sUC1cuy2StJmssJoLG4NZ\nKpYTlsJELPpZO27G+fmGqUW2rVa8xmTRFzsEtxCGa5pSi2yPEdkughTFakG1tT4zIp3eZ6ob1S16\niVuk15+dnnJ+eo2z01O2mw3zdmLazmw3W66dne9GiO2og09JGMcVq9XIarXi6PgoRimsxxVjSoya\nGVPCJOMt3XwpRhfRpfi985AhIv8Z8PHEH8O/6e5fdotdL4qQLlHd25lEXnqKy7sVvw08StSV3473\nJc7tt+/Ssd8d7vZ6bzfPYd713Tzv32rHft4d1GzfFV74whfytre97Zk4VKfT6XQ671W84AUv4AUv\neMFt9xvH8RlYzd2hi+1DhvaDk+ZyLRoGXV4Rq6hZxLTdWq9nJzcjNTcnLfXVxcM5PGWGlMmaEHGq\nONaM0XAQs6jdFmvCtmCiGEpFqKq4aLymyiTGRozsxrYWyrxl3g5sUwrp4GHb/eRcODNnUqXkgZoS\nVZUKFPeIkJfCdp7YThPb7cT5dovWCvMMc2E6O+Pk5JSTJ084ffKEqRZmq8y1Ur1SLQR3qYVrpyec\nnZxy7fSE7fmGeZqZt23ezZbzzYbzzRYz2/XW1pQYV2G8tloVjs2ZEIokZhNWObPOkdJfk8Q1EBhE\nyCJkjdc6DyUvPnh8kQHYx1zw3knbPvs2x/rQ27x/pynKbyF6SX/0ReZbIvI8ov55+cz94m6v96W3\nOd7h+3fzvH8K+DDgQ0Xk/d39V+7i3J1Op9PpdDoX0sX2AUeXrwBQVMlNJM+qWGtJVUWuE9rmHgKW\nEIWLmZlXRyxMwBJKgva5imN4LW3MuNUwXWvzmghVmtgWDZGsIbwnMUQMxDgbMhsVzsw5mQvicUxx\nOHF4Yhg4Oz5mKhUZM4wD5ER2J88Ten4GKWMGZa5MmwmfZ2yzoW63bK6dcfLkEzz5xJOcPPkkxSrF\njFIrhrde2E5149rJCaenJ1w7OWHaTrua7dL6hU9zYS6lpeCHOlETUAM1XA2dDZ0MHQ1PhuGYWji+\n48wYoziDOFmdQSB1rf2wcvj31kWR59dc8N4vte0VEfkQd//FG3cQkQH4U7dZy6ZtV7fZ7weJnt/P\nBj6DaDN1M/4crVCjfeZ+cTfXK8Cnisj7tpZj178Z6f6vbE9/h+iHfbf4HuC/bY//CvBFd3HuTqfT\n6XQ6nQvpYvuA9ZUQ2zVlch7IOeqJfcgwZHwYiDrlRXAT7a1EMAQWsW0O1ZA2qJFybVawWqh1ps4z\ntUzUMlNrpGNbLRitv7SEwC4qFFXmpJgYFaOKkZNypsqJG0fzjDoxLH77fzIPnB1fYguQot+YJ4n0\n9XmGszPMoE6V+Xxie7qhbDbM52F2dn56ypNP7MV2NaOYUc1wodV7RyOu09MTTk9izPO8S103M8zY\ntVHz1tYLEdQFT01Yq6GTo2MI7uW1aiHmhya4t+oMTWgP6uQuth9WDoXxq7iJc7WIvAZ4BbeOPP/r\ng8f/Ezev5/0Gon/0RSyWmb/7Nvu9Hvgywnjr60Tkx27sXS0iHwF8SXv6NuCf3WbOe8ndXK8TNyNe\nJyKfcZMo+ZcQDuQOfPNSc3+X+E6ilv/3Al8oIj/l7v/oVjuLyIuBD3L3772La+h0Op1Op/OQ0sX2\nAUeXHwGg5IE8rsirEVut0PUaXa+QozW472qjHUATotEsO0R2pHL7XPFpxqcZm2dKiVHrzDxPlGli\nnsM1fJ4nbJ7wecIBkxDxVZU5CXMKsT1RmcSYMETgFGFtxsoKyZtbt8PscD4MnB8fM2UFb/23qQgG\n84T5EtGe2QwbzoYzprMzNqenbE5Pufbkkzz5xJMhuJ88ab3GQzSjSh4SaUigcHpywsnJCScnT1JL\niV+Z3YFwPZd2jaLOOkZC8eK4OqaOlohuy2x4Nmoyao7IdsbJEoZyg8CoMLTRefhw958SkbcQraa+\nUESeA/xjQvg+BnwuEZH+t8Af4iaC293fJCI/DvwB4AtEZAV8G/AE8CHAq4FPIoy1Pv6C5fwY8HnA\n80Xk6wnjtifae/OStuzuvykifxn4e8D7AT8hIl/dPp+JKPIXE33CDfiCi1pu3WvuwXrfSNz8+FER\n+QbihsnziZsln932+VX2Pbvv1nmYiHw28XO8DPwDEflM4uf088Dc1vFRwH8FfBzwt4EutjudTqfT\n6Txtutg+YPHbylmR1UBar7H1EaxHWI0wjpRSqF4ozVk7q5KSkvIAFkJbmiD1VvrtCjYoZhmzEasr\nai3UUnbbUktEuaFFtcMJfSuwxdkCG69srbLxirlHS6wmsPEwbKvAlBJzFso6U+ua2Qp4xbzgolSJ\ntmTnkjkVWEtlVbeUsmUqE3OZ2FjlTOA8J+ZxaOXgEc1XFTwnyClaeFXjWBTJA9bahoEgIkhOSMpo\nSq2HtuAqSMrthsaKPK44unTM+tJxbNcrxlVmtUqMYw639BzbnBI5KTkpSbvafoj5XOD/Bd4H+Kw2\nFhx4M/CZXNwU+POAHybE1ivZpzIvc/xt4Ge4WGz/EyIy+0HAX2pj4a0cRLzd/e+LyLOAv9GO+Q03\nzOVEYsrnu/sbLjjmM8JdXu/fA15GiOt/cpN5fg34NHc/4S7j7m8RkY8nUuE/BPgU4mbBu+zaxhM3\nea/T6XQ6nU7nKdPF9gGL2I52WAN+tMaPjrBxaGMEPHpI1xDbknOI7TFHzbRF7TRawmMtzLOJS+0R\nGd/VfS8O4jWcuq1iCFUjsj3jnFfj3Gpsa2VTC+cljMq8GlajtVeFaAGGMw2ZeZUprClewAruhWoz\nxWHrwuBCNmEwGKww1EotE6Wlts+1MOMh3Jvjn7f/iSrkhKQQ2yOK5oFhXON4OJhLFFXrkJEhI8OA\niYS7ugikRF6tSOOKtLiRXzpmfXzMej2yGhLjoIxj6wWeEzpksoY5WhYldTPyhxZ3f7OIfCQhdP8o\n8ELC9Ow/Em3AvsndJ7mgP5y7/7yIfDTwpcAfIwzBniCisH/X3d8gIq9kL8JuNsc1Efm4to5PJczC\nlh7NN4uof7WIfC/w54FPbus24FeANwD/+21MvG65lnux391cr7v/WRH5AeALiLTxy8AvA98NfI27\nXyRyb3c+tzuPt4jIhwGfA/xJ4L8Ankf87fxbRJT73wLf7e5vuuA4nU6n0+l0OneMuN+pme57P9/6\nHd/iAHLpMvLIs5Arz4JLlyhDbmPg/OyMs2tnnJ9dwx2OLx1zdBxDm0GZmiOlItOETDMyFyKoK6i2\niO8yAGk9q8XDeGypA5/duDbPnM1ltz2bCtfmynaemebCtGzFmdTZijEpTNmZklPUUZuROrWtocXR\nZTvVMCabDL92hp2e4Sdn+PkGm2ZsmvBpRpb/REiqDJoYkpJVoVYoFa81hLYKkpogHwdkNSDjgDVH\n9NrEto4jaRVie3181K7jEevVwJiVVVbG3KLjTdwnFbJE89ylQ+1f+8P/dZfdnU7nviIibwNe9KIX\nvai3/up0Op1O5x7y2GOP8fjjjwM87u6P3e/1XESPbB/gJaLV2npSJ6GlPStoilplTWQVkggOJIQk\nShYFDMxwM6gFSoEyI2UOAaqK+lLHLAiKqqIi0fMampFZ1DXP7qQ8MAyFoVTWpXI8Vy7NhW0b0zyz\nnWc2GFsxNhiTGlOCKTmzGmYZtwG3Oc6xRDS8zBUk/NTFCz5kfMz4eohzGRKyGqDUdnMg1ppUySmF\n4FaN/t3uJPM4r5QgaQjk1aHYlqVqHNeEjkMI7mFkfXTE+mjN0dERqzGzSsIqCWNSJDfhnlM4vEt8\ncdOtfpCdTqfT6XQ6nU6nc5/pYvuAst3Gg3G1i0rrqqCaIDnSnMJNM6QBd2elyggkM6zM1KlQ5xmm\nCdlO6LSNeTSi2tqcvEW0Rbo1BKQqqdUyRyQ3o6oMAj5kNGdWZlwyp5gz10qZK3MpTKWytcLWY7vx\nypYYYaqmzJKYNVHFqMmxGgZkrgVPBctRz21JsZzhaIVY3DwQM5JEjXQSZdTEKiVWmhg1kdxJDtmJ\nGwlZkdSi0WOO6PY4RAo54Z3mqmjOpDygeWC1WrFer1ivVqyGzEqFlQqjHkS28z6ynZvo7nQ6nU6n\n0+l0Op0HkS62D6jTFA+2TWjPM8wFTRk1j5JrUVzD9MvdGSUxOmQz5rlg2w1lu8U2W3S7RbZbZJ7R\nyK6OzlehuHfR4mEx/dKoFddxRN2QYSCnSJ8eUmoF4KEwzZ1ajFKNWivbMrMpM9s6sy0z53Vm08a5\nKOeaOffKnIzZYHaPz2oI7ZoLVcP0zIcMpSAttV0JI7iI6i9iO7PWENwDQnYYmvu4ZkUXsT00sT1k\nQHaFlYK0mu+EamY1DqzGkfU4hpCXJrYXk7UhoTlFVF0lxHb3R+t0Op1Op9PpdDoPKF1sH1CnAoDM\nMz4XfCohtrMh5qiH25lqIqUR3EiayKJkh1orzAXbbqmbDb7dINstOk2tFtubUfdBv2lRSAnPGVIC\nW+E4SESJU0rknGAc43FzPxfAzLEa7bimeWI7TWznGOfTls205bwkTt04xTh1Y+vOFmdymMwouVDm\nEtucqU0cuxVUBNG4STCk3Oq0E2tNrDVz1AT3gDAiDCIkTWjSMDRLTWznjAwJQXYWRlH/rZE+L8oq\nZ8Y8sBpyiHlgJXuxrW2ORWwPKiTtoe1Op9PpdDqdTqfzYNLF9gHjsAYgpRFFkQo+O2RDZycNjngi\n68hqjIrhNCTSEG7ZaTSGCqODSYhyaVFwwXdjwZvpmKqiRBsvaa3D3Jxq3nKuY18DTCPlXERwBU8R\n5bYUojSXAcqKVNasSuG4Fo69csWNczO2bkzmTGZMtTJNM/M0M+fCnDKzZqaUsVoiAq8huMc8MC6C\nWHMT3ImVJAYRMku0uZ1PE92SQnRrjusl7fSVJYU+6tbHlBlyZlhqwQUGCQG/S0lPrWZciPGMfjs6\nnU6n0+l0Op1O587pYvuAVRPb6Ih4ggJMhmRHS7h3ZxTVERmbu3jWnRAcBmcyYSWCNcEZQjEjbs11\n3Fr7r/bw4Phu8RoW0Wq3tl/o7eZSrlRVRBVvbY0cwYeE1oHBjFwNLFqDuRmTVSY3JjO2tYbIrsY0\nF7Z5YsoT22lmSpltymxzotQaTXFSCO7VMMYYR1baos+SWKk2kzjISJi9LaZvqtcNQZAI2qNCZAWo\n7gzXsqa2lWaEFoMU54tqpOPjbXQ6nU6n0+l0Op3Og0kX2wcskW1PI0bCK/hsUAyZnVSIyGuLvmpS\nSLoTpMVhhTJrq+tuQ1MCN8RqKGoz3MDNMTOqH2zdsd17bbQAt4pQm4iV3Gq4o6cY4kM4gju7kT1c\nwqtZjFqZSmUqJYT2PLPNWzZ5Ypu2nKfMRjPnKTFbxVXwFGM9rlmvVqzHdUS4RVmJMhACOImgxBoV\naaI4nNrjtXBbX9qOq0irUw+hnZroTk2oi4Zwl4NzRBTwqGdvo9PpdDqdTqfT6XQeRLrYvo6IFIuD\nNoErBqk6qTo6G+qGuqLJI425tQeL/xKqAykDHqJTJKEph9j2GhFnM6zux2yVUg232szXWjR8NxLk\nDJpBE64pYrvNMM2R0N2iiC/R8tbDm0hNz2YkM1KuDKWyypX1UJiGkWmamFZrttOW7bxmOx0xW8FF\ncBVMhdU4Mo4rVuMqbjaIMoi2aHYT2cLumNqOrzQRjqAePcXVnSQSRmcabdM0NaGtUcO9d5OL4SJ4\nE9vSNHbvEN/pdDqdTqfT6XQeVLrYPsCtAiBuIQyJKGz21nu71BCzBlYdSY5YQrJDSqiHuE4Kkls0\nVzOah4jCekXcsFqppVJroZSK14KVilRBhwHaCBfvAR1GbBha/+qEkCJ13MAkRjsDBJgRsjip1VGr\nEmtpwjVpQrORbWAsI2W9otZKmWfmeaLMM6VWXMBxTGAYBnIeyMMQrukHad47oS0hsCP3PbayiGyI\nftwWNyySRzR8aeEVqehhGCca4pom9l0Ua5Ftd8fVI92+0+l0Op1Op9PpdB5Qutg+wGsT2xbRV6WJ\nQHNSNXSuIfIqmEautrqjnglpHWI7+mgnkiaSDehQEa9hceYWInueKfMMqWAlkWSmikDeC21ftk10\nRzq14pLAwXCqQw3/8hgtGJyQXYb7gDAkibTylEgeNxHEwd1wizsIVsp+VMNxvKW2a8o7h/GlJlsO\nI9ptG35u8Tn3aB2Gt2wBq3Eda41rCvvRhHbSiGq7tJ7msghvxURjbuDA2LzT6XQ6nU6n0+l0Hji6\n2D7AdmK7gFXUjOSGmiGlIlrCvEwMtELKiDvukTwtKqjKrp1V1hZFdkOwxU8crWVXz02aMU2YamzH\nARsHPIfYtpyxFAZsyN4azKBFy5sgFqhElFsEaopE8iRxfyAEqhLdrlvbLdineAtQDWpFmrFaGLSF\n2KaJ4N2WfdvvRXiLxIHMLT7Thizp41VJUkkISg3R705yR30R3ouwlja/YFIQKYiWOGermBq+q9n+\nXc/UV6TT6XQ6nU6n0+l07ogutg+wugVAZsXygM0hdINI0RZNuyirpmZ45o6aY7rUFkeEGXEsrNYQ\ncUQMkRaZ1YQOkFUhZVIdGcZKTYk6JGrKGBKR62pYqYuzWJs3XLlTE6vVoeBUAVcBj/pmA2YcNyjN\nQE3dScY+cr9EqNt76pH37UnbmTcn9Fa/HRH0dh5LS7P22RZybynkcbzcUsSlDW1D3JvzuoEoYlEs\nv6s7F41rpm9H5CqiV1HOMY1UdF/S1fk99/qr0el0Op1Op9PpdDpPiS62D6hlAkBUd0LbUojWRWwj\nIbZpYls8DLvM9jXLJoKKYwpJHFMPjaweH0UQDQduTZnkzmgRQS4izApFhIJSDKQatRSkObJJCnO0\ntAhoc+Ym6HHH0OjBDRiKeTidS3XEDKmG1rhBkFvddbTb0oM67CWdO7ZFoAjUFkF3nJZoHtfgYOgu\nYh29suVAbC9CWxFogtktLqCYgrXzSOFdHhHzdyD600h6M/BOtEXz94nkr3lmvyidTqfT6XQ6nU6n\ncxu62D7AmtiuqmjJ2Bzp3QsCUS8tipNIyVDbO5f7wZ4moCkCzJpCo6u3lG1delDHNj4Rqdmz+64v\n9lJXLWYwe9SEq6KtbrmyRLUjioyH/C0C5kTiuliryzbMHOaKlAJz3YntQZWhteAaUmqtuIiU8WbK\nNhGCfhKnYlQXjLhBsKSzixnqTm6tx4bm0Z6VcGwXRcV2gtud6CdeLWzfdV/fHRnki9h+O6o/DfrD\nuPzG3n/tnn8jOp1Op9PpdDqdTufdo4vtA9yjZtutUOuM1AlmRYgocDVrke2ESMLT0NKuQZfy4Rbd\nFiVCwAqmEsI7CZ7AVcNlWyNCHTXUshPcS3q3h7oHa720TZCmS12a6RgtStw+rxJttaooFaU24R9R\nbos096VBl1uI9hZZNjOqVVJVJCU8Z9wT7plZYRaYgSreItsxryBomzUjSwc1Es3gjJZ67tEz3Jb0\ncXe81XebGVIrhZbK3mYVEUyehdljGC/G5dGmsn0nuhnv7fei0+l0Op1Op9PpdJ4qXWxfRyhm94pZ\noZYpjLp8Sb+uiCZUMyKZlIxykDYN+7pkaWXTro4qYQvehsvebVtErxPbixBXFVLS5nimqEsYsZlA\nDTtx8ehhnSQhEqnqSaCoUJrgLi7X1VzHFB411xKp2GYhm6tb9LAWQBNmmWoZGypFNeZNYc7GQRr3\ncnNAm9AWb87k7IU2Lfofwnovsr09RwxKwVt0XJdWYKpUno/5h2N+CffTiOv7sga62O50Op1Op9Pp\ndDoPHF1sX8eB2K5zizoTBdm1IKlE32wdUM2Qoh93aVpxL7Sjt7U0ozRXx5O2yHa8by2OvRfaEcUl\nJ8gZzbnNpYgriuKu4ccmEdVViU7g3kzCo291E9sqFITEPp08BLc353PFDhzHi9WowfYwdTNVig1U\nHyhYGLdlpWqC3dojup7acbJoRNvb+S2V7ossDwd1p5oj1kT3EuGukeZuEnXwWZSUEskT1Z6P+SWq\nfSDu887lfB/a7nQ6nU6n0+l0Op0Hiy62D9i1kvKKWwnDrtqaR2ukWmsyUoSym4hdGnrJXhzLIoC9\nCW7HTfG6F9v4XrLqgdiO0uX9PLo4m9vSwiuMyWSndttxvfW6JqLd6rKLfEf9tkaNNQkTx8SpGqng\nIXKlCe2IdBdgtkoxpZhiKlGn3eqpl7ZhS/L7roUYe/GtNMdxmjZuAjvady29uGmp7HHsiseMqeKl\n4poxuUzlCrXVeS814l1sdzqdTqfT6XQ6nQeVLravYxHbFsNqaz+1tNIyxA1ZirG94K6YKbU2cax7\nkWwA0gQ3hrmECN5J1BCrJtpag+0du0XCOM087Xpde3LcmvtZ2JuHc7do6/HtUQtu0gzO2BuROXhL\nRTcSYfwtuCRMDfNKdaO4UakUEXLOlJSZNWGtx7d53BhItKx4h+wwANmdoT0ePEYmUuydiF57Nay5\novuBWJZl64QQr4aXism8u6nhy15L7+5ukdbpdDqdTqfT6XQeULrYPsSXNPIQ1bGt4LoT2EJtwru2\ntO6KSaFWQZv9uKpGPTbsxbaDWJiVReq47hPJxaMtWBPatgh8aNHfENyY42kvtuMzGvXbJk1wE2Hm\n5uqtsjilS6ubDiszF8GTtvZdhtGEthtlEduqzCmRd2I7sViXZYTcxHQI7bYlWpHt+nm3dmBOGLCx\ntPlqr+8j/DspDUtauVTwlpLeasB3orz9NDqdTqfT6XQ6nU7nQaSL7etYIqUGTWS3ousQ3F6bDVhF\nPOSeeQmhK+HujSYgRcsqJ4TwgSaURWyje9GtUcO8r+WWXT20e9ovSz0M0pJHkXbL3xYF1Waepq2A\nW2NjKtf3yPJQ4q5tK61ftjilCe5CuIJn1RDcsjinR914QprIln0Um/Ap06XfdtiOxzWgtR+rrUZ8\nJ7pbRXdLg18i1r5Etr1QbV/vvchxPXBw73Q6ndshIi8Dfqg9/SR3/5F7dayrV6/y2GOP3avp7xqP\nPvoob3zjG+/3MjqdTqfTea+mi+0DFvEm0dw5HsPOU1tZekqHeDSrIZA9csZNErqkmi9T2PVie5n/\nOrEtirT08yVZWtvnpTWdFqFF2Nu6zHF1MNu1EAu3ccXVdqJbFnvxVmsdRt6LCA6RvdSVhxW5tzXG\n++LNTZ143yWM2AaEQZrgNidbpJCHyLZwTPeoZl/miVZlIZYj3T4M09Rpae1LO7CW9l6ibrtWo5pR\nzEgiSMqklCKToNPpdO6ce157YmY8/vjj9/ownU6n0+l03gPoYvsAbXXSLfzMrnmV+z5iuxPbLcXZ\n9u2tRAxzRaQ2wd7muFFsL9Zi0rpkawxEUaKF1zJEFnEac4FGWrZopJWLNBEc0Wc7eByiW3dCO6Lx\nvosis7ilH7QCS9LOWVsauhoJBbe2RiOLMIjuotnJjGxOqhHRdrMWvW657O38dyZqraZdZW+qZjuD\n8XArr6VSSqWWipeZMs9M80zShI4jwziS8nBvvxCdzkOGiLwSeD3xl80Hufuv3OclvYfyovu9gAu4\nys6fpNPpdDqdzj2li+0DUijaELKtTljcr4tse9hqtzbTstOTboRRGYu5Wes5LTcLpMjiGb6LarMY\nq7H0y5Zd32xXgdqcvd0jFTzsx3frdFF8EdpNbNPEsdDcypelNHOxuEHQWpO1vHXR6MMtYbhOcqhY\nuzFgiAhZhFFgFGUQSOZoNVINgW016rPdF5UPqMR8zX1dVVGR3fOlD7cDtVSmuUW1txO23VC2W6bN\nhiFnbL2Gauiq/8LY6XQeNBR42/1exAU8BvTIe6fT6XQ6zwRdbB/g1tpUiS2Z1y3KHU2pFndxb4I3\n2lA1Z2wlWn6hrQ6ZRW/T/L8OaPnlreBaXKPNmO4jvgak1uYruYdZm+7TzZfo+xJVp9VUsxPZrV3Z\nEq33ln7e1rFzAhcQjfWqth0WkzUVJEULsagrjzrprEpuUfCMoNWa2I5aazGLCHcT26JxfHVHzVCL\nGxepGbjpbk0sfcBItZJKQec5xjSh84RYRVRalL27kXc6nU6n0+l0Op0Hky62DyilxINdDnUTiwam\njlSDZCFitRmbJd+JxYhUJyJ1XFoPbA6Eti8V0c0dPMLNjocjOY5VdkLf3VCrWC2UkvZiWxPI3qHb\nAUkZTQlJaRclj+1y8BYFX1pmHfaoXlK9dS+OQ9CGy/kNOe24WGvHpVTaTYrFEG1ngNbS7qXVcQsh\ntNs5yU7sN9d1a8Zo7nipME1oKWSrjB614qj05BhVAAAgAElEQVSQBLJXpE743MV2p9N5YFjubN7f\nVXQ6nTvm6tWrvO51r+PVr341L3jBC+73cjqdzh1Sa10e6v1cx53wwC/wmaTOJUZZRt09Lq1uuE4z\ndZ6o84yVCSszVgteS/TlrnXvtG0tyttqvXftrpob92Ky5laxg1HLTC0TZZ6YW/r0dH7OdnPOdrNh\nszlns9mwOd/EdrNh2m6Z55lSClYqtVastFENqxVvI143rFS8lhC3pcBcrxtysJUb3vO5YtNMnQo2\nx7WKY9Wd27iYIbWitZJKRUtFS0FK2W2ZZphmfNri2y222eKbLTLPaJnJtTK4sxLnKAkr9b3Ynjb4\ntLnfX5tO5z0eEXmZhJvi65eXgLeKiN0wPrHt/63t+X9qzx8Vka8RkbeIyJM37PuyGz9/wTqW/f6X\n2+z3B0XkH4rIz4nIEyJyIiI/KyLfLSKfKyJX3o1r8GwR+Xft+FsR+dNPcYru2NjpvIdx9epVXvva\n13L16tX7vZROp/MUOBDbD/y/vT2yfcDuB2cR1RYnor5Gq2d2RA08ta23Xs/72ukIaMvOjVx2PcAi\n3hG10hIRXJbIdES2fYlTmyJmoJWDiUCXyHVqtdmtHZYI2ZzsTnbw5Nc5nKu0NmVLw+9dm63mRt5W\nt6x/l/qdBKmtcFxbqy1t6esuLF7ju5Njaf3F7saCthsNy1asol4P6s/DfdyrYTXqvb3GTQutNW5g\nEO7uKcX6hRpZBnW+91+KTufh4TAN5vD5je9f91hEPhb4XuA5t9j3otcuWse7ICJr4FuA/+Ym+35o\nG38C+HLgf73D4yEiLwB+AHgxcAZ8prv/izv9fKfT6XQ6nc7N6GL7gJTi5kikYIeo3ddHt/rqg1Ru\nVW3Z14ugXHo/18g93/Xv2h9jn1G+vBgC0rWJXz9wXDPd1Y2H63htLuMpTNOWdeki+MN6zc1aCrs2\n4a+7x3tn8ji6NrEtrRbd2PfdNo12YpaWfuBRK22yzzwv3kzcaGXeLU1dm7O4uaHNLE28IlapVluk\nf29AF/23HWtbLDIExA3BSS2N3H0xneupmp3OXeTfAy8BPh34CuIP2KcS1tWH/NINzy8D30k0JvgK\n4AcJsfqSm3z2aSHxl+v3AJ/S1veLwDcBb2zHfAHwB4HPeorzfjDwL4EPBJ4A/ri7/+hdW3in0+l0\nOp2Hli62D8ipXY7FIVzS4hq2E7aiGk7aeiBgIVKmgV0za1H2oe7dGzc3S5Noz+Xaor1iuCtgLLLS\nkdZLWzFRfDEJU0XS0kqslV7XqOkWZC+6NUFrudXeiaVIiGSREMZmRrUagls1ener7M9Z2zpazbV7\ntB9rfmo7we14JAhYxT1agYm1SLXVaCW2OL0vrcjaye4d3x0Isa3NZM3joO29Lrg7nbuBu58DPyMi\nLz14+RfvoPXXc4ET4OPd/S0Hr//E3V4j8D+wF9rfBXyOux+mt7wJ+BfAX2+R6tsiIi8B3gA8CvwG\n8HJ3f/NdXXWn0+l0Op2Hli62D0jaItuqLaodYnsxNUNC4KYD4blEpqP314I3R3PdGaUdRrd3T5wW\nsV6i2ctjiZruZn7mTbxWEaootbX2ElUkJcQSLALawbXiuwMemKq1dlvahLguqehNJFcr1FqptWBN\nRPuuIXYzXtOI9EfUOlLfXcBaq7Lom+170d1q0v1AaLuVdnOipZzjbfWHBuP7PmVNZyMSKfdm7Rpf\nd807nc59wIGvuUFo33VaVPuL2/HeBrzyBqF9/aLcbxtVF5GPI9Lfnw38MvBH3P0/3p0VdzqdTqfT\n6XSxfR27KPVue/gmraaaXa1xqL4lgXrpXO1460uNWNRJSxPNbUJbRDSCH0a/dZ9aHrHiqGe2Fsmt\nCIUaDuCqaE4IhuIYUC0c0/W6GwRNkGsMby3CUktNt4M2ZbWEwVotM9XtoFZcoc3hKdGy0Vv9dLT9\nXlLdXfY3B5yowT4U2dQmvL3GVWtz7MdBBsDBtd9t292HRex3Op37zv/5DBzjI4kG0Q78Q3c/ezqT\nicinEenvR8DPEUL71572KjudTqfT6XQO6GL7Ovap0SJLSvf+NZBwDl9aZ4m2VO3WMzuad0UsVuyg\n/Ze0NPDFEO1gLOneANbabUkz1hN2wt6spXnjFPcWcfd2VKG6R5p2Kbs67UUoa81IykjyEMxi2C7q\n7rtk9VrCaX2eZ8xqax3WzNFaazFyZmlpJiI7gSzahPKuZ3ZL9bbFobzuhXZLI4/1Xy+2l9NeblDs\nXlhahLX6bmvGap1O575y6u5vfQaO81EHj//N05zrM4HPBwYi3f2PuvtvPc05O51Op9PpdN6FLrYP\nWNKSQ+PK3qhs8esOd65Wp2w7QStocxNfRPRBVHhn770X14dCG29zKBHNbnXMHGSeh3Bd6qljuAqJ\n6GPtCmZCXRKyF7Hd6ro9jXuXcA2ncteWJo4T8W3H5pkyTZRpi1ndm6KJYrniOYf5Woo1q+5T0veC\nm10PbzeD5i4uBxFtb2JbW/RbuUFotweH6ftLTXrUgjdDtV6z3encb975DB3nuQePn67x2he17Qb4\nk3dfaDvw/DvYL3F/Opb0FkedTqfTeTC5evXqHbXim+f3nI5EXWzfhKVFF0s0+0DWhXlXRF1FLKLQ\nTfDGZ+WG9PBFVB9u9WCfvch2DCXE/OItvhiFuTtm0S+7Wm1RcMWSolWwnbkY+2h0q9V2D7GqaATK\nlX29eGvg5Xj0FJ9bH/FamwlcmKO5W7sZ4UTrM2FXD+5NGS8O64dtvXYp5Ae12153Ne6Kh7t5i3Af\nlmwvbc12RnMee0Rq/b38BnQ6nTuk3n6Xu87T/dP/ncBnAGvgn4rIp7n76dNf1iHvuLvTdTqdTqfz\nEPC6172O1772tfd7GXeVLrY795+4p9HpdN47OXQy1FvtJCLHF8zxmwePX0i0/Xp3+bvAjwNfC3wc\n8H0i8sfc/drTmBMO+owvvh+34073uxe8/e1v57HHHrtvx+90HgSmaQLg5S9/OeM43ufVdDqdWivP\ne97zbrvfO96xu6n9nIv2exDoYvuAV/21r+ySr9PpPMzci5yRk4PH73PBfh96wXs/efD4E4F//XQW\n5O5fJyIZ+CrgE4DvbYL7/GlMu/v3w+/QUOJO97sXmBmPP/74fTt+p/MgcfCLe6fTec/igdduXWx3\nOp1OZ2Fz8Hh1l+Z868HjjwG++xb7fc4Fc7wZ+FXg/YA/JyJf93Qdyd39ayTcKL8CeBnwz0Xkj7v7\n5jYfvRVb4poZ8Pans7ZOp9PpdDoX8nwiW257vxdyO7rY7nQ6nc7CoSvJB/P00rUBcPd3isj/B/w+\n4PNE5Gvd/TpjNRH5Q8Bf4BaRdXd3Efla4O8QLcC+XUT+9M16bbee3I/eSa9td//KFuH+cuCTge9p\ngnt6amcJ7n7pqX6m8/+zd+9htm1nXee/7xhzzlVVe++TaCfh5MJFBBtUUEmCwRsEsI0gEAERWiUS\nLlFpLjbY0CCGgA9PNyrS0tiGi4FAt1GQBrXTEaFDBB+ICchFQW4ikOQoh0iIOWdXrTnHePuPd8y5\nZtXZl9r71N5V59Tvk2dmrao1a81Rtfc6tX/rHeMdIiIiT243XT8nIiKXzr9hV93+SjP7SDN7XzP7\n7e2422r332237wb8kJn9aTP7vWb24Wb2NcC/AN7MraeDfX07D6LB2U+Z2eea2R9oz/UiM3sFsW/2\nZ552YO7+FcBXtg8/EvgeM+tP/62JiIiI3Jgq2yIiAoC7v8vM/g7wV4APAr73xCkfBvzLu3jqbwT+\nGPBi4P2Bf7C+LPCTRID+T7cYm5vZxwHfCnwi8L7A197o1DsdnLu/vE0p/5I2zu82sxffqHIuIiIi\nclqqbIuIyMLdv5ioDP8g8HZgIgJsPXkqpwy2Hp3APpHY4/pNwLva8RNEwH2Bu992nbO7H7r7nyam\nfH8b8B+AR4F3Aj9DbOv1KUSn8cd8+a3G6+5/Ffjqds6LgO9sU8xFRERE7oqdZzdUERERERERkScj\nVbZFREREREREzpjCtoiIiIiIiMgZU9gWEREREREROWMK2yIiIiIiIiJnTGFbRERERERE5IwpbIuI\niIiIiIicMYVtERG59MzsPczsb5nZz5jZu8zs7Wb2r83sC81s/wyv88fN7LvM7FfN7LDdfpeZveis\nriFymdzL166ZvcTM6imPTz2r70nkycrMnm5mH21mrzCz15rZw6vX0N+/R9f8FDP752b2kJldN7P/\naGbfZmYvuBfXe8z1tc+2iIhcZmb2McC3AQ8AJ38pGvBzwEe7+y8+jmsY8I3AS9un1texdvuN7v6y\nu72GyGVzr1+7ZvYS4FU3eO4b+TR3f/XdXEfksjCzeuJT69fWt7r7SzkjZrYH/GPgj3Pj/z5U4Cvc\n/SvO6po3osq2iIhcWmb2+4DXANeA/wp8CfAHgI8gwrED7wv8MzO78jgu9VVE0HbgR4FPAT643f5Y\n+/xnmNlffxzXELk07uNrd/bfAR9wi+O7z+AaIpeBt+NXgO9l94bzWXsVu6D9/wEvJn7vfjrwC0QO\nfrmZfcY9uj6gyraIiFxiZvYG4A8DI/CH3f1fn3j8C4C/QfyyfsXdvANuZu8L/DsgA28CPtTdj1aP\n7wNvAJ7XxvE7H08VXeQyuE+v3XVl+7e5+6887oGLXGJm9nLi9+Cb3P1hM3tP4JeI19iZVbbN7IXA\n97fn/SfAx/sq9JrZf0O88f0ewG8A7+3uv3kW1z5JlW0REbmUzOz5xD/WHfimk/9Yb74G+BninffP\nM7N8F5f6y0DX7n/OOmgDuPt14HPahx3w+XdxDZFL4z6+dkXkDLn7K9z9te7+8D2+1F9ptwX4bD9R\nXXb3twNf1D58KnDPqtsK2yIiclm9eHX/W250QvsFPa/DfCrwwru4zscSoeDfu/ubbnKdNwI/SwSD\nj7uLa4hcJvfrtSsiTzBmdhX4cOL37r9w97fd5NTvAt7Z7v/JezUehW0REbms/lC7fYSYTnYzb1jd\n/4N3cgEz+23As27wPLe6zrPb1DoRubF7/toVkSes5wNDu3/T37vuPgI/QrzJ/Xwz62527uOhsC0i\nIpfV+xPvfP+Cu5/skLr27098zZ34nTd5nrO+jshlcj9euyd9i5m91cyO2nZFP2xmX2lmz7r9l4rI\nfXQ3v3c74H3uxWAUtkVE5NIxsw3wtPbhW251rru/g6igAbz7HV7qOav7t7wO8Kur+3d6HZFL4T6+\ndk/6UOBB4h/lv5XoavylwC+Y2Wc9zucWkbNzoX7v3pNyuYiIyAV3bXX/Xac4/xHgALh6D6/zyOr+\nnV5H5LK4X6/d2S8Se/X+CLt/mL838AnAJwJ7wP9hZtXdv+kuryEiZ+dC/d5V2BYRkctob3V/e4rz\nj4h1Xfv38DrrLuV3eh2Ry+J+vXYBvsvdv/UGn/9R4DvM7KOA/5v49/TfNrN/4u6/dhfXEZGzc6F+\n72oauYiIXEaHq/vDTc/a2RBrRK/fw+tsVvfv9Doil8X9eu3i7v/1No+/FvgKIswfAJ9+p9cQkTN3\noX7vKmyLiMhltP5H9Gmmjl1pt6eZtnq317myun+n1xG5LO7Xa/e0voEI8xDrukXkfF2o37sK2yIi\ncum4+xHw6+3D59zqXDN7KrtfyL96q3NvYN2c5ZbX4Xhzlju9jsilcB9fu6cdz8PA29uHz74X1xCR\nO3Khfu8qbIuIyGX1M8T0z/cxs1v9Pny/E19zJ376Js9z1tcRuUzux2v3TvjtTxGR++Rufu9OwC/c\ni8EobIuIyGX1Q+32CvDcW5y3nhr6r+7kAu7+S8DbbvA8N/JH2u1b3f2X7+Q6IpfMPX/tnpaZPY3d\nVmRvu9W5InJfvIldY7Sb/t41sx54AfFm2ZvcfboXg1HYFhGRy+q7V/c/7UYnmJkBn9o+fAfw+ru4\nzvcQVbj3M7MPvsl1XkC8w+4nxiUij3W/Xrun8TLi9Q3whnt0DRE5JXd/F/D9xOvyI83sWTc59ROA\nB9r977pX41HYFhGRS8nd3wT8IPEL+dPN7Pff4LQvBN6fCMFf6+5l/aCZfaiZ1Xb8/Ztc6muJKWoA\nX2dm621JaB//nfbhBPxvd/UNiVwS9+O1a2bvaWa/91bjMLM/AXxZ+/AQeNWdfzcicifM7CWr1+5f\nu8lpf7PddsDXn1xu0mak/C/tw3cA33xvRqt9tkVE5HL7PGJ66T7wL8zsq4gK2D7wKcBntvN+Fvia\nWzzPTddsuvvPm9nfBL4YeD7wr8zsfwV+EfjtwBcBv689x1e7+y8+ru9I5HK416/d9wJeb2Y/DPxT\n4MeBXyMC/nsDf4qojFl7ji9w94cex/cj8qRnZn8QeJ/Vp562uv8+ZvaS9fk32ed+efimD7i/3sxe\nA3wy8HHEfyO+lljq8YHAlwDv0Z7ji9z9N+/oG7kDCtsiInJpufuPm9knAd9OTCf7qpOnEP9Y/2h3\nf+RxXOpLgacDLwV+L/CaE9dw4Jvc/ctu8LUicsJ9eu06sabzQ27x+CPA57v7PauMiTyJfAbwkht8\n3oA/1I6ZA7cK27fzUuAa8FHAhwEvPPHcBfgKd/+mx3GN21LYFhGRS83d/x8z+0CiUvbRxFYhW6Iz\n6T8Cvt7dD2/1FCdub3QNBz7TzP4x8FlEhftpxBZGbwL+nrt/7+P9XkQuk3v82v1R4M8SQft5wDOJ\n12wH/Abw74h1od/k7r9+g68XkRs7bff+W5132+dor/2PMbNPBv488HuApwL/GfiXxH8f3njKsdw1\ni9//IiIiIiIiInJW1CBNRERERERE5IwpbIuIiIiIiIicMYVtERERERERkTOmsC0iIiIiIiJyxhS2\nRURERERERM6YwraIiIiIiIjIGVPYFhERERERETljCtsiIiIiIiIiZ0xhW0REREREROSMKWzfZ2ZW\n21HO8DlftXreTz2r573Lsbx8NZa/dp5jEREREREROS8K2+fDn2DPezcu0lhERERERETuK4Xt82Hn\nPQARERERERG5dxS2nzwcVZNFREREREQuhO68ByCPn7t/GvBp5z0OERERERERCapsi4iIiIiIiJwx\nhW0RERERERGRM6awfQGY2fPM7BvN7GfN7F1m9nYze6OZfbGZXTvF1992668bbcllZntm9ulm9s/N\n7JfN7Kg9/oE3eY4Xmtn/aWb/0cyum9nbzOxfmtlfNLP9x/dTEBERERERefLQmu1zZmZfDvxV4o2P\nucHZPvD8dny2mf0pd/+RUzzdaRqkebvu+wHfCfzOE1/7mOcwswx8A8fXhTvwbsCDwB9q4/z4U1xf\nRERERETkSU9h+3zMgfdzgL/WPv554I3AFvgA4Hnt3GcD/6+Zfai7/+QZXf9pwOuAdweuAz8E/DJw\nFXjBDc7/NuCT2QXxdwCvB94OvAfwYcD7A68F/skZjVFEREREROQJS2H7fP0NIux+uru/Zv2AmX0I\n8A+B5wAPAN9mZh/k7uUMrvsXgAx8B/DZ7v72E9fOq/t/juNB++uAL3L3o9U57wZ8O/ARwF86g/GJ\niIiIiIg8oWnN9vkxoAdecjJoA7j7DwMvAo7aub8b+HNndO0M/HN3/+STQbtduwCYmQFfyS5ov8rd\nP38dtNv5/xn4GOCn2vckIiIiIiJyqSlsnx8HftDdv/OmJ7j/NPD1q0995hlc19rt55/i3D9GTBM3\nogL/V252orsfAl/Qzj3N2nEREREREZEnLYXt8/XqU5zzre3WgOefQddvB37S3X/uFOe+cPU1r3X3\n37jlE7t/H/BWdoFeRERERETkUlLYPh9zGP3h253o7j8FvKt9mIEbbst1h370lOf9vtX92461eeMd\njkVERERERORJR2H7fP3KKc97y+r+08/gug+f8rz1tU471tOeJyIiIiIi8qSlsH2+Hj3leY+s7l87\ng+teP+V5V1f372asIiIiIiIil5LC9vk6OOV5V1b3/+u9GMhNvGt1/27GKiIiIiIicikpbJ+v9zjl\nec9e3f/1ezGQm1hPNz/tWN/9XgxERERERETkiURh+3zMW2O94HYnmtnvZjd1vAA/ca8GdQP/ZnX/\ntmNtfv+9GIiIiIiIiMgTicL2+fpzpzjnJe3WgTe5+2nXW5+F17dbAz7KzJ56q5PN7COA56B9tkVE\nRERE5JJT2D4/BnyomX38TU8we3/gs9mF12+8HwNb+V7gV9v9A+Crb3aimW2AvzV/eI/HJSIiIiIi\ncqEpbJ8fB7bAt5nZJ5980Mw+BHgdsCHC678Fvv2+DtC9Al82Dwn4dDP72y1Yr8f6IPDPiD3Aj+7n\nGEVERERERC4ihe3z9T8B+8D/ZWY/a2avNrNvNrM3Av+KaDZmRAfyl7j7dL8H6O6vBv4R8eaAAZ8H\nvM3MvtPMXmlmrwV+CfgI4D8Af/d+j1FEREREROSi6c57AJeUAe7uX2dmTwO+FHgf4H1X58xTx98K\nfJK7//h9HuPanyH22Z7Xj/8WYD393YGfbp/7lPs7NBERERERkYtHle37z1cH7v5y4A8ArwJ+HngE\neAfwo8CXAL/L3X/kDp73dufc+YDdi7u/lKhe/0NiHfcR8J+AHwI+F/hgd/+5x3MdERERERGRJwtz\nVy4SEREREREROUuqbIuIiIiIiIicMYVtERERERERkTOmsC0iIiIiIiJyxhS2RURERERERM6YwraI\niIiIiIjIGVPYFhERERERETljCtsiIiIiIiIiZ0xhW0REREREROSMKWyLiIiIiIiInDGFbRERERER\nEZEzprAtIiIiIiIicsa68x6AiIjIE52ZPQJsgAr82jkPR0RE5MnsGUTR+Mjdr5z3YG7F3P28x3Bh\nvOyT/mj7YRgpJcwMs0TKOT5Ou/vxseGwHJBwiyN1PV2/Tzfs0fUbsARksEStE2U8okxbyrSlli0+\nHVHLFvOK1Yq5Y15jOO3PyFZXsvXt/GfYbt0dJ/7FB5ByxlKOW1t9bzf4GZiBAWbzo9Y+mbDliM/F\nz+fEE6z+Os1/t/xGD7ZrHLvO6uvWx/Hnbz+D1edf/rWvvtG3IiJy35jZBOTzHoeIiMglUtz9QheP\nL/Tg7rc52JnF/V0QnIMtEYJph89BtIXG3GPtSN1A6jbkbsDyQK1OdagljlKNUhPVE04G6/EE1AlS\nwWrBMfAa16/rWB9h2uZb97jnMVb3OCPONGAXrM0Mr/VE0N09Ngdo8/aDWJ6hXdt2P4v40cyj4FgA\n3v1Md2Oax728edCSvZktOXx+3J1V2N593fL1c+gWEbkYHOK/Z8961rPOeywicgrb7ZaHH36Ypz/9\n6QzDcN7DEZFTetvb3jbntgsfBhS2TzgZsm0VNnchux3mreKbIWVSv08e9sjDPtYNWOqx1IF1lHFi\nmgrjNFEK1JLwmnDv4q+JGaREImF1xG1X2XZ38NIC9fHKrh0Lur5Utefs7RjuTvKMJz9RlY7vc77d\nVfNPBu72hO17P8Z2IX+prB/7gbbg7/P3sQvQzNX1do1dgX53nrc3Enah/cRziIhcDP8FeMbTnvY0\n3vKWt5z3WETkFH7sx36M5z73ubzuda/jgz7og857OCJySs94xjN4+OGHIX73XmgK2yvLtOc5SHur\nvN4odHvF3DAMTxlyTzfs0e9fo9u/RsoDTlStqxu1HDH5EYdToRbwanjN4I5hpBbavdWrzQuY4RXc\na4TtE0FzHhPrILoE7t1Ec2tfYzXtqvCsK9lxuHuMYwnd8fzukJK1n4kvZe3VxO/VuGZzgF4F5uq4\n11XFep6Gvq6Os5zLsbB9stpdb1RIFxERERERuRAUtlfm8GZzwrRVFdkdo0YAr0AyYt53VITJHbnf\noxsOGPauQh6YilMKTFNlLMbhCNePCqVUqBX3igHZIKdETgYUrK3tXhZQL2GzLoF7/txuOnddhe15\n8jeAxXsGMd+8Be3d9O15XboliyJ1MhKOWd1VuC0Cv1drU92Xhd3xhkAb2zLN+8T66zk0V3e8Vmpt\nU+N3dfn57FUlfFfFrieD9iqEi4iIiIiIXEQK2zfijhtLiKwe66JrncNqwathGXIeWhO1ecp4ppKZ\nJufwcMv1wy2Hh1sOD4+Wo5SJWiZqLUBl6IyhTwyd0VulxzESmTlwt1tSqwHH9PKYhN3CZwvQj60u\n27GPvVXr52nmrf6MFbAUz12hNUKLqruZkRySGblWlry+TP9ehWDWzdXWj0H1ilePW39MzF5NQ2+h\ne/WcJxumKWyLiIiIiMhFprB9A60IjFvFPSq3tRpQY9o4EXRjhbVjlsgtbLt1VDLj5Dx6feSd73yE\nd73rUbbbLdvtyHa7ZZomSpkoZQScvU3H/qaj7PV4qqQEXWohm8TcEdzmsZFYmqUtiflkRXstJpwv\nq69XGdXbtPB5Svr8vVvadSWPynYU9X3pQL47fx2Il6vPgftEVdo9KtuP6VTufuv78/Osqt/K2iIX\ni5m9HHg54O5+V525zewHgD8C/IC7f/gZDu9W13zc4579+q//Os95znPOZmAick9tt1sAXvSiF51b\ng7QHH3yQN7/5zedybRG59xS2byLWTc/rh2P6tFdbL1kGy219dyLljrSqbI/TyKPXt/zmOx/hHe94\nJ9M0xTFOTNPIOI1MU4TtKwcbat2AGal3ug6GOWzPxzz1m5gT7vO2YMzV7CX6zoM7fsxBfM7A7f/W\nFeWl+Vtqzdna9cwgW6v223ry94lKs+8y9q7qzbFmZ3U1Hf5Y07Ple1lGt/7DOBbqd9+ziDwJPaFf\n3O7OW9/61vMehojcgdZoSUTkzClsH7Nr0nWyQOyrLa7co0o8ryVuvbwYx4nqR9TpEa4fTWy3W7xW\nckqkvqPPGR8Gjo6O4HplHLeUaWQcE+OY4xYYqUzmpEwMxBKkrk1vn9dsz+PxZd10e0cgEvEqaNsS\n1tNjvts5OK/i9u4xB6cu670dX4VtX4XfOTjvLI3PfDclfL0Wez1FvP1Qj79VMDdxW/4v7sRbCjf4\nAxKRJ5P2H5knqmef9wBE5FS2wMPA08ZrtyMAACAASURBVIH7Xdl+CKi3PUtEntgUtm/kZNBepkzv\nAuD8L8Hq4NUppTKWLdsjGJk4Givjdksy2Bt6UkrL8eijj+Je2G4PmWqllhLV7m1iNJjMGRMki728\no0N4XgVUB1Ks3bYWslkH8TmWpnZvVSWH1Tmtcs/x6d+OLVX9+f+X/b5PHsemdq9+fLa61FLBXq21\nXk8jv3FS321P5uvn9HjjQ1lb5EnJ3V943mN4fAzQ1l8iTww/BjwXeB1wv7f+eg6gWTAiT3YK2zfV\nAuCyNdXxTy+bgLWp0bU6R9OW69PE9emQqUSGTMDeZqDve4a+px8G+i6xPTrkEUvUUihToYwTY06M\nyeIokFO0RMNap/J5CnadK9opppOvQnZ0TW8V7fa169slEK+q0stU7t0+Yqwy8hK0ndqmrx8P27tA\nfrOf4i6J+y1D9u59DpvfLjA78eaHtbXbmkouIiIiIiIXl8L2ym7/6dZ1vM2+nnk7Jyq/8aC7USuU\nUplGZ7uFwyOnuJFTJqVMlzNdTnQ50edElxI5tevg0JqGlVKpNVEqVE8Uj47npI6UiS2zagU7vgUY\nPncn9zarfA7brbJtq/vzdzI3NaMu07uX0Mzueecp2/PWXtXnHbx31z8ZtP3kvfVjj0nl8898/ZnW\nAb3t9w2267e2DM9Zdz0XETln5bwHICJ36plEb8RnnvdAROQO5Lz0Mr3wv3sVtldSimnWyxbX5sva\n4zBPzU6QYh21W6a6QfW2dtvnxc5RhS4w1UoZtxwSz/vIo49y+OijUApdivDd5UTOiZRyhMycSV0i\nd5m+S/RdptYSlfBS8FqWxmNzOJ4L01EMnoP2air5rh85u7Bcl6nkuxA+NzCruBe8xrptq21Kt6/O\nn697sj/b2vzzYA7VuwnnZrsu6fMds3nqfDr++DHa+EvkojOzpwD/I/AJwHsSCyR/AvgGd3/NTb7m\nB7hJN3Ize0/gl9qHf97dX21mHw98BvB7gGcAP3iDr3sW8KXAi4BnAf8FeDPwd9z9+8/gW4Vl8aXe\nBBR54ngm8OXnPQgRuUOrsH3hGx8obK/Mf3CxtbUv+0nPRWRvYTtZxixD6nHSsm67Vto07lZlrqXt\n0+2M48i43TKOI9vtEUfbLV5LVLpzIqccjdRSxnLGUofljtz3dEPHZuiZykSZRpgmainRCb1VnJdx\nt+/lsWE7Phvfxcl117tp4jGlvLbp6rGfeE0l/iobWPV23flNhVXkNVaty1abeQPHk7izBO15XXb7\nkt12Y/ORjn+t775eRC4uM3sv4PuA92b3Ij4APgz4MDN7MfDf+3prheDc/K279TlmZq8G/uytzjez\nPwz8U+CB1XkPAn8C+Bgz+/JTfUMiIiIid0hhe2UdtlnCdoTopYBsGazDrGv7arfp3rV1J6cFxtam\nvLpTp8LhI4/wyCOP8Ogjj1BqxVLsnT1PMc85RdjOmZR6LHdYHsj9QL/ZMOxtsHELaaTaFlJZpna7\n1xZSl/7jwDwFO63C766TeNzWdn8XsmnrsqOinajFoqLNKueyW7d9sr68vlbcnava6x3Ao/y+VLCX\nvbt325vFzAI79lxO/JnEmnUeu+ZbRC6Sf0hUs/8u8I+B3wQ+EPgi4HcAf4roDvQFN/ja07yb9pfb\n870B+HvAzwFPBd5reRKzdyeC9jViqtkrT4zli4mylja5FRERkTOnsL3Sde3H0Srb88xraz3ISoWc\nelLuyWkgdQPW7ZP6Pazbox5NFArbMkVFu8S072mMbcCmMfbWrtWxZFhKYDCVhE0TloyhdpAyud+j\n39tj2N9js7/PZn+PPG6Xo0zHw3Z07F6tll5mhZ9YdN7C9Ry2I2hX8LjvXjCfm5A5kHfrpa11Aa/z\nll5zZ/HV05/kLIF71y7Nl3XZbRrB8bXxy1rsk1PfbZWvVd0WucAMeB7wKe7+j1af/zEz+w7gh4hp\n359rZt/s7j99F9f4AOBb3P2ltzjna9hVtP/MLcbyvLu4voiIiMgtKWyvHA/bQGqV6rmQWyHlga7b\nkLsNXbdHHvbJwwG536fYIWM9xLaF6jWmfY8j43bcBe1SqO5ExdnxAkwROh3Y7G0gdeRhj2H/KsP+\nPpuDffYODhjHLakdtYVtr3OztLk67W1KexxzM7FdEt59M0bFvMTXeoljXntdHUs5Ppyn05e476sG\nbVEFh5t0SGvT8NssgVXlPx6z5TjZMs1bt3c79tldLT0eV2Vb5IJy4J+eCLfxgPsjZvZZwBuJJhh/\nAfjcO3x+A34D+JybnmD2bsCL72AsIiIiImdKYXuly+3HkXbHkiNbATj3Eba7fp9+OKDbHNBvDuiG\nA46KcX1bwI7a3tsl1mgfHTGOW0qZKLUsQdW8hUyLEFtxSnVImW7YY9i7wubgCpuDA/auHJDHMSrb\n221rklZXgbsule5SorM5U6HWFobna7aqts1VbZ8icLfmZ0v1O7X12PM0cGsB1wxPsf2Y10r1CMRL\no7aTBefl4111e95lbFmvbce7q7UV6Mt1jz/dPBFeUVvkgvuWmz3g7m8ys38H/C7gI+/iuecA/cgt\nznkhkNu5px2LiIiIyJlR2F6Z12z7qum4z6XdFIm4Hzb0wz7DcIV+c0A/7NMNB3TDPv31Lak9R62F\nWqNzeFmq2ZAsUalLM7Go9rbAmRK56+mHPTb7V9g7uMZm/4Bh/4B+bx/LI5YHUjdSpmkXtJewHZXm\nUgrTWLBcKGV9jkd5uk0bdwpWWQL/Lm2n5f5SczaPT0ME81TbDHE71hH9mGXv7uUC7TnnT7U12ra0\nlDtht+XX+hlO1z9JRM7Zm27z+L8mAu7vMLPO3ac7fP6fvM3jH3AXYxERERE5MwrbN2LrI2Ep01nC\nLbPZ7LPZu8Jmc5V+2Cd1m1i73bYNi7AbFezqEaotxxZezgBmMcWbaJ5GSvSbgWFvj2Gz4crVa1y5\n9gBXrj2Fg6sPsNls6Ie4RibGYCmT8zyN3FdBOzqK11LJfSFPE2VqYb/EGnIvE14L1SeijXqCmmLN\n9jxfnNTWctsSjedp3cePhFPxXY5ub054m1p+ozXV65bpS/vx9qEdC+K2vl1x9+WNChG5sH7tNo//\n53ZrwG8BHr7D5/+N2zz+W+9iLGegEjuQ3U5uh4hcTg+d9wBELpyHHnqIhx66/Wtju93eh9GcDYXt\ntWO9xJZNn0k5Y7knpZ7N3gH7B1fY279K3+/j1vbaJsfa4yVsT23PbUg5kemw1m28lEqplakWLCX6\nYcPe/j77B1e4cu0aV689wNVrT+HK1Qfo+o6+60i5j4XkKZNyt2z9NQftuSOaA7UU8hRhe5oKZZyY\npokyTRRLeGvSFtPOE3jC5tsWtOcNzOb9uXcBexW8DZwU4drmZmnzmmpfFbbbc62r1KuQjR3vPr50\nJ7f5+hz72l1PNoVtkQvsdi/Qx9vhsNzB89/rsZxwp+8biIiIyCtf+Upe8YpXnPcwzpTC9o3MDcGI\nfZ5T7sjdQO43bPb22du/wsGVa3T9HqVCqcZYoglYdafUeR/sVtlORraOlCHPa7nLhLfGaP0wsLd/\nwMHVq1y5+gBXrj7A1WsPcOXqtQjoqW2PVTOWKp7LMi18PmLccVtLJU/TErjHPMJ2izNS3aKBGpXq\n5VjYjqC9O2ih21ht9jWvM5+r2u3n5b67nXfsBua+4zwmcBu7UA3Y/D2eOOBExXtFUVvkQns3Ymuv\nm5nLv87tq9R347/cxVgeNzPjaU972m3PyzkvS5dE5PJ68MEHz3sIIhfGy172Mj72Yz/2tue96EUv\n4uGHnxhvbCtsr5S5NZclsEzKidTHGuphs08/7LO3v88wDHQ5kXCmGuujx3GCumXIzpW9jmybaPLd\nwu04FcaxME4FKwlKwrqOlBJXrl7h2rWrPPDUp3Dt2lUODvYY+p6uyy10tj2zDTwZeIqtxVp1utYT\nLcPMyOy2Ltt9OkXwdfBqsS1YzTiJWlK8sUCaW7G3SnfBPWG1xrRzq1jrfE6btr5U1ZetyLxNb2cJ\n2mZpHlobTCslLTPJTwbt1edOVLc1e1zkCeH53DrgPr/d/vxdrNc+jZ+6i7E8bs961rN4y1veclZP\nJyIicmk885nP5JnPfOZtzxuG4T6M5mwobK/UpWYbe2Bb15O7DcNmn739A/b2r9D3EYRzapXfMlK2\nR4xHR1CPGHLl6n7Ppk9ABktUN64fbrl+uMUPt9gUQTt7pes6rly9wgMPPMBTn/oUHrh2lYP9PYah\noztR7Z0b68ay6EqpTq01Oo7P/1vmbne7rbVaAzazHOHfjVpTHIwR3i2D5XYbY4/IPmFkqDUCt/ku\nbFvbr9trm0J+PGzPZW9r3ebmcM1yu5qUvgTrGwdv2FW35ynqInKhvQT47hs9YGbPA3438R+B77tH\n1389MdU8nXIsIiIiImdKYXulLiVg6FIm5Z6u3zBs9nZTx3NPTh05JcpUqdOWaXud7eGjWBkZOufq\nQUfxHrMeSz2VRH7XdbDEWKIp2rwfTT8MXLlyhQceuMZveepTuHr1Kgf7+1HZzmlZt7wLopAs9pie\n136XWqnEtPLqvgvZKaawz5Vls9KCtpGLUUrCyVRPeI2Kdk2JZAlsiuZwZKA1U6uV1DYdj4Dcwva8\nL9oqbLNUtuegPbcyX3cor0sn8yjc2+42GWkVwCFC+Dxl/tjW4SJy0RjwsWb2ie7+ncceMLsCvLJ9\nWFf3z5S7/ycz+x7g428zlm/gsZsWioiIiDxuCtsrlmL9XM49XT8s08eHYY++39B3PUbCqzOVkXEc\nGY8OGY8eZTx8hForGWfoDCxjqYPc45452o7kwxwh0lNrlpbYbPbY399n/2CfK1cO2NvfYxgiaKc0\nV3Zj+ndaBVDHSamSaiXPVW6vFK9YtTgKlKXpmEdluxq1GF5STA8fIww7CbMcR4rbqMxPQG7TyGs8\nD3UJ2mYVKNHNnLbXd50DN7v1355O/FPWd83VqEsjtfkNhfX0+WVdd3vMUdIWueAceDPwD8zsw4Dv\nBN4JfCDwRcB/28753939397k68/CFwB/FLh2g7H8HuCLgfdpYz2zqeQiIiIioLB9TN9v4nazx2bv\ngGH/CpvNPn2/if2xi1PLljIWylQYt1sOrz/K4eGjbA8fhdah21pVGS9Qy9IhvJToCu4YfU70fc9m\ns2HYDBGwu46cMymlZYFyskRK8bl0rGu34ymT29TxUgvFo8N5adt8za16LRvJnUSGIfYLN9oa7ZxI\nY2oV8LZe21qTtBa4zcoubKdYs22UJWjb/M9i97Z92NKrbVeVXv3T2dePGHFd3z06N1Vbfaq9wTB/\naLv16SJyUX0S8P3AXwT+0onHnAi9X3CTrz2TKrO7/7KZfSzwPUTg/ksnxuLAK9r1FLZFRETkTCls\nr8xhexj22NuLNdrDZj/CrmVqqYxHI9ujI7ZHW7ZHh+24zrg9JOXYk9tyalOeC/g8dbtEIC4T1sLz\nEraHgb4f6LqeLueoaMOuk3lO5NztppIzV3jbHXOmUmIrsZpINjEdWxMd2T8ZsRG2tzXZlkljTNmO\nInNMK3fmNdtTVOiZYn12inXb5qVNCy9t6XXbesxrC/K7yrNhmM/j3QXxec7mLnjvpocvH8152nbP\ntUTsuQO6iFw0sXFBBN3nAl8I/EngPYER+Angle7+mts9x1089tiT3d9gZr8L+J+BjwKeSXQ/fxPw\nde7+fWb28jt9XhEREZHbUdhe6YcI25tNhO2Dg6sMmz1qiY7ftRS22y2Hj17n+qOPsj28zjgeMY1H\nlGlL7ju6viNZFwG2Ziq7sF3LxFQKnaVd2N7bMAwb+r6n76OybWmecm3L3txd1y1BG1rGTlGRNjNS\nmUhlwspqe622sDkBbtZmcsc67GSZlHIL2lFJrx7rt6snIKaPR6DOkFrIrhXzCeYtvZzjQbu2cbvv\nxjs3Rt/1TFvdsVanXq3Hnv9v+T52Td7mh5YvF5ELw91fQVSK549/E/iydpz2OV54i8d+mfgP052O\n663A/3CLx4+NW0REROQsKGyvbDZ7AHTdQEptbfY4Lcc4jWwPj9geHVKmkVomvBa8FmotpJqotZK8\n4qVSLR6bHLwUvDrmMTW8y5m+Hxj6ga5tARZ9xRyrTqkOpbTO4FHRtSVlRvfulNq67mSMZWJsYX6a\nxmXMpRRww2tMH3ePSdgpQddZVLEtY6mPPcMLTDFjPLYGw2LHL0qb3l3i+nNDNNqYKvjcsbzOFfCY\nVm4kUnuuXSXbjpWQ5v25lz3Dl0Ru7Wdw3GpHMxERERERkQtHYXtls9kHIHcDyTJeK9M4sj3acnR0\nxPboiGkcmbZbpnFs1eqC13riiPRZKVSm6Bbezouga3S5Y+gHhmFD1/XRvKwFW6rvwnpp/b5bBdna\nlO1d2I5ma9MStifGaaLMYXuKnW9sXgfdpnSnDJ1FI7eUnNzBWGAcwacoRddqEbqZp4CXVk2umMeb\nABHkd2OOoF3a4cuV5+B+/GDZmsy9rcJu88h9fQ4VmPfp3pWzTZVtERERERG5oBS2V+awbSmTLCrb\ntU5sj444fPRRrl+/HqG5tGp2mfA6UesNArc7lUJxo9RELbWFZciWVmF7oMurynZ13CrVKqnEuu+5\n4k1ta6NrbJWVUibnCNxjmZim0m5bNX4ble1EbOe17NdtRk5EVTwncu3IbtgWMKfi1MmXKeB1DtaL\n1oGc1Krl4GUO22UVtmusFXdbmp6xrCCf9w6PyvVuCrnHdPS56/hyvu9q3cuMcqVtERERERG5mBS2\nV1LaLQWsrUIdXcePGMct0/aoBemYQu21LPeP9dZpHbmjCO3Hp493HX3X0fc9w9Az9AM5d4BF0K61\nrYWesOrkVCmlkHNue13HVOsErbIdDdWmsgvapZS2B3ih1tqG5C1wR9O1XX6ukJzkLE3Bqzu1Em8S\neEwn303ZjoZsNn+/qeJlohJbiVEttvwqjnnd9U5j3tgrxRT0OXxbwmzZ4XzVAM12neDMwSq1vWGw\nfF5EREREROSCUtheqbW0W6fWSvUI29O0xWt05MZqbOnlFafg1AifLQDavHWW7aZuG5BzYug6NgNs\nhg2bIdZr932/VLVjfbXjyfGUsFRJbbp4ntuPt+p2gti6qzVIm6bC1NZse/Ul+y8dvD324W6lajDH\nzYldrqOKPE0T2zGOcSytaZnhSz+i9v21ZmXz/tlewVpFfmlu1t5kiNv4WcbPou5+LtammC8V7l3Y\n9tZ53VfVeDt2zOFdRERERETk4lHYXillardRTS6ltIZjW2qdWDqBMQftaBY2z3RegmBKRAU3RYMw\nM3LK9B1sPLG3GdgMcfRdT06xHnmaCp6cmjwCaHu+1I7I9fO6bdq+2xE5Y5wRuCGu16XcwnGrsHtE\n6wjFJRqSWdsNDBinMZrAbSfGqbY3DXLbd9vijYa2Pzetq7mRYgq5FZyp/WxqbHtWLar61aFNKZ9r\n2/HexC48A6u12sfvz9XsJaCf/DoREREREZELRmF7pZSy3I7jGFOy27TsWieshWyWoB2V7sjF0RXc\nUtpVt2sLyxg5Jfo+4ea7yvYQlW0sGoRNJaZ9V0tUa+uc51nT83Zabd23+W5itsFqvXYhp8zQ96Qu\nkTrDo7V4C9xTW0s+Rehuod6NaK42TmzHkbHEmu7odp5bpd7b95vb0UXgThVLBUsT1XZvRuCtyl6B\nEmvBY/V1W7Xdfm5pCdvHO5TDvEbbVv3UorqusC0iIiIiIheZwvbK4eEhENPJa43KdvXSmpRFuDaL\nSnZKhqdEIjOvP87dhtxt6LoNlRzrlmPnLTpzqsWU7a41NYsGZHU3XducakY1KC1sp6W6HdVpqkOp\nu82oWzqdpmmpbCcrbV9wp5vKqv+3U7xQfKS00E1q3caSxZsLtbaZ6h5ryFvAT8nIOWGpi+7lVkhW\nY0/xusxMx8m4dbjNtxOUEWcieXRjt7b92O6NhPnb2UXtqMKv9t1u5/ry80ht+rmIiIiIiMjFo7C9\ncnh0Pe64416PHfP08eh0NofERMqZlLrWGXyzBO5aEzZVfHIqNcJ2TNwm50xu07/jWtHArDpL2K4Y\nbgnSrqUYddf1e14PDcR+4G3Lr2maSJaopYXtXEhm5BTBvfjEVCemuqV6gZSwZJASUyktbHvbaoy4\ncq10XSblhFlHSpCS06VKomBdhO3kRiVRLeOkCN6MEbyZMC8kjy7lsXbbI+u3Bebe3kBYT3mftwKL\nR+adwqLyryZpIiIiIiJyUSlsrxwdHS73o/I6dxdfBW5j+XxKidz1dP1A1w2kvCHnDSlvKNXwbaHa\nRPV5nXSUf7suOogbc2D2pWO3W6JQo7JtEGnbSWbUudlYqRG451BanXEaW3V7wswouVKmSpc7upzI\nOdHlRKkTYx0ZS4RtSwnyvFe3U9q2ZV7nrbfijYeUOvCMWTR0y9nJyclWMTeqR3yOKfARtiuZyhan\no/oWq1NsCZYieEft3iPQt322ff1GR921b4s/FD+2blthW0RERERELiqF7RWvc6iD3VZec/hrU6rb\nOmNSInUd/bCh3+wzDPukPJBSC9vFIU24jVQfY+9sKsULqT2/1xod0C2qyzmlmDpOhNe5+fi83Ze3\nanUt817etC7jcZTqTDWmmNcKxSpdLvRdpvcOyJRao5laKRQvWHKsVkiJMrVdzIj151SWKeVxtG5q\nc5szA0uxPZnlulpP3sbthlmmWoelDqsTqU6YTyQvJJxkEcvdK7UdXmPtuntpHeLbn8FqezWf9xQT\nERERERG5gBS2V7quB1qQax21vU2pjvXL0Rxs3t/aug3dZp/NwVX29q9E0E4DKQ9Mo+PpiMoRxROV\niVLHtqd0BO1SJso0kvuenHty32Ot4XnradbCNEDBS40GavPWXuw6erul3c5g7ZwJY5wmau3xtka6\neEwVLzXCObW2TmWV6tFlPKdo8FaZA7fvptbXipe2TVdKS6BOqV8CNzUq4+YWITv1UMsqaE9kKsmc\nbJDNW9iOfctrLdQyUetEKdMSur2WFsZdYVtERERERC40he2VLkfYrrWsKrqV2gJnqU5Osb81KZP6\ngW4vwvbB1QcwG7A0kNLAOBaqZYonSjWKQyrRtRv31oBtpJSO3HexD/fQxzbeU2WiRhW7hWd3qKXs\n1mvju87cAGax3hso7ngtUOfNsxxLtP28K6V4HK0J2VwvntdDp5QwT5T2wHJGrdRSqMnwmqFaa7CW\nI1S3hm3ewna0He+xHPuSJy8kn0gUMpWcoGuH17IE7lomahnj5zO12zJSy4SVQqHgpbQ3RURERERE\nRC4ehe2VvhsAKC0Ix/TlBOZ4tWgCRgRt63pyv6HfHERl++oDmPUtcPeko4lSjbHANFVyqaRclm2s\n3CulRqiEgZyMvu/w4kxu5DYNnFalLiXuz1XdeZ9qVuuX513AS419wusUwd7MSDm6ibtXpvactfrS\nDM1xUk7kbKSUwPLSkdwsOoh7nSvblZpSq7wbRo711Jn2RkLFWtCf12QbTvJCJpqkZXP6bHQZ+mRt\nqn6E8lpGpmlLmY7a7ZZpyhG8pxGfrM04UNgWEREREZGLSWF75crBA0AL23WMad51ik7fpTDViW4Y\n6IaBfrNhc3CFzcE1+v2rdJsrmHXEj7TDaiL1E90w0Y2FvlSmqdC1BmZmUbEutUS1uBa8TC28xlZb\nqSbKvLa71CVgx17esf2VpbaG2gyfK9u0SrxHQB/LRBrnjappFWRiu7G51TcObkvDNSyul1Pb2CxK\n7pTJW3OzEp3IPWNW2/rrpWd724W87QNumYRDasGcGluZdQlrR1o1S/Na6MuWUrbUJWzvbqftlmmM\n4C0iIiIiInIRKWyvXLnyFIBYK9zWC091ZJzDdpnILWh3S9h+gGHvGv3eVdxTbHnlieRGHgrdVOin\naErWTRPdOMb05zn0trXJXqYI3J6A3d7atux5Xdue29FZfF3VnruYt77ey1Ty0qbBpzLBGKHdkrXp\n4nOjs3n5d2t+VqFajUZn8z7fmVVVvVBqIlGYKOCZnAxiufdSx64YZdknfO6rlpgv7cmg76DPWN/F\n2u3kpAR4wcuI121MJ29Bu0xHTNsjxnZM4/b+/gURERERERE5JYXtlbmyvYTtdmyniW2ZGKeRvNnQ\n7+3R7W3Y7Edle9i/Srd3JZqazdPNq5H7QjcUuha0+3Fk6ruYEj6H7XVluxYAjEyyRLJWd65OKQXL\nGUsWzdlSop0c07nbVmGVqGpXorJda4Vpiup4LtHcLWdyew6bK9stfEdvuHmLs9S2KDOqtyr8VPFk\nGAUo4B2eE3TxfHPvttrWkPv8pkFb6x6VecNzhr7Hho409ORsdO1IFKgTXkeoI2U6Wqrc49Eh26Pr\nHB1eZ9oe3c+/HiIiIiIiIqemsL2SW4O0lDKZLtZVe6Grhb4WplrIw4Zub0O32aPf22fYu0K/2Sd3\nA1SnFmK9cq6kLpqf9dNAGUbqOFCmkWmcWlfxts1V60o+bY/wPGBmdJbxbJScmHIiW2oBvO0zDUv3\n8dq25por4vP67Wh+1jqTe6VWI+dMro5noopszJO9I0SbY9VIyemykTPknJmI6eWVglcoU9uaq0x0\nXaZ6xr2jeGwrNpboeG7Wtktzo0sezdU8YSQyidz+P8Ye4T4arqWYou4dljOp9HjfY6mLgVsiZ/31\nFRERERGRi0lpZcU9qrw2d+Q2pzcoOBucgpOHgbzZkIcNebNHt9mj6wZS7vC2XzRO7JudE13u8L6n\n9gN1E120R9syTlvGsbQtwEam8YhtNlJvpC7TdbE2u5ZMKZXSzVuO7aaO19bALCrYc+M0W7YDm7cL\ndyqlVZzdHU/RFC2lZVX1sn/4vA68i5RNSom+y1ALXqJpW4zZYtxmlL6j1DjcK1OJde7FW6W+zVgv\nOdMRU82pKfb3LhWfCpXU1nonugTZjEQip/grmlKCmon56oAbyfL9/OshIiIiIiJyagrbK3PYTim6\ncucukbLFftIptrmyfiAPG9IwIwQBZgAAIABJREFUkPoB6was60mpo3qJtc41uo6nlOi6DLWjDgNe\nS3usrX2eRqYS1eFp3GIGvWVS7umSkS1TS0ctldp1bT9smwcblea5yVqtwByYl8nhQHTttta9O9UI\n26k6lvKuqzm2BPm0VM69dUnPeE2UCZI5k5elMl/d6WtPaYfjTLXEmLziNm8IHuvIK7EnONWhOJ4q\n1eo8KZ3eoLrRpUSX2h7eObU6eN+mz8d4c1LYFhERERGRi0lhe2XeSiqRMMvknOn6jHWZ1HVx2w9Y\nH0GbroeU8Xak6lRrnbyTkXKCrotgXArUgtVoVFamkXEbnce9TNTJmMzJ3YANlS4BOVFKopRM6R1f\nqtNtL+3WAM3bQmnDyClR2nTzZLGymnnbLl82zm7ryuep5zCH7WSJmtr2Xx5TwHOOinpKYObgsc58\nnCZKra27eaW2HblLm35fl328PdaTO9EkrTV6o5bYxgtwcpv0DjWnaNpmsV4cIyruVsnueBfV9bR6\nS0FEREREROQiUdheGVvDrVqjo7h7pnpHpieb02Wg9dqGtve012i17dGpzDy2wZrDorfQm1Km6zq8\nL/RTzzQNlGGCtr91MrBaY1utdiQqfQLvowJdKpS2PzY2dxGPEE6XIxgnI3nsaU2rpHtK0U3cK20j\nL5Y55msObr50QC+lME0T2+3INE2UUpdO6hik1LYOm6e114jbtTVmK7772Nn9bHKbNB7V6kIlU8iY\nt6niOdaP1wwlQTYnWyWbwzRSxy1lHKna+kvkQjKzlwCvIv4j89vc/VfOeUj3zUMPPcRznvOc8x7G\nffPggw/y5je/+byHISIiciEpbK9sx0MAskfYrp6odPTmWDacDF4jVLd6bCTUGm3A5xBOm/ENJDNI\nsV2X5w66Su1jCnmd+vjaZavrCh6BO3khU+kz0UwsJcbiTFNlLDVicsvKrRk5KRm1xp7V1AkvXYTt\nWiNwe43q+Px1fjxsO8QbCEZ0H29hexxHpql1THdv14uu4jELIL46Ktzeppe3Tutz1dsr5pWMUyzi\ndaWjrsI2NeE1U4tRExSDbNCtw3YZ8WmLT2PsSy4icoHUWnnrW9963sMQERGRC0Bhe2U7RmU7V8Nr\nhG2nw7KRSwLvYNnJOgJ3BNY5/FasBfEEuzXW1tYXZ8d6p5ZCmQZKmWIaeAumXj3WdM+B2wpdiiZl\nHYk0VszKY4I2gKfUquweYbvE3t2UaXnMPS3rvKOD+foZ5ueLaeZzZXscJ8xSdEyv80Rx2h7fCUtt\nG7O2Nnxey13mddu1TSmvheQRrYsVCl0L3JlKBs94StSaKGYRtIml8tkqHRG4rRYoYxxtqzQRkYvl\n2ec9gPvgIdq7zCIiInITCtsr26NHgKgQT52RR6OfYjur4oVKJZdCVyrZwboKKeEWa7YjaLbAWdox\nf87nWGtYznR9z8Y35GStIhwV6M3eHptNz9DnaK5mGbcOrKPLhTxOpJQYcyFNlZQKU4kp7O4J2nRt\n7ztq6eP+PNXcve2X7dR5SngbE7B0MYfopJ5zjq24YkF4a/rWtulKleTpMVuOWWrz2y2q+tWif3g1\n6IifWyre3jTIVE+UmiJotzcWzIyyvGnhMY2cqGzvpsdPEbxFRC6UBLzlvAdxHzwHUAVfRETkVhS2\nV44OI2ybQcoRurttZiwjY4ntuvq9A7q9Ql8quZ8g91juIHXRbbutqV7CdikRbEs0MqseW4vlvseS\n0Q89USWPannfb+g3ewybgdx1WOohx223jT2tcy5sx4kuF7ajkaxE2G5T2s0zXnu8FtLcvLx1MV+6\niLc3BdZ9y213l2QRtufAba0dmbcu675c73hlvFbIsQkaBcfN8GpUS+TkEbhrwSbi88WY0rzlWNp1\nSF9N15+Ddm4/p+Rz9V9VFRERERERuZgUtleODt8Vd6x13TbIXWKYtmynLdtxyzAVhlIZqtNtKqkr\nWDeQslOqU9wplQjYtVKLL0Gbeb1zSnRpwPouqrfJo0GaQdd15K6n6wZy15Nyj+UhtgPLEbRTmqLy\nvB3jjQFj15ncLarZtcN9iK7iy3T2mE5eS21dxH2J2sZqDXfrQh77jbcO5URl21vQXnK5+7Gg7waV\n2trIReM4t0T0VYumb6mW2GPbYgK+m1NTbtPS2z7ac0O32iraxHrv1O7PVW8REREREZGLSGF7Zdxe\nB9q6ZWKP6JQTUy1MZWKcRjbVKR49yfvqpL6Seyf10Sm81LafdGn7YM9Bu9aluVq2Nk07Zbpssad3\narc5k1MmpUzOPbkbSN1A7oaoMucpupcnWwJ6MtrWXqltiTU3b4s9wy0lYt+utFtXXcoSmq1917Tt\nxPDd18+V79QC8xzIj39dbdcHzKmk2F/b5nXhrVt7jW3D4vzYWXveNsxSbLc2h+1aWkO2WkjzdPI5\nbP//7L17sGX5Vd/3Wev32/vc7hGY0gPPiDGUsXkZkwgb8X6YR4hAQRhwbCuFLYQAVUIRx0DiOOY1\ngnLFYBtiCsdCNiAoKCoYAYoNGMcBIRwcJB7BicDmIYIk2tHwEGhm+p69f7+18sf67XPPNH17umf6\nNer16frVOffeffbe53SfPve711rfr4yLEym2k+SOICLvAfz3wGcC7wO8E/hF4JXu/k+vcx8F+Dzg\ns4HnAc8a+3kT8BrgH7n7/jr28znAi4EPB54DXAZ+BfhfgW92998/53HfDrwE+A13f18RuR/468AL\ngfcGngH8OXf/yet5PkmSJEmSJFeSYvsIYbhbb/nQbhhKXxeWUYVFtjlqpZtQuo9FiHCPiC4bedpm\nUdkOpRqC27e2bAkZKSJoEWpVaomIsFoqtU6UoxVZ17GKCrWcrUN7eLezPGyiFT6+EQZkcU4dG6Zp\nIZr98cL5WHAfjNhk0+MHkSuRFRbbm+AGaMykRxSajlbviEkzU7zHdQfr0W7fh3O5IMgQ0QfTuWGs\nZm7jaxsXF84c35Mkub2IyAcB/xvwAGcOizvgk4BPHiL2mgJVRP4E8Frgg472AfBM4GOBjwP+KxF5\nobv/6jn7eDbwg8BHX7GPGfgw4PnAF4vIZ7r7zzzB+XwE8M/G8Tfyal6SJEmSJE+JFNtHCJvh1iY4\nDXOhDaHdzc7ENooNsV07FANzoTt0l3D77iG4o4X8rLJNEUSieh3V4KhUl1qYamWeJqY6MU1R2S51\npkxzVLTHKkcV8VqE3ju9R1xX0TMhqhrmY4hGO7cZZoqVUL0hWofYdgPXs4sCx9XurTq9vVYHMe74\ncEFzE8R1VKIFoQwhH/u3LoRButPEaN1oDutwUFcckbjQYRh4x62Bh7s5Hu3jIpuZ2238x5EkCSLy\nbsC/AO4nxOj3At8JvB14f+BLiWr1n77GPu4H/jXwnsAp8K2EOP8Nopr8qcBfA94P+BER+TPu/s4r\n9nFxPOYDgQZ8N/AjwJuBCfj4cS5/FPhhEflQd3/LOaf0DOD7CZH+dcSFhMeADyEst5MkSZIkSZ4U\nKbaP2Crbh/lnswj6co8Z49aOG5oxF+oQ2dWF7kJHMJfomDYLwe2Pr2zjiqpjJSrGIn6obE9TYZ4r\nu3linuYxvz1RpolyXNU+3EItQmud3hqtR3s5B7FtIbLHMje8R8QW1kce+IgqY8v5FtxltIb7qIDH\nToVwPD97PiGyMcGHA3kVoYigElsrIf77Cqs6q3RWOou3MbdtKDEbrtuFDbd4za0d2smxqJJvIeap\ntZPktvNVhA21A3/T3b/+6Gc/LyL/FPjnhGA+j1cRQvs3iDbt37zi5z859vN64H2B/w74yiu2+TuE\n0P494JPd/Reu+Pn/ISLfA/w0cWHgbwN/5ZzzeTbRvv4x7v5/H33/Z6/xHJIkSZIkSZ6QFNvHjCip\ng9C2cNU+jECr09YFLadj/lmGGZrTm2GiMa/MFWL1cZVtQ7ogptAL1gq2VtpSWU8rbbfDTnZ4O4Hd\nCfN8QhFBaqGoM1cBSrRTK9Si1FrordNbpfVwKp/Xxjx1ltbZLMWMETPWG94VNz06rz7Ot+FD3Now\nKGO0m49AbR4vtn17xog6RZSiergYEIJ7xHlpmMGV4qMyr5QaEWelVrRWSq2YOcschnTrOkUreY9z\n8iG2Bck+8iS5jYjIBLyU+F/gF68Q2gC4exeRlwG/TlSYr9zHBxMz0Q588VWE9rafXxCRbyGE9udx\nJLZF5FnAy8Y+vuIqQnvbx2+KyNcC/xD4z0Xki9z98tU2Bf7OFUI7SZIkSZLkKZNi+wi7Qmz7aKF2\n2eaVHVlXRPbDjEzo3emtU1uLFnMtuJTYj40WbNuq22EI5gK2Cr3AUoSlalS1q7K/cIH15CLtwgXs\nwkXsYhsu5QV1qCrIpIf28akW5qlEC3nr9N5Z186ydvZrY21GJ6rvxmY8pkNwC9bbiCZzujW6rfS+\n4L2dmbpZH0ZnZyJ7a7MHRsVdUVWqlpg3LyG6o1qtqAhdJ1QJoV0LdapMfaK1mTJFBb9OE4azLCG2\nl3XBeo9z6+Pv5yC0U2wnyW3kzxIzzQ68+ryN3P1tIvJjhKi+ks8ct48BP/oEx/tJQmw/V0QedPct\nvPo/BU7GeXz/E+zj9eN2Guf/U+ds9z1PsJ8kSZIkSZIbJsX2Ee5DzI3YqcjF9jGuHMK5ywIoYTDu\n1B4Ct7eGaEVKRbSwVYE3szG3jo/qsYnRxFklZpSLEqsIy4WL9Iv3Yesz8B5Ce6oV382IFqoOF/Ma\nbedzd1o3+sj17t1YW4jt3Sa2fTilO0NcK9aV3oTenL42uhurdNwXet/jbY1Kv51VvQ+z3FeK7VqR\naaJKZTpU26N6fRDbWtAeRnClKrVXuk3x2lljmifqNDPNM+7Oft3E9krvjdYbvTfMGUJb8RTbSXI7\n+ZCj+294gm1/hquL7Q8bt/cBXa6/O+V+YBPbH3b0/f9wg/u4Go+4+29c706SJEmSJEmulxTbR2yV\nbTPHfLh7b95gbP5gK+4S23THeqf0Tl9XtEzRDl1qCELnkENtZkNs94i/Gm7bgoVB2Yi06vvL+Loc\nxK5KiNdpKujI3dY6oaJUjXloVcWLhIg2Z66FuRrrVFmb0SwywJv5ENjQm9ERVnNWMaDRbUH6Au0U\n1vXx5ztEN8OojDHLjQA6ox552GW7GFCg1JjhFi2oVESHuZsVzCZsVPrNOtM8H5YLzENoL+tC7xG9\n1noj5H3YqSF6B/6VJMk9y7FT99ufYNv/75zvv+e4vVGn74tX2ceN7ufiOd9/xw2eyxNgPP4Uz6OM\n9XQlveOSJEmSm8ulS5e4dOmJP1+WZbkNZ3NzSLF9RO8h5czDebz7VtkWHBtmXgreRrQVh8xo741S\nCt4qWgoieia22arlo43c+2gpH8JV/CC61TpqhreV3lZaa6zryn7ZU6cddZ6p8w4pFYhILxFlqN7w\nABegRhv3VM8q382MhcbaO24rre2x9ZS+XKbtL9P2p/TlFFtPsbYeMrFxQ90Y9m8ghg3PcAfKME4L\nO7SCaAW10MJHYeCiBUVRr2cmdMP1fJompnlmnidcJNrKW2NqjW49/j6sx3S46MERPkmS28ZxCfmJ\nRO555eZNXb4Z+IxrbHclb77KPhbgz9zAPt56zvf7Od9/Cjx883eZJEmSJO/ivPKVr+Shhx6606dx\nU0mxfUQ7Etvmm9gGJL6PgHgfDtwR64V16A1vK14KWgqlFGSrbG+/B26t6b6J7s3wK6rcsbGNfTXa\nsrAuC21dWZY9p6eXmU8uMO1OmE9OqNMcVe5SKWVCtVI0hK6qUlCoIcJb66y903pHO7gYq61422Pr\nZWx5jHb6GH3d05c9tu7xvkZGtoe1muKoGCpR1bbxBxgye/iOSwHpiDoHK3IVXHVkizOczeHMeS7E\n9jzPTPMEItTemcY5n/1d2JnQlpKV7SS5vfzu0f0/Clw1/3pwXmn3d44e/+/cxyzKjbHtYwZ+193P\nq6LfEUSEZz/72U+4XRmfFU937r//vO78JEmSJLkxXv7yl/OiF73oCbd7wQtewMMPPz0ubKfYPqI/\nTmxH/NSxXo7RwIaZgxjeO14aroqWgmtBi+JFEdGDg7c4R5nVNpy+R6TVqHaz1YrXhb4sLKen7E9P\nD0L7scce4+TiBXYXLrK7cJF5d8I076jTjmnaMdUZqfPBDbwUpZYJEaWVxtqE1oBVaGLsbYW2xw+V\n7RDbti5Y22O9DQ/zqLqrejQ9io/KtoX4PWRqDzFNjYsTYkjxx4ttKRQtqIY7uRxeU5jnaVS3p3Au\nNxvdBSHp3ccrJApaQerBiC5JktvCvz26/3wiK/s8nn/O938e+HSipftjODMwuxF+/uj+pwLf9ST2\ncct47nOfy1vfel4RPUmSJEmS83jggQd44IEHnnC7eZ5vw9ncHFJsH1GmHcBIknYqW6/kUdSUy1Hm\ntCA6Kr6AuCEW20TreWwT3eSb2N7ap9th/jlmtgEcdQ/3b+v0trLs9yBCd2ftK/t1YV5OmXYnIbKP\n1nwQ3rsQrtNM0cK6rod1+tgjnD72aKzLj7KcXmbd72ltofc2TOI8XMNH7nWRMvK8o1gu4uPSgEci\n2jRRphmdZso0U+eJMk+UqY6W8gnRKUzS5PFiO9rsHdGCFEXGz3TLNvMQ+S4SrvBSDmIbTbGdJLeR\nnyVyrd+DyKz+pqttJCLvxfk52z8E/K1x/2/w5MT2jwAr8fn1pSLyPb65WyZJkiRJktxFpNg+YpqH\nf86WKhU9z8Eh+oqtzMoIoYpNpMetAeJHQntscRDaHGLAthZtGbnRqmNJPNLd6G1h2UM3Y21jdvv0\nMnXehXv3FNXteTphN0Uu9zzv2M0n7OYdtdaI0Rrr8mOP8Ogjj/DYo49wevlR1uWUdTmltxWzBliM\nWW9Z2SVupzKixoqGBpZ4noggtSI1Wtpl2lHqjE4TWisyhLaUGTkI7aiFh9COtnR5XLVbDq9rEYl2\ncR1LyhDvKbaT5Hbi7ouIfDvwpcDzROTL3f3vHm8jIgV4FVfJ2B77eOOIBftU4NNE5BXu/lXnHVNE\n3gf4KHf/3qN9/NY4jy8C/iPgVSLyhecJbhF5DvAid/8nN/SEkyRJkiRJniIpto+Y5gsAiMrZEuEw\nU+1jTruPHO4tDouzSCxGZFjo8rMquI9Wcj+YpoXQBsI0TIQyjjcegVunrdDMoC3oOlH2Fa2VUqcw\nTKszdZrZzRdiTRc4ObnAerLQTy4w1Yn9fs+y37Pfn3L58qM89ugjXH7sUU5PH6O3JarabRlGbY4o\noxW9MFVlqoWpKPO4LUVjXLrEa0SJfHFKCVFdd0iZzpbGrQ4zt83QLZzOBTcbr7WGeJcQ3OrRFbDF\nqUX1u0ZlW6cRsZYkyW3kFcBfBB4Evl5EPhT4TsKd/P2BLyPyrN/I+a3kLyWiwx4AvkJEPhX4NuAX\ngVPgWYSI/jTgE4EfAL73in18GfBRwJ8GPg/4SBH5R0T1/RGi+v7BwH8y9vOLQIrtJEmSJEluKym2\nj9gq21oELYqWIbZHFdrd8N5itT4SsEa12zziwEakVySGHbuEH/mlbUVdonJeNI5Xturt5uANMRfd\nDeuC9AZrtFtrmajTQqkztc6sc6PtOm22mCn3OKpNxrI/Zb8/ZX+6D+E9jNd6b5iF4N8uLMQplIgb\nqzWE9lSZSxmiu1CqjtcoxLaP83ZVKBPotiJ3XLf88bAnPzi1u/iI6vZ4nUcr/cFCbZipiUjEhukQ\n9VpGlTsN0pLkduLufyAiLwD+JZFb/eKxDpsA30G0h19VbLv7JRH5KOD7xjbPBz78apuO29+/yj4e\nFZFPAL4beAHwAVy9rf3cfSRJkiRJktxqUmwfMZ/cB4TYLpvYVkJc21gyjLrcYobYdUxlD2PyIbrd\nN7GtQzDqYUVx28bEs5/NMKtEe7SEoHQpCGCHcXEZLeqOeacPJ3QRxaxjfTice5iXmRs+zN7co/Vb\nVanTRLUdroztO24tKtolWtlLiYp2LSG4iyp1GMDpqG5vgtu3CwQq+MEpPNq8QyTLwQgtjOI2A+Ko\npMc4/NZSPi5gHPr3JTK+6diIW3NxXIYzeZIktxV3f5OIfDAxc/1ZwHsD7yQM1L7V3f8XEXkJZ3ED\nV9vHW4hq9IuAvwR8BOFQPhG5178C/DTwWnf/qXP28Q7ghSLy54DPBT6WqJafAH8A/BrwM8A/B37s\nvKdz3jkmSZIkSZI8VVJsHzHvtsq2UqpEBVccawvWV6wLxoi98h6O25FsHQ7aCtqdblGxPYht0UM0\nl5Z4ye0oY1pGhZvNAOwgtvUwu2wSaxPquIwZ61DiUaWOmCwbLe5mm2P4kdGbFrROTO7hnD7mxx0L\nU7QhoOuIpamlUktBRc+WKrq9PkVGJR5QcFEcHbPqo/VbNdrNN0fxg6DeMsbHGbqNLoFj+/fRbk50\nGLg5HcPombOdJHeIIXT/5lhX+/mrgVdfx35eC7z2KZ7LTwA/8SQe91KipT1JkiRJkuSWkGL7iK2y\nXYbYrlURMXor2Kr0lVEx7vRRbg4B6qgITZ2mBW2OKWdxWFIopVLrRK3TwZ18qzy7+5lj+VYZ1oJF\nsPc4u7jtQ5jaEKWOQBd6j0q3jzzvQ1Xb7LB/Rjt2qROoUGzzW4+ILy0SFxlG5bponHfRwlahx6NS\nXapSJqUUYVixh9j24R/nsF1o2FrUh3fciDobLuSHynZcuHD3w5z75up+JrihY3QXOkL3rfqdJEmS\nJEmSJElyd5Fi+4hpDrEdOdFOH0K1r4a1Tm9ttDT7IZpKxQ4j1upKcYeqqG/Z02EKVmo5VItF5NBC\n/jjjNA6yFxsp152j9uqRae0Mx+4tPgu2AvfhVlWGeFa6CVIUejiHqypYifqz+IgvG+Pi5cwVXYah\nmWs5aomPeC6KQBm3Q2hHK/iYFx9969E6v7WRj/l2iVZ7sDHzbphvbfR6dmFgE9rj2FE1F8yhj5Uk\nSZIkSZIkSXI3kmL7iDoM0swaZiu9L/Rm9NaxtdPXhnpH8TA1YwjqkbPtOirNUnCPqncI1tGSrZuT\n9xCSQyxvVe3wWYuKrSEhJs2GIB/HkM1uTY8E6RY/tpmJyWHuvFRFe8xZU+IxWsrmQzZax2Nt0dai\njNiyyPgyZDiJj7W1hRfBi4TI3gT3KG3LENZymFk/E9pbjJp5OK6HG7lyaChXPTNIkzgh326HQ7n5\nqO4nSZIkSZIkSZLchaTYPmKrbLe2py/O2hba2ulrx1qjr40qxqThnq2iRznZUXVmVHIZpmgqJVYp\n0ZatY+Z5rG2WeZur7gbNoA2HcxfC5dxCAOtQoT7E+lk5e3ypo6qtZ63wa1ekh9gOTzEdFwKEWmOV\nukVuRbXbPYr4ZhKj1CPXWqRGZXsYop21kI92cGwI7ugAEJeDybgeCe24MTDDeh+J5IoKuMvBiRzR\nw3IpY6TbMaB7qu0kSZIkSZIkSe5OUmwfoWUXd7rhLJgrvQsjWjsqqRpzylOFqpGrPerQw/U7KrbR\nOl0O1W2VgmqhSERYhdCOfO0tPQwcNQntatGKLT2c0cQkTNIYreYiw4hMR2TX5vq9ieYt+zvcxs17\nGKgxKusaFwC0CjoNsY1HCz2HExpnJcNZvI4Vol1kE9tnIn0EjkWVfXMdN0e2/HHzcWt471hv9NbB\nR+yZa8R2bw7uMMzj4vUSlxDpYmeG5UmSJEmSJEmSJHcZKbaPOHO3LmdLC6ITpTSUzjwJu1m4sFOm\n4sjI4GZkax+qtgexeHR7aKk+irWCQ8t1aHmhilJQKlDNqNZpbiG0HTrbbLdEW7XoIZ6rKIBhvbGu\np0Bn2a8s+5V1WXGtEVMtilMiRsyc8B+zsTbH8M1N/Sz6O8zgYmZ9i8YOcS9HAjvMzdwl8tCGO3o3\nw3rMwvdu9N5pvdF7x7tiKnSVMJObdrgLVUocZMs+d6EgdFG0bxFiSZIkSZIkSZIkdxcpto9wytnt\nyIqOtumKM4MY01w4uVC4eKEyF8D72SJmqkNoj9uDszYjQxrchgO5jbzubXLbQ2wXjSFqF6W6Uc3p\nbmEKRjiSR5E4zMLchVqUWiRmrjGsr7Q1MsKXJYT2sm9InSmiiE+EqI7zMNkMy2y0gEd0l8tmcnbU\nOX6FyL7y0gFHz9l9q2I3bAjsM6HdD18bHKrypU6jlbygxZAyDNyKUlzp4pTumbOdJEmSJEmSJMld\nS4rtI7bKdrRnh+AWqaATKgZqzPPEyYWJixdndpPg3sEabm20VetR5ZpDO7abYz1aqXvv2CYy+5mD\nN4CKUGqhTBNoobsfrRDa3aHb+N5ob6+1UEq0kuNDbC+dLsKytMMq7lAqWofh2oggoxsQZmXRpj3c\nwUcVO5ajGpXtg9DetPVBcm9+6XFhAXPMeojtNsR167QWUWWb2D6mdIvW+zJRaqcw2vI12vJL90jZ\nFidJkiRJkiRJkuRuJMX2EZt001KYphlVwybBewErYMq8m5jmmXk3MU9jPhk7GKVtWdaxw609fFSx\nR0XbWgux2Rp9tFFvVd9SaojtUpBSqDIcukVo5jQzVo9W7NadblHlnlQoAuqGW6N1ow2jsqU5rRve\nRhW9TlBnaNC90WmIt7hwMOa8RRQtM1omtMwwqu0Mwct2QeHIoA0JoR8O4x3vnb7u6etCWxdaW2lr\ni9sWM+S9G906W31cRCjTxO7kMvPJZeaTi+wu3Md8sTHjUCZwoWyl9iRJkiRJkiRJkruQFNtXQVXR\neaK6gFW8F7wr3iVE9jwxTTPzXA6VXj0Yhg2HcYDRos0wTotb6C0E57qutHWltUZvDVnXcC0vlVrD\nwIxhpCZaWK2zdqOY0bpR9Exw16IhtnHosd/e1qgcG3QfRmyi0GZoCy6O9wUby70dxLKoUOuOUneU\nuoYx2rgAYOjhwoRH6DfINkM+WsYtHNzX5ZR1f8q67FnXJday0nrDesfMMLOz3HARyjQfxPbu5DIX\nWuMio418OgnBP4zakiRJkiRJkiRJ7kZSbF8FLYrqRNEJYcKaHtZuV5l3M9M8Me8KpQilCLXIkTAO\nsX1w3fajgW0Iob0s1OVJ0ykRAAAgAElEQVRMfK7LEhnYopHJXQqlFrRUSq1orSy9UVuntM6qHVVD\nNczGispI4jJ6b7TllOX0dOy3RGyWRvwYbUF6zGz39TJtPaWtp7i1MHrzjqpi00KdVnzaIaUiWpBS\ngRDuNi4e+Mgdc5EwQhtCuq0ry+ll9mOtyynLurAse1pbsWGcZm5RSdeYD6/TzO7kAvPJY+wuXMQw\ntCjzPIUgr1O0y2dlO0mSJEmSJEmSu5QU20doCfFWhtlYrYJS6eqxxBANl3JzpZuiRWO2WxUpI84r\nXMpGG/kmsn00Sjtatvis4zVmpOEwI/04F/AxUb51a6twaKUW12FaNiK2esNbw9YFWxfQejAZE+vQ\nV7wtuHf6ckrbP8a6vzwq22dim9ahd+gNtGBa6LqJbR/mbMRrMqrvPirVvXf6urIse05PL3P62KPs\n96eHFWK7435W2VbVIbYnTvenTJcvM59exognrSrsemPanVB3RmF3B/6VJEmSJEmSJEmSPDEpto+o\nNV6OUoRalamOmWwPAzSTgrmwrIb4Sm+deSrMs9GnmLPW6hGtpXKI+RIdQhgIse0Uq/g0QsJUoppd\n65h3jgxqN6P5ilinibKasfZO60aPnnDEoWjcxry1gXUEC0GucRFBtugsBaVDX+im9PWUtpyy7i8P\nR/Uxt62KuMf3rNFRVBQdkv9Q0Sbavus0UaYZHxcHpAhY5GS7Ga01lnVlWRb2+z3ruo9WczPM+qGF\nHAQtlXq6p0yPMV0+oVvMnK+tcd/pKSf3vRsn9zV2F/od+FeSJEmSJEmSJEnyxKTYPqJOEf1VSgjt\nWguC4TZhPeaWzTrL2rHVaKvQ5krvFTOn1BDRZRJKUYpKxGxt8VjiCI66U+oEDEOwolit2DSFkdi6\n0EY79vBYi3xtt2GS5iMqS1Ep0cK9zYRbZH+LG0UcNCrCqtHujsTPva8YHpXt5TJtiG0ZZm+o0jen\n9b5Gplk4tRH19RDdIoX5pEfVvIZ5misoiltEhpkbrXXWUene709Zlv0wSIv57mCYpKmiJVrnS51p\nvbP2xtoWlnXhGb2FJ5tkG3mSJEmSJEmSJHcnKbaPqNNW2Q6hvYlta5WuFZEyMqKNxVaqQu+OW7RT\n1xkml0PFV2C0lOtBaAuOasFH8LaoRKb2qEqvp+Ho3dc15p9ty6SOLOrusRClFJASFWcTHzPQHayj\nGGUI0qJR1S5FcXXM+yGKrK+n9FHZDqHtKFHZjip7C/M0A+s+4sQFRMecdUUVploQH7PUMqrfJdrb\n3ZzWW8xwryv7Zc+yPz1zYrcWed9bTBox/44oWsoQ2o11WWi9E4bqYSKXJEmSJEmSJElyN5Ji+wgt\n4W4tKjghXvE+4rUccxkV3ljdnGXtgNN7o65halYnZZoq0zQxT5Va64gIcyAEcVR0V6z3s/Zvd9b9\nnmW/Z1kW1nWld6MNsY0oLoqPiq4ZY38cXL19OHzDmIGugtYSqygmEueL0ekRSxZPOp775jMuZ6+B\nSwjtENuG+6hsi6LaqetEbRNTmyki6GhZL1KZT064cN9FVuvR017i/EULy7rHl4W2Wpit4ZgdZWeL\nI+Ls10Y5PQWtUCZcCt2F09PlNvyrSJIkSZIkSZIkuXFSbB9RNrFNyOI+3LK7Gd2ilftMbGvkUq8h\nnBeFWpVSY957nivzPGPzzDRX8Mjjxm20T69Yb5i1YaIWYnxdFtb9/igW7ExsR/wW4QwuEgK1O8g2\n5x373krEqgUUtFa0KFK3eesQ0eaGDbEdre6jjZt4HdwZwn1UwZth3cbuRxu5FmqbqW1hbSsURWpB\nilKrMJ/suNDvC5Gv4VhuAiaCnSrNPMzcJF7nEN2A+aH9fr800FO6Cx2hO6zdeOzy5dv1TyNJkiRJ\nkiRJkuSGSLF9xFbZjjzsEH7WR2XbHB9Z1QfBbcRcs3fwRq0SYrsI6zzRTxpuHbPpUL12OtbbiMcK\nwT2GocGdvo787WWI7W6srdOaUSYoohTdziHmoYEzYzUbreCbo7kqUqKyLaXQ3cMxHKN7j8dvFueu\niIxzkdHo7oZ3wl2828jGhs3yTbVEfFlbaH2heMXxYchWmU92dIBahtAWOk7zWNIatl+iRR6neVwM\nkOHGLjgsjeZ79i1yxtdu7JeVRx999Db/C0mSJEmSJEmSJLk+Umwfobq1Z4e4tt7HTLFFzNUwBJMy\nWs27RGu19aj4jsf2QlS9cdyjxdwtIrW2uCv3HvPV2EHrCk73OHY3pxujoq6MxnDEI+oLOxLAtrWi\nR4U8YsGUoor6qFp3RcRYzVh6j9XamAd3mm31bDnEV5sbPmrfzTy2bUPUD0dykUKZ9uh+QuuEqWKq\nUAtVBFehzDOzFnYWArt7POuO0MxDQK8RMeatjzix4X4+4sAQoXdnvzS07EGUtaUbeZIkSZIkSZIk\ndycpto8YY8tgjnun9Rbt3hazzaJRyS0oSgUr9AZ9daAjMtqyu7GKg0RluK0FszZWRyQyrw8u4Spn\nruQGqOHSMRn51cMvDK0xr4zSm7EsjWVZWJblLGcbH/ssFA1TNOkVqYZYDbHdGvveWFvHlkZfjL46\nVTXEsSoqTjybHlVwE1YjjNp6P7ikgyHLHi8FE4mKtUFzmOYxZ45Sppn5JNzUTRQfud0uBZPCujaW\n1ihrOLCXMqLUtuzuLY9bC+bQWkckZ7aTJEmSJEmSJLk7SbF9hGwl3eHs3XujtYZH3zRImI5VVaoK\neKMtIbTdy3DUHm7fa8w799ZQVayvw327UYowTZV5qky1RKa1FrRUpIwZbCm4ePxsmJ0xzNHMlbUb\nl/eNxy6fcnr5cghs2TzINMSqFrQUdLIQ3H2rbDf2LcS2L4atjjewIjgF0XD5NjrmgkujeVShlxbu\n6JE05jHnvd9jW5XanNWhATuDMu2oU6FOMzOCiSJ1gjJFbjmVLpVlWSnrSlkbeGSe11opRWlba/0a\n4tqH2PbNvjxJkiRJkiRJkuQuI8X2EcexzeY22rR7VIwlWptLLdRSmGsBryOHe8xfm2F9OIOPanCL\nSGhaW+l9pbfGNBXcdyHcaz1EaJVS0eJI6bh0XEEoIIUiUdHd1tobp0vn0csLjz5ymVqUWiJLuxal\nlEopRimGuKMG4k4zY9+iiry2Dg28gfdwGFctdI8M8JG4jbnTPSrb64g+c4sLCw7YfqE5rGY0Z5iY\nxdqh6DSjtVJV2WlB6wxljgo9hS6FuiyUZaUsKw7M08Q0BPfp6WXk8uWYmbce5nXdhnlbkiTJ3cOl\nS5d48MEH7/Rp3DTuv/9+3vjGN97p00iSJEmSpyUpth/HFnsFpRameQoXb/fhve1MQ2jPVcE6Ih2k\nI+q0pkhTWpOY2faYeMYdtEaFWgplnqjzCdPuhGk3U2pUoBlGZmUuzExoN1wKDMEdc9MxP62m6NzQ\nqaFTR4sghbjdKtplq5YXpMZtKbCrTnFjZ44tYEvcbnPeSBmu4CtOtIG7d9wbThkV5TBRUx8v2Mjh\nbsvKqZzSOiyrcbHHzDnUUZUfc951ok47prkx7zpaJsrUmXYxh11rpZZohXdnuKE3LGK2o21ervZ3\nmCRJcucwM972trfd6dNIkiRJkuQuIMX2ET6cvUWhVAUmqpUwMMMRYC46BLeCx5w2YoiCrAqr4Mow\nRduM0Hyow4IWi9bq3QWmkwtMu92Y3daYCadSZcLLjBqMrC9cCtrCmdyboV3RqVGmRpk7qqDDuE2L\nIlrRUiImbIh5KZG1XVSZVXAT+t7p1eklhrBlxIGHc7oeie2G+4pvbuPiiPsQvCGIvTsrjdb3yL6z\n3zfMFaegGgZqogU0RH+tO6a5s2tGmZzJjD5ytrd5c0XCqG5dWcv+cMwU20mSPBVE5H2AN48vP8/d\nv/Pm7f29bt6u7hiXgOweSpIkSZKnQortI3z8YiHCYebZ4VDVFmAqylSUuSi4hegsjhaBIrg4HcdE\n8N6xHvuVYXQmQJlPmHYXmHb3MZ3sONaMKk5VQybDXA5CGynI2vG1YUtHm6DTOlY7iO0Y7daoaG+C\nu45VCjpVylTRaUJQ1uq0YjR1rBk+llkLkS3jUoOvOBWnEL+ADcENiCtuMszhWkSKAVqXcYFhok4n\nTLNQJh1t7kqdOtNsTN2pscfhcC6H6DLcY2Z7v6eUigy39c0QLkmS5Clyk/8jUeCtN3eXd4QHgazQ\nJ0mSJMlTIcX2Ed2ihVmiUTmqxEfCrwhMpTBVjZntEd1ldLp1pFcoFdeKq4e5mMcWRSPGqqii0w7X\nmU5h7TJmn2MGOhR5QcqEylaurqAFlYbSEGvoZJTphDKv1NVQNVSGAD5UtqcQqDVEttZKnSfqbmaa\nZ0QKqzqrGAtGXzqdRrMY5HYpCOEWXkqnaqeXjiCIW1xsIDKxowat8RyMkUveafvGcrpwOp1iJswu\niNSR6x2vdVzU2ErVCnJ2ccPN0PHaVRWs6OHvQvUaf5lJkiRJkiRJkiR3kBTbR/TWABBRRB1BUSmI\nCrUUatExrx3LrdO9oT0cxV2ELhr50SguYBrVbqkVLWH4paXQKJyuztLaIdbK3Si1UqeJOo95a4kK\nNVqQsBqHqkg1dF6pu8bUHbwDY6mGWC8TpU6UeaJME3WemHY7pt2O+WSHSGWhs7hR3VhYWW2FvtJ9\nBam4LlHRrgYWArtrwXvDrYF3lJGJLYoNM7kY5i54d9q+sS973BVcgaiy97Xjw2wtlDUgftbG7hat\n+L0j7pThBl80zOA0+8iTJHny3Oz/QMblv+y4SZKnC5cuXeKVr3wlL3/5y3nggQfu9OkkSXKd9N63\nu3d96S3F9hFt/MWpDkM0lZhYVgnDtBHXNdfCPFW8N9a+IksBEUwkIrBQ+mjBdo0mdKlzGKNNMwJ0\nc9bVcFvpFtnV3Tq7HeykcjIXJp3CWE0rUip0jap1CbFdphPq3Kkd3NbDEh2V8TqjdWaadkyjmj1f\nOGE+OWF3coLIxOQ9hLZ11BakL3jbI77gUke294J7uJoLTlsVk5XeBOtyENpFNFq7lcgqF4UObWmc\ncop7VL9FClor1hrWe9irC6OyHYZyboZbx3rDe0eH2EaVWmNuvmRpO0neZRCRjwZeCnwc8ABn/di/\nDLwG+EF3f+fR9vcDnwV8EvAfA88lPtN+G3gj8D3A9/lVMgJF5HgYWYDvEJHvuGKzr3H3V9zAUyg3\nsG2SJHcBly5d4qGHHuJFL3pRiu0keRpxJLbv+s/eFNtH9B6V7cjMHppxVE9jjjvyq+tUmaYJU6GM\nSvU2NO2imCgmMe8dtRPB64zUHTrvsN5Z+8q6NtZ1PUSMdet0KcgME8qkUxiclQnVilYNF3JTSnXq\nbEzd6A7WC70p1mXMbO/QMlPqTKm7MTO9YzefsNtdYHdykaITpXe0dbQZtD3eTrGlgBfQENtIQdxi\niaEitFURh+4SZmaj1VsBMUfFMQRc6KthtiBSUS2IVko3Wu/01kNwi4BES7q7H0S29RXvDcEoCopS\nS4lYsJJiO0me7ojICfBtwF8e3zoWx+8/1mcCXwO8YjxGiYFi4Q+Xkh8AXjTWy0Tks9z9sSu22R4j\nV3ydJEmSJEly00ixfcQW2+xmdHGkGU1DDK7LyrJfuHAyY30H7uO3NKHWid3uhGbO2o25G07DzCMP\n2mFtDtIwhD72ty4LrbXRQu74EK9RVy8RkVUmap2p04RqoWplKo1JhapOFaeq0Hult0pvEyLCpBO1\nTJQyIVpxg9aMdXVKcWpxKOBWUFFqcWqFaQLbCVoEF8XDiQyRBtrDDE4jx7uVCesNFTmYmrkIxQ8W\narEPURDFINzUT08RVdbeWXujtcjOdiIlDbdoi7cO1rC2INaZVNDR0l+KxkWOJEmetoiIAK8FPoX4\nL+BXgH9IVKYfI4TzRwN/8cqHEv/N/Djwo8C/BR4G3g14X+ALgY8a+/0WomJ+zIcQlfAfG8f9CuCH\nrtjm7U/1+SVJkiRJcm+TYvsI61HcODMsC2fsdV2pRala6O0E8cikrlVBorq9O9mxmsXqRkeiTbx3\nuhl0x+g0i9nwdV1YloXeWhiOjQp6dwkXcgqqlVom5mlinncHod2mylSUqlBFqCq0XmltorUFcUG1\nUqSgUhHRMGtrTlmNVqBXEBPoMkzKlGmScE83DmLbhoBGbbiuQy9KbxOlzljrw9BMogtc5GB25kB3\n6O6Ye1x0WBvNQly33ll7p/WOjflsiwwx1G004xuCRRt5IaLLiqK1oJpiO0me5nwJZ0L7NcB/4e7r\n0c9/AfgR4CtF5NDj6e5dRD7A3X/9Kvt8PfBqEflq4KuBvyIiX+fuv3b0+DeJyKNHj3mbu7/p5j2t\nJEmSJEmSFNuPYxPb5kbvjg3xp0JkYYc9NlWVqVaQiahsV7Qoaw+hvbQhqnvDJeazzT1mwrU/Xmz3\nRtESwliHKCYqwSqVWirzNHMyz/RiWK303lk2sa1CLUJrK+u6srYpLhKgCAVGnJYZeDeKRqZ2L45W\nASvoMCzzKvgEeGR12xDMJiBqqDpSQEuhlEavMVO94cTFiYg5UxxYe0d6Y20d887aGrZ2ukcb+bbM\n+qGdXrxTxShiVHGmGu37Uy2UEWW2ZYgnSfL0ZFS1v5z4r+OtwEuuENqPw90vXfH11YT2MV8LfDHw\nLKKl/Buf0gknSZIkSZLcIKlWjug9+sjNz4R2KEg2JYmZ07qxro0y2ppFt+zoGnPLsiVzR5XYzHEs\nqrc4va2sbaWtK2YdL0CJ8KwQxk5vRu8dszAMUxFEBZdCHcZtWIce7darSmRXtzpa0hVQcAmh7Y4b\ncY4SruDeI1vbVrDm0Sq/rnERwFYoPcS1KtM0USvMXjA7wXqPSnS3QxeAjxgwJGLT3MNtfW0rpa20\nvtLaSusLvdnBCM1sPNfesBbz2a4eAl+gamGeCrt5ptSIVxOdIhYtSZKnK88jwpwdeNVV5qqvmyHc\n7yfayKft24SIfxZhoJYkSZIkSXJbSbF9xOZsF/XtaK+WTcSO9uVSKji01lhVqFNEghWN/GdBDsPH\nPiraZh7mZ9bo1mm90dsalW93VAqUiBxzB+tGWxvrsjLXik1TxF8JqBBu5F7oVWP1gopQi9JrzIi7\nK+6CuwyD77itJWa5Q3BDb511aaynzn7Zs9+fst/vMV+ps1N3MctdNYzgiu5gywQfIt7cR+v9mcuQ\ni+BuTIeK+8q67lnWU9ZliHEYLfY9zMiJynhBompflLnAbp64cLLjwskOrVPkmEeJ/bb/G0mS5Kbx\noUf3X/9kdiAinwt8PvARwIVzNnPg2U9m/0mSJEmSJE+FFNtHtBaVbRlVZBFBS6HWwlQKtVZKCafy\n1nq0lhdBiPZmVY255SFGfVRuu3mI8x6is/eGWccsWrBrqVE4H1FW3c7Edp/ChAzrUTXXmK8WlF4K\nvYYLeVHFvGL4qGTH7Lf7QcZGeziVIhUlKtu9dZZ94/S0c3p6mdP9Kfv9ZZzGiRRkUiYZDuxzYZ63\nyC05KOsQ92dz2T6ef3djassQ2wv7fUX3Suh8o5uxbtnmcLiYoAJFhakIUxV2c+XCyczFCydonTEp\nh5UkydOWYwF86dytroKI7IAfAF7A0TW+azzkPCF+C3DgPa9ju8LdnVhyQ38lSZIkSfKUuXTpEpcu\nPfHnz7qeO3V215Fi+wgbduSiSvG4VdGYqa4T01RRHUZmvUfO9BS/LIXQPq5sby3k4W6+tZ4vbaH3\njnssEYl29TgyjkRluzVaa7FtD/MwVMKYrQiC0kbFvdYyDjkeP1rH+6hoCyPrG0VcEVMwoVs8j3Vt\n7E8X9vt9CO7Ty6CdMk9MNoMUSi3M844LF2ZqrcMQLfa7iey4NWxU9Lt1aluo657aKqIAhnmj94a2\nFq8bW7e+DMEtFIVSYKrKPBXmeeJkVLY7hU6K7SR5F+JGo7e+gjOh/ROEg/nPAf/B3S9vG4nI64jc\nbrnKPm4hD9/ewyVJkiTJuwCvfOUreeihh+70adxUUmxfJ0/qN7Wb/Ovdbf5t8Qa4Iur2yuTbu/fE\nkyS5c/z20f3nErFf18vLiP9lXu/un3yN7Z7J7cvQfuZ2Z0uXeCKud7s7ydvf/nYefPDBO30aSXJL\nWJYFgBe84AXM83yHzyZJkt47z3nOc55wu4cfPlzUfua1trsbSLF9xBd87mfc/b/5JEmSvGvwc0f3\nPx543fU8SESeSZihOfB919juPuADrrGrmy3CD58f7te36+vd7k5iZrztbW+706eRJLeUo1/ckyR5\nenHXa7cU20mSJMmd4P8C3gL8MeALROTvXacj+fHn1n3X2O4Lx7bnKdrTo/u76zjuE7Ef+zHg7Tdh\nf0mSJEmSXJ33BJT47L2rSbGdJEmS3Hbc3UXkG4B/QESAfaeIvPhqWdtbtNfI2n4YeAfwR4AXi8g3\nXvkYEXk+8AquXb3+HWAhosL+xE14PtcS/kmSJEmS3IPI06GNLUmSJHnXY4joHwU+hWgF+/eE2dkb\ngceIdvGPAv4y8N3u/orxuG8Gvnjs5o3A3ydmvv8I8ELgvwTeCfwu0Ur+E+7+SVc5/uuBjyHmx/9r\n4BeATbj/rrv/3s19xkmSJEmS3Euk2E6SJEnuGCJyArwa+Avbt66ymQMPHYntdwd+HHjeOdv/NvDZ\nwNcCn8D5YvvTgddyyEd8HF+zHS9JkiRJkuTJoHf6BJIkSZJ7F3c/dfe/BHwS8F3ArxNV7T8Afgn4\nfuDFwDccPeYPiIr0VwK/CFwmKtlvAr4eeJ67/9S2Oee0k7v7DwOfDPwQ8Dairfzc7ZMkSZIkSW6E\nrGwnSZIkSZIkSZIkyU0mK9tJkiRJkiRJkiRJcpNJsZ0kSZIkSZIkSZIkN5kU20mSJEmSJEmSJEly\nk0mxnSRJkiRJkiRJkiQ3mRTbSZIkSZIkSZIkSXKTSbGdJEmS3POIyHuLyN8TkV8SkUdE5HdE5GdE\n5MtF5MJNPM6nichrROQtInI6bl8jIi+4WcdIknuJW/neFZGXiIhd5/qrN+s5Jcm7KiLyHBF5oYg8\nJCI/LCIPH72Hvu0WHfPFIvIvROSSiFwWkd8Qke8SkY+8Fcf7Q8fP6K8kSZLkXkZEPoPI+H53/nDG\ntgD/Hnihu//aUziGAK8CPn986/g4Mm5f5e4vf7LHSJJ7jVv93hWRlwDffpV9X42Xuvt3PpnjJMm9\ngojYFd86fm+92t0/n5uEiJwA3w98Glf//8GAV7j7K27WMa9GVraTJEmSexYR+VDge4F3A94J/A/A\nRwOfTIhjB94P+Gcict9TONTfJoS2Az8LvBj48HH7c+P7XyAiX/cUjpEk9wy38b278anAh1xj/eBN\nOEaS3Av4WL8J/BhnF5xvNt/OmdD+34E/T3zuvgz4VUIHf7WIfMEtOj6Qle0kSZLkHkZEXgd8HLAC\nH+fuP3PFz78M+Abiw/qhJ3MFXETeD/h/gAK8AfgEd98f/fwC8Drgw8Z5/KmnUkVPknuB2/TePa5s\n/3F3/82nfOJJcg8jIl9NfA6+wd0fFpH3Ad5MvMduWmVbRD4R+Fdjv68FPtuPRK+IPIu48P3ewO8B\n7+vuv38zjn0lWdlOkiRJ7klE5PnEL+sO/OMrf1kf/H3gl4gr739NRMqTONRfB+q4/yXHQhvA3S8D\nXzK+rMB/8ySOkST3DLfxvZskyU3E3R9y9x9294dv8aH+23HbgS/2K6rL7v47wN8YX74HcMuq2ym2\nkyRJknuVP390/zuutsH4gN7mMN8D+MQncZwXEaLgl939Decc5/8E/h0hDD7zSRwjSe4lbtd7N0mS\npxki8gzgk4jP3X/p7r91zqavAf5g3P+sW3U+KbaTJEmSe5WPHbePEu1k5/G6o/sfcyMHEJE/Djz3\nKvu51nHea7TWJUlydW75ezdJkqctzwfmcf/cz113X4F/Q1zkfr6I1PO2fSqk2E6SJEnuVT6IuPL9\nq+5+pUPqMb98xWNuhD91zn5u9nGS5F7idrx3r+Q7RORtIrIfcUU/LSJfKyLPfeKHJklyG3kyn7sV\n+JO34mRSbCdJkiT3HCKyA549vnzrtbZ193cQFTSAP3aDh3rw6P41jwO85ej+jR4nSe4JbuN790o+\nAbif+KX8mYSr8d8CflVEvugp7jtJkpvHXfW5e0vK5UmSJElyl/NuR/cfuY7tHwUuAs+4hcd59Oj+\njR4nSe4Vbtd7d+PXiKzef8PZL+bvC3wO8BeAE+B/FhFz93/8JI+RJMnN46763E2xnSRJktyLnBzd\nX65j+z0x13XhFh7n2KX8Ro+TJPcKt+u9C/Aad3/1Vb7/s8D3icinAz9A/D79jSLyWnd/+5M4TpIk\nN4+76nM328iTJEmSe5HTo/vzuVudsSNmRC/fwuPsju7f6HGS5F7hdr13cfd3PsHPfxh4BSHmLwIv\nu9FjJEly07mrPndTbCdJkiT3Ise/RF9P69h94/Z62laf7HHuO7p/o8dJknuF2/XevV6+lRDzEHPd\nSZLcWe6qz90U20mSJMk9h7vvgd8eXz54rW1F5D04+0B+y7W2vQrH5izXPA6PN2e50eMkyT3BbXzv\nXu/5PAz8zvjyvW7FMZIkuSHuqs/dFNtJkiTJvcovEe2ff1JErvV5+IFXPOZGeNM5+7nZx0mSe4nb\n8d69EfyJN0mS5DbxZD53G/Crt+JkUmwnSZIk9yo/NW7vA/7sNbY7bg391zdyAHd/M/BbV9nP1fj4\ncfs2d/9/b+Q4SXKPccvfu9eLiDybsyiy37rWtkmS3BbewJkx2rmfuyIyAR9JXCx7g7u3W3EyKbaT\nJEmSe5UfPLr/0qttICIC/NXx5TuAH38Sx/khogr3gSLy4ecc5yOJK+x+xXklSfKHuV3v3evh5cT7\nG+B1t+gYSZJcJ+7+CPCviPflp4jIc8/Z9HOAdx/3X3OrzifFdpIkSXJP4u5vAF5PfCC/TEQ+4iqb\nfTnwQYQI/iZ378c/FJFPEBEb69vOOdQ3ES1qAN8sIsexJIyv/8H4sgH/05N6Qklyj3A73rsi8j4i\n8rxrnYeI/GfAV3NoOX4AAALUSURBVI4vT4Fvv/FnkyTJjSAiLzl6737VOZv93XFbgW+5ctxkdKT8\nj+PLdwD/5P9v7/5B7CgCOI5/B2xSCBaxsBHR1FFBC9FCW4NYKRGESDC2ChYRxcYioIjYCBZXWgTB\nSmy0CAgiIkIaC5HUoghiERBBxmKfchz3J/H2LHKfDzzeY3d2ZqeY996P2dk9mrP1nG0AjreXWy4v\nPVF9Mca41DIDdqJ6rrqwKfdD9d4+9ey5ZnPO+eMY493qterh6qsxxtvVteq+6mL14KaOd+ac1w7V\nIzgejnrs3lNdGWN8XX1aXa1+aQn491bPtMyMjU0dr845fzpEf+CWN8Z4tDq1bdPJbZ9PjTHObS+/\nx3Pu/9295445r4wxLldnq6dbviPeb1nqcbp6vbp7U8fFOefvN9WRmyBsA3BszTmvjjGerT5quZzs\n0s4iLX/Wz8w5rx+iqTeqO6vz1QPV5R1tzGprzvnmLscCO/xPY3e2rOl8ZJ/916tX5pxHNjMGt5AX\nq3O7bB/VY5vXP2a1X9g+yPnq9urJ6vHqiR11/1W9NefcOkQbBxK2ATjW5pyfjTFOt8yUnWl5VMif\nLXcm/bj6YM75x35V7HjfrY1ZXRhjfFK91DLDfbLlEUbfVh/OOT8/bF/gODnisftd9XxL0H6ouqtl\nzN5W/VZ937IudGvO+esuxwO7u9G79+9X7sA6NmP/qTHG2eqF6v7qjurn6suW74dvbvBc/rOx/P4D\nAAAAa3GDNAAAAFiZsA0AAAArE7YBAABgZcI2AAAArEzYBgAAgJUJ2wAAALAyYRsAAABWJmwDAADA\nyoRtAAAAWJmwDQAAACsTtgEAAGBlwjYAAACsTNgGAACAlQnbAAAAsDJhGwAAAFYmbAMAAMDKhG0A\nAABYmbANAAAAKxO2AQAAYGXCNgAAAKxM2AYAAICVCdsAAACwMmEbAAAAViZsAwAAwMqEbQAAAFjZ\n31kzA73t5PcYAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": { + "image/png": { + "height": 445, + "width": 493 + } + }, + "output_type": "display_data" + } + ], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "%matplotlib inline\n", + "%config InlineBackend.figure_format = 'retina'\n", + "\n", + "import tensorflow as tf\n", + "import pickle\n", + "import helper\n", + "import random\n", + "\n", + "# Set batch size if not already set\n", + "try:\n", + " if batch_size:\n", + " pass\n", + "except NameError:\n", + " batch_size = 64\n", + "\n", + "save_model_path = './image_classification'\n", + "n_samples = 4\n", + "top_n_predictions = 3\n", + "\n", + "def test_model():\n", + " \"\"\"\n", + " Test the saved model against the test dataset\n", + " \"\"\"\n", + "\n", + " test_features, test_labels = pickle.load(open('preprocess_training.p', mode='rb'))\n", + " loaded_graph = tf.Graph()\n", + "\n", + " with tf.Session(graph=loaded_graph) as sess:\n", + " # Load model\n", + " loader = tf.train.import_meta_graph(save_model_path + '.meta')\n", + " loader.restore(sess, save_model_path)\n", + "\n", + " # Get Tensors from loaded model\n", + " loaded_x = loaded_graph.get_tensor_by_name('x:0')\n", + " loaded_y = loaded_graph.get_tensor_by_name('y:0')\n", + " loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')\n", + " loaded_logits = loaded_graph.get_tensor_by_name('logits:0')\n", + " loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')\n", + " \n", + " # Get accuracy in batches for memory limitations\n", + " test_batch_acc_total = 0\n", + " test_batch_count = 0\n", + " \n", + " for train_feature_batch, train_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):\n", + " test_batch_acc_total += sess.run(\n", + " loaded_acc,\n", + " feed_dict={loaded_x: train_feature_batch, loaded_y: train_label_batch, loaded_keep_prob: 1.0})\n", + " test_batch_count += 1\n", + "\n", + " print('Testing Accuracy: {}\\n'.format(test_batch_acc_total/test_batch_count))\n", + "\n", + " # Print Random Samples\n", + " random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))\n", + " random_test_predictions = sess.run(\n", + " tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),\n", + " feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})\n", + " helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)\n", + "\n", + "\n", + "test_model()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Why 50-70% Accuracy?\n", + "You might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores [well above 70%](http://rodrigob.github.io/are_we_there_yet/build/classification_datasets_results.html#43494641522d3130). That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.\n", + "## Submitting This Project\n", + "When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as \"dlnd_image_classification.ipynb\" and save it as a HTML file under \"File\" -> \"Download as\". Include the \"helper.py\" and \"problem_unittests.py\" files in your submission." + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + }, + "widgets": { + "state": {}, + "version": "1.1.2" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/image-classification/image-classification/helper.py b/image-classification/image-classification/helper.py new file mode 100644 index 0000000..fc971f9 --- /dev/null +++ b/image-classification/image-classification/helper.py @@ -0,0 +1,165 @@ +import pickle +import numpy as np +import matplotlib.pyplot as plt +from sklearn.preprocessing import LabelBinarizer + + +def _load_label_names(): + """ + Load the label names from file + """ + return ['airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck'] + + +def load_cfar10_batch(cifar10_dataset_folder_path, batch_id): + """ + Load a batch of the dataset + """ + with open(cifar10_dataset_folder_path + '/data_batch_' + str(batch_id), mode='rb') as file: + batch = pickle.load(file, encoding='latin1') + + features = batch['data'].reshape((len(batch['data']), 3, 32, 32)).transpose(0, 2, 3, 1) + labels = batch['labels'] + + return features, labels + + +def display_stats(cifar10_dataset_folder_path, batch_id, sample_id): + """ + Display Stats of the the dataset + """ + batch_ids = list(range(1, 6)) + + if batch_id not in batch_ids: + print('Batch Id out of Range. Possible Batch Ids: {}'.format(batch_ids)) + return None + + features, labels = load_cfar10_batch(cifar10_dataset_folder_path, batch_id) + + if not (0 <= sample_id < len(features)): + print('{} samples in batch {}. {} is out of range.'.format(len(features), batch_id, sample_id)) + return None + + print('\nStats of batch {}:'.format(batch_id)) + print('Samples: {}'.format(len(features))) + print('Label Counts: {}'.format(dict(zip(*np.unique(labels, return_counts=True))))) + print('First 20 Labels: {}'.format(labels[:20])) + + sample_image = features[sample_id] + sample_label = labels[sample_id] + label_names = _load_label_names() + + print('\nExample of Image {}:'.format(sample_id)) + print('Image - Min Value: {} Max Value: {}'.format(sample_image.min(), sample_image.max())) + print('Image - Shape: {}'.format(sample_image.shape)) + print('Label - Label Id: {} Name: {}'.format(sample_label, label_names[sample_label])) + plt.axis('off') + plt.imshow(sample_image) + + +def _preprocess_and_save(normalize, one_hot_encode, features, labels, filename): + """ + Preprocess data and save it to file + """ + features = normalize(features) + labels = one_hot_encode(labels) + + pickle.dump((features, labels), open(filename, 'wb')) + + +def preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode): + """ + Preprocess Training and Validation Data + """ + n_batches = 5 + valid_features = [] + valid_labels = [] + + for batch_i in range(1, n_batches + 1): + features, labels = load_cfar10_batch(cifar10_dataset_folder_path, batch_i) + validation_count = int(len(features) * 0.1) + + # Prprocess and save a batch of training data + _preprocess_and_save( + normalize, + one_hot_encode, + features[:-validation_count], + labels[:-validation_count], + 'preprocess_batch_' + str(batch_i) + '.p') + + # Use a portion of training batch for validation + valid_features.extend(features[-validation_count:]) + valid_labels.extend(labels[-validation_count:]) + + # Preprocess and Save all validation data + _preprocess_and_save( + normalize, + one_hot_encode, + np.array(valid_features), + np.array(valid_labels), + 'preprocess_validation.p') + + with open(cifar10_dataset_folder_path + '/test_batch', mode='rb') as file: + batch = pickle.load(file, encoding='latin1') + + # load the training data + test_features = batch['data'].reshape((len(batch['data']), 3, 32, 32)).transpose(0, 2, 3, 1) + test_labels = batch['labels'] + + # Preprocess and Save all training data + _preprocess_and_save( + normalize, + one_hot_encode, + np.array(test_features), + np.array(test_labels), + 'preprocess_training.p') + + +def batch_features_labels(features, labels, batch_size): + """ + Split features and labels into batches + """ + for start in range(0, len(features), batch_size): + end = min(start + batch_size, len(features)) + yield features[start:end], labels[start:end] + + +def load_preprocess_training_batch(batch_id, batch_size): + """ + Load the Preprocessed Training data and return them in batches of or less + """ + filename = 'preprocess_batch_' + str(batch_id) + '.p' + features, labels = pickle.load(open(filename, mode='rb')) + + # Return the training data in batches of size or less + return batch_features_labels(features, labels, batch_size) + + +def display_image_predictions(features, labels, predictions): + n_classes = 10 + label_names = _load_label_names() + label_binarizer = LabelBinarizer() + label_binarizer.fit(range(n_classes)) + label_ids = label_binarizer.inverse_transform(np.array(labels)) + + fig, axies = plt.subplots(nrows=4, ncols=2) + fig.tight_layout() + fig.suptitle('Softmax Predictions', fontsize=20, y=1.1) + + n_predictions = 3 + margin = 0.05 + ind = np.arange(n_predictions) + width = (1. - 2. * margin) / n_predictions + + for image_i, (feature, label_id, pred_indicies, pred_values) in enumerate(zip(features, label_ids, predictions.indices, predictions.values)): + pred_names = [label_names[pred_i] for pred_i in pred_indicies] + correct_name = label_names[label_id] + + axies[image_i][0].imshow(feature*255) + axies[image_i][0].set_title(correct_name) + axies[image_i][0].set_axis_off() + + axies[image_i][1].barh(ind + margin, pred_values[::-1], width) + axies[image_i][1].set_yticks(ind + margin) + axies[image_i][1].set_yticklabels(pred_names[::-1]) + axies[image_i][1].set_xticks([0, 0.5, 1.0]) diff --git a/image-classification/image-classification/image_classification.data-00000-of-00001 b/image-classification/image-classification/image_classification.data-00000-of-00001 new file mode 100644 index 0000000..a1ed498 --- /dev/null +++ b/image-classification/image-classification/image_classification.data-00000-of-00001 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:e64b4233415742ee2ef69338ffbc2d61210fa3092abd42369a5fd55e41748e46 +size 23940328 diff --git a/image-classification/image-classification/image_classification.index b/image-classification/image-classification/image_classification.index new file mode 100644 index 0000000..e30330b Binary files /dev/null and b/image-classification/image-classification/image_classification.index differ diff --git a/image-classification/image-classification/image_classification.meta b/image-classification/image-classification/image_classification.meta new file mode 100644 index 0000000..7d6d354 Binary files /dev/null and b/image-classification/image-classification/image_classification.meta differ diff --git a/image-classification/image-classification/preprocess_batch_1.p b/image-classification/image-classification/preprocess_batch_1.p new file mode 100644 index 0000000..cd1c47a --- /dev/null +++ b/image-classification/image-classification/preprocess_batch_1.p @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:56ee02f2c8292680e92c5a415a29d4b5dd53bf02608edcfe5a83f8017b8a6a5c +size 221904211 diff --git a/image-classification/image-classification/preprocess_batch_2.p b/image-classification/image-classification/preprocess_batch_2.p new file mode 100644 index 0000000..7cf7861 --- /dev/null +++ b/image-classification/image-classification/preprocess_batch_2.p @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:35bf0c70458928128eb21122c36a32a15bca438842b9062363ad7cef5b5ab8be +size 221904211 diff --git a/image-classification/image-classification/preprocess_batch_3.p b/image-classification/image-classification/preprocess_batch_3.p new file mode 100644 index 0000000..935c62d --- /dev/null +++ b/image-classification/image-classification/preprocess_batch_3.p @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:bf3b607a39cf3a68951b513ce602869ce38bcb86d369042fac95adf0cdbcdb0d +size 221904211 diff --git a/image-classification/image-classification/preprocess_batch_4.p b/image-classification/image-classification/preprocess_batch_4.p new file mode 100644 index 0000000..36a3245 --- /dev/null +++ b/image-classification/image-classification/preprocess_batch_4.p @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:85216cbda3a030e02d55de2cee83124019ff9d1c95d239c166dff276278fc4d9 +size 221904211 diff --git a/image-classification/image-classification/preprocess_batch_5.p b/image-classification/image-classification/preprocess_batch_5.p new file mode 100644 index 0000000..2bb9863 --- /dev/null +++ b/image-classification/image-classification/preprocess_batch_5.p @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:855256a55dc3ea09e29e46f05070f09207ce5ad603ff137fb12fcc617efaed01 +size 221904211 diff --git a/image-classification/image-classification/preprocess_training.p b/image-classification/image-classification/preprocess_training.p new file mode 100644 index 0000000..105ffa0 --- /dev/null +++ b/image-classification/image-classification/preprocess_training.p @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:ec95efd0e1e788eabcaa51a460868cb63d55ed774a2cca4c5cacfb777360f9c0 +size 246560211 diff --git a/image-classification/image-classification/preprocess_validation.p b/image-classification/image-classification/preprocess_validation.p new file mode 100644 index 0000000..fcf6512 --- /dev/null +++ b/image-classification/image-classification/preprocess_validation.p @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6db0027275dd6c8f3a8d012bc0e2c6422c638f8dc9db30ce3de8298f0ea10e95 +size 123280211 diff --git a/image-classification/image-classification/problem_unittests.py b/image-classification/image-classification/problem_unittests.py new file mode 100644 index 0000000..35e9f5b --- /dev/null +++ b/image-classification/image-classification/problem_unittests.py @@ -0,0 +1,199 @@ +import os +import numpy as np +import tensorflow as tf +import random +from unittest.mock import MagicMock + + +def _print_success_message(): + return print('Tests Passed') + + +def test_folder_path(cifar10_dataset_folder_path): + assert cifar10_dataset_folder_path is not None,\ + 'Cifar-10 data folder not set.' + assert cifar10_dataset_folder_path[-1] != '/',\ + 'The "/" shouldn\'t be added to the end of the path.' + assert os.path.exists(cifar10_dataset_folder_path),\ + 'Path not found.' + assert os.path.isdir(cifar10_dataset_folder_path),\ + '{} is not a folder.'.format(os.path.basename(cifar10_dataset_folder_path)) + + train_files = [cifar10_dataset_folder_path + '/data_batch_' + str(batch_id) for batch_id in range(1, 6)] + other_files = [cifar10_dataset_folder_path + '/batches.meta', cifar10_dataset_folder_path + '/test_batch'] + missing_files = [path for path in train_files + other_files if not os.path.exists(path)] + + assert not missing_files,\ + 'Missing files in directory: {}'.format(missing_files) + + print('All files found!') + + +def test_normalize(normalize): + test_shape = (np.random.choice(range(1000)), 32, 32, 3) + test_numbers = np.random.choice(range(256), test_shape) + normalize_out = normalize(test_numbers) + + assert type(normalize_out).__module__ == np.__name__,\ + 'Not Numpy Object' + + assert normalize_out.shape == test_shape,\ + 'Incorrect Shape. {} shape found'.format(normalize_out.shape) + + assert normalize_out.max() <= 1 and normalize_out.min() >= 0,\ + 'Incorect Range. {} to {} found'.format(normalize_out.min(), normalize_out.max()) + + _print_success_message() + + +def test_one_hot_encode(one_hot_encode): + test_shape = np.random.choice(range(1000)) + test_numbers = np.random.choice(range(10), test_shape) + one_hot_out = one_hot_encode(test_numbers) + + assert type(one_hot_out).__module__ == np.__name__,\ + 'Not Numpy Object' + + assert one_hot_out.shape == (test_shape, 10),\ + 'Incorrect Shape. {} shape found'.format(one_hot_out.shape) + + n_encode_tests = 5 + test_pairs = list(zip(test_numbers, one_hot_out)) + test_indices = np.random.choice(len(test_numbers), n_encode_tests) + labels = [test_pairs[test_i][0] for test_i in test_indices] + enc_labels = np.array([test_pairs[test_i][1] for test_i in test_indices]) + new_enc_labels = one_hot_encode(labels) + + assert np.array_equal(enc_labels, new_enc_labels),\ + 'Encodings returned different results for the same numbers.\n' \ + 'For the first call it returned:\n' \ + '{}\n' \ + 'For the second call it returned\n' \ + '{}\n' \ + 'Make sure you save the map of labels to encodings outside of the function.'.format(enc_labels, new_enc_labels) + + _print_success_message() + + +def test_nn_image_inputs(neural_net_image_input): + image_shape = (32, 32, 3) + nn_inputs_out_x = neural_net_image_input(image_shape) + + assert nn_inputs_out_x.get_shape().as_list() == [None, image_shape[0], image_shape[1], image_shape[2]],\ + 'Incorrect Image Shape. Found {} shape'.format(nn_inputs_out_x.get_shape().as_list()) + + assert nn_inputs_out_x.op.type == 'Placeholder',\ + 'Incorrect Image Type. Found {} type'.format(nn_inputs_out_x.op.type) + + assert nn_inputs_out_x.name == 'x:0', \ + 'Incorrect Name. Found {}'.format(nn_inputs_out_x.name) + + print('Image Input Tests Passed.') + + +def test_nn_label_inputs(neural_net_label_input): + n_classes = 10 + nn_inputs_out_y = neural_net_label_input(n_classes) + + assert nn_inputs_out_y.get_shape().as_list() == [None, n_classes],\ + 'Incorrect Label Shape. Found {} shape'.format(nn_inputs_out_y.get_shape().as_list()) + + assert nn_inputs_out_y.op.type == 'Placeholder',\ + 'Incorrect Label Type. Found {} type'.format(nn_inputs_out_y.op.type) + + assert nn_inputs_out_y.name == 'y:0', \ + 'Incorrect Name. Found {}'.format(nn_inputs_out_y.name) + + print('Label Input Tests Passed.') + + +def test_nn_keep_prob_inputs(neural_net_keep_prob_input): + nn_inputs_out_k = neural_net_keep_prob_input() + + assert nn_inputs_out_k.get_shape().ndims is None,\ + 'Too many dimensions found for keep prob. Found {} dimensions. It should be a scalar (0-Dimension Tensor).'.format(nn_inputs_out_k.get_shape().ndims) + + assert nn_inputs_out_k.op.type == 'Placeholder',\ + 'Incorrect keep prob Type. Found {} type'.format(nn_inputs_out_k.op.type) + + assert nn_inputs_out_k.name == 'keep_prob:0', \ + 'Incorrect Name. Found {}'.format(nn_inputs_out_k.name) + + print('Keep Prob Tests Passed.') + + +def test_con_pool(conv2d_maxpool): + test_x = tf.placeholder(tf.float32, [None, 32, 32, 5]) + test_num_outputs = 10 + test_con_k = (2, 2) + test_con_s = (4, 4) + test_pool_k = (2, 2) + test_pool_s = (2, 2) + + conv2d_maxpool_out = conv2d_maxpool(test_x, test_num_outputs, test_con_k, test_con_s, test_pool_k, test_pool_s) + + assert conv2d_maxpool_out.get_shape().as_list() == [None, 4, 4, 10],\ + 'Incorrect Shape. Found {} shape'.format(conv2d_maxpool_out.get_shape().as_list()) + + _print_success_message() + + +def test_flatten(flatten): + test_x = tf.placeholder(tf.float32, [None, 10, 30, 6]) + flat_out = flatten(test_x) + + assert flat_out.get_shape().as_list() == [None, 10*30*6],\ + 'Incorrect Shape. Found {} shape'.format(flat_out.get_shape().as_list()) + + _print_success_message() + + +def test_fully_conn(fully_conn): + test_x = tf.placeholder(tf.float32, [None, 128]) + test_num_outputs = 40 + + fc_out = fully_conn(test_x, test_num_outputs) + + assert fc_out.get_shape().as_list() == [None, 40],\ + 'Incorrect Shape. Found {} shape'.format(fc_out.get_shape().as_list()) + + _print_success_message() + + +def test_output(output): + test_x = tf.placeholder(tf.float32, [None, 128]) + test_num_outputs = 40 + + output_out = output(test_x, test_num_outputs) + + assert output_out.get_shape().as_list() == [None, 40],\ + 'Incorrect Shape. Found {} shape'.format(output_out.get_shape().as_list()) + + _print_success_message() + + +def test_conv_net(conv_net): + test_x = tf.placeholder(tf.float32, [None, 32, 32, 3]) + test_k = tf.placeholder(tf.float32) + + logits_out = conv_net(test_x, test_k) + + assert logits_out.get_shape().as_list() == [None, 10],\ + 'Incorrect Model Output. Found {}'.format(logits_out.get_shape().as_list()) + + print('Neural Network Built!') + + +def test_train_nn(train_neural_network): + mock_session = tf.Session() + test_x = np.random.rand(128, 32, 32, 3) + test_y = np.random.rand(128, 10) + test_k = np.random.rand(1) + test_optimizer = tf.train.AdamOptimizer() + + mock_session.run = MagicMock() + train_neural_network(mock_session, test_optimizer, test_k, test_x, test_y) + + assert mock_session.run.called, 'Session not used' + + _print_success_message() diff --git a/intro-to-rnns/.ipynb_checkpoints/Anna KaRNNa-checkpoint.ipynb b/intro-to-rnns/.ipynb_checkpoints/Anna KaRNNa-checkpoint.ipynb new file mode 100644 index 0000000..82cd954 --- /dev/null +++ b/intro-to-rnns/.ipynb_checkpoints/Anna KaRNNa-checkpoint.ipynb @@ -0,0 +1,705 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Anna KaRNNa\n", + "\n", + "In this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.\n", + "\n", + "This network is based off of Andrej Karpathy's [post on RNNs](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) and [implementation in Torch](https://github.com/karpathy/char-rnn). Also, some information [here at r2rt](http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.html) and from [Sherjil Ozair](https://github.com/sherjilozair/char-rnn-tensorflow) on GitHub. Below is the general architecture of the character-wise RNN.\n", + "\n", + "" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "from collections import namedtuple\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we'll load the text file and convert it into integers for our network to use. Here I'm creating a couple dictionaries to convert the characters to and from integers. Encoding the characters as integers makes it easier to use as input in the network." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with open('anna.txt', 'r') as f:\n", + " text=f.read()\n", + "vocab = set(text)\n", + "vocab_to_int = {c: i for i, c in enumerate(vocab)}\n", + "int_to_vocab = dict(enumerate(vocab))\n", + "chars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's check out the first 100 characters, make sure everything is peachy. According to the [American Book Review](http://americanbookreview.org/100bestlines.asp), this is the 6th best first line of a book ever." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'Chapter 1\\n\\n\\nHappy families are all alike; every unhappy family is unhappy in its own\\nway.\\n\\nEverythin'" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "text[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And we can see the characters encoded as integers." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([47, 21, 34, 65, 42, 26, 54, 15, 1, 32, 32, 32, 19, 34, 65, 65, 57,\n", + " 15, 74, 34, 5, 80, 27, 80, 26, 68, 15, 34, 54, 26, 15, 34, 27, 27,\n", + " 15, 34, 27, 80, 7, 26, 16, 15, 26, 78, 26, 54, 57, 15, 66, 24, 21,\n", + " 34, 65, 65, 57, 15, 74, 34, 5, 80, 27, 57, 15, 80, 68, 15, 66, 24,\n", + " 21, 34, 65, 65, 57, 15, 80, 24, 15, 80, 42, 68, 15, 40, 2, 24, 32,\n", + " 2, 34, 57, 12, 32, 32, 17, 78, 26, 54, 57, 42, 21, 80, 24], dtype=int32)" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chars[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since the network is working with individual characters, it's similar to a classification problem in which we are trying to predict the next character from the previous text. Here's how many 'classes' our network has to pick from." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "82" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "np.max(chars)+1" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Making training and validation batches\n", + "\n", + "Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.\n", + "\n", + "Here I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.\n", + "\n", + "The idea here is to make a 2D matrix where the number of rows is equal to the batch size. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the `split_frac` keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def split_data(chars, batch_size, num_steps, split_frac=0.9):\n", + " \"\"\" \n", + " Split character data into training and validation sets, inputs and targets for each set.\n", + " \n", + " Arguments\n", + " ---------\n", + " chars: character array\n", + " batch_size: Size of examples in each of batch\n", + " num_steps: Number of sequence steps to keep in the input and pass to the network\n", + " split_frac: Fraction of batches to keep in the training set\n", + " \n", + " \n", + " Returns train_x, train_y, val_x, val_y\n", + " \"\"\"\n", + " \n", + " slice_size = batch_size * num_steps\n", + " n_batches = int(len(chars) / slice_size)\n", + " \n", + " # Drop the last few characters to make only full batches\n", + " x = chars[: n_batches*slice_size]\n", + " y = chars[1: n_batches*slice_size + 1]\n", + " \n", + " # Split the data into batch_size slices, then stack them into a 2D matrix \n", + " x = np.stack(np.split(x, batch_size))\n", + " y = np.stack(np.split(y, batch_size))\n", + " \n", + " # Now x and y are arrays with dimensions batch_size x n_batches*num_steps\n", + " \n", + " # Split into training and validation sets, keep the first split_frac batches for training\n", + " split_idx = int(n_batches*split_frac)\n", + " train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]\n", + " val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]\n", + " \n", + " return train_x, train_y, val_x, val_y" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now I'll make my data sets and we can check out what's going on here. Here I'm going to use a batch size of 10 and 50 sequence steps." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_x, train_y, val_x, val_y = split_data(chars, 10, 50)" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(10, 178650)" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Looking at the size of this array, we see that we have rows equal to the batch size. When we want to get a batch out of here, we can grab a subset of this array that contains all the rows but has a width equal to the number of steps in the sequence. The first batch looks like this:" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[47, 21, 34, 65, 42, 26, 54, 15, 1, 32, 32, 32, 19, 34, 65, 65, 57,\n", + " 15, 74, 34, 5, 80, 27, 80, 26, 68, 15, 34, 54, 26, 15, 34, 27, 27,\n", + " 15, 34, 27, 80, 7, 26, 16, 15, 26, 78, 26, 54, 57, 15, 66, 24],\n", + " [15, 34, 5, 15, 24, 40, 42, 15, 37, 40, 80, 24, 37, 15, 42, 40, 15,\n", + " 68, 42, 34, 57, 36, 72, 15, 34, 24, 68, 2, 26, 54, 26, 38, 15, 35,\n", + " 24, 24, 34, 36, 15, 68, 5, 80, 27, 80, 24, 37, 36, 15, 11, 66],\n", + " [78, 80, 24, 12, 32, 32, 72, 67, 26, 68, 36, 15, 80, 42, 0, 68, 15,\n", + " 68, 26, 42, 42, 27, 26, 38, 12, 15, 10, 21, 26, 15, 65, 54, 80, 49,\n", + " 26, 15, 80, 68, 15, 5, 34, 37, 24, 80, 74, 80, 49, 26, 24, 42],\n", + " [24, 15, 38, 66, 54, 80, 24, 37, 15, 21, 80, 68, 15, 49, 40, 24, 78,\n", + " 26, 54, 68, 34, 42, 80, 40, 24, 15, 2, 80, 42, 21, 15, 21, 80, 68,\n", + " 32, 11, 54, 40, 42, 21, 26, 54, 15, 2, 34, 68, 15, 42, 21, 80],\n", + " [15, 80, 42, 15, 80, 68, 36, 15, 68, 80, 54, 81, 72, 15, 68, 34, 80,\n", + " 38, 15, 42, 21, 26, 15, 40, 27, 38, 15, 5, 34, 24, 36, 15, 37, 26,\n", + " 42, 42, 80, 24, 37, 15, 66, 65, 36, 15, 34, 24, 38, 32, 49, 54],\n", + " [15, 79, 42, 15, 2, 34, 68, 32, 40, 24, 27, 57, 15, 2, 21, 26, 24,\n", + " 15, 42, 21, 26, 15, 68, 34, 5, 26, 15, 26, 78, 26, 24, 80, 24, 37,\n", + " 15, 21, 26, 15, 49, 34, 5, 26, 15, 42, 40, 15, 42, 21, 26, 80],\n", + " [21, 26, 24, 15, 49, 40, 5, 26, 15, 74, 40, 54, 15, 5, 26, 36, 72,\n", + " 15, 68, 21, 26, 15, 68, 34, 80, 38, 36, 15, 34, 24, 38, 15, 2, 26,\n", + " 24, 42, 15, 11, 34, 49, 7, 15, 80, 24, 42, 40, 15, 42, 21, 26],\n", + " [16, 15, 11, 66, 42, 15, 24, 40, 2, 15, 68, 21, 26, 15, 2, 40, 66,\n", + " 27, 38, 15, 54, 26, 34, 38, 80, 27, 57, 15, 21, 34, 78, 26, 15, 68,\n", + " 34, 49, 54, 80, 74, 80, 49, 26, 38, 36, 15, 24, 40, 42, 15, 5],\n", + " [42, 15, 80, 68, 24, 0, 42, 12, 15, 10, 21, 26, 57, 0, 54, 26, 15,\n", + " 65, 54, 40, 65, 54, 80, 26, 42, 40, 54, 68, 15, 40, 74, 15, 34, 15,\n", + " 68, 40, 54, 42, 36, 32, 11, 66, 42, 15, 2, 26, 0, 54, 26, 15],\n", + " [15, 68, 34, 80, 38, 15, 42, 40, 15, 21, 26, 54, 68, 26, 27, 74, 36,\n", + " 15, 34, 24, 38, 15, 11, 26, 37, 34, 24, 15, 34, 37, 34, 80, 24, 15,\n", + " 74, 54, 40, 5, 15, 42, 21, 26, 15, 11, 26, 37, 80, 24, 24, 80]], dtype=int32)" + ] + }, + "execution_count": 18, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x[:,:50]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "I'll write another function to grab batches out of the arrays made by `split_data`. Here each batch will be a sliding window on these arrays with size `batch_size X num_steps`. For example, if we want our network to train on a sequence of 100 characters, `num_steps = 100`. For the next batch, we'll shift this window the next sequence of `num_steps` characters. In this way we can feed batches to the network and the cell states will continue through on each batch." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_batch(arrs, num_steps):\n", + " batch_size, slice_size = arrs[0].shape\n", + " \n", + " n_batches = int(slice_size/num_steps)\n", + " for b in range(n_batches):\n", + " yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Building the model\n", + "\n", + "Below is a function where I build the graph for the network." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,\n", + " learning_rate=0.001, grad_clip=5, sampling=False):\n", + " \n", + " # When we're using this network for sampling later, we'll be passing in\n", + " # one character at a time, so providing an option for that\n", + " if sampling == True:\n", + " batch_size, num_steps = 1, 1\n", + "\n", + " tf.reset_default_graph()\n", + " \n", + " # Declare placeholders we'll feed into the graph\n", + " inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')\n", + " targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')\n", + " \n", + " # Keep probability placeholder for drop out layers\n", + " keep_prob = tf.placeholder(tf.float32, name='keep_prob')\n", + " \n", + " # One-hot encoding the input and target characters\n", + " x_one_hot = tf.one_hot(inputs, num_classes)\n", + " y_one_hot = tf.one_hot(targets, num_classes)\n", + "\n", + " ### Build the RNN layers\n", + " # Use a basic LSTM cell\n", + " lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n", + " \n", + " # Add dropout to the cell\n", + " drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n", + " \n", + " # Stack up multiple LSTM layers, for deep learning\n", + " cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)\n", + " initial_state = cell.zero_state(batch_size, tf.float32)\n", + "\n", + " ### Run the data through the RNN layers\n", + " # This makes a list where each element is one step in the sequence\n", + " rnn_inputs = [tf.squeeze(i, squeeze_dims=[1]) for i in tf.split(x_one_hot, num_steps, 1)]\n", + " \n", + " # Run each sequence step through the RNN and collect the outputs\n", + " outputs, state = tf.contrib.rnn.static_rnn(cell, rnn_inputs, initial_state=initial_state)\n", + " final_state = state\n", + " \n", + " # Reshape output so it's a bunch of rows, one output row for each step for each batch\n", + " seq_output = tf.concat(outputs, axis=1)\n", + " output = tf.reshape(seq_output, [-1, lstm_size])\n", + " \n", + " # Now connect the RNN outputs to a softmax layer\n", + " with tf.variable_scope('softmax'):\n", + " softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1))\n", + " softmax_b = tf.Variable(tf.zeros(num_classes))\n", + " \n", + " # Since output is a bunch of rows of RNN cell outputs, logits will be a bunch\n", + " # of rows of logit outputs, one for each step and batch\n", + " logits = tf.matmul(output, softmax_w) + softmax_b\n", + " \n", + " # Use softmax to get the probabilities for predicted characters\n", + " preds = tf.nn.softmax(logits, name='predictions')\n", + " \n", + " # Reshape the targets to match the logits\n", + " y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])\n", + " loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped)\n", + " cost = tf.reduce_mean(loss)\n", + "\n", + " # Optimizer for training, using gradient clipping to control exploding gradients\n", + " tvars = tf.trainable_variables()\n", + " grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)\n", + " train_op = tf.train.AdamOptimizer(learning_rate)\n", + " optimizer = train_op.apply_gradients(zip(grads, tvars))\n", + " \n", + " # Export the nodes\n", + " # NOTE: I'm using a namedtuple here because I think they are cool\n", + " export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',\n", + " 'keep_prob', 'cost', 'preds', 'optimizer']\n", + " Graph = namedtuple('Graph', export_nodes)\n", + " local_dict = locals()\n", + " graph = Graph(*[local_dict[each] for each in export_nodes])\n", + " \n", + " return graph" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Hyperparameters\n", + "\n", + "Here I'm defining the hyperparameters for the network. \n", + "\n", + "* `batch_size` - Number of sequences running through the network in one pass.\n", + "* `num_steps` - Number of characters in the sequence the network is trained on. Larger is better typically, the network will learn more long range dependencies. But it takes longer to train. 100 is typically a good number here.\n", + "* `lstm_size` - The number of units in the hidden layers.\n", + "* `num_layers` - Number of hidden LSTM layers to use\n", + "* `learning_rate` - Learning rate for training\n", + "* `keep_prob` - The dropout keep probability when training. If you're network is overfitting, try decreasing this.\n", + "\n", + "Here's some good advice from Andrej Karpathy on training the network. I'm going to write it in here for your benefit, but also link to [where it originally came from](https://github.com/karpathy/char-rnn#tips-and-tricks).\n", + "\n", + "> ## Tips and Tricks\n", + "\n", + ">### Monitoring Validation Loss vs. Training Loss\n", + ">If you're somewhat new to Machine Learning or Neural Networks it can take a bit of expertise to get good models. The most important quantity to keep track of is the difference between your training loss (printed during training) and the validation loss (printed once in a while when the RNN is run on the validation data (by default every 1000 iterations)). In particular:\n", + "\n", + "> - If your training loss is much lower than validation loss then this means the network might be **overfitting**. Solutions to this are to decrease your network size, or to increase dropout. For example you could try dropout of 0.5 and so on.\n", + "> - If your training/validation loss are about equal then your model is **underfitting**. Increase the size of your model (either number of layers or the raw number of neurons per layer)\n", + "\n", + "> ### Approximate number of parameters\n", + "\n", + "> The two most important parameters that control the model are `lstm_size` and `num_layers`. I would advise that you always use `num_layers` of either 2/3. The `lstm_size` can be adjusted based on how much data you have. The two important quantities to keep track of here are:\n", + "\n", + "> - The number of parameters in your model. This is printed when you start training.\n", + "> - The size of your dataset. 1MB file is approximately 1 million characters.\n", + "\n", + ">These two should be about the same order of magnitude. It's a little tricky to tell. Here are some examples:\n", + "\n", + "> - I have a 100MB dataset and I'm using the default parameter settings (which currently print 150K parameters). My data size is significantly larger (100 mil >> 0.15 mil), so I expect to heavily underfit. I am thinking I can comfortably afford to make `lstm_size` larger.\n", + "> - I have a 10MB dataset and running a 10 million parameter model. I'm slightly nervous and I'm carefully monitoring my validation loss. If it's larger than my training loss then I may want to try to increase dropout a bit and see if that helps the validation loss.\n", + "\n", + "> ### Best models strategy\n", + "\n", + ">The winning strategy to obtaining very good models (if you have the compute time) is to always err on making the network larger (as large as you're willing to wait for it to compute) and then try different dropout values (between 0,1). Whatever model has the best validation performance (the loss, written in the checkpoint filename, low is good) is the one you should use in the end.\n", + "\n", + ">It is very common in deep learning to run many different models with many different hyperparameter settings, and in the end take whatever checkpoint gave the best validation performance.\n", + "\n", + ">By the way, the size of your training and validation splits are also parameters. Make sure you have a decent amount of data in your validation set or otherwise the validation performance will be noisy and not very informative.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "batch_size = 100\n", + "num_steps = 100 \n", + "lstm_size = 512\n", + "num_layers = 2\n", + "learning_rate = 0.001\n", + "keep_prob = 0.5" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Time for training which is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by `save_every_n`) I calculate the validation loss and save a checkpoint.\n", + "\n", + "Here I'm saving checkpoints with the format\n", + "\n", + "`i{iteration number}_l{# hidden layer units}_v{validation loss}.ckpt`" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": true + }, + "outputs": [], + "source": [ + "epochs = 20\n", + "# Save every N iterations\n", + "save_every_n = 200\n", + "train_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)\n", + "\n", + "model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "saver = tf.train.Saver(max_to_keep=100)\n", + "with tf.Session() as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Use the line below to load a checkpoint and resume training\n", + " #saver.restore(sess, 'checkpoints/______.ckpt')\n", + " \n", + " n_batches = int(train_x.shape[1]/num_steps)\n", + " iterations = n_batches * epochs\n", + " for e in range(epochs):\n", + " \n", + " # Train network\n", + " new_state = sess.run(model.initial_state)\n", + " loss = 0\n", + " for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):\n", + " iteration = e*n_batches + b\n", + " start = time.time()\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: keep_prob,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n", + " feed_dict=feed)\n", + " loss += batch_loss\n", + " end = time.time()\n", + " print('Epoch {}/{} '.format(e+1, epochs),\n", + " 'Iteration {}/{}'.format(iteration, iterations),\n", + " 'Training loss: {:.4f}'.format(loss/b),\n", + " '{:.4f} sec/batch'.format((end-start)))\n", + " \n", + " \n", + " if (iteration%save_every_n == 0) or (iteration == iterations):\n", + " # Check performance, notice dropout has been set to 1\n", + " val_loss = []\n", + " new_state = sess.run(model.initial_state)\n", + " for x, y in get_batch([val_x, val_y], num_steps):\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state = sess.run([model.cost, model.final_state], feed_dict=feed)\n", + " val_loss.append(batch_loss)\n", + "\n", + " print('Validation loss:', np.mean(val_loss),\n", + " 'Saving checkpoint!')\n", + " saver.save(sess, \"checkpoints/i{}_l{}_v{:.3f}.ckpt\".format(iteration, lstm_size, np.mean(val_loss)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Saved checkpoints\n", + "\n", + "Read up on saving and loading checkpoints here: https://www.tensorflow.org/programmers_guide/variables" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "tf.train.get_checkpoint_state('checkpoints')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sampling\n", + "\n", + "Now that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.\n", + "\n", + "The network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def pick_top_n(preds, vocab_size, top_n=5):\n", + " p = np.squeeze(preds)\n", + " p[np.argsort(p)[:-top_n]] = 0\n", + " p = p / np.sum(p)\n", + " c = np.random.choice(vocab_size, 1, p=p)[0]\n", + " return c" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def sample(checkpoint, n_samples, lstm_size, vocab_size, prime=\"The \"):\n", + " samples = [c for c in prime]\n", + " model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)\n", + " saver = tf.train.Saver()\n", + " with tf.Session() as sess:\n", + " saver.restore(sess, checkpoint)\n", + " new_state = sess.run(model.initial_state)\n", + " for c in prime:\n", + " x = np.zeros((1, 1))\n", + " x[0,0] = vocab_to_int[c]\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + "\n", + " for i in range(n_samples):\n", + " x[0,0] = c\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + " \n", + " return ''.join(samples)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, pass in the path to a checkpoint and sample from the network." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "checkpoint = \"checkpoints/____.ckpt\"\n", + "samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/intro-to-rnns/Anna KaRNNa.ipynb b/intro-to-rnns/Anna KaRNNa.ipynb new file mode 100644 index 0000000..d198967 --- /dev/null +++ b/intro-to-rnns/Anna KaRNNa.ipynb @@ -0,0 +1,852 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Anna KaRNNa\n", + "\n", + "In this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.\n", + "\n", + "This network is based off of Andrej Karpathy's [post on RNNs](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) and [implementation in Torch](https://github.com/karpathy/char-rnn). Also, some information [here at r2rt](http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.html) and from [Sherjil Ozair](https://github.com/sherjilozair/char-rnn-tensorflow) on GitHub. Below is the general architecture of the character-wise RNN.\n", + "\n", + "" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "from collections import namedtuple\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we'll load the text file and convert it into integers for our network to use. Here I'm creating a couple dictionaries to convert the characters to and from integers. Encoding the characters as integers makes it easier to use as input in the network." + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with open('anna.txt', 'r') as f:\n", + " text=f.read()\n", + "vocab = set(text)\n", + "vocab_to_int = {c: i for i, c in enumerate(vocab)}\n", + "int_to_vocab = dict(enumerate(vocab))\n", + "chars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's check out the first 100 characters, make sure everything is peachy. According to the [American Book Review](http://americanbookreview.org/100bestlines.asp), this is the 6th best first line of a book ever." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'Chapter 1\\n\\n\\nHappy families are all alike; every unhappy family is unhappy in its own\\nway.\\n\\nEverythin'" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "text[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And we can see the characters encoded as integers." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([36, 17, 78, 77, 22, 32, 43, 37, 44, 48, 48, 48, 34, 78, 77, 77, 12,\n", + " 37, 47, 78, 63, 18, 27, 18, 32, 23, 37, 78, 43, 32, 37, 78, 27, 27,\n", + " 37, 78, 27, 18, 66, 32, 21, 37, 32, 64, 32, 43, 12, 37, 30, 68, 17,\n", + " 78, 77, 77, 12, 37, 47, 78, 63, 18, 27, 12, 37, 18, 23, 37, 30, 68,\n", + " 17, 78, 77, 77, 12, 37, 18, 68, 37, 18, 22, 23, 37, 4, 35, 68, 48,\n", + " 35, 78, 12, 42, 48, 48, 57, 64, 32, 43, 12, 22, 17, 18, 68], dtype=int32)" + ] + }, + "execution_count": 29, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chars[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Since the network is working with individual characters, it's similar to a classification problem in which we are trying to predict the next character from the previous text. Here's how many 'classes' our network has to pick from." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "83" + ] + }, + "execution_count": 30, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "np.max(chars)+1" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Making training and validation batches\n", + "\n", + "Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.\n", + "\n", + "Here I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.\n", + "\n", + "The idea here is to make a 2D matrix where the number of rows is equal to the batch size. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the `split_frac` keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set." + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def split_data(chars, batch_size, num_steps, split_frac=0.9):\n", + " \"\"\" \n", + " Split character data into training and validation sets, inputs and targets for each set.\n", + " \n", + " Arguments\n", + " ---------\n", + " chars: character array\n", + " batch_size: Size of examples in each of batch\n", + " num_steps: Number of sequence steps to keep in the input and pass to the network\n", + " split_frac: Fraction of batches to keep in the training set\n", + " \n", + " \n", + " Returns train_x, train_y, val_x, val_y\n", + " \"\"\"\n", + " \n", + " slice_size = batch_size * num_steps\n", + " n_batches = int(len(chars) / slice_size)\n", + " \n", + " # Drop the last few characters to make only full batches\n", + " x = chars[: n_batches*slice_size]\n", + " y = chars[1: n_batches*slice_size + 1]\n", + " \n", + " # Split the data into batch_size slices, then stack them into a 2D matrix \n", + " x = np.stack(np.split(x, batch_size))\n", + " y = np.stack(np.split(y, batch_size))\n", + " \n", + " # Now x and y are arrays with dimensions batch_size x n_batches*num_steps\n", + " \n", + " # Split into training and validation sets, keep the first split_frac batches for training\n", + " split_idx = int(n_batches*split_frac)\n", + " train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]\n", + " val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]\n", + " \n", + " return train_x, train_y, val_x, val_y" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now I'll make my data sets and we can check out what's going on here. Here I'm going to use a batch size of 10 and 50 sequence steps." + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_x, train_y, val_x, val_y = split_data(chars, 10, 50)" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(10, 178650)" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Looking at the size of this array, we see that we have rows equal to the batch size. When we want to get a batch out of here, we can grab a subset of this array that contains all the rows but has a width equal to the number of steps in the sequence. The first batch looks like this:" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[36, 17, 78, 77, 22, 32, 43, 37, 44, 48, 48, 48, 34, 78, 77, 77, 12,\n", + " 37, 47, 78, 63, 18, 27, 18, 32, 23, 37, 78, 43, 32, 37, 78, 27, 27,\n", + " 37, 78, 27, 18, 66, 32, 21, 37, 32, 64, 32, 43, 12, 37, 30, 68],\n", + " [37, 78, 63, 37, 68, 4, 22, 37, 24, 4, 18, 68, 24, 37, 22, 4, 37,\n", + " 23, 22, 78, 12, 75, 49, 37, 78, 68, 23, 35, 32, 43, 32, 69, 37, 0,\n", + " 68, 68, 78, 75, 37, 23, 63, 18, 27, 18, 68, 24, 75, 37, 33, 30],\n", + " [64, 18, 68, 42, 48, 48, 49, 70, 32, 23, 75, 37, 18, 22, 9, 23, 37,\n", + " 23, 32, 22, 22, 27, 32, 69, 42, 37, 10, 17, 32, 37, 77, 43, 18, 73,\n", + " 32, 37, 18, 23, 37, 63, 78, 24, 68, 18, 47, 18, 73, 32, 68, 22],\n", + " [68, 37, 69, 30, 43, 18, 68, 24, 37, 17, 18, 23, 37, 73, 4, 68, 64,\n", + " 32, 43, 23, 78, 22, 18, 4, 68, 37, 35, 18, 22, 17, 37, 17, 18, 23,\n", + " 48, 33, 43, 4, 22, 17, 32, 43, 37, 35, 78, 23, 37, 22, 17, 18],\n", + " [37, 18, 22, 37, 18, 23, 75, 37, 23, 18, 43, 58, 49, 37, 23, 78, 18,\n", + " 69, 37, 22, 17, 32, 37, 4, 27, 69, 37, 63, 78, 68, 75, 37, 24, 32,\n", + " 22, 22, 18, 68, 24, 37, 30, 77, 75, 37, 78, 68, 69, 48, 73, 43],\n", + " [37, 15, 22, 37, 35, 78, 23, 48, 4, 68, 27, 12, 37, 35, 17, 32, 68,\n", + " 37, 22, 17, 32, 37, 23, 78, 63, 32, 37, 32, 64, 32, 68, 18, 68, 24,\n", + " 37, 17, 32, 37, 73, 78, 63, 32, 37, 22, 4, 37, 22, 17, 32, 18],\n", + " [17, 32, 68, 37, 73, 4, 63, 32, 37, 47, 4, 43, 37, 63, 32, 75, 49,\n", + " 37, 23, 17, 32, 37, 23, 78, 18, 69, 75, 37, 78, 68, 69, 37, 35, 32,\n", + " 68, 22, 37, 33, 78, 73, 66, 37, 18, 68, 22, 4, 37, 22, 17, 32],\n", + " [21, 37, 33, 30, 22, 37, 68, 4, 35, 37, 23, 17, 32, 37, 35, 4, 30,\n", + " 27, 69, 37, 43, 32, 78, 69, 18, 27, 12, 37, 17, 78, 64, 32, 37, 23,\n", + " 78, 73, 43, 18, 47, 18, 73, 32, 69, 75, 37, 68, 4, 22, 37, 63],\n", + " [22, 37, 18, 23, 68, 9, 22, 42, 37, 10, 17, 32, 12, 9, 43, 32, 37,\n", + " 77, 43, 4, 77, 43, 18, 32, 22, 4, 43, 23, 37, 4, 47, 37, 78, 37,\n", + " 23, 4, 43, 22, 75, 48, 33, 30, 22, 37, 35, 32, 9, 43, 32, 37],\n", + " [37, 23, 78, 18, 69, 37, 22, 4, 37, 17, 32, 43, 23, 32, 27, 47, 75,\n", + " 37, 78, 68, 69, 37, 33, 32, 24, 78, 68, 37, 78, 24, 78, 18, 68, 37,\n", + " 47, 43, 4, 63, 37, 22, 17, 32, 37, 33, 32, 24, 18, 68, 68, 18]], dtype=int32)" + ] + }, + "execution_count": 34, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x[:,:50]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "I'll write another function to grab batches out of the arrays made by `split_data`. Here each batch will be a sliding window on these arrays with size `batch_size X num_steps`. For example, if we want our network to train on a sequence of 100 characters, `num_steps = 100`. For the next batch, we'll shift this window the next sequence of `num_steps` characters. In this way we can feed batches to the network and the cell states will continue through on each batch." + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_batch(arrs, num_steps):\n", + " batch_size, slice_size = arrs[0].shape\n", + " \n", + " n_batches = int(slice_size/num_steps)\n", + " for b in range(n_batches):\n", + " yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Building the model\n", + "\n", + "Below is a function where I build the graph for the network." + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,\n", + " learning_rate=0.001, grad_clip=5, sampling=False):\n", + " \n", + " # When we're using this network for sampling later, we'll be passing in\n", + " # one character at a time, so providing an option for that\n", + " if sampling == True:\n", + " batch_size, num_steps = 1, 1\n", + "\n", + " tf.reset_default_graph()\n", + " \n", + " # Declare placeholders we'll feed into the graph\n", + " inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')\n", + " targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')\n", + " \n", + " # Keep probability placeholder for drop out layers\n", + " keep_prob = tf.placeholder(tf.float32, name='keep_prob')\n", + " \n", + " # One-hot encoding the input and target characters\n", + " x_one_hot = tf.one_hot(inputs, num_classes)\n", + " y_one_hot = tf.one_hot(targets, num_classes)\n", + "\n", + " ### Build the RNN layers\n", + " # Use a basic LSTM cell\n", + " lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n", + " \n", + " # Add dropout to the cell\n", + " drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n", + " \n", + " # Stack up multiple LSTM layers, for deep learning\n", + " cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)\n", + " initial_state = cell.zero_state(batch_size, tf.float32)\n", + "\n", + " ### Run the data through the RNN layers\n", + " # This makes a list where each element is one step in the sequence\n", + " rnn_inputs = [tf.squeeze(i, squeeze_dims=[1]) for i in tf.split(x_one_hot, num_steps, 1)]\n", + " \n", + " # Run each sequence step through the RNN and collect the outputs\n", + " outputs, state = tf.contrib.rnn.static_rnn(cell, rnn_inputs, initial_state=initial_state)\n", + " final_state = state\n", + " \n", + " # Reshape output so it's a bunch of rows, one output row for each step for each batch\n", + " seq_output = tf.concat(outputs, axis=1)\n", + " output = tf.reshape(seq_output, [-1, lstm_size])\n", + " \n", + " # Now connect the RNN outputs to a softmax layer\n", + " with tf.variable_scope('softmax'):\n", + " softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1))\n", + " softmax_b = tf.Variable(tf.zeros(num_classes))\n", + " \n", + " # Since output is a bunch of rows of RNN cell outputs, logits will be a bunch\n", + " # of rows of logit outputs, one for each step and batch\n", + " logits = tf.matmul(output, softmax_w) + softmax_b\n", + " \n", + " # Use softmax to get the probabilities for predicted characters\n", + " preds = tf.nn.softmax(logits, name='predictions')\n", + " \n", + " # Reshape the targets to match the logits\n", + " y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])\n", + " loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped)\n", + " cost = tf.reduce_mean(loss)\n", + "\n", + " # Optimizer for training, using gradient clipping to control exploding gradients\n", + " tvars = tf.trainable_variables()\n", + " grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)\n", + " train_op = tf.train.AdamOptimizer(learning_rate)\n", + " optimizer = train_op.apply_gradients(zip(grads, tvars))\n", + " \n", + " # Export the nodes\n", + " # NOTE: I'm using a namedtuple here because I think they are cool\n", + " export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',\n", + " 'keep_prob', 'cost', 'preds', 'optimizer']\n", + " Graph = namedtuple('Graph', export_nodes)\n", + " local_dict = locals()\n", + " graph = Graph(*[local_dict[each] for each in export_nodes])\n", + " \n", + " return graph" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Hyperparameters\n", + "\n", + "Here I'm defining the hyperparameters for the network. \n", + "\n", + "* `batch_size` - Number of sequences running through the network in one pass.\n", + "* `num_steps` - Number of characters in the sequence the network is trained on. Larger is better typically, the network will learn more long range dependencies. But it takes longer to train. 100 is typically a good number here.\n", + "* `lstm_size` - The number of units in the hidden layers.\n", + "* `num_layers` - Number of hidden LSTM layers to use\n", + "* `learning_rate` - Learning rate for training\n", + "* `keep_prob` - The dropout keep probability when training. If you're network is overfitting, try decreasing this.\n", + "\n", + "Here's some good advice from Andrej Karpathy on training the network. I'm going to write it in here for your benefit, but also link to [where it originally came from](https://github.com/karpathy/char-rnn#tips-and-tricks).\n", + "\n", + "> ## Tips and Tricks\n", + "\n", + ">### Monitoring Validation Loss vs. Training Loss\n", + ">If you're somewhat new to Machine Learning or Neural Networks it can take a bit of expertise to get good models. The most important quantity to keep track of is the difference between your training loss (printed during training) and the validation loss (printed once in a while when the RNN is run on the validation data (by default every 1000 iterations)). In particular:\n", + "\n", + "> - If your training loss is much lower than validation loss then this means the network might be **overfitting**. Solutions to this are to decrease your network size, or to increase dropout. For example you could try dropout of 0.5 and so on.\n", + "> - If your training/validation loss are about equal then your model is **underfitting**. Increase the size of your model (either number of layers or the raw number of neurons per layer)\n", + "\n", + "> ### Approximate number of parameters\n", + "\n", + "> The two most important parameters that control the model are `lstm_size` and `num_layers`. I would advise that you always use `num_layers` of either 2/3. The `lstm_size` can be adjusted based on how much data you have. The two important quantities to keep track of here are:\n", + "\n", + "> - The number of parameters in your model. This is printed when you start training.\n", + "> - The size of your dataset. 1MB file is approximately 1 million characters.\n", + "\n", + ">These two should be about the same order of magnitude. It's a little tricky to tell. Here are some examples:\n", + "\n", + "> - I have a 100MB dataset and I'm using the default parameter settings (which currently print 150K parameters). My data size is significantly larger (100 mil >> 0.15 mil), so I expect to heavily underfit. I am thinking I can comfortably afford to make `lstm_size` larger.\n", + "> - I have a 10MB dataset and running a 10 million parameter model. I'm slightly nervous and I'm carefully monitoring my validation loss. If it's larger than my training loss then I may want to try to increase dropout a bit and see if that helps the validation loss.\n", + "\n", + "> ### Best models strategy\n", + "\n", + ">The winning strategy to obtaining very good models (if you have the compute time) is to always err on making the network larger (as large as you're willing to wait for it to compute) and then try different dropout values (between 0,1). Whatever model has the best validation performance (the loss, written in the checkpoint filename, low is good) is the one you should use in the end.\n", + "\n", + ">It is very common in deep learning to run many different models with many different hyperparameter settings, and in the end take whatever checkpoint gave the best validation performance.\n", + "\n", + ">By the way, the size of your training and validation splits are also parameters. Make sure you have a decent amount of data in your validation set or otherwise the validation performance will be noisy and not very informative.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "batch_size = 100\n", + "num_steps = 100 \n", + "lstm_size = 512\n", + "num_layers = 2\n", + "learning_rate = 0.001\n", + "keep_prob = 0.5" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Time for training which is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by `save_every_n`) I calculate the validation loss and save a checkpoint.\n", + "\n", + "Here I'm saving checkpoints with the format\n", + "\n", + "`i{iteration number}_l{# hidden layer units}_v{validation loss}.ckpt`" + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4183 1.6397 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.3788 0.7654 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.2195 0.7784 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.5335 0.7526 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.4629 0.7196 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.3735 0.7344 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.2830 0.6926 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.1975 0.6992 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.1208 0.7158 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 4.0544 0.7060 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.9961 0.7527 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.9460 0.7117 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.9025 0.7060 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.8645 0.6890 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.8308 0.6893 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.7992 0.6998 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.7699 0.7144 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.7447 0.7604 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.7213 0.7003 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.6979 0.6853 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.6773 0.6831 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.6585 0.7401 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.6412 0.7318 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.6250 0.7306 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.6095 0.7362 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.5959 0.7257 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.5829 0.7072 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.5698 0.7180 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.5579 0.7113 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.5468 0.7387 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.5370 0.7235 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.5269 0.7129 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.5169 0.6921 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.5079 0.7006 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.4990 0.6973 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.4912 0.7065 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.4828 0.7117 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.4750 0.7141 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.4673 0.7443 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.4600 0.7414 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.4529 0.7337 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.4460 0.7220 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.4396 0.7769 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.4333 0.7866 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.4273 0.6979 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.4218 0.6913 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.4164 0.7000 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.4114 0.6916 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.4065 0.6946 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.4017 0.7230 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.3970 0.7899 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.3922 0.7018 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.3879 0.6916 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.3835 0.6925 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.3794 0.7105 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.3751 0.7442 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.3712 0.7132 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.3674 0.6911 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.3635 0.6949 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.3600 0.6954 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.3566 0.6968 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.3534 0.6942 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.3505 0.6893 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.3470 0.7183 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.3437 0.7065 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.3408 0.6992 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.3380 0.7020 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.3344 0.7185 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.3314 0.7007 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.3286 0.6948 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.3259 0.7042 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.3234 0.7412 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.3207 0.7508 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.3182 0.7345 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.3159 0.6922 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.3136 0.7058 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.3112 0.7150 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.3088 0.7372 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.3064 0.6926 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.3039 0.6921 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.3015 0.7069 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2993 0.6933 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2972 0.7065 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2949 0.6919 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2925 0.6986 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2902 0.6895 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2880 0.6895 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2858 0.6850 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2839 0.7076 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2819 0.6936 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2800 0.7680 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.2780 0.7508 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.2761 0.7522 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.2743 0.7540 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.2723 0.7220 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.2703 0.7042 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.2684 0.7808 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.2664 0.7194 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.2645 0.6983 sec/batch\n" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 33\u001b[0m model.initial_state: new_state}\n\u001b[1;32m 34\u001b[0m batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n\u001b[0;32m---> 35\u001b[0;31m feed_dict=feed)\n\u001b[0m\u001b[1;32m 36\u001b[0m \u001b[0mloss\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0mbatch_loss\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 37\u001b[0m \u001b[0mend\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtime\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtime\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36mrun\u001b[0;34m(self, fetches, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 765\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 766\u001b[0m result = self._run(None, fetches, feed_dict, options_ptr,\n\u001b[0;32m--> 767\u001b[0;31m run_metadata_ptr)\n\u001b[0m\u001b[1;32m 768\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrun_metadata\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 769\u001b[0m \u001b[0mproto_data\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf_session\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mTF_GetBuffer\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mrun_metadata_ptr\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_run\u001b[0;34m(self, handle, fetches, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 963\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfinal_fetches\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0mfinal_targets\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 964\u001b[0m results = self._do_run(handle, final_targets, final_fetches,\n\u001b[0;32m--> 965\u001b[0;31m feed_dict_string, options, run_metadata)\n\u001b[0m\u001b[1;32m 966\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 967\u001b[0m \u001b[0mresults\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_do_run\u001b[0;34m(self, handle, target_list, fetch_list, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 1013\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mhandle\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1014\u001b[0m return self._do_call(_run_fn, self._session, feed_dict, fetch_list,\n\u001b[0;32m-> 1015\u001b[0;31m target_list, options, run_metadata)\n\u001b[0m\u001b[1;32m 1016\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1017\u001b[0m return self._do_call(_prun_fn, self._session, handle, feed_dict,\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_do_call\u001b[0;34m(self, fn, *args)\u001b[0m\n\u001b[1;32m 1020\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_do_call\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1021\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1022\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 1023\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0merrors\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mOpError\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1024\u001b[0m \u001b[0mmessage\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mcompat\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mas_text\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0me\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmessage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_run_fn\u001b[0;34m(session, feed_dict, fetch_list, target_list, options, run_metadata)\u001b[0m\n\u001b[1;32m 1002\u001b[0m return tf_session.TF_Run(session, options,\n\u001b[1;32m 1003\u001b[0m \u001b[0mfeed_dict\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfetch_list\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtarget_list\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1004\u001b[0;31m status, run_metadata)\n\u001b[0m\u001b[1;32m 1005\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1006\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_prun_fn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msession\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mhandle\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfeed_dict\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfetch_list\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "epochs = 20\n", + "# Save every N iterations\n", + "save_every_n = 200\n", + "train_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)\n", + "\n", + "model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "saver = tf.train.Saver(max_to_keep=100)\n", + "with tf.Session() as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Use the line below to load a checkpoint and resume training\n", + " #saver.restore(sess, 'checkpoints/______.ckpt')\n", + " \n", + " n_batches = int(train_x.shape[1]/num_steps)\n", + " iterations = n_batches * epochs\n", + " for e in range(epochs):\n", + " \n", + " # Train network\n", + " new_state = sess.run(model.initial_state)\n", + " loss = 0\n", + " for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):\n", + " iteration = e*n_batches + b\n", + " start = time.time()\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: keep_prob,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n", + " feed_dict=feed)\n", + " loss += batch_loss\n", + " end = time.time()\n", + " print('Epoch {}/{} '.format(e+1, epochs),\n", + " 'Iteration {}/{}'.format(iteration, iterations),\n", + " 'Training loss: {:.4f}'.format(loss/b),\n", + " '{:.4f} sec/batch'.format((end-start)))\n", + " \n", + " \n", + " if (iteration%save_every_n == 0) or (iteration == iterations):\n", + " # Check performance, notice dropout has been set to 1\n", + " val_loss = []\n", + " new_state = sess.run(model.initial_state)\n", + " for x, y in get_batch([val_x, val_y], num_steps):\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state = sess.run([model.cost, model.final_state], feed_dict=feed)\n", + " val_loss.append(batch_loss)\n", + "\n", + " print('Validation loss:', np.mean(val_loss),\n", + " 'Saving checkpoint!')\n", + " saver.save(sess, \"checkpoints/i{}_l{}_v{:.3f}.ckpt\".format(iteration, lstm_size, np.mean(val_loss)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Saved checkpoints\n", + "\n", + "Read up on saving and loading checkpoints here: https://www.tensorflow.org/programmers_guide/variables" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "tf.train.get_checkpoint_state('checkpoints')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sampling\n", + "\n", + "Now that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.\n", + "\n", + "The network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def pick_top_n(preds, vocab_size, top_n=5):\n", + " p = np.squeeze(preds)\n", + " p[np.argsort(p)[:-top_n]] = 0\n", + " p = p / np.sum(p)\n", + " c = np.random.choice(vocab_size, 1, p=p)[0]\n", + " return c" + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def sample(checkpoint, n_samples, lstm_size, vocab_size, prime=\"The \"):\n", + " samples = [c for c in prime]\n", + " model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)\n", + " saver = tf.train.Saver()\n", + " with tf.Session() as sess:\n", + " saver.restore(sess, checkpoint)\n", + " new_state = sess.run(model.initial_state)\n", + " for c in prime:\n", + " x = np.zeros((1, 1))\n", + " x[0,0] = vocab_to_int[c]\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + "\n", + " for i in range(n_samples):\n", + " x[0,0] = c\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + " \n", + " return ''.join(samples)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here, pass in the path to a checkpoint and sample from the network." + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "ename": "NotFoundError", + "evalue": "Unsuccessful TensorSliceReader constructor: Failed to find any matching files for checkpoints/____.ckpt\n\t [[Node: save/RestoreV2_6 = RestoreV2[dtypes=[DT_FLOAT], _device=\"/job:localhost/replica:0/task:0/cpu:0\"](_recv_save/Const_0, save/RestoreV2_6/tensor_names, save/RestoreV2_6/shape_and_slices)]]\n\nCaused by op 'save/RestoreV2_6', defined at:\n File \"/home/spike/.pyenv/versions/3.5.1/lib/python3.5/runpy.py\", line 170, in _run_module_as_main\n \"__main__\", mod_spec)\n File \"/home/spike/.pyenv/versions/3.5.1/lib/python3.5/runpy.py\", line 85, in _run_code\n exec(code, run_globals)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/__main__.py\", line 3, in \n app.launch_new_instance()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/traitlets/config/application.py\", line 658, in launch_instance\n app.start()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/kernelapp.py\", line 474, in start\n ioloop.IOLoop.instance().start()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/zmq/eventloop/ioloop.py\", line 177, in start\n super(ZMQIOLoop, self).start()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tornado/ioloop.py\", line 887, in start\n handler_func(fd_obj, events)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tornado/stack_context.py\", line 275, in null_wrapper\n return fn(*args, **kwargs)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/zmq/eventloop/zmqstream.py\", line 440, in _handle_events\n self._handle_recv()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/zmq/eventloop/zmqstream.py\", line 472, in _handle_recv\n self._run_callback(callback, msg)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/zmq/eventloop/zmqstream.py\", line 414, in _run_callback\n callback(*args, **kwargs)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tornado/stack_context.py\", line 275, in null_wrapper\n return fn(*args, **kwargs)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/kernelbase.py\", line 276, in dispatcher\n return self.dispatch_shell(stream, msg)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/kernelbase.py\", line 228, in dispatch_shell\n handler(stream, idents, msg)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/kernelbase.py\", line 390, in execute_request\n user_expressions, allow_stdin)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/ipkernel.py\", line 196, in do_execute\n res = shell.run_cell(code, store_history=store_history, silent=silent)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/zmqshell.py\", line 501, in run_cell\n return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/IPython/core/interactiveshell.py\", line 2717, in run_cell\n interactivity=interactivity, compiler=compiler, result=result)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/IPython/core/interactiveshell.py\", line 2821, in run_ast_nodes\n if self.run_code(code, result):\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/IPython/core/interactiveshell.py\", line 2881, in run_code\n exec(code_obj, self.user_global_ns, self.user_ns)\n File \"\", line 2, in \n samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n File \"\", line 4, in sample\n saver = tf.train.Saver()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 1040, in __init__\n self.build()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 1070, in build\n restore_sequentially=self._restore_sequentially)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 675, in build\n restore_sequentially, reshape)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 402, in _AddRestoreOps\n tensors = self.restore_op(filename_tensor, saveable, preferred_shard)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 242, in restore_op\n [spec.tensor.dtype])[0])\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/ops/gen_io_ops.py\", line 668, in restore_v2\n dtypes=dtypes, name=name)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/framework/op_def_library.py\", line 763, in apply_op\n op_def=op_def)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/framework/ops.py\", line 2327, in create_op\n original_op=self._default_original_op, op_def=op_def)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/framework/ops.py\", line 1226, in __init__\n self._traceback = _extract_stack()\n\nNotFoundError (see above for traceback): Unsuccessful TensorSliceReader constructor: Failed to find any matching files for checkpoints/____.ckpt\n\t [[Node: save/RestoreV2_6 = RestoreV2[dtypes=[DT_FLOAT], _device=\"/job:localhost/replica:0/task:0/cpu:0\"](_recv_save/Const_0, save/RestoreV2_6/tensor_names, save/RestoreV2_6/shape_and_slices)]]\n", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mNotFoundError\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_do_call\u001b[0;34m(self, fn, *args)\u001b[0m\n\u001b[1;32m 1021\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1022\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 1023\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0merrors\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mOpError\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_run_fn\u001b[0;34m(session, feed_dict, fetch_list, target_list, options, run_metadata)\u001b[0m\n\u001b[1;32m 1003\u001b[0m \u001b[0mfeed_dict\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfetch_list\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtarget_list\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1004\u001b[0;31m status, run_metadata)\n\u001b[0m\u001b[1;32m 1005\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/lib/python3.5/contextlib.py\u001b[0m in \u001b[0;36m__exit__\u001b[0;34m(self, type, value, traceback)\u001b[0m\n\u001b[1;32m 65\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 66\u001b[0;31m \u001b[0mnext\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgen\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 67\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mStopIteration\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/framework/errors_impl.py\u001b[0m in \u001b[0;36mraise_exception_on_not_ok_status\u001b[0;34m()\u001b[0m\n\u001b[1;32m 465\u001b[0m \u001b[0mcompat\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mas_text\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mpywrap_tensorflow\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mTF_Message\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstatus\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 466\u001b[0;31m pywrap_tensorflow.TF_GetCode(status))\n\u001b[0m\u001b[1;32m 467\u001b[0m \u001b[0;32mfinally\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mNotFoundError\u001b[0m: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for checkpoints/____.ckpt\n\t [[Node: save/RestoreV2_6 = RestoreV2[dtypes=[DT_FLOAT], _device=\"/job:localhost/replica:0/task:0/cpu:0\"](_recv_save/Const_0, save/RestoreV2_6/tensor_names, save/RestoreV2_6/shape_and_slices)]]", + "\nDuring handling of the above exception, another exception occurred:\n", + "\u001b[0;31mNotFoundError\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0mcheckpoint\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m\"checkpoints/____.ckpt\"\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0msamp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0msample\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcheckpoint\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m2000\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlstm_size\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mlen\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mvocab\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mprime\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m\"Far\"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 3\u001b[0m \u001b[0mprint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msamp\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m\u001b[0m in \u001b[0;36msample\u001b[0;34m(checkpoint, n_samples, lstm_size, vocab_size, prime)\u001b[0m\n\u001b[1;32m 4\u001b[0m \u001b[0msaver\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mSaver\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 5\u001b[0m \u001b[0;32mwith\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mSession\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0msess\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 6\u001b[0;31m \u001b[0msaver\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrestore\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msess\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mcheckpoint\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 7\u001b[0m \u001b[0mnew_state\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0msess\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrun\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmodel\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0minitial_state\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 8\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mc\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mprime\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\u001b[0m in \u001b[0;36mrestore\u001b[0;34m(self, sess, save_path)\u001b[0m\n\u001b[1;32m 1426\u001b[0m \u001b[0;32mreturn\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1427\u001b[0m sess.run(self.saver_def.restore_op_name,\n\u001b[0;32m-> 1428\u001b[0;31m {self.saver_def.filename_tensor_name: save_path})\n\u001b[0m\u001b[1;32m 1429\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1430\u001b[0m \u001b[0;34m@\u001b[0m\u001b[0mstaticmethod\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36mrun\u001b[0;34m(self, fetches, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 765\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 766\u001b[0m result = self._run(None, fetches, feed_dict, options_ptr,\n\u001b[0;32m--> 767\u001b[0;31m run_metadata_ptr)\n\u001b[0m\u001b[1;32m 768\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrun_metadata\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 769\u001b[0m \u001b[0mproto_data\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf_session\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mTF_GetBuffer\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mrun_metadata_ptr\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_run\u001b[0;34m(self, handle, fetches, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 963\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfinal_fetches\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0mfinal_targets\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 964\u001b[0m results = self._do_run(handle, final_targets, final_fetches,\n\u001b[0;32m--> 965\u001b[0;31m feed_dict_string, options, run_metadata)\n\u001b[0m\u001b[1;32m 966\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 967\u001b[0m \u001b[0mresults\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_do_run\u001b[0;34m(self, handle, target_list, fetch_list, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 1013\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mhandle\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1014\u001b[0m return self._do_call(_run_fn, self._session, feed_dict, fetch_list,\n\u001b[0;32m-> 1015\u001b[0;31m target_list, options, run_metadata)\n\u001b[0m\u001b[1;32m 1016\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1017\u001b[0m return self._do_call(_prun_fn, self._session, handle, feed_dict,\n", + "\u001b[0;32m/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_do_call\u001b[0;34m(self, fn, *args)\u001b[0m\n\u001b[1;32m 1033\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mKeyError\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1034\u001b[0m \u001b[0;32mpass\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1035\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mtype\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0me\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnode_def\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mop\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mmessage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 1036\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1037\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_extend_graph\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mNotFoundError\u001b[0m: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for checkpoints/____.ckpt\n\t [[Node: save/RestoreV2_6 = RestoreV2[dtypes=[DT_FLOAT], _device=\"/job:localhost/replica:0/task:0/cpu:0\"](_recv_save/Const_0, save/RestoreV2_6/tensor_names, save/RestoreV2_6/shape_and_slices)]]\n\nCaused by op 'save/RestoreV2_6', defined at:\n File \"/home/spike/.pyenv/versions/3.5.1/lib/python3.5/runpy.py\", line 170, in _run_module_as_main\n \"__main__\", mod_spec)\n File \"/home/spike/.pyenv/versions/3.5.1/lib/python3.5/runpy.py\", line 85, in _run_code\n exec(code, run_globals)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/__main__.py\", line 3, in \n app.launch_new_instance()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/traitlets/config/application.py\", line 658, in launch_instance\n app.start()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/kernelapp.py\", line 474, in start\n ioloop.IOLoop.instance().start()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/zmq/eventloop/ioloop.py\", line 177, in start\n super(ZMQIOLoop, self).start()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tornado/ioloop.py\", line 887, in start\n handler_func(fd_obj, events)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tornado/stack_context.py\", line 275, in null_wrapper\n return fn(*args, **kwargs)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/zmq/eventloop/zmqstream.py\", line 440, in _handle_events\n self._handle_recv()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/zmq/eventloop/zmqstream.py\", line 472, in _handle_recv\n self._run_callback(callback, msg)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/zmq/eventloop/zmqstream.py\", line 414, in _run_callback\n callback(*args, **kwargs)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tornado/stack_context.py\", line 275, in null_wrapper\n return fn(*args, **kwargs)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/kernelbase.py\", line 276, in dispatcher\n return self.dispatch_shell(stream, msg)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/kernelbase.py\", line 228, in dispatch_shell\n handler(stream, idents, msg)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/kernelbase.py\", line 390, in execute_request\n user_expressions, allow_stdin)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/ipkernel.py\", line 196, in do_execute\n res = shell.run_cell(code, store_history=store_history, silent=silent)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/zmqshell.py\", line 501, in run_cell\n return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/IPython/core/interactiveshell.py\", line 2717, in run_cell\n interactivity=interactivity, compiler=compiler, result=result)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/IPython/core/interactiveshell.py\", line 2821, in run_ast_nodes\n if self.run_code(code, result):\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/IPython/core/interactiveshell.py\", line 2881, in run_code\n exec(code_obj, self.user_global_ns, self.user_ns)\n File \"\", line 2, in \n samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n File \"\", line 4, in sample\n saver = tf.train.Saver()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 1040, in __init__\n self.build()\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 1070, in build\n restore_sequentially=self._restore_sequentially)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 675, in build\n restore_sequentially, reshape)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 402, in _AddRestoreOps\n tensors = self.restore_op(filename_tensor, saveable, preferred_shard)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/training/saver.py\", line 242, in restore_op\n [spec.tensor.dtype])[0])\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/ops/gen_io_ops.py\", line 668, in restore_v2\n dtypes=dtypes, name=name)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/framework/op_def_library.py\", line 763, in apply_op\n op_def=op_def)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/framework/ops.py\", line 2327, in create_op\n original_op=self._default_original_op, op_def=op_def)\n File \"/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/tensorflow/python/framework/ops.py\", line 1226, in __init__\n self._traceback = _extract_stack()\n\nNotFoundError (see above for traceback): Unsuccessful TensorSliceReader constructor: Failed to find any matching files for checkpoints/____.ckpt\n\t [[Node: save/RestoreV2_6 = RestoreV2[dtypes=[DT_FLOAT], _device=\"/job:localhost/replica:0/task:0/cpu:0\"](_recv_save/Const_0, save/RestoreV2_6/tensor_names, save/RestoreV2_6/shape_and_slices)]]\n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/____.ckpt\"\n", + "samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/intro-to-rnns/anna.txt b/intro-to-rnns/anna.txt new file mode 100644 index 0000000..f177bb0 --- /dev/null +++ b/intro-to-rnns/anna.txt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:568bb39fc78aca98e9db91c4329dcd1aa5ec4c3df3aca2064ce5d6f023ae16c9 +size 2025486 diff --git a/intro-to-rnns/assets/charseq.jpeg b/intro-to-rnns/assets/charseq.jpeg new file mode 100644 index 0000000..c8e1221 Binary files /dev/null and b/intro-to-rnns/assets/charseq.jpeg differ diff --git a/intro-to-tensorflow/.ipynb_checkpoints/intro_to_tensorflow-checkpoint.ipynb b/intro-to-tensorflow/.ipynb_checkpoints/intro_to_tensorflow-checkpoint.ipynb new file mode 100644 index 0000000..dad092f --- /dev/null +++ b/intro-to-tensorflow/.ipynb_checkpoints/intro_to_tensorflow-checkpoint.ipynb @@ -0,0 +1,817 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "

TensorFlow Neural Network Lab

" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "In this lab, you'll use all the tools you learned from *Introduction to TensorFlow* to label images of English letters! The data you are using, notMNIST, consists of images of a letter from A to J in differents font.\n", + "\n", + "The above images are a few examples of the data you'll be training on. After training the network, you will compare your prediction model against test data. Your goal, by the end of this lab, is to make predictions against that test set with at least an 80% accuracy. Let's jump in!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To start this lab, you first need to import all the necessary modules. Run the code below. If it runs successfully, it will print \"`All modules imported`\"." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "All modules imported.\n" + ] + } + ], + "source": [ + "import hashlib\n", + "import os\n", + "import pickle\n", + "from urllib.request import urlretrieve\n", + "\n", + "import numpy as np\n", + "from PIL import Image\n", + "from sklearn.model_selection import train_test_split\n", + "from sklearn.preprocessing import LabelBinarizer\n", + "from sklearn.utils import resample\n", + "from tqdm import tqdm\n", + "from zipfile import ZipFile\n", + "\n", + "print('All modules imported.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The notMNIST dataset is too large for many computers to handle. It contains 500,000 images for just training. You'll be using a subset of this data, 15,000 images for each label (A-J)." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Downloading notMNIST_train.zip...\n", + "Download Finished\n", + "Downloading notMNIST_test.zip...\n", + "Download Finished\n", + "All files downloaded.\n" + ] + } + ], + "source": [ + "def download(url, file):\n", + " \"\"\"\n", + " Download file from \n", + " :param url: URL to file\n", + " :param file: Local file path\n", + " \"\"\"\n", + " if not os.path.isfile(file):\n", + " print('Downloading ' + file + '...')\n", + " urlretrieve(url, file)\n", + " print('Download Finished')\n", + "\n", + "# Download the training and test dataset.\n", + "download('https://s3.amazonaws.com/udacity-sdc/notMNIST_train.zip', 'notMNIST_train.zip')\n", + "download('https://s3.amazonaws.com/udacity-sdc/notMNIST_test.zip', 'notMNIST_test.zip')\n", + "\n", + "# Make sure the files aren't corrupted\n", + "assert hashlib.md5(open('notMNIST_train.zip', 'rb').read()).hexdigest() == 'c8673b3f28f489e9cdf3a3d74e2ac8fa',\\\n", + " 'notMNIST_train.zip file is corrupted. Remove the file and try again.'\n", + "assert hashlib.md5(open('notMNIST_test.zip', 'rb').read()).hexdigest() == '5d3c7e653e63471c88df796156a9dfa9',\\\n", + " 'notMNIST_test.zip file is corrupted. Remove the file and try again.'\n", + "\n", + "# Wait until you see that all files have been downloaded.\n", + "print('All files downloaded.')" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "100%|██████████| 210001/210001 [00:26<00:00, 7964.58files/s]\n", + "100%|██████████| 10001/10001 [00:01<00:00, 8404.50files/s]\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "All features and labels uncompressed.\n" + ] + } + ], + "source": [ + "def uncompress_features_labels(file):\n", + " \"\"\"\n", + " Uncompress features and labels from a zip file\n", + " :param file: The zip file to extract the data from\n", + " \"\"\"\n", + " features = []\n", + " labels = []\n", + "\n", + " with ZipFile(file) as zipf:\n", + " # Progress Bar\n", + " filenames_pbar = tqdm(zipf.namelist(), unit='files')\n", + " \n", + " # Get features and labels from all files\n", + " for filename in filenames_pbar:\n", + " # Check if the file is a directory\n", + " if not filename.endswith('/'):\n", + " with zipf.open(filename) as image_file:\n", + " image = Image.open(image_file)\n", + " image.load()\n", + " # Load image data as 1 dimensional array\n", + " # We're using float32 to save on memory space\n", + " feature = np.array(image, dtype=np.float32).flatten()\n", + "\n", + " # Get the the letter from the filename. This is the letter of the image.\n", + " label = os.path.split(filename)[1][0]\n", + "\n", + " features.append(feature)\n", + " labels.append(label)\n", + " return np.array(features), np.array(labels)\n", + "\n", + "# Get the features and labels from the zip files\n", + "train_features, train_labels = uncompress_features_labels('notMNIST_train.zip')\n", + "test_features, test_labels = uncompress_features_labels('notMNIST_test.zip')\n", + "\n", + "# Limit the amount of data to work with a docker container\n", + "docker_size_limit = 150000\n", + "train_features, train_labels = resample(train_features, train_labels, n_samples=docker_size_limit)\n", + "\n", + "# Set flags for feature engineering. This will prevent you from skipping an important step.\n", + "is_features_normal = False\n", + "is_labels_encod = False\n", + "\n", + "# Wait until you see that all features and labels have been uncompressed.\n", + "print('All features and labels uncompressed.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Problem 1\n", + "The first problem involves normalizing the features for your training and test data.\n", + "\n", + "Implement Min-Max scaling in the `normalize()` function to a range of `a=0.1` and `b=0.9`. After scaling, the values of the pixels in the input data should range from 0.1 to 0.9.\n", + "\n", + "Since the raw notMNIST image data is in [grayscale](https://en.wikipedia.org/wiki/Grayscale), the current values range from a min of 0 to a max of 255.\n", + "\n", + "Min-Max Scaling:\n", + "$\n", + "X'=a+{\\frac {\\left(X-X_{\\min }\\right)\\left(b-a\\right)}{X_{\\max }-X_{\\min }}}\n", + "$\n", + "\n", + "*If you're having trouble solving problem 1, you can view the solution [here](https://github.com/udacity/deep-learning/blob/master/intro-to-tensorFlow/intro_to_tensorflow_solution.ipynb).*" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0.9" + ] + }, + "execution_count": 37, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "t = train_features[0]\n", + "t_ = 0.1 + ((t-t.min())*(0.9 - 0.1))/(t.max()-t.min())\n", + "0.1 + (255*0.8/255)" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed!\n" + ] + } + ], + "source": [ + "# Problem 1 - Implement Min-Max scaling for grayscale image data\n", + "def normalize_grayscale(image_data):\n", + " \"\"\"\n", + " Normalize the image data with Min-Max scaling to a range of [0.1, 0.9]\n", + " :param image_data: The image data to be normalized\n", + " :return: Normalized image data\n", + " \"\"\"\n", + " t = image_data\n", + " # TODO: Implement Min-Max scaling for grayscale image data\n", + " return 0.1 + ((t-t.min())*(0.9 - 0.1))/(t.max()-t.min())\n", + "\n", + "\n", + "### DON'T MODIFY ANYTHING BELOW ###\n", + "# Test Cases\n", + "np.testing.assert_array_almost_equal(\n", + " normalize_grayscale(np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 255])),\n", + " [0.1, 0.103137254902, 0.106274509804, 0.109411764706, 0.112549019608, 0.11568627451, 0.118823529412, 0.121960784314,\n", + " 0.125098039216, 0.128235294118, 0.13137254902, 0.9],\n", + " decimal=3)\n", + "np.testing.assert_array_almost_equal(\n", + " normalize_grayscale(np.array([0, 1, 10, 20, 30, 40, 233, 244, 254,255])),\n", + " [0.1, 0.103137254902, 0.13137254902, 0.162745098039, 0.194117647059, 0.225490196078, 0.830980392157, 0.865490196078,\n", + " 0.896862745098, 0.9])\n", + "\n", + "if not is_features_normal:\n", + " train_features = normalize_grayscale(train_features)\n", + " test_features = normalize_grayscale(test_features)\n", + " is_features_normal = True\n", + "\n", + "print('Tests Passed!')" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Labels One-Hot Encoded\n" + ] + } + ], + "source": [ + "if not is_labels_encod:\n", + " # Turn labels into numbers and apply One-Hot Encoding\n", + " encoder = LabelBinarizer()\n", + " encoder.fit(train_labels)\n", + " train_labels = encoder.transform(train_labels)\n", + " test_labels = encoder.transform(test_labels)\n", + "\n", + " # Change to float32, so it can be multiplied against the features in TensorFlow, which are float32\n", + " train_labels = train_labels.astype(np.float32)\n", + " test_labels = test_labels.astype(np.float32)\n", + " is_labels_encod = True\n", + "\n", + "print('Labels One-Hot Encoded')" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training features and labels randomized and split.\n" + ] + } + ], + "source": [ + "assert is_features_normal, 'You skipped the step to normalize the features'\n", + "assert is_labels_encod, 'You skipped the step to One-Hot Encode the labels'\n", + "\n", + "# Get randomized datasets for training and validation\n", + "train_features, valid_features, train_labels, valid_labels = train_test_split(\n", + " train_features,\n", + " train_labels,\n", + " test_size=0.05,\n", + " random_state=832289)\n", + "\n", + "print('Training features and labels randomized and split.')" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Saving data to pickle file...\n", + "Data cached in pickle file.\n" + ] + } + ], + "source": [ + "# Save the data for easy access\n", + "pickle_file = 'notMNIST.pickle'\n", + "if not os.path.isfile(pickle_file):\n", + " print('Saving data to pickle file...')\n", + " try:\n", + " with open('notMNIST.pickle', 'wb') as pfile:\n", + " pickle.dump(\n", + " {\n", + " 'train_dataset': train_features,\n", + " 'train_labels': train_labels,\n", + " 'valid_dataset': valid_features,\n", + " 'valid_labels': valid_labels,\n", + " 'test_dataset': test_features,\n", + " 'test_labels': test_labels,\n", + " },\n", + " pfile, pickle.HIGHEST_PROTOCOL)\n", + " except Exception as e:\n", + " print('Unable to save data to', pickle_file, ':', e)\n", + " raise\n", + "\n", + "print('Data cached in pickle file.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Checkpoint\n", + "All your progress is now saved to the pickle file. If you need to leave and comeback to this lab, you no longer have to start from the beginning. Just run the code block below and it will load all the data and modules required to proceed." + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Data and modules loaded.\n" + ] + } + ], + "source": [ + "%matplotlib inline\n", + "\n", + "# Load the modules\n", + "import pickle\n", + "import math\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "from tqdm import tqdm\n", + "import matplotlib.pyplot as plt\n", + "\n", + "# Reload the data\n", + "pickle_file = 'notMNIST.pickle'\n", + "with open(pickle_file, 'rb') as f:\n", + " pickle_data = pickle.load(f)\n", + " train_features = pickle_data['train_dataset']\n", + " train_labels = pickle_data['train_labels']\n", + " valid_features = pickle_data['valid_dataset']\n", + " valid_labels = pickle_data['valid_labels']\n", + " test_features = pickle_data['test_dataset']\n", + " test_labels = pickle_data['test_labels']\n", + " del pickle_data # Free up memory\n", + "\n", + "print('Data and modules loaded.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Problem 2\n", + "\n", + "Now it's time to build a simple neural network using TensorFlow. Here, your network will be just an input layer and an output layer.\n", + "\n", + "\n", + "\n", + "For the input here the images have been flattened into a vector of $28 \\times 28 = 784$ features. Then, we're trying to predict the image digit so there are 10 output units, one for each label. Of course, feel free to add hidden layers if you want, but this notebook is built to guide you through a single layer network. \n", + "\n", + "For the neural network to train on your data, you need the following float32 tensors:\n", + " - `features`\n", + " - Placeholder tensor for feature data (`train_features`/`valid_features`/`test_features`)\n", + " - `labels`\n", + " - Placeholder tensor for label data (`train_labels`/`valid_labels`/`test_labels`)\n", + " - `weights`\n", + " - Variable Tensor with random numbers from a truncated normal distribution.\n", + " - See `tf.truncated_normal()` documentation for help.\n", + " - `biases`\n", + " - Variable Tensor with all zeros.\n", + " - See `tf.zeros()` documentation for help.\n", + "\n", + "*If you're having trouble solving problem 2, review \"TensorFlow Linear Function\" section of the class. If that doesn't help, the solution for this problem is available [here](intro_to_tensorflow_solution.ipynb).*" + ] + }, + { + "cell_type": "code", + "execution_count": 159, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed!\n" + ] + } + ], + "source": [ + "# All the pixels in the image (28 * 28 = 784)\n", + "features_count = 784\n", + "# All the labels\n", + "labels_count = 10\n", + "\n", + "# TODO: Set the features and labels tensors\n", + "features = tf.placeholder(tf.float32, [None, features_count])\n", + "labels = tf.placeholder(tf.float32, [None, labels_count])\n", + "\n", + "# TODO: Set the weights and biases tensors\n", + "weights = tf.Variable(tf.truncated_normal([features_count, labels_count]))\n", + "biases = tf.Variable(tf.zeros([labels_count]))\n", + "\n", + "\n", + "\n", + "### DON'T MODIFY ANYTHING BELOW ###\n", + "\n", + "#Test Cases\n", + "from tensorflow.python.ops.variables import Variable\n", + "\n", + "assert features._op.name.startswith('Placeholder'), 'features must be a placeholder'\n", + "assert labels._op.name.startswith('Placeholder'), 'labels must be a placeholder'\n", + "assert isinstance(weights, Variable), 'weights must be a TensorFlow variable'\n", + "assert isinstance(biases, Variable), 'biases must be a TensorFlow variable'\n", + "\n", + "assert features._shape == None or (\\\n", + " features._shape.dims[0].value is None and\\\n", + " features._shape.dims[1].value in [None, 784]), 'The shape of features is incorrect'\n", + "assert labels._shape == None or (\\\n", + " labels._shape.dims[0].value is None and\\\n", + " labels._shape.dims[1].value in [None, 10]), 'The shape of labels is incorrect'\n", + "assert weights._variable._shape == (784, 10), 'The shape of weights is incorrect'\n", + "assert biases._variable._shape == (10), 'The shape of biases is incorrect'\n", + "\n", + "assert features._dtype == tf.float32, 'features must be type float32'\n", + "assert labels._dtype == tf.float32, 'labels must be type float32'\n", + "\n", + "# Feed dicts for training, validation, and test session\n", + "train_feed_dict = {features: train_features, labels: train_labels}\n", + "valid_feed_dict = {features: valid_features, labels: valid_labels}\n", + "test_feed_dict = {features: test_features, labels: test_labels}\n", + "\n", + "# Linear Function WX + b\n", + "logits = tf.matmul(features, weights) + biases\n", + "\n", + "prediction = tf.nn.softmax(logits)\n", + "\n", + "# Cross entropy\n", + "cross_entropy = -tf.reduce_sum(labels * tf.log(prediction), reduction_indices=1)\n", + "\n", + "# Training loss\n", + "loss = tf.reduce_mean(cross_entropy)\n", + "\n", + "\n", + "optimizer = tf.train.AdamOptimizer(0.003).minimize(loss) \n", + "\n", + "# Create an operation that initializes all variables\n", + "init = tf.global_variables_initializer()\n", + "\n", + "# Test Cases\n", + "with tf.Session() as session:\n", + " session.run(init)\n", + " session.run(loss, feed_dict=train_feed_dict)\n", + " session.run(loss, feed_dict=valid_feed_dict)\n", + " session.run(loss, feed_dict=test_feed_dict)\n", + " biases_data = session.run(biases)\n", + "\n", + "assert not np.count_nonzero(biases_data), 'biases must be zeros'\n", + "\n", + "print('Tests Passed!')" + ] + }, + { + "cell_type": "code", + "execution_count": 160, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Accuracy function created.\n" + ] + } + ], + "source": [ + "# Determine if the predictions are correct\n", + "is_correct_prediction = tf.equal(tf.argmax(prediction, 1), tf.argmax(labels, 1))\n", + "# Calculate the accuracy of the predictions\n", + "accuracy = tf.reduce_mean(tf.cast(is_correct_prediction, tf.float32))\n", + "\n", + "print('Accuracy function created.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Problem 3\n", + "Below are 2 parameter configurations for training the neural network. In each configuration, one of the parameters has multiple options. For each configuration, choose the option that gives the best acccuracy.\n", + "\n", + "Parameter configurations:\n", + "\n", + "Configuration 1\n", + "* **Epochs:** 1\n", + "* **Learning Rate:**\n", + " * 0.8\n", + " * 0.5\n", + " * 0.1\n", + " * 0.05\n", + " * 0.01\n", + "\n", + "Configuration 2\n", + "* **Epochs:**\n", + " * 1\n", + " * 2\n", + " * 3\n", + " * 4\n", + " * 5\n", + "* **Learning Rate:** 0.2\n", + "\n", + "The code will print out a Loss and Accuracy graph, so you can see how well the neural network performed.\n", + "\n", + "*If you're having trouble solving problem 3, you can view the solution [here](intro_to_tensorflow_solution.ipynb).*" + ] + }, + { + "cell_type": "code", + "execution_count": 161, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Epoch 1/5: 100%|██████████| 557/557 [00:02<00:00, 246.49batches/s]\n", + "Epoch 2/5: 100%|██████████| 557/557 [00:02<00:00, 256.03batches/s]\n", + "Epoch 3/5: 100%|██████████| 557/557 [00:02<00:00, 252.19batches/s]\n", + "Epoch 4/5: 100%|██████████| 557/557 [00:02<00:00, 241.44batches/s]\n", + "Epoch 5/5: 100%|██████████| 557/557 [00:02<00:00, 238.71batches/s]\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAk4AAAGGCAYAAACNCg6xAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzs3Xl8VNX9//HXBwhLAgn7voMguCDEfQFHtFrXWlsVtdpa\nf9Uu1trF9vvtYrVfbautS61bVVyq0lqxivsCBpcq1kQRZJFlwk5CWAIkIZDk8/vjTMIQkjCQZRJ4\nPx+P+5jMvWfuPXMY5r7n3HPvNXdHRERERPasVbIrICIiItJSKDiJiIiIJEjBSURERCRBCk4iIiIi\nCVJwEhEREUmQgpOIiIhIghScRERERBKk4CQiIiKSIAUnERERkQQpOImIiIgkSMFJRJqEmV1hZhVm\nNi7ZdRER2VcKTiLSlHRzTBFp0RScRERERBKk4CQizYaZ9TCzR8xsrZmVmNmnZnZ5DeUuNrOPzWyz\nmRWa2Wdm9sO45W3M7EYz+yK2ngIze9fMJjbtOxKR/U2bZFdARATAzNoDWcAw4B4gF/g68JiZZbj7\nPbFypwFPA28CN8RePgo4DvhL7PlNwC+AvwH/BdKBI4FxwPTGfzcisr9ScBKR5uJq4GDgUnf/B4CZ\nPQC8A/yfmU129yLgTGCTu59ex7rOBF529+82dqVF5MCiQ3Ui0lx8GVhbGZoA3L2c0IvUEZgQm70J\n6GhmdQWnTcAhZja8sSorIgcmBScRaS4GAYtqmD8fsNhygPuAL4BXzGxFbExU9RD1G6Az8EVs/NMf\nzeywxqq4iBw4FJxEpLmwRAq5+zrgCOBc4AXgZOBVM3s0rsy7hLFS3wLmAFcBOWZ2ZQPXWUQOMApO\nItJc5AIH1TB/VOxxWeUMdy9z95fd/QfuPgx4ELjczIbGldnk7o+7+6XAAOAz4LeNVXkROTAoOIlI\nc/EK0NvMLqqcYWatgWuBLcDM2LyuNbx2TuyxXU1l3L0YWFy5XERkX+msOhFpSgZ828y+XMOyuwln\n1j1mZkey83IExwHXxc6oA3g4FoxmACuBwcAPgE/dfX6szDwzywKygQ3AUcDX2Hm5AhGRfWLuugOC\niDQ+M7sCmFxHkQHAduAPwDmEay8tBP7s7n+PW8/5wHcI45w6A2sJvVU3uXt+rMz/EMZAjSD0Mi0D\nngD+FDtTT0Rknyg4iYiIiCSo3mOczOx/zOyj2K0P8szs32Y2olqZrNhd0SuncjO7r77bFhEREWlK\nDTE4/CTC7RGOAU4FUoA3zKxDXBkn3PqgF9Ab6MPOWyWIiIiItAj1Hhzu7mfGPzezbwL5QCbwXtyi\n4tj1V0RERERapMa4HEFnQg/ThmrzLzWzdWY2x8xurdYjJSIiItLsNejgcDMz4EWgk7tPiJt/FeGs\nltXA4cBtwCx3/1qDbVxERESkkTV0cLofOB04wd3X1FEuArwFDHf3aA3Lu8XWkwtsa7AKioiIyP6m\nPeF6bq+7+/rG3liDXQDTzP4KnAmcVFdoiplFuBDecGC34EQITU81VN1ERERkv3cp8HRjb6RBglMs\nNJ0HTHD35Qm8ZCxhHFRtASsX4Mknn2TUqFG1FJG6XH/99dx5553JrsZ+TW3c+NTGjU9t3PjUxo1r\n/vz5XHbZZRDLDo2t3sEpdj2mSYSr9BaZWa/YokJ33xa76eYlhCv7rgfGAHcAM919bi2r3QYwatQo\nxo0bV98qHpAyMjLUdo1Mbdz41MaNT23c+NTGTaZJhvY0RI/TNYTeo6xq879FuMXBdsL1na4D0oAV\nwL+AWxpg2yIiIiJNpiGu41TnJQ3cfSVwcn23IyIiIpJsjXEdJxEREZH9koLTfmrSpEnJrsJ+T23c\n+NTGjU9t3PjUxvuXBr2OU0Mxs3FAdnZ2tgbUiYiISK1ycnLIzMwEyHT3nMbennqcRERERBKk4CQi\nIiKSIAUnERERkQQpOImIiIgkSMFJREREJEEKTiIiIiIJUnASERERSZCCk4iIiEiCFJxEREREEqTg\nJCIiIpIgBScRERGRBCk4iYiIiCRIwUlEREQkQfUOTmb2P2b2kZltNrM8M/u3mY2oVqadmd1rZgVm\ntsXMnjWznvXdtoiIiEhTaogep5OAe4BjgFOBFOANM+sQV+Yu4CzgAmA80BeY2gDbFhEREWkybeq7\nAnc/M/65mX0TyAcygffMLB24ErjY3WfGynwLmG9mR7v7R/Wtg4iIiEhTaIwxTp0BBzbEnmcSAtr0\nygLuvhBYDhzXCNsXERERaRQNGpzMzAiH5d5z93mx2b2B7e6+uVrxvNiyWlV4RUNWT0RERKRe6n2o\nrpr7gNHAiQmUNULPVK02lGyoa7GIiIhIk2qw4GRmfwXOBE5y99Vxi9YCbc0svVqvU09Cr1OtfvHT\nX/DXnn/dZd6kSZOYNGlSA9VaREREWoopU6YwZcqUXeYVFhY2aR3Mvc5On8RWEkLTecAEd19abVk6\nsI4wOPzfsXkjgAXAsTUNDjezcUD27c/ezk8v+Gm96yciIiL7p5ycHDIzMwEy3T2nsbdX7x4nM7sP\nmAScCxSZWa/YokJ33+bum83sEeAOM9sIbAH+Ary/pzPq8rfm17d6IiIiIg2mIQ7VXUMYq5RVbf63\ngCdif18PlAPPAu2A14Dv72nFeUV1HskTERERaVINcR2nPZ6Z5+6lwLWxKWH5xepxEhERkeajWd+r\nLm+repxERESk+WjWwSm/SD1OIiIi0nw06+CUV5RHQ5z1JyIiItIQmnVwKisvo6C4INnVEBEREQGa\neXACWLl5ZbKrICIiIgIoOImIiIgkrFkHp9atWis4iYiISLPRrINTj9QeCk4iIiLSbDTr4NSzY09W\nblFwEhERkeahWQenXmm91OMkIiIizUazDk49U3sqOImIiEiz0ayDU69OocdJF8EUERGR5qB5B6e0\nXhTvKGbTtk3JroqIiIhI8w9OoGs5iYiISPPQrINTz7SegIKTiIiINA/NOjh1S+1GK2vFis0rkl0V\nERERkeYdnNq0akOfjn3U4yQiIiLNQr2Dk5mdZGbTzGyVmVWY2bnVlj8amx8/vZLo+vun91dwEhER\nkWahIXqc0oBPge8DtV034FWgF9A7Nk1KdOUKTiIiItJctKnvCtz9NeA1ADOzWoqVuvu6fVl///T+\nvLHkjX2tnoiIiEiDaaoxTiebWZ6ZLTCz+8ysa6IvVI+TiIiINBdNEZxeBS4HTgFuACYAr9TRO7WL\n/un92bJ9C5tLNzdiFUVERET2rN6H6vbE3Z+Je/q5mc0BlgAnA2/v6fX90/sD4VpOo3uMbowqioiI\niCSk0YNTde4eNbMCYDh7CE7XX389KakpsBS+/c636ZHWg0mTJjFpUsJjy0VERGQ/MWXKFKZMmbLL\nvMLCwiatgzXkDXTNrAL4irtPq6NMf2AZcJ67v1RLmXFAdnZ2NoeOOZR2/9eOR859hCvHXtlgdRUR\nEZGWLycnh8zMTIBMd89p7O3Vu8fJzNIIvUeVY5aGmtkYYENsuhGYCqyNlfsj8AXweiLrb9u6Lb3S\nemmAuIiIiCRdQxyqO5JwyM1j059j8x8HvgccThgc3hlYTQhMv3H3HYluYEDGAAUnERERSbqGuI7T\nTOo+O++M+m5DlyQQERGR5qBZ36uuUv9OCk4iIiKSfC0jOKnHSURERJqBFhOcNm7bSNH2omRXRURE\nRA5gLSY4AazasirJNREREZEDWYsKTjpcJyIiIsnUIoJTv/R+gIKTiIiIJFeLCE7t27Sne2p3BScR\nERFJqhYRnCAcrltRuCLZ1RAREZEDWIsKTiu3qMdJREREkqflBCddBFNERESSrOUEJ10EU0RERJKs\nRQWnguICtpVtS3ZVRERE5ADVYoLTgIwBAKzarItgioiISHK0mOCki2CKiIhIsrWY4NSvky6CKSIi\nIsnVYoJTWts0urTvouAkIiIiSVPv4GRmJ5nZNDNbZWYVZnZuDWVuNrPVZlZsZm+a2fB92ZbOrBMR\nEZFkaogepzTgU+D7gFdfaGY/B34AXA0cDRQBr5tZ273dkC6CKSIiIsnUpr4rcPfXgNcAzMxqKHId\n8Dt3fzFW5nIgD/gK8MzebKt/en8+WftJ/SosIiIiso8adYyTmQ0BegPTK+e5+2ZgFnDc3q5Ph+pE\nREQkmRp7cHhvwuG7vGrz82LL9kr/9P7kbc1je/n2hqibiIiIyF5J1ll1Rg3jofakf3p/HGfNljWN\nUCURERGRutV7jNMerCWEpF7s2uvUE9jjYKXrr7+ejIyMqudbtm+BjuFaToM6D2rouoqIiEgzNmXK\nFKZMmbLLvMLCwiatQ6MGJ3ePmtlaYCLwGYCZpQPHAPfu6fV33nkn48aNq3q+uXQzGX/I0DgnERGR\nA9CkSZOYNGnSLvNycnLIzMxssjrUOziZWRownNCzBDDUzMYAG9x9BXAX8CszWwzkAr8DVgIv7O22\n0tul06ltJwUnERERSYqG6HE6EnibMGbJgT/H5j8OXOnut5lZKvAg0Bl4F/iyu+/TCG+dWSciIiLJ\n0hDXcZrJHgaZu/tvgd/Wd1sQgtOKzSsaYlUiIiIie6XF3Kuu0oD0AepxEhERkaRoccFJh+pEREQk\nWVpkcFqzdQ1lFWXJroqIiIgcYFpkcKrwCtZuXZvsqoiIiMgBpkUGJ0CH60RERKTJKTiJiIiIJKjF\nBafO7TuTmpKq4CQiIiJNrsUFJzPTmXUiIiKSFC0uOIEuSSAiIiLJoeAkIiIikqCWGZw6KTiJiIhI\n02uZwSm9P6u2rKLCK5JdFRERETmAtNjgVFZRxpota5JdFRERETmAtMjgdFS/o2jfpj0P5zyc7KqI\niIjIAaRFBqfeHXvzvSO/x58/+DPri9cnuzoiIiJygGiRwQngFyf+Ase57f3bkl0VEREROUC02ODU\nI60H1x97Pfd8dI/GOomIiEiTaPTgZGY3mllFtWleQ6z7J8f9hPZt2nPLu7c0xOpERERE6tRUPU5z\ngV5A79h0YkOsNKN9BjeccAN/y/4buZtyG2KVIiIiIrVqquBU5u7r3D0/Nm1oqBVfe/S1dO3QlZtn\n3txQqxQRERGpUVMFp4PMbJWZLTGzJ81sQEOtOK1tGv970v/y+OzHWViwsKFWKyIiIrKbpghOHwLf\nBE4HrgGGAO+YWVpDbeDqzKvp16kfv8n6TUOtUkRERGQ35u5Nu0GzDGAZcL27P1pLmXFA9vjx48nI\nyNhl2aRJk5g0adJur3kk5xGuevEqPrn6E47ofUQj1FxERESSacqUKUyZMmWXeYWFhbzzzjsAme6e\n09h1aPLgBGBmHwFvuvsva1k+DsjOzs5m3LhxCa2zrKKM0feOZmT3kbw46cUGrK2IiIg0Vzk5OWRm\nZkITBacmv46TmXUEhgENevGlNq3acNPJN/HSFy/x4coPG3LVIiIiIkDTXMfpdjMbb2aDzOx44N9A\nGTBlDy/daxcdehGH9TyMX86osSNLREREpF6aosepP/A0sAD4B7AOONbdG/wmc62sFb+L/I4Z0RlM\nXzq9oVcvIiIiB7g2jb0Bd999JHcjOnfkuRzd72h+OeOXnDLkFMysKTcvIiIi+7EWe6+62pgZt5xy\nC7NWzeKlL15KdnVERERkP7LfBSeAiUMmcvLgk/nF9F+wdOPSZFdHRERE9hP7ZXAyM/78pT9TUFzA\niHtGcMXzV7CgYEGyqyUiIiIt3H4ZnADG9RlH9Lood5x+B9OXTmf0vaO5+NmL+Szvs2RXTURERFqo\n/TY4AaSmpPLDY37Ikh8u4f6z7mfWqlmMeWAMX/nHV/h49cfJrp6IiIi0MPt1cKrUrk07rj7yar74\nwRc8dt5jzC+Yz1EPHcWXn/oyryx6hcUbFrOtbFuyqykiIiLNXKNfjqA5SWmdwhVHXMFlh1/Gv+b9\ni1vevYWznj6ranmP1B70T+/PgIwB9O8Ue0zvz8huIxnbZyxtWh1QzSUiIiLVHJBJoHWr1lx86MVc\neMiFLN6wmBWFK1i5eSUrNu98fHf5u6zcvJKN2zYC0KltJ8YPGk9kcITIkAhjeo2hdavWdW6nZEcJ\nn6/7nNlrZzMnfw4D0gdw1biryGifUefrREREpHk6IINTpVbWihHdRjCi24hay2zdvpU5eXN4O/dt\n3s59m1+//WtK3iyhS/sujB80nlOGnEJkcIQuHbowe+1sPsv7jNl5s5mdN5sv1n9BhVdgGMO6DmPZ\npmXc/M7NXJN5Ddcdex19O/VtwnfbdMoqyli9ZTXLNi1jWeEyUlql8NVRXyWldUqyqyYiIlIv5u7J\nrsNuzGwckJ2dnc24ceOSXZ1dlJaVMmvVLN6OhiD1wcoP2F6+vWp5ert0xvQaw+G9DmdMrzGM6T2G\nQ3seSmpKKqs2r+Ivs/7CA9kPULKjhMsOv4yfHv9TRvcYncR3tO8WrV/E27lvk7spl+WFy1lWuIzl\nhctZtXkV5V6+S9nBnQfzq5N+xeVjLleAEhGRBpOTk0NmZiZAprvnNPb2FJzqqWRHCR+s/ICt27dy\neK/DGZQxaI+3eSncVshDOQ9x54d3snrLas4ecTY3HH8DJw48ca9uEePu5Bfls3Tj0qppycYlLC9c\nTlrbNHqn9aZXx170SutV9di7Y5iX0S5jr29H4+7MzZ/L1PlTeW7+c8zJn0Nra02/9H4MzBjIoIxB\nuz52Do/RjVF+987v+Ne8fylAiYhIg1JwomUFp/rYXr6dp+c8ze3/uZ156+ZxbP9jOb7/8TiOu9f4\nWOEVrN6yuiooFe0oqlpfj9QeDOs6jIEZAyneUczarWvJ25pHXlHeLr1iAO3btGdEtxGM7jGa0d1H\nh8ceoxnedfgugcbd+Xj1xzw3/zmmzp/Kog2LSG+XzjkjzuGro77KGcPPIDUlNaH3OydvDje/czPP\nznuWIZ2H8Kvxv+Ibh3+j2QSo8opylm5cyoKCBRza81CGdBmS7Co1KndnWeEyPl37KZ+s+YRP8z6l\nY9uOXHrYpXxp2Jd0MoSItAgKThw4walShVfw6qJXuWvWXazcvBIj9ASZGYbt9tinUx+Gdh7K0C5D\nGdZ1GEO7DGVI5yF0atepxvW7O4WlheRtzQthqiiP1VtWs7BgIfMK5vF5/uesL1kPQJtWbaoCVbcO\n3Xh18assL1xOtw7dOG/keVww+gImDplIuzbt9vn9zsmbw00zb2Lq/KlVAeriQy+moLiANVvWsHrL\natZsjT1uWVP1d7mX0z21e5g6dN/5d2p3eqT1oHtqdzLaZdAhpQOpKamkpqSS0iplt541d2fl5pXM\nzZ8bpnXhcd66ebtcluK4/sdxyWGX8PXRX6dXx177/H5rsrFkIwvXL2T1ltUMyhjEyO4j6di2Y4Nu\nI15ZRRkLChbwyZpP+GTtJ3y69lM+Xftp1ckPPVJ7MLbPWFZtXsXn6z6nZ1pPLjn0Ei4fczlH9D5C\nN8tuRvK25pG9Jpt+nfpxeK/D9W8jBzwFJw684NQcrCtax+frPmfeunlV05qtazhl8ClcMPoCxg8a\n3+A9EJ/lfcbNM29m6vypuy1r06oNvTv2pm+nvvTp2Ic+HfvQplUb1pesp6C4oGpaV7yuzmtwtbbW\nVSEqNSWV9m3as2rLKjaXbgYgLSWNQ3seWjUd1vMwhncdzvsr3mfK3Cm8tvg1KryCiUMmcslhl3D+\nwecnfFbkjvIdRDdFWViwkIXrF7KgYAEL1y9kYcFC1hWv26185aUvDu5+8M7H7iPpn96fVpb4JdfK\nK8pZuH4hH6/+uGr6dO2nlJSVADC0y1DG9h7L2N5jOaL3EYztM5Y+HftgZrg7n679lCdmP8HTc58m\nvyifQ3ocwuVjLufSwy6lX3q/GrdZvKO4qnczvyifdq3b0TOtJ7069qJHao8G61XcUrqFRRsW8cX6\nL1i0fhGrt6zmmP7HcMbwM+jdsXeDbKM52bp9Kzlrcvho1UfMWjWLj1Z9xPLC5VXL+3Xqx5kHncmZ\nB53JqUNPbdTwLYmr7M2dtXIWc/PnMqrHKE4Zcsp++RltDhScUHA60HyW9xk5a3Lo3bE3fTr2oW+n\nvnRL7ZZwWCjeURxCVNE6NpdupqSshOIdxbVOvTv2rgpKAzMG1rmd9cXreXbes0yZO4V3lr1D29Zt\nOWvEWXx99Ndp17od64rXkV+Uz7qideQX5+/8uyifguKCqkHyHdt2ZGS3kYzsPjI8xv7u16kfuZty\nq0LVgoIFLChYwKINi6oOr7Zr3W6X3rXuqd3pkdpjl+dlFWVkr8nm49Ufk7Mmp+oQ7ohuIziy75Ec\n2edIxvUZxxG9j0g4+JVVlPHGkjd4YvYTvLDwBUrLSpk4dCIHdT2IvKK8qqCUtzWPLdu31Lmurh26\nhiCV1oueaT3pmdaTtJQ02rVpR9vWbWnXOvbYpl3V361btWZ54fIQkmJhae3WtVXr7JHagx5pPZi/\nbj6Ok9knk7MOOoszDzqTo/odtVdhs8IrWLNlDcsLl+86bd75d8mOEkZ0G8GoHqMY1T1MB3c/mBHd\nRtSrBxZCW68oXMGiDYtYtH4Rn679lI9Wf8Tc/LlUeAWpKakc2fdIju57NEf3O5rMvpks3biUVxa9\nwsuLXuaL9V/QtnVbxg8aX9UG8WcL7yjfUfVZzdsawm1+UT4bSjbQM60nQ7oMqeq5TmubVq/3ArC5\ndDPLNi2rOmlk2aZlFO8o5uh+R3PiwBMZ3Hlwo/SUuTtbtm+htbVukPeRqMJthfx39X+ZtXIWs1aF\nKb8oH4CeaT2r/j6kxyFMHDKRiUMnMmHQBF2apoEoOKHgJM3Tys0r+efcf/L03KfJWRP+b7ayVnTr\n0K0qDPRM60mP1B5VvS3Duw5nZLeR9O3Ud692FOUV5VWBasnGJbv0slWfdlTsAEJPUmVIOrJvCEoN\n9cVcuK2QZ+c9y1NznmJDyYZdTzaIO/mgV8cQjLaXb99lJ13ZE1X5mF+UT/GOYkrLStlevp3S8tKq\nv+PPyOzUtlPVJUMO6nrQzr+7HUTn9p2B0Fv62uLXeGXxK7y2+DU2bdtEj9QenDH8DM486ExOGngS\nm0s3s2rLKlZvWV01xT9fs2VNVTtWbrfy5IaB6QMZmDGQdm3asbBgIfML5jO/YD4FxQVA+AwM7TKU\nUd1HMShjEGlt0+jYtiNpKWmktU3b7bGy12zxhsUs3rCYRRsWEd0Yrdp+m1ZtGNV9FMf0O4aj+x3N\nMf2PYXSP0XX2+C7esJhXFr3CK4teISs3i9LyUoZ0HkK7Nu2qAlJ1ndp2okuHLuQX5e/Sa9szrSdD\nOu8MUgMzBgKwo2IHO8p3sL18e9XflY9FO4pYsXlFCEqbllFYWli1vjat2jAgfQAprVP4Yv0XAPTt\n1JeTBp7EiQNP5KSBJ3Foz0PrvC5eWUUZG0o2UFBcQH5R/i7/jpWH8iun4h3FAPTu2JthXYYxrOsw\nhncZzrCuwxjWZRjDuw6na4euex3cSnaUVIXoyjOIo5uiZK/OZkHBAhwnvV16+Dfrd0yY+h9Dz7Se\nrN26lhnRGcyIzmB6dDq5m3JpZa04su+RTBwykRMGnEBpeWnVD8DK3vT4x6LtRQzIGBDeU5dhVUM1\nhnUZxoCMAXV+PsoryinaUUTR9iIKSwurxr9WDt3Y5XFrHimtUxjbeyyZfTLJ7JvJuD7j6JnWc6/a\nq7ptZduqfmyt3bqWtVvXkl+Uv8v/tUEZg/bp32a/DU5m9n3gp0BvYDZwrbv/t5ayCk71NGXKFCZN\nmpTsauy3Vm9ZzYvPvshVV1y1xwuhNqbKX9gVXlEVJFq68opySstLKaso46WpL3HJJZck/NqyijI+\nXPlhVYiYnTd7tzJdO3Slb6e+9O3Ul36d+lX9XXk26MCMgQkFzoLiAuavm8+CggVVYWrl5pUUbS+q\n2kkV7yjG2f07NqVVCkO7DGV41+Ec1PUghncdHv7udhADMwbW67B40fYiZkRn8NbSt2jdqnVVoI/v\n7euZ1pMOKR0AeOrpp4icEyG6MUp0U5SlG5fufNwYZeXmlQC0bd2WlNYppLRKIaV1Snge+7tDmw70\nT++/2xm1gzIG0btj76r/I+uL1/OfFf/hveXv8e7yd/l49cfsqNhBert0jh9wPCO7jWTjto0UFBew\nvjgcll9fsp5N2zbt9j47te1U9W9XeUi/b6e+9OnUhx3lO1iycQmLNyxmycYlLNmwZJfD4xntMuiR\n1oP2bdrToU0H2rdpH/5O6VA1r13rdqwvWV8Vkip7jQAMC5+ZzoM4vOfhHNM/BKWR3UfW2NNZ/ft4\n6calVSFqRnRG1borf4hVjtmM71lOTUll2aZl4f3Ezp6u8AoghNNBGYPo06kPJTtKqj5/lY+l5aU1\nflbat2m/y4+fyjOxi3cUk7Mmh5w1OVUhuH96fzL7hBCV2SeT7qnd2Vy6mcLSQgq3FVb9vbl0M4Xb\nCiksLaSguKAqlFX/NzSM7qnd2bJ9yy7BPS0lrer/YeXUqW2nXXqjq/+dOy+Xb531LdifgpOZXQQ8\nDnwH+Ai4Hvg6MMLdC2oor+BUT+eeey7Tpk1LdjX2a2rjxlffNl65eSUfr/6Y7qndq3aulYGhKbg7\nJWUlu+zEUlNSGZgxMKmBO96e2tjdG20AesmOEv67+r9VQWrZpmV07dC1Kix069AtPKZ2q3reI60H\nfTv13evxXJtLN7Nkw84wtbFkI9vKtlFSVrLL47aybZTsCH937dC1xjDYL70fbVu3TXjbdbWxu7O8\ncDmd2nWic/vOCR9i3lG+g2WFy1iyIQSppRuXkleUR2qb1N16Oju27Vj1d3q79KrL0nRq26nOf1t3\nZ+nGpWSvySZnTU7VY009mGkpaWS0zyC9XToZ7cJj99Tu9O7YuyqcVf3dsRfdU7vTplUb3J11xet2\nObQb37O3onAFRTuKKC0r3e36gFVWA38Dmig4NdX5xtcDD7r7EwBmdg1wFnAlcFsT1UFEDjD90/vT\nP71/0rZvZlUnJvSgR9LqUR+NedZeh5QOjB80nvGDxjfaNiqlt0tnbJ+xjO0zttG3tTfMjEGdB+31\n61Jap1T1VDYWs3DXi2Fdh3HhIRcCOwe+F24rrApK6e3S97mX1MyqekGP6ndUnWXLK8p3O7RfWl7K\nJzmfcOHwQ8D6AAAgAElEQVTfLtyn7e+LRg9OZpYCZAK3Vs5zdzezt4DjGnv7IiIi0jDMjMGdBydl\n261btaZDqw679Rpv7rq5SeuR+Gkn+6470BrIqzY/jzDeSURERKRFSOalgQ1qGDUZtAeYP39+09Vm\nP1NYWEhOTqMf6j2gqY0bn9q48amNG5/auHHFZYX2TbG9Rh8cHjtUVwxc4O7T4uY/BmS4+/k1vOYS\n4KlGrZiIiIjsTy5196cbeyON3uPk7jvMLBuYCEwDsDDacCLwl1pe9jpwKZAL1H5ZaBERETnQtQcG\nE7JDo2uqyxFcSLgcwdXsvBzB14CD3X33e0+IiIiINENNMsbJ3Z8xs+7AzUAv4FPgdIUmERERaUma\n5S1XRERERJqjprgcgYiIiMh+QcFJREREJEEKTi2Emd1oZhXVpnlxy9uZ2b1mVmBmW8zsWTPrWW0d\nA8zsZTMrMrO1ZnabWYI3RtoPmdlJZjbNzFbF2vPcGsrcbGarzazYzN40s+HVlncxs6fMrNDMNprZ\nw2aWVq3M4Wb2jpmVmNkyM/tZY7+35mJPbWxmj9bwuX6lWhm1cS3M7H/M7CMz22xmeWb2bzMbUa1M\ng3w3mNnJZpZtZtvM7Aszu6Ip3mOyJdjGWdU+w+Vmdl+1MmrjWpjZNWY2O/Z/vNDM/mNmZ8Qtb1af\n4QN2p9lCzSUMru8dm06MW3YX4f5/FwDjgb7A1MqFsQ/QK4QTAo4FrgC+SRiwf6BKI5yo8H1quBir\nmf0c+AHhbNCjgSLgdTOLv7vn08AowuU1ziK0/YNx6+hEOEU2CowDfgb81syuaoT30xzV2cYxr7Lr\n53pSteVq49qdBNwDHAOcCqQAb5hZ/D0p6v3dYGaDgZeA6cAY4G7gYTM7rVHeVfOSSBs74TazlZ/j\nPsANlQvVxnu0Avg54fZsmcAM4AUzGxVb3rw+w+6uqQVMwI1ATi3L0oFS4Py4eSOBCuDo2PMvAzuA\n7nFlrgY2Am2S/f6SPcXa6txq81YD11dr5xLgwtjzUbHXjY0rczpQBvSOPf8uUBDfxsDvgXnJfs/N\npI0fBZ6r4zUHq433qo27x9rrxNjzBvluAP4IfFZtW1OAV5L9npPdxrF5bwN31PEatfHet/N64FvN\n8TOsHqeW5aDYIY8lZvakmQ2Izc8kJO3plQXdfSGwnJ03Uj4WmOPuBXHrex3IAA5p/Kq3LGY2hPDL\nMb5NNwOz2LVNN7r7J3EvfYvw6/OYuDLvuHtZXJnXgZFmltFI1W9pTo4dAllgZveZWde4ZcehNt4b\nnQltsyH2vKG+G44ltDvVyhyIN2qv3saVLjWzdWY2x8xurdYjpTZOkJm1MrOLgVTgA5rhZ1jBqeX4\nkND1eDpwDTAEeCc21qM3sD22Y48XfyPl3tR8o2XQzZZr0pvw5VjXzal7A/nxC929nPCFqnZPzKvA\n5cAphEMbE4BXzMxiy9XGCYq12V3Ae+5eOf6xob4baiuTbmbt6lv3lqKWNoZwi7DLgJOBW4FvAH+P\nW6423gMzO9TMthB6l+4j9DAtoBl+hpN5k1/ZC+4efyn5uWb2EbAMuJDab0tT142Ud1l9Pat3IEmk\nTfdUpjIUHPDt7u7PxD393MzmAEsIO6C363ip2nh39wGj2XXsY20a4rvhQG7jE+JnuvvDcU8/N7O1\nwHQzG+Lu0T2sU20cLCCMPepMGMv0hJmNr6N80j7D6nFqody9EPgCGA6sBdqaWXq1Yj3ZmbDXEgYu\nxqt8Xj2FS2gvY/c2q96m1c/saA10iS2rLFPTOkDtvpvYTqaA8LkGtXFCzOyvwJnAye6+Om5Rfb8b\n9tTGm919e33q3lJUa+M1eyg+K/YY/zlWG9fB3cvcfam757j7L4HZwHU0w8+wglMLZWYdgWGEAczZ\nhMGyE+OWjwAGAv+JzfoAOMzCrW8qfQkoBOK7nIWqHfhadm3TdMK4mvg27WxmY+NeOpEQuD6KKzM+\ntrOv9CVgYSz8Shwz6w90Ayp3TGrjPYjt0M8DIu6+vNri+n43zI8rM5FdfSk2f7+3hzauyVhCL0b8\n51htvHdaAe1ojp/hZI+c15TwGQa3E07DHAQcD7xJSNvdYsvvI5yOfTJhMN37wLtxr29FSPCvAocT\nxkrlAb9L9ntLYpumEbqGjyCcofGj2PMBseU3EM7sOAc4DHgeWAS0jVvHK8DHwFGE7vuFwN/jlqcT\nwu3jhC7+i4CtwLeT/f6T3caxZbcRwuggwpfax4QvuhS1cULtex/hzKGTCL+mK6f21crU67uBcOf5\nrYQzk0YC3wO2A6cmuw2S3cbAUOBXhEthDALOBRYDM9TGCbfxLYRDzIOAQwlnxZYBpzTHz3DSG0xT\nwh+sKcBKwunwywnXthkSt7wd4VojBcAW4F9Az2rrGEC4jsXW2Ifqj0CrZL+3JLbpBMLOvLzaNDmu\nzG8JO+ViwhkYw6utozPwJOGXzUbgISC1WpnDgJmxdSwHfprs994c2hhoD7xG6NnbBiwF7gd6qI0T\nbt+a2rYcuDyuTIN8N8T+LbNj30GLgG8k+/03hzYG+gNZwLrY528hYcffUW2ccBs/HPv/XxL7PniD\nWGiKLW9Wn2Hd5FdEREQkQRrjJCIiIpIgBScRERGRBCk4iYiIiCRIwUlEREQkQQpOIiIiIglScBIR\nERFJkIKTiIiISIIUnEREREQSpOAkIiIikiAFJxEREZEEKTiJiIiIJEjBSURERCRBCk4iIiIiCVJw\nEhEREUmQgpOIiIhIghScRERERBKk4CQiIiKSIAUnERERkQQpOInIHpnZ98yswsw+SHZdRESSydw9\n2XUQkWbOzN4D+gCDgYPcfWlyayQikhzqcRKROpnZEOB44MdAAXBpcmtUMzNLTXYdRGT/p+AkInty\nKbAReBl4lhqCkwXXmdlnZlZiZvlm9qqZjatW7jIzm2VmRWa2wcxmmtlpccsrzOw3Naw/18wmxz2/\nIlZ2vJndZ2Z5wIrYsoGxeQvMrNjMCszsGTMbVMN6M8zsTjOLmtk2M1thZo+bWVczSzOzrWZ2Zw2v\n62tmZWb2871qSRFp8dokuwIi0uxdAjzr7mVmNgW4xswy3T07rsxk4ApCuHqI8N1yEnAskANgZjcC\nNwLvA78GtgPHABHgzT3UobYxBfcB+cBNQFps3lGx7U4BVhIOL34PeNvMRrv7tlh90oD3gJHAI8An\nQHfgXKC/u39mZv8GLjKzH/uu4xoqw+OTe6i3iOxnFJxEpFZmlgkcDHwfwN3fM7NVhOCQHSsTIYSm\nu9z9x3EvvzNuPcMIYWmqu389rsxf61nFAmBitVDzkrtPrfY+XgQ+BC4AnorNvgEYDZzv7tPiit8a\n9/cThOB4GvBG3PxLgXfcfVU96y8iLYwO1YlIXS4F1gJZcfP+CVxsZhZ7fgFQAdxcx3rOB2wPZfaW\nAw9VC024e2nl32bWxsy6AksJhxvjDx1+FZhdLTRV9xawhrjDk2Z2CHA48Pd6vwMRaXEUnESkRmbW\nCrgIeBsYambDYj1HHwG9gYmxokOB1e6+qY7VDSWEq/kNXM3c6jPMrL2Z3Wxmy4FSQq9UPtAZyIgr\nOgyYW9fKY6HsKeArZtY+NvsyYBthvJeIHGAUnESkNqcQLkFwMbAobvonobenshfGanz1rhIpU5fW\ntcwvqWHeX4H/Af4BfJ1wmO1UYAP79p33BNAJ+Ers+SRgmrtv2Yd1iUgLpzFOIlKby4A8wsDq6sHn\nAuB8M7sGWAycZmad6+h1WkwILaOBz+rY5kZCz1AVM0shBLhEXQA85u43xK2jXfX1AkuAQ/e0Mnf/\n3Mw+AS6Nje8aSGzMl4gceNTjJCK7iR2WOh940d3/7e7PxU+EXp10whloUwnfJTfWscrnCb1Uv4kb\nG1WTJcD4avOuofYep5qUs/t32w9rWMdUYIyZnZfAOv8OnA78iHDo77W9qI+I7EfU4yQiNTmPcHiq\ntoHTHwLrgEvd/Stm9nfgh2Y2ghAqWhEuRzDD3e9z9yVmdgvwK+BdM3uOMP7oKGCVu/8ytt6HgQfM\n7FnCJQrGAF+Kbau62gLYS8A3zGwzMA84jjAeq6BauduBrwH/MrNHCWcJdgPOAa529zlxZZ8CbiMc\nrrvP3ctr2baI7OcUnESkJpcAxYSzynbj7m5mLwOXmFkX4JvAbODbhIBRCHwM/CfuNTea2VLgWuD/\nYuv/jDCGqNJDhOsufZvQw/MOYYzSdHa/llNt13b6IVAWew/tCddqOhV4Pf417l5kZicSrgF1PnA5\nYRD5W4TrP8W/33Vm9gbwZXTtJpED2l7fq87MTgJ+BmQSxh18ZQ+n82JmJwN/Bg4BlgO3uPvj+1Jh\nEZFkiPWSHeruI5JdFxFJnn0Z45QGfEoYHLnH1GVmgwld59MJ3e53Aw/H32ZBRKQ5M7M+wFns2jsm\nIgegve5x2uXFZhXsocfJzP4IfNndD4+bNwXIcPcz93njIiKNLPbD70TgKkIv+zB3z09mnUQkuZri\nrLpj2X2cxOuEAZsiIs3ZBEIv00DgcoUmEWmKweG9CdeCiZcHpJtZu/jbI4iINCexsZgajykiVZJ1\nVl3lacQ1Hic0s26EM2pyCbc2EBEREalJe8LZuK+7+/rG3lhTBKe1QK9q83oCm919ey2vOZ2ddzAX\nERER2ZNLgacbeyNNEZw+IFz7JN6XYvNrkwvw5JNPMmrUqEaq1v7t+uuv584770x2NfZrauPGpzZu\nfGrjxqc2blzz58/nsssugxpu+t0Y9jo4mVkaMJydh9uGmtkYYIO7rzCz3wN93f2K2PIHgB/Ezq6b\nTLiC79eAus6o2wYwatQoxo0bt7dVFCAjI0Nt18jUxo1Pbdz41MaNT23cZJpkaM++nFV3JPAJ4fYE\nTriwZQ7h6rsQBoMPqCzs7rmE65+cSrj+0/XAt929xisSi4iIiDRXe93j5O4zqSNwufu3anlN5t5u\nS0RERKQ5aYrrOImIiIjsFxSc9lOTJk1KdhX2e2rjxqc2bnxq48anNt6/1OuWK43FzMYB2dnZ2RpQ\nJyIiIrXKyckhMzMTINPdcxp7e+pxEhEREUmQgpOIiIhIghScRERERBKk4CQiIiKSIAUnERERkQQp\nOImIiIgkSMFJREREJEEKTiIiIiIJUnASERERSZCCk4iIiEiCFJxEREREEqTgJCIiIpIgBScRERGR\nBCk4iYiIiCRon4KTmX3fzKJmVmJmH5rZUXso/yMzW2BmxWa23MzuMLN2+1ZlERERkeTY6+BkZhcB\nfwZuBMYCs4HXzax7LeUvAX4fK38wcCVwEXDLPtZZREREJCn2pcfpeuBBd3/C3RcA1wDFhEBUk+OA\n99z9n+6+3N3fAqYAR+9TjUVERESSZK+Ck5mlAJnA9Mp57u7AW4SAVJP/AJmVh/PMbChwJvDyvlRY\nRERE4PHHITe35mW5uWF5Q5VpyvrUd1uNbW97nLoDrYG8avPzgN41vcDdpxAO071nZtuBRcDb7v7H\nvdy2iEiL05Q7pYayp239v/+3f+6wm1uZPZkwAa68cvf15OaG+RMmNFyZROrbHLbVJNw94QnoA1QA\nx1Sbfxvwn1peczKwBvgWcAhwHrAM+FUd2xkHeHZ2toscaB57zD0arXlZNBqWJ1KmobbVUBrqfe2p\nzFVXNc12Ei0TjbpHIruXi5/flOtpiG29++6e65JIfZuybVpamT/9KbH/m9XXU9N6G6JMIu8pWdt6\n8cVsBxwY53uRafZ12tvglALsAM6tNv8x4N+1vOYd4I/V5l0KbK1jO+MAHz9+vJ9zzjm7TE8//bSL\nNLSmCiKJ7NSb2w6nKQNEQ5Rpyp16Q+1MmnI9jbWt/WGH3ZzKJPq+4+dlZdX8mjrLVFS479jhXlzs\n0TlbPHLids96YZNHTtru0blb3UtK3Ldvd6+oqLm+J1d4dME298JC93Xr3Fev9ui7KzxybJFnPbrU\nI0du9uhzOWHDb7zh/uKL7s8+69G7X/DIqNWe9cs3PHJovkcfy3KfOdP9o4/c58716MxlHjlhm0dn\nF7qvXOnRV+Z5ZOwGv/vKn/s5Y8b4OaNG+TnDh/vEvsM9o01m8w1OHkLNh8Ddcc8NWAH8rJbyHwO/\nrzZvElAEWC2vUY+T7Je9Aons1BN5vq+v2ZfXNeXOuLHee7J3kPHza9u5NeV6Et7W0gqPnLTds57f\n6JETSz2avT7sHAsK3AsKPJqzYeeO9sRSj/53nfuaNe6rV7uvXOm+YoVH31vpkeNKPOsfa3buCEtK\nwg57b9/TknL3rVs9OivPI8ds9ejzn7q/+ab71Knujz7q0d9M9siQpZ51+SMeGRL16M/vd7/jDve/\n/tX9wQfdJ0/26J3/9sgha8MOe+RKj/7v39x/+1v3n/zE/eqr3S+91KOn/T+PdMnxrKN/5pGeczx6\n8S/cf/pT95tvdr/rrrCt+1/1yJj1nvXnjz0yZn3Y8b/8svu0ae7PP+8+dapH7305bOvXb3lk9FqP\n3vW8+9//7v7EE+6PP+7+2GMeve0Zjxy82rNuyvLIEes9+sJs90WL3PPzPbqwdNd/pyXlHjl+m0ef\n/dj96afdf/9792uucT/jDM8a+A0H96xOZ7t37bpz6tataspKPyeU6XCGe4cO7m3ahBgQN2UxPpRh\n/G7L3MyjbYZ7pNXbnpVyqkeY7lEG7V5uT+tJsEyUQR5humcxftdtmbl36eI+bJj7UUf530Z/o9kH\npwuBEuBywuUFHgTWAz1iy58Abo0rfyOwiXAJgsHAaYRxTk/XsQ0FpyRoboeIGmqH3ZQBojHWu0+/\nIhOsW0Oup0Hf1+Iyj0wo86w3Sj1ycnnYYe7je0+o/d6uCNv5vMh9wwb3tWvdV6xwX7LEo28t9shR\nWzzrgfkeGbfRo0//x/3VV91feMH9mWfcn3wy7PxGrPSs7z8TdtjX/MH92mvdr7zS/aKL3M8+2z0S\n8ayDrw47iuN+4X7FFe4//rH7Lbe4P/CA+zPPePSp9z1y1GbPeniRR8Zu9OjDb7k/+aT7vfeGneQv\nfuH+ve951mn/F9Zz3h3uP/uZ+003hWM799/v/ve/e/SB18JO/e5PPZK5yaPTPnP/7DP3BQvcly4N\ngea/60LoeXSpR8YUePTWp91/9atQr1NOcT/oIPf27fe4c6vXDrJtW/f0dPcePTyrx9dCmR5fc+/d\nO+zsMzLcU1Pd27b1KINr3onGTx06eFaXr4T1dD4v7Fw7dgzbMdu9PnZy2Ea/fu4jR7pnZrpPmOB+\n9tmeFfltKDPuevcjj3QfMSLUq0OHBg0HeyoTbTfSI23f9axeF3rEZuz6vjt3dj/iiBD0+i/0rKv+\n7pGhuR694T73P/xhlyl6w30eGZrrWd95yiPDl3n0xkfd77vP/aGH3B99NHyO/zLNI4fmedbNMz1y\nSJ5H/zItBL1HH3V/+OHwOb33Xs/6wb9CfX/8gvvkyaHMP//p/u9/u7/8skefeMcjYzd41r1zQ4/T\nawvcv/jCPTc3BOv16z36eZFHTi73rNdKPHLido++v8p94UL3Tz5x/89/3N96y33aNM/67Yywrfs+\nD+tYv969rGyX/8NHHtmMD9VVvQi+B+TGAtQHwJFxy2YAk+OetwJ+DXwR62XKBf4CpNexfgWnRrCn\nsHL77Q0TQhoqFCXyvCHLxM9vqiBS107dPSwH96zn1rvPnRtmTJ0afj3feqv7j3/sWaffGsqcfXvY\nsf7xj+HL8Nln3adP9+jLn4df/FMLPHJskUdf/tz9vffcX389rOuJJ9zvu8+zrn46rOd7/ww76ocf\nDsv++U/355/36KNve+SIDZ71l9k7d8azZ7vPmxe+0KJRj36wxiMnlHrWkys8Mm6TRx983f1vfwvh\n4Lrr3C+5xP3UUz1r2JVhW70vcu/b171797ADbd/evVWr2ncmrVuHnWCHDu4dO3pW2pk7d7ZDh7of\nfLD74YeHHd0JJ3jWEdeF5Ydf63700WHZyJHugwa59+rlnpHhWW1PbZidn50cymSc6z54sPshh4Rt\nRiLuZ5/t0bN/4JE+8zzr9Fs90u1Tj449P+yMu3Wres91vu+uXd2HDPHoqC97JCPbsw79vkc6zvLo\n4JPd+/QJ7ben9dT1vlq1CgHi2GPdL7zQ/Sc/8eivH/HIIXme9bt3wuGUB193f+65qin6wGseOTQ/\n7GgPzffo394IPS4vvRR6X1591aOPZYUQd/tHHjlsnUfvfTnsaB96KPQE/elPHv3JPR4ZvMSzLnnQ\nI4OXePT6u8Pn+/bb3e+80/2ee9wfeMCzfvZSqO+t74fDOp98EoJgQYF7aeme/1+VlXl0wTaPnLTD\ns97Y5pFIxT79IPHt20Nv23HFnvX0Ko8cVxJ2/KtWhWCQnx965GYXhm29UuSR8WUenVfsXlzsvm2b\ne2lpWM/ishAgphV65Lhij7401/3tt0MIeewx97vu8qxvPRbe93XPhUNdn30WDo0l+J3U3L4jG3pb\nzXqMU1NNCk57r7F6cPblQ9+QoSh+/j4fvy8sDDv1F+d45Ij1nvWHDzxy+DqPPjI9fMG/+GKYpk0L\nv3BufT98Sf3xwxAu3nzTfcaM8EX97rsenZrtkcxNnvWn/3rkkLUeveWp8Kvu5z93/8533L/+dffT\nTvOsUdeE9Yz9kfuZZ7qff777xRe7f/Ob7ldf7VlfvTssP+dPIVCcfXb4tTtunPvw4R7tfqRHWmXV\n/AvbzL1bN48OPcUj6R971tgfeaTjRx7tf2L4Bb23XeKth3mkzUzPSj/HI62zPNpm+L4HiOplzEIw\nGj3afcIEj575PY/0W+BZF9/vkUFLPPqju0Jvyp//HHaQDz7o0dv/5ZFRazzrf1/zyKjVHv3jP0MA\nu//+sLO9+26P/uphjwzLDb+yBy/16Hf/GA63XHut+3e+49ELfuKRXnM96+QbPdLrc49e9HP3H/wg\nlPnlL91/9zuP/uIBjxy03LN+Mi1s5+4XQuCcNs39tdfcZ8zw6L/+65HMQs966ItweGjmstAblZfn\nvnFjGBuyuKx+O5PycvcNGzz6djR8tu6fF7Y1K8+9qKjqsNYe11NR4V5SEtupl4QAe/SWsDP+8EP3\nd95xnz49BJq/vRF6F/4yO+z4v9her/+v+8sOu7mWqel978t3+r6W2ZfnTbWt7GwFJwWnapLVg9Mc\nDhH5tm3uq1Z51uQlYWf8hw/c//GP8Gv1jjvCIYqf/jQEkYk3hzKjvxt6FXr0CL/U9yZA1HZMva5w\nUHm8fciQEHomTvTol7/rkd7zPOtLt4QxEqf9P/czzghv8vjjPXrIWR5J+9CzBlwWAs8xF7mfc477\npZe6X3ONR6/+fTjk8+tHwniKh94MPTjTl4Rf12VldbdxWVno0l60yKPPf+qRMQWedev7HjliQxgX\nMm+e+7Jlocu8+jiKyvUsrQjtX1gYxlt8sMYjxxZ71uO5HjkqtjOeNSv0Xr39tvsbb3h08gyPHJbv\nWXfmhB32f9ft1q3eFDuc5rTza8qdUlPVuSkH4DdU2+wPZao/b4njQBtjWwpOCk67acgvzPj5tYaV\nHTs867n1IRg8nhuOOy9eHLrEly0Lgz7XrPGs5zeGMtMKw066ctqwIfx6/nTTzoGlxxaHQ0QzZ4be\nnaeeCr0Ht93mWd94OKznlJvcTzsthI+BA93T0uoOMx07hkMUBx0UgkjnbM864X890nueR6/6v3B4\n6P77w6GmN98MPU7HFYdDVidsC7/m16wJ09q1Hv0o3yMn7BzgGv1gTfgl/u6K8P4XLnSfN8+jry0I\nY16mrA5nnlQbf1PfL/3mtsNJ9HXNZefW3M6qa04nJzRUnZvykg9NucNuTmUSGTrRlJrbpUviKTgd\nYMEp0Q9Ig4SiTZvc58xxf+UVz/rJtBBWvnp3GLwaiYQxGd27JzQIc596Zqovb9XKo50O80i79z1r\nyDc90jnHo1/+bjiz5Ze/dL/jDo/+6dlwWO25HPcFCzz64VqPjN+xS1hpqp16Q6wnkZ16c9vhNGWA\n2B+v45SIplxPU+4AG0JLq29DOVDf975QcDrAglOiO+z4eTWGom3bwims06d71s9fDmHlrNvcTz89\njC3p1Gn30JNxrkfSPvTocZPC2Jsf/jAM0hy5KgzyfPddjz7zkUfGbghnEr35pvtrr8UOx6wLA0Wn\nTg2n5R6a59H7Xw2DjWNT5fys/3s3HCJ6YXbotVmzxr2oKJzq3Ax6KBoyQDTETr2haGcsIgcCBaf9\nKDg1aG9Saan7/Pk7By5ffH848+WYY8LhqtiptlWhqPN5Hun0URhbc+217rfd5j5lShjoGjcQtDFC\nyL48r2k9+2uvgIiINBwFp/0oOO11b9KE8jDw9rB1Hv3lQ+7f/W4Y8zN4cDisVRmKWp/ikfb/8eix\nsTO0fvMb90ce8eiT74XTzRdsq3E7e6pPIpf4b8pDRCIiInvS1MHJPASVZsXMxgHZ2dnZjBs3LtnV\nqVHlTQYHD959WW4uzJwJV1yx88aFkyeHslXPH65gcOlC+OijqmnmJ+mcXD6dLCYwIfVjGD68asrt\nMpYrnzmdyX8tYfCxvcld3qrm9U7etU7x82fOTKzOTdU2IiIi9ZWTk0NmZiZAprvnNPoGmyKd7e1E\nC+hx2uvepBO2hYvEDVwceorixhz5qFEe/eqPPXLQinCl3+O3hVPB92Jb6sEREZEDkXqcaBk9TlBH\nb9JkGJy+IXStTJ8OM2Ywc34PTmYmWd2/xoSTKuDoo8OUmUnuxow6e4/UgyMiIlKzpu5xatPYG9if\nDR4cws2VV8KNN5Rw0y+KmXzM3xh8wbPwySehP2nYMHKPvpCbWl1H1k0F3HTvs0y+Y2cIqukQW/x6\nJ0+uOxQNHlxzoBIREZGG1yrZFWiOHn88BJqa5OaG5QDs2MHgT5/nxqIbOPnLHbhx9lcZ/OI9MHp0\nSDy5ueS+tZgr197K5Jd6MeGC7lWBqHL9M2fuPi4JdoanmTMb4x2KiIjIvlCPUw0mTNjDQOubV8Kv\nHnywhdwAACAASURBVIDJk8ld05ab0p8l64fPcdOsV5g8JZXBQ2zX8upNEhER2S+ox6kG8eGmsmco\nd9EOrjxnHZPLr2Dw+IFwzz3knnoVVx41h8mzj2TC3V9l8j/SuPLbpt4kERGR/ZSCUy2qwtNlpcy8\n7CGuPHQWk+cexeDtX8Ajj5D7wRquXHkzk5/pVGNvUm5u6E2qrcdo8GAN6BYREWlpdKiuNu4Mfv9p\nbpw9hZPff4msr9zF4N++AGPGADDz8T33Jukwm4iIyP5Fwakm+flwzTXk/juHm3q+TNZThdx014+Y\nnAGD/3979x4XVbU+fvzzDIKCoKJ4yVTwrlCWYN7yQvnzmlppKngt+4aX8nS0r6llGnYyK+t0OifT\njl8rA1E7luUl9WCmpXYRupmXPBpqmhbeTt4B1++PGaYZmIEZBAfweb9e88K99tp7rb0YZx7WXnst\nWxYdm6SUUkpdf4p0q05EHhaRn0Tkgoh8LiK3FZK/qoi8JiJHbcfsEZFeRavy1Sn0iblHvoKoKDI+\nyWD0TV+y6Isouvavmm/Mk1JKKaWuP14HTiIyBHgJmAm0Br4F1otImJv8/kAq0AAYADQHHgKOFLHO\nVyX3ibm8AVDG16cY3e4Hur42iIyYgYyO/JxFq2q5Hb+klFJKqetPUXqcJgILjDGLjTF7gLHAeWC0\nm/wPAtWAe4wxnxtjDhljPjXGfF+0Kl8dl0/MLUxldPsfWHRxKBHJs9kc9zqLkgL0aTillFJKOfFq\njJOt9ygGmJ2bZowxIpIKdHBzWD9gOzBPRO4GfgOWAM8bY64UqdZXyR48jcpmZtBcEte1ZVHs/xGR\n/BHUrUtBD7vp+CWllFLq+uVtj1MY4Accz5N+HKjj5phGwCBbWb2BZ4DHgCe8LLtYRYQbZppEYtdN\nZebjF4n4eBHUrevLKimllFKqlCuueZwE68rE7so4DiQYY742xiwHngXGFVPZRZLx0goSP72DT2Zt\nIfGrPmQcFF9WRymllFJlgLfTEWQCOUDtPOm1yN8LlesX4LIxxjGw2g3UEZEKxphsd4VNnDiRqlWr\nOqXFx8cTHx/vZbWdZXx6mNFTa7Jo4CoinprLohGul1hRSimlVOmRkpJCSkqKU9qZM2euaR3EOZ7x\n4ACRz4EvjDGP2rYFOAS8aox50UX+Z4F4Y0wjh7RHgcnGmHpuyogG0tLS0oiOjvaqfoXJ2J/D6Jhv\nWBT8KBG71kKVKtb0DA2elFJKqbImPT2dmJgYgBhjTHpJl1eUW3UvAwkiMlJEWgDzgSDgLQARWSwi\nsx3yvw7UEJG/iUhTEbkLmAb84+qqXjSbp6xl0X8HEbF0jj1oAn1iTimllFKF83rmcGPMctucTbOw\n3rL7BuhpjPnNlqUekO2Q/2cR6QH8FeucT0ds/37hKuvuvbQ0Rn0wAKZOhk6d8u3WJ+aUUkopVZAi\nLblijJkHzHOz704XaV8AHYtSVrE5fx6GD4dWreDpp31aFaWUUkqVTdfPWnVTplgHMqWnQ0CAr2uj\nlFJKqTKouKYjKBXcrkO3fj0Z/1jF2wM/hJYtr3W1lFJKKVVOlKvAyeU6dJmZZAyfzujqK+k6q5uv\nqqaUUkqpcqBcBU751qEzxho0nX6ZRWtvIKJRubpcpZRSSl1j5S6ScAyeNk9bx+j1g1n06lki2uWd\ns1MppZRSyjvlLnACa/A0c+xxYp/vzcyeXxAxrrevq6SUUkqpcqBcBk4ZGZA49TyfBPUh8fxk1wPG\nlVJKKaW8VO4CJ/vSKcGP0rVvCIsWV8g/YFwppZRSqgjKVeBkD5rm/ErE96ugb9/8A8aVUkoppYqo\nXAVOmzfbFun97kOwWKC3dWyTrkOnlFJKqeJQrmYOHzXK9o9Vq6BDBwgLs+/TdeiUUkV16NAhMjMz\nfV0Npa5LYWFhNGjQwNfVsCtXgRMAFy5AairMmOHrmiilyoFDhw7RsmVLzp8/7+uqKHVdCgoKYvfu\n3aUmeCp/gdOmTdYFffv183VNlFLlQGZmJufPnycpKYmWumSTUtfU7t27GT58OJmZmRo4lZjVq6Fh\nQ12TTilVrFq2bEl0dLSvq6GU8rFyNTgcY6yBU79+IOLr2iillFKqnClfgdN338Hhw9C3r69ropRS\nSqlyqHwFTqtWQUgIdO3q65oopZRSqhwqUuAkIg+LyE8ickFEPheR2zw8Lk5ErojIe0Upt1CrV0PP\nnhAQUCKnV0oppdT1zevASUSGAC8BM4HWwLfAehEJK+S4cOBFYEsR6lm448fhyy/1Np1SSpVCe/fu\nxWKxsHz5cq+PvXTpEhaLhRdeeKEEaqaUd4rS4zQRWGCMWWyM2QOMBc4Do90dICIWIAmYAfxUlIoW\nas0a688+fUrk9EopVZ5YLJZCX35+fmzZUnx/68pVPLQjIld1fHH4+uuvsVgshISE6Lxe1zGvpiMQ\nEX8gBpidm2aMMSKSCnQo4NCZwK/GmDdFpEuRalqY1auhfXuoWbNETq+UUuVJUlKS0/bbb79Namoq\nSUlJGGPs6cU1d1Xz5s25cOECAUUYSlGxYkUuXLiAv79/sdSlqJKTk6lXrx7Hjx9n5cqVDB061Kf1\nUb7h7TxOYYAfcDxP+nGguasDROR24AHgFq9r56mLF2HDBnjyyRIrQimlypO8X/rbt28nNTWV+Ph4\nj46/ePEilSpV8qrMogRNxXFscTDGsHTpUh544AG+/vprkpOTS23glJ2dDUCFCuVvqsbSoLieqhPA\n5EsUCQbeAR4yxpwqprLy27wZzp3T2cKVUqoErF+/HovFwvvvv8+UKVO48cYbCQ4O5vLly2RmZjJx\n4kRuuukmgoODqVatGv369WPXrl1O53A1xikuLo6aNWty+PBh+vbtS0hICLVr1+bJPH8EuxrjNHXq\nVCwWC4cPH2b48OFUq1aN6tWrM2bMGC5fvux0/Pnz5xk/fjw1atSgSpUq3HfffRw8eNCrcVMbN27k\nl19+IS4ujiFDhpCamup2/cJVq1bRpUsXQkJCqFatGu3bt+df//qXU56tW7fSs2dPQkNDCQ4OpnXr\n1syfP9++v3379vRxMfQkLi7OqRcwt11fe+015s6dS6NGjQgMDOTAgQNcvHiR6dOnExMTQ9WqVQkJ\nCeGOO+5g69at+c575coV5s6dy80330xgYCC1a9fmrrvu4rvvvgOgXbt2tG/f3uX1RkREcO+99xbe\niOWEt+FoJpAD1M6TXov8vVAAjYFwYJX8cXPaAiAil4Hmxhi3Y54mTpxI1apVndLi4+Pz/0W0ahWE\nh0NUlOdXopRSyitPPfUUlStXZsqUKZw7dw4/Pz/27t3LunXruO+++wgPD+eXX35h/vz5xMbGsmvX\nLsLC3D83JCJkZWXRvXt3YmNjmTt3LuvWrWPOnDk0a9aMUfaV210fKyLcc889NGvWjOeff54vv/yS\nhQsXUrduXWbOnGnPGx8fz+rVqxk9ejQxMTGkpqZyzz33eDVmKjk5maioKKKioggPD2fMmDEsW7aM\nhx9+2Cnf/PnzGT9+PK1bt2b69OlUqVKF9PR0NmzYwH333QfA6tWrGTBgAOHh4UyaNInatWvzww8/\nsGbNGsaOHWu/voKuO6/XX3+dnJwcxo8fT4UKFahatSonTpxg8eLFxMXFMXbsWE6fPs3ChQvp3r07\n6enptGjRwn78sGHDWLZsGXfffbc9+Ny8eTNfffUVrVq1YuTIkfzpT3/iwIEDNGrUyH7cp59+yqFD\nh3j55Zc9bsurkZKSQkpKilPamTNnrknZdsYYr17A58DfHLYFOAxMdpE3AIjM83of+DfQEqjgpoxo\nwKSlpZlCXbliTHi4MY88UnhepZTyUlpamvH486gMe+SRR4zFYnG5b926dUZETGRkpMnKynLad+nS\npXz59+3bZwICAszcuXPtaXv27DEiYpYtW2ZPi4uLMxaLxbz00ktOx0dFRZnOnTvbty9evGhExDz/\n/PP2tKlTpxoRMRMmTHA6tk+fPqZ+/fr27W3bthkRMU8++aRTvvj4eGOxWJzO6c7FixdN1apVzezZ\ns+1pAwcONB06dHDKd+LECRMUFGRiY2PztVOurKwsc+ONN5oWLVqYs2fPui2zffv2pnfv3vnS4+Li\nTMuWLe3bue0aFhZmzpw545Q3JyfHZGdnO6WdPHnS1KhRwzzi8J25du1aIyJm2rRpbutz4sQJExAQ\nYBITE53SExISTGhoqMv3QXHw5P9fbh4g2ngZ0xTlVZQboC8Db4tIGvAl1qfsgoC3AERkMfCzMeYJ\nY8xlwKm/VkROW+M1s7sIZee3cyccPKjTECilfO/8edizp+TLadECgoJKvpw8Ro8enW/cjOPYo5yc\nHM6cOUO1atVo2LAh6enpHp03ISHBabtTp06sXr260ONEhDFjxjilde7cmfXr15OVlYW/vz/r1q1D\nRBg3bpxTvgkTJrB06VKP6vfBBx/w+++/ExcXZ0+Lj49n8ODBTj0wH330ERcvXuSJJ55wO77oiy++\n4OjRoyxYsIDKlSt7VL4n4uLiqFKlilOaxfLHaBxjDKdPnyYnJ4fo6Gin382KFSsICAjId4vUUfXq\n1enTpw/JycnMmDEDgKysLFasWMGgQYN8PgbtWvI6cDLGLLfN2TQL6y27b4CexpjfbFnqAdnFV8VC\nrFoFwcEQG3vNilRKKZf27IGYmJIvJy0NfLDgcERERL603LExCxYs4ODBg1y5cgWwBjVNmjQp9JzV\nqlUjODjYKS00NJRTpzwbFtugQYN8x+YGCTVr1uTgwYNUrFiRG2+80SmfJ3XLlZycTPPmzbly5Qr7\n9+8HoFmzZgQEBLBkyRKmT58OYN8XVcCwkf379yMiBeYpCle/G4CFCxfyyiuv8OOPP9oHjQNERkba\n/33gwAEaNGhQaCA3cuRI7rvvPnbs2EGbNm1Yu3Ytp06dYsSIEcVyDWVFkYbcG2PmAfPc7LuzkGMf\nKEqZbq1eDT16QMWKxXpapZTyWosW1qDmWpTjA4GBgfnSZsyYwezZsxk7dix33HEHoaGhWCwWxo0b\nZw+iCuLn5+cy3Zh8zxuVyPGFOXXqFOvWrSM7O5umTZs67RMRkpOT7YGTJ2V6Wi93Y5xycnJcprv6\n3SxcuJCEhAQGDx7Mk08+SVhYGH5+fiQmJvLbb7/Z83lap759+xIaGkpSUhJt2rQhKSmJBg0a0KlT\nJ4+OLy/K9rOKv/4Kn38O//d/vq6JUkpZb5/5oCfIl1asWEGfPn2YN8/5b+mTJ0/SuHFjH9XqD+Hh\n4Vy6dIkjR4449Trt27fPo+OXLVtGdnY2ixYtIiQkxGnfzp07SUxMJD09nejoaHsv1s6dO6lbt67L\n8zVp0gRjDDt37qRjx45uy3XX63bw4EGP6g3W301UVFS+W5KPP/54vjpt376ds2fP5uv9c+Tv78+Q\nIUNYtmwZM2fOZM2aNTz22GMe16e8KNuL/H70kfWnzhaulFIlyl0PiJ+fX74ei3feeYcTJ05ci2oV\nqmfPnhhj8gV2f//73z16qi45OZnIyEhGjRrFgAEDnF6TJ0+mYsWKJCcnA9C7d28qVarE7NmzycrK\ncnm+du3aceONN/LSSy/x+++/uy23cePGfP/9905PjH355Zfs2LHDk8sGXP9utmzZkm/s2cCBA7l8\n+TLPPvtsoeccMWIEx48fZ+zYsVy6dIlhw4Z5XJ/yomz3OK1aBW3bQu28syMopZQqTu5u5/Tt25cX\nX3yRhIQEbrvtNr799luWLVvmdszNtdaxY0fuuusu5syZw7Fjx2jTpg0bN27kp5+sM+EUFDxlZGSw\nbds2pk2b5nJ/YGAg3bp1Y+nSpcydO5fq1avz4osvMmHCBNq1a8eQIUOoWrUq33zzDcYYFixYQIUK\nFZg3bx4DBw6kdevWjBo1itq1a7N7924OHDjABx98AMCDDz7IP/7xD3r06MH999/PkSNHWLhwIVFR\nUU5jlQrSt29fxo8fz3333UfPnj35z3/+wxtvvEFkZKTTbdRevXoxaNAgXnjhBXbt2kX37t3Jzs5m\n8+bN9O3blwcffNCet3379jRt2pR3332X6OhopykNrhdlpsfp7bchI8Mh4fJlWL8e+vUjI8O6Xyml\nVNEVFES42/f000/zpz/9iTVr1jBp0iR27drFhg0bqFOnTr5jXJ2joPmK8m57cj5Xli1bxpgxY1i5\nciXTpk2jQoUK9qVlCpr9PHe+oL4FPLXdr18/jh07xsaNGwEYP348K1asIDAwkGeeeYZp06bx/fff\n06tXL6djNm7cSMOGDZk7dy6TJ09my5Yt9HOYxPmWW27hrbfeIjMzk0mTJrF+/XqWLVtGVFSUx+0w\nZswYZs2axY4dO/jzn//Mpk2bePfdd7n55pvzHZOSksJzzz3Hjz/+yOTJk5kzZw5XrlyhXbt2+c47\nYsQIRISRI0e6bZdy7VrMeeDtCxfzOP30kzF33GH9aYwxZsMGY8D8tHaXc7pSShWj62Uep+vN9u3b\njYiY9957z9dVKXPmzJlj/P39zfHjx0u8rNI4j1OZ6XGKiIBFi2D0aFvP06pVZNzQgdEvtmDRIut+\npZRSKq9Lly7lS/vb3/5GhQoVrrsnwq6WMYY333yTHj16UKtWLV9XxyfK1BinP4Inw8zdv5Lo9yaL\nFokGTUoppdyaNWsWe/bsoUuXLogIq1evZuPGjTz66KPUrFnT19UrE86ePcuqVavYsGED+/bt47XX\nXvN1lXymTAVOYA2eZo7MIPaBpXzy/BcaNCmllCpQp06d+OSTT5g1axbnzp0jPDycZ599lilTpvi6\namXGkSNHGDZsGDVq1CAxMZFu3br5uko+U+YCp4wMSJwFnwT2InHtGhYN1tt0Siml3Ovduze9e/f2\ndTXKtNyZ01UZeqoOrEHT6NGwqNLDdL07lEVv+f0x5kkppZRSqoSVmcDJHjTNPEjE7o9g4MD8A8aV\nUkoppUpQmQmcNm+2BkkRXy6HwECwdbvmBk+bN/u2fkoppZQq/8rMGKdRo2z/WLECevUCh1WcIyJ0\nnJNSSimlSl6Z6XEC4Oef4YsvYOBAX9dEKaWUUtehshU4vfce+PtDAdPfK6WUUkqVlLIVOK1YAd27\nQ9Wqvq6JUkoppa5DRQqcRORhEflJRC6IyOciclsBef9HRLaIyEnb698F5Xfr+HH49FMYMKAoVVZK\nKaWUumpeB04iMgR4CZgJtAa+BdaLSJibQ7oCS4BYoD1wGNggIjd4VfDKlWCxwN13e1tlpZRS11C9\nevVISEiwb2/cuBGLxcK2bdsKPbZTp0706NGjWOszffp0/P39i/Wc6vpVlB6nicACY8xiY8weYCxw\nHhjtKrMxZoQxZr4x5jtjzI/A/9jK9W6+9hUroGtXCHMXnymllPJU//79qVy5MufOnXObZ9iwYVSs\nWJFTp055dW4R8SjN02M9ce7cORITE/nss89cntNi8e3IlJMnTxIQEICfnx/79+/3aV3U1fHqnSQi\n/kAMsDE3zRhjgFSgg4enqQz4Ayc9LvjkSdi0SZ+mU0qpYjJ8+HAuXrzI+++/73L/hQsX+PDDD+nT\npw+hoaFXVVa3bt24cOECHTt2vKrzFOTs2bMkJiayZcuWfPsSExM5e/ZsiZXtieXLl+Pv70+tWrVI\nTk72aV3U1fE2BA8D/IDjedKPA3U8PMfzwBGswZZnVq2CnBy4916PD1FKKeVe//79CQ4OZsmSJS73\nr1y5kvPnzzNs2LBiKS8gIKBYzuOO9W941ywWi89v1SUlJdG/f3+GDBlSqgMnYwyXLl3ydTVKteLq\nuxTA/bs2N5PIVGAwcI8x5rLHZ1+xAjp2hBu8GxallFIl7e233S/5lJFh3V8az1+pUiUGDBhAamoq\nmZmZ+fYvWbKE4OBg+vXrZ097/vnnuf3226lRowZBQUHcdtttrFy5stCy3I1xev3112ncuDFBQUF0\n6NDB5RioS5cu8dRTTxETE0O1atUIDg4mNjaWTz/91J5n//791K1bFxFh+vTpWCwWLBYLs2fPBlyP\nccrOziYxMZHGjRtTqVIlGjVqxIwZM8jKynLKV69ePQYMGMCWLVto27YtgYGBNGnSxG3A6UpGRgbb\ntm0jPj6eIUOGsG/fPnbs2OEy7/bt2+nduzehoaEEBwdz66238tprrznl2b17N4MGDaJmzZoEBQXR\nsmVLZs6cad8/fPhwmjZtmu/cedshJycHi8XCpEmTeOedd4iKiqJSpUps3Gi9qeTN73vx4sW0bduW\nypUrU6NGDWJjY/n4448B6y3fOnXquFwk+M477+Tmm28upAVLF28Dp0wgB6idJ70W+XuhnIjI/wKP\nA92NMT94UtjEiRPp36cP/desof/p0/Tv35+UlBQvq6yUUiWna1fX62Xmrq/ZtWvpPf+wYcPIzs5m\n+fLlTumnTp1iw4YNDBw4kIoVK9rTX331VWJiYvjLX/7Cc889h8ViYeDAgWzYsKHQsvKOXVqwYAEP\nP/ww9evX58UXX6RDhw7069ePo0ePOuU7ffo0b731Ft26deOFF17g6aef5tixY/To0YMffrB+ldSp\nU4fXXnsNYwyDBg0iKSmJpKQk7rnnHnvZecu///77SUxMpF27dvz1r3+lc+fO/OUvf2H48OH56r13\n717i4uLo1asXL7/8MlWrVmXUqFHs27ev0OsGSE5Oplq1avTu3ZsOHToQHh7ustdp3bp1xMbG8uOP\nP/LYY4/x8ssvExsby5o1a+x5vvnmG9q3b8+WLVsYN24cr776KnfffbdTHlfXW1D6hg0bmDJlCkOH\nDuWVV16hQYMGgOe/76eeeor777+fwMBAnnnmGZ5++mnq1avHpk2bABg5ciS//fYbqanON5qOHj3K\nli1bGDFihEftCJCSkkL//v2dXhMnTvT4+GJhjPHqBXwO/M1hW7A+KTe5gGMmA6eA2zwsIxowaWlp\nxqSkGAPG/PSTUUqpay0tLc3YP4/c+OknY+6444+PqbzbV6ukzp+Tk2Pq1q1rbr/9dqf0+fPnG4vF\nYlJTU53SL1686LSdlZVlIiMjTa9evZzS69WrZx566CH7dmpqqrFYLGbr1q3GGGMuX75swsLCTNu2\nbU12drZTuSJiunfv7lTHrKwsp/OfPn3a1KxZ04wdO9aeduzYMSMi5tlnn813ndOnTzf+/v727bS0\nNCMiZvz48U75Jk6caCwWi/nss8+crsVisZjPP//cqayAgAAzbdq0fGW5EhkZaR544AH79pQpU8wN\nN9xgrly5Yk/Lzs42DRo0ME2bNjW///6723N17NjRhIaGmqNHj7rNM3z4cNO0adN86XnbITs724iI\n8ff3N/v27cuX35Pf9969e43FYjFDhgxxW5/c99mIESOc0l944QXj5+dnDh8+7PZYT/7/5eYBoo2X\nMU1RXkW5VfcykCAiI0WkBTAfCALeAhCRxSIyOzeziDwOPIP1qbtDIlLb9qqc/9QurFgBMTG6GJ1S\nqtTKXWx89GjrguOjR9sWJY8o3ee3WCzExcWxfft2Dh48aE9fsmQJtWvX5s4773TK79j7dPr0aU6f\nPk2nTp1IT0/3qtwvvviCEydOMG7cOPz8/Ozpo0ePJiQkJF8dK1SwLqtqjOHUqVNkZWXRpk0br8vN\ntXbtWkSESZMmOaU/9thjGGOcem8AWrVqRbt27ezbtWvXpmnTphw4cKDQstLT09m9ezdDhw61p8XH\nx3P8+HGnHpgdO3Zw+PBhJk6cSHBwsMtzHT9+nO3bt/PQQw9xQzEOXenWrRtNmjTJl+7J7/u9994D\ncLpVmJfFYmHo0KGsXLmSCxcu2NOXLFlCly5dqFevXnFcxjXjdeBkjFkOPAbMAr4GWgE9jTG/2bLU\nw3mg+DisT9H9Czjq8Hqs0MIuXIC1a/VpOqVUqRcRATNnQmys9Wdx/61XUucfNmwYxhj7MIgjR47w\n2WefER8fn++2zocffkj79u0JDAykevXq1KpVi3/+85+cOXPGqzIPHjyIiOT7svb39yfCxYW9+eab\ntGrVikqVKlGjRg1q1arFunXrvC7XsfwKFSrQuHFjp/Qbb7yRkJAQpyASsN+6chQaGurRNA1JSUmE\nhIRQv3599u/fz/79+6lcuTL16tVzul23f/9+RISoqCi358qdxqCgPEXhqs3Bs9/3gQMH8PPzo3nz\n5gWWMWrUKM6ePcsHH3wAwA8//MC3337LyJEji+06rpUiDQ43xswzxkQYYwKNMR2MMTsc9t1pjBnt\nsN3QGOPn4jWr0IK2b4fz5zVwUkqVehkZkJgIn3xi/eluQHdpO390dDQtWrSwD3bO/enYQwKwadMm\n7r33XkJCQpg/fz4fffQRqampDBkyxOWg34IY2xNwrsbb5O7L9dZbb/Hggw/SokUL3nzzTdavX09q\naipdu3b1ulx3ZRS2z7FXzNPz5O5ftmwZZ8+epWXLljRt2pSmTZvSrFkzfv75Z95//30uXrzo0bk8\nzQPu58LKyclxmR4YGJgvzdPftzHGo7m3brrpJm655RaSkpIAa0AZGBjIwDL4/V7B1xUo0Mcfw003\nQbNmvq6JUkq5lTtQO/f2We5tteK6XVfS5x82bBgzZszg+++/JyUlhaZNmxITE+OU57333qNy5cqs\nW7fOKZBYsGCB1+VFRERgjOHHH3/k9ttvt6dnZWVx8OBB6tT546bFihUraN68eb4B7E888YTTtjcT\nZ0ZERJCdnc3+/fudep2OHj3K2bNnCQ8P9/aSXNq4cSO//PILzz33XL6n3DIzMxk3bhwffvghgwcP\npkmTJhhj2LlzJ126dHF5vtweup07dxZYbmhoKKdPn86XnuFFtO3p77tJkyZkZ2ezZ88eIiMjCzzn\nyJEjmTp1Kr/++itLly6lf//++W7NlgWle5HfLVu0t0kpVarlDWrAObi52p6hkj4//HG7bsaMGXzz\nzTf5niwDa6+LxWJx6rU4cOAAq1at8rq8du3aUb16debPn+90voULF/L777/nKzevrVu38tVXxg1c\nNgAADcdJREFUXzmlVa5sHTbrKmDIq0+fPhhjeOWVV5zSX3rpJUSEu+66y+NrKUhSUhJVqlThscce\nY8CAAU6vhIQEGjZsaL9dd9ttt9GgQQP++te/8t///tfl+WrXrk3Hjh1ZuHAhR44ccVtu48aNOXHi\nBLt377anHTlyxKvflae/73tt8ysmJiYW2iM2dOhQrly5woQJEzh06JDL91lZULp7nM6d00V9lVKl\n2ubNrnt+coObzZuvrleopM9vPVcEHTt25IMPPkBE8t2mA+jbty+vvvoqPXv2JD4+nl9++YV58+bR\nvHlz+7QABXH8UvX39+eZZ57hkUce4Y477mDIkCH85z//YfHixTRs2DBfuR9++CEDBgygd+/e7N+/\nnzfeeIPIyEiniRorV65Ms2bNSElJoVGjRoSGhtKqVStatmyZry7R0dEMGzaMefPmceLECTp37sz2\n7dtJSkpi8ODBTr1gRZU7K3vv3r3tg9vz6tevH6+//jonT56kevXqzJs3j3vvvZdbb72VBx54gDp1\n6rBnzx727t3L6tWrAfj73/9O165dad26NQkJCURERHDgwAE2bNhgnxtq6NChPPHEE/Tv358JEyZw\n9uxZ5s+fT4sWLfj22289qr+nv+9mzZoxdepU5syZQ9euXbnnnnsICAjgq6++Ijw8nFmz/hiVU7t2\nbbp37867775LWFgYvXr1Kmrz+ta1eHTP2xe50xHUq2eMw+OaSil1rXnyOHR5MG/ePGOxWEyHDh3c\n5lm4cKFp1qyZCQwMNFFRUeadd97J94i7McbUr1/fJCQk2LfzTkfgWGajRo1MYGCg6dChg9m2bZvp\n3Lmz6dGjh1O+Z5991kRERJigoCDTpk0bs27dOjN8+HDTrFkzp3xbt241bdq0MZUqVTIWi8U+NcH0\n6dNNQECAU97s7GyTmJhoGjVqZCpWrGgiIiLMjBkz8k19UL9+fTNgwIB8bdGpU6d89XS0fPlyY7FY\nTFJSkts8GzduNBaLxbz++uv2tM8++8x0797dVKlSxYSEhJjWrVubBQsWOB23c+dOc++995rq1aub\nypUrm8jISDNr1iynPOvXrzc33XSTqVixoomMjDTLli1zOR2BxWIxkyZNclk/T3/fxhizaNEiEx0d\nbQIDA02NGjXMnXfeaTZt2pQvX0pKihERM2HCBLft4qg0TkcgxsPBZteSiEQDaWmjRhH91lu+ro5S\n6jqWnp5OTEwMaWlpREdH+7o6SpVp7733HoMGDWL79u20bdu20Pye/P/LzQPEGGOKNkeFF0r3GKdu\n3YDiWbZAKaWUUr71xhtv0LRpU4+CptKqdI9xiox0GhiplFJKqbJn6dKlfPPNN/z73/9m3rx5vq7O\nVSnVgdPRX4T/nVy8M/AqpZRS6trJyclh6NChhISEkJCQQEJCgq+rdFVKdeCUmAjvvqtBk1JKKVVW\n+fn5FXmy0tKoVI9xSkjQoEkppZRSpUepDpzeeKP4ly1QSimllCqqUh04zZxZfDPjKqWUUkpdrVId\nONWtW7zLCiillFJKXY1SPTgcindZAaWUKirHdb+UUtdGafx/V+oDJ7AGTBo0KaV8ISwsjKCgoDK7\nIKlSZV1QUBBhYWG+roZdmQiclFLKVxo0aMDu3bvJzMz0dVWUui6FhYXRoEEDX1fDTgOnciolJYX4\n+HhfV6Nc0zYueaWljRs0aFCqPriLU2lp4/JM27h8KdLgcBF5WER+EpELIvK5iNxWSP5BIrLblv9b\nEeldtOoqT6WkpPi6CuWetnHJ0zYuedrGJU/buHzxOnASkSHAS8BMoDXwLbBeRFzegBSRDsAS4J/A\nrcBKYKWIRBa10koppZRSvlCUHqeJwAJjzGJjzB5gLHAeGO0m/6PAR8aYl40xe40xM4F04JEi1Vgp\npZRSyke8CpxExB+IATbmphljDJAKdHBzWAfbfkfrC8ivlFJKKVUqeTs4PAzwA47nST8ONHdzTB03\n+esUUE4lKJ3zN5QVZ86cIT093dfVKNe0jUuetnHJ0zYuedrGJcshVqh0LcorrqfqBDDFmD8C0HlT\nrlJMTIyvq1DuaRuXPG3jkqdtXPK0ja+JCGBbSRfibeCUCeQAtfOk1yJ/r1KuY17mB+utvGFABnDR\nyzoqpZRS6vpRCWvQtP5aFCbWIUpeHCDyOfCFMeZR27YAh4BXjTEvusi/FAg0xtztkLYV+NYYM/5q\nKq+UUkopdS0V5Vbdy8DbIpIGfIn1Kbsg4C0AEVkM/GyMecKW/2/AZhGZBKwB4rEOMH/o6qqulFJK\nKXVteR04GWOW2+ZsmoX1Ftw3QE9jzG+2LPWAbIf820UkHnjW9toH3G2M2XW1lVdKKaWUupa8vlWn\nlFJKKXW9KtKSK0oppZRS1yMNnMoIEZkpIlfyvHY57K8oIq+JSKaI/C4i/xKRWnnOUV9E1ojIORE5\nJiIviMh1+x4Qkc4i8qGIHLG1Z38XeWaJyFEROS8i/xaRJnn2h4pIsoicEZFTIrJQRCrnydNKRLbY\n1mo8KCKTS/raSovC2lhE3nTxvl6bJ4+2sRsiMk1EvhSR/4rIcRF5X0Sa5clTLJ8NIhIrImkiclFE\nfhSRUdfiGn3Nwzb+JM97OEdE5uXJo23shoiMFes6tmdsr20i0sthf6l6D1+3X5pl1E6s48rq2F6d\nHPa9AtwFDAS6AHWBFbk7bW+gtVjHtbUHRgH3Yx2rdr2qjHWM3sO4mFdMRKZgXRpoDNAWOId1XcYA\nh2xLgJZAN6zt3wVY4HCOEKyPyP4ERAOTgadF5H9K4HpKowLb2OYjnN/XeZeR1zZ2rzPwd6Ad8P8A\nf2CDiAQ65LnqzwYRiQBWY1014hasD/0sFJHuJXJVpYsnbWyAN/jjfXwD8HjuTm3jQh0GpmB9cCwG\n+Bj4QERa2vaXrvewMUZfZeCFdVHldDf7qgCXgHsd0poDV4C2tu3eQBYQ5pBnDHAKqODr6/P1y9ZW\n/fOkHQUm5mnnC8Bg23ZL23GtHfL0xPpwRB3b9jis859VcMjzHLDL19dcStr4TeC9Ao5poW3sVRuH\n2dqrk227WD4bgOeB7/KUlQKs9fU1+7qNbWmbgJcLOEbb2Pt2PgE8UBrfw9rjVLY0td3y2C8iSSJS\n35YegzXSdlxDcC/W+bVy1wRsD3xvjMl0ON96oCoQVfJVL1tEpCHWvxwd2/S/wBc4t+kpY8zXDoem\nYv3rs51Dni3GmGyHPOuB5iJStYSqX9bE2m6B7BGReSJS3WFfB7SNvVENa9uctG0X12dDe3TN0Vx5\n2zjXMBH5TUS+F5HZeXqktI09JCIWEYnDOs3Rdkrhe1gDp7Ljc6xdjz2BsUBDYIttrEcd4LLti92R\n45qA7tYMhILXDbxe1cH64VjQOot1gF8ddxpjcrB+oGq7e+YjYCRwJ9ZbG12BtSIitv3axh6ytdkr\nwGfmj+leiuuzwV2eKiJS8WrrXla4aWOAZGA4EAvMBkYA7zjs1zYuhIjcJCK/Y+1dmoe1h2kPpfA9\nXFxr1akSZoxxnEp+p4h8CRwEBuN+WRpP1xDUOSk850mbFpYnNyi47tvdGLPcYfMHEfke2I/1C2hT\nAYdqG+c3D4jEeeyjO8Xx2XA9t/HtjonGmIUOmz+IyDFgo4g0NMb8VMg5tY2t9mAde1QN61imxSLS\npYD8PnsPa49TGWWMOQP8CDTBuh5ggIhUyZPNcU1AV2sG5m4XtG7g9eoY1v9UBa2zeMy2bScifkCo\nbV9uHlfnAG33fGxfMplY39egbewREfkH0AeINcYcddh1tZ8NhbXxf40xl6+m7mVFnjb+pZDsX9h+\nOr6PtY0LYIzJNsYcMMakG2OeBL4FHqUUvoc1cCqjRCQYaIx1AHMa1sGy3Rz2NwMa8MdK0duBm8U6\n63uuHsAZQGdxz8P2BX4M5zatgnVcjWObVhOR1g6HdsMacH3pkKeL7cs+Vw9gry34VQ5EpB5QA8j9\nYtI2LoTtC/1u4A5jzKE8u6/2s2G3Q55uOOthSy/3CmljV1pj7cVwfB9rG3vHAlSkNL6HfT1yXl8e\nP2HwItbHMMOBjsC/sUbbNWz752F9HDsW62C6rcCnDsdbsEbwHwGtsI6VOg484+tr82GbVsbaNXwr\n1ic0/mzbrm/b/zjWJzv6ATcDK7EuGRTgcI61wA7gNqzd93uBdxz2V8Ea3L6NtYt/CHAWeNDX1+/r\nNrbtewFrMBqO9UNtB9YPOn9tY4/adx7WJ4c6Y/1rOvdVKU+eq/pswLry/FmsTyY1B8YDl4H/5+s2\n8HUbA42A6VinwggH+gP/AT7WNva4jZ/Feos5HLgJ61Ox2cCdpfE97PMG05fHb6wU4Gesj8Mfwjq3\nTUOH/RWxzjWSCfwOvAvUynOO+ljnsThre1M9D1h8fW0+bNOuWL/Mc/K8FjnkeRrrl/J5rE9gNMlz\njmpAEta/bE4B/wSC8uS5GdhsO8ch4H99fe2loY2BSsA6rD17F4EDwOtATW1jj9vXVdvmACMd8hTL\nZ4Ptd5lm+wzaB4zw9fWXhjbGuj7rJ8BvtvffXqxf/MHaxh638ULb//8Lts+DDdiCJtv+UvUe1rXq\nlFJKKaU8pGOclFJKKaU8pIGTUkoppZSHNHBSSimllPKQBk5KKaWUUh7SwEkppZRSykMaOCmllFJK\neUgDJ6WUUkopD2ngpJRSSinlIQ2clFJKKaU8pIGTUkoppZSHNHBSSimllPKQBk5KKaWUUh76/+/i\nLR/zc9RFAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Validation accuracy at 0.7875999808311462\n" + ] + } + ], + "source": [ + "# Change if you have memory restrictions\n", + "batch_size = 256\n", + "\n", + "# TODO: Find the best parameters for each configuration\n", + "epochs = 5\n", + "#learning_rate = 0.01\n", + " \n", + "\n", + "### DON'T MODIFY ANYTHING BELOW ###\n", + "# Gradient Descent\n", + "#optimizer = tf.train.AdamOptimizer().minimize(loss) \n", + "#with tf.Session() as sess:\n", + "# sess.run(init)\n", + "\n", + "# The accuracy measured against the validation set\n", + "validation_accuracy = 0.0\n", + "\n", + "# Measurements use for graphing loss and accuracy\n", + "log_batch_step = 50\n", + "batches = []\n", + "loss_batch = []\n", + "train_acc_batch = []\n", + "valid_acc_batch = []\n", + "\n", + "with tf.Session() as session:\n", + " session.run(init)\n", + " batch_count = int(math.ceil(len(train_features)/batch_size))\n", + "\n", + " for epoch_i in range(epochs):\n", + " \n", + " # Progress bar\n", + " batches_pbar = tqdm(range(batch_count), desc='Epoch {:>2}/{}'.format(epoch_i+1, epochs), unit='batches')\n", + " \n", + " # The training cycle\n", + " for batch_i in batches_pbar:\n", + " # Get a batch of training features and labels\n", + " batch_start = batch_i*batch_size\n", + " batch_features = train_features[batch_start:batch_start + batch_size]\n", + " batch_labels = train_labels[batch_start:batch_start + batch_size]\n", + "\n", + " # Run optimizer and get loss\n", + " _, l = session.run(\n", + " [optimizer, loss],\n", + " feed_dict={features: batch_features, labels: batch_labels})\n", + "\n", + " # Log every 50 batches\n", + " if not batch_i % log_batch_step:\n", + " # Calculate Training and Validation accuracy\n", + " training_accuracy = session.run(accuracy, feed_dict=train_feed_dict)\n", + " validation_accuracy = session.run(accuracy, feed_dict=valid_feed_dict)\n", + "\n", + " # Log batches\n", + " previous_batch = batches[-1] if batches else 0\n", + " batches.append(log_batch_step + previous_batch)\n", + " loss_batch.append(l)\n", + " train_acc_batch.append(training_accuracy)\n", + " valid_acc_batch.append(validation_accuracy)\n", + "\n", + " # Check accuracy against Validation data\n", + " validation_accuracy = session.run(accuracy, feed_dict=valid_feed_dict)\n", + "\n", + "loss_plot = plt.subplot(211)\n", + "loss_plot.set_title('Loss')\n", + "loss_plot.plot(batches, loss_batch, 'g')\n", + "loss_plot.set_xlim([batches[0], batches[-1]])\n", + "acc_plot = plt.subplot(212)\n", + "acc_plot.set_title('Accuracy')\n", + "acc_plot.plot(batches, train_acc_batch, 'r', label='Training Accuracy')\n", + "acc_plot.plot(batches, valid_acc_batch, 'x', label='Validation Accuracy')\n", + "acc_plot.set_ylim([0, 1.0])\n", + "acc_plot.set_xlim([batches[0], batches[-1]])\n", + "acc_plot.legend(loc=4)\n", + "plt.tight_layout()\n", + "plt.show()\n", + "\n", + "print('Validation accuracy at {}'.format(validation_accuracy))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Test\n", + "You're going to test your model against your hold out dataset/testing data. This will give you a good indicator of how well the model will do in the real world. You should have a test accuracy of at least 80%." + ] + }, + { + "cell_type": "code", + "execution_count": 162, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Epoch 1/5: 100%|██████████| 557/557 [00:00<00:00, 1103.36batches/s]\n", + "Epoch 2/5: 100%|██████████| 557/557 [00:00<00:00, 1257.45batches/s]\n", + "Epoch 3/5: 100%|██████████| 557/557 [00:00<00:00, 1494.07batches/s]\n", + "Epoch 4/5: 100%|██████████| 557/557 [00:00<00:00, 1577.33batches/s]\n", + "Epoch 5/5: 100%|██████████| 557/557 [00:00<00:00, 1609.31batches/s]" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Nice Job! Test Accuracy is 0.855400025844574\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "\n" + ] + } + ], + "source": [ + "### DON'T MODIFY ANYTHING BELOW ###\n", + "# The accuracy measured against the test set\n", + "test_accuracy = 0.0\n", + "\n", + "with tf.Session() as session:\n", + " \n", + " session.run(init)\n", + " batch_count = int(math.ceil(len(train_features)/batch_size))\n", + "\n", + " for epoch_i in range(epochs):\n", + " e\n", + " # Progress bar\n", + " batches_pbar = tqdm(range(batch_count), desc='Epoch {:>2}/{}'.format(epoch_i+1, epochs), unit='batches')\n", + " \n", + " # The training cycle\n", + " for batch_i in batches_pbar:\n", + " # Get a batch of training features and labels\n", + " batch_start = batch_i*batch_size\n", + " batch_features = train_features[batch_start:batch_start + batch_size]\n", + " batch_labels = train_labels[batch_start:batch_start + batch_size]\n", + "\n", + " # Run optimizer\n", + " _ = session.run(optimizer, feed_dict={features: batch_features, labels: batch_labels})\n", + "\n", + " # Check accuracy against Test data\n", + " test_accuracy = session.run(accuracy, feed_dict=test_feed_dict)\n", + "\n", + "\n", + "assert test_accuracy >= 0.80, 'Test accuracy at {}, should be equal to or greater than 0.80'.format(test_accuracy)\n", + "print('Nice Job! Test Accuracy is {}'.format(test_accuracy))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Multiple layers\n", + "Good job! You built a one layer TensorFlow network! However, you might want to build more than one layer. This is deep learning after all! In the next section, you will start to satisfy your need for more layers." + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/intro-to-tensorflow/environment.yml b/intro-to-tensorflow/environment.yml new file mode 100644 index 0000000..a20e1fe --- /dev/null +++ b/intro-to-tensorflow/environment.yml @@ -0,0 +1,59 @@ +name: dlnd-tf-lab +dependencies: +- openssl=1.0.2j +- pip>=8.1.2 +- psutil=4.4.1 +- python>=3.4.0 +- readline=6.2 +- setuptools=27.2.0 +- sqlite=3.13.0 +- tk=8.5.18 +- wheel=0.29.0 +- xz=5.2.2 +- zlib=1.2.8 +- pip: + - appnope==0.1.0 + - cycler==0.10.0 + - decorator==4.0.10 + - entrypoints==0.2.2 + - ipykernel==4.5.0 + - ipython==5.1.0 + - ipython-genutils==0.1.0 + - ipywidgets==5.2.2 + - jinja2==2.8 + - jsonschema==2.5.1 + - jupyter==1.0.0 + - jupyter-client==4.4.0 + - jupyter-console==5.0.0 + - jupyter-core==4.2.0 + - markupsafe==0.23 + - matplotlib==1.5.3 + - mistune==0.7.3 + - nbconvert==4.2.0 + - nbformat==4.1.0 + - notebook==4.2.3 + - numpy==1.11.2 + - pexpect==4.2.1 + - pickleshare==0.7.4 + - pillow==3.4.2 + - prompt-toolkit==1.0.8 + - protobuf==3.1.0.post1 + - ptyprocess==0.5.1 + - pygments==2.1.3 + - pyparsing==2.1.10 + - python-dateutil==2.5.3 + - pytz==2016.7 + - pyzmq==16.0.0 + - qtconsole==4.2.1 + - scikit-learn==0.18 + - scipy==0.18.1 + - simplegeneric==0.8.1 + - six==1.10.0 + - sklearn==0.0 + - tensorflow>=0.12.1 + - terminado==0.6 + - tornado==4.4.2 + - tqdm==4.8.4 + - traitlets==4.3.1 + - wcwidth==0.1.7 + - widgetsnbextension==1.2.6 diff --git a/intro-to-tensorflow/environment_win.yml b/intro-to-tensorflow/environment_win.yml new file mode 100644 index 0000000..21c8827 --- /dev/null +++ b/intro-to-tensorflow/environment_win.yml @@ -0,0 +1,80 @@ +name: dlnd-tf-lab +channels: !!python/tuple +- defaults +dependencies: +- bleach=1.5.0=py35_0 +- bzip2=1.0.6=vc14_3 +- colorama=0.3.7=py35_0 +- cycler=0.10.0=py35_0 +- decorator=4.0.11=py35_0 +- entrypoints=0.2.2=py35_1 +- freetype=2.5.5=vc14_2 +- html5lib=0.999=py35_0 +- icu=57.1=vc14_0 +- ipykernel=4.5.2=py35_0 +- ipython=5.2.2=py35_0 +- ipython_genutils=0.1.0=py35_0 +- ipywidgets=5.2.2=py35_1 +- jinja2=2.9.4=py35_0 +- jpeg=9b=vc14_0 +- jsonschema=2.5.1=py35_0 +- jupyter=1.0.0=py35_3 +- jupyter_client=4.4.0=py35_0 +- jupyter_console=5.0.0=py35_0 +- jupyter_core=4.3.0=py35_0 +- libpng=1.6.27=vc14_0 +- libtiff=4.0.6=vc14_3 +- markupsafe=0.23=py35_2 +- matplotlib=2.0.0=np112py35_0 +- mistune=0.7.3=py35_0 +- mkl=2017.0.1=0 +- nbconvert=5.1.1=py35_0 +- nbformat=4.2.0=py35_0 +- notebook=4.3.1=py35_1 +- numpy=1.12.0=py35_0 +- olefile=0.44=py35_0 +- openssl=1.0.2k=vc14_0 +- pandas=0.19.2=np112py35_1 +- pandocfilters=1.4.1=py35_0 +- path.py=10.1=py35_0 +- pickleshare=0.7.4=py35_0 +- pillow=4.0.0=py35_1 +- pip=9.0.1=py35_1 +- prompt_toolkit=1.0.9=py35_0 +- pygments=2.1.3=py35_0 +- pyparsing=2.1.4=py35_0 +- pyqt=5.6.0=py35_2 +- python=3.5.2=0 +- python-dateutil=2.6.0=py35_0 +- pytz=2016.10=py35_0 +- pyzmq=16.0.2=py35_0 +- qt=5.6.2=vc14_3 +- qtconsole=4.2.1=py35_2 +- scikit-learn=0.18.1=np112py35_1 +- scipy=0.18.1=np112py35_1 +- setuptools=27.2.0=py35_1 +- simplegeneric=0.8.1=py35_1 +- sip=4.18=py35_0 +- six=1.10.0=py35_0 +- testpath=0.3=py35_0 +- tk=8.5.18=vc14_0 +- tornado=4.4.2=py35_0 +- traitlets=4.3.1=py35_0 +- vs2015_runtime=14.0.25123=0 +- wcwidth=0.1.7=py35_0 +- wheel=0.29.0=py35_0 +- widgetsnbextension=1.2.6=py35_0 +- win_unicode_console=0.5=py35_0 +- zlib=1.2.8=vc14_3 +- pip: + - ipython-genutils==0.1.0 + - jupyter-client==4.4.0 + - jupyter-console==5.0.0 + - jupyter-core==4.3.0 + - prompt-toolkit==1.0.9 + - protobuf==3.2.0 + - tensorflow==1.0.0 + - tqdm==4.11.2 + - win-unicode-console==0.5 +prefix: C:\Users\Mat\Anaconda3\envs\dlnd-tf-lab + diff --git a/intro-to-tensorflow/image/Learn Rate Tune - Image.png b/intro-to-tensorflow/image/Learn Rate Tune - Image.png new file mode 100644 index 0000000..b18c844 Binary files /dev/null and b/intro-to-tensorflow/image/Learn Rate Tune - Image.png differ diff --git a/intro-to-tensorflow/image/Mean Variance - Image.png b/intro-to-tensorflow/image/Mean Variance - Image.png new file mode 100644 index 0000000..c6d0e77 Binary files /dev/null and b/intro-to-tensorflow/image/Mean Variance - Image.png differ diff --git a/intro-to-tensorflow/image/network_diagram.png b/intro-to-tensorflow/image/network_diagram.png new file mode 100644 index 0000000..8d72fc0 Binary files /dev/null and b/intro-to-tensorflow/image/network_diagram.png differ diff --git a/intro-to-tensorflow/image/notmnist.png b/intro-to-tensorflow/image/notmnist.png new file mode 100644 index 0000000..85922e4 Binary files /dev/null and b/intro-to-tensorflow/image/notmnist.png differ diff --git a/intro-to-tensorflow/intro_to_tensorflow.ipynb b/intro-to-tensorflow/intro_to_tensorflow.ipynb new file mode 100644 index 0000000..dad092f --- /dev/null +++ b/intro-to-tensorflow/intro_to_tensorflow.ipynb @@ -0,0 +1,817 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "

TensorFlow Neural Network Lab

" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "In this lab, you'll use all the tools you learned from *Introduction to TensorFlow* to label images of English letters! The data you are using, notMNIST, consists of images of a letter from A to J in differents font.\n", + "\n", + "The above images are a few examples of the data you'll be training on. After training the network, you will compare your prediction model against test data. Your goal, by the end of this lab, is to make predictions against that test set with at least an 80% accuracy. Let's jump in!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To start this lab, you first need to import all the necessary modules. Run the code below. If it runs successfully, it will print \"`All modules imported`\"." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "All modules imported.\n" + ] + } + ], + "source": [ + "import hashlib\n", + "import os\n", + "import pickle\n", + "from urllib.request import urlretrieve\n", + "\n", + "import numpy as np\n", + "from PIL import Image\n", + "from sklearn.model_selection import train_test_split\n", + "from sklearn.preprocessing import LabelBinarizer\n", + "from sklearn.utils import resample\n", + "from tqdm import tqdm\n", + "from zipfile import ZipFile\n", + "\n", + "print('All modules imported.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The notMNIST dataset is too large for many computers to handle. It contains 500,000 images for just training. You'll be using a subset of this data, 15,000 images for each label (A-J)." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Downloading notMNIST_train.zip...\n", + "Download Finished\n", + "Downloading notMNIST_test.zip...\n", + "Download Finished\n", + "All files downloaded.\n" + ] + } + ], + "source": [ + "def download(url, file):\n", + " \"\"\"\n", + " Download file from \n", + " :param url: URL to file\n", + " :param file: Local file path\n", + " \"\"\"\n", + " if not os.path.isfile(file):\n", + " print('Downloading ' + file + '...')\n", + " urlretrieve(url, file)\n", + " print('Download Finished')\n", + "\n", + "# Download the training and test dataset.\n", + "download('https://s3.amazonaws.com/udacity-sdc/notMNIST_train.zip', 'notMNIST_train.zip')\n", + "download('https://s3.amazonaws.com/udacity-sdc/notMNIST_test.zip', 'notMNIST_test.zip')\n", + "\n", + "# Make sure the files aren't corrupted\n", + "assert hashlib.md5(open('notMNIST_train.zip', 'rb').read()).hexdigest() == 'c8673b3f28f489e9cdf3a3d74e2ac8fa',\\\n", + " 'notMNIST_train.zip file is corrupted. Remove the file and try again.'\n", + "assert hashlib.md5(open('notMNIST_test.zip', 'rb').read()).hexdigest() == '5d3c7e653e63471c88df796156a9dfa9',\\\n", + " 'notMNIST_test.zip file is corrupted. Remove the file and try again.'\n", + "\n", + "# Wait until you see that all files have been downloaded.\n", + "print('All files downloaded.')" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "100%|██████████| 210001/210001 [00:26<00:00, 7964.58files/s]\n", + "100%|██████████| 10001/10001 [00:01<00:00, 8404.50files/s]\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "All features and labels uncompressed.\n" + ] + } + ], + "source": [ + "def uncompress_features_labels(file):\n", + " \"\"\"\n", + " Uncompress features and labels from a zip file\n", + " :param file: The zip file to extract the data from\n", + " \"\"\"\n", + " features = []\n", + " labels = []\n", + "\n", + " with ZipFile(file) as zipf:\n", + " # Progress Bar\n", + " filenames_pbar = tqdm(zipf.namelist(), unit='files')\n", + " \n", + " # Get features and labels from all files\n", + " for filename in filenames_pbar:\n", + " # Check if the file is a directory\n", + " if not filename.endswith('/'):\n", + " with zipf.open(filename) as image_file:\n", + " image = Image.open(image_file)\n", + " image.load()\n", + " # Load image data as 1 dimensional array\n", + " # We're using float32 to save on memory space\n", + " feature = np.array(image, dtype=np.float32).flatten()\n", + "\n", + " # Get the the letter from the filename. This is the letter of the image.\n", + " label = os.path.split(filename)[1][0]\n", + "\n", + " features.append(feature)\n", + " labels.append(label)\n", + " return np.array(features), np.array(labels)\n", + "\n", + "# Get the features and labels from the zip files\n", + "train_features, train_labels = uncompress_features_labels('notMNIST_train.zip')\n", + "test_features, test_labels = uncompress_features_labels('notMNIST_test.zip')\n", + "\n", + "# Limit the amount of data to work with a docker container\n", + "docker_size_limit = 150000\n", + "train_features, train_labels = resample(train_features, train_labels, n_samples=docker_size_limit)\n", + "\n", + "# Set flags for feature engineering. This will prevent you from skipping an important step.\n", + "is_features_normal = False\n", + "is_labels_encod = False\n", + "\n", + "# Wait until you see that all features and labels have been uncompressed.\n", + "print('All features and labels uncompressed.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Problem 1\n", + "The first problem involves normalizing the features for your training and test data.\n", + "\n", + "Implement Min-Max scaling in the `normalize()` function to a range of `a=0.1` and `b=0.9`. After scaling, the values of the pixels in the input data should range from 0.1 to 0.9.\n", + "\n", + "Since the raw notMNIST image data is in [grayscale](https://en.wikipedia.org/wiki/Grayscale), the current values range from a min of 0 to a max of 255.\n", + "\n", + "Min-Max Scaling:\n", + "$\n", + "X'=a+{\\frac {\\left(X-X_{\\min }\\right)\\left(b-a\\right)}{X_{\\max }-X_{\\min }}}\n", + "$\n", + "\n", + "*If you're having trouble solving problem 1, you can view the solution [here](https://github.com/udacity/deep-learning/blob/master/intro-to-tensorFlow/intro_to_tensorflow_solution.ipynb).*" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0.9" + ] + }, + "execution_count": 37, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "t = train_features[0]\n", + "t_ = 0.1 + ((t-t.min())*(0.9 - 0.1))/(t.max()-t.min())\n", + "0.1 + (255*0.8/255)" + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed!\n" + ] + } + ], + "source": [ + "# Problem 1 - Implement Min-Max scaling for grayscale image data\n", + "def normalize_grayscale(image_data):\n", + " \"\"\"\n", + " Normalize the image data with Min-Max scaling to a range of [0.1, 0.9]\n", + " :param image_data: The image data to be normalized\n", + " :return: Normalized image data\n", + " \"\"\"\n", + " t = image_data\n", + " # TODO: Implement Min-Max scaling for grayscale image data\n", + " return 0.1 + ((t-t.min())*(0.9 - 0.1))/(t.max()-t.min())\n", + "\n", + "\n", + "### DON'T MODIFY ANYTHING BELOW ###\n", + "# Test Cases\n", + "np.testing.assert_array_almost_equal(\n", + " normalize_grayscale(np.array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 255])),\n", + " [0.1, 0.103137254902, 0.106274509804, 0.109411764706, 0.112549019608, 0.11568627451, 0.118823529412, 0.121960784314,\n", + " 0.125098039216, 0.128235294118, 0.13137254902, 0.9],\n", + " decimal=3)\n", + "np.testing.assert_array_almost_equal(\n", + " normalize_grayscale(np.array([0, 1, 10, 20, 30, 40, 233, 244, 254,255])),\n", + " [0.1, 0.103137254902, 0.13137254902, 0.162745098039, 0.194117647059, 0.225490196078, 0.830980392157, 0.865490196078,\n", + " 0.896862745098, 0.9])\n", + "\n", + "if not is_features_normal:\n", + " train_features = normalize_grayscale(train_features)\n", + " test_features = normalize_grayscale(test_features)\n", + " is_features_normal = True\n", + "\n", + "print('Tests Passed!')" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Labels One-Hot Encoded\n" + ] + } + ], + "source": [ + "if not is_labels_encod:\n", + " # Turn labels into numbers and apply One-Hot Encoding\n", + " encoder = LabelBinarizer()\n", + " encoder.fit(train_labels)\n", + " train_labels = encoder.transform(train_labels)\n", + " test_labels = encoder.transform(test_labels)\n", + "\n", + " # Change to float32, so it can be multiplied against the features in TensorFlow, which are float32\n", + " train_labels = train_labels.astype(np.float32)\n", + " test_labels = test_labels.astype(np.float32)\n", + " is_labels_encod = True\n", + "\n", + "print('Labels One-Hot Encoded')" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training features and labels randomized and split.\n" + ] + } + ], + "source": [ + "assert is_features_normal, 'You skipped the step to normalize the features'\n", + "assert is_labels_encod, 'You skipped the step to One-Hot Encode the labels'\n", + "\n", + "# Get randomized datasets for training and validation\n", + "train_features, valid_features, train_labels, valid_labels = train_test_split(\n", + " train_features,\n", + " train_labels,\n", + " test_size=0.05,\n", + " random_state=832289)\n", + "\n", + "print('Training features and labels randomized and split.')" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Saving data to pickle file...\n", + "Data cached in pickle file.\n" + ] + } + ], + "source": [ + "# Save the data for easy access\n", + "pickle_file = 'notMNIST.pickle'\n", + "if not os.path.isfile(pickle_file):\n", + " print('Saving data to pickle file...')\n", + " try:\n", + " with open('notMNIST.pickle', 'wb') as pfile:\n", + " pickle.dump(\n", + " {\n", + " 'train_dataset': train_features,\n", + " 'train_labels': train_labels,\n", + " 'valid_dataset': valid_features,\n", + " 'valid_labels': valid_labels,\n", + " 'test_dataset': test_features,\n", + " 'test_labels': test_labels,\n", + " },\n", + " pfile, pickle.HIGHEST_PROTOCOL)\n", + " except Exception as e:\n", + " print('Unable to save data to', pickle_file, ':', e)\n", + " raise\n", + "\n", + "print('Data cached in pickle file.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Checkpoint\n", + "All your progress is now saved to the pickle file. If you need to leave and comeback to this lab, you no longer have to start from the beginning. Just run the code block below and it will load all the data and modules required to proceed." + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Data and modules loaded.\n" + ] + } + ], + "source": [ + "%matplotlib inline\n", + "\n", + "# Load the modules\n", + "import pickle\n", + "import math\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "from tqdm import tqdm\n", + "import matplotlib.pyplot as plt\n", + "\n", + "# Reload the data\n", + "pickle_file = 'notMNIST.pickle'\n", + "with open(pickle_file, 'rb') as f:\n", + " pickle_data = pickle.load(f)\n", + " train_features = pickle_data['train_dataset']\n", + " train_labels = pickle_data['train_labels']\n", + " valid_features = pickle_data['valid_dataset']\n", + " valid_labels = pickle_data['valid_labels']\n", + " test_features = pickle_data['test_dataset']\n", + " test_labels = pickle_data['test_labels']\n", + " del pickle_data # Free up memory\n", + "\n", + "print('Data and modules loaded.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Problem 2\n", + "\n", + "Now it's time to build a simple neural network using TensorFlow. Here, your network will be just an input layer and an output layer.\n", + "\n", + "\n", + "\n", + "For the input here the images have been flattened into a vector of $28 \\times 28 = 784$ features. Then, we're trying to predict the image digit so there are 10 output units, one for each label. Of course, feel free to add hidden layers if you want, but this notebook is built to guide you through a single layer network. \n", + "\n", + "For the neural network to train on your data, you need the following float32 tensors:\n", + " - `features`\n", + " - Placeholder tensor for feature data (`train_features`/`valid_features`/`test_features`)\n", + " - `labels`\n", + " - Placeholder tensor for label data (`train_labels`/`valid_labels`/`test_labels`)\n", + " - `weights`\n", + " - Variable Tensor with random numbers from a truncated normal distribution.\n", + " - See `tf.truncated_normal()` documentation for help.\n", + " - `biases`\n", + " - Variable Tensor with all zeros.\n", + " - See `tf.zeros()` documentation for help.\n", + "\n", + "*If you're having trouble solving problem 2, review \"TensorFlow Linear Function\" section of the class. If that doesn't help, the solution for this problem is available [here](intro_to_tensorflow_solution.ipynb).*" + ] + }, + { + "cell_type": "code", + "execution_count": 159, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed!\n" + ] + } + ], + "source": [ + "# All the pixels in the image (28 * 28 = 784)\n", + "features_count = 784\n", + "# All the labels\n", + "labels_count = 10\n", + "\n", + "# TODO: Set the features and labels tensors\n", + "features = tf.placeholder(tf.float32, [None, features_count])\n", + "labels = tf.placeholder(tf.float32, [None, labels_count])\n", + "\n", + "# TODO: Set the weights and biases tensors\n", + "weights = tf.Variable(tf.truncated_normal([features_count, labels_count]))\n", + "biases = tf.Variable(tf.zeros([labels_count]))\n", + "\n", + "\n", + "\n", + "### DON'T MODIFY ANYTHING BELOW ###\n", + "\n", + "#Test Cases\n", + "from tensorflow.python.ops.variables import Variable\n", + "\n", + "assert features._op.name.startswith('Placeholder'), 'features must be a placeholder'\n", + "assert labels._op.name.startswith('Placeholder'), 'labels must be a placeholder'\n", + "assert isinstance(weights, Variable), 'weights must be a TensorFlow variable'\n", + "assert isinstance(biases, Variable), 'biases must be a TensorFlow variable'\n", + "\n", + "assert features._shape == None or (\\\n", + " features._shape.dims[0].value is None and\\\n", + " features._shape.dims[1].value in [None, 784]), 'The shape of features is incorrect'\n", + "assert labels._shape == None or (\\\n", + " labels._shape.dims[0].value is None and\\\n", + " labels._shape.dims[1].value in [None, 10]), 'The shape of labels is incorrect'\n", + "assert weights._variable._shape == (784, 10), 'The shape of weights is incorrect'\n", + "assert biases._variable._shape == (10), 'The shape of biases is incorrect'\n", + "\n", + "assert features._dtype == tf.float32, 'features must be type float32'\n", + "assert labels._dtype == tf.float32, 'labels must be type float32'\n", + "\n", + "# Feed dicts for training, validation, and test session\n", + "train_feed_dict = {features: train_features, labels: train_labels}\n", + "valid_feed_dict = {features: valid_features, labels: valid_labels}\n", + "test_feed_dict = {features: test_features, labels: test_labels}\n", + "\n", + "# Linear Function WX + b\n", + "logits = tf.matmul(features, weights) + biases\n", + "\n", + "prediction = tf.nn.softmax(logits)\n", + "\n", + "# Cross entropy\n", + "cross_entropy = -tf.reduce_sum(labels * tf.log(prediction), reduction_indices=1)\n", + "\n", + "# Training loss\n", + "loss = tf.reduce_mean(cross_entropy)\n", + "\n", + "\n", + "optimizer = tf.train.AdamOptimizer(0.003).minimize(loss) \n", + "\n", + "# Create an operation that initializes all variables\n", + "init = tf.global_variables_initializer()\n", + "\n", + "# Test Cases\n", + "with tf.Session() as session:\n", + " session.run(init)\n", + " session.run(loss, feed_dict=train_feed_dict)\n", + " session.run(loss, feed_dict=valid_feed_dict)\n", + " session.run(loss, feed_dict=test_feed_dict)\n", + " biases_data = session.run(biases)\n", + "\n", + "assert not np.count_nonzero(biases_data), 'biases must be zeros'\n", + "\n", + "print('Tests Passed!')" + ] + }, + { + "cell_type": "code", + "execution_count": 160, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Accuracy function created.\n" + ] + } + ], + "source": [ + "# Determine if the predictions are correct\n", + "is_correct_prediction = tf.equal(tf.argmax(prediction, 1), tf.argmax(labels, 1))\n", + "# Calculate the accuracy of the predictions\n", + "accuracy = tf.reduce_mean(tf.cast(is_correct_prediction, tf.float32))\n", + "\n", + "print('Accuracy function created.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Problem 3\n", + "Below are 2 parameter configurations for training the neural network. In each configuration, one of the parameters has multiple options. For each configuration, choose the option that gives the best acccuracy.\n", + "\n", + "Parameter configurations:\n", + "\n", + "Configuration 1\n", + "* **Epochs:** 1\n", + "* **Learning Rate:**\n", + " * 0.8\n", + " * 0.5\n", + " * 0.1\n", + " * 0.05\n", + " * 0.01\n", + "\n", + "Configuration 2\n", + "* **Epochs:**\n", + " * 1\n", + " * 2\n", + " * 3\n", + " * 4\n", + " * 5\n", + "* **Learning Rate:** 0.2\n", + "\n", + "The code will print out a Loss and Accuracy graph, so you can see how well the neural network performed.\n", + "\n", + "*If you're having trouble solving problem 3, you can view the solution [here](intro_to_tensorflow_solution.ipynb).*" + ] + }, + { + "cell_type": "code", + "execution_count": 161, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Epoch 1/5: 100%|██████████| 557/557 [00:02<00:00, 246.49batches/s]\n", + "Epoch 2/5: 100%|██████████| 557/557 [00:02<00:00, 256.03batches/s]\n", + "Epoch 3/5: 100%|██████████| 557/557 [00:02<00:00, 252.19batches/s]\n", + "Epoch 4/5: 100%|██████████| 557/557 [00:02<00:00, 241.44batches/s]\n", + "Epoch 5/5: 100%|██████████| 557/557 [00:02<00:00, 238.71batches/s]\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAk4AAAGGCAYAAACNCg6xAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzs3Xl8VNX9//HXBwhLAgn7voMguCDEfQFHtFrXWlsVtdpa\nf9Uu1trF9vvtYrVfbautS61bVVyq0lqxivsCBpcq1kQRZJFlwk5CWAIkIZDk8/vjTMIQkjCQZRJ4\nPx+P+5jMvWfuPXMY5r7n3HPvNXdHRERERPasVbIrICIiItJSKDiJiIiIJEjBSURERCRBCk4iIiIi\nCVJwEhEREUmQgpOIiIhIghScRERERBKk4CQiIiKSIAUnERERkQQpOImIiIgkSMFJRJqEmV1hZhVm\nNi7ZdRER2VcKTiLSlHRzTBFp0RScRERERBKk4CQizYaZ9TCzR8xsrZmVmNmnZnZ5DeUuNrOPzWyz\nmRWa2Wdm9sO45W3M7EYz+yK2ngIze9fMJjbtOxKR/U2bZFdARATAzNoDWcAw4B4gF/g68JiZZbj7\nPbFypwFPA28CN8RePgo4DvhL7PlNwC+AvwH/BdKBI4FxwPTGfzcisr9ScBKR5uJq4GDgUnf/B4CZ\nPQC8A/yfmU129yLgTGCTu59ex7rOBF529+82dqVF5MCiQ3Ui0lx8GVhbGZoA3L2c0IvUEZgQm70J\n6GhmdQWnTcAhZja8sSorIgcmBScRaS4GAYtqmD8fsNhygPuAL4BXzGxFbExU9RD1G6Az8EVs/NMf\nzeywxqq4iBw4FJxEpLmwRAq5+zrgCOBc4AXgZOBVM3s0rsy7hLFS3wLmAFcBOWZ2ZQPXWUQOMApO\nItJc5AIH1TB/VOxxWeUMdy9z95fd/QfuPgx4ELjczIbGldnk7o+7+6XAAOAz4LeNVXkROTAoOIlI\nc/EK0NvMLqqcYWatgWuBLcDM2LyuNbx2TuyxXU1l3L0YWFy5XERkX+msOhFpSgZ828y+XMOyuwln\n1j1mZkey83IExwHXxc6oA3g4FoxmACuBwcAPgE/dfX6szDwzywKygQ3AUcDX2Hm5AhGRfWLuugOC\niDQ+M7sCmFxHkQHAduAPwDmEay8tBP7s7n+PW8/5wHcI45w6A2sJvVU3uXt+rMz/EMZAjSD0Mi0D\nngD+FDtTT0Rknyg4iYiIiCSo3mOczOx/zOyj2K0P8szs32Y2olqZrNhd0SuncjO7r77bFhEREWlK\nDTE4/CTC7RGOAU4FUoA3zKxDXBkn3PqgF9Ab6MPOWyWIiIiItAj1Hhzu7mfGPzezbwL5QCbwXtyi\n4tj1V0RERERapMa4HEFnQg/ThmrzLzWzdWY2x8xurdYjJSIiItLsNejgcDMz4EWgk7tPiJt/FeGs\nltXA4cBtwCx3/1qDbVxERESkkTV0cLofOB04wd3X1FEuArwFDHf3aA3Lu8XWkwtsa7AKioiIyP6m\nPeF6bq+7+/rG3liDXQDTzP4KnAmcVFdoiplFuBDecGC34EQITU81VN1ERERkv3cp8HRjb6RBglMs\nNJ0HTHD35Qm8ZCxhHFRtASsX4Mknn2TUqFG1FJG6XH/99dx5553JrsZ+TW3c+NTGjU9t3PjUxo1r\n/vz5XHbZZRDLDo2t3sEpdj2mSYSr9BaZWa/YokJ33xa76eYlhCv7rgfGAHcAM919bi2r3QYwatQo\nxo0bV98qHpAyMjLUdo1Mbdz41MaNT23c+NTGTaZJhvY0RI/TNYTeo6xq879FuMXBdsL1na4D0oAV\nwL+AWxpg2yIiIiJNpiGu41TnJQ3cfSVwcn23IyIiIpJsjXEdJxEREZH9koLTfmrSpEnJrsJ+T23c\n+NTGjU9t3PjUxvuXBr2OU0Mxs3FAdnZ2tgbUiYiISK1ycnLIzMwEyHT3nMbennqcRERERBKk4CQi\nIiKSIAUnERERkQQpOImIiIgkSMFJREREJEEKTiIiIiIJUnASERERSZCCk4iIiEiCFJxEREREEqTg\nJCIiIpIgBScRERGRBCk4iYiIiCRIwUlEREQkQfUOTmb2P2b2kZltNrM8M/u3mY2oVqadmd1rZgVm\ntsXMnjWznvXdtoiIiEhTaogep5OAe4BjgFOBFOANM+sQV+Yu4CzgAmA80BeY2gDbFhEREWkybeq7\nAnc/M/65mX0TyAcygffMLB24ErjY3WfGynwLmG9mR7v7R/Wtg4iIiEhTaIwxTp0BBzbEnmcSAtr0\nygLuvhBYDhzXCNsXERERaRQNGpzMzAiH5d5z93mx2b2B7e6+uVrxvNiyWlV4RUNWT0RERKRe6n2o\nrpr7gNHAiQmUNULPVK02lGyoa7GIiIhIk2qw4GRmfwXOBE5y99Vxi9YCbc0svVqvU09Cr1OtfvHT\nX/DXnn/dZd6kSZOYNGlSA9VaREREWoopU6YwZcqUXeYVFhY2aR3Mvc5On8RWEkLTecAEd19abVk6\nsI4wOPzfsXkjgAXAsTUNDjezcUD27c/ezk8v+Gm96yciIiL7p5ycHDIzMwEy3T2nsbdX7x4nM7sP\nmAScCxSZWa/YokJ33+bum83sEeAOM9sIbAH+Ary/pzPq8rfm17d6IiIiIg2mIQ7VXUMYq5RVbf63\ngCdif18PlAPPAu2A14Dv72nFeUV1HskTERERaVINcR2nPZ6Z5+6lwLWxKWH5xepxEhERkeajWd+r\nLm+repxERESk+WjWwSm/SD1OIiIi0nw06+CUV5RHQ5z1JyIiItIQmnVwKisvo6C4INnVEBEREQGa\neXACWLl5ZbKrICIiIgIoOImIiIgkrFkHp9atWis4iYiISLPRrINTj9QeCk4iIiLSbDTr4NSzY09W\nblFwEhERkeahWQenXmm91OMkIiIizUazDk49U3sqOImIiEiz0ayDU69OocdJF8EUERGR5qB5B6e0\nXhTvKGbTtk3JroqIiIhI8w9OoGs5iYiISPPQrINTz7SegIKTiIiINA/NOjh1S+1GK2vFis0rkl0V\nERERkeYdnNq0akOfjn3U4yQiIiLNQr2Dk5mdZGbTzGyVmVWY2bnVlj8amx8/vZLo+vun91dwEhER\nkWahIXqc0oBPge8DtV034FWgF9A7Nk1KdOUKTiIiItJctKnvCtz9NeA1ADOzWoqVuvu6fVl///T+\nvLHkjX2tnoiIiEiDaaoxTiebWZ6ZLTCz+8ysa6IvVI+TiIiINBdNEZxeBS4HTgFuACYAr9TRO7WL\n/un92bJ9C5tLNzdiFUVERET2rN6H6vbE3Z+Je/q5mc0BlgAnA2/v6fX90/sD4VpOo3uMbowqioiI\niCSk0YNTde4eNbMCYDh7CE7XX389KakpsBS+/c636ZHWg0mTJjFpUsJjy0VERGQ/MWXKFKZMmbLL\nvMLCwiatgzXkDXTNrAL4irtPq6NMf2AZcJ67v1RLmXFAdnZ2NoeOOZR2/9eOR859hCvHXtlgdRUR\nEZGWLycnh8zMTIBMd89p7O3Vu8fJzNIIvUeVY5aGmtkYYENsuhGYCqyNlfsj8AXweiLrb9u6Lb3S\nemmAuIiIiCRdQxyqO5JwyM1j059j8x8HvgccThgc3hlYTQhMv3H3HYluYEDGAAUnERERSbqGuI7T\nTOo+O++M+m5DlyQQERGR5qBZ36uuUv9OCk4iIiKSfC0jOKnHSURERJqBFhOcNm7bSNH2omRXRURE\nRA5gLSY4AazasirJNREREZEDWYsKTjpcJyIiIsnUIoJTv/R+gIKTiIiIJFeLCE7t27Sne2p3BScR\nERFJqhYRnCAcrltRuCLZ1RAREZEDWIsKTiu3qMdJREREkqflBCddBFNERESSrOUEJ10EU0RERJKs\nRQWnguICtpVtS3ZVRERE5ADVYoLTgIwBAKzarItgioiISHK0mOCki2CKiIhIsrWY4NSvky6CKSIi\nIsnVYoJTWts0urTvouAkIiIiSVPv4GRmJ5nZNDNbZWYVZnZuDWVuNrPVZlZsZm+a2fB92ZbOrBMR\nEZFkaogepzTgU+D7gFdfaGY/B34AXA0cDRQBr5tZ273dkC6CKSIiIsnUpr4rcPfXgNcAzMxqKHId\n8Dt3fzFW5nIgD/gK8MzebKt/en8+WftJ/SosIiIiso8adYyTmQ0BegPTK+e5+2ZgFnDc3q5Ph+pE\nREQkmRp7cHhvwuG7vGrz82LL9kr/9P7kbc1je/n2hqibiIiIyF5J1ll1Rg3jofakf3p/HGfNljWN\nUCURERGRutV7jNMerCWEpF7s2uvUE9jjYKXrr7+ejIyMqudbtm+BjuFaToM6D2rouoqIiEgzNmXK\nFKZMmbLLvMLCwiatQ6MGJ3ePmtlaYCLwGYCZpQPHAPfu6fV33nkn48aNq3q+uXQzGX/I0DgnERGR\nA9CkSZOYNGnSLvNycnLIzMxssjrUOziZWRownNCzBDDUzMYAG9x9BXAX8CszWwzkAr8DVgIv7O22\n0tul06ltJwUnERERSYqG6HE6EnibMGbJgT/H5j8OXOnut5lZKvAg0Bl4F/iyu+/TCG+dWSciIiLJ\n0hDXcZrJHgaZu/tvgd/Wd1sQgtOKzSsaYlUiIiIie6XF3Kuu0oD0AepxEhERkaRoccFJh+pEREQk\nWVpkcFqzdQ1lFWXJroqIiIgcYFpkcKrwCtZuXZvsqoiIiMgBpkUGJ0CH60RERKTJKTiJiIiIJKjF\nBafO7TuTmpKq4CQiIiJNrsUFJzPTmXUiIiKSFC0uOIEuSSAiIiLJoeAkIiIikqCWGZw6KTiJiIhI\n02uZwSm9P6u2rKLCK5JdFRERETmAtNjgVFZRxpota5JdFRERETmAtMjgdFS/o2jfpj0P5zyc7KqI\niIjIAaRFBqfeHXvzvSO/x58/+DPri9cnuzoiIiJygGiRwQngFyf+Ase57f3bkl0VEREROUC02ODU\nI60H1x97Pfd8dI/GOomIiEiTaPTgZGY3mllFtWleQ6z7J8f9hPZt2nPLu7c0xOpERERE6tRUPU5z\ngV5A79h0YkOsNKN9BjeccAN/y/4buZtyG2KVIiIiIrVqquBU5u7r3D0/Nm1oqBVfe/S1dO3QlZtn\n3txQqxQRERGpUVMFp4PMbJWZLTGzJ81sQEOtOK1tGv970v/y+OzHWViwsKFWKyIiIrKbpghOHwLf\nBE4HrgGGAO+YWVpDbeDqzKvp16kfv8n6TUOtUkRERGQ35u5Nu0GzDGAZcL27P1pLmXFA9vjx48nI\nyNhl2aRJk5g0adJur3kk5xGuevEqPrn6E47ofUQj1FxERESSacqUKUyZMmWXeYWFhbzzzjsAme6e\n09h1aPLgBGBmHwFvuvsva1k+DsjOzs5m3LhxCa2zrKKM0feOZmT3kbw46cUGrK2IiIg0Vzk5OWRm\nZkITBacmv46TmXUEhgENevGlNq3acNPJN/HSFy/x4coPG3LVIiIiIkDTXMfpdjMbb2aDzOx44N9A\nGTBlDy/daxcdehGH9TyMX86osSNLREREpF6aosepP/A0sAD4B7AOONbdG/wmc62sFb+L/I4Z0RlM\nXzq9oVcvIiIiB7g2jb0Bd999JHcjOnfkuRzd72h+OeOXnDLkFMysKTcvIiIi+7EWe6+62pgZt5xy\nC7NWzeKlL15KdnVERERkP7LfBSeAiUMmcvLgk/nF9F+wdOPSZFdHRERE9hP7ZXAyM/78pT9TUFzA\niHtGcMXzV7CgYEGyqyUiIiIt3H4ZnADG9RlH9Lood5x+B9OXTmf0vaO5+NmL+Szvs2RXTURERFqo\n/TY4AaSmpPLDY37Ikh8u4f6z7mfWqlmMeWAMX/nHV/h49cfJrp6IiIi0MPt1cKrUrk07rj7yar74\nwRc8dt5jzC+Yz1EPHcWXn/oyryx6hcUbFrOtbFuyqykiIiLNXKNfjqA5SWmdwhVHXMFlh1/Gv+b9\ni1vevYWznj6ranmP1B70T+/PgIwB9O8Ue0zvz8huIxnbZyxtWh1QzSUiIiLVHJBJoHWr1lx86MVc\neMiFLN6wmBWFK1i5eSUrNu98fHf5u6zcvJKN2zYC0KltJ8YPGk9kcITIkAhjeo2hdavWdW6nZEcJ\nn6/7nNlrZzMnfw4D0gdw1biryGifUefrREREpHk6IINTpVbWihHdRjCi24hay2zdvpU5eXN4O/dt\n3s59m1+//WtK3iyhS/sujB80nlOGnEJkcIQuHbowe+1sPsv7jNl5s5mdN5sv1n9BhVdgGMO6DmPZ\npmXc/M7NXJN5Ddcdex19O/VtwnfbdMoqyli9ZTXLNi1jWeEyUlql8NVRXyWldUqyqyYiIlIv5u7J\nrsNuzGwckJ2dnc24ceOSXZ1dlJaVMmvVLN6OhiD1wcoP2F6+vWp5ert0xvQaw+G9DmdMrzGM6T2G\nQ3seSmpKKqs2r+Ivs/7CA9kPULKjhMsOv4yfHv9TRvcYncR3tO8WrV/E27lvk7spl+WFy1lWuIzl\nhctZtXkV5V6+S9nBnQfzq5N+xeVjLleAEhGRBpOTk0NmZiZAprvnNPb2FJzqqWRHCR+s/ICt27dy\neK/DGZQxaI+3eSncVshDOQ9x54d3snrLas4ecTY3HH8DJw48ca9uEePu5Bfls3Tj0qppycYlLC9c\nTlrbNHqn9aZXx170SutV9di7Y5iX0S5jr29H4+7MzZ/L1PlTeW7+c8zJn0Nra02/9H4MzBjIoIxB\nuz52Do/RjVF+987v+Ne8fylAiYhIg1JwomUFp/rYXr6dp+c8ze3/uZ156+ZxbP9jOb7/8TiOu9f4\nWOEVrN6yuiooFe0oqlpfj9QeDOs6jIEZAyneUczarWvJ25pHXlHeLr1iAO3btGdEtxGM7jGa0d1H\nh8ceoxnedfgugcbd+Xj1xzw3/zmmzp/Kog2LSG+XzjkjzuGro77KGcPPIDUlNaH3OydvDje/czPP\nznuWIZ2H8Kvxv+Ibh3+j2QSo8opylm5cyoKCBRza81CGdBmS7Co1KndnWeEyPl37KZ+s+YRP8z6l\nY9uOXHrYpXxp2Jd0MoSItAgKThw4walShVfw6qJXuWvWXazcvBIj9ASZGYbt9tinUx+Gdh7K0C5D\nGdZ1GEO7DGVI5yF0atepxvW7O4WlheRtzQthqiiP1VtWs7BgIfMK5vF5/uesL1kPQJtWbaoCVbcO\n3Xh18assL1xOtw7dOG/keVww+gImDplIuzbt9vn9zsmbw00zb2Lq/KlVAeriQy+moLiANVvWsHrL\natZsjT1uWVP1d7mX0z21e5g6dN/5d2p3eqT1oHtqdzLaZdAhpQOpKamkpqSS0iplt541d2fl5pXM\nzZ8bpnXhcd66ebtcluK4/sdxyWGX8PXRX6dXx177/H5rsrFkIwvXL2T1ltUMyhjEyO4j6di2Y4Nu\nI15ZRRkLChbwyZpP+GTtJ3y69lM+Xftp1ckPPVJ7MLbPWFZtXsXn6z6nZ1pPLjn0Ei4fczlH9D5C\nN8tuRvK25pG9Jpt+nfpxeK/D9W8jBzwFJw684NQcrCtax+frPmfeunlV05qtazhl8ClcMPoCxg8a\n3+A9EJ/lfcbNM29m6vypuy1r06oNvTv2pm+nvvTp2Ic+HfvQplUb1pesp6C4oGpaV7yuzmtwtbbW\nVSEqNSWV9m3as2rLKjaXbgYgLSWNQ3seWjUd1vMwhncdzvsr3mfK3Cm8tvg1KryCiUMmcslhl3D+\nwecnfFbkjvIdRDdFWViwkIXrF7KgYAEL1y9kYcFC1hWv26185aUvDu5+8M7H7iPpn96fVpb4JdfK\nK8pZuH4hH6/+uGr6dO2nlJSVADC0y1DG9h7L2N5jOaL3EYztM5Y+HftgZrg7n679lCdmP8HTc58m\nvyifQ3ocwuVjLufSwy6lX3q/GrdZvKO4qnczvyifdq3b0TOtJ7069qJHao8G61XcUrqFRRsW8cX6\nL1i0fhGrt6zmmP7HcMbwM+jdsXeDbKM52bp9Kzlrcvho1UfMWjWLj1Z9xPLC5VXL+3Xqx5kHncmZ\nB53JqUNPbdTwLYmr7M2dtXIWc/PnMqrHKE4Zcsp++RltDhScUHA60HyW9xk5a3Lo3bE3fTr2oW+n\nvnRL7ZZwWCjeURxCVNE6NpdupqSshOIdxbVOvTv2rgpKAzMG1rmd9cXreXbes0yZO4V3lr1D29Zt\nOWvEWXx99Ndp17od64rXkV+Uz7qideQX5+/8uyifguKCqkHyHdt2ZGS3kYzsPjI8xv7u16kfuZty\nq0LVgoIFLChYwKINi6oOr7Zr3W6X3rXuqd3pkdpjl+dlFWVkr8nm49Ufk7Mmp+oQ7ohuIziy75Ec\n2edIxvUZxxG9j0g4+JVVlPHGkjd4YvYTvLDwBUrLSpk4dCIHdT2IvKK8qqCUtzWPLdu31Lmurh26\nhiCV1oueaT3pmdaTtJQ02rVpR9vWbWnXOvbYpl3V361btWZ54fIQkmJhae3WtVXr7JHagx5pPZi/\nbj6Ok9knk7MOOoszDzqTo/odtVdhs8IrWLNlDcsLl+86bd75d8mOEkZ0G8GoHqMY1T1MB3c/mBHd\nRtSrBxZCW68oXMGiDYtYtH4Rn679lI9Wf8Tc/LlUeAWpKakc2fdIju57NEf3O5rMvpks3biUVxa9\nwsuLXuaL9V/QtnVbxg8aX9UG8WcL7yjfUfVZzdsawm1+UT4bSjbQM60nQ7oMqeq5TmubVq/3ArC5\ndDPLNi2rOmlk2aZlFO8o5uh+R3PiwBMZ3Hlwo/SUuTtbtm+htbVukPeRqMJthfx39X+ZtXIWs1aF\nKb8oH4CeaT2r/j6kxyFMHDKRiUMnMmHQBF2apoEoOKHgJM3Tys0r+efcf/L03KfJWRP+b7ayVnTr\n0K0qDPRM60mP1B5VvS3Duw5nZLeR9O3Ud692FOUV5VWBasnGJbv0slWfdlTsAEJPUmVIOrJvCEoN\n9cVcuK2QZ+c9y1NznmJDyYZdTzaIO/mgV8cQjLaXb99lJ13ZE1X5mF+UT/GOYkrLStlevp3S8tKq\nv+PPyOzUtlPVJUMO6nrQzr+7HUTn9p2B0Fv62uLXeGXxK7y2+DU2bdtEj9QenDH8DM486ExOGngS\nm0s3s2rLKlZvWV01xT9fs2VNVTtWbrfy5IaB6QMZmDGQdm3asbBgIfML5jO/YD4FxQVA+AwM7TKU\nUd1HMShjEGlt0+jYtiNpKWmktU3b7bGy12zxhsUs3rCYRRsWEd0Yrdp+m1ZtGNV9FMf0O4aj+x3N\nMf2PYXSP0XX2+C7esJhXFr3CK4teISs3i9LyUoZ0HkK7Nu2qAlJ1ndp2okuHLuQX5e/Sa9szrSdD\nOu8MUgMzBgKwo2IHO8p3sL18e9XflY9FO4pYsXlFCEqbllFYWli1vjat2jAgfQAprVP4Yv0XAPTt\n1JeTBp7EiQNP5KSBJ3Foz0PrvC5eWUUZG0o2UFBcQH5R/i7/jpWH8iun4h3FAPTu2JthXYYxrOsw\nhncZzrCuwxjWZRjDuw6na4euex3cSnaUVIXoyjOIo5uiZK/OZkHBAhwnvV16+Dfrd0yY+h9Dz7Se\nrN26lhnRGcyIzmB6dDq5m3JpZa04su+RTBwykRMGnEBpeWnVD8DK3vT4x6LtRQzIGBDeU5dhVUM1\nhnUZxoCMAXV+PsoryinaUUTR9iIKSwurxr9WDt3Y5XFrHimtUxjbeyyZfTLJ7JvJuD7j6JnWc6/a\nq7ptZduqfmyt3bqWtVvXkl+Uv8v/tUEZg/bp32a/DU5m9n3gp0BvYDZwrbv/t5ayCk71NGXKFCZN\nmpTsauy3Vm9ZzYvPvshVV1y1xwuhNqbKX9gVXlEVJFq68opySstLKaso46WpL3HJJZck/NqyijI+\nXPlhVYiYnTd7tzJdO3Slb6e+9O3Ul36d+lX9XXk26MCMgQkFzoLiAuavm8+CggVVYWrl5pUUbS+q\n2kkV7yjG2f07NqVVCkO7DGV41+Ec1PUghncdHv7udhADMwbW67B40fYiZkRn8NbSt2jdqnVVoI/v\n7euZ1pMOKR0AeOrpp4icEyG6MUp0U5SlG5fufNwYZeXmlQC0bd2WlNYppLRKIaV1Snge+7tDmw70\nT++/2xm1gzIG0btj76r/I+uL1/OfFf/hveXv8e7yd/l49cfsqNhBert0jh9wPCO7jWTjto0UFBew\nvjgcll9fsp5N2zbt9j47te1U9W9XeUi/b6e+9OnUhx3lO1iycQmLNyxmycYlLNmwZJfD4xntMuiR\n1oP2bdrToU0H2rdpH/5O6VA1r13rdqwvWV8Vkip7jQAMC5+ZzoM4vOfhHNM/BKWR3UfW2NNZ/ft4\n6calVSFqRnRG1borf4hVjtmM71lOTUll2aZl4f3Ezp6u8AoghNNBGYPo06kPJTtKqj5/lY+l5aU1\nflbat2m/y4+fyjOxi3cUk7Mmh5w1OVUhuH96fzL7hBCV2SeT7qnd2Vy6mcLSQgq3FVb9vbl0M4Xb\nCiksLaSguKAqlFX/NzSM7qnd2bJ9yy7BPS0lrer/YeXUqW2nXXqjq/+dOy+Xb531LdifgpOZXQQ8\nDnwH+Ai4Hvg6MMLdC2oor+BUT+eeey7Tpk1LdjX2a2rjxlffNl65eSUfr/6Y7qndq3aulYGhKbg7\nJWUlu+zEUlNSGZgxMKmBO96e2tjdG20AesmOEv67+r9VQWrZpmV07dC1Kix069AtPKZ2q3reI60H\nfTv13evxXJtLN7Nkw84wtbFkI9vKtlFSVrLL47aybZTsCH937dC1xjDYL70fbVu3TXjbdbWxu7O8\ncDmd2nWic/vOCR9i3lG+g2WFy1iyIQSppRuXkleUR2qb1N16Oju27Vj1d3q79KrL0nRq26nOf1t3\nZ+nGpWSvySZnTU7VY009mGkpaWS0zyC9XToZ7cJj99Tu9O7YuyqcVf3dsRfdU7vTplUb3J11xet2\nObQb37O3onAFRTuKKC0r3e36gFVWA38Dmig4NdX5xtcDD7r7EwBmdg1wFnAlcFsT1UFEDjD90/vT\nP71/0rZvZlUnJvSgR9LqUR+NedZeh5QOjB80nvGDxjfaNiqlt0tnbJ+xjO0zttG3tTfMjEGdB+31\n61Jap1T1VDYWs3DXi2Fdh3HhIRcCOwe+F24rrApK6e3S97mX1MyqekGP6ndUnWXLK8p3O7RfWl7K\nJzmfcOHwQ8D6AAAgAElEQVTfLtyn7e+LRg9OZpYCZAK3Vs5zdzezt4DjGnv7IiIi0jDMjMGdBydl\n261btaZDqw679Rpv7rq5SeuR+Gkn+6470BrIqzY/jzDeSURERKRFSOalgQ1qGDUZtAeYP39+09Vm\nP1NYWEhOTqMf6j2gqY0bn9q48amNG5/auHHFZYX2TbG9Rh8cHjtUVwxc4O7T4uY/BmS4+/k1vOYS\n4KlGrZiIiIjsTy5196cbeyON3uPk7jvMLBuYCEwDsDDacCLwl1pe9jpwKZAL1H5ZaBERETnQtQcG\nE7JDo2uqyxFcSLgcwdXsvBzB14CD3X33e0+IiIiINENNMsbJ3Z8xs+7AzUAv4FPgdIUmERERaUma\n5S1XRERERJqjprgcgYiIiMh+QcFJREREJEEKTi2Emd1oZhXVpnlxy9uZ2b1mVmBmW8zsWTPrWW0d\nA8zsZTMrMrO1ZnabWYI3RtoPmdlJZjbNzFbF2vPcGsrcbGarzazYzN40s+HVlncxs6fMrNDMNprZ\nw2aWVq3M4Wb2jpmVmNkyM/tZY7+35mJPbWxmj9bwuX6lWhm1cS3M7H/M7CMz22xmeWb2bzMbUa1M\ng3w3mNnJZpZtZtvM7Aszu6Ip3mOyJdjGWdU+w+Vmdl+1MmrjWpjZNWY2O/Z/vNDM/mNmZ8Qtb1af\n4QN2p9lCzSUMru8dm06MW3YX4f5/FwDjgb7A1MqFsQ/QK4QTAo4FrgC+SRiwf6BKI5yo8H1quBir\nmf0c+AHhbNCjgSLgdTOLv7vn08AowuU1ziK0/YNx6+hEOEU2CowDfgb81syuaoT30xzV2cYxr7Lr\n53pSteVq49qdBNwDHAOcCqQAb5hZ/D0p6v3dYGaDgZeA6cAY4G7gYTM7rVHeVfOSSBs74TazlZ/j\nPsANlQvVxnu0Avg54fZsmcAM4AUzGxVb3rw+w+6uqQVMwI1ATi3L0oFS4Py4eSOBCuDo2PMvAzuA\n7nFlrgY2Am2S/f6SPcXa6txq81YD11dr5xLgwtjzUbHXjY0rczpQBvSOPf8uUBDfxsDvgXnJfs/N\npI0fBZ6r4zUHq433qo27x9rrxNjzBvluAP4IfFZtW1OAV5L9npPdxrF5bwN31PEatfHet/N64FvN\n8TOsHqeW5aDYIY8lZvakmQ2Izc8kJO3plQXdfSGwnJ03Uj4WmOPuBXHrex3IAA5p/Kq3LGY2hPDL\nMb5NNwOz2LVNN7r7J3EvfYvw6/OYuDLvuHtZXJnXgZFmltFI1W9pTo4dAllgZveZWde4ZcehNt4b\nnQltsyH2vKG+G44ltDvVyhyIN2qv3saVLjWzdWY2x8xurdYjpTZOkJm1MrOLgVTgA5rhZ1jBqeX4\nkND1eDpwDTAEeCc21qM3sD22Y48XfyPl3tR8o2XQzZZr0pvw5VjXzal7A/nxC929nPCFqnZPzKvA\n5cAphEMbE4BXzMxiy9XGCYq12V3Ae+5eOf6xob4baiuTbmbt6lv3lqKWNoZwi7DLgJOBW4FvAH+P\nW6423gMzO9TMthB6l+4j9DAtoBl+hpN5k1/ZC+4efyn5uWb2EbAMuJDab0tT142Ud1l9Pat3IEmk\nTfdUpjIUHPDt7u7PxD393MzmAEsIO6C363ip2nh39wGj2XXsY20a4rvhQG7jE+JnuvvDcU8/N7O1\nwHQzG+Lu0T2sU20cLCCMPepMGMv0hJmNr6N80j7D6nFqody9EPgCGA6sBdqaWXq1Yj3ZmbDXEgYu\nxqt8Xj2FS2gvY/c2q96m1c/saA10iS2rLFPTOkDtvpvYTqaA8LkGtXFCzOyvwJnAye6+Om5Rfb8b\n9tTGm919e33q3lJUa+M1eyg+K/YY/zlWG9fB3cvcfam757j7L4HZwHU0w8+wglMLZWYdgWGEAczZ\nhMGyE+OWjwAGAv+JzfoAOMzCrW8qfQkoBOK7nIWqHfhadm3TdMK4mvg27WxmY+NeOpEQuD6KKzM+\ntrOv9CVgYSz8Shwz6w90Ayp3TGrjPYjt0M8DIu6+vNri+n43zI8rM5FdfSk2f7+3hzauyVhCL0b8\n51htvHdaAe1ojp/hZI+c15TwGQa3E07DHAQcD7xJSNvdYsvvI5yOfTJhMN37wLtxr29FSPCvAocT\nxkrlAb9L9ntLYpumEbqGjyCcofGj2PMBseU3EM7sOAc4DHgeWAS0jVvHK8DHwFGE7vuFwN/jlqcT\nwu3jhC7+i4CtwLeT/f6T3caxZbcRwuggwpfax4QvuhS1cULtex/hzKGTCL+mK6f21crU67uBcOf5\nrYQzk0YC3wO2A6cmuw2S3cbAUOBXhEthDALOBRYDM9TGCbfxLYRDzIOAQwlnxZYBpzTHz3DSG0xT\nwh+sKcBKwunwywnXthkSt7wd4VojBcAW4F9Az2rrGEC4jsXW2Ifqj0CrZL+3JLbpBMLOvLzaNDmu\nzG8JO+ViwhkYw6utozPwJOGXzUbgISC1WpnDgJmxdSwHfprs994c2hhoD7xG6NnbBiwF7gd6qI0T\nbt+a2rYcuDyuTIN8N8T+LbNj30GLgG8k+/03hzYG+gNZwLrY528hYcffUW2ccBs/HPv/XxL7PniD\nWGiKLW9Wn2Hd5FdEREQkQRrjJCIiIpIgBScRERGRBCk4iYiIiCRIwUlEREQkQQpOIiIiIglScBIR\nERFJkIKTiIiISIIUnEREREQSpOAkIiIikiAFJxEREZEEKTiJiIiIJEjBSURERCRBCk4iIiIiCVJw\nEhEREUmQgpOIiIhIghScRERERBKk4CQiIiKSIAUnERERkQQpOInIHpnZ98yswsw+SHZdRESSydw9\n2XUQkWbOzN4D+gCDgYPcfWlyayQikhzqcRKROpnZEOB44MdAAXBpcmtUMzNLTXYdRGT/p+AkInty\nKbAReBl4lhqCkwXXmdlnZlZiZvlm9qqZjatW7jIzm2VmRWa2wcxmmtlpccsrzOw3Naw/18wmxz2/\nIlZ2vJndZ2Z5wIrYsoGxeQvMrNjMCszsGTMbVMN6M8zsTjOLmtk2M1thZo+bWVczSzOzrWZ2Zw2v\n62tmZWb2871qSRFp8dokuwIi0uxdAjzr7mVmNgW4xswy3T07rsxk4ApCuHqI8N1yEnAskANgZjcC\nNwLvA78GtgPHABHgzT3UobYxBfcB+cBNQFps3lGx7U4BVhIOL34PeNvMRrv7tlh90oD3gJHAI8An\nQHfgXKC/u39mZv8GLjKzH/uu4xoqw+OTe6i3iOxnFJxEpFZmlgkcDHwfwN3fM7NVhOCQHSsTIYSm\nu9z9x3EvvzNuPcMIYWmqu389rsxf61nFAmBitVDzkrtPrfY+XgQ+BC4AnorNvgEYDZzv7tPiit8a\n9/cThOB4GvBG3PxLgXfcfVU96y8iLYwO1YlIXS4F1gJZcfP+CVxsZhZ7fgFQAdxcx3rOB2wPZfaW\nAw9VC024e2nl32bWxsy6AksJhxvjDx1+FZhdLTRV9xawhrjDk2Z2CHA48Pd6vwMRaXEUnESkRmbW\nCrgIeBsYambDYj1HHwG9gYmxokOB1e6+qY7VDSWEq/kNXM3c6jPMrL2Z3Wxmy4FSQq9UPtAZyIgr\nOgyYW9fKY6HsKeArZtY+NvsyYBthvJeIHGAUnESkNqcQLkFwMbAobvonobenshfGanz1rhIpU5fW\ntcwvqWHeX4H/Af4BfJ1wmO1UYAP79p33BNAJ+Ers+SRgmrtv2Yd1iUgLpzFOIlKby4A8wsDq6sHn\nAuB8M7sGWAycZmad6+h1WkwILaOBz+rY5kZCz1AVM0shBLhEXQA85u43xK2jXfX1AkuAQ/e0Mnf/\n3Mw+AS6Nje8aSGzMl4gceNTjJCK7iR2WOh940d3/7e7PxU+EXp10whloUwnfJTfWscrnCb1Uv4kb\nG1WTJcD4avOuofYep5qUs/t32w9rWMdUYIyZnZfAOv8OnA78iHDo77W9qI+I7EfU4yQiNTmPcHiq\ntoHTHwLrgEvd/Stm9nfgh2Y2ghAqWhEuRzDD3e9z9yVmdgvwK+BdM3uOMP7oKGCVu/8ytt6HgQfM\n7FnCJQrGAF+Kbau62gLYS8A3zGwzMA84jjAeq6BauduBrwH/MrNHCWcJdgPOAa529zlxZZ8CbiMc\nrrvP3ctr2baI7OcUnESkJpcAxYSzynbj7m5mLwOXmFkX4JvAbODbhIBRCHwM/CfuNTea2VLgWuD/\nYuv/jDCGqNJDhOsufZvQw/MOYYzSdHa/llNt13b6IVAWew/tCddqOhV4Pf417l5kZicSrgF1PnA5\nYRD5W4TrP8W/33Vm9gbwZXTtJpED2l7fq87MTgJ+BmQSxh18ZQ+n82JmJwN/Bg4BlgO3uPvj+1Jh\nEZFkiPWSHeruI5JdFxFJnn0Z45QGfEoYHLnH1GVmgwld59MJ3e53Aw/H32ZBRKQ5M7M+wFns2jsm\nIgegve5x2uXFZhXsocfJzP4IfNndD4+bNwXIcPcz93njIiKNLPbD70TgKkIv+zB3z09mnUQkuZri\nrLpj2X2cxOuEAZsiIs3ZBEIv00DgcoUmEWmKweG9CdeCiZcHpJtZu/jbI4iINCexsZgajykiVZJ1\nVl3lacQ1Hic0s26EM2pyCbc2EBEREalJe8LZuK+7+/rG3lhTBKe1QK9q83oCm919ey2vOZ2ddzAX\nERER2ZNLgacbeyNNEZw+IFz7JN6XYvNrkwvw5JNPMmrUqEaq1v7t+uuv584770x2NfZrauPGpzZu\nfGrjxqc2blzz58/nsssugxpu+t0Y9jo4mVkaMJydh9uGmtkYYIO7rzCz3wN93f2K2PIHgB/Ezq6b\nTLiC79eAus6o2wYwatQoxo0bt7dVFCAjI0Nt18jUxo1Pbdz41MaNT23cZJpkaM++nFV3JPAJ4fYE\nTriwZQ7h6rsQBoMPqCzs7rmE65+cSrj+0/XAt929xisSi4iIiDRXe93j5O4zqSNwufu3anlN5t5u\nS0RERKQ5aYrrOImIiIjsFxSc9lOTJk1KdhX2e2rjxqc2bnxq48anNt6/1OuWK43FzMYB2dnZ2RpQ\nJyIiIrXKyckhMzMTINPdcxp7e+pxEhEREUmQgpOIiIhIghScRERERBKk4CQiIiKSIAUnERERkQQp\nOImIiIgkSMFJREREJEEKTiIiIiIJUnASERERSZCCk4iIiEiCFJxEREREEqTgJCIiIpIgBScRERGR\nBCk4iYiIiCRon4KTmX3fzKJmVmJmH5rZUXso/yMzW2BmxWa23MzuMLN2+1ZlERERkeTY6+BkZhcB\nfwZuBMYCs4HXzax7LeUvAX4fK38wcCVwEXDLPtZZREREJCn2pcfpeuBBd3/C3RcA1wDFhEBUk+OA\n99z9n+6+3N3fAqYAR+9TjUVERESSZK+Ck5mlAJnA9Mp57u7AW4SAVJP/AJmVh/PMbChwJvDyvlRY\nRERE4PHHITe35mW5uWF5Q5VpyvrUd1uNbW97nLoDrYG8avPzgN41vcDdpxAO071nZtuBRcDb7v7H\nvdy2iEiL05Q7pYayp239v/+3f+6wm1uZPZkwAa68cvf15OaG+RMmNFyZROrbHLbVJNw94QnoA1QA\nx1Sbfxvwn1peczKwBvgWcAhwHrAM+FUd2xkHeHZ2toscaB57zD0arXlZNBqWJ1KmobbVUBrqfe2p\nzFVXNc12Ei0TjbpHIruXi5/flOtpiG29++6e65JIfZuybVpamT/9KbH/m9XXU9N6G6JMIu8pWdt6\n8cVsBxwY53uRafZ12tvglALsAM6tNv8x4N+1vOYd4I/V5l0KbK1jO+MAHz9+vJ9zzjm7TE8//bSL\nNLSmCiKJ7NSb2w6nKQNEQ5Rpyp16Q+1MmnI9jbWt/WGH3ZzKJPq+4+dlZdX8mjrLVFS479jhXlzs\n0TlbPHLids96YZNHTtru0blb3UtK3Ldvd6+oqLm+J1d4dME298JC93Xr3Fev9ui7KzxybJFnPbrU\nI0du9uhzOWHDb7zh/uKL7s8+69G7X/DIqNWe9cs3PHJovkcfy3KfOdP9o4/c58716MxlHjlhm0dn\nF7qvXOnRV+Z5ZOwGv/vKn/s5Y8b4OaNG+TnDh/vEvsM9o01m8w1OHkLNh8Ddcc8NWAH8rJbyHwO/\nrzZvElAEWC2vUY+T7Je9Aons1BN5vq+v2ZfXNeXOuLHee7J3kPHza9u5NeV6Et7W0gqPnLTds57f\n6JETSz2avT7sHAsK3AsKPJqzYeeO9sRSj/53nfuaNe6rV7uvXOm+YoVH31vpkeNKPOsfa3buCEtK\nwg57b9/TknL3rVs9OivPI8ds9ejzn7q/+ab71Knujz7q0d9M9siQpZ51+SMeGRL16M/vd7/jDve/\n/tX9wQfdJ0/26J3/9sgha8MOe+RKj/7v39x/+1v3n/zE/eqr3S+91KOn/T+PdMnxrKN/5pGeczx6\n8S/cf/pT95tvdr/rrrCt+1/1yJj1nvXnjz0yZn3Y8b/8svu0ae7PP+8+dapH7305bOvXb3lk9FqP\n3vW8+9//7v7EE+6PP+7+2GMeve0Zjxy82rNuyvLIEes9+sJs90WL3PPzPbqwdNd/pyXlHjl+m0ef\n/dj96afdf/9792uucT/jDM8a+A0H96xOZ7t37bpz6tataspKPyeU6XCGe4cO7m3ahBgQN2UxPpRh\n/G7L3MyjbYZ7pNXbnpVyqkeY7lEG7V5uT+tJsEyUQR5humcxftdtmbl36eI+bJj7UUf530Z/o9kH\npwuBEuBywuUFHgTWAz1iy58Abo0rfyOwiXAJgsHAaYRxTk/XsQ0FpyRoboeIGmqH3ZQBojHWu0+/\nIhOsW0Oup0Hf1+Iyj0wo86w3Sj1ycnnYYe7je0+o/d6uCNv5vMh9wwb3tWvdV6xwX7LEo28t9shR\nWzzrgfkeGbfRo0//x/3VV91feMH9mWfcn3wy7PxGrPSs7z8TdtjX/MH92mvdr7zS/aKL3M8+2z0S\n8ayDrw47iuN+4X7FFe4//rH7Lbe4P/CA+zPPePSp9z1y1GbPeniRR8Zu9OjDb7k/+aT7vfeGneQv\nfuH+ve951mn/F9Zz3h3uP/uZ+003hWM799/v/ve/e/SB18JO/e5PPZK5yaPTPnP/7DP3BQvcly4N\ngea/60LoeXSpR8YUePTWp91/9atQr1NOcT/oIPf27fe4c6vXDrJtW/f0dPcePTyrx9dCmR5fc+/d\nO+zsMzLcU1Pd27b1KINr3onGTx06eFaXr4T1dD4v7Fw7dgzbMdu9PnZy2Ea/fu4jR7pnZrpPmOB+\n9tmeFfltKDPuevcjj3QfMSLUq0OHBg0HeyoTbTfSI23f9axeF3rEZuz6vjt3dj/iiBD0+i/0rKv+\n7pGhuR694T73P/xhlyl6w30eGZrrWd95yiPDl3n0xkfd77vP/aGH3B99NHyO/zLNI4fmedbNMz1y\nSJ5H/zItBL1HH3V/+OHwOb33Xs/6wb9CfX/8gvvkyaHMP//p/u9/u7/8skefeMcjYzd41r1zQ4/T\nawvcv/jCPTc3BOv16z36eZFHTi73rNdKPHLido++v8p94UL3Tz5x/89/3N96y33aNM/67Yywrfs+\nD+tYv969rGyX/8NHHtmMD9VVvQi+B+TGAtQHwJFxy2YAk+OetwJ+DXwR62XKBf4CpNexfgWnRrCn\nsHL77Q0TQhoqFCXyvCHLxM9vqiBS107dPSwH96zn1rvPnRtmTJ0afj3feqv7j3/sWaffGsqcfXvY\nsf7xj+HL8Nln3adP9+jLn4df/FMLPHJskUdf/tz9vffcX389rOuJJ9zvu8+zrn46rOd7/ww76ocf\nDsv++U/355/36KNve+SIDZ71l9k7d8azZ7vPmxe+0KJRj36wxiMnlHrWkys8Mm6TRx983f1vfwvh\n4Lrr3C+5xP3UUz1r2JVhW70vcu/b171797ADbd/evVWr2ncmrVuHnWCHDu4dO3pW2pk7d7ZDh7of\nfLD74YeHHd0JJ3jWEdeF5Ydf63700WHZyJHugwa59+rlnpHhWW1PbZidn50cymSc6z54sPshh4Rt\nRiLuZ5/t0bN/4JE+8zzr9Fs90u1Tj449P+yMu3Wres91vu+uXd2HDPHoqC97JCPbsw79vkc6zvLo\n4JPd+/QJ7ben9dT1vlq1CgHi2GPdL7zQ/Sc/8eivH/HIIXme9bt3wuGUB193f+65qin6wGseOTQ/\n7GgPzffo394IPS4vvRR6X1591aOPZYUQd/tHHjlsnUfvfTnsaB96KPQE/elPHv3JPR4ZvMSzLnnQ\nI4OXePT6u8Pn+/bb3e+80/2ee9wfeMCzfvZSqO+t74fDOp98EoJgQYF7aeme/1+VlXl0wTaPnLTD\ns97Y5pFIxT79IPHt20Nv23HFnvX0Ko8cVxJ2/KtWhWCQnx965GYXhm29UuSR8WUenVfsXlzsvm2b\ne2lpWM/ishAgphV65Lhij7401/3tt0MIeewx97vu8qxvPRbe93XPhUNdn30WDo0l+J3U3L4jG3pb\nzXqMU1NNCk57r7F6cPblQ9+QoSh+/j4fvy8sDDv1F+d45Ij1nvWHDzxy+DqPPjI9fMG/+GKYpk0L\nv3BufT98Sf3xwxAu3nzTfcaM8EX97rsenZrtkcxNnvWn/3rkkLUeveWp8Kvu5z93/8533L/+dffT\nTvOsUdeE9Yz9kfuZZ7qff777xRe7f/Ob7ldf7VlfvTssP+dPIVCcfXb4tTtunPvw4R7tfqRHWmXV\n/AvbzL1bN48OPcUj6R971tgfeaTjRx7tf2L4Bb23XeKth3mkzUzPSj/HI62zPNpm+L4HiOplzEIw\nGj3afcIEj575PY/0W+BZF9/vkUFLPPqju0Jvyp//HHaQDz7o0dv/5ZFRazzrf1/zyKjVHv3jP0MA\nu//+sLO9+26P/uphjwzLDb+yBy/16Hf/GA63XHut+3e+49ELfuKRXnM96+QbPdLrc49e9HP3H/wg\nlPnlL91/9zuP/uIBjxy03LN+Mi1s5+4XQuCcNs39tdfcZ8zw6L/+65HMQs966ItweGjmstAblZfn\nvnFjGBuyuKx+O5PycvcNGzz6djR8tu6fF7Y1K8+9qKjqsNYe11NR4V5SEtupl4QAe/SWsDP+8EP3\nd95xnz49BJq/vRF6F/4yO+z4v9her/+v+8sOu7mWqel978t3+r6W2ZfnTbWt7GwFJwWnapLVg9Mc\nDhH5tm3uq1Z51uQlYWf8hw/c//GP8Gv1jjvCIYqf/jQEkYk3hzKjvxt6FXr0CL/U9yZA1HZMva5w\nUHm8fciQEHomTvTol7/rkd7zPOtLt4QxEqf9P/czzghv8vjjPXrIWR5J+9CzBlwWAs8xF7mfc477\npZe6X3ONR6/+fTjk8+tHwniKh94MPTjTl4Rf12VldbdxWVno0l60yKPPf+qRMQWedev7HjliQxgX\nMm+e+7Jlocu8+jiKyvUsrQjtX1gYxlt8sMYjxxZ71uO5HjkqtjOeNSv0Xr39tvsbb3h08gyPHJbv\nWXfmhB32f9ft1q3eFDuc5rTza8qdUlPVuSkH4DdU2+wPZao/b4njQBtjWwpOCk67acgvzPj5tYaV\nHTs867n1IRg8nhuOOy9eHLrEly0Lgz7XrPGs5zeGMtMKw066ctqwIfx6/nTTzoGlxxaHQ0QzZ4be\nnaeeCr0Ht93mWd94OKznlJvcTzsthI+BA93T0uoOMx07hkMUBx0UgkjnbM864X890nueR6/6v3B4\n6P77w6GmN98MPU7HFYdDVidsC7/m16wJ09q1Hv0o3yMn7BzgGv1gTfgl/u6K8P4XLnSfN8+jry0I\nY16mrA5nnlQbf1PfL/3mtsNJ9HXNZefW3M6qa04nJzRUnZvykg9NucNuTmUSGTrRlJrbpUviKTgd\nYMEp0Q9Ig4SiTZvc58xxf+UVz/rJtBBWvnp3GLwaiYQxGd27JzQIc596Zqovb9XKo50O80i79z1r\nyDc90jnHo1/+bjiz5Ze/dL/jDo/+6dlwWO25HPcFCzz64VqPjN+xS1hpqp16Q6wnkZ16c9vhNGWA\n2B+v45SIplxPU+4AG0JLq29DOVDf975QcDrAglOiO+z4eTWGom3bwims06d71s9fDmHlrNvcTz89\njC3p1Gn30JNxrkfSPvTocZPC2Jsf/jAM0hy5KgzyfPddjz7zkUfGbghnEr35pvtrr8UOx6wLA0Wn\nTg2n5R6a59H7Xw2DjWNT5fys/3s3HCJ6YXbotVmzxr2oKJzq3Ax6KBoyQDTETr2haGcsIgcCBaf9\nKDg1aG9Saan7/Pk7By5ffH848+WYY8LhqtiptlWhqPN5Hun0URhbc+217rfd5j5lShjoGjcQtDFC\nyL48r2k9+2uvgIiINBwFp/0oOO11b9KE8jDw9rB1Hv3lQ+7f/W4Y8zN4cDisVRmKWp/ikfb/8eix\nsTO0fvMb90ce8eiT74XTzRdsq3E7e6pPIpf4b8pDRCIiInvS1MHJPASVZsXMxgHZ2dnZjBs3LtnV\nqVHlTQYHD959WW4uzJwJV1yx88aFkyeHslXPH65gcOlC+OijqmnmJ+mcXD6dLCYwIfVjGD68asrt\nMpYrnzmdyX8tYfCxvcld3qrm9U7etU7x82fOTKzOTdU2IiIi9ZWTk0NmZiZAprvnNPoGmyKd7e1E\nC+hx2uvepBO2hYvEDVwceorixhz5qFEe/eqPPXLQinCl3+O3hVPB92Jb6sEREZEDkXqcaBk9TlBH\nb9JkGJy+IXStTJ8OM2Ywc34PTmYmWd2/xoSTKuDoo8OUmUnuxow6e4/UgyMiIlKzpu5xatPYG9if\nDR4cws2VV8KNN5Rw0y+KmXzM3xh8wbPwySehP2nYMHKPvpCbWl1H1k0F3HTvs0y+Y2cIqukQW/x6\nJ0+uOxQNHlxzoBIREZGG1yrZFWiOHn88BJqa5OaG5QDs2MHgT5/nxqIbOPnLHbhx9lcZ/OI9MHp0\nSDy5ueS+tZgr197K5Jd6MeGC7lWBqHL9M2fuPi4JdoanmTMb4x2KiIjIvlCPUw0mTNjDQOubV8Kv\nHnywhdwAACAASURBVIDJk8ld05ab0p8l64fPcdOsV5g8JZXBQ2zX8upNEhER2S+ox6kG8eGmsmco\nd9EOrjxnHZPLr2Dw+IFwzz3knnoVVx41h8mzj2TC3V9l8j/SuPLbpt4kERGR/ZSCUy2qwtNlpcy8\n7CGuPHQWk+cexeDtX8Ajj5D7wRquXHkzk5/pVGNvUm5u6E2qrcdo8GAN6BYREWlpdKiuNu4Mfv9p\nbpw9hZPff4msr9zF4N++AGPGADDz8T33Jukwm4iIyP5Fwakm+flwzTXk/juHm3q+TNZThdx014+Y\nnAGD/3979x4XVbU+fvzzDIKCoKJ4yVTwrlCWYN7yQvnzmlppKngt+4aX8nS0r6llGnYyK+t0OifT\njl8rA1E7luUl9WCmpXYRupmXPBpqmhbeTt4B1++PGaYZmIEZBAfweb9e88K99tp7rb0YZx7WXnst\nWxYdm6SUUkpdf4p0q05EHhaRn0Tkgoh8LiK3FZK/qoi8JiJHbcfsEZFeRavy1Sn0iblHvoKoKDI+\nyWD0TV+y6Isouvavmm/Mk1JKKaWuP14HTiIyBHgJmAm0Br4F1otImJv8/kAq0AAYADQHHgKOFLHO\nVyX3ibm8AVDG16cY3e4Hur42iIyYgYyO/JxFq2q5Hb+klFJKqetPUXqcJgILjDGLjTF7gLHAeWC0\nm/wPAtWAe4wxnxtjDhljPjXGfF+0Kl8dl0/MLUxldPsfWHRxKBHJs9kc9zqLkgL0aTillFJKOfFq\njJOt9ygGmJ2bZowxIpIKdHBzWD9gOzBPRO4GfgOWAM8bY64UqdZXyR48jcpmZtBcEte1ZVHs/xGR\n/BHUrUtBD7vp+CWllFLq+uVtj1MY4Accz5N+HKjj5phGwCBbWb2BZ4DHgCe8LLtYRYQbZppEYtdN\nZebjF4n4eBHUrevLKimllFKqlCuueZwE68rE7so4DiQYY742xiwHngXGFVPZRZLx0goSP72DT2Zt\nIfGrPmQcFF9WRymllFJlgLfTEWQCOUDtPOm1yN8LlesX4LIxxjGw2g3UEZEKxphsd4VNnDiRqlWr\nOqXFx8cTHx/vZbWdZXx6mNFTa7Jo4CoinprLohGul1hRSimlVOmRkpJCSkqKU9qZM2euaR3EOZ7x\n4ACRz4EvjDGP2rYFOAS8aox50UX+Z4F4Y0wjh7RHgcnGmHpuyogG0tLS0oiOjvaqfoXJ2J/D6Jhv\nWBT8KBG71kKVKtb0DA2elFJKqbImPT2dmJgYgBhjTHpJl1eUW3UvAwkiMlJEWgDzgSDgLQARWSwi\nsx3yvw7UEJG/iUhTEbkLmAb84+qqXjSbp6xl0X8HEbF0jj1oAn1iTimllFKF83rmcGPMctucTbOw\n3rL7BuhpjPnNlqUekO2Q/2cR6QH8FeucT0ds/37hKuvuvbQ0Rn0wAKZOhk6d8u3WJ+aUUkopVZAi\nLblijJkHzHOz704XaV8AHYtSVrE5fx6GD4dWreDpp31aFaWUUkqVTdfPWnVTplgHMqWnQ0CAr2uj\nlFJKqTKouKYjKBXcrkO3fj0Z/1jF2wM/hJYtr3W1lFJKKVVOlKvAyeU6dJmZZAyfzujqK+k6q5uv\nqqaUUkqpcqBcBU751qEzxho0nX6ZRWtvIKJRubpcpZRSSl1j5S6ScAyeNk9bx+j1g1n06lki2uWd\ns1MppZRSyjvlLnACa/A0c+xxYp/vzcyeXxAxrrevq6SUUkqpcqBcBk4ZGZA49TyfBPUh8fxk1wPG\nlVJKKaW8VO4CJ/vSKcGP0rVvCIsWV8g/YFwppZRSqgjKVeBkD5rm/ErE96ugb9/8A8aVUkoppYqo\nXAVOmzfbFun97kOwWKC3dWyTrkOnlFJKqeJQrmYOHzXK9o9Vq6BDBwgLs+/TdeiUUkV16NAhMjMz\nfV0Npa5LYWFhNGjQwNfVsCtXgRMAFy5AairMmOHrmiilyoFDhw7RsmVLzp8/7+uqKHVdCgoKYvfu\n3aUmeCp/gdOmTdYFffv183VNlFLlQGZmJufPnycpKYmWumSTUtfU7t27GT58OJmZmRo4lZjVq6Fh\nQ12TTilVrFq2bEl0dLSvq6GU8rFyNTgcY6yBU79+IOLr2iillFKqnClfgdN338Hhw9C3r69ropRS\nSqlyqHwFTqtWQUgIdO3q65oopZRSqhwqUuAkIg+LyE8ickFEPheR2zw8Lk5ErojIe0Upt1CrV0PP\nnhAQUCKnV0oppdT1zevASUSGAC8BM4HWwLfAehEJK+S4cOBFYEsR6lm448fhyy/1Np1SSpVCe/fu\nxWKxsHz5cq+PvXTpEhaLhRdeeKEEaqaUd4rS4zQRWGCMWWyM2QOMBc4Do90dICIWIAmYAfxUlIoW\nas0a688+fUrk9EopVZ5YLJZCX35+fmzZUnx/68pVPLQjIld1fHH4+uuvsVgshISE6Lxe1zGvpiMQ\nEX8gBpidm2aMMSKSCnQo4NCZwK/GmDdFpEuRalqY1auhfXuoWbNETq+UUuVJUlKS0/bbb79Namoq\nSUlJGGPs6cU1d1Xz5s25cOECAUUYSlGxYkUuXLiAv79/sdSlqJKTk6lXrx7Hjx9n5cqVDB061Kf1\nUb7h7TxOYYAfcDxP+nGguasDROR24AHgFq9r56mLF2HDBnjyyRIrQimlypO8X/rbt28nNTWV+Ph4\nj46/ePEilSpV8qrMogRNxXFscTDGsHTpUh544AG+/vprkpOTS23glJ2dDUCFCuVvqsbSoLieqhPA\n5EsUCQbeAR4yxpwqprLy27wZzp3T2cKVUqoErF+/HovFwvvvv8+UKVO48cYbCQ4O5vLly2RmZjJx\n4kRuuukmgoODqVatGv369WPXrl1O53A1xikuLo6aNWty+PBh+vbtS0hICLVr1+bJPH8EuxrjNHXq\nVCwWC4cPH2b48OFUq1aN6tWrM2bMGC5fvux0/Pnz5xk/fjw1atSgSpUq3HfffRw8eNCrcVMbN27k\nl19+IS4ujiFDhpCamup2/cJVq1bRpUsXQkJCqFatGu3bt+df//qXU56tW7fSs2dPQkNDCQ4OpnXr\n1syfP9++v3379vRxMfQkLi7OqRcwt11fe+015s6dS6NGjQgMDOTAgQNcvHiR6dOnExMTQ9WqVQkJ\nCeGOO+5g69at+c575coV5s6dy80330xgYCC1a9fmrrvu4rvvvgOgXbt2tG/f3uX1RkREcO+99xbe\niOWEt+FoJpAD1M6TXov8vVAAjYFwYJX8cXPaAiAil4Hmxhi3Y54mTpxI1apVndLi4+Pz/0W0ahWE\nh0NUlOdXopRSyitPPfUUlStXZsqUKZw7dw4/Pz/27t3LunXruO+++wgPD+eXX35h/vz5xMbGsmvX\nLsLC3D83JCJkZWXRvXt3YmNjmTt3LuvWrWPOnDk0a9aMUfaV210fKyLcc889NGvWjOeff54vv/yS\nhQsXUrduXWbOnGnPGx8fz+rVqxk9ejQxMTGkpqZyzz33eDVmKjk5maioKKKioggPD2fMmDEsW7aM\nhx9+2Cnf/PnzGT9+PK1bt2b69OlUqVKF9PR0NmzYwH333QfA6tWrGTBgAOHh4UyaNInatWvzww8/\nsGbNGsaOHWu/voKuO6/XX3+dnJwcxo8fT4UKFahatSonTpxg8eLFxMXFMXbsWE6fPs3ChQvp3r07\n6enptGjRwn78sGHDWLZsGXfffbc9+Ny8eTNfffUVrVq1YuTIkfzpT3/iwIEDNGrUyH7cp59+yqFD\nh3j55Zc9bsurkZKSQkpKilPamTNnrknZdsYYr17A58DfHLYFOAxMdpE3AIjM83of+DfQEqjgpoxo\nwKSlpZlCXbliTHi4MY88UnhepZTyUlpamvH486gMe+SRR4zFYnG5b926dUZETGRkpMnKynLad+nS\npXz59+3bZwICAszcuXPtaXv27DEiYpYtW2ZPi4uLMxaLxbz00ktOx0dFRZnOnTvbty9evGhExDz/\n/PP2tKlTpxoRMRMmTHA6tk+fPqZ+/fr27W3bthkRMU8++aRTvvj4eGOxWJzO6c7FixdN1apVzezZ\ns+1pAwcONB06dHDKd+LECRMUFGRiY2PztVOurKwsc+ONN5oWLVqYs2fPui2zffv2pnfv3vnS4+Li\nTMuWLe3bue0aFhZmzpw545Q3JyfHZGdnO6WdPHnS1KhRwzzi8J25du1aIyJm2rRpbutz4sQJExAQ\nYBITE53SExISTGhoqMv3QXHw5P9fbh4g2ngZ0xTlVZQboC8Db4tIGvAl1qfsgoC3AERkMfCzMeYJ\nY8xlwKm/VkROW+M1s7sIZee3cyccPKjTECilfO/8edizp+TLadECgoJKvpw8Ro8enW/cjOPYo5yc\nHM6cOUO1atVo2LAh6enpHp03ISHBabtTp06sXr260ONEhDFjxjilde7cmfXr15OVlYW/vz/r1q1D\nRBg3bpxTvgkTJrB06VKP6vfBBx/w+++/ExcXZ0+Lj49n8ODBTj0wH330ERcvXuSJJ55wO77oiy++\n4OjRoyxYsIDKlSt7VL4n4uLiqFKlilOaxfLHaBxjDKdPnyYnJ4fo6Gin382KFSsICAjId4vUUfXq\n1enTpw/JycnMmDEDgKysLFasWMGgQYN8PgbtWvI6cDLGLLfN2TQL6y27b4CexpjfbFnqAdnFV8VC\nrFoFwcEQG3vNilRKKZf27IGYmJIvJy0NfLDgcERERL603LExCxYs4ODBg1y5cgWwBjVNmjQp9JzV\nqlUjODjYKS00NJRTpzwbFtugQYN8x+YGCTVr1uTgwYNUrFiRG2+80SmfJ3XLlZycTPPmzbly5Qr7\n9+8HoFmzZgQEBLBkyRKmT58OYN8XVcCwkf379yMiBeYpCle/G4CFCxfyyiuv8OOPP9oHjQNERkba\n/33gwAEaNGhQaCA3cuRI7rvvPnbs2EGbNm1Yu3Ytp06dYsSIEcVyDWVFkYbcG2PmAfPc7LuzkGMf\nKEqZbq1eDT16QMWKxXpapZTyWosW1qDmWpTjA4GBgfnSZsyYwezZsxk7dix33HEHoaGhWCwWxo0b\nZw+iCuLn5+cy3Zh8zxuVyPGFOXXqFOvWrSM7O5umTZs67RMRkpOT7YGTJ2V6Wi93Y5xycnJcprv6\n3SxcuJCEhAQGDx7Mk08+SVhYGH5+fiQmJvLbb7/Z83lap759+xIaGkpSUhJt2rQhKSmJBg0a0KlT\nJ4+OLy/K9rOKv/4Kn38O//d/vq6JUkpZb5/5oCfIl1asWEGfPn2YN8/5b+mTJ0/SuHFjH9XqD+Hh\n4Vy6dIkjR4449Trt27fPo+OXLVtGdnY2ixYtIiQkxGnfzp07SUxMJD09nejoaHsv1s6dO6lbt67L\n8zVp0gRjDDt37qRjx45uy3XX63bw4EGP6g3W301UVFS+W5KPP/54vjpt376ds2fP5uv9c+Tv78+Q\nIUNYtmwZM2fOZM2aNTz22GMe16e8KNuL/H70kfWnzhaulFIlyl0PiJ+fX74ei3feeYcTJ05ci2oV\nqmfPnhhj8gV2f//73z16qi45OZnIyEhGjRrFgAEDnF6TJ0+mYsWKJCcnA9C7d28qVarE7NmzycrK\ncnm+du3aceONN/LSSy/x+++/uy23cePGfP/9905PjH355Zfs2LHDk8sGXP9utmzZkm/s2cCBA7l8\n+TLPPvtsoeccMWIEx48fZ+zYsVy6dIlhw4Z5XJ/yomz3OK1aBW3bQu28syMopZQqTu5u5/Tt25cX\nX3yRhIQEbrvtNr799luWLVvmdszNtdaxY0fuuusu5syZw7Fjx2jTpg0bN27kp5+sM+EUFDxlZGSw\nbds2pk2b5nJ/YGAg3bp1Y+nSpcydO5fq1avz4osvMmHCBNq1a8eQIUOoWrUq33zzDcYYFixYQIUK\nFZg3bx4DBw6kdevWjBo1itq1a7N7924OHDjABx98AMCDDz7IP/7xD3r06MH999/PkSNHWLhwIVFR\nUU5jlQrSt29fxo8fz3333UfPnj35z3/+wxtvvEFkZKTTbdRevXoxaNAgXnjhBXbt2kX37t3Jzs5m\n8+bN9O3blwcffNCet3379jRt2pR3332X6OhopykNrhdlpsfp7bchI8Mh4fJlWL8e+vUjI8O6Xyml\nVNEVFES42/f000/zpz/9iTVr1jBp0iR27drFhg0bqFOnTr5jXJ2joPmK8m57cj5Xli1bxpgxY1i5\nciXTpk2jQoUK9qVlCpr9PHe+oL4FPLXdr18/jh07xsaNGwEYP348K1asIDAwkGeeeYZp06bx/fff\n06tXL6djNm7cSMOGDZk7dy6TJ09my5Yt9HOYxPmWW27hrbfeIjMzk0mTJrF+/XqWLVtGVFSUx+0w\nZswYZs2axY4dO/jzn//Mpk2bePfdd7n55pvzHZOSksJzzz3Hjz/+yOTJk5kzZw5XrlyhXbt2+c47\nYsQIRISRI0e6bZdy7VrMeeDtCxfzOP30kzF33GH9aYwxZsMGY8D8tHaXc7pSShWj62Uep+vN9u3b\njYiY9957z9dVKXPmzJlj/P39zfHjx0u8rNI4j1OZ6XGKiIBFi2D0aFvP06pVZNzQgdEvtmDRIut+\npZRSKq9Lly7lS/vb3/5GhQoVrrsnwq6WMYY333yTHj16UKtWLV9XxyfK1BinP4Inw8zdv5Lo9yaL\nFokGTUoppdyaNWsWe/bsoUuXLogIq1evZuPGjTz66KPUrFnT19UrE86ePcuqVavYsGED+/bt47XX\nXvN1lXymTAVOYA2eZo7MIPaBpXzy/BcaNCmllCpQp06d+OSTT5g1axbnzp0jPDycZ599lilTpvi6\namXGkSNHGDZsGDVq1CAxMZFu3br5uko+U+YCp4wMSJwFnwT2InHtGhYN1tt0Siml3Ovduze9e/f2\ndTXKtNyZ01UZeqoOrEHT6NGwqNLDdL07lEVv+f0x5kkppZRSqoSVmcDJHjTNPEjE7o9g4MD8A8aV\nUkoppUpQmQmcNm+2BkkRXy6HwECwdbvmBk+bN/u2fkoppZQq/8rMGKdRo2z/WLECevUCh1WcIyJ0\nnJNSSimlSl6Z6XEC4Oef4YsvYOBAX9dEKaWUUtehshU4vfce+PtDAdPfK6WUUkqVlLIVOK1YAd27\nQ9Wqvq6JUkoppa5DRQqcRORhEflJRC6IyOciclsBef9HRLaIyEnb698F5Xfr+HH49FMYMKAoVVZK\nKaWUumpeB04iMgR4CZgJtAa+BdaLSJibQ7oCS4BYoD1wGNggIjd4VfDKlWCxwN13e1tlpZRS11C9\nevVISEiwb2/cuBGLxcK2bdsKPbZTp0706NGjWOszffp0/P39i/Wc6vpVlB6nicACY8xiY8weYCxw\nHhjtKrMxZoQxZr4x5jtjzI/A/9jK9W6+9hUroGtXCHMXnymllPJU//79qVy5MufOnXObZ9iwYVSs\nWJFTp055dW4R8SjN02M9ce7cORITE/nss89cntNi8e3IlJMnTxIQEICfnx/79+/3aV3U1fHqnSQi\n/kAMsDE3zRhjgFSgg4enqQz4Ayc9LvjkSdi0SZ+mU0qpYjJ8+HAuXrzI+++/73L/hQsX+PDDD+nT\npw+hoaFXVVa3bt24cOECHTt2vKrzFOTs2bMkJiayZcuWfPsSExM5e/ZsiZXtieXLl+Pv70+tWrVI\nTk72aV3U1fE2BA8D/IDjedKPA3U8PMfzwBGswZZnVq2CnBy4916PD1FKKeVe//79CQ4OZsmSJS73\nr1y5kvPnzzNs2LBiKS8gIKBYzuOO9W941ywWi89v1SUlJdG/f3+GDBlSqgMnYwyXLl3ydTVKteLq\nuxTA/bs2N5PIVGAwcI8x5rLHZ1+xAjp2hBu8GxallFIl7e233S/5lJFh3V8az1+pUiUGDBhAamoq\nmZmZ+fYvWbKE4OBg+vXrZ097/vnnuf3226lRowZBQUHcdtttrFy5stCy3I1xev3112ncuDFBQUF0\n6NDB5RioS5cu8dRTTxETE0O1atUIDg4mNjaWTz/91J5n//791K1bFxFh+vTpWCwWLBYLs2fPBlyP\nccrOziYxMZHGjRtTqVIlGjVqxIwZM8jKynLKV69ePQYMGMCWLVto27YtgYGBNGnSxG3A6UpGRgbb\ntm0jPj6eIUOGsG/fPnbs2OEy7/bt2+nduzehoaEEBwdz66238tprrznl2b17N4MGDaJmzZoEBQXR\nsmVLZs6cad8/fPhwmjZtmu/cedshJycHi8XCpEmTeOedd4iKiqJSpUps3Gi9qeTN73vx4sW0bduW\nypUrU6NGDWJjY/n4448B6y3fOnXquFwk+M477+Tmm28upAVLF28Dp0wgB6idJ70W+XuhnIjI/wKP\nA92NMT94UtjEiRPp36cP/desof/p0/Tv35+UlBQvq6yUUiWna1fX62Xmrq/ZtWvpPf+wYcPIzs5m\n+fLlTumnTp1iw4YNDBw4kIoVK9rTX331VWJiYvjLX/7Cc889h8ViYeDAgWzYsKHQsvKOXVqwYAEP\nP/ww9evX58UXX6RDhw7069ePo0ePOuU7ffo0b731Ft26deOFF17g6aef5tixY/To0YMffrB+ldSp\nU4fXXnsNYwyDBg0iKSmJpKQk7rnnHnvZecu///77SUxMpF27dvz1r3+lc+fO/OUvf2H48OH56r13\n717i4uLo1asXL7/8MlWrVmXUqFHs27ev0OsGSE5Oplq1avTu3ZsOHToQHh7ustdp3bp1xMbG8uOP\nP/LYY4/x8ssvExsby5o1a+x5vvnmG9q3b8+WLVsYN24cr776KnfffbdTHlfXW1D6hg0bmDJlCkOH\nDuWVV16hQYMGgOe/76eeeor777+fwMBAnnnmGZ5++mnq1avHpk2bABg5ciS//fYbqanON5qOHj3K\nli1bGDFihEftCJCSkkL//v2dXhMnTvT4+GJhjPHqBXwO/M1hW7A+KTe5gGMmA6eA2zwsIxowaWlp\nxqSkGAPG/PSTUUqpay0tLc3YP4/c+OknY+6444+PqbzbV6ukzp+Tk2Pq1q1rbr/9dqf0+fPnG4vF\nYlJTU53SL1686LSdlZVlIiMjTa9evZzS69WrZx566CH7dmpqqrFYLGbr1q3GGGMuX75swsLCTNu2\nbU12drZTuSJiunfv7lTHrKwsp/OfPn3a1KxZ04wdO9aeduzYMSMi5tlnn813ndOnTzf+/v727bS0\nNCMiZvz48U75Jk6caCwWi/nss8+crsVisZjPP//cqayAgAAzbdq0fGW5EhkZaR544AH79pQpU8wN\nN9xgrly5Yk/Lzs42DRo0ME2bNjW///6723N17NjRhIaGmqNHj7rNM3z4cNO0adN86XnbITs724iI\n8ff3N/v27cuX35Pf9969e43FYjFDhgxxW5/c99mIESOc0l944QXj5+dnDh8+7PZYT/7/5eYBoo2X\nMU1RXkW5VfcykCAiI0WkBTAfCALeAhCRxSIyOzeziDwOPIP1qbtDIlLb9qqc/9QurFgBMTG6GJ1S\nqtTKXWx89GjrguOjR9sWJY8o3ee3WCzExcWxfft2Dh48aE9fsmQJtWvX5s4773TK79j7dPr0aU6f\nPk2nTp1IT0/3qtwvvviCEydOMG7cOPz8/Ozpo0ePJiQkJF8dK1SwLqtqjOHUqVNkZWXRpk0br8vN\ntXbtWkSESZMmOaU/9thjGGOcem8AWrVqRbt27ezbtWvXpmnTphw4cKDQstLT09m9ezdDhw61p8XH\nx3P8+HGnHpgdO3Zw+PBhJk6cSHBwsMtzHT9+nO3bt/PQQw9xQzEOXenWrRtNmjTJl+7J7/u9994D\ncLpVmJfFYmHo0KGsXLmSCxcu2NOXLFlCly5dqFevXnFcxjXjdeBkjFkOPAbMAr4GWgE9jTG/2bLU\nw3mg+DisT9H9Czjq8Hqs0MIuXIC1a/VpOqVUqRcRATNnQmys9Wdx/61XUucfNmwYxhj7MIgjR47w\n2WefER8fn++2zocffkj79u0JDAykevXq1KpVi3/+85+cOXPGqzIPHjyIiOT7svb39yfCxYW9+eab\ntGrVikqVKlGjRg1q1arFunXrvC7XsfwKFSrQuHFjp/Qbb7yRkJAQpyASsN+6chQaGurRNA1JSUmE\nhIRQv3599u/fz/79+6lcuTL16tVzul23f/9+RISoqCi358qdxqCgPEXhqs3Bs9/3gQMH8PPzo3nz\n5gWWMWrUKM6ePcsHH3wAwA8//MC3337LyJEji+06rpUiDQ43xswzxkQYYwKNMR2MMTsc9t1pjBnt\nsN3QGOPn4jWr0IK2b4fz5zVwUkqVehkZkJgIn3xi/eluQHdpO390dDQtWrSwD3bO/enYQwKwadMm\n7r33XkJCQpg/fz4fffQRqampDBkyxOWg34IY2xNwrsbb5O7L9dZbb/Hggw/SokUL3nzzTdavX09q\naipdu3b1ulx3ZRS2z7FXzNPz5O5ftmwZZ8+epWXLljRt2pSmTZvSrFkzfv75Z95//30uXrzo0bk8\nzQPu58LKyclxmR4YGJgvzdPftzHGo7m3brrpJm655RaSkpIAa0AZGBjIwDL4/V7B1xUo0Mcfw003\nQbNmvq6JUkq5lTtQO/f2We5tteK6XVfS5x82bBgzZszg+++/JyUlhaZNmxITE+OU57333qNy5cqs\nW7fOKZBYsGCB1+VFRERgjOHHH3/k9ttvt6dnZWVx8OBB6tT546bFihUraN68eb4B7E888YTTtjcT\nZ0ZERJCdnc3+/fudep2OHj3K2bNnCQ8P9/aSXNq4cSO//PILzz33XL6n3DIzMxk3bhwffvghgwcP\npkmTJhhj2LlzJ126dHF5vtweup07dxZYbmhoKKdPn86XnuFFtO3p77tJkyZkZ2ezZ88eIiMjCzzn\nyJEjmTp1Kr/++itLly6lf//++W7NlgWle5HfLVu0t0kpVarlDWrAObi52p6hkj4//HG7bsaMGXzz\nzTf5niwDa6+LxWJx6rU4cOAAq1at8rq8du3aUb16debPn+90voULF/L777/nKzevrVu38tVXxg1c\nNgAADcdJREFUXzmlVa5sHTbrKmDIq0+fPhhjeOWVV5zSX3rpJUSEu+66y+NrKUhSUhJVqlThscce\nY8CAAU6vhIQEGjZsaL9dd9ttt9GgQQP++te/8t///tfl+WrXrk3Hjh1ZuHAhR44ccVtu48aNOXHi\nBLt377anHTlyxKvflae/73tt8ysmJiYW2iM2dOhQrly5woQJEzh06JDL91lZULp7nM6d00V9lVKl\n2ubNrnt+coObzZuvrleopM9vPVcEHTt25IMPPkBE8t2mA+jbty+vvvoqPXv2JD4+nl9++YV58+bR\nvHlz+7QABXH8UvX39+eZZ57hkUce4Y477mDIkCH85z//YfHixTRs2DBfuR9++CEDBgygd+/e7N+/\nnzfeeIPIyEiniRorV65Ms2bNSElJoVGjRoSGhtKqVStatmyZry7R0dEMGzaMefPmceLECTp37sz2\n7dtJSkpi8ODBTr1gRZU7K3vv3r3tg9vz6tevH6+//jonT56kevXqzJs3j3vvvZdbb72VBx54gDp1\n6rBnzx727t3L6tWrAfj73/9O165dad26NQkJCURERHDgwAE2bNhgnxtq6NChPPHEE/Tv358JEyZw\n9uxZ5s+fT4sWLfj22289qr+nv+9mzZoxdepU5syZQ9euXbnnnnsICAjgq6++Ijw8nFmz/hiVU7t2\nbbp37867775LWFgYvXr1Kmrz+ta1eHTP2xe50xHUq2eMw+OaSil1rXnyOHR5MG/ePGOxWEyHDh3c\n5lm4cKFp1qyZCQwMNFFRUeadd97J94i7McbUr1/fJCQk2LfzTkfgWGajRo1MYGCg6dChg9m2bZvp\n3Lmz6dGjh1O+Z5991kRERJigoCDTpk0bs27dOjN8+HDTrFkzp3xbt241bdq0MZUqVTIWi8U+NcH0\n6dNNQECAU97s7GyTmJhoGjVqZCpWrGgiIiLMjBkz8k19UL9+fTNgwIB8bdGpU6d89XS0fPlyY7FY\nTFJSkts8GzduNBaLxbz++uv2tM8++8x0797dVKlSxYSEhJjWrVubBQsWOB23c+dOc++995rq1aub\nypUrm8jISDNr1iynPOvXrzc33XSTqVixoomMjDTLli1zOR2BxWIxkyZNclk/T3/fxhizaNEiEx0d\nbQIDA02NGjXMnXfeaTZt2pQvX0pKihERM2HCBLft4qg0TkcgxsPBZteSiEQDaWmjRhH91lu+ro5S\n6jqWnp5OTEwMaWlpREdH+7o6SpVp7733HoMGDWL79u20bdu20Pye/P/LzQPEGGOKNkeFF0r3GKdu\n3YDiWbZAKaWUUr71xhtv0LRpU4+CptKqdI9xiox0GhiplFJKqbJn6dKlfPPNN/z73/9m3rx5vq7O\nVSnVgdPRX4T/nVy8M/AqpZRS6trJyclh6NChhISEkJCQQEJCgq+rdFVKdeCUmAjvvqtBk1JKKVVW\n+fn5FXmy0tKoVI9xSkjQoEkppZRSpUepDpzeeKP4ly1QSimllCqqUh04zZxZfDPjKqWUUkpdrVId\nONWtW7zLCiillFJKXY1SPTgcindZAaWUKirHdb+UUtdGafx/V+oDJ7AGTBo0KaV8ISwsjKCgoDK7\nIKlSZV1QUBBhYWG+roZdmQiclFLKVxo0aMDu3bvJzMz0dVWUui6FhYXRoEEDX1fDTgOnciolJYX4\n+HhfV6Nc0zYueaWljRs0aFCqPriLU2lp4/JM27h8KdLgcBF5WER+EpELIvK5iNxWSP5BIrLblv9b\nEeldtOoqT6WkpPi6CuWetnHJ0zYuedrGJU/buHzxOnASkSHAS8BMoDXwLbBeRFzegBSRDsAS4J/A\nrcBKYKWIRBa10koppZRSvlCUHqeJwAJjzGJjzB5gLHAeGO0m/6PAR8aYl40xe40xM4F04JEi1Vgp\npZRSyke8CpxExB+IATbmphljDJAKdHBzWAfbfkfrC8ivlFJKKVUqeTs4PAzwA47nST8ONHdzTB03\n+esUUE4lKJ3zN5QVZ86cIT093dfVKNe0jUuetnHJ0zYuedrGJcshVqh0LcorrqfqBDDFmD8C0HlT\nrlJMTIyvq1DuaRuXPG3jkqdtXPK0ja+JCGBbSRfibeCUCeQAtfOk1yJ/r1KuY17mB+utvGFABnDR\nyzoqpZRS6vpRCWvQtP5aFCbWIUpeHCDyOfCFMeZR27YAh4BXjTEvusi/FAg0xtztkLYV+NYYM/5q\nKq+UUkopdS0V5Vbdy8DbIpIGfIn1Kbsg4C0AEVkM/GyMecKW/2/AZhGZBKwB4rEOMH/o6qqulFJK\nKXVteR04GWOW2+ZsmoX1Ftw3QE9jzG+2LPWAbIf820UkHnjW9toH3G2M2XW1lVdKKaWUupa8vlWn\nlFJKKXW9KtKSK0oppZRS1yMNnMoIEZkpIlfyvHY57K8oIq+JSKaI/C4i/xKRWnnOUV9E1ojIORE5\nJiIviMh1+x4Qkc4i8qGIHLG1Z38XeWaJyFEROS8i/xaRJnn2h4pIsoicEZFTIrJQRCrnydNKRLbY\n1mo8KCKTS/raSovC2lhE3nTxvl6bJ4+2sRsiMk1EvhSR/4rIcRF5X0Sa5clTLJ8NIhIrImkiclFE\nfhSRUdfiGn3Nwzb+JM97OEdE5uXJo23shoiMFes6tmdsr20i0sthf6l6D1+3X5pl1E6s48rq2F6d\nHPa9AtwFDAS6AHWBFbk7bW+gtVjHtbUHRgH3Yx2rdr2qjHWM3sO4mFdMRKZgXRpoDNAWOId1XcYA\nh2xLgJZAN6zt3wVY4HCOEKyPyP4ERAOTgadF5H9K4HpKowLb2OYjnN/XeZeR1zZ2rzPwd6Ad8P8A\nf2CDiAQ65LnqzwYRiQBWY1014hasD/0sFJHuJXJVpYsnbWyAN/jjfXwD8HjuTm3jQh0GpmB9cCwG\n+Bj4QERa2vaXrvewMUZfZeCFdVHldDf7qgCXgHsd0poDV4C2tu3eQBYQ5pBnDHAKqODr6/P1y9ZW\n/fOkHQUm5mnnC8Bg23ZL23GtHfL0xPpwRB3b9jis859VcMjzHLDL19dcStr4TeC9Ao5poW3sVRuH\n2dqrk227WD4bgOeB7/KUlQKs9fU1+7qNbWmbgJcLOEbb2Pt2PgE8UBrfw9rjVLY0td3y2C8iSSJS\n35YegzXSdlxDcC/W+bVy1wRsD3xvjMl0ON96oCoQVfJVL1tEpCHWvxwd2/S/wBc4t+kpY8zXDoem\nYv3rs51Dni3GmGyHPOuB5iJStYSqX9bE2m6B7BGReSJS3WFfB7SNvVENa9uctG0X12dDe3TN0Vx5\n2zjXMBH5TUS+F5HZeXqktI09JCIWEYnDOs3Rdkrhe1gDp7Ljc6xdjz2BsUBDYIttrEcd4LLti92R\n45qA7tYMhILXDbxe1cH64VjQOot1gF8ddxpjcrB+oGq7e+YjYCRwJ9ZbG12BtSIitv3axh6ytdkr\nwGfmj+leiuuzwV2eKiJS8WrrXla4aWOAZGA4EAvMBkYA7zjs1zYuhIjcJCK/Y+1dmoe1h2kPpfA9\nXFxr1akSZoxxnEp+p4h8CRwEBuN+WRpP1xDUOSk850mbFpYnNyi47tvdGLPcYfMHEfke2I/1C2hT\nAYdqG+c3D4jEeeyjO8Xx2XA9t/HtjonGmIUOmz+IyDFgo4g0NMb8VMg5tY2t9mAde1QN61imxSLS\npYD8PnsPa49TGWWMOQP8CDTBuh5ggIhUyZPNcU1AV2sG5m4XtG7g9eoY1v9UBa2zeMy2bScifkCo\nbV9uHlfnAG33fGxfMplY39egbewREfkH0AeINcYcddh1tZ8NhbXxf40xl6+m7mVFnjb+pZDsX9h+\nOr6PtY0LYIzJNsYcMMakG2OeBL4FHqUUvoc1cCqjRCQYaIx1AHMa1sGy3Rz2NwMa8MdK0duBm8U6\n63uuHsAZQGdxz8P2BX4M5zatgnVcjWObVhOR1g6HdsMacH3pkKeL7cs+Vw9gry34VQ5EpB5QA8j9\nYtI2LoTtC/1u4A5jzKE8u6/2s2G3Q55uOOthSy/3CmljV1pj7cVwfB9rG3vHAlSkNL6HfT1yXl8e\nP2HwItbHMMOBjsC/sUbbNWz752F9HDsW62C6rcCnDsdbsEbwHwGtsI6VOg484+tr82GbVsbaNXwr\n1ic0/mzbrm/b/zjWJzv6ATcDK7EuGRTgcI61wA7gNqzd93uBdxz2V8Ea3L6NtYt/CHAWeNDX1+/r\nNrbtewFrMBqO9UNtB9YPOn9tY4/adx7WJ4c6Y/1rOvdVKU+eq/pswLry/FmsTyY1B8YDl4H/5+s2\n8HUbA42A6VinwggH+gP/AT7WNva4jZ/Feos5HLgJ61Ox2cCdpfE97PMG05fHb6wU4Gesj8Mfwjq3\nTUOH/RWxzjWSCfwOvAvUynOO+ljnsThre1M9D1h8fW0+bNOuWL/Mc/K8FjnkeRrrl/J5rE9gNMlz\njmpAEta/bE4B/wSC8uS5GdhsO8ch4H99fe2loY2BSsA6rD17F4EDwOtATW1jj9vXVdvmACMd8hTL\nZ4Ptd5lm+wzaB4zw9fWXhjbGuj7rJ8BvtvffXqxf/MHaxh638ULb//8Lts+DDdiCJtv+UvUe1rXq\nlFJKKaU8pGOclFJKKaU8pIGTUkoppZSHNHBSSimllPKQBk5KKaWUUh7SwEkppZRSykMaOCmllFJK\neUgDJ6WUUkopD2ngpJRSSinlIQ2clFJKKaU8pIGTUkoppZSHNHBSSimllPKQBk5KKaWUUh76/+/i\nLR/zc9RFAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Validation accuracy at 0.7875999808311462\n" + ] + } + ], + "source": [ + "# Change if you have memory restrictions\n", + "batch_size = 256\n", + "\n", + "# TODO: Find the best parameters for each configuration\n", + "epochs = 5\n", + "#learning_rate = 0.01\n", + " \n", + "\n", + "### DON'T MODIFY ANYTHING BELOW ###\n", + "# Gradient Descent\n", + "#optimizer = tf.train.AdamOptimizer().minimize(loss) \n", + "#with tf.Session() as sess:\n", + "# sess.run(init)\n", + "\n", + "# The accuracy measured against the validation set\n", + "validation_accuracy = 0.0\n", + "\n", + "# Measurements use for graphing loss and accuracy\n", + "log_batch_step = 50\n", + "batches = []\n", + "loss_batch = []\n", + "train_acc_batch = []\n", + "valid_acc_batch = []\n", + "\n", + "with tf.Session() as session:\n", + " session.run(init)\n", + " batch_count = int(math.ceil(len(train_features)/batch_size))\n", + "\n", + " for epoch_i in range(epochs):\n", + " \n", + " # Progress bar\n", + " batches_pbar = tqdm(range(batch_count), desc='Epoch {:>2}/{}'.format(epoch_i+1, epochs), unit='batches')\n", + " \n", + " # The training cycle\n", + " for batch_i in batches_pbar:\n", + " # Get a batch of training features and labels\n", + " batch_start = batch_i*batch_size\n", + " batch_features = train_features[batch_start:batch_start + batch_size]\n", + " batch_labels = train_labels[batch_start:batch_start + batch_size]\n", + "\n", + " # Run optimizer and get loss\n", + " _, l = session.run(\n", + " [optimizer, loss],\n", + " feed_dict={features: batch_features, labels: batch_labels})\n", + "\n", + " # Log every 50 batches\n", + " if not batch_i % log_batch_step:\n", + " # Calculate Training and Validation accuracy\n", + " training_accuracy = session.run(accuracy, feed_dict=train_feed_dict)\n", + " validation_accuracy = session.run(accuracy, feed_dict=valid_feed_dict)\n", + "\n", + " # Log batches\n", + " previous_batch = batches[-1] if batches else 0\n", + " batches.append(log_batch_step + previous_batch)\n", + " loss_batch.append(l)\n", + " train_acc_batch.append(training_accuracy)\n", + " valid_acc_batch.append(validation_accuracy)\n", + "\n", + " # Check accuracy against Validation data\n", + " validation_accuracy = session.run(accuracy, feed_dict=valid_feed_dict)\n", + "\n", + "loss_plot = plt.subplot(211)\n", + "loss_plot.set_title('Loss')\n", + "loss_plot.plot(batches, loss_batch, 'g')\n", + "loss_plot.set_xlim([batches[0], batches[-1]])\n", + "acc_plot = plt.subplot(212)\n", + "acc_plot.set_title('Accuracy')\n", + "acc_plot.plot(batches, train_acc_batch, 'r', label='Training Accuracy')\n", + "acc_plot.plot(batches, valid_acc_batch, 'x', label='Validation Accuracy')\n", + "acc_plot.set_ylim([0, 1.0])\n", + "acc_plot.set_xlim([batches[0], batches[-1]])\n", + "acc_plot.legend(loc=4)\n", + "plt.tight_layout()\n", + "plt.show()\n", + "\n", + "print('Validation accuracy at {}'.format(validation_accuracy))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Test\n", + "You're going to test your model against your hold out dataset/testing data. This will give you a good indicator of how well the model will do in the real world. You should have a test accuracy of at least 80%." + ] + }, + { + "cell_type": "code", + "execution_count": 162, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Epoch 1/5: 100%|██████████| 557/557 [00:00<00:00, 1103.36batches/s]\n", + "Epoch 2/5: 100%|██████████| 557/557 [00:00<00:00, 1257.45batches/s]\n", + "Epoch 3/5: 100%|██████████| 557/557 [00:00<00:00, 1494.07batches/s]\n", + "Epoch 4/5: 100%|██████████| 557/557 [00:00<00:00, 1577.33batches/s]\n", + "Epoch 5/5: 100%|██████████| 557/557 [00:00<00:00, 1609.31batches/s]" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Nice Job! Test Accuracy is 0.855400025844574\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "\n" + ] + } + ], + "source": [ + "### DON'T MODIFY ANYTHING BELOW ###\n", + "# The accuracy measured against the test set\n", + "test_accuracy = 0.0\n", + "\n", + "with tf.Session() as session:\n", + " \n", + " session.run(init)\n", + " batch_count = int(math.ceil(len(train_features)/batch_size))\n", + "\n", + " for epoch_i in range(epochs):\n", + " e\n", + " # Progress bar\n", + " batches_pbar = tqdm(range(batch_count), desc='Epoch {:>2}/{}'.format(epoch_i+1, epochs), unit='batches')\n", + " \n", + " # The training cycle\n", + " for batch_i in batches_pbar:\n", + " # Get a batch of training features and labels\n", + " batch_start = batch_i*batch_size\n", + " batch_features = train_features[batch_start:batch_start + batch_size]\n", + " batch_labels = train_labels[batch_start:batch_start + batch_size]\n", + "\n", + " # Run optimizer\n", + " _ = session.run(optimizer, feed_dict={features: batch_features, labels: batch_labels})\n", + "\n", + " # Check accuracy against Test data\n", + " test_accuracy = session.run(accuracy, feed_dict=test_feed_dict)\n", + "\n", + "\n", + "assert test_accuracy >= 0.80, 'Test accuracy at {}, should be equal to or greater than 0.80'.format(test_accuracy)\n", + "print('Nice Job! Test Accuracy is {}'.format(test_accuracy))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Multiple layers\n", + "Good job! You built a one layer TensorFlow network! However, you might want to build more than one layer. This is deep learning after all! In the next section, you will start to satisfy your need for more layers." + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/intro-to-tensorflow/intro_to_tensorflow_solution.ipynb b/intro-to-tensorflow/intro_to_tensorflow_solution.ipynb new file mode 100644 index 0000000..99b2b60 --- /dev/null +++ b/intro-to-tensorflow/intro_to_tensorflow_solution.ipynb @@ -0,0 +1,112 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Solutions\n", + "## Problem 1\n", + "Implement the Min-Max scaling function ($X'=a+{\\frac {\\left(X-X_{\\min }\\right)\\left(b-a\\right)}{X_{\\max }-X_{\\min }}}$) with the parameters:\n", + "\n", + "$X_{\\min }=0$\n", + "\n", + "$X_{\\max }=255$\n", + "\n", + "$a=0.1$\n", + "\n", + "$b=0.9$" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "# Problem 1 - Implement Min-Max scaling for grayscale image data\n", + "def normalize_grayscale(image_data):\n", + " \"\"\"\n", + " Normalize the image data with Min-Max scaling to a range of [0.1, 0.9]\n", + " :param image_data: The image data to be normalized\n", + " :return: Normalized image data\n", + " \"\"\"\n", + " a = 0.1\n", + " b = 0.9\n", + " grayscale_min = 0\n", + " grayscale_max = 255\n", + " return a + ( ( (image_data - grayscale_min)*(b - a) )/( grayscale_max - grayscale_min ) )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Problem 2\n", + "- Use [tf.placeholder()](https://www.tensorflow.org/api_docs/python/io_ops.html#placeholder) for `features` and `labels` since they are the inputs to the model.\n", + "- Any math operations must have the same type on both sides of the operator. The weights are float32, so the `features` and `labels` must also be float32.\n", + "- Use [tf.Variable()](https://www.tensorflow.org/api_docs/python/state_ops.html#Variable) to allow `weights` and `biases` to be modified.\n", + "- The `weights` must be the dimensions of features by labels. The number of features is the size of the image, 28*28=784. The size of labels is 10.\n", + "- The `biases` must be the dimensions of the labels, which is 10." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "features_count = 784\n", + "labels_count = 10\n", + "\n", + "# Problem 2 - Set the features and labels tensors\n", + "features = tf.placeholder(tf.float32)\n", + "labels = tf.placeholder(tf.float32)\n", + "\n", + "# Problem 2 - Set the weights and biases tensors\n", + "weights = tf.Variable(tf.truncated_normal((features_count, labels_count)))\n", + "biases = tf.Variable(tf.zeros(labels_count))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Problem 3\n", + "Configuration 1\n", + "* **Epochs:** 1\n", + "* **Learning Rate:** 0.1\n", + "\n", + "Configuration 2\n", + "* **Epochs:** 4 or 5\n", + "* **Learning Rate:** 0.2" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/intro-to-tensorflow/notMNIST.pickle b/intro-to-tensorflow/notMNIST.pickle new file mode 100644 index 0000000..d1e0615 --- /dev/null +++ b/intro-to-tensorflow/notMNIST.pickle @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:4a205367d7c771405d2334bf0cb2ed4dfb651aa1273dd60454d825091e83a0a5 +size 508160488 diff --git a/intro-to-tensorflow/notMNIST_test.zip b/intro-to-tensorflow/notMNIST_test.zip new file mode 100644 index 0000000..8b04dc9 --- /dev/null +++ b/intro-to-tensorflow/notMNIST_test.zip @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:a9c56d4c0daf330a47778058319132b0f12b50cc592fc354fd6e77be636fc744 +size 6125664 diff --git a/intro-to-tensorflow/notMNIST_train.zip b/intro-to-tensorflow/notMNIST_train.zip new file mode 100644 index 0000000..5a4bdf3 --- /dev/null +++ b/intro-to-tensorflow/notMNIST_train.zip @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8000f214a988720bb02f2d0f21eec838ccf2cb36ee644979597b0ea44a8bf5c3 +size 132817335 diff --git a/intro-to-tflearn/Handwritten Digit Recognition with TFLearn - Solution.ipynb b/intro-to-tflearn/Handwritten Digit Recognition with TFLearn - Solution.ipynb new file mode 100644 index 0000000..9bf309d --- /dev/null +++ b/intro-to-tflearn/Handwritten Digit Recognition with TFLearn - Solution.ipynb @@ -0,0 +1,329 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Handwritten Number Recognition with TFLearn and MNIST\n", + "\n", + "In this notebook, we'll be building a neural network that recognizes handwritten numbers 0-9. \n", + "\n", + "This kind of neural network is used in a variety of real-world applications including: recognizing phone numbers and sorting postal mail by address. To build the network, we'll be using the **MNIST** data set, which consists of images of handwritten numbers and their correct labels 0-9.\n", + "\n", + "We'll be using [TFLearn](http://tflearn.org/), a high-level library built on top of TensorFlow to build the neural network. We'll start off by importing all the modules we'll need, then load the data, and finally build the network." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Import Numpy, TensorFlow, TFLearn, and MNIST data\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "import tflearn\n", + "import tflearn.datasets.mnist as mnist" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Retrieving training and test data\n", + "\n", + "The MNIST data set already contains both training and test data. There are 55,000 data points of training data, and 10,000 points of test data.\n", + "\n", + "Each MNIST data point has:\n", + "1. an image of a handwritten digit and \n", + "2. a corresponding label (a number 0-9 that identifies the image)\n", + "\n", + "We'll call the images, which will be the input to our neural network, **X** and their corresponding labels **Y**.\n", + "\n", + "We're going to want our labels as *one-hot vectors*, which are vectors that holds mostly 0's and one 1. It's easiest to see this in a example. As a one-hot vector, the number 0 is represented as [1, 0, 0, 0, 0, 0, 0, 0, 0, 0], and 4 is represented as [0, 0, 0, 0, 1, 0, 0, 0, 0, 0].\n", + "\n", + "### Flattened data\n", + "\n", + "For this example, we'll be using *flattened* data or a representation of MNIST images in one dimension rather than two. So, each handwritten number image, which is 28x28 pixels, will be represented as a one dimensional array of 784 pixel values. \n", + "\n", + "Flattening the data throws away information about the 2D structure of the image, but it simplifies our data so that all of the training data can be contained in one array whose shape is [55000, 784]; the first dimension is the number of training images and the second dimension is the number of pixels in each image. This is the kind of data that is easy to analyze using a simple neural network." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Extracting mnist/train-images-idx3-ubyte.gz\n", + "Extracting mnist/train-labels-idx1-ubyte.gz\n", + "Extracting mnist/t10k-images-idx3-ubyte.gz\n", + "Extracting mnist/t10k-labels-idx1-ubyte.gz\n" + ] + } + ], + "source": [ + "# Retrieve the training and test data\n", + "trainX, trainY, testX, testY = mnist.load_data(one_hot=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualize the training data\n", + "\n", + "Provided below is a function that will help you visualize the MNIST data. By passing in the index of a training example, the function `show_digit` will display that training image along with it's corresponding label in the title." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAEICAYAAACQ6CLfAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAFIVJREFUeJzt3X2QHHWdx/H35wghQGIgZC+GpwSEUrgDo+xxRh4KIXA8\ngyWEBx9CIUYUVCBHHXKo1GGVHiIcyiFEiCQikQfloCw8JTwUaqGyQITEAMGwQbiQbMRAeLoQ+N4f\n3csNy07P7Dzv/j6vqq2d6W8/fKd3PtMz3T3bigjMLD1/0+4GzKw9HH6zRDn8Zoly+M0S5fCbJcrh\nN0tUEuGXtImklyTt2MhxG9DXDEm9zV5OmWV/RdJVNU7btr5brZ7H2unrqSPDn4ev/+dNSa+W3P/4\nUOcXEW9ExNiIeLqR47aSpNMk3duo+UXERRFxeqPm1yySzpX0nKQXJF0jaXSN82no+ms0STsPeN6/\nJCkkfalZy+zI8OfhGxsRY4GngaNKhv1o4PiSRrW+S2s2SUcAc4CPADsB7wW+2tammiQiVgx43n8A\neBP4SbOW2ZHhr0TS1yXdKGmhpPXAJyRNl/RbSeskrZL0HUmb5uOPyl9Fp+b3r8/rP5e0XtL9knYa\n6rh5/TBJT+Rbpu9K+o2kU8r0vYWkH0r6q6SlwF4D6hdIWpEvZ6mko/PhewBXAPvlW4S1+fCjJS2W\n9KKkpyV9ZYjr8Lr89i75Y/6UpGck9Uk6bwh9by/p1ny6pySdkQ+XpF9I+veScW+RNLfKNmcBcyNi\nWUQ8D1wEnFLtY6xW/q5gWb7e/yTptEHG+aqkv+SP78SS4WMkXSrpz5JWS7pS0pgGtPUp4O6IeKYB\n8xpcRHT0D9ALzBgw7OvABuAoshewzYF/AP4RGAXsDDwBnJmPPwoIYGp+/3pgLdANbArcCFxfw7h/\nC6wHjslr5wCvA6eUeSyXAPcCWwNTgD8CvSX1mcDk/DGdDLwETMprpwH3DpjfgcDf5eO/P+/zyCrX\n69eB6/Lbu+SP+SpgDPBB4H+BXSv1nS97MXA+MDqfVy9wUF7fFugD9icL85PAlnltJ2AdsG2ZHpcC\nHyu5Pynvc3wNz6N3rL+S2lH5c0b5On0V2DOvzQA2At8CNsvrrwC75PXvArfm6+ZdwB3ARSXTlv59\nrwa+U0WvytfhJ5qarXaHu4oV0cvg4b+7wnT/DNyc3x4s0FeVjHs0sKSGcU8FfjXgj7aK8uF/uvSx\nAJ8vfXIMMv4S4IhKT96S8a8AvlXleh0s/O8uqT8EHFepb2AfYMWAeX8F+H7J/RPyefwFmD6Ev/3K\nAcvdPO9z+xqeRxXXX8m4PwPOyG/PINvQbFFS/ynwZbIXvteAKSW1/YDlJdOW/fsWLP8jwIuly2zG\nz3D+rPzn0juS3gd8m+wt6RZkIf5dwfTPldx+BRhbw7jblvYRESGp6G3a5AF9rywt5h8XzibbupIv\nZ2K5mUmaDnyDbOs/mmzLtLBg+YUiotzjLOp7CrCjpHUlwzYhe6fQ7zbgO2QvmvcPoaWXyLam/cbn\nv9cPYR4VSTqS7AVrV7JAbwE8UDLKXyLilZL7K8n+9u8mW+d/kPTW7BrQ0iyyDdcrFcesw7D8zJ8b\n+HXEq8m2lLtExLvIdgw14g9RZBWwff8dZc+A7QrGfw7YoeT+W4cTJe0MfA/4HLBNRGwFPMb/P4bB\nvn75Y7IdQjtExHjgGprzmMv2TfaisDwitir5GRcRR5WM8w3gD8BUSccPYblLyT7O9Hs/8GxEvDDE\n/suStDlwS97jpHy9/5K3r8dt8vH67Qj8D7Ca7F3Be0se+/j8b1FrP1sCHwPm1zqPag3n8A80DngB\neFnSbsBnW7DMnwEflHRUfsThS0BXwfg3AedL2krZeQRnltTGkgW8j+x15DPA+0rqq4Ht+3di5sYB\nz0fEa5I+BJxYUiPfefeJWh9clX3fD2yQNCff+bWJpD0k7ZX3cCDwcbIdWLOAKyVNrnK5C4DPSHqf\npAnABcB1/cV8Z+w1Q3gcf5P3+NYP2ZZ7NNl6fyN/F3DQwOmACyWNlnQAcBhwS0S8QfaC+x+SuvId\nnNtLOmQIPQ30MWAN8Ks65lGVkRT+OWRPrvVk7wJubPYCI2I12efZS8k+z74HeJhsZ9lgvkb2bqEX\n+DnZk7t/Xo+Q7Tz6fT7Oe3n7x5Y7geXAakn9b88/B3xD2RGP88lCCmR7ocl2QhV99KlWUd8bgcOB\nvfP6WrL1/y5JW5GF9fMR8VxE3JtPe23eY/+x7W0HW2hE/Ay4DLgvn/cTwL+VjLID8JshPI79yHbm\nvfUTEevIPmrdCjwPHEf2ol7qGeDlfB3MB06LiOV5bQ7Zx4Dfk218fkn28eEdlJ2ncEWFHmcBCyL/\n8N9MasEykiFpE7K3g8dFRNNfuSv0cgDw6Yj4ZDv7aJb8xe0hsr3yG9vdz3Dk8NdJ0qHAb8m2JF8m\n26v8nogot/U36wgj6W1/u+wLrCD7zPhPwEcdfBsOvOU3S5S3/GaJaulJPhMnToypU6e2cpFmSent\n7WXt2rVVnetRV/jznV2Xk53RdU1EfLNo/KlTp9LT01PPIs2sQHd3d9Xj1vy2Pz+s9Z9kJzzsDpwk\nafda52dmrVXPZ/69gScj+x7yBrJTTY9pTFtm1mz1hH873v5lj2cY5Lx2SbMl9Ujq6evrq2NxZtZI\nTd/bHxFzI6I7Irq7uopOezezVqon/M/y9m96bZ8PM7NhoJ7wPwDsKmknZf9U8UTg9sa0ZWbNVvOh\nvojYKOlM4Bdkh/rmRcTShnVmZk1V13H+iLiD7H+Wmdkw49N7zRLl8JslyuE3S5TDb5Yoh98sUQ6/\nWaIcfrNEOfxmiXL4zRLl8JslyuE3S5TDb5Yoh98sUQ6/WaIcfrNEOfxmiXL4zRLl8JslyuE3S5TD\nb5Yoh98sUQ6/WaIcfrNEOfxmiXL4zRLl8JslyuE3S5TDb5Yoh98sUQ6/WaLqukS3pF5gPfAGsDEi\nuhvRlJk1X13hz30kItY2YD5m1kJ+22+WqHrDH8AiSQ9Kmj3YCJJmS+qR1NPX11fn4sysUeoN/74R\nMQ04DDhD0v4DR4iIuRHRHRHdXV1ddS7OzBqlrvBHxLP57zXArcDejWjKzJqv5vBL2lLSuP7bwCHA\nkkY1ZmbNVc/e/knArZL653NDRPx3Q7oys6arOfwRsQJ4fwN7MbMW8qE+s0Q5/GaJcvjNEuXwmyXK\n4TdLVCO+2GNt9oMf/KBsLT8UW9Y222xTWF+2bFlhffr06YX1/fbbr7Bu7eMtv1miHH6zRDn8Zoly\n+M0S5fCbJcrhN0uUw2+WqBFznP+GG24orD/88MOF9Xnz5jWynZZat25dzdOOGlX8FNiwYUNhfcyY\nMYX1LbbYomxtzz33LJz2pptuKqz7P0PVx1t+s0Q5/GaJcvjNEuXwmyXK4TdLlMNvliiH3yxRw+o4\n/znnnFO2dvnllxdO++abbza6nRGh0nH8Sl577bWa6/fee2/htCeccEJhfeHChYX1SZMmFdZT5y2/\nWaIcfrNEOfxmiXL4zRLl8JslyuE3S5TDb5aoYXWc/+abby5bq3Qcv9J3xzfffPOaemqEffbZp7B+\n7LHHtqiToVu0aFFhfcGCBWVrvb29hdPec889hfWTTjqpsH7jjTeWrfl/AVSx5Zc0T9IaSUtKhk2Q\ndKek5fnvrZvbppk1WjVv+68DDh0w7DzgrojYFbgrv29mw0jF8EfEfcDzAwYfA8zPb88HOvd9qZkN\nqtYdfpMiYlV++zmg7EnUkmZL6pHU09fXV+PizKzR6t7bHxEBREF9bkR0R0S3d7KYdY5aw79a0mSA\n/PeaxrVkZq1Qa/hvB2blt2cBtzWmHTNrFWXv2gtGkBYCBwATgdXA14D/Am4CdgRWAjMjYuBOwXfo\n7u6Onp6empt94oknytaWLFlStgZw8MEHF9bHjRtXU09WbMWKFWVrRxxxROG0jz32WF3LvuSSS8rW\n5syZU9e8O1V3dzc9PT2qZtyKJ/lERLkzKQ4aUldm1lF8eq9Zohx+s0Q5/GaJcvjNEuXwmyWq4qG+\nRqr3UJ+NLLfcckth/fjjj69r/hMnTixbG6mnmg/lUJ+3/GaJcvjNEuXwmyXK4TdLlMNvliiH3yxR\nDr9Zohx+s0Q5/GaJcvjNEuXwmyXK4TdLlMNvliiH3yxRDr9ZoobVJbpt+LnyyivL1pr9vx1effXV\nsrUHH3ywcNq99tqr0e10HG/5zRLl8JslyuE3S5TDb5Yoh98sUQ6/WaIcfrNE+Tj/CLBq1aqyteuv\nv75w2ssuu6zR7bxNUW/N9vLLL5etHXjggYXTvvDCC41up+NU3PJLmidpjaQlJcMulPSspMX5z+HN\nbdPMGq2at/3XAYcOMvyyiJiW/9zR2LbMrNkqhj8i7gOeb0EvZtZC9ezw+4KkR/KPBVuXG0nSbEk9\nknpG6vXRzIajWsP/PWBnYBqwCvh2uREjYm5EdEdEd1dXV42LM7NGqyn8EbE6It6IiDeB7wN7N7Yt\nM2u2msIvaXLJ3Y8CS8qNa2adqeJxfkkLgQOAiZKeAb4GHCBpGhBAL/DZJvY44i1atKiwXum751df\nfXXZ2lNPPVVTTyPdqaee2u4W2q5i+CPipEEGX9uEXsyshXx6r1miHH6zRDn8Zoly+M0S5fCbJcpf\n6W2A5cuXF9ZPP/30wvrdd9/dyHaGZMqUKYX1rbcue+Z2VS666KKytTFjxhROe+aZZxbWH3/88Zp6\nAth2221rnnak8JbfLFEOv1miHH6zRDn8Zoly+M0S5fCbJcrhN0uUj/NXqehfXF9xxRWF065YsaKw\nPnbs2ML6+PHjC+tnn3122Vql49kf/vCHC+uVzgNopkqPu5Jx48aVrR155JF1zXsk8JbfLFEOv1mi\nHH6zRDn8Zoly+M0S5fCbJcrhN0uUj/NX6f777y9bq3Qc/+ijjy6sz5kzp7C+//77F9aHq8WLFxfW\nV65cWdf8N9tss7K13Xbbra55jwTe8pslyuE3S5TDb5Yoh98sUQ6/WaIcfrNEOfxmiarmEt07AAuA\nSWSX5J4bEZdLmgDcCEwlu0z3zIj4a/Naba+rrrqqbG3PPfcsnPaCCy5odDsjwpNPPllYX716dV3z\nnzFjRl3Tj3TVbPk3AnMiYnfgQ8AZknYHzgPuiohdgbvy+2Y2TFQMf0SsioiH8tvrgWXAdsAxwPx8\ntPnAsc1q0swab0if+SVNBT4A/A6YFBGr8tJzZB8LzGyYqDr8ksYCPwHOiogXS2sREWT7Awabbrak\nHkk9fX19dTVrZo1TVfglbUoW/B9FxE/zwaslTc7rk4E1g00bEXMjojsiuru6uhrRs5k1QMXwSxJw\nLbAsIi4tKd0OzMpvzwJua3x7ZtYs1Xyldx/gk8Cjkvq/g3k+8E3gJkmfBlYCM5vTYmeYMGFC2ZoP\n5dWm6GvS1dhqq60K61/84hfrmv9IVzH8EfFrQGXKBzW2HTNrFZ/hZ5Yoh98sUQ6/WaIcfrNEOfxm\niXL4zRLlf91tTbXHHnuUrT322GN1zfuQQw4prE+fPr2u+Y903vKbJcrhN0uUw2+WKIffLFEOv1mi\nHH6zRDn8ZonycX5rqt7e3rK1jRs3Fk47fvz4wvpZZ51VS0uW85bfLFEOv1miHH6zRDn8Zoly+M0S\n5fCbJcrhN0uUj/NbXRYuXFhYf+WVV8rWxo0bVzjt3LlzC+v+vn59vOU3S5TDb5Yoh98sUQ6/WaIc\nfrNEOfxmiXL4zRJV8Ti/pB2ABcAkIIC5EXG5pAuBzwB9+ajnR8QdzWrU2uP1118vrF988cWF9dGj\nR5etHXfccYXTzpw5s7Bu9anmJJ+NwJyIeEjSOOBBSXfmtcsi4pLmtWdmzVIx/BGxCliV314vaRmw\nXbMbM7PmGtJnfklTgQ8Av8sHfUHSI5LmSdq6zDSzJfVI6unr6xtsFDNrg6rDL2ks8BPgrIh4Efge\nsDMwjeydwbcHmy4i5kZEd0R0d3V1NaBlM2uEqsIvaVOy4P8oIn4KEBGrI+KNiHgT+D6wd/PaNLNG\nqxh+SQKuBZZFxKUlwyeXjPZRYEnj2zOzZqlmb/8+wCeBRyUtzoedD5wkaRrZ4b9e4LNN6dDaKnvt\nL+/kk08urE+bNq1s7eCDD66pJ2uMavb2/xoY7BngY/pmw5jP8DNLlMNvliiH3yxRDr9Zohx+s0Q5\n/GaJ8r/utkKjRhU/Rc4999wWdWKN5i2/WaIcfrNEOfxmiXL4zRLl8JslyuE3S5TDb5YoRUTrFib1\nAStLBk0E1rasgaHp1N46tS9wb7VqZG9TIqKq/5fX0vC/Y+FST0R0t62BAp3aW6f2Be6tVu3qzW/7\nzRLl8Jslqt3hn9vm5Rfp1N46tS9wb7VqS29t/cxvZu3T7i2/mbWJw2+WqLaEX9Khkh6X9KSk89rR\nQzmSeiU9KmmxpJ429zJP0hpJS0qGTZB0p6Tl+e9Br5HYpt4ulPRsvu4WSzq8Tb3tIOkeSX+UtFTS\nl/LhbV13BX21Zb21/DO/pE2AJ4CDgWeAB4CTIuKPLW2kDEm9QHdEtP2EEEn7Ay8BCyLi7/NhFwPP\nR8Q38xfOrSPiXzqktwuBl9p92fb8alKTSy8rDxwLnEIb111BXzNpw3prx5Z/b+DJiFgRERuAHwPH\ntKGPjhcR9wHPDxh8DDA/vz2f7MnTcmV66wgRsSoiHspvrwf6Lyvf1nVX0FdbtCP82wF/Lrn/DG1c\nAYMIYJGkByXNbnczg5gUEavy288Bk9rZzCAqXra9lQZcVr5j1l0tl7tvNO/we6d9I2IacBhwRv72\ntiNF9pmtk47VVnXZ9lYZ5LLyb2nnuqv1cveN1o7wPwvsUHJ/+3xYR4iIZ/Pfa4Bb6bxLj6/uv0Jy\n/ntNm/t5Syddtn2wy8rTAeuuky53347wPwDsKmknSaOBE4Hb29DHO0jaMt8Rg6QtgUPovEuP3w7M\nym/PAm5rYy9v0ymXbS93WXnavO467nL3EdHyH+Bwsj3+fwL+tR09lOlrZ+AP+c/SdvcGLCR7G/g6\n2b6RTwPbAHcBy4FFwIQO6u2HwKPAI2RBm9ym3vYle0v/CLA4/zm83euuoK+2rDef3muWKO/wM0uU\nw2+WKIffLFEOv1miHH6zRDn8Zoly+M0S9X9n3/JewkbQBgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Visualizing the data\n", + "import matplotlib.pyplot as plt\n", + "%matplotlib inline\n", + "\n", + "# Function for displaying a training image by it's index in the MNIST set\n", + "def display_digit(index):\n", + " label = trainY[index].argmax(axis=0)\n", + " # Reshape 784 array into 28x28 image\n", + " image = trainX[index].reshape([28,28])\n", + " plt.title('Training data, index: %d, Label: %d' % (index, label))\n", + " plt.imshow(image, cmap='gray_r')\n", + " plt.show()\n", + " \n", + "# Display the first (index 0) training image\n", + "display_digit(0)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "## Building the network\n", + "\n", + "TFLearn lets you build the network by defining the layers in that network. \n", + "\n", + "For this example, you'll define:\n", + "\n", + "1. The input layer, which tells the network the number of inputs it should expect for each piece of MNIST data. \n", + "2. Hidden layers, which recognize patterns in data and connect the input to the output layer, and\n", + "3. The output layer, which defines how the network learns and outputs a label for a given image.\n", + "\n", + "Let's start with the input layer; to define the input layer, you'll define the type of data that the network expects. For example,\n", + "\n", + "```\n", + "net = tflearn.input_data([None, 100])\n", + "```\n", + "\n", + "would create a network with 100 inputs. The number of inputs to your network needs to match the size of your data. For this example, we're using 784 element long vectors to encode our input data, so we need **784 input units**.\n", + "\n", + "\n", + "### Adding layers\n", + "\n", + "To add new hidden layers, you use \n", + "\n", + "```\n", + "net = tflearn.fully_connected(net, n_units, activation='ReLU')\n", + "```\n", + "\n", + "This adds a fully connected layer where every unit (or node) in the previous layer is connected to every unit in this layer. The first argument `net` is the network you created in the `tflearn.input_data` call, it designates the input to the hidden layer. You can set the number of units in the layer with `n_units`, and set the activation function with the `activation` keyword. You can keep adding layers to your network by repeated calling `tflearn.fully_connected(net, n_units)`. \n", + "\n", + "Then, to set how you train the network, use:\n", + "\n", + "```\n", + "net = tflearn.regression(net, optimizer='sgd', learning_rate=0.1, loss='categorical_crossentropy')\n", + "```\n", + "\n", + "Again, this is passing in the network you've been building. The keywords: \n", + "\n", + "* `optimizer` sets the training method, here stochastic gradient descent\n", + "* `learning_rate` is the learning rate\n", + "* `loss` determines how the network error is calculated. In this example, with categorical cross-entropy.\n", + "\n", + "Finally, you put all this together to create the model with `tflearn.DNN(net)`." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Define the neural network\n", + "def build_model():\n", + " # This resets all parameters and variables, leave this here\n", + " tf.reset_default_graph()\n", + " \n", + " # Inputs\n", + " net = tflearn.input_data([None, trainX.shape[1]])\n", + "\n", + " # Hidden layer(s)\n", + " net = tflearn.fully_connected(net, 128, activation='ReLU')\n", + " net = tflearn.fully_connected(net, 32, activation='ReLU')\n", + " \n", + " # Output layer and training model\n", + " net = tflearn.fully_connected(net, 10, activation='softmax')\n", + " net = tflearn.regression(net, optimizer='sgd', learning_rate=0.01, loss='categorical_crossentropy')\n", + " \n", + " model = tflearn.DNN(net)\n", + " return model" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "WARNING:tensorflow:From //anaconda3/envs/tensorflow/lib/python3.5/site-packages/tflearn/summaries.py:46 in get_summary.: scalar_summary (from tensorflow.python.ops.logging_ops) is deprecated and will be removed after 2016-11-30.\n", + "Instructions for updating:\n", + "Please switch to tf.summary.scalar. Note that tf.summary.scalar uses the node name instead of the tag. This means that TensorFlow will automatically de-duplicate summary names based on the scope they are created in. Also, passing a tensor or list of tags to a scalar summary op is no longer supported.\n", + "WARNING:tensorflow:From //anaconda3/envs/tensorflow/lib/python3.5/site-packages/tflearn/summaries.py:46 in get_summary.: scalar_summary (from tensorflow.python.ops.logging_ops) is deprecated and will be removed after 2016-11-30.\n", + "Instructions for updating:\n", + "Please switch to tf.summary.scalar. Note that tf.summary.scalar uses the node name instead of the tag. This means that TensorFlow will automatically de-duplicate summary names based on the scope they are created in. Also, passing a tensor or list of tags to a scalar summary op is no longer supported.\n", + "WARNING:tensorflow:From //anaconda3/envs/tensorflow/lib/python3.5/site-packages/tflearn/helpers/trainer.py:766 in create_summaries.: merge_summary (from tensorflow.python.ops.logging_ops) is deprecated and will be removed after 2016-11-30.\n", + "Instructions for updating:\n", + "Please switch to tf.summary.merge.\n", + "WARNING:tensorflow:VARIABLES collection name is deprecated, please use GLOBAL_VARIABLES instead; VARIABLES will be removed after 2017-03-02.\n", + "WARNING:tensorflow:From //anaconda3/envs/tensorflow/lib/python3.5/site-packages/tflearn/helpers/trainer.py:130 in __init__.: initialize_all_variables (from tensorflow.python.ops.variables) is deprecated and will be removed after 2017-03-02.\n", + "Instructions for updating:\n", + "Use `tf.global_variables_initializer` instead.\n" + ] + } + ], + "source": [ + "# Build the model\n", + "model = build_model()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training the network\n", + "\n", + "Now that we've constructed the network, saved as the variable `model`, we can fit it to the data. Here we use the `model.fit` method. You pass in the training features `trainX` and the training targets `trainY`. Below I set `validation_set=0.1` which reserves 10% of the data set as the validation set. You can also set the batch size and number of epochs with the `batch_size` and `n_epoch` keywords, respectively.\n", + "\n", + "Too few epochs don't effectively train your network, and too many take a long time to execute. Choose wisely!" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training Step: 49500 | total loss: \u001b[1m\u001b[32m0.06119\u001b[0m\u001b[0m\n", + "| SGD | epoch: 100 | loss: 0.06119 - acc: 0.9824 | val_loss: 0.10663 - val_acc: 0.9705 -- iter: 49500/49500\n", + "Training Step: 49500 | total loss: \u001b[1m\u001b[32m0.06119\u001b[0m\u001b[0m\n", + "| SGD | epoch: 100 | loss: 0.06119 - acc: 0.9824 | val_loss: 0.10663 - val_acc: 0.9705 -- iter: 49500/49500\n", + "--\n" + ] + } + ], + "source": [ + "# Training\n", + "model.fit(trainX, trainY, validation_set=0.1, show_metric=True, batch_size=100, n_epoch=100)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Testing\n", + "After you're satisified with the training output and accuracy, you can then run the network on the **test data set** to measure it's performance! Remember, only do this after you've done the training and are satisfied with the results.\n", + "\n", + "A good result will be **higher than 95% accuracy**. Some simple models have been known to get up to 99.7% accuracy!" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Test accuracy: 0.9704\n" + ] + } + ], + "source": [ + "# Compare the labels that our model predicts with the actual labels\n", + "\n", + "# Find the indices of the most confident prediction for each item. That tells us the predicted digit for that sample.\n", + "predictions = np.array(model.predict(testX)).argmax(axis=1)\n", + "\n", + "# Calculate the accuracy, which is the percentage of times the predicated labels matched the actual labels\n", + "actual = testY.argmax(axis=1)\n", + "test_accuracy = np.mean(predictions == actual, axis=0)\n", + "\n", + "# Print out the result\n", + "print(\"Test accuracy: \", test_accuracy)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/intro-to-tflearn/Handwritten Digit Recognition with TFLearn.ipynb b/intro-to-tflearn/Handwritten Digit Recognition with TFLearn.ipynb new file mode 100644 index 0000000..f029341 --- /dev/null +++ b/intro-to-tflearn/Handwritten Digit Recognition with TFLearn.ipynb @@ -0,0 +1,269 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Handwritten Number Recognition with TFLearn and MNIST\n", + "\n", + "In this notebook, we'll be building a neural network that recognizes handwritten numbers 0-9. \n", + "\n", + "This kind of neural network is used in a variety of real-world applications including: recognizing phone numbers and sorting postal mail by address. To build the network, we'll be using the **MNIST** data set, which consists of images of handwritten numbers and their correct labels 0-9.\n", + "\n", + "We'll be using [TFLearn](http://tflearn.org/), a high-level library built on top of TensorFlow to build the neural network. We'll start off by importing all the modules we'll need, then load the data, and finally build the network." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Import Numpy, TensorFlow, TFLearn, and MNIST data\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "import tflearn\n", + "import tflearn.datasets.mnist as mnist" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Retrieving training and test data\n", + "\n", + "The MNIST data set already contains both training and test data. There are 55,000 data points of training data, and 10,000 points of test data.\n", + "\n", + "Each MNIST data point has:\n", + "1. an image of a handwritten digit and \n", + "2. a corresponding label (a number 0-9 that identifies the image)\n", + "\n", + "We'll call the images, which will be the input to our neural network, **X** and their corresponding labels **Y**.\n", + "\n", + "We're going to want our labels as *one-hot vectors*, which are vectors that holds mostly 0's and one 1. It's easiest to see this in a example. As a one-hot vector, the number 0 is represented as [1, 0, 0, 0, 0, 0, 0, 0, 0, 0], and 4 is represented as [0, 0, 0, 0, 1, 0, 0, 0, 0, 0].\n", + "\n", + "### Flattened data\n", + "\n", + "For this example, we'll be using *flattened* data or a representation of MNIST images in one dimension rather than two. So, each handwritten number image, which is 28x28 pixels, will be represented as a one dimensional array of 784 pixel values. \n", + "\n", + "Flattening the data throws away information about the 2D structure of the image, but it simplifies our data so that all of the training data can be contained in one array whose shape is [55000, 784]; the first dimension is the number of training images and the second dimension is the number of pixels in each image. This is the kind of data that is easy to analyze using a simple neural network." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Retrieve the training and test data\n", + "trainX, trainY, testX, testY = mnist.load_data(one_hot=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualize the training data\n", + "\n", + "Provided below is a function that will help you visualize the MNIST data. By passing in the index of a training example, the function `show_digit` will display that training image along with it's corresponding label in the title." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Visualizing the data\n", + "import matplotlib.pyplot as plt\n", + "%matplotlib inline\n", + "\n", + "# Function for displaying a training image by it's index in the MNIST set\n", + "def show_digit(index):\n", + " label = trainY[index].argmax(axis=0)\n", + " # Reshape 784 array into 28x28 image\n", + " image = trainX[index].reshape([28,28])\n", + " plt.title('Training data, index: %d, Label: %d' % (index, label))\n", + " plt.imshow(image, cmap='gray_r')\n", + " plt.show()\n", + " \n", + "# Display the first (index 0) training image\n", + "show_digit(0)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "## Building the network\n", + "\n", + "TFLearn lets you build the network by defining the layers in that network. \n", + "\n", + "For this example, you'll define:\n", + "\n", + "1. The input layer, which tells the network the number of inputs it should expect for each piece of MNIST data. \n", + "2. Hidden layers, which recognize patterns in data and connect the input to the output layer, and\n", + "3. The output layer, which defines how the network learns and outputs a label for a given image.\n", + "\n", + "Let's start with the input layer; to define the input layer, you'll define the type of data that the network expects. For example,\n", + "\n", + "```\n", + "net = tflearn.input_data([None, 100])\n", + "```\n", + "\n", + "would create a network with 100 inputs. The number of inputs to your network needs to match the size of your data. For this example, we're using 784 element long vectors to encode our input data, so we need **784 input units**.\n", + "\n", + "\n", + "### Adding layers\n", + "\n", + "To add new hidden layers, you use \n", + "\n", + "```\n", + "net = tflearn.fully_connected(net, n_units, activation='ReLU')\n", + "```\n", + "\n", + "This adds a fully connected layer where every unit (or node) in the previous layer is connected to every unit in this layer. The first argument `net` is the network you created in the `tflearn.input_data` call, it designates the input to the hidden layer. You can set the number of units in the layer with `n_units`, and set the activation function with the `activation` keyword. You can keep adding layers to your network by repeated calling `tflearn.fully_connected(net, n_units)`. \n", + "\n", + "Then, to set how you train the network, use:\n", + "\n", + "```\n", + "net = tflearn.regression(net, optimizer='sgd', learning_rate=0.1, loss='categorical_crossentropy')\n", + "```\n", + "\n", + "Again, this is passing in the network you've been building. The keywords: \n", + "\n", + "* `optimizer` sets the training method, here stochastic gradient descent\n", + "* `learning_rate` is the learning rate\n", + "* `loss` determines how the network error is calculated. In this example, with categorical cross-entropy.\n", + "\n", + "Finally, you put all this together to create the model with `tflearn.DNN(net)`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "**Exercise:** Below in the `build_model()` function, you'll put together the network using TFLearn. You get to choose how many layers to use, how many hidden units, etc.\n", + "\n", + "**Hint:** The final output layer must have 10 output nodes (one for each digit 0-9). It's also recommended to use a `softmax` activation layer as your final output layer. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Define the neural network\n", + "def build_model():\n", + " # This resets all parameters and variables, leave this here\n", + " tf.reset_default_graph()\n", + " \n", + " #### Your code ####\n", + " # Include the input layer, hidden layer(s), and set how you want to train the model\n", + " \n", + " # This model assumes that your network is named \"net\" \n", + " model = tflearn.DNN(net)\n", + " return model" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Build the model\n", + "model = build_model()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training the network\n", + "\n", + "Now that we've constructed the network, saved as the variable `model`, we can fit it to the data. Here we use the `model.fit` method. You pass in the training features `trainX` and the training targets `trainY`. Below I set `validation_set=0.1` which reserves 10% of the data set as the validation set. You can also set the batch size and number of epochs with the `batch_size` and `n_epoch` keywords, respectively. \n", + "\n", + "Too few epochs don't effectively train your network, and too many take a long time to execute. Choose wisely!" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Training\n", + "model.fit(trainX, trainY, validation_set=0.1, show_metric=True, batch_size=100, n_epoch=20)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Testing\n", + "After you're satisified with the training output and accuracy, you can then run the network on the **test data set** to measure it's performance! Remember, only do this after you've done the training and are satisfied with the results.\n", + "\n", + "A good result will be **higher than 95% accuracy**. Some simple models have been known to get up to 99.7% accuracy!" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Compare the labels that our model predicts with the actual labels\n", + "\n", + "# Find the indices of the most confident prediction for each item. That tells us the predicted digit for that sample.\n", + "predictions = np.array(model.predict(testX)).argmax(axis=1)\n", + "\n", + "# Calculate the accuracy, which is the percentage of times the predicated labels matched the actual labels\n", + "actual = testY.argmax(axis=1)\n", + "test_accuracy = np.mean(predictions == actual, axis=0)\n", + "\n", + "# Print out the result\n", + "print(\"Test accuracy: \", test_accuracy)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/intro-to-tflearn/Sentiment Analysis with TFLearn - Solution.ipynb b/intro-to-tflearn/Sentiment Analysis with TFLearn - Solution.ipynb new file mode 100644 index 0000000..77a3168 --- /dev/null +++ b/intro-to-tflearn/Sentiment Analysis with TFLearn - Solution.ipynb @@ -0,0 +1,629 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment analysis with TFLearn\n", + "\n", + "In this notebook, we'll continue Andrew Trask's work by building a network for sentiment analysis on the movie review data. Instead of a network written with Numpy, we'll be using [TFLearn](http://tflearn.org/), a high-level library built on top of TensorFlow. TFLearn makes it simpler to build networks just by defining the layers. It takes care of most of the details for you.\n", + "\n", + "We'll start off by importing all the modules we'll need, then load and prepare the data." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "import tflearn\n", + "from tflearn.data_utils import to_categorical" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Preparing the data\n", + "\n", + "Following along with Andrew, our goal here is to convert our reviews into word vectors. The word vectors will have elements representing words in the total vocabulary. If the second position represents the word 'the', for each review we'll count up the number of times 'the' appears in the text and set the second position to that count. I'll show you examples as we build the input data from the reviews data. Check out Andrew's notebook and video for more about this." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Read the data\n", + "\n", + "Use the pandas library to read the reviews and postive/negative labels from comma-separated files. The data we're using has already been preprocessed a bit and we know it uses only lower case characters. If we were working from raw data, where we didn't know it was all lower case, we would want to add a step here to convert it. That's so we treat different variations of the same word, like `The`, `the`, and `THE`, all the same way." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "reviews = pd.read_csv('reviews.txt', header=None)\n", + "labels = pd.read_csv('labels.txt', header=None)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Counting word frequency\n", + "\n", + "To start off we'll need to count how often each word appears in the data. We'll use this count to create a vocabulary we'll use to encode the review data. This resulting count is known as a [bag of words](https://en.wikipedia.org/wiki/Bag-of-words_model). We'll use it to select our vocabulary and build the word vectors. You should have seen how to do this in Andrew's lesson. Try to implement it here using the [Counter class](https://docs.python.org/2/library/collections.html#collections.Counter).\n", + "\n", + "> **Exercise:** Create the bag of words from the reviews data and assign it to `total_counts`. The reviews are stores in the `reviews` [Pandas DataFrame](http://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.html). If you want the reviews as a Numpy array, use `reviews.values`. You can iterate through the rows in the DataFrame with `for idx, row in reviews.iterrows():` ([documentation](http://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.iterrows.html)). When you break up the reviews into words, use `.split(' ')` instead of `.split()` so your results match ours." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Total words in data set: 74074\n" + ] + } + ], + "source": [ + "from collections import Counter\n", + "total_counts = Counter()\n", + "for _, row in reviews.iterrows():\n", + " total_counts.update(row[0].split(' '))\n", + "print(\"Total words in data set: \", len(total_counts))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's keep the first 10000 most frequent words. As Andrew noted, most of the words in the vocabulary are rarely used so they will have little effect on our predictions. Below, we'll sort `vocab` by the count value and keep the 10000 most frequent words." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "['', 'the', '.', 'and', 'a', 'of', 'to', 'is', 'br', 'it', 'in', 'i', 'this', 'that', 's', 'was', 'as', 'for', 'with', 'movie', 'but', 'film', 'you', 'on', 't', 'not', 'he', 'are', 'his', 'have', 'be', 'one', 'all', 'at', 'they', 'by', 'an', 'who', 'so', 'from', 'like', 'there', 'her', 'or', 'just', 'about', 'out', 'if', 'has', 'what', 'some', 'good', 'can', 'more', 'she', 'when', 'very', 'up', 'time', 'no']\n" + ] + } + ], + "source": [ + "vocab = sorted(total_counts, key=total_counts.get, reverse=True)[:10000]\n", + "print(vocab[:60])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "What's the last word in our vocabulary? We can use this to judge if 10000 is too few. If the last word is pretty common, we probably need to keep more words." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "float : 30\n" + ] + } + ], + "source": [ + "print(vocab[-1], ': ', total_counts[vocab[-1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The last word in our vocabulary shows up in 30 reviews out of 25000. I think it's fair to say this is a tiny proportion of reviews. We are probably fine with this number of words.\n", + "\n", + "**Note:** When you run, you may see a different word from the one shown above, but it will also have the value `30`. That's because there are many words tied for that number of counts, and the `Counter` class does not guarantee which one will be returned in the case of a tie.\n", + "\n", + "Now for each review in the data, we'll make a word vector. First we need to make a mapping of word to index, pretty easy to do with a dictionary comprehension.\n", + "\n", + "> **Exercise:** Create a dictionary called `word2idx` that maps each word in the vocabulary to an index. The first word in `vocab` has index `0`, the second word has index `1`, and so on." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "word2idx = {word: i for i, word in enumerate(vocab)}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Text to vector function\n", + "\n", + "Now we can write a function that converts a some text to a word vector. The function will take a string of words as input and return a vector with the words counted up. Here's the general algorithm to do this:\n", + "\n", + "* Initialize the word vector with [np.zeros](https://docs.scipy.org/doc/numpy/reference/generated/numpy.zeros.html), it should be the length of the vocabulary.\n", + "* Split the input string of text into a list of words with `.split(' ')`. Again, if you call `.split()` instead, you'll get slightly different results than what we show here.\n", + "* For each word in that list, increment the element in the index associated with that word, which you get from `word2idx`.\n", + "\n", + "**Note:** Since all words aren't in the `vocab` dictionary, you'll get a key error if you run into one of those words. You can use the `.get` method of the `word2idx` dictionary to specify a default returned value when you make a key error. For example, `word2idx.get(word, None)` returns `None` if `word` doesn't exist in the dictionary." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def text_to_vector(text):\n", + " word_vector = np.zeros(len(vocab), dtype=np.int_)\n", + " for word in text.split(' '):\n", + " idx = word2idx.get(word, None)\n", + " if idx is None:\n", + " continue\n", + " else:\n", + " word_vector[idx] += 1\n", + " return np.array(word_vector)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you do this right, the following code should return\n", + "\n", + "```\n", + "text_to_vector('The tea is for a party to celebrate '\n", + " 'the movie so she has no time for a cake')[:65]\n", + " \n", + "array([0, 1, 0, 0, 2, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0, 1, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0])\n", + "``` " + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([0, 1, 0, 0, 2, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0, 1, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0])" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "text_to_vector('The tea is for a party to celebrate '\n", + " 'the movie so she has no time for a cake')[:65]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, run through our entire review data set and convert each review to a word vector." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "word_vectors = np.zeros((len(reviews), len(vocab)), dtype=np.int_)\n", + "for ii, (_, text) in enumerate(reviews.iterrows()):\n", + " word_vectors[ii] = text_to_vector(text[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18, 9, 27, 1, 4, 4, 6, 4, 0, 2, 2, 5, 0,\n", + " 4, 1, 0, 2, 0, 0, 0, 0, 0, 0],\n", + " [ 5, 4, 8, 1, 7, 3, 1, 2, 0, 4, 0, 0, 0,\n", + " 1, 2, 0, 0, 1, 3, 0, 0, 0, 1],\n", + " [ 78, 24, 12, 4, 17, 5, 20, 2, 8, 8, 2, 1, 1,\n", + " 2, 8, 0, 5, 5, 4, 0, 2, 1, 4],\n", + " [167, 53, 23, 0, 22, 23, 13, 14, 8, 10, 8, 12, 9,\n", + " 4, 11, 2, 11, 5, 11, 0, 5, 3, 0],\n", + " [ 19, 10, 11, 4, 6, 2, 2, 5, 0, 1, 2, 3, 1,\n", + " 0, 0, 0, 3, 1, 0, 1, 0, 0, 0]])" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# Printing out the first 5 word vectors\n", + "word_vectors[:5, :23]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Train, Validation, Test sets\n", + "\n", + "Now that we have the word_vectors, we're ready to split our data into train, validation, and test sets. Remember that we train on the train data, use the validation data to set the hyperparameters, and at the very end measure the network performance on the test data. Here we're using the function `to_categorical` from TFLearn to reshape the target data so that we'll have two output units and can classify with a softmax activation function. We actually won't be creating the validation set here, TFLearn will do that for us later." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "Y = (labels=='positive').astype(np.int_)\n", + "records = len(labels)\n", + "\n", + "shuffle = np.arange(records)\n", + "np.random.shuffle(shuffle)\n", + "test_fraction = 0.9\n", + "\n", + "train_split, test_split = shuffle[:int(records*test_fraction)], shuffle[int(records*test_fraction):]\n", + "trainX, trainY = word_vectors[train_split,:], to_categorical(Y.values[train_split], 2)\n", + "testX, testY = word_vectors[test_split,:], to_categorical(Y.values[test_split], 2)" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 1.],\n", + " [ 1., 0.],\n", + " [ 1., 0.],\n", + " ..., \n", + " [ 1., 0.],\n", + " [ 1., 0.],\n", + " [ 0., 1.]])" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "trainY" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Building the network\n", + "\n", + "[TFLearn](http://tflearn.org/) lets you build the network by [defining the layers](http://tflearn.org/layers/core/). \n", + "\n", + "### Input layer\n", + "\n", + "For the input layer, you just need to tell it how many units you have. For example, \n", + "\n", + "```\n", + "net = tflearn.input_data([None, 100])\n", + "```\n", + "\n", + "would create a network with 100 input units. The first element in the list, `None` in this case, sets the batch size. Setting it to `None` here leaves it at the default batch size.\n", + "\n", + "The number of inputs to your network needs to match the size of your data. For this example, we're using 10000 element long vectors to encode our input data, so we need 10000 input units.\n", + "\n", + "\n", + "### Adding layers\n", + "\n", + "To add new hidden layers, you use \n", + "\n", + "```\n", + "net = tflearn.fully_connected(net, n_units, activation='ReLU')\n", + "```\n", + "\n", + "This adds a fully connected layer where every unit in the previous layer is connected to every unit in this layer. The first argument `net` is the network you created in the `tflearn.input_data` call. It's telling the network to use the output of the previous layer as the input to this layer. You can set the number of units in the layer with `n_units`, and set the activation function with the `activation` keyword. You can keep adding layers to your network by repeated calling `net = tflearn.fully_connected(net, n_units)`.\n", + "\n", + "### Output layer\n", + "\n", + "The last layer you add is used as the output layer. There for, you need to set the number of units to match the target data. In this case we are predicting two classes, positive or negative sentiment. You also need to set the activation function so it's appropriate for your model. Again, we're trying to predict if some input data belongs to one of two classes, so we should use softmax.\n", + "\n", + "```\n", + "net = tflearn.fully_connected(net, 2, activation='softmax')\n", + "```\n", + "\n", + "### Training\n", + "To set how you train the network, use \n", + "\n", + "```\n", + "net = tflearn.regression(net, optimizer='sgd', learning_rate=0.1, loss='categorical_crossentropy')\n", + "```\n", + "\n", + "Again, this is passing in the network you've been building. The keywords: \n", + "\n", + "* `optimizer` sets the training method, here stochastic gradient descent\n", + "* `learning_rate` is the learning rate\n", + "* `loss` determines how the network error is calculated. In this example, with the categorical cross-entropy.\n", + "\n", + "Finally you put all this together to create the model with `tflearn.DNN(net)`. So it ends up looking something like \n", + "\n", + "```\n", + "net = tflearn.input_data([None, 10]) # Input\n", + "net = tflearn.fully_connected(net, 5, activation='ReLU') # Hidden\n", + "net = tflearn.fully_connected(net, 2, activation='softmax') # Output\n", + "net = tflearn.regression(net, optimizer='sgd', learning_rate=0.1, loss='categorical_crossentropy')\n", + "model = tflearn.DNN(net)\n", + "```\n", + "\n", + "> **Exercise:** Below in the `build_model()` function, you'll put together the network using TFLearn. You get to choose how many layers to use, how many hidden units, etc." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Network building\n", + "def build_model():\n", + " # This resets all parameters and variables, leave this here\n", + " tf.reset_default_graph()\n", + " \n", + " # Inputs\n", + " net = tflearn.input_data([None, 10000])\n", + "\n", + " # Hidden layer(s)\n", + " net = tflearn.fully_connected(net, 200, activation='ReLU')\n", + " net = tflearn.fully_connected(net, 25, activation='ReLU')\n", + "\n", + " # Output layer\n", + " net = tflearn.fully_connected(net, 2, activation='softmax')\n", + " net = tflearn.regression(net, optimizer='sgd', \n", + " learning_rate=0.1, \n", + " loss='categorical_crossentropy')\n", + " \n", + " model = tflearn.DNN(net)\n", + " return model" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Intializing the model\n", + "\n", + "Next we need to call the `build_model()` function to actually build the model. In my solution I haven't included any arguments to the function, but you can add arguments so you can change parameters in the model if you want.\n", + "\n", + "> **Note:** You might get a bunch of warnings here. TFLearn uses a lot of deprecated code in TensorFlow. Hopefully it gets updated to the new TensorFlow version soon." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "WARNING:tensorflow:From //anaconda3/envs/dl/lib/python3.5/site-packages/tflearn/summaries.py:46 in get_summary.: scalar_summary (from tensorflow.python.ops.logging_ops) is deprecated and will be removed after 2016-11-30.\n", + "Instructions for updating:\n", + "Please switch to tf.summary.scalar. Note that tf.summary.scalar uses the node name instead of the tag. This means that TensorFlow will automatically de-duplicate summary names based on the scope they are created in. Also, passing a tensor or list of tags to a scalar summary op is no longer supported.\n", + "WARNING:tensorflow:From //anaconda3/envs/dl/lib/python3.5/site-packages/tflearn/summaries.py:46 in get_summary.: scalar_summary (from tensorflow.python.ops.logging_ops) is deprecated and will be removed after 2016-11-30.\n", + "Instructions for updating:\n", + "Please switch to tf.summary.scalar. Note that tf.summary.scalar uses the node name instead of the tag. This means that TensorFlow will automatically de-duplicate summary names based on the scope they are created in. Also, passing a tensor or list of tags to a scalar summary op is no longer supported.\n", + "WARNING:tensorflow:From //anaconda3/envs/dl/lib/python3.5/site-packages/tflearn/helpers/trainer.py:766 in create_summaries.: merge_summary (from tensorflow.python.ops.logging_ops) is deprecated and will be removed after 2016-11-30.\n", + "Instructions for updating:\n", + "Please switch to tf.summary.merge.\n", + "WARNING:tensorflow:VARIABLES collection name is deprecated, please use GLOBAL_VARIABLES instead; VARIABLES will be removed after 2017-03-02.\n", + "WARNING:tensorflow:From //anaconda3/envs/dl/lib/python3.5/site-packages/tflearn/helpers/trainer.py:130 in __init__.: initialize_all_variables (from tensorflow.python.ops.variables) is deprecated and will be removed after 2017-03-02.\n", + "Instructions for updating:\n", + "Use `tf.global_variables_initializer` instead.\n" + ] + } + ], + "source": [ + "model = build_model()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training the network\n", + "\n", + "Now that we've constructed the network, saved as the variable `model`, we can fit it to the data. Here we use the `model.fit` method. You pass in the training features `trainX` and the training targets `trainY`. Below I set `validation_set=0.1` which reserves 10% of the data set as the validation set. You can also set the batch size and number of epochs with the `batch_size` and `n_epoch` keywords, respectively. Below is the code to fit our the network to our word vectors.\n", + "\n", + "You can rerun `model.fit` to train the network further if you think you can increase the validation accuracy. Remember, all hyperparameter adjustments must be done using the validation set. **Only use the test set after you're completely done training the network.**" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false, + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training Step: 15900 | total loss: \u001b[1m\u001b[32m0.28905\u001b[0m\u001b[0m\n", + "| SGD | epoch: 100 | loss: 0.28905 - acc: 0.8768 | val_loss: 0.45618 - val_acc: 0.8351 -- iter: 20250/20250\n", + "Training Step: 15900 | total loss: \u001b[1m\u001b[32m0.28905\u001b[0m\u001b[0m\n", + "| SGD | epoch: 100 | loss: 0.28905 - acc: 0.8768 | val_loss: 0.45618 - val_acc: 0.8351 -- iter: 20250/20250\n", + "--\n" + ] + } + ], + "source": [ + "# Training\n", + "model.fit(trainX, trainY, validation_set=0.1, show_metric=True, batch_size=128, n_epoch=100)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Testing\n", + "\n", + "After you're satisified with your hyperparameters, you can run the network on the test set to measure it's performance. Remember, *only do this after finalizing the hyperparameters*." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Test accuracy: 0.8504\n" + ] + } + ], + "source": [ + "predictions = (np.array(model.predict(testX))[:,0] >= 0.5).astype(np.int_)\n", + "test_accuracy = np.mean(predictions == testY[:,0], axis=0)\n", + "print(\"Test accuracy: \", test_accuracy)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Try out your own text!" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Helper function that uses your model to predict sentiment\n", + "def test_sentence(sentence):\n", + " positive_prob = model.predict([text_to_vector(sentence.lower())])[0][1]\n", + " print('Sentence: {}'.format(sentence))\n", + " print('P(positive) = {:.3f} :'.format(positive_prob), \n", + " 'Positive' if positive_prob > 0.5 else 'Negative')" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Sentence: Moonlight is by far the best movie of 2016.\n", + "P(positive) = 0.932 : Positive\n", + "Sentence: It's amazing anyone could be talented enough to make something this spectacularly awful\n", + "P(positive) = 0.002 : Negative\n" + ] + } + ], + "source": [ + "sentence = \"Moonlight is by far the best movie of 2016.\"\n", + "test_sentence(sentence)\n", + "\n", + "sentence = \"It's amazing anyone could be talented enough to make something this spectacularly awful\"\n", + "test_sentence(sentence)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/intro-to-tflearn/Sentiment Analysis with TFLearn.ipynb b/intro-to-tflearn/Sentiment Analysis with TFLearn.ipynb new file mode 100644 index 0000000..7d320b3 --- /dev/null +++ b/intro-to-tflearn/Sentiment Analysis with TFLearn.ipynb @@ -0,0 +1,487 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment analysis with TFLearn\n", + "\n", + "In this notebook, we'll continue Andrew Trask's work by building a network for sentiment analysis on the movie review data. Instead of a network written with Numpy, we'll be using [TFLearn](http://tflearn.org/), a high-level library built on top of TensorFlow. TFLearn makes it simpler to build networks just by defining the layers. It takes care of most of the details for you.\n", + "\n", + "We'll start off by importing all the modules we'll need, then load and prepare the data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "import tflearn\n", + "from tflearn.data_utils import to_categorical" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Preparing the data\n", + "\n", + "Following along with Andrew, our goal here is to convert our reviews into word vectors. The word vectors will have elements representing words in the total vocabulary. If the second position represents the word 'the', for each review we'll count up the number of times 'the' appears in the text and set the second position to that count. I'll show you examples as we build the input data from the reviews data. Check out Andrew's notebook and video for more about this." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Read the data\n", + "\n", + "Use the pandas library to read the reviews and postive/negative labels from comma-separated files. The data we're using has already been preprocessed a bit and we know it uses only lower case characters. If we were working from raw data, where we didn't know it was all lower case, we would want to add a step here to convert it. That's so we treat different variations of the same word, like `The`, `the`, and `THE`, all the same way." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "reviews = pd.read_csv('reviews.txt', header=None)\n", + "labels = pd.read_csv('labels.txt', header=None)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Counting word frequency\n", + "\n", + "To start off we'll need to count how often each word appears in the data. We'll use this count to create a vocabulary we'll use to encode the review data. This resulting count is known as a [bag of words](https://en.wikipedia.org/wiki/Bag-of-words_model). We'll use it to select our vocabulary and build the word vectors. You should have seen how to do this in Andrew's lesson. Try to implement it here using the [Counter class](https://docs.python.org/2/library/collections.html#collections.Counter).\n", + "\n", + "> **Exercise:** Create the bag of words from the reviews data and assign it to `total_counts`. The reviews are stores in the `reviews` [Pandas DataFrame](http://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.html). If you want the reviews as a Numpy array, use `reviews.values`. You can iterate through the rows in the DataFrame with `for idx, row in reviews.iterrows():` ([documentation](http://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.iterrows.html)). When you break up the reviews into words, use `.split(' ')` instead of `.split()` so your results match ours." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "\n", + "total_counts = # bag of words here\n", + "\n", + "print(\"Total words in data set: \", len(total_counts))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's keep the first 10000 most frequent words. As Andrew noted, most of the words in the vocabulary are rarely used so they will have little effect on our predictions. Below, we'll sort `vocab` by the count value and keep the 10000 most frequent words." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "vocab = sorted(total_counts, key=total_counts.get, reverse=True)[:10000]\n", + "print(vocab[:60])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "What's the last word in our vocabulary? We can use this to judge if 10000 is too few. If the last word is pretty common, we probably need to keep more words." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "print(vocab[-1], ': ', total_counts[vocab[-1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The last word in our vocabulary shows up in 30 reviews out of 25000. I think it's fair to say this is a tiny proportion of reviews. We are probably fine with this number of words.\n", + "\n", + "**Note:** When you run, you may see a different word from the one shown above, but it will also have the value `30`. That's because there are many words tied for that number of counts, and the `Counter` class does not guarantee which one will be returned in the case of a tie.\n", + "\n", + "Now for each review in the data, we'll make a word vector. First we need to make a mapping of word to index, pretty easy to do with a dictionary comprehension.\n", + "\n", + "> **Exercise:** Create a dictionary called `word2idx` that maps each word in the vocabulary to an index. The first word in `vocab` has index `0`, the second word has index `1`, and so on." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "word2idx = ## create the word-to-index dictionary here" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Text to vector function\n", + "\n", + "Now we can write a function that converts a some text to a word vector. The function will take a string of words as input and return a vector with the words counted up. Here's the general algorithm to do this:\n", + "\n", + "* Initialize the word vector with [np.zeros](https://docs.scipy.org/doc/numpy/reference/generated/numpy.zeros.html), it should be the length of the vocabulary.\n", + "* Split the input string of text into a list of words with `.split(' ')`. Again, if you call `.split()` instead, you'll get slightly different results than what we show here.\n", + "* For each word in that list, increment the element in the index associated with that word, which you get from `word2idx`.\n", + "\n", + "**Note:** Since all words aren't in the `vocab` dictionary, you'll get a key error if you run into one of those words. You can use the `.get` method of the `word2idx` dictionary to specify a default returned value when you make a key error. For example, `word2idx.get(word, None)` returns `None` if `word` doesn't exist in the dictionary." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def text_to_vector(text):\n", + " \n", + " pass" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you do this right, the following code should return\n", + "\n", + "```\n", + "text_to_vector('The tea is for a party to celebrate '\n", + " 'the movie so she has no time for a cake')[:65]\n", + " \n", + "array([0, 1, 0, 0, 2, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0, 1, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0])\n", + "``` " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "text_to_vector('The tea is for a party to celebrate '\n", + " 'the movie so she has no time for a cake')[:65]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, run through our entire review data set and convert each review to a word vector." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "word_vectors = np.zeros((len(reviews), len(vocab)), dtype=np.int_)\n", + "for ii, (_, text) in enumerate(reviews.iterrows()):\n", + " word_vectors[ii] = text_to_vector(text[0])" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Printing out the first 5 word vectors\n", + "word_vectors[:5, :23]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Train, Validation, Test sets\n", + "\n", + "Now that we have the word_vectors, we're ready to split our data into train, validation, and test sets. Remember that we train on the train data, use the validation data to set the hyperparameters, and at the very end measure the network performance on the test data. Here we're using the function `to_categorical` from TFLearn to reshape the target data so that we'll have two output units and can classify with a softmax activation function. We actually won't be creating the validation set here, TFLearn will do that for us later." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "Y = (labels=='positive').astype(np.int_)\n", + "records = len(labels)\n", + "\n", + "shuffle = np.arange(records)\n", + "np.random.shuffle(shuffle)\n", + "test_fraction = 0.9\n", + "\n", + "train_split, test_split = shuffle[:int(records*test_fraction)], shuffle[int(records*test_fraction):]\n", + "trainX, trainY = word_vectors[train_split,:], to_categorical(Y.values[train_split], 2)\n", + "testX, testY = word_vectors[test_split,:], to_categorical(Y.values[test_split], 2)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "trainY" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Building the network\n", + "\n", + "[TFLearn](http://tflearn.org/) lets you build the network by [defining the layers](http://tflearn.org/layers/core/). \n", + "\n", + "### Input layer\n", + "\n", + "For the input layer, you just need to tell it how many units you have. For example, \n", + "\n", + "```\n", + "net = tflearn.input_data([None, 100])\n", + "```\n", + "\n", + "would create a network with 100 input units. The first element in the list, `None` in this case, sets the batch size. Setting it to `None` here leaves it at the default batch size.\n", + "\n", + "The number of inputs to your network needs to match the size of your data. For this example, we're using 10000 element long vectors to encode our input data, so we need 10000 input units.\n", + "\n", + "\n", + "### Adding layers\n", + "\n", + "To add new hidden layers, you use \n", + "\n", + "```\n", + "net = tflearn.fully_connected(net, n_units, activation='ReLU')\n", + "```\n", + "\n", + "This adds a fully connected layer where every unit in the previous layer is connected to every unit in this layer. The first argument `net` is the network you created in the `tflearn.input_data` call. It's telling the network to use the output of the previous layer as the input to this layer. You can set the number of units in the layer with `n_units`, and set the activation function with the `activation` keyword. You can keep adding layers to your network by repeated calling `net = tflearn.fully_connected(net, n_units)`.\n", + "\n", + "### Output layer\n", + "\n", + "The last layer you add is used as the output layer. Therefore, you need to set the number of units to match the target data. In this case we are predicting two classes, positive or negative sentiment. You also need to set the activation function so it's appropriate for your model. Again, we're trying to predict if some input data belongs to one of two classes, so we should use softmax.\n", + "\n", + "```\n", + "net = tflearn.fully_connected(net, 2, activation='softmax')\n", + "```\n", + "\n", + "### Training\n", + "To set how you train the network, use \n", + "\n", + "```\n", + "net = tflearn.regression(net, optimizer='sgd', learning_rate=0.1, loss='categorical_crossentropy')\n", + "```\n", + "\n", + "Again, this is passing in the network you've been building. The keywords: \n", + "\n", + "* `optimizer` sets the training method, here stochastic gradient descent\n", + "* `learning_rate` is the learning rate\n", + "* `loss` determines how the network error is calculated. In this example, with the categorical cross-entropy.\n", + "\n", + "Finally you put all this together to create the model with `tflearn.DNN(net)`. So it ends up looking something like \n", + "\n", + "```\n", + "net = tflearn.input_data([None, 10]) # Input\n", + "net = tflearn.fully_connected(net, 5, activation='ReLU') # Hidden\n", + "net = tflearn.fully_connected(net, 2, activation='softmax') # Output\n", + "net = tflearn.regression(net, optimizer='sgd', learning_rate=0.1, loss='categorical_crossentropy')\n", + "model = tflearn.DNN(net)\n", + "```\n", + "\n", + "> **Exercise:** Below in the `build_model()` function, you'll put together the network using TFLearn. You get to choose how many layers to use, how many hidden units, etc." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Network building\n", + "def build_model():\n", + " # This resets all parameters and variables, leave this here\n", + " tf.reset_default_graph()\n", + " \n", + " #### Your code ####\n", + " \n", + " model = tflearn.DNN(net)\n", + " return model" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Intializing the model\n", + "\n", + "Next we need to call the `build_model()` function to actually build the model. In my solution I haven't included any arguments to the function, but you can add arguments so you can change parameters in the model if you want.\n", + "\n", + "> **Note:** You might get a bunch of warnings here. TFLearn uses a lot of deprecated code in TensorFlow. Hopefully it gets updated to the new TensorFlow version soon." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "model = build_model()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training the network\n", + "\n", + "Now that we've constructed the network, saved as the variable `model`, we can fit it to the data. Here we use the `model.fit` method. You pass in the training features `trainX` and the training targets `trainY`. Below I set `validation_set=0.1` which reserves 10% of the data set as the validation set. You can also set the batch size and number of epochs with the `batch_size` and `n_epoch` keywords, respectively. Below is the code to fit our the network to our word vectors.\n", + "\n", + "You can rerun `model.fit` to train the network further if you think you can increase the validation accuracy. Remember, all hyperparameter adjustments must be done using the validation set. **Only use the test set after you're completely done training the network.**" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "scrolled": false + }, + "outputs": [], + "source": [ + "# Training\n", + "model.fit(trainX, trainY, validation_set=0.1, show_metric=True, batch_size=128, n_epoch=10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Testing\n", + "\n", + "After you're satisified with your hyperparameters, you can run the network on the test set to measure its performance. Remember, *only do this after finalizing the hyperparameters*." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "predictions = (np.array(model.predict(testX))[:,0] >= 0.5).astype(np.int_)\n", + "test_accuracy = np.mean(predictions == testY[:,0], axis=0)\n", + "print(\"Test accuracy: \", test_accuracy)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Try out your own text!" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "# Helper function that uses your model to predict sentiment\n", + "def test_sentence(sentence):\n", + " positive_prob = model.predict([text_to_vector(sentence.lower())])[0][1]\n", + " print('Sentence: {}'.format(sentence))\n", + " print('P(positive) = {:.3f} :'.format(positive_prob), \n", + " 'Positive' if positive_prob > 0.5 else 'Negative')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "sentence = \"Moonlight is by far the best movie of 2016.\"\n", + "test_sentence(sentence)\n", + "\n", + "sentence = \"It's amazing anyone could be talented enough to make something this spectacularly awful\"\n", + "test_sentence(sentence)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/intro-to-tflearn/labels.txt b/intro-to-tflearn/labels.txt new file mode 100644 index 0000000..10366d9 --- /dev/null +++ b/intro-to-tflearn/labels.txt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:301d18424e95957a000293f5d8393239451d7f54622a8f21613172aebe64ca06 +size 225000 diff --git a/intro-to-tflearn/reviews.txt b/intro-to-tflearn/reviews.txt new file mode 100644 index 0000000..2940b8f --- /dev/null +++ b/intro-to-tflearn/reviews.txt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d7c2f21e5c8f5240859910ba5331b9baa88310aa978dbd8747ebd4ff6ebbefa6 +size 33678267 diff --git a/sentiment-rnn/Sentiment RNN Solution.ipynb b/sentiment-rnn/Sentiment RNN Solution.ipynb new file mode 100644 index 0000000..79cb217 --- /dev/null +++ b/sentiment-rnn/Sentiment RNN Solution.ipynb @@ -0,0 +1,1149 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Sentiment Analysis with an RNN\n", + "\n", + "In this notebook, you'll implement a recurrent neural network that performs sentiment analysis. Using an RNN rather than a feedfoward network is more accurate since we can include information about the *sequence* of words. Here we'll use a dataset of movie reviews, accompanied by labels.\n", + "\n", + "The architecture for this network is shown below.\n", + "\n", + "\n", + "\n", + "Here, we'll pass in words to an embedding layer. We need an embedding layer because we have tens of thousands of words, so we'll need a more efficient representation for our input data than one-hot encoded vectors. You should have seen this before from the word2vec lesson. You can actually train up an embedding with word2vec and use it here. But it's good enough to just have an embedding layer and let the network learn the embedding table on it's own.\n", + "\n", + "From the embedding layer, the new representations will be passed to LSTM cells. These will add recurrent connections to the network so we can include information about the sequence of words in the data. Finally, the LSTM cells will go to a sigmoid output layer here. We're using the sigmoid because we're trying to predict if this text has positive or negative sentiment. The output layer will just be a single unit then, with a sigmoid activation function.\n", + "\n", + "We don't care about the sigmoid outputs except for the very last one, we can ignore the rest. We'll calculate the cost from the output of the last step and the training label." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with open('reviews.txt', 'r') as f:\n", + " reviews = f.read()\n", + "with open('labels.txt', 'r') as f:\n", + " labels = f.read()" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t \\nstory of a man who has unnatural feelings for a pig . starts out with a opening scene that is a terrific example of absurd comedy . a formal orchestra audience is turned into an insane violent mob by the crazy chantings of it s singers . unfortunately it stays absurd the whole time with no general narrative eventually making it just too off putting . even those from the era should be turned off . the cryptic dialogue would make shakespeare seem easy to a third grader . on a technical level it s better than you might think with some good cinematography by future great vilmos zsigmond . future stars sally kirkland and frederic forrest can be seen briefly . \\nhomelessness or houselessness as george carlin stated has been an issue for years but never a plan to help those on the street that were once considered human who did everything from going to school work or vote for the matter . most people think of the homeless as just a lost cause while worrying about things such as racism the war on iraq pressuring kids to succeed technology the elections inflation or worrying if they ll be next to end up on the streets . br br but what if y'" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[:2000]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Data preprocessing\n", + "\n", + "The first step when building a neural network model is getting your data into the proper form to feed into the network. Since we're using embedding layers, we'll need to encode each word with an integer. We'll also want to clean it up a bit.\n", + "\n", + "You can see an example of the reviews data above. We'll want to get rid of those periods. Also, you might notice that the reviews are delimited with newlines `\\n`. To deal with those, I'm going to split the text into each review using `\\n` as the delimiter. Then I can combined all the reviews back together into one big string.\n", + "\n", + "First, let's remove all punctuation. Then get all the text without the newlines and split it into individual words." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "from string import punctuation\n", + "all_text = ''.join([c for c in reviews if c not in punctuation])\n", + "reviews = all_text.split('\\n')\n", + "\n", + "all_text = ' '.join(reviews)\n", + "words = all_text.split()" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy it ran at the same time as some other programs about school life such as teachers my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled at high a classic line inspector i m here to sack one of your teachers student welcome to bromwell high i expect that many adults of my age think that bromwell high is far fetched what a pity that it isn t story of a man who has unnatural feelings for a pig starts out with a opening scene that is a terrific example of absurd comedy a formal orchestra audience is turned into an insane violent mob by the crazy chantings of it s singers unfortunately it stays absurd the whole time with no general narrative eventually making it just too off putting even those from the era should be turned off the cryptic dialogue would make shakespeare seem easy to a third grader on a technical level it s better than you might think with some good cinematography by future great vilmos zsigmond future stars sally kirkland and frederic forrest can be seen briefly homelessness or houselessness as george carlin stated has been an issue for years but never a plan to help those on the street that were once considered human who did everything from going to school work or vote for the matter most people think of the homeless as just a lost cause while worrying about things such as racism the war on iraq pressuring kids to succeed technology the elections inflation or worrying if they ll be next to end up on the streets br br but what if you were given a bet to live on the st'" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "all_text[:2000]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['bromwell',\n", + " 'high',\n", + " 'is',\n", + " 'a',\n", + " 'cartoon',\n", + " 'comedy',\n", + " 'it',\n", + " 'ran',\n", + " 'at',\n", + " 'the',\n", + " 'same',\n", + " 'time',\n", + " 'as',\n", + " 'some',\n", + " 'other',\n", + " 'programs',\n", + " 'about',\n", + " 'school',\n", + " 'life',\n", + " 'such',\n", + " 'as',\n", + " 'teachers',\n", + " 'my',\n", + " 'years',\n", + " 'in',\n", + " 'the',\n", + " 'teaching',\n", + " 'profession',\n", + " 'lead',\n", + " 'me',\n", + " 'to',\n", + " 'believe',\n", + " 'that',\n", + " 'bromwell',\n", + " 'high',\n", + " 's',\n", + " 'satire',\n", + " 'is',\n", + " 'much',\n", + " 'closer',\n", + " 'to',\n", + " 'reality',\n", + " 'than',\n", + " 'is',\n", + " 'teachers',\n", + " 'the',\n", + " 'scramble',\n", + " 'to',\n", + " 'survive',\n", + " 'financially',\n", + " 'the',\n", + " 'insightful',\n", + " 'students',\n", + " 'who',\n", + " 'can',\n", + " 'see',\n", + " 'right',\n", + " 'through',\n", + " 'their',\n", + " 'pathetic',\n", + " 'teachers',\n", + " 'pomp',\n", + " 'the',\n", + " 'pettiness',\n", + " 'of',\n", + " 'the',\n", + " 'whole',\n", + " 'situation',\n", + " 'all',\n", + " 'remind',\n", + " 'me',\n", + " 'of',\n", + " 'the',\n", + " 'schools',\n", + " 'i',\n", + " 'knew',\n", + " 'and',\n", + " 'their',\n", + " 'students',\n", + " 'when',\n", + " 'i',\n", + " 'saw',\n", + " 'the',\n", + " 'episode',\n", + " 'in',\n", + " 'which',\n", + " 'a',\n", + " 'student',\n", + " 'repeatedly',\n", + " 'tried',\n", + " 'to',\n", + " 'burn',\n", + " 'down',\n", + " 'the',\n", + " 'school',\n", + " 'i',\n", + " 'immediately',\n", + " 'recalled',\n", + " 'at',\n", + " 'high']" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "words[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Encoding the words\n", + "\n", + "The embedding lookup requires that we pass in integers to our network. The easiest way to do this is to create dictionaries that map the words in the vocabulary to integers. Then we can convert each of our reviews into integers so they can be passed into the network.\n", + "\n", + "> **Exercise:** Now you're going to encode the words with integers. Build a dictionary that maps words to integers. Later we're going to pad our input vectors with zeros, so make sure the integers **start at 1, not 0**.\n", + "> Also, convert the reviews to integers and store the reviews in a new list called `reviews_ints`. " + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "counts = Counter(words)\n", + "vocab = sorted(counts, key=counts.get, reverse=True)\n", + "vocab_to_int = {word: ii for ii, word in enumerate(vocab, 1)}\n", + "\n", + "reviews_ints = []\n", + "for each in reviews:\n", + " reviews_ints.append([vocab_to_int[word] for word in each.split()])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "source": [ + "### Encoding the labels\n", + "\n", + "Our labels are \"positive\" or \"negative\". To use these labels in our network, we need to convert them to 0 and 1.\n", + "\n", + "> **Exercise:** Convert labels from `positive` and `negative` to 1 and 0, respectively." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "labels = labels.split('\\n')\n", + "labels = np.array([1 if each == 'positive' else 0 for each in labels])" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Zero-length reviews: 1\n", + "Maximum review length: 2514\n" + ] + } + ], + "source": [ + "review_lens = Counter([len(x) for x in reviews_ints])\n", + "print(\"Zero-length reviews: {}\".format(review_lens[0]))\n", + "print(\"Maximum review length: {}\".format(max(review_lens)))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "Okay, a couple issues here. We seem to have one review with zero length. And, the maximum review length is way too many steps for our RNN. Let's truncate to 200 steps. For reviews shorter than 200, we'll pad with 0s. For reviews longer than 200, we can truncate them to the first 200 characters.\n", + "\n", + "> **Exercise:** First, remove the review with zero length from the `reviews_ints` list." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Filter out that review with 0 length\n", + "reviews_ints = [each for each in reviews_ints if len(each) > 0]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "> **Exercise:** Now, create an array `features` that contains the data we'll pass to the network. The data should come from `review_ints`, since we want to feed integers to the network. Each row should be 200 elements long. For reviews shorter than 200 words, left pad with 0s. That is, if the review is `['best', 'movie', 'ever']`, `[117, 18, 128]` as integers, the row will look like `[0, 0, 0, ..., 0, 117, 18, 128]`. For reviews longer than 200, use on the first 200 words as the feature vector.\n", + "\n", + "This isn't trivial and there are a bunch of ways to do this. But, if you're going to be building your own deep learning networks, you're going to have to get used to preparing your data.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "seq_len = 200\n", + "features = np.zeros((len(reviews), seq_len), dtype=int)\n", + "for i, row in enumerate(reviews_ints):\n", + " features[i, -len(row):] = np.array(row)[:seq_len]" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 21282, 308, 6,\n", + " 3, 1050, 207, 8, 2143, 32, 1, 171, 57,\n", + " 15, 49, 81, 5832, 44, 382, 110, 140, 15,\n", + " 5236, 60, 154, 9, 1, 5014, 5899, 475, 71,\n", + " 5, 260, 12, 21282, 308, 13, 1981, 6, 74,\n", + " 2396],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 63, 4, 3, 125,\n", + " 36, 47, 7487, 1397, 16, 3, 4212, 505, 45,\n", + " 17],\n", + " [23249, 42, 55982, 15, 707, 17288, 3398, 47, 77,\n", + " 35, 1822, 16, 154, 19, 114, 3, 1308, 5,\n", + " 336, 147, 22, 1, 857, 12, 70, 281, 1170,\n", + " 399, 36, 120, 283, 38, 169, 5, 382, 158,\n", + " 42, 2278, 16, 1, 541, 90, 78, 102, 4,\n", + " 1, 3256, 15, 43, 3, 407, 1069, 136, 8165,\n", + " 44, 182, 140, 15, 3054, 1, 321, 22, 4827,\n", + " 28571, 346, 5, 3093, 2094, 1, 18970, 18062, 42,\n", + " 8165, 46, 33, 236, 29, 370, 5, 130, 56,\n", + " 22, 1, 1928, 7, 7, 19, 48, 46, 21,\n", + " 70, 345, 3, 2102, 5, 408, 22, 1, 1928,\n", + " 16],\n", + " [ 4504, 505, 15, 3, 3352, 162, 8369, 1655, 6,\n", + " 4860, 56, 17, 4513, 5629, 140, 11938, 5, 996,\n", + " 4969, 2947, 4464, 566, 1202, 36, 6, 1520, 96,\n", + " 3, 744, 4, 28265, 13, 5, 27, 3464, 9,\n", + " 10794, 4, 8, 111, 3024, 5, 1, 1027, 15,\n", + " 3, 4400, 82, 22, 2058, 6, 4464, 538, 2773,\n", + " 7123, 41932, 41, 463, 1, 8369, 62989, 302, 123,\n", + " 15, 4230, 19, 1671, 923, 1, 1655, 6, 6166,\n", + " 20692, 34, 1, 980, 1751, 22804, 646, 25292, 27,\n", + " 106, 11901, 13, 14278, 15496, 18701, 2459, 466, 21189,\n", + " 36, 3267, 1, 6436, 1020, 45, 17, 2700, 2500,\n", + " 33],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 520, 119, 113, 34,\n", + " 16673, 1817, 3744, 117, 885, 22019, 721, 10, 28,\n", + " 124, 108, 2, 115, 137, 9, 1626, 7742, 26,\n", + " 330, 5, 590, 1, 6165, 22, 386, 6, 3,\n", + " 349, 15, 50, 15, 231, 9, 7484, 11646, 1,\n", + " 191, 22, 8994, 6, 82, 881, 101, 111, 3590,\n", + " 4],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 11, 20, 3662, 141, 10, 423, 23, 272, 60,\n", + " 4362, 22, 32, 84, 3305, 22, 1, 172, 4,\n", + " 1, 952, 507, 11, 4996, 5387, 5, 574, 4,\n", + " 1154, 54, 53, 5328, 1, 261, 17, 41, 952,\n", + " 125, 59, 1, 712, 137, 379, 627, 15, 111,\n", + " 1511],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 11, 6, 692, 1, 90,\n", + " 2158, 20, 11793, 1, 2818, 5218, 249, 92, 3007,\n", + " 8, 126, 24, 201, 3, 803, 634, 4, 23249,\n", + " 1002],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 786, 295, 10, 122, 11, 6, 418,\n", + " 5, 29, 35, 482, 20, 19, 1285, 33, 142,\n", + " 28, 2667, 45, 1844, 32, 1, 2790, 37, 78,\n", + " 97, 2439, 67, 3952, 45, 2, 24, 105, 256,\n", + " 1, 134, 1572, 2, 12612, 452, 14, 319, 11,\n", + " 63, 6, 98, 1322, 5, 105, 1, 3766, 4,\n", + " 3],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 11, 6,\n", + " 24, 1, 779, 3708, 2818, 20, 8, 14, 74,\n", + " 325, 2739, 73, 90, 4, 27, 99, 2, 165,\n", + " 68],\n", + " [ 54, 10, 14, 116, 60, 798, 552, 71, 364,\n", + " 5, 1, 731, 5, 66, 8089, 8, 14, 30,\n", + " 4, 109, 99, 10, 293, 17, 60, 798, 19,\n", + " 11, 14, 1, 64, 30, 69, 2506, 45, 4,\n", + " 234, 93, 10, 68, 114, 108, 8089, 363, 43,\n", + " 1009, 2, 10, 97, 28, 1431, 45, 1, 357,\n", + " 4, 60, 110, 205, 8, 48, 3, 1929, 11029,\n", + " 2, 2127, 354, 412, 4, 13, 6676, 2, 2975,\n", + " 5174, 2125, 1371, 6, 30, 4, 60, 502, 876,\n", + " 19, 8089, 6, 34, 227, 1, 247, 412, 4,\n", + " 582, 4, 27, 599, 9, 1, 13829, 395, 4,\n", + " 14175]])" + ] + }, + "execution_count": 13, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "features[:10,:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Training, Validation, Test\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "With our data in nice shape, we'll split it into training, validation, and test sets.\n", + "\n", + "> **Exercise:** Create the training, validation, and test sets here. You'll need to create sets for the features and the labels, `train_x` and `train_y` for example. Define a split fraction, `split_frac` as the fraction of data to keep in the training set. Usually this is set to 0.8 or 0.9. The rest of the data will be split in half to create the validation and testing data." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\t\t\tFeature Shapes:\n", + "Train set: \t\t(20000, 200) \n", + "Validation set: \t(2500, 200) \n", + "Test set: \t\t(2501, 200)\n" + ] + } + ], + "source": [ + "split_frac = 0.8\n", + "split_idx = int(len(features)*0.8)\n", + "train_x, val_x = features[:split_idx], features[split_idx:]\n", + "train_y, val_y = labels[:split_idx], labels[split_idx:]\n", + "\n", + "test_idx = int(len(val_x)*0.5)\n", + "val_x, test_x = val_x[:test_idx], val_x[test_idx:]\n", + "val_y, test_y = val_y[:test_idx], val_y[test_idx:]\n", + "\n", + "print(\"\\t\\t\\tFeature Shapes:\")\n", + "print(\"Train set: \\t\\t{}\".format(train_x.shape), \n", + " \"\\nValidation set: \\t{}\".format(val_x.shape),\n", + " \"\\nTest set: \\t\\t{}\".format(test_x.shape))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "With train, validation, and text fractions of 0.8, 0.1, 0.1, the final shapes should look like:\n", + "```\n", + " Feature Shapes:\n", + "Train set: \t\t (20000, 200) \n", + "Validation set: \t(2500, 200) \n", + "Test set: \t\t (2501, 200)\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Build the graph\n", + "\n", + "Here, we'll build the graph. First up, defining the hyperparameters.\n", + "\n", + "* `lstm_size`: Number of units in the hidden layers in the LSTM cells. Usually larger is better performance wise. Common values are 128, 256, 512, etc.\n", + "* `lstm_layers`: Number of LSTM layers in the network. I'd start with 1, then add more if I'm underfitting.\n", + "* `batch_size`: The number of reviews to feed the network in one training pass. Typically this should be set as high as you can go without running out of memory.\n", + "* `learning_rate`: Learning rate" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "lstm_size = 256\n", + "lstm_layers = 1\n", + "batch_size = 500\n", + "learning_rate = 0.001" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "For the network itself, we'll be passing in our 200 element long review vectors. Each batch will be `batch_size` vectors. We'll also be using dropout on the LSTM layer, so we'll make a placeholder for the keep probability." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "> **Exercise:** Create the `inputs_`, `labels_`, and drop out `keep_prob` placeholders using `tf.placeholder`. `labels_` needs to be two-dimensional to work with some functions later. Since `keep_prob` is a scalar (a 0-dimensional tensor), you shouldn't provide a size to `tf.placeholder`." + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "n_words = len(vocab)\n", + "\n", + "# Create the graph object\n", + "graph = tf.Graph()\n", + "# Add nodes to the graph\n", + "with graph.as_default():\n", + " inputs_ = tf.placeholder(tf.int32, [None, None], name='inputs')\n", + " labels_ = tf.placeholder(tf.int32, [None, None], name='labels')\n", + " keep_prob = tf.placeholder(tf.float32, name='keep_prob')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Embedding\n", + "\n", + "Now we'll add an embedding layer. We need to do this because there are 74000 words in our vocabulary. It is massively inefficient to one-hot encode our classes here. You should remember dealing with this problem from the word2vec lesson. Instead of one-hot encoding, we can have an embedding layer and use that layer as a lookup table. You could train an embedding layer using word2vec, then load it here. But, it's fine to just make a new layer and let the network learn the weights.\n", + "\n", + "> **Exercise:** Create the embedding lookup matrix as a `tf.Variable`. Use that embedding matrix to get the embedded vectors to pass to the LSTM cell with [`tf.nn.embedding_lookup`](https://www.tensorflow.org/api_docs/python/tf/nn/embedding_lookup). This function takes the embedding matrix and an input tensor, such as the review vectors. Then, it'll return another tensor with the embedded vectors. So, if the embedding layer as 200 units, the function will return a tensor with size [batch_size, 200].\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Size of the embedding vectors (number of units in the embedding layer)\n", + "embed_size = 300 \n", + "\n", + "with graph.as_default():\n", + " embedding = tf.Variable(tf.random_uniform((n_words, embed_size), -1, 1))\n", + " embed = tf.nn.embedding_lookup(embedding, inputs_)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### LSTM cell\n", + "\n", + "\n", + "\n", + "Next, we'll create our LSTM cells to use in the recurrent network ([TensorFlow documentation](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn)). Here we are just defining what the cells look like. This isn't actually building the graph, just defining the type of cells we want in our graph.\n", + "\n", + "To create a basic LSTM cell for the graph, you'll want to use `tf.contrib.rnn.BasicLSTMCell`. Looking at the function documentation:\n", + "\n", + "```\n", + "tf.contrib.rnn.BasicLSTMCell(num_units, forget_bias=1.0, input_size=None, state_is_tuple=True, activation=)\n", + "```\n", + "\n", + "you can see it takes a parameter called `num_units`, the number of units in the cell, called `lstm_size` in this code. So then, you can write something like \n", + "\n", + "```\n", + "lstm = tf.contrib.rnn.BasicLSTMCell(num_units)\n", + "```\n", + "\n", + "to create an LSTM cell with `num_units`. Next, you can add dropout to the cell with `tf.contrib.rnn.DropoutWrapper`. This just wraps the cell in another cell, but with dropout added to the inputs and/or outputs. It's a really convenient way to make your network better with almost no effort! So you'd do something like\n", + "\n", + "```\n", + "drop = tf.contrib.rnn.DropoutWrapper(cell, output_keep_prob=keep_prob)\n", + "```\n", + "\n", + "Most of the time, you're network will have better performance with more layers. That's sort of the magic of deep learning, adding more layers allows the network to learn really complex relationships. Again, there is a simple way to create multiple layers of LSTM cells with `tf.contrib.rnn.MultiRNNCell`:\n", + "\n", + "```\n", + "cell = tf.contrib.rnn.MultiRNNCell([drop] * lstm_layers)\n", + "```\n", + "\n", + "Here, `[drop] * lstm_layers` creates a list of cells (`drop`) that is `lstm_layers` long. The `MultiRNNCell` wrapper builds this into multiple layers of RNN cells, one for each cell in the list.\n", + "\n", + "So the final cell you're using in the network is actually multiple (or just one) LSTM cells with dropout. But it all works the same from an achitectural viewpoint, just a more complicated graph in the cell.\n", + "\n", + "> **Exercise:** Below, use `tf.contrib.rnn.BasicLSTMCell` to create an LSTM cell. Then, add drop out to it with `tf.contrib.rnn.DropoutWrapper`. Finally, create multiple LSTM layers with `tf.contrib.rnn.MultiRNNCell`.\n", + "\n", + "Here is [a tutorial on building RNNs](https://www.tensorflow.org/tutorials/recurrent) that will help you out.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with graph.as_default():\n", + " # Your basic LSTM cell\n", + " lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n", + " \n", + " # Add dropout to the cell\n", + " drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n", + " \n", + " # Stack up multiple LSTM layers, for deep learning\n", + " cell = tf.contrib.rnn.MultiRNNCell([drop] * lstm_layers)\n", + " \n", + " # Getting an initial state of all zeros\n", + " initial_state = cell.zero_state(batch_size, tf.float32)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### RNN forward pass\n", + "\n", + "\n", + "\n", + "Now we need to actually run the data through the RNN nodes. You can use [`tf.nn.dynamic_rnn`](https://www.tensorflow.org/api_docs/python/tf/nn/dynamic_rnn) to do this. You'd pass in the RNN cell you created (our multiple layered LSTM `cell` for instance), and the inputs to the network.\n", + "\n", + "```\n", + "outputs, final_state = tf.nn.dynamic_rnn(cell, inputs, initial_state=initial_state)\n", + "```\n", + "\n", + "Above I created an initial state, `initial_state`, to pass to the RNN. This is the cell state that is passed between the hidden layers in successive time steps. `tf.nn.dynamic_rnn` takes care of most of the work for us. We pass in our cell and the input to the cell, then it does the unrolling and everything else for us. It returns outputs for each time step and the final_state of the hidden layer.\n", + "\n", + "> **Exercise:** Use `tf.nn.dynamic_rnn` to add the forward pass through the RNN. Remember that we're actually passing in vectors from the embedding layer, `embed`.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with graph.as_default():\n", + " outputs, final_state = tf.nn.dynamic_rnn(cell, embed,\n", + " initial_state=initial_state)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Output\n", + "\n", + "We only care about the final output, we'll be using that as our sentiment prediction. So we need to grab the last output with `outputs[:, -1]`, the calculate the cost from that and `labels_`." + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with graph.as_default():\n", + " predictions = tf.contrib.layers.fully_connected(outputs[:, -1], 1, activation_fn=tf.sigmoid)\n", + " cost = tf.losses.mean_squared_error(labels_, predictions)\n", + " \n", + " optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Validation accuracy\n", + "\n", + "Here we can add a few nodes to calculate the accuracy which we'll use in the validation pass." + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with graph.as_default():\n", + " correct_pred = tf.equal(tf.cast(tf.round(predictions), tf.int32), labels_)\n", + " accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Batching\n", + "\n", + "This is a simple function for returning batches from our data. First it removes data such that we only have full batches. Then it iterates through the `x` and `y` arrays and returns slices out of those arrays with size `[batch_size]`." + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_batches(x, y, batch_size=100):\n", + " \n", + " n_batches = len(x)//batch_size\n", + " x, y = x[:n_batches*batch_size], y[:n_batches*batch_size]\n", + " for ii in range(0, len(x), batch_size):\n", + " yield x[ii:ii+batch_size], y[ii:ii+batch_size]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Training\n", + "\n", + "Below is the typical training code. If you want to do this yourself, feel free to delete all this code and implement it yourself. Before you run this, make sure the `checkpoints` directory exists." + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch: 0/10 Iteration: 5 Train loss: 0.244\n", + "Epoch: 0/10 Iteration: 10 Train loss: 0.237\n", + "Epoch: 0/10 Iteration: 15 Train loss: 0.198\n", + "Epoch: 0/10 Iteration: 20 Train loss: 0.211\n", + "Epoch: 0/10 Iteration: 25 Train loss: 0.197\n", + "Val acc: 0.713\n", + "Epoch: 0/10 Iteration: 30 Train loss: 0.211\n", + "Epoch: 0/10 Iteration: 35 Train loss: 0.176\n", + "Epoch: 0/10 Iteration: 40 Train loss: 0.170\n", + "Epoch: 1/10 Iteration: 45 Train loss: 0.166\n", + "Epoch: 1/10 Iteration: 50 Train loss: 0.165\n", + "Val acc: 0.749\n", + "Epoch: 1/10 Iteration: 55 Train loss: 0.144\n", + "Epoch: 1/10 Iteration: 60 Train loss: 0.134\n", + "Epoch: 1/10 Iteration: 65 Train loss: 0.128\n", + "Epoch: 1/10 Iteration: 70 Train loss: 0.151\n", + "Epoch: 1/10 Iteration: 75 Train loss: 0.121\n", + "Val acc: 0.744\n", + "Epoch: 1/10 Iteration: 80 Train loss: 0.136\n", + "Epoch: 2/10 Iteration: 85 Train loss: 0.116\n", + "Epoch: 2/10 Iteration: 90 Train loss: 0.134\n", + "Epoch: 2/10 Iteration: 95 Train loss: 0.103\n", + "Epoch: 2/10 Iteration: 100 Train loss: 0.106\n", + "Val acc: 0.807\n", + "Epoch: 2/10 Iteration: 105 Train loss: 0.088\n", + "Epoch: 2/10 Iteration: 110 Train loss: 0.135\n", + "Epoch: 2/10 Iteration: 115 Train loss: 0.102\n", + "Epoch: 2/10 Iteration: 120 Train loss: 0.150\n", + "Epoch: 3/10 Iteration: 125 Train loss: 0.162\n", + "Val acc: 0.682\n", + "Epoch: 3/10 Iteration: 130 Train loss: 0.186\n", + "Epoch: 3/10 Iteration: 135 Train loss: 0.220\n", + "Epoch: 3/10 Iteration: 140 Train loss: 0.209\n", + "Epoch: 3/10 Iteration: 145 Train loss: 0.199\n", + "Epoch: 3/10 Iteration: 150 Train loss: 0.180\n", + "Val acc: 0.701\n", + "Epoch: 3/10 Iteration: 155 Train loss: 0.151\n", + "Epoch: 3/10 Iteration: 160 Train loss: 0.156\n", + "Epoch: 4/10 Iteration: 165 Train loss: 0.127\n", + "Epoch: 4/10 Iteration: 170 Train loss: 0.150\n", + "Epoch: 4/10 Iteration: 175 Train loss: 0.148\n", + "Val acc: 0.739\n", + "Epoch: 4/10 Iteration: 180 Train loss: 0.108\n", + "Epoch: 4/10 Iteration: 185 Train loss: 0.074\n", + "Epoch: 4/10 Iteration: 190 Train loss: 0.096\n", + "Epoch: 4/10 Iteration: 195 Train loss: 0.103\n", + "Epoch: 4/10 Iteration: 200 Train loss: 0.094\n", + "Val acc: 0.810\n", + "Epoch: 5/10 Iteration: 205 Train loss: 0.090\n", + "Epoch: 5/10 Iteration: 210 Train loss: 0.111\n", + "Epoch: 5/10 Iteration: 215 Train loss: 0.108\n", + "Epoch: 5/10 Iteration: 220 Train loss: 0.077\n", + "Epoch: 5/10 Iteration: 225 Train loss: 0.075\n", + "Val acc: 0.802\n", + "Epoch: 5/10 Iteration: 230 Train loss: 0.072\n", + "Epoch: 5/10 Iteration: 235 Train loss: 0.070\n", + "Epoch: 5/10 Iteration: 240 Train loss: 0.084\n", + "Epoch: 6/10 Iteration: 245 Train loss: 0.058\n", + "Epoch: 6/10 Iteration: 250 Train loss: 0.073\n", + "Val acc: 0.801\n", + "Epoch: 6/10 Iteration: 255 Train loss: 0.078\n", + "Epoch: 6/10 Iteration: 260 Train loss: 0.062\n", + "Epoch: 6/10 Iteration: 265 Train loss: 0.080\n", + "Epoch: 6/10 Iteration: 270 Train loss: 0.067\n", + "Epoch: 6/10 Iteration: 275 Train loss: 0.053\n", + "Val acc: 0.788\n", + "Epoch: 6/10 Iteration: 280 Train loss: 0.070\n", + "Epoch: 7/10 Iteration: 285 Train loss: 0.108\n", + "Epoch: 7/10 Iteration: 290 Train loss: 0.059\n", + "Epoch: 7/10 Iteration: 295 Train loss: 0.065\n", + "Epoch: 7/10 Iteration: 300 Train loss: 0.113\n", + "Val acc: 0.783\n", + "Epoch: 7/10 Iteration: 305 Train loss: 0.261\n", + "Epoch: 7/10 Iteration: 310 Train loss: 0.219\n", + "Epoch: 7/10 Iteration: 315 Train loss: 0.086\n", + "Epoch: 7/10 Iteration: 320 Train loss: 0.124\n", + "Epoch: 8/10 Iteration: 325 Train loss: 0.098\n", + "Val acc: 0.762\n", + "Epoch: 8/10 Iteration: 330 Train loss: 0.081\n", + "Epoch: 8/10 Iteration: 335 Train loss: 0.056\n", + "Epoch: 8/10 Iteration: 340 Train loss: 0.055\n", + "Epoch: 8/10 Iteration: 345 Train loss: 0.078\n", + "Epoch: 8/10 Iteration: 350 Train loss: 0.058\n", + "Val acc: 0.845\n", + "Epoch: 8/10 Iteration: 355 Train loss: 0.050\n", + "Epoch: 8/10 Iteration: 360 Train loss: 0.050\n", + "Epoch: 9/10 Iteration: 365 Train loss: 0.049\n", + "Epoch: 9/10 Iteration: 370 Train loss: 0.055\n", + "Epoch: 9/10 Iteration: 375 Train loss: 0.061\n", + "Val acc: 0.842\n", + "Epoch: 9/10 Iteration: 380 Train loss: 0.065\n", + "Epoch: 9/10 Iteration: 385 Train loss: 0.069\n", + "Epoch: 9/10 Iteration: 390 Train loss: 0.075\n", + "Epoch: 9/10 Iteration: 395 Train loss: 0.051\n", + "Epoch: 9/10 Iteration: 400 Train loss: 0.075\n", + "Val acc: 0.826\n" + ] + } + ], + "source": [ + "epochs = 10\n", + "\n", + "with graph.as_default():\n", + " saver = tf.train.Saver()\n", + "\n", + "with tf.Session(graph=graph) as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + " iteration = 1\n", + " for e in range(epochs):\n", + " state = sess.run(initial_state)\n", + " \n", + " for ii, (x, y) in enumerate(get_batches(train_x, train_y, batch_size), 1):\n", + " feed = {inputs_: x,\n", + " labels_: y[:, None],\n", + " keep_prob: 0.5,\n", + " initial_state: state}\n", + " loss, state, _ = sess.run([cost, final_state, optimizer], feed_dict=feed)\n", + " \n", + " if iteration%5==0:\n", + " print(\"Epoch: {}/{}\".format(e, epochs),\n", + " \"Iteration: {}\".format(iteration),\n", + " \"Train loss: {:.3f}\".format(loss))\n", + "\n", + " if iteration%25==0:\n", + " val_acc = []\n", + " val_state = sess.run(cell.zero_state(batch_size, tf.float32))\n", + " for x, y in get_batches(val_x, val_y, batch_size):\n", + " feed = {inputs_: x,\n", + " labels_: y[:, None],\n", + " keep_prob: 1,\n", + " initial_state: val_state}\n", + " batch_acc, val_state = sess.run([accuracy, final_state], feed_dict=feed)\n", + " val_acc.append(batch_acc)\n", + " print(\"Val acc: {:.3f}\".format(np.mean(val_acc)))\n", + " iteration +=1\n", + " saver.save(sess, \"checkpoints/sentiment.ckpt\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Testing" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Test accuracy: 0.830\n" + ] + } + ], + "source": [ + "test_acc = []\n", + "with tf.Session(graph=graph) as sess:\n", + " saver.restore(sess, tf.train.latest_checkpoint('/output/checkpoints'))\n", + " test_state = sess.run(cell.zero_state(batch_size, tf.float32))\n", + " for ii, (x, y) in enumerate(get_batches(test_x, test_y, batch_size), 1):\n", + " feed = {inputs_: x,\n", + " labels_: y[:, None],\n", + " keep_prob: 1,\n", + " initial_state: test_state}\n", + " batch_acc, test_state = sess.run([accuracy, final_state], feed_dict=feed)\n", + " test_acc.append(batch_acc)\n", + " print(\"Test accuracy: {:.3f}\".format(np.mean(test_acc)))" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sentiment-rnn/Sentiment RNN.ipynb b/sentiment-rnn/Sentiment RNN.ipynb new file mode 100644 index 0000000..459bd55 --- /dev/null +++ b/sentiment-rnn/Sentiment RNN.ipynb @@ -0,0 +1,1034 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Sentiment Analysis with an RNN\n", + "\n", + "In this notebook, you'll implement a recurrent neural network that performs sentiment analysis. Using an RNN rather than a feedfoward network is more accurate since we can include information about the *sequence* of words. Here we'll use a dataset of movie reviews, accompanied by labels.\n", + "\n", + "The architecture for this network is shown below.\n", + "\n", + "\n", + "\n", + "Here, we'll pass in words to an embedding layer. We need an embedding layer because we have tens of thousands of words, so we'll need a more efficient representation for our input data than one-hot encoded vectors. You should have seen this before from the word2vec lesson. You can actually train up an embedding with word2vec and use it here. But it's good enough to just have an embedding layer and let the network learn the embedding table on it's own.\n", + "\n", + "From the embedding layer, the new representations will be passed to LSTM cells. These will add recurrent connections to the network so we can include information about the sequence of words in the data. Finally, the LSTM cells will go to a sigmoid output layer here. We're using the sigmoid because we're trying to predict if this text has positive or negative sentiment. The output layer will just be a single unit then, with a sigmoid activation function.\n", + "\n", + "We don't care about the sigmoid outputs except for the very last one, we can ignore the rest. We'll calculate the cost from the output of the last step and the training label." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with open('reviews.txt', 'r') as f:\n", + " reviews = f.read()\n", + "with open('labels.txt', 'r') as f:\n", + " labels = f.read()" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t \\nstory of a man who has unnatural feelings for a pig . starts out with a opening scene that is a terrific example of absurd comedy . a formal orchestra audience is turned into an insane violent mob by the crazy chantings of it s singers . unfortunately it stays absurd the whole time with no general narrative eventually making it just too off putting . even those from the era should be turned off . the cryptic dialogue would make shakespeare seem easy to a third grader . on a technical level it s better than you might think with some good cinematography by future great vilmos zsigmond . future stars sally kirkland and frederic forrest can be seen briefly . \\nhomelessness or houselessness as george carlin stated has been an issue for years but never a plan to help those on the street that were once considered human who did everything from going to school work or vote for the matter . most people think of the homeless as just a lost cause while worrying about things such as racism the war on iraq pressuring kids to succeed technology the elections inflation or worrying if they ll be next to end up on the streets . br br but what if y'" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[:2000]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Data preprocessing\n", + "\n", + "The first step when building a neural network model is getting your data into the proper form to feed into the network. Since we're using embedding layers, we'll need to encode each word with an integer. We'll also want to clean it up a bit.\n", + "\n", + "You can see an example of the reviews data above. We'll want to get rid of those periods. Also, you might notice that the reviews are delimited with newlines `\\n`. To deal with those, I'm going to split the text into each review using `\\n` as the delimiter. Then I can combined all the reviews back together into one big string.\n", + "\n", + "First, let's remove all punctuation. Then get all the text without the newlines and split it into individual words." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "from string import punctuation\n", + "all_text = ''.join([c for c in reviews if c not in punctuation])\n", + "reviews = all_text.split('\\n')\n", + "\n", + "all_text = ' '.join(reviews)\n", + "words = all_text.split()" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy it ran at the same time as some other programs about school life such as teachers my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled at high a classic line inspector i m here to sack one of your teachers student welcome to bromwell high i expect that many adults of my age think that bromwell high is far fetched what a pity that it isn t story of a man who has unnatural feelings for a pig starts out with a opening scene that is a terrific example of absurd comedy a formal orchestra audience is turned into an insane violent mob by the crazy chantings of it s singers unfortunately it stays absurd the whole time with no general narrative eventually making it just too off putting even those from the era should be turned off the cryptic dialogue would make shakespeare seem easy to a third grader on a technical level it s better than you might think with some good cinematography by future great vilmos zsigmond future stars sally kirkland and frederic forrest can be seen briefly homelessness or houselessness as george carlin stated has been an issue for years but never a plan to help those on the street that were once considered human who did everything from going to school work or vote for the matter most people think of the homeless as just a lost cause while worrying about things such as racism the war on iraq pressuring kids to succeed technology the elections inflation or worrying if they ll be next to end up on the streets br br but what if you were given a bet to live on the st'" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "all_text[:2000]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['bromwell',\n", + " 'high',\n", + " 'is',\n", + " 'a',\n", + " 'cartoon',\n", + " 'comedy',\n", + " 'it',\n", + " 'ran',\n", + " 'at',\n", + " 'the',\n", + " 'same',\n", + " 'time',\n", + " 'as',\n", + " 'some',\n", + " 'other',\n", + " 'programs',\n", + " 'about',\n", + " 'school',\n", + " 'life',\n", + " 'such',\n", + " 'as',\n", + " 'teachers',\n", + " 'my',\n", + " 'years',\n", + " 'in',\n", + " 'the',\n", + " 'teaching',\n", + " 'profession',\n", + " 'lead',\n", + " 'me',\n", + " 'to',\n", + " 'believe',\n", + " 'that',\n", + " 'bromwell',\n", + " 'high',\n", + " 's',\n", + " 'satire',\n", + " 'is',\n", + " 'much',\n", + " 'closer',\n", + " 'to',\n", + " 'reality',\n", + " 'than',\n", + " 'is',\n", + " 'teachers',\n", + " 'the',\n", + " 'scramble',\n", + " 'to',\n", + " 'survive',\n", + " 'financially',\n", + " 'the',\n", + " 'insightful',\n", + " 'students',\n", + " 'who',\n", + " 'can',\n", + " 'see',\n", + " 'right',\n", + " 'through',\n", + " 'their',\n", + " 'pathetic',\n", + " 'teachers',\n", + " 'pomp',\n", + " 'the',\n", + " 'pettiness',\n", + " 'of',\n", + " 'the',\n", + " 'whole',\n", + " 'situation',\n", + " 'all',\n", + " 'remind',\n", + " 'me',\n", + " 'of',\n", + " 'the',\n", + " 'schools',\n", + " 'i',\n", + " 'knew',\n", + " 'and',\n", + " 'their',\n", + " 'students',\n", + " 'when',\n", + " 'i',\n", + " 'saw',\n", + " 'the',\n", + " 'episode',\n", + " 'in',\n", + " 'which',\n", + " 'a',\n", + " 'student',\n", + " 'repeatedly',\n", + " 'tried',\n", + " 'to',\n", + " 'burn',\n", + " 'down',\n", + " 'the',\n", + " 'school',\n", + " 'i',\n", + " 'immediately',\n", + " 'recalled',\n", + " 'at',\n", + " 'high']" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "words[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Encoding the words\n", + "\n", + "The embedding lookup requires that we pass in integers to our network. The easiest way to do this is to create dictionaries that map the words in the vocabulary to integers. Then we can convert each of our reviews into integers so they can be passed into the network.\n", + "\n", + "> **Exercise:** Now you're going to encode the words with integers. Build a dictionary that maps words to integers. Later we're going to pad our input vectors with zeros, so make sure the integers **start at 1, not 0**.\n", + "> Also, convert the reviews to integers and store the reviews in a new list called `reviews_ints`. " + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Create your dictionary that maps vocab words to integers here\n", + "vocab_to_int = \n", + "\n", + "# Convert the reviews to integers, same shape as reviews list, but with integers\n", + "reviews_ints = " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "source": [ + "### Encoding the labels\n", + "\n", + "Our labels are \"positive\" or \"negative\". To use these labels in our network, we need to convert them to 0 and 1.\n", + "\n", + "> **Exercise:** Convert labels from `positive` and `negative` to 1 and 0, respectively." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Convert labels to 1s and 0s for 'positive' and 'negative'\n", + "labels = " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you built `labels` correctly, you should see the next output." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Zero-length reviews: 1\n", + "Maximum review length: 2514\n" + ] + } + ], + "source": [ + "review_lens = Counter([len(x) for x in reviews_ints])\n", + "print(\"Zero-length reviews: {}\".format(review_lens[0]))\n", + "print(\"Maximum review length: {}\".format(max(review_lens)))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "Okay, a couple issues here. We seem to have one review with zero length. And, the maximum review length is way too many steps for our RNN. Let's truncate to 200 steps. For reviews shorter than 200, we'll pad with 0s. For reviews longer than 200, we can truncate them to the first 200 characters.\n", + "\n", + "> **Exercise:** First, remove the review with zero length from the `reviews_ints` list." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Filter out that review with 0 length\n", + "reviews_ints = " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "> **Exercise:** Now, create an array `features` that contains the data we'll pass to the network. The data should come from `review_ints`, since we want to feed integers to the network. Each row should be 200 elements long. For reviews shorter than 200 words, left pad with 0s. That is, if the review is `['best', 'movie', 'ever']`, `[117, 18, 128]` as integers, the row will look like `[0, 0, 0, ..., 0, 117, 18, 128]`. For reviews longer than 200, use on the first 200 words as the feature vector.\n", + "\n", + "This isn't trivial and there are a bunch of ways to do this. But, if you're going to be building your own deep learning networks, you're going to have to get used to preparing your data.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "seq_len = 200\n", + "features = " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you build features correctly, it should look like that cell output below." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 21282, 308, 6,\n", + " 3, 1050, 207, 8, 2143, 32, 1, 171, 57,\n", + " 15, 49, 81, 5832, 44, 382, 110, 140, 15,\n", + " 5236, 60, 154, 9, 1, 5014, 5899, 475, 71,\n", + " 5, 260, 12, 21282, 308, 13, 1981, 6, 74,\n", + " 2396],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 63, 4, 3, 125,\n", + " 36, 47, 7487, 1397, 16, 3, 4212, 505, 45,\n", + " 17],\n", + " [23249, 42, 55982, 15, 707, 17288, 3398, 47, 77,\n", + " 35, 1822, 16, 154, 19, 114, 3, 1308, 5,\n", + " 336, 147, 22, 1, 857, 12, 70, 281, 1170,\n", + " 399, 36, 120, 283, 38, 169, 5, 382, 158,\n", + " 42, 2278, 16, 1, 541, 90, 78, 102, 4,\n", + " 1, 3256, 15, 43, 3, 407, 1069, 136, 8165,\n", + " 44, 182, 140, 15, 3054, 1, 321, 22, 4827,\n", + " 28571, 346, 5, 3093, 2094, 1, 18970, 18062, 42,\n", + " 8165, 46, 33, 236, 29, 370, 5, 130, 56,\n", + " 22, 1, 1928, 7, 7, 19, 48, 46, 21,\n", + " 70, 345, 3, 2102, 5, 408, 22, 1, 1928,\n", + " 16],\n", + " [ 4504, 505, 15, 3, 3352, 162, 8369, 1655, 6,\n", + " 4860, 56, 17, 4513, 5629, 140, 11938, 5, 996,\n", + " 4969, 2947, 4464, 566, 1202, 36, 6, 1520, 96,\n", + " 3, 744, 4, 28265, 13, 5, 27, 3464, 9,\n", + " 10794, 4, 8, 111, 3024, 5, 1, 1027, 15,\n", + " 3, 4400, 82, 22, 2058, 6, 4464, 538, 2773,\n", + " 7123, 41932, 41, 463, 1, 8369, 62989, 302, 123,\n", + " 15, 4230, 19, 1671, 923, 1, 1655, 6, 6166,\n", + " 20692, 34, 1, 980, 1751, 22804, 646, 25292, 27,\n", + " 106, 11901, 13, 14278, 15496, 18701, 2459, 466, 21189,\n", + " 36, 3267, 1, 6436, 1020, 45, 17, 2700, 2500,\n", + " 33],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 520, 119, 113, 34,\n", + " 16673, 1817, 3744, 117, 885, 22019, 721, 10, 28,\n", + " 124, 108, 2, 115, 137, 9, 1626, 7742, 26,\n", + " 330, 5, 590, 1, 6165, 22, 386, 6, 3,\n", + " 349, 15, 50, 15, 231, 9, 7484, 11646, 1,\n", + " 191, 22, 8994, 6, 82, 881, 101, 111, 3590,\n", + " 4],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 11, 20, 3662, 141, 10, 423, 23, 272, 60,\n", + " 4362, 22, 32, 84, 3305, 22, 1, 172, 4,\n", + " 1, 952, 507, 11, 4996, 5387, 5, 574, 4,\n", + " 1154, 54, 53, 5328, 1, 261, 17, 41, 952,\n", + " 125, 59, 1, 712, 137, 379, 627, 15, 111,\n", + " 1511],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 11, 6, 692, 1, 90,\n", + " 2158, 20, 11793, 1, 2818, 5218, 249, 92, 3007,\n", + " 8, 126, 24, 201, 3, 803, 634, 4, 23249,\n", + " 1002],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 786, 295, 10, 122, 11, 6, 418,\n", + " 5, 29, 35, 482, 20, 19, 1285, 33, 142,\n", + " 28, 2667, 45, 1844, 32, 1, 2790, 37, 78,\n", + " 97, 2439, 67, 3952, 45, 2, 24, 105, 256,\n", + " 1, 134, 1572, 2, 12612, 452, 14, 319, 11,\n", + " 63, 6, 98, 1322, 5, 105, 1, 3766, 4,\n", + " 3],\n", + " [ 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 0, 0,\n", + " 0, 0, 0, 0, 0, 0, 0, 11, 6,\n", + " 24, 1, 779, 3708, 2818, 20, 8, 14, 74,\n", + " 325, 2739, 73, 90, 4, 27, 99, 2, 165,\n", + " 68],\n", + " [ 54, 10, 14, 116, 60, 798, 552, 71, 364,\n", + " 5, 1, 731, 5, 66, 8089, 8, 14, 30,\n", + " 4, 109, 99, 10, 293, 17, 60, 798, 19,\n", + " 11, 14, 1, 64, 30, 69, 2506, 45, 4,\n", + " 234, 93, 10, 68, 114, 108, 8089, 363, 43,\n", + " 1009, 2, 10, 97, 28, 1431, 45, 1, 357,\n", + " 4, 60, 110, 205, 8, 48, 3, 1929, 11029,\n", + " 2, 2127, 354, 412, 4, 13, 6676, 2, 2975,\n", + " 5174, 2125, 1371, 6, 30, 4, 60, 502, 876,\n", + " 19, 8089, 6, 34, 227, 1, 247, 412, 4,\n", + " 582, 4, 27, 599, 9, 1, 13829, 395, 4,\n", + " 14175]])" + ] + }, + "execution_count": 13, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "features[:10,:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Training, Validation, Test\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "With our data in nice shape, we'll split it into training, validation, and test sets.\n", + "\n", + "> **Exercise:** Create the training, validation, and test sets here. You'll need to create sets for the features and the labels, `train_x` and `train_y` for example. Define a split fraction, `split_frac` as the fraction of data to keep in the training set. Usually this is set to 0.8 or 0.9. The rest of the data will be split in half to create the validation and testing data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "split_frac = 0.8\n", + "\n", + "train_x, val_x = \n", + "train_y, val_y = \n", + "\n", + "val_x, test_x = \n", + "val_y, test_y = \n", + "\n", + "print(\"\\t\\t\\tFeature Shapes:\")\n", + "print(\"Train set: \\t\\t{}\".format(train_x.shape), \n", + " \"\\nValidation set: \\t{}\".format(val_x.shape),\n", + " \"\\nTest set: \\t\\t{}\".format(test_x.shape))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "With train, validation, and text fractions of 0.8, 0.1, 0.1, the final shapes should look like:\n", + "```\n", + " Feature Shapes:\n", + "Train set: \t\t (20000, 200) \n", + "Validation set: \t(2500, 200) \n", + "Test set: \t\t (2501, 200)\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Build the graph\n", + "\n", + "Here, we'll build the graph. First up, defining the hyperparameters.\n", + "\n", + "* `lstm_size`: Number of units in the hidden layers in the LSTM cells. Usually larger is better performance wise. Common values are 128, 256, 512, etc.\n", + "* `lstm_layers`: Number of LSTM layers in the network. I'd start with 1, then add more if I'm underfitting.\n", + "* `batch_size`: The number of reviews to feed the network in one training pass. Typically this should be set as high as you can go without running out of memory.\n", + "* `learning_rate`: Learning rate" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "lstm_size = 256\n", + "lstm_layers = 1\n", + "batch_size = 500\n", + "learning_rate = 0.001" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "For the network itself, we'll be passing in our 200 element long review vectors. Each batch will be `batch_size` vectors. We'll also be using dropout on the LSTM layer, so we'll make a placeholder for the keep probability." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "> **Exercise:** Create the `inputs_`, `labels_`, and drop out `keep_prob` placeholders using `tf.placeholder`. `labels_` needs to be two-dimensional to work with some functions later. Since `keep_prob` is a scalar (a 0-dimensional tensor), you shouldn't provide a size to `tf.placeholder`." + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "n_words = len(vocab)\n", + "\n", + "# Create the graph object\n", + "graph = tf.Graph()\n", + "# Add nodes to the graph\n", + "with graph.as_default():\n", + " inputs_ = \n", + " labels_ = \n", + " keep_prob = " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Embedding\n", + "\n", + "Now we'll add an embedding layer. We need to do this because there are 74000 words in our vocabulary. It is massively inefficient to one-hot encode our classes here. You should remember dealing with this problem from the word2vec lesson. Instead of one-hot encoding, we can have an embedding layer and use that layer as a lookup table. You could train an embedding layer using word2vec, then load it here. But, it's fine to just make a new layer and let the network learn the weights.\n", + "\n", + "> **Exercise:** Create the embedding lookup matrix as a `tf.Variable`. Use that embedding matrix to get the embedded vectors to pass to the LSTM cell with [`tf.nn.embedding_lookup`](https://www.tensorflow.org/api_docs/python/tf/nn/embedding_lookup). This function takes the embedding matrix and an input tensor, such as the review vectors. Then, it'll return another tensor with the embedded vectors. So, if the embedding layer has 200 units, the function will return a tensor with size [batch_size, 200].\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Size of the embedding vectors (number of units in the embedding layer)\n", + "embed_size = 300 \n", + "\n", + "with graph.as_default():\n", + " embedding = \n", + " embed = " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### LSTM cell\n", + "\n", + "\n", + "\n", + "Next, we'll create our LSTM cells to use in the recurrent network ([TensorFlow documentation](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn)). Here we are just defining what the cells look like. This isn't actually building the graph, just defining the type of cells we want in our graph.\n", + "\n", + "To create a basic LSTM cell for the graph, you'll want to use `tf.contrib.rnn.BasicLSTMCell`. Looking at the function documentation:\n", + "\n", + "```\n", + "tf.contrib.rnn.BasicLSTMCell(num_units, forget_bias=1.0, input_size=None, state_is_tuple=True, activation=)\n", + "```\n", + "\n", + "you can see it takes a parameter called `num_units`, the number of units in the cell, called `lstm_size` in this code. So then, you can write something like \n", + "\n", + "```\n", + "lstm = tf.contrib.rnn.BasicLSTMCell(num_units)\n", + "```\n", + "\n", + "to create an LSTM cell with `num_units`. Next, you can add dropout to the cell with `tf.contrib.rnn.DropoutWrapper`. This just wraps the cell in another cell, but with dropout added to the inputs and/or outputs. It's a really convenient way to make your network better with almost no effort! So you'd do something like\n", + "\n", + "```\n", + "drop = tf.contrib.rnn.DropoutWrapper(cell, output_keep_prob=keep_prob)\n", + "```\n", + "\n", + "Most of the time, you're network will have better performance with more layers. That's sort of the magic of deep learning, adding more layers allows the network to learn really complex relationships. Again, there is a simple way to create multiple layers of LSTM cells with `tf.contrib.rnn.MultiRNNCell`:\n", + "\n", + "```\n", + "cell = tf.contrib.rnn.MultiRNNCell([drop] * lstm_layers)\n", + "```\n", + "\n", + "Here, `[drop] * lstm_layers` creates a list of cells (`drop`) that is `lstm_layers` long. The `MultiRNNCell` wrapper builds this into multiple layers of RNN cells, one for each cell in the list.\n", + "\n", + "So the final cell you're using in the network is actually multiple (or just one) LSTM cells with dropout. But it all works the same from an achitectural viewpoint, just a more complicated graph in the cell.\n", + "\n", + "> **Exercise:** Below, use `tf.contrib.rnn.BasicLSTMCell` to create an LSTM cell. Then, add drop out to it with `tf.contrib.rnn.DropoutWrapper`. Finally, create multiple LSTM layers with `tf.contrib.rnn.MultiRNNCell`.\n", + "\n", + "Here is [a tutorial on building RNNs](https://www.tensorflow.org/tutorials/recurrent) that will help you out.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with graph.as_default():\n", + " # Your basic LSTM cell\n", + " lstm = \n", + " \n", + " # Add dropout to the cell\n", + " drop = \n", + " \n", + " # Stack up multiple LSTM layers, for deep learning\n", + " cell = \n", + " \n", + " # Getting an initial state of all zeros\n", + " initial_state = cell.zero_state(batch_size, tf.float32)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### RNN forward pass\n", + "\n", + "\n", + "\n", + "Now we need to actually run the data through the RNN nodes. You can use [`tf.nn.dynamic_rnn`](https://www.tensorflow.org/api_docs/python/tf/nn/dynamic_rnn) to do this. You'd pass in the RNN cell you created (our multiple layered LSTM `cell` for instance), and the inputs to the network.\n", + "\n", + "```\n", + "outputs, final_state = tf.nn.dynamic_rnn(cell, inputs, initial_state=initial_state)\n", + "```\n", + "\n", + "Above I created an initial state, `initial_state`, to pass to the RNN. This is the cell state that is passed between the hidden layers in successive time steps. `tf.nn.dynamic_rnn` takes care of most of the work for us. We pass in our cell and the input to the cell, then it does the unrolling and everything else for us. It returns outputs for each time step and the final_state of the hidden layer.\n", + "\n", + "> **Exercise:** Use `tf.nn.dynamic_rnn` to add the forward pass through the RNN. Remember that we're actually passing in vectors from the embedding layer, `embed`.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with graph.as_default():\n", + " outputs, final_state = " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Output\n", + "\n", + "We only care about the final output, we'll be using that as our sentiment prediction. So we need to grab the last output with `outputs[:, -1]`, the calculate the cost from that and `labels_`." + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with graph.as_default():\n", + " predictions = tf.contrib.layers.fully_connected(outputs[:, -1], 1, activation_fn=tf.sigmoid)\n", + " cost = tf.losses.mean_squared_error(labels_, predictions)\n", + " \n", + " optimizer = tf.train.AdamOptimizer(learning_rate).minimize(cost)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Validation accuracy\n", + "\n", + "Here we can add a few nodes to calculate the accuracy which we'll use in the validation pass." + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "with graph.as_default():\n", + " correct_pred = tf.equal(tf.cast(tf.round(predictions), tf.int32), labels_)\n", + " accuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Batching\n", + "\n", + "This is a simple function for returning batches from our data. First it removes data such that we only have full batches. Then it iterates through the `x` and `y` arrays and returns slices out of those arrays with size `[batch_size]`." + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_batches(x, y, batch_size=100):\n", + " \n", + " n_batches = len(x)//batch_size\n", + " x, y = x[:n_batches*batch_size], y[:n_batches*batch_size]\n", + " for ii in range(0, len(x), batch_size):\n", + " yield x[ii:ii+batch_size], y[ii:ii+batch_size]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Training\n", + "\n", + "Below is the typical training code. If you want to do this yourself, feel free to delete all this code and implement it yourself. Before you run this, make sure the `checkpoints` directory exists." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "epochs = 10\n", + "\n", + "with graph.as_default():\n", + " saver = tf.train.Saver()\n", + "\n", + "with tf.Session(graph=graph) as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + " iteration = 1\n", + " for e in range(epochs):\n", + " state = sess.run(initial_state)\n", + " \n", + " for ii, (x, y) in enumerate(get_batches(train_x, train_y, batch_size), 1):\n", + " feed = {inputs_: x,\n", + " labels_: y[:, None],\n", + " keep_prob: 0.5,\n", + " initial_state: state}\n", + " loss, state, _ = sess.run([cost, final_state, optimizer], feed_dict=feed)\n", + " \n", + " if iteration%5==0:\n", + " print(\"Epoch: {}/{}\".format(e, epochs),\n", + " \"Iteration: {}\".format(iteration),\n", + " \"Train loss: {:.3f}\".format(loss))\n", + "\n", + " if iteration%25==0:\n", + " val_acc = []\n", + " val_state = sess.run(cell.zero_state(batch_size, tf.float32))\n", + " for x, y in get_batches(val_x, val_y, batch_size):\n", + " feed = {inputs_: x,\n", + " labels_: y[:, None],\n", + " keep_prob: 1,\n", + " initial_state: val_state}\n", + " batch_acc, val_state = sess.run([accuracy, final_state], feed_dict=feed)\n", + " val_acc.append(batch_acc)\n", + " print(\"Val acc: {:.3f}\".format(np.mean(val_acc)))\n", + " iteration +=1\n", + " saver.save(sess, \"checkpoints/sentiment.ckpt\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Testing" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "test_acc = []\n", + "with tf.Session(graph=graph) as sess:\n", + " saver.restore(sess, tf.train.latest_checkpoint('/output/checkpoints'))\n", + " test_state = sess.run(cell.zero_state(batch_size, tf.float32))\n", + " for ii, (x, y) in enumerate(get_batches(test_x, test_y, batch_size), 1):\n", + " feed = {inputs_: x,\n", + " labels_: y[:, None],\n", + " keep_prob: 1,\n", + " initial_state: test_state}\n", + " batch_acc, test_state = sess.run([accuracy, final_state], feed_dict=feed)\n", + " test_acc.append(batch_acc)\n", + " print(\"Test accuracy: {:.3f}\".format(np.mean(test_acc)))" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sentiment-rnn/assets/network_diagram.png b/sentiment-rnn/assets/network_diagram.png new file mode 100644 index 0000000..2d645bf Binary files /dev/null and b/sentiment-rnn/assets/network_diagram.png differ diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Intro).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Intro).ipynb new file mode 100644 index 0000000..71a5ae0 --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Intro).ipynb @@ -0,0 +1,88 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 1).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 1).ipynb new file mode 100644 index 0000000..7d4ff54 --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 1).ipynb @@ -0,0 +1,236 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 2).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 2).ipynb new file mode 100644 index 0000000..117f5a4 --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 2).ipynb @@ -0,0 +1,2466 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 3).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 3).ipynb new file mode 100644 index 0000000..698ca65 --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 3).ipynb @@ -0,0 +1,4768 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 3: Building a Neural Network" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "- Start with your neural network from the last chapter\n", + "- 3 layer neural network\n", + "- no non-linearity in hidden layer\n", + "- use our functions to create the training data\n", + "- create a \"pre_process_data\" function to create vocabulary for our training data generating functions\n", + "- modify \"train\" to train over the entire corpus" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Where to Get Help if You Need it\n", + "- Re-watch previous week's Udacity Lectures\n", + "- Chapters 3-5 - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) - (40% Off: **traskud17**)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 4).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 4).ipynb new file mode 100644 index 0000000..16304b4 --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 4).ipynb @@ -0,0 +1,5313 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 3: Building a Neural Network" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "- Start with your neural network from the last chapter\n", + "- 3 layer neural network\n", + "- no non-linearity in hidden layer\n", + "- use our functions to create the training data\n", + "- create a \"pre_process_data\" function to create vocabulary for our training data generating functions\n", + "- modify \"train\" to train over the entire corpus" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Where to Get Help if You Need it\n", + "- Re-watch previous week's Udacity Lectures\n", + "- Chapters 3-5 - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) - (40% Off: **traskud17**)" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] += 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):587.5% #Correct:500 #Tested:1000 Testing Accuracy:50.0%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):89.58 #Correct:1250 #Trained:2501 Training Accuracy:49.9%\n", + "Progress:20.8% Speed(reviews/sec):95.03 #Correct:2500 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:27.4% Speed(reviews/sec):95.46 #Correct:3295 #Trained:6592 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):96.39 #Correct:1247 #Trained:2501 Training Accuracy:49.8%\n", + "Progress:20.8% Speed(reviews/sec):99.31 #Correct:2497 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:22.8% Speed(reviews/sec):99.02 #Correct:2735 #Trained:5476 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.001)" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):98.77 #Correct:1267 #Trained:2501 Training Accuracy:50.6%\n", + "Progress:20.8% Speed(reviews/sec):98.79 #Correct:2640 #Trained:5001 Training Accuracy:52.7%\n", + "Progress:31.2% Speed(reviews/sec):98.58 #Correct:4109 #Trained:7501 Training Accuracy:54.7%\n", + "Progress:41.6% Speed(reviews/sec):93.78 #Correct:5638 #Trained:10001 Training Accuracy:56.3%\n", + "Progress:52.0% Speed(reviews/sec):91.76 #Correct:7246 #Trained:12501 Training Accuracy:57.9%\n", + "Progress:62.5% Speed(reviews/sec):92.42 #Correct:8841 #Trained:15001 Training Accuracy:58.9%\n", + "Progress:69.4% Speed(reviews/sec):92.58 #Correct:9934 #Trained:16668 Training Accuracy:59.5%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Understanding Neural Noise" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 67, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 71, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "review_counter = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for word in reviews[0].split(\" \"):\n", + " review_counter[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('.', 27),\n", + " ('', 18),\n", + " ('the', 9),\n", + " ('to', 6),\n", + " ('i', 5),\n", + " ('high', 5),\n", + " ('is', 4),\n", + " ('of', 4),\n", + " ('a', 4),\n", + " ('bromwell', 4),\n", + " ('teachers', 4),\n", + " ('that', 4),\n", + " ('their', 2),\n", + " ('my', 2),\n", + " ('at', 2),\n", + " ('as', 2),\n", + " ('me', 2),\n", + " ('in', 2),\n", + " ('students', 2),\n", + " ('it', 2),\n", + " ('student', 2),\n", + " ('school', 2),\n", + " ('through', 1),\n", + " ('insightful', 1),\n", + " ('ran', 1),\n", + " ('years', 1),\n", + " ('here', 1),\n", + " ('episode', 1),\n", + " ('reality', 1),\n", + " ('what', 1),\n", + " ('far', 1),\n", + " ('t', 1),\n", + " ('saw', 1),\n", + " ('s', 1),\n", + " ('repeatedly', 1),\n", + " ('isn', 1),\n", + " ('closer', 1),\n", + " ('and', 1),\n", + " ('fetched', 1),\n", + " ('remind', 1),\n", + " ('can', 1),\n", + " ('welcome', 1),\n", + " ('line', 1),\n", + " ('your', 1),\n", + " ('survive', 1),\n", + " ('teaching', 1),\n", + " ('satire', 1),\n", + " ('classic', 1),\n", + " ('who', 1),\n", + " ('age', 1),\n", + " ('knew', 1),\n", + " ('schools', 1),\n", + " ('inspector', 1),\n", + " ('comedy', 1),\n", + " ('down', 1),\n", + " ('about', 1),\n", + " ('pity', 1),\n", + " ('m', 1),\n", + " ('all', 1),\n", + " ('adults', 1),\n", + " ('see', 1),\n", + " ('think', 1),\n", + " ('situation', 1),\n", + " ('time', 1),\n", + " ('pomp', 1),\n", + " ('lead', 1),\n", + " ('other', 1),\n", + " ('much', 1),\n", + " ('many', 1),\n", + " ('which', 1),\n", + " ('one', 1),\n", + " ('profession', 1),\n", + " ('programs', 1),\n", + " ('same', 1),\n", + " ('some', 1),\n", + " ('such', 1),\n", + " ('pettiness', 1),\n", + " ('immediately', 1),\n", + " ('expect', 1),\n", + " ('financially', 1),\n", + " ('recalled', 1),\n", + " ('tried', 1),\n", + " ('whole', 1),\n", + " ('right', 1),\n", + " ('life', 1),\n", + " ('cartoon', 1),\n", + " ('scramble', 1),\n", + " ('sack', 1),\n", + " ('believe', 1),\n", + " ('when', 1),\n", + " ('than', 1),\n", + " ('burn', 1),\n", + " ('pathetic', 1)]" + ] + }, + "execution_count": 81, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review_counter.most_common()" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 5).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 5).ipynb new file mode 100644 index 0000000..54f7333 --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 5).ipynb @@ -0,0 +1,5762 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 3: Building a Neural Network" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "- Start with your neural network from the last chapter\n", + "- 3 layer neural network\n", + "- no non-linearity in hidden layer\n", + "- use our functions to create the training data\n", + "- create a \"pre_process_data\" function to create vocabulary for our training data generating functions\n", + "- modify \"train\" to train over the entire corpus" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Where to Get Help if You Need it\n", + "- Re-watch previous week's Udacity Lectures\n", + "- Chapters 3-5 - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) - (40% Off: **traskud17**)" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] += 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):587.5% #Correct:500 #Tested:1000 Testing Accuracy:50.0%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):89.58 #Correct:1250 #Trained:2501 Training Accuracy:49.9%\n", + "Progress:20.8% Speed(reviews/sec):95.03 #Correct:2500 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:27.4% Speed(reviews/sec):95.46 #Correct:3295 #Trained:6592 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):96.39 #Correct:1247 #Trained:2501 Training Accuracy:49.8%\n", + "Progress:20.8% Speed(reviews/sec):99.31 #Correct:2497 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:22.8% Speed(reviews/sec):99.02 #Correct:2735 #Trained:5476 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.001)" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):98.77 #Correct:1267 #Trained:2501 Training Accuracy:50.6%\n", + "Progress:20.8% Speed(reviews/sec):98.79 #Correct:2640 #Trained:5001 Training Accuracy:52.7%\n", + "Progress:31.2% Speed(reviews/sec):98.58 #Correct:4109 #Trained:7501 Training Accuracy:54.7%\n", + "Progress:41.6% Speed(reviews/sec):93.78 #Correct:5638 #Trained:10001 Training Accuracy:56.3%\n", + "Progress:52.0% Speed(reviews/sec):91.76 #Correct:7246 #Trained:12501 Training Accuracy:57.9%\n", + "Progress:62.5% Speed(reviews/sec):92.42 #Correct:8841 #Trained:15001 Training Accuracy:58.9%\n", + "Progress:69.4% Speed(reviews/sec):92.58 #Correct:9934 #Trained:16668 Training Accuracy:59.5%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Understanding Neural Noise" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 67, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 71, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "review_counter = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for word in reviews[0].split(\" \"):\n", + " review_counter[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('.', 27),\n", + " ('', 18),\n", + " ('the', 9),\n", + " ('to', 6),\n", + " ('i', 5),\n", + " ('high', 5),\n", + " ('is', 4),\n", + " ('of', 4),\n", + " ('a', 4),\n", + " ('bromwell', 4),\n", + " ('teachers', 4),\n", + " ('that', 4),\n", + " ('their', 2),\n", + " ('my', 2),\n", + " ('at', 2),\n", + " ('as', 2),\n", + " ('me', 2),\n", + " ('in', 2),\n", + " ('students', 2),\n", + " ('it', 2),\n", + " ('student', 2),\n", + " ('school', 2),\n", + " ('through', 1),\n", + " ('insightful', 1),\n", + " ('ran', 1),\n", + " ('years', 1),\n", + " ('here', 1),\n", + " ('episode', 1),\n", + " ('reality', 1),\n", + " ('what', 1),\n", + " ('far', 1),\n", + " ('t', 1),\n", + " ('saw', 1),\n", + " ('s', 1),\n", + " ('repeatedly', 1),\n", + " ('isn', 1),\n", + " ('closer', 1),\n", + " ('and', 1),\n", + " ('fetched', 1),\n", + " ('remind', 1),\n", + " ('can', 1),\n", + " ('welcome', 1),\n", + " ('line', 1),\n", + " ('your', 1),\n", + " ('survive', 1),\n", + " ('teaching', 1),\n", + " ('satire', 1),\n", + " ('classic', 1),\n", + " ('who', 1),\n", + " ('age', 1),\n", + " ('knew', 1),\n", + " ('schools', 1),\n", + " ('inspector', 1),\n", + " ('comedy', 1),\n", + " ('down', 1),\n", + " ('about', 1),\n", + " ('pity', 1),\n", + " ('m', 1),\n", + " ('all', 1),\n", + " ('adults', 1),\n", + " ('see', 1),\n", + " ('think', 1),\n", + " ('situation', 1),\n", + " ('time', 1),\n", + " ('pomp', 1),\n", + " ('lead', 1),\n", + " ('other', 1),\n", + " ('much', 1),\n", + " ('many', 1),\n", + " ('which', 1),\n", + " ('one', 1),\n", + " ('profession', 1),\n", + " ('programs', 1),\n", + " ('same', 1),\n", + " ('some', 1),\n", + " ('such', 1),\n", + " ('pettiness', 1),\n", + " ('immediately', 1),\n", + " ('expect', 1),\n", + " ('financially', 1),\n", + " ('recalled', 1),\n", + " ('tried', 1),\n", + " ('whole', 1),\n", + " ('right', 1),\n", + " ('life', 1),\n", + " ('cartoon', 1),\n", + " ('scramble', 1),\n", + " ('sack', 1),\n", + " ('believe', 1),\n", + " ('when', 1),\n", + " ('than', 1),\n", + " ('burn', 1),\n", + " ('pathetic', 1)]" + ] + }, + "execution_count": 81, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review_counter.most_common()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 4: Reducing Noise in our Input Data" + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):91.50 #Correct:1795 #Trained:2501 Training Accuracy:71.7%\n", + "Progress:20.8% Speed(reviews/sec):95.25 #Correct:3811 #Trained:5001 Training Accuracy:76.2%\n", + "Progress:31.2% Speed(reviews/sec):93.74 #Correct:5898 #Trained:7501 Training Accuracy:78.6%\n", + "Progress:41.6% Speed(reviews/sec):93.69 #Correct:8042 #Trained:10001 Training Accuracy:80.4%\n", + "Progress:52.0% Speed(reviews/sec):95.27 #Correct:10186 #Trained:12501 Training Accuracy:81.4%\n", + "Progress:62.5% Speed(reviews/sec):98.19 #Correct:12317 #Trained:15001 Training Accuracy:82.1%\n", + "Progress:72.9% Speed(reviews/sec):98.56 #Correct:14440 #Trained:17501 Training Accuracy:82.5%\n", + "Progress:83.3% Speed(reviews/sec):99.74 #Correct:16613 #Trained:20001 Training Accuracy:83.0%\n", + "Progress:93.7% Speed(reviews/sec):100.7 #Correct:18794 #Trained:22501 Training Accuracy:83.5%\n", + "Progress:99.9% Speed(reviews/sec):101.9 #Correct:20115 #Trained:24000 Training Accuracy:83.8%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):832.7% #Correct:851 #Tested:1000 Testing Accuracy:85.1%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Analyzing Inefficiencies in our Network" + ] + }, + { + "cell_type": "code", + "execution_count": 88, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAl4AAAEoCAYAAACJsv/HAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsvQv8HdPV/7+pKnVrUCSoS5A0ES1CCC2JpPh7KkLlaV1yafskoUk02hCh\njyhyERWXIMlTInFpRCVBCRIJQRJFtQRJCXVLUOLXoC7Vnv+8t67T9Z3Muc85Z+actV6v+c6cmX1Z\n+7NnZn++a63Ze4NMIM7EEDAEDAFDwBAwBAwBQ6DqCGxY9RqsAkPAEDAEDAFDwBAwBAwBj4ARL7sR\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwg0GgLvvfee22CDDdzWW2+diqahZ+fOnVOhqylpCBgCjYuA\nEa/G7VtrmSFgCPwbgT59+jiIookhYAgYAvVGwIhXvXvA6jcEGgiB2267zbVt29ZbwrCGDRo0KNs6\nOf/kk09mz2GFIh3nIEYQJH6LJW38+PHrpZ06daq3so0cOTJ7LdcBaSgLvUwMAUPAEEgCAka8ktAL\npoMh0AAIQJwgWi+99FK2NZAkyBTSo0cPv1+wYEF2T57999/fbz179mxBkLgGcdLkjYyc41oxMm7c\nOJfJZNysWbOKSW5pDAFDwBCoOgJGvKoOsVVgCDQHAliVIEQDBw70ZGft2rWuVatWTojWiSee6IGQ\n32L54jwEjd+QM4gS2xNPPOF23333FmSMAiQNpMrEEDAEDIG0IWDEK209ZvoaAglFQAgXZAmrVNgy\nBWGCiAnhEgIG8RIrGefE1UggPOchc5KHpp999tkJRcDUMgQMAUOgMAJGvApjZCkMAUOgCAQgScRs\nCaGCgEG0tECyIFKkgUzhZiSdyJQpU7IWL7F8sSediSFgCBgCjYCAEa9G6EVrgyGQAARwF0KqIFe4\nASFU/NYicV7ylSFpESFflCHWL53Pjg0BQ8AQaBQEjHg1Sk9aOwyBOiMg1i2C4XEXQq7knKgG0eKc\nEDIhXrgpsWphBZOvH7XLUfLb3hAwBAyBtCNgxCvtPWj6GwIJQQDyJBYtyBVuQ7F66ekchGyRVixd\nNGH+/Pk+MF83hzI5b2IIGAKGQKMgsEEQP5FplMZYOwwBQyD5CDA3F4H3uCMtUD75/WUaGgKGQLwI\nmMUrXjytNEPAEMiBAG5E3IeQLixiWLMqEaxo4o6M2lOPiSFgCBgCSUNgo6QpZPoYAoZA4yOApSsc\n/1Vqq3FZmsG+VNQsvSFgCNQbAXM11rsHrH5DwBAwBAwBQ8AQaBoEzNXYNF1tDTUEDAFDwBAwBAyB\neiNgxKvePWD1GwKGgCFgCBgChkDTIGDEq2m62hpqCBgChoAhYAgYAvVGwIhXvXvA6jcEDAFDwBAw\nBAyBpkHAiFfTdLU11BAwBAwBQ8AQMATqjYARr3r3gNVvCBgChoAhYAgYAk2DgBGvpulqa6ghYAgY\nAoaAIWAI1BsBm0C13j1g9RsCTYTAgw8+6F555RW3/Nnn3RtvvOHWrH7DPfjgovUQ+MFJp7gtttjC\ndezYwX1t553c4Ycf7r7yla+sl85OGAKGgCGQNgRsAtW09ZjpawikDIG5c+e6hx9Z4u6+607Xuk0b\n981993e77b6b23PPPd1WW27puh7cpUWLnn1uhXv1tdfc6tVr3EsvveSeefpP7q475zrI2JHf6eF6\n9eplJKwFYvbDEDAE0oSAEa809ZbpagikBIH/9//+n5tx401uzuzZXuPeJ3zPHdG9u+vYoX1ZLVi9\n5k0379773WPLlrr/mzrZ/XzE2e4npw92u+66a1nlWSZDwBAwBOqFgBGveiFv9RoCDYrAlVde5a65\n+mq3X+cD3Kl9+7qjj+wZa0uxiP3619e5yydeagQsVmStMEPAEKgFAka8aoGy1WEINAECxG9dcMEv\n3RZbbuVOO/302AlXGEIhYPPuvsudM+oc169fv3AS+20IGAKGQOIQMOKVuC4xhQyB9CFw5VWT3DWT\nJrnThw5zw4acXtMGzLtvvrtk3NggfmzHwNJ2lcV/1RR9q8wQMARKRcCmkygVMUtfFgKdO3d2G2yw\ngXvyySfLyl+LTFOnTnVbb7211xNdx48fX4tqU10HsVyDBp/uFix4wF1/w/Saky7Aw5V58y23uO23\n38Ed1OUg98c//jHVmJryhoAh0NgI2HQSjd2/1roiEYAQDho0qEXqkSNH+t9nn312i/P243MEIF19\n+w1wm2++uZs8+VrXpvUOdYOGuideNsF/Lfn9//6+m3nrTPfNb36zbvpYxYaAIWAI5ELALF65kLHz\nVUWAaQJ69uyZtS5xzDmkT58+/ry2OElaOcceq5RskKb33ntvvfxY2tgKCdYu5MQTT3SZTMaNGzfO\n/16wYIHf25+WCAjpatt2D3fLzTfWlXRpzXBzjhg5ykG+zPKlkbFjQ8AQSAoCRryS0hNNpgfkSpMa\njiFXSI8ePfxeX8ci1apVKzdw4EBvmRJrlE8Y/IE4SX45Bzkr1rUp6SBeiOzlvJRpe+c06cLKlDT5\n0YC+Rr6S1immjyFgCGQRMOKVhcIOaoUAli0Izf777++tS1iYOJbzkCtIlpAeCBjWLNKwh2RxvGrV\nKp9/7dq1nqyRXpO13Xff3XHtiSeeKNg0sZZRLyJ7zsu1goU0SYIxYz+PfUsi6ZIugHwR6D98+Jme\nKMp52xsChoAhUG8ELMar3j3QhPVDiCBbECixXImbUeCAWEGi2ISAYYWSY/Zt27aV5Nm9XOcE6YVA\nZRPYQUUITJ8+3d05d45bGEwdkXTB7bj8mWfc6T8Z6t2hSdfX9DMEDIHmQMAsXs3Rz4lrJXFXEq+F\ncuJeFEW1qw/yBYGSc6TBKgZ5C2/lBsILQRMCKFYuzss10a1Z93/5y1/c2DFj3cRggtR6BtKXgv/o\n0ef79SAhjCaGgCFgCCQBASNeSeiFJtPhtttu85YryBZB7JAobakCDrFWQc4gXljAIEDsEcpgi0uk\nXOpCpGw5H1c9aS5n1Lm/cCf0+X7VJ0aNEyMI4lkjz/GEkdg0E0PAEDAE6o2AEa9690AT1i8WJFyN\nfJWIy1AsTAKHkCw5L9Yu3JQQNc7L14/yZSNzcEl6KafYPWUiEC7KExdo2BJXbHmNlo5Z6f/wxON+\nfcS0tY15vo4+5rtOYtPSpr/pawgYAo2FgBGvxurPVLQGMqNdghxr4iONELIFCZPrXJsyZYq3lAmB\n4xxlzp8/v2y3IJYtytVlYo3TelJPs8rU/7vOB6unxcUY7qcf//hHbsIl4xzuUhNDwBAwBOqJgC0Z\nVE/0re68COB+JBZMSFXexHaxaghg7Ro8aLBbsXJF1eqoRcHDzxzhvvjFjdwl48fWojqrwxAwBAyB\nSATM4hUJi52sNwK4DWXiU23tKkcv3I/ijozaSz3llN0MeW75za2u74Afpb6pWL34ItNivVLfldYA\nQyDVCJjFK9Xd17jKS7wW7sZZs2Y1bkMT3jJICu7X5c8+7zp2aJ9wbQurd2yv3u743r1c//79Cye2\nFIaAIWAIVAEBs3hVAVQrsnIEmPiUqSKMdFWOZSUlzJ071/3PwMENQbrAoUewOsKSpY9VAonlNQQM\nAUOgIgSMeFUEn2U2BBobAUjK3p06NUwjj+je3f3f1MkN0x5riCFgCKQPASNe6esz09gQqBkCix9c\n5Dr/e+60mlVaxYpwl3732OMcHwyYGAKGgCFQDwSMeNUDdavTEEgBAjL1QteDu6RA2+JVbNt2D/f0\n088Un8FSGgKGgCEQIwJGvGIE04oyBBoJAYjXfp0PaKQm+bbstvtu7rXX32i4dlmDDAFDIB0IGPFK\nRz+ZloZAzRHAHbf99jvUvN5qV7jnnnu6N94w4lVtnK18Q8AQiEZgo+jTdtYQMASaHYFWW2/jNt5k\ns2aHwdpvCBgChkCsCJjFK1Y4rTBDoHEQ+Ne//tU4jVEt+cY+ndxvbrlJnbFDQ8AQMARqh4ARr9ph\nbTUZAoZAAhBI63qTCYDOVDAEDIEYEDDiFQOIVoQhYAgYAoaAIWAIGALFIGDEqxiULI0hYAg0DAJM\nCttur3YN0x5riCFgCKQLASNe6eov09YQqAkCrNG4ukG//PvbunUNOU1GTW4Mq8QQMAQqRsC+aqwY\nQiug1ggwzcEf//hHvzHXFNsrr7zSQo2tttrKffOb33Rf+cpX/P7www93bCa5EYBsgSsCbh07dmjI\ndQ3ff//93CDYFUPAEDAEqoyAEa8qA2zFx4MAizXfcMMNfqkXSAEkCmLVv3//LLnSNQkhYw+Z+OlP\nf+r+9Kc/uV69ernjjjvOb5TT7CI4gYPgKphAxGbPuUN+Nsz+xRdXuS5dDmyY9lhDDAFDIF0IbJAJ\nJF0qm7bNggAD/+WXX+43SAHkCdK06667lgUB5QmBg4xR1ujRo8surywlEpBJky2wzIfnBhts4N5Y\nvcY10peAJ518qvtOzyM8aU9Ad5gKhoAh0GQIWIxXk3V4GpoLQRJChFsRsgRZgHjlIwmF2gZ5w0Im\nrkrS77bbbv4cdTayQDRpNxuCxZCtEJ4sKP3Io0t8nkb584cnHvdtb5T2WDsMAUMgXQgY8UpXfzW8\nthADXIjsIVzsIQhxC4QD1+XLL7/sIF38xrrWSAJ2stE+cGTjuFjpcuCBgYv26WKTJz7dvPvmu9Zt\n2pSEQeIbZQoaAoZAqhCwGK9UdVdjK4tFCzKEtYvjWggkRAieWMPQIa3xXxAtEUhWpXLMMUe74cPP\nrLSYxOR/5JFHXZsdd0qMPqaIIWAINB8CFuPVfH2euBZjcRKSAAkqxSITZ2PQA/KFWxPyheUt6YLO\n8iUiugqOcerdrVt3d9pPhrg+3zs+zmLrUlb7du3d5CmTIz/IqItCVqkhYAg0HQIbNl2LrcGJQkBI\nl7gX60W6AAUrF8QP8sKmCU2SQIMYiguRY9GXfTWk9/HHuwXz51ej6JqWed20GW6v9l/3eHGfaetg\nTRWxygwBQ6CpETCLV1N3f30br0kXFqYkCfrg7mRwToLlC4LFhkAa2GohkE8I3d/+9je3/NnnXccO\n7WtRbVXqwHI3dOhQd/zxvbPl07+0z8QQMAQMgVohYMSrVkhbPS0QSDLpEkXrTb4gPeCE1JJsSfsh\nJUy5AenaY489XbfuR7ipU66Vy6naY+26acYNbtGihevpbeRrPUjshCFgCFQRASNeVQTXis6NAAM6\npIJBL8kiVi/0rEXAvcYDS1st6ozCH9I5YMCAFpd23mlnN+XX17mjj+zZ4nzSf6xe86Y7+aST1rN2\nab3BvZ54a13s2BAwBBobASNejd2/iWwdXy0ysGPRqRexKAUYXFES/1VKvmLTarKVBLcXZPOKK67I\nqs/yS8S+4eqcPn2Gu/mWW1I1oeq5vxjtXn5plbvl5huzbYo64H7EspiGezJKfztnCBgC6UDAiFc6\n+qlhtGRw23fffd1TTz2ViNipYoDFMseADFnEUlepUB44iCSBbKELekG6pk+fLqq5XXbZxZOub3zj\nG45FLpj1ve0ee7qLLxydTZPkA+btGj5sqLv3vnt9HxbS1chXIYTsuiFgCFSKgBGvShG0/CUhAMlg\nw+qVJsHiI1NNlGMR0WSL/EkI2Nf4ox/9wnqWIpCtRYsWeQvQv/71L/fPf/7Tvfjii8F6l8e5kaPO\ncz8a0FeSJnL/7HMr3Am9j3Njxo5tEVBfSFkjX4UQsuuGgCFQCQJGvCpBz/KWhAAWIwgXA1s55KWk\nyqqQGGLCVixpxDXHhiSRbHnFgj/0B5a8V155RU65fv36uYkTJ3q9IVxs//jHP9ynn37qZs2a5X71\nq8vc9Bk3uq4Hd8nmSdIBcV2DB5/m2rdv7y4ZP7Zk1eQexdJpYggYAoZAnAgY8YoTTSsrLwIMYpAW\nLEdpFAZjiBdkKhdxJA3WI4T2siVZiC+TLxdFzzPOOMOTLn4L6YJwffTRR+7jjz92K1eudA8uetDN\nvHWmu/GmWxJHvoR07RgsDXTttVdLs0reC2lOeh+W3DDLYAgYAnVFwIhXXeFvnsrF2iWDWVpbDmlk\nINZWL0220vRlHH0S/nJx2rRp3tpFPBfuRbFyQbr+/ve/uw8//NDde++97pBDDnG/u+tud+usZJEv\nIV3cXzOmT8tJkIu9/+R+NfJVLGKWzhAwBAohsGGhBGm9PnXqVLfBBhv4rWfPnqlrhtafdmiRdrFf\nsGCBvlTwuG3btllcxo8fXzB9XAmEeMVVXr3KgXixmDaWItkYlLGEseWyhNVL31z1EkSvSRdfLhLP\nhYsR0oWlCyvXJ5984gnX+++/7+fzItaNjyPWrVvnDu92mPveCSe6Q7oe5Jgnq96yZOljWffinXfM\niaUv6FsEcm1iCBgChkAcCGwURyFWhiGQDwGsBg899JD/Oi5furRcY0JR3IlxfOFY6zajd74vFyWI\n/rPPPsuSLqxcEK9nnnnGbbPNNt7qBTnbeOON3dH/31Fu+x22c2MuusAtD66PGPGzukw1AfGbMG6M\nO33IEDds6JBYYYV8gRvkK2kfRcTaUCvMEDAEaoJAw1q8aoKeVVIUAlhJevXqFYsFoqgKq5gIqxZB\n57QpbQJ5QH89XQRfLjK1B3shXbgXcS1+8MEHnnBBNP2SQcuXu+22285bwkiLxXWjjTZyPXr0cNdf\nf737y19edid9/weOKRxqJXy5OHDQaZ50sfh13KRL2oElEwJmli9BxPaGgCFQLgJGvMpFrsr5Bg4c\n6F0+WBbY0iyQlLitQ08++aTDVdqnTx+HK1m7X+UYtyrXRo4c6W677Tb33nvvxQIj5CVtxEusNXq6\nCNyKMl0ErkWxcgnpwp0opOuuu+5yXbp08WkAEWvXJpts4rdNN93U7bXXXu6GadcF83yd5OfNYr6v\nahIwYrnGjJvgp4uAFC17bJknlbF0cI5CjHzlAMZOGwKGQEkINBXxYqCWQZn9oEGD3EsvvZQTMOKn\nSKPzbL311n7AzzWI67TkZ+vcubMvQ2KqikmTL8YrrDCkQuqgbI45V45EtRnygj7lCm5GyEocInjS\nRiFUnIsS+pZrQtAgYuTJ1XdRZUSdE3dTWqwfxKKBv54ugi8XCaSHTIS/XMStqEnX8sDS1bp1a5/u\nC1/4QpZsbb755o4N4vWlL33JffGLXwzixvq7JUuXBCTtwCwBm/Xb2VEwlnWOOK7hZ45w3YP2LH/m\naf9lJdNF0I5aiJAvMDUxBAwBQ6AsBAJrSkPKlClTMBP5LXCFZNjkt963atUqs2rVqvUwOPvssyPT\nS17yPfHEE+vlk+vsw2WMGzfOpy8mjdaf9Fp0/sAyllNPqU/n3X333bPpw9f5rcsOH1NXqRK4sTLB\n7OelZotMX0i/sL65foNBVJ9HVprjZOA6zQTEJcfV5JxGxzAOnAtchZmAcGUCt2Im+FoxE7ghM2vW\nrPG4BIQy8/DDD2fmzZuXmT17dmbw4MGZ3/zmN5mAzGcCy1dm4cKFmccffzzz/PPPZ1577bXMu+++\nmwniwDJBIH4msJr5skEgILiZ4KOKzOGHd8u026td5qfDf5659bbbM8uffb4kgO659/7MqPPOz5Yz\n4qyRmZdffrmkMqqROLAWVqNYK9MQMAQaHIGmCK7PZREJBiRv/cCqNX/+f+JSsJCIdYo0UYLVBEtQ\nMIC7gIRFJSlYBpkK1RNZsDqZzxKFdWf//ff3MTgqS+QhFjLS5xPqCkiLCwhlvmQtrmEVIjamUilG\nv2LrwBKGizIgzsVmWS8dVi8+Gkiy5Fpz8bDDDst+uainiyCmS+K6CKjH5Xj//fd7axnWLCxbm222\nmdtiiy38xrG2dmENE2suuGAdwp3Jxn2w+OFH3Nw5c9x/n3hCUGY317rNjm777XdwXw3ixsKCNQtd\n7rpzrvvusce5Aw88wJ1xxrDYXdbhekv5jRVRrIml5LO0hoAh0NwIbNgszYeAMNAGRNqtXbvWExJp\nuyZmECpNhiAakDLysRF7JRJOK+f1Xsdq5SIsxaTRZYaPA0tQVr/AUtbicj5iphNq0qWxglgSPC0C\nNrS7WIGcMEBVKrpPpCz0pO1s9FF4Y4Z1roEv/aiFGLFy3bGUA/FKqruJIHqmvdALXbPmIvpCugiM\nJ54L0sV0EfLVorgX2XPuz3/+s9ttt918PNeXv/xlT7aYdoINFyPniPMi3itMujTWgheB7yxUzXM0\nceJlbuD//MhtteVmbvMvb+I23WRjv20WHH/68YeuT0DOzhx+hk/L1BDnnTsqUaRL2ifkC8xNDAFD\nwBAoCoHgJdiQEnbVhV1LwSDdwgUTkDGPQzgf6cKi3Za4HLUEoGfLJV2UFJMmrIcuR+cPSIW+5I8D\nspHVgbTSNi7iZpP8pENwmco59mGsyE87JQ26FSvnn39+hq0SoX6pm30uN2+hOsK44AouV3AzBSSm\n3OxVyxeQ4kzwhWILvPgNhrgXcQXiEgysSZl33nkn8+qrr3qX4e9///vMAw88kLnzzjszAWH1rkVc\njLgagwlTM4888kgmCMz398abb76ZCYLuM4FFzLsqcVlSdjMLLnWwNzEEDAFDoBACTWHxwtoRtniE\nf4sVB0uICC5Ebe2R81hQRMin88h59lF59fVi04Tz6N8nnnii/umPw/Xm+4CADFp/rEhhbMBB11Oo\nPK0QVhYJRtfnSznW+pEPKxZ6lipYHHXbwuWWWl7S0uPOA+tyv1wkqJ4lgQi2X7x4sTvqqKO8ZWvL\nLbf0bkMsXbgZsXQRTM9UEoUsXUnDqFr6iOvZLF/VQtjKNQQaB4GmiPHSg22hrtOkItfgDhHRIqRN\nn+M4nC58vdg0UfnkXFTbwvXm0k/K0NchI8Tp5BOdPl86uRb3F2dRbZa6Cu3Jq/u4UPq0XIfglrLm\nIq5Eiediz3JAzFSPGzIImPeLS8tXi5Atjonpkq8XIV0bbvj5/22F7pe0YFipnpAviWmM+56vVDfL\nbwgYAslBoCksXsmB2zSJA4FyLVUQxnLzxqF3tcpgOaZu3br5ObekjuDLRT/Ra2Dy9hYs4rmwZgnh\nCsdzEesF6YJQvfXWW65Tp04+lgsrFxYviJfEcwnp0oH0Um+z78XylfQPL5q9n6z9hkA9ETDiFUJf\nW1NyDdJhi0/YwhQqsqo/o3QM66fbFKWM1h83JYN1vi2I8YoqJvIcXzRiBahEwq5TAu2jgu3z1YGV\ni69XNTbhcvPlT+q1ctdcZGJUXItYuiBd9DdfLgaxXu673/1uC9KFpSuKdCUVk3rrBflCjHzVuyes\nfkMgmQg0hauxFOi1e5FBmi8ewwO0/lIQ0qLzlFJXHGnRT8dfUab+ShP9ChEvrT9EjnZrMlaJnhCv\nOOJeaKN8hYh+fIXJRt/k6gPS0R5IV5R7MdyvlbSz1nnBtNw1FyFcuBexgPF1I5YrSJe2dEksl54u\nAtdipVYuyAhu0b+te9899tjvPWzowrQRSDDfl9uv8wH+uH27doG1bXPHl4NCZvyFFPzhvqetbByb\nGAKGgCEgCBjxEiT+vWeAZ0Bn0EawkmDhkUGa35rYhEnPv4up2S48txa/9dQQxegH8ZLYJ9rN/GS0\nWQiZlCmYcE1/YFCosXEQLwLjwV10kDqlL4SUyflCe/SX9hVKG3VdYnmirlX7HHhCRnQQPWstBl9a\n+iB4XIYEyIt7EauWTBkB6eJYgughUkwHQcB88OWjO+SQQ7LxXJAurlXqWgSr3919j3sg6L81q1d7\nYrV3p33cET16ujZtWnu4mDICYe3FV4MYM+Spp/7onnt+pbvjjjt9vm8Hc391PbiLnyrDJ0j4HyFf\ntD9txDHh0Jp6hkCqETDiFeo+rCcM8kJesJRARKKEtHxhV29BV9E3rAttKUZIB6lEsBKxJE+UQNBK\nIV0QhNGjR0cVVdI5SBKEL+wuLKmQfyeGjFbab/WyZDCIE0Svl/9hglIW7iagG8LFRqC8zNGFRYlN\nSBfXSIMFi2B5SBdz3FFu1PxcfLmIQNJKkdmz57irrrrKk6YT+nzfnTXyHHf0kdHPkpTbsUN7x4bo\ntBCyBxYudLPn3OHGjR3nTh8yxB373f9ykJskC/pBlI18JbmXTDdDoLYIWIxXBN6QkELkAtLFhJ3s\n6yn5iBXkopCbUXSnvYXICGXR5lKEgYe1GuMQCBMTutLmcnDHasmkqmzl5NdtYCCFVNZScNFRpyZd\npay5CPmCjEG6IFPEbUG0sHRhMZOvF7WlqxzSBeHq1q27J12n9O3vVqxc4S6+cHQLIlUqbpCxYUNO\nd1jGJl55lXv55Vf85K5XXjUpFld2qfqUkh5CzHPAPWNiCBgChoBZvHLcA+JexJUVjukSYlbp4J2j\n6pJOQ5ggRASbSxwT1iF0LMbNqCsjD+QEt50OXqd86uF6qcKAw6zpcf3HD+YQRDYsc+JqlL3WD71l\no11x9RcWDMhkLd1HfLk4YMAA3Ty/yDXWLgLjcS/q5X9wL2Lhkngulv/B0kVaXIeQLln+54UXXvAu\nRpmfi3gvCJfEdLWoNM8P+viSCb8KLFxvOAjXjwb0zZO6/EtYwth+/OMfuYsvvthdM2mSuyrYevbs\nUX6hVc6pyVct75sqN8uKNwQMgTIQ2CB4ETPLtYkhUDUE+gfL10DA4nA5Vk3JEgqeO3eub0utLBhx\nrLkI6ULCay5CXo855pi8ay4WA8306dPd2DFjXd8BP3L9+53q2rTeoZhssaSZ9dvZ7n+DJYW6dT/C\njR17sXe5xlJwFQoRt2OtraVVaIoVaQgYAmUiYK7GMoGzbMUjQOwQZKURREgXZLLawiBNPZWuuQjp\n0kH0uBSZn4sPFfbee++S1lyMavNZZ5/jSRcuwFEjR9SUdKFPn+8d7xb6LyXXub79BiTapYflC9KF\n29jEEDAEmhMBs3g1Z7/XvNUMOAw2aXezQIZwWTJBKVY8kbgtGNRDmXF/uUhMl8RyPf744+7oo48u\n+8tFdIToIJMnX1tzwiXY6/3wM0e4eXff5WbeOjPx9xrPQ9z3jcbCjg0BQyCZCBjxSma/NJxWuMsY\nqG8IYpXSLJdffrm33oUtFuHfEEzIZjmCC7MaXy5CumQWeiZKZS1GposgnqvUIPokki7B+rppM9yE\ncWOMfAkgtjcEDIFEIWDEK1Hd0bjKMP3CbrvtFnyN9nILS1HaWoyVC/IFMconkCfIiQj52AoJBI6y\nmVlehC8XmS4C4YtEJj3FfaiXAJIger3mImSK6SIIoteWrr/+9a/+/J577pmdo4uyS5ku4qSTT/VT\nVCTF0oX+WtJGvioh6rrddmwIGALJR8CIV/L7qGE0JF4JSavVC8LFBoksVcij82ENC7tdwaVaXy5C\nvNj4cnHp0qV+brpyvlyk3RddPCZYWujxxLgXc/XFub8Y7Z55+k9uxvRpZVsfc5Ud93mIOsS8XCtp\n3PpYeYaAIVA9BIx4VQ9bKzmEAMQDq9dTTz21HukIJU3cT6xXDIwE18cRl0N5+qtIXLE6novgd+o6\n7LDD/BQQWLr0dBHhSVFlugiAC3+5SEwXVi/m59KkCwtXKVYuyp4/f4EbGkxeevucudmJTjmfVMEy\nt1WwyPe1116dVBWzehn5ykJhB4ZAQyNgxKuhuzd5jWNKCQiFJh3J03J9jcS1iO5xCgQsvOYi5eNa\nxMUoy//gXsS1qJf/EfIVXnMRgiWuRUgXVi7OrVmzxpMy5jYrh3Sh60FdDnK/DCxefEmYBlm95k3X\nPfhIIenzfAmWPBdYvSD5JoaAIdCYCBjxasx+TXSrcLFBZNIyrxdkCzepWCTiAhcig/VMW7r0mous\nvSgxXVi72rZt6yc1lYlRIWFRay4K6WIvli6C6B955BHXvXv3skgXbR40+HRvfZs65dq4IKhJOTLP\n17LHlqXClScuaSNfNbk9rBJDoOYIGPGqOeRWIQQGwkFMk1iSkopKtXSlXNqul//JteaiWLpYT/Gt\nt97yVi+W/sEyss0227RYcxGyJV8uYulihnpI18MPP+yOOOIID3Op7kUyEfQ/eNBgP19WLSdHjeu+\nwOXYsUMHd+6558RVZFXLMfJVVXitcEOgrggY8aor/M1bOaQLFxsDejjIPCmoiEUKkkhQfVxCmyFd\npXy5KF8t4l788MMP/VeNb7/9tnv33Xc9sYJgffWrX3UsFyWWLr5oJN7r9ddf9+QMC0o5pIt2Q1z2\n7rSPnyA1LhxqWQ6LbO/d8eup+qrWyFct7xCryxCoHQJGvGqHtdUUQgAyg7sxieQL0sVEqVihIIlx\nCWWV8uUiJEtci5AuHUTPV4nEbsmai6z+xWANCYNwsSZjt27d3OLFi/2+3DbQP2m2dkm7mVx12222\nTo3VC73pT+7FpP5zItja3hAwBIpHwIhX8VhZyiogQOwUMVRJIl9i6cJChFUOi1ccQll6+Z9ivlwU\n0gUBg3QR64VArCBYEs+lv1wUSxfE7IILLmhBuhjAS52yYMRZI12rrbdJrbVL+m7J0sfcD/v3cytW\nrpBTqdhzP0LAjHylortMSUOgIAJGvApCZAmqjQBWIEgJ+3rHfEnslcSg0XYhhaUSFsGNgZP2sZC0\nyC677OIJJ8H0xXy5COki0B4hZkt/uSiuRc4J6dpwww19/BiuRQikCO1DHxGu6etyXvakxfK3/Nnn\nUzF9hOida39sr97u+N69HIQ/TWLkK029ZboaAvkR+ELg6hmdP4ldNQSqiwD/ye+www5u8ODB7s03\n3/RL2VS3xujScX0yII8cOdKNGzcumwhismLFCv8FYankiwETEnffffdly4NsMZ8W5UK6mCoCS5YE\n0eNSXLdund841l8uykz0WLgIomcP8SKQXpMuCBdfS4atJOBMvbKhH2QMiwobv0kjMnPmTPevzAbu\njGFD5FSq9399d6178sk/uO9+979S1Q6sm2zLli3zfZcq5U1ZQ8AQaIGAWbxawGE/6okABEAsEZAg\nCEstBMJBveyxuuWqV4hJmMzk0lGsZ6V8uQjREvci00Xw9SLkDAsWxAqCpd2L+stFXItsyEMPPZSz\nHbn05bwQMUlz2cQrXI+ePd2wIafLqVTvCbI/ofdxqXM3atCxwOa6R3U6OzYEDIFkIrBhMtUyrZoR\nAQiNkBVcjkKGqoUFJAMXILPpS935BjSxEjHwFRIZHDXpYkLUadM+X75G5ueCWOFGhHDxlSMb1i6x\ndEG6IFMSz4WVi9gwmTIC9yKuRwLphXRRJ7qWI1j0wEC299f9zXUOvpRsFOnYob1r3aaNK6YPk9pm\n+ibN+icVV9PLEKgVAka8aoW01VM0Ani/sS4hkCJIWJwzxotljdglyBcLd2NhK8aNKMSEgY+8UYLV\njK8J9XQREC5mo+fLQ1n+B9KFVQsLl5Au9pAurpEWMgW5wqUI4dKkCzImpAuLGO5FNrArl3jp9lDO\ngw8ucl0P7qJPp/74m/vu32L+tDQ2yMhXGnvNdDYEPkfAiJfdCYlEAIIDgXnvvfe8NQrLFGQCKxgk\nLBfpydUYiJKUwaBF+XPmzPGEqxySQhkQEzYt1KGni4AoMQM901II6fr00089sQqTLggY57iO8OUi\nrsQw6WL6iCjSRR7aiW5xCG37wUmnxFFUosr46nbb+Y8FEqVUGcrQz/R3qc9CGVVZFkPAEIgRAYvx\nihFMK6q6CGCpgoyxJ4aJLwMhTbgJo6xVMigRZE5AOwMVm/5yslKiAjlh4EMPSFetv1wUKxfICwlE\nlzgEK+Arr77hJl42IY7iElPGvPvmuxtnzHC33HxjYnSqRBGeB/o86hmopFzLawgYAtVBYKPqFGul\nGgLxIwDBggyIMOBAeiBPUQIRYjCCbOUSrlVCvhjwIDy4LbXoNRf1l4viXtRB9MzRxXlckBAp3Ic6\niF5PFxHlWpR60SNfWyVdsfsNNvyCwzpkkmwEJD7RyFey+8m0MwQEAbN4CRK2b1oEKrEUQf6woOkg\n+kJrLmrSVcmXi5A0kUrIo5QR3k+8/Ar30cefpn7i1HC7Vq950+3YprV3/Yavpfk39yL/aEDATAwB\nQyC5CGyYXNVMM0OgNggwUGE5KzVWRsiOJl2HHXaYD6JnAKzml4uadEEcbbAt/l5J4yLfxbQOyxci\n/0gUk8fSGAKGQO0RMOJVe8ytxgQiIO6aYlUj1izqy0UC6flK8qWXXvKTooprMe4vF7WeRrw0Gs19\nLATcyFdz3wfW+mQjYMQr2f1j2tUQgWLJV6EvFzt16uS/THziiSfWmy4iji8XNSRiddPn7Dg/Akyi\n2m6vdvkTpfiqka8Ud56p3hQIGPFqim62RhaDAO5BtlzWAlyRTGehF7rmy0rIDy5GHUS//fbbu222\n2cYtWLAgOykqpItAelnoWoLow5Oihmej118u6nagpwyy+nxcxzIha1zlJaWcV197ze3X+YCkqFMV\nPeS+IO7LxBAwBJKFgH3VmKz+MG2KQADCAdmRPVkgRUwbgcg0ExxjxWIQ4ms/iYHhfC4hLWWz10L5\nlCF1cK3Ql4sQpnbt2rnFixe71q1b+9nlK/1yUetE+9GpWrLlFpu75cufrVbxdSsXAtwMwj3MfQv5\nKubeL4QJ9xtlyUbZ+rljzjqpR5479tW8RwvpbNcNgSQiYF81JrFXTKf1EOBlT1yVTJ4qL3T2WKkQ\necGTVgYFGSTkHF8gyrZeJeoE5EuXV+mXiytXrvQz0GMJK2XNRR1Er9Tz5FD00+fjPAaDSy651N1z\nz+/iLLbuZY0ZN8Ft9uVNgoW/h9Zdl1oowLMAaeJZKVX0c8dHJFh2ue8gdWyI3IfyjHGOe4c62VM/\naeS5k+eVdCaGQDMiYMSrGXs9JW3mhQ3RYgkhhBc3rr5yBhDyMzAwEDAXGGUTqyVzfXFdiwxW7KlX\nL//Dmoss/4PIl4vMNv/xxx97VyIWFaaMYMO1yDVmrV+7dq13M+69997Zha5lji7IGDPVs+Yiy/8g\nuUgXAxoiA5//EdMfwQic7rjjDl8qujeSDBx0mnv7rTVlr1qQRiy4j+lbCFAxwj85PCfca0KY2Jcj\nlMFzTJkc8wzz3FXj/i1HP8tjCNQaASNetUbc6isKAV7SvJwhWbyo2eIUiAWEjsEoFwHjvI7non7W\nXJTlf4jpIl4LYsVC15AsSJcQL87J8j+y5iIkZvXq1d5yAOkinktIF2lkzcV8bUX3YgfQfOVwjYGQ\n8tgYHDXB5Pp+++7nLho7zh19ZE9+NoS0b9fezbx15nrxfHFhmmSQ6Od87eQe4L5HeD7ifu543iB0\nrPDAc8SxWcCSfMeYbtVAwIhXNVC1MstGgBczL3v+Q4d85Rskyq5EZWQgYoCBgDAIyH/1YdJF/AqD\nEq4WWXNRky49KSoEDNIlQfRYslhbEaLFuotsTz/9tNt///3ddsHM8FyHdOUKolfqeoJUCSbgSptp\nC3s9B5muR44hXkcd81138YWj5VSq90uWPubOHXVOsH7mwvXaAR4iWGMa1SJDO8P3EPc/z50QI46r\nKdTHM4YuPH9C9qpZp5VtCCQFASNeSekJ08MTn+HDh7vzzz/fv4xrCQkkj5c/xAtyIm420eGpp57y\nwfRYucS9iGuRmefF0iXuRUgXaRC+XNx0002zpEtci5wj7mvbbbd1u+22W1Gki8EKKZUQCMlikNMf\nB/jCIv4ce+yxfmBmcGY+skmTro4kKhFZE3/q3F+Mdv/49BN3yfixeXUFa8GbhGGikjdzCi5yL0ib\n5N6HbEGCammBQg/qxbKNHrWsOwXdZCo2KAJGvBq0Y9PULIiO/PcLSSg3hqvSNvPf/r777tuiGL5c\nxL3IgPC1r33NffbZZ96SJROjhi1dpa65+Oqrr3r3XrjeFkr8+4ceLKOuyznSycZi4oXk29/+th+E\nGYhlWgysekIyv/GNb7orA/LVCO7Gbt26B8T+f7OkoxA2ch08RSC+pZJfyZukPW3Cysue547+r4fw\n/EO+eP7q+fzXo+1WZ3MisFFzNttanRQEeOnKC58Xbz3/46VuXIoS56Sni1i4cKFr06aNj9kSS5cm\nXeWuuYi1i/oY/ASHqL7Jdx3cuC6b6B9VDudol3ydxp42i/tUiCMWO7HsHXPMMe7+++5PPfG6btoM\nD0k+nHNhpvNgCQNrhHumXv8oeAUq+IOFCcsu1tx6tgEMIVxY28AZbOupTwWQWlZDoCgEzOJVFEyW\nqBoICOniJcsgkATRVi8ICUsAMRM9li7IV+fOnddzLeovF4nVIlh+s802W8+9KEH0ub5clAEnTD7F\n5SVWFhn4Sc+AVYhoMa+ZEK1evXpliRYWLbFqaaJFW/X2wgsvOMjX8mefdx07tE9CN5Wlw0knn+q+\nd8Lx7vjje5eVPyoT9zD3jAj3crj/5FqS9mJh4h6iDXJv1VtH3gNi/TbyVe/esPqrhYARr2oha+Xm\nRSCJpEsUhthAVhicsAgQi0Vw/FtvveX+/Oc/u5133tmtW7fOTxcR9eUipIsAeuK5Sv1yUax+eiCE\nXCHsGSgLBcRDGCFaxKthQcBFKq7DMNnSBItjPgjQe7k+PpjPa/fd27qJl00QmFK1n3fffDc8mLdr\n2WPLqkqM6D/ubSSp1jBNupJIEo18perRMmXLQMCIVxmgWZbKEUj6y19cbwMGDPAWDZb+wbXI+oss\n8YPgXsz35SIEjCB6sXQV++UixI/BhwE8PJ1FLuR1QPw+++zjiZaQLbFmyV7IVJhghUkXv8Xd+OKL\nL7pLJ1zqZs66zXU9uEsuNRJ7ntiuoUOHxmrtKtRY+i9p1jDcedxbQvALtaFe14k9Y0u6nvXCx+pN\nNwJGvNLdf6nUnhcqAwAEI4n/cQOquOCYh6tLly5+Y+JULF2PPvqo23PPPbNzdOX7clFIl8zPlWtS\nVCxZssUREI/+ECtNtoRIRREsIWOSRvIKDmAybdp0P8/Y7353Jz9TI8xUv2zpEnfnHXPqqnO9rWHc\nX1hB2afBjSdfGKOviSHQSAgY8Wqk3kxBWyBbvPRxm+EGS6JgKWKDhBBsvmLFCtejR49g+ZxLPOEi\npusPf/iD69ixo59pnklQZY4uPV0EhEziucJzdDEIM6DIVihOK19AvJAjTbI4FoKlLVv6nD4vedlT\nnmAgRBFrHeSRJYT+69hebtTIEUnsuvV0Yt6uQ7oeVPcA8rBitbaGUR/ua/7hScucWejMuwJ906Jz\nuJ/ttyEQhYARryhU7FzVEIBs8TLF6pVUERcdJIUYLojWpEmT/GzbV199tSdTb775pp/0lPgpIV3E\ndUHCiAeDdEFW2BDisoRksS8Up0Wevn37enIqMVvsIUVRREssVrIXghXey3X2QraEaFGnCCSLTdog\nk7w+++yz7rLLJrpLLv2V6/O94yV5Iver17zpTj7pJNe/fz8/S3oilfy3UtoaBkFii1MgLkL24yy3\n2mXxrGD5Qve4Mam27la+IZALASNeuZCx87EjIC/RJLsYaTTESyxGQrxYBuiUU07xXziefPLJHpvn\nnnvOde3aNRtET0yXuBaJB1u8eLGjzQToFyJaQq6EmELoCPAXosXAA7HbcccdW3xxCIHKRa7kvCZb\nUh57LUK02GOli9pkLclLL73Uu1tvvOmWxMZ7QboGDz7NtW/fvuBkqRqHJBzzfLCJcE9UIpTFtCUv\nv/xyKslL/+AjF4TYNBNDoBEQMOLVCL2YkjbwHyuuDnmRJlVtbfGSObuwehF7xcz6t99+u5+SAZL1\nzDPPuG7dunlL19KlS93dd9/tHn74Ybd8+fKCzWPiUvnyUAfEM23Ft771raxFChIIeWLgfO+997y7\nU8iUkCu9l/Sk4VjIllYIF6K2aIWJlpAsvcf6RXwbU2rcc888N3nyZDd9xo2JJF/DzxzhVq160c2Y\n/vnkt7rtaTuGvIvwDLGVIjxv5OHZS6PERRz5QKZnz54egnHjxrmzzz47VXC0bdvWrySB0qXoP3Xq\nVDdo0KBsW3m/1VJ4Z/HOYBWME0880c2aNauW1SeyLiNede6WfA9Tvmt1Vrvk6onpwt3BSzTpwouJ\nDTJDcD3kSzasXQcccIAbNmyYw+LFS+TWW2/16Qu1C3IlRIvpHiBEQvKEIDE4COmCOKED14RYrV27\n1tdLWZp8CdmSctgjlC/xZZpoQaIgW+JC1ARLSJg+h8UPMnnEEUdk3Y+jRp3r5t1zj7v+humJIV9Y\nukaPvsDhCm4E0hW+p3h+9DNUyBpGWqxdDH5J/ZAl3Mao3/LPWiVWL3mf7r777gEpXxVVTaLPif4o\nGSZeEovJtSlTpriBAwdy6KXexAsltA68MyFgzSwbNXPjre3FI5DvwS6mFF6YaQmQpa0QFiEqxGtx\nDvchZOSaa67xW6F25wuIj5ohnsFg66239hOiCunSe44hVAwcuDGxYhBPJmQLnYVooRvkStogREvI\nVnjPdSFaco1zbK+99pqDeDGJKuWBBfvLLvtVMAv+Pu6HQQzVLy8eU/eYL3Ev0vZGJF20iz5nEylk\nDcPK1a9fv1STLtpKOyCQxIaWQyDHjx+ftRalzdIlfZ3mPURQ+mDkyJFGvNLcmaZ7OhDgv27inCr5\nb7XWLYVcsEFCEIgGE6guWbIkpyrEZR0exOOwYdEiRgsiJK4+rGaQJDaxVuk9S7cceuih3joRJlyS\nTvLz3y+uR+LKvvrVr2Z11EQLIqUJVZhYaYIl14RsyZ5FtSGDHTp08BhI48EG6R+4sbbccis36pxz\n3Isvrqrb144yQeqxx/VOXUyXYFrOnntNhOdMiBjkRL4elnOSLo17yCbPFJZz7rlSBGsfgz7SqlWr\nFtagUsqpd9pyrXSQHm0Bq1c70AHShcuR/mhmArxhvTrB6m0eBHhZslRNOf+p1hMlIR9YvNgOPvhg\nt/3222dVgvRgBfrVr37lXRcspn3dddc53JGs64hVi+B8JlqVdR2ZB4ypI/hUno1BgW3OnDl+oORY\nrpGODWsTMWYySz7kC0KH5Qsd33jjDR9jxteVTO5KoD5Ys2eg4Vjv5VjSkI7AfdrDV5ky6Sskk/nK\nIHnUI2RUSJcAwRI8M2+d6efKOrZXb8cUDrUSrFzn/mK0n5V+zNixTUW6whhDTiBibBxjYeb+gYA1\ngkC4yvnnDTcXzxUSJiAQALmviYMiHeRAzrEnjeSPwpHwAPLqPPyzUigfehFzpvPxm/NRwnMoaSkb\nkfw6vegi5bCXfOxFwu188skn5VJ2r9MQp6Ulqt359NfuRdFNl9dMx6kkXtx0+ibkZuIcTFqL3IBc\n50EIX9dlcBwWHjbK1Tdtrrp03mL103nKOS71xtftBQvy8zBJ++RloXUp5sHW6aOO+Y+b2KY0CZgg\nYkHCIsTGi/vUU0/11yBRd955p4/3YhkhvjhkSSH9JWSYaAnZ0nssXZAgzjFQkgeiBmEjxgxrF1Yz\ndIIAQQIhRxAl+nSPPfbw9zZlCMmCXNGf8luuQbIgZ0K0KEeIFuXSRogeHwjw0QDlCBa+0Tn+MLgz\nQWmPHkd41yPB7c8+tyJH6spPQ7iYGLV7QDLe+evb7t777q3prPSVt6C6JdDfCJP+NorwDuEDF56T\nUkQP8szHl0943/H+1gL5kOBwfZ5jrkWRDSFwPJ9RhIaxiY13sBZ5p1NmtUUTIeoK68I5jZ1OD0ZR\n7Rb9aVtY+Edx//3396cZf2677bZwkqb5nSriRWfxAHCzh0mUPBz6JseUycCBCImSnuWG0mUQkKiF\ncnhoKDcsnONa+EYtVb9wuaX8LufG1+Vz0/PgaLzkZRH3Q4+bkf/C0yZCSNlDwNh++ctfuhkzZnhr\nElNEiBuR4HesYbgjV69e7ckTJEoTLPBlk3NcZ+PLyG233dZbtbCSUVYuogVhgjyxCZmC9HXv3t2T\nPoiZkC1JI0QLi5i2aPFVJmSLPGy0jzaxHV5mfw0bOsSToI02+oLbu+PXHQQsTgsYZE4I1/JnnnaT\np0x2UyZf43YNLDwmLRFI4z88LVvQ8hf3NXGTtKtY4f2m3/P5iBdjgn4f6jooI0wmeAeHSZrOwzHP\nO+9T9iLUowmNnNd7xpZCZev05RxDgiBDIuG281vrLdgxdkSNi1IOe9oXpb+UQRojXqCQAunTp0/O\nBwP1wzc5N5X2I3MzyEOobwqYvL4hwuXkgib8IJaqX65yC52v5MaXsvM9ODz0cT0UzD9FrFM9B0Ze\nfEKipP2V7rHwMADg8sMiJV8/EgD8+OOPe+IlBEvvtUUL9+G9997r5wLDfYiIRYugeT0jvpAocRNG\nWbOOPPJIT+SoD5LFJu5DyhOiRWyXEC2NC32FVOqaoq8nXDLOx6Btu83W3gLWrVt37xIkFqtUgbhd\nOekaN3DQaZ7MvfKXlz3huuXmG8smiKXqkMb0xOeVS6DjaG81njvaI/dpMTrqf47F2pIvH2mIpeK5\nZq/HBcqS8hgj9BjCWDN//nyfj7zapRlOq9+t1MeXylKf1lGny6Wz1KmvY0QI66Cv62NtxZK2yXX9\nG71ENz12cI71a0V/jRf40HYtmujp8nWaZjjeMC2NhDRpRs7ntHQ2m7ZW0dH6vwmIl+5sbgZNwBjI\nKEsL1/UNo+vSRA4SJw9Hufrpeos9ruTG13XodoXnVhGsK32wCfitJ+nS7eVY92v4Wim/GQBoG5Yp\nSJMQL8jUTjvt5O9VsWixj4rTIjieexP3HqQoF9ESsqX3EDH5zTFWLYjWQQcd5F2HuDzzES0IlxYG\nM/qJLS6hrHPPPcetWLkicHkNc5tusrG7ZNxYT4KJBYNIYb2K2ojbOunkU137du09cVseWAVZnJv+\nw8JVT0IRFz7VLId/CrAOJUXieu74p6AU4iXvMXDQ40AuXHgPSjr24feikAXe+7pNjEGadIR/6zFJ\n/vlHB4gPzzFCfXp80br7BFX4o4kX7dF16mNJxzmtP/gIIRO8pD2UJ+OjqC7Y8pvruixJ0wz71BAv\nueHpFP6b0DcovzV50jc56TUx45r+TyVMzEivb5ZwXdSjbx65OSvRjzqLlUpvfKkn3C4eLHm4SKNf\nKpKnnD0vySQNktJf5bRF58GKx+AG6ZIvEPlqkfguXHbs//rXv64XEM81LE64+N5++2239957xxoQ\nT7nEfHGPEqclFq0w0dJtoR2QJIkL0tfiOiY+57xzR7lFixb6e+vM4We4Dl9v5zbfLIgxCwhZePvi\nF4Ln/H9+5N2WELepU651/YPg6mrqGFdbk1AOVs8kYRXXc8d9StuKFV2vfm9H5YdAhNNAIvR7UYiC\nLpc0mnRJ2bxjRTSpEaLCNf6JZhPrEHWJQYF9tSXcZj2O6WNpn253OC+65sJL2hHGV5cnaZph//m3\n8iloqe4guQm02tywYgni4eBGF+ZNeh4CIWTy8HATaAIn5em69EMi16M+69V5StVPyi1mr+vJd+OH\n2xouO6pdnNOkM5ynEX6DX1T/lNo2BgARsXoJCWPPdYLmmYYBkTgqSBfbI4884r/0xNrFb70nrfyW\n9JIf4sbG7zCp0uSKe//wwCoHqcJKEDUIM4DVgxijC7qxmVQHgXr0a76WxPncEWBfrOh/IGU8yJU3\n6p1IWk0WpDwZQ7ieK1+4PsnLmKPfs2IIkPFLxitN+KinWkI9ooOML+xFX9onbZRz6EIa/c6J0k+n\nL+d6VJ5GOJcai5e+0Yml0oMOxwS7awl3ODd7+EHQljDJq+vhnH7oJE3UXucrR7+oMqPO6XbJjR/G\nQkgX+XV6XV6x7dJ5yjkWa0o5eauRhxeMvFziLF8TInEdMsP9iy++6L8gxB0ocVoQnn333dffj5AQ\n7kvZS5pSAuKl/6PaA7nBJcqmRc4Z+dGo2HG1EIjrudP/8BSja4eXj40AADlaSURBVK73XzF5q5UG\nEkNclLaI6bqwNDGGCBHT16pxrP8RFSuXJoa1IoDVaFtSy0wN8aoUQB7A8ENYjQG4Uj0bMX+pL8so\nDHhxC8EodS8vE8rlHiDol5daHP2PLlifsExJnBYB7fL14V577eWXG4JYSUA8MV9ihYJ06RitUgPi\no7AKn5NgeYmNkb2cD6e334aAIJDU5070K7Qv5R/M8PggZet/qqU82ZMm13skXJ7+xx/yxT/+4lYk\nhIVNp4mLrEo7cu0hXlIvOtMe/c7UxEzSURbnRf9c+yjjRi49mul8aoiXvtEl4DtXZ3Nep6dDo/57\n4MbWDxXp9I3F7/B1zkWJrq8c/aLKjDqn9bMbPwqhwud4udD3UfdE4dwtU+iYLebDIkAea5VYrr7+\n9a/7DJAzzvGlGfFOYbKlp3kgTgsiRx7K10SzZe3F/4L8so0Oll6R4+JzW0pDoHIE4nzuitVGvy/D\nRChcBlae8Pue39r6I+95cb1RBuVqoiLlas8DepCH8vTzLKQNjwwbYSxaZ7kuZVZrr61v6C31orNu\nqz4mTSFMC+kreBZK12jXU0O8wh1eSkdwI8mDIQ8A+eVFoMviur7xww8iabFcyMPDAI5Uop8voMg/\n4XoqvfGLrLbsZFh6xMJSdiEJzaitXbgXmbIBaxdWK4iVkC9mvGduLPqqU6dO2SkewhOXQrTY5N5i\nH5dIPBfEi/4oJUA5Lh2sHEOg1gjogT3qXR7WBxefpGPPby1i/cH9pseJ8GSo4d/irkMfnY9//qQ+\n6mGc0u90nVbrke9Y58+XTl+TdnFOE0bRW9Iy/gim1IP3QEia5NXjoy6L67qt/NbjGb+bRVJDvPSN\nwc0qhIeOkgdEBiwd78XNoS0b/FcR/gJSSJl0ur7ZqEf/x0NZ+sYWvWRPGaXoJ3UWu6/0xi+2nnzp\ndPvzpeMa7qxGHOSFFLHHOoWVSlyNEC823I0yQzzxXvfff79r166dTwtRqybR0v0SjufKFfel81Tr\nmHuBuL8rr7wqmIz2Ij9lBNNGhLcRZ410V141ya/NF45Pq5ZujVZuIz53/NPAPzTFih7Yw4N+VBmQ\nCMYPnmv2mlQwLkh5ECLGEhHK1vOWacJBWj3maOsSY4/UR52a6JFPjytSV6E94w9laR0K5aGeKJIX\nVb9uN/jo1U8gnDI+QNB0W9FB4wmWUXUW0rURrqfmq0Y6EBIkDw83F1uU6BuDNHIj0MmUIze0EC5u\nFv2lIvn1TasfBl2ffhDL1U+XV+wx+qEzIjd+VN6oGz8qXannBHv89+EHK6qsOAYAjXVUHeWcq+Sh\nZwDYNXDd4QrEtc2LDiIVFs6zPffcc8GSNse71157ze0a5KuVoCdWx3A8F7+FkFVbH+p58MGH3Ow5\nc91dd8513z32uGCw2cN9dbvt3Kl9+0ZC8cILLwTLJn3oZt12u+vdu7c7/PBu7ohgcDj0kK7B8eGR\neezkfxAAI6yblUrSnjveJeF7OV8b0V/GCd6VjAW5nntIBtc1OZCyIQnheCXew4xHeqyQ9LKnLkJP\ndJ3kY+yJqkfysWeOLJ1PXwsfo1+h8sJ5wr9lDJPzlMkWFtKBk+Aavs5vxh7aHRYZizkfRerC6Rv2\ndzBopEYCcpQJbgQmN8m5Bf9ZZNsTfDnSIl2x1yggeMha5A3XiR7BjMPZujgoVT/yBDdoth6tX6Fr\npA3rpH9TLvpo0XUFD4W+5I91mcHD1eJ6FO5gVEgWLVqUOeywwwolS931YA28zPnnn+/1DqaTyOTa\nSBBMlOo3jsGjVkJdwYsub3XoFkx7kTdNuReDhb8zPzjpFH+f/nT4zzO33nZ75o3Va8oq7p5778+M\nOu/8TEDA/HbDDTcUbFtZFTVIJvo0mGuuQVrzn2ZMnDgx069fv/+cKOJIv7sCMtMih37nBUTAv9N5\n9+l3afi93KKA4AdlhvMEhClDvvAYofNyXesmdXKesSss+v0d1on0Aclsobe8n8NjWbhc+U07RAf2\n4TokneypMyCRLfKgY758ur1RY5CU3eh7/ltPndCxugO5Sbjxwx2p03BDhEU/LDwoYaLCjaXTUE+h\nG4s6itWPtPkepnzXyFvqja/LC2NFeegtDx7t1pLvwdbpwscM7IFrIHw69b8hkxCLYiRMtsK/iymj\nlDSQrVLqKDV9IV2oWwjSFVddXTbZylUPBA5C126vdpkrrrgyV7KmP8+zXIh4pw0kSBfkqxTRxCP8\nXtPvPIiXSfUQYAyR8YWxqJkllcSrmTssjW3nP+9qWVXqhUexg1oUAdIWsLj1r8SCha6VDNTUHSwD\n9DkhCghXtQUrGAQMkheFc7XrT3r5pfxzkPS2iH68S0rta6xO/GPNM8teW6GMeAmy1d9r65hY46pf\nazJrSE1wffDQmKQUAeJNCKhuBCFuRmKiiJ3KJ8Q2SVqdjnPEqsQR+6bLJZ4LKSUGRuennyTuS58v\n5nj+/AXuqCOPCqbT2MwtDPp62JDTi8lWUZqjj+zpWCi79wnfc4MHDXYXXTymovIaLTP9OXfu3IZp\nFvcmzwztKkUCspUNhA/+scgbk1VKuZa2eAQ07oG1q6jY4OJLT1/KDdOnsmmcNgR4UQYxOWlTO1Lf\nyy+/3E8NwUWC5mkbZExIj86Ui3iRBnIUlUfnL+WYsiB0bJWIkLZSdLv44rFu6JAh7pcB8Zl42QTX\npvUOlahQcl5I3u1B4P7vf/+4Y/HtuAltyQolIAMY8I/B9OnTGwYPSCRz4JUjBLQHoSc+a75g+HLK\ntjyFEQBzyBcSWBkLZ2j0FMk0xJlWjYQA7ivivHBFpVlwlwbvg5xbMHFq5thjj81cdtllmeuvvz4b\ncJ+rzeAShwsW1wtlxSmUV4xLJ5j2IRN8pZh5dMmyOKsvuyyC+HE9xoFr2UrUKSNtps/YpP245oqN\nRayT2kVXW2lbdIwRLkbEXI1Fw192Qu3qxd1okslsAAiNTi6tffVHoH///l6JNFu+sCLwHzeL9AaD\nQNbylQvdHXfc0VvEmEaiW7du2YWqsZSJYBVDyrFUoQ+WKaxu1RJcxFjBotyqZ519jluxYoWbPPna\nmlu58rV3+Jkj3Ly773Izb51Ztts1X/lJuRZ2C0f1ExZaLEVpd/WjP8+eWTOTcveZHpUgYMSrEvQs\nb9EIMEgwMLCPGsSLLqjOCSFIuBYhkoEFz82ePdstXLjQbx9//HFe7Tp06OAJmBAxEkPCGFRKJU/g\nyCAkrsG8FVd4EXJHn2lymFTSJU0V8rXssWWpvt+kPbLXBIr+0H0iafSee4Q04orW19J0zPPBxrNn\nYgikHQEjXmnvwRTpD1lhEEjryxNrHbpDejAUs/3zn//02z/+8Q/3+OOPu5/85Cf+NxOAFpJDDjnE\nTw6KNYyFsxlYtDUsV/4oIpQrbVznNdFjRvk5AeG8+ZZbEmXpCrcV8rVq1YtuxvRpqSVf9LW28nCP\nlCrcsxA2TdpKLaOe6dEbaxf3YJr/aasnhlZ3shAw4pWs/mhobXhx7rbbbt5SBAFLk4h1CdcNgwCk\nK5g01X322WcO0vXJJ5+4lStXOqxeuBi5FsTWuMWLF7uHH37Y/f3vfy/Y3J122skTu+7du3uCSoYw\nEWMQinIpFSw8hgRgQPtvvvkWN33Gja7rwV1iKLW6RRBsf+CBB7jzzh1V3YpiKh2MIVsicfQ1ZfK8\n4XIsh7iJLvXaozMbBNLEEGgEBIx4NUIvpqgNP/3pT/3Akrb/vsN6i7VLSNdHH33k5s2b51iTERIG\nIYN8QZxYSujDDz90v/vd79yjjz7qHnvssYI91qZNm/XckgzIWMfqJQzgB3U5yI0YOcr9aED0Uj/1\n0i1Xvc8+t8Kd0Ps4N278OE+Yc6Wr53n9LGDRqYb7GMLMJtbSera3lLpFb/5pMzEEGgUBI16N0pMp\naQeDNwMLRIYtDSKuDgYtLAeIEK9PP/3UQbruuecev1js+++/739DvnBDIhAvFtJmYWzZL1++3JOw\nRx55xAeo+4QF/gwJpmxg3UIhX2FrWIHsFV8mrov+mzrl2orLqmUB102b4W6acUNggZydCFcVJEIT\niVpZoaiHZw8ykwbheUPntFrq0oCx6VgfBIx41Qf3pq6VF+q+++7rgk/eq/LffZzgipuGwYoYNRFN\nvJ5//nlv0dpmm23cunXrvFsRMoY1TKxeLKYN6YoiYZzHEgYJg8BhHSskxxxzjCdgkDCwRKpJxOiz\n7//39/18WR07tC+kXuKun3Tyqe6gg7q4YUOH1EU3bdWCvAuBr6UykD0hXvperqUOxdbFcwfpwq0/\n2lyMxcJm6VKCgBGvlHRUo6lJoDoWLwakarhW4sBLXv7oh75aNPG67777gjiiAx3Wrg8++MBvEC+s\nYZAv0rJBjNggYRAwTcI4FosYgfmQsGnTpvkydL1Rx5tuuqmTLyXzxYdF5S32HMRl7077uFEjRxSb\nJVHp5t033w0fNtTV6itHiCr3jwgkIgmC9QjSlXQrkkwdoQlrEvAzHQyBOBAw4hUHilZGWQgwADBA\n8XJN2tdKQrrQK+rlD5HCmrVgwQLXtWtX716EbEG82LPhbhTyRVrZNFiQsDARo4ybb77Z/fznP/dk\n7PXXX89axIqND8MlCQnDIibYlmsRE2sXSwHVelZ6jVWlx9W0enG/gJMIZF1wl3NJ2WO9HT58eGIt\nzkl+LySlD02PdCNgxCvd/Zd67eUli0UpKZYvIV2Am4sUQryYx4s4Lr5ihGBBtPiqkY1jTbwItpdN\npqCAiFGOlldffdVbzlje5LnnnvNuRIkLk32p8WEdO3bMxoaVEx9GbNcXN/6Su/jC0VrV1B2L1WvF\nyhWx6K4JOSQrKfdvvsbhbhSSmESLs7wPcj13+dpm1wyBtCBgxCstPdXAevKyxfXBy7begxe6sL5d\nr169vHsxn9UiWJrFffvb3/bkS6aVEAuX3ss12UO8cEHyW0gY+yeffNJbSZgVH+sUU1C88847bo89\n9ljPNSkkTMeH3X333UVNWyHxYVjEBO9c1jAGab5kZC3ENMZ2hR+bbt26uzPOGFbWF46QFjaRpLgP\nRZ9Ce9Fd4sv4ZwfyFY5fLFRONa4X889ONeq1Mg2BeiBgxKseqFud6yHAIDBgwAA3ceLEun3tSFzJ\nHXfc4XUjvgoSlksgiYcddpi/LJYrIVEQKr1BsjTZEtKl98E6co55vDbffHOfVtySDJbbbrutPx8V\nHwbx0iSMwHziwwjWv//++3Opnz1fKD4MQnz9tOnuzjvmZPOk+eDKSde4VwJMf3XpJQWbIZYhSQhh\nEdIi59KyD5Mu0VtivrjX6/W1I88S9UNk0SHfPzuit+0NgTQjYMQrzb3XYLoTIwP5YXCDiNVqkGOA\n5cUvpEtg7devn9eD39olKIMYk8Hq8xwLCWMvREz2YTLGb8gXcWK4AyFdQsZ02qefftqx3BBlaomK\nD9MkjGD9SuPDxowZ51pts21qg+o1XhzLvF653I3cg9wPSFrch17ZPH/kfs31PHGd5w6B+NTKkgfO\nfLHIs84+LdPL5IHaLhkCRSFgxKsomCxRrRDgZczL/4ILLnAQH17IuQaMSnWSumSw4cUPAXvllVey\nRe+zzz4OlyKDsJAs/kMnVkq75+RY0lCAJmEcazIGscKNiKWLpYOEhEXtOYcbEnImJE7Kpj6pmy8j\nJVAfAqa/lJQvJkuJD9tkk01clwMPcmPGjUvFLPXZTitwgLtx4sTLvJuVeyAtQfEFmhV5uRDp0pl4\n1ngWIGHVfO6oE7LF84arm+NqPeO6fXZsCCQFASNeSekJ06MFAgwYvPyJt4KAyUu6RaIyf1A2L3sG\nGV781CP/5TMQc/ynP/0pWzqzyLMYNiRMky5Ijrj/2AsBymYMDoSIsZcN8vTSSy+5tWvXuk6dOmXd\nkpzXFi85Zv/aa6/5dLgd+U1aCBl7IXS6XtEL8iWbkC9tFZP5w6Liw2gv1jZpgy4/zccDB53m/rzy\ned/vjWLViuqPUkiX5Of+51mT545/RHge4hD5R4dnDxGS53/YH0OgiRAw4tVEnZ3GpgoBIxaF/4r5\nb5yBoNTBAKsGpImXPqQKMpdvUOH6jBkzspBBZIYOHer69u3r15sUMiOWJX7nIl/ZQoIDSAy6bLXV\nVu5rX/ua/y3ESfZCqMLWL6xV2223nSMuS5MyIWGSj3LYtAhJ1HpDxPgthCwcH/aNb3zD7blnO3fb\nbbfqolJ/PGbcBPfZp5+4//3f81LfllwNKId06bLknxMhSTx38uzpdIWOKYfnjucXVz4frUDsSn1+\nC9Vj1w2BNCFgxCtNvdXkuvLyZuNFjjuQ4HbIGBuC9YJNBh1xI4kriZe9DCCkyycQl+uuu84NHDiw\nRTLyX3HFFVnCsvHGGzs2IWBCcFpkUj/QHSsb9WtLEsfUKXvIlN6EhGGhYqoJ+R2155xsQsKiiJi4\nJSFf6K8tYZCxSZMmua232c5NvGyCakH6D5lW4saAVN9y843pb0xEC+T+l+ciIklJp+SZ497lnxYI\nOWXLF7EUxrE8Z7meO56/uHQqqQGW2BBIGAJGvBLWIaZOcQjolzvHCAMOx3pAkJd9KS98yA8bVqXf\n//73fsoIrVXbtm3djTfe6K1PX/rSl7wFir1YjiA0iHY9ir7ok0uoU0STMCFPELF3333Xzx/Wvn17\nT8y05SsXCRPXpBA5KZv6RMcwCYOMzZhxk+uwd6eGCawXbBuZePEMIKXc7z5DkX/kPoZk5XvueAbR\nQT+LRVZhyQyBhkfAiFfDd7E1sFQEICSQE5kUlfUTTz755PWK+fWvf+0OPvhgt9lmm7kvf/nLjmB0\nrF/iziMDxIbBMEwI1yss4oQQMfayQZ4Y9HbeeWf/FaRYtjgvJCyKgMk1IWGk0USM6qlD3KUQsZkz\nZ7lv7rd/wxGv1WvedDu2ae3bGwF7ak9Vm3SlFhhT3BBIGAKf/2ueMKVMHUOg3ggI6YGAQVBw8bVr\n166FWj/+8Y8dMTB/+9vf/GzzTHjKrPVCbiiDhcCRcv7z1yRIyBzE7oADDnAszE2sF6SPaSi22GIL\nt+WWW/rYMdyYrVq18u7MqD3xZWzkIS+kUSx2EC70ps20vRElzcse5eoPcfNVy9KVq147bwgYAqUj\nsFHpWSyHIdD4CAjpWbx4sZ86ggWwIVkXXnihwwImMmHCBLdq1Sp33nnneaKi3XgSjwX5iUPQCWHP\nrPPE3OC6lDplL5Ys2Yu1qxhLmKQlr9QXh+5WRvUQgHRBuArFLVZPAyvZEDAESkHAiFcpaFnapkEA\n0kEAP/FcEnjO/mc/+5lr3bq1D7wXMJhqgnm2cD3iAiQOa+XKle6oo47ycV8Qoqi4L8lfzh79mMAV\nHXcNBl2sVFjFECFg7NmEgLEX16TeC9kK7zf64hfLUS3xeZhEtd1eLa2XiVc6h4JGunIAY6cNgQQj\nYMQrwZ1jqtUHAbH0sGA1k5t+9NFH3hUn0zj06dPHEyzm/xKBAPXs2dONHz/eL/1z6KGH+nwQHx33\nBUGS8iVvuXsIFwMv8WPa2gEBEyLGXjaIVxQR43yYdPF7y8AV2YjyajAn2n6dD0h904x0pb4LrQFN\nioARrybt+DQ3W76swtUmx1HtgZiwEV8lX1lFpYs6R9m487AMQZyEtEBcIDIE1eN67B9MMKnl7LPP\ndmPGjPETo0oe0lMGErfli3ahKy5HLULu2FM/IoRM2hAmYeirSdjGG2/k/vLyS7rYhjjGbZx2MdKV\n9h40/ZsZASNezdz7KWo7X2wxnxBkR+YSEjKlLU+6SQxO5GOG7Iceesjtsssufh4vyBJ5cwl5IGyQ\nJMiKkCZIDJuc5xqTQp5zzjnuueeeyxY3atQoH/c1ZMiQrJsPkkM5MmkpiYUcZTOWeUBbaGuuNul6\npA1SlSZhEDQhX+yZJ21asEB2o8mLL65y7UMfSqSpjUa60tRbpqshsD4CNp3E+pjYmQQhANFiY7CR\nyU+x7mjXWrHqykSQlEd+CBtlhssSC5JYioSM4H775JNPHF8vMsv7Bx984L9mZLmdZcuW+S8ftS6Q\nt9/+9rf+60G+PsRVKV8PQtritH5BFhHqLFWkneTTRIwljYhn09dLLTuJ6VkyqOvBXVz/kLUyibqG\ndTLSFUbEfhsC6UPAiFf6+qwpNIYksbQIkosgVQKEJnRYxGQQFtIlZQvpELccrkfIF3Ffb7zxhnvk\nkUf8TPIQsaVLl/qvHiWv7O+77z4fEybzfUG+sH5BvtgQbZWSfKXuw7qXml/SS5vZH3FED3fWyHPc\n0Uf2lMup37dv197NvHVmTgthUhtopCupPWN6GQKlIWDEqzS8LHWVEcByAwlikNGEqFrVQlaoD6uX\nrCEXZTWChLBh/YJ88dXim2++GaxluKe3fGH9YluzZo0bMGDAeupec8017lvf+lZ23iyZbJUvJbF8\nhV2A6xVQ5Im4yJdUN+KskW7jL23iLr5wtJxK9X7J0sfcD/v3cytWrkhVO4x0paq7TFlDIC8CNoFq\nXnjsYi0RwApFnBKbELBq14/bkrpwOUKY0CFKhBhhoXr22Wf9LPVdu3b1bkSZkJQJTHfccUcfi8ak\npFpOP/10v8ZjvslWxdKk85V6DGmkPXEI5fzj04/dQ4seiKO4RJRx9z3z3LHH9U6ELsUqAZmmX8Mu\n8WLzWzpDwBBIFgJm8UpWfzStNlidcC+yQYbqIVgVxPqFHlED3aJFizwxZNZ3rF+yrJDEffHFHJYv\nfl977bUtJlulTfvuu6+f74v8Ou4Ly5fEfVXqdizXOkI+vhIVYbBnwzV3/Q3TfVyUXEvrvlu37u6M\nM4Z5op2GNsRtwUxDm01HQ6DRETDi1eg9nPD2MdBjbWIP2WGgr6egB+QLaw+DnpAvzkNMIIVimZK4\nLwm6J+6LWC8hXxL3ddFFF63XpPnz5/v5vnTcl3zxGEfcVzEDdphoYWmU9mqFL754rHvn3bVu4mUT\n9OnUHc/67Wx37dWT3KJFC1OhezF9mIqGmJKGgCHQAgEjXi3gsB+1RAAyI9YtTXJqqUOuuiBfEBP0\nQk+28HQNOu4L8oX1S8iXfPEI+SLu64c//OF6VU2ePNkx0aqslyhxXxCvSskX+kIetc60RUsuoqXT\ncEw5zJK//NnnXccO7cOXU/P7pJNPdQd1OdANGzY08TrTV/JsJF5ZU9AQMARKQsCIV0lwWeI4EcDS\nxaDOIBNlaYmzrnLKgnxNnz7dL3StCYwuS8gX1i+C7iFfLJQN4RLyJUH3p512midmOv+IESPcKaec\n4skX1i8hX1i/Kg26l3g1sSJWMpATZP/ZZ/9MrdVr3n3z3fCAcC17bFki7zV9Txjp0mjYsSHQeAgY\n8Wq8Pk1Fi/iCkAGGLYmkCxBxffJlJYKeuURcjzLfl5CvqLivcePGuSVLlrQoihnyL730Uk++sH4x\n35dMtgr5EgLWIlPoBxYuLHRaIFroXQnhkvLSbvU6tldv1+OI7om3dsXVX9JvtjcEDIHkIWDEK3l9\n0vAaQWjElSfWmKQ2GkIDccE6x3xiuUTIl477wvIlrkcd93Xvvfe6SZMmtSiKWfVZZHunnXbyBAzy\nhfVLFuiGfCESeB8mWpDXXFa5uAbzK6+aFEwU+5i75eYbW+ie9B9XTrrGzbn9t4mP7Yqrn5LeH6af\nIdDsCBjxavY7oMbthzBAtnCDQWbSIBJUD2GEhOUTCBjki03HfeFu1Nvzzz/vfvazn61X1MyZM/06\njxJ0L65HiNszzzzj00O+8hGtcKFgjsUqFzELp8/1m3J69z7e9T7he27YkNNzJUvUeZm3a/KUyQX7\nrp6KG+mqJ/pWtyFQWwSMeNUW76avTcgWJCZNgsuRDQJTSHTcl5AvHfclBIygeyx/Ybn44os9+Xrn\nnXcc84Fh/dpuu+1c586ds25HsXyF8+b6DXmE8Fbq1qUcpsR4dMmyxE8vsXrNm27w4NNcjx5HuGFD\nh+SCpu7njXTVvQtMAUOgpggY8aop3M1dGQOMBNRXSgDqgSQWI4iSLGWUTwdxPUbFfQnxYk8QfniR\nbcrdb7/93HXXXZcNutdxX3zxCPEqlXzFNcDPnj3HjQoWBk/63F6syQjGSXaNxtUn+e5Fu2YIGALJ\nQsCIV7L6o6G1wU3Hli9WKskAlEocw+RLz/fFGo+vv/561v0Ytcg2cV+33357lnxh/apkke24XI70\n0Vlnn+NWrFjhJk++1rVpvUPium34mSPcqlUvuhnTp1Vs5atG4+gLcWFXo3wr0xAwBJKLgBGv5PZN\nQ2lWKmlJauMhjljtirF6SRsgYH/4wx/cu+++66ecYJHtdu3aOaaMwCJD/JZMtnrhhRdKtuw+zkW2\nxVWK27FSEfI1duzYRM3vlQbSFUfMXaX9Z/kNAUOgPghsWJ9qrdZiEWjbtm3WrTR+/PgW2cTdxH7B\nggUtruX7MXXq1GyZpbqr8pWb7xrxUZCVNLoYdbtog0wxoc+HjyGasj300ENu9913D2KNeriePXu6\no446yrVp0ya7ziOYsM7jIYcc4qZNmxYuyh155JG+rHXr1nmixpeSkDfmDcOVKTFl62WMOAHhEvIV\ncbmkU5eMH+vat2/vTuh9nCOIvd5CTBeTpCbd0mWkq953itVvCNQXgaYnXpAZITCQHJP4EcCtcscd\nd/j4qPhLr22J8nEApEqLkCzZi1tV9q1atfL3GfFZWLr4WpEvF5m3C9IlC20znQQfHkDMtLDINoRP\nFtnGQkbAPnOGlUq+0Cmsv66rlGPI15jA4nVI14Mc0zbUSyB+J590ktsqwDLJ7kUjXfW6Q6xeQyA5\nCGyUHFVMk0ZFACLRq1cvF4d7KwkYQVzCli/OFRKxLurlgDjHb70xZ9eUKVOC+KnJ7u67784WS7A9\nLkvm+5IpKySOjD1lhOf7ymYOHfChADFGlU4xQbHHH987mCNrkbvggl+6ZUuXunPPPbdmrkesXDdM\nv9Gde85ZfoqSfv36hVqajJ9xxtclo0WmhSFgCJSLgBGvcpGrUb5Vq1bVqKbqVVPM/FfVqz3ekqUt\nWIyKIVvh2iFaQpLE0ip7SJOQJ/ZYufi6Ucd9PfXUU27//fd3999/v9t5552zBEwH3ZOXOig3l4jL\nF0Igx7nSFnMeLCBxV199rdu749fdxWMvcf37nVrVwPvrps1wN824wbVus2PeZZ2K0b+aaYx0VRNd\nK9sQSB8CG6ZP5Xg0JiaKgWnkyJHZAl966SV/LsrliEtSx1uRl3PkCYt2XxLTg1CPDLCSXn6zRx82\n5mqSskmn66TcfHLbbbdl81MGZXGuHCmlvYXKh6SIi65Q2qRfpx39gyklZDAtR1/pd4gWM9OzPJC4\nHrfYYgtPhHA94oLs2rWru/7669er5jvf+Y4DV4n7YnkicTviekTEGrZe5n+fEKtXruulnofAnXvu\nOZ4ELX/madc9IGPn/mK0e/a5FaUWlTM9Fi4IV7du3T3pGjp0qJ8uIg7LXc5KK7gg90lS9augaZbV\nEDAEykSgaYlXsXhBrCAwEKcwyeIc15588sm8xZGmEGmCdEHSCpWVqyIIVp8+fVrkpyzOFapblxlH\ne3V5uLOQXWP4ik6Xi558JDBo0CCP29Zbb50ltkJsOAempCFtuP90eaUex0FaRE8sVGHyJTFf7CXu\ni+kktLDoNot4E/clc4IR98W0FcXGfWGpgsDFKWDD3Fkzb53pPv3kY28BY61EYsDKCcIXssXXipC5\nBxbMd/369fVLAOHmTKoY6Upqz5hehkB9ETBXYwH8w2QmnPy9997zgzsuQQKowwKhKkZKIUdR5UEs\ncgkEEfcUX9UVkkrbGy4/7mBiyBPtKcaSR9+E8T/xxBMdC1XzlWElAmGBVFZqyYN8IZAvRMiYuB05\nzzEbywmFF9meMGGCJ9sssg3Z0rFfBPFLXqnHVxL6Aymmn+ImxxAwtnNHjfQfDEC6rrnqSl/7fp0P\ncHt32scf77FHW/+Fp6j1wgsvBETyQ/eXl19yL/x5ZUAMF7kfnHSKO/I7PdwZw34Su55Sb5x7I11x\nomllGQKNhcCGjdWc4lsDCcEVw0AmwmDMOYmrgsxoCxRpuc5G8LMIA3w+4kO58+fPz+aVfOH9wIED\ns2nOPvvs8OWCv3PpR8Z8+knBcbVXymMf5ySR4kothnRpHfRxHGVQHiRFrHm6/HKOhRRBsnA9Eq/F\nTPV89Yi7ERceli/ckKNGjXJDhrRc/mbhwoWB662be+WVV7zrkS8emXIC1yNTTkDG5L6N0o+2QLyq\nJejfP3DPTp1yrVuxcoW3hPU58QS3+Wabus8+/cTNnTPH3ThjRnZ77dVX3WZf3sQvSXT++f/rdceC\nRuA8uiZdwJIN0mliCBgChsB6CAQv5KaWgKxkAlD8FhCkFlgE1pHstYAUtbjGj1x59XnKDkjXenk5\nIfWyD4hgZBp0knSUq0XOsy+kH2nWrl3rswekMVsm50XKba/kj9qff/75GbZKJSDDmcCi2EJv3f5S\njymLMsuV4Cu+zGGHHVZu9pz5ApKUCSxXmYA0ZQIClQlIfWb16tWZwAqUCb5ozCxevDgzb968zGWX\nXRaJRfAlZGb58uWZYODPvP3225kgBiwTuB8zgfsxQ9lsuYQ2mVSGwMsvv5xhMzEEDAFDIBcCTWvx\nCgbqgqKtXVFuOtxWIrjAsHxFSVTecLpi0oTz6N9aFzkfLrNQjFNc7ZX62WMVisNKgTUujC+WRCyD\nASH1FkWsiuGNa6TB1aqlkJVSp63lsbgaJe4L6xexXVi7JO4LK1jHjh399AnhuK/Bgwf7OdNkvi+C\n7on7KmayVSw0cVnxaolZUurCyoXEcb/7guyPIWAINCQCFuOVp1s1USH2qZAwmIfjvCAHxUg4XzF5\ndJqoesJlhomLzs9xHO0Nl0msSxwDUThWC1cvrtlCosmnfMAgecJlyvl677XrEV0kTos9hExvxH0R\n8/bcc89l1WYeLUj0L37xixYxXwTwE/dFfkTqkYwyrQR9Jsdyzfb5ETDSlR8fu2oIGAL/QcAsXv/B\nwo4SjIC2xkG4iiFd4eZAwnQ+XWY4bb1/CymCJMmUE8R9MdO9WL4k7uuSSy5xp5xySguVZ8+e7QP/\nJe6Lrx5lpvt8cV9m9WoBY1E/jHQVBZMlMgQMgX8jYMQrz62grUg6OD7w22aDlfWxTp+n2KpciiIR\nYQtXIf309bjai+VEBqa4Gq71LLXMSvKWWlel6cXtiKVLky8JuhcChuvxpGC5HCxcWiBdkE1NvsLr\nPJKee1jL4YfHP8WELr+RjuXejsOq20i4WFsMAUMgNwLmasyNjY8LEvcbxEa7rfJkq8sl3GbhOC/t\nSsPtWIh0EAcVd3uxoMjgFBcwtKucrz6pX2MSlz7VLgcCBvkK77XrkeNDDz3UL7I9YMCAFiqxyPY1\n11zjvvWtb2WnnIBs4XpEyIuIlY1jiAT9xj5uIY6Mbd37H7jXXnvdvfHGGy2qIJ6tfft2Lpjj338Z\nCBFMosh9XQ2Mkthe08kQMATiQcAsXgrHsIVIEy3iaPRcWxAU4r7EKhE1270quuqHBJ9r/fiNziJh\nUibn9b5a7SVmqFLRukGeaFu4v/LVQVrw0cRLl5kvb9Q1iEMt46DkPsP1SJwWQfeyyDaWL3TB8iWT\nreZaZJuZ7t9//31XaJFtyAT9FkffUcYNN9zgBg46zbVv194NH36mu3/+A+6DDz9yrbbexp3at2+L\n7cAuB7mPPv7UvfLqG+6yiVf4Z8xPwHrVJE8Go/qj1ueMdNUacavPEGgcBMzipfqSwZkBDssQc3kR\nD8RgLVYgBntNZlTWsi0wuoxKjyvVrxrtxeJ1+eWXV9o0b23UpIl+YcNKhzUvF4kiD/0a5YrNlacY\nZSETtK2Wwr2JpUoHx3OO39r6xe9ci2w/8MAD7vbbb89avpjjC5FytfWL9j0YzGpfrsWJvHffc6+7\ndMJ4PwHqQQcf7M4444ySF9Bm5vpHHl3ili5Z6o468ii3V/uvu+N793L9g7nB6iFGuuqButVpCDQQ\nAsELt6ll1qxZ682HFBCvLCbM9RQM7uulCW6B7Lnw/Fr8luu6rGyh/z6QNOyZWytKyC/pwvXIefaB\n6y2bTp/nOJwv1zxe1F9Oe6P0lnPMaRRYZORn2XvmICvUD+F25/tNWTKvWTlKMYfXnDlzyslacR6Z\njysIks988sknfr6vd999N/P6669nVq5cmQlIZiYgPZm77747E8R9Rd4XwSLbmeeffz7z6quvZt55\n551MYAWLnO8rIK2ZYGHuknRmPrBgpvlMu73aZYLFsjPLn32+pPz5Er+xek3m19dPzxx+eDe/3X77\n7HzJY79m83TFDqkVaAg0HQJN72rEBRcQk/WmgQgGbS9Yv5544gmfBuuKFixEBKGXG2+ky6r0GF2w\ncqCvCPoGxLIk/eJuLy4rpNL5obBq0b5wH/jCS/xDGZQVnm6jlGIeeuihmlu8RD9xO2KdIuge16Ne\nZFu7HotZZBvXoyyyHZ7vCxdmsR9IYAW86OIxbvCgwX45oIWBxWvUyBElW7iknVH7Nq13cD8a8Pk6\njaf07e+uuuoqhxuy0vsrqq7wOalD7unwdfttCBgChkAxCGwA1SwmoaUxBMpFgPUMcVf99Kc/LbeI\nFvlwMRLDJi7gFhfz/IBUQlArJcpz5871bRGXU54qq36Jx5cNlyGkiWWCmDaCGC42SBVTSUCs+PKR\nvZYf//jHbujQoX6aCiZjhcARPwahw2UpJK+Qy5HrF1zwS9e6zY6OecQ6dmivq6nq8ZhxE9y555zl\nrrjiSjds2NCq1AXpgnDVMq6vKg2xQg0BQ6DuCBjxqnsXNL4CBFYT5yUWg7haDPHSMVzEcmnBooV1\nS2LAtDVQpyv1WAhkHLFrpdYdlV7+d2KRbDbIV+CC9CQL0sUmVq1gSSF3zz33tCime/fujkW2mSOM\ngH3mC9PkC8saBCwX+Zo+fbobO2asO33oMDdsyOktyq7VDxbgxnLdunVrN37cmFgJkpGuWvWi1WMI\nNAcCRryao5/r2kpcUJCfID7GWw3qqkwMlWP1gITUOrg+n+pCvrB8Qb6CtRmz5AvLl5AvjpctW+Yu\nuuiiFsXhnvztb3/rv4qEgAn5wo2J9QvyhYUPArarmmLirLPPcXfOneOuv2G6X9S6RaE1/kEQ/ujR\nF7g333zTzZg+LRbyZaSrxp1o1RkCTYBA08d4NUEf172JEJV+/frF8nVjvRuD9Y72JIl0gYm4BCXu\nizm6cBtCophmQsd9HXLIIS5YZLsFlKzt2LNnT0fsGscQNSZbxXomcV8QLqyKEGkE0rVixQpHLFfX\ng7u0KK8eP4j/mjrlWte27R6ub78BWT3L1cVIV7nIWT5DwBDIh4BZvPKhY9diQwALEbFeWE3SHCcD\nwTn//PMDy8ro2LCJu6Bw3BduR4n7Etcj+zVr1rjTTjvNEyytw4gRI/wSROJ6hMDJOo8QO8jZvHvv\n96Rr8uRrHYQnaTL8zBHBlDAvlm35MtKVtB41fQyBxkHAiFfj9GXiW0KAPVuSSUs+ELF2oTskEgKp\nhXYlScT1qOO+IF8E12vyxW9io5YsWdJC/eOPP96dd9553mImrkchXzfddFMQRzXe3T5nbk2D6Fso\nWMQPJmylrbfcfGMRqf+TxEjXf7CwI0PAEIgfASNe8WNqJeZAQKxeMrDlSJbY07jaIF5RE3fSNi1J\ncEcK+ZIvHnEZQr5wIWryRdzXzJkzHYRKyy677OJ+/etfu5133jkbdI9rkaWJHl2yLBHuRa1v+JiY\nr8GDT3NdDjww+NLynPDlyN9yb6bZKhvZMDtpCBgCiUHAiFdiuqI5FIG0ECPElAxpEggXOjMwFyO0\nMZyWuLB6DOgQMMgXmwTdQ74k6F5I2NKlS92FF164XvMgZZ06dfKxXqef/hPX5/s/qNvXi+spV+AE\nXzv+sH8/N3nKZG9tzZfcSFc+dOyaIWAIxIWAEa+4kLRyikIAQoLliKkYoixHRRVS40QMyPvuu68L\nZnCvKKieciQwXZpQKxeljvuCfBE0D/kKux5Xr17twotsoyuLbC9b9nv3j8BqVqrrTtpar/2Vk65x\nc27/rVu0aGFOFbBY1osY51TKLhgChkBDImDEqyG7NdmNEpejDHZJ1haixIDM3F0yf1ec+oZdlJBS\ntmqIuB513BfkS1yPerJVgu4hYWEJlv9JdFxXWF/5zez2PY7oHjnBKn1QKwIs+tjeEDAEmhcBI17N\n2/d1bTmuO4LVsQLVw/1WbOMhXWzoWgshaD8cuB+nJSZMvsT1iOUL16OQL46ZbDVY79E3+8ADurge\nwQLVF184uhYwxF7HvPvmu+HBrPbLHlvW4n4z0hU71FagIWAIFEDAiFcBgOxy9RDA1QjxYvBLIvlK\nin5hFyVYQcbKFSFfMtkqQfcy0z0ETJMvHfcVLFCdyKkjisXhpJNPdQd1OTBr9TLSVSxyls4QMATi\nRMCIV5xoWlklI5AUcqMVx72IWzGppBD90E1LOS5KifuSme7DcV8QMCxf//uL813XQ7/lJl42QVeZ\numOsXpeMG+tjvYx0pa77TGFDoGEQMOLVMF2Z3oZAvhgI+WqwEktOHAhAaiTeB52SaImLameUi1La\nEZVezgn5kiknIF96vi/I13/3+W83c9ZtiZ8+QtqUb9+tW3f3jW/s0xCrKORrp10zBAyB5CKwUXJV\nM82aBQHip4j5gijU82tHiBaz67OhR1pIF/dJlMWL9miJclEyEz/yhS98we/10kPMUn/XXXe5vTvt\n0xCkiwZ2PfTb7h+ffuLban8MAUPAEKgHAmbxqgfqVmckAkJ8hIBBJmohWLkk2J99Nb5erEU7CtUR\n5aKUwP1w3JcE3Q8/8+dup52/ltqg+jAmMq/XipUrwpfstyFgCBgCNUHALF41gdkqKQYBCBcuM4gP\nhIA9WzUtT2Jtg+QRN1UrslcMHnGnAUcw1hIO3IeAHXbYYdlFt1e9+IL7/g9+oLOk+lgW86bd9XZr\npxpIU94QMATKRsAsXmVDZxmriQDWL6xPDJCQL+LA4iJFWH4gXLgTEfa4F00+R2DRokUOArZ27Vp3\n4okn+uNGwoY1HDt8vZ2/rxqpXdYWQ8AQSAcCG6ZDTdOy2RDAMgP5IuAeK9huu+2Wjb3id6kiZAuC\n1apVK18uhIuyjHS1RLNbt26ObZtttgmsXSe3vNgAv3bdbXe3bt0HDdASa4IhYAikEQFzNaax15pI\nZwgYGyQJEsaGJQy3GRYwriGy9z+CP+JCY8/2yiuveBeaBM7HZT2T+hptT5A9uG2xxRaN1jS3xx5t\n3dw5cxquXdYgQ8AQSAcCRrzS0U9NryVEC3cjGyKECouVBMf7C//+A7Fig2jhqgwTM53WjqMR+MJG\nX3RYhxpNGpFMNlofWXsMgUZGwIhXI/duA7eNwGgLjq5uBzOxaiPK13be2f3hiccbsWnWJkPAEEgB\nAhumQEdT0RAwBAyB2BDo2KG9W/nnlbGVZwUZAoaAIVAKAka8SkHL0hoChoAhYAgYAoaAIVABAka8\nKgDPshoChkD6EHj2OZs8NX29ZhobAo2DgBGvxulLa4khECsCsoxQrIUmoLBXX3vN/eCkUxKgialg\nCBgCzYiAEa9m7HVrsyFQBAL/+udn7q9vv11ESktiCBgChoAhUCwCRryKRcrSGQJNhgBfjb711psN\n1+qnnvqja9+uXcO1yxpkCBgC6UDAiFc6+sm0NARqjgDE6ze33FTzeqtd4V9efsltueXm1a7GyjcE\nDAFDIBIBI16RsNhJQ8AQ+HxR7W5uydLHGgqMF4KpJGxC3YbqUmuMIZAqBIx4paq7TFlDoLYIHHBg\nF/fgQ4trW+n/397d9ER1hmEcv+YDsNRUUDMJLEhdYayg3dD6/sZAurOFIU2qSNGkTaoZSXShKGqi\nCRo7VYkVW0QDTtqmMbY2gcS2GKDMoo3ESGUjfgLWyjNqAqiYyDkz5z7nPyvmDPOc+/nds7hyXp7j\n495ciHwyOcniuz4aMzQCCMwvQPCa34dPEYi0wNo1lRr8+6/QGAyPjGhHojY082EiCCBgT4DgZa9n\nVIxA3gTcsy4fjN1XWNa+yvT16sO1VXnzY0cIIIDAXAGC11wR3iOAwCyBmto6XbrUOWubxTe3bv+e\nO83owiQvBBBAoFACBK9CybNfBIwINO/ZrVu//qLJJ7aXlrja1aXmlhYj6pSJAAJhFSB4hbWzzAsB\njwTi8bhWrvpA31+56tGI+R/GHe36Z3hIDfWsWJ9/ffaIAAIzBWJPp18zN/A3AgggMFcgm82qoqJC\n//53XyveL5/7ceDf7/y0XlVVldq3lyNegW8WBSIQcgGOeIW8wUwPAS8E3GKqR462qa2tzYvh8jpG\nx7nz09d2PeZoV17V2RkCCLxJgOD1Jhm2I4DALIGWL5tzp+tckLHycndjnj/bocOHD8ktCMsLAQQQ\nKLQAwavQHWD/CBgRcMEl/V06F2SsrGafSqX0WUMDK9Ub+Y1RJgJREOAaryh0mTki4KFAR8dZZTIZ\n/djdreIl73k4srdDffX1Nxoff6iff8p4OzCjIYAAAgsQIHgtAI+vIhBVgf0HUhobG1M6/W0gw5cL\nXdnRkemAeJNTjFH9kTJvBAIqwKnGgDaGshAIssDJE8dVXl6upqY9gVvfy4Uut+7YmTOnCV1B/hFR\nGwIRFSB4RbTxTBuBhQrMDF9BuObLLfD68vRiz/UeHoS90AbzfQQQ8EWA4OULK4MiEA0BF74qV6/W\n541J3ei9WbBJu7sX3dE3d01X15XLhK6CdYIdI4DA2wQIXm8T4nMEEJhXoLU1pfYT7TrUelC7duf/\n1KNb3uKTutpcAHQX0rNsxLzt4kMEECiwAMGrwA1g9wiEQcA9eHrw3qBisZg+rq7WsfZTvk/LPQao\nJlGnTF9vbpkLFwB5IYAAAkEX4K7GoHeI+hAwJtDf368LFztzq8Vv2LRFjcl6T+987LzcpT/uPH/2\nYupgSslk0pgQ5SKAQJQFCF5R7j5zR8BHARfAuq9d18ULaX2xq0mVVWu0ZfPGdwph7ujW3bt/qu9G\nj5YUF6tx+pqyRCLBaUUf+8fQCCDgjwDByx9XRkUAgRcCExMTGhgY0O3f7uha9w/aUVOr0tIyLVq8\nWGVlpSoqKnrFanQ0q6mpKT36fzz3nerqj7Ru/Xpt37aVC+df0WIDAghYEiB4WeoWtSIQAgF3JCyb\nzeqpYhoaGn7tjEpKSrRsaYmWL1+WC1rxePy1/8dGBBBAwJoAwctax6gXAQQQQAABBMwKcFej2dZR\nOAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkB\ngpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAII\nIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1\nIoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMg\neFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAA\nAghYEyB4WesY9SKAAAIIIICAWYFnu/zIiInRwogAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 88, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 89, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0 = np.zeros(10)" + ] + }, + { + "cell_type": "code", + "execution_count": 90, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])" + ] + }, + "execution_count": 90, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 91, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0[4] = 1\n", + "layer_0[9] = 1" + ] + }, + { + "cell_type": "code", + "execution_count": 92, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 1., 0., 0., 0., 0., 1.])" + ] + }, + "execution_count": 92, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "weights_0_1 = np.random.randn(10,5)" + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 94, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0.dot(weights_0_1)" + ] + }, + { + "cell_type": "code", + "execution_count": 101, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "indices = [4,9]" + ] + }, + { + "cell_type": "code", + "execution_count": 102, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_1 = np.zeros(5)" + ] + }, + { + "cell_type": "code", + "execution_count": 103, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for index in indices:\n", + " layer_1 += (weights_0_1[index])" + ] + }, + { + "cell_type": "code", + "execution_count": 104, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 104, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_1" + ] + }, + { + "cell_type": "code", + "execution_count": 100, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjsAAAEpCAYAAAB1IONWAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsnQe8VcW1/yc+TUxssQuWWGJoGhOjUmwUMVYUC3YQTeyCDQVRwYJYElGa\nggUBJVixRLFiV4gmxoKiiaKiYIr6NJr23vvf//6O/k7mbvc597R7zt7nrvl89tlt9syatct8z5o1\nM99oioKzYBowDZgGTAOmAdOAaaBBNbBcg5bLimUaMA2YBkwDpgHTgGnAa8Bgxx4E04BpwDRgGjAN\nmAYaWgMGOw19e61wpgHTgGnANGAaMA0Y7NgzYBowDZgGTAOmAdNAQ2vAYKehb68VzjRgGjANmAZM\nA6YBgx17BkwDpgHTgGnANGAaaGgNGOw09O21wpkGTAOmAdOAacA0YLBjz4BpwDRgGjANmAZMAw2t\nAYOdhr69VjjTgGnANGAaMA2YBgx27BkwDZgGTAOmAdOAaaChNWCw09C31wpnGjANmAZMA6YB04DB\njj0DpgHTgGnANGAaMA00tAYMdhr69lrhTAOmAdOAacA0YBow2LFnwDRgGjANmAZMA6aBhtaAwU5D\n314rnGnANGAaMA2YBkwDBjv2DJgGTAOmAdOAacA00NAaMNhp6NtrhTMNmAZMA6YB04BpwGDHngHT\ngGnANGAaMA2YBhpaAwY7DX17rXCmAdOAacA0YBowDRjs2DNgGjANmAZMA6YB00BDa8Bgp6FvrxXO\nNGAaMA2YBkwDpgGDHXsGTAOmAdOAacA0YBpoaA0Y7DT07bXCmQZMA6YB04BpwDRgsGPPgGnANGAa\nMA2YBkwDDa0Bg52Gvr1WONOAacA0YBowDZgGDHbsGTANmAZMA6YB04BpoKE1YLDT0LfXCmcaMA2Y\nBkwDpgHTgMGOPQOmAdOAacA0YBowDTS0Bgx2Gvr2WuFMA6YB04BpwDRgGljeVGAaMA2YBsrRwOOP\nP+7effdd9+rC190HH3zgli39wD3++GNfS+qQQw93q6yyiuvSpbPbaMMNXM+ePd13v/vdr8WzA6YB\n04BpoLU08I2mKLRW4pauacA00FgauOuuu9xTTz/r7rv3HteufXv3ox//xG2y6SZu8803d6utuqrr\n0b1rswIvfG2Re2/JErd06TL39ttvu1defsnde89dDgD66a67uH322cfAp5nGbMc0YBpoDQ0Y7LSG\nVi1N00ADaeC///u/3YyZN7k5d97pS9V//wNcn969XZfOHcsq5dJlH7q5DzzkFsx/zl079Rp3xrCz\n3IknHOc23njjstKzi0wDpgHTQEsaMNhpSUN23jTQhjUwfvwEN3nSJLf1Ntu6IwYOdLv/tG9VtYHl\n57rrrndXjvuFQU9VNWuJmQZMA6EGDHZCbdi2acA04DWAP87551/gVll1NXf8CSdUHXLiahb0zL3v\nXjfi7BFu0KBB8Si2bxowDZgGytaAwU7ZqrMLTQONqYHxEya6yRMnuhNOHuKGnHRCTQs598GH3WWX\njI38gdaPLEoTzJ+nptq3zEwDjasB63reuPfWSmYaKEkD+OYce9wJ7pFHHnU33Di95qCDsDST3Txr\nllt33fVct67d3O9///uSymCRTQOmAdNAkgbMspOkFTtmGmhjGgB0Bg4a7FZeeWX3i19c7tq3W6/u\nGhg/cbKbPGG8m33LbPejH/2o7vKYAKYB00B2NWDj7GT33pnkpoGqaECgs9lm33fjrri8KmlWIxGa\n0FZaaWV38EEHG/BUQ6GWhmmgDWvAYKcN33wrumkgraCjO3P04IF+04BHGrG1acA0UI4GrBmrHK3Z\nNaaBBtHAmWeNcIsWLXL33D0n1SWiSWvOHbe7OXPuNKflVN8pE840kE4NGOyk876YVKaBVtfA9OnT\n3diLx7p5UTfzNPjotFTgY4493n3++edu1s0zW4pq500DpgHTQDMNWG+sZuqwHdNA29DAO++840Fn\nXDRoYBZAh7syevQoP/8WkGbBNGAaMA2UogGz7JSiLYtrGmgQDRx62BHRnFabuTEXjs5UiRiH59Qh\nJ7v5C+Zbc1am7pwJaxqorwbMslNf/VvupoGaa4DRkX/3wvN+PqqaZ15hhozDs/uee7uLx15aYUp2\nuWnANNCWNGCWnbZ0t62spoFIA1h1unXvXpdBA6txA5haYosundzixYtt8tBqKNTSMA20AQ0Y7LSB\nm2xFNA1IA1h1jjv2OLfojUU6lMn1qacNcyussLy77NKxmZTfhDYNmAZqqwFrxqqtvi0300BdNTDr\nV7e4gYOPrqsM1cj8Zz872t1z1xzHOEEWTAOmAdNASxow2GlJQ3beNNAgGgAMrp16jdun396ZL1GX\nzh3dDzp2cnfddVfmy2IFMA2YBlpfAwY7ra9jy8E0kAoNAAY/P+Y4Byg0Qtilb1/37HMLGqEoVgbT\ngGmglTVgsNPKCrbkTQNp0QBgsMWWW6ZFnIrl6NO7t7dUVZyQJWAaMA00vAYMdhr+FlsBTQNfauDJ\nxx9z2/zkJw2jDixUe/fb1+F0bcE0YBowDRTSgMFOIe3YOdNAg2iAEZMJPbp39etG+WGm9pdffqVR\nimPlMA2YBlpJAwY7raRYS9Y0kCYNADtbb7NtmkSqiiybbLqJW/L+B1VJyxIxDZgGGlcDBjuNe2+t\nZKaBnAZo6ll33fVy+42ysfnmm7sPPjDYaZT7aeUwDbSWBpZvrYQtXdOAaSA9Glh9jTXdN1dcKT0C\nmSSmAdOAaaCGGjDLTg2VbVmZBuqlgf/3//5fvbJu1Xy3+uGW7lezbmrVPCxx04BpIPsaMNjJ/j20\nEpgG2qwG2rdrvKa5NnszreCmgVbUgMFOKyrXkjYNmAZMA6YB04BpoP4aMNip/z0wCUwDpoEyNcBA\niR1+0KHMq+0y04BpoK1owGCnrdxpK2eb0wDTQxx55JHuu9/9rps8aVJDlv/Tzz5ryC71DXmzrFCm\ngTpqwHpj1VH5lrXzo9/+/ve/dyyMBcPy7rvvNlPNaqut5n70ox/5Spt1z549/dIsku34GcABHLqZ\ns/70009zWmH7ncVv5/YbZeNvf/tboxTFymEaMA20ogYMdlpRuZZ0sgaoiG+88UZfKWN1AF6AGFkh\n2A6DIIg1UHTKKae4l156ye2zzz5u33339QvptMXATObok+Xuu+9upoLvfe97XjfolXhXjLuq2flG\n2PnjH99yXbtu1whFsTKYBkwDraiBbzRFoRXTt6RNA14DVLZXXnmlXwATgAVQ2XjjjcvSkCp5oAkA\nIq3Ro0eXnV5ZQtTpIqBPwAj0hWGrrbbyukAfITTymi+33HLug6XLXCP1YDr0sCPcrn37eFAO9WDb\npgHTgGkg1IDBTqgN2666BkLIofIFSLDkVDNQ+ZPu9OnT3aBBg3JAVc086p0WQCcLThLgYL0J4TH8\nD8M24+z077+/OyLSz4AD9qt3caqWf8cOHd0DDz7QJiC3akqzhEwDbVAD5qDcBm96rYqM7wiAIx8S\n1tUGHcqCdQgLz+LFi31zDftYkbIe1GRHeX784x+7888/3zffUS6a8KZNm+Y++eSTXNMezVaAjeDm\n//7v/9y///1v969//cv985//dF26dImufznrasnJP/fBh1279u39/c8dtA3TgGnANJCgAfPZSVCK\nHapcAzRTASBYXNiuRQAKsH4AVVg6WCNDlvx5ZL1hHToY46QNKGK9YVGZBDfoF+sN+0AOC/v/+7//\nm9vfaacd3MUXj41ijiZ65sPTTz/j2q+/QebLYQUwDZgGWl8D1ozV+jpuUznQbCXrDRU2AFKPgBwA\nD01cAE/ov1IPefLliZxAmSAnDjiCG9YKAA1BoAPUsAhytAZ0BDtaHx75uJx+5pkN0ZRFE9Y1U67J\n9dKTfmxtGjANmAbiGjDYiWvE9svWgEAHsKAZSdaHshOswoWyMAEUaQEe9CS4ydeDCrgRNKKGEHBk\nwYkDjoAmDjnh/jXXTHWg0tQpV1dBu/VL4vppM9zdd81x99w9xzdd0uQX6qt+klnOpgHTQBo1YLCT\nxruSQZlC0MGSkqaAPEBPPYFHPaiAnCeeeKKZeuhBRUWNJSoEsjjgyIIjyCkGbmTl0Rofn7322su9\nuvB116Vzx2ZyZGmnV6/e7uSTT3b77dc/Jzb314Anpw7bMA2YBgINmM9OoAzbLE8DaQYdSgREEKgI\nawk8WBvID9gqpgcVMuYDHMFKCDgcC/fDbcUXIJHuN77xDe8H1L379u6qq67KrHUHqw4hBB32dX9Z\nWzANmAZMA6EGzLITasO2y9IATS4ADxV7moOsO8jZWk1sAA5wgwUnPhI0PaioiNGXfJkEN+gNMGFf\nlhsBSxxqtJ8EN5yLAw7j67C8/PLLbs0113Rrr722Gzx4sJs4+Rq3+0/7pvmWfU22pcs+dIcdeujX\nrDphRO4vFrLWusdhXrZtGjANZEMDBjvZuE+plZLeVlTuVPJZqFyADeQERqoVSIsKNh/gADcs0k8S\n4AhSWAtm4us43GhfcMNaQYDzX//1X47l6aefdj/5yU887Kywwgpu9uxb3MMPPexmzf5VpgYZHHnu\naLf47bfcrJtnqqiJa55HgFI6T4xkB00DpoE2owGDnTZzq6tfUCoUxn958cUXm/maVD+n6qWIBYpK\nEEADQMoNgI2WavagEuAAMoKZ+DZxBDgCJ5qoWAQ34RrQ2WWXXdzyyy/vz7NmOebY412nzl3cmAtH\nl6uGml7HuDqnDjm56EEEDXhqenssM9NAqjVgsJPq25Nu4bCSsIyOrDtZCkAKfjw4DRf7zx9IEtzk\n60GFLkKAEoioaUprYCVcBDOF4EbxSUPp5gMcwcznn3/uXnnlFdenTx8PNzouEFqyZInbp98+btjw\ns93Rgwem+hYufG2R27//vu7isWO/5qtTSHADnkLasXOmgbajAYOdtnOvq1pSLCNADpVJscBQVQEq\nTKwYUFMPKpqo8gEO0FSoB1UccJKABpCJA4/ghnVLgCOIYS2Qef/99x2ws8022+SO6ZyauIAlyjVi\n+Ah3w43TXY/uXSvUautcjp/Occcd7zp27Oguu5RBEUsLekax6FkwDZgG2qYGDHba5n2vuNRUHMAO\nlX0WAxUgwBO37ghwgLl8Pai4rhDgqIkpBBbBjI5pX/Cj41rHAUeAArAkwU14DIsN8TbbbLNmoCNL\nEGsC5aMsAw4c4J588slUAo9AZ/1oWoirr55U9qPGfSUY8Hg12I9poM1pwGCnzd3yygssq44qkMpT\nrE8KgBqVH01PlAkLThxwdt55Z3+eOKoo1YyE1ICNrDdsC1YEM/F9wY3WOq90lLbghrUAJ74W4HBc\n52i2osfVpptu6o8BNqShINBhH2CjvPQSGzhwkDt7RLosPAIdZJ0xfVrFFkQ9r7qPpGvBNGAaaBsa\nyATsTJ061R177LH+juBo+fDDD2fq7oTyI7gqNLbDyodyUb5iA//c3377bR/9kksucWeddVaxl1YU\nD2sAoMCS5aDKPl4GKn/ghkVNdOE9E5gAKnHAEbwIdgQ18TXXaSF/ngOBieBFAKN1EtxwTPGxzmy9\n9dZu9dVXzws4KitWOSYWZc4tIIBy3nnnHLf//vu5626YXncfnmefW+B4psttulI542vKiv9VaJmL\nx7F904BpoPE0sHzjFclK1JoaoLJgBGCcdbMeKIvCoEGDPNwAciHgADnhIpiJr+Nww/n4McGNwCkJ\nbgQuApsQZnRMcVjLAgTo9O3b1xcnBGiVL1zThAfoELBoUV5k6h85AM+bN88dH/nHvBpZiIYNO70u\n3dIZNPDySy52J5x0khty8kmh6BVvY9UBdtCBAU/F6rQETAOZ0YDBTmZuVToEBXKwfAgI0iFVeVIw\nfxeVPRUga0IINmwLUPLBjaBGawFOGF9pkn4+wBHI5IOb8DiAo3To9k7F3bt3b5IvKsgiJwuWLkLO\nHXfc0d3763vdyHPO84P3nRk5L9dq4EF6XDGy85OPP+Yn+AQ8WyPw7HLPDXhaQ7uWpmkgnRow2KnB\nfTnmmGMcSyMEddtuhLJQ6fPvnkqVip4gwAlhJR/ICGxYx+O3BDiCF0EOVhpt61zcgiPAkeWGEZqx\nUvTq1avo20HzFX46NF+FgBdC3frrr++uv26qmz59hhty0olu2+26uiMGDmw16ME358bpM92Made7\nfvv2d/MXzG91mDbgKfqRsYimgYbQwHJZLcWll16a83Pg449Pj/xXksr0yCOP+DjE1bLGGms40mFy\nxKSgeKy5noWuvOxzHaGYOPjshPGS8tKx2267LZcH15Afx8oJSWWmqQN5yg00YbXWP+5iZJIe1WRT\nzDWF4qgpg3/5VPgCm3//+9/uX//6l/vHP/7h/v73v7svvvgit9Clm4VjnCMOC/H/53/+x6dDnsAK\noxV/61vfct/5znf8stJKK7mVV17Zac22Fo4R79vf/rZfVlxxRX8taQiAZNXRVBSSv1AZdY4yJjVf\nCfCQneWf//ynXwYMONDNnXt/NGFoZw89hx52hLv19juVXMVr/HJOPW2Y6x3B5quvvOxm3zLbdy2v\nldVQwINjugXTgGmgsTWQSdihohs+fHizO0MFDhgkAQ9xkyp5IIdzOPr+9re/bZZefAdwII1C8YqJ\nE0833AdqBgwY0CwP8uOY4CqMX2ib+EllFgDJ4btQGvFzVJbf+9733MZRE0C9Qz5ALUcu4I2yqdLH\nUqNKH4gJQUeAI8gJAQcQA0q++c1vOkAFcBHUaB2CDscEOCHkAEekQVryyRHkUT5kJZR6H/I1XwF5\n8TJTPhbK8fOfHx11CnjI7bTjDm7yhAmuY4eOHlIAH5qeSgmMgsyUD8xaftSRgyIYXN6PiMz0D6WA\nWyl5FooL8HD/DXgKacnOmQayr4HMNWNRWecLVIBU4mFvLSr9lkCB6wCDt956y/dkSUq/pTS4ppg4\nSWnrWCGLC1DG3EbF9NYCmuIwqDy0Ji+6J5fSg4tKttQKVvlVe10IOkvNi0oWZ2VgJ27ZwcqBlYcF\nIFBzD3kIQMLmJjVHaS2LTHwdXiNrDWsF0k4KVMrIW6r1o1DzFWUG7mTJUlnJH6sSYZVVVnGHH36Y\nO+qowW7hwoXuqaefcXfNmeMOOnD/CBZ6uXbt13frrrueW3uddXz88AerDZawe++5y+3db1+33Xbb\nuqFDh3iH8DBePbcFPKwtmAZMA42ngf98XTNUNir9F154wVdOH3/8sYcAiR/CEBATAgiVOyAkf4rQ\njyYeV+mFa+Lr2nyQUEycMM34Nt1tlceUKVOanS4EQ2HEEHRCXQFzISyhG8pdbAAI0lQZhPe62DIk\nxdtqq638P/uwGUuVv5qzBADcG6AECMD6QpNTviYqWW7CtSw4YROVwEfwVAh00H+poAOk5mu+EuhQ\nPjXHsQbygJ8kwAO26CWFNQZ9jBt3hTsmsv6stmrURPedFd23V4z0Ei0rRdv//ucX0aCF+7vTTh3q\n495z9xx3zsizUwU6eibQLTCJH5QF04BpoLE0kDnLDuq/9dZbvVWCbcYUARCwzChQgXMcC0dYmQMP\nYWXPPs1eqjSBCdJKClwXh494vGLixK8J9wGlEKLYR37Bi8pD2fIFLB5hU16oK2CPfZrtSJeFNMkn\niwG9AK+F9FFMuQQPgkxZb7QPfAhIwjXWmrjFJumYmqK0Fsxo3ZKM6ipdLmgW23wlXx3Ah7JTFtYE\nZEfeJJnV/FSufC2Vv5bnKYMsmHouapm/5WUaMA20jgYyZ9mhwmYJQ3xfgBM2dVAhhqCj68OKnuvC\naxSHddK14fli48SvCfcPPPDAcNdvx/MNQeZrkaMDofxYdeK6QQ9hPi2lF+bBv15VbOHxUrcBU1Wc\npa7DvCgrflpYqEopR5hGuC1ZqNiBGip7+d9gwZF/DWv53oTbOiZLD9YbrmfBEkSahaAhlEXbWNMq\nsagV03wlyJE1B6sWFp9QHwI1ydXIa55xdG4Wnka+y1a2tqaBzFl24pV3oRsWVoBU/EkhbhUQKMXj\nxuPFz7NfTJyk63QsqWzxNPPJpzTC88AAFVahEMYvFE/n0vZvl3ssy1doFZO8pazRVQg5XAvwYOkh\nhOcVj3W4CAoECrrOJ1DiDxUuoVzALLb5iuYqgQ5WHYLgjDU6oIwqm4/Q4D/o3Cw8DX6TrXhtSgOZ\ng502dXessDXTAHCiypzKncAxKvuwKUdgEwIAx3S9BGa/kkBFC1huXEHPtyO/ms4jPngg8IYvDmAj\n0AF2sOhQVvRA+WSVYq3yUq5Ky1aJXmp5rYCn0vtQS5ktL9OAaSBZA5lrxkouRvLR0FISNu+EseOW\njbglJYzb2ttJMsblC8uUJE8oP01gVF6Flpb8kMI8qHiphBs1UIkDLqroaYaSA7KcjEMHY8GArB4C\ngUphgOZCdM1Sbhg9Ov/ggXJKVu8rQAfwUdMVgEf399CJGp0AQW0tyKomK1tbK7+V1zTQKBpoaMtO\n2HQFNOCIHPeBCXs4AQrhNbW+ycgX+tOQv5yn2Ua+lmAnlB94otwhAJFOuYHKtxp+DDiBxyGuXJl0\nXUt6UbykteBElTn7ACIVvIJAhjiKz7lwW3ErWQM6PXv2rCQJD6TF9L4CdrQAOgTADYgDdORzpCYt\n6UDCAQDI++lnf3MLFvzGH1YXc3Y6/KCD23qbbf3xjh06uFVXXdmXTQDhT2Tgh+eesrKwbcE0YBrI\nngb+8zXPnuwtSgw44M+hipUxeMIeWeyHMBEHjRYzqHKE+Ng37MsfhayKkQ/YoeLHl4Vy4wxMmQVB\nSlM64VzopN1SkaoBO5KlpbxqdV6+GeQn4MmXd7XhRvlU2uNK6bAup/mKpq0k0FETliAPXf36vvvd\noxGYL1u61MPMFlv+0PXZpa9r376dF4Pu5QQGHHxvyRK//eKLv3evvf6Gu/vue/x1O0Vj8/To3jUn\nq4+U4h8BD+XPGqylWK0mmmmgZhpoaNjBooHTqoABAAi7qIdaJm6+budhvNbeRlbJG8+rWAdc4mmE\nZKw79FhKCkBRKaCDxYHmkUYL+sfeWiDTkr7IH9ip1KJDPtyffHNf5Wu+AnSAmXjzVQg6d945x02c\nONGDyv4DDnbFTBDapXPHaKqJjr744WSiQNCj0ezqd865210y9hI/u3m/vfdKvdUE4BGUGvC09FTb\nedNAujTQ8I3wVPwtVeiATjXGa6n01haCGUCs2KYaytsSuJEWZS4l8LFnbqxGC/xbrwZolKMXQIdQ\njcqTclS7+eqee+51ffrs4kHn8IFHukVvLHJjLhxd0aSgANCQk05wWIDGjZ/gFi9+122yySZu/ISJ\nVWkm9QptpR85K6NrC6YB00B2NNDwsMOtoKmGwfTi0CNrDiMLp6FpBfmQNYQa5EL2QiCU9LgRn1Gm\n49eRNiBEmcN8ktKIHwN2mBurkT70/FMH4KoBG3F9tbQvPaLXaoRym6+w6sT9dF577TV35OCj3aRJ\nkxyQ89hj89zRgwdWQ8xmaWDxGXfF5e7Vha+7+fMXuG5du0UQnn9KmGYX12nHgKdOirdsTQMVaOAb\nkSPml0OkVpCIXdp2NECFSuXcCM1ZwAYOtjfeeGPNAY58ASwqzmoE7gdWndVWW81hLSJdXm11M6fH\nFRN7hrO10/2cpjtAh15mGhSRUbUnRBaXgRHsHDnoCNe+3XrVELGoNJhc9LxoOolevfu4sWPHVE0/\nRWVeYiQ1adXLKliiuBbdNNCmNWCw06Zvf+mFv+uuuzzoyCpRegrpuQIg+PTTT71AjEVDpcXS2lYe\nQId8qhW4Fz/+8Y99cnOiyTn33Xff3HADGk9Hs7cLdsIpIeheD+iwXHDBRe6xeY/65qXQz6ZashaT\nztJlH7ozzhjmweyC80e1+v0oRqZCcap9PwvlZedMA6aB8jRgsFOe3tr0VUACH/jWhoLWVLIcgnHm\njYdVV13Vlw0g0aI4lTgxt5YlgPtAOQA2YJSAVQeHZKAmtOoAO+xzjt5XdC9nDCGg6IwzzvRWnilT\np9TUmiPdxtennjbMzb3vXjf7ltmpf9YMeOJ3z/ZNA+nSwH9F5u/R6RLJpEm7Bj788EMPO1gQshqo\n5Kn0WQCEDtE4MAyktzTqTv23v/3Nvfvuu96XZ/r06b55iCEKXnzxRT8zeLt27TwkUPZi4YemJfTW\nrVu3qqqM1/eWW27xzVdUuJRLzVdx2NFs5hyXnw5WHYDozDOHu29F106Zck0qQAcl7fbTXd23V17V\nnXbKULfDjju49darXXNaqTeJpl30z9qCacA0kD4NmGUnffck9RJRcdN7ZvHixZn+uFMxXXnlld4i\ngtLxb2FhiIJHH33UL1RgH3/88dfuSadOnVzv3r1981GvXr28PoiUBD/oi1DtirBazVennXaG+8Zy\ny7lrrrk6NaDjFfbVz/XTZrjLL7k4MxaeavpihXqwbdOAaaB8DRjslK+7Nn2lev7g3JvFAOSwACJY\nQkJrCHNE0azDIvgBLJ588km/fPDBB18rMtaeEH7kQ0PzEs1+1QYdBKhG89WkSVd7HUy9dmoqQUeK\nHnnuaPfKyy+5GdOnpdppGXl5Vrjf3HcLpgHTQDo0YLCTjvuQOSmABKw7NO1kzXcH3xkqI5qv8MkJ\nQUcOvTTtsNDkw6KAn8tnn33mXnnlFQ8+Tz31lKObdjzQnERT0eDBg91+++3nsP4oJFl/dK7YNc1X\nlfa+YsDJMWMudndFoxpr8L9i869HvEMPO8KtFvlTXX31pHpkX1KeBjwlqcsimwZaXQMGO62u4sbN\ngAoXYODDnqUgX6O4M698XIAc/FuYNworD8ex8BAAGICHRdusgZ6XX37Zr5977rlEdfTo0cNDjyxA\n+udfKvygb1mOyu19BdQdfPDBbuyll7sBB+yXKG/aDtJLq3cEpxOikZz79t0lbeJ9TR7uU2tZ9b6W\nmR0wDZgGCmrAYKegeuxkSxrAqgM8AD5ZCADOkdFYQfrnjcxYdgAaWXVwWgZ24sBDXMBGSxL04NyM\n1WeNNdbw4MM2IEQvqHiQ3w9WH+AFSxmhJfipRvPVueeOcmusuaabOuXquFip3tc4PPMXzM9EMxEW\nUAKWRAumAdNA/TRgsFM/3TdEzkBDz+jfNr47spiktWD5ZBXsyLIThx0sPVh4sO4QFxhhAXpYC3re\ne+8935OLUa85p+NsL4kmxAR61PxVyO9H8CPrTQg/1Wi+YmTtiy8e656IfJBqOWBgtZ4LmrO6dO7s\nRo4cUa0kWzUdA55WVa8lbhooSgMGO0WpySIV0gCgc8opp/iut2n136HC6RlBGUCGY3IY4j47NF8B\nPFoDO1h9WAAi4mtROoAOUMIUHGETV7gdAhAWIByeBT/5/H769Onj5abpi/S33nprn2W5zVcMHDgs\n6ma+XTQtw9nDh0n8TK2ZSHSLLp0y1RvQgCdTj5gJ24AaMNhpwJtajyIBEFgd6KqdNuDBIZl50AhJ\n3eUFLlhuABqsOACOFh0DdOIL1/7ud7/z49wwb5iCrD4CHNbhtqw+IQwBP1h/WL/++utKKnFN13gs\nQOSPTMgMoGlKiHyDB2LVueCCCzNr1ZEyGHBwrTXXyIx1B7kBHp7FtL0f0qmtTQONrAGDnUa+uzUu\nG74w+MSkCXjU80rTQjB3FDJi5QkD0ADsCHhkyZGDsvYBC+JoDXRsueWWbpVVVsldz3kBFHlgkdEi\n6NE6CXoERbL6AED5nJ47R805O+20k8P5edttt/VA9cUXX3h/I2Qm33Duq4suGus223zzzFp1dM+e\nfW6BO+rIQX4Wdh3LwprnEegx4MnC3TIZG0kDBjuNdDdTUBY1aaXBhwcfHZqtABusTmxreohx48a5\noUOH5jQGnBAEKXELTrgv2Jk3b57bcccdc+ATj0M8LUpX+STBTz7wAXrovk7Ybrvt3JqRYzE9v5L8\nftZee23f1EVlyrLhhhs6RkkGxoAf4IgZxrPQ1dwXuMBPv336u/367+MdzgtES90pA57U3RITqA1o\nwGCnDdzkWhdRwIOlJ+4fUytZ1KwG5OBPRKCSAXBmzJjh94844gg3bdo0DzjAhwLbIZwIWLT+6KOP\nHGPUYFER4HCOba3DbR0L16TPfhiw6JC3LDtq4sLhmfQ6duzo7rnnnpxPENaqxx57zDd9Pfvss346\nijA9ttdaay3XpUsXD2X4FX388X+7e++9Ox4tk/vjJ052r0YgmLUeZSibZ1EO85lUvgltGsiYBgx2\nMnbDsiIuH3JghwB4YF2pRaCJgHxZA13xfIEMrDqnn366FweAABgYDyVubQkBSPCDzw+ws9VWWxUF\nNknQEx7TttJnTZAsDBz40EMP+WM0mWHV0TniYq3Btwh/HZb58+f7gR6x/KCDMGy3bVd32MCBbshJ\nJ4SHM7uNo/L+/ffNXFNWqHCafOPPaHjetk0DpoHqaMBgpzp6tFTyaADLCrBDExLbG7fSeCOADYB1\n1VVXeesNeWnQvlA0AAHAABz23ntv79jLeYCHnk6hVUWWFc4DGMAD11MG1lqw0GgRvIRWHI5p0XHF\nC9faVlqMTn3SSSeRfTQj+Rmuf//+fhtZVA45UwM6QA9pcH6FFVZw3/nOd9z777/v9fL8889HY/18\n4a67/gbXo3tXn04j/PTq1duNGnVepoHBgKcRnkQrQ9o1YLCT9jvUAPJhsqcpiRnE99nnSx8L4Kca\nAcDReDSkl9TbSvkITgACIIGeSYcddpgHAuKMHTvW/exnP3PLL7+8hwXW8qMhH3p0ATphIE0FIEV5\nCGoELtonby06Fl/rvLqZM9v3nXfemQMc4iM/C93j1UWefQKgg5/OSiut5OhqzvrPf/6z23PPPX0a\nkrcR1vTK+tFWW7hBgwZlujgGPJm+fSZ8BjSwXAZkNBEzrgEsLFhePvnkE+80C/gADTQ30TMKGCol\nUDEoDZoAwi7fQImCwCNcAwpa8GWhiYgmKcKIESPcgQce6Ltvh5YSrEDkEeajPNSkxFpgBCTRA4r5\nsVgADxYsLQIQQQj78YVzv/jFL5SFu/XWW/313/rWt3y65EOZaMJCTrqbh6M9c5wFaFIz1xtvvOEG\nHHRILs1G2Vh7nXW8w3XWy8NzzHNd6ruQ9XKb/KaBWmnALDu10rTl00wDQAkAxPqJJ57wIAEA0YMo\nqflJFQG9qYATKgcWWYiAH5qwVo0miqS5iTQEOWHGHMMCQpMPFhFNC3HOOee4O+64w0dt3769mzt3\nrsOigg9M3759vbUHyBDchGkW2iY/BbYBLQIgon1ZcjjHNuP27Lbbbj4eTYA0twle5J+D3IylQzdz\nFo5zPc1wQJHgCthif/Lkya79+hu5cVdc7tNtlJ+5Dz7sZkYO57NuntkQReJ94D1IegcaooBWCNNA\nnTRgsFMnxVu2zTXAR55/tUBNUhAEAThJgWs5BwzRHZxu4YIJ1kCKAkARQgOWEfZvv/32aBqFixXN\n+/4MHz7cW2ew1KhZC6AghGnmLipiA3kIrLXI2sSacXvefvtt39sLAOOYLDSy5CAzozADPIAP55EH\nGYEbWYEk93XXT3OdOnfJ/Pg6cfU2GuxQPgOe+F22fdNA5Row2Klch5ZCSjRAJbHzzju7zz77zF1y\nySXu5JNPzjVZIaKsMsADwIOFR01AAAPAg1XlxBNPzJXo3HPPdccff3wOIAQ8WHmUZi5ymRsh/Iwa\nNcpddNFF3jKD/xHj4yArMCNLFIAD6LAI1EgDmYAbrDqCHFmjrplyrftBh44NBzvMhL5++3YeGstU\nfyov41nGuoOVx4JpwDRQuQa+/ItaeTqWgmmg7hqgeQtYIGCRwQFZzrtYRNgGaIAHAvCDMy8Aw4LF\nhhGWx48f748T58ILL/ROy0BF6MdDGrLKEK+SIAijuzigQ7j55pvdOpE/SmilkawAjBaghqYq/H5o\nwqOCZKEcrDmGDxAA1IghixOZFnMfsGQS3okNH1DMtRbHNGAa+LoGDHa+rhM7kmENMGig/F3oaUUv\nJKw2wEoILFh34s0+TMYJ9NC7C6dkBvMj3H///d5vhxGLBU1YhUijWsBDPkdGDtsEeqzRzRz5ADDW\nCspPlhwACNABbugtxiLgAXSwDLEQx0K2NCCrjgFPtu6bSZtODRjspPO+mFRlagAIuOGGG/zVjDFD\nD6vQz0XNVTQLARLAAtaTBQsW+KkUGGSQYwAGgw8eddRRPq1Fixb5uaeeeeaZXM8nWYmqAT2jo3GB\n8DcCWm6MHLcJlIW0WeSzg3UK0MKyhPzIHlp1BDsCHc6xrPitFX2ajfbDwIIdftCh0YqVK48BT04V\ntmEaqEgDBjsVqc8uTpsGgBQsG3PmzPGi3X333e62227L+bsAO8CPLDMAA1Mt7LHHHo5eWHQPZwEi\naCrC2kKzFgG4weJCExPpqFkMEFHTGIBSasA/g5GSCYAO8pMOC+kiK3mTnxYAiLICZjRjIXPYnZ1t\nHfPrVRrTsvPekiVu6222LVXlmYov4OE5sWAaMA2Up4Hly7vMrjINVEcD70Q+CY9HPbC0JlV6VmnC\nTsa20ccePwa2e/bsWXDWaACmV69ebsCAAX6MGpyMO3To4DbaaCMPD0AEcXDwff31190uu+ziC8Mx\n+cKwrSYr8mVOqn79+vl4TDWBI/Oll17qrS74zbAQuJ70w6Ynf6LAD0BFoPlKXenZj1t0kCcENfKS\nzw4+OQAa4MMx5JcMpLPWmmu43zz/W5JtqMA9bAuB5573AuCRP08l5eZ9Iy0tpB2+d1gYlY/eO9Y9\no3fPgmkgixqw3lhZvGsZl5kPLBYMDSiojyhrrBoEfVSJqw+xPsw6BhhokUoECFhAsL5svvnmvncW\nMPDII4/4aMDAn/70J28x6dGjR86Kw0lZUbhWViDS4jiBZq0//vGPfrt79+5uypQpfjweIAMrixyd\nAQ3Bho+c54fmK6w6VC5UQLLqqBzADb5GGlOHbSxJpE05ZL2hqUrAgwzx/JkOY9y4qyJouyuPJNk8\nfPEll7uVvrOiGzrk5GwWoESpeRd4TnhXSg3he/fuu+/6nou8Z4AUCyH+3nHs8eDPCPkTR++d3lfi\nWTANpFkDBjtpvjsNJBsfSeCGyp3AxxKLRjkfba7nw81HmEH3SJtBBVmABpp+aPYBFGiiYlA+AoMD\nYuVZEjV9YAVhBGX1VFKzFVYZwAbA4XpBj4CH84Ca/IK47sEHH3SdOnXyaQI8LFhWQuuKFyD2Qxk0\n1QXNbuiE9Fnko0P+DBoo2KFcnAdogBvkVxkEXEn5oiP8ebi2kcIxxx7v/vynZf45UIXdSOVLKgv3\nkmcH6Cgm8LzyngBJghTW5QTS4D0mTbZ5hzWaeTnp2TWmgVppwGCnVppuw/nwYeSDCNjwcWSpZgB6\ngCgqAPJhfB0AADAAFiZNmuQuuOACnyWWGSqJ73//+x4WsIgQF1AAXAAFQtzCQzoAD2lidSGvIUOG\n+Lj8XH/99W733XfPpQOM0Myk9JKsPOiD5jqar6hACMBICGuy6gA7wBfnSBd5JTvWHcAHSw/n4nmh\nH8LoUee7s84+2+3+075+vxF+OkZjB82+Zba3iFH5KnCPGz1wXwuVk2eK94HA+3Fkld873gEgijnv\nmJuMbbP0NPpTl93yGexk996lXnI+hnxg+ScK8BT6MFejMIIeKr1f/vKXfuJLWWfwy6FrOQH/Gz7K\nagYSNAAQHAMWBB2y8CgdoAfgATrIZ+DAgTnRmaFcIy4DTlh4gA8WQgghVD6t1XzF9BthkN4vGnOx\n+8c//+3GXDg6PJ3Z7WefW+BGnj0imrF+3tfKIMDjBBYflkYMScDDc8l7JxhhuzUD+QFVyMJzLcBq\nzTwtbdNAqRow2ClVYxa/KA3wL+/UU0/1g/zxAaxlmDZtms8bEKHrOeAxa9Ysb/FBDvx4sMRgdeFc\n6PfCvoAHC05oZRHwsFazFiB32mmn5fx48AG65ZZbchaeJD8eKqFqNl+9+eabvpkLmGIR3MR1zj/9\nq64anwgH8bhZ2B957mj3P//+l7vs0rEFxaUyZlHIpx+dz9o6BB7+VAAbAA7vXS0tLchBvlgskaOW\neWftnpm8tdeAwU7tdd7QOVL5618elWu5PjmVKompFugmzkzrzHe16667+sEB+RgT6FFF8xFWF5qA\nsO5oAXjkaCxHYaw5AA6WHZYQeLACMQmpJhLlesbtadeunYcp4EnNWsAIoFNJ8xU6/vjjj3NAxeCH\na665ZjPLkS9kwg/NPuPGT2iIpqxevXpHMH1eXrhLKL4/RKWsgMWHJeuBMvEHgzXvXb2AjmeTdwyg\nr+f7n/X7afJXXwMGO9XXaZtNkQ+dPrJ8dOv9zw7gwRGTnif33Xef99MZNmyYmzlzpr9HM6LZsqno\ngJEQeLD0ACzyfwFmcBhOclwGejgOFFHm8847L3f/77zzTkePLTWPATxMP8GUEAz6h1zoiPQFVaQn\nPx0ck9lGr/QAw0qEXFim6EqPzAIzWXVymefZGDNmrPvrRx9nfvbz66fNcDfNuLFiK9U7gdWHe1Ev\nOM9zu4o+DGDgO/Piiy+mogxYlQRfWdVp0cq3iJnQgMFOJm5T+oUU6MiEnQaJkYneWVQE/Mv89a9/\n7TbbbDMPPVhngBy6owMKQAOQI+tO3OFXTVpyXAZKQiuP/Hj4Rxs6Lp9zzjl+IlGAB58hZmQnAELq\nESOYIg3SBHLoKi6QwoGakZ2RiW0WtklTPb8oQzGByn2TTTZxry583XXp3LGYS1IZ59DDjnAH7L+f\n22+//lWTj+eF+6fAs1xvYJcshdaypADblAGAT0NQkxpyGfCk4Y60bRkMdtr2/a9K6dMIOioYIMFC\nhcBoyvzzZRoJZkcn4LiMNQYrDsAj2Al7OKlHFengw6Nu4XHgoZmLc+iDXl9//etffR7y4+nWrZtj\nfi3m7rr33ntzPbUAKebiAnaUZufOncvufeUzLfAz7MzhkZz/l1nrztwHH3anRuPqzF8wv1VhBPDh\nXhLSavUJQSeNYGbAU+BFtFM11YDBTk3V3ZiZpf2DC6QAFMiJrwzWnLA7Ot3SaX6jmQmLCcATWk/k\nb8PdiwNP6MeDVQZgwfpDPHpbATHxQPMV0MPov4AUsnXt2vVrzVeAExYbLFChEzUyltp8FcqAdWe3\nn+7mbrhxuuvRvWt4KhPb+OowvEA1rTotFRzoSZvVh6YiYAK50gg60inNWSxpl1Py2roxNWCw05j3\ntWal4iPGR5cKNM0fXEEKzrxYWrDmMMjgwoULva7ydUcHLORgHFp4ABT58WCNkUVG2/Ljofnsiiuu\nyN0Ppr+45JJLPNysvfba/jjWIqCJ5istQBMyC8Aqbb7KCfDVxvgJEyPoe9Tdc/eXc4jFz6d1nxGT\n5z/3bN3lrrfVh6YhmkGz0kSErAAj8lowDdRDAwY79dB6g+QJ4NAWX8/eH8WqEnAgvP32227rrbd2\nN910k/dd2XLLLf1xwIPeVGF3dFl45GAsh2V/QfQDpLAANsCKgAcLD9NR/OY3v/HnQ6dlrmUU5+OO\nO86DDJYboEkWIgYPZJt0yY+8JUfYtBaXRTKVsu63T3/XrXsPd/bwYaVcVre4jKuzfY9uqXHClSJq\nbfUhP/xy+JORlTFtkJlvBfJmRWbdX1s3hgYMdhrjPtalFDT98AHDupOFAPCwjBs3zs9kPm/ePPfq\nq6/mHIWPPvpoPxIsIIFFR01HrOUMLMgQPGHhEfA89NBDOeChmQmn4rOjEYs5Hg+Mtjxx4kTfTAXs\nhP46pAd0Vbv5Ki4D1ol+e/dzvxh3pRtwwH7x06naX7rsQ3fYoYdGTZGD/D1KlXAxYUKrD1DCUs0A\nLJBH1qwkyIuFB9mrrZNq6tfSakwNGOw05n1t9VLpw5X25qu4IoAUAIVZ0bfffnv/L7OY7ujAD8BD\nsxIggkWGjzbj+JAeC+kBLbLyPPfcc+6QQw7xIjDwIB96Blr87W+/nH183XXX9QMQrrLKKv46riUd\nAr2sBFuh/1Cpva98Ygk//NOm0pz36Dy3wjdXcDNvmpVa/x1A57jjjvfw2NIAgglFresh3g8WBf4g\nVBJIi950DKuQRWDAb46Ar5EF00AtNWCwU0ttN1BefGgxo+vjlaWiATxYdfbbbz/38ssve7DYdttt\n3dKlS30xnnzySQ8zYXd0wOONN97wXcMBHmCHwQHxUyI9QZSsNAAPlh0G/0NXs2fPzo3HwwjP+tgD\nL1iaGDsH0CFd5Ss/HZqx5Dsky1Il+qbCBLxw1ib07burey9ymk6rw/Kppw1zb731Rzdj+rRU+4UV\nc09CawzPBUspgfeNa3j3shiqBWt0MsDnjoAP3FlnnZVFdVRV5qlTp7pjjz02lybfpFqGSy+91E+X\nQ54vvPCCwz8yTcFgp4p3gzFc8AkhxF/AQueqKEJNksJHB6sAH64sBsEJ1p0ddtjBT/fAGDg77bST\nL044O/qyZcty3dFxbGZUZABF0AGcEJQmwEIzFM1XckzGdwerkHprYcHhYxB+oOldhDwCHaw9jBHE\nOrQqkZ/y9BmX+COL3KeffurTZ5+mSLqj33v3XakCHiw6o0ef7z788MOGAJ34reL9Cd+hlqw+xMWq\ngzUxzZ0B4uWM7+sPkoA/fr6YfX1PN9100wiE3yrmkoaIE777U6ZMccccc0yuXPWGHQTRfQF0+Mal\nKSyXJmFMltpoQBUma16QUgMfqSw7Gar8/DvGURnfGEYkHjVqlFfFww8/7LuNAy0ADt3CGSMH6ABU\nsN6ouUm6U5pA0CuvvJIDnRtuuMFtsMEGvkmK67EKAUadOnVy1113nS53EyZMcFdffbW3/nCQdARV\nYdMV+ZQbuG8AFaCz1VZb+WY4QAd5aB4686wz3VGRT8ytt99ZbhZVu05NV40KOihq48hCA+BoATy1\nhBAkpfK8Mrt4lkGHslAORnumKbWcgAVBfyrDPwzlpGXXVFcDuh801ZdTt1RXmuapGew014fttaAB\nPsIMzqd/Zy1ET+1poIFK5r333vOmV/xrgBr8bgiMj0OlomYprDL0tqJ5imOAEMADKCgIRHB0JuCE\nfNBBB3lIoimKpjAsN4AM12Ht4YMADBGALHx6gBECceQfpLT9iTJ+uF+DBw/2V1JhUqlS2ZIHC+U5\n7LDD3HkR8I0cMdzRxbtegUEDe0f3hmZAusZnvXIvVo+CHtYEgQ++YQRZVP1Ohn947hjUk/KUGrBq\nATuE1VdfvZllo9S0Gi0+Vh69z6zrEQ488EB/X8h7+PDh3gpZDzmS8lwu6aAdMw3k0wAfKCbQbIQK\n6IknnvD/lOnuTdMV1ptrrrkmV3RBi6aIiAOPYCf8sDCQIH5AzH2F1QirDJYjIIdFY/aQiZq8+De0\n5557+nxxPB0wYIB7J4JKAASwygdXOUELbKjLL/+kCfgHYeHh/unDiByCut12+2k04OJE9+Tjj7m9\n9urn6O5dq4A1h5nMGR354rFjW5zNvFZy1SMfgEDwwzaWVCAYS1wjBOCb57DUwJ8DgIcQNuGwzzn+\nFLDId4UKV8dY826pgwDXxAMgRVNMeE1oSYrHZ5/0SFfXrLHGGl4WrE86xlrWKKXBflw+4nEsLiPf\nJ86FgTJyTBaUsPyKi67Y1oKc8RCPc9tttzWLgn+U8lI6yKN8w8gAKMBDIN14WmHcmm9HH7y6hsi3\npSlqdwVDcwvHonbYZnJFD3bufKTQr50P02A7HiJH0SbSDfNhOymv8Npi5eOaUAauC0Ohc8SL/tU3\nhWVEtmgqg6aoXTZMJrcdpoeuuD5qJ82VDx3FZSC9ePm1ny+fXIZfbUSg0xQ52MYPZ3b/d7/7XVM0\n0F9T1DzVFEFP01/+8pemCOhyejrggAOaIoflpmeeeaYp+gA1LVq0qGnJkiVNPE/RAIBNEQg1RbDg\nyx9NRZG7bs6cOf54BBFNkTWo6bPPPmuK/H+aIifnpsiK1BTN09UUwVBT9MFoirqgN02ePLnpzDPP\nzF3PfYnAy+f10Ucf+bxIh/TIT3kWUjzyRH4/Pk3WyKTA9aRFuSlH9GFqisYG8vlFH+GmaOLRpmHD\nzvLP9CmnntEUzaWlS6u+/mDpsqYxYy9r6vCDDk3HHHt80+LFi6ueR9YTHDp0aBNLowSeN55x1qWE\n8LvHNy8MfMP0PeNbGn4PdVzr+LV8QwvF53sa+aCE2flt0lGa8XX0J6bZubBOI614/Ph++P0u5tsd\nlp+0FCL4yOVFOeKBfJQ35/m2KYTnFCdco+d4uPXWW3PpodO0hP9opMYSlfpwEZ8bIUWHSo7f5PiD\nzIMVXqs0wnX8mlLlQ33hixg+qC2dK+eBiucVliXc5iVRKOaFUdx8a9KudmWErrmH3FNkTLpXHOMc\ncYjLNdUKgACVe9RM5aEk8hNpihyGc8/a+PHjPfAAKVGTQtMf/vCHpqjnVlNkNfHXCHgiPxh/DUCo\nEFlnPFBE/8r9NcDO/Pnzm+6///6mW265pSn6d9t07bXXNl1//fVN0WzsTZGPTy5fdH3iiSc2RXN5\nNUXzbDVF00s0y68Q8ACkAh3kAnwIAiVBGIDHx43yADmvv/56U+Rz1BRZp5qiMYg8RA8cNNjLBPQ8\n8+x8Fa3iNQAlyDnk0MOboslPK06zURPgHlZbP/V+7yhTCOAt3bs4IMTjx+uB8DsY345XwoVAR9fy\nDQpBgO2kb5Xix9fhNyv8fsfjhfvKr5hvd7z80k/8eBzaQhhiWyGEllCm+Ha8rkPmME48P6Vf63Xd\nYKechysOBXp4wgcnvFkos9gHkjTCUI58oRzxByDfuXIfqDC98MFK2iYPQjEvTKiD+DYVJlaQagXu\nX/iiJcle6BjX6hmoRKbIf8BDBtASNVX5f5vR3FVN7du3z720WHeeeuqppgULFngYAAywhGCxweIS\njYrs4wIY+rcq64nSxCIUTU/hLTsPPvig/9Bzb6Ju6R58br/99qbIH8pvR6b0XN6Rk7TPE6sT+ZEe\nsgJSScCDBUB6A7xCeYjPtYAd8AREAVMAHJCD9SrqPdb0/PPPe7CTJYsP1siR53jrS8+evZrOPmdU\n0/0PPFSy2oGlqyZMavr5Mcd5GbHkVLsSL1moDFzA/axWSMt7x3M6atSooosVfv/jsEIi8UqdOKpo\nqQfi3z+BRPy68Ntd6FwoD/cnvC5u1eG8vlVxa1B4Xfycvt1Skt5r1sgWhrisOheHjzA/4oTAFuYX\n1jGhLilHqMs4BJJmeG08P87XI9TFZ4e2vrBNMlIGb7JfohsW3ccvQ/SRbtYuiG9DpESd9o5qpBVV\nPP5YpHTf5TsXIdrgPOkohHmRngJpqH2xXPmUVilr2mcVogfKd9dDF9ED5Wfk1jnajcNy6LjWYbmi\nB1aH/Vq6jl4kr+PwJPomv8hiEh5O3MaPZOPIf6AaAV1vs802OZ2Xk2Y10iBffCOYnBPHYRb52Dzw\nwAM5sU4//XSvp8gi4h2V5aysbuQXXnihjxtZVJr5w8jvJgIM39OK6zkWTkuhbuYRKHkn5rXWWsv3\n1Np///19ms8++6zXFZOH4jeEkzTpkY7eGyLin0NZrrrqKn9dVJl4J9DQP0fyIDdl+Pvf/+7n42LN\nwjHSjqDIt/MjJwvv3dlnj3CvLnzVDYl8ar694jfdZZeM9XGYdiKCFu/UjGNzfMEP59DDjnAdO3T0\nvb1ejXqrMQEpz/OUayZ7mb3A9pOoARyVIytI4rlSD1bjnalGGsiN/5Gcr4sph75jxA3rgXzX8h3k\nm0pIqhv0PcUnRYHvYFgvsM+3VYG6QQE9KMSv45p839SwHMgV5hdBRLOySUblU86aPKI/hrlLw/zZ\nVh7EI38Cx1Wvsh/qEt2zT3wC14e64Fh4f8J0OFevUBfYKffhQkkhDPHgyTOfc3EY4lh4E5IeyPCm\n6CGoRD7yLDZU+kApn3i5eLDDh1sPs+KXu+bD1DOqTCsNPPw4vFVDLtIgrUpeKACOCoUA7NA9HOBZ\nb731vEMvxyNLh8OhGVgABoACAc+h0TQGBJyMGaxPAWAAbgALAEVLCDuADjDChwPYwbFZXdSBFWZk\nJ3AtlUPkO9QMeEiffCKrm+/hgoykA3RpGg8BkWQnLaBJk46yZv8t+iwAAEAASURBVB85iUOQHnCw\nRh8sHCNQxnNGnu0ee2yev4ennTrUde7Uwa280rc9BAFC4bJCdNkxPz/aPfDgA27RG4vc1ClXuyMj\nB9VGcHL3CmnlHyC2GrpK43tH2YoN+j4TP/xuJ13P+XgcgU88fpiuKvswTvgtRYd8c1haui4pLdKl\nntI7GVldfFbUOdRlOBBX8i0L5Q63Q1nC+i3cJo4AJiwbeovrknhxvYT5hfHDtMI4td5evtYZkl9Y\n+PAmSBaUKIuHHi7dBOJTuYuw9WCg3JCQlVaYV9LDjgUlHsJrSpUvnlah/TCfQg9UvKzxNJPKxbEQ\n9OLX1HOf8sRBh/vHfec+J5UHedEX1/GChrrjGGmG/8BKKZ+sVerBIOsOH6SDDz7Yd0OPHIr9BJ6a\nHR0wABCw6GAV4trI7yZnEeHaOFzIasI5AQQ9tDQNBWkALoIjoAq4nDFjhhs4cKAvEqM+n3POOe74\n44/3cYGy++67z9FzDGjZaKON/NAAGj+Hi0gTWQReyIHsLAI2zhFUdsmlHmQcl5XHR/zqh0oYGVks\ntI4GqvUnI43vHXBebAi/GaoP8l0bVrb54ui46hD2k3orKZ7WoRw6lvTNKiSDvlmq55ROa635tvKn\nkEDefD+ROfyOhvASlpE4+jbmky+MH49T6Fw8bmvu1wV2ynm4wocbqKEiD5UYWnyksDAfjhV6+HQN\n6/C6Yh/+UL4wrULbofyVPFDFlquQLMWcw/rBP/JKQ/hvgrS4d/lMvmFeIXiSBt0fFeJp6nipa15q\nKnUqd6waVPZAFOkDBjQtMQYPIEIX88ip2GdBRcJYOkAD1wp0BC6hVYc8iAPkMPYOiwYOJF2uAYYE\nIhtHlicg66ijjnKRj4276KKL/HQXkYOzY0b1SZMmeRm6dOnimOqCZxGgIiBHKEscdMiL8wRkAp6w\nLGlBRll30Auyt/Th84nZT+o0EH9H6v3e8VyXEsLvZSnXpS0u5aAJP6xnkJFvIN9yviXxc5WWgW8C\n3089A6zJS3+Idb7SfNJ8/XJpFi6fbDws8Qc/JNR819nxyjVQ6gcqKcfwXvGCFwM68XRk4dPxME0d\nq2RNxQ5wqDmLua0IwEjUO8vDhORm9OXddtvNQwqwo0WAA2CwcC1WFtIGogAKQEdrtsP5sNgHNpCB\nj9Gdd97p+vTp4+XAj2fDDTfMgU7//v19UxvNYuQhaw4gA9CQv/xz1GyFfAIdgCaEL8mFnJwDhJDb\nQnY1EL4jaX3vCmm3tf7UhenKr5E/C/mWML7kDXWrY/mAJYQZ0qJ1gbyAz6TWCaVX6Tr8s4i8Ah/S\n5RzfGIVwm3P5dKHjScYGpZWWdV2+XuHDUs7DlWT6Sxr4KbxhKDzfwxe/GZXKF08v334oX6M8UPnK\nmu94qOt8cfIdr+TafGlyXNaLEHi6d+/umL+KEPWaauabw4suoAkBh201FQEcAh3gAYgALljYDhfg\nBysRi2Y8B3iQJ+q94icw9YJ89cMgj9E4Pd6vh3yAKlmIkAsZkkAHeSgraQt0lC8yCHSAPvKWXsK8\nbTubGqjk3ank2kq0FX4v4392K0k3bIJKgpaktNFBKE8IDoqfdIxz4XGgM9Qn5Sq2nlI+xa7154z4\nWHRCOcImLM7HdVKJvsPykXa9Ql1gJ67IUgoPFesm8bApLW5G6KxMmpwPH8ikh4jRLvUR1/VKkzSK\nffiJW2qI51PJA1Vq3uXExz+jlN4TxeShe1lM3HicSq6NpxXf55mggseiAZwABCNGjHCdO3f2UeVY\nSC8twEA+MFoLMliHFhQ1FQl0SJdFPjzKS/CBhUUL8BF1f/cWnlBepu8AdtSjSjJoX47I7CMPQMQ/\nMspH3oIrwArYCUEHefV+hHnadrY1UMm7U8m1lWgtrDSTvuXlph1aPPgjrXqA9MgHVwa9A4yurBAC\nAvVSeB3bHGsphO4Y6DVsmm/p2lLrC+rCsKySL36cfKmbpG/yQa6wLuTasO5UWpI5vD9hPafz9VjX\nBXZChZfycKH00KqDyS90SkXh8RcxfCB5ANVGibJJK3xgJJfWihM+xIUe/lJvYKUPVKn5JcUPy590\nPjyG02spvSfCa8PtUL/cr/h9COMmbSMz9yS812GaSdeUe4zKXs1ZwAZ+MmHAqoIVRXDD1BMsgIWg\nI7TqABdJoCOoCPMDOgAdAIQ1eY8cOTKXPc1pyEbAUfpnP/uZi8bO8fmzBnIkD2vkQVYC+VCeMH3y\nQzbBl2TiQ58UeBYej/y4xo+fEPUau8h3L6eLeXxhRvXxEyb6bvDvRMMXWChdA4343vHHiZ6DxYaw\n0gwr02Kvzxcvbl3hexTCTVhnhM1M4TZph9exnS+E5QAgBA1xoMh3vY4rvzho6HzSOuk7ybHQKKDr\nwvIhJ35G0kvYmxYoCq1GXB+CUVhepV2PdV0clFEMlZUeWG5avocjVDhxVDlzc0hHVKqKj5sQ9rDi\n+vBhyOdwDBTpppQrXzk3EPnkJa8HKimdpAcqKV6px6T7Yp0Vq/XR1f1CXp4FFvSv+5lUDq7h/ocv\nkuIlvcQ619Kaj+7GCc6SvNiygAh4mJk8DFhV+vXr560lNAsRD0jAFwbfnRB0wuYrQENQgYWFIKiI\n73Oc3lZz58718X74wx+6yy67zIMKztKnnXaa1wnnmaWdMTDowo7skgNZ1GylsgA2AI4gB5kkf1wG\nn3H0A6w8/vgT7s45d7l777nL7d1v32guoe+7tddZxx3xVY8xxdU6GrAwgq4v3K233eHwLYoGJXR9\nog/sDtv3iLZ7Kpqt82gAHY0ePTrP2eIP846k6b3jW8IfqGID32jVE3wD+BYkVdLFphfGw52CuiHp\n26J48bFz+Cbz3dT3W/G05tvOdy0eqF+ok1SXhec5x3EBlupIxeEbWUhGxcu3Jn3pUHFCg4COsZYs\n8fhhHHSA7sKA/GHZKvk2h+lWvB19EOsSIiApOBdJVLBmI1IyEibHtEQPXk7uQueIFD2Quet0fbiO\nHqBmw4BzTanycU1043P5hPK1dI64oTzxbdJFnjCEeUUPW3jKb4dpRg9ts/OUN54HOmopMNItow1X\nGhjRM0mGuEzF7ifdv1JkbGkk1wgS/DxSTPOQJBOjHkfQ4UcCjiwdTVF32ibWHNPxp59+uol5uN58\n800/NUP0MfAjIUcQkjgKMnlGoOLn6oocoHP5RmDj59eKAM2PxMzIzo9F92XQoEG5OBFU+fm2yJtn\ng4Vt5GLKi+hj6aeFiMDFjwKNLJEVyE9rkU8ehvVnSgfKz7QRt9x2RxNzWpUTGHmZEZgZiZnlxmjK\nDGSwkKyBao1cnrb3LpqU1j+3yaVOPhp+NyKobxYp/M5HFWyzc9oJ39/4N5U4fDfDPIjP9zPpG6s0\nOUd+SptvM7JwXMdYo38F6qwIMnLndQ3nI0jKHee6UM74dZzXtzssP8fzhbB8ESw2kyvpGvKMy4S8\n8TpO13JfyJ8l331Q3Fqu82ukRlIU+3CFNwhFxwMPpBTMDQwfEOJyw8I4xC10w5R+sfIRn/QkQ/xB\nKHSOa0t9oML0kl5E8pcscdgp9MIgS77AnFiR2Tnf6ZKOI0N4TyVrqWvSIK1KAgDX0hw9wEfUtdvr\nlPhMsRBZRPw+cPHQQw81RePd+Ak+77333qaoq3gT68ja0jRv3jw/zQRTRbz33nt+igbgImpSSgQd\nlQU4iiw0Po/ICtMUjbeTm94BaBLwADLMtTVmzJjcPUePQ4YM8eVCrugfvZ/uAtBhCohobKCmP//5\nz03M2RU1b+WdfgKQEpQwzUO5gKMyxddAExDFJKBXXTU+ftr2v9IA97MaQJim9w5AB3hKCWGFHv+u\nlZJOLeKG32DqpLYSQogTiKWh7HWHnTQowWQoXgPMjaVJJYu/Kn9MPgghuBULO1wTB8r8uRQ+U0xF\nEo1n40Ei8nHJTcwZzo7OHFQAE/NczZo1yy/MdcXs5lh50BkAziSjmk8Lyw0QlRTCiTyjZis/V1Xk\nF+Tns2IWdObZAlqYwwqYAq7ImxnUQx1GfgB+FneAGKsOwAW0Ms8WoEOagq5QFuIwb5WHkAhyWjtg\n7QF6ACsAy0JzDRQD5M2vKLyXhveOb0mp9xrrCODAM16MVaKwFio7G/5Z43sU/sHmfZOcyErcthC4\nP/r+oJM0hW8gTCScBdNAURo4MhpUkHb2U045paj4xUaibTr0yYn+xTa7NPpwNPPpiV6kZufL3YmA\nxftD4LeTL3Duxz/+sT9Nt/O99trL97DCCRnHYHpCEfCr2GSTTbyfDj4v+MRo3iucEHHGVFdy/HeI\nIz8dn8BXP+hW81vhAK35tvC/YVGXdhyQI2DxSwRQ3jkZmfDPiaw8jrm0CMh0xRVXuA022MD78mhK\nCvkM4adDkCwPP/yIO/mkk9zue+7thg073bVvt54/X4uf8RMnu8kTxrvDI/8fpqSw8KUGeLbwl4qa\n/Kqqknq9d5Sl3A4P+MHIjySCtlYdm6aQskM5CsXjXGTh+JoTb0vXZPF8qJO0ldlgJ4tPVB1l5mPL\nnEuF4KCO4pWUNaBDeXBO1jxSSQnwUX7ppZcc4BFZbzxw0KsJ2AAy9thjD/fGG2/4S4EKIAInZXo6\nAThM7Ans0HUf2AGCAAzBhfLEYZN5pzSEPnNjSS7+k0SWF583Ts9AjWCH60LY4TyBiUyZ5oKATPTm\nYpRlAZfkALrkkDxmzFg3c8Z0d8GYi92AA/bz19b6Z+Fri3w3f/KdMf3LiVVrLUOa8uP+8pyeeuqp\n3vGzGvNk1bt8+oZQrnICPYNw1OVPT2RRKSeJqlxDD6rQ6Tsp0ai5zcNO0rlGOsYfVLrms44sWX5S\n6zSV78tuIGmSyGRJtQaojPlXxpL1QC8XelMBO1FTk1/iZeIfNaADtOjDLDgAaNgOe2hFPgi+FxRg\nwiLDqa5hDeTouPIDHpEH0CGvpIk8SQ+rDaDFmgVLD4E0kQeLEWDDQs8n5tEiAEDsR/49/npZiSQj\n8tBFfMFvfuNuuHF63UAHWbt07ujuuXuO7+XVv/9+DQHWlKuUwPPw+FfPJO8a1r6o2ccfKyWdtMYF\ndsJJc0uVE6sBActUUo+nUtMrN37UXOVBJqlHE72xdL7c9LN0nXqYYYWnR2jagll20nZHMiAPTVkE\nVf5+J4M/yK9/mBKfCkaBSmbw4MF+F4sOH2dZWAAOrCtYVKJ2at8tXGBx0EEH+RnI6dLNiy/LDtYd\nxsxRF29ZdrAwoVOapKjQ2MeaJCACSIAT4AZo0fg9WHZYkENj6AiAuAawAn4YA+iwww5TsdwJJ5zg\nLSfIhyzEOfvscxxdxK+Zck1Nm61yQuXZOPW0YW7uffe62bfMLqmbcp7kUnuYZ41Fgfsft+DwrPJs\nhM+o4mdpjfy8S1isLJgGaqUBg51aabqB8uGjzMeYdfyDnKViYtHBciN4i8vOeWY0x9JCJcM+MCK/\nGSCDwfuAnchp2PvFRL2yfDLnn3++N7HjH4OO1IyFD0/YfEQ8FsJWW23lKzLiC3RkgQGuAB0NXkje\nbOO/w3HicQ2LIMonGv2wj5xnnnlmzuTPeDz8O15vvfUiHVzgyzll6pRUgY7kF/DMXzA/08+byqN1\nCC08WyyFAnBAHKw+LcUtlE69z2HBZOHds2AaqJUGDHZqpekGywdA4IOb1Q8WVh1kB9iSAueAEEBH\nUBf1UHIsgAWAoaHj5STMKMWHHnqoBxCsJTfddJMfsA/gIR3W+MvgywOsnHHGGblZ06NuuA6ZCIIW\nWXTIC6gR6GDFiUMOTVha1FSm67H2sE1g1OU77rjDb2PVOfron7mFry50s2b/KpWg4wWNfgCet976\nY6Z9eICU0JpBhV9q4LkEkkJQKjWNesZHbjWFZ/mPUj11aHmXpwGDnfL01uavAgDo5UPln7V/mVQ4\nWKby+Q1QKan3Vdh8BYSETUm/ifxboq7kHlwAkU6dOrloHB134okn+udjxx13zDUX0XwF6GDZica3\ncQcffLBvNiLiDTfckGsuE+gAVOSFRYe046DDcVlxNCKymqTUuwrAIZ7AiG2OUeFEXem9jDh4zrxp\nluvRvavfT/NPv336u+222zYzvbR4zniWFJKapnSu2LWsO1gay4GlYvNprXjIzAK0WTAN1FIDBju1\n1HaD5YXTJB9zKs8shZbkplJS7ysqFQJgIYsO4IFlBksOzUNYWrC+0DuEeFhOosHb/HW//OUv3dZb\nb+0dhrHoROPi5LqgAifRyMq5aUq4ILTGkGYIOmwDLkALQT45NIux4IODRSmEnTANrmefcnDf6J4+\nZuxl7ujBA316af+hl9b+/fd1l1x6SUXOra1ZzvBdwHLBs1TtAKTL1yxL1hHJzR8lC6aBWmvAYKfW\nGm+g/GQhAR5YshCojDCjU9knWaSSmq8AGCAES0sIOnIOlpWFZiRAAwih59OyZcu8SugyTKUXDern\nrrnmGn+sXbt27sEHH3Q/+MEPfPMT14SgA9SwyBlZQAWoEMiLHleCHNbAEwvnCMgN3ITpCJguvHCM\n22DDDd3UKc3n+vIXpvjn+mkz3E0zboyGALgzFf47VNwsClgtahHIh2cKgMhC4H1D5qxapLKgY5Ox\nsAYMdgrrx862oAE+YjT5RCMEt8q/2BayL+m0mgCoII78qkdZmIDKwrFCzVdADlYd1sAEkALkABoA\nCNYV8sIJmLD22mt7B2biKQBdW2yxRa43FE7EnAecSFOQA5ywcFygQ14h6GDREewItkiP+Cxx4Inm\n+HKjR5/v7rt/ru/mLZmysmZW9W7durohJ59UF5G5dwoAM0utA4Al2El6lmstT6H8eBcAHf5kjLbm\nq0KqsnOtqAGDnVZUbltJGsdaLDtUAq1htq+GHvXBRT7kTQqcK7b5CtgBQoAJLCmADs1UgAfAAWww\nxgazlYeB8ThGjBjhormtPMBwDeBCZSAwSQIdrDSkCUgRn3y0sM/COSxELITQIgUsYeFB5p///Fi3\n48493dnDh4WiZWZ77oMPu1OHnOxq1TsLCOb5UeBepSFgJQF00m4tUTfzxwNITIP+TIa2pQGDnbZ1\nv1uttHx0qRT4oKXNj0Cgg1z5Prj844z3vgphAUiQn46ar2jWAkAADaAFJ2QABOgg4J/D6MoK++23\nn4dC4qv5ifjs47tDfqTJObq4AydACgGAAaLC62TN4frQooNMBNIjyMJDWgsWLHDHHXe8e+LJJ1Pd\n+8oLXuCnNa07PC88ywpAcNqeacmGlZJm0rRaVtP8XZAObd02NGCw0zbuc01KqQ8blpO0WHgEOijg\n8TwgVm7zFTABZAAs9LRiAUCAHXRw8sknf03vjJAsQAqbvYhIMxZNTn/961/da6+95iGF4+uvv77b\naKONcqAj4AFyWLAsyU9HoMN1BAEPaQM9Z5xxpltlte+6MReO9uez+iPrzqI3FlWlCDwbCoBNWp5f\nyZS0pilLYJZGy6q+B/neu6Qy2THTQGtpwGCntTTbRtPlA4dZnQ9cvSsMKgNM6BtHPhXAR75/58hZ\nbvMV4KFu5Vh32B82bFhuCokddtjBD+bXr18//0Qwl87ZZ5/tgQdrDWAEqAApgImsMKw5xjkGLGQR\nHDHvDH5AgFYh0AkfQdJm8MPte2zv7phzVyZ9dcLysN2rV283dOiQsnpm8WywKKSlaUrytLSW7Dzb\nBJ5vgCefP5qPVKOfYv5g1EgUy8Y0kNPAf0Xm+9G5PdswDVSoAeCCUXl33313DxfdunWrMMXyLuej\njwyMZ0NFAIQkBR5/Jshk0D8AjXiAgSwhWFrUhIUvjfx0ABWsKlh15KvDOaAG52bC8ccf7/Ned911\nHb2vGF2ZuXyooNinmYqFPOREDOSQt6w/yLPOOuu4zTff3PfcYk1FRzrvv/++n64CfcctOvGycp7e\nX0uWfOCGDqmPY29cpkr3P/v8C/fyy6+4n+7at8WkqIBxzEZ3LLLecC9YshSQnxDKDbB37NgxaqI8\nzo/9tNtuu/k4tf7hHSJv3vvZs2fn/YNRa7ksP9OAWXbsGWgVDdA0FFpVwg9zq2T4VaJUaliX+OgC\nMvxjz2dhqmbzFbOeU9GwZqRkYOuII47wlhogCT+fAQMGuGeffdZLOnPmTG+pUTMTVhoWWW+AHBaB\nlPaxBMmiA8AwenPoX4Ke8+maiT5XX2PNzDomx58bjbuTrykLvfA8EAQ38TSytp8EOmEZOM97R+AZ\n5PmvRUDPvG/8sWCdlaEoaqEbyyMdGviy20Y6ZDEpGkgDAAaVDR9bRloGQPShbo1i6mOrip68+OCy\nH8JAmDcyEfbZZ59cBcE+lhUchWVtwWLDgoMv52TVEYDcf//9btddd/Wgg28NfjlMIEo8FpqaAJSr\nrrqK5H0YOnSoTxMIYqF3F1Ye0ie+eneFjs8cC5u9gB0qcXSshcQBPS2q7Dn+wvO/cT133onNhgjM\njt6uffvc/aWsKjdr7r30kg94s6QIvT+UK1/gHM87wMPCM67r8l1T6XEAR/mSt4FOpRq161tDA2bZ\naQ2tWprNNMDHln9706dPd8wBxcewWpUPafOx5V8saZIPFVwYqAQFXjpOvEp7X+GQfPnll+f8c7bc\ncksPOsx0jsWGRdBETy6sMPS6Ouqoo7wYvXv3dgcccIDfpkkM3x+sQlzPmqklOAZUyZoDCBFaarby\nkaIfyi3g6dWrl5dJ5xphfcyxx7s333jd3/dGsd4k3RcBSyHQiV/Hfedd03sH+MTfjfg1xe6TNu8c\n7x6BbVmU/AH7MQ2kTAPLpUweE6cBNcAHmo8i82gR+OAKTPgHXmqgAhfcYDWiIpBTdNLHXNYP8hL4\naKZx5OK84ASfGSw4surIr4bjBGADMMHSQ7PV1Vd/OQLx4Ycf7p2cQ9DBSiMrEdBDGvhV7LXXXj6t\nefPmeWsQcdScJWsQFhxZcgAdwQ4XFgs6xEXP0skhhx7OoYYKG2+yqevdexdfxmoBdNoUVA7oUAae\na713vIPADmsAqJz3DjlID6jhOec9ZJ/jBjppe2pMnrgGzLIT14jt10QDwIkA5d1333U777yz/xDz\nMSbwoWbhQ0oQpPCBJVCB84FlIV6xgeu5FisLzVfIQAA2gBEgB5DRmDrxwQOxsvzlL39xwA29mwjX\nXnutHzwQCAmhCcABnOSzwzxan332mV+oeOhiTmAKCVl2KMuaa66ZmyUdyw7QIwjyF5TxAxy++94H\nbtwVl5dxdXovoQv6zBkz3KybZ6ZXyAok0/Ov96KCpPyleueAHXogbrXVVv69C0GRbb1n+d473qFq\nyVRpmex600AxGjDYKUZLFqdVNRB+UNkm8JFnO/wI6wNbyUdWzVfk8cknn3hQAlBkgQlBJ2nwQGY6\nHzJkCJd7AGG+q2222cbvAzukAzSp+Yr0gB3giUU+OoAOwEPo0qWLGzlypO/ZRdMVwEPvMDVjAUJY\ndki/FKuOT/yrn/ETJrrPv/hHwzgnq2yNDDvVBh3pTOti3zveQd658F1UGrY2DWRFAwY7WblTJmfF\nGuDfKvN4EaZNm+Y/4FiUgB3Biaww4dxXnAc26KI+fvx4fz0jHD/wwAO+S7ggBMgR6Kj5i/QEO+pm\njrWH/GjGuuKKK3x6jM2DLHRlx5oj2NHYPaFjsr+gxJ9xV17l/vHPfzcc7Cxd9qFbv327XDNgiWpJ\nbfTWBp3UFtwEMw20kgaWa6V0LVnTQOo0IEsKzVdsYynCnE9zlGAHIMEawxL2vjrmmGNyoIPPDV3I\nv//97+csLQKdeDOYrENKC58fjbjMmDw4KRNwdAaKCFiHJAfpIRvph749PqL9ZHrKi3y3T01IWFMs\nmAZMA9XRgMFOdfRoqaRcAzRf4aOAxQSnSgIWm5122skP0PfWW295wAA4AB0gA7jAx2bHHXd0Cxcu\n9NcwieesWbPcWmut5UEnbAIToKipSk1XHAdW8LtRl3LkwMkTuThGOOGEE7wDNHGBI0EX17PPcfJj\nsdCYGgB0gBwDnca8v1aq+mnAYKd+ureca6QBKpBCva86d+7sYYLJFAELwQnXaZoHRJ08ebKf+oEm\nJcBFoCNrjpqrBDnsc454xOc6HJzpsg7ssGy44YaOAQYJOD5PnDjRx+c65AjhCwsPAEYw4PFqcAws\n2OEHHb7cyfivQKcUh/uMF9nENw3UTAMGOzVTtWVULw2EzVdhF1nAQc1XTMnAfFPPPPOMB58bbrgh\nmndpqBcZB+F77rnHj4As3xlOcH0IJVh05OujZjDiqbs6/jeADj45Wtjffvvtc5OG3n777b4nDFYc\npU1asu6oSYt0Swkan6eUa7IQ970lS9zW22ybBVELymigU1A9dtI0ULEGlq84BUvANFCBBtQjBN8Z\nbSclJ9M+PULUOyQpXvxYvuYrQEXNRbKgADIMDMjknbKcbLrppg4AYeZxzuOozDmuBTy4FhjBAsPC\nPpDCeQKQQTMVFh18dVjYJi0cm0mLOMOHD3d33HGHW7p0qe/tBVzRzEV6nNeChUgO0dqOlzlp///9\n3/+6dxa/nXQq08fozp/1YKCT9Tto8mdBAwY7WbhLDSYjPU0Y7+PGyHdGY30IYICTpECFwHWMF8N0\nDPSGwkpzZORonK9LLNcUar7CD0bWE6DiT3/6k9tjjz1yoIMDMlNBYH0R6CBbCDqCHPnXyBGZeFwT\nBx32sRQJXoAd4AX4onfXD3/4Qy51F154ofvlL3+Z891RfK1D0OH6lgI6mvfYEy1Fy9z5P/7xLdex\nQ3absQx0MvfImcAZ1YB1Pc/ojcui2MANCx94DQjYs2fPkgYFVLk1OBrp4eMAJMUHGKSCB6aKGTyQ\naRx+/vOfK3l34okn+vSw3jCODtYYQAM4AWiAI4EOa6CJ44IXgU5ozZFFJxwNmQzVlEY69913n59X\ni+Onn366l534XEvTF+BFcxjQRB7IVAzsYDXT6M6k3SiB6SJ6dO/qoTdrZTLQydodM3mzrAGDnSzf\nvYzIDphocsAkKKm0GAAPC5UHlp8jI2sP+RQ799XNN9+cswAhCyMa9+jRw8MFIPLmm286oIwQBx35\n0xCPAHzEQQfgwZoji46sMkAKcCTfIQAK52ag69e//rVPj+YsYI5rSUc+P2wDPICQ0vMXFPjp1au3\nO3P4CLf7T/sWiJWtUx07dHSzb5md17qX1tIY6KT1zphcjaoBg51GvbMpKBfNToBHCCGtKRZ+P+QH\nHGDRIcyZM8dbaIAKltCKgkPxGWec4X1lJBeWlXbt2uWsKEAG8ILlp1u3bs0sOoBO3D9HUAKMYI1h\nEZSoCYq8QmsMctE0BkiRJsCz+eabe8sReT/66KM+PunIyZm1IArgIb0wTZVHayw7hxxymHfmHXPh\naB3O9PrZ5xa4o44c5Ba9sShT5TDQydTtMmEbRAPWG6tBbmTaioGlhWYkFkFPa8uI9YW81OOKeX+0\nTd6yoAAo+OccfPDBOdDZOBrb5KmnnnL0ygIqWAQnXLfddtv5EY+XLVvmp3ygyQlLjByRgRLABggJ\nF45xLmy6SoISrDPEAZa4Zvbs2V5dABCjNgNEoVWJvCkH8IZ8BOIoADeyqHEPaMJ64IH73ROPPaoo\nmV/fd/9c12/f/pkqB0DOs2bdyzN120zYBtCAWXYa4CamrQhYV6hoWdT8U2sZ+fcsKw/WnVVXXdWD\nAZaT1157ze2yyy7egoJcgwcP9ousMvjGCFKAEIACuOBa0gWEGFQQuABcgBmOcQ3WFjUxkZ4gpyXL\nC2mFMAZMXXTRRW7ChAledQAP0CKokv+OLEjvv/++e+WVV7zzNhWqLFuh3oE/xvK57fY7vZ9LeC6L\n2zTLDR06pBnQprkc3Jd6vQ9p1ovJZhqohQYMdmqh5TaSB9YELCmyKvAPtp4BOQCedyJrzyOPPOIt\nLnPnznUHHHBATqwLLrjAz0mFFQdwkFUGqBDoYEEBdFiAniXR2C50ee4Q9QISfAhyAB7Ah+OAjvxp\nkqw5OSG+2gB41NOLvAAeoAw4I4T+O3/+85/dCy+84J5//nm3YMGC3AzsXyXlV8ANlasWrAkXXXSx\n++jjTzI/+/mtEbBdPWmie+yxeWGRU7ttoJPaW2OCtRENGOy0kRvd2sUELKhUCXzY02SmHzRokLfI\nHHjgge7cc8/1MvLzq1/9ym2wwQbeOiOrDtDCNpCCpUWgA3iwrWYr9hcvXuwHBJR1JQQdNYGRTzGg\nQzw1Q5GH8mXcHcb+UWDcn7ffTh4vB6fqPn36+MlOe/Xq5e+B0mTNgsz4A7268HXXpXNHJZu59aGH\nHeG6dd0uGpPo5NTLbqCT+ltkArYBDRjstIGbXIsiYtHBgpI20KGCB1qwsigABWPGjPGWHM4BJnFQ\nEehgyWHBX0agE/rWvPzyy27rrbfO+fqo2QpYIhQLOpJNUILV5qGHHnJYoph0NCnQNLfnnnvmFixK\nyp/4SouyaKEMZww7K7JgrZRZ687cBx92p0aQM3/B/FRBddI9MtBJ0oodMw3UXgMGO7XXecPlSLdy\nPuosabLooGgqfEYmxqqjQFdyLDMADEFNToACkMI1nMO6ItDhGOBC3NAKhFXnjTfe8D48DELI9Syl\nQg6+QNLhY4895icglbzx9VFHHeWb55ADSMN/J+ydRd4syAzcaAF42MYyxBQVWbXu9Nunv9ulT+/U\nW3W4nz2/snbG76HtmwZMA7XVgMFObfXdcLnhhHzkV93L6+2jk6RcVfgjR4507du3d1OnTnV9+/Z1\nxx57rHc8Fhhg3QkBAcgBdgREAAwwBFzE/XOAjg8++MB9+umnvgmJdFoKAhvWgA7XhgGrDbOtAyXd\nu3f3TU9hd/SHH37YRxfwADuyTgm2KLsAB8jRNutJk652yz780N0ye1aYbeq3x0+c7ObccXvqfXUM\ndFL/KJmAbUwDBjtt7IZXs7j46QA4N0bdzMMu3tXMo9K0VOFrfJ3f/va3fibzKVOm+JGRqfiJIygi\nHoDDAiAQAKHQEVnAA2iwyD8HfdALKunfPJWfFqa7iAemv6C3Fdey4FwsOAG6sES9+uqrrnfv3v5S\n1synBVgJwgQ7yCrgiUMO+ywfffSRO+vMs9ygo452Q046IS5OKvc1rs41U67xOkqlkJFQ3GfuoQXT\ngGkgPRow2EnPvcicJAIcrDtpDQIZIEYgQzduYIcZzsPjQIXiABoEQCberRyoAHLkH0Mcgiw66APw\nwWLDkg9uqBC1AI3IqiD4AkyQi95ZDDaI3JdccomPdvHFF7tOnTr5fJFRcqpZTvIImkiLdAV4L774\nop9t/Zln56e+K/rSZR+64447Puqd1scNOfkkqSl1awOd1N0SE8g04DVgsGMPQlka4KMup+S0+enE\nCxSCg2CG5iH8eAYOHJjrUi7YEegADQIINV2xH4IOFhTABqBBJyxJY9xguRHYsE6CG0GI4ERr+Q8B\nO59//rkbMGCA+8Mf/uCLCfzgswN4ydIEjMm6g3yCKK2BIC1z5z7gbr75JjfzplmpBh7mwKLss26e\nGb+9qdnn3nNvLZgGTAPp04DBTvruSSYk4qPOMjrPLOVpKgSVvIAHgABqcAI+4ogjPKQABlhOgApg\nCBAAHgAbQQ4AIYhgAD9GWwZqgJwkuKEZCqChaQoHboBQsIFuQrAJZRPgaI08su7gR0RzFqM47733\n3l7FG220kRs1apRPGwsTwCNZ41DGecrGOlxGjb7A/TNKd+q1U137duul6dZ5WU49bZh7660/uhnT\np6XOAR4BZcUz0Endo2MCmQZyGjDYyanCNorVAP9gs2LVoUyCDEGFLCV77LGHH5fmoIMOyvW6EgwA\nCgIdppag+zcLkIMzcjwkDeBHfqoId9ppJy+HIAeAEdDE15yLL7JIAWUsQBYTnRL22msvt9tuuzVr\nctPgiIAPZVHTFpCDtSeEHbbPG3W++0s0UOGll12WqvF3sgA670RDLgC1FkwDpoH0asBgp4b3ZrPN\nNssNCIffxVlnnZXLnUpWgZ42jJxbTKB3ET2LFFSxa7811oAOH/csWHXC8oewg5WEaSQYZPDBBx/0\nFh2gAxBYuHChW7RokZs/f75fGC05HnbeeWfHP3kWdBFabshHC2myACddunRxq6yyigcZAU4IPSHg\nJJ0X8CA7wDNp0iQvO7JRDnqbqdlNs6PTxAW0ATsCnhB2wu1zzjnPPfLwQ+6GG6fXvUkLH50zzhjm\nm67SbNEx0Im/GbZvGkinBv4z0lo65StJqksvvdT3UOEiRpp96623SrreIresASwVd999t7vyyitb\njpzCGEClKniags477zx3Y9SbDH8QmrYYMycp7LDDDr4nFHDD6MQEgaUgirXgJg4rDDzIAIRACFAS\nnhfk6Fi4ZjsEJ/JV76uTTz7ZYWUDfi688EI3bdo0b7EJHacBHCw7svCE52TdQR/o5corr4jmM7vb\nbd+jm7tqwqS69dKi19XIs0e4jh07ucmTJqS26cpAh6fRgmkgGxpoKNjJhsqzLSU9jfbZZx+3ceSP\nksVApc6CtebQQw91+N/84he/+FpRgJpdd93Vd09nCgauUQgBBFAR5AhaWAtYwu1NNtnEvffee45B\nDddbb71cUxVxw0Vww5oAjAjQJD/xaY7Dssd0GITrrrsumhhzaM45WU1W8j/C6sOitAQ5SpM0Djhg\nfw99559/gZv/3HOO8YlqNa0E1pwbp890I0ec6QFU5UKuNAWA30AnTXfEZDENtKwBg52WdVS1GI1g\naQJ2AIEsBip1AIJKfo011nBPP/10rhhYbrDYMH7NNttsk+tWTlzBjNYhmLQEOCHssL3aaqt5wKLb\nNyMukxbpshAEHuSrRdCitYTm2i222MJbp5jQlK70DERIWYhLWqTBNgsWHsCHhePKT+lp3TO6vzTN\nMfDgFl06uTFjL3NHDjqiVZ2Xr582w90040bXrv36Dt2k1QfGQEdPia1NA9nSwJdfvGzJ/DVpmdGa\nDzuDrCkwJD7H8JOJB5q7OK6KhTXHkiZY5LjiMfIuQYO5cVxBcVgjDwuVJvukQQjz1DFdH1/fdttt\nuetJg7Q4Vk4grzBvyZRU3pbSp9lE4+u0FDeN5yk7C5X/zTff7AGB0Yrpwo0PVdeuXT0MEIcgZ2b5\nydAbii7gX3zxhW/6Yt3SQhMZcbiO6wGttdZay89YDuRIHpqc1ANMDsb43IQLzWDIywI44St0yCGH\nuG7dunl58QVDVsEOQCTgYjsMKmN4TNukO3LkCA8er77ysusdAdDIc0e7ha8tUpSK11hygJxevXp7\n0KFZjq7lBjoVq9YSMA2YBmIaaFOWHSp3xihhFN14AGCAApyDf/KTn8RP5/aBjqTrcxGiDUCnJZgJ\n48e3gRqaJ8JAnsged2wO48S3q1HeME0GyCNsnNEmLJVFVo3999/fH6Jyfe2117ylRVYWwIAu6oIF\ndQEXOLAutM050uJ6pRnmD6hst912ftBBBgZkXxYYWWO01nGtBUeki1wskydP9hOSksdxxx3nbr/9\ndp835ygHACR/HdIV6Ggt2eJrdAOAcO9vnjXbW3r27rev2yUC/22i96RH967xSwruAzhzH3jIvfrK\nK27uffe6rbfZNmqGG+iOjKYcSXMwi06a747JZhpoWQNtCnbygY7U9Mknn/h5k2huWn311XU4twZi\nigmVgA7px0EnzBMoA8aK6a1VaXnDfNlOg58C1jXdByr7UgOVO9cJeLie5iumYsAXiXMCGaw6LAIK\n1oUAJwQbyUZ+LOQXwou26ZKOUzTxGTMHoNE5wY32WWtbkCJZN9hgA9+7rH///u4vf/mLmzlzph8w\nEdDRNUpPMrFfbAB6WEaePTxyYr7L4UQ8ecJ4fznAssWWP/Tb3//+Zr7HmdJl8MPPP//CvbP4bfeH\nN99wjz/+mDvk0MPdT3fdxQ0dcqLbOAPgbKCju2lr00B2NdAQzVhU/FQWGkaf20FvLI7JTwaACC0y\nxOU8C00YCgBPIdggXaw/ulbXxdfHHHNMLk7YxTweL99+PvmIX0g+pVet8io91vy77xk1Z6QlcK/K\nCarstWZ0Y/xECAALi6whdPFWsxVrbYfNUsQBimTNIV2sKGGzVNgUFd/GWkj38MWLF/smq7DbuJqz\nOB8OFqhu5BpDh3NMGMpAiQSclblfyERZkFGLZJW8/oIif2jewgozdcrVbtEbi9zsW2a7AQfu71Ze\n6dvuf//9L3dX1J1/5owZuWVJ5JC90ndW9BagUaPO8+8EliKcj7MAOgB+GiC/yNtj0UwDpoE8Gmgz\nlh1ZA9ADIBICCPtUnPL5ARTC86HuAKOWrCqcDwEqvL6Y7Zbko5kLeZOsT0q/WuVVemlcA68t3Yti\n5KbS5d+7QAcYECDQ/MO2LDyh9Ya0ARtZXLSWBYV1aFXJt008mrLoIfbCCy94oCSu0matEN8Gurke\nuMLfB0h+9NFH3dKlS92QIUPck08+6S1T8uMJm7LUM0vlUB6lrGXxKeWarMQFcgiU0YJpwDSQbQ00\nhGWnmFsQWnWSKkjmSVLA1yWf1SDpWl2ndTFxFDdpHcqi8/E0W3IurlZ5lT9rLAWAQVpCWMZqyAQ4\nYO2guUrAA+gIdmQJATiABqwqoUMxFpvQKhO34IT7ioflRlabddZZxzejMlIzTs3kIeghzxB0wvIS\nJ5Rn9uzZudOnn366t6ZQJoAH644ATs1yLVkpc4m1oQ2BTpqe9zakfiuqaaDqGmgzsBPCAb4sqjy0\njvfaSoIdmrCKCYUsLsVcn5RPPM0k+cK0q1HeMD22sX6k6eOPb1S1gSdeZp4PLCdqkqK5CDgBUgQ3\ngIvgJQSa+LbicL0AB1iKdwnHh+jdd991qnDjMoX7en5l3SGtDh06uHHjxvlozz//vB+9Wc1Z9AaL\nAw/WKgv/0YD0nqZn/T/S2ZZpwDRQjgbaDOyUoxy7pnU0AKSoki51HTbPAXw4LNP8WA3oQRY1NSXB\nTTmAA/DIeiOwAUiAE5bQchNqW00nWNNaCtIhacnaxHxfe+65p7+UqSQAVSw5WKlC4JF1R81zLeXV\n6OcNdBr9Dlv52qoG2gzshNaS0MFYJvz4Ooxf64cjqeKOW3Jaki88X63y4pyqyqDWOsmXH3oBnuRv\nlS9eMccFO2oSiltxZMmJW2wENFrH4QZwaglukuTDssDyeDS2UUsB2ZUH+SE7/jusCUcffbRfFwIe\nvQM+Yhv80bONzi2YBkwDjaWBNuOgTHdtNe0AE3EfmDTdVqwXcb+d0KJBk1YIM0myt0Z5sTaoQkjK\nM8vHBDoAgwKWEsABCCCwr0VWGda6lrUW4rNdaQAwe/bs6YEH/bNfKIQyMyUF/jsMAkl39PHjx3un\nZaw7xAPqBEgqA/uUt1jZsTyxfPa3z92SJe9/bUZ4Jj7t2LGDizr8e0dfypLGoOfaQCeNd8dkMg1U\nroGGtezELSEh3GAFCMfCAYJCP564/07lai4tBXqDhfKxH1ou4iCUlHprlZfmkEYLL730kocIKnhV\n/jQH/f/2zj3IiurO4z9SIJYVUu7GZAOuFi5UGNkqERMZhFpFYnzk4Yih3IqRAZJSUUkibtCMZA1b\nAg6iUjKWQXwOJghEHIeYVWMSJCKPUgOl4aEbE0UdrazlHz7WRLOV3G+THzlc7jzune57u/t+TlXP\n6du3+5zf+Zyr/eV3fuecYs+Ox+oUx9tU6rkph6NEgl7I/lIu9azsd9EiIaNhM01H18rESlpoUMLE\n43d8Krpyn22m73pK6v97ChunXnTxJdYwqsHmzLnCfvbYL+zd9963f/jHj9u05uYDjnGN4+39P35g\nL+99zW5aenNk39lNU2xZ2y09tqUnG+L+zpkidOImS3kQSA+B3Hp2JHb0P355QLTWjqZzS0C4d0fi\nIRQQYZd0N+08vCfp8/7al0R75VmIY7dzibWeVqmuhG1xAHc5ZegFrrbJ26GkvDho14WEck/huV9L\nMnfPmgSLzksl2ST7JdokwiReWlpabN26dd1OR3eBp+e8nX7udWgo7af//YjdsGRxtCjg+IKI0qaj\n5W4SqhWUNz252bZs3mJnnnGmfbrhWDt3SpPNKKzdU4uE0KkFdeqEQPUJ5Ers9Da0o9iV3lYVVpyD\nhEItk8RW6NkJbZF9vbXT74+7vXrB6kXb3yT7+9qG/tbVl+f1Ir/88sujF73fLwHgqdqixustlcv7\nIHHWk+DRc26/PFQSPI888ogdd9y+VY41Hf3GG2+MvDny6ujecEgrFDobN260Fbffab9++ilrnvkN\n+83O3WULnLAdw4Z+ys6bem50zJ37H9HWEe3t91h7+8rIA3XuuVPC2xM9R+gkipfCIZAqArkaxpLH\nQGKgu3/l6wWrRdt0T7FnQQJH4iANXh3ZokUJQ0Ege9euXVuWfXG3Vy9apTgET1RQSv7ohR56Sty7\n4XlKzNxvhuJ21BcSaaWSizOJFh/OUvxOOB29s7Mz8l5p+EqCRzO0wvV33nrrLVuwcJHNunhWtBXE\nLwt1Xf3duf0SOsW2Svh8Y2azbdjwS7ugeYa1tbWZhriq8fvyOvw3XWwbnyEAgXwRGFAIRix/g6F8\nMaA1ZRBQsOukQvyIPCF5SNrnSW3xf+VnrU0SPBJqpQKX9Z+2huM0A0tCRltdXHjhhfbQQw9FzVy/\nfn30nLw/ikPydYC2bdtmN9241IYV9tuaN29erAKnN76LWpfYvJYr7eabFUy9L9aot2fK/V5CRyKn\nFLNyy+J+CEAgGwQQO9nop9RYqeBUxe34v4xTY1iFhrhoiyMWqUIT+v2Y+sK9PcWFSfBoGMs9OBI8\nI0eOjLw5iunR1hLyBCmYWVPmH3yw09TH3/z25fat2ZcWF1eVz9pkVN7XoUOH2uLWRbGKEoROVbqQ\nSiCQOgKIndR1SboNUryIhgm1aaX+dZz1JJHg3pEst0WeKQ+0DtshseOCR1PONWS1adOmaDq67ps6\ndWo0HV1xO62t19vu3busfeW90cadYTnVPlcg8/z5/2VvvPGGrWy/OxbBg9Cpdi9SHwTSQyBXMTvp\nwZpfSyQOtGN1lj0h3jvu1Qnjdfy7rOUSnjok3MLkcUcev6Mhq1LT0RcsWBQNeW0sbBw64aTGsIia\nnCueRzurjxgx0pqnz4yEXH8MQej0hx7PQiD7BPDsZL8Pq94CvVAVuyNvQpbjHjyQd8yYMVHci0SP\n4pGyLn7UP8VxPO7dUfyOByRrLaZdu3bZZz9zov1TIYB5xe0rTCIjbWnOFXMLy0f8tmIPD0InbT2K\nPRCoPgHETvWZ56JGiQId8+fPz2R7FJcyc+bMbm0/+eSTo/ZJNOiQ10TJBVL0IcV/9IIP43gkdpQU\nv+PDWV1dXXbZZbOj9Xce7Fxf1UDkctFpEUPtBL/qR/eW9ShCpyxc3AyB3BJA7OS2a5NtmHt3/GWS\nbG3xly7xIqE2o7CYndqyYcOGKOha+TvvvHNQhcOGDbOxY8dGh3Yl16GUZvFTHMfj8Tu+P9aWLVvs\n9NNPtyc3b03F0NVB0IMLiuGZNesSaxw3rjBDrCX4pvtT/21m2fvYfev4BgIQKIcAYqccWtx7AAEJ\nBQXFavp2lpJEjmzWy1DJvR6apq1Dq2xr/zTNVNq+fXt0lGrf6NGjI9FzwgknRCLIh7/SJIB8AUJ5\n4ZTUVrXxzTffLOy/dp5NPe/fazbrKjKojD+apfX1GdNt+W3LI69bT48idHqiw3cQqD8CiJ366/PY\nWqwXqTwkClaW8MlC0ktQHhqJGBcn7vGQCNAwjzwf4V5R+l73/6oQvKt9tJ544oloSKW4vVqrRsJH\nXh/Vody9CrUWQGEcj9pz7bULbM/zL5Q9LFTc5mp/XnbLrdax7v5oIcLu6g7b2t09XIcABOqLAGKn\nvvo79tbqxaJgZX/BxF5BjAX61GzNwvKZWCrevR0udBTT4oeu6dA9Eiw6NE377bfftueeey4SQBI/\nO3fuLGmphr9c/LgA0o21ED8SehJfaotW1+7v1g8lG1yFi1pl+bTPTS656KB+h+7FqoIpVAEBCGSE\nAGInIx2VZjM1LKSAX3+ZptVWDzaWrWHSy99FjYsc3z7BPTzy+ihJ6Ggatx/67Nf27t0biR+JIAmg\nV199Naxm//mECRMiz497geQdU6qGAFIcz3e+c6WNOna0Lbx2flRv1v48/OhjNqewuvLWbVv3e87U\nBoRO1noSeyFQPQKIneqxznVNGsaS2NELx4du0tTgnuxzseOBuz41OxQ8EkOeXNx4LuFT6lzXNPTl\nHqBnn3222+GvyZMn7x/6kgfIGcYtgCR2jjnmGHut6/VUTjN3xr3l539tmo1vHLffu4PQ6Y0Y30Og\nvgkgduq7/2NtfU+CItaKyihMQ1casupJiLnYkaDxadkSOi52dE0eHnl3dK8EiB8SOjp3wROKnlLX\nNPwlr48LoO6GvxT8LNGjQ0LI44v6K36uuuq79sGH/29Lb1pSBsX03SrvzvWt10WxOwid9PUPFkEg\nbQQQO2nrkYzbI8Gjl49mO/kLulZNktCZ9LdZSLLJvSXF9kjAhMHJLnjk4Ql3Apfnx+9TGXpOh5KL\nn1D4SOz44SLIcxdCyiV85PVxL1B3w18TJ06M2uOzvyoZ/moY1WB33dOe+qnmEdRe/px66mQbM+a4\nXKzm3UtT+RoCEOgnAcROPwHy+MEEFMOjGVq1nKUlcaPAaR2yozuhI+tdtEjI+Ewsj92RR8fjdvSd\nvD+6zw99drHk5TgRF0AueEKB49ckfkIB5PfI+yPxIxG0efNmL/KAXBtluvBRELQOT6U8QBKgd93d\nbus7O/y2TOfz/nO+ffjBn+z6xddluh0YDwEIJE8AsZM847qswcWGPCsSG+6FSBqGvDkeMK08nHXV\nU90uWNxzEwocFzkSNn6EYsef8Wueu/hRruTix70/Lnhc4IR58fkrr7wSCR+JIAmgnoa/fPaXhJB7\n11TnlVe12KBDBmc2MLm4/3zdnT3P7yn+is8QgAAEDiCA2DkABx/iJODxMvIo+HTvnjws/a1bs6wk\ncCSsdK68r8kFiYSKzl3UKHcx4+f+Ocz9PLzHryn3MmWPzr0+F0ASNzov5eUpFj7uDfKhL+USQdpO\noTiFa/90dq63xUtusLPO+HzxbZn9rGG51WtW7xd1mW0IhkMAAokSQOwkipfCRUBeHokQBQlL9Ciu\npxwh0hNFCSoJG3mPlJRr6KrS5CLEBYlyiZVShwsi/86Fjufh9VLXvGyvSza7+NG5RI4foQjyc89d\nDIVr/2gITJt8FifVlaekPbNGHzuqzx68PLWdtkAAAn0ngNjpOyvu7CcBiR4Jk/b2dmtqaoqCbSVM\nyhU+EjjyFuno7Oy0U045JXrZ9UfkdNe0UBxIvLgwcSET5qGg8XPPdV+p8/Cal+V1eN0ugJS7+PHc\nvTz+2YWPCyFf+6etrc0mTvw3+/GP13TX1ExeX9S6xP5ciNu55prvZdJ+jIYABKpDALFTHc7UEhAI\nxYoEkIa2JHgm/W3mlOf+iDxCeka5jpdffjkSOBI3lYglL7eS3AWIng1FiYSKPheLl1Kf+3rNxY+X\nHdbtAkjiRucublzshLnOJXbe/+MHtuK2H1TS7NQ+s/b+B+zBjo7MbXuRWqAYBoGcEhiY03bRrBQT\nkLjRUJYOJRcxvku3hrzCJCGkQ8JGw2DFYii8N+lzCQtPfi4RIrGhpHMXJ2FeSuDo+/C6n3vu34ef\ndc3LVV0KnlZS7vbIFp274NHnwq02/Jh/ie7N058hQ4bkqTm0BQIQSIgAYichsBTbdwK+jUPfn0jX\nnS4yZJWLjNALE4oTFyueFwsZfS73murSM0o610wyJdnix7vv/l90LW9/jj7qKPv100/lrVm0BwIQ\niJkAYidmoBQHAREIBdA+z8rfA4MlSPxwIVRK4Oi78Lqfex5+X3xN5XvZyj/88z4BlLfe+dfRDfb8\nC8/nrVm0BwIQiJkAYidmoBQHgVIEQvHj5xIklQx/hcLGBU9v1wYNHFTKLK5BAAIQqAsCiJ266GYa\nmUYCLnpkm84VYyMBpOSeH8//AVUvAAAG+ElEQVQlasLDxY3nLnrCPDwffOjgqNy8/dm5iwUF89an\ntAcCSRBA7CRBlTIhUCEBF0Ceu/hRcS58lIfCJzyX+AkFkAueIR/9aIUWpfuxvYWVpb96/gXpNhLr\nIACBmhNA7NS8CzAAAt0TcNGjO/xcYseHvyRmXAS56NHnUPDofODAj9j//uEP3VfENxCAAARyTOAj\nOW4bTYNALglI9Pgh0aNj4MCBdsghh9jgwYOjQ9tE6DjssMOiQzumd3W9ljse27fvsIZRo3LXLhoE\nAQjESwCxEy9PSoNA1Qm48FEerq0zaNCgSAAdeuih1tjYaGvX3Fd125Ku8KXf/84+9rF8DtElzY7y\nIVBPBBA79dTbtLVuCBQLoCOOOKKwGOOppp3C85T+pzDtvJaLTOaJJW2BQJ4JIHby3Lu0DQIBgRPH\nNdrjG38VXMn2qYTb611d7Hie7W7EeghUhQBipyqYqQQCtScw4aRG27plc+0NicmCp595xr7cVPkO\n9zGZQTEQgEAGCCB2MtBJmAiBOAhob7EX9uy2vKxN07Hufps4YXwcaCgDAhDIOQHETs47mOZBICRw\n9jlT7I477gwvZfL84Ucfi4awJOBIEIAABHojgNjpjRDfQyBHBC695GJ7+Kc/sa7X38h0q+5dudIu\nnT07023AeAhAoHoEEDvVY01NEKg5geHDh9sJnz3R7mm/t+a2VGqAvDra6bx5GisnV8qQ5yBQbwQG\nFFZb/ft2zPXWetoLgToksGPHDhs7dqz9Zudu067hWUvnf22ajR/faN/6Jp6drPUd9kKgVgTw7NSK\nPPVCoEYEjj/+eLt2wUJbuHBhjSyovNplt9xaiNV5Da9O5Qh5EgJ1SQCxU5fdTqPrncDsyy6NhoIk\nHrKSNIvs1rZl9v3vX2OHH354VszGTghAIAUEEDsp6ARMgEC1CUgsLL9teSQesrKqcktLi13Q3MyK\nydX+sVAfBHJAgJidHHQiTYBApQSWLWuzjo4O+9GqVTZs6KcqLSbx5+ZcMddefPG3tr6zI/G6qAAC\nEMgfAcRO/vqUFkGgLAJXXtVie/bsseXLf5BKwSOhs2P7MwVR9gDDV2X1LDdDAAJOgGEsJ0EOgTol\ncP3i66yhocFmzbokdevvSOhoXaClS29C6NTp75NmQyAOAoidOChSBgQyTiAUPGmI4dGihz50tXrN\najb7zPjvC/MhUGsCiJ1a9wD1QyAlBCR4GseNs6/PmG5r73+gZlZp1pW8TIrRWdl+N0KnZj1BxRDI\nDwHETn76kpZAoN8E5s1rsdbFrXbNvKvtoourP6ylqfBfmXJOJLoUjMwU8353KQVAAAIFAogdfgYQ\ngMABBLS55tZtW23AgAE2edIkW9S65IDvk/igLSDObppi2slcU+IlukgQgAAE4iLAbKy4SFIOBHJI\n4PHHH7cVt98ZrVr8+TPOshnTp8U6Y+vOu1faL36+b6+rlqtbbPr06TmkSJMgAIFaE0Ds1LoHqB8C\nGSAg0bPqvjV2+4rlduFFs6xx/El21pmnVyR85MXZtOlJW7d2tQ0dNsxmFGKEmpqaGLLKwO8AEyGQ\nVQKInaz2HHZDoAYEXnrpJdu4caM9+rOf232rfmhfPvscGzFipH3ik5+0kSNH2JAhQw6yavv2Hfbe\ne+/Z73/3YvTMpEmn2udOO82+9MUvEHx8EC0uQAACSRBA7CRBlTIhUCcE5PHRLup/sQH21FNPl2z1\nkUceaUf985F29NFHReJm+PDhJe/jIgQgAIGkCCB2kiJLuRCAAAQgAAEIpIIAs7FS0Q0YAQEIQAAC\nEIBAUgQQO0mRpVwIQAACEIAABFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoid\nVHQDRkAAAhCAAAQgkBQBxE5SZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAA\nBCAAAQikggBiJxXdgBEQgAAEIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCAp\nAoidpMhSLgQgAAEIQAACqSCA2ElFN2AEBCAAAQhAAAJJEUDsJEWWciEAAQhAAAIQSAUBxE4qugEj\nIAABCEAAAhBIigBiJymylAsBCEAAAhCAQCoIIHZS0Q0YAQEIQAACEIBAUgQQO0mRpVwIQAACEIAA\nBFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoidVHQDRkAAAhCAAAQgkBQBxE5S\nZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAABCAAAQikggBiJxXdgBEQgAAE\nIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCApAn8FUX2PmBTVQm8AAAAASUVO\nRK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 100, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse_2.png')" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 6).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 6).ipynb new file mode 100644 index 0000000..6502cfe --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 6).ipynb @@ -0,0 +1,7478 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 3: Building a Neural Network" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "- Start with your neural network from the last chapter\n", + "- 3 layer neural network\n", + "- no non-linearity in hidden layer\n", + "- use our functions to create the training data\n", + "- create a \"pre_process_data\" function to create vocabulary for our training data generating functions\n", + "- modify \"train\" to train over the entire corpus" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Where to Get Help if You Need it\n", + "- Re-watch previous week's Udacity Lectures\n", + "- Chapters 3-5 - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) - (40% Off: **traskud17**)" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] += 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):587.5% #Correct:500 #Tested:1000 Testing Accuracy:50.0%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):89.58 #Correct:1250 #Trained:2501 Training Accuracy:49.9%\n", + "Progress:20.8% Speed(reviews/sec):95.03 #Correct:2500 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:27.4% Speed(reviews/sec):95.46 #Correct:3295 #Trained:6592 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):96.39 #Correct:1247 #Trained:2501 Training Accuracy:49.8%\n", + "Progress:20.8% Speed(reviews/sec):99.31 #Correct:2497 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:22.8% Speed(reviews/sec):99.02 #Correct:2735 #Trained:5476 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.001)" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):98.77 #Correct:1267 #Trained:2501 Training Accuracy:50.6%\n", + "Progress:20.8% Speed(reviews/sec):98.79 #Correct:2640 #Trained:5001 Training Accuracy:52.7%\n", + "Progress:31.2% Speed(reviews/sec):98.58 #Correct:4109 #Trained:7501 Training Accuracy:54.7%\n", + "Progress:41.6% Speed(reviews/sec):93.78 #Correct:5638 #Trained:10001 Training Accuracy:56.3%\n", + "Progress:52.0% Speed(reviews/sec):91.76 #Correct:7246 #Trained:12501 Training Accuracy:57.9%\n", + "Progress:62.5% Speed(reviews/sec):92.42 #Correct:8841 #Trained:15001 Training Accuracy:58.9%\n", + "Progress:69.4% Speed(reviews/sec):92.58 #Correct:9934 #Trained:16668 Training Accuracy:59.5%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Understanding Neural Noise" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 67, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 71, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "review_counter = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for word in reviews[0].split(\" \"):\n", + " review_counter[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('.', 27),\n", + " ('', 18),\n", + " ('the', 9),\n", + " ('to', 6),\n", + " ('i', 5),\n", + " ('high', 5),\n", + " ('is', 4),\n", + " ('of', 4),\n", + " ('a', 4),\n", + " ('bromwell', 4),\n", + " ('teachers', 4),\n", + " ('that', 4),\n", + " ('their', 2),\n", + " ('my', 2),\n", + " ('at', 2),\n", + " ('as', 2),\n", + " ('me', 2),\n", + " ('in', 2),\n", + " ('students', 2),\n", + " ('it', 2),\n", + " ('student', 2),\n", + " ('school', 2),\n", + " ('through', 1),\n", + " ('insightful', 1),\n", + " ('ran', 1),\n", + " ('years', 1),\n", + " ('here', 1),\n", + " ('episode', 1),\n", + " ('reality', 1),\n", + " ('what', 1),\n", + " ('far', 1),\n", + " ('t', 1),\n", + " ('saw', 1),\n", + " ('s', 1),\n", + " ('repeatedly', 1),\n", + " ('isn', 1),\n", + " ('closer', 1),\n", + " ('and', 1),\n", + " ('fetched', 1),\n", + " ('remind', 1),\n", + " ('can', 1),\n", + " ('welcome', 1),\n", + " ('line', 1),\n", + " ('your', 1),\n", + " ('survive', 1),\n", + " ('teaching', 1),\n", + " ('satire', 1),\n", + " ('classic', 1),\n", + " ('who', 1),\n", + " ('age', 1),\n", + " ('knew', 1),\n", + " ('schools', 1),\n", + " ('inspector', 1),\n", + " ('comedy', 1),\n", + " ('down', 1),\n", + " ('about', 1),\n", + " ('pity', 1),\n", + " ('m', 1),\n", + " ('all', 1),\n", + " ('adults', 1),\n", + " ('see', 1),\n", + " ('think', 1),\n", + " ('situation', 1),\n", + " ('time', 1),\n", + " ('pomp', 1),\n", + " ('lead', 1),\n", + " ('other', 1),\n", + " ('much', 1),\n", + " ('many', 1),\n", + " ('which', 1),\n", + " ('one', 1),\n", + " ('profession', 1),\n", + " ('programs', 1),\n", + " ('same', 1),\n", + " ('some', 1),\n", + " ('such', 1),\n", + " ('pettiness', 1),\n", + " ('immediately', 1),\n", + " ('expect', 1),\n", + " ('financially', 1),\n", + " ('recalled', 1),\n", + " ('tried', 1),\n", + " ('whole', 1),\n", + " ('right', 1),\n", + " ('life', 1),\n", + " ('cartoon', 1),\n", + " ('scramble', 1),\n", + " ('sack', 1),\n", + " ('believe', 1),\n", + " ('when', 1),\n", + " ('than', 1),\n", + " ('burn', 1),\n", + " ('pathetic', 1)]" + ] + }, + "execution_count": 81, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review_counter.most_common()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 4: Reducing Noise in our Input Data" + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):91.50 #Correct:1795 #Trained:2501 Training Accuracy:71.7%\n", + "Progress:20.8% Speed(reviews/sec):95.25 #Correct:3811 #Trained:5001 Training Accuracy:76.2%\n", + "Progress:31.2% Speed(reviews/sec):93.74 #Correct:5898 #Trained:7501 Training Accuracy:78.6%\n", + "Progress:41.6% Speed(reviews/sec):93.69 #Correct:8042 #Trained:10001 Training Accuracy:80.4%\n", + "Progress:52.0% Speed(reviews/sec):95.27 #Correct:10186 #Trained:12501 Training Accuracy:81.4%\n", + "Progress:62.5% Speed(reviews/sec):98.19 #Correct:12317 #Trained:15001 Training Accuracy:82.1%\n", + "Progress:72.9% Speed(reviews/sec):98.56 #Correct:14440 #Trained:17501 Training Accuracy:82.5%\n", + "Progress:83.3% Speed(reviews/sec):99.74 #Correct:16613 #Trained:20001 Training Accuracy:83.0%\n", + "Progress:93.7% Speed(reviews/sec):100.7 #Correct:18794 #Trained:22501 Training Accuracy:83.5%\n", + "Progress:99.9% Speed(reviews/sec):101.9 #Correct:20115 #Trained:24000 Training Accuracy:83.8%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):832.7% #Correct:851 #Tested:1000 Testing Accuracy:85.1%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Analyzing Inefficiencies in our Network" + ] + }, + { + "cell_type": "code", + "execution_count": 88, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAl4AAAEoCAYAAACJsv/HAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsvQv8HdPV/7+pKnVrUCSoS5A0ES1CCC2JpPh7KkLlaV1yafskoUk02hCh\njyhyERWXIMlTInFpRCVBCRIJQRJFtQRJCXVLUOLXoC7Vnv+8t67T9Z3Muc85Z+actV6v+c6cmX1Z\n+7NnZn++a63Ze4NMIM7EEDAEDAFDwBAwBAwBQ6DqCGxY9RqsAkPAEDAEDAFDwBAwBAwBj4ARL7sR\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwg0GgLvvfee22CDDdzWW2+diqahZ+fOnVOhqylpCBgCjYuA\nEa/G7VtrmSFgCPwbgT59+jiIookhYAgYAvVGwIhXvXvA6jcEGgiB2267zbVt29ZbwrCGDRo0KNs6\nOf/kk09mz2GFIh3nIEYQJH6LJW38+PHrpZ06daq3so0cOTJ7LdcBaSgLvUwMAUPAEEgCAka8ktAL\npoMh0AAIQJwgWi+99FK2NZAkyBTSo0cPv1+wYEF2T57999/fbz179mxBkLgGcdLkjYyc41oxMm7c\nOJfJZNysWbOKSW5pDAFDwBCoOgJGvKoOsVVgCDQHAliVIEQDBw70ZGft2rWuVatWTojWiSee6IGQ\n32L54jwEjd+QM4gS2xNPPOF23333FmSMAiQNpMrEEDAEDIG0IWDEK209ZvoaAglFQAgXZAmrVNgy\nBWGCiAnhEgIG8RIrGefE1UggPOchc5KHpp999tkJRcDUMgQMAUOgMAJGvApjZCkMAUOgCAQgScRs\nCaGCgEG0tECyIFKkgUzhZiSdyJQpU7IWL7F8sSediSFgCBgCjYCAEa9G6EVrgyGQAARwF0KqIFe4\nASFU/NYicV7ylSFpESFflCHWL53Pjg0BQ8AQaBQEjHg1Sk9aOwyBOiMg1i2C4XEXQq7knKgG0eKc\nEDIhXrgpsWphBZOvH7XLUfLb3hAwBAyBtCNgxCvtPWj6GwIJQQDyJBYtyBVuQ7F66ekchGyRVixd\nNGH+/Pk+MF83hzI5b2IIGAKGQKMgsEEQP5FplMZYOwwBQyD5CDA3F4H3uCMtUD75/WUaGgKGQLwI\nmMUrXjytNEPAEMiBAG5E3IeQLixiWLMqEaxo4o6M2lOPiSFgCBgCSUNgo6QpZPoYAoZA4yOApSsc\n/1Vqq3FZmsG+VNQsvSFgCNQbAXM11rsHrH5DwBAwBAwBQ8AQaBoEzNXYNF1tDTUEDAFDwBAwBAyB\neiNgxKvePWD1GwKGgCFgCBgChkDTIGDEq2m62hpqCBgChoAhYAgYAvVGwIhXvXvA6jcEDAFDwBAw\nBAyBpkHAiFfTdLU11BAwBAwBQ8AQMATqjYARr3r3gNVvCBgChoAhYAgYAk2DgBGvpulqa6ghYAgY\nAoaAIWAI1BsBm0C13j1g9RsCTYTAgw8+6F555RW3/Nnn3RtvvOHWrH7DPfjgovUQ+MFJp7gtttjC\ndezYwX1t553c4Ycf7r7yla+sl85OGAKGgCGQNgRsAtW09ZjpawikDIG5c+e6hx9Z4u6+607Xuk0b\n981993e77b6b23PPPd1WW27puh7cpUWLnn1uhXv1tdfc6tVr3EsvveSeefpP7q475zrI2JHf6eF6\n9eplJKwFYvbDEDAE0oSAEa809ZbpagikBIH/9//+n5tx401uzuzZXuPeJ3zPHdG9u+vYoX1ZLVi9\n5k0379773WPLlrr/mzrZ/XzE2e4npw92u+66a1nlWSZDwBAwBOqFgBGveiFv9RoCDYrAlVde5a65\n+mq3X+cD3Kl9+7qjj+wZa0uxiP3619e5yydeagQsVmStMEPAEKgFAka8aoGy1WEINAECxG9dcMEv\n3RZbbuVOO/302AlXGEIhYPPuvsudM+oc169fv3AS+20IGAKGQOIQMOKVuC4xhQyB9CFw5VWT3DWT\nJrnThw5zw4acXtMGzLtvvrtk3NggfmzHwNJ2lcV/1RR9q8wQMARKRcCmkygVMUtfFgKdO3d2G2yw\ngXvyySfLyl+LTFOnTnVbb7211xNdx48fX4tqU10HsVyDBp/uFix4wF1/w/Saky7Aw5V58y23uO23\n38Ed1OUg98c//jHVmJryhoAh0NgI2HQSjd2/1roiEYAQDho0qEXqkSNH+t9nn312i/P243MEIF19\n+w1wm2++uZs8+VrXpvUOdYOGuideNsF/Lfn9//6+m3nrTPfNb36zbvpYxYaAIWAI5ELALF65kLHz\nVUWAaQJ69uyZtS5xzDmkT58+/ry2OElaOcceq5RskKb33ntvvfxY2tgKCdYu5MQTT3SZTMaNGzfO\n/16wYIHf25+WCAjpatt2D3fLzTfWlXRpzXBzjhg5ykG+zPKlkbFjQ8AQSAoCRryS0hNNpgfkSpMa\njiFXSI8ePfxeX8ci1apVKzdw4EBvmRJrlE8Y/IE4SX45Bzkr1rUp6SBeiOzlvJRpe+c06cLKlDT5\n0YC+Rr6S1immjyFgCGQRMOKVhcIOaoUAli0Izf777++tS1iYOJbzkCtIlpAeCBjWLNKwh2RxvGrV\nKp9/7dq1nqyRXpO13Xff3XHtiSeeKNg0sZZRLyJ7zsu1goU0SYIxYz+PfUsi6ZIugHwR6D98+Jme\nKMp52xsChoAhUG8ELMar3j3QhPVDiCBbECixXImbUeCAWEGi2ISAYYWSY/Zt27aV5Nm9XOcE6YVA\nZRPYQUUITJ8+3d05d45bGEwdkXTB7bj8mWfc6T8Z6t2hSdfX9DMEDIHmQMAsXs3Rz4lrJXFXEq+F\ncuJeFEW1qw/yBYGSc6TBKgZ5C2/lBsILQRMCKFYuzss10a1Z93/5y1/c2DFj3cRggtR6BtKXgv/o\n0ef79SAhjCaGgCFgCCQBASNeSeiFJtPhtttu85YryBZB7JAobakCDrFWQc4gXljAIEDsEcpgi0uk\nXOpCpGw5H1c9aS5n1Lm/cCf0+X7VJ0aNEyMI4lkjz/GEkdg0E0PAEDAE6o2AEa9690AT1i8WJFyN\nfJWIy1AsTAKHkCw5L9Yu3JQQNc7L14/yZSNzcEl6KafYPWUiEC7KExdo2BJXbHmNlo5Z6f/wxON+\nfcS0tY15vo4+5rtOYtPSpr/pawgYAo2FgBGvxurPVLQGMqNdghxr4iONELIFCZPrXJsyZYq3lAmB\n4xxlzp8/v2y3IJYtytVlYo3TelJPs8rU/7vOB6unxcUY7qcf//hHbsIl4xzuUhNDwBAwBOqJgC0Z\nVE/0re68COB+JBZMSFXexHaxaghg7Ro8aLBbsXJF1eqoRcHDzxzhvvjFjdwl48fWojqrwxAwBAyB\nSATM4hUJi52sNwK4DWXiU23tKkcv3I/ijozaSz3llN0MeW75za2u74Afpb6pWL34ItNivVLfldYA\nQyDVCJjFK9Xd17jKS7wW7sZZs2Y1bkMT3jJICu7X5c8+7zp2aJ9wbQurd2yv3u743r1c//79Cye2\nFIaAIWAIVAEBs3hVAVQrsnIEmPiUqSKMdFWOZSUlzJ071/3PwMENQbrAoUewOsKSpY9VAonlNQQM\nAUOgIgSMeFUEn2U2BBobAUjK3p06NUwjj+je3f3f1MkN0x5riCFgCKQPASNe6esz09gQqBkCix9c\n5Dr/e+60mlVaxYpwl3732OMcHwyYGAKGgCFQDwSMeNUDdavTEEgBAjL1QteDu6RA2+JVbNt2D/f0\n088Un8FSGgKGgCEQIwJGvGIE04oyBBoJAYjXfp0PaKQm+bbstvtu7rXX32i4dlmDDAFDIB0IGPFK\nRz+ZloZAzRHAHbf99jvUvN5qV7jnnnu6N94w4lVtnK18Q8AQiEZgo+jTdtYQMASaHYFWW2/jNt5k\ns2aHwdpvCBgChkCsCJjFK1Y4rTBDoHEQ+Ne//tU4jVEt+cY+ndxvbrlJnbFDQ8AQMARqh4ARr9ph\nbTUZAoZAAhBI63qTCYDOVDAEDIEYEDDiFQOIVoQhYAgYAoaAIWAIGALFIGDEqxiULI0hYAg0DAJM\nCttur3YN0x5riCFgCKQLASNe6eov09YQqAkCrNG4ukG//PvbunUNOU1GTW4Mq8QQMAQqRsC+aqwY\nQiug1ggwzcEf//hHvzHXFNsrr7zSQo2tttrKffOb33Rf+cpX/P7www93bCa5EYBsgSsCbh07dmjI\ndQ3ff//93CDYFUPAEDAEqoyAEa8qA2zFx4MAizXfcMMNfqkXSAEkCmLVv3//LLnSNQkhYw+Z+OlP\nf+r+9Kc/uV69ernjjjvOb5TT7CI4gYPgKphAxGbPuUN+Nsz+xRdXuS5dDmyY9lhDDAFDIF0IbJAJ\nJF0qm7bNggAD/+WXX+43SAHkCdK06667lgUB5QmBg4xR1ujRo8surywlEpBJky2wzIfnBhts4N5Y\nvcY10peAJ518qvtOzyM8aU9Ad5gKhoAh0GQIWIxXk3V4GpoLQRJChFsRsgRZgHjlIwmF2gZ5w0Im\nrkrS77bbbv4cdTayQDRpNxuCxZCtEJ4sKP3Io0t8nkb584cnHvdtb5T2WDsMAUMgXQgY8UpXfzW8\nthADXIjsIVzsIQhxC4QD1+XLL7/sIF38xrrWSAJ2stE+cGTjuFjpcuCBgYv26WKTJz7dvPvmu9Zt\n2pSEQeIbZQoaAoZAqhCwGK9UdVdjK4tFCzKEtYvjWggkRAieWMPQIa3xXxAtEUhWpXLMMUe74cPP\nrLSYxOR/5JFHXZsdd0qMPqaIIWAINB8CFuPVfH2euBZjcRKSAAkqxSITZ2PQA/KFWxPyheUt6YLO\n8iUiugqOcerdrVt3d9pPhrg+3zs+zmLrUlb7du3d5CmTIz/IqItCVqkhYAg0HQIbNl2LrcGJQkBI\nl7gX60W6AAUrF8QP8sKmCU2SQIMYiguRY9GXfTWk9/HHuwXz51ej6JqWed20GW6v9l/3eHGfaetg\nTRWxygwBQ6CpETCLV1N3f30br0kXFqYkCfrg7mRwToLlC4LFhkAa2GohkE8I3d/+9je3/NnnXccO\n7WtRbVXqwHI3dOhQd/zxvbPl07+0z8QQMAQMgVohYMSrVkhbPS0QSDLpEkXrTb4gPeCE1JJsSfsh\nJUy5AenaY489XbfuR7ipU66Vy6naY+26acYNbtGihevpbeRrPUjshCFgCFQRASNeVQTXis6NAAM6\npIJBL8kiVi/0rEXAvcYDS1st6ozCH9I5YMCAFpd23mlnN+XX17mjj+zZ4nzSf6xe86Y7+aST1rN2\nab3BvZ54a13s2BAwBBobASNejd2/iWwdXy0ysGPRqRexKAUYXFES/1VKvmLTarKVBLcXZPOKK67I\nqs/yS8S+4eqcPn2Gu/mWW1I1oeq5vxjtXn5plbvl5huzbYo64H7EspiGezJKfztnCBgC6UDAiFc6\n+qlhtGRw23fffd1TTz2ViNipYoDFMseADFnEUlepUB44iCSBbKELekG6pk+fLqq5XXbZxZOub3zj\nG45FLpj1ve0ee7qLLxydTZPkA+btGj5sqLv3vnt9HxbS1chXIYTsuiFgCFSKgBGvShG0/CUhAMlg\nw+qVJsHiI1NNlGMR0WSL/EkI2Nf4ox/9wnqWIpCtRYsWeQvQv/71L/fPf/7Tvfjii8F6l8e5kaPO\ncz8a0FeSJnL/7HMr3Am9j3Njxo5tEVBfSFkjX4UQsuuGgCFQCQJGvCpBz/KWhAAWIwgXA1s55KWk\nyqqQGGLCVixpxDXHhiSRbHnFgj/0B5a8V155RU65fv36uYkTJ3q9IVxs//jHP9ynn37qZs2a5X71\nq8vc9Bk3uq4Hd8nmSdIBcV2DB5/m2rdv7y4ZP7Zk1eQexdJpYggYAoZAnAgY8YoTTSsrLwIMYpAW\nLEdpFAZjiBdkKhdxJA3WI4T2siVZiC+TLxdFzzPOOMOTLn4L6YJwffTRR+7jjz92K1eudA8uetDN\nvHWmu/GmWxJHvoR07RgsDXTttVdLs0reC2lOeh+W3DDLYAgYAnVFwIhXXeFvnsrF2iWDWVpbDmlk\nINZWL0220vRlHH0S/nJx2rRp3tpFPBfuRbFyQbr+/ve/uw8//NDde++97pBDDnG/u+tud+usZJEv\nIV3cXzOmT8tJkIu9/+R+NfJVLGKWzhAwBAohsGGhBGm9PnXqVLfBBhv4rWfPnqlrhtafdmiRdrFf\nsGCBvlTwuG3btllcxo8fXzB9XAmEeMVVXr3KgXixmDaWItkYlLGEseWyhNVL31z1EkSvSRdfLhLP\nhYsR0oWlCyvXJ5984gnX+++/7+fzItaNjyPWrVvnDu92mPveCSe6Q7oe5Jgnq96yZOljWffinXfM\niaUv6FsEcm1iCBgChkAcCGwURyFWhiGQDwGsBg899JD/Oi5furRcY0JR3IlxfOFY6zajd74vFyWI\n/rPPPsuSLqxcEK9nnnnGbbPNNt7qBTnbeOON3dH/31Fu+x22c2MuusAtD66PGPGzukw1AfGbMG6M\nO33IEDds6JBYYYV8gRvkK2kfRcTaUCvMEDAEaoJAw1q8aoKeVVIUAlhJevXqFYsFoqgKq5gIqxZB\n57QpbQJ5QH89XQRfLjK1B3shXbgXcS1+8MEHnnBBNP2SQcuXu+22285bwkiLxXWjjTZyPXr0cNdf\nf737y19edid9/weOKRxqJXy5OHDQaZ50sfh13KRL2oElEwJmli9BxPaGgCFQLgJGvMpFrsr5Bg4c\n6F0+WBbY0iyQlLitQ08++aTDVdqnTx+HK1m7X+UYtyrXRo4c6W677Tb33nvvxQIj5CVtxEusNXq6\nCNyKMl0ErkWxcgnpwp0opOuuu+5yXbp08WkAEWvXJpts4rdNN93U7bXXXu6GadcF83yd5OfNYr6v\nahIwYrnGjJvgp4uAFC17bJknlbF0cI5CjHzlAMZOGwKGQEkINBXxYqCWQZn9oEGD3EsvvZQTMOKn\nSKPzbL311n7AzzWI67TkZ+vcubMvQ2KqikmTL8YrrDCkQuqgbI45V45EtRnygj7lCm5GyEocInjS\nRiFUnIsS+pZrQtAgYuTJ1XdRZUSdE3dTWqwfxKKBv54ugi8XCaSHTIS/XMStqEnX8sDS1bp1a5/u\nC1/4QpZsbb755o4N4vWlL33JffGLXwzixvq7JUuXBCTtwCwBm/Xb2VEwlnWOOK7hZ45w3YP2LH/m\naf9lJdNF0I5aiJAvMDUxBAwBQ6AsBAJrSkPKlClTMBP5LXCFZNjkt963atUqs2rVqvUwOPvssyPT\nS17yPfHEE+vlk+vsw2WMGzfOpy8mjdaf9Fp0/sAyllNPqU/n3X333bPpw9f5rcsOH1NXqRK4sTLB\n7OelZotMX0i/sL65foNBVJ9HVprjZOA6zQTEJcfV5JxGxzAOnAtchZmAcGUCt2Im+FoxE7ghM2vW\nrPG4BIQy8/DDD2fmzZuXmT17dmbw4MGZ3/zmN5mAzGcCy1dm4cKFmccffzzz/PPPZ1577bXMu+++\nmwniwDJBIH4msJr5skEgILiZ4KOKzOGHd8u026td5qfDf5659bbbM8uffb4kgO659/7MqPPOz5Yz\n4qyRmZdffrmkMqqROLAWVqNYK9MQMAQaHIGmCK7PZREJBiRv/cCqNX/+f+JSsJCIdYo0UYLVBEtQ\nMIC7gIRFJSlYBpkK1RNZsDqZzxKFdWf//ff3MTgqS+QhFjLS5xPqCkiLCwhlvmQtrmEVIjamUilG\nv2LrwBKGizIgzsVmWS8dVi8+Gkiy5Fpz8bDDDst+uainiyCmS+K6CKjH5Xj//fd7axnWLCxbm222\nmdtiiy38xrG2dmENE2suuGAdwp3Jxn2w+OFH3Nw5c9x/n3hCUGY317rNjm777XdwXw3ixsKCNQtd\n7rpzrvvusce5Aw88wJ1xxrDYXdbhekv5jRVRrIml5LO0hoAh0NwIbNgszYeAMNAGRNqtXbvWExJp\nuyZmECpNhiAakDLysRF7JRJOK+f1Xsdq5SIsxaTRZYaPA0tQVr/AUtbicj5iphNq0qWxglgSPC0C\nNrS7WIGcMEBVKrpPpCz0pO1s9FF4Y4Z1roEv/aiFGLFy3bGUA/FKqruJIHqmvdALXbPmIvpCugiM\nJ54L0sV0EfLVorgX2XPuz3/+s9ttt918PNeXv/xlT7aYdoINFyPniPMi3itMujTWgheB7yxUzXM0\nceJlbuD//MhtteVmbvMvb+I23WRjv20WHH/68YeuT0DOzhx+hk/L1BDnnTsqUaRL2ifkC8xNDAFD\nwBAoCoHgJdiQEnbVhV1LwSDdwgUTkDGPQzgf6cKi3Za4HLUEoGfLJV2UFJMmrIcuR+cPSIW+5I8D\nspHVgbTSNi7iZpP8pENwmco59mGsyE87JQ26FSvnn39+hq0SoX6pm30uN2+hOsK44AouV3AzBSSm\n3OxVyxeQ4kzwhWILvPgNhrgXcQXiEgysSZl33nkn8+qrr3qX4e9///vMAw88kLnzzjszAWH1rkVc\njLgagwlTM4888kgmCMz398abb76ZCYLuM4FFzLsqcVlSdjMLLnWwNzEEDAFDoBACTWHxwtoRtniE\nf4sVB0uICC5Ebe2R81hQRMin88h59lF59fVi04Tz6N8nnnii/umPw/Xm+4CADFp/rEhhbMBB11Oo\nPK0QVhYJRtfnSznW+pEPKxZ6lipYHHXbwuWWWl7S0uPOA+tyv1wkqJ4lgQi2X7x4sTvqqKO8ZWvL\nLbf0bkMsXbgZsXQRTM9UEoUsXUnDqFr6iOvZLF/VQtjKNQQaB4GmiPHSg22hrtOkItfgDhHRIqRN\nn+M4nC58vdg0UfnkXFTbwvXm0k/K0NchI8Tp5BOdPl86uRb3F2dRbZa6Cu3Jq/u4UPq0XIfglrLm\nIq5Eiediz3JAzFSPGzIImPeLS8tXi5Atjonpkq8XIV0bbvj5/22F7pe0YFipnpAviWmM+56vVDfL\nbwgYAslBoCksXsmB2zSJA4FyLVUQxnLzxqF3tcpgOaZu3br5ObekjuDLRT/Ra2Dy9hYs4rmwZgnh\nCsdzEesF6YJQvfXWW65Tp04+lgsrFxYviJfEcwnp0oH0Um+z78XylfQPL5q9n6z9hkA9ETDiFUJf\nW1NyDdJhi0/YwhQqsqo/o3QM66fbFKWM1h83JYN1vi2I8YoqJvIcXzRiBahEwq5TAu2jgu3z1YGV\ni69XNTbhcvPlT+q1ctdcZGJUXItYuiBd9DdfLgaxXu673/1uC9KFpSuKdCUVk3rrBflCjHzVuyes\nfkMgmQg0hauxFOi1e5FBmi8ewwO0/lIQ0qLzlFJXHGnRT8dfUab+ShP9ChEvrT9EjnZrMlaJnhCv\nOOJeaKN8hYh+fIXJRt/k6gPS0R5IV5R7MdyvlbSz1nnBtNw1FyFcuBexgPF1I5YrSJe2dEksl54u\nAtdipVYuyAhu0b+te9899tjvPWzowrQRSDDfl9uv8wH+uH27doG1bXPHl4NCZvyFFPzhvqetbByb\nGAKGgCEgCBjxEiT+vWeAZ0Bn0EawkmDhkUGa35rYhEnPv4up2S48txa/9dQQxegH8ZLYJ9rN/GS0\nWQiZlCmYcE1/YFCosXEQLwLjwV10kDqlL4SUyflCe/SX9hVKG3VdYnmirlX7HHhCRnQQPWstBl9a\n+iB4XIYEyIt7EauWTBkB6eJYgughUkwHQcB88OWjO+SQQ7LxXJAurlXqWgSr3919j3sg6L81q1d7\nYrV3p33cET16ujZtWnu4mDICYe3FV4MYM+Spp/7onnt+pbvjjjt9vm8Hc391PbiLnyrDJ0j4HyFf\ntD9txDHh0Jp6hkCqETDiFeo+rCcM8kJesJRARKKEtHxhV29BV9E3rAttKUZIB6lEsBKxJE+UQNBK\nIV0QhNGjR0cVVdI5SBKEL+wuLKmQfyeGjFbab/WyZDCIE0Svl/9hglIW7iagG8LFRqC8zNGFRYlN\nSBfXSIMFi2B5SBdz3FFu1PxcfLmIQNJKkdmz57irrrrKk6YT+nzfnTXyHHf0kdHPkpTbsUN7x4bo\ntBCyBxYudLPn3OHGjR3nTh8yxB373f9ykJskC/pBlI18JbmXTDdDoLYIWIxXBN6QkELkAtLFhJ3s\n6yn5iBXkopCbUXSnvYXICGXR5lKEgYe1GuMQCBMTutLmcnDHasmkqmzl5NdtYCCFVNZScNFRpyZd\npay5CPmCjEG6IFPEbUG0sHRhMZOvF7WlqxzSBeHq1q27J12n9O3vVqxc4S6+cHQLIlUqbpCxYUNO\nd1jGJl55lXv55Vf85K5XXjUpFld2qfqUkh5CzHPAPWNiCBgChoBZvHLcA+JexJUVjukSYlbp4J2j\n6pJOQ5ggRASbSxwT1iF0LMbNqCsjD+QEt50OXqd86uF6qcKAw6zpcf3HD+YQRDYsc+JqlL3WD71l\no11x9RcWDMhkLd1HfLk4YMAA3Ty/yDXWLgLjcS/q5X9wL2Lhkngulv/B0kVaXIeQLln+54UXXvAu\nRpmfi3gvCJfEdLWoNM8P+viSCb8KLFxvOAjXjwb0zZO6/EtYwth+/OMfuYsvvthdM2mSuyrYevbs\nUX6hVc6pyVct75sqN8uKNwQMgTIQ2CB4ETPLtYkhUDUE+gfL10DA4nA5Vk3JEgqeO3eub0utLBhx\nrLkI6ULCay5CXo855pi8ay4WA8306dPd2DFjXd8BP3L9+53q2rTeoZhssaSZ9dvZ7n+DJYW6dT/C\njR17sXe5xlJwFQoRt2OtraVVaIoVaQgYAmUiYK7GMoGzbMUjQOwQZKURREgXZLLawiBNPZWuuQjp\n0kH0uBSZn4sPFfbee++S1lyMavNZZ5/jSRcuwFEjR9SUdKFPn+8d7xb6LyXXub79BiTapYflC9KF\n29jEEDAEmhMBs3g1Z7/XvNUMOAw2aXezQIZwWTJBKVY8kbgtGNRDmXF/uUhMl8RyPf744+7oo48u\n+8tFdIToIJMnX1tzwiXY6/3wM0e4eXff5WbeOjPx9xrPQ9z3jcbCjg0BQyCZCBjxSma/NJxWuMsY\nqG8IYpXSLJdffrm33oUtFuHfEEzIZjmCC7MaXy5CumQWeiZKZS1GposgnqvUIPokki7B+rppM9yE\ncWOMfAkgtjcEDIFEIWDEK1Hd0bjKMP3CbrvtFnyN9nILS1HaWoyVC/IFMconkCfIiQj52AoJBI6y\nmVlehC8XmS4C4YtEJj3FfaiXAJIger3mImSK6SIIoteWrr/+9a/+/J577pmdo4uyS5ku4qSTT/VT\nVCTF0oX+WtJGvioh6rrddmwIGALJR8CIV/L7qGE0JF4JSavVC8LFBoksVcij82ENC7tdwaVaXy5C\nvNj4cnHp0qV+brpyvlyk3RddPCZYWujxxLgXc/XFub8Y7Z55+k9uxvRpZVsfc5Ud93mIOsS8XCtp\n3PpYeYaAIVA9BIx4VQ9bKzmEAMQDq9dTTz21HukIJU3cT6xXDIwE18cRl0N5+qtIXLE6novgd+o6\n7LDD/BQQWLr0dBHhSVFlugiAC3+5SEwXVi/m59KkCwtXKVYuyp4/f4EbGkxeevucudmJTjmfVMEy\nt1WwyPe1116dVBWzehn5ykJhB4ZAQyNgxKuhuzd5jWNKCQiFJh3J03J9jcS1iO5xCgQsvOYi5eNa\nxMUoy//gXsS1qJf/EfIVXnMRgiWuRUgXVi7OrVmzxpMy5jYrh3Sh60FdDnK/DCxefEmYBlm95k3X\nPfhIIenzfAmWPBdYvSD5JoaAIdCYCBjxasx+TXSrcLFBZNIyrxdkCzepWCTiAhcig/VMW7r0mous\nvSgxXVi72rZt6yc1lYlRIWFRay4K6WIvli6C6B955BHXvXv3skgXbR40+HRvfZs65dq4IKhJOTLP\n17LHlqXClScuaSNfNbk9rBJDoOYIGPGqOeRWIQQGwkFMk1iSkopKtXSlXNqul//JteaiWLpYT/Gt\nt97yVi+W/sEyss0227RYcxGyJV8uYulihnpI18MPP+yOOOIID3Op7kUyEfQ/eNBgP19WLSdHjeu+\nwOXYsUMHd+6558RVZFXLMfJVVXitcEOgrggY8aor/M1bOaQLFxsDejjIPCmoiEUKkkhQfVxCmyFd\npXy5KF8t4l788MMP/VeNb7/9tnv33Xc9sYJgffWrX3UsFyWWLr5oJN7r9ddf9+QMC0o5pIt2Q1z2\n7rSPnyA1LhxqWQ6LbO/d8eup+qrWyFct7xCryxCoHQJGvGqHtdUUQgAyg7sxieQL0sVEqVihIIlx\nCWWV8uUiJEtci5AuHUTPV4nEbsmai6z+xWANCYNwsSZjt27d3OLFi/2+3DbQP2m2dkm7mVx12222\nTo3VC73pT+7FpP5zItja3hAwBIpHwIhX8VhZyiogQOwUMVRJIl9i6cJChFUOi1ccQll6+Z9ivlwU\n0gUBg3QR64VArCBYEs+lv1wUSxfE7IILLmhBuhjAS52yYMRZI12rrbdJrbVL+m7J0sfcD/v3cytW\nrpBTqdhzP0LAjHylortMSUOgIAJGvApCZAmqjQBWIEgJ+3rHfEnslcSg0XYhhaUSFsGNgZP2sZC0\nyC677OIJJ8H0xXy5COki0B4hZkt/uSiuRc4J6dpwww19/BiuRQikCO1DHxGu6etyXvakxfK3/Nnn\nUzF9hOida39sr97u+N69HIQ/TWLkK029ZboaAvkR+ELg6hmdP4ldNQSqiwD/ye+www5u8ODB7s03\n3/RL2VS3xujScX0yII8cOdKNGzcumwhismLFCv8FYankiwETEnffffdly4NsMZ8W5UK6mCoCS5YE\n0eNSXLdund841l8uykz0WLgIomcP8SKQXpMuCBdfS4atJOBMvbKhH2QMiwobv0kjMnPmTPevzAbu\njGFD5FSq9399d6178sk/uO9+979S1Q6sm2zLli3zfZcq5U1ZQ8AQaIGAWbxawGE/6okABEAsEZAg\nCEstBMJBveyxuuWqV4hJmMzk0lGsZ6V8uQjREvci00Xw9SLkDAsWxAqCpd2L+stFXItsyEMPPZSz\nHbn05bwQMUlz2cQrXI+ePd2wIafLqVTvCbI/ofdxqXM3atCxwOa6R3U6OzYEDIFkIrBhMtUyrZoR\nAQiNkBVcjkKGqoUFJAMXILPpS935BjSxEjHwFRIZHDXpYkLUadM+X75G5ueCWOFGhHDxlSMb1i6x\ndEG6IFMSz4WVi9gwmTIC9yKuRwLphXRRJ7qWI1j0wEC299f9zXUOvpRsFOnYob1r3aaNK6YPk9pm\n+ibN+icVV9PLEKgVAka8aoW01VM0Ani/sS4hkCJIWJwzxotljdglyBcLd2NhK8aNKMSEgY+8UYLV\njK8J9XQREC5mo+fLQ1n+B9KFVQsLl5Au9pAurpEWMgW5wqUI4dKkCzImpAuLGO5FNrArl3jp9lDO\ngw8ucl0P7qJPp/74m/vu32L+tDQ2yMhXGnvNdDYEPkfAiJfdCYlEAIIDgXnvvfe8NQrLFGQCKxgk\nLBfpydUYiJKUwaBF+XPmzPGEqxySQhkQEzYt1KGni4AoMQM901II6fr00089sQqTLggY57iO8OUi\nrsQw6WL6iCjSRR7aiW5xCG37wUmnxFFUosr46nbb+Y8FEqVUGcrQz/R3qc9CGVVZFkPAEIgRAYvx\nihFMK6q6CGCpgoyxJ4aJLwMhTbgJo6xVMigRZE5AOwMVm/5yslKiAjlh4EMPSFetv1wUKxfICwlE\nlzgEK+Arr77hJl42IY7iElPGvPvmuxtnzHC33HxjYnSqRBGeB/o86hmopFzLawgYAtVBYKPqFGul\nGgLxIwDBggyIMOBAeiBPUQIRYjCCbOUSrlVCvhjwIDy4LbXoNRf1l4viXtRB9MzRxXlckBAp3Ic6\niF5PFxHlWpR60SNfWyVdsfsNNvyCwzpkkmwEJD7RyFey+8m0MwQEAbN4CRK2b1oEKrEUQf6woOkg\n+kJrLmrSVcmXi5A0kUrIo5QR3k+8/Ar30cefpn7i1HC7Vq950+3YprV3/Yavpfk39yL/aEDATAwB\nQyC5CGyYXNVMM0OgNggwUGE5KzVWRsiOJl2HHXaYD6JnAKzml4uadEEcbbAt/l5J4yLfxbQOyxci\n/0gUk8fSGAKGQO0RMOJVe8ytxgQiIO6aYlUj1izqy0UC6flK8qWXXvKTooprMe4vF7WeRrw0Gs19\nLATcyFdz3wfW+mQjYMQr2f1j2tUQgWLJV6EvFzt16uS/THziiSfWmy4iji8XNSRiddPn7Dg/Akyi\n2m6vdvkTpfiqka8Ud56p3hQIGPFqim62RhaDAO5BtlzWAlyRTGehF7rmy0rIDy5GHUS//fbbu222\n2cYtWLAgOykqpItAelnoWoLow5Oihmej118u6nagpwyy+nxcxzIha1zlJaWcV197ze3X+YCkqFMV\nPeS+IO7LxBAwBJKFgH3VmKz+MG2KQADCAdmRPVkgRUwbgcg0ExxjxWIQ4ms/iYHhfC4hLWWz10L5\nlCF1cK3Ql4sQpnbt2rnFixe71q1b+9nlK/1yUetE+9GpWrLlFpu75cufrVbxdSsXAtwMwj3MfQv5\nKubeL4QJ9xtlyUbZ+rljzjqpR5479tW8RwvpbNcNgSQiYF81JrFXTKf1EOBlT1yVTJ4qL3T2WKkQ\necGTVgYFGSTkHF8gyrZeJeoE5EuXV+mXiytXrvQz0GMJK2XNRR1Er9Tz5FD00+fjPAaDSy651N1z\nz+/iLLbuZY0ZN8Ft9uVNgoW/h9Zdl1oowLMAaeJZKVX0c8dHJFh2ue8gdWyI3IfyjHGOe4c62VM/\naeS5k+eVdCaGQDMiYMSrGXs9JW3mhQ3RYgkhhBc3rr5yBhDyMzAwEDAXGGUTqyVzfXFdiwxW7KlX\nL//Dmoss/4PIl4vMNv/xxx97VyIWFaaMYMO1yDVmrV+7dq13M+69997Zha5lji7IGDPVs+Yiy/8g\nuUgXAxoiA5//EdMfwQic7rjjDl8qujeSDBx0mnv7rTVlr1qQRiy4j+lbCFAxwj85PCfca0KY2Jcj\nlMFzTJkc8wzz3FXj/i1HP8tjCNQaASNetUbc6isKAV7SvJwhWbyo2eIUiAWEjsEoFwHjvI7non7W\nXJTlf4jpIl4LYsVC15AsSJcQL87J8j+y5iIkZvXq1d5yAOkinktIF2lkzcV8bUX3YgfQfOVwjYGQ\n8tgYHDXB5Pp+++7nLho7zh19ZE9+NoS0b9fezbx15nrxfHFhmmSQ6Od87eQe4L5HeD7ifu543iB0\nrPDAc8SxWcCSfMeYbtVAwIhXNVC1MstGgBczL3v+Q4d85Rskyq5EZWQgYoCBgDAIyH/1YdJF/AqD\nEq4WWXNRky49KSoEDNIlQfRYslhbEaLFuotsTz/9tNt///3ddsHM8FyHdOUKolfqeoJUCSbgSptp\nC3s9B5muR44hXkcd81138YWj5VSq90uWPubOHXVOsH7mwvXaAR4iWGMa1SJDO8P3EPc/z50QI46r\nKdTHM4YuPH9C9qpZp5VtCCQFASNeSekJ08MTn+HDh7vzzz/fv4xrCQkkj5c/xAtyIm420eGpp57y\nwfRYucS9iGuRmefF0iXuRUgXaRC+XNx0002zpEtci5wj7mvbbbd1u+22W1Gki8EKKZUQCMlikNMf\nB/jCIv4ce+yxfmBmcGY+skmTro4kKhFZE3/q3F+Mdv/49BN3yfixeXUFa8GbhGGikjdzCi5yL0ib\n5N6HbEGCammBQg/qxbKNHrWsOwXdZCo2KAJGvBq0Y9PULIiO/PcLSSg3hqvSNvPf/r777tuiGL5c\nxL3IgPC1r33NffbZZ96SJROjhi1dpa65+Oqrr3r3XrjeFkr8+4ceLKOuyznSycZi4oXk29/+th+E\nGYhlWgysekIyv/GNb7orA/LVCO7Gbt26B8T+f7OkoxA2ch08RSC+pZJfyZukPW3Cysue547+r4fw\n/EO+eP7q+fzXo+1WZ3MisFFzNttanRQEeOnKC58Xbz3/46VuXIoS56Sni1i4cKFr06aNj9kSS5cm\nXeWuuYi1i/oY/ASHqL7Jdx3cuC6b6B9VDudol3ydxp42i/tUiCMWO7HsHXPMMe7+++5PPfG6btoM\nD0k+nHNhpvNgCQNrhHumXv8oeAUq+IOFCcsu1tx6tgEMIVxY28AZbOupTwWQWlZDoCgEzOJVFEyW\nqBoICOniJcsgkATRVi8ICUsAMRM9li7IV+fOnddzLeovF4nVIlh+s802W8+9KEH0ub5clAEnTD7F\n5SVWFhn4Sc+AVYhoMa+ZEK1evXpliRYWLbFqaaJFW/X2wgsvOMjX8mefdx07tE9CN5Wlw0knn+q+\nd8Lx7vjje5eVPyoT9zD3jAj3crj/5FqS9mJh4h6iDXJv1VtH3gNi/TbyVe/esPqrhYARr2oha+Xm\nRSCJpEsUhthAVhicsAgQi0Vw/FtvveX+/Oc/u5133tmtW7fOTxcR9eUipIsAeuK5Sv1yUax+eiCE\nXCHsGSgLBcRDGCFaxKthQcBFKq7DMNnSBItjPgjQe7k+PpjPa/fd27qJl00QmFK1n3fffDc8mLdr\n2WPLqkqM6D/ubSSp1jBNupJIEo18perRMmXLQMCIVxmgWZbKEUj6y19cbwMGDPAWDZb+wbXI+oss\n8YPgXsz35SIEjCB6sXQV++UixI/BhwE8PJ1FLuR1QPw+++zjiZaQLbFmyV7IVJhghUkXv8Xd+OKL\nL7pLJ1zqZs66zXU9uEsuNRJ7ntiuoUOHxmrtKtRY+i9p1jDcedxbQvALtaFe14k9Y0u6nvXCx+pN\nNwJGvNLdf6nUnhcqAwAEI4n/cQOquOCYh6tLly5+Y+JULF2PPvqo23PPPbNzdOX7clFIl8zPlWtS\nVCxZssUREI/+ECtNtoRIRREsIWOSRvIKDmAybdp0P8/Y7353Jz9TI8xUv2zpEnfnHXPqqnO9rWHc\nX1hB2afBjSdfGKOviSHQSAgY8Wqk3kxBWyBbvPRxm+EGS6JgKWKDhBBsvmLFCtejR49g+ZxLPOEi\npusPf/iD69ixo59pnklQZY4uPV0EhEziucJzdDEIM6DIVihOK19AvJAjTbI4FoKlLVv6nD4vedlT\nnmAgRBFrHeSRJYT+69hebtTIEUnsuvV0Yt6uQ7oeVPcA8rBitbaGUR/ua/7hScucWejMuwJ906Jz\nuJ/ttyEQhYARryhU7FzVEIBs8TLF6pVUERcdJIUYLojWpEmT/GzbV199tSdTb775pp/0lPgpIV3E\ndUHCiAeDdEFW2BDisoRksS8Up0Wevn37enIqMVvsIUVRREssVrIXghXey3X2QraEaFGnCCSLTdog\nk7w+++yz7rLLJrpLLv2V6/O94yV5Iver17zpTj7pJNe/fz8/S3oilfy3UtoaBkFii1MgLkL24yy3\n2mXxrGD5Qve4Mam27la+IZALASNeuZCx87EjIC/RJLsYaTTESyxGQrxYBuiUU07xXziefPLJHpvn\nnnvOde3aNRtET0yXuBaJB1u8eLGjzQToFyJaQq6EmELoCPAXosXAA7HbcccdW3xxCIHKRa7kvCZb\nUh57LUK02GOli9pkLclLL73Uu1tvvOmWxMZ7QboGDz7NtW/fvuBkqRqHJBzzfLCJcE9UIpTFtCUv\nv/xyKslL/+AjF4TYNBNDoBEQMOLVCL2YkjbwHyuuDnmRJlVtbfGSObuwehF7xcz6t99+u5+SAZL1\nzDPPuG7dunlL19KlS93dd9/tHn74Ybd8+fKCzWPiUvnyUAfEM23Ft771raxFChIIeWLgfO+997y7\nU8iUkCu9l/Sk4VjIllYIF6K2aIWJlpAsvcf6RXwbU2rcc888N3nyZDd9xo2JJF/DzxzhVq160c2Y\n/vnkt7rtaTuGvIvwDLGVIjxv5OHZS6PERRz5QKZnz54egnHjxrmzzz47VXC0bdvWrySB0qXoP3Xq\nVDdo0KBsW3m/1VJ4Z/HOYBWME0880c2aNauW1SeyLiNede6WfA9Tvmt1Vrvk6onpwt3BSzTpwouJ\nDTJDcD3kSzasXQcccIAbNmyYw+LFS+TWW2/16Qu1C3IlRIvpHiBEQvKEIDE4COmCOKED14RYrV27\n1tdLWZp8CdmSctgjlC/xZZpoQaIgW+JC1ARLSJg+h8UPMnnEEUdk3Y+jRp3r5t1zj7v+humJIV9Y\nukaPvsDhCm4E0hW+p3h+9DNUyBpGWqxdDH5J/ZAl3Mao3/LPWiVWL3mf7r777gEpXxVVTaLPif4o\nGSZeEovJtSlTpriBAwdy6KXexAsltA68MyFgzSwbNXPjre3FI5DvwS6mFF6YaQmQpa0QFiEqxGtx\nDvchZOSaa67xW6F25wuIj5ohnsFg66239hOiCunSe44hVAwcuDGxYhBPJmQLnYVooRvkStogREvI\nVnjPdSFaco1zbK+99pqDeDGJKuWBBfvLLvtVMAv+Pu6HQQzVLy8eU/eYL3Ev0vZGJF20iz5nEylk\nDcPK1a9fv1STLtpKOyCQxIaWQyDHjx+ftRalzdIlfZ3mPURQ+mDkyJFGvNLcmaZ7OhDgv27inCr5\nb7XWLYVcsEFCEIgGE6guWbIkpyrEZR0exOOwYdEiRgsiJK4+rGaQJDaxVuk9S7cceuih3joRJlyS\nTvLz3y+uR+LKvvrVr2Z11EQLIqUJVZhYaYIl14RsyZ5FtSGDHTp08BhI48EG6R+4sbbccis36pxz\n3Isvrqrb144yQeqxx/VOXUyXYFrOnntNhOdMiBjkRL4elnOSLo17yCbPFJZz7rlSBGsfgz7SqlWr\nFtagUsqpd9pyrXSQHm0Bq1c70AHShcuR/mhmArxhvTrB6m0eBHhZslRNOf+p1hMlIR9YvNgOPvhg\nt/3222dVgvRgBfrVr37lXRcspn3dddc53JGs64hVi+B8JlqVdR2ZB4ypI/hUno1BgW3OnDl+oORY\nrpGODWsTMWYySz7kC0KH5Qsd33jjDR9jxteVTO5KoD5Ys2eg4Vjv5VjSkI7AfdrDV5ky6Sskk/nK\nIHnUI2RUSJcAwRI8M2+d6efKOrZXb8cUDrUSrFzn/mK0n5V+zNixTUW6whhDTiBibBxjYeb+gYA1\ngkC4yvnnDTcXzxUSJiAQALmviYMiHeRAzrEnjeSPwpHwAPLqPPyzUigfehFzpvPxm/NRwnMoaSkb\nkfw6vegi5bCXfOxFwu188skn5VJ2r9MQp6Ulqt359NfuRdFNl9dMx6kkXtx0+ibkZuIcTFqL3IBc\n50EIX9dlcBwWHjbK1Tdtrrp03mL103nKOS71xtftBQvy8zBJ++RloXUp5sHW6aOO+Y+b2KY0CZgg\nYkHCIsTGi/vUU0/11yBRd955p4/3YhkhvjhkSSH9JWSYaAnZ0nssXZAgzjFQkgeiBmEjxgxrF1Yz\ndIIAQQIhRxAl+nSPPfbw9zZlCMmCXNGf8luuQbIgZ0K0KEeIFuXSRogeHwjw0QDlCBa+0Tn+MLgz\nQWmPHkd41yPB7c8+tyJH6spPQ7iYGLV7QDLe+evb7t777q3prPSVt6C6JdDfCJP+NorwDuEDF56T\nUkQP8szHl0943/H+1gL5kOBwfZ5jrkWRDSFwPJ9RhIaxiY13sBZ5p1NmtUUTIeoK68I5jZ1OD0ZR\n7Rb9aVtY+Edx//3396cZf2677bZwkqb5nSriRWfxAHCzh0mUPBz6JseUycCBCImSnuWG0mUQkKiF\ncnhoKDcsnONa+EYtVb9wuaX8LufG1+Vz0/PgaLzkZRH3Q4+bkf/C0yZCSNlDwNh++ctfuhkzZnhr\nElNEiBuR4HesYbgjV69e7ckTJEoTLPBlk3NcZ+PLyG233dZbtbCSUVYuogVhgjyxCZmC9HXv3t2T\nPoiZkC1JI0QLi5i2aPFVJmSLPGy0jzaxHV5mfw0bOsSToI02+oLbu+PXHQQsTgsYZE4I1/JnnnaT\np0x2UyZf43YNLDwmLRFI4z88LVvQ8hf3NXGTtKtY4f2m3/P5iBdjgn4f6jooI0wmeAeHSZrOwzHP\nO+9T9iLUowmNnNd7xpZCZev05RxDgiBDIuG281vrLdgxdkSNi1IOe9oXpb+UQRojXqCQAunTp0/O\nBwP1wzc5N5X2I3MzyEOobwqYvL4hwuXkgib8IJaqX65yC52v5MaXsvM9ODz0cT0UzD9FrFM9B0Ze\nfEKipP2V7rHwMADg8sMiJV8/EgD8+OOPe+IlBEvvtUUL9+G9997r5wLDfYiIRYugeT0jvpAocRNG\nWbOOPPJIT+SoD5LFJu5DyhOiRWyXEC2NC32FVOqaoq8nXDLOx6Btu83W3gLWrVt37xIkFqtUgbhd\nOekaN3DQaZ7MvfKXlz3huuXmG8smiKXqkMb0xOeVS6DjaG81njvaI/dpMTrqf47F2pIvH2mIpeK5\nZq/HBcqS8hgj9BjCWDN//nyfj7zapRlOq9+t1MeXylKf1lGny6Wz1KmvY0QI66Cv62NtxZK2yXX9\nG71ENz12cI71a0V/jRf40HYtmujp8nWaZjjeMC2NhDRpRs7ntHQ2m7ZW0dH6vwmIl+5sbgZNwBjI\nKEsL1/UNo+vSRA4SJw9Hufrpeos9ruTG13XodoXnVhGsK32wCfitJ+nS7eVY92v4Wim/GQBoG5Yp\nSJMQL8jUTjvt5O9VsWixj4rTIjieexP3HqQoF9ESsqX3EDH5zTFWLYjWQQcd5F2HuDzzES0IlxYG\nM/qJLS6hrHPPPcetWLkicHkNc5tusrG7ZNxYT4KJBYNIYb2K2ojbOunkU137du09cVseWAVZnJv+\nw8JVT0IRFz7VLId/CrAOJUXieu74p6AU4iXvMXDQ40AuXHgPSjr24feikAXe+7pNjEGadIR/6zFJ\n/vlHB4gPzzFCfXp80br7BFX4o4kX7dF16mNJxzmtP/gIIRO8pD2UJ+OjqC7Y8pvruixJ0wz71BAv\nueHpFP6b0DcovzV50jc56TUx45r+TyVMzEivb5ZwXdSjbx65OSvRjzqLlUpvfKkn3C4eLHm4SKNf\nKpKnnD0vySQNktJf5bRF58GKx+AG6ZIvEPlqkfguXHbs//rXv64XEM81LE64+N5++2239957xxoQ\nT7nEfHGPEqclFq0w0dJtoR2QJIkL0tfiOiY+57xzR7lFixb6e+vM4We4Dl9v5zbfLIgxCwhZePvi\nF4Ln/H9+5N2WELepU651/YPg6mrqGFdbk1AOVs8kYRXXc8d9StuKFV2vfm9H5YdAhNNAIvR7UYiC\nLpc0mnRJ2bxjRTSpEaLCNf6JZhPrEHWJQYF9tSXcZj2O6WNpn253OC+65sJL2hHGV5cnaZph//m3\n8iloqe4guQm02tywYgni4eBGF+ZNeh4CIWTy8HATaAIn5em69EMi16M+69V5StVPyi1mr+vJd+OH\n2xouO6pdnNOkM5ynEX6DX1T/lNo2BgARsXoJCWPPdYLmmYYBkTgqSBfbI4884r/0xNrFb70nrfyW\n9JIf4sbG7zCp0uSKe//wwCoHqcJKEDUIM4DVgxijC7qxmVQHgXr0a76WxPncEWBfrOh/IGU8yJU3\n6p1IWk0WpDwZQ7ieK1+4PsnLmKPfs2IIkPFLxitN+KinWkI9ooOML+xFX9onbZRz6EIa/c6J0k+n\nL+d6VJ5GOJcai5e+0Yml0oMOxwS7awl3ODd7+EHQljDJq+vhnH7oJE3UXucrR7+oMqPO6XbJjR/G\nQkgX+XV6XV6x7dJ5yjkWa0o5eauRhxeMvFziLF8TInEdMsP9iy++6L8gxB0ocVoQnn333dffj5AQ\n7kvZS5pSAuKl/6PaA7nBJcqmRc4Z+dGo2HG1EIjrudP/8BSja4eXj40AADlaSURBVK73XzF5q5UG\nEkNclLaI6bqwNDGGCBHT16pxrP8RFSuXJoa1IoDVaFtSy0wN8aoUQB7A8ENYjQG4Uj0bMX+pL8so\nDHhxC8EodS8vE8rlHiDol5daHP2PLlifsExJnBYB7fL14V577eWXG4JYSUA8MV9ihYJ06RitUgPi\no7AKn5NgeYmNkb2cD6e334aAIJDU5070K7Qv5R/M8PggZet/qqU82ZMm13skXJ7+xx/yxT/+4lYk\nhIVNp4mLrEo7cu0hXlIvOtMe/c7UxEzSURbnRf9c+yjjRi49mul8aoiXvtEl4DtXZ3Nep6dDo/57\n4MbWDxXp9I3F7/B1zkWJrq8c/aLKjDqn9bMbPwqhwud4udD3UfdE4dwtU+iYLebDIkAea5VYrr7+\n9a/7DJAzzvGlGfFOYbKlp3kgTgsiRx7K10SzZe3F/4L8so0Oll6R4+JzW0pDoHIE4nzuitVGvy/D\nRChcBlae8Pue39r6I+95cb1RBuVqoiLlas8DepCH8vTzLKQNjwwbYSxaZ7kuZVZrr61v6C31orNu\nqz4mTSFMC+kreBZK12jXU0O8wh1eSkdwI8mDIQ8A+eVFoMviur7xww8iabFcyMPDAI5Uop8voMg/\n4XoqvfGLrLbsZFh6xMJSdiEJzaitXbgXmbIBaxdWK4iVkC9mvGduLPqqU6dO2SkewhOXQrTY5N5i\nH5dIPBfEi/4oJUA5Lh2sHEOg1gjogT3qXR7WBxefpGPPby1i/cH9pseJ8GSo4d/irkMfnY9//qQ+\n6mGc0u90nVbrke9Y58+XTl+TdnFOE0bRW9Iy/gim1IP3QEia5NXjoy6L67qt/NbjGb+bRVJDvPSN\nwc0qhIeOkgdEBiwd78XNoS0b/FcR/gJSSJl0ur7ZqEf/x0NZ+sYWvWRPGaXoJ3UWu6/0xi+2nnzp\ndPvzpeMa7qxGHOSFFLHHOoWVSlyNEC823I0yQzzxXvfff79r166dTwtRqybR0v0SjufKFfel81Tr\nmHuBuL8rr7wqmIz2Ij9lBNNGhLcRZ410V141ya/NF45Pq5ZujVZuIz53/NPAPzTFih7Yw4N+VBmQ\nCMYPnmv2mlQwLkh5ECLGEhHK1vOWacJBWj3maOsSY4/UR52a6JFPjytSV6E94w9laR0K5aGeKJIX\nVb9uN/jo1U8gnDI+QNB0W9FB4wmWUXUW0rURrqfmq0Y6EBIkDw83F1uU6BuDNHIj0MmUIze0EC5u\nFv2lIvn1TasfBl2ffhDL1U+XV+wx+qEzIjd+VN6oGz8qXannBHv89+EHK6qsOAYAjXVUHeWcq+Sh\nZwDYNXDd4QrEtc2LDiIVFs6zPffcc8GSNse71157ze0a5KuVoCdWx3A8F7+FkFVbH+p58MGH3Ow5\nc91dd8513z32uGCw2cN9dbvt3Kl9+0ZC8cILLwTLJn3oZt12u+vdu7c7/PBu7ohgcDj0kK7B8eGR\neezkfxAAI6yblUrSnjveJeF7OV8b0V/GCd6VjAW5nntIBtc1OZCyIQnheCXew4xHeqyQ9LKnLkJP\ndJ3kY+yJqkfysWeOLJ1PXwsfo1+h8sJ5wr9lDJPzlMkWFtKBk+Aavs5vxh7aHRYZizkfRerC6Rv2\ndzBopEYCcpQJbgQmN8m5Bf9ZZNsTfDnSIl2x1yggeMha5A3XiR7BjMPZujgoVT/yBDdoth6tX6Fr\npA3rpH9TLvpo0XUFD4W+5I91mcHD1eJ6FO5gVEgWLVqUOeywwwolS931YA28zPnnn+/1DqaTyOTa\nSBBMlOo3jsGjVkJdwYsub3XoFkx7kTdNuReDhb8zPzjpFH+f/nT4zzO33nZ75o3Va8oq7p5778+M\nOu/8TEDA/HbDDTcUbFtZFTVIJvo0mGuuQVrzn2ZMnDgx069fv/+cKOJIv7sCMtMih37nBUTAv9N5\n9+l3afi93KKA4AdlhvMEhClDvvAYofNyXesmdXKesSss+v0d1on0Aclsobe8n8NjWbhc+U07RAf2\n4TokneypMyCRLfKgY758ur1RY5CU3eh7/ltPndCxugO5Sbjxwx2p03BDhEU/LDwoYaLCjaXTUE+h\nG4s6itWPtPkepnzXyFvqja/LC2NFeegtDx7t1pLvwdbpwscM7IFrIHw69b8hkxCLYiRMtsK/iymj\nlDSQrVLqKDV9IV2oWwjSFVddXTbZylUPBA5C126vdpkrrrgyV7KmP8+zXIh4pw0kSBfkqxTRxCP8\nXtPvPIiXSfUQYAyR8YWxqJkllcSrmTssjW3nP+9qWVXqhUexg1oUAdIWsLj1r8SCha6VDNTUHSwD\n9DkhCghXtQUrGAQMkheFc7XrT3r5pfxzkPS2iH68S0rta6xO/GPNM8teW6GMeAmy1d9r65hY46pf\nazJrSE1wffDQmKQUAeJNCKhuBCFuRmKiiJ3KJ8Q2SVqdjnPEqsQR+6bLJZ4LKSUGRuennyTuS58v\n5nj+/AXuqCOPCqbT2MwtDPp62JDTi8lWUZqjj+zpWCi79wnfc4MHDXYXXTymovIaLTP9OXfu3IZp\nFvcmzwztKkUCspUNhA/+scgbk1VKuZa2eAQ07oG1q6jY4OJLT1/KDdOnsmmcNgR4UQYxOWlTO1Lf\nyy+/3E8NwUWC5mkbZExIj86Ui3iRBnIUlUfnL+WYsiB0bJWIkLZSdLv44rFu6JAh7pcB8Zl42QTX\npvUOlahQcl5I3u1B4P7vf/+4Y/HtuAltyQolIAMY8I/B9OnTGwYPSCRz4JUjBLQHoSc+a75g+HLK\ntjyFEQBzyBcSWBkLZ2j0FMk0xJlWjYQA7ivivHBFpVlwlwbvg5xbMHFq5thjj81cdtllmeuvvz4b\ncJ+rzeAShwsW1wtlxSmUV4xLJ5j2IRN8pZh5dMmyOKsvuyyC+HE9xoFr2UrUKSNtps/YpP245oqN\nRayT2kVXW2lbdIwRLkbEXI1Fw192Qu3qxd1okslsAAiNTi6tffVHoH///l6JNFu+sCLwHzeL9AaD\nQNbylQvdHXfc0VvEmEaiW7du2YWqsZSJYBVDyrFUoQ+WKaxu1RJcxFjBotyqZ519jluxYoWbPPna\nmlu58rV3+Jkj3Ly773Izb51Ztts1X/lJuRZ2C0f1ExZaLEVpd/WjP8+eWTOTcveZHpUgYMSrEvQs\nb9EIMEgwMLCPGsSLLqjOCSFIuBYhkoEFz82ePdstXLjQbx9//HFe7Tp06OAJmBAxEkPCGFRKJU/g\nyCAkrsG8FVd4EXJHn2lymFTSJU0V8rXssWWpvt+kPbLXBIr+0H0iafSee4Q04orW19J0zPPBxrNn\nYgikHQEjXmnvwRTpD1lhEEjryxNrHbpDejAUs/3zn//02z/+8Q/3+OOPu5/85Cf+NxOAFpJDDjnE\nTw6KNYyFsxlYtDUsV/4oIpQrbVznNdFjRvk5AeG8+ZZbEmXpCrcV8rVq1YtuxvRpqSVf9LW28nCP\nlCrcsxA2TdpKLaOe6dEbaxf3YJr/aasnhlZ3shAw4pWs/mhobXhx7rbbbt5SBAFLk4h1CdcNgwCk\nK5g01X322WcO0vXJJ5+4lStXOqxeuBi5FsTWuMWLF7uHH37Y/f3vfy/Y3J122skTu+7du3uCSoYw\nEWMQinIpFSw8hgRgQPtvvvkWN33Gja7rwV1iKLW6RRBsf+CBB7jzzh1V3YpiKh2MIVsicfQ1ZfK8\n4XIsh7iJLvXaozMbBNLEEGgEBIx4NUIvpqgNP/3pT/3Akrb/vsN6i7VLSNdHH33k5s2b51iTERIG\nIYN8QZxYSujDDz90v/vd79yjjz7qHnvssYI91qZNm/XckgzIWMfqJQzgB3U5yI0YOcr9aED0Uj/1\n0i1Xvc8+t8Kd0Ps4N278OE+Yc6Wr53n9LGDRqYb7GMLMJtbSera3lLpFb/5pMzEEGgUBI16N0pMp\naQeDNwMLRIYtDSKuDgYtLAeIEK9PP/3UQbruuecev1js+++/739DvnBDIhAvFtJmYWzZL1++3JOw\nRx55xAeo+4QF/gwJpmxg3UIhX2FrWIHsFV8mrov+mzrl2orLqmUB102b4W6acUNggZydCFcVJEIT\niVpZoaiHZw8ykwbheUPntFrq0oCx6VgfBIx41Qf3pq6VF+q+++7rgk/eq/LffZzgipuGwYoYNRFN\nvJ5//nlv0dpmm23cunXrvFsRMoY1TKxeLKYN6YoiYZzHEgYJg8BhHSskxxxzjCdgkDCwRKpJxOiz\n7//39/18WR07tC+kXuKun3Tyqe6gg7q4YUOH1EU3bdWCvAuBr6UykD0hXvperqUOxdbFcwfpwq0/\n2lyMxcJm6VKCgBGvlHRUo6lJoDoWLwakarhW4sBLXv7oh75aNPG67777gjiiAx3Wrg8++MBvEC+s\nYZAv0rJBjNggYRAwTcI4FosYgfmQsGnTpvkydL1Rx5tuuqmTLyXzxYdF5S32HMRl7077uFEjRxSb\nJVHp5t033w0fNtTV6itHiCr3jwgkIgmC9QjSlXQrkkwdoQlrEvAzHQyBOBAw4hUHilZGWQgwADBA\n8XJN2tdKQrrQK+rlD5HCmrVgwQLXtWtX716EbEG82LPhbhTyRVrZNFiQsDARo4ybb77Z/fznP/dk\n7PXXX89axIqND8MlCQnDIibYlmsRE2sXSwHVelZ6jVWlx9W0enG/gJMIZF1wl3NJ2WO9HT58eGIt\nzkl+LySlD02PdCNgxCvd/Zd67eUli0UpKZYvIV2Am4sUQryYx4s4Lr5ihGBBtPiqkY1jTbwItpdN\npqCAiFGOlldffdVbzlje5LnnnvNuRIkLk32p8WEdO3bMxoaVEx9GbNcXN/6Su/jC0VrV1B2L1WvF\nyhWx6K4JOSQrKfdvvsbhbhSSmESLs7wPcj13+dpm1wyBtCBgxCstPdXAevKyxfXBy7begxe6sL5d\nr169vHsxn9UiWJrFffvb3/bkS6aVEAuX3ss12UO8cEHyW0gY+yeffNJbSZgVH+sUU1C88847bo89\n9ljPNSkkTMeH3X333UVNWyHxYVjEBO9c1jAGab5kZC3ENMZ2hR+bbt26uzPOGFbWF46QFjaRpLgP\nRZ9Ce9Fd4sv4ZwfyFY5fLFRONa4X889ONeq1Mg2BeiBgxKseqFud6yHAIDBgwAA3ceLEun3tSFzJ\nHXfc4XUjvgoSlksgiYcddpi/LJYrIVEQKr1BsjTZEtKl98E6co55vDbffHOfVtySDJbbbrutPx8V\nHwbx0iSMwHziwwjWv//++3Opnz1fKD4MQnz9tOnuzjvmZPOk+eDKSde4VwJMf3XpJQWbIZYhSQhh\nEdIi59KyD5Mu0VtivrjX6/W1I88S9UNk0SHfPzuit+0NgTQjYMQrzb3XYLoTIwP5YXCDiNVqkGOA\n5cUvpEtg7devn9eD39olKIMYk8Hq8xwLCWMvREz2YTLGb8gXcWK4AyFdQsZ02qefftqx3BBlaomK\nD9MkjGD9SuPDxowZ51pts21qg+o1XhzLvF653I3cg9wPSFrch17ZPH/kfs31PHGd5w6B+NTKkgfO\nfLHIs84+LdPL5IHaLhkCRSFgxKsomCxRrRDgZczL/4ILLnAQH17IuQaMSnWSumSw4cUPAXvllVey\nRe+zzz4OlyKDsJAs/kMnVkq75+RY0lCAJmEcazIGscKNiKWLpYOEhEXtOYcbEnImJE7Kpj6pmy8j\nJVAfAqa/lJQvJkuJD9tkk01clwMPcmPGjUvFLPXZTitwgLtx4sTLvJuVeyAtQfEFmhV5uRDp0pl4\n1ngWIGHVfO6oE7LF84arm+NqPeO6fXZsCCQFASNeSekJ06MFAgwYvPyJt4KAyUu6RaIyf1A2L3sG\nGV781CP/5TMQc/ynP/0pWzqzyLMYNiRMky5Ijrj/2AsBymYMDoSIsZcN8vTSSy+5tWvXuk6dOmXd\nkpzXFi85Zv/aa6/5dLgd+U1aCBl7IXS6XtEL8iWbkC9tFZP5w6Liw2gv1jZpgy4/zccDB53m/rzy\ned/vjWLViuqPUkiX5Of+51mT545/RHge4hD5R4dnDxGS53/YH0OgiRAw4tVEnZ3GpgoBIxaF/4r5\nb5yBoNTBAKsGpImXPqQKMpdvUOH6jBkzspBBZIYOHer69u3r15sUMiOWJX7nIl/ZQoIDSAy6bLXV\nVu5rX/ua/y3ESfZCqMLWL6xV2223nSMuS5MyIWGSj3LYtAhJ1HpDxPgthCwcH/aNb3zD7blnO3fb\nbbfqolJ/PGbcBPfZp5+4//3f81LfllwNKId06bLknxMhSTx38uzpdIWOKYfnjucXVz4frUDsSn1+\nC9Vj1w2BNCFgxCtNvdXkuvLyZuNFjjuQ4HbIGBuC9YJNBh1xI4kriZe9DCCkyycQl+uuu84NHDiw\nRTLyX3HFFVnCsvHGGzs2IWBCcFpkUj/QHSsb9WtLEsfUKXvIlN6EhGGhYqoJ+R2155xsQsKiiJi4\nJSFf6K8tYZCxSZMmua232c5NvGyCakH6D5lW4saAVN9y843pb0xEC+T+l+ciIklJp+SZ497lnxYI\nOWXLF7EUxrE8Z7meO56/uHQqqQGW2BBIGAJGvBLWIaZOcQjolzvHCAMOx3pAkJd9KS98yA8bVqXf\n//73fsoIrVXbtm3djTfe6K1PX/rSl7wFir1YjiA0iHY9ir7ok0uoU0STMCFPELF3333Xzx/Wvn17\nT8y05SsXCRPXpBA5KZv6RMcwCYOMzZhxk+uwd6eGCawXbBuZePEMIKXc7z5DkX/kPoZk5XvueAbR\nQT+LRVZhyQyBhkfAiFfDd7E1sFQEICSQE5kUlfUTTz755PWK+fWvf+0OPvhgt9lmm7kvf/nLjmB0\nrF/iziMDxIbBMEwI1yss4oQQMfayQZ4Y9HbeeWf/FaRYtjgvJCyKgMk1IWGk0USM6qlD3KUQsZkz\nZ7lv7rd/wxGv1WvedDu2ae3bGwF7ak9Vm3SlFhhT3BBIGAKf/2ueMKVMHUOg3ggI6YGAQVBw8bVr\n166FWj/+8Y8dMTB/+9vf/GzzTHjKrPVCbiiDhcCRcv7z1yRIyBzE7oADDnAszE2sF6SPaSi22GIL\nt+WWW/rYMdyYrVq18u7MqD3xZWzkIS+kUSx2EC70ps20vRElzcse5eoPcfNVy9KVq147bwgYAqUj\nsFHpWSyHIdD4CAjpWbx4sZ86ggWwIVkXXnihwwImMmHCBLdq1Sp33nnneaKi3XgSjwX5iUPQCWHP\nrPPE3OC6lDplL5Ys2Yu1qxhLmKQlr9QXh+5WRvUQgHRBuArFLVZPAyvZEDAESkHAiFcpaFnapkEA\n0kEAP/FcEnjO/mc/+5lr3bq1D7wXMJhqgnm2cD3iAiQOa+XKle6oo47ycV8Qoqi4L8lfzh79mMAV\nHXcNBl2sVFjFECFg7NmEgLEX16TeC9kK7zf64hfLUS3xeZhEtd1eLa2XiVc6h4JGunIAY6cNgQQj\nYMQrwZ1jqtUHAbH0sGA1k5t+9NFH3hUn0zj06dPHEyzm/xKBAPXs2dONHz/eL/1z6KGH+nwQHx33\nBUGS8iVvuXsIFwMv8WPa2gEBEyLGXjaIVxQR43yYdPF7y8AV2YjyajAn2n6dD0h904x0pb4LrQFN\nioARrybt+DQ3W76swtUmx1HtgZiwEV8lX1lFpYs6R9m487AMQZyEtEBcIDIE1eN67B9MMKnl7LPP\ndmPGjPETo0oe0lMGErfli3ahKy5HLULu2FM/IoRM2hAmYeirSdjGG2/k/vLyS7rYhjjGbZx2MdKV\n9h40/ZsZASNezdz7KWo7X2wxnxBkR+YSEjKlLU+6SQxO5GOG7Iceesjtsssufh4vyBJ5cwl5IGyQ\nJMiKkCZIDJuc5xqTQp5zzjnuueeeyxY3atQoH/c1ZMiQrJsPkkM5MmkpiYUcZTOWeUBbaGuuNul6\npA1SlSZhEDQhX+yZJ21asEB2o8mLL65y7UMfSqSpjUa60tRbpqshsD4CNp3E+pjYmQQhANFiY7CR\nyU+x7mjXWrHqykSQlEd+CBtlhssSC5JYioSM4H775JNPHF8vMsv7Bx984L9mZLmdZcuW+S8ftS6Q\nt9/+9rf+60G+PsRVKV8PQtritH5BFhHqLFWkneTTRIwljYhn09dLLTuJ6VkyqOvBXVz/kLUyibqG\ndTLSFUbEfhsC6UPAiFf6+qwpNIYksbQIkosgVQKEJnRYxGQQFtIlZQvpELccrkfIF3Ffb7zxhnvk\nkUf8TPIQsaVLl/qvHiWv7O+77z4fEybzfUG+sH5BvtgQbZWSfKXuw7qXml/SS5vZH3FED3fWyHPc\n0Uf2lMup37dv197NvHVmTgthUhtopCupPWN6GQKlIWDEqzS8LHWVEcByAwlikNGEqFrVQlaoD6uX\nrCEXZTWChLBh/YJ88dXim2++GaxluKe3fGH9YluzZo0bMGDAeupec8017lvf+lZ23iyZbJUvJbF8\nhV2A6xVQ5Im4yJdUN+KskW7jL23iLr5wtJxK9X7J0sfcD/v3cytWrkhVO4x0paq7TFlDIC8CNoFq\nXnjsYi0RwApFnBKbELBq14/bkrpwOUKY0CFKhBhhoXr22Wf9LPVdu3b1bkSZkJQJTHfccUcfi8ak\npFpOP/10v8ZjvslWxdKk85V6DGmkPXEI5fzj04/dQ4seiKO4RJRx9z3z3LHH9U6ELsUqAZmmX8Mu\n8WLzWzpDwBBIFgJm8UpWfzStNlidcC+yQYbqIVgVxPqFHlED3aJFizwxZNZ3rF+yrJDEffHFHJYv\nfl977bUtJlulTfvuu6+f74v8Ou4Ly5fEfVXqdizXOkI+vhIVYbBnwzV3/Q3TfVyUXEvrvlu37u6M\nM4Z5op2GNsRtwUxDm01HQ6DRETDi1eg9nPD2MdBjbWIP2WGgr6egB+QLaw+DnpAvzkNMIIVimZK4\nLwm6J+6LWC8hXxL3ddFFF63XpPnz5/v5vnTcl3zxGEfcVzEDdphoYWmU9mqFL754rHvn3bVu4mUT\n9OnUHc/67Wx37dWT3KJFC1OhezF9mIqGmJKGgCHQAgEjXi3gsB+1RAAyI9YtTXJqqUOuuiBfEBP0\nQk+28HQNOu4L8oX1S8iXfPEI+SLu64c//OF6VU2ePNkx0aqslyhxXxCvSskX+kIetc60RUsuoqXT\ncEw5zJK//NnnXccO7cOXU/P7pJNPdQd1OdANGzY08TrTV/JsJF5ZU9AQMARKQsCIV0lwWeI4EcDS\nxaDOIBNlaYmzrnLKgnxNnz7dL3StCYwuS8gX1i+C7iFfLJQN4RLyJUH3p512midmOv+IESPcKaec\n4skX1i8hX1i/Kg26l3g1sSJWMpATZP/ZZ/9MrdVr3n3z3fCAcC17bFki7zV9Txjp0mjYsSHQeAgY\n8Wq8Pk1Fi/iCkAGGLYmkCxBxffJlJYKeuURcjzLfl5CvqLivcePGuSVLlrQoihnyL730Uk++sH4x\n35dMtgr5EgLWIlPoBxYuLHRaIFroXQnhkvLSbvU6tldv1+OI7om3dsXVX9JvtjcEDIHkIWDEK3l9\n0vAaQWjElSfWmKQ2GkIDccE6x3xiuUTIl477wvIlrkcd93Xvvfe6SZMmtSiKWfVZZHunnXbyBAzy\nhfVLFuiGfCESeB8mWpDXXFa5uAbzK6+aFEwU+5i75eYbW+ie9B9XTrrGzbn9t4mP7Yqrn5LeH6af\nIdDsCBjxavY7oMbthzBAtnCDQWbSIBJUD2GEhOUTCBjki03HfeFu1Nvzzz/vfvazn61X1MyZM/06\njxJ0L65HiNszzzzj00O+8hGtcKFgjsUqFzELp8/1m3J69z7e9T7he27YkNNzJUvUeZm3a/KUyQX7\nrp6KG+mqJ/pWtyFQWwSMeNUW76avTcgWJCZNgsuRDQJTSHTcl5AvHfclBIygeyx/Ybn44os9+Xrn\nnXcc84Fh/dpuu+1c586ds25HsXyF8+b6DXmE8Fbq1qUcpsR4dMmyxE8vsXrNm27w4NNcjx5HuGFD\nh+SCpu7njXTVvQtMAUOgpggY8aop3M1dGQOMBNRXSgDqgSQWI4iSLGWUTwdxPUbFfQnxYk8QfniR\nbcrdb7/93HXXXZcNutdxX3zxCPEqlXzFNcDPnj3HjQoWBk/63F6syQjGSXaNxtUn+e5Fu2YIGALJ\nQsCIV7L6o6G1wU3Hli9WKskAlEocw+RLz/fFGo+vv/561v0Ytcg2cV+33357lnxh/apkke24XI70\n0Vlnn+NWrFjhJk++1rVpvUPium34mSPcqlUvuhnTp1Vs5atG4+gLcWFXo3wr0xAwBJKLgBGv5PZN\nQ2lWKmlJauMhjljtirF6SRsgYH/4wx/cu+++66ecYJHtdu3aOaaMwCJD/JZMtnrhhRdKtuw+zkW2\nxVWK27FSEfI1duzYRM3vlQbSFUfMXaX9Z/kNAUOgPghsWJ9qrdZiEWjbtm3WrTR+/PgW2cTdxH7B\nggUtruX7MXXq1GyZpbqr8pWb7xrxUZCVNLoYdbtog0wxoc+HjyGasj300ENu9913D2KNeriePXu6\no446yrVp0ya7ziOYsM7jIYcc4qZNmxYuyh155JG+rHXr1nmixpeSkDfmDcOVKTFl62WMOAHhEvIV\ncbmkU5eMH+vat2/vTuh9nCOIvd5CTBeTpCbd0mWkq953itVvCNQXgaYnXpAZITCQHJP4EcCtcscd\nd/j4qPhLr22J8nEApEqLkCzZi1tV9q1atfL3GfFZWLr4WpEvF5m3C9IlC20znQQfHkDMtLDINoRP\nFtnGQkbAPnOGlUq+0Cmsv66rlGPI15jA4nVI14Mc0zbUSyB+J590ktsqwDLJ7kUjXfW6Q6xeQyA5\nCGyUHFVMk0ZFACLRq1cvF4d7KwkYQVzCli/OFRKxLurlgDjHb70xZ9eUKVOC+KnJ7u67784WS7A9\nLkvm+5IpKySOjD1lhOf7ymYOHfChADFGlU4xQbHHH987mCNrkbvggl+6ZUuXunPPPbdmrkesXDdM\nv9Gde85ZfoqSfv36hVqajJ9xxtclo0WmhSFgCJSLgBGvcpGrUb5Vq1bVqKbqVVPM/FfVqz3ekqUt\nWIyKIVvh2iFaQpLE0ip7SJOQJ/ZYufi6Ucd9PfXUU27//fd3999/v9t5552zBEwH3ZOXOig3l4jL\nF0Igx7nSFnMeLCBxV199rdu749fdxWMvcf37nVrVwPvrps1wN824wbVus2PeZZ2K0b+aaYx0VRNd\nK9sQSB8CG6ZP5Xg0JiaKgWnkyJHZAl966SV/LsrliEtSx1uRl3PkCYt2XxLTg1CPDLCSXn6zRx82\n5mqSskmn66TcfHLbbbdl81MGZXGuHCmlvYXKh6SIi65Q2qRfpx39gyklZDAtR1/pd4gWM9OzPJC4\nHrfYYgtPhHA94oLs2rWru/7669er5jvf+Y4DV4n7YnkicTviekTEGrZe5n+fEKtXruulnofAnXvu\nOZ4ELX/madc9IGPn/mK0e/a5FaUWlTM9Fi4IV7du3T3pGjp0qJ8uIg7LXc5KK7gg90lS9augaZbV\nEDAEykSgaYlXsXhBrCAwEKcwyeIc15588sm8xZGmEGmCdEHSCpWVqyIIVp8+fVrkpyzOFapblxlH\ne3V5uLOQXWP4ik6Xi558JDBo0CCP29Zbb50ltkJsOAempCFtuP90eaUex0FaRE8sVGHyJTFf7CXu\ni+kktLDoNot4E/clc4IR98W0FcXGfWGpgsDFKWDD3Fkzb53pPv3kY28BY61EYsDKCcIXssXXipC5\nBxbMd/369fVLAOHmTKoY6Upqz5hehkB9ETBXYwH8w2QmnPy9997zgzsuQQKowwKhKkZKIUdR5UEs\ncgkEEfcUX9UVkkrbGy4/7mBiyBPtKcaSR9+E8T/xxBMdC1XzlWElAmGBVFZqyYN8IZAvRMiYuB05\nzzEbywmFF9meMGGCJ9sssg3Z0rFfBPFLXqnHVxL6Aymmn+ImxxAwtnNHjfQfDEC6rrnqSl/7fp0P\ncHt32scf77FHW/+Fp6j1wgsvBETyQ/eXl19yL/x5ZUAMF7kfnHSKO/I7PdwZw34Su55Sb5x7I11x\nomllGQKNhcCGjdWc4lsDCcEVw0AmwmDMOYmrgsxoCxRpuc5G8LMIA3w+4kO58+fPz+aVfOH9wIED\ns2nOPvvs8OWCv3PpR8Z8+knBcbVXymMf5ySR4kothnRpHfRxHGVQHiRFrHm6/HKOhRRBsnA9Eq/F\nTPV89Yi7ERceli/ckKNGjXJDhrRc/mbhwoWB662be+WVV7zrkS8emXIC1yNTTkDG5L6N0o+2QLyq\nJejfP3DPTp1yrVuxcoW3hPU58QS3+Wabus8+/cTNnTPH3ThjRnZ77dVX3WZf3sQvSXT++f/rdceC\nRuA8uiZdwJIN0mliCBgChsB6CAQv5KaWgKxkAlD8FhCkFlgE1pHstYAUtbjGj1x59XnKDkjXenk5\nIfWyD4hgZBp0knSUq0XOsy+kH2nWrl3rswekMVsm50XKba/kj9qff/75GbZKJSDDmcCi2EJv3f5S\njymLMsuV4Cu+zGGHHVZu9pz5ApKUCSxXmYA0ZQIClQlIfWb16tWZwAqUCb5ozCxevDgzb968zGWX\nXRaJRfAlZGb58uWZYODPvP3225kgBiwTuB8zgfsxQ9lsuYQ2mVSGwMsvv5xhMzEEDAFDIBcCTWvx\nCgbqgqKtXVFuOtxWIrjAsHxFSVTecLpi0oTz6N9aFzkfLrNQjFNc7ZX62WMVisNKgTUujC+WRCyD\nASH1FkWsiuGNa6TB1aqlkJVSp63lsbgaJe4L6xexXVi7JO4LK1jHjh399AnhuK/Bgwf7OdNkvi+C\n7on7KmayVSw0cVnxaolZUurCyoXEcb/7guyPIWAINCQCFuOVp1s1USH2qZAwmIfjvCAHxUg4XzF5\ndJqoesJlhomLzs9xHO0Nl0msSxwDUThWC1cvrtlCosmnfMAgecJlyvl677XrEV0kTos9hExvxH0R\n8/bcc89l1WYeLUj0L37xixYxXwTwE/dFfkTqkYwyrQR9Jsdyzfb5ETDSlR8fu2oIGAL/QcAsXv/B\nwo4SjIC2xkG4iiFd4eZAwnQ+XWY4bb1/CymCJMmUE8R9MdO9WL4k7uuSSy5xp5xySguVZ8+e7QP/\nJe6Lrx5lpvt8cV9m9WoBY1E/jHQVBZMlMgQMgX8jYMQrz62grUg6OD7w22aDlfWxTp+n2KpciiIR\nYQtXIf309bjai+VEBqa4Gq71LLXMSvKWWlel6cXtiKVLky8JuhcChuvxpGC5HCxcWiBdkE1NvsLr\nPJKee1jL4YfHP8WELr+RjuXejsOq20i4WFsMAUMgNwLmasyNjY8LEvcbxEa7rfJkq8sl3GbhOC/t\nSsPtWIh0EAcVd3uxoMjgFBcwtKucrz6pX2MSlz7VLgcCBvkK77XrkeNDDz3UL7I9YMCAFiqxyPY1\n11zjvvWtb2WnnIBs4XpEyIuIlY1jiAT9xj5uIY6Mbd37H7jXXnvdvfHGGy2qIJ6tfft2Lpjj338Z\nCBFMosh9XQ2Mkthe08kQMATiQcAsXgrHsIVIEy3iaPRcWxAU4r7EKhE1270quuqHBJ9r/fiNziJh\nUibn9b5a7SVmqFLRukGeaFu4v/LVQVrw0cRLl5kvb9Q1iEMt46DkPsP1SJwWQfeyyDaWL3TB8iWT\nreZaZJuZ7t9//31XaJFtyAT9FkffUcYNN9zgBg46zbVv194NH36mu3/+A+6DDz9yrbbexp3at2+L\n7cAuB7mPPv7UvfLqG+6yiVf4Z8xPwHrVJE8Go/qj1ueMdNUacavPEGgcBMzipfqSwZkBDssQc3kR\nD8RgLVYgBntNZlTWsi0wuoxKjyvVrxrtxeJ1+eWXV9o0b23UpIl+YcNKhzUvF4kiD/0a5YrNlacY\nZSETtK2Wwr2JpUoHx3OO39r6xe9ci2w/8MAD7vbbb89avpjjC5FytfWL9j0YzGpfrsWJvHffc6+7\ndMJ4PwHqQQcf7M4444ySF9Bm5vpHHl3ili5Z6o468ii3V/uvu+N793L9g7nB6iFGuuqButVpCDQQ\nAsELt6ll1qxZ682HFBCvLCbM9RQM7uulCW6B7Lnw/Fr8luu6rGyh/z6QNOyZWytKyC/pwvXIefaB\n6y2bTp/nOJwv1zxe1F9Oe6P0lnPMaRRYZORn2XvmICvUD+F25/tNWTKvWTlKMYfXnDlzyslacR6Z\njysIks988sknfr6vd999N/P6669nVq5cmQlIZiYgPZm77747E8R9Rd4XwSLbmeeffz7z6quvZt55\n551MYAWLnO8rIK2ZYGHuknRmPrBgpvlMu73aZYLFsjPLn32+pPz5Er+xek3m19dPzxx+eDe/3X77\n7HzJY79m83TFDqkVaAg0HQJN72rEBRcQk/WmgQgGbS9Yv5544gmfBuuKFixEBKGXG2+ky6r0GF2w\ncqCvCPoGxLIk/eJuLy4rpNL5obBq0b5wH/jCS/xDGZQVnm6jlGIeeuihmlu8RD9xO2KdIuge16Ne\nZFu7HotZZBvXoyyyHZ7vCxdmsR9IYAW86OIxbvCgwX45oIWBxWvUyBElW7iknVH7Nq13cD8a8Pk6\njaf07e+uuuoqhxuy0vsrqq7wOalD7unwdfttCBgChkAxCGwA1SwmoaUxBMpFgPUMcVf99Kc/LbeI\nFvlwMRLDJi7gFhfz/IBUQlArJcpz5871bRGXU54qq36Jx5cNlyGkiWWCmDaCGC42SBVTSUCs+PKR\nvZYf//jHbujQoX6aCiZjhcARPwahw2UpJK+Qy5HrF1zwS9e6zY6OecQ6dmivq6nq8ZhxE9y555zl\nrrjiSjds2NCq1AXpgnDVMq6vKg2xQg0BQ6DuCBjxqnsXNL4CBFYT5yUWg7haDPHSMVzEcmnBooV1\nS2LAtDVQpyv1WAhkHLFrpdYdlV7+d2KRbDbIV+CC9CQL0sUmVq1gSSF3zz33tCime/fujkW2mSOM\ngH3mC9PkC8saBCwX+Zo+fbobO2asO33oMDdsyOktyq7VDxbgxnLdunVrN37cmFgJkpGuWvWi1WMI\nNAcCRryao5/r2kpcUJCfID7GWw3qqkwMlWP1gITUOrg+n+pCvrB8Qb6CtRmz5AvLl5AvjpctW+Yu\nuuiiFsXhnvztb3/rv4qEgAn5wo2J9QvyhYUPArarmmLirLPPcXfOneOuv2G6X9S6RaE1/kEQ/ujR\nF7g333zTzZg+LRbyZaSrxp1o1RkCTYBA08d4NUEf172JEJV+/frF8nVjvRuD9Y72JIl0gYm4BCXu\nizm6cBtCophmQsd9HXLIIS5YZLsFlKzt2LNnT0fsGscQNSZbxXomcV8QLqyKEGkE0rVixQpHLFfX\ng7u0KK8eP4j/mjrlWte27R6ub78BWT3L1cVIV7nIWT5DwBDIh4BZvPKhY9diQwALEbFeWE3SHCcD\nwTn//PMDy8ro2LCJu6Bw3BduR4n7Etcj+zVr1rjTTjvNEyytw4gRI/wSROJ6hMDJOo8QO8jZvHvv\n96Rr8uRrHYQnaTL8zBHBlDAvlm35MtKVtB41fQyBxkHAiFfj9GXiW0KAPVuSSUs+ELF2oTskEgKp\nhXYlScT1qOO+IF8E12vyxW9io5YsWdJC/eOPP96dd9553mImrkchXzfddFMQRzXe3T5nbk2D6Fso\nWMQPJmylrbfcfGMRqf+TxEjXf7CwI0PAEIgfASNe8WNqJeZAQKxeMrDlSJbY07jaIF5RE3fSNi1J\ncEcK+ZIvHnEZQr5wIWryRdzXzJkzHYRKyy677OJ+/etfu5133jkbdI9rkaWJHl2yLBHuRa1v+JiY\nr8GDT3NdDjww+NLynPDlyN9yb6bZKhvZMDtpCBgCiUHAiFdiuqI5FIG0ECPElAxpEggXOjMwFyO0\nMZyWuLB6DOgQMMgXmwTdQ74k6F5I2NKlS92FF164XvMgZZ06dfKxXqef/hPX5/s/qNvXi+spV+AE\nXzv+sH8/N3nKZG9tzZfcSFc+dOyaIWAIxIWAEa+4kLRyikIAQoLliKkYoixHRRVS40QMyPvuu68L\nZnCvKKieciQwXZpQKxeljvuCfBE0D/kKux5Xr17twotsoyuLbC9b9nv3j8BqVqrrTtpar/2Vk65x\nc27/rVu0aGFOFbBY1osY51TKLhgChkBDImDEqyG7NdmNEpejDHZJ1haixIDM3F0yf1ec+oZdlJBS\ntmqIuB513BfkS1yPerJVgu4hYWEJlv9JdFxXWF/5zez2PY7oHjnBKn1QKwIs+tjeEDAEmhcBI17N\n2/d1bTmuO4LVsQLVw/1WbOMhXWzoWgshaD8cuB+nJSZMvsT1iOUL16OQL46ZbDVY79E3+8ADurge\nwQLVF184uhYwxF7HvPvmu+HBrPbLHlvW4n4z0hU71FagIWAIFEDAiFcBgOxy9RDA1QjxYvBLIvlK\nin5hFyVYQcbKFSFfMtkqQfcy0z0ETJMvHfcVLFCdyKkjisXhpJNPdQd1OTBr9TLSVSxyls4QMATi\nRMCIV5xoWlklI5AUcqMVx72IWzGppBD90E1LOS5KifuSme7DcV8QMCxf//uL813XQ7/lJl42QVeZ\numOsXpeMG+tjvYx0pa77TGFDoGEQMOLVMF2Z3oZAvhgI+WqwEktOHAhAaiTeB52SaImLameUi1La\nEZVezgn5kiknIF96vi/I13/3+W83c9ZtiZ8+QtqUb9+tW3f3jW/s0xCrKORrp10zBAyB5CKwUXJV\nM82aBQHip4j5gijU82tHiBaz67OhR1pIF/dJlMWL9miJclEyEz/yhS98we/10kPMUn/XXXe5vTvt\n0xCkiwZ2PfTb7h+ffuLban8MAUPAEKgHAmbxqgfqVmckAkJ8hIBBJmohWLkk2J99Nb5erEU7CtUR\n5aKUwP1w3JcE3Q8/8+dup52/ltqg+jAmMq/XipUrwpfstyFgCBgCNUHALF41gdkqKQYBCBcuM4gP\nhIA9WzUtT2Jtg+QRN1UrslcMHnGnAUcw1hIO3IeAHXbYYdlFt1e9+IL7/g9+oLOk+lgW86bd9XZr\npxpIU94QMATKRsAsXmVDZxmriQDWL6xPDJCQL+LA4iJFWH4gXLgTEfa4F00+R2DRokUOArZ27Vp3\n4okn+uNGwoY1HDt8vZ2/rxqpXdYWQ8AQSAcCG6ZDTdOy2RDAMgP5IuAeK9huu+2Wjb3id6kiZAuC\n1apVK18uhIuyjHS1RLNbt26ObZtttgmsXSe3vNgAv3bdbXe3bt0HDdASa4IhYAikEQFzNaax15pI\nZwgYGyQJEsaGJQy3GRYwriGy9z+CP+JCY8/2yiuveBeaBM7HZT2T+hptT5A9uG2xxRaN1jS3xx5t\n3dw5cxquXdYgQ8AQSAcCRrzS0U9NryVEC3cjGyKECouVBMf7C//+A7Fig2jhqgwTM53WjqMR+MJG\nX3RYhxpNGpFMNlofWXsMgUZGwIhXI/duA7eNwGgLjq5uBzOxaiPK13be2f3hiccbsWnWJkPAEEgB\nAhumQEdT0RAwBAyB2BDo2KG9W/nnlbGVZwUZAoaAIVAKAka8SkHL0hoChoAhYAgYAoaAIVABAka8\nKgDPshoChkD6EHj2OZs8NX29ZhobAo2DgBGvxulLa4khECsCsoxQrIUmoLBXX3vN/eCkUxKgialg\nCBgCzYiAEa9m7HVrsyFQBAL/+udn7q9vv11ESktiCBgChoAhUCwCRryKRcrSGQJNhgBfjb711psN\n1+qnnvqja9+uXcO1yxpkCBgC6UDAiFc6+sm0NARqjgDE6ze33FTzeqtd4V9efsltueXm1a7GyjcE\nDAFDIBIBI16RsNhJQ8AQ+HxR7W5uydLHGgqMF4KpJGxC3YbqUmuMIZAqBIx4paq7TFlDoLYIHHBg\nF/fgQ4trW+n/397d9ER1hmEcv+YDsNRUUDMJLEhdYayg3dD6/sZAurOFIU2qSNGkTaoZSXShKGqi\nCRo7VYkVW0QDTtqmMbY2gcS2GKDMoo3ESGUjfgLWyjNqAqiYyDkz5z7nPyvmDPOc+/nds7hyXp7j\n495ciHwyOcniuz4aMzQCCMwvQPCa34dPEYi0wNo1lRr8+6/QGAyPjGhHojY082EiCCBgT4DgZa9n\nVIxA3gTcsy4fjN1XWNa+yvT16sO1VXnzY0cIIIDAXAGC11wR3iOAwCyBmto6XbrUOWubxTe3bv+e\nO83owiQvBBBAoFACBK9CybNfBIwINO/ZrVu//qLJJ7aXlrja1aXmlhYj6pSJAAJhFSB4hbWzzAsB\njwTi8bhWrvpA31+56tGI+R/GHe36Z3hIDfWsWJ9/ffaIAAIzBWJPp18zN/A3AgggMFcgm82qoqJC\n//53XyveL5/7ceDf7/y0XlVVldq3lyNegW8WBSIQcgGOeIW8wUwPAS8E3GKqR462qa2tzYvh8jpG\nx7nz09d2PeZoV17V2RkCCLxJgOD1Jhm2I4DALIGWL5tzp+tckLHycndjnj/bocOHD8ktCMsLAQQQ\nKLQAwavQHWD/CBgRcMEl/V06F2SsrGafSqX0WUMDK9Ub+Y1RJgJREOAaryh0mTki4KFAR8dZZTIZ\n/djdreIl73k4srdDffX1Nxoff6iff8p4OzCjIYAAAgsQIHgtAI+vIhBVgf0HUhobG1M6/W0gw5cL\nXdnRkemAeJNTjFH9kTJvBAIqwKnGgDaGshAIssDJE8dVXl6upqY9gVvfy4Uut+7YmTOnCV1B/hFR\nGwIRFSB4RbTxTBuBhQrMDF9BuObLLfD68vRiz/UeHoS90AbzfQQQ8EWA4OULK4MiEA0BF74qV6/W\n541J3ei9WbBJu7sX3dE3d01X15XLhK6CdYIdI4DA2wQIXm8T4nMEEJhXoLU1pfYT7TrUelC7duf/\n1KNb3uKTutpcAHQX0rNsxLzt4kMEECiwAMGrwA1g9wiEQcA9eHrw3qBisZg+rq7WsfZTvk/LPQao\nJlGnTF9vbpkLFwB5IYAAAkEX4K7GoHeI+hAwJtDf368LFztzq8Vv2LRFjcl6T+987LzcpT/uPH/2\nYupgSslk0pgQ5SKAQJQFCF5R7j5zR8BHARfAuq9d18ULaX2xq0mVVWu0ZfPGdwph7ujW3bt/qu9G\nj5YUF6tx+pqyRCLBaUUf+8fQCCDgjwDByx9XRkUAgRcCExMTGhgY0O3f7uha9w/aUVOr0tIyLVq8\nWGVlpSoqKnrFanQ0q6mpKT36fzz3nerqj7Ru/Xpt37aVC+df0WIDAghYEiB4WeoWtSIQAgF3JCyb\nzeqpYhoaGn7tjEpKSrRsaYmWL1+WC1rxePy1/8dGBBBAwJoAwctax6gXAQQQQAABBMwKcFej2dZR\nOAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkB\ngpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAII\nIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1\nIoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMg\neFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAA\nAghYEyB4WesY9SKAAAIIIICAWYFnu/zIiInRwogAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 88, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 89, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0 = np.zeros(10)" + ] + }, + { + "cell_type": "code", + "execution_count": 90, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])" + ] + }, + "execution_count": 90, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 91, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0[4] = 1\n", + "layer_0[9] = 1" + ] + }, + { + "cell_type": "code", + "execution_count": 92, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 1., 0., 0., 0., 0., 1.])" + ] + }, + "execution_count": 92, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "weights_0_1 = np.random.randn(10,5)" + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 94, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0.dot(weights_0_1)" + ] + }, + { + "cell_type": "code", + "execution_count": 101, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "indices = [4,9]" + ] + }, + { + "cell_type": "code", + "execution_count": 102, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_1 = np.zeros(5)" + ] + }, + { + "cell_type": "code", + "execution_count": 103, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for index in indices:\n", + " layer_1 += (weights_0_1[index])" + ] + }, + { + "cell_type": "code", + "execution_count": 104, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 104, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_1" + ] + }, + { + "cell_type": "code", + "execution_count": 100, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjsAAAEpCAYAAAB1IONWAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsnQe8VcW1/yc+TUxssQuWWGJoGhOjUmwUMVYUC3YQTeyCDQVRwYJYElGa\nggUBJVixRLFiV4gmxoKiiaKiYIr6NJr23vvf//6O/k7mbvc597R7zt7nrvl89tlt9syatct8z5o1\nM99oioKzYBowDZgGTAOmAdOAaaBBNbBcg5bLimUaMA2YBkwDpgHTgGnAa8Bgxx4E04BpwDRgGjAN\nmAYaWgMGOw19e61wpgHTgGnANGAaMA0Y7NgzYBowDZgGTAOmAdNAQ2vAYKehb68VzjRgGjANmAZM\nA6YBgx17BkwDpgHTgGnANGAaaGgNGOw09O21wpkGTAOmAdOAacA0YLBjz4BpwDRgGjANmAZMAw2t\nAYOdhr69VjjTgGnANGAaMA2YBgx27BkwDZgGTAOmAdOAaaChNWCw09C31wpnGjANmAZMA6YB04DB\njj0DpgHTgGnANGAaMA00tAYMdhr69lrhTAOmAdOAacA0YBow2LFnwDRgGjANmAZMA6aBhtaAwU5D\n314rnGnANGAaMA2YBkwDBjv2DJgGTAOmAdOAacA00NAaMNhp6NtrhTMNmAZMA6YB04BpwGDHngHT\ngGnANGAaMA2YBhpaAwY7DX17rXCmAdOAacA0YBowDRjs2DNgGjANmAZMA6YB00BDa8Bgp6FvrxXO\nNGAaMA2YBkwDpgGDHXsGTAOmAdOAacA0YBpoaA0Y7DT07bXCmQZMA6YB04BpwDRgsGPPgGnANGAa\nMA2YBkwDDa0Bg52Gvr1WONOAacA0YBowDZgGDHbsGTANmAZMA6YB04BpoKE1YLDT0LfXCmcaMA2Y\nBkwDpgHTgMGOPQOmAdOAacA0YBowDTS0Bgx2Gvr2WuFMA6YB04BpwDRgGljeVGAaMA2YBsrRwOOP\nP+7effdd9+rC190HH3zgli39wD3++GNfS+qQQw93q6yyiuvSpbPbaMMNXM+ePd13v/vdr8WzA6YB\n04BpoLU08I2mKLRW4pauacA00FgauOuuu9xTTz/r7rv3HteufXv3ox//xG2y6SZu8803d6utuqrr\n0b1rswIvfG2Re2/JErd06TL39ttvu1defsnde89dDgD66a67uH322cfAp5nGbMc0YBpoDQ0Y7LSG\nVi1N00ADaeC///u/3YyZN7k5d97pS9V//wNcn969XZfOHcsq5dJlH7q5DzzkFsx/zl079Rp3xrCz\n3IknHOc23njjstKzi0wDpgHTQEsaMNhpSUN23jTQhjUwfvwEN3nSJLf1Ntu6IwYOdLv/tG9VtYHl\n57rrrndXjvuFQU9VNWuJmQZMA6EGDHZCbdi2acA04DWAP87551/gVll1NXf8CSdUHXLiahb0zL3v\nXjfi7BFu0KBB8Si2bxowDZgGytaAwU7ZqrMLTQONqYHxEya6yRMnuhNOHuKGnHRCTQs598GH3WWX\njI38gdaPLEoTzJ+nptq3zEwDjasB63reuPfWSmYaKEkD+OYce9wJ7pFHHnU33Di95qCDsDST3Txr\nllt33fVct67d3O9///uSymCRTQOmAdNAkgbMspOkFTtmGmhjGgB0Bg4a7FZeeWX3i19c7tq3W6/u\nGhg/cbKbPGG8m33LbPejH/2o7vKYAKYB00B2NWDj7GT33pnkpoGqaECgs9lm33fjrri8KmlWIxGa\n0FZaaWV38EEHG/BUQ6GWhmmgDWvAYKcN33wrumkgraCjO3P04IF+04BHGrG1acA0UI4GrBmrHK3Z\nNaaBBtHAmWeNcIsWLXL33D0n1SWiSWvOHbe7OXPuNKflVN8pE840kE4NGOyk876YVKaBVtfA9OnT\n3diLx7p5UTfzNPjotFTgY4493n3++edu1s0zW4pq500DpgHTQDMNWG+sZuqwHdNA29DAO++840Fn\nXDRoYBZAh7syevQoP/8WkGbBNGAaMA2UogGz7JSiLYtrGmgQDRx62BHRnFabuTEXjs5UiRiH59Qh\nJ7v5C+Zbc1am7pwJaxqorwbMslNf/VvupoGaa4DRkX/3wvN+PqqaZ15hhozDs/uee7uLx15aYUp2\nuWnANNCWNGCWnbZ0t62spoFIA1h1unXvXpdBA6txA5haYosundzixYtt8tBqKNTSMA20AQ0Y7LSB\nm2xFNA1IA1h1jjv2OLfojUU6lMn1qacNcyussLy77NKxmZTfhDYNmAZqqwFrxqqtvi0300BdNTDr\nV7e4gYOPrqsM1cj8Zz872t1z1xzHOEEWTAOmAdNASxow2GlJQ3beNNAgGgAMrp16jdun396ZL1GX\nzh3dDzp2cnfddVfmy2IFMA2YBlpfAwY7ra9jy8E0kAoNAAY/P+Y4Byg0Qtilb1/37HMLGqEoVgbT\ngGmglTVgsNPKCrbkTQNp0QBgsMWWW6ZFnIrl6NO7t7dUVZyQJWAaMA00vAYMdhr+FlsBTQNfauDJ\nxx9z2/zkJw2jDixUe/fb1+F0bcE0YBowDRTSgMFOIe3YOdNAg2iAEZMJPbp39etG+WGm9pdffqVR\nimPlMA2YBlpJAwY7raRYS9Y0kCYNADtbb7NtmkSqiiybbLqJW/L+B1VJyxIxDZgGGlcDBjuNe2+t\nZKaBnAZo6ll33fVy+42ysfnmm7sPPjDYaZT7aeUwDbSWBpZvrYQtXdOAaSA9Glh9jTXdN1dcKT0C\nmSSmAdOAaaCGGjDLTg2VbVmZBuqlgf/3//5fvbJu1Xy3+uGW7lezbmrVPCxx04BpIPsaMNjJ/j20\nEpgG2qwG2rdrvKa5NnszreCmgVbUgMFOKyrXkjYNmAZMA6YB04BpoP4aMNip/z0wCUwDpoEyNcBA\niR1+0KHMq+0y04BpoK1owGCnrdxpK2eb0wDTQxx55JHuu9/9rps8aVJDlv/Tzz5ryC71DXmzrFCm\ngTpqwHpj1VH5lrXzo9/+/ve/dyyMBcPy7rvvNlPNaqut5n70ox/5Spt1z549/dIsku34GcABHLqZ\ns/70009zWmH7ncVv5/YbZeNvf/tboxTFymEaMA20ogYMdlpRuZZ0sgaoiG+88UZfKWN1AF6AGFkh\n2A6DIIg1UHTKKae4l156ye2zzz5u33339QvptMXATObok+Xuu+9upoLvfe97XjfolXhXjLuq2flG\n2PnjH99yXbtu1whFsTKYBkwDraiBbzRFoRXTt6RNA14DVLZXXnmlXwATgAVQ2XjjjcvSkCp5oAkA\nIq3Ro0eXnV5ZQtTpIqBPwAj0hWGrrbbyukAfITTymi+33HLug6XLXCP1YDr0sCPcrn37eFAO9WDb\npgHTgGkg1IDBTqgN2666BkLIofIFSLDkVDNQ+ZPu9OnT3aBBg3JAVc086p0WQCcLThLgYL0J4TH8\nD8M24+z077+/OyLSz4AD9qt3caqWf8cOHd0DDz7QJiC3akqzhEwDbVAD5qDcBm96rYqM7wiAIx8S\n1tUGHcqCdQgLz+LFi31zDftYkbIe1GRHeX784x+7888/3zffUS6a8KZNm+Y++eSTXNMezVaAjeDm\n//7v/9y///1v969//cv985//dF26dImufznrasnJP/fBh1279u39/c8dtA3TgGnANJCgAfPZSVCK\nHapcAzRTASBYXNiuRQAKsH4AVVg6WCNDlvx5ZL1hHToY46QNKGK9YVGZBDfoF+sN+0AOC/v/+7//\nm9vfaacd3MUXj41ijiZ65sPTTz/j2q+/QebLYQUwDZgGWl8D1ozV+jpuUznQbCXrDRU2AFKPgBwA\nD01cAE/ov1IPefLliZxAmSAnDjiCG9YKAA1BoAPUsAhytAZ0BDtaHx75uJx+5pkN0ZRFE9Y1U67J\n9dKTfmxtGjANmAbiGjDYiWvE9svWgEAHsKAZSdaHshOswoWyMAEUaQEe9CS4ydeDCrgRNKKGEHBk\nwYkDjoAmDjnh/jXXTHWg0tQpV1dBu/VL4vppM9zdd81x99w9xzdd0uQX6qt+klnOpgHTQBo1YLCT\nxruSQZlC0MGSkqaAPEBPPYFHPaiAnCeeeKKZeuhBRUWNJSoEsjjgyIIjyCkGbmTl0Rofn7322su9\nuvB116Vzx2ZyZGmnV6/e7uSTT3b77dc/Jzb314Anpw7bMA2YBgINmM9OoAzbLE8DaQYdSgREEKgI\nawk8WBvID9gqpgcVMuYDHMFKCDgcC/fDbcUXIJHuN77xDe8H1L379u6qq67KrHUHqw4hBB32dX9Z\nWzANmAZMA6EGzLITasO2y9IATS4ADxV7moOsO8jZWk1sAA5wgwUnPhI0PaioiNGXfJkEN+gNMGFf\nlhsBSxxqtJ8EN5yLAw7j67C8/PLLbs0113Rrr722Gzx4sJs4+Rq3+0/7pvmWfU22pcs+dIcdeujX\nrDphRO4vFrLWusdhXrZtGjANZEMDBjvZuE+plZLeVlTuVPJZqFyADeQERqoVSIsKNh/gADcs0k8S\n4AhSWAtm4us43GhfcMNaQYDzX//1X47l6aefdj/5yU887Kywwgpu9uxb3MMPPexmzf5VpgYZHHnu\naLf47bfcrJtnqqiJa55HgFI6T4xkB00DpoE2owGDnTZzq6tfUCoUxn958cUXm/maVD+n6qWIBYpK\nEEADQMoNgI2WavagEuAAMoKZ+DZxBDgCJ5qoWAQ34RrQ2WWXXdzyyy/vz7NmOebY412nzl3cmAtH\nl6uGml7HuDqnDjm56EEEDXhqenssM9NAqjVgsJPq25Nu4bCSsIyOrDtZCkAKfjw4DRf7zx9IEtzk\n60GFLkKAEoioaUprYCVcBDOF4EbxSUPp5gMcwcznn3/uXnnlFdenTx8PNzouEFqyZInbp98+btjw\ns93Rgwem+hYufG2R27//vu7isWO/5qtTSHADnkLasXOmgbajAYOdtnOvq1pSLCNADpVJscBQVQEq\nTKwYUFMPKpqo8gEO0FSoB1UccJKABpCJA4/ghnVLgCOIYS2Qef/99x2ws8022+SO6ZyauIAlyjVi\n+Ah3w43TXY/uXSvUautcjp/Occcd7zp27Oguu5RBEUsLekax6FkwDZgG2qYGDHba5n2vuNRUHMAO\nlX0WAxUgwBO37ghwgLl8Pai4rhDgqIkpBBbBjI5pX/Cj41rHAUeAArAkwU14DIsN8TbbbLNmoCNL\nEGsC5aMsAw4c4J588slUAo9AZ/1oWoirr55U9qPGfSUY8Hg12I9poM1pwGCnzd3yygssq44qkMpT\nrE8KgBqVH01PlAkLThxwdt55Z3+eOKoo1YyE1ICNrDdsC1YEM/F9wY3WOq90lLbghrUAJ74W4HBc\n52i2osfVpptu6o8BNqShINBhH2CjvPQSGzhwkDt7RLosPAIdZJ0xfVrFFkQ9r7qPpGvBNGAaaBsa\nyATsTJ061R177LH+juBo+fDDD2fq7oTyI7gqNLbDyodyUb5iA//c3377bR/9kksucWeddVaxl1YU\nD2sAoMCS5aDKPl4GKn/ghkVNdOE9E5gAKnHAEbwIdgQ18TXXaSF/ngOBieBFAKN1EtxwTPGxzmy9\n9dZu9dVXzws4KitWOSYWZc4tIIBy3nnnHLf//vu5626YXncfnmefW+B4psttulI542vKiv9VaJmL\nx7F904BpoPE0sHzjFclK1JoaoLJgBGCcdbMeKIvCoEGDPNwAciHgADnhIpiJr+Nww/n4McGNwCkJ\nbgQuApsQZnRMcVjLAgTo9O3b1xcnBGiVL1zThAfoELBoUV5k6h85AM+bN88dH/nHvBpZiIYNO70u\n3dIZNPDySy52J5x0khty8kmh6BVvY9UBdtCBAU/F6rQETAOZ0YDBTmZuVToEBXKwfAgI0iFVeVIw\nfxeVPRUga0IINmwLUPLBjaBGawFOGF9pkn4+wBHI5IOb8DiAo3To9k7F3bt3b5IvKsgiJwuWLkLO\nHXfc0d3763vdyHPO84P3nRk5L9dq4EF6XDGy85OPP+Yn+AQ8WyPw7HLPDXhaQ7uWpmkgnRow2KnB\nfTnmmGMcSyMEddtuhLJQ6fPvnkqVip4gwAlhJR/ICGxYx+O3BDiCF0EOVhpt61zcgiPAkeWGEZqx\nUvTq1avo20HzFX46NF+FgBdC3frrr++uv26qmz59hhty0olu2+26uiMGDmw16ME358bpM92Made7\nfvv2d/MXzG91mDbgKfqRsYimgYbQwHJZLcWll16a83Pg449Pj/xXksr0yCOP+DjE1bLGGms40mFy\nxKSgeKy5noWuvOxzHaGYOPjshPGS8tKx2267LZcH15Afx8oJSWWmqQN5yg00YbXWP+5iZJIe1WRT\nzDWF4qgpg3/5VPgCm3//+9/uX//6l/vHP/7h/v73v7svvvgit9Clm4VjnCMOC/H/53/+x6dDnsAK\noxV/61vfct/5znf8stJKK7mVV17Zac22Fo4R79vf/rZfVlxxRX8taQiAZNXRVBSSv1AZdY4yJjVf\nCfCQneWf//ynXwYMONDNnXt/NGFoZw89hx52hLv19juVXMVr/HJOPW2Y6x3B5quvvOxm3zLbdy2v\nldVQwINjugXTgGmgsTWQSdihohs+fHizO0MFDhgkAQ9xkyp5IIdzOPr+9re/bZZefAdwII1C8YqJ\nE0833AdqBgwY0CwP8uOY4CqMX2ib+EllFgDJ4btQGvFzVJbf+9733MZRE0C9Qz5ALUcu4I2yqdLH\nUqNKH4gJQUeAI8gJAQcQA0q++c1vOkAFcBHUaB2CDscEOCHkAEekQVryyRHkUT5kJZR6H/I1XwF5\n8TJTPhbK8fOfHx11CnjI7bTjDm7yhAmuY4eOHlIAH5qeSgmMgsyUD8xaftSRgyIYXN6PiMz0D6WA\nWyl5FooL8HD/DXgKacnOmQayr4HMNWNRWecLVIBU4mFvLSr9lkCB6wCDt956y/dkSUq/pTS4ppg4\nSWnrWCGLC1DG3EbF9NYCmuIwqDy0Ji+6J5fSg4tKttQKVvlVe10IOkvNi0oWZ2VgJ27ZwcqBlYcF\nIFBzD3kIQMLmJjVHaS2LTHwdXiNrDWsF0k4KVMrIW6r1o1DzFWUG7mTJUlnJH6sSYZVVVnGHH36Y\nO+qowW7hwoXuqaefcXfNmeMOOnD/CBZ6uXbt13frrrueW3uddXz88AerDZawe++5y+3db1+33Xbb\nuqFDh3iH8DBePbcFPKwtmAZMA42ngf98XTNUNir9F154wVdOH3/8sYcAiR/CEBATAgiVOyAkf4rQ\njyYeV+mFa+Lr2nyQUEycMM34Nt1tlceUKVOanS4EQ2HEEHRCXQFzISyhG8pdbAAI0lQZhPe62DIk\nxdtqq638P/uwGUuVv5qzBADcG6AECMD6QpNTviYqWW7CtSw4YROVwEfwVAh00H+poAOk5mu+EuhQ\nPjXHsQbygJ8kwAO26CWFNQZ9jBt3hTsmsv6stmrURPedFd23V4z0Ei0rRdv//ucX0aCF+7vTTh3q\n495z9xx3zsizUwU6eibQLTCJH5QF04BpoLE0kDnLDuq/9dZbvVWCbcYUARCwzChQgXMcC0dYmQMP\nYWXPPs1eqjSBCdJKClwXh494vGLixK8J9wGlEKLYR37Bi8pD2fIFLB5hU16oK2CPfZrtSJeFNMkn\niwG9AK+F9FFMuQQPgkxZb7QPfAhIwjXWmrjFJumYmqK0Fsxo3ZKM6ipdLmgW23wlXx3Ah7JTFtYE\nZEfeJJnV/FSufC2Vv5bnKYMsmHouapm/5WUaMA20jgYyZ9mhwmYJQ3xfgBM2dVAhhqCj68OKnuvC\naxSHddK14fli48SvCfcPPPDAcNdvx/MNQeZrkaMDofxYdeK6QQ9hPi2lF+bBv15VbOHxUrcBU1Wc\npa7DvCgrflpYqEopR5hGuC1ZqNiBGip7+d9gwZF/DWv53oTbOiZLD9YbrmfBEkSahaAhlEXbWNMq\nsagV03wlyJE1B6sWFp9QHwI1ydXIa55xdG4Wnka+y1a2tqaBzFl24pV3oRsWVoBU/EkhbhUQKMXj\nxuPFz7NfTJyk63QsqWzxNPPJpzTC88AAFVahEMYvFE/n0vZvl3ssy1doFZO8pazRVQg5XAvwYOkh\nhOcVj3W4CAoECrrOJ1DiDxUuoVzALLb5iuYqgQ5WHYLgjDU6oIwqm4/Q4D/o3Cw8DX6TrXhtSgOZ\ng502dXessDXTAHCiypzKncAxKvuwKUdgEwIAx3S9BGa/kkBFC1huXEHPtyO/ms4jPngg8IYvDmAj\n0AF2sOhQVvRA+WSVYq3yUq5Ky1aJXmp5rYCn0vtQS5ktL9OAaSBZA5lrxkouRvLR0FISNu+EseOW\njbglJYzb2ttJMsblC8uUJE8oP01gVF6Flpb8kMI8qHiphBs1UIkDLqroaYaSA7KcjEMHY8GArB4C\ngUphgOZCdM1Sbhg9Ov/ggXJKVu8rQAfwUdMVgEf399CJGp0AQW0tyKomK1tbK7+V1zTQKBpoaMtO\n2HQFNOCIHPeBCXs4AQrhNbW+ycgX+tOQv5yn2Ua+lmAnlB94otwhAJFOuYHKtxp+DDiBxyGuXJl0\nXUt6UbykteBElTn7ACIVvIJAhjiKz7lwW3ErWQM6PXv2rCQJD6TF9L4CdrQAOgTADYgDdORzpCYt\n6UDCAQDI++lnf3MLFvzGH1YXc3Y6/KCD23qbbf3xjh06uFVXXdmXTQDhT2Tgh+eesrKwbcE0YBrI\nngb+8zXPnuwtSgw44M+hipUxeMIeWeyHMBEHjRYzqHKE+Ng37MsfhayKkQ/YoeLHl4Vy4wxMmQVB\nSlM64VzopN1SkaoBO5KlpbxqdV6+GeQn4MmXd7XhRvlU2uNK6bAup/mKpq0k0FETliAPXf36vvvd\noxGYL1u61MPMFlv+0PXZpa9r376dF4Pu5QQGHHxvyRK//eKLv3evvf6Gu/vue/x1O0Vj8/To3jUn\nq4+U4h8BD+XPGqylWK0mmmmgZhpoaNjBooHTqoABAAi7qIdaJm6+budhvNbeRlbJG8+rWAdc4mmE\nZKw79FhKCkBRKaCDxYHmkUYL+sfeWiDTkr7IH9ip1KJDPtyffHNf5Wu+AnSAmXjzVQg6d945x02c\nONGDyv4DDnbFTBDapXPHaKqJjr744WSiQNCj0ezqd865210y9hI/u3m/vfdKvdUE4BGUGvC09FTb\nedNAujTQ8I3wVPwtVeiATjXGa6n01haCGUCs2KYaytsSuJEWZS4l8LFnbqxGC/xbrwZolKMXQIdQ\njcqTclS7+eqee+51ffrs4kHn8IFHukVvLHJjLhxd0aSgANCQk05wWIDGjZ/gFi9+122yySZu/ISJ\nVWkm9QptpR85K6NrC6YB00B2NNDwsMOtoKmGwfTi0CNrDiMLp6FpBfmQNYQa5EL2QiCU9LgRn1Gm\n49eRNiBEmcN8ktKIHwN2mBurkT70/FMH4KoBG3F9tbQvPaLXaoRym6+w6sT9dF577TV35OCj3aRJ\nkxyQ89hj89zRgwdWQ8xmaWDxGXfF5e7Vha+7+fMXuG5du0UQnn9KmGYX12nHgKdOirdsTQMVaOAb\nkSPml0OkVpCIXdp2NECFSuXcCM1ZwAYOtjfeeGPNAY58ASwqzmoE7gdWndVWW81hLSJdXm11M6fH\nFRN7hrO10/2cpjtAh15mGhSRUbUnRBaXgRHsHDnoCNe+3XrVELGoNJhc9LxoOolevfu4sWPHVE0/\nRWVeYiQ1adXLKliiuBbdNNCmNWCw06Zvf+mFv+uuuzzoyCpRegrpuQIg+PTTT71AjEVDpcXS2lYe\nQId8qhW4Fz/+8Y99cnOiyTn33Xff3HADGk9Hs7cLdsIpIeheD+iwXHDBRe6xeY/65qXQz6ZashaT\nztJlH7ozzhjmweyC80e1+v0oRqZCcap9PwvlZedMA6aB8jRgsFOe3tr0VUACH/jWhoLWVLIcgnHm\njYdVV13Vlw0g0aI4lTgxt5YlgPtAOQA2YJSAVQeHZKAmtOoAO+xzjt5XdC9nDCGg6IwzzvRWnilT\np9TUmiPdxtennjbMzb3vXjf7ltmpf9YMeOJ3z/ZNA+nSwH9F5u/R6RLJpEm7Bj788EMPO1gQshqo\n5Kn0WQCEDtE4MAyktzTqTv23v/3Nvfvuu96XZ/r06b55iCEKXnzxRT8zeLt27TwkUPZi4YemJfTW\nrVu3qqqM1/eWW27xzVdUuJRLzVdx2NFs5hyXnw5WHYDozDOHu29F106Zck0qQAcl7fbTXd23V17V\nnXbKULfDjju49darXXNaqTeJpl30z9qCacA0kD4NmGUnffck9RJRcdN7ZvHixZn+uFMxXXnlld4i\ngtLxb2FhiIJHH33UL1RgH3/88dfuSadOnVzv3r1981GvXr28PoiUBD/oi1DtirBazVennXaG+8Zy\ny7lrrrk6NaDjFfbVz/XTZrjLL7k4MxaeavpihXqwbdOAaaB8DRjslK+7Nn2lev7g3JvFAOSwACJY\nQkJrCHNE0azDIvgBLJ588km/fPDBB18rMtaeEH7kQ0PzEs1+1QYdBKhG89WkSVd7HUy9dmoqQUeK\nHnnuaPfKyy+5GdOnpdppGXl5Vrjf3HcLpgHTQDo0YLCTjvuQOSmABKw7NO1kzXcH3xkqI5qv8MkJ\nQUcOvTTtsNDkw6KAn8tnn33mXnnlFQ8+Tz31lKObdjzQnERT0eDBg91+++3nsP4oJFl/dK7YNc1X\nlfa+YsDJMWMudndFoxpr8L9i869HvEMPO8KtFvlTXX31pHpkX1KeBjwlqcsimwZaXQMGO62u4sbN\ngAoXYODDnqUgX6O4M698XIAc/FuYNworD8ex8BAAGICHRdusgZ6XX37Zr5977rlEdfTo0cNDjyxA\n+udfKvygb1mOyu19BdQdfPDBbuyll7sBB+yXKG/aDtJLq3cEpxOikZz79t0lbeJ9TR7uU2tZ9b6W\nmR0wDZgGCmrAYKegeuxkSxrAqgM8AD5ZCADOkdFYQfrnjcxYdgAaWXVwWgZ24sBDXMBGSxL04NyM\n1WeNNdbw4MM2IEQvqHiQ3w9WH+AFSxmhJfipRvPVueeOcmusuaabOuXquFip3tc4PPMXzM9EMxEW\nUAKWRAumAdNA/TRgsFM/3TdEzkBDz+jfNr47spiktWD5ZBXsyLIThx0sPVh4sO4QFxhhAXpYC3re\ne+8935OLUa85p+NsL4kmxAR61PxVyO9H8CPrTQg/1Wi+YmTtiy8e656IfJBqOWBgtZ4LmrO6dO7s\nRo4cUa0kWzUdA55WVa8lbhooSgMGO0WpySIV0gCgc8opp/iut2n136HC6RlBGUCGY3IY4j47NF8B\nPFoDO1h9WAAi4mtROoAOUMIUHGETV7gdAhAWIByeBT/5/H769Onj5abpi/S33nprn2W5zVcMHDgs\n6ma+XTQtw9nDh0n8TK2ZSHSLLp0y1RvQgCdTj5gJ24AaMNhpwJtajyIBEFgd6KqdNuDBIZl50AhJ\n3eUFLlhuABqsOACOFh0DdOIL1/7ud7/z49wwb5iCrD4CHNbhtqw+IQwBP1h/WL/++utKKnFN13gs\nQOSPTMgMoGlKiHyDB2LVueCCCzNr1ZEyGHBwrTXXyIx1B7kBHp7FtL0f0qmtTQONrAGDnUa+uzUu\nG74w+MSkCXjU80rTQjB3FDJi5QkD0ADsCHhkyZGDsvYBC+JoDXRsueWWbpVVVsldz3kBFHlgkdEi\n6NE6CXoERbL6AED5nJ47R805O+20k8P5edttt/VA9cUXX3h/I2Qm33Duq4suGus223zzzFp1dM+e\nfW6BO+rIQX4Wdh3LwprnEegx4MnC3TIZG0kDBjuNdDdTUBY1aaXBhwcfHZqtABusTmxreohx48a5\noUOH5jQGnBAEKXELTrgv2Jk3b57bcccdc+ATj0M8LUpX+STBTz7wAXrovk7Ybrvt3JqRYzE9v5L8\nftZee23f1EVlyrLhhhs6RkkGxoAf4IgZxrPQ1dwXuMBPv336u/367+MdzgtES90pA57U3RITqA1o\nwGCnDdzkWhdRwIOlJ+4fUytZ1KwG5OBPRKCSAXBmzJjh94844gg3bdo0DzjAhwLbIZwIWLT+6KOP\nHGPUYFER4HCOba3DbR0L16TPfhiw6JC3LDtq4sLhmfQ6duzo7rnnnpxPENaqxx57zDd9Pfvss346\nijA9ttdaay3XpUsXD2X4FX388X+7e++9Ox4tk/vjJ052r0YgmLUeZSibZ1EO85lUvgltGsiYBgx2\nMnbDsiIuH3JghwB4YF2pRaCJgHxZA13xfIEMrDqnn366FweAABgYDyVubQkBSPCDzw+ws9VWWxUF\nNknQEx7TttJnTZAsDBz40EMP+WM0mWHV0TniYq3Btwh/HZb58+f7gR6x/KCDMGy3bVd32MCBbshJ\nJ4SHM7uNo/L+/ffNXFNWqHCafOPPaHjetk0DpoHqaMBgpzp6tFTyaADLCrBDExLbG7fSeCOADYB1\n1VVXeesNeWnQvlA0AAHAABz23ntv79jLeYCHnk6hVUWWFc4DGMAD11MG1lqw0GgRvIRWHI5p0XHF\nC9faVlqMTn3SSSeRfTQj+Rmuf//+fhtZVA45UwM6QA9pcH6FFVZw3/nOd9z777/v9fL8889HY/18\n4a67/gbXo3tXn04j/PTq1duNGnVepoHBgKcRnkQrQ9o1YLCT9jvUAPJhsqcpiRnE99nnSx8L4Kca\nAcDReDSkl9TbSvkITgACIIGeSYcddpgHAuKMHTvW/exnP3PLL7+8hwXW8qMhH3p0ATphIE0FIEV5\nCGoELtonby06Fl/rvLqZM9v3nXfemQMc4iM/C93j1UWefQKgg5/OSiut5OhqzvrPf/6z23PPPX0a\nkrcR1vTK+tFWW7hBgwZlujgGPJm+fSZ8BjSwXAZkNBEzrgEsLFhePvnkE+80C/gADTQ30TMKGCol\nUDEoDZoAwi7fQImCwCNcAwpa8GWhiYgmKcKIESPcgQce6Ltvh5YSrEDkEeajPNSkxFpgBCTRA4r5\nsVgADxYsLQIQQQj78YVzv/jFL5SFu/XWW/313/rWt3y65EOZaMJCTrqbh6M9c5wFaFIz1xtvvOEG\nHHRILs1G2Vh7nXW8w3XWy8NzzHNd6ruQ9XKb/KaBWmnALDu10rTl00wDQAkAxPqJJ57wIAEA0YMo\nqflJFQG9qYATKgcWWYiAH5qwVo0miqS5iTQEOWHGHMMCQpMPFhFNC3HOOee4O+64w0dt3769mzt3\nrsOigg9M3759vbUHyBDchGkW2iY/BbYBLQIgon1ZcjjHNuP27Lbbbj4eTYA0twle5J+D3IylQzdz\nFo5zPc1wQJHgCthif/Lkya79+hu5cVdc7tNtlJ+5Dz7sZkYO57NuntkQReJ94D1IegcaooBWCNNA\nnTRgsFMnxVu2zTXAR55/tUBNUhAEAThJgWs5BwzRHZxu4YIJ1kCKAkARQgOWEfZvv/32aBqFixXN\n+/4MHz7cW2ew1KhZC6AghGnmLipiA3kIrLXI2sSacXvefvtt39sLAOOYLDSy5CAzozADPIAP55EH\nGYEbWYEk93XXT3OdOnfJ/Pg6cfU2GuxQPgOe+F22fdNA5Row2Klch5ZCSjRAJbHzzju7zz77zF1y\nySXu5JNPzjVZIaKsMsADwIOFR01AAAPAg1XlxBNPzJXo3HPPdccff3wOIAQ8WHmUZi5ymRsh/Iwa\nNcpddNFF3jKD/xHj4yArMCNLFIAD6LAI1EgDmYAbrDqCHFmjrplyrftBh44NBzvMhL5++3YeGstU\nfyov41nGuoOVx4JpwDRQuQa+/ItaeTqWgmmg7hqgeQtYIGCRwQFZzrtYRNgGaIAHAvCDMy8Aw4LF\nhhGWx48f748T58ILL/ROy0BF6MdDGrLKEK+SIAijuzigQ7j55pvdOpE/SmilkawAjBaghqYq/H5o\nwqOCZKEcrDmGDxAA1IghixOZFnMfsGQS3okNH1DMtRbHNGAa+LoGDHa+rhM7kmENMGig/F3oaUUv\nJKw2wEoILFh34s0+TMYJ9NC7C6dkBvMj3H///d5vhxGLBU1YhUijWsBDPkdGDtsEeqzRzRz5ADDW\nCspPlhwACNABbugtxiLgAXSwDLEQx0K2NCCrjgFPtu6bSZtODRjspPO+mFRlagAIuOGGG/zVjDFD\nD6vQz0XNVTQLARLAAtaTBQsW+KkUGGSQYwAGgw8eddRRPq1Fixb5uaeeeeaZXM8nWYmqAT2jo3GB\n8DcCWm6MHLcJlIW0WeSzg3UK0MKyhPzIHlp1BDsCHc6xrPitFX2ajfbDwIIdftCh0YqVK48BT04V\ntmEaqEgDBjsVqc8uTpsGgBQsG3PmzPGi3X333e62227L+bsAO8CPLDMAA1Mt7LHHHo5eWHQPZwEi\naCrC2kKzFgG4weJCExPpqFkMEFHTGIBSasA/g5GSCYAO8pMOC+kiK3mTnxYAiLICZjRjIXPYnZ1t\nHfPrVRrTsvPekiVu6222LVXlmYov4OE5sWAaMA2Up4Hly7vMrjINVEcD70Q+CY9HPbC0JlV6VmnC\nTsa20ccePwa2e/bsWXDWaACmV69ebsCAAX6MGpyMO3To4DbaaCMPD0AEcXDwff31190uu+ziC8Mx\n+cKwrSYr8mVOqn79+vl4TDWBI/Oll17qrS74zbAQuJ70w6Ynf6LAD0BFoPlKXenZj1t0kCcENfKS\nzw4+OQAa4MMx5JcMpLPWmmu43zz/W5JtqMA9bAuB5573AuCRP08l5eZ9Iy0tpB2+d1gYlY/eO9Y9\no3fPgmkgixqw3lhZvGsZl5kPLBYMDSiojyhrrBoEfVSJqw+xPsw6BhhokUoECFhAsL5svvnmvncW\nMPDII4/4aMDAn/70J28x6dGjR86Kw0lZUbhWViDS4jiBZq0//vGPfrt79+5uypQpfjweIAMrixyd\nAQ3Bho+c54fmK6w6VC5UQLLqqBzADb5GGlOHbSxJpE05ZL2hqUrAgwzx/JkOY9y4qyJouyuPJNk8\nfPEll7uVvrOiGzrk5GwWoESpeRd4TnhXSg3he/fuu+/6nou8Z4AUCyH+3nHs8eDPCPkTR++d3lfi\nWTANpFkDBjtpvjsNJBsfSeCGyp3AxxKLRjkfba7nw81HmEH3SJtBBVmABpp+aPYBFGiiYlA+AoMD\nYuVZEjV9YAVhBGX1VFKzFVYZwAbA4XpBj4CH84Ca/IK47sEHH3SdOnXyaQI8LFhWQuuKFyD2Qxk0\n1QXNbuiE9Fnko0P+DBoo2KFcnAdogBvkVxkEXEn5oiP8ebi2kcIxxx7v/vynZf45UIXdSOVLKgv3\nkmcH6Cgm8LzyngBJghTW5QTS4D0mTbZ5hzWaeTnp2TWmgVppwGCnVppuw/nwYeSDCNjwcWSpZgB6\ngCgqAPJhfB0AADAAFiZNmuQuuOACnyWWGSqJ73//+x4WsIgQF1AAXAAFQtzCQzoAD2lidSGvIUOG\n+Lj8XH/99W733XfPpQOM0Myk9JKsPOiD5jqar6hACMBICGuy6gA7wBfnSBd5JTvWHcAHSw/n4nmh\nH8LoUee7s84+2+3+075+vxF+OkZjB82+Zba3iFH5KnCPGz1wXwuVk2eK94HA+3Fkld873gEgijnv\nmJuMbbP0NPpTl93yGexk996lXnI+hnxg+ScK8BT6MFejMIIeKr1f/vKXfuJLWWfwy6FrOQH/Gz7K\nagYSNAAQHAMWBB2y8CgdoAfgATrIZ+DAgTnRmaFcIy4DTlh4gA8WQgghVD6t1XzF9BthkN4vGnOx\n+8c//+3GXDg6PJ3Z7WefW+BGnj0imrF+3tfKIMDjBBYflkYMScDDc8l7JxhhuzUD+QFVyMJzLcBq\nzTwtbdNAqRow2ClVYxa/KA3wL+/UU0/1g/zxAaxlmDZtms8bEKHrOeAxa9Ysb/FBDvx4sMRgdeFc\n6PfCvoAHC05oZRHwsFazFiB32mmn5fx48AG65ZZbchaeJD8eKqFqNl+9+eabvpkLmGIR3MR1zj/9\nq64anwgH8bhZ2B957mj3P//+l7vs0rEFxaUyZlHIpx+dz9o6BB7+VAAbAA7vXS0tLchBvlgskaOW\neWftnpm8tdeAwU7tdd7QOVL5618elWu5PjmVKompFugmzkzrzHe16667+sEB+RgT6FFF8xFWF5qA\nsO5oAXjkaCxHYaw5AA6WHZYQeLACMQmpJhLlesbtadeunYcp4EnNWsAIoFNJ8xU6/vjjj3NAxeCH\na665ZjPLkS9kwg/NPuPGT2iIpqxevXpHMH1eXrhLKL4/RKWsgMWHJeuBMvEHgzXvXb2AjmeTdwyg\nr+f7n/X7afJXXwMGO9XXaZtNkQ+dPrJ8dOv9zw7gwRGTnif33Xef99MZNmyYmzlzpr9HM6LZsqno\ngJEQeLD0ACzyfwFmcBhOclwGejgOFFHm8847L3f/77zzTkePLTWPATxMP8GUEAz6h1zoiPQFVaQn\nPx0ck9lGr/QAw0qEXFim6EqPzAIzWXVymefZGDNmrPvrRx9nfvbz66fNcDfNuLFiK9U7gdWHe1Ev\nOM9zu4o+DGDgO/Piiy+mogxYlQRfWdVp0cq3iJnQgMFOJm5T+oUU6MiEnQaJkYneWVQE/Mv89a9/\n7TbbbDMPPVhngBy6owMKQAOQI+tO3OFXTVpyXAZKQiuP/Hj4Rxs6Lp9zzjl+IlGAB58hZmQnAELq\nESOYIg3SBHLoKi6QwoGakZ2RiW0WtklTPb8oQzGByn2TTTZxry583XXp3LGYS1IZ59DDjnAH7L+f\n22+//lWTj+eF+6fAs1xvYJcshdaypADblAGAT0NQkxpyGfCk4Y60bRkMdtr2/a9K6dMIOioYIMFC\nhcBoyvzzZRoJZkcn4LiMNQYrDsAj2Al7OKlHFengw6Nu4XHgoZmLc+iDXl9//etffR7y4+nWrZtj\nfi3m7rr33ntzPbUAKebiAnaUZufOncvufeUzLfAz7MzhkZz/l1nrztwHH3anRuPqzF8wv1VhBPDh\nXhLSavUJQSeNYGbAU+BFtFM11YDBTk3V3ZiZpf2DC6QAFMiJrwzWnLA7Ot3SaX6jmQmLCcATWk/k\nb8PdiwNP6MeDVQZgwfpDPHpbATHxQPMV0MPov4AUsnXt2vVrzVeAExYbLFChEzUyltp8FcqAdWe3\nn+7mbrhxuuvRvWt4KhPb+OowvEA1rTotFRzoSZvVh6YiYAK50gg60inNWSxpl1Py2roxNWCw05j3\ntWal4iPGR5cKNM0fXEEKzrxYWrDmMMjgwoULva7ydUcHLORgHFp4ABT58WCNkUVG2/Ljofnsiiuu\nyN0Ppr+45JJLPNysvfba/jjWIqCJ5istQBMyC8Aqbb7KCfDVxvgJEyPoe9Tdc/eXc4jFz6d1nxGT\n5z/3bN3lrrfVh6YhmkGz0kSErAAj8lowDdRDAwY79dB6g+QJ4NAWX8/eH8WqEnAgvP32227rrbd2\nN910k/dd2XLLLf1xwIPeVGF3dFl45GAsh2V/QfQDpLAANsCKgAcLD9NR/OY3v/HnQ6dlrmUU5+OO\nO86DDJYboEkWIgYPZJt0yY+8JUfYtBaXRTKVsu63T3/XrXsPd/bwYaVcVre4jKuzfY9uqXHClSJq\nbfUhP/xy+JORlTFtkJlvBfJmRWbdX1s3hgYMdhrjPtalFDT98AHDupOFAPCwjBs3zs9kPm/ePPfq\nq6/mHIWPPvpoPxIsIIFFR01HrOUMLMgQPGHhEfA89NBDOeChmQmn4rOjEYs5Hg+Mtjxx4kTfTAXs\nhP46pAd0Vbv5Ki4D1ol+e/dzvxh3pRtwwH7x06naX7rsQ3fYoYdGTZGD/D1KlXAxYUKrD1DCUs0A\nLJBH1qwkyIuFB9mrrZNq6tfSakwNGOw05n1t9VLpw5X25qu4IoAUAIVZ0bfffnv/L7OY7ujAD8BD\nsxIggkWGjzbj+JAeC+kBLbLyPPfcc+6QQw7xIjDwIB96Blr87W+/nH183XXX9QMQrrLKKv46riUd\nAr2sBFuh/1Cpva98Ygk//NOm0pz36Dy3wjdXcDNvmpVa/x1A57jjjvfw2NIAgglFresh3g8WBf4g\nVBJIi950DKuQRWDAb46Ar5EF00AtNWCwU0ttN1BefGgxo+vjlaWiATxYdfbbbz/38ssve7DYdttt\n3dKlS30xnnzySQ8zYXd0wOONN97wXcMBHmCHwQHxUyI9QZSsNAAPlh0G/0NXs2fPzo3HwwjP+tgD\nL1iaGDsH0CFd5Ss/HZqx5Dsky1Il+qbCBLxw1ib07burey9ymk6rw/Kppw1zb731Rzdj+rRU+4UV\nc09CawzPBUspgfeNa3j3shiqBWt0MsDnjoAP3FlnnZVFdVRV5qlTp7pjjz02lybfpFqGSy+91E+X\nQ54vvPCCwz8yTcFgp4p3gzFc8AkhxF/AQueqKEJNksJHB6sAH64sBsEJ1p0ddtjBT/fAGDg77bST\nL044O/qyZcty3dFxbGZUZABF0AGcEJQmwEIzFM1XckzGdwerkHprYcHhYxB+oOldhDwCHaw9jBHE\nOrQqkZ/y9BmX+COL3KeffurTZ5+mSLqj33v3XakCHiw6o0ef7z788MOGAJ34reL9Cd+hlqw+xMWq\ngzUxzZ0B4uWM7+sPkoA/fr6YfX1PN9100wiE3yrmkoaIE777U6ZMccccc0yuXPWGHQTRfQF0+Mal\nKSyXJmFMltpoQBUma16QUgMfqSw7Gar8/DvGURnfGEYkHjVqlFfFww8/7LuNAy0ADt3CGSMH6ABU\nsN6ouUm6U5pA0CuvvJIDnRtuuMFtsMEGvkmK67EKAUadOnVy1113nS53EyZMcFdffbW3/nCQdARV\nYdMV+ZQbuG8AFaCz1VZb+WY4QAd5aB4686wz3VGRT8ytt99ZbhZVu05NV40KOihq48hCA+BoATy1\nhBAkpfK8Mrt4lkGHslAORnumKbWcgAVBfyrDPwzlpGXXVFcDuh801ZdTt1RXmuapGew014fttaAB\nPsIMzqd/Zy1ET+1poIFK5r333vOmV/xrgBr8bgiMj0OlomYprDL0tqJ5imOAEMADKCgIRHB0JuCE\nfNBBB3lIoimKpjAsN4AM12Ht4YMADBGALHx6gBECceQfpLT9iTJ+uF+DBw/2V1JhUqlS2ZIHC+U5\n7LDD3HkR8I0cMdzRxbtegUEDe0f3hmZAusZnvXIvVo+CHtYEgQ++YQRZVP1Ohn947hjUk/KUGrBq\nATuE1VdfvZllo9S0Gi0+Vh69z6zrEQ488EB/X8h7+PDh3gpZDzmS8lwu6aAdMw3k0wAfKCbQbIQK\n6IknnvD/lOnuTdMV1ptrrrkmV3RBi6aIiAOPYCf8sDCQIH5AzH2F1QirDJYjIIdFY/aQiZq8+De0\n5557+nxxPB0wYIB7J4JKAASwygdXOUELbKjLL/+kCfgHYeHh/unDiByCut12+2k04OJE9+Tjj7m9\n9urn6O5dq4A1h5nMGR354rFjW5zNvFZy1SMfgEDwwzaWVCAYS1wjBOCb57DUwJ8DgIcQNuGwzzn+\nFLDId4UKV8dY826pgwDXxAMgRVNMeE1oSYrHZ5/0SFfXrLHGGl4WrE86xlrWKKXBflw+4nEsLiPf\nJ86FgTJyTBaUsPyKi67Y1oKc8RCPc9tttzWLgn+U8lI6yKN8w8gAKMBDIN14WmHcmm9HH7y6hsi3\npSlqdwVDcwvHonbYZnJFD3bufKTQr50P02A7HiJH0SbSDfNhOymv8Npi5eOaUAauC0Ohc8SL/tU3\nhWVEtmgqg6aoXTZMJrcdpoeuuD5qJ82VDx3FZSC9ePm1ny+fXIZfbUSg0xQ52MYPZ3b/d7/7XVM0\n0F9T1DzVFEFP01/+8pemCOhyejrggAOaIoflpmeeeaYp+gA1LVq0qGnJkiVNPE/RAIBNEQg1RbDg\nyx9NRZG7bs6cOf54BBFNkTWo6bPPPmuK/H+aIifnpsiK1BTN09UUwVBT9MFoirqgN02ePLnpzDPP\nzF3PfYnAy+f10Ucf+bxIh/TIT3kWUjzyRH4/Pk3WyKTA9aRFuSlH9GFqisYG8vlFH+GmaOLRpmHD\nzvLP9CmnntEUzaWlS6u+/mDpsqYxYy9r6vCDDk3HHHt80+LFi6ueR9YTHDp0aBNLowSeN55x1qWE\n8LvHNy8MfMP0PeNbGn4PdVzr+LV8QwvF53sa+aCE2flt0lGa8XX0J6bZubBOI614/Ph++P0u5tsd\nlp+0FCL4yOVFOeKBfJQ35/m2KYTnFCdco+d4uPXWW3PpodO0hP9opMYSlfpwEZ8bIUWHSo7f5PiD\nzIMVXqs0wnX8mlLlQ33hixg+qC2dK+eBiucVliXc5iVRKOaFUdx8a9KudmWErrmH3FNkTLpXHOMc\ncYjLNdUKgACVe9RM5aEk8hNpihyGc8/a+PHjPfAAKVGTQtMf/vCHpqjnVlNkNfHXCHgiPxh/DUCo\nEFlnPFBE/8r9NcDO/Pnzm+6///6mW265pSn6d9t07bXXNl1//fVN0WzsTZGPTy5fdH3iiSc2RXN5\nNUXzbDVF00s0y68Q8ACkAh3kAnwIAiVBGIDHx43yADmvv/56U+Rz1BRZp5qiMYg8RA8cNNjLBPQ8\n8+x8Fa3iNQAlyDnk0MOboslPK06zURPgHlZbP/V+7yhTCOAt3bs4IMTjx+uB8DsY345XwoVAR9fy\nDQpBgO2kb5Xix9fhNyv8fsfjhfvKr5hvd7z80k/8eBzaQhhiWyGEllCm+Ha8rkPmME48P6Vf63Xd\nYKechysOBXp4wgcnvFkos9gHkjTCUI58oRzxByDfuXIfqDC98MFK2iYPQjEvTKiD+DYVJlaQagXu\nX/iiJcle6BjX6hmoRKbIf8BDBtASNVX5f5vR3FVN7du3z720WHeeeuqppgULFngYAAywhGCxweIS\njYrs4wIY+rcq64nSxCIUTU/hLTsPPvig/9Bzb6Ju6R58br/99qbIH8pvR6b0XN6Rk7TPE6sT+ZEe\nsgJSScCDBUB6A7xCeYjPtYAd8AREAVMAHJCD9SrqPdb0/PPPe7CTJYsP1siR53jrS8+evZrOPmdU\n0/0PPFSy2oGlqyZMavr5Mcd5GbHkVLsSL1moDFzA/axWSMt7x3M6atSooosVfv/jsEIi8UqdOKpo\nqQfi3z+BRPy68Ntd6FwoD/cnvC5u1eG8vlVxa1B4Xfycvt1Skt5r1sgWhrisOheHjzA/4oTAFuYX\n1jGhLilHqMs4BJJmeG08P87XI9TFZ4e2vrBNMlIGb7JfohsW3ccvQ/SRbtYuiG9DpESd9o5qpBVV\nPP5YpHTf5TsXIdrgPOkohHmRngJpqH2xXPmUVilr2mcVogfKd9dDF9ED5Wfk1jnajcNy6LjWYbmi\nB1aH/Vq6jl4kr+PwJPomv8hiEh5O3MaPZOPIf6AaAV1vs802OZ2Xk2Y10iBffCOYnBPHYRb52Dzw\nwAM5sU4//XSvp8gi4h2V5aysbuQXXnihjxtZVJr5w8jvJgIM39OK6zkWTkuhbuYRKHkn5rXWWsv3\n1Np///19ms8++6zXFZOH4jeEkzTpkY7eGyLin0NZrrrqKn9dVJl4J9DQP0fyIDdl+Pvf/+7n42LN\nwjHSjqDIt/MjJwvv3dlnj3CvLnzVDYl8ar694jfdZZeM9XGYdiKCFu/UjGNzfMEP59DDjnAdO3T0\nvb1ejXqrMQEpz/OUayZ7mb3A9pOoARyVIytI4rlSD1bjnalGGsiN/5Gcr4sph75jxA3rgXzX8h3k\nm0pIqhv0PcUnRYHvYFgvsM+3VYG6QQE9KMSv45p839SwHMgV5hdBRLOySUblU86aPKI/hrlLw/zZ\nVh7EI38Cx1Wvsh/qEt2zT3wC14e64Fh4f8J0OFevUBfYKffhQkkhDPHgyTOfc3EY4lh4E5IeyPCm\n6CGoRD7yLDZU+kApn3i5eLDDh1sPs+KXu+bD1DOqTCsNPPw4vFVDLtIgrUpeKACOCoUA7NA9HOBZ\nb731vEMvxyNLh8OhGVgABoACAc+h0TQGBJyMGaxPAWAAbgALAEVLCDuADjDChwPYwbFZXdSBFWZk\nJ3AtlUPkO9QMeEiffCKrm+/hgoykA3RpGg8BkWQnLaBJk46yZv8t+iwAAEAASURBVB85iUOQHnCw\nRh8sHCNQxnNGnu0ee2yev4ennTrUde7Uwa280rc9BAFC4bJCdNkxPz/aPfDgA27RG4vc1ClXuyMj\nB9VGcHL3CmnlHyC2GrpK43tH2YoN+j4TP/xuJ13P+XgcgU88fpiuKvswTvgtRYd8c1haui4pLdKl\nntI7GVldfFbUOdRlOBBX8i0L5Q63Q1nC+i3cJo4AJiwbeovrknhxvYT5hfHDtMI4td5evtYZkl9Y\n+PAmSBaUKIuHHi7dBOJTuYuw9WCg3JCQlVaYV9LDjgUlHsJrSpUvnlah/TCfQg9UvKzxNJPKxbEQ\n9OLX1HOf8sRBh/vHfec+J5UHedEX1/GChrrjGGmG/8BKKZ+sVerBIOsOH6SDDz7Yd0OPHIr9BJ6a\nHR0wABCw6GAV4trI7yZnEeHaOFzIasI5AQQ9tDQNBWkALoIjoAq4nDFjhhs4cKAvEqM+n3POOe74\n44/3cYGy++67z9FzDGjZaKON/NAAGj+Hi0gTWQReyIHsLAI2zhFUdsmlHmQcl5XHR/zqh0oYGVks\ntI4GqvUnI43vHXBebAi/GaoP8l0bVrb54ui46hD2k3orKZ7WoRw6lvTNKiSDvlmq55ROa635tvKn\nkEDefD+ROfyOhvASlpE4+jbmky+MH49T6Fw8bmvu1wV2ynm4wocbqKEiD5UYWnyksDAfjhV6+HQN\n6/C6Yh/+UL4wrULbofyVPFDFlquQLMWcw/rBP/JKQ/hvgrS4d/lMvmFeIXiSBt0fFeJp6nipa15q\nKnUqd6waVPZAFOkDBjQtMQYPIEIX88ip2GdBRcJYOkAD1wp0BC6hVYc8iAPkMPYOiwYOJF2uAYYE\nIhtHlicg66ijjnKRj4276KKL/HQXkYOzY0b1SZMmeRm6dOnimOqCZxGgIiBHKEscdMiL8wRkAp6w\nLGlBRll30Auyt/Th84nZT+o0EH9H6v3e8VyXEsLvZSnXpS0u5aAJP6xnkJFvIN9yviXxc5WWgW8C\n3089A6zJS3+Idb7SfNJ8/XJpFi6fbDws8Qc/JNR819nxyjVQ6gcqKcfwXvGCFwM68XRk4dPxME0d\nq2RNxQ5wqDmLua0IwEjUO8vDhORm9OXddtvNQwqwo0WAA2CwcC1WFtIGogAKQEdrtsP5sNgHNpCB\nj9Gdd97p+vTp4+XAj2fDDTfMgU7//v19UxvNYuQhaw4gA9CQv/xz1GyFfAIdgCaEL8mFnJwDhJDb\nQnY1EL4jaX3vCmm3tf7UhenKr5E/C/mWML7kDXWrY/mAJYQZ0qJ1gbyAz6TWCaVX6Tr8s4i8Ah/S\n5RzfGIVwm3P5dKHjScYGpZWWdV2+XuHDUs7DlWT6Sxr4KbxhKDzfwxe/GZXKF08v334oX6M8UPnK\nmu94qOt8cfIdr+TafGlyXNaLEHi6d+/umL+KEPWaauabw4suoAkBh201FQEcAh3gAYgALljYDhfg\nBysRi2Y8B3iQJ+q94icw9YJ89cMgj9E4Pd6vh3yAKlmIkAsZkkAHeSgraQt0lC8yCHSAPvKWXsK8\nbTubGqjk3ank2kq0FX4v4392K0k3bIJKgpaktNFBKE8IDoqfdIxz4XGgM9Qn5Sq2nlI+xa7154z4\nWHRCOcImLM7HdVKJvsPykXa9Ql1gJ67IUgoPFesm8bApLW5G6KxMmpwPH8ikh4jRLvUR1/VKkzSK\nffiJW2qI51PJA1Vq3uXExz+jlN4TxeShe1lM3HicSq6NpxXf55mggseiAZwABCNGjHCdO3f2UeVY\nSC8twEA+MFoLMliHFhQ1FQl0SJdFPjzKS/CBhUUL8BF1f/cWnlBepu8AdtSjSjJoX47I7CMPQMQ/\nMspH3oIrwArYCUEHefV+hHnadrY1UMm7U8m1lWgtrDSTvuXlph1aPPgjrXqA9MgHVwa9A4yurBAC\nAvVSeB3bHGsphO4Y6DVsmm/p2lLrC+rCsKySL36cfKmbpG/yQa6wLuTasO5UWpI5vD9hPafz9VjX\nBXZChZfycKH00KqDyS90SkXh8RcxfCB5ANVGibJJK3xgJJfWihM+xIUe/lJvYKUPVKn5JcUPy590\nPjyG02spvSfCa8PtUL/cr/h9COMmbSMz9yS812GaSdeUe4zKXs1ZwAZ+MmHAqoIVRXDD1BMsgIWg\nI7TqABdJoCOoCPMDOgAdAIQ1eY8cOTKXPc1pyEbAUfpnP/uZi8bO8fmzBnIkD2vkQVYC+VCeMH3y\nQzbBl2TiQ58UeBYej/y4xo+fEPUau8h3L6eLeXxhRvXxEyb6bvDvRMMXWChdA4343vHHiZ6DxYaw\n0gwr02Kvzxcvbl3hexTCTVhnhM1M4TZph9exnS+E5QAgBA1xoMh3vY4rvzho6HzSOuk7ybHQKKDr\nwvIhJ35G0kvYmxYoCq1GXB+CUVhepV2PdV0clFEMlZUeWG5avocjVDhxVDlzc0hHVKqKj5sQ9rDi\n+vBhyOdwDBTpppQrXzk3EPnkJa8HKimdpAcqKV6px6T7Yp0Vq/XR1f1CXp4FFvSv+5lUDq7h/ocv\nkuIlvcQ619Kaj+7GCc6SvNiygAh4mJk8DFhV+vXr560lNAsRD0jAFwbfnRB0wuYrQENQgYWFIKiI\n73Oc3lZz58718X74wx+6yy67zIMKztKnnXaa1wnnmaWdMTDowo7skgNZ1GylsgA2AI4gB5kkf1wG\nn3H0A6w8/vgT7s45d7l777nL7d1v32guoe+7tddZxx3xVY8xxdU6GrAwgq4v3K233eHwLYoGJXR9\nog/sDtv3iLZ7Kpqt82gAHY0ePTrP2eIP846k6b3jW8IfqGID32jVE3wD+BYkVdLFphfGw52CuiHp\n26J48bFz+Cbz3dT3W/G05tvOdy0eqF+ok1SXhec5x3EBlupIxeEbWUhGxcu3Jn3pUHFCg4COsZYs\n8fhhHHSA7sKA/GHZKvk2h+lWvB19EOsSIiApOBdJVLBmI1IyEibHtEQPXk7uQueIFD2Quet0fbiO\nHqBmw4BzTanycU1043P5hPK1dI64oTzxbdJFnjCEeUUPW3jKb4dpRg9ts/OUN54HOmopMNItow1X\nGhjRM0mGuEzF7ifdv1JkbGkk1wgS/DxSTPOQJBOjHkfQ4UcCjiwdTVF32ibWHNPxp59+uol5uN58\n800/NUP0MfAjIUcQkjgKMnlGoOLn6oocoHP5RmDj59eKAM2PxMzIzo9F92XQoEG5OBFU+fm2yJtn\ng4Vt5GLKi+hj6aeFiMDFjwKNLJEVyE9rkU8ehvVnSgfKz7QRt9x2RxNzWpUTGHmZEZgZiZnlxmjK\nDGSwkKyBao1cnrb3LpqU1j+3yaVOPhp+NyKobxYp/M5HFWyzc9oJ39/4N5U4fDfDPIjP9zPpG6s0\nOUd+SptvM7JwXMdYo38F6qwIMnLndQ3nI0jKHee6UM74dZzXtzssP8fzhbB8ESw2kyvpGvKMy4S8\n8TpO13JfyJ8l331Q3Fqu82ukRlIU+3CFNwhFxwMPpBTMDQwfEOJyw8I4xC10w5R+sfIRn/QkQ/xB\nKHSOa0t9oML0kl5E8pcscdgp9MIgS77AnFiR2Tnf6ZKOI0N4TyVrqWvSIK1KAgDX0hw9wEfUtdvr\nlPhMsRBZRPw+cPHQQw81RePd+Ak+77333qaoq3gT68ja0jRv3jw/zQRTRbz33nt+igbgImpSSgQd\nlQU4iiw0Po/ICtMUjbeTm94BaBLwADLMtTVmzJjcPUePQ4YM8eVCrugfvZ/uAtBhCohobKCmP//5\nz03M2RU1b+WdfgKQEpQwzUO5gKMyxddAExDFJKBXXTU+ftr2v9IA97MaQJim9w5AB3hKCWGFHv+u\nlZJOLeKG32DqpLYSQogTiKWh7HWHnTQowWQoXgPMjaVJJYu/Kn9MPgghuBULO1wTB8r8uRQ+U0xF\nEo1n40Ei8nHJTcwZzo7OHFQAE/NczZo1yy/MdcXs5lh50BkAziSjmk8Lyw0QlRTCiTyjZis/V1Xk\nF+Tns2IWdObZAlqYwwqYAq7ImxnUQx1GfgB+FneAGKsOwAW0Ms8WoEOagq5QFuIwb5WHkAhyWjtg\n7QF6ACsAy0JzDRQD5M2vKLyXhveOb0mp9xrrCODAM16MVaKwFio7G/5Z43sU/sHmfZOcyErcthC4\nP/r+oJM0hW8gTCScBdNAURo4MhpUkHb2U045paj4xUaibTr0yYn+xTa7NPpwNPPpiV6kZufL3YmA\nxftD4LeTL3Duxz/+sT9Nt/O99trL97DCCRnHYHpCEfCr2GSTTbyfDj4v+MRo3iucEHHGVFdy/HeI\nIz8dn8BXP+hW81vhAK35tvC/YVGXdhyQI2DxSwRQ3jkZmfDPiaw8jrm0CMh0xRVXuA022MD78mhK\nCvkM4adDkCwPP/yIO/mkk9zue+7thg073bVvt54/X4uf8RMnu8kTxrvDI/8fpqSw8KUGeLbwl4qa\n/Kqqknq9d5Sl3A4P+MHIjySCtlYdm6aQskM5CsXjXGTh+JoTb0vXZPF8qJO0ldlgJ4tPVB1l5mPL\nnEuF4KCO4pWUNaBDeXBO1jxSSQnwUX7ppZcc4BFZbzxw0KsJ2AAy9thjD/fGG2/4S4EKIAInZXo6\nAThM7Ans0HUf2AGCAAzBhfLEYZN5pzSEPnNjSS7+k0SWF583Ts9AjWCH60LY4TyBiUyZ5oKATPTm\nYpRlAZfkALrkkDxmzFg3c8Z0d8GYi92AA/bz19b6Z+Fri3w3f/KdMf3LiVVrLUOa8uP+8pyeeuqp\n3vGzGvNk1bt8+oZQrnICPYNw1OVPT2RRKSeJqlxDD6rQ6Tsp0ai5zcNO0rlGOsYfVLrms44sWX5S\n6zSV78tuIGmSyGRJtQaojPlXxpL1QC8XelMBO1FTk1/iZeIfNaADtOjDLDgAaNgOe2hFPgi+FxRg\nwiLDqa5hDeTouPIDHpEH0CGvpIk8SQ+rDaDFmgVLD4E0kQeLEWDDQs8n5tEiAEDsR/49/npZiSQj\n8tBFfMFvfuNuuHF63UAHWbt07ujuuXuO7+XVv/9+DQHWlKuUwPPw+FfPJO8a1r6o2ccfKyWdtMYF\ndsJJc0uVE6sBActUUo+nUtMrN37UXOVBJqlHE72xdL7c9LN0nXqYYYWnR2jagll20nZHMiAPTVkE\nVf5+J4M/yK9/mBKfCkaBSmbw4MF+F4sOH2dZWAAOrCtYVKJ2at8tXGBx0EEH+RnI6dLNiy/LDtYd\nxsxRF29ZdrAwoVOapKjQ2MeaJCACSIAT4AZo0fg9WHZYkENj6AiAuAawAn4YA+iwww5TsdwJJ5zg\nLSfIhyzEOfvscxxdxK+Zck1Nm61yQuXZOPW0YW7uffe62bfMLqmbcp7kUnuYZ41Fgfsft+DwrPJs\nhM+o4mdpjfy8S1isLJgGaqUBg51aabqB8uGjzMeYdfyDnKViYtHBciN4i8vOeWY0x9JCJcM+MCK/\nGSCDwfuAnchp2PvFRL2yfDLnn3++N7HjH4OO1IyFD0/YfEQ8FsJWW23lKzLiC3RkgQGuAB0NXkje\nbOO/w3HicQ2LIMonGv2wj5xnnnlmzuTPeDz8O15vvfUiHVzgyzll6pRUgY7kF/DMXzA/08+byqN1\nCC08WyyFAnBAHKw+LcUtlE69z2HBZOHds2AaqJUGDHZqpekGywdA4IOb1Q8WVh1kB9iSAueAEEBH\nUBf1UHIsgAWAoaHj5STMKMWHHnqoBxCsJTfddJMfsA/gIR3W+MvgywOsnHHGGblZ06NuuA6ZCIIW\nWXTIC6gR6GDFiUMOTVha1FSm67H2sE1g1OU77rjDb2PVOfron7mFry50s2b/KpWg4wWNfgCet976\nY6Z9eICU0JpBhV9q4LkEkkJQKjWNesZHbjWFZ/mPUj11aHmXpwGDnfL01uavAgDo5UPln7V/mVQ4\nWKby+Q1QKan3Vdh8BYSETUm/ifxboq7kHlwAkU6dOrloHB134okn+udjxx13zDUX0XwF6GDZica3\ncQcffLBvNiLiDTfckGsuE+gAVOSFRYe046DDcVlxNCKymqTUuwrAIZ7AiG2OUeFEXem9jDh4zrxp\nluvRvavfT/NPv336u+222zYzvbR4zniWFJKapnSu2LWsO1gay4GlYvNprXjIzAK0WTAN1FIDBju1\n1HaD5YXTJB9zKs8shZbkplJS7ysqFQJgIYsO4IFlBksOzUNYWrC+0DuEeFhOosHb/HW//OUv3dZb\nb+0dhrHoROPi5LqgAifRyMq5aUq4ILTGkGYIOmwDLkALQT45NIux4IODRSmEnTANrmefcnDf6J4+\nZuxl7ujBA316af+hl9b+/fd1l1x6SUXOra1ZzvBdwHLBs1TtAKTL1yxL1hHJzR8lC6aBWmvAYKfW\nGm+g/GQhAR5YshCojDCjU9knWaSSmq8AGCAES0sIOnIOlpWFZiRAAwih59OyZcu8SugyTKUXDern\nrrnmGn+sXbt27sEHH3Q/+MEPfPMT14SgA9SwyBlZQAWoEMiLHleCHNbAEwvnCMgN3ITpCJguvHCM\n22DDDd3UKc3n+vIXpvjn+mkz3E0zboyGALgzFf47VNwsClgtahHIh2cKgMhC4H1D5qxapLKgY5Ox\nsAYMdgrrx862oAE+YjT5RCMEt8q/2BayL+m0mgCoII78qkdZmIDKwrFCzVdADlYd1sAEkALkABoA\nCNYV8sIJmLD22mt7B2biKQBdW2yxRa43FE7EnAecSFOQA5ywcFygQ14h6GDREewItkiP+Cxx4Inm\n+HKjR5/v7rt/ru/mLZmysmZW9W7durohJ59UF5G5dwoAM0utA4Al2El6lmstT6H8eBcAHf5kjLbm\nq0KqsnOtqAGDnVZUbltJGsdaLDtUAq1htq+GHvXBRT7kTQqcK7b5CtgBQoAJLCmADs1UgAfAAWww\nxgazlYeB8ThGjBjhormtPMBwDeBCZSAwSQIdrDSkCUgRn3y0sM/COSxELITQIgUsYeFB5p///Fi3\n48493dnDh4WiZWZ77oMPu1OHnOxq1TsLCOb5UeBepSFgJQF00m4tUTfzxwNITIP+TIa2pQGDnbZ1\nv1uttHx0qRT4oKXNj0Cgg1z5Prj844z3vgphAUiQn46ar2jWAkAADaAFJ2QABOgg4J/D6MoK++23\nn4dC4qv5ifjs47tDfqTJObq4AydACgGAAaLC62TN4frQooNMBNIjyMJDWgsWLHDHHXe8e+LJJ1Pd\n+8oLXuCnNa07PC88ywpAcNqeacmGlZJm0rRaVtP8XZAObd02NGCw0zbuc01KqQ8blpO0WHgEOijg\n8TwgVm7zFTABZAAs9LRiAUCAHXRw8sknf03vjJAsQAqbvYhIMxZNTn/961/da6+95iGF4+uvv77b\naKONcqAj4AFyWLAsyU9HoMN1BAEPaQM9Z5xxpltlte+6MReO9uez+iPrzqI3FlWlCDwbCoBNWp5f\nyZS0pilLYJZGy6q+B/neu6Qy2THTQGtpwGCntTTbRtPlA4dZnQ9cvSsMKgNM6BtHPhXAR75/58hZ\nbvMV4KFu5Vh32B82bFhuCokddtjBD+bXr18//0Qwl87ZZ5/tgQdrDWAEqAApgImsMKw5xjkGLGQR\nHDHvDH5AgFYh0AkfQdJm8MPte2zv7phzVyZ9dcLysN2rV283dOiQsnpm8WywKKSlaUrytLSW7Dzb\nBJ5vgCefP5qPVKOfYv5g1EgUy8Y0kNPAf0Xm+9G5PdswDVSoAeCCUXl33313DxfdunWrMMXyLuej\njwyMZ0NFAIQkBR5/Jshk0D8AjXiAgSwhWFrUhIUvjfx0ABWsKlh15KvDOaAG52bC8ccf7/Ned911\nHb2vGF2ZuXyooNinmYqFPOREDOSQt6w/yLPOOuu4zTff3PfcYk1FRzrvv/++n64CfcctOvGycp7e\nX0uWfOCGDqmPY29cpkr3P/v8C/fyy6+4n+7at8WkqIBxzEZ3LLLecC9YshSQnxDKDbB37NgxaqI8\nzo/9tNtuu/k4tf7hHSJv3vvZs2fn/YNRa7ksP9OAWXbsGWgVDdA0FFpVwg9zq2T4VaJUaliX+OgC\nMvxjz2dhqmbzFbOeU9GwZqRkYOuII47wlhogCT+fAQMGuGeffdZLOnPmTG+pUTMTVhoWWW+AHBaB\nlPaxBMmiA8AwenPoX4Ke8+maiT5XX2PNzDomx58bjbuTrykLvfA8EAQ38TSytp8EOmEZOM97R+AZ\n5PmvRUDPvG/8sWCdlaEoaqEbyyMdGviy20Y6ZDEpGkgDAAaVDR9bRloGQPShbo1i6mOrip68+OCy\nH8JAmDcyEfbZZ59cBcE+lhUchWVtwWLDgoMv52TVEYDcf//9btddd/Wgg28NfjlMIEo8FpqaAJSr\nrrqK5H0YOnSoTxMIYqF3F1Ye0ie+eneFjs8cC5u9gB0qcXSshcQBPS2q7Dn+wvO/cT133onNhgjM\njt6uffvc/aWsKjdr7r30kg94s6QIvT+UK1/gHM87wMPCM67r8l1T6XEAR/mSt4FOpRq161tDA2bZ\naQ2tWprNNMDHln9706dPd8wBxcewWpUPafOx5V8saZIPFVwYqAQFXjpOvEp7X+GQfPnll+f8c7bc\ncksPOsx0jsWGRdBETy6sMPS6Ouqoo7wYvXv3dgcccIDfpkkM3x+sQlzPmqklOAZUyZoDCBFaarby\nkaIfyi3g6dWrl5dJ5xphfcyxx7s333jd3/dGsd4k3RcBSyHQiV/Hfedd03sH+MTfjfg1xe6TNu8c\n7x6BbVmU/AH7MQ2kTAPLpUweE6cBNcAHmo8i82gR+OAKTPgHXmqgAhfcYDWiIpBTdNLHXNYP8hL4\naKZx5OK84ASfGSw4surIr4bjBGADMMHSQ7PV1Vd/OQLx4Ycf7p2cQ9DBSiMrEdBDGvhV7LXXXj6t\nefPmeWsQcdScJWsQFhxZcgAdwQ4XFgs6xEXP0skhhx7OoYYKG2+yqevdexdfxmoBdNoUVA7oUAae\na713vIPADmsAqJz3DjlID6jhOec9ZJ/jBjppe2pMnrgGzLIT14jt10QDwIkA5d1333U777yz/xDz\nMSbwoWbhQ0oQpPCBJVCB84FlIV6xgeu5FisLzVfIQAA2gBEgB5DRmDrxwQOxsvzlL39xwA29mwjX\nXnutHzwQCAmhCcABnOSzwzxan332mV+oeOhiTmAKCVl2KMuaa66ZmyUdyw7QIwjyF5TxAxy++94H\nbtwVl5dxdXovoQv6zBkz3KybZ6ZXyAok0/Ov96KCpPyleueAHXogbrXVVv69C0GRbb1n+d473qFq\nyVRpmex600AxGjDYKUZLFqdVNRB+UNkm8JFnO/wI6wNbyUdWzVfk8cknn3hQAlBkgQlBJ2nwQGY6\nHzJkCJd7AGG+q2222cbvAzukAzSp+Yr0gB3giUU+OoAOwEPo0qWLGzlypO/ZRdMVwEPvMDVjAUJY\ndki/FKuOT/yrn/ETJrrPv/hHwzgnq2yNDDvVBh3pTOti3zveQd658F1UGrY2DWRFAwY7WblTJmfF\nGuDfKvN4EaZNm+Y/4FiUgB3Biaww4dxXnAc26KI+fvx4fz0jHD/wwAO+S7ggBMgR6Kj5i/QEO+pm\njrWH/GjGuuKKK3x6jM2DLHRlx5oj2NHYPaFjsr+gxJ9xV17l/vHPfzcc7Cxd9qFbv327XDNgiWpJ\nbfTWBp3UFtwEMw20kgaWa6V0LVnTQOo0IEsKzVdsYynCnE9zlGAHIMEawxL2vjrmmGNyoIPPDV3I\nv//97+csLQKdeDOYrENKC58fjbjMmDw4KRNwdAaKCFiHJAfpIRvph749PqL9ZHrKi3y3T01IWFMs\nmAZMA9XRgMFOdfRoqaRcAzRf4aOAxQSnSgIWm5122skP0PfWW295wAA4AB0gA7jAx2bHHXd0Cxcu\n9NcwieesWbPcWmut5UEnbAIToKipSk1XHAdW8LtRl3LkwMkTuThGOOGEE7wDNHGBI0EX17PPcfJj\nsdCYGgB0gBwDnca8v1aq+mnAYKd+ureca6QBKpBCva86d+7sYYLJFAELwQnXaZoHRJ08ebKf+oEm\nJcBFoCNrjpqrBDnsc454xOc6HJzpsg7ssGy44YaOAQYJOD5PnDjRx+c65AjhCwsPAEYw4PFqcAws\n2OEHHb7cyfivQKcUh/uMF9nENw3UTAMGOzVTtWVULw2EzVdhF1nAQc1XTMnAfFPPPPOMB58bbrgh\nmndpqBcZB+F77rnHj4As3xlOcH0IJVh05OujZjDiqbs6/jeADj45Wtjffvvtc5OG3n777b4nDFYc\npU1asu6oSYt0Swkan6eUa7IQ970lS9zW22ybBVELymigU1A9dtI0ULEGlq84BUvANFCBBtQjBN8Z\nbSclJ9M+PULUOyQpXvxYvuYrQEXNRbKgADIMDMjknbKcbLrppg4AYeZxzuOozDmuBTy4FhjBAsPC\nPpDCeQKQQTMVFh18dVjYJi0cm0mLOMOHD3d33HGHW7p0qe/tBVzRzEV6nNeChUgO0dqOlzlp///9\n3/+6dxa/nXQq08fozp/1YKCT9Tto8mdBAwY7WbhLDSYjPU0Y7+PGyHdGY30IYICTpECFwHWMF8N0\nDPSGwkpzZORonK9LLNcUar7CD0bWE6DiT3/6k9tjjz1yoIMDMlNBYH0R6CBbCDqCHPnXyBGZeFwT\nBx32sRQJXoAd4AX4onfXD3/4Qy51F154ofvlL3+Z891RfK1D0OH6lgI6mvfYEy1Fy9z5P/7xLdex\nQ3absQx0MvfImcAZ1YB1Pc/ojcui2MANCx94DQjYs2fPkgYFVLk1OBrp4eMAJMUHGKSCB6aKGTyQ\naRx+/vOfK3l34okn+vSw3jCODtYYQAM4AWiAI4EOa6CJ44IXgU5ozZFFJxwNmQzVlEY69913n59X\ni+Onn366l534XEvTF+BFcxjQRB7IVAzsYDXT6M6k3SiB6SJ6dO/qoTdrZTLQydodM3mzrAGDnSzf\nvYzIDphocsAkKKm0GAAPC5UHlp8jI2sP+RQ799XNN9+cswAhCyMa9+jRw8MFIPLmm286oIwQBx35\n0xCPAHzEQQfgwZoji46sMkAKcCTfIQAK52ag69e//rVPj+YsYI5rSUc+P2wDPICQ0vMXFPjp1au3\nO3P4CLf7T/sWiJWtUx07dHSzb5md17qX1tIY6KT1zphcjaoBg51GvbMpKBfNToBHCCGtKRZ+P+QH\nHGDRIcyZM8dbaIAKltCKgkPxGWec4X1lJBeWlXbt2uWsKEAG8ILlp1u3bs0sOoBO3D9HUAKMYI1h\nEZSoCYq8QmsMctE0BkiRJsCz+eabe8sReT/66KM+PunIyZm1IArgIb0wTZVHayw7hxxymHfmHXPh\naB3O9PrZ5xa4o44c5Ba9sShT5TDQydTtMmEbRAPWG6tBbmTaioGlhWYkFkFPa8uI9YW81OOKeX+0\nTd6yoAAo+OccfPDBOdDZOBrb5KmnnnL0ygIqWAQnXLfddtv5EY+XLVvmp3ygyQlLjByRgRLABggJ\nF45xLmy6SoISrDPEAZa4Zvbs2V5dABCjNgNEoVWJvCkH8IZ8BOIoADeyqHEPaMJ64IH73ROPPaoo\nmV/fd/9c12/f/pkqB0DOs2bdyzN120zYBtCAWXYa4CamrQhYV6hoWdT8U2sZ+fcsKw/WnVVXXdWD\nAZaT1157ze2yyy7egoJcgwcP9ousMvjGCFKAEIACuOBa0gWEGFQQuABcgBmOcQ3WFjUxkZ4gpyXL\nC2mFMAZMXXTRRW7ChAledQAP0CKokv+OLEjvv/++e+WVV7zzNhWqLFuh3oE/xvK57fY7vZ9LeC6L\n2zTLDR06pBnQprkc3Jd6vQ9p1ovJZhqohQYMdmqh5TaSB9YELCmyKvAPtp4BOQCedyJrzyOPPOIt\nLnPnznUHHHBATqwLLrjAz0mFFQdwkFUGqBDoYEEBdFiAniXR2C50ee4Q9QISfAhyAB7Ah+OAjvxp\nkqw5OSG+2gB41NOLvAAeoAw4I4T+O3/+85/dCy+84J5//nm3YMGC3AzsXyXlV8ANlasWrAkXXXSx\n++jjTzI/+/mtEbBdPWmie+yxeWGRU7ttoJPaW2OCtRENGOy0kRvd2sUELKhUCXzY02SmHzRokLfI\nHHjgge7cc8/1MvLzq1/9ym2wwQbeOiOrDtDCNpCCpUWgA3iwrWYr9hcvXuwHBJR1JQQdNYGRTzGg\nQzw1Q5GH8mXcHcb+UWDcn7ffTh4vB6fqPn36+MlOe/Xq5e+B0mTNgsz4A7268HXXpXNHJZu59aGH\nHeG6dd0uGpPo5NTLbqCT+ltkArYBDRjstIGbXIsiYtHBgpI20KGCB1qwsigABWPGjPGWHM4BJnFQ\nEehgyWHBX0agE/rWvPzyy27rrbfO+fqo2QpYIhQLOpJNUILV5qGHHnJYoph0NCnQNLfnnnvmFixK\nyp/4SouyaKEMZww7K7JgrZRZ687cBx92p0aQM3/B/FRBddI9MtBJ0oodMw3UXgMGO7XXecPlSLdy\nPuosabLooGgqfEYmxqqjQFdyLDMADEFNToACkMI1nMO6ItDhGOBC3NAKhFXnjTfe8D48DELI9Syl\nQg6+QNLhY4895icglbzx9VFHHeWb55ADSMN/J+ydRd4syAzcaAF42MYyxBQVWbXu9Nunv9ulT+/U\nW3W4nz2/snbG76HtmwZMA7XVgMFObfXdcLnhhHzkV93L6+2jk6RcVfgjR4507du3d1OnTnV9+/Z1\nxx57rHc8Fhhg3QkBAcgBdgREAAwwBFzE/XOAjg8++MB9+umnvgmJdFoKAhvWgA7XhgGrDbOtAyXd\nu3f3TU9hd/SHH37YRxfwADuyTgm2KLsAB8jRNutJk652yz780N0ye1aYbeq3x0+c7ObccXvqfXUM\ndFL/KJmAbUwDBjtt7IZXs7j46QA4N0bdzMMu3tXMo9K0VOFrfJ3f/va3fibzKVOm+JGRqfiJIygi\nHoDDAiAQAKHQEVnAA2iwyD8HfdALKunfPJWfFqa7iAemv6C3Fdey4FwsOAG6sES9+uqrrnfv3v5S\n1synBVgJwgQ7yCrgiUMO+ywfffSRO+vMs9ygo452Q046IS5OKvc1rs41U67xOkqlkJFQ3GfuoQXT\ngGkgPRow2EnPvcicJAIcrDtpDQIZIEYgQzduYIcZzsPjQIXiABoEQCberRyoAHLkH0Mcgiw66APw\nwWLDkg9uqBC1AI3IqiD4AkyQi95ZDDaI3JdccomPdvHFF7tOnTr5fJFRcqpZTvIImkiLdAV4L774\nop9t/Zln56e+K/rSZR+64447Puqd1scNOfkkqSl1awOd1N0SE8g04DVgsGMPQlka4KMup+S0+enE\nCxSCg2CG5iH8eAYOHJjrUi7YEegADQIINV2xH4IOFhTABqBBJyxJY9xguRHYsE6CG0GI4ERr+Q8B\nO59//rkbMGCA+8Mf/uCLCfzgswN4ydIEjMm6g3yCKK2BIC1z5z7gbr75JjfzplmpBh7mwKLss26e\nGb+9qdnn3nNvLZgGTAPp04DBTvruSSYk4qPOMjrPLOVpKgSVvIAHgABqcAI+4ogjPKQABlhOgApg\nCBAAHgAbQQ4AIYhgAD9GWwZqgJwkuKEZCqChaQoHboBQsIFuQrAJZRPgaI08su7gR0RzFqM47733\n3l7FG220kRs1apRPGwsTwCNZ41DGecrGOlxGjb7A/TNKd+q1U137duul6dZ5WU49bZh7660/uhnT\np6XOAR4BZcUz0Endo2MCmQZyGjDYyanCNorVAP9gs2LVoUyCDEGFLCV77LGHH5fmoIMOyvW6EgwA\nCgIdppag+zcLkIMzcjwkDeBHfqoId9ppJy+HIAeAEdDE15yLL7JIAWUsQBYTnRL22msvt9tuuzVr\nctPgiIAPZVHTFpCDtSeEHbbPG3W++0s0UOGll12WqvF3sgA670RDLgC1FkwDpoH0asBgp4b3ZrPN\nNssNCIffxVlnnZXLnUpWgZ42jJxbTKB3ET2LFFSxa7811oAOH/csWHXC8oewg5WEaSQYZPDBBx/0\nFh2gAxBYuHChW7RokZs/f75fGC05HnbeeWfHP3kWdBFabshHC2myACddunRxq6yyigcZAU4IPSHg\nJJ0X8CA7wDNp0iQvO7JRDnqbqdlNs6PTxAW0ATsCnhB2wu1zzjnPPfLwQ+6GG6fXvUkLH50zzhjm\nm67SbNEx0Im/GbZvGkinBv4z0lo65StJqksvvdT3UOEiRpp96623SrreIresASwVd999t7vyyitb\njpzCGEClKniags477zx3Y9SbDH8QmrYYMycp7LDDDr4nFHDD6MQEgaUgirXgJg4rDDzIAIRACFAS\nnhfk6Fi4ZjsEJ/JV76uTTz7ZYWUDfi688EI3bdo0b7EJHacBHCw7svCE52TdQR/o5corr4jmM7vb\nbd+jm7tqwqS69dKi19XIs0e4jh07ucmTJqS26cpAh6fRgmkgGxpoKNjJhsqzLSU9jfbZZx+3ceSP\nksVApc6CtebQQw91+N/84he/+FpRgJpdd93Vd09nCgauUQgBBFAR5AhaWAtYwu1NNtnEvffee45B\nDddbb71cUxVxw0Vww5oAjAjQJD/xaY7Dssd0GITrrrsumhhzaM45WU1W8j/C6sOitAQ5SpM0Djhg\nfw99559/gZv/3HOO8YlqNa0E1pwbp890I0ec6QFU5UKuNAWA30AnTXfEZDENtKwBg52WdVS1GI1g\naQJ2AIEsBip1AIJKfo011nBPP/10rhhYbrDYMH7NNttsk+tWTlzBjNYhmLQEOCHssL3aaqt5wKLb\nNyMukxbpshAEHuSrRdCitYTm2i222MJbp5jQlK70DERIWYhLWqTBNgsWHsCHhePKT+lp3TO6vzTN\nMfDgFl06uTFjL3NHDjqiVZ2Xr582w90040bXrv36Dt2k1QfGQEdPia1NA9nSwJdfvGzJ/DVpmdGa\nDzuDrCkwJD7H8JOJB5q7OK6KhTXHkiZY5LjiMfIuQYO5cVxBcVgjDwuVJvukQQjz1DFdH1/fdttt\nuetJg7Q4Vk4grzBvyZRU3pbSp9lE4+u0FDeN5yk7C5X/zTff7AGB0Yrpwo0PVdeuXT0MEIcgZ2b5\nydAbii7gX3zxhW/6Yt3SQhMZcbiO6wGttdZay89YDuRIHpqc1ANMDsb43IQLzWDIywI44St0yCGH\nuG7dunl58QVDVsEOQCTgYjsMKmN4TNukO3LkCA8er77ysusdAdDIc0e7ha8tUpSK11hygJxevXp7\n0KFZjq7lBjoVq9YSMA2YBmIaaFOWHSp3xihhFN14AGCAApyDf/KTn8RP5/aBjqTrcxGiDUCnJZgJ\n48e3gRqaJ8JAnsged2wO48S3q1HeME0GyCNsnNEmLJVFVo3999/fH6Jyfe2117ylRVYWwIAu6oIF\ndQEXOLAutM050uJ6pRnmD6hst912ftBBBgZkXxYYWWO01nGtBUeki1wskydP9hOSksdxxx3nbr/9\ndp835ygHACR/HdIV6Ggt2eJrdAOAcO9vnjXbW3r27rev2yUC/22i96RH967xSwruAzhzH3jIvfrK\nK27uffe6rbfZNmqGG+iOjKYcSXMwi06a747JZhpoWQNtCnbygY7U9Mknn/h5k2huWn311XU4twZi\nigmVgA7px0EnzBMoA8aK6a1VaXnDfNlOg58C1jXdByr7UgOVO9cJeLie5iumYsAXiXMCGaw6LAIK\n1oUAJwQbyUZ+LOQXwou26ZKOUzTxGTMHoNE5wY32WWtbkCJZN9hgA9+7rH///u4vf/mLmzlzph8w\nEdDRNUpPMrFfbAB6WEaePTxyYr7L4UQ8ecJ4fznAssWWP/Tb3//+Zr7HmdJl8MPPP//CvbP4bfeH\nN99wjz/+mDvk0MPdT3fdxQ0dcqLbOAPgbKCju2lr00B2NdAQzVhU/FQWGkaf20FvLI7JTwaACC0y\nxOU8C00YCgBPIdggXaw/ulbXxdfHHHNMLk7YxTweL99+PvmIX0g+pVet8io91vy77xk1Z6QlcK/K\nCarstWZ0Y/xECAALi6whdPFWsxVrbYfNUsQBimTNIV2sKGGzVNgUFd/GWkj38MWLF/smq7DbuJqz\nOB8OFqhu5BpDh3NMGMpAiQSclblfyERZkFGLZJW8/oIif2jewgozdcrVbtEbi9zsW2a7AQfu71Ze\n6dvuf//9L3dX1J1/5owZuWVJ5JC90ndW9BagUaPO8+8EliKcj7MAOgB+GiC/yNtj0UwDpoE8Gmgz\nlh1ZA9ADIBICCPtUnPL5ARTC86HuAKOWrCqcDwEqvL6Y7Zbko5kLeZOsT0q/WuVVemlcA68t3Yti\n5KbS5d+7QAcYECDQ/MO2LDyh9Ya0ARtZXLSWBYV1aFXJt008mrLoIfbCCy94oCSu0matEN8Gurke\nuMLfB0h+9NFH3dKlS92QIUPck08+6S1T8uMJm7LUM0vlUB6lrGXxKeWarMQFcgiU0YJpwDSQbQ00\nhGWnmFsQWnWSKkjmSVLA1yWf1SDpWl2ndTFxFDdpHcqi8/E0W3IurlZ5lT9rLAWAQVpCWMZqyAQ4\nYO2guUrAA+gIdmQJATiABqwqoUMxFpvQKhO34IT7ioflRlabddZZxzejMlIzTs3kIeghzxB0wvIS\nJ5Rn9uzZudOnn366t6ZQJoAH644ATs1yLVkpc4m1oQ2BTpqe9zakfiuqaaDqGmgzsBPCAb4sqjy0\njvfaSoIdmrCKCYUsLsVcn5RPPM0k+cK0q1HeMD22sX6k6eOPb1S1gSdeZp4PLCdqkqK5CDgBUgQ3\ngIvgJQSa+LbicL0AB1iKdwnHh+jdd991qnDjMoX7en5l3SGtDh06uHHjxvlozz//vB+9Wc1Z9AaL\nAw/WKgv/0YD0nqZn/T/S2ZZpwDRQjgbaDOyUoxy7pnU0AKSoki51HTbPAXw4LNP8WA3oQRY1NSXB\nTTmAA/DIeiOwAUiAE5bQchNqW00nWNNaCtIhacnaxHxfe+65p7+UqSQAVSw5WKlC4JF1R81zLeXV\n6OcNdBr9Dlv52qoG2gzshNaS0MFYJvz4Ooxf64cjqeKOW3Jaki88X63y4pyqyqDWOsmXH3oBnuRv\nlS9eMccFO2oSiltxZMmJW2wENFrH4QZwaglukuTDssDyeDS2UUsB2ZUH+SE7/jusCUcffbRfFwIe\nvQM+Yhv80bONzi2YBkwDjaWBNuOgTHdtNe0AE3EfmDTdVqwXcb+d0KJBk1YIM0myt0Z5sTaoQkjK\nM8vHBDoAgwKWEsABCCCwr0VWGda6lrUW4rNdaQAwe/bs6YEH/bNfKIQyMyUF/jsMAkl39PHjx3un\nZaw7xAPqBEgqA/uUt1jZsTyxfPa3z92SJe9/bUZ4Jj7t2LGDizr8e0dfypLGoOfaQCeNd8dkMg1U\nroGGtezELSEh3GAFCMfCAYJCP564/07lai4tBXqDhfKxH1ou4iCUlHprlZfmkEYLL730kocIKnhV\n/jQH/f/2zj3IiurO4z9SIJYVUu7GZAOuFi5UGNkqERMZhFpFYnzk4Yih3IqRAZJSUUkibtCMZA1b\nAg6iUjKWQXwOJghEHIeYVWMSJCKPUgOl4aEbE0UdrazlHz7WRLOV3G+THzlc7jzune57u/t+TlXP\n6du3+5zf+Zyr/eV3fuecYs+Ox+oUx9tU6rkph6NEgl7I/lIu9azsd9EiIaNhM01H18rESlpoUMLE\n43d8Krpyn22m73pK6v97ChunXnTxJdYwqsHmzLnCfvbYL+zd9963f/jHj9u05uYDjnGN4+39P35g\nL+99zW5aenNk39lNU2xZ2y09tqUnG+L+zpkidOImS3kQSA+B3Hp2JHb0P355QLTWjqZzS0C4d0fi\nIRQQYZd0N+08vCfp8/7al0R75VmIY7dzibWeVqmuhG1xAHc5ZegFrrbJ26GkvDho14WEck/huV9L\nMnfPmgSLzksl2ST7JdokwiReWlpabN26dd1OR3eBp+e8nX7udWgo7af//YjdsGRxtCjg+IKI0qaj\n5W4SqhWUNz252bZs3mJnnnGmfbrhWDt3SpPNKKzdU4uE0KkFdeqEQPUJ5Ers9Da0o9iV3lYVVpyD\nhEItk8RW6NkJbZF9vbXT74+7vXrB6kXb3yT7+9qG/tbVl+f1Ir/88sujF73fLwHgqdqixustlcv7\nIHHWk+DRc26/PFQSPI888ogdd9y+VY41Hf3GG2+MvDny6ujecEgrFDobN260Fbffab9++ilrnvkN\n+83O3WULnLAdw4Z+ys6bem50zJ37H9HWEe3t91h7+8rIA3XuuVPC2xM9R+gkipfCIZAqArkaxpLH\nQGKgu3/l6wWrRdt0T7FnQQJH4iANXh3ZokUJQ0Ege9euXVuWfXG3Vy9apTgET1RQSv7ohR56Sty7\n4XlKzNxvhuJ21BcSaaWSizOJFh/OUvxOOB29s7Mz8l5p+EqCRzO0wvV33nrrLVuwcJHNunhWtBXE\nLwt1Xf3duf0SOsW2Svh8Y2azbdjwS7ugeYa1tbWZhriq8fvyOvw3XWwbnyEAgXwRGFAIRix/g6F8\nMaA1ZRBQsOukQvyIPCF5SNrnSW3xf+VnrU0SPBJqpQKX9Z+2huM0A0tCRltdXHjhhfbQQw9FzVy/\nfn30nLw/ikPydYC2bdtmN9241IYV9tuaN29erAKnN76LWpfYvJYr7eabFUy9L9aot2fK/V5CRyKn\nFLNyy+J+CEAgGwQQO9nop9RYqeBUxe34v4xTY1iFhrhoiyMWqUIT+v2Y+sK9PcWFSfBoGMs9OBI8\nI0eOjLw5iunR1hLyBCmYWVPmH3yw09TH3/z25fat2ZcWF1eVz9pkVN7XoUOH2uLWRbGKEoROVbqQ\nSiCQOgKIndR1SboNUryIhgm1aaX+dZz1JJHg3pEst0WeKQ+0DtshseOCR1PONWS1adOmaDq67ps6\ndWo0HV1xO62t19vu3busfeW90cadYTnVPlcg8/z5/2VvvPGGrWy/OxbBg9Cpdi9SHwTSQyBXMTvp\nwZpfSyQOtGN1lj0h3jvu1Qnjdfy7rOUSnjok3MLkcUcev6Mhq1LT0RcsWBQNeW0sbBw64aTGsIia\nnCueRzurjxgx0pqnz4yEXH8MQej0hx7PQiD7BPDsZL8Pq94CvVAVuyNvQpbjHjyQd8yYMVHci0SP\n4pGyLn7UP8VxPO7dUfyOByRrLaZdu3bZZz9zov1TIYB5xe0rTCIjbWnOFXMLy0f8tmIPD0InbT2K\nPRCoPgHETvWZ56JGiQId8+fPz2R7FJcyc+bMbm0/+eSTo/ZJNOiQ10TJBVL0IcV/9IIP43gkdpQU\nv+PDWV1dXXbZZbOj9Xce7Fxf1UDkctFpEUPtBL/qR/eW9ShCpyxc3AyB3BJA7OS2a5NtmHt3/GWS\nbG3xly7xIqE2o7CYndqyYcOGKOha+TvvvHNQhcOGDbOxY8dGh3Yl16GUZvFTHMfj8Tu+P9aWLVvs\n9NNPtyc3b03F0NVB0IMLiuGZNesSaxw3rjBDrCX4pvtT/21m2fvYfev4BgIQKIcAYqccWtx7AAEJ\nBQXFavp2lpJEjmzWy1DJvR6apq1Dq2xr/zTNVNq+fXt0lGrf6NGjI9FzwgknRCLIh7/SJIB8AUJ5\n4ZTUVrXxzTffLOy/dp5NPe/fazbrKjKojD+apfX1GdNt+W3LI69bT48idHqiw3cQqD8CiJ366/PY\nWqwXqTwkClaW8MlC0ktQHhqJGBcn7vGQCNAwjzwf4V5R+l73/6oQvKt9tJ544oloSKW4vVqrRsJH\nXh/Vody9CrUWQGEcj9pz7bULbM/zL5Q9LFTc5mp/XnbLrdax7v5oIcLu6g7b2t09XIcABOqLAGKn\nvvo79tbqxaJgZX/BxF5BjAX61GzNwvKZWCrevR0udBTT4oeu6dA9Eiw6NE377bfftueeey4SQBI/\nO3fuLGmphr9c/LgA0o21ED8SehJfaotW1+7v1g8lG1yFi1pl+bTPTS656KB+h+7FqoIpVAEBCGSE\nAGInIx2VZjM1LKSAX3+ZptVWDzaWrWHSy99FjYsc3z7BPTzy+ihJ6Ggatx/67Nf27t0biR+JIAmg\nV199Naxm//mECRMiz497geQdU6qGAFIcz3e+c6WNOna0Lbx2flRv1v48/OhjNqewuvLWbVv3e87U\nBoRO1noSeyFQPQKIneqxznVNGsaS2NELx4du0tTgnuxzseOBuz41OxQ8EkOeXNx4LuFT6lzXNPTl\nHqBnn3222+GvyZMn7x/6kgfIGcYtgCR2jjnmGHut6/VUTjN3xr3l539tmo1vHLffu4PQ6Y0Y30Og\nvgkgduq7/2NtfU+CItaKyihMQ1casupJiLnYkaDxadkSOi52dE0eHnl3dK8EiB8SOjp3wROKnlLX\nNPwlr48LoO6GvxT8LNGjQ0LI44v6K36uuuq79sGH/29Lb1pSBsX03SrvzvWt10WxOwid9PUPFkEg\nbQQQO2nrkYzbI8Gjl49mO/kLulZNktCZ9LdZSLLJvSXF9kjAhMHJLnjk4Ql3Apfnx+9TGXpOh5KL\nn1D4SOz44SLIcxdCyiV85PVxL1B3w18TJ06M2uOzvyoZ/moY1WB33dOe+qnmEdRe/px66mQbM+a4\nXKzm3UtT+RoCEOgnAcROPwHy+MEEFMOjGVq1nKUlcaPAaR2yozuhI+tdtEjI+Ewsj92RR8fjdvSd\nvD+6zw99drHk5TgRF0AueEKB49ckfkIB5PfI+yPxIxG0efNmL/KAXBtluvBRELQOT6U8QBKgd93d\nbus7O/y2TOfz/nO+ffjBn+z6xddluh0YDwEIJE8AsZM847qswcWGPCsSG+6FSBqGvDkeMK08nHXV\nU90uWNxzEwocFzkSNn6EYsef8Wueu/hRruTix70/Lnhc4IR58fkrr7wSCR+JIAmgnoa/fPaXhJB7\n11TnlVe12KBDBmc2MLm4/3zdnT3P7yn+is8QgAAEDiCA2DkABx/iJODxMvIo+HTvnjws/a1bs6wk\ncCSsdK68r8kFiYSKzl3UKHcx4+f+Ocz9PLzHryn3MmWPzr0+F0ASNzov5eUpFj7uDfKhL+USQdpO\noTiFa/90dq63xUtusLPO+HzxbZn9rGG51WtW7xd1mW0IhkMAAokSQOwkipfCRUBeHokQBQlL9Ciu\npxwh0hNFCSoJG3mPlJRr6KrS5CLEBYlyiZVShwsi/86Fjufh9VLXvGyvSza7+NG5RI4foQjyc89d\nDIVr/2gITJt8FifVlaekPbNGHzuqzx68PLWdtkAAAn0ngNjpOyvu7CcBiR4Jk/b2dmtqaoqCbSVM\nyhU+EjjyFuno7Oy0U045JXrZ9UfkdNe0UBxIvLgwcSET5qGg8XPPdV+p8/Cal+V1eN0ugJS7+PHc\nvTz+2YWPCyFf+6etrc0mTvw3+/GP13TX1ExeX9S6xP5ciNu55prvZdJ+jIYABKpDALFTHc7UEhAI\nxYoEkIa2JHgm/W3mlOf+iDxCeka5jpdffjkSOBI3lYglL7eS3AWIng1FiYSKPheLl1Kf+3rNxY+X\nHdbtAkjiRucublzshLnOJXbe/+MHtuK2H1TS7NQ+s/b+B+zBjo7MbXuRWqAYBoGcEhiY03bRrBQT\nkLjRUJYOJRcxvku3hrzCJCGkQ8JGw2DFYii8N+lzCQtPfi4RIrGhpHMXJ2FeSuDo+/C6n3vu34ef\ndc3LVV0KnlZS7vbIFp274NHnwq02/Jh/ie7N058hQ4bkqTm0BQIQSIgAYichsBTbdwK+jUPfn0jX\nnS4yZJWLjNALE4oTFyueFwsZfS73murSM0o610wyJdnix7vv/l90LW9/jj7qKPv100/lrVm0BwIQ\niJkAYidmoBQHAREIBdA+z8rfA4MlSPxwIVRK4Oi78Lqfex5+X3xN5XvZyj/88z4BlLfe+dfRDfb8\nC8/nrVm0BwIQiJkAYidmoBQHgVIEQvHj5xIklQx/hcLGBU9v1wYNHFTKLK5BAAIQqAsCiJ266GYa\nmUYCLnpkm84VYyMBpOSeH8//AVUvAAAG+ElEQVQlasLDxY3nLnrCPDwffOjgqNy8/dm5iwUF89an\ntAcCSRBA7CRBlTIhUCEBF0Ceu/hRcS58lIfCJzyX+AkFkAueIR/9aIUWpfuxvYWVpb96/gXpNhLr\nIACBmhNA7NS8CzAAAt0TcNGjO/xcYseHvyRmXAS56NHnUPDofODAj9j//uEP3VfENxCAAARyTOAj\nOW4bTYNALglI9Pgh0aNj4MCBdsghh9jgwYOjQ9tE6DjssMOiQzumd3W9ljse27fvsIZRo3LXLhoE\nAQjESwCxEy9PSoNA1Qm48FEerq0zaNCgSAAdeuih1tjYaGvX3Fd125Ku8KXf/84+9rF8DtElzY7y\nIVBPBBA79dTbtLVuCBQLoCOOOKKwGOOppp3C85T+pzDtvJaLTOaJJW2BQJ4JIHby3Lu0DQIBgRPH\nNdrjG38VXMn2qYTb611d7Hie7W7EeghUhQBipyqYqQQCtScw4aRG27plc+0NicmCp595xr7cVPkO\n9zGZQTEQgEAGCCB2MtBJmAiBOAhob7EX9uy2vKxN07Hufps4YXwcaCgDAhDIOQHETs47mOZBICRw\n9jlT7I477gwvZfL84Ucfi4awJOBIEIAABHojgNjpjRDfQyBHBC695GJ7+Kc/sa7X38h0q+5dudIu\nnT07023AeAhAoHoEEDvVY01NEKg5geHDh9sJnz3R7mm/t+a2VGqAvDra6bx5GisnV8qQ5yBQbwQG\nFFZb/ft2zPXWetoLgToksGPHDhs7dqz9Zudu067hWUvnf22ajR/faN/6Jp6drPUd9kKgVgTw7NSK\nPPVCoEYEjj/+eLt2wUJbuHBhjSyovNplt9xaiNV5Da9O5Qh5EgJ1SQCxU5fdTqPrncDsyy6NhoIk\nHrKSNIvs1rZl9v3vX2OHH354VszGTghAIAUEEDsp6ARMgEC1CUgsLL9teSQesrKqcktLi13Q3MyK\nydX+sVAfBHJAgJidHHQiTYBApQSWLWuzjo4O+9GqVTZs6KcqLSbx5+ZcMddefPG3tr6zI/G6qAAC\nEMgfAcRO/vqUFkGgLAJXXtVie/bsseXLf5BKwSOhs2P7MwVR9gDDV2X1LDdDAAJOgGEsJ0EOgTol\ncP3i66yhocFmzbokdevvSOhoXaClS29C6NTp75NmQyAOAoidOChSBgQyTiAUPGmI4dGihz50tXrN\najb7zPjvC/MhUGsCiJ1a9wD1QyAlBCR4GseNs6/PmG5r73+gZlZp1pW8TIrRWdl+N0KnZj1BxRDI\nDwHETn76kpZAoN8E5s1rsdbFrXbNvKvtoourP6ylqfBfmXJOJLoUjMwU8353KQVAAAIFAogdfgYQ\ngMABBLS55tZtW23AgAE2edIkW9S65IDvk/igLSDObppi2slcU+IlukgQgAAE4iLAbKy4SFIOBHJI\n4PHHH7cVt98ZrVr8+TPOshnTp8U6Y+vOu1faL36+b6+rlqtbbPr06TmkSJMgAIFaE0Ds1LoHqB8C\nGSAg0bPqvjV2+4rlduFFs6xx/El21pmnVyR85MXZtOlJW7d2tQ0dNsxmFGKEmpqaGLLKwO8AEyGQ\nVQKInaz2HHZDoAYEXnrpJdu4caM9+rOf232rfmhfPvscGzFipH3ik5+0kSNH2JAhQw6yavv2Hfbe\ne+/Z73/3YvTMpEmn2udOO82+9MUvEHx8EC0uQAACSRBA7CRBlTIhUCcE5PHRLup/sQH21FNPl2z1\nkUceaUf985F29NFHReJm+PDhJe/jIgQgAIGkCCB2kiJLuRCAAAQgAAEIpIIAs7FS0Q0YAQEIQAAC\nEIBAUgQQO0mRpVwIQAACEIAABFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoid\nVHQDRkAAAhCAAAQgkBQBxE5SZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAA\nBCAAAQikggBiJxXdgBEQgAAEIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCAp\nAoidpMhSLgQgAAEIQAACqSCA2ElFN2AEBCAAAQhAAAJJEUDsJEWWciEAAQhAAAIQSAUBxE4qugEj\nIAABCEAAAhBIigBiJymylAsBCEAAAhCAQCoIIHZS0Q0YAQEIQAACEIBAUgQQO0mRpVwIQAACEIAA\nBFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoidVHQDRkAAAhCAAAQgkBQBxE5S\nZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAABCAAAQikggBiJxXdgBEQgAAE\nIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCApAn8FUX2PmBTVQm8AAAAASUVO\nRK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 100, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse_2.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 5: Making our Network More Efficient" + ] + }, + { + "cell_type": "code", + "execution_count": 105, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self,reviews):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " self.layer_1 = np.zeros((1,hidden_nodes))\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + "\n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def train(self, training_reviews_raw, training_labels):\n", + " \n", + " training_reviews = list()\n", + " for review in training_reviews_raw:\n", + " indices = set()\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " indices.add(self.word2index[word])\n", + " training_reviews.append(list(indices))\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + "\n", + " # Hidden layer\n", + "# layer_1 = self.layer_0.dot(self.weights_0_1)\n", + " self.layer_1 *= 0\n", + " for index in review:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # Update the weights\n", + " self.weights_1_2 -= self.layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " \n", + " for index in review:\n", + " self.weights_0_1[index] -= layer_1_delta[0] * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + "\n", + "\n", + " # Hidden layer\n", + " self.layer_1 *= 0\n", + " unique_indices = set()\n", + " for word in review.lower().split(\" \"):\n", + " if word in self.word2index.keys():\n", + " unique_indices.add(self.word2index[word])\n", + " for index in unique_indices:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 106, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 111, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 109, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):1581.% #Correct:857 #Tested:1000 Testing Accuracy:85.7%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Further Noise Reduction" + ] + }, + { + "cell_type": "code", + "execution_count": 112, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjsAAAEpCAYAAAB1IONWAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsnQe8VcW1/yc+TUxssQuWWGJoGhOjUmwUMVYUC3YQTeyCDQVRwYJYElGa\nggUBJVixRLFiV4gmxoKiiaKiYIr6NJr23vvf//6O/k7mbvc597R7zt7nrvl89tlt9syatct8z5o1\nM99oioKzYBowDZgGTAOmAdOAaaBBNbBcg5bLimUaMA2YBkwDpgHTgGnAa8Bgxx4E04BpwDRgGjAN\nmAYaWgMGOw19e61wpgHTgGnANGAaMA0Y7NgzYBowDZgGTAOmAdNAQ2vAYKehb68VzjRgGjANmAZM\nA6YBgx17BkwDpgHTgGnANGAaaGgNGOw09O21wpkGTAOmAdOAacA0YLBjz4BpwDRgGjANmAZMAw2t\nAYOdhr69VjjTgGnANGAaMA2YBgx27BkwDZgGTAOmAdOAaaChNWCw09C31wpnGjANmAZMA6YB04DB\njj0DpgHTgGnANGAaMA00tAYMdhr69lrhTAOmAdOAacA0YBow2LFnwDRgGjANmAZMA6aBhtaAwU5D\n314rnGnANGAaMA2YBkwDBjv2DJgGTAOmAdOAacA00NAaMNhp6NtrhTMNmAZMA6YB04BpwGDHngHT\ngGnANGAaMA2YBhpaAwY7DX17rXCmAdOAacA0YBowDRjs2DNgGjANmAZMA6YB00BDa8Bgp6FvrxXO\nNGAaMA2YBkwDpgGDHXsGTAOmAdOAacA0YBpoaA0Y7DT07bXCmQZMA6YB04BpwDRgsGPPgGnANGAa\nMA2YBkwDDa0Bg52Gvr1WONOAacA0YBowDZgGDHbsGTANmAZMA6YB04BpoKE1YLDT0LfXCmcaMA2Y\nBkwDpgHTgMGOPQOmAdOAacA0YBowDTS0Bgx2Gvr2WuFMA6YB04BpwDRgGljeVGAaMA2YBsrRwOOP\nP+7effdd9+rC190HH3zgli39wD3++GNfS+qQQw93q6yyiuvSpbPbaMMNXM+ePd13v/vdr8WzA6YB\n04BpoLU08I2mKLRW4pauacA00FgauOuuu9xTTz/r7rv3HteufXv3ox//xG2y6SZu8803d6utuqrr\n0b1rswIvfG2Re2/JErd06TL39ttvu1defsnde89dDgD66a67uH322cfAp5nGbMc0YBpoDQ0Y7LSG\nVi1N00ADaeC///u/3YyZN7k5d97pS9V//wNcn969XZfOHcsq5dJlH7q5DzzkFsx/zl079Rp3xrCz\n3IknHOc23njjstKzi0wDpgHTQEsaMNhpSUN23jTQhjUwfvwEN3nSJLf1Ntu6IwYOdLv/tG9VtYHl\n57rrrndXjvuFQU9VNWuJmQZMA6EGDHZCbdi2acA04DWAP87551/gVll1NXf8CSdUHXLiahb0zL3v\nXjfi7BFu0KBB8Si2bxowDZgGytaAwU7ZqrMLTQONqYHxEya6yRMnuhNOHuKGnHRCTQs598GH3WWX\njI38gdaPLEoTzJ+nptq3zEwDjasB63reuPfWSmYaKEkD+OYce9wJ7pFHHnU33Di95qCDsDST3Txr\nllt33fVct67d3O9///uSymCRTQOmAdNAkgbMspOkFTtmGmhjGgB0Bg4a7FZeeWX3i19c7tq3W6/u\nGhg/cbKbPGG8m33LbPejH/2o7vKYAKYB00B2NWDj7GT33pnkpoGqaECgs9lm33fjrri8KmlWIxGa\n0FZaaWV38EEHG/BUQ6GWhmmgDWvAYKcN33wrumkgraCjO3P04IF+04BHGrG1acA0UI4GrBmrHK3Z\nNaaBBtHAmWeNcIsWLXL33D0n1SWiSWvOHbe7OXPuNKflVN8pE840kE4NGOyk876YVKaBVtfA9OnT\n3diLx7p5UTfzNPjotFTgY4493n3++edu1s0zW4pq500DpgHTQDMNWG+sZuqwHdNA29DAO++840Fn\nXDRoYBZAh7syevQoP/8WkGbBNGAaMA2UogGz7JSiLYtrGmgQDRx62BHRnFabuTEXjs5UiRiH59Qh\nJ7v5C+Zbc1am7pwJaxqorwbMslNf/VvupoGaa4DRkX/3wvN+PqqaZ15hhozDs/uee7uLx15aYUp2\nuWnANNCWNGCWnbZ0t62spoFIA1h1unXvXpdBA6txA5haYosundzixYtt8tBqKNTSMA20AQ0Y7LSB\nm2xFNA1IA1h1jjv2OLfojUU6lMn1qacNcyussLy77NKxmZTfhDYNmAZqqwFrxqqtvi0300BdNTDr\nV7e4gYOPrqsM1cj8Zz872t1z1xzHOEEWTAOmAdNASxow2GlJQ3beNNAgGgAMrp16jdun396ZL1GX\nzh3dDzp2cnfddVfmy2IFMA2YBlpfAwY7ra9jy8E0kAoNAAY/P+Y4Byg0Qtilb1/37HMLGqEoVgbT\ngGmglTVgsNPKCrbkTQNp0QBgsMWWW6ZFnIrl6NO7t7dUVZyQJWAaMA00vAYMdhr+FlsBTQNfauDJ\nxx9z2/zkJw2jDixUe/fb1+F0bcE0YBowDRTSgMFOIe3YOdNAg2iAEZMJPbp39etG+WGm9pdffqVR\nimPlMA2YBlpJAwY7raRYS9Y0kCYNADtbb7NtmkSqiiybbLqJW/L+B1VJyxIxDZgGGlcDBjuNe2+t\nZKaBnAZo6ll33fVy+42ysfnmm7sPPjDYaZT7aeUwDbSWBpZvrYQtXdOAaSA9Glh9jTXdN1dcKT0C\nmSSmAdOAaaCGGjDLTg2VbVmZBuqlgf/3//5fvbJu1Xy3+uGW7lezbmrVPCxx04BpIPsaMNjJ/j20\nEpgG2qwG2rdrvKa5NnszreCmgVbUgMFOKyrXkjYNmAZMA6YB04BpoP4aMNip/z0wCUwDpoEyNcBA\niR1+0KHMq+0y04BpoK1owGCnrdxpK2eb0wDTQxx55JHuu9/9rps8aVJDlv/Tzz5ryC71DXmzrFCm\ngTpqwHpj1VH5lrXzo9/+/ve/dyyMBcPy7rvvNlPNaqut5n70ox/5Spt1z549/dIsku34GcABHLqZ\ns/70009zWmH7ncVv5/YbZeNvf/tboxTFymEaMA20ogYMdlpRuZZ0sgaoiG+88UZfKWN1AF6AGFkh\n2A6DIIg1UHTKKae4l156ye2zzz5u33339QvptMXATObok+Xuu+9upoLvfe97XjfolXhXjLuq2flG\n2PnjH99yXbtu1whFsTKYBkwDraiBbzRFoRXTt6RNA14DVLZXXnmlXwATgAVQ2XjjjcvSkCp5oAkA\nIq3Ro0eXnV5ZQtTpIqBPwAj0hWGrrbbyukAfITTymi+33HLug6XLXCP1YDr0sCPcrn37eFAO9WDb\npgHTgGkg1IDBTqgN2666BkLIofIFSLDkVDNQ+ZPu9OnT3aBBg3JAVc086p0WQCcLThLgYL0J4TH8\nD8M24+z077+/OyLSz4AD9qt3caqWf8cOHd0DDz7QJiC3akqzhEwDbVAD5qDcBm96rYqM7wiAIx8S\n1tUGHcqCdQgLz+LFi31zDftYkbIe1GRHeX784x+7888/3zffUS6a8KZNm+Y++eSTXNMezVaAjeDm\n//7v/9y///1v969//cv985//dF26dImufznrasnJP/fBh1279u39/c8dtA3TgGnANJCgAfPZSVCK\nHapcAzRTASBYXNiuRQAKsH4AVVg6WCNDlvx5ZL1hHToY46QNKGK9YVGZBDfoF+sN+0AOC/v/+7//\nm9vfaacd3MUXj41ijiZ65sPTTz/j2q+/QebLYQUwDZgGWl8D1ozV+jpuUznQbCXrDRU2AFKPgBwA\nD01cAE/ov1IPefLliZxAmSAnDjiCG9YKAA1BoAPUsAhytAZ0BDtaHx75uJx+5pkN0ZRFE9Y1U67J\n9dKTfmxtGjANmAbiGjDYiWvE9svWgEAHsKAZSdaHshOswoWyMAEUaQEe9CS4ydeDCrgRNKKGEHBk\nwYkDjoAmDjnh/jXXTHWg0tQpV1dBu/VL4vppM9zdd81x99w9xzdd0uQX6qt+klnOpgHTQBo1YLCT\nxruSQZlC0MGSkqaAPEBPPYFHPaiAnCeeeKKZeuhBRUWNJSoEsjjgyIIjyCkGbmTl0Rofn7322su9\nuvB116Vzx2ZyZGmnV6/e7uSTT3b77dc/Jzb314Anpw7bMA2YBgINmM9OoAzbLE8DaQYdSgREEKgI\nawk8WBvID9gqpgcVMuYDHMFKCDgcC/fDbcUXIJHuN77xDe8H1L379u6qq67KrHUHqw4hBB32dX9Z\nWzANmAZMA6EGzLITasO2y9IATS4ADxV7moOsO8jZWk1sAA5wgwUnPhI0PaioiNGXfJkEN+gNMGFf\nlhsBSxxqtJ8EN5yLAw7j67C8/PLLbs0113Rrr722Gzx4sJs4+Rq3+0/7pvmWfU22pcs+dIcdeujX\nrDphRO4vFrLWusdhXrZtGjANZEMDBjvZuE+plZLeVlTuVPJZqFyADeQERqoVSIsKNh/gADcs0k8S\n4AhSWAtm4us43GhfcMNaQYDzX//1X47l6aefdj/5yU887Kywwgpu9uxb3MMPPexmzf5VpgYZHHnu\naLf47bfcrJtnqqiJa55HgFI6T4xkB00DpoE2owGDnTZzq6tfUCoUxn958cUXm/maVD+n6qWIBYpK\nEEADQMoNgI2WavagEuAAMoKZ+DZxBDgCJ5qoWAQ34RrQ2WWXXdzyyy/vz7NmOebY412nzl3cmAtH\nl6uGml7HuDqnDjm56EEEDXhqenssM9NAqjVgsJPq25Nu4bCSsIyOrDtZCkAKfjw4DRf7zx9IEtzk\n60GFLkKAEoioaUprYCVcBDOF4EbxSUPp5gMcwcznn3/uXnnlFdenTx8PNzouEFqyZInbp98+btjw\ns93Rgwem+hYufG2R27//vu7isWO/5qtTSHADnkLasXOmgbajAYOdtnOvq1pSLCNADpVJscBQVQEq\nTKwYUFMPKpqo8gEO0FSoB1UccJKABpCJA4/ghnVLgCOIYS2Qef/99x2ws8022+SO6ZyauIAlyjVi\n+Ah3w43TXY/uXSvUautcjp/Occcd7zp27Oguu5RBEUsLekax6FkwDZgG2qYGDHba5n2vuNRUHMAO\nlX0WAxUgwBO37ghwgLl8Pai4rhDgqIkpBBbBjI5pX/Cj41rHAUeAArAkwU14DIsN8TbbbLNmoCNL\nEGsC5aMsAw4c4J588slUAo9AZ/1oWoirr55U9qPGfSUY8Hg12I9poM1pwGCnzd3yygssq44qkMpT\nrE8KgBqVH01PlAkLThxwdt55Z3+eOKoo1YyE1ICNrDdsC1YEM/F9wY3WOq90lLbghrUAJ74W4HBc\n52i2osfVpptu6o8BNqShINBhH2CjvPQSGzhwkDt7RLosPAIdZJ0xfVrFFkQ9r7qPpGvBNGAaaBsa\nyATsTJ061R177LH+juBo+fDDD2fq7oTyI7gqNLbDyodyUb5iA//c3377bR/9kksucWeddVaxl1YU\nD2sAoMCS5aDKPl4GKn/ghkVNdOE9E5gAKnHAEbwIdgQ18TXXaSF/ngOBieBFAKN1EtxwTPGxzmy9\n9dZu9dVXzws4KitWOSYWZc4tIIBy3nnnHLf//vu5626YXncfnmefW+B4psttulI542vKiv9VaJmL\nx7F904BpoPE0sHzjFclK1JoaoLJgBGCcdbMeKIvCoEGDPNwAciHgADnhIpiJr+Nww/n4McGNwCkJ\nbgQuApsQZnRMcVjLAgTo9O3b1xcnBGiVL1zThAfoELBoUV5k6h85AM+bN88dH/nHvBpZiIYNO70u\n3dIZNPDySy52J5x0khty8kmh6BVvY9UBdtCBAU/F6rQETAOZ0YDBTmZuVToEBXKwfAgI0iFVeVIw\nfxeVPRUga0IINmwLUPLBjaBGawFOGF9pkn4+wBHI5IOb8DiAo3To9k7F3bt3b5IvKsgiJwuWLkLO\nHXfc0d3763vdyHPO84P3nRk5L9dq4EF6XDGy85OPP+Yn+AQ8WyPw7HLPDXhaQ7uWpmkgnRow2KnB\nfTnmmGMcSyMEddtuhLJQ6fPvnkqVip4gwAlhJR/ICGxYx+O3BDiCF0EOVhpt61zcgiPAkeWGEZqx\nUvTq1avo20HzFX46NF+FgBdC3frrr++uv26qmz59hhty0olu2+26uiMGDmw16ME358bpM92Made7\nfvv2d/MXzG91mDbgKfqRsYimgYbQwHJZLcWll16a83Pg449Pj/xXksr0yCOP+DjE1bLGGms40mFy\nxKSgeKy5noWuvOxzHaGYOPjshPGS8tKx2267LZcH15Afx8oJSWWmqQN5yg00YbXWP+5iZJIe1WRT\nzDWF4qgpg3/5VPgCm3//+9/uX//6l/vHP/7h/v73v7svvvgit9Clm4VjnCMOC/H/53/+x6dDnsAK\noxV/61vfct/5znf8stJKK7mVV17Zac22Fo4R79vf/rZfVlxxRX8taQiAZNXRVBSSv1AZdY4yJjVf\nCfCQneWf//ynXwYMONDNnXt/NGFoZw89hx52hLv19juVXMVr/HJOPW2Y6x3B5quvvOxm3zLbdy2v\nldVQwINjugXTgGmgsTWQSdihohs+fHizO0MFDhgkAQ9xkyp5IIdzOPr+9re/bZZefAdwII1C8YqJ\nE0833AdqBgwY0CwP8uOY4CqMX2ib+EllFgDJ4btQGvFzVJbf+9733MZRE0C9Qz5ALUcu4I2yqdLH\nUqNKH4gJQUeAI8gJAQcQA0q++c1vOkAFcBHUaB2CDscEOCHkAEekQVryyRHkUT5kJZR6H/I1XwF5\n8TJTPhbK8fOfHx11CnjI7bTjDm7yhAmuY4eOHlIAH5qeSgmMgsyUD8xaftSRgyIYXN6PiMz0D6WA\nWyl5FooL8HD/DXgKacnOmQayr4HMNWNRWecLVIBU4mFvLSr9lkCB6wCDt956y/dkSUq/pTS4ppg4\nSWnrWCGLC1DG3EbF9NYCmuIwqDy0Ji+6J5fSg4tKttQKVvlVe10IOkvNi0oWZ2VgJ27ZwcqBlYcF\nIFBzD3kIQMLmJjVHaS2LTHwdXiNrDWsF0k4KVMrIW6r1o1DzFWUG7mTJUlnJH6sSYZVVVnGHH36Y\nO+qowW7hwoXuqaefcXfNmeMOOnD/CBZ6uXbt13frrrueW3uddXz88AerDZawe++5y+3db1+33Xbb\nuqFDh3iH8DBePbcFPKwtmAZMA42ngf98XTNUNir9F154wVdOH3/8sYcAiR/CEBATAgiVOyAkf4rQ\njyYeV+mFa+Lr2nyQUEycMM34Nt1tlceUKVOanS4EQ2HEEHRCXQFzISyhG8pdbAAI0lQZhPe62DIk\nxdtqq638P/uwGUuVv5qzBADcG6AECMD6QpNTviYqWW7CtSw4YROVwEfwVAh00H+poAOk5mu+EuhQ\nPjXHsQbygJ8kwAO26CWFNQZ9jBt3hTsmsv6stmrURPedFd23V4z0Ei0rRdv//ucX0aCF+7vTTh3q\n495z9xx3zsizUwU6eibQLTCJH5QF04BpoLE0kDnLDuq/9dZbvVWCbcYUARCwzChQgXMcC0dYmQMP\nYWXPPs1eqjSBCdJKClwXh494vGLixK8J9wGlEKLYR37Bi8pD2fIFLB5hU16oK2CPfZrtSJeFNMkn\niwG9AK+F9FFMuQQPgkxZb7QPfAhIwjXWmrjFJumYmqK0Fsxo3ZKM6ipdLmgW23wlXx3Ah7JTFtYE\nZEfeJJnV/FSufC2Vv5bnKYMsmHouapm/5WUaMA20jgYyZ9mhwmYJQ3xfgBM2dVAhhqCj68OKnuvC\naxSHddK14fli48SvCfcPPPDAcNdvx/MNQeZrkaMDofxYdeK6QQ9hPi2lF+bBv15VbOHxUrcBU1Wc\npa7DvCgrflpYqEopR5hGuC1ZqNiBGip7+d9gwZF/DWv53oTbOiZLD9YbrmfBEkSahaAhlEXbWNMq\nsagV03wlyJE1B6sWFp9QHwI1ydXIa55xdG4Wnka+y1a2tqaBzFl24pV3oRsWVoBU/EkhbhUQKMXj\nxuPFz7NfTJyk63QsqWzxNPPJpzTC88AAFVahEMYvFE/n0vZvl3ssy1doFZO8pazRVQg5XAvwYOkh\nhOcVj3W4CAoECrrOJ1DiDxUuoVzALLb5iuYqgQ5WHYLgjDU6oIwqm4/Q4D/o3Cw8DX6TrXhtSgOZ\ng502dXessDXTAHCiypzKncAxKvuwKUdgEwIAx3S9BGa/kkBFC1huXEHPtyO/ms4jPngg8IYvDmAj\n0AF2sOhQVvRA+WSVYq3yUq5Ky1aJXmp5rYCn0vtQS5ktL9OAaSBZA5lrxkouRvLR0FISNu+EseOW\njbglJYzb2ttJMsblC8uUJE8oP01gVF6Flpb8kMI8qHiphBs1UIkDLqroaYaSA7KcjEMHY8GArB4C\ngUphgOZCdM1Sbhg9Ov/ggXJKVu8rQAfwUdMVgEf399CJGp0AQW0tyKomK1tbK7+V1zTQKBpoaMtO\n2HQFNOCIHPeBCXs4AQrhNbW+ycgX+tOQv5yn2Ua+lmAnlB94otwhAJFOuYHKtxp+DDiBxyGuXJl0\nXUt6UbykteBElTn7ACIVvIJAhjiKz7lwW3ErWQM6PXv2rCQJD6TF9L4CdrQAOgTADYgDdORzpCYt\n6UDCAQDI++lnf3MLFvzGH1YXc3Y6/KCD23qbbf3xjh06uFVXXdmXTQDhT2Tgh+eesrKwbcE0YBrI\nngb+8zXPnuwtSgw44M+hipUxeMIeWeyHMBEHjRYzqHKE+Ng37MsfhayKkQ/YoeLHl4Vy4wxMmQVB\nSlM64VzopN1SkaoBO5KlpbxqdV6+GeQn4MmXd7XhRvlU2uNK6bAup/mKpq0k0FETliAPXf36vvvd\noxGYL1u61MPMFlv+0PXZpa9r376dF4Pu5QQGHHxvyRK//eKLv3evvf6Gu/vue/x1O0Vj8/To3jUn\nq4+U4h8BD+XPGqylWK0mmmmgZhpoaNjBooHTqoABAAi7qIdaJm6+budhvNbeRlbJG8+rWAdc4mmE\nZKw79FhKCkBRKaCDxYHmkUYL+sfeWiDTkr7IH9ip1KJDPtyffHNf5Wu+AnSAmXjzVQg6d945x02c\nONGDyv4DDnbFTBDapXPHaKqJjr744WSiQNCj0ezqd865210y9hI/u3m/vfdKvdUE4BGUGvC09FTb\nedNAujTQ8I3wVPwtVeiATjXGa6n01haCGUCs2KYaytsSuJEWZS4l8LFnbqxGC/xbrwZolKMXQIdQ\njcqTclS7+eqee+51ffrs4kHn8IFHukVvLHJjLhxd0aSgANCQk05wWIDGjZ/gFi9+122yySZu/ISJ\nVWkm9QptpR85K6NrC6YB00B2NNDwsMOtoKmGwfTi0CNrDiMLp6FpBfmQNYQa5EL2QiCU9LgRn1Gm\n49eRNiBEmcN8ktKIHwN2mBurkT70/FMH4KoBG3F9tbQvPaLXaoRym6+w6sT9dF577TV35OCj3aRJ\nkxyQ89hj89zRgwdWQ8xmaWDxGXfF5e7Vha+7+fMXuG5du0UQnn9KmGYX12nHgKdOirdsTQMVaOAb\nkSPml0OkVpCIXdp2NECFSuXcCM1ZwAYOtjfeeGPNAY58ASwqzmoE7gdWndVWW81hLSJdXm11M6fH\nFRN7hrO10/2cpjtAh15mGhSRUbUnRBaXgRHsHDnoCNe+3XrVELGoNJhc9LxoOolevfu4sWPHVE0/\nRWVeYiQ1adXLKliiuBbdNNCmNWCw06Zvf+mFv+uuuzzoyCpRegrpuQIg+PTTT71AjEVDpcXS2lYe\nQId8qhW4Fz/+8Y99cnOiyTn33Xff3HADGk9Hs7cLdsIpIeheD+iwXHDBRe6xeY/65qXQz6ZashaT\nztJlH7ozzhjmweyC80e1+v0oRqZCcap9PwvlZedMA6aB8jRgsFOe3tr0VUACH/jWhoLWVLIcgnHm\njYdVV13Vlw0g0aI4lTgxt5YlgPtAOQA2YJSAVQeHZKAmtOoAO+xzjt5XdC9nDCGg6IwzzvRWnilT\np9TUmiPdxtennjbMzb3vXjf7ltmpf9YMeOJ3z/ZNA+nSwH9F5u/R6RLJpEm7Bj788EMPO1gQshqo\n5Kn0WQCEDtE4MAyktzTqTv23v/3Nvfvuu96XZ/r06b55iCEKXnzxRT8zeLt27TwkUPZi4YemJfTW\nrVu3qqqM1/eWW27xzVdUuJRLzVdx2NFs5hyXnw5WHYDozDOHu29F106Zck0qQAcl7fbTXd23V17V\nnXbKULfDjju49darXXNaqTeJpl30z9qCacA0kD4NmGUnffck9RJRcdN7ZvHixZn+uFMxXXnlld4i\ngtLxb2FhiIJHH33UL1RgH3/88dfuSadOnVzv3r1981GvXr28PoiUBD/oi1DtirBazVennXaG+8Zy\ny7lrrrk6NaDjFfbVz/XTZrjLL7k4MxaeavpihXqwbdOAaaB8DRjslK+7Nn2lev7g3JvFAOSwACJY\nQkJrCHNE0azDIvgBLJ588km/fPDBB18rMtaeEH7kQ0PzEs1+1QYdBKhG89WkSVd7HUy9dmoqQUeK\nHnnuaPfKyy+5GdOnpdppGXl5Vrjf3HcLpgHTQDo0YLCTjvuQOSmABKw7NO1kzXcH3xkqI5qv8MkJ\nQUcOvTTtsNDkw6KAn8tnn33mXnnlFQ8+Tz31lKObdjzQnERT0eDBg91+++3nsP4oJFl/dK7YNc1X\nlfa+YsDJMWMudndFoxpr8L9i869HvEMPO8KtFvlTXX31pHpkX1KeBjwlqcsimwZaXQMGO62u4sbN\ngAoXYODDnqUgX6O4M698XIAc/FuYNworD8ex8BAAGICHRdusgZ6XX37Zr5977rlEdfTo0cNDjyxA\n+udfKvygb1mOyu19BdQdfPDBbuyll7sBB+yXKG/aDtJLq3cEpxOikZz79t0lbeJ9TR7uU2tZ9b6W\nmR0wDZgGCmrAYKegeuxkSxrAqgM8AD5ZCADOkdFYQfrnjcxYdgAaWXVwWgZ24sBDXMBGSxL04NyM\n1WeNNdbw4MM2IEQvqHiQ3w9WH+AFSxmhJfipRvPVueeOcmusuaabOuXquFip3tc4PPMXzM9EMxEW\nUAKWRAumAdNA/TRgsFM/3TdEzkBDz+jfNr47spiktWD5ZBXsyLIThx0sPVh4sO4QFxhhAXpYC3re\ne+8935OLUa85p+NsL4kmxAR61PxVyO9H8CPrTQg/1Wi+YmTtiy8e656IfJBqOWBgtZ4LmrO6dO7s\nRo4cUa0kWzUdA55WVa8lbhooSgMGO0WpySIV0gCgc8opp/iut2n136HC6RlBGUCGY3IY4j47NF8B\nPFoDO1h9WAAi4mtROoAOUMIUHGETV7gdAhAWIByeBT/5/H769Onj5abpi/S33nprn2W5zVcMHDgs\n6ma+XTQtw9nDh0n8TK2ZSHSLLp0y1RvQgCdTj5gJ24AaMNhpwJtajyIBEFgd6KqdNuDBIZl50AhJ\n3eUFLlhuABqsOACOFh0DdOIL1/7ud7/z49wwb5iCrD4CHNbhtqw+IQwBP1h/WL/++utKKnFN13gs\nQOSPTMgMoGlKiHyDB2LVueCCCzNr1ZEyGHBwrTXXyIx1B7kBHp7FtL0f0qmtTQONrAGDnUa+uzUu\nG74w+MSkCXjU80rTQjB3FDJi5QkD0ADsCHhkyZGDsvYBC+JoDXRsueWWbpVVVsldz3kBFHlgkdEi\n6NE6CXoERbL6AED5nJ47R805O+20k8P5edttt/VA9cUXX3h/I2Qm33Duq4suGus223zzzFp1dM+e\nfW6BO+rIQX4Wdh3LwprnEegx4MnC3TIZG0kDBjuNdDdTUBY1aaXBhwcfHZqtABusTmxreohx48a5\noUOH5jQGnBAEKXELTrgv2Jk3b57bcccdc+ATj0M8LUpX+STBTz7wAXrovk7Ybrvt3JqRYzE9v5L8\nftZee23f1EVlyrLhhhs6RkkGxoAf4IgZxrPQ1dwXuMBPv336u/367+MdzgtES90pA57U3RITqA1o\nwGCnDdzkWhdRwIOlJ+4fUytZ1KwG5OBPRKCSAXBmzJjh94844gg3bdo0DzjAhwLbIZwIWLT+6KOP\nHGPUYFER4HCOba3DbR0L16TPfhiw6JC3LDtq4sLhmfQ6duzo7rnnnpxPENaqxx57zDd9Pfvss346\nijA9ttdaay3XpUsXD2X4FX388X+7e++9Ox4tk/vjJ052r0YgmLUeZSibZ1EO85lUvgltGsiYBgx2\nMnbDsiIuH3JghwB4YF2pRaCJgHxZA13xfIEMrDqnn366FweAABgYDyVubQkBSPCDzw+ws9VWWxUF\nNknQEx7TttJnTZAsDBz40EMP+WM0mWHV0TniYq3Btwh/HZb58+f7gR6x/KCDMGy3bVd32MCBbshJ\nJ4SHM7uNo/L+/ffNXFNWqHCafOPPaHjetk0DpoHqaMBgpzp6tFTyaADLCrBDExLbG7fSeCOADYB1\n1VVXeesNeWnQvlA0AAHAABz23ntv79jLeYCHnk6hVUWWFc4DGMAD11MG1lqw0GgRvIRWHI5p0XHF\nC9faVlqMTn3SSSeRfTQj+Rmuf//+fhtZVA45UwM6QA9pcH6FFVZw3/nOd9z777/v9fL8889HY/18\n4a67/gbXo3tXn04j/PTq1duNGnVepoHBgKcRnkQrQ9o1YLCT9jvUAPJhsqcpiRnE99nnSx8L4Kca\nAcDReDSkl9TbSvkITgACIIGeSYcddpgHAuKMHTvW/exnP3PLL7+8hwXW8qMhH3p0ATphIE0FIEV5\nCGoELtonby06Fl/rvLqZM9v3nXfemQMc4iM/C93j1UWefQKgg5/OSiut5OhqzvrPf/6z23PPPX0a\nkrcR1vTK+tFWW7hBgwZlujgGPJm+fSZ8BjSwXAZkNBEzrgEsLFhePvnkE+80C/gADTQ30TMKGCol\nUDEoDZoAwi7fQImCwCNcAwpa8GWhiYgmKcKIESPcgQce6Ltvh5YSrEDkEeajPNSkxFpgBCTRA4r5\nsVgADxYsLQIQQQj78YVzv/jFL5SFu/XWW/313/rWt3y65EOZaMJCTrqbh6M9c5wFaFIz1xtvvOEG\nHHRILs1G2Vh7nXW8w3XWy8NzzHNd6ruQ9XKb/KaBWmnALDu10rTl00wDQAkAxPqJJ57wIAEA0YMo\nqflJFQG9qYATKgcWWYiAH5qwVo0miqS5iTQEOWHGHMMCQpMPFhFNC3HOOee4O+64w0dt3769mzt3\nrsOigg9M3759vbUHyBDchGkW2iY/BbYBLQIgon1ZcjjHNuP27Lbbbj4eTYA0twle5J+D3IylQzdz\nFo5zPc1wQJHgCthif/Lkya79+hu5cVdc7tNtlJ+5Dz7sZkYO57NuntkQReJ94D1IegcaooBWCNNA\nnTRgsFMnxVu2zTXAR55/tUBNUhAEAThJgWs5BwzRHZxu4YIJ1kCKAkARQgOWEfZvv/32aBqFixXN\n+/4MHz7cW2ew1KhZC6AghGnmLipiA3kIrLXI2sSacXvefvtt39sLAOOYLDSy5CAzozADPIAP55EH\nGYEbWYEk93XXT3OdOnfJ/Pg6cfU2GuxQPgOe+F22fdNA5Row2Klch5ZCSjRAJbHzzju7zz77zF1y\nySXu5JNPzjVZIaKsMsADwIOFR01AAAPAg1XlxBNPzJXo3HPPdccff3wOIAQ8WHmUZi5ymRsh/Iwa\nNcpddNFF3jKD/xHj4yArMCNLFIAD6LAI1EgDmYAbrDqCHFmjrplyrftBh44NBzvMhL5++3YeGstU\nfyov41nGuoOVx4JpwDRQuQa+/ItaeTqWgmmg7hqgeQtYIGCRwQFZzrtYRNgGaIAHAvCDMy8Aw4LF\nhhGWx48f748T58ILL/ROy0BF6MdDGrLKEK+SIAijuzigQ7j55pvdOpE/SmilkawAjBaghqYq/H5o\nwqOCZKEcrDmGDxAA1IghixOZFnMfsGQS3okNH1DMtRbHNGAa+LoGDHa+rhM7kmENMGig/F3oaUUv\nJKw2wEoILFh34s0+TMYJ9NC7C6dkBvMj3H///d5vhxGLBU1YhUijWsBDPkdGDtsEeqzRzRz5ADDW\nCspPlhwACNABbugtxiLgAXSwDLEQx0K2NCCrjgFPtu6bSZtODRjspPO+mFRlagAIuOGGG/zVjDFD\nD6vQz0XNVTQLARLAAtaTBQsW+KkUGGSQYwAGgw8eddRRPq1Fixb5uaeeeeaZXM8nWYmqAT2jo3GB\n8DcCWm6MHLcJlIW0WeSzg3UK0MKyhPzIHlp1BDsCHc6xrPitFX2ajfbDwIIdftCh0YqVK48BT04V\ntmEaqEgDBjsVqc8uTpsGgBQsG3PmzPGi3X333e62227L+bsAO8CPLDMAA1Mt7LHHHo5eWHQPZwEi\naCrC2kKzFgG4weJCExPpqFkMEFHTGIBSasA/g5GSCYAO8pMOC+kiK3mTnxYAiLICZjRjIXPYnZ1t\nHfPrVRrTsvPekiVu6222LVXlmYov4OE5sWAaMA2Up4Hly7vMrjINVEcD70Q+CY9HPbC0JlV6VmnC\nTsa20ccePwa2e/bsWXDWaACmV69ebsCAAX6MGpyMO3To4DbaaCMPD0AEcXDwff31190uu+ziC8Mx\n+cKwrSYr8mVOqn79+vl4TDWBI/Oll17qrS74zbAQuJ70w6Ynf6LAD0BFoPlKXenZj1t0kCcENfKS\nzw4+OQAa4MMx5JcMpLPWmmu43zz/W5JtqMA9bAuB5573AuCRP08l5eZ9Iy0tpB2+d1gYlY/eO9Y9\no3fPgmkgixqw3lhZvGsZl5kPLBYMDSiojyhrrBoEfVSJqw+xPsw6BhhokUoECFhAsL5svvnmvncW\nMPDII4/4aMDAn/70J28x6dGjR86Kw0lZUbhWViDS4jiBZq0//vGPfrt79+5uypQpfjweIAMrixyd\nAQ3Bho+c54fmK6w6VC5UQLLqqBzADb5GGlOHbSxJpE05ZL2hqUrAgwzx/JkOY9y4qyJouyuPJNk8\nfPEll7uVvrOiGzrk5GwWoESpeRd4TnhXSg3he/fuu+/6nou8Z4AUCyH+3nHs8eDPCPkTR++d3lfi\nWTANpFkDBjtpvjsNJBsfSeCGyp3AxxKLRjkfba7nw81HmEH3SJtBBVmABpp+aPYBFGiiYlA+AoMD\nYuVZEjV9YAVhBGX1VFKzFVYZwAbA4XpBj4CH84Ca/IK47sEHH3SdOnXyaQI8LFhWQuuKFyD2Qxk0\n1QXNbuiE9Fnko0P+DBoo2KFcnAdogBvkVxkEXEn5oiP8ebi2kcIxxx7v/vynZf45UIXdSOVLKgv3\nkmcH6Cgm8LzyngBJghTW5QTS4D0mTbZ5hzWaeTnp2TWmgVppwGCnVppuw/nwYeSDCNjwcWSpZgB6\ngCgqAPJhfB0AADAAFiZNmuQuuOACnyWWGSqJ73//+x4WsIgQF1AAXAAFQtzCQzoAD2lidSGvIUOG\n+Lj8XH/99W733XfPpQOM0Myk9JKsPOiD5jqar6hACMBICGuy6gA7wBfnSBd5JTvWHcAHSw/n4nmh\nH8LoUee7s84+2+3+075+vxF+OkZjB82+Zba3iFH5KnCPGz1wXwuVk2eK94HA+3Fkld873gEgijnv\nmJuMbbP0NPpTl93yGexk996lXnI+hnxg+ScK8BT6MFejMIIeKr1f/vKXfuJLWWfwy6FrOQH/Gz7K\nagYSNAAQHAMWBB2y8CgdoAfgATrIZ+DAgTnRmaFcIy4DTlh4gA8WQgghVD6t1XzF9BthkN4vGnOx\n+8c//+3GXDg6PJ3Z7WefW+BGnj0imrF+3tfKIMDjBBYflkYMScDDc8l7JxhhuzUD+QFVyMJzLcBq\nzTwtbdNAqRow2ClVYxa/KA3wL+/UU0/1g/zxAaxlmDZtms8bEKHrOeAxa9Ysb/FBDvx4sMRgdeFc\n6PfCvoAHC05oZRHwsFazFiB32mmn5fx48AG65ZZbchaeJD8eKqFqNl+9+eabvpkLmGIR3MR1zj/9\nq64anwgH8bhZ2B957mj3P//+l7vs0rEFxaUyZlHIpx+dz9o6BB7+VAAbAA7vXS0tLchBvlgskaOW\neWftnpm8tdeAwU7tdd7QOVL5618elWu5PjmVKompFugmzkzrzHe16667+sEB+RgT6FFF8xFWF5qA\nsO5oAXjkaCxHYaw5AA6WHZYQeLACMQmpJhLlesbtadeunYcp4EnNWsAIoFNJ8xU6/vjjj3NAxeCH\na665ZjPLkS9kwg/NPuPGT2iIpqxevXpHMH1eXrhLKL4/RKWsgMWHJeuBMvEHgzXvXb2AjmeTdwyg\nr+f7n/X7afJXXwMGO9XXaZtNkQ+dPrJ8dOv9zw7gwRGTnif33Xef99MZNmyYmzlzpr9HM6LZsqno\ngJEQeLD0ACzyfwFmcBhOclwGejgOFFHm8847L3f/77zzTkePLTWPATxMP8GUEAz6h1zoiPQFVaQn\nPx0ck9lGr/QAw0qEXFim6EqPzAIzWXVymefZGDNmrPvrRx9nfvbz66fNcDfNuLFiK9U7gdWHe1Ev\nOM9zu4o+DGDgO/Piiy+mogxYlQRfWdVp0cq3iJnQgMFOJm5T+oUU6MiEnQaJkYneWVQE/Mv89a9/\n7TbbbDMPPVhngBy6owMKQAOQI+tO3OFXTVpyXAZKQiuP/Hj4Rxs6Lp9zzjl+IlGAB58hZmQnAELq\nESOYIg3SBHLoKi6QwoGakZ2RiW0WtklTPb8oQzGByn2TTTZxry583XXp3LGYS1IZ59DDjnAH7L+f\n22+//lWTj+eF+6fAs1xvYJcshdaypADblAGAT0NQkxpyGfCk4Y60bRkMdtr2/a9K6dMIOioYIMFC\nhcBoyvzzZRoJZkcn4LiMNQYrDsAj2Al7OKlHFengw6Nu4XHgoZmLc+iDXl9//etffR7y4+nWrZtj\nfi3m7rr33ntzPbUAKebiAnaUZufOncvufeUzLfAz7MzhkZz/l1nrztwHH3anRuPqzF8wv1VhBPDh\nXhLSavUJQSeNYGbAU+BFtFM11YDBTk3V3ZiZpf2DC6QAFMiJrwzWnLA7Ot3SaX6jmQmLCcATWk/k\nb8PdiwNP6MeDVQZgwfpDPHpbATHxQPMV0MPov4AUsnXt2vVrzVeAExYbLFChEzUyltp8FcqAdWe3\nn+7mbrhxuuvRvWt4KhPb+OowvEA1rTotFRzoSZvVh6YiYAK50gg60inNWSxpl1Py2roxNWCw05j3\ntWal4iPGR5cKNM0fXEEKzrxYWrDmMMjgwoULva7ydUcHLORgHFp4ABT58WCNkUVG2/Ljofnsiiuu\nyN0Ppr+45JJLPNysvfba/jjWIqCJ5istQBMyC8Aqbb7KCfDVxvgJEyPoe9Tdc/eXc4jFz6d1nxGT\n5z/3bN3lrrfVh6YhmkGz0kSErAAj8lowDdRDAwY79dB6g+QJ4NAWX8/eH8WqEnAgvP32227rrbd2\nN910k/dd2XLLLf1xwIPeVGF3dFl45GAsh2V/QfQDpLAANsCKgAcLD9NR/OY3v/HnQ6dlrmUU5+OO\nO86DDJYboEkWIgYPZJt0yY+8JUfYtBaXRTKVsu63T3/XrXsPd/bwYaVcVre4jKuzfY9uqXHClSJq\nbfUhP/xy+JORlTFtkJlvBfJmRWbdX1s3hgYMdhrjPtalFDT98AHDupOFAPCwjBs3zs9kPm/ePPfq\nq6/mHIWPPvpoPxIsIIFFR01HrOUMLMgQPGHhEfA89NBDOeChmQmn4rOjEYs5Hg+Mtjxx4kTfTAXs\nhP46pAd0Vbv5Ki4D1ol+e/dzvxh3pRtwwH7x06naX7rsQ3fYoYdGTZGD/D1KlXAxYUKrD1DCUs0A\nLJBH1qwkyIuFB9mrrZNq6tfSakwNGOw05n1t9VLpw5X25qu4IoAUAIVZ0bfffnv/L7OY7ujAD8BD\nsxIggkWGjzbj+JAeC+kBLbLyPPfcc+6QQw7xIjDwIB96Blr87W+/nH183XXX9QMQrrLKKv46riUd\nAr2sBFuh/1Cpva98Ygk//NOm0pz36Dy3wjdXcDNvmpVa/x1A57jjjvfw2NIAgglFresh3g8WBf4g\nVBJIi950DKuQRWDAb46Ar5EF00AtNWCwU0ttN1BefGgxo+vjlaWiATxYdfbbbz/38ssve7DYdttt\n3dKlS30xnnzySQ8zYXd0wOONN97wXcMBHmCHwQHxUyI9QZSsNAAPlh0G/0NXs2fPzo3HwwjP+tgD\nL1iaGDsH0CFd5Ss/HZqx5Dsky1Il+qbCBLxw1ib07burey9ymk6rw/Kppw1zb731Rzdj+rRU+4UV\nc09CawzPBUspgfeNa3j3shiqBWt0MsDnjoAP3FlnnZVFdVRV5qlTp7pjjz02lybfpFqGSy+91E+X\nQ54vvPCCwz8yTcFgp4p3gzFc8AkhxF/AQueqKEJNksJHB6sAH64sBsEJ1p0ddtjBT/fAGDg77bST\nL044O/qyZcty3dFxbGZUZABF0AGcEJQmwEIzFM1XckzGdwerkHprYcHhYxB+oOldhDwCHaw9jBHE\nOrQqkZ/y9BmX+COL3KeffurTZ5+mSLqj33v3XakCHiw6o0ef7z788MOGAJ34reL9Cd+hlqw+xMWq\ngzUxzZ0B4uWM7+sPkoA/fr6YfX1PN9100wiE3yrmkoaIE777U6ZMccccc0yuXPWGHQTRfQF0+Mal\nKSyXJmFMltpoQBUma16QUgMfqSw7Gar8/DvGURnfGEYkHjVqlFfFww8/7LuNAy0ADt3CGSMH6ABU\nsN6ouUm6U5pA0CuvvJIDnRtuuMFtsMEGvkmK67EKAUadOnVy1113nS53EyZMcFdffbW3/nCQdARV\nYdMV+ZQbuG8AFaCz1VZb+WY4QAd5aB4686wz3VGRT8ytt99ZbhZVu05NV40KOihq48hCA+BoATy1\nhBAkpfK8Mrt4lkGHslAORnumKbWcgAVBfyrDPwzlpGXXVFcDuh801ZdTt1RXmuapGew014fttaAB\nPsIMzqd/Zy1ET+1poIFK5r333vOmV/xrgBr8bgiMj0OlomYprDL0tqJ5imOAEMADKCgIRHB0JuCE\nfNBBB3lIoimKpjAsN4AM12Ht4YMADBGALHx6gBECceQfpLT9iTJ+uF+DBw/2V1JhUqlS2ZIHC+U5\n7LDD3HkR8I0cMdzRxbtegUEDe0f3hmZAusZnvXIvVo+CHtYEgQ++YQRZVP1Ohn947hjUk/KUGrBq\nATuE1VdfvZllo9S0Gi0+Vh69z6zrEQ488EB/X8h7+PDh3gpZDzmS8lwu6aAdMw3k0wAfKCbQbIQK\n6IknnvD/lOnuTdMV1ptrrrkmV3RBi6aIiAOPYCf8sDCQIH5AzH2F1QirDJYjIIdFY/aQiZq8+De0\n5557+nxxPB0wYIB7J4JKAASwygdXOUELbKjLL/+kCfgHYeHh/unDiByCut12+2k04OJE9+Tjj7m9\n9urn6O5dq4A1h5nMGR354rFjW5zNvFZy1SMfgEDwwzaWVCAYS1wjBOCb57DUwJ8DgIcQNuGwzzn+\nFLDId4UKV8dY826pgwDXxAMgRVNMeE1oSYrHZ5/0SFfXrLHGGl4WrE86xlrWKKXBflw+4nEsLiPf\nJ86FgTJyTBaUsPyKi67Y1oKc8RCPc9tttzWLgn+U8lI6yKN8w8gAKMBDIN14WmHcmm9HH7y6hsi3\npSlqdwVDcwvHonbYZnJFD3bufKTQr50P02A7HiJH0SbSDfNhOymv8Npi5eOaUAauC0Ohc8SL/tU3\nhWVEtmgqg6aoXTZMJrcdpoeuuD5qJ82VDx3FZSC9ePm1ny+fXIZfbUSg0xQ52MYPZ3b/d7/7XVM0\n0F9T1DzVFEFP01/+8pemCOhyejrggAOaIoflpmeeeaYp+gA1LVq0qGnJkiVNPE/RAIBNEQg1RbDg\nyx9NRZG7bs6cOf54BBFNkTWo6bPPPmuK/H+aIifnpsiK1BTN09UUwVBT9MFoirqgN02ePLnpzDPP\nzF3PfYnAy+f10Ucf+bxIh/TIT3kWUjzyRH4/Pk3WyKTA9aRFuSlH9GFqisYG8vlFH+GmaOLRpmHD\nzvLP9CmnntEUzaWlS6u+/mDpsqYxYy9r6vCDDk3HHHt80+LFi6ueR9YTHDp0aBNLowSeN55x1qWE\n8LvHNy8MfMP0PeNbGn4PdVzr+LV8QwvF53sa+aCE2flt0lGa8XX0J6bZubBOI614/Ph++P0u5tsd\nlp+0FCL4yOVFOeKBfJQ35/m2KYTnFCdco+d4uPXWW3PpodO0hP9opMYSlfpwEZ8bIUWHSo7f5PiD\nzIMVXqs0wnX8mlLlQ33hixg+qC2dK+eBiucVliXc5iVRKOaFUdx8a9KudmWErrmH3FNkTLpXHOMc\ncYjLNdUKgACVe9RM5aEk8hNpihyGc8/a+PHjPfAAKVGTQtMf/vCHpqjnVlNkNfHXCHgiPxh/DUCo\nEFlnPFBE/8r9NcDO/Pnzm+6///6mW265pSn6d9t07bXXNl1//fVN0WzsTZGPTy5fdH3iiSc2RXN5\nNUXzbDVF00s0y68Q8ACkAh3kAnwIAiVBGIDHx43yADmvv/56U+Rz1BRZp5qiMYg8RA8cNNjLBPQ8\n8+x8Fa3iNQAlyDnk0MOboslPK06zURPgHlZbP/V+7yhTCOAt3bs4IMTjx+uB8DsY345XwoVAR9fy\nDQpBgO2kb5Xix9fhNyv8fsfjhfvKr5hvd7z80k/8eBzaQhhiWyGEllCm+Ha8rkPmME48P6Vf63Xd\nYKechysOBXp4wgcnvFkos9gHkjTCUI58oRzxByDfuXIfqDC98MFK2iYPQjEvTKiD+DYVJlaQagXu\nX/iiJcle6BjX6hmoRKbIf8BDBtASNVX5f5vR3FVN7du3z720WHeeeuqppgULFngYAAywhGCxweIS\njYrs4wIY+rcq64nSxCIUTU/hLTsPPvig/9Bzb6Ju6R58br/99qbIH8pvR6b0XN6Rk7TPE6sT+ZEe\nsgJSScCDBUB6A7xCeYjPtYAd8AREAVMAHJCD9SrqPdb0/PPPe7CTJYsP1siR53jrS8+evZrOPmdU\n0/0PPFSy2oGlqyZMavr5Mcd5GbHkVLsSL1moDFzA/axWSMt7x3M6atSooosVfv/jsEIi8UqdOKpo\nqQfi3z+BRPy68Ntd6FwoD/cnvC5u1eG8vlVxa1B4Xfycvt1Skt5r1sgWhrisOheHjzA/4oTAFuYX\n1jGhLilHqMs4BJJmeG08P87XI9TFZ4e2vrBNMlIGb7JfohsW3ccvQ/SRbtYuiG9DpESd9o5qpBVV\nPP5YpHTf5TsXIdrgPOkohHmRngJpqH2xXPmUVilr2mcVogfKd9dDF9ED5Wfk1jnajcNy6LjWYbmi\nB1aH/Vq6jl4kr+PwJPomv8hiEh5O3MaPZOPIf6AaAV1vs802OZ2Xk2Y10iBffCOYnBPHYRb52Dzw\nwAM5sU4//XSvp8gi4h2V5aysbuQXXnihjxtZVJr5w8jvJgIM39OK6zkWTkuhbuYRKHkn5rXWWsv3\n1Np///19ms8++6zXFZOH4jeEkzTpkY7eGyLin0NZrrrqKn9dVJl4J9DQP0fyIDdl+Pvf/+7n42LN\nwjHSjqDIt/MjJwvv3dlnj3CvLnzVDYl8ar694jfdZZeM9XGYdiKCFu/UjGNzfMEP59DDjnAdO3T0\nvb1ejXqrMQEpz/OUayZ7mb3A9pOoARyVIytI4rlSD1bjnalGGsiN/5Gcr4sph75jxA3rgXzX8h3k\nm0pIqhv0PcUnRYHvYFgvsM+3VYG6QQE9KMSv45p839SwHMgV5hdBRLOySUblU86aPKI/hrlLw/zZ\nVh7EI38Cx1Wvsh/qEt2zT3wC14e64Fh4f8J0OFevUBfYKffhQkkhDPHgyTOfc3EY4lh4E5IeyPCm\n6CGoRD7yLDZU+kApn3i5eLDDh1sPs+KXu+bD1DOqTCsNPPw4vFVDLtIgrUpeKACOCoUA7NA9HOBZ\nb731vEMvxyNLh8OhGVgABoACAc+h0TQGBJyMGaxPAWAAbgALAEVLCDuADjDChwPYwbFZXdSBFWZk\nJ3AtlUPkO9QMeEiffCKrm+/hgoykA3RpGg8BkWQnLaBJk46yZv8t+iwAAEAASURBVB85iUOQHnCw\nRh8sHCNQxnNGnu0ee2yev4ennTrUde7Uwa280rc9BAFC4bJCdNkxPz/aPfDgA27RG4vc1ClXuyMj\nB9VGcHL3CmnlHyC2GrpK43tH2YoN+j4TP/xuJ13P+XgcgU88fpiuKvswTvgtRYd8c1haui4pLdKl\nntI7GVldfFbUOdRlOBBX8i0L5Q63Q1nC+i3cJo4AJiwbeovrknhxvYT5hfHDtMI4td5evtYZkl9Y\n+PAmSBaUKIuHHi7dBOJTuYuw9WCg3JCQlVaYV9LDjgUlHsJrSpUvnlah/TCfQg9UvKzxNJPKxbEQ\n9OLX1HOf8sRBh/vHfec+J5UHedEX1/GChrrjGGmG/8BKKZ+sVerBIOsOH6SDDz7Yd0OPHIr9BJ6a\nHR0wABCw6GAV4trI7yZnEeHaOFzIasI5AQQ9tDQNBWkALoIjoAq4nDFjhhs4cKAvEqM+n3POOe74\n44/3cYGy++67z9FzDGjZaKON/NAAGj+Hi0gTWQReyIHsLAI2zhFUdsmlHmQcl5XHR/zqh0oYGVks\ntI4GqvUnI43vHXBebAi/GaoP8l0bVrb54ui46hD2k3orKZ7WoRw6lvTNKiSDvlmq55ROa635tvKn\nkEDefD+ROfyOhvASlpE4+jbmky+MH49T6Fw8bmvu1wV2ynm4wocbqKEiD5UYWnyksDAfjhV6+HQN\n6/C6Yh/+UL4wrULbofyVPFDFlquQLMWcw/rBP/JKQ/hvgrS4d/lMvmFeIXiSBt0fFeJp6nipa15q\nKnUqd6waVPZAFOkDBjQtMQYPIEIX88ip2GdBRcJYOkAD1wp0BC6hVYc8iAPkMPYOiwYOJF2uAYYE\nIhtHlicg66ijjnKRj4276KKL/HQXkYOzY0b1SZMmeRm6dOnimOqCZxGgIiBHKEscdMiL8wRkAp6w\nLGlBRll30Auyt/Th84nZT+o0EH9H6v3e8VyXEsLvZSnXpS0u5aAJP6xnkJFvIN9yviXxc5WWgW8C\n3089A6zJS3+Idb7SfNJ8/XJpFi6fbDws8Qc/JNR819nxyjVQ6gcqKcfwXvGCFwM68XRk4dPxME0d\nq2RNxQ5wqDmLua0IwEjUO8vDhORm9OXddtvNQwqwo0WAA2CwcC1WFtIGogAKQEdrtsP5sNgHNpCB\nj9Gdd97p+vTp4+XAj2fDDTfMgU7//v19UxvNYuQhaw4gA9CQv/xz1GyFfAIdgCaEL8mFnJwDhJDb\nQnY1EL4jaX3vCmm3tf7UhenKr5E/C/mWML7kDXWrY/mAJYQZ0qJ1gbyAz6TWCaVX6Tr8s4i8Ah/S\n5RzfGIVwm3P5dKHjScYGpZWWdV2+XuHDUs7DlWT6Sxr4KbxhKDzfwxe/GZXKF08v334oX6M8UPnK\nmu94qOt8cfIdr+TafGlyXNaLEHi6d+/umL+KEPWaauabw4suoAkBh201FQEcAh3gAYgALljYDhfg\nBysRi2Y8B3iQJ+q94icw9YJ89cMgj9E4Pd6vh3yAKlmIkAsZkkAHeSgraQt0lC8yCHSAPvKWXsK8\nbTubGqjk3ank2kq0FX4v4392K0k3bIJKgpaktNFBKE8IDoqfdIxz4XGgM9Qn5Sq2nlI+xa7154z4\nWHRCOcImLM7HdVKJvsPykXa9Ql1gJ67IUgoPFesm8bApLW5G6KxMmpwPH8ikh4jRLvUR1/VKkzSK\nffiJW2qI51PJA1Vq3uXExz+jlN4TxeShe1lM3HicSq6NpxXf55mggseiAZwABCNGjHCdO3f2UeVY\nSC8twEA+MFoLMliHFhQ1FQl0SJdFPjzKS/CBhUUL8BF1f/cWnlBepu8AdtSjSjJoX47I7CMPQMQ/\nMspH3oIrwArYCUEHefV+hHnadrY1UMm7U8m1lWgtrDSTvuXlph1aPPgjrXqA9MgHVwa9A4yurBAC\nAvVSeB3bHGsphO4Y6DVsmm/p2lLrC+rCsKySL36cfKmbpG/yQa6wLuTasO5UWpI5vD9hPafz9VjX\nBXZChZfycKH00KqDyS90SkXh8RcxfCB5ANVGibJJK3xgJJfWihM+xIUe/lJvYKUPVKn5JcUPy590\nPjyG02spvSfCa8PtUL/cr/h9COMmbSMz9yS812GaSdeUe4zKXs1ZwAZ+MmHAqoIVRXDD1BMsgIWg\nI7TqABdJoCOoCPMDOgAdAIQ1eY8cOTKXPc1pyEbAUfpnP/uZi8bO8fmzBnIkD2vkQVYC+VCeMH3y\nQzbBl2TiQ58UeBYej/y4xo+fEPUau8h3L6eLeXxhRvXxEyb6bvDvRMMXWChdA4343vHHiZ6DxYaw\n0gwr02Kvzxcvbl3hexTCTVhnhM1M4TZph9exnS+E5QAgBA1xoMh3vY4rvzho6HzSOuk7ybHQKKDr\nwvIhJ35G0kvYmxYoCq1GXB+CUVhepV2PdV0clFEMlZUeWG5avocjVDhxVDlzc0hHVKqKj5sQ9rDi\n+vBhyOdwDBTpppQrXzk3EPnkJa8HKimdpAcqKV6px6T7Yp0Vq/XR1f1CXp4FFvSv+5lUDq7h/ocv\nkuIlvcQ619Kaj+7GCc6SvNiygAh4mJk8DFhV+vXr560lNAsRD0jAFwbfnRB0wuYrQENQgYWFIKiI\n73Oc3lZz58718X74wx+6yy67zIMKztKnnXaa1wnnmaWdMTDowo7skgNZ1GylsgA2AI4gB5kkf1wG\nn3H0A6w8/vgT7s45d7l777nL7d1v32guoe+7tddZxx3xVY8xxdU6GrAwgq4v3K233eHwLYoGJXR9\nog/sDtv3iLZ7Kpqt82gAHY0ePTrP2eIP846k6b3jW8IfqGID32jVE3wD+BYkVdLFphfGw52CuiHp\n26J48bFz+Cbz3dT3W/G05tvOdy0eqF+ok1SXhec5x3EBlupIxeEbWUhGxcu3Jn3pUHFCg4COsZYs\n8fhhHHSA7sKA/GHZKvk2h+lWvB19EOsSIiApOBdJVLBmI1IyEibHtEQPXk7uQueIFD2Quet0fbiO\nHqBmw4BzTanycU1043P5hPK1dI64oTzxbdJFnjCEeUUPW3jKb4dpRg9ts/OUN54HOmopMNItow1X\nGhjRM0mGuEzF7ifdv1JkbGkk1wgS/DxSTPOQJBOjHkfQ4UcCjiwdTVF32ibWHNPxp59+uol5uN58\n800/NUP0MfAjIUcQkjgKMnlGoOLn6oocoHP5RmDj59eKAM2PxMzIzo9F92XQoEG5OBFU+fm2yJtn\ng4Vt5GLKi+hj6aeFiMDFjwKNLJEVyE9rkU8ehvVnSgfKz7QRt9x2RxNzWpUTGHmZEZgZiZnlxmjK\nDGSwkKyBao1cnrb3LpqU1j+3yaVOPhp+NyKobxYp/M5HFWyzc9oJ39/4N5U4fDfDPIjP9zPpG6s0\nOUd+SptvM7JwXMdYo38F6qwIMnLndQ3nI0jKHee6UM74dZzXtzssP8fzhbB8ESw2kyvpGvKMy4S8\n8TpO13JfyJ8l331Q3Fqu82ukRlIU+3CFNwhFxwMPpBTMDQwfEOJyw8I4xC10w5R+sfIRn/QkQ/xB\nKHSOa0t9oML0kl5E8pcscdgp9MIgS77AnFiR2Tnf6ZKOI0N4TyVrqWvSIK1KAgDX0hw9wEfUtdvr\nlPhMsRBZRPw+cPHQQw81RePd+Ak+77333qaoq3gT68ja0jRv3jw/zQRTRbz33nt+igbgImpSSgQd\nlQU4iiw0Po/ICtMUjbeTm94BaBLwADLMtTVmzJjcPUePQ4YM8eVCrugfvZ/uAtBhCohobKCmP//5\nz03M2RU1b+WdfgKQEpQwzUO5gKMyxddAExDFJKBXXTU+ftr2v9IA97MaQJim9w5AB3hKCWGFHv+u\nlZJOLeKG32DqpLYSQogTiKWh7HWHnTQowWQoXgPMjaVJJYu/Kn9MPgghuBULO1wTB8r8uRQ+U0xF\nEo1n40Ei8nHJTcwZzo7OHFQAE/NczZo1yy/MdcXs5lh50BkAziSjmk8Lyw0QlRTCiTyjZis/V1Xk\nF+Tns2IWdObZAlqYwwqYAq7ImxnUQx1GfgB+FneAGKsOwAW0Ms8WoEOagq5QFuIwb5WHkAhyWjtg\n7QF6ACsAy0JzDRQD5M2vKLyXhveOb0mp9xrrCODAM16MVaKwFio7G/5Z43sU/sHmfZOcyErcthC4\nP/r+oJM0hW8gTCScBdNAURo4MhpUkHb2U045paj4xUaibTr0yYn+xTa7NPpwNPPpiV6kZufL3YmA\nxftD4LeTL3Duxz/+sT9Nt/O99trL97DCCRnHYHpCEfCr2GSTTbyfDj4v+MRo3iucEHHGVFdy/HeI\nIz8dn8BXP+hW81vhAK35tvC/YVGXdhyQI2DxSwRQ3jkZmfDPiaw8jrm0CMh0xRVXuA022MD78mhK\nCvkM4adDkCwPP/yIO/mkk9zue+7thg073bVvt54/X4uf8RMnu8kTxrvDI/8fpqSw8KUGeLbwl4qa\n/Kqqknq9d5Sl3A4P+MHIjySCtlYdm6aQskM5CsXjXGTh+JoTb0vXZPF8qJO0ldlgJ4tPVB1l5mPL\nnEuF4KCO4pWUNaBDeXBO1jxSSQnwUX7ppZcc4BFZbzxw0KsJ2AAy9thjD/fGG2/4S4EKIAInZXo6\nAThM7Ans0HUf2AGCAAzBhfLEYZN5pzSEPnNjSS7+k0SWF583Ts9AjWCH60LY4TyBiUyZ5oKATPTm\nYpRlAZfkALrkkDxmzFg3c8Z0d8GYi92AA/bz19b6Z+Fri3w3f/KdMf3LiVVrLUOa8uP+8pyeeuqp\n3vGzGvNk1bt8+oZQrnICPYNw1OVPT2RRKSeJqlxDD6rQ6Tsp0ai5zcNO0rlGOsYfVLrms44sWX5S\n6zSV78tuIGmSyGRJtQaojPlXxpL1QC8XelMBO1FTk1/iZeIfNaADtOjDLDgAaNgOe2hFPgi+FxRg\nwiLDqa5hDeTouPIDHpEH0CGvpIk8SQ+rDaDFmgVLD4E0kQeLEWDDQs8n5tEiAEDsR/49/npZiSQj\n8tBFfMFvfuNuuHF63UAHWbt07ujuuXuO7+XVv/9+DQHWlKuUwPPw+FfPJO8a1r6o2ccfKyWdtMYF\ndsJJc0uVE6sBActUUo+nUtMrN37UXOVBJqlHE72xdL7c9LN0nXqYYYWnR2jagll20nZHMiAPTVkE\nVf5+J4M/yK9/mBKfCkaBSmbw4MF+F4sOH2dZWAAOrCtYVKJ2at8tXGBx0EEH+RnI6dLNiy/LDtYd\nxsxRF29ZdrAwoVOapKjQ2MeaJCACSIAT4AZo0fg9WHZYkENj6AiAuAawAn4YA+iwww5TsdwJJ5zg\nLSfIhyzEOfvscxxdxK+Zck1Nm61yQuXZOPW0YW7uffe62bfMLqmbcp7kUnuYZ41Fgfsft+DwrPJs\nhM+o4mdpjfy8S1isLJgGaqUBg51aabqB8uGjzMeYdfyDnKViYtHBciN4i8vOeWY0x9JCJcM+MCK/\nGSCDwfuAnchp2PvFRL2yfDLnn3++N7HjH4OO1IyFD0/YfEQ8FsJWW23lKzLiC3RkgQGuAB0NXkje\nbOO/w3HicQ2LIMonGv2wj5xnnnlmzuTPeDz8O15vvfUiHVzgyzll6pRUgY7kF/DMXzA/08+byqN1\nCC08WyyFAnBAHKw+LcUtlE69z2HBZOHds2AaqJUGDHZqpekGywdA4IOb1Q8WVh1kB9iSAueAEEBH\nUBf1UHIsgAWAoaHj5STMKMWHHnqoBxCsJTfddJMfsA/gIR3W+MvgywOsnHHGGblZ06NuuA6ZCIIW\nWXTIC6gR6GDFiUMOTVha1FSm67H2sE1g1OU77rjDb2PVOfron7mFry50s2b/KpWg4wWNfgCet976\nY6Z9eICU0JpBhV9q4LkEkkJQKjWNesZHbjWFZ/mPUj11aHmXpwGDnfL01uavAgDo5UPln7V/mVQ4\nWKby+Q1QKan3Vdh8BYSETUm/ifxboq7kHlwAkU6dOrloHB134okn+udjxx13zDUX0XwF6GDZica3\ncQcffLBvNiLiDTfckGsuE+gAVOSFRYe046DDcVlxNCKymqTUuwrAIZ7AiG2OUeFEXem9jDh4zrxp\nluvRvavfT/NPv336u+222zYzvbR4zniWFJKapnSu2LWsO1gay4GlYvNprXjIzAK0WTAN1FIDBju1\n1HaD5YXTJB9zKs8shZbkplJS7ysqFQJgIYsO4IFlBksOzUNYWrC+0DuEeFhOosHb/HW//OUv3dZb\nb+0dhrHoROPi5LqgAifRyMq5aUq4ILTGkGYIOmwDLkALQT45NIux4IODRSmEnTANrmefcnDf6J4+\nZuxl7ujBA316af+hl9b+/fd1l1x6SUXOra1ZzvBdwHLBs1TtAKTL1yxL1hHJzR8lC6aBWmvAYKfW\nGm+g/GQhAR5YshCojDCjU9knWaSSmq8AGCAES0sIOnIOlpWFZiRAAwih59OyZcu8SugyTKUXDern\nrrnmGn+sXbt27sEHH3Q/+MEPfPMT14SgA9SwyBlZQAWoEMiLHleCHNbAEwvnCMgN3ITpCJguvHCM\n22DDDd3UKc3n+vIXpvjn+mkz3E0zboyGALgzFf47VNwsClgtahHIh2cKgMhC4H1D5qxapLKgY5Ox\nsAYMdgrrx862oAE+YjT5RCMEt8q/2BayL+m0mgCoII78qkdZmIDKwrFCzVdADlYd1sAEkALkABoA\nCNYV8sIJmLD22mt7B2biKQBdW2yxRa43FE7EnAecSFOQA5ywcFygQ14h6GDREewItkiP+Cxx4Inm\n+HKjR5/v7rt/ru/mLZmysmZW9W7durohJ59UF5G5dwoAM0utA4Al2El6lmstT6H8eBcAHf5kjLbm\nq0KqsnOtqAGDnVZUbltJGsdaLDtUAq1htq+GHvXBRT7kTQqcK7b5CtgBQoAJLCmADs1UgAfAAWww\nxgazlYeB8ThGjBjhormtPMBwDeBCZSAwSQIdrDSkCUgRn3y0sM/COSxELITQIgUsYeFB5p///Fi3\n48493dnDh4WiZWZ77oMPu1OHnOxq1TsLCOb5UeBepSFgJQF00m4tUTfzxwNITIP+TIa2pQGDnbZ1\nv1uttHx0qRT4oKXNj0Cgg1z5Prj844z3vgphAUiQn46ar2jWAkAADaAFJ2QABOgg4J/D6MoK++23\nn4dC4qv5ifjs47tDfqTJObq4AydACgGAAaLC62TN4frQooNMBNIjyMJDWgsWLHDHHXe8e+LJJ1Pd\n+8oLXuCnNa07PC88ywpAcNqeacmGlZJm0rRaVtP8XZAObd02NGCw0zbuc01KqQ8blpO0WHgEOijg\n8TwgVm7zFTABZAAs9LRiAUCAHXRw8sknf03vjJAsQAqbvYhIMxZNTn/961/da6+95iGF4+uvv77b\naKONcqAj4AFyWLAsyU9HoMN1BAEPaQM9Z5xxpltlte+6MReO9uez+iPrzqI3FlWlCDwbCoBNWp5f\nyZS0pilLYJZGy6q+B/neu6Qy2THTQGtpwGCntTTbRtPlA4dZnQ9cvSsMKgNM6BtHPhXAR75/58hZ\nbvMV4KFu5Vh32B82bFhuCokddtjBD+bXr18//0Qwl87ZZ5/tgQdrDWAEqAApgImsMKw5xjkGLGQR\nHDHvDH5AgFYh0AkfQdJm8MPte2zv7phzVyZ9dcLysN2rV283dOiQsnpm8WywKKSlaUrytLSW7Dzb\nBJ5vgCefP5qPVKOfYv5g1EgUy8Y0kNPAf0Xm+9G5PdswDVSoAeCCUXl33313DxfdunWrMMXyLuej\njwyMZ0NFAIQkBR5/Jshk0D8AjXiAgSwhWFrUhIUvjfx0ABWsKlh15KvDOaAG52bC8ccf7/Ned911\nHb2vGF2ZuXyooNinmYqFPOREDOSQt6w/yLPOOuu4zTff3PfcYk1FRzrvv/++n64CfcctOvGycp7e\nX0uWfOCGDqmPY29cpkr3P/v8C/fyy6+4n+7at8WkqIBxzEZ3LLLecC9YshSQnxDKDbB37NgxaqI8\nzo/9tNtuu/k4tf7hHSJv3vvZs2fn/YNRa7ksP9OAWXbsGWgVDdA0FFpVwg9zq2T4VaJUaliX+OgC\nMvxjz2dhqmbzFbOeU9GwZqRkYOuII47wlhogCT+fAQMGuGeffdZLOnPmTG+pUTMTVhoWWW+AHBaB\nlPaxBMmiA8AwenPoX4Ke8+maiT5XX2PNzDomx58bjbuTrykLvfA8EAQ38TSytp8EOmEZOM97R+AZ\n5PmvRUDPvG/8sWCdlaEoaqEbyyMdGviy20Y6ZDEpGkgDAAaVDR9bRloGQPShbo1i6mOrip68+OCy\nH8JAmDcyEfbZZ59cBcE+lhUchWVtwWLDgoMv52TVEYDcf//9btddd/Wgg28NfjlMIEo8FpqaAJSr\nrrqK5H0YOnSoTxMIYqF3F1Ye0ie+eneFjs8cC5u9gB0qcXSshcQBPS2q7Dn+wvO/cT133onNhgjM\njt6uffvc/aWsKjdr7r30kg94s6QIvT+UK1/gHM87wMPCM67r8l1T6XEAR/mSt4FOpRq161tDA2bZ\naQ2tWprNNMDHln9706dPd8wBxcewWpUPafOx5V8saZIPFVwYqAQFXjpOvEp7X+GQfPnll+f8c7bc\ncksPOsx0jsWGRdBETy6sMPS6Ouqoo7wYvXv3dgcccIDfpkkM3x+sQlzPmqklOAZUyZoDCBFaarby\nkaIfyi3g6dWrl5dJ5xphfcyxx7s333jd3/dGsd4k3RcBSyHQiV/Hfedd03sH+MTfjfg1xe6TNu8c\n7x6BbVmU/AH7MQ2kTAPLpUweE6cBNcAHmo8i82gR+OAKTPgHXmqgAhfcYDWiIpBTdNLHXNYP8hL4\naKZx5OK84ASfGSw4surIr4bjBGADMMHSQ7PV1Vd/OQLx4Ycf7p2cQ9DBSiMrEdBDGvhV7LXXXj6t\nefPmeWsQcdScJWsQFhxZcgAdwQ4XFgs6xEXP0skhhx7OoYYKG2+yqevdexdfxmoBdNoUVA7oUAae\na713vIPADmsAqJz3DjlID6jhOec9ZJ/jBjppe2pMnrgGzLIT14jt10QDwIkA5d1333U777yz/xDz\nMSbwoWbhQ0oQpPCBJVCB84FlIV6xgeu5FisLzVfIQAA2gBEgB5DRmDrxwQOxsvzlL39xwA29mwjX\nXnutHzwQCAmhCcABnOSzwzxan332mV+oeOhiTmAKCVl2KMuaa66ZmyUdyw7QIwjyF5TxAxy++94H\nbtwVl5dxdXovoQv6zBkz3KybZ6ZXyAok0/Ov96KCpPyleueAHXogbrXVVv69C0GRbb1n+d473qFq\nyVRpmex600AxGjDYKUZLFqdVNRB+UNkm8JFnO/wI6wNbyUdWzVfk8cknn3hQAlBkgQlBJ2nwQGY6\nHzJkCJd7AGG+q2222cbvAzukAzSp+Yr0gB3giUU+OoAOwEPo0qWLGzlypO/ZRdMVwEPvMDVjAUJY\ndki/FKuOT/yrn/ETJrrPv/hHwzgnq2yNDDvVBh3pTOti3zveQd658F1UGrY2DWRFAwY7WblTJmfF\nGuDfKvN4EaZNm+Y/4FiUgB3Biaww4dxXnAc26KI+fvx4fz0jHD/wwAO+S7ggBMgR6Kj5i/QEO+pm\njrWH/GjGuuKKK3x6jM2DLHRlx5oj2NHYPaFjsr+gxJ9xV17l/vHPfzcc7Cxd9qFbv327XDNgiWpJ\nbfTWBp3UFtwEMw20kgaWa6V0LVnTQOo0IEsKzVdsYynCnE9zlGAHIMEawxL2vjrmmGNyoIPPDV3I\nv//97+csLQKdeDOYrENKC58fjbjMmDw4KRNwdAaKCFiHJAfpIRvph749PqL9ZHrKi3y3T01IWFMs\nmAZMA9XRgMFOdfRoqaRcAzRf4aOAxQSnSgIWm5122skP0PfWW295wAA4AB0gA7jAx2bHHXd0Cxcu\n9NcwieesWbPcWmut5UEnbAIToKipSk1XHAdW8LtRl3LkwMkTuThGOOGEE7wDNHGBI0EX17PPcfJj\nsdCYGgB0gBwDnca8v1aq+mnAYKd+ureca6QBKpBCva86d+7sYYLJFAELwQnXaZoHRJ08ebKf+oEm\nJcBFoCNrjpqrBDnsc454xOc6HJzpsg7ssGy44YaOAQYJOD5PnDjRx+c65AjhCwsPAEYw4PFqcAws\n2OEHHb7cyfivQKcUh/uMF9nENw3UTAMGOzVTtWVULw2EzVdhF1nAQc1XTMnAfFPPPPOMB58bbrgh\nmndpqBcZB+F77rnHj4As3xlOcH0IJVh05OujZjDiqbs6/jeADj45Wtjffvvtc5OG3n777b4nDFYc\npU1asu6oSYt0Swkan6eUa7IQ970lS9zW22ybBVELymigU1A9dtI0ULEGlq84BUvANFCBBtQjBN8Z\nbSclJ9M+PULUOyQpXvxYvuYrQEXNRbKgADIMDMjknbKcbLrppg4AYeZxzuOozDmuBTy4FhjBAsPC\nPpDCeQKQQTMVFh18dVjYJi0cm0mLOMOHD3d33HGHW7p0qe/tBVzRzEV6nNeChUgO0dqOlzlp///9\n3/+6dxa/nXQq08fozp/1YKCT9Tto8mdBAwY7WbhLDSYjPU0Y7+PGyHdGY30IYICTpECFwHWMF8N0\nDPSGwkpzZORonK9LLNcUar7CD0bWE6DiT3/6k9tjjz1yoIMDMlNBYH0R6CBbCDqCHPnXyBGZeFwT\nBx32sRQJXoAd4AX4onfXD3/4Qy51F154ofvlL3+Z891RfK1D0OH6lgI6mvfYEy1Fy9z5P/7xLdex\nQ3absQx0MvfImcAZ1YB1Pc/ojcui2MANCx94DQjYs2fPkgYFVLk1OBrp4eMAJMUHGKSCB6aKGTyQ\naRx+/vOfK3l34okn+vSw3jCODtYYQAM4AWiAI4EOa6CJ44IXgU5ozZFFJxwNmQzVlEY69913n59X\ni+Onn366l534XEvTF+BFcxjQRB7IVAzsYDXT6M6k3SiB6SJ6dO/qoTdrZTLQydodM3mzrAGDnSzf\nvYzIDphocsAkKKm0GAAPC5UHlp8jI2sP+RQ799XNN9+cswAhCyMa9+jRw8MFIPLmm286oIwQBx35\n0xCPAHzEQQfgwZoji46sMkAKcCTfIQAK52ag69e//rVPj+YsYI5rSUc+P2wDPICQ0vMXFPjp1au3\nO3P4CLf7T/sWiJWtUx07dHSzb5md17qX1tIY6KT1zphcjaoBg51GvbMpKBfNToBHCCGtKRZ+P+QH\nHGDRIcyZM8dbaIAKltCKgkPxGWec4X1lJBeWlXbt2uWsKEAG8ILlp1u3bs0sOoBO3D9HUAKMYI1h\nEZSoCYq8QmsMctE0BkiRJsCz+eabe8sReT/66KM+PunIyZm1IArgIb0wTZVHayw7hxxymHfmHXPh\naB3O9PrZ5xa4o44c5Ba9sShT5TDQydTtMmEbRAPWG6tBbmTaioGlhWYkFkFPa8uI9YW81OOKeX+0\nTd6yoAAo+OccfPDBOdDZOBrb5KmnnnL0ygIqWAQnXLfddtv5EY+XLVvmp3ygyQlLjByRgRLABggJ\nF45xLmy6SoISrDPEAZa4Zvbs2V5dABCjNgNEoVWJvCkH8IZ8BOIoADeyqHEPaMJ64IH73ROPPaoo\nmV/fd/9c12/f/pkqB0DOs2bdyzN120zYBtCAWXYa4CamrQhYV6hoWdT8U2sZ+fcsKw/WnVVXXdWD\nAZaT1157ze2yyy7egoJcgwcP9ousMvjGCFKAEIACuOBa0gWEGFQQuABcgBmOcQ3WFjUxkZ4gpyXL\nC2mFMAZMXXTRRW7ChAledQAP0CKokv+OLEjvv/++e+WVV7zzNhWqLFuh3oE/xvK57fY7vZ9LeC6L\n2zTLDR06pBnQprkc3Jd6vQ9p1ovJZhqohQYMdmqh5TaSB9YELCmyKvAPtp4BOQCedyJrzyOPPOIt\nLnPnznUHHHBATqwLLrjAz0mFFQdwkFUGqBDoYEEBdFiAniXR2C50ee4Q9QISfAhyAB7Ah+OAjvxp\nkqw5OSG+2gB41NOLvAAeoAw4I4T+O3/+85/dCy+84J5//nm3YMGC3AzsXyXlV8ANlasWrAkXXXSx\n++jjTzI/+/mtEbBdPWmie+yxeWGRU7ttoJPaW2OCtRENGOy0kRvd2sUELKhUCXzY02SmHzRokLfI\nHHjgge7cc8/1MvLzq1/9ym2wwQbeOiOrDtDCNpCCpUWgA3iwrWYr9hcvXuwHBJR1JQQdNYGRTzGg\nQzw1Q5GH8mXcHcb+UWDcn7ffTh4vB6fqPn36+MlOe/Xq5e+B0mTNgsz4A7268HXXpXNHJZu59aGH\nHeG6dd0uGpPo5NTLbqCT+ltkArYBDRjstIGbXIsiYtHBgpI20KGCB1qwsigABWPGjPGWHM4BJnFQ\nEehgyWHBX0agE/rWvPzyy27rrbfO+fqo2QpYIhQLOpJNUILV5qGHHnJYoph0NCnQNLfnnnvmFixK\nyp/4SouyaKEMZww7K7JgrZRZ687cBx92p0aQM3/B/FRBddI9MtBJ0oodMw3UXgMGO7XXecPlSLdy\nPuosabLooGgqfEYmxqqjQFdyLDMADEFNToACkMI1nMO6ItDhGOBC3NAKhFXnjTfe8D48DELI9Syl\nQg6+QNLhY4895icglbzx9VFHHeWb55ADSMN/J+ydRd4syAzcaAF42MYyxBQVWbXu9Nunv9ulT+/U\nW3W4nz2/snbG76HtmwZMA7XVgMFObfXdcLnhhHzkV93L6+2jk6RcVfgjR4507du3d1OnTnV9+/Z1\nxx57rHc8Fhhg3QkBAcgBdgREAAwwBFzE/XOAjg8++MB9+umnvgmJdFoKAhvWgA7XhgGrDbOtAyXd\nu3f3TU9hd/SHH37YRxfwADuyTgm2KLsAB8jRNutJk652yz780N0ye1aYbeq3x0+c7ObccXvqfXUM\ndFL/KJmAbUwDBjtt7IZXs7j46QA4N0bdzMMu3tXMo9K0VOFrfJ3f/va3fibzKVOm+JGRqfiJIygi\nHoDDAiAQAKHQEVnAA2iwyD8HfdALKunfPJWfFqa7iAemv6C3Fdey4FwsOAG6sES9+uqrrnfv3v5S\n1synBVgJwgQ7yCrgiUMO+ywfffSRO+vMs9ygo452Q046IS5OKvc1rs41U67xOkqlkJFQ3GfuoQXT\ngGkgPRow2EnPvcicJAIcrDtpDQIZIEYgQzduYIcZzsPjQIXiABoEQCberRyoAHLkH0Mcgiw66APw\nwWLDkg9uqBC1AI3IqiD4AkyQi95ZDDaI3JdccomPdvHFF7tOnTr5fJFRcqpZTvIImkiLdAV4L774\nop9t/Zln56e+K/rSZR+64447Puqd1scNOfkkqSl1awOd1N0SE8g04DVgsGMPQlka4KMup+S0+enE\nCxSCg2CG5iH8eAYOHJjrUi7YEegADQIINV2xH4IOFhTABqBBJyxJY9xguRHYsE6CG0GI4ERr+Q8B\nO59//rkbMGCA+8Mf/uCLCfzgswN4ydIEjMm6g3yCKK2BIC1z5z7gbr75JjfzplmpBh7mwKLss26e\nGb+9qdnn3nNvLZgGTAPp04DBTvruSSYk4qPOMjrPLOVpKgSVvIAHgABqcAI+4ogjPKQABlhOgApg\nCBAAHgAbQQ4AIYhgAD9GWwZqgJwkuKEZCqChaQoHboBQsIFuQrAJZRPgaI08su7gR0RzFqM47733\n3l7FG220kRs1apRPGwsTwCNZ41DGecrGOlxGjb7A/TNKd+q1U137duul6dZ5WU49bZh7660/uhnT\np6XOAR4BZcUz0Endo2MCmQZyGjDYyanCNorVAP9gs2LVoUyCDEGFLCV77LGHH5fmoIMOyvW6EgwA\nCgIdppag+zcLkIMzcjwkDeBHfqoId9ppJy+HIAeAEdDE15yLL7JIAWUsQBYTnRL22msvt9tuuzVr\nctPgiIAPZVHTFpCDtSeEHbbPG3W++0s0UOGll12WqvF3sgA670RDLgC1FkwDpoH0asBgp4b3ZrPN\nNssNCIffxVlnnZXLnUpWgZ42jJxbTKB3ET2LFFSxa7811oAOH/csWHXC8oewg5WEaSQYZPDBBx/0\nFh2gAxBYuHChW7RokZs/f75fGC05HnbeeWfHP3kWdBFabshHC2myACddunRxq6yyigcZAU4IPSHg\nJJ0X8CA7wDNp0iQvO7JRDnqbqdlNs6PTxAW0ATsCnhB2wu1zzjnPPfLwQ+6GG6fXvUkLH50zzhjm\nm67SbNEx0Im/GbZvGkinBv4z0lo65StJqksvvdT3UOEiRpp96623SrreIresASwVd999t7vyyitb\njpzCGEClKniags477zx3Y9SbDH8QmrYYMycp7LDDDr4nFHDD6MQEgaUgirXgJg4rDDzIAIRACFAS\nnhfk6Fi4ZjsEJ/JV76uTTz7ZYWUDfi688EI3bdo0b7EJHacBHCw7svCE52TdQR/o5corr4jmM7vb\nbd+jm7tqwqS69dKi19XIs0e4jh07ucmTJqS26cpAh6fRgmkgGxpoKNjJhsqzLSU9jfbZZx+3ceSP\nksVApc6CtebQQw91+N/84he/+FpRgJpdd93Vd09nCgauUQgBBFAR5AhaWAtYwu1NNtnEvffee45B\nDddbb71cUxVxw0Vww5oAjAjQJD/xaY7Dssd0GITrrrsumhhzaM45WU1W8j/C6sOitAQ5SpM0Djhg\nfw99559/gZv/3HOO8YlqNa0E1pwbp890I0ec6QFU5UKuNAWA30AnTXfEZDENtKwBg52WdVS1GI1g\naQJ2AIEsBip1AIJKfo011nBPP/10rhhYbrDYMH7NNttsk+tWTlzBjNYhmLQEOCHssL3aaqt5wKLb\nNyMukxbpshAEHuSrRdCitYTm2i222MJbp5jQlK70DERIWYhLWqTBNgsWHsCHhePKT+lp3TO6vzTN\nMfDgFl06uTFjL3NHDjqiVZ2Xr582w90040bXrv36Dt2k1QfGQEdPia1NA9nSwJdfvGzJ/DVpmdGa\nDzuDrCkwJD7H8JOJB5q7OK6KhTXHkiZY5LjiMfIuQYO5cVxBcVgjDwuVJvukQQjz1DFdH1/fdttt\nuetJg7Q4Vk4grzBvyZRU3pbSp9lE4+u0FDeN5yk7C5X/zTff7AGB0Yrpwo0PVdeuXT0MEIcgZ2b5\nydAbii7gX3zxhW/6Yt3SQhMZcbiO6wGttdZay89YDuRIHpqc1ANMDsb43IQLzWDIywI44St0yCGH\nuG7dunl58QVDVsEOQCTgYjsMKmN4TNukO3LkCA8er77ysusdAdDIc0e7ha8tUpSK11hygJxevXp7\n0KFZjq7lBjoVq9YSMA2YBmIaaFOWHSp3xihhFN14AGCAApyDf/KTn8RP5/aBjqTrcxGiDUCnJZgJ\n48e3gRqaJ8JAnsged2wO48S3q1HeME0GyCNsnNEmLJVFVo3999/fH6Jyfe2117ylRVYWwIAu6oIF\ndQEXOLAutM050uJ6pRnmD6hst912ftBBBgZkXxYYWWO01nGtBUeki1wskydP9hOSksdxxx3nbr/9\ndp835ygHACR/HdIV6Ggt2eJrdAOAcO9vnjXbW3r27rev2yUC/22i96RH967xSwruAzhzH3jIvfrK\nK27uffe6rbfZNmqGG+iOjKYcSXMwi06a747JZhpoWQNtCnbygY7U9Mknn/h5k2huWn311XU4twZi\nigmVgA7px0EnzBMoA8aK6a1VaXnDfNlOg58C1jXdByr7UgOVO9cJeLie5iumYsAXiXMCGaw6LAIK\n1oUAJwQbyUZ+LOQXwou26ZKOUzTxGTMHoNE5wY32WWtbkCJZN9hgA9+7rH///u4vf/mLmzlzph8w\nEdDRNUpPMrFfbAB6WEaePTxyYr7L4UQ8ecJ4fznAssWWP/Tb3//+Zr7HmdJl8MPPP//CvbP4bfeH\nN99wjz/+mDvk0MPdT3fdxQ0dcqLbOAPgbKCju2lr00B2NdAQzVhU/FQWGkaf20FvLI7JTwaACC0y\nxOU8C00YCgBPIdggXaw/ulbXxdfHHHNMLk7YxTweL99+PvmIX0g+pVet8io91vy77xk1Z6QlcK/K\nCarstWZ0Y/xECAALi6whdPFWsxVrbYfNUsQBimTNIV2sKGGzVNgUFd/GWkj38MWLF/smq7DbuJqz\nOB8OFqhu5BpDh3NMGMpAiQSclblfyERZkFGLZJW8/oIif2jewgozdcrVbtEbi9zsW2a7AQfu71Ze\n6dvuf//9L3dX1J1/5owZuWVJ5JC90ndW9BagUaPO8+8EliKcj7MAOgB+GiC/yNtj0UwDpoE8Gmgz\nlh1ZA9ADIBICCPtUnPL5ARTC86HuAKOWrCqcDwEqvL6Y7Zbko5kLeZOsT0q/WuVVemlcA68t3Yti\n5KbS5d+7QAcYECDQ/MO2LDyh9Ya0ARtZXLSWBYV1aFXJt008mrLoIfbCCy94oCSu0matEN8Gurke\nuMLfB0h+9NFH3dKlS92QIUPck08+6S1T8uMJm7LUM0vlUB6lrGXxKeWarMQFcgiU0YJpwDSQbQ00\nhGWnmFsQWnWSKkjmSVLA1yWf1SDpWl2ndTFxFDdpHcqi8/E0W3IurlZ5lT9rLAWAQVpCWMZqyAQ4\nYO2guUrAA+gIdmQJATiABqwqoUMxFpvQKhO34IT7ioflRlabddZZxzejMlIzTs3kIeghzxB0wvIS\nJ5Rn9uzZudOnn366t6ZQJoAH644ATs1yLVkpc4m1oQ2BTpqe9zakfiuqaaDqGmgzsBPCAb4sqjy0\njvfaSoIdmrCKCYUsLsVcn5RPPM0k+cK0q1HeMD22sX6k6eOPb1S1gSdeZp4PLCdqkqK5CDgBUgQ3\ngIvgJQSa+LbicL0AB1iKdwnHh+jdd991qnDjMoX7en5l3SGtDh06uHHjxvlozz//vB+9Wc1Z9AaL\nAw/WKgv/0YD0nqZn/T/S2ZZpwDRQjgbaDOyUoxy7pnU0AKSoki51HTbPAXw4LNP8WA3oQRY1NSXB\nTTmAA/DIeiOwAUiAE5bQchNqW00nWNNaCtIhacnaxHxfe+65p7+UqSQAVSw5WKlC4JF1R81zLeXV\n6OcNdBr9Dlv52qoG2gzshNaS0MFYJvz4Ooxf64cjqeKOW3Jaki88X63y4pyqyqDWOsmXH3oBnuRv\nlS9eMccFO2oSiltxZMmJW2wENFrH4QZwaglukuTDssDyeDS2UUsB2ZUH+SE7/jusCUcffbRfFwIe\nvQM+Yhv80bONzi2YBkwDjaWBNuOgTHdtNe0AE3EfmDTdVqwXcb+d0KJBk1YIM0myt0Z5sTaoQkjK\nM8vHBDoAgwKWEsABCCCwr0VWGda6lrUW4rNdaQAwe/bs6YEH/bNfKIQyMyUF/jsMAkl39PHjx3un\nZaw7xAPqBEgqA/uUt1jZsTyxfPa3z92SJe9/bUZ4Jj7t2LGDizr8e0dfypLGoOfaQCeNd8dkMg1U\nroGGtezELSEh3GAFCMfCAYJCP564/07lai4tBXqDhfKxH1ou4iCUlHprlZfmkEYLL730kocIKnhV\n/jQH/f/2zj3IiurO4z9SIJYVUu7GZAOuFi5UGNkqERMZhFpFYnzk4Yih3IqRAZJSUUkibtCMZA1b\nAg6iUjKWQXwOJghEHIeYVWMSJCKPUgOl4aEbE0UdrazlHz7WRLOV3G+THzlc7jzune57u/t+TlXP\n6du3+5zf+Zyr/eV3fuecYs+Ox+oUx9tU6rkph6NEgl7I/lIu9azsd9EiIaNhM01H18rESlpoUMLE\n43d8Krpyn22m73pK6v97ChunXnTxJdYwqsHmzLnCfvbYL+zd9963f/jHj9u05uYDjnGN4+39P35g\nL+99zW5aenNk39lNU2xZ2y09tqUnG+L+zpkidOImS3kQSA+B3Hp2JHb0P355QLTWjqZzS0C4d0fi\nIRQQYZd0N+08vCfp8/7al0R75VmIY7dzibWeVqmuhG1xAHc5ZegFrrbJ26GkvDho14WEck/huV9L\nMnfPmgSLzksl2ST7JdokwiReWlpabN26dd1OR3eBp+e8nX7udWgo7af//YjdsGRxtCjg+IKI0qaj\n5W4SqhWUNz252bZs3mJnnnGmfbrhWDt3SpPNKKzdU4uE0KkFdeqEQPUJ5Ers9Da0o9iV3lYVVpyD\nhEItk8RW6NkJbZF9vbXT74+7vXrB6kXb3yT7+9qG/tbVl+f1Ir/88sujF73fLwHgqdqixustlcv7\nIHHWk+DRc26/PFQSPI888ogdd9y+VY41Hf3GG2+MvDny6ujecEgrFDobN260Fbffab9++ilrnvkN\n+83O3WULnLAdw4Z+ys6bem50zJ37H9HWEe3t91h7+8rIA3XuuVPC2xM9R+gkipfCIZAqArkaxpLH\nQGKgu3/l6wWrRdt0T7FnQQJH4iANXh3ZokUJQ0Ege9euXVuWfXG3Vy9apTgET1RQSv7ohR56Sty7\n4XlKzNxvhuJ21BcSaaWSizOJFh/OUvxOOB29s7Mz8l5p+EqCRzO0wvV33nrrLVuwcJHNunhWtBXE\nLwt1Xf3duf0SOsW2Svh8Y2azbdjwS7ugeYa1tbWZhriq8fvyOvw3XWwbnyEAgXwRGFAIRix/g6F8\nMaA1ZRBQsOukQvyIPCF5SNrnSW3xf+VnrU0SPBJqpQKX9Z+2huM0A0tCRltdXHjhhfbQQw9FzVy/\nfn30nLw/ikPydYC2bdtmN9241IYV9tuaN29erAKnN76LWpfYvJYr7eabFUy9L9aot2fK/V5CRyKn\nFLNyy+J+CEAgGwQQO9nop9RYqeBUxe34v4xTY1iFhrhoiyMWqUIT+v2Y+sK9PcWFSfBoGMs9OBI8\nI0eOjLw5iunR1hLyBCmYWVPmH3yw09TH3/z25fat2ZcWF1eVz9pkVN7XoUOH2uLWRbGKEoROVbqQ\nSiCQOgKIndR1SboNUryIhgm1aaX+dZz1JJHg3pEst0WeKQ+0DtshseOCR1PONWS1adOmaDq67ps6\ndWo0HV1xO62t19vu3busfeW90cadYTnVPlcg8/z5/2VvvPGGrWy/OxbBg9Cpdi9SHwTSQyBXMTvp\nwZpfSyQOtGN1lj0h3jvu1Qnjdfy7rOUSnjok3MLkcUcev6Mhq1LT0RcsWBQNeW0sbBw64aTGsIia\nnCueRzurjxgx0pqnz4yEXH8MQej0hx7PQiD7BPDsZL8Pq94CvVAVuyNvQpbjHjyQd8yYMVHci0SP\n4pGyLn7UP8VxPO7dUfyOByRrLaZdu3bZZz9zov1TIYB5xe0rTCIjbWnOFXMLy0f8tmIPD0InbT2K\nPRCoPgHETvWZ56JGiQId8+fPz2R7FJcyc+bMbm0/+eSTo/ZJNOiQ10TJBVL0IcV/9IIP43gkdpQU\nv+PDWV1dXXbZZbOj9Xce7Fxf1UDkctFpEUPtBL/qR/eW9ShCpyxc3AyB3BJA7OS2a5NtmHt3/GWS\nbG3xly7xIqE2o7CYndqyYcOGKOha+TvvvHNQhcOGDbOxY8dGh3Yl16GUZvFTHMfj8Tu+P9aWLVvs\n9NNPtyc3b03F0NVB0IMLiuGZNesSaxw3rjBDrCX4pvtT/21m2fvYfev4BgIQKIcAYqccWtx7AAEJ\nBQXFavp2lpJEjmzWy1DJvR6apq1Dq2xr/zTNVNq+fXt0lGrf6NGjI9FzwgknRCLIh7/SJIB8AUJ5\n4ZTUVrXxzTffLOy/dp5NPe/fazbrKjKojD+apfX1GdNt+W3LI69bT48idHqiw3cQqD8CiJ366/PY\nWqwXqTwkClaW8MlC0ktQHhqJGBcn7vGQCNAwjzwf4V5R+l73/6oQvKt9tJ544oloSKW4vVqrRsJH\nXh/Vody9CrUWQGEcj9pz7bULbM/zL5Q9LFTc5mp/XnbLrdax7v5oIcLu6g7b2t09XIcABOqLAGKn\nvvo79tbqxaJgZX/BxF5BjAX61GzNwvKZWCrevR0udBTT4oeu6dA9Eiw6NE377bfftueeey4SQBI/\nO3fuLGmphr9c/LgA0o21ED8SehJfaotW1+7v1g8lG1yFi1pl+bTPTS656KB+h+7FqoIpVAEBCGSE\nAGInIx2VZjM1LKSAX3+ZptVWDzaWrWHSy99FjYsc3z7BPTzy+ihJ6Ggatx/67Nf27t0biR+JIAmg\nV199Naxm//mECRMiz497geQdU6qGAFIcz3e+c6WNOna0Lbx2flRv1v48/OhjNqewuvLWbVv3e87U\nBoRO1noSeyFQPQKIneqxznVNGsaS2NELx4du0tTgnuxzseOBuz41OxQ8EkOeXNx4LuFT6lzXNPTl\nHqBnn3222+GvyZMn7x/6kgfIGcYtgCR2jjnmGHut6/VUTjN3xr3l539tmo1vHLffu4PQ6Y0Y30Og\nvgkgduq7/2NtfU+CItaKyihMQ1casupJiLnYkaDxadkSOi52dE0eHnl3dK8EiB8SOjp3wROKnlLX\nNPwlr48LoO6GvxT8LNGjQ0LI44v6K36uuuq79sGH/29Lb1pSBsX03SrvzvWt10WxOwid9PUPFkEg\nbQQQO2nrkYzbI8Gjl49mO/kLulZNktCZ9LdZSLLJvSXF9kjAhMHJLnjk4Ql3Apfnx+9TGXpOh5KL\nn1D4SOz44SLIcxdCyiV85PVxL1B3w18TJ06M2uOzvyoZ/moY1WB33dOe+qnmEdRe/px66mQbM+a4\nXKzm3UtT+RoCEOgnAcROPwHy+MEEFMOjGVq1nKUlcaPAaR2yozuhI+tdtEjI+Ewsj92RR8fjdvSd\nvD+6zw99drHk5TgRF0AueEKB49ckfkIB5PfI+yPxIxG0efNmL/KAXBtluvBRELQOT6U8QBKgd93d\nbus7O/y2TOfz/nO+ffjBn+z6xddluh0YDwEIJE8AsZM847qswcWGPCsSG+6FSBqGvDkeMK08nHXV\nU90uWNxzEwocFzkSNn6EYsef8Wueu/hRruTix70/Lnhc4IR58fkrr7wSCR+JIAmgnoa/fPaXhJB7\n11TnlVe12KBDBmc2MLm4/3zdnT3P7yn+is8QgAAEDiCA2DkABx/iJODxMvIo+HTvnjws/a1bs6wk\ncCSsdK68r8kFiYSKzl3UKHcx4+f+Ocz9PLzHryn3MmWPzr0+F0ASNzov5eUpFj7uDfKhL+USQdpO\noTiFa/90dq63xUtusLPO+HzxbZn9rGG51WtW7xd1mW0IhkMAAokSQOwkipfCRUBeHokQBQlL9Ciu\npxwh0hNFCSoJG3mPlJRr6KrS5CLEBYlyiZVShwsi/86Fjufh9VLXvGyvSza7+NG5RI4foQjyc89d\nDIVr/2gITJt8FifVlaekPbNGHzuqzx68PLWdtkAAAn0ngNjpOyvu7CcBiR4Jk/b2dmtqaoqCbSVM\nyhU+EjjyFuno7Oy0U045JXrZ9UfkdNe0UBxIvLgwcSET5qGg8XPPdV+p8/Cal+V1eN0ugJS7+PHc\nvTz+2YWPCyFf+6etrc0mTvw3+/GP13TX1ExeX9S6xP5ciNu55prvZdJ+jIYABKpDALFTHc7UEhAI\nxYoEkIa2JHgm/W3mlOf+iDxCeka5jpdffjkSOBI3lYglL7eS3AWIng1FiYSKPheLl1Kf+3rNxY+X\nHdbtAkjiRucublzshLnOJXbe/+MHtuK2H1TS7NQ+s/b+B+zBjo7MbXuRWqAYBoGcEhiY03bRrBQT\nkLjRUJYOJRcxvku3hrzCJCGkQ8JGw2DFYii8N+lzCQtPfi4RIrGhpHMXJ2FeSuDo+/C6n3vu34ef\ndc3LVV0KnlZS7vbIFp274NHnwq02/Jh/ie7N058hQ4bkqTm0BQIQSIgAYichsBTbdwK+jUPfn0jX\nnS4yZJWLjNALE4oTFyueFwsZfS73murSM0o610wyJdnix7vv/l90LW9/jj7qKPv100/lrVm0BwIQ\niJkAYidmoBQHAREIBdA+z8rfA4MlSPxwIVRK4Oi78Lqfex5+X3xN5XvZyj/88z4BlLfe+dfRDfb8\nC8/nrVm0BwIQiJkAYidmoBQHgVIEQvHj5xIklQx/hcLGBU9v1wYNHFTKLK5BAAIQqAsCiJ266GYa\nmUYCLnpkm84VYyMBpOSeH8//AVUvAAAG+ElEQVQlasLDxY3nLnrCPDwffOjgqNy8/dm5iwUF89an\ntAcCSRBA7CRBlTIhUCEBF0Ceu/hRcS58lIfCJzyX+AkFkAueIR/9aIUWpfuxvYWVpb96/gXpNhLr\nIACBmhNA7NS8CzAAAt0TcNGjO/xcYseHvyRmXAS56NHnUPDofODAj9j//uEP3VfENxCAAARyTOAj\nOW4bTYNALglI9Pgh0aNj4MCBdsghh9jgwYOjQ9tE6DjssMOiQzumd3W9ljse27fvsIZRo3LXLhoE\nAQjESwCxEy9PSoNA1Qm48FEerq0zaNCgSAAdeuih1tjYaGvX3Fd125Ku8KXf/84+9rF8DtElzY7y\nIVBPBBA79dTbtLVuCBQLoCOOOKKwGOOppp3C85T+pzDtvJaLTOaJJW2BQJ4JIHby3Lu0DQIBgRPH\nNdrjG38VXMn2qYTb611d7Hie7W7EeghUhQBipyqYqQQCtScw4aRG27plc+0NicmCp595xr7cVPkO\n9zGZQTEQgEAGCCB2MtBJmAiBOAhob7EX9uy2vKxN07Hufps4YXwcaCgDAhDIOQHETs47mOZBICRw\n9jlT7I477gwvZfL84Ucfi4awJOBIEIAABHojgNjpjRDfQyBHBC695GJ7+Kc/sa7X38h0q+5dudIu\nnT07023AeAhAoHoEEDvVY01NEKg5geHDh9sJnz3R7mm/t+a2VGqAvDra6bx5GisnV8qQ5yBQbwQG\nFFZb/ft2zPXWetoLgToksGPHDhs7dqz9Zudu067hWUvnf22ajR/faN/6Jp6drPUd9kKgVgTw7NSK\nPPVCoEYEjj/+eLt2wUJbuHBhjSyovNplt9xaiNV5Da9O5Qh5EgJ1SQCxU5fdTqPrncDsyy6NhoIk\nHrKSNIvs1rZl9v3vX2OHH354VszGTghAIAUEEDsp6ARMgEC1CUgsLL9teSQesrKqcktLi13Q3MyK\nydX+sVAfBHJAgJidHHQiTYBApQSWLWuzjo4O+9GqVTZs6KcqLSbx5+ZcMddefPG3tr6zI/G6qAAC\nEMgfAcRO/vqUFkGgLAJXXtVie/bsseXLf5BKwSOhs2P7MwVR9gDDV2X1LDdDAAJOgGEsJ0EOgTol\ncP3i66yhocFmzbokdevvSOhoXaClS29C6NTp75NmQyAOAoidOChSBgQyTiAUPGmI4dGihz50tXrN\najb7zPjvC/MhUGsCiJ1a9wD1QyAlBCR4GseNs6/PmG5r73+gZlZp1pW8TIrRWdl+N0KnZj1BxRDI\nDwHETn76kpZAoN8E5s1rsdbFrXbNvKvtoourP6ylqfBfmXJOJLoUjMwU8353KQVAAAIFAogdfgYQ\ngMABBLS55tZtW23AgAE2edIkW9S65IDvk/igLSDObppi2slcU+IlukgQgAAE4iLAbKy4SFIOBHJI\n4PHHH7cVt98ZrVr8+TPOshnTp8U6Y+vOu1faL36+b6+rlqtbbPr06TmkSJMgAIFaE0Ds1LoHqB8C\nGSAg0bPqvjV2+4rlduFFs6xx/El21pmnVyR85MXZtOlJW7d2tQ0dNsxmFGKEmpqaGLLKwO8AEyGQ\nVQKInaz2HHZDoAYEXnrpJdu4caM9+rOf232rfmhfPvscGzFipH3ik5+0kSNH2JAhQw6yavv2Hfbe\ne+/Z73/3YvTMpEmn2udOO82+9MUvEHx8EC0uQAACSRBA7CRBlTIhUCcE5PHRLup/sQH21FNPl2z1\nkUceaUf985F29NFHReJm+PDhJe/jIgQgAIGkCCB2kiJLuRCAAAQgAAEIpIIAs7FS0Q0YAQEIQAAC\nEIBAUgQQO0mRpVwIQAACEIAABFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoid\nVHQDRkAAAhCAAAQgkBQBxE5SZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAA\nBCAAAQikggBiJxXdgBEQgAAEIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCAp\nAoidpMhSLgQgAAEIQAACqSCA2ElFN2AEBCAAAQhAAAJJEUDsJEWWciEAAQhAAAIQSAUBxE4qugEj\nIAABCEAAAhBIigBiJymylAsBCEAAAhCAQCoIIHZS0Q0YAQEIQAACEIBAUgQQO0mRpVwIQAACEIAA\nBFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoidVHQDRkAAAhCAAAQgkBQBxE5S\nZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAABCAAAQikggBiJxXdgBEQgAAE\nIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCApAn8FUX2PmBTVQm8AAAAASUVO\nRK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 112, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse_2.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 113, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 113, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 114, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 114, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "code", + "execution_count": 115, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "
\n", + " \n", + " Loading BokehJS ...\n", + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "application/javascript": [ + "\n", + "(function(global) {\n", + " function now() {\n", + " return new Date();\n", + " }\n", + "\n", + " var force = \"1\";\n", + "\n", + " if (typeof (window._bokeh_onload_callbacks) === \"undefined\" || force !== \"\") {\n", + " window._bokeh_onload_callbacks = [];\n", + " window._bokeh_is_loading = undefined;\n", + " }\n", + "\n", + "\n", + " \n", + " if (typeof (window._bokeh_timeout) === \"undefined\" || force !== \"\") {\n", + " window._bokeh_timeout = Date.now() + 5000;\n", + " window._bokeh_failed_load = false;\n", + " }\n", + "\n", + " var NB_LOAD_WARNING = {'data': {'text/html':\n", + " \"
\\n\"+\n", + " \"

\\n\"+\n", + " \"BokehJS does not appear to have successfully loaded. If loading BokehJS from CDN, this \\n\"+\n", + " \"may be due to a slow or bad network connection. Possible fixes:\\n\"+\n", + " \"

\\n\"+\n", + " \"
    \\n\"+\n", + " \"
  • re-rerun `output_notebook()` to attempt to load from CDN again, or
  • \\n\"+\n", + " \"
  • use INLINE resources instead, as so:
  • \\n\"+\n", + " \"
\\n\"+\n", + " \"\\n\"+\n", + " \"from bokeh.resources import INLINE\\n\"+\n", + " \"output_notebook(resources=INLINE)\\n\"+\n", + " \"\\n\"+\n", + " \"
\"}};\n", + "\n", + " function display_loaded() {\n", + " if (window.Bokeh !== undefined) {\n", + " Bokeh.$(\"#fcba94a8-578e-4e33-ab7d-b09fc2376af8\").text(\"BokehJS successfully loaded.\");\n", + " } else if (Date.now() < window._bokeh_timeout) {\n", + " setTimeout(display_loaded, 100)\n", + " }\n", + " }\n", + "\n", + " function run_callbacks() {\n", + " window._bokeh_onload_callbacks.forEach(function(callback) { callback() });\n", + " delete window._bokeh_onload_callbacks\n", + " console.info(\"Bokeh: all callbacks have finished\");\n", + " }\n", + "\n", + " function load_libs(js_urls, callback) {\n", + " window._bokeh_onload_callbacks.push(callback);\n", + " if (window._bokeh_is_loading > 0) {\n", + " console.log(\"Bokeh: BokehJS is being loaded, scheduling callback at\", now());\n", + " return null;\n", + " }\n", + " if (js_urls == null || js_urls.length === 0) {\n", + " run_callbacks();\n", + " return null;\n", + " }\n", + " console.log(\"Bokeh: BokehJS not loaded, scheduling load and callback at\", now());\n", + " window._bokeh_is_loading = js_urls.length;\n", + " for (var i = 0; i < js_urls.length; i++) {\n", + " var url = js_urls[i];\n", + " var s = document.createElement('script');\n", + " s.src = url;\n", + " s.async = false;\n", + " s.onreadystatechange = s.onload = function() {\n", + " window._bokeh_is_loading--;\n", + " if (window._bokeh_is_loading === 0) {\n", + " console.log(\"Bokeh: all BokehJS libraries loaded\");\n", + " run_callbacks()\n", + " }\n", + " };\n", + " s.onerror = function() {\n", + " console.warn(\"failed to load library \" + url);\n", + " };\n", + " console.log(\"Bokeh: injecting script tag for BokehJS library: \", url);\n", + " document.getElementsByTagName(\"head\")[0].appendChild(s);\n", + " }\n", + " };var element = document.getElementById(\"fcba94a8-578e-4e33-ab7d-b09fc2376af8\");\n", + " if (element == null) {\n", + " console.log(\"Bokeh: ERROR: autoload.js configured with elementid 'fcba94a8-578e-4e33-ab7d-b09fc2376af8' but no matching script tag was found. \")\n", + " return false;\n", + " }\n", + "\n", + " var js_urls = ['https://cdn.pydata.org/bokeh/release/bokeh-0.12.2.min.js', 'https://cdn.pydata.org/bokeh/release/bokeh-widgets-0.12.2.min.js', 'https://cdn.pydata.org/bokeh/release/bokeh-compiler-0.12.2.min.js'];\n", + "\n", + " var inline_js = [\n", + " function(Bokeh) {\n", + " Bokeh.set_log_level(\"info\");\n", + " },\n", + " \n", + " function(Bokeh) {\n", + " \n", + " Bokeh.$(\"#fcba94a8-578e-4e33-ab7d-b09fc2376af8\").text(\"BokehJS is loading...\");\n", + " },\n", + " function(Bokeh) {\n", + " console.log(\"Bokeh: injecting CSS: https://cdn.pydata.org/bokeh/release/bokeh-0.12.2.min.css\");\n", + " Bokeh.embed.inject_css(\"https://cdn.pydata.org/bokeh/release/bokeh-0.12.2.min.css\");\n", + " console.log(\"Bokeh: injecting CSS: https://cdn.pydata.org/bokeh/release/bokeh-widgets-0.12.2.min.css\");\n", + " Bokeh.embed.inject_css(\"https://cdn.pydata.org/bokeh/release/bokeh-widgets-0.12.2.min.css\");\n", + " }\n", + " ];\n", + "\n", + " function run_inline_js() {\n", + " \n", + " if ((window.Bokeh !== undefined) || (force === \"1\")) {\n", + " for (var i = 0; i < inline_js.length; i++) {\n", + " inline_js[i](window.Bokeh);\n", + " }if (force === \"1\") {\n", + " display_loaded();\n", + " }} else if (Date.now() < window._bokeh_timeout) {\n", + " setTimeout(run_inline_js, 100);\n", + " } else if (!window._bokeh_failed_load) {\n", + " console.log(\"Bokeh: BokehJS failed to load within specified timeout.\");\n", + " window._bokeh_failed_load = true;\n", + " } else if (!force) {\n", + " var cell = $(\"#fcba94a8-578e-4e33-ab7d-b09fc2376af8\").parents('.cell').data().cell;\n", + " cell.output_area.append_execute_result(NB_LOAD_WARNING)\n", + " }\n", + "\n", + " }\n", + "\n", + " if (window._bokeh_is_loading === 0) {\n", + " console.log(\"Bokeh: BokehJS loaded, going straight to plotting\");\n", + " run_inline_js();\n", + " } else {\n", + " load_libs(js_urls, function() {\n", + " console.log(\"Bokeh: BokehJS plotting callback run at\", now());\n", + " run_inline_js();\n", + " });\n", + " }\n", + "}(this));" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "from bokeh.models import ColumnDataSource, LabelSet\n", + "from bokeh.plotting import figure, show, output_file\n", + "from bokeh.io import output_notebook\n", + "output_notebook()" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "ename": "NameError", + "evalue": "name 'np' is not defined", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mNameError\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[0;32m----> 1\u001b[0;31m \u001b[0mhist\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0medges\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mhistogram\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlist\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmap\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;32mlambda\u001b[0m \u001b[0mx\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0mx\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mpos_neg_ratios\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmost_common\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdensity\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mTrue\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mbins\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m100\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnormed\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mTrue\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 2\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 3\u001b[0m p = figure(tools=\"pan,wheel_zoom,reset,save\",\n\u001b[1;32m 4\u001b[0m \u001b[0mtoolbar_location\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m\"above\"\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 5\u001b[0m title=\"Word Positive/Negative Affinity Distribution\")\n", + "\u001b[0;31mNameError\u001b[0m: name 'np' is not defined" + ] + } + ], + "source": [ + "hist, edges = np.histogram(list(map(lambda x:x[1],pos_neg_ratios.most_common())), density=True, bins=100, normed=True)\n", + "\n", + "p = figure(tools=\"pan,wheel_zoom,reset,save\",\n", + " toolbar_location=\"above\",\n", + " title=\"Word Positive/Negative Affinity Distribution\")\n", + "p.quad(top=hist, bottom=0, left=edges[:-1], right=edges[1:], line_color=\"#555555\")\n", + "show(p)" + ] + }, + { + "cell_type": "code", + "execution_count": 117, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "frequency_frequency = Counter()\n", + "\n", + "for word, cnt in total_counts.most_common():\n", + " frequency_frequency[cnt] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 118, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "
\n", + "
\n", + "
\n", + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "hist, edges = np.histogram(list(map(lambda x:x[1],frequency_frequency.most_common())), density=True, bins=100, normed=True)\n", + "\n", + "p = figure(tools=\"pan,wheel_zoom,reset,save\",\n", + " toolbar_location=\"above\",\n", + " title=\"The frequency distribution of the words in our corpus\")\n", + "p.quad(top=hist, bottom=0, left=edges[:-1], right=edges[1:], line_color=\"#555555\")\n", + "show(p)" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 7).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 7).ipynb new file mode 100644 index 0000000..7443e58 --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Lesson 7).ipynb @@ -0,0 +1,11300 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 3: Building a Neural Network" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "- Start with your neural network from the last chapter\n", + "- 3 layer neural network\n", + "- no non-linearity in hidden layer\n", + "- use our functions to create the training data\n", + "- create a \"pre_process_data\" function to create vocabulary for our training data generating functions\n", + "- modify \"train\" to train over the entire corpus" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Where to Get Help if You Need it\n", + "- Re-watch previous week's Udacity Lectures\n", + "- Chapters 3-5 - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) - (40% Off: **traskud17**)" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] += 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):587.5% #Correct:500 #Tested:1000 Testing Accuracy:50.0%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):89.58 #Correct:1250 #Trained:2501 Training Accuracy:49.9%\n", + "Progress:20.8% Speed(reviews/sec):95.03 #Correct:2500 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:27.4% Speed(reviews/sec):95.46 #Correct:3295 #Trained:6592 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):96.39 #Correct:1247 #Trained:2501 Training Accuracy:49.8%\n", + "Progress:20.8% Speed(reviews/sec):99.31 #Correct:2497 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:22.8% Speed(reviews/sec):99.02 #Correct:2735 #Trained:5476 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.001)" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):98.77 #Correct:1267 #Trained:2501 Training Accuracy:50.6%\n", + "Progress:20.8% Speed(reviews/sec):98.79 #Correct:2640 #Trained:5001 Training Accuracy:52.7%\n", + "Progress:31.2% Speed(reviews/sec):98.58 #Correct:4109 #Trained:7501 Training Accuracy:54.7%\n", + "Progress:41.6% Speed(reviews/sec):93.78 #Correct:5638 #Trained:10001 Training Accuracy:56.3%\n", + "Progress:52.0% Speed(reviews/sec):91.76 #Correct:7246 #Trained:12501 Training Accuracy:57.9%\n", + "Progress:62.5% Speed(reviews/sec):92.42 #Correct:8841 #Trained:15001 Training Accuracy:58.9%\n", + "Progress:69.4% Speed(reviews/sec):92.58 #Correct:9934 #Trained:16668 Training Accuracy:59.5%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Understanding Neural Noise" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 67, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 71, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "review_counter = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for word in reviews[0].split(\" \"):\n", + " review_counter[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('.', 27),\n", + " ('', 18),\n", + " ('the', 9),\n", + " ('to', 6),\n", + " ('i', 5),\n", + " ('high', 5),\n", + " ('is', 4),\n", + " ('of', 4),\n", + " ('a', 4),\n", + " ('bromwell', 4),\n", + " ('teachers', 4),\n", + " ('that', 4),\n", + " ('their', 2),\n", + " ('my', 2),\n", + " ('at', 2),\n", + " ('as', 2),\n", + " ('me', 2),\n", + " ('in', 2),\n", + " ('students', 2),\n", + " ('it', 2),\n", + " ('student', 2),\n", + " ('school', 2),\n", + " ('through', 1),\n", + " ('insightful', 1),\n", + " ('ran', 1),\n", + " ('years', 1),\n", + " ('here', 1),\n", + " ('episode', 1),\n", + " ('reality', 1),\n", + " ('what', 1),\n", + " ('far', 1),\n", + " ('t', 1),\n", + " ('saw', 1),\n", + " ('s', 1),\n", + " ('repeatedly', 1),\n", + " ('isn', 1),\n", + " ('closer', 1),\n", + " ('and', 1),\n", + " ('fetched', 1),\n", + " ('remind', 1),\n", + " ('can', 1),\n", + " ('welcome', 1),\n", + " ('line', 1),\n", + " ('your', 1),\n", + " ('survive', 1),\n", + " ('teaching', 1),\n", + " ('satire', 1),\n", + " ('classic', 1),\n", + " ('who', 1),\n", + " ('age', 1),\n", + " ('knew', 1),\n", + " ('schools', 1),\n", + " ('inspector', 1),\n", + " ('comedy', 1),\n", + " ('down', 1),\n", + " ('about', 1),\n", + " ('pity', 1),\n", + " ('m', 1),\n", + " ('all', 1),\n", + " ('adults', 1),\n", + " ('see', 1),\n", + " ('think', 1),\n", + " ('situation', 1),\n", + " ('time', 1),\n", + " ('pomp', 1),\n", + " ('lead', 1),\n", + " ('other', 1),\n", + " ('much', 1),\n", + " ('many', 1),\n", + " ('which', 1),\n", + " ('one', 1),\n", + " ('profession', 1),\n", + " ('programs', 1),\n", + " ('same', 1),\n", + " ('some', 1),\n", + " ('such', 1),\n", + " ('pettiness', 1),\n", + " ('immediately', 1),\n", + " ('expect', 1),\n", + " ('financially', 1),\n", + " ('recalled', 1),\n", + " ('tried', 1),\n", + " ('whole', 1),\n", + " ('right', 1),\n", + " ('life', 1),\n", + " ('cartoon', 1),\n", + " ('scramble', 1),\n", + " ('sack', 1),\n", + " ('believe', 1),\n", + " ('when', 1),\n", + " ('than', 1),\n", + " ('burn', 1),\n", + " ('pathetic', 1)]" + ] + }, + "execution_count": 81, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review_counter.most_common()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 4: Reducing Noise in our Input Data" + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):91.50 #Correct:1795 #Trained:2501 Training Accuracy:71.7%\n", + "Progress:20.8% Speed(reviews/sec):95.25 #Correct:3811 #Trained:5001 Training Accuracy:76.2%\n", + "Progress:31.2% Speed(reviews/sec):93.74 #Correct:5898 #Trained:7501 Training Accuracy:78.6%\n", + "Progress:41.6% Speed(reviews/sec):93.69 #Correct:8042 #Trained:10001 Training Accuracy:80.4%\n", + "Progress:52.0% Speed(reviews/sec):95.27 #Correct:10186 #Trained:12501 Training Accuracy:81.4%\n", + "Progress:62.5% Speed(reviews/sec):98.19 #Correct:12317 #Trained:15001 Training Accuracy:82.1%\n", + "Progress:72.9% Speed(reviews/sec):98.56 #Correct:14440 #Trained:17501 Training Accuracy:82.5%\n", + "Progress:83.3% Speed(reviews/sec):99.74 #Correct:16613 #Trained:20001 Training Accuracy:83.0%\n", + "Progress:93.7% Speed(reviews/sec):100.7 #Correct:18794 #Trained:22501 Training Accuracy:83.5%\n", + "Progress:99.9% Speed(reviews/sec):101.9 #Correct:20115 #Trained:24000 Training Accuracy:83.8%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):832.7% #Correct:851 #Tested:1000 Testing Accuracy:85.1%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Analyzing Inefficiencies in our Network" + ] + }, + { + "cell_type": "code", + "execution_count": 88, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAl4AAAEoCAYAAACJsv/HAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsvQv8HdPV/7+pKnVrUCSoS5A0ES1CCC2JpPh7KkLlaV1yafskoUk02hCh\njyhyERWXIMlTInFpRCVBCRIJQRJFtQRJCXVLUOLXoC7Vnv+8t67T9Z3Muc85Z+actV6v+c6cmX1Z\n+7NnZn++a63Ze4NMIM7EEDAEDAFDwBAwBAwBQ6DqCGxY9RqsAkPAEDAEDAFDwBAwBAwBj4ARL7sR\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwg0GgLvvfee22CDDdzWW2+diqahZ+fOnVOhqylpCBgCjYuA\nEa/G7VtrmSFgCPwbgT59+jiIookhYAgYAvVGwIhXvXvA6jcEGgiB2267zbVt29ZbwrCGDRo0KNs6\nOf/kk09mz2GFIh3nIEYQJH6LJW38+PHrpZ06daq3so0cOTJ7LdcBaSgLvUwMAUPAEEgCAka8ktAL\npoMh0AAIQJwgWi+99FK2NZAkyBTSo0cPv1+wYEF2T57999/fbz179mxBkLgGcdLkjYyc41oxMm7c\nOJfJZNysWbOKSW5pDAFDwBCoOgJGvKoOsVVgCDQHAliVIEQDBw70ZGft2rWuVatWTojWiSee6IGQ\n32L54jwEjd+QM4gS2xNPPOF23333FmSMAiQNpMrEEDAEDIG0IWDEK209ZvoaAglFQAgXZAmrVNgy\nBWGCiAnhEgIG8RIrGefE1UggPOchc5KHpp999tkJRcDUMgQMAUOgMAJGvApjZCkMAUOgCAQgScRs\nCaGCgEG0tECyIFKkgUzhZiSdyJQpU7IWL7F8sSediSFgCBgCjYCAEa9G6EVrgyGQAARwF0KqIFe4\nASFU/NYicV7ylSFpESFflCHWL53Pjg0BQ8AQaBQEjHg1Sk9aOwyBOiMg1i2C4XEXQq7knKgG0eKc\nEDIhXrgpsWphBZOvH7XLUfLb3hAwBAyBtCNgxCvtPWj6GwIJQQDyJBYtyBVuQ7F66ekchGyRVixd\nNGH+/Pk+MF83hzI5b2IIGAKGQKMgsEEQP5FplMZYOwwBQyD5CDA3F4H3uCMtUD75/WUaGgKGQLwI\nmMUrXjytNEPAEMiBAG5E3IeQLixiWLMqEaxo4o6M2lOPiSFgCBgCSUNgo6QpZPoYAoZA4yOApSsc\n/1Vqq3FZmsG+VNQsvSFgCNQbAXM11rsHrH5DwBAwBAwBQ8AQaBoEzNXYNF1tDTUEDAFDwBAwBAyB\neiNgxKvePWD1GwKGgCFgCBgChkDTIGDEq2m62hpqCBgChoAhYAgYAvVGwIhXvXvA6jcEDAFDwBAw\nBAyBpkHAiFfTdLU11BAwBAwBQ8AQMATqjYARr3r3gNVvCBgChoAhYAgYAk2DgBGvpulqa6ghYAgY\nAoaAIWAI1BsBm0C13j1g9RsCTYTAgw8+6F555RW3/Nnn3RtvvOHWrH7DPfjgovUQ+MFJp7gtttjC\ndezYwX1t553c4Ycf7r7yla+sl85OGAKGgCGQNgRsAtW09ZjpawikDIG5c+e6hx9Z4u6+607Xuk0b\n981993e77b6b23PPPd1WW27puh7cpUWLnn1uhXv1tdfc6tVr3EsvveSeefpP7q475zrI2JHf6eF6\n9eplJKwFYvbDEDAE0oSAEa809ZbpagikBIH/9//+n5tx401uzuzZXuPeJ3zPHdG9u+vYoX1ZLVi9\n5k0379773WPLlrr/mzrZ/XzE2e4npw92u+66a1nlWSZDwBAwBOqFgBGveiFv9RoCDYrAlVde5a65\n+mq3X+cD3Kl9+7qjj+wZa0uxiP3619e5yydeagQsVmStMEPAEKgFAka8aoGy1WEINAECxG9dcMEv\n3RZbbuVOO/302AlXGEIhYPPuvsudM+oc169fv3AS+20IGAKGQOIQMOKVuC4xhQyB9CFw5VWT3DWT\nJrnThw5zw4acXtMGzLtvvrtk3NggfmzHwNJ2lcV/1RR9q8wQMARKRcCmkygVMUtfFgKdO3d2G2yw\ngXvyySfLyl+LTFOnTnVbb7211xNdx48fX4tqU10HsVyDBp/uFix4wF1/w/Saky7Aw5V58y23uO23\n38Ed1OUg98c//jHVmJryhoAh0NgI2HQSjd2/1roiEYAQDho0qEXqkSNH+t9nn312i/P243MEIF19\n+w1wm2++uZs8+VrXpvUOdYOGuideNsF/Lfn9//6+m3nrTPfNb36zbvpYxYaAIWAI5ELALF65kLHz\nVUWAaQJ69uyZtS5xzDmkT58+/ry2OElaOcceq5RskKb33ntvvfxY2tgKCdYu5MQTT3SZTMaNGzfO\n/16wYIHf25+WCAjpatt2D3fLzTfWlXRpzXBzjhg5ykG+zPKlkbFjQ8AQSAoCRryS0hNNpgfkSpMa\njiFXSI8ePfxeX8ci1apVKzdw4EBvmRJrlE8Y/IE4SX45Bzkr1rUp6SBeiOzlvJRpe+c06cLKlDT5\n0YC+Rr6S1immjyFgCGQRMOKVhcIOaoUAli0Izf777++tS1iYOJbzkCtIlpAeCBjWLNKwh2RxvGrV\nKp9/7dq1nqyRXpO13Xff3XHtiSeeKNg0sZZRLyJ7zsu1goU0SYIxYz+PfUsi6ZIugHwR6D98+Jme\nKMp52xsChoAhUG8ELMar3j3QhPVDiCBbECixXImbUeCAWEGi2ISAYYWSY/Zt27aV5Nm9XOcE6YVA\nZRPYQUUITJ8+3d05d45bGEwdkXTB7bj8mWfc6T8Z6t2hSdfX9DMEDIHmQMAsXs3Rz4lrJXFXEq+F\ncuJeFEW1qw/yBYGSc6TBKgZ5C2/lBsILQRMCKFYuzss10a1Z93/5y1/c2DFj3cRggtR6BtKXgv/o\n0ef79SAhjCaGgCFgCCQBASNeSeiFJtPhtttu85YryBZB7JAobakCDrFWQc4gXljAIEDsEcpgi0uk\nXOpCpGw5H1c9aS5n1Lm/cCf0+X7VJ0aNEyMI4lkjz/GEkdg0E0PAEDAE6o2AEa9690AT1i8WJFyN\nfJWIy1AsTAKHkCw5L9Yu3JQQNc7L14/yZSNzcEl6KafYPWUiEC7KExdo2BJXbHmNlo5Z6f/wxON+\nfcS0tY15vo4+5rtOYtPSpr/pawgYAo2FgBGvxurPVLQGMqNdghxr4iONELIFCZPrXJsyZYq3lAmB\n4xxlzp8/v2y3IJYtytVlYo3TelJPs8rU/7vOB6unxcUY7qcf//hHbsIl4xzuUhNDwBAwBOqJgC0Z\nVE/0re68COB+JBZMSFXexHaxaghg7Ro8aLBbsXJF1eqoRcHDzxzhvvjFjdwl48fWojqrwxAwBAyB\nSATM4hUJi52sNwK4DWXiU23tKkcv3I/ijozaSz3llN0MeW75za2u74Afpb6pWL34ItNivVLfldYA\nQyDVCJjFK9Xd17jKS7wW7sZZs2Y1bkMT3jJICu7X5c8+7zp2aJ9wbQurd2yv3u743r1c//79Cye2\nFIaAIWAIVAEBs3hVAVQrsnIEmPiUqSKMdFWOZSUlzJ071/3PwMENQbrAoUewOsKSpY9VAonlNQQM\nAUOgIgSMeFUEn2U2BBobAUjK3p06NUwjj+je3f3f1MkN0x5riCFgCKQPASNe6esz09gQqBkCix9c\n5Dr/e+60mlVaxYpwl3732OMcHwyYGAKGgCFQDwSMeNUDdavTEEgBAjL1QteDu6RA2+JVbNt2D/f0\n088Un8FSGgKGgCEQIwJGvGIE04oyBBoJAYjXfp0PaKQm+bbstvtu7rXX32i4dlmDDAFDIB0IGPFK\nRz+ZloZAzRHAHbf99jvUvN5qV7jnnnu6N94w4lVtnK18Q8AQiEZgo+jTdtYQMASaHYFWW2/jNt5k\ns2aHwdpvCBgChkCsCJjFK1Y4rTBDoHEQ+Ne//tU4jVEt+cY+ndxvbrlJnbFDQ8AQMARqh4ARr9ph\nbTUZAoZAAhBI63qTCYDOVDAEDIEYEDDiFQOIVoQhYAgYAoaAIWAIGALFIGDEqxiULI0hYAg0DAJM\nCttur3YN0x5riCFgCKQLASNe6eov09YQqAkCrNG4ukG//PvbunUNOU1GTW4Mq8QQMAQqRsC+aqwY\nQiug1ggwzcEf//hHvzHXFNsrr7zSQo2tttrKffOb33Rf+cpX/P7www93bCa5EYBsgSsCbh07dmjI\ndQ3ff//93CDYFUPAEDAEqoyAEa8qA2zFx4MAizXfcMMNfqkXSAEkCmLVv3//LLnSNQkhYw+Z+OlP\nf+r+9Kc/uV69ernjjjvOb5TT7CI4gYPgKphAxGbPuUN+Nsz+xRdXuS5dDmyY9lhDDAFDIF0IbJAJ\nJF0qm7bNggAD/+WXX+43SAHkCdK06667lgUB5QmBg4xR1ujRo8surywlEpBJky2wzIfnBhts4N5Y\nvcY10peAJ518qvtOzyM8aU9Ad5gKhoAh0GQIWIxXk3V4GpoLQRJChFsRsgRZgHjlIwmF2gZ5w0Im\nrkrS77bbbv4cdTayQDRpNxuCxZCtEJ4sKP3Io0t8nkb584cnHvdtb5T2WDsMAUMgXQgY8UpXfzW8\nthADXIjsIVzsIQhxC4QD1+XLL7/sIF38xrrWSAJ2stE+cGTjuFjpcuCBgYv26WKTJz7dvPvmu9Zt\n2pSEQeIbZQoaAoZAqhCwGK9UdVdjK4tFCzKEtYvjWggkRAieWMPQIa3xXxAtEUhWpXLMMUe74cPP\nrLSYxOR/5JFHXZsdd0qMPqaIIWAINB8CFuPVfH2euBZjcRKSAAkqxSITZ2PQA/KFWxPyheUt6YLO\n8iUiugqOcerdrVt3d9pPhrg+3zs+zmLrUlb7du3d5CmTIz/IqItCVqkhYAg0HQIbNl2LrcGJQkBI\nl7gX60W6AAUrF8QP8sKmCU2SQIMYiguRY9GXfTWk9/HHuwXz51ej6JqWed20GW6v9l/3eHGfaetg\nTRWxygwBQ6CpETCLV1N3f30br0kXFqYkCfrg7mRwToLlC4LFhkAa2GohkE8I3d/+9je3/NnnXccO\n7WtRbVXqwHI3dOhQd/zxvbPl07+0z8QQMAQMgVohYMSrVkhbPS0QSDLpEkXrTb4gPeCE1JJsSfsh\nJUy5AenaY489XbfuR7ipU66Vy6naY+26acYNbtGihevpbeRrPUjshCFgCFQRASNeVQTXis6NAAM6\npIJBL8kiVi/0rEXAvcYDS1st6ozCH9I5YMCAFpd23mlnN+XX17mjj+zZ4nzSf6xe86Y7+aST1rN2\nab3BvZ54a13s2BAwBBobASNejd2/iWwdXy0ysGPRqRexKAUYXFES/1VKvmLTarKVBLcXZPOKK67I\nqs/yS8S+4eqcPn2Gu/mWW1I1oeq5vxjtXn5plbvl5huzbYo64H7EspiGezJKfztnCBgC6UDAiFc6\n+qlhtGRw23fffd1TTz2ViNipYoDFMseADFnEUlepUB44iCSBbKELekG6pk+fLqq5XXbZxZOub3zj\nG45FLpj1ve0ee7qLLxydTZPkA+btGj5sqLv3vnt9HxbS1chXIYTsuiFgCFSKgBGvShG0/CUhAMlg\nw+qVJsHiI1NNlGMR0WSL/EkI2Nf4ox/9wnqWIpCtRYsWeQvQv/71L/fPf/7Tvfjii8F6l8e5kaPO\ncz8a0FeSJnL/7HMr3Am9j3Njxo5tEVBfSFkjX4UQsuuGgCFQCQJGvCpBz/KWhAAWIwgXA1s55KWk\nyqqQGGLCVixpxDXHhiSRbHnFgj/0B5a8V155RU65fv36uYkTJ3q9IVxs//jHP9ynn37qZs2a5X71\nq8vc9Bk3uq4Hd8nmSdIBcV2DB5/m2rdv7y4ZP7Zk1eQexdJpYggYAoZAnAgY8YoTTSsrLwIMYpAW\nLEdpFAZjiBdkKhdxJA3WI4T2siVZiC+TLxdFzzPOOMOTLn4L6YJwffTRR+7jjz92K1eudA8uetDN\nvHWmu/GmWxJHvoR07RgsDXTttVdLs0reC2lOeh+W3DDLYAgYAnVFwIhXXeFvnsrF2iWDWVpbDmlk\nINZWL0220vRlHH0S/nJx2rRp3tpFPBfuRbFyQbr+/ve/uw8//NDde++97pBDDnG/u+tud+usZJEv\nIV3cXzOmT8tJkIu9/+R+NfJVLGKWzhAwBAohsGGhBGm9PnXqVLfBBhv4rWfPnqlrhtafdmiRdrFf\nsGCBvlTwuG3btllcxo8fXzB9XAmEeMVVXr3KgXixmDaWItkYlLGEseWyhNVL31z1EkSvSRdfLhLP\nhYsR0oWlCyvXJ5984gnX+++/7+fzItaNjyPWrVvnDu92mPveCSe6Q7oe5Jgnq96yZOljWffinXfM\niaUv6FsEcm1iCBgChkAcCGwURyFWhiGQDwGsBg899JD/Oi5furRcY0JR3IlxfOFY6zajd74vFyWI\n/rPPPsuSLqxcEK9nnnnGbbPNNt7qBTnbeOON3dH/31Fu+x22c2MuusAtD66PGPGzukw1AfGbMG6M\nO33IEDds6JBYYYV8gRvkK2kfRcTaUCvMEDAEaoJAw1q8aoKeVVIUAlhJevXqFYsFoqgKq5gIqxZB\n57QpbQJ5QH89XQRfLjK1B3shXbgXcS1+8MEHnnBBNP2SQcuXu+22285bwkiLxXWjjTZyPXr0cNdf\nf737y19edid9/weOKRxqJXy5OHDQaZ50sfh13KRL2oElEwJmli9BxPaGgCFQLgJGvMpFrsr5Bg4c\n6F0+WBbY0iyQlLitQ08++aTDVdqnTx+HK1m7X+UYtyrXRo4c6W677Tb33nvvxQIj5CVtxEusNXq6\nCNyKMl0ErkWxcgnpwp0opOuuu+5yXbp08WkAEWvXJpts4rdNN93U7bXXXu6GadcF83yd5OfNYr6v\nahIwYrnGjJvgp4uAFC17bJknlbF0cI5CjHzlAMZOGwKGQEkINBXxYqCWQZn9oEGD3EsvvZQTMOKn\nSKPzbL311n7AzzWI67TkZ+vcubMvQ2KqikmTL8YrrDCkQuqgbI45V45EtRnygj7lCm5GyEocInjS\nRiFUnIsS+pZrQtAgYuTJ1XdRZUSdE3dTWqwfxKKBv54ugi8XCaSHTIS/XMStqEnX8sDS1bp1a5/u\nC1/4QpZsbb755o4N4vWlL33JffGLXwzixvq7JUuXBCTtwCwBm/Xb2VEwlnWOOK7hZ45w3YP2LH/m\naf9lJdNF0I5aiJAvMDUxBAwBQ6AsBAJrSkPKlClTMBP5LXCFZNjkt963atUqs2rVqvUwOPvssyPT\nS17yPfHEE+vlk+vsw2WMGzfOpy8mjdaf9Fp0/sAyllNPqU/n3X333bPpw9f5rcsOH1NXqRK4sTLB\n7OelZotMX0i/sL65foNBVJ9HVprjZOA6zQTEJcfV5JxGxzAOnAtchZmAcGUCt2Im+FoxE7ghM2vW\nrPG4BIQy8/DDD2fmzZuXmT17dmbw4MGZ3/zmN5mAzGcCy1dm4cKFmccffzzz/PPPZ1577bXMu+++\nmwniwDJBIH4msJr5skEgILiZ4KOKzOGHd8u026td5qfDf5659bbbM8uffb4kgO659/7MqPPOz5Yz\n4qyRmZdffrmkMqqROLAWVqNYK9MQMAQaHIGmCK7PZREJBiRv/cCqNX/+f+JSsJCIdYo0UYLVBEtQ\nMIC7gIRFJSlYBpkK1RNZsDqZzxKFdWf//ff3MTgqS+QhFjLS5xPqCkiLCwhlvmQtrmEVIjamUilG\nv2LrwBKGizIgzsVmWS8dVi8+Gkiy5Fpz8bDDDst+uainiyCmS+K6CKjH5Xj//fd7axnWLCxbm222\nmdtiiy38xrG2dmENE2suuGAdwp3Jxn2w+OFH3Nw5c9x/n3hCUGY317rNjm777XdwXw3ixsKCNQtd\n7rpzrvvusce5Aw88wJ1xxrDYXdbhekv5jRVRrIml5LO0hoAh0NwIbNgszYeAMNAGRNqtXbvWExJp\nuyZmECpNhiAakDLysRF7JRJOK+f1Xsdq5SIsxaTRZYaPA0tQVr/AUtbicj5iphNq0qWxglgSPC0C\nNrS7WIGcMEBVKrpPpCz0pO1s9FF4Y4Z1roEv/aiFGLFy3bGUA/FKqruJIHqmvdALXbPmIvpCugiM\nJ54L0sV0EfLVorgX2XPuz3/+s9ttt918PNeXv/xlT7aYdoINFyPniPMi3itMujTWgheB7yxUzXM0\nceJlbuD//MhtteVmbvMvb+I23WRjv20WHH/68YeuT0DOzhx+hk/L1BDnnTsqUaRL2ifkC8xNDAFD\nwBAoCoHgJdiQEnbVhV1LwSDdwgUTkDGPQzgf6cKi3Za4HLUEoGfLJV2UFJMmrIcuR+cPSIW+5I8D\nspHVgbTSNi7iZpP8pENwmco59mGsyE87JQ26FSvnn39+hq0SoX6pm30uN2+hOsK44AouV3AzBSSm\n3OxVyxeQ4kzwhWILvPgNhrgXcQXiEgysSZl33nkn8+qrr3qX4e9///vMAw88kLnzzjszAWH1rkVc\njLgagwlTM4888kgmCMz398abb76ZCYLuM4FFzLsqcVlSdjMLLnWwNzEEDAFDoBACTWHxwtoRtniE\nf4sVB0uICC5Ebe2R81hQRMin88h59lF59fVi04Tz6N8nnnii/umPw/Xm+4CADFp/rEhhbMBB11Oo\nPK0QVhYJRtfnSznW+pEPKxZ6lipYHHXbwuWWWl7S0uPOA+tyv1wkqJ4lgQi2X7x4sTvqqKO8ZWvL\nLbf0bkMsXbgZsXQRTM9UEoUsXUnDqFr6iOvZLF/VQtjKNQQaB4GmiPHSg22hrtOkItfgDhHRIqRN\nn+M4nC58vdg0UfnkXFTbwvXm0k/K0NchI8Tp5BOdPl86uRb3F2dRbZa6Cu3Jq/u4UPq0XIfglrLm\nIq5Eiediz3JAzFSPGzIImPeLS8tXi5Atjonpkq8XIV0bbvj5/22F7pe0YFipnpAviWmM+56vVDfL\nbwgYAslBoCksXsmB2zSJA4FyLVUQxnLzxqF3tcpgOaZu3br5ObekjuDLRT/Ra2Dy9hYs4rmwZgnh\nCsdzEesF6YJQvfXWW65Tp04+lgsrFxYviJfEcwnp0oH0Um+z78XylfQPL5q9n6z9hkA9ETDiFUJf\nW1NyDdJhi0/YwhQqsqo/o3QM66fbFKWM1h83JYN1vi2I8YoqJvIcXzRiBahEwq5TAu2jgu3z1YGV\ni69XNTbhcvPlT+q1ctdcZGJUXItYuiBd9DdfLgaxXu673/1uC9KFpSuKdCUVk3rrBflCjHzVuyes\nfkMgmQg0hauxFOi1e5FBmi8ewwO0/lIQ0qLzlFJXHGnRT8dfUab+ShP9ChEvrT9EjnZrMlaJnhCv\nOOJeaKN8hYh+fIXJRt/k6gPS0R5IV5R7MdyvlbSz1nnBtNw1FyFcuBexgPF1I5YrSJe2dEksl54u\nAtdipVYuyAhu0b+te9899tjvPWzowrQRSDDfl9uv8wH+uH27doG1bXPHl4NCZvyFFPzhvqetbByb\nGAKGgCEgCBjxEiT+vWeAZ0Bn0EawkmDhkUGa35rYhEnPv4up2S48txa/9dQQxegH8ZLYJ9rN/GS0\nWQiZlCmYcE1/YFCosXEQLwLjwV10kDqlL4SUyflCe/SX9hVKG3VdYnmirlX7HHhCRnQQPWstBl9a\n+iB4XIYEyIt7EauWTBkB6eJYgughUkwHQcB88OWjO+SQQ7LxXJAurlXqWgSr3919j3sg6L81q1d7\nYrV3p33cET16ujZtWnu4mDICYe3FV4MYM+Spp/7onnt+pbvjjjt9vm8Hc391PbiLnyrDJ0j4HyFf\ntD9txDHh0Jp6hkCqETDiFeo+rCcM8kJesJRARKKEtHxhV29BV9E3rAttKUZIB6lEsBKxJE+UQNBK\nIV0QhNGjR0cVVdI5SBKEL+wuLKmQfyeGjFbab/WyZDCIE0Svl/9hglIW7iagG8LFRqC8zNGFRYlN\nSBfXSIMFi2B5SBdz3FFu1PxcfLmIQNJKkdmz57irrrrKk6YT+nzfnTXyHHf0kdHPkpTbsUN7x4bo\ntBCyBxYudLPn3OHGjR3nTh8yxB373f9ykJskC/pBlI18JbmXTDdDoLYIWIxXBN6QkELkAtLFhJ3s\n6yn5iBXkopCbUXSnvYXICGXR5lKEgYe1GuMQCBMTutLmcnDHasmkqmzl5NdtYCCFVNZScNFRpyZd\npay5CPmCjEG6IFPEbUG0sHRhMZOvF7WlqxzSBeHq1q27J12n9O3vVqxc4S6+cHQLIlUqbpCxYUNO\nd1jGJl55lXv55Vf85K5XXjUpFld2qfqUkh5CzHPAPWNiCBgChoBZvHLcA+JexJUVjukSYlbp4J2j\n6pJOQ5ggRASbSxwT1iF0LMbNqCsjD+QEt50OXqd86uF6qcKAw6zpcf3HD+YQRDYsc+JqlL3WD71l\no11x9RcWDMhkLd1HfLk4YMAA3Ty/yDXWLgLjcS/q5X9wL2Lhkngulv/B0kVaXIeQLln+54UXXvAu\nRpmfi3gvCJfEdLWoNM8P+viSCb8KLFxvOAjXjwb0zZO6/EtYwth+/OMfuYsvvthdM2mSuyrYevbs\nUX6hVc6pyVct75sqN8uKNwQMgTIQ2CB4ETPLtYkhUDUE+gfL10DA4nA5Vk3JEgqeO3eub0utLBhx\nrLkI6ULCay5CXo855pi8ay4WA8306dPd2DFjXd8BP3L9+53q2rTeoZhssaSZ9dvZ7n+DJYW6dT/C\njR17sXe5xlJwFQoRt2OtraVVaIoVaQgYAmUiYK7GMoGzbMUjQOwQZKURREgXZLLawiBNPZWuuQjp\n0kH0uBSZn4sPFfbee++S1lyMavNZZ5/jSRcuwFEjR9SUdKFPn+8d7xb6LyXXub79BiTapYflC9KF\n29jEEDAEmhMBs3g1Z7/XvNUMOAw2aXezQIZwWTJBKVY8kbgtGNRDmXF/uUhMl8RyPf744+7oo48u\n+8tFdIToIJMnX1tzwiXY6/3wM0e4eXff5WbeOjPx9xrPQ9z3jcbCjg0BQyCZCBjxSma/NJxWuMsY\nqG8IYpXSLJdffrm33oUtFuHfEEzIZjmCC7MaXy5CumQWeiZKZS1GposgnqvUIPokki7B+rppM9yE\ncWOMfAkgtjcEDIFEIWDEK1Hd0bjKMP3CbrvtFnyN9nILS1HaWoyVC/IFMconkCfIiQj52AoJBI6y\nmVlehC8XmS4C4YtEJj3FfaiXAJIger3mImSK6SIIoteWrr/+9a/+/J577pmdo4uyS5ku4qSTT/VT\nVCTF0oX+WtJGvioh6rrddmwIGALJR8CIV/L7qGE0JF4JSavVC8LFBoksVcij82ENC7tdwaVaXy5C\nvNj4cnHp0qV+brpyvlyk3RddPCZYWujxxLgXc/XFub8Y7Z55+k9uxvRpZVsfc5Ud93mIOsS8XCtp\n3PpYeYaAIVA9BIx4VQ9bKzmEAMQDq9dTTz21HukIJU3cT6xXDIwE18cRl0N5+qtIXLE6novgd+o6\n7LDD/BQQWLr0dBHhSVFlugiAC3+5SEwXVi/m59KkCwtXKVYuyp4/f4EbGkxeevucudmJTjmfVMEy\nt1WwyPe1116dVBWzehn5ykJhB4ZAQyNgxKuhuzd5jWNKCQiFJh3J03J9jcS1iO5xCgQsvOYi5eNa\nxMUoy//gXsS1qJf/EfIVXnMRgiWuRUgXVi7OrVmzxpMy5jYrh3Sh60FdDnK/DCxefEmYBlm95k3X\nPfhIIenzfAmWPBdYvSD5JoaAIdCYCBjxasx+TXSrcLFBZNIyrxdkCzepWCTiAhcig/VMW7r0mous\nvSgxXVi72rZt6yc1lYlRIWFRay4K6WIvli6C6B955BHXvXv3skgXbR40+HRvfZs65dq4IKhJOTLP\n17LHlqXClScuaSNfNbk9rBJDoOYIGPGqOeRWIQQGwkFMk1iSkopKtXSlXNqul//JteaiWLpYT/Gt\nt97yVi+W/sEyss0227RYcxGyJV8uYulihnpI18MPP+yOOOIID3Op7kUyEfQ/eNBgP19WLSdHjeu+\nwOXYsUMHd+6558RVZFXLMfJVVXitcEOgrggY8aor/M1bOaQLFxsDejjIPCmoiEUKkkhQfVxCmyFd\npXy5KF8t4l788MMP/VeNb7/9tnv33Xc9sYJgffWrX3UsFyWWLr5oJN7r9ddf9+QMC0o5pIt2Q1z2\n7rSPnyA1LhxqWQ6LbO/d8eup+qrWyFct7xCryxCoHQJGvGqHtdUUQgAyg7sxieQL0sVEqVihIIlx\nCWWV8uUiJEtci5AuHUTPV4nEbsmai6z+xWANCYNwsSZjt27d3OLFi/2+3DbQP2m2dkm7mVx12222\nTo3VC73pT+7FpP5zItja3hAwBIpHwIhX8VhZyiogQOwUMVRJIl9i6cJChFUOi1ccQll6+Z9ivlwU\n0gUBg3QR64VArCBYEs+lv1wUSxfE7IILLmhBuhjAS52yYMRZI12rrbdJrbVL+m7J0sfcD/v3cytW\nrpBTqdhzP0LAjHylortMSUOgIAJGvApCZAmqjQBWIEgJ+3rHfEnslcSg0XYhhaUSFsGNgZP2sZC0\nyC677OIJJ8H0xXy5COki0B4hZkt/uSiuRc4J6dpwww19/BiuRQikCO1DHxGu6etyXvakxfK3/Nnn\nUzF9hOida39sr97u+N69HIQ/TWLkK029ZboaAvkR+ELg6hmdP4ldNQSqiwD/ye+www5u8ODB7s03\n3/RL2VS3xujScX0yII8cOdKNGzcumwhismLFCv8FYankiwETEnffffdly4NsMZ8W5UK6mCoCS5YE\n0eNSXLdund841l8uykz0WLgIomcP8SKQXpMuCBdfS4atJOBMvbKhH2QMiwobv0kjMnPmTPevzAbu\njGFD5FSq9399d6178sk/uO9+979S1Q6sm2zLli3zfZcq5U1ZQ8AQaIGAWbxawGE/6okABEAsEZAg\nCEstBMJBveyxuuWqV4hJmMzk0lGsZ6V8uQjREvci00Xw9SLkDAsWxAqCpd2L+stFXItsyEMPPZSz\nHbn05bwQMUlz2cQrXI+ePd2wIafLqVTvCbI/ofdxqXM3atCxwOa6R3U6OzYEDIFkIrBhMtUyrZoR\nAQiNkBVcjkKGqoUFJAMXILPpS935BjSxEjHwFRIZHDXpYkLUadM+X75G5ueCWOFGhHDxlSMb1i6x\ndEG6IFMSz4WVi9gwmTIC9yKuRwLphXRRJ7qWI1j0wEC299f9zXUOvpRsFOnYob1r3aaNK6YPk9pm\n+ibN+icVV9PLEKgVAka8aoW01VM0Ani/sS4hkCJIWJwzxotljdglyBcLd2NhK8aNKMSEgY+8UYLV\njK8J9XQREC5mo+fLQ1n+B9KFVQsLl5Au9pAurpEWMgW5wqUI4dKkCzImpAuLGO5FNrArl3jp9lDO\ngw8ucl0P7qJPp/74m/vu32L+tDQ2yMhXGnvNdDYEPkfAiJfdCYlEAIIDgXnvvfe8NQrLFGQCKxgk\nLBfpydUYiJKUwaBF+XPmzPGEqxySQhkQEzYt1KGni4AoMQM901II6fr00089sQqTLggY57iO8OUi\nrsQw6WL6iCjSRR7aiW5xCG37wUmnxFFUosr46nbb+Y8FEqVUGcrQz/R3qc9CGVVZFkPAEIgRAYvx\nihFMK6q6CGCpgoyxJ4aJLwMhTbgJo6xVMigRZE5AOwMVm/5yslKiAjlh4EMPSFetv1wUKxfICwlE\nlzgEK+Arr77hJl42IY7iElPGvPvmuxtnzHC33HxjYnSqRBGeB/o86hmopFzLawgYAtVBYKPqFGul\nGgLxIwDBggyIMOBAeiBPUQIRYjCCbOUSrlVCvhjwIDy4LbXoNRf1l4viXtRB9MzRxXlckBAp3Ic6\niF5PFxHlWpR60SNfWyVdsfsNNvyCwzpkkmwEJD7RyFey+8m0MwQEAbN4CRK2b1oEKrEUQf6woOkg\n+kJrLmrSVcmXi5A0kUrIo5QR3k+8/Ar30cefpn7i1HC7Vq950+3YprV3/Yavpfk39yL/aEDATAwB\nQyC5CGyYXNVMM0OgNggwUGE5KzVWRsiOJl2HHXaYD6JnAKzml4uadEEcbbAt/l5J4yLfxbQOyxci\n/0gUk8fSGAKGQO0RMOJVe8ytxgQiIO6aYlUj1izqy0UC6flK8qWXXvKTooprMe4vF7WeRrw0Gs19\nLATcyFdz3wfW+mQjYMQr2f1j2tUQgWLJV6EvFzt16uS/THziiSfWmy4iji8XNSRiddPn7Dg/Akyi\n2m6vdvkTpfiqka8Ud56p3hQIGPFqim62RhaDAO5BtlzWAlyRTGehF7rmy0rIDy5GHUS//fbbu222\n2cYtWLAgOykqpItAelnoWoLow5Oihmej118u6nagpwyy+nxcxzIha1zlJaWcV197ze3X+YCkqFMV\nPeS+IO7LxBAwBJKFgH3VmKz+MG2KQADCAdmRPVkgRUwbgcg0ExxjxWIQ4ms/iYHhfC4hLWWz10L5\nlCF1cK3Ql4sQpnbt2rnFixe71q1b+9nlK/1yUetE+9GpWrLlFpu75cufrVbxdSsXAtwMwj3MfQv5\nKubeL4QJ9xtlyUbZ+rljzjqpR5479tW8RwvpbNcNgSQiYF81JrFXTKf1EOBlT1yVTJ4qL3T2WKkQ\necGTVgYFGSTkHF8gyrZeJeoE5EuXV+mXiytXrvQz0GMJK2XNRR1Er9Tz5FD00+fjPAaDSy651N1z\nz+/iLLbuZY0ZN8Ft9uVNgoW/h9Zdl1oowLMAaeJZKVX0c8dHJFh2ue8gdWyI3IfyjHGOe4c62VM/\naeS5k+eVdCaGQDMiYMSrGXs9JW3mhQ3RYgkhhBc3rr5yBhDyMzAwEDAXGGUTqyVzfXFdiwxW7KlX\nL//Dmoss/4PIl4vMNv/xxx97VyIWFaaMYMO1yDVmrV+7dq13M+69997Zha5lji7IGDPVs+Yiy/8g\nuUgXAxoiA5//EdMfwQic7rjjDl8qujeSDBx0mnv7rTVlr1qQRiy4j+lbCFAxwj85PCfca0KY2Jcj\nlMFzTJkc8wzz3FXj/i1HP8tjCNQaASNetUbc6isKAV7SvJwhWbyo2eIUiAWEjsEoFwHjvI7non7W\nXJTlf4jpIl4LYsVC15AsSJcQL87J8j+y5iIkZvXq1d5yAOkinktIF2lkzcV8bUX3YgfQfOVwjYGQ\n8tgYHDXB5Pp+++7nLho7zh19ZE9+NoS0b9fezbx15nrxfHFhmmSQ6Od87eQe4L5HeD7ifu543iB0\nrPDAc8SxWcCSfMeYbtVAwIhXNVC1MstGgBczL3v+Q4d85Rskyq5EZWQgYoCBgDAIyH/1YdJF/AqD\nEq4WWXNRky49KSoEDNIlQfRYslhbEaLFuotsTz/9tNt///3ddsHM8FyHdOUKolfqeoJUCSbgSptp\nC3s9B5muR44hXkcd81138YWj5VSq90uWPubOHXVOsH7mwvXaAR4iWGMa1SJDO8P3EPc/z50QI46r\nKdTHM4YuPH9C9qpZp5VtCCQFASNeSekJ08MTn+HDh7vzzz/fv4xrCQkkj5c/xAtyIm420eGpp57y\nwfRYucS9iGuRmefF0iXuRUgXaRC+XNx0002zpEtci5wj7mvbbbd1u+22W1Gki8EKKZUQCMlikNMf\nB/jCIv4ce+yxfmBmcGY+skmTro4kKhFZE3/q3F+Mdv/49BN3yfixeXUFa8GbhGGikjdzCi5yL0ib\n5N6HbEGCammBQg/qxbKNHrWsOwXdZCo2KAJGvBq0Y9PULIiO/PcLSSg3hqvSNvPf/r777tuiGL5c\nxL3IgPC1r33NffbZZ96SJROjhi1dpa65+Oqrr3r3XrjeFkr8+4ceLKOuyznSycZi4oXk29/+th+E\nGYhlWgysekIyv/GNb7orA/LVCO7Gbt26B8T+f7OkoxA2ch08RSC+pZJfyZukPW3Cysue547+r4fw\n/EO+eP7q+fzXo+1WZ3MisFFzNttanRQEeOnKC58Xbz3/46VuXIoS56Sni1i4cKFr06aNj9kSS5cm\nXeWuuYi1i/oY/ASHqL7Jdx3cuC6b6B9VDudol3ydxp42i/tUiCMWO7HsHXPMMe7+++5PPfG6btoM\nD0k+nHNhpvNgCQNrhHumXv8oeAUq+IOFCcsu1tx6tgEMIVxY28AZbOupTwWQWlZDoCgEzOJVFEyW\nqBoICOniJcsgkATRVi8ICUsAMRM9li7IV+fOnddzLeovF4nVIlh+s802W8+9KEH0ub5clAEnTD7F\n5SVWFhn4Sc+AVYhoMa+ZEK1evXpliRYWLbFqaaJFW/X2wgsvOMjX8mefdx07tE9CN5Wlw0knn+q+\nd8Lx7vjje5eVPyoT9zD3jAj3crj/5FqS9mJh4h6iDXJv1VtH3gNi/TbyVe/esPqrhYARr2oha+Xm\nRSCJpEsUhthAVhicsAgQi0Vw/FtvveX+/Oc/u5133tmtW7fOTxcR9eUipIsAeuK5Sv1yUax+eiCE\nXCHsGSgLBcRDGCFaxKthQcBFKq7DMNnSBItjPgjQe7k+PpjPa/fd27qJl00QmFK1n3fffDc8mLdr\n2WPLqkqM6D/ubSSp1jBNupJIEo18perRMmXLQMCIVxmgWZbKEUj6y19cbwMGDPAWDZb+wbXI+oss\n8YPgXsz35SIEjCB6sXQV++UixI/BhwE8PJ1FLuR1QPw+++zjiZaQLbFmyV7IVJhghUkXv8Xd+OKL\nL7pLJ1zqZs66zXU9uEsuNRJ7ntiuoUOHxmrtKtRY+i9p1jDcedxbQvALtaFe14k9Y0u6nvXCx+pN\nNwJGvNLdf6nUnhcqAwAEI4n/cQOquOCYh6tLly5+Y+JULF2PPvqo23PPPbNzdOX7clFIl8zPlWtS\nVCxZssUREI/+ECtNtoRIRREsIWOSRvIKDmAybdp0P8/Y7353Jz9TI8xUv2zpEnfnHXPqqnO9rWHc\nX1hB2afBjSdfGKOviSHQSAgY8Wqk3kxBWyBbvPRxm+EGS6JgKWKDhBBsvmLFCtejR49g+ZxLPOEi\npusPf/iD69ixo59pnklQZY4uPV0EhEziucJzdDEIM6DIVihOK19AvJAjTbI4FoKlLVv6nD4vedlT\nnmAgRBFrHeSRJYT+69hebtTIEUnsuvV0Yt6uQ7oeVPcA8rBitbaGUR/ua/7hScucWejMuwJ906Jz\nuJ/ttyEQhYARryhU7FzVEIBs8TLF6pVUERcdJIUYLojWpEmT/GzbV199tSdTb775pp/0lPgpIV3E\ndUHCiAeDdEFW2BDisoRksS8Up0Wevn37enIqMVvsIUVRREssVrIXghXey3X2QraEaFGnCCSLTdog\nk7w+++yz7rLLJrpLLv2V6/O94yV5Iver17zpTj7pJNe/fz8/S3oilfy3UtoaBkFii1MgLkL24yy3\n2mXxrGD5Qve4Mam27la+IZALASNeuZCx87EjIC/RJLsYaTTESyxGQrxYBuiUU07xXziefPLJHpvn\nnnvOde3aNRtET0yXuBaJB1u8eLGjzQToFyJaQq6EmELoCPAXosXAA7HbcccdW3xxCIHKRa7kvCZb\nUh57LUK02GOli9pkLclLL73Uu1tvvOmWxMZ7QboGDz7NtW/fvuBkqRqHJBzzfLCJcE9UIpTFtCUv\nv/xyKslL/+AjF4TYNBNDoBEQMOLVCL2YkjbwHyuuDnmRJlVtbfGSObuwehF7xcz6t99+u5+SAZL1\nzDPPuG7dunlL19KlS93dd9/tHn74Ybd8+fKCzWPiUvnyUAfEM23Ft771raxFChIIeWLgfO+997y7\nU8iUkCu9l/Sk4VjIllYIF6K2aIWJlpAsvcf6RXwbU2rcc888N3nyZDd9xo2JJF/DzxzhVq160c2Y\n/vnkt7rtaTuGvIvwDLGVIjxv5OHZS6PERRz5QKZnz54egnHjxrmzzz47VXC0bdvWrySB0qXoP3Xq\nVDdo0KBsW3m/1VJ4Z/HOYBWME0880c2aNauW1SeyLiNede6WfA9Tvmt1Vrvk6onpwt3BSzTpwouJ\nDTJDcD3kSzasXQcccIAbNmyYw+LFS+TWW2/16Qu1C3IlRIvpHiBEQvKEIDE4COmCOKED14RYrV27\n1tdLWZp8CdmSctgjlC/xZZpoQaIgW+JC1ARLSJg+h8UPMnnEEUdk3Y+jRp3r5t1zj7v+humJIV9Y\nukaPvsDhCm4E0hW+p3h+9DNUyBpGWqxdDH5J/ZAl3Mao3/LPWiVWL3mf7r777gEpXxVVTaLPif4o\nGSZeEovJtSlTpriBAwdy6KXexAsltA68MyFgzSwbNXPjre3FI5DvwS6mFF6YaQmQpa0QFiEqxGtx\nDvchZOSaa67xW6F25wuIj5ohnsFg66239hOiCunSe44hVAwcuDGxYhBPJmQLnYVooRvkStogREvI\nVnjPdSFaco1zbK+99pqDeDGJKuWBBfvLLvtVMAv+Pu6HQQzVLy8eU/eYL3Ev0vZGJF20iz5nEylk\nDcPK1a9fv1STLtpKOyCQxIaWQyDHjx+ftRalzdIlfZ3mPURQ+mDkyJFGvNLcmaZ7OhDgv27inCr5\nb7XWLYVcsEFCEIgGE6guWbIkpyrEZR0exOOwYdEiRgsiJK4+rGaQJDaxVuk9S7cceuih3joRJlyS\nTvLz3y+uR+LKvvrVr2Z11EQLIqUJVZhYaYIl14RsyZ5FtSGDHTp08BhI48EG6R+4sbbccis36pxz\n3Isvrqrb144yQeqxx/VOXUyXYFrOnntNhOdMiBjkRL4elnOSLo17yCbPFJZz7rlSBGsfgz7SqlWr\nFtagUsqpd9pyrXSQHm0Bq1c70AHShcuR/mhmArxhvTrB6m0eBHhZslRNOf+p1hMlIR9YvNgOPvhg\nt/3222dVgvRgBfrVr37lXRcspn3dddc53JGs64hVi+B8JlqVdR2ZB4ypI/hUno1BgW3OnDl+oORY\nrpGODWsTMWYySz7kC0KH5Qsd33jjDR9jxteVTO5KoD5Ys2eg4Vjv5VjSkI7AfdrDV5ky6Sskk/nK\nIHnUI2RUSJcAwRI8M2+d6efKOrZXb8cUDrUSrFzn/mK0n5V+zNixTUW6whhDTiBibBxjYeb+gYA1\ngkC4yvnnDTcXzxUSJiAQALmviYMiHeRAzrEnjeSPwpHwAPLqPPyzUigfehFzpvPxm/NRwnMoaSkb\nkfw6vegi5bCXfOxFwu188skn5VJ2r9MQp6Ulqt359NfuRdFNl9dMx6kkXtx0+ibkZuIcTFqL3IBc\n50EIX9dlcBwWHjbK1Tdtrrp03mL103nKOS71xtftBQvy8zBJ++RloXUp5sHW6aOO+Y+b2KY0CZgg\nYkHCIsTGi/vUU0/11yBRd955p4/3YhkhvjhkSSH9JWSYaAnZ0nssXZAgzjFQkgeiBmEjxgxrF1Yz\ndIIAQQIhRxAl+nSPPfbw9zZlCMmCXNGf8luuQbIgZ0K0KEeIFuXSRogeHwjw0QDlCBa+0Tn+MLgz\nQWmPHkd41yPB7c8+tyJH6spPQ7iYGLV7QDLe+evb7t777q3prPSVt6C6JdDfCJP+NorwDuEDF56T\nUkQP8szHl0943/H+1gL5kOBwfZ5jrkWRDSFwPJ9RhIaxiY13sBZ5p1NmtUUTIeoK68I5jZ1OD0ZR\n7Rb9aVtY+Edx//3396cZf2677bZwkqb5nSriRWfxAHCzh0mUPBz6JseUycCBCImSnuWG0mUQkKiF\ncnhoKDcsnONa+EYtVb9wuaX8LufG1+Vz0/PgaLzkZRH3Q4+bkf/C0yZCSNlDwNh++ctfuhkzZnhr\nElNEiBuR4HesYbgjV69e7ckTJEoTLPBlk3NcZ+PLyG233dZbtbCSUVYuogVhgjyxCZmC9HXv3t2T\nPoiZkC1JI0QLi5i2aPFVJmSLPGy0jzaxHV5mfw0bOsSToI02+oLbu+PXHQQsTgsYZE4I1/JnnnaT\np0x2UyZf43YNLDwmLRFI4z88LVvQ8hf3NXGTtKtY4f2m3/P5iBdjgn4f6jooI0wmeAeHSZrOwzHP\nO+9T9iLUowmNnNd7xpZCZev05RxDgiBDIuG281vrLdgxdkSNi1IOe9oXpb+UQRojXqCQAunTp0/O\nBwP1wzc5N5X2I3MzyEOobwqYvL4hwuXkgib8IJaqX65yC52v5MaXsvM9ODz0cT0UzD9FrFM9B0Ze\nfEKipP2V7rHwMADg8sMiJV8/EgD8+OOPe+IlBEvvtUUL9+G9997r5wLDfYiIRYugeT0jvpAocRNG\nWbOOPPJIT+SoD5LFJu5DyhOiRWyXEC2NC32FVOqaoq8nXDLOx6Btu83W3gLWrVt37xIkFqtUgbhd\nOekaN3DQaZ7MvfKXlz3huuXmG8smiKXqkMb0xOeVS6DjaG81njvaI/dpMTrqf47F2pIvH2mIpeK5\nZq/HBcqS8hgj9BjCWDN//nyfj7zapRlOq9+t1MeXylKf1lGny6Wz1KmvY0QI66Cv62NtxZK2yXX9\nG71ENz12cI71a0V/jRf40HYtmujp8nWaZjjeMC2NhDRpRs7ntHQ2m7ZW0dH6vwmIl+5sbgZNwBjI\nKEsL1/UNo+vSRA4SJw9Hufrpeos9ruTG13XodoXnVhGsK32wCfitJ+nS7eVY92v4Wim/GQBoG5Yp\nSJMQL8jUTjvt5O9VsWixj4rTIjieexP3HqQoF9ESsqX3EDH5zTFWLYjWQQcd5F2HuDzzES0IlxYG\nM/qJLS6hrHPPPcetWLkicHkNc5tusrG7ZNxYT4KJBYNIYb2K2ojbOunkU137du09cVseWAVZnJv+\nw8JVT0IRFz7VLId/CrAOJUXieu74p6AU4iXvMXDQ40AuXHgPSjr24feikAXe+7pNjEGadIR/6zFJ\n/vlHB4gPzzFCfXp80br7BFX4o4kX7dF16mNJxzmtP/gIIRO8pD2UJ+OjqC7Y8pvruixJ0wz71BAv\nueHpFP6b0DcovzV50jc56TUx45r+TyVMzEivb5ZwXdSjbx65OSvRjzqLlUpvfKkn3C4eLHm4SKNf\nKpKnnD0vySQNktJf5bRF58GKx+AG6ZIvEPlqkfguXHbs//rXv64XEM81LE64+N5++2239957xxoQ\nT7nEfHGPEqclFq0w0dJtoR2QJIkL0tfiOiY+57xzR7lFixb6e+vM4We4Dl9v5zbfLIgxCwhZePvi\nF4Ln/H9+5N2WELepU651/YPg6mrqGFdbk1AOVs8kYRXXc8d9StuKFV2vfm9H5YdAhNNAIvR7UYiC\nLpc0mnRJ2bxjRTSpEaLCNf6JZhPrEHWJQYF9tSXcZj2O6WNpn253OC+65sJL2hHGV5cnaZph//m3\n8iloqe4guQm02tywYgni4eBGF+ZNeh4CIWTy8HATaAIn5em69EMi16M+69V5StVPyi1mr+vJd+OH\n2xouO6pdnNOkM5ynEX6DX1T/lNo2BgARsXoJCWPPdYLmmYYBkTgqSBfbI4884r/0xNrFb70nrfyW\n9JIf4sbG7zCp0uSKe//wwCoHqcJKEDUIM4DVgxijC7qxmVQHgXr0a76WxPncEWBfrOh/IGU8yJU3\n6p1IWk0WpDwZQ7ieK1+4PsnLmKPfs2IIkPFLxitN+KinWkI9ooOML+xFX9onbZRz6EIa/c6J0k+n\nL+d6VJ5GOJcai5e+0Yml0oMOxwS7awl3ODd7+EHQljDJq+vhnH7oJE3UXucrR7+oMqPO6XbJjR/G\nQkgX+XV6XV6x7dJ5yjkWa0o5eauRhxeMvFziLF8TInEdMsP9iy++6L8gxB0ocVoQnn333dffj5AQ\n7kvZS5pSAuKl/6PaA7nBJcqmRc4Z+dGo2HG1EIjrudP/8BSja4eXj40AADlaSURBVK73XzF5q5UG\nEkNclLaI6bqwNDGGCBHT16pxrP8RFSuXJoa1IoDVaFtSy0wN8aoUQB7A8ENYjQG4Uj0bMX+pL8so\nDHhxC8EodS8vE8rlHiDol5daHP2PLlifsExJnBYB7fL14V577eWXG4JYSUA8MV9ihYJ06RitUgPi\no7AKn5NgeYmNkb2cD6e334aAIJDU5070K7Qv5R/M8PggZet/qqU82ZMm13skXJ7+xx/yxT/+4lYk\nhIVNp4mLrEo7cu0hXlIvOtMe/c7UxEzSURbnRf9c+yjjRi49mul8aoiXvtEl4DtXZ3Nep6dDo/57\n4MbWDxXp9I3F7/B1zkWJrq8c/aLKjDqn9bMbPwqhwud4udD3UfdE4dwtU+iYLebDIkAea5VYrr7+\n9a/7DJAzzvGlGfFOYbKlp3kgTgsiRx7K10SzZe3F/4L8so0Oll6R4+JzW0pDoHIE4nzuitVGvy/D\nRChcBlae8Pue39r6I+95cb1RBuVqoiLlas8DepCH8vTzLKQNjwwbYSxaZ7kuZVZrr61v6C31orNu\nqz4mTSFMC+kreBZK12jXU0O8wh1eSkdwI8mDIQ8A+eVFoMviur7xww8iabFcyMPDAI5Uop8voMg/\n4XoqvfGLrLbsZFh6xMJSdiEJzaitXbgXmbIBaxdWK4iVkC9mvGduLPqqU6dO2SkewhOXQrTY5N5i\nH5dIPBfEi/4oJUA5Lh2sHEOg1gjogT3qXR7WBxefpGPPby1i/cH9pseJ8GSo4d/irkMfnY9//qQ+\n6mGc0u90nVbrke9Y58+XTl+TdnFOE0bRW9Iy/gim1IP3QEia5NXjoy6L67qt/NbjGb+bRVJDvPSN\nwc0qhIeOkgdEBiwd78XNoS0b/FcR/gJSSJl0ur7ZqEf/x0NZ+sYWvWRPGaXoJ3UWu6/0xi+2nnzp\ndPvzpeMa7qxGHOSFFLHHOoWVSlyNEC823I0yQzzxXvfff79r166dTwtRqybR0v0SjufKFfel81Tr\nmHuBuL8rr7wqmIz2Ij9lBNNGhLcRZ410V141ya/NF45Pq5ZujVZuIz53/NPAPzTFih7Yw4N+VBmQ\nCMYPnmv2mlQwLkh5ECLGEhHK1vOWacJBWj3maOsSY4/UR52a6JFPjytSV6E94w9laR0K5aGeKJIX\nVb9uN/jo1U8gnDI+QNB0W9FB4wmWUXUW0rURrqfmq0Y6EBIkDw83F1uU6BuDNHIj0MmUIze0EC5u\nFv2lIvn1TasfBl2ffhDL1U+XV+wx+qEzIjd+VN6oGz8qXannBHv89+EHK6qsOAYAjXVUHeWcq+Sh\nZwDYNXDd4QrEtc2LDiIVFs6zPffcc8GSNse71157ze0a5KuVoCdWx3A8F7+FkFVbH+p58MGH3Ow5\nc91dd8513z32uGCw2cN9dbvt3Kl9+0ZC8cILLwTLJn3oZt12u+vdu7c7/PBu7ohgcDj0kK7B8eGR\neezkfxAAI6yblUrSnjveJeF7OV8b0V/GCd6VjAW5nntIBtc1OZCyIQnheCXew4xHeqyQ9LKnLkJP\ndJ3kY+yJqkfysWeOLJ1PXwsfo1+h8sJ5wr9lDJPzlMkWFtKBk+Aavs5vxh7aHRYZizkfRerC6Rv2\ndzBopEYCcpQJbgQmN8m5Bf9ZZNsTfDnSIl2x1yggeMha5A3XiR7BjMPZujgoVT/yBDdoth6tX6Fr\npA3rpH9TLvpo0XUFD4W+5I91mcHD1eJ6FO5gVEgWLVqUOeywwwolS931YA28zPnnn+/1DqaTyOTa\nSBBMlOo3jsGjVkJdwYsub3XoFkx7kTdNuReDhb8zPzjpFH+f/nT4zzO33nZ75o3Va8oq7p5778+M\nOu/8TEDA/HbDDTcUbFtZFTVIJvo0mGuuQVrzn2ZMnDgx069fv/+cKOJIv7sCMtMih37nBUTAv9N5\n9+l3afi93KKA4AdlhvMEhClDvvAYofNyXesmdXKesSss+v0d1on0Aclsobe8n8NjWbhc+U07RAf2\n4TokneypMyCRLfKgY758ur1RY5CU3eh7/ltPndCxugO5Sbjxwx2p03BDhEU/LDwoYaLCjaXTUE+h\nG4s6itWPtPkepnzXyFvqja/LC2NFeegtDx7t1pLvwdbpwscM7IFrIHw69b8hkxCLYiRMtsK/iymj\nlDSQrVLqKDV9IV2oWwjSFVddXTbZylUPBA5C126vdpkrrrgyV7KmP8+zXIh4pw0kSBfkqxTRxCP8\nXtPvPIiXSfUQYAyR8YWxqJkllcSrmTssjW3nP+9qWVXqhUexg1oUAdIWsLj1r8SCha6VDNTUHSwD\n9DkhCghXtQUrGAQMkheFc7XrT3r5pfxzkPS2iH68S0rta6xO/GPNM8teW6GMeAmy1d9r65hY46pf\nazJrSE1wffDQmKQUAeJNCKhuBCFuRmKiiJ3KJ8Q2SVqdjnPEqsQR+6bLJZ4LKSUGRuennyTuS58v\n5nj+/AXuqCOPCqbT2MwtDPp62JDTi8lWUZqjj+zpWCi79wnfc4MHDXYXXTymovIaLTP9OXfu3IZp\nFvcmzwztKkUCspUNhA/+scgbk1VKuZa2eAQ07oG1q6jY4OJLT1/KDdOnsmmcNgR4UQYxOWlTO1Lf\nyy+/3E8NwUWC5mkbZExIj86Ui3iRBnIUlUfnL+WYsiB0bJWIkLZSdLv44rFu6JAh7pcB8Zl42QTX\npvUOlahQcl5I3u1B4P7vf/+4Y/HtuAltyQolIAMY8I/B9OnTGwYPSCRz4JUjBLQHoSc+a75g+HLK\ntjyFEQBzyBcSWBkLZ2j0FMk0xJlWjYQA7ivivHBFpVlwlwbvg5xbMHFq5thjj81cdtllmeuvvz4b\ncJ+rzeAShwsW1wtlxSmUV4xLJ5j2IRN8pZh5dMmyOKsvuyyC+HE9xoFr2UrUKSNtps/YpP245oqN\nRayT2kVXW2lbdIwRLkbEXI1Fw192Qu3qxd1okslsAAiNTi6tffVHoH///l6JNFu+sCLwHzeL9AaD\nQNbylQvdHXfc0VvEmEaiW7du2YWqsZSJYBVDyrFUoQ+WKaxu1RJcxFjBotyqZ519jluxYoWbPPna\nmlu58rV3+Jkj3Ly773Izb51Ztts1X/lJuRZ2C0f1ExZaLEVpd/WjP8+eWTOTcveZHpUgYMSrEvQs\nb9EIMEgwMLCPGsSLLqjOCSFIuBYhkoEFz82ePdstXLjQbx9//HFe7Tp06OAJmBAxEkPCGFRKJU/g\nyCAkrsG8FVd4EXJHn2lymFTSJU0V8rXssWWpvt+kPbLXBIr+0H0iafSee4Q04orW19J0zPPBxrNn\nYgikHQEjXmnvwRTpD1lhEEjryxNrHbpDejAUs/3zn//02z/+8Q/3+OOPu5/85Cf+NxOAFpJDDjnE\nTw6KNYyFsxlYtDUsV/4oIpQrbVznNdFjRvk5AeG8+ZZbEmXpCrcV8rVq1YtuxvRpqSVf9LW28nCP\nlCrcsxA2TdpKLaOe6dEbaxf3YJr/aasnhlZ3shAw4pWs/mhobXhx7rbbbt5SBAFLk4h1CdcNgwCk\nK5g01X322WcO0vXJJ5+4lStXOqxeuBi5FsTWuMWLF7uHH37Y/f3vfy/Y3J122skTu+7du3uCSoYw\nEWMQinIpFSw8hgRgQPtvvvkWN33Gja7rwV1iKLW6RRBsf+CBB7jzzh1V3YpiKh2MIVsicfQ1ZfK8\n4XIsh7iJLvXaozMbBNLEEGgEBIx4NUIvpqgNP/3pT/3Akrb/vsN6i7VLSNdHH33k5s2b51iTERIG\nIYN8QZxYSujDDz90v/vd79yjjz7qHnvssYI91qZNm/XckgzIWMfqJQzgB3U5yI0YOcr9aED0Uj/1\n0i1Xvc8+t8Kd0Ps4N278OE+Yc6Wr53n9LGDRqYb7GMLMJtbSera3lLpFb/5pMzEEGgUBI16N0pMp\naQeDNwMLRIYtDSKuDgYtLAeIEK9PP/3UQbruuecev1js+++/739DvnBDIhAvFtJmYWzZL1++3JOw\nRx55xAeo+4QF/gwJpmxg3UIhX2FrWIHsFV8mrov+mzrl2orLqmUB102b4W6acUNggZydCFcVJEIT\niVpZoaiHZw8ykwbheUPntFrq0oCx6VgfBIx41Qf3pq6VF+q+++7rgk/eq/LffZzgipuGwYoYNRFN\nvJ5//nlv0dpmm23cunXrvFsRMoY1TKxeLKYN6YoiYZzHEgYJg8BhHSskxxxzjCdgkDCwRKpJxOiz\n7//39/18WR07tC+kXuKun3Tyqe6gg7q4YUOH1EU3bdWCvAuBr6UykD0hXvperqUOxdbFcwfpwq0/\n2lyMxcJm6VKCgBGvlHRUo6lJoDoWLwakarhW4sBLXv7oh75aNPG67777gjiiAx3Wrg8++MBvEC+s\nYZAv0rJBjNggYRAwTcI4FosYgfmQsGnTpvkydL1Rx5tuuqmTLyXzxYdF5S32HMRl7077uFEjRxSb\nJVHp5t033w0fNtTV6itHiCr3jwgkIgmC9QjSlXQrkkwdoQlrEvAzHQyBOBAw4hUHilZGWQgwADBA\n8XJN2tdKQrrQK+rlD5HCmrVgwQLXtWtX716EbEG82LPhbhTyRVrZNFiQsDARo4ybb77Z/fznP/dk\n7PXXX89axIqND8MlCQnDIibYlmsRE2sXSwHVelZ6jVWlx9W0enG/gJMIZF1wl3NJ2WO9HT58eGIt\nzkl+LySlD02PdCNgxCvd/Zd67eUli0UpKZYvIV2Am4sUQryYx4s4Lr5ihGBBtPiqkY1jTbwItpdN\npqCAiFGOlldffdVbzlje5LnnnvNuRIkLk32p8WEdO3bMxoaVEx9GbNcXN/6Su/jC0VrV1B2L1WvF\nyhWx6K4JOSQrKfdvvsbhbhSSmESLs7wPcj13+dpm1wyBtCBgxCstPdXAevKyxfXBy7begxe6sL5d\nr169vHsxn9UiWJrFffvb3/bkS6aVEAuX3ss12UO8cEHyW0gY+yeffNJbSZgVH+sUU1C88847bo89\n9ljPNSkkTMeH3X333UVNWyHxYVjEBO9c1jAGab5kZC3ENMZ2hR+bbt26uzPOGFbWF46QFjaRpLgP\nRZ9Ce9Fd4sv4ZwfyFY5fLFRONa4X889ONeq1Mg2BeiBgxKseqFud6yHAIDBgwAA3ceLEun3tSFzJ\nHXfc4XUjvgoSlksgiYcddpi/LJYrIVEQKr1BsjTZEtKl98E6co55vDbffHOfVtySDJbbbrutPx8V\nHwbx0iSMwHziwwjWv//++3Opnz1fKD4MQnz9tOnuzjvmZPOk+eDKSde4VwJMf3XpJQWbIZYhSQhh\nEdIi59KyD5Mu0VtivrjX6/W1I88S9UNk0SHfPzuit+0NgTQjYMQrzb3XYLoTIwP5YXCDiNVqkGOA\n5cUvpEtg7devn9eD39olKIMYk8Hq8xwLCWMvREz2YTLGb8gXcWK4AyFdQsZ02qefftqx3BBlaomK\nD9MkjGD9SuPDxowZ51pts21qg+o1XhzLvF653I3cg9wPSFrch17ZPH/kfs31PHGd5w6B+NTKkgfO\nfLHIs84+LdPL5IHaLhkCRSFgxKsomCxRrRDgZczL/4ILLnAQH17IuQaMSnWSumSw4cUPAXvllVey\nRe+zzz4OlyKDsJAs/kMnVkq75+RY0lCAJmEcazIGscKNiKWLpYOEhEXtOYcbEnImJE7Kpj6pmy8j\nJVAfAqa/lJQvJkuJD9tkk01clwMPcmPGjUvFLPXZTitwgLtx4sTLvJuVeyAtQfEFmhV5uRDp0pl4\n1ngWIGHVfO6oE7LF84arm+NqPeO6fXZsCCQFASNeSekJ06MFAgwYvPyJt4KAyUu6RaIyf1A2L3sG\nGV781CP/5TMQc/ynP/0pWzqzyLMYNiRMky5Ijrj/2AsBymYMDoSIsZcN8vTSSy+5tWvXuk6dOmXd\nkpzXFi85Zv/aa6/5dLgd+U1aCBl7IXS6XtEL8iWbkC9tFZP5w6Liw2gv1jZpgy4/zccDB53m/rzy\ned/vjWLViuqPUkiX5Of+51mT545/RHge4hD5R4dnDxGS53/YH0OgiRAw4tVEnZ3GpgoBIxaF/4r5\nb5yBoNTBAKsGpImXPqQKMpdvUOH6jBkzspBBZIYOHer69u3r15sUMiOWJX7nIl/ZQoIDSAy6bLXV\nVu5rX/ua/y3ESfZCqMLWL6xV2223nSMuS5MyIWGSj3LYtAhJ1HpDxPgthCwcH/aNb3zD7blnO3fb\nbbfqolJ/PGbcBPfZp5+4//3f81LfllwNKId06bLknxMhSTx38uzpdIWOKYfnjucXVz4frUDsSn1+\nC9Vj1w2BNCFgxCtNvdXkuvLyZuNFjjuQ4HbIGBuC9YJNBh1xI4kriZe9DCCkyycQl+uuu84NHDiw\nRTLyX3HFFVnCsvHGGzs2IWBCcFpkUj/QHSsb9WtLEsfUKXvIlN6EhGGhYqoJ+R2155xsQsKiiJi4\nJSFf6K8tYZCxSZMmua232c5NvGyCakH6D5lW4saAVN9y843pb0xEC+T+l+ciIklJp+SZ497lnxYI\nOWXLF7EUxrE8Z7meO56/uHQqqQGW2BBIGAJGvBLWIaZOcQjolzvHCAMOx3pAkJd9KS98yA8bVqXf\n//73fsoIrVXbtm3djTfe6K1PX/rSl7wFir1YjiA0iHY9ir7ok0uoU0STMCFPELF3333Xzx/Wvn17\nT8y05SsXCRPXpBA5KZv6RMcwCYOMzZhxk+uwd6eGCawXbBuZePEMIKXc7z5DkX/kPoZk5XvueAbR\nQT+LRVZhyQyBhkfAiFfDd7E1sFQEICSQE5kUlfUTTz755PWK+fWvf+0OPvhgt9lmm7kvf/nLjmB0\nrF/iziMDxIbBMEwI1yss4oQQMfayQZ4Y9HbeeWf/FaRYtjgvJCyKgMk1IWGk0USM6qlD3KUQsZkz\nZ7lv7rd/wxGv1WvedDu2ae3bGwF7ak9Vm3SlFhhT3BBIGAKf/2ueMKVMHUOg3ggI6YGAQVBw8bVr\n166FWj/+8Y8dMTB/+9vf/GzzTHjKrPVCbiiDhcCRcv7z1yRIyBzE7oADDnAszE2sF6SPaSi22GIL\nt+WWW/rYMdyYrVq18u7MqD3xZWzkIS+kUSx2EC70ps20vRElzcse5eoPcfNVy9KVq147bwgYAqUj\nsFHpWSyHIdD4CAjpWbx4sZ86ggWwIVkXXnihwwImMmHCBLdq1Sp33nnneaKi3XgSjwX5iUPQCWHP\nrPPE3OC6lDplL5Ys2Yu1qxhLmKQlr9QXh+5WRvUQgHRBuArFLVZPAyvZEDAESkHAiFcpaFnapkEA\n0kEAP/FcEnjO/mc/+5lr3bq1D7wXMJhqgnm2cD3iAiQOa+XKle6oo47ycV8Qoqi4L8lfzh79mMAV\nHXcNBl2sVFjFECFg7NmEgLEX16TeC9kK7zf64hfLUS3xeZhEtd1eLa2XiVc6h4JGunIAY6cNgQQj\nYMQrwZ1jqtUHAbH0sGA1k5t+9NFH3hUn0zj06dPHEyzm/xKBAPXs2dONHz/eL/1z6KGH+nwQHx33\nBUGS8iVvuXsIFwMv8WPa2gEBEyLGXjaIVxQR43yYdPF7y8AV2YjyajAn2n6dD0h904x0pb4LrQFN\nioARrybt+DQ3W76swtUmx1HtgZiwEV8lX1lFpYs6R9m487AMQZyEtEBcIDIE1eN67B9MMKnl7LPP\ndmPGjPETo0oe0lMGErfli3ahKy5HLULu2FM/IoRM2hAmYeirSdjGG2/k/vLyS7rYhjjGbZx2MdKV\n9h40/ZsZASNezdz7KWo7X2wxnxBkR+YSEjKlLU+6SQxO5GOG7Iceesjtsssufh4vyBJ5cwl5IGyQ\nJMiKkCZIDJuc5xqTQp5zzjnuueeeyxY3atQoH/c1ZMiQrJsPkkM5MmkpiYUcZTOWeUBbaGuuNul6\npA1SlSZhEDQhX+yZJ21asEB2o8mLL65y7UMfSqSpjUa60tRbpqshsD4CNp3E+pjYmQQhANFiY7CR\nyU+x7mjXWrHqykSQlEd+CBtlhssSC5JYioSM4H775JNPHF8vMsv7Bx984L9mZLmdZcuW+S8ftS6Q\nt9/+9rf+60G+PsRVKV8PQtritH5BFhHqLFWkneTTRIwljYhn09dLLTuJ6VkyqOvBXVz/kLUyibqG\ndTLSFUbEfhsC6UPAiFf6+qwpNIYksbQIkosgVQKEJnRYxGQQFtIlZQvpELccrkfIF3Ffb7zxhnvk\nkUf8TPIQsaVLl/qvHiWv7O+77z4fEybzfUG+sH5BvtgQbZWSfKXuw7qXml/SS5vZH3FED3fWyHPc\n0Uf2lMup37dv197NvHVmTgthUhtopCupPWN6GQKlIWDEqzS8LHWVEcByAwlikNGEqFrVQlaoD6uX\nrCEXZTWChLBh/YJ88dXim2++GaxluKe3fGH9YluzZo0bMGDAeupec8017lvf+lZ23iyZbJUvJbF8\nhV2A6xVQ5Im4yJdUN+KskW7jL23iLr5wtJxK9X7J0sfcD/v3cytWrkhVO4x0paq7TFlDIC8CNoFq\nXnjsYi0RwApFnBKbELBq14/bkrpwOUKY0CFKhBhhoXr22Wf9LPVdu3b1bkSZkJQJTHfccUcfi8ak\npFpOP/10v8ZjvslWxdKk85V6DGmkPXEI5fzj04/dQ4seiKO4RJRx9z3z3LHH9U6ELsUqAZmmX8Mu\n8WLzWzpDwBBIFgJm8UpWfzStNlidcC+yQYbqIVgVxPqFHlED3aJFizwxZNZ3rF+yrJDEffHFHJYv\nfl977bUtJlulTfvuu6+f74v8Ou4Ly5fEfVXqdizXOkI+vhIVYbBnwzV3/Q3TfVyUXEvrvlu37u6M\nM4Z5op2GNsRtwUxDm01HQ6DRETDi1eg9nPD2MdBjbWIP2WGgr6egB+QLaw+DnpAvzkNMIIVimZK4\nLwm6J+6LWC8hXxL3ddFFF63XpPnz5/v5vnTcl3zxGEfcVzEDdphoYWmU9mqFL754rHvn3bVu4mUT\n9OnUHc/67Wx37dWT3KJFC1OhezF9mIqGmJKGgCHQAgEjXi3gsB+1RAAyI9YtTXJqqUOuuiBfEBP0\nQk+28HQNOu4L8oX1S8iXfPEI+SLu64c//OF6VU2ePNkx0aqslyhxXxCvSskX+kIetc60RUsuoqXT\ncEw5zJK//NnnXccO7cOXU/P7pJNPdQd1OdANGzY08TrTV/JsJF5ZU9AQMARKQsCIV0lwWeI4EcDS\nxaDOIBNlaYmzrnLKgnxNnz7dL3StCYwuS8gX1i+C7iFfLJQN4RLyJUH3p512midmOv+IESPcKaec\n4skX1i8hX1i/Kg26l3g1sSJWMpATZP/ZZ/9MrdVr3n3z3fCAcC17bFki7zV9Txjp0mjYsSHQeAgY\n8Wq8Pk1Fi/iCkAGGLYmkCxBxffJlJYKeuURcjzLfl5CvqLivcePGuSVLlrQoihnyL730Uk++sH4x\n35dMtgr5EgLWIlPoBxYuLHRaIFroXQnhkvLSbvU6tldv1+OI7om3dsXVX9JvtjcEDIHkIWDEK3l9\n0vAaQWjElSfWmKQ2GkIDccE6x3xiuUTIl477wvIlrkcd93Xvvfe6SZMmtSiKWfVZZHunnXbyBAzy\nhfVLFuiGfCESeB8mWpDXXFa5uAbzK6+aFEwU+5i75eYbW+ie9B9XTrrGzbn9t4mP7Yqrn5LeH6af\nIdDsCBjxavY7oMbthzBAtnCDQWbSIBJUD2GEhOUTCBjki03HfeFu1Nvzzz/vfvazn61X1MyZM/06\njxJ0L65HiNszzzzj00O+8hGtcKFgjsUqFzELp8/1m3J69z7e9T7he27YkNNzJUvUeZm3a/KUyQX7\nrp6KG+mqJ/pWtyFQWwSMeNUW76avTcgWJCZNgsuRDQJTSHTcl5AvHfclBIygeyx/Ybn44os9+Xrn\nnXcc84Fh/dpuu+1c586ds25HsXyF8+b6DXmE8Fbq1qUcpsR4dMmyxE8vsXrNm27w4NNcjx5HuGFD\nh+SCpu7njXTVvQtMAUOgpggY8aop3M1dGQOMBNRXSgDqgSQWI4iSLGWUTwdxPUbFfQnxYk8QfniR\nbcrdb7/93HXXXZcNutdxX3zxCPEqlXzFNcDPnj3HjQoWBk/63F6syQjGSXaNxtUn+e5Fu2YIGALJ\nQsCIV7L6o6G1wU3Hli9WKskAlEocw+RLz/fFGo+vv/561v0Ytcg2cV+33357lnxh/apkke24XI70\n0Vlnn+NWrFjhJk++1rVpvUPium34mSPcqlUvuhnTp1Vs5atG4+gLcWFXo3wr0xAwBJKLgBGv5PZN\nQ2lWKmlJauMhjljtirF6SRsgYH/4wx/cu+++66ecYJHtdu3aOaaMwCJD/JZMtnrhhRdKtuw+zkW2\nxVWK27FSEfI1duzYRM3vlQbSFUfMXaX9Z/kNAUOgPghsWJ9qrdZiEWjbtm3WrTR+/PgW2cTdxH7B\nggUtruX7MXXq1GyZpbqr8pWb7xrxUZCVNLoYdbtog0wxoc+HjyGasj300ENu9913D2KNeriePXu6\no446yrVp0ya7ziOYsM7jIYcc4qZNmxYuyh155JG+rHXr1nmixpeSkDfmDcOVKTFl62WMOAHhEvIV\ncbmkU5eMH+vat2/vTuh9nCOIvd5CTBeTpCbd0mWkq953itVvCNQXgaYnXpAZITCQHJP4EcCtcscd\nd/j4qPhLr22J8nEApEqLkCzZi1tV9q1atfL3GfFZWLr4WpEvF5m3C9IlC20znQQfHkDMtLDINoRP\nFtnGQkbAPnOGlUq+0Cmsv66rlGPI15jA4nVI14Mc0zbUSyB+J590ktsqwDLJ7kUjXfW6Q6xeQyA5\nCGyUHFVMk0ZFACLRq1cvF4d7KwkYQVzCli/OFRKxLurlgDjHb70xZ9eUKVOC+KnJ7u67784WS7A9\nLkvm+5IpKySOjD1lhOf7ymYOHfChADFGlU4xQbHHH987mCNrkbvggl+6ZUuXunPPPbdmrkesXDdM\nv9Gde85ZfoqSfv36hVqajJ9xxtclo0WmhSFgCJSLgBGvcpGrUb5Vq1bVqKbqVVPM/FfVqz3ekqUt\nWIyKIVvh2iFaQpLE0ip7SJOQJ/ZYufi6Ucd9PfXUU27//fd3999/v9t5552zBEwH3ZOXOig3l4jL\nF0Igx7nSFnMeLCBxV199rdu749fdxWMvcf37nVrVwPvrps1wN824wbVus2PeZZ2K0b+aaYx0VRNd\nK9sQSB8CG6ZP5Xg0JiaKgWnkyJHZAl966SV/LsrliEtSx1uRl3PkCYt2XxLTg1CPDLCSXn6zRx82\n5mqSskmn66TcfHLbbbdl81MGZXGuHCmlvYXKh6SIi65Q2qRfpx39gyklZDAtR1/pd4gWM9OzPJC4\nHrfYYgtPhHA94oLs2rWru/7669er5jvf+Y4DV4n7YnkicTviekTEGrZe5n+fEKtXruulnofAnXvu\nOZ4ELX/madc9IGPn/mK0e/a5FaUWlTM9Fi4IV7du3T3pGjp0qJ8uIg7LXc5KK7gg90lS9augaZbV\nEDAEykSgaYlXsXhBrCAwEKcwyeIc15588sm8xZGmEGmCdEHSCpWVqyIIVp8+fVrkpyzOFapblxlH\ne3V5uLOQXWP4ik6Xi558JDBo0CCP29Zbb50ltkJsOAempCFtuP90eaUex0FaRE8sVGHyJTFf7CXu\ni+kktLDoNot4E/clc4IR98W0FcXGfWGpgsDFKWDD3Fkzb53pPv3kY28BY61EYsDKCcIXssXXipC5\nBxbMd/369fVLAOHmTKoY6Upqz5hehkB9ETBXYwH8w2QmnPy9997zgzsuQQKowwKhKkZKIUdR5UEs\ncgkEEfcUX9UVkkrbGy4/7mBiyBPtKcaSR9+E8T/xxBMdC1XzlWElAmGBVFZqyYN8IZAvRMiYuB05\nzzEbywmFF9meMGGCJ9sssg3Z0rFfBPFLXqnHVxL6Aymmn+ImxxAwtnNHjfQfDEC6rrnqSl/7fp0P\ncHt32scf77FHW/+Fp6j1wgsvBETyQ/eXl19yL/x5ZUAMF7kfnHSKO/I7PdwZw34Su55Sb5x7I11x\nomllGQKNhcCGjdWc4lsDCcEVw0AmwmDMOYmrgsxoCxRpuc5G8LMIA3w+4kO58+fPz+aVfOH9wIED\ns2nOPvvs8OWCv3PpR8Z8+knBcbVXymMf5ySR4kothnRpHfRxHGVQHiRFrHm6/HKOhRRBsnA9Eq/F\nTPV89Yi7ERceli/ckKNGjXJDhrRc/mbhwoWB662be+WVV7zrkS8emXIC1yNTTkDG5L6N0o+2QLyq\nJejfP3DPTp1yrVuxcoW3hPU58QS3+Wabus8+/cTNnTPH3ThjRnZ77dVX3WZf3sQvSXT++f/rdceC\nRuA8uiZdwJIN0mliCBgChsB6CAQv5KaWgKxkAlD8FhCkFlgE1pHstYAUtbjGj1x59XnKDkjXenk5\nIfWyD4hgZBp0knSUq0XOsy+kH2nWrl3rswekMVsm50XKba/kj9qff/75GbZKJSDDmcCi2EJv3f5S\njymLMsuV4Cu+zGGHHVZu9pz5ApKUCSxXmYA0ZQIClQlIfWb16tWZwAqUCb5ozCxevDgzb968zGWX\nXRaJRfAlZGb58uWZYODPvP3225kgBiwTuB8zgfsxQ9lsuYQ2mVSGwMsvv5xhMzEEDAFDIBcCTWvx\nCgbqgqKtXVFuOtxWIrjAsHxFSVTecLpi0oTz6N9aFzkfLrNQjFNc7ZX62WMVisNKgTUujC+WRCyD\nASH1FkWsiuGNa6TB1aqlkJVSp63lsbgaJe4L6xexXVi7JO4LK1jHjh399AnhuK/Bgwf7OdNkvi+C\n7on7KmayVSw0cVnxaolZUurCyoXEcb/7guyPIWAINCQCFuOVp1s1USH2qZAwmIfjvCAHxUg4XzF5\ndJqoesJlhomLzs9xHO0Nl0msSxwDUThWC1cvrtlCosmnfMAgecJlyvl677XrEV0kTos9hExvxH0R\n8/bcc89l1WYeLUj0L37xixYxXwTwE/dFfkTqkYwyrQR9Jsdyzfb5ETDSlR8fu2oIGAL/QcAsXv/B\nwo4SjIC2xkG4iiFd4eZAwnQ+XWY4bb1/CymCJMmUE8R9MdO9WL4k7uuSSy5xp5xySguVZ8+e7QP/\nJe6Lrx5lpvt8cV9m9WoBY1E/jHQVBZMlMgQMgX8jYMQrz62grUg6OD7w22aDlfWxTp+n2KpciiIR\nYQtXIf309bjai+VEBqa4Gq71LLXMSvKWWlel6cXtiKVLky8JuhcChuvxpGC5HCxcWiBdkE1NvsLr\nPJKee1jL4YfHP8WELr+RjuXejsOq20i4WFsMAUMgNwLmasyNjY8LEvcbxEa7rfJkq8sl3GbhOC/t\nSsPtWIh0EAcVd3uxoMjgFBcwtKucrz6pX2MSlz7VLgcCBvkK77XrkeNDDz3UL7I9YMCAFiqxyPY1\n11zjvvWtb2WnnIBs4XpEyIuIlY1jiAT9xj5uIY6Mbd37H7jXXnvdvfHGGy2qIJ6tfft2Lpjj338Z\nCBFMosh9XQ2Mkthe08kQMATiQcAsXgrHsIVIEy3iaPRcWxAU4r7EKhE1270quuqHBJ9r/fiNziJh\nUibn9b5a7SVmqFLRukGeaFu4v/LVQVrw0cRLl5kvb9Q1iEMt46DkPsP1SJwWQfeyyDaWL3TB8iWT\nreZaZJuZ7t9//31XaJFtyAT9FkffUcYNN9zgBg46zbVv194NH36mu3/+A+6DDz9yrbbexp3at2+L\n7cAuB7mPPv7UvfLqG+6yiVf4Z8xPwHrVJE8Go/qj1ueMdNUacavPEGgcBMzipfqSwZkBDssQc3kR\nD8RgLVYgBntNZlTWsi0wuoxKjyvVrxrtxeJ1+eWXV9o0b23UpIl+YcNKhzUvF4kiD/0a5YrNlacY\nZSETtK2Wwr2JpUoHx3OO39r6xe9ci2w/8MAD7vbbb89avpjjC5FytfWL9j0YzGpfrsWJvHffc6+7\ndMJ4PwHqQQcf7M4444ySF9Bm5vpHHl3ili5Z6o468ii3V/uvu+N793L9g7nB6iFGuuqButVpCDQQ\nAsELt6ll1qxZ682HFBCvLCbM9RQM7uulCW6B7Lnw/Fr8luu6rGyh/z6QNOyZWytKyC/pwvXIefaB\n6y2bTp/nOJwv1zxe1F9Oe6P0lnPMaRRYZORn2XvmICvUD+F25/tNWTKvWTlKMYfXnDlzyslacR6Z\njysIks988sknfr6vd999N/P6669nVq5cmQlIZiYgPZm77747E8R9Rd4XwSLbmeeffz7z6quvZt55\n551MYAWLnO8rIK2ZYGHuknRmPrBgpvlMu73aZYLFsjPLn32+pPz5Er+xek3m19dPzxx+eDe/3X77\n7HzJY79m83TFDqkVaAg0HQJN72rEBRcQk/WmgQgGbS9Yv5544gmfBuuKFixEBKGXG2+ky6r0GF2w\ncqCvCPoGxLIk/eJuLy4rpNL5obBq0b5wH/jCS/xDGZQVnm6jlGIeeuihmlu8RD9xO2KdIuge16Ne\nZFu7HotZZBvXoyyyHZ7vCxdmsR9IYAW86OIxbvCgwX45oIWBxWvUyBElW7iknVH7Nq13cD8a8Pk6\njaf07e+uuuoqhxuy0vsrqq7wOalD7unwdfttCBgChkAxCGwA1SwmoaUxBMpFgPUMcVf99Kc/LbeI\nFvlwMRLDJi7gFhfz/IBUQlArJcpz5871bRGXU54qq36Jx5cNlyGkiWWCmDaCGC42SBVTSUCs+PKR\nvZYf//jHbujQoX6aCiZjhcARPwahw2UpJK+Qy5HrF1zwS9e6zY6OecQ6dmivq6nq8ZhxE9y555zl\nrrjiSjds2NCq1AXpgnDVMq6vKg2xQg0BQ6DuCBjxqnsXNL4CBFYT5yUWg7haDPHSMVzEcmnBooV1\nS2LAtDVQpyv1WAhkHLFrpdYdlV7+d2KRbDbIV+CC9CQL0sUmVq1gSSF3zz33tCime/fujkW2mSOM\ngH3mC9PkC8saBCwX+Zo+fbobO2asO33oMDdsyOktyq7VDxbgxnLdunVrN37cmFgJkpGuWvWi1WMI\nNAcCRryao5/r2kpcUJCfID7GWw3qqkwMlWP1gITUOrg+n+pCvrB8Qb6CtRmz5AvLl5AvjpctW+Yu\nuuiiFsXhnvztb3/rv4qEgAn5wo2J9QvyhYUPArarmmLirLPPcXfOneOuv2G6X9S6RaE1/kEQ/ujR\nF7g333zTzZg+LRbyZaSrxp1o1RkCTYBA08d4NUEf172JEJV+/frF8nVjvRuD9Y72JIl0gYm4BCXu\nizm6cBtCophmQsd9HXLIIS5YZLsFlKzt2LNnT0fsGscQNSZbxXomcV8QLqyKEGkE0rVixQpHLFfX\ng7u0KK8eP4j/mjrlWte27R6ub78BWT3L1cVIV7nIWT5DwBDIh4BZvPKhY9diQwALEbFeWE3SHCcD\nwTn//PMDy8ro2LCJu6Bw3BduR4n7Etcj+zVr1rjTTjvNEyytw4gRI/wSROJ6hMDJOo8QO8jZvHvv\n96Rr8uRrHYQnaTL8zBHBlDAvlm35MtKVtB41fQyBxkHAiFfj9GXiW0KAPVuSSUs+ELF2oTskEgKp\nhXYlScT1qOO+IF8E12vyxW9io5YsWdJC/eOPP96dd9553mImrkchXzfddFMQRzXe3T5nbk2D6Fso\nWMQPJmylrbfcfGMRqf+TxEjXf7CwI0PAEIgfASNe8WNqJeZAQKxeMrDlSJbY07jaIF5RE3fSNi1J\ncEcK+ZIvHnEZQr5wIWryRdzXzJkzHYRKyy677OJ+/etfu5133jkbdI9rkaWJHl2yLBHuRa1v+JiY\nr8GDT3NdDjww+NLynPDlyN9yb6bZKhvZMDtpCBgCiUHAiFdiuqI5FIG0ECPElAxpEggXOjMwFyO0\nMZyWuLB6DOgQMMgXmwTdQ74k6F5I2NKlS92FF164XvMgZZ06dfKxXqef/hPX5/s/qNvXi+spV+AE\nXzv+sH8/N3nKZG9tzZfcSFc+dOyaIWAIxIWAEa+4kLRyikIAQoLliKkYoixHRRVS40QMyPvuu68L\nZnCvKKieciQwXZpQKxeljvuCfBE0D/kKux5Xr17twotsoyuLbC9b9nv3j8BqVqrrTtpar/2Vk65x\nc27/rVu0aGFOFbBY1osY51TKLhgChkBDImDEqyG7NdmNEpejDHZJ1haixIDM3F0yf1ec+oZdlJBS\ntmqIuB513BfkS1yPerJVgu4hYWEJlv9JdFxXWF/5zez2PY7oHjnBKn1QKwIs+tjeEDAEmhcBI17N\n2/d1bTmuO4LVsQLVw/1WbOMhXWzoWgshaD8cuB+nJSZMvsT1iOUL16OQL46ZbDVY79E3+8ADurge\nwQLVF184uhYwxF7HvPvmu+HBrPbLHlvW4n4z0hU71FagIWAIFEDAiFcBgOxy9RDA1QjxYvBLIvlK\nin5hFyVYQcbKFSFfMtkqQfcy0z0ETJMvHfcVLFCdyKkjisXhpJNPdQd1OTBr9TLSVSxyls4QMATi\nRMCIV5xoWlklI5AUcqMVx72IWzGppBD90E1LOS5KifuSme7DcV8QMCxf//uL813XQ7/lJl42QVeZ\numOsXpeMG+tjvYx0pa77TGFDoGEQMOLVMF2Z3oZAvhgI+WqwEktOHAhAaiTeB52SaImLameUi1La\nEZVezgn5kiknIF96vi/I13/3+W83c9ZtiZ8+QtqUb9+tW3f3jW/s0xCrKORrp10zBAyB5CKwUXJV\nM82aBQHip4j5gijU82tHiBaz67OhR1pIF/dJlMWL9miJclEyEz/yhS98we/10kPMUn/XXXe5vTvt\n0xCkiwZ2PfTb7h+ffuLban8MAUPAEKgHAmbxqgfqVmckAkJ8hIBBJmohWLkk2J99Nb5erEU7CtUR\n5aKUwP1w3JcE3Q8/8+dup52/ltqg+jAmMq/XipUrwpfstyFgCBgCNUHALF41gdkqKQYBCBcuM4gP\nhIA9WzUtT2Jtg+QRN1UrslcMHnGnAUcw1hIO3IeAHXbYYdlFt1e9+IL7/g9+oLOk+lgW86bd9XZr\npxpIU94QMATKRsAsXmVDZxmriQDWL6xPDJCQL+LA4iJFWH4gXLgTEfa4F00+R2DRokUOArZ27Vp3\n4okn+uNGwoY1HDt8vZ2/rxqpXdYWQ8AQSAcCG6ZDTdOy2RDAMgP5IuAeK9huu+2Wjb3id6kiZAuC\n1apVK18uhIuyjHS1RLNbt26ObZtttgmsXSe3vNgAv3bdbXe3bt0HDdASa4IhYAikEQFzNaax15pI\nZwgYGyQJEsaGJQy3GRYwriGy9z+CP+JCY8/2yiuveBeaBM7HZT2T+hptT5A9uG2xxRaN1jS3xx5t\n3dw5cxquXdYgQ8AQSAcCRrzS0U9NryVEC3cjGyKECouVBMf7C//+A7Fig2jhqgwTM53WjqMR+MJG\nX3RYhxpNGpFMNlofWXsMgUZGwIhXI/duA7eNwGgLjq5uBzOxaiPK13be2f3hiccbsWnWJkPAEEgB\nAhumQEdT0RAwBAyB2BDo2KG9W/nnlbGVZwUZAoaAIVAKAka8SkHL0hoChoAhYAgYAoaAIVABAka8\nKgDPshoChkD6EHj2OZs8NX29ZhobAo2DgBGvxulLa4khECsCsoxQrIUmoLBXX3vN/eCkUxKgialg\nCBgCzYiAEa9m7HVrsyFQBAL/+udn7q9vv11ESktiCBgChoAhUCwCRryKRcrSGQJNhgBfjb711psN\n1+qnnvqja9+uXcO1yxpkCBgC6UDAiFc6+sm0NARqjgDE6ze33FTzeqtd4V9efsltueXm1a7GyjcE\nDAFDIBIBI16RsNhJQ8AQ+HxR7W5uydLHGgqMF4KpJGxC3YbqUmuMIZAqBIx4paq7TFlDoLYIHHBg\nF/fgQ4trW+n/397d9ER1hmEcv+YDsNRUUDMJLEhdYayg3dD6/sZAurOFIU2qSNGkTaoZSXShKGqi\nCRo7VYkVW0QDTtqmMbY2gcS2GKDMoo3ESGUjfgLWyjNqAqiYyDkz5z7nPyvmDPOc+/nds7hyXp7j\n495ciHwyOcniuz4aMzQCCMwvQPCa34dPEYi0wNo1lRr8+6/QGAyPjGhHojY082EiCCBgT4DgZa9n\nVIxA3gTcsy4fjN1XWNa+yvT16sO1VXnzY0cIIIDAXAGC11wR3iOAwCyBmto6XbrUOWubxTe3bv+e\nO83owiQvBBBAoFACBK9CybNfBIwINO/ZrVu//qLJJ7aXlrja1aXmlhYj6pSJAAJhFSB4hbWzzAsB\njwTi8bhWrvpA31+56tGI+R/GHe36Z3hIDfWsWJ9/ffaIAAIzBWJPp18zN/A3AgggMFcgm82qoqJC\n//53XyveL5/7ceDf7/y0XlVVldq3lyNegW8WBSIQcgGOeIW8wUwPAS8E3GKqR462qa2tzYvh8jpG\nx7nz09d2PeZoV17V2RkCCLxJgOD1Jhm2I4DALIGWL5tzp+tckLHycndjnj/bocOHD8ktCMsLAQQQ\nKLQAwavQHWD/CBgRcMEl/V06F2SsrGafSqX0WUMDK9Ub+Y1RJgJREOAaryh0mTki4KFAR8dZZTIZ\n/djdreIl73k4srdDffX1Nxoff6iff8p4OzCjIYAAAgsQIHgtAI+vIhBVgf0HUhobG1M6/W0gw5cL\nXdnRkemAeJNTjFH9kTJvBAIqwKnGgDaGshAIssDJE8dVXl6upqY9gVvfy4Uut+7YmTOnCV1B/hFR\nGwIRFSB4RbTxTBuBhQrMDF9BuObLLfD68vRiz/UeHoS90AbzfQQQ8EWA4OULK4MiEA0BF74qV6/W\n541J3ei9WbBJu7sX3dE3d01X15XLhK6CdYIdI4DA2wQIXm8T4nMEEJhXoLU1pfYT7TrUelC7duf/\n1KNb3uKTutpcAHQX0rNsxLzt4kMEECiwAMGrwA1g9wiEQcA9eHrw3qBisZg+rq7WsfZTvk/LPQao\nJlGnTF9vbpkLFwB5IYAAAkEX4K7GoHeI+hAwJtDf368LFztzq8Vv2LRFjcl6T+987LzcpT/uPH/2\nYupgSslk0pgQ5SKAQJQFCF5R7j5zR8BHARfAuq9d18ULaX2xq0mVVWu0ZfPGdwph7ujW3bt/qu9G\nj5YUF6tx+pqyRCLBaUUf+8fQCCDgjwDByx9XRkUAgRcCExMTGhgY0O3f7uha9w/aUVOr0tIyLVq8\nWGVlpSoqKnrFanQ0q6mpKT36fzz3nerqj7Ru/Xpt37aVC+df0WIDAghYEiB4WeoWtSIQAgF3JCyb\nzeqpYhoaGn7tjEpKSrRsaYmWL1+WC1rxePy1/8dGBBBAwJoAwctax6gXAQQQQAABBMwKcFej2dZR\nOAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkB\ngpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAII\nIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1\nIoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMg\neFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAA\nAghYEyB4WesY9SKAAAIIIICAWYFnu/zIiInRwogAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 88, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 89, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0 = np.zeros(10)" + ] + }, + { + "cell_type": "code", + "execution_count": 90, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])" + ] + }, + "execution_count": 90, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 91, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0[4] = 1\n", + "layer_0[9] = 1" + ] + }, + { + "cell_type": "code", + "execution_count": 92, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 1., 0., 0., 0., 0., 1.])" + ] + }, + "execution_count": 92, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "weights_0_1 = np.random.randn(10,5)" + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 94, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0.dot(weights_0_1)" + ] + }, + { + "cell_type": "code", + "execution_count": 101, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "indices = [4,9]" + ] + }, + { + "cell_type": "code", + "execution_count": 102, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_1 = np.zeros(5)" + ] + }, + { + "cell_type": "code", + "execution_count": 103, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for index in indices:\n", + " layer_1 += (weights_0_1[index])" + ] + }, + { + "cell_type": "code", + "execution_count": 104, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 104, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_1" + ] + }, + { + "cell_type": "code", + "execution_count": 100, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjsAAAEpCAYAAAB1IONWAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsnQe8VcW1/yc+TUxssQuWWGJoGhOjUmwUMVYUC3YQTeyCDQVRwYJYElGa\nggUBJVixRLFiV4gmxoKiiaKiYIr6NJr23vvf//6O/k7mbvc597R7zt7nrvl89tlt9syatct8z5o1\nM99oioKzYBowDZgGTAOmAdOAaaBBNbBcg5bLimUaMA2YBkwDpgHTgGnAa8Bgxx4E04BpwDRgGjAN\nmAYaWgMGOw19e61wpgHTgGnANGAaMA0Y7NgzYBowDZgGTAOmAdNAQ2vAYKehb68VzjRgGjANmAZM\nA6YBgx17BkwDpgHTgGnANGAaaGgNGOw09O21wpkGTAOmAdOAacA0YLBjz4BpwDRgGjANmAZMAw2t\nAYOdhr69VjjTgGnANGAaMA2YBgx27BkwDZgGTAOmAdOAaaChNWCw09C31wpnGjANmAZMA6YB04DB\njj0DpgHTgGnANGAaMA00tAYMdhr69lrhTAOmAdOAacA0YBow2LFnwDRgGjANmAZMA6aBhtaAwU5D\n314rnGnANGAaMA2YBkwDBjv2DJgGTAOmAdOAacA00NAaMNhp6NtrhTMNmAZMA6YB04BpwGDHngHT\ngGnANGAaMA2YBhpaAwY7DX17rXCmAdOAacA0YBowDRjs2DNgGjANmAZMA6YB00BDa8Bgp6FvrxXO\nNGAaMA2YBkwDpgGDHXsGTAOmAdOAacA0YBpoaA0Y7DT07bXCmQZMA6YB04BpwDRgsGPPgGnANGAa\nMA2YBkwDDa0Bg52Gvr1WONOAacA0YBowDZgGDHbsGTANmAZMA6YB04BpoKE1YLDT0LfXCmcaMA2Y\nBkwDpgHTgMGOPQOmAdOAacA0YBowDTS0Bgx2Gvr2WuFMA6YB04BpwDRgGljeVGAaMA2YBsrRwOOP\nP+7effdd9+rC190HH3zgli39wD3++GNfS+qQQw93q6yyiuvSpbPbaMMNXM+ePd13v/vdr8WzA6YB\n04BpoLU08I2mKLRW4pauacA00FgauOuuu9xTTz/r7rv3HteufXv3ox//xG2y6SZu8803d6utuqrr\n0b1rswIvfG2Re2/JErd06TL39ttvu1defsnde89dDgD66a67uH322cfAp5nGbMc0YBpoDQ0Y7LSG\nVi1N00ADaeC///u/3YyZN7k5d97pS9V//wNcn969XZfOHcsq5dJlH7q5DzzkFsx/zl079Rp3xrCz\n3IknHOc23njjstKzi0wDpgHTQEsaMNhpSUN23jTQhjUwfvwEN3nSJLf1Ntu6IwYOdLv/tG9VtYHl\n57rrrndXjvuFQU9VNWuJmQZMA6EGDHZCbdi2acA04DWAP87551/gVll1NXf8CSdUHXLiahb0zL3v\nXjfi7BFu0KBB8Si2bxowDZgGytaAwU7ZqrMLTQONqYHxEya6yRMnuhNOHuKGnHRCTQs598GH3WWX\njI38gdaPLEoTzJ+nptq3zEwDjasB63reuPfWSmYaKEkD+OYce9wJ7pFHHnU33Di95qCDsDST3Txr\nllt33fVct67d3O9///uSymCRTQOmAdNAkgbMspOkFTtmGmhjGgB0Bg4a7FZeeWX3i19c7tq3W6/u\nGhg/cbKbPGG8m33LbPejH/2o7vKYAKYB00B2NWDj7GT33pnkpoGqaECgs9lm33fjrri8KmlWIxGa\n0FZaaWV38EEHG/BUQ6GWhmmgDWvAYKcN33wrumkgraCjO3P04IF+04BHGrG1acA0UI4GrBmrHK3Z\nNaaBBtHAmWeNcIsWLXL33D0n1SWiSWvOHbe7OXPuNKflVN8pE840kE4NGOyk876YVKaBVtfA9OnT\n3diLx7p5UTfzNPjotFTgY4493n3++edu1s0zW4pq500DpgHTQDMNWG+sZuqwHdNA29DAO++840Fn\nXDRoYBZAh7syevQoP/8WkGbBNGAaMA2UogGz7JSiLYtrGmgQDRx62BHRnFabuTEXjs5UiRiH59Qh\nJ7v5C+Zbc1am7pwJaxqorwbMslNf/VvupoGaa4DRkX/3wvN+PqqaZ15hhozDs/uee7uLx15aYUp2\nuWnANNCWNGCWnbZ0t62spoFIA1h1unXvXpdBA6txA5haYosundzixYtt8tBqKNTSMA20AQ0Y7LSB\nm2xFNA1IA1h1jjv2OLfojUU6lMn1qacNcyussLy77NKxmZTfhDYNmAZqqwFrxqqtvi0300BdNTDr\nV7e4gYOPrqsM1cj8Zz872t1z1xzHOEEWTAOmAdNASxow2GlJQ3beNNAgGgAMrp16jdun396ZL1GX\nzh3dDzp2cnfddVfmy2IFMA2YBlpfAwY7ra9jy8E0kAoNAAY/P+Y4Byg0Qtilb1/37HMLGqEoVgbT\ngGmglTVgsNPKCrbkTQNp0QBgsMWWW6ZFnIrl6NO7t7dUVZyQJWAaMA00vAYMdhr+FlsBTQNfauDJ\nxx9z2/zkJw2jDixUe/fb1+F0bcE0YBowDRTSgMFOIe3YOdNAg2iAEZMJPbp39etG+WGm9pdffqVR\nimPlMA2YBlpJAwY7raRYS9Y0kCYNADtbb7NtmkSqiiybbLqJW/L+B1VJyxIxDZgGGlcDBjuNe2+t\nZKaBnAZo6ll33fVy+42ysfnmm7sPPjDYaZT7aeUwDbSWBpZvrYQtXdOAaSA9Glh9jTXdN1dcKT0C\nmSSmAdOAaaCGGjDLTg2VbVmZBuqlgf/3//5fvbJu1Xy3+uGW7lezbmrVPCxx04BpIPsaMNjJ/j20\nEpgG2qwG2rdrvKa5NnszreCmgVbUgMFOKyrXkjYNmAZMA6YB04BpoP4aMNip/z0wCUwDpoEyNcBA\niR1+0KHMq+0y04BpoK1owGCnrdxpK2eb0wDTQxx55JHuu9/9rps8aVJDlv/Tzz5ryC71DXmzrFCm\ngTpqwHpj1VH5lrXzo9/+/ve/dyyMBcPy7rvvNlPNaqut5n70ox/5Spt1z549/dIsku34GcABHLqZ\ns/70009zWmH7ncVv5/YbZeNvf/tboxTFymEaMA20ogYMdlpRuZZ0sgaoiG+88UZfKWN1AF6AGFkh\n2A6DIIg1UHTKKae4l156ye2zzz5u33339QvptMXATObok+Xuu+9upoLvfe97XjfolXhXjLuq2flG\n2PnjH99yXbtu1whFsTKYBkwDraiBbzRFoRXTt6RNA14DVLZXXnmlXwATgAVQ2XjjjcvSkCp5oAkA\nIq3Ro0eXnV5ZQtTpIqBPwAj0hWGrrbbyukAfITTymi+33HLug6XLXCP1YDr0sCPcrn37eFAO9WDb\npgHTgGkg1IDBTqgN2666BkLIofIFSLDkVDNQ+ZPu9OnT3aBBg3JAVc086p0WQCcLThLgYL0J4TH8\nD8M24+z077+/OyLSz4AD9qt3caqWf8cOHd0DDz7QJiC3akqzhEwDbVAD5qDcBm96rYqM7wiAIx8S\n1tUGHcqCdQgLz+LFi31zDftYkbIe1GRHeX784x+7888/3zffUS6a8KZNm+Y++eSTXNMezVaAjeDm\n//7v/9y///1v969//cv985//dF26dImufznrasnJP/fBh1279u39/c8dtA3TgGnANJCgAfPZSVCK\nHapcAzRTASBYXNiuRQAKsH4AVVg6WCNDlvx5ZL1hHToY46QNKGK9YVGZBDfoF+sN+0AOC/v/+7//\nm9vfaacd3MUXj41ijiZ65sPTTz/j2q+/QebLYQUwDZgGWl8D1ozV+jpuUznQbCXrDRU2AFKPgBwA\nD01cAE/ov1IPefLliZxAmSAnDjiCG9YKAA1BoAPUsAhytAZ0BDtaHx75uJx+5pkN0ZRFE9Y1U67J\n9dKTfmxtGjANmAbiGjDYiWvE9svWgEAHsKAZSdaHshOswoWyMAEUaQEe9CS4ydeDCrgRNKKGEHBk\nwYkDjoAmDjnh/jXXTHWg0tQpV1dBu/VL4vppM9zdd81x99w9xzdd0uQX6qt+klnOpgHTQBo1YLCT\nxruSQZlC0MGSkqaAPEBPPYFHPaiAnCeeeKKZeuhBRUWNJSoEsjjgyIIjyCkGbmTl0Rofn7322su9\nuvB116Vzx2ZyZGmnV6/e7uSTT3b77dc/Jzb314Anpw7bMA2YBgINmM9OoAzbLE8DaQYdSgREEKgI\nawk8WBvID9gqpgcVMuYDHMFKCDgcC/fDbcUXIJHuN77xDe8H1L379u6qq67KrHUHqw4hBB32dX9Z\nWzANmAZMA6EGzLITasO2y9IATS4ADxV7moOsO8jZWk1sAA5wgwUnPhI0PaioiNGXfJkEN+gNMGFf\nlhsBSxxqtJ8EN5yLAw7j67C8/PLLbs0113Rrr722Gzx4sJs4+Rq3+0/7pvmWfU22pcs+dIcdeujX\nrDphRO4vFrLWusdhXrZtGjANZEMDBjvZuE+plZLeVlTuVPJZqFyADeQERqoVSIsKNh/gADcs0k8S\n4AhSWAtm4us43GhfcMNaQYDzX//1X47l6aefdj/5yU887Kywwgpu9uxb3MMPPexmzf5VpgYZHHnu\naLf47bfcrJtnqqiJa55HgFI6T4xkB00DpoE2owGDnTZzq6tfUCoUxn958cUXm/maVD+n6qWIBYpK\nEEADQMoNgI2WavagEuAAMoKZ+DZxBDgCJ5qoWAQ34RrQ2WWXXdzyyy/vz7NmOebY412nzl3cmAtH\nl6uGml7HuDqnDjm56EEEDXhqenssM9NAqjVgsJPq25Nu4bCSsIyOrDtZCkAKfjw4DRf7zx9IEtzk\n60GFLkKAEoioaUprYCVcBDOF4EbxSUPp5gMcwcznn3/uXnnlFdenTx8PNzouEFqyZInbp98+btjw\ns93Rgwem+hYufG2R27//vu7isWO/5qtTSHADnkLasXOmgbajAYOdtnOvq1pSLCNADpVJscBQVQEq\nTKwYUFMPKpqo8gEO0FSoB1UccJKABpCJA4/ghnVLgCOIYS2Qef/99x2ws8022+SO6ZyauIAlyjVi\n+Ah3w43TXY/uXSvUautcjp/Occcd7zp27Oguu5RBEUsLekax6FkwDZgG2qYGDHba5n2vuNRUHMAO\nlX0WAxUgwBO37ghwgLl8Pai4rhDgqIkpBBbBjI5pX/Cj41rHAUeAArAkwU14DIsN8TbbbLNmoCNL\nEGsC5aMsAw4c4J588slUAo9AZ/1oWoirr55U9qPGfSUY8Hg12I9poM1pwGCnzd3yygssq44qkMpT\nrE8KgBqVH01PlAkLThxwdt55Z3+eOKoo1YyE1ICNrDdsC1YEM/F9wY3WOq90lLbghrUAJ74W4HBc\n52i2osfVpptu6o8BNqShINBhH2CjvPQSGzhwkDt7RLosPAIdZJ0xfVrFFkQ9r7qPpGvBNGAaaBsa\nyATsTJ061R177LH+juBo+fDDD2fq7oTyI7gqNLbDyodyUb5iA//c3377bR/9kksucWeddVaxl1YU\nD2sAoMCS5aDKPl4GKn/ghkVNdOE9E5gAKnHAEbwIdgQ18TXXaSF/ngOBieBFAKN1EtxwTPGxzmy9\n9dZu9dVXzws4KitWOSYWZc4tIIBy3nnnHLf//vu5626YXncfnmefW+B4psttulI542vKiv9VaJmL\nx7F904BpoPE0sHzjFclK1JoaoLJgBGCcdbMeKIvCoEGDPNwAciHgADnhIpiJr+Nww/n4McGNwCkJ\nbgQuApsQZnRMcVjLAgTo9O3b1xcnBGiVL1zThAfoELBoUV5k6h85AM+bN88dH/nHvBpZiIYNO70u\n3dIZNPDySy52J5x0khty8kmh6BVvY9UBdtCBAU/F6rQETAOZ0YDBTmZuVToEBXKwfAgI0iFVeVIw\nfxeVPRUga0IINmwLUPLBjaBGawFOGF9pkn4+wBHI5IOb8DiAo3To9k7F3bt3b5IvKsgiJwuWLkLO\nHXfc0d3763vdyHPO84P3nRk5L9dq4EF6XDGy85OPP+Yn+AQ8WyPw7HLPDXhaQ7uWpmkgnRow2KnB\nfTnmmGMcSyMEddtuhLJQ6fPvnkqVip4gwAlhJR/ICGxYx+O3BDiCF0EOVhpt61zcgiPAkeWGEZqx\nUvTq1avo20HzFX46NF+FgBdC3frrr++uv26qmz59hhty0olu2+26uiMGDmw16ME358bpM92Made7\nfvv2d/MXzG91mDbgKfqRsYimgYbQwHJZLcWll16a83Pg449Pj/xXksr0yCOP+DjE1bLGGms40mFy\nxKSgeKy5noWuvOxzHaGYOPjshPGS8tKx2267LZcH15Afx8oJSWWmqQN5yg00YbXWP+5iZJIe1WRT\nzDWF4qgpg3/5VPgCm3//+9/uX//6l/vHP/7h/v73v7svvvgit9Clm4VjnCMOC/H/53/+x6dDnsAK\noxV/61vfct/5znf8stJKK7mVV17Zac22Fo4R79vf/rZfVlxxRX8taQiAZNXRVBSSv1AZdY4yJjVf\nCfCQneWf//ynXwYMONDNnXt/NGFoZw89hx52hLv19juVXMVr/HJOPW2Y6x3B5quvvOxm3zLbdy2v\nldVQwINjugXTgGmgsTWQSdihohs+fHizO0MFDhgkAQ9xkyp5IIdzOPr+9re/bZZefAdwII1C8YqJ\nE0833AdqBgwY0CwP8uOY4CqMX2ib+EllFgDJ4btQGvFzVJbf+9733MZRE0C9Qz5ALUcu4I2yqdLH\nUqNKH4gJQUeAI8gJAQcQA0q++c1vOkAFcBHUaB2CDscEOCHkAEekQVryyRHkUT5kJZR6H/I1XwF5\n8TJTPhbK8fOfHx11CnjI7bTjDm7yhAmuY4eOHlIAH5qeSgmMgsyUD8xaftSRgyIYXN6PiMz0D6WA\nWyl5FooL8HD/DXgKacnOmQayr4HMNWNRWecLVIBU4mFvLSr9lkCB6wCDt956y/dkSUq/pTS4ppg4\nSWnrWCGLC1DG3EbF9NYCmuIwqDy0Ji+6J5fSg4tKttQKVvlVe10IOkvNi0oWZ2VgJ27ZwcqBlYcF\nIFBzD3kIQMLmJjVHaS2LTHwdXiNrDWsF0k4KVMrIW6r1o1DzFWUG7mTJUlnJH6sSYZVVVnGHH36Y\nO+qowW7hwoXuqaefcXfNmeMOOnD/CBZ6uXbt13frrrueW3uddXz88AerDZawe++5y+3db1+33Xbb\nuqFDh3iH8DBePbcFPKwtmAZMA42ngf98XTNUNir9F154wVdOH3/8sYcAiR/CEBATAgiVOyAkf4rQ\njyYeV+mFa+Lr2nyQUEycMM34Nt1tlceUKVOanS4EQ2HEEHRCXQFzISyhG8pdbAAI0lQZhPe62DIk\nxdtqq638P/uwGUuVv5qzBADcG6AECMD6QpNTviYqWW7CtSw4YROVwEfwVAh00H+poAOk5mu+EuhQ\nPjXHsQbygJ8kwAO26CWFNQZ9jBt3hTsmsv6stmrURPedFd23V4z0Ei0rRdv//ucX0aCF+7vTTh3q\n495z9xx3zsizUwU6eibQLTCJH5QF04BpoLE0kDnLDuq/9dZbvVWCbcYUARCwzChQgXMcC0dYmQMP\nYWXPPs1eqjSBCdJKClwXh494vGLixK8J9wGlEKLYR37Bi8pD2fIFLB5hU16oK2CPfZrtSJeFNMkn\niwG9AK+F9FFMuQQPgkxZb7QPfAhIwjXWmrjFJumYmqK0Fsxo3ZKM6ipdLmgW23wlXx3Ah7JTFtYE\nZEfeJJnV/FSufC2Vv5bnKYMsmHouapm/5WUaMA20jgYyZ9mhwmYJQ3xfgBM2dVAhhqCj68OKnuvC\naxSHddK14fli48SvCfcPPPDAcNdvx/MNQeZrkaMDofxYdeK6QQ9hPi2lF+bBv15VbOHxUrcBU1Wc\npa7DvCgrflpYqEopR5hGuC1ZqNiBGip7+d9gwZF/DWv53oTbOiZLD9YbrmfBEkSahaAhlEXbWNMq\nsagV03wlyJE1B6sWFp9QHwI1ydXIa55xdG4Wnka+y1a2tqaBzFl24pV3oRsWVoBU/EkhbhUQKMXj\nxuPFz7NfTJyk63QsqWzxNPPJpzTC88AAFVahEMYvFE/n0vZvl3ssy1doFZO8pazRVQg5XAvwYOkh\nhOcVj3W4CAoECrrOJ1DiDxUuoVzALLb5iuYqgQ5WHYLgjDU6oIwqm4/Q4D/o3Cw8DX6TrXhtSgOZ\ng502dXessDXTAHCiypzKncAxKvuwKUdgEwIAx3S9BGa/kkBFC1huXEHPtyO/ms4jPngg8IYvDmAj\n0AF2sOhQVvRA+WSVYq3yUq5Ky1aJXmp5rYCn0vtQS5ktL9OAaSBZA5lrxkouRvLR0FISNu+EseOW\njbglJYzb2ttJMsblC8uUJE8oP01gVF6Flpb8kMI8qHiphBs1UIkDLqroaYaSA7KcjEMHY8GArB4C\ngUphgOZCdM1Sbhg9Ov/ggXJKVu8rQAfwUdMVgEf399CJGp0AQW0tyKomK1tbK7+V1zTQKBpoaMtO\n2HQFNOCIHPeBCXs4AQrhNbW+ycgX+tOQv5yn2Ua+lmAnlB94otwhAJFOuYHKtxp+DDiBxyGuXJl0\nXUt6UbykteBElTn7ACIVvIJAhjiKz7lwW3ErWQM6PXv2rCQJD6TF9L4CdrQAOgTADYgDdORzpCYt\n6UDCAQDI++lnf3MLFvzGH1YXc3Y6/KCD23qbbf3xjh06uFVXXdmXTQDhT2Tgh+eesrKwbcE0YBrI\nngb+8zXPnuwtSgw44M+hipUxeMIeWeyHMBEHjRYzqHKE+Ng37MsfhayKkQ/YoeLHl4Vy4wxMmQVB\nSlM64VzopN1SkaoBO5KlpbxqdV6+GeQn4MmXd7XhRvlU2uNK6bAup/mKpq0k0FETliAPXf36vvvd\noxGYL1u61MPMFlv+0PXZpa9r376dF4Pu5QQGHHxvyRK//eKLv3evvf6Gu/vue/x1O0Vj8/To3jUn\nq4+U4h8BD+XPGqylWK0mmmmgZhpoaNjBooHTqoABAAi7qIdaJm6+budhvNbeRlbJG8+rWAdc4mmE\nZKw79FhKCkBRKaCDxYHmkUYL+sfeWiDTkr7IH9ip1KJDPtyffHNf5Wu+AnSAmXjzVQg6d945x02c\nONGDyv4DDnbFTBDapXPHaKqJjr744WSiQNCj0ezqd865210y9hI/u3m/vfdKvdUE4BGUGvC09FTb\nedNAujTQ8I3wVPwtVeiATjXGa6n01haCGUCs2KYaytsSuJEWZS4l8LFnbqxGC/xbrwZolKMXQIdQ\njcqTclS7+eqee+51ffrs4kHn8IFHukVvLHJjLhxd0aSgANCQk05wWIDGjZ/gFi9+122yySZu/ISJ\nVWkm9QptpR85K6NrC6YB00B2NNDwsMOtoKmGwfTi0CNrDiMLp6FpBfmQNYQa5EL2QiCU9LgRn1Gm\n49eRNiBEmcN8ktKIHwN2mBurkT70/FMH4KoBG3F9tbQvPaLXaoRym6+w6sT9dF577TV35OCj3aRJ\nkxyQ89hj89zRgwdWQ8xmaWDxGXfF5e7Vha+7+fMXuG5du0UQnn9KmGYX12nHgKdOirdsTQMVaOAb\nkSPml0OkVpCIXdp2NECFSuXcCM1ZwAYOtjfeeGPNAY58ASwqzmoE7gdWndVWW81hLSJdXm11M6fH\nFRN7hrO10/2cpjtAh15mGhSRUbUnRBaXgRHsHDnoCNe+3XrVELGoNJhc9LxoOolevfu4sWPHVE0/\nRWVeYiQ1adXLKliiuBbdNNCmNWCw06Zvf+mFv+uuuzzoyCpRegrpuQIg+PTTT71AjEVDpcXS2lYe\nQId8qhW4Fz/+8Y99cnOiyTn33Xff3HADGk9Hs7cLdsIpIeheD+iwXHDBRe6xeY/65qXQz6ZashaT\nztJlH7ozzhjmweyC80e1+v0oRqZCcap9PwvlZedMA6aB8jRgsFOe3tr0VUACH/jWhoLWVLIcgnHm\njYdVV13Vlw0g0aI4lTgxt5YlgPtAOQA2YJSAVQeHZKAmtOoAO+xzjt5XdC9nDCGg6IwzzvRWnilT\np9TUmiPdxtennjbMzb3vXjf7ltmpf9YMeOJ3z/ZNA+nSwH9F5u/R6RLJpEm7Bj788EMPO1gQshqo\n5Kn0WQCEDtE4MAyktzTqTv23v/3Nvfvuu96XZ/r06b55iCEKXnzxRT8zeLt27TwkUPZi4YemJfTW\nrVu3qqqM1/eWW27xzVdUuJRLzVdx2NFs5hyXnw5WHYDozDOHu29F106Zck0qQAcl7fbTXd23V17V\nnXbKULfDjju49darXXNaqTeJpl30z9qCacA0kD4NmGUnffck9RJRcdN7ZvHixZn+uFMxXXnlld4i\ngtLxb2FhiIJHH33UL1RgH3/88dfuSadOnVzv3r1981GvXr28PoiUBD/oi1DtirBazVennXaG+8Zy\ny7lrrrk6NaDjFfbVz/XTZrjLL7k4MxaeavpihXqwbdOAaaB8DRjslK+7Nn2lev7g3JvFAOSwACJY\nQkJrCHNE0azDIvgBLJ588km/fPDBB18rMtaeEH7kQ0PzEs1+1QYdBKhG89WkSVd7HUy9dmoqQUeK\nHnnuaPfKyy+5GdOnpdppGXl5Vrjf3HcLpgHTQDo0YLCTjvuQOSmABKw7NO1kzXcH3xkqI5qv8MkJ\nQUcOvTTtsNDkw6KAn8tnn33mXnnlFQ8+Tz31lKObdjzQnERT0eDBg91+++3nsP4oJFl/dK7YNc1X\nlfa+YsDJMWMudndFoxpr8L9i869HvEMPO8KtFvlTXX31pHpkX1KeBjwlqcsimwZaXQMGO62u4sbN\ngAoXYODDnqUgX6O4M698XIAc/FuYNworD8ex8BAAGICHRdusgZ6XX37Zr5977rlEdfTo0cNDjyxA\n+udfKvygb1mOyu19BdQdfPDBbuyll7sBB+yXKG/aDtJLq3cEpxOikZz79t0lbeJ9TR7uU2tZ9b6W\nmR0wDZgGCmrAYKegeuxkSxrAqgM8AD5ZCADOkdFYQfrnjcxYdgAaWXVwWgZ24sBDXMBGSxL04NyM\n1WeNNdbw4MM2IEQvqHiQ3w9WH+AFSxmhJfipRvPVueeOcmusuaabOuXquFip3tc4PPMXzM9EMxEW\nUAKWRAumAdNA/TRgsFM/3TdEzkBDz+jfNr47spiktWD5ZBXsyLIThx0sPVh4sO4QFxhhAXpYC3re\ne+8935OLUa85p+NsL4kmxAR61PxVyO9H8CPrTQg/1Wi+YmTtiy8e656IfJBqOWBgtZ4LmrO6dO7s\nRo4cUa0kWzUdA55WVa8lbhooSgMGO0WpySIV0gCgc8opp/iut2n136HC6RlBGUCGY3IY4j47NF8B\nPFoDO1h9WAAi4mtROoAOUMIUHGETV7gdAhAWIByeBT/5/H769Onj5abpi/S33nprn2W5zVcMHDgs\n6ma+XTQtw9nDh0n8TK2ZSHSLLp0y1RvQgCdTj5gJ24AaMNhpwJtajyIBEFgd6KqdNuDBIZl50AhJ\n3eUFLlhuABqsOACOFh0DdOIL1/7ud7/z49wwb5iCrD4CHNbhtqw+IQwBP1h/WL/++utKKnFN13gs\nQOSPTMgMoGlKiHyDB2LVueCCCzNr1ZEyGHBwrTXXyIx1B7kBHp7FtL0f0qmtTQONrAGDnUa+uzUu\nG74w+MSkCXjU80rTQjB3FDJi5QkD0ADsCHhkyZGDsvYBC+JoDXRsueWWbpVVVsldz3kBFHlgkdEi\n6NE6CXoERbL6AED5nJ47R805O+20k8P5edttt/VA9cUXX3h/I2Qm33Duq4suGus223zzzFp1dM+e\nfW6BO+rIQX4Wdh3LwprnEegx4MnC3TIZG0kDBjuNdDdTUBY1aaXBhwcfHZqtABusTmxreohx48a5\noUOH5jQGnBAEKXELTrgv2Jk3b57bcccdc+ATj0M8LUpX+STBTz7wAXrovk7Ybrvt3JqRYzE9v5L8\nftZee23f1EVlyrLhhhs6RkkGxoAf4IgZxrPQ1dwXuMBPv336u/367+MdzgtES90pA57U3RITqA1o\nwGCnDdzkWhdRwIOlJ+4fUytZ1KwG5OBPRKCSAXBmzJjh94844gg3bdo0DzjAhwLbIZwIWLT+6KOP\nHGPUYFER4HCOba3DbR0L16TPfhiw6JC3LDtq4sLhmfQ6duzo7rnnnpxPENaqxx57zDd9Pfvss346\nijA9ttdaay3XpUsXD2X4FX388X+7e++9Ox4tk/vjJ052r0YgmLUeZSibZ1EO85lUvgltGsiYBgx2\nMnbDsiIuH3JghwB4YF2pRaCJgHxZA13xfIEMrDqnn366FweAABgYDyVubQkBSPCDzw+ws9VWWxUF\nNknQEx7TttJnTZAsDBz40EMP+WM0mWHV0TniYq3Btwh/HZb58+f7gR6x/KCDMGy3bVd32MCBbshJ\nJ4SHM7uNo/L+/ffNXFNWqHCafOPPaHjetk0DpoHqaMBgpzp6tFTyaADLCrBDExLbG7fSeCOADYB1\n1VVXeesNeWnQvlA0AAHAABz23ntv79jLeYCHnk6hVUWWFc4DGMAD11MG1lqw0GgRvIRWHI5p0XHF\nC9faVlqMTn3SSSeRfTQj+Rmuf//+fhtZVA45UwM6QA9pcH6FFVZw3/nOd9z777/v9fL8889HY/18\n4a67/gbXo3tXn04j/PTq1duNGnVepoHBgKcRnkQrQ9o1YLCT9jvUAPJhsqcpiRnE99nnSx8L4Kca\nAcDReDSkl9TbSvkITgACIIGeSYcddpgHAuKMHTvW/exnP3PLL7+8hwXW8qMhH3p0ATphIE0FIEV5\nCGoELtonby06Fl/rvLqZM9v3nXfemQMc4iM/C93j1UWefQKgg5/OSiut5OhqzvrPf/6z23PPPX0a\nkrcR1vTK+tFWW7hBgwZlujgGPJm+fSZ8BjSwXAZkNBEzrgEsLFhePvnkE+80C/gADTQ30TMKGCol\nUDEoDZoAwi7fQImCwCNcAwpa8GWhiYgmKcKIESPcgQce6Ltvh5YSrEDkEeajPNSkxFpgBCTRA4r5\nsVgADxYsLQIQQQj78YVzv/jFL5SFu/XWW/313/rWt3y65EOZaMJCTrqbh6M9c5wFaFIz1xtvvOEG\nHHRILs1G2Vh7nXW8w3XWy8NzzHNd6ruQ9XKb/KaBWmnALDu10rTl00wDQAkAxPqJJ57wIAEA0YMo\nqflJFQG9qYATKgcWWYiAH5qwVo0miqS5iTQEOWHGHMMCQpMPFhFNC3HOOee4O+64w0dt3769mzt3\nrsOigg9M3759vbUHyBDchGkW2iY/BbYBLQIgon1ZcjjHNuP27Lbbbj4eTYA0twle5J+D3IylQzdz\nFo5zPc1wQJHgCthif/Lkya79+hu5cVdc7tNtlJ+5Dz7sZkYO57NuntkQReJ94D1IegcaooBWCNNA\nnTRgsFMnxVu2zTXAR55/tUBNUhAEAThJgWs5BwzRHZxu4YIJ1kCKAkARQgOWEfZvv/32aBqFixXN\n+/4MHz7cW2ew1KhZC6AghGnmLipiA3kIrLXI2sSacXvefvtt39sLAOOYLDSy5CAzozADPIAP55EH\nGYEbWYEk93XXT3OdOnfJ/Pg6cfU2GuxQPgOe+F22fdNA5Row2Klch5ZCSjRAJbHzzju7zz77zF1y\nySXu5JNPzjVZIaKsMsADwIOFR01AAAPAg1XlxBNPzJXo3HPPdccff3wOIAQ8WHmUZi5ymRsh/Iwa\nNcpddNFF3jKD/xHj4yArMCNLFIAD6LAI1EgDmYAbrDqCHFmjrplyrftBh44NBzvMhL5++3YeGstU\nfyov41nGuoOVx4JpwDRQuQa+/ItaeTqWgmmg7hqgeQtYIGCRwQFZzrtYRNgGaIAHAvCDMy8Aw4LF\nhhGWx48f748T58ILL/ROy0BF6MdDGrLKEK+SIAijuzigQ7j55pvdOpE/SmilkawAjBaghqYq/H5o\nwqOCZKEcrDmGDxAA1IghixOZFnMfsGQS3okNH1DMtRbHNGAa+LoGDHa+rhM7kmENMGig/F3oaUUv\nJKw2wEoILFh34s0+TMYJ9NC7C6dkBvMj3H///d5vhxGLBU1YhUijWsBDPkdGDtsEeqzRzRz5ADDW\nCspPlhwACNABbugtxiLgAXSwDLEQx0K2NCCrjgFPtu6bSZtODRjspPO+mFRlagAIuOGGG/zVjDFD\nD6vQz0XNVTQLARLAAtaTBQsW+KkUGGSQYwAGgw8eddRRPq1Fixb5uaeeeeaZXM8nWYmqAT2jo3GB\n8DcCWm6MHLcJlIW0WeSzg3UK0MKyhPzIHlp1BDsCHc6xrPitFX2ajfbDwIIdftCh0YqVK48BT04V\ntmEaqEgDBjsVqc8uTpsGgBQsG3PmzPGi3X333e62227L+bsAO8CPLDMAA1Mt7LHHHo5eWHQPZwEi\naCrC2kKzFgG4weJCExPpqFkMEFHTGIBSasA/g5GSCYAO8pMOC+kiK3mTnxYAiLICZjRjIXPYnZ1t\nHfPrVRrTsvPekiVu6222LVXlmYov4OE5sWAaMA2Up4Hly7vMrjINVEcD70Q+CY9HPbC0JlV6VmnC\nTsa20ccePwa2e/bsWXDWaACmV69ebsCAAX6MGpyMO3To4DbaaCMPD0AEcXDwff31190uu+ziC8Mx\n+cKwrSYr8mVOqn79+vl4TDWBI/Oll17qrS74zbAQuJ70w6Ynf6LAD0BFoPlKXenZj1t0kCcENfKS\nzw4+OQAa4MMx5JcMpLPWmmu43zz/W5JtqMA9bAuB5573AuCRP08l5eZ9Iy0tpB2+d1gYlY/eO9Y9\no3fPgmkgixqw3lhZvGsZl5kPLBYMDSiojyhrrBoEfVSJqw+xPsw6BhhokUoECFhAsL5svvnmvncW\nMPDII4/4aMDAn/70J28x6dGjR86Kw0lZUbhWViDS4jiBZq0//vGPfrt79+5uypQpfjweIAMrixyd\nAQ3Bho+c54fmK6w6VC5UQLLqqBzADb5GGlOHbSxJpE05ZL2hqUrAgwzx/JkOY9y4qyJouyuPJNk8\nfPEll7uVvrOiGzrk5GwWoESpeRd4TnhXSg3he/fuu+/6nou8Z4AUCyH+3nHs8eDPCPkTR++d3lfi\nWTANpFkDBjtpvjsNJBsfSeCGyp3AxxKLRjkfba7nw81HmEH3SJtBBVmABpp+aPYBFGiiYlA+AoMD\nYuVZEjV9YAVhBGX1VFKzFVYZwAbA4XpBj4CH84Ca/IK47sEHH3SdOnXyaQI8LFhWQuuKFyD2Qxk0\n1QXNbuiE9Fnko0P+DBoo2KFcnAdogBvkVxkEXEn5oiP8ebi2kcIxxx7v/vynZf45UIXdSOVLKgv3\nkmcH6Cgm8LzyngBJghTW5QTS4D0mTbZ5hzWaeTnp2TWmgVppwGCnVppuw/nwYeSDCNjwcWSpZgB6\ngCgqAPJhfB0AADAAFiZNmuQuuOACnyWWGSqJ73//+x4WsIgQF1AAXAAFQtzCQzoAD2lidSGvIUOG\n+Lj8XH/99W733XfPpQOM0Myk9JKsPOiD5jqar6hACMBICGuy6gA7wBfnSBd5JTvWHcAHSw/n4nmh\nH8LoUee7s84+2+3+075+vxF+OkZjB82+Zba3iFH5KnCPGz1wXwuVk2eK94HA+3Fkld873gEgijnv\nmJuMbbP0NPpTl93yGexk996lXnI+hnxg+ScK8BT6MFejMIIeKr1f/vKXfuJLWWfwy6FrOQH/Gz7K\nagYSNAAQHAMWBB2y8CgdoAfgATrIZ+DAgTnRmaFcIy4DTlh4gA8WQgghVD6t1XzF9BthkN4vGnOx\n+8c//+3GXDg6PJ3Z7WefW+BGnj0imrF+3tfKIMDjBBYflkYMScDDc8l7JxhhuzUD+QFVyMJzLcBq\nzTwtbdNAqRow2ClVYxa/KA3wL+/UU0/1g/zxAaxlmDZtms8bEKHrOeAxa9Ysb/FBDvx4sMRgdeFc\n6PfCvoAHC05oZRHwsFazFiB32mmn5fx48AG65ZZbchaeJD8eKqFqNl+9+eabvpkLmGIR3MR1zj/9\nq64anwgH8bhZ2B957mj3P//+l7vs0rEFxaUyZlHIpx+dz9o6BB7+VAAbAA7vXS0tLchBvlgskaOW\neWftnpm8tdeAwU7tdd7QOVL5618elWu5PjmVKompFugmzkzrzHe16667+sEB+RgT6FFF8xFWF5qA\nsO5oAXjkaCxHYaw5AA6WHZYQeLACMQmpJhLlesbtadeunYcp4EnNWsAIoFNJ8xU6/vjjj3NAxeCH\na665ZjPLkS9kwg/NPuPGT2iIpqxevXpHMH1eXrhLKL4/RKWsgMWHJeuBMvEHgzXvXb2AjmeTdwyg\nr+f7n/X7afJXXwMGO9XXaZtNkQ+dPrJ8dOv9zw7gwRGTnif33Xef99MZNmyYmzlzpr9HM6LZsqno\ngJEQeLD0ACzyfwFmcBhOclwGejgOFFHm8847L3f/77zzTkePLTWPATxMP8GUEAz6h1zoiPQFVaQn\nPx0ck9lGr/QAw0qEXFim6EqPzAIzWXVymefZGDNmrPvrRx9nfvbz66fNcDfNuLFiK9U7gdWHe1Ev\nOM9zu4o+DGDgO/Piiy+mogxYlQRfWdVp0cq3iJnQgMFOJm5T+oUU6MiEnQaJkYneWVQE/Mv89a9/\n7TbbbDMPPVhngBy6owMKQAOQI+tO3OFXTVpyXAZKQiuP/Hj4Rxs6Lp9zzjl+IlGAB58hZmQnAELq\nESOYIg3SBHLoKi6QwoGakZ2RiW0WtklTPb8oQzGByn2TTTZxry583XXp3LGYS1IZ59DDjnAH7L+f\n22+//lWTj+eF+6fAs1xvYJcshdaypADblAGAT0NQkxpyGfCk4Y60bRkMdtr2/a9K6dMIOioYIMFC\nhcBoyvzzZRoJZkcn4LiMNQYrDsAj2Al7OKlHFengw6Nu4XHgoZmLc+iDXl9//etffR7y4+nWrZtj\nfi3m7rr33ntzPbUAKebiAnaUZufOncvufeUzLfAz7MzhkZz/l1nrztwHH3anRuPqzF8wv1VhBPDh\nXhLSavUJQSeNYGbAU+BFtFM11YDBTk3V3ZiZpf2DC6QAFMiJrwzWnLA7Ot3SaX6jmQmLCcATWk/k\nb8PdiwNP6MeDVQZgwfpDPHpbATHxQPMV0MPov4AUsnXt2vVrzVeAExYbLFChEzUyltp8FcqAdWe3\nn+7mbrhxuuvRvWt4KhPb+OowvEA1rTotFRzoSZvVh6YiYAK50gg60inNWSxpl1Py2roxNWCw05j3\ntWal4iPGR5cKNM0fXEEKzrxYWrDmMMjgwoULva7ydUcHLORgHFp4ABT58WCNkUVG2/Ljofnsiiuu\nyN0Ppr+45JJLPNysvfba/jjWIqCJ5istQBMyC8Aqbb7KCfDVxvgJEyPoe9Tdc/eXc4jFz6d1nxGT\n5z/3bN3lrrfVh6YhmkGz0kSErAAj8lowDdRDAwY79dB6g+QJ4NAWX8/eH8WqEnAgvP32227rrbd2\nN910k/dd2XLLLf1xwIPeVGF3dFl45GAsh2V/QfQDpLAANsCKgAcLD9NR/OY3v/HnQ6dlrmUU5+OO\nO86DDJYboEkWIgYPZJt0yY+8JUfYtBaXRTKVsu63T3/XrXsPd/bwYaVcVre4jKuzfY9uqXHClSJq\nbfUhP/xy+JORlTFtkJlvBfJmRWbdX1s3hgYMdhrjPtalFDT98AHDupOFAPCwjBs3zs9kPm/ePPfq\nq6/mHIWPPvpoPxIsIIFFR01HrOUMLMgQPGHhEfA89NBDOeChmQmn4rOjEYs5Hg+Mtjxx4kTfTAXs\nhP46pAd0Vbv5Ki4D1ol+e/dzvxh3pRtwwH7x06naX7rsQ3fYoYdGTZGD/D1KlXAxYUKrD1DCUs0A\nLJBH1qwkyIuFB9mrrZNq6tfSakwNGOw05n1t9VLpw5X25qu4IoAUAIVZ0bfffnv/L7OY7ujAD8BD\nsxIggkWGjzbj+JAeC+kBLbLyPPfcc+6QQw7xIjDwIB96Blr87W+/nH183XXX9QMQrrLKKv46riUd\nAr2sBFuh/1Cpva98Ygk//NOm0pz36Dy3wjdXcDNvmpVa/x1A57jjjvfw2NIAgglFresh3g8WBf4g\nVBJIi950DKuQRWDAb46Ar5EF00AtNWCwU0ttN1BefGgxo+vjlaWiATxYdfbbbz/38ssve7DYdttt\n3dKlS30xnnzySQ8zYXd0wOONN97wXcMBHmCHwQHxUyI9QZSsNAAPlh0G/0NXs2fPzo3HwwjP+tgD\nL1iaGDsH0CFd5Ss/HZqx5Dsky1Il+qbCBLxw1ib07burey9ymk6rw/Kppw1zb731Rzdj+rRU+4UV\nc09CawzPBUspgfeNa3j3shiqBWt0MsDnjoAP3FlnnZVFdVRV5qlTp7pjjz02lybfpFqGSy+91E+X\nQ54vvPCCwz8yTcFgp4p3gzFc8AkhxF/AQueqKEJNksJHB6sAH64sBsEJ1p0ddtjBT/fAGDg77bST\nL044O/qyZcty3dFxbGZUZABF0AGcEJQmwEIzFM1XckzGdwerkHprYcHhYxB+oOldhDwCHaw9jBHE\nOrQqkZ/y9BmX+COL3KeffurTZ5+mSLqj33v3XakCHiw6o0ef7z788MOGAJ34reL9Cd+hlqw+xMWq\ngzUxzZ0B4uWM7+sPkoA/fr6YfX1PN9100wiE3yrmkoaIE777U6ZMccccc0yuXPWGHQTRfQF0+Mal\nKSyXJmFMltpoQBUma16QUgMfqSw7Gar8/DvGURnfGEYkHjVqlFfFww8/7LuNAy0ADt3CGSMH6ABU\nsN6ouUm6U5pA0CuvvJIDnRtuuMFtsMEGvkmK67EKAUadOnVy1113nS53EyZMcFdffbW3/nCQdARV\nYdMV+ZQbuG8AFaCz1VZb+WY4QAd5aB4686wz3VGRT8ytt99ZbhZVu05NV40KOihq48hCA+BoATy1\nhBAkpfK8Mrt4lkGHslAORnumKbWcgAVBfyrDPwzlpGXXVFcDuh801ZdTt1RXmuapGew014fttaAB\nPsIMzqd/Zy1ET+1poIFK5r333vOmV/xrgBr8bgiMj0OlomYprDL0tqJ5imOAEMADKCgIRHB0JuCE\nfNBBB3lIoimKpjAsN4AM12Ht4YMADBGALHx6gBECceQfpLT9iTJ+uF+DBw/2V1JhUqlS2ZIHC+U5\n7LDD3HkR8I0cMdzRxbtegUEDe0f3hmZAusZnvXIvVo+CHtYEgQ++YQRZVP1Ohn947hjUk/KUGrBq\nATuE1VdfvZllo9S0Gi0+Vh69z6zrEQ488EB/X8h7+PDh3gpZDzmS8lwu6aAdMw3k0wAfKCbQbIQK\n6IknnvD/lOnuTdMV1ptrrrkmV3RBi6aIiAOPYCf8sDCQIH5AzH2F1QirDJYjIIdFY/aQiZq8+De0\n5557+nxxPB0wYIB7J4JKAASwygdXOUELbKjLL/+kCfgHYeHh/unDiByCut12+2k04OJE9+Tjj7m9\n9urn6O5dq4A1h5nMGR354rFjW5zNvFZy1SMfgEDwwzaWVCAYS1wjBOCb57DUwJ8DgIcQNuGwzzn+\nFLDId4UKV8dY826pgwDXxAMgRVNMeE1oSYrHZ5/0SFfXrLHGGl4WrE86xlrWKKXBflw+4nEsLiPf\nJ86FgTJyTBaUsPyKi67Y1oKc8RCPc9tttzWLgn+U8lI6yKN8w8gAKMBDIN14WmHcmm9HH7y6hsi3\npSlqdwVDcwvHonbYZnJFD3bufKTQr50P02A7HiJH0SbSDfNhOymv8Npi5eOaUAauC0Ohc8SL/tU3\nhWVEtmgqg6aoXTZMJrcdpoeuuD5qJ82VDx3FZSC9ePm1ny+fXIZfbUSg0xQ52MYPZ3b/d7/7XVM0\n0F9T1DzVFEFP01/+8pemCOhyejrggAOaIoflpmeeeaYp+gA1LVq0qGnJkiVNPE/RAIBNEQg1RbDg\nyx9NRZG7bs6cOf54BBFNkTWo6bPPPmuK/H+aIifnpsiK1BTN09UUwVBT9MFoirqgN02ePLnpzDPP\nzF3PfYnAy+f10Ucf+bxIh/TIT3kWUjzyRH4/Pk3WyKTA9aRFuSlH9GFqisYG8vlFH+GmaOLRpmHD\nzvLP9CmnntEUzaWlS6u+/mDpsqYxYy9r6vCDDk3HHHt80+LFi6ueR9YTHDp0aBNLowSeN55x1qWE\n8LvHNy8MfMP0PeNbGn4PdVzr+LV8QwvF53sa+aCE2flt0lGa8XX0J6bZubBOI614/Ph++P0u5tsd\nlp+0FCL4yOVFOeKBfJQ35/m2KYTnFCdco+d4uPXWW3PpodO0hP9opMYSlfpwEZ8bIUWHSo7f5PiD\nzIMVXqs0wnX8mlLlQ33hixg+qC2dK+eBiucVliXc5iVRKOaFUdx8a9KudmWErrmH3FNkTLpXHOMc\ncYjLNdUKgACVe9RM5aEk8hNpihyGc8/a+PHjPfAAKVGTQtMf/vCHpqjnVlNkNfHXCHgiPxh/DUCo\nEFlnPFBE/8r9NcDO/Pnzm+6///6mW265pSn6d9t07bXXNl1//fVN0WzsTZGPTy5fdH3iiSc2RXN5\nNUXzbDVF00s0y68Q8ACkAh3kAnwIAiVBGIDHx43yADmvv/56U+Rz1BRZp5qiMYg8RA8cNNjLBPQ8\n8+x8Fa3iNQAlyDnk0MOboslPK06zURPgHlZbP/V+7yhTCOAt3bs4IMTjx+uB8DsY345XwoVAR9fy\nDQpBgO2kb5Xix9fhNyv8fsfjhfvKr5hvd7z80k/8eBzaQhhiWyGEllCm+Ha8rkPmME48P6Vf63Xd\nYKechysOBXp4wgcnvFkos9gHkjTCUI58oRzxByDfuXIfqDC98MFK2iYPQjEvTKiD+DYVJlaQagXu\nX/iiJcle6BjX6hmoRKbIf8BDBtASNVX5f5vR3FVN7du3z720WHeeeuqppgULFngYAAywhGCxweIS\njYrs4wIY+rcq64nSxCIUTU/hLTsPPvig/9Bzb6Ju6R58br/99qbIH8pvR6b0XN6Rk7TPE6sT+ZEe\nsgJSScCDBUB6A7xCeYjPtYAd8AREAVMAHJCD9SrqPdb0/PPPe7CTJYsP1siR53jrS8+evZrOPmdU\n0/0PPFSy2oGlqyZMavr5Mcd5GbHkVLsSL1moDFzA/axWSMt7x3M6atSooosVfv/jsEIi8UqdOKpo\nqQfi3z+BRPy68Ntd6FwoD/cnvC5u1eG8vlVxa1B4Xfycvt1Skt5r1sgWhrisOheHjzA/4oTAFuYX\n1jGhLilHqMs4BJJmeG08P87XI9TFZ4e2vrBNMlIGb7JfohsW3ccvQ/SRbtYuiG9DpESd9o5qpBVV\nPP5YpHTf5TsXIdrgPOkohHmRngJpqH2xXPmUVilr2mcVogfKd9dDF9ED5Wfk1jnajcNy6LjWYbmi\nB1aH/Vq6jl4kr+PwJPomv8hiEh5O3MaPZOPIf6AaAV1vs802OZ2Xk2Y10iBffCOYnBPHYRb52Dzw\nwAM5sU4//XSvp8gi4h2V5aysbuQXXnihjxtZVJr5w8jvJgIM39OK6zkWTkuhbuYRKHkn5rXWWsv3\n1Np///19ms8++6zXFZOH4jeEkzTpkY7eGyLin0NZrrrqKn9dVJl4J9DQP0fyIDdl+Pvf/+7n42LN\nwjHSjqDIt/MjJwvv3dlnj3CvLnzVDYl8ar694jfdZZeM9XGYdiKCFu/UjGNzfMEP59DDjnAdO3T0\nvb1ejXqrMQEpz/OUayZ7mb3A9pOoARyVIytI4rlSD1bjnalGGsiN/5Gcr4sph75jxA3rgXzX8h3k\nm0pIqhv0PcUnRYHvYFgvsM+3VYG6QQE9KMSv45p839SwHMgV5hdBRLOySUblU86aPKI/hrlLw/zZ\nVh7EI38Cx1Wvsh/qEt2zT3wC14e64Fh4f8J0OFevUBfYKffhQkkhDPHgyTOfc3EY4lh4E5IeyPCm\n6CGoRD7yLDZU+kApn3i5eLDDh1sPs+KXu+bD1DOqTCsNPPw4vFVDLtIgrUpeKACOCoUA7NA9HOBZ\nb731vEMvxyNLh8OhGVgABoACAc+h0TQGBJyMGaxPAWAAbgALAEVLCDuADjDChwPYwbFZXdSBFWZk\nJ3AtlUPkO9QMeEiffCKrm+/hgoykA3RpGg8BkWQnLaBJk46yZv8t+iwAAEAASURBVB85iUOQHnCw\nRh8sHCNQxnNGnu0ee2yev4ennTrUde7Uwa280rc9BAFC4bJCdNkxPz/aPfDgA27RG4vc1ClXuyMj\nB9VGcHL3CmnlHyC2GrpK43tH2YoN+j4TP/xuJ13P+XgcgU88fpiuKvswTvgtRYd8c1haui4pLdKl\nntI7GVldfFbUOdRlOBBX8i0L5Q63Q1nC+i3cJo4AJiwbeovrknhxvYT5hfHDtMI4td5evtYZkl9Y\n+PAmSBaUKIuHHi7dBOJTuYuw9WCg3JCQlVaYV9LDjgUlHsJrSpUvnlah/TCfQg9UvKzxNJPKxbEQ\n9OLX1HOf8sRBh/vHfec+J5UHedEX1/GChrrjGGmG/8BKKZ+sVerBIOsOH6SDDz7Yd0OPHIr9BJ6a\nHR0wABCw6GAV4trI7yZnEeHaOFzIasI5AQQ9tDQNBWkALoIjoAq4nDFjhhs4cKAvEqM+n3POOe74\n44/3cYGy++67z9FzDGjZaKON/NAAGj+Hi0gTWQReyIHsLAI2zhFUdsmlHmQcl5XHR/zqh0oYGVks\ntI4GqvUnI43vHXBebAi/GaoP8l0bVrb54ui46hD2k3orKZ7WoRw6lvTNKiSDvlmq55ROa635tvKn\nkEDefD+ROfyOhvASlpE4+jbmky+MH49T6Fw8bmvu1wV2ynm4wocbqKEiD5UYWnyksDAfjhV6+HQN\n6/C6Yh/+UL4wrULbofyVPFDFlquQLMWcw/rBP/JKQ/hvgrS4d/lMvmFeIXiSBt0fFeJp6nipa15q\nKnUqd6waVPZAFOkDBjQtMQYPIEIX88ip2GdBRcJYOkAD1wp0BC6hVYc8iAPkMPYOiwYOJF2uAYYE\nIhtHlicg66ijjnKRj4276KKL/HQXkYOzY0b1SZMmeRm6dOnimOqCZxGgIiBHKEscdMiL8wRkAp6w\nLGlBRll30Auyt/Th84nZT+o0EH9H6v3e8VyXEsLvZSnXpS0u5aAJP6xnkJFvIN9yviXxc5WWgW8C\n3089A6zJS3+Idb7SfNJ8/XJpFi6fbDws8Qc/JNR819nxyjVQ6gcqKcfwXvGCFwM68XRk4dPxME0d\nq2RNxQ5wqDmLua0IwEjUO8vDhORm9OXddtvNQwqwo0WAA2CwcC1WFtIGogAKQEdrtsP5sNgHNpCB\nj9Gdd97p+vTp4+XAj2fDDTfMgU7//v19UxvNYuQhaw4gA9CQv/xz1GyFfAIdgCaEL8mFnJwDhJDb\nQnY1EL4jaX3vCmm3tf7UhenKr5E/C/mWML7kDXWrY/mAJYQZ0qJ1gbyAz6TWCaVX6Tr8s4i8Ah/S\n5RzfGIVwm3P5dKHjScYGpZWWdV2+XuHDUs7DlWT6Sxr4KbxhKDzfwxe/GZXKF08v334oX6M8UPnK\nmu94qOt8cfIdr+TafGlyXNaLEHi6d+/umL+KEPWaauabw4suoAkBh201FQEcAh3gAYgALljYDhfg\nBysRi2Y8B3iQJ+q94icw9YJ89cMgj9E4Pd6vh3yAKlmIkAsZkkAHeSgraQt0lC8yCHSAPvKWXsK8\nbTubGqjk3ank2kq0FX4v4392K0k3bIJKgpaktNFBKE8IDoqfdIxz4XGgM9Qn5Sq2nlI+xa7154z4\nWHRCOcImLM7HdVKJvsPykXa9Ql1gJ67IUgoPFesm8bApLW5G6KxMmpwPH8ikh4jRLvUR1/VKkzSK\nffiJW2qI51PJA1Vq3uXExz+jlN4TxeShe1lM3HicSq6NpxXf55mggseiAZwABCNGjHCdO3f2UeVY\nSC8twEA+MFoLMliHFhQ1FQl0SJdFPjzKS/CBhUUL8BF1f/cWnlBepu8AdtSjSjJoX47I7CMPQMQ/\nMspH3oIrwArYCUEHefV+hHnadrY1UMm7U8m1lWgtrDSTvuXlph1aPPgjrXqA9MgHVwa9A4yurBAC\nAvVSeB3bHGsphO4Y6DVsmm/p2lLrC+rCsKySL36cfKmbpG/yQa6wLuTasO5UWpI5vD9hPafz9VjX\nBXZChZfycKH00KqDyS90SkXh8RcxfCB5ANVGibJJK3xgJJfWihM+xIUe/lJvYKUPVKn5JcUPy590\nPjyG02spvSfCa8PtUL/cr/h9COMmbSMz9yS812GaSdeUe4zKXs1ZwAZ+MmHAqoIVRXDD1BMsgIWg\nI7TqABdJoCOoCPMDOgAdAIQ1eY8cOTKXPc1pyEbAUfpnP/uZi8bO8fmzBnIkD2vkQVYC+VCeMH3y\nQzbBl2TiQ58UeBYej/y4xo+fEPUau8h3L6eLeXxhRvXxEyb6bvDvRMMXWChdA4343vHHiZ6DxYaw\n0gwr02Kvzxcvbl3hexTCTVhnhM1M4TZph9exnS+E5QAgBA1xoMh3vY4rvzho6HzSOuk7ybHQKKDr\nwvIhJ35G0kvYmxYoCq1GXB+CUVhepV2PdV0clFEMlZUeWG5avocjVDhxVDlzc0hHVKqKj5sQ9rDi\n+vBhyOdwDBTpppQrXzk3EPnkJa8HKimdpAcqKV6px6T7Yp0Vq/XR1f1CXp4FFvSv+5lUDq7h/ocv\nkuIlvcQ619Kaj+7GCc6SvNiygAh4mJk8DFhV+vXr560lNAsRD0jAFwbfnRB0wuYrQENQgYWFIKiI\n73Oc3lZz58718X74wx+6yy67zIMKztKnnXaa1wnnmaWdMTDowo7skgNZ1GylsgA2AI4gB5kkf1wG\nn3H0A6w8/vgT7s45d7l777nL7d1v32guoe+7tddZxx3xVY8xxdU6GrAwgq4v3K233eHwLYoGJXR9\nog/sDtv3iLZ7Kpqt82gAHY0ePTrP2eIP846k6b3jW8IfqGID32jVE3wD+BYkVdLFphfGw52CuiHp\n26J48bFz+Cbz3dT3W/G05tvOdy0eqF+ok1SXhec5x3EBlupIxeEbWUhGxcu3Jn3pUHFCg4COsZYs\n8fhhHHSA7sKA/GHZKvk2h+lWvB19EOsSIiApOBdJVLBmI1IyEibHtEQPXk7uQueIFD2Quet0fbiO\nHqBmw4BzTanycU1043P5hPK1dI64oTzxbdJFnjCEeUUPW3jKb4dpRg9ts/OUN54HOmopMNItow1X\nGhjRM0mGuEzF7ifdv1JkbGkk1wgS/DxSTPOQJBOjHkfQ4UcCjiwdTVF32ibWHNPxp59+uol5uN58\n800/NUP0MfAjIUcQkjgKMnlGoOLn6oocoHP5RmDj59eKAM2PxMzIzo9F92XQoEG5OBFU+fm2yJtn\ng4Vt5GLKi+hj6aeFiMDFjwKNLJEVyE9rkU8ehvVnSgfKz7QRt9x2RxNzWpUTGHmZEZgZiZnlxmjK\nDGSwkKyBao1cnrb3LpqU1j+3yaVOPhp+NyKobxYp/M5HFWyzc9oJ39/4N5U4fDfDPIjP9zPpG6s0\nOUd+SptvM7JwXMdYo38F6qwIMnLndQ3nI0jKHee6UM74dZzXtzssP8fzhbB8ESw2kyvpGvKMy4S8\n8TpO13JfyJ8l331Q3Fqu82ukRlIU+3CFNwhFxwMPpBTMDQwfEOJyw8I4xC10w5R+sfIRn/QkQ/xB\nKHSOa0t9oML0kl5E8pcscdgp9MIgS77AnFiR2Tnf6ZKOI0N4TyVrqWvSIK1KAgDX0hw9wEfUtdvr\nlPhMsRBZRPw+cPHQQw81RePd+Ak+77333qaoq3gT68ja0jRv3jw/zQRTRbz33nt+igbgImpSSgQd\nlQU4iiw0Po/ICtMUjbeTm94BaBLwADLMtTVmzJjcPUePQ4YM8eVCrugfvZ/uAtBhCohobKCmP//5\nz03M2RU1b+WdfgKQEpQwzUO5gKMyxddAExDFJKBXXTU+ftr2v9IA97MaQJim9w5AB3hKCWGFHv+u\nlZJOLeKG32DqpLYSQogTiKWh7HWHnTQowWQoXgPMjaVJJYu/Kn9MPgghuBULO1wTB8r8uRQ+U0xF\nEo1n40Ei8nHJTcwZzo7OHFQAE/NczZo1yy/MdcXs5lh50BkAziSjmk8Lyw0QlRTCiTyjZis/V1Xk\nF+Tns2IWdObZAlqYwwqYAq7ImxnUQx1GfgB+FneAGKsOwAW0Ms8WoEOagq5QFuIwb5WHkAhyWjtg\n7QF6ACsAy0JzDRQD5M2vKLyXhveOb0mp9xrrCODAM16MVaKwFio7G/5Z43sU/sHmfZOcyErcthC4\nP/r+oJM0hW8gTCScBdNAURo4MhpUkHb2U045paj4xUaibTr0yYn+xTa7NPpwNPPpiV6kZufL3YmA\nxftD4LeTL3Duxz/+sT9Nt/O99trL97DCCRnHYHpCEfCr2GSTTbyfDj4v+MRo3iucEHHGVFdy/HeI\nIz8dn8BXP+hW81vhAK35tvC/YVGXdhyQI2DxSwRQ3jkZmfDPiaw8jrm0CMh0xRVXuA022MD78mhK\nCvkM4adDkCwPP/yIO/mkk9zue+7thg073bVvt54/X4uf8RMnu8kTxrvDI/8fpqSw8KUGeLbwl4qa\n/Kqqknq9d5Sl3A4P+MHIjySCtlYdm6aQskM5CsXjXGTh+JoTb0vXZPF8qJO0ldlgJ4tPVB1l5mPL\nnEuF4KCO4pWUNaBDeXBO1jxSSQnwUX7ppZcc4BFZbzxw0KsJ2AAy9thjD/fGG2/4S4EKIAInZXo6\nAThM7Ans0HUf2AGCAAzBhfLEYZN5pzSEPnNjSS7+k0SWF583Ts9AjWCH60LY4TyBiUyZ5oKATPTm\nYpRlAZfkALrkkDxmzFg3c8Z0d8GYi92AA/bz19b6Z+Fri3w3f/KdMf3LiVVrLUOa8uP+8pyeeuqp\n3vGzGvNk1bt8+oZQrnICPYNw1OVPT2RRKSeJqlxDD6rQ6Tsp0ai5zcNO0rlGOsYfVLrms44sWX5S\n6zSV78tuIGmSyGRJtQaojPlXxpL1QC8XelMBO1FTk1/iZeIfNaADtOjDLDgAaNgOe2hFPgi+FxRg\nwiLDqa5hDeTouPIDHpEH0CGvpIk8SQ+rDaDFmgVLD4E0kQeLEWDDQs8n5tEiAEDsR/49/npZiSQj\n8tBFfMFvfuNuuHF63UAHWbt07ujuuXuO7+XVv/9+DQHWlKuUwPPw+FfPJO8a1r6o2ccfKyWdtMYF\ndsJJc0uVE6sBActUUo+nUtMrN37UXOVBJqlHE72xdL7c9LN0nXqYYYWnR2jagll20nZHMiAPTVkE\nVf5+J4M/yK9/mBKfCkaBSmbw4MF+F4sOH2dZWAAOrCtYVKJ2at8tXGBx0EEH+RnI6dLNiy/LDtYd\nxsxRF29ZdrAwoVOapKjQ2MeaJCACSIAT4AZo0fg9WHZYkENj6AiAuAawAn4YA+iwww5TsdwJJ5zg\nLSfIhyzEOfvscxxdxK+Zck1Nm61yQuXZOPW0YW7uffe62bfMLqmbcp7kUnuYZ41Fgfsft+DwrPJs\nhM+o4mdpjfy8S1isLJgGaqUBg51aabqB8uGjzMeYdfyDnKViYtHBciN4i8vOeWY0x9JCJcM+MCK/\nGSCDwfuAnchp2PvFRL2yfDLnn3++N7HjH4OO1IyFD0/YfEQ8FsJWW23lKzLiC3RkgQGuAB0NXkje\nbOO/w3HicQ2LIMonGv2wj5xnnnlmzuTPeDz8O15vvfUiHVzgyzll6pRUgY7kF/DMXzA/08+byqN1\nCC08WyyFAnBAHKw+LcUtlE69z2HBZOHds2AaqJUGDHZqpekGywdA4IOb1Q8WVh1kB9iSAueAEEBH\nUBf1UHIsgAWAoaHj5STMKMWHHnqoBxCsJTfddJMfsA/gIR3W+MvgywOsnHHGGblZ06NuuA6ZCIIW\nWXTIC6gR6GDFiUMOTVha1FSm67H2sE1g1OU77rjDb2PVOfron7mFry50s2b/KpWg4wWNfgCet976\nY6Z9eICU0JpBhV9q4LkEkkJQKjWNesZHbjWFZ/mPUj11aHmXpwGDnfL01uavAgDo5UPln7V/mVQ4\nWKby+Q1QKan3Vdh8BYSETUm/ifxboq7kHlwAkU6dOrloHB134okn+udjxx13zDUX0XwF6GDZica3\ncQcffLBvNiLiDTfckGsuE+gAVOSFRYe046DDcVlxNCKymqTUuwrAIZ7AiG2OUeFEXem9jDh4zrxp\nluvRvavfT/NPv336u+222zYzvbR4zniWFJKapnSu2LWsO1gay4GlYvNprXjIzAK0WTAN1FIDBju1\n1HaD5YXTJB9zKs8shZbkplJS7ysqFQJgIYsO4IFlBksOzUNYWrC+0DuEeFhOosHb/HW//OUv3dZb\nb+0dhrHoROPi5LqgAifRyMq5aUq4ILTGkGYIOmwDLkALQT45NIux4IODRSmEnTANrmefcnDf6J4+\nZuxl7ujBA316af+hl9b+/fd1l1x6SUXOra1ZzvBdwHLBs1TtAKTL1yxL1hHJzR8lC6aBWmvAYKfW\nGm+g/GQhAR5YshCojDCjU9knWaSSmq8AGCAES0sIOnIOlpWFZiRAAwih59OyZcu8SugyTKUXDern\nrrnmGn+sXbt27sEHH3Q/+MEPfPMT14SgA9SwyBlZQAWoEMiLHleCHNbAEwvnCMgN3ITpCJguvHCM\n22DDDd3UKc3n+vIXpvjn+mkz3E0zboyGALgzFf47VNwsClgtahHIh2cKgMhC4H1D5qxapLKgY5Ox\nsAYMdgrrx862oAE+YjT5RCMEt8q/2BayL+m0mgCoII78qkdZmIDKwrFCzVdADlYd1sAEkALkABoA\nCNYV8sIJmLD22mt7B2biKQBdW2yxRa43FE7EnAecSFOQA5ywcFygQ14h6GDREewItkiP+Cxx4Inm\n+HKjR5/v7rt/ru/mLZmysmZW9W7durohJ59UF5G5dwoAM0utA4Al2El6lmstT6H8eBcAHf5kjLbm\nq0KqsnOtqAGDnVZUbltJGsdaLDtUAq1htq+GHvXBRT7kTQqcK7b5CtgBQoAJLCmADs1UgAfAAWww\nxgazlYeB8ThGjBjhormtPMBwDeBCZSAwSQIdrDSkCUgRn3y0sM/COSxELITQIgUsYeFB5p///Fi3\n48493dnDh4WiZWZ77oMPu1OHnOxq1TsLCOb5UeBepSFgJQF00m4tUTfzxwNITIP+TIa2pQGDnbZ1\nv1uttHx0qRT4oKXNj0Cgg1z5Prj844z3vgphAUiQn46ar2jWAkAADaAFJ2QABOgg4J/D6MoK++23\nn4dC4qv5ifjs47tDfqTJObq4AydACgGAAaLC62TN4frQooNMBNIjyMJDWgsWLHDHHXe8e+LJJ1Pd\n+8oLXuCnNa07PC88ywpAcNqeacmGlZJm0rRaVtP8XZAObd02NGCw0zbuc01KqQ8blpO0WHgEOijg\n8TwgVm7zFTABZAAs9LRiAUCAHXRw8sknf03vjJAsQAqbvYhIMxZNTn/961/da6+95iGF4+uvv77b\naKONcqAj4AFyWLAsyU9HoMN1BAEPaQM9Z5xxpltlte+6MReO9uez+iPrzqI3FlWlCDwbCoBNWp5f\nyZS0pilLYJZGy6q+B/neu6Qy2THTQGtpwGCntTTbRtPlA4dZnQ9cvSsMKgNM6BtHPhXAR75/58hZ\nbvMV4KFu5Vh32B82bFhuCokddtjBD+bXr18//0Qwl87ZZ5/tgQdrDWAEqAApgImsMKw5xjkGLGQR\nHDHvDH5AgFYh0AkfQdJm8MPte2zv7phzVyZ9dcLysN2rV283dOiQsnpm8WywKKSlaUrytLSW7Dzb\nBJ5vgCefP5qPVKOfYv5g1EgUy8Y0kNPAf0Xm+9G5PdswDVSoAeCCUXl33313DxfdunWrMMXyLuej\njwyMZ0NFAIQkBR5/Jshk0D8AjXiAgSwhWFrUhIUvjfx0ABWsKlh15KvDOaAG52bC8ccf7/Ned911\nHb2vGF2ZuXyooNinmYqFPOREDOSQt6w/yLPOOuu4zTff3PfcYk1FRzrvv/++n64CfcctOvGycp7e\nX0uWfOCGDqmPY29cpkr3P/v8C/fyy6+4n+7at8WkqIBxzEZ3LLLecC9YshSQnxDKDbB37NgxaqI8\nzo/9tNtuu/k4tf7hHSJv3vvZs2fn/YNRa7ksP9OAWXbsGWgVDdA0FFpVwg9zq2T4VaJUaliX+OgC\nMvxjz2dhqmbzFbOeU9GwZqRkYOuII47wlhogCT+fAQMGuGeffdZLOnPmTG+pUTMTVhoWWW+AHBaB\nlPaxBMmiA8AwenPoX4Ke8+maiT5XX2PNzDomx58bjbuTrykLvfA8EAQ38TSytp8EOmEZOM97R+AZ\n5PmvRUDPvG/8sWCdlaEoaqEbyyMdGviy20Y6ZDEpGkgDAAaVDR9bRloGQPShbo1i6mOrip68+OCy\nH8JAmDcyEfbZZ59cBcE+lhUchWVtwWLDgoMv52TVEYDcf//9btddd/Wgg28NfjlMIEo8FpqaAJSr\nrrqK5H0YOnSoTxMIYqF3F1Ye0ie+eneFjs8cC5u9gB0qcXSshcQBPS2q7Dn+wvO/cT133onNhgjM\njt6uffvc/aWsKjdr7r30kg94s6QIvT+UK1/gHM87wMPCM67r8l1T6XEAR/mSt4FOpRq161tDA2bZ\naQ2tWprNNMDHln9706dPd8wBxcewWpUPafOx5V8saZIPFVwYqAQFXjpOvEp7X+GQfPnll+f8c7bc\ncksPOsx0jsWGRdBETy6sMPS6Ouqoo7wYvXv3dgcccIDfpkkM3x+sQlzPmqklOAZUyZoDCBFaarby\nkaIfyi3g6dWrl5dJ5xphfcyxx7s333jd3/dGsd4k3RcBSyHQiV/Hfedd03sH+MTfjfg1xe6TNu8c\n7x6BbVmU/AH7MQ2kTAPLpUweE6cBNcAHmo8i82gR+OAKTPgHXmqgAhfcYDWiIpBTdNLHXNYP8hL4\naKZx5OK84ASfGSw4surIr4bjBGADMMHSQ7PV1Vd/OQLx4Ycf7p2cQ9DBSiMrEdBDGvhV7LXXXj6t\nefPmeWsQcdScJWsQFhxZcgAdwQ4XFgs6xEXP0skhhx7OoYYKG2+yqevdexdfxmoBdNoUVA7oUAae\na713vIPADmsAqJz3DjlID6jhOec9ZJ/jBjppe2pMnrgGzLIT14jt10QDwIkA5d1333U777yz/xDz\nMSbwoWbhQ0oQpPCBJVCB84FlIV6xgeu5FisLzVfIQAA2gBEgB5DRmDrxwQOxsvzlL39xwA29mwjX\nXnutHzwQCAmhCcABnOSzwzxan332mV+oeOhiTmAKCVl2KMuaa66ZmyUdyw7QIwjyF5TxAxy++94H\nbtwVl5dxdXovoQv6zBkz3KybZ6ZXyAok0/Ov96KCpPyleueAHXogbrXVVv69C0GRbb1n+d473qFq\nyVRpmex600AxGjDYKUZLFqdVNRB+UNkm8JFnO/wI6wNbyUdWzVfk8cknn3hQAlBkgQlBJ2nwQGY6\nHzJkCJd7AGG+q2222cbvAzukAzSp+Yr0gB3giUU+OoAOwEPo0qWLGzlypO/ZRdMVwEPvMDVjAUJY\ndki/FKuOT/yrn/ETJrrPv/hHwzgnq2yNDDvVBh3pTOti3zveQd658F1UGrY2DWRFAwY7WblTJmfF\nGuDfKvN4EaZNm+Y/4FiUgB3Biaww4dxXnAc26KI+fvx4fz0jHD/wwAO+S7ggBMgR6Kj5i/QEO+pm\njrWH/GjGuuKKK3x6jM2DLHRlx5oj2NHYPaFjsr+gxJ9xV17l/vHPfzcc7Cxd9qFbv327XDNgiWpJ\nbfTWBp3UFtwEMw20kgaWa6V0LVnTQOo0IEsKzVdsYynCnE9zlGAHIMEawxL2vjrmmGNyoIPPDV3I\nv//97+csLQKdeDOYrENKC58fjbjMmDw4KRNwdAaKCFiHJAfpIRvph749PqL9ZHrKi3y3T01IWFMs\nmAZMA9XRgMFOdfRoqaRcAzRf4aOAxQSnSgIWm5122skP0PfWW295wAA4AB0gA7jAx2bHHXd0Cxcu\n9NcwieesWbPcWmut5UEnbAIToKipSk1XHAdW8LtRl3LkwMkTuThGOOGEE7wDNHGBI0EX17PPcfJj\nsdCYGgB0gBwDnca8v1aq+mnAYKd+ureca6QBKpBCva86d+7sYYLJFAELwQnXaZoHRJ08ebKf+oEm\nJcBFoCNrjpqrBDnsc454xOc6HJzpsg7ssGy44YaOAQYJOD5PnDjRx+c65AjhCwsPAEYw4PFqcAws\n2OEHHb7cyfivQKcUh/uMF9nENw3UTAMGOzVTtWVULw2EzVdhF1nAQc1XTMnAfFPPPPOMB58bbrgh\nmndpqBcZB+F77rnHj4As3xlOcH0IJVh05OujZjDiqbs6/jeADj45Wtjffvvtc5OG3n777b4nDFYc\npU1asu6oSYt0Swkan6eUa7IQ970lS9zW22ybBVELymigU1A9dtI0ULEGlq84BUvANFCBBtQjBN8Z\nbSclJ9M+PULUOyQpXvxYvuYrQEXNRbKgADIMDMjknbKcbLrppg4AYeZxzuOozDmuBTy4FhjBAsPC\nPpDCeQKQQTMVFh18dVjYJi0cm0mLOMOHD3d33HGHW7p0qe/tBVzRzEV6nNeChUgO0dqOlzlp///9\n3/+6dxa/nXQq08fozp/1YKCT9Tto8mdBAwY7WbhLDSYjPU0Y7+PGyHdGY30IYICTpECFwHWMF8N0\nDPSGwkpzZORonK9LLNcUar7CD0bWE6DiT3/6k9tjjz1yoIMDMlNBYH0R6CBbCDqCHPnXyBGZeFwT\nBx32sRQJXoAd4AX4onfXD3/4Qy51F154ofvlL3+Z891RfK1D0OH6lgI6mvfYEy1Fy9z5P/7xLdex\nQ3absQx0MvfImcAZ1YB1Pc/ojcui2MANCx94DQjYs2fPkgYFVLk1OBrp4eMAJMUHGKSCB6aKGTyQ\naRx+/vOfK3l34okn+vSw3jCODtYYQAM4AWiAI4EOa6CJ44IXgU5ozZFFJxwNmQzVlEY69913n59X\ni+Onn366l534XEvTF+BFcxjQRB7IVAzsYDXT6M6k3SiB6SJ6dO/qoTdrZTLQydodM3mzrAGDnSzf\nvYzIDphocsAkKKm0GAAPC5UHlp8jI2sP+RQ799XNN9+cswAhCyMa9+jRw8MFIPLmm286oIwQBx35\n0xCPAHzEQQfgwZoji46sMkAKcCTfIQAK52ag69e//rVPj+YsYI5rSUc+P2wDPICQ0vMXFPjp1au3\nO3P4CLf7T/sWiJWtUx07dHSzb5md17qX1tIY6KT1zphcjaoBg51GvbMpKBfNToBHCCGtKRZ+P+QH\nHGDRIcyZM8dbaIAKltCKgkPxGWec4X1lJBeWlXbt2uWsKEAG8ILlp1u3bs0sOoBO3D9HUAKMYI1h\nEZSoCYq8QmsMctE0BkiRJsCz+eabe8sReT/66KM+PunIyZm1IArgIb0wTZVHayw7hxxymHfmHXPh\naB3O9PrZ5xa4o44c5Ba9sShT5TDQydTtMmEbRAPWG6tBbmTaioGlhWYkFkFPa8uI9YW81OOKeX+0\nTd6yoAAo+OccfPDBOdDZOBrb5KmnnnL0ygIqWAQnXLfddtv5EY+XLVvmp3ygyQlLjByRgRLABggJ\nF45xLmy6SoISrDPEAZa4Zvbs2V5dABCjNgNEoVWJvCkH8IZ8BOIoADeyqHEPaMJ64IH73ROPPaoo\nmV/fd/9c12/f/pkqB0DOs2bdyzN120zYBtCAWXYa4CamrQhYV6hoWdT8U2sZ+fcsKw/WnVVXXdWD\nAZaT1157ze2yyy7egoJcgwcP9ousMvjGCFKAEIACuOBa0gWEGFQQuABcgBmOcQ3WFjUxkZ4gpyXL\nC2mFMAZMXXTRRW7ChAledQAP0CKokv+OLEjvv/++e+WVV7zzNhWqLFuh3oE/xvK57fY7vZ9LeC6L\n2zTLDR06pBnQprkc3Jd6vQ9p1ovJZhqohQYMdmqh5TaSB9YELCmyKvAPtp4BOQCedyJrzyOPPOIt\nLnPnznUHHHBATqwLLrjAz0mFFQdwkFUGqBDoYEEBdFiAniXR2C50ee4Q9QISfAhyAB7Ah+OAjvxp\nkqw5OSG+2gB41NOLvAAeoAw4I4T+O3/+85/dCy+84J5//nm3YMGC3AzsXyXlV8ANlasWrAkXXXSx\n++jjTzI/+/mtEbBdPWmie+yxeWGRU7ttoJPaW2OCtRENGOy0kRvd2sUELKhUCXzY02SmHzRokLfI\nHHjgge7cc8/1MvLzq1/9ym2wwQbeOiOrDtDCNpCCpUWgA3iwrWYr9hcvXuwHBJR1JQQdNYGRTzGg\nQzw1Q5GH8mXcHcb+UWDcn7ffTh4vB6fqPn36+MlOe/Xq5e+B0mTNgsz4A7268HXXpXNHJZu59aGH\nHeG6dd0uGpPo5NTLbqCT+ltkArYBDRjstIGbXIsiYtHBgpI20KGCB1qwsigABWPGjPGWHM4BJnFQ\nEehgyWHBX0agE/rWvPzyy27rrbfO+fqo2QpYIhQLOpJNUILV5qGHHnJYoph0NCnQNLfnnnvmFixK\nyp/4SouyaKEMZww7K7JgrZRZ687cBx92p0aQM3/B/FRBddI9MtBJ0oodMw3UXgMGO7XXecPlSLdy\nPuosabLooGgqfEYmxqqjQFdyLDMADEFNToACkMI1nMO6ItDhGOBC3NAKhFXnjTfe8D48DELI9Syl\nQg6+QNLhY4895icglbzx9VFHHeWb55ADSMN/J+ydRd4syAzcaAF42MYyxBQVWbXu9Nunv9ulT+/U\nW3W4nz2/snbG76HtmwZMA7XVgMFObfXdcLnhhHzkV93L6+2jk6RcVfgjR4507du3d1OnTnV9+/Z1\nxx57rHc8Fhhg3QkBAcgBdgREAAwwBFzE/XOAjg8++MB9+umnvgmJdFoKAhvWgA7XhgGrDbOtAyXd\nu3f3TU9hd/SHH37YRxfwADuyTgm2KLsAB8jRNutJk652yz780N0ye1aYbeq3x0+c7ObccXvqfXUM\ndFL/KJmAbUwDBjtt7IZXs7j46QA4N0bdzMMu3tXMo9K0VOFrfJ3f/va3fibzKVOm+JGRqfiJIygi\nHoDDAiAQAKHQEVnAA2iwyD8HfdALKunfPJWfFqa7iAemv6C3Fdey4FwsOAG6sES9+uqrrnfv3v5S\n1synBVgJwgQ7yCrgiUMO+ywfffSRO+vMs9ygo452Q046IS5OKvc1rs41U67xOkqlkJFQ3GfuoQXT\ngGkgPRow2EnPvcicJAIcrDtpDQIZIEYgQzduYIcZzsPjQIXiABoEQCberRyoAHLkH0Mcgiw66APw\nwWLDkg9uqBC1AI3IqiD4AkyQi95ZDDaI3JdccomPdvHFF7tOnTr5fJFRcqpZTvIImkiLdAV4L774\nop9t/Zln56e+K/rSZR+64447Puqd1scNOfkkqSl1awOd1N0SE8g04DVgsGMPQlka4KMup+S0+enE\nCxSCg2CG5iH8eAYOHJjrUi7YEegADQIINV2xH4IOFhTABqBBJyxJY9xguRHYsE6CG0GI4ERr+Q8B\nO59//rkbMGCA+8Mf/uCLCfzgswN4ydIEjMm6g3yCKK2BIC1z5z7gbr75JjfzplmpBh7mwKLss26e\nGb+9qdnn3nNvLZgGTAPp04DBTvruSSYk4qPOMjrPLOVpKgSVvIAHgABqcAI+4ogjPKQABlhOgApg\nCBAAHgAbQQ4AIYhgAD9GWwZqgJwkuKEZCqChaQoHboBQsIFuQrAJZRPgaI08su7gR0RzFqM47733\n3l7FG220kRs1apRPGwsTwCNZ41DGecrGOlxGjb7A/TNKd+q1U137duul6dZ5WU49bZh7660/uhnT\np6XOAR4BZcUz0Endo2MCmQZyGjDYyanCNorVAP9gs2LVoUyCDEGFLCV77LGHH5fmoIMOyvW6EgwA\nCgIdppag+zcLkIMzcjwkDeBHfqoId9ppJy+HIAeAEdDE15yLL7JIAWUsQBYTnRL22msvt9tuuzVr\nctPgiIAPZVHTFpCDtSeEHbbPG3W++0s0UOGll12WqvF3sgA670RDLgC1FkwDpoH0asBgp4b3ZrPN\nNssNCIffxVlnnZXLnUpWgZ42jJxbTKB3ET2LFFSxa7811oAOH/csWHXC8oewg5WEaSQYZPDBBx/0\nFh2gAxBYuHChW7RokZs/f75fGC05HnbeeWfHP3kWdBFabshHC2myACddunRxq6yyigcZAU4IPSHg\nJJ0X8CA7wDNp0iQvO7JRDnqbqdlNs6PTxAW0ATsCnhB2wu1zzjnPPfLwQ+6GG6fXvUkLH50zzhjm\nm67SbNEx0Im/GbZvGkinBv4z0lo65StJqksvvdT3UOEiRpp96623SrreIresASwVd999t7vyyitb\njpzCGEClKniags477zx3Y9SbDH8QmrYYMycp7LDDDr4nFHDD6MQEgaUgirXgJg4rDDzIAIRACFAS\nnhfk6Fi4ZjsEJ/JV76uTTz7ZYWUDfi688EI3bdo0b7EJHacBHCw7svCE52TdQR/o5corr4jmM7vb\nbd+jm7tqwqS69dKi19XIs0e4jh07ucmTJqS26cpAh6fRgmkgGxpoKNjJhsqzLSU9jfbZZx+3ceSP\nksVApc6CtebQQw91+N/84he/+FpRgJpdd93Vd09nCgauUQgBBFAR5AhaWAtYwu1NNtnEvffee45B\nDddbb71cUxVxw0Vww5oAjAjQJD/xaY7Dssd0GITrrrsumhhzaM45WU1W8j/C6sOitAQ5SpM0Djhg\nfw99559/gZv/3HOO8YlqNa0E1pwbp890I0ec6QFU5UKuNAWA30AnTXfEZDENtKwBg52WdVS1GI1g\naQJ2AIEsBip1AIJKfo011nBPP/10rhhYbrDYMH7NNttsk+tWTlzBjNYhmLQEOCHssL3aaqt5wKLb\nNyMukxbpshAEHuSrRdCitYTm2i222MJbp5jQlK70DERIWYhLWqTBNgsWHsCHhePKT+lp3TO6vzTN\nMfDgFl06uTFjL3NHDjqiVZ2Xr582w90040bXrv36Dt2k1QfGQEdPia1NA9nSwJdfvGzJ/DVpmdGa\nDzuDrCkwJD7H8JOJB5q7OK6KhTXHkiZY5LjiMfIuQYO5cVxBcVgjDwuVJvukQQjz1DFdH1/fdttt\nuetJg7Q4Vk4grzBvyZRU3pbSp9lE4+u0FDeN5yk7C5X/zTff7AGB0Yrpwo0PVdeuXT0MEIcgZ2b5\nydAbii7gX3zxhW/6Yt3SQhMZcbiO6wGttdZay89YDuRIHpqc1ANMDsb43IQLzWDIywI44St0yCGH\nuG7dunl58QVDVsEOQCTgYjsMKmN4TNukO3LkCA8er77ysusdAdDIc0e7ha8tUpSK11hygJxevXp7\n0KFZjq7lBjoVq9YSMA2YBmIaaFOWHSp3xihhFN14AGCAApyDf/KTn8RP5/aBjqTrcxGiDUCnJZgJ\n48e3gRqaJ8JAnsged2wO48S3q1HeME0GyCNsnNEmLJVFVo3999/fH6Jyfe2117ylRVYWwIAu6oIF\ndQEXOLAutM050uJ6pRnmD6hst912ftBBBgZkXxYYWWO01nGtBUeki1wskydP9hOSksdxxx3nbr/9\ndp835ygHACR/HdIV6Ggt2eJrdAOAcO9vnjXbW3r27rev2yUC/22i96RH967xSwruAzhzH3jIvfrK\nK27uffe6rbfZNmqGG+iOjKYcSXMwi06a747JZhpoWQNtCnbygY7U9Mknn/h5k2huWn311XU4twZi\nigmVgA7px0EnzBMoA8aK6a1VaXnDfNlOg58C1jXdByr7UgOVO9cJeLie5iumYsAXiXMCGaw6LAIK\n1oUAJwQbyUZ+LOQXwou26ZKOUzTxGTMHoNE5wY32WWtbkCJZN9hgA9+7rH///u4vf/mLmzlzph8w\nEdDRNUpPMrFfbAB6WEaePTxyYr7L4UQ8ecJ4fznAssWWP/Tb3//+Zr7HmdJl8MPPP//CvbP4bfeH\nN99wjz/+mDvk0MPdT3fdxQ0dcqLbOAPgbKCju2lr00B2NdAQzVhU/FQWGkaf20FvLI7JTwaACC0y\nxOU8C00YCgBPIdggXaw/ulbXxdfHHHNMLk7YxTweL99+PvmIX0g+pVet8io91vy77xk1Z6QlcK/K\nCarstWZ0Y/xECAALi6whdPFWsxVrbYfNUsQBimTNIV2sKGGzVNgUFd/GWkj38MWLF/smq7DbuJqz\nOB8OFqhu5BpDh3NMGMpAiQSclblfyERZkFGLZJW8/oIif2jewgozdcrVbtEbi9zsW2a7AQfu71Ze\n6dvuf//9L3dX1J1/5owZuWVJ5JC90ndW9BagUaPO8+8EliKcj7MAOgB+GiC/yNtj0UwDpoE8Gmgz\nlh1ZA9ADIBICCPtUnPL5ARTC86HuAKOWrCqcDwEqvL6Y7Zbko5kLeZOsT0q/WuVVemlcA68t3Yti\n5KbS5d+7QAcYECDQ/MO2LDyh9Ya0ARtZXLSWBYV1aFXJt008mrLoIfbCCy94oCSu0matEN8Gurke\nuMLfB0h+9NFH3dKlS92QIUPck08+6S1T8uMJm7LUM0vlUB6lrGXxKeWarMQFcgiU0YJpwDSQbQ00\nhGWnmFsQWnWSKkjmSVLA1yWf1SDpWl2ndTFxFDdpHcqi8/E0W3IurlZ5lT9rLAWAQVpCWMZqyAQ4\nYO2guUrAA+gIdmQJATiABqwqoUMxFpvQKhO34IT7ioflRlabddZZxzejMlIzTs3kIeghzxB0wvIS\nJ5Rn9uzZudOnn366t6ZQJoAH644ATs1yLVkpc4m1oQ2BTpqe9zakfiuqaaDqGmgzsBPCAb4sqjy0\njvfaSoIdmrCKCYUsLsVcn5RPPM0k+cK0q1HeMD22sX6k6eOPb1S1gSdeZp4PLCdqkqK5CDgBUgQ3\ngIvgJQSa+LbicL0AB1iKdwnHh+jdd991qnDjMoX7en5l3SGtDh06uHHjxvlozz//vB+9Wc1Z9AaL\nAw/WKgv/0YD0nqZn/T/S2ZZpwDRQjgbaDOyUoxy7pnU0AKSoki51HTbPAXw4LNP8WA3oQRY1NSXB\nTTmAA/DIeiOwAUiAE5bQchNqW00nWNNaCtIhacnaxHxfe+65p7+UqSQAVSw5WKlC4JF1R81zLeXV\n6OcNdBr9Dlv52qoG2gzshNaS0MFYJvz4Ooxf64cjqeKOW3Jaki88X63y4pyqyqDWOsmXH3oBnuRv\nlS9eMccFO2oSiltxZMmJW2wENFrH4QZwaglukuTDssDyeDS2UUsB2ZUH+SE7/jusCUcffbRfFwIe\nvQM+Yhv80bONzi2YBkwDjaWBNuOgTHdtNe0AE3EfmDTdVqwXcb+d0KJBk1YIM0myt0Z5sTaoQkjK\nM8vHBDoAgwKWEsABCCCwr0VWGda6lrUW4rNdaQAwe/bs6YEH/bNfKIQyMyUF/jsMAkl39PHjx3un\nZaw7xAPqBEgqA/uUt1jZsTyxfPa3z92SJe9/bUZ4Jj7t2LGDizr8e0dfypLGoOfaQCeNd8dkMg1U\nroGGtezELSEh3GAFCMfCAYJCP564/07lai4tBXqDhfKxH1ou4iCUlHprlZfmkEYLL730kocIKnhV\n/jQH/f/2zj3IiurO4z9SIJYVUu7GZAOuFi5UGNkqERMZhFpFYnzk4Yih3IqRAZJSUUkibtCMZA1b\nAg6iUjKWQXwOJghEHIeYVWMSJCKPUgOl4aEbE0UdrazlHz7WRLOV3G+THzlc7jzune57u/t+TlXP\n6du3+5zf+Zyr/eV3fuecYs+Ox+oUx9tU6rkph6NEgl7I/lIu9azsd9EiIaNhM01H18rESlpoUMLE\n43d8Krpyn22m73pK6v97ChunXnTxJdYwqsHmzLnCfvbYL+zd9963f/jHj9u05uYDjnGN4+39P35g\nL+99zW5aenNk39lNU2xZ2y09tqUnG+L+zpkidOImS3kQSA+B3Hp2JHb0P355QLTWjqZzS0C4d0fi\nIRQQYZd0N+08vCfp8/7al0R75VmIY7dzibWeVqmuhG1xAHc5ZegFrrbJ26GkvDho14WEck/huV9L\nMnfPmgSLzksl2ST7JdokwiReWlpabN26dd1OR3eBp+e8nX7udWgo7af//YjdsGRxtCjg+IKI0qaj\n5W4SqhWUNz252bZs3mJnnnGmfbrhWDt3SpPNKKzdU4uE0KkFdeqEQPUJ5Ers9Da0o9iV3lYVVpyD\nhEItk8RW6NkJbZF9vbXT74+7vXrB6kXb3yT7+9qG/tbVl+f1Ir/88sujF73fLwHgqdqixustlcv7\nIHHWk+DRc26/PFQSPI888ogdd9y+VY41Hf3GG2+MvDny6ujecEgrFDobN260Fbffab9++ilrnvkN\n+83O3WULnLAdw4Z+ys6bem50zJ37H9HWEe3t91h7+8rIA3XuuVPC2xM9R+gkipfCIZAqArkaxpLH\nQGKgu3/l6wWrRdt0T7FnQQJH4iANXh3ZokUJQ0Ege9euXVuWfXG3Vy9apTgET1RQSv7ohR56Sty7\n4XlKzNxvhuJ21BcSaaWSizOJFh/OUvxOOB29s7Mz8l5p+EqCRzO0wvV33nrrLVuwcJHNunhWtBXE\nLwt1Xf3duf0SOsW2Svh8Y2azbdjwS7ugeYa1tbWZhriq8fvyOvw3XWwbnyEAgXwRGFAIRix/g6F8\nMaA1ZRBQsOukQvyIPCF5SNrnSW3xf+VnrU0SPBJqpQKX9Z+2huM0A0tCRltdXHjhhfbQQw9FzVy/\nfn30nLw/ikPydYC2bdtmN9241IYV9tuaN29erAKnN76LWpfYvJYr7eabFUy9L9aot2fK/V5CRyKn\nFLNyy+J+CEAgGwQQO9nop9RYqeBUxe34v4xTY1iFhrhoiyMWqUIT+v2Y+sK9PcWFSfBoGMs9OBI8\nI0eOjLw5iunR1hLyBCmYWVPmH3yw09TH3/z25fat2ZcWF1eVz9pkVN7XoUOH2uLWRbGKEoROVbqQ\nSiCQOgKIndR1SboNUryIhgm1aaX+dZz1JJHg3pEst0WeKQ+0DtshseOCR1PONWS1adOmaDq67ps6\ndWo0HV1xO62t19vu3busfeW90cadYTnVPlcg8/z5/2VvvPGGrWy/OxbBg9Cpdi9SHwTSQyBXMTvp\nwZpfSyQOtGN1lj0h3jvu1Qnjdfy7rOUSnjok3MLkcUcev6Mhq1LT0RcsWBQNeW0sbBw64aTGsIia\nnCueRzurjxgx0pqnz4yEXH8MQej0hx7PQiD7BPDsZL8Pq94CvVAVuyNvQpbjHjyQd8yYMVHci0SP\n4pGyLn7UP8VxPO7dUfyOByRrLaZdu3bZZz9zov1TIYB5xe0rTCIjbWnOFXMLy0f8tmIPD0InbT2K\nPRCoPgHETvWZ56JGiQId8+fPz2R7FJcyc+bMbm0/+eSTo/ZJNOiQ10TJBVL0IcV/9IIP43gkdpQU\nv+PDWV1dXXbZZbOj9Xce7Fxf1UDkctFpEUPtBL/qR/eW9ShCpyxc3AyB3BJA7OS2a5NtmHt3/GWS\nbG3xly7xIqE2o7CYndqyYcOGKOha+TvvvHNQhcOGDbOxY8dGh3Yl16GUZvFTHMfj8Tu+P9aWLVvs\n9NNPtyc3b03F0NVB0IMLiuGZNesSaxw3rjBDrCX4pvtT/21m2fvYfev4BgIQKIcAYqccWtx7AAEJ\nBQXFavp2lpJEjmzWy1DJvR6apq1Dq2xr/zTNVNq+fXt0lGrf6NGjI9FzwgknRCLIh7/SJIB8AUJ5\n4ZTUVrXxzTffLOy/dp5NPe/fazbrKjKojD+apfX1GdNt+W3LI69bT48idHqiw3cQqD8CiJ366/PY\nWqwXqTwkClaW8MlC0ktQHhqJGBcn7vGQCNAwjzwf4V5R+l73/6oQvKt9tJ544oloSKW4vVqrRsJH\nXh/Vody9CrUWQGEcj9pz7bULbM/zL5Q9LFTc5mp/XnbLrdax7v5oIcLu6g7b2t09XIcABOqLAGKn\nvvo79tbqxaJgZX/BxF5BjAX61GzNwvKZWCrevR0udBTT4oeu6dA9Eiw6NE377bfftueeey4SQBI/\nO3fuLGmphr9c/LgA0o21ED8SehJfaotW1+7v1g8lG1yFi1pl+bTPTS656KB+h+7FqoIpVAEBCGSE\nAGInIx2VZjM1LKSAX3+ZptVWDzaWrWHSy99FjYsc3z7BPTzy+ihJ6Ggatx/67Nf27t0biR+JIAmg\nV199Naxm//mECRMiz497geQdU6qGAFIcz3e+c6WNOna0Lbx2flRv1v48/OhjNqewuvLWbVv3e87U\nBoRO1noSeyFQPQKIneqxznVNGsaS2NELx4du0tTgnuxzseOBuz41OxQ8EkOeXNx4LuFT6lzXNPTl\nHqBnn3222+GvyZMn7x/6kgfIGcYtgCR2jjnmGHut6/VUTjN3xr3l539tmo1vHLffu4PQ6Y0Y30Og\nvgkgduq7/2NtfU+CItaKyihMQ1casupJiLnYkaDxadkSOi52dE0eHnl3dK8EiB8SOjp3wROKnlLX\nNPwlr48LoO6GvxT8LNGjQ0LI44v6K36uuuq79sGH/29Lb1pSBsX03SrvzvWt10WxOwid9PUPFkEg\nbQQQO2nrkYzbI8Gjl49mO/kLulZNktCZ9LdZSLLJvSXF9kjAhMHJLnjk4Ql3Apfnx+9TGXpOh5KL\nn1D4SOz44SLIcxdCyiV85PVxL1B3w18TJ06M2uOzvyoZ/moY1WB33dOe+qnmEdRe/px66mQbM+a4\nXKzm3UtT+RoCEOgnAcROPwHy+MEEFMOjGVq1nKUlcaPAaR2yozuhI+tdtEjI+Ewsj92RR8fjdvSd\nvD+6zw99drHk5TgRF0AueEKB49ckfkIB5PfI+yPxIxG0efNmL/KAXBtluvBRELQOT6U8QBKgd93d\nbus7O/y2TOfz/nO+ffjBn+z6xddluh0YDwEIJE8AsZM847qswcWGPCsSG+6FSBqGvDkeMK08nHXV\nU90uWNxzEwocFzkSNn6EYsef8Wueu/hRruTix70/Lnhc4IR58fkrr7wSCR+JIAmgnoa/fPaXhJB7\n11TnlVe12KBDBmc2MLm4/3zdnT3P7yn+is8QgAAEDiCA2DkABx/iJODxMvIo+HTvnjws/a1bs6wk\ncCSsdK68r8kFiYSKzl3UKHcx4+f+Ocz9PLzHryn3MmWPzr0+F0ASNzov5eUpFj7uDfKhL+USQdpO\noTiFa/90dq63xUtusLPO+HzxbZn9rGG51WtW7xd1mW0IhkMAAokSQOwkipfCRUBeHokQBQlL9Ciu\npxwh0hNFCSoJG3mPlJRr6KrS5CLEBYlyiZVShwsi/86Fjufh9VLXvGyvSza7+NG5RI4foQjyc89d\nDIVr/2gITJt8FifVlaekPbNGHzuqzx68PLWdtkAAAn0ngNjpOyvu7CcBiR4Jk/b2dmtqaoqCbSVM\nyhU+EjjyFuno7Oy0U045JXrZ9UfkdNe0UBxIvLgwcSET5qGg8XPPdV+p8/Cal+V1eN0ugJS7+PHc\nvTz+2YWPCyFf+6etrc0mTvw3+/GP13TX1ExeX9S6xP5ciNu55prvZdJ+jIYABKpDALFTHc7UEhAI\nxYoEkIa2JHgm/W3mlOf+iDxCeka5jpdffjkSOBI3lYglL7eS3AWIng1FiYSKPheLl1Kf+3rNxY+X\nHdbtAkjiRucublzshLnOJXbe/+MHtuK2H1TS7NQ+s/b+B+zBjo7MbXuRWqAYBoGcEhiY03bRrBQT\nkLjRUJYOJRcxvku3hrzCJCGkQ8JGw2DFYii8N+lzCQtPfi4RIrGhpHMXJ2FeSuDo+/C6n3vu34ef\ndc3LVV0KnlZS7vbIFp274NHnwq02/Jh/ie7N058hQ4bkqTm0BQIQSIgAYichsBTbdwK+jUPfn0jX\nnS4yZJWLjNALE4oTFyueFwsZfS73murSM0o610wyJdnix7vv/l90LW9/jj7qKPv100/lrVm0BwIQ\niJkAYidmoBQHAREIBdA+z8rfA4MlSPxwIVRK4Oi78Lqfex5+X3xN5XvZyj/88z4BlLfe+dfRDfb8\nC8/nrVm0BwIQiJkAYidmoBQHgVIEQvHj5xIklQx/hcLGBU9v1wYNHFTKLK5BAAIQqAsCiJ266GYa\nmUYCLnpkm84VYyMBpOSeH8//AVUvAAAG+ElEQVQlasLDxY3nLnrCPDwffOjgqNy8/dm5iwUF89an\ntAcCSRBA7CRBlTIhUCEBF0Ceu/hRcS58lIfCJzyX+AkFkAueIR/9aIUWpfuxvYWVpb96/gXpNhLr\nIACBmhNA7NS8CzAAAt0TcNGjO/xcYseHvyRmXAS56NHnUPDofODAj9j//uEP3VfENxCAAARyTOAj\nOW4bTYNALglI9Pgh0aNj4MCBdsghh9jgwYOjQ9tE6DjssMOiQzumd3W9ljse27fvsIZRo3LXLhoE\nAQjESwCxEy9PSoNA1Qm48FEerq0zaNCgSAAdeuih1tjYaGvX3Fd125Ku8KXf/84+9rF8DtElzY7y\nIVBPBBA79dTbtLVuCBQLoCOOOKKwGOOppp3C85T+pzDtvJaLTOaJJW2BQJ4JIHby3Lu0DQIBgRPH\nNdrjG38VXMn2qYTb611d7Hie7W7EeghUhQBipyqYqQQCtScw4aRG27plc+0NicmCp595xr7cVPkO\n9zGZQTEQgEAGCCB2MtBJmAiBOAhob7EX9uy2vKxN07Hufps4YXwcaCgDAhDIOQHETs47mOZBICRw\n9jlT7I477gwvZfL84Ucfi4awJOBIEIAABHojgNjpjRDfQyBHBC695GJ7+Kc/sa7X38h0q+5dudIu\nnT07023AeAhAoHoEEDvVY01NEKg5geHDh9sJnz3R7mm/t+a2VGqAvDra6bx5GisnV8qQ5yBQbwQG\nFFZb/ft2zPXWetoLgToksGPHDhs7dqz9Zudu067hWUvnf22ajR/faN/6Jp6drPUd9kKgVgTw7NSK\nPPVCoEYEjj/+eLt2wUJbuHBhjSyovNplt9xaiNV5Da9O5Qh5EgJ1SQCxU5fdTqPrncDsyy6NhoIk\nHrKSNIvs1rZl9v3vX2OHH354VszGTghAIAUEEDsp6ARMgEC1CUgsLL9teSQesrKqcktLi13Q3MyK\nydX+sVAfBHJAgJidHHQiTYBApQSWLWuzjo4O+9GqVTZs6KcqLSbx5+ZcMddefPG3tr6zI/G6qAAC\nEMgfAcRO/vqUFkGgLAJXXtVie/bsseXLf5BKwSOhs2P7MwVR9gDDV2X1LDdDAAJOgGEsJ0EOgTol\ncP3i66yhocFmzbokdevvSOhoXaClS29C6NTp75NmQyAOAoidOChSBgQyTiAUPGmI4dGihz50tXrN\najb7zPjvC/MhUGsCiJ1a9wD1QyAlBCR4GseNs6/PmG5r73+gZlZp1pW8TIrRWdl+N0KnZj1BxRDI\nDwHETn76kpZAoN8E5s1rsdbFrXbNvKvtoourP6ylqfBfmXJOJLoUjMwU8353KQVAAAIFAogdfgYQ\ngMABBLS55tZtW23AgAE2edIkW9S65IDvk/igLSDObppi2slcU+IlukgQgAAE4iLAbKy4SFIOBHJI\n4PHHH7cVt98ZrVr8+TPOshnTp8U6Y+vOu1faL36+b6+rlqtbbPr06TmkSJMgAIFaE0Ds1LoHqB8C\nGSAg0bPqvjV2+4rlduFFs6xx/El21pmnVyR85MXZtOlJW7d2tQ0dNsxmFGKEmpqaGLLKwO8AEyGQ\nVQKInaz2HHZDoAYEXnrpJdu4caM9+rOf232rfmhfPvscGzFipH3ik5+0kSNH2JAhQw6yavv2Hfbe\ne+/Z73/3YvTMpEmn2udOO82+9MUvEHx8EC0uQAACSRBA7CRBlTIhUCcE5PHRLup/sQH21FNPl2z1\nkUceaUf985F29NFHReJm+PDhJe/jIgQgAIGkCCB2kiJLuRCAAAQgAAEIpIIAs7FS0Q0YAQEIQAAC\nEIBAUgQQO0mRpVwIQAACEIAABFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoid\nVHQDRkAAAhCAAAQgkBQBxE5SZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAA\nBCAAAQikggBiJxXdgBEQgAAEIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCAp\nAoidpMhSLgQgAAEIQAACqSCA2ElFN2AEBCAAAQhAAAJJEUDsJEWWciEAAQhAAAIQSAUBxE4qugEj\nIAABCEAAAhBIigBiJymylAsBCEAAAhCAQCoIIHZS0Q0YAQEIQAACEIBAUgQQO0mRpVwIQAACEIAA\nBFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoidVHQDRkAAAhCAAAQgkBQBxE5S\nZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAABCAAAQikggBiJxXdgBEQgAAE\nIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCApAn8FUX2PmBTVQm8AAAAASUVO\nRK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 100, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse_2.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 5: Making our Network More Efficient" + ] + }, + { + "cell_type": "code", + "execution_count": 105, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self,reviews):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " self.layer_1 = np.zeros((1,hidden_nodes))\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + "\n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def train(self, training_reviews_raw, training_labels):\n", + " \n", + " training_reviews = list()\n", + " for review in training_reviews_raw:\n", + " indices = set()\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " indices.add(self.word2index[word])\n", + " training_reviews.append(list(indices))\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + "\n", + " # Hidden layer\n", + "# layer_1 = self.layer_0.dot(self.weights_0_1)\n", + " self.layer_1 *= 0\n", + " for index in review:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # Update the weights\n", + " self.weights_1_2 -= self.layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " \n", + " for index in review:\n", + " self.weights_0_1[index] -= layer_1_delta[0] * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + "\n", + "\n", + " # Hidden layer\n", + " self.layer_1 *= 0\n", + " unique_indices = set()\n", + " for word in review.lower().split(\" \"):\n", + " if word in self.word2index.keys():\n", + " unique_indices.add(self.word2index[word])\n", + " for index in unique_indices:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 106, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 111, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 109, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):1581.% #Correct:857 #Tested:1000 Testing Accuracy:85.7%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Further Noise Reduction" + ] + }, + { + "cell_type": "code", + "execution_count": 112, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjsAAAEpCAYAAAB1IONWAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsnQe8VcW1/yc+TUxssQuWWGJoGhOjUmwUMVYUC3YQTeyCDQVRwYJYElGa\nggUBJVixRLFiV4gmxoKiiaKiYIr6NJr23vvf//6O/k7mbvc597R7zt7nrvl89tlt9syatct8z5o1\nM99oioKzYBowDZgGTAOmAdOAaaBBNbBcg5bLimUaMA2YBkwDpgHTgGnAa8Bgxx4E04BpwDRgGjAN\nmAYaWgMGOw19e61wpgHTgGnANGAaMA0Y7NgzYBowDZgGTAOmAdNAQ2vAYKehb68VzjRgGjANmAZM\nA6YBgx17BkwDpgHTgGnANGAaaGgNGOw09O21wpkGTAOmAdOAacA0YLBjz4BpwDRgGjANmAZMAw2t\nAYOdhr69VjjTgGnANGAaMA2YBgx27BkwDZgGTAOmAdOAaaChNWCw09C31wpnGjANmAZMA6YB04DB\njj0DpgHTgGnANGAaMA00tAYMdhr69lrhTAOmAdOAacA0YBow2LFnwDRgGjANmAZMA6aBhtaAwU5D\n314rnGnANGAaMA2YBkwDBjv2DJgGTAOmAdOAacA00NAaMNhp6NtrhTMNmAZMA6YB04BpwGDHngHT\ngGnANGAaMA2YBhpaAwY7DX17rXCmAdOAacA0YBowDRjs2DNgGjANmAZMA6YB00BDa8Bgp6FvrxXO\nNGAaMA2YBkwDpgGDHXsGTAOmAdOAacA0YBpoaA0Y7DT07bXCmQZMA6YB04BpwDRgsGPPgGnANGAa\nMA2YBkwDDa0Bg52Gvr1WONOAacA0YBowDZgGDHbsGTANmAZMA6YB04BpoKE1YLDT0LfXCmcaMA2Y\nBkwDpgHTgMGOPQOmAdOAacA0YBowDTS0Bgx2Gvr2WuFMA6YB04BpwDRgGljeVGAaMA2YBsrRwOOP\nP+7effdd9+rC190HH3zgli39wD3++GNfS+qQQw93q6yyiuvSpbPbaMMNXM+ePd13v/vdr8WzA6YB\n04BpoLU08I2mKLRW4pauacA00FgauOuuu9xTTz/r7rv3HteufXv3ox//xG2y6SZu8803d6utuqrr\n0b1rswIvfG2Re2/JErd06TL39ttvu1defsnde89dDgD66a67uH322cfAp5nGbMc0YBpoDQ0Y7LSG\nVi1N00ADaeC///u/3YyZN7k5d97pS9V//wNcn969XZfOHcsq5dJlH7q5DzzkFsx/zl079Rp3xrCz\n3IknHOc23njjstKzi0wDpgHTQEsaMNhpSUN23jTQhjUwfvwEN3nSJLf1Ntu6IwYOdLv/tG9VtYHl\n57rrrndXjvuFQU9VNWuJmQZMA6EGDHZCbdi2acA04DWAP87551/gVll1NXf8CSdUHXLiahb0zL3v\nXjfi7BFu0KBB8Si2bxowDZgGytaAwU7ZqrMLTQONqYHxEya6yRMnuhNOHuKGnHRCTQs598GH3WWX\njI38gdaPLEoTzJ+nptq3zEwDjasB63reuPfWSmYaKEkD+OYce9wJ7pFHHnU33Di95qCDsDST3Txr\nllt33fVct67d3O9///uSymCRTQOmAdNAkgbMspOkFTtmGmhjGgB0Bg4a7FZeeWX3i19c7tq3W6/u\nGhg/cbKbPGG8m33LbPejH/2o7vKYAKYB00B2NWDj7GT33pnkpoGqaECgs9lm33fjrri8KmlWIxGa\n0FZaaWV38EEHG/BUQ6GWhmmgDWvAYKcN33wrumkgraCjO3P04IF+04BHGrG1acA0UI4GrBmrHK3Z\nNaaBBtHAmWeNcIsWLXL33D0n1SWiSWvOHbe7OXPuNKflVN8pE840kE4NGOyk876YVKaBVtfA9OnT\n3diLx7p5UTfzNPjotFTgY4493n3++edu1s0zW4pq500DpgHTQDMNWG+sZuqwHdNA29DAO++840Fn\nXDRoYBZAh7syevQoP/8WkGbBNGAaMA2UogGz7JSiLYtrGmgQDRx62BHRnFabuTEXjs5UiRiH59Qh\nJ7v5C+Zbc1am7pwJaxqorwbMslNf/VvupoGaa4DRkX/3wvN+PqqaZ15hhozDs/uee7uLx15aYUp2\nuWnANNCWNGCWnbZ0t62spoFIA1h1unXvXpdBA6txA5haYosundzixYtt8tBqKNTSMA20AQ0Y7LSB\nm2xFNA1IA1h1jjv2OLfojUU6lMn1qacNcyussLy77NKxmZTfhDYNmAZqqwFrxqqtvi0300BdNTDr\nV7e4gYOPrqsM1cj8Zz872t1z1xzHOEEWTAOmAdNASxow2GlJQ3beNNAgGgAMrp16jdun396ZL1GX\nzh3dDzp2cnfddVfmy2IFMA2YBlpfAwY7ra9jy8E0kAoNAAY/P+Y4Byg0Qtilb1/37HMLGqEoVgbT\ngGmglTVgsNPKCrbkTQNp0QBgsMWWW6ZFnIrl6NO7t7dUVZyQJWAaMA00vAYMdhr+FlsBTQNfauDJ\nxx9z2/zkJw2jDixUe/fb1+F0bcE0YBowDRTSgMFOIe3YOdNAg2iAEZMJPbp39etG+WGm9pdffqVR\nimPlMA2YBlpJAwY7raRYS9Y0kCYNADtbb7NtmkSqiiybbLqJW/L+B1VJyxIxDZgGGlcDBjuNe2+t\nZKaBnAZo6ll33fVy+42ysfnmm7sPPjDYaZT7aeUwDbSWBpZvrYQtXdOAaSA9Glh9jTXdN1dcKT0C\nmSSmAdOAaaCGGjDLTg2VbVmZBuqlgf/3//5fvbJu1Xy3+uGW7lezbmrVPCxx04BpIPsaMNjJ/j20\nEpgG2qwG2rdrvKa5NnszreCmgVbUgMFOKyrXkjYNmAZMA6YB04BpoP4aMNip/z0wCUwDpoEyNcBA\niR1+0KHMq+0y04BpoK1owGCnrdxpK2eb0wDTQxx55JHuu9/9rps8aVJDlv/Tzz5ryC71DXmzrFCm\ngTpqwHpj1VH5lrXzo9/+/ve/dyyMBcPy7rvvNlPNaqut5n70ox/5Spt1z549/dIsku34GcABHLqZ\ns/70009zWmH7ncVv5/YbZeNvf/tboxTFymEaMA20ogYMdlpRuZZ0sgaoiG+88UZfKWN1AF6AGFkh\n2A6DIIg1UHTKKae4l156ye2zzz5u33339QvptMXATObok+Xuu+9upoLvfe97XjfolXhXjLuq2flG\n2PnjH99yXbtu1whFsTKYBkwDraiBbzRFoRXTt6RNA14DVLZXXnmlXwATgAVQ2XjjjcvSkCp5oAkA\nIq3Ro0eXnV5ZQtTpIqBPwAj0hWGrrbbyukAfITTymi+33HLug6XLXCP1YDr0sCPcrn37eFAO9WDb\npgHTgGkg1IDBTqgN2666BkLIofIFSLDkVDNQ+ZPu9OnT3aBBg3JAVc086p0WQCcLThLgYL0J4TH8\nD8M24+z077+/OyLSz4AD9qt3caqWf8cOHd0DDz7QJiC3akqzhEwDbVAD5qDcBm96rYqM7wiAIx8S\n1tUGHcqCdQgLz+LFi31zDftYkbIe1GRHeX784x+7888/3zffUS6a8KZNm+Y++eSTXNMezVaAjeDm\n//7v/9y///1v969//cv985//dF26dImufznrasnJP/fBh1279u39/c8dtA3TgGnANJCgAfPZSVCK\nHapcAzRTASBYXNiuRQAKsH4AVVg6WCNDlvx5ZL1hHToY46QNKGK9YVGZBDfoF+sN+0AOC/v/+7//\nm9vfaacd3MUXj41ijiZ65sPTTz/j2q+/QebLYQUwDZgGWl8D1ozV+jpuUznQbCXrDRU2AFKPgBwA\nD01cAE/ov1IPefLliZxAmSAnDjiCG9YKAA1BoAPUsAhytAZ0BDtaHx75uJx+5pkN0ZRFE9Y1U67J\n9dKTfmxtGjANmAbiGjDYiWvE9svWgEAHsKAZSdaHshOswoWyMAEUaQEe9CS4ydeDCrgRNKKGEHBk\nwYkDjoAmDjnh/jXXTHWg0tQpV1dBu/VL4vppM9zdd81x99w9xzdd0uQX6qt+klnOpgHTQBo1YLCT\nxruSQZlC0MGSkqaAPEBPPYFHPaiAnCeeeKKZeuhBRUWNJSoEsjjgyIIjyCkGbmTl0Rofn7322su9\nuvB116Vzx2ZyZGmnV6/e7uSTT3b77dc/Jzb314Anpw7bMA2YBgINmM9OoAzbLE8DaQYdSgREEKgI\nawk8WBvID9gqpgcVMuYDHMFKCDgcC/fDbcUXIJHuN77xDe8H1L379u6qq67KrHUHqw4hBB32dX9Z\nWzANmAZMA6EGzLITasO2y9IATS4ADxV7moOsO8jZWk1sAA5wgwUnPhI0PaioiNGXfJkEN+gNMGFf\nlhsBSxxqtJ8EN5yLAw7j67C8/PLLbs0113Rrr722Gzx4sJs4+Rq3+0/7pvmWfU22pcs+dIcdeujX\nrDphRO4vFrLWusdhXrZtGjANZEMDBjvZuE+plZLeVlTuVPJZqFyADeQERqoVSIsKNh/gADcs0k8S\n4AhSWAtm4us43GhfcMNaQYDzX//1X47l6aefdj/5yU887Kywwgpu9uxb3MMPPexmzf5VpgYZHHnu\naLf47bfcrJtnqqiJa55HgFI6T4xkB00DpoE2owGDnTZzq6tfUCoUxn958cUXm/maVD+n6qWIBYpK\nEEADQMoNgI2WavagEuAAMoKZ+DZxBDgCJ5qoWAQ34RrQ2WWXXdzyyy/vz7NmOebY412nzl3cmAtH\nl6uGml7HuDqnDjm56EEEDXhqenssM9NAqjVgsJPq25Nu4bCSsIyOrDtZCkAKfjw4DRf7zx9IEtzk\n60GFLkKAEoioaUprYCVcBDOF4EbxSUPp5gMcwcznn3/uXnnlFdenTx8PNzouEFqyZInbp98+btjw\ns93Rgwem+hYufG2R27//vu7isWO/5qtTSHADnkLasXOmgbajAYOdtnOvq1pSLCNADpVJscBQVQEq\nTKwYUFMPKpqo8gEO0FSoB1UccJKABpCJA4/ghnVLgCOIYS2Qef/99x2ws8022+SO6ZyauIAlyjVi\n+Ah3w43TXY/uXSvUautcjp/Occcd7zp27Oguu5RBEUsLekax6FkwDZgG2qYGDHba5n2vuNRUHMAO\nlX0WAxUgwBO37ghwgLl8Pai4rhDgqIkpBBbBjI5pX/Cj41rHAUeAArAkwU14DIsN8TbbbLNmoCNL\nEGsC5aMsAw4c4J588slUAo9AZ/1oWoirr55U9qPGfSUY8Hg12I9poM1pwGCnzd3yygssq44qkMpT\nrE8KgBqVH01PlAkLThxwdt55Z3+eOKoo1YyE1ICNrDdsC1YEM/F9wY3WOq90lLbghrUAJ74W4HBc\n52i2osfVpptu6o8BNqShINBhH2CjvPQSGzhwkDt7RLosPAIdZJ0xfVrFFkQ9r7qPpGvBNGAaaBsa\nyATsTJ061R177LH+juBo+fDDD2fq7oTyI7gqNLbDyodyUb5iA//c3377bR/9kksucWeddVaxl1YU\nD2sAoMCS5aDKPl4GKn/ghkVNdOE9E5gAKnHAEbwIdgQ18TXXaSF/ngOBieBFAKN1EtxwTPGxzmy9\n9dZu9dVXzws4KitWOSYWZc4tIIBy3nnnHLf//vu5626YXncfnmefW+B4psttulI542vKiv9VaJmL\nx7F904BpoPE0sHzjFclK1JoaoLJgBGCcdbMeKIvCoEGDPNwAciHgADnhIpiJr+Nww/n4McGNwCkJ\nbgQuApsQZnRMcVjLAgTo9O3b1xcnBGiVL1zThAfoELBoUV5k6h85AM+bN88dH/nHvBpZiIYNO70u\n3dIZNPDySy52J5x0khty8kmh6BVvY9UBdtCBAU/F6rQETAOZ0YDBTmZuVToEBXKwfAgI0iFVeVIw\nfxeVPRUga0IINmwLUPLBjaBGawFOGF9pkn4+wBHI5IOb8DiAo3To9k7F3bt3b5IvKsgiJwuWLkLO\nHXfc0d3763vdyHPO84P3nRk5L9dq4EF6XDGy85OPP+Yn+AQ8WyPw7HLPDXhaQ7uWpmkgnRow2KnB\nfTnmmGMcSyMEddtuhLJQ6fPvnkqVip4gwAlhJR/ICGxYx+O3BDiCF0EOVhpt61zcgiPAkeWGEZqx\nUvTq1avo20HzFX46NF+FgBdC3frrr++uv26qmz59hhty0olu2+26uiMGDmw16ME358bpM92Made7\nfvv2d/MXzG91mDbgKfqRsYimgYbQwHJZLcWll16a83Pg449Pj/xXksr0yCOP+DjE1bLGGms40mFy\nxKSgeKy5noWuvOxzHaGYOPjshPGS8tKx2267LZcH15Afx8oJSWWmqQN5yg00YbXWP+5iZJIe1WRT\nzDWF4qgpg3/5VPgCm3//+9/uX//6l/vHP/7h/v73v7svvvgit9Clm4VjnCMOC/H/53/+x6dDnsAK\noxV/61vfct/5znf8stJKK7mVV17Zac22Fo4R79vf/rZfVlxxRX8taQiAZNXRVBSSv1AZdY4yJjVf\nCfCQneWf//ynXwYMONDNnXt/NGFoZw89hx52hLv19juVXMVr/HJOPW2Y6x3B5quvvOxm3zLbdy2v\nldVQwINjugXTgGmgsTWQSdihohs+fHizO0MFDhgkAQ9xkyp5IIdzOPr+9re/bZZefAdwII1C8YqJ\nE0833AdqBgwY0CwP8uOY4CqMX2ib+EllFgDJ4btQGvFzVJbf+9733MZRE0C9Qz5ALUcu4I2yqdLH\nUqNKH4gJQUeAI8gJAQcQA0q++c1vOkAFcBHUaB2CDscEOCHkAEekQVryyRHkUT5kJZR6H/I1XwF5\n8TJTPhbK8fOfHx11CnjI7bTjDm7yhAmuY4eOHlIAH5qeSgmMgsyUD8xaftSRgyIYXN6PiMz0D6WA\nWyl5FooL8HD/DXgKacnOmQayr4HMNWNRWecLVIBU4mFvLSr9lkCB6wCDt956y/dkSUq/pTS4ppg4\nSWnrWCGLC1DG3EbF9NYCmuIwqDy0Ji+6J5fSg4tKttQKVvlVe10IOkvNi0oWZ2VgJ27ZwcqBlYcF\nIFBzD3kIQMLmJjVHaS2LTHwdXiNrDWsF0k4KVMrIW6r1o1DzFWUG7mTJUlnJH6sSYZVVVnGHH36Y\nO+qowW7hwoXuqaefcXfNmeMOOnD/CBZ6uXbt13frrrueW3uddXz88AerDZawe++5y+3db1+33Xbb\nuqFDh3iH8DBePbcFPKwtmAZMA42ngf98XTNUNir9F154wVdOH3/8sYcAiR/CEBATAgiVOyAkf4rQ\njyYeV+mFa+Lr2nyQUEycMM34Nt1tlceUKVOanS4EQ2HEEHRCXQFzISyhG8pdbAAI0lQZhPe62DIk\nxdtqq638P/uwGUuVv5qzBADcG6AECMD6QpNTviYqWW7CtSw4YROVwEfwVAh00H+poAOk5mu+EuhQ\nPjXHsQbygJ8kwAO26CWFNQZ9jBt3hTsmsv6stmrURPedFd23V4z0Ei0rRdv//ucX0aCF+7vTTh3q\n495z9xx3zsizUwU6eibQLTCJH5QF04BpoLE0kDnLDuq/9dZbvVWCbcYUARCwzChQgXMcC0dYmQMP\nYWXPPs1eqjSBCdJKClwXh494vGLixK8J9wGlEKLYR37Bi8pD2fIFLB5hU16oK2CPfZrtSJeFNMkn\niwG9AK+F9FFMuQQPgkxZb7QPfAhIwjXWmrjFJumYmqK0Fsxo3ZKM6ipdLmgW23wlXx3Ah7JTFtYE\nZEfeJJnV/FSufC2Vv5bnKYMsmHouapm/5WUaMA20jgYyZ9mhwmYJQ3xfgBM2dVAhhqCj68OKnuvC\naxSHddK14fli48SvCfcPPPDAcNdvx/MNQeZrkaMDofxYdeK6QQ9hPi2lF+bBv15VbOHxUrcBU1Wc\npa7DvCgrflpYqEopR5hGuC1ZqNiBGip7+d9gwZF/DWv53oTbOiZLD9YbrmfBEkSahaAhlEXbWNMq\nsagV03wlyJE1B6sWFp9QHwI1ydXIa55xdG4Wnka+y1a2tqaBzFl24pV3oRsWVoBU/EkhbhUQKMXj\nxuPFz7NfTJyk63QsqWzxNPPJpzTC88AAFVahEMYvFE/n0vZvl3ssy1doFZO8pazRVQg5XAvwYOkh\nhOcVj3W4CAoECrrOJ1DiDxUuoVzALLb5iuYqgQ5WHYLgjDU6oIwqm4/Q4D/o3Cw8DX6TrXhtSgOZ\ng502dXessDXTAHCiypzKncAxKvuwKUdgEwIAx3S9BGa/kkBFC1huXEHPtyO/ms4jPngg8IYvDmAj\n0AF2sOhQVvRA+WSVYq3yUq5Ky1aJXmp5rYCn0vtQS5ktL9OAaSBZA5lrxkouRvLR0FISNu+EseOW\njbglJYzb2ttJMsblC8uUJE8oP01gVF6Flpb8kMI8qHiphBs1UIkDLqroaYaSA7KcjEMHY8GArB4C\ngUphgOZCdM1Sbhg9Ov/ggXJKVu8rQAfwUdMVgEf399CJGp0AQW0tyKomK1tbK7+V1zTQKBpoaMtO\n2HQFNOCIHPeBCXs4AQrhNbW+ycgX+tOQv5yn2Ua+lmAnlB94otwhAJFOuYHKtxp+DDiBxyGuXJl0\nXUt6UbykteBElTn7ACIVvIJAhjiKz7lwW3ErWQM6PXv2rCQJD6TF9L4CdrQAOgTADYgDdORzpCYt\n6UDCAQDI++lnf3MLFvzGH1YXc3Y6/KCD23qbbf3xjh06uFVXXdmXTQDhT2Tgh+eesrKwbcE0YBrI\nngb+8zXPnuwtSgw44M+hipUxeMIeWeyHMBEHjRYzqHKE+Ng37MsfhayKkQ/YoeLHl4Vy4wxMmQVB\nSlM64VzopN1SkaoBO5KlpbxqdV6+GeQn4MmXd7XhRvlU2uNK6bAup/mKpq0k0FETliAPXf36vvvd\noxGYL1u61MPMFlv+0PXZpa9r376dF4Pu5QQGHHxvyRK//eKLv3evvf6Gu/vue/x1O0Vj8/To3jUn\nq4+U4h8BD+XPGqylWK0mmmmgZhpoaNjBooHTqoABAAi7qIdaJm6+budhvNbeRlbJG8+rWAdc4mmE\nZKw79FhKCkBRKaCDxYHmkUYL+sfeWiDTkr7IH9ip1KJDPtyffHNf5Wu+AnSAmXjzVQg6d945x02c\nONGDyv4DDnbFTBDapXPHaKqJjr744WSiQNCj0ezqd865210y9hI/u3m/vfdKvdUE4BGUGvC09FTb\nedNAujTQ8I3wVPwtVeiATjXGa6n01haCGUCs2KYaytsSuJEWZS4l8LFnbqxGC/xbrwZolKMXQIdQ\njcqTclS7+eqee+51ffrs4kHn8IFHukVvLHJjLhxd0aSgANCQk05wWIDGjZ/gFi9+122yySZu/ISJ\nVWkm9QptpR85K6NrC6YB00B2NNDwsMOtoKmGwfTi0CNrDiMLp6FpBfmQNYQa5EL2QiCU9LgRn1Gm\n49eRNiBEmcN8ktKIHwN2mBurkT70/FMH4KoBG3F9tbQvPaLXaoRym6+w6sT9dF577TV35OCj3aRJ\nkxyQ89hj89zRgwdWQ8xmaWDxGXfF5e7Vha+7+fMXuG5du0UQnn9KmGYX12nHgKdOirdsTQMVaOAb\nkSPml0OkVpCIXdp2NECFSuXcCM1ZwAYOtjfeeGPNAY58ASwqzmoE7gdWndVWW81hLSJdXm11M6fH\nFRN7hrO10/2cpjtAh15mGhSRUbUnRBaXgRHsHDnoCNe+3XrVELGoNJhc9LxoOolevfu4sWPHVE0/\nRWVeYiQ1adXLKliiuBbdNNCmNWCw06Zvf+mFv+uuuzzoyCpRegrpuQIg+PTTT71AjEVDpcXS2lYe\nQId8qhW4Fz/+8Y99cnOiyTn33Xff3HADGk9Hs7cLdsIpIeheD+iwXHDBRe6xeY/65qXQz6ZashaT\nztJlH7ozzhjmweyC80e1+v0oRqZCcap9PwvlZedMA6aB8jRgsFOe3tr0VUACH/jWhoLWVLIcgnHm\njYdVV13Vlw0g0aI4lTgxt5YlgPtAOQA2YJSAVQeHZKAmtOoAO+xzjt5XdC9nDCGg6IwzzvRWnilT\np9TUmiPdxtennjbMzb3vXjf7ltmpf9YMeOJ3z/ZNA+nSwH9F5u/R6RLJpEm7Bj788EMPO1gQshqo\n5Kn0WQCEDtE4MAyktzTqTv23v/3Nvfvuu96XZ/r06b55iCEKXnzxRT8zeLt27TwkUPZi4YemJfTW\nrVu3qqqM1/eWW27xzVdUuJRLzVdx2NFs5hyXnw5WHYDozDOHu29F106Zck0qQAcl7fbTXd23V17V\nnXbKULfDjju49darXXNaqTeJpl30z9qCacA0kD4NmGUnffck9RJRcdN7ZvHixZn+uFMxXXnlld4i\ngtLxb2FhiIJHH33UL1RgH3/88dfuSadOnVzv3r1981GvXr28PoiUBD/oi1DtirBazVennXaG+8Zy\ny7lrrrk6NaDjFfbVz/XTZrjLL7k4MxaeavpihXqwbdOAaaB8DRjslK+7Nn2lev7g3JvFAOSwACJY\nQkJrCHNE0azDIvgBLJ588km/fPDBB18rMtaeEH7kQ0PzEs1+1QYdBKhG89WkSVd7HUy9dmoqQUeK\nHnnuaPfKyy+5GdOnpdppGXl5Vrjf3HcLpgHTQDo0YLCTjvuQOSmABKw7NO1kzXcH3xkqI5qv8MkJ\nQUcOvTTtsNDkw6KAn8tnn33mXnnlFQ8+Tz31lKObdjzQnERT0eDBg91+++3nsP4oJFl/dK7YNc1X\nlfa+YsDJMWMudndFoxpr8L9i869HvEMPO8KtFvlTXX31pHpkX1KeBjwlqcsimwZaXQMGO62u4sbN\ngAoXYODDnqUgX6O4M698XIAc/FuYNworD8ex8BAAGICHRdusgZ6XX37Zr5977rlEdfTo0cNDjyxA\n+udfKvygb1mOyu19BdQdfPDBbuyll7sBB+yXKG/aDtJLq3cEpxOikZz79t0lbeJ9TR7uU2tZ9b6W\nmR0wDZgGCmrAYKegeuxkSxrAqgM8AD5ZCADOkdFYQfrnjcxYdgAaWXVwWgZ24sBDXMBGSxL04NyM\n1WeNNdbw4MM2IEQvqHiQ3w9WH+AFSxmhJfipRvPVueeOcmusuaabOuXquFip3tc4PPMXzM9EMxEW\nUAKWRAumAdNA/TRgsFM/3TdEzkBDz+jfNr47spiktWD5ZBXsyLIThx0sPVh4sO4QFxhhAXpYC3re\ne+8935OLUa85p+NsL4kmxAR61PxVyO9H8CPrTQg/1Wi+YmTtiy8e656IfJBqOWBgtZ4LmrO6dO7s\nRo4cUa0kWzUdA55WVa8lbhooSgMGO0WpySIV0gCgc8opp/iut2n136HC6RlBGUCGY3IY4j47NF8B\nPFoDO1h9WAAi4mtROoAOUMIUHGETV7gdAhAWIByeBT/5/H769Onj5abpi/S33nprn2W5zVcMHDgs\n6ma+XTQtw9nDh0n8TK2ZSHSLLp0y1RvQgCdTj5gJ24AaMNhpwJtajyIBEFgd6KqdNuDBIZl50AhJ\n3eUFLlhuABqsOACOFh0DdOIL1/7ud7/z49wwb5iCrD4CHNbhtqw+IQwBP1h/WL/++utKKnFN13gs\nQOSPTMgMoGlKiHyDB2LVueCCCzNr1ZEyGHBwrTXXyIx1B7kBHp7FtL0f0qmtTQONrAGDnUa+uzUu\nG74w+MSkCXjU80rTQjB3FDJi5QkD0ADsCHhkyZGDsvYBC+JoDXRsueWWbpVVVsldz3kBFHlgkdEi\n6NE6CXoERbL6AED5nJ47R805O+20k8P5edttt/VA9cUXX3h/I2Qm33Duq4suGus223zzzFp1dM+e\nfW6BO+rIQX4Wdh3LwprnEegx4MnC3TIZG0kDBjuNdDdTUBY1aaXBhwcfHZqtABusTmxreohx48a5\noUOH5jQGnBAEKXELTrgv2Jk3b57bcccdc+ATj0M8LUpX+STBTz7wAXrovk7Ybrvt3JqRYzE9v5L8\nftZee23f1EVlyrLhhhs6RkkGxoAf4IgZxrPQ1dwXuMBPv336u/367+MdzgtES90pA57U3RITqA1o\nwGCnDdzkWhdRwIOlJ+4fUytZ1KwG5OBPRKCSAXBmzJjh94844gg3bdo0DzjAhwLbIZwIWLT+6KOP\nHGPUYFER4HCOba3DbR0L16TPfhiw6JC3LDtq4sLhmfQ6duzo7rnnnpxPENaqxx57zDd9Pfvss346\nijA9ttdaay3XpUsXD2X4FX388X+7e++9Ox4tk/vjJ052r0YgmLUeZSibZ1EO85lUvgltGsiYBgx2\nMnbDsiIuH3JghwB4YF2pRaCJgHxZA13xfIEMrDqnn366FweAABgYDyVubQkBSPCDzw+ws9VWWxUF\nNknQEx7TttJnTZAsDBz40EMP+WM0mWHV0TniYq3Btwh/HZb58+f7gR6x/KCDMGy3bVd32MCBbshJ\nJ4SHM7uNo/L+/ffNXFNWqHCafOPPaHjetk0DpoHqaMBgpzp6tFTyaADLCrBDExLbG7fSeCOADYB1\n1VVXeesNeWnQvlA0AAHAABz23ntv79jLeYCHnk6hVUWWFc4DGMAD11MG1lqw0GgRvIRWHI5p0XHF\nC9faVlqMTn3SSSeRfTQj+Rmuf//+fhtZVA45UwM6QA9pcH6FFVZw3/nOd9z777/v9fL8889HY/18\n4a67/gbXo3tXn04j/PTq1duNGnVepoHBgKcRnkQrQ9o1YLCT9jvUAPJhsqcpiRnE99nnSx8L4Kca\nAcDReDSkl9TbSvkITgACIIGeSYcddpgHAuKMHTvW/exnP3PLL7+8hwXW8qMhH3p0ATphIE0FIEV5\nCGoELtonby06Fl/rvLqZM9v3nXfemQMc4iM/C93j1UWefQKgg5/OSiut5OhqzvrPf/6z23PPPX0a\nkrcR1vTK+tFWW7hBgwZlujgGPJm+fSZ8BjSwXAZkNBEzrgEsLFhePvnkE+80C/gADTQ30TMKGCol\nUDEoDZoAwi7fQImCwCNcAwpa8GWhiYgmKcKIESPcgQce6Ltvh5YSrEDkEeajPNSkxFpgBCTRA4r5\nsVgADxYsLQIQQQj78YVzv/jFL5SFu/XWW/313/rWt3y65EOZaMJCTrqbh6M9c5wFaFIz1xtvvOEG\nHHRILs1G2Vh7nXW8w3XWy8NzzHNd6ruQ9XKb/KaBWmnALDu10rTl00wDQAkAxPqJJ57wIAEA0YMo\nqflJFQG9qYATKgcWWYiAH5qwVo0miqS5iTQEOWHGHMMCQpMPFhFNC3HOOee4O+64w0dt3769mzt3\nrsOigg9M3759vbUHyBDchGkW2iY/BbYBLQIgon1ZcjjHNuP27Lbbbj4eTYA0twle5J+D3IylQzdz\nFo5zPc1wQJHgCthif/Lkya79+hu5cVdc7tNtlJ+5Dz7sZkYO57NuntkQReJ94D1IegcaooBWCNNA\nnTRgsFMnxVu2zTXAR55/tUBNUhAEAThJgWs5BwzRHZxu4YIJ1kCKAkARQgOWEfZvv/32aBqFixXN\n+/4MHz7cW2ew1KhZC6AghGnmLipiA3kIrLXI2sSacXvefvtt39sLAOOYLDSy5CAzozADPIAP55EH\nGYEbWYEk93XXT3OdOnfJ/Pg6cfU2GuxQPgOe+F22fdNA5Row2Klch5ZCSjRAJbHzzju7zz77zF1y\nySXu5JNPzjVZIaKsMsADwIOFR01AAAPAg1XlxBNPzJXo3HPPdccff3wOIAQ8WHmUZi5ymRsh/Iwa\nNcpddNFF3jKD/xHj4yArMCNLFIAD6LAI1EgDmYAbrDqCHFmjrplyrftBh44NBzvMhL5++3YeGstU\nfyov41nGuoOVx4JpwDRQuQa+/ItaeTqWgmmg7hqgeQtYIGCRwQFZzrtYRNgGaIAHAvCDMy8Aw4LF\nhhGWx48f748T58ILL/ROy0BF6MdDGrLKEK+SIAijuzigQ7j55pvdOpE/SmilkawAjBaghqYq/H5o\nwqOCZKEcrDmGDxAA1IghixOZFnMfsGQS3okNH1DMtRbHNGAa+LoGDHa+rhM7kmENMGig/F3oaUUv\nJKw2wEoILFh34s0+TMYJ9NC7C6dkBvMj3H///d5vhxGLBU1YhUijWsBDPkdGDtsEeqzRzRz5ADDW\nCspPlhwACNABbugtxiLgAXSwDLEQx0K2NCCrjgFPtu6bSZtODRjspPO+mFRlagAIuOGGG/zVjDFD\nD6vQz0XNVTQLARLAAtaTBQsW+KkUGGSQYwAGgw8eddRRPq1Fixb5uaeeeeaZXM8nWYmqAT2jo3GB\n8DcCWm6MHLcJlIW0WeSzg3UK0MKyhPzIHlp1BDsCHc6xrPitFX2ajfbDwIIdftCh0YqVK48BT04V\ntmEaqEgDBjsVqc8uTpsGgBQsG3PmzPGi3X333e62227L+bsAO8CPLDMAA1Mt7LHHHo5eWHQPZwEi\naCrC2kKzFgG4weJCExPpqFkMEFHTGIBSasA/g5GSCYAO8pMOC+kiK3mTnxYAiLICZjRjIXPYnZ1t\nHfPrVRrTsvPekiVu6222LVXlmYov4OE5sWAaMA2Up4Hly7vMrjINVEcD70Q+CY9HPbC0JlV6VmnC\nTsa20ccePwa2e/bsWXDWaACmV69ebsCAAX6MGpyMO3To4DbaaCMPD0AEcXDwff31190uu+ziC8Mx\n+cKwrSYr8mVOqn79+vl4TDWBI/Oll17qrS74zbAQuJ70w6Ynf6LAD0BFoPlKXenZj1t0kCcENfKS\nzw4+OQAa4MMx5JcMpLPWmmu43zz/W5JtqMA9bAuB5573AuCRP08l5eZ9Iy0tpB2+d1gYlY/eO9Y9\no3fPgmkgixqw3lhZvGsZl5kPLBYMDSiojyhrrBoEfVSJqw+xPsw6BhhokUoECFhAsL5svvnmvncW\nMPDII4/4aMDAn/70J28x6dGjR86Kw0lZUbhWViDS4jiBZq0//vGPfrt79+5uypQpfjweIAMrixyd\nAQ3Bho+c54fmK6w6VC5UQLLqqBzADb5GGlOHbSxJpE05ZL2hqUrAgwzx/JkOY9y4qyJouyuPJNk8\nfPEll7uVvrOiGzrk5GwWoESpeRd4TnhXSg3he/fuu+/6nou8Z4AUCyH+3nHs8eDPCPkTR++d3lfi\nWTANpFkDBjtpvjsNJBsfSeCGyp3AxxKLRjkfba7nw81HmEH3SJtBBVmABpp+aPYBFGiiYlA+AoMD\nYuVZEjV9YAVhBGX1VFKzFVYZwAbA4XpBj4CH84Ca/IK47sEHH3SdOnXyaQI8LFhWQuuKFyD2Qxk0\n1QXNbuiE9Fnko0P+DBoo2KFcnAdogBvkVxkEXEn5oiP8ebi2kcIxxx7v/vynZf45UIXdSOVLKgv3\nkmcH6Cgm8LzyngBJghTW5QTS4D0mTbZ5hzWaeTnp2TWmgVppwGCnVppuw/nwYeSDCNjwcWSpZgB6\ngCgqAPJhfB0AADAAFiZNmuQuuOACnyWWGSqJ73//+x4WsIgQF1AAXAAFQtzCQzoAD2lidSGvIUOG\n+Lj8XH/99W733XfPpQOM0Myk9JKsPOiD5jqar6hACMBICGuy6gA7wBfnSBd5JTvWHcAHSw/n4nmh\nH8LoUee7s84+2+3+075+vxF+OkZjB82+Zba3iFH5KnCPGz1wXwuVk2eK94HA+3Fkld873gEgijnv\nmJuMbbP0NPpTl93yGexk996lXnI+hnxg+ScK8BT6MFejMIIeKr1f/vKXfuJLWWfwy6FrOQH/Gz7K\nagYSNAAQHAMWBB2y8CgdoAfgATrIZ+DAgTnRmaFcIy4DTlh4gA8WQgghVD6t1XzF9BthkN4vGnOx\n+8c//+3GXDg6PJ3Z7WefW+BGnj0imrF+3tfKIMDjBBYflkYMScDDc8l7JxhhuzUD+QFVyMJzLcBq\nzTwtbdNAqRow2ClVYxa/KA3wL+/UU0/1g/zxAaxlmDZtms8bEKHrOeAxa9Ysb/FBDvx4sMRgdeFc\n6PfCvoAHC05oZRHwsFazFiB32mmn5fx48AG65ZZbchaeJD8eKqFqNl+9+eabvpkLmGIR3MR1zj/9\nq64anwgH8bhZ2B957mj3P//+l7vs0rEFxaUyZlHIpx+dz9o6BB7+VAAbAA7vXS0tLchBvlgskaOW\neWftnpm8tdeAwU7tdd7QOVL5618elWu5PjmVKompFugmzkzrzHe16667+sEB+RgT6FFF8xFWF5qA\nsO5oAXjkaCxHYaw5AA6WHZYQeLACMQmpJhLlesbtadeunYcp4EnNWsAIoFNJ8xU6/vjjj3NAxeCH\na665ZjPLkS9kwg/NPuPGT2iIpqxevXpHMH1eXrhLKL4/RKWsgMWHJeuBMvEHgzXvXb2AjmeTdwyg\nr+f7n/X7afJXXwMGO9XXaZtNkQ+dPrJ8dOv9zw7gwRGTnif33Xef99MZNmyYmzlzpr9HM6LZsqno\ngJEQeLD0ACzyfwFmcBhOclwGejgOFFHm8847L3f/77zzTkePLTWPATxMP8GUEAz6h1zoiPQFVaQn\nPx0ck9lGr/QAw0qEXFim6EqPzAIzWXVymefZGDNmrPvrRx9nfvbz66fNcDfNuLFiK9U7gdWHe1Ev\nOM9zu4o+DGDgO/Piiy+mogxYlQRfWdVp0cq3iJnQgMFOJm5T+oUU6MiEnQaJkYneWVQE/Mv89a9/\n7TbbbDMPPVhngBy6owMKQAOQI+tO3OFXTVpyXAZKQiuP/Hj4Rxs6Lp9zzjl+IlGAB58hZmQnAELq\nESOYIg3SBHLoKi6QwoGakZ2RiW0WtklTPb8oQzGByn2TTTZxry583XXp3LGYS1IZ59DDjnAH7L+f\n22+//lWTj+eF+6fAs1xvYJcshdaypADblAGAT0NQkxpyGfCk4Y60bRkMdtr2/a9K6dMIOioYIMFC\nhcBoyvzzZRoJZkcn4LiMNQYrDsAj2Al7OKlHFengw6Nu4XHgoZmLc+iDXl9//etffR7y4+nWrZtj\nfi3m7rr33ntzPbUAKebiAnaUZufOncvufeUzLfAz7MzhkZz/l1nrztwHH3anRuPqzF8wv1VhBPDh\nXhLSavUJQSeNYGbAU+BFtFM11YDBTk3V3ZiZpf2DC6QAFMiJrwzWnLA7Ot3SaX6jmQmLCcATWk/k\nb8PdiwNP6MeDVQZgwfpDPHpbATHxQPMV0MPov4AUsnXt2vVrzVeAExYbLFChEzUyltp8FcqAdWe3\nn+7mbrhxuuvRvWt4KhPb+OowvEA1rTotFRzoSZvVh6YiYAK50gg60inNWSxpl1Py2roxNWCw05j3\ntWal4iPGR5cKNM0fXEEKzrxYWrDmMMjgwoULva7ydUcHLORgHFp4ABT58WCNkUVG2/Ljofnsiiuu\nyN0Ppr+45JJLPNysvfba/jjWIqCJ5istQBMyC8Aqbb7KCfDVxvgJEyPoe9Tdc/eXc4jFz6d1nxGT\n5z/3bN3lrrfVh6YhmkGz0kSErAAj8lowDdRDAwY79dB6g+QJ4NAWX8/eH8WqEnAgvP32227rrbd2\nN910k/dd2XLLLf1xwIPeVGF3dFl45GAsh2V/QfQDpLAANsCKgAcLD9NR/OY3v/HnQ6dlrmUU5+OO\nO86DDJYboEkWIgYPZJt0yY+8JUfYtBaXRTKVsu63T3/XrXsPd/bwYaVcVre4jKuzfY9uqXHClSJq\nbfUhP/xy+JORlTFtkJlvBfJmRWbdX1s3hgYMdhrjPtalFDT98AHDupOFAPCwjBs3zs9kPm/ePPfq\nq6/mHIWPPvpoPxIsIIFFR01HrOUMLMgQPGHhEfA89NBDOeChmQmn4rOjEYs5Hg+Mtjxx4kTfTAXs\nhP46pAd0Vbv5Ki4D1ol+e/dzvxh3pRtwwH7x06naX7rsQ3fYoYdGTZGD/D1KlXAxYUKrD1DCUs0A\nLJBH1qwkyIuFB9mrrZNq6tfSakwNGOw05n1t9VLpw5X25qu4IoAUAIVZ0bfffnv/L7OY7ujAD8BD\nsxIggkWGjzbj+JAeC+kBLbLyPPfcc+6QQw7xIjDwIB96Blr87W+/nH183XXX9QMQrrLKKv46riUd\nAr2sBFuh/1Cpva98Ygk//NOm0pz36Dy3wjdXcDNvmpVa/x1A57jjjvfw2NIAgglFresh3g8WBf4g\nVBJIi950DKuQRWDAb46Ar5EF00AtNWCwU0ttN1BefGgxo+vjlaWiATxYdfbbbz/38ssve7DYdttt\n3dKlS30xnnzySQ8zYXd0wOONN97wXcMBHmCHwQHxUyI9QZSsNAAPlh0G/0NXs2fPzo3HwwjP+tgD\nL1iaGDsH0CFd5Ss/HZqx5Dsky1Il+qbCBLxw1ib07burey9ymk6rw/Kppw1zb731Rzdj+rRU+4UV\nc09CawzPBUspgfeNa3j3shiqBWt0MsDnjoAP3FlnnZVFdVRV5qlTp7pjjz02lybfpFqGSy+91E+X\nQ54vvPCCwz8yTcFgp4p3gzFc8AkhxF/AQueqKEJNksJHB6sAH64sBsEJ1p0ddtjBT/fAGDg77bST\nL044O/qyZcty3dFxbGZUZABF0AGcEJQmwEIzFM1XckzGdwerkHprYcHhYxB+oOldhDwCHaw9jBHE\nOrQqkZ/y9BmX+COL3KeffurTZ5+mSLqj33v3XakCHiw6o0ef7z788MOGAJ34reL9Cd+hlqw+xMWq\ngzUxzZ0B4uWM7+sPkoA/fr6YfX1PN9100wiE3yrmkoaIE777U6ZMccccc0yuXPWGHQTRfQF0+Mal\nKSyXJmFMltpoQBUma16QUgMfqSw7Gar8/DvGURnfGEYkHjVqlFfFww8/7LuNAy0ADt3CGSMH6ABU\nsN6ouUm6U5pA0CuvvJIDnRtuuMFtsMEGvkmK67EKAUadOnVy1113nS53EyZMcFdffbW3/nCQdARV\nYdMV+ZQbuG8AFaCz1VZb+WY4QAd5aB4686wz3VGRT8ytt99ZbhZVu05NV40KOihq48hCA+BoATy1\nhBAkpfK8Mrt4lkGHslAORnumKbWcgAVBfyrDPwzlpGXXVFcDuh801ZdTt1RXmuapGew014fttaAB\nPsIMzqd/Zy1ET+1poIFK5r333vOmV/xrgBr8bgiMj0OlomYprDL0tqJ5imOAEMADKCgIRHB0JuCE\nfNBBB3lIoimKpjAsN4AM12Ht4YMADBGALHx6gBECceQfpLT9iTJ+uF+DBw/2V1JhUqlS2ZIHC+U5\n7LDD3HkR8I0cMdzRxbtegUEDe0f3hmZAusZnvXIvVo+CHtYEgQ++YQRZVP1Ohn947hjUk/KUGrBq\nATuE1VdfvZllo9S0Gi0+Vh69z6zrEQ488EB/X8h7+PDh3gpZDzmS8lwu6aAdMw3k0wAfKCbQbIQK\n6IknnvD/lOnuTdMV1ptrrrkmV3RBi6aIiAOPYCf8sDCQIH5AzH2F1QirDJYjIIdFY/aQiZq8+De0\n5557+nxxPB0wYIB7J4JKAASwygdXOUELbKjLL/+kCfgHYeHh/unDiByCut12+2k04OJE9+Tjj7m9\n9urn6O5dq4A1h5nMGR354rFjW5zNvFZy1SMfgEDwwzaWVCAYS1wjBOCb57DUwJ8DgIcQNuGwzzn+\nFLDId4UKV8dY826pgwDXxAMgRVNMeE1oSYrHZ5/0SFfXrLHGGl4WrE86xlrWKKXBflw+4nEsLiPf\nJ86FgTJyTBaUsPyKi67Y1oKc8RCPc9tttzWLgn+U8lI6yKN8w8gAKMBDIN14WmHcmm9HH7y6hsi3\npSlqdwVDcwvHonbYZnJFD3bufKTQr50P02A7HiJH0SbSDfNhOymv8Npi5eOaUAauC0Ohc8SL/tU3\nhWVEtmgqg6aoXTZMJrcdpoeuuD5qJ82VDx3FZSC9ePm1ny+fXIZfbUSg0xQ52MYPZ3b/d7/7XVM0\n0F9T1DzVFEFP01/+8pemCOhyejrggAOaIoflpmeeeaYp+gA1LVq0qGnJkiVNPE/RAIBNEQg1RbDg\nyx9NRZG7bs6cOf54BBFNkTWo6bPPPmuK/H+aIifnpsiK1BTN09UUwVBT9MFoirqgN02ePLnpzDPP\nzF3PfYnAy+f10Ucf+bxIh/TIT3kWUjzyRH4/Pk3WyKTA9aRFuSlH9GFqisYG8vlFH+GmaOLRpmHD\nzvLP9CmnntEUzaWlS6u+/mDpsqYxYy9r6vCDDk3HHHt80+LFi6ueR9YTHDp0aBNLowSeN55x1qWE\n8LvHNy8MfMP0PeNbGn4PdVzr+LV8QwvF53sa+aCE2flt0lGa8XX0J6bZubBOI614/Ph++P0u5tsd\nlp+0FCL4yOVFOeKBfJQ35/m2KYTnFCdco+d4uPXWW3PpodO0hP9opMYSlfpwEZ8bIUWHSo7f5PiD\nzIMVXqs0wnX8mlLlQ33hixg+qC2dK+eBiucVliXc5iVRKOaFUdx8a9KudmWErrmH3FNkTLpXHOMc\ncYjLNdUKgACVe9RM5aEk8hNpihyGc8/a+PHjPfAAKVGTQtMf/vCHpqjnVlNkNfHXCHgiPxh/DUCo\nEFlnPFBE/8r9NcDO/Pnzm+6///6mW265pSn6d9t07bXXNl1//fVN0WzsTZGPTy5fdH3iiSc2RXN5\nNUXzbDVF00s0y68Q8ACkAh3kAnwIAiVBGIDHx43yADmvv/56U+Rz1BRZp5qiMYg8RA8cNNjLBPQ8\n8+x8Fa3iNQAlyDnk0MOboslPK06zURPgHlZbP/V+7yhTCOAt3bs4IMTjx+uB8DsY345XwoVAR9fy\nDQpBgO2kb5Xix9fhNyv8fsfjhfvKr5hvd7z80k/8eBzaQhhiWyGEllCm+Ha8rkPmME48P6Vf63Xd\nYKechysOBXp4wgcnvFkos9gHkjTCUI58oRzxByDfuXIfqDC98MFK2iYPQjEvTKiD+DYVJlaQagXu\nX/iiJcle6BjX6hmoRKbIf8BDBtASNVX5f5vR3FVN7du3z720WHeeeuqppgULFngYAAywhGCxweIS\njYrs4wIY+rcq64nSxCIUTU/hLTsPPvig/9Bzb6Ju6R58br/99qbIH8pvR6b0XN6Rk7TPE6sT+ZEe\nsgJSScCDBUB6A7xCeYjPtYAd8AREAVMAHJCD9SrqPdb0/PPPe7CTJYsP1siR53jrS8+evZrOPmdU\n0/0PPFSy2oGlqyZMavr5Mcd5GbHkVLsSL1moDFzA/axWSMt7x3M6atSooosVfv/jsEIi8UqdOKpo\nqQfi3z+BRPy68Ntd6FwoD/cnvC5u1eG8vlVxa1B4Xfycvt1Skt5r1sgWhrisOheHjzA/4oTAFuYX\n1jGhLilHqMs4BJJmeG08P87XI9TFZ4e2vrBNMlIGb7JfohsW3ccvQ/SRbtYuiG9DpESd9o5qpBVV\nPP5YpHTf5TsXIdrgPOkohHmRngJpqH2xXPmUVilr2mcVogfKd9dDF9ED5Wfk1jnajcNy6LjWYbmi\nB1aH/Vq6jl4kr+PwJPomv8hiEh5O3MaPZOPIf6AaAV1vs802OZ2Xk2Y10iBffCOYnBPHYRb52Dzw\nwAM5sU4//XSvp8gi4h2V5aysbuQXXnihjxtZVJr5w8jvJgIM39OK6zkWTkuhbuYRKHkn5rXWWsv3\n1Np///19ms8++6zXFZOH4jeEkzTpkY7eGyLin0NZrrrqKn9dVJl4J9DQP0fyIDdl+Pvf/+7n42LN\nwjHSjqDIt/MjJwvv3dlnj3CvLnzVDYl8ar694jfdZZeM9XGYdiKCFu/UjGNzfMEP59DDjnAdO3T0\nvb1ejXqrMQEpz/OUayZ7mb3A9pOoARyVIytI4rlSD1bjnalGGsiN/5Gcr4sph75jxA3rgXzX8h3k\nm0pIqhv0PcUnRYHvYFgvsM+3VYG6QQE9KMSv45p839SwHMgV5hdBRLOySUblU86aPKI/hrlLw/zZ\nVh7EI38Cx1Wvsh/qEt2zT3wC14e64Fh4f8J0OFevUBfYKffhQkkhDPHgyTOfc3EY4lh4E5IeyPCm\n6CGoRD7yLDZU+kApn3i5eLDDh1sPs+KXu+bD1DOqTCsNPPw4vFVDLtIgrUpeKACOCoUA7NA9HOBZ\nb731vEMvxyNLh8OhGVgABoACAc+h0TQGBJyMGaxPAWAAbgALAEVLCDuADjDChwPYwbFZXdSBFWZk\nJ3AtlUPkO9QMeEiffCKrm+/hgoykA3RpGg8BkWQnLaBJk46yZv8t+iwAAEAASURBVB85iUOQHnCw\nRh8sHCNQxnNGnu0ee2yev4ennTrUde7Uwa280rc9BAFC4bJCdNkxPz/aPfDgA27RG4vc1ClXuyMj\nB9VGcHL3CmnlHyC2GrpK43tH2YoN+j4TP/xuJ13P+XgcgU88fpiuKvswTvgtRYd8c1haui4pLdKl\nntI7GVldfFbUOdRlOBBX8i0L5Q63Q1nC+i3cJo4AJiwbeovrknhxvYT5hfHDtMI4td5evtYZkl9Y\n+PAmSBaUKIuHHi7dBOJTuYuw9WCg3JCQlVaYV9LDjgUlHsJrSpUvnlah/TCfQg9UvKzxNJPKxbEQ\n9OLX1HOf8sRBh/vHfec+J5UHedEX1/GChrrjGGmG/8BKKZ+sVerBIOsOH6SDDz7Yd0OPHIr9BJ6a\nHR0wABCw6GAV4trI7yZnEeHaOFzIasI5AQQ9tDQNBWkALoIjoAq4nDFjhhs4cKAvEqM+n3POOe74\n44/3cYGy++67z9FzDGjZaKON/NAAGj+Hi0gTWQReyIHsLAI2zhFUdsmlHmQcl5XHR/zqh0oYGVks\ntI4GqvUnI43vHXBebAi/GaoP8l0bVrb54ui46hD2k3orKZ7WoRw6lvTNKiSDvlmq55ROa635tvKn\nkEDefD+ROfyOhvASlpE4+jbmky+MH49T6Fw8bmvu1wV2ynm4wocbqKEiD5UYWnyksDAfjhV6+HQN\n6/C6Yh/+UL4wrULbofyVPFDFlquQLMWcw/rBP/JKQ/hvgrS4d/lMvmFeIXiSBt0fFeJp6nipa15q\nKnUqd6waVPZAFOkDBjQtMQYPIEIX88ip2GdBRcJYOkAD1wp0BC6hVYc8iAPkMPYOiwYOJF2uAYYE\nIhtHlicg66ijjnKRj4276KKL/HQXkYOzY0b1SZMmeRm6dOnimOqCZxGgIiBHKEscdMiL8wRkAp6w\nLGlBRll30Auyt/Th84nZT+o0EH9H6v3e8VyXEsLvZSnXpS0u5aAJP6xnkJFvIN9yviXxc5WWgW8C\n3089A6zJS3+Idb7SfNJ8/XJpFi6fbDws8Qc/JNR819nxyjVQ6gcqKcfwXvGCFwM68XRk4dPxME0d\nq2RNxQ5wqDmLua0IwEjUO8vDhORm9OXddtvNQwqwo0WAA2CwcC1WFtIGogAKQEdrtsP5sNgHNpCB\nj9Gdd97p+vTp4+XAj2fDDTfMgU7//v19UxvNYuQhaw4gA9CQv/xz1GyFfAIdgCaEL8mFnJwDhJDb\nQnY1EL4jaX3vCmm3tf7UhenKr5E/C/mWML7kDXWrY/mAJYQZ0qJ1gbyAz6TWCaVX6Tr8s4i8Ah/S\n5RzfGIVwm3P5dKHjScYGpZWWdV2+XuHDUs7DlWT6Sxr4KbxhKDzfwxe/GZXKF08v334oX6M8UPnK\nmu94qOt8cfIdr+TafGlyXNaLEHi6d+/umL+KEPWaauabw4suoAkBh201FQEcAh3gAYgALljYDhfg\nBysRi2Y8B3iQJ+q94icw9YJ89cMgj9E4Pd6vh3yAKlmIkAsZkkAHeSgraQt0lC8yCHSAPvKWXsK8\nbTubGqjk3ank2kq0FX4v4392K0k3bIJKgpaktNFBKE8IDoqfdIxz4XGgM9Qn5Sq2nlI+xa7154z4\nWHRCOcImLM7HdVKJvsPykXa9Ql1gJ67IUgoPFesm8bApLW5G6KxMmpwPH8ikh4jRLvUR1/VKkzSK\nffiJW2qI51PJA1Vq3uXExz+jlN4TxeShe1lM3HicSq6NpxXf55mggseiAZwABCNGjHCdO3f2UeVY\nSC8twEA+MFoLMliHFhQ1FQl0SJdFPjzKS/CBhUUL8BF1f/cWnlBepu8AdtSjSjJoX47I7CMPQMQ/\nMspH3oIrwArYCUEHefV+hHnadrY1UMm7U8m1lWgtrDSTvuXlph1aPPgjrXqA9MgHVwa9A4yurBAC\nAvVSeB3bHGsphO4Y6DVsmm/p2lLrC+rCsKySL36cfKmbpG/yQa6wLuTasO5UWpI5vD9hPafz9VjX\nBXZChZfycKH00KqDyS90SkXh8RcxfCB5ANVGibJJK3xgJJfWihM+xIUe/lJvYKUPVKn5JcUPy590\nPjyG02spvSfCa8PtUL/cr/h9COMmbSMz9yS812GaSdeUe4zKXs1ZwAZ+MmHAqoIVRXDD1BMsgIWg\nI7TqABdJoCOoCPMDOgAdAIQ1eY8cOTKXPc1pyEbAUfpnP/uZi8bO8fmzBnIkD2vkQVYC+VCeMH3y\nQzbBl2TiQ58UeBYej/y4xo+fEPUau8h3L6eLeXxhRvXxEyb6bvDvRMMXWChdA4343vHHiZ6DxYaw\n0gwr02Kvzxcvbl3hexTCTVhnhM1M4TZph9exnS+E5QAgBA1xoMh3vY4rvzho6HzSOuk7ybHQKKDr\nwvIhJ35G0kvYmxYoCq1GXB+CUVhepV2PdV0clFEMlZUeWG5avocjVDhxVDlzc0hHVKqKj5sQ9rDi\n+vBhyOdwDBTpppQrXzk3EPnkJa8HKimdpAcqKV6px6T7Yp0Vq/XR1f1CXp4FFvSv+5lUDq7h/ocv\nkuIlvcQ619Kaj+7GCc6SvNiygAh4mJk8DFhV+vXr560lNAsRD0jAFwbfnRB0wuYrQENQgYWFIKiI\n73Oc3lZz58718X74wx+6yy67zIMKztKnnXaa1wnnmaWdMTDowo7skgNZ1GylsgA2AI4gB5kkf1wG\nn3H0A6w8/vgT7s45d7l777nL7d1v32guoe+7tddZxx3xVY8xxdU6GrAwgq4v3K233eHwLYoGJXR9\nog/sDtv3iLZ7Kpqt82gAHY0ePTrP2eIP846k6b3jW8IfqGID32jVE3wD+BYkVdLFphfGw52CuiHp\n26J48bFz+Cbz3dT3W/G05tvOdy0eqF+ok1SXhec5x3EBlupIxeEbWUhGxcu3Jn3pUHFCg4COsZYs\n8fhhHHSA7sKA/GHZKvk2h+lWvB19EOsSIiApOBdJVLBmI1IyEibHtEQPXk7uQueIFD2Quet0fbiO\nHqBmw4BzTanycU1043P5hPK1dI64oTzxbdJFnjCEeUUPW3jKb4dpRg9ts/OUN54HOmopMNItow1X\nGhjRM0mGuEzF7ifdv1JkbGkk1wgS/DxSTPOQJBOjHkfQ4UcCjiwdTVF32ibWHNPxp59+uol5uN58\n800/NUP0MfAjIUcQkjgKMnlGoOLn6oocoHP5RmDj59eKAM2PxMzIzo9F92XQoEG5OBFU+fm2yJtn\ng4Vt5GLKi+hj6aeFiMDFjwKNLJEVyE9rkU8ehvVnSgfKz7QRt9x2RxNzWpUTGHmZEZgZiZnlxmjK\nDGSwkKyBao1cnrb3LpqU1j+3yaVOPhp+NyKobxYp/M5HFWyzc9oJ39/4N5U4fDfDPIjP9zPpG6s0\nOUd+SptvM7JwXMdYo38F6qwIMnLndQ3nI0jKHee6UM74dZzXtzssP8fzhbB8ESw2kyvpGvKMy4S8\n8TpO13JfyJ8l331Q3Fqu82ukRlIU+3CFNwhFxwMPpBTMDQwfEOJyw8I4xC10w5R+sfIRn/QkQ/xB\nKHSOa0t9oML0kl5E8pcscdgp9MIgS77AnFiR2Tnf6ZKOI0N4TyVrqWvSIK1KAgDX0hw9wEfUtdvr\nlPhMsRBZRPw+cPHQQw81RePd+Ak+77333qaoq3gT68ja0jRv3jw/zQRTRbz33nt+igbgImpSSgQd\nlQU4iiw0Po/ICtMUjbeTm94BaBLwADLMtTVmzJjcPUePQ4YM8eVCrugfvZ/uAtBhCohobKCmP//5\nz03M2RU1b+WdfgKQEpQwzUO5gKMyxddAExDFJKBXXTU+ftr2v9IA97MaQJim9w5AB3hKCWGFHv+u\nlZJOLeKG32DqpLYSQogTiKWh7HWHnTQowWQoXgPMjaVJJYu/Kn9MPgghuBULO1wTB8r8uRQ+U0xF\nEo1n40Ei8nHJTcwZzo7OHFQAE/NczZo1yy/MdcXs5lh50BkAziSjmk8Lyw0QlRTCiTyjZis/V1Xk\nF+Tns2IWdObZAlqYwwqYAq7ImxnUQx1GfgB+FneAGKsOwAW0Ms8WoEOagq5QFuIwb5WHkAhyWjtg\n7QF6ACsAy0JzDRQD5M2vKLyXhveOb0mp9xrrCODAM16MVaKwFio7G/5Z43sU/sHmfZOcyErcthC4\nP/r+oJM0hW8gTCScBdNAURo4MhpUkHb2U045paj4xUaibTr0yYn+xTa7NPpwNPPpiV6kZufL3YmA\nxftD4LeTL3Duxz/+sT9Nt/O99trL97DCCRnHYHpCEfCr2GSTTbyfDj4v+MRo3iucEHHGVFdy/HeI\nIz8dn8BXP+hW81vhAK35tvC/YVGXdhyQI2DxSwRQ3jkZmfDPiaw8jrm0CMh0xRVXuA022MD78mhK\nCvkM4adDkCwPP/yIO/mkk9zue+7thg073bVvt54/X4uf8RMnu8kTxrvDI/8fpqSw8KUGeLbwl4qa\n/Kqqknq9d5Sl3A4P+MHIjySCtlYdm6aQskM5CsXjXGTh+JoTb0vXZPF8qJO0ldlgJ4tPVB1l5mPL\nnEuF4KCO4pWUNaBDeXBO1jxSSQnwUX7ppZcc4BFZbzxw0KsJ2AAy9thjD/fGG2/4S4EKIAInZXo6\nAThM7Ans0HUf2AGCAAzBhfLEYZN5pzSEPnNjSS7+k0SWF583Ts9AjWCH60LY4TyBiUyZ5oKATPTm\nYpRlAZfkALrkkDxmzFg3c8Z0d8GYi92AA/bz19b6Z+Fri3w3f/KdMf3LiVVrLUOa8uP+8pyeeuqp\n3vGzGvNk1bt8+oZQrnICPYNw1OVPT2RRKSeJqlxDD6rQ6Tsp0ai5zcNO0rlGOsYfVLrms44sWX5S\n6zSV78tuIGmSyGRJtQaojPlXxpL1QC8XelMBO1FTk1/iZeIfNaADtOjDLDgAaNgOe2hFPgi+FxRg\nwiLDqa5hDeTouPIDHpEH0CGvpIk8SQ+rDaDFmgVLD4E0kQeLEWDDQs8n5tEiAEDsR/49/npZiSQj\n8tBFfMFvfuNuuHF63UAHWbt07ujuuXuO7+XVv/9+DQHWlKuUwPPw+FfPJO8a1r6o2ccfKyWdtMYF\ndsJJc0uVE6sBActUUo+nUtMrN37UXOVBJqlHE72xdL7c9LN0nXqYYYWnR2jagll20nZHMiAPTVkE\nVf5+J4M/yK9/mBKfCkaBSmbw4MF+F4sOH2dZWAAOrCtYVKJ2at8tXGBx0EEH+RnI6dLNiy/LDtYd\nxsxRF29ZdrAwoVOapKjQ2MeaJCACSIAT4AZo0fg9WHZYkENj6AiAuAawAn4YA+iwww5TsdwJJ5zg\nLSfIhyzEOfvscxxdxK+Zck1Nm61yQuXZOPW0YW7uffe62bfMLqmbcp7kUnuYZ41Fgfsft+DwrPJs\nhM+o4mdpjfy8S1isLJgGaqUBg51aabqB8uGjzMeYdfyDnKViYtHBciN4i8vOeWY0x9JCJcM+MCK/\nGSCDwfuAnchp2PvFRL2yfDLnn3++N7HjH4OO1IyFD0/YfEQ8FsJWW23lKzLiC3RkgQGuAB0NXkje\nbOO/w3HicQ2LIMonGv2wj5xnnnlmzuTPeDz8O15vvfUiHVzgyzll6pRUgY7kF/DMXzA/08+byqN1\nCC08WyyFAnBAHKw+LcUtlE69z2HBZOHds2AaqJUGDHZqpekGywdA4IOb1Q8WVh1kB9iSAueAEEBH\nUBf1UHIsgAWAoaHj5STMKMWHHnqoBxCsJTfddJMfsA/gIR3W+MvgywOsnHHGGblZ06NuuA6ZCIIW\nWXTIC6gR6GDFiUMOTVha1FSm67H2sE1g1OU77rjDb2PVOfron7mFry50s2b/KpWg4wWNfgCet976\nY6Z9eICU0JpBhV9q4LkEkkJQKjWNesZHbjWFZ/mPUj11aHmXpwGDnfL01uavAgDo5UPln7V/mVQ4\nWKby+Q1QKan3Vdh8BYSETUm/ifxboq7kHlwAkU6dOrloHB134okn+udjxx13zDUX0XwF6GDZica3\ncQcffLBvNiLiDTfckGsuE+gAVOSFRYe046DDcVlxNCKymqTUuwrAIZ7AiG2OUeFEXem9jDh4zrxp\nluvRvavfT/NPv336u+222zYzvbR4zniWFJKapnSu2LWsO1gay4GlYvNprXjIzAK0WTAN1FIDBju1\n1HaD5YXTJB9zKs8shZbkplJS7ysqFQJgIYsO4IFlBksOzUNYWrC+0DuEeFhOosHb/HW//OUv3dZb\nb+0dhrHoROPi5LqgAifRyMq5aUq4ILTGkGYIOmwDLkALQT45NIux4IODRSmEnTANrmefcnDf6J4+\nZuxl7ujBA316af+hl9b+/fd1l1x6SUXOra1ZzvBdwHLBs1TtAKTL1yxL1hHJzR8lC6aBWmvAYKfW\nGm+g/GQhAR5YshCojDCjU9knWaSSmq8AGCAES0sIOnIOlpWFZiRAAwih59OyZcu8SugyTKUXDern\nrrnmGn+sXbt27sEHH3Q/+MEPfPMT14SgA9SwyBlZQAWoEMiLHleCHNbAEwvnCMgN3ITpCJguvHCM\n22DDDd3UKc3n+vIXpvjn+mkz3E0zboyGALgzFf47VNwsClgtahHIh2cKgMhC4H1D5qxapLKgY5Ox\nsAYMdgrrx862oAE+YjT5RCMEt8q/2BayL+m0mgCoII78qkdZmIDKwrFCzVdADlYd1sAEkALkABoA\nCNYV8sIJmLD22mt7B2biKQBdW2yxRa43FE7EnAecSFOQA5ywcFygQ14h6GDREewItkiP+Cxx4Inm\n+HKjR5/v7rt/ru/mLZmysmZW9W7durohJ59UF5G5dwoAM0utA4Al2El6lmstT6H8eBcAHf5kjLbm\nq0KqsnOtqAGDnVZUbltJGsdaLDtUAq1htq+GHvXBRT7kTQqcK7b5CtgBQoAJLCmADs1UgAfAAWww\nxgazlYeB8ThGjBjhormtPMBwDeBCZSAwSQIdrDSkCUgRn3y0sM/COSxELITQIgUsYeFB5p///Fi3\n48493dnDh4WiZWZ77oMPu1OHnOxq1TsLCOb5UeBepSFgJQF00m4tUTfzxwNITIP+TIa2pQGDnbZ1\nv1uttHx0qRT4oKXNj0Cgg1z5Prj844z3vgphAUiQn46ar2jWAkAADaAFJ2QABOgg4J/D6MoK++23\nn4dC4qv5ifjs47tDfqTJObq4AydACgGAAaLC62TN4frQooNMBNIjyMJDWgsWLHDHHXe8e+LJJ1Pd\n+8oLXuCnNa07PC88ywpAcNqeacmGlZJm0rRaVtP8XZAObd02NGCw0zbuc01KqQ8blpO0WHgEOijg\n8TwgVm7zFTABZAAs9LRiAUCAHXRw8sknf03vjJAsQAqbvYhIMxZNTn/961/da6+95iGF4+uvv77b\naKONcqAj4AFyWLAsyU9HoMN1BAEPaQM9Z5xxpltlte+6MReO9uez+iPrzqI3FlWlCDwbCoBNWp5f\nyZS0pilLYJZGy6q+B/neu6Qy2THTQGtpwGCntTTbRtPlA4dZnQ9cvSsMKgNM6BtHPhXAR75/58hZ\nbvMV4KFu5Vh32B82bFhuCokddtjBD+bXr18//0Qwl87ZZ5/tgQdrDWAEqAApgImsMKw5xjkGLGQR\nHDHvDH5AgFYh0AkfQdJm8MPte2zv7phzVyZ9dcLysN2rV283dOiQsnpm8WywKKSlaUrytLSW7Dzb\nBJ5vgCefP5qPVKOfYv5g1EgUy8Y0kNPAf0Xm+9G5PdswDVSoAeCCUXl33313DxfdunWrMMXyLuej\njwyMZ0NFAIQkBR5/Jshk0D8AjXiAgSwhWFrUhIUvjfx0ABWsKlh15KvDOaAG52bC8ccf7/Ned911\nHb2vGF2ZuXyooNinmYqFPOREDOSQt6w/yLPOOuu4zTff3PfcYk1FRzrvv/++n64CfcctOvGycp7e\nX0uWfOCGDqmPY29cpkr3P/v8C/fyy6+4n+7at8WkqIBxzEZ3LLLecC9YshSQnxDKDbB37NgxaqI8\nzo/9tNtuu/k4tf7hHSJv3vvZs2fn/YNRa7ksP9OAWXbsGWgVDdA0FFpVwg9zq2T4VaJUaliX+OgC\nMvxjz2dhqmbzFbOeU9GwZqRkYOuII47wlhogCT+fAQMGuGeffdZLOnPmTG+pUTMTVhoWWW+AHBaB\nlPaxBMmiA8AwenPoX4Ke8+maiT5XX2PNzDomx58bjbuTrykLvfA8EAQ38TSytp8EOmEZOM97R+AZ\n5PmvRUDPvG/8sWCdlaEoaqEbyyMdGviy20Y6ZDEpGkgDAAaVDR9bRloGQPShbo1i6mOrip68+OCy\nH8JAmDcyEfbZZ59cBcE+lhUchWVtwWLDgoMv52TVEYDcf//9btddd/Wgg28NfjlMIEo8FpqaAJSr\nrrqK5H0YOnSoTxMIYqF3F1Ye0ie+eneFjs8cC5u9gB0qcXSshcQBPS2q7Dn+wvO/cT133onNhgjM\njt6uffvc/aWsKjdr7r30kg94s6QIvT+UK1/gHM87wMPCM67r8l1T6XEAR/mSt4FOpRq161tDA2bZ\naQ2tWprNNMDHln9706dPd8wBxcewWpUPafOx5V8saZIPFVwYqAQFXjpOvEp7X+GQfPnll+f8c7bc\ncksPOsx0jsWGRdBETy6sMPS6Ouqoo7wYvXv3dgcccIDfpkkM3x+sQlzPmqklOAZUyZoDCBFaarby\nkaIfyi3g6dWrl5dJ5xphfcyxx7s333jd3/dGsd4k3RcBSyHQiV/Hfedd03sH+MTfjfg1xe6TNu8c\n7x6BbVmU/AH7MQ2kTAPLpUweE6cBNcAHmo8i82gR+OAKTPgHXmqgAhfcYDWiIpBTdNLHXNYP8hL4\naKZx5OK84ASfGSw4surIr4bjBGADMMHSQ7PV1Vd/OQLx4Ycf7p2cQ9DBSiMrEdBDGvhV7LXXXj6t\nefPmeWsQcdScJWsQFhxZcgAdwQ4XFgs6xEXP0skhhx7OoYYKG2+yqevdexdfxmoBdNoUVA7oUAae\na713vIPADmsAqJz3DjlID6jhOec9ZJ/jBjppe2pMnrgGzLIT14jt10QDwIkA5d1333U777yz/xDz\nMSbwoWbhQ0oQpPCBJVCB84FlIV6xgeu5FisLzVfIQAA2gBEgB5DRmDrxwQOxsvzlL39xwA29mwjX\nXnutHzwQCAmhCcABnOSzwzxan332mV+oeOhiTmAKCVl2KMuaa66ZmyUdyw7QIwjyF5TxAxy++94H\nbtwVl5dxdXovoQv6zBkz3KybZ6ZXyAok0/Ov96KCpPyleueAHXogbrXVVv69C0GRbb1n+d473qFq\nyVRpmex600AxGjDYKUZLFqdVNRB+UNkm8JFnO/wI6wNbyUdWzVfk8cknn3hQAlBkgQlBJ2nwQGY6\nHzJkCJd7AGG+q2222cbvAzukAzSp+Yr0gB3giUU+OoAOwEPo0qWLGzlypO/ZRdMVwEPvMDVjAUJY\ndki/FKuOT/yrn/ETJrrPv/hHwzgnq2yNDDvVBh3pTOti3zveQd658F1UGrY2DWRFAwY7WblTJmfF\nGuDfKvN4EaZNm+Y/4FiUgB3Biaww4dxXnAc26KI+fvx4fz0jHD/wwAO+S7ggBMgR6Kj5i/QEO+pm\njrWH/GjGuuKKK3x6jM2DLHRlx5oj2NHYPaFjsr+gxJ9xV17l/vHPfzcc7Cxd9qFbv327XDNgiWpJ\nbfTWBp3UFtwEMw20kgaWa6V0LVnTQOo0IEsKzVdsYynCnE9zlGAHIMEawxL2vjrmmGNyoIPPDV3I\nv//97+csLQKdeDOYrENKC58fjbjMmDw4KRNwdAaKCFiHJAfpIRvph749PqL9ZHrKi3y3T01IWFMs\nmAZMA9XRgMFOdfRoqaRcAzRf4aOAxQSnSgIWm5122skP0PfWW295wAA4AB0gA7jAx2bHHXd0Cxcu\n9NcwieesWbPcWmut5UEnbAIToKipSk1XHAdW8LtRl3LkwMkTuThGOOGEE7wDNHGBI0EX17PPcfJj\nsdCYGgB0gBwDnca8v1aq+mnAYKd+ureca6QBKpBCva86d+7sYYLJFAELwQnXaZoHRJ08ebKf+oEm\nJcBFoCNrjpqrBDnsc454xOc6HJzpsg7ssGy44YaOAQYJOD5PnDjRx+c65AjhCwsPAEYw4PFqcAws\n2OEHHb7cyfivQKcUh/uMF9nENw3UTAMGOzVTtWVULw2EzVdhF1nAQc1XTMnAfFPPPPOMB58bbrgh\nmndpqBcZB+F77rnHj4As3xlOcH0IJVh05OujZjDiqbs6/jeADj45Wtjffvvtc5OG3n777b4nDFYc\npU1asu6oSYt0Swkan6eUa7IQ970lS9zW22ybBVELymigU1A9dtI0ULEGlq84BUvANFCBBtQjBN8Z\nbSclJ9M+PULUOyQpXvxYvuYrQEXNRbKgADIMDMjknbKcbLrppg4AYeZxzuOozDmuBTy4FhjBAsPC\nPpDCeQKQQTMVFh18dVjYJi0cm0mLOMOHD3d33HGHW7p0qe/tBVzRzEV6nNeChUgO0dqOlzlp///9\n3/+6dxa/nXQq08fozp/1YKCT9Tto8mdBAwY7WbhLDSYjPU0Y7+PGyHdGY30IYICTpECFwHWMF8N0\nDPSGwkpzZORonK9LLNcUar7CD0bWE6DiT3/6k9tjjz1yoIMDMlNBYH0R6CBbCDqCHPnXyBGZeFwT\nBx32sRQJXoAd4AX4onfXD3/4Qy51F154ofvlL3+Z891RfK1D0OH6lgI6mvfYEy1Fy9z5P/7xLdex\nQ3absQx0MvfImcAZ1YB1Pc/ojcui2MANCx94DQjYs2fPkgYFVLk1OBrp4eMAJMUHGKSCB6aKGTyQ\naRx+/vOfK3l34okn+vSw3jCODtYYQAM4AWiAI4EOa6CJ44IXgU5ozZFFJxwNmQzVlEY69913n59X\ni+Onn366l534XEvTF+BFcxjQRB7IVAzsYDXT6M6k3SiB6SJ6dO/qoTdrZTLQydodM3mzrAGDnSzf\nvYzIDphocsAkKKm0GAAPC5UHlp8jI2sP+RQ799XNN9+cswAhCyMa9+jRw8MFIPLmm286oIwQBx35\n0xCPAHzEQQfgwZoji46sMkAKcCTfIQAK52ag69e//rVPj+YsYI5rSUc+P2wDPICQ0vMXFPjp1au3\nO3P4CLf7T/sWiJWtUx07dHSzb5md17qX1tIY6KT1zphcjaoBg51GvbMpKBfNToBHCCGtKRZ+P+QH\nHGDRIcyZM8dbaIAKltCKgkPxGWec4X1lJBeWlXbt2uWsKEAG8ILlp1u3bs0sOoBO3D9HUAKMYI1h\nEZSoCYq8QmsMctE0BkiRJsCz+eabe8sReT/66KM+PunIyZm1IArgIb0wTZVHayw7hxxymHfmHXPh\naB3O9PrZ5xa4o44c5Ba9sShT5TDQydTtMmEbRAPWG6tBbmTaioGlhWYkFkFPa8uI9YW81OOKeX+0\nTd6yoAAo+OccfPDBOdDZOBrb5KmnnnL0ygIqWAQnXLfddtv5EY+XLVvmp3ygyQlLjByRgRLABggJ\nF45xLmy6SoISrDPEAZa4Zvbs2V5dABCjNgNEoVWJvCkH8IZ8BOIoADeyqHEPaMJ64IH73ROPPaoo\nmV/fd/9c12/f/pkqB0DOs2bdyzN120zYBtCAWXYa4CamrQhYV6hoWdT8U2sZ+fcsKw/WnVVXXdWD\nAZaT1157ze2yyy7egoJcgwcP9ousMvjGCFKAEIACuOBa0gWEGFQQuABcgBmOcQ3WFjUxkZ4gpyXL\nC2mFMAZMXXTRRW7ChAledQAP0CKokv+OLEjvv/++e+WVV7zzNhWqLFuh3oE/xvK57fY7vZ9LeC6L\n2zTLDR06pBnQprkc3Jd6vQ9p1ovJZhqohQYMdmqh5TaSB9YELCmyKvAPtp4BOQCedyJrzyOPPOIt\nLnPnznUHHHBATqwLLrjAz0mFFQdwkFUGqBDoYEEBdFiAniXR2C50ee4Q9QISfAhyAB7Ah+OAjvxp\nkqw5OSG+2gB41NOLvAAeoAw4I4T+O3/+85/dCy+84J5//nm3YMGC3AzsXyXlV8ANlasWrAkXXXSx\n++jjTzI/+/mtEbBdPWmie+yxeWGRU7ttoJPaW2OCtRENGOy0kRvd2sUELKhUCXzY02SmHzRokLfI\nHHjgge7cc8/1MvLzq1/9ym2wwQbeOiOrDtDCNpCCpUWgA3iwrWYr9hcvXuwHBJR1JQQdNYGRTzGg\nQzw1Q5GH8mXcHcb+UWDcn7ffTh4vB6fqPn36+MlOe/Xq5e+B0mTNgsz4A7268HXXpXNHJZu59aGH\nHeG6dd0uGpPo5NTLbqCT+ltkArYBDRjstIGbXIsiYtHBgpI20KGCB1qwsigABWPGjPGWHM4BJnFQ\nEehgyWHBX0agE/rWvPzyy27rrbfO+fqo2QpYIhQLOpJNUILV5qGHHnJYoph0NCnQNLfnnnvmFixK\nyp/4SouyaKEMZww7K7JgrZRZ687cBx92p0aQM3/B/FRBddI9MtBJ0oodMw3UXgMGO7XXecPlSLdy\nPuosabLooGgqfEYmxqqjQFdyLDMADEFNToACkMI1nMO6ItDhGOBC3NAKhFXnjTfe8D48DELI9Syl\nQg6+QNLhY4895icglbzx9VFHHeWb55ADSMN/J+ydRd4syAzcaAF42MYyxBQVWbXu9Nunv9ulT+/U\nW3W4nz2/snbG76HtmwZMA7XVgMFObfXdcLnhhHzkV93L6+2jk6RcVfgjR4507du3d1OnTnV9+/Z1\nxx57rHc8Fhhg3QkBAcgBdgREAAwwBFzE/XOAjg8++MB9+umnvgmJdFoKAhvWgA7XhgGrDbOtAyXd\nu3f3TU9hd/SHH37YRxfwADuyTgm2KLsAB8jRNutJk652yz780N0ye1aYbeq3x0+c7ObccXvqfXUM\ndFL/KJmAbUwDBjtt7IZXs7j46QA4N0bdzMMu3tXMo9K0VOFrfJ3f/va3fibzKVOm+JGRqfiJIygi\nHoDDAiAQAKHQEVnAA2iwyD8HfdALKunfPJWfFqa7iAemv6C3Fdey4FwsOAG6sES9+uqrrnfv3v5S\n1synBVgJwgQ7yCrgiUMO+ywfffSRO+vMs9ygo452Q046IS5OKvc1rs41U67xOkqlkJFQ3GfuoQXT\ngGkgPRow2EnPvcicJAIcrDtpDQIZIEYgQzduYIcZzsPjQIXiABoEQCberRyoAHLkH0Mcgiw66APw\nwWLDkg9uqBC1AI3IqiD4AkyQi95ZDDaI3JdccomPdvHFF7tOnTr5fJFRcqpZTvIImkiLdAV4L774\nop9t/Zln56e+K/rSZR+64447Puqd1scNOfkkqSl1awOd1N0SE8g04DVgsGMPQlka4KMup+S0+enE\nCxSCg2CG5iH8eAYOHJjrUi7YEegADQIINV2xH4IOFhTABqBBJyxJY9xguRHYsE6CG0GI4ERr+Q8B\nO59//rkbMGCA+8Mf/uCLCfzgswN4ydIEjMm6g3yCKK2BIC1z5z7gbr75JjfzplmpBh7mwKLss26e\nGb+9qdnn3nNvLZgGTAPp04DBTvruSSYk4qPOMjrPLOVpKgSVvIAHgABqcAI+4ogjPKQABlhOgApg\nCBAAHgAbQQ4AIYhgAD9GWwZqgJwkuKEZCqChaQoHboBQsIFuQrAJZRPgaI08su7gR0RzFqM47733\n3l7FG220kRs1apRPGwsTwCNZ41DGecrGOlxGjb7A/TNKd+q1U137duul6dZ5WU49bZh7660/uhnT\np6XOAR4BZcUz0Endo2MCmQZyGjDYyanCNorVAP9gs2LVoUyCDEGFLCV77LGHH5fmoIMOyvW6EgwA\nCgIdppag+zcLkIMzcjwkDeBHfqoId9ppJy+HIAeAEdDE15yLL7JIAWUsQBYTnRL22msvt9tuuzVr\nctPgiIAPZVHTFpCDtSeEHbbPG3W++0s0UOGll12WqvF3sgA670RDLgC1FkwDpoH0asBgp4b3ZrPN\nNssNCIffxVlnnZXLnUpWgZ42jJxbTKB3ET2LFFSxa7811oAOH/csWHXC8oewg5WEaSQYZPDBBx/0\nFh2gAxBYuHChW7RokZs/f75fGC05HnbeeWfHP3kWdBFabshHC2myACddunRxq6yyigcZAU4IPSHg\nJJ0X8CA7wDNp0iQvO7JRDnqbqdlNs6PTxAW0ATsCnhB2wu1zzjnPPfLwQ+6GG6fXvUkLH50zzhjm\nm67SbNEx0Im/GbZvGkinBv4z0lo65StJqksvvdT3UOEiRpp96623SrreIresASwVd999t7vyyitb\njpzCGEClKniags477zx3Y9SbDH8QmrYYMycp7LDDDr4nFHDD6MQEgaUgirXgJg4rDDzIAIRACFAS\nnhfk6Fi4ZjsEJ/JV76uTTz7ZYWUDfi688EI3bdo0b7EJHacBHCw7svCE52TdQR/o5corr4jmM7vb\nbd+jm7tqwqS69dKi19XIs0e4jh07ucmTJqS26cpAh6fRgmkgGxpoKNjJhsqzLSU9jfbZZx+3ceSP\nksVApc6CtebQQw91+N/84he/+FpRgJpdd93Vd09nCgauUQgBBFAR5AhaWAtYwu1NNtnEvffee45B\nDddbb71cUxVxw0Vww5oAjAjQJD/xaY7Dssd0GITrrrsumhhzaM45WU1W8j/C6sOitAQ5SpM0Djhg\nfw99559/gZv/3HOO8YlqNa0E1pwbp890I0ec6QFU5UKuNAWA30AnTXfEZDENtKwBg52WdVS1GI1g\naQJ2AIEsBip1AIJKfo011nBPP/10rhhYbrDYMH7NNttsk+tWTlzBjNYhmLQEOCHssL3aaqt5wKLb\nNyMukxbpshAEHuSrRdCitYTm2i222MJbp5jQlK70DERIWYhLWqTBNgsWHsCHhePKT+lp3TO6vzTN\nMfDgFl06uTFjL3NHDjqiVZ2Xr582w90040bXrv36Dt2k1QfGQEdPia1NA9nSwJdfvGzJ/DVpmdGa\nDzuDrCkwJD7H8JOJB5q7OK6KhTXHkiZY5LjiMfIuQYO5cVxBcVgjDwuVJvukQQjz1DFdH1/fdttt\nuetJg7Q4Vk4grzBvyZRU3pbSp9lE4+u0FDeN5yk7C5X/zTff7AGB0Yrpwo0PVdeuXT0MEIcgZ2b5\nydAbii7gX3zxhW/6Yt3SQhMZcbiO6wGttdZay89YDuRIHpqc1ANMDsb43IQLzWDIywI44St0yCGH\nuG7dunl58QVDVsEOQCTgYjsMKmN4TNukO3LkCA8er77ysusdAdDIc0e7ha8tUpSK11hygJxevXp7\n0KFZjq7lBjoVq9YSMA2YBmIaaFOWHSp3xihhFN14AGCAApyDf/KTn8RP5/aBjqTrcxGiDUCnJZgJ\n48e3gRqaJ8JAnsged2wO48S3q1HeME0GyCNsnNEmLJVFVo3999/fH6Jyfe2117ylRVYWwIAu6oIF\ndQEXOLAutM050uJ6pRnmD6hst912ftBBBgZkXxYYWWO01nGtBUeki1wskydP9hOSksdxxx3nbr/9\ndp835ygHACR/HdIV6Ggt2eJrdAOAcO9vnjXbW3r27rev2yUC/22i96RH967xSwruAzhzH3jIvfrK\nK27uffe6rbfZNmqGG+iOjKYcSXMwi06a747JZhpoWQNtCnbygY7U9Mknn/h5k2huWn311XU4twZi\nigmVgA7px0EnzBMoA8aK6a1VaXnDfNlOg58C1jXdByr7UgOVO9cJeLie5iumYsAXiXMCGaw6LAIK\n1oUAJwQbyUZ+LOQXwou26ZKOUzTxGTMHoNE5wY32WWtbkCJZN9hgA9+7rH///u4vf/mLmzlzph8w\nEdDRNUpPMrFfbAB6WEaePTxyYr7L4UQ8ecJ4fznAssWWP/Tb3//+Zr7HmdJl8MPPP//CvbP4bfeH\nN99wjz/+mDvk0MPdT3fdxQ0dcqLbOAPgbKCju2lr00B2NdAQzVhU/FQWGkaf20FvLI7JTwaACC0y\nxOU8C00YCgBPIdggXaw/ulbXxdfHHHNMLk7YxTweL99+PvmIX0g+pVet8io91vy77xk1Z6QlcK/K\nCarstWZ0Y/xECAALi6whdPFWsxVrbYfNUsQBimTNIV2sKGGzVNgUFd/GWkj38MWLF/smq7DbuJqz\nOB8OFqhu5BpDh3NMGMpAiQSclblfyERZkFGLZJW8/oIif2jewgozdcrVbtEbi9zsW2a7AQfu71Ze\n6dvuf//9L3dX1J1/5owZuWVJ5JC90ndW9BagUaPO8+8EliKcj7MAOgB+GiC/yNtj0UwDpoE8Gmgz\nlh1ZA9ADIBICCPtUnPL5ARTC86HuAKOWrCqcDwEqvL6Y7Zbko5kLeZOsT0q/WuVVemlcA68t3Yti\n5KbS5d+7QAcYECDQ/MO2LDyh9Ya0ARtZXLSWBYV1aFXJt008mrLoIfbCCy94oCSu0matEN8Gurke\nuMLfB0h+9NFH3dKlS92QIUPck08+6S1T8uMJm7LUM0vlUB6lrGXxKeWarMQFcgiU0YJpwDSQbQ00\nhGWnmFsQWnWSKkjmSVLA1yWf1SDpWl2ndTFxFDdpHcqi8/E0W3IurlZ5lT9rLAWAQVpCWMZqyAQ4\nYO2guUrAA+gIdmQJATiABqwqoUMxFpvQKhO34IT7ioflRlabddZZxzejMlIzTs3kIeghzxB0wvIS\nJ5Rn9uzZudOnn366t6ZQJoAH644ATs1yLVkpc4m1oQ2BTpqe9zakfiuqaaDqGmgzsBPCAb4sqjy0\njvfaSoIdmrCKCYUsLsVcn5RPPM0k+cK0q1HeMD22sX6k6eOPb1S1gSdeZp4PLCdqkqK5CDgBUgQ3\ngIvgJQSa+LbicL0AB1iKdwnHh+jdd991qnDjMoX7en5l3SGtDh06uHHjxvlozz//vB+9Wc1Z9AaL\nAw/WKgv/0YD0nqZn/T/S2ZZpwDRQjgbaDOyUoxy7pnU0AKSoki51HTbPAXw4LNP8WA3oQRY1NSXB\nTTmAA/DIeiOwAUiAE5bQchNqW00nWNNaCtIhacnaxHxfe+65p7+UqSQAVSw5WKlC4JF1R81zLeXV\n6OcNdBr9Dlv52qoG2gzshNaS0MFYJvz4Ooxf64cjqeKOW3Jaki88X63y4pyqyqDWOsmXH3oBnuRv\nlS9eMccFO2oSiltxZMmJW2wENFrH4QZwaglukuTDssDyeDS2UUsB2ZUH+SE7/jusCUcffbRfFwIe\nvQM+Yhv80bONzi2YBkwDjaWBNuOgTHdtNe0AE3EfmDTdVqwXcb+d0KJBk1YIM0myt0Z5sTaoQkjK\nM8vHBDoAgwKWEsABCCCwr0VWGda6lrUW4rNdaQAwe/bs6YEH/bNfKIQyMyUF/jsMAkl39PHjx3un\nZaw7xAPqBEgqA/uUt1jZsTyxfPa3z92SJe9/bUZ4Jj7t2LGDizr8e0dfypLGoOfaQCeNd8dkMg1U\nroGGtezELSEh3GAFCMfCAYJCP564/07lai4tBXqDhfKxH1ou4iCUlHprlZfmkEYLL730kocIKnhV\n/jQH/f/2zj3IiurO4z9SIJYVUu7GZAOuFi5UGNkqERMZhFpFYnzk4Yih3IqRAZJSUUkibtCMZA1b\nAg6iUjKWQXwOJghEHIeYVWMSJCKPUgOl4aEbE0UdrazlHz7WRLOV3G+THzlc7jzune57u/t+TlXP\n6du3+5zf+Zyr/eV3fuecYs+Ox+oUx9tU6rkph6NEgl7I/lIu9azsd9EiIaNhM01H18rESlpoUMLE\n43d8Krpyn22m73pK6v97ChunXnTxJdYwqsHmzLnCfvbYL+zd9963f/jHj9u05uYDjnGN4+39P35g\nL+99zW5aenNk39lNU2xZ2y09tqUnG+L+zpkidOImS3kQSA+B3Hp2JHb0P355QLTWjqZzS0C4d0fi\nIRQQYZd0N+08vCfp8/7al0R75VmIY7dzibWeVqmuhG1xAHc5ZegFrrbJ26GkvDho14WEck/huV9L\nMnfPmgSLzksl2ST7JdokwiReWlpabN26dd1OR3eBp+e8nX7udWgo7af//YjdsGRxtCjg+IKI0qaj\n5W4SqhWUNz252bZs3mJnnnGmfbrhWDt3SpPNKKzdU4uE0KkFdeqEQPUJ5Ers9Da0o9iV3lYVVpyD\nhEItk8RW6NkJbZF9vbXT74+7vXrB6kXb3yT7+9qG/tbVl+f1Ir/88sujF73fLwHgqdqixustlcv7\nIHHWk+DRc26/PFQSPI888ogdd9y+VY41Hf3GG2+MvDny6ujecEgrFDobN260Fbffab9++ilrnvkN\n+83O3WULnLAdw4Z+ys6bem50zJ37H9HWEe3t91h7+8rIA3XuuVPC2xM9R+gkipfCIZAqArkaxpLH\nQGKgu3/l6wWrRdt0T7FnQQJH4iANXh3ZokUJQ0Ege9euXVuWfXG3Vy9apTgET1RQSv7ohR56Sty7\n4XlKzNxvhuJ21BcSaaWSizOJFh/OUvxOOB29s7Mz8l5p+EqCRzO0wvV33nrrLVuwcJHNunhWtBXE\nLwt1Xf3duf0SOsW2Svh8Y2azbdjwS7ugeYa1tbWZhriq8fvyOvw3XWwbnyEAgXwRGFAIRix/g6F8\nMaA1ZRBQsOukQvyIPCF5SNrnSW3xf+VnrU0SPBJqpQKX9Z+2huM0A0tCRltdXHjhhfbQQw9FzVy/\nfn30nLw/ikPydYC2bdtmN9241IYV9tuaN29erAKnN76LWpfYvJYr7eabFUy9L9aot2fK/V5CRyKn\nFLNyy+J+CEAgGwQQO9nop9RYqeBUxe34v4xTY1iFhrhoiyMWqUIT+v2Y+sK9PcWFSfBoGMs9OBI8\nI0eOjLw5iunR1hLyBCmYWVPmH3yw09TH3/z25fat2ZcWF1eVz9pkVN7XoUOH2uLWRbGKEoROVbqQ\nSiCQOgKIndR1SboNUryIhgm1aaX+dZz1JJHg3pEst0WeKQ+0DtshseOCR1PONWS1adOmaDq67ps6\ndWo0HV1xO62t19vu3busfeW90cadYTnVPlcg8/z5/2VvvPGGrWy/OxbBg9Cpdi9SHwTSQyBXMTvp\nwZpfSyQOtGN1lj0h3jvu1Qnjdfy7rOUSnjok3MLkcUcev6Mhq1LT0RcsWBQNeW0sbBw64aTGsIia\nnCueRzurjxgx0pqnz4yEXH8MQej0hx7PQiD7BPDsZL8Pq94CvVAVuyNvQpbjHjyQd8yYMVHci0SP\n4pGyLn7UP8VxPO7dUfyOByRrLaZdu3bZZz9zov1TIYB5xe0rTCIjbWnOFXMLy0f8tmIPD0InbT2K\nPRCoPgHETvWZ56JGiQId8+fPz2R7FJcyc+bMbm0/+eSTo/ZJNOiQ10TJBVL0IcV/9IIP43gkdpQU\nv+PDWV1dXXbZZbOj9Xce7Fxf1UDkctFpEUPtBL/qR/eW9ShCpyxc3AyB3BJA7OS2a5NtmHt3/GWS\nbG3xly7xIqE2o7CYndqyYcOGKOha+TvvvHNQhcOGDbOxY8dGh3Yl16GUZvFTHMfj8Tu+P9aWLVvs\n9NNPtyc3b03F0NVB0IMLiuGZNesSaxw3rjBDrCX4pvtT/21m2fvYfev4BgIQKIcAYqccWtx7AAEJ\nBQXFavp2lpJEjmzWy1DJvR6apq1Dq2xr/zTNVNq+fXt0lGrf6NGjI9FzwgknRCLIh7/SJIB8AUJ5\n4ZTUVrXxzTffLOy/dp5NPe/fazbrKjKojD+apfX1GdNt+W3LI69bT48idHqiw3cQqD8CiJ366/PY\nWqwXqTwkClaW8MlC0ktQHhqJGBcn7vGQCNAwjzwf4V5R+l73/6oQvKt9tJ544oloSKW4vVqrRsJH\nXh/Vody9CrUWQGEcj9pz7bULbM/zL5Q9LFTc5mp/XnbLrdax7v5oIcLu6g7b2t09XIcABOqLAGKn\nvvo79tbqxaJgZX/BxF5BjAX61GzNwvKZWCrevR0udBTT4oeu6dA9Eiw6NE377bfftueeey4SQBI/\nO3fuLGmphr9c/LgA0o21ED8SehJfaotW1+7v1g8lG1yFi1pl+bTPTS656KB+h+7FqoIpVAEBCGSE\nAGInIx2VZjM1LKSAX3+ZptVWDzaWrWHSy99FjYsc3z7BPTzy+ihJ6Ggatx/67Nf27t0biR+JIAmg\nV199Naxm//mECRMiz497geQdU6qGAFIcz3e+c6WNOna0Lbx2flRv1v48/OhjNqewuvLWbVv3e87U\nBoRO1noSeyFQPQKIneqxznVNGsaS2NELx4du0tTgnuxzseOBuz41OxQ8EkOeXNx4LuFT6lzXNPTl\nHqBnn3222+GvyZMn7x/6kgfIGcYtgCR2jjnmGHut6/VUTjN3xr3l539tmo1vHLffu4PQ6Y0Y30Og\nvgkgduq7/2NtfU+CItaKyihMQ1casupJiLnYkaDxadkSOi52dE0eHnl3dK8EiB8SOjp3wROKnlLX\nNPwlr48LoO6GvxT8LNGjQ0LI44v6K36uuuq79sGH/29Lb1pSBsX03SrvzvWt10WxOwid9PUPFkEg\nbQQQO2nrkYzbI8Gjl49mO/kLulZNktCZ9LdZSLLJvSXF9kjAhMHJLnjk4Ql3Apfnx+9TGXpOh5KL\nn1D4SOz44SLIcxdCyiV85PVxL1B3w18TJ06M2uOzvyoZ/moY1WB33dOe+qnmEdRe/px66mQbM+a4\nXKzm3UtT+RoCEOgnAcROPwHy+MEEFMOjGVq1nKUlcaPAaR2yozuhI+tdtEjI+Ewsj92RR8fjdvSd\nvD+6zw99drHk5TgRF0AueEKB49ckfkIB5PfI+yPxIxG0efNmL/KAXBtluvBRELQOT6U8QBKgd93d\nbus7O/y2TOfz/nO+ffjBn+z6xddluh0YDwEIJE8AsZM847qswcWGPCsSG+6FSBqGvDkeMK08nHXV\nU90uWNxzEwocFzkSNn6EYsef8Wueu/hRruTix70/Lnhc4IR58fkrr7wSCR+JIAmgnoa/fPaXhJB7\n11TnlVe12KBDBmc2MLm4/3zdnT3P7yn+is8QgAAEDiCA2DkABx/iJODxMvIo+HTvnjws/a1bs6wk\ncCSsdK68r8kFiYSKzl3UKHcx4+f+Ocz9PLzHryn3MmWPzr0+F0ASNzov5eUpFj7uDfKhL+USQdpO\noTiFa/90dq63xUtusLPO+HzxbZn9rGG51WtW7xd1mW0IhkMAAokSQOwkipfCRUBeHokQBQlL9Ciu\npxwh0hNFCSoJG3mPlJRr6KrS5CLEBYlyiZVShwsi/86Fjufh9VLXvGyvSza7+NG5RI4foQjyc89d\nDIVr/2gITJt8FifVlaekPbNGHzuqzx68PLWdtkAAAn0ngNjpOyvu7CcBiR4Jk/b2dmtqaoqCbSVM\nyhU+EjjyFuno7Oy0U045JXrZ9UfkdNe0UBxIvLgwcSET5qGg8XPPdV+p8/Cal+V1eN0ugJS7+PHc\nvTz+2YWPCyFf+6etrc0mTvw3+/GP13TX1ExeX9S6xP5ciNu55prvZdJ+jIYABKpDALFTHc7UEhAI\nxYoEkIa2JHgm/W3mlOf+iDxCeka5jpdffjkSOBI3lYglL7eS3AWIng1FiYSKPheLl1Kf+3rNxY+X\nHdbtAkjiRucublzshLnOJXbe/+MHtuK2H1TS7NQ+s/b+B+zBjo7MbXuRWqAYBoGcEhiY03bRrBQT\nkLjRUJYOJRcxvku3hrzCJCGkQ8JGw2DFYii8N+lzCQtPfi4RIrGhpHMXJ2FeSuDo+/C6n3vu34ef\ndc3LVV0KnlZS7vbIFp274NHnwq02/Jh/ie7N058hQ4bkqTm0BQIQSIgAYichsBTbdwK+jUPfn0jX\nnS4yZJWLjNALE4oTFyueFwsZfS73murSM0o610wyJdnix7vv/l90LW9/jj7qKPv100/lrVm0BwIQ\niJkAYidmoBQHAREIBdA+z8rfA4MlSPxwIVRK4Oi78Lqfex5+X3xN5XvZyj/88z4BlLfe+dfRDfb8\nC8/nrVm0BwIQiJkAYidmoBQHgVIEQvHj5xIklQx/hcLGBU9v1wYNHFTKLK5BAAIQqAsCiJ266GYa\nmUYCLnpkm84VYyMBpOSeH8//AVUvAAAG+ElEQVQlasLDxY3nLnrCPDwffOjgqNy8/dm5iwUF89an\ntAcCSRBA7CRBlTIhUCEBF0Ceu/hRcS58lIfCJzyX+AkFkAueIR/9aIUWpfuxvYWVpb96/gXpNhLr\nIACBmhNA7NS8CzAAAt0TcNGjO/xcYseHvyRmXAS56NHnUPDofODAj9j//uEP3VfENxCAAARyTOAj\nOW4bTYNALglI9Pgh0aNj4MCBdsghh9jgwYOjQ9tE6DjssMOiQzumd3W9ljse27fvsIZRo3LXLhoE\nAQjESwCxEy9PSoNA1Qm48FEerq0zaNCgSAAdeuih1tjYaGvX3Fd125Ku8KXf/84+9rF8DtElzY7y\nIVBPBBA79dTbtLVuCBQLoCOOOKKwGOOppp3C85T+pzDtvJaLTOaJJW2BQJ4JIHby3Lu0DQIBgRPH\nNdrjG38VXMn2qYTb611d7Hie7W7EeghUhQBipyqYqQQCtScw4aRG27plc+0NicmCp595xr7cVPkO\n9zGZQTEQgEAGCCB2MtBJmAiBOAhob7EX9uy2vKxN07Hufps4YXwcaCgDAhDIOQHETs47mOZBICRw\n9jlT7I477gwvZfL84Ucfi4awJOBIEIAABHojgNjpjRDfQyBHBC695GJ7+Kc/sa7X38h0q+5dudIu\nnT07023AeAhAoHoEEDvVY01NEKg5geHDh9sJnz3R7mm/t+a2VGqAvDra6bx5GisnV8qQ5yBQbwQG\nFFZb/ft2zPXWetoLgToksGPHDhs7dqz9Zudu067hWUvnf22ajR/faN/6Jp6drPUd9kKgVgTw7NSK\nPPVCoEYEjj/+eLt2wUJbuHBhjSyovNplt9xaiNV5Da9O5Qh5EgJ1SQCxU5fdTqPrncDsyy6NhoIk\nHrKSNIvs1rZl9v3vX2OHH354VszGTghAIAUEEDsp6ARMgEC1CUgsLL9teSQesrKqcktLi13Q3MyK\nydX+sVAfBHJAgJidHHQiTYBApQSWLWuzjo4O+9GqVTZs6KcqLSbx5+ZcMddefPG3tr6zI/G6qAAC\nEMgfAcRO/vqUFkGgLAJXXtVie/bsseXLf5BKwSOhs2P7MwVR9gDDV2X1LDdDAAJOgGEsJ0EOgTol\ncP3i66yhocFmzbokdevvSOhoXaClS29C6NTp75NmQyAOAoidOChSBgQyTiAUPGmI4dGihz50tXrN\najb7zPjvC/MhUGsCiJ1a9wD1QyAlBCR4GseNs6/PmG5r73+gZlZp1pW8TIrRWdl+N0KnZj1BxRDI\nDwHETn76kpZAoN8E5s1rsdbFrXbNvKvtoourP6ylqfBfmXJOJLoUjMwU8353KQVAAAIFAogdfgYQ\ngMABBLS55tZtW23AgAE2edIkW9S65IDvk/igLSDObppi2slcU+IlukgQgAAE4iLAbKy4SFIOBHJI\n4PHHH7cVt98ZrVr8+TPOshnTp8U6Y+vOu1faL36+b6+rlqtbbPr06TmkSJMgAIFaE0Ds1LoHqB8C\nGSAg0bPqvjV2+4rlduFFs6xx/El21pmnVyR85MXZtOlJW7d2tQ0dNsxmFGKEmpqaGLLKwO8AEyGQ\nVQKInaz2HHZDoAYEXnrpJdu4caM9+rOf232rfmhfPvscGzFipH3ik5+0kSNH2JAhQw6yavv2Hfbe\ne+/Z73/3YvTMpEmn2udOO82+9MUvEHx8EC0uQAACSRBA7CRBlTIhUCcE5PHRLup/sQH21FNPl2z1\nkUceaUf985F29NFHReJm+PDhJe/jIgQgAIGkCCB2kiJLuRCAAAQgAAEIpIIAs7FS0Q0YAQEIQAAC\nEIBAUgQQO0mRpVwIQAACEIAABFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoid\nVHQDRkAAAhCAAAQgkBQBxE5SZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAA\nBCAAAQikggBiJxXdgBEQgAAEIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCAp\nAoidpMhSLgQgAAEIQAACqSCA2ElFN2AEBCAAAQhAAAJJEUDsJEWWciEAAQhAAAIQSAUBxE4qugEj\nIAABCEAAAhBIigBiJymylAsBCEAAAhCAQCoIIHZS0Q0YAQEIQAACEIBAUgQQO0mRpVwIQAACEIAA\nBFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoidVHQDRkAAAhCAAAQgkBQBxE5S\nZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAABCAAAQikggBiJxXdgBEQgAAE\nIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCApAn8FUX2PmBTVQm8AAAAASUVO\nRK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 112, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse_2.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 113, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 113, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 114, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 114, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "code", + "execution_count": 115, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "
\n", + " \n", + " Loading BokehJS ...\n", + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "application/javascript": [ + "\n", + "(function(global) {\n", + " function now() {\n", + " return new Date();\n", + " }\n", + "\n", + " var force = \"1\";\n", + "\n", + " if (typeof (window._bokeh_onload_callbacks) === \"undefined\" || force !== \"\") {\n", + " window._bokeh_onload_callbacks = [];\n", + " window._bokeh_is_loading = undefined;\n", + " }\n", + "\n", + "\n", + " \n", + " if (typeof (window._bokeh_timeout) === \"undefined\" || force !== \"\") {\n", + " window._bokeh_timeout = Date.now() + 5000;\n", + " window._bokeh_failed_load = false;\n", + " }\n", + "\n", + " var NB_LOAD_WARNING = {'data': {'text/html':\n", + " \"
\\n\"+\n", + " \"

\\n\"+\n", + " \"BokehJS does not appear to have successfully loaded. If loading BokehJS from CDN, this \\n\"+\n", + " \"may be due to a slow or bad network connection. Possible fixes:\\n\"+\n", + " \"

\\n\"+\n", + " \"
    \\n\"+\n", + " \"
  • re-rerun `output_notebook()` to attempt to load from CDN again, or
  • \\n\"+\n", + " \"
  • use INLINE resources instead, as so:
  • \\n\"+\n", + " \"
\\n\"+\n", + " \"\\n\"+\n", + " \"from bokeh.resources import INLINE\\n\"+\n", + " \"output_notebook(resources=INLINE)\\n\"+\n", + " \"\\n\"+\n", + " \"
\"}};\n", + "\n", + " function display_loaded() {\n", + " if (window.Bokeh !== undefined) {\n", + " Bokeh.$(\"#fcba94a8-578e-4e33-ab7d-b09fc2376af8\").text(\"BokehJS successfully loaded.\");\n", + " } else if (Date.now() < window._bokeh_timeout) {\n", + " setTimeout(display_loaded, 100)\n", + " }\n", + " }\n", + "\n", + " function run_callbacks() {\n", + " window._bokeh_onload_callbacks.forEach(function(callback) { callback() });\n", + " delete window._bokeh_onload_callbacks\n", + " console.info(\"Bokeh: all callbacks have finished\");\n", + " }\n", + "\n", + " function load_libs(js_urls, callback) {\n", + " window._bokeh_onload_callbacks.push(callback);\n", + " if (window._bokeh_is_loading > 0) {\n", + " console.log(\"Bokeh: BokehJS is being loaded, scheduling callback at\", now());\n", + " return null;\n", + " }\n", + " if (js_urls == null || js_urls.length === 0) {\n", + " run_callbacks();\n", + " return null;\n", + " }\n", + " console.log(\"Bokeh: BokehJS not loaded, scheduling load and callback at\", now());\n", + " window._bokeh_is_loading = js_urls.length;\n", + " for (var i = 0; i < js_urls.length; i++) {\n", + " var url = js_urls[i];\n", + " var s = document.createElement('script');\n", + " s.src = url;\n", + " s.async = false;\n", + " s.onreadystatechange = s.onload = function() {\n", + " window._bokeh_is_loading--;\n", + " if (window._bokeh_is_loading === 0) {\n", + " console.log(\"Bokeh: all BokehJS libraries loaded\");\n", + " run_callbacks()\n", + " }\n", + " };\n", + " s.onerror = function() {\n", + " console.warn(\"failed to load library \" + url);\n", + " };\n", + " console.log(\"Bokeh: injecting script tag for BokehJS library: \", url);\n", + " document.getElementsByTagName(\"head\")[0].appendChild(s);\n", + " }\n", + " };var element = document.getElementById(\"fcba94a8-578e-4e33-ab7d-b09fc2376af8\");\n", + " if (element == null) {\n", + " console.log(\"Bokeh: ERROR: autoload.js configured with elementid 'fcba94a8-578e-4e33-ab7d-b09fc2376af8' but no matching script tag was found. \")\n", + " return false;\n", + " }\n", + "\n", + " var js_urls = ['https://cdn.pydata.org/bokeh/release/bokeh-0.12.2.min.js', 'https://cdn.pydata.org/bokeh/release/bokeh-widgets-0.12.2.min.js', 'https://cdn.pydata.org/bokeh/release/bokeh-compiler-0.12.2.min.js'];\n", + "\n", + " var inline_js = [\n", + " function(Bokeh) {\n", + " Bokeh.set_log_level(\"info\");\n", + " },\n", + " \n", + " function(Bokeh) {\n", + " \n", + " Bokeh.$(\"#fcba94a8-578e-4e33-ab7d-b09fc2376af8\").text(\"BokehJS is loading...\");\n", + " },\n", + " function(Bokeh) {\n", + " console.log(\"Bokeh: injecting CSS: https://cdn.pydata.org/bokeh/release/bokeh-0.12.2.min.css\");\n", + " Bokeh.embed.inject_css(\"https://cdn.pydata.org/bokeh/release/bokeh-0.12.2.min.css\");\n", + " console.log(\"Bokeh: injecting CSS: https://cdn.pydata.org/bokeh/release/bokeh-widgets-0.12.2.min.css\");\n", + " Bokeh.embed.inject_css(\"https://cdn.pydata.org/bokeh/release/bokeh-widgets-0.12.2.min.css\");\n", + " }\n", + " ];\n", + "\n", + " function run_inline_js() {\n", + " \n", + " if ((window.Bokeh !== undefined) || (force === \"1\")) {\n", + " for (var i = 0; i < inline_js.length; i++) {\n", + " inline_js[i](window.Bokeh);\n", + " }if (force === \"1\") {\n", + " display_loaded();\n", + " }} else if (Date.now() < window._bokeh_timeout) {\n", + " setTimeout(run_inline_js, 100);\n", + " } else if (!window._bokeh_failed_load) {\n", + " console.log(\"Bokeh: BokehJS failed to load within specified timeout.\");\n", + " window._bokeh_failed_load = true;\n", + " } else if (!force) {\n", + " var cell = $(\"#fcba94a8-578e-4e33-ab7d-b09fc2376af8\").parents('.cell').data().cell;\n", + " cell.output_area.append_execute_result(NB_LOAD_WARNING)\n", + " }\n", + "\n", + " }\n", + "\n", + " if (window._bokeh_is_loading === 0) {\n", + " console.log(\"Bokeh: BokehJS loaded, going straight to plotting\");\n", + " run_inline_js();\n", + " } else {\n", + " load_libs(js_urls, function() {\n", + " console.log(\"Bokeh: BokehJS plotting callback run at\", now());\n", + " run_inline_js();\n", + " });\n", + " }\n", + "}(this));" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "from bokeh.models import ColumnDataSource, LabelSet\n", + "from bokeh.plotting import figure, show, output_file\n", + "from bokeh.io import output_notebook\n", + "output_notebook()" + ] + }, + { + "cell_type": "code", + "execution_count": 116, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "
\n", + "
\n", + "
\n", + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "hist, edges = np.histogram(list(map(lambda x:x[1],pos_neg_ratios.most_common())), density=True, bins=100, normed=True)\n", + "\n", + "p = figure(tools=\"pan,wheel_zoom,reset,save\",\n", + " toolbar_location=\"above\",\n", + " title=\"Word Positive/Negative Affinity Distribution\")\n", + "p.quad(top=hist, bottom=0, left=edges[:-1], right=edges[1:], line_color=\"#555555\")\n", + "show(p)" + ] + }, + { + "cell_type": "code", + "execution_count": 117, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "frequency_frequency = Counter()\n", + "\n", + "for word, cnt in total_counts.most_common():\n", + " frequency_frequency[cnt] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 118, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "
\n", + "
\n", + "
\n", + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "hist, edges = np.histogram(list(map(lambda x:x[1],frequency_frequency.most_common())), density=True, bins=100, normed=True)\n", + "\n", + "p = figure(tools=\"pan,wheel_zoom,reset,save\",\n", + " toolbar_location=\"above\",\n", + " title=\"The frequency distribution of the words in our corpus\")\n", + "p.quad(top=hist, bottom=0, left=edges[:-1], right=edges[1:], line_color=\"#555555\")\n", + "show(p)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Reducing Noise by Strategically Reducing the Vocabulary" + ] + }, + { + "cell_type": "code", + "execution_count": 122, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,min_count = 10,polarity_cutoff = 0.1,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, polarity_cutoff, min_count)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self,reviews, polarity_cutoff,min_count):\n", + " \n", + " positive_counts = Counter()\n", + " negative_counts = Counter()\n", + " total_counts = Counter()\n", + "\n", + " for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1\n", + "\n", + " pos_neg_ratios = Counter()\n", + "\n", + " for term,cnt in list(total_counts.most_common()):\n", + " if(cnt >= 50):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + " for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio + 0.01)))\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " if(total_counts[word] > min_count):\n", + " if(word in pos_neg_ratios.keys()):\n", + " if((pos_neg_ratios[word] >= polarity_cutoff) or (pos_neg_ratios[word] <= -polarity_cutoff)):\n", + " review_vocab.add(word)\n", + " else:\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " self.layer_1 = np.zeros((1,hidden_nodes))\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + "\n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def train(self, training_reviews_raw, training_labels):\n", + " \n", + " training_reviews = list()\n", + " for review in training_reviews_raw:\n", + " indices = set()\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " indices.add(self.word2index[word])\n", + " training_reviews.append(list(indices))\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + "\n", + " # Hidden layer\n", + "# layer_1 = self.layer_0.dot(self.weights_0_1)\n", + " self.layer_1 *= 0\n", + " for index in review:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # Update the weights\n", + " self.weights_1_2 -= self.layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " \n", + " for index in review:\n", + " self.weights_0_1[index] -= layer_1_delta[0] * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(layer_2 >= 0.5 and label == 'POSITIVE'):\n", + " correct_so_far += 1\n", + " if(layer_2 < 0.5 and label == 'NEGATIVE'):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + "\n", + "\n", + " # Hidden layer\n", + " self.layer_1 *= 0\n", + " unique_indices = set()\n", + " for word in review.lower().split(\" \"):\n", + " if word in self.word2index.keys():\n", + " unique_indices.add(self.word2index[word])\n", + " for index in unique_indices:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] >= 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 123, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000],min_count=20,polarity_cutoff=0.05,learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 124, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):1371. #Correct:20461 #Trained:24000 Training Accuracy:85.2%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 125, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):1708.% #Correct:859 #Tested:1000 Testing Accuracy:85.9%" + ] + } + ], + "source": [ + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 126, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000],min_count=20,polarity_cutoff=0.8,learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 127, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):7089. #Correct:20552 #Trained:24000 Training Accuracy:85.6%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 128, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\r", + "Progress:0.0% Speed(reviews/sec):0.0% #Correct:0 #Tested:1 Testing Accuracy:0.0%\r", + "Progress:0.1% Speed(reviews/sec):2123.% #Correct:1 #Tested:2 Testing Accuracy:50.0%\r", + "Progress:0.2% Speed(reviews/sec):3623.% #Correct:2 #Tested:3 Testing Accuracy:66.6%\r", + "Progress:0.3% Speed(reviews/sec):4477.% #Correct:3 #Tested:4 Testing Accuracy:75.0%\r", + "Progress:0.4% Speed(reviews/sec):5488.% #Correct:3 #Tested:5 Testing Accuracy:60.0%\r", + "Progress:0.5% Speed(reviews/sec):5995.% #Correct:4 #Tested:6 Testing Accuracy:66.6%\r", + "Progress:0.6% Speed(reviews/sec):5698.% #Correct:5 #Tested:7 Testing Accuracy:71.4%\r", + "Progress:0.7% Speed(reviews/sec):5448.% #Correct:6 #Tested:8 Testing Accuracy:75.0%\r", + "Progress:0.8% Speed(reviews/sec):5041.% #Correct:7 #Tested:9 Testing Accuracy:77.7%\r", + "Progress:0.9% Speed(reviews/sec):2163.% #Correct:8 #Tested:10 Testing Accuracy:80.0%\r", + "Progress:1.0% Speed(reviews/sec):2218.% #Correct:9 #Tested:11 Testing Accuracy:81.8%\r", + "Progress:1.1% Speed(reviews/sec):2356.% #Correct:10 #Tested:12 Testing Accuracy:83.3%\r", + "Progress:1.2% Speed(reviews/sec):2496.% #Correct:10 #Tested:13 Testing Accuracy:76.9%\r", + "Progress:1.3% Speed(reviews/sec):2617.% #Correct:11 #Tested:14 Testing Accuracy:78.5%\r", + "Progress:1.4% Speed(reviews/sec):2746.% #Correct:11 #Tested:15 Testing Accuracy:73.3%\r", + "Progress:1.5% Speed(reviews/sec):2835.% #Correct:12 #Tested:16 Testing Accuracy:75.0%\r", + "Progress:1.6% Speed(reviews/sec):2812.% #Correct:13 #Tested:17 Testing Accuracy:76.4%\r", + "Progress:1.7% Speed(reviews/sec):2891.% #Correct:14 #Tested:18 Testing Accuracy:77.7%\r", + "Progress:1.8% Speed(reviews/sec):3022.% #Correct:15 #Tested:19 Testing Accuracy:78.9%\r", + "Progress:1.9% Speed(reviews/sec):3096.% #Correct:16 #Tested:20 Testing Accuracy:80.0%\r", + "Progress:2.0% Speed(reviews/sec):3125.% #Correct:17 #Tested:21 Testing Accuracy:80.9%\r", + "Progress:2.1% Speed(reviews/sec):3218.% #Correct:18 #Tested:22 Testing Accuracy:81.8%\r", + "Progress:2.2% Speed(reviews/sec):3214.% #Correct:19 #Tested:23 Testing Accuracy:82.6%\r", + "Progress:2.3% Speed(reviews/sec):3205.% #Correct:20 #Tested:24 Testing Accuracy:83.3%\r", + "Progress:2.4% Speed(reviews/sec):3137.% #Correct:21 #Tested:25 Testing Accuracy:84.0%\r", + "Progress:2.5% Speed(reviews/sec):3207.% #Correct:22 #Tested:26 Testing Accuracy:84.6%\r", + "Progress:2.6% Speed(reviews/sec):3265.% #Correct:22 #Tested:27 Testing Accuracy:81.4%\r", + "Progress:2.7% Speed(reviews/sec):3164.% #Correct:23 #Tested:28 Testing Accuracy:82.1%\r", + "Progress:2.8% Speed(reviews/sec):3221.% #Correct:24 #Tested:29 Testing Accuracy:82.7%\r", + "Progress:2.9% Speed(reviews/sec):3264.% #Correct:25 #Tested:30 Testing Accuracy:83.3%\r", + "Progress:3.0% Speed(reviews/sec):3286.% #Correct:25 #Tested:31 Testing Accuracy:80.6%\r", + "Progress:3.1% Speed(reviews/sec):3357.% #Correct:26 #Tested:32 Testing Accuracy:81.2%\r", + "Progress:3.2% Speed(reviews/sec):3401.% #Correct:27 #Tested:33 Testing Accuracy:81.8%\r", + "Progress:3.3% Speed(reviews/sec):3443.% #Correct:28 #Tested:34 Testing Accuracy:82.3%\r", + "Progress:3.4% Speed(reviews/sec):3388.% #Correct:29 #Tested:35 Testing Accuracy:82.8%\r", + "Progress:3.5% Speed(reviews/sec):3348.% #Correct:30 #Tested:36 Testing Accuracy:83.3%\r", + "Progress:3.6% Speed(reviews/sec):3013.% #Correct:31 #Tested:37 Testing Accuracy:83.7%\r", + "Progress:3.7% Speed(reviews/sec):3018.% #Correct:32 #Tested:38 Testing Accuracy:84.2%\r", + "Progress:3.8% Speed(reviews/sec):2956.% #Correct:33 #Tested:39 Testing Accuracy:84.6%\r", + "Progress:3.9% Speed(reviews/sec):2880.% #Correct:33 #Tested:40 Testing Accuracy:82.5%\r", + "Progress:4.0% Speed(reviews/sec):2846.% #Correct:34 #Tested:41 Testing Accuracy:82.9%\r", + "Progress:4.1% Speed(reviews/sec):2837.% #Correct:35 #Tested:42 Testing Accuracy:83.3%\r", + "Progress:4.2% Speed(reviews/sec):2796.% #Correct:36 #Tested:43 Testing Accuracy:83.7%\r", + "Progress:4.3% Speed(reviews/sec):2793.% #Correct:37 #Tested:44 Testing Accuracy:84.0%\r", + "Progress:4.4% Speed(reviews/sec):2713.% #Correct:38 #Tested:45 Testing Accuracy:84.4%\r", + "Progress:4.5% Speed(reviews/sec):2598.% #Correct:39 #Tested:46 Testing Accuracy:84.7%\r", + "Progress:4.6% Speed(reviews/sec):2535.% #Correct:40 #Tested:47 Testing Accuracy:85.1%\r", + "Progress:4.7% Speed(reviews/sec):2553.% #Correct:41 #Tested:48 Testing Accuracy:85.4%\r", + "Progress:4.8% Speed(reviews/sec):2531.% #Correct:42 #Tested:49 Testing Accuracy:85.7%\r", + "Progress:4.9% Speed(reviews/sec):2560.% #Correct:43 #Tested:50 Testing Accuracy:86.0%\r", + "Progress:5.0% Speed(reviews/sec):2594.% #Correct:44 #Tested:51 Testing Accuracy:86.2%\r", + "Progress:5.1% Speed(reviews/sec):2631.% #Correct:45 #Tested:52 Testing Accuracy:86.5%\r", + "Progress:5.2% Speed(reviews/sec):2630.% #Correct:46 #Tested:53 Testing Accuracy:86.7%\r", + "Progress:5.3% Speed(reviews/sec):2659.% #Correct:47 #Tested:54 Testing Accuracy:87.0%\r", + "Progress:5.4% Speed(reviews/sec):2692.% #Correct:48 #Tested:55 Testing Accuracy:87.2%\r", + "Progress:5.5% Speed(reviews/sec):2731.% #Correct:49 #Tested:56 Testing Accuracy:87.5%\r", + "Progress:5.6% Speed(reviews/sec):2698.% #Correct:50 #Tested:57 Testing Accuracy:87.7%\r", + "Progress:5.7% Speed(reviews/sec):2736.% #Correct:51 #Tested:58 Testing Accuracy:87.9%\r", + "Progress:5.8% Speed(reviews/sec):2774.% #Correct:52 #Tested:59 Testing Accuracy:88.1%\r", + "Progress:5.9% Speed(reviews/sec):2811.% #Correct:53 #Tested:60 Testing Accuracy:88.3%\r", + "Progress:6.0% Speed(reviews/sec):2822.% #Correct:53 #Tested:61 Testing Accuracy:86.8%\r", + "Progress:6.1% Speed(reviews/sec):2840.% #Correct:54 #Tested:62 Testing Accuracy:87.0%\r", + "Progress:6.2% Speed(reviews/sec):2877.% #Correct:55 #Tested:63 Testing Accuracy:87.3%\r", + "Progress:6.3% Speed(reviews/sec):2908.% #Correct:55 #Tested:64 Testing Accuracy:85.9%\r", + "Progress:6.4% Speed(reviews/sec):2910.% #Correct:55 #Tested:65 Testing Accuracy:84.6%\r", + "Progress:6.5% Speed(reviews/sec):2939.% #Correct:55 #Tested:66 Testing Accuracy:83.3%\r", + "Progress:6.6% Speed(reviews/sec):2969.% #Correct:56 #Tested:67 Testing Accuracy:83.5%\r", + "Progress:6.7% Speed(reviews/sec):2995.% #Correct:57 #Tested:68 Testing Accuracy:83.8%\r", + "Progress:6.8% Speed(reviews/sec):3011.% #Correct:58 #Tested:69 Testing Accuracy:84.0%\r", + "Progress:6.9% Speed(reviews/sec):2990.% #Correct:59 #Tested:70 Testing Accuracy:84.2%\r", + "Progress:7.0% Speed(reviews/sec):2985.% #Correct:59 #Tested:71 Testing Accuracy:83.0%\r", + "Progress:7.1% Speed(reviews/sec):3013.% #Correct:60 #Tested:72 Testing Accuracy:83.3%\r", + "Progress:7.2% Speed(reviews/sec):3037.% #Correct:61 #Tested:73 Testing Accuracy:83.5%\r", + "Progress:7.3% Speed(reviews/sec):3069.% #Correct:62 #Tested:74 Testing Accuracy:83.7%\r", + "Progress:7.4% Speed(reviews/sec):3094.% #Correct:63 #Tested:75 Testing Accuracy:84.0%\r", + "Progress:7.5% Speed(reviews/sec):3117.% #Correct:64 #Tested:76 Testing Accuracy:84.2%\r", + "Progress:7.6% Speed(reviews/sec):3136.% #Correct:65 #Tested:77 Testing Accuracy:84.4%\r", + "Progress:7.7% Speed(reviews/sec):3156.% #Correct:66 #Tested:78 Testing Accuracy:84.6%\r", + "Progress:7.8% Speed(reviews/sec):3180.% #Correct:67 #Tested:79 Testing Accuracy:84.8%\r", + "Progress:7.9% Speed(reviews/sec):3202.% #Correct:68 #Tested:80 Testing Accuracy:85.0%\r", + "Progress:8.0% Speed(reviews/sec):3227.% #Correct:69 #Tested:81 Testing Accuracy:85.1%\r", + "Progress:8.1% Speed(reviews/sec):3195.% #Correct:69 #Tested:82 Testing Accuracy:84.1%\r", + "Progress:8.2% Speed(reviews/sec):3211.% #Correct:70 #Tested:83 Testing Accuracy:84.3%\r", + "Progress:8.3% Speed(reviews/sec):3230.% #Correct:71 #Tested:84 Testing Accuracy:84.5%\r", + "Progress:8.4% Speed(reviews/sec):3257.% #Correct:72 #Tested:85 Testing Accuracy:84.7%\r", + "Progress:8.5% Speed(reviews/sec):3279.% #Correct:73 #Tested:86 Testing Accuracy:84.8%\r", + "Progress:8.6% Speed(reviews/sec):3288.% #Correct:74 #Tested:87 Testing Accuracy:85.0%\r", + "Progress:8.7% Speed(reviews/sec):3312.% #Correct:74 #Tested:88 Testing Accuracy:84.0%\r", + "Progress:8.8% Speed(reviews/sec):3336.% #Correct:75 #Tested:89 Testing Accuracy:84.2%\r", + "Progress:8.9% Speed(reviews/sec):3318.% #Correct:76 #Tested:90 Testing Accuracy:84.4%\r", + "Progress:9.0% Speed(reviews/sec):3331.% #Correct:77 #Tested:91 Testing Accuracy:84.6%\r", + "Progress:9.1% Speed(reviews/sec):3352.% #Correct:78 #Tested:92 Testing Accuracy:84.7%\r", + "Progress:9.2% Speed(reviews/sec):3375.% #Correct:79 #Tested:93 Testing Accuracy:84.9%\r", + "Progress:9.3% Speed(reviews/sec):3393.% #Correct:80 #Tested:94 Testing Accuracy:85.1%\r", + "Progress:9.4% Speed(reviews/sec):3414.% #Correct:81 #Tested:95 Testing Accuracy:85.2%\r", + "Progress:9.5% Speed(reviews/sec):3428.% #Correct:82 #Tested:96 Testing Accuracy:85.4%\r", + "Progress:9.6% Speed(reviews/sec):3442.% #Correct:83 #Tested:97 Testing Accuracy:85.5%\r", + "Progress:9.7% Speed(reviews/sec):3450.% #Correct:84 #Tested:98 Testing Accuracy:85.7%\r", + "Progress:9.8% Speed(reviews/sec):3472.% #Correct:85 #Tested:99 Testing Accuracy:85.8%\r", + "Progress:9.9% Speed(reviews/sec):3494.% #Correct:86 #Tested:100 Testing Accuracy:86.0%\r", + "Progress:10.0% Speed(reviews/sec):3512.% #Correct:87 #Tested:101 Testing Accuracy:86.1%\r", + "Progress:10.1% Speed(reviews/sec):3531.% #Correct:88 #Tested:102 Testing Accuracy:86.2%\r", + "Progress:10.2% Speed(reviews/sec):3549.% #Correct:89 #Tested:103 Testing Accuracy:86.4%\r", + "Progress:10.3% Speed(reviews/sec):3547.% #Correct:89 #Tested:104 Testing Accuracy:85.5%\r", + "Progress:10.4% Speed(reviews/sec):3567.% #Correct:90 #Tested:105 Testing Accuracy:85.7%\r", + "Progress:10.5% Speed(reviews/sec):3592.% #Correct:91 #Tested:106 Testing Accuracy:85.8%\r", + "Progress:10.6% Speed(reviews/sec):3603.% #Correct:92 #Tested:107 Testing Accuracy:85.9%\r", + "Progress:10.7% Speed(reviews/sec):3620.% #Correct:93 #Tested:108 Testing Accuracy:86.1%\r", + "Progress:10.8% Speed(reviews/sec):3641.% #Correct:94 #Tested:109 Testing Accuracy:86.2%\r", + "Progress:10.9% Speed(reviews/sec):3659.% #Correct:94 #Tested:110 Testing Accuracy:85.4%\r", + "Progress:11.0% Speed(reviews/sec):3674.% #Correct:95 #Tested:111 Testing Accuracy:85.5%\r", + "Progress:11.1% Speed(reviews/sec):3687.% #Correct:96 #Tested:112 Testing Accuracy:85.7%\r", + "Progress:11.2% Speed(reviews/sec):3702.% #Correct:97 #Tested:113 Testing Accuracy:85.8%\r", + "Progress:11.3% Speed(reviews/sec):3706.% #Correct:98 #Tested:114 Testing Accuracy:85.9%\r", + "Progress:11.4% Speed(reviews/sec):3718.% #Correct:99 #Tested:115 Testing Accuracy:86.0%\r", + "Progress:11.5% Speed(reviews/sec):3735.% #Correct:100 #Tested:116 Testing Accuracy:86.2%\r", + "Progress:11.6% Speed(reviews/sec):3755.% #Correct:101 #Tested:117 Testing Accuracy:86.3%\r", + "Progress:11.7% Speed(reviews/sec):3764.% #Correct:101 #Tested:118 Testing Accuracy:85.5%\r", + "Progress:11.8% Speed(reviews/sec):3783.% #Correct:102 #Tested:119 Testing Accuracy:85.7%\r", + "Progress:11.9% Speed(reviews/sec):3793.% #Correct:103 #Tested:120 Testing Accuracy:85.8%\r", + "Progress:12.0% Speed(reviews/sec):3800.% #Correct:103 #Tested:121 Testing Accuracy:85.1%\r", + "Progress:12.1% Speed(reviews/sec):3814.% #Correct:104 #Tested:122 Testing Accuracy:85.2%\r", + "Progress:12.2% Speed(reviews/sec):3833.% #Correct:105 #Tested:123 Testing Accuracy:85.3%\r", + "Progress:12.3% Speed(reviews/sec):3829.% #Correct:106 #Tested:124 Testing Accuracy:85.4%\r", + "Progress:12.4% Speed(reviews/sec):3835.% #Correct:107 #Tested:125 Testing Accuracy:85.6%\r", + "Progress:12.5% Speed(reviews/sec):3844.% #Correct:108 #Tested:126 Testing Accuracy:85.7%\r", + "Progress:12.6% Speed(reviews/sec):3865.% #Correct:109 #Tested:127 Testing Accuracy:85.8%\r", + "Progress:12.7% Speed(reviews/sec):3868.% #Correct:110 #Tested:128 Testing Accuracy:85.9%\r", + "Progress:12.8% Speed(reviews/sec):3877.% #Correct:111 #Tested:129 Testing Accuracy:86.0%\r", + "Progress:12.9% Speed(reviews/sec):3876.% #Correct:112 #Tested:130 Testing Accuracy:86.1%\r", + "Progress:13.0% Speed(reviews/sec):3877.% #Correct:112 #Tested:131 Testing Accuracy:85.4%\r", + "Progress:13.1% Speed(reviews/sec):3887.% #Correct:113 #Tested:132 Testing Accuracy:85.6%\r", + "Progress:13.2% Speed(reviews/sec):3902.% #Correct:114 #Tested:133 Testing Accuracy:85.7%\r", + "Progress:13.3% Speed(reviews/sec):3914.% #Correct:115 #Tested:134 Testing Accuracy:85.8%\r", + "Progress:13.4% Speed(reviews/sec):3916.% #Correct:116 #Tested:135 Testing Accuracy:85.9%\r", + "Progress:13.5% Speed(reviews/sec):3921.% #Correct:116 #Tested:136 Testing Accuracy:85.2%\r", + "Progress:13.6% Speed(reviews/sec):3932.% #Correct:117 #Tested:137 Testing Accuracy:85.4%\r", + "Progress:13.7% Speed(reviews/sec):3934.% #Correct:118 #Tested:138 Testing Accuracy:85.5%\r", + "Progress:13.8% Speed(reviews/sec):3945.% #Correct:119 #Tested:139 Testing Accuracy:85.6%\r", + "Progress:13.9% Speed(reviews/sec):3939.% #Correct:120 #Tested:140 Testing Accuracy:85.7%\r", + "Progress:14.0% Speed(reviews/sec):3960.% #Correct:121 #Tested:141 Testing Accuracy:85.8%\r", + "Progress:14.1% Speed(reviews/sec):3969.% #Correct:122 #Tested:142 Testing Accuracy:85.9%\r", + "Progress:14.2% Speed(reviews/sec):3985.% #Correct:123 #Tested:143 Testing Accuracy:86.0%\r", + "Progress:14.3% Speed(reviews/sec):4000.% #Correct:124 #Tested:144 Testing Accuracy:86.1%\r", + "Progress:14.4% Speed(reviews/sec):3940.% #Correct:125 #Tested:145 Testing Accuracy:86.2%\r", + "Progress:14.5% Speed(reviews/sec):3929.% #Correct:126 #Tested:146 Testing Accuracy:86.3%\r", + "Progress:14.6% Speed(reviews/sec):3915.% #Correct:127 #Tested:147 Testing Accuracy:86.3%\r", + "Progress:14.7% Speed(reviews/sec):3924.% #Correct:128 #Tested:148 Testing Accuracy:86.4%\r", + "Progress:14.8% Speed(reviews/sec):3938.% #Correct:129 #Tested:149 Testing Accuracy:86.5%\r", + "Progress:14.9% Speed(reviews/sec):3951.% #Correct:130 #Tested:150 Testing Accuracy:86.6%\r", + "Progress:15.0% Speed(reviews/sec):3965.% #Correct:131 #Tested:151 Testing Accuracy:86.7%\r", + "Progress:15.1% Speed(reviews/sec):3978.% #Correct:132 #Tested:152 Testing Accuracy:86.8%\r", + "Progress:15.2% Speed(reviews/sec):3976.% #Correct:133 #Tested:153 Testing Accuracy:86.9%\r", + "Progress:15.3% Speed(reviews/sec):3975.% #Correct:134 #Tested:154 Testing Accuracy:87.0%\r", + "Progress:15.4% Speed(reviews/sec):3987.% #Correct:135 #Tested:155 Testing Accuracy:87.0%\r", + "Progress:15.5% Speed(reviews/sec):3997.% #Correct:136 #Tested:156 Testing Accuracy:87.1%\r", + "Progress:15.6% Speed(reviews/sec):3996.% #Correct:137 #Tested:157 Testing Accuracy:87.2%\r", + "Progress:15.7% Speed(reviews/sec):4000.% #Correct:138 #Tested:158 Testing Accuracy:87.3%\r", + "Progress:15.8% Speed(reviews/sec):4015.% #Correct:139 #Tested:159 Testing Accuracy:87.4%\r", + "Progress:15.9% Speed(reviews/sec):4027.% #Correct:140 #Tested:160 Testing Accuracy:87.5%\r", + "Progress:16.0% Speed(reviews/sec):4032.% #Correct:141 #Tested:161 Testing Accuracy:87.5%\r", + "Progress:16.1% Speed(reviews/sec):4036.% #Correct:141 #Tested:162 Testing Accuracy:87.0%\r", + "Progress:16.2% Speed(reviews/sec):4049.% #Correct:142 #Tested:163 Testing Accuracy:87.1%\r", + "Progress:16.3% Speed(reviews/sec):4062.% #Correct:143 #Tested:164 Testing Accuracy:87.1%\r", + "Progress:16.4% Speed(reviews/sec):4077.% #Correct:144 #Tested:165 Testing Accuracy:87.2%\r", + "Progress:16.5% Speed(reviews/sec):4069.% #Correct:145 #Tested:166 Testing Accuracy:87.3%\r", + "Progress:16.6% Speed(reviews/sec):4070.% #Correct:145 #Tested:167 Testing Accuracy:86.8%\r", + "Progress:16.7% Speed(reviews/sec):4061.% #Correct:146 #Tested:168 Testing Accuracy:86.9%\r", + "Progress:16.8% Speed(reviews/sec):4075.% #Correct:147 #Tested:169 Testing Accuracy:86.9%\r", + "Progress:16.9% Speed(reviews/sec):4086.% #Correct:148 #Tested:170 Testing Accuracy:87.0%\r", + "Progress:17.0% Speed(reviews/sec):4091.% #Correct:149 #Tested:171 Testing Accuracy:87.1%\r", + "Progress:17.1% Speed(reviews/sec):4102.% #Correct:150 #Tested:172 Testing Accuracy:87.2%\r", + "Progress:17.2% Speed(reviews/sec):4096.% #Correct:151 #Tested:173 Testing Accuracy:87.2%\r", + "Progress:17.3% Speed(reviews/sec):4104.% #Correct:151 #Tested:174 Testing Accuracy:86.7%\r", + "Progress:17.4% Speed(reviews/sec):4109.% #Correct:152 #Tested:175 Testing Accuracy:86.8%\r", + "Progress:17.5% Speed(reviews/sec):4121.% #Correct:153 #Tested:176 Testing Accuracy:86.9%\r", + "Progress:17.6% Speed(reviews/sec):4133.% #Correct:154 #Tested:177 Testing Accuracy:87.0%\r", + "Progress:17.7% Speed(reviews/sec):4135.% #Correct:155 #Tested:178 Testing Accuracy:87.0%\r", + "Progress:17.8% Speed(reviews/sec):4149.% #Correct:156 #Tested:179 Testing Accuracy:87.1%\r", + "Progress:17.9% Speed(reviews/sec):4161.% #Correct:157 #Tested:180 Testing Accuracy:87.2%\r", + "Progress:18.0% Speed(reviews/sec):4170.% #Correct:158 #Tested:181 Testing Accuracy:87.2%\r", + "Progress:18.1% Speed(reviews/sec):4181.% #Correct:159 #Tested:182 Testing Accuracy:87.3%\r", + "Progress:18.2% Speed(reviews/sec):4191.% #Correct:159 #Tested:183 Testing Accuracy:86.8%\r", + "Progress:18.3% Speed(reviews/sec):4198.% #Correct:160 #Tested:184 Testing Accuracy:86.9%\r", + "Progress:18.4% Speed(reviews/sec):4189.% #Correct:161 #Tested:185 Testing Accuracy:87.0%\r", + "Progress:18.5% Speed(reviews/sec):4197.% #Correct:162 #Tested:186 Testing Accuracy:87.0%\r", + "Progress:18.6% Speed(reviews/sec):4212.% #Correct:162 #Tested:187 Testing Accuracy:86.6%\r", + "Progress:18.7% Speed(reviews/sec):4213.% #Correct:163 #Tested:188 Testing Accuracy:86.7%\r", + "Progress:18.8% Speed(reviews/sec):4197.% #Correct:164 #Tested:189 Testing Accuracy:86.7%\r", + "Progress:18.9% Speed(reviews/sec):4186.% #Correct:165 #Tested:190 Testing Accuracy:86.8%\r", + "Progress:19.0% Speed(reviews/sec):4187.% #Correct:166 #Tested:191 Testing Accuracy:86.9%\r", + "Progress:19.1% Speed(reviews/sec):4188.% #Correct:167 #Tested:192 Testing Accuracy:86.9%\r", + "Progress:19.2% Speed(reviews/sec):4184.% #Correct:168 #Tested:193 Testing Accuracy:87.0%\r", + "Progress:19.3% Speed(reviews/sec):4138.% #Correct:169 #Tested:194 Testing Accuracy:87.1%\r", + "Progress:19.4% Speed(reviews/sec):4146.% #Correct:170 #Tested:195 Testing Accuracy:87.1%\r", + "Progress:19.5% Speed(reviews/sec):4147.% #Correct:170 #Tested:196 Testing Accuracy:86.7%\r", + "Progress:19.6% Speed(reviews/sec):4149.% #Correct:171 #Tested:197 Testing Accuracy:86.8%\r", + "Progress:19.7% Speed(reviews/sec):4157.% #Correct:172 #Tested:198 Testing Accuracy:86.8%\r", + "Progress:19.8% Speed(reviews/sec):4170.% #Correct:173 #Tested:199 Testing Accuracy:86.9%\r", + "Progress:19.9% Speed(reviews/sec):4176.% #Correct:174 #Tested:200 Testing Accuracy:87.0%\r", + "Progress:20.0% Speed(reviews/sec):4190.% #Correct:175 #Tested:201 Testing Accuracy:87.0%\r", + "Progress:20.1% Speed(reviews/sec):4194.% #Correct:176 #Tested:202 Testing Accuracy:87.1%\r", + "Progress:20.2% Speed(reviews/sec):4205.% #Correct:177 #Tested:203 Testing Accuracy:87.1%\r", + "Progress:20.3% Speed(reviews/sec):4207.% #Correct:178 #Tested:204 Testing Accuracy:87.2%\r", + "Progress:20.4% Speed(reviews/sec):4217.% #Correct:179 #Tested:205 Testing Accuracy:87.3%\r", + "Progress:20.5% Speed(reviews/sec):4224.% #Correct:180 #Tested:206 Testing Accuracy:87.3%\r", + "Progress:20.6% Speed(reviews/sec):4223.% #Correct:181 #Tested:207 Testing Accuracy:87.4%\r", + "Progress:20.7% Speed(reviews/sec):4221.% #Correct:182 #Tested:208 Testing Accuracy:87.5%\r", + "Progress:20.8% Speed(reviews/sec):4227.% #Correct:183 #Tested:209 Testing Accuracy:87.5%\r", + "Progress:20.9% Speed(reviews/sec):4221.% #Correct:184 #Tested:210 Testing Accuracy:87.6%\r", + "Progress:21.0% Speed(reviews/sec):4231.% #Correct:184 #Tested:211 Testing Accuracy:87.2%\r", + "Progress:21.1% Speed(reviews/sec):4242.% #Correct:185 #Tested:212 Testing Accuracy:87.2%\r", + "Progress:21.2% Speed(reviews/sec):4254.% #Correct:186 #Tested:213 Testing Accuracy:87.3%\r", + "Progress:21.3% Speed(reviews/sec):4237.% #Correct:187 #Tested:214 Testing Accuracy:87.3%\r", + "Progress:21.4% Speed(reviews/sec):4250.% #Correct:188 #Tested:215 Testing Accuracy:87.4%\r", + "Progress:21.5% Speed(reviews/sec):4248.% #Correct:189 #Tested:216 Testing Accuracy:87.5%\r", + "Progress:21.6% Speed(reviews/sec):4259.% #Correct:190 #Tested:217 Testing Accuracy:87.5%\r", + "Progress:21.7% Speed(reviews/sec):4262.% #Correct:190 #Tested:218 Testing Accuracy:87.1%\r", + "Progress:21.8% Speed(reviews/sec):4265.% #Correct:191 #Tested:219 Testing Accuracy:87.2%\r", + "Progress:21.9% Speed(reviews/sec):4278.% #Correct:192 #Tested:220 Testing Accuracy:87.2%\r", + "Progress:22.0% Speed(reviews/sec):4275.% #Correct:193 #Tested:221 Testing Accuracy:87.3%\r", + "Progress:22.1% Speed(reviews/sec):4283.% #Correct:194 #Tested:222 Testing Accuracy:87.3%\r", + "Progress:22.2% Speed(reviews/sec):4265.% #Correct:195 #Tested:223 Testing Accuracy:87.4%\r", + "Progress:22.3% Speed(reviews/sec):4263.% #Correct:196 #Tested:224 Testing Accuracy:87.5%\r", + "Progress:22.4% Speed(reviews/sec):4246.% #Correct:197 #Tested:225 Testing Accuracy:87.5%\r", + "Progress:22.5% Speed(reviews/sec):4249.% #Correct:198 #Tested:226 Testing Accuracy:87.6%\r", + "Progress:22.6% Speed(reviews/sec):4236.% #Correct:199 #Tested:227 Testing Accuracy:87.6%\r", + "Progress:22.7% Speed(reviews/sec):4244.% #Correct:200 #Tested:228 Testing Accuracy:87.7%\r", + "Progress:22.8% Speed(reviews/sec):4250.% #Correct:201 #Tested:229 Testing Accuracy:87.7%\r", + "Progress:22.9% Speed(reviews/sec):4252.% #Correct:202 #Tested:230 Testing Accuracy:87.8%\r", + "Progress:23.0% Speed(reviews/sec):4247.% #Correct:203 #Tested:231 Testing Accuracy:87.8%\r", + "Progress:23.1% Speed(reviews/sec):4252.% #Correct:204 #Tested:232 Testing Accuracy:87.9%\r", + "Progress:23.2% Speed(reviews/sec):4265.% #Correct:205 #Tested:233 Testing Accuracy:87.9%\r", + "Progress:23.3% Speed(reviews/sec):4272.% #Correct:206 #Tested:234 Testing Accuracy:88.0%\r", + "Progress:23.4% Speed(reviews/sec):4257.% #Correct:207 #Tested:235 Testing Accuracy:88.0%\r", + "Progress:23.5% Speed(reviews/sec):4266.% #Correct:208 #Tested:236 Testing Accuracy:88.1%\r", + "Progress:23.6% Speed(reviews/sec):4258.% #Correct:208 #Tested:237 Testing Accuracy:87.7%\r", + "Progress:23.7% Speed(reviews/sec):4262.% #Correct:209 #Tested:238 Testing Accuracy:87.8%\r", + "Progress:23.8% Speed(reviews/sec):4271.% #Correct:210 #Tested:239 Testing Accuracy:87.8%\r", + "Progress:23.9% Speed(reviews/sec):4276.% #Correct:211 #Tested:240 Testing Accuracy:87.9%\r", + "Progress:24.0% Speed(reviews/sec):4260.% #Correct:212 #Tested:241 Testing Accuracy:87.9%\r", + "Progress:24.1% Speed(reviews/sec):4244.% #Correct:212 #Tested:242 Testing Accuracy:87.6%\r", + "Progress:24.2% Speed(reviews/sec):4230.% #Correct:213 #Tested:243 Testing Accuracy:87.6%\r", + "Progress:24.3% Speed(reviews/sec):4214.% #Correct:214 #Tested:244 Testing Accuracy:87.7%\r", + "Progress:24.4% Speed(reviews/sec):4208.% #Correct:215 #Tested:245 Testing Accuracy:87.7%\r", + "Progress:24.5% Speed(reviews/sec):4213.% #Correct:216 #Tested:246 Testing Accuracy:87.8%\r", + "Progress:24.6% Speed(reviews/sec):4222.% #Correct:217 #Tested:247 Testing Accuracy:87.8%\r", + "Progress:24.7% Speed(reviews/sec):4227.% #Correct:218 #Tested:248 Testing Accuracy:87.9%\r", + "Progress:24.8% Speed(reviews/sec):4239.% #Correct:219 #Tested:249 Testing Accuracy:87.9%\r", + "Progress:24.9% Speed(reviews/sec):4246.% #Correct:220 #Tested:250 Testing Accuracy:88.0%\r", + "Progress:25.0% Speed(reviews/sec):4255.% #Correct:221 #Tested:251 Testing Accuracy:88.0%\r", + "Progress:25.1% Speed(reviews/sec):4264.% #Correct:222 #Tested:252 Testing Accuracy:88.0%\r", + "Progress:25.2% Speed(reviews/sec):4272.% #Correct:223 #Tested:253 Testing Accuracy:88.1%\r", + "Progress:25.3% Speed(reviews/sec):4278.% #Correct:224 #Tested:254 Testing Accuracy:88.1%\r", + "Progress:25.4% Speed(reviews/sec):4287.% #Correct:225 #Tested:255 Testing Accuracy:88.2%\r", + "Progress:25.5% Speed(reviews/sec):4295.% #Correct:226 #Tested:256 Testing Accuracy:88.2%\r", + "Progress:25.6% Speed(reviews/sec):4306.% #Correct:227 #Tested:257 Testing Accuracy:88.3%\r", + "Progress:25.7% Speed(reviews/sec):4315.% #Correct:228 #Tested:258 Testing Accuracy:88.3%\r", + "Progress:25.8% Speed(reviews/sec):4326.% #Correct:229 #Tested:259 Testing Accuracy:88.4%\r", + "Progress:25.9% Speed(reviews/sec):4335.% #Correct:229 #Tested:260 Testing Accuracy:88.0%\r", + "Progress:26.0% Speed(reviews/sec):4344.% #Correct:230 #Tested:261 Testing Accuracy:88.1%\r", + "Progress:26.1% Speed(reviews/sec):4354.% #Correct:231 #Tested:262 Testing Accuracy:88.1%\r", + "Progress:26.2% Speed(reviews/sec):4357.% #Correct:232 #Tested:263 Testing Accuracy:88.2%\r", + "Progress:26.3% Speed(reviews/sec):4353.% #Correct:233 #Tested:264 Testing Accuracy:88.2%\r", + "Progress:26.4% Speed(reviews/sec):4360.% #Correct:234 #Tested:265 Testing Accuracy:88.3%\r", + "Progress:26.5% Speed(reviews/sec):4348.% #Correct:234 #Tested:266 Testing Accuracy:87.9%\r", + "Progress:26.6% Speed(reviews/sec):4353.% #Correct:235 #Tested:267 Testing Accuracy:88.0%\r", + "Progress:26.7% Speed(reviews/sec):4357.% #Correct:235 #Tested:268 Testing Accuracy:87.6%\r", + "Progress:26.8% Speed(reviews/sec):4365.% #Correct:236 #Tested:269 Testing Accuracy:87.7%\r", + "Progress:26.9% Speed(reviews/sec):4373.% #Correct:236 #Tested:270 Testing Accuracy:87.4%\r", + "Progress:27.0% Speed(reviews/sec):4375.% #Correct:236 #Tested:271 Testing Accuracy:87.0%\r", + "Progress:27.1% Speed(reviews/sec):4379.% #Correct:237 #Tested:272 Testing Accuracy:87.1%\r", + "Progress:27.2% Speed(reviews/sec):4376.% #Correct:238 #Tested:273 Testing Accuracy:87.1%\r", + "Progress:27.3% Speed(reviews/sec):4386.% #Correct:239 #Tested:274 Testing Accuracy:87.2%\r", + "Progress:27.4% Speed(reviews/sec):4386.% #Correct:240 #Tested:275 Testing Accuracy:87.2%\r", + "Progress:27.5% Speed(reviews/sec):4375.% #Correct:241 #Tested:276 Testing Accuracy:87.3%\r", + "Progress:27.6% Speed(reviews/sec):4382.% #Correct:242 #Tested:277 Testing Accuracy:87.3%\r", + "Progress:27.7% Speed(reviews/sec):4389.% #Correct:243 #Tested:278 Testing Accuracy:87.4%\r", + "Progress:27.8% Speed(reviews/sec):4398.% #Correct:244 #Tested:279 Testing Accuracy:87.4%\r", + "Progress:27.9% Speed(reviews/sec):4406.% #Correct:245 #Tested:280 Testing Accuracy:87.5%\r", + "Progress:28.0% Speed(reviews/sec):4409.% #Correct:246 #Tested:281 Testing Accuracy:87.5%\r", + "Progress:28.1% Speed(reviews/sec):4417.% #Correct:247 #Tested:282 Testing Accuracy:87.5%\r", + "Progress:28.2% Speed(reviews/sec):4428.% #Correct:248 #Tested:283 Testing Accuracy:87.6%\r", + "Progress:28.3% Speed(reviews/sec):4437.% #Correct:249 #Tested:284 Testing Accuracy:87.6%\r", + "Progress:28.4% Speed(reviews/sec):4414.% #Correct:250 #Tested:285 Testing Accuracy:87.7%\r", + "Progress:28.5% Speed(reviews/sec):4420.% #Correct:251 #Tested:286 Testing Accuracy:87.7%\r", + "Progress:28.6% Speed(reviews/sec):4430.% #Correct:252 #Tested:287 Testing Accuracy:87.8%\r", + "Progress:28.7% Speed(reviews/sec):4434.% #Correct:253 #Tested:288 Testing Accuracy:87.8%\r", + "Progress:28.8% Speed(reviews/sec):4442.% #Correct:254 #Tested:289 Testing Accuracy:87.8%\r", + "Progress:28.9% Speed(reviews/sec):4451.% #Correct:255 #Tested:290 Testing Accuracy:87.9%\r", + "Progress:29.0% Speed(reviews/sec):4457.% #Correct:256 #Tested:291 Testing Accuracy:87.9%\r", + "Progress:29.1% Speed(reviews/sec):4465.% #Correct:257 #Tested:292 Testing Accuracy:88.0%\r", + "Progress:29.2% Speed(reviews/sec):4474.% #Correct:258 #Tested:293 Testing Accuracy:88.0%\r", + "Progress:29.3% Speed(reviews/sec):4482.% #Correct:259 #Tested:294 Testing Accuracy:88.0%\r", + "Progress:29.4% Speed(reviews/sec):4490.% #Correct:260 #Tested:295 Testing Accuracy:88.1%\r", + "Progress:29.5% Speed(reviews/sec):4499.% #Correct:261 #Tested:296 Testing Accuracy:88.1%\r", + "Progress:29.6% Speed(reviews/sec):4510.% #Correct:262 #Tested:297 Testing Accuracy:88.2%\r", + "Progress:29.7% Speed(reviews/sec):4507.% #Correct:263 #Tested:298 Testing Accuracy:88.2%\r", + "Progress:29.8% Speed(reviews/sec):4511.% #Correct:264 #Tested:299 Testing Accuracy:88.2%\r", + "Progress:29.9% Speed(reviews/sec):4513.% #Correct:265 #Tested:300 Testing Accuracy:88.3%\r", + "Progress:30.0% Speed(reviews/sec):4518.% #Correct:266 #Tested:301 Testing Accuracy:88.3%\r", + "Progress:30.1% Speed(reviews/sec):4510.% #Correct:266 #Tested:302 Testing Accuracy:88.0%\r", + "Progress:30.2% Speed(reviews/sec):4487.% #Correct:267 #Tested:303 Testing Accuracy:88.1%\r", + "Progress:30.3% Speed(reviews/sec):4491.% #Correct:268 #Tested:304 Testing Accuracy:88.1%\r", + "Progress:30.4% Speed(reviews/sec):4500.% #Correct:269 #Tested:305 Testing Accuracy:88.1%\r", + "Progress:30.5% Speed(reviews/sec):4501.% #Correct:269 #Tested:306 Testing Accuracy:87.9%\r", + "Progress:30.6% Speed(reviews/sec):4430.% #Correct:270 #Tested:307 Testing Accuracy:87.9%\r", + "Progress:30.7% Speed(reviews/sec):4377.% #Correct:270 #Tested:308 Testing Accuracy:87.6%\r", + "Progress:30.8% Speed(reviews/sec):4375.% #Correct:271 #Tested:309 Testing Accuracy:87.7%\r", + "Progress:30.9% Speed(reviews/sec):4362.% #Correct:272 #Tested:310 Testing Accuracy:87.7%\r", + "Progress:31.0% Speed(reviews/sec):4370.% #Correct:273 #Tested:311 Testing Accuracy:87.7%\r", + "Progress:31.1% Speed(reviews/sec):4368.% #Correct:274 #Tested:312 Testing Accuracy:87.8%\r", + "Progress:31.2% Speed(reviews/sec):4373.% #Correct:275 #Tested:313 Testing Accuracy:87.8%\r", + "Progress:31.3% Speed(reviews/sec):4362.% #Correct:276 #Tested:314 Testing Accuracy:87.8%\r", + "Progress:31.4% Speed(reviews/sec):4366.% #Correct:277 #Tested:315 Testing Accuracy:87.9%\r", + "Progress:31.5% Speed(reviews/sec):4368.% #Correct:278 #Tested:316 Testing Accuracy:87.9%\r", + "Progress:31.6% Speed(reviews/sec):4376.% #Correct:279 #Tested:317 Testing Accuracy:88.0%\r", + "Progress:31.7% Speed(reviews/sec):4377.% #Correct:279 #Tested:318 Testing Accuracy:87.7%\r", + "Progress:31.8% Speed(reviews/sec):4383.% #Correct:280 #Tested:319 Testing Accuracy:87.7%\r", + "Progress:31.9% Speed(reviews/sec):4389.% #Correct:281 #Tested:320 Testing Accuracy:87.8%\r", + "Progress:32.0% Speed(reviews/sec):4396.% #Correct:282 #Tested:321 Testing Accuracy:87.8%\r", + "Progress:32.1% Speed(reviews/sec):4404.% #Correct:282 #Tested:322 Testing Accuracy:87.5%\r", + "Progress:32.2% Speed(reviews/sec):4412.% #Correct:283 #Tested:323 Testing Accuracy:87.6%\r", + "Progress:32.3% Speed(reviews/sec):4415.% #Correct:284 #Tested:324 Testing Accuracy:87.6%\r", + "Progress:32.4% Speed(reviews/sec):4413.% #Correct:285 #Tested:325 Testing Accuracy:87.6%\r", + "Progress:32.5% Speed(reviews/sec):4419.% #Correct:286 #Tested:326 Testing Accuracy:87.7%\r", + "Progress:32.6% Speed(reviews/sec):4424.% #Correct:287 #Tested:327 Testing Accuracy:87.7%\r", + "Progress:32.7% Speed(reviews/sec):4421.% #Correct:287 #Tested:328 Testing Accuracy:87.5%\r", + "Progress:32.8% Speed(reviews/sec):4413.% #Correct:288 #Tested:329 Testing Accuracy:87.5%\r", + "Progress:32.9% Speed(reviews/sec):4414.% #Correct:289 #Tested:330 Testing Accuracy:87.5%\r", + "Progress:33.0% Speed(reviews/sec):4417.% #Correct:290 #Tested:331 Testing Accuracy:87.6%\r", + "Progress:33.1% Speed(reviews/sec):4410.% #Correct:291 #Tested:332 Testing Accuracy:87.6%\r", + "Progress:33.2% Speed(reviews/sec):4391.% #Correct:292 #Tested:333 Testing Accuracy:87.6%\r", + "Progress:33.3% Speed(reviews/sec):4389.% #Correct:293 #Tested:334 Testing Accuracy:87.7%\r", + "Progress:33.4% Speed(reviews/sec):4355.% #Correct:294 #Tested:335 Testing Accuracy:87.7%\r", + "Progress:33.5% Speed(reviews/sec):4350.% #Correct:295 #Tested:336 Testing Accuracy:87.7%\r", + "Progress:33.6% Speed(reviews/sec):4351.% #Correct:296 #Tested:337 Testing Accuracy:87.8%\r", + "Progress:33.7% Speed(reviews/sec):4358.% #Correct:297 #Tested:338 Testing Accuracy:87.8%\r", + "Progress:33.8% Speed(reviews/sec):4363.% #Correct:297 #Tested:339 Testing Accuracy:87.6%\r", + "Progress:33.9% Speed(reviews/sec):4368.% #Correct:298 #Tested:340 Testing Accuracy:87.6%\r", + "Progress:34.0% Speed(reviews/sec):4366.% #Correct:298 #Tested:341 Testing Accuracy:87.3%\r", + "Progress:34.1% Speed(reviews/sec):4365.% #Correct:299 #Tested:342 Testing Accuracy:87.4%\r", + "Progress:34.2% Speed(reviews/sec):4343.% #Correct:300 #Tested:343 Testing Accuracy:87.4%\r", + "Progress:34.3% Speed(reviews/sec):4347.% #Correct:301 #Tested:344 Testing Accuracy:87.5%\r", + "Progress:34.4% Speed(reviews/sec):4343.% #Correct:302 #Tested:345 Testing Accuracy:87.5%\r", + "Progress:34.5% Speed(reviews/sec):4341.% #Correct:303 #Tested:346 Testing Accuracy:87.5%\r", + "Progress:34.6% Speed(reviews/sec):4342.% #Correct:303 #Tested:347 Testing Accuracy:87.3%\r", + "Progress:34.7% Speed(reviews/sec):4341.% #Correct:304 #Tested:348 Testing Accuracy:87.3%\r", + "Progress:34.8% Speed(reviews/sec):4334.% #Correct:305 #Tested:349 Testing Accuracy:87.3%\r", + "Progress:34.9% Speed(reviews/sec):4331.% #Correct:306 #Tested:350 Testing Accuracy:87.4%\r", + "Progress:35.0% Speed(reviews/sec):4334.% #Correct:307 #Tested:351 Testing Accuracy:87.4%\r", + "Progress:35.1% Speed(reviews/sec):4336.% #Correct:308 #Tested:352 Testing Accuracy:87.5%\r", + "Progress:35.2% Speed(reviews/sec):4327.% #Correct:309 #Tested:353 Testing Accuracy:87.5%\r", + "Progress:35.3% Speed(reviews/sec):4333.% #Correct:309 #Tested:354 Testing Accuracy:87.2%\r", + "Progress:35.4% Speed(reviews/sec):4338.% #Correct:310 #Tested:355 Testing Accuracy:87.3%\r", + "Progress:35.5% Speed(reviews/sec):4331.% #Correct:311 #Tested:356 Testing Accuracy:87.3%\r", + "Progress:35.6% Speed(reviews/sec):4334.% #Correct:311 #Tested:357 Testing Accuracy:87.1%\r", + "Progress:35.7% Speed(reviews/sec):4333.% #Correct:311 #Tested:358 Testing Accuracy:86.8%\r", + "Progress:35.8% Speed(reviews/sec):4339.% #Correct:312 #Tested:359 Testing Accuracy:86.9%\r", + "Progress:35.9% Speed(reviews/sec):4344.% #Correct:313 #Tested:360 Testing Accuracy:86.9%\r", + "Progress:36.0% Speed(reviews/sec):4346.% #Correct:314 #Tested:361 Testing Accuracy:86.9%\r", + "Progress:36.1% Speed(reviews/sec):4351.% #Correct:315 #Tested:362 Testing Accuracy:87.0%\r", + "Progress:36.2% Speed(reviews/sec):4322.% #Correct:316 #Tested:363 Testing Accuracy:87.0%\r", + "Progress:36.3% Speed(reviews/sec):4324.% #Correct:317 #Tested:364 Testing Accuracy:87.0%\r", + "Progress:36.4% Speed(reviews/sec):4330.% #Correct:317 #Tested:365 Testing Accuracy:86.8%\r", + "Progress:36.5% Speed(reviews/sec):4330.% #Correct:318 #Tested:366 Testing Accuracy:86.8%\r", + "Progress:36.6% Speed(reviews/sec):4331.% #Correct:319 #Tested:367 Testing Accuracy:86.9%\r", + "Progress:36.7% Speed(reviews/sec):4337.% #Correct:320 #Tested:368 Testing Accuracy:86.9%\r", + "Progress:36.8% Speed(reviews/sec):4338.% #Correct:320 #Tested:369 Testing Accuracy:86.7%\r", + "Progress:36.9% Speed(reviews/sec):4343.% #Correct:320 #Tested:370 Testing Accuracy:86.4%\r", + "Progress:37.0% Speed(reviews/sec):4344.% #Correct:321 #Tested:371 Testing Accuracy:86.5%\r", + "Progress:37.1% Speed(reviews/sec):4318.% #Correct:322 #Tested:372 Testing Accuracy:86.5%\r", + "Progress:37.2% Speed(reviews/sec):4315.% #Correct:322 #Tested:373 Testing Accuracy:86.3%\r", + "Progress:37.3% Speed(reviews/sec):4316.% #Correct:322 #Tested:374 Testing Accuracy:86.0%\r", + "Progress:37.4% Speed(reviews/sec):4300.% #Correct:323 #Tested:375 Testing Accuracy:86.1%\r", + "Progress:37.5% Speed(reviews/sec):4296.% #Correct:324 #Tested:376 Testing Accuracy:86.1%\r", + "Progress:37.6% Speed(reviews/sec):4300.% #Correct:325 #Tested:377 Testing Accuracy:86.2%\r", + "Progress:37.7% Speed(reviews/sec):4303.% #Correct:326 #Tested:378 Testing Accuracy:86.2%\r", + "Progress:37.8% Speed(reviews/sec):4309.% #Correct:326 #Tested:379 Testing Accuracy:86.0%\r", + "Progress:37.9% Speed(reviews/sec):4316.% #Correct:327 #Tested:380 Testing Accuracy:86.0%\r", + "Progress:38.0% Speed(reviews/sec):4304.% #Correct:328 #Tested:381 Testing Accuracy:86.0%\r", + "Progress:38.1% Speed(reviews/sec):4310.% #Correct:329 #Tested:382 Testing Accuracy:86.1%\r", + "Progress:38.2% Speed(reviews/sec):4309.% #Correct:330 #Tested:383 Testing Accuracy:86.1%\r", + "Progress:38.3% Speed(reviews/sec):4313.% #Correct:331 #Tested:384 Testing Accuracy:86.1%\r", + "Progress:38.4% Speed(reviews/sec):4318.% #Correct:332 #Tested:385 Testing Accuracy:86.2%\r", + "Progress:38.5% Speed(reviews/sec):4324.% #Correct:333 #Tested:386 Testing Accuracy:86.2%\r", + "Progress:38.6% Speed(reviews/sec):4324.% #Correct:334 #Tested:387 Testing Accuracy:86.3%\r", + "Progress:38.7% Speed(reviews/sec):4332.% #Correct:335 #Tested:388 Testing Accuracy:86.3%\r", + "Progress:38.8% Speed(reviews/sec):4336.% #Correct:335 #Tested:389 Testing Accuracy:86.1%\r", + "Progress:38.9% Speed(reviews/sec):4341.% #Correct:336 #Tested:390 Testing Accuracy:86.1%\r", + "Progress:39.0% Speed(reviews/sec):4347.% #Correct:336 #Tested:391 Testing Accuracy:85.9%\r", + "Progress:39.1% Speed(reviews/sec):4351.% #Correct:337 #Tested:392 Testing Accuracy:85.9%\r", + "Progress:39.2% Speed(reviews/sec):4358.% #Correct:337 #Tested:393 Testing Accuracy:85.7%\r", + "Progress:39.3% Speed(reviews/sec):4358.% #Correct:338 #Tested:394 Testing Accuracy:85.7%\r", + "Progress:39.4% Speed(reviews/sec):4354.% #Correct:338 #Tested:395 Testing Accuracy:85.5%\r", + "Progress:39.5% Speed(reviews/sec):4351.% #Correct:339 #Tested:396 Testing Accuracy:85.6%\r", + "Progress:39.6% Speed(reviews/sec):4344.% #Correct:340 #Tested:397 Testing Accuracy:85.6%\r", + "Progress:39.7% Speed(reviews/sec):4338.% #Correct:341 #Tested:398 Testing Accuracy:85.6%\r", + "Progress:39.8% Speed(reviews/sec):4314.% #Correct:341 #Tested:399 Testing Accuracy:85.4%\r", + "Progress:39.9% Speed(reviews/sec):4304.% #Correct:342 #Tested:400 Testing Accuracy:85.5%\r", + "Progress:40.0% Speed(reviews/sec):4283.% #Correct:343 #Tested:401 Testing Accuracy:85.5%\r", + "Progress:40.1% Speed(reviews/sec):4285.% #Correct:344 #Tested:402 Testing Accuracy:85.5%\r", + "Progress:40.2% Speed(reviews/sec):4284.% #Correct:345 #Tested:403 Testing Accuracy:85.6%\r", + "Progress:40.3% Speed(reviews/sec):4288.% #Correct:345 #Tested:404 Testing Accuracy:85.3%\r", + "Progress:40.4% Speed(reviews/sec):4293.% #Correct:346 #Tested:405 Testing Accuracy:85.4%\r", + "Progress:40.5% Speed(reviews/sec):4296.% #Correct:347 #Tested:406 Testing Accuracy:85.4%\r", + "Progress:40.6% Speed(reviews/sec):4294.% #Correct:348 #Tested:407 Testing Accuracy:85.5%\r", + "Progress:40.7% Speed(reviews/sec):4293.% #Correct:349 #Tested:408 Testing Accuracy:85.5%\r", + "Progress:40.8% Speed(reviews/sec):4287.% #Correct:350 #Tested:409 Testing Accuracy:85.5%\r", + "Progress:40.9% Speed(reviews/sec):4290.% #Correct:351 #Tested:410 Testing Accuracy:85.6%\r", + "Progress:41.0% Speed(reviews/sec):4294.% #Correct:352 #Tested:411 Testing Accuracy:85.6%\r", + "Progress:41.1% Speed(reviews/sec):4292.% #Correct:353 #Tested:412 Testing Accuracy:85.6%\r", + "Progress:41.2% Speed(reviews/sec):4297.% #Correct:354 #Tested:413 Testing Accuracy:85.7%\r", + "Progress:41.3% Speed(reviews/sec):4294.% #Correct:355 #Tested:414 Testing Accuracy:85.7%\r", + "Progress:41.4% Speed(reviews/sec):4299.% #Correct:356 #Tested:415 Testing Accuracy:85.7%\r", + "Progress:41.5% Speed(reviews/sec):4301.% #Correct:357 #Tested:416 Testing Accuracy:85.8%\r", + "Progress:41.6% Speed(reviews/sec):4305.% #Correct:358 #Tested:417 Testing Accuracy:85.8%\r", + "Progress:41.7% Speed(reviews/sec):4308.% #Correct:359 #Tested:418 Testing Accuracy:85.8%\r", + "Progress:41.8% Speed(reviews/sec):4311.% #Correct:360 #Tested:419 Testing Accuracy:85.9%\r", + "Progress:41.9% Speed(reviews/sec):4316.% #Correct:360 #Tested:420 Testing Accuracy:85.7%\r", + "Progress:42.0% Speed(reviews/sec):4312.% #Correct:361 #Tested:421 Testing Accuracy:85.7%\r", + "Progress:42.1% Speed(reviews/sec):4315.% #Correct:362 #Tested:422 Testing Accuracy:85.7%\r", + "Progress:42.2% Speed(reviews/sec):4318.% #Correct:363 #Tested:423 Testing Accuracy:85.8%\r", + "Progress:42.3% Speed(reviews/sec):4321.% #Correct:364 #Tested:424 Testing Accuracy:85.8%\r", + "Progress:42.4% Speed(reviews/sec):4323.% #Correct:365 #Tested:425 Testing Accuracy:85.8%\r", + "Progress:42.5% Speed(reviews/sec):4329.% #Correct:366 #Tested:426 Testing Accuracy:85.9%\r", + "Progress:42.6% Speed(reviews/sec):4320.% #Correct:367 #Tested:427 Testing Accuracy:85.9%\r", + "Progress:42.7% Speed(reviews/sec):4324.% #Correct:368 #Tested:428 Testing Accuracy:85.9%\r", + "Progress:42.8% Speed(reviews/sec):4326.% #Correct:369 #Tested:429 Testing Accuracy:86.0%\r", + "Progress:42.9% Speed(reviews/sec):4330.% #Correct:370 #Tested:430 Testing Accuracy:86.0%\r", + "Progress:43.0% Speed(reviews/sec):4335.% #Correct:371 #Tested:431 Testing Accuracy:86.0%\r", + "Progress:43.1% Speed(reviews/sec):4340.% #Correct:372 #Tested:432 Testing Accuracy:86.1%\r", + "Progress:43.2% Speed(reviews/sec):4342.% #Correct:372 #Tested:433 Testing Accuracy:85.9%\r", + "Progress:43.3% Speed(reviews/sec):4347.% #Correct:373 #Tested:434 Testing Accuracy:85.9%\r", + "Progress:43.4% Speed(reviews/sec):4350.% #Correct:374 #Tested:435 Testing Accuracy:85.9%\r", + "Progress:43.5% Speed(reviews/sec):4352.% #Correct:375 #Tested:436 Testing Accuracy:86.0%\r", + "Progress:43.6% Speed(reviews/sec):4358.% #Correct:376 #Tested:437 Testing Accuracy:86.0%\r", + "Progress:43.7% Speed(reviews/sec):4352.% #Correct:377 #Tested:438 Testing Accuracy:86.0%\r", + "Progress:43.8% Speed(reviews/sec):4358.% #Correct:378 #Tested:439 Testing Accuracy:86.1%\r", + "Progress:43.9% Speed(reviews/sec):4356.% #Correct:379 #Tested:440 Testing Accuracy:86.1%\r", + "Progress:44.0% Speed(reviews/sec):4353.% #Correct:380 #Tested:441 Testing Accuracy:86.1%\r", + "Progress:44.1% Speed(reviews/sec):4358.% #Correct:381 #Tested:442 Testing Accuracy:86.1%\r", + "Progress:44.2% Speed(reviews/sec):4364.% #Correct:382 #Tested:443 Testing Accuracy:86.2%\r", + "Progress:44.3% Speed(reviews/sec):4366.% #Correct:383 #Tested:444 Testing Accuracy:86.2%\r", + "Progress:44.4% Speed(reviews/sec):4357.% #Correct:384 #Tested:445 Testing Accuracy:86.2%\r", + "Progress:44.5% Speed(reviews/sec):4360.% #Correct:385 #Tested:446 Testing Accuracy:86.3%\r", + "Progress:44.6% Speed(reviews/sec):4364.% #Correct:386 #Tested:447 Testing Accuracy:86.3%\r", + "Progress:44.7% Speed(reviews/sec):4311.% #Correct:387 #Tested:448 Testing Accuracy:86.3%\r", + "Progress:44.8% Speed(reviews/sec):4302.% #Correct:388 #Tested:449 Testing Accuracy:86.4%\r", + "Progress:44.9% Speed(reviews/sec):4285.% #Correct:388 #Tested:450 Testing Accuracy:86.2%\r", + "Progress:45.0% Speed(reviews/sec):4285.% #Correct:389 #Tested:451 Testing Accuracy:86.2%\r", + "Progress:45.1% Speed(reviews/sec):4262.% #Correct:389 #Tested:452 Testing Accuracy:86.0%\r", + "Progress:45.2% Speed(reviews/sec):4262.% #Correct:390 #Tested:453 Testing Accuracy:86.0%\r", + "Progress:45.3% Speed(reviews/sec):4261.% #Correct:391 #Tested:454 Testing Accuracy:86.1%\r", + "Progress:45.4% Speed(reviews/sec):4265.% #Correct:392 #Tested:455 Testing Accuracy:86.1%\r", + "Progress:45.5% Speed(reviews/sec):4259.% #Correct:393 #Tested:456 Testing Accuracy:86.1%\r", + "Progress:45.6% Speed(reviews/sec):4257.% #Correct:394 #Tested:457 Testing Accuracy:86.2%\r", + "Progress:45.7% Speed(reviews/sec):4251.% #Correct:395 #Tested:458 Testing Accuracy:86.2%\r", + "Progress:45.8% Speed(reviews/sec):4247.% #Correct:396 #Tested:459 Testing Accuracy:86.2%\r", + "Progress:45.9% Speed(reviews/sec):4222.% #Correct:397 #Tested:460 Testing Accuracy:86.3%\r", + "Progress:46.0% Speed(reviews/sec):4217.% #Correct:398 #Tested:461 Testing Accuracy:86.3%\r", + "Progress:46.1% Speed(reviews/sec):4188.% #Correct:398 #Tested:462 Testing Accuracy:86.1%\r", + "Progress:46.2% Speed(reviews/sec):4083.% #Correct:399 #Tested:463 Testing Accuracy:86.1%\r", + "Progress:46.3% Speed(reviews/sec):4064.% #Correct:400 #Tested:464 Testing Accuracy:86.2%\r", + "Progress:46.4% Speed(reviews/sec):4057.% #Correct:401 #Tested:465 Testing Accuracy:86.2%\r", + "Progress:46.5% Speed(reviews/sec):4042.% #Correct:402 #Tested:466 Testing Accuracy:86.2%\r", + "Progress:46.6% Speed(reviews/sec):4043.% #Correct:403 #Tested:467 Testing Accuracy:86.2%\r", + "Progress:46.7% Speed(reviews/sec):4019.% #Correct:404 #Tested:468 Testing Accuracy:86.3%\r", + "Progress:46.8% Speed(reviews/sec):4007.% #Correct:405 #Tested:469 Testing Accuracy:86.3%\r", + "Progress:46.9% Speed(reviews/sec):4008.% #Correct:405 #Tested:470 Testing Accuracy:86.1%\r", + "Progress:47.0% Speed(reviews/sec):4008.% #Correct:406 #Tested:471 Testing Accuracy:86.1%\r", + "Progress:47.1% Speed(reviews/sec):3958.% #Correct:406 #Tested:472 Testing Accuracy:86.0%\r", + "Progress:47.2% Speed(reviews/sec):3957.% #Correct:407 #Tested:473 Testing Accuracy:86.0%\r", + "Progress:47.3% Speed(reviews/sec):3948.% #Correct:408 #Tested:474 Testing Accuracy:86.0%\r", + "Progress:47.4% Speed(reviews/sec):3938.% #Correct:409 #Tested:475 Testing Accuracy:86.1%\r", + "Progress:47.5% Speed(reviews/sec):3912.% #Correct:410 #Tested:476 Testing Accuracy:86.1%\r", + "Progress:47.6% Speed(reviews/sec):3890.% #Correct:411 #Tested:477 Testing Accuracy:86.1%\r", + "Progress:47.7% Speed(reviews/sec):3815.% #Correct:411 #Tested:478 Testing Accuracy:85.9%\r", + "Progress:47.8% Speed(reviews/sec):3804.% #Correct:412 #Tested:479 Testing Accuracy:86.0%\r", + "Progress:47.9% Speed(reviews/sec):3803.% #Correct:413 #Tested:480 Testing Accuracy:86.0%\r", + "Progress:48.0% Speed(reviews/sec):3767.% #Correct:414 #Tested:481 Testing Accuracy:86.0%\r", + "Progress:48.1% Speed(reviews/sec):3736.% #Correct:415 #Tested:482 Testing Accuracy:86.0%\r", + "Progress:48.2% Speed(reviews/sec):3737.% #Correct:416 #Tested:483 Testing Accuracy:86.1%\r", + "Progress:48.3% Speed(reviews/sec):3737.% #Correct:417 #Tested:484 Testing Accuracy:86.1%\r", + "Progress:48.4% Speed(reviews/sec):3731.% #Correct:418 #Tested:485 Testing Accuracy:86.1%\r", + "Progress:48.5% Speed(reviews/sec):3726.% #Correct:419 #Tested:486 Testing Accuracy:86.2%\r", + "Progress:48.6% Speed(reviews/sec):3730.% #Correct:420 #Tested:487 Testing Accuracy:86.2%\r", + "Progress:48.7% Speed(reviews/sec):3735.% #Correct:420 #Tested:488 Testing Accuracy:86.0%\r", + "Progress:48.8% Speed(reviews/sec):3738.% #Correct:421 #Tested:489 Testing Accuracy:86.0%\r", + "Progress:48.9% Speed(reviews/sec):3735.% #Correct:421 #Tested:490 Testing Accuracy:85.9%\r", + "Progress:49.0% Speed(reviews/sec):3738.% #Correct:422 #Tested:491 Testing Accuracy:85.9%\r", + "Progress:49.1% Speed(reviews/sec):3743.% #Correct:423 #Tested:492 Testing Accuracy:85.9%\r", + "Progress:49.2% Speed(reviews/sec):3745.% #Correct:424 #Tested:493 Testing Accuracy:86.0%\r", + "Progress:49.3% Speed(reviews/sec):3746.% #Correct:425 #Tested:494 Testing Accuracy:86.0%\r", + "Progress:49.4% Speed(reviews/sec):3750.% #Correct:426 #Tested:495 Testing Accuracy:86.0%\r", + "Progress:49.5% Speed(reviews/sec):3752.% #Correct:427 #Tested:496 Testing Accuracy:86.0%\r", + "Progress:49.6% Speed(reviews/sec):3756.% #Correct:428 #Tested:497 Testing Accuracy:86.1%\r", + "Progress:49.7% Speed(reviews/sec):3761.% #Correct:428 #Tested:498 Testing Accuracy:85.9%\r", + "Progress:49.8% Speed(reviews/sec):3759.% #Correct:429 #Tested:499 Testing Accuracy:85.9%\r", + "Progress:49.9% Speed(reviews/sec):3761.% #Correct:430 #Tested:500 Testing Accuracy:86.0%\r", + "Progress:50.0% Speed(reviews/sec):3767.% #Correct:431 #Tested:501 Testing Accuracy:86.0%\r", + "Progress:50.1% Speed(reviews/sec):3764.% #Correct:432 #Tested:502 Testing Accuracy:86.0%\r", + "Progress:50.2% Speed(reviews/sec):3766.% #Correct:433 #Tested:503 Testing Accuracy:86.0%\r", + "Progress:50.3% Speed(reviews/sec):3769.% #Correct:434 #Tested:504 Testing Accuracy:86.1%\r", + "Progress:50.4% Speed(reviews/sec):3772.% #Correct:434 #Tested:505 Testing Accuracy:85.9%\r", + "Progress:50.5% Speed(reviews/sec):3776.% #Correct:435 #Tested:506 Testing Accuracy:85.9%\r", + "Progress:50.6% Speed(reviews/sec):3772.% #Correct:436 #Tested:507 Testing Accuracy:85.9%\r", + "Progress:50.7% Speed(reviews/sec):3762.% #Correct:437 #Tested:508 Testing Accuracy:86.0%\r", + "Progress:50.8% Speed(reviews/sec):3766.% #Correct:438 #Tested:509 Testing Accuracy:86.0%\r", + "Progress:50.9% Speed(reviews/sec):3771.% #Correct:439 #Tested:510 Testing Accuracy:86.0%\r", + "Progress:51.0% Speed(reviews/sec):3756.% #Correct:440 #Tested:511 Testing Accuracy:86.1%\r", + "Progress:51.1% Speed(reviews/sec):3759.% #Correct:441 #Tested:512 Testing Accuracy:86.1%\r", + "Progress:51.2% Speed(reviews/sec):3760.% #Correct:442 #Tested:513 Testing Accuracy:86.1%\r", + "Progress:51.3% Speed(reviews/sec):3765.% #Correct:443 #Tested:514 Testing Accuracy:86.1%\r", + "Progress:51.4% Speed(reviews/sec):3767.% #Correct:444 #Tested:515 Testing Accuracy:86.2%\r", + "Progress:51.5% Speed(reviews/sec):3769.% #Correct:445 #Tested:516 Testing Accuracy:86.2%\r", + "Progress:51.6% Speed(reviews/sec):3769.% #Correct:446 #Tested:517 Testing Accuracy:86.2%\r", + "Progress:51.7% Speed(reviews/sec):3773.% #Correct:447 #Tested:518 Testing Accuracy:86.2%\r", + "Progress:51.8% Speed(reviews/sec):3776.% #Correct:447 #Tested:519 Testing Accuracy:86.1%\r", + "Progress:51.9% Speed(reviews/sec):3774.% #Correct:448 #Tested:520 Testing Accuracy:86.1%\r", + "Progress:52.0% Speed(reviews/sec):3776.% #Correct:449 #Tested:521 Testing Accuracy:86.1%\r", + "Progress:52.1% Speed(reviews/sec):3774.% #Correct:450 #Tested:522 Testing Accuracy:86.2%\r", + "Progress:52.2% Speed(reviews/sec):3774.% #Correct:451 #Tested:523 Testing Accuracy:86.2%\r", + "Progress:52.3% Speed(reviews/sec):3777.% #Correct:452 #Tested:524 Testing Accuracy:86.2%\r", + "Progress:52.4% Speed(reviews/sec):3779.% #Correct:453 #Tested:525 Testing Accuracy:86.2%\r", + "Progress:52.5% Speed(reviews/sec):3781.% #Correct:454 #Tested:526 Testing Accuracy:86.3%\r", + "Progress:52.6% Speed(reviews/sec):3785.% #Correct:455 #Tested:527 Testing Accuracy:86.3%\r", + "Progress:52.7% Speed(reviews/sec):3788.% #Correct:455 #Tested:528 Testing Accuracy:86.1%\r", + "Progress:52.8% Speed(reviews/sec):3788.% #Correct:455 #Tested:529 Testing Accuracy:86.0%\r", + "Progress:52.9% Speed(reviews/sec):3791.% #Correct:456 #Tested:530 Testing Accuracy:86.0%\r", + "Progress:53.0% Speed(reviews/sec):3792.% #Correct:457 #Tested:531 Testing Accuracy:86.0%\r", + "Progress:53.1% Speed(reviews/sec):3795.% #Correct:457 #Tested:532 Testing Accuracy:85.9%\r", + "Progress:53.2% Speed(reviews/sec):3800.% #Correct:458 #Tested:533 Testing Accuracy:85.9%\r", + "Progress:53.3% Speed(reviews/sec):3803.% #Correct:459 #Tested:534 Testing Accuracy:85.9%\r", + "Progress:53.4% Speed(reviews/sec):3807.% #Correct:460 #Tested:535 Testing Accuracy:85.9%\r", + "Progress:53.5% Speed(reviews/sec):3811.% #Correct:461 #Tested:536 Testing Accuracy:86.0%\r", + "Progress:53.6% Speed(reviews/sec):3815.% #Correct:461 #Tested:537 Testing Accuracy:85.8%\r", + "Progress:53.7% Speed(reviews/sec):3816.% #Correct:462 #Tested:538 Testing Accuracy:85.8%\r", + "Progress:53.8% Speed(reviews/sec):3816.% #Correct:463 #Tested:539 Testing Accuracy:85.8%\r", + "Progress:53.9% Speed(reviews/sec):3816.% #Correct:464 #Tested:540 Testing Accuracy:85.9%\r", + "Progress:54.0% Speed(reviews/sec):3813.% #Correct:465 #Tested:541 Testing Accuracy:85.9%\r", + "Progress:54.1% Speed(reviews/sec):3815.% #Correct:466 #Tested:542 Testing Accuracy:85.9%\r", + "Progress:54.2% Speed(reviews/sec):3813.% #Correct:467 #Tested:543 Testing Accuracy:86.0%\r", + "Progress:54.3% Speed(reviews/sec):3816.% #Correct:468 #Tested:544 Testing Accuracy:86.0%\r", + "Progress:54.4% Speed(reviews/sec):3817.% #Correct:468 #Tested:545 Testing Accuracy:85.8%\r", + "Progress:54.5% Speed(reviews/sec):3819.% #Correct:469 #Tested:546 Testing Accuracy:85.8%\r", + "Progress:54.6% Speed(reviews/sec):3818.% #Correct:469 #Tested:547 Testing Accuracy:85.7%\r", + "Progress:54.7% Speed(reviews/sec):3820.% #Correct:470 #Tested:548 Testing Accuracy:85.7%\r", + "Progress:54.8% Speed(reviews/sec):3825.% #Correct:471 #Tested:549 Testing Accuracy:85.7%\r", + "Progress:54.9% Speed(reviews/sec):3829.% #Correct:472 #Tested:550 Testing Accuracy:85.8%\r", + "Progress:55.0% Speed(reviews/sec):3833.% #Correct:473 #Tested:551 Testing Accuracy:85.8%\r", + "Progress:55.1% Speed(reviews/sec):3835.% #Correct:474 #Tested:552 Testing Accuracy:85.8%\r", + "Progress:55.2% Speed(reviews/sec):3836.% #Correct:475 #Tested:553 Testing Accuracy:85.8%\r", + "Progress:55.3% Speed(reviews/sec):3836.% #Correct:476 #Tested:554 Testing Accuracy:85.9%\r", + "Progress:55.4% Speed(reviews/sec):3827.% #Correct:477 #Tested:555 Testing Accuracy:85.9%\r", + "Progress:55.5% Speed(reviews/sec):3826.% #Correct:478 #Tested:556 Testing Accuracy:85.9%\r", + "Progress:55.6% Speed(reviews/sec):3823.% #Correct:479 #Tested:557 Testing Accuracy:85.9%\r", + "Progress:55.7% Speed(reviews/sec):3822.% #Correct:480 #Tested:558 Testing Accuracy:86.0%\r", + "Progress:55.8% Speed(reviews/sec):3821.% #Correct:480 #Tested:559 Testing Accuracy:85.8%\r", + "Progress:55.9% Speed(reviews/sec):3825.% #Correct:481 #Tested:560 Testing Accuracy:85.8%\r", + "Progress:56.0% Speed(reviews/sec):3829.% #Correct:482 #Tested:561 Testing Accuracy:85.9%\r", + "Progress:56.1% Speed(reviews/sec):3833.% #Correct:483 #Tested:562 Testing Accuracy:85.9%\r", + "Progress:56.2% Speed(reviews/sec):3834.% #Correct:484 #Tested:563 Testing Accuracy:85.9%\r", + "Progress:56.3% Speed(reviews/sec):3838.% #Correct:485 #Tested:564 Testing Accuracy:85.9%\r", + "Progress:56.4% Speed(reviews/sec):3838.% #Correct:486 #Tested:565 Testing Accuracy:86.0%\r", + "Progress:56.5% Speed(reviews/sec):3843.% #Correct:487 #Tested:566 Testing Accuracy:86.0%\r", + "Progress:56.6% Speed(reviews/sec):3846.% #Correct:488 #Tested:567 Testing Accuracy:86.0%\r", + "Progress:56.7% Speed(reviews/sec):3849.% #Correct:489 #Tested:568 Testing Accuracy:86.0%\r", + "Progress:56.8% Speed(reviews/sec):3853.% #Correct:490 #Tested:569 Testing Accuracy:86.1%\r", + "Progress:56.9% Speed(reviews/sec):3855.% #Correct:491 #Tested:570 Testing Accuracy:86.1%\r", + "Progress:57.0% Speed(reviews/sec):3851.% #Correct:492 #Tested:571 Testing Accuracy:86.1%\r", + "Progress:57.1% Speed(reviews/sec):3850.% #Correct:493 #Tested:572 Testing Accuracy:86.1%\r", + "Progress:57.2% Speed(reviews/sec):3851.% #Correct:493 #Tested:573 Testing Accuracy:86.0%\r", + "Progress:57.3% Speed(reviews/sec):3854.% #Correct:493 #Tested:574 Testing Accuracy:85.8%\r", + "Progress:57.4% Speed(reviews/sec):3853.% #Correct:494 #Tested:575 Testing Accuracy:85.9%\r", + "Progress:57.5% Speed(reviews/sec):3854.% #Correct:495 #Tested:576 Testing Accuracy:85.9%\r", + "Progress:57.6% Speed(reviews/sec):3855.% #Correct:496 #Tested:577 Testing Accuracy:85.9%\r", + "Progress:57.7% Speed(reviews/sec):3857.% #Correct:497 #Tested:578 Testing Accuracy:85.9%\r", + "Progress:57.8% Speed(reviews/sec):3851.% #Correct:498 #Tested:579 Testing Accuracy:86.0%\r", + "Progress:57.9% Speed(reviews/sec):3853.% #Correct:499 #Tested:580 Testing Accuracy:86.0%\r", + "Progress:58.0% Speed(reviews/sec):3853.% #Correct:500 #Tested:581 Testing Accuracy:86.0%\r", + "Progress:58.1% Speed(reviews/sec):3855.% #Correct:501 #Tested:582 Testing Accuracy:86.0%\r", + "Progress:58.2% Speed(reviews/sec):3858.% #Correct:502 #Tested:583 Testing Accuracy:86.1%\r", + "Progress:58.3% Speed(reviews/sec):3861.% #Correct:503 #Tested:584 Testing Accuracy:86.1%\r", + "Progress:58.4% Speed(reviews/sec):3864.% #Correct:504 #Tested:585 Testing Accuracy:86.1%\r", + "Progress:58.5% Speed(reviews/sec):3868.% #Correct:505 #Tested:586 Testing Accuracy:86.1%\r", + "Progress:58.6% Speed(reviews/sec):3872.% #Correct:506 #Tested:587 Testing Accuracy:86.2%\r", + "Progress:58.7% Speed(reviews/sec):3875.% #Correct:507 #Tested:588 Testing Accuracy:86.2%\r", + "Progress:58.8% Speed(reviews/sec):3880.% #Correct:508 #Tested:589 Testing Accuracy:86.2%\r", + "Progress:58.9% Speed(reviews/sec):3884.% #Correct:509 #Tested:590 Testing Accuracy:86.2%\r", + "Progress:59.0% Speed(reviews/sec):3887.% #Correct:510 #Tested:591 Testing Accuracy:86.2%\r", + "Progress:59.1% Speed(reviews/sec):3891.% #Correct:511 #Tested:592 Testing Accuracy:86.3%\r", + "Progress:59.2% Speed(reviews/sec):3890.% #Correct:511 #Tested:593 Testing Accuracy:86.1%\r", + "Progress:59.3% Speed(reviews/sec):3893.% #Correct:512 #Tested:594 Testing Accuracy:86.1%\r", + "Progress:59.4% Speed(reviews/sec):3895.% #Correct:513 #Tested:595 Testing Accuracy:86.2%\r", + "Progress:59.5% Speed(reviews/sec):3872.% #Correct:514 #Tested:596 Testing Accuracy:86.2%\r", + "Progress:59.6% Speed(reviews/sec):3874.% #Correct:515 #Tested:597 Testing Accuracy:86.2%\r", + "Progress:59.7% Speed(reviews/sec):3872.% #Correct:516 #Tested:598 Testing Accuracy:86.2%\r", + "Progress:59.8% Speed(reviews/sec):3874.% #Correct:516 #Tested:599 Testing Accuracy:86.1%\r", + "Progress:59.9% Speed(reviews/sec):3878.% #Correct:517 #Tested:600 Testing Accuracy:86.1%\r", + "Progress:60.0% Speed(reviews/sec):3878.% #Correct:517 #Tested:601 Testing Accuracy:86.0%\r", + "Progress:60.1% Speed(reviews/sec):3882.% #Correct:518 #Tested:602 Testing Accuracy:86.0%\r", + "Progress:60.2% Speed(reviews/sec):3885.% #Correct:519 #Tested:603 Testing Accuracy:86.0%\r", + "Progress:60.3% Speed(reviews/sec):3890.% #Correct:520 #Tested:604 Testing Accuracy:86.0%\r", + "Progress:60.4% Speed(reviews/sec):3893.% #Correct:521 #Tested:605 Testing Accuracy:86.1%\r", + "Progress:60.5% Speed(reviews/sec):3886.% #Correct:522 #Tested:606 Testing Accuracy:86.1%\r", + "Progress:60.6% Speed(reviews/sec):3890.% #Correct:522 #Tested:607 Testing Accuracy:85.9%\r", + "Progress:60.7% Speed(reviews/sec):3893.% #Correct:523 #Tested:608 Testing Accuracy:86.0%\r", + "Progress:60.8% Speed(reviews/sec):3897.% #Correct:524 #Tested:609 Testing Accuracy:86.0%\r", + "Progress:60.9% Speed(reviews/sec):3902.% #Correct:525 #Tested:610 Testing Accuracy:86.0%\r", + "Progress:61.0% Speed(reviews/sec):3903.% #Correct:525 #Tested:611 Testing Accuracy:85.9%\r", + "Progress:61.1% Speed(reviews/sec):3906.% #Correct:526 #Tested:612 Testing Accuracy:85.9%\r", + "Progress:61.2% Speed(reviews/sec):3911.% #Correct:527 #Tested:613 Testing Accuracy:85.9%\r", + "Progress:61.3% Speed(reviews/sec):3914.% #Correct:528 #Tested:614 Testing Accuracy:85.9%\r", + "Progress:61.4% Speed(reviews/sec):3919.% #Correct:528 #Tested:615 Testing Accuracy:85.8%\r", + "Progress:61.5% Speed(reviews/sec):3921.% #Correct:528 #Tested:616 Testing Accuracy:85.7%\r", + "Progress:61.6% Speed(reviews/sec):3920.% #Correct:529 #Tested:617 Testing Accuracy:85.7%\r", + "Progress:61.7% Speed(reviews/sec):3922.% #Correct:530 #Tested:618 Testing Accuracy:85.7%\r", + "Progress:61.8% Speed(reviews/sec):3925.% #Correct:531 #Tested:619 Testing Accuracy:85.7%\r", + "Progress:61.9% Speed(reviews/sec):3928.% #Correct:531 #Tested:620 Testing Accuracy:85.6%\r", + "Progress:62.0% Speed(reviews/sec):3928.% #Correct:531 #Tested:621 Testing Accuracy:85.5%\r", + "Progress:62.1% Speed(reviews/sec):3932.% #Correct:532 #Tested:622 Testing Accuracy:85.5%\r", + "Progress:62.2% Speed(reviews/sec):3935.% #Correct:532 #Tested:623 Testing Accuracy:85.3%\r", + "Progress:62.3% Speed(reviews/sec):3939.% #Correct:533 #Tested:624 Testing Accuracy:85.4%\r", + "Progress:62.4% Speed(reviews/sec):3942.% #Correct:533 #Tested:625 Testing Accuracy:85.2%\r", + "Progress:62.5% Speed(reviews/sec):3936.% #Correct:533 #Tested:626 Testing Accuracy:85.1%\r", + "Progress:62.6% Speed(reviews/sec):3937.% #Correct:533 #Tested:627 Testing Accuracy:85.0%\r", + "Progress:62.7% Speed(reviews/sec):3940.% #Correct:533 #Tested:628 Testing Accuracy:84.8%\r", + "Progress:62.8% Speed(reviews/sec):3945.% #Correct:533 #Tested:629 Testing Accuracy:84.7%\r", + "Progress:62.9% Speed(reviews/sec):3945.% #Correct:534 #Tested:630 Testing Accuracy:84.7%\r", + "Progress:63.0% Speed(reviews/sec):3947.% #Correct:534 #Tested:631 Testing Accuracy:84.6%\r", + "Progress:63.1% Speed(reviews/sec):3944.% #Correct:535 #Tested:632 Testing Accuracy:84.6%\r", + "Progress:63.2% Speed(reviews/sec):3948.% #Correct:535 #Tested:633 Testing Accuracy:84.5%\r", + "Progress:63.3% Speed(reviews/sec):3949.% #Correct:536 #Tested:634 Testing Accuracy:84.5%\r", + "Progress:63.4% Speed(reviews/sec):3948.% #Correct:536 #Tested:635 Testing Accuracy:84.4%\r", + "Progress:63.5% Speed(reviews/sec):3949.% #Correct:537 #Tested:636 Testing Accuracy:84.4%\r", + "Progress:63.6% Speed(reviews/sec):3945.% #Correct:537 #Tested:637 Testing Accuracy:84.3%\r", + "Progress:63.7% Speed(reviews/sec):3944.% #Correct:538 #Tested:638 Testing Accuracy:84.3%\r", + "Progress:63.8% Speed(reviews/sec):3946.% #Correct:539 #Tested:639 Testing Accuracy:84.3%\r", + "Progress:63.9% Speed(reviews/sec):3947.% #Correct:540 #Tested:640 Testing Accuracy:84.3%\r", + "Progress:64.0% Speed(reviews/sec):3949.% #Correct:540 #Tested:641 Testing Accuracy:84.2%\r", + "Progress:64.1% Speed(reviews/sec):3944.% #Correct:540 #Tested:642 Testing Accuracy:84.1%\r", + "Progress:64.2% Speed(reviews/sec):3943.% #Correct:541 #Tested:643 Testing Accuracy:84.1%\r", + "Progress:64.3% Speed(reviews/sec):3946.% #Correct:542 #Tested:644 Testing Accuracy:84.1%\r", + "Progress:64.4% Speed(reviews/sec):3946.% #Correct:543 #Tested:645 Testing Accuracy:84.1%\r", + "Progress:64.5% Speed(reviews/sec):3943.% #Correct:543 #Tested:646 Testing Accuracy:84.0%\r", + "Progress:64.6% Speed(reviews/sec):3941.% #Correct:544 #Tested:647 Testing Accuracy:84.0%\r", + "Progress:64.7% Speed(reviews/sec):3944.% #Correct:545 #Tested:648 Testing Accuracy:84.1%\r", + "Progress:64.8% Speed(reviews/sec):3947.% #Correct:546 #Tested:649 Testing Accuracy:84.1%\r", + "Progress:64.9% Speed(reviews/sec):3951.% #Correct:547 #Tested:650 Testing Accuracy:84.1%\r", + "Progress:65.0% Speed(reviews/sec):3955.% #Correct:547 #Tested:651 Testing Accuracy:84.0%\r", + "Progress:65.1% Speed(reviews/sec):3958.% #Correct:548 #Tested:652 Testing Accuracy:84.0%\r", + "Progress:65.2% Speed(reviews/sec):3962.% #Correct:549 #Tested:653 Testing Accuracy:84.0%\r", + "Progress:65.3% Speed(reviews/sec):3965.% #Correct:550 #Tested:654 Testing Accuracy:84.0%\r", + "Progress:65.4% Speed(reviews/sec):3963.% #Correct:550 #Tested:655 Testing Accuracy:83.9%\r", + "Progress:65.5% Speed(reviews/sec):3964.% #Correct:551 #Tested:656 Testing Accuracy:83.9%\r", + "Progress:65.6% Speed(reviews/sec):3967.% #Correct:551 #Tested:657 Testing Accuracy:83.8%\r", + "Progress:65.7% Speed(reviews/sec):3968.% #Correct:552 #Tested:658 Testing Accuracy:83.8%\r", + "Progress:65.8% Speed(reviews/sec):3972.% #Correct:553 #Tested:659 Testing Accuracy:83.9%\r", + "Progress:65.9% Speed(reviews/sec):3974.% #Correct:554 #Tested:660 Testing Accuracy:83.9%\r", + "Progress:66.0% Speed(reviews/sec):3978.% #Correct:555 #Tested:661 Testing Accuracy:83.9%\r", + "Progress:66.1% Speed(reviews/sec):3981.% #Correct:556 #Tested:662 Testing Accuracy:83.9%\r", + "Progress:66.2% Speed(reviews/sec):3983.% #Correct:557 #Tested:663 Testing Accuracy:84.0%\r", + "Progress:66.3% Speed(reviews/sec):3986.% #Correct:557 #Tested:664 Testing Accuracy:83.8%\r", + "Progress:66.4% Speed(reviews/sec):3989.% #Correct:558 #Tested:665 Testing Accuracy:83.9%\r", + "Progress:66.5% Speed(reviews/sec):3993.% #Correct:559 #Tested:666 Testing Accuracy:83.9%\r", + "Progress:66.6% Speed(reviews/sec):3997.% #Correct:560 #Tested:667 Testing Accuracy:83.9%\r", + "Progress:66.7% Speed(reviews/sec):4000.% #Correct:561 #Tested:668 Testing Accuracy:83.9%\r", + "Progress:66.8% Speed(reviews/sec):4002.% #Correct:562 #Tested:669 Testing Accuracy:84.0%\r", + "Progress:66.9% Speed(reviews/sec):4005.% #Correct:562 #Tested:670 Testing Accuracy:83.8%\r", + "Progress:67.0% Speed(reviews/sec):4010.% #Correct:563 #Tested:671 Testing Accuracy:83.9%\r", + "Progress:67.1% Speed(reviews/sec):4014.% #Correct:564 #Tested:672 Testing Accuracy:83.9%\r", + "Progress:67.2% Speed(reviews/sec):4018.% #Correct:565 #Tested:673 Testing Accuracy:83.9%\r", + "Progress:67.3% Speed(reviews/sec):4020.% #Correct:566 #Tested:674 Testing Accuracy:83.9%\r", + "Progress:67.4% Speed(reviews/sec):4024.% #Correct:567 #Tested:675 Testing Accuracy:84.0%\r", + "Progress:67.5% Speed(reviews/sec):4027.% #Correct:568 #Tested:676 Testing Accuracy:84.0%\r", + "Progress:67.6% Speed(reviews/sec):4031.% #Correct:568 #Tested:677 Testing Accuracy:83.8%\r", + "Progress:67.7% Speed(reviews/sec):4033.% #Correct:568 #Tested:678 Testing Accuracy:83.7%\r", + "Progress:67.8% Speed(reviews/sec):4037.% #Correct:569 #Tested:679 Testing Accuracy:83.7%\r", + "Progress:67.9% Speed(reviews/sec):4041.% #Correct:570 #Tested:680 Testing Accuracy:83.8%\r", + "Progress:68.0% Speed(reviews/sec):4044.% #Correct:570 #Tested:681 Testing Accuracy:83.7%\r", + "Progress:68.1% Speed(reviews/sec):4041.% #Correct:571 #Tested:682 Testing Accuracy:83.7%\r", + "Progress:68.2% Speed(reviews/sec):4036.% #Correct:572 #Tested:683 Testing Accuracy:83.7%\r", + "Progress:68.3% Speed(reviews/sec):4037.% #Correct:573 #Tested:684 Testing Accuracy:83.7%\r", + "Progress:68.4% Speed(reviews/sec):4037.% #Correct:574 #Tested:685 Testing Accuracy:83.7%\r", + "Progress:68.5% Speed(reviews/sec):4037.% #Correct:575 #Tested:686 Testing Accuracy:83.8%\r", + "Progress:68.6% Speed(reviews/sec):4039.% #Correct:575 #Tested:687 Testing Accuracy:83.6%\r", + "Progress:68.7% Speed(reviews/sec):4041.% #Correct:576 #Tested:688 Testing Accuracy:83.7%\r", + "Progress:68.8% Speed(reviews/sec):4036.% #Correct:577 #Tested:689 Testing Accuracy:83.7%\r", + "Progress:68.9% Speed(reviews/sec):4039.% #Correct:578 #Tested:690 Testing Accuracy:83.7%\r", + "Progress:69.0% Speed(reviews/sec):4041.% #Correct:579 #Tested:691 Testing Accuracy:83.7%\r", + "Progress:69.1% Speed(reviews/sec):4043.% #Correct:580 #Tested:692 Testing Accuracy:83.8%\r", + "Progress:69.2% Speed(reviews/sec):4041.% #Correct:581 #Tested:693 Testing Accuracy:83.8%\r", + "Progress:69.3% Speed(reviews/sec):4038.% #Correct:582 #Tested:694 Testing Accuracy:83.8%\r", + "Progress:69.4% Speed(reviews/sec):4037.% #Correct:582 #Tested:695 Testing Accuracy:83.7%\r", + "Progress:69.5% Speed(reviews/sec):4036.% #Correct:583 #Tested:696 Testing Accuracy:83.7%\r", + "Progress:69.6% Speed(reviews/sec):4040.% #Correct:584 #Tested:697 Testing Accuracy:83.7%\r", + "Progress:69.7% Speed(reviews/sec):4042.% #Correct:585 #Tested:698 Testing Accuracy:83.8%\r", + "Progress:69.8% Speed(reviews/sec):4046.% #Correct:586 #Tested:699 Testing Accuracy:83.8%\r", + "Progress:69.9% Speed(reviews/sec):4047.% #Correct:587 #Tested:700 Testing Accuracy:83.8%\r", + "Progress:70.0% Speed(reviews/sec):4046.% #Correct:588 #Tested:701 Testing Accuracy:83.8%\r", + "Progress:70.1% Speed(reviews/sec):4047.% #Correct:589 #Tested:702 Testing Accuracy:83.9%\r", + "Progress:70.2% Speed(reviews/sec):4043.% #Correct:590 #Tested:703 Testing Accuracy:83.9%\r", + "Progress:70.3% Speed(reviews/sec):4039.% #Correct:591 #Tested:704 Testing Accuracy:83.9%\r", + "Progress:70.4% Speed(reviews/sec):4035.% #Correct:592 #Tested:705 Testing Accuracy:83.9%\r", + "Progress:70.5% Speed(reviews/sec):4037.% #Correct:593 #Tested:706 Testing Accuracy:83.9%\r", + "Progress:70.6% Speed(reviews/sec):4039.% #Correct:594 #Tested:707 Testing Accuracy:84.0%\r", + "Progress:70.7% Speed(reviews/sec):4042.% #Correct:595 #Tested:708 Testing Accuracy:84.0%\r", + "Progress:70.8% Speed(reviews/sec):4043.% #Correct:596 #Tested:709 Testing Accuracy:84.0%\r", + "Progress:70.9% Speed(reviews/sec):4046.% #Correct:597 #Tested:710 Testing Accuracy:84.0%\r", + "Progress:71.0% Speed(reviews/sec):4049.% #Correct:598 #Tested:711 Testing Accuracy:84.1%\r", + "Progress:71.1% Speed(reviews/sec):4052.% #Correct:599 #Tested:712 Testing Accuracy:84.1%\r", + "Progress:71.2% Speed(reviews/sec):4056.% #Correct:599 #Tested:713 Testing Accuracy:84.0%\r", + "Progress:71.3% Speed(reviews/sec):4058.% #Correct:600 #Tested:714 Testing Accuracy:84.0%\r", + "Progress:71.4% Speed(reviews/sec):4060.% #Correct:601 #Tested:715 Testing Accuracy:84.0%\r", + "Progress:71.5% Speed(reviews/sec):4063.% #Correct:602 #Tested:716 Testing Accuracy:84.0%\r", + "Progress:71.6% Speed(reviews/sec):4067.% #Correct:603 #Tested:717 Testing Accuracy:84.1%\r", + "Progress:71.7% Speed(reviews/sec):4070.% #Correct:604 #Tested:718 Testing Accuracy:84.1%\r", + "Progress:71.8% Speed(reviews/sec):4072.% #Correct:605 #Tested:719 Testing Accuracy:84.1%\r", + "Progress:71.9% Speed(reviews/sec):4076.% #Correct:606 #Tested:720 Testing Accuracy:84.1%\r", + "Progress:72.0% Speed(reviews/sec):4080.% #Correct:606 #Tested:721 Testing Accuracy:84.0%\r", + "Progress:72.1% Speed(reviews/sec):4083.% #Correct:607 #Tested:722 Testing Accuracy:84.0%\r", + "Progress:72.2% Speed(reviews/sec):4085.% #Correct:608 #Tested:723 Testing Accuracy:84.0%\r", + "Progress:72.3% Speed(reviews/sec):4087.% #Correct:609 #Tested:724 Testing Accuracy:84.1%\r", + "Progress:72.4% Speed(reviews/sec):4091.% #Correct:609 #Tested:725 Testing Accuracy:84.0%\r", + "Progress:72.5% Speed(reviews/sec):4093.% #Correct:610 #Tested:726 Testing Accuracy:84.0%\r", + "Progress:72.6% Speed(reviews/sec):4090.% #Correct:611 #Tested:727 Testing Accuracy:84.0%\r", + "Progress:72.7% Speed(reviews/sec):4083.% #Correct:612 #Tested:728 Testing Accuracy:84.0%\r", + "Progress:72.8% Speed(reviews/sec):4086.% #Correct:613 #Tested:729 Testing Accuracy:84.0%\r", + "Progress:72.9% Speed(reviews/sec):4084.% #Correct:614 #Tested:730 Testing Accuracy:84.1%\r", + "Progress:73.0% Speed(reviews/sec):4086.% #Correct:615 #Tested:731 Testing Accuracy:84.1%\r", + "Progress:73.1% Speed(reviews/sec):4089.% #Correct:616 #Tested:732 Testing Accuracy:84.1%\r", + "Progress:73.2% Speed(reviews/sec):4090.% #Correct:617 #Tested:733 Testing Accuracy:84.1%\r", + "Progress:73.3% Speed(reviews/sec):4091.% #Correct:618 #Tested:734 Testing Accuracy:84.1%\r", + "Progress:73.4% Speed(reviews/sec):4091.% #Correct:619 #Tested:735 Testing Accuracy:84.2%\r", + "Progress:73.5% Speed(reviews/sec):4093.% #Correct:620 #Tested:736 Testing Accuracy:84.2%\r", + "Progress:73.6% Speed(reviews/sec):4095.% #Correct:621 #Tested:737 Testing Accuracy:84.2%\r", + "Progress:73.7% Speed(reviews/sec):4098.% #Correct:621 #Tested:738 Testing Accuracy:84.1%\r", + "Progress:73.8% Speed(reviews/sec):4099.% #Correct:622 #Tested:739 Testing Accuracy:84.1%\r", + "Progress:73.9% Speed(reviews/sec):4103.% #Correct:623 #Tested:740 Testing Accuracy:84.1%\r", + "Progress:74.0% Speed(reviews/sec):4107.% #Correct:624 #Tested:741 Testing Accuracy:84.2%\r", + "Progress:74.1% Speed(reviews/sec):4110.% #Correct:625 #Tested:742 Testing Accuracy:84.2%\r", + "Progress:74.2% Speed(reviews/sec):4112.% #Correct:626 #Tested:743 Testing Accuracy:84.2%\r", + "Progress:74.3% Speed(reviews/sec):4113.% #Correct:626 #Tested:744 Testing Accuracy:84.1%\r", + "Progress:74.4% Speed(reviews/sec):4116.% #Correct:627 #Tested:745 Testing Accuracy:84.1%\r", + "Progress:74.5% Speed(reviews/sec):4114.% #Correct:627 #Tested:746 Testing Accuracy:84.0%\r", + "Progress:74.6% Speed(reviews/sec):4116.% #Correct:628 #Tested:747 Testing Accuracy:84.0%\r", + "Progress:74.7% Speed(reviews/sec):4113.% #Correct:628 #Tested:748 Testing Accuracy:83.9%\r", + "Progress:74.8% Speed(reviews/sec):4114.% #Correct:629 #Tested:749 Testing Accuracy:83.9%\r", + "Progress:74.9% Speed(reviews/sec):4115.% #Correct:630 #Tested:750 Testing Accuracy:84.0%\r", + "Progress:75.0% Speed(reviews/sec):4119.% #Correct:631 #Tested:751 Testing Accuracy:84.0%\r", + "Progress:75.1% Speed(reviews/sec):4121.% #Correct:632 #Tested:752 Testing Accuracy:84.0%\r", + "Progress:75.2% Speed(reviews/sec):4123.% #Correct:633 #Tested:753 Testing Accuracy:84.0%\r", + "Progress:75.3% Speed(reviews/sec):4126.% #Correct:634 #Tested:754 Testing Accuracy:84.0%\r", + "Progress:75.4% Speed(reviews/sec):4124.% #Correct:635 #Tested:755 Testing Accuracy:84.1%\r", + "Progress:75.5% Speed(reviews/sec):4128.% #Correct:635 #Tested:756 Testing Accuracy:83.9%\r", + "Progress:75.6% Speed(reviews/sec):4130.% #Correct:635 #Tested:757 Testing Accuracy:83.8%\r", + "Progress:75.7% Speed(reviews/sec):4133.% #Correct:636 #Tested:758 Testing Accuracy:83.9%\r", + "Progress:75.8% Speed(reviews/sec):4135.% #Correct:636 #Tested:759 Testing Accuracy:83.7%\r", + "Progress:75.9% Speed(reviews/sec):4137.% #Correct:637 #Tested:760 Testing Accuracy:83.8%\r", + "Progress:76.0% Speed(reviews/sec):4137.% #Correct:637 #Tested:761 Testing Accuracy:83.7%\r", + "Progress:76.1% Speed(reviews/sec):4135.% #Correct:638 #Tested:762 Testing Accuracy:83.7%\r", + "Progress:76.2% Speed(reviews/sec):4130.% #Correct:638 #Tested:763 Testing Accuracy:83.6%\r", + "Progress:76.3% Speed(reviews/sec):4130.% #Correct:639 #Tested:764 Testing Accuracy:83.6%\r", + "Progress:76.4% Speed(reviews/sec):4097.% #Correct:639 #Tested:765 Testing Accuracy:83.5%\r", + "Progress:76.5% Speed(reviews/sec):4094.% #Correct:639 #Tested:766 Testing Accuracy:83.4%\r", + "Progress:76.6% Speed(reviews/sec):4095.% #Correct:639 #Tested:767 Testing Accuracy:83.3%\r", + "Progress:76.7% Speed(reviews/sec):4097.% #Correct:639 #Tested:768 Testing Accuracy:83.2%\r", + "Progress:76.8% Speed(reviews/sec):4099.% #Correct:639 #Tested:769 Testing Accuracy:83.0%\r", + "Progress:76.9% Speed(reviews/sec):4102.% #Correct:640 #Tested:770 Testing Accuracy:83.1%\r", + "Progress:77.0% Speed(reviews/sec):4095.% #Correct:640 #Tested:771 Testing Accuracy:83.0%\r", + "Progress:77.1% Speed(reviews/sec):4094.% #Correct:641 #Tested:772 Testing Accuracy:83.0%\r", + "Progress:77.2% Speed(reviews/sec):4096.% #Correct:642 #Tested:773 Testing Accuracy:83.0%\r", + "Progress:77.3% Speed(reviews/sec):4097.% #Correct:643 #Tested:774 Testing Accuracy:83.0%\r", + "Progress:77.4% Speed(reviews/sec):4100.% #Correct:644 #Tested:775 Testing Accuracy:83.0%\r", + "Progress:77.5% Speed(reviews/sec):4100.% #Correct:645 #Tested:776 Testing Accuracy:83.1%\r", + "Progress:77.6% Speed(reviews/sec):4101.% #Correct:645 #Tested:777 Testing Accuracy:83.0%\r", + "Progress:77.7% Speed(reviews/sec):4102.% #Correct:645 #Tested:778 Testing Accuracy:82.9%\r", + "Progress:77.8% Speed(reviews/sec):4099.% #Correct:646 #Tested:779 Testing Accuracy:82.9%\r", + "Progress:77.9% Speed(reviews/sec):4101.% #Correct:647 #Tested:780 Testing Accuracy:82.9%\r", + "Progress:78.0% Speed(reviews/sec):4095.% #Correct:647 #Tested:781 Testing Accuracy:82.8%\r", + "Progress:78.1% Speed(reviews/sec):4098.% #Correct:648 #Tested:782 Testing Accuracy:82.8%\r", + "Progress:78.2% Speed(reviews/sec):4094.% #Correct:648 #Tested:783 Testing Accuracy:82.7%\r", + "Progress:78.3% Speed(reviews/sec):4096.% #Correct:649 #Tested:784 Testing Accuracy:82.7%\r", + "Progress:78.4% Speed(reviews/sec):4095.% #Correct:649 #Tested:785 Testing Accuracy:82.6%\r", + "Progress:78.5% Speed(reviews/sec):4097.% #Correct:650 #Tested:786 Testing Accuracy:82.6%\r", + "Progress:78.6% Speed(reviews/sec):4097.% #Correct:650 #Tested:787 Testing Accuracy:82.5%\r", + "Progress:78.7% Speed(reviews/sec):4097.% #Correct:651 #Tested:788 Testing Accuracy:82.6%\r", + "Progress:78.8% Speed(reviews/sec):4100.% #Correct:651 #Tested:789 Testing Accuracy:82.5%\r", + "Progress:78.9% Speed(reviews/sec):4098.% #Correct:652 #Tested:790 Testing Accuracy:82.5%\r", + "Progress:79.0% Speed(reviews/sec):4098.% #Correct:652 #Tested:791 Testing Accuracy:82.4%\r", + "Progress:79.1% Speed(reviews/sec):4097.% #Correct:653 #Tested:792 Testing Accuracy:82.4%\r", + "Progress:79.2% Speed(reviews/sec):4097.% #Correct:653 #Tested:793 Testing Accuracy:82.3%\r", + "Progress:79.3% Speed(reviews/sec):4099.% #Correct:653 #Tested:794 Testing Accuracy:82.2%\r", + "Progress:79.4% Speed(reviews/sec):4100.% #Correct:654 #Tested:795 Testing Accuracy:82.2%\r", + "Progress:79.5% Speed(reviews/sec):4104.% #Correct:655 #Tested:796 Testing Accuracy:82.2%\r", + "Progress:79.6% Speed(reviews/sec):4107.% #Correct:656 #Tested:797 Testing Accuracy:82.3%\r", + "Progress:79.7% Speed(reviews/sec):4109.% #Correct:657 #Tested:798 Testing Accuracy:82.3%\r", + "Progress:79.8% Speed(reviews/sec):4113.% #Correct:658 #Tested:799 Testing Accuracy:82.3%\r", + "Progress:79.9% Speed(reviews/sec):4116.% #Correct:659 #Tested:800 Testing Accuracy:82.3%\r", + "Progress:80.0% Speed(reviews/sec):4120.% #Correct:660 #Tested:801 Testing Accuracy:82.3%\r", + "Progress:80.1% Speed(reviews/sec):4123.% #Correct:661 #Tested:802 Testing Accuracy:82.4%\r", + "Progress:80.2% Speed(reviews/sec):4126.% #Correct:662 #Tested:803 Testing Accuracy:82.4%\r", + "Progress:80.3% Speed(reviews/sec):4129.% #Correct:663 #Tested:804 Testing Accuracy:82.4%\r", + "Progress:80.4% Speed(reviews/sec):4132.% #Correct:664 #Tested:805 Testing Accuracy:82.4%\r", + "Progress:80.5% Speed(reviews/sec):4136.% #Correct:664 #Tested:806 Testing Accuracy:82.3%\r", + "Progress:80.6% Speed(reviews/sec):4138.% #Correct:665 #Tested:807 Testing Accuracy:82.4%\r", + "Progress:80.7% Speed(reviews/sec):4139.% #Correct:666 #Tested:808 Testing Accuracy:82.4%\r", + "Progress:80.8% Speed(reviews/sec):4142.% #Correct:667 #Tested:809 Testing Accuracy:82.4%\r", + "Progress:80.9% Speed(reviews/sec):4140.% #Correct:668 #Tested:810 Testing Accuracy:82.4%\r", + "Progress:81.0% Speed(reviews/sec):4135.% #Correct:669 #Tested:811 Testing Accuracy:82.4%\r", + "Progress:81.1% Speed(reviews/sec):4129.% #Correct:670 #Tested:812 Testing Accuracy:82.5%\r", + "Progress:81.2% Speed(reviews/sec):4128.% #Correct:671 #Tested:813 Testing Accuracy:82.5%\r", + "Progress:81.3% Speed(reviews/sec):4130.% #Correct:672 #Tested:814 Testing Accuracy:82.5%\r", + "Progress:81.4% Speed(reviews/sec):4125.% #Correct:673 #Tested:815 Testing Accuracy:82.5%\r", + "Progress:81.5% Speed(reviews/sec):4127.% #Correct:674 #Tested:816 Testing Accuracy:82.5%\r", + "Progress:81.6% Speed(reviews/sec):4129.% #Correct:675 #Tested:817 Testing Accuracy:82.6%\r", + "Progress:81.7% Speed(reviews/sec):4130.% #Correct:676 #Tested:818 Testing Accuracy:82.6%\r", + "Progress:81.8% Speed(reviews/sec):4130.% #Correct:677 #Tested:819 Testing Accuracy:82.6%\r", + "Progress:81.9% Speed(reviews/sec):4133.% #Correct:677 #Tested:820 Testing Accuracy:82.5%\r", + "Progress:82.0% Speed(reviews/sec):4128.% #Correct:678 #Tested:821 Testing Accuracy:82.5%\r", + "Progress:82.1% Speed(reviews/sec):4125.% #Correct:678 #Tested:822 Testing Accuracy:82.4%\r", + "Progress:82.2% Speed(reviews/sec):4126.% #Correct:678 #Tested:823 Testing Accuracy:82.3%\r", + "Progress:82.3% Speed(reviews/sec):4129.% #Correct:679 #Tested:824 Testing Accuracy:82.4%\r", + "Progress:82.4% Speed(reviews/sec):4131.% #Correct:680 #Tested:825 Testing Accuracy:82.4%\r", + "Progress:82.5% Speed(reviews/sec):4132.% #Correct:681 #Tested:826 Testing Accuracy:82.4%\r", + "Progress:82.6% Speed(reviews/sec):4126.% #Correct:682 #Tested:827 Testing Accuracy:82.4%\r", + "Progress:82.7% Speed(reviews/sec):4128.% #Correct:683 #Tested:828 Testing Accuracy:82.4%\r", + "Progress:82.8% Speed(reviews/sec):4129.% #Correct:684 #Tested:829 Testing Accuracy:82.5%\r", + "Progress:82.9% Speed(reviews/sec):4130.% #Correct:685 #Tested:830 Testing Accuracy:82.5%\r", + "Progress:83.0% Speed(reviews/sec):4133.% #Correct:686 #Tested:831 Testing Accuracy:82.5%\r", + "Progress:83.1% Speed(reviews/sec):4136.% #Correct:687 #Tested:832 Testing Accuracy:82.5%\r", + "Progress:83.2% Speed(reviews/sec):4137.% #Correct:688 #Tested:833 Testing Accuracy:82.5%\r", + "Progress:83.3% Speed(reviews/sec):4140.% #Correct:688 #Tested:834 Testing Accuracy:82.4%\r", + "Progress:83.4% Speed(reviews/sec):4141.% #Correct:689 #Tested:835 Testing Accuracy:82.5%\r", + "Progress:83.5% Speed(reviews/sec):4143.% #Correct:690 #Tested:836 Testing Accuracy:82.5%\r", + "Progress:83.6% Speed(reviews/sec):4142.% #Correct:691 #Tested:837 Testing Accuracy:82.5%\r", + "Progress:83.7% Speed(reviews/sec):4129.% #Correct:692 #Tested:838 Testing Accuracy:82.5%\r", + "Progress:83.8% Speed(reviews/sec):4130.% #Correct:692 #Tested:839 Testing Accuracy:82.4%\r", + "Progress:83.9% Speed(reviews/sec):4133.% #Correct:693 #Tested:840 Testing Accuracy:82.5%\r", + "Progress:84.0% Speed(reviews/sec):4134.% #Correct:694 #Tested:841 Testing Accuracy:82.5%\r", + "Progress:84.1% Speed(reviews/sec):4133.% #Correct:695 #Tested:842 Testing Accuracy:82.5%\r", + "Progress:84.2% Speed(reviews/sec):4133.% #Correct:696 #Tested:843 Testing Accuracy:82.5%\r", + "Progress:84.3% Speed(reviews/sec):4134.% #Correct:697 #Tested:844 Testing Accuracy:82.5%\r", + "Progress:84.4% Speed(reviews/sec):4134.% #Correct:698 #Tested:845 Testing Accuracy:82.6%\r", + "Progress:84.5% Speed(reviews/sec):4137.% #Correct:699 #Tested:846 Testing Accuracy:82.6%\r", + "Progress:84.6% Speed(reviews/sec):4137.% #Correct:699 #Tested:847 Testing Accuracy:82.5%\r", + "Progress:84.7% Speed(reviews/sec):4140.% #Correct:699 #Tested:848 Testing Accuracy:82.4%\r", + "Progress:84.8% Speed(reviews/sec):4143.% #Correct:700 #Tested:849 Testing Accuracy:82.4%\r", + "Progress:84.9% Speed(reviews/sec):4140.% #Correct:701 #Tested:850 Testing Accuracy:82.4%\r", + "Progress:85.0% Speed(reviews/sec):4138.% #Correct:702 #Tested:851 Testing Accuracy:82.4%\r", + "Progress:85.1% Speed(reviews/sec):4141.% #Correct:703 #Tested:852 Testing Accuracy:82.5%\r", + "Progress:85.2% Speed(reviews/sec):4141.% #Correct:703 #Tested:853 Testing Accuracy:82.4%\r", + "Progress:85.3% Speed(reviews/sec):4141.% #Correct:704 #Tested:854 Testing Accuracy:82.4%\r", + "Progress:85.4% Speed(reviews/sec):4144.% #Correct:705 #Tested:855 Testing Accuracy:82.4%\r", + "Progress:85.5% Speed(reviews/sec):4147.% #Correct:706 #Tested:856 Testing Accuracy:82.4%\r", + "Progress:85.6% Speed(reviews/sec):4149.% #Correct:707 #Tested:857 Testing Accuracy:82.4%\r", + "Progress:85.7% Speed(reviews/sec):4150.% #Correct:708 #Tested:858 Testing Accuracy:82.5%\r", + "Progress:85.8% Speed(reviews/sec):4151.% #Correct:709 #Tested:859 Testing Accuracy:82.5%\r", + "Progress:85.9% Speed(reviews/sec):4153.% #Correct:710 #Tested:860 Testing Accuracy:82.5%\r", + "Progress:86.0% Speed(reviews/sec):4153.% #Correct:711 #Tested:861 Testing Accuracy:82.5%\r", + "Progress:86.1% Speed(reviews/sec):4148.% #Correct:712 #Tested:862 Testing Accuracy:82.5%\r", + "Progress:86.2% Speed(reviews/sec):4146.% #Correct:712 #Tested:863 Testing Accuracy:82.5%\r", + "Progress:86.3% Speed(reviews/sec):4126.% #Correct:713 #Tested:864 Testing Accuracy:82.5%\r", + "Progress:86.4% Speed(reviews/sec):4125.% #Correct:713 #Tested:865 Testing Accuracy:82.4%\r", + "Progress:86.5% Speed(reviews/sec):4125.% #Correct:714 #Tested:866 Testing Accuracy:82.4%\r", + "Progress:86.6% Speed(reviews/sec):4126.% #Correct:714 #Tested:867 Testing Accuracy:82.3%\r", + "Progress:86.7% Speed(reviews/sec):4129.% #Correct:714 #Tested:868 Testing Accuracy:82.2%\r", + "Progress:86.8% Speed(reviews/sec):4132.% #Correct:715 #Tested:869 Testing Accuracy:82.2%\r", + "Progress:86.9% Speed(reviews/sec):4135.% #Correct:716 #Tested:870 Testing Accuracy:82.2%\r", + "Progress:87.0% Speed(reviews/sec):4137.% #Correct:717 #Tested:871 Testing Accuracy:82.3%\r", + "Progress:87.1% Speed(reviews/sec):4139.% #Correct:718 #Tested:872 Testing Accuracy:82.3%\r", + "Progress:87.2% Speed(reviews/sec):4141.% #Correct:719 #Tested:873 Testing Accuracy:82.3%\r", + "Progress:87.3% Speed(reviews/sec):4145.% #Correct:720 #Tested:874 Testing Accuracy:82.3%\r", + "Progress:87.4% Speed(reviews/sec):4147.% #Correct:721 #Tested:875 Testing Accuracy:82.4%\r", + "Progress:87.5% Speed(reviews/sec):4150.% #Correct:722 #Tested:876 Testing Accuracy:82.4%\r", + "Progress:87.6% Speed(reviews/sec):4154.% #Correct:722 #Tested:877 Testing Accuracy:82.3%\r", + "Progress:87.7% Speed(reviews/sec):4141.% #Correct:723 #Tested:878 Testing Accuracy:82.3%\r", + "Progress:87.8% Speed(reviews/sec):4143.% #Correct:724 #Tested:879 Testing Accuracy:82.3%\r", + "Progress:87.9% Speed(reviews/sec):4145.% #Correct:725 #Tested:880 Testing Accuracy:82.3%\r", + "Progress:88.0% Speed(reviews/sec):4147.% #Correct:726 #Tested:881 Testing Accuracy:82.4%\r", + "Progress:88.1% Speed(reviews/sec):4150.% #Correct:727 #Tested:882 Testing Accuracy:82.4%\r", + "Progress:88.2% Speed(reviews/sec):4152.% #Correct:728 #Tested:883 Testing Accuracy:82.4%\r", + "Progress:88.3% Speed(reviews/sec):4155.% #Correct:729 #Tested:884 Testing Accuracy:82.4%\r", + "Progress:88.4% Speed(reviews/sec):4158.% #Correct:730 #Tested:885 Testing Accuracy:82.4%\r", + "Progress:88.5% Speed(reviews/sec):4160.% #Correct:731 #Tested:886 Testing Accuracy:82.5%\r", + "Progress:88.6% Speed(reviews/sec):4164.% #Correct:731 #Tested:887 Testing Accuracy:82.4%\r", + "Progress:88.7% Speed(reviews/sec):4166.% #Correct:732 #Tested:888 Testing Accuracy:82.4%\r", + "Progress:88.8% Speed(reviews/sec):4168.% #Correct:732 #Tested:889 Testing Accuracy:82.3%\r", + "Progress:88.9% Speed(reviews/sec):4169.% #Correct:733 #Tested:890 Testing Accuracy:82.3%\r", + "Progress:89.0% Speed(reviews/sec):4171.% #Correct:734 #Tested:891 Testing Accuracy:82.3%\r", + "Progress:89.1% Speed(reviews/sec):4171.% #Correct:735 #Tested:892 Testing Accuracy:82.3%\r", + "Progress:89.2% Speed(reviews/sec):4175.% #Correct:735 #Tested:893 Testing Accuracy:82.3%\r", + "Progress:89.3% Speed(reviews/sec):4177.% #Correct:736 #Tested:894 Testing Accuracy:82.3%\r", + "Progress:89.4% Speed(reviews/sec):4180.% #Correct:737 #Tested:895 Testing Accuracy:82.3%\r", + "Progress:89.5% Speed(reviews/sec):4183.% #Correct:738 #Tested:896 Testing Accuracy:82.3%\r", + "Progress:89.6% Speed(reviews/sec):4185.% #Correct:739 #Tested:897 Testing Accuracy:82.3%\r", + "Progress:89.7% Speed(reviews/sec):4187.% #Correct:740 #Tested:898 Testing Accuracy:82.4%\r", + "Progress:89.8% Speed(reviews/sec):4183.% #Correct:741 #Tested:899 Testing Accuracy:82.4%\r", + "Progress:89.9% Speed(reviews/sec):4183.% #Correct:742 #Tested:900 Testing Accuracy:82.4%\r", + "Progress:90.0% Speed(reviews/sec):4177.% #Correct:743 #Tested:901 Testing Accuracy:82.4%\r", + "Progress:90.1% Speed(reviews/sec):4171.% #Correct:744 #Tested:902 Testing Accuracy:82.4%\r", + "Progress:90.2% Speed(reviews/sec):4171.% #Correct:745 #Tested:903 Testing Accuracy:82.5%\r", + "Progress:90.3% Speed(reviews/sec):4173.% #Correct:746 #Tested:904 Testing Accuracy:82.5%\r", + "Progress:90.4% Speed(reviews/sec):4176.% #Correct:747 #Tested:905 Testing Accuracy:82.5%\r", + "Progress:90.5% Speed(reviews/sec):4176.% #Correct:748 #Tested:906 Testing Accuracy:82.5%\r", + "Progress:90.6% Speed(reviews/sec):4177.% #Correct:748 #Tested:907 Testing Accuracy:82.4%\r", + "Progress:90.7% Speed(reviews/sec):4175.% #Correct:749 #Tested:908 Testing Accuracy:82.4%\r", + "Progress:90.8% Speed(reviews/sec):4175.% #Correct:749 #Tested:909 Testing Accuracy:82.3%\r", + "Progress:90.9% Speed(reviews/sec):4177.% #Correct:749 #Tested:910 Testing Accuracy:82.3%\r", + "Progress:91.0% Speed(reviews/sec):4177.% #Correct:750 #Tested:911 Testing Accuracy:82.3%\r", + "Progress:91.1% Speed(reviews/sec):4179.% #Correct:750 #Tested:912 Testing Accuracy:82.2%\r", + "Progress:91.2% Speed(reviews/sec):4179.% #Correct:751 #Tested:913 Testing Accuracy:82.2%\r", + "Progress:91.3% Speed(reviews/sec):4148.% #Correct:751 #Tested:914 Testing Accuracy:82.1%\r", + "Progress:91.4% Speed(reviews/sec):4144.% #Correct:752 #Tested:915 Testing Accuracy:82.1%\r", + "Progress:91.5% Speed(reviews/sec):4117.% #Correct:752 #Tested:916 Testing Accuracy:82.0%\r", + "Progress:91.6% Speed(reviews/sec):4111.% #Correct:752 #Tested:917 Testing Accuracy:82.0%\r", + "Progress:91.7% Speed(reviews/sec):4088.% #Correct:752 #Tested:918 Testing Accuracy:81.9%\r", + "Progress:91.8% Speed(reviews/sec):4081.% #Correct:753 #Tested:919 Testing Accuracy:81.9%\r", + "Progress:91.9% Speed(reviews/sec):4005.% #Correct:754 #Tested:920 Testing Accuracy:81.9%\r", + "Progress:92.0% Speed(reviews/sec):4002.% #Correct:754 #Tested:921 Testing Accuracy:81.8%\r", + "Progress:92.1% Speed(reviews/sec):3964.% #Correct:755 #Tested:922 Testing Accuracy:81.8%\r", + "Progress:92.2% Speed(reviews/sec):3949.% #Correct:756 #Tested:923 Testing Accuracy:81.9%\r", + "Progress:92.3% Speed(reviews/sec):3947.% #Correct:757 #Tested:924 Testing Accuracy:81.9%\r", + "Progress:92.4% Speed(reviews/sec):3945.% #Correct:758 #Tested:925 Testing Accuracy:81.9%\r", + "Progress:92.5% Speed(reviews/sec):3910.% #Correct:759 #Tested:926 Testing Accuracy:81.9%\r", + "Progress:92.6% Speed(reviews/sec):3883.% #Correct:760 #Tested:927 Testing Accuracy:81.9%\r", + "Progress:92.7% Speed(reviews/sec):3883.% #Correct:761 #Tested:928 Testing Accuracy:82.0%\r", + "Progress:92.8% Speed(reviews/sec):3885.% #Correct:761 #Tested:929 Testing Accuracy:81.9%\r", + "Progress:92.9% Speed(reviews/sec):3851.% #Correct:762 #Tested:930 Testing Accuracy:81.9%\r", + "Progress:93.0% Speed(reviews/sec):3808.% #Correct:763 #Tested:931 Testing Accuracy:81.9%\r", + "Progress:93.1% Speed(reviews/sec):3775.% #Correct:764 #Tested:932 Testing Accuracy:81.9%\r", + "Progress:93.2% Speed(reviews/sec):3777.% #Correct:765 #Tested:933 Testing Accuracy:81.9%\r", + "Progress:93.3% Speed(reviews/sec):3776.% #Correct:766 #Tested:934 Testing Accuracy:82.0%\r", + "Progress:93.4% Speed(reviews/sec):3773.% #Correct:767 #Tested:935 Testing Accuracy:82.0%\r", + "Progress:93.5% Speed(reviews/sec):3774.% #Correct:768 #Tested:936 Testing Accuracy:82.0%\r", + "Progress:93.6% Speed(reviews/sec):3776.% #Correct:768 #Tested:937 Testing Accuracy:81.9%\r", + "Progress:93.7% Speed(reviews/sec):3778.% #Correct:769 #Tested:938 Testing Accuracy:81.9%\r", + "Progress:93.8% Speed(reviews/sec):3780.% #Correct:769 #Tested:939 Testing Accuracy:81.8%\r", + "Progress:93.9% Speed(reviews/sec):3783.% #Correct:770 #Tested:940 Testing Accuracy:81.9%\r", + "Progress:94.0% Speed(reviews/sec):3786.% #Correct:771 #Tested:941 Testing Accuracy:81.9%\r", + "Progress:94.1% Speed(reviews/sec):3788.% #Correct:771 #Tested:942 Testing Accuracy:81.8%\r", + "Progress:94.2% Speed(reviews/sec):3790.% #Correct:771 #Tested:943 Testing Accuracy:81.7%\r", + "Progress:94.3% Speed(reviews/sec):3792.% #Correct:772 #Tested:944 Testing Accuracy:81.7%\r", + "Progress:94.4% Speed(reviews/sec):3791.% #Correct:773 #Tested:945 Testing Accuracy:81.7%\r", + "Progress:94.5% Speed(reviews/sec):3793.% #Correct:774 #Tested:946 Testing Accuracy:81.8%\r", + "Progress:94.6% Speed(reviews/sec):3795.% #Correct:775 #Tested:947 Testing Accuracy:81.8%\r", + "Progress:94.7% Speed(reviews/sec):3797.% #Correct:776 #Tested:948 Testing Accuracy:81.8%\r", + "Progress:94.8% Speed(reviews/sec):3799.% #Correct:777 #Tested:949 Testing Accuracy:81.8%\r", + "Progress:94.9% Speed(reviews/sec):3802.% #Correct:778 #Tested:950 Testing Accuracy:81.8%\r", + "Progress:95.0% Speed(reviews/sec):3804.% #Correct:779 #Tested:951 Testing Accuracy:81.9%\r", + "Progress:95.1% Speed(reviews/sec):3803.% #Correct:779 #Tested:952 Testing Accuracy:81.8%\r", + "Progress:95.2% Speed(reviews/sec):3805.% #Correct:779 #Tested:953 Testing Accuracy:81.7%\r", + "Progress:95.3% Speed(reviews/sec):3807.% #Correct:780 #Tested:954 Testing Accuracy:81.7%\r", + "Progress:95.4% Speed(reviews/sec):3806.% #Correct:781 #Tested:955 Testing Accuracy:81.7%\r", + "Progress:95.5% Speed(reviews/sec):3809.% #Correct:782 #Tested:956 Testing Accuracy:81.7%\r", + "Progress:95.6% Speed(reviews/sec):3811.% #Correct:783 #Tested:957 Testing Accuracy:81.8%\r", + "Progress:95.7% Speed(reviews/sec):3813.% #Correct:784 #Tested:958 Testing Accuracy:81.8%\r", + "Progress:95.8% Speed(reviews/sec):3814.% #Correct:785 #Tested:959 Testing Accuracy:81.8%\r", + "Progress:95.9% Speed(reviews/sec):3817.% #Correct:786 #Tested:960 Testing Accuracy:81.8%\r", + "Progress:96.0% Speed(reviews/sec):3812.% #Correct:787 #Tested:961 Testing Accuracy:81.8%\r", + "Progress:96.1% Speed(reviews/sec):3814.% #Correct:788 #Tested:962 Testing Accuracy:81.9%\r", + "Progress:96.2% Speed(reviews/sec):3816.% #Correct:789 #Tested:963 Testing Accuracy:81.9%\r", + "Progress:96.3% Speed(reviews/sec):3819.% #Correct:790 #Tested:964 Testing Accuracy:81.9%\r", + "Progress:96.4% Speed(reviews/sec):3821.% #Correct:791 #Tested:965 Testing Accuracy:81.9%\r", + "Progress:96.5% Speed(reviews/sec):3823.% #Correct:792 #Tested:966 Testing Accuracy:81.9%\r", + "Progress:96.6% Speed(reviews/sec):3826.% #Correct:793 #Tested:967 Testing Accuracy:82.0%\r", + "Progress:96.7% Speed(reviews/sec):3829.% #Correct:794 #Tested:968 Testing Accuracy:82.0%\r", + "Progress:96.8% Speed(reviews/sec):3831.% #Correct:795 #Tested:969 Testing Accuracy:82.0%\r", + "Progress:96.9% Speed(reviews/sec):3833.% #Correct:796 #Tested:970 Testing Accuracy:82.0%\r", + "Progress:97.0% Speed(reviews/sec):3832.% #Correct:797 #Tested:971 Testing Accuracy:82.0%\r", + "Progress:97.1% Speed(reviews/sec):3833.% #Correct:798 #Tested:972 Testing Accuracy:82.0%\r", + "Progress:97.2% Speed(reviews/sec):3835.% #Correct:798 #Tested:973 Testing Accuracy:82.0%\r", + "Progress:97.3% Speed(reviews/sec):3833.% #Correct:799 #Tested:974 Testing Accuracy:82.0%\r", + "Progress:97.4% Speed(reviews/sec):3825.% #Correct:800 #Tested:975 Testing Accuracy:82.0%\r", + "Progress:97.5% Speed(reviews/sec):3825.% #Correct:801 #Tested:976 Testing Accuracy:82.0%\r", + "Progress:97.6% Speed(reviews/sec):3821.% #Correct:802 #Tested:977 Testing Accuracy:82.0%\r", + "Progress:97.7% Speed(reviews/sec):3815.% #Correct:803 #Tested:978 Testing Accuracy:82.1%\r", + "Progress:97.8% Speed(reviews/sec):3817.% #Correct:804 #Tested:979 Testing Accuracy:82.1%\r", + "Progress:97.9% Speed(reviews/sec):3816.% #Correct:805 #Tested:980 Testing Accuracy:82.1%\r", + "Progress:98.0% Speed(reviews/sec):3817.% #Correct:806 #Tested:981 Testing Accuracy:82.1%\r", + "Progress:98.1% Speed(reviews/sec):3793.% #Correct:807 #Tested:982 Testing Accuracy:82.1%\r", + "Progress:98.2% Speed(reviews/sec):3787.% #Correct:808 #Tested:983 Testing Accuracy:82.1%\r", + "Progress:98.3% Speed(reviews/sec):3779.% #Correct:809 #Tested:984 Testing Accuracy:82.2%\r", + "Progress:98.4% Speed(reviews/sec):3781.% #Correct:809 #Tested:985 Testing Accuracy:82.1%\r", + "Progress:98.5% Speed(reviews/sec):3784.% #Correct:810 #Tested:986 Testing Accuracy:82.1%\r", + "Progress:98.6% Speed(reviews/sec):3786.% #Correct:811 #Tested:987 Testing Accuracy:82.1%\r", + "Progress:98.7% Speed(reviews/sec):3787.% #Correct:812 #Tested:988 Testing Accuracy:82.1%\r", + "Progress:98.8% Speed(reviews/sec):3789.% #Correct:813 #Tested:989 Testing Accuracy:82.2%\r", + "Progress:98.9% Speed(reviews/sec):3790.% #Correct:814 #Tested:990 Testing Accuracy:82.2%\r", + "Progress:99.0% Speed(reviews/sec):3792.% #Correct:815 #Tested:991 Testing Accuracy:82.2%\r", + "Progress:99.1% Speed(reviews/sec):3793.% #Correct:816 #Tested:992 Testing Accuracy:82.2%\r", + "Progress:99.2% Speed(reviews/sec):3796.% #Correct:817 #Tested:993 Testing Accuracy:82.2%\r", + "Progress:99.3% Speed(reviews/sec):3798.% #Correct:818 #Tested:994 Testing Accuracy:82.2%\r", + "Progress:99.4% Speed(reviews/sec):3798.% #Correct:818 #Tested:995 Testing Accuracy:82.2%\r", + "Progress:99.5% Speed(reviews/sec):3798.% #Correct:819 #Tested:996 Testing Accuracy:82.2%\r", + "Progress:99.6% Speed(reviews/sec):3800.% #Correct:820 #Tested:997 Testing Accuracy:82.2%\r", + "Progress:99.7% Speed(reviews/sec):3802.% #Correct:821 #Tested:998 Testing Accuracy:82.2%\r", + "Progress:99.8% Speed(reviews/sec):3803.% #Correct:821 #Tested:999 Testing Accuracy:82.1%\r", + "Progress:99.9% Speed(reviews/sec):3805.% #Correct:822 #Tested:1000 Testing Accuracy:82.2%" + ] + } + ], + "source": [ + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Analysis: What's Going on in the Weights?" + ] + }, + { + "cell_type": "code", + "execution_count": 129, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp_full = SentimentNetwork(reviews[:-1000],labels[:-1000],min_count=0,polarity_cutoff=0,learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 130, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):1534. #Correct:20335 #Trained:24000 Training Accuracy:84.7%" + ] + } + ], + "source": [ + "mlp_full.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 131, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAl4AAAEoCAYAAACJsv/HAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsvQv8HdPV/7+pKnVrUCSoS5A0ES1CCC2JpPh7KkLlaV1yafskoUk02hCh\njyhyERWXIMlTInFpRCVBCRIJQRJFtQRJCXVLUOLXoC7Vnv+8t67T9Z3Muc85Z+actV6v+c6cmX1Z\n+7NnZn++a63Ze4NMIM7EEDAEDAFDwBAwBAwBQ6DqCGxY9RqsAkPAEDAEDAFDwBAwBAwBj4ARL7sR\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwg0GgLvvfee22CDDdzWW2+diqahZ+fOnVOhqylpCBgCjYuA\nEa/G7VtrmSFgCPwbgT59+jiIookhYAgYAvVGwIhXvXvA6jcEGgiB2267zbVt29ZbwrCGDRo0KNs6\nOf/kk09mz2GFIh3nIEYQJH6LJW38+PHrpZ06daq3so0cOTJ7LdcBaSgLvUwMAUPAEEgCAka8ktAL\npoMh0AAIQJwgWi+99FK2NZAkyBTSo0cPv1+wYEF2T57999/fbz179mxBkLgGcdLkjYyc41oxMm7c\nOJfJZNysWbOKSW5pDAFDwBCoOgJGvKoOsVVgCDQHAliVIEQDBw70ZGft2rWuVatWTojWiSee6IGQ\n32L54jwEjd+QM4gS2xNPPOF23333FmSMAiQNpMrEEDAEDIG0IWDEK209ZvoaAglFQAgXZAmrVNgy\nBWGCiAnhEgIG8RIrGefE1UggPOchc5KHpp999tkJRcDUMgQMAUOgMAJGvApjZCkMAUOgCAQgScRs\nCaGCgEG0tECyIFKkgUzhZiSdyJQpU7IWL7F8sSediSFgCBgCjYCAEa9G6EVrgyGQAARwF0KqIFe4\nASFU/NYicV7ylSFpESFflCHWL53Pjg0BQ8AQaBQEjHg1Sk9aOwyBOiMg1i2C4XEXQq7knKgG0eKc\nEDIhXrgpsWphBZOvH7XLUfLb3hAwBAyBtCNgxCvtPWj6GwIJQQDyJBYtyBVuQ7F66ekchGyRVixd\nNGH+/Pk+MF83hzI5b2IIGAKGQKMgsEEQP5FplMZYOwwBQyD5CDA3F4H3uCMtUD75/WUaGgKGQLwI\nmMUrXjytNEPAEMiBAG5E3IeQLixiWLMqEaxo4o6M2lOPiSFgCBgCSUNgo6QpZPoYAoZA4yOApSsc\n/1Vqq3FZmsG+VNQsvSFgCNQbAXM11rsHrH5DwBAwBAwBQ8AQaBoEzNXYNF1tDTUEDAFDwBAwBAyB\neiNgxKvePWD1GwKGgCFgCBgChkDTIGDEq2m62hpqCBgChoAhYAgYAvVGwIhXvXvA6jcEDAFDwBAw\nBAyBpkHAiFfTdLU11BAwBAwBQ8AQMATqjYARr3r3gNVvCBgChoAhYAgYAk2DgBGvpulqa6ghYAgY\nAoaAIWAI1BsBm0C13j1g9RsCTYTAgw8+6F555RW3/Nnn3RtvvOHWrH7DPfjgovUQ+MFJp7gtttjC\ndezYwX1t553c4Ycf7r7yla+sl85OGAKGgCGQNgRsAtW09ZjpawikDIG5c+e6hx9Z4u6+607Xuk0b\n981993e77b6b23PPPd1WW27puh7cpUWLnn1uhXv1tdfc6tVr3EsvveSeefpP7q475zrI2JHf6eF6\n9eplJKwFYvbDEDAE0oSAEa809ZbpagikBIH/9//+n5tx401uzuzZXuPeJ3zPHdG9u+vYoX1ZLVi9\n5k0379773WPLlrr/mzrZ/XzE2e4npw92u+66a1nlWSZDwBAwBOqFgBGveiFv9RoCDYrAlVde5a65\n+mq3X+cD3Kl9+7qjj+wZa0uxiP3619e5yydeagQsVmStMEPAEKgFAka8aoGy1WEINAECxG9dcMEv\n3RZbbuVOO/302AlXGEIhYPPuvsudM+oc169fv3AS+20IGAKGQOIQMOKVuC4xhQyB9CFw5VWT3DWT\nJrnThw5zw4acXtMGzLtvvrtk3NggfmzHwNJ2lcV/1RR9q8wQMARKRcCmkygVMUtfFgKdO3d2G2yw\ngXvyySfLyl+LTFOnTnVbb7211xNdx48fX4tqU10HsVyDBp/uFix4wF1/w/Saky7Aw5V58y23uO23\n38Ed1OUg98c//jHVmJryhoAh0NgI2HQSjd2/1roiEYAQDho0qEXqkSNH+t9nn312i/P243MEIF19\n+w1wm2++uZs8+VrXpvUOdYOGuideNsF/Lfn9//6+m3nrTPfNb36zbvpYxYaAIWAI5ELALF65kLHz\nVUWAaQJ69uyZtS5xzDmkT58+/ry2OElaOcceq5RskKb33ntvvfxY2tgKCdYu5MQTT3SZTMaNGzfO\n/16wYIHf25+WCAjpatt2D3fLzTfWlXRpzXBzjhg5ykG+zPKlkbFjQ8AQSAoCRryS0hNNpgfkSpMa\njiFXSI8ePfxeX8ci1apVKzdw4EBvmRJrlE8Y/IE4SX45Bzkr1rUp6SBeiOzlvJRpe+c06cLKlDT5\n0YC+Rr6S1immjyFgCGQRMOKVhcIOaoUAli0Izf777++tS1iYOJbzkCtIlpAeCBjWLNKwh2RxvGrV\nKp9/7dq1nqyRXpO13Xff3XHtiSeeKNg0sZZRLyJ7zsu1goU0SYIxYz+PfUsi6ZIugHwR6D98+Jme\nKMp52xsChoAhUG8ELMar3j3QhPVDiCBbECixXImbUeCAWEGi2ISAYYWSY/Zt27aV5Nm9XOcE6YVA\nZRPYQUUITJ8+3d05d45bGEwdkXTB7bj8mWfc6T8Z6t2hSdfX9DMEDIHmQMAsXs3Rz4lrJXFXEq+F\ncuJeFEW1qw/yBYGSc6TBKgZ5C2/lBsILQRMCKFYuzss10a1Z93/5y1/c2DFj3cRggtR6BtKXgv/o\n0ef79SAhjCaGgCFgCCQBASNeSeiFJtPhtttu85YryBZB7JAobakCDrFWQc4gXljAIEDsEcpgi0uk\nXOpCpGw5H1c9aS5n1Lm/cCf0+X7VJ0aNEyMI4lkjz/GEkdg0E0PAEDAE6o2AEa9690AT1i8WJFyN\nfJWIy1AsTAKHkCw5L9Yu3JQQNc7L14/yZSNzcEl6KafYPWUiEC7KExdo2BJXbHmNlo5Z6f/wxON+\nfcS0tY15vo4+5rtOYtPSpr/pawgYAo2FgBGvxurPVLQGMqNdghxr4iONELIFCZPrXJsyZYq3lAmB\n4xxlzp8/v2y3IJYtytVlYo3TelJPs8rU/7vOB6unxcUY7qcf//hHbsIl4xzuUhNDwBAwBOqJgC0Z\nVE/0re68COB+JBZMSFXexHaxaghg7Ro8aLBbsXJF1eqoRcHDzxzhvvjFjdwl48fWojqrwxAwBAyB\nSATM4hUJi52sNwK4DWXiU23tKkcv3I/ijozaSz3llN0MeW75za2u74Afpb6pWL34ItNivVLfldYA\nQyDVCJjFK9Xd17jKS7wW7sZZs2Y1bkMT3jJICu7X5c8+7zp2aJ9wbQurd2yv3u743r1c//79Cye2\nFIaAIWAIVAEBs3hVAVQrsnIEmPiUqSKMdFWOZSUlzJ071/3PwMENQbrAoUewOsKSpY9VAonlNQQM\nAUOgIgSMeFUEn2U2BBobAUjK3p06NUwjj+je3f3f1MkN0x5riCFgCKQPASNe6esz09gQqBkCix9c\n5Dr/e+60mlVaxYpwl3732OMcHwyYGAKGgCFQDwSMeNUDdavTEEgBAjL1QteDu6RA2+JVbNt2D/f0\n088Un8FSGgKGgCEQIwJGvGIE04oyBBoJAYjXfp0PaKQm+bbstvtu7rXX32i4dlmDDAFDIB0IGPFK\nRz+ZloZAzRHAHbf99jvUvN5qV7jnnnu6N94w4lVtnK18Q8AQiEZgo+jTdtYQMASaHYFWW2/jNt5k\ns2aHwdpvCBgChkCsCJjFK1Y4rTBDoHEQ+Ne//tU4jVEt+cY+ndxvbrlJnbFDQ8AQMARqh4ARr9ph\nbTUZAoZAAhBI63qTCYDOVDAEDIEYEDDiFQOIVoQhYAgYAoaAIWAIGALFIGDEqxiULI0hYAg0DAJM\nCttur3YN0x5riCFgCKQLASNe6eov09YQqAkCrNG4ukG//PvbunUNOU1GTW4Mq8QQMAQqRsC+aqwY\nQiug1ggwzcEf//hHvzHXFNsrr7zSQo2tttrKffOb33Rf+cpX/P7www93bCa5EYBsgSsCbh07dmjI\ndQ3ff//93CDYFUPAEDAEqoyAEa8qA2zFx4MAizXfcMMNfqkXSAEkCmLVv3//LLnSNQkhYw+Z+OlP\nf+r+9Kc/uV69ernjjjvOb5TT7CI4gYPgKphAxGbPuUN+Nsz+xRdXuS5dDmyY9lhDDAFDIF0IbJAJ\nJF0qm7bNggAD/+WXX+43SAHkCdK06667lgUB5QmBg4xR1ujRo8surywlEpBJky2wzIfnBhts4N5Y\nvcY10peAJ518qvtOzyM8aU9Ad5gKhoAh0GQIWIxXk3V4GpoLQRJChFsRsgRZgHjlIwmF2gZ5w0Im\nrkrS77bbbv4cdTayQDRpNxuCxZCtEJ4sKP3Io0t8nkb584cnHvdtb5T2WDsMAUMgXQgY8UpXfzW8\nthADXIjsIVzsIQhxC4QD1+XLL7/sIF38xrrWSAJ2stE+cGTjuFjpcuCBgYv26WKTJz7dvPvmu9Zt\n2pSEQeIbZQoaAoZAqhCwGK9UdVdjK4tFCzKEtYvjWggkRAieWMPQIa3xXxAtEUhWpXLMMUe74cPP\nrLSYxOR/5JFHXZsdd0qMPqaIIWAINB8CFuPVfH2euBZjcRKSAAkqxSITZ2PQA/KFWxPyheUt6YLO\n8iUiugqOcerdrVt3d9pPhrg+3zs+zmLrUlb7du3d5CmTIz/IqItCVqkhYAg0HQIbNl2LrcGJQkBI\nl7gX60W6AAUrF8QP8sKmCU2SQIMYiguRY9GXfTWk9/HHuwXz51ej6JqWed20GW6v9l/3eHGfaetg\nTRWxygwBQ6CpETCLV1N3f30br0kXFqYkCfrg7mRwToLlC4LFhkAa2GohkE8I3d/+9je3/NnnXccO\n7WtRbVXqwHI3dOhQd/zxvbPl07+0z8QQMAQMgVohYMSrVkhbPS0QSDLpEkXrTb4gPeCE1JJsSfsh\nJUy5AenaY489XbfuR7ipU66Vy6naY+26acYNbtGihevpbeRrPUjshCFgCFQRASNeVQTXis6NAAM6\npIJBL8kiVi/0rEXAvcYDS1st6ozCH9I5YMCAFpd23mlnN+XX17mjj+zZ4nzSf6xe86Y7+aST1rN2\nab3BvZ54a13s2BAwBBobASNejd2/iWwdXy0ysGPRqRexKAUYXFES/1VKvmLTarKVBLcXZPOKK67I\nqs/yS8S+4eqcPn2Gu/mWW1I1oeq5vxjtXn5plbvl5huzbYo64H7EspiGezJKfztnCBgC6UDAiFc6\n+qlhtGRw23fffd1TTz2ViNipYoDFMseADFnEUlepUB44iCSBbKELekG6pk+fLqq5XXbZxZOub3zj\nG45FLpj1ve0ee7qLLxydTZPkA+btGj5sqLv3vnt9HxbS1chXIYTsuiFgCFSKgBGvShG0/CUhAMlg\nw+qVJsHiI1NNlGMR0WSL/EkI2Nf4ox/9wnqWIpCtRYsWeQvQv/71L/fPf/7Tvfjii8F6l8e5kaPO\ncz8a0FeSJnL/7HMr3Am9j3Njxo5tEVBfSFkjX4UQsuuGgCFQCQJGvCpBz/KWhAAWIwgXA1s55KWk\nyqqQGGLCVixpxDXHhiSRbHnFgj/0B5a8V155RU65fv36uYkTJ3q9IVxs//jHP9ynn37qZs2a5X71\nq8vc9Bk3uq4Hd8nmSdIBcV2DB5/m2rdv7y4ZP7Zk1eQexdJpYggYAoZAnAgY8YoTTSsrLwIMYpAW\nLEdpFAZjiBdkKhdxJA3WI4T2siVZiC+TLxdFzzPOOMOTLn4L6YJwffTRR+7jjz92K1eudA8uetDN\nvHWmu/GmWxJHvoR07RgsDXTttVdLs0reC2lOeh+W3DDLYAgYAnVFwIhXXeFvnsrF2iWDWVpbDmlk\nINZWL0220vRlHH0S/nJx2rRp3tpFPBfuRbFyQbr+/ve/uw8//NDde++97pBDDnG/u+tud+usZJEv\nIV3cXzOmT8tJkIu9/+R+NfJVLGKWzhAwBAohsGGhBGm9PnXqVLfBBhv4rWfPnqlrhtafdmiRdrFf\nsGCBvlTwuG3btllcxo8fXzB9XAmEeMVVXr3KgXixmDaWItkYlLGEseWyhNVL31z1EkSvSRdfLhLP\nhYsR0oWlCyvXJ5984gnX+++/7+fzItaNjyPWrVvnDu92mPveCSe6Q7oe5Jgnq96yZOljWffinXfM\niaUv6FsEcm1iCBgChkAcCGwURyFWhiGQDwGsBg899JD/Oi5furRcY0JR3IlxfOFY6zajd74vFyWI\n/rPPPsuSLqxcEK9nnnnGbbPNNt7qBTnbeOON3dH/31Fu+x22c2MuusAtD66PGPGzukw1AfGbMG6M\nO33IEDds6JBYYYV8gRvkK2kfRcTaUCvMEDAEaoJAw1q8aoKeVVIUAlhJevXqFYsFoqgKq5gIqxZB\n57QpbQJ5QH89XQRfLjK1B3shXbgXcS1+8MEHnnBBNP2SQcuXu+22285bwkiLxXWjjTZyPXr0cNdf\nf737y19edid9/weOKRxqJXy5OHDQaZ50sfh13KRL2oElEwJmli9BxPaGgCFQLgJGvMpFrsr5Bg4c\n6F0+WBbY0iyQlLitQ08++aTDVdqnTx+HK1m7X+UYtyrXRo4c6W677Tb33nvvxQIj5CVtxEusNXq6\nCNyKMl0ErkWxcgnpwp0opOuuu+5yXbp08WkAEWvXJpts4rdNN93U7bXXXu6GadcF83yd5OfNYr6v\nahIwYrnGjJvgp4uAFC17bJknlbF0cI5CjHzlAMZOGwKGQEkINBXxYqCWQZn9oEGD3EsvvZQTMOKn\nSKPzbL311n7AzzWI67TkZ+vcubMvQ2KqikmTL8YrrDCkQuqgbI45V45EtRnygj7lCm5GyEocInjS\nRiFUnIsS+pZrQtAgYuTJ1XdRZUSdE3dTWqwfxKKBv54ugi8XCaSHTIS/XMStqEnX8sDS1bp1a5/u\nC1/4QpZsbb755o4N4vWlL33JffGLXwzixvq7JUuXBCTtwCwBm/Xb2VEwlnWOOK7hZ45w3YP2LH/m\naf9lJdNF0I5aiJAvMDUxBAwBQ6AsBAJrSkPKlClTMBP5LXCFZNjkt963atUqs2rVqvUwOPvssyPT\nS17yPfHEE+vlk+vsw2WMGzfOpy8mjdaf9Fp0/sAyllNPqU/n3X333bPpw9f5rcsOH1NXqRK4sTLB\n7OelZotMX0i/sL65foNBVJ9HVprjZOA6zQTEJcfV5JxGxzAOnAtchZmAcGUCt2Im+FoxE7ghM2vW\nrPG4BIQy8/DDD2fmzZuXmT17dmbw4MGZ3/zmN5mAzGcCy1dm4cKFmccffzzz/PPPZ1577bXMu+++\nmwniwDJBIH4msJr5skEgILiZ4KOKzOGHd8u026td5qfDf5659bbbM8uffb4kgO659/7MqPPOz5Yz\n4qyRmZdffrmkMqqROLAWVqNYK9MQMAQaHIGmCK7PZREJBiRv/cCqNX/+f+JSsJCIdYo0UYLVBEtQ\nMIC7gIRFJSlYBpkK1RNZsDqZzxKFdWf//ff3MTgqS+QhFjLS5xPqCkiLCwhlvmQtrmEVIjamUilG\nv2LrwBKGizIgzsVmWS8dVi8+Gkiy5Fpz8bDDDst+uainiyCmS+K6CKjH5Xj//fd7axnWLCxbm222\nmdtiiy38xrG2dmENE2suuGAdwp3Jxn2w+OFH3Nw5c9x/n3hCUGY317rNjm777XdwXw3ixsKCNQtd\n7rpzrvvusce5Aw88wJ1xxrDYXdbhekv5jRVRrIml5LO0hoAh0NwIbNgszYeAMNAGRNqtXbvWExJp\nuyZmECpNhiAakDLysRF7JRJOK+f1Xsdq5SIsxaTRZYaPA0tQVr/AUtbicj5iphNq0qWxglgSPC0C\nNrS7WIGcMEBVKrpPpCz0pO1s9FF4Y4Z1roEv/aiFGLFy3bGUA/FKqruJIHqmvdALXbPmIvpCugiM\nJ54L0sV0EfLVorgX2XPuz3/+s9ttt918PNeXv/xlT7aYdoINFyPniPMi3itMujTWgheB7yxUzXM0\nceJlbuD//MhtteVmbvMvb+I23WRjv20WHH/68YeuT0DOzhx+hk/L1BDnnTsqUaRL2ifkC8xNDAFD\nwBAoCoHgJdiQEnbVhV1LwSDdwgUTkDGPQzgf6cKi3Za4HLUEoGfLJV2UFJMmrIcuR+cPSIW+5I8D\nspHVgbTSNi7iZpP8pENwmco59mGsyE87JQ26FSvnn39+hq0SoX6pm30uN2+hOsK44AouV3AzBSSm\n3OxVyxeQ4kzwhWILvPgNhrgXcQXiEgysSZl33nkn8+qrr3qX4e9///vMAw88kLnzzjszAWH1rkVc\njLgagwlTM4888kgmCMz398abb76ZCYLuM4FFzLsqcVlSdjMLLnWwNzEEDAFDoBACTWHxwtoRtniE\nf4sVB0uICC5Ebe2R81hQRMin88h59lF59fVi04Tz6N8nnnii/umPw/Xm+4CADFp/rEhhbMBB11Oo\nPK0QVhYJRtfnSznW+pEPKxZ6lipYHHXbwuWWWl7S0uPOA+tyv1wkqJ4lgQi2X7x4sTvqqKO8ZWvL\nLbf0bkMsXbgZsXQRTM9UEoUsXUnDqFr6iOvZLF/VQtjKNQQaB4GmiPHSg22hrtOkItfgDhHRIqRN\nn+M4nC58vdg0UfnkXFTbwvXm0k/K0NchI8Tp5BOdPl86uRb3F2dRbZa6Cu3Jq/u4UPq0XIfglrLm\nIq5Eiediz3JAzFSPGzIImPeLS8tXi5Atjonpkq8XIV0bbvj5/22F7pe0YFipnpAviWmM+56vVDfL\nbwgYAslBoCksXsmB2zSJA4FyLVUQxnLzxqF3tcpgOaZu3br5ObekjuDLRT/Ra2Dy9hYs4rmwZgnh\nCsdzEesF6YJQvfXWW65Tp04+lgsrFxYviJfEcwnp0oH0Um+z78XylfQPL5q9n6z9hkA9ETDiFUJf\nW1NyDdJhi0/YwhQqsqo/o3QM66fbFKWM1h83JYN1vi2I8YoqJvIcXzRiBahEwq5TAu2jgu3z1YGV\ni69XNTbhcvPlT+q1ctdcZGJUXItYuiBd9DdfLgaxXu673/1uC9KFpSuKdCUVk3rrBflCjHzVuyes\nfkMgmQg0hauxFOi1e5FBmi8ewwO0/lIQ0qLzlFJXHGnRT8dfUab+ShP9ChEvrT9EjnZrMlaJnhCv\nOOJeaKN8hYh+fIXJRt/k6gPS0R5IV5R7MdyvlbSz1nnBtNw1FyFcuBexgPF1I5YrSJe2dEksl54u\nAtdipVYuyAhu0b+te9899tjvPWzowrQRSDDfl9uv8wH+uH27doG1bXPHl4NCZvyFFPzhvqetbByb\nGAKGgCEgCBjxEiT+vWeAZ0Bn0EawkmDhkUGa35rYhEnPv4up2S48txa/9dQQxegH8ZLYJ9rN/GS0\nWQiZlCmYcE1/YFCosXEQLwLjwV10kDqlL4SUyflCe/SX9hVKG3VdYnmirlX7HHhCRnQQPWstBl9a\n+iB4XIYEyIt7EauWTBkB6eJYgughUkwHQcB88OWjO+SQQ7LxXJAurlXqWgSr3919j3sg6L81q1d7\nYrV3p33cET16ujZtWnu4mDICYe3FV4MYM+Spp/7onnt+pbvjjjt9vm8Hc391PbiLnyrDJ0j4HyFf\ntD9txDHh0Jp6hkCqETDiFeo+rCcM8kJesJRARKKEtHxhV29BV9E3rAttKUZIB6lEsBKxJE+UQNBK\nIV0QhNGjR0cVVdI5SBKEL+wuLKmQfyeGjFbab/WyZDCIE0Svl/9hglIW7iagG8LFRqC8zNGFRYlN\nSBfXSIMFi2B5SBdz3FFu1PxcfLmIQNJKkdmz57irrrrKk6YT+nzfnTXyHHf0kdHPkpTbsUN7x4bo\ntBCyBxYudLPn3OHGjR3nTh8yxB373f9ykJskC/pBlI18JbmXTDdDoLYIWIxXBN6QkELkAtLFhJ3s\n6yn5iBXkopCbUXSnvYXICGXR5lKEgYe1GuMQCBMTutLmcnDHasmkqmzl5NdtYCCFVNZScNFRpyZd\npay5CPmCjEG6IFPEbUG0sHRhMZOvF7WlqxzSBeHq1q27J12n9O3vVqxc4S6+cHQLIlUqbpCxYUNO\nd1jGJl55lXv55Vf85K5XXjUpFld2qfqUkh5CzHPAPWNiCBgChoBZvHLcA+JexJUVjukSYlbp4J2j\n6pJOQ5ggRASbSxwT1iF0LMbNqCsjD+QEt50OXqd86uF6qcKAw6zpcf3HD+YQRDYsc+JqlL3WD71l\no11x9RcWDMhkLd1HfLk4YMAA3Ty/yDXWLgLjcS/q5X9wL2Lhkngulv/B0kVaXIeQLln+54UXXvAu\nRpmfi3gvCJfEdLWoNM8P+viSCb8KLFxvOAjXjwb0zZO6/EtYwth+/OMfuYsvvthdM2mSuyrYevbs\nUX6hVc6pyVct75sqN8uKNwQMgTIQ2CB4ETPLtYkhUDUE+gfL10DA4nA5Vk3JEgqeO3eub0utLBhx\nrLkI6ULCay5CXo855pi8ay4WA8306dPd2DFjXd8BP3L9+53q2rTeoZhssaSZ9dvZ7n+DJYW6dT/C\njR17sXe5xlJwFQoRt2OtraVVaIoVaQgYAmUiYK7GMoGzbMUjQOwQZKURREgXZLLawiBNPZWuuQjp\n0kH0uBSZn4sPFfbee++S1lyMavNZZ5/jSRcuwFEjR9SUdKFPn+8d7xb6LyXXub79BiTapYflC9KF\n29jEEDAEmhMBs3g1Z7/XvNUMOAw2aXezQIZwWTJBKVY8kbgtGNRDmXF/uUhMl8RyPf744+7oo48u\n+8tFdIToIJMnX1tzwiXY6/3wM0e4eXff5WbeOjPx9xrPQ9z3jcbCjg0BQyCZCBjxSma/NJxWuMsY\nqG8IYpXSLJdffrm33oUtFuHfEEzIZjmCC7MaXy5CumQWeiZKZS1GposgnqvUIPokki7B+rppM9yE\ncWOMfAkgtjcEDIFEIWDEK1Hd0bjKMP3CbrvtFnyN9nILS1HaWoyVC/IFMconkCfIiQj52AoJBI6y\nmVlehC8XmS4C4YtEJj3FfaiXAJIger3mImSK6SIIoteWrr/+9a/+/J577pmdo4uyS5ku4qSTT/VT\nVCTF0oX+WtJGvioh6rrddmwIGALJR8CIV/L7qGE0JF4JSavVC8LFBoksVcij82ENC7tdwaVaXy5C\nvNj4cnHp0qV+brpyvlyk3RddPCZYWujxxLgXc/XFub8Y7Z55+k9uxvRpZVsfc5Ud93mIOsS8XCtp\n3PpYeYaAIVA9BIx4VQ9bKzmEAMQDq9dTTz21HukIJU3cT6xXDIwE18cRl0N5+qtIXLE6novgd+o6\n7LDD/BQQWLr0dBHhSVFlugiAC3+5SEwXVi/m59KkCwtXKVYuyp4/f4EbGkxeevucudmJTjmfVMEy\nt1WwyPe1116dVBWzehn5ykJhB4ZAQyNgxKuhuzd5jWNKCQiFJh3J03J9jcS1iO5xCgQsvOYi5eNa\nxMUoy//gXsS1qJf/EfIVXnMRgiWuRUgXVi7OrVmzxpMy5jYrh3Sh60FdDnK/DCxefEmYBlm95k3X\nPfhIIenzfAmWPBdYvSD5JoaAIdCYCBjxasx+TXSrcLFBZNIyrxdkCzepWCTiAhcig/VMW7r0mous\nvSgxXVi72rZt6yc1lYlRIWFRay4K6WIvli6C6B955BHXvXv3skgXbR40+HRvfZs65dq4IKhJOTLP\n17LHlqXClScuaSNfNbk9rBJDoOYIGPGqOeRWIQQGwkFMk1iSkopKtXSlXNqul//JteaiWLpYT/Gt\nt97yVi+W/sEyss0227RYcxGyJV8uYulihnpI18MPP+yOOOIID3Op7kUyEfQ/eNBgP19WLSdHjeu+\nwOXYsUMHd+6558RVZFXLMfJVVXitcEOgrggY8aor/M1bOaQLFxsDejjIPCmoiEUKkkhQfVxCmyFd\npXy5KF8t4l788MMP/VeNb7/9tnv33Xc9sYJgffWrX3UsFyWWLr5oJN7r9ddf9+QMC0o5pIt2Q1z2\n7rSPnyA1LhxqWQ6LbO/d8eup+qrWyFct7xCryxCoHQJGvGqHtdUUQgAyg7sxieQL0sVEqVihIIlx\nCWWV8uUiJEtci5AuHUTPV4nEbsmai6z+xWANCYNwsSZjt27d3OLFi/2+3DbQP2m2dkm7mVx12222\nTo3VC73pT+7FpP5zItja3hAwBIpHwIhX8VhZyiogQOwUMVRJIl9i6cJChFUOi1ccQll6+Z9ivlwU\n0gUBg3QR64VArCBYEs+lv1wUSxfE7IILLmhBuhjAS52yYMRZI12rrbdJrbVL+m7J0sfcD/v3cytW\nrpBTqdhzP0LAjHylortMSUOgIAJGvApCZAmqjQBWIEgJ+3rHfEnslcSg0XYhhaUSFsGNgZP2sZC0\nyC677OIJJ8H0xXy5COki0B4hZkt/uSiuRc4J6dpwww19/BiuRQikCO1DHxGu6etyXvakxfK3/Nnn\nUzF9hOida39sr97u+N69HIQ/TWLkK029ZboaAvkR+ELg6hmdP4ldNQSqiwD/ye+www5u8ODB7s03\n3/RL2VS3xujScX0yII8cOdKNGzcumwhismLFCv8FYankiwETEnffffdly4NsMZ8W5UK6mCoCS5YE\n0eNSXLdund841l8uykz0WLgIomcP8SKQXpMuCBdfS4atJOBMvbKhH2QMiwobv0kjMnPmTPevzAbu\njGFD5FSq9399d6178sk/uO9+979S1Q6sm2zLli3zfZcq5U1ZQ8AQaIGAWbxawGE/6okABEAsEZAg\nCEstBMJBveyxuuWqV4hJmMzk0lGsZ6V8uQjREvci00Xw9SLkDAsWxAqCpd2L+stFXItsyEMPPZSz\nHbn05bwQMUlz2cQrXI+ePd2wIafLqVTvCbI/ofdxqXM3atCxwOa6R3U6OzYEDIFkIrBhMtUyrZoR\nAQiNkBVcjkKGqoUFJAMXILPpS935BjSxEjHwFRIZHDXpYkLUadM+X75G5ueCWOFGhHDxlSMb1i6x\ndEG6IFMSz4WVi9gwmTIC9yKuRwLphXRRJ7qWI1j0wEC299f9zXUOvpRsFOnYob1r3aaNK6YPk9pm\n+ibN+icVV9PLEKgVAka8aoW01VM0Ani/sS4hkCJIWJwzxotljdglyBcLd2NhK8aNKMSEgY+8UYLV\njK8J9XQREC5mo+fLQ1n+B9KFVQsLl5Au9pAurpEWMgW5wqUI4dKkCzImpAuLGO5FNrArl3jp9lDO\ngw8ucl0P7qJPp/74m/vu32L+tDQ2yMhXGnvNdDYEPkfAiJfdCYlEAIIDgXnvvfe8NQrLFGQCKxgk\nLBfpydUYiJKUwaBF+XPmzPGEqxySQhkQEzYt1KGni4AoMQM901II6fr00089sQqTLggY57iO8OUi\nrsQw6WL6iCjSRR7aiW5xCG37wUmnxFFUosr46nbb+Y8FEqVUGcrQz/R3qc9CGVVZFkPAEIgRAYvx\nihFMK6q6CGCpgoyxJ4aJLwMhTbgJo6xVMigRZE5AOwMVm/5yslKiAjlh4EMPSFetv1wUKxfICwlE\nlzgEK+Arr77hJl42IY7iElPGvPvmuxtnzHC33HxjYnSqRBGeB/o86hmopFzLawgYAtVBYKPqFGul\nGgLxIwDBggyIMOBAeiBPUQIRYjCCbOUSrlVCvhjwIDy4LbXoNRf1l4viXtRB9MzRxXlckBAp3Ic6\niF5PFxHlWpR60SNfWyVdsfsNNvyCwzpkkmwEJD7RyFey+8m0MwQEAbN4CRK2b1oEKrEUQf6woOkg\n+kJrLmrSVcmXi5A0kUrIo5QR3k+8/Ar30cefpn7i1HC7Vq950+3YprV3/Yavpfk39yL/aEDATAwB\nQyC5CGyYXNVMM0OgNggwUGE5KzVWRsiOJl2HHXaYD6JnAKzml4uadEEcbbAt/l5J4yLfxbQOyxci\n/0gUk8fSGAKGQO0RMOJVe8ytxgQiIO6aYlUj1izqy0UC6flK8qWXXvKTooprMe4vF7WeRrw0Gs19\nLATcyFdz3wfW+mQjYMQr2f1j2tUQgWLJV6EvFzt16uS/THziiSfWmy4iji8XNSRiddPn7Dg/Akyi\n2m6vdvkTpfiqka8Ud56p3hQIGPFqim62RhaDAO5BtlzWAlyRTGehF7rmy0rIDy5GHUS//fbbu222\n2cYtWLAgOykqpItAelnoWoLow5Oihmej118u6nagpwyy+nxcxzIha1zlJaWcV197ze3X+YCkqFMV\nPeS+IO7LxBAwBJKFgH3VmKz+MG2KQADCAdmRPVkgRUwbgcg0ExxjxWIQ4ms/iYHhfC4hLWWz10L5\nlCF1cK3Ql4sQpnbt2rnFixe71q1b+9nlK/1yUetE+9GpWrLlFpu75cufrVbxdSsXAtwMwj3MfQv5\nKubeL4QJ9xtlyUbZ+rljzjqpR5479tW8RwvpbNcNgSQiYF81JrFXTKf1EOBlT1yVTJ4qL3T2WKkQ\necGTVgYFGSTkHF8gyrZeJeoE5EuXV+mXiytXrvQz0GMJK2XNRR1Er9Tz5FD00+fjPAaDSy651N1z\nz+/iLLbuZY0ZN8Ft9uVNgoW/h9Zdl1oowLMAaeJZKVX0c8dHJFh2ue8gdWyI3IfyjHGOe4c62VM/\naeS5k+eVdCaGQDMiYMSrGXs9JW3mhQ3RYgkhhBc3rr5yBhDyMzAwEDAXGGUTqyVzfXFdiwxW7KlX\nL//Dmoss/4PIl4vMNv/xxx97VyIWFaaMYMO1yDVmrV+7dq13M+69997Zha5lji7IGDPVs+Yiy/8g\nuUgXAxoiA5//EdMfwQic7rjjDl8qujeSDBx0mnv7rTVlr1qQRiy4j+lbCFAxwj85PCfca0KY2Jcj\nlMFzTJkc8wzz3FXj/i1HP8tjCNQaASNetUbc6isKAV7SvJwhWbyo2eIUiAWEjsEoFwHjvI7non7W\nXJTlf4jpIl4LYsVC15AsSJcQL87J8j+y5iIkZvXq1d5yAOkinktIF2lkzcV8bUX3YgfQfOVwjYGQ\n8tgYHDXB5Pp+++7nLho7zh19ZE9+NoS0b9fezbx15nrxfHFhmmSQ6Od87eQe4L5HeD7ifu543iB0\nrPDAc8SxWcCSfMeYbtVAwIhXNVC1MstGgBczL3v+Q4d85Rskyq5EZWQgYoCBgDAIyH/1YdJF/AqD\nEq4WWXNRky49KSoEDNIlQfRYslhbEaLFuotsTz/9tNt///3ddsHM8FyHdOUKolfqeoJUCSbgSptp\nC3s9B5muR44hXkcd81138YWj5VSq90uWPubOHXVOsH7mwvXaAR4iWGMa1SJDO8P3EPc/z50QI46r\nKdTHM4YuPH9C9qpZp5VtCCQFASNeSekJ08MTn+HDh7vzzz/fv4xrCQkkj5c/xAtyIm420eGpp57y\nwfRYucS9iGuRmefF0iXuRUgXaRC+XNx0002zpEtci5wj7mvbbbd1u+22W1Gki8EKKZUQCMlikNMf\nB/jCIv4ce+yxfmBmcGY+skmTro4kKhFZE3/q3F+Mdv/49BN3yfixeXUFa8GbhGGikjdzCi5yL0ib\n5N6HbEGCammBQg/qxbKNHrWsOwXdZCo2KAJGvBq0Y9PULIiO/PcLSSg3hqvSNvPf/r777tuiGL5c\nxL3IgPC1r33NffbZZ96SJROjhi1dpa65+Oqrr3r3XrjeFkr8+4ceLKOuyznSycZi4oXk29/+th+E\nGYhlWgysekIyv/GNb7orA/LVCO7Gbt26B8T+f7OkoxA2ch08RSC+pZJfyZukPW3Cysue547+r4fw\n/EO+eP7q+fzXo+1WZ3MisFFzNttanRQEeOnKC58Xbz3/46VuXIoS56Sni1i4cKFr06aNj9kSS5cm\nXeWuuYi1i/oY/ASHqL7Jdx3cuC6b6B9VDudol3ydxp42i/tUiCMWO7HsHXPMMe7+++5PPfG6btoM\nD0k+nHNhpvNgCQNrhHumXv8oeAUq+IOFCcsu1tx6tgEMIVxY28AZbOupTwWQWlZDoCgEzOJVFEyW\nqBoICOniJcsgkATRVi8ICUsAMRM9li7IV+fOnddzLeovF4nVIlh+s802W8+9KEH0ub5clAEnTD7F\n5SVWFhn4Sc+AVYhoMa+ZEK1evXpliRYWLbFqaaJFW/X2wgsvOMjX8mefdx07tE9CN5Wlw0knn+q+\nd8Lx7vjje5eVPyoT9zD3jAj3crj/5FqS9mJh4h6iDXJv1VtH3gNi/TbyVe/esPqrhYARr2oha+Xm\nRSCJpEsUhthAVhicsAgQi0Vw/FtvveX+/Oc/u5133tmtW7fOTxcR9eUipIsAeuK5Sv1yUax+eiCE\nXCHsGSgLBcRDGCFaxKthQcBFKq7DMNnSBItjPgjQe7k+PpjPa/fd27qJl00QmFK1n3fffDc8mLdr\n2WPLqkqM6D/ubSSp1jBNupJIEo18perRMmXLQMCIVxmgWZbKEUj6y19cbwMGDPAWDZb+wbXI+oss\n8YPgXsz35SIEjCB6sXQV++UixI/BhwE8PJ1FLuR1QPw+++zjiZaQLbFmyV7IVJhghUkXv8Xd+OKL\nL7pLJ1zqZs66zXU9uEsuNRJ7ntiuoUOHxmrtKtRY+i9p1jDcedxbQvALtaFe14k9Y0u6nvXCx+pN\nNwJGvNLdf6nUnhcqAwAEI4n/cQOquOCYh6tLly5+Y+JULF2PPvqo23PPPbNzdOX7clFIl8zPlWtS\nVCxZssUREI/+ECtNtoRIRREsIWOSRvIKDmAybdp0P8/Y7353Jz9TI8xUv2zpEnfnHXPqqnO9rWHc\nX1hB2afBjSdfGKOviSHQSAgY8Wqk3kxBWyBbvPRxm+EGS6JgKWKDhBBsvmLFCtejR49g+ZxLPOEi\npusPf/iD69ixo59pnklQZY4uPV0EhEziucJzdDEIM6DIVihOK19AvJAjTbI4FoKlLVv6nD4vedlT\nnmAgRBFrHeSRJYT+69hebtTIEUnsuvV0Yt6uQ7oeVPcA8rBitbaGUR/ua/7hScucWejMuwJ906Jz\nuJ/ttyEQhYARryhU7FzVEIBs8TLF6pVUERcdJIUYLojWpEmT/GzbV199tSdTb775pp/0lPgpIV3E\ndUHCiAeDdEFW2BDisoRksS8Up0Wevn37enIqMVvsIUVRREssVrIXghXey3X2QraEaFGnCCSLTdog\nk7w+++yz7rLLJrpLLv2V6/O94yV5Iver17zpTj7pJNe/fz8/S3oilfy3UtoaBkFii1MgLkL24yy3\n2mXxrGD5Qve4Mam27la+IZALASNeuZCx87EjIC/RJLsYaTTESyxGQrxYBuiUU07xXziefPLJHpvn\nnnvOde3aNRtET0yXuBaJB1u8eLGjzQToFyJaQq6EmELoCPAXosXAA7HbcccdW3xxCIHKRa7kvCZb\nUh57LUK02GOli9pkLclLL73Uu1tvvOmWxMZ7QboGDz7NtW/fvuBkqRqHJBzzfLCJcE9UIpTFtCUv\nv/xyKslL/+AjF4TYNBNDoBEQMOLVCL2YkjbwHyuuDnmRJlVtbfGSObuwehF7xcz6t99+u5+SAZL1\nzDPPuG7dunlL19KlS93dd9/tHn74Ybd8+fKCzWPiUvnyUAfEM23Ft771raxFChIIeWLgfO+997y7\nU8iUkCu9l/Sk4VjIllYIF6K2aIWJlpAsvcf6RXwbU2rcc888N3nyZDd9xo2JJF/DzxzhVq160c2Y\n/vnkt7rtaTuGvIvwDLGVIjxv5OHZS6PERRz5QKZnz54egnHjxrmzzz47VXC0bdvWrySB0qXoP3Xq\nVDdo0KBsW3m/1VJ4Z/HOYBWME0880c2aNauW1SeyLiNede6WfA9Tvmt1Vrvk6onpwt3BSzTpwouJ\nDTJDcD3kSzasXQcccIAbNmyYw+LFS+TWW2/16Qu1C3IlRIvpHiBEQvKEIDE4COmCOKED14RYrV27\n1tdLWZp8CdmSctgjlC/xZZpoQaIgW+JC1ARLSJg+h8UPMnnEEUdk3Y+jRp3r5t1zj7v+humJIV9Y\nukaPvsDhCm4E0hW+p3h+9DNUyBpGWqxdDH5J/ZAl3Mao3/LPWiVWL3mf7r777gEpXxVVTaLPif4o\nGSZeEovJtSlTpriBAwdy6KXexAsltA68MyFgzSwbNXPjre3FI5DvwS6mFF6YaQmQpa0QFiEqxGtx\nDvchZOSaa67xW6F25wuIj5ohnsFg66239hOiCunSe44hVAwcuDGxYhBPJmQLnYVooRvkStogREvI\nVnjPdSFaco1zbK+99pqDeDGJKuWBBfvLLvtVMAv+Pu6HQQzVLy8eU/eYL3Ev0vZGJF20iz5nEylk\nDcPK1a9fv1STLtpKOyCQxIaWQyDHjx+ftRalzdIlfZ3mPURQ+mDkyJFGvNLcmaZ7OhDgv27inCr5\nb7XWLYVcsEFCEIgGE6guWbIkpyrEZR0exOOwYdEiRgsiJK4+rGaQJDaxVuk9S7cceuih3joRJlyS\nTvLz3y+uR+LKvvrVr2Z11EQLIqUJVZhYaYIl14RsyZ5FtSGDHTp08BhI48EG6R+4sbbccis36pxz\n3Isvrqrb144yQeqxx/VOXUyXYFrOnntNhOdMiBjkRL4elnOSLo17yCbPFJZz7rlSBGsfgz7SqlWr\nFtagUsqpd9pyrXSQHm0Bq1c70AHShcuR/mhmArxhvTrB6m0eBHhZslRNOf+p1hMlIR9YvNgOPvhg\nt/3222dVgvRgBfrVr37lXRcspn3dddc53JGs64hVi+B8JlqVdR2ZB4ypI/hUno1BgW3OnDl+oORY\nrpGODWsTMWYySz7kC0KH5Qsd33jjDR9jxteVTO5KoD5Ys2eg4Vjv5VjSkI7AfdrDV5ky6Sskk/nK\nIHnUI2RUSJcAwRI8M2+d6efKOrZXb8cUDrUSrFzn/mK0n5V+zNixTUW6whhDTiBibBxjYeb+gYA1\ngkC4yvnnDTcXzxUSJiAQALmviYMiHeRAzrEnjeSPwpHwAPLqPPyzUigfehFzpvPxm/NRwnMoaSkb\nkfw6vegi5bCXfOxFwu188skn5VJ2r9MQp6Ulqt359NfuRdFNl9dMx6kkXtx0+ibkZuIcTFqL3IBc\n50EIX9dlcBwWHjbK1Tdtrrp03mL103nKOS71xtftBQvy8zBJ++RloXUp5sHW6aOO+Y+b2KY0CZgg\nYkHCIsTGi/vUU0/11yBRd955p4/3YhkhvjhkSSH9JWSYaAnZ0nssXZAgzjFQkgeiBmEjxgxrF1Yz\ndIIAQQIhRxAl+nSPPfbw9zZlCMmCXNGf8luuQbIgZ0K0KEeIFuXSRogeHwjw0QDlCBa+0Tn+MLgz\nQWmPHkd41yPB7c8+tyJH6spPQ7iYGLV7QDLe+evb7t777q3prPSVt6C6JdDfCJP+NorwDuEDF56T\nUkQP8szHl0943/H+1gL5kOBwfZ5jrkWRDSFwPJ9RhIaxiY13sBZ5p1NmtUUTIeoK68I5jZ1OD0ZR\n7Rb9aVtY+Edx//3396cZf2677bZwkqb5nSriRWfxAHCzh0mUPBz6JseUycCBCImSnuWG0mUQkKiF\ncnhoKDcsnONa+EYtVb9wuaX8LufG1+Vz0/PgaLzkZRH3Q4+bkf/C0yZCSNlDwNh++ctfuhkzZnhr\nElNEiBuR4HesYbgjV69e7ckTJEoTLPBlk3NcZ+PLyG233dZbtbCSUVYuogVhgjyxCZmC9HXv3t2T\nPoiZkC1JI0QLi5i2aPFVJmSLPGy0jzaxHV5mfw0bOsSToI02+oLbu+PXHQQsTgsYZE4I1/JnnnaT\np0x2UyZf43YNLDwmLRFI4z88LVvQ8hf3NXGTtKtY4f2m3/P5iBdjgn4f6jooI0wmeAeHSZrOwzHP\nO+9T9iLUowmNnNd7xpZCZev05RxDgiBDIuG281vrLdgxdkSNi1IOe9oXpb+UQRojXqCQAunTp0/O\nBwP1wzc5N5X2I3MzyEOobwqYvL4hwuXkgib8IJaqX65yC52v5MaXsvM9ODz0cT0UzD9FrFM9B0Ze\nfEKipP2V7rHwMADg8sMiJV8/EgD8+OOPe+IlBEvvtUUL9+G9997r5wLDfYiIRYugeT0jvpAocRNG\nWbOOPPJIT+SoD5LFJu5DyhOiRWyXEC2NC32FVOqaoq8nXDLOx6Btu83W3gLWrVt37xIkFqtUgbhd\nOekaN3DQaZ7MvfKXlz3huuXmG8smiKXqkMb0xOeVS6DjaG81njvaI/dpMTrqf47F2pIvH2mIpeK5\nZq/HBcqS8hgj9BjCWDN//nyfj7zapRlOq9+t1MeXylKf1lGny6Wz1KmvY0QI66Cv62NtxZK2yXX9\nG71ENz12cI71a0V/jRf40HYtmujp8nWaZjjeMC2NhDRpRs7ntHQ2m7ZW0dH6vwmIl+5sbgZNwBjI\nKEsL1/UNo+vSRA4SJw9Hufrpeos9ruTG13XodoXnVhGsK32wCfitJ+nS7eVY92v4Wim/GQBoG5Yp\nSJMQL8jUTjvt5O9VsWixj4rTIjieexP3HqQoF9ESsqX3EDH5zTFWLYjWQQcd5F2HuDzzES0IlxYG\nM/qJLS6hrHPPPcetWLkicHkNc5tusrG7ZNxYT4KJBYNIYb2K2ojbOunkU137du09cVseWAVZnJv+\nw8JVT0IRFz7VLId/CrAOJUXieu74p6AU4iXvMXDQ40AuXHgPSjr24feikAXe+7pNjEGadIR/6zFJ\n/vlHB4gPzzFCfXp80br7BFX4o4kX7dF16mNJxzmtP/gIIRO8pD2UJ+OjqC7Y8pvruixJ0wz71BAv\nueHpFP6b0DcovzV50jc56TUx45r+TyVMzEivb5ZwXdSjbx65OSvRjzqLlUpvfKkn3C4eLHm4SKNf\nKpKnnD0vySQNktJf5bRF58GKx+AG6ZIvEPlqkfguXHbs//rXv64XEM81LE64+N5++2239957xxoQ\nT7nEfHGPEqclFq0w0dJtoR2QJIkL0tfiOiY+57xzR7lFixb6e+vM4We4Dl9v5zbfLIgxCwhZePvi\nF4Ln/H9+5N2WELepU651/YPg6mrqGFdbk1AOVs8kYRXXc8d9StuKFV2vfm9H5YdAhNNAIvR7UYiC\nLpc0mnRJ2bxjRTSpEaLCNf6JZhPrEHWJQYF9tSXcZj2O6WNpn253OC+65sJL2hHGV5cnaZph//m3\n8iloqe4guQm02tywYgni4eBGF+ZNeh4CIWTy8HATaAIn5em69EMi16M+69V5StVPyi1mr+vJd+OH\n2xouO6pdnNOkM5ynEX6DX1T/lNo2BgARsXoJCWPPdYLmmYYBkTgqSBfbI4884r/0xNrFb70nrfyW\n9JIf4sbG7zCp0uSKe//wwCoHqcJKEDUIM4DVgxijC7qxmVQHgXr0a76WxPncEWBfrOh/IGU8yJU3\n6p1IWk0WpDwZQ7ieK1+4PsnLmKPfs2IIkPFLxitN+KinWkI9ooOML+xFX9onbZRz6EIa/c6J0k+n\nL+d6VJ5GOJcai5e+0Yml0oMOxwS7awl3ODd7+EHQljDJq+vhnH7oJE3UXucrR7+oMqPO6XbJjR/G\nQkgX+XV6XV6x7dJ5yjkWa0o5eauRhxeMvFziLF8TInEdMsP9iy++6L8gxB0ocVoQnn333dffj5AQ\n7kvZS5pSAuKl/6PaA7nBJcqmRc4Z+dGo2HG1EIjrudP/8BSja4eXj40AADlaSURBVK73XzF5q5UG\nEkNclLaI6bqwNDGGCBHT16pxrP8RFSuXJoa1IoDVaFtSy0wN8aoUQB7A8ENYjQG4Uj0bMX+pL8so\nDHhxC8EodS8vE8rlHiDol5daHP2PLlifsExJnBYB7fL14V577eWXG4JYSUA8MV9ihYJ06RitUgPi\no7AKn5NgeYmNkb2cD6e334aAIJDU5070K7Qv5R/M8PggZet/qqU82ZMm13skXJ7+xx/yxT/+4lYk\nhIVNp4mLrEo7cu0hXlIvOtMe/c7UxEzSURbnRf9c+yjjRi49mul8aoiXvtEl4DtXZ3Nep6dDo/57\n4MbWDxXp9I3F7/B1zkWJrq8c/aLKjDqn9bMbPwqhwud4udD3UfdE4dwtU+iYLebDIkAea5VYrr7+\n9a/7DJAzzvGlGfFOYbKlp3kgTgsiRx7K10SzZe3F/4L8so0Oll6R4+JzW0pDoHIE4nzuitVGvy/D\nRChcBlae8Pue39r6I+95cb1RBuVqoiLlas8DepCH8vTzLKQNjwwbYSxaZ7kuZVZrr61v6C31orNu\nqz4mTSFMC+kreBZK12jXU0O8wh1eSkdwI8mDIQ8A+eVFoMviur7xww8iabFcyMPDAI5Uop8voMg/\n4XoqvfGLrLbsZFh6xMJSdiEJzaitXbgXmbIBaxdWK4iVkC9mvGduLPqqU6dO2SkewhOXQrTY5N5i\nH5dIPBfEi/4oJUA5Lh2sHEOg1gjogT3qXR7WBxefpGPPby1i/cH9pseJ8GSo4d/irkMfnY9//qQ+\n6mGc0u90nVbrke9Y58+XTl+TdnFOE0bRW9Iy/gim1IP3QEia5NXjoy6L67qt/NbjGb+bRVJDvPSN\nwc0qhIeOkgdEBiwd78XNoS0b/FcR/gJSSJl0ur7ZqEf/x0NZ+sYWvWRPGaXoJ3UWu6/0xi+2nnzp\ndPvzpeMa7qxGHOSFFLHHOoWVSlyNEC823I0yQzzxXvfff79r166dTwtRqybR0v0SjufKFfel81Tr\nmHuBuL8rr7wqmIz2Ij9lBNNGhLcRZ410V141ya/NF45Pq5ZujVZuIz53/NPAPzTFih7Yw4N+VBmQ\nCMYPnmv2mlQwLkh5ECLGEhHK1vOWacJBWj3maOsSY4/UR52a6JFPjytSV6E94w9laR0K5aGeKJIX\nVb9uN/jo1U8gnDI+QNB0W9FB4wmWUXUW0rURrqfmq0Y6EBIkDw83F1uU6BuDNHIj0MmUIze0EC5u\nFv2lIvn1TasfBl2ffhDL1U+XV+wx+qEzIjd+VN6oGz8qXannBHv89+EHK6qsOAYAjXVUHeWcq+Sh\nZwDYNXDd4QrEtc2LDiIVFs6zPffcc8GSNse71157ze0a5KuVoCdWx3A8F7+FkFVbH+p58MGH3Ow5\nc91dd8513z32uGCw2cN9dbvt3Kl9+0ZC8cILLwTLJn3oZt12u+vdu7c7/PBu7ohgcDj0kK7B8eGR\neezkfxAAI6yblUrSnjveJeF7OV8b0V/GCd6VjAW5nntIBtc1OZCyIQnheCXew4xHeqyQ9LKnLkJP\ndJ3kY+yJqkfysWeOLJ1PXwsfo1+h8sJ5wr9lDJPzlMkWFtKBk+Aavs5vxh7aHRYZizkfRerC6Rv2\ndzBopEYCcpQJbgQmN8m5Bf9ZZNsTfDnSIl2x1yggeMha5A3XiR7BjMPZujgoVT/yBDdoth6tX6Fr\npA3rpH9TLvpo0XUFD4W+5I91mcHD1eJ6FO5gVEgWLVqUOeywwwolS931YA28zPnnn+/1DqaTyOTa\nSBBMlOo3jsGjVkJdwYsub3XoFkx7kTdNuReDhb8zPzjpFH+f/nT4zzO33nZ75o3Va8oq7p5778+M\nOu/8TEDA/HbDDTcUbFtZFTVIJvo0mGuuQVrzn2ZMnDgx069fv/+cKOJIv7sCMtMih37nBUTAv9N5\n9+l3afi93KKA4AdlhvMEhClDvvAYofNyXesmdXKesSss+v0d1on0Aclsobe8n8NjWbhc+U07RAf2\n4TokneypMyCRLfKgY758ur1RY5CU3eh7/ltPndCxugO5Sbjxwx2p03BDhEU/LDwoYaLCjaXTUE+h\nG4s6itWPtPkepnzXyFvqja/LC2NFeegtDx7t1pLvwdbpwscM7IFrIHw69b8hkxCLYiRMtsK/iymj\nlDSQrVLqKDV9IV2oWwjSFVddXTbZylUPBA5C126vdpkrrrgyV7KmP8+zXIh4pw0kSBfkqxTRxCP8\nXtPvPIiXSfUQYAyR8YWxqJkllcSrmTssjW3nP+9qWVXqhUexg1oUAdIWsLj1r8SCha6VDNTUHSwD\n9DkhCghXtQUrGAQMkheFc7XrT3r5pfxzkPS2iH68S0rta6xO/GPNM8teW6GMeAmy1d9r65hY46pf\nazJrSE1wffDQmKQUAeJNCKhuBCFuRmKiiJ3KJ8Q2SVqdjnPEqsQR+6bLJZ4LKSUGRuennyTuS58v\n5nj+/AXuqCOPCqbT2MwtDPp62JDTi8lWUZqjj+zpWCi79wnfc4MHDXYXXTymovIaLTP9OXfu3IZp\nFvcmzwztKkUCspUNhA/+scgbk1VKuZa2eAQ07oG1q6jY4OJLT1/KDdOnsmmcNgR4UQYxOWlTO1Lf\nyy+/3E8NwUWC5mkbZExIj86Ui3iRBnIUlUfnL+WYsiB0bJWIkLZSdLv44rFu6JAh7pcB8Zl42QTX\npvUOlahQcl5I3u1B4P7vf/+4Y/HtuAltyQolIAMY8I/B9OnTGwYPSCRz4JUjBLQHoSc+a75g+HLK\ntjyFEQBzyBcSWBkLZ2j0FMk0xJlWjYQA7ivivHBFpVlwlwbvg5xbMHFq5thjj81cdtllmeuvvz4b\ncJ+rzeAShwsW1wtlxSmUV4xLJ5j2IRN8pZh5dMmyOKsvuyyC+HE9xoFr2UrUKSNtps/YpP245oqN\nRayT2kVXW2lbdIwRLkbEXI1Fw192Qu3qxd1okslsAAiNTi6tffVHoH///l6JNFu+sCLwHzeL9AaD\nQNbylQvdHXfc0VvEmEaiW7du2YWqsZSJYBVDyrFUoQ+WKaxu1RJcxFjBotyqZ519jluxYoWbPPna\nmlu58rV3+Jkj3Ly773Izb51Ztts1X/lJuRZ2C0f1ExZaLEVpd/WjP8+eWTOTcveZHpUgYMSrEvQs\nb9EIMEgwMLCPGsSLLqjOCSFIuBYhkoEFz82ePdstXLjQbx9//HFe7Tp06OAJmBAxEkPCGFRKJU/g\nyCAkrsG8FVd4EXJHn2lymFTSJU0V8rXssWWpvt+kPbLXBIr+0H0iafSee4Q04orW19J0zPPBxrNn\nYgikHQEjXmnvwRTpD1lhEEjryxNrHbpDejAUs/3zn//02z/+8Q/3+OOPu5/85Cf+NxOAFpJDDjnE\nTw6KNYyFsxlYtDUsV/4oIpQrbVznNdFjRvk5AeG8+ZZbEmXpCrcV8rVq1YtuxvRpqSVf9LW28nCP\nlCrcsxA2TdpKLaOe6dEbaxf3YJr/aasnhlZ3shAw4pWs/mhobXhx7rbbbt5SBAFLk4h1CdcNgwCk\nK5g01X322WcO0vXJJ5+4lStXOqxeuBi5FsTWuMWLF7uHH37Y/f3vfy/Y3J122skTu+7du3uCSoYw\nEWMQinIpFSw8hgRgQPtvvvkWN33Gja7rwV1iKLW6RRBsf+CBB7jzzh1V3YpiKh2MIVsicfQ1ZfK8\n4XIsh7iJLvXaozMbBNLEEGgEBIx4NUIvpqgNP/3pT/3Akrb/vsN6i7VLSNdHH33k5s2b51iTERIG\nIYN8QZxYSujDDz90v/vd79yjjz7qHnvssYI91qZNm/XckgzIWMfqJQzgB3U5yI0YOcr9aED0Uj/1\n0i1Xvc8+t8Kd0Ps4N278OE+Yc6Wr53n9LGDRqYb7GMLMJtbSera3lLpFb/5pMzEEGgUBI16N0pMp\naQeDNwMLRIYtDSKuDgYtLAeIEK9PP/3UQbruuecev1js+++/739DvnBDIhAvFtJmYWzZL1++3JOw\nRx55xAeo+4QF/gwJpmxg3UIhX2FrWIHsFV8mrov+mzrl2orLqmUB102b4W6acUNggZydCFcVJEIT\niVpZoaiHZw8ykwbheUPntFrq0oCx6VgfBIx41Qf3pq6VF+q+++7rgk/eq/LffZzgipuGwYoYNRFN\nvJ5//nlv0dpmm23cunXrvFsRMoY1TKxeLKYN6YoiYZzHEgYJg8BhHSskxxxzjCdgkDCwRKpJxOiz\n7//39/18WR07tC+kXuKun3Tyqe6gg7q4YUOH1EU3bdWCvAuBr6UykD0hXvperqUOxdbFcwfpwq0/\n2lyMxcJm6VKCgBGvlHRUo6lJoDoWLwakarhW4sBLXv7oh75aNPG67777gjiiAx3Wrg8++MBvEC+s\nYZAv0rJBjNggYRAwTcI4FosYgfmQsGnTpvkydL1Rx5tuuqmTLyXzxYdF5S32HMRl7077uFEjRxSb\nJVHp5t033w0fNtTV6itHiCr3jwgkIgmC9QjSlXQrkkwdoQlrEvAzHQyBOBAw4hUHilZGWQgwADBA\n8XJN2tdKQrrQK+rlD5HCmrVgwQLXtWtX716EbEG82LPhbhTyRVrZNFiQsDARo4ybb77Z/fznP/dk\n7PXXX89axIqND8MlCQnDIibYlmsRE2sXSwHVelZ6jVWlx9W0enG/gJMIZF1wl3NJ2WO9HT58eGIt\nzkl+LySlD02PdCNgxCvd/Zd67eUli0UpKZYvIV2Am4sUQryYx4s4Lr5ihGBBtPiqkY1jTbwItpdN\npqCAiFGOlldffdVbzlje5LnnnvNuRIkLk32p8WEdO3bMxoaVEx9GbNcXN/6Su/jC0VrV1B2L1WvF\nyhWx6K4JOSQrKfdvvsbhbhSSmESLs7wPcj13+dpm1wyBtCBgxCstPdXAevKyxfXBy7begxe6sL5d\nr169vHsxn9UiWJrFffvb3/bkS6aVEAuX3ss12UO8cEHyW0gY+yeffNJbSZgVH+sUU1C88847bo89\n9ljPNSkkTMeH3X333UVNWyHxYVjEBO9c1jAGab5kZC3ENMZ2hR+bbt26uzPOGFbWF46QFjaRpLgP\nRZ9Ce9Fd4sv4ZwfyFY5fLFRONa4X889ONeq1Mg2BeiBgxKseqFud6yHAIDBgwAA3ceLEun3tSFzJ\nHXfc4XUjvgoSlksgiYcddpi/LJYrIVEQKr1BsjTZEtKl98E6co55vDbffHOfVtySDJbbbrutPx8V\nHwbx0iSMwHziwwjWv//++3Opnz1fKD4MQnz9tOnuzjvmZPOk+eDKSde4VwJMf3XpJQWbIZYhSQhh\nEdIi59KyD5Mu0VtivrjX6/W1I88S9UNk0SHfPzuit+0NgTQjYMQrzb3XYLoTIwP5YXCDiNVqkGOA\n5cUvpEtg7devn9eD39olKIMYk8Hq8xwLCWMvREz2YTLGb8gXcWK4AyFdQsZ02qefftqx3BBlaomK\nD9MkjGD9SuPDxowZ51pts21qg+o1XhzLvF653I3cg9wPSFrch17ZPH/kfs31PHGd5w6B+NTKkgfO\nfLHIs84+LdPL5IHaLhkCRSFgxKsomCxRrRDgZczL/4ILLnAQH17IuQaMSnWSumSw4cUPAXvllVey\nRe+zzz4OlyKDsJAs/kMnVkq75+RY0lCAJmEcazIGscKNiKWLpYOEhEXtOYcbEnImJE7Kpj6pmy8j\nJVAfAqa/lJQvJkuJD9tkk01clwMPcmPGjUvFLPXZTitwgLtx4sTLvJuVeyAtQfEFmhV5uRDp0pl4\n1ngWIGHVfO6oE7LF84arm+NqPeO6fXZsCCQFASNeSekJ06MFAgwYvPyJt4KAyUu6RaIyf1A2L3sG\nGV781CP/5TMQc/ynP/0pWzqzyLMYNiRMky5Ijrj/2AsBymYMDoSIsZcN8vTSSy+5tWvXuk6dOmXd\nkpzXFi85Zv/aa6/5dLgd+U1aCBl7IXS6XtEL8iWbkC9tFZP5w6Liw2gv1jZpgy4/zccDB53m/rzy\ned/vjWLViuqPUkiX5Of+51mT545/RHge4hD5R4dnDxGS53/YH0OgiRAw4tVEnZ3GpgoBIxaF/4r5\nb5yBoNTBAKsGpImXPqQKMpdvUOH6jBkzspBBZIYOHer69u3r15sUMiOWJX7nIl/ZQoIDSAy6bLXV\nVu5rX/ua/y3ESfZCqMLWL6xV2223nSMuS5MyIWGSj3LYtAhJ1HpDxPgthCwcH/aNb3zD7blnO3fb\nbbfqolJ/PGbcBPfZp5+4//3f81LfllwNKId06bLknxMhSTx38uzpdIWOKYfnjucXVz4frUDsSn1+\nC9Vj1w2BNCFgxCtNvdXkuvLyZuNFjjuQ4HbIGBuC9YJNBh1xI4kriZe9DCCkyycQl+uuu84NHDiw\nRTLyX3HFFVnCsvHGGzs2IWBCcFpkUj/QHSsb9WtLEsfUKXvIlN6EhGGhYqoJ+R2155xsQsKiiJi4\nJSFf6K8tYZCxSZMmua232c5NvGyCakH6D5lW4saAVN9y843pb0xEC+T+l+ciIklJp+SZ497lnxYI\nOWXLF7EUxrE8Z7meO56/uHQqqQGW2BBIGAJGvBLWIaZOcQjolzvHCAMOx3pAkJd9KS98yA8bVqXf\n//73fsoIrVXbtm3djTfe6K1PX/rSl7wFir1YjiA0iHY9ir7ok0uoU0STMCFPELF3333Xzx/Wvn17\nT8y05SsXCRPXpBA5KZv6RMcwCYOMzZhxk+uwd6eGCawXbBuZePEMIKXc7z5DkX/kPoZk5XvueAbR\nQT+LRVZhyQyBhkfAiFfDd7E1sFQEICSQE5kUlfUTTz755PWK+fWvf+0OPvhgt9lmm7kvf/nLjmB0\nrF/iziMDxIbBMEwI1yss4oQQMfayQZ4Y9HbeeWf/FaRYtjgvJCyKgMk1IWGk0USM6qlD3KUQsZkz\nZ7lv7rd/wxGv1WvedDu2ae3bGwF7ak9Vm3SlFhhT3BBIGAKf/2ueMKVMHUOg3ggI6YGAQVBw8bVr\n166FWj/+8Y8dMTB/+9vf/GzzTHjKrPVCbiiDhcCRcv7z1yRIyBzE7oADDnAszE2sF6SPaSi22GIL\nt+WWW/rYMdyYrVq18u7MqD3xZWzkIS+kUSx2EC70ps20vRElzcse5eoPcfNVy9KVq147bwgYAqUj\nsFHpWSyHIdD4CAjpWbx4sZ86ggWwIVkXXnihwwImMmHCBLdq1Sp33nnneaKi3XgSjwX5iUPQCWHP\nrPPE3OC6lDplL5Ys2Yu1qxhLmKQlr9QXh+5WRvUQgHRBuArFLVZPAyvZEDAESkHAiFcpaFnapkEA\n0kEAP/FcEnjO/mc/+5lr3bq1D7wXMJhqgnm2cD3iAiQOa+XKle6oo47ycV8Qoqi4L8lfzh79mMAV\nHXcNBl2sVFjFECFg7NmEgLEX16TeC9kK7zf64hfLUS3xeZhEtd1eLa2XiVc6h4JGunIAY6cNgQQj\nYMQrwZ1jqtUHAbH0sGA1k5t+9NFH3hUn0zj06dPHEyzm/xKBAPXs2dONHz/eL/1z6KGH+nwQHx33\nBUGS8iVvuXsIFwMv8WPa2gEBEyLGXjaIVxQR43yYdPF7y8AV2YjyajAn2n6dD0h904x0pb4LrQFN\nioARrybt+DQ3W76swtUmx1HtgZiwEV8lX1lFpYs6R9m487AMQZyEtEBcIDIE1eN67B9MMKnl7LPP\ndmPGjPETo0oe0lMGErfli3ahKy5HLULu2FM/IoRM2hAmYeirSdjGG2/k/vLyS7rYhjjGbZx2MdKV\n9h40/ZsZASNezdz7KWo7X2wxnxBkR+YSEjKlLU+6SQxO5GOG7Iceesjtsssufh4vyBJ5cwl5IGyQ\nJMiKkCZIDJuc5xqTQp5zzjnuueeeyxY3atQoH/c1ZMiQrJsPkkM5MmkpiYUcZTOWeUBbaGuuNul6\npA1SlSZhEDQhX+yZJ21asEB2o8mLL65y7UMfSqSpjUa60tRbpqshsD4CNp3E+pjYmQQhANFiY7CR\nyU+x7mjXWrHqykSQlEd+CBtlhssSC5JYioSM4H775JNPHF8vMsv7Bx984L9mZLmdZcuW+S8ftS6Q\nt9/+9rf+60G+PsRVKV8PQtritH5BFhHqLFWkneTTRIwljYhn09dLLTuJ6VkyqOvBXVz/kLUyibqG\ndTLSFUbEfhsC6UPAiFf6+qwpNIYksbQIkosgVQKEJnRYxGQQFtIlZQvpELccrkfIF3Ffb7zxhnvk\nkUf8TPIQsaVLl/qvHiWv7O+77z4fEybzfUG+sH5BvtgQbZWSfKXuw7qXml/SS5vZH3FED3fWyHPc\n0Uf2lMup37dv197NvHVmTgthUhtopCupPWN6GQKlIWDEqzS8LHWVEcByAwlikNGEqFrVQlaoD6uX\nrCEXZTWChLBh/YJ88dXim2++GaxluKe3fGH9YluzZo0bMGDAeupec8017lvf+lZ23iyZbJUvJbF8\nhV2A6xVQ5Im4yJdUN+KskW7jL23iLr5wtJxK9X7J0sfcD/v3cytWrkhVO4x0paq7TFlDIC8CNoFq\nXnjsYi0RwApFnBKbELBq14/bkrpwOUKY0CFKhBhhoXr22Wf9LPVdu3b1bkSZkJQJTHfccUcfi8ak\npFpOP/10v8ZjvslWxdKk85V6DGmkPXEI5fzj04/dQ4seiKO4RJRx9z3z3LHH9U6ELsUqAZmmX8Mu\n8WLzWzpDwBBIFgJm8UpWfzStNlidcC+yQYbqIVgVxPqFHlED3aJFizwxZNZ3rF+yrJDEffHFHJYv\nfl977bUtJlulTfvuu6+f74v8Ou4Ly5fEfVXqdizXOkI+vhIVYbBnwzV3/Q3TfVyUXEvrvlu37u6M\nM4Z5op2GNsRtwUxDm01HQ6DRETDi1eg9nPD2MdBjbWIP2WGgr6egB+QLaw+DnpAvzkNMIIVimZK4\nLwm6J+6LWC8hXxL3ddFFF63XpPnz5/v5vnTcl3zxGEfcVzEDdphoYWmU9mqFL754rHvn3bVu4mUT\n9OnUHc/67Wx37dWT3KJFC1OhezF9mIqGmJKGgCHQAgEjXi3gsB+1RAAyI9YtTXJqqUOuuiBfEBP0\nQk+28HQNOu4L8oX1S8iXfPEI+SLu64c//OF6VU2ePNkx0aqslyhxXxCvSskX+kIetc60RUsuoqXT\ncEw5zJK//NnnXccO7cOXU/P7pJNPdQd1OdANGzY08TrTV/JsJF5ZU9AQMARKQsCIV0lwWeI4EcDS\nxaDOIBNlaYmzrnLKgnxNnz7dL3StCYwuS8gX1i+C7iFfLJQN4RLyJUH3p512midmOv+IESPcKaec\n4skX1i8hX1i/Kg26l3g1sSJWMpATZP/ZZ/9MrdVr3n3z3fCAcC17bFki7zV9Txjp0mjYsSHQeAgY\n8Wq8Pk1Fi/iCkAGGLYmkCxBxffJlJYKeuURcjzLfl5CvqLivcePGuSVLlrQoihnyL730Uk++sH4x\n35dMtgr5EgLWIlPoBxYuLHRaIFroXQnhkvLSbvU6tldv1+OI7om3dsXVX9JvtjcEDIHkIWDEK3l9\n0vAaQWjElSfWmKQ2GkIDccE6x3xiuUTIl477wvIlrkcd93Xvvfe6SZMmtSiKWfVZZHunnXbyBAzy\nhfVLFuiGfCESeB8mWpDXXFa5uAbzK6+aFEwU+5i75eYbW+ie9B9XTrrGzbn9t4mP7Yqrn5LeH6af\nIdDsCBjxavY7oMbthzBAtnCDQWbSIBJUD2GEhOUTCBjki03HfeFu1Nvzzz/vfvazn61X1MyZM/06\njxJ0L65HiNszzzzj00O+8hGtcKFgjsUqFzELp8/1m3J69z7e9T7he27YkNNzJUvUeZm3a/KUyQX7\nrp6KG+mqJ/pWtyFQWwSMeNUW76avTcgWJCZNgsuRDQJTSHTcl5AvHfclBIygeyx/Ybn44os9+Xrn\nnXcc84Fh/dpuu+1c586ds25HsXyF8+b6DXmE8Fbq1qUcpsR4dMmyxE8vsXrNm27w4NNcjx5HuGFD\nh+SCpu7njXTVvQtMAUOgpggY8aop3M1dGQOMBNRXSgDqgSQWI4iSLGWUTwdxPUbFfQnxYk8QfniR\nbcrdb7/93HXXXZcNutdxX3zxCPEqlXzFNcDPnj3HjQoWBk/63F6syQjGSXaNxtUn+e5Fu2YIGALJ\nQsCIV7L6o6G1wU3Hli9WKskAlEocw+RLz/fFGo+vv/561v0Ytcg2cV+33357lnxh/apkke24XI70\n0Vlnn+NWrFjhJk++1rVpvUPium34mSPcqlUvuhnTp1Vs5atG4+gLcWFXo3wr0xAwBJKLgBGv5PZN\nQ2lWKmlJauMhjljtirF6SRsgYH/4wx/cu+++66ecYJHtdu3aOaaMwCJD/JZMtnrhhRdKtuw+zkW2\nxVWK27FSEfI1duzYRM3vlQbSFUfMXaX9Z/kNAUOgPghsWJ9qrdZiEWjbtm3WrTR+/PgW2cTdxH7B\nggUtruX7MXXq1GyZpbqr8pWb7xrxUZCVNLoYdbtog0wxoc+HjyGasj300ENu9913D2KNeriePXu6\no446yrVp0ya7ziOYsM7jIYcc4qZNmxYuyh155JG+rHXr1nmixpeSkDfmDcOVKTFl62WMOAHhEvIV\ncbmkU5eMH+vat2/vTuh9nCOIvd5CTBeTpCbd0mWkq953itVvCNQXgaYnXpAZITCQHJP4EcCtcscd\nd/j4qPhLr22J8nEApEqLkCzZi1tV9q1atfL3GfFZWLr4WpEvF5m3C9IlC20znQQfHkDMtLDINoRP\nFtnGQkbAPnOGlUq+0Cmsv66rlGPI15jA4nVI14Mc0zbUSyB+J590ktsqwDLJ7kUjXfW6Q6xeQyA5\nCGyUHFVMk0ZFACLRq1cvF4d7KwkYQVzCli/OFRKxLurlgDjHb70xZ9eUKVOC+KnJ7u67784WS7A9\nLkvm+5IpKySOjD1lhOf7ymYOHfChADFGlU4xQbHHH987mCNrkbvggl+6ZUuXunPPPbdmrkesXDdM\nv9Gde85ZfoqSfv36hVqajJ9xxtclo0WmhSFgCJSLgBGvcpGrUb5Vq1bVqKbqVVPM/FfVqz3ekqUt\nWIyKIVvh2iFaQpLE0ip7SJOQJ/ZYufi6Ucd9PfXUU27//fd3999/v9t5552zBEwH3ZOXOig3l4jL\nF0Igx7nSFnMeLCBxV199rdu749fdxWMvcf37nVrVwPvrps1wN824wbVus2PeZZ2K0b+aaYx0VRNd\nK9sQSB8CG6ZP5Xg0JiaKgWnkyJHZAl966SV/LsrliEtSx1uRl3PkCYt2XxLTg1CPDLCSXn6zRx82\n5mqSskmn66TcfHLbbbdl81MGZXGuHCmlvYXKh6SIi65Q2qRfpx39gyklZDAtR1/pd4gWM9OzPJC4\nHrfYYgtPhHA94oLs2rWru/7669er5jvf+Y4DV4n7YnkicTviekTEGrZe5n+fEKtXruulnofAnXvu\nOZ4ELX/madc9IGPn/mK0e/a5FaUWlTM9Fi4IV7du3T3pGjp0qJ8uIg7LXc5KK7gg90lS9augaZbV\nEDAEykSgaYlXsXhBrCAwEKcwyeIc15588sm8xZGmEGmCdEHSCpWVqyIIVp8+fVrkpyzOFapblxlH\ne3V5uLOQXWP4ik6Xi558JDBo0CCP29Zbb50ltkJsOAempCFtuP90eaUex0FaRE8sVGHyJTFf7CXu\ni+kktLDoNot4E/clc4IR98W0FcXGfWGpgsDFKWDD3Fkzb53pPv3kY28BY61EYsDKCcIXssXXipC5\nBxbMd/369fVLAOHmTKoY6Upqz5hehkB9ETBXYwH8w2QmnPy9997zgzsuQQKowwKhKkZKIUdR5UEs\ncgkEEfcUX9UVkkrbGy4/7mBiyBPtKcaSR9+E8T/xxBMdC1XzlWElAmGBVFZqyYN8IZAvRMiYuB05\nzzEbywmFF9meMGGCJ9sssg3Z0rFfBPFLXqnHVxL6Aymmn+ImxxAwtnNHjfQfDEC6rrnqSl/7fp0P\ncHt32scf77FHW/+Fp6j1wgsvBETyQ/eXl19yL/x5ZUAMF7kfnHSKO/I7PdwZw34Su55Sb5x7I11x\nomllGQKNhcCGjdWc4lsDCcEVw0AmwmDMOYmrgsxoCxRpuc5G8LMIA3w+4kO58+fPz+aVfOH9wIED\ns2nOPvvs8OWCv3PpR8Z8+knBcbVXymMf5ySR4kothnRpHfRxHGVQHiRFrHm6/HKOhRRBsnA9Eq/F\nTPV89Yi7ERceli/ckKNGjXJDhrRc/mbhwoWB662be+WVV7zrkS8emXIC1yNTTkDG5L6N0o+2QLyq\nJejfP3DPTp1yrVuxcoW3hPU58QS3+Wabus8+/cTNnTPH3ThjRnZ77dVX3WZf3sQvSXT++f/rdceC\nRuA8uiZdwJIN0mliCBgChsB6CAQv5KaWgKxkAlD8FhCkFlgE1pHstYAUtbjGj1x59XnKDkjXenk5\nIfWyD4hgZBp0knSUq0XOsy+kH2nWrl3rswekMVsm50XKba/kj9qff/75GbZKJSDDmcCi2EJv3f5S\njymLMsuV4Cu+zGGHHVZu9pz5ApKUCSxXmYA0ZQIClQlIfWb16tWZwAqUCb5ozCxevDgzb968zGWX\nXRaJRfAlZGb58uWZYODPvP3225kgBiwTuB8zgfsxQ9lsuYQ2mVSGwMsvv5xhMzEEDAFDIBcCTWvx\nCgbqgqKtXVFuOtxWIrjAsHxFSVTecLpi0oTz6N9aFzkfLrNQjFNc7ZX62WMVisNKgTUujC+WRCyD\nASH1FkWsiuGNa6TB1aqlkJVSp63lsbgaJe4L6xexXVi7JO4LK1jHjh399AnhuK/Bgwf7OdNkvi+C\n7on7KmayVSw0cVnxaolZUurCyoXEcb/7guyPIWAINCQCFuOVp1s1USH2qZAwmIfjvCAHxUg4XzF5\ndJqoesJlhomLzs9xHO0Nl0msSxwDUThWC1cvrtlCosmnfMAgecJlyvl677XrEV0kTos9hExvxH0R\n8/bcc89l1WYeLUj0L37xixYxXwTwE/dFfkTqkYwyrQR9Jsdyzfb5ETDSlR8fu2oIGAL/QcAsXv/B\nwo4SjIC2xkG4iiFd4eZAwnQ+XWY4bb1/CymCJMmUE8R9MdO9WL4k7uuSSy5xp5xySguVZ8+e7QP/\nJe6Lrx5lpvt8cV9m9WoBY1E/jHQVBZMlMgQMgX8jYMQrz62grUg6OD7w22aDlfWxTp+n2KpciiIR\nYQtXIf309bjai+VEBqa4Gq71LLXMSvKWWlel6cXtiKVLky8JuhcChuvxpGC5HCxcWiBdkE1NvsLr\nPJKee1jL4YfHP8WELr+RjuXejsOq20i4WFsMAUMgNwLmasyNjY8LEvcbxEa7rfJkq8sl3GbhOC/t\nSsPtWIh0EAcVd3uxoMjgFBcwtKucrz6pX2MSlz7VLgcCBvkK77XrkeNDDz3UL7I9YMCAFiqxyPY1\n11zjvvWtb2WnnIBs4XpEyIuIlY1jiAT9xj5uIY6Mbd37H7jXXnvdvfHGGy2qIJ6tfft2Lpjj338Z\nCBFMosh9XQ2Mkthe08kQMATiQcAsXgrHsIVIEy3iaPRcWxAU4r7EKhE1270quuqHBJ9r/fiNziJh\nUibn9b5a7SVmqFLRukGeaFu4v/LVQVrw0cRLl5kvb9Q1iEMt46DkPsP1SJwWQfeyyDaWL3TB8iWT\nreZaZJuZ7t9//31XaJFtyAT9FkffUcYNN9zgBg46zbVv194NH36mu3/+A+6DDz9yrbbexp3at2+L\n7cAuB7mPPv7UvfLqG+6yiVf4Z8xPwHrVJE8Go/qj1ueMdNUacavPEGgcBMzipfqSwZkBDssQc3kR\nD8RgLVYgBntNZlTWsi0wuoxKjyvVrxrtxeJ1+eWXV9o0b23UpIl+YcNKhzUvF4kiD/0a5YrNlacY\nZSETtK2Wwr2JpUoHx3OO39r6xe9ci2w/8MAD7vbbb89avpjjC5FytfWL9j0YzGpfrsWJvHffc6+7\ndMJ4PwHqQQcf7M4444ySF9Bm5vpHHl3ili5Z6o468ii3V/uvu+N793L9g7nB6iFGuuqButVpCDQQ\nAsELt6ll1qxZ682HFBCvLCbM9RQM7uulCW6B7Lnw/Fr8luu6rGyh/z6QNOyZWytKyC/pwvXIefaB\n6y2bTp/nOJwv1zxe1F9Oe6P0lnPMaRRYZORn2XvmICvUD+F25/tNWTKvWTlKMYfXnDlzyslacR6Z\njysIks988sknfr6vd999N/P6669nVq5cmQlIZiYgPZm77747E8R9Rd4XwSLbmeeffz7z6quvZt55\n551MYAWLnO8rIK2ZYGHuknRmPrBgpvlMu73aZYLFsjPLn32+pPz5Er+xek3m19dPzxx+eDe/3X77\n7HzJY79m83TFDqkVaAg0HQJN72rEBRcQk/WmgQgGbS9Yv5544gmfBuuKFixEBKGXG2+ky6r0GF2w\ncqCvCPoGxLIk/eJuLy4rpNL5obBq0b5wH/jCS/xDGZQVnm6jlGIeeuihmlu8RD9xO2KdIuge16Ne\nZFu7HotZZBvXoyyyHZ7vCxdmsR9IYAW86OIxbvCgwX45oIWBxWvUyBElW7iknVH7Nq13cD8a8Pk6\njaf07e+uuuoqhxuy0vsrqq7wOalD7unwdfttCBgChkAxCGwA1SwmoaUxBMpFgPUMcVf99Kc/LbeI\nFvlwMRLDJi7gFhfz/IBUQlArJcpz5871bRGXU54qq36Jx5cNlyGkiWWCmDaCGC42SBVTSUCs+PKR\nvZYf//jHbujQoX6aCiZjhcARPwahw2UpJK+Qy5HrF1zwS9e6zY6OecQ6dmivq6nq8ZhxE9y555zl\nrrjiSjds2NCq1AXpgnDVMq6vKg2xQg0BQ6DuCBjxqnsXNL4CBFYT5yUWg7haDPHSMVzEcmnBooV1\nS2LAtDVQpyv1WAhkHLFrpdYdlV7+d2KRbDbIV+CC9CQL0sUmVq1gSSF3zz33tCime/fujkW2mSOM\ngH3mC9PkC8saBCwX+Zo+fbobO2asO33oMDdsyOktyq7VDxbgxnLdunVrN37cmFgJkpGuWvWi1WMI\nNAcCRryao5/r2kpcUJCfID7GWw3qqkwMlWP1gITUOrg+n+pCvrB8Qb6CtRmz5AvLl5AvjpctW+Yu\nuuiiFsXhnvztb3/rv4qEgAn5wo2J9QvyhYUPArarmmLirLPPcXfOneOuv2G6X9S6RaE1/kEQ/ujR\nF7g333zTzZg+LRbyZaSrxp1o1RkCTYBA08d4NUEf172JEJV+/frF8nVjvRuD9Y72JIl0gYm4BCXu\nizm6cBtCophmQsd9HXLIIS5YZLsFlKzt2LNnT0fsGscQNSZbxXomcV8QLqyKEGkE0rVixQpHLFfX\ng7u0KK8eP4j/mjrlWte27R6ub78BWT3L1cVIV7nIWT5DwBDIh4BZvPKhY9diQwALEbFeWE3SHCcD\nwTn//PMDy8ro2LCJu6Bw3BduR4n7Etcj+zVr1rjTTjvNEyytw4gRI/wSROJ6hMDJOo8QO8jZvHvv\n96Rr8uRrHYQnaTL8zBHBlDAvlm35MtKVtB41fQyBxkHAiFfj9GXiW0KAPVuSSUs+ELF2oTskEgKp\nhXYlScT1qOO+IF8E12vyxW9io5YsWdJC/eOPP96dd9553mImrkchXzfddFMQRzXe3T5nbk2D6Fso\nWMQPJmylrbfcfGMRqf+TxEjXf7CwI0PAEIgfASNe8WNqJeZAQKxeMrDlSJbY07jaIF5RE3fSNi1J\ncEcK+ZIvHnEZQr5wIWryRdzXzJkzHYRKyy677OJ+/etfu5133jkbdI9rkaWJHl2yLBHuRa1v+JiY\nr8GDT3NdDjww+NLynPDlyN9yb6bZKhvZMDtpCBgCiUHAiFdiuqI5FIG0ECPElAxpEggXOjMwFyO0\nMZyWuLB6DOgQMMgXmwTdQ74k6F5I2NKlS92FF164XvMgZZ06dfKxXqef/hPX5/s/qNvXi+spV+AE\nXzv+sH8/N3nKZG9tzZfcSFc+dOyaIWAIxIWAEa+4kLRyikIAQoLliKkYoixHRRVS40QMyPvuu68L\nZnCvKKieciQwXZpQKxeljvuCfBE0D/kKux5Xr17twotsoyuLbC9b9nv3j8BqVqrrTtpar/2Vk65x\nc27/rVu0aGFOFbBY1osY51TKLhgChkBDImDEqyG7NdmNEpejDHZJ1haixIDM3F0yf1ec+oZdlJBS\ntmqIuB513BfkS1yPerJVgu4hYWEJlv9JdFxXWF/5zez2PY7oHjnBKn1QKwIs+tjeEDAEmhcBI17N\n2/d1bTmuO4LVsQLVw/1WbOMhXWzoWgshaD8cuB+nJSZMvsT1iOUL16OQL46ZbDVY79E3+8ADurge\nwQLVF184uhYwxF7HvPvmu+HBrPbLHlvW4n4z0hU71FagIWAIFEDAiFcBgOxy9RDA1QjxYvBLIvlK\nin5hFyVYQcbKFSFfMtkqQfcy0z0ETJMvHfcVLFCdyKkjisXhpJNPdQd1OTBr9TLSVSxyls4QMATi\nRMCIV5xoWlklI5AUcqMVx72IWzGppBD90E1LOS5KifuSme7DcV8QMCxf//uL813XQ7/lJl42QVeZ\numOsXpeMG+tjvYx0pa77TGFDoGEQMOLVMF2Z3oZAvhgI+WqwEktOHAhAaiTeB52SaImLameUi1La\nEZVezgn5kiknIF96vi/I13/3+W83c9ZtiZ8+QtqUb9+tW3f3jW/s0xCrKORrp10zBAyB5CKwUXJV\nM82aBQHip4j5gijU82tHiBaz67OhR1pIF/dJlMWL9miJclEyEz/yhS98we/10kPMUn/XXXe5vTvt\n0xCkiwZ2PfTb7h+ffuLban8MAUPAEKgHAmbxqgfqVmckAkJ8hIBBJmohWLkk2J99Nb5erEU7CtUR\n5aKUwP1w3JcE3Q8/8+dup52/ltqg+jAmMq/XipUrwpfstyFgCBgCNUHALF41gdkqKQYBCBcuM4gP\nhIA9WzUtT2Jtg+QRN1UrslcMHnGnAUcw1hIO3IeAHXbYYdlFt1e9+IL7/g9+oLOk+lgW86bd9XZr\npxpIU94QMATKRsAsXmVDZxmriQDWL6xPDJCQL+LA4iJFWH4gXLgTEfa4F00+R2DRokUOArZ27Vp3\n4okn+uNGwoY1HDt8vZ2/rxqpXdYWQ8AQSAcCG6ZDTdOy2RDAMgP5IuAeK9huu+2Wjb3id6kiZAuC\n1apVK18uhIuyjHS1RLNbt26ObZtttgmsXSe3vNgAv3bdbXe3bt0HDdASa4IhYAikEQFzNaax15pI\nZwgYGyQJEsaGJQy3GRYwriGy9z+CP+JCY8/2yiuveBeaBM7HZT2T+hptT5A9uG2xxRaN1jS3xx5t\n3dw5cxquXdYgQ8AQSAcCRrzS0U9NryVEC3cjGyKECouVBMf7C//+A7Fig2jhqgwTM53WjqMR+MJG\nX3RYhxpNGpFMNlofWXsMgUZGwIhXI/duA7eNwGgLjq5uBzOxaiPK13be2f3hiccbsWnWJkPAEEgB\nAhumQEdT0RAwBAyB2BDo2KG9W/nnlbGVZwUZAoaAIVAKAka8SkHL0hoChoAhYAgYAoaAIVABAka8\nKgDPshoChkD6EHj2OZs8NX29ZhobAo2DgBGvxulLa4khECsCsoxQrIUmoLBXX3vN/eCkUxKgialg\nCBgCzYiAEa9m7HVrsyFQBAL/+udn7q9vv11ESktiCBgChoAhUCwCRryKRcrSGQJNhgBfjb711psN\n1+qnnvqja9+uXcO1yxpkCBgC6UDAiFc6+sm0NARqjgDE6ze33FTzeqtd4V9efsltueXm1a7GyjcE\nDAFDIBIBI16RsNhJQ8AQ+HxR7W5uydLHGgqMF4KpJGxC3YbqUmuMIZAqBIx4paq7TFlDoLYIHHBg\nF/fgQ4trW+n/397d9ER1hmEcv+YDsNRUUDMJLEhdYayg3dD6/sZAurOFIU2qSNGkTaoZSXShKGqi\nCRo7VYkVW0QDTtqmMbY2gcS2GKDMoo3ESGUjfgLWyjNqAqiYyDkz5z7nPyvmDPOc+/nds7hyXp7j\n495ciHwyOcniuz4aMzQCCMwvQPCa34dPEYi0wNo1lRr8+6/QGAyPjGhHojY082EiCCBgT4DgZa9n\nVIxA3gTcsy4fjN1XWNa+yvT16sO1VXnzY0cIIIDAXAGC11wR3iOAwCyBmto6XbrUOWubxTe3bv+e\nO83owiQvBBBAoFACBK9CybNfBIwINO/ZrVu//qLJJ7aXlrja1aXmlhYj6pSJAAJhFSB4hbWzzAsB\njwTi8bhWrvpA31+56tGI+R/GHe36Z3hIDfWsWJ9/ffaIAAIzBWJPp18zN/A3AgggMFcgm82qoqJC\n//53XyveL5/7ceDf7/y0XlVVldq3lyNegW8WBSIQcgGOeIW8wUwPAS8E3GKqR462qa2tzYvh8jpG\nx7nz09d2PeZoV17V2RkCCLxJgOD1Jhm2I4DALIGWL5tzp+tckLHycndjnj/bocOHD8ktCMsLAQQQ\nKLQAwavQHWD/CBgRcMEl/V06F2SsrGafSqX0WUMDK9Ub+Y1RJgJREOAaryh0mTki4KFAR8dZZTIZ\n/djdreIl73k4srdDffX1Nxoff6iff8p4OzCjIYAAAgsQIHgtAI+vIhBVgf0HUhobG1M6/W0gw5cL\nXdnRkemAeJNTjFH9kTJvBAIqwKnGgDaGshAIssDJE8dVXl6upqY9gVvfy4Uut+7YmTOnCV1B/hFR\nGwIRFSB4RbTxTBuBhQrMDF9BuObLLfD68vRiz/UeHoS90AbzfQQQ8EWA4OULK4MiEA0BF74qV6/W\n541J3ei9WbBJu7sX3dE3d01X15XLhK6CdYIdI4DA2wQIXm8T4nMEEJhXoLU1pfYT7TrUelC7duf/\n1KNb3uKTutpcAHQX0rNsxLzt4kMEECiwAMGrwA1g9wiEQcA9eHrw3qBisZg+rq7WsfZTvk/LPQao\nJlGnTF9vbpkLFwB5IYAAAkEX4K7GoHeI+hAwJtDf368LFztzq8Vv2LRFjcl6T+987LzcpT/uPH/2\nYupgSslk0pgQ5SKAQJQFCF5R7j5zR8BHARfAuq9d18ULaX2xq0mVVWu0ZfPGdwph7ujW3bt/qu9G\nj5YUF6tx+pqyRCLBaUUf+8fQCCDgjwDByx9XRkUAgRcCExMTGhgY0O3f7uha9w/aUVOr0tIyLVq8\nWGVlpSoqKnrFanQ0q6mpKT36fzz3nerqj7Ru/Xpt37aVC+df0WIDAghYEiB4WeoWtSIQAgF3JCyb\nzeqpYhoaGn7tjEpKSrRsaYmWL1+WC1rxePy1/8dGBBBAwJoAwctax6gXAQQQQAABBMwKcFej2dZR\nOAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkB\ngpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAII\nIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1\nIoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMg\neFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAA\nAghYEyB4WesY9SKAAAIIIICAWYFnu/zIiInRwogAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 131, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 132, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_most_similar_words(focus = \"horrible\"):\n", + " most_similar = Counter()\n", + "\n", + " for word in mlp_full.word2index.keys():\n", + " most_similar[word] = np.dot(mlp_full.weights_0_1[mlp_full.word2index[word]],mlp_full.weights_0_1[mlp_full.word2index[focus]])\n", + " \n", + " return most_similar.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 133, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('excellent', 0.1367295075735247),\n", + " ('perfect', 0.12548286087225943),\n", + " ('amazing', 0.091827633925999699),\n", + " ('today', 0.090223662694414203),\n", + " ('wonderful', 0.089355976962214589),\n", + " ('fun', 0.087504466674206874),\n", + " ('great', 0.087141758882292017),\n", + " ('best', 0.085810885617880611),\n", + " ('liked', 0.07769762912384344),\n", + " ('definitely', 0.076628781406966023),\n", + " ('brilliant', 0.073423858769279024),\n", + " ('loved', 0.073285428928122121),\n", + " ('favorite', 0.072781136036160765),\n", + " ('superb', 0.071736207178505068),\n", + " ('fantastic', 0.07092219191626617),\n", + " ('job', 0.069160617207634043),\n", + " ('incredible', 0.06642407795261443),\n", + " ('enjoyable', 0.065632560502888793),\n", + " ('rare', 0.064819212662615075),\n", + " ('highly', 0.063889453350970515),\n", + " ('enjoyed', 0.062127546101812939),\n", + " ('wonderfully', 0.062055178604090148),\n", + " ('perfectly', 0.061093208811887394),\n", + " ('fascinating', 0.060663547937493893),\n", + " ('bit', 0.059655427045653034),\n", + " ('gem', 0.059510859296156772),\n", + " ('outstanding', 0.058860808147083013),\n", + " ('beautiful', 0.058613934703162042),\n", + " ('surprised', 0.058273314482562996),\n", + " ('worth', 0.057657484236471213),\n", + " ('especially', 0.057422020781760785),\n", + " ('refreshing', 0.057310532092265762),\n", + " ('entertaining', 0.056612033835629211),\n", + " ('hilarious', 0.056168541032286634),\n", + " ('masterpiece', 0.054993988649431565),\n", + " ('simple', 0.054484083134924075),\n", + " ('subtle', 0.054368883033508619),\n", + " ('funniest', 0.053457164871302677),\n", + " ('solid', 0.052903564743620651),\n", + " ('awesome', 0.052489194202770414),\n", + " ('always', 0.052260328525345269),\n", + " ('noir', 0.051530194726406887),\n", + " ('guys', 0.051109413645642671),\n", + " ('sweet', 0.050818930317526004),\n", + " ('unique', 0.050670162263589176),\n", + " ('very', 0.050132994948528464),\n", + " ('heart', 0.04994805849824363),\n", + " ('moving', 0.049424601164379113),\n", + " ('atmosphere', 0.048842500895912862),\n", + " ('strong', 0.048570880631759183),\n", + " ('remember', 0.048479036942291276),\n", + " ('believable', 0.048415384391603783),\n", + " ('shows', 0.048336045608039592),\n", + " ('love', 0.047310648160924645),\n", + " ('beautifully', 0.047118717440814896),\n", + " ('both', 0.046957278901480326),\n", + " ('terrific', 0.046686597975756625),\n", + " ('touching', 0.046589962377280969),\n", + " ('fine', 0.046256431328855756),\n", + " ('caught', 0.046163326224782343),\n", + " ('recommended', 0.045876341160885278),\n", + " ('jack', 0.04535290997518833),\n", + " ('everyone', 0.045145273964599365),\n", + " ('episodes', 0.045064457062621278),\n", + " ('classic', 0.044985816637932746),\n", + " ('will', 0.044966672557930479),\n", + " ('appreciate', 0.044764139584570886),\n", + " ('powerful', 0.044176442621852767),\n", + " ('realistic', 0.043597482283464786),\n", + " ('performances', 0.043020249087841737),\n", + " ('human', 0.042657925475092548),\n", + " ('expecting', 0.042588442995212215),\n", + " ('each', 0.042163774519666956),\n", + " ('delightful', 0.041815007170235508),\n", + " ('cry', 0.041750968395934826),\n", + " ('enjoy', 0.0416600917978181),\n", + " ('you', 0.041465994778271079),\n", + " ('surprisingly', 0.0413931392565174),\n", + " ('think', 0.041103720571057052),\n", + " ('performance', 0.040844259420896825),\n", + " ('nice', 0.040016506666931732),\n", + " ('paced', 0.039944488647599613),\n", + " ('true', 0.039750592643370664),\n", + " ('tight', 0.039425438825552654),\n", + " ('similar', 0.039222380170683489),\n", + " ('friendship', 0.039110112764204313),\n", + " ('somewhat', 0.039069615731010227),\n", + " ('beauty', 0.038130922554738773),\n", + " ('short', 0.03798170013140921),\n", + " ('life', 0.037716639265310242),\n", + " ('stunning', 0.037507364832543758),\n", + " ('still', 0.037479827910101508),\n", + " ('normal', 0.037422144669435123),\n", + " ('works', 0.037255830186344194),\n", + " ('appreciated', 0.037156165138066237),\n", + " ('mind', 0.037080739403157773),\n", + " ('twists', 0.036932552473074115),\n", + " ('knowing', 0.036786021801572075),\n", + " ('captures', 0.036467506884494696),\n", + " ('certain', 0.036348359494082827),\n", + " ('later', 0.036210042786765206),\n", + " ('finest', 0.036132101827862653),\n", + " ('compelling', 0.036098464918935771),\n", + " ('others', 0.036090120202196076),\n", + " ('tragic', 0.036005003580472761),\n", + " ('viewing', 0.035933572455522977),\n", + " ('above', 0.03588671784974258),\n", + " ('them', 0.035717513281555757),\n", + " ('matter', 0.035602710619685632),\n", + " ('future', 0.035323777987573413),\n", + " ('good', 0.035250130839512742),\n", + " ('hooked', 0.035154077227307998),\n", + " ('world', 0.035098777806455039),\n", + " ('unexpected', 0.035078442502957774),\n", + " ('innocent', 0.034765360696729211),\n", + " ('tears', 0.034338309927008835),\n", + " ('certainly', 0.03430103774271414),\n", + " ('available', 0.034268101109487997),\n", + " ('unlike', 0.034253988843446576),\n", + " ('season', 0.034038922427011599),\n", + " ('vhs', 0.034011519281018116),\n", + " ('superior', 0.033917622732495753),\n", + " ('unusual', 0.033797799688239372),\n", + " ('genre', 0.033766115408287264),\n", + " ('criminal', 0.033744472720326837),\n", + " ('makes', 0.033587001877476597),\n", + " ('greatest', 0.033431852271975364),\n", + " ('small', 0.033426529870538409),\n", + " ('episode', 0.033336443796849913),\n", + " ('deal', 0.03333610766528191),\n", + " ('now', 0.033283339034235492),\n", + " ('quiet', 0.033147935977529283),\n", + " ('played', 0.033108782201536797),\n", + " ('day', 0.033074949731286572),\n", + " ('moved', 0.032873980754099884),\n", + " ('underrated', 0.032738818192726317),\n", + " ('society', 0.032613580418616228),\n", + " ('focuses', 0.032607333858382825),\n", + " ('intense', 0.032564318613854962),\n", + " ('sharp', 0.032309211040923352),\n", + " ('adds', 0.032236076588351786),\n", + " ('check', 0.032030541149668808),\n", + " ('take', 0.031717140193258615),\n", + " ('deeply', 0.031693099458454568),\n", + " ('games', 0.03166349528572017),\n", + " ('pre', 0.031251131973427118),\n", + " ('change', 0.031183353959862575),\n", + " ('thanks', 0.031172398048464695),\n", + " ('own', 0.031121337943347101),\n", + " ('easy', 0.031088479340529659),\n", + " ('pace', 0.030934361491678233),\n", + " ('parts', 0.030850186028628303),\n", + " ('truly', 0.030836637734471675),\n", + " ('tony', 0.030739434811745028),\n", + " ('inspired', 0.030725453849735015),\n", + " ('thought', 0.030707437377997422),\n", + " ('complex', 0.030464622676702038),\n", + " ('worlds', 0.030391255174782042),\n", + " ('language', 0.03026497620030956),\n", + " ('soundtrack', 0.030210032139046036),\n", + " ('steals', 0.030207167115964776),\n", + " ('glad', 0.029812003262142252),\n", + " ('ride', 0.02980179480975171),\n", + " ('came', 0.029760628313031539),\n", + " ('impact', 0.029695785634015849),\n", + " ('personally', 0.029677477012254868),\n", + " ('gritty', 0.029540021762614985),\n", + " ('effective', 0.02951238212335535),\n", + " ('wise', 0.029510408701830339),\n", + " ('ultimate', 0.029442440672320935),\n", + " ('ways', 0.029439341792844208),\n", + " ('well', 0.029238386207701295),\n", + " ('sent', 0.029147924396380087),\n", + " ('after', 0.029037668915531285),\n", + " ('tells', 0.029004383695691496),\n", + " ('along', 0.02893297290163489),\n", + " ('modern', 0.028910642159349319),\n", + " ('family', 0.02889738066286553),\n", + " ('pleasantly', 0.028754280601052385),\n", + " ('edge', 0.02874468747624128),\n", + " ('american', 0.028706398764554435),\n", + " ('england', 0.028640930969798119),\n", + " ('grand', 0.028581102406371937),\n", + " ('slowly', 0.028470328912922976),\n", + " ('treat', 0.028418097520915959),\n", + " ('pleasure', 0.028370704112004166),\n", + " ('living', 0.028335845213660407),\n", + " ('impressed', 0.028311856507726565),\n", + " ('fans', 0.028234674336798958),\n", + " ('suspenseful', 0.028156658725541156),\n", + " ('smile', 0.028065651834597621),\n", + " ('jim', 0.027910842672277572),\n", + " ('saw', 0.027900239466183016),\n", + " ('length', 0.027896431301274525),\n", + " ('impressive', 0.027894778243362818),\n", + " ('times', 0.027869981332762556),\n", + " ('witty', 0.027809121334036416),\n", + " ('flawless', 0.027676409302939117),\n", + " ('magic', 0.027671001404745994),\n", + " ('though', 0.027434087841071535),\n", + " ('subtitles', 0.02743198117938047),\n", + " ('stands', 0.027348518548416426),\n", + " ('freedom', 0.027271908118037386),\n", + " ('relationship', 0.027231146375769118),\n", + " ('tape', 0.027213179198573845),\n", + " ('apartment', 0.027198859160909993),\n", + " ('shown', 0.027062169058709833),\n", + " ('films', 0.027035590529373467),\n", + " ('lot', 0.026934527370476365),\n", + " ('barbara', 0.026837141036193595),\n", + " ('office', 0.026775230449656295),\n", + " ('damn', 0.026751196837598828),\n", + " ('murder', 0.026709073212876612),\n", + " ('brilliantly', 0.026701889741880671),\n", + " ('learns', 0.026699872569574588),\n", + " ('tends', 0.026683774361335774),\n", + " ('complaint', 0.026587011626106868),\n", + " ('themselves', 0.026524658938498962),\n", + " ('war', 0.026518675436425311),\n", + " ('violence', 0.02645062815807616),\n", + " ('judge', 0.026443267774947335),\n", + " ('thriller', 0.026431555027632107),\n", + " ('his', 0.026370773394088623),\n", + " ('finding', 0.026362279892885022),\n", + " ('cast', 0.026360860883736618),\n", + " ('police', 0.02635212945330527),\n", + " ('once', 0.02625581764290822),\n", + " ('spectacular', 0.026245466997092383),\n", + " ('deserves', 0.02621450815996168),\n", + " ('driven', 0.026194930792511648),\n", + " ('spot', 0.026171686780563655),\n", + " ('carrey', 0.026162838804053026),\n", + " ('negative', 0.026161677045062219),\n", + " ('suspense', 0.026110016575822802),\n", + " ('flaws', 0.026085421601700298),\n", + " ('brave', 0.026080835779725288),\n", + " ('surprising', 0.026070851171974718),\n", + " ('gives', 0.026069978044960782),\n", + " ('takes', 0.026047493401813341),\n", + " ('light', 0.025921067904644497),\n", + " ('timing', 0.025900303450693642),\n", + " ('crime', 0.025886011572638652),\n", + " ('thank', 0.025873161609513355),\n", + " ('century', 0.02587105631011263),\n", + " ('until', 0.025870245942132539),\n", + " ('nature', 0.02581794293587545),\n", + " ('stellar', 0.025803971141651161),\n", + " ('emotions', 0.025783809728671923),\n", + " ('tremendous', 0.025772614605786563),\n", + " ('missed', 0.025657501028952603),\n", + " ('overall', 0.025655652485101793),\n", + " ('haven', 0.025650692177140798),\n", + " ('portrayal', 0.02559427365790963),\n", + " ('taylor', 0.025516992710898169),\n", + " ('appropriate', 0.025495908849901619),\n", + " ('joan', 0.025489829859140629),\n", + " ('realize', 0.025452457061382182),\n", + " ('different', 0.025434073970060433),\n", + " ('return', 0.025384569542597588),\n", + " ('bound', 0.025380084410398837),\n", + " ('noticed', 0.025306494998440763),\n", + " ('constantly', 0.02528218674576245),\n", + " ('first', 0.025246100888919813),\n", + " ('lovable', 0.025213500492273055),\n", + " ('comic', 0.025074597800944048),\n", + " ('scared', 0.024995376513809515),\n", + " ('fight', 0.024943209945836389),\n", + " ('extraordinary', 0.024940366453083611),\n", + " ('buy', 0.024803940824255594),\n", + " ('know', 0.024749519416087058),\n", + " ('brothers', 0.024675058346350743),\n", + " ('action', 0.024660907824635262),\n", + " ('needs', 0.024634851651549338),\n", + " ('jerry', 0.02462148438534385),\n", + " ('while', 0.024620233313683848),\n", + " ('also', 0.02451948098747243),\n", + " ('definite', 0.024509585305468831),\n", + " ('genius', 0.024500478757646965),\n", + " ('tragedy', 0.024481339186882278),\n", + " ('heard', 0.024446567944460471),\n", + " ('haunting', 0.024431007352898909),\n", + " ('legendary', 0.024412777264908973),\n", + " ('uses', 0.024358972452014009),\n", + " ('years', 0.024316094895735267),\n", + " ('notch', 0.024310571597216279),\n", + " ('fabulous', 0.024258810824927628),\n", + " ('herself', 0.024241390957491074),\n", + " ('battle', 0.024205827940178139),\n", + " ('ralph', 0.024205046194653312),\n", + " ('provoking', 0.02410610606248181),\n", + " ('ago', 0.024024541904156493),\n", + " ('game', 0.024004541901512386),\n", + " ('deals', 0.02394702024903099),\n", + " ('themes', 0.023936597120221115),\n", + " ('my', 0.023928374753346034),\n", + " ('which', 0.023908264765228702),\n", + " ('together', 0.023887683942808241),\n", + " ('record', 0.023879473557965505),\n", + " ('chilling', 0.023877413677317431),\n", + " ('absorbing', 0.023848541510400115),\n", + " ('studios', 0.023840610970325325),\n", + " ('helps', 0.023800338082370948),\n", + " ('paul', 0.023782537407117971),\n", + " ('drama', 0.023766688862014725),\n", + " ('spots', 0.023727534480488414),\n", + " ('japanese', 0.023708475430511466),\n", + " ('com', 0.023663537310393362),\n", + " ('meets', 0.02364941593652313),\n", + " ('may', 0.023577512715288886),\n", + " ('goal', 0.023571992449256608),\n", + " ('out', 0.023558753773465099),\n", + " ('page', 0.023530160671184866),\n", + " ('con', 0.023523200814540537),\n", + " ('thankfully', 0.023405004970711688),\n", + " ('number', 0.023389568775323544),\n", + " ('captured', 0.0233510560685312),\n", + " ('joy', 0.023338854638575421),\n", + " ('brought', 0.023336907813285956),\n", + " ('max', 0.023250909447975858),\n", + " ('superbly', 0.023239871167515604),\n", + " ('those', 0.023176845007530658),\n", + " ('course', 0.023170128305056509),\n", + " ('inspiring', 0.023124940469820009),\n", + " ('troubled', 0.02310455328814328),\n", + " ('starring', 0.023098181939380291),\n", + " ('famous', 0.023080990484234926),\n", + " ('nowadays', 0.023041214534459811),\n", + " ('gripping', 0.023039160339941956),\n", + " ('identity', 0.023038352369265165),\n", + " ('many', 0.023030059748964157),\n", + " ('victor', 0.023028627724258649),\n", + " ('michael', 0.022946522358330841),\n", + " ('stop', 0.0229270478594421),\n", + " ('eerie', 0.022877301562370833),\n", + " ('seen', 0.02282092921742266),\n", + " ('caused', 0.02279167067216753),\n", + " ('moment', 0.022789062338184278),\n", + " ('portraying', 0.02272933498308894),\n", + " ('influence', 0.022698569029077059),\n", + " ('when', 0.022541791159242774),\n", + " ('touched', 0.022525639292270222),\n", + " ('complicated', 0.022432126566344628),\n", + " ('turns', 0.022415566693423827),\n", + " ('young', 0.022415228068632005),\n", + " ('award', 0.022414761392271609),\n", + " ('put', 0.02232584900817719),\n", + " ('trust', 0.022301497663936399),\n", + " ('issues', 0.022257753376187496),\n", + " ('innocence', 0.022236928993752805),\n", + " ('anime', 0.022201683728338889),\n", + " ('without', 0.02214454398785887),\n", + " ('himself', 0.022068240705874397),\n", + " ('charlie', 0.022052037301460173),\n", + " ('parents', 0.021888138202371739),\n", + " ('covered', 0.021887533337961746),\n", + " ('final', 0.021877215769079545),\n", + " ('killers', 0.021830664900395112),\n", + " ('ages', 0.021774376677575591),\n", + " ('usual', 0.021760980512718138),\n", + " ('physical', 0.021749103191221808),\n", + " ('like', 0.021730991541426766),\n", + " ('crazy', 0.021727382570242974),\n", + " ('puts', 0.021725737321791526),\n", + " ('got', 0.0217015745002891),\n", + " ('room', 0.021690968569465629),\n", + " ('complaints', 0.021670426593916561),\n", + " ('type', 0.021663628982945167),\n", + " ('brings', 0.021600600975875434),\n", + " ('remarkable', 0.021576791719396037),\n", + " ('get', 0.021538325389801372),\n", + " ('city', 0.021523385378314892),\n", + " ('coming', 0.021492351614142785),\n", + " ('traditional', 0.021430875828269802),\n", + " ('romantic', 0.021420587536168545),\n", + " ('cinema', 0.021411776829230962),\n", + " ('regular', 0.021395882255575843),\n", + " ('intelligent', 0.021391350897315448),\n", + " ('music', 0.021381013806527446),\n", + " ('humor', 0.021365697759571513),\n", + " ('experience', 0.021314525649372928),\n", + " ('favourite', 0.021253476483878254),\n", + " ('social', 0.021250085255237389),\n", + " ('feelings', 0.021245030895714369),\n", + " ('cried', 0.021233271641070736),\n", + " ('rock', 0.021213280029832356),\n", + " ('against', 0.021157314119587267),\n", + " ('including', 0.021156674122491392),\n", + " ('honest', 0.021143458758793497),\n", + " ('parallel', 0.021107353247706458),\n", + " ('eddie', 0.021080182147252734),\n", + " ('crafted', 0.020979194953745076),\n", + " ('more', 0.020933797343193825),\n", + " ('glued', 0.020931988721930153),\n", + " ('insanity', 0.02091493559910116),\n", + " ('thoroughly', 0.020905661542252773),\n", + " ('eyes', 0.020868013291281098),\n", + " ('jr', 0.020865268971014529),\n", + " ('dramas', 0.020836398428109221),\n", + " ('follows', 0.020814937146708408),\n", + " ('situation', 0.020814821105666473),\n", + " ('understood', 0.020749677092470175),\n", + " ('face', 0.020701739464945065),\n", + " ('albeit', 0.020680340389878406),\n", + " ('memorable', 0.02060826012411552),\n", + " ('accurate', 0.020585303033408744),\n", + " ('under', 0.020574430698374231),\n", + " ('arthur', 0.020562083939889467),\n", + " ('elderly', 0.020545350471808114),\n", + " ('opinion', 0.020539570922797762),\n", + " ('whoopi', 0.020515675744150079),\n", + " ('helped', 0.02047624233713053),\n", + " ('detract', 0.020443807698341674),\n", + " ('flawed', 0.020436371691432323),\n", + " ('unusually', 0.020433523835905333),\n", + " ('performing', 0.020396957567555728),\n", + " ('smooth', 0.020347681451465382),\n", + " ('magnificent', 0.020334637688102841),\n", + " ('desperation', 0.020287768999057227),\n", + " ('lose', 0.02027753568325787),\n", + " ('satisfying', 0.020251527110272064),\n", + " ('friend', 0.020227651020398928),\n", + " ('kudos', 0.02020147732692662),\n", + " ('breaking', 0.020117861519854289),\n", + " ('elephant', 0.020115783447057049),\n", + " ('colors', 0.020112155987764873),\n", + " ('willing', 0.020087728040224333),\n", + " ('fresh', 0.020054019123593746),\n", + " ('offers', 0.020003415308141058),\n", + " ('provides', 0.020002909565985043),\n", + " ('guilt', 0.019987917970659564),\n", + " ('shouldn', 0.019907879458024358),\n", + " ('japan', 0.019906368589571694),\n", + " ('secrets', 0.019876976104814398),\n", + " ('obligatory', 0.019789665431840416),\n", + " ('dvd', 0.01978279618782345),\n", + " ('tale', 0.019752149872839884),\n", + " ('since', 0.019726258912690298),\n", + " ('roles', 0.019710495505207981),\n", + " ('breathtaking', 0.019705824135660539),\n", + " ('ground', 0.019687236524961883),\n", + " ('higher', 0.019670526139537566),\n", + " ('jean', 0.01966540008740161),\n", + " ('rich', 0.019653095716660719),\n", + " ('right', 0.019629293580435747),\n", + " ('stone', 0.019610595905669118),\n", + " ('lives', 0.019610348936710143),\n", + " ('it', 0.019542002303277586),\n", + " ('essential', 0.019533860093920406),\n", + " ('tend', 0.01952340445749683),\n", + " ('places', 0.019510216587218021),\n", + " ('recommend', 0.019506211559818135),\n", + " ('loy', 0.019481148560970919),\n", + " ('tell', 0.019450286669268763),\n", + " ('challenge', 0.019374490591710924),\n", + " ('fiction', 0.019350601498735374),\n", + " ('able', 0.019340445094151427),\n", + " ('animated', 0.019333069625267076),\n", + " ('complain', 0.019332028796550115),\n", + " ('deeper', 0.019318681931941167),\n", + " ('blew', 0.019304454395430135),\n", + " ('seeing', 0.019302442445035525),\n", + " ('release', 0.019209904006239134),\n", + " ('unfolds', 0.019184703456013679),\n", + " ('boys', 0.019177414753158404),\n", + " ('favorites', 0.019160378141489524),\n", + " ('throughout', 0.01913689284569068),\n", + " ('marvelous', 0.01911001532194358),\n", + " ('relax', 0.019044075162625462),\n", + " ('desire', 0.019016117204605984),\n", + " ('end', 0.019014420138293211),\n", + " ('questions', 0.018977699968684848),\n", + " ('man', 0.018956744494720242),\n", + " ('rea', 0.018928733395777452),\n", + " ('comments', 0.018923870708363079),\n", + " ('vengeance', 0.018908638777923939),\n", + " ('brian', 0.018906876323023587),\n", + " ('learned', 0.018899947923704447),\n", + " ('lovely', 0.018854980464698644),\n", + " ('seasons', 0.018852496578683819),\n", + " ('shines', 0.018827509959493262),\n", + " ('justice', 0.018827310862034662),\n", + " ('succeeds', 0.018776998522312772),\n", + " ('discovered', 0.018766802216817063),\n", + " ('touch', 0.018762806738861482),\n", + " ('white', 0.018743225697414177),\n", + " ('bitter', 0.018724701999912892),\n", + " ('knows', 0.018719063288744283),\n", + " ('gene', 0.018660060796556233),\n", + " ('mainstream', 0.018654252436913925),\n", + " ('raw', 0.018609728881254832),\n", + " ('focus', 0.018605078305494939),\n", + " ('won', 0.018597537876871649),\n", + " ('ve', 0.018560162581379283),\n", + " ('million', 0.018514133006256914),\n", + " ('attention', 0.018406547682637133),\n", + " ('river', 0.018403383531225684),\n", + " ('classics', 0.018375185367387355),\n", + " ('quirky', 0.018358100535754603),\n", + " ('although', 0.01835025297382193),\n", + " ('september', 0.018345012211358883),\n", + " ('emotional', 0.018327165070951747),\n", + " ('events', 0.018324554475918103),\n", + " ('released', 0.018304767183625552),\n", + " ('thus', 0.018302709016086091),\n", + " ('rules', 0.018298967789718679),\n", + " ('trilogy', 0.018261985922288504),\n", + " ('jackie', 0.018261017705562571),\n", + " ('country', 0.018248984107628777),\n", + " ('find', 0.018220001120247339),\n", + " ('sure', 0.018205281970545911),\n", + " ('overlooked', 0.018173644592107394),\n", + " ('sensitive', 0.018173518786609135),\n", + " ('harsh', 0.0181439980759164),\n", + " ('chair', 0.018127987063468097),\n", + " ('neatly', 0.01812304461217944),\n", + " ('round', 0.018082305853658345),\n", + " ('adult', 0.018060718859389514),\n", + " ('strength', 0.018042558269708915),\n", + " ('aunt', 0.018028313353173647),\n", + " ('description', 0.017997557340833963),\n", + " ('perspective', 0.017974761193339687),\n", + " ('closer', 0.017945066423908043),\n", + " ('extra', 0.017934760731343105),\n", + " ('hit', 0.017910740181690345),\n", + " ('tough', 0.01790450947037623),\n", + " ('work', 0.017882494289916097),\n", + " ('captivating', 0.017875072308920943),\n", + " ('swim', 0.017853354272014843),\n", + " ('holmes', 0.017846058193393119),\n", + " ('unlikely', 0.017843839699452115),\n", + " ('fears', 0.017838067451752794),\n", + " ('nominated', 0.017837439304520596),\n", + " ('neat', 0.017823068474913176),\n", + " ('discovers', 0.017801301834152447),\n", + " ('paris', 0.017798057884200066),\n", + " ('streets', 0.017746147480597597),\n", + " ('realism', 0.017729724930388033),\n", + " ('travel', 0.017694257020940296),\n", + " ('keep', 0.017684400089090127),\n", + " ('anyway', 0.017675995400919457),\n", + " ('realizes', 0.017618932935696135),\n", + " ('variety', 0.017618487604827662),\n", + " ('chief', 0.017603963834362826),\n", + " ('broke', 0.017601657476194948),\n", + " ('craven', 0.01759761349993532),\n", + " ('moves', 0.01755974422177168),\n", + " ('see', 0.017554713803040186),\n", + " ('intellectual', 0.017537349329235126),\n", + " ('normally', 0.017511237908563508),\n", + " ('technique', 0.017502265077830197),\n", + " ('dancer', 0.017501395365645257),\n", + " ('awe', 0.017467446640641385),\n", + " ('technology', 0.017414969148737205),\n", + " ('kelly', 0.017380794671638243),\n", + " ('particular', 0.017380503339109239),\n", + " ('awards', 0.017343067374305084),\n", + " ('twisted', 0.017342731655512204),\n", + " ('manager', 0.017337683585341684),\n", + " ('fantasy', 0.017314736380004709),\n", + " ('blake', 0.017282963990552184),\n", + " ('criticism', 0.017279558676803676),\n", + " ('identify', 0.017277471199843668),\n", + " ('collection', 0.017253533052260933),\n", + " ('sidney', 0.017239120845031555),\n", + " ('ironic', 0.017225809884120879),\n", + " ('score', 0.017223046869263493),\n", + " ('charm', 0.017204164112517874),\n", + " ('lonely', 0.017192972607511965),\n", + " ('recall', 0.01718951228267028),\n", + " ('dream', 0.017185607849471308),\n", + " ('known', 0.017169341473045788),\n", + " ('hoffman', 0.017123937023014242),\n", + " ('answers', 0.01711237453169525),\n", + " ('taking', 0.017102244694823306),\n", + " ('color', 0.017086755659474467),\n", + " ('existed', 0.01708449183478003),\n", + " ('mel', 0.017080644125498479),\n", + " ('treats', 0.017076365809061661),\n", + " ('kennedy', 0.017063054110179412),\n", + " ('millionaire', 0.017058120181534069),\n", + " ('stewart', 0.01701786393539511),\n", + " ('soon', 0.017016949690113494),\n", + " ('style', 0.0169784466165274),\n", + " ('urban', 0.01696177374188856),\n", + " ('sides', 0.016958377563876276),\n", + " ('nicely', 0.016956584044665043),\n", + " ('survive', 0.016953201066203551),\n", + " ('contrast', 0.016949017788907707),\n", + " ('granted', 0.016948500759420799),\n", + " ('wes', 0.016856895803564038),\n", + " ('heroic', 0.016849533387674566),\n", + " ('sadness', 0.016836182986070529),\n", + " ('faults', 0.01683396699850543),\n", + " ('ladies', 0.016818146836646251),\n", + " ('walter', 0.0168136452096148),\n", + " ('exceptional', 0.016810242985337301),\n", + " ('dangerous', 0.016796058008032445),\n", + " ('fan', 0.016737120507724364),\n", + " ('witch', 0.016717085914917343),\n", + " ('occasionally', 0.016711349636820461),\n", + " ('movies', 0.01667668795406365),\n", + " ('celebration', 0.01666419756672374),\n", + " ('castle', 0.016661909651854566),\n", + " ('catch', 0.016647995152024708),\n", + " ('its', 0.016639302941262299),\n", + " ('tribute', 0.016629617927918797),\n", + " ('jimmy', 0.016625132101972973),\n", + " ('bravo', 0.01661675415646004),\n", + " ('enjoying', 0.016613140144305667),\n", + " ('bus', 0.016593157501778116),\n", + " ('documentary', 0.016564651461285385),\n", + " ('frightening', 0.016559987706802774),\n", + " ('guilty', 0.016536110253664235),\n", + " ('slightly', 0.016526421724199349),\n", + " ('is', 0.016511509443399734),\n", + " ('chan', 0.016507204515006667),\n", + " ('mixed', 0.016506847567311397),\n", + " ('curious', 0.016506488394564575),\n", + " ('spirit', 0.016502977044099084),\n", + " ('pleased', 0.016487261129390265),\n", + " ('most', 0.016476759333214092),\n", + " ('chemistry', 0.016425356343989072),\n", + " ('age', 0.016410666314929885),\n", + " ('understanding', 0.016345696202945563),\n", + " ('marie', 0.016341053241072719),\n", + " ('dreams', 0.016332672013556301),\n", + " ('again', 0.016287090973937747),\n", + " ('union', 0.016282379359022561),\n", + " ('spy', 0.016278154923785912),\n", + " ('presented', 0.016273043238663493),\n", + " ('steele', 0.0162609933390068),\n", + " ('lay', 0.01625999545879786),\n", + " ('plenty', 0.016247194189832816),\n", + " ('horrors', 0.016246022980305592),\n", + " ('black', 0.016223176851856813),\n", + " ('comedy', 0.016220408022010597),\n", + " ('winner', 0.016220318857398414),\n", + " ('african', 0.01621445660979496),\n", + " ('drummer', 0.016178152199513927),\n", + " ('entertainment', 0.016173112007890973),\n", + " ('delivers', 0.016166599465683083),\n", + " ('stays', 0.016139476352793784),\n", + " ('america', 0.016108896341111501),\n", + " ('disappoint', 0.016066615933996442),\n", + " ('gorgeous', 0.016062350166815058),\n", + " ('sisters', 0.016060080355840688),\n", + " ('subsequent', 0.016043574203873964),\n", + " ('cerebral', 0.016039058904070022),\n", + " ('french', 0.016038425317363176),\n", + " ('perfection', 0.016033154869346929),\n", + " ('likable', 0.016021713396124574),\n", + " ('warm', 0.016019144095827362),\n", + " ('studio', 0.01600723281846456),\n", + " ('late', 0.01599792335045707),\n", + " ('reality', 0.015978872249423719),\n", + " ('showed', 0.015938750644323922),\n", + " ('figures', 0.015927446608923247),\n", + " ('ever', 0.015926454600790643),\n", + " ('italy', 0.015909186780479367),\n", + " ('accustomed', 0.015906246911558279),\n", + " ('into', 0.015892173681617973),\n", + " ('he', 0.015866239932092331),\n", + " ('journey', 0.015817191390925529),\n", + " ('waters', 0.015800906878826307),\n", + " ('bill', 0.015785976148791334),\n", + " ('cousin', 0.015784382710801667),\n", + " ('explores', 0.015768756345569596),\n", + " ('originally', 0.015766016465315415),\n", + " ('astonishing', 0.015741175347778351),\n", + " ('mouse', 0.015739473070555076),\n", + " ('affect', 0.01571979846044327),\n", + " ('authenticity', 0.015716491136675288),\n", + " ('key', 0.015706372736941265),\n", + " ('authorities', 0.015700111946298504),\n", + " ('fortunately', 0.015676427069879852),\n", + " ('notes', 0.015668388567765472),\n", + " ('disagree', 0.01565982223146424),\n", + " ('advanced', 0.015653464856497615),\n", + " ('contribution', 0.015651919381489538),\n", + " ('flaw', 0.015630623175485563),\n", + " ('burning', 0.015593951152590373),\n", + " ('scoop', 0.015580911014213491),\n", + " ('levels', 0.015579506047588173),\n", + " ('dead', 0.015575945832152268),\n", + " ('reveals', 0.015552631094426436),\n", + " ('explicit', 0.015535052542383243),\n", + " ('fault', 0.015532818014787668),\n", + " ('requires', 0.015440001642516228),\n", + " ('way', 0.015434313286947611),\n", + " ('waitress', 0.015433929845739235),\n", + " ('vividly', 0.015399209375312223),\n", + " ('truman', 0.015388667015530336),\n", + " ('leslie', 0.015388355420398656),\n", + " ('cool', 0.015362419182461007),\n", + " ('i', 0.015358846209804456),\n", + " ('dated', 0.015351894934707868),\n", + " ('ruthless', 0.015347223840634977),\n", + " ('anymore', 0.015327840988573715),\n", + " ('batman', 0.015325445892906487),\n", + " ('york', 0.01532365079728272),\n", + " ('expressions', 0.015290943599335201),\n", + " ('terms', 0.015285161966075789),\n", + " ('sunday', 0.01527998232990482),\n", + " ('chinese', 0.015240680418926658),\n", + " ('done', 0.01523073330930268),\n", + " ('behind', 0.015219079842199843),\n", + " ('event', 0.015214794169662843),\n", + " ('chamberlain', 0.015214082741427187),\n", + " ('mysteries', 0.01520455675940993),\n", + " ('manages', 0.015203486934632001),\n", + " ('simpsons', 0.01519184981292622),\n", + " ('mine', 0.015191085212402707),\n", + " ('canadian', 0.015117611742208799),\n", + " ('purple', 0.015100505661562475),\n", + " ('website', 0.015095063701722861),\n", + " ('master', 0.015091528696557652),\n", + " ('charming', 0.015088362486196544),\n", + " ('joe', 0.015081920177878145),\n", + " ('reservations', 0.015077821343474082),\n", + " ('fever', 0.015076873583983717),\n", + " ('covers', 0.0150472334532588),\n", + " ('madness', 0.015030361859657219),\n", + " ('glimpse', 0.014991086926970959),\n", + " ('pilot', 0.014978443271049663),\n", + " ('johansson', 0.014975808461544404),\n", + " ('explains', 0.01497051208022746),\n", + " ('excellently', 0.014970388571598842),\n", + " ('hawke', 0.014969750109931358),\n", + " ('genuinely', 0.01494767277070257),\n", + " ('often', 0.014942833143544479),\n", + " ('cube', 0.01493992870936536),\n", + " ('clean', 0.014937853229023529),\n", + " ('ensemble', 0.01491365690908787),\n", + " ('referred', 0.014910582069880145),\n", + " ('replies', 0.014907131594945566),\n", + " ('disease', 0.014895193110452171),\n", + " ('wish', 0.014892245549307062),\n", + " ('logical', 0.014888665766304059),\n", + " ('nathan', 0.014869928851670398),\n", + " ('aware', 0.014869867112894513),\n", + " ('exciting', 0.014823139694980617),\n", + " ('gone', 0.014821497224651536),\n", + " ('critics', 0.014818559383907352),\n", + " ('split', 0.014788117032985607),\n", + " ('series', 0.01477070870316219),\n", + " ('henry', 0.014757735101897458),\n", + " ('prisoners', 0.014747710184003867),\n", + " ('sentenced', 0.01474621990650384),\n", + " ('laughing', 0.014722151818909785),\n", + " ('president', 0.01467176677949055),\n", + " ('list', 0.014666775185665167),\n", + " ('ones', 0.01465899785410933),\n", + " ('information', 0.014651687169784227),\n", + " ('bonus', 0.014648059891508164),\n", + " ('chicago', 0.014631769872667602),\n", + " ('someday', 0.01462934047526257),\n", + " ('splendid', 0.014609703424340649),\n", + " ('surprises', 0.01460882405466246),\n", + " ('sentimental', 0.01459136104528796),\n", + " ('admit', 0.014588098910742801),\n", + " ('previously', 0.014571223247118629),\n", + " ('conveys', 0.014567143509152131),\n", + " ('prominent', 0.014547363114083278),\n", + " ('born', 0.014536990751946697),\n", + " ('necessary', 0.014533225697989451),\n", + " ('yes', 0.014531704633026971),\n", + " ('marvel', 0.014527554209112409),\n", + " ('initially', 0.014510187714555971),\n", + " ('jake', 0.01450250940847886),\n", + " ('matters', 0.014497730426084206),\n", + " ('lucas', 0.014496736417950703),\n", + " ('stories', 0.014475382661229951),\n", + " ('happy', 0.014471040644253801),\n", + " ('improvement', 0.014459225025278402),\n", + " ('anger', 0.014440696969299309),\n", + " ('hong', 0.014412020732763237),\n", + " ('devotion', 0.01440616559418076),\n", + " ('infamous', 0.014402483161136861),\n", + " ('sir', 0.014390585849942569),\n", + " ('fashioned', 0.014376495163092872),\n", + " ('whenever', 0.014311984840844725),\n", + " ('facing', 0.014311813694297491),\n", + " ('spin', 0.014300937890947234),\n", + " ('clear', 0.014297831903635039),\n", + " ('verhoeven', 0.014290838087095126),\n", + " ('onto', 0.014287704198288405),\n", + " ('sheriff', 0.014266680346279266),\n", + " ('boy', 0.014238393212172486),\n", + " ('felix', 0.014236371593101718),\n", + " ('what', 0.014231196728127834),\n", + " ('site', 0.014212839329217037),\n", + " ('hits', 0.014208508715996914),\n", + " ('convincingly', 0.014165838532387461),\n", + " ('adventures', 0.014158492204346281),\n", + " ('multiple', 0.014150723728410515),\n", + " ('wrapped', 0.014118759103459121),\n", + " ('reveal', 0.014076510653822791),\n", + " ('toby', 0.014075221493111762),\n", + " ('months', 0.014061986005374691),\n", + " ('comedies', 0.01405030180887607),\n", + " ('shot', 0.014031987455271906),\n", + " ('holds', 0.014023504904484209),\n", + " ('weeks', 0.014002257803042343),\n", + " ('window', 0.013985434541614852),\n", + " ('received', 0.013983301709629945),\n", + " ('him', 0.013968181093938306),\n", + " ('court', 0.013964352058193522),\n", + " ('double', 0.013960483190947271),\n", + " ('refuses', 0.013957613385590649),\n", + " ('stand', 0.013948813859221343),\n", + " ('shocked', 0.013935157243261932),\n", + " ('powell', 0.013934062441977025),\n", + " ('brutal', 0.013924129605946699),\n", + " ('among', 0.013913156765292936),\n", + " ('prostitute', 0.013911765274631791),\n", + " ('nine', 0.013882343344720896),\n", + " ('timeless', 0.013858274395499411),\n", + " ('likes', 0.013844971514262235),\n", + " ('kurosawa', 0.013820064338774897),\n", + " ('fact', 0.013814297186034372),\n", + " ('ass', 0.013813899781949794),\n", + " ('deanna', 0.013799520782801165),\n", + " ('almost', 0.013791517357271334),\n", + " ('technicolor', 0.013790541990858995),\n", + " ('adventure', 0.013782999907047075),\n", + " ('gerard', 0.013776140434137588),\n", + " ('analysis', 0.013764039325045373),\n", + " ('mid', 0.013747853289146203),\n", + " ('stanwyck', 0.013738927891779253),\n", + " ('mann', 0.013726915645691871),\n", + " ('stuart', 0.013700229069235785),\n", + " ('reluctantly', 0.013697113976504024),\n", + " ('humanity', 0.013690830736911051),\n", + " ('classical', 0.013688949911986581),\n", + " ('health', 0.01368478464061345),\n", + " ('edie', 0.013683859176013944),\n", + " ('british', 0.013666460250876467),\n", + " ('primary', 0.013661794714033899),\n", + " ('coaster', 0.013660631014138398),\n", + " ('explore', 0.013656042478726916),\n", + " ('china', 0.013638756081011155),\n", + " ('advantage', 0.013631698822745392),\n", + " ('protagonists', 0.013627593648932788),\n", + " ('partly', 0.013617059618125359),\n", + " ('artist', 0.01359712346550283),\n", + " ('terrifying', 0.013581203319898157),\n", + " ('scarlett', 0.013567078625941562),\n", + " ('mesmerizing', 0.013547816899479412),\n", + " ('prince', 0.013541105943095601),\n", + " ('weird', 0.013535346249579552),\n", + " ('vance', 0.013518150392608123),\n", + " ('collect', 0.013513303578887654),\n", + " ('humour', 0.013508890166677976),\n", + " ('doc', 0.013507286431402924),\n", + " ('history', 0.013506120200788261),\n", + " ('miss', 0.013498187990897415),\n", + " ('angles', 0.013497507265665429),\n", + " ('dealers', 0.01349360723438389),\n", + " ('mass', 0.013472328625932868),\n", + " ('paramount', 0.01346754666234452),\n", + " ('musicians', 0.01346451713868627),\n", + " ('jackman', 0.013441428735872099),\n", + " ('cheer', 0.013440230376864145),\n", + " ('aired', 0.013427957547366864),\n", + " ('personal', 0.013422418887670075),\n", + " ('become', 0.013415910991211782),\n", + " ('wang', 0.013406655764270567),\n", + " ('unforgettable', 0.013405651085753994),\n", + " ('theme', 0.013397995857105521),\n", + " ('satisfy', 0.013361012634637449),\n", + " ('beginning', 0.013353575498360106),\n", + " ('tongue', 0.013332587937334753),\n", + " ('ran', 0.013322580056022448),\n", + " ('vh', 0.013321694862247341),\n", + " ('april', 0.01331795808268902),\n", + " ('cracking', 0.01331648265485188),\n", + " ('hilariously', 0.013312111975215809),\n", + " ('addictive', 0.013304056341282523),\n", + " ('factory', 0.013302408850101522),\n", + " ('bloom', 0.013287106893282021),\n", + " ('outcome', 0.013278893812795747),\n", + " ('startling', 0.013276469703553513),\n", + " ('portrait', 0.013273055100999263),\n", + " ('adapted', 0.013258514308676842),\n", + " ('raines', 0.013257908724754863),\n", + " ('sky', 0.013252502620889894),\n", + " ('earlier', 0.013233110743632566),\n", + " ('atlantis', 0.013228188610144569),\n", + " ('delirious', 0.013226874818125444),\n", + " ('titanic', 0.013205633401144464),\n", + " ('nevertheless', 0.013198200611184926),\n", + " ('proved', 0.013189760358384484),\n", + " ('denzel', 0.013188430841614762),\n", + " ('pleasant', 0.013180077348723358),\n", + " ('horses', 0.013178651568029467),\n", + " ('about', 0.013166154528006849),\n", + " ('astounding', 0.013161698337226808),\n", + " ('savage', 0.013154100553759925),\n", + " ('winning', 0.013153246708379673),\n", + " ('rose', 0.013145586701309773),\n", + " ('fitting', 0.013133578254330341),\n", + " ('compared', 0.013131693803520047),\n", + " ('took', 0.01311934348149899),\n", + " ('masterson', 0.013112762074217889),\n", + " ('owner', 0.013108690454819136),\n", + " ('delight', 0.013107278788311007),\n", + " ('conventions', 0.01310603977069605),\n", + " ('natali', 0.013094964441143216),\n", + " ('message', 0.013093664295113419),\n", + " ('stood', 0.013090122718303433),\n", + " ('sailor', 0.013058959170423452),\n", + " ('ida', 0.013058842950256239),\n", + " ('escaping', 0.01305272362470678),\n", + " ('top', 0.013047466741024423),\n", + " ('louis', 0.013046238442637026),\n", + " ('peace', 0.013040907918892317),\n", + " ('several', 0.013028244887060291),\n", + " ('info', 0.013023754625550183),\n", + " ('graphics', 0.013020850288881853),\n", + " ('reflection', 0.013019243823940103),\n", + " ('slimy', 0.013014377070231845),\n", + " ('elvira', 0.013009811638957062),\n", + " ('andre', 0.013000047313446738),\n", + " ('kong', 0.012999080313300514),\n", + " ('mayor', 0.012994758409723568),\n", + " ('punishment', 0.012988264949614945),\n", + " ('morris', 0.012983710119604966),\n", + " ('hall', 0.012981593609354825),\n", + " ('match', 0.012980233583057327),\n", + " ('bleak', 0.01297250508630406),\n", + " ('lindy', 0.01297224893312126),\n", + " ('sequence', 0.012964435808713577),\n", + " ('learn', 0.012938848970083346),\n", + " ('happen', 0.01293283638787375),\n", + " ('john', 0.012929524979001674),\n", + " ('gothic', 0.012926957011734876),\n", + " ('wider', 0.012920985981480957),\n", + " ('popular', 0.012891690509844084),\n", + " ('diverse', 0.012875263936567821),\n", + " ('compare', 0.012869395292065185),\n", + " ('brooklyn', 0.012852986243263928),\n", + " ('broadcast', 0.012839574692097613),\n", + " ('zane', 0.012834302957709142),\n", + " ('andrew', 0.012824020940615251),\n", + " ('finely', 0.012822716004015855),\n", + " ('confronted', 0.012817523686608628),\n", + " ('going', 0.012809762839304965),\n", + " ('likewise', 0.012804639349082507),\n", + " ('breath', 0.012790132659417907),\n", + " ('building', 0.01278980970479387),\n", + " ('suggesting', 0.012780624321169344),\n", + " ('contemporary', 0.012772749462937518),\n", + " ('midnight', 0.012766963563112074),\n", + " ('victoria', 0.012756422131580529),\n", + " ('lasting', 0.01275242441564259),\n", + " ('kitty', 0.012751468371946009),\n", + " ('continued', 0.012744325456485406),\n", + " ('indian', 0.012712962842718672),\n", + " ('subplots', 0.012709887814283907),\n", + " ('douglas', 0.012693830679455903),\n", + " ('explosions', 0.012692697593201845),\n", + " ('bond', 0.012689802823687826),\n", + " ('delightfully', 0.012669417460922622),\n", + " ('understated', 0.012669374312789354),\n", + " ('greater', 0.012664580396020159),\n", + " ('sailing', 0.01266242458128243),\n", + " ('images', 0.012661803048859862),\n", + " ('copy', 0.012624649645734171),\n", + " ('seat', 0.012610464273152516),\n", + " ('eleven', 0.012602533659978897),\n", + " ('riveting', 0.012591829460094517),\n", + " ('boiled', 0.012588863529638759),\n", + " ('academy', 0.012581996178142985),\n", + " ('whilst', 0.012569841653295643),\n", + " ('heaven', 0.012547361621330928),\n", + " ('fruit', 0.012543513029693254),\n", + " ('reviewer', 0.012534273375083898),\n", + " ('cost', 0.012529643005796611),\n", + " ('week', 0.012522845015008296),\n", + " ('intriguing', 0.012508687653306347),\n", + " ('streak', 0.012507752385208555),\n", + " ('san', 0.012502130058217927),\n", + " ('awareness', 0.01247644644201245),\n", + " ('catching', 0.012467108595451536),\n", + " ('kicks', 0.012457714930570577),\n", + " ('complexities', 0.012454362663082467),\n", + " ('draws', 0.012447753285125906),\n", + " ('easily', 0.012444885855614887),\n", + " ('ealing', 0.012444339255708927),\n", + " ('psychopath', 0.012431259926282273),\n", + " ('skin', 0.012424248540973567),\n", + " ('creative', 0.012386713452491538),\n", + " ('recognition', 0.012354025801439421),\n", + " ('downey', 0.012348698765161131),\n", + " ('symbolism', 0.012329925038271319),\n", + " ('touches', 0.012328013470751468),\n", + " ('everyday', 0.012324934809895891),\n", + " ('achieves', 0.012314898707483488),\n", + " ('outcast', 0.01231366223021968),\n", + " ('overwhelmed', 0.012306633138869481),\n", + " ...]" + ] + }, + "execution_count": 133, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_most_similar_words(\"excellent\")" + ] + }, + { + "cell_type": "code", + "execution_count": 134, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('worst', 0.16966107259049848),\n", + " ('awful', 0.12026847019691242),\n", + " ('waste', 0.11945367265311002),\n", + " ('poor', 0.092758887574435483),\n", + " ('terrible', 0.091425387197727914),\n", + " ('dull', 0.084209271678223591),\n", + " ('poorly', 0.081241544516042027),\n", + " ('disappointment', 0.080064759621368692),\n", + " ('fails', 0.078599773723337499),\n", + " ('disappointing', 0.07733948548032335),\n", + " ('boring', 0.077127858748012895),\n", + " ('unfortunately', 0.075502449705859093),\n", + " ('worse', 0.070601835364194662),\n", + " ('mess', 0.070564299623590385),\n", + " ('stupid', 0.069484822832543036),\n", + " ('badly', 0.066888903666228572),\n", + " ('annoying', 0.065687021903374138),\n", + " ('bad', 0.063093814537572138),\n", + " ('save', 0.06288059749586572),\n", + " ('disappointed', 0.062692353812072846),\n", + " ('wasted', 0.061387183028051268),\n", + " ('supposed', 0.060985452957725138),\n", + " ('horrible', 0.060121772339380097),\n", + " ('laughable', 0.05869840628546763),\n", + " ('crap', 0.058104528667884549),\n", + " ('basically', 0.057218840369636148),\n", + " ('nothing', 0.057158220043034176),\n", + " ('ridiculous', 0.056905481068931438),\n", + " ('lacks', 0.055766565889465436),\n", + " ('lame', 0.055616009058110163),\n", + " ('avoid', 0.055518726073197189),\n", + " ('unless', 0.054208926212940739),\n", + " ('script', 0.053948359467048485),\n", + " ('failed', 0.05341393055000912),\n", + " ('pointless', 0.052855531546894111),\n", + " ('oh', 0.052761580933176816),\n", + " ('effort', 0.050773747127292324),\n", + " ('guess', 0.050379576420076538),\n", + " ('minutes', 0.049784532804242179),\n", + " ('wooden', 0.049453108380727175),\n", + " ('redeeming', 0.049182869114721736),\n", + " ('seems', 0.049079625154669751),\n", + " ('instead', 0.047957645123532268),\n", + " ('weak', 0.046496387374765677),\n", + " ('pathetic', 0.046099741149715746),\n", + " ('looks', 0.045796536730244836),\n", + " ('hoping', 0.045082242887577006),\n", + " ('wonder', 0.044669791780934595),\n", + " ('forgettable', 0.042854349251871711),\n", + " ('silly', 0.042237829687270009),\n", + " ('attempt', 0.041706299941373509),\n", + " ('predictable', 0.041514442438568111),\n", + " ('someone', 0.041506119027337314),\n", + " ('sorry', 0.04086887728153335),\n", + " ('might', 0.040445683500688362),\n", + " ('slow', 0.040346869107034944),\n", + " ('painful', 0.040220039039613249),\n", + " ('thin', 0.040062642253777855),\n", + " ('mediocre', 0.03940716537757738),\n", + " ('garbage', 0.039310979440981095),\n", + " ('money', 0.038907973313640501),\n", + " ('none', 0.038300807052230948),\n", + " ('bland', 0.038062246057085039),\n", + " ('couldn', 0.038016664218957927),\n", + " ('either', 0.037738833070341968),\n", + " ('unfunny', 0.037076629805044496),\n", + " ('entire', 0.036642119399463165),\n", + " ('cheap', 0.036516800802525562),\n", + " ('honestly', 0.036212041543797806),\n", + " ('mildly', 0.035744850608185635),\n", + " ('total', 0.035560454471013067),\n", + " ('neither', 0.035415946043548564),\n", + " ('making', 0.035244315060985604),\n", + " ('problem', 0.035088251034562444),\n", + " ('flat', 0.034518947038747069),\n", + " ('bizarre', 0.034509460694521141),\n", + " ('group', 0.034335883528586783),\n", + " ('dreadful', 0.034287618511331844),\n", + " ('ludicrous', 0.034159649323816037),\n", + " ('decent', 0.033771585787868943),\n", + " ('clich', 0.033751444631720563),\n", + " ('daughter', 0.033732725858384868),\n", + " ('bored', 0.033622879572852551),\n", + " ('horror', 0.033464120619956815),\n", + " ('writing', 0.033437913916756788),\n", + " ('skip', 0.033430639850491162),\n", + " ('absurd', 0.033154173530163311),\n", + " ('barely', 0.032653416827517712),\n", + " ('idea', 0.032584013175663208),\n", + " ('wasn', 0.032481207966272067),\n", + " ('fake', 0.032136435098031532),\n", + " ('believe', 0.031677858935800801),\n", + " ('uninteresting', 0.031526815915867132),\n", + " ('reason', 0.031390715260270527),\n", + " ('scenes', 0.031216362935389166),\n", + " ('alright', 0.031046883113956258),\n", + " ('body', 0.030999982945986659),\n", + " ('no', 0.030917695380560415),\n", + " ('insult', 0.030808450146355922),\n", + " ('mst', 0.030527916471397853),\n", + " ('nowhere', 0.030352177599338292),\n", + " ('lousy', 0.030160195468380797),\n", + " ('didn', 0.030115903194061419),\n", + " ('interest', 0.029888118468771117),\n", + " ('half', 0.02981324611505725),\n", + " ('lee', 0.029804235955718638),\n", + " ('dimensional', 0.029562861996904038),\n", + " ('unconvincing', 0.029322607679950232),\n", + " ('left', 0.029322408787030522),\n", + " ('sex', 0.029296748476082143),\n", + " ('even', 0.029225209450923412),\n", + " ('far', 0.029192618334294547),\n", + " ('tries', 0.029004001132703523),\n", + " ('anything', 0.028988097743501137),\n", + " ('trying', 0.028919477228465107),\n", + " ('accent', 0.028779542310252575),\n", + " ('nudity', 0.028662654953266066),\n", + " ('apparently', 0.028291626941517919),\n", + " ('zombies', 0.028178583120430672),\n", + " ('sense', 0.028166740534758782),\n", + " ('incoherent', 0.027988926190862507),\n", + " ('something', 0.027986519420278216),\n", + " ('tedious', 0.027952212405329514),\n", + " ('wrong', 0.027831947557365632),\n", + " ('were', 0.027825695799985381),\n", + " ('endless', 0.027824591794431464),\n", + " ('turkey', 0.027624266205058482),\n", + " ('zombie', 0.027543333835110859),\n", + " ('appears', 0.027469840878483233),\n", + " ('embarrassing', 0.027425437142424351),\n", + " ('walked', 0.027411768647042707),\n", + " ('premise', 0.027346072285964175),\n", + " ('ok', 0.027333008356232001),\n", + " ('result', 0.027312558653191901),\n", + " ('complete', 0.027247564384243413),\n", + " ('t', 0.02718673746561022),\n", + " ('least', 0.02694907263201728),\n", + " ('was', 0.026917906772065299),\n", + " ('unwatchable', 0.026829458762459381),\n", + " ('sat', 0.026806511532143459),\n", + " ('to', 0.026801902698524071),\n", + " ('sadly', 0.026753380035391502),\n", + " ('christmas', 0.026735555962199221),\n", + " ('gore', 0.0266701616306084),\n", + " ('mother', 0.026612696987437748),\n", + " ('aspects', 0.026583237615263797),\n", + " ('amateurish', 0.026565159291175689),\n", + " ('below', 0.026548271016778126),\n", + " ('stupidity', 0.026460990221946916),\n", + " ('appeal', 0.026396596713420969),\n", + " ('trite', 0.026331168557051404),\n", + " ('then', 0.026284629203937655),\n", + " ('rubbish', 0.026216695246125493),\n", + " ('okay', 0.025981446095883619),\n", + " ('sucks', 0.025930224401969335),\n", + " ('pretentious', 0.02590791237062829),\n", + " ('positive', 0.025773976409798761),\n", + " ('confusing', 0.025737618729473628),\n", + " ('remotely', 0.025699566061653016),\n", + " ('obnoxious', 0.025454829745850248),\n", + " ('m', 0.025435495928249188),\n", + " ('rent', 0.025373441934038503),\n", + " ('laughs', 0.025346512576104405),\n", + " ('re', 0.025342239903627856),\n", + " ('context', 0.025274382593713566),\n", + " ('disgusting', 0.025195418263468175),\n", + " ('so', 0.025148024611438793),\n", + " ('tiresome', 0.025031684199042097),\n", + " ('miscast', 0.024970026716882358),\n", + " ('aren', 0.024968703889385907),\n", + " ('forced', 0.024933299777713691),\n", + " ('paid', 0.024906929703330336),\n", + " ('utter', 0.024802282233385511),\n", + " ('uninspired', 0.024799576212017463),\n", + " ('falls', 0.024749631706810708),\n", + " ('throw', 0.024614954073046699),\n", + " ('been', 0.024470487429445055),\n", + " ('ugly', 0.024334820044832374),\n", + " ('hopes', 0.024315635652054308),\n", + " ('dire', 0.024191221840051083),\n", + " ('hunter', 0.02417129112741848),\n", + " ('producers', 0.024089231997130214),\n", + " ('seem', 0.024065146985976858),\n", + " ('straight', 0.02399666645155215),\n", + " ('vampire', 0.023942797574072673),\n", + " ('paper', 0.023908828083961012),\n", + " ('crappy', 0.023807255546688062),\n", + " ('excited', 0.023764516357875833),\n", + " ('start', 0.023739057832096774),\n", + " ('material', 0.023729757962158746),\n", + " ('excuse', 0.023681577270328096),\n", + " ('cop', 0.023480677028928129),\n", + " ('f', 0.023312251619610848),\n", + " ('ms', 0.023282327986278314),\n", + " ('villain', 0.023158273483660733),\n", + " ('fest', 0.023091425711778239),\n", + " ('lack', 0.023039437894325183),\n", + " ('such', 0.023031161078650962),\n", + " ('saving', 0.023025745893238067),\n", + " ('clichs', 0.022928209200342307),\n", + " ('enough', 0.02292139725392528),\n", + " ('mistake', 0.022868689470375),\n", + " ('unbelievable', 0.022864325693347887),\n", + " ('maybe', 0.022825002748295277),\n", + " ('blame', 0.022808369279543168),\n", + " ('bunch', 0.022769532876362856),\n", + " ('version', 0.022753296945755487),\n", + " ('candy', 0.022749363632616742),\n", + " ('island', 0.02274580066608017),\n", + " ('tripe', 0.022695188509832674),\n", + " ('wasting', 0.022681371343356752),\n", + " ('inept', 0.022679276425665765),\n", + " ('actor', 0.02263697537177102),\n", + " ('flop', 0.022613758633444538),\n", + " ('any', 0.022560608437607196),\n", + " ('k', 0.022554017579615032),\n", + " ('appalling', 0.022500975853556059),\n", + " ('propaganda', 0.022465024430755737),\n", + " ('major', 0.022430482324246572),\n", + " ('sequel', 0.022362296462477865),\n", + " ('offensive', 0.022326080604825445),\n", + " ('revenge', 0.022315150942472609),\n", + " ('shoot', 0.02228810570921174),\n", + " ('whatsoever', 0.022286498346940933),\n", + " ('ruined', 0.022173811528211046),\n", + " ('painfully', 0.022152008209040921),\n", + " ('on', 0.022016020939730041),\n", + " ('shame', 0.021981493467648269),\n", + " ('effects', 0.021849482201960254),\n", + " ('wouldn', 0.021848506706035151),\n", + " ('development', 0.021773241990065747),\n", + " ('plot', 0.021733893676650608),\n", + " ('co', 0.021728673026887642),\n", + " ('church', 0.021719723717009982),\n", + " ('storyline', 0.021663404462350763),\n", + " ('screenwriter', 0.02166017725248592),\n", + " ('bother', 0.02157169990956697),\n", + " ('miserably', 0.021516173872499805),\n", + " ('christian', 0.021515873507543644),\n", + " ('add', 0.021468134313277938),\n", + " ('found', 0.021449077767987133),\n", + " ('watching', 0.021344833140596573),\n", + " ('pseudo', 0.021308384076023465),\n", + " ('boredom', 0.021119995917930002),\n", + " ('please', 0.021090765093296306),\n", + " ('talent', 0.021005847445274794),\n", + " ('continuity', 0.021005145852421921),\n", + " ('talents', 0.020992716564348882),\n", + " ('college', 0.020990718952374872),\n", + " ('tried', 0.020978219626186817),\n", + " ('editing', 0.020865814801443755),\n", + " ('lines', 0.020853755408845792),\n", + " ('drivel', 0.020726493692759695),\n", + " ('generous', 0.020697017742241999),\n", + " ('potential', 0.020672988272090822),\n", + " ('creatures', 0.020601399429061324),\n", + " ('disjointed', 0.020581338926655212),\n", + " ('irritating', 0.020576764848872681),\n", + " ('pile', 0.020560898967541538),\n", + " ('acts', 0.020560043588043517),\n", + " ('junk', 0.020558505639508208),\n", + " ('raped', 0.020550629285133258),\n", + " ('christ', 0.020481424289613519),\n", + " ('brain', 0.020431161137662711),\n", + " ('slasher', 0.020425652445140888),\n", + " ('seconds', 0.020390927443421889),\n", + " ('nobody', 0.020389268101762604),\n", + " ('dialog', 0.020338349197601486),\n", + " ('makers', 0.020333184431951125),\n", + " ('excitement', 0.0202904560242918),\n", + " ('flashbacks', 0.020267510512910234),\n", + " ('sloppy', 0.020234078734398357),\n", + " ('joke', 0.020212187048528514),\n", + " ('sleep', 0.020108895811675787),\n", + " ('bottom', 0.019986770547280194),\n", + " ('however', 0.019981104962051167),\n", + " ('fail', 0.01993740521162023),\n", + " ('sucked', 0.019874923017311572),\n", + " ('soap', 0.019853525395543012),\n", + " ('looked', 0.019810211840927107),\n", + " ('stinks', 0.019769365381781159),\n", + " ('deserve', 0.019614034321096468),\n", + " ('exact', 0.019555320028259),\n", + " ('substance', 0.019552647432498176),\n", + " ('yeah', 0.019513150136671549),\n", + " ('production', 0.019510696746296522),\n", + " ('female', 0.0194769149781218),\n", + " ('unintentional', 0.019387723280198922),\n", + " ('army', 0.019364852889641605),\n", + " ('minute', 0.019351862554568222),\n", + " ('unrealistic', 0.019350657250497855),\n", + " ('rescue', 0.019340920364464904),\n", + " ('theater', 0.019333829276668497),\n", + " ('monsters', 0.019332636015751026),\n", + " ('frankly', 0.019326550823843876),\n", + " ('children', 0.019314240606868868),\n", + " ('convince', 0.019312073515560635),\n", + " ('shallow', 0.019298445504930546),\n", + " ('synopsis', 0.019259706392396589),\n", + " ('scott', 0.01918347440557033),\n", + " ('seriously', 0.019182027987149994),\n", + " ('ridiculously', 0.019169300285178967),\n", + " ('looking', 0.019150985439966562),\n", + " ('kareena', 0.019110212601710658),\n", + " ('wrote', 0.019015323411486429),\n", + " ('attempts', 0.019006343780653929),\n", + " ('bothered', 0.018970712777578509),\n", + " ('utterly', 0.018924824767803397),\n", + " ('giant', 0.018891084650049701),\n", + " ('writers', 0.018868906582101302),\n", + " ('atrocious', 0.018848042351202358),\n", + " ('plain', 0.018828766525513598),\n", + " ('presumably', 0.018826629750947937),\n", + " ('example', 0.018796453237837171),\n", + " ('murray', 0.018754173430046931),\n", + " ('seemed', 0.018749132295913067),\n", + " ('stay', 0.01874415970643268),\n", + " ('interview', 0.018672085964709519),\n", + " ('disaster', 0.018553283301235145),\n", + " ('value', 0.018544080955166367),\n", + " ('paint', 0.018529607132429377),\n", + " ('original', 0.018528190682362417),\n", + " ('difficult', 0.018518455298178582),\n", + " ('care', 0.018494804801171251),\n", + " ('watchable', 0.01848187060538909),\n", + " ('useless', 0.018470481000366853),\n", + " ('desperately', 0.018421675047000256),\n", + " ('except', 0.018391993551238547),\n", + " ('doing', 0.018384737621350646),\n", + " ('errors', 0.018380414978330258),\n", + " ('solely', 0.018349321075079389),\n", + " ('sitting', 0.018346519170301077),\n", + " ('giving', 0.018335957397904827),\n", + " ('ideas', 0.018327099221245188),\n", + " ('unbearable', 0.018321159676201411),\n", + " ('advice', 0.01827337252768883),\n", + " ('nor', 0.018254420259554285),\n", + " ('project', 0.018252633214771746),\n", + " ('dozen', 0.018206363291515752),\n", + " ('charles', 0.018163660578293446),\n", + " ('plastic', 0.018161741020378652),\n", + " ('book', 0.018139011699011297),\n", + " ('shots', 0.018114876064363863),\n", + " ('ill', 0.018103621818215732),\n", + " ('grade', 0.018088309511242358),\n", + " ('where', 0.018065882599695146),\n", + " ('women', 0.018026883825059355),\n", + " ('screenplay', 0.018014307024101311),\n", + " ('through', 0.017990863003241389),\n", + " ('actress', 0.017876003487857155),\n", + " ('sign', 0.01786563614405693),\n", + " ('walk', 0.017823522607756631),\n", + " ('santa', 0.017727102733219178),\n", + " ('happens', 0.017722408798843584),\n", + " ('contrived', 0.017720303645882781),\n", + " ('gun', 0.01768599317693384),\n", + " ('ashamed', 0.017679623098721585),\n", + " ('gratuitous', 0.017665737783803856),\n", + " ('one', 0.017608259344043253),\n", + " ('not', 0.017562336441189891),\n", + " ('credibility', 0.017558852870687949),\n", + " ('promising', 0.017544417082572289),\n", + " ('risk', 0.017532600100721239),\n", + " ('sub', 0.017531947750389465),\n", + " ('lacking', 0.017513759836446527),\n", + " ('fell', 0.017464857159331278),\n", + " ('scenery', 0.017451365955319952),\n", + " ('flesh', 0.017402514298262693),\n", + " ('animal', 0.017386681692205423),\n", + " ('tired', 0.017383214541566681),\n", + " ('writer', 0.017380887757560838),\n", + " ('lady', 0.017370657212565484),\n", + " ('dialogue', 0.017319373946647603),\n", + " ('terribly', 0.017291135257276879),\n", + " ('downright', 0.017277675563205447),\n", + " ('rented', 0.017247977656900705),\n", + " ('clumsy', 0.017241290805182073),\n", + " ('blah', 0.017217377177396766),\n", + " ('random', 0.017199913549247985),\n", + " ('members', 0.017198947117344765),\n", + " ('three', 0.017189383912215896),\n", + " ('celluloid', 0.017174000803758888),\n", + " ('your', 0.017140173886430049),\n", + " ('lost', 0.017127763322061815),\n", + " ('suddenly', 0.017124566068806111),\n", + " ('cover', 0.017066680835874294),\n", + " ('existent', 0.017028540662919325),\n", + " ('mostly', 0.017009366180205404),\n", + " ('dig', 0.016990887715494299),\n", + " ('spending', 0.016944400877991015),\n", + " ('elsewhere', 0.016937877167916525),\n", + " ('suck', 0.016897737192407582),\n", + " ('apparent', 0.016783874225807266),\n", + " ('fill', 0.016766110935370601),\n", + " ('running', 0.016728621099996368),\n", + " ('jokes', 0.016718920312228033),\n", + " ('cheese', 0.016699473014889825),\n", + " ('outer', 0.016612591391981471),\n", + " ('anil', 0.016581200840654876),\n", + " ('director', 0.01651289445031142),\n", + " ('awfully', 0.016492200414985295),\n", + " ('mix', 0.016468214294032502),\n", + " ('naturally', 0.016404879835269445),\n", + " ('scientist', 0.016395078905109238),\n", + " ('imdb', 0.016343168034107167),\n", + " ('dumb', 0.016289693549692445),\n", + " ('made', 0.016279809910441426),\n", + " ('curiosity', 0.016277433551029966),\n", + " ('somewhere', 0.016236117446747977),\n", + " ('stereotyped', 0.016235814767295294),\n", + " ('officer', 0.016235401039884571),\n", + " ('shelf', 0.016151304702362455),\n", + " ('spends', 0.016089566181633208),\n", + " ('explanation', 0.016040330428242214),\n", + " ('proof', 0.016021381235154272),\n", + " ('killed', 0.016004979798664866),\n", + " ('songs', 0.016002280189188103),\n", + " ('why', 0.015994497048455167),\n", + " ('adequate', 0.0159780034105916),\n", + " ('assume', 0.015953574865902424),\n", + " ('mean', 0.015907137878947274),\n", + " ('year', 0.015900265748875844),\n", + " ('named', 0.015897377296493403),\n", + " ('actors', 0.015880849255718699),\n", + " ('dreck', 0.015844184837849263),\n", + " ('ripped', 0.015809352391222227),\n", + " ('exception', 0.015801037653546943),\n", + " ('let', 0.015747554995806858),\n", + " ('said', 0.015739206756809128),\n", + " ('handed', 0.015729421480492771),\n", + " ('five', 0.015692627471399438),\n", + " ('manage', 0.015647108880417118),\n", + " ('thousands', 0.015643430975892967),\n", + " ('faith', 0.015616976955551864),\n", + " ('hideous', 0.015589158171890801),\n", + " ('alas', 0.015538213296394241),\n", + " ('interesting', 0.015537431607034399),\n", + " ('camera', 0.01553421777185927),\n", + " ('affair', 0.0154993718203294),\n", + " ('basketball', 0.015498025904813827),\n", + " ('saved', 0.015479619606949033),\n", + " ('allow', 0.01547129065797),\n", + " ('embarrassed', 0.01546569091101236),\n", + " ('historically', 0.015405093934372963),\n", + " ('guy', 0.01537764125447004),\n", + " ('smoking', 0.015346508854378344),\n", + " ('implausible', 0.01534045398602275),\n", + " ('entirely', 0.01533469278818364),\n", + " ('insulting', 0.015328508644691492),\n", + " ('unable', 0.015321433538157139),\n", + " ('supposedly', 0.015316107621242397),\n", + " ('replaced', 0.015263381265213493),\n", + " ('write', 0.015247349730647834),\n", + " ('devoid', 0.01519618192038018),\n", + " ('angry', 0.015128878425101411),\n", + " ('cannot', 0.015124671278970766),\n", + " ('stinker', 0.015117424017513681),\n", + " ('types', 0.015097306608066994),\n", + " ('hype', 0.015076288365524311),\n", + " ('responsible', 0.014991356276561583),\n", + " ('peter', 0.014969127137333012),\n", + " ('putting', 0.014910707254937244),\n", + " ('over', 0.014897181020826423),\n", + " ('cardboard', 0.014888714204149049),\n", + " ('interspersed', 0.014883165331874143),\n", + " ('haired', 0.014880449676198546),\n", + " ('spend', 0.01487609431622766),\n", + " ('elvis', 0.01485470984415174),\n", + " ('indulgent', 0.014847232132387197),\n", + " ('catholic', 0.014843519648135949),\n", + " ('downhill', 0.014807184967767797),\n", + " ('lazy', 0.01478151469522973),\n", + " ('aged', 0.014773315829198606),\n", + " ('exist', 0.014753607788843255),\n", + " ('torture', 0.014733998799388373),\n", + " ('prove', 0.014729418674653008),\n", + " ('tolerable', 0.014680880104255794),\n", + " ('four', 0.014654547592632506),\n", + " ('acceptable', 0.014651730694965842),\n", + " ('chick', 0.014641428398798827),\n", + " ('unimaginative', 0.014629366067627067),\n", + " ('whiny', 0.014626751487134576),\n", + " ('artsy', 0.014597921349167277),\n", + " ('decide', 0.014596087755808965),\n", + " ('unpleasant', 0.014539257963097196),\n", + " ('rotten', 0.014526987482368661),\n", + " ('racist', 0.014521318292204636),\n", + " ('air', 0.014513999400043521),\n", + " ('flimsy', 0.014510298364381131),\n", + " ('baldwin', 0.014458793249711601),\n", + " ('merely', 0.014423588430956459),\n", + " ('wood', 0.01440518212855918),\n", + " ('thinking', 0.014365675477621536),\n", + " ('earth', 0.01435295387020083),\n", + " ('kidding', 0.014337420788166336),\n", + " ('unintentionally', 0.014336443850996722),\n", + " ('vampires', 0.014325905430975226),\n", + " ('generic', 0.014319871170399814),\n", + " ('defense', 0.014290336242912224),\n", + " ('saif', 0.014289573796132719),\n", + " ('asleep', 0.014289012435576958),\n", + " ('execution', 0.01428396200827341),\n", + " ('figure', 0.014283770855230148),\n", + " ('lackluster', 0.014273058981901444),\n", + " ('hoped', 0.014264724762345842),\n", + " ('nonsense', 0.014261341497203126),\n", + " ('horrid', 0.01425321660445842),\n", + " ('god', 0.01423736354744793),\n", + " ('l', 0.01418729677374257),\n", + " ('caricatures', 0.014181564208326641),\n", + " ('starts', 0.014153430344591595),\n", + " ('dry', 0.014133935534427947),\n", + " ('display', 0.014128179969827091),\n", + " ('button', 0.014116471162614745),\n", + " ('bore', 0.014116389381443268),\n", + " ('empty', 0.014096772700681904),\n", + " ('harold', 0.01405213089664656),\n", + " ('incomprehensible', 0.014009428713655188),\n", + " ('annie', 0.014008405850952511),\n", + " ('thrown', 0.014007462594894682),\n", + " ('incredibly', 0.014005185007294354),\n", + " ('renting', 0.01392668760863046),\n", + " ('connect', 0.013922471736926735),\n", + " ('younger', 0.013921148395141743),\n", + " ('author', 0.013908729139553388),\n", + " ('mistakes', 0.013902060662024712),\n", + " ('vague', 0.013900188409028451),\n", + " ('susan', 0.013899718009237958),\n", + " ('obvious', 0.013862928310275266),\n", + " ('public', 0.013848261281553172),\n", + " ('porn', 0.013842110384054581),\n", + " ('trash', 0.013803990572178484),\n", + " ('stevens', 0.013796967244647431),\n", + " ('sequels', 0.013782463861472683),\n", + " ('hurt', 0.013769543921240131),\n", + " ('desert', 0.013763619124969734),\n", + " ('did', 0.013737639449728181),\n", + " ('behave', 0.013719767167839486),\n", + " ('served', 0.01371483823922371),\n", + " ('claims', 0.013706886269650505),\n", + " ('ultimately', 0.01369764359110015),\n", + " ('wide', 0.013685211021307753),\n", + " ('wow', 0.013679184770624804),\n", + " ('worthless', 0.013670533296298285),\n", + " ('dear', 0.01365359137960015),\n", + " ('plodding', 0.013622845840855251),\n", + " ('mike', 0.013594086031988709),\n", + " ('favor', 0.013578310381078488),\n", + " ('call', 0.013577646631327921),\n", + " ('biggest', 0.013529947586389569),\n", + " ('worthy', 0.013524754842185308),\n", + " ('meaning', 0.013517997531900548),\n", + " ('scientific', 0.01351539665384285),\n", + " ('hanks', 0.013467213376215899),\n", + " ('ads', 0.013463653421760929),\n", + " ('gay', 0.013414840808688235),\n", + " ('embarrassingly', 0.013401336286973733),\n", + " ('literary', 0.013389208999321035),\n", + " ('playing', 0.013329954634726381),\n", + " ('bo', 0.013312890564682506),\n", + " ('manipulative', 0.013287016941406323),\n", + " ('dressed', 0.013285092423656568),\n", + " ('embarrassment', 0.013269530319198216),\n", + " ('regarding', 0.013233250211631659),\n", + " ('stilted', 0.013215539220141915),\n", + " ('sleeve', 0.013215085161586723),\n", + " ('rating', 0.013203442200940888),\n", + " ('kills', 0.013183919467358734),\n", + " ('sounds', 0.013178727878711712),\n", + " ('ali', 0.013173031266866366),\n", + " ('non', 0.013162603751805228),\n", + " ('pie', 0.013161492629253844),\n", + " ('populated', 0.013152746747459266),\n", + " ('killing', 0.013111860853151807),\n", + " ('else', 0.013110592541316683),\n", + " ('schneider', 0.013093514941690403),\n", + " ('priest', 0.013071537555948209),\n", + " ('hollow', 0.013068001463175459),\n", + " ('shower', 0.013029604174841079),\n", + " ('ruins', 0.013021597567104507),\n", + " ('mental', 0.013019696244479805),\n", + " ('this', 0.01300977816966453),\n", + " ('pregnant', 0.012997074834619551),\n", + " ('make', 0.01299285191649867),\n", + " ('timberlake', 0.012979689860020446),\n", + " ('saves', 0.012915795355367859),\n", + " ('vastly', 0.012914828969565756),\n", + " ('swear', 0.01290105947549007),\n", + " ('stella', 0.012883911119651205),\n", + " ('grave', 0.01288255504027714),\n", + " ('thats', 0.012861061812910335),\n", + " ('drinking', 0.012860129471019707),\n", + " ('boom', 0.012851779594694185),\n", + " ('introduction', 0.012831129197335454),\n", + " ('programming', 0.012796219757750256),\n", + " ('career', 0.012773059501084117),\n", + " ('stereotype', 0.012769447626661472),\n", + " ('attractive', 0.012765873120010159),\n", + " ('victims', 0.012749299245502169),\n", + " ('pass', 0.012735021821089279),\n", + " ('experiment', 0.012716112941788907),\n", + " ('retarded', 0.012713099529852412),\n", + " ('stuck', 0.012709332698253251),\n", + " ('akshay', 0.01268427306987787),\n", + " ('cut', 0.012676285239015485),\n", + " ('shoddy', 0.012674792040888047),\n", + " ('damme', 0.01266653641765667),\n", + " ('inaccurate', 0.012653687577536547),\n", + " ('ray', 0.01264981802351017),\n", + " ('woman', 0.012646521945546323),\n", + " ('research', 0.012640494662864557),\n", + " ('mile', 0.012627245693716727),\n", + " ('place', 0.012624645831509396),\n", + " ('demon', 0.012621688470792604),\n", + " ('vulgar', 0.012612150302693324),\n", + " ('engage', 0.012602272831074858),\n", + " ('wives', 0.012601890190118301),\n", + " ('mention', 0.012581598480006471),\n", + " ('if', 0.012569631262234704),\n", + " ('cartoon', 0.012561864177985764),\n", + " ('unbelievably', 0.012550391668315846),\n", + " ('only', 0.012517107727859128),\n", + " ('ended', 0.012507282716729776),\n", + " ('stereotypical', 0.012506426536204346),\n", + " ('spent', 0.012503032775055239),\n", + " ('thing', 0.012483110991541426),\n", + " ('phone', 0.012464039991489134),\n", + " ('stock', 0.012446742147556615),\n", + " ('drop', 0.012432978683590463),\n", + " ('self', 0.012432059211520803),\n", + " ('headache', 0.012424495134195475),\n", + " ('escapes', 0.01241921129824892),\n", + " ('conceived', 0.012392639977060704),\n", + " ('required', 0.012392260947042827),\n", + " ('assassin', 0.012332404091910096),\n", + " ('meat', 0.012327751187890425),\n", + " ('therefore', 0.012316138729629601),\n", + " ('struggling', 0.012308628353572298),\n", + " ('ho', 0.012307714936265705),\n", + " ('ta', 0.012299409649320241),\n", + " ('cold', 0.012289510775209258),\n", + " ('expects', 0.012271684887263188),\n", + " ('furthermore', 0.012263298696316198),\n", + " ('remote', 0.012254529263879217),\n", + " ('cgi', 0.012250569964074181),\n", + " ('arab', 0.012230232115225252),\n", + " ('feminist', 0.012220004405980534),\n", + " ('hair', 0.012213792907949595),\n", + " ('intelligence', 0.012203964889416771),\n", + " ('destroy', 0.012190213907023965),\n", + " ('cameo', 0.012186034087855131),\n", + " ('claus', 0.012181510618531243),\n", + " ('awake', 0.012171290237450144),\n", + " ('sums', 0.012139945909251909),\n", + " ('auto', 0.012126012687040619),\n", + " ('cue', 0.012120943623008957),\n", + " ('speak', 0.012117784815618097),\n", + " ('stereotypes', 0.012106976159466581),\n", + " ('footage', 0.012103658001584283),\n", + " ('maker', 0.01209336953927035),\n", + " ('rental', 0.012083052888147337),\n", + " ('proper', 0.012063210621690412),\n", + " ('mercifully', 0.012047936344961967),\n", + " ('gimmick', 0.012041001769926642),\n", + " ('coherent', 0.012027899920693617),\n", + " ('inane', 0.011993175877578827),\n", + " ('relies', 0.011992345660343812),\n", + " ('nomination', 0.011982252573531246),\n", + " ('segal', 0.011947340234058405),\n", + " ('christians', 0.011946398905489899),\n", + " ('overrated', 0.011926101166626013),\n", + " ('don', 0.011924357980777277),\n", + " ('severely', 0.011916168552237318),\n", + " ('phony', 0.011913822393121727),\n", + " ('selfish', 0.011900529017180243),\n", + " ('resume', 0.011897346320859058),\n", + " ('another', 0.01187768443136164),\n", + " ('sean', 0.011876040214137608),\n", + " ('hepburn', 0.011869243078008905),\n", + " ('secondly', 0.01186310933445027),\n", + " ('ups', 0.011859394818287428),\n", + " ('planet', 0.011852030247443603),\n", + " ('changed', 0.01184533561188748),\n", + " ('amused', 0.011842962845878567),\n", + " ('lowest', 0.011831634819501915),\n", + " ('fools', 0.011824116232842369),\n", + " ('spelling', 0.011821902194872624),\n", + " ('repressed', 0.011821527286346348),\n", + " ('unlikeable', 0.01181876011058648),\n", + " ('failure', 0.011816519901709052),\n", + " ('line', 0.011796438571873891),\n", + " ('hyped', 0.011784666544684304),\n", + " ('anti', 0.011764086315539161),\n", + " ('acting', 0.011752348314205383),\n", + " ('promise', 0.011749711660046624),\n", + " ('observe', 0.011739608959278626),\n", + " ('mindless', 0.011729368774426891),\n", + " ('lacked', 0.011718485221863709),\n", + " ('rather', 0.011704535222487891),\n", + " ('ed', 0.011700096242496997),\n", + " ('significant', 0.01169617650193994),\n", + " ('talks', 0.011678101476086883),\n", + " ('arty', 0.011674972481678897),\n", + " ('spit', 0.011671408526135128),\n", + " ('ilk', 0.011661568455359029),\n", + " ('unoriginal', 0.011651107245840887),\n", + " ('forward', 0.011646719533106094),\n", + " ('toilet', 0.011635522207639078),\n", + " ('suppose', 0.011633258510072186),\n", + " ('feed', 0.01161744751742516),\n", + " ('surrounded', 0.011607897169523127),\n", + " ('wanted', 0.011604506869089724),\n", + " ('tashan', 0.011596205445299108),\n", + " ('dr', 0.01154394928133564),\n", + " ('scare', 0.011543316667712905),\n", + " ('murderer', 0.011535350571639676),\n", + " ('explained', 0.011466329649783205),\n", + " ('cheated', 0.011455846970137712),\n", + " ('whats', 0.01145144357723085),\n", + " ('romance', 0.011445558616225329),\n", + " ('jewish', 0.01144156416364368),\n", + " ('sexual', 0.011438682797255701),\n", + " ('books', 0.01141981177753516),\n", + " ('throwing', 0.011404165894740239),\n", + " ('nose', 0.011395583651720624),\n", + " ('parking', 0.011390688400833907),\n", + " ('pick', 0.011357671445382181),\n", + " ('chose', 0.011354353327826118),\n", + " ('improve', 0.011350584813053918),\n", + " ('kapoor', 0.011340767814074903),\n", + " ('costs', 0.011325900726890981),\n", + " ('saying', 0.011325617629551313),\n", + " ('early', 0.011320525734188087),\n", + " ('technically', 0.011317672837061938),\n", + " ('hackman', 0.011288294849240651),\n", + " ('birthday', 0.011282785404027751),\n", + " ('cinematography', 0.011263572785831684),\n", + " ('hurts', 0.011250154303091528),\n", + " ('saturday', 0.011247837147971233),\n", + " ('meaningless', 0.011239510238506719),\n", + " ('mannered', 0.011239044207972256),\n", + " ('screaming', 0.01123862031022237),\n", + " ('should', 0.011236648355832369),\n", + " ('crazed', 0.011236418275421324),\n", + " ('dignity', 0.011236150963786546),\n", + " ('mate', 0.0112167000098445),\n", + " ('letters', 0.011208675517174478),\n", + " ('recycled', 0.011206236378205576),\n", + " ('promptly', 0.011202237607822145),\n", + " ('inexplicably', 0.01116132181154625),\n", + " ('or', 0.011152965343305343),\n", + " ('simply', 0.011146233896835922),\n", + " ('too', 0.011130044921930288),\n", + " ('nerd', 0.011122543127721436),\n", + " ('chris', 0.011116119389820144),\n", + " ('proceedings', 0.011111786695547108),\n", + " ('lived', 0.011100598930695569),\n", + " ('code', 0.01109542524270142),\n", + " ('potentially', 0.011093285835678523),\n", + " ('open', 0.011075631889800954),\n", + " ('faster', 0.011074177906888309),\n", + " ('moore', 0.011070458274337773),\n", + " ('bowl', 0.011060417562531431),\n", + " ('absolutely', 0.011044130796846871),\n", + " ('just', 0.011033356854991554),\n", + " ('suspension', 0.011031781173072127),\n", + " ('enemy', 0.011025820754518639),\n", + " ('conclusion', 0.010986051066943338),\n", + " ('hospital', 0.010977494845678686),\n", + " ('romances', 0.010962761722118311),\n", + " ('spoke', 0.010962116403553662),\n", + " ('hardly', 0.010960545391113456),\n", + " ('olds', 0.010951344004097441),\n", + " ('creek', 0.010950023924322864),\n", + " ('shouting', 0.01094372750254274),\n", + " ('originality', 0.010912963822714918),\n", + " ('bollywood', 0.010911409137577785),\n", + " ('cape', 0.01090232612951828),\n", + " ('teeth', 0.01090050204600262),\n", + " ('backdrop', 0.010885688008708722),\n", + " ('turn', 0.010880478059425644),\n", + " ('mason', 0.010866951716170654),\n", + " ('grace', 0.010848406257382322),\n", + " ('valley', 0.010845180425875844),\n", + " ('depressing', 0.010827818086738505),\n", + " ('superficial', 0.01082640323755853),\n", + " ('invested', 0.010812488716640862),\n", + " ('bomb', 0.010811727591767118),\n", + " ('embarrass', 0.010778451069403564),\n", + " ('sided', 0.010773707983617679),\n", + " ('sticking', 0.010762292435547709),\n", + " ('common', 0.010754536408451008),\n", + " ('boat', 0.010750196487059143),\n", + " ('promised', 0.010746025901289747),\n", + " ('wayans', 0.010744338945929416),\n", + " ('sheer', 0.010734103279474522),\n", + " ('wrestling', 0.010724515540975418),\n", + " ('staff', 0.010715523520497053),\n", + " ('apollo', 0.010711377643774767),\n", + " ('leigh', 0.010702080598678557),\n", + " ('virtually', 0.010691942663824007),\n", + " ('seagal', 0.010677324100672111),\n", + " ('comes', 0.0106748997197255),\n", + " ('edition', 0.010673353805904191),\n", + " ('predictably', 0.010666551243955741),\n", + " ('stuff', 0.010664915811483258),\n", + " ('gang', 0.010664441184213124),\n", + " ('cancer', 0.010643225900463574),\n", + " ('obviously', 0.010641670080654522),\n", + " ('would', 0.010623530922231164),\n", + " ('totally', 0.010616092995147883),\n", + " ('profile', 0.010596003501785214),\n", + " ('spacey', 0.010595967407784398),\n", + " ('ability', 0.01058459252136016),\n", + " ('horrendous', 0.010580213328532085),\n", + " ('blood', 0.010579520401095313),\n", + " ('imitation', 0.010568550630572958),\n", + " ('bikini', 0.010568043371931093),\n", + " ('talented', 0.010566001035979433),\n", + " ('basis', 0.010564729746933205),\n", + " ('dialogs', 0.010551191397294005),\n", + " ('showing', 0.010548613564454221),\n", + " ('door', 0.010544563357219762),\n", + " ('portray', 0.01052779962849062),\n", + " ('strictly', 0.010526959295132305),\n", + " ('mexican', 0.01050873151782232),\n", + " ('stick', 0.010465961443388669),\n", + " ('east', 0.010455324716016765),\n", + " ('anywhere', 0.010431532734666283),\n", + " ('remake', 0.01041986919495284),\n", + " ('am', 0.010410414209203916),\n", + " ('attempting', 0.010386393998627376),\n", + " ('disturbing', 0.010381152608581442),\n", + " ('jude', 0.010377136500506754),\n", + " ('wondering', 0.010363512690012198),\n", + " ('celebrated', 0.01036011176907586),\n", + " ('use', 0.010350554074714637),\n", + " ('wreck', 0.010344734410393921),\n", + " ('appear', 0.010344438351539177),\n", + " ('entitled', 0.010335246001593065),\n", + " ('youth', 0.010323214445994815),\n", + " ('letdown', 0.01031855344625868),\n", + " ('moran', 0.010305507693633359),\n", + " ('mediocrity', 0.010302827140695369),\n", + " ('news', 0.010292874788426091),\n", + " ('bits', 0.010276065293631171),\n", + " ('alone', 0.010268492053981953),\n", + " ('accents', 0.010263852094534689),\n", + " ('inhabited', 0.010244117693024815),\n", + " ('mock', 0.010244061360675905),\n", + " ('g', 0.010223458175403785),\n", + " ('box', 0.010203304329265734),\n", + " ('term', 0.010199983044386091),\n", + " ('behavior', 0.010198776124373237),\n", + " ('tedium', 0.010190092201507218),\n", + " ('intent', 0.010190038120698582),\n", + " ('husband', 0.01018950226595784),\n", + " ('presence', 0.01018719233607417),\n", + " ('z', 0.010184318583214757),\n", + " ('unappealing', 0.010146391189444364),\n", + " ('much', 0.010136790117697133),\n", + " ('tree', 0.010113534581593916),\n", + " ('doctors', 0.010099854380484191),\n", + " ('pi', 0.010095099419111339),\n", + " ('rodney', 0.010090819798082389),\n", + " ('franchise', 0.010089650929674206),\n", + " ('piece', 0.010086011549585341),\n", + " ('company', 0.01008353958260106),\n", + " ('choppy', 0.010079223420593732),\n", + " ('turned', 0.010069855547990123),\n", + " ('test', 0.010041505355613897),\n", + " ('ball', 0.010040944323609524),\n", + " ('hated', 0.010035509058945862),\n", + " ('bear', 0.01003427246505746),\n", + " ('serves', 0.010027495172169224),\n", + " ('leonard', 0.010022751390164689),\n", + " ('deserved', 0.010022334081283371),\n", + " ('part', 0.010016360436147431),\n", + " ('opportunity', 0.010013126012646686),\n", + " ('turning', 0.010011850960865775),\n", + " ('overacting', 0.010008994714980207),\n", + " ('refer', 0.010006488920574083),\n", + " ('flies', 0.010006418749637626),\n", + " ('uninvolving', 0.0099991338976208165),\n", + " ('produce', 0.0099962014038013722),\n", + " ('jumpy', 0.0099947855808415129),\n", + " ('die', 0.0099914129058671017),\n", + " ('root', 0.0099747135001128327),\n", + " ('insomnia', 0.0099744642555285069),\n", + " ('blatant', 0.0099596620005663813),\n", + " ('larry', 0.0099556905367902439),\n", + " ('threw', 0.0099473965388449589),\n", + " ('billed', 0.0099285818753670832),\n", + " ('bullets', 0.0099281758971005909),\n", + " ('intellectually', 0.009908138827878615),\n", + " ('rip', 0.009901323399604086),\n", + " ('stretching', 0.0099012969699172632),\n", + " ('protest', 0.0098984552675623581),\n", + " ('soldiers', 0.0098936923822449258),\n", + " ('flick', 0.009887063364977652),\n", + " ('justin', 0.009862246602717558),\n", + " ('highlights', 0.0098589088020586291),\n", + " ('move', 0.0098539899809540372),\n", + " ('merit', 0.0098431205949966738),\n", + " ('russian', 0.009841171721984102),\n", + " ('security', 0.0098373450338831089),\n", + " ('idiotic', 0.0098341234288144581),\n", + " ('produced', 0.0098294307574258062),\n", + " ('king', 0.009826687234317566),\n", + " ('magically', 0.009822884247682559),\n", + " ('united', 0.0098070847890707642),\n", + " ('missile', 0.0097990578193348551),\n", + " ('unlikable', 0.0097869158986480815),\n", + " ('ignorant', 0.0097732743173460923),\n", + " ('amateur', 0.0097674059870561138),\n", + " ('bachelor', 0.0097673429455405695),\n", + " ('asylum', 0.0097627338519779908),\n", + " ('screw', 0.0097568098573927193),\n", + " ('report', 0.0097479232699172434),\n", + " ('dracula', 0.0097467323393205588),\n", + " ('removed', 0.0097416519499422087),\n", + " ('confess', 0.0097162925211573253),\n", + " ('brand', 0.0097152534660907616),\n", + " ('conspiracy', 0.0097116972290396987),\n", + " ('horribly', 0.009708378556425248),\n", + " ('switch', 0.009702684093379545),\n", + " ('jaws', 0.0096877455513713073),\n", + " ('unsuspecting', 0.0096853425035846423),\n", + " ('betty', 0.0096770352133324685),\n", + " ('forwarding', 0.0096711196893192741),\n", + " ('university', 0.0096636715878149638),\n", + " ('star', 0.0096623254931800309),\n", + " ('crawl', 0.0096464318968590562),\n", + " ('dopey', 0.0096460863315858663),\n", + " ('ruin', 0.0096230106385457228),\n", + " ('lifeless', 0.0096228807274879972),\n", + " ('flash', 0.0096193625359650009),\n", + " ('whoever', 0.0096174128915875422),\n", + " ('coincidence', 0.0096024599741402102),\n", + " ('choosing', 0.0095951100051069292),\n", + " ('avid', 0.0095900913284222636),\n", + " ('intended', 0.0095846987041676261),\n", + " ('remained', 0.0095839628178583831),\n", + " ('c', 0.0095732676681762417),\n", + " ('waiting', 0.0095562258694348833),\n", + " ('cassie', 0.0095481354442238063),\n", + " ('garage', 0.0095349544587830237),\n", + " ('clarke', 0.0095345445855698589),\n", + " ('fortune', 0.0095330396648302049),\n", + " ('interminable', 0.0095328159563552606),\n", + " ('incessant', 0.0095235485026846332),\n", + " ('plots', 0.0095225805490624683),\n", + " ('danger', 0.0095171205654692899),\n", + " ('costumes', 0.0094980144667524413),\n", + " ('evidently', 0.0094952158467012243),\n", + " ('minus', 0.0094911495174661263),\n", + " ('reporters', 0.0094836811040990825),\n", + " ('israeli', 0.0094750077183364638),\n", + " ('failing', 0.0094711841313976849),\n", + " ('paying', 0.00946923440668513),\n", + " ('godzilla', 0.0094586915548437872),\n", + " ('dumber', 0.0094582903092924817),\n", + " ('earn', 0.009447622492842497),\n", + " ('slows', 0.0094467463872487598),\n", + " ('held', 0.0094452736817914641),\n", + " ('chase', 0.0094438362611946568),\n", + " ('lies', 0.0094383969845033347),\n", + " ('hands', 0.0094381781614589089),\n", + " ('grief', 0.009423849453410283),\n", + " ('brains', 0.009418215341663207),\n", + " ('tom', 0.0094130433384347137),\n", + " ('resurrected', 0.0094083423437290523),\n", + " ('asking', 0.0094021029403453284),\n", + " ('sleeps', 0.0094017951882658275),\n", + " ('porno', 0.0093907201413965108),\n", + " ('somehow', 0.0093889261270860523),\n", + " ('sarcasm', 0.0093886064393904137),\n", + " ('tie', 0.0093856009366311641),\n", + " ('fall', 0.0093801640008931118),\n", + " ('bring', 0.0093791273545761507),\n", + " ('rape', 0.0093760851230746296),\n", + " ('village', 0.0093684513318614063),\n", + " ('kitchen', 0.0093649071460109555),\n", + " ('concerned', 0.0093611353238811264),\n", + " ('republic', 0.0093499426948764237),\n", + " ('hell', 0.0093400360705317119),\n", + " ('inducing', 0.0093382129792553541),\n", + " ('stomach', 0.0093378286385158524),\n", + " ('shambles', 0.0093335457329829716),\n", + " ('virgin', 0.0093312001339055962),\n", + " ('extraneous', 0.0093250413800351276),\n", + " ('cameras', 0.009322946026797712),\n", + " ('suffers', 0.0093204929924829982),\n", + " ('justified', 0.0093163217479363125),\n", + " ('plummer', 0.0092948273285103911),\n", + " ('ponderous', 0.0092880344237223321),\n", + " ('player', 0.0092802296345443694),\n", + " ('survivor', 0.0092767026472125765),\n", + " ('rainy', 0.0092697034218137443),\n", + " ('graces', 0.0092620944963291256),\n", + " ...]" + ] + }, + "execution_count": 134, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_most_similar_words(\"terrible\")" + ] + }, + { + "cell_type": "code", + "execution_count": 135, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import matplotlib.colors as colors\n", + "\n", + "words_to_visualize = list()\n", + "for word, ratio in pos_neg_ratios.most_common(500):\n", + " if(word in mlp_full.word2index.keys()):\n", + " words_to_visualize.append(word)\n", + " \n", + "for word, ratio in list(reversed(pos_neg_ratios.most_common()))[0:500]:\n", + " if(word in mlp_full.word2index.keys()):\n", + " words_to_visualize.append(word)" + ] + }, + { + "cell_type": "code", + "execution_count": 136, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos = 0\n", + "neg = 0\n", + "\n", + "colors_list = list()\n", + "vectors_list = list()\n", + "for word in words_to_visualize:\n", + " if word in pos_neg_ratios.keys():\n", + " vectors_list.append(mlp_full.weights_0_1[mlp_full.word2index[word]])\n", + " if(pos_neg_ratios[word] > 0):\n", + " pos+=1\n", + " colors_list.append(\"#00ff00\")\n", + " else:\n", + " neg+=1\n", + " colors_list.append(\"#000000\")\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 137, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "from sklearn.manifold import TSNE\n", + "tsne = TSNE(n_components=2, random_state=0)\n", + "words_top_ted_tsne = tsne.fit_transform(vectors_list)" + ] + }, + { + "cell_type": "code", + "execution_count": 139, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "
\n", + "
\n", + "
\n", + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "p = figure(tools=\"pan,wheel_zoom,reset,save\",\n", + " toolbar_location=\"above\",\n", + " title=\"vector T-SNE for most polarized words\")\n", + "\n", + "source = ColumnDataSource(data=dict(x1=words_top_ted_tsne[:,0],\n", + " x2=words_top_ted_tsne[:,1],\n", + " names=words_to_visualize))\n", + "\n", + "p.scatter(x=\"x1\", y=\"x2\", size=8, source=source,color=colors_list)\n", + "\n", + "word_labels = LabelSet(x=\"x1\", y=\"x2\", text=\"names\", y_offset=6,\n", + " text_font_size=\"8pt\", text_color=\"#555555\",\n", + " source=source, text_align='center')\n", + "p.add_layout(word_labels)\n", + "\n", + "show(p)\n", + "\n", + "# green indicates positive words, black indicates negative words" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 1).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 1).ipynb new file mode 100644 index 0000000..da72ef0 --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 1).ipynb @@ -0,0 +1,2407 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 2).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 2).ipynb new file mode 100644 index 0000000..2748b1e --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 2).ipynb @@ -0,0 +1,4738 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 3).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 3).ipynb new file mode 100644 index 0000000..9bc914c --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 3).ipynb @@ -0,0 +1,5102 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 3: Building a Neural Network" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "- Start with your neural network from the last chapter\n", + "- 3 layer neural network\n", + "- no non-linearity in hidden layer\n", + "- use our functions to create the training data\n", + "- create a \"pre_process_data\" function to create vocabulary for our training data generating functions\n", + "- modify \"train\" to train over the entire corpus" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Where to Get Help if You Need it\n", + "- Re-watch previous week's Udacity Lectures\n", + "- Chapters 3-5 - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) - (40% Off: **traskud17**)" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] += 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):587.5% #Correct:500 #Tested:1000 Testing Accuracy:50.0%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):89.58 #Correct:1250 #Trained:2501 Training Accuracy:49.9%\n", + "Progress:20.8% Speed(reviews/sec):95.03 #Correct:2500 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:27.4% Speed(reviews/sec):95.46 #Correct:3295 #Trained:6592 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):96.39 #Correct:1247 #Trained:2501 Training Accuracy:49.8%\n", + "Progress:20.8% Speed(reviews/sec):99.31 #Correct:2497 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:22.8% Speed(reviews/sec):99.02 #Correct:2735 #Trained:5476 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.001)" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):98.77 #Correct:1267 #Trained:2501 Training Accuracy:50.6%\n", + "Progress:20.8% Speed(reviews/sec):98.79 #Correct:2640 #Trained:5001 Training Accuracy:52.7%\n", + "Progress:31.2% Speed(reviews/sec):98.58 #Correct:4109 #Trained:7501 Training Accuracy:54.7%\n", + "Progress:41.6% Speed(reviews/sec):93.78 #Correct:5638 #Trained:10001 Training Accuracy:56.3%\n", + "Progress:52.0% Speed(reviews/sec):91.76 #Correct:7246 #Trained:12501 Training Accuracy:57.9%\n", + "Progress:62.5% Speed(reviews/sec):92.42 #Correct:8841 #Trained:15001 Training Accuracy:58.9%\n", + "Progress:69.4% Speed(reviews/sec):92.58 #Correct:9934 #Trained:16668 Training Accuracy:59.5%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 4).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 4).ipynb new file mode 100644 index 0000000..e2bc1a6 --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 4).ipynb @@ -0,0 +1,5553 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 3: Building a Neural Network" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "- Start with your neural network from the last chapter\n", + "- 3 layer neural network\n", + "- no non-linearity in hidden layer\n", + "- use our functions to create the training data\n", + "- create a \"pre_process_data\" function to create vocabulary for our training data generating functions\n", + "- modify \"train\" to train over the entire corpus" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Where to Get Help if You Need it\n", + "- Re-watch previous week's Udacity Lectures\n", + "- Chapters 3-5 - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) - (40% Off: **traskud17**)" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] += 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):587.5% #Correct:500 #Tested:1000 Testing Accuracy:50.0%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):89.58 #Correct:1250 #Trained:2501 Training Accuracy:49.9%\n", + "Progress:20.8% Speed(reviews/sec):95.03 #Correct:2500 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:27.4% Speed(reviews/sec):95.46 #Correct:3295 #Trained:6592 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):96.39 #Correct:1247 #Trained:2501 Training Accuracy:49.8%\n", + "Progress:20.8% Speed(reviews/sec):99.31 #Correct:2497 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:22.8% Speed(reviews/sec):99.02 #Correct:2735 #Trained:5476 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.001)" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):98.77 #Correct:1267 #Trained:2501 Training Accuracy:50.6%\n", + "Progress:20.8% Speed(reviews/sec):98.79 #Correct:2640 #Trained:5001 Training Accuracy:52.7%\n", + "Progress:31.2% Speed(reviews/sec):98.58 #Correct:4109 #Trained:7501 Training Accuracy:54.7%\n", + "Progress:41.6% Speed(reviews/sec):93.78 #Correct:5638 #Trained:10001 Training Accuracy:56.3%\n", + "Progress:52.0% Speed(reviews/sec):91.76 #Correct:7246 #Trained:12501 Training Accuracy:57.9%\n", + "Progress:62.5% Speed(reviews/sec):92.42 #Correct:8841 #Trained:15001 Training Accuracy:58.9%\n", + "Progress:69.4% Speed(reviews/sec):92.58 #Correct:9934 #Trained:16668 Training Accuracy:59.5%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Understanding Neural Noise" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 67, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 71, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "review_counter = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for word in reviews[0].split(\" \"):\n", + " review_counter[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('.', 27),\n", + " ('', 18),\n", + " ('the', 9),\n", + " ('to', 6),\n", + " ('i', 5),\n", + " ('high', 5),\n", + " ('is', 4),\n", + " ('of', 4),\n", + " ('a', 4),\n", + " ('bromwell', 4),\n", + " ('teachers', 4),\n", + " ('that', 4),\n", + " ('their', 2),\n", + " ('my', 2),\n", + " ('at', 2),\n", + " ('as', 2),\n", + " ('me', 2),\n", + " ('in', 2),\n", + " ('students', 2),\n", + " ('it', 2),\n", + " ('student', 2),\n", + " ('school', 2),\n", + " ('through', 1),\n", + " ('insightful', 1),\n", + " ('ran', 1),\n", + " ('years', 1),\n", + " ('here', 1),\n", + " ('episode', 1),\n", + " ('reality', 1),\n", + " ('what', 1),\n", + " ('far', 1),\n", + " ('t', 1),\n", + " ('saw', 1),\n", + " ('s', 1),\n", + " ('repeatedly', 1),\n", + " ('isn', 1),\n", + " ('closer', 1),\n", + " ('and', 1),\n", + " ('fetched', 1),\n", + " ('remind', 1),\n", + " ('can', 1),\n", + " ('welcome', 1),\n", + " ('line', 1),\n", + " ('your', 1),\n", + " ('survive', 1),\n", + " ('teaching', 1),\n", + " ('satire', 1),\n", + " ('classic', 1),\n", + " ('who', 1),\n", + " ('age', 1),\n", + " ('knew', 1),\n", + " ('schools', 1),\n", + " ('inspector', 1),\n", + " ('comedy', 1),\n", + " ('down', 1),\n", + " ('about', 1),\n", + " ('pity', 1),\n", + " ('m', 1),\n", + " ('all', 1),\n", + " ('adults', 1),\n", + " ('see', 1),\n", + " ('think', 1),\n", + " ('situation', 1),\n", + " ('time', 1),\n", + " ('pomp', 1),\n", + " ('lead', 1),\n", + " ('other', 1),\n", + " ('much', 1),\n", + " ('many', 1),\n", + " ('which', 1),\n", + " ('one', 1),\n", + " ('profession', 1),\n", + " ('programs', 1),\n", + " ('same', 1),\n", + " ('some', 1),\n", + " ('such', 1),\n", + " ('pettiness', 1),\n", + " ('immediately', 1),\n", + " ('expect', 1),\n", + " ('financially', 1),\n", + " ('recalled', 1),\n", + " ('tried', 1),\n", + " ('whole', 1),\n", + " ('right', 1),\n", + " ('life', 1),\n", + " ('cartoon', 1),\n", + " ('scramble', 1),\n", + " ('sack', 1),\n", + " ('believe', 1),\n", + " ('when', 1),\n", + " ('than', 1),\n", + " ('burn', 1),\n", + " ('pathetic', 1)]" + ] + }, + "execution_count": 81, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review_counter.most_common()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 4: Reducing Noise in our Input Data" + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):91.50 #Correct:1795 #Trained:2501 Training Accuracy:71.7%\n", + "Progress:20.8% Speed(reviews/sec):95.25 #Correct:3811 #Trained:5001 Training Accuracy:76.2%\n", + "Progress:31.2% Speed(reviews/sec):93.74 #Correct:5898 #Trained:7501 Training Accuracy:78.6%\n", + "Progress:41.6% Speed(reviews/sec):93.69 #Correct:8042 #Trained:10001 Training Accuracy:80.4%\n", + "Progress:52.0% Speed(reviews/sec):95.27 #Correct:10186 #Trained:12501 Training Accuracy:81.4%\n", + "Progress:62.5% Speed(reviews/sec):98.19 #Correct:12317 #Trained:15001 Training Accuracy:82.1%\n", + "Progress:72.9% Speed(reviews/sec):98.56 #Correct:14440 #Trained:17501 Training Accuracy:82.5%\n", + "Progress:83.3% Speed(reviews/sec):99.74 #Correct:16613 #Trained:20001 Training Accuracy:83.0%\n", + "Progress:93.7% Speed(reviews/sec):100.7 #Correct:18794 #Trained:22501 Training Accuracy:83.5%\n", + "Progress:99.9% Speed(reviews/sec):101.9 #Correct:20115 #Trained:24000 Training Accuracy:83.8%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):832.7% #Correct:851 #Tested:1000 Testing Accuracy:85.1%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 5).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 5).ipynb new file mode 100644 index 0000000..33a506e --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 5).ipynb @@ -0,0 +1,5997 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 3: Building a Neural Network" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "- Start with your neural network from the last chapter\n", + "- 3 layer neural network\n", + "- no non-linearity in hidden layer\n", + "- use our functions to create the training data\n", + "- create a \"pre_process_data\" function to create vocabulary for our training data generating functions\n", + "- modify \"train\" to train over the entire corpus" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Where to Get Help if You Need it\n", + "- Re-watch previous week's Udacity Lectures\n", + "- Chapters 3-5 - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) - (40% Off: **traskud17**)" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] += 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):587.5% #Correct:500 #Tested:1000 Testing Accuracy:50.0%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):89.58 #Correct:1250 #Trained:2501 Training Accuracy:49.9%\n", + "Progress:20.8% Speed(reviews/sec):95.03 #Correct:2500 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:27.4% Speed(reviews/sec):95.46 #Correct:3295 #Trained:6592 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):96.39 #Correct:1247 #Trained:2501 Training Accuracy:49.8%\n", + "Progress:20.8% Speed(reviews/sec):99.31 #Correct:2497 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:22.8% Speed(reviews/sec):99.02 #Correct:2735 #Trained:5476 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.001)" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):98.77 #Correct:1267 #Trained:2501 Training Accuracy:50.6%\n", + "Progress:20.8% Speed(reviews/sec):98.79 #Correct:2640 #Trained:5001 Training Accuracy:52.7%\n", + "Progress:31.2% Speed(reviews/sec):98.58 #Correct:4109 #Trained:7501 Training Accuracy:54.7%\n", + "Progress:41.6% Speed(reviews/sec):93.78 #Correct:5638 #Trained:10001 Training Accuracy:56.3%\n", + "Progress:52.0% Speed(reviews/sec):91.76 #Correct:7246 #Trained:12501 Training Accuracy:57.9%\n", + "Progress:62.5% Speed(reviews/sec):92.42 #Correct:8841 #Trained:15001 Training Accuracy:58.9%\n", + "Progress:69.4% Speed(reviews/sec):92.58 #Correct:9934 #Trained:16668 Training Accuracy:59.5%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Understanding Neural Noise" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 67, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 71, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "review_counter = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for word in reviews[0].split(\" \"):\n", + " review_counter[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('.', 27),\n", + " ('', 18),\n", + " ('the', 9),\n", + " ('to', 6),\n", + " ('i', 5),\n", + " ('high', 5),\n", + " ('is', 4),\n", + " ('of', 4),\n", + " ('a', 4),\n", + " ('bromwell', 4),\n", + " ('teachers', 4),\n", + " ('that', 4),\n", + " ('their', 2),\n", + " ('my', 2),\n", + " ('at', 2),\n", + " ('as', 2),\n", + " ('me', 2),\n", + " ('in', 2),\n", + " ('students', 2),\n", + " ('it', 2),\n", + " ('student', 2),\n", + " ('school', 2),\n", + " ('through', 1),\n", + " ('insightful', 1),\n", + " ('ran', 1),\n", + " ('years', 1),\n", + " ('here', 1),\n", + " ('episode', 1),\n", + " ('reality', 1),\n", + " ('what', 1),\n", + " ('far', 1),\n", + " ('t', 1),\n", + " ('saw', 1),\n", + " ('s', 1),\n", + " ('repeatedly', 1),\n", + " ('isn', 1),\n", + " ('closer', 1),\n", + " ('and', 1),\n", + " ('fetched', 1),\n", + " ('remind', 1),\n", + " ('can', 1),\n", + " ('welcome', 1),\n", + " ('line', 1),\n", + " ('your', 1),\n", + " ('survive', 1),\n", + " ('teaching', 1),\n", + " ('satire', 1),\n", + " ('classic', 1),\n", + " ('who', 1),\n", + " ('age', 1),\n", + " ('knew', 1),\n", + " ('schools', 1),\n", + " ('inspector', 1),\n", + " ('comedy', 1),\n", + " ('down', 1),\n", + " ('about', 1),\n", + " ('pity', 1),\n", + " ('m', 1),\n", + " ('all', 1),\n", + " ('adults', 1),\n", + " ('see', 1),\n", + " ('think', 1),\n", + " ('situation', 1),\n", + " ('time', 1),\n", + " ('pomp', 1),\n", + " ('lead', 1),\n", + " ('other', 1),\n", + " ('much', 1),\n", + " ('many', 1),\n", + " ('which', 1),\n", + " ('one', 1),\n", + " ('profession', 1),\n", + " ('programs', 1),\n", + " ('same', 1),\n", + " ('some', 1),\n", + " ('such', 1),\n", + " ('pettiness', 1),\n", + " ('immediately', 1),\n", + " ('expect', 1),\n", + " ('financially', 1),\n", + " ('recalled', 1),\n", + " ('tried', 1),\n", + " ('whole', 1),\n", + " ('right', 1),\n", + " ('life', 1),\n", + " ('cartoon', 1),\n", + " ('scramble', 1),\n", + " ('sack', 1),\n", + " ('believe', 1),\n", + " ('when', 1),\n", + " ('than', 1),\n", + " ('burn', 1),\n", + " ('pathetic', 1)]" + ] + }, + "execution_count": 81, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review_counter.most_common()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 4: Reducing Noise in our Input Data" + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):91.50 #Correct:1795 #Trained:2501 Training Accuracy:71.7%\n", + "Progress:20.8% Speed(reviews/sec):95.25 #Correct:3811 #Trained:5001 Training Accuracy:76.2%\n", + "Progress:31.2% Speed(reviews/sec):93.74 #Correct:5898 #Trained:7501 Training Accuracy:78.6%\n", + "Progress:41.6% Speed(reviews/sec):93.69 #Correct:8042 #Trained:10001 Training Accuracy:80.4%\n", + "Progress:52.0% Speed(reviews/sec):95.27 #Correct:10186 #Trained:12501 Training Accuracy:81.4%\n", + "Progress:62.5% Speed(reviews/sec):98.19 #Correct:12317 #Trained:15001 Training Accuracy:82.1%\n", + "Progress:72.9% Speed(reviews/sec):98.56 #Correct:14440 #Trained:17501 Training Accuracy:82.5%\n", + "Progress:83.3% Speed(reviews/sec):99.74 #Correct:16613 #Trained:20001 Training Accuracy:83.0%\n", + "Progress:93.7% Speed(reviews/sec):100.7 #Correct:18794 #Trained:22501 Training Accuracy:83.5%\n", + "Progress:99.9% Speed(reviews/sec):101.9 #Correct:20115 #Trained:24000 Training Accuracy:83.8%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):832.7% #Correct:851 #Tested:1000 Testing Accuracy:85.1%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Analyzing Inefficiencies in our Network" + ] + }, + { + "cell_type": "code", + "execution_count": 88, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAl4AAAEoCAYAAACJsv/HAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsvQv8HdPV/7+pKnVrUCSoS5A0ES1CCC2JpPh7KkLlaV1yafskoUk02hCh\njyhyERWXIMlTInFpRCVBCRIJQRJFtQRJCXVLUOLXoC7Vnv+8t67T9Z3Muc85Z+actV6v+c6cmX1Z\n+7NnZn++a63Ze4NMIM7EEDAEDAFDwBAwBAwBQ6DqCGxY9RqsAkPAEDAEDAFDwBAwBAwBj4ARL7sR\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwg0GgLvvfee22CDDdzWW2+diqahZ+fOnVOhqylpCBgCjYuA\nEa/G7VtrmSFgCPwbgT59+jiIookhYAgYAvVGwIhXvXvA6jcEGgiB2267zbVt29ZbwrCGDRo0KNs6\nOf/kk09mz2GFIh3nIEYQJH6LJW38+PHrpZ06daq3so0cOTJ7LdcBaSgLvUwMAUPAEEgCAka8ktAL\npoMh0AAIQJwgWi+99FK2NZAkyBTSo0cPv1+wYEF2T57999/fbz179mxBkLgGcdLkjYyc41oxMm7c\nOJfJZNysWbOKSW5pDAFDwBCoOgJGvKoOsVVgCDQHAliVIEQDBw70ZGft2rWuVatWTojWiSee6IGQ\n32L54jwEjd+QM4gS2xNPPOF23333FmSMAiQNpMrEEDAEDIG0IWDEK209ZvoaAglFQAgXZAmrVNgy\nBWGCiAnhEgIG8RIrGefE1UggPOchc5KHpp999tkJRcDUMgQMAUOgMAJGvApjZCkMAUOgCAQgScRs\nCaGCgEG0tECyIFKkgUzhZiSdyJQpU7IWL7F8sSediSFgCBgCjYCAEa9G6EVrgyGQAARwF0KqIFe4\nASFU/NYicV7ylSFpESFflCHWL53Pjg0BQ8AQaBQEjHg1Sk9aOwyBOiMg1i2C4XEXQq7knKgG0eKc\nEDIhXrgpsWphBZOvH7XLUfLb3hAwBAyBtCNgxCvtPWj6GwIJQQDyJBYtyBVuQ7F66ekchGyRVixd\nNGH+/Pk+MF83hzI5b2IIGAKGQKMgsEEQP5FplMZYOwwBQyD5CDA3F4H3uCMtUD75/WUaGgKGQLwI\nmMUrXjytNEPAEMiBAG5E3IeQLixiWLMqEaxo4o6M2lOPiSFgCBgCSUNgo6QpZPoYAoZA4yOApSsc\n/1Vqq3FZmsG+VNQsvSFgCNQbAXM11rsHrH5DwBAwBAwBQ8AQaBoEzNXYNF1tDTUEDAFDwBAwBAyB\neiNgxKvePWD1GwKGgCFgCBgChkDTIGDEq2m62hpqCBgChoAhYAgYAvVGwIhXvXvA6jcEDAFDwBAw\nBAyBpkHAiFfTdLU11BAwBAwBQ8AQMATqjYARr3r3gNVvCBgChoAhYAgYAk2DgBGvpulqa6ghYAgY\nAoaAIWAI1BsBm0C13j1g9RsCTYTAgw8+6F555RW3/Nnn3RtvvOHWrH7DPfjgovUQ+MFJp7gtttjC\ndezYwX1t553c4Ycf7r7yla+sl85OGAKGgCGQNgRsAtW09ZjpawikDIG5c+e6hx9Z4u6+607Xuk0b\n981993e77b6b23PPPd1WW27puh7cpUWLnn1uhXv1tdfc6tVr3EsvveSeefpP7q475zrI2JHf6eF6\n9eplJKwFYvbDEDAE0oSAEa809ZbpagikBIH/9//+n5tx401uzuzZXuPeJ3zPHdG9u+vYoX1ZLVi9\n5k0379773WPLlrr/mzrZ/XzE2e4npw92u+66a1nlWSZDwBAwBOqFgBGveiFv9RoCDYrAlVde5a65\n+mq3X+cD3Kl9+7qjj+wZa0uxiP3619e5yydeagQsVmStMEPAEKgFAka8aoGy1WEINAECxG9dcMEv\n3RZbbuVOO/302AlXGEIhYPPuvsudM+oc169fv3AS+20IGAKGQOIQMOKVuC4xhQyB9CFw5VWT3DWT\nJrnThw5zw4acXtMGzLtvvrtk3NggfmzHwNJ2lcV/1RR9q8wQMARKRcCmkygVMUtfFgKdO3d2G2yw\ngXvyySfLyl+LTFOnTnVbb7211xNdx48fX4tqU10HsVyDBp/uFix4wF1/w/Saky7Aw5V58y23uO23\n38Ed1OUg98c//jHVmJryhoAh0NgI2HQSjd2/1roiEYAQDho0qEXqkSNH+t9nn312i/P243MEIF19\n+w1wm2++uZs8+VrXpvUOdYOGuideNsF/Lfn9//6+m3nrTPfNb36zbvpYxYaAIWAI5ELALF65kLHz\nVUWAaQJ69uyZtS5xzDmkT58+/ry2OElaOcceq5RskKb33ntvvfxY2tgKCdYu5MQTT3SZTMaNGzfO\n/16wYIHf25+WCAjpatt2D3fLzTfWlXRpzXBzjhg5ykG+zPKlkbFjQ8AQSAoCRryS0hNNpgfkSpMa\njiFXSI8ePfxeX8ci1apVKzdw4EBvmRJrlE8Y/IE4SX45Bzkr1rUp6SBeiOzlvJRpe+c06cLKlDT5\n0YC+Rr6S1immjyFgCGQRMOKVhcIOaoUAli0Izf777++tS1iYOJbzkCtIlpAeCBjWLNKwh2RxvGrV\nKp9/7dq1nqyRXpO13Xff3XHtiSeeKNg0sZZRLyJ7zsu1goU0SYIxYz+PfUsi6ZIugHwR6D98+Jme\nKMp52xsChoAhUG8ELMar3j3QhPVDiCBbECixXImbUeCAWEGi2ISAYYWSY/Zt27aV5Nm9XOcE6YVA\nZRPYQUUITJ8+3d05d45bGEwdkXTB7bj8mWfc6T8Z6t2hSdfX9DMEDIHmQMAsXs3Rz4lrJXFXEq+F\ncuJeFEW1qw/yBYGSc6TBKgZ5C2/lBsILQRMCKFYuzss10a1Z93/5y1/c2DFj3cRggtR6BtKXgv/o\n0ef79SAhjCaGgCFgCCQBASNeSeiFJtPhtttu85YryBZB7JAobakCDrFWQc4gXljAIEDsEcpgi0uk\nXOpCpGw5H1c9aS5n1Lm/cCf0+X7VJ0aNEyMI4lkjz/GEkdg0E0PAEDAE6o2AEa9690AT1i8WJFyN\nfJWIy1AsTAKHkCw5L9Yu3JQQNc7L14/yZSNzcEl6KafYPWUiEC7KExdo2BJXbHmNlo5Z6f/wxON+\nfcS0tY15vo4+5rtOYtPSpr/pawgYAo2FgBGvxurPVLQGMqNdghxr4iONELIFCZPrXJsyZYq3lAmB\n4xxlzp8/v2y3IJYtytVlYo3TelJPs8rU/7vOB6unxcUY7qcf//hHbsIl4xzuUhNDwBAwBOqJgC0Z\nVE/0re68COB+JBZMSFXexHaxaghg7Ro8aLBbsXJF1eqoRcHDzxzhvvjFjdwl48fWojqrwxAwBAyB\nSATM4hUJi52sNwK4DWXiU23tKkcv3I/ijozaSz3llN0MeW75za2u74Afpb6pWL34ItNivVLfldYA\nQyDVCJjFK9Xd17jKS7wW7sZZs2Y1bkMT3jJICu7X5c8+7zp2aJ9wbQurd2yv3u743r1c//79Cye2\nFIaAIWAIVAEBs3hVAVQrsnIEmPiUqSKMdFWOZSUlzJ071/3PwMENQbrAoUewOsKSpY9VAonlNQQM\nAUOgIgSMeFUEn2U2BBobAUjK3p06NUwjj+je3f3f1MkN0x5riCFgCKQPASNe6esz09gQqBkCix9c\n5Dr/e+60mlVaxYpwl3732OMcHwyYGAKGgCFQDwSMeNUDdavTEEgBAjL1QteDu6RA2+JVbNt2D/f0\n088Un8FSGgKGgCEQIwJGvGIE04oyBBoJAYjXfp0PaKQm+bbstvtu7rXX32i4dlmDDAFDIB0IGPFK\nRz+ZloZAzRHAHbf99jvUvN5qV7jnnnu6N94w4lVtnK18Q8AQiEZgo+jTdtYQMASaHYFWW2/jNt5k\ns2aHwdpvCBgChkCsCJjFK1Y4rTBDoHEQ+Ne//tU4jVEt+cY+ndxvbrlJnbFDQ8AQMARqh4ARr9ph\nbTUZAoZAAhBI63qTCYDOVDAEDIEYEDDiFQOIVoQhYAgYAoaAIWAIGALFIGDEqxiULI0hYAg0DAJM\nCttur3YN0x5riCFgCKQLASNe6eov09YQqAkCrNG4ukG//PvbunUNOU1GTW4Mq8QQMAQqRsC+aqwY\nQiug1ggwzcEf//hHvzHXFNsrr7zSQo2tttrKffOb33Rf+cpX/P7www93bCa5EYBsgSsCbh07dmjI\ndQ3ff//93CDYFUPAEDAEqoyAEa8qA2zFx4MAizXfcMMNfqkXSAEkCmLVv3//LLnSNQkhYw+Z+OlP\nf+r+9Kc/uV69ernjjjvOb5TT7CI4gYPgKphAxGbPuUN+Nsz+xRdXuS5dDmyY9lhDDAFDIF0IbJAJ\nJF0qm7bNggAD/+WXX+43SAHkCdK06667lgUB5QmBg4xR1ujRo8surywlEpBJky2wzIfnBhts4N5Y\nvcY10peAJ518qvtOzyM8aU9Ad5gKhoAh0GQIWIxXk3V4GpoLQRJChFsRsgRZgHjlIwmF2gZ5w0Im\nrkrS77bbbv4cdTayQDRpNxuCxZCtEJ4sKP3Io0t8nkb584cnHvdtb5T2WDsMAUMgXQgY8UpXfzW8\nthADXIjsIVzsIQhxC4QD1+XLL7/sIF38xrrWSAJ2stE+cGTjuFjpcuCBgYv26WKTJz7dvPvmu9Zt\n2pSEQeIbZQoaAoZAqhCwGK9UdVdjK4tFCzKEtYvjWggkRAieWMPQIa3xXxAtEUhWpXLMMUe74cPP\nrLSYxOR/5JFHXZsdd0qMPqaIIWAINB8CFuPVfH2euBZjcRKSAAkqxSITZ2PQA/KFWxPyheUt6YLO\n8iUiugqOcerdrVt3d9pPhrg+3zs+zmLrUlb7du3d5CmTIz/IqItCVqkhYAg0HQIbNl2LrcGJQkBI\nl7gX60W6AAUrF8QP8sKmCU2SQIMYiguRY9GXfTWk9/HHuwXz51ej6JqWed20GW6v9l/3eHGfaetg\nTRWxygwBQ6CpETCLV1N3f30br0kXFqYkCfrg7mRwToLlC4LFhkAa2GohkE8I3d/+9je3/NnnXccO\n7WtRbVXqwHI3dOhQd/zxvbPl07+0z8QQMAQMgVohYMSrVkhbPS0QSDLpEkXrTb4gPeCE1JJsSfsh\nJUy5AenaY489XbfuR7ipU66Vy6naY+26acYNbtGihevpbeRrPUjshCFgCFQRASNeVQTXis6NAAM6\npIJBL8kiVi/0rEXAvcYDS1st6ozCH9I5YMCAFpd23mlnN+XX17mjj+zZ4nzSf6xe86Y7+aST1rN2\nab3BvZ54a13s2BAwBBobASNejd2/iWwdXy0ysGPRqRexKAUYXFES/1VKvmLTarKVBLcXZPOKK67I\nqs/yS8S+4eqcPn2Gu/mWW1I1oeq5vxjtXn5plbvl5huzbYo64H7EspiGezJKfztnCBgC6UDAiFc6\n+qlhtGRw23fffd1TTz2ViNipYoDFMseADFnEUlepUB44iCSBbKELekG6pk+fLqq5XXbZxZOub3zj\nG45FLpj1ve0ee7qLLxydTZPkA+btGj5sqLv3vnt9HxbS1chXIYTsuiFgCFSKgBGvShG0/CUhAMlg\nw+qVJsHiI1NNlGMR0WSL/EkI2Nf4ox/9wnqWIpCtRYsWeQvQv/71L/fPf/7Tvfjii8F6l8e5kaPO\ncz8a0FeSJnL/7HMr3Am9j3Njxo5tEVBfSFkjX4UQsuuGgCFQCQJGvCpBz/KWhAAWIwgXA1s55KWk\nyqqQGGLCVixpxDXHhiSRbHnFgj/0B5a8V155RU65fv36uYkTJ3q9IVxs//jHP9ynn37qZs2a5X71\nq8vc9Bk3uq4Hd8nmSdIBcV2DB5/m2rdv7y4ZP7Zk1eQexdJpYggYAoZAnAgY8YoTTSsrLwIMYpAW\nLEdpFAZjiBdkKhdxJA3WI4T2siVZiC+TLxdFzzPOOMOTLn4L6YJwffTRR+7jjz92K1eudA8uetDN\nvHWmu/GmWxJHvoR07RgsDXTttVdLs0reC2lOeh+W3DDLYAgYAnVFwIhXXeFvnsrF2iWDWVpbDmlk\nINZWL0220vRlHH0S/nJx2rRp3tpFPBfuRbFyQbr+/ve/uw8//NDde++97pBDDnG/u+tud+usZJEv\nIV3cXzOmT8tJkIu9/+R+NfJVLGKWzhAwBAohsGGhBGm9PnXqVLfBBhv4rWfPnqlrhtafdmiRdrFf\nsGCBvlTwuG3btllcxo8fXzB9XAmEeMVVXr3KgXixmDaWItkYlLGEseWyhNVL31z1EkSvSRdfLhLP\nhYsR0oWlCyvXJ5984gnX+++/7+fzItaNjyPWrVvnDu92mPveCSe6Q7oe5Jgnq96yZOljWffinXfM\niaUv6FsEcm1iCBgChkAcCGwURyFWhiGQDwGsBg899JD/Oi5furRcY0JR3IlxfOFY6zajd74vFyWI\n/rPPPsuSLqxcEK9nnnnGbbPNNt7qBTnbeOON3dH/31Fu+x22c2MuusAtD66PGPGzukw1AfGbMG6M\nO33IEDds6JBYYYV8gRvkK2kfRcTaUCvMEDAEaoJAw1q8aoKeVVIUAlhJevXqFYsFoqgKq5gIqxZB\n57QpbQJ5QH89XQRfLjK1B3shXbgXcS1+8MEHnnBBNP2SQcuXu+22285bwkiLxXWjjTZyPXr0cNdf\nf737y19edid9/weOKRxqJXy5OHDQaZ50sfh13KRL2oElEwJmli9BxPaGgCFQLgJGvMpFrsr5Bg4c\n6F0+WBbY0iyQlLitQ08++aTDVdqnTx+HK1m7X+UYtyrXRo4c6W677Tb33nvvxQIj5CVtxEusNXq6\nCNyKMl0ErkWxcgnpwp0opOuuu+5yXbp08WkAEWvXJpts4rdNN93U7bXXXu6GadcF83yd5OfNYr6v\nahIwYrnGjJvgp4uAFC17bJknlbF0cI5CjHzlAMZOGwKGQEkINBXxYqCWQZn9oEGD3EsvvZQTMOKn\nSKPzbL311n7AzzWI67TkZ+vcubMvQ2KqikmTL8YrrDCkQuqgbI45V45EtRnygj7lCm5GyEocInjS\nRiFUnIsS+pZrQtAgYuTJ1XdRZUSdE3dTWqwfxKKBv54ugi8XCaSHTIS/XMStqEnX8sDS1bp1a5/u\nC1/4QpZsbb755o4N4vWlL33JffGLXwzixvq7JUuXBCTtwCwBm/Xb2VEwlnWOOK7hZ45w3YP2LH/m\naf9lJdNF0I5aiJAvMDUxBAwBQ6AsBAJrSkPKlClTMBP5LXCFZNjkt963atUqs2rVqvUwOPvssyPT\nS17yPfHEE+vlk+vsw2WMGzfOpy8mjdaf9Fp0/sAyllNPqU/n3X333bPpw9f5rcsOH1NXqRK4sTLB\n7OelZotMX0i/sL65foNBVJ9HVprjZOA6zQTEJcfV5JxGxzAOnAtchZmAcGUCt2Im+FoxE7ghM2vW\nrPG4BIQy8/DDD2fmzZuXmT17dmbw4MGZ3/zmN5mAzGcCy1dm4cKFmccffzzz/PPPZ1577bXMu+++\nmwniwDJBIH4msJr5skEgILiZ4KOKzOGHd8u026td5qfDf5659bbbM8uffb4kgO659/7MqPPOz5Yz\n4qyRmZdffrmkMqqROLAWVqNYK9MQMAQaHIGmCK7PZREJBiRv/cCqNX/+f+JSsJCIdYo0UYLVBEtQ\nMIC7gIRFJSlYBpkK1RNZsDqZzxKFdWf//ff3MTgqS+QhFjLS5xPqCkiLCwhlvmQtrmEVIjamUilG\nv2LrwBKGizIgzsVmWS8dVi8+Gkiy5Fpz8bDDDst+uainiyCmS+K6CKjH5Xj//fd7axnWLCxbm222\nmdtiiy38xrG2dmENE2suuGAdwp3Jxn2w+OFH3Nw5c9x/n3hCUGY317rNjm777XdwXw3ixsKCNQtd\n7rpzrvvusce5Aw88wJ1xxrDYXdbhekv5jRVRrIml5LO0hoAh0NwIbNgszYeAMNAGRNqtXbvWExJp\nuyZmECpNhiAakDLysRF7JRJOK+f1Xsdq5SIsxaTRZYaPA0tQVr/AUtbicj5iphNq0qWxglgSPC0C\nNrS7WIGcMEBVKrpPpCz0pO1s9FF4Y4Z1roEv/aiFGLFy3bGUA/FKqruJIHqmvdALXbPmIvpCugiM\nJ54L0sV0EfLVorgX2XPuz3/+s9ttt918PNeXv/xlT7aYdoINFyPniPMi3itMujTWgheB7yxUzXM0\nceJlbuD//MhtteVmbvMvb+I23WRjv20WHH/68YeuT0DOzhx+hk/L1BDnnTsqUaRL2ifkC8xNDAFD\nwBAoCoHgJdiQEnbVhV1LwSDdwgUTkDGPQzgf6cKi3Za4HLUEoGfLJV2UFJMmrIcuR+cPSIW+5I8D\nspHVgbTSNi7iZpP8pENwmco59mGsyE87JQ26FSvnn39+hq0SoX6pm30uN2+hOsK44AouV3AzBSSm\n3OxVyxeQ4kzwhWILvPgNhrgXcQXiEgysSZl33nkn8+qrr3qX4e9///vMAw88kLnzzjszAWH1rkVc\njLgagwlTM4888kgmCMz398abb76ZCYLuM4FFzLsqcVlSdjMLLnWwNzEEDAFDoBACTWHxwtoRtniE\nf4sVB0uICC5Ebe2R81hQRMin88h59lF59fVi04Tz6N8nnnii/umPw/Xm+4CADFp/rEhhbMBB11Oo\nPK0QVhYJRtfnSznW+pEPKxZ6lipYHHXbwuWWWl7S0uPOA+tyv1wkqJ4lgQi2X7x4sTvqqKO8ZWvL\nLbf0bkMsXbgZsXQRTM9UEoUsXUnDqFr6iOvZLF/VQtjKNQQaB4GmiPHSg22hrtOkItfgDhHRIqRN\nn+M4nC58vdg0UfnkXFTbwvXm0k/K0NchI8Tp5BOdPl86uRb3F2dRbZa6Cu3Jq/u4UPq0XIfglrLm\nIq5Eiediz3JAzFSPGzIImPeLS8tXi5Atjonpkq8XIV0bbvj5/22F7pe0YFipnpAviWmM+56vVDfL\nbwgYAslBoCksXsmB2zSJA4FyLVUQxnLzxqF3tcpgOaZu3br5ObekjuDLRT/Ra2Dy9hYs4rmwZgnh\nCsdzEesF6YJQvfXWW65Tp04+lgsrFxYviJfEcwnp0oH0Um+z78XylfQPL5q9n6z9hkA9ETDiFUJf\nW1NyDdJhi0/YwhQqsqo/o3QM66fbFKWM1h83JYN1vi2I8YoqJvIcXzRiBahEwq5TAu2jgu3z1YGV\ni69XNTbhcvPlT+q1ctdcZGJUXItYuiBd9DdfLgaxXu673/1uC9KFpSuKdCUVk3rrBflCjHzVuyes\nfkMgmQg0hauxFOi1e5FBmi8ewwO0/lIQ0qLzlFJXHGnRT8dfUab+ShP9ChEvrT9EjnZrMlaJnhCv\nOOJeaKN8hYh+fIXJRt/k6gPS0R5IV5R7MdyvlbSz1nnBtNw1FyFcuBexgPF1I5YrSJe2dEksl54u\nAtdipVYuyAhu0b+te9899tjvPWzowrQRSDDfl9uv8wH+uH27doG1bXPHl4NCZvyFFPzhvqetbByb\nGAKGgCEgCBjxEiT+vWeAZ0Bn0EawkmDhkUGa35rYhEnPv4up2S48txa/9dQQxegH8ZLYJ9rN/GS0\nWQiZlCmYcE1/YFCosXEQLwLjwV10kDqlL4SUyflCe/SX9hVKG3VdYnmirlX7HHhCRnQQPWstBl9a\n+iB4XIYEyIt7EauWTBkB6eJYgughUkwHQcB88OWjO+SQQ7LxXJAurlXqWgSr3919j3sg6L81q1d7\nYrV3p33cET16ujZtWnu4mDICYe3FV4MYM+Spp/7onnt+pbvjjjt9vm8Hc391PbiLnyrDJ0j4HyFf\ntD9txDHh0Jp6hkCqETDiFeo+rCcM8kJesJRARKKEtHxhV29BV9E3rAttKUZIB6lEsBKxJE+UQNBK\nIV0QhNGjR0cVVdI5SBKEL+wuLKmQfyeGjFbab/WyZDCIE0Svl/9hglIW7iagG8LFRqC8zNGFRYlN\nSBfXSIMFi2B5SBdz3FFu1PxcfLmIQNJKkdmz57irrrrKk6YT+nzfnTXyHHf0kdHPkpTbsUN7x4bo\ntBCyBxYudLPn3OHGjR3nTh8yxB373f9ykJskC/pBlI18JbmXTDdDoLYIWIxXBN6QkELkAtLFhJ3s\n6yn5iBXkopCbUXSnvYXICGXR5lKEgYe1GuMQCBMTutLmcnDHasmkqmzl5NdtYCCFVNZScNFRpyZd\npay5CPmCjEG6IFPEbUG0sHRhMZOvF7WlqxzSBeHq1q27J12n9O3vVqxc4S6+cHQLIlUqbpCxYUNO\nd1jGJl55lXv55Vf85K5XXjUpFld2qfqUkh5CzHPAPWNiCBgChoBZvHLcA+JexJUVjukSYlbp4J2j\n6pJOQ5ggRASbSxwT1iF0LMbNqCsjD+QEt50OXqd86uF6qcKAw6zpcf3HD+YQRDYsc+JqlL3WD71l\no11x9RcWDMhkLd1HfLk4YMAA3Ty/yDXWLgLjcS/q5X9wL2Lhkngulv/B0kVaXIeQLln+54UXXvAu\nRpmfi3gvCJfEdLWoNM8P+viSCb8KLFxvOAjXjwb0zZO6/EtYwth+/OMfuYsvvthdM2mSuyrYevbs\nUX6hVc6pyVct75sqN8uKNwQMgTIQ2CB4ETPLtYkhUDUE+gfL10DA4nA5Vk3JEgqeO3eub0utLBhx\nrLkI6ULCay5CXo855pi8ay4WA8306dPd2DFjXd8BP3L9+53q2rTeoZhssaSZ9dvZ7n+DJYW6dT/C\njR17sXe5xlJwFQoRt2OtraVVaIoVaQgYAmUiYK7GMoGzbMUjQOwQZKURREgXZLLawiBNPZWuuQjp\n0kH0uBSZn4sPFfbee++S1lyMavNZZ5/jSRcuwFEjR9SUdKFPn+8d7xb6LyXXub79BiTapYflC9KF\n29jEEDAEmhMBs3g1Z7/XvNUMOAw2aXezQIZwWTJBKVY8kbgtGNRDmXF/uUhMl8RyPf744+7oo48u\n+8tFdIToIJMnX1tzwiXY6/3wM0e4eXff5WbeOjPx9xrPQ9z3jcbCjg0BQyCZCBjxSma/NJxWuMsY\nqG8IYpXSLJdffrm33oUtFuHfEEzIZjmCC7MaXy5CumQWeiZKZS1GposgnqvUIPokki7B+rppM9yE\ncWOMfAkgtjcEDIFEIWDEK1Hd0bjKMP3CbrvtFnyN9nILS1HaWoyVC/IFMconkCfIiQj52AoJBI6y\nmVlehC8XmS4C4YtEJj3FfaiXAJIger3mImSK6SIIoteWrr/+9a/+/J577pmdo4uyS5ku4qSTT/VT\nVCTF0oX+WtJGvioh6rrddmwIGALJR8CIV/L7qGE0JF4JSavVC8LFBoksVcij82ENC7tdwaVaXy5C\nvNj4cnHp0qV+brpyvlyk3RddPCZYWujxxLgXc/XFub8Y7Z55+k9uxvRpZVsfc5Ud93mIOsS8XCtp\n3PpYeYaAIVA9BIx4VQ9bKzmEAMQDq9dTTz21HukIJU3cT6xXDIwE18cRl0N5+qtIXLE6novgd+o6\n7LDD/BQQWLr0dBHhSVFlugiAC3+5SEwXVi/m59KkCwtXKVYuyp4/f4EbGkxeevucudmJTjmfVMEy\nt1WwyPe1116dVBWzehn5ykJhB4ZAQyNgxKuhuzd5jWNKCQiFJh3J03J9jcS1iO5xCgQsvOYi5eNa\nxMUoy//gXsS1qJf/EfIVXnMRgiWuRUgXVi7OrVmzxpMy5jYrh3Sh60FdDnK/DCxefEmYBlm95k3X\nPfhIIenzfAmWPBdYvSD5JoaAIdCYCBjxasx+TXSrcLFBZNIyrxdkCzepWCTiAhcig/VMW7r0mous\nvSgxXVi72rZt6yc1lYlRIWFRay4K6WIvli6C6B955BHXvXv3skgXbR40+HRvfZs65dq4IKhJOTLP\n17LHlqXClScuaSNfNbk9rBJDoOYIGPGqOeRWIQQGwkFMk1iSkopKtXSlXNqul//JteaiWLpYT/Gt\nt97yVi+W/sEyss0227RYcxGyJV8uYulihnpI18MPP+yOOOIID3Op7kUyEfQ/eNBgP19WLSdHjeu+\nwOXYsUMHd+6558RVZFXLMfJVVXitcEOgrggY8aor/M1bOaQLFxsDejjIPCmoiEUKkkhQfVxCmyFd\npXy5KF8t4l788MMP/VeNb7/9tnv33Xc9sYJgffWrX3UsFyWWLr5oJN7r9ddf9+QMC0o5pIt2Q1z2\n7rSPnyA1LhxqWQ6LbO/d8eup+qrWyFct7xCryxCoHQJGvGqHtdUUQgAyg7sxieQL0sVEqVihIIlx\nCWWV8uUiJEtci5AuHUTPV4nEbsmai6z+xWANCYNwsSZjt27d3OLFi/2+3DbQP2m2dkm7mVx12222\nTo3VC73pT+7FpP5zItja3hAwBIpHwIhX8VhZyiogQOwUMVRJIl9i6cJChFUOi1ccQll6+Z9ivlwU\n0gUBg3QR64VArCBYEs+lv1wUSxfE7IILLmhBuhjAS52yYMRZI12rrbdJrbVL+m7J0sfcD/v3cytW\nrpBTqdhzP0LAjHylortMSUOgIAJGvApCZAmqjQBWIEgJ+3rHfEnslcSg0XYhhaUSFsGNgZP2sZC0\nyC677OIJJ8H0xXy5COki0B4hZkt/uSiuRc4J6dpwww19/BiuRQikCO1DHxGu6etyXvakxfK3/Nnn\nUzF9hOida39sr97u+N69HIQ/TWLkK029ZboaAvkR+ELg6hmdP4ldNQSqiwD/ye+www5u8ODB7s03\n3/RL2VS3xujScX0yII8cOdKNGzcumwhismLFCv8FYankiwETEnffffdly4NsMZ8W5UK6mCoCS5YE\n0eNSXLdund841l8uykz0WLgIomcP8SKQXpMuCBdfS4atJOBMvbKhH2QMiwobv0kjMnPmTPevzAbu\njGFD5FSq9399d6178sk/uO9+979S1Q6sm2zLli3zfZcq5U1ZQ8AQaIGAWbxawGE/6okABEAsEZAg\nCEstBMJBveyxuuWqV4hJmMzk0lGsZ6V8uQjREvci00Xw9SLkDAsWxAqCpd2L+stFXItsyEMPPZSz\nHbn05bwQMUlz2cQrXI+ePd2wIafLqVTvCbI/ofdxqXM3atCxwOa6R3U6OzYEDIFkIrBhMtUyrZoR\nAQiNkBVcjkKGqoUFJAMXILPpS935BjSxEjHwFRIZHDXpYkLUadM+X75G5ueCWOFGhHDxlSMb1i6x\ndEG6IFMSz4WVi9gwmTIC9yKuRwLphXRRJ7qWI1j0wEC299f9zXUOvpRsFOnYob1r3aaNK6YPk9pm\n+ibN+icVV9PLEKgVAka8aoW01VM0Ani/sS4hkCJIWJwzxotljdglyBcLd2NhK8aNKMSEgY+8UYLV\njK8J9XQREC5mo+fLQ1n+B9KFVQsLl5Au9pAurpEWMgW5wqUI4dKkCzImpAuLGO5FNrArl3jp9lDO\ngw8ucl0P7qJPp/74m/vu32L+tDQ2yMhXGnvNdDYEPkfAiJfdCYlEAIIDgXnvvfe8NQrLFGQCKxgk\nLBfpydUYiJKUwaBF+XPmzPGEqxySQhkQEzYt1KGni4AoMQM901II6fr00089sQqTLggY57iO8OUi\nrsQw6WL6iCjSRR7aiW5xCG37wUmnxFFUosr46nbb+Y8FEqVUGcrQz/R3qc9CGVVZFkPAEIgRAYvx\nihFMK6q6CGCpgoyxJ4aJLwMhTbgJo6xVMigRZE5AOwMVm/5yslKiAjlh4EMPSFetv1wUKxfICwlE\nlzgEK+Arr77hJl42IY7iElPGvPvmuxtnzHC33HxjYnSqRBGeB/o86hmopFzLawgYAtVBYKPqFGul\nGgLxIwDBggyIMOBAeiBPUQIRYjCCbOUSrlVCvhjwIDy4LbXoNRf1l4viXtRB9MzRxXlckBAp3Ic6\niF5PFxHlWpR60SNfWyVdsfsNNvyCwzpkkmwEJD7RyFey+8m0MwQEAbN4CRK2b1oEKrEUQf6woOkg\n+kJrLmrSVcmXi5A0kUrIo5QR3k+8/Ar30cefpn7i1HC7Vq950+3YprV3/Yavpfk39yL/aEDATAwB\nQyC5CGyYXNVMM0OgNggwUGE5KzVWRsiOJl2HHXaYD6JnAKzml4uadEEcbbAt/l5J4yLfxbQOyxci\n/0gUk8fSGAKGQO0RMOJVe8ytxgQiIO6aYlUj1izqy0UC6flK8qWXXvKTooprMe4vF7WeRrw0Gs19\nLATcyFdz3wfW+mQjYMQr2f1j2tUQgWLJV6EvFzt16uS/THziiSfWmy4iji8XNSRiddPn7Dg/Akyi\n2m6vdvkTpfiqka8Ud56p3hQIGPFqim62RhaDAO5BtlzWAlyRTGehF7rmy0rIDy5GHUS//fbbu222\n2cYtWLAgOykqpItAelnoWoLow5Oihmej118u6nagpwyy+nxcxzIha1zlJaWcV197ze3X+YCkqFMV\nPeS+IO7LxBAwBJKFgH3VmKz+MG2KQADCAdmRPVkgRUwbgcg0ExxjxWIQ4ms/iYHhfC4hLWWz10L5\nlCF1cK3Ql4sQpnbt2rnFixe71q1b+9nlK/1yUetE+9GpWrLlFpu75cufrVbxdSsXAtwMwj3MfQv5\nKubeL4QJ9xtlyUbZ+rljzjqpR5479tW8RwvpbNcNgSQiYF81JrFXTKf1EOBlT1yVTJ4qL3T2WKkQ\necGTVgYFGSTkHF8gyrZeJeoE5EuXV+mXiytXrvQz0GMJK2XNRR1Er9Tz5FD00+fjPAaDSy651N1z\nz+/iLLbuZY0ZN8Ft9uVNgoW/h9Zdl1oowLMAaeJZKVX0c8dHJFh2ue8gdWyI3IfyjHGOe4c62VM/\naeS5k+eVdCaGQDMiYMSrGXs9JW3mhQ3RYgkhhBc3rr5yBhDyMzAwEDAXGGUTqyVzfXFdiwxW7KlX\nL//Dmoss/4PIl4vMNv/xxx97VyIWFaaMYMO1yDVmrV+7dq13M+69997Zha5lji7IGDPVs+Yiy/8g\nuUgXAxoiA5//EdMfwQic7rjjDl8qujeSDBx0mnv7rTVlr1qQRiy4j+lbCFAxwj85PCfca0KY2Jcj\nlMFzTJkc8wzz3FXj/i1HP8tjCNQaASNetUbc6isKAV7SvJwhWbyo2eIUiAWEjsEoFwHjvI7non7W\nXJTlf4jpIl4LYsVC15AsSJcQL87J8j+y5iIkZvXq1d5yAOkinktIF2lkzcV8bUX3YgfQfOVwjYGQ\n8tgYHDXB5Pp+++7nLho7zh19ZE9+NoS0b9fezbx15nrxfHFhmmSQ6Od87eQe4L5HeD7ifu543iB0\nrPDAc8SxWcCSfMeYbtVAwIhXNVC1MstGgBczL3v+Q4d85Rskyq5EZWQgYoCBgDAIyH/1YdJF/AqD\nEq4WWXNRky49KSoEDNIlQfRYslhbEaLFuotsTz/9tNt///3ddsHM8FyHdOUKolfqeoJUCSbgSptp\nC3s9B5muR44hXkcd81138YWj5VSq90uWPubOHXVOsH7mwvXaAR4iWGMa1SJDO8P3EPc/z50QI46r\nKdTHM4YuPH9C9qpZp5VtCCQFASNeSekJ08MTn+HDh7vzzz/fv4xrCQkkj5c/xAtyIm420eGpp57y\nwfRYucS9iGuRmefF0iXuRUgXaRC+XNx0002zpEtci5wj7mvbbbd1u+22W1Gki8EKKZUQCMlikNMf\nB/jCIv4ce+yxfmBmcGY+skmTro4kKhFZE3/q3F+Mdv/49BN3yfixeXUFa8GbhGGikjdzCi5yL0ib\n5N6HbEGCammBQg/qxbKNHrWsOwXdZCo2KAJGvBq0Y9PULIiO/PcLSSg3hqvSNvPf/r777tuiGL5c\nxL3IgPC1r33NffbZZ96SJROjhi1dpa65+Oqrr3r3XrjeFkr8+4ceLKOuyznSycZi4oXk29/+th+E\nGYhlWgysekIyv/GNb7orA/LVCO7Gbt26B8T+f7OkoxA2ch08RSC+pZJfyZukPW3Cysue547+r4fw\n/EO+eP7q+fzXo+1WZ3MisFFzNttanRQEeOnKC58Xbz3/46VuXIoS56Sni1i4cKFr06aNj9kSS5cm\nXeWuuYi1i/oY/ASHqL7Jdx3cuC6b6B9VDudol3ydxp42i/tUiCMWO7HsHXPMMe7+++5PPfG6btoM\nD0k+nHNhpvNgCQNrhHumXv8oeAUq+IOFCcsu1tx6tgEMIVxY28AZbOupTwWQWlZDoCgEzOJVFEyW\nqBoICOniJcsgkATRVi8ICUsAMRM9li7IV+fOnddzLeovF4nVIlh+s802W8+9KEH0ub5clAEnTD7F\n5SVWFhn4Sc+AVYhoMa+ZEK1evXpliRYWLbFqaaJFW/X2wgsvOMjX8mefdx07tE9CN5Wlw0knn+q+\nd8Lx7vjje5eVPyoT9zD3jAj3crj/5FqS9mJh4h6iDXJv1VtH3gNi/TbyVe/esPqrhYARr2oha+Xm\nRSCJpEsUhthAVhicsAgQi0Vw/FtvveX+/Oc/u5133tmtW7fOTxcR9eUipIsAeuK5Sv1yUax+eiCE\nXCHsGSgLBcRDGCFaxKthQcBFKq7DMNnSBItjPgjQe7k+PpjPa/fd27qJl00QmFK1n3fffDc8mLdr\n2WPLqkqM6D/ubSSp1jBNupJIEo18perRMmXLQMCIVxmgWZbKEUj6y19cbwMGDPAWDZb+wbXI+oss\n8YPgXsz35SIEjCB6sXQV++UixI/BhwE8PJ1FLuR1QPw+++zjiZaQLbFmyV7IVJhghUkXv8Xd+OKL\nL7pLJ1zqZs66zXU9uEsuNRJ7ntiuoUOHxmrtKtRY+i9p1jDcedxbQvALtaFe14k9Y0u6nvXCx+pN\nNwJGvNLdf6nUnhcqAwAEI4n/cQOquOCYh6tLly5+Y+JULF2PPvqo23PPPbNzdOX7clFIl8zPlWtS\nVCxZssUREI/+ECtNtoRIRREsIWOSRvIKDmAybdp0P8/Y7353Jz9TI8xUv2zpEnfnHXPqqnO9rWHc\nX1hB2afBjSdfGKOviSHQSAgY8Wqk3kxBWyBbvPRxm+EGS6JgKWKDhBBsvmLFCtejR49g+ZxLPOEi\npusPf/iD69ixo59pnklQZY4uPV0EhEziucJzdDEIM6DIVihOK19AvJAjTbI4FoKlLVv6nD4vedlT\nnmAgRBFrHeSRJYT+69hebtTIEUnsuvV0Yt6uQ7oeVPcA8rBitbaGUR/ua/7hScucWejMuwJ906Jz\nuJ/ttyEQhYARryhU7FzVEIBs8TLF6pVUERcdJIUYLojWpEmT/GzbV199tSdTb775pp/0lPgpIV3E\ndUHCiAeDdEFW2BDisoRksS8Up0Wevn37enIqMVvsIUVRREssVrIXghXey3X2QraEaFGnCCSLTdog\nk7w+++yz7rLLJrpLLv2V6/O94yV5Iver17zpTj7pJNe/fz8/S3oilfy3UtoaBkFii1MgLkL24yy3\n2mXxrGD5Qve4Mam27la+IZALASNeuZCx87EjIC/RJLsYaTTESyxGQrxYBuiUU07xXziefPLJHpvn\nnnvOde3aNRtET0yXuBaJB1u8eLGjzQToFyJaQq6EmELoCPAXosXAA7HbcccdW3xxCIHKRa7kvCZb\nUh57LUK02GOli9pkLclLL73Uu1tvvOmWxMZ7QboGDz7NtW/fvuBkqRqHJBzzfLCJcE9UIpTFtCUv\nv/xyKslL/+AjF4TYNBNDoBEQMOLVCL2YkjbwHyuuDnmRJlVtbfGSObuwehF7xcz6t99+u5+SAZL1\nzDPPuG7dunlL19KlS93dd9/tHn74Ybd8+fKCzWPiUvnyUAfEM23Ft771raxFChIIeWLgfO+997y7\nU8iUkCu9l/Sk4VjIllYIF6K2aIWJlpAsvcf6RXwbU2rcc888N3nyZDd9xo2JJF/DzxzhVq160c2Y\n/vnkt7rtaTuGvIvwDLGVIjxv5OHZS6PERRz5QKZnz54egnHjxrmzzz47VXC0bdvWrySB0qXoP3Xq\nVDdo0KBsW3m/1VJ4Z/HOYBWME0880c2aNauW1SeyLiNede6WfA9Tvmt1Vrvk6onpwt3BSzTpwouJ\nDTJDcD3kSzasXQcccIAbNmyYw+LFS+TWW2/16Qu1C3IlRIvpHiBEQvKEIDE4COmCOKED14RYrV27\n1tdLWZp8CdmSctgjlC/xZZpoQaIgW+JC1ARLSJg+h8UPMnnEEUdk3Y+jRp3r5t1zj7v+humJIV9Y\nukaPvsDhCm4E0hW+p3h+9DNUyBpGWqxdDH5J/ZAl3Mao3/LPWiVWL3mf7r777gEpXxVVTaLPif4o\nGSZeEovJtSlTpriBAwdy6KXexAsltA68MyFgzSwbNXPjre3FI5DvwS6mFF6YaQmQpa0QFiEqxGtx\nDvchZOSaa67xW6F25wuIj5ohnsFg66239hOiCunSe44hVAwcuDGxYhBPJmQLnYVooRvkStogREvI\nVnjPdSFaco1zbK+99pqDeDGJKuWBBfvLLvtVMAv+Pu6HQQzVLy8eU/eYL3Ev0vZGJF20iz5nEylk\nDcPK1a9fv1STLtpKOyCQxIaWQyDHjx+ftRalzdIlfZ3mPURQ+mDkyJFGvNLcmaZ7OhDgv27inCr5\nb7XWLYVcsEFCEIgGE6guWbIkpyrEZR0exOOwYdEiRgsiJK4+rGaQJDaxVuk9S7cceuih3joRJlyS\nTvLz3y+uR+LKvvrVr2Z11EQLIqUJVZhYaYIl14RsyZ5FtSGDHTp08BhI48EG6R+4sbbccis36pxz\n3Isvrqrb144yQeqxx/VOXUyXYFrOnntNhOdMiBjkRL4elnOSLo17yCbPFJZz7rlSBGsfgz7SqlWr\nFtagUsqpd9pyrXSQHm0Bq1c70AHShcuR/mhmArxhvTrB6m0eBHhZslRNOf+p1hMlIR9YvNgOPvhg\nt/3222dVgvRgBfrVr37lXRcspn3dddc53JGs64hVi+B8JlqVdR2ZB4ypI/hUno1BgW3OnDl+oORY\nrpGODWsTMWYySz7kC0KH5Qsd33jjDR9jxteVTO5KoD5Ys2eg4Vjv5VjSkI7AfdrDV5ky6Sskk/nK\nIHnUI2RUSJcAwRI8M2+d6efKOrZXb8cUDrUSrFzn/mK0n5V+zNixTUW6whhDTiBibBxjYeb+gYA1\ngkC4yvnnDTcXzxUSJiAQALmviYMiHeRAzrEnjeSPwpHwAPLqPPyzUigfehFzpvPxm/NRwnMoaSkb\nkfw6vegi5bCXfOxFwu188skn5VJ2r9MQp6Ulqt359NfuRdFNl9dMx6kkXtx0+ibkZuIcTFqL3IBc\n50EIX9dlcBwWHjbK1Tdtrrp03mL103nKOS71xtftBQvy8zBJ++RloXUp5sHW6aOO+Y+b2KY0CZgg\nYkHCIsTGi/vUU0/11yBRd955p4/3YhkhvjhkSSH9JWSYaAnZ0nssXZAgzjFQkgeiBmEjxgxrF1Yz\ndIIAQQIhRxAl+nSPPfbw9zZlCMmCXNGf8luuQbIgZ0K0KEeIFuXSRogeHwjw0QDlCBa+0Tn+MLgz\nQWmPHkd41yPB7c8+tyJH6spPQ7iYGLV7QDLe+evb7t777q3prPSVt6C6JdDfCJP+NorwDuEDF56T\nUkQP8szHl0943/H+1gL5kOBwfZ5jrkWRDSFwPJ9RhIaxiY13sBZ5p1NmtUUTIeoK68I5jZ1OD0ZR\n7Rb9aVtY+Edx//3396cZf2677bZwkqb5nSriRWfxAHCzh0mUPBz6JseUycCBCImSnuWG0mUQkKiF\ncnhoKDcsnONa+EYtVb9wuaX8LufG1+Vz0/PgaLzkZRH3Q4+bkf/C0yZCSNlDwNh++ctfuhkzZnhr\nElNEiBuR4HesYbgjV69e7ckTJEoTLPBlk3NcZ+PLyG233dZbtbCSUVYuogVhgjyxCZmC9HXv3t2T\nPoiZkC1JI0QLi5i2aPFVJmSLPGy0jzaxHV5mfw0bOsSToI02+oLbu+PXHQQsTgsYZE4I1/JnnnaT\np0x2UyZf43YNLDwmLRFI4z88LVvQ8hf3NXGTtKtY4f2m3/P5iBdjgn4f6jooI0wmeAeHSZrOwzHP\nO+9T9iLUowmNnNd7xpZCZev05RxDgiBDIuG281vrLdgxdkSNi1IOe9oXpb+UQRojXqCQAunTp0/O\nBwP1wzc5N5X2I3MzyEOobwqYvL4hwuXkgib8IJaqX65yC52v5MaXsvM9ODz0cT0UzD9FrFM9B0Ze\nfEKipP2V7rHwMADg8sMiJV8/EgD8+OOPe+IlBEvvtUUL9+G9997r5wLDfYiIRYugeT0jvpAocRNG\nWbOOPPJIT+SoD5LFJu5DyhOiRWyXEC2NC32FVOqaoq8nXDLOx6Btu83W3gLWrVt37xIkFqtUgbhd\nOekaN3DQaZ7MvfKXlz3huuXmG8smiKXqkMb0xOeVS6DjaG81njvaI/dpMTrqf47F2pIvH2mIpeK5\nZq/HBcqS8hgj9BjCWDN//nyfj7zapRlOq9+t1MeXylKf1lGny6Wz1KmvY0QI66Cv62NtxZK2yXX9\nG71ENz12cI71a0V/jRf40HYtmujp8nWaZjjeMC2NhDRpRs7ntHQ2m7ZW0dH6vwmIl+5sbgZNwBjI\nKEsL1/UNo+vSRA4SJw9Hufrpeos9ruTG13XodoXnVhGsK32wCfitJ+nS7eVY92v4Wim/GQBoG5Yp\nSJMQL8jUTjvt5O9VsWixj4rTIjieexP3HqQoF9ESsqX3EDH5zTFWLYjWQQcd5F2HuDzzES0IlxYG\nM/qJLS6hrHPPPcetWLkicHkNc5tusrG7ZNxYT4KJBYNIYb2K2ojbOunkU137du09cVseWAVZnJv+\nw8JVT0IRFz7VLId/CrAOJUXieu74p6AU4iXvMXDQ40AuXHgPSjr24feikAXe+7pNjEGadIR/6zFJ\n/vlHB4gPzzFCfXp80br7BFX4o4kX7dF16mNJxzmtP/gIIRO8pD2UJ+OjqC7Y8pvruixJ0wz71BAv\nueHpFP6b0DcovzV50jc56TUx45r+TyVMzEivb5ZwXdSjbx65OSvRjzqLlUpvfKkn3C4eLHm4SKNf\nKpKnnD0vySQNktJf5bRF58GKx+AG6ZIvEPlqkfguXHbs//rXv64XEM81LE64+N5++2239957xxoQ\nT7nEfHGPEqclFq0w0dJtoR2QJIkL0tfiOiY+57xzR7lFixb6e+vM4We4Dl9v5zbfLIgxCwhZePvi\nF4Ln/H9+5N2WELepU651/YPg6mrqGFdbk1AOVs8kYRXXc8d9StuKFV2vfm9H5YdAhNNAIvR7UYiC\nLpc0mnRJ2bxjRTSpEaLCNf6JZhPrEHWJQYF9tSXcZj2O6WNpn253OC+65sJL2hHGV5cnaZph//m3\n8iloqe4guQm02tywYgni4eBGF+ZNeh4CIWTy8HATaAIn5em69EMi16M+69V5StVPyi1mr+vJd+OH\n2xouO6pdnNOkM5ynEX6DX1T/lNo2BgARsXoJCWPPdYLmmYYBkTgqSBfbI4884r/0xNrFb70nrfyW\n9JIf4sbG7zCp0uSKe//wwCoHqcJKEDUIM4DVgxijC7qxmVQHgXr0a76WxPncEWBfrOh/IGU8yJU3\n6p1IWk0WpDwZQ7ieK1+4PsnLmKPfs2IIkPFLxitN+KinWkI9ooOML+xFX9onbZRz6EIa/c6J0k+n\nL+d6VJ5GOJcai5e+0Yml0oMOxwS7awl3ODd7+EHQljDJq+vhnH7oJE3UXucrR7+oMqPO6XbJjR/G\nQkgX+XV6XV6x7dJ5yjkWa0o5eauRhxeMvFziLF8TInEdMsP9iy++6L8gxB0ocVoQnn333dffj5AQ\n7kvZS5pSAuKl/6PaA7nBJcqmRc4Z+dGo2HG1EIjrudP/8BSja4eXj40AADlaSURBVK73XzF5q5UG\nEkNclLaI6bqwNDGGCBHT16pxrP8RFSuXJoa1IoDVaFtSy0wN8aoUQB7A8ENYjQG4Uj0bMX+pL8so\nDHhxC8EodS8vE8rlHiDol5daHP2PLlifsExJnBYB7fL14V577eWXG4JYSUA8MV9ihYJ06RitUgPi\no7AKn5NgeYmNkb2cD6e334aAIJDU5070K7Qv5R/M8PggZet/qqU82ZMm13skXJ7+xx/yxT/+4lYk\nhIVNp4mLrEo7cu0hXlIvOtMe/c7UxEzSURbnRf9c+yjjRi49mul8aoiXvtEl4DtXZ3Nep6dDo/57\n4MbWDxXp9I3F7/B1zkWJrq8c/aLKjDqn9bMbPwqhwud4udD3UfdE4dwtU+iYLebDIkAea5VYrr7+\n9a/7DJAzzvGlGfFOYbKlp3kgTgsiRx7K10SzZe3F/4L8so0Oll6R4+JzW0pDoHIE4nzuitVGvy/D\nRChcBlae8Pue39r6I+95cb1RBuVqoiLlas8DepCH8vTzLKQNjwwbYSxaZ7kuZVZrr61v6C31orNu\nqz4mTSFMC+kreBZK12jXU0O8wh1eSkdwI8mDIQ8A+eVFoMviur7xww8iabFcyMPDAI5Uop8voMg/\n4XoqvfGLrLbsZFh6xMJSdiEJzaitXbgXmbIBaxdWK4iVkC9mvGduLPqqU6dO2SkewhOXQrTY5N5i\nH5dIPBfEi/4oJUA5Lh2sHEOg1gjogT3qXR7WBxefpGPPby1i/cH9pseJ8GSo4d/irkMfnY9//qQ+\n6mGc0u90nVbrke9Y58+XTl+TdnFOE0bRW9Iy/gim1IP3QEia5NXjoy6L67qt/NbjGb+bRVJDvPSN\nwc0qhIeOkgdEBiwd78XNoS0b/FcR/gJSSJl0ur7ZqEf/x0NZ+sYWvWRPGaXoJ3UWu6/0xi+2nnzp\ndPvzpeMa7qxGHOSFFLHHOoWVSlyNEC823I0yQzzxXvfff79r166dTwtRqybR0v0SjufKFfel81Tr\nmHuBuL8rr7wqmIz2Ij9lBNNGhLcRZ410V141ya/NF45Pq5ZujVZuIz53/NPAPzTFih7Yw4N+VBmQ\nCMYPnmv2mlQwLkh5ECLGEhHK1vOWacJBWj3maOsSY4/UR52a6JFPjytSV6E94w9laR0K5aGeKJIX\nVb9uN/jo1U8gnDI+QNB0W9FB4wmWUXUW0rURrqfmq0Y6EBIkDw83F1uU6BuDNHIj0MmUIze0EC5u\nFv2lIvn1TasfBl2ffhDL1U+XV+wx+qEzIjd+VN6oGz8qXannBHv89+EHK6qsOAYAjXVUHeWcq+Sh\nZwDYNXDd4QrEtc2LDiIVFs6zPffcc8GSNse71157ze0a5KuVoCdWx3A8F7+FkFVbH+p58MGH3Ow5\nc91dd8513z32uGCw2cN9dbvt3Kl9+0ZC8cILLwTLJn3oZt12u+vdu7c7/PBu7ohgcDj0kK7B8eGR\neezkfxAAI6yblUrSnjveJeF7OV8b0V/GCd6VjAW5nntIBtc1OZCyIQnheCXew4xHeqyQ9LKnLkJP\ndJ3kY+yJqkfysWeOLJ1PXwsfo1+h8sJ5wr9lDJPzlMkWFtKBk+Aavs5vxh7aHRYZizkfRerC6Rv2\ndzBopEYCcpQJbgQmN8m5Bf9ZZNsTfDnSIl2x1yggeMha5A3XiR7BjMPZujgoVT/yBDdoth6tX6Fr\npA3rpH9TLvpo0XUFD4W+5I91mcHD1eJ6FO5gVEgWLVqUOeywwwolS931YA28zPnnn+/1DqaTyOTa\nSBBMlOo3jsGjVkJdwYsub3XoFkx7kTdNuReDhb8zPzjpFH+f/nT4zzO33nZ75o3Va8oq7p5778+M\nOu/8TEDA/HbDDTcUbFtZFTVIJvo0mGuuQVrzn2ZMnDgx069fv/+cKOJIv7sCMtMih37nBUTAv9N5\n9+l3afi93KKA4AdlhvMEhClDvvAYofNyXesmdXKesSss+v0d1on0Aclsobe8n8NjWbhc+U07RAf2\n4TokneypMyCRLfKgY758ur1RY5CU3eh7/ltPndCxugO5Sbjxwx2p03BDhEU/LDwoYaLCjaXTUE+h\nG4s6itWPtPkepnzXyFvqja/LC2NFeegtDx7t1pLvwdbpwscM7IFrIHw69b8hkxCLYiRMtsK/iymj\nlDSQrVLqKDV9IV2oWwjSFVddXTbZylUPBA5C126vdpkrrrgyV7KmP8+zXIh4pw0kSBfkqxTRxCP8\nXtPvPIiXSfUQYAyR8YWxqJkllcSrmTssjW3nP+9qWVXqhUexg1oUAdIWsLj1r8SCha6VDNTUHSwD\n9DkhCghXtQUrGAQMkheFc7XrT3r5pfxzkPS2iH68S0rta6xO/GPNM8teW6GMeAmy1d9r65hY46pf\nazJrSE1wffDQmKQUAeJNCKhuBCFuRmKiiJ3KJ8Q2SVqdjnPEqsQR+6bLJZ4LKSUGRuennyTuS58v\n5nj+/AXuqCOPCqbT2MwtDPp62JDTi8lWUZqjj+zpWCi79wnfc4MHDXYXXTymovIaLTP9OXfu3IZp\nFvcmzwztKkUCspUNhA/+scgbk1VKuZa2eAQ07oG1q6jY4OJLT1/KDdOnsmmcNgR4UQYxOWlTO1Lf\nyy+/3E8NwUWC5mkbZExIj86Ui3iRBnIUlUfnL+WYsiB0bJWIkLZSdLv44rFu6JAh7pcB8Zl42QTX\npvUOlahQcl5I3u1B4P7vf/+4Y/HtuAltyQolIAMY8I/B9OnTGwYPSCRz4JUjBLQHoSc+a75g+HLK\ntjyFEQBzyBcSWBkLZ2j0FMk0xJlWjYQA7ivivHBFpVlwlwbvg5xbMHFq5thjj81cdtllmeuvvz4b\ncJ+rzeAShwsW1wtlxSmUV4xLJ5j2IRN8pZh5dMmyOKsvuyyC+HE9xoFr2UrUKSNtps/YpP245oqN\nRayT2kVXW2lbdIwRLkbEXI1Fw192Qu3qxd1okslsAAiNTi6tffVHoH///l6JNFu+sCLwHzeL9AaD\nQNbylQvdHXfc0VvEmEaiW7du2YWqsZSJYBVDyrFUoQ+WKaxu1RJcxFjBotyqZ519jluxYoWbPPna\nmlu58rV3+Jkj3Ly773Izb51Ztts1X/lJuRZ2C0f1ExZaLEVpd/WjP8+eWTOTcveZHpUgYMSrEvQs\nb9EIMEgwMLCPGsSLLqjOCSFIuBYhkoEFz82ePdstXLjQbx9//HFe7Tp06OAJmBAxEkPCGFRKJU/g\nyCAkrsG8FVd4EXJHn2lymFTSJU0V8rXssWWpvt+kPbLXBIr+0H0iafSee4Q04orW19J0zPPBxrNn\nYgikHQEjXmnvwRTpD1lhEEjryxNrHbpDejAUs/3zn//02z/+8Q/3+OOPu5/85Cf+NxOAFpJDDjnE\nTw6KNYyFsxlYtDUsV/4oIpQrbVznNdFjRvk5AeG8+ZZbEmXpCrcV8rVq1YtuxvRpqSVf9LW28nCP\nlCrcsxA2TdpKLaOe6dEbaxf3YJr/aasnhlZ3shAw4pWs/mhobXhx7rbbbt5SBAFLk4h1CdcNgwCk\nK5g01X322WcO0vXJJ5+4lStXOqxeuBi5FsTWuMWLF7uHH37Y/f3vfy/Y3J122skTu+7du3uCSoYw\nEWMQinIpFSw8hgRgQPtvvvkWN33Gja7rwV1iKLW6RRBsf+CBB7jzzh1V3YpiKh2MIVsicfQ1ZfK8\n4XIsh7iJLvXaozMbBNLEEGgEBIx4NUIvpqgNP/3pT/3Akrb/vsN6i7VLSNdHH33k5s2b51iTERIG\nIYN8QZxYSujDDz90v/vd79yjjz7qHnvssYI91qZNm/XckgzIWMfqJQzgB3U5yI0YOcr9aED0Uj/1\n0i1Xvc8+t8Kd0Ps4N278OE+Yc6Wr53n9LGDRqYb7GMLMJtbSera3lLpFb/5pMzEEGgUBI16N0pMp\naQeDNwMLRIYtDSKuDgYtLAeIEK9PP/3UQbruuecev1js+++/739DvnBDIhAvFtJmYWzZL1++3JOw\nRx55xAeo+4QF/gwJpmxg3UIhX2FrWIHsFV8mrov+mzrl2orLqmUB102b4W6acUNggZydCFcVJEIT\niVpZoaiHZw8ykwbheUPntFrq0oCx6VgfBIx41Qf3pq6VF+q+++7rgk/eq/LffZzgipuGwYoYNRFN\nvJ5//nlv0dpmm23cunXrvFsRMoY1TKxeLKYN6YoiYZzHEgYJg8BhHSskxxxzjCdgkDCwRKpJxOiz\n7//39/18WR07tC+kXuKun3Tyqe6gg7q4YUOH1EU3bdWCvAuBr6UykD0hXvperqUOxdbFcwfpwq0/\n2lyMxcJm6VKCgBGvlHRUo6lJoDoWLwakarhW4sBLXv7oh75aNPG67777gjiiAx3Wrg8++MBvEC+s\nYZAv0rJBjNggYRAwTcI4FosYgfmQsGnTpvkydL1Rx5tuuqmTLyXzxYdF5S32HMRl7077uFEjRxSb\nJVHp5t033w0fNtTV6itHiCr3jwgkIgmC9QjSlXQrkkwdoQlrEvAzHQyBOBAw4hUHilZGWQgwADBA\n8XJN2tdKQrrQK+rlD5HCmrVgwQLXtWtX716EbEG82LPhbhTyRVrZNFiQsDARo4ybb77Z/fznP/dk\n7PXXX89axIqND8MlCQnDIibYlmsRE2sXSwHVelZ6jVWlx9W0enG/gJMIZF1wl3NJ2WO9HT58eGIt\nzkl+LySlD02PdCNgxCvd/Zd67eUli0UpKZYvIV2Am4sUQryYx4s4Lr5ihGBBtPiqkY1jTbwItpdN\npqCAiFGOlldffdVbzlje5LnnnvNuRIkLk32p8WEdO3bMxoaVEx9GbNcXN/6Su/jC0VrV1B2L1WvF\nyhWx6K4JOSQrKfdvvsbhbhSSmESLs7wPcj13+dpm1wyBtCBgxCstPdXAevKyxfXBy7begxe6sL5d\nr169vHsxn9UiWJrFffvb3/bkS6aVEAuX3ss12UO8cEHyW0gY+yeffNJbSZgVH+sUU1C88847bo89\n9ljPNSkkTMeH3X333UVNWyHxYVjEBO9c1jAGab5kZC3ENMZ2hR+bbt26uzPOGFbWF46QFjaRpLgP\nRZ9Ce9Fd4sv4ZwfyFY5fLFRONa4X889ONeq1Mg2BeiBgxKseqFud6yHAIDBgwAA3ceLEun3tSFzJ\nHXfc4XUjvgoSlksgiYcddpi/LJYrIVEQKr1BsjTZEtKl98E6co55vDbffHOfVtySDJbbbrutPx8V\nHwbx0iSMwHziwwjWv//++3Opnz1fKD4MQnz9tOnuzjvmZPOk+eDKSde4VwJMf3XpJQWbIZYhSQhh\nEdIi59KyD5Mu0VtivrjX6/W1I88S9UNk0SHfPzuit+0NgTQjYMQrzb3XYLoTIwP5YXCDiNVqkGOA\n5cUvpEtg7devn9eD39olKIMYk8Hq8xwLCWMvREz2YTLGb8gXcWK4AyFdQsZ02qefftqx3BBlaomK\nD9MkjGD9SuPDxowZ51pts21qg+o1XhzLvF653I3cg9wPSFrch17ZPH/kfs31PHGd5w6B+NTKkgfO\nfLHIs84+LdPL5IHaLhkCRSFgxKsomCxRrRDgZczL/4ILLnAQH17IuQaMSnWSumSw4cUPAXvllVey\nRe+zzz4OlyKDsJAs/kMnVkq75+RY0lCAJmEcazIGscKNiKWLpYOEhEXtOYcbEnImJE7Kpj6pmy8j\nJVAfAqa/lJQvJkuJD9tkk01clwMPcmPGjUvFLPXZTitwgLtx4sTLvJuVeyAtQfEFmhV5uRDp0pl4\n1ngWIGHVfO6oE7LF84arm+NqPeO6fXZsCCQFASNeSekJ06MFAgwYvPyJt4KAyUu6RaIyf1A2L3sG\nGV781CP/5TMQc/ynP/0pWzqzyLMYNiRMky5Ijrj/2AsBymYMDoSIsZcN8vTSSy+5tWvXuk6dOmXd\nkpzXFi85Zv/aa6/5dLgd+U1aCBl7IXS6XtEL8iWbkC9tFZP5w6Liw2gv1jZpgy4/zccDB53m/rzy\ned/vjWLViuqPUkiX5Of+51mT545/RHge4hD5R4dnDxGS53/YH0OgiRAw4tVEnZ3GpgoBIxaF/4r5\nb5yBoNTBAKsGpImXPqQKMpdvUOH6jBkzspBBZIYOHer69u3r15sUMiOWJX7nIl/ZQoIDSAy6bLXV\nVu5rX/ua/y3ESfZCqMLWL6xV2223nSMuS5MyIWGSj3LYtAhJ1HpDxPgthCwcH/aNb3zD7blnO3fb\nbbfqolJ/PGbcBPfZp5+4//3f81LfllwNKId06bLknxMhSTx38uzpdIWOKYfnjucXVz4frUDsSn1+\nC9Vj1w2BNCFgxCtNvdXkuvLyZuNFjjuQ4HbIGBuC9YJNBh1xI4kriZe9DCCkyycQl+uuu84NHDiw\nRTLyX3HFFVnCsvHGGzs2IWBCcFpkUj/QHSsb9WtLEsfUKXvIlN6EhGGhYqoJ+R2155xsQsKiiJi4\nJSFf6K8tYZCxSZMmua232c5NvGyCakH6D5lW4saAVN9y843pb0xEC+T+l+ciIklJp+SZ497lnxYI\nOWXLF7EUxrE8Z7meO56/uHQqqQGW2BBIGAJGvBLWIaZOcQjolzvHCAMOx3pAkJd9KS98yA8bVqXf\n//73fsoIrVXbtm3djTfe6K1PX/rSl7wFir1YjiA0iHY9ir7ok0uoU0STMCFPELF3333Xzx/Wvn17\nT8y05SsXCRPXpBA5KZv6RMcwCYOMzZhxk+uwd6eGCawXbBuZePEMIKXc7z5DkX/kPoZk5XvueAbR\nQT+LRVZhyQyBhkfAiFfDd7E1sFQEICSQE5kUlfUTTz755PWK+fWvf+0OPvhgt9lmm7kvf/nLjmB0\nrF/iziMDxIbBMEwI1yss4oQQMfayQZ4Y9HbeeWf/FaRYtjgvJCyKgMk1IWGk0USM6qlD3KUQsZkz\nZ7lv7rd/wxGv1WvedDu2ae3bGwF7ak9Vm3SlFhhT3BBIGAKf/2ueMKVMHUOg3ggI6YGAQVBw8bVr\n166FWj/+8Y8dMTB/+9vf/GzzTHjKrPVCbiiDhcCRcv7z1yRIyBzE7oADDnAszE2sF6SPaSi22GIL\nt+WWW/rYMdyYrVq18u7MqD3xZWzkIS+kUSx2EC70ps20vRElzcse5eoPcfNVy9KVq147bwgYAqUj\nsFHpWSyHIdD4CAjpWbx4sZ86ggWwIVkXXnihwwImMmHCBLdq1Sp33nnneaKi3XgSjwX5iUPQCWHP\nrPPE3OC6lDplL5Ys2Yu1qxhLmKQlr9QXh+5WRvUQgHRBuArFLVZPAyvZEDAESkHAiFcpaFnapkEA\n0kEAP/FcEnjO/mc/+5lr3bq1D7wXMJhqgnm2cD3iAiQOa+XKle6oo47ycV8Qoqi4L8lfzh79mMAV\nHXcNBl2sVFjFECFg7NmEgLEX16TeC9kK7zf64hfLUS3xeZhEtd1eLa2XiVc6h4JGunIAY6cNgQQj\nYMQrwZ1jqtUHAbH0sGA1k5t+9NFH3hUn0zj06dPHEyzm/xKBAPXs2dONHz/eL/1z6KGH+nwQHx33\nBUGS8iVvuXsIFwMv8WPa2gEBEyLGXjaIVxQR43yYdPF7y8AV2YjyajAn2n6dD0h904x0pb4LrQFN\nioARrybt+DQ3W76swtUmx1HtgZiwEV8lX1lFpYs6R9m487AMQZyEtEBcIDIE1eN67B9MMKnl7LPP\ndmPGjPETo0oe0lMGErfli3ahKy5HLULu2FM/IoRM2hAmYeirSdjGG2/k/vLyS7rYhjjGbZx2MdKV\n9h40/ZsZASNezdz7KWo7X2wxnxBkR+YSEjKlLU+6SQxO5GOG7Iceesjtsssufh4vyBJ5cwl5IGyQ\nJMiKkCZIDJuc5xqTQp5zzjnuueeeyxY3atQoH/c1ZMiQrJsPkkM5MmkpiYUcZTOWeUBbaGuuNul6\npA1SlSZhEDQhX+yZJ21asEB2o8mLL65y7UMfSqSpjUa60tRbpqshsD4CNp3E+pjYmQQhANFiY7CR\nyU+x7mjXWrHqykSQlEd+CBtlhssSC5JYioSM4H775JNPHF8vMsv7Bx984L9mZLmdZcuW+S8ftS6Q\nt9/+9rf+60G+PsRVKV8PQtritH5BFhHqLFWkneTTRIwljYhn09dLLTuJ6VkyqOvBXVz/kLUyibqG\ndTLSFUbEfhsC6UPAiFf6+qwpNIYksbQIkosgVQKEJnRYxGQQFtIlZQvpELccrkfIF3Ffb7zxhnvk\nkUf8TPIQsaVLl/qvHiWv7O+77z4fEybzfUG+sH5BvtgQbZWSfKXuw7qXml/SS5vZH3FED3fWyHPc\n0Uf2lMup37dv197NvHVmTgthUhtopCupPWN6GQKlIWDEqzS8LHWVEcByAwlikNGEqFrVQlaoD6uX\nrCEXZTWChLBh/YJ88dXim2++GaxluKe3fGH9YluzZo0bMGDAeupec8017lvf+lZ23iyZbJUvJbF8\nhV2A6xVQ5Im4yJdUN+KskW7jL23iLr5wtJxK9X7J0sfcD/v3cytWrkhVO4x0paq7TFlDIC8CNoFq\nXnjsYi0RwApFnBKbELBq14/bkrpwOUKY0CFKhBhhoXr22Wf9LPVdu3b1bkSZkJQJTHfccUcfi8ak\npFpOP/10v8ZjvslWxdKk85V6DGmkPXEI5fzj04/dQ4seiKO4RJRx9z3z3LHH9U6ELsUqAZmmX8Mu\n8WLzWzpDwBBIFgJm8UpWfzStNlidcC+yQYbqIVgVxPqFHlED3aJFizwxZNZ3rF+yrJDEffHFHJYv\nfl977bUtJlulTfvuu6+f74v8Ou4Ly5fEfVXqdizXOkI+vhIVYbBnwzV3/Q3TfVyUXEvrvlu37u6M\nM4Z5op2GNsRtwUxDm01HQ6DRETDi1eg9nPD2MdBjbWIP2WGgr6egB+QLaw+DnpAvzkNMIIVimZK4\nLwm6J+6LWC8hXxL3ddFFF63XpPnz5/v5vnTcl3zxGEfcVzEDdphoYWmU9mqFL754rHvn3bVu4mUT\n9OnUHc/67Wx37dWT3KJFC1OhezF9mIqGmJKGgCHQAgEjXi3gsB+1RAAyI9YtTXJqqUOuuiBfEBP0\nQk+28HQNOu4L8oX1S8iXfPEI+SLu64c//OF6VU2ePNkx0aqslyhxXxCvSskX+kIetc60RUsuoqXT\ncEw5zJK//NnnXccO7cOXU/P7pJNPdQd1OdANGzY08TrTV/JsJF5ZU9AQMARKQsCIV0lwWeI4EcDS\nxaDOIBNlaYmzrnLKgnxNnz7dL3StCYwuS8gX1i+C7iFfLJQN4RLyJUH3p512midmOv+IESPcKaec\n4skX1i8hX1i/Kg26l3g1sSJWMpATZP/ZZ/9MrdVr3n3z3fCAcC17bFki7zV9Txjp0mjYsSHQeAgY\n8Wq8Pk1Fi/iCkAGGLYmkCxBxffJlJYKeuURcjzLfl5CvqLivcePGuSVLlrQoihnyL730Uk++sH4x\n35dMtgr5EgLWIlPoBxYuLHRaIFroXQnhkvLSbvU6tldv1+OI7om3dsXVX9JvtjcEDIHkIWDEK3l9\n0vAaQWjElSfWmKQ2GkIDccE6x3xiuUTIl477wvIlrkcd93Xvvfe6SZMmtSiKWfVZZHunnXbyBAzy\nhfVLFuiGfCESeB8mWpDXXFa5uAbzK6+aFEwU+5i75eYbW+ie9B9XTrrGzbn9t4mP7Yqrn5LeH6af\nIdDsCBjxavY7oMbthzBAtnCDQWbSIBJUD2GEhOUTCBjki03HfeFu1Nvzzz/vfvazn61X1MyZM/06\njxJ0L65HiNszzzzj00O+8hGtcKFgjsUqFzELp8/1m3J69z7e9T7he27YkNNzJUvUeZm3a/KUyQX7\nrp6KG+mqJ/pWtyFQWwSMeNUW76avTcgWJCZNgsuRDQJTSHTcl5AvHfclBIygeyx/Ybn44os9+Xrn\nnXcc84Fh/dpuu+1c586ds25HsXyF8+b6DXmE8Fbq1qUcpsR4dMmyxE8vsXrNm27w4NNcjx5HuGFD\nh+SCpu7njXTVvQtMAUOgpggY8aop3M1dGQOMBNRXSgDqgSQWI4iSLGWUTwdxPUbFfQnxYk8QfniR\nbcrdb7/93HXXXZcNutdxX3zxCPEqlXzFNcDPnj3HjQoWBk/63F6syQjGSXaNxtUn+e5Fu2YIGALJ\nQsCIV7L6o6G1wU3Hli9WKskAlEocw+RLz/fFGo+vv/561v0Ytcg2cV+33357lnxh/apkke24XI70\n0Vlnn+NWrFjhJk++1rVpvUPium34mSPcqlUvuhnTp1Vs5atG4+gLcWFXo3wr0xAwBJKLgBGv5PZN\nQ2lWKmlJauMhjljtirF6SRsgYH/4wx/cu+++66ecYJHtdu3aOaaMwCJD/JZMtnrhhRdKtuw+zkW2\nxVWK27FSEfI1duzYRM3vlQbSFUfMXaX9Z/kNAUOgPghsWJ9qrdZiEWjbtm3WrTR+/PgW2cTdxH7B\nggUtruX7MXXq1GyZpbqr8pWb7xrxUZCVNLoYdbtog0wxoc+HjyGasj300ENu9913D2KNeriePXu6\no446yrVp0ya7ziOYsM7jIYcc4qZNmxYuyh155JG+rHXr1nmixpeSkDfmDcOVKTFl62WMOAHhEvIV\ncbmkU5eMH+vat2/vTuh9nCOIvd5CTBeTpCbd0mWkq953itVvCNQXgaYnXpAZITCQHJP4EcCtcscd\nd/j4qPhLr22J8nEApEqLkCzZi1tV9q1atfL3GfFZWLr4WpEvF5m3C9IlC20znQQfHkDMtLDINoRP\nFtnGQkbAPnOGlUq+0Cmsv66rlGPI15jA4nVI14Mc0zbUSyB+J590ktsqwDLJ7kUjXfW6Q6xeQyA5\nCGyUHFVMk0ZFACLRq1cvF4d7KwkYQVzCli/OFRKxLurlgDjHb70xZ9eUKVOC+KnJ7u67784WS7A9\nLkvm+5IpKySOjD1lhOf7ymYOHfChADFGlU4xQbHHH987mCNrkbvggl+6ZUuXunPPPbdmrkesXDdM\nv9Gde85ZfoqSfv36hVqajJ9xxtclo0WmhSFgCJSLgBGvcpGrUb5Vq1bVqKbqVVPM/FfVqz3ekqUt\nWIyKIVvh2iFaQpLE0ip7SJOQJ/ZYufi6Ucd9PfXUU27//fd3999/v9t5552zBEwH3ZOXOig3l4jL\nF0Igx7nSFnMeLCBxV199rdu749fdxWMvcf37nVrVwPvrps1wN824wbVus2PeZZ2K0b+aaYx0VRNd\nK9sQSB8CG6ZP5Xg0JiaKgWnkyJHZAl966SV/LsrliEtSx1uRl3PkCYt2XxLTg1CPDLCSXn6zRx82\n5mqSskmn66TcfHLbbbdl81MGZXGuHCmlvYXKh6SIi65Q2qRfpx39gyklZDAtR1/pd4gWM9OzPJC4\nHrfYYgtPhHA94oLs2rWru/7669er5jvf+Y4DV4n7YnkicTviekTEGrZe5n+fEKtXruulnofAnXvu\nOZ4ELX/madc9IGPn/mK0e/a5FaUWlTM9Fi4IV7du3T3pGjp0qJ8uIg7LXc5KK7gg90lS9augaZbV\nEDAEykSgaYlXsXhBrCAwEKcwyeIc15588sm8xZGmEGmCdEHSCpWVqyIIVp8+fVrkpyzOFapblxlH\ne3V5uLOQXWP4ik6Xi558JDBo0CCP29Zbb50ltkJsOAempCFtuP90eaUex0FaRE8sVGHyJTFf7CXu\ni+kktLDoNot4E/clc4IR98W0FcXGfWGpgsDFKWDD3Fkzb53pPv3kY28BY61EYsDKCcIXssXXipC5\nBxbMd/369fVLAOHmTKoY6Upqz5hehkB9ETBXYwH8w2QmnPy9997zgzsuQQKowwKhKkZKIUdR5UEs\ncgkEEfcUX9UVkkrbGy4/7mBiyBPtKcaSR9+E8T/xxBMdC1XzlWElAmGBVFZqyYN8IZAvRMiYuB05\nzzEbywmFF9meMGGCJ9sssg3Z0rFfBPFLXqnHVxL6Aymmn+ImxxAwtnNHjfQfDEC6rrnqSl/7fp0P\ncHt32scf77FHW/+Fp6j1wgsvBETyQ/eXl19yL/x5ZUAMF7kfnHSKO/I7PdwZw34Su55Sb5x7I11x\nomllGQKNhcCGjdWc4lsDCcEVw0AmwmDMOYmrgsxoCxRpuc5G8LMIA3w+4kO58+fPz+aVfOH9wIED\ns2nOPvvs8OWCv3PpR8Z8+knBcbVXymMf5ySR4kothnRpHfRxHGVQHiRFrHm6/HKOhRRBsnA9Eq/F\nTPV89Yi7ERceli/ckKNGjXJDhrRc/mbhwoWB662be+WVV7zrkS8emXIC1yNTTkDG5L6N0o+2QLyq\nJejfP3DPTp1yrVuxcoW3hPU58QS3+Wabus8+/cTNnTPH3ThjRnZ77dVX3WZf3sQvSXT++f/rdceC\nRuA8uiZdwJIN0mliCBgChsB6CAQv5KaWgKxkAlD8FhCkFlgE1pHstYAUtbjGj1x59XnKDkjXenk5\nIfWyD4hgZBp0knSUq0XOsy+kH2nWrl3rswekMVsm50XKba/kj9qff/75GbZKJSDDmcCi2EJv3f5S\njymLMsuV4Cu+zGGHHVZu9pz5ApKUCSxXmYA0ZQIClQlIfWb16tWZwAqUCb5ozCxevDgzb968zGWX\nXRaJRfAlZGb58uWZYODPvP3225kgBiwTuB8zgfsxQ9lsuYQ2mVSGwMsvv5xhMzEEDAFDIBcCTWvx\nCgbqgqKtXVFuOtxWIrjAsHxFSVTecLpi0oTz6N9aFzkfLrNQjFNc7ZX62WMVisNKgTUujC+WRCyD\nASH1FkWsiuGNa6TB1aqlkJVSp63lsbgaJe4L6xexXVi7JO4LK1jHjh399AnhuK/Bgwf7OdNkvi+C\n7on7KmayVSw0cVnxaolZUurCyoXEcb/7guyPIWAINCQCFuOVp1s1USH2qZAwmIfjvCAHxUg4XzF5\ndJqoesJlhomLzs9xHO0Nl0msSxwDUThWC1cvrtlCosmnfMAgecJlyvl677XrEV0kTos9hExvxH0R\n8/bcc89l1WYeLUj0L37xixYxXwTwE/dFfkTqkYwyrQR9Jsdyzfb5ETDSlR8fu2oIGAL/QcAsXv/B\nwo4SjIC2xkG4iiFd4eZAwnQ+XWY4bb1/CymCJMmUE8R9MdO9WL4k7uuSSy5xp5xySguVZ8+e7QP/\nJe6Lrx5lpvt8cV9m9WoBY1E/jHQVBZMlMgQMgX8jYMQrz62grUg6OD7w22aDlfWxTp+n2KpciiIR\nYQtXIf309bjai+VEBqa4Gq71LLXMSvKWWlel6cXtiKVLky8JuhcChuvxpGC5HCxcWiBdkE1NvsLr\nPJKee1jL4YfHP8WELr+RjuXejsOq20i4WFsMAUMgNwLmasyNjY8LEvcbxEa7rfJkq8sl3GbhOC/t\nSsPtWIh0EAcVd3uxoMjgFBcwtKucrz6pX2MSlz7VLgcCBvkK77XrkeNDDz3UL7I9YMCAFiqxyPY1\n11zjvvWtb2WnnIBs4XpEyIuIlY1jiAT9xj5uIY6Mbd37H7jXXnvdvfHGGy2qIJ6tfft2Lpjj338Z\nCBFMosh9XQ2Mkthe08kQMATiQcAsXgrHsIVIEy3iaPRcWxAU4r7EKhE1270quuqHBJ9r/fiNziJh\nUibn9b5a7SVmqFLRukGeaFu4v/LVQVrw0cRLl5kvb9Q1iEMt46DkPsP1SJwWQfeyyDaWL3TB8iWT\nreZaZJuZ7t9//31XaJFtyAT9FkffUcYNN9zgBg46zbVv194NH36mu3/+A+6DDz9yrbbexp3at2+L\n7cAuB7mPPv7UvfLqG+6yiVf4Z8xPwHrVJE8Go/qj1ueMdNUacavPEGgcBMzipfqSwZkBDssQc3kR\nD8RgLVYgBntNZlTWsi0wuoxKjyvVrxrtxeJ1+eWXV9o0b23UpIl+YcNKhzUvF4kiD/0a5YrNlacY\nZSETtK2Wwr2JpUoHx3OO39r6xe9ci2w/8MAD7vbbb89avpjjC5FytfWL9j0YzGpfrsWJvHffc6+7\ndMJ4PwHqQQcf7M4444ySF9Bm5vpHHl3ili5Z6o468ii3V/uvu+N793L9g7nB6iFGuuqButVpCDQQ\nAsELt6ll1qxZ682HFBCvLCbM9RQM7uulCW6B7Lnw/Fr8luu6rGyh/z6QNOyZWytKyC/pwvXIefaB\n6y2bTp/nOJwv1zxe1F9Oe6P0lnPMaRRYZORn2XvmICvUD+F25/tNWTKvWTlKMYfXnDlzyslacR6Z\njysIks988sknfr6vd999N/P6669nVq5cmQlIZiYgPZm77747E8R9Rd4XwSLbmeeffz7z6quvZt55\n551MYAWLnO8rIK2ZYGHuknRmPrBgpvlMu73aZYLFsjPLn32+pPz5Er+xek3m19dPzxx+eDe/3X77\n7HzJY79m83TFDqkVaAg0HQJN72rEBRcQk/WmgQgGbS9Yv5544gmfBuuKFixEBKGXG2+ky6r0GF2w\ncqCvCPoGxLIk/eJuLy4rpNL5obBq0b5wH/jCS/xDGZQVnm6jlGIeeuihmlu8RD9xO2KdIuge16Ne\nZFu7HotZZBvXoyyyHZ7vCxdmsR9IYAW86OIxbvCgwX45oIWBxWvUyBElW7iknVH7Nq13cD8a8Pk6\njaf07e+uuuoqhxuy0vsrqq7wOalD7unwdfttCBgChkAxCGwA1SwmoaUxBMpFgPUMcVf99Kc/LbeI\nFvlwMRLDJi7gFhfz/IBUQlArJcpz5871bRGXU54qq36Jx5cNlyGkiWWCmDaCGC42SBVTSUCs+PKR\nvZYf//jHbujQoX6aCiZjhcARPwahw2UpJK+Qy5HrF1zwS9e6zY6OecQ6dmivq6nq8ZhxE9y555zl\nrrjiSjds2NCq1AXpgnDVMq6vKg2xQg0BQ6DuCBjxqnsXNL4CBFYT5yUWg7haDPHSMVzEcmnBooV1\nS2LAtDVQpyv1WAhkHLFrpdYdlV7+d2KRbDbIV+CC9CQL0sUmVq1gSSF3zz33tCime/fujkW2mSOM\ngH3mC9PkC8saBCwX+Zo+fbobO2asO33oMDdsyOktyq7VDxbgxnLdunVrN37cmFgJkpGuWvWi1WMI\nNAcCRryao5/r2kpcUJCfID7GWw3qqkwMlWP1gITUOrg+n+pCvrB8Qb6CtRmz5AvLl5AvjpctW+Yu\nuuiiFsXhnvztb3/rv4qEgAn5wo2J9QvyhYUPArarmmLirLPPcXfOneOuv2G6X9S6RaE1/kEQ/ujR\nF7g333zTzZg+LRbyZaSrxp1o1RkCTYBA08d4NUEf172JEJV+/frF8nVjvRuD9Y72JIl0gYm4BCXu\nizm6cBtCophmQsd9HXLIIS5YZLsFlKzt2LNnT0fsGscQNSZbxXomcV8QLqyKEGkE0rVixQpHLFfX\ng7u0KK8eP4j/mjrlWte27R6ub78BWT3L1cVIV7nIWT5DwBDIh4BZvPKhY9diQwALEbFeWE3SHCcD\nwTn//PMDy8ro2LCJu6Bw3BduR4n7Etcj+zVr1rjTTjvNEyytw4gRI/wSROJ6hMDJOo8QO8jZvHvv\n96Rr8uRrHYQnaTL8zBHBlDAvlm35MtKVtB41fQyBxkHAiFfj9GXiW0KAPVuSSUs+ELF2oTskEgKp\nhXYlScT1qOO+IF8E12vyxW9io5YsWdJC/eOPP96dd9553mImrkchXzfddFMQRzXe3T5nbk2D6Fso\nWMQPJmylrbfcfGMRqf+TxEjXf7CwI0PAEIgfASNe8WNqJeZAQKxeMrDlSJbY07jaIF5RE3fSNi1J\ncEcK+ZIvHnEZQr5wIWryRdzXzJkzHYRKyy677OJ+/etfu5133jkbdI9rkaWJHl2yLBHuRa1v+JiY\nr8GDT3NdDjww+NLynPDlyN9yb6bZKhvZMDtpCBgCiUHAiFdiuqI5FIG0ECPElAxpEggXOjMwFyO0\nMZyWuLB6DOgQMMgXmwTdQ74k6F5I2NKlS92FF164XvMgZZ06dfKxXqef/hPX5/s/qNvXi+spV+AE\nXzv+sH8/N3nKZG9tzZfcSFc+dOyaIWAIxIWAEa+4kLRyikIAQoLliKkYoixHRRVS40QMyPvuu68L\nZnCvKKieciQwXZpQKxeljvuCfBE0D/kKux5Xr17twotsoyuLbC9b9nv3j8BqVqrrTtpar/2Vk65x\nc27/rVu0aGFOFbBY1osY51TKLhgChkBDImDEqyG7NdmNEpejDHZJ1haixIDM3F0yf1ec+oZdlJBS\ntmqIuB513BfkS1yPerJVgu4hYWEJlv9JdFxXWF/5zez2PY7oHjnBKn1QKwIs+tjeEDAEmhcBI17N\n2/d1bTmuO4LVsQLVw/1WbOMhXWzoWgshaD8cuB+nJSZMvsT1iOUL16OQL46ZbDVY79E3+8ADurge\nwQLVF184uhYwxF7HvPvmu+HBrPbLHlvW4n4z0hU71FagIWAIFEDAiFcBgOxy9RDA1QjxYvBLIvlK\nin5hFyVYQcbKFSFfMtkqQfcy0z0ETJMvHfcVLFCdyKkjisXhpJNPdQd1OTBr9TLSVSxyls4QMATi\nRMCIV5xoWlklI5AUcqMVx72IWzGppBD90E1LOS5KifuSme7DcV8QMCxf//uL813XQ7/lJl42QVeZ\numOsXpeMG+tjvYx0pa77TGFDoGEQMOLVMF2Z3oZAvhgI+WqwEktOHAhAaiTeB52SaImLameUi1La\nEZVezgn5kiknIF96vi/I13/3+W83c9ZtiZ8+QtqUb9+tW3f3jW/s0xCrKORrp10zBAyB5CKwUXJV\nM82aBQHip4j5gijU82tHiBaz67OhR1pIF/dJlMWL9miJclEyEz/yhS98we/10kPMUn/XXXe5vTvt\n0xCkiwZ2PfTb7h+ffuLban8MAUPAEKgHAmbxqgfqVmckAkJ8hIBBJmohWLkk2J99Nb5erEU7CtUR\n5aKUwP1w3JcE3Q8/8+dup52/ltqg+jAmMq/XipUrwpfstyFgCBgCNUHALF41gdkqKQYBCBcuM4gP\nhIA9WzUtT2Jtg+QRN1UrslcMHnGnAUcw1hIO3IeAHXbYYdlFt1e9+IL7/g9+oLOk+lgW86bd9XZr\npxpIU94QMATKRsAsXmVDZxmriQDWL6xPDJCQL+LA4iJFWH4gXLgTEfa4F00+R2DRokUOArZ27Vp3\n4okn+uNGwoY1HDt8vZ2/rxqpXdYWQ8AQSAcCG6ZDTdOy2RDAMgP5IuAeK9huu+2Wjb3id6kiZAuC\n1apVK18uhIuyjHS1RLNbt26ObZtttgmsXSe3vNgAv3bdbXe3bt0HDdASa4IhYAikEQFzNaax15pI\nZwgYGyQJEsaGJQy3GRYwriGy9z+CP+JCY8/2yiuveBeaBM7HZT2T+hptT5A9uG2xxRaN1jS3xx5t\n3dw5cxquXdYgQ8AQSAcCRrzS0U9NryVEC3cjGyKECouVBMf7C//+A7Fig2jhqgwTM53WjqMR+MJG\nX3RYhxpNGpFMNlofWXsMgUZGwIhXI/duA7eNwGgLjq5uBzOxaiPK13be2f3hiccbsWnWJkPAEEgB\nAhumQEdT0RAwBAyB2BDo2KG9W/nnlbGVZwUZAoaAIVAKAka8SkHL0hoChoAhYAgYAoaAIVABAka8\nKgDPshoChkD6EHj2OZs8NX29ZhobAo2DgBGvxulLa4khECsCsoxQrIUmoLBXX3vN/eCkUxKgialg\nCBgCzYiAEa9m7HVrsyFQBAL/+udn7q9vv11ESktiCBgChoAhUCwCRryKRcrSGQJNhgBfjb711psN\n1+qnnvqja9+uXcO1yxpkCBgC6UDAiFc6+sm0NARqjgDE6ze33FTzeqtd4V9efsltueXm1a7GyjcE\nDAFDIBIBI16RsNhJQ8AQ+HxR7W5uydLHGgqMF4KpJGxC3YbqUmuMIZAqBIx4paq7TFlDoLYIHHBg\nF/fgQ4trW+n/397d9ER1hmEcv+YDsNRUUDMJLEhdYayg3dD6/sZAurOFIU2qSNGkTaoZSXShKGqi\nCRo7VYkVW0QDTtqmMbY2gcS2GKDMoo3ESGUjfgLWyjNqAqiYyDkz5z7nPyvmDPOc+/nds7hyXp7j\n495ciHwyOcniuz4aMzQCCMwvQPCa34dPEYi0wNo1lRr8+6/QGAyPjGhHojY082EiCCBgT4DgZa9n\nVIxA3gTcsy4fjN1XWNa+yvT16sO1VXnzY0cIIIDAXAGC11wR3iOAwCyBmto6XbrUOWubxTe3bv+e\nO83owiQvBBBAoFACBK9CybNfBIwINO/ZrVu//qLJJ7aXlrja1aXmlhYj6pSJAAJhFSB4hbWzzAsB\njwTi8bhWrvpA31+56tGI+R/GHe36Z3hIDfWsWJ9/ffaIAAIzBWJPp18zN/A3AgggMFcgm82qoqJC\n//53XyveL5/7ceDf7/y0XlVVldq3lyNegW8WBSIQcgGOeIW8wUwPAS8E3GKqR462qa2tzYvh8jpG\nx7nz09d2PeZoV17V2RkCCLxJgOD1Jhm2I4DALIGWL5tzp+tckLHycndjnj/bocOHD8ktCMsLAQQQ\nKLQAwavQHWD/CBgRcMEl/V06F2SsrGafSqX0WUMDK9Ub+Y1RJgJREOAaryh0mTki4KFAR8dZZTIZ\n/djdreIl73k4srdDffX1Nxoff6iff8p4OzCjIYAAAgsQIHgtAI+vIhBVgf0HUhobG1M6/W0gw5cL\nXdnRkemAeJNTjFH9kTJvBAIqwKnGgDaGshAIssDJE8dVXl6upqY9gVvfy4Uut+7YmTOnCV1B/hFR\nGwIRFSB4RbTxTBuBhQrMDF9BuObLLfD68vRiz/UeHoS90AbzfQQQ8EWA4OULK4MiEA0BF74qV6/W\n541J3ei9WbBJu7sX3dE3d01X15XLhK6CdYIdI4DA2wQIXm8T4nMEEJhXoLU1pfYT7TrUelC7duf/\n1KNb3uKTutpcAHQX0rNsxLzt4kMEECiwAMGrwA1g9wiEQcA9eHrw3qBisZg+rq7WsfZTvk/LPQao\nJlGnTF9vbpkLFwB5IYAAAkEX4K7GoHeI+hAwJtDf368LFztzq8Vv2LRFjcl6T+987LzcpT/uPH/2\nYupgSslk0pgQ5SKAQJQFCF5R7j5zR8BHARfAuq9d18ULaX2xq0mVVWu0ZfPGdwph7ujW3bt/qu9G\nj5YUF6tx+pqyRCLBaUUf+8fQCCDgjwDByx9XRkUAgRcCExMTGhgY0O3f7uha9w/aUVOr0tIyLVq8\nWGVlpSoqKnrFanQ0q6mpKT36fzz3nerqj7Ru/Xpt37aVC+df0WIDAghYEiB4WeoWtSIQAgF3JCyb\nzeqpYhoaGn7tjEpKSrRsaYmWL1+WC1rxePy1/8dGBBBAwJoAwctax6gXAQQQQAABBMwKcFej2dZR\nOAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkB\ngpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAII\nIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1\nIoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMg\neFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAA\nAghYEyB4WesY9SKAAAIIIICAWYFnu/zIiInRwogAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 88, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 89, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0 = np.zeros(10)" + ] + }, + { + "cell_type": "code", + "execution_count": 90, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])" + ] + }, + "execution_count": 90, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 91, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0[4] = 1\n", + "layer_0[9] = 1" + ] + }, + { + "cell_type": "code", + "execution_count": 92, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 1., 0., 0., 0., 0., 1.])" + ] + }, + "execution_count": 92, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "weights_0_1 = np.random.randn(10,5)" + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 94, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0.dot(weights_0_1)" + ] + }, + { + "cell_type": "code", + "execution_count": 101, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "indices = [4,9]" + ] + }, + { + "cell_type": "code", + "execution_count": 102, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_1 = np.zeros(5)" + ] + }, + { + "cell_type": "code", + "execution_count": 103, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for index in indices:\n", + " layer_1 += (weights_0_1[index])" + ] + }, + { + "cell_type": "code", + "execution_count": 104, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 104, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_1" + ] + }, + { + "cell_type": "code", + "execution_count": 100, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjsAAAEpCAYAAAB1IONWAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsnQe8VcW1/yc+TUxssQuWWGJoGhOjUmwUMVYUC3YQTeyCDQVRwYJYElGa\nggUBJVixRLFiV4gmxoKiiaKiYIr6NJr23vvf//6O/k7mbvc597R7zt7nrvl89tlt9syatct8z5o1\nM99oioKzYBowDZgGTAOmAdOAaaBBNbBcg5bLimUaMA2YBkwDpgHTgGnAa8Bgxx4E04BpwDRgGjAN\nmAYaWgMGOw19e61wpgHTgGnANGAaMA0Y7NgzYBowDZgGTAOmAdNAQ2vAYKehb68VzjRgGjANmAZM\nA6YBgx17BkwDpgHTgGnANGAaaGgNGOw09O21wpkGTAOmAdOAacA0YLBjz4BpwDRgGjANmAZMAw2t\nAYOdhr69VjjTgGnANGAaMA2YBgx27BkwDZgGTAOmAdOAaaChNWCw09C31wpnGjANmAZMA6YB04DB\njj0DpgHTgGnANGAaMA00tAYMdhr69lrhTAOmAdOAacA0YBow2LFnwDRgGjANmAZMA6aBhtaAwU5D\n314rnGnANGAaMA2YBkwDBjv2DJgGTAOmAdOAacA00NAaMNhp6NtrhTMNmAZMA6YB04BpwGDHngHT\ngGnANGAaMA2YBhpaAwY7DX17rXCmAdOAacA0YBowDRjs2DNgGjANmAZMA6YB00BDa8Bgp6FvrxXO\nNGAaMA2YBkwDpgGDHXsGTAOmAdOAacA0YBpoaA0Y7DT07bXCmQZMA6YB04BpwDRgsGPPgGnANGAa\nMA2YBkwDDa0Bg52Gvr1WONOAacA0YBowDZgGDHbsGTANmAZMA6YB04BpoKE1YLDT0LfXCmcaMA2Y\nBkwDpgHTgMGOPQOmAdOAacA0YBowDTS0Bgx2Gvr2WuFMA6YB04BpwDRgGljeVGAaMA2YBsrRwOOP\nP+7effdd9+rC190HH3zgli39wD3++GNfS+qQQw93q6yyiuvSpbPbaMMNXM+ePd13v/vdr8WzA6YB\n04BpoLU08I2mKLRW4pauacA00FgauOuuu9xTTz/r7rv3HteufXv3ox//xG2y6SZu8803d6utuqrr\n0b1rswIvfG2Re2/JErd06TL39ttvu1defsnde89dDgD66a67uH322cfAp5nGbMc0YBpoDQ0Y7LSG\nVi1N00ADaeC///u/3YyZN7k5d97pS9V//wNcn969XZfOHcsq5dJlH7q5DzzkFsx/zl079Rp3xrCz\n3IknHOc23njjstKzi0wDpgHTQEsaMNhpSUN23jTQhjUwfvwEN3nSJLf1Ntu6IwYOdLv/tG9VtYHl\n57rrrndXjvuFQU9VNWuJmQZMA6EGDHZCbdi2acA04DWAP87551/gVll1NXf8CSdUHXLiahb0zL3v\nXjfi7BFu0KBB8Si2bxowDZgGytaAwU7ZqrMLTQONqYHxEya6yRMnuhNOHuKGnHRCTQs598GH3WWX\njI38gdaPLEoTzJ+nptq3zEwDjasB63reuPfWSmYaKEkD+OYce9wJ7pFHHnU33Di95qCDsDST3Txr\nllt33fVct67d3O9///uSymCRTQOmAdNAkgbMspOkFTtmGmhjGgB0Bg4a7FZeeWX3i19c7tq3W6/u\nGhg/cbKbPGG8m33LbPejH/2o7vKYAKYB00B2NWDj7GT33pnkpoGqaECgs9lm33fjrri8KmlWIxGa\n0FZaaWV38EEHG/BUQ6GWhmmgDWvAYKcN33wrumkgraCjO3P04IF+04BHGrG1acA0UI4GrBmrHK3Z\nNaaBBtHAmWeNcIsWLXL33D0n1SWiSWvOHbe7OXPuNKflVN8pE840kE4NGOyk876YVKaBVtfA9OnT\n3diLx7p5UTfzNPjotFTgY4493n3++edu1s0zW4pq500DpgHTQDMNWG+sZuqwHdNA29DAO++840Fn\nXDRoYBZAh7syevQoP/8WkGbBNGAaMA2UogGz7JSiLYtrGmgQDRx62BHRnFabuTEXjs5UiRiH59Qh\nJ7v5C+Zbc1am7pwJaxqorwbMslNf/VvupoGaa4DRkX/3wvN+PqqaZ15hhozDs/uee7uLx15aYUp2\nuWnANNCWNGCWnbZ0t62spoFIA1h1unXvXpdBA6txA5haYosundzixYtt8tBqKNTSMA20AQ0Y7LSB\nm2xFNA1IA1h1jjv2OLfojUU6lMn1qacNcyussLy77NKxmZTfhDYNmAZqqwFrxqqtvi0300BdNTDr\nV7e4gYOPrqsM1cj8Zz872t1z1xzHOEEWTAOmAdNASxow2GlJQ3beNNAgGgAMrp16jdun396ZL1GX\nzh3dDzp2cnfddVfmy2IFMA2YBlpfAwY7ra9jy8E0kAoNAAY/P+Y4Byg0Qtilb1/37HMLGqEoVgbT\ngGmglTVgsNPKCrbkTQNp0QBgsMWWW6ZFnIrl6NO7t7dUVZyQJWAaMA00vAYMdhr+FlsBTQNfauDJ\nxx9z2/zkJw2jDixUe/fb1+F0bcE0YBowDRTSgMFOIe3YOdNAg2iAEZMJPbp39etG+WGm9pdffqVR\nimPlMA2YBlpJAwY7raRYS9Y0kCYNADtbb7NtmkSqiiybbLqJW/L+B1VJyxIxDZgGGlcDBjuNe2+t\nZKaBnAZo6ll33fVy+42ysfnmm7sPPjDYaZT7aeUwDbSWBpZvrYQtXdOAaSA9Glh9jTXdN1dcKT0C\nmSSmAdOAaaCGGjDLTg2VbVmZBuqlgf/3//5fvbJu1Xy3+uGW7lezbmrVPCxx04BpIPsaMNjJ/j20\nEpgG2qwG2rdrvKa5NnszreCmgVbUgMFOKyrXkjYNmAZMA6YB04BpoP4aMNip/z0wCUwDpoEyNcBA\niR1+0KHMq+0y04BpoK1owGCnrdxpK2eb0wDTQxx55JHuu9/9rps8aVJDlv/Tzz5ryC71DXmzrFCm\ngTpqwHpj1VH5lrXzo9/+/ve/dyyMBcPy7rvvNlPNaqut5n70ox/5Spt1z549/dIsku34GcABHLqZ\ns/70009zWmH7ncVv5/YbZeNvf/tboxTFymEaMA20ogYMdlpRuZZ0sgaoiG+88UZfKWN1AF6AGFkh\n2A6DIIg1UHTKKae4l156ye2zzz5u33339QvptMXATObok+Xuu+9upoLvfe97XjfolXhXjLuq2flG\n2PnjH99yXbtu1whFsTKYBkwDraiBbzRFoRXTt6RNA14DVLZXXnmlXwATgAVQ2XjjjcvSkCp5oAkA\nIq3Ro0eXnV5ZQtTpIqBPwAj0hWGrrbbyukAfITTymi+33HLug6XLXCP1YDr0sCPcrn37eFAO9WDb\npgHTgGkg1IDBTqgN2666BkLIofIFSLDkVDNQ+ZPu9OnT3aBBg3JAVc086p0WQCcLThLgYL0J4TH8\nD8M24+z077+/OyLSz4AD9qt3caqWf8cOHd0DDz7QJiC3akqzhEwDbVAD5qDcBm96rYqM7wiAIx8S\n1tUGHcqCdQgLz+LFi31zDftYkbIe1GRHeX784x+7888/3zffUS6a8KZNm+Y++eSTXNMezVaAjeDm\n//7v/9y///1v969//cv985//dF26dImufznrasnJP/fBh1279u39/c8dtA3TgGnANJCgAfPZSVCK\nHapcAzRTASBYXNiuRQAKsH4AVVg6WCNDlvx5ZL1hHToY46QNKGK9YVGZBDfoF+sN+0AOC/v/+7//\nm9vfaacd3MUXj41ijiZ65sPTTz/j2q+/QebLYQUwDZgGWl8D1ozV+jpuUznQbCXrDRU2AFKPgBwA\nD01cAE/ov1IPefLliZxAmSAnDjiCG9YKAA1BoAPUsAhytAZ0BDtaHx75uJx+5pkN0ZRFE9Y1U67J\n9dKTfmxtGjANmAbiGjDYiWvE9svWgEAHsKAZSdaHshOswoWyMAEUaQEe9CS4ydeDCrgRNKKGEHBk\nwYkDjoAmDjnh/jXXTHWg0tQpV1dBu/VL4vppM9zdd81x99w9xzdd0uQX6qt+klnOpgHTQBo1YLCT\nxruSQZlC0MGSkqaAPEBPPYFHPaiAnCeeeKKZeuhBRUWNJSoEsjjgyIIjyCkGbmTl0Rofn7322su9\nuvB116Vzx2ZyZGmnV6/e7uSTT3b77dc/Jzb314Anpw7bMA2YBgINmM9OoAzbLE8DaQYdSgREEKgI\nawk8WBvID9gqpgcVMuYDHMFKCDgcC/fDbcUXIJHuN77xDe8H1L379u6qq67KrHUHqw4hBB32dX9Z\nWzANmAZMA6EGzLITasO2y9IATS4ADxV7moOsO8jZWk1sAA5wgwUnPhI0PaioiNGXfJkEN+gNMGFf\nlhsBSxxqtJ8EN5yLAw7j67C8/PLLbs0113Rrr722Gzx4sJs4+Rq3+0/7pvmWfU22pcs+dIcdeujX\nrDphRO4vFrLWusdhXrZtGjANZEMDBjvZuE+plZLeVlTuVPJZqFyADeQERqoVSIsKNh/gADcs0k8S\n4AhSWAtm4us43GhfcMNaQYDzX//1X47l6aefdj/5yU887Kywwgpu9uxb3MMPPexmzf5VpgYZHHnu\naLf47bfcrJtnqqiJa55HgFI6T4xkB00DpoE2owGDnTZzq6tfUCoUxn958cUXm/maVD+n6qWIBYpK\nEEADQMoNgI2WavagEuAAMoKZ+DZxBDgCJ5qoWAQ34RrQ2WWXXdzyyy/vz7NmOebY412nzl3cmAtH\nl6uGml7HuDqnDjm56EEEDXhqenssM9NAqjVgsJPq25Nu4bCSsIyOrDtZCkAKfjw4DRf7zx9IEtzk\n60GFLkKAEoioaUprYCVcBDOF4EbxSUPp5gMcwcznn3/uXnnlFdenTx8PNzouEFqyZInbp98+btjw\ns93Rgwem+hYufG2R27//vu7isWO/5qtTSHADnkLasXOmgbajAYOdtnOvq1pSLCNADpVJscBQVQEq\nTKwYUFMPKpqo8gEO0FSoB1UccJKABpCJA4/ghnVLgCOIYS2Qef/99x2ws8022+SO6ZyauIAlyjVi\n+Ah3w43TXY/uXSvUautcjp/Occcd7zp27Oguu5RBEUsLekax6FkwDZgG2qYGDHba5n2vuNRUHMAO\nlX0WAxUgwBO37ghwgLl8Pai4rhDgqIkpBBbBjI5pX/Cj41rHAUeAArAkwU14DIsN8TbbbLNmoCNL\nEGsC5aMsAw4c4J588slUAo9AZ/1oWoirr55U9qPGfSUY8Hg12I9poM1pwGCnzd3yygssq44qkMpT\nrE8KgBqVH01PlAkLThxwdt55Z3+eOKoo1YyE1ICNrDdsC1YEM/F9wY3WOq90lLbghrUAJ74W4HBc\n52i2osfVpptu6o8BNqShINBhH2CjvPQSGzhwkDt7RLosPAIdZJ0xfVrFFkQ9r7qPpGvBNGAaaBsa\nyATsTJ061R177LH+juBo+fDDD2fq7oTyI7gqNLbDyodyUb5iA//c3377bR/9kksucWeddVaxl1YU\nD2sAoMCS5aDKPl4GKn/ghkVNdOE9E5gAKnHAEbwIdgQ18TXXaSF/ngOBieBFAKN1EtxwTPGxzmy9\n9dZu9dVXzws4KitWOSYWZc4tIIBy3nnnHLf//vu5626YXncfnmefW+B4psttulI542vKiv9VaJmL\nx7F904BpoPE0sHzjFclK1JoaoLJgBGCcdbMeKIvCoEGDPNwAciHgADnhIpiJr+Nww/n4McGNwCkJ\nbgQuApsQZnRMcVjLAgTo9O3b1xcnBGiVL1zThAfoELBoUV5k6h85AM+bN88dH/nHvBpZiIYNO70u\n3dIZNPDySy52J5x0khty8kmh6BVvY9UBdtCBAU/F6rQETAOZ0YDBTmZuVToEBXKwfAgI0iFVeVIw\nfxeVPRUga0IINmwLUPLBjaBGawFOGF9pkn4+wBHI5IOb8DiAo3To9k7F3bt3b5IvKsgiJwuWLkLO\nHXfc0d3763vdyHPO84P3nRk5L9dq4EF6XDGy85OPP+Yn+AQ8WyPw7HLPDXhaQ7uWpmkgnRow2KnB\nfTnmmGMcSyMEddtuhLJQ6fPvnkqVip4gwAlhJR/ICGxYx+O3BDiCF0EOVhpt61zcgiPAkeWGEZqx\nUvTq1avo20HzFX46NF+FgBdC3frrr++uv26qmz59hhty0olu2+26uiMGDmw16ME358bpM92Made7\nfvv2d/MXzG91mDbgKfqRsYimgYbQwHJZLcWll16a83Pg449Pj/xXksr0yCOP+DjE1bLGGms40mFy\nxKSgeKy5noWuvOxzHaGYOPjshPGS8tKx2267LZcH15Afx8oJSWWmqQN5yg00YbXWP+5iZJIe1WRT\nzDWF4qgpg3/5VPgCm3//+9/uX//6l/vHP/7h/v73v7svvvgit9Clm4VjnCMOC/H/53/+x6dDnsAK\noxV/61vfct/5znf8stJKK7mVV17Zac22Fo4R79vf/rZfVlxxRX8taQiAZNXRVBSSv1AZdY4yJjVf\nCfCQneWf//ynXwYMONDNnXt/NGFoZw89hx52hLv19juVXMVr/HJOPW2Y6x3B5quvvOxm3zLbdy2v\nldVQwINjugXTgGmgsTWQSdihohs+fHizO0MFDhgkAQ9xkyp5IIdzOPr+9re/bZZefAdwII1C8YqJ\nE0833AdqBgwY0CwP8uOY4CqMX2ib+EllFgDJ4btQGvFzVJbf+9733MZRE0C9Qz5ALUcu4I2yqdLH\nUqNKH4gJQUeAI8gJAQcQA0q++c1vOkAFcBHUaB2CDscEOCHkAEekQVryyRHkUT5kJZR6H/I1XwF5\n8TJTPhbK8fOfHx11CnjI7bTjDm7yhAmuY4eOHlIAH5qeSgmMgsyUD8xaftSRgyIYXN6PiMz0D6WA\nWyl5FooL8HD/DXgKacnOmQayr4HMNWNRWecLVIBU4mFvLSr9lkCB6wCDt956y/dkSUq/pTS4ppg4\nSWnrWCGLC1DG3EbF9NYCmuIwqDy0Ji+6J5fSg4tKttQKVvlVe10IOkvNi0oWZ2VgJ27ZwcqBlYcF\nIFBzD3kIQMLmJjVHaS2LTHwdXiNrDWsF0k4KVMrIW6r1o1DzFWUG7mTJUlnJH6sSYZVVVnGHH36Y\nO+qowW7hwoXuqaefcXfNmeMOOnD/CBZ6uXbt13frrrueW3uddXz88AerDZawe++5y+3db1+33Xbb\nuqFDh3iH8DBePbcFPKwtmAZMA42ngf98XTNUNir9F154wVdOH3/8sYcAiR/CEBATAgiVOyAkf4rQ\njyYeV+mFa+Lr2nyQUEycMM34Nt1tlceUKVOanS4EQ2HEEHRCXQFzISyhG8pdbAAI0lQZhPe62DIk\nxdtqq638P/uwGUuVv5qzBADcG6AECMD6QpNTviYqWW7CtSw4YROVwEfwVAh00H+poAOk5mu+EuhQ\nPjXHsQbygJ8kwAO26CWFNQZ9jBt3hTsmsv6stmrURPedFd23V4z0Ei0rRdv//ucX0aCF+7vTTh3q\n495z9xx3zsizUwU6eibQLTCJH5QF04BpoLE0kDnLDuq/9dZbvVWCbcYUARCwzChQgXMcC0dYmQMP\nYWXPPs1eqjSBCdJKClwXh494vGLixK8J9wGlEKLYR37Bi8pD2fIFLB5hU16oK2CPfZrtSJeFNMkn\niwG9AK+F9FFMuQQPgkxZb7QPfAhIwjXWmrjFJumYmqK0Fsxo3ZKM6ipdLmgW23wlXx3Ah7JTFtYE\nZEfeJJnV/FSufC2Vv5bnKYMsmHouapm/5WUaMA20jgYyZ9mhwmYJQ3xfgBM2dVAhhqCj68OKnuvC\naxSHddK14fli48SvCfcPPPDAcNdvx/MNQeZrkaMDofxYdeK6QQ9hPi2lF+bBv15VbOHxUrcBU1Wc\npa7DvCgrflpYqEopR5hGuC1ZqNiBGip7+d9gwZF/DWv53oTbOiZLD9YbrmfBEkSahaAhlEXbWNMq\nsagV03wlyJE1B6sWFp9QHwI1ydXIa55xdG4Wnka+y1a2tqaBzFl24pV3oRsWVoBU/EkhbhUQKMXj\nxuPFz7NfTJyk63QsqWzxNPPJpzTC88AAFVahEMYvFE/n0vZvl3ssy1doFZO8pazRVQg5XAvwYOkh\nhOcVj3W4CAoECrrOJ1DiDxUuoVzALLb5iuYqgQ5WHYLgjDU6oIwqm4/Q4D/o3Cw8DX6TrXhtSgOZ\ng502dXessDXTAHCiypzKncAxKvuwKUdgEwIAx3S9BGa/kkBFC1huXEHPtyO/ms4jPngg8IYvDmAj\n0AF2sOhQVvRA+WSVYq3yUq5Ky1aJXmp5rYCn0vtQS5ktL9OAaSBZA5lrxkouRvLR0FISNu+EseOW\njbglJYzb2ttJMsblC8uUJE8oP01gVF6Flpb8kMI8qHiphBs1UIkDLqroaYaSA7KcjEMHY8GArB4C\ngUphgOZCdM1Sbhg9Ov/ggXJKVu8rQAfwUdMVgEf399CJGp0AQW0tyKomK1tbK7+V1zTQKBpoaMtO\n2HQFNOCIHPeBCXs4AQrhNbW+ycgX+tOQv5yn2Ua+lmAnlB94otwhAJFOuYHKtxp+DDiBxyGuXJl0\nXUt6UbykteBElTn7ACIVvIJAhjiKz7lwW3ErWQM6PXv2rCQJD6TF9L4CdrQAOgTADYgDdORzpCYt\n6UDCAQDI++lnf3MLFvzGH1YXc3Y6/KCD23qbbf3xjh06uFVXXdmXTQDhT2Tgh+eesrKwbcE0YBrI\nngb+8zXPnuwtSgw44M+hipUxeMIeWeyHMBEHjRYzqHKE+Ng37MsfhayKkQ/YoeLHl4Vy4wxMmQVB\nSlM64VzopN1SkaoBO5KlpbxqdV6+GeQn4MmXd7XhRvlU2uNK6bAup/mKpq0k0FETliAPXf36vvvd\noxGYL1u61MPMFlv+0PXZpa9r376dF4Pu5QQGHHxvyRK//eKLv3evvf6Gu/vue/x1O0Vj8/To3jUn\nq4+U4h8BD+XPGqylWK0mmmmgZhpoaNjBooHTqoABAAi7qIdaJm6+budhvNbeRlbJG8+rWAdc4mmE\nZKw79FhKCkBRKaCDxYHmkUYL+sfeWiDTkr7IH9ip1KJDPtyffHNf5Wu+AnSAmXjzVQg6d945x02c\nONGDyv4DDnbFTBDapXPHaKqJjr744WSiQNCj0ezqd865210y9hI/u3m/vfdKvdUE4BGUGvC09FTb\nedNAujTQ8I3wVPwtVeiATjXGa6n01haCGUCs2KYaytsSuJEWZS4l8LFnbqxGC/xbrwZolKMXQIdQ\njcqTclS7+eqee+51ffrs4kHn8IFHukVvLHJjLhxd0aSgANCQk05wWIDGjZ/gFi9+122yySZu/ISJ\nVWkm9QptpR85K6NrC6YB00B2NNDwsMOtoKmGwfTi0CNrDiMLp6FpBfmQNYQa5EL2QiCU9LgRn1Gm\n49eRNiBEmcN8ktKIHwN2mBurkT70/FMH4KoBG3F9tbQvPaLXaoRym6+w6sT9dF577TV35OCj3aRJ\nkxyQ89hj89zRgwdWQ8xmaWDxGXfF5e7Vha+7+fMXuG5du0UQnn9KmGYX12nHgKdOirdsTQMVaOAb\nkSPml0OkVpCIXdp2NECFSuXcCM1ZwAYOtjfeeGPNAY58ASwqzmoE7gdWndVWW81hLSJdXm11M6fH\nFRN7hrO10/2cpjtAh15mGhSRUbUnRBaXgRHsHDnoCNe+3XrVELGoNJhc9LxoOolevfu4sWPHVE0/\nRWVeYiQ1adXLKliiuBbdNNCmNWCw06Zvf+mFv+uuuzzoyCpRegrpuQIg+PTTT71AjEVDpcXS2lYe\nQId8qhW4Fz/+8Y99cnOiyTn33Xff3HADGk9Hs7cLdsIpIeheD+iwXHDBRe6xeY/65qXQz6ZashaT\nztJlH7ozzhjmweyC80e1+v0oRqZCcap9PwvlZedMA6aB8jRgsFOe3tr0VUACH/jWhoLWVLIcgnHm\njYdVV13Vlw0g0aI4lTgxt5YlgPtAOQA2YJSAVQeHZKAmtOoAO+xzjt5XdC9nDCGg6IwzzvRWnilT\np9TUmiPdxtennjbMzb3vXjf7ltmpf9YMeOJ3z/ZNA+nSwH9F5u/R6RLJpEm7Bj788EMPO1gQshqo\n5Kn0WQCEDtE4MAyktzTqTv23v/3Nvfvuu96XZ/r06b55iCEKXnzxRT8zeLt27TwkUPZi4YemJfTW\nrVu3qqqM1/eWW27xzVdUuJRLzVdx2NFs5hyXnw5WHYDozDOHu29F106Zck0qQAcl7fbTXd23V17V\nnXbKULfDjju49darXXNaqTeJpl30z9qCacA0kD4NmGUnffck9RJRcdN7ZvHixZn+uFMxXXnlld4i\ngtLxb2FhiIJHH33UL1RgH3/88dfuSadOnVzv3r1981GvXr28PoiUBD/oi1DtirBazVennXaG+8Zy\ny7lrrrk6NaDjFfbVz/XTZrjLL7k4MxaeavpihXqwbdOAaaB8DRjslK+7Nn2lev7g3JvFAOSwACJY\nQkJrCHNE0azDIvgBLJ588km/fPDBB18rMtaeEH7kQ0PzEs1+1QYdBKhG89WkSVd7HUy9dmoqQUeK\nHnnuaPfKyy+5GdOnpdppGXl5Vrjf3HcLpgHTQDo0YLCTjvuQOSmABKw7NO1kzXcH3xkqI5qv8MkJ\nQUcOvTTtsNDkw6KAn8tnn33mXnnlFQ8+Tz31lKObdjzQnERT0eDBg91+++3nsP4oJFl/dK7YNc1X\nlfa+YsDJMWMudndFoxpr8L9i869HvEMPO8KtFvlTXX31pHpkX1KeBjwlqcsimwZaXQMGO62u4sbN\ngAoXYODDnqUgX6O4M698XIAc/FuYNworD8ex8BAAGICHRdusgZ6XX37Zr5977rlEdfTo0cNDjyxA\n+udfKvygb1mOyu19BdQdfPDBbuyll7sBB+yXKG/aDtJLq3cEpxOikZz79t0lbeJ9TR7uU2tZ9b6W\nmR0wDZgGCmrAYKegeuxkSxrAqgM8AD5ZCADOkdFYQfrnjcxYdgAaWXVwWgZ24sBDXMBGSxL04NyM\n1WeNNdbw4MM2IEQvqHiQ3w9WH+AFSxmhJfipRvPVueeOcmusuaabOuXquFip3tc4PPMXzM9EMxEW\nUAKWRAumAdNA/TRgsFM/3TdEzkBDz+jfNr47spiktWD5ZBXsyLIThx0sPVh4sO4QFxhhAXpYC3re\ne+8935OLUa85p+NsL4kmxAR61PxVyO9H8CPrTQg/1Wi+YmTtiy8e656IfJBqOWBgtZ4LmrO6dO7s\nRo4cUa0kWzUdA55WVa8lbhooSgMGO0WpySIV0gCgc8opp/iut2n136HC6RlBGUCGY3IY4j47NF8B\nPFoDO1h9WAAi4mtROoAOUMIUHGETV7gdAhAWIByeBT/5/H769Onj5abpi/S33nprn2W5zVcMHDgs\n6ma+XTQtw9nDh0n8TK2ZSHSLLp0y1RvQgCdTj5gJ24AaMNhpwJtajyIBEFgd6KqdNuDBIZl50AhJ\n3eUFLlhuABqsOACOFh0DdOIL1/7ud7/z49wwb5iCrD4CHNbhtqw+IQwBP1h/WL/++utKKnFN13gs\nQOSPTMgMoGlKiHyDB2LVueCCCzNr1ZEyGHBwrTXXyIx1B7kBHp7FtL0f0qmtTQONrAGDnUa+uzUu\nG74w+MSkCXjU80rTQjB3FDJi5QkD0ADsCHhkyZGDsvYBC+JoDXRsueWWbpVVVsldz3kBFHlgkdEi\n6NE6CXoERbL6AED5nJ47R805O+20k8P5edttt/VA9cUXX3h/I2Qm33Duq4suGus223zzzFp1dM+e\nfW6BO+rIQX4Wdh3LwprnEegx4MnC3TIZG0kDBjuNdDdTUBY1aaXBhwcfHZqtABusTmxreohx48a5\noUOH5jQGnBAEKXELTrgv2Jk3b57bcccdc+ATj0M8LUpX+STBTz7wAXrovk7Ybrvt3JqRYzE9v5L8\nftZee23f1EVlyrLhhhs6RkkGxoAf4IgZxrPQ1dwXuMBPv336u/367+MdzgtES90pA57U3RITqA1o\nwGCnDdzkWhdRwIOlJ+4fUytZ1KwG5OBPRKCSAXBmzJjh94844gg3bdo0DzjAhwLbIZwIWLT+6KOP\nHGPUYFER4HCOba3DbR0L16TPfhiw6JC3LDtq4sLhmfQ6duzo7rnnnpxPENaqxx57zDd9Pfvss346\nijA9ttdaay3XpUsXD2X4FX388X+7e++9Ox4tk/vjJ052r0YgmLUeZSibZ1EO85lUvgltGsiYBgx2\nMnbDsiIuH3JghwB4YF2pRaCJgHxZA13xfIEMrDqnn366FweAABgYDyVubQkBSPCDzw+ws9VWWxUF\nNknQEx7TttJnTZAsDBz40EMP+WM0mWHV0TniYq3Btwh/HZb58+f7gR6x/KCDMGy3bVd32MCBbshJ\nJ4SHM7uNo/L+/ffNXFNWqHCafOPPaHjetk0DpoHqaMBgpzp6tFTyaADLCrBDExLbG7fSeCOADYB1\n1VVXeesNeWnQvlA0AAHAABz23ntv79jLeYCHnk6hVUWWFc4DGMAD11MG1lqw0GgRvIRWHI5p0XHF\nC9faVlqMTn3SSSeRfTQj+Rmuf//+fhtZVA45UwM6QA9pcH6FFVZw3/nOd9z777/v9fL8889HY/18\n4a67/gbXo3tXn04j/PTq1duNGnVepoHBgKcRnkQrQ9o1YLCT9jvUAPJhsqcpiRnE99nnSx8L4Kca\nAcDReDSkl9TbSvkITgACIIGeSYcddpgHAuKMHTvW/exnP3PLL7+8hwXW8qMhH3p0ATphIE0FIEV5\nCGoELtonby06Fl/rvLqZM9v3nXfemQMc4iM/C93j1UWefQKgg5/OSiut5OhqzvrPf/6z23PPPX0a\nkrcR1vTK+tFWW7hBgwZlujgGPJm+fSZ8BjSwXAZkNBEzrgEsLFhePvnkE+80C/gADTQ30TMKGCol\nUDEoDZoAwi7fQImCwCNcAwpa8GWhiYgmKcKIESPcgQce6Ltvh5YSrEDkEeajPNSkxFpgBCTRA4r5\nsVgADxYsLQIQQQj78YVzv/jFL5SFu/XWW/313/rWt3y65EOZaMJCTrqbh6M9c5wFaFIz1xtvvOEG\nHHRILs1G2Vh7nXW8w3XWy8NzzHNd6ruQ9XKb/KaBWmnALDu10rTl00wDQAkAxPqJJ57wIAEA0YMo\nqflJFQG9qYATKgcWWYiAH5qwVo0miqS5iTQEOWHGHMMCQpMPFhFNC3HOOee4O+64w0dt3769mzt3\nrsOigg9M3759vbUHyBDchGkW2iY/BbYBLQIgon1ZcjjHNuP27Lbbbj4eTYA0twle5J+D3IylQzdz\nFo5zPc1wQJHgCthif/Lkya79+hu5cVdc7tNtlJ+5Dz7sZkYO57NuntkQReJ94D1IegcaooBWCNNA\nnTRgsFMnxVu2zTXAR55/tUBNUhAEAThJgWs5BwzRHZxu4YIJ1kCKAkARQgOWEfZvv/32aBqFixXN\n+/4MHz7cW2ew1KhZC6AghGnmLipiA3kIrLXI2sSacXvefvtt39sLAOOYLDSy5CAzozADPIAP55EH\nGYEbWYEk93XXT3OdOnfJ/Pg6cfU2GuxQPgOe+F22fdNA5Row2Klch5ZCSjRAJbHzzju7zz77zF1y\nySXu5JNPzjVZIaKsMsADwIOFR01AAAPAg1XlxBNPzJXo3HPPdccff3wOIAQ8WHmUZi5ymRsh/Iwa\nNcpddNFF3jKD/xHj4yArMCNLFIAD6LAI1EgDmYAbrDqCHFmjrplyrftBh44NBzvMhL5++3YeGstU\nfyov41nGuoOVx4JpwDRQuQa+/ItaeTqWgmmg7hqgeQtYIGCRwQFZzrtYRNgGaIAHAvCDMy8Aw4LF\nhhGWx48f748T58ILL/ROy0BF6MdDGrLKEK+SIAijuzigQ7j55pvdOpE/SmilkawAjBaghqYq/H5o\nwqOCZKEcrDmGDxAA1IghixOZFnMfsGQS3okNH1DMtRbHNGAa+LoGDHa+rhM7kmENMGig/F3oaUUv\nJKw2wEoILFh34s0+TMYJ9NC7C6dkBvMj3H///d5vhxGLBU1YhUijWsBDPkdGDtsEeqzRzRz5ADDW\nCspPlhwACNABbugtxiLgAXSwDLEQx0K2NCCrjgFPtu6bSZtODRjspPO+mFRlagAIuOGGG/zVjDFD\nD6vQz0XNVTQLARLAAtaTBQsW+KkUGGSQYwAGgw8eddRRPq1Fixb5uaeeeeaZXM8nWYmqAT2jo3GB\n8DcCWm6MHLcJlIW0WeSzg3UK0MKyhPzIHlp1BDsCHc6xrPitFX2ajfbDwIIdftCh0YqVK48BT04V\ntmEaqEgDBjsVqc8uTpsGgBQsG3PmzPGi3X333e62227L+bsAO8CPLDMAA1Mt7LHHHo5eWHQPZwEi\naCrC2kKzFgG4weJCExPpqFkMEFHTGIBSasA/g5GSCYAO8pMOC+kiK3mTnxYAiLICZjRjIXPYnZ1t\nHfPrVRrTsvPekiVu6222LVXlmYov4OE5sWAaMA2Up4Hly7vMrjINVEcD70Q+CY9HPbC0JlV6VmnC\nTsa20ccePwa2e/bsWXDWaACmV69ebsCAAX6MGpyMO3To4DbaaCMPD0AEcXDwff31190uu+ziC8Mx\n+cKwrSYr8mVOqn79+vl4TDWBI/Oll17qrS74zbAQuJ70w6Ynf6LAD0BFoPlKXenZj1t0kCcENfKS\nzw4+OQAa4MMx5JcMpLPWmmu43zz/W5JtqMA9bAuB5573AuCRP08l5eZ9Iy0tpB2+d1gYlY/eO9Y9\no3fPgmkgixqw3lhZvGsZl5kPLBYMDSiojyhrrBoEfVSJqw+xPsw6BhhokUoECFhAsL5svvnmvncW\nMPDII4/4aMDAn/70J28x6dGjR86Kw0lZUbhWViDS4jiBZq0//vGPfrt79+5uypQpfjweIAMrixyd\nAQ3Bho+c54fmK6w6VC5UQLLqqBzADb5GGlOHbSxJpE05ZL2hqUrAgwzx/JkOY9y4qyJouyuPJNk8\nfPEll7uVvrOiGzrk5GwWoESpeRd4TnhXSg3he/fuu+/6nou8Z4AUCyH+3nHs8eDPCPkTR++d3lfi\nWTANpFkDBjtpvjsNJBsfSeCGyp3AxxKLRjkfba7nw81HmEH3SJtBBVmABpp+aPYBFGiiYlA+AoMD\nYuVZEjV9YAVhBGX1VFKzFVYZwAbA4XpBj4CH84Ca/IK47sEHH3SdOnXyaQI8LFhWQuuKFyD2Qxk0\n1QXNbuiE9Fnko0P+DBoo2KFcnAdogBvkVxkEXEn5oiP8ebi2kcIxxx7v/vynZf45UIXdSOVLKgv3\nkmcH6Cgm8LzyngBJghTW5QTS4D0mTbZ5hzWaeTnp2TWmgVppwGCnVppuw/nwYeSDCNjwcWSpZgB6\ngCgqAPJhfB0AADAAFiZNmuQuuOACnyWWGSqJ73//+x4WsIgQF1AAXAAFQtzCQzoAD2lidSGvIUOG\n+Lj8XH/99W733XfPpQOM0Myk9JKsPOiD5jqar6hACMBICGuy6gA7wBfnSBd5JTvWHcAHSw/n4nmh\nH8LoUee7s84+2+3+075+vxF+OkZjB82+Zba3iFH5KnCPGz1wXwuVk2eK94HA+3Fkld873gEgijnv\nmJuMbbP0NPpTl93yGexk996lXnI+hnxg+ScK8BT6MFejMIIeKr1f/vKXfuJLWWfwy6FrOQH/Gz7K\nagYSNAAQHAMWBB2y8CgdoAfgATrIZ+DAgTnRmaFcIy4DTlh4gA8WQgghVD6t1XzF9BthkN4vGnOx\n+8c//+3GXDg6PJ3Z7WefW+BGnj0imrF+3tfKIMDjBBYflkYMScDDc8l7JxhhuzUD+QFVyMJzLcBq\nzTwtbdNAqRow2ClVYxa/KA3wL+/UU0/1g/zxAaxlmDZtms8bEKHrOeAxa9Ysb/FBDvx4sMRgdeFc\n6PfCvoAHC05oZRHwsFazFiB32mmn5fx48AG65ZZbchaeJD8eKqFqNl+9+eabvpkLmGIR3MR1zj/9\nq64anwgH8bhZ2B957mj3P//+l7vs0rEFxaUyZlHIpx+dz9o6BB7+VAAbAA7vXS0tLchBvlgskaOW\neWftnpm8tdeAwU7tdd7QOVL5618elWu5PjmVKompFugmzkzrzHe16667+sEB+RgT6FFF8xFWF5qA\nsO5oAXjkaCxHYaw5AA6WHZYQeLACMQmpJhLlesbtadeunYcp4EnNWsAIoFNJ8xU6/vjjj3NAxeCH\na665ZjPLkS9kwg/NPuPGT2iIpqxevXpHMH1eXrhLKL4/RKWsgMWHJeuBMvEHgzXvXb2AjmeTdwyg\nr+f7n/X7afJXXwMGO9XXaZtNkQ+dPrJ8dOv9zw7gwRGTnif33Xef99MZNmyYmzlzpr9HM6LZsqno\ngJEQeLD0ACzyfwFmcBhOclwGejgOFFHm8847L3f/77zzTkePLTWPATxMP8GUEAz6h1zoiPQFVaQn\nPx0ck9lGr/QAw0qEXFim6EqPzAIzWXVymefZGDNmrPvrRx9nfvbz66fNcDfNuLFiK9U7gdWHe1Ev\nOM9zu4o+DGDgO/Piiy+mogxYlQRfWdVp0cq3iJnQgMFOJm5T+oUU6MiEnQaJkYneWVQE/Mv89a9/\n7TbbbDMPPVhngBy6owMKQAOQI+tO3OFXTVpyXAZKQiuP/Hj4Rxs6Lp9zzjl+IlGAB58hZmQnAELq\nESOYIg3SBHLoKi6QwoGakZ2RiW0WtklTPb8oQzGByn2TTTZxry583XXp3LGYS1IZ59DDjnAH7L+f\n22+//lWTj+eF+6fAs1xvYJcshdaypADblAGAT0NQkxpyGfCk4Y60bRkMdtr2/a9K6dMIOioYIMFC\nhcBoyvzzZRoJZkcn4LiMNQYrDsAj2Al7OKlHFengw6Nu4XHgoZmLc+iDXl9//etffR7y4+nWrZtj\nfi3m7rr33ntzPbUAKebiAnaUZufOncvufeUzLfAz7MzhkZz/l1nrztwHH3anRuPqzF8wv1VhBPDh\nXhLSavUJQSeNYGbAU+BFtFM11YDBTk3V3ZiZpf2DC6QAFMiJrwzWnLA7Ot3SaX6jmQmLCcATWk/k\nb8PdiwNP6MeDVQZgwfpDPHpbATHxQPMV0MPov4AUsnXt2vVrzVeAExYbLFChEzUyltp8FcqAdWe3\nn+7mbrhxuuvRvWt4KhPb+OowvEA1rTotFRzoSZvVh6YiYAK50gg60inNWSxpl1Py2roxNWCw05j3\ntWal4iPGR5cKNM0fXEEKzrxYWrDmMMjgwoULva7ydUcHLORgHFp4ABT58WCNkUVG2/Ljofnsiiuu\nyN0Ppr+45JJLPNysvfba/jjWIqCJ5istQBMyC8Aqbb7KCfDVxvgJEyPoe9Tdc/eXc4jFz6d1nxGT\n5z/3bN3lrrfVh6YhmkGz0kSErAAj8lowDdRDAwY79dB6g+QJ4NAWX8/eH8WqEnAgvP32227rrbd2\nN910k/dd2XLLLf1xwIPeVGF3dFl45GAsh2V/QfQDpLAANsCKgAcLD9NR/OY3v/HnQ6dlrmUU5+OO\nO86DDJYboEkWIgYPZJt0yY+8JUfYtBaXRTKVsu63T3/XrXsPd/bwYaVcVre4jKuzfY9uqXHClSJq\nbfUhP/xy+JORlTFtkJlvBfJmRWbdX1s3hgYMdhrjPtalFDT98AHDupOFAPCwjBs3zs9kPm/ePPfq\nq6/mHIWPPvpoPxIsIIFFR01HrOUMLMgQPGHhEfA89NBDOeChmQmn4rOjEYs5Hg+Mtjxx4kTfTAXs\nhP46pAd0Vbv5Ki4D1ol+e/dzvxh3pRtwwH7x06naX7rsQ3fYoYdGTZGD/D1KlXAxYUKrD1DCUs0A\nLJBH1qwkyIuFB9mrrZNq6tfSakwNGOw05n1t9VLpw5X25qu4IoAUAIVZ0bfffnv/L7OY7ujAD8BD\nsxIggkWGjzbj+JAeC+kBLbLyPPfcc+6QQw7xIjDwIB96Blr87W+/nH183XXX9QMQrrLKKv46riUd\nAr2sBFuh/1Cpva98Ygk//NOm0pz36Dy3wjdXcDNvmpVa/x1A57jjjvfw2NIAgglFresh3g8WBf4g\nVBJIi950DKuQRWDAb46Ar5EF00AtNWCwU0ttN1BefGgxo+vjlaWiATxYdfbbbz/38ssve7DYdttt\n3dKlS30xnnzySQ8zYXd0wOONN97wXcMBHmCHwQHxUyI9QZSsNAAPlh0G/0NXs2fPzo3HwwjP+tgD\nL1iaGDsH0CFd5Ss/HZqx5Dsky1Il+qbCBLxw1ib07burey9ymk6rw/Kppw1zb731Rzdj+rRU+4UV\nc09CawzPBUspgfeNa3j3shiqBWt0MsDnjoAP3FlnnZVFdVRV5qlTp7pjjz02lybfpFqGSy+91E+X\nQ54vvPCCwz8yTcFgp4p3gzFc8AkhxF/AQueqKEJNksJHB6sAH64sBsEJ1p0ddtjBT/fAGDg77bST\nL044O/qyZcty3dFxbGZUZABF0AGcEJQmwEIzFM1XckzGdwerkHprYcHhYxB+oOldhDwCHaw9jBHE\nOrQqkZ/y9BmX+COL3KeffurTZ5+mSLqj33v3XakCHiw6o0ef7z788MOGAJ34reL9Cd+hlqw+xMWq\ngzUxzZ0B4uWM7+sPkoA/fr6YfX1PN9100wiE3yrmkoaIE777U6ZMccccc0yuXPWGHQTRfQF0+Mal\nKSyXJmFMltpoQBUma16QUgMfqSw7Gar8/DvGURnfGEYkHjVqlFfFww8/7LuNAy0ADt3CGSMH6ABU\nsN6ouUm6U5pA0CuvvJIDnRtuuMFtsMEGvkmK67EKAUadOnVy1113nS53EyZMcFdffbW3/nCQdARV\nYdMV+ZQbuG8AFaCz1VZb+WY4QAd5aB4686wz3VGRT8ytt99ZbhZVu05NV40KOihq48hCA+BoATy1\nhBAkpfK8Mrt4lkGHslAORnumKbWcgAVBfyrDPwzlpGXXVFcDuh801ZdTt1RXmuapGew014fttaAB\nPsIMzqd/Zy1ET+1poIFK5r333vOmV/xrgBr8bgiMj0OlomYprDL0tqJ5imOAEMADKCgIRHB0JuCE\nfNBBB3lIoimKpjAsN4AM12Ht4YMADBGALHx6gBECceQfpLT9iTJ+uF+DBw/2V1JhUqlS2ZIHC+U5\n7LDD3HkR8I0cMdzRxbtegUEDe0f3hmZAusZnvXIvVo+CHtYEgQ++YQRZVP1Ohn947hjUk/KUGrBq\nATuE1VdfvZllo9S0Gi0+Vh69z6zrEQ488EB/X8h7+PDh3gpZDzmS8lwu6aAdMw3k0wAfKCbQbIQK\n6IknnvD/lOnuTdMV1ptrrrkmV3RBi6aIiAOPYCf8sDCQIH5AzH2F1QirDJYjIIdFY/aQiZq8+De0\n5557+nxxPB0wYIB7J4JKAASwygdXOUELbKjLL/+kCfgHYeHh/unDiByCut12+2k04OJE9+Tjj7m9\n9urn6O5dq4A1h5nMGR354rFjW5zNvFZy1SMfgEDwwzaWVCAYS1wjBOCb57DUwJ8DgIcQNuGwzzn+\nFLDId4UKV8dY826pgwDXxAMgRVNMeE1oSYrHZ5/0SFfXrLHGGl4WrE86xlrWKKXBflw+4nEsLiPf\nJ86FgTJyTBaUsPyKi67Y1oKc8RCPc9tttzWLgn+U8lI6yKN8w8gAKMBDIN14WmHcmm9HH7y6hsi3\npSlqdwVDcwvHonbYZnJFD3bufKTQr50P02A7HiJH0SbSDfNhOymv8Npi5eOaUAauC0Ohc8SL/tU3\nhWVEtmgqg6aoXTZMJrcdpoeuuD5qJ82VDx3FZSC9ePm1ny+fXIZfbUSg0xQ52MYPZ3b/d7/7XVM0\n0F9T1DzVFEFP01/+8pemCOhyejrggAOaIoflpmeeeaYp+gA1LVq0qGnJkiVNPE/RAIBNEQg1RbDg\nyx9NRZG7bs6cOf54BBFNkTWo6bPPPmuK/H+aIifnpsiK1BTN09UUwVBT9MFoirqgN02ePLnpzDPP\nzF3PfYnAy+f10Ucf+bxIh/TIT3kWUjzyRH4/Pk3WyKTA9aRFuSlH9GFqisYG8vlFH+GmaOLRpmHD\nzvLP9CmnntEUzaWlS6u+/mDpsqYxYy9r6vCDDk3HHHt80+LFi6ueR9YTHDp0aBNLowSeN55x1qWE\n8LvHNy8MfMP0PeNbGn4PdVzr+LV8QwvF53sa+aCE2flt0lGa8XX0J6bZubBOI614/Ph++P0u5tsd\nlp+0FCL4yOVFOeKBfJQ35/m2KYTnFCdco+d4uPXWW3PpodO0hP9opMYSlfpwEZ8bIUWHSo7f5PiD\nzIMVXqs0wnX8mlLlQ33hixg+qC2dK+eBiucVliXc5iVRKOaFUdx8a9KudmWErrmH3FNkTLpXHOMc\ncYjLNdUKgACVe9RM5aEk8hNpihyGc8/a+PHjPfAAKVGTQtMf/vCHpqjnVlNkNfHXCHgiPxh/DUCo\nEFlnPFBE/8r9NcDO/Pnzm+6///6mW265pSn6d9t07bXXNl1//fVN0WzsTZGPTy5fdH3iiSc2RXN5\nNUXzbDVF00s0y68Q8ACkAh3kAnwIAiVBGIDHx43yADmvv/56U+Rz1BRZp5qiMYg8RA8cNNjLBPQ8\n8+x8Fa3iNQAlyDnk0MOboslPK06zURPgHlZbP/V+7yhTCOAt3bs4IMTjx+uB8DsY345XwoVAR9fy\nDQpBgO2kb5Xix9fhNyv8fsfjhfvKr5hvd7z80k/8eBzaQhhiWyGEllCm+Ha8rkPmME48P6Vf63Xd\nYKechysOBXp4wgcnvFkos9gHkjTCUI58oRzxByDfuXIfqDC98MFK2iYPQjEvTKiD+DYVJlaQagXu\nX/iiJcle6BjX6hmoRKbIf8BDBtASNVX5f5vR3FVN7du3z720WHeeeuqppgULFngYAAywhGCxweIS\njYrs4wIY+rcq64nSxCIUTU/hLTsPPvig/9Bzb6Ju6R58br/99qbIH8pvR6b0XN6Rk7TPE6sT+ZEe\nsgJSScCDBUB6A7xCeYjPtYAd8AREAVMAHJCD9SrqPdb0/PPPe7CTJYsP1siR53jrS8+evZrOPmdU\n0/0PPFSy2oGlqyZMavr5Mcd5GbHkVLsSL1moDFzA/axWSMt7x3M6atSooosVfv/jsEIi8UqdOKpo\nqQfi3z+BRPy68Ntd6FwoD/cnvC5u1eG8vlVxa1B4Xfycvt1Skt5r1sgWhrisOheHjzA/4oTAFuYX\n1jGhLilHqMs4BJJmeG08P87XI9TFZ4e2vrBNMlIGb7JfohsW3ccvQ/SRbtYuiG9DpESd9o5qpBVV\nPP5YpHTf5TsXIdrgPOkohHmRngJpqH2xXPmUVilr2mcVogfKd9dDF9ED5Wfk1jnajcNy6LjWYbmi\nB1aH/Vq6jl4kr+PwJPomv8hiEh5O3MaPZOPIf6AaAV1vs802OZ2Xk2Y10iBffCOYnBPHYRb52Dzw\nwAM5sU4//XSvp8gi4h2V5aysbuQXXnihjxtZVJr5w8jvJgIM39OK6zkWTkuhbuYRKHkn5rXWWsv3\n1Np///19ms8++6zXFZOH4jeEkzTpkY7eGyLin0NZrrrqKn9dVJl4J9DQP0fyIDdl+Pvf/+7n42LN\nwjHSjqDIt/MjJwvv3dlnj3CvLnzVDYl8ar694jfdZZeM9XGYdiKCFu/UjGNzfMEP59DDjnAdO3T0\nvb1ejXqrMQEpz/OUayZ7mb3A9pOoARyVIytI4rlSD1bjnalGGsiN/5Gcr4sph75jxA3rgXzX8h3k\nm0pIqhv0PcUnRYHvYFgvsM+3VYG6QQE9KMSv45p839SwHMgV5hdBRLOySUblU86aPKI/hrlLw/zZ\nVh7EI38Cx1Wvsh/qEt2zT3wC14e64Fh4f8J0OFevUBfYKffhQkkhDPHgyTOfc3EY4lh4E5IeyPCm\n6CGoRD7yLDZU+kApn3i5eLDDh1sPs+KXu+bD1DOqTCsNPPw4vFVDLtIgrUpeKACOCoUA7NA9HOBZ\nb731vEMvxyNLh8OhGVgABoACAc+h0TQGBJyMGaxPAWAAbgALAEVLCDuADjDChwPYwbFZXdSBFWZk\nJ3AtlUPkO9QMeEiffCKrm+/hgoykA3RpGg8BkWQnLaBJk46yZv8t+iwAAEAASURBVB85iUOQHnCw\nRh8sHCNQxnNGnu0ee2yev4ennTrUde7Uwa280rc9BAFC4bJCdNkxPz/aPfDgA27RG4vc1ClXuyMj\nB9VGcHL3CmnlHyC2GrpK43tH2YoN+j4TP/xuJ13P+XgcgU88fpiuKvswTvgtRYd8c1haui4pLdKl\nntI7GVldfFbUOdRlOBBX8i0L5Q63Q1nC+i3cJo4AJiwbeovrknhxvYT5hfHDtMI4td5evtYZkl9Y\n+PAmSBaUKIuHHi7dBOJTuYuw9WCg3JCQlVaYV9LDjgUlHsJrSpUvnlah/TCfQg9UvKzxNJPKxbEQ\n9OLX1HOf8sRBh/vHfec+J5UHedEX1/GChrrjGGmG/8BKKZ+sVerBIOsOH6SDDz7Yd0OPHIr9BJ6a\nHR0wABCw6GAV4trI7yZnEeHaOFzIasI5AQQ9tDQNBWkALoIjoAq4nDFjhhs4cKAvEqM+n3POOe74\n44/3cYGy++67z9FzDGjZaKON/NAAGj+Hi0gTWQReyIHsLAI2zhFUdsmlHmQcl5XHR/zqh0oYGVks\ntI4GqvUnI43vHXBebAi/GaoP8l0bVrb54ui46hD2k3orKZ7WoRw6lvTNKiSDvlmq55ROa635tvKn\nkEDefD+ROfyOhvASlpE4+jbmky+MH49T6Fw8bmvu1wV2ynm4wocbqKEiD5UYWnyksDAfjhV6+HQN\n6/C6Yh/+UL4wrULbofyVPFDFlquQLMWcw/rBP/JKQ/hvgrS4d/lMvmFeIXiSBt0fFeJp6nipa15q\nKnUqd6waVPZAFOkDBjQtMQYPIEIX88ip2GdBRcJYOkAD1wp0BC6hVYc8iAPkMPYOiwYOJF2uAYYE\nIhtHlicg66ijjnKRj4276KKL/HQXkYOzY0b1SZMmeRm6dOnimOqCZxGgIiBHKEscdMiL8wRkAp6w\nLGlBRll30Auyt/Th84nZT+o0EH9H6v3e8VyXEsLvZSnXpS0u5aAJP6xnkJFvIN9yviXxc5WWgW8C\n3089A6zJS3+Idb7SfNJ8/XJpFi6fbDws8Qc/JNR819nxyjVQ6gcqKcfwXvGCFwM68XRk4dPxME0d\nq2RNxQ5wqDmLua0IwEjUO8vDhORm9OXddtvNQwqwo0WAA2CwcC1WFtIGogAKQEdrtsP5sNgHNpCB\nj9Gdd97p+vTp4+XAj2fDDTfMgU7//v19UxvNYuQhaw4gA9CQv/xz1GyFfAIdgCaEL8mFnJwDhJDb\nQnY1EL4jaX3vCmm3tf7UhenKr5E/C/mWML7kDXWrY/mAJYQZ0qJ1gbyAz6TWCaVX6Tr8s4i8Ah/S\n5RzfGIVwm3P5dKHjScYGpZWWdV2+XuHDUs7DlWT6Sxr4KbxhKDzfwxe/GZXKF08v334oX6M8UPnK\nmu94qOt8cfIdr+TafGlyXNaLEHi6d+/umL+KEPWaauabw4suoAkBh201FQEcAh3gAYgALljYDhfg\nBysRi2Y8B3iQJ+q94icw9YJ89cMgj9E4Pd6vh3yAKlmIkAsZkkAHeSgraQt0lC8yCHSAPvKWXsK8\nbTubGqjk3ank2kq0FX4v4392K0k3bIJKgpaktNFBKE8IDoqfdIxz4XGgM9Qn5Sq2nlI+xa7154z4\nWHRCOcImLM7HdVKJvsPykXa9Ql1gJ67IUgoPFesm8bApLW5G6KxMmpwPH8ikh4jRLvUR1/VKkzSK\nffiJW2qI51PJA1Vq3uXExz+jlN4TxeShe1lM3HicSq6NpxXf55mggseiAZwABCNGjHCdO3f2UeVY\nSC8twEA+MFoLMliHFhQ1FQl0SJdFPjzKS/CBhUUL8BF1f/cWnlBepu8AdtSjSjJoX47I7CMPQMQ/\nMspH3oIrwArYCUEHefV+hHnadrY1UMm7U8m1lWgtrDSTvuXlph1aPPgjrXqA9MgHVwa9A4yurBAC\nAvVSeB3bHGsphO4Y6DVsmm/p2lLrC+rCsKySL36cfKmbpG/yQa6wLuTasO5UWpI5vD9hPafz9VjX\nBXZChZfycKH00KqDyS90SkXh8RcxfCB5ANVGibJJK3xgJJfWihM+xIUe/lJvYKUPVKn5JcUPy590\nPjyG02spvSfCa8PtUL/cr/h9COMmbSMz9yS812GaSdeUe4zKXs1ZwAZ+MmHAqoIVRXDD1BMsgIWg\nI7TqABdJoCOoCPMDOgAdAIQ1eY8cOTKXPc1pyEbAUfpnP/uZi8bO8fmzBnIkD2vkQVYC+VCeMH3y\nQzbBl2TiQ58UeBYej/y4xo+fEPUau8h3L6eLeXxhRvXxEyb6bvDvRMMXWChdA4343vHHiZ6DxYaw\n0gwr02Kvzxcvbl3hexTCTVhnhM1M4TZph9exnS+E5QAgBA1xoMh3vY4rvzho6HzSOuk7ybHQKKDr\nwvIhJ35G0kvYmxYoCq1GXB+CUVhepV2PdV0clFEMlZUeWG5avocjVDhxVDlzc0hHVKqKj5sQ9rDi\n+vBhyOdwDBTpppQrXzk3EPnkJa8HKimdpAcqKV6px6T7Yp0Vq/XR1f1CXp4FFvSv+5lUDq7h/ocv\nkuIlvcQ619Kaj+7GCc6SvNiygAh4mJk8DFhV+vXr560lNAsRD0jAFwbfnRB0wuYrQENQgYWFIKiI\n73Oc3lZz58718X74wx+6yy67zIMKztKnnXaa1wnnmaWdMTDowo7skgNZ1GylsgA2AI4gB5kkf1wG\nn3H0A6w8/vgT7s45d7l777nL7d1v32guoe+7tddZxx3xVY8xxdU6GrAwgq4v3K233eHwLYoGJXR9\nog/sDtv3iLZ7Kpqt82gAHY0ePTrP2eIP846k6b3jW8IfqGID32jVE3wD+BYkVdLFphfGw52CuiHp\n26J48bFz+Cbz3dT3W/G05tvOdy0eqF+ok1SXhec5x3EBlupIxeEbWUhGxcu3Jn3pUHFCg4COsZYs\n8fhhHHSA7sKA/GHZKvk2h+lWvB19EOsSIiApOBdJVLBmI1IyEibHtEQPXk7uQueIFD2Quet0fbiO\nHqBmw4BzTanycU1043P5hPK1dI64oTzxbdJFnjCEeUUPW3jKb4dpRg9ts/OUN54HOmopMNItow1X\nGhjRM0mGuEzF7ifdv1JkbGkk1wgS/DxSTPOQJBOjHkfQ4UcCjiwdTVF32ibWHNPxp59+uol5uN58\n800/NUP0MfAjIUcQkjgKMnlGoOLn6oocoHP5RmDj59eKAM2PxMzIzo9F92XQoEG5OBFU+fm2yJtn\ng4Vt5GLKi+hj6aeFiMDFjwKNLJEVyE9rkU8ehvVnSgfKz7QRt9x2RxNzWpUTGHmZEZgZiZnlxmjK\nDGSwkKyBao1cnrb3LpqU1j+3yaVOPhp+NyKobxYp/M5HFWyzc9oJ39/4N5U4fDfDPIjP9zPpG6s0\nOUd+SptvM7JwXMdYo38F6qwIMnLndQ3nI0jKHee6UM74dZzXtzssP8fzhbB8ESw2kyvpGvKMy4S8\n8TpO13JfyJ8l331Q3Fqu82ukRlIU+3CFNwhFxwMPpBTMDQwfEOJyw8I4xC10w5R+sfIRn/QkQ/xB\nKHSOa0t9oML0kl5E8pcscdgp9MIgS77AnFiR2Tnf6ZKOI0N4TyVrqWvSIK1KAgDX0hw9wEfUtdvr\nlPhMsRBZRPw+cPHQQw81RePd+Ak+77333qaoq3gT68ja0jRv3jw/zQRTRbz33nt+igbgImpSSgQd\nlQU4iiw0Po/ICtMUjbeTm94BaBLwADLMtTVmzJjcPUePQ4YM8eVCrugfvZ/uAtBhCohobKCmP//5\nz03M2RU1b+WdfgKQEpQwzUO5gKMyxddAExDFJKBXXTU+ftr2v9IA97MaQJim9w5AB3hKCWGFHv+u\nlZJOLeKG32DqpLYSQogTiKWh7HWHnTQowWQoXgPMjaVJJYu/Kn9MPgghuBULO1wTB8r8uRQ+U0xF\nEo1n40Ei8nHJTcwZzo7OHFQAE/NczZo1yy/MdcXs5lh50BkAziSjmk8Lyw0QlRTCiTyjZis/V1Xk\nF+Tns2IWdObZAlqYwwqYAq7ImxnUQx1GfgB+FneAGKsOwAW0Ms8WoEOagq5QFuIwb5WHkAhyWjtg\n7QF6ACsAy0JzDRQD5M2vKLyXhveOb0mp9xrrCODAM16MVaKwFio7G/5Z43sU/sHmfZOcyErcthC4\nP/r+oJM0hW8gTCScBdNAURo4MhpUkHb2U045paj4xUaibTr0yYn+xTa7NPpwNPPpiV6kZufL3YmA\nxftD4LeTL3Duxz/+sT9Nt/O99trL97DCCRnHYHpCEfCr2GSTTbyfDj4v+MRo3iucEHHGVFdy/HeI\nIz8dn8BXP+hW81vhAK35tvC/YVGXdhyQI2DxSwRQ3jkZmfDPiaw8jrm0CMh0xRVXuA022MD78mhK\nCvkM4adDkCwPP/yIO/mkk9zue+7thg073bVvt54/X4uf8RMnu8kTxrvDI/8fpqSw8KUGeLbwl4qa\n/Kqqknq9d5Sl3A4P+MHIjySCtlYdm6aQskM5CsXjXGTh+JoTb0vXZPF8qJO0ldlgJ4tPVB1l5mPL\nnEuF4KCO4pWUNaBDeXBO1jxSSQnwUX7ppZcc4BFZbzxw0KsJ2AAy9thjD/fGG2/4S4EKIAInZXo6\nAThM7Ans0HUf2AGCAAzBhfLEYZN5pzSEPnNjSS7+k0SWF583Ts9AjWCH60LY4TyBiUyZ5oKATPTm\nYpRlAZfkALrkkDxmzFg3c8Z0d8GYi92AA/bz19b6Z+Fri3w3f/KdMf3LiVVrLUOa8uP+8pyeeuqp\n3vGzGvNk1bt8+oZQrnICPYNw1OVPT2RRKSeJqlxDD6rQ6Tsp0ai5zcNO0rlGOsYfVLrms44sWX5S\n6zSV78tuIGmSyGRJtQaojPlXxpL1QC8XelMBO1FTk1/iZeIfNaADtOjDLDgAaNgOe2hFPgi+FxRg\nwiLDqa5hDeTouPIDHpEH0CGvpIk8SQ+rDaDFmgVLD4E0kQeLEWDDQs8n5tEiAEDsR/49/npZiSQj\n8tBFfMFvfuNuuHF63UAHWbt07ujuuXuO7+XVv/9+DQHWlKuUwPPw+FfPJO8a1r6o2ccfKyWdtMYF\ndsJJc0uVE6sBActUUo+nUtMrN37UXOVBJqlHE72xdL7c9LN0nXqYYYWnR2jagll20nZHMiAPTVkE\nVf5+J4M/yK9/mBKfCkaBSmbw4MF+F4sOH2dZWAAOrCtYVKJ2at8tXGBx0EEH+RnI6dLNiy/LDtYd\nxsxRF29ZdrAwoVOapKjQ2MeaJCACSIAT4AZo0fg9WHZYkENj6AiAuAawAn4YA+iwww5TsdwJJ5zg\nLSfIhyzEOfvscxxdxK+Zck1Nm61yQuXZOPW0YW7uffe62bfMLqmbcp7kUnuYZ41Fgfsft+DwrPJs\nhM+o4mdpjfy8S1isLJgGaqUBg51aabqB8uGjzMeYdfyDnKViYtHBciN4i8vOeWY0x9JCJcM+MCK/\nGSCDwfuAnchp2PvFRL2yfDLnn3++N7HjH4OO1IyFD0/YfEQ8FsJWW23lKzLiC3RkgQGuAB0NXkje\nbOO/w3HicQ2LIMonGv2wj5xnnnlmzuTPeDz8O15vvfUiHVzgyzll6pRUgY7kF/DMXzA/08+byqN1\nCC08WyyFAnBAHKw+LcUtlE69z2HBZOHds2AaqJUGDHZqpekGywdA4IOb1Q8WVh1kB9iSAueAEEBH\nUBf1UHIsgAWAoaHj5STMKMWHHnqoBxCsJTfddJMfsA/gIR3W+MvgywOsnHHGGblZ06NuuA6ZCIIW\nWXTIC6gR6GDFiUMOTVha1FSm67H2sE1g1OU77rjDb2PVOfron7mFry50s2b/KpWg4wWNfgCet976\nY6Z9eICU0JpBhV9q4LkEkkJQKjWNesZHbjWFZ/mPUj11aHmXpwGDnfL01uavAgDo5UPln7V/mVQ4\nWKby+Q1QKan3Vdh8BYSETUm/ifxboq7kHlwAkU6dOrloHB134okn+udjxx13zDUX0XwF6GDZica3\ncQcffLBvNiLiDTfckGsuE+gAVOSFRYe046DDcVlxNCKymqTUuwrAIZ7AiG2OUeFEXem9jDh4zrxp\nluvRvavfT/NPv336u+222zYzvbR4zniWFJKapnSu2LWsO1gay4GlYvNprXjIzAK0WTAN1FIDBju1\n1HaD5YXTJB9zKs8shZbkplJS7ysqFQJgIYsO4IFlBksOzUNYWrC+0DuEeFhOosHb/HW//OUv3dZb\nb+0dhrHoROPi5LqgAifRyMq5aUq4ILTGkGYIOmwDLkALQT45NIux4IODRSmEnTANrmefcnDf6J4+\nZuxl7ujBA316af+hl9b+/fd1l1x6SUXOra1ZzvBdwHLBs1TtAKTL1yxL1hHJzR8lC6aBWmvAYKfW\nGm+g/GQhAR5YshCojDCjU9knWaSSmq8AGCAES0sIOnIOlpWFZiRAAwih59OyZcu8SugyTKUXDern\nrrnmGn+sXbt27sEHH3Q/+MEPfPMT14SgA9SwyBlZQAWoEMiLHleCHNbAEwvnCMgN3ITpCJguvHCM\n22DDDd3UKc3n+vIXpvjn+mkz3E0zboyGALgzFf47VNwsClgtahHIh2cKgMhC4H1D5qxapLKgY5Ox\nsAYMdgrrx862oAE+YjT5RCMEt8q/2BayL+m0mgCoII78qkdZmIDKwrFCzVdADlYd1sAEkALkABoA\nCNYV8sIJmLD22mt7B2biKQBdW2yxRa43FE7EnAecSFOQA5ywcFygQ14h6GDREewItkiP+Cxx4Inm\n+HKjR5/v7rt/ru/mLZmysmZW9W7durohJ59UF5G5dwoAM0utA4Al2El6lmstT6H8eBcAHf5kjLbm\nq0KqsnOtqAGDnVZUbltJGsdaLDtUAq1htq+GHvXBRT7kTQqcK7b5CtgBQoAJLCmADs1UgAfAAWww\nxgazlYeB8ThGjBjhormtPMBwDeBCZSAwSQIdrDSkCUgRn3y0sM/COSxELITQIgUsYeFB5p///Fi3\n48493dnDh4WiZWZ77oMPu1OHnOxq1TsLCOb5UeBepSFgJQF00m4tUTfzxwNITIP+TIa2pQGDnbZ1\nv1uttHx0qRT4oKXNj0Cgg1z5Prj844z3vgphAUiQn46ar2jWAkAADaAFJ2QABOgg4J/D6MoK++23\nn4dC4qv5ifjs47tDfqTJObq4AydACgGAAaLC62TN4frQooNMBNIjyMJDWgsWLHDHHXe8e+LJJ1Pd\n+8oLXuCnNa07PC88ywpAcNqeacmGlZJm0rRaVtP8XZAObd02NGCw0zbuc01KqQ8blpO0WHgEOijg\n8TwgVm7zFTABZAAs9LRiAUCAHXRw8sknf03vjJAsQAqbvYhIMxZNTn/961/da6+95iGF4+uvv77b\naKONcqAj4AFyWLAsyU9HoMN1BAEPaQM9Z5xxpltlte+6MReO9uez+iPrzqI3FlWlCDwbCoBNWp5f\nyZS0pilLYJZGy6q+B/neu6Qy2THTQGtpwGCntTTbRtPlA4dZnQ9cvSsMKgNM6BtHPhXAR75/58hZ\nbvMV4KFu5Vh32B82bFhuCokddtjBD+bXr18//0Qwl87ZZ5/tgQdrDWAEqAApgImsMKw5xjkGLGQR\nHDHvDH5AgFYh0AkfQdJm8MPte2zv7phzVyZ9dcLysN2rV283dOiQsnpm8WywKKSlaUrytLSW7Dzb\nBJ5vgCefP5qPVKOfYv5g1EgUy8Y0kNPAf0Xm+9G5PdswDVSoAeCCUXl33313DxfdunWrMMXyLuej\njwyMZ0NFAIQkBR5/Jshk0D8AjXiAgSwhWFrUhIUvjfx0ABWsKlh15KvDOaAG52bC8ccf7/Ned911\nHb2vGF2ZuXyooNinmYqFPOREDOSQt6w/yLPOOuu4zTff3PfcYk1FRzrvv/++n64CfcctOvGycp7e\nX0uWfOCGDqmPY29cpkr3P/v8C/fyy6+4n+7at8WkqIBxzEZ3LLLecC9YshSQnxDKDbB37NgxaqI8\nzo/9tNtuu/k4tf7hHSJv3vvZs2fn/YNRa7ksP9OAWXbsGWgVDdA0FFpVwg9zq2T4VaJUaliX+OgC\nMvxjz2dhqmbzFbOeU9GwZqRkYOuII47wlhogCT+fAQMGuGeffdZLOnPmTG+pUTMTVhoWWW+AHBaB\nlPaxBMmiA8AwenPoX4Ke8+maiT5XX2PNzDomx58bjbuTrykLvfA8EAQ38TSytp8EOmEZOM97R+AZ\n5PmvRUDPvG/8sWCdlaEoaqEbyyMdGviy20Y6ZDEpGkgDAAaVDR9bRloGQPShbo1i6mOrip68+OCy\nH8JAmDcyEfbZZ59cBcE+lhUchWVtwWLDgoMv52TVEYDcf//9btddd/Wgg28NfjlMIEo8FpqaAJSr\nrrqK5H0YOnSoTxMIYqF3F1Ye0ie+eneFjs8cC5u9gB0qcXSshcQBPS2q7Dn+wvO/cT133onNhgjM\njt6uffvc/aWsKjdr7r30kg94s6QIvT+UK1/gHM87wMPCM67r8l1T6XEAR/mSt4FOpRq161tDA2bZ\naQ2tWprNNMDHln9706dPd8wBxcewWpUPafOx5V8saZIPFVwYqAQFXjpOvEp7X+GQfPnll+f8c7bc\ncksPOsx0jsWGRdBETy6sMPS6Ouqoo7wYvXv3dgcccIDfpkkM3x+sQlzPmqklOAZUyZoDCBFaarby\nkaIfyi3g6dWrl5dJ5xphfcyxx7s333jd3/dGsd4k3RcBSyHQiV/Hfedd03sH+MTfjfg1xe6TNu8c\n7x6BbVmU/AH7MQ2kTAPLpUweE6cBNcAHmo8i82gR+OAKTPgHXmqgAhfcYDWiIpBTdNLHXNYP8hL4\naKZx5OK84ASfGSw4surIr4bjBGADMMHSQ7PV1Vd/OQLx4Ycf7p2cQ9DBSiMrEdBDGvhV7LXXXj6t\nefPmeWsQcdScJWsQFhxZcgAdwQ4XFgs6xEXP0skhhx7OoYYKG2+yqevdexdfxmoBdNoUVA7oUAae\na713vIPADmsAqJz3DjlID6jhOec9ZJ/jBjppe2pMnrgGzLIT14jt10QDwIkA5d1333U777yz/xDz\nMSbwoWbhQ0oQpPCBJVCB84FlIV6xgeu5FisLzVfIQAA2gBEgB5DRmDrxwQOxsvzlL39xwA29mwjX\nXnutHzwQCAmhCcABnOSzwzxan332mV+oeOhiTmAKCVl2KMuaa66ZmyUdyw7QIwjyF5TxAxy++94H\nbtwVl5dxdXovoQv6zBkz3KybZ6ZXyAok0/Ov96KCpPyleueAHXogbrXVVv69C0GRbb1n+d473qFq\nyVRpmex600AxGjDYKUZLFqdVNRB+UNkm8JFnO/wI6wNbyUdWzVfk8cknn3hQAlBkgQlBJ2nwQGY6\nHzJkCJd7AGG+q2222cbvAzukAzSp+Yr0gB3giUU+OoAOwEPo0qWLGzlypO/ZRdMVwEPvMDVjAUJY\ndki/FKuOT/yrn/ETJrrPv/hHwzgnq2yNDDvVBh3pTOti3zveQd658F1UGrY2DWRFAwY7WblTJmfF\nGuDfKvN4EaZNm+Y/4FiUgB3Biaww4dxXnAc26KI+fvx4fz0jHD/wwAO+S7ggBMgR6Kj5i/QEO+pm\njrWH/GjGuuKKK3x6jM2DLHRlx5oj2NHYPaFjsr+gxJ9xV17l/vHPfzcc7Cxd9qFbv327XDNgiWpJ\nbfTWBp3UFtwEMw20kgaWa6V0LVnTQOo0IEsKzVdsYynCnE9zlGAHIMEawxL2vjrmmGNyoIPPDV3I\nv//97+csLQKdeDOYrENKC58fjbjMmDw4KRNwdAaKCFiHJAfpIRvph749PqL9ZHrKi3y3T01IWFMs\nmAZMA9XRgMFOdfRoqaRcAzRf4aOAxQSnSgIWm5122skP0PfWW295wAA4AB0gA7jAx2bHHXd0Cxcu\n9NcwieesWbPcWmut5UEnbAIToKipSk1XHAdW8LtRl3LkwMkTuThGOOGEE7wDNHGBI0EX17PPcfJj\nsdCYGgB0gBwDnca8v1aq+mnAYKd+ureca6QBKpBCva86d+7sYYLJFAELwQnXaZoHRJ08ebKf+oEm\nJcBFoCNrjpqrBDnsc454xOc6HJzpsg7ssGy44YaOAQYJOD5PnDjRx+c65AjhCwsPAEYw4PFqcAws\n2OEHHb7cyfivQKcUh/uMF9nENw3UTAMGOzVTtWVULw2EzVdhF1nAQc1XTMnAfFPPPPOMB58bbrgh\nmndpqBcZB+F77rnHj4As3xlOcH0IJVh05OujZjDiqbs6/jeADj45Wtjffvvtc5OG3n777b4nDFYc\npU1asu6oSYt0Swkan6eUa7IQ970lS9zW22ybBVELymigU1A9dtI0ULEGlq84BUvANFCBBtQjBN8Z\nbSclJ9M+PULUOyQpXvxYvuYrQEXNRbKgADIMDMjknbKcbLrppg4AYeZxzuOozDmuBTy4FhjBAsPC\nPpDCeQKQQTMVFh18dVjYJi0cm0mLOMOHD3d33HGHW7p0qe/tBVzRzEV6nNeChUgO0dqOlzlp///9\n3/+6dxa/nXQq08fozp/1YKCT9Tto8mdBAwY7WbhLDSYjPU0Y7+PGyHdGY30IYICTpECFwHWMF8N0\nDPSGwkpzZORonK9LLNcUar7CD0bWE6DiT3/6k9tjjz1yoIMDMlNBYH0R6CBbCDqCHPnXyBGZeFwT\nBx32sRQJXoAd4AX4onfXD3/4Qy51F154ofvlL3+Z891RfK1D0OH6lgI6mvfYEy1Fy9z5P/7xLdex\nQ3absQx0MvfImcAZ1YB1Pc/ojcui2MANCx94DQjYs2fPkgYFVLk1OBrp4eMAJMUHGKSCB6aKGTyQ\naRx+/vOfK3l34okn+vSw3jCODtYYQAM4AWiAI4EOa6CJ44IXgU5ozZFFJxwNmQzVlEY69913n59X\ni+Onn366l534XEvTF+BFcxjQRB7IVAzsYDXT6M6k3SiB6SJ6dO/qoTdrZTLQydodM3mzrAGDnSzf\nvYzIDphocsAkKKm0GAAPC5UHlp8jI2sP+RQ799XNN9+cswAhCyMa9+jRw8MFIPLmm286oIwQBx35\n0xCPAHzEQQfgwZoji46sMkAKcCTfIQAK52ag69e//rVPj+YsYI5rSUc+P2wDPICQ0vMXFPjp1au3\nO3P4CLf7T/sWiJWtUx07dHSzb5md17qX1tIY6KT1zphcjaoBg51GvbMpKBfNToBHCCGtKRZ+P+QH\nHGDRIcyZM8dbaIAKltCKgkPxGWec4X1lJBeWlXbt2uWsKEAG8ILlp1u3bs0sOoBO3D9HUAKMYI1h\nEZSoCYq8QmsMctE0BkiRJsCz+eabe8sReT/66KM+PunIyZm1IArgIb0wTZVHayw7hxxymHfmHXPh\naB3O9PrZ5xa4o44c5Ba9sShT5TDQydTtMmEbRAPWG6tBbmTaioGlhWYkFkFPa8uI9YW81OOKeX+0\nTd6yoAAo+OccfPDBOdDZOBrb5KmnnnL0ygIqWAQnXLfddtv5EY+XLVvmp3ygyQlLjByRgRLABggJ\nF45xLmy6SoISrDPEAZa4Zvbs2V5dABCjNgNEoVWJvCkH8IZ8BOIoADeyqHEPaMJ64IH73ROPPaoo\nmV/fd/9c12/f/pkqB0DOs2bdyzN120zYBtCAWXYa4CamrQhYV6hoWdT8U2sZ+fcsKw/WnVVXXdWD\nAZaT1157ze2yyy7egoJcgwcP9ousMvjGCFKAEIACuOBa0gWEGFQQuABcgBmOcQ3WFjUxkZ4gpyXL\nC2mFMAZMXXTRRW7ChAledQAP0CKokv+OLEjvv/++e+WVV7zzNhWqLFuh3oE/xvK57fY7vZ9LeC6L\n2zTLDR06pBnQprkc3Jd6vQ9p1ovJZhqohQYMdmqh5TaSB9YELCmyKvAPtp4BOQCedyJrzyOPPOIt\nLnPnznUHHHBATqwLLrjAz0mFFQdwkFUGqBDoYEEBdFiAniXR2C50ee4Q9QISfAhyAB7Ah+OAjvxp\nkqw5OSG+2gB41NOLvAAeoAw4I4T+O3/+85/dCy+84J5//nm3YMGC3AzsXyXlV8ANlasWrAkXXXSx\n++jjTzI/+/mtEbBdPWmie+yxeWGRU7ttoJPaW2OCtRENGOy0kRvd2sUELKhUCXzY02SmHzRokLfI\nHHjgge7cc8/1MvLzq1/9ym2wwQbeOiOrDtDCNpCCpUWgA3iwrWYr9hcvXuwHBJR1JQQdNYGRTzGg\nQzw1Q5GH8mXcHcb+UWDcn7ffTh4vB6fqPn36+MlOe/Xq5e+B0mTNgsz4A7268HXXpXNHJZu59aGH\nHeG6dd0uGpPo5NTLbqCT+ltkArYBDRjstIGbXIsiYtHBgpI20KGCB1qwsigABWPGjPGWHM4BJnFQ\nEehgyWHBX0agE/rWvPzyy27rrbfO+fqo2QpYIhQLOpJNUILV5qGHHnJYoph0NCnQNLfnnnvmFixK\nyp/4SouyaKEMZww7K7JgrZRZ687cBx92p0aQM3/B/FRBddI9MtBJ0oodMw3UXgMGO7XXecPlSLdy\nPuosabLooGgqfEYmxqqjQFdyLDMADEFNToACkMI1nMO6ItDhGOBC3NAKhFXnjTfe8D48DELI9Syl\nQg6+QNLhY4895icglbzx9VFHHeWb55ADSMN/J+ydRd4syAzcaAF42MYyxBQVWbXu9Nunv9ulT+/U\nW3W4nz2/snbG76HtmwZMA7XVgMFObfXdcLnhhHzkV93L6+2jk6RcVfgjR4507du3d1OnTnV9+/Z1\nxx57rHc8Fhhg3QkBAcgBdgREAAwwBFzE/XOAjg8++MB9+umnvgmJdFoKAhvWgA7XhgGrDbOtAyXd\nu3f3TU9hd/SHH37YRxfwADuyTgm2KLsAB8jRNutJk652yz780N0ye1aYbeq3x0+c7ObccXvqfXUM\ndFL/KJmAbUwDBjtt7IZXs7j46QA4N0bdzMMu3tXMo9K0VOFrfJ3f/va3fibzKVOm+JGRqfiJIygi\nHoDDAiAQAKHQEVnAA2iwyD8HfdALKunfPJWfFqa7iAemv6C3Fdey4FwsOAG6sES9+uqrrnfv3v5S\n1synBVgJwgQ7yCrgiUMO+ywfffSRO+vMs9ygo452Q046IS5OKvc1rs41U67xOkqlkJFQ3GfuoQXT\ngGkgPRow2EnPvcicJAIcrDtpDQIZIEYgQzduYIcZzsPjQIXiABoEQCberRyoAHLkH0Mcgiw66APw\nwWLDkg9uqBC1AI3IqiD4AkyQi95ZDDaI3JdccomPdvHFF7tOnTr5fJFRcqpZTvIImkiLdAV4L774\nop9t/Zln56e+K/rSZR+64447Puqd1scNOfkkqSl1awOd1N0SE8g04DVgsGMPQlka4KMup+S0+enE\nCxSCg2CG5iH8eAYOHJjrUi7YEegADQIINV2xH4IOFhTABqBBJyxJY9xguRHYsE6CG0GI4ERr+Q8B\nO59//rkbMGCA+8Mf/uCLCfzgswN4ydIEjMm6g3yCKK2BIC1z5z7gbr75JjfzplmpBh7mwKLss26e\nGb+9qdnn3nNvLZgGTAPp04DBTvruSSYk4qPOMjrPLOVpKgSVvIAHgABqcAI+4ogjPKQABlhOgApg\nCBAAHgAbQQ4AIYhgAD9GWwZqgJwkuKEZCqChaQoHboBQsIFuQrAJZRPgaI08su7gR0RzFqM47733\n3l7FG220kRs1apRPGwsTwCNZ41DGecrGOlxGjb7A/TNKd+q1U137duul6dZ5WU49bZh7660/uhnT\np6XOAR4BZcUz0Endo2MCmQZyGjDYyanCNorVAP9gs2LVoUyCDEGFLCV77LGHH5fmoIMOyvW6EgwA\nCgIdppag+zcLkIMzcjwkDeBHfqoId9ppJy+HIAeAEdDE15yLL7JIAWUsQBYTnRL22msvt9tuuzVr\nctPgiIAPZVHTFpCDtSeEHbbPG3W++0s0UOGll12WqvF3sgA670RDLgC1FkwDpoH0asBgp4b3ZrPN\nNssNCIffxVlnnZXLnUpWgZ42jJxbTKB3ET2LFFSxa7811oAOH/csWHXC8oewg5WEaSQYZPDBBx/0\nFh2gAxBYuHChW7RokZs/f75fGC05HnbeeWfHP3kWdBFabshHC2myACddunRxq6yyigcZAU4IPSHg\nJJ0X8CA7wDNp0iQvO7JRDnqbqdlNs6PTxAW0ATsCnhB2wu1zzjnPPfLwQ+6GG6fXvUkLH50zzhjm\nm67SbNEx0Im/GbZvGkinBv4z0lo65StJqksvvdT3UOEiRpp96623SrreIresASwVd999t7vyyitb\njpzCGEClKniags477zx3Y9SbDH8QmrYYMycp7LDDDr4nFHDD6MQEgaUgirXgJg4rDDzIAIRACFAS\nnhfk6Fi4ZjsEJ/JV76uTTz7ZYWUDfi688EI3bdo0b7EJHacBHCw7svCE52TdQR/o5corr4jmM7vb\nbd+jm7tqwqS69dKi19XIs0e4jh07ucmTJqS26cpAh6fRgmkgGxpoKNjJhsqzLSU9jfbZZx+3ceSP\nksVApc6CtebQQw91+N/84he/+FpRgJpdd93Vd09nCgauUQgBBFAR5AhaWAtYwu1NNtnEvffee45B\nDddbb71cUxVxw0Vww5oAjAjQJD/xaY7Dssd0GITrrrsumhhzaM45WU1W8j/C6sOitAQ5SpM0Djhg\nfw99559/gZv/3HOO8YlqNa0E1pwbp890I0ec6QFU5UKuNAWA30AnTXfEZDENtKwBg52WdVS1GI1g\naQJ2AIEsBip1AIJKfo011nBPP/10rhhYbrDYMH7NNttsk+tWTlzBjNYhmLQEOCHssL3aaqt5wKLb\nNyMukxbpshAEHuSrRdCitYTm2i222MJbp5jQlK70DERIWYhLWqTBNgsWHsCHhePKT+lp3TO6vzTN\nMfDgFl06uTFjL3NHDjqiVZ2Xr582w90040bXrv36Dt2k1QfGQEdPia1NA9nSwJdfvGzJ/DVpmdGa\nDzuDrCkwJD7H8JOJB5q7OK6KhTXHkiZY5LjiMfIuQYO5cVxBcVgjDwuVJvukQQjz1DFdH1/fdttt\nuetJg7Q4Vk4grzBvyZRU3pbSp9lE4+u0FDeN5yk7C5X/zTff7AGB0Yrpwo0PVdeuXT0MEIcgZ2b5\nydAbii7gX3zxhW/6Yt3SQhMZcbiO6wGttdZay89YDuRIHpqc1ANMDsb43IQLzWDIywI44St0yCGH\nuG7dunl58QVDVsEOQCTgYjsMKmN4TNukO3LkCA8er77ysusdAdDIc0e7ha8tUpSK11hygJxevXp7\n0KFZjq7lBjoVq9YSMA2YBmIaaFOWHSp3xihhFN14AGCAApyDf/KTn8RP5/aBjqTrcxGiDUCnJZgJ\n48e3gRqaJ8JAnsged2wO48S3q1HeME0GyCNsnNEmLJVFVo3999/fH6Jyfe2117ylRVYWwIAu6oIF\ndQEXOLAutM050uJ6pRnmD6hst912ftBBBgZkXxYYWWO01nGtBUeki1wskydP9hOSksdxxx3nbr/9\ndp835ygHACR/HdIV6Ggt2eJrdAOAcO9vnjXbW3r27rev2yUC/22i96RH967xSwruAzhzH3jIvfrK\nK27uffe6rbfZNmqGG+iOjKYcSXMwi06a747JZhpoWQNtCnbygY7U9Mknn/h5k2huWn311XU4twZi\nigmVgA7px0EnzBMoA8aK6a1VaXnDfNlOg58C1jXdByr7UgOVO9cJeLie5iumYsAXiXMCGaw6LAIK\n1oUAJwQbyUZ+LOQXwou26ZKOUzTxGTMHoNE5wY32WWtbkCJZN9hgA9+7rH///u4vf/mLmzlzph8w\nEdDRNUpPMrFfbAB6WEaePTxyYr7L4UQ8ecJ4fznAssWWP/Tb3//+Zr7HmdJl8MPPP//CvbP4bfeH\nN99wjz/+mDvk0MPdT3fdxQ0dcqLbOAPgbKCju2lr00B2NdAQzVhU/FQWGkaf20FvLI7JTwaACC0y\nxOU8C00YCgBPIdggXaw/ulbXxdfHHHNMLk7YxTweL99+PvmIX0g+pVet8io91vy77xk1Z6QlcK/K\nCarstWZ0Y/xECAALi6whdPFWsxVrbYfNUsQBimTNIV2sKGGzVNgUFd/GWkj38MWLF/smq7DbuJqz\nOB8OFqhu5BpDh3NMGMpAiQSclblfyERZkFGLZJW8/oIif2jewgozdcrVbtEbi9zsW2a7AQfu71Ze\n6dvuf//9L3dX1J1/5owZuWVJ5JC90ndW9BagUaPO8+8EliKcj7MAOgB+GiC/yNtj0UwDpoE8Gmgz\nlh1ZA9ADIBICCPtUnPL5ARTC86HuAKOWrCqcDwEqvL6Y7Zbko5kLeZOsT0q/WuVVemlcA68t3Yti\n5KbS5d+7QAcYECDQ/MO2LDyh9Ya0ARtZXLSWBYV1aFXJt008mrLoIfbCCy94oCSu0matEN8Gurke\nuMLfB0h+9NFH3dKlS92QIUPck08+6S1T8uMJm7LUM0vlUB6lrGXxKeWarMQFcgiU0YJpwDSQbQ00\nhGWnmFsQWnWSKkjmSVLA1yWf1SDpWl2ndTFxFDdpHcqi8/E0W3IurlZ5lT9rLAWAQVpCWMZqyAQ4\nYO2guUrAA+gIdmQJATiABqwqoUMxFpvQKhO34IT7ioflRlabddZZxzejMlIzTs3kIeghzxB0wvIS\nJ5Rn9uzZudOnn366t6ZQJoAH644ATs1yLVkpc4m1oQ2BTpqe9zakfiuqaaDqGmgzsBPCAb4sqjy0\njvfaSoIdmrCKCYUsLsVcn5RPPM0k+cK0q1HeMD22sX6k6eOPb1S1gSdeZp4PLCdqkqK5CDgBUgQ3\ngIvgJQSa+LbicL0AB1iKdwnHh+jdd991qnDjMoX7en5l3SGtDh06uHHjxvlozz//vB+9Wc1Z9AaL\nAw/WKgv/0YD0nqZn/T/S2ZZpwDRQjgbaDOyUoxy7pnU0AKSoki51HTbPAXw4LNP8WA3oQRY1NSXB\nTTmAA/DIeiOwAUiAE5bQchNqW00nWNNaCtIhacnaxHxfe+65p7+UqSQAVSw5WKlC4JF1R81zLeXV\n6OcNdBr9Dlv52qoG2gzshNaS0MFYJvz4Ooxf64cjqeKOW3Jaki88X63y4pyqyqDWOsmXH3oBnuRv\nlS9eMccFO2oSiltxZMmJW2wENFrH4QZwaglukuTDssDyeDS2UUsB2ZUH+SE7/jusCUcffbRfFwIe\nvQM+Yhv80bONzi2YBkwDjaWBNuOgTHdtNe0AE3EfmDTdVqwXcb+d0KJBk1YIM0myt0Z5sTaoQkjK\nM8vHBDoAgwKWEsABCCCwr0VWGda6lrUW4rNdaQAwe/bs6YEH/bNfKIQyMyUF/jsMAkl39PHjx3un\nZaw7xAPqBEgqA/uUt1jZsTyxfPa3z92SJe9/bUZ4Jj7t2LGDizr8e0dfypLGoOfaQCeNd8dkMg1U\nroGGtezELSEh3GAFCMfCAYJCP564/07lai4tBXqDhfKxH1ou4iCUlHprlZfmkEYLL730kocIKnhV\n/jQH/f/2zj3IiurO4z9SIJYVUu7GZAOuFi5UGNkqERMZhFpFYnzk4Yih3IqRAZJSUUkibtCMZA1b\nAg6iUjKWQXwOJghEHIeYVWMSJCKPUgOl4aEbE0UdrazlHz7WRLOV3G+THzlc7jzune57u/t+TlXP\n6du3+5zf+Zyr/eV3fuecYs+Ox+oUx9tU6rkph6NEgl7I/lIu9azsd9EiIaNhM01H18rESlpoUMLE\n43d8Krpyn22m73pK6v97ChunXnTxJdYwqsHmzLnCfvbYL+zd9963f/jHj9u05uYDjnGN4+39P35g\nL+99zW5aenNk39lNU2xZ2y09tqUnG+L+zpkidOImS3kQSA+B3Hp2JHb0P355QLTWjqZzS0C4d0fi\nIRQQYZd0N+08vCfp8/7al0R75VmIY7dzibWeVqmuhG1xAHc5ZegFrrbJ26GkvDho14WEck/huV9L\nMnfPmgSLzksl2ST7JdokwiReWlpabN26dd1OR3eBp+e8nX7udWgo7af//YjdsGRxtCjg+IKI0qaj\n5W4SqhWUNz252bZs3mJnnnGmfbrhWDt3SpPNKKzdU4uE0KkFdeqEQPUJ5Ers9Da0o9iV3lYVVpyD\nhEItk8RW6NkJbZF9vbXT74+7vXrB6kXb3yT7+9qG/tbVl+f1Ir/88sujF73fLwHgqdqixustlcv7\nIHHWk+DRc26/PFQSPI888ogdd9y+VY41Hf3GG2+MvDny6ujecEgrFDobN260Fbffab9++ilrnvkN\n+83O3WULnLAdw4Z+ys6bem50zJ37H9HWEe3t91h7+8rIA3XuuVPC2xM9R+gkipfCIZAqArkaxpLH\nQGKgu3/l6wWrRdt0T7FnQQJH4iANXh3ZokUJQ0Ege9euXVuWfXG3Vy9apTgET1RQSv7ohR56Sty7\n4XlKzNxvhuJ21BcSaaWSizOJFh/OUvxOOB29s7Mz8l5p+EqCRzO0wvV33nrrLVuwcJHNunhWtBXE\nLwt1Xf3duf0SOsW2Svh8Y2azbdjwS7ugeYa1tbWZhriq8fvyOvw3XWwbnyEAgXwRGFAIRix/g6F8\nMaA1ZRBQsOukQvyIPCF5SNrnSW3xf+VnrU0SPBJqpQKX9Z+2huM0A0tCRltdXHjhhfbQQw9FzVy/\nfn30nLw/ikPydYC2bdtmN9241IYV9tuaN29erAKnN76LWpfYvJYr7eabFUy9L9aot2fK/V5CRyKn\nFLNyy+J+CEAgGwQQO9nop9RYqeBUxe34v4xTY1iFhrhoiyMWqUIT+v2Y+sK9PcWFSfBoGMs9OBI8\nI0eOjLw5iunR1hLyBCmYWVPmH3yw09TH3/z25fat2ZcWF1eVz9pkVN7XoUOH2uLWRbGKEoROVbqQ\nSiCQOgKIndR1SboNUryIhgm1aaX+dZz1JJHg3pEst0WeKQ+0DtshseOCR1PONWS1adOmaDq67ps6\ndWo0HV1xO62t19vu3busfeW90cadYTnVPlcg8/z5/2VvvPGGrWy/OxbBg9Cpdi9SHwTSQyBXMTvp\nwZpfSyQOtGN1lj0h3jvu1Qnjdfy7rOUSnjok3MLkcUcev6Mhq1LT0RcsWBQNeW0sbBw64aTGsIia\nnCueRzurjxgx0pqnz4yEXH8MQej0hx7PQiD7BPDsZL8Pq94CvVAVuyNvQpbjHjyQd8yYMVHci0SP\n4pGyLn7UP8VxPO7dUfyOByRrLaZdu3bZZz9zov1TIYB5xe0rTCIjbWnOFXMLy0f8tmIPD0InbT2K\nPRCoPgHETvWZ56JGiQId8+fPz2R7FJcyc+bMbm0/+eSTo/ZJNOiQ10TJBVL0IcV/9IIP43gkdpQU\nv+PDWV1dXXbZZbOj9Xce7Fxf1UDkctFpEUPtBL/qR/eW9ShCpyxc3AyB3BJA7OS2a5NtmHt3/GWS\nbG3xly7xIqE2o7CYndqyYcOGKOha+TvvvHNQhcOGDbOxY8dGh3Yl16GUZvFTHMfj8Tu+P9aWLVvs\n9NNPtyc3b03F0NVB0IMLiuGZNesSaxw3rjBDrCX4pvtT/21m2fvYfev4BgIQKIcAYqccWtx7AAEJ\nBQXFavp2lpJEjmzWy1DJvR6apq1Dq2xr/zTNVNq+fXt0lGrf6NGjI9FzwgknRCLIh7/SJIB8AUJ5\n4ZTUVrXxzTffLOy/dp5NPe/fazbrKjKojD+apfX1GdNt+W3LI69bT48idHqiw3cQqD8CiJ366/PY\nWqwXqTwkClaW8MlC0ktQHhqJGBcn7vGQCNAwjzwf4V5R+l73/6oQvKt9tJ544oloSKW4vVqrRsJH\nXh/Vody9CrUWQGEcj9pz7bULbM/zL5Q9LFTc5mp/XnbLrdax7v5oIcLu6g7b2t09XIcABOqLAGKn\nvvo79tbqxaJgZX/BxF5BjAX61GzNwvKZWCrevR0udBTT4oeu6dA9Eiw6NE377bfftueeey4SQBI/\nO3fuLGmphr9c/LgA0o21ED8SehJfaotW1+7v1g8lG1yFi1pl+bTPTS656KB+h+7FqoIpVAEBCGSE\nAGInIx2VZjM1LKSAX3+ZptVWDzaWrWHSy99FjYsc3z7BPTzy+ihJ6Ggatx/67Nf27t0biR+JIAmg\nV199Naxm//mECRMiz497geQdU6qGAFIcz3e+c6WNOna0Lbx2flRv1v48/OhjNqewuvLWbVv3e87U\nBoRO1noSeyFQPQKIneqxznVNGsaS2NELx4du0tTgnuxzseOBuz41OxQ8EkOeXNx4LuFT6lzXNPTl\nHqBnn3222+GvyZMn7x/6kgfIGcYtgCR2jjnmGHut6/VUTjN3xr3l539tmo1vHLffu4PQ6Y0Y30Og\nvgkgduq7/2NtfU+CItaKyihMQ1casupJiLnYkaDxadkSOi52dE0eHnl3dK8EiB8SOjp3wROKnlLX\nNPwlr48LoO6GvxT8LNGjQ0LI44v6K36uuuq79sGH/29Lb1pSBsX03SrvzvWt10WxOwid9PUPFkEg\nbQQQO2nrkYzbI8Gjl49mO/kLulZNktCZ9LdZSLLJvSXF9kjAhMHJLnjk4Ql3Apfnx+9TGXpOh5KL\nn1D4SOz44SLIcxdCyiV85PVxL1B3w18TJ06M2uOzvyoZ/moY1WB33dOe+qnmEdRe/px66mQbM+a4\nXKzm3UtT+RoCEOgnAcROPwHy+MEEFMOjGVq1nKUlcaPAaR2yozuhI+tdtEjI+Ewsj92RR8fjdvSd\nvD+6zw99drHk5TgRF0AueEKB49ckfkIB5PfI+yPxIxG0efNmL/KAXBtluvBRELQOT6U8QBKgd93d\nbus7O/y2TOfz/nO+ffjBn+z6xddluh0YDwEIJE8AsZM847qswcWGPCsSG+6FSBqGvDkeMK08nHXV\nU90uWNxzEwocFzkSNn6EYsef8Wueu/hRruTix70/Lnhc4IR58fkrr7wSCR+JIAmgnoa/fPaXhJB7\n11TnlVe12KBDBmc2MLm4/3zdnT3P7yn+is8QgAAEDiCA2DkABx/iJODxMvIo+HTvnjws/a1bs6wk\ncCSsdK68r8kFiYSKzl3UKHcx4+f+Ocz9PLzHryn3MmWPzr0+F0ASNzov5eUpFj7uDfKhL+USQdpO\noTiFa/90dq63xUtusLPO+HzxbZn9rGG51WtW7xd1mW0IhkMAAokSQOwkipfCRUBeHokQBQlL9Ciu\npxwh0hNFCSoJG3mPlJRr6KrS5CLEBYlyiZVShwsi/86Fjufh9VLXvGyvSza7+NG5RI4foQjyc89d\nDIVr/2gITJt8FifVlaekPbNGHzuqzx68PLWdtkAAAn0ngNjpOyvu7CcBiR4Jk/b2dmtqaoqCbSVM\nyhU+EjjyFuno7Oy0U045JXrZ9UfkdNe0UBxIvLgwcSET5qGg8XPPdV+p8/Cal+V1eN0ugJS7+PHc\nvTz+2YWPCyFf+6etrc0mTvw3+/GP13TX1ExeX9S6xP5ciNu55prvZdJ+jIYABKpDALFTHc7UEhAI\nxYoEkIa2JHgm/W3mlOf+iDxCeka5jpdffjkSOBI3lYglL7eS3AWIng1FiYSKPheLl1Kf+3rNxY+X\nHdbtAkjiRucublzshLnOJXbe/+MHtuK2H1TS7NQ+s/b+B+zBjo7MbXuRWqAYBoGcEhiY03bRrBQT\nkLjRUJYOJRcxvku3hrzCJCGkQ8JGw2DFYii8N+lzCQtPfi4RIrGhpHMXJ2FeSuDo+/C6n3vu34ef\ndc3LVV0KnlZS7vbIFp274NHnwq02/Jh/ie7N058hQ4bkqTm0BQIQSIgAYichsBTbdwK+jUPfn0jX\nnS4yZJWLjNALE4oTFyueFwsZfS73murSM0o610wyJdnix7vv/l90LW9/jj7qKPv100/lrVm0BwIQ\niJkAYidmoBQHAREIBdA+z8rfA4MlSPxwIVRK4Oi78Lqfex5+X3xN5XvZyj/88z4BlLfe+dfRDfb8\nC8/nrVm0BwIQiJkAYidmoBQHgVIEQvHj5xIklQx/hcLGBU9v1wYNHFTKLK5BAAIQqAsCiJ266GYa\nmUYCLnpkm84VYyMBpOSeH8//AVUvAAAG+ElEQVQlasLDxY3nLnrCPDwffOjgqNy8/dm5iwUF89an\ntAcCSRBA7CRBlTIhUCEBF0Ceu/hRcS58lIfCJzyX+AkFkAueIR/9aIUWpfuxvYWVpb96/gXpNhLr\nIACBmhNA7NS8CzAAAt0TcNGjO/xcYseHvyRmXAS56NHnUPDofODAj9j//uEP3VfENxCAAARyTOAj\nOW4bTYNALglI9Pgh0aNj4MCBdsghh9jgwYOjQ9tE6DjssMOiQzumd3W9ljse27fvsIZRo3LXLhoE\nAQjESwCxEy9PSoNA1Qm48FEerq0zaNCgSAAdeuih1tjYaGvX3Fd125Ku8KXf/84+9rF8DtElzY7y\nIVBPBBA79dTbtLVuCBQLoCOOOKKwGOOppp3C85T+pzDtvJaLTOaJJW2BQJ4JIHby3Lu0DQIBgRPH\nNdrjG38VXMn2qYTb611d7Hie7W7EeghUhQBipyqYqQQCtScw4aRG27plc+0NicmCp595xr7cVPkO\n9zGZQTEQgEAGCCB2MtBJmAiBOAhob7EX9uy2vKxN07Hufps4YXwcaCgDAhDIOQHETs47mOZBICRw\n9jlT7I477gwvZfL84Ucfi4awJOBIEIAABHojgNjpjRDfQyBHBC695GJ7+Kc/sa7X38h0q+5dudIu\nnT07023AeAhAoHoEEDvVY01NEKg5geHDh9sJnz3R7mm/t+a2VGqAvDra6bx5GisnV8qQ5yBQbwQG\nFFZb/ft2zPXWetoLgToksGPHDhs7dqz9Zudu067hWUvnf22ajR/faN/6Jp6drPUd9kKgVgTw7NSK\nPPVCoEYEjj/+eLt2wUJbuHBhjSyovNplt9xaiNV5Da9O5Qh5EgJ1SQCxU5fdTqPrncDsyy6NhoIk\nHrKSNIvs1rZl9v3vX2OHH354VszGTghAIAUEEDsp6ARMgEC1CUgsLL9teSQesrKqcktLi13Q3MyK\nydX+sVAfBHJAgJidHHQiTYBApQSWLWuzjo4O+9GqVTZs6KcqLSbx5+ZcMddefPG3tr6zI/G6qAAC\nEMgfAcRO/vqUFkGgLAJXXtVie/bsseXLf5BKwSOhs2P7MwVR9gDDV2X1LDdDAAJOgGEsJ0EOgTol\ncP3i66yhocFmzbokdevvSOhoXaClS29C6NTp75NmQyAOAoidOChSBgQyTiAUPGmI4dGihz50tXrN\najb7zPjvC/MhUGsCiJ1a9wD1QyAlBCR4GseNs6/PmG5r73+gZlZp1pW8TIrRWdl+N0KnZj1BxRDI\nDwHETn76kpZAoN8E5s1rsdbFrXbNvKvtoourP6ylqfBfmXJOJLoUjMwU8353KQVAAAIFAogdfgYQ\ngMABBLS55tZtW23AgAE2edIkW9S65IDvk/igLSDObppi2slcU+IlukgQgAAE4iLAbKy4SFIOBHJI\n4PHHH7cVt98ZrVr8+TPOshnTp8U6Y+vOu1faL36+b6+rlqtbbPr06TmkSJMgAIFaE0Ds1LoHqB8C\nGSAg0bPqvjV2+4rlduFFs6xx/El21pmnVyR85MXZtOlJW7d2tQ0dNsxmFGKEmpqaGLLKwO8AEyGQ\nVQKInaz2HHZDoAYEXnrpJdu4caM9+rOf232rfmhfPvscGzFipH3ik5+0kSNH2JAhQw6yavv2Hfbe\ne+/Z73/3YvTMpEmn2udOO82+9MUvEHx8EC0uQAACSRBA7CRBlTIhUCcE5PHRLup/sQH21FNPl2z1\nkUceaUf985F29NFHReJm+PDhJe/jIgQgAIGkCCB2kiJLuRCAAAQgAAEIpIIAs7FS0Q0YAQEIQAAC\nEIBAUgQQO0mRpVwIQAACEIAABFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoid\nVHQDRkAAAhCAAAQgkBQBxE5SZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAA\nBCAAAQikggBiJxXdgBEQgAAEIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCAp\nAoidpMhSLgQgAAEIQAACqSCA2ElFN2AEBCAAAQhAAAJJEUDsJEWWciEAAQhAAAIQSAUBxE4qugEj\nIAABCEAAAhBIigBiJymylAsBCEAAAhCAQCoIIHZS0Q0YAQEIQAACEIBAUgQQO0mRpVwIQAACEIAA\nBFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoidVHQDRkAAAhCAAAQgkBQBxE5S\nZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAABCAAAQikggBiJxXdgBEQgAAE\nIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCApAn8FUX2PmBTVQm8AAAAASUVO\nRK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 100, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse_2.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 5: Making our Network More Efficient" + ] + }, + { + "cell_type": "code", + "execution_count": 105, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self,reviews):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " self.layer_1 = np.zeros((1,hidden_nodes))\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + "\n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def train(self, training_reviews_raw, training_labels):\n", + " \n", + " training_reviews = list()\n", + " for review in training_reviews_raw:\n", + " indices = set()\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " indices.add(self.word2index[word])\n", + " training_reviews.append(list(indices))\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + "\n", + " # Hidden layer\n", + "# layer_1 = self.layer_0.dot(self.weights_0_1)\n", + " self.layer_1 *= 0\n", + " for index in review:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # Update the weights\n", + " self.weights_1_2 -= self.layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " \n", + " for index in review:\n", + " self.weights_0_1[index] -= layer_1_delta[0] * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + "\n", + "\n", + " # Hidden layer\n", + " self.layer_1 *= 0\n", + " unique_indices = set()\n", + " for word in review.lower().split(\" \"):\n", + " if word in self.word2index.keys():\n", + " unique_indices.add(self.word2index[word])\n", + " for index in unique_indices:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 106, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 111, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 109, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):1581.% #Correct:857 #Tested:1000 Testing Accuracy:85.7%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 6).ipynb b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 6).ipynb new file mode 100644 index 0000000..b6eeb8c --- /dev/null +++ b/sentiment_network/Sentiment Classification - How to Best Frame a Problem for a Neural Network (Project 6).ipynb @@ -0,0 +1,8938 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Sentiment Classification & How To \"Frame Problems\" for a Neural Network\n", + "\n", + "by Andrew Trask\n", + "\n", + "- **Twitter**: @iamtrask\n", + "- **Blog**: http://iamtrask.github.io" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### What You Should Already Know\n", + "\n", + "- neural networks, forward and back-propagation\n", + "- stochastic gradient descent\n", + "- mean squared error\n", + "- and train/test splits\n", + "\n", + "### Where to Get Help if You Need it\n", + "- Re-watch previous Udacity Lectures\n", + "- Leverage the recommended Course Reading Material - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) (40% Off: **traskud17**)\n", + "- Shoot me a tweet @iamtrask\n", + "\n", + "\n", + "### Tutorial Outline:\n", + "\n", + "- Intro: The Importance of \"Framing a Problem\"\n", + "\n", + "\n", + "- Curate a Dataset\n", + "- Developing a \"Predictive Theory\"\n", + "- **PROJECT 1**: Quick Theory Validation\n", + "\n", + "\n", + "- Transforming Text to Numbers\n", + "- **PROJECT 2**: Creating the Input/Output Data\n", + "\n", + "\n", + "- Putting it all together in a Neural Network\n", + "- **PROJECT 3**: Building our Neural Network\n", + "\n", + "\n", + "- Understanding Neural Noise\n", + "- **PROJECT 4**: Making Learning Faster by Reducing Noise\n", + "\n", + "\n", + "- Analyzing Inefficiencies in our Network\n", + "- **PROJECT 5**: Making our Network Train and Run Faster\n", + "\n", + "\n", + "- Further Noise Reduction\n", + "- **PROJECT 6**: Reducing Noise by Strategically Reducing the Vocabulary\n", + "\n", + "\n", + "- Analysis: What's going on in the weights?" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "nbpresent": { + "id": "56bb3cba-260c-4ebe-9ed6-b995b4c72aa3" + } + }, + "source": [ + "# Lesson: Curate a Dataset" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "eba2b193-0419-431e-8db9-60f34dd3fe83" + } + }, + "outputs": [], + "source": [ + "def pretty_print_review_and_label(i):\n", + " print(labels[i] + \"\\t:\\t\" + reviews[i][:80] + \"...\")\n", + "\n", + "g = open('reviews.txt','r') # What we know!\n", + "reviews = list(map(lambda x:x[:-1],g.readlines()))\n", + "g.close()\n", + "\n", + "g = open('labels.txt','r') # What we WANT to know!\n", + "labels = list(map(lambda x:x[:-1].upper(),g.readlines()))\n", + "g.close()" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "25000" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "len(reviews)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "bb95574b-21a0-4213-ae50-34363cf4f87f" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'bromwell high is a cartoon comedy . it ran at the same time as some other programs about school life such as teachers . my years in the teaching profession lead me to believe that bromwell high s satire is much closer to reality than is teachers . the scramble to survive financially the insightful students who can see right through their pathetic teachers pomp the pettiness of the whole situation all remind me of the schools i knew and their students . when i saw the episode in which a student repeatedly tried to burn down the school i immediately recalled . . . . . . . . . at . . . . . . . . . . high . a classic line inspector i m here to sack one of your teachers . student welcome to bromwell high . i expect that many adults of my age think that bromwell high is far fetched . what a pity that it isn t '" + ] + }, + "execution_count": 5, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "reviews[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e0408810-c424-4ed4-afb9-1735e9ddbd0a" + } + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Lesson: Develop a Predictive Theory" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false, + "nbpresent": { + "id": "e67a709f-234f-4493-bae6-4fb192141ee0" + } + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "labels.txt \t : \t reviews.txt\n", + "\n", + "NEGATIVE\t:\tthis movie is terrible but it has some good effects . ...\n", + "POSITIVE\t:\tadrian pasdar is excellent is this film . he makes a fascinating woman . ...\n", + "NEGATIVE\t:\tcomment this movie is impossible . is terrible very improbable bad interpretat...\n", + "POSITIVE\t:\texcellent episode movie ala pulp fiction . days suicides . it doesnt get more...\n", + "NEGATIVE\t:\tif you haven t seen this it s terrible . it is pure trash . i saw this about ...\n", + "POSITIVE\t:\tthis schiffer guy is a real genius the movie is of excellent quality and both e...\n" + ] + } + ], + "source": [ + "print(\"labels.txt \\t : \\t reviews.txt\\n\")\n", + "pretty_print_review_and_label(2137)\n", + "pretty_print_review_and_label(12816)\n", + "pretty_print_review_and_label(6267)\n", + "pretty_print_review_and_label(21934)\n", + "pretty_print_review_and_label(5297)\n", + "pretty_print_review_and_label(4998)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 1: Quick Theory Validation" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "from collections import Counter\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "positive_counts = Counter()\n", + "negative_counts = Counter()\n", + "total_counts = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('', 550468),\n", + " ('the', 173324),\n", + " ('.', 159654),\n", + " ('and', 89722),\n", + " ('a', 83688),\n", + " ('of', 76855),\n", + " ('to', 66746),\n", + " ('is', 57245),\n", + " ('in', 50215),\n", + " ('br', 49235),\n", + " ('it', 48025),\n", + " ('i', 40743),\n", + " ('that', 35630),\n", + " ('this', 35080),\n", + " ('s', 33815),\n", + " ('as', 26308),\n", + " ('with', 23247),\n", + " ('for', 22416),\n", + " ('was', 21917),\n", + " ('film', 20937),\n", + " ('but', 20822),\n", + " ('movie', 19074),\n", + " ('his', 17227),\n", + " ('on', 17008),\n", + " ('you', 16681),\n", + " ('he', 16282),\n", + " ('are', 14807),\n", + " ('not', 14272),\n", + " ('t', 13720),\n", + " ('one', 13655),\n", + " ('have', 12587),\n", + " ('be', 12416),\n", + " ('by', 11997),\n", + " ('all', 11942),\n", + " ('who', 11464),\n", + " ('an', 11294),\n", + " ('at', 11234),\n", + " ('from', 10767),\n", + " ('her', 10474),\n", + " ('they', 9895),\n", + " ('has', 9186),\n", + " ('so', 9154),\n", + " ('like', 9038),\n", + " ('about', 8313),\n", + " ('very', 8305),\n", + " ('out', 8134),\n", + " ('there', 8057),\n", + " ('she', 7779),\n", + " ('what', 7737),\n", + " ('or', 7732),\n", + " ('good', 7720),\n", + " ('more', 7521),\n", + " ('when', 7456),\n", + " ('some', 7441),\n", + " ('if', 7285),\n", + " ('just', 7152),\n", + " ('can', 7001),\n", + " ('story', 6780),\n", + " ('time', 6515),\n", + " ('my', 6488),\n", + " ('great', 6419),\n", + " ('well', 6405),\n", + " ('up', 6321),\n", + " ('which', 6267),\n", + " ('their', 6107),\n", + " ('see', 6026),\n", + " ('also', 5550),\n", + " ('we', 5531),\n", + " ('really', 5476),\n", + " ('would', 5400),\n", + " ('will', 5218),\n", + " ('me', 5167),\n", + " ('had', 5148),\n", + " ('only', 5137),\n", + " ('him', 5018),\n", + " ('even', 4964),\n", + " ('most', 4864),\n", + " ('other', 4858),\n", + " ('were', 4782),\n", + " ('first', 4755),\n", + " ('than', 4736),\n", + " ('much', 4685),\n", + " ('its', 4622),\n", + " ('no', 4574),\n", + " ('into', 4544),\n", + " ('people', 4479),\n", + " ('best', 4319),\n", + " ('love', 4301),\n", + " ('get', 4272),\n", + " ('how', 4213),\n", + " ('life', 4199),\n", + " ('been', 4189),\n", + " ('because', 4079),\n", + " ('way', 4036),\n", + " ('do', 3941),\n", + " ('made', 3823),\n", + " ('films', 3813),\n", + " ('them', 3805),\n", + " ('after', 3800),\n", + " ('many', 3766),\n", + " ('two', 3733),\n", + " ('too', 3659),\n", + " ('think', 3655),\n", + " ('movies', 3586),\n", + " ('characters', 3560),\n", + " ('character', 3514),\n", + " ('don', 3468),\n", + " ('man', 3460),\n", + " ('show', 3432),\n", + " ('watch', 3424),\n", + " ('seen', 3414),\n", + " ('then', 3358),\n", + " ('little', 3341),\n", + " ('still', 3340),\n", + " ('make', 3303),\n", + " ('could', 3237),\n", + " ('never', 3226),\n", + " ('being', 3217),\n", + " ('where', 3173),\n", + " ('does', 3069),\n", + " ('over', 3017),\n", + " ('any', 3002),\n", + " ('while', 2899),\n", + " ('know', 2833),\n", + " ('did', 2790),\n", + " ('years', 2758),\n", + " ('here', 2740),\n", + " ('ever', 2734),\n", + " ('end', 2696),\n", + " ('these', 2694),\n", + " ('such', 2590),\n", + " ('real', 2568),\n", + " ('scene', 2567),\n", + " ('back', 2547),\n", + " ('those', 2485),\n", + " ('though', 2475),\n", + " ('off', 2463),\n", + " ('new', 2458),\n", + " ('your', 2453),\n", + " ('go', 2440),\n", + " ('acting', 2437),\n", + " ('plot', 2432),\n", + " ('world', 2429),\n", + " ('scenes', 2427),\n", + " ('say', 2414),\n", + " ('through', 2409),\n", + " ('makes', 2390),\n", + " ('better', 2381),\n", + " ('now', 2368),\n", + " ('work', 2346),\n", + " ('young', 2343),\n", + " ('old', 2311),\n", + " ('ve', 2307),\n", + " ('find', 2272),\n", + " ('both', 2248),\n", + " ('before', 2177),\n", + " ('us', 2162),\n", + " ('again', 2158),\n", + " ('series', 2153),\n", + " ('quite', 2143),\n", + " ('something', 2135),\n", + " ('cast', 2133),\n", + " ('should', 2121),\n", + " ('part', 2098),\n", + " ('always', 2088),\n", + " ('lot', 2087),\n", + " ('another', 2075),\n", + " ('actors', 2047),\n", + " ('director', 2040),\n", + " ('family', 2032),\n", + " ('between', 2016),\n", + " ('own', 2016),\n", + " ('m', 1998),\n", + " ('may', 1997),\n", + " ('same', 1972),\n", + " ('role', 1967),\n", + " ('watching', 1966),\n", + " ('every', 1954),\n", + " ('funny', 1953),\n", + " ('doesn', 1935),\n", + " ('performance', 1928),\n", + " ('few', 1918),\n", + " ('bad', 1907),\n", + " ('look', 1900),\n", + " ('re', 1884),\n", + " ('why', 1855),\n", + " ('things', 1849),\n", + " ('times', 1832),\n", + " ('big', 1815),\n", + " ('however', 1795),\n", + " ('actually', 1790),\n", + " ('action', 1789),\n", + " ('going', 1783),\n", + " ('bit', 1757),\n", + " ('comedy', 1742),\n", + " ('down', 1740),\n", + " ('music', 1738),\n", + " ('must', 1728),\n", + " ('take', 1709),\n", + " ('saw', 1692),\n", + " ('long', 1690),\n", + " ('right', 1688),\n", + " ('fun', 1686),\n", + " ('fact', 1684),\n", + " ('excellent', 1683),\n", + " ('around', 1674),\n", + " ('didn', 1672),\n", + " ('without', 1671),\n", + " ('thing', 1662),\n", + " ('thought', 1639),\n", + " ('got', 1635),\n", + " ('each', 1630),\n", + " ('day', 1614),\n", + " ('feel', 1597),\n", + " ('seems', 1596),\n", + " ('come', 1594),\n", + " ('done', 1586),\n", + " ('beautiful', 1580),\n", + " ('especially', 1572),\n", + " ('played', 1571),\n", + " ('almost', 1566),\n", + " ('want', 1562),\n", + " ('yet', 1556),\n", + " ('give', 1553),\n", + " ('pretty', 1549),\n", + " ('last', 1543),\n", + " ('since', 1519),\n", + " ('different', 1504),\n", + " ('although', 1501),\n", + " ('gets', 1490),\n", + " ('true', 1487),\n", + " ('interesting', 1481),\n", + " ('job', 1470),\n", + " ('enough', 1455),\n", + " ('our', 1454),\n", + " ('shows', 1447),\n", + " ('horror', 1441),\n", + " ('woman', 1439),\n", + " ('tv', 1400),\n", + " ('probably', 1398),\n", + " ('father', 1395),\n", + " ('original', 1393),\n", + " ('girl', 1390),\n", + " ('point', 1379),\n", + " ('plays', 1378),\n", + " ('wonderful', 1372),\n", + " ('far', 1358),\n", + " ('course', 1358),\n", + " ('john', 1350),\n", + " ('rather', 1340),\n", + " ('isn', 1328),\n", + " ('ll', 1326),\n", + " ('later', 1324),\n", + " ('dvd', 1324),\n", + " ('war', 1310),\n", + " ('whole', 1310),\n", + " ('d', 1307),\n", + " ('away', 1306),\n", + " ('found', 1306),\n", + " ('screen', 1305),\n", + " ('nothing', 1300),\n", + " ('year', 1297),\n", + " ('once', 1296),\n", + " ('hard', 1294),\n", + " ('together', 1280),\n", + " ('am', 1277),\n", + " ('set', 1277),\n", + " ('having', 1266),\n", + " ('making', 1265),\n", + " ('place', 1263),\n", + " ('comes', 1260),\n", + " ('might', 1260),\n", + " ('sure', 1253),\n", + " ('american', 1248),\n", + " ('play', 1245),\n", + " ('kind', 1244),\n", + " ('takes', 1242),\n", + " ('perfect', 1242),\n", + " ('performances', 1237),\n", + " ('himself', 1230),\n", + " ('worth', 1221),\n", + " ('everyone', 1221),\n", + " ('anyone', 1214),\n", + " ('actor', 1203),\n", + " ('three', 1201),\n", + " ('wife', 1196),\n", + " ('classic', 1192),\n", + " ('goes', 1186),\n", + " ('ending', 1178),\n", + " ('version', 1168),\n", + " ('star', 1149),\n", + " ('enjoy', 1146),\n", + " ('book', 1142),\n", + " ('nice', 1132),\n", + " ('everything', 1128),\n", + " ('during', 1124),\n", + " ('put', 1118),\n", + " ('seeing', 1111),\n", + " ('least', 1102),\n", + " ('house', 1100),\n", + " ('high', 1095),\n", + " ('watched', 1094),\n", + " ('men', 1087),\n", + " ('loved', 1087),\n", + " ('night', 1082),\n", + " ('anything', 1075),\n", + " ('guy', 1071),\n", + " ('believe', 1071),\n", + " ('top', 1063),\n", + " ('amazing', 1058),\n", + " ('hollywood', 1056),\n", + " ('looking', 1053),\n", + " ('main', 1044),\n", + " ('definitely', 1043),\n", + " ('gives', 1031),\n", + " ('home', 1029),\n", + " ('seem', 1028),\n", + " ('episode', 1023),\n", + " ('sense', 1020),\n", + " ('audience', 1020),\n", + " ('truly', 1017),\n", + " ('special', 1011),\n", + " ('fan', 1009),\n", + " ('second', 1009),\n", + " ('short', 1009),\n", + " ('mind', 1005),\n", + " ('human', 1001),\n", + " ('recommend', 999),\n", + " ('full', 996),\n", + " ('black', 995),\n", + " ('help', 991),\n", + " ('along', 989),\n", + " ('trying', 987),\n", + " ('small', 986),\n", + " ('death', 985),\n", + " ('friends', 981),\n", + " ('remember', 974),\n", + " ('often', 970),\n", + " ('said', 966),\n", + " ('favorite', 962),\n", + " ('heart', 959),\n", + " ('early', 957),\n", + " ('left', 956),\n", + " ('until', 955),\n", + " ('let', 954),\n", + " ('script', 954),\n", + " ('maybe', 937),\n", + " ('today', 936),\n", + " ('live', 934),\n", + " ('less', 934),\n", + " ('moments', 933),\n", + " ('others', 929),\n", + " ('brilliant', 926),\n", + " ('shot', 925),\n", + " ('liked', 923),\n", + " ('become', 916),\n", + " ('won', 915),\n", + " ('used', 910),\n", + " ('style', 907),\n", + " ('mother', 895),\n", + " ('lives', 894),\n", + " ('came', 893),\n", + " ('stars', 890),\n", + " ('cinema', 889),\n", + " ('looks', 885),\n", + " ('perhaps', 884),\n", + " ('read', 882),\n", + " ('enjoyed', 879),\n", + " ('boy', 875),\n", + " ('drama', 873),\n", + " ('highly', 871),\n", + " ('given', 870),\n", + " ('playing', 867),\n", + " ('use', 864),\n", + " ('next', 859),\n", + " ('women', 858),\n", + " ('fine', 857),\n", + " ('effects', 856),\n", + " ('kids', 854),\n", + " ('entertaining', 853),\n", + " ('need', 852),\n", + " ('line', 850),\n", + " ('works', 848),\n", + " ('someone', 847),\n", + " ('mr', 836),\n", + " ('simply', 835),\n", + " ('children', 833),\n", + " ('picture', 833),\n", + " ('face', 831),\n", + " ('friend', 831),\n", + " ('keep', 831),\n", + " ('dark', 830),\n", + " ('overall', 828),\n", + " ('certainly', 828),\n", + " ('minutes', 827),\n", + " ('wasn', 824),\n", + " ('history', 822),\n", + " ('finally', 820),\n", + " ('couple', 816),\n", + " ('against', 815),\n", + " ('son', 809),\n", + " ('understand', 808),\n", + " ('lost', 807),\n", + " ('michael', 805),\n", + " ('else', 801),\n", + " ('throughout', 798),\n", + " ('fans', 797),\n", + " ('city', 792),\n", + " ('reason', 789),\n", + " ('written', 787),\n", + " ('production', 787),\n", + " ('several', 784),\n", + " ('school', 783),\n", + " ('rest', 781),\n", + " ('based', 781),\n", + " ('try', 780),\n", + " ('dead', 776),\n", + " ('hope', 775),\n", + " ('strong', 768),\n", + " ('white', 765),\n", + " ('tell', 759),\n", + " ('itself', 758),\n", + " ('half', 753),\n", + " ('person', 749),\n", + " ('sometimes', 746),\n", + " ('past', 744),\n", + " ('start', 744),\n", + " ('genre', 743),\n", + " ('final', 739),\n", + " ('beginning', 739),\n", + " ('town', 738),\n", + " ('art', 734),\n", + " ('game', 732),\n", + " ('humor', 732),\n", + " ('yes', 731),\n", + " ('idea', 731),\n", + " ('late', 730),\n", + " ('becomes', 729),\n", + " ('despite', 729),\n", + " ('able', 726),\n", + " ('case', 726),\n", + " ('money', 723),\n", + " ('child', 721),\n", + " ('completely', 721),\n", + " ('side', 719),\n", + " ('camera', 716),\n", + " ('getting', 714),\n", + " ('instead', 712),\n", + " ('soon', 702),\n", + " ('under', 700),\n", + " ('viewer', 699),\n", + " ('age', 697),\n", + " ('days', 696),\n", + " ('stories', 696),\n", + " ('felt', 694),\n", + " ('simple', 694),\n", + " ('roles', 693),\n", + " ('video', 688),\n", + " ('name', 683),\n", + " ('either', 683),\n", + " ('doing', 677),\n", + " ('turns', 674),\n", + " ('wants', 671),\n", + " ('close', 671),\n", + " ('title', 669),\n", + " ('wrong', 668),\n", + " ('went', 666),\n", + " ('james', 665),\n", + " ('evil', 659),\n", + " ('budget', 657),\n", + " ('episodes', 657),\n", + " ('relationship', 655),\n", + " ('piece', 653),\n", + " ('fantastic', 653),\n", + " ('david', 651),\n", + " ('turn', 648),\n", + " ('murder', 646),\n", + " ('parts', 645),\n", + " ('brother', 644),\n", + " ('head', 643),\n", + " ('absolutely', 643),\n", + " ('experience', 642),\n", + " ('eyes', 641),\n", + " ('sex', 638),\n", + " ('direction', 637),\n", + " ('called', 637),\n", + " ('directed', 636),\n", + " ('lines', 634),\n", + " ('behind', 633),\n", + " ('sort', 632),\n", + " ('actress', 631),\n", + " ('lead', 630),\n", + " ('oscar', 628),\n", + " ('example', 627),\n", + " ('including', 627),\n", + " ('known', 625),\n", + " ('musical', 625),\n", + " ('chance', 621),\n", + " ('score', 620),\n", + " ('feeling', 619),\n", + " ('already', 619),\n", + " ('hit', 619),\n", + " ('voice', 615),\n", + " ('moment', 612),\n", + " ('living', 612),\n", + " ('low', 610),\n", + " ('supporting', 610),\n", + " ('ago', 609),\n", + " ('themselves', 608),\n", + " ('hilarious', 605),\n", + " ('reality', 605),\n", + " ('jack', 604),\n", + " ('told', 603),\n", + " ('hand', 601),\n", + " ('moving', 600),\n", + " ('dialogue', 600),\n", + " ('quality', 600),\n", + " ('song', 599),\n", + " ('happy', 599),\n", + " ('paul', 598),\n", + " ('matter', 598),\n", + " ('light', 594),\n", + " ('future', 593),\n", + " ('entire', 592),\n", + " ('finds', 591),\n", + " ('gave', 589),\n", + " ('laugh', 587),\n", + " ('released', 586),\n", + " ('expect', 584),\n", + " ('fight', 581),\n", + " ('particularly', 580),\n", + " ('cinematography', 579),\n", + " ('police', 579),\n", + " ('whose', 578),\n", + " ('type', 578),\n", + " ('sound', 578),\n", + " ('enjoyable', 573),\n", + " ('view', 573),\n", + " ('husband', 572),\n", + " ('romantic', 572),\n", + " ('number', 572),\n", + " ('daughter', 572),\n", + " ('documentary', 571),\n", + " ('self', 570),\n", + " ('modern', 569),\n", + " ('robert', 569),\n", + " ('took', 569),\n", + " ('superb', 569),\n", + " ('mean', 566),\n", + " ('shown', 563),\n", + " ('coming', 561),\n", + " ('important', 560),\n", + " ('king', 559),\n", + " ('leave', 559),\n", + " ('change', 558),\n", + " ('wanted', 555),\n", + " ('somewhat', 555),\n", + " ('tells', 554),\n", + " ('run', 552),\n", + " ('events', 552),\n", + " ('country', 552),\n", + " ('career', 552),\n", + " ('heard', 550),\n", + " ('season', 550),\n", + " ('girls', 549),\n", + " ('greatest', 549),\n", + " ('etc', 547),\n", + " ('care', 546),\n", + " ('starts', 545),\n", + " ('english', 542),\n", + " ('killer', 541),\n", + " ('animation', 540),\n", + " ('guys', 540),\n", + " ('totally', 540),\n", + " ('tale', 540),\n", + " ('usual', 539),\n", + " ('opinion', 535),\n", + " ('miss', 535),\n", + " ('violence', 531),\n", + " ('easy', 531),\n", + " ('songs', 530),\n", + " ('british', 528),\n", + " ('says', 526),\n", + " ('realistic', 525),\n", + " ('writing', 524),\n", + " ('act', 522),\n", + " ('writer', 522),\n", + " ('comic', 521),\n", + " ('thriller', 519),\n", + " ('television', 517),\n", + " ('power', 516),\n", + " ('ones', 515),\n", + " ('kid', 514),\n", + " ('novel', 513),\n", + " ('york', 513),\n", + " ('problem', 512),\n", + " ('alone', 512),\n", + " ('attention', 509),\n", + " ('involved', 508),\n", + " ('kill', 507),\n", + " ('extremely', 507),\n", + " ('seemed', 506),\n", + " ('hero', 505),\n", + " ('french', 505),\n", + " ('rock', 504),\n", + " ('stuff', 501),\n", + " ('wish', 499),\n", + " ('begins', 498),\n", + " ('taken', 497),\n", + " ('sad', 497),\n", + " ('ways', 496),\n", + " ('richard', 495),\n", + " ('knows', 494),\n", + " ('atmosphere', 493),\n", + " ('surprised', 491),\n", + " ('similar', 491),\n", + " ('taking', 491),\n", + " ('car', 491),\n", + " ('george', 490),\n", + " ('perfectly', 490),\n", + " ('across', 489),\n", + " ('sequence', 489),\n", + " ('eye', 489),\n", + " ('team', 489),\n", + " ('serious', 488),\n", + " ('powerful', 488),\n", + " ('room', 488),\n", + " ('due', 488),\n", + " ('among', 488),\n", + " ('order', 487),\n", + " ('b', 487),\n", + " ('cannot', 487),\n", + " ('strange', 487),\n", + " ('beauty', 486),\n", + " ('famous', 485),\n", + " ('tries', 484),\n", + " ('myself', 484),\n", + " ('happened', 484),\n", + " ('herself', 484),\n", + " ('class', 483),\n", + " ('four', 482),\n", + " ('cool', 481),\n", + " ('release', 479),\n", + " ('anyway', 479),\n", + " ('theme', 479),\n", + " ('opening', 478),\n", + " ('entertainment', 477),\n", + " ('unique', 475),\n", + " ('ends', 475),\n", + " ('slow', 475),\n", + " ('exactly', 475),\n", + " ('red', 474),\n", + " ('o', 474),\n", + " ('level', 474),\n", + " ('easily', 474),\n", + " ('interest', 472),\n", + " ('happen', 471),\n", + " ('crime', 470),\n", + " ('viewing', 468),\n", + " ('memorable', 467),\n", + " ('sets', 467),\n", + " ('group', 466),\n", + " ('stop', 466),\n", + " ('dance', 463),\n", + " ('message', 463),\n", + " ('sister', 463),\n", + " ('working', 463),\n", + " ('problems', 463),\n", + " ('knew', 462),\n", + " ('mystery', 461),\n", + " ('nature', 461),\n", + " ('bring', 460),\n", + " ('believable', 459),\n", + " ('thinking', 459),\n", + " ('brought', 459),\n", + " ('mostly', 458),\n", + " ('couldn', 457),\n", + " ('disney', 457),\n", + " ('society', 456),\n", + " ('within', 455),\n", + " ('lady', 455),\n", + " ('blood', 454),\n", + " ('upon', 453),\n", + " ('viewers', 453),\n", + " ('parents', 453),\n", + " ('meets', 452),\n", + " ('form', 452),\n", + " ('soundtrack', 452),\n", + " ('usually', 452),\n", + " ('tom', 452),\n", + " ('peter', 452),\n", + " ('local', 450),\n", + " ('certain', 448),\n", + " ('follow', 448),\n", + " ('whether', 447),\n", + " ('possible', 446),\n", + " ('emotional', 445),\n", + " ('killed', 444),\n", + " ('de', 444),\n", + " ('above', 444),\n", + " ('middle', 443),\n", + " ('god', 443),\n", + " ('happens', 442),\n", + " ('flick', 442),\n", + " ('needs', 442),\n", + " ('masterpiece', 441),\n", + " ('major', 440),\n", + " ('period', 440),\n", + " ('haven', 439),\n", + " ('named', 439),\n", + " ('th', 438),\n", + " ('particular', 438),\n", + " ('earth', 437),\n", + " ('feature', 437),\n", + " ('stand', 436),\n", + " ('words', 435),\n", + " ('typical', 435),\n", + " ('obviously', 433),\n", + " ('elements', 433),\n", + " ('romance', 431),\n", + " ('jane', 430),\n", + " ('yourself', 427),\n", + " ('showing', 427),\n", + " ('fantasy', 426),\n", + " ('brings', 426),\n", + " ('america', 423),\n", + " ('guess', 423),\n", + " ('huge', 422),\n", + " ('unfortunately', 422),\n", + " ('indeed', 421),\n", + " ('running', 421),\n", + " ('talent', 420),\n", + " ('stage', 419),\n", + " ('started', 418),\n", + " ('sweet', 417),\n", + " ('leads', 417),\n", + " ('japanese', 417),\n", + " ('poor', 416),\n", + " ('deal', 416),\n", + " ('personal', 413),\n", + " ('incredible', 413),\n", + " ('fast', 412),\n", + " ('became', 410),\n", + " ('deep', 410),\n", + " ('hours', 409),\n", + " ('nearly', 408),\n", + " ('dream', 408),\n", + " ('giving', 408),\n", + " ('turned', 407),\n", + " ('clearly', 407),\n", + " ('near', 406),\n", + " ('obvious', 406),\n", + " ('cut', 405),\n", + " ('surprise', 405),\n", + " ('body', 404),\n", + " ('era', 404),\n", + " ('female', 403),\n", + " ('hour', 403),\n", + " ('five', 403),\n", + " ('note', 399),\n", + " ('learn', 398),\n", + " ('truth', 398),\n", + " ('match', 397),\n", + " ('feels', 397),\n", + " ('except', 397),\n", + " ('tony', 397),\n", + " ('filmed', 394),\n", + " ('complete', 394),\n", + " ('clear', 394),\n", + " ('older', 393),\n", + " ('street', 393),\n", + " ('lots', 393),\n", + " ('eventually', 393),\n", + " ('keeps', 393),\n", + " ('buy', 392),\n", + " ('stewart', 391),\n", + " ('william', 391),\n", + " ('joe', 390),\n", + " ('meet', 390),\n", + " ('fall', 390),\n", + " ('shots', 389),\n", + " ('talking', 389),\n", + " ('difficult', 389),\n", + " ('unlike', 389),\n", + " ('rating', 389),\n", + " ('means', 388),\n", + " ('dramatic', 388),\n", + " ('appears', 386),\n", + " ('subject', 386),\n", + " ('wonder', 386),\n", + " ('present', 386),\n", + " ('situation', 386),\n", + " ('comments', 385),\n", + " ('sequences', 383),\n", + " ('general', 383),\n", + " ('lee', 383),\n", + " ('earlier', 382),\n", + " ('points', 382),\n", + " ('check', 379),\n", + " ('gone', 379),\n", + " ('ten', 378),\n", + " ('suspense', 378),\n", + " ('recommended', 378),\n", + " ('business', 377),\n", + " ('third', 377),\n", + " ('talk', 375),\n", + " ('leaves', 375),\n", + " ('beyond', 375),\n", + " ('portrayal', 374),\n", + " ('beautifully', 373),\n", + " ('single', 372),\n", + " ('bill', 372),\n", + " ('word', 371),\n", + " ('plenty', 371),\n", + " ('falls', 370),\n", + " ('whom', 370),\n", + " ('figure', 369),\n", + " ('battle', 369),\n", + " ('scary', 369),\n", + " ('non', 369),\n", + " ('return', 368),\n", + " ('using', 368),\n", + " ('doubt', 367),\n", + " ('add', 367),\n", + " ('hear', 366),\n", + " ('solid', 366),\n", + " ('success', 366),\n", + " ('touching', 365),\n", + " ('political', 365),\n", + " ('oh', 365),\n", + " ('jokes', 365),\n", + " ('awesome', 364),\n", + " ('hell', 364),\n", + " ('boys', 364),\n", + " ('dog', 362),\n", + " ('recently', 362),\n", + " ('sexual', 362),\n", + " ('please', 361),\n", + " ('wouldn', 361),\n", + " ('features', 361),\n", + " ('straight', 361),\n", + " ('lack', 360),\n", + " ('forget', 360),\n", + " ('setting', 360),\n", + " ('mark', 359),\n", + " ('married', 359),\n", + " ('social', 357),\n", + " ('adventure', 356),\n", + " ('interested', 356),\n", + " ('brothers', 355),\n", + " ('sees', 355),\n", + " ('actual', 355),\n", + " ('terrific', 355),\n", + " ('move', 354),\n", + " ('call', 354),\n", + " ('various', 353),\n", + " ('dr', 353),\n", + " ('theater', 353),\n", + " ('animated', 352),\n", + " ('western', 351),\n", + " ('space', 350),\n", + " ('baby', 350),\n", + " ('leading', 348),\n", + " ('disappointed', 348),\n", + " ('portrayed', 346),\n", + " ('aren', 346),\n", + " ('screenplay', 345),\n", + " ('smith', 345),\n", + " ('hate', 344),\n", + " ('towards', 344),\n", + " ('noir', 343),\n", + " ('outstanding', 342),\n", + " ('decent', 342),\n", + " ('kelly', 342),\n", + " ('directors', 341),\n", + " ('journey', 341),\n", + " ('none', 340),\n", + " ('effective', 340),\n", + " ('looked', 340),\n", + " ('caught', 339),\n", + " ('cold', 339),\n", + " ('storyline', 339),\n", + " ('fi', 339),\n", + " ('sci', 339),\n", + " ('mary', 339),\n", + " ('rich', 338),\n", + " ('charming', 338),\n", + " ('harry', 337),\n", + " ('popular', 337),\n", + " ('manages', 337),\n", + " ('rare', 337),\n", + " ('spirit', 336),\n", + " ('open', 335),\n", + " ('appreciate', 335),\n", + " ('basically', 334),\n", + " ('moves', 334),\n", + " ('acted', 334),\n", + " ('deserves', 333),\n", + " ('subtle', 333),\n", + " ('mention', 333),\n", + " ('inside', 333),\n", + " ('pace', 333),\n", + " ('century', 333),\n", + " ('boring', 333),\n", + " ('familiar', 332),\n", + " ('background', 332),\n", + " ('ben', 331),\n", + " ('creepy', 330),\n", + " ('supposed', 330),\n", + " ('secret', 329),\n", + " ('jim', 328),\n", + " ('die', 328),\n", + " ('question', 327),\n", + " ('effect', 327),\n", + " ('natural', 327),\n", + " ('rate', 326),\n", + " ('language', 326),\n", + " ('impressive', 326),\n", + " ('intelligent', 325),\n", + " ('saying', 325),\n", + " ('material', 324),\n", + " ('realize', 324),\n", + " ('telling', 324),\n", + " ('scott', 324),\n", + " ('singing', 323),\n", + " ('dancing', 322),\n", + " ('adult', 321),\n", + " ('imagine', 321),\n", + " ('visual', 321),\n", + " ('kept', 320),\n", + " ('office', 320),\n", + " ('uses', 319),\n", + " ('pure', 318),\n", + " ('wait', 318),\n", + " ('stunning', 318),\n", + " ('copy', 317),\n", + " ('review', 317),\n", + " ('previous', 317),\n", + " ('seriously', 317),\n", + " ('somehow', 316),\n", + " ('created', 316),\n", + " ('magic', 316),\n", + " ('create', 316),\n", + " ('hot', 316),\n", + " ('reading', 316),\n", + " ('crazy', 315),\n", + " ('air', 315),\n", + " ('frank', 315),\n", + " ('stay', 315),\n", + " ('escape', 315),\n", + " ('attempt', 315),\n", + " ('hands', 314),\n", + " ('filled', 313),\n", + " ('surprisingly', 312),\n", + " ('expected', 312),\n", + " ('average', 312),\n", + " ('complex', 311),\n", + " ('studio', 310),\n", + " ('successful', 310),\n", + " ('quickly', 310),\n", + " ('male', 309),\n", + " ('plus', 309),\n", + " ('co', 307),\n", + " ('minute', 306),\n", + " ('images', 306),\n", + " ('casting', 306),\n", + " ('exciting', 306),\n", + " ('following', 306),\n", + " ('members', 305),\n", + " ('german', 305),\n", + " ('e', 305),\n", + " ('reasons', 305),\n", + " ('follows', 305),\n", + " ('themes', 305),\n", + " ('touch', 304),\n", + " ('genius', 304),\n", + " ('free', 304),\n", + " ('edge', 304),\n", + " ('cute', 304),\n", + " ('outside', 303),\n", + " ('ok', 302),\n", + " ('admit', 302),\n", + " ('younger', 302),\n", + " ('reviews', 302),\n", + " ('odd', 301),\n", + " ('fighting', 301),\n", + " ('master', 301),\n", + " ('break', 300),\n", + " ('thanks', 300),\n", + " ('recent', 300),\n", + " ('comment', 300),\n", + " ('apart', 299),\n", + " ('lovely', 298),\n", + " ('begin', 298),\n", + " ('emotions', 298),\n", + " ('doctor', 297),\n", + " ('italian', 297),\n", + " ('party', 297),\n", + " ('la', 296),\n", + " ('missed', 296),\n", + " ...]" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "positive_counts.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "pos_neg_ratios = Counter()\n", + "\n", + "for term,cnt in list(total_counts.most_common()):\n", + " if(cnt > 100):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + "for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio+0.01)))" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Transforming Text into Numbers" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "\n", + "review = \"This was a horrible, terrible movie.\"\n", + "\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAi4AAAECCAYAAADZzFwPAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQVdV5/xdNZjIxjRgrM52qFI01ERQVExWNeMMLQy0YiEiNEgOYaJAO\nitIaGYo2TFGQeElQAREjRa0oDEG8AKagosYYkEuSjjUEbP+orZFc/KMzmfe3Pys+57fOfvfZZ1/P\nWXu/zzNz3rPP3uvyrO/a717f/axnPatfTyBGRRFQBBQBRUARUAQUgQog8CcV0FFVVAQUAUVAEVAE\nFAFFwCKgxEVvBEVAEVAEFAFFQBGoDAJKXCrTVaqoIqAIKAKKgCKgCChx0XtAEVAEFAFFQBFQBCqD\ngBKXynSVKqoIKAKKgCKgCCgCSlz0HlAEFAFFQBFQBBSByiCgxKUyXaWKKgKKgCKgCCgCioASF70H\nFAFFQBFQBBQBRaAyCHy8MpqqooqAItAVBH784x+bPXv2mJ07d5q9e/eat99+2+zYsaOXLuPGjTOH\nHHKIGTp0qBkyZIg59dRTzac//ele6fSEIqAIKAJ5EOinkXPzwKd5FYF6IrBp0yazYcMGs2rVKjNg\nwAAzcuRIc8IJJ5jBgwebgw8+2Hzuc59ravh//dd/mf/8z/807777rtm1a5d58cUX7Qcyc8kll5gv\nf/nLSmKaENMfioAikBUBJS5ZkdN8ikDNEPjtb39rli9fbh566CHbshkzZpgLLrjA/MVf/EWmllLe\nxo0bzfr1682yZcvMjTfeaG644YbM5WVSQjMpAopA7RBQH5fadak2SBFIj8A999xjPv/5z5stW7aY\nJUuWmO3bt5tJkyblIhlME1166aVm6dKl1hqDVocffriZOXOmwUKjoggoAopAFgSUuGRBTfMoAjVB\nAP+Vk046yaxZs8Z+nnzySfPFL36x8NZhtVmwYEGDwFDHihUrCq9HC1QEFIH6I6BTRfXvY22hIhCJ\nAFaW+fPnm3nz5lnrSmSikk5CmKZOnWqOOeYYOz2lTrwlAa3FKgI1REAtLjXsVG2SIhCHAL4nU6ZM\nsRaWzZs3d5y0oBsWl61bt5pBgwbZKapf/OIXcSrrNUVAEVAEGgioxaUBhR4oAvVHANIyZswYc+ih\nh3pj6WDK6JZbbjGQqPBqpfr3iLZQEVAE0iKgcVzSIqbpFYGKIiCkZdiwYdbfxJdm4ATMEuvzzjtP\nyYsvnaJ6KAIeI6DExePOUdUUgaIQ8JW0SPtYfYQoeRFE9FsRUARaIaDEpRUyel4RqBECc+fOta1h\nZY+vAnn5zW9+YyZMmGD9X9Rh19eeUr0Uge4ioD4u3cVfa1cESkdAfEh+/vOfVyJ6LY7DH3zwgWFp\ntooioAgoAmEEdFVRGBH9rQjUCAECveH4SpyWqlgwFi1aZPdD0jgvNboRtSmKQIEIqMWlQDC1KEXA\nNwTGjx9vTjzxRDN79mzfVIvVhzgvY8eONVWxEsU2Ri8qAopAoQgocSkUTi1MEfAHgaoP/mwNgPjs\nl+NPb6smikDfQUCJS9/pa21pH0MAaws7M7PcuIrCNBd7G7HrdNaNHqvYbtVZEVAE4hFQ4hKPj15V\nBCqJgFhbGPSrLGp1qXLvqe6KQDkIKHEpB1ctVRHoKgKszBk6dKiZPn16V/XIWzlWF7YHUF+XvEhq\nfkWgPgjoqqL69KW2RBGwCBBsbtmyZYapoqoLU0RsA7Bx48aqN0X1VwQUgYIQUOJSEJBajCLgCwIM\n8pMnT66NXwg+OuvXr/cFXtVDEVAEuoyAEpcud4BWrwgUjQCD/FlnnVV0sV0r74ILLrAWpK4poBUr\nAoqAVwgocfGqO1QZRSA/Ahs2bDCnn356/oI8KYHponPPPdfgcKyiCCgCioASF70HFIEaIYAzK4Jf\nSJ2EHa23bdtWpyZpWxQBRSAjAkpcMgKn2RQBHxFg+fPw4cN9VC2XTieccILZt29frjI0syKgCNQD\nASUu9ehHbYUiYBHAKjFo0KDaoTF48GCzd+/e2rVLG6QIKALpEVDikh4zzaEIeI3AwIEDvdZPlVME\nFAFFIA8CSlzyoKd5FQFFoCMIfP7znzerV6/uSF1aiSKgCPiNgBIXv/tHtVMEFIEAgU9/+tOKgyKg\nCCgCFgElLnojKAKKgCKgCCgCikBlEFDiUpmuUkUVgb6LwC9+8YvaRALuu72oLVcEikFAiUsxOGop\nikDXEWCPogMHDnRdjzIU+M1vflPLZd5lYKVlKgJ1R+DjdW+gts8PBIh6umfPHrNz5067rPXtt982\nO3bsaFKOCKnEIDnkkEPszsYcszOwSmsEICvsnIwcfPDB5vjjj6/lvj4QFxVFQBFQBEBAiYveB6Uh\nsGnTJrNq1SpDCPoBAwaYkSNHGgKJTZgwwQ6y4eiuRH0lgNq7775rdu3aZWbNmmVefPFFu2Hg6NGj\nbX510jRGcKLjICsuuWOAf+edd0rr024VvHv3bjNixIhuVa/1KgKKgEcI9OsJxCN9VJWKI4AFYPny\n5Wb+/PmWrMyYMcOwSR7WlCxCeex2vHLlShvyfeLEieaGG27IXF4WHXzI45KVww8/PLb9/fr1MxCY\nOpG88ePHmyuuuMJceumlPnSH6qAIKAJdREB9XLoIfp2qhmDcc889hngbW7ZsMWvWrDHbt283kyZN\nih1k22HA4Mtg9eSTTzY22WPgnjlzpqHOOgsOqUyxyeaCWFb4tCOBbEj4+uuv1woaIgKfdtpptWqT\nNkYRUASyIaDEJRtumstBgIH1rLPOahAWSIY7feEkzXXIgL1gwQI7nfTBBx9YkrRixYpcZfqWWYgK\n37Q3KVlx2wFxeeWVV9xTlT4GC6Ya2xG2SjdSlVcEFIHECOhUUWKoNGEUArfffru5//77zbx586x1\nJSpNWecY0KZOnWq+8IUvmEWLFlV2aoR2iBRB+LDU4EeExasOwj32P//zP+buu++uQ3O0DYqAIpAT\nASUuOQHsq9mZphkzZoxt/qOPPtq1t2H0wI8Gh9TFixebsMOvj/2DzrISCP2KICvhdp500klm4cKF\n5vzzzw9fqtxvpgbXrVtnPvWpT1WifysHsCqsCFQMAZ0qqliH+aCukJZhw4aZtWvXdo20gAU+MEuX\nLjVMj5x33nkGa4OPgnMtlhU+HMsUUBmkhfZ//etftyu6fMQijU5PP/20JSvca0wVudapNOVoWkVA\nEagPAmpxqU9fdqQlLmnB38QnYZCbNm2a2bx5sxdv5mlWAhWNI/3EUmmWl1fZNwTL0Zw5c5pWE0Fe\nyiJ8RfeDlqcIKALFI6DEpXhMa1uiz6RFQO82ecHiI8HS2i1bFp3L+IY0sST997//vbVIlVFH2WXS\nl3Pnzo301VHyUjb6Wr4i4C8CSlz87RvvNJsyZYphNQ+rhnwWnDkJXMc0VidimbjTFywH70SdcfhD\nntCBD/qwNL1qFgpIMivVwtYWt93g7gPerk56rAgoAuUjoMSlfIxrUQMxWh566CGzdevWrg/MSQAl\nYBlbB+D/Uoa4ZMUnUhAezFkuzoqrqvSb9BXkky0h2pFkSBpTYd0mi6K3fisCikD5CChxKR/jytfA\n4MCbLSthqrBqB8B5Y0fnRx55pJCVNZRX9kqgvDcKpCWKREHiTjzxRDN79uy8VXQkP+0YO3asdcRN\n4p+j5KUj3aKVKALeIKDExZuu8FcRBj72iZk+fbq/SkZoxl5JV111lSUcWd7IXbKCo6uvpE30jCIt\nwCKrmO67774mJ9cIyLp+KquuSl663nWqgCLQMQSUuHQM6mpWJA6SVZtqELTTki53JZDPZEXah74Q\nl3akSkicLyuuRH/3m3YQG4ilz1lWrEFeIKhJrDRuvXqsCCgC1UJAiUu1+qvj2kYtR+24EjkqZDAj\nvgvTPK2sLqTxYSVQ2mZCWpAkAzVpf/SjH5mbbrrJm+XibnuFtBx99NG5/JLSYOLWr8eKgCJQHQQ+\nXh1VVdNOI4C1BanyjrxYIthRmh2r3akul6zgC9POYtFp7NvVl9a6QDyXv/3bvzWf/OQnLZHzyfIi\npIU240icRyBxkBc+SQhdnro0ryKgCHQHAa8i5z7xxBOmX79+9sNgUzVx9acdrki7+H711VfdS22P\nTznllAYuS5YsaZu+qAQrV660y1GLKq9b5bBvDzFNcPqUD4MaPiF8WlliuqVvu3ppA/onHZhJL/4v\nkFB8XSBrQkzb1VfmdQiYTA9BporoC8GFslUUAUWgfgh4RVzqB291W8Qb6+rVq83IkSOr2whHc/a5\nYTqoqmRFmiIkJOkATz8SCM8VyMvrr79uowzPnDnT+si41zt1DHFiGo8VRFl8WuL0FGKn5CUOJb2m\nCFQTASUu1ey30rV+4YUXzOTJkwt5Ay5d2TYVQFa+/e1vmw0bNrRJ6e9lplOEtKTRslXIfzChvL17\n99pAbxx3SiBTBDNkewaC47lTeEXqALmDwCh5KRJVLUsR6D4CSlwK7IPLLrvM9PT0ND4FFt3xotiN\nd/To0YXWu2fPHsNU13XXXWf9TtzpMzlmWoxpwjvvvNM899xzDafZvIqcfvrpld10kIGej0z3JMWi\nHdFhUCfAG7trY/VgBVaZBAbyRSBD2kFwQBym07YpadslnZIXQUK/FYEaIRAMtN7I448/3hNAaz+X\nX3651evBBx9snOPaLbfc0rN///6WOm/bts2mkXL4PvLII3so58CBA5H53LTk53PhhRfaesmHJEnj\n6k96V8L5n3322UYdXKM+zkVJsDy0Ub/o46aLajP4oU9WQafgbT1r9qZ8Lp4uDkmO6bs77rijZd81\nVdTmRzBQ9wSDZZtUfl2mD7L0Q9p8wTRaz913390DRuPGjevZuHFjYUCA+Y033tgouxt9QPuC6bHC\n2qQFKQKKQPcQaB5du6eHrdkd+Bl4+UQNbgxmUeSFAS4qvZwj3+7du3u1Uq7zHS5DiEKSNK7+pHfF\nzQ/5cn+7x1KfmzeOuJDezR8+pq60wsASRFpNmy0yfTv9wvq2+g0GUX0eWWmLk8HUV89TTz3V4qp/\np+mHLKSFlmQdpBngH374Ydv/kBgIByQmrR7Uf9tttzWVk7aMMnokKy5l6KJlKgKKQDYEvF0O/dhj\njwVjWLQEA5iNR7Fq1apGAqYgbr755sbvqAPyXXzxxWbXrl2G4GJR0q4M8iRJE1W2nJs3b54c9vq+\n5pprzAknnGCY2mgnTKWQPk6oa9CgQWbq1KlxyZquMaVzzDHHNJ3L8iOJfknLffPNN63PDWVmlaFD\nhxrugSoIUzas/EnqhOu2qd0UkZs2fEx9kyZNsh98Q8B78eLF1lGbbQO4L7ifBg4cGM5qtmzZYt5/\n/327weW5555r+CxcuLCQLRd6VZbxhPj2lD1FlVE9zaYIKAIJEPDaxyWYPjGBhcT6jATTPIbfIi6x\nYbUIm7KJEHkzmJ6w+QI+ZwKrg1yyA9cDDzzQ+B11QHry8Wk14CdJE1W2nAssEY06AkuNnLbf7K+T\nRNx2uVgxOAfWqkYRYANGSYX8hPjPK3fddVevItCTtvOhj8KfYLrMXqNt9KMrzz//vB1I3XNpjocM\nGWIH1zR5upFWiEcW0hK1iihrG4htg+MsfjD8L3Cfzpo1K5K0UMe1115rl52TlqXN7I10/vnnZ62+\ntHxCXvC5UVEEFIEKIhA8ZLyR8FRLMIA26YavRABx4yM+K+F8pAuLO+3ElJErbpmki5IkacJ6uOW4\n+YNB2b1kj8NTKtI2LkZNFTHl5ZYZxor8tFPSoFtSwdeBTx6hfqmb71bTdO3qCOPCVF5WYZoA/w1f\npQg/DJ0KSd67TMWBuYoioAhUCwFvLS68bR9xxBHBmPf/JfxbrAg7duxoJAoGyMhplq997WuNNFgU\nmA6JEuJKtJMkaeLKuOSSS3pdPvPMM5vOvfvuu02/wz+Y7hLBihHGhqmwK6+8UpKYX/3qV43jdgeY\n/LFO5JEwvsTpGDx4cOoisXi5lhemjOooWVcOuViIpcY9p8etEcCiBO5qeWmNkV5RBHxEwFsfl2OP\nPTYxXu+8804jbZgAyIX+/fvLof0W0tN0MvgRThe+zu8kaaLyybkwyeB82OemlX5SRmDRkEPDFArL\niePkgw8+iLvc61pYn14JUp6I8olIWgT3Ql0JCxgweCJ5th0ocorIKtNH/oA5vjwsDc8yNddHYNJm\nKgJeIeCtxcUrlFSZ3Ajs3LkzUxkQuJdffjlT3ipkkuBoDJx5JFixk3gLgDz11DGvWF6EQNaxjdom\nRaBOCNSCuLCjrEirQc61UJC2aIuC1J/kG4fjsIQtLFFWGTePa/XBETeYoYz9fOc733Gzxx6zaiQ8\n1RObIeJieFUUDsJp91liuuzv//7vm1YC5Z2mi1C1a6eY2oGw5CUtOkWUvwvF2qXkJT+WWoIiUDYC\n3k4VpWk4yzRF8F9hE8PwwBnEppAkBj+YLP4WjQJyHuBDctFFFzWV4hIu9GtHXI4//vhGfvJCfIoi\nY0zrhIleo7IUB6wyYSktQr+wdJsPROszn/mMOfnkk3uVxpQW00L//u//Hjk91GoqsFdBnp8oimx0\nYoqIOl577TXbh9y7iCx75hjiNXz4cA4N/4vcP/z/CRmwFyrwh3bQVj55yWQFmqsqKgKVRaAWxIXY\nLAz2DI7It771LfO9732vQV7Yp8ZdPu06rXaj58KxVdhV2o3HkkQ/iBdOqwzytPsrX/mKWbRoUYOQ\nUSYb6Akmwaoiw5YESaUI4sJeND/84Q8bOkjdbl/IuSTfLJHOQzixImFN6qbgCBqsZiks1D1TRGXE\nJGEKi3uIjTbfe+89S0xYIn/FFVc0SLXUy0CPHgjL27du3WrvRfKNGjXKbh3Bxo5VECEvtL9qxKsK\n+KqOikAhCPi0CMpdThy1LDkYhJuW2PJbJLxsNgCnKa38DghOr/Dxco3vVsuGk6Rx9Se9K27+dsdu\nuygjajk058P1tSqX/GmkyGXDbGMA5q10S3o+sN706rc0bSItkVzzLvNOW6ebnsixLMEtSspY+kxk\nYaImBwO4xStPHbSXKLxBIDpbHthzrgoSWDAL7asqtFl1VASqgkAtfFyCwc8GinMDsnEuLFhlCHBW\n1JRKuPykv4NYJC2TEpit3TSRZMaCQvo4wSqzdu3auCS9rh1++OH2TbvXhQwnmBJj6TZtBv+0wrQS\nffb9738/d7+xbF6mNNLqkTc9VgmkqLd4yqOfipKnn37anHTSSWbu3Llmzpw51oJCADmxqmSpB+sF\nUXgJRscu0Pv27bM6s9Gi70uQWWGE/uI8naX9mkcRUATKQaAWU0UCDQ6omLOZh3fD6jNg8hCeMGFC\n7sFP6srzze7Hf/mXf2mjjMoyX2Kx3HDDDb18X9rVQ5wT/D7Wr1/ftBUBhOWb3/xmy8i/ceXywMZX\noShzOUTxpptush+ma8SfhwEtLDhaM52DnwQkoyiSyUDJtMfy5cvDVZb+GxxlICyqMqZm8pAK0QPd\nmEp9++23LWEpa0oHXflwjxONd/78+YYI0T5G1hVspM+K+j+QcvVbEVAE8iHQD9NQviI0dx0RwD8G\n8sAgUwfZtGmTgdhGkaUy24cTbtY9h1rpVZRj74oVK+x2GITxv/rqqzsax4T+uOqqq6wPDL5ZPsdQ\nKdovqVW/6nlFQBFIhkBtpoqSNVdTJUUAp0rM+3UQBsmVK1faaYtOtkcIRpGDclFTRBBTplbpY8hp\nkTomwRhLC07KrCIbM2aM11MyYIO1iP5UUQQUge4joBaX7veBtxrgQ4GFoii/jG41lDdm2rB06VIz\nYMCAhhpFTLU0CnMOynxDFzLkVJfqEN0gCgi+T50mLFHKQqLY6b0K91pe/KPar+cUAUUgHQJKXNLh\n1adSEzSOZdHsM1RlYUpk3bp1dpdjtx3hN+gipnSwiAhRcusq4jjvoOkjaRFccA5m+bySF0FEvxUB\nRaAVAkpcWiGj520gLqwuOILisFtVYbXMwoUL2zqC4oTpRjCm7WnaLSuH0uRJimkRZY8fP94GjvPF\n0hJue9XISxFEN4yB/lYEFIH2CChxaY9Rn06BGR+pqtUFawvOn9u3b0/dj5AFSJsIK5xaTZuVsXJI\n6uU7r7UF69mLL77ozfSQ2zb3uCp6ojN9Dkn1YbrNxVCPFYG6I6DEpe49nLN9DN5YHnCkbDVo56yi\ntOxMjfBWjANqEf4slAcOrojTZplv33lJi6zgoZwyrEEuHkUcYxk65JBDrE9SEeWVWYaSlzLR1bIV\ngWgElLhE46JnHQQIGMbg3+mlxI4KmQ6nTJli8+GUW5Zg0XG3ISiCILm65p0iEvLme8wUt81V07ls\na5uLjR4rAoqAMUpc9C5IhAC7Mo8dO7YycV3KtjKI9SVMVLBquJLXEpPX2tIJ8ua2t6hj6T8sXFWY\nislLMIvCTctRBPoCAkpc+kIvF9BG3iohL1V4cy9bVwYpiEuSqTN0yerwm5e0kB+yWZXBP3ybMmVE\nBGeiXldBlLxUoZdUxzogoMSlDr3YoTZUYdUHhII4JcHGfqUMeHkHJ/IncfjNWw+3BAM/W2BUNfox\nGFRtVVsR/dahf2etRhGoLAJKXCrbdd1RXMLE+xhvA9JywQUXmC996UulrIIqw5dBppzc3hSH3/A0\nlJum3XHVrS3SPla19e/fvxQSKnUU/Q15SWqRK7puLU8R6AsIKHHpC71ccBt9jHQqlpbjjz/eXHnl\nlYWsInJhgwjk9Vdxy4s7Djv8ZqmXPqrDXlMy7edaqeKw8+Ua9yMEJsl0oi86qx6KQFUQUOJSlZ7y\nTE+ZNvLB54XBjZ2/R44c2bC0FEk0KCuP9SNN10VNNaT1k2HQJOZM1QMHCm74Vl1//fWmrJ2rpZ6i\nv5W8FI2olqcI/BGBj/1jIAqGIpAWgeOOO84cffTRZurUqebDDz80Z599dtoiCkmPdYJdhm+99VZz\n8803N8rEN2Lv3r3m//7v/zKvSmHgeeuttzpGWlAeR1osLK4cdthh1teDNvFBL9JBcvj87ne/M6QR\neeaZZ8x///d/2xD6cq7q3xs3bjR/8zd/U6lmfOITnzB8uIfoNxVFQBEoBgG1uBSDY58tBWvAtdde\na9s/f/78jg3yDNg4nb799ttmyZIlLeslHQN9WpN91nx5boSslh0hMlL33XffbX19Jk2aJKcq/U1f\nYPGq2nSRC3rWvnXL0GNFQBH4IwJ/okAoAnkQgBDgqMuyWz74VjDQlCUM0oSF5w2WpbJbt25tSVrQ\ngUixfBg4koron5bsJC0/Kh11Zn0rJ84JA7t8du3aZT71qU/ZkPRRdVXtHP3Hrt5p+tC3NtI3Vdbf\nNzxVn76NgBKXvt3/hbUe6wfTFwgDMIHPCCJWlGDZgRThu8GO1bx9E98jSXAyGdgZOCA+cUI9SKdD\n4xfljwIB2rFjh10K3UniFYdpEdfwX9qzZ08RRXWtDCUvXYNeK64ZAkpcatah3WwOBIHNGAm4NnTo\nUHPjjTcadmaGcEBi2pGGsO4QDawrlIGDJstit2zZYubMmZOJWDBwMLCLRSWqPrHQhK+V+Zt2olsR\nAgEaN25cEUV5VQYrpHbu3OmVTlmUEfKS9n8hS12aRxGoKwLq41LXnvWkXVgwnnjiCWsFWL16tZ3e\nOeaYY+w3RCQsEJP333/f7mRMEDk+Z5xxhjn//PMbSfMO9BAXBg7XIpG3zIZyKQ+ERBVl4cFZmQG+\nqrt5t4KP/sGH6sknn2yVpFLn+b+gz5NYDCvVMFVWEegAAh/vQB1aRR9GAHLghmzngY1FZtu2bZGo\n4OjLdFCcBULeWuPSRBb+0UkGDIgLgyEreJjiylpWXD1JrmEhKbJuptGwTqj4jQD/F0pe/O4j1c5f\nBNTi4m/fqGYxCBRhqaCMV155xVx00UVdefMtw8rDTt5IVcP8t+pyiCaEtqenp1WSSp6HvGB1Kcri\nVkkQVGlFICUC6uOSEjBN7gcCYjXJ6isgxIf9fDjOWk5WNKgz6yqirHVWOV9dp1RkulLuxyr3kequ\nCHQKASUunUJa6ykcAR76spIpTeG85SLylks5DBydHDyKWkWUpt2a1k8E5D7s5P3nJxKqlSKQDAEl\nLslw0lSeIoCPihCRJCoyPcNAIYOF5JE33zRlSd6032VMEaXVoWrpGdTDfVa1NsTpK21T8hKHkl5T\nBP6IgBIXvRMqjQBTCHySPPCFMLSadhBCQ7qyBD11iig9uliohg8fnj5jhXIIeekEea4QLKqqItAL\nAV1V1AsSPVEGAgzYr732mtm/f7+NxUIdsuyZY6LgskxajlkZc/rppzctWbYXI/7wwBdLSsRl67+S\ndOUQpEZWLWXZlTmqfvdc0auI3LI5PvLII8369evDpyv/m5VofUG4l/G3gryIFTBPu/m/+9nPfmZ2\n795t90z64IMPmv7vqE8IIf+D/N8NHjy40JVuefTXvIpAFAK6qigKFT1XCAI8fInhQvyW9957zz4g\nR4wYYYYMGWJXiFCJLAUmLYMTH3nIvvHGGzbfxIkTzahRo5piuUQpKBYV9xoPbgaCLIMAOkFk5E3Y\nLTfLcZR+WcqJy0Mds2bNstswxKWr2rW6rpZq1Q/cs9y7We9b+b8jijIBCfm/g9QeccQRtsrw/x0n\nCVGwb98+w4aW/L/yPzd69Gi763orK2Ur/fW8IlAmAkpcykS3D5bNA5cH39y5c23reWhedtllmR7A\nFAB5ePXVV82iRYvsw5RB+eqrr45cvhx+2PPgR/IQjzzEx1b+0Z8idHHLa3UMBiwbhgBmHfhald3N\n82whwcDLbuR5+rObbUhbN32Z1FJI2U8//bS599577f9MUrLfSifunRdeeMEQ0JD/wW9+85tm8uTJ\nfQb7VrjoeU8QCOIiqCgChSDw1FNP9QSDSk9AVno4Llpef/11WzZ1BDsg9wSDc68qgge9Pc93MC3T\n63qWE9RD3Xkkb/4kdVMHn1NOOaXnX//1X5NkqUwa+lz6VNrZCUx9AKhdO4MXhZ5gmsd+yvi/A/dg\n+w4C6NjvqP87H3BSHfoOAgR0UlEEciHAgy0IzW8fnO0esrkq+igzdUCOeFjz0A7Lww8/HElqwunS\n/qbeLA/tsjChXPcj7bntttt6+NRFuL/o6yhx218UUY2qp9vnou4h2sv/AaSuDMISbjP1BVYXWx//\nYyqKQLcQUOLSLeRrUi8PMN7EsIB0WsTCwyAthIIHPMcMdmUI5aYZIEmbJn2cztTtDtSt0sobeKvr\nVTtP//LG307AOQk+7crx9bpLXqLu/U7pjR4QSUiM/N91qm6tRxEAAfVx8WTKrmpqMP+OHwv+LI8/\n/nhmH5a87WYu/qtf/ar5wx/+YIIBzpx99tm2yDJ9Siib9idxnMzjkEs9wWDcgCjNKieWXK9Zs6bh\n/NwopIIH7A6+ZMmS1G0BexHwCCwT8rOy37TpBz/4gXV472b/cv/PmDHD4EDfzf//ynakKp4LASUu\nueDrm5l5aI0ZM8Y2fu3atZGOsp1EhgH+1ltvNc8995xdTSOEIg9paKc/GAQWkNjBNG39UqbUnWew\nvf322w0bLlZ9l2gcTiHI27dvF1gyfbskEOdluUcyFdbFTDNnzjQvvfSSeeSRR8yxxx7bRU3+WDWr\nvdi1e/PmzZXFtOsgqgKpEVDikhqyvp1BSMuwYcO8GRTRieWaPNRXrVrV9BBNSx7S9i7lR1lCGCiR\ndm/55BcpckClfogPFpt2Okj9Pn6zl9QVV1xhLr300sLUCxPEqP4rrLICC+L+fvPNNw0vC7TBl36F\nXE6bNq3p/67AZmtRikAvBJS49IJET7RCwEfSEtZVyAvWEMgMOjOIl/mGHRXvpRVhcokKuks8jXA7\nivgNFkhVrS5gNXbsWGvZKjOOCP0X+GpYrIokj7bAgv64pKVMLLKqq+QlK3KaLwsCSlyyoNZH8/j+\n8JRuET0xXyMMTLydlvnAhxxBkiBILmlxB0V0KZOoUL4r6ER9VTXj49syZ86cQq0tLj5Rx/QhpFfE\nB2sM0zEPPfSQ2bp1a6n3sLQ56zcxX4i35LueWdun+fxBQImLP33htSY8lG655ZbS336LAuG8884z\nwRJtM3v2bFukSyaKqiNcDoMeDpN/9md/ZgYMGGAvd3vgY9BDJyFxYZ19/e2L3i7x7IY1RqxOVSGf\nBApkW4Enn3zS11tL9aoBAkpcatCJZTdB3ty7uYohbRujdC6DvITf0AmVDmnpNmFx8YLEMeUyffp0\n97S3x5AF8MPyUeYUX1oAwn1ddh9TH3XMmzfPTJo0Ka26XUmPzmeddZZdcVQVnbsClFaaCwElLrng\n6xuZcZAM4jY0rBdVaXXYdA2ZQfI6NUKARNy3cJcYdWJ6SnRo940ukBfM+Gy/4LNUaeBzrTF5VoC1\n6g9WhrHXUNWsF2Il4jvv/1orbPR830ZAiUvf7v+2rZeHkDi7ts3gWQJIFxvMibXBJRdJVXUHKPJE\n+alEkSLy4Vfjw8P7vvvuM//0T/9k/u3f/s0rK4bbB5AWltn7tGLN1S/umP53Y+5E3SNx+cPXKK/K\nq8Kq7hge7g/97RkCGodPEYhDgAiZnQgnHqdDnmtE+QyIQ1OETzcCaVTZAUlrisCaJDpoqzKJ5kp5\n3RR0ow1EOQaLbuvTCgui47J1RBK8W5Xhy3kwlw/3QFoBi25Eo06rZ6v0tDkY6nJHjQ52rLblUNaD\nDz7YqjpvzwckPJP+QVC/Rj7a3mkJYkD1iO7XXnttp6tvW1/nEWmrkv8JpEOj/pnirvnfsmYN6xI6\nnv1c3EGAgdEdvHnIyiAjg3wzEvG/yBMn1NcuTVz+rNei6mVA9I28oCfh4+tCWsL9Fb6/wtfDv2XQ\nB5cqC/canzwiz1O+qyiif9RYwTn5QFRc6TZxQRdXh2effdZVr+vHfxIAp1JTBPr162fk88QTT6Ru\nJcHcCOtddZk1a5ZdTuq2Y+fOnXbahKkjBNO+fNIsmxaTvlt2+JjyKJu6mA7phKAXn/CUBTFdcPbE\n52XTpk2dUCW2Dpkeeuedd2xgtTTYxxbs0UWmCuXekvuAe4EPfRSWu+66ywQDvtdLn8M6R/2+4YYb\nzMKFCzPf82zzQMA9hP9hlc4igD9cQLxspawo9Uq6Tp0qqEAci4671ummBjdaS0bfTpe6vPVJO//q\nr/6q53vf+561fIi1pQgrSNoyqBtsy5Qkdcgmfa4lqkydosoGO6w/ed/Ko8quyjnXGiP3pW8WsTxY\nYu3MMtXMVMWRRx5pn19811HyPJ87hQfTc6KnT1N1anHxikb6o8wLL7xgAvN95d/6QJQ3W94eeKvn\njVeW2Mrbb1bUKZcy0ojUjeNuGYJOvOHziRNC6BMbhCXuWF/K0idKB6wsrJhhiTZOw1WN7BvVtrTn\n6CfuIT4cf//73zfuSrW05fmWnu0aVq5cmVqtYJrC7N+/3+a7/vrrm/JjiRFL8re//W0b9fjOO+9s\nnOMaaYKptqZ87o9XX33VkFfK4XvgwIFt82G5njhxYlM+freyaJ9yyimNtOiESH5XnwkTJth0Ug7f\nrm6SNtzOPXv2yKXGt5vmoosuapznIKrdcfqPGjWqkf/+++9vHHf9oJPMzbVGBA23zlbBzdmkQmCS\najA8mHb4ulsGx2GBqbsskXpa1eXmJY9bdlweN12YhcZdoz6czdw2Us/ll19u5xNdfeTYLQ8syH/h\nhRc2YRTWgfKk3eHv8Fyq1BP+xucgy5tSuBxffvM2i6NxWHjjzWIByZpP6o/yP5FrWb7zlIfVJRg0\nreUjCxZJ9UVHqYv7q8y6kurkW7pgh/MePnUR+pxnEN9pxH3u8cxzxX2+8yx1n4fu844ywuMH5dxx\nxx0tn4/kZ9zZvXu3W6U9Dj+33bo45rkbFrcd8pxO8nx2/UsoWwS93HqlTLnOt1iqSOded3Fzy5Bj\n2hclLr6++Lr8f0SiNC7gHDeO23ABSb7DNwnpXeBdMMOdGb6h6VQ3r9ThfofzpNUPSKJuRoEq7lqW\nGydcntsW99j9p0nyjyH6tvqm7LoNLAzOUW2C1KR9sKadImqFM+WkrTtcFm2SaYbwtaS/KYMpG/qd\n77zlufVSthAWpg6Kws6toy7HOCjXDR/ahKN/UgkPzuF87Z6jrZ6LlJM0L+MIL8Ei4bHHrcM9pnxX\nws9vriV5Pofra1VmeMVPGDt+IxAOV89Wx2H9yesSvXB9XO+GlE5c4kiLgBe+ScI3l7Bm9yYIA+jO\niUq5Ud+U4UoW/Vw9wh3d6lrWG8ctL6o97jlhw0n+MVwMwsetrBPhdGX+dttQVD1x8+1pBos0aZPo\nDt5RhKrsvFHlowdv/JA8LFQcZ2kvbWL5NZYV7lG+s5QTpWOdz4FVN6WM/zvuoTS+VO7zn+dzWNzr\n4EUaGSP4dtvAdXlZDY8RPFvlGnWELSoM2CJumdQnpCb84hvW131+h8cKdJMPRMWVOOLiEgnGTldc\nbFxdXD04L4QmjFd4LKZsV5dwfW7dnTwu9b/EbTAd5HZc3DUAcIHmhnLTA57cqAKW22HhutyO5poM\n8G6Z4Txx11zd3DaF9XavuXnS3DhuvrCOYTLk/qOhC+nlQ3uSCm9HDPLdFPdBIQ+JvPrw8Gz1AMXq\nkcTKwMCelWTE6U+ZSep3yyB9XmuNW174mPuAQQcCw33EmzMERHAMf5OW+wbSw4e0TDeWqWNY5yr/\nhtiBcTeljP877gHuhaTCS6k8t8IvqJQRftaHxwKeF5Kfb3kuhp/pLmkR3dz2u4O0+xx2n+vkCz+H\npSy+4/K5Ooafz2Fd3TJbWVVI42IneobTR+FFW0WfsC7gJNf4Dud3devUcanOuT/60Y+Cdv5RAvJh\npk6dKj+ts2QAbON32PEnWAHSuMbyzfnz5zd+s3HeEUcc0fjNgRsWO1zXTTfd1FjWRdq33nqLL5NH\nP1tAwj84UMmyPrIsW7bMDB482OamHQ888IAJbhz7O7gpTPCPYI/Df8LtwvEquFEbydjcrAgJbnQb\nbbaIsoooI8oBLUu5YLxly5bIrLIMt91y5YBgtHV8jaygzclgoLfl4lybRMQJV/ROkidtmvPPP99u\n87B9+3br6Mj/IPvQtJL+/fvbZavoBk5Lly61OzuXqWMrXap4PiB45tBDD/VG9aL+73jGpXk2vfba\naw0MjjrqqMZx1EHwEthrLMC52X0u/vKXv7RZ2T5BhGfB6aefLj8b31/72tcaxzyLBYPTTjutcf6a\na64xOMCKAzDP4WDAbnwaCUs6YOwICFGj9Jdffrlx/MMf/rBxfOaZZ9rjXbt2Nc61wuvKK69spPnV\nr37VOOYgPNYyPnRbPl6mAu4NSNj1sLgeywzs/ONy0yHcVAzUkBZEBn46zCVA9mLw5/nnn5dDu69O\n48dHBz/5yU/Cp0we/XoVFnMi6Y0jbQ3fOFJ08OYrh43vk08+uXFc1wNirkQ9ZNK2N/wPGM7Pih8G\n3VYrheKuhcvK8psBnrqpp9UGfhCrwNLSUscs9SbJI7q1wiZJGZomHgHfXhiK+r9j64LVq1fHN965\nykalIocccogcRn5/4QtfiDzvEp5f//rXNg2rCkVkUJff8g35doUxCZk2bZpZvHhx49LNN99sjyEx\nSGCpMZCe8Coee7GEP9QnY+JPf/pTWwMkC7KFQFDk5TiwQNlz/GGcZLVSnLQjmW55ceWUea1Ui4sA\nSwMuvvjipuVdgCdWBmmg3CTyG9YcTuNaYiTdu+++K4f2m2VtSSSvfknqII3b0XLjuEvdOBbSQvpW\nN07SdlFGHsEqEcY9T3l587J0Vt588pbVLr8Qh3C6JIHmwnmy/kYHCSDnliHnlDy4qOhxWQgU9X+H\nNTGNyOCbJk/ZaSEBEEtepqPkscces2MclphOiGv5FCvL+vXrG1WzR1udpVTikhc4iEz4JuYtQKV8\nBNpZJ5Jo4MZbCBO1dr95EIhwD0B8eSh0gsDwhhiOaFrWFJG0MfwdjvcicVbkfDi9/lYEBIGq/t+J\n/u40iJxr9S3WlPB1mR7i/NFHH20vyzc/3OkVe/GjP+5LJqdkBoBjyMt3vvOdxpQQrg583Jc8LDGd\neEZhgZZ6eT5SZ+CThppWXIuSa0XCUuNOa0Ud00bfpVTi4t6AgYNPW8DCgyWMPyycC1tY3JuL9Pv2\n7Qtni/ydV7/IQiNO1vHGiWhmqaf45+ShEHVPFF0xb4hMyYi/S9lTRK30F7+XFStWWP+XtG+urcrV\n84pAUgQ6+X8nOh122GFy2NL6LAmYvgmPB/x2p3UGDRpkk7tT7bSLYGxhCVbCNU5BDCArlOe+aAkx\nwWWBD64AQiLI7LoGNAor4cANzEeQP3GXcKeJqPb4449v1A5hC89sNC4mPOiU5T9OnVKJi+vQlNZS\nQuRAeevmpqAzEG4496bkHMTFvXGifEQAW24+iWCYRz/qTSpF3zhJ682ajnll+efMWkbV82HZwJek\nk1NEYczEn2XSpElWFyFS4XT6WxGoEwKf/exnG81xLSeNk6GDYMVSg7xAMvjtilgfsNq648S3vvWt\nJvJCJF0Zc8gvDqu8ULv5wi/PvJQzLom4L6pyrt132NLTLj3X3eki19UgPE0E+ZKXdPT8yle+0vR8\nZ6x1x0eJ3is6hIkh5XVbSiUu55xzTqN9ODEJYeAkYFx33XUNMkFoZBEY4cyZM+WnXdkwd+7cxm86\nKcyW5SYjEW/mzz33XCM9UwzujSU3clb9GgUnPMh74ySsJjZZmn+MoUOHNvnlxBZc44s4yL7yyiul\nrCJqB1vYn6WV30u7coq4DmHC6nTPPfdYixcPxqgP/7OkYfPG8FRbEXr0hTLS/J9WBQ+mOdNYC90F\nB//xH//RtplYGiAWvJjyLZYHMuInKQMtL7isSBXBx3H48OGNMcgd/CnH3djRtW5AbqQ+6oQQiXCe\nMtMK4yNlhUlDXDnudJGbTsY395zbFvAZMmRIo91sNyDjIwSH7VFccY0OGBDCMxxu2k4dl7qqCABY\nQilOsHSOeGGHG+gCSx4BkhsBYAGL+TlhxLBld6UQN6h747k3k1uXeyNn1c8tL+kx7aMdiNw4UXmj\nbpyodGnPCfbBGv1eN2ZUWUU8QFk1xttIkZLnnwYr0qCPzMZJdMLicsYZZ9hBOM2DN0nZcWl40LOK\nJ+zPwm8hNGXrQz3sV8U01YsvvmiC+CL2rY03M/d/1W0H+HLfYBFlFQmm+SCui32wq0Oxi1T0MQOe\nG/YhOlX7s7793/EimmYwd1eb8qwkf6v/e8aE999/v4msCEIMsv/8z/8sP+03Uzt79+5tGiuaEgQ/\nGHMISeHW+Y1vfMP6kLikKJyP3xAPN19UGjmHfu3Kk7StviFUssKJNJQpRM3Nw1jH/2ar8Ze0jD1r\n1651s9ljd7HIyJEje13vyomyA8YEBCQ25H/Q6KbAdIHndlOwGwmig55x17geDtpD2e4n6NRGxEPS\nI2n1I0/QwY1yXf3aXSOtq0/4mHLRxxW3LgIBhcUtM/B4b7pMe8N1hIMLNWX46AeBsLodgC5Kr7zn\n0kTwJCAcH6STEV+pK3hQxzYVvQg+V4ZI8EHuG0L/87udPq30oC0SwI4gdkTSzVpWqzrqdJ4+Bae6\nCf2edgdw99lFgDdX3GdeQFzsM51nn/usCz+X3fwcU2Y4T0BY7FgUDPDh5I3flOvqJnVynvEpLO7z\nO6wT6YMX6Sa95fkcHsvC5cpv2iE68B2uQ9LJN3WGA7KiY1w+t71RY5CU3clvHGY7IgDjAgDI3Dhh\nINw0ABoW92bjRgsP9HSMm4Z62nUMdSTVj7RxN2PcNfKmvXHc8sJYUR56y41Lu12J+8dw04WPGRiD\nN/rw6cr/howxECeRMFkJ/05SRpo0DOhp6kibvp0u1M2gWRbBEELEfdUqenE7HfvCdf6X60buIC2Q\nlzTiDtzh55r7zIO4qJSHAGOIjC+MRb5Ix4iLLw1WPZIhwABW1lt9Mg2KT5V0UIgiEK4FpmjN8lhQ\n0DXPQEfdhGOHUKQdXLLggL4QSO6vKJyzlFmnPGnIdVXanaWvsXrwYsr/LN+uFUSJS+d63rXOiDWo\nc7W3rqlU59zgplOpKALMZYYdoCvaFKs2DqP4abQLP49vB3FcwoJPCU6qRa/syRufJY/TLpiQn1Vk\n+POweqlsoT6255gxY4YZO3ZsR5a3l92mIssnwviGDRuKLLKrZfH/RCTctD5O+ImII21gVTfBoNnV\ndvTFyoMXInPvvffapgfWlkS+kZ3CSYlLp5CuWD14puOYWQdhgMbpDOLSTgILRMsVELJEul0ZSa/L\nagtIUR4RJ14hQUnKYknnVVddZR555BGzYMGCtoQuSZlp0kCSWKmE4+95551XOCFMo4svaSHF3Av0\nSdEEuVttxMF74sSJmarHkTZwHbB5w3vZZSpQM6VCALIIaUTchS+pCikpsRKXkoCterFYXBgIeWOq\nupx66qn2je24446zgyUDZpQkCTTHEuk0BCGqHs5RF4NUOwtQq/zh85TFp1Xb3PQsW4YwbN682bCR\nYrcEfdGBtzliUhSBa7fakrVe2kyf8eF/jWXm4BJMo2Ut0qt8ixYtMu4qobTKkR9hZaobTiNtOZo+\nHQJYWyTYZzBd1LE9mJJq2Y9ZpKSJNV3fQkBi6fBGXmV5+umnrcmTQVLEHeAxSwuBYNBoJ0LmkqQN\nl8WbNNMyaU3n4XLiftO2Vps00qcMAligpM1xZXXqGnqtWrXKEhmxIHWq7k7Ww72DVU8kqp+wdK5b\nt65px3tJX6Vv7kOmA932Vkl/1dVfBJS4+Ns3XdeMhywDLAOtT4NcWmBOOukkM2fOHHPppZdGZoVM\nPPXUU434B/i4tCMlPJTTkg/wpK5ODMy8ydNnbjt8JS3SKUJeqn6/SXvk2yXJSe4t7hEIDfnc/pPy\nqvKN9QifnenTp1dFZdWzIggocalIR3VLTQYTgo5V9eGDtYWoy9u3b28JYZiEJHkrprBwvpYVBBei\niERc+iKuuUSJiLYPPfSQ2bp1q9ck1HdylaRf6GtM7SJpCS756C92aceRuYrC/wbWlrqR0Cr2RR11\nVuJSx14tsE0MfrwlxjmtFlhdoUXx5orvxMKFC1v6ctA+JO7NttVARPnkb2dB4SEeNSVQaGNbFIaO\nWJP+4R/+wfq1tNO1RTEdPY2zLo7Usqqko5VnqAyMGaBFiuhryqScNWvWpLbsiR7d/FZrSzfRr3/d\nSlzq38e5W4iT1o4dOyr39pdE7zRWEwGSPCL/+7//a1iBFTWVJgNaljduKT/vtwyA9913X8upsrx1\nFJ0fMghmrK7ppvNwXLvcewAfqTIIIb4uOKf6biUL4yR6x1k5w3n0tyKQBgElLmnQ6qNpGfywXBB7\noxOxPoqAmYEFUzXfrawpXMtLKsAmyj+GwZdrZQxoafBh6oW9RpYuXZomW9fTyhSfL4M2/ek6mea9\nb5ICjOUiCOBWGesT1kksZlW1FCXtF03XXQSUuHQX/8rUzgMJ0zUm8W4Pxu1AS2JlYCBCWpGadnW4\n16mP8sCFb3aUPuigg8yAAQO6NkWEfjKIxJE3tx2+HXd7ugHcRJI41UraIr+5nyBJVbCY8X8wZswY\n+8JQVZ+4IvtOyyoPASUu5WFbu5J5C542bZrXS1bl4UlskLhl3EVYW9wOFiLEW7nr49DKP8bNW9Zx\ntwf+vO2ijzrp4NnNvorDigCKBAtkiTT3ta8yZcoUa92rqkOxr7iqXr0RUOLSGxM9E4OAz6s+hLQc\neuihsf44RZMW4KJupozaTaW5b/Fl+Uagj1hbqr6qo0zyRZ8V7VQL9mXIv/zLv9ipWlYa+Wjx9Pm5\nUEZ/aJndRUCJS3fxr2Tt8pBavHixNw9RIS0AGhdcTSwjRUwRSedRJvUzoKQhReGBs8jpCPqof//+\nlfGNECzD32J1cf1LwmnS/O4UcUyjU7u0cn/t2bPHS4unPA/i/u/atVGvKwJpEFDikgYtTdtAgIeV\nL5FOebB/9atfNUcffbRdhRG1wkcUT0MsJE/cN5YNN9AbZAR9srwVk88doN0ppzgdwtfQgby0tUiC\nFq6nU78JIBi3pD1OjzCmnXKqjdMpzTX0R6QfZbrWB58X7jN8WhAlLRYG/dMhBD72j4F0qC6tpkYI\nsPkZRIHVRkcddZRhcOmGPPHEE9YP4rLLLrOD2yc+8YmWapRBWhhQDjvssEad1M8Sab7jdGlkcA4g\nQFhd5LN3717zy1/+0hKh3/3ud031ONl6Hb700ktG3s57XazgiU9+8pPm5Zdfbmy4F9cEBtO33nrL\nYsagz3QcJE4wjcvr27UwaUE/9tvif40NCD/88ENz9tlnd0Vt/peIRE0oADZAjHtZ6IqCWmmtEVCL\nS627t/zGYXGYMGGCOeaYY2y0T3kzLLtmBiiWZ2/YsMFaWVgyGmfliBoE8ujIgzvOIlI0SaK9rj9G\n3LQS1rAqRzsO9wt9h6XEtUa5aVyn2jL9htw6yz5ud79yHSsjMn/+/NzL+pO2h/vwu9/9riUr7Bjc\nzqcrabmaThFIhQCbLKooAnkQCMKb99x9991s1tlz44039gQDTJ7iYvNKXQFB6pk8eXIPvxG+g4G9\nZd5gt92W19JcoJ6kZSVNl6Z+SQvGlC8fwYHrAYmLxULKqNK32ybpg6i2V6lNrXSlb5P+D/F/x/9C\n2f936Bo4n9u6xo0bl1i/Vm3U84pAHgTU4pKK5mniOAR4C7zrrrvslE3wILXm7DgrSFxZ4WuUzTJL\n3i6HDx9uZs2a1estU6wSYT+Goqwf6EAdSdtEeqQTViixOnzsYx8zp5xyigkeCmEIK/2bN/s///M/\nN6wyqotVJapDstwz3JPsx4UfEP93WEDD/wNRdSU5R9nLly+3+1yRPquvUZK6NI0ikBSBP0maUNMp\nAu0QYIAmdkrwtmiTEkGTDxvGQR7SCoMx4cMpg6mRffv22YicEJioBzPz7JynLh64CAMBefMKuiBJ\nSQtpwQM9RBfOlSXoRdv/8Ic/mOCNuKxqulYuZOxP//RPbRvT9EHXFM5QcRbSQjXc9/J/xxQh/i/4\nwbDlRZb/O/TACZi4LJBEfIaWLFliNyr1dQuGDHBrlgojoBaXCndeFVQneBZ+KBs3brT7HTGoDho0\nyPpgoD9Ldnk47t+/3zbnwIEDNt22bdvs71GjRpnRo0ebkSNHpnIAhGjwQIdERZEcW3jCPzz84/xZ\n2hVD/rw6tKtDrkP0du7cGRt8T9JW6RsMsbbVNbhZVtLSqg/5vyOC84svvmg/bFqJM/3QoUMbWYYM\nGWJ2795tf7f6vzvttNM6YjFsKKUHikACBJS4JABJkxSDAJYHHExZ8cKDEsGKwl467gOVqaA459Ok\n2jzzzDPms5/9bCoriVu26JuXdFAOA1MnLAVYt5C6hVyvM3EpmrS49zDHch+3+7+DyBxxxBEduU/D\nOupvRSANAkpc0qClaSuDgAwGWF0gS2nJB/l54BdFNrAAMXWEPmVKXYkLfYFlrm6+O3KfdsIPqsz7\nTstWBDqJgPq4dBJtratjCDBFJEQB0sIbO4NfEsniz9KuXAiQu5y5XXq93oxA2YSvubbO/JL7TElL\nZ/DWWuqDgBKX+vSltuQjBKJ8SiAvvN3KG24rsMjLQFLGYCIEqlXder7vICAWuDLus76Dora0ryKg\nxKWv9nxN2w0xabWKSKZ95E3XhQBrjBCeMt/u0a0deXL10uM/IgBmdRnkhbSUeZ/pfaMI1BkBJS51\n7t0+2DaZImrVdLGmQFJEGBT5pPWDkfxpvqkfkpR02ipN2XVOS7/itF11UdJS9R5U/X1A4OM+KKE6\n1B8BBmp8PFjmLEsvo1otS6VZ4cC+LGnessViElWue443XZm2IWAbgc3EGuOmK+uYupLqmlYHlpdv\n3bo1bTbv0wfRcr3XsZ2CSlraIaTXFYFkCChxSYaTpsqAAFaMF154wQaRI54EsSSGDRtmY7gQ+TZK\nWLLJEunFixeb1atXG/YgIvYLmyjGkQvqajVFFFUP51il8vvf/77V5VLPExeGgSyuTVkUGDx4sMU7\nS16f8xBvhHuhqqKkpao9p3r7iIAuh/axVyquE1E3V65caa0rE7JufpwAABmPSURBVCdONASRO/XU\nUzMtBZZAWuxAO2DAADNnzpzIYHRpLRikl6BykB4sQkWTiHbdSL1IGqtSuzJpRx2XDRPFlUCE7Ehc\nNVHSUrUeU319R0CJi+89VCH9IBnslYK0Ihh5muMSovvuu68xiKUhLTJlFfZnaXU+j75J8qbRPUl5\npCHcOyHaw21Mmt/HdFjTNm/e3HFymRcLJS15EdT8ikBvBJS49MZEz6REAMsBkVrxX3EJRcpiEidn\nsGc/lkMPPdTMnDnTDtRJrBZJLCtlEIl2DSu6TjDB12X27Nntqq7EdQZ/9qvCQbdKoqSlSr2lulYJ\nAV1VVKXe8lBXrCC82eN/gPNtJ0z51Ld9+3YzduxYO32QZP8aBhGk3XQQZUMksMB0SopcIg05Y0+a\nH/zgB7YdnWpDmfU88cQThinHKgn3EGRalzxXqddU16ogoBaXqvSUh3ryZr9q1Sq7Y3O3piUgJNde\ne621vixfvjxyoGAQEX+WpDBSLoNOEktO0jLj0mV9Oyefu+IGEoTOVZ1aicKIqa+FCxeaquxMXLQF\nLQoTPacI9GUElLj05d7P2HasEVdffbV5//33zaOPPtqxwb2VuugzY8YM884775i1a9c2yEtev5Uk\nU0utdMpyPsmAFyYqrQjZ7bffbpedL1iwIIsq3uQRvyksbFWQJH1YhXaojoqAzwgocfG5dzzUDTIw\nZswYq5lLEnxQFQvQm2++ackLevJpNzXUTu+85Kdd+e516oIsuTozELrSiqi4aTimHKwu7QLyhfP5\n9nv8+PFmxIgRldjtWkmLb3eP6lNXBJS41LVnS2oXy1LDlo2SqspULOTlpZdeMo888og59thjM5UR\nlYlBKSlpiMqf9Nwzzzxjk7L0G8kzBQcWSFWtLmCOHxO+U777iihpsbea/lEEOoKAEpeOwFyPSph+\nIJCcb5YWF12mFiAtBJZL4rTr5m13XLTfi1hz3HrFOTgPYZHyqm51YSURxIUVaz6Lkhafe0d1qyMC\nSlzq2KsltAlCcNVVV9mVKp1yWM3aDAgB01llDHp5/F7CRIVAce60kNveogbDe+65x2zZsqVwEufq\nWsbxihUrzKJFi+zqsTLKL6rMovqpKH20HEWgLyCgxKUv9HLONjLgMk2CJaMqKzuwjvDGvmbNmlzT\nLVHQCQFpZxWRdFJGHFGRNPJN3rC/i1xL8005Z511lnVenjRpUpqsXUtbZt8V2SglLUWiqWUpAskR\nUOKSHKs+mxK/FmTp0qWVwqDst3YGLtfvBaLhBklLQ1SigGUALyIWCOWgJ74irSw8UfV34xxEqyxr\nWZHtUdJSJJpaliKQDgElLunw6nOpeUBXxUEyqnOwumBpKMPaAFF55ZVXzEEHHWT3UZIYKlF6ZD1X\n1ABJoMBp06Z5HzYfkvzBBx94PbVVVJ9kvSc0nyLQ1xFQ4tLX74A27a/SctSophRJvLBcRAV7y+P3\nEqWze66oKSPKdJeL+7hKx3f96AusVu2mCN3+02NFQBEoHgElLsVjWpsSixz0uwkK5OuSSy5JbXUJ\nExV3WijcnjIHNYgRUoRTtJADHwIHuhiKXr6uWCuSQLrt1mNFQBFIj4DuVZQes7Y5TjnlFNOvXz/7\nYZdeV+Q836+++qp7KfaY/VrcvLGJC7r4wAMPmFmzZnkfQ6Ndc9kSgBUq7QSi5n4gCrxdyyfOSsE1\n0pGfQa5IQQ/XdyZP2cR0GTZsmNUVYtZtASuIpQQOjMO4W7oqaekW8lqvIhCNQCWJC2RABnFIgkrx\nCPCwXrZsmR1Uii+9syXKSihIhSsuSeFYCIp8ZxlEyYuFRKwkbn15joUU5SlD8kJe2MUbCxIOzN0S\niBMrng455BBvYwMpaenW3aH1KgKtEfh460t6pS8jsHHjRjN58uRCpid8wPGv//qvLRFzdYEMlCGs\n3BHyUsT0jugI0WCwL2JlELt4v/7662bq1Klm3bp1hngvReoqOkd9QwbYEPPv/u7vzMMPP5x6Ci+q\nzDLOKWkpA1UtUxHIj0AlLS75m11uCT/5yU9MT0+P/TAwVFHWr19v34arqHtYZ4LnfelLX7JTc2JN\nKYu0SN1CAoqcjhELEANqEQIGW7duNSeeeKLd1wjyUlTZrfRjdRNWFoLi4ehaxmqvVnWnOa+kJQ1a\nmlYR6DACwQBbGdm2bVtPAE/kJ5i379WOBx98sIfzbh7O7d+/PzKtpLv88svt9TvuuKORVzJIGr7R\nh8+FF15o01E24tYp51rlf/bZZxv5KZOyOBeWxx9/vKEL6aIkTXuj8rvngoG3J/CrcE9V/rgbbQpW\nIfUElo1CsSu6PJQLSETPuHHjesDotttuK7TvweCpp57qCQiS/XDss6AveKgoAoqAnwjUcqro3Xff\ntdMczz//fDDGN8s111xjjjzySBOQAzN48ODmi86viy66yETld5LYt9Wbb77ZPZXqGBP9vHnzmvJQ\nJ5+AhFgzftPFFj+KaK9btFgJxGrgXstzjJ7EPdmxY4fdqPHll182AYlsKpK+OfPMM83RRx9tLQFn\nnHGGOeKII5rSZP0xfPhw87Of/axjUyLo6Trtxq1KStMmLCXik5MmX1xapp/Y24m+x4eMmDTnnnuu\ntYicfvrpqaensFgw3Ugfr1q1yoD9nDlzDFNUPotaWnzuHdVNEfgjArUkLvhmxJEOBsuLL77Y7Nq1\nyxDdNCyPPfZY+FTk7zykhQLDpMWtBIJ1wgknGAaNdpK3veHyIRgMNEUJ5dHWxYsXty2Svgnjz6qg\nW265JTeBGTFihNm9e3dXti2AbEAKIDJFEEKIBX40RZTldgoEBuddSAbEgylDsEe4J8AQGTJkSNP/\nzp49e8yBAwfMW2+9Zd544w1LTgMLjl2GfsMNNxSup1Wi4D9KWgoGVItTBEpCoFI+LgzigeHKWiME\nD5Z2cg6/EoRlwy5pwXLBdT7BdItks2/67u/GhY8OKDeYBmrkDV+X3zzUpfws/iyt9KN89gZqJ0W1\n162HwV0GKPd8luPnnnvOYDVJQlpalU9eyqCsPII1h4G1WyJOtWLRyqMHhKWoJdJRekCwsI6wzQP1\nbN682UAgEQgKfTJ//vzGZ+fOnfba6NGjrcWG/wksOPiwFE2ubEUF/xFnaumjgovX4hQBRaBIBIIH\nTOUEX44AA/vBn8SV4OHauBaQCveSPW6V1z1P2fiuRInUy7f4woTTJfVxaacfdQSDhC2+lY9L1vaG\ndXZ/33333T188gq+RAFZaPSHi12WY8qizKyCbwh+HN2WIv1eyvB36TY+na4fX666+XN1GkOtTxHo\nJAK1myp67bXXgjHxjxJlNRg1apRctkGvgkGkyeQtF5NM0bAPTh4hmmtY8O9whWmWOF+cotrr1olV\ngjfnvIJvA1M/rmDJCgifjd3BVFiU8PbOfjVMVbjWM8qizJtuuikqW2XOFen3UuQS6coAWKCiEm+n\nClahAputRSkClUagdsSFCJwi+LG0kyjiwuCaRPr3758kWcs0UU6nYZ8b9IuTItobLh/SEKVbOF27\n3xAPV5iau+yyy9xTkcdCGiEoRBd2/W0os+rERRpdhN8LJAjBP0OOpXz9jkdASUs8PnpVEfAVgUr5\nuPgKouoVjYBrLcEXKAlpCZcEiSGviFumnKvyt/hU5PF7oQxioqgkR0BJS3KsNKUi4BsCtSMurrXE\nda4N5t8aTrTucRGWhaydihNsWMIWlnb6ldFeQrAzRVWkDBo0KHNxefJmrrSDGZmm4BPekiCNCrJE\nOk2evppWSUtf7Xltd10QqN1U0WmnnWZ9V+ggfCVk2sHHDiN6KPFiXCHuhQirYNoRlzLaO3To0F6+\nKaJT1m9WpWRZdUV95K27FOH3UtYSabDHIsSSZ/yM9u3bZ/bu3dvUJZBd7humT/HJgkj5KEpafOwV\n1UkRSIdA5S0u7733XlOLzznnnMZvYqG4uzNjRbjuuuu82aCR2CaufixtRmeRK6+8Ug5bfpfVXpa8\n5hWccEWIzXLnnXeasEVJrkd9kxZ83LgubplReeLOMfAywPosDPgMrjLAptEVqw2+LnzyCmUQnn/K\nlCk2GN2ECRPMypUrbbEDBw60u4azc7h8xJmblwXOsQkqzutsI5ClLXn1j8oveqgjbhQ6ek4RqA4C\nlbe48AbIQ5IpE2K54EdBfAlxWoUIuGTA7RoesN2WOP0kbkacjmW0l+BieeKuiL4MXC7pIGAfn2Bb\nA3PooYfagU3Sut9YWN5///2mFUVyPc9KLsgYVgHfBZ8VBlmsHOIDk1Rn0ueJqktenKgXLlxoJIBc\nsAVA21gsYQsLxCdYqm02bNhgrS/odf3113ctcq6SlqR3kKZTBCqAQCfXXhdVF3v5BNA2fQLi0ig+\nIDNN+/+E0/KbuC2uuHFc3LLcNBy7ZRFbJUrIL+nC9ch5vt29kNzzHIfztYrjQv1Z2hult5wjpkXw\nVio/M38Tg0b2cQq3L8tvypK4NlmUIoZLsCopS9au5AksTpn2zMmST2Lc0O/E8Ckyrgn6dHOvIo3T\n0pXbVytVBEpDAIfVSgoDuxvcLIpskCY8cBL0LSq4HGllMI0qS0CSNHznJS4QDspwiQ765tlkMWl7\npT2tvhnAithoLnBA7tUHLoZJj2kXZeUR6ipyQM6jS9K8DPpZgswlHawpn00VhbDwu0wRAhPsg1TI\n/dVOV+7hqvV5uzbpdUWgryNQWeLS1zuu7PYH+x/1PPzww4VVAzF0CVpSwkIe8uYVLC3sTlxVgbyk\nJRXtCA/XwQRLVKcHd6w63ANFRGhu1aeQlrSYtSpLzysCioA/CPRDleABoqIINCGAY+a9995b+Ioe\nHKTZIRp/k5/+9Kfm17/+dVO9n/nMZ8zJJ59sV6cUuTP07bffbuuZPXt2U31V+oHPC6uPAutIYrVb\n+busWLHCxsfBQZz9hLohtAc/LnYCX7RoUaEB9CgbnDQoXzd6VutUBMpFQIlLufhWtnScK4niG7yJ\npxoofW0wS4Vx+k3r7Opbe3AypW+StiPKKXXmzJl26wQf8KAtM2bMMO+8845Zu3ZtIURDSYtvd63q\nowgUi0Dll0MXC4eWJgjwpnrjjTeaZcuWyanKfmM9GjBgQOLB3ueGYkXgkzRYHWkhB3wQSAsr7oi0\nm5T8lIkH9xk7UAdTgmbMmDENPbPWqaQlK3KaTxGoDgJqcalOX3VcUwbHsWPH2kGuyiZ3pp6mTZtm\nAr+djmNYZoX0D5ssJukb0gaO4Ja0FGXZKLptQqqy6qekpege0fIUAT8RUIuLn/3ihVbE5mCDw+XL\nl3uhTxYlsLbgxsWu4Aze7idLeT7lSROsjuB77Kz96KOPJiI63WjnggULrL/L1Vdfnbp6JS2pIdMM\nikBlEVCLS2W7rjOKM9BjdeGbaYeqyUknnWTmzJkTGfiMNrmCT48P0yeuTkmO2/m9MKhjmfFleiiu\nTUxpMWUULJc2SR2plbTEIarXFIH6IaDEpX59WniLMOF/8MEH1heh8MJLLJBw82vWrEm8MopBk8Hd\nlaRTMW6ebhyL7lERbM866yzrANut1UNp8YCIECGZvgu3J1yWkpYwIvpbEag/Akpc6t/HuVvIoMgA\nft9990VaLnJXUEIBRVkZKCeIBdKkYbvBtClxh39gRXLJFsvAd+zYYZ588skOa5KvOpZrs0R6+/bt\nLQsKt7VlQr2gCCgCtUJAiUuturO8xjBIMGXkwxLadq2EaJVpZQhPMbHU2qdpNMiWOOyiW1WXtGN1\n4Z6bPn16ry6nD3wmkL0U1hOKgCJQGAJKXAqDsv4FMfXy0EMPma1btzYGRh9bzYDH8lqcPTsh+JhA\nDlxxrR7u+U4do9Ott95qjjrqqMS+Ip3SLWk9QpaZvhMiRl4lLUkR1HSKQD0RUOJSz34trVV5l6yW\npthHBfuiX3iKqdOOvxAXrC3oceyxx5YNe2nljx8/3owYMaJhdVHSUhrUWrAiUBkElLhUpqv8UdQX\ncuAiwvTQ3LlzvY1TIs6zrs5lTjHRR0inrE5uu4o8hqhMnTrV+rooaSkSWS1LEaguAkpcqtt3XdWc\ngTHYuNAGNev2EmJIAUtokazBy7oBZtQUU1F+G5CiKvgjJcGdJe3XXHONue6665Ik1zSKgCJQcwQ+\nXvP2afNKQoA3eXxe8Cfp5moj3sJx4Jw4caKN1+L6QpTU9MKKxaE37NRLe1zJMsW0adMmG4+m24TS\nbUee42D3aruXUZ4yNK8ioAjUBwG1uNSnL7vSEiEORKa97bbbeg3EZSmFleW73/2uuf/++003dzgu\nq31SbtQUUzvHX6xh/fv3r6xTrrRdvvHTgSCHHaDlun4rAopA30JAiUvf6u9SWsvgin8JIeVnzZpl\nCNlepuWDMP7Ud8wxx1irT9hqUUojPSo07PiLau4UE1MrS5YsaTrnkfqZVKnT1FcmADSTIqAINBBQ\n4tKAQg/yIoD1Zf78+Wbbtm2WwLAipChSATl66qmnbFAy9Fy4cKE5//zz86pcm/wyxfThhx+ac845\np7KxW1p1yJQpU2xsnqpE/23VDj2vCCgC+RHQTRbzY6glfIQAb/1EaCVU+759++xyXAYcoqDiiJpW\nICtYVygDX49169ZZwkI0VSUtzWiCPZ+DDjrI4BOCYJmpiwwdOtTeU3Vpj7ZDEVAEsiOgFpfs2GnO\nNghAPFh5tH79erNhwwYzYMAAO71DXA6EnaddYQfjAwcOmLfeesu88cYbNlQ9g/All1xiLrjggsKs\nN26ddTuGJBIgcOnSpbVqGg7HixcvrtzWBbXqBG2MIuAJArqqyJOOqKMa+Llceumljf2NsABATvbv\n328JCtNKrgwaNMgMHDjQjB492nzjG9+olY+G284yjyF+WCfqJljcVBQBRUARAAElLnofdAwBlufW\nZYlux0DTiiwCOOfiO6WiCCgCioD6uOg9oAgoAt4jgJN3Fj8p7xumCioCikBqBJS4pIZMMygCioAi\noAgoAopAtxBQ4tIt5LVeRUARSIyAWlsSQ6UJFYHaI6DEpfZdrA1UBKqPAFFzZZl39VujLVAEFIE8\nCChxyYOe5lUEPEOAUP/E0FFRBBQBRaCuCChxqWvParv6JAKDBw82e/furV3bWVF04okn1q5d2iBF\nQBFIj4ASl/SYaQ5FwFsE2IBx9erV3uqXVTGCEhLjR0URUAQUASUueg8oAjVCgKB/WCbqFO6f7iGS\n8umnn16jntKmKAKKQFYElLhkRU7zKQKeIjBy5Ejz3HPPeapderUgYe+9954GL0wPneZQBGqJgBKX\nWnarNqovIzBq1Ci70WVdMHj11VfNxIkT69IcbYcioAjkRECJS04ANbsi4BsC7JyNlaIusU8WLVpk\nIGMqioAioAiAgBIXvQ8UgRoigIXirrvuqnzLfvzjH9tpIsiYiiKgCCgCINCvJxCFQhFQBOqFANaW\nL37xi+bnP/+5wWG3qjJ+/HgzYsQIM3369Ko2QfVWBBSBghFQi0vBgGpxioAPCLAp4fDhw83y5ct9\nUCeTDlhbiN9y9dVXZ8qvmRQBRaCeCKjFpZ79qq1SBKyfC3FdCJcPkamaqLWlaj2m+ioCnUFALS6d\nwVlrUQQ6jsDnPvc5c9ttt1VymmXFihXm7bffVmtLx+8arVAR8B8Btbj430eqoSKQGYHf/va3BqvL\nvHnzzKRJkzKX08mM4p+zZs0a66fTybq1LkVAEfAfASUu/veRaqgI5EIAX5GxY8eazZs3VyKI23nn\nnWfOPfdcM3v27Fzt1syKgCJQTwR0qqie/aqtUgQaCLC6aNasWWbChAkGC4zPMnPmTKuekhafe0l1\nUwS6i4BaXLqLv9auCHQMAUjBm2++adauXevlEmn027hxo9m6dauX+nWso7QiRUARiEVALS6x8OhF\nRaA+CCxYsMAMGzbMjBkzxjvLC6Rl1apV5vHHH1fSUp9bTluiCJSCgBKXUmDVQhUBPxFwyYsPO0gz\ndSWWoKr44PjZs6qVItB3EFDi0nf6WluqCFgEIC84v+IEu2nTpq6hwuohrD8yfcXybRVFQBFQBNoh\noMSlHUJ6XRGoIQI4vz7yyCPmqquuMlOmTOn41BFxWnAahkBhaanytgQ1vD20SYqA1wgocfG6e1Q5\nRaA8BNi4kL2MEGK93HPPPeVV9lHJLM3G0sOOz8Rp0dVDpUOuFSgCtUNAVxXVrku1QYpAegQgFPPn\nz7fRar/+9a/biLVFWkGefvpps3LlSrv3UJWC4aVHUnMoAopA2QgocSkbYS1fEagQAhCYBx54wCxb\ntsxMnjzZjB492owcOTLTVA5lPfvss+b+++83AwYMMDNmzDBf/vKXM5VVIQhVVUVAESgZASUuJQOs\nxSsCVUQAx9kXXnjBrFu3zqxevdr6orCUeuDAgWbIkCHm4IMP7tUsdnI+cOCA2bFjh81z4oknmnHj\nxpnLLrusEhF7ezVITygCioCXCChx8bJbVClFwC8EsJ7s2bPHEpMtW7ZEKjdo0KAGsTnuuOMquSN1\nZMP0pCKgCHiFgBIXr7pDlVEEFAFFQBFQBBSBOAR0VVEcOnpNEVAEFAFFQBFQBLxCQImLV92hyigC\nioAioAgoAopAHAJKXOLQ0WuKgCKgCCgCioAi4BUCSly86g5VRhFQBBQBRUARUATiEFDiEoeOXlME\nFAFFQBFQBBQBrxBQ4uJVd6gyioAioAgoAoqAIhCHgBKXOHT0miKgCCgCioAioAh4hYASF6+6Q5VR\nBBQBRUARUAQUgTgElLjEoaPXFAFFQBFQBBQBRcArBP4fntNQJrCufL0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review = \"The movie was excellent\"\n", + "\n", + "Image(filename='sentiment_network_pos.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "# Project 2: Creating the Input/Output Data" + ] + }, + { + "cell_type": "code", + "execution_count": 74, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "74074\n" + ] + } + ], + "source": [ + "vocab = set(total_counts.keys())\n", + "vocab_size = len(vocab)\n", + "print(vocab_size)" + ] + }, + { + "cell_type": "code", + "execution_count": 75, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "['',\n", + " 'inhabitants',\n", + " 'goku',\n", + " 'stunts',\n", + " 'catepillar',\n", + " 'kristensen',\n", + " 'senegal',\n", + " 'goddess',\n", + " 'distroy',\n", + " 'unexplainably',\n", + " 'concoctions',\n", + " 'petite',\n", + " 'scribe',\n", + " 'stevson',\n", + " 'sctv',\n", + " 'soundscape',\n", + " 'rana',\n", + " 'metamorphose',\n", + " 'immortalizer',\n", + " 'henstridge',\n", + " 'planning',\n", + " 'akiva',\n", + " 'plod',\n", + " 'eko',\n", + " 'orderly',\n", + " 'zeleznice',\n", + " 'verbose',\n", + " 'amplify',\n", + " 'resonation',\n", + " 'critize',\n", + " 'jefferies',\n", + " 'mountainbillies',\n", + " 'steinbichler',\n", + " 'vowel',\n", + " 'rafe',\n", + " 'bonbons',\n", + " 'tulipe',\n", + " 'clot',\n", + " 'distended',\n", + " 'his',\n", + " 'impatiently',\n", + " 'unfortuntly',\n", + " 'lung',\n", + " 'scapegoats',\n", + " 'muzzle',\n", + " 'pscychosexual',\n", + " 'outbid',\n", + " 'obit',\n", + " 'sideshows',\n", + " 'jugde',\n", + " 'particolare',\n", + " 'kevloun',\n", + " 'masterful',\n", + " 'quartier',\n", + " 'unravelling',\n", + " 'necessarily',\n", + " 'antiques',\n", + " 'strutts',\n", + " 'tilts',\n", + " 'disconcert',\n", + " 'dossiers',\n", + " 'sorriest',\n", + " 'blart',\n", + " 'iberia',\n", + " 'situations',\n", + " 'frmann',\n", + " 'daniell',\n", + " 'rays',\n", + " 'pried',\n", + " 'khoobsurat',\n", + " 'leavitt',\n", + " 'caiano',\n", + " 'sagan',\n", + " 'attractiveness',\n", + " 'kitaparaporn',\n", + " 'hamilton',\n", + " 'massages',\n", + " 'reasonably',\n", + " 'horgan',\n", + " 'chemist',\n", + " 'audrey',\n", + " 'jana',\n", + " 'dutch',\n", + " 'override',\n", + " 'spasms',\n", + " 'resumed',\n", + " 'stinson',\n", + " 'widows',\n", + " 'stonewall',\n", + " 'palatial',\n", + " 'neuman',\n", + " 'abandon',\n", + " 'anglophile',\n", + " 'marathon',\n", + " 'chevette',\n", + " 'unscary',\n", + " 'eponymously',\n", + " 'spoilerific',\n", + " 'fleashens',\n", + " 'brigand',\n", + " 'politeness',\n", + " 'clued',\n", + " 'dermatonecrotic',\n", + " 'grady',\n", + " 'mulligan',\n", + " 'ol',\n", + " 'bertolucci',\n", + " 'incubation',\n", + " 'oldboy',\n", + " 'snden',\n", + " 'plaintiffs',\n", + " 'fk',\n", + " 'deply',\n", + " 'franchot',\n", + " 'cyhper',\n", + " 'glorifying',\n", + " 'mazovia',\n", + " 'elizabeth',\n", + " 'palestine',\n", + " 'robby',\n", + " 'wongo',\n", + " 'moshing',\n", + " 'eeeee',\n", + " 'doltish',\n", + " 'bree',\n", + " 'postponed',\n", + " 'gunslinger',\n", + " 'debacles',\n", + " 'kamm',\n", + " 'herman',\n", + " 'rapture',\n", + " 'rolando',\n", + " 'tetsuothe',\n", + " 'premises',\n", + " 'bruck',\n", + " 'loosely',\n", + " 'boylen',\n", + " 'proportions',\n", + " 'grecianized',\n", + " 'wodehousian',\n", + " 'encapsuling',\n", + " 'partly',\n", + " 'posative',\n", + " 'calms',\n", + " 'stadling',\n", + " 'austrailia',\n", + " 'shortland',\n", + " 'wheeling',\n", + " 'darkie',\n", + " 'mckellar',\n", + " 'cushy',\n", + " 'ooookkkk',\n", + " 'milky',\n", + " 'unfolded',\n", + " 'degrades',\n", + " 'authenticating',\n", + " 'rotheroe',\n", + " 'beart',\n", + " 'neath',\n", + " 'grispin',\n", + " 'intoxicants',\n", + " 'nnette',\n", + " 'slinging',\n", + " 'tsukamoto',\n", + " 'stows',\n", + " 'suddenness',\n", + " 'waqt',\n", + " 'degrading',\n", + " 'camazotz',\n", + " 'blarney',\n", + " 'shakher',\n", + " 'delinquency',\n", + " 'tomreynolds',\n", + " 'insecticide',\n", + " 'charlton',\n", + " 'hare',\n", + " 'wayland',\n", + " 'nakada',\n", + " 'urbane',\n", + " 'sadomasochistic',\n", + " 'larnia',\n", + " 'hyping',\n", + " 'yr',\n", + " 'hebert',\n", + " 'accentuating',\n", + " 'deathrow',\n", + " 'galligan',\n", + " 'unmediated',\n", + " 'treble',\n", + " 'alphabet',\n", + " 'soad',\n", + " 'donen',\n", + " 'lord',\n", + " 'recess',\n", + " 'handsome',\n", + " 'center',\n", + " 'vignettes',\n", + " 'rescuers',\n", + " 'pairings',\n", + " 'uselful',\n", + " 'sanders',\n", + " 'nots',\n", + " 'hatsumomo',\n", + " 'appleby',\n", + " 'tampax',\n", + " 'sprinkling',\n", + " 'defacing',\n", + " 'lofty',\n", + " 'opaque',\n", + " 'tlc',\n", + " 'romagna',\n", + " 'tablespoons',\n", + " 'bernhard',\n", + " 'verger',\n", + " 'acumen',\n", + " 'percentages',\n", + " 'wendingo',\n", + " 'resonating',\n", + " 'vntoarea',\n", + " 'redundancies',\n", + " 'red',\n", + " 'pitied',\n", + " 'belying',\n", + " 'gleefulness',\n", + " 'bibbidi',\n", + " 'heiligt',\n", + " 'gitane',\n", + " 'journalist',\n", + " 'focusing',\n", + " 'plethora',\n", + " 'citizen',\n", + " 'coster',\n", + " 'clunkers',\n", + " 'deplorable',\n", + " 'forgive',\n", + " 'proplems',\n", + " 'magwood',\n", + " 'bankers',\n", + " 'aqua',\n", + " 'donated',\n", + " 'disbelieving',\n", + " 'acomplication',\n", + " 'immediately',\n", + " 'contrasted',\n", + " 'reidelsheimer',\n", + " 'fox',\n", + " 'springs',\n", + " 'toolbox',\n", + " 'contacting',\n", + " 'ace',\n", + " 'washrooms',\n", + " 'raving',\n", + " 'dynamism',\n", + " 'mae',\n", + " 'sky',\n", + " 'disharmony',\n", + " 'untutored',\n", + " 'icarus',\n", + " 'taint',\n", + " 'kargil',\n", + " 'captain',\n", + " 'paucity',\n", + " 'fits',\n", + " 'tumbles',\n", + " 'amer',\n", + " 'bueller',\n", + " 'redubbed',\n", + " 'cleansed',\n", + " 'kollos',\n", + " 'shara',\n", + " 'humma',\n", + " 'felichy',\n", + " 'outa',\n", + " 'piglets',\n", + " 'gombell',\n", + " 'supermen',\n", + " 'superlow',\n", + " 'enhance',\n", + " 'goode',\n", + " 'shalt',\n", + " 'kubanskie',\n", + " 'zenith',\n", + " 'ananda',\n", + " 'ocd',\n", + " 'matlin',\n", + " 'nosed',\n", + " 'presumptuous',\n", + " 'rerun',\n", + " 'toyko',\n", + " 'mazar',\n", + " 'sundry',\n", + " 'bilb',\n", + " 'fugly',\n", + " 'orchestrating',\n", + " 'prosaically',\n", + " 'maricarmen',\n", + " 'moveis',\n", + " 'conelly',\n", + " 'estrange',\n", + " 'lusciously',\n", + " 'seasonings',\n", + " 'sums',\n", + " 'delirious',\n", + " 'quincey',\n", + " 'flesh',\n", + " 'tootsie',\n", + " 'ai',\n", + " 'tenma',\n", + " 'appropriations',\n", + " 'chainsaw',\n", + " 'ides',\n", + " 'surrogacy',\n", + " 'pungent',\n", + " 'gallon',\n", + " 'damaso',\n", + " 'caribou',\n", + " 'perico',\n", + " 'supplying',\n", + " 'ro',\n", + " 'yuy',\n", + " 'valium',\n", + " 'debuted',\n", + " 'robbin',\n", + " 'mounts',\n", + " 'interpolated',\n", + " 'aetv',\n", + " 'plummer',\n", + " 'competence',\n", + " 'toadies',\n", + " 'dubiel',\n", + " 'clavichord',\n", + " 'asunder',\n", + " 'sublety',\n", + " 'airfix',\n", + " 'stoltzfus',\n", + " 'ruth',\n", + " 'fluorescent',\n", + " 'improves',\n", + " 'rebenga',\n", + " 'russells',\n", + " 'deliberation',\n", + " 'zsa',\n", + " 'dardino',\n", + " 'macs',\n", + " 'servile',\n", + " 'jlb',\n", + " 'apallonia',\n", + " 'crossbows',\n", + " 'locus',\n", + " 'mislead',\n", + " 'corey',\n", + " 'blundered',\n", + " 'jeopardizes',\n", + " 'disorganized',\n", + " 'discuss',\n", + " 'longish',\n", + " 'tieing',\n", + " 'ledger',\n", + " 'speechifying',\n", + " 'amitabhz',\n", + " 'bbc',\n", + " 'chimayo',\n", + " 'pranked',\n", + " 'superman',\n", + " 'aggravated',\n", + " 'rifleman',\n", + " 'yvone',\n", + " 'radiant',\n", + " 'galico',\n", + " 'debris',\n", + " 'waking',\n", + " 'btw',\n", + " 'havnt',\n", + " 'francen',\n", + " 'chattered',\n", + " 'scathed',\n", + " 'pic',\n", + " 'ceremonies',\n", + " 'watergate',\n", + " 'betsy',\n", + " 'majorca',\n", + " 'meercat',\n", + " 'noirs',\n", + " 'grunts',\n", + " 'drecky',\n", + " 'tribulations',\n", + " 'avery',\n", + " 'talladega',\n", + " 'eights',\n", + " 'dumbing',\n", + " 'alloimono',\n", + " 'scrutinising',\n", + " 'geta',\n", + " 'beltrami',\n", + " 'pvc',\n", + " 'horse',\n", + " 'tiburon',\n", + " 'huitime',\n", + " 'ripple',\n", + " 'loitering',\n", + " 'forensics',\n", + " 'nearly',\n", + " 'elizabethan',\n", + " 'ellington',\n", + " 'uzi',\n", + " 'sicily',\n", + " 'camion',\n", + " 'motivated',\n", + " 'rung',\n", + " 'gao',\n", + " 'licitates',\n", + " 'protocol',\n", + " 'smirker',\n", + " 'torin',\n", + " 'newlywed',\n", + " 'rich',\n", + " 'dismay',\n", + " 'skyler',\n", + " 'moonwalks',\n", + " 'haranguing',\n", + " 'sunburst',\n", + " 'grifter',\n", + " 'undersold',\n", + " 'chearator',\n", + " 'marino',\n", + " 'scala',\n", + " 'conditioner',\n", + " 'ulysses',\n", + " 'lamarre',\n", + " 'figueroa',\n", + " 'flane',\n", + " 'allllllll',\n", + " 'slide',\n", + " 'lateness',\n", + " 'selbst',\n", + " 'gandhis',\n", + " 'dramatizing',\n", + " 'catchphrase',\n", + " 'doable',\n", + " 'stadiums',\n", + " 'alexanderplatz',\n", + " 'pandemonium',\n", + " 'misrepresents',\n", + " 'earth',\n", + " 'mounties',\n", + " 'seeker',\n", + " 'cheat',\n", + " 'outbreaks',\n", + " 'snowstorm',\n", + " 'baur',\n", + " 'schedules',\n", + " 'bathetic',\n", + " 'incorrect',\n", + " 'johnathon',\n", + " 'rosanne',\n", + " 'mundanely',\n", + " 'cauldrons',\n", + " 'forrest',\n", + " 'poky',\n", + " 'legislation',\n", + " 'womanness',\n", + " 'spender',\n", + " 'crazy',\n", + " 'rational',\n", + " 'terrell',\n", + " 'zero',\n", + " 'coincides',\n", + " 'thoughout',\n", + " 'mathew',\n", + " 'narnia',\n", + " 'naseeruddin',\n", + " 'bucks',\n", + " 'affronts',\n", + " 'topple',\n", + " 'degree',\n", + " 'preyed',\n", + " 'passionately',\n", + " 'defeats',\n", + " 'torchwood',\n", + " 'sources',\n", + " 'botticelli',\n", + " 'compactor',\n", + " 'kosturica',\n", + " 'waiving',\n", + " 'gunnar',\n", + " 'stiffler',\n", + " 'fwd',\n", + " 'kawajiri',\n", + " 'eleanor',\n", + " 'sistahs',\n", + " 'soulhunter',\n", + " 'belies',\n", + " 'wrathful',\n", + " 'americans',\n", + " 'ferdinandvongalitzien',\n", + " 'kendra',\n", + " 'weirdy',\n", + " 'unforgivably',\n", + " 'chepart',\n", + " 'tatta',\n", + " 'departmentthe',\n", + " 'dig',\n", + " 'blatty',\n", + " 'marionettes',\n", + " 'atop',\n", + " 'chim',\n", + " 'saurian',\n", + " 'woes',\n", + " 'cloudscape',\n", + " 'resignedly',\n", + " 'unrooted',\n", + " 'keuck',\n", + " 'hitlerian',\n", + " 'stylings',\n", + " 'crewed',\n", + " 'bedeviled',\n", + " 'unfurnished',\n", + " 'reedus',\n", + " 'circumstances',\n", + " 'grasped',\n", + " 'smurfettes',\n", + " 'fn',\n", + " 'dishwashers',\n", + " 'roadie',\n", + " 'ruthlessness',\n", + " 'refrains',\n", + " 'lampooning',\n", + " 'semblance',\n", + " 'richart',\n", + " 'legions',\n", + " 'gwenneth',\n", + " 'enmity',\n", + " 'assess',\n", + " 'manufacturer',\n", + " 'bullosa',\n", + " 'outrun',\n", + " 'hogan',\n", + " 'chekov',\n", + " 'blithe',\n", + " 'code',\n", + " 'drillings',\n", + " 'revolvers',\n", + " 'aredavid',\n", + " 'robespierre',\n", + " 'achcha',\n", + " 'boyfriendhe',\n", + " 'wallow',\n", + " 'toga',\n", + " 'graphed',\n", + " 'tonking',\n", + " 'going',\n", + " 'bosnians',\n", + " 'willy',\n", + " 'rohauer',\n", + " 'fim',\n", + " 'forbidding',\n", + " 'yew',\n", + " 'rationalised',\n", + " 'shimomo',\n", + " 'opposition',\n", + " 'landis',\n", + " 'minded',\n", + " 'despicableness',\n", + " 'easting',\n", + " 'arghhhhh',\n", + " 'ebb',\n", + " 'trialat',\n", + " 'protected',\n", + " 'negras',\n", + " 'rick',\n", + " 'muti',\n", + " 'tracker',\n", + " 'shawl',\n", + " 'differentiates',\n", + " 'sweetheart',\n", + " 'deepened',\n", + " 'manmohan',\n", + " 'trevethyn',\n", + " 'brain',\n", + " 'incomprehensibly',\n", + " 'piercing',\n", + " 'pasadena',\n", + " 'shtick',\n", + " 'ute',\n", + " 'viggo',\n", + " 'supersedes',\n", + " 'ack',\n", + " 'cites',\n", + " 'taurus',\n", + " 'relevent',\n", + " 'minidress',\n", + " 'philosopher',\n", + " 'bel',\n", + " 'mahattan',\n", + " 'moden',\n", + " 'compiling',\n", + " 'advertising',\n", + " 'rogues',\n", + " 'unimaginative',\n", + " 'subpaar',\n", + " 'ademir',\n", + " 'darkly',\n", + " 'saturate',\n", + " 'fledgling',\n", + " 'breaths',\n", + " 'padre',\n", + " 'aszombi',\n", + " 'pachabel',\n", + " 'incalculable',\n", + " 'ozone',\n", + " 'sped',\n", + " 'mpho',\n", + " 'rawail',\n", + " 'forbid',\n", + " 'synth',\n", + " 'guttersnipe',\n", + " 'reputedly',\n", + " 'holiness',\n", + " 'unessential',\n", + " 'hampden',\n", + " 'asylum',\n", + " 'bolye',\n", + " 'strangers',\n", + " 'rantzen',\n", + " 'farrellys',\n", + " 'vigourous',\n", + " 'cantinflas',\n", + " 'enshrined',\n", + " 'boris',\n", + " 'expetations',\n", + " 'replaying',\n", + " 'prestige',\n", + " 'bukater',\n", + " 'overpaid',\n", + " 'exhude',\n", + " 'backsides',\n", + " 'topless',\n", + " 'sufferings',\n", + " 'nitwits',\n", + " 'cordova',\n", + " 'incensed',\n", + " 'danira',\n", + " 'unrelenting',\n", + " 'disabling',\n", + " 'ferdy',\n", + " 'gerard',\n", + " 'drewitt',\n", + " 'mero',\n", + " 'monsters',\n", + " 'precautions',\n", + " 'lamping',\n", + " 'relinquish',\n", + " 'demy',\n", + " 'drink',\n", + " 'chamberlin',\n", + " 'unjustifiably',\n", + " 'cove',\n", + " 'floodwaters',\n", + " 'searing',\n", + " 'isral',\n", + " 'ling',\n", + " 'grossness',\n", + " 'pickier',\n", + " 'pax',\n", + " 'wierd',\n", + " 'tereasa',\n", + " 'smog',\n", + " 'girotti',\n", + " 'spat',\n", + " 'sera',\n", + " 'noxious',\n", + " 'misbehaving',\n", + " 'scouts',\n", + " 'refreshments',\n", + " 'autobiographic',\n", + " 'shi',\n", + " 'toyomichi',\n", + " 'bits',\n", + " 'psychotics',\n", + " 'barzell',\n", + " 'colt',\n", + " 'shivering',\n", + " 'pugilist',\n", + " 'gladiator',\n", + " 'dryer',\n", + " 'reissues',\n", + " 'scrivener',\n", + " 'predicable',\n", + " 'objection',\n", + " 'marmalade',\n", + " 'seems',\n", + " 'spellbind',\n", + " 'trifecta',\n", + " 'innovator',\n", + " 'shriekfest',\n", + " 'inthused',\n", + " 'contestants',\n", + " 'goody',\n", + " 'samotri',\n", + " 'serviced',\n", + " 'nozires',\n", + " 'ins',\n", + " 'mutilating',\n", + " 'dupes',\n", + " 'launius',\n", + " 'widescreen',\n", + " 'joo',\n", + " 'discretionary',\n", + " 'enlivens',\n", + " 'bushes',\n", + " 'chills',\n", + " 'header',\n", + " 'activist',\n", + " 'gethsemane',\n", + " 'phoenixs',\n", + " 'wreathed',\n", + " 'sacrine',\n", + " 'electrifyingly',\n", + " 'basely',\n", + " 'ghidora',\n", + " 'binder',\n", + " 'dogfights',\n", + " 'sugar',\n", + " 'doddsville',\n", + " 'porkys',\n", + " 'scattershot',\n", + " 'refunded',\n", + " 'rudely',\n", + " 'insteadit',\n", + " 'zatichi',\n", + " 'eurotrash',\n", + " 'radioraptus',\n", + " 'hurls',\n", + " 'boogeman',\n", + " 'weighs',\n", + " 'danniele',\n", + " 'converging',\n", + " 'hypothermia',\n", + " 'glorfindel',\n", + " 'birthdays',\n", + " 'attentive',\n", + " 'mallepa',\n", + " 'spacewalk',\n", + " 'manoy',\n", + " 'bombshells',\n", + " 'farts',\n", + " 'lyoko',\n", + " 'southron',\n", + " 'destruction',\n", + " 'flemming',\n", + " 'manhole',\n", + " 'elainor',\n", + " 'bowersock',\n", + " 'lowly',\n", + " 'wfst',\n", + " 'limousines',\n", + " 'skolimowski',\n", + " 'saban',\n", + " 'koen',\n", + " 'malaysia',\n", + " 'uwi',\n", + " 'cyd',\n", + " 'apeing',\n", + " 'bonecrushing',\n", + " 'dini',\n", + " 'merest',\n", + " 'janina',\n", + " 'chemotrodes',\n", + " 'trials',\n", + " 'authorize',\n", + " 'whilhelm',\n", + " 'asthmatic',\n", + " 'broads',\n", + " 'missteps',\n", + " 'embittered',\n", + " 'chandeliers',\n", + " 'seeming',\n", + " 'miscalculate',\n", + " 'recommeded',\n", + " 'schoolwork',\n", + " 'coy',\n", + " 'mcconaughey',\n", + " 'philosophically',\n", + " 'waver',\n", + " 'fanny',\n", + " 'mestressat',\n", + " 'unwatchably',\n", + " 'saggy',\n", + " 'topness',\n", + " 'dwellings',\n", + " 'breakup',\n", + " 'hasselhoff',\n", + " 'superstars',\n", + " 'replay',\n", + " 'aggravates',\n", + " 'balances',\n", + " 'urging',\n", + " 'snidely',\n", + " 'aleksandar',\n", + " 'hildy',\n", + " 'kazuhiro',\n", + " 'slayer',\n", + " 'tangy',\n", + " 'brussels',\n", + " 'horne',\n", + " 'masayuki',\n", + " 'molden',\n", + " 'unravel',\n", + " 'goodtime',\n", + " 'interrogates',\n", + " 'bismillahhirrahmannirrahim',\n", + " 'rowboat',\n", + " 'dumann',\n", + " 'datedness',\n", + " 'astrotheology',\n", + " 'dekhiye',\n", + " 'valga',\n", + " 'kata',\n", + " 'wipes',\n", + " 'hostilities',\n", + " 'sentimentalising',\n", + " 'documentary',\n", + " 'salesman',\n", + " 'virtue',\n", + " 'unreasonably',\n", + " 'haver',\n", + " 'cei',\n", + " 'unglamorised',\n", + " 'balky',\n", + " 'complementary',\n", + " 'paychecks',\n", + " 'mnica',\n", + " 'wada',\n", + " 'ily',\n", + " 'prc',\n", + " 'ennobling',\n", + " 'functionality',\n", + " 'dissociated',\n", + " 'elk',\n", + " 'throbbing',\n", + " 'tempe',\n", + " 'linoleum',\n", + " 'photogrsphed',\n", + " 'bottacin',\n", + " 'hipper',\n", + " 'titillating',\n", + " 'barging',\n", + " 'untie',\n", + " 'sacchetti',\n", + " 'gnat',\n", + " 'roedel',\n", + " 'cohabitation',\n", + " 'performs',\n", + " 'sales',\n", + " 'migrs',\n", + " 'teachs',\n", + " 'nanavati',\n", + " 'fresco',\n", + " 'davison',\n", + " 'obstinate',\n", + " 'burglar',\n", + " 'masue',\n", + " 'dickory',\n", + " 'grills',\n", + " 'appelagate',\n", + " 'linkage',\n", + " 'enables',\n", + " 'loesser',\n", + " 'patties',\n", + " 'prudent',\n", + " 'mallorquins',\n", + " 'nativetex',\n", + " 'suprise',\n", + " 'drippy',\n", + " 'quill',\n", + " 'speeded',\n", + " 'farscape',\n", + " 'saddening',\n", + " 'centuries',\n", + " 'mos',\n", + " 'improvisationally',\n", + " 'neccessarily',\n", + " 'transmitter',\n", + " 'tankers',\n", + " 'latte',\n", + " 'mechanisation',\n", + " 'faracy',\n", + " 'synthetically',\n", + " 'thoughtless',\n", + " 'rake',\n", + " 'ropes',\n", + " 'desirable',\n", + " 'whitewashed',\n", + " 'donal',\n", + " 'crabby',\n", + " 'lifeless',\n", + " 'perfidy',\n", + " 'teresa',\n", + " 'bulldog',\n", + " 'cockamamie',\n", + " 'rasberries',\n", + " 'notethe',\n", + " 'captivity',\n", + " 'chiseling',\n", + " 'smaller',\n", + " 'clampets',\n", + " 'alerts',\n", + " 'tough',\n", + " 'wellingtonian',\n", + " 'aaaahhhhhhh',\n", + " 'dither',\n", + " 'incertitude',\n", + " 'florentine',\n", + " 'imperioli',\n", + " 'licking',\n", + " 'disparagement',\n", + " 'artfully',\n", + " 'feds',\n", + " 'fumiya',\n", + " 'tearfully',\n", + " 'lanchester',\n", + " 'undertaken',\n", + " 'longlost',\n", + " 'netted',\n", + " 'carrell',\n", + " 'uncompelling',\n", + " 'reliefs',\n", + " 'leona',\n", + " 'autorenfilm',\n", + " 'unfriendly',\n", + " 'typewriter',\n", + " 'shifted',\n", + " 'bertrand',\n", + " 'blesses',\n", + " 'tricking',\n", + " 'fireflies',\n", + " 'zanes',\n", + " 'unknowingly',\n", + " 'unnerve',\n", + " 'caning',\n", + " 'flat',\n", + " 'recluse',\n", + " 'dcreasy',\n", + " 'chipmunk',\n", + " 'dipper',\n", + " 'musee',\n", + " 'cousin',\n", + " 'shys',\n", + " 'berserkers',\n", + " 'eve',\n", + " 'conflagration',\n", + " 'irks',\n", + " 'restricts',\n", + " 'parsing',\n", + " 'positronic',\n", + " 'copout',\n", + " 'khala',\n", + " 'swiftness',\n", + " 'higginson',\n", + " 'imprint',\n", + " 'walter',\n", + " 'sundance',\n", + " 'whispering',\n", + " 'thematically',\n", + " 'underimpressed',\n", + " 'uno',\n", + " 'expressly',\n", + " 'russkies',\n", + " 'discos',\n", + " 'shaping',\n", + " 'verson',\n", + " 'prototype',\n", + " 'chapman',\n", + " 'trafficker',\n", + " 'semetary',\n", + " 'unrealistically',\n", + " 'lifewell',\n", + " 'rivas',\n", + " 'consequent',\n", + " 'katsu',\n", + " 'titantic',\n", + " 'jalees',\n", + " 'ranee',\n", + " 'shipbuilding',\n", + " 'gambles',\n", + " 'dispenses',\n", + " 'disfigurement',\n", + " 'bright',\n", + " 'cristian',\n", + " 'puertorricans',\n", + " 'constituent',\n", + " 'capta',\n", + " 'jewel',\n", + " 'erect',\n", + " 'farah',\n", + " 'despondently',\n", + " 'avoide',\n", + " 'inconnu',\n", + " 'headquarters',\n", + " 'sanguisga',\n", + " ...]" + ] + }, + "execution_count": 75, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "list(vocab)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 0., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 46, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "import numpy as np\n", + "\n", + "layer_0 = np.zeros((1,vocab_size))\n", + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 47, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 48, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "{'': 0,\n", + " 'inhabitants': 1,\n", + " 'goku': 2,\n", + " 'stunts': 3,\n", + " 'catepillar': 4,\n", + " 'kristensen': 5,\n", + " 'goddess': 7,\n", + " 'offing': 49797,\n", + " 'distroy': 8,\n", + " 'unexplainably': 9,\n", + " 'concoctions': 10,\n", + " 'petite': 11,\n", + " 'paramilitary': 24759,\n", + " 'scribe': 12,\n", + " 'stevson': 13,\n", + " 'senegal': 6,\n", + " 'sctv': 14,\n", + " 'soundscape': 15,\n", + " 'rana': 16,\n", + " 'immortalizer': 18,\n", + " 'rene': 67354,\n", + " 'eko': 23,\n", + " 'planning': 20,\n", + " 'akiva': 21,\n", + " 'plod': 22,\n", + " 'orderly': 24,\n", + " 'zeleznice': 25,\n", + " 'critize': 29,\n", + " 'baguettes': 25649,\n", + " 'jefferies': 30,\n", + " 'uncertainties': 61695,\n", + " 'mountainbillies': 31,\n", + " 'steinbichler': 32,\n", + " 'vowel': 33,\n", + " 'rafe': 34,\n", + " 'donig': 68719,\n", + " 'tulipe': 36,\n", + " 'clot': 37,\n", + " 'hack': 12526,\n", + " 'distended': 38,\n", + " 'cornered': 37116,\n", + " 'impatiently': 40,\n", + " 'batrice': 12525,\n", + " 'unfortuntly': 41,\n", + " 'lung': 42,\n", + " 'scapegoats': 43,\n", + " 'pscychosexual': 45,\n", + " 'outbid': 46,\n", + " 'obit': 47,\n", + " 'sideshows': 48,\n", + " 'jugde': 49,\n", + " 'kevloun': 51,\n", + " 'quartier': 53,\n", + " 'harp': 61948,\n", + " 'unravelling': 54,\n", + " 'antiques': 56,\n", + " 'strutts': 57,\n", + " 'tilts': 58,\n", + " 'disconcert': 59,\n", + " 'dossiers': 60,\n", + " 'sorriest': 61,\n", + " 'craftsman': 49412,\n", + " 'blart': 62,\n", + " 'dependence': 37120,\n", + " 'sated': 61698,\n", + " 'iberia': 63,\n", + " 'sagan': 72,\n", + " 'frmann': 65,\n", + " 'daniell': 66,\n", + " 'rays': 67,\n", + " 'pried': 68,\n", + " 'khoobsurat': 69,\n", + " 'leavitt': 70,\n", + " 'caiano': 71,\n", + " 'attractiveness': 73,\n", + " 'kitaparaporn': 74,\n", + " 'hamilton': 75,\n", + " 'massages': 76,\n", + " 'horgan': 78,\n", + " 'chemist': 79,\n", + " 'audrey': 80,\n", + " 'yeow': 55655,\n", + " 'jana': 81,\n", + " 'dutch': 82,\n", + " 'pinchot': 24773,\n", + " 'override': 83,\n", + " 'dwervick': 63223,\n", + " 'spasms': 84,\n", + " 'resumed': 85,\n", + " 'tamale': 66259,\n", + " 'calibanian': 49636,\n", + " 'stinson': 86,\n", + " 'widows': 87,\n", + " 'stonewall': 88,\n", + " 'palatial': 89,\n", + " 'neuman': 90,\n", + " 'abandon': 91,\n", + " 'lemmings': 65314,\n", + " 'anglophile': 92,\n", + " 'ertha': 61706,\n", + " 'chevette': 94,\n", + " 'unscary': 95,\n", + " 'spoilerific': 97,\n", + " 'neworleans': 67639,\n", + " 'metamorphose': 17,\n", + " 'brigand': 99,\n", + " 'cheating': 41603,\n", + " 'clued': 101,\n", + " 'dermatonecrotic': 102,\n", + " 'grady': 103,\n", + " 'mulligan': 104,\n", + " 'ol': 105,\n", + " 'incubation': 107,\n", + " 'plaintiffs': 110,\n", + " 'snden': 109,\n", + " 'fk': 111,\n", + " 'deply': 112,\n", + " 'franchot': 113,\n", + " 'henstridge': 19,\n", + " 'cyhper': 114,\n", + " 'verbose': 26,\n", + " 'mazovia': 116,\n", + " 'elizabeth': 117,\n", + " 'palestine': 118,\n", + " 'robby': 119,\n", + " 'wongo': 120,\n", + " 'moshing': 121,\n", + " 'mstified': 12543,\n", + " 'eeeee': 122,\n", + " 'doltish': 123,\n", + " 'bree': 124,\n", + " 'postponed': 125,\n", + " 'debacles': 127,\n", + " 'amplify': 27,\n", + " 'kamm': 128,\n", + " 'phantom': 18893,\n", + " 'boylen': 136,\n", + " 'rolando': 131,\n", + " 'premises': 133,\n", + " 'bruck': 134,\n", + " 'loosely': 135,\n", + " 'wodehousian': 139,\n", + " 'onishi': 70389,\n", + " 'encapsuling': 140,\n", + " 'partly': 141,\n", + " 'stadling': 144,\n", + " 'calms': 143,\n", + " 'darkie': 148,\n", + " 'wheeling': 147,\n", + " 'ursla': 15875,\n", + " 'subsidized': 49420,\n", + " 'mckellar': 149,\n", + " 'ooookkkk': 151,\n", + " 'milky': 152,\n", + " 'unfolded': 153,\n", + " 'degrades': 154,\n", + " 'authenticating': 155,\n", + " 'writeup': 12548,\n", + " 'rotheroe': 156,\n", + " 'beart': 157,\n", + " 'intoxicants': 160,\n", + " 'grispin': 159,\n", + " 'cannes': 61718,\n", + " 'antithetical': 70398,\n", + " 'nnette': 161,\n", + " 'tsukamoto': 163,\n", + " 'antwones': 44205,\n", + " 'stows': 164,\n", + " 'suddenness': 165,\n", + " 'vol': 61720,\n", + " 'waqt': 166,\n", + " 'camazotz': 168,\n", + " 'paps': 55042,\n", + " 'shakher': 170,\n", + " 'terminate': 63868,\n", + " 'kotex': 56419,\n", + " 'delinquency': 171,\n", + " 'bromwell': 25214,\n", + " 'insecticide': 173,\n", + " 'charlton': 174,\n", + " 'nakada': 177,\n", + " 'titted': 24791,\n", + " 'urbane': 178,\n", + " 'depicted': 54491,\n", + " 'sadomasochistic': 179,\n", + " 'hyping': 181,\n", + " 'yr': 182,\n", + " 'hebert': 183,\n", + " 'waxwork': 12990,\n", + " 'deathrow': 185,\n", + " 'nourishes': 24792,\n", + " 'unmediated': 187,\n", + " 'tamper': 37143,\n", + " 'soad': 190,\n", + " 'alphabet': 189,\n", + " 'donen': 191,\n", + " 'lord': 192,\n", + " 'recess': 193,\n", + " 'watchably': 61023,\n", + " 'handsome': 194,\n", + " 'vignettes': 196,\n", + " 'pairings': 198,\n", + " 'uselful': 199,\n", + " 'sanders': 200,\n", + " 'outbursts': 72891,\n", + " 'nots': 201,\n", + " 'hatsumomo': 202,\n", + " 'actioned': 18292,\n", + " 'krimi': 24797,\n", + " 'appleby': 203,\n", + " 'tampax': 204,\n", + " 'sprinkling': 205,\n", + " 'defacing': 206,\n", + " 'lofty': 207,\n", + " 'verger': 213,\n", + " 'tablespoons': 211,\n", + " 'bernhard': 212,\n", + " 'goosebump': 64565,\n", + " 'acumen': 214,\n", + " 'percentages': 215,\n", + " 'wendingo': 216,\n", + " 'resonating': 217,\n", + " 'vntoarea': 218,\n", + " 'redundancies': 219,\n", + " 'strictly': 57081,\n", + " 'pitied': 221,\n", + " 'belying': 222,\n", + " 'michelangelo': 53153,\n", + " 'gleefulness': 223,\n", + " 'environmentalist': 24803,\n", + " 'gitane': 226,\n", + " 'corrected': 66547,\n", + " 'journalist': 227,\n", + " 'focusing': 228,\n", + " 'plethora': 229,\n", + " 'his': 39,\n", + " 'citizen': 230,\n", + " 'south': 55579,\n", + " 'clunkers': 232,\n", + " 'pendulous': 55991,\n", + " 'mounds': 24805,\n", + " 'deplorable': 233,\n", + " 'forgive': 234,\n", + " 'proplems': 235,\n", + " 'bankers': 237,\n", + " 'aqua': 238,\n", + " 'donated': 239,\n", + " 'disbelieving': 240,\n", + " 'acomplication': 241,\n", + " 'contrasted': 243,\n", + " 'muzzle': 44,\n", + " 'amphibians': 72141,\n", + " 'springs': 246,\n", + " 'reformatted': 49443,\n", + " 'toolbox': 247,\n", + " 'contacting': 248,\n", + " 'washrooms': 250,\n", + " 'raving': 251,\n", + " 'dynamism': 252,\n", + " 'mae': 253,\n", + " 'disharmony': 255,\n", + " 'molls': 72979,\n", + " 'dewaere': 12569,\n", + " 'untutored': 256,\n", + " 'icarus': 257,\n", + " 'taint': 258,\n", + " 'kargil': 259,\n", + " 'captain': 260,\n", + " 'paucity': 261,\n", + " 'fits': 262,\n", + " 'tumbles': 263,\n", + " 'amer': 264,\n", + " 'bueller': 265,\n", + " 'cleansed': 267,\n", + " 'shara': 269,\n", + " 'humma': 270,\n", + " 'outa': 272,\n", + " 'piglets': 273,\n", + " 'gombell': 274,\n", + " 'supermen': 275,\n", + " 'superlow': 276,\n", + " 'kubanskie': 280,\n", + " 'goode': 278,\n", + " 'disorganised': 45570,\n", + " 'zenith': 281,\n", + " 'ananda': 282,\n", + " 'matlin': 284,\n", + " 'particolare': 50,\n", + " 'presumptuous': 286,\n", + " 'rerun': 287,\n", + " 'toyko': 288,\n", + " 'bilb': 291,\n", + " 'sundry': 290,\n", + " 'fugly': 292,\n", + " 'orchestrating': 293,\n", + " 'prosaically': 294,\n", + " 'moveis': 296,\n", + " 'conelly': 297,\n", + " 'estrange': 298,\n", + " 'elfriede': 49455,\n", + " 'masterful': 52,\n", + " 'seasonings': 300,\n", + " 'quincey': 303,\n", + " 'frowning': 49456,\n", + " 'painkillers': 53444,\n", + " 'high': 25515,\n", + " 'flesh': 304,\n", + " 'tootsie': 305,\n", + " 'ai': 306,\n", + " 'tenma': 307,\n", + " 'duguay': 71257,\n", + " 'appropriations': 308,\n", + " 'ides': 310,\n", + " 'rui': 61734,\n", + " 'surrogacy': 311,\n", + " 'pungent': 312,\n", + " 'damaso': 314,\n", + " 'authoritarian': 61736,\n", + " 'caribou': 315,\n", + " 'ro': 318,\n", + " 'supplying': 317,\n", + " 'yuy': 319,\n", + " 'debuted': 321,\n", + " 'mounts': 323,\n", + " 'interpolated': 324,\n", + " 'aetv': 325,\n", + " 'plummer': 326,\n", + " 'asunder': 331,\n", + " 'airfix': 333,\n", + " 'dubiel': 329,\n", + " 'clavichord': 330,\n", + " 'crafty': 50465,\n", + " 'sublety': 332,\n", + " 'stoltzfus': 334,\n", + " 'ruth': 335,\n", + " 'fluorescent': 336,\n", + " 'improves': 337,\n", + " 'russells': 339,\n", + " 'tick': 43838,\n", + " 'zsa': 341,\n", + " 'macs': 343,\n", + " 'jlb': 345,\n", + " 'locus': 348,\n", + " 'mislead': 349,\n", + " 'merly': 49461,\n", + " 'corey': 350,\n", + " 'blundered': 351,\n", + " 'humourless': 3568,\n", + " 'disorganized': 353,\n", + " 'discuss': 354,\n", + " 'sharifi': 45391,\n", + " 'tieing': 356,\n", + " 'kats': 34784,\n", + " 'bbc': 360,\n", + " 'pranked': 362,\n", + " 'superman': 363,\n", + " 'holroyd': 9223,\n", + " 'aggravated': 364,\n", + " 'rifleman': 365,\n", + " 'yvone': 366,\n", + " 'vaugier': 24820,\n", + " 'radiant': 367,\n", + " 'galico': 368,\n", + " 'debris': 369,\n", + " 'btw': 371,\n", + " 'denote': 24822,\n", + " 'havnt': 372,\n", + " 'francen': 373,\n", + " 'chattered': 374,\n", + " 'scathed': 375,\n", + " 'pic': 376,\n", + " 'ceremonies': 377,\n", + " 'everyplace': 65309,\n", + " 'betsy': 379,\n", + " 'finster': 37176,\n", + " 'meercat': 381,\n", + " 'noirs': 382,\n", + " 'grunts': 383,\n", + " 'tribulations': 385,\n", + " 'apparatus': 47673,\n", + " 'martnez': 25825,\n", + " 'telethons': 24825,\n", + " 'talladega': 387,\n", + " 'alloimono': 390,\n", + " 'situations': 64,\n", + " 'scrutinising': 391,\n", + " 'geta': 392,\n", + " 'beltrami': 393,\n", + " 'pvc': 394,\n", + " 'horse': 395,\n", + " 'tiburon': 396,\n", + " 'huitime': 397,\n", + " 'ripple': 398,\n", + " 'exceed': 61748,\n", + " 'loitering': 399,\n", + " 'forensics': 400,\n", + " 'nearly': 401,\n", + " 'ellington': 403,\n", + " 'uzi': 404,\n", + " 'rung': 408,\n", + " 'pillaged': 24829,\n", + " 'gao': 409,\n", + " 'licitates': 410,\n", + " 'protocol': 411,\n", + " 'smirker': 412,\n", + " 'torin': 413,\n", + " 'vizier': 31853,\n", + " 'newlywed': 414,\n", + " 'dismay': 416,\n", + " 'moonwalks': 418,\n", + " 'skyler': 417,\n", + " 'invested': 18455,\n", + " 'grifter': 421,\n", + " 'undersold': 422,\n", + " 'chearator': 423,\n", + " 'marino': 424,\n", + " 'scala': 425,\n", + " 'conditioner': 426,\n", + " 'lamarre': 428,\n", + " 'figueroa': 429,\n", + " 'mcinnerny': 61753,\n", + " 'allllllll': 431,\n", + " 'slide': 432,\n", + " 'lateness': 433,\n", + " 'selbst': 434,\n", + " 'dramatizing': 436,\n", + " 'doable': 438,\n", + " 'hollywoodize': 27207,\n", + " 'alexanderplatz': 440,\n", + " 'wholesome': 45745,\n", + " 'pandemonium': 441,\n", + " 'earth': 443,\n", + " 'mounties': 444,\n", + " 'seeker': 445,\n", + " 'cheat': 446,\n", + " 'outbreaks': 447,\n", + " 'savagely': 61759,\n", + " 'snowstorm': 448,\n", + " 'baur': 449,\n", + " 'schedules': 450,\n", + " 'bathetic': 451,\n", + " 'johnathon': 453,\n", + " 'origonal': 57843,\n", + " 'rosanne': 454,\n", + " 'cauldrons': 456,\n", + " 'forrest': 457,\n", + " 'poky': 458,\n", + " 'aristos': 54856,\n", + " 'womanness': 460,\n", + " 'spender': 461,\n", + " 'pagliai': 37108,\n", + " 'rational': 463,\n", + " 'terrell': 464,\n", + " 'affronts': 472,\n", + " 'concise': 49476,\n", + " 'mathew': 468,\n", + " 'narnia': 469,\n", + " 'naseeruddin': 470,\n", + " 'bucks': 471,\n", + " 'proceeds': 69809,\n", + " 'topple': 473,\n", + " 'degree': 474,\n", + " 'passionately': 476,\n", + " 'defeats': 477,\n", + " 'gras': 49477,\n", + " 'sources': 479,\n", + " 'pflug': 49976,\n", + " 'botticelli': 480,\n", + " 'fwd': 486,\n", + " 'waiving': 483,\n", + " 'gunnar': 484,\n", + " 'stiffler': 485,\n", + " 'unwise': 49480,\n", + " 'kawajiri': 487,\n", + " 'sistahs': 489,\n", + " 'swallowed': 30511,\n", + " 'soulhunter': 490,\n", + " 'belies': 491,\n", + " 'wrathful': 492,\n", + " 'badmouth': 16696,\n", + " 'floradora': 61766,\n", + " 'unforgivably': 497,\n", + " 'weirdy': 496,\n", + " 'violation': 63309,\n", + " 'chepart': 498,\n", + " 'departmentthe': 500,\n", + " 'posehn': 49483,\n", + " 'peyote': 37188,\n", + " 'psychiatrically': 24846,\n", + " 'marionettes': 503,\n", + " 'blatty': 502,\n", + " 'atop': 504,\n", + " 'debases': 25135,\n", + " 'henze': 24845,\n", + " 'unrooted': 510,\n", + " 'cloudscape': 508,\n", + " 'resignedly': 509,\n", + " 'begin': 49917,\n", + " 'hitlerian': 512,\n", + " 'reedus': 517,\n", + " 'crewed': 514,\n", + " 'bedeviled': 515,\n", + " 'unfurnished': 516,\n", + " 'herrmann': 12602,\n", + " 'circumstances': 518,\n", + " 'grasped': 519,\n", + " 'fn': 521,\n", + " 'beefed': 22200,\n", + " 'scwatch': 64018,\n", + " 'dishwashers': 522,\n", + " 'roadie': 523,\n", + " 'ruthlessness': 524,\n", + " 'migrant': 12605,\n", + " 'refrains': 525,\n", + " 'preponderance': 44377,\n", + " 'lampooning': 526,\n", + " 'richart': 528,\n", + " 'gwenneth': 530,\n", + " 'enmity': 531,\n", + " 'vortex': 61772,\n", + " 'assess': 532,\n", + " 'manufacturer': 533,\n", + " 'bullosa': 534,\n", + " 'citizenship': 61774,\n", + " 'chekov': 537,\n", + " 'hogan': 536,\n", + " 'blithe': 538,\n", + " 'aredavid': 542,\n", + " 'drillings': 540,\n", + " 'revolvers': 541,\n", + " 'boyfriendhe': 545,\n", + " 'achcha': 544,\n", + " 'wallow': 546,\n", + " 'toga': 547,\n", + " 'bosnians': 551,\n", + " 'going': 550,\n", + " 'willy': 552,\n", + " 'fim': 554,\n", + " 'forbidding': 555,\n", + " 'delete': 56779,\n", + " 'rationalised': 557,\n", + " 'shimomo': 558,\n", + " 'opposition': 559,\n", + " 'landis': 560,\n", + " 'minded': 561,\n", + " 'arghhhhh': 564,\n", + " 'trialat': 566,\n", + " 'protected': 567,\n", + " 'negras': 568,\n", + " 'tracker': 571,\n", + " 'muti': 570,\n", + " 'dinky': 49489,\n", + " 'shawl': 572,\n", + " 'differentiates': 573,\n", + " 'dipaolo': 61779,\n", + " 'sweetheart': 574,\n", + " 'manmohan': 576,\n", + " 'enamored': 66265,\n", + " 'trevethyn': 577,\n", + " 'brain': 578,\n", + " 'incomprehensibly': 579,\n", + " 'pasadena': 581,\n", + " 'bruton': 59142,\n", + " 'shtick': 582,\n", + " 'ute': 583,\n", + " 'viggo': 584,\n", + " 'relevent': 589,\n", + " 'cites': 587,\n", + " 'greenaways': 61781,\n", + " 'minidress': 590,\n", + " 'philosopher': 591,\n", + " 'mahattan': 593,\n", + " 'moden': 594,\n", + " 'compiling': 595,\n", + " 'unimaginative': 598,\n", + " 'rogues': 597,\n", + " 'subpaar': 599,\n", + " 'darkly': 601,\n", + " 'saturate': 602,\n", + " 'fledgling': 603,\n", + " 'breaths': 604,\n", + " 'sceam': 37206,\n", + " 'empathized': 58870,\n", + " 'aszombi': 606,\n", + " 'incalculable': 608,\n", + " 'formations': 28596,\n", + " 'hampden': 619,\n", + " 'rawail': 612,\n", + " 'forbid': 613,\n", + " 'holiness': 617,\n", + " 'unessential': 618,\n", + " 'reputedly': 616,\n", + " 'wage': 63181,\n", + " 'kewpie': 24860,\n", + " 'asylum': 620,\n", + " 'bolye': 621,\n", + " 'celticism': 63189,\n", + " 'strangers': 622,\n", + " 'rantzen': 623,\n", + " 'farrellys': 624,\n", + " 'marathon': 93,\n", + " 'cantinflas': 626,\n", + " 'disproportionately': 12617,\n", + " 'bared': 67212,\n", + " 'enshrined': 627,\n", + " 'expetations': 629,\n", + " 'replaying': 630,\n", + " 'topless': 636,\n", + " 'bukater': 632,\n", + " 'overpaid': 633,\n", + " 'exhude': 634,\n", + " 'nitwits': 638,\n", + " 'tsst': 51554,\n", + " 'sufferings': 637,\n", + " 'ci': 24693,\n", + " 'eponymously': 96,\n", + " 'ferdy': 644,\n", + " 'danira': 641,\n", + " 'unrelenting': 642,\n", + " 'disabling': 643,\n", + " 'gerard': 645,\n", + " 'drewitt': 646,\n", + " 'lamping': 650,\n", + " 'demy': 652,\n", + " 'wicklow': 37214,\n", + " 'relinquish': 651,\n", + " 'feminized': 64196,\n", + " 'drink': 653,\n", + " 'chamberlin': 654,\n", + " 'floodwaters': 657,\n", + " 'searing': 658,\n", + " 'isral': 659,\n", + " 'ling': 660,\n", + " 'grossness': 661,\n", + " 'sassier': 24865,\n", + " 'pickier': 662,\n", + " 'pax': 663,\n", + " 'fleashens': 98,\n", + " 'wierd': 664,\n", + " 'tereasa': 665,\n", + " 'smog': 666,\n", + " 'girotti': 667,\n", + " 'zooey': 64814,\n", + " 'spat': 668,\n", + " 'sera': 669,\n", + " 'misbehaving': 671,\n", + " 'scouts': 672,\n", + " 'refreshments': 673,\n", + " 'itll': 39668,\n", + " 'toyomichi': 676,\n", + " 'politeness': 100,\n", + " 'bits': 677,\n", + " 'psychotics': 678,\n", + " 'optimistic': 61796,\n", + " 'barzell': 679,\n", + " 'colt': 680,\n", + " 'anita': 49501,\n", + " 'shivering': 681,\n", + " 'utah': 59297,\n", + " 'scrivener': 686,\n", + " 'predicable': 687,\n", + " 'dryer': 684,\n", + " 'reissues': 685,\n", + " 'sexier': 26115,\n", + " 'spellbind': 691,\n", + " 'marmalade': 689,\n", + " 'seems': 690,\n", + " 'wyke': 37223,\n", + " 'innovator': 693,\n", + " 'inthused': 695,\n", + " 'scatman': 6309,\n", + " 'contestants': 696,\n", + " 'bertolucci': 106,\n", + " 'serviced': 699,\n", + " 'nozires': 700,\n", + " 'ins': 701,\n", + " 'mutilating': 702,\n", + " 'dupes': 703,\n", + " 'launius': 704,\n", + " 'widescreen': 705,\n", + " 'joo': 706,\n", + " 'discretionary': 707,\n", + " 'enlivens': 708,\n", + " 'manos': 55596,\n", + " 'bushes': 709,\n", + " 'header': 711,\n", + " 'activist': 712,\n", + " 'gethsemane': 713,\n", + " 'phoenixs': 714,\n", + " 'wreathed': 715,\n", + " 'oldboy': 108,\n", + " 'electrifyingly': 717,\n", + " 'inseparability': 24874,\n", + " 'ghidora': 719,\n", + " 'binder': 720,\n", + " 'tibet': 51530,\n", + " 'doddsville': 723,\n", + " 'sugar': 722,\n", + " 'porkys': 724,\n", + " 'hopefully': 37226,\n", + " 'scattershot': 725,\n", + " 'refunded': 726,\n", + " 'rudely': 727,\n", + " 'enacts': 67435,\n", + " 'insteadit': 728,\n", + " 'nightwatch': 61803,\n", + " 'eurotrash': 730,\n", + " 'radioraptus': 731,\n", + " 'unreservedly': 73710,\n", + " 'vall': 49508,\n", + " 'boogeman': 733,\n", + " 'flunked': 24880,\n", + " 'weighs': 734,\n", + " 'glorfindel': 738,\n", + " 'hypothermia': 737,\n", + " 'misled': 64919,\n", + " 'toiletries': 71501,\n", + " 'birthdays': 739,\n", + " 'attentive': 740,\n", + " 'mallepa': 741,\n", + " 'manoy': 743,\n", + " 'bombshells': 744,\n", + " 'glorifying': 115,\n", + " 'southron': 747,\n", + " 'destruction': 748,\n", + " 'manhole': 750,\n", + " 'elainor': 751,\n", + " 'bounder': 13003,\n", + " 'bowersock': 752,\n", + " 'lowly': 753,\n", + " 'wfst': 754,\n", + " 'limousines': 755,\n", + " 'skolimowski': 756,\n", + " 'saban': 757,\n", + " 'malaysia': 759,\n", + " 'cyd': 761,\n", + " 'bonecrushing': 763,\n", + " 'merest': 765,\n", + " 'janina': 766,\n", + " 'chemotrodes': 767,\n", + " 'trials': 768,\n", + " 'whilhelm': 770,\n", + " 'asthmatic': 771,\n", + " 'missteps': 773,\n", + " 'melyvn': 24885,\n", + " 'embittered': 774,\n", + " 'profit': 37234,\n", + " 'seeming': 776,\n", + " 'miscalculate': 777,\n", + " 'recommeded': 778,\n", + " 'mankin': 37235,\n", + " 'schoolwork': 779,\n", + " 'coy': 780,\n", + " 'mcconaughey': 781,\n", + " 'waver': 783,\n", + " 'unwatchably': 786,\n", + " 'saggy': 787,\n", + " 'breakup': 790,\n", + " 'pufnstuf': 37237,\n", + " 'superstars': 792,\n", + " 'replay': 793,\n", + " 'aggravates': 794,\n", + " 'urging': 796,\n", + " 'snidely': 797,\n", + " 'aleksandar': 798,\n", + " 'hildy': 799,\n", + " 'kazuhiro': 800,\n", + " 'slayer': 801,\n", + " 'tangy': 802,\n", + " 'horne': 804,\n", + " 'masayuki': 805,\n", + " 'molden': 806,\n", + " 'unravel': 807,\n", + " 'goodtime': 808,\n", + " 'rowboat': 811,\n", + " 'dekhiye': 815,\n", + " 'datedness': 813,\n", + " 'astrotheology': 814,\n", + " 'suriani': 59610,\n", + " 'hostilities': 819,\n", + " 'wipes': 818,\n", + " 'sentimentalising': 820,\n", + " 'documentary': 821,\n", + " 'virtue': 823,\n", + " 'unreasonably': 824,\n", + " 'cei': 826,\n", + " 'hobbled': 37240,\n", + " 'unglamorised': 827,\n", + " 'balky': 828,\n", + " 'complementary': 829,\n", + " 'paychecks': 830,\n", + " 'tughlaq': 45551,\n", + " 'functionality': 836,\n", + " 'ily': 833,\n", + " 'prc': 834,\n", + " 'ennobling': 835,\n", + " 'dissociated': 837,\n", + " 'elk': 838,\n", + " 'throbbing': 839,\n", + " 'tempe': 840,\n", + " 'linoleum': 841,\n", + " 'bottacin': 843,\n", + " 'hipper': 844,\n", + " 'barging': 846,\n", + " 'untie': 847,\n", + " 'sacchetti': 848,\n", + " 'gnat': 849,\n", + " 'roedel': 850,\n", + " 'performs': 852,\n", + " 'nanavati': 856,\n", + " 'migrs': 854,\n", + " 'teachs': 855,\n", + " 'gunslinger': 126,\n", + " 'fresco': 857,\n", + " 'davison': 858,\n", + " 'jet': 59446,\n", + " 'burglar': 860,\n", + " 'jerker': 69267,\n", + " 'masue': 861,\n", + " 'dickory': 862,\n", + " 'muggy': 46634,\n", + " 'grills': 863,\n", + " 'figment': 28693,\n", + " 'monogamistic': 49527,\n", + " 'appelagate': 864,\n", + " 'linkage': 865,\n", + " 'loesser': 867,\n", + " 'patties': 868,\n", + " 'prudent': 869,\n", + " 'mallorquins': 870,\n", + " 'nativetex': 871,\n", + " 'suprise': 872,\n", + " 'quill': 874,\n", + " 'angsty': 71451,\n", + " 'speeded': 875,\n", + " 'farscape': 876,\n", + " 'herman': 129,\n", + " 'saddening': 877,\n", + " 'centuries': 878,\n", + " 'mos': 879,\n", + " 'neccessarily': 881,\n", + " 'tankers': 883,\n", + " 'latte': 884,\n", + " 'faracy': 886,\n", + " 'stilts': 24897,\n", + " 'synthetically': 887,\n", + " 'thoughtless': 888,\n", + " 'authoring': 62813,\n", + " 'rake': 889,\n", + " 'ropes': 890,\n", + " 'whitewashed': 892,\n", + " 'donal': 893,\n", + " 'arching': 4910,\n", + " 'cockamamie': 899,\n", + " 'lifeless': 895,\n", + " 'perfidy': 896,\n", + " 'teresa': 897,\n", + " 'bulldog': 898,\n", + " 'vingh': 73726,\n", + " 'evacuees': 65858,\n", + " 'rasberries': 900,\n", + " 'chiseling': 903,\n", + " 'clampets': 905,\n", + " 'grecianized': 138,\n", + " 'smaller': 904,\n", + " 'kluznick': 62184,\n", + " 'alerts': 906,\n", + " 'aaaahhhhhhh': 909,\n", + " 'wellingtonian': 908,\n", + " 'dither': 910,\n", + " 'incertitude': 911,\n", + " 'florentine': 912,\n", + " 'imperioli': 913,\n", + " 'licking': 914,\n", + " 'disparagement': 915,\n", + " 'artfully': 916,\n", + " 'feds': 917,\n", + " 'fumiya': 918,\n", + " 'jbl': 52774,\n", + " 'tearfully': 919,\n", + " 'welfare': 24905,\n", + " 'idyllically': 49534,\n", + " 'isha': 43702,\n", + " 'lanchester': 920,\n", + " 'undertaken': 921,\n", + " 'longlost': 922,\n", + " 'netted': 923,\n", + " 'carrell': 924,\n", + " 'uncompelling': 925,\n", + " 'stems': 37258,\n", + " 'reliefs': 926,\n", + " 'leona': 927,\n", + " 'autorenfilm': 928,\n", + " 'unfriendly': 929,\n", + " 'typewriter': 930,\n", + " 'shifted': 931,\n", + " 'bertrand': 932,\n", + " 'blesses': 933,\n", + " 'leukemia': 12666,\n", + " 'posative': 142,\n", + " 'tricking': 934,\n", + " 'zanes': 936,\n", + " 'dashboard': 12667,\n", + " 'unknowingly': 937,\n", + " 'flatmates': 51897,\n", + " 'unnerve': 938,\n", + " 'caning': 939,\n", + " 'shortland': 146,\n", + " 'recluse': 941,\n", + " 'dcreasy': 942,\n", + " 'scratchiness': 24911,\n", + " 'pms': 30930,\n", + " 'chipmunk': 943,\n", + " 'tkachenko': 49537,\n", + " 'dipper': 944,\n", + " 'europeans': 61601,\n", + " 'berserkers': 948,\n", + " 'shys': 947,\n", + " 'monte': 68505,\n", + " 'eve': 949,\n", + " 'luxury': 61828,\n", + " 'conflagration': 950,\n", + " 'water': 46389,\n", + " 'irks': 951,\n", + " 'positronic': 954,\n", + " 'cushy': 150,\n", + " 'swiftness': 957,\n", + " 'underimpressed': 964,\n", + " 'imprint': 959,\n", + " 'sundance': 961,\n", + " 'aida': 31951,\n", + " 'thematically': 963,\n", + " 'uno': 965,\n", + " 'expressly': 966,\n", + " 'russkies': 967,\n", + " 'discos': 968,\n", + " 'shaping': 969,\n", + " 'verson': 970,\n", + " 'blushed': 61831,\n", + " 'prototype': 971,\n", + " 'lifewell': 976,\n", + " 'trafficker': 973,\n", + " 'crucifixions': 62188,\n", + " 'unrealistically': 975,\n", + " 'rivas': 977,\n", + " 'consequent': 978,\n", + " 'katsu': 979,\n", + " 'titantic': 980,\n", + " 'jalees': 981,\n", + " 'ranee': 982,\n", + " 'gambles': 984,\n", + " 'dispenses': 985,\n", + " 'disfigurement': 986,\n", + " 'bright': 987,\n", + " 'cristian': 988,\n", + " 'subculture': 37268,\n", + " 'capta': 991,\n", + " 'jewel': 992,\n", + " 'erect': 993,\n", + " 'avoide': 996,\n", + " 'inconnu': 997,\n", + " 'headquarters': 998,\n", + " 'babbling': 1000,\n", + " 'pac': 1001,\n", + " 'performace': 1003,\n", + " 'dorrit': 1004,\n", + " 'runners': 1005,\n", + " 'sentimentality': 1006,\n", + " 'marred': 1007,\n", + " 'commemorative': 1008,\n", + " 'helpers': 1012,\n", + " 'chiles': 1011,\n", + " 'snowy': 1013,\n", + " 'cheddar': 1014,\n", + " 'neath': 158,\n", + " 'outshine': 1016,\n", + " 'nadu': 1019,\n", + " 'wellbeing': 1020,\n", + " 'envisioned': 43779,\n", + " 'fanaticism': 1021,\n", + " 'morrisette': 12687,\n", + " 'sesame': 1024,\n", + " 'gran': 1023,\n", + " 'marlina': 1025,\n", + " 'artificiality': 1030,\n", + " 'coinsidence': 1027,\n", + " 'founders': 1028,\n", + " 'dismissably': 1029,\n", + " 'dracht': 66299,\n", + " 'scavengers': 1031,\n", + " 'neese': 12685,\n", + " 'pangborn': 1034,\n", + " 'elmore': 1039,\n", + " 'bristol': 71162,\n", + " 'lillies': 1035,\n", + " 'parkers': 1036,\n", + " 'skipped': 1038,\n", + " 'clipboard': 1042,\n", + " 'jucier': 1041,\n", + " 'haifa': 1043,\n", + " ...}" + ] + }, + "execution_count": 48, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "word2index = {}\n", + "\n", + "for i,word in enumerate(vocab):\n", + " word2index[word] = i\n", + "word2index" + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_target_for_label(label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'POSITIVE'" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[0]" + ] + }, + { + "cell_type": "code", + "execution_count": 52, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "1" + ] + }, + "execution_count": 52, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 55, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'NEGATIVE'" + ] + }, + "execution_count": 55, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "labels[1]" + ] + }, + { + "cell_type": "code", + "execution_count": 53, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "0" + ] + }, + "execution_count": 53, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "get_target_for_label(labels[1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 3: Building a Neural Network" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "- Start with your neural network from the last chapter\n", + "- 3 layer neural network\n", + "- no non-linearity in hidden layer\n", + "- use our functions to create the training data\n", + "- create a \"pre_process_data\" function to create vocabulary for our training data generating functions\n", + "- modify \"train\" to train over the entire corpus" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Where to Get Help if You Need it\n", + "- Re-watch previous week's Udacity Lectures\n", + "- Chapters 3-5 - [Grokking Deep Learning](https://www.manning.com/books/grokking-deep-learning) - (40% Off: **traskud17**)" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] += 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 61, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):587.5% #Correct:500 #Tested:1000 Testing Accuracy:50.0%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 62, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):89.58 #Correct:1250 #Trained:2501 Training Accuracy:49.9%\n", + "Progress:20.8% Speed(reviews/sec):95.03 #Correct:2500 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:27.4% Speed(reviews/sec):95.46 #Correct:3295 #Trained:6592 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 63, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 64, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):96.39 #Correct:1247 #Trained:2501 Training Accuracy:49.8%\n", + "Progress:20.8% Speed(reviews/sec):99.31 #Correct:2497 #Trained:5001 Training Accuracy:49.9%\n", + "Progress:22.8% Speed(reviews/sec):99.02 #Correct:2735 #Trained:5476 Training Accuracy:49.9%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 65, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.001)" + ] + }, + { + "cell_type": "code", + "execution_count": 66, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):98.77 #Correct:1267 #Trained:2501 Training Accuracy:50.6%\n", + "Progress:20.8% Speed(reviews/sec):98.79 #Correct:2640 #Trained:5001 Training Accuracy:52.7%\n", + "Progress:31.2% Speed(reviews/sec):98.58 #Correct:4109 #Trained:7501 Training Accuracy:54.7%\n", + "Progress:41.6% Speed(reviews/sec):93.78 #Correct:5638 #Trained:10001 Training Accuracy:56.3%\n", + "Progress:52.0% Speed(reviews/sec):91.76 #Correct:7246 #Trained:12501 Training Accuracy:57.9%\n", + "Progress:62.5% Speed(reviews/sec):92.42 #Correct:8841 #Trained:15001 Training Accuracy:58.9%\n", + "Progress:69.4% Speed(reviews/sec):92.58 #Correct:9934 #Trained:16668 Training Accuracy:59.5%" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m# train the network\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0mmlp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mreviews\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0mlabels\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m-\u001b[0m\u001b[0;36m1000\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", + "\u001b[0;32m\u001b[0m in \u001b[0;36mtrain\u001b[0;34m(self, training_reviews, training_labels)\u001b[0m\n\u001b[1;32m 117\u001b[0m \u001b[0;31m# TODO: Update the weights\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 118\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_1_2\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mlayer_1\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update hidden-to-output weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 119\u001b[0;31m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mweights_0_1\u001b[0m \u001b[0;34m-=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlayer_0\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mT\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdot\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_1_delta\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m*\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mlearning_rate\u001b[0m \u001b[0;31m# update input-to-hidden weights with gradient descent step\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 120\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 121\u001b[0m \u001b[0;32mif\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mnp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mabs\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mlayer_2_error\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m<\u001b[0m \u001b[0;36m0.5\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "# train the network\n", + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Understanding Neural Noise" + ] + }, + { + "cell_type": "code", + "execution_count": 67, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiIAAAFKCAYAAAAg+zSAAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHtnXvQXVV5/1daZxy1BUpJp1MhE5BSSSAgqBAV5BIuGaQJBoEUATEJAiXY\ncMsUTfMDK9MAMXKRAEmAgGkASUiGIgQSsEQgKGDCJV6GYkywfzRWibc/OuO8v/1Zuo7r3e/e5+zr\n2ZfzfWbOe/bZe12e9V373eu7n/WsZ40aCsRIhIAQEAJCQAgIASFQAQJ/UkGdqlIICAEhIASEgBAQ\nAhYBERHdCEJACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSq\nWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQ\nGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEh\nIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC\nQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYC\nQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaA\niEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qFgBAQAkJACAgBERHdA0JACAgB\nISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSAPQXX3yxGTVqlPnlL39Z\nQGkqQggIASEgBITA4CAgIjI4fR3Z0iVLlpgHH3ww8ppOCgEhIASEgBAoG4FRQ4GUXYnKry8CJ510\nknnf+95nbrvttvoqKc2EgBAQAkKgtQjIItLarlXDhIAQEAJCQAjUHwERkfr3kTQUAkJACAgBIdBa\nBERECuhanFXHjBkzrKR169ZZB9atW7daHwymQHBodZ8bbrhhWHp+kJbr+G1wHM7Db65FiV9f1HXO\noSO6ItRPXU888YRZvHhxRy853Fp49EcICAEhIAT6hICISMlAz5kzxyxbtszMmDHD4I7D5/HHHzfr\n16+3RCOq+u9973tm/Pjx5oMf/GAnD/kmTZpkLrjgAjN9+vSobKnOXXnllbbsE0880Vx00UWdenbb\nbbdU5SixEBACQkAICIE8CLwjT2bl7Y3Az372M/P0008bf4DHsrHPPvtYsoGFY9asWcMKwkJx5513\njjgPeTjqqKPMxIkTzWGHHWb4LRECQkAICAEh0GQEZBEpufcuvPDCYSTEVTdu3DhLRrZt2+ZOdb4h\nGWFy4i4eeeSR1oJxyy23uFP6FgJCQAgIASHQWAREREruuoMPPrhrDb/4xS9GXD/rrLNGnPNPHHPM\nMWbHjh1m06ZN/mkdCwEhIASEgBBoHAIiIiV3mT8lk7SqPfbYo2vS3Xff3V7ftWtX13S6KASEgBAQ\nAkKg7giIiNS9h7roJyLSBRxdEgJCQAgIgUYgICJSw256++23u2q1fft2ez28ZLhrJl0UAkJACAgB\nIVBDBEREatgp999/f1etnnrqKevoiuNqWOLigOBPgl+JRAgIASEgBIRAnRAQEalTb/xBl5dffjk2\ncBkb1EFU5s2bN0xzlvSyJHjjxo3DzrsfN910kzvUtxAQAkJACAiB2iAgIlKbrvijItdff725/fbb\nbRRU38JBNNQzzzzTLt8NL+/FKXb27NnmqquuGkZi3nrrLRsA7Uc/+pElKn+s5fdHe+65p3nhhRfC\np/VbCAgBISAEhEBfEBAR6QvM6Sph1cxLL71k9t13X3PQQQd1wq8TjZVAZ3E75RLgjOuQGBdKHisJ\nQlC1KMGysnPnzk56QstLhIAQEAJCQAj0C4FRQejwoX5Vpnq6IwAJILR7VFTV7jl1VQgIASEgBIRA\nMxGQRaSZ/SathYAQEAJCQAi0AgERkVZ0oxohBISAEBACQqCZCIiINLPfpLUQEAJCQAgIgVYgICLS\nim5UI4SAEBACQkAINBMBOas2s9+ktRAQAkJACAiBViAgi0grulGNEAJCQAgIASHQTARERJrZb9Ja\nCAgBISAEhEArEBARaUU3qhFCQAgIASEgBJqJgIhIM/tNWgsBISAEhIAQaAUCIiKt6MaRjfjVr35l\nHn744ZEXdEYICAEhIASEQI0Q0KqZGnVG0aq8973vNd/97nfN3/zN3xRdtMoTAkJACAgBIVAIArKI\nFAJjPQuZPn26WbZsWT2Vk1ZCQAgIASEgBAIEZBFp8W3wwx/+0Bx33HHmpz/9aYtbqaYJASEgBIRA\nkxGQRaTJvddD97/7u78zo0ePNhs2bOiRUpeFgBAQAkJACFSDgIhINbj3rdY5c+aYlStX9q0+VSQE\nhIAQEAJCIA0CmppJg1YD0/73f/+3wWn1l7/8pfnzP//zBrZAKgsBISAEhECbEZBFpM29G7SNFTMz\nZswwq1evbnlL1TwhIASEgBBoIgIiIk3stZQ6s3pm0aJFKXMpuRAQAkJACAiB8hEQESkf48prOP74\n483OnTsNq2gkQkAICAEhIATqhICISJ16o0RdLrzwQrNkyZISa1DRQkAICAEhIATSIyBn1fSYNTKH\nYoo0stuktBAQAkKg9QjIItL6Lv59A4kpwkf7zwxIh6uZQkAICIGGICAi0pCOKkLN2bNnmxUrVhRR\nlMoQAkJACAgBIVAIApqaKQTGZhTCjry77babDfmujfCa0WfSUggIASHQdgRkEWl7D3vtI6DZ5Zdf\nbh566CHvrA6FgBAQAkJACFSHgIhIddhXUvPkyZPNXXfdVUndqlQICAEhIASEQBgBEZEwIi3/TUwR\n5Lvf/W7LW6rmCQEhIASEQBMQEBFpQi8VrONnP/tZ88ADDxRcqooTAkJACAgBIZAeATmrpses8Tm0\nEV7ju1ANEAJCQAi0BgERkdZ0ZbqGnH766ebss882p512WrqMSt1KBJiq27p1q3n11VfNtm3bzBtv\nvGG2bNkyoq3Tpk0ze+yxh5kwYYIZP368+fCHP6xdnUegpBNCQAikQUBEJA1aLUpLYLNbbrnFPPXU\nUy1qlZqSBoENGzaYxx57zKxcudKMHj3aTJo0yRx88MFm3Lhxdpk3AfB8wZL205/+1Lz11lvmtdde\nM08//bT9QE5OPfVU88lPflKkxAdMx0JACCRCQEQkEUztS0RMkfe///3WaVUxRdrXv3Etot/vvvvu\nzsqpOXPmmBNOOMFkvQcob/369ebRRx81y5Yts8vDL7vssszlxemt80JACLQXATmrtrdvu7aMmCLT\np0+3g0fXhLrYGgRuvvlmSz6feeYZuwHi5s2bzXnnnZeLNHAfMb23dOlSay0BrPe+973miiuuMFhQ\nJEJACAiBXgiIiPRCqMXXZ82aZVatWtXiFqppIID/x6GHHmrWrFljPwS0+9CHPlQ4OFhVbrzxxg4h\noY7ly5cXXo8KFAJCoF0IiIi0qz9Ttcb5AOArIGknAlhBpk6dapiCwR+oDAISRs4REojPokWLDI7R\nTOFIhIAQEAJRCIiIRKEyQOcYoHBWlLQLAQb+mTNnWgsIBIQpmH4LpGfjxo1m7Nixdkrohz/8Yb9V\nUH1CQAg0AAE5qzagk8pUUTFFykS3mrIhIVOmTDF77rmndUzFj6NqYYrm6quvtlYZZ4mrWifVLwSE\nQD0QkEWkHv1QmRaY0WfMmGFWr15dmQ6quDgEHAk57LDD7OaGdSAhtA6LzK233mqOO+44I8tIcf2t\nkoRAGxAQEWlDL+ZsA6tn5FSYE8QaZPdJCE6jdRNW14iM1K1XpI8QqB4BTc1U3we10IAll/gSyGxe\ni+7IpARLZl9++eXaB6mD9OLEiv9IXSw2mQBXJiEgBApBQBaRQmBsfiEXXnihjS3R/JYMZgsY3HE6\nXrt2be0BYJrmgx/8oDn//PNrr6sUFAJCoHwEZBEpH+NG1MC8PfP3hPBGcGLNGm2zEQ1ukZL0FStU\nWC7bj+W5RUDHNNJRRx1llxVXsaKniDaoDCEgBIpBQBaRYnBsfClMyfBhD5ovfelL1tGx8Y0akAZc\neumlBotWU0gI3cKUzJIlS+xKGkiJpFkIXHzxxeakk04apjTnRo0aZX75y18OO1/kDzZmpI4bbrih\nyGJVVsUIiIhU3AF1qJ43aqJvHnTQQTb2xL/8y7/UQS3pkAAB+u355583//RP/5Qgdb2SQJxwlL7m\nmmvqpZi0qRwBNlaEbJRJaipvpBToIKCpmQ4Ug3vgpmUgJE5uuukmw5u2pN4IELWUnW+bOr3BPYej\nNFOCmgqs973ma4f147/+67/MunXr/NOFHVPuySefbF5//XW7G3RhBaugWiIgi0gtu6W/SjElw4oZ\nSbMQcNaQppIQ0IZ8XH755eYrX/lKs8CXtkJACBSGgIhIYVA2uyDIiIKaNasP77jjDjN37txmKR2h\n7WWXXWZX/MhXJAIcnRICA4CAiMgAdHLSJhJw6p577kmaXOkqRIBBe9myZXZDuQrVKKRqrCIQ4fXr\n1xdSXpMKcc6XOO5yjAMozpjuw2+uRQnX+OBH4RxFx4wZMyLpgw8+aH1xXJl8f+ELX7D1jUj8hxOU\niY+Gnwd/njhdyIYOUfVzLao80ob9QEhHnUzLIOPHj7e/nXOqj5dNEPqDfocffvgwvWkrPidhYfqH\nuigTjMLYuzrD+fS7eARERIrHtNElYubHVC6pNwIM2oTmb4tfBffdo48+Wm/QS9Tue9/7nh10ia8y\nNDTU+UyaNMlccMEFlkjEVf+pT33KXtq1a5fZvn37sGSQAwLdsTTflbtjxw7zi1/8wtYX5ePBoH3s\nscdaYvjAAw908n3mM5+xU7iUmUYY6HGEJ9gePh9Oj3nz5plbbrnF1gUBQXbbbTd7/fHHH7e/Xfor\nr7zS/o77Q36IxO23326thK4O19Z99tkn1p+FjT8h9fw/uXzUz/8YZUr6gEAAvEQIjEDgBz/4wYhz\nOlEfBIKH5lBgvaqPQjk1CZxVh4LHXc5Smpc9GGhtu2n7nXfeGdmAYFWUTXP99dcPu37iiScOBQPs\nULCZ4LDz7gflcT0YjN2pYd/k43pAYIadp9xgr6IR510iVy/fvlx00UW2PP8cZVPWWWed5Z/uHLu2\nhdsQEAHbZvDxxeEVxoryu+ns2upj4eqIyxdXl6+PjotBQBaRPpC9JlaBqVxSXwQee+wxc+SRR9ZX\nwZSaYdnhLRwH3EGUYDA0s2bNimw6/RwM8tZ6EE7AGz/XooR4QLNnzzZ777131GWbj/xYPZxgIXni\niSdsXBqsE1HCcmvyJRHKxhKC9SNKXNvuu+++qMuJzm3atMncf//9XXUGI3RevHjxiDKJwRPV1nHj\nxhksKdu2bRuRRyeKRUBEpFg8VZoQKB0Bt8y6bWSRwZiYKIMowRt912Yfc8wxdiBl0PUFzKKIBj4P\nDLxEr40T8pHfXzH3zDPP2OSTJ0+Oy2YJMPmSCGWTlkE9Tm677bYRU0pxaaPOv/rqq/Z0N51dW92U\nj1/OwQcf7P8cccw0lqRcBEREysVXpQuBwhEg5sbEiRMLL7fqAhkQwj4OVevUr/r32GOPrlXtvvvu\n9jp+IL7stdde/s/OsUvHfeI7nIaPsVb8/Oc/7+Rj0MUKEEVuOomCgwMOOMD/GXv87LPPJk4bW0iP\nC/jXIFFWDT/rEUccYXbu3Omfsse98o3IoBOFIyAiUjikKlAIlIsAVoOxY8eWW0kFpfPWLDN4d+Ad\nweie6o9XAz+HjgNmMJsfeRzlsPrHEnQkBMpHQESkfIxVgxAoHIG4ZZKFV6QC+4LA22+/3bUeZylK\n2u/OgvLaa691LTd88S/+4i/slI5bxRK+7n7/6Ec/coddv0ePHm16pWVVTXgZb9dCQxc/8IEP2DO9\ndH7hhRcM+kjqh4CISP36RBoJgYFE4P3vf79ZtWrVQLYdZ8tugq8FUyZJHZQ/8pGP2OK2bNkSWyzL\ndJmq8ZfjHnLIITZ9N18diANTOkkE3xfSkidOVqxYYR1xs06ROB8PHLjjxOncyxcnLr/Ol4uAiEi5\n+Kp0ISAEEiLAjryDKgzWccHCcDyFqMStPInCDB8PVopcd911Juzg6tK7FSSXXHKJO2XOOOMM61xK\nyP04CwOrcZIKQdAgUHF5IEOsmPnEJz6RtMgR6SBnEAxiiHTTGT0+97nPjcivE9UjICJSfR9IAyEg\nBAYcgSBGiB1IsU74gylTFmeeeaYlFXHLe+Og+7d/+zcTxPqw5MInOQz+EARIShCPY8SKFojB97//\nfUOgNEiQE6wK5MO5NW7JsEvrviFE1A2RIi91O6HsKVOm2OkSdPXFTS3h7JpE2O4Ax12WgPs6u7ZS\nDnpktbok0UFpsiMgIpIdO+UUAkKgQATYBbotkWLTwsKqmZdeesnsu+++NgqpW91CdE/IAktc0wqD\nLo6oWFIeeuihzuoZLAMIMT6iyA1Ow88995whqiskyOlCuHV8SNI6txKdlKXE++23n7WOuPKI+Iol\ng3aHCYKLL0JU2fD0URQOrq3EBCFKqquDtrJ8mPYoSmoUcvU4N4q4aPVQRVoIASHQCwH2mPnqV79q\neGO89NJLeyVv1HWCmS1YsMAOmo1SPIeyWBkY4CEbUaQgR9HKKgQag8A7GqOpFK09AgwkPFgJMMQy\nzDfeeMOEneV44yW2AW+AEyZMsMcf+tCHat+2KhWEfAQh960KvPmxb0cb92XxpySqxFt1CwEh0F8E\nRET6i3fratuwYYPdwh2PdZbGYc7Fix2TLoNmOPonUUEJyMXcLUsL2cb+6aefthtOnXLKKTb/IDst\nuhvE4cRvcPTJGgP2m2++6ZK25puYF0cffXRr2qOGCAEhkAwBTc0kw0mpPAR4Q7/77rutGR3ywe6V\nJ5xwQub5fcpjLpxlfCwbxKntsssuy1yep2qjDn3y8d73vrdr+5kDh5C0ibSdfvrp5uyzzzannXZa\no/otj7KamsmDnvK2BQE5q7alJ/vQDgjDzTffbIj3wJ4Ua9asMZs3bzZs4Z7HyZDBlMEHhzq36RkD\nMc5s1NlmwUGTNrt2Y/ng0wtPVgd85zvfaRU0kFDCcEuEgBAYLARERAarvzO3loGSDbQcAYE0+NMF\nmQsOZWQAvvHGG+30DdEmIT3Lly8PpWr2T0c8+Ka9ScmH32qICCsB2iJggXWtFwFrS3tdO1ihwnqB\ntjmqYt079NBDXTP1LQS6IqCpma7w6CIIEIyIYEHEHcD60U9hgOIh/cEPftAsWrSosVMRtMNJEQQO\nSwp+OFik2iDcYz/72c/MTTfd1IbmDHwb6E/2xeGlQiIEeiEgItILoQG+zrQIAYeQr3/965W9raIH\nfig4aBINMuwAW8cuQme30gX9iiAf4Xbyxrlw4UJz/PHHhy817jdTcY888oh5z3ve04j+bRzAfVaY\ne5MAYmXc931uiqrrAwKamukDyE2swpEQggGtXbu2MhICdviQLF261EZNPO644wzWgDoK5mgsH3w4\ndlMuZT2MP/vZz9oVS3XEIo1ODz/8sCUf3GtMzfjWozTlKG09EHD9V9Z9X49WSosiEZBFpEg0W1KW\nT0LqZlpl0GJvDDYBq4NlJM1Kl6JvD/qJpb0sh26ybwVvz/Pnzx+2WobBTANZ0XdMf8rDyZyAe2n2\nxumPZqqlrgiIiNS1ZyrSq84kxEFSNRnBIuOCb/VaZut0LuMbEkQk0t/85jfWYlRGHWWXSV9ec801\nkb4uIiNlo198+Tw/cDCn75pMjotHRiV2Q0BEpBs6A3ht5syZhtUqrIqps+AMRyA0po36EUvDmZvB\nhAdtP+rshj9kCB34oA9LqZtmQWDQYiVW2Britxvc64C3r5OO4xGAWN5yyy3WYhmfSleEwHAERESG\n4zHQv4gRctddd5mNGzdWPtAm6QgCYBEqHv+RMsQnH3Ua5MODM8ubWVHUlH5zfQWZZAuAXqQX0sXb\nddXkz+mt73gEeJEhQvIgBaWLR0NXkiIgIpIUqZan42HPmycrPerge5EEbmcGvvfeewtZOUJ5Za90\nSdKubmkgIVGkCFJ2yCGHNGZennZMnTo1sQlfZKTbXVGPa0wVMlXZtoi/9UC33VqIiLS7fxO3joGM\nfT6atqMre92ce+65lkBkeWP2yUfU3jiJASw5odMzioRQtVulc+utt9b+bTSrriIjJd9kOYvHModV\nriwLZU71lL3GCIiI1Lhz+qWacxhsmmnf4ZOWRDEQstIEqTP5cO1DX4hIL0uVI2V1WVHk9Pe/aQex\naViqm2VFFmQEwilHSB/VehyztP4LX/hCIdbJerRIWvQLARGRfiFd43qilk/WWN0RqjE48RBkWiXO\nKkKaOqx0GaF8jxOQECTJwEvab33rW+bKK6+szfJmv3mOhOy333653prTYOLXr+PyEHD/g47gl1eT\nSm4jAu9oY6PUpuQIYA1BmuxchqWAHXvZEdifWvLJB/4vvSwKyVHrT0r0T/P2zyDwD//wD+Zd73qX\nJWZ1sow4EgJyONbmEUgZZIRPEoKWpy7lTYbAgw8+aP8Hk6VWKiEQQiDYcKmv8sADDwwFKtjPWWed\n1de6i6jM1592+OLaxXewk6h/qedxYKru4HLnnXf2TF9UgmnTpg2tXr26qOIqKyfYiXYoGJSG+Haf\nwAJSmT55K6YNafQnvS/0KfdhHfo2sFQNBZv0DV1++eW+irmPA+I1RNmS6hHgf099UX0/NFUDhXgP\nntaDKrxRrlq1ykyaNKkVELBPCdMvOHTyiZumqXtj3cqYpPrTj6xW8AULV0BObBRaIl1ikahCsLgx\nbcYKmSw+Id10xhrCB8uRpDoE8E1i5+SmWRyrQ0w1hxEQEQkjMkC/n3zySTNjxozGDth+V0E8cJR7\n7LHH/NONOoYsOBKSRnGmZBiQwwImlLdt2zYbOIzjfgnkiJgShOMn2Jo/ZVakDm7qSmSkSFTTlcX/\nHJtSSoRAVgRERFIid8YZZzAf0/mkzF6r5Ox2SvChtsiRRx7Z2E3gGLj5QB7SSC/iAkEhYBgDBVYJ\nVhiVSUggUwTGox0Em8OBOG2b0rSftCIjaRErLj39zQ7QJ5xwQnGFqqSBQ6AWRITtokeNGtX58Gb7\n1ltvxXbGpk2b7Nuvn2fMmDF222m3MiKc2U9Lfj4nnXSSrZP6kSRpcMry04Xr8X+vW7euUwd5qI9z\nWSSqzThook9WYVrmiCOOyJo9dz6HI+0oQpxpuGlvxxAQxOmfFAvyhadk4vKed955lhQQK8YREkzq\nRQmYMwWEU/AzzzxjV+0wFZN0eimvHo6MlEmy8urYxvzr1683gZ9ZpEWuje1Vm0pCoN/OLb6zJ86q\nfIKmjfjss88+Qzt27Bih3vXXXz8irZ+ffK+//vqIfH6acBnOOTRJGl9/0vvi57/66qtj9XT1+Xm7\nOauS3i87fExdaQXHsiASZ9pshaZ37TjxxBMLKzeYaqqFg2bSBtEPOF1mkbCDatIycIK95557bP8H\nFhPrRBoMKKn1oP5rr712WDlZ25JU9yTpsuKSpGylGY5AW5zdh7dKv/qNQKXLd++///5gLIqWgITY\neAgrV67sJMBycdVVV3V+Rx2Q7+STTzavvfaaDVYVlaZXGeRJkiaqbHfuuuuuc4cjvi+44AJz8MEH\nG6YSegkWFNJ3E+oaO3asmTVrVrdkw65t3brV7L///sPOVfXjiSeeKKzqCRMmGO6BJghv71k3dOs1\nJdOt/VgPsJDwwZLBPbZ48WLruEyYeO4L7iesjGHB2vHzn//cbjgYrIQxfDDNH3/88eGklf12vjFl\nTwlV1sCaVIxFDqsqy+YlQiAPApVPzQRvwyawYFifi127dhl+O/GJClMubJLlhMiMwRLZjq9GYBVw\nl+xAdMcdd3R+Rx2QPmB99hM3gCdJE1W2OxdYMjp1BJYUd9p+sz9KEvHb5WPFYBtYkzpFgE3ctFQn\nkXdAfsz0dRGmnoqQ8ePH26mBIsoqswxHJLJMXaSZkunVBqaDcCTFj4T/B+7TuXPnRpIQyrrooovM\nggULbFrilMybN69WJMS115GRqlYLOT3a/M09EyzJ7tv0W5uxHPi29dsEE57aCAbEYSoQfyPolM4n\nICf2ejhfVJwOf5qHKRpf/DJJFyVJ0oT18Mvx8wcEwr9kj8NTLK5tXIyammGKyS8zjBX5aadLg25J\n5aabbhriU6U4vflmesbHI6temOUxF9dVmBbJO3WQN39dsSlDL6a+wFxSPAJM7TKlJxECeRGo1CKC\nVWPvvfcOxqE/Svi3e8tnu3AnweAbOa3xmc98xiWxVhGmH6KEuAa9JEmabmWceuqpIy5/9KMfHXau\nm0MuCZlecoI1JIwN+6Scc845Lon5yU9+0jnudYCJHetBXsHR1Dmdpv3262Z6BjM/02+9cPHzNekY\nSwafPFMGzpLSpHZXqSsWHzCXZaTYXnAO4XWakiu2hSqtnwhUSkQOOOCAxG198803O2nDA7q7sPvu\nu7tD++1IzLCTwY9wuvB1fidJE5XPnQuTBs5DHHyJ08+lCSwE7tAwUEcN9L4vyttvv91Jn+QgrE+S\nPGWmefnll60/TJNjgcThw2CIpF0Z45dHGUlXyfj5Bv1YZKT4O4AXBlbLSIRAEQhUSkSKaIDKEAJ1\nR8C9PUYFHUuje1zgsjRlDGpaR0YcIRxUHIpqN4sIiKkkEQJFINAYIsKOnU6effZZdzjs27cgcKHK\nN34ccMMStoBEWU38PL5VBsfUYB6u6+fLX/6yn73rMasi4qauumbUxVQIMJUCAclLQjQlkwr2yMTO\nGiUyEglP4pPEnwFLh2fijEooBGIQqHT5boxOkadZVuiEFR+ssggvfw1iI7gkBj+ScePGdX73+wAf\nDAKY+eITKPTrRUQOOuigTnbyQmSKIlcszQwTt05lKQ5YNfH5z38+RY7eSXvh0ruEeqQoijz0Y0qG\nOl544QXrW8W9i7hluhxDpCZOnMih4X+R+4f/v6YNRrSDtvLJSw4tGAP4B2sIS78lQqAoBBpDRIgN\nwuANCUH+8R//0Xzta1/rkBGisfrLfX0nzqLASlNOOLYHEVD9eCBJ9INI4dCL7wTt/tSnPmW3UHcE\nizKJZukwCVbNpDKXFkFEnC5psCkzLVYerD1VCo6RRYY2Z0omj4NrHBZMGXEPEQti586dlmiwpPvs\ns8/ukGRXLwM3eiAsm9+4caO9F8k3efJku1UAG+01QRwZof1NI1JV48u9vWzZMhMEsqtaFdXfJgTy\nLrtJm99f/hq1jDYYVDvLUQOch0VXDS9/5XrUJyAsI5aC+unilrkmSePrT3pf/Py9jmmnL1HLd7ke\nri+uXPKnkbovc03TFj8tkT6rXJZMZFGWjBYlZSzVXb16dScaKnjlqYP2EqU1WPE0FAzwFnvONUEC\nC2OhfdWENufVkb4merFECBSJQGN8RIIB2EYO9QN8cS4sWE0ef/zxwqYwwuUn/R2EkY9NSqCzpNMP\nOISRvptgNVm7dm23JCOusfqCN+G2Ccu83RRCv9uG1QAp6i2b8opcJfPwww+bQw891FxzzTVm/vz5\n1sLB1JqzemTBC+sCZnqCm7HL7vbt263ObHxX9yWzbn8a50ycpf2Dlmf58uWt2ihz0Pqvru1tFBEB\nRBwyIRphQgIBYcAm9kYdpguIrxFYM+zUiut8YoGge1wkV5cu/E16nF/D5AYCQptfeumlxMTGlc0A\nwlx/mx7CDHyQK8Km91scjuBalDAVUkR56Mauu46AbN682ZQxjQKhYaM79MbPhH4ocmO9onD1yxEZ\n8dHofgwx5l4q497pXrOuth2BUZhX2t5ItS8aAfxLcDokxHcbhEEPosrbeT8Fp9Sse8bE6VmUoytv\nsJB2wraff/75fQ3HTX+ce+651odk0aJFfa07Dte480X79cTV0+TzbCOBXxlkUyIEikSgcRaRIhs/\n6GXhZIg5vQ3CoLdixYq+e/M7wpBlz5g43IuakoFoQkLoY8hmkTrG6e6fJ+omTrsE2psyZUqtrW9g\ng0WH/pREI4C18cwzz4y+qLNCIAcCIiI5wGt6VgYKTK1uWqHJ7fnwhz9sp71Gjx5tBxMGlDIHFd6g\nHQkpGre8UzLoxhYFrLYqcvVOlnYywLM5GtOI6FT3e01kJLqX3f9SHn+i6JJ1VggYo6mZAb8L2mJu\nZQrikUcesYOe36XuAerOFTGFgsWCwb4op1SnG995yQ16YX1AcGDutxXEVhzzB2fZSy65xE6dlYFd\nTLWZTufth0yV1jgT1jUCLOLcLBECRSMgIlI0og0rz00D5H0Lr7rZrAZZuHBhzy3peSP3I9yyKiWN\nQyh4IWnyJMWmiLJxSiUQWd1IiMOgaWSkCOLq2t7Ub8gtOEDOyrjvm4qL9C4OARGR4rBsbEm87SBN\ndULDGoIzJKtB0gqDPyTMCZFr497WITFYGMp6GOd9C8e69fTTT9eWhDiMm6In+tLn9HedLEsOx359\nQx5vueWWvjuB96t9qqd6BEREqu+DyjVwVhH8CeIG4cqVjFHAva3hkFnE/DXlgYMvzm+gzLfjvCTE\nrVBpylsrlps99tjDLF261Ie6lseDTkZmzpzZqMi5tbyJpFRXBEREusIzOBcJQMVg3u+lr3kR5iGJ\nlDmgYXHxY9MUQXj8duedknFk7N577+05NeXXW+Vx03Qu2xpWZV90q9u9pDCdOchWoW4Y6Vp+BERE\n8mPYmhJY1TB16tTGxBUp2wrgrCNh4oHVwZe8lpK81pB+kDG/vUUdu/7DAtWEQS4vYSwKt36WAwln\nX6EyiX4/26O66omAiEg9+6USrXjrg4w04c26bF0ZdCAiSaaq0CWrA2xeEkJ+yGNTBvPwjc0UDRF+\nm7IaY9DICM8DNhRlqb9ECJSFgIhIWcg2tNwmrGqAILBENdhorZQBLO9gQ/4kDrB56+EWYyBnx9ym\nRscFA1YuNWnVVhH91oTHgyP7/r3cBL2lY/MQEBFpXp+VrjHmWCJy4i+SxCJQukJeBZCQE044wXzs\nYx8rZZUPD9+iV8a4KR6vGZ0onuFpHz9Nr+OmW0Nc+5oYowIyktRi5trZtO+2xBhqGu6DqK+IyCD2\neoI2MzisXLmyVmTEWUIOOuggc8455xSySsaHgoE9r7+HX16347ADbJZ66aM27BXU1Ddv7kcISd3I\nerf7Ls01LFV1fBlJ0walbQYCIiLN6KdKtHTTNHXwGWGwYp+LSZMmdSwhRRIHyspjnUjTQVGmfdqX\nxs+EQZCYJ02a0uiGEb4Is2fPbtzOrm0lIzgSX3755Zli83TrZ10TAlEI/On/CyTqgs4JgQMPPNDs\nt99+ZtasWea3v/2t+fjHP14JKFgP2MX1i1/8ornqqqs6OvDGtm3bNvN///d/mVddMJC88sorfSMh\nKI9jKRYQX/baay/rK0Gb+KAX6SAtfH79618b0jj55je/af7nf/7Hhkx355r+vX79evP3f//3jWrG\nO9/5TsOHe4h+a4vcdttt1g+LiMUSIVA2ArKIlI1wC8rnbf2iiy6yLVmwYEHfBm0GYJww33jjDbNk\nyZLYeknHwJ3WRJ41X54uzWp5ccTE1X3TTTdZX5nzzjvPnWr0N32BRarJjpFZ+7ZuHce91iZrW93w\nlT4jEdDuuyMx0ZkQAgzwzBWzTJQPvgkMHGUJD0Ic5XjDZGkncQy6TZsQgpsPA0FScfqnJS9Jy49K\nR51Z35pxoAUD93nttdfMe97zntrvZhuFQ9Q5+s/tnBx1vQnn6Js092Dd2sT/HYJlatq0aaVtZVC3\ndkuf6hEQEam+DxqjAdYJpgsQBlQCaTGXXJRgeYHk8Da2a9cu+3ZMfIkkwa7cQM1A4B6ocXpRD8Lg\n108pyp8DQrNlyxa7dLefRKpsrPD/2bp1a9nVlFp+k8nInDlz7P/zihUrzNlnn10qTipcCPgIiIj4\naOi4JwIM+GyOh2PlhAkTrEMb88gQCEhJLxIQrgDigPWDMnBYZKvxZ555xsyfPz8TUWAgYKB2Fo+o\n+pwFJXytzN+0E92KEAgNb6xtE1YAvfrqq41vliMjaf8Xqm7422+/bZ3BV61aZadD2fYh7v+oal1V\nf7sQeEe7mqPW9AsBCAkWEj5YGB588EGzePFi+yBjOmX//fe30yoQi7BANNiqnp1iCUrGZ+HChcOi\nN+YZuLES8ABFL99ikKfMcBvS/EaXrFMyUfVgNRg7dmzUpUafmzhxoiWhjW7EH5SHjHD/QXqTWPTq\n1macwlk102+rYd1wkD79QUBEpD84t7oWBns/RDcPYCwmzz//fGS7cXxl+qWbhcC9VXZLE1n4H07y\nAOWNFPLBChWmlLKW1a2eJNewYBRZN9NWWA8k9UaA/4umkhFeDrB8SoRAPxAQEekHygNWh7NC5B18\nsSJgTcj6VsabKGWsW7fOnHTSSZX0QlVWmEoam7NSCCPTAm0SR0a4F7Pex/3GAxKydu3afler+gYY\nAfmIDHDn173pzqqRda7dzW+zHwvHWcvJilPRUzJZ9WhKviZOYSTB1hFzdz8myVNVGv7nmGJta19U\nhavq7Y6AiEh3fHS1YgR4iLuVOmlUwSSOuLdQymEg6OdgUNQqmTTtVtp6IuDuw37ef1mQWLNmzTC/\nqixlKI8QSIuAiEhaxJS+7whgsnfEIknlTIfw4HcPf5fHvZmmKcvlTfutKZm0iJlc03Dpa+t/Dnc/\n1pWMfPnLXy7Ul6n/CKvGpiIgItLUnhsgvTET80nyAHcEIM607AgK6coS9CxylUxZetatXCxIrJxp\nszgy0g8ynBZHR9TT5lN6IZAXATmr5kVQ+TsIMAC/8MILZseOHZ1lmG6ZLoncsl53zMqPI488MpEp\nmAe4s3R0KvQO8P9IujIGkoIjLeVhbYkjLV7xqQ6LXiUTrnyfffYxjz76aPh043/7m/41vjFdGsC9\nzP0KGSli8Of/7vvf/755/fXX7Z43xAPx/++ozxE8/gf5vxs3bpysH136SJf6i4D2mukv3q2rjYcp\nMURY7bBz5077wDv66KPN+PHj7RJdGuxWz5CWwYaPe2i++OKLNt/06dPN5MmTh8USiQLLWTz8azyI\nebBneaijE0TEvan65WY5jtIvSznd8lDH3Llzbdj9bumado0AWgixaQZBuGe5d7Pet+7/jii7BLjj\n/w6Suvfee1v4wv93nGRJ/fbt220Yd/5f+Z875ZRTbPyfogn5IPSh2lgMAiIixeA4UKXwAGU/imuu\nuca2m4fgGWeckemBSgGQgU2bNplFixZZUsIge/7550daKsIPbx7kSB4ikYfI2Mr/8KcIXfzy4o7B\ngDgsELqsA1lc2VWeZ8sABlJ2e87Tn1W2IW3d9GVSSx5lP/zww+aWW26x/zNJyXucTtw7Tz75pGF3\na/4HL7zwQjNjxoyBwT4OF52vAIEhiRBIgcDq1auHgkFiKCAfQxwXLd/5znds2dQR7DA7FAy2I6oI\nHtz2PN/BNMiI61lOUA9155G8+ZPUTR18Dj/88KFvfOMbSbI0Jg197vrUtbMfmNYBoF7tDIj/UDCt\nYj9l/N+BexBJdSgYgux31P9dHXCSDu1EwLSzWWpV0QjwoAoCHdkHYa+HZhF1Uwdkh4cvD+Gw3HPP\nPZEkJZwu7W/qzfIQLgsTyvU/rj3XXnvtEJ+2CPcXfR0lfvuLIp5R9VR9Luoeor38H0DSyiAg4TZT\nX2AVsfXxPyYRAv1AQESkHyg3vA4eSLwpYaHotzgLDIOuIwg8sDlm8CpDKDfNgEfaNOm76Uzd/sAb\nl9a9Icddb9p5+pc38l4Czknw6VVOXa/7ZCTq3u+X3ugBMYSUuP+7ftWtegYPAfmIVDAd1pQqmb/G\nDwR/kAceeCCzD0je9jKX/elPf9r87ne/M8GAZT7+8Y/bIsv0yaBs2p/EkTCPgyr1BINrB6I0q3hY\nIkwAKueU2CmkgQfsvrxkyZLUbQF7J+ARWA7cz8Z+06b77rvPOoBX2b/c/3PmzDE4lFf5/9/YjpTi\niREQEUkM1WAl5CE0ZcoU22j2najao54B+4tf/KLdN+app57qEIQ8JKBXj4JBYKHoOjimrd+V6erO\nM3h+6UtfMmyA1/TNyXDAhPBu3rzZwZLp2yd1OPMmIZGZKio50xVXXGG+/e1vm3vvvdcccMABJdfW\nu3hWMy1YsMCu0moqpr1bqRRVIiAiUiX6Na3bkZDDDjusNoMcOkGGeEivXLly2EMxLRlICzvlR1kq\nGPiQXm/h5HdS5ABJ/RAZLCq9dHD11/GbvYDOPvtsc9pppxWmXpjwRfVfYZUVWBD398svv2w3naMN\ndelXyOIll1wy7P+uwGarqAFHQERkwG+AcPPrSELCOjoygrUCcoLODMplvq1FxRuJI0A+8UD3MqdO\nwAJpqlUErKZOnWotT2Va3ei/wNfBYlUkGbQFFvTHJyFlYpFVXZGRrMgpXy8ERER6ITRg1+v+MHTd\n4fRkmgZhoOHtscwHOGQH0gPh8UmIP8ihS5nEg/J9QSfq86er/Ot1P8Y3ZP78+YVaQ3q1mT6ExDqp\ng7WE6Y+77rrLbNy4sdR72LU56zcxR4j3U3c9s7ZP+apBQESkGtxrWSsPmauvvrr0t9OiGn/ccceZ\nYEmxmTdvni3SJwdF1REuh0EMB8K//Mu/NKNHj7aXqx7IGMTQyZGysM51/V0XvX0iWYW1xFmFmkIm\nCTxHGPmHHnqorreW9GoYAiIiDeuwstR1b9ZVeumnbVuUzmWQkfAbNKGxISFVExAfL0gZUxxNCY/O\n4A9+WCbKnFLzMUpyHO7rsvuY+qjjuuuuM+edd14SFStPg85HHXWUXVHTFJ0rB00KdEVARKQrPINz\nEYfBIG5Ax7rQlJaHTcWQEySvkx+Exon/luwTnX5MBzkden2jC2QEsznh9ussTRrIfGtJnhVOcf3B\nyif2immadcFZcfjO+78Wh43ODw4CIiKD09exLXUPFef8GZuwphcgUWz45awBPllIqrI/4JAnys8j\niuSQD7+UOjyMb731VvOv//qv5j//8z9rZWXw+wASwrLwOq3I8vXrdkz/+zFfou6RbvnD1yivyaue\nmu4oHe4P/a4QgcGL4aYWhxEggmI/wkeH6y3qN1EgAyIwLAKkH6Eyqp6AdA2L0JkkemRcmUT7pLwq\nBd1oA1FwwaJqfeKwIHoqWwUkwTuujLqcB3P34R5IK2BRRbTitHrGpafNwdCVO6pwsCOwLYey7rzz\nzrjqBup8EECugwm49FvoB+rlE+yUXnr1/W9h6U0qpoLgja3TEeF/jm7Xiqm9f6W0JVQ4+3H4D3UG\nOn8w5qHpBg03aKdBmTzdhPp6pemWP+u1qHoZ4OpGRtCTcOFtISHh/grfX+Hr4d9uEAeXJgv3Gp88\n4p6nfA+SuIGeb4iHL1UTEXRx/XLiiSf6qpVy/CcBCJIGIzBq1CjjPg8++GDqlhAcjDDOTZe5c+fa\n5Y9+O1599VU7TcFUDYIp3X3SLPN1JnS/7PAx5VE2dTH90A9BLz7hKQJiiuD8iM/Ihg0b+qFK1zrc\ndMybb75pA3Wlwb5rwTW6yNScu7fcfcC9wIc+CstXvvIVEwzgtV6qG9Y56vdll11mFi5cmPmeJ6w/\nAdwQ/ocl9UHA9ccTTzxhsowtqVpSCr1pQaGODQZgjjAXdrvW76ajn/uEWXUvXdryVuba+bd/+7dD\nX/va16xlwllDirBSpC2DusG2TElSh9s0zbcUlalTVNlgh3Um71tzVNlNOedbS9x9WTeLVR4ssUZm\nmdoNticY2meffezzi+9BE/fc5jvts7sfWIX7h99liSwiwV0wqPLkk0+awFze+Lcy+o83T1aL8NbN\nG6lbEureTrP2MeVSRhpxdePIWoagE2/gfLoJIdOJTcGSbKwjZekTpQNWEFaEsKQYJ9qmRn6Nalva\nc/QT9xAfjm+77Tbjr8RKW17d0hOef8WKFanVCgZfs2PHDptv9uzZw/LzBu4svV/4whfstRtuuKFz\njmsXX3yx2bp167B8/g+sLYcffviwPJx76623/GTDjimPcl3dY8aMsdYA8rhzfIfL4HdYP9JxLqzj\n9OnTbVl+xWeeeaY95ywPfvspBwmm8YbpgJ5hCadZt27dsCSbNm0y4Om3BX1cvX5i7tFzzjnHnqKf\nHn/8cf9yscdFMhx8KXxrQaCptSYEjRhWTRA0q/MWDxMOX/fL4DgsMDPfmYZ64ury8ybVjzy+Dml9\nRHC+8tuIbmeddVYs6/XrAgvyMy/n2gVGYR0oz10Pfydl18zZZ3mT8TGt0zFvmzjehoU30iwWiqz5\nXP3M/6e1pri8Ud95ysMqEgyC1jKRBYsofaLOoaOri/urzLqi6m/CuWAH6SE+bRH6nGcQ32nEf+7x\nzPOFZ5h7rvEs9Z+H7rz7DuflGdotPc/TKAdMynFlhr+vv/76Ydf8MYuywunDv/3nd5Jnt99+ynJy\n0UUXdeqKsiJRj6ub674Vw7/m0vjf4ByWgHx0yivTV+SPLQxrkOJ32o4nPSA5EHwAwh0QvsnodD+v\nK8P/DudJqx9N9/9J/Juo17UsnR2uy2+Lf8wN7CTJzezSxn1TdtEDBVjTh/QpOkb1Fee4RhrSkqco\nYbCNahMkJe2DsigSQTlp6w7jQZucWT98LelvymCKhH7nO295fr2U7QgIpvqisPPraMsxDrtF41P1\n/x1twvE9qYQH73C+8DjgPwfDx+EBshsJcXl5BvmDNMdRzyqXPvztP7P853c4nf/b1Zfk2R1uv8Mn\nfD5MqHyiwrETn1D4OoWPw2MdOvtpwvW58vN+F0JEsnR8eMB2Het3qg8kDU16s1CGL1n08/UId07c\ntayd7Zfnd3rUMXUgSW5mH4PwcZz1IJwu6W/6z/8niNK92znyunsgaZ1R6brNV6d5+KdJG6VH+Bx4\nRxGkcLqo33nyRpWHHryRQ9qwIHGcpb3oxXJhLB/0Ld9ZyonSsc3nwKooqcv/HfdQGl8k//kfJhJg\nEx5wSeMGQcaB8PPPDfLhfP6zu9s1Xx/6x88XtoZw3T2rwlYUP1/4mnt2u76nHPdBN1/CurprYWLg\n10can0z59fljjI8l7fCxDBM0yvTzhuvjehGS+z8iDJivaLdrKO830L0du44BENfZrqGU7a7z7dcV\nvllcJ3TTods1Xze/nrDe/jU/T5rO9vOF20U7/DbTTl/8a7QnqfD2wqBdhKCj/w/g65TmmDJcv2XV\ni4dh3AMRqwSDZy9hoM5KGrqVTZlJ6vfLIH1ea4pfXviY+4BBBEJCX/FmC6FwOIa/Sct9A4nhQ1qm\n98rUMaxzk39D1MC4CKnT/x33APdCUvFfWsIvnJQRfjaHx4KwRcVd98v1Le1OL38M4bnrhOe1e1ZF\n5eOcu863q88vj+dXWPxne/j57JcXvhZuv1+u30YfOx8TdHHkzD/v6+7KJJ3//A7r4hOVKGxcOXm+\nczurfutb3wra9nsJlDSzZs1yP63zYNBRnd+3335755iDYIVD5zfLDRcsWND5zUZme++9d+c3B34Y\n5HBdV155pY3W6DK88sor9jCPfq6sJN84JLllaKRftmyZGTdunM1KO+644w4TdLb9HdzEsY4/4Xad\ndNJJJrjZbD7+sNlUERLcnDYaad6ycNKiz2lTXqEMygo7gqUpF4yfeeaZyCxu2Wiv5bUBYejpCBpZ\nQY+TwcBty8XZNIk4p1Snd5I8adMcf/zxNqz/5s2brTMc/4PsIxInu+++u11miW7gtHTpUrtzbpk6\nxunSxPMBYTN77rlnbtXr9n/HMy7Ns+mFF17oYLDvvvt2jqMOgsF8xFjgnq3h9H65RFsOy+TJkzun\neF7TH3xYouokKl/UOdLzvAoGYPvZvn27LQKHUJxicSb1xwRXft7vY445plPEf/zHf3SOn3322c7x\nJz7xCesQzYnXXnutcz4gXCOw9J1SSfiTn/ykk56D/fbbr/P7xRdf7BwXefCOvIVl6Xgajhx55JF2\nt1dICOI6jRvPJzT2YvDHv1mCNzh3uvP90ksvdY7dQR79XBlJvpN2tmtruLNdHVHt+sAHPuAu1+6b\nOCRhEkL/BSza7LHHHubggw+O1JkYHzy4gjfyYf1KWZQJscwiYfIaLoMVLQyicSthul0Ll5XlNwM2\ndVNP3IZqEKXAEhKrY5Z6k+RxusVhk6QMpemOQFEvAHX7vyNU/apVq7o33rvKxpFOeE50kwMOOKDb\n5WHX3BjCyZNPPnnYtagfkJCwjB8/Pnyq8xI54kJwgjICK4K54IILoi4Xfs5vF89LXoIhZt/73vc6\ndbGNgpPA4uEO7bPWrcLpnAwddCOUP/vZz0Kpi/mZm4hk6XhHRGgCb/v33XffsMHMt5S4ZobfkllW\nlUTy6pekDtIU1dlJ25VUr7h0WA1YdpdXIBK+8A+ZZNM1SCgC4eANYuLEiZ1iKDMrEekU0uXAEYHw\ngJskcFmXYlNdou6ofWrQASIS1i1V4UrcegTq9n+HtS+NhF9e0uStU1pISDDV1nmJdrph2R47dqxh\nFsAfg9z1PN+Mn7zo3X///bYYLCG8gC1evNj+xirsP0/z1NWvvH/Sr4ri6qEjwzclb8uS8hHoZT1I\nooFvpeKfLwkJCZfrLGPuvF+mO1f0N29w4YiXZU3JxOkejjfi4ny483H5dF4I+P8jTfq/cz2H1bQM\n8csNnEU70yZu+iT8HfUMxGoVlvAY5a7z4uWIBnWTjjq+/OUvR1r1Xb6838QFcoIlhLY68adlOMd0\nqhMITBiD8G9077fkJiJ5O56gR2HhXNgC4ltRSO/m48J5w7/z6hcuL+53Ezo7TveizvMGkFXy5M1S\nJ29wWB6cv0jZUzJxOjq/keXLl1v/kbRvlnHl6vzgIJDnfydP3jwI77XXXp3s3aYCOokSHhxxxBGd\nlElfaCEjzn+PzFE+ZlHnSEvAQCcXXnjhMP8LXrIdSXFpivomAJoTLCG+fv60DGkOOuggl9RgPUGv\nrJJmmixNHbmJSJaOdwoSzc2Zl7gRcKRBYJXOzOTSQkT8myXKx4JpDRcxDmchJI9+ru4k30V2dpL6\n8qZhXjYc8S9vmf4/Zdqy8uRNW5dLj+UBX4x+Tsm4ut238wc577zzrC6OGLnr+hYCvRDI87+TJ28v\nvbpdf9/73te5/OMf/7hznPfAd+TEZ8ONA5TLy60fNZWoq05cBFF+48fn5+PY+fa59FHfLKZwgzzT\nzZ/61KeikkWe86f2IxOETrrpGXfa6Rc1LYP/iHshZ2xFL//Zzzjsj53hKKtEq3biynG/C/sOzDK5\nhKU+gTKdj7+cNWj0sNgSQSM6dYWXDJEvvO6a374EJshOPdTpL/UML99lyRKSVT90de3y20SZcdf8\n8/7yXadHcJN0ykQvJ36+cJtJQ/1OFzDwxZ3nO6ynny587JZlhs+n/e0v7UIH+oG+TSqk9dtHGZSZ\nVVgemWZZcvDgGPrGN76Rtbpc+aKW87Jcl/P9FnAAO+4Lt0QXHMMfAqGRJvBRqETPfuNSdH0O37zl\n1u3/jvs2sOYlbpb/P8+zMiz+czvueeA/+xhrnPjPUz9N+Nh/BpM/fL3bb1dfeNzplsevD12j0ro0\nfvtJFyU+hq4sfzmvnydcnksf/ga7sPjjlj/mhtPl+R3dwpQlZul4n1TQUDd4+f9gYVCS3izhzsii\nn58nPMDHXcva2X55eYiIu6nczdytG4t6IEb9M6AHDxf6kutRH66Rxunsf4fx7taO8DUCbKXZYI3B\nl4G/34N/N8LRL32oB7yIawH+fDuiAS5RH9Jz70BQyJMnIFq47wbhN5imIcpxmNTt/y5tu8LPcvf8\nd+31n6VpiQhlxz1b3HMm6hnj1+nSue8w4XBEhG9/oHbp+WaMYyxy5yjDlygd3bM7rIufzx2DmSvb\nfXcjCnH3jMvLOOTaFVdHuJ9curzfhRCRtB2PtcI1nm//puh2jVopbAYAACaLSURBVMaGO8gvh2M6\nNwxWWv2oxycHvn69rmXpbL+utESk282MrnGS9sERVw5YR+kQ7pekv6P6L67uqPNpIjz6Az549Euo\nCwtEN0E3yEoZgjXDEQmIB7976ROnB21xAdEgJRCVrGXF1dGm8/QpOOWVuv3fpX0BoP3+cyM8gPrP\n+bRExGHLs9ivg2cQ5CDqGevycI363POKZzO6cN6d49sfYxizfMLh8lBmHIHhWjgf5VIX4ref83Hi\nt89/oY9LT51hndA3PMa5/PSLa3dcP7i0eb7jW5ih1KQd74MHCGEJW0vCLA0w/TQA1Q1MV35S/UhP\nea4Dwp3U7Rp503a2X17UPwn1O11oty/dbmY/XfiYgS6NKTWc3/+NDn6fOl3TflMGZeURBlgG1iQS\nJh/h30nKSJPGTX8kzZM2fa9yaR+DYFmEwREc7iusJpJoBPi/KIKs1en/DkILGUkj/mAbfq6lKacf\naf1nMAP+oIhPsBxJKqPthRKRMhRUmeUhwIBU5Fs3/6w+qUpKRMgTJntZW530IR9FOnwLSdb64/Ll\nsXCga56Bi7oJvw1BSDtYxLWn23n0hRByf0Xh3C3vIFxLQ5aT4FGH/7ssfY1VwU1rJHmbT4JF1jT+\nixTPI//ll5dDpyfPF9IOgtA/7hkOJmXKKAoPKpMMIAJXXHGFjXzKio0iBe/05557zgZ5Y437L37x\ni2HF/8Vf/IUhWixLnj/ykY8MW/I2LGHKHxs2bLDr93utBHDxQ4KBeUQNxPLgfJEhy6MCl42ouMeJ\nrGWAybnnnmumT59u5s+fX2i7eqhsWJIcvOkaljWyZYPk9wjcfPPNNvzAjTfeWCgkVf3f8f9EAL6A\n8KZuDytSXETSgFCVGnujm3K+Ht3ScS2wDGSKl9Sr3Lpd9zEpvc1lshyVXW8E2KiqqA24qm4p0wKz\nZ8+2/gq9dOn1lt7req/y/etYnPJYM/yy0lpV8N3ACpJ0qsqvq6hjdOYe41MUDkXpVkU5YMAqrdGj\nR7cGjyz+IT72zhpR9lu3X2fUse8bEpCCjjXAP677FFJUu7Kc861V/bAAaWomSy+1JA8PRQYqBoum\nC235q7/6K/uQh0jEkYm48377KauIKSvqoqwihfKStIE5ewb/ItpRhP7oU/RUYBF69aMM+oA+4+P6\nAyyqJIhFtjtvW/B1cYN9UVO0WduHH4TvF+H0wsEzyn8vaz11z0c/0HampPxpqrL01tRMgPYgC9Mz\nSNFm4n5j+vDDD5tbbrllWKRDoqU6IQCQm26JmpJx6dx3t+kblybu2wUpK3O/mG6b5tGnRHRcu3Zt\np81xuvbzPHqxWRtTZ20OY8+9409TRG1uyLTVI488MmxH8X72RVF1cR9OnTp1WHuLKlvlDA4CIiKD\n09eRLXXzu8GbWq0GrUhlu5w89NBDrQ/EaaedFpkKchBMRdldKknAXjO9CEmWsO/gSV39GGij/Ebq\nSkJcpzgy0vT7zbXHffukN8m9xT0CQSFfr/vQ1VHH79NPP90cffTR5tJLL62jetKpIQiIiDSko8pU\nk8GBEL9NfZhgDbnmmmvM5s2bY2EKk4okb60UFs4XW0FwIYoYdEtfxDWf+OAEedddd5mNGzfWmlTW\nnSwl6Rf6Opgm6yTNYv2iv9gjhNDgTRT+N7CGtI1UNrEvmq6ziEjTe7AA/RnMeIvDnNy0tzPeLI86\n6iizcOFCc/zxx0eiQfuQbm2LG1gon/y9LBw8lKNM8JEKFXwSHbH2/PM//7N5+umne+pacPWZimP3\n0MCHpTGracCYAddJEX1NmZSzZs0au+rEld2Ub1lDmtJT9ddTRKT+fdQXDdnxeMuWLY17O0uidxqr\nhgObPE7+93//13z0ox+NtDK4ASrLG7ErP++3G9BuvfVWEzc1lbeOovND7sDs3nvvjSWQRdeZtjz/\nHsDHqBcZTVs+6fEVWbRoUe2tWOG2Ob27WSHDefRbCMQhICISh8yAnWcww7IwZ84cU3RckbKgZKDA\nNMx3nLWDa3lJAthE+ZcwmHKtjAEqDWZMdbCV+tKlS9Nkqzytm1Kry1QS/ek7mea9b5ICjGUhWHnS\nGOsQ1kMsWk215CTtF6XrHwIiIv3DuvY18YDBVIwJuurBtRdYSawADCxIHEnpVYd/nfooD1z4JmDb\nu9/9bhPEg6hsSgb93KDQjYz57ajbcdXmfXBzksTJ1KUt8pv7CdLTBIsW/wdTpkyxLwBN9Skrsu9U\nVjEIiIgUg2NrSuEt9ZJLLqn1Ekv3MAwCIHVddlyENcTvWEdseGv2fQTi/Ev8vGUdVz2Q520XfdRP\nh8cq+6obVi4CLkt6ua/rKjNnzrTWt6Y62NYV10HXS0Rk0O+AiPbXeVWDIyF77rlnV3+WokkIMFE3\nUzS9pq78t+yyfAvQx1lDmr5qoUwyRZ8V7WQK9mXIv//7v9upUVbS1NEiWefnQhn9oTL7h4CISP+w\nblRN7qGzePHi2jwUHQkByG7BupzloogpGddplEn9DBBpSE54ICzS/E8fsV9P0/dxcVYR3z/D4Z7l\nu19EMItucXnc/bV169ZaWiTd86Db/11c23ReCPRCQESkF0IDfJ2HT10iYfKg/vSnP232228/u8rA\nRUmN6p40RCEqf/gclgfqc8QGcoE+Wd5ayecPuP4UT7jebr/Rgby01enVLX3drxGQrtsS7G76hzHt\nl5NpN53SXEN/xPWjmx6tg88I9xk+IYhIiIVBf0pA4E//XyAllKsiW4BAsNmRHfhZTbPvvvsaBosq\n5MEHH7R+BGeccYYdrN75znfGqlEGCWGA2GuvvTp1Uj9Levnupksng3cAocEq4j7btm0zP/7xjy2x\n+fWvfz2sHi/biMNvf/vbxr09j7jYwBPvete7zLPPPmu453oJg+Mrr7xiMWMQZ/oLUuYw7ZW/TtfD\nJATdDjzwQPu/NmvWLPPb3/7WfPzjH69EZf6XWA7O0vXbb789cvl6JYqp0tYhIItI67q0+AZhETjz\nzDPN/vvvb4gG6d7ciq9peIkMOCwnfuyxx6wVhCWO3awQUQ/14SWm+8WDuJvFomjSQ3t9f4Zu0zhY\nq5ocDTfcE/QdlgzfWuSn8Z1My/S78ess+7jX/cp1rIDIggULci9DT9oe7sOvfvWrlnxcd911PX2i\nkpardEIgFoGydtNTue1CgF1fb7rpJrsjIzupBgNGaQ10dQWEZ2jGjBmdHWw5HwzUsfUm2ZU2NrN3\ngXqSlpU0nVd84kMwpnz3QS8n7HjaDQuXrknffptcH0S1vUltitOVvk36P8T/Hf8LZf/foWvgjG3r\nmjZtWmL94tqo80IgKQImaUKlEwIgwMOTB2LAbO13kYMhZbuHLg/CqEHeDVDh3ohKG06T5Dc6pGkT\n6fn0Q9CLdr700ksW/37U2c86zj333KGrrrrKtjFNH/RTxyLqynLPcN/7/3dF3e+0h7L5v4MI8lm/\nfn0RzVQZQiAxAn8SayrRBSEQgQDTMjfeeGPHhE6ERT5M2TBVkVYwuRMumjKYiti+fbuN2Eicgiin\nQ3wsOE9dmJARTNjkzSvognSb/gnXAR7o4XQJXy/yN3rR9t/97ncmIGpFFl2Lsg4//HDzZ3/2Z7aN\nafqgFsonVKLXdExcMdz37v+OKTn8R/DZYouDLP936IFTLHFBmOrC52bJkiV248i4PZvidNN5IZAX\nAfmI5EVQ+Q3BmPDjCN6k7H41DJJjx461PgzAwxJTHnY7duywaO3atcume/755+3vyZMnm1NOOcVM\nmjQplUMcxIEHdPCGGUlabOEJ//Aw7+YP0qsY8kcRp175slyHuL366qtdg7llKbfqPGCIL0Rbg2Vl\nJSFx/cL/HRF+2eiQD5sIsqpswoQJnSzjx483r7/+uv0d9393xBFH9M3vq6OYDoSAh4CIiAeGDvMj\ngGUgMKvbFR08+BCsHOyF4j8gJ06caK0YWBTyyDe/+U3zvve9L5UVw6/P6ZuXRFAOA00/3uSxPiFt\nC7HdZiJSNAnx72GO3X3MSqpu/3cQk7333rsv92lYR/0WAnEIiIjEIaPztUfAPdyxikB+0pIJ8vMA\nL4o8YKGBWKFPmdJWIkJfYDkLJpbLhK/vZbv7NC/p7rviqlAI9AkB+Yj0CWhVUzwCTMm4gR8Swhs1\ng1kSyeIP0qtcCA2ESJINgbIJXDat8uVy95lISD4clbvdCIiItLt/W9u6KJ8MyAhvn+4NNK7x5GVg\nKGNwcIQorm6dHxwEnIWsjPtscFBUSwcBARGRQejllrURohG3SsZNs7g3Ub/pWEscgSnz7RvdepEh\nXy8d/x4BMGvLoO1ISJn3me4bIdAWBERE2tKTA9QONyUT12Rn7YB0OGGQ45PWj8TlT/NN/ZCepNNE\nacpuc1r6FSfmpotISNN7UPr3G4F39LtC1ddeBBh48ZFgWa5bKhjVWre0Fw9+9tVI8xbsLBpR5frn\neBN10yR/+qd/av76r/+6MKdUv564YywzSXWNKyPuPMuhN27cGHe5seeDwFqN1d0pLhLikNC3EEiO\ngIhIcqyUMgIBrAxPPvmkDUrmYhkcdthhNobI3LlzI3IYu7SXJb2LFy82q1atMkE0Rxugi03t3NRK\nVEbqipuSiUrPOVZh/OY3v4m7XOp54pIwMHVrUxYFxo0bZ/HOkrfOeYh3wb3QVBEJaWrPSe+qEdDy\n3ap7oKH1E5VxxYoV1voxffp0Q1CyD3/4w5mWrrrATOzwOXr0aDN//vzI4GZpLQykd0HKIDFYbIom\nBb26j3qRNFafXmXSjjYucyXKJ4Ht2PG1aSIS0rQek751QkBEpE690QBdIA3BnhdW0zjCkKcZPsG5\n9dZbO4NSGhLipojC/iBx5/PomyRvGt2TlEcawnsTkjvcxqT565gOa9dTTz3Vd7KYFwuRkLwIKv+g\nIyAiMuh3QML282ZPJE/8P3yCkDB76mQM3uynseeeexq2vGfgTWJVSGL5KIMY9Gpg0XWCCb4i8+bN\n61V1I64zmLPfEA6rTRKRkCb1lnStKwJaNVPXnqmRXlgpePNm/h5n1H6Yzqlv8+bNZurUqdZcn2T/\nEQYFpNf0C2VDDLCQ9EuKXNIL2WJPkfvuu8+2o19tKLOeBx980DDF1yThHoIca4luk3pNutYRAVlE\n6tgrNdKJN++VK1faHXGrmgaAYFx00UXWOnL33XdHPvgZFJw/SFL4KJdBJImlJWmZ3dJlfXsmn7+i\nBFKDzk2dyojCCIvXwoULTVN2fi3awhWFic4JgUFBQERkUHo6ZTuxFpx//vnm5z//ufn617/et8E6\nTk30mTNnjnnzzTfN2rVrO2Qkr99HkqmcOJ2ynE8ygIWJRxzBYgt4lkmzPXyTxfkdYQFrgiTpwya0\nQzoKgbogICJSl56okR4M7lOmTLEa+YN+HVTEQvPyyy9bMoKefHpNxfTSOy+Z6VW+f526ID++zgxs\nvsQRDz8Nx5SDVaRXgLdwvrr9Pv30083RRx/diN2ERULqdvdInzYgICLShl4suA0sowxbHgquIldx\nkJFvf/vb5t577zUHHHBArrL8zAwySUmAny/t8Te/+U2bhaXKSJ4pL7BAmmoVAXP8gPA9qruvhUiI\nvdX0RwgUjoCISOGQNrtAzP0EJqubJcRHFVM+JIRAZUmcWP28vY6L9htx1ha/Xucsm4eAuPKabhVh\npQxEhBVZdRaRkDr3jnRrOgIiIk3vwQL1Z4A/99xz7UqMfjlwZlWfAZ7pozIGsTx+I2HiQeAxfxrG\nb29Rg9vNN99snnnmmcJJma9rGcfLly83ixYtsqujyii/qDKL6qei9FE5QqBtCIiItK1HM7aHAZRp\nCSwNTVm5gPWCN+o1a9bkmt6IgswRil5WC5fOldGNeLg07pu8YX8Rdy3NN+UcddRR1pn3vPPOS5O1\nsrRl9l2RjRIJKRJNlSUEohEQEYnGZeDO4heCLF26tFFtL/utmoHI9xuBOPhBt9IQjyhgGZCLiEVB\nOeiJr0WcBSaq/irOQZzKsmYV2R6RkCLRVFlCIB4BEZF4bAbmCg/cpjgMRnUKVhEsAWVYAyAezz33\nnHn3u99t98FxMTyi9Mh6rqgBj8Bzl1xySe3DpEN633777VpPJRXVJ1nvCeUTAoOEgIjIIPV2TFub\ntHwyqglFEiksC1HBw/L4jUTp7J8raoqGMv3lzXVchVJ3/egLrEq9puT8/tOxEBAC+RAQEcmHX+Nz\nFzmIVwkGZOrUU09NbRUJEw9/GibcnjIHKYgOUoSTsBvs6xCIzsfQ6VXXFVlFEkK/3ToWAkKgOwLa\na6Y7PrFXDz/8cDNq1Cj7YRdUX9x5vjdt2uRf6nrMfht+3q6JC7p4xx13mLlz59Y+hkOv5hICnhUY\nvQTi5X8Y+Hn7dZ9uVgSukY78DFpFCnr4vid5yiamyGGHHWZ1hWhVLWAFUXSB6LphXJWuIiFVIa96\nhYAxtSUiDO5uUGbQlxSPAA/fZcuW2UGi+NL7W6Jb6QNJ8MUnHRw7wuG+swyK5MWC4awYfn15jh3J\nyVOGywsZYZdkLDw49FYlECFW9Oyxxx61jU0jElLV3aF6hcDvEXiHgBhcBNavX29mzJhRyHRAHVD8\nxCc+YYmVrwuDexnCyhRHRoqYTnE6QhwYvItY+cIuyd/5znfMrFmzzCOPPGKIN1Kkrk7nqG8GdzYo\n/PznP2/uueee1FNmUWWWcU4kpAxUVaYQSIdAbS0i6ZrR/9QvvfSSGRoash8e9E2URx991L6tNlH3\nsM4EY/vYxz5mp8KctaMsEuLqdoN6kdMfzkLDAFmEgMHGjRvNIYccYvelgYwUVXacfqzewQpCkDUc\nP8tYzRRXd5rzIiFp0FJaIVAiAsFgWit5/vnnh4LmRn6Cee8Rut55551DnPfzcG7Hjh2RaV26s846\ny16//vrrO3ldBpeGb/Thc+KJJ9p0lI34dbpzcfkff/zxTn7KpCzOheWBBx7o6EK6KEnT3qj8/rlg\nIB0K/BL8U40/rqJNwSqbocDyUCh2RZeHcgEpGJo2bdoQGF177bWF9j0YrF69eiggPPbDcZ0FfcFD\nIgSEQPUIRI92FeqVlIhANBw58ImDO95nn32GXn/99WEtYRB31yEi4fwusUvDt09U+O1IR1IicvXV\nV3fq9Mv1y3L1diMiWdrryo365iHMgNQ2YaANppwqaRbkgQGuKCmDjKAbfX/55Zfb+/LYY48dCqZO\nMg3KjnxQFvcS2NedgNB+kRBQkAiB+iDQWB8RfBueeOKJYDyPlmDgNieffLJ57bXXDNEvw3L//feH\nT0X+vuqqqyLPJz153XXXxSa94IILzMEHH2yOPPLI2DTuQt72unLc91tvvWUmTpzoflbyPX36dOP6\nIfiXKEQHtpMPCGglYeqZBmGahumVYGDO3R6Cp+GHUkRZvjL4n+DMOn/+fIOfEFN0AWG2SbgnwBAZ\nP378sP+drVu3ml27dplXXnnFvPjii2bLli0mIB922fRll11WuJ5WiYL/aDqmYEBVnBAoAIHa+Ygw\nKDMoBZaHTvMC64M9h18GwjJXn4SQljx8AqtCJx9kxP/dufCHA8oNLDCdvOHr7jcPaVd+Fn+QOP0o\nn71deklR7fXrYbB2A45/vqrj4C21kKoDS5gdKAspLEMhzsm0CL8RCEhRS3qjmgJhwqGVsP7U89RT\nTxmWQSMQjsWLF5sFCxZ0Pq+++qq9dsoppxhWtfE/we7H+IAUTZZsRQX/cc7Fro8KLl7FCQEhkBWB\n4GFSS2EKJGiT/TAN4kvwsOxcY+ojLHF5/fOUzTRQlLh6+Xa+JOF0aaZmwnnDegQPfZskbmoma3vD\n9fq/b7rppiE+VQrYOqzj+iKtfkxnMEVQtWD+L2pqpahyqsakyvrxhWqbP1SVeKpuIVAkArWziAQD\nU0954YUXOmmi3uonT57cuU4Qpbi37SRTIuxjkkeI9hmWj370o8NOMU3STYpqr18H5nWsB3UR97Zd\nF33y6oG1gamaIoKfuSW9eXUa1Pwu3ksTrDaD2kdq92Aj0EgiArlwgh+IC3zmvsMDbBQRYVomiey+\n++5JksWm2XvvvUdcC/usROnnZyqivX55HLPpWJRu4XT9+v2lL33J4IPQNoGMuCmBrG0reklvVj2a\nmE8kpIm9Jp0HDYFGEpFB66Q6t9ePgOuIYNJv56hK+5xz8Q033NA6QuJ8EvL4jVBGsNqlzrdC7XQT\nCaldl0ghIRCJQCOJiG/N8J1NgzmrjlOpf1zlmz9OoWEJW0B66VdGewm53WtKKKx32b8hI6xSwjrS\nNmFagE84BH2adrqpnjR5BjWtSMig9rza3UQEGrl894gjjrAbaAE4vgVJfD2q6hyiS5500knDqn/2\n2Wc7v5lG6kVEymjvhAkTrBWio4gOSkfA9xvptstvN0XKWtJLnVhsmB6DEG7fvt1s27ZtmCqQV+4b\npivHjRtnfWCGJajJD5GQmnSE1BACCRFohEVk586dw5pzzDHHdH4Ti8Pf/Za3/IsvvrjjN1L1hnnE\nEfH1YykuOjs555xz3GHsd1ntZYlm24SBlAGzzpLHbwSrCrEwigjTThmEY585c6YN/37mmWeaFStW\nWOjGjBljd2VmZ2b3YdkuAvnnHFNwOHMTNt4N/jZBhX+cHnJMrbATVLUQSIlAIywivKHx0GOKglgi\nZ5xxho1t4Jw4Gdj9wd3HgAdm1dJNPxe3oZuOZbSXYFXEicgrrFBieqxICTvzpikbcsVbe90Fnw8G\nTawQzockqc6kdzsJJ83jpyMv8XUWLlzYCUgWhHzvGQsEAuULRCZYWmwee+wxax1Br9mzZ9vYJH66\nfh2LhPQLadUjBApGoMi1wEWWxV4sQVOHfQIi0qkiICcjQrSH0xOvwxc/fodflp+GY78cYntECfld\nunA97jzf4RDx/rVwvrg4ItSfpb1RertzxFQI3hrdz9Z8VxniPQuIgb9Q5vDqafdKcTFW6HdiyBQZ\nV4N2VLnXjOKEZLn7lEcI1AOB2k7N4FcRDNSxsS7wq1i3bp1NE+wZE4zvfxQiofKWniUK6h9LKeaI\nMOa8fWLNcYK+AdFKpV/R7XWm6zwrOVx76vS9atUqc+CBB9ZJpa664DdCX6R1YiUfH2cF6FYJlgsc\ngKdOnWqj6bL65tJLL+1pAelWZvgauhCldfPmzTZ0/DXXXGOnbfpxf7k63D0d1k2/hYAQqDcCo+BD\n9VZR2pWFAL4BbNde123a07Z7w4YNJtiAzQ6GafPWIT1kJK0Ta68pGq5DyPfff3/ry9HPwRrfkc9/\n/vMmsL5Y4lMGxpAQ2gQRkggBIdBMBGprEWkmnM3SGufD5cuXN0vpLto+99xz1uehS5JaX8rixNpt\nSS99ixVkzpw5dk+YfpIQgMbqgvVlzZo11iG2CAdbvwNFQnw0dCwEmouALCLN7bvcmjMw4BgazK8X\naqbPrVjGAljaysZtaZ0/M1ZXWjamW+ibpO1w0zM+0bjiiivMypUra4EHbYEMvfnmm2bt2rWFWC9E\nQkq7/VSwEOg7ArKI9B3y+lSIOZupjGXLltVHqYyasAx19OjRiQfvjNX0JRuEgk9SvxHSMtjzQSAh\nrCjDGpGUzJTZMO4zdvjFT2rKlCkdPbPWKRKSFTnlEwL1REAWkXr2S9+0YrDDfM+g1eR59g984APm\nkksuMTNmzOgbdv2oiP5J6jdCWhyjISFFWR6KbqMjSVn1EwkpukdUnhCoHgFZRKrvg0o1wMdg4sSJ\n5u67765UjzyVYw3B55qYJgzG/idPuXXIm8ZvhGBuTMd8/etfry2pvPHGG81+++1nzj///NTwioSk\nhkwZhEAjEJBFpBHdVK6SDNxYRfj2/QzKrbW40g899FC7ZJTlo2GhTb7gE1OH6QpfpyTHvfxGGKSx\nnNRlOqZbm5hCYorm2GOPNfPmzeuWtHNNJKQDhQ6EQOsQEBFpXZdmaxAm87ffftvO5WcroZpcLBFl\nVQZOqkmEQZDB2pekUx9+niqOne5YSXzhPMuwcQhtylJsiAXh4em7cHv8tnEsEhJGRL+FQLsQEBFp\nV39mbg2DGQPyrbfeWlmI7rTKF2UFoJzwjsi9Bse0uhaZHiuPT54IVrZlyxa7RLfIesoui+XFixYt\n6hr3JdzWsnVS+UJACPQfARGR/mNe2xp56DNF04QlsGVbAcJTOiwNrtO0FeTJORejW1OXYGMV4Z4j\n5khY6IM6E8KwvvotBIRANgRERLLh1tpcTHXcddddZuPGjZ2Bro6NZQBjOSjOj/0QfDQY7H3xrRL+\n+X4do9MXv/hFs++++yb2teiXbknrceQ3vGpLJCQpgkonBJqPgIhI8/uw8BbkXWJZuEKhAuuiX3hK\np9+OsBARrCHoccABB4RQas7P008/3e6B46wiIiHN6TtpKgSKQEBEpAgUW1hGXQZ7H1qmY9hMra5x\nMtAv7Ahb5pQOfYT0yyrk90WRxxAP9sNhwzyRkCKRVVlCoBkIiIg0o58q0ZKBbv369TZIVtVLXhnk\nWfKJZA2GVQWIUVM6Rfk9QHKa4M+TBHeWYF9wwQXm4osvTpJcaYSAEGgRAu9oUVvUlIIR4E0bnxH8\nMapcTcNbMg6N06dPt/FCnJNmwc0tpTgcXMNOrrTHlyxTOuw0DDmsmiD67chzPG3aNLsXTZ4ylFcI\nCIFmIiCLSDP7ra9aOyJA5NJrr712xMBaljJYQb761a+a22+/3Vx33XWNiZGRFo+oKZ1ejrBYq3bf\nfffGOqmGMcLPBcIbdggOp9NvISAE2oeAiEj7+rSUFjFY4p9BCPG5c+faEN1lWiYI2059+++/v7XK\nhK0KpTSyRoWGHWFRzZ/SYSpjyZIlw87VSP1MqrRpqikTAMokBAYUARGRAe34rM3GOrJgwQLz/PPP\nW0LCioeiSAJkZ/Xq1TbIFfotXLjQHH/88VlVbV0+N6Xz29/+1hxzzDGNjR0S1zEzZ860EWKbEh02\nrh06LwSEQDoEtOldOrwGPjVv5Q899JANzb19+3a7fJQBhCiZOGamFcgH1g/KwFfikUcesQSEFRQi\nIcPRBHs+7373uw0+FQiWk7bIhAkTDPeURAgIgcFCQBaRwervwlsLkWBlzaOPPmoee+wxM3r0aDud\ncvTRR9u62NnXF3aI3bVrl3nllVfMiy++aEOTM6ieeuqp5oQTTijMuuLX2bZjSB8B55YuXdqqpuGA\nu3jx4saFqm9VJ6gxQqACBLRqpgLQ21QlfiLseut2vuUNHbKxY8cOSziYxvFl7NixZsyYMeaUU04x\nn/vc51rl4+C3s8xjiBzWg7YJFjGJEBACg4eAiMjg9XmpLW7TktJSgVLhIxDAWRXfI4kQEAKDhYB8\nRAarv9VaIVBbBHB6zuJnVNsGSTEhIAQSISAikggmJRICQkAICAEhIATKQEBEpAxUVaYQEAKpEZA1\nJDVkyiAEWoGAiEgrulGNEALNR4Coqm5ZcvNboxYIASGQFAERkaRIKZ0QqAkChHZXvI2adIbUEAJC\nIDcCIiK5IVQBQqC/CIwbN85s27atv5X2oTZWzBxyyCF9qElVCAEhUCcERETq1BvSRQgkQIAN8Vat\nWpUgZbOSEOSOGDMSISAEBgsBEZHB6m+1tgUIEEQOy0GbwrvTLUTaPfLII1vQQ2qCEBACaRAQEUmD\nltIKgZogMGnSJLNu3bqaaJNfDUjVzp07DQHxJEJACAwWAiIig9Xfam1LEJg8ebLdeLAlzTGbNm0y\n06dPb0tz1A4hIARSICAikgIsJRUCdUGAnYmxIrQl9saiRYsM5EoiBITA4CEgIjJ4fa4WtwQBLAhf\n+cpXGt+a7373u3ZaBnIlEQJCYPAQGDUUyOA1Wy0WAs1HAGvIhz70IfODH/zA4MDaVDn99NPN0Ucf\nbS699NKmNkF6CwEhkAMBWURygKesQqBKBNgkbuLEiebuu++uUo1cdWMNIX7I+eefn6scZRYCQqC5\nCMgi0ty+k+ZCwPqJEFeE8OgQk6aJrCFN6zHpKwSKR0AWkeIxVYlCoG8IsNz12muvbeS0xvLly80b\nb7wha0jf7hZVJATqiYAsIvXsF2klBBIj8Ktf/cpgFbnuuuvMeeedlzhflQmdf8uaNWusn0uVuqhu\nISAEqkVARKRa/FW7ECgEAXwtpk6dap566qlGBAU77rjjzLHHHmvmzZtXSPtViBAQAs1FQFMzze07\naS4EOgiwembu3LnmzDPPNFhI6ixXXHGFVU8kpM69JN2EQP8QkEWkf1irJiFQOgIM8i+//LJZu3Zt\nLZf0ot/69evNxo0ba6lf6R2kCoSAEBiBgCwiIyDRCSHQXARuvPFGc9hhh5kpU6bUzjICCVm5cqV5\n4IEHREKae4tJcyFQOAIiIoVDqgKFQLUI+GSkDjv0MlXkLDVN8WGptgdVuxAYLARERAarv9XaAUEA\nMoIzKE6hGzZsqKzVrI7BOuOmi7S7bmVdoYqFQG0REBGpbddIMSGQDwGcQe+9915z7rnnmpkzZ/Z9\nqoY4ITjRQoiwhDQ5DH2+nlBuISAEuiEgItINHV0TAg1HgI3k2IsGIdbIzTffXHqLWEqMJYYddYkT\notUxpUOuCoRAoxHQqplGd5+UFwLJEYAgLFiwwEYz/exnP2sjmhZppXj44YfNihUr7N4xTQqulhxB\npRQCQqAMBEREykBVZQqBGiMAIbnjjjvMsmXLzIwZM8wpp5xiJk2alGnqhLIef/xxc/vtt5vRo0eb\nOXPmmE9+8pOZyqoxZFJNCAiBEhEQESkRXBUtBOqMAI6kTz75pHnkkUfMqlWrrC8HS3/HjBljxo8f\nb3bbbbcR6rNT7q5du8yWLVtsnkMOOcRMmzbNnHHGGY2I6DqiQTohBIRA5QiIiFTeBVJACNQDAawb\nW7dutUTjmWeeiVRq7NixHaJy4IEHNnLH38iG6aQQEAKVISAiUhn0qlgICAEhIASEgBDQqhndA0JA\nCAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUhICJSGfSqWAgIASEgBISA\nEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJCQERE94AQEAJCQAgIASFQGQIiIpVBr4qF\ngBAQAkJACAgBERHdA0JACAgBISAEhEBlCIiIVAa9KhYCQkAICAEhIARERHQPCAEhIASEgBAQApUh\nICJSGfSqWAgIASEgBISAEBAR0T0gBISAEBACQkAIVIaAiEhl0KtiISAEhIAQEAJC4P8Di13nEo+f\nAH0AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 67, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "from IPython.display import Image\n", + "Image(filename='sentiment_network.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 70, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def update_input_layer(review):\n", + " \n", + " global layer_0\n", + " \n", + " # clear out previous state, reset the layer to be all 0s\n", + " layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " layer_0[0][word2index[word]] += 1\n", + "\n", + "update_input_layer(reviews[0])" + ] + }, + { + "cell_type": "code", + "execution_count": 71, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 18., 0., 0., ..., 0., 0., 0.]])" + ] + }, + "execution_count": 71, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "review_counter = Counter()" + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for word in reviews[0].split(\" \"):\n", + " review_counter[word] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 81, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('.', 27),\n", + " ('', 18),\n", + " ('the', 9),\n", + " ('to', 6),\n", + " ('i', 5),\n", + " ('high', 5),\n", + " ('is', 4),\n", + " ('of', 4),\n", + " ('a', 4),\n", + " ('bromwell', 4),\n", + " ('teachers', 4),\n", + " ('that', 4),\n", + " ('their', 2),\n", + " ('my', 2),\n", + " ('at', 2),\n", + " ('as', 2),\n", + " ('me', 2),\n", + " ('in', 2),\n", + " ('students', 2),\n", + " ('it', 2),\n", + " ('student', 2),\n", + " ('school', 2),\n", + " ('through', 1),\n", + " ('insightful', 1),\n", + " ('ran', 1),\n", + " ('years', 1),\n", + " ('here', 1),\n", + " ('episode', 1),\n", + " ('reality', 1),\n", + " ('what', 1),\n", + " ('far', 1),\n", + " ('t', 1),\n", + " ('saw', 1),\n", + " ('s', 1),\n", + " ('repeatedly', 1),\n", + " ('isn', 1),\n", + " ('closer', 1),\n", + " ('and', 1),\n", + " ('fetched', 1),\n", + " ('remind', 1),\n", + " ('can', 1),\n", + " ('welcome', 1),\n", + " ('line', 1),\n", + " ('your', 1),\n", + " ('survive', 1),\n", + " ('teaching', 1),\n", + " ('satire', 1),\n", + " ('classic', 1),\n", + " ('who', 1),\n", + " ('age', 1),\n", + " ('knew', 1),\n", + " ('schools', 1),\n", + " ('inspector', 1),\n", + " ('comedy', 1),\n", + " ('down', 1),\n", + " ('about', 1),\n", + " ('pity', 1),\n", + " ('m', 1),\n", + " ('all', 1),\n", + " ('adults', 1),\n", + " ('see', 1),\n", + " ('think', 1),\n", + " ('situation', 1),\n", + " ('time', 1),\n", + " ('pomp', 1),\n", + " ('lead', 1),\n", + " ('other', 1),\n", + " ('much', 1),\n", + " ('many', 1),\n", + " ('which', 1),\n", + " ('one', 1),\n", + " ('profession', 1),\n", + " ('programs', 1),\n", + " ('same', 1),\n", + " ('some', 1),\n", + " ('such', 1),\n", + " ('pettiness', 1),\n", + " ('immediately', 1),\n", + " ('expect', 1),\n", + " ('financially', 1),\n", + " ('recalled', 1),\n", + " ('tried', 1),\n", + " ('whole', 1),\n", + " ('right', 1),\n", + " ('life', 1),\n", + " ('cartoon', 1),\n", + " ('scramble', 1),\n", + " ('sack', 1),\n", + " ('believe', 1),\n", + " ('when', 1),\n", + " ('than', 1),\n", + " ('burn', 1),\n", + " ('pathetic', 1)]" + ] + }, + "execution_count": 81, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "review_counter.most_common()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 4: Reducing Noise in our Input Data" + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " # set our random number generator \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, labels)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self, reviews, labels):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " \n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + " \n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def train(self, training_reviews, training_labels):\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + " self.update_input_layer(review)\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # TODO: Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # TODO: Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # TODO: Update the weights\n", + " self.weights_1_2 -= layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " self.weights_0_1 -= self.layer_0.T.dot(layer_1_delta) * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " if(i % 2500 == 0):\n", + " print(\"\")\n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + " self.update_input_layer(review.lower())\n", + "\n", + " # Hidden layer\n", + " layer_1 = self.layer_0.dot(self.weights_0_1)\n", + "\n", + " # Output layer\n", + " layer_2 = self.sigmoid(layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "metadata": { + "collapsed": false, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:0.0% Speed(reviews/sec):0.0 #Correct:0 #Trained:1 Training Accuracy:0.0%\n", + "Progress:10.4% Speed(reviews/sec):91.50 #Correct:1795 #Trained:2501 Training Accuracy:71.7%\n", + "Progress:20.8% Speed(reviews/sec):95.25 #Correct:3811 #Trained:5001 Training Accuracy:76.2%\n", + "Progress:31.2% Speed(reviews/sec):93.74 #Correct:5898 #Trained:7501 Training Accuracy:78.6%\n", + "Progress:41.6% Speed(reviews/sec):93.69 #Correct:8042 #Trained:10001 Training Accuracy:80.4%\n", + "Progress:52.0% Speed(reviews/sec):95.27 #Correct:10186 #Trained:12501 Training Accuracy:81.4%\n", + "Progress:62.5% Speed(reviews/sec):98.19 #Correct:12317 #Trained:15001 Training Accuracy:82.1%\n", + "Progress:72.9% Speed(reviews/sec):98.56 #Correct:14440 #Trained:17501 Training Accuracy:82.5%\n", + "Progress:83.3% Speed(reviews/sec):99.74 #Correct:16613 #Trained:20001 Training Accuracy:83.0%\n", + "Progress:93.7% Speed(reviews/sec):100.7 #Correct:18794 #Trained:22501 Training Accuracy:83.5%\n", + "Progress:99.9% Speed(reviews/sec):101.9 #Correct:20115 #Trained:24000 Training Accuracy:83.8%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):832.7% #Correct:851 #Tested:1000 Testing Accuracy:85.1%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Analyzing Inefficiencies in our Network" + ] + }, + { + "cell_type": "code", + "execution_count": 88, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAl4AAAEoCAYAAACJsv/HAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsvQv8HdPV/7+pKnVrUCSoS5A0ES1CCC2JpPh7KkLlaV1yafskoUk02hCh\njyhyERWXIMlTInFpRCVBCRIJQRJFtQRJCXVLUOLXoC7Vnv+8t67T9Z3Muc85Z+actV6v+c6cmX1Z\n+7NnZn++a63Ze4NMIM7EEDAEDAFDwBAwBAwBQ6DqCGxY9RqsAkPAEDAEDAFDwBAwBAwBj4ARL7sR\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwgYAoaAIWAIGAKGgBEvuwcMAUPAEDAEDAFDwBCoEQJGvGoE\ntFVjCBgChoAhYAgYAoaAES+7BwwBQ8AQMAQMAUPAEKgRAka8agS0VWMIGAKGgCFgCBgChoARL7sH\nDAFDwBAwBAwBQ8AQqBECRrxqBLRVYwg0GgLvvfee22CDDdzWW2+diqahZ+fOnVOhqylpCBgCjYuA\nEa/G7VtrmSFgCPwbgT59+jiIookhYAgYAvVGwIhXvXvA6jcEGgiB2267zbVt29ZbwrCGDRo0KNs6\nOf/kk09mz2GFIh3nIEYQJH6LJW38+PHrpZ06daq3so0cOTJ7LdcBaSgLvUwMAUPAEEgCAka8ktAL\npoMh0AAIQJwgWi+99FK2NZAkyBTSo0cPv1+wYEF2T57999/fbz179mxBkLgGcdLkjYyc41oxMm7c\nOJfJZNysWbOKSW5pDAFDwBCoOgJGvKoOsVVgCDQHAliVIEQDBw70ZGft2rWuVatWTojWiSee6IGQ\n32L54jwEjd+QM4gS2xNPPOF23333FmSMAiQNpMrEEDAEDIG0IWDEK209ZvoaAglFQAgXZAmrVNgy\nBWGCiAnhEgIG8RIrGefE1UggPOchc5KHpp999tkJRcDUMgQMAUOgMAJGvApjZCkMAUOgCAQgScRs\nCaGCgEG0tECyIFKkgUzhZiSdyJQpU7IWL7F8sSediSFgCBgCjYCAEa9G6EVrgyGQAARwF0KqIFe4\nASFU/NYicV7ylSFpESFflCHWL53Pjg0BQ8AQaBQEjHg1Sk9aOwyBOiMg1i2C4XEXQq7knKgG0eKc\nEDIhXrgpsWphBZOvH7XLUfLb3hAwBAyBtCNgxCvtPWj6GwIJQQDyJBYtyBVuQ7F66ekchGyRVixd\nNGH+/Pk+MF83hzI5b2IIGAKGQKMgsEEQP5FplMZYOwwBQyD5CDA3F4H3uCMtUD75/WUaGgKGQLwI\nmMUrXjytNEPAEMiBAG5E3IeQLixiWLMqEaxo4o6M2lOPiSFgCBgCSUNgo6QpZPoYAoZA4yOApSsc\n/1Vqq3FZmsG+VNQsvSFgCNQbAXM11rsHrH5DwBAwBAwBQ8AQaBoEzNXYNF1tDTUEDAFDwBAwBAyB\neiNgxKvePWD1GwKGgCFgCBgChkDTIGDEq2m62hpqCBgChoAhYAgYAvVGwIhXvXvA6jcEDAFDwBAw\nBAyBpkHAiFfTdLU11BAwBAwBQ8AQMATqjYARr3r3gNVvCBgChoAhYAgYAk2DgBGvpulqa6ghYAgY\nAoaAIWAI1BsBm0C13j1g9RsCTYTAgw8+6F555RW3/Nnn3RtvvOHWrH7DPfjgovUQ+MFJp7gtttjC\ndezYwX1t553c4Ycf7r7yla+sl85OGAKGgCGQNgRsAtW09ZjpawikDIG5c+e6hx9Z4u6+607Xuk0b\n981993e77b6b23PPPd1WW27puh7cpUWLnn1uhXv1tdfc6tVr3EsvveSeefpP7q475zrI2JHf6eF6\n9eplJKwFYvbDEDAE0oSAEa809ZbpagikBIH/9//+n5tx401uzuzZXuPeJ3zPHdG9u+vYoX1ZLVi9\n5k0379773WPLlrr/mzrZ/XzE2e4npw92u+66a1nlWSZDwBAwBOqFgBGveiFv9RoCDYrAlVde5a65\n+mq3X+cD3Kl9+7qjj+wZa0uxiP3619e5yydeagQsVmStMEPAEKgFAka8aoGy1WEINAECxG9dcMEv\n3RZbbuVOO/302AlXGEIhYPPuvsudM+oc169fv3AS+20IGAKGQOIQMOKVuC4xhQyB9CFw5VWT3DWT\nJrnThw5zw4acXtMGzLtvvrtk3NggfmzHwNJ2lcV/1RR9q8wQMARKRcCmkygVMUtfFgKdO3d2G2yw\ngXvyySfLyl+LTFOnTnVbb7211xNdx48fX4tqU10HsVyDBp/uFix4wF1/w/Saky7Aw5V58y23uO23\n38Ed1OUg98c//jHVmJryhoAh0NgI2HQSjd2/1roiEYAQDho0qEXqkSNH+t9nn312i/P243MEIF19\n+w1wm2++uZs8+VrXpvUOdYOGuideNsF/Lfn9//6+m3nrTPfNb36zbvpYxYaAIWAI5ELALF65kLHz\nVUWAaQJ69uyZtS5xzDmkT58+/ry2OElaOcceq5RskKb33ntvvfxY2tgKCdYu5MQTT3SZTMaNGzfO\n/16wYIHf25+WCAjpatt2D3fLzTfWlXRpzXBzjhg5ykG+zPKlkbFjQ8AQSAoCRryS0hNNpgfkSpMa\njiFXSI8ePfxeX8ci1apVKzdw4EBvmRJrlE8Y/IE4SX45Bzkr1rUp6SBeiOzlvJRpe+c06cLKlDT5\n0YC+Rr6S1immjyFgCGQRMOKVhcIOaoUAli0Izf777++tS1iYOJbzkCtIlpAeCBjWLNKwh2RxvGrV\nKp9/7dq1nqyRXpO13Xff3XHtiSeeKNg0sZZRLyJ7zsu1goU0SYIxYz+PfUsi6ZIugHwR6D98+Jme\nKMp52xsChoAhUG8ELMar3j3QhPVDiCBbECixXImbUeCAWEGi2ISAYYWSY/Zt27aV5Nm9XOcE6YVA\nZRPYQUUITJ8+3d05d45bGEwdkXTB7bj8mWfc6T8Z6t2hSdfX9DMEDIHmQMAsXs3Rz4lrJXFXEq+F\ncuJeFEW1qw/yBYGSc6TBKgZ5C2/lBsILQRMCKFYuzss10a1Z93/5y1/c2DFj3cRggtR6BtKXgv/o\n0ef79SAhjCaGgCFgCCQBASNeSeiFJtPhtttu85YryBZB7JAobakCDrFWQc4gXljAIEDsEcpgi0uk\nXOpCpGw5H1c9aS5n1Lm/cCf0+X7VJ0aNEyMI4lkjz/GEkdg0E0PAEDAE6o2AEa9690AT1i8WJFyN\nfJWIy1AsTAKHkCw5L9Yu3JQQNc7L14/yZSNzcEl6KafYPWUiEC7KExdo2BJXbHmNlo5Z6f/wxON+\nfcS0tY15vo4+5rtOYtPSpr/pawgYAo2FgBGvxurPVLQGMqNdghxr4iONELIFCZPrXJsyZYq3lAmB\n4xxlzp8/v2y3IJYtytVlYo3TelJPs8rU/7vOB6unxcUY7qcf//hHbsIl4xzuUhNDwBAwBOqJgC0Z\nVE/0re68COB+JBZMSFXexHaxaghg7Ro8aLBbsXJF1eqoRcHDzxzhvvjFjdwl48fWojqrwxAwBAyB\nSATM4hUJi52sNwK4DWXiU23tKkcv3I/ijozaSz3llN0MeW75za2u74Afpb6pWL34ItNivVLfldYA\nQyDVCJjFK9Xd17jKS7wW7sZZs2Y1bkMT3jJICu7X5c8+7zp2aJ9wbQurd2yv3u743r1c//79Cye2\nFIaAIWAIVAEBs3hVAVQrsnIEmPiUqSKMdFWOZSUlzJ071/3PwMENQbrAoUewOsKSpY9VAonlNQQM\nAUOgIgSMeFUEn2U2BBobAUjK3p06NUwjj+je3f3f1MkN0x5riCFgCKQPASNe6esz09gQqBkCix9c\n5Dr/e+60mlVaxYpwl3732OMcHwyYGAKGgCFQDwSMeNUDdavTEEgBAjL1QteDu6RA2+JVbNt2D/f0\n088Un8FSGgKGgCEQIwJGvGIE04oyBBoJAYjXfp0PaKQm+bbstvtu7rXX32i4dlmDDAFDIB0IGPFK\nRz+ZloZAzRHAHbf99jvUvN5qV7jnnnu6N94w4lVtnK18Q8AQiEZgo+jTdtYQMASaHYFWW2/jNt5k\ns2aHwdpvCBgChkCsCJjFK1Y4rTBDoHEQ+Ne//tU4jVEt+cY+ndxvbrlJnbFDQ8AQMARqh4ARr9ph\nbTUZAoZAAhBI63qTCYDOVDAEDIEYEDDiFQOIVoQhYAgYAoaAIWAIGALFIGDEqxiULI0hYAg0DAJM\nCttur3YN0x5riCFgCKQLASNe6eov09YQqAkCrNG4ukG//PvbunUNOU1GTW4Mq8QQMAQqRsC+aqwY\nQiug1ggwzcEf//hHvzHXFNsrr7zSQo2tttrKffOb33Rf+cpX/P7www93bCa5EYBsgSsCbh07dmjI\ndQ3ff//93CDYFUPAEDAEqoyAEa8qA2zFx4MAizXfcMMNfqkXSAEkCmLVv3//LLnSNQkhYw+Z+OlP\nf+r+9Kc/uV69ernjjjvOb5TT7CI4gYPgKphAxGbPuUN+Nsz+xRdXuS5dDmyY9lhDDAFDIF0IbJAJ\nJF0qm7bNggAD/+WXX+43SAHkCdK06667lgUB5QmBg4xR1ujRo8surywlEpBJky2wzIfnBhts4N5Y\nvcY10peAJ518qvtOzyM8aU9Ad5gKhoAh0GQIWIxXk3V4GpoLQRJChFsRsgRZgHjlIwmF2gZ5w0Im\nrkrS77bbbv4cdTayQDRpNxuCxZCtEJ4sKP3Io0t8nkb584cnHvdtb5T2WDsMAUMgXQgY8UpXfzW8\nthADXIjsIVzsIQhxC4QD1+XLL7/sIF38xrrWSAJ2stE+cGTjuFjpcuCBgYv26WKTJz7dvPvmu9Zt\n2pSEQeIbZQoaAoZAqhCwGK9UdVdjK4tFCzKEtYvjWggkRAieWMPQIa3xXxAtEUhWpXLMMUe74cPP\nrLSYxOR/5JFHXZsdd0qMPqaIIWAINB8CFuPVfH2euBZjcRKSAAkqxSITZ2PQA/KFWxPyheUt6YLO\n8iUiugqOcerdrVt3d9pPhrg+3zs+zmLrUlb7du3d5CmTIz/IqItCVqkhYAg0HQIbNl2LrcGJQkBI\nl7gX60W6AAUrF8QP8sKmCU2SQIMYiguRY9GXfTWk9/HHuwXz51ej6JqWed20GW6v9l/3eHGfaetg\nTRWxygwBQ6CpETCLV1N3f30br0kXFqYkCfrg7mRwToLlC4LFhkAa2GohkE8I3d/+9je3/NnnXccO\n7WtRbVXqwHI3dOhQd/zxvbPl07+0z8QQMAQMgVohYMSrVkhbPS0QSDLpEkXrTb4gPeCE1JJsSfsh\nJUy5AenaY489XbfuR7ipU66Vy6naY+26acYNbtGihevpbeRrPUjshCFgCFQRASNeVQTXis6NAAM6\npIJBL8kiVi/0rEXAvcYDS1st6ozCH9I5YMCAFpd23mlnN+XX17mjj+zZ4nzSf6xe86Y7+aST1rN2\nab3BvZ54a13s2BAwBBobASNejd2/iWwdXy0ysGPRqRexKAUYXFES/1VKvmLTarKVBLcXZPOKK67I\nqs/yS8S+4eqcPn2Gu/mWW1I1oeq5vxjtXn5plbvl5huzbYo64H7EspiGezJKfztnCBgC6UDAiFc6\n+qlhtGRw23fffd1TTz2ViNipYoDFMseADFnEUlepUB44iCSBbKELekG6pk+fLqq5XXbZxZOub3zj\nG45FLpj1ve0ee7qLLxydTZPkA+btGj5sqLv3vnt9HxbS1chXIYTsuiFgCFSKgBGvShG0/CUhAMlg\nw+qVJsHiI1NNlGMR0WSL/EkI2Nf4ox/9wnqWIpCtRYsWeQvQv/71L/fPf/7Tvfjii8F6l8e5kaPO\ncz8a0FeSJnL/7HMr3Am9j3Njxo5tEVBfSFkjX4UQsuuGgCFQCQJGvCpBz/KWhAAWIwgXA1s55KWk\nyqqQGGLCVixpxDXHhiSRbHnFgj/0B5a8V155RU65fv36uYkTJ3q9IVxs//jHP9ynn37qZs2a5X71\nq8vc9Bk3uq4Hd8nmSdIBcV2DB5/m2rdv7y4ZP7Zk1eQexdJpYggYAoZAnAgY8YoTTSsrLwIMYpAW\nLEdpFAZjiBdkKhdxJA3WI4T2siVZiC+TLxdFzzPOOMOTLn4L6YJwffTRR+7jjz92K1eudA8uetDN\nvHWmu/GmWxJHvoR07RgsDXTttVdLs0reC2lOeh+W3DDLYAgYAnVFwIhXXeFvnsrF2iWDWVpbDmlk\nINZWL0220vRlHH0S/nJx2rRp3tpFPBfuRbFyQbr+/ve/uw8//NDde++97pBDDnG/u+tud+usZJEv\nIV3cXzOmT8tJkIu9/+R+NfJVLGKWzhAwBAohsGGhBGm9PnXqVLfBBhv4rWfPnqlrhtafdmiRdrFf\nsGCBvlTwuG3btllcxo8fXzB9XAmEeMVVXr3KgXixmDaWItkYlLGEseWyhNVL31z1EkSvSRdfLhLP\nhYsR0oWlCyvXJ5984gnX+++/7+fzItaNjyPWrVvnDu92mPveCSe6Q7oe5Jgnq96yZOljWffinXfM\niaUv6FsEcm1iCBgChkAcCGwURyFWhiGQDwGsBg899JD/Oi5furRcY0JR3IlxfOFY6zajd74vFyWI\n/rPPPsuSLqxcEK9nnnnGbbPNNt7qBTnbeOON3dH/31Fu+x22c2MuusAtD66PGPGzukw1AfGbMG6M\nO33IEDds6JBYYYV8gRvkK2kfRcTaUCvMEDAEaoJAw1q8aoKeVVIUAlhJevXqFYsFoqgKq5gIqxZB\n57QpbQJ5QH89XQRfLjK1B3shXbgXcS1+8MEHnnBBNP2SQcuXu+22285bwkiLxXWjjTZyPXr0cNdf\nf737y19edid9/weOKRxqJXy5OHDQaZ50sfh13KRL2oElEwJmli9BxPaGgCFQLgJGvMpFrsr5Bg4c\n6F0+WBbY0iyQlLitQ08++aTDVdqnTx+HK1m7X+UYtyrXRo4c6W677Tb33nvvxQIj5CVtxEusNXq6\nCNyKMl0ErkWxcgnpwp0opOuuu+5yXbp08WkAEWvXJpts4rdNN93U7bXXXu6GadcF83yd5OfNYr6v\nahIwYrnGjJvgp4uAFC17bJknlbF0cI5CjHzlAMZOGwKGQEkINBXxYqCWQZn9oEGD3EsvvZQTMOKn\nSKPzbL311n7AzzWI67TkZ+vcubMvQ2KqikmTL8YrrDCkQuqgbI45V45EtRnygj7lCm5GyEocInjS\nRiFUnIsS+pZrQtAgYuTJ1XdRZUSdE3dTWqwfxKKBv54ugi8XCaSHTIS/XMStqEnX8sDS1bp1a5/u\nC1/4QpZsbb755o4N4vWlL33JffGLXwzixvq7JUuXBCTtwCwBm/Xb2VEwlnWOOK7hZ45w3YP2LH/m\naf9lJdNF0I5aiJAvMDUxBAwBQ6AsBAJrSkPKlClTMBP5LXCFZNjkt963atUqs2rVqvUwOPvssyPT\nS17yPfHEE+vlk+vsw2WMGzfOpy8mjdaf9Fp0/sAyllNPqU/n3X333bPpw9f5rcsOH1NXqRK4sTLB\n7OelZotMX0i/sL65foNBVJ9HVprjZOA6zQTEJcfV5JxGxzAOnAtchZmAcGUCt2Im+FoxE7ghM2vW\nrPG4BIQy8/DDD2fmzZuXmT17dmbw4MGZ3/zmN5mAzGcCy1dm4cKFmccffzzz/PPPZ1577bXMu+++\nmwniwDJBIH4msJr5skEgILiZ4KOKzOGHd8u026td5qfDf5659bbbM8uffb4kgO659/7MqPPOz5Yz\n4qyRmZdffrmkMqqROLAWVqNYK9MQMAQaHIGmCK7PZREJBiRv/cCqNX/+f+JSsJCIdYo0UYLVBEtQ\nMIC7gIRFJSlYBpkK1RNZsDqZzxKFdWf//ff3MTgqS+QhFjLS5xPqCkiLCwhlvmQtrmEVIjamUilG\nv2LrwBKGizIgzsVmWS8dVi8+Gkiy5Fpz8bDDDst+uainiyCmS+K6CKjH5Xj//fd7axnWLCxbm222\nmdtiiy38xrG2dmENE2suuGAdwp3Jxn2w+OFH3Nw5c9x/n3hCUGY317rNjm777XdwXw3ixsKCNQtd\n7rpzrvvusce5Aw88wJ1xxrDYXdbhekv5jRVRrIml5LO0hoAh0NwIbNgszYeAMNAGRNqtXbvWExJp\nuyZmECpNhiAakDLysRF7JRJOK+f1Xsdq5SIsxaTRZYaPA0tQVr/AUtbicj5iphNq0qWxglgSPC0C\nNrS7WIGcMEBVKrpPpCz0pO1s9FF4Y4Z1roEv/aiFGLFy3bGUA/FKqruJIHqmvdALXbPmIvpCugiM\nJ54L0sV0EfLVorgX2XPuz3/+s9ttt918PNeXv/xlT7aYdoINFyPniPMi3itMujTWgheB7yxUzXM0\nceJlbuD//MhtteVmbvMvb+I23WRjv20WHH/68YeuT0DOzhx+hk/L1BDnnTsqUaRL2ifkC8xNDAFD\nwBAoCoHgJdiQEnbVhV1LwSDdwgUTkDGPQzgf6cKi3Za4HLUEoGfLJV2UFJMmrIcuR+cPSIW+5I8D\nspHVgbTSNi7iZpP8pENwmco59mGsyE87JQ26FSvnn39+hq0SoX6pm30uN2+hOsK44AouV3AzBSSm\n3OxVyxeQ4kzwhWILvPgNhrgXcQXiEgysSZl33nkn8+qrr3qX4e9///vMAw88kLnzzjszAWH1rkVc\njLgagwlTM4888kgmCMz398abb76ZCYLuM4FFzLsqcVlSdjMLLnWwNzEEDAFDoBACTWHxwtoRtniE\nf4sVB0uICC5Ebe2R81hQRMin88h59lF59fVi04Tz6N8nnnii/umPw/Xm+4CADFp/rEhhbMBB11Oo\nPK0QVhYJRtfnSznW+pEPKxZ6lipYHHXbwuWWWl7S0uPOA+tyv1wkqJ4lgQi2X7x4sTvqqKO8ZWvL\nLbf0bkMsXbgZsXQRTM9UEoUsXUnDqFr6iOvZLF/VQtjKNQQaB4GmiPHSg22hrtOkItfgDhHRIqRN\nn+M4nC58vdg0UfnkXFTbwvXm0k/K0NchI8Tp5BOdPl86uRb3F2dRbZa6Cu3Jq/u4UPq0XIfglrLm\nIq5Eiediz3JAzFSPGzIImPeLS8tXi5Atjonpkq8XIV0bbvj5/22F7pe0YFipnpAviWmM+56vVDfL\nbwgYAslBoCksXsmB2zSJA4FyLVUQxnLzxqF3tcpgOaZu3br5ObekjuDLRT/Ra2Dy9hYs4rmwZgnh\nCsdzEesF6YJQvfXWW65Tp04+lgsrFxYviJfEcwnp0oH0Um+z78XylfQPL5q9n6z9hkA9ETDiFUJf\nW1NyDdJhi0/YwhQqsqo/o3QM66fbFKWM1h83JYN1vi2I8YoqJvIcXzRiBahEwq5TAu2jgu3z1YGV\ni69XNTbhcvPlT+q1ctdcZGJUXItYuiBd9DdfLgaxXu673/1uC9KFpSuKdCUVk3rrBflCjHzVuyes\nfkMgmQg0hauxFOi1e5FBmi8ewwO0/lIQ0qLzlFJXHGnRT8dfUab+ShP9ChEvrT9EjnZrMlaJnhCv\nOOJeaKN8hYh+fIXJRt/k6gPS0R5IV5R7MdyvlbSz1nnBtNw1FyFcuBexgPF1I5YrSJe2dEksl54u\nAtdipVYuyAhu0b+te9899tjvPWzowrQRSDDfl9uv8wH+uH27doG1bXPHl4NCZvyFFPzhvqetbByb\nGAKGgCEgCBjxEiT+vWeAZ0Bn0EawkmDhkUGa35rYhEnPv4up2S48txa/9dQQxegH8ZLYJ9rN/GS0\nWQiZlCmYcE1/YFCosXEQLwLjwV10kDqlL4SUyflCe/SX9hVKG3VdYnmirlX7HHhCRnQQPWstBl9a\n+iB4XIYEyIt7EauWTBkB6eJYgughUkwHQcB88OWjO+SQQ7LxXJAurlXqWgSr3919j3sg6L81q1d7\nYrV3p33cET16ujZtWnu4mDICYe3FV4MYM+Spp/7onnt+pbvjjjt9vm8Hc391PbiLnyrDJ0j4HyFf\ntD9txDHh0Jp6hkCqETDiFeo+rCcM8kJesJRARKKEtHxhV29BV9E3rAttKUZIB6lEsBKxJE+UQNBK\nIV0QhNGjR0cVVdI5SBKEL+wuLKmQfyeGjFbab/WyZDCIE0Svl/9hglIW7iagG8LFRqC8zNGFRYlN\nSBfXSIMFi2B5SBdz3FFu1PxcfLmIQNJKkdmz57irrrrKk6YT+nzfnTXyHHf0kdHPkpTbsUN7x4bo\ntBCyBxYudLPn3OHGjR3nTh8yxB373f9ykJskC/pBlI18JbmXTDdDoLYIWIxXBN6QkELkAtLFhJ3s\n6yn5iBXkopCbUXSnvYXICGXR5lKEgYe1GuMQCBMTutLmcnDHasmkqmzl5NdtYCCFVNZScNFRpyZd\npay5CPmCjEG6IFPEbUG0sHRhMZOvF7WlqxzSBeHq1q27J12n9O3vVqxc4S6+cHQLIlUqbpCxYUNO\nd1jGJl55lXv55Vf85K5XXjUpFld2qfqUkh5CzHPAPWNiCBgChoBZvHLcA+JexJUVjukSYlbp4J2j\n6pJOQ5ggRASbSxwT1iF0LMbNqCsjD+QEt50OXqd86uF6qcKAw6zpcf3HD+YQRDYsc+JqlL3WD71l\no11x9RcWDMhkLd1HfLk4YMAA3Ty/yDXWLgLjcS/q5X9wL2Lhkngulv/B0kVaXIeQLln+54UXXvAu\nRpmfi3gvCJfEdLWoNM8P+viSCb8KLFxvOAjXjwb0zZO6/EtYwth+/OMfuYsvvthdM2mSuyrYevbs\nUX6hVc6pyVct75sqN8uKNwQMgTIQ2CB4ETPLtYkhUDUE+gfL10DA4nA5Vk3JEgqeO3eub0utLBhx\nrLkI6ULCay5CXo855pi8ay4WA8306dPd2DFjXd8BP3L9+53q2rTeoZhssaSZ9dvZ7n+DJYW6dT/C\njR17sXe5xlJwFQoRt2OtraVVaIoVaQgYAmUiYK7GMoGzbMUjQOwQZKURREgXZLLawiBNPZWuuQjp\n0kH0uBSZn4sPFfbee++S1lyMavNZZ5/jSRcuwFEjR9SUdKFPn+8d7xb6LyXXub79BiTapYflC9KF\n29jEEDAEmhMBs3g1Z7/XvNUMOAw2aXezQIZwWTJBKVY8kbgtGNRDmXF/uUhMl8RyPf744+7oo48u\n+8tFdIToIJMnX1tzwiXY6/3wM0e4eXff5WbeOjPx9xrPQ9z3jcbCjg0BQyCZCBjxSma/NJxWuMsY\nqG8IYpXSLJdffrm33oUtFuHfEEzIZjmCC7MaXy5CumQWeiZKZS1GposgnqvUIPokki7B+rppM9yE\ncWOMfAkgtjcEDIFEIWDEK1Hd0bjKMP3CbrvtFnyN9nILS1HaWoyVC/IFMconkCfIiQj52AoJBI6y\nmVlehC8XmS4C4YtEJj3FfaiXAJIger3mImSK6SIIoteWrr/+9a/+/J577pmdo4uyS5ku4qSTT/VT\nVCTF0oX+WtJGvioh6rrddmwIGALJR8CIV/L7qGE0JF4JSavVC8LFBoksVcij82ENC7tdwaVaXy5C\nvNj4cnHp0qV+brpyvlyk3RddPCZYWujxxLgXc/XFub8Y7Z55+k9uxvRpZVsfc5Ud93mIOsS8XCtp\n3PpYeYaAIVA9BIx4VQ9bKzmEAMQDq9dTTz21HukIJU3cT6xXDIwE18cRl0N5+qtIXLE6novgd+o6\n7LDD/BQQWLr0dBHhSVFlugiAC3+5SEwXVi/m59KkCwtXKVYuyp4/f4EbGkxeevucudmJTjmfVMEy\nt1WwyPe1116dVBWzehn5ykJhB4ZAQyNgxKuhuzd5jWNKCQiFJh3J03J9jcS1iO5xCgQsvOYi5eNa\nxMUoy//gXsS1qJf/EfIVXnMRgiWuRUgXVi7OrVmzxpMy5jYrh3Sh60FdDnK/DCxefEmYBlm95k3X\nPfhIIenzfAmWPBdYvSD5JoaAIdCYCBjxasx+TXSrcLFBZNIyrxdkCzepWCTiAhcig/VMW7r0mous\nvSgxXVi72rZt6yc1lYlRIWFRay4K6WIvli6C6B955BHXvXv3skgXbR40+HRvfZs65dq4IKhJOTLP\n17LHlqXClScuaSNfNbk9rBJDoOYIGPGqOeRWIQQGwkFMk1iSkopKtXSlXNqul//JteaiWLpYT/Gt\nt97yVi+W/sEyss0227RYcxGyJV8uYulihnpI18MPP+yOOOIID3Op7kUyEfQ/eNBgP19WLSdHjeu+\nwOXYsUMHd+6558RVZFXLMfJVVXitcEOgrggY8aor/M1bOaQLFxsDejjIPCmoiEUKkkhQfVxCmyFd\npXy5KF8t4l788MMP/VeNb7/9tnv33Xc9sYJgffWrX3UsFyWWLr5oJN7r9ddf9+QMC0o5pIt2Q1z2\n7rSPnyA1LhxqWQ6LbO/d8eup+qrWyFct7xCryxCoHQJGvGqHtdUUQgAyg7sxieQL0sVEqVihIIlx\nCWWV8uUiJEtci5AuHUTPV4nEbsmai6z+xWANCYNwsSZjt27d3OLFi/2+3DbQP2m2dkm7mVx12222\nTo3VC73pT+7FpP5zItja3hAwBIpHwIhX8VhZyiogQOwUMVRJIl9i6cJChFUOi1ccQll6+Z9ivlwU\n0gUBg3QR64VArCBYEs+lv1wUSxfE7IILLmhBuhjAS52yYMRZI12rrbdJrbVL+m7J0sfcD/v3cytW\nrpBTqdhzP0LAjHylortMSUOgIAJGvApCZAmqjQBWIEgJ+3rHfEnslcSg0XYhhaUSFsGNgZP2sZC0\nyC677OIJJ8H0xXy5COki0B4hZkt/uSiuRc4J6dpwww19/BiuRQikCO1DHxGu6etyXvakxfK3/Nnn\nUzF9hOida39sr97u+N69HIQ/TWLkK029ZboaAvkR+ELg6hmdP4ldNQSqiwD/ye+www5u8ODB7s03\n3/RL2VS3xujScX0yII8cOdKNGzcumwhismLFCv8FYankiwETEnffffdly4NsMZ8W5UK6mCoCS5YE\n0eNSXLdund841l8uykz0WLgIomcP8SKQXpMuCBdfS4atJOBMvbKhH2QMiwobv0kjMnPmTPevzAbu\njGFD5FSq9399d6178sk/uO9+979S1Q6sm2zLli3zfZcq5U1ZQ8AQaIGAWbxawGE/6okABEAsEZAg\nCEstBMJBveyxuuWqV4hJmMzk0lGsZ6V8uQjREvci00Xw9SLkDAsWxAqCpd2L+stFXItsyEMPPZSz\nHbn05bwQMUlz2cQrXI+ePd2wIafLqVTvCbI/ofdxqXM3atCxwOa6R3U6OzYEDIFkIrBhMtUyrZoR\nAQiNkBVcjkKGqoUFJAMXILPpS935BjSxEjHwFRIZHDXpYkLUadM+X75G5ueCWOFGhHDxlSMb1i6x\ndEG6IFMSz4WVi9gwmTIC9yKuRwLphXRRJ7qWI1j0wEC299f9zXUOvpRsFOnYob1r3aaNK6YPk9pm\n+ibN+icVV9PLEKgVAka8aoW01VM0Ani/sS4hkCJIWJwzxotljdglyBcLd2NhK8aNKMSEgY+8UYLV\njK8J9XQREC5mo+fLQ1n+B9KFVQsLl5Au9pAurpEWMgW5wqUI4dKkCzImpAuLGO5FNrArl3jp9lDO\ngw8ucl0P7qJPp/74m/vu32L+tDQ2yMhXGnvNdDYEPkfAiJfdCYlEAIIDgXnvvfe8NQrLFGQCKxgk\nLBfpydUYiJKUwaBF+XPmzPGEqxySQhkQEzYt1KGni4AoMQM901II6fr00089sQqTLggY57iO8OUi\nrsQw6WL6iCjSRR7aiW5xCG37wUmnxFFUosr46nbb+Y8FEqVUGcrQz/R3qc9CGVVZFkPAEIgRAYvx\nihFMK6q6CGCpgoyxJ4aJLwMhTbgJo6xVMigRZE5AOwMVm/5yslKiAjlh4EMPSFetv1wUKxfICwlE\nlzgEK+Arr77hJl42IY7iElPGvPvmuxtnzHC33HxjYnSqRBGeB/o86hmopFzLawgYAtVBYKPqFGul\nGgLxIwDBggyIMOBAeiBPUQIRYjCCbOUSrlVCvhjwIDy4LbXoNRf1l4viXtRB9MzRxXlckBAp3Ic6\niF5PFxHlWpR60SNfWyVdsfsNNvyCwzpkkmwEJD7RyFey+8m0MwQEAbN4CRK2b1oEKrEUQf6woOkg\n+kJrLmrSVcmXi5A0kUrIo5QR3k+8/Ar30cefpn7i1HC7Vq950+3YprV3/Yavpfk39yL/aEDATAwB\nQyC5CGyYXNVMM0OgNggwUGE5KzVWRsiOJl2HHXaYD6JnAKzml4uadEEcbbAt/l5J4yLfxbQOyxci\n/0gUk8fSGAKGQO0RMOJVe8ytxgQiIO6aYlUj1izqy0UC6flK8qWXXvKTooprMe4vF7WeRrw0Gs19\nLATcyFdz3wfW+mQjYMQr2f1j2tUQgWLJV6EvFzt16uS/THziiSfWmy4iji8XNSRiddPn7Dg/Akyi\n2m6vdvkTpfiqka8Ud56p3hQIGPFqim62RhaDAO5BtlzWAlyRTGehF7rmy0rIDy5GHUS//fbbu222\n2cYtWLAgOykqpItAelnoWoLow5Oihmej118u6nagpwyy+nxcxzIha1zlJaWcV197ze3X+YCkqFMV\nPeS+IO7LxBAwBJKFgH3VmKz+MG2KQADCAdmRPVkgRUwbgcg0ExxjxWIQ4ms/iYHhfC4hLWWz10L5\nlCF1cK3Ql4sQpnbt2rnFixe71q1b+9nlK/1yUetE+9GpWrLlFpu75cufrVbxdSsXAtwMwj3MfQv5\nKubeL4QJ9xtlyUbZ+rljzjqpR5479tW8RwvpbNcNgSQiYF81JrFXTKf1EOBlT1yVTJ4qL3T2WKkQ\necGTVgYFGSTkHF8gyrZeJeoE5EuXV+mXiytXrvQz0GMJK2XNRR1Er9Tz5FD00+fjPAaDSy651N1z\nz+/iLLbuZY0ZN8Ft9uVNgoW/h9Zdl1oowLMAaeJZKVX0c8dHJFh2ue8gdWyI3IfyjHGOe4c62VM/\naeS5k+eVdCaGQDMiYMSrGXs9JW3mhQ3RYgkhhBc3rr5yBhDyMzAwEDAXGGUTqyVzfXFdiwxW7KlX\nL//Dmoss/4PIl4vMNv/xxx97VyIWFaaMYMO1yDVmrV+7dq13M+69997Zha5lji7IGDPVs+Yiy/8g\nuUgXAxoiA5//EdMfwQic7rjjDl8qujeSDBx0mnv7rTVlr1qQRiy4j+lbCFAxwj85PCfca0KY2Jcj\nlMFzTJkc8wzz3FXj/i1HP8tjCNQaASNetUbc6isKAV7SvJwhWbyo2eIUiAWEjsEoFwHjvI7non7W\nXJTlf4jpIl4LYsVC15AsSJcQL87J8j+y5iIkZvXq1d5yAOkinktIF2lkzcV8bUX3YgfQfOVwjYGQ\n8tgYHDXB5Pp+++7nLho7zh19ZE9+NoS0b9fezbx15nrxfHFhmmSQ6Od87eQe4L5HeD7ifu543iB0\nrPDAc8SxWcCSfMeYbtVAwIhXNVC1MstGgBczL3v+Q4d85Rskyq5EZWQgYoCBgDAIyH/1YdJF/AqD\nEq4WWXNRky49KSoEDNIlQfRYslhbEaLFuotsTz/9tNt///3ddsHM8FyHdOUKolfqeoJUCSbgSptp\nC3s9B5muR44hXkcd81138YWj5VSq90uWPubOHXVOsH7mwvXaAR4iWGMa1SJDO8P3EPc/z50QI46r\nKdTHM4YuPH9C9qpZp5VtCCQFASNeSekJ08MTn+HDh7vzzz/fv4xrCQkkj5c/xAtyIm420eGpp57y\nwfRYucS9iGuRmefF0iXuRUgXaRC+XNx0002zpEtci5wj7mvbbbd1u+22W1Gki8EKKZUQCMlikNMf\nB/jCIv4ce+yxfmBmcGY+skmTro4kKhFZE3/q3F+Mdv/49BN3yfixeXUFa8GbhGGikjdzCi5yL0ib\n5N6HbEGCammBQg/qxbKNHrWsOwXdZCo2KAJGvBq0Y9PULIiO/PcLSSg3hqvSNvPf/r777tuiGL5c\nxL3IgPC1r33NffbZZ96SJROjhi1dpa65+Oqrr3r3XrjeFkr8+4ceLKOuyznSycZi4oXk29/+th+E\nGYhlWgysekIyv/GNb7orA/LVCO7Gbt26B8T+f7OkoxA2ch08RSC+pZJfyZukPW3Cysue547+r4fw\n/EO+eP7q+fzXo+1WZ3MisFFzNttanRQEeOnKC58Xbz3/46VuXIoS56Sni1i4cKFr06aNj9kSS5cm\nXeWuuYi1i/oY/ASHqL7Jdx3cuC6b6B9VDudol3ydxp42i/tUiCMWO7HsHXPMMe7+++5PPfG6btoM\nD0k+nHNhpvNgCQNrhHumXv8oeAUq+IOFCcsu1tx6tgEMIVxY28AZbOupTwWQWlZDoCgEzOJVFEyW\nqBoICOniJcsgkATRVi8ICUsAMRM9li7IV+fOnddzLeovF4nVIlh+s802W8+9KEH0ub5clAEnTD7F\n5SVWFhn4Sc+AVYhoMa+ZEK1evXpliRYWLbFqaaJFW/X2wgsvOMjX8mefdx07tE9CN5Wlw0knn+q+\nd8Lx7vjje5eVPyoT9zD3jAj3crj/5FqS9mJh4h6iDXJv1VtH3gNi/TbyVe/esPqrhYARr2oha+Xm\nRSCJpEsUhthAVhicsAgQi0Vw/FtvveX+/Oc/u5133tmtW7fOTxcR9eUipIsAeuK5Sv1yUax+eiCE\nXCHsGSgLBcRDGCFaxKthQcBFKq7DMNnSBItjPgjQe7k+PpjPa/fd27qJl00QmFK1n3fffDc8mLdr\n2WPLqkqM6D/ubSSp1jBNupJIEo18perRMmXLQMCIVxmgWZbKEUj6y19cbwMGDPAWDZb+wbXI+oss\n8YPgXsz35SIEjCB6sXQV++UixI/BhwE8PJ1FLuR1QPw+++zjiZaQLbFmyV7IVJhghUkXv8Xd+OKL\nL7pLJ1zqZs66zXU9uEsuNRJ7ntiuoUOHxmrtKtRY+i9p1jDcedxbQvALtaFe14k9Y0u6nvXCx+pN\nNwJGvNLdf6nUnhcqAwAEI4n/cQOquOCYh6tLly5+Y+JULF2PPvqo23PPPbNzdOX7clFIl8zPlWtS\nVCxZssUREI/+ECtNtoRIRREsIWOSRvIKDmAybdp0P8/Y7353Jz9TI8xUv2zpEnfnHXPqqnO9rWHc\nX1hB2afBjSdfGKOviSHQSAgY8Wqk3kxBWyBbvPRxm+EGS6JgKWKDhBBsvmLFCtejR49g+ZxLPOEi\npusPf/iD69ixo59pnklQZY4uPV0EhEziucJzdDEIM6DIVihOK19AvJAjTbI4FoKlLVv6nD4vedlT\nnmAgRBFrHeSRJYT+69hebtTIEUnsuvV0Yt6uQ7oeVPcA8rBitbaGUR/ua/7hScucWejMuwJ906Jz\nuJ/ttyEQhYARryhU7FzVEIBs8TLF6pVUERcdJIUYLojWpEmT/GzbV199tSdTb775pp/0lPgpIV3E\ndUHCiAeDdEFW2BDisoRksS8Up0Wevn37enIqMVvsIUVRREssVrIXghXey3X2QraEaFGnCCSLTdog\nk7w+++yz7rLLJrpLLv2V6/O94yV5Iver17zpTj7pJNe/fz8/S3oilfy3UtoaBkFii1MgLkL24yy3\n2mXxrGD5Qve4Mam27la+IZALASNeuZCx87EjIC/RJLsYaTTESyxGQrxYBuiUU07xXziefPLJHpvn\nnnvOde3aNRtET0yXuBaJB1u8eLGjzQToFyJaQq6EmELoCPAXosXAA7HbcccdW3xxCIHKRa7kvCZb\nUh57LUK02GOli9pkLclLL73Uu1tvvOmWxMZ7QboGDz7NtW/fvuBkqRqHJBzzfLCJcE9UIpTFtCUv\nv/xyKslL/+AjF4TYNBNDoBEQMOLVCL2YkjbwHyuuDnmRJlVtbfGSObuwehF7xcz6t99+u5+SAZL1\nzDPPuG7dunlL19KlS93dd9/tHn74Ybd8+fKCzWPiUvnyUAfEM23Ft771raxFChIIeWLgfO+997y7\nU8iUkCu9l/Sk4VjIllYIF6K2aIWJlpAsvcf6RXwbU2rcc888N3nyZDd9xo2JJF/DzxzhVq160c2Y\n/vnkt7rtaTuGvIvwDLGVIjxv5OHZS6PERRz5QKZnz54egnHjxrmzzz47VXC0bdvWrySB0qXoP3Xq\nVDdo0KBsW3m/1VJ4Z/HOYBWME0880c2aNauW1SeyLiNede6WfA9Tvmt1Vrvk6onpwt3BSzTpwouJ\nDTJDcD3kSzasXQcccIAbNmyYw+LFS+TWW2/16Qu1C3IlRIvpHiBEQvKEIDE4COmCOKED14RYrV27\n1tdLWZp8CdmSctgjlC/xZZpoQaIgW+JC1ARLSJg+h8UPMnnEEUdk3Y+jRp3r5t1zj7v+humJIV9Y\nukaPvsDhCm4E0hW+p3h+9DNUyBpGWqxdDH5J/ZAl3Mao3/LPWiVWL3mf7r777gEpXxVVTaLPif4o\nGSZeEovJtSlTpriBAwdy6KXexAsltA68MyFgzSwbNXPjre3FI5DvwS6mFF6YaQmQpa0QFiEqxGtx\nDvchZOSaa67xW6F25wuIj5ohnsFg66239hOiCunSe44hVAwcuDGxYhBPJmQLnYVooRvkStogREvI\nVnjPdSFaco1zbK+99pqDeDGJKuWBBfvLLvtVMAv+Pu6HQQzVLy8eU/eYL3Ev0vZGJF20iz5nEylk\nDcPK1a9fv1STLtpKOyCQxIaWQyDHjx+ftRalzdIlfZ3mPURQ+mDkyJFGvNLcmaZ7OhDgv27inCr5\nb7XWLYVcsEFCEIgGE6guWbIkpyrEZR0exOOwYdEiRgsiJK4+rGaQJDaxVuk9S7cceuih3joRJlyS\nTvLz3y+uR+LKvvrVr2Z11EQLIqUJVZhYaYIl14RsyZ5FtSGDHTp08BhI48EG6R+4sbbccis36pxz\n3Isvrqrb144yQeqxx/VOXUyXYFrOnntNhOdMiBjkRL4elnOSLo17yCbPFJZz7rlSBGsfgz7SqlWr\nFtagUsqpd9pyrXSQHm0Bq1c70AHShcuR/mhmArxhvTrB6m0eBHhZslRNOf+p1hMlIR9YvNgOPvhg\nt/3222dVgvRgBfrVr37lXRcspn3dddc53JGs64hVi+B8JlqVdR2ZB4ypI/hUno1BgW3OnDl+oORY\nrpGODWsTMWYySz7kC0KH5Qsd33jjDR9jxteVTO5KoD5Ys2eg4Vjv5VjSkI7AfdrDV5ky6Sskk/nK\nIHnUI2RUSJcAwRI8M2+d6efKOrZXb8cUDrUSrFzn/mK0n5V+zNixTUW6whhDTiBibBxjYeb+gYA1\ngkC4yvnnDTcXzxUSJiAQALmviYMiHeRAzrEnjeSPwpHwAPLqPPyzUigfehFzpvPxm/NRwnMoaSkb\nkfw6vegi5bCXfOxFwu188skn5VJ2r9MQp6Ulqt359NfuRdFNl9dMx6kkXtx0+ibkZuIcTFqL3IBc\n50EIX9dlcBwWHjbK1Tdtrrp03mL103nKOS71xtftBQvy8zBJ++RloXUp5sHW6aOO+Y+b2KY0CZgg\nYkHCIsTGi/vUU0/11yBRd955p4/3YhkhvjhkSSH9JWSYaAnZ0nssXZAgzjFQkgeiBmEjxgxrF1Yz\ndIIAQQIhRxAl+nSPPfbw9zZlCMmCXNGf8luuQbIgZ0K0KEeIFuXSRogeHwjw0QDlCBa+0Tn+MLgz\nQWmPHkd41yPB7c8+tyJH6spPQ7iYGLV7QDLe+evb7t777q3prPSVt6C6JdDfCJP+NorwDuEDF56T\nUkQP8szHl0943/H+1gL5kOBwfZ5jrkWRDSFwPJ9RhIaxiY13sBZ5p1NmtUUTIeoK68I5jZ1OD0ZR\n7Rb9aVtY+Edx//3396cZf2677bZwkqb5nSriRWfxAHCzh0mUPBz6JseUycCBCImSnuWG0mUQkKiF\ncnhoKDcsnONa+EYtVb9wuaX8LufG1+Vz0/PgaLzkZRH3Q4+bkf/C0yZCSNlDwNh++ctfuhkzZnhr\nElNEiBuR4HesYbgjV69e7ckTJEoTLPBlk3NcZ+PLyG233dZbtbCSUVYuogVhgjyxCZmC9HXv3t2T\nPoiZkC1JI0QLi5i2aPFVJmSLPGy0jzaxHV5mfw0bOsSToI02+oLbu+PXHQQsTgsYZE4I1/JnnnaT\np0x2UyZf43YNLDwmLRFI4z88LVvQ8hf3NXGTtKtY4f2m3/P5iBdjgn4f6jooI0wmeAeHSZrOwzHP\nO+9T9iLUowmNnNd7xpZCZev05RxDgiBDIuG281vrLdgxdkSNi1IOe9oXpb+UQRojXqCQAunTp0/O\nBwP1wzc5N5X2I3MzyEOobwqYvL4hwuXkgib8IJaqX65yC52v5MaXsvM9ODz0cT0UzD9FrFM9B0Ze\nfEKipP2V7rHwMADg8sMiJV8/EgD8+OOPe+IlBEvvtUUL9+G9997r5wLDfYiIRYugeT0jvpAocRNG\nWbOOPPJIT+SoD5LFJu5DyhOiRWyXEC2NC32FVOqaoq8nXDLOx6Btu83W3gLWrVt37xIkFqtUgbhd\nOekaN3DQaZ7MvfKXlz3huuXmG8smiKXqkMb0xOeVS6DjaG81njvaI/dpMTrqf47F2pIvH2mIpeK5\nZq/HBcqS8hgj9BjCWDN//nyfj7zapRlOq9+t1MeXylKf1lGny6Wz1KmvY0QI66Cv62NtxZK2yXX9\nG71ENz12cI71a0V/jRf40HYtmujp8nWaZjjeMC2NhDRpRs7ntHQ2m7ZW0dH6vwmIl+5sbgZNwBjI\nKEsL1/UNo+vSRA4SJw9Hufrpeos9ruTG13XodoXnVhGsK32wCfitJ+nS7eVY92v4Wim/GQBoG5Yp\nSJMQL8jUTjvt5O9VsWixj4rTIjieexP3HqQoF9ESsqX3EDH5zTFWLYjWQQcd5F2HuDzzES0IlxYG\nM/qJLS6hrHPPPcetWLkicHkNc5tusrG7ZNxYT4KJBYNIYb2K2ojbOunkU137du09cVseWAVZnJv+\nw8JVT0IRFz7VLId/CrAOJUXieu74p6AU4iXvMXDQ40AuXHgPSjr24feikAXe+7pNjEGadIR/6zFJ\n/vlHB4gPzzFCfXp80br7BFX4o4kX7dF16mNJxzmtP/gIIRO8pD2UJ+OjqC7Y8pvruixJ0wz71BAv\nueHpFP6b0DcovzV50jc56TUx45r+TyVMzEivb5ZwXdSjbx65OSvRjzqLlUpvfKkn3C4eLHm4SKNf\nKpKnnD0vySQNktJf5bRF58GKx+AG6ZIvEPlqkfguXHbs//rXv64XEM81LE64+N5++2239957xxoQ\nT7nEfHGPEqclFq0w0dJtoR2QJIkL0tfiOiY+57xzR7lFixb6e+vM4We4Dl9v5zbfLIgxCwhZePvi\nF4Ln/H9+5N2WELepU651/YPg6mrqGFdbk1AOVs8kYRXXc8d9StuKFV2vfm9H5YdAhNNAIvR7UYiC\nLpc0mnRJ2bxjRTSpEaLCNf6JZhPrEHWJQYF9tSXcZj2O6WNpn253OC+65sJL2hHGV5cnaZph//m3\n8iloqe4guQm02tywYgni4eBGF+ZNeh4CIWTy8HATaAIn5em69EMi16M+69V5StVPyi1mr+vJd+OH\n2xouO6pdnNOkM5ynEX6DX1T/lNo2BgARsXoJCWPPdYLmmYYBkTgqSBfbI4884r/0xNrFb70nrfyW\n9JIf4sbG7zCp0uSKe//wwCoHqcJKEDUIM4DVgxijC7qxmVQHgXr0a76WxPncEWBfrOh/IGU8yJU3\n6p1IWk0WpDwZQ7ieK1+4PsnLmKPfs2IIkPFLxitN+KinWkI9ooOML+xFX9onbZRz6EIa/c6J0k+n\nL+d6VJ5GOJcai5e+0Yml0oMOxwS7awl3ODd7+EHQljDJq+vhnH7oJE3UXucrR7+oMqPO6XbJjR/G\nQkgX+XV6XV6x7dJ5yjkWa0o5eauRhxeMvFziLF8TInEdMsP9iy++6L8gxB0ocVoQnn333dffj5AQ\n7kvZS5pSAuKl/6PaA7nBJcqmRc4Z+dGo2HG1EIjrudP/8BSja4eXj40AADlaSURBVK73XzF5q5UG\nEkNclLaI6bqwNDGGCBHT16pxrP8RFSuXJoa1IoDVaFtSy0wN8aoUQB7A8ENYjQG4Uj0bMX+pL8so\nDHhxC8EodS8vE8rlHiDol5daHP2PLlifsExJnBYB7fL14V577eWXG4JYSUA8MV9ihYJ06RitUgPi\no7AKn5NgeYmNkb2cD6e334aAIJDU5070K7Qv5R/M8PggZet/qqU82ZMm13skXJ7+xx/yxT/+4lYk\nhIVNp4mLrEo7cu0hXlIvOtMe/c7UxEzSURbnRf9c+yjjRi49mul8aoiXvtEl4DtXZ3Nep6dDo/57\n4MbWDxXp9I3F7/B1zkWJrq8c/aLKjDqn9bMbPwqhwud4udD3UfdE4dwtU+iYLebDIkAea5VYrr7+\n9a/7DJAzzvGlGfFOYbKlp3kgTgsiRx7K10SzZe3F/4L8so0Oll6R4+JzW0pDoHIE4nzuitVGvy/D\nRChcBlae8Pue39r6I+95cb1RBuVqoiLlas8DepCH8vTzLKQNjwwbYSxaZ7kuZVZrr61v6C31orNu\nqz4mTSFMC+kreBZK12jXU0O8wh1eSkdwI8mDIQ8A+eVFoMviur7xww8iabFcyMPDAI5Uop8voMg/\n4XoqvfGLrLbsZFh6xMJSdiEJzaitXbgXmbIBaxdWK4iVkC9mvGduLPqqU6dO2SkewhOXQrTY5N5i\nH5dIPBfEi/4oJUA5Lh2sHEOg1gjogT3qXR7WBxefpGPPby1i/cH9pseJ8GSo4d/irkMfnY9//qQ+\n6mGc0u90nVbrke9Y58+XTl+TdnFOE0bRW9Iy/gim1IP3QEia5NXjoy6L67qt/NbjGb+bRVJDvPSN\nwc0qhIeOkgdEBiwd78XNoS0b/FcR/gJSSJl0ur7ZqEf/x0NZ+sYWvWRPGaXoJ3UWu6/0xi+2nnzp\ndPvzpeMa7qxGHOSFFLHHOoWVSlyNEC823I0yQzzxXvfff79r166dTwtRqybR0v0SjufKFfel81Tr\nmHuBuL8rr7wqmIz2Ij9lBNNGhLcRZ410V141ya/NF45Pq5ZujVZuIz53/NPAPzTFih7Yw4N+VBmQ\nCMYPnmv2mlQwLkh5ECLGEhHK1vOWacJBWj3maOsSY4/UR52a6JFPjytSV6E94w9laR0K5aGeKJIX\nVb9uN/jo1U8gnDI+QNB0W9FB4wmWUXUW0rURrqfmq0Y6EBIkDw83F1uU6BuDNHIj0MmUIze0EC5u\nFv2lIvn1TasfBl2ffhDL1U+XV+wx+qEzIjd+VN6oGz8qXannBHv89+EHK6qsOAYAjXVUHeWcq+Sh\nZwDYNXDd4QrEtc2LDiIVFs6zPffcc8GSNse71157ze0a5KuVoCdWx3A8F7+FkFVbH+p58MGH3Ow5\nc91dd8513z32uGCw2cN9dbvt3Kl9+0ZC8cILLwTLJn3oZt12u+vdu7c7/PBu7ohgcDj0kK7B8eGR\neezkfxAAI6yblUrSnjveJeF7OV8b0V/GCd6VjAW5nntIBtc1OZCyIQnheCXew4xHeqyQ9LKnLkJP\ndJ3kY+yJqkfysWeOLJ1PXwsfo1+h8sJ5wr9lDJPzlMkWFtKBk+Aavs5vxh7aHRYZizkfRerC6Rv2\ndzBopEYCcpQJbgQmN8m5Bf9ZZNsTfDnSIl2x1yggeMha5A3XiR7BjMPZujgoVT/yBDdoth6tX6Fr\npA3rpH9TLvpo0XUFD4W+5I91mcHD1eJ6FO5gVEgWLVqUOeywwwolS931YA28zPnnn+/1DqaTyOTa\nSBBMlOo3jsGjVkJdwYsub3XoFkx7kTdNuReDhb8zPzjpFH+f/nT4zzO33nZ75o3Va8oq7p5778+M\nOu/8TEDA/HbDDTcUbFtZFTVIJvo0mGuuQVrzn2ZMnDgx069fv/+cKOJIv7sCMtMih37nBUTAv9N5\n9+l3afi93KKA4AdlhvMEhClDvvAYofNyXesmdXKesSss+v0d1on0Aclsobe8n8NjWbhc+U07RAf2\n4TokneypMyCRLfKgY758ur1RY5CU3eh7/ltPndCxugO5Sbjxwx2p03BDhEU/LDwoYaLCjaXTUE+h\nG4s6itWPtPkepnzXyFvqja/LC2NFeegtDx7t1pLvwdbpwscM7IFrIHw69b8hkxCLYiRMtsK/iymj\nlDSQrVLqKDV9IV2oWwjSFVddXTbZylUPBA5C126vdpkrrrgyV7KmP8+zXIh4pw0kSBfkqxTRxCP8\nXtPvPIiXSfUQYAyR8YWxqJkllcSrmTssjW3nP+9qWVXqhUexg1oUAdIWsLj1r8SCha6VDNTUHSwD\n9DkhCghXtQUrGAQMkheFc7XrT3r5pfxzkPS2iH68S0rta6xO/GPNM8teW6GMeAmy1d9r65hY46pf\nazJrSE1wffDQmKQUAeJNCKhuBCFuRmKiiJ3KJ8Q2SVqdjnPEqsQR+6bLJZ4LKSUGRuennyTuS58v\n5nj+/AXuqCOPCqbT2MwtDPp62JDTi8lWUZqjj+zpWCi79wnfc4MHDXYXXTymovIaLTP9OXfu3IZp\nFvcmzwztKkUCspUNhA/+scgbk1VKuZa2eAQ07oG1q6jY4OJLT1/KDdOnsmmcNgR4UQYxOWlTO1Lf\nyy+/3E8NwUWC5mkbZExIj86Ui3iRBnIUlUfnL+WYsiB0bJWIkLZSdLv44rFu6JAh7pcB8Zl42QTX\npvUOlahQcl5I3u1B4P7vf/+4Y/HtuAltyQolIAMY8I/B9OnTGwYPSCRz4JUjBLQHoSc+a75g+HLK\ntjyFEQBzyBcSWBkLZ2j0FMk0xJlWjYQA7ivivHBFpVlwlwbvg5xbMHFq5thjj81cdtllmeuvvz4b\ncJ+rzeAShwsW1wtlxSmUV4xLJ5j2IRN8pZh5dMmyOKsvuyyC+HE9xoFr2UrUKSNtps/YpP245oqN\nRayT2kVXW2lbdIwRLkbEXI1Fw192Qu3qxd1okslsAAiNTi6tffVHoH///l6JNFu+sCLwHzeL9AaD\nQNbylQvdHXfc0VvEmEaiW7du2YWqsZSJYBVDyrFUoQ+WKaxu1RJcxFjBotyqZ519jluxYoWbPPna\nmlu58rV3+Jkj3Ly773Izb51Ztts1X/lJuRZ2C0f1ExZaLEVpd/WjP8+eWTOTcveZHpUgYMSrEvQs\nb9EIMEgwMLCPGsSLLqjOCSFIuBYhkoEFz82ePdstXLjQbx9//HFe7Tp06OAJmBAxEkPCGFRKJU/g\nyCAkrsG8FVd4EXJHn2lymFTSJU0V8rXssWWpvt+kPbLXBIr+0H0iafSee4Q04orW19J0zPPBxrNn\nYgikHQEjXmnvwRTpD1lhEEjryxNrHbpDejAUs/3zn//02z/+8Q/3+OOPu5/85Cf+NxOAFpJDDjnE\nTw6KNYyFsxlYtDUsV/4oIpQrbVznNdFjRvk5AeG8+ZZbEmXpCrcV8rVq1YtuxvRpqSVf9LW28nCP\nlCrcsxA2TdpKLaOe6dEbaxf3YJr/aasnhlZ3shAw4pWs/mhobXhx7rbbbt5SBAFLk4h1CdcNgwCk\nK5g01X322WcO0vXJJ5+4lStXOqxeuBi5FsTWuMWLF7uHH37Y/f3vfy/Y3J122skTu+7du3uCSoYw\nEWMQinIpFSw8hgRgQPtvvvkWN33Gja7rwV1iKLW6RRBsf+CBB7jzzh1V3YpiKh2MIVsicfQ1ZfK8\n4XIsh7iJLvXaozMbBNLEEGgEBIx4NUIvpqgNP/3pT/3Akrb/vsN6i7VLSNdHH33k5s2b51iTERIG\nIYN8QZxYSujDDz90v/vd79yjjz7qHnvssYI91qZNm/XckgzIWMfqJQzgB3U5yI0YOcr9aED0Uj/1\n0i1Xvc8+t8Kd0Ps4N278OE+Yc6Wr53n9LGDRqYb7GMLMJtbSera3lLpFb/5pMzEEGgUBI16N0pMp\naQeDNwMLRIYtDSKuDgYtLAeIEK9PP/3UQbruuecev1js+++/739DvnBDIhAvFtJmYWzZL1++3JOw\nRx55xAeo+4QF/gwJpmxg3UIhX2FrWIHsFV8mrov+mzrl2orLqmUB102b4W6acUNggZydCFcVJEIT\niVpZoaiHZw8ykwbheUPntFrq0oCx6VgfBIx41Qf3pq6VF+q+++7rgk/eq/LffZzgipuGwYoYNRFN\nvJ5//nlv0dpmm23cunXrvFsRMoY1TKxeLKYN6YoiYZzHEgYJg8BhHSskxxxzjCdgkDCwRKpJxOiz\n7//39/18WR07tC+kXuKun3Tyqe6gg7q4YUOH1EU3bdWCvAuBr6UykD0hXvperqUOxdbFcwfpwq0/\n2lyMxcJm6VKCgBGvlHRUo6lJoDoWLwakarhW4sBLXv7oh75aNPG67777gjiiAx3Wrg8++MBvEC+s\nYZAv0rJBjNggYRAwTcI4FosYgfmQsGnTpvkydL1Rx5tuuqmTLyXzxYdF5S32HMRl7077uFEjRxSb\nJVHp5t033w0fNtTV6itHiCr3jwgkIgmC9QjSlXQrkkwdoQlrEvAzHQyBOBAw4hUHilZGWQgwADBA\n8XJN2tdKQrrQK+rlD5HCmrVgwQLXtWtX716EbEG82LPhbhTyRVrZNFiQsDARo4ybb77Z/fznP/dk\n7PXXX89axIqND8MlCQnDIibYlmsRE2sXSwHVelZ6jVWlx9W0enG/gJMIZF1wl3NJ2WO9HT58eGIt\nzkl+LySlD02PdCNgxCvd/Zd67eUli0UpKZYvIV2Am4sUQryYx4s4Lr5ihGBBtPiqkY1jTbwItpdN\npqCAiFGOlldffdVbzlje5LnnnvNuRIkLk32p8WEdO3bMxoaVEx9GbNcXN/6Su/jC0VrV1B2L1WvF\nyhWx6K4JOSQrKfdvvsbhbhSSmESLs7wPcj13+dpm1wyBtCBgxCstPdXAevKyxfXBy7begxe6sL5d\nr169vHsxn9UiWJrFffvb3/bkS6aVEAuX3ss12UO8cEHyW0gY+yeffNJbSZgVH+sUU1C88847bo89\n9ljPNSkkTMeH3X333UVNWyHxYVjEBO9c1jAGab5kZC3ENMZ2hR+bbt26uzPOGFbWF46QFjaRpLgP\nRZ9Ce9Fd4sv4ZwfyFY5fLFRONa4X889ONeq1Mg2BeiBgxKseqFud6yHAIDBgwAA3ceLEun3tSFzJ\nHXfc4XUjvgoSlksgiYcddpi/LJYrIVEQKr1BsjTZEtKl98E6co55vDbffHOfVtySDJbbbrutPx8V\nHwbx0iSMwHziwwjWv//++3Opnz1fKD4MQnz9tOnuzjvmZPOk+eDKSde4VwJMf3XpJQWbIZYhSQhh\nEdIi59KyD5Mu0VtivrjX6/W1I88S9UNk0SHfPzuit+0NgTQjYMQrzb3XYLoTIwP5YXCDiNVqkGOA\n5cUvpEtg7devn9eD39olKIMYk8Hq8xwLCWMvREz2YTLGb8gXcWK4AyFdQsZ02qefftqx3BBlaomK\nD9MkjGD9SuPDxowZ51pts21qg+o1XhzLvF653I3cg9wPSFrch17ZPH/kfs31PHGd5w6B+NTKkgfO\nfLHIs84+LdPL5IHaLhkCRSFgxKsomCxRrRDgZczL/4ILLnAQH17IuQaMSnWSumSw4cUPAXvllVey\nRe+zzz4OlyKDsJAs/kMnVkq75+RY0lCAJmEcazIGscKNiKWLpYOEhEXtOYcbEnImJE7Kpj6pmy8j\nJVAfAqa/lJQvJkuJD9tkk01clwMPcmPGjUvFLPXZTitwgLtx4sTLvJuVeyAtQfEFmhV5uRDp0pl4\n1ngWIGHVfO6oE7LF84arm+NqPeO6fXZsCCQFASNeSekJ06MFAgwYvPyJt4KAyUu6RaIyf1A2L3sG\nGV781CP/5TMQc/ynP/0pWzqzyLMYNiRMky5Ijrj/2AsBymYMDoSIsZcN8vTSSy+5tWvXuk6dOmXd\nkpzXFi85Zv/aa6/5dLgd+U1aCBl7IXS6XtEL8iWbkC9tFZP5w6Liw2gv1jZpgy4/zccDB53m/rzy\ned/vjWLViuqPUkiX5Of+51mT545/RHge4hD5R4dnDxGS53/YH0OgiRAw4tVEnZ3GpgoBIxaF/4r5\nb5yBoNTBAKsGpImXPqQKMpdvUOH6jBkzspBBZIYOHer69u3r15sUMiOWJX7nIl/ZQoIDSAy6bLXV\nVu5rX/ua/y3ESfZCqMLWL6xV2223nSMuS5MyIWGSj3LYtAhJ1HpDxPgthCwcH/aNb3zD7blnO3fb\nbbfqolJ/PGbcBPfZp5+4//3f81LfllwNKId06bLknxMhSTx38uzpdIWOKYfnjucXVz4frUDsSn1+\nC9Vj1w2BNCFgxCtNvdXkuvLyZuNFjjuQ4HbIGBuC9YJNBh1xI4kriZe9DCCkyycQl+uuu84NHDiw\nRTLyX3HFFVnCsvHGGzs2IWBCcFpkUj/QHSsb9WtLEsfUKXvIlN6EhGGhYqoJ+R2155xsQsKiiJi4\nJSFf6K8tYZCxSZMmua232c5NvGyCakH6D5lW4saAVN9y843pb0xEC+T+l+ciIklJp+SZ497lnxYI\nOWXLF7EUxrE8Z7meO56/uHQqqQGW2BBIGAJGvBLWIaZOcQjolzvHCAMOx3pAkJd9KS98yA8bVqXf\n//73fsoIrVXbtm3djTfe6K1PX/rSl7wFir1YjiA0iHY9ir7ok0uoU0STMCFPELF3333Xzx/Wvn17\nT8y05SsXCRPXpBA5KZv6RMcwCYOMzZhxk+uwd6eGCawXbBuZePEMIKXc7z5DkX/kPoZk5XvueAbR\nQT+LRVZhyQyBhkfAiFfDd7E1sFQEICSQE5kUlfUTTz755PWK+fWvf+0OPvhgt9lmm7kvf/nLjmB0\nrF/iziMDxIbBMEwI1yss4oQQMfayQZ4Y9HbeeWf/FaRYtjgvJCyKgMk1IWGk0USM6qlD3KUQsZkz\nZ7lv7rd/wxGv1WvedDu2ae3bGwF7ak9Vm3SlFhhT3BBIGAKf/2ueMKVMHUOg3ggI6YGAQVBw8bVr\n166FWj/+8Y8dMTB/+9vf/GzzTHjKrPVCbiiDhcCRcv7z1yRIyBzE7oADDnAszE2sF6SPaSi22GIL\nt+WWW/rYMdyYrVq18u7MqD3xZWzkIS+kUSx2EC70ps20vRElzcse5eoPcfNVy9KVq147bwgYAqUj\nsFHpWSyHIdD4CAjpWbx4sZ86ggWwIVkXXnihwwImMmHCBLdq1Sp33nnneaKi3XgSjwX5iUPQCWHP\nrPPE3OC6lDplL5Ys2Yu1qxhLmKQlr9QXh+5WRvUQgHRBuArFLVZPAyvZEDAESkHAiFcpaFnapkEA\n0kEAP/FcEnjO/mc/+5lr3bq1D7wXMJhqgnm2cD3iAiQOa+XKle6oo47ycV8Qoqi4L8lfzh79mMAV\nHXcNBl2sVFjFECFg7NmEgLEX16TeC9kK7zf64hfLUS3xeZhEtd1eLa2XiVc6h4JGunIAY6cNgQQj\nYMQrwZ1jqtUHAbH0sGA1k5t+9NFH3hUn0zj06dPHEyzm/xKBAPXs2dONHz/eL/1z6KGH+nwQHx33\nBUGS8iVvuXsIFwMv8WPa2gEBEyLGXjaIVxQR43yYdPF7y8AV2YjyajAn2n6dD0h904x0pb4LrQFN\nioARrybt+DQ3W76swtUmx1HtgZiwEV8lX1lFpYs6R9m487AMQZyEtEBcIDIE1eN67B9MMKnl7LPP\ndmPGjPETo0oe0lMGErfli3ahKy5HLULu2FM/IoRM2hAmYeirSdjGG2/k/vLyS7rYhjjGbZx2MdKV\n9h40/ZsZASNezdz7KWo7X2wxnxBkR+YSEjKlLU+6SQxO5GOG7Iceesjtsssufh4vyBJ5cwl5IGyQ\nJMiKkCZIDJuc5xqTQp5zzjnuueeeyxY3atQoH/c1ZMiQrJsPkkM5MmkpiYUcZTOWeUBbaGuuNul6\npA1SlSZhEDQhX+yZJ21asEB2o8mLL65y7UMfSqSpjUa60tRbpqshsD4CNp3E+pjYmQQhANFiY7CR\nyU+x7mjXWrHqykSQlEd+CBtlhssSC5JYioSM4H775JNPHF8vMsv7Bx984L9mZLmdZcuW+S8ftS6Q\nt9/+9rf+60G+PsRVKV8PQtritH5BFhHqLFWkneTTRIwljYhn09dLLTuJ6VkyqOvBXVz/kLUyibqG\ndTLSFUbEfhsC6UPAiFf6+qwpNIYksbQIkosgVQKEJnRYxGQQFtIlZQvpELccrkfIF3Ffb7zxhnvk\nkUf8TPIQsaVLl/qvHiWv7O+77z4fEybzfUG+sH5BvtgQbZWSfKXuw7qXml/SS5vZH3FED3fWyHPc\n0Uf2lMup37dv197NvHVmTgthUhtopCupPWN6GQKlIWDEqzS8LHWVEcByAwlikNGEqFrVQlaoD6uX\nrCEXZTWChLBh/YJ88dXim2++GaxluKe3fGH9YluzZo0bMGDAeupec8017lvf+lZ23iyZbJUvJbF8\nhV2A6xVQ5Im4yJdUN+KskW7jL23iLr5wtJxK9X7J0sfcD/v3cytWrkhVO4x0paq7TFlDIC8CNoFq\nXnjsYi0RwApFnBKbELBq14/bkrpwOUKY0CFKhBhhoXr22Wf9LPVdu3b1bkSZkJQJTHfccUcfi8ak\npFpOP/10v8ZjvslWxdKk85V6DGmkPXEI5fzj04/dQ4seiKO4RJRx9z3z3LHH9U6ELsUqAZmmX8Mu\n8WLzWzpDwBBIFgJm8UpWfzStNlidcC+yQYbqIVgVxPqFHlED3aJFizwxZNZ3rF+yrJDEffHFHJYv\nfl977bUtJlulTfvuu6+f74v8Ou4Ly5fEfVXqdizXOkI+vhIVYbBnwzV3/Q3TfVyUXEvrvlu37u6M\nM4Z5op2GNsRtwUxDm01HQ6DRETDi1eg9nPD2MdBjbWIP2WGgr6egB+QLaw+DnpAvzkNMIIVimZK4\nLwm6J+6LWC8hXxL3ddFFF63XpPnz5/v5vnTcl3zxGEfcVzEDdphoYWmU9mqFL754rHvn3bVu4mUT\n9OnUHc/67Wx37dWT3KJFC1OhezF9mIqGmJKGgCHQAgEjXi3gsB+1RAAyI9YtTXJqqUOuuiBfEBP0\nQk+28HQNOu4L8oX1S8iXfPEI+SLu64c//OF6VU2ePNkx0aqslyhxXxCvSskX+kIetc60RUsuoqXT\ncEw5zJK//NnnXccO7cOXU/P7pJNPdQd1OdANGzY08TrTV/JsJF5ZU9AQMARKQsCIV0lwWeI4EcDS\nxaDOIBNlaYmzrnLKgnxNnz7dL3StCYwuS8gX1i+C7iFfLJQN4RLyJUH3p512midmOv+IESPcKaec\n4skX1i8hX1i/Kg26l3g1sSJWMpATZP/ZZ/9MrdVr3n3z3fCAcC17bFki7zV9Txjp0mjYsSHQeAgY\n8Wq8Pk1Fi/iCkAGGLYmkCxBxffJlJYKeuURcjzLfl5CvqLivcePGuSVLlrQoihnyL730Uk++sH4x\n35dMtgr5EgLWIlPoBxYuLHRaIFroXQnhkvLSbvU6tldv1+OI7om3dsXVX9JvtjcEDIHkIWDEK3l9\n0vAaQWjElSfWmKQ2GkIDccE6x3xiuUTIl477wvIlrkcd93Xvvfe6SZMmtSiKWfVZZHunnXbyBAzy\nhfVLFuiGfCESeB8mWpDXXFa5uAbzK6+aFEwU+5i75eYbW+ie9B9XTrrGzbn9t4mP7Yqrn5LeH6af\nIdDsCBjxavY7oMbthzBAtnCDQWbSIBJUD2GEhOUTCBjki03HfeFu1Nvzzz/vfvazn61X1MyZM/06\njxJ0L65HiNszzzzj00O+8hGtcKFgjsUqFzELp8/1m3J69z7e9T7he27YkNNzJUvUeZm3a/KUyQX7\nrp6KG+mqJ/pWtyFQWwSMeNUW76avTcgWJCZNgsuRDQJTSHTcl5AvHfclBIygeyx/Ybn44os9+Xrn\nnXcc84Fh/dpuu+1c586ds25HsXyF8+b6DXmE8Fbq1qUcpsR4dMmyxE8vsXrNm27w4NNcjx5HuGFD\nh+SCpu7njXTVvQtMAUOgpggY8aop3M1dGQOMBNRXSgDqgSQWI4iSLGWUTwdxPUbFfQnxYk8QfniR\nbcrdb7/93HXXXZcNutdxX3zxCPEqlXzFNcDPnj3HjQoWBk/63F6syQjGSXaNxtUn+e5Fu2YIGALJ\nQsCIV7L6o6G1wU3Hli9WKskAlEocw+RLz/fFGo+vv/561v0Ytcg2cV+33357lnxh/apkke24XI70\n0Vlnn+NWrFjhJk++1rVpvUPium34mSPcqlUvuhnTp1Vs5atG4+gLcWFXo3wr0xAwBJKLgBGv5PZN\nQ2lWKmlJauMhjljtirF6SRsgYH/4wx/cu+++66ecYJHtdu3aOaaMwCJD/JZMtnrhhRdKtuw+zkW2\nxVWK27FSEfI1duzYRM3vlQbSFUfMXaX9Z/kNAUOgPghsWJ9qrdZiEWjbtm3WrTR+/PgW2cTdxH7B\nggUtruX7MXXq1GyZpbqr8pWb7xrxUZCVNLoYdbtog0wxoc+HjyGasj300ENu9913D2KNeriePXu6\no446yrVp0ya7ziOYsM7jIYcc4qZNmxYuyh155JG+rHXr1nmixpeSkDfmDcOVKTFl62WMOAHhEvIV\ncbmkU5eMH+vat2/vTuh9nCOIvd5CTBeTpCbd0mWkq953itVvCNQXgaYnXpAZITCQHJP4EcCtcscd\nd/j4qPhLr22J8nEApEqLkCzZi1tV9q1atfL3GfFZWLr4WpEvF5m3C9IlC20znQQfHkDMtLDINoRP\nFtnGQkbAPnOGlUq+0Cmsv66rlGPI15jA4nVI14Mc0zbUSyB+J590ktsqwDLJ7kUjXfW6Q6xeQyA5\nCGyUHFVMk0ZFACLRq1cvF4d7KwkYQVzCli/OFRKxLurlgDjHb70xZ9eUKVOC+KnJ7u67784WS7A9\nLkvm+5IpKySOjD1lhOf7ymYOHfChADFGlU4xQbHHH987mCNrkbvggl+6ZUuXunPPPbdmrkesXDdM\nv9Gde85ZfoqSfv36hVqajJ9xxtclo0WmhSFgCJSLgBGvcpGrUb5Vq1bVqKbqVVPM/FfVqz3ekqUt\nWIyKIVvh2iFaQpLE0ip7SJOQJ/ZYufi6Ucd9PfXUU27//fd3999/v9t5552zBEwH3ZOXOig3l4jL\nF0Igx7nSFnMeLCBxV199rdu749fdxWMvcf37nVrVwPvrps1wN824wbVus2PeZZ2K0b+aaYx0VRNd\nK9sQSB8CG6ZP5Xg0JiaKgWnkyJHZAl966SV/LsrliEtSx1uRl3PkCYt2XxLTg1CPDLCSXn6zRx82\n5mqSskmn66TcfHLbbbdl81MGZXGuHCmlvYXKh6SIi65Q2qRfpx39gyklZDAtR1/pd4gWM9OzPJC4\nHrfYYgtPhHA94oLs2rWru/7669er5jvf+Y4DV4n7YnkicTviekTEGrZe5n+fEKtXruulnofAnXvu\nOZ4ELX/madc9IGPn/mK0e/a5FaUWlTM9Fi4IV7du3T3pGjp0qJ8uIg7LXc5KK7gg90lS9augaZbV\nEDAEykSgaYlXsXhBrCAwEKcwyeIc15588sm8xZGmEGmCdEHSCpWVqyIIVp8+fVrkpyzOFapblxlH\ne3V5uLOQXWP4ik6Xi558JDBo0CCP29Zbb50ltkJsOAempCFtuP90eaUex0FaRE8sVGHyJTFf7CXu\ni+kktLDoNot4E/clc4IR98W0FcXGfWGpgsDFKWDD3Fkzb53pPv3kY28BY61EYsDKCcIXssXXipC5\nBxbMd/369fVLAOHmTKoY6Upqz5hehkB9ETBXYwH8w2QmnPy9997zgzsuQQKowwKhKkZKIUdR5UEs\ncgkEEfcUX9UVkkrbGy4/7mBiyBPtKcaSR9+E8T/xxBMdC1XzlWElAmGBVFZqyYN8IZAvRMiYuB05\nzzEbywmFF9meMGGCJ9sssg3Z0rFfBPFLXqnHVxL6Aymmn+ImxxAwtnNHjfQfDEC6rrnqSl/7fp0P\ncHt32scf77FHW/+Fp6j1wgsvBETyQ/eXl19yL/x5ZUAMF7kfnHSKO/I7PdwZw34Su55Sb5x7I11x\nomllGQKNhcCGjdWc4lsDCcEVw0AmwmDMOYmrgsxoCxRpuc5G8LMIA3w+4kO58+fPz+aVfOH9wIED\ns2nOPvvs8OWCv3PpR8Z8+knBcbVXymMf5ySR4kothnRpHfRxHGVQHiRFrHm6/HKOhRRBsnA9Eq/F\nTPV89Yi7ERceli/ckKNGjXJDhrRc/mbhwoWB662be+WVV7zrkS8emXIC1yNTTkDG5L6N0o+2QLyq\nJejfP3DPTp1yrVuxcoW3hPU58QS3+Wabus8+/cTNnTPH3ThjRnZ77dVX3WZf3sQvSXT++f/rdceC\nRuA8uiZdwJIN0mliCBgChsB6CAQv5KaWgKxkAlD8FhCkFlgE1pHstYAUtbjGj1x59XnKDkjXenk5\nIfWyD4hgZBp0knSUq0XOsy+kH2nWrl3rswekMVsm50XKba/kj9qff/75GbZKJSDDmcCi2EJv3f5S\njymLMsuV4Cu+zGGHHVZu9pz5ApKUCSxXmYA0ZQIClQlIfWb16tWZwAqUCb5ozCxevDgzb968zGWX\nXRaJRfAlZGb58uWZYODPvP3225kgBiwTuB8zgfsxQ9lsuYQ2mVSGwMsvv5xhMzEEDAFDIBcCTWvx\nCgbqgqKtXVFuOtxWIrjAsHxFSVTecLpi0oTz6N9aFzkfLrNQjFNc7ZX62WMVisNKgTUujC+WRCyD\nASH1FkWsiuGNa6TB1aqlkJVSp63lsbgaJe4L6xexXVi7JO4LK1jHjh399AnhuK/Bgwf7OdNkvi+C\n7on7KmayVSw0cVnxaolZUurCyoXEcb/7guyPIWAINCQCFuOVp1s1USH2qZAwmIfjvCAHxUg4XzF5\ndJqoesJlhomLzs9xHO0Nl0msSxwDUThWC1cvrtlCosmnfMAgecJlyvl677XrEV0kTos9hExvxH0R\n8/bcc89l1WYeLUj0L37xixYxXwTwE/dFfkTqkYwyrQR9Jsdyzfb5ETDSlR8fu2oIGAL/QcAsXv/B\nwo4SjIC2xkG4iiFd4eZAwnQ+XWY4bb1/CymCJMmUE8R9MdO9WL4k7uuSSy5xp5xySguVZ8+e7QP/\nJe6Lrx5lpvt8cV9m9WoBY1E/jHQVBZMlMgQMgX8jYMQrz62grUg6OD7w22aDlfWxTp+n2KpciiIR\nYQtXIf309bjai+VEBqa4Gq71LLXMSvKWWlel6cXtiKVLky8JuhcChuvxpGC5HCxcWiBdkE1NvsLr\nPJKee1jL4YfHP8WELr+RjuXejsOq20i4WFsMAUMgNwLmasyNjY8LEvcbxEa7rfJkq8sl3GbhOC/t\nSsPtWIh0EAcVd3uxoMjgFBcwtKucrz6pX2MSlz7VLgcCBvkK77XrkeNDDz3UL7I9YMCAFiqxyPY1\n11zjvvWtb2WnnIBs4XpEyIuIlY1jiAT9xj5uIY6Mbd37H7jXXnvdvfHGGy2qIJ6tfft2Lpjj338Z\nCBFMosh9XQ2Mkthe08kQMATiQcAsXgrHsIVIEy3iaPRcWxAU4r7EKhE1270quuqHBJ9r/fiNziJh\nUibn9b5a7SVmqFLRukGeaFu4v/LVQVrw0cRLl5kvb9Q1iEMt46DkPsP1SJwWQfeyyDaWL3TB8iWT\nreZaZJuZ7t9//31XaJFtyAT9FkffUcYNN9zgBg46zbVv194NH36mu3/+A+6DDz9yrbbexp3at2+L\n7cAuB7mPPv7UvfLqG+6yiVf4Z8xPwHrVJE8Go/qj1ueMdNUacavPEGgcBMzipfqSwZkBDssQc3kR\nD8RgLVYgBntNZlTWsi0wuoxKjyvVrxrtxeJ1+eWXV9o0b23UpIl+YcNKhzUvF4kiD/0a5YrNlacY\nZSETtK2Wwr2JpUoHx3OO39r6xe9ci2w/8MAD7vbbb89avpjjC5FytfWL9j0YzGpfrsWJvHffc6+7\ndMJ4PwHqQQcf7M4444ySF9Bm5vpHHl3ili5Z6o468ii3V/uvu+N793L9g7nB6iFGuuqButVpCDQQ\nAsELt6ll1qxZ682HFBCvLCbM9RQM7uulCW6B7Lnw/Fr8luu6rGyh/z6QNOyZWytKyC/pwvXIefaB\n6y2bTp/nOJwv1zxe1F9Oe6P0lnPMaRRYZORn2XvmICvUD+F25/tNWTKvWTlKMYfXnDlzyslacR6Z\njysIks988sknfr6vd999N/P6669nVq5cmQlIZiYgPZm77747E8R9Rd4XwSLbmeeffz7z6quvZt55\n551MYAWLnO8rIK2ZYGHuknRmPrBgpvlMu73aZYLFsjPLn32+pPz5Er+xek3m19dPzxx+eDe/3X77\n7HzJY79m83TFDqkVaAg0HQJN72rEBRcQk/WmgQgGbS9Yv5544gmfBuuKFixEBKGXG2+ky6r0GF2w\ncqCvCPoGxLIk/eJuLy4rpNL5obBq0b5wH/jCS/xDGZQVnm6jlGIeeuihmlu8RD9xO2KdIuge16Ne\nZFu7HotZZBvXoyyyHZ7vCxdmsR9IYAW86OIxbvCgwX45oIWBxWvUyBElW7iknVH7Nq13cD8a8Pk6\njaf07e+uuuoqhxuy0vsrqq7wOalD7unwdfttCBgChkAxCGwA1SwmoaUxBMpFgPUMcVf99Kc/LbeI\nFvlwMRLDJi7gFhfz/IBUQlArJcpz5871bRGXU54qq36Jx5cNlyGkiWWCmDaCGC42SBVTSUCs+PKR\nvZYf//jHbujQoX6aCiZjhcARPwahw2UpJK+Qy5HrF1zwS9e6zY6OecQ6dmivq6nq8ZhxE9y555zl\nrrjiSjds2NCq1AXpgnDVMq6vKg2xQg0BQ6DuCBjxqnsXNL4CBFYT5yUWg7haDPHSMVzEcmnBooV1\nS2LAtDVQpyv1WAhkHLFrpdYdlV7+d2KRbDbIV+CC9CQL0sUmVq1gSSF3zz33tCime/fujkW2mSOM\ngH3mC9PkC8saBCwX+Zo+fbobO2asO33oMDdsyOktyq7VDxbgxnLdunVrN37cmFgJkpGuWvWi1WMI\nNAcCRryao5/r2kpcUJCfID7GWw3qqkwMlWP1gITUOrg+n+pCvrB8Qb6CtRmz5AvLl5AvjpctW+Yu\nuuiiFsXhnvztb3/rv4qEgAn5wo2J9QvyhYUPArarmmLirLPPcXfOneOuv2G6X9S6RaE1/kEQ/ujR\nF7g333zTzZg+LRbyZaSrxp1o1RkCTYBA08d4NUEf172JEJV+/frF8nVjvRuD9Y72JIl0gYm4BCXu\nizm6cBtCophmQsd9HXLIIS5YZLsFlKzt2LNnT0fsGscQNSZbxXomcV8QLqyKEGkE0rVixQpHLFfX\ng7u0KK8eP4j/mjrlWte27R6ub78BWT3L1cVIV7nIWT5DwBDIh4BZvPKhY9diQwALEbFeWE3SHCcD\nwTn//PMDy8ro2LCJu6Bw3BduR4n7Etcj+zVr1rjTTjvNEyytw4gRI/wSROJ6hMDJOo8QO8jZvHvv\n96Rr8uRrHYQnaTL8zBHBlDAvlm35MtKVtB41fQyBxkHAiFfj9GXiW0KAPVuSSUs+ELF2oTskEgKp\nhXYlScT1qOO+IF8E12vyxW9io5YsWdJC/eOPP96dd9553mImrkchXzfddFMQRzXe3T5nbk2D6Fso\nWMQPJmylrbfcfGMRqf+TxEjXf7CwI0PAEIgfASNe8WNqJeZAQKxeMrDlSJbY07jaIF5RE3fSNi1J\ncEcK+ZIvHnEZQr5wIWryRdzXzJkzHYRKyy677OJ+/etfu5133jkbdI9rkaWJHl2yLBHuRa1v+JiY\nr8GDT3NdDjww+NLynPDlyN9yb6bZKhvZMDtpCBgCiUHAiFdiuqI5FIG0ECPElAxpEggXOjMwFyO0\nMZyWuLB6DOgQMMgXmwTdQ74k6F5I2NKlS92FF164XvMgZZ06dfKxXqef/hPX5/s/qNvXi+spV+AE\nXzv+sH8/N3nKZG9tzZfcSFc+dOyaIWAIxIWAEa+4kLRyikIAQoLliKkYoixHRRVS40QMyPvuu68L\nZnCvKKieciQwXZpQKxeljvuCfBE0D/kKux5Xr17twotsoyuLbC9b9nv3j8BqVqrrTtpar/2Vk65x\nc27/rVu0aGFOFbBY1osY51TKLhgChkBDImDEqyG7NdmNEpejDHZJ1haixIDM3F0yf1ec+oZdlJBS\ntmqIuB513BfkS1yPerJVgu4hYWEJlv9JdFxXWF/5zez2PY7oHjnBKn1QKwIs+tjeEDAEmhcBI17N\n2/d1bTmuO4LVsQLVw/1WbOMhXWzoWgshaD8cuB+nJSZMvsT1iOUL16OQL46ZbDVY79E3+8ADurge\nwQLVF184uhYwxF7HvPvmu+HBrPbLHlvW4n4z0hU71FagIWAIFEDAiFcBgOxy9RDA1QjxYvBLIvlK\nin5hFyVYQcbKFSFfMtkqQfcy0z0ETJMvHfcVLFCdyKkjisXhpJNPdQd1OTBr9TLSVSxyls4QMATi\nRMCIV5xoWlklI5AUcqMVx72IWzGppBD90E1LOS5KifuSme7DcV8QMCxf//uL813XQ7/lJl42QVeZ\numOsXpeMG+tjvYx0pa77TGFDoGEQMOLVMF2Z3oZAvhgI+WqwEktOHAhAaiTeB52SaImLameUi1La\nEZVezgn5kiknIF96vi/I13/3+W83c9ZtiZ8+QtqUb9+tW3f3jW/s0xCrKORrp10zBAyB5CKwUXJV\nM82aBQHip4j5gijU82tHiBaz67OhR1pIF/dJlMWL9miJclEyEz/yhS98we/10kPMUn/XXXe5vTvt\n0xCkiwZ2PfTb7h+ffuLban8MAUPAEKgHAmbxqgfqVmckAkJ8hIBBJmohWLkk2J99Nb5erEU7CtUR\n5aKUwP1w3JcE3Q8/8+dup52/ltqg+jAmMq/XipUrwpfstyFgCBgCNUHALF41gdkqKQYBCBcuM4gP\nhIA9WzUtT2Jtg+QRN1UrslcMHnGnAUcw1hIO3IeAHXbYYdlFt1e9+IL7/g9+oLOk+lgW86bd9XZr\npxpIU94QMATKRsAsXmVDZxmriQDWL6xPDJCQL+LA4iJFWH4gXLgTEfa4F00+R2DRokUOArZ27Vp3\n4okn+uNGwoY1HDt8vZ2/rxqpXdYWQ8AQSAcCG6ZDTdOy2RDAMgP5IuAeK9huu+2Wjb3id6kiZAuC\n1apVK18uhIuyjHS1RLNbt26ObZtttgmsXSe3vNgAv3bdbXe3bt0HDdASa4IhYAikEQFzNaax15pI\nZwgYGyQJEsaGJQy3GRYwriGy9z+CP+JCY8/2yiuveBeaBM7HZT2T+hptT5A9uG2xxRaN1jS3xx5t\n3dw5cxquXdYgQ8AQSAcCRrzS0U9NryVEC3cjGyKECouVBMf7C//+A7Fig2jhqgwTM53WjqMR+MJG\nX3RYhxpNGpFMNlofWXsMgUZGwIhXI/duA7eNwGgLjq5uBzOxaiPK13be2f3hiccbsWnWJkPAEEgB\nAhumQEdT0RAwBAyB2BDo2KG9W/nnlbGVZwUZAoaAIVAKAka8SkHL0hoChoAhYAgYAoaAIVABAka8\nKgDPshoChkD6EHj2OZs8NX29ZhobAo2DgBGvxulLa4khECsCsoxQrIUmoLBXX3vN/eCkUxKgialg\nCBgCzYiAEa9m7HVrsyFQBAL/+udn7q9vv11ESktiCBgChoAhUCwCRryKRcrSGQJNhgBfjb711psN\n1+qnnvqja9+uXcO1yxpkCBgC6UDAiFc6+sm0NARqjgDE6ze33FTzeqtd4V9efsltueXm1a7GyjcE\nDAFDIBIBI16RsNhJQ8AQ+HxR7W5uydLHGgqMF4KpJGxC3YbqUmuMIZAqBIx4paq7TFlDoLYIHHBg\nF/fgQ4trW+n/397d9ER1hmEcv+YDsNRUUDMJLEhdYayg3dD6/sZAurOFIU2qSNGkTaoZSXShKGqi\nCRo7VYkVW0QDTtqmMbY2gcS2GKDMoo3ESGUjfgLWyjNqAqiYyDkz5z7nPyvmDPOc+/nds7hyXp7j\n495ciHwyOcniuz4aMzQCCMwvQPCa34dPEYi0wNo1lRr8+6/QGAyPjGhHojY082EiCCBgT4DgZa9n\nVIxA3gTcsy4fjN1XWNa+yvT16sO1VXnzY0cIIIDAXAGC11wR3iOAwCyBmto6XbrUOWubxTe3bv+e\nO83owiQvBBBAoFACBK9CybNfBIwINO/ZrVu//qLJJ7aXlrja1aXmlhYj6pSJAAJhFSB4hbWzzAsB\njwTi8bhWrvpA31+56tGI+R/GHe36Z3hIDfWsWJ9/ffaIAAIzBWJPp18zN/A3AgggMFcgm82qoqJC\n//53XyveL5/7ceDf7/y0XlVVldq3lyNegW8WBSIQcgGOeIW8wUwPAS8E3GKqR462qa2tzYvh8jpG\nx7nz09d2PeZoV17V2RkCCLxJgOD1Jhm2I4DALIGWL5tzp+tckLHycndjnj/bocOHD8ktCMsLAQQQ\nKLQAwavQHWD/CBgRcMEl/V06F2SsrGafSqX0WUMDK9Ub+Y1RJgJREOAaryh0mTki4KFAR8dZZTIZ\n/djdreIl73k4srdDffX1Nxoff6iff8p4OzCjIYAAAgsQIHgtAI+vIhBVgf0HUhobG1M6/W0gw5cL\nXdnRkemAeJNTjFH9kTJvBAIqwKnGgDaGshAIssDJE8dVXl6upqY9gVvfy4Uut+7YmTOnCV1B/hFR\nGwIRFSB4RbTxTBuBhQrMDF9BuObLLfD68vRiz/UeHoS90AbzfQQQ8EWA4OULK4MiEA0BF74qV6/W\n541J3ei9WbBJu7sX3dE3d01X15XLhK6CdYIdI4DA2wQIXm8T4nMEEJhXoLU1pfYT7TrUelC7duf/\n1KNb3uKTutpcAHQX0rNsxLzt4kMEECiwAMGrwA1g9wiEQcA9eHrw3qBisZg+rq7WsfZTvk/LPQao\nJlGnTF9vbpkLFwB5IYAAAkEX4K7GoHeI+hAwJtDf368LFztzq8Vv2LRFjcl6T+987LzcpT/uPH/2\nYupgSslk0pgQ5SKAQJQFCF5R7j5zR8BHARfAuq9d18ULaX2xq0mVVWu0ZfPGdwph7ujW3bt/qu9G\nj5YUF6tx+pqyRCLBaUUf+8fQCCDgjwDByx9XRkUAgRcCExMTGhgY0O3f7uha9w/aUVOr0tIyLVq8\nWGVlpSoqKnrFanQ0q6mpKT36fzz3nerqj7Ru/Xpt37aVC+df0WIDAghYEiB4WeoWtSIQAgF3JCyb\nzeqpYhoaGn7tjEpKSrRsaYmWL1+WC1rxePy1/8dGBBBAwJoAwctax6gXAQQQQAABBMwKcFej2dZR\nOAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkB\ngpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAII\nIICAWQGCl9nWUTgCCCCAAAIIWBMgeFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1\nIoAAAggggIBZAYKX2dZROAIIIIAAAghYEyB4WesY9SKAAAIIIICAWQGCl9nWUTgCCCCAAAIIWBMg\neFnrGPUigAACCCCAgFkBgpfZ1lE4AggggAACCFgTIHhZ6xj1IoAAAggggIBZAYKX2dZROAIIIIAA\nAghYEyB4WesY9SKAAAIIIICAWYFnu/zIiInRwogAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 88, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 89, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0 = np.zeros(10)" + ] + }, + { + "cell_type": "code", + "execution_count": 90, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.])" + ] + }, + "execution_count": 90, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 91, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_0[4] = 1\n", + "layer_0[9] = 1" + ] + }, + { + "cell_type": "code", + "execution_count": 92, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 0., 0., 0., 0., 1., 0., 0., 0., 0., 1.])" + ] + }, + "execution_count": 92, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0" + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "weights_0_1 = np.random.randn(10,5)" + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 94, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_0.dot(weights_0_1)" + ] + }, + { + "cell_type": "code", + "execution_count": 101, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "indices = [4,9]" + ] + }, + { + "cell_type": "code", + "execution_count": 102, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "layer_1 = np.zeros(5)" + ] + }, + { + "cell_type": "code", + "execution_count": 103, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "for index in indices:\n", + " layer_1 += (weights_0_1[index])" + ] + }, + { + "cell_type": "code", + "execution_count": 104, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([-0.10503756, 0.44222989, 0.24392938, -0.55961832, 0.21389503])" + ] + }, + "execution_count": 104, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "layer_1" + ] + }, + { + "cell_type": "code", + "execution_count": 100, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjsAAAEpCAYAAAB1IONWAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsnQe8VcW1/yc+TUxssQuWWGJoGhOjUmwUMVYUC3YQTeyCDQVRwYJYElGa\nggUBJVixRLFiV4gmxoKiiaKiYIr6NJr23vvf//6O/k7mbvc597R7zt7nrvl89tlt9syatct8z5o1\nM99oioKzYBowDZgGTAOmAdOAaaBBNbBcg5bLimUaMA2YBkwDpgHTgGnAa8Bgxx4E04BpwDRgGjAN\nmAYaWgMGOw19e61wpgHTgGnANGAaMA0Y7NgzYBowDZgGTAOmAdNAQ2vAYKehb68VzjRgGjANmAZM\nA6YBgx17BkwDpgHTgGnANGAaaGgNGOw09O21wpkGTAOmAdOAacA0YLBjz4BpwDRgGjANmAZMAw2t\nAYOdhr69VjjTgGnANGAaMA2YBgx27BkwDZgGTAOmAdOAaaChNWCw09C31wpnGjANmAZMA6YB04DB\njj0DpgHTgGnANGAaMA00tAYMdhr69lrhTAOmAdOAacA0YBow2LFnwDRgGjANmAZMA6aBhtaAwU5D\n314rnGnANGAaMA2YBkwDBjv2DJgGTAOmAdOAacA00NAaMNhp6NtrhTMNmAZMA6YB04BpwGDHngHT\ngGnANGAaMA2YBhpaAwY7DX17rXCmAdOAacA0YBowDRjs2DNgGjANmAZMA6YB00BDa8Bgp6FvrxXO\nNGAaMA2YBkwDpgGDHXsGTAOmAdOAacA0YBpoaA0Y7DT07bXCmQZMA6YB04BpwDRgsGPPgGnANGAa\nMA2YBkwDDa0Bg52Gvr1WONOAacA0YBowDZgGDHbsGTANmAZMA6YB04BpoKE1YLDT0LfXCmcaMA2Y\nBkwDpgHTgMGOPQOmAdOAacA0YBowDTS0Bgx2Gvr2WuFMA6YB04BpwDRgGljeVGAaMA2YBsrRwOOP\nP+7effdd9+rC190HH3zgli39wD3++GNfS+qQQw93q6yyiuvSpbPbaMMNXM+ePd13v/vdr8WzA6YB\n04BpoLU08I2mKLRW4pauacA00FgauOuuu9xTTz/r7rv3HteufXv3ox//xG2y6SZu8803d6utuqrr\n0b1rswIvfG2Re2/JErd06TL39ttvu1defsnde89dDgD66a67uH322cfAp5nGbMc0YBpoDQ0Y7LSG\nVi1N00ADaeC///u/3YyZN7k5d97pS9V//wNcn969XZfOHcsq5dJlH7q5DzzkFsx/zl079Rp3xrCz\n3IknHOc23njjstKzi0wDpgHTQEsaMNhpSUN23jTQhjUwfvwEN3nSJLf1Ntu6IwYOdLv/tG9VtYHl\n57rrrndXjvuFQU9VNWuJmQZMA6EGDHZCbdi2acA04DWAP87551/gVll1NXf8CSdUHXLiahb0zL3v\nXjfi7BFu0KBB8Si2bxowDZgGytaAwU7ZqrMLTQONqYHxEya6yRMnuhNOHuKGnHRCTQs598GH3WWX\njI38gdaPLEoTzJ+nptq3zEwDjasB63reuPfWSmYaKEkD+OYce9wJ7pFHHnU33Di95qCDsDST3Txr\nllt33fVct67d3O9///uSymCRTQOmAdNAkgbMspOkFTtmGmhjGgB0Bg4a7FZeeWX3i19c7tq3W6/u\nGhg/cbKbPGG8m33LbPejH/2o7vKYAKYB00B2NWDj7GT33pnkpoGqaECgs9lm33fjrri8KmlWIxGa\n0FZaaWV38EEHG/BUQ6GWhmmgDWvAYKcN33wrumkgraCjO3P04IF+04BHGrG1acA0UI4GrBmrHK3Z\nNaaBBtHAmWeNcIsWLXL33D0n1SWiSWvOHbe7OXPuNKflVN8pE840kE4NGOyk876YVKaBVtfA9OnT\n3diLx7p5UTfzNPjotFTgY4493n3++edu1s0zW4pq500DpgHTQDMNWG+sZuqwHdNA29DAO++840Fn\nXDRoYBZAh7syevQoP/8WkGbBNGAaMA2UogGz7JSiLYtrGmgQDRx62BHRnFabuTEXjs5UiRiH59Qh\nJ7v5C+Zbc1am7pwJaxqorwbMslNf/VvupoGaa4DRkX/3wvN+PqqaZ15hhozDs/uee7uLx15aYUp2\nuWnANNCWNGCWnbZ0t62spoFIA1h1unXvXpdBA6txA5haYosundzixYtt8tBqKNTSMA20AQ0Y7LSB\nm2xFNA1IA1h1jjv2OLfojUU6lMn1qacNcyussLy77NKxmZTfhDYNmAZqqwFrxqqtvi0300BdNTDr\nV7e4gYOPrqsM1cj8Zz872t1z1xzHOEEWTAOmAdNASxow2GlJQ3beNNAgGgAMrp16jdun396ZL1GX\nzh3dDzp2cnfddVfmy2IFMA2YBlpfAwY7ra9jy8E0kAoNAAY/P+Y4Byg0Qtilb1/37HMLGqEoVgbT\ngGmglTVgsNPKCrbkTQNp0QBgsMWWW6ZFnIrl6NO7t7dUVZyQJWAaMA00vAYMdhr+FlsBTQNfauDJ\nxx9z2/zkJw2jDixUe/fb1+F0bcE0YBowDRTSgMFOIe3YOdNAg2iAEZMJPbp39etG+WGm9pdffqVR\nimPlMA2YBlpJAwY7raRYS9Y0kCYNADtbb7NtmkSqiiybbLqJW/L+B1VJyxIxDZgGGlcDBjuNe2+t\nZKaBnAZo6ll33fVy+42ysfnmm7sPPjDYaZT7aeUwDbSWBpZvrYQtXdOAaSA9Glh9jTXdN1dcKT0C\nmSSmAdOAaaCGGjDLTg2VbVmZBuqlgf/3//5fvbJu1Xy3+uGW7lezbmrVPCxx04BpIPsaMNjJ/j20\nEpgG2qwG2rdrvKa5NnszreCmgVbUgMFOKyrXkjYNmAZMA6YB04BpoP4aMNip/z0wCUwDpoEyNcBA\niR1+0KHMq+0y04BpoK1owGCnrdxpK2eb0wDTQxx55JHuu9/9rps8aVJDlv/Tzz5ryC71DXmzrFCm\ngTpqwHpj1VH5lrXzo9/+/ve/dyyMBcPy7rvvNlPNaqut5n70ox/5Spt1z549/dIsku34GcABHLqZ\ns/70009zWmH7ncVv5/YbZeNvf/tboxTFymEaMA20ogYMdlpRuZZ0sgaoiG+88UZfKWN1AF6AGFkh\n2A6DIIg1UHTKKae4l156ye2zzz5u33339QvptMXATObok+Xuu+9upoLvfe97XjfolXhXjLuq2flG\n2PnjH99yXbtu1whFsTKYBkwDraiBbzRFoRXTt6RNA14DVLZXXnmlXwATgAVQ2XjjjcvSkCp5oAkA\nIq3Ro0eXnV5ZQtTpIqBPwAj0hWGrrbbyukAfITTymi+33HLug6XLXCP1YDr0sCPcrn37eFAO9WDb\npgHTgGkg1IDBTqgN2666BkLIofIFSLDkVDNQ+ZPu9OnT3aBBg3JAVc086p0WQCcLThLgYL0J4TH8\nD8M24+z077+/OyLSz4AD9qt3caqWf8cOHd0DDz7QJiC3akqzhEwDbVAD5qDcBm96rYqM7wiAIx8S\n1tUGHcqCdQgLz+LFi31zDftYkbIe1GRHeX784x+7888/3zffUS6a8KZNm+Y++eSTXNMezVaAjeDm\n//7v/9y///1v969//cv985//dF26dImufznrasnJP/fBh1279u39/c8dtA3TgGnANJCgAfPZSVCK\nHapcAzRTASBYXNiuRQAKsH4AVVg6WCNDlvx5ZL1hHToY46QNKGK9YVGZBDfoF+sN+0AOC/v/+7//\nm9vfaacd3MUXj41ijiZ65sPTTz/j2q+/QebLYQUwDZgGWl8D1ozV+jpuUznQbCXrDRU2AFKPgBwA\nD01cAE/ov1IPefLliZxAmSAnDjiCG9YKAA1BoAPUsAhytAZ0BDtaHx75uJx+5pkN0ZRFE9Y1U67J\n9dKTfmxtGjANmAbiGjDYiWvE9svWgEAHsKAZSdaHshOswoWyMAEUaQEe9CS4ydeDCrgRNKKGEHBk\nwYkDjoAmDjnh/jXXTHWg0tQpV1dBu/VL4vppM9zdd81x99w9xzdd0uQX6qt+klnOpgHTQBo1YLCT\nxruSQZlC0MGSkqaAPEBPPYFHPaiAnCeeeKKZeuhBRUWNJSoEsjjgyIIjyCkGbmTl0Rofn7322su9\nuvB116Vzx2ZyZGmnV6/e7uSTT3b77dc/Jzb314Anpw7bMA2YBgINmM9OoAzbLE8DaQYdSgREEKgI\nawk8WBvID9gqpgcVMuYDHMFKCDgcC/fDbcUXIJHuN77xDe8H1L379u6qq67KrHUHqw4hBB32dX9Z\nWzANmAZMA6EGzLITasO2y9IATS4ADxV7moOsO8jZWk1sAA5wgwUnPhI0PaioiNGXfJkEN+gNMGFf\nlhsBSxxqtJ8EN5yLAw7j67C8/PLLbs0113Rrr722Gzx4sJs4+Rq3+0/7pvmWfU22pcs+dIcdeujX\nrDphRO4vFrLWusdhXrZtGjANZEMDBjvZuE+plZLeVlTuVPJZqFyADeQERqoVSIsKNh/gADcs0k8S\n4AhSWAtm4us43GhfcMNaQYDzX//1X47l6aefdj/5yU887Kywwgpu9uxb3MMPPexmzf5VpgYZHHnu\naLf47bfcrJtnqqiJa55HgFI6T4xkB00DpoE2owGDnTZzq6tfUCoUxn958cUXm/maVD+n6qWIBYpK\nEEADQMoNgI2WavagEuAAMoKZ+DZxBDgCJ5qoWAQ34RrQ2WWXXdzyyy/vz7NmOebY412nzl3cmAtH\nl6uGml7HuDqnDjm56EEEDXhqenssM9NAqjVgsJPq25Nu4bCSsIyOrDtZCkAKfjw4DRf7zx9IEtzk\n60GFLkKAEoioaUprYCVcBDOF4EbxSUPp5gMcwcznn3/uXnnlFdenTx8PNzouEFqyZInbp98+btjw\ns93Rgwem+hYufG2R27//vu7isWO/5qtTSHADnkLasXOmgbajAYOdtnOvq1pSLCNADpVJscBQVQEq\nTKwYUFMPKpqo8gEO0FSoB1UccJKABpCJA4/ghnVLgCOIYS2Qef/99x2ws8022+SO6ZyauIAlyjVi\n+Ah3w43TXY/uXSvUautcjp/Occcd7zp27Oguu5RBEUsLekax6FkwDZgG2qYGDHba5n2vuNRUHMAO\nlX0WAxUgwBO37ghwgLl8Pai4rhDgqIkpBBbBjI5pX/Cj41rHAUeAArAkwU14DIsN8TbbbLNmoCNL\nEGsC5aMsAw4c4J588slUAo9AZ/1oWoirr55U9qPGfSUY8Hg12I9poM1pwGCnzd3yygssq44qkMpT\nrE8KgBqVH01PlAkLThxwdt55Z3+eOKoo1YyE1ICNrDdsC1YEM/F9wY3WOq90lLbghrUAJ74W4HBc\n52i2osfVpptu6o8BNqShINBhH2CjvPQSGzhwkDt7RLosPAIdZJ0xfVrFFkQ9r7qPpGvBNGAaaBsa\nyATsTJ061R177LH+juBo+fDDD2fq7oTyI7gqNLbDyodyUb5iA//c3377bR/9kksucWeddVaxl1YU\nD2sAoMCS5aDKPl4GKn/ghkVNdOE9E5gAKnHAEbwIdgQ18TXXaSF/ngOBieBFAKN1EtxwTPGxzmy9\n9dZu9dVXzws4KitWOSYWZc4tIIBy3nnnHLf//vu5626YXncfnmefW+B4psttulI542vKiv9VaJmL\nx7F904BpoPE0sHzjFclK1JoaoLJgBGCcdbMeKIvCoEGDPNwAciHgADnhIpiJr+Nww/n4McGNwCkJ\nbgQuApsQZnRMcVjLAgTo9O3b1xcnBGiVL1zThAfoELBoUV5k6h85AM+bN88dH/nHvBpZiIYNO70u\n3dIZNPDySy52J5x0khty8kmh6BVvY9UBdtCBAU/F6rQETAOZ0YDBTmZuVToEBXKwfAgI0iFVeVIw\nfxeVPRUga0IINmwLUPLBjaBGawFOGF9pkn4+wBHI5IOb8DiAo3To9k7F3bt3b5IvKsgiJwuWLkLO\nHXfc0d3763vdyHPO84P3nRk5L9dq4EF6XDGy85OPP+Yn+AQ8WyPw7HLPDXhaQ7uWpmkgnRow2KnB\nfTnmmGMcSyMEddtuhLJQ6fPvnkqVip4gwAlhJR/ICGxYx+O3BDiCF0EOVhpt61zcgiPAkeWGEZqx\nUvTq1avo20HzFX46NF+FgBdC3frrr++uv26qmz59hhty0olu2+26uiMGDmw16ME358bpM92Made7\nfvv2d/MXzG91mDbgKfqRsYimgYbQwHJZLcWll16a83Pg449Pj/xXksr0yCOP+DjE1bLGGms40mFy\nxKSgeKy5noWuvOxzHaGYOPjshPGS8tKx2267LZcH15Afx8oJSWWmqQN5yg00YbXWP+5iZJIe1WRT\nzDWF4qgpg3/5VPgCm3//+9/uX//6l/vHP/7h/v73v7svvvgit9Clm4VjnCMOC/H/53/+x6dDnsAK\noxV/61vfct/5znf8stJKK7mVV17Zac22Fo4R79vf/rZfVlxxRX8taQiAZNXRVBSSv1AZdY4yJjVf\nCfCQneWf//ynXwYMONDNnXt/NGFoZw89hx52hLv19juVXMVr/HJOPW2Y6x3B5quvvOxm3zLbdy2v\nldVQwINjugXTgGmgsTWQSdihohs+fHizO0MFDhgkAQ9xkyp5IIdzOPr+9re/bZZefAdwII1C8YqJ\nE0833AdqBgwY0CwP8uOY4CqMX2ib+EllFgDJ4btQGvFzVJbf+9733MZRE0C9Qz5ALUcu4I2yqdLH\nUqNKH4gJQUeAI8gJAQcQA0q++c1vOkAFcBHUaB2CDscEOCHkAEekQVryyRHkUT5kJZR6H/I1XwF5\n8TJTPhbK8fOfHx11CnjI7bTjDm7yhAmuY4eOHlIAH5qeSgmMgsyUD8xaftSRgyIYXN6PiMz0D6WA\nWyl5FooL8HD/DXgKacnOmQayr4HMNWNRWecLVIBU4mFvLSr9lkCB6wCDt956y/dkSUq/pTS4ppg4\nSWnrWCGLC1DG3EbF9NYCmuIwqDy0Ji+6J5fSg4tKttQKVvlVe10IOkvNi0oWZ2VgJ27ZwcqBlYcF\nIFBzD3kIQMLmJjVHaS2LTHwdXiNrDWsF0k4KVMrIW6r1o1DzFWUG7mTJUlnJH6sSYZVVVnGHH36Y\nO+qowW7hwoXuqaefcXfNmeMOOnD/CBZ6uXbt13frrrueW3uddXz88AerDZawe++5y+3db1+33Xbb\nuqFDh3iH8DBePbcFPKwtmAZMA42ngf98XTNUNir9F154wVdOH3/8sYcAiR/CEBATAgiVOyAkf4rQ\njyYeV+mFa+Lr2nyQUEycMM34Nt1tlceUKVOanS4EQ2HEEHRCXQFzISyhG8pdbAAI0lQZhPe62DIk\nxdtqq638P/uwGUuVv5qzBADcG6AECMD6QpNTviYqWW7CtSw4YROVwEfwVAh00H+poAOk5mu+EuhQ\nPjXHsQbygJ8kwAO26CWFNQZ9jBt3hTsmsv6stmrURPedFd23V4z0Ei0rRdv//ucX0aCF+7vTTh3q\n495z9xx3zsizUwU6eibQLTCJH5QF04BpoLE0kDnLDuq/9dZbvVWCbcYUARCwzChQgXMcC0dYmQMP\nYWXPPs1eqjSBCdJKClwXh494vGLixK8J9wGlEKLYR37Bi8pD2fIFLB5hU16oK2CPfZrtSJeFNMkn\niwG9AK+F9FFMuQQPgkxZb7QPfAhIwjXWmrjFJumYmqK0Fsxo3ZKM6ipdLmgW23wlXx3Ah7JTFtYE\nZEfeJJnV/FSufC2Vv5bnKYMsmHouapm/5WUaMA20jgYyZ9mhwmYJQ3xfgBM2dVAhhqCj68OKnuvC\naxSHddK14fli48SvCfcPPPDAcNdvx/MNQeZrkaMDofxYdeK6QQ9hPi2lF+bBv15VbOHxUrcBU1Wc\npa7DvCgrflpYqEopR5hGuC1ZqNiBGip7+d9gwZF/DWv53oTbOiZLD9YbrmfBEkSahaAhlEXbWNMq\nsagV03wlyJE1B6sWFp9QHwI1ydXIa55xdG4Wnka+y1a2tqaBzFl24pV3oRsWVoBU/EkhbhUQKMXj\nxuPFz7NfTJyk63QsqWzxNPPJpzTC88AAFVahEMYvFE/n0vZvl3ssy1doFZO8pazRVQg5XAvwYOkh\nhOcVj3W4CAoECrrOJ1DiDxUuoVzALLb5iuYqgQ5WHYLgjDU6oIwqm4/Q4D/o3Cw8DX6TrXhtSgOZ\ng502dXessDXTAHCiypzKncAxKvuwKUdgEwIAx3S9BGa/kkBFC1huXEHPtyO/ms4jPngg8IYvDmAj\n0AF2sOhQVvRA+WSVYq3yUq5Ky1aJXmp5rYCn0vtQS5ktL9OAaSBZA5lrxkouRvLR0FISNu+EseOW\njbglJYzb2ttJMsblC8uUJE8oP01gVF6Flpb8kMI8qHiphBs1UIkDLqroaYaSA7KcjEMHY8GArB4C\ngUphgOZCdM1Sbhg9Ov/ggXJKVu8rQAfwUdMVgEf399CJGp0AQW0tyKomK1tbK7+V1zTQKBpoaMtO\n2HQFNOCIHPeBCXs4AQrhNbW+ycgX+tOQv5yn2Ua+lmAnlB94otwhAJFOuYHKtxp+DDiBxyGuXJl0\nXUt6UbykteBElTn7ACIVvIJAhjiKz7lwW3ErWQM6PXv2rCQJD6TF9L4CdrQAOgTADYgDdORzpCYt\n6UDCAQDI++lnf3MLFvzGH1YXc3Y6/KCD23qbbf3xjh06uFVXXdmXTQDhT2Tgh+eesrKwbcE0YBrI\nngb+8zXPnuwtSgw44M+hipUxeMIeWeyHMBEHjRYzqHKE+Ng37MsfhayKkQ/YoeLHl4Vy4wxMmQVB\nSlM64VzopN1SkaoBO5KlpbxqdV6+GeQn4MmXd7XhRvlU2uNK6bAup/mKpq0k0FETliAPXf36vvvd\noxGYL1u61MPMFlv+0PXZpa9r376dF4Pu5QQGHHxvyRK//eKLv3evvf6Gu/vue/x1O0Vj8/To3jUn\nq4+U4h8BD+XPGqylWK0mmmmgZhpoaNjBooHTqoABAAi7qIdaJm6+budhvNbeRlbJG8+rWAdc4mmE\nZKw79FhKCkBRKaCDxYHmkUYL+sfeWiDTkr7IH9ip1KJDPtyffHNf5Wu+AnSAmXjzVQg6d945x02c\nONGDyv4DDnbFTBDapXPHaKqJjr744WSiQNCj0ezqd865210y9hI/u3m/vfdKvdUE4BGUGvC09FTb\nedNAujTQ8I3wVPwtVeiATjXGa6n01haCGUCs2KYaytsSuJEWZS4l8LFnbqxGC/xbrwZolKMXQIdQ\njcqTclS7+eqee+51ffrs4kHn8IFHukVvLHJjLhxd0aSgANCQk05wWIDGjZ/gFi9+122yySZu/ISJ\nVWkm9QptpR85K6NrC6YB00B2NNDwsMOtoKmGwfTi0CNrDiMLp6FpBfmQNYQa5EL2QiCU9LgRn1Gm\n49eRNiBEmcN8ktKIHwN2mBurkT70/FMH4KoBG3F9tbQvPaLXaoRym6+w6sT9dF577TV35OCj3aRJ\nkxyQ89hj89zRgwdWQ8xmaWDxGXfF5e7Vha+7+fMXuG5du0UQnn9KmGYX12nHgKdOirdsTQMVaOAb\nkSPml0OkVpCIXdp2NECFSuXcCM1ZwAYOtjfeeGPNAY58ASwqzmoE7gdWndVWW81hLSJdXm11M6fH\nFRN7hrO10/2cpjtAh15mGhSRUbUnRBaXgRHsHDnoCNe+3XrVELGoNJhc9LxoOolevfu4sWPHVE0/\nRWVeYiQ1adXLKliiuBbdNNCmNWCw06Zvf+mFv+uuuzzoyCpRegrpuQIg+PTTT71AjEVDpcXS2lYe\nQId8qhW4Fz/+8Y99cnOiyTn33Xff3HADGk9Hs7cLdsIpIeheD+iwXHDBRe6xeY/65qXQz6ZashaT\nztJlH7ozzhjmweyC80e1+v0oRqZCcap9PwvlZedMA6aB8jRgsFOe3tr0VUACH/jWhoLWVLIcgnHm\njYdVV13Vlw0g0aI4lTgxt5YlgPtAOQA2YJSAVQeHZKAmtOoAO+xzjt5XdC9nDCGg6IwzzvRWnilT\np9TUmiPdxtennjbMzb3vXjf7ltmpf9YMeOJ3z/ZNA+nSwH9F5u/R6RLJpEm7Bj788EMPO1gQshqo\n5Kn0WQCEDtE4MAyktzTqTv23v/3Nvfvuu96XZ/r06b55iCEKXnzxRT8zeLt27TwkUPZi4YemJfTW\nrVu3qqqM1/eWW27xzVdUuJRLzVdx2NFs5hyXnw5WHYDozDOHu29F106Zck0qQAcl7fbTXd23V17V\nnXbKULfDjju49darXXNaqTeJpl30z9qCacA0kD4NmGUnffck9RJRcdN7ZvHixZn+uFMxXXnlld4i\ngtLxb2FhiIJHH33UL1RgH3/88dfuSadOnVzv3r1981GvXr28PoiUBD/oi1DtirBazVennXaG+8Zy\ny7lrrrk6NaDjFfbVz/XTZrjLL7k4MxaeavpihXqwbdOAaaB8DRjslK+7Nn2lev7g3JvFAOSwACJY\nQkJrCHNE0azDIvgBLJ588km/fPDBB18rMtaeEH7kQ0PzEs1+1QYdBKhG89WkSVd7HUy9dmoqQUeK\nHnnuaPfKyy+5GdOnpdppGXl5Vrjf3HcLpgHTQDo0YLCTjvuQOSmABKw7NO1kzXcH3xkqI5qv8MkJ\nQUcOvTTtsNDkw6KAn8tnn33mXnnlFQ8+Tz31lKObdjzQnERT0eDBg91+++3nsP4oJFl/dK7YNc1X\nlfa+YsDJMWMudndFoxpr8L9i869HvEMPO8KtFvlTXX31pHpkX1KeBjwlqcsimwZaXQMGO62u4sbN\ngAoXYODDnqUgX6O4M698XIAc/FuYNworD8ex8BAAGICHRdusgZ6XX37Zr5977rlEdfTo0cNDjyxA\n+udfKvygb1mOyu19BdQdfPDBbuyll7sBB+yXKG/aDtJLq3cEpxOikZz79t0lbeJ9TR7uU2tZ9b6W\nmR0wDZgGCmrAYKegeuxkSxrAqgM8AD5ZCADOkdFYQfrnjcxYdgAaWXVwWgZ24sBDXMBGSxL04NyM\n1WeNNdbw4MM2IEQvqHiQ3w9WH+AFSxmhJfipRvPVueeOcmusuaabOuXquFip3tc4PPMXzM9EMxEW\nUAKWRAumAdNA/TRgsFM/3TdEzkBDz+jfNr47spiktWD5ZBXsyLIThx0sPVh4sO4QFxhhAXpYC3re\ne+8935OLUa85p+NsL4kmxAR61PxVyO9H8CPrTQg/1Wi+YmTtiy8e656IfJBqOWBgtZ4LmrO6dO7s\nRo4cUa0kWzUdA55WVa8lbhooSgMGO0WpySIV0gCgc8opp/iut2n136HC6RlBGUCGY3IY4j47NF8B\nPFoDO1h9WAAi4mtROoAOUMIUHGETV7gdAhAWIByeBT/5/H769Onj5abpi/S33nprn2W5zVcMHDgs\n6ma+XTQtw9nDh0n8TK2ZSHSLLp0y1RvQgCdTj5gJ24AaMNhpwJtajyIBEFgd6KqdNuDBIZl50AhJ\n3eUFLlhuABqsOACOFh0DdOIL1/7ud7/z49wwb5iCrD4CHNbhtqw+IQwBP1h/WL/++utKKnFN13gs\nQOSPTMgMoGlKiHyDB2LVueCCCzNr1ZEyGHBwrTXXyIx1B7kBHp7FtL0f0qmtTQONrAGDnUa+uzUu\nG74w+MSkCXjU80rTQjB3FDJi5QkD0ADsCHhkyZGDsvYBC+JoDXRsueWWbpVVVsldz3kBFHlgkdEi\n6NE6CXoERbL6AED5nJ47R805O+20k8P5edttt/VA9cUXX3h/I2Qm33Duq4suGus223zzzFp1dM+e\nfW6BO+rIQX4Wdh3LwprnEegx4MnC3TIZG0kDBjuNdDdTUBY1aaXBhwcfHZqtABusTmxreohx48a5\noUOH5jQGnBAEKXELTrgv2Jk3b57bcccdc+ATj0M8LUpX+STBTz7wAXrovk7Ybrvt3JqRYzE9v5L8\nftZee23f1EVlyrLhhhs6RkkGxoAf4IgZxrPQ1dwXuMBPv336u/367+MdzgtES90pA57U3RITqA1o\nwGCnDdzkWhdRwIOlJ+4fUytZ1KwG5OBPRKCSAXBmzJjh94844gg3bdo0DzjAhwLbIZwIWLT+6KOP\nHGPUYFER4HCOba3DbR0L16TPfhiw6JC3LDtq4sLhmfQ6duzo7rnnnpxPENaqxx57zDd9Pfvss346\nijA9ttdaay3XpUsXD2X4FX388X+7e++9Ox4tk/vjJ052r0YgmLUeZSibZ1EO85lUvgltGsiYBgx2\nMnbDsiIuH3JghwB4YF2pRaCJgHxZA13xfIEMrDqnn366FweAABgYDyVubQkBSPCDzw+ws9VWWxUF\nNknQEx7TttJnTZAsDBz40EMP+WM0mWHV0TniYq3Btwh/HZb58+f7gR6x/KCDMGy3bVd32MCBbshJ\nJ4SHM7uNo/L+/ffNXFNWqHCafOPPaHjetk0DpoHqaMBgpzp6tFTyaADLCrBDExLbG7fSeCOADYB1\n1VVXeesNeWnQvlA0AAHAABz23ntv79jLeYCHnk6hVUWWFc4DGMAD11MG1lqw0GgRvIRWHI5p0XHF\nC9faVlqMTn3SSSeRfTQj+Rmuf//+fhtZVA45UwM6QA9pcH6FFVZw3/nOd9z777/v9fL8889HY/18\n4a67/gbXo3tXn04j/PTq1duNGnVepoHBgKcRnkQrQ9o1YLCT9jvUAPJhsqcpiRnE99nnSx8L4Kca\nAcDReDSkl9TbSvkITgACIIGeSYcddpgHAuKMHTvW/exnP3PLL7+8hwXW8qMhH3p0ATphIE0FIEV5\nCGoELtonby06Fl/rvLqZM9v3nXfemQMc4iM/C93j1UWefQKgg5/OSiut5OhqzvrPf/6z23PPPX0a\nkrcR1vTK+tFWW7hBgwZlujgGPJm+fSZ8BjSwXAZkNBEzrgEsLFhePvnkE+80C/gADTQ30TMKGCol\nUDEoDZoAwi7fQImCwCNcAwpa8GWhiYgmKcKIESPcgQce6Ltvh5YSrEDkEeajPNSkxFpgBCTRA4r5\nsVgADxYsLQIQQQj78YVzv/jFL5SFu/XWW/313/rWt3y65EOZaMJCTrqbh6M9c5wFaFIz1xtvvOEG\nHHRILs1G2Vh7nXW8w3XWy8NzzHNd6ruQ9XKb/KaBWmnALDu10rTl00wDQAkAxPqJJ57wIAEA0YMo\nqflJFQG9qYATKgcWWYiAH5qwVo0miqS5iTQEOWHGHMMCQpMPFhFNC3HOOee4O+64w0dt3769mzt3\nrsOigg9M3759vbUHyBDchGkW2iY/BbYBLQIgon1ZcjjHNuP27Lbbbj4eTYA0twle5J+D3IylQzdz\nFo5zPc1wQJHgCthif/Lkya79+hu5cVdc7tNtlJ+5Dz7sZkYO57NuntkQReJ94D1IegcaooBWCNNA\nnTRgsFMnxVu2zTXAR55/tUBNUhAEAThJgWs5BwzRHZxu4YIJ1kCKAkARQgOWEfZvv/32aBqFixXN\n+/4MHz7cW2ew1KhZC6AghGnmLipiA3kIrLXI2sSacXvefvtt39sLAOOYLDSy5CAzozADPIAP55EH\nGYEbWYEk93XXT3OdOnfJ/Pg6cfU2GuxQPgOe+F22fdNA5Row2Klch5ZCSjRAJbHzzju7zz77zF1y\nySXu5JNPzjVZIaKsMsADwIOFR01AAAPAg1XlxBNPzJXo3HPPdccff3wOIAQ8WHmUZi5ymRsh/Iwa\nNcpddNFF3jKD/xHj4yArMCNLFIAD6LAI1EgDmYAbrDqCHFmjrplyrftBh44NBzvMhL5++3YeGstU\nfyov41nGuoOVx4JpwDRQuQa+/ItaeTqWgmmg7hqgeQtYIGCRwQFZzrtYRNgGaIAHAvCDMy8Aw4LF\nhhGWx48f748T58ILL/ROy0BF6MdDGrLKEK+SIAijuzigQ7j55pvdOpE/SmilkawAjBaghqYq/H5o\nwqOCZKEcrDmGDxAA1IghixOZFnMfsGQS3okNH1DMtRbHNGAa+LoGDHa+rhM7kmENMGig/F3oaUUv\nJKw2wEoILFh34s0+TMYJ9NC7C6dkBvMj3H///d5vhxGLBU1YhUijWsBDPkdGDtsEeqzRzRz5ADDW\nCspPlhwACNABbugtxiLgAXSwDLEQx0K2NCCrjgFPtu6bSZtODRjspPO+mFRlagAIuOGGG/zVjDFD\nD6vQz0XNVTQLARLAAtaTBQsW+KkUGGSQYwAGgw8eddRRPq1Fixb5uaeeeeaZXM8nWYmqAT2jo3GB\n8DcCWm6MHLcJlIW0WeSzg3UK0MKyhPzIHlp1BDsCHc6xrPitFX2ajfbDwIIdftCh0YqVK48BT04V\ntmEaqEgDBjsVqc8uTpsGgBQsG3PmzPGi3X333e62227L+bsAO8CPLDMAA1Mt7LHHHo5eWHQPZwEi\naCrC2kKzFgG4weJCExPpqFkMEFHTGIBSasA/g5GSCYAO8pMOC+kiK3mTnxYAiLICZjRjIXPYnZ1t\nHfPrVRrTsvPekiVu6222LVXlmYov4OE5sWAaMA2Up4Hly7vMrjINVEcD70Q+CY9HPbC0JlV6VmnC\nTsa20ccePwa2e/bsWXDWaACmV69ebsCAAX6MGpyMO3To4DbaaCMPD0AEcXDwff31190uu+ziC8Mx\n+cKwrSYr8mVOqn79+vl4TDWBI/Oll17qrS74zbAQuJ70w6Ynf6LAD0BFoPlKXenZj1t0kCcENfKS\nzw4+OQAa4MMx5JcMpLPWmmu43zz/W5JtqMA9bAuB5573AuCRP08l5eZ9Iy0tpB2+d1gYlY/eO9Y9\no3fPgmkgixqw3lhZvGsZl5kPLBYMDSiojyhrrBoEfVSJqw+xPsw6BhhokUoECFhAsL5svvnmvncW\nMPDII4/4aMDAn/70J28x6dGjR86Kw0lZUbhWViDS4jiBZq0//vGPfrt79+5uypQpfjweIAMrixyd\nAQ3Bho+c54fmK6w6VC5UQLLqqBzADb5GGlOHbSxJpE05ZL2hqUrAgwzx/JkOY9y4qyJouyuPJNk8\nfPEll7uVvrOiGzrk5GwWoESpeRd4TnhXSg3he/fuu+/6nou8Z4AUCyH+3nHs8eDPCPkTR++d3lfi\nWTANpFkDBjtpvjsNJBsfSeCGyp3AxxKLRjkfba7nw81HmEH3SJtBBVmABpp+aPYBFGiiYlA+AoMD\nYuVZEjV9YAVhBGX1VFKzFVYZwAbA4XpBj4CH84Ca/IK47sEHH3SdOnXyaQI8LFhWQuuKFyD2Qxk0\n1QXNbuiE9Fnko0P+DBoo2KFcnAdogBvkVxkEXEn5oiP8ebi2kcIxxx7v/vynZf45UIXdSOVLKgv3\nkmcH6Cgm8LzyngBJghTW5QTS4D0mTbZ5hzWaeTnp2TWmgVppwGCnVppuw/nwYeSDCNjwcWSpZgB6\ngCgqAPJhfB0AADAAFiZNmuQuuOACnyWWGSqJ73//+x4WsIgQF1AAXAAFQtzCQzoAD2lidSGvIUOG\n+Lj8XH/99W733XfPpQOM0Myk9JKsPOiD5jqar6hACMBICGuy6gA7wBfnSBd5JTvWHcAHSw/n4nmh\nH8LoUee7s84+2+3+075+vxF+OkZjB82+Zba3iFH5KnCPGz1wXwuVk2eK94HA+3Fkld873gEgijnv\nmJuMbbP0NPpTl93yGexk996lXnI+hnxg+ScK8BT6MFejMIIeKr1f/vKXfuJLWWfwy6FrOQH/Gz7K\nagYSNAAQHAMWBB2y8CgdoAfgATrIZ+DAgTnRmaFcIy4DTlh4gA8WQgghVD6t1XzF9BthkN4vGnOx\n+8c//+3GXDg6PJ3Z7WefW+BGnj0imrF+3tfKIMDjBBYflkYMScDDc8l7JxhhuzUD+QFVyMJzLcBq\nzTwtbdNAqRow2ClVYxa/KA3wL+/UU0/1g/zxAaxlmDZtms8bEKHrOeAxa9Ysb/FBDvx4sMRgdeFc\n6PfCvoAHC05oZRHwsFazFiB32mmn5fx48AG65ZZbchaeJD8eKqFqNl+9+eabvpkLmGIR3MR1zj/9\nq64anwgH8bhZ2B957mj3P//+l7vs0rEFxaUyZlHIpx+dz9o6BB7+VAAbAA7vXS0tLchBvlgskaOW\neWftnpm8tdeAwU7tdd7QOVL5618elWu5PjmVKompFugmzkzrzHe16667+sEB+RgT6FFF8xFWF5qA\nsO5oAXjkaCxHYaw5AA6WHZYQeLACMQmpJhLlesbtadeunYcp4EnNWsAIoFNJ8xU6/vjjj3NAxeCH\na665ZjPLkS9kwg/NPuPGT2iIpqxevXpHMH1eXrhLKL4/RKWsgMWHJeuBMvEHgzXvXb2AjmeTdwyg\nr+f7n/X7afJXXwMGO9XXaZtNkQ+dPrJ8dOv9zw7gwRGTnif33Xef99MZNmyYmzlzpr9HM6LZsqno\ngJEQeLD0ACzyfwFmcBhOclwGejgOFFHm8847L3f/77zzTkePLTWPATxMP8GUEAz6h1zoiPQFVaQn\nPx0ck9lGr/QAw0qEXFim6EqPzAIzWXVymefZGDNmrPvrRx9nfvbz66fNcDfNuLFiK9U7gdWHe1Ev\nOM9zu4o+DGDgO/Piiy+mogxYlQRfWdVp0cq3iJnQgMFOJm5T+oUU6MiEnQaJkYneWVQE/Mv89a9/\n7TbbbDMPPVhngBy6owMKQAOQI+tO3OFXTVpyXAZKQiuP/Hj4Rxs6Lp9zzjl+IlGAB58hZmQnAELq\nESOYIg3SBHLoKi6QwoGakZ2RiW0WtklTPb8oQzGByn2TTTZxry583XXp3LGYS1IZ59DDjnAH7L+f\n22+//lWTj+eF+6fAs1xvYJcshdaypADblAGAT0NQkxpyGfCk4Y60bRkMdtr2/a9K6dMIOioYIMFC\nhcBoyvzzZRoJZkcn4LiMNQYrDsAj2Al7OKlHFengw6Nu4XHgoZmLc+iDXl9//etffR7y4+nWrZtj\nfi3m7rr33ntzPbUAKebiAnaUZufOncvufeUzLfAz7MzhkZz/l1nrztwHH3anRuPqzF8wv1VhBPDh\nXhLSavUJQSeNYGbAU+BFtFM11YDBTk3V3ZiZpf2DC6QAFMiJrwzWnLA7Ot3SaX6jmQmLCcATWk/k\nb8PdiwNP6MeDVQZgwfpDPHpbATHxQPMV0MPov4AUsnXt2vVrzVeAExYbLFChEzUyltp8FcqAdWe3\nn+7mbrhxuuvRvWt4KhPb+OowvEA1rTotFRzoSZvVh6YiYAK50gg60inNWSxpl1Py2roxNWCw05j3\ntWal4iPGR5cKNM0fXEEKzrxYWrDmMMjgwoULva7ydUcHLORgHFp4ABT58WCNkUVG2/Ljofnsiiuu\nyN0Ppr+45JJLPNysvfba/jjWIqCJ5istQBMyC8Aqbb7KCfDVxvgJEyPoe9Tdc/eXc4jFz6d1nxGT\n5z/3bN3lrrfVh6YhmkGz0kSErAAj8lowDdRDAwY79dB6g+QJ4NAWX8/eH8WqEnAgvP32227rrbd2\nN910k/dd2XLLLf1xwIPeVGF3dFl45GAsh2V/QfQDpLAANsCKgAcLD9NR/OY3v/HnQ6dlrmUU5+OO\nO86DDJYboEkWIgYPZJt0yY+8JUfYtBaXRTKVsu63T3/XrXsPd/bwYaVcVre4jKuzfY9uqXHClSJq\nbfUhP/xy+JORlTFtkJlvBfJmRWbdX1s3hgYMdhrjPtalFDT98AHDupOFAPCwjBs3zs9kPm/ePPfq\nq6/mHIWPPvpoPxIsIIFFR01HrOUMLMgQPGHhEfA89NBDOeChmQmn4rOjEYs5Hg+Mtjxx4kTfTAXs\nhP46pAd0Vbv5Ki4D1ol+e/dzvxh3pRtwwH7x06naX7rsQ3fYoYdGTZGD/D1KlXAxYUKrD1DCUs0A\nLJBH1qwkyIuFB9mrrZNq6tfSakwNGOw05n1t9VLpw5X25qu4IoAUAIVZ0bfffnv/L7OY7ujAD8BD\nsxIggkWGjzbj+JAeC+kBLbLyPPfcc+6QQw7xIjDwIB96Blr87W+/nH183XXX9QMQrrLKKv46riUd\nAr2sBFuh/1Cpva98Ygk//NOm0pz36Dy3wjdXcDNvmpVa/x1A57jjjvfw2NIAgglFresh3g8WBf4g\nVBJIi950DKuQRWDAb46Ar5EF00AtNWCwU0ttN1BefGgxo+vjlaWiATxYdfbbbz/38ssve7DYdttt\n3dKlS30xnnzySQ8zYXd0wOONN97wXcMBHmCHwQHxUyI9QZSsNAAPlh0G/0NXs2fPzo3HwwjP+tgD\nL1iaGDsH0CFd5Ss/HZqx5Dsky1Il+qbCBLxw1ib07burey9ymk6rw/Kppw1zb731Rzdj+rRU+4UV\nc09CawzPBUspgfeNa3j3shiqBWt0MsDnjoAP3FlnnZVFdVRV5qlTp7pjjz02lybfpFqGSy+91E+X\nQ54vvPCCwz8yTcFgp4p3gzFc8AkhxF/AQueqKEJNksJHB6sAH64sBsEJ1p0ddtjBT/fAGDg77bST\nL044O/qyZcty3dFxbGZUZABF0AGcEJQmwEIzFM1XckzGdwerkHprYcHhYxB+oOldhDwCHaw9jBHE\nOrQqkZ/y9BmX+COL3KeffurTZ5+mSLqj33v3XakCHiw6o0ef7z788MOGAJ34reL9Cd+hlqw+xMWq\ngzUxzZ0B4uWM7+sPkoA/fr6YfX1PN9100wiE3yrmkoaIE777U6ZMccccc0yuXPWGHQTRfQF0+Mal\nKSyXJmFMltpoQBUma16QUgMfqSw7Gar8/DvGURnfGEYkHjVqlFfFww8/7LuNAy0ADt3CGSMH6ABU\nsN6ouUm6U5pA0CuvvJIDnRtuuMFtsMEGvkmK67EKAUadOnVy1113nS53EyZMcFdffbW3/nCQdARV\nYdMV+ZQbuG8AFaCz1VZb+WY4QAd5aB4686wz3VGRT8ytt99ZbhZVu05NV40KOihq48hCA+BoATy1\nhBAkpfK8Mrt4lkGHslAORnumKbWcgAVBfyrDPwzlpGXXVFcDuh801ZdTt1RXmuapGew014fttaAB\nPsIMzqd/Zy1ET+1poIFK5r333vOmV/xrgBr8bgiMj0OlomYprDL0tqJ5imOAEMADKCgIRHB0JuCE\nfNBBB3lIoimKpjAsN4AM12Ht4YMADBGALHx6gBECceQfpLT9iTJ+uF+DBw/2V1JhUqlS2ZIHC+U5\n7LDD3HkR8I0cMdzRxbtegUEDe0f3hmZAusZnvXIvVo+CHtYEgQ++YQRZVP1Ohn947hjUk/KUGrBq\nATuE1VdfvZllo9S0Gi0+Vh69z6zrEQ488EB/X8h7+PDh3gpZDzmS8lwu6aAdMw3k0wAfKCbQbIQK\n6IknnvD/lOnuTdMV1ptrrrkmV3RBi6aIiAOPYCf8sDCQIH5AzH2F1QirDJYjIIdFY/aQiZq8+De0\n5557+nxxPB0wYIB7J4JKAASwygdXOUELbKjLL/+kCfgHYeHh/unDiByCut12+2k04OJE9+Tjj7m9\n9urn6O5dq4A1h5nMGR354rFjW5zNvFZy1SMfgEDwwzaWVCAYS1wjBOCb57DUwJ8DgIcQNuGwzzn+\nFLDId4UKV8dY826pgwDXxAMgRVNMeE1oSYrHZ5/0SFfXrLHGGl4WrE86xlrWKKXBflw+4nEsLiPf\nJ86FgTJyTBaUsPyKi67Y1oKc8RCPc9tttzWLgn+U8lI6yKN8w8gAKMBDIN14WmHcmm9HH7y6hsi3\npSlqdwVDcwvHonbYZnJFD3bufKTQr50P02A7HiJH0SbSDfNhOymv8Npi5eOaUAauC0Ohc8SL/tU3\nhWVEtmgqg6aoXTZMJrcdpoeuuD5qJ82VDx3FZSC9ePm1ny+fXIZfbUSg0xQ52MYPZ3b/d7/7XVM0\n0F9T1DzVFEFP01/+8pemCOhyejrggAOaIoflpmeeeaYp+gA1LVq0qGnJkiVNPE/RAIBNEQg1RbDg\nyx9NRZG7bs6cOf54BBFNkTWo6bPPPmuK/H+aIifnpsiK1BTN09UUwVBT9MFoirqgN02ePLnpzDPP\nzF3PfYnAy+f10Ucf+bxIh/TIT3kWUjzyRH4/Pk3WyKTA9aRFuSlH9GFqisYG8vlFH+GmaOLRpmHD\nzvLP9CmnntEUzaWlS6u+/mDpsqYxYy9r6vCDDk3HHHt80+LFi6ueR9YTHDp0aBNLowSeN55x1qWE\n8LvHNy8MfMP0PeNbGn4PdVzr+LV8QwvF53sa+aCE2flt0lGa8XX0J6bZubBOI614/Ph++P0u5tsd\nlp+0FCL4yOVFOeKBfJQ35/m2KYTnFCdco+d4uPXWW3PpodO0hP9opMYSlfpwEZ8bIUWHSo7f5PiD\nzIMVXqs0wnX8mlLlQ33hixg+qC2dK+eBiucVliXc5iVRKOaFUdx8a9KudmWErrmH3FNkTLpXHOMc\ncYjLNdUKgACVe9RM5aEk8hNpihyGc8/a+PHjPfAAKVGTQtMf/vCHpqjnVlNkNfHXCHgiPxh/DUCo\nEFlnPFBE/8r9NcDO/Pnzm+6///6mW265pSn6d9t07bXXNl1//fVN0WzsTZGPTy5fdH3iiSc2RXN5\nNUXzbDVF00s0y68Q8ACkAh3kAnwIAiVBGIDHx43yADmvv/56U+Rz1BRZp5qiMYg8RA8cNNjLBPQ8\n8+x8Fa3iNQAlyDnk0MOboslPK06zURPgHlZbP/V+7yhTCOAt3bs4IMTjx+uB8DsY345XwoVAR9fy\nDQpBgO2kb5Xix9fhNyv8fsfjhfvKr5hvd7z80k/8eBzaQhhiWyGEllCm+Ha8rkPmME48P6Vf63Xd\nYKechysOBXp4wgcnvFkos9gHkjTCUI58oRzxByDfuXIfqDC98MFK2iYPQjEvTKiD+DYVJlaQagXu\nX/iiJcle6BjX6hmoRKbIf8BDBtASNVX5f5vR3FVN7du3z720WHeeeuqppgULFngYAAywhGCxweIS\njYrs4wIY+rcq64nSxCIUTU/hLTsPPvig/9Bzb6Ju6R58br/99qbIH8pvR6b0XN6Rk7TPE6sT+ZEe\nsgJSScCDBUB6A7xCeYjPtYAd8AREAVMAHJCD9SrqPdb0/PPPe7CTJYsP1siR53jrS8+evZrOPmdU\n0/0PPFSy2oGlqyZMavr5Mcd5GbHkVLsSL1moDFzA/axWSMt7x3M6atSooosVfv/jsEIi8UqdOKpo\nqQfi3z+BRPy68Ntd6FwoD/cnvC5u1eG8vlVxa1B4Xfycvt1Skt5r1sgWhrisOheHjzA/4oTAFuYX\n1jGhLilHqMs4BJJmeG08P87XI9TFZ4e2vrBNMlIGb7JfohsW3ccvQ/SRbtYuiG9DpESd9o5qpBVV\nPP5YpHTf5TsXIdrgPOkohHmRngJpqH2xXPmUVilr2mcVogfKd9dDF9ED5Wfk1jnajcNy6LjWYbmi\nB1aH/Vq6jl4kr+PwJPomv8hiEh5O3MaPZOPIf6AaAV1vs802OZ2Xk2Y10iBffCOYnBPHYRb52Dzw\nwAM5sU4//XSvp8gi4h2V5aysbuQXXnihjxtZVJr5w8jvJgIM39OK6zkWTkuhbuYRKHkn5rXWWsv3\n1Np///19ms8++6zXFZOH4jeEkzTpkY7eGyLin0NZrrrqKn9dVJl4J9DQP0fyIDdl+Pvf/+7n42LN\nwjHSjqDIt/MjJwvv3dlnj3CvLnzVDYl8ar694jfdZZeM9XGYdiKCFu/UjGNzfMEP59DDjnAdO3T0\nvb1ejXqrMQEpz/OUayZ7mb3A9pOoARyVIytI4rlSD1bjnalGGsiN/5Gcr4sph75jxA3rgXzX8h3k\nm0pIqhv0PcUnRYHvYFgvsM+3VYG6QQE9KMSv45p839SwHMgV5hdBRLOySUblU86aPKI/hrlLw/zZ\nVh7EI38Cx1Wvsh/qEt2zT3wC14e64Fh4f8J0OFevUBfYKffhQkkhDPHgyTOfc3EY4lh4E5IeyPCm\n6CGoRD7yLDZU+kApn3i5eLDDh1sPs+KXu+bD1DOqTCsNPPw4vFVDLtIgrUpeKACOCoUA7NA9HOBZ\nb731vEMvxyNLh8OhGVgABoACAc+h0TQGBJyMGaxPAWAAbgALAEVLCDuADjDChwPYwbFZXdSBFWZk\nJ3AtlUPkO9QMeEiffCKrm+/hgoykA3RpGg8BkWQnLaBJk46yZv8t+iwAAEAASURBVB85iUOQHnCw\nRh8sHCNQxnNGnu0ee2yev4ennTrUde7Uwa280rc9BAFC4bJCdNkxPz/aPfDgA27RG4vc1ClXuyMj\nB9VGcHL3CmnlHyC2GrpK43tH2YoN+j4TP/xuJ13P+XgcgU88fpiuKvswTvgtRYd8c1haui4pLdKl\nntI7GVldfFbUOdRlOBBX8i0L5Q63Q1nC+i3cJo4AJiwbeovrknhxvYT5hfHDtMI4td5evtYZkl9Y\n+PAmSBaUKIuHHi7dBOJTuYuw9WCg3JCQlVaYV9LDjgUlHsJrSpUvnlah/TCfQg9UvKzxNJPKxbEQ\n9OLX1HOf8sRBh/vHfec+J5UHedEX1/GChrrjGGmG/8BKKZ+sVerBIOsOH6SDDz7Yd0OPHIr9BJ6a\nHR0wABCw6GAV4trI7yZnEeHaOFzIasI5AQQ9tDQNBWkALoIjoAq4nDFjhhs4cKAvEqM+n3POOe74\n44/3cYGy++67z9FzDGjZaKON/NAAGj+Hi0gTWQReyIHsLAI2zhFUdsmlHmQcl5XHR/zqh0oYGVks\ntI4GqvUnI43vHXBebAi/GaoP8l0bVrb54ui46hD2k3orKZ7WoRw6lvTNKiSDvlmq55ROa635tvKn\nkEDefD+ROfyOhvASlpE4+jbmky+MH49T6Fw8bmvu1wV2ynm4wocbqKEiD5UYWnyksDAfjhV6+HQN\n6/C6Yh/+UL4wrULbofyVPFDFlquQLMWcw/rBP/JKQ/hvgrS4d/lMvmFeIXiSBt0fFeJp6nipa15q\nKnUqd6waVPZAFOkDBjQtMQYPIEIX88ip2GdBRcJYOkAD1wp0BC6hVYc8iAPkMPYOiwYOJF2uAYYE\nIhtHlicg66ijjnKRj4276KKL/HQXkYOzY0b1SZMmeRm6dOnimOqCZxGgIiBHKEscdMiL8wRkAp6w\nLGlBRll30Auyt/Th84nZT+o0EH9H6v3e8VyXEsLvZSnXpS0u5aAJP6xnkJFvIN9yviXxc5WWgW8C\n3089A6zJS3+Idb7SfNJ8/XJpFi6fbDws8Qc/JNR819nxyjVQ6gcqKcfwXvGCFwM68XRk4dPxME0d\nq2RNxQ5wqDmLua0IwEjUO8vDhORm9OXddtvNQwqwo0WAA2CwcC1WFtIGogAKQEdrtsP5sNgHNpCB\nj9Gdd97p+vTp4+XAj2fDDTfMgU7//v19UxvNYuQhaw4gA9CQv/xz1GyFfAIdgCaEL8mFnJwDhJDb\nQnY1EL4jaX3vCmm3tf7UhenKr5E/C/mWML7kDXWrY/mAJYQZ0qJ1gbyAz6TWCaVX6Tr8s4i8Ah/S\n5RzfGIVwm3P5dKHjScYGpZWWdV2+XuHDUs7DlWT6Sxr4KbxhKDzfwxe/GZXKF08v334oX6M8UPnK\nmu94qOt8cfIdr+TafGlyXNaLEHi6d+/umL+KEPWaauabw4suoAkBh201FQEcAh3gAYgALljYDhfg\nBysRi2Y8B3iQJ+q94icw9YJ89cMgj9E4Pd6vh3yAKlmIkAsZkkAHeSgraQt0lC8yCHSAPvKWXsK8\nbTubGqjk3ank2kq0FX4v4392K0k3bIJKgpaktNFBKE8IDoqfdIxz4XGgM9Qn5Sq2nlI+xa7154z4\nWHRCOcImLM7HdVKJvsPykXa9Ql1gJ67IUgoPFesm8bApLW5G6KxMmpwPH8ikh4jRLvUR1/VKkzSK\nffiJW2qI51PJA1Vq3uXExz+jlN4TxeShe1lM3HicSq6NpxXf55mggseiAZwABCNGjHCdO3f2UeVY\nSC8twEA+MFoLMliHFhQ1FQl0SJdFPjzKS/CBhUUL8BF1f/cWnlBepu8AdtSjSjJoX47I7CMPQMQ/\nMspH3oIrwArYCUEHefV+hHnadrY1UMm7U8m1lWgtrDSTvuXlph1aPPgjrXqA9MgHVwa9A4yurBAC\nAvVSeB3bHGsphO4Y6DVsmm/p2lLrC+rCsKySL36cfKmbpG/yQa6wLuTasO5UWpI5vD9hPafz9VjX\nBXZChZfycKH00KqDyS90SkXh8RcxfCB5ANVGibJJK3xgJJfWihM+xIUe/lJvYKUPVKn5JcUPy590\nPjyG02spvSfCa8PtUL/cr/h9COMmbSMz9yS812GaSdeUe4zKXs1ZwAZ+MmHAqoIVRXDD1BMsgIWg\nI7TqABdJoCOoCPMDOgAdAIQ1eY8cOTKXPc1pyEbAUfpnP/uZi8bO8fmzBnIkD2vkQVYC+VCeMH3y\nQzbBl2TiQ58UeBYej/y4xo+fEPUau8h3L6eLeXxhRvXxEyb6bvDvRMMXWChdA4343vHHiZ6DxYaw\n0gwr02Kvzxcvbl3hexTCTVhnhM1M4TZph9exnS+E5QAgBA1xoMh3vY4rvzho6HzSOuk7ybHQKKDr\nwvIhJ35G0kvYmxYoCq1GXB+CUVhepV2PdV0clFEMlZUeWG5avocjVDhxVDlzc0hHVKqKj5sQ9rDi\n+vBhyOdwDBTpppQrXzk3EPnkJa8HKimdpAcqKV6px6T7Yp0Vq/XR1f1CXp4FFvSv+5lUDq7h/ocv\nkuIlvcQ619Kaj+7GCc6SvNiygAh4mJk8DFhV+vXr560lNAsRD0jAFwbfnRB0wuYrQENQgYWFIKiI\n73Oc3lZz58718X74wx+6yy67zIMKztKnnXaa1wnnmaWdMTDowo7skgNZ1GylsgA2AI4gB5kkf1wG\nn3H0A6w8/vgT7s45d7l777nL7d1v32guoe+7tddZxx3xVY8xxdU6GrAwgq4v3K233eHwLYoGJXR9\nog/sDtv3iLZ7Kpqt82gAHY0ePTrP2eIP846k6b3jW8IfqGID32jVE3wD+BYkVdLFphfGw52CuiHp\n26J48bFz+Cbz3dT3W/G05tvOdy0eqF+ok1SXhec5x3EBlupIxeEbWUhGxcu3Jn3pUHFCg4COsZYs\n8fhhHHSA7sKA/GHZKvk2h+lWvB19EOsSIiApOBdJVLBmI1IyEibHtEQPXk7uQueIFD2Quet0fbiO\nHqBmw4BzTanycU1043P5hPK1dI64oTzxbdJFnjCEeUUPW3jKb4dpRg9ts/OUN54HOmopMNItow1X\nGhjRM0mGuEzF7ifdv1JkbGkk1wgS/DxSTPOQJBOjHkfQ4UcCjiwdTVF32ibWHNPxp59+uol5uN58\n800/NUP0MfAjIUcQkjgKMnlGoOLn6oocoHP5RmDj59eKAM2PxMzIzo9F92XQoEG5OBFU+fm2yJtn\ng4Vt5GLKi+hj6aeFiMDFjwKNLJEVyE9rkU8ehvVnSgfKz7QRt9x2RxNzWpUTGHmZEZgZiZnlxmjK\nDGSwkKyBao1cnrb3LpqU1j+3yaVOPhp+NyKobxYp/M5HFWyzc9oJ39/4N5U4fDfDPIjP9zPpG6s0\nOUd+SptvM7JwXMdYo38F6qwIMnLndQ3nI0jKHee6UM74dZzXtzssP8fzhbB8ESw2kyvpGvKMy4S8\n8TpO13JfyJ8l331Q3Fqu82ukRlIU+3CFNwhFxwMPpBTMDQwfEOJyw8I4xC10w5R+sfIRn/QkQ/xB\nKHSOa0t9oML0kl5E8pcscdgp9MIgS77AnFiR2Tnf6ZKOI0N4TyVrqWvSIK1KAgDX0hw9wEfUtdvr\nlPhMsRBZRPw+cPHQQw81RePd+Ak+77333qaoq3gT68ja0jRv3jw/zQRTRbz33nt+igbgImpSSgQd\nlQU4iiw0Po/ICtMUjbeTm94BaBLwADLMtTVmzJjcPUePQ4YM8eVCrugfvZ/uAtBhCohobKCmP//5\nz03M2RU1b+WdfgKQEpQwzUO5gKMyxddAExDFJKBXXTU+ftr2v9IA97MaQJim9w5AB3hKCWGFHv+u\nlZJOLeKG32DqpLYSQogTiKWh7HWHnTQowWQoXgPMjaVJJYu/Kn9MPgghuBULO1wTB8r8uRQ+U0xF\nEo1n40Ei8nHJTcwZzo7OHFQAE/NczZo1yy/MdcXs5lh50BkAziSjmk8Lyw0QlRTCiTyjZis/V1Xk\nF+Tns2IWdObZAlqYwwqYAq7ImxnUQx1GfgB+FneAGKsOwAW0Ms8WoEOagq5QFuIwb5WHkAhyWjtg\n7QF6ACsAy0JzDRQD5M2vKLyXhveOb0mp9xrrCODAM16MVaKwFio7G/5Z43sU/sHmfZOcyErcthC4\nP/r+oJM0hW8gTCScBdNAURo4MhpUkHb2U045paj4xUaibTr0yYn+xTa7NPpwNPPpiV6kZufL3YmA\nxftD4LeTL3Duxz/+sT9Nt/O99trL97DCCRnHYHpCEfCr2GSTTbyfDj4v+MRo3iucEHHGVFdy/HeI\nIz8dn8BXP+hW81vhAK35tvC/YVGXdhyQI2DxSwRQ3jkZmfDPiaw8jrm0CMh0xRVXuA022MD78mhK\nCvkM4adDkCwPP/yIO/mkk9zue+7thg073bVvt54/X4uf8RMnu8kTxrvDI/8fpqSw8KUGeLbwl4qa\n/Kqqknq9d5Sl3A4P+MHIjySCtlYdm6aQskM5CsXjXGTh+JoTb0vXZPF8qJO0ldlgJ4tPVB1l5mPL\nnEuF4KCO4pWUNaBDeXBO1jxSSQnwUX7ppZcc4BFZbzxw0KsJ2AAy9thjD/fGG2/4S4EKIAInZXo6\nAThM7Ans0HUf2AGCAAzBhfLEYZN5pzSEPnNjSS7+k0SWF583Ts9AjWCH60LY4TyBiUyZ5oKATPTm\nYpRlAZfkALrkkDxmzFg3c8Z0d8GYi92AA/bz19b6Z+Fri3w3f/KdMf3LiVVrLUOa8uP+8pyeeuqp\n3vGzGvNk1bt8+oZQrnICPYNw1OVPT2RRKSeJqlxDD6rQ6Tsp0ai5zcNO0rlGOsYfVLrms44sWX5S\n6zSV78tuIGmSyGRJtQaojPlXxpL1QC8XelMBO1FTk1/iZeIfNaADtOjDLDgAaNgOe2hFPgi+FxRg\nwiLDqa5hDeTouPIDHpEH0CGvpIk8SQ+rDaDFmgVLD4E0kQeLEWDDQs8n5tEiAEDsR/49/npZiSQj\n8tBFfMFvfuNuuHF63UAHWbt07ujuuXuO7+XVv/9+DQHWlKuUwPPw+FfPJO8a1r6o2ccfKyWdtMYF\ndsJJc0uVE6sBActUUo+nUtMrN37UXOVBJqlHE72xdL7c9LN0nXqYYYWnR2jagll20nZHMiAPTVkE\nVf5+J4M/yK9/mBKfCkaBSmbw4MF+F4sOH2dZWAAOrCtYVKJ2at8tXGBx0EEH+RnI6dLNiy/LDtYd\nxsxRF29ZdrAwoVOapKjQ2MeaJCACSIAT4AZo0fg9WHZYkENj6AiAuAawAn4YA+iwww5TsdwJJ5zg\nLSfIhyzEOfvscxxdxK+Zck1Nm61yQuXZOPW0YW7uffe62bfMLqmbcp7kUnuYZ41Fgfsft+DwrPJs\nhM+o4mdpjfy8S1isLJgGaqUBg51aabqB8uGjzMeYdfyDnKViYtHBciN4i8vOeWY0x9JCJcM+MCK/\nGSCDwfuAnchp2PvFRL2yfDLnn3++N7HjH4OO1IyFD0/YfEQ8FsJWW23lKzLiC3RkgQGuAB0NXkje\nbOO/w3HicQ2LIMonGv2wj5xnnnlmzuTPeDz8O15vvfUiHVzgyzll6pRUgY7kF/DMXzA/08+byqN1\nCC08WyyFAnBAHKw+LcUtlE69z2HBZOHds2AaqJUGDHZqpekGywdA4IOb1Q8WVh1kB9iSAueAEEBH\nUBf1UHIsgAWAoaHj5STMKMWHHnqoBxCsJTfddJMfsA/gIR3W+MvgywOsnHHGGblZ06NuuA6ZCIIW\nWXTIC6gR6GDFiUMOTVha1FSm67H2sE1g1OU77rjDb2PVOfron7mFry50s2b/KpWg4wWNfgCet976\nY6Z9eICU0JpBhV9q4LkEkkJQKjWNesZHbjWFZ/mPUj11aHmXpwGDnfL01uavAgDo5UPln7V/mVQ4\nWKby+Q1QKan3Vdh8BYSETUm/ifxboq7kHlwAkU6dOrloHB134okn+udjxx13zDUX0XwF6GDZica3\ncQcffLBvNiLiDTfckGsuE+gAVOSFRYe046DDcVlxNCKymqTUuwrAIZ7AiG2OUeFEXem9jDh4zrxp\nluvRvavfT/NPv336u+222zYzvbR4zniWFJKapnSu2LWsO1gay4GlYvNprXjIzAK0WTAN1FIDBju1\n1HaD5YXTJB9zKs8shZbkplJS7ysqFQJgIYsO4IFlBksOzUNYWrC+0DuEeFhOosHb/HW//OUv3dZb\nb+0dhrHoROPi5LqgAifRyMq5aUq4ILTGkGYIOmwDLkALQT45NIux4IODRSmEnTANrmefcnDf6J4+\nZuxl7ujBA316af+hl9b+/fd1l1x6SUXOra1ZzvBdwHLBs1TtAKTL1yxL1hHJzR8lC6aBWmvAYKfW\nGm+g/GQhAR5YshCojDCjU9knWaSSmq8AGCAES0sIOnIOlpWFZiRAAwih59OyZcu8SugyTKUXDern\nrrnmGn+sXbt27sEHH3Q/+MEPfPMT14SgA9SwyBlZQAWoEMiLHleCHNbAEwvnCMgN3ITpCJguvHCM\n22DDDd3UKc3n+vIXpvjn+mkz3E0zboyGALgzFf47VNwsClgtahHIh2cKgMhC4H1D5qxapLKgY5Ox\nsAYMdgrrx862oAE+YjT5RCMEt8q/2BayL+m0mgCoII78qkdZmIDKwrFCzVdADlYd1sAEkALkABoA\nCNYV8sIJmLD22mt7B2biKQBdW2yxRa43FE7EnAecSFOQA5ywcFygQ14h6GDREewItkiP+Cxx4Inm\n+HKjR5/v7rt/ru/mLZmysmZW9W7durohJ59UF5G5dwoAM0utA4Al2El6lmstT6H8eBcAHf5kjLbm\nq0KqsnOtqAGDnVZUbltJGsdaLDtUAq1htq+GHvXBRT7kTQqcK7b5CtgBQoAJLCmADs1UgAfAAWww\nxgazlYeB8ThGjBjhormtPMBwDeBCZSAwSQIdrDSkCUgRn3y0sM/COSxELITQIgUsYeFB5p///Fi3\n48493dnDh4WiZWZ77oMPu1OHnOxq1TsLCOb5UeBepSFgJQF00m4tUTfzxwNITIP+TIa2pQGDnbZ1\nv1uttHx0qRT4oKXNj0Cgg1z5Prj844z3vgphAUiQn46ar2jWAkAADaAFJ2QABOgg4J/D6MoK++23\nn4dC4qv5ifjs47tDfqTJObq4AydACgGAAaLC62TN4frQooNMBNIjyMJDWgsWLHDHHXe8e+LJJ1Pd\n+8oLXuCnNa07PC88ywpAcNqeacmGlZJm0rRaVtP8XZAObd02NGCw0zbuc01KqQ8blpO0WHgEOijg\n8TwgVm7zFTABZAAs9LRiAUCAHXRw8sknf03vjJAsQAqbvYhIMxZNTn/961/da6+95iGF4+uvv77b\naKONcqAj4AFyWLAsyU9HoMN1BAEPaQM9Z5xxpltlte+6MReO9uez+iPrzqI3FlWlCDwbCoBNWp5f\nyZS0pilLYJZGy6q+B/neu6Qy2THTQGtpwGCntTTbRtPlA4dZnQ9cvSsMKgNM6BtHPhXAR75/58hZ\nbvMV4KFu5Vh32B82bFhuCokddtjBD+bXr18//0Qwl87ZZ5/tgQdrDWAEqAApgImsMKw5xjkGLGQR\nHDHvDH5AgFYh0AkfQdJm8MPte2zv7phzVyZ9dcLysN2rV283dOiQsnpm8WywKKSlaUrytLSW7Dzb\nBJ5vgCefP5qPVKOfYv5g1EgUy8Y0kNPAf0Xm+9G5PdswDVSoAeCCUXl33313DxfdunWrMMXyLuej\njwyMZ0NFAIQkBR5/Jshk0D8AjXiAgSwhWFrUhIUvjfx0ABWsKlh15KvDOaAG52bC8ccf7/Ned911\nHb2vGF2ZuXyooNinmYqFPOREDOSQt6w/yLPOOuu4zTff3PfcYk1FRzrvv/++n64CfcctOvGycp7e\nX0uWfOCGDqmPY29cpkr3P/v8C/fyy6+4n+7at8WkqIBxzEZ3LLLecC9YshSQnxDKDbB37NgxaqI8\nzo/9tNtuu/k4tf7hHSJv3vvZs2fn/YNRa7ksP9OAWXbsGWgVDdA0FFpVwg9zq2T4VaJUaliX+OgC\nMvxjz2dhqmbzFbOeU9GwZqRkYOuII47wlhogCT+fAQMGuGeffdZLOnPmTG+pUTMTVhoWWW+AHBaB\nlPaxBMmiA8AwenPoX4Ke8+maiT5XX2PNzDomx58bjbuTrykLvfA8EAQ38TSytp8EOmEZOM97R+AZ\n5PmvRUDPvG/8sWCdlaEoaqEbyyMdGviy20Y6ZDEpGkgDAAaVDR9bRloGQPShbo1i6mOrip68+OCy\nH8JAmDcyEfbZZ59cBcE+lhUchWVtwWLDgoMv52TVEYDcf//9btddd/Wgg28NfjlMIEo8FpqaAJSr\nrrqK5H0YOnSoTxMIYqF3F1Ye0ie+eneFjs8cC5u9gB0qcXSshcQBPS2q7Dn+wvO/cT133onNhgjM\njt6uffvc/aWsKjdr7r30kg94s6QIvT+UK1/gHM87wMPCM67r8l1T6XEAR/mSt4FOpRq161tDA2bZ\naQ2tWprNNMDHln9706dPd8wBxcewWpUPafOx5V8saZIPFVwYqAQFXjpOvEp7X+GQfPnll+f8c7bc\ncksPOsx0jsWGRdBETy6sMPS6Ouqoo7wYvXv3dgcccIDfpkkM3x+sQlzPmqklOAZUyZoDCBFaarby\nkaIfyi3g6dWrl5dJ5xphfcyxx7s333jd3/dGsd4k3RcBSyHQiV/Hfedd03sH+MTfjfg1xe6TNu8c\n7x6BbVmU/AH7MQ2kTAPLpUweE6cBNcAHmo8i82gR+OAKTPgHXmqgAhfcYDWiIpBTdNLHXNYP8hL4\naKZx5OK84ASfGSw4surIr4bjBGADMMHSQ7PV1Vd/OQLx4Ycf7p2cQ9DBSiMrEdBDGvhV7LXXXj6t\nefPmeWsQcdScJWsQFhxZcgAdwQ4XFgs6xEXP0skhhx7OoYYKG2+yqevdexdfxmoBdNoUVA7oUAae\na713vIPADmsAqJz3DjlID6jhOec9ZJ/jBjppe2pMnrgGzLIT14jt10QDwIkA5d1333U777yz/xDz\nMSbwoWbhQ0oQpPCBJVCB84FlIV6xgeu5FisLzVfIQAA2gBEgB5DRmDrxwQOxsvzlL39xwA29mwjX\nXnutHzwQCAmhCcABnOSzwzxan332mV+oeOhiTmAKCVl2KMuaa66ZmyUdyw7QIwjyF5TxAxy++94H\nbtwVl5dxdXovoQv6zBkz3KybZ6ZXyAok0/Ov96KCpPyleueAHXogbrXVVv69C0GRbb1n+d473qFq\nyVRpmex600AxGjDYKUZLFqdVNRB+UNkm8JFnO/wI6wNbyUdWzVfk8cknn3hQAlBkgQlBJ2nwQGY6\nHzJkCJd7AGG+q2222cbvAzukAzSp+Yr0gB3giUU+OoAOwEPo0qWLGzlypO/ZRdMVwEPvMDVjAUJY\ndki/FKuOT/yrn/ETJrrPv/hHwzgnq2yNDDvVBh3pTOti3zveQd658F1UGrY2DWRFAwY7WblTJmfF\nGuDfKvN4EaZNm+Y/4FiUgB3Biaww4dxXnAc26KI+fvx4fz0jHD/wwAO+S7ggBMgR6Kj5i/QEO+pm\njrWH/GjGuuKKK3x6jM2DLHRlx5oj2NHYPaFjsr+gxJ9xV17l/vHPfzcc7Cxd9qFbv327XDNgiWpJ\nbfTWBp3UFtwEMw20kgaWa6V0LVnTQOo0IEsKzVdsYynCnE9zlGAHIMEawxL2vjrmmGNyoIPPDV3I\nv//97+csLQKdeDOYrENKC58fjbjMmDw4KRNwdAaKCFiHJAfpIRvph749PqL9ZHrKi3y3T01IWFMs\nmAZMA9XRgMFOdfRoqaRcAzRf4aOAxQSnSgIWm5122skP0PfWW295wAA4AB0gA7jAx2bHHXd0Cxcu\n9NcwieesWbPcWmut5UEnbAIToKipSk1XHAdW8LtRl3LkwMkTuThGOOGEE7wDNHGBI0EX17PPcfJj\nsdCYGgB0gBwDnca8v1aq+mnAYKd+ureca6QBKpBCva86d+7sYYLJFAELwQnXaZoHRJ08ebKf+oEm\nJcBFoCNrjpqrBDnsc454xOc6HJzpsg7ssGy44YaOAQYJOD5PnDjRx+c65AjhCwsPAEYw4PFqcAws\n2OEHHb7cyfivQKcUh/uMF9nENw3UTAMGOzVTtWVULw2EzVdhF1nAQc1XTMnAfFPPPPOMB58bbrgh\nmndpqBcZB+F77rnHj4As3xlOcH0IJVh05OujZjDiqbs6/jeADj45Wtjffvvtc5OG3n777b4nDFYc\npU1asu6oSYt0Swkan6eUa7IQ970lS9zW22ybBVELymigU1A9dtI0ULEGlq84BUvANFCBBtQjBN8Z\nbSclJ9M+PULUOyQpXvxYvuYrQEXNRbKgADIMDMjknbKcbLrppg4AYeZxzuOozDmuBTy4FhjBAsPC\nPpDCeQKQQTMVFh18dVjYJi0cm0mLOMOHD3d33HGHW7p0qe/tBVzRzEV6nNeChUgO0dqOlzlp///9\n3/+6dxa/nXQq08fozp/1YKCT9Tto8mdBAwY7WbhLDSYjPU0Y7+PGyHdGY30IYICTpECFwHWMF8N0\nDPSGwkpzZORonK9LLNcUar7CD0bWE6DiT3/6k9tjjz1yoIMDMlNBYH0R6CBbCDqCHPnXyBGZeFwT\nBx32sRQJXoAd4AX4onfXD3/4Qy51F154ofvlL3+Z891RfK1D0OH6lgI6mvfYEy1Fy9z5P/7xLdex\nQ3absQx0MvfImcAZ1YB1Pc/ojcui2MANCx94DQjYs2fPkgYFVLk1OBrp4eMAJMUHGKSCB6aKGTyQ\naRx+/vOfK3l34okn+vSw3jCODtYYQAM4AWiAI4EOa6CJ44IXgU5ozZFFJxwNmQzVlEY69913n59X\ni+Onn366l534XEvTF+BFcxjQRB7IVAzsYDXT6M6k3SiB6SJ6dO/qoTdrZTLQydodM3mzrAGDnSzf\nvYzIDphocsAkKKm0GAAPC5UHlp8jI2sP+RQ799XNN9+cswAhCyMa9+jRw8MFIPLmm286oIwQBx35\n0xCPAHzEQQfgwZoji46sMkAKcCTfIQAK52ag69e//rVPj+YsYI5rSUc+P2wDPICQ0vMXFPjp1au3\nO3P4CLf7T/sWiJWtUx07dHSzb5md17qX1tIY6KT1zphcjaoBg51GvbMpKBfNToBHCCGtKRZ+P+QH\nHGDRIcyZM8dbaIAKltCKgkPxGWec4X1lJBeWlXbt2uWsKEAG8ILlp1u3bs0sOoBO3D9HUAKMYI1h\nEZSoCYq8QmsMctE0BkiRJsCz+eabe8sReT/66KM+PunIyZm1IArgIb0wTZVHayw7hxxymHfmHXPh\naB3O9PrZ5xa4o44c5Ba9sShT5TDQydTtMmEbRAPWG6tBbmTaioGlhWYkFkFPa8uI9YW81OOKeX+0\nTd6yoAAo+OccfPDBOdDZOBrb5KmnnnL0ygIqWAQnXLfddtv5EY+XLVvmp3ygyQlLjByRgRLABggJ\nF45xLmy6SoISrDPEAZa4Zvbs2V5dABCjNgNEoVWJvCkH8IZ8BOIoADeyqHEPaMJ64IH73ROPPaoo\nmV/fd/9c12/f/pkqB0DOs2bdyzN120zYBtCAWXYa4CamrQhYV6hoWdT8U2sZ+fcsKw/WnVVXXdWD\nAZaT1157ze2yyy7egoJcgwcP9ousMvjGCFKAEIACuOBa0gWEGFQQuABcgBmOcQ3WFjUxkZ4gpyXL\nC2mFMAZMXXTRRW7ChAledQAP0CKokv+OLEjvv/++e+WVV7zzNhWqLFuh3oE/xvK57fY7vZ9LeC6L\n2zTLDR06pBnQprkc3Jd6vQ9p1ovJZhqohQYMdmqh5TaSB9YELCmyKvAPtp4BOQCedyJrzyOPPOIt\nLnPnznUHHHBATqwLLrjAz0mFFQdwkFUGqBDoYEEBdFiAniXR2C50ee4Q9QISfAhyAB7Ah+OAjvxp\nkqw5OSG+2gB41NOLvAAeoAw4I4T+O3/+85/dCy+84J5//nm3YMGC3AzsXyXlV8ANlasWrAkXXXSx\n++jjTzI/+/mtEbBdPWmie+yxeWGRU7ttoJPaW2OCtRENGOy0kRvd2sUELKhUCXzY02SmHzRokLfI\nHHjgge7cc8/1MvLzq1/9ym2wwQbeOiOrDtDCNpCCpUWgA3iwrWYr9hcvXuwHBJR1JQQdNYGRTzGg\nQzw1Q5GH8mXcHcb+UWDcn7ffTh4vB6fqPn36+MlOe/Xq5e+B0mTNgsz4A7268HXXpXNHJZu59aGH\nHeG6dd0uGpPo5NTLbqCT+ltkArYBDRjstIGbXIsiYtHBgpI20KGCB1qwsigABWPGjPGWHM4BJnFQ\nEehgyWHBX0agE/rWvPzyy27rrbfO+fqo2QpYIhQLOpJNUILV5qGHHnJYoph0NCnQNLfnnnvmFixK\nyp/4SouyaKEMZww7K7JgrZRZ687cBx92p0aQM3/B/FRBddI9MtBJ0oodMw3UXgMGO7XXecPlSLdy\nPuosabLooGgqfEYmxqqjQFdyLDMADEFNToACkMI1nMO6ItDhGOBC3NAKhFXnjTfe8D48DELI9Syl\nQg6+QNLhY4895icglbzx9VFHHeWb55ADSMN/J+ydRd4syAzcaAF42MYyxBQVWbXu9Nunv9ulT+/U\nW3W4nz2/snbG76HtmwZMA7XVgMFObfXdcLnhhHzkV93L6+2jk6RcVfgjR4507du3d1OnTnV9+/Z1\nxx57rHc8Fhhg3QkBAcgBdgREAAwwBFzE/XOAjg8++MB9+umnvgmJdFoKAhvWgA7XhgGrDbOtAyXd\nu3f3TU9hd/SHH37YRxfwADuyTgm2KLsAB8jRNutJk652yz780N0ye1aYbeq3x0+c7ObccXvqfXUM\ndFL/KJmAbUwDBjtt7IZXs7j46QA4N0bdzMMu3tXMo9K0VOFrfJ3f/va3fibzKVOm+JGRqfiJIygi\nHoDDAiAQAKHQEVnAA2iwyD8HfdALKunfPJWfFqa7iAemv6C3Fdey4FwsOAG6sES9+uqrrnfv3v5S\n1synBVgJwgQ7yCrgiUMO+ywfffSRO+vMs9ygo452Q046IS5OKvc1rs41U67xOkqlkJFQ3GfuoQXT\ngGkgPRow2EnPvcicJAIcrDtpDQIZIEYgQzduYIcZzsPjQIXiABoEQCberRyoAHLkH0Mcgiw66APw\nwWLDkg9uqBC1AI3IqiD4AkyQi95ZDDaI3JdccomPdvHFF7tOnTr5fJFRcqpZTvIImkiLdAV4L774\nop9t/Zln56e+K/rSZR+64447Puqd1scNOfkkqSl1awOd1N0SE8g04DVgsGMPQlka4KMup+S0+enE\nCxSCg2CG5iH8eAYOHJjrUi7YEegADQIINV2xH4IOFhTABqBBJyxJY9xguRHYsE6CG0GI4ERr+Q8B\nO59//rkbMGCA+8Mf/uCLCfzgswN4ydIEjMm6g3yCKK2BIC1z5z7gbr75JjfzplmpBh7mwKLss26e\nGb+9qdnn3nNvLZgGTAPp04DBTvruSSYk4qPOMjrPLOVpKgSVvIAHgABqcAI+4ogjPKQABlhOgApg\nCBAAHgAbQQ4AIYhgAD9GWwZqgJwkuKEZCqChaQoHboBQsIFuQrAJZRPgaI08su7gR0RzFqM47733\n3l7FG220kRs1apRPGwsTwCNZ41DGecrGOlxGjb7A/TNKd+q1U137duul6dZ5WU49bZh7660/uhnT\np6XOAR4BZcUz0Endo2MCmQZyGjDYyanCNorVAP9gs2LVoUyCDEGFLCV77LGHH5fmoIMOyvW6EgwA\nCgIdppag+zcLkIMzcjwkDeBHfqoId9ppJy+HIAeAEdDE15yLL7JIAWUsQBYTnRL22msvt9tuuzVr\nctPgiIAPZVHTFpCDtSeEHbbPG3W++0s0UOGll12WqvF3sgA670RDLgC1FkwDpoH0asBgp4b3ZrPN\nNssNCIffxVlnnZXLnUpWgZ42jJxbTKB3ET2LFFSxa7811oAOH/csWHXC8oewg5WEaSQYZPDBBx/0\nFh2gAxBYuHChW7RokZs/f75fGC05HnbeeWfHP3kWdBFabshHC2myACddunRxq6yyigcZAU4IPSHg\nJJ0X8CA7wDNp0iQvO7JRDnqbqdlNs6PTxAW0ATsCnhB2wu1zzjnPPfLwQ+6GG6fXvUkLH50zzhjm\nm67SbNEx0Im/GbZvGkinBv4z0lo65StJqksvvdT3UOEiRpp96623SrreIresASwVd999t7vyyitb\njpzCGEClKniags477zx3Y9SbDH8QmrYYMycp7LDDDr4nFHDD6MQEgaUgirXgJg4rDDzIAIRACFAS\nnhfk6Fi4ZjsEJ/JV76uTTz7ZYWUDfi688EI3bdo0b7EJHacBHCw7svCE52TdQR/o5corr4jmM7vb\nbd+jm7tqwqS69dKi19XIs0e4jh07ucmTJqS26cpAh6fRgmkgGxpoKNjJhsqzLSU9jfbZZx+3ceSP\nksVApc6CtebQQw91+N/84he/+FpRgJpdd93Vd09nCgauUQgBBFAR5AhaWAtYwu1NNtnEvffee45B\nDddbb71cUxVxw0Vww5oAjAjQJD/xaY7Dssd0GITrrrsumhhzaM45WU1W8j/C6sOitAQ5SpM0Djhg\nfw99559/gZv/3HOO8YlqNa0E1pwbp890I0ec6QFU5UKuNAWA30AnTXfEZDENtKwBg52WdVS1GI1g\naQJ2AIEsBip1AIJKfo011nBPP/10rhhYbrDYMH7NNttsk+tWTlzBjNYhmLQEOCHssL3aaqt5wKLb\nNyMukxbpshAEHuSrRdCitYTm2i222MJbp5jQlK70DERIWYhLWqTBNgsWHsCHhePKT+lp3TO6vzTN\nMfDgFl06uTFjL3NHDjqiVZ2Xr582w90040bXrv36Dt2k1QfGQEdPia1NA9nSwJdfvGzJ/DVpmdGa\nDzuDrCkwJD7H8JOJB5q7OK6KhTXHkiZY5LjiMfIuQYO5cVxBcVgjDwuVJvukQQjz1DFdH1/fdttt\nuetJg7Q4Vk4grzBvyZRU3pbSp9lE4+u0FDeN5yk7C5X/zTff7AGB0Yrpwo0PVdeuXT0MEIcgZ2b5\nydAbii7gX3zxhW/6Yt3SQhMZcbiO6wGttdZay89YDuRIHpqc1ANMDsb43IQLzWDIywI44St0yCGH\nuG7dunl58QVDVsEOQCTgYjsMKmN4TNukO3LkCA8er77ysusdAdDIc0e7ha8tUpSK11hygJxevXp7\n0KFZjq7lBjoVq9YSMA2YBmIaaFOWHSp3xihhFN14AGCAApyDf/KTn8RP5/aBjqTrcxGiDUCnJZgJ\n48e3gRqaJ8JAnsged2wO48S3q1HeME0GyCNsnNEmLJVFVo3999/fH6Jyfe2117ylRVYWwIAu6oIF\ndQEXOLAutM050uJ6pRnmD6hst912ftBBBgZkXxYYWWO01nGtBUeki1wskydP9hOSksdxxx3nbr/9\ndp835ygHACR/HdIV6Ggt2eJrdAOAcO9vnjXbW3r27rev2yUC/22i96RH967xSwruAzhzH3jIvfrK\nK27uffe6rbfZNmqGG+iOjKYcSXMwi06a747JZhpoWQNtCnbygY7U9Mknn/h5k2huWn311XU4twZi\nigmVgA7px0EnzBMoA8aK6a1VaXnDfNlOg58C1jXdByr7UgOVO9cJeLie5iumYsAXiXMCGaw6LAIK\n1oUAJwQbyUZ+LOQXwou26ZKOUzTxGTMHoNE5wY32WWtbkCJZN9hgA9+7rH///u4vf/mLmzlzph8w\nEdDRNUpPMrFfbAB6WEaePTxyYr7L4UQ8ecJ4fznAssWWP/Tb3//+Zr7HmdJl8MPPP//CvbP4bfeH\nN99wjz/+mDvk0MPdT3fdxQ0dcqLbOAPgbKCju2lr00B2NdAQzVhU/FQWGkaf20FvLI7JTwaACC0y\nxOU8C00YCgBPIdggXaw/ulbXxdfHHHNMLk7YxTweL99+PvmIX0g+pVet8io91vy77xk1Z6QlcK/K\nCarstWZ0Y/xECAALi6whdPFWsxVrbYfNUsQBimTNIV2sKGGzVNgUFd/GWkj38MWLF/smq7DbuJqz\nOB8OFqhu5BpDh3NMGMpAiQSclblfyERZkFGLZJW8/oIif2jewgozdcrVbtEbi9zsW2a7AQfu71Ze\n6dvuf//9L3dX1J1/5owZuWVJ5JC90ndW9BagUaPO8+8EliKcj7MAOgB+GiC/yNtj0UwDpoE8Gmgz\nlh1ZA9ADIBICCPtUnPL5ARTC86HuAKOWrCqcDwEqvL6Y7Zbko5kLeZOsT0q/WuVVemlcA68t3Yti\n5KbS5d+7QAcYECDQ/MO2LDyh9Ya0ARtZXLSWBYV1aFXJt008mrLoIfbCCy94oCSu0matEN8Gurke\nuMLfB0h+9NFH3dKlS92QIUPck08+6S1T8uMJm7LUM0vlUB6lrGXxKeWarMQFcgiU0YJpwDSQbQ00\nhGWnmFsQWnWSKkjmSVLA1yWf1SDpWl2ndTFxFDdpHcqi8/E0W3IurlZ5lT9rLAWAQVpCWMZqyAQ4\nYO2guUrAA+gIdmQJATiABqwqoUMxFpvQKhO34IT7ioflRlabddZZxzejMlIzTs3kIeghzxB0wvIS\nJ5Rn9uzZudOnn366t6ZQJoAH644ATs1yLVkpc4m1oQ2BTpqe9zakfiuqaaDqGmgzsBPCAb4sqjy0\njvfaSoIdmrCKCYUsLsVcn5RPPM0k+cK0q1HeMD22sX6k6eOPb1S1gSdeZp4PLCdqkqK5CDgBUgQ3\ngIvgJQSa+LbicL0AB1iKdwnHh+jdd991qnDjMoX7en5l3SGtDh06uHHjxvlozz//vB+9Wc1Z9AaL\nAw/WKgv/0YD0nqZn/T/S2ZZpwDRQjgbaDOyUoxy7pnU0AKSoki51HTbPAXw4LNP8WA3oQRY1NSXB\nTTmAA/DIeiOwAUiAE5bQchNqW00nWNNaCtIhacnaxHxfe+65p7+UqSQAVSw5WKlC4JF1R81zLeXV\n6OcNdBr9Dlv52qoG2gzshNaS0MFYJvz4Ooxf64cjqeKOW3Jaki88X63y4pyqyqDWOsmXH3oBnuRv\nlS9eMccFO2oSiltxZMmJW2wENFrH4QZwaglukuTDssDyeDS2UUsB2ZUH+SE7/jusCUcffbRfFwIe\nvQM+Yhv80bONzi2YBkwDjaWBNuOgTHdtNe0AE3EfmDTdVqwXcb+d0KJBk1YIM0myt0Z5sTaoQkjK\nM8vHBDoAgwKWEsABCCCwr0VWGda6lrUW4rNdaQAwe/bs6YEH/bNfKIQyMyUF/jsMAkl39PHjx3un\nZaw7xAPqBEgqA/uUt1jZsTyxfPa3z92SJe9/bUZ4Jj7t2LGDizr8e0dfypLGoOfaQCeNd8dkMg1U\nroGGtezELSEh3GAFCMfCAYJCP564/07lai4tBXqDhfKxH1ou4iCUlHprlZfmkEYLL730kocIKnhV\n/jQH/f/2zj3IiurO4z9SIJYVUu7GZAOuFi5UGNkqERMZhFpFYnzk4Yih3IqRAZJSUUkibtCMZA1b\nAg6iUjKWQXwOJghEHIeYVWMSJCKPUgOl4aEbE0UdrazlHz7WRLOV3G+THzlc7jzune57u/t+TlXP\n6du3+5zf+Zyr/eV3fuecYs+Ox+oUx9tU6rkph6NEgl7I/lIu9azsd9EiIaNhM01H18rESlpoUMLE\n43d8Krpyn22m73pK6v97ChunXnTxJdYwqsHmzLnCfvbYL+zd9963f/jHj9u05uYDjnGN4+39P35g\nL+99zW5aenNk39lNU2xZ2y09tqUnG+L+zpkidOImS3kQSA+B3Hp2JHb0P355QLTWjqZzS0C4d0fi\nIRQQYZd0N+08vCfp8/7al0R75VmIY7dzibWeVqmuhG1xAHc5ZegFrrbJ26GkvDho14WEck/huV9L\nMnfPmgSLzksl2ST7JdokwiReWlpabN26dd1OR3eBp+e8nX7udWgo7af//YjdsGRxtCjg+IKI0qaj\n5W4SqhWUNz252bZs3mJnnnGmfbrhWDt3SpPNKKzdU4uE0KkFdeqEQPUJ5Ers9Da0o9iV3lYVVpyD\nhEItk8RW6NkJbZF9vbXT74+7vXrB6kXb3yT7+9qG/tbVl+f1Ir/88sujF73fLwHgqdqixustlcv7\nIHHWk+DRc26/PFQSPI888ogdd9y+VY41Hf3GG2+MvDny6ujecEgrFDobN260Fbffab9++ilrnvkN\n+83O3WULnLAdw4Z+ys6bem50zJ37H9HWEe3t91h7+8rIA3XuuVPC2xM9R+gkipfCIZAqArkaxpLH\nQGKgu3/l6wWrRdt0T7FnQQJH4iANXh3ZokUJQ0Ege9euXVuWfXG3Vy9apTgET1RQSv7ohR56Sty7\n4XlKzNxvhuJ21BcSaaWSizOJFh/OUvxOOB29s7Mz8l5p+EqCRzO0wvV33nrrLVuwcJHNunhWtBXE\nLwt1Xf3duf0SOsW2Svh8Y2azbdjwS7ugeYa1tbWZhriq8fvyOvw3XWwbnyEAgXwRGFAIRix/g6F8\nMaA1ZRBQsOukQvyIPCF5SNrnSW3xf+VnrU0SPBJqpQKX9Z+2huM0A0tCRltdXHjhhfbQQw9FzVy/\nfn30nLw/ikPydYC2bdtmN9241IYV9tuaN29erAKnN76LWpfYvJYr7eabFUy9L9aot2fK/V5CRyKn\nFLNyy+J+CEAgGwQQO9nop9RYqeBUxe34v4xTY1iFhrhoiyMWqUIT+v2Y+sK9PcWFSfBoGMs9OBI8\nI0eOjLw5iunR1hLyBCmYWVPmH3yw09TH3/z25fat2ZcWF1eVz9pkVN7XoUOH2uLWRbGKEoROVbqQ\nSiCQOgKIndR1SboNUryIhgm1aaX+dZz1JJHg3pEst0WeKQ+0DtshseOCR1PONWS1adOmaDq67ps6\ndWo0HV1xO62t19vu3busfeW90cadYTnVPlcg8/z5/2VvvPGGrWy/OxbBg9Cpdi9SHwTSQyBXMTvp\nwZpfSyQOtGN1lj0h3jvu1Qnjdfy7rOUSnjok3MLkcUcev6Mhq1LT0RcsWBQNeW0sbBw64aTGsIia\nnCueRzurjxgx0pqnz4yEXH8MQej0hx7PQiD7BPDsZL8Pq94CvVAVuyNvQpbjHjyQd8yYMVHci0SP\n4pGyLn7UP8VxPO7dUfyOByRrLaZdu3bZZz9zov1TIYB5xe0rTCIjbWnOFXMLy0f8tmIPD0InbT2K\nPRCoPgHETvWZ56JGiQId8+fPz2R7FJcyc+bMbm0/+eSTo/ZJNOiQ10TJBVL0IcV/9IIP43gkdpQU\nv+PDWV1dXXbZZbOj9Xce7Fxf1UDkctFpEUPtBL/qR/eW9ShCpyxc3AyB3BJA7OS2a5NtmHt3/GWS\nbG3xly7xIqE2o7CYndqyYcOGKOha+TvvvHNQhcOGDbOxY8dGh3Yl16GUZvFTHMfj8Tu+P9aWLVvs\n9NNPtyc3b03F0NVB0IMLiuGZNesSaxw3rjBDrCX4pvtT/21m2fvYfev4BgIQKIcAYqccWtx7AAEJ\nBQXFavp2lpJEjmzWy1DJvR6apq1Dq2xr/zTNVNq+fXt0lGrf6NGjI9FzwgknRCLIh7/SJIB8AUJ5\n4ZTUVrXxzTffLOy/dp5NPe/fazbrKjKojD+apfX1GdNt+W3LI69bT48idHqiw3cQqD8CiJ366/PY\nWqwXqTwkClaW8MlC0ktQHhqJGBcn7vGQCNAwjzwf4V5R+l73/6oQvKt9tJ544oloSKW4vVqrRsJH\nXh/Vody9CrUWQGEcj9pz7bULbM/zL5Q9LFTc5mp/XnbLrdax7v5oIcLu6g7b2t09XIcABOqLAGKn\nvvo79tbqxaJgZX/BxF5BjAX61GzNwvKZWCrevR0udBTT4oeu6dA9Eiw6NE377bfftueeey4SQBI/\nO3fuLGmphr9c/LgA0o21ED8SehJfaotW1+7v1g8lG1yFi1pl+bTPTS656KB+h+7FqoIpVAEBCGSE\nAGInIx2VZjM1LKSAX3+ZptVWDzaWrWHSy99FjYsc3z7BPTzy+ihJ6Ggatx/67Nf27t0biR+JIAmg\nV199Naxm//mECRMiz497geQdU6qGAFIcz3e+c6WNOna0Lbx2flRv1v48/OhjNqewuvLWbVv3e87U\nBoRO1noSeyFQPQKIneqxznVNGsaS2NELx4du0tTgnuxzseOBuz41OxQ8EkOeXNx4LuFT6lzXNPTl\nHqBnn3222+GvyZMn7x/6kgfIGcYtgCR2jjnmGHut6/VUTjN3xr3l539tmo1vHLffu4PQ6Y0Y30Og\nvgkgduq7/2NtfU+CItaKyihMQ1casupJiLnYkaDxadkSOi52dE0eHnl3dK8EiB8SOjp3wROKnlLX\nNPwlr48LoO6GvxT8LNGjQ0LI44v6K36uuuq79sGH/29Lb1pSBsX03SrvzvWt10WxOwid9PUPFkEg\nbQQQO2nrkYzbI8Gjl49mO/kLulZNktCZ9LdZSLLJvSXF9kjAhMHJLnjk4Ql3Apfnx+9TGXpOh5KL\nn1D4SOz44SLIcxdCyiV85PVxL1B3w18TJ06M2uOzvyoZ/moY1WB33dOe+qnmEdRe/px66mQbM+a4\nXKzm3UtT+RoCEOgnAcROPwHy+MEEFMOjGVq1nKUlcaPAaR2yozuhI+tdtEjI+Ewsj92RR8fjdvSd\nvD+6zw99drHk5TgRF0AueEKB49ckfkIB5PfI+yPxIxG0efNmL/KAXBtluvBRELQOT6U8QBKgd93d\nbus7O/y2TOfz/nO+ffjBn+z6xddluh0YDwEIJE8AsZM847qswcWGPCsSG+6FSBqGvDkeMK08nHXV\nU90uWNxzEwocFzkSNn6EYsef8Wueu/hRruTix70/Lnhc4IR58fkrr7wSCR+JIAmgnoa/fPaXhJB7\n11TnlVe12KBDBmc2MLm4/3zdnT3P7yn+is8QgAAEDiCA2DkABx/iJODxMvIo+HTvnjws/a1bs6wk\ncCSsdK68r8kFiYSKzl3UKHcx4+f+Ocz9PLzHryn3MmWPzr0+F0ASNzov5eUpFj7uDfKhL+USQdpO\noTiFa/90dq63xUtusLPO+HzxbZn9rGG51WtW7xd1mW0IhkMAAokSQOwkipfCRUBeHokQBQlL9Ciu\npxwh0hNFCSoJG3mPlJRr6KrS5CLEBYlyiZVShwsi/86Fjufh9VLXvGyvSza7+NG5RI4foQjyc89d\nDIVr/2gITJt8FifVlaekPbNGHzuqzx68PLWdtkAAAn0ngNjpOyvu7CcBiR4Jk/b2dmtqaoqCbSVM\nyhU+EjjyFuno7Oy0U045JXrZ9UfkdNe0UBxIvLgwcSET5qGg8XPPdV+p8/Cal+V1eN0ugJS7+PHc\nvTz+2YWPCyFf+6etrc0mTvw3+/GP13TX1ExeX9S6xP5ciNu55prvZdJ+jIYABKpDALFTHc7UEhAI\nxYoEkIa2JHgm/W3mlOf+iDxCeka5jpdffjkSOBI3lYglL7eS3AWIng1FiYSKPheLl1Kf+3rNxY+X\nHdbtAkjiRucublzshLnOJXbe/+MHtuK2H1TS7NQ+s/b+B+zBjo7MbXuRWqAYBoGcEhiY03bRrBQT\nkLjRUJYOJRcxvku3hrzCJCGkQ8JGw2DFYii8N+lzCQtPfi4RIrGhpHMXJ2FeSuDo+/C6n3vu34ef\ndc3LVV0KnlZS7vbIFp274NHnwq02/Jh/ie7N058hQ4bkqTm0BQIQSIgAYichsBTbdwK+jUPfn0jX\nnS4yZJWLjNALE4oTFyueFwsZfS73murSM0o610wyJdnix7vv/l90LW9/jj7qKPv100/lrVm0BwIQ\niJkAYidmoBQHAREIBdA+z8rfA4MlSPxwIVRK4Oi78Lqfex5+X3xN5XvZyj/88z4BlLfe+dfRDfb8\nC8/nrVm0BwIQiJkAYidmoBQHgVIEQvHj5xIklQx/hcLGBU9v1wYNHFTKLK5BAAIQqAsCiJ266GYa\nmUYCLnpkm84VYyMBpOSeH8//AVUvAAAG+ElEQVQlasLDxY3nLnrCPDwffOjgqNy8/dm5iwUF89an\ntAcCSRBA7CRBlTIhUCEBF0Ceu/hRcS58lIfCJzyX+AkFkAueIR/9aIUWpfuxvYWVpb96/gXpNhLr\nIACBmhNA7NS8CzAAAt0TcNGjO/xcYseHvyRmXAS56NHnUPDofODAj9j//uEP3VfENxCAAARyTOAj\nOW4bTYNALglI9Pgh0aNj4MCBdsghh9jgwYOjQ9tE6DjssMOiQzumd3W9ljse27fvsIZRo3LXLhoE\nAQjESwCxEy9PSoNA1Qm48FEerq0zaNCgSAAdeuih1tjYaGvX3Fd125Ku8KXf/84+9rF8DtElzY7y\nIVBPBBA79dTbtLVuCBQLoCOOOKKwGOOppp3C85T+pzDtvJaLTOaJJW2BQJ4JIHby3Lu0DQIBgRPH\nNdrjG38VXMn2qYTb611d7Hie7W7EeghUhQBipyqYqQQCtScw4aRG27plc+0NicmCp595xr7cVPkO\n9zGZQTEQgEAGCCB2MtBJmAiBOAhob7EX9uy2vKxN07Hufps4YXwcaCgDAhDIOQHETs47mOZBICRw\n9jlT7I477gwvZfL84Ucfi4awJOBIEIAABHojgNjpjRDfQyBHBC695GJ7+Kc/sa7X38h0q+5dudIu\nnT07023AeAhAoHoEEDvVY01NEKg5geHDh9sJnz3R7mm/t+a2VGqAvDra6bx5GisnV8qQ5yBQbwQG\nFFZb/ft2zPXWetoLgToksGPHDhs7dqz9Zudu067hWUvnf22ajR/faN/6Jp6drPUd9kKgVgTw7NSK\nPPVCoEYEjj/+eLt2wUJbuHBhjSyovNplt9xaiNV5Da9O5Qh5EgJ1SQCxU5fdTqPrncDsyy6NhoIk\nHrKSNIvs1rZl9v3vX2OHH354VszGTghAIAUEEDsp6ARMgEC1CUgsLL9teSQesrKqcktLi13Q3MyK\nydX+sVAfBHJAgJidHHQiTYBApQSWLWuzjo4O+9GqVTZs6KcqLSbx5+ZcMddefPG3tr6zI/G6qAAC\nEMgfAcRO/vqUFkGgLAJXXtVie/bsseXLf5BKwSOhs2P7MwVR9gDDV2X1LDdDAAJOgGEsJ0EOgTol\ncP3i66yhocFmzbokdevvSOhoXaClS29C6NTp75NmQyAOAoidOChSBgQyTiAUPGmI4dGihz50tXrN\najb7zPjvC/MhUGsCiJ1a9wD1QyAlBCR4GseNs6/PmG5r73+gZlZp1pW8TIrRWdl+N0KnZj1BxRDI\nDwHETn76kpZAoN8E5s1rsdbFrXbNvKvtoourP6ylqfBfmXJOJLoUjMwU8353KQVAAAIFAogdfgYQ\ngMABBLS55tZtW23AgAE2edIkW9S65IDvk/igLSDObppi2slcU+IlukgQgAAE4iLAbKy4SFIOBHJI\n4PHHH7cVt98ZrVr8+TPOshnTp8U6Y+vOu1faL36+b6+rlqtbbPr06TmkSJMgAIFaE0Ds1LoHqB8C\nGSAg0bPqvjV2+4rlduFFs6xx/El21pmnVyR85MXZtOlJW7d2tQ0dNsxmFGKEmpqaGLLKwO8AEyGQ\nVQKInaz2HHZDoAYEXnrpJdu4caM9+rOf232rfmhfPvscGzFipH3ik5+0kSNH2JAhQw6yavv2Hfbe\ne+/Z73/3YvTMpEmn2udOO82+9MUvEHx8EC0uQAACSRBA7CRBlTIhUCcE5PHRLup/sQH21FNPl2z1\nkUceaUf985F29NFHReJm+PDhJe/jIgQgAIGkCCB2kiJLuRCAAAQgAAEIpIIAs7FS0Q0YAQEIQAAC\nEIBAUgQQO0mRpVwIQAACEIAABFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoid\nVHQDRkAAAhCAAAQgkBQBxE5SZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAA\nBCAAAQikggBiJxXdgBEQgAAEIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCAp\nAoidpMhSLgQgAAEIQAACqSCA2ElFN2AEBCAAAQhAAAJJEUDsJEWWciEAAQhAAAIQSAUBxE4qugEj\nIAABCEAAAhBIigBiJymylAsBCEAAAhCAQCoIIHZS0Q0YAQEIQAACEIBAUgQQO0mRpVwIQAACEIAA\nBFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoidVHQDRkAAAhCAAAQgkBQBxE5S\nZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAABCAAAQikggBiJxXdgBEQgAAE\nIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCApAn8FUX2PmBTVQm8AAAAASUVO\nRK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 100, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse_2.png')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Project 5: Making our Network More Efficient" + ] + }, + { + "cell_type": "code", + "execution_count": 105, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self,reviews):\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " self.layer_1 = np.zeros((1,hidden_nodes))\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + "\n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def train(self, training_reviews_raw, training_labels):\n", + " \n", + " training_reviews = list()\n", + " for review in training_reviews_raw:\n", + " indices = set()\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " indices.add(self.word2index[word])\n", + " training_reviews.append(list(indices))\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + "\n", + " # Hidden layer\n", + "# layer_1 = self.layer_0.dot(self.weights_0_1)\n", + " self.layer_1 *= 0\n", + " for index in review:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # Update the weights\n", + " self.weights_1_2 -= self.layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " \n", + " for index in review:\n", + " self.weights_0_1[index] -= layer_1_delta[0] * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(np.abs(layer_2_error) < 0.5):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + "\n", + "\n", + " # Hidden layer\n", + " self.layer_1 *= 0\n", + " unique_indices = set()\n", + " for word in review.lower().split(\" \"):\n", + " if word in self.word2index.keys():\n", + " unique_indices.add(self.word2index[word])\n", + " for index in unique_indices:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] > 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 106, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000], learning_rate=0.1)" + ] + }, + { + "cell_type": "code", + "execution_count": 111, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 109, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):1581.% #Correct:857 #Tested:1000 Testing Accuracy:85.7%" + ] + } + ], + "source": [ + "# evaluate our model before training (just to show how horrible it is)\n", + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Further Noise Reduction" + ] + }, + { + "cell_type": "code", + "execution_count": 112, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjsAAAEpCAYAAAB1IONWAAAABGdBTUEAALGPC/xhBQAAACBjSFJN\nAAB6JgAAgIQAAPoAAACA6AAAdTAAAOpgAAA6mAAAF3CculE8AAAB1WlUWHRYTUw6Y29tLmFkb2Jl\nLnhtcAAAAAAAPHg6eG1wbWV0YSB4bWxuczp4PSJhZG9iZTpuczptZXRhLyIgeDp4bXB0az0iWE1Q\nIENvcmUgNS40LjAiPgogICA8cmRmOlJERiB4bWxuczpyZGY9Imh0dHA6Ly93d3cudzMub3JnLzE5\nOTkvMDIvMjItcmRmLXN5bnRheC1ucyMiPgogICAgICA8cmRmOkRlc2NyaXB0aW9uIHJkZjphYm91\ndD0iIgogICAgICAgICAgICB4bWxuczp0aWZmPSJodHRwOi8vbnMuYWRvYmUuY29tL3RpZmYvMS4w\nLyI+CiAgICAgICAgIDx0aWZmOkNvbXByZXNzaW9uPjE8L3RpZmY6Q29tcHJlc3Npb24+CiAgICAg\nICAgIDx0aWZmOk9yaWVudGF0aW9uPjE8L3RpZmY6T3JpZW50YXRpb24+CiAgICAgICAgIDx0aWZm\nOlBob3RvbWV0cmljSW50ZXJwcmV0YXRpb24+MjwvdGlmZjpQaG90b21ldHJpY0ludGVycHJldGF0\naW9uPgogICAgICA8L3JkZjpEZXNjcmlwdGlvbj4KICAgPC9yZGY6UkRGPgo8L3g6eG1wbWV0YT4K\nAtiABQAAQABJREFUeAHsnQe8VcW1/yc+TUxssQuWWGJoGhOjUmwUMVYUC3YQTeyCDQVRwYJYElGa\nggUBJVixRLFiV4gmxoKiiaKiYIr6NJr23vvf//6O/k7mbvc597R7zt7nrvl89tlt9syatct8z5o1\nM99oioKzYBowDZgGTAOmAdOAaaBBNbBcg5bLimUaMA2YBkwDpgHTgGnAa8Bgxx4E04BpwDRgGjAN\nmAYaWgMGOw19e61wpgHTgGnANGAaMA0Y7NgzYBowDZgGTAOmAdNAQ2vAYKehb68VzjRgGjANmAZM\nA6YBgx17BkwDpgHTgGnANGAaaGgNGOw09O21wpkGTAOmAdOAacA0YLBjz4BpwDRgGjANmAZMAw2t\nAYOdhr69VjjTgGnANGAaMA2YBgx27BkwDZgGTAOmAdOAaaChNWCw09C31wpnGjANmAZMA6YB04DB\njj0DpgHTgGnANGAaMA00tAYMdhr69lrhTAOmAdOAacA0YBow2LFnwDRgGjANmAZMA6aBhtaAwU5D\n314rnGnANGAaMA2YBkwDBjv2DJgGTAOmAdOAacA00NAaMNhp6NtrhTMNmAZMA6YB04BpwGDHngHT\ngGnANGAaMA2YBhpaAwY7DX17rXCmAdOAacA0YBowDRjs2DNgGjANmAZMA6YB00BDa8Bgp6FvrxXO\nNGAaMA2YBkwDpgGDHXsGTAOmAdOAacA0YBpoaA0Y7DT07bXCmQZMA6YB04BpwDRgsGPPgGnANGAa\nMA2YBkwDDa0Bg52Gvr1WONOAacA0YBowDZgGDHbsGTANmAZMA6YB04BpoKE1YLDT0LfXCmcaMA2Y\nBkwDpgHTgMGOPQOmAdOAacA0YBowDTS0Bgx2Gvr2WuFMA6YB04BpwDRgGljeVGAaMA2YBsrRwOOP\nP+7effdd9+rC190HH3zgli39wD3++GNfS+qQQw93q6yyiuvSpbPbaMMNXM+ePd13v/vdr8WzA6YB\n04BpoLU08I2mKLRW4pauacA00FgauOuuu9xTTz/r7rv3HteufXv3ox//xG2y6SZu8803d6utuqrr\n0b1rswIvfG2Re2/JErd06TL39ttvu1defsnde89dDgD66a67uH322cfAp5nGbMc0YBpoDQ0Y7LSG\nVi1N00ADaeC///u/3YyZN7k5d97pS9V//wNcn969XZfOHcsq5dJlH7q5DzzkFsx/zl079Rp3xrCz\n3IknHOc23njjstKzi0wDpgHTQEsaMNhpSUN23jTQhjUwfvwEN3nSJLf1Ntu6IwYOdLv/tG9VtYHl\n57rrrndXjvuFQU9VNWuJmQZMA6EGDHZCbdi2acA04DWAP87551/gVll1NXf8CSdUHXLiahb0zL3v\nXjfi7BFu0KBB8Si2bxowDZgGytaAwU7ZqrMLTQONqYHxEya6yRMnuhNOHuKGnHRCTQs598GH3WWX\njI38gdaPLEoTzJ+nptq3zEwDjasB63reuPfWSmYaKEkD+OYce9wJ7pFHHnU33Di95qCDsDST3Txr\nllt33fVct67d3O9///uSymCRTQOmAdNAkgbMspOkFTtmGmhjGgB0Bg4a7FZeeWX3i19c7tq3W6/u\nGhg/cbKbPGG8m33LbPejH/2o7vKYAKYB00B2NWDj7GT33pnkpoGqaECgs9lm33fjrri8KmlWIxGa\n0FZaaWV38EEHG/BUQ6GWhmmgDWvAYKcN33wrumkgraCjO3P04IF+04BHGrG1acA0UI4GrBmrHK3Z\nNaaBBtHAmWeNcIsWLXL33D0n1SWiSWvOHbe7OXPuNKflVN8pE840kE4NGOyk876YVKaBVtfA9OnT\n3diLx7p5UTfzNPjotFTgY4493n3++edu1s0zW4pq500DpgHTQDMNWG+sZuqwHdNA29DAO++840Fn\nXDRoYBZAh7syevQoP/8WkGbBNGAaMA2UogGz7JSiLYtrGmgQDRx62BHRnFabuTEXjs5UiRiH59Qh\nJ7v5C+Zbc1am7pwJaxqorwbMslNf/VvupoGaa4DRkX/3wvN+PqqaZ15hhozDs/uee7uLx15aYUp2\nuWnANNCWNGCWnbZ0t62spoFIA1h1unXvXpdBA6txA5haYosundzixYtt8tBqKNTSMA20AQ0Y7LSB\nm2xFNA1IA1h1jjv2OLfojUU6lMn1qacNcyussLy77NKxmZTfhDYNmAZqqwFrxqqtvi0300BdNTDr\nV7e4gYOPrqsM1cj8Zz872t1z1xzHOEEWTAOmAdNASxow2GlJQ3beNNAgGgAMrp16jdun396ZL1GX\nzh3dDzp2cnfddVfmy2IFMA2YBlpfAwY7ra9jy8E0kAoNAAY/P+Y4Byg0Qtilb1/37HMLGqEoVgbT\ngGmglTVgsNPKCrbkTQNp0QBgsMWWW6ZFnIrl6NO7t7dUVZyQJWAaMA00vAYMdhr+FlsBTQNfauDJ\nxx9z2/zkJw2jDixUe/fb1+F0bcE0YBowDRTSgMFOIe3YOdNAg2iAEZMJPbp39etG+WGm9pdffqVR\nimPlMA2YBlpJAwY7raRYS9Y0kCYNADtbb7NtmkSqiiybbLqJW/L+B1VJyxIxDZgGGlcDBjuNe2+t\nZKaBnAZo6ll33fVy+42ysfnmm7sPPjDYaZT7aeUwDbSWBpZvrYQtXdOAaSA9Glh9jTXdN1dcKT0C\nmSSmAdOAaaCGGjDLTg2VbVmZBuqlgf/3//5fvbJu1Xy3+uGW7lezbmrVPCxx04BpIPsaMNjJ/j20\nEpgG2qwG2rdrvKa5NnszreCmgVbUgMFOKyrXkjYNmAZMA6YB04BpoP4aMNip/z0wCUwDpoEyNcBA\niR1+0KHMq+0y04BpoK1owGCnrdxpK2eb0wDTQxx55JHuu9/9rps8aVJDlv/Tzz5ryC71DXmzrFCm\ngTpqwHpj1VH5lrXzo9/+/ve/dyyMBcPy7rvvNlPNaqut5n70ox/5Spt1z549/dIsku34GcABHLqZ\ns/70009zWmH7ncVv5/YbZeNvf/tboxTFymEaMA20ogYMdlpRuZZ0sgaoiG+88UZfKWN1AF6AGFkh\n2A6DIIg1UHTKKae4l156ye2zzz5u33339QvptMXATObok+Xuu+9upoLvfe97XjfolXhXjLuq2flG\n2PnjH99yXbtu1whFsTKYBkwDraiBbzRFoRXTt6RNA14DVLZXXnmlXwATgAVQ2XjjjcvSkCp5oAkA\nIq3Ro0eXnV5ZQtTpIqBPwAj0hWGrrbbyukAfITTymi+33HLug6XLXCP1YDr0sCPcrn37eFAO9WDb\npgHTgGkg1IDBTqgN2666BkLIofIFSLDkVDNQ+ZPu9OnT3aBBg3JAVc086p0WQCcLThLgYL0J4TH8\nD8M24+z077+/OyLSz4AD9qt3caqWf8cOHd0DDz7QJiC3akqzhEwDbVAD5qDcBm96rYqM7wiAIx8S\n1tUGHcqCdQgLz+LFi31zDftYkbIe1GRHeX784x+7888/3zffUS6a8KZNm+Y++eSTXNMezVaAjeDm\n//7v/9y///1v969//cv985//dF26dImufznrasnJP/fBh1279u39/c8dtA3TgGnANJCgAfPZSVCK\nHapcAzRTASBYXNiuRQAKsH4AVVg6WCNDlvx5ZL1hHToY46QNKGK9YVGZBDfoF+sN+0AOC/v/+7//\nm9vfaacd3MUXj41ijiZ65sPTTz/j2q+/QebLYQUwDZgGWl8D1ozV+jpuUznQbCXrDRU2AFKPgBwA\nD01cAE/ov1IPefLliZxAmSAnDjiCG9YKAA1BoAPUsAhytAZ0BDtaHx75uJx+5pkN0ZRFE9Y1U67J\n9dKTfmxtGjANmAbiGjDYiWvE9svWgEAHsKAZSdaHshOswoWyMAEUaQEe9CS4ydeDCrgRNKKGEHBk\nwYkDjoAmDjnh/jXXTHWg0tQpV1dBu/VL4vppM9zdd81x99w9xzdd0uQX6qt+klnOpgHTQBo1YLCT\nxruSQZlC0MGSkqaAPEBPPYFHPaiAnCeeeKKZeuhBRUWNJSoEsjjgyIIjyCkGbmTl0Rofn7322su9\nuvB116Vzx2ZyZGmnV6/e7uSTT3b77dc/Jzb314Anpw7bMA2YBgINmM9OoAzbLE8DaQYdSgREEKgI\nawk8WBvID9gqpgcVMuYDHMFKCDgcC/fDbcUXIJHuN77xDe8H1L379u6qq67KrHUHqw4hBB32dX9Z\nWzANmAZMA6EGzLITasO2y9IATS4ADxV7moOsO8jZWk1sAA5wgwUnPhI0PaioiNGXfJkEN+gNMGFf\nlhsBSxxqtJ8EN5yLAw7j67C8/PLLbs0113Rrr722Gzx4sJs4+Rq3+0/7pvmWfU22pcs+dIcdeujX\nrDphRO4vFrLWusdhXrZtGjANZEMDBjvZuE+plZLeVlTuVPJZqFyADeQERqoVSIsKNh/gADcs0k8S\n4AhSWAtm4us43GhfcMNaQYDzX//1X47l6aefdj/5yU887Kywwgpu9uxb3MMPPexmzf5VpgYZHHnu\naLf47bfcrJtnqqiJa55HgFI6T4xkB00DpoE2owGDnTZzq6tfUCoUxn958cUXm/maVD+n6qWIBYpK\nEEADQMoNgI2WavagEuAAMoKZ+DZxBDgCJ5qoWAQ34RrQ2WWXXdzyyy/vz7NmOebY412nzl3cmAtH\nl6uGml7HuDqnDjm56EEEDXhqenssM9NAqjVgsJPq25Nu4bCSsIyOrDtZCkAKfjw4DRf7zx9IEtzk\n60GFLkKAEoioaUprYCVcBDOF4EbxSUPp5gMcwcznn3/uXnnlFdenTx8PNzouEFqyZInbp98+btjw\ns93Rgwem+hYufG2R27//vu7isWO/5qtTSHADnkLasXOmgbajAYOdtnOvq1pSLCNADpVJscBQVQEq\nTKwYUFMPKpqo8gEO0FSoB1UccJKABpCJA4/ghnVLgCOIYS2Qef/99x2ws8022+SO6ZyauIAlyjVi\n+Ah3w43TXY/uXSvUautcjp/Occcd7zp27Oguu5RBEUsLekax6FkwDZgG2qYGDHba5n2vuNRUHMAO\nlX0WAxUgwBO37ghwgLl8Pai4rhDgqIkpBBbBjI5pX/Cj41rHAUeAArAkwU14DIsN8TbbbLNmoCNL\nEGsC5aMsAw4c4J588slUAo9AZ/1oWoirr55U9qPGfSUY8Hg12I9poM1pwGCnzd3yygssq44qkMpT\nrE8KgBqVH01PlAkLThxwdt55Z3+eOKoo1YyE1ICNrDdsC1YEM/F9wY3WOq90lLbghrUAJ74W4HBc\n52i2osfVpptu6o8BNqShINBhH2CjvPQSGzhwkDt7RLosPAIdZJ0xfVrFFkQ9r7qPpGvBNGAaaBsa\nyATsTJ061R177LH+juBo+fDDD2fq7oTyI7gqNLbDyodyUb5iA//c3377bR/9kksucWeddVaxl1YU\nD2sAoMCS5aDKPl4GKn/ghkVNdOE9E5gAKnHAEbwIdgQ18TXXaSF/ngOBieBFAKN1EtxwTPGxzmy9\n9dZu9dVXzws4KitWOSYWZc4tIIBy3nnnHLf//vu5626YXncfnmefW+B4psttulI542vKiv9VaJmL\nx7F904BpoPE0sHzjFclK1JoaoLJgBGCcdbMeKIvCoEGDPNwAciHgADnhIpiJr+Nww/n4McGNwCkJ\nbgQuApsQZnRMcVjLAgTo9O3b1xcnBGiVL1zThAfoELBoUV5k6h85AM+bN88dH/nHvBpZiIYNO70u\n3dIZNPDySy52J5x0khty8kmh6BVvY9UBdtCBAU/F6rQETAOZ0YDBTmZuVToEBXKwfAgI0iFVeVIw\nfxeVPRUga0IINmwLUPLBjaBGawFOGF9pkn4+wBHI5IOb8DiAo3To9k7F3bt3b5IvKsgiJwuWLkLO\nHXfc0d3763vdyHPO84P3nRk5L9dq4EF6XDGy85OPP+Yn+AQ8WyPw7HLPDXhaQ7uWpmkgnRow2KnB\nfTnmmGMcSyMEddtuhLJQ6fPvnkqVip4gwAlhJR/ICGxYx+O3BDiCF0EOVhpt61zcgiPAkeWGEZqx\nUvTq1avo20HzFX46NF+FgBdC3frrr++uv26qmz59hhty0olu2+26uiMGDmw16ME358bpM92Made7\nfvv2d/MXzG91mDbgKfqRsYimgYbQwHJZLcWll16a83Pg449Pj/xXksr0yCOP+DjE1bLGGms40mFy\nxKSgeKy5noWuvOxzHaGYOPjshPGS8tKx2267LZcH15Afx8oJSWWmqQN5yg00YbXWP+5iZJIe1WRT\nzDWF4qgpg3/5VPgCm3//+9/uX//6l/vHP/7h/v73v7svvvgit9Clm4VjnCMOC/H/53/+x6dDnsAK\noxV/61vfct/5znf8stJKK7mVV17Zac22Fo4R79vf/rZfVlxxRX8taQiAZNXRVBSSv1AZdY4yJjVf\nCfCQneWf//ynXwYMONDNnXt/NGFoZw89hx52hLv19juVXMVr/HJOPW2Y6x3B5quvvOxm3zLbdy2v\nldVQwINjugXTgGmgsTWQSdihohs+fHizO0MFDhgkAQ9xkyp5IIdzOPr+9re/bZZefAdwII1C8YqJ\nE0833AdqBgwY0CwP8uOY4CqMX2ib+EllFgDJ4btQGvFzVJbf+9733MZRE0C9Qz5ALUcu4I2yqdLH\nUqNKH4gJQUeAI8gJAQcQA0q++c1vOkAFcBHUaB2CDscEOCHkAEekQVryyRHkUT5kJZR6H/I1XwF5\n8TJTPhbK8fOfHx11CnjI7bTjDm7yhAmuY4eOHlIAH5qeSgmMgsyUD8xaftSRgyIYXN6PiMz0D6WA\nWyl5FooL8HD/DXgKacnOmQayr4HMNWNRWecLVIBU4mFvLSr9lkCB6wCDt956y/dkSUq/pTS4ppg4\nSWnrWCGLC1DG3EbF9NYCmuIwqDy0Ji+6J5fSg4tKttQKVvlVe10IOkvNi0oWZ2VgJ27ZwcqBlYcF\nIFBzD3kIQMLmJjVHaS2LTHwdXiNrDWsF0k4KVMrIW6r1o1DzFWUG7mTJUlnJH6sSYZVVVnGHH36Y\nO+qowW7hwoXuqaefcXfNmeMOOnD/CBZ6uXbt13frrrueW3uddXz88AerDZawe++5y+3db1+33Xbb\nuqFDh3iH8DBePbcFPKwtmAZMA42ngf98XTNUNir9F154wVdOH3/8sYcAiR/CEBATAgiVOyAkf4rQ\njyYeV+mFa+Lr2nyQUEycMM34Nt1tlceUKVOanS4EQ2HEEHRCXQFzISyhG8pdbAAI0lQZhPe62DIk\nxdtqq638P/uwGUuVv5qzBADcG6AECMD6QpNTviYqWW7CtSw4YROVwEfwVAh00H+poAOk5mu+EuhQ\nPjXHsQbygJ8kwAO26CWFNQZ9jBt3hTsmsv6stmrURPedFd23V4z0Ei0rRdv//ucX0aCF+7vTTh3q\n495z9xx3zsizUwU6eibQLTCJH5QF04BpoLE0kDnLDuq/9dZbvVWCbcYUARCwzChQgXMcC0dYmQMP\nYWXPPs1eqjSBCdJKClwXh494vGLixK8J9wGlEKLYR37Bi8pD2fIFLB5hU16oK2CPfZrtSJeFNMkn\niwG9AK+F9FFMuQQPgkxZb7QPfAhIwjXWmrjFJumYmqK0Fsxo3ZKM6ipdLmgW23wlXx3Ah7JTFtYE\nZEfeJJnV/FSufC2Vv5bnKYMsmHouapm/5WUaMA20jgYyZ9mhwmYJQ3xfgBM2dVAhhqCj68OKnuvC\naxSHddK14fli48SvCfcPPPDAcNdvx/MNQeZrkaMDofxYdeK6QQ9hPi2lF+bBv15VbOHxUrcBU1Wc\npa7DvCgrflpYqEopR5hGuC1ZqNiBGip7+d9gwZF/DWv53oTbOiZLD9YbrmfBEkSahaAhlEXbWNMq\nsagV03wlyJE1B6sWFp9QHwI1ydXIa55xdG4Wnka+y1a2tqaBzFl24pV3oRsWVoBU/EkhbhUQKMXj\nxuPFz7NfTJyk63QsqWzxNPPJpzTC88AAFVahEMYvFE/n0vZvl3ssy1doFZO8pazRVQg5XAvwYOkh\nhOcVj3W4CAoECrrOJ1DiDxUuoVzALLb5iuYqgQ5WHYLgjDU6oIwqm4/Q4D/o3Cw8DX6TrXhtSgOZ\ng502dXessDXTAHCiypzKncAxKvuwKUdgEwIAx3S9BGa/kkBFC1huXEHPtyO/ms4jPngg8IYvDmAj\n0AF2sOhQVvRA+WSVYq3yUq5Ky1aJXmp5rYCn0vtQS5ktL9OAaSBZA5lrxkouRvLR0FISNu+EseOW\njbglJYzb2ttJMsblC8uUJE8oP01gVF6Flpb8kMI8qHiphBs1UIkDLqroaYaSA7KcjEMHY8GArB4C\ngUphgOZCdM1Sbhg9Ov/ggXJKVu8rQAfwUdMVgEf399CJGp0AQW0tyKomK1tbK7+V1zTQKBpoaMtO\n2HQFNOCIHPeBCXs4AQrhNbW+ycgX+tOQv5yn2Ua+lmAnlB94otwhAJFOuYHKtxp+DDiBxyGuXJl0\nXUt6UbykteBElTn7ACIVvIJAhjiKz7lwW3ErWQM6PXv2rCQJD6TF9L4CdrQAOgTADYgDdORzpCYt\n6UDCAQDI++lnf3MLFvzGH1YXc3Y6/KCD23qbbf3xjh06uFVXXdmXTQDhT2Tgh+eesrKwbcE0YBrI\nngb+8zXPnuwtSgw44M+hipUxeMIeWeyHMBEHjRYzqHKE+Ng37MsfhayKkQ/YoeLHl4Vy4wxMmQVB\nSlM64VzopN1SkaoBO5KlpbxqdV6+GeQn4MmXd7XhRvlU2uNK6bAup/mKpq0k0FETliAPXf36vvvd\noxGYL1u61MPMFlv+0PXZpa9r376dF4Pu5QQGHHxvyRK//eKLv3evvf6Gu/vue/x1O0Vj8/To3jUn\nq4+U4h8BD+XPGqylWK0mmmmgZhpoaNjBooHTqoABAAi7qIdaJm6+budhvNbeRlbJG8+rWAdc4mmE\nZKw79FhKCkBRKaCDxYHmkUYL+sfeWiDTkr7IH9ip1KJDPtyffHNf5Wu+AnSAmXjzVQg6d945x02c\nONGDyv4DDnbFTBDapXPHaKqJjr744WSiQNCj0ezqd865210y9hI/u3m/vfdKvdUE4BGUGvC09FTb\nedNAujTQ8I3wVPwtVeiATjXGa6n01haCGUCs2KYaytsSuJEWZS4l8LFnbqxGC/xbrwZolKMXQIdQ\njcqTclS7+eqee+51ffrs4kHn8IFHukVvLHJjLhxd0aSgANCQk05wWIDGjZ/gFi9+122yySZu/ISJ\nVWkm9QptpR85K6NrC6YB00B2NNDwsMOtoKmGwfTi0CNrDiMLp6FpBfmQNYQa5EL2QiCU9LgRn1Gm\n49eRNiBEmcN8ktKIHwN2mBurkT70/FMH4KoBG3F9tbQvPaLXaoRym6+w6sT9dF577TV35OCj3aRJ\nkxyQ89hj89zRgwdWQ8xmaWDxGXfF5e7Vha+7+fMXuG5du0UQnn9KmGYX12nHgKdOirdsTQMVaOAb\nkSPml0OkVpCIXdp2NECFSuXcCM1ZwAYOtjfeeGPNAY58ASwqzmoE7gdWndVWW81hLSJdXm11M6fH\nFRN7hrO10/2cpjtAh15mGhSRUbUnRBaXgRHsHDnoCNe+3XrVELGoNJhc9LxoOolevfu4sWPHVE0/\nRWVeYiQ1adXLKliiuBbdNNCmNWCw06Zvf+mFv+uuuzzoyCpRegrpuQIg+PTTT71AjEVDpcXS2lYe\nQId8qhW4Fz/+8Y99cnOiyTn33Xff3HADGk9Hs7cLdsIpIeheD+iwXHDBRe6xeY/65qXQz6ZashaT\nztJlH7ozzhjmweyC80e1+v0oRqZCcap9PwvlZedMA6aB8jRgsFOe3tr0VUACH/jWhoLWVLIcgnHm\njYdVV13Vlw0g0aI4lTgxt5YlgPtAOQA2YJSAVQeHZKAmtOoAO+xzjt5XdC9nDCGg6IwzzvRWnilT\np9TUmiPdxtennjbMzb3vXjf7ltmpf9YMeOJ3z/ZNA+nSwH9F5u/R6RLJpEm7Bj788EMPO1gQshqo\n5Kn0WQCEDtE4MAyktzTqTv23v/3Nvfvuu96XZ/r06b55iCEKXnzxRT8zeLt27TwkUPZi4YemJfTW\nrVu3qqqM1/eWW27xzVdUuJRLzVdx2NFs5hyXnw5WHYDozDOHu29F106Zck0qQAcl7fbTXd23V17V\nnXbKULfDjju49darXXNaqTeJpl30z9qCacA0kD4NmGUnffck9RJRcdN7ZvHixZn+uFMxXXnlld4i\ngtLxb2FhiIJHH33UL1RgH3/88dfuSadOnVzv3r1981GvXr28PoiUBD/oi1DtirBazVennXaG+8Zy\ny7lrrrk6NaDjFfbVz/XTZrjLL7k4MxaeavpihXqwbdOAaaB8DRjslK+7Nn2lev7g3JvFAOSwACJY\nQkJrCHNE0azDIvgBLJ588km/fPDBB18rMtaeEH7kQ0PzEs1+1QYdBKhG89WkSVd7HUy9dmoqQUeK\nHnnuaPfKyy+5GdOnpdppGXl5Vrjf3HcLpgHTQDo0YLCTjvuQOSmABKw7NO1kzXcH3xkqI5qv8MkJ\nQUcOvTTtsNDkw6KAn8tnn33mXnnlFQ8+Tz31lKObdjzQnERT0eDBg91+++3nsP4oJFl/dK7YNc1X\nlfa+YsDJMWMudndFoxpr8L9i869HvEMPO8KtFvlTXX31pHpkX1KeBjwlqcsimwZaXQMGO62u4sbN\ngAoXYODDnqUgX6O4M698XIAc/FuYNworD8ex8BAAGICHRdusgZ6XX37Zr5977rlEdfTo0cNDjyxA\n+udfKvygb1mOyu19BdQdfPDBbuyll7sBB+yXKG/aDtJLq3cEpxOikZz79t0lbeJ9TR7uU2tZ9b6W\nmR0wDZgGCmrAYKegeuxkSxrAqgM8AD5ZCADOkdFYQfrnjcxYdgAaWXVwWgZ24sBDXMBGSxL04NyM\n1WeNNdbw4MM2IEQvqHiQ3w9WH+AFSxmhJfipRvPVueeOcmusuaabOuXquFip3tc4PPMXzM9EMxEW\nUAKWRAumAdNA/TRgsFM/3TdEzkBDz+jfNr47spiktWD5ZBXsyLIThx0sPVh4sO4QFxhhAXpYC3re\ne+8935OLUa85p+NsL4kmxAR61PxVyO9H8CPrTQg/1Wi+YmTtiy8e656IfJBqOWBgtZ4LmrO6dO7s\nRo4cUa0kWzUdA55WVa8lbhooSgMGO0WpySIV0gCgc8opp/iut2n136HC6RlBGUCGY3IY4j47NF8B\nPFoDO1h9WAAi4mtROoAOUMIUHGETV7gdAhAWIByeBT/5/H769Onj5abpi/S33nprn2W5zVcMHDgs\n6ma+XTQtw9nDh0n8TK2ZSHSLLp0y1RvQgCdTj5gJ24AaMNhpwJtajyIBEFgd6KqdNuDBIZl50AhJ\n3eUFLlhuABqsOACOFh0DdOIL1/7ud7/z49wwb5iCrD4CHNbhtqw+IQwBP1h/WL/++utKKnFN13gs\nQOSPTMgMoGlKiHyDB2LVueCCCzNr1ZEyGHBwrTXXyIx1B7kBHp7FtL0f0qmtTQONrAGDnUa+uzUu\nG74w+MSkCXjU80rTQjB3FDJi5QkD0ADsCHhkyZGDsvYBC+JoDXRsueWWbpVVVsldz3kBFHlgkdEi\n6NE6CXoERbL6AED5nJ47R805O+20k8P5edttt/VA9cUXX3h/I2Qm33Duq4suGus223zzzFp1dM+e\nfW6BO+rIQX4Wdh3LwprnEegx4MnC3TIZG0kDBjuNdDdTUBY1aaXBhwcfHZqtABusTmxreohx48a5\noUOH5jQGnBAEKXELTrgv2Jk3b57bcccdc+ATj0M8LUpX+STBTz7wAXrovk7Ybrvt3JqRYzE9v5L8\nftZee23f1EVlyrLhhhs6RkkGxoAf4IgZxrPQ1dwXuMBPv336u/367+MdzgtES90pA57U3RITqA1o\nwGCnDdzkWhdRwIOlJ+4fUytZ1KwG5OBPRKCSAXBmzJjh94844gg3bdo0DzjAhwLbIZwIWLT+6KOP\nHGPUYFER4HCOba3DbR0L16TPfhiw6JC3LDtq4sLhmfQ6duzo7rnnnpxPENaqxx57zDd9Pfvss346\nijA9ttdaay3XpUsXD2X4FX388X+7e++9Ox4tk/vjJ052r0YgmLUeZSibZ1EO85lUvgltGsiYBgx2\nMnbDsiIuH3JghwB4YF2pRaCJgHxZA13xfIEMrDqnn366FweAABgYDyVubQkBSPCDzw+ws9VWWxUF\nNknQEx7TttJnTZAsDBz40EMP+WM0mWHV0TniYq3Btwh/HZb58+f7gR6x/KCDMGy3bVd32MCBbshJ\nJ4SHM7uNo/L+/ffNXFNWqHCafOPPaHjetk0DpoHqaMBgpzp6tFTyaADLCrBDExLbG7fSeCOADYB1\n1VVXeesNeWnQvlA0AAHAABz23ntv79jLeYCHnk6hVUWWFc4DGMAD11MG1lqw0GgRvIRWHI5p0XHF\nC9faVlqMTn3SSSeRfTQj+Rmuf//+fhtZVA45UwM6QA9pcH6FFVZw3/nOd9z777/v9fL8889HY/18\n4a67/gbXo3tXn04j/PTq1duNGnVepoHBgKcRnkQrQ9o1YLCT9jvUAPJhsqcpiRnE99nnSx8L4Kca\nAcDReDSkl9TbSvkITgACIIGeSYcddpgHAuKMHTvW/exnP3PLL7+8hwXW8qMhH3p0ATphIE0FIEV5\nCGoELtonby06Fl/rvLqZM9v3nXfemQMc4iM/C93j1UWefQKgg5/OSiut5OhqzvrPf/6z23PPPX0a\nkrcR1vTK+tFWW7hBgwZlujgGPJm+fSZ8BjSwXAZkNBEzrgEsLFhePvnkE+80C/gADTQ30TMKGCol\nUDEoDZoAwi7fQImCwCNcAwpa8GWhiYgmKcKIESPcgQce6Ltvh5YSrEDkEeajPNSkxFpgBCTRA4r5\nsVgADxYsLQIQQQj78YVzv/jFL5SFu/XWW/313/rWt3y65EOZaMJCTrqbh6M9c5wFaFIz1xtvvOEG\nHHRILs1G2Vh7nXW8w3XWy8NzzHNd6ruQ9XKb/KaBWmnALDu10rTl00wDQAkAxPqJJ57wIAEA0YMo\nqflJFQG9qYATKgcWWYiAH5qwVo0miqS5iTQEOWHGHMMCQpMPFhFNC3HOOee4O+64w0dt3769mzt3\nrsOigg9M3759vbUHyBDchGkW2iY/BbYBLQIgon1ZcjjHNuP27Lbbbj4eTYA0twle5J+D3IylQzdz\nFo5zPc1wQJHgCthif/Lkya79+hu5cVdc7tNtlJ+5Dz7sZkYO57NuntkQReJ94D1IegcaooBWCNNA\nnTRgsFMnxVu2zTXAR55/tUBNUhAEAThJgWs5BwzRHZxu4YIJ1kCKAkARQgOWEfZvv/32aBqFixXN\n+/4MHz7cW2ew1KhZC6AghGnmLipiA3kIrLXI2sSacXvefvtt39sLAOOYLDSy5CAzozADPIAP55EH\nGYEbWYEk93XXT3OdOnfJ/Pg6cfU2GuxQPgOe+F22fdNA5Row2Klch5ZCSjRAJbHzzju7zz77zF1y\nySXu5JNPzjVZIaKsMsADwIOFR01AAAPAg1XlxBNPzJXo3HPPdccff3wOIAQ8WHmUZi5ymRsh/Iwa\nNcpddNFF3jKD/xHj4yArMCNLFIAD6LAI1EgDmYAbrDqCHFmjrplyrftBh44NBzvMhL5++3YeGstU\nfyov41nGuoOVx4JpwDRQuQa+/ItaeTqWgmmg7hqgeQtYIGCRwQFZzrtYRNgGaIAHAvCDMy8Aw4LF\nhhGWx48f748T58ILL/ROy0BF6MdDGrLKEK+SIAijuzigQ7j55pvdOpE/SmilkawAjBaghqYq/H5o\nwqOCZKEcrDmGDxAA1IghixOZFnMfsGQS3okNH1DMtRbHNGAa+LoGDHa+rhM7kmENMGig/F3oaUUv\nJKw2wEoILFh34s0+TMYJ9NC7C6dkBvMj3H///d5vhxGLBU1YhUijWsBDPkdGDtsEeqzRzRz5ADDW\nCspPlhwACNABbugtxiLgAXSwDLEQx0K2NCCrjgFPtu6bSZtODRjspPO+mFRlagAIuOGGG/zVjDFD\nD6vQz0XNVTQLARLAAtaTBQsW+KkUGGSQYwAGgw8eddRRPq1Fixb5uaeeeeaZXM8nWYmqAT2jo3GB\n8DcCWm6MHLcJlIW0WeSzg3UK0MKyhPzIHlp1BDsCHc6xrPitFX2ajfbDwIIdftCh0YqVK48BT04V\ntmEaqEgDBjsVqc8uTpsGgBQsG3PmzPGi3X333e62227L+bsAO8CPLDMAA1Mt7LHHHo5eWHQPZwEi\naCrC2kKzFgG4weJCExPpqFkMEFHTGIBSasA/g5GSCYAO8pMOC+kiK3mTnxYAiLICZjRjIXPYnZ1t\nHfPrVRrTsvPekiVu6222LVXlmYov4OE5sWAaMA2Up4Hly7vMrjINVEcD70Q+CY9HPbC0JlV6VmnC\nTsa20ccePwa2e/bsWXDWaACmV69ebsCAAX6MGpyMO3To4DbaaCMPD0AEcXDwff31190uu+ziC8Mx\n+cKwrSYr8mVOqn79+vl4TDWBI/Oll17qrS74zbAQuJ70w6Ynf6LAD0BFoPlKXenZj1t0kCcENfKS\nzw4+OQAa4MMx5JcMpLPWmmu43zz/W5JtqMA9bAuB5573AuCRP08l5eZ9Iy0tpB2+d1gYlY/eO9Y9\no3fPgmkgixqw3lhZvGsZl5kPLBYMDSiojyhrrBoEfVSJqw+xPsw6BhhokUoECFhAsL5svvnmvncW\nMPDII4/4aMDAn/70J28x6dGjR86Kw0lZUbhWViDS4jiBZq0//vGPfrt79+5uypQpfjweIAMrixyd\nAQ3Bho+c54fmK6w6VC5UQLLqqBzADb5GGlOHbSxJpE05ZL2hqUrAgwzx/JkOY9y4qyJouyuPJNk8\nfPEll7uVvrOiGzrk5GwWoESpeRd4TnhXSg3he/fuu+/6nou8Z4AUCyH+3nHs8eDPCPkTR++d3lfi\nWTANpFkDBjtpvjsNJBsfSeCGyp3AxxKLRjkfba7nw81HmEH3SJtBBVmABpp+aPYBFGiiYlA+AoMD\nYuVZEjV9YAVhBGX1VFKzFVYZwAbA4XpBj4CH84Ca/IK47sEHH3SdOnXyaQI8LFhWQuuKFyD2Qxk0\n1QXNbuiE9Fnko0P+DBoo2KFcnAdogBvkVxkEXEn5oiP8ebi2kcIxxx7v/vynZf45UIXdSOVLKgv3\nkmcH6Cgm8LzyngBJghTW5QTS4D0mTbZ5hzWaeTnp2TWmgVppwGCnVppuw/nwYeSDCNjwcWSpZgB6\ngCgqAPJhfB0AADAAFiZNmuQuuOACnyWWGSqJ73//+x4WsIgQF1AAXAAFQtzCQzoAD2lidSGvIUOG\n+Lj8XH/99W733XfPpQOM0Myk9JKsPOiD5jqar6hACMBICGuy6gA7wBfnSBd5JTvWHcAHSw/n4nmh\nH8LoUee7s84+2+3+075+vxF+OkZjB82+Zba3iFH5KnCPGz1wXwuVk2eK94HA+3Fkld873gEgijnv\nmJuMbbP0NPpTl93yGexk996lXnI+hnxg+ScK8BT6MFejMIIeKr1f/vKXfuJLWWfwy6FrOQH/Gz7K\nagYSNAAQHAMWBB2y8CgdoAfgATrIZ+DAgTnRmaFcIy4DTlh4gA8WQgghVD6t1XzF9BthkN4vGnOx\n+8c//+3GXDg6PJ3Z7WefW+BGnj0imrF+3tfKIMDjBBYflkYMScDDc8l7JxhhuzUD+QFVyMJzLcBq\nzTwtbdNAqRow2ClVYxa/KA3wL+/UU0/1g/zxAaxlmDZtms8bEKHrOeAxa9Ysb/FBDvx4sMRgdeFc\n6PfCvoAHC05oZRHwsFazFiB32mmn5fx48AG65ZZbchaeJD8eKqFqNl+9+eabvpkLmGIR3MR1zj/9\nq64anwgH8bhZ2B957mj3P//+l7vs0rEFxaUyZlHIpx+dz9o6BB7+VAAbAA7vXS0tLchBvlgskaOW\neWftnpm8tdeAwU7tdd7QOVL5618elWu5PjmVKompFugmzkzrzHe16667+sEB+RgT6FFF8xFWF5qA\nsO5oAXjkaCxHYaw5AA6WHZYQeLACMQmpJhLlesbtadeunYcp4EnNWsAIoFNJ8xU6/vjjj3NAxeCH\na665ZjPLkS9kwg/NPuPGT2iIpqxevXpHMH1eXrhLKL4/RKWsgMWHJeuBMvEHgzXvXb2AjmeTdwyg\nr+f7n/X7afJXXwMGO9XXaZtNkQ+dPrJ8dOv9zw7gwRGTnif33Xef99MZNmyYmzlzpr9HM6LZsqno\ngJEQeLD0ACzyfwFmcBhOclwGejgOFFHm8847L3f/77zzTkePLTWPATxMP8GUEAz6h1zoiPQFVaQn\nPx0ck9lGr/QAw0qEXFim6EqPzAIzWXVymefZGDNmrPvrRx9nfvbz66fNcDfNuLFiK9U7gdWHe1Ev\nOM9zu4o+DGDgO/Piiy+mogxYlQRfWdVp0cq3iJnQgMFOJm5T+oUU6MiEnQaJkYneWVQE/Mv89a9/\n7TbbbDMPPVhngBy6owMKQAOQI+tO3OFXTVpyXAZKQiuP/Hj4Rxs6Lp9zzjl+IlGAB58hZmQnAELq\nESOYIg3SBHLoKi6QwoGakZ2RiW0WtklTPb8oQzGByn2TTTZxry583XXp3LGYS1IZ59DDjnAH7L+f\n22+//lWTj+eF+6fAs1xvYJcshdaypADblAGAT0NQkxpyGfCk4Y60bRkMdtr2/a9K6dMIOioYIMFC\nhcBoyvzzZRoJZkcn4LiMNQYrDsAj2Al7OKlHFengw6Nu4XHgoZmLc+iDXl9//etffR7y4+nWrZtj\nfi3m7rr33ntzPbUAKebiAnaUZufOncvufeUzLfAz7MzhkZz/l1nrztwHH3anRuPqzF8wv1VhBPDh\nXhLSavUJQSeNYGbAU+BFtFM11YDBTk3V3ZiZpf2DC6QAFMiJrwzWnLA7Ot3SaX6jmQmLCcATWk/k\nb8PdiwNP6MeDVQZgwfpDPHpbATHxQPMV0MPov4AUsnXt2vVrzVeAExYbLFChEzUyltp8FcqAdWe3\nn+7mbrhxuuvRvWt4KhPb+OowvEA1rTotFRzoSZvVh6YiYAK50gg60inNWSxpl1Py2roxNWCw05j3\ntWal4iPGR5cKNM0fXEEKzrxYWrDmMMjgwoULva7ydUcHLORgHFp4ABT58WCNkUVG2/Ljofnsiiuu\nyN0Ppr+45JJLPNysvfba/jjWIqCJ5istQBMyC8Aqbb7KCfDVxvgJEyPoe9Tdc/eXc4jFz6d1nxGT\n5z/3bN3lrrfVh6YhmkGz0kSErAAj8lowDdRDAwY79dB6g+QJ4NAWX8/eH8WqEnAgvP32227rrbd2\nN910k/dd2XLLLf1xwIPeVGF3dFl45GAsh2V/QfQDpLAANsCKgAcLD9NR/OY3v/HnQ6dlrmUU5+OO\nO86DDJYboEkWIgYPZJt0yY+8JUfYtBaXRTKVsu63T3/XrXsPd/bwYaVcVre4jKuzfY9uqXHClSJq\nbfUhP/xy+JORlTFtkJlvBfJmRWbdX1s3hgYMdhrjPtalFDT98AHDupOFAPCwjBs3zs9kPm/ePPfq\nq6/mHIWPPvpoPxIsIIFFR01HrOUMLMgQPGHhEfA89NBDOeChmQmn4rOjEYs5Hg+Mtjxx4kTfTAXs\nhP46pAd0Vbv5Ki4D1ol+e/dzvxh3pRtwwH7x06naX7rsQ3fYoYdGTZGD/D1KlXAxYUKrD1DCUs0A\nLJBH1qwkyIuFB9mrrZNq6tfSakwNGOw05n1t9VLpw5X25qu4IoAUAIVZ0bfffnv/L7OY7ujAD8BD\nsxIggkWGjzbj+JAeC+kBLbLyPPfcc+6QQw7xIjDwIB96Blr87W+/nH183XXX9QMQrrLKKv46riUd\nAr2sBFuh/1Cpva98Ygk//NOm0pz36Dy3wjdXcDNvmpVa/x1A57jjjvfw2NIAgglFresh3g8WBf4g\nVBJIi950DKuQRWDAb46Ar5EF00AtNWCwU0ttN1BefGgxo+vjlaWiATxYdfbbbz/38ssve7DYdttt\n3dKlS30xnnzySQ8zYXd0wOONN97wXcMBHmCHwQHxUyI9QZSsNAAPlh0G/0NXs2fPzo3HwwjP+tgD\nL1iaGDsH0CFd5Ss/HZqx5Dsky1Il+qbCBLxw1ib07burey9ymk6rw/Kppw1zb731Rzdj+rRU+4UV\nc09CawzPBUspgfeNa3j3shiqBWt0MsDnjoAP3FlnnZVFdVRV5qlTp7pjjz02lybfpFqGSy+91E+X\nQ54vvPCCwz8yTcFgp4p3gzFc8AkhxF/AQueqKEJNksJHB6sAH64sBsEJ1p0ddtjBT/fAGDg77bST\nL044O/qyZcty3dFxbGZUZABF0AGcEJQmwEIzFM1XckzGdwerkHprYcHhYxB+oOldhDwCHaw9jBHE\nOrQqkZ/y9BmX+COL3KeffurTZ5+mSLqj33v3XakCHiw6o0ef7z788MOGAJ34reL9Cd+hlqw+xMWq\ngzUxzZ0B4uWM7+sPkoA/fr6YfX1PN9100wiE3yrmkoaIE777U6ZMccccc0yuXPWGHQTRfQF0+Mal\nKSyXJmFMltpoQBUma16QUgMfqSw7Gar8/DvGURnfGEYkHjVqlFfFww8/7LuNAy0ADt3CGSMH6ABU\nsN6ouUm6U5pA0CuvvJIDnRtuuMFtsMEGvkmK67EKAUadOnVy1113nS53EyZMcFdffbW3/nCQdARV\nYdMV+ZQbuG8AFaCz1VZb+WY4QAd5aB4686wz3VGRT8ytt99ZbhZVu05NV40KOihq48hCA+BoATy1\nhBAkpfK8Mrt4lkGHslAORnumKbWcgAVBfyrDPwzlpGXXVFcDuh801ZdTt1RXmuapGew014fttaAB\nPsIMzqd/Zy1ET+1poIFK5r333vOmV/xrgBr8bgiMj0OlomYprDL0tqJ5imOAEMADKCgIRHB0JuCE\nfNBBB3lIoimKpjAsN4AM12Ht4YMADBGALHx6gBECceQfpLT9iTJ+uF+DBw/2V1JhUqlS2ZIHC+U5\n7LDD3HkR8I0cMdzRxbtegUEDe0f3hmZAusZnvXIvVo+CHtYEgQ++YQRZVP1Ohn947hjUk/KUGrBq\nATuE1VdfvZllo9S0Gi0+Vh69z6zrEQ488EB/X8h7+PDh3gpZDzmS8lwu6aAdMw3k0wAfKCbQbIQK\n6IknnvD/lOnuTdMV1ptrrrkmV3RBi6aIiAOPYCf8sDCQIH5AzH2F1QirDJYjIIdFY/aQiZq8+De0\n5557+nxxPB0wYIB7J4JKAASwygdXOUELbKjLL/+kCfgHYeHh/unDiByCut12+2k04OJE9+Tjj7m9\n9urn6O5dq4A1h5nMGR354rFjW5zNvFZy1SMfgEDwwzaWVCAYS1wjBOCb57DUwJ8DgIcQNuGwzzn+\nFLDId4UKV8dY826pgwDXxAMgRVNMeE1oSYrHZ5/0SFfXrLHGGl4WrE86xlrWKKXBflw+4nEsLiPf\nJ86FgTJyTBaUsPyKi67Y1oKc8RCPc9tttzWLgn+U8lI6yKN8w8gAKMBDIN14WmHcmm9HH7y6hsi3\npSlqdwVDcwvHonbYZnJFD3bufKTQr50P02A7HiJH0SbSDfNhOymv8Npi5eOaUAauC0Ohc8SL/tU3\nhWVEtmgqg6aoXTZMJrcdpoeuuD5qJ82VDx3FZSC9ePm1ny+fXIZfbUSg0xQ52MYPZ3b/d7/7XVM0\n0F9T1DzVFEFP01/+8pemCOhyejrggAOaIoflpmeeeaYp+gA1LVq0qGnJkiVNPE/RAIBNEQg1RbDg\nyx9NRZG7bs6cOf54BBFNkTWo6bPPPmuK/H+aIifnpsiK1BTN09UUwVBT9MFoirqgN02ePLnpzDPP\nzF3PfYnAy+f10Ucf+bxIh/TIT3kWUjzyRH4/Pk3WyKTA9aRFuSlH9GFqisYG8vlFH+GmaOLRpmHD\nzvLP9CmnntEUzaWlS6u+/mDpsqYxYy9r6vCDDk3HHHt80+LFi6ueR9YTHDp0aBNLowSeN55x1qWE\n8LvHNy8MfMP0PeNbGn4PdVzr+LV8QwvF53sa+aCE2flt0lGa8XX0J6bZubBOI614/Ph++P0u5tsd\nlp+0FCL4yOVFOeKBfJQ35/m2KYTnFCdco+d4uPXWW3PpodO0hP9opMYSlfpwEZ8bIUWHSo7f5PiD\nzIMVXqs0wnX8mlLlQ33hixg+qC2dK+eBiucVliXc5iVRKOaFUdx8a9KudmWErrmH3FNkTLpXHOMc\ncYjLNdUKgACVe9RM5aEk8hNpihyGc8/a+PHjPfAAKVGTQtMf/vCHpqjnVlNkNfHXCHgiPxh/DUCo\nEFlnPFBE/8r9NcDO/Pnzm+6///6mW265pSn6d9t07bXXNl1//fVN0WzsTZGPTy5fdH3iiSc2RXN5\nNUXzbDVF00s0y68Q8ACkAh3kAnwIAiVBGIDHx43yADmvv/56U+Rz1BRZp5qiMYg8RA8cNNjLBPQ8\n8+x8Fa3iNQAlyDnk0MOboslPK06zURPgHlZbP/V+7yhTCOAt3bs4IMTjx+uB8DsY345XwoVAR9fy\nDQpBgO2kb5Xix9fhNyv8fsfjhfvKr5hvd7z80k/8eBzaQhhiWyGEllCm+Ha8rkPmME48P6Vf63Xd\nYKechysOBXp4wgcnvFkos9gHkjTCUI58oRzxByDfuXIfqDC98MFK2iYPQjEvTKiD+DYVJlaQagXu\nX/iiJcle6BjX6hmoRKbIf8BDBtASNVX5f5vR3FVN7du3z720WHeeeuqppgULFngYAAywhGCxweIS\njYrs4wIY+rcq64nSxCIUTU/hLTsPPvig/9Bzb6Ju6R58br/99qbIH8pvR6b0XN6Rk7TPE6sT+ZEe\nsgJSScCDBUB6A7xCeYjPtYAd8AREAVMAHJCD9SrqPdb0/PPPe7CTJYsP1siR53jrS8+evZrOPmdU\n0/0PPFSy2oGlqyZMavr5Mcd5GbHkVLsSL1moDFzA/axWSMt7x3M6atSooosVfv/jsEIi8UqdOKpo\nqQfi3z+BRPy68Ntd6FwoD/cnvC5u1eG8vlVxa1B4Xfycvt1Skt5r1sgWhrisOheHjzA/4oTAFuYX\n1jGhLilHqMs4BJJmeG08P87XI9TFZ4e2vrBNMlIGb7JfohsW3ccvQ/SRbtYuiG9DpESd9o5qpBVV\nPP5YpHTf5TsXIdrgPOkohHmRngJpqH2xXPmUVilr2mcVogfKd9dDF9ED5Wfk1jnajcNy6LjWYbmi\nB1aH/Vq6jl4kr+PwJPomv8hiEh5O3MaPZOPIf6AaAV1vs802OZ2Xk2Y10iBffCOYnBPHYRb52Dzw\nwAM5sU4//XSvp8gi4h2V5aysbuQXXnihjxtZVJr5w8jvJgIM39OK6zkWTkuhbuYRKHkn5rXWWsv3\n1Np///19ms8++6zXFZOH4jeEkzTpkY7eGyLin0NZrrrqKn9dVJl4J9DQP0fyIDdl+Pvf/+7n42LN\nwjHSjqDIt/MjJwvv3dlnj3CvLnzVDYl8ar694jfdZZeM9XGYdiKCFu/UjGNzfMEP59DDjnAdO3T0\nvb1ejXqrMQEpz/OUayZ7mb3A9pOoARyVIytI4rlSD1bjnalGGsiN/5Gcr4sph75jxA3rgXzX8h3k\nm0pIqhv0PcUnRYHvYFgvsM+3VYG6QQE9KMSv45p839SwHMgV5hdBRLOySUblU86aPKI/hrlLw/zZ\nVh7EI38Cx1Wvsh/qEt2zT3wC14e64Fh4f8J0OFevUBfYKffhQkkhDPHgyTOfc3EY4lh4E5IeyPCm\n6CGoRD7yLDZU+kApn3i5eLDDh1sPs+KXu+bD1DOqTCsNPPw4vFVDLtIgrUpeKACOCoUA7NA9HOBZ\nb731vEMvxyNLh8OhGVgABoACAc+h0TQGBJyMGaxPAWAAbgALAEVLCDuADjDChwPYwbFZXdSBFWZk\nJ3AtlUPkO9QMeEiffCKrm+/hgoykA3RpGg8BkWQnLaBJk46yZv8t+iwAAEAASURBVB85iUOQHnCw\nRh8sHCNQxnNGnu0ee2yev4ennTrUde7Uwa280rc9BAFC4bJCdNkxPz/aPfDgA27RG4vc1ClXuyMj\nB9VGcHL3CmnlHyC2GrpK43tH2YoN+j4TP/xuJ13P+XgcgU88fpiuKvswTvgtRYd8c1haui4pLdKl\nntI7GVldfFbUOdRlOBBX8i0L5Q63Q1nC+i3cJo4AJiwbeovrknhxvYT5hfHDtMI4td5evtYZkl9Y\n+PAmSBaUKIuHHi7dBOJTuYuw9WCg3JCQlVaYV9LDjgUlHsJrSpUvnlah/TCfQg9UvKzxNJPKxbEQ\n9OLX1HOf8sRBh/vHfec+J5UHedEX1/GChrrjGGmG/8BKKZ+sVerBIOsOH6SDDz7Yd0OPHIr9BJ6a\nHR0wABCw6GAV4trI7yZnEeHaOFzIasI5AQQ9tDQNBWkALoIjoAq4nDFjhhs4cKAvEqM+n3POOe74\n44/3cYGy++67z9FzDGjZaKON/NAAGj+Hi0gTWQReyIHsLAI2zhFUdsmlHmQcl5XHR/zqh0oYGVks\ntI4GqvUnI43vHXBebAi/GaoP8l0bVrb54ui46hD2k3orKZ7WoRw6lvTNKiSDvlmq55ROa635tvKn\nkEDefD+ROfyOhvASlpE4+jbmky+MH49T6Fw8bmvu1wV2ynm4wocbqKEiD5UYWnyksDAfjhV6+HQN\n6/C6Yh/+UL4wrULbofyVPFDFlquQLMWcw/rBP/JKQ/hvgrS4d/lMvmFeIXiSBt0fFeJp6nipa15q\nKnUqd6waVPZAFOkDBjQtMQYPIEIX88ip2GdBRcJYOkAD1wp0BC6hVYc8iAPkMPYOiwYOJF2uAYYE\nIhtHlicg66ijjnKRj4276KKL/HQXkYOzY0b1SZMmeRm6dOnimOqCZxGgIiBHKEscdMiL8wRkAp6w\nLGlBRll30Auyt/Th84nZT+o0EH9H6v3e8VyXEsLvZSnXpS0u5aAJP6xnkJFvIN9yviXxc5WWgW8C\n3089A6zJS3+Idb7SfNJ8/XJpFi6fbDws8Qc/JNR819nxyjVQ6gcqKcfwXvGCFwM68XRk4dPxME0d\nq2RNxQ5wqDmLua0IwEjUO8vDhORm9OXddtvNQwqwo0WAA2CwcC1WFtIGogAKQEdrtsP5sNgHNpCB\nj9Gdd97p+vTp4+XAj2fDDTfMgU7//v19UxvNYuQhaw4gA9CQv/xz1GyFfAIdgCaEL8mFnJwDhJDb\nQnY1EL4jaX3vCmm3tf7UhenKr5E/C/mWML7kDXWrY/mAJYQZ0qJ1gbyAz6TWCaVX6Tr8s4i8Ah/S\n5RzfGIVwm3P5dKHjScYGpZWWdV2+XuHDUs7DlWT6Sxr4KbxhKDzfwxe/GZXKF08v334oX6M8UPnK\nmu94qOt8cfIdr+TafGlyXNaLEHi6d+/umL+KEPWaauabw4suoAkBh201FQEcAh3gAYgALljYDhfg\nBysRi2Y8B3iQJ+q94icw9YJ89cMgj9E4Pd6vh3yAKlmIkAsZkkAHeSgraQt0lC8yCHSAPvKWXsK8\nbTubGqjk3ank2kq0FX4v4392K0k3bIJKgpaktNFBKE8IDoqfdIxz4XGgM9Qn5Sq2nlI+xa7154z4\nWHRCOcImLM7HdVKJvsPykXa9Ql1gJ67IUgoPFesm8bApLW5G6KxMmpwPH8ikh4jRLvUR1/VKkzSK\nffiJW2qI51PJA1Vq3uXExz+jlN4TxeShe1lM3HicSq6NpxXf55mggseiAZwABCNGjHCdO3f2UeVY\nSC8twEA+MFoLMliHFhQ1FQl0SJdFPjzKS/CBhUUL8BF1f/cWnlBepu8AdtSjSjJoX47I7CMPQMQ/\nMspH3oIrwArYCUEHefV+hHnadrY1UMm7U8m1lWgtrDSTvuXlph1aPPgjrXqA9MgHVwa9A4yurBAC\nAvVSeB3bHGsphO4Y6DVsmm/p2lLrC+rCsKySL36cfKmbpG/yQa6wLuTasO5UWpI5vD9hPafz9VjX\nBXZChZfycKH00KqDyS90SkXh8RcxfCB5ANVGibJJK3xgJJfWihM+xIUe/lJvYKUPVKn5JcUPy590\nPjyG02spvSfCa8PtUL/cr/h9COMmbSMz9yS812GaSdeUe4zKXs1ZwAZ+MmHAqoIVRXDD1BMsgIWg\nI7TqABdJoCOoCPMDOgAdAIQ1eY8cOTKXPc1pyEbAUfpnP/uZi8bO8fmzBnIkD2vkQVYC+VCeMH3y\nQzbBl2TiQ58UeBYej/y4xo+fEPUau8h3L6eLeXxhRvXxEyb6bvDvRMMXWChdA4343vHHiZ6DxYaw\n0gwr02Kvzxcvbl3hexTCTVhnhM1M4TZph9exnS+E5QAgBA1xoMh3vY4rvzho6HzSOuk7ybHQKKDr\nwvIhJ35G0kvYmxYoCq1GXB+CUVhepV2PdV0clFEMlZUeWG5avocjVDhxVDlzc0hHVKqKj5sQ9rDi\n+vBhyOdwDBTpppQrXzk3EPnkJa8HKimdpAcqKV6px6T7Yp0Vq/XR1f1CXp4FFvSv+5lUDq7h/ocv\nkuIlvcQ619Kaj+7GCc6SvNiygAh4mJk8DFhV+vXr560lNAsRD0jAFwbfnRB0wuYrQENQgYWFIKiI\n73Oc3lZz58718X74wx+6yy67zIMKztKnnXaa1wnnmaWdMTDowo7skgNZ1GylsgA2AI4gB5kkf1wG\nn3H0A6w8/vgT7s45d7l777nL7d1v32guoe+7tddZxx3xVY8xxdU6GrAwgq4v3K233eHwLYoGJXR9\nog/sDtv3iLZ7Kpqt82gAHY0ePTrP2eIP846k6b3jW8IfqGID32jVE3wD+BYkVdLFphfGw52CuiHp\n26J48bFz+Cbz3dT3W/G05tvOdy0eqF+ok1SXhec5x3EBlupIxeEbWUhGxcu3Jn3pUHFCg4COsZYs\n8fhhHHSA7sKA/GHZKvk2h+lWvB19EOsSIiApOBdJVLBmI1IyEibHtEQPXk7uQueIFD2Quet0fbiO\nHqBmw4BzTanycU1043P5hPK1dI64oTzxbdJFnjCEeUUPW3jKb4dpRg9ts/OUN54HOmopMNItow1X\nGhjRM0mGuEzF7ifdv1JkbGkk1wgS/DxSTPOQJBOjHkfQ4UcCjiwdTVF32ibWHNPxp59+uol5uN58\n800/NUP0MfAjIUcQkjgKMnlGoOLn6oocoHP5RmDj59eKAM2PxMzIzo9F92XQoEG5OBFU+fm2yJtn\ng4Vt5GLKi+hj6aeFiMDFjwKNLJEVyE9rkU8ehvVnSgfKz7QRt9x2RxNzWpUTGHmZEZgZiZnlxmjK\nDGSwkKyBao1cnrb3LpqU1j+3yaVOPhp+NyKobxYp/M5HFWyzc9oJ39/4N5U4fDfDPIjP9zPpG6s0\nOUd+SptvM7JwXMdYo38F6qwIMnLndQ3nI0jKHee6UM74dZzXtzssP8fzhbB8ESw2kyvpGvKMy4S8\n8TpO13JfyJ8l331Q3Fqu82ukRlIU+3CFNwhFxwMPpBTMDQwfEOJyw8I4xC10w5R+sfIRn/QkQ/xB\nKHSOa0t9oML0kl5E8pcscdgp9MIgS77AnFiR2Tnf6ZKOI0N4TyVrqWvSIK1KAgDX0hw9wEfUtdvr\nlPhMsRBZRPw+cPHQQw81RePd+Ak+77333qaoq3gT68ja0jRv3jw/zQRTRbz33nt+igbgImpSSgQd\nlQU4iiw0Po/ICtMUjbeTm94BaBLwADLMtTVmzJjcPUePQ4YM8eVCrugfvZ/uAtBhCohobKCmP//5\nz03M2RU1b+WdfgKQEpQwzUO5gKMyxddAExDFJKBXXTU+ftr2v9IA97MaQJim9w5AB3hKCWGFHv+u\nlZJOLeKG32DqpLYSQogTiKWh7HWHnTQowWQoXgPMjaVJJYu/Kn9MPgghuBULO1wTB8r8uRQ+U0xF\nEo1n40Ei8nHJTcwZzo7OHFQAE/NczZo1yy/MdcXs5lh50BkAziSjmk8Lyw0QlRTCiTyjZis/V1Xk\nF+Tns2IWdObZAlqYwwqYAq7ImxnUQx1GfgB+FneAGKsOwAW0Ms8WoEOagq5QFuIwb5WHkAhyWjtg\n7QF6ACsAy0JzDRQD5M2vKLyXhveOb0mp9xrrCODAM16MVaKwFio7G/5Z43sU/sHmfZOcyErcthC4\nP/r+oJM0hW8gTCScBdNAURo4MhpUkHb2U045paj4xUaibTr0yYn+xTa7NPpwNPPpiV6kZufL3YmA\nxftD4LeTL3Duxz/+sT9Nt/O99trL97DCCRnHYHpCEfCr2GSTTbyfDj4v+MRo3iucEHHGVFdy/HeI\nIz8dn8BXP+hW81vhAK35tvC/YVGXdhyQI2DxSwRQ3jkZmfDPiaw8jrm0CMh0xRVXuA022MD78mhK\nCvkM4adDkCwPP/yIO/mkk9zue+7thg073bVvt54/X4uf8RMnu8kTxrvDI/8fpqSw8KUGeLbwl4qa\n/Kqqknq9d5Sl3A4P+MHIjySCtlYdm6aQskM5CsXjXGTh+JoTb0vXZPF8qJO0ldlgJ4tPVB1l5mPL\nnEuF4KCO4pWUNaBDeXBO1jxSSQnwUX7ppZcc4BFZbzxw0KsJ2AAy9thjD/fGG2/4S4EKIAInZXo6\nAThM7Ans0HUf2AGCAAzBhfLEYZN5pzSEPnNjSS7+k0SWF583Ts9AjWCH60LY4TyBiUyZ5oKATPTm\nYpRlAZfkALrkkDxmzFg3c8Z0d8GYi92AA/bz19b6Z+Fri3w3f/KdMf3LiVVrLUOa8uP+8pyeeuqp\n3vGzGvNk1bt8+oZQrnICPYNw1OVPT2RRKSeJqlxDD6rQ6Tsp0ai5zcNO0rlGOsYfVLrms44sWX5S\n6zSV78tuIGmSyGRJtQaojPlXxpL1QC8XelMBO1FTk1/iZeIfNaADtOjDLDgAaNgOe2hFPgi+FxRg\nwiLDqa5hDeTouPIDHpEH0CGvpIk8SQ+rDaDFmgVLD4E0kQeLEWDDQs8n5tEiAEDsR/49/npZiSQj\n8tBFfMFvfuNuuHF63UAHWbt07ujuuXuO7+XVv/9+DQHWlKuUwPPw+FfPJO8a1r6o2ccfKyWdtMYF\ndsJJc0uVE6sBActUUo+nUtMrN37UXOVBJqlHE72xdL7c9LN0nXqYYYWnR2jagll20nZHMiAPTVkE\nVf5+J4M/yK9/mBKfCkaBSmbw4MF+F4sOH2dZWAAOrCtYVKJ2at8tXGBx0EEH+RnI6dLNiy/LDtYd\nxsxRF29ZdrAwoVOapKjQ2MeaJCACSIAT4AZo0fg9WHZYkENj6AiAuAawAn4YA+iwww5TsdwJJ5zg\nLSfIhyzEOfvscxxdxK+Zck1Nm61yQuXZOPW0YW7uffe62bfMLqmbcp7kUnuYZ41Fgfsft+DwrPJs\nhM+o4mdpjfy8S1isLJgGaqUBg51aabqB8uGjzMeYdfyDnKViYtHBciN4i8vOeWY0x9JCJcM+MCK/\nGSCDwfuAnchp2PvFRL2yfDLnn3++N7HjH4OO1IyFD0/YfEQ8FsJWW23lKzLiC3RkgQGuAB0NXkje\nbOO/w3HicQ2LIMonGv2wj5xnnnlmzuTPeDz8O15vvfUiHVzgyzll6pRUgY7kF/DMXzA/08+byqN1\nCC08WyyFAnBAHKw+LcUtlE69z2HBZOHds2AaqJUGDHZqpekGywdA4IOb1Q8WVh1kB9iSAueAEEBH\nUBf1UHIsgAWAoaHj5STMKMWHHnqoBxCsJTfddJMfsA/gIR3W+MvgywOsnHHGGblZ06NuuA6ZCIIW\nWXTIC6gR6GDFiUMOTVha1FSm67H2sE1g1OU77rjDb2PVOfron7mFry50s2b/KpWg4wWNfgCet976\nY6Z9eICU0JpBhV9q4LkEkkJQKjWNesZHbjWFZ/mPUj11aHmXpwGDnfL01uavAgDo5UPln7V/mVQ4\nWKby+Q1QKan3Vdh8BYSETUm/ifxboq7kHlwAkU6dOrloHB134okn+udjxx13zDUX0XwF6GDZica3\ncQcffLBvNiLiDTfckGsuE+gAVOSFRYe046DDcVlxNCKymqTUuwrAIZ7AiG2OUeFEXem9jDh4zrxp\nluvRvavfT/NPv336u+222zYzvbR4zniWFJKapnSu2LWsO1gay4GlYvNprXjIzAK0WTAN1FIDBju1\n1HaD5YXTJB9zKs8shZbkplJS7ysqFQJgIYsO4IFlBksOzUNYWrC+0DuEeFhOosHb/HW//OUv3dZb\nb+0dhrHoROPi5LqgAifRyMq5aUq4ILTGkGYIOmwDLkALQT45NIux4IODRSmEnTANrmefcnDf6J4+\nZuxl7ujBA316af+hl9b+/fd1l1x6SUXOra1ZzvBdwHLBs1TtAKTL1yxL1hHJzR8lC6aBWmvAYKfW\nGm+g/GQhAR5YshCojDCjU9knWaSSmq8AGCAES0sIOnIOlpWFZiRAAwih59OyZcu8SugyTKUXDern\nrrnmGn+sXbt27sEHH3Q/+MEPfPMT14SgA9SwyBlZQAWoEMiLHleCHNbAEwvnCMgN3ITpCJguvHCM\n22DDDd3UKc3n+vIXpvjn+mkz3E0zboyGALgzFf47VNwsClgtahHIh2cKgMhC4H1D5qxapLKgY5Ox\nsAYMdgrrx862oAE+YjT5RCMEt8q/2BayL+m0mgCoII78qkdZmIDKwrFCzVdADlYd1sAEkALkABoA\nCNYV8sIJmLD22mt7B2biKQBdW2yxRa43FE7EnAecSFOQA5ywcFygQ14h6GDREewItkiP+Cxx4Inm\n+HKjR5/v7rt/ru/mLZmysmZW9W7durohJ59UF5G5dwoAM0utA4Al2El6lmstT6H8eBcAHf5kjLbm\nq0KqsnOtqAGDnVZUbltJGsdaLDtUAq1htq+GHvXBRT7kTQqcK7b5CtgBQoAJLCmADs1UgAfAAWww\nxgazlYeB8ThGjBjhormtPMBwDeBCZSAwSQIdrDSkCUgRn3y0sM/COSxELITQIgUsYeFB5p///Fi3\n48493dnDh4WiZWZ77oMPu1OHnOxq1TsLCOb5UeBepSFgJQF00m4tUTfzxwNITIP+TIa2pQGDnbZ1\nv1uttHx0qRT4oKXNj0Cgg1z5Prj844z3vgphAUiQn46ar2jWAkAADaAFJ2QABOgg4J/D6MoK++23\nn4dC4qv5ifjs47tDfqTJObq4AydACgGAAaLC62TN4frQooNMBNIjyMJDWgsWLHDHHXe8e+LJJ1Pd\n+8oLXuCnNa07PC88ywpAcNqeacmGlZJm0rRaVtP8XZAObd02NGCw0zbuc01KqQ8blpO0WHgEOijg\n8TwgVm7zFTABZAAs9LRiAUCAHXRw8sknf03vjJAsQAqbvYhIMxZNTn/961/da6+95iGF4+uvv77b\naKONcqAj4AFyWLAsyU9HoMN1BAEPaQM9Z5xxpltlte+6MReO9uez+iPrzqI3FlWlCDwbCoBNWp5f\nyZS0pilLYJZGy6q+B/neu6Qy2THTQGtpwGCntTTbRtPlA4dZnQ9cvSsMKgNM6BtHPhXAR75/58hZ\nbvMV4KFu5Vh32B82bFhuCokddtjBD+bXr18//0Qwl87ZZ5/tgQdrDWAEqAApgImsMKw5xjkGLGQR\nHDHvDH5AgFYh0AkfQdJm8MPte2zv7phzVyZ9dcLysN2rV283dOiQsnpm8WywKKSlaUrytLSW7Dzb\nBJ5vgCefP5qPVKOfYv5g1EgUy8Y0kNPAf0Xm+9G5PdswDVSoAeCCUXl33313DxfdunWrMMXyLuej\njwyMZ0NFAIQkBR5/Jshk0D8AjXiAgSwhWFrUhIUvjfx0ABWsKlh15KvDOaAG52bC8ccf7/Ned911\nHb2vGF2ZuXyooNinmYqFPOREDOSQt6w/yLPOOuu4zTff3PfcYk1FRzrvv/++n64CfcctOvGycp7e\nX0uWfOCGDqmPY29cpkr3P/v8C/fyy6+4n+7at8WkqIBxzEZ3LLLecC9YshSQnxDKDbB37NgxaqI8\nzo/9tNtuu/k4tf7hHSJv3vvZs2fn/YNRa7ksP9OAWXbsGWgVDdA0FFpVwg9zq2T4VaJUaliX+OgC\nMvxjz2dhqmbzFbOeU9GwZqRkYOuII47wlhogCT+fAQMGuGeffdZLOnPmTG+pUTMTVhoWWW+AHBaB\nlPaxBMmiA8AwenPoX4Ke8+maiT5XX2PNzDomx58bjbuTrykLvfA8EAQ38TSytp8EOmEZOM97R+AZ\n5PmvRUDPvG/8sWCdlaEoaqEbyyMdGviy20Y6ZDEpGkgDAAaVDR9bRloGQPShbo1i6mOrip68+OCy\nH8JAmDcyEfbZZ59cBcE+lhUchWVtwWLDgoMv52TVEYDcf//9btddd/Wgg28NfjlMIEo8FpqaAJSr\nrrqK5H0YOnSoTxMIYqF3F1Ye0ie+eneFjs8cC5u9gB0qcXSshcQBPS2q7Dn+wvO/cT133onNhgjM\njt6uffvc/aWsKjdr7r30kg94s6QIvT+UK1/gHM87wMPCM67r8l1T6XEAR/mSt4FOpRq161tDA2bZ\naQ2tWprNNMDHln9706dPd8wBxcewWpUPafOx5V8saZIPFVwYqAQFXjpOvEp7X+GQfPnll+f8c7bc\ncksPOsx0jsWGRdBETy6sMPS6Ouqoo7wYvXv3dgcccIDfpkkM3x+sQlzPmqklOAZUyZoDCBFaarby\nkaIfyi3g6dWrl5dJ5xphfcyxx7s333jd3/dGsd4k3RcBSyHQiV/Hfedd03sH+MTfjfg1xe6TNu8c\n7x6BbVmU/AH7MQ2kTAPLpUweE6cBNcAHmo8i82gR+OAKTPgHXmqgAhfcYDWiIpBTdNLHXNYP8hL4\naKZx5OK84ASfGSw4surIr4bjBGADMMHSQ7PV1Vd/OQLx4Ycf7p2cQ9DBSiMrEdBDGvhV7LXXXj6t\nefPmeWsQcdScJWsQFhxZcgAdwQ4XFgs6xEXP0skhhx7OoYYKG2+yqevdexdfxmoBdNoUVA7oUAae\na713vIPADmsAqJz3DjlID6jhOec9ZJ/jBjppe2pMnrgGzLIT14jt10QDwIkA5d1333U777yz/xDz\nMSbwoWbhQ0oQpPCBJVCB84FlIV6xgeu5FisLzVfIQAA2gBEgB5DRmDrxwQOxsvzlL39xwA29mwjX\nXnutHzwQCAmhCcABnOSzwzxan332mV+oeOhiTmAKCVl2KMuaa66ZmyUdyw7QIwjyF5TxAxy++94H\nbtwVl5dxdXovoQv6zBkz3KybZ6ZXyAok0/Ov96KCpPyleueAHXogbrXVVv69C0GRbb1n+d473qFq\nyVRpmex600AxGjDYKUZLFqdVNRB+UNkm8JFnO/wI6wNbyUdWzVfk8cknn3hQAlBkgQlBJ2nwQGY6\nHzJkCJd7AGG+q2222cbvAzukAzSp+Yr0gB3giUU+OoAOwEPo0qWLGzlypO/ZRdMVwEPvMDVjAUJY\ndki/FKuOT/yrn/ETJrrPv/hHwzgnq2yNDDvVBh3pTOti3zveQd658F1UGrY2DWRFAwY7WblTJmfF\nGuDfKvN4EaZNm+Y/4FiUgB3Biaww4dxXnAc26KI+fvx4fz0jHD/wwAO+S7ggBMgR6Kj5i/QEO+pm\njrWH/GjGuuKKK3x6jM2DLHRlx5oj2NHYPaFjsr+gxJ9xV17l/vHPfzcc7Cxd9qFbv327XDNgiWpJ\nbfTWBp3UFtwEMw20kgaWa6V0LVnTQOo0IEsKzVdsYynCnE9zlGAHIMEawxL2vjrmmGNyoIPPDV3I\nv//97+csLQKdeDOYrENKC58fjbjMmDw4KRNwdAaKCFiHJAfpIRvph749PqL9ZHrKi3y3T01IWFMs\nmAZMA9XRgMFOdfRoqaRcAzRf4aOAxQSnSgIWm5122skP0PfWW295wAA4AB0gA7jAx2bHHXd0Cxcu\n9NcwieesWbPcWmut5UEnbAIToKipSk1XHAdW8LtRl3LkwMkTuThGOOGEE7wDNHGBI0EX17PPcfJj\nsdCYGgB0gBwDnca8v1aq+mnAYKd+ureca6QBKpBCva86d+7sYYLJFAELwQnXaZoHRJ08ebKf+oEm\nJcBFoCNrjpqrBDnsc454xOc6HJzpsg7ssGy44YaOAQYJOD5PnDjRx+c65AjhCwsPAEYw4PFqcAws\n2OEHHb7cyfivQKcUh/uMF9nENw3UTAMGOzVTtWVULw2EzVdhF1nAQc1XTMnAfFPPPPOMB58bbrgh\nmndpqBcZB+F77rnHj4As3xlOcH0IJVh05OujZjDiqbs6/jeADj45Wtjffvvtc5OG3n777b4nDFYc\npU1asu6oSYt0Swkan6eUa7IQ970lS9zW22ybBVELymigU1A9dtI0ULEGlq84BUvANFCBBtQjBN8Z\nbSclJ9M+PULUOyQpXvxYvuYrQEXNRbKgADIMDMjknbKcbLrppg4AYeZxzuOozDmuBTy4FhjBAsPC\nPpDCeQKQQTMVFh18dVjYJi0cm0mLOMOHD3d33HGHW7p0qe/tBVzRzEV6nNeChUgO0dqOlzlp///9\n3/+6dxa/nXQq08fozp/1YKCT9Tto8mdBAwY7WbhLDSYjPU0Y7+PGyHdGY30IYICTpECFwHWMF8N0\nDPSGwkpzZORonK9LLNcUar7CD0bWE6DiT3/6k9tjjz1yoIMDMlNBYH0R6CBbCDqCHPnXyBGZeFwT\nBx32sRQJXoAd4AX4onfXD3/4Qy51F154ofvlL3+Z891RfK1D0OH6lgI6mvfYEy1Fy9z5P/7xLdex\nQ3absQx0MvfImcAZ1YB1Pc/ojcui2MANCx94DQjYs2fPkgYFVLk1OBrp4eMAJMUHGKSCB6aKGTyQ\naRx+/vOfK3l34okn+vSw3jCODtYYQAM4AWiAI4EOa6CJ44IXgU5ozZFFJxwNmQzVlEY69913n59X\ni+Onn366l534XEvTF+BFcxjQRB7IVAzsYDXT6M6k3SiB6SJ6dO/qoTdrZTLQydodM3mzrAGDnSzf\nvYzIDphocsAkKKm0GAAPC5UHlp8jI2sP+RQ799XNN9+cswAhCyMa9+jRw8MFIPLmm286oIwQBx35\n0xCPAHzEQQfgwZoji46sMkAKcCTfIQAK52ag69e//rVPj+YsYI5rSUc+P2wDPICQ0vMXFPjp1au3\nO3P4CLf7T/sWiJWtUx07dHSzb5md17qX1tIY6KT1zphcjaoBg51GvbMpKBfNToBHCCGtKRZ+P+QH\nHGDRIcyZM8dbaIAKltCKgkPxGWec4X1lJBeWlXbt2uWsKEAG8ILlp1u3bs0sOoBO3D9HUAKMYI1h\nEZSoCYq8QmsMctE0BkiRJsCz+eabe8sReT/66KM+PunIyZm1IArgIb0wTZVHayw7hxxymHfmHXPh\naB3O9PrZ5xa4o44c5Ba9sShT5TDQydTtMmEbRAPWG6tBbmTaioGlhWYkFkFPa8uI9YW81OOKeX+0\nTd6yoAAo+OccfPDBOdDZOBrb5KmnnnL0ygIqWAQnXLfddtv5EY+XLVvmp3ygyQlLjByRgRLABggJ\nF45xLmy6SoISrDPEAZa4Zvbs2V5dABCjNgNEoVWJvCkH8IZ8BOIoADeyqHEPaMJ64IH73ROPPaoo\nmV/fd/9c12/f/pkqB0DOs2bdyzN120zYBtCAWXYa4CamrQhYV6hoWdT8U2sZ+fcsKw/WnVVXXdWD\nAZaT1157ze2yyy7egoJcgwcP9ousMvjGCFKAEIACuOBa0gWEGFQQuABcgBmOcQ3WFjUxkZ4gpyXL\nC2mFMAZMXXTRRW7ChAledQAP0CKokv+OLEjvv/++e+WVV7zzNhWqLFuh3oE/xvK57fY7vZ9LeC6L\n2zTLDR06pBnQprkc3Jd6vQ9p1ovJZhqohQYMdmqh5TaSB9YELCmyKvAPtp4BOQCedyJrzyOPPOIt\nLnPnznUHHHBATqwLLrjAz0mFFQdwkFUGqBDoYEEBdFiAniXR2C50ee4Q9QISfAhyAB7Ah+OAjvxp\nkqw5OSG+2gB41NOLvAAeoAw4I4T+O3/+85/dCy+84J5//nm3YMGC3AzsXyXlV8ANlasWrAkXXXSx\n++jjTzI/+/mtEbBdPWmie+yxeWGRU7ttoJPaW2OCtRENGOy0kRvd2sUELKhUCXzY02SmHzRokLfI\nHHjgge7cc8/1MvLzq1/9ym2wwQbeOiOrDtDCNpCCpUWgA3iwrWYr9hcvXuwHBJR1JQQdNYGRTzGg\nQzw1Q5GH8mXcHcb+UWDcn7ffTh4vB6fqPn36+MlOe/Xq5e+B0mTNgsz4A7268HXXpXNHJZu59aGH\nHeG6dd0uGpPo5NTLbqCT+ltkArYBDRjstIGbXIsiYtHBgpI20KGCB1qwsigABWPGjPGWHM4BJnFQ\nEehgyWHBX0agE/rWvPzyy27rrbfO+fqo2QpYIhQLOpJNUILV5qGHHnJYoph0NCnQNLfnnnvmFixK\nyp/4SouyaKEMZww7K7JgrZRZ687cBx92p0aQM3/B/FRBddI9MtBJ0oodMw3UXgMGO7XXecPlSLdy\nPuosabLooGgqfEYmxqqjQFdyLDMADEFNToACkMI1nMO6ItDhGOBC3NAKhFXnjTfe8D48DELI9Syl\nQg6+QNLhY4895icglbzx9VFHHeWb55ADSMN/J+ydRd4syAzcaAF42MYyxBQVWbXu9Nunv9ulT+/U\nW3W4nz2/snbG76HtmwZMA7XVgMFObfXdcLnhhHzkV93L6+2jk6RcVfgjR4507du3d1OnTnV9+/Z1\nxx57rHc8Fhhg3QkBAcgBdgREAAwwBFzE/XOAjg8++MB9+umnvgmJdFoKAhvWgA7XhgGrDbOtAyXd\nu3f3TU9hd/SHH37YRxfwADuyTgm2KLsAB8jRNutJk652yz780N0ye1aYbeq3x0+c7ObccXvqfXUM\ndFL/KJmAbUwDBjtt7IZXs7j46QA4N0bdzMMu3tXMo9K0VOFrfJ3f/va3fibzKVOm+JGRqfiJIygi\nHoDDAiAQAKHQEVnAA2iwyD8HfdALKunfPJWfFqa7iAemv6C3Fdey4FwsOAG6sES9+uqrrnfv3v5S\n1synBVgJwgQ7yCrgiUMO+ywfffSRO+vMs9ygo452Q046IS5OKvc1rs41U67xOkqlkJFQ3GfuoQXT\ngGkgPRow2EnPvcicJAIcrDtpDQIZIEYgQzduYIcZzsPjQIXiABoEQCberRyoAHLkH0Mcgiw66APw\nwWLDkg9uqBC1AI3IqiD4AkyQi95ZDDaI3JdccomPdvHFF7tOnTr5fJFRcqpZTvIImkiLdAV4L774\nop9t/Zln56e+K/rSZR+64447Puqd1scNOfkkqSl1awOd1N0SE8g04DVgsGMPQlka4KMup+S0+enE\nCxSCg2CG5iH8eAYOHJjrUi7YEegADQIINV2xH4IOFhTABqBBJyxJY9xguRHYsE6CG0GI4ERr+Q8B\nO59//rkbMGCA+8Mf/uCLCfzgswN4ydIEjMm6g3yCKK2BIC1z5z7gbr75JjfzplmpBh7mwKLss26e\nGb+9qdnn3nNvLZgGTAPp04DBTvruSSYk4qPOMjrPLOVpKgSVvIAHgABqcAI+4ogjPKQABlhOgApg\nCBAAHgAbQQ4AIYhgAD9GWwZqgJwkuKEZCqChaQoHboBQsIFuQrAJZRPgaI08su7gR0RzFqM47733\n3l7FG220kRs1apRPGwsTwCNZ41DGecrGOlxGjb7A/TNKd+q1U137duul6dZ5WU49bZh7660/uhnT\np6XOAR4BZcUz0Endo2MCmQZyGjDYyanCNorVAP9gs2LVoUyCDEGFLCV77LGHH5fmoIMOyvW6EgwA\nCgIdppag+zcLkIMzcjwkDeBHfqoId9ppJy+HIAeAEdDE15yLL7JIAWUsQBYTnRL22msvt9tuuzVr\nctPgiIAPZVHTFpCDtSeEHbbPG3W++0s0UOGll12WqvF3sgA670RDLgC1FkwDpoH0asBgp4b3ZrPN\nNssNCIffxVlnnZXLnUpWgZ42jJxbTKB3ET2LFFSxa7811oAOH/csWHXC8oewg5WEaSQYZPDBBx/0\nFh2gAxBYuHChW7RokZs/f75fGC05HnbeeWfHP3kWdBFabshHC2myACddunRxq6yyigcZAU4IPSHg\nJJ0X8CA7wDNp0iQvO7JRDnqbqdlNs6PTxAW0ATsCnhB2wu1zzjnPPfLwQ+6GG6fXvUkLH50zzhjm\nm67SbNEx0Im/GbZvGkinBv4z0lo65StJqksvvdT3UOEiRpp96623SrreIresASwVd999t7vyyitb\njpzCGEClKniags477zx3Y9SbDH8QmrYYMycp7LDDDr4nFHDD6MQEgaUgirXgJg4rDDzIAIRACFAS\nnhfk6Fi4ZjsEJ/JV76uTTz7ZYWUDfi688EI3bdo0b7EJHacBHCw7svCE52TdQR/o5corr4jmM7vb\nbd+jm7tqwqS69dKi19XIs0e4jh07ucmTJqS26cpAh6fRgmkgGxpoKNjJhsqzLSU9jfbZZx+3ceSP\nksVApc6CtebQQw91+N/84he/+FpRgJpdd93Vd09nCgauUQgBBFAR5AhaWAtYwu1NNtnEvffee45B\nDddbb71cUxVxw0Vww5oAjAjQJD/xaY7Dssd0GITrrrsumhhzaM45WU1W8j/C6sOitAQ5SpM0Djhg\nfw99559/gZv/3HOO8YlqNa0E1pwbp890I0ec6QFU5UKuNAWA30AnTXfEZDENtKwBg52WdVS1GI1g\naQJ2AIEsBip1AIJKfo011nBPP/10rhhYbrDYMH7NNttsk+tWTlzBjNYhmLQEOCHssL3aaqt5wKLb\nNyMukxbpshAEHuSrRdCitYTm2i222MJbp5jQlK70DERIWYhLWqTBNgsWHsCHhePKT+lp3TO6vzTN\nMfDgFl06uTFjL3NHDjqiVZ2Xr582w90040bXrv36Dt2k1QfGQEdPia1NA9nSwJdfvGzJ/DVpmdGa\nDzuDrCkwJD7H8JOJB5q7OK6KhTXHkiZY5LjiMfIuQYO5cVxBcVgjDwuVJvukQQjz1DFdH1/fdttt\nuetJg7Q4Vk4grzBvyZRU3pbSp9lE4+u0FDeN5yk7C5X/zTff7AGB0Yrpwo0PVdeuXT0MEIcgZ2b5\nydAbii7gX3zxhW/6Yt3SQhMZcbiO6wGttdZay89YDuRIHpqc1ANMDsb43IQLzWDIywI44St0yCGH\nuG7dunl58QVDVsEOQCTgYjsMKmN4TNukO3LkCA8er77ysusdAdDIc0e7ha8tUpSK11hygJxevXp7\n0KFZjq7lBjoVq9YSMA2YBmIaaFOWHSp3xihhFN14AGCAApyDf/KTn8RP5/aBjqTrcxGiDUCnJZgJ\n48e3gRqaJ8JAnsged2wO48S3q1HeME0GyCNsnNEmLJVFVo3999/fH6Jyfe2117ylRVYWwIAu6oIF\ndQEXOLAutM050uJ6pRnmD6hst912ftBBBgZkXxYYWWO01nGtBUeki1wskydP9hOSksdxxx3nbr/9\ndp835ygHACR/HdIV6Ggt2eJrdAOAcO9vnjXbW3r27rev2yUC/22i96RH967xSwruAzhzH3jIvfrK\nK27uffe6rbfZNmqGG+iOjKYcSXMwi06a747JZhpoWQNtCnbygY7U9Mknn/h5k2huWn311XU4twZi\nigmVgA7px0EnzBMoA8aK6a1VaXnDfNlOg58C1jXdByr7UgOVO9cJeLie5iumYsAXiXMCGaw6LAIK\n1oUAJwQbyUZ+LOQXwou26ZKOUzTxGTMHoNE5wY32WWtbkCJZN9hgA9+7rH///u4vf/mLmzlzph8w\nEdDRNUpPMrFfbAB6WEaePTxyYr7L4UQ8ecJ4fznAssWWP/Tb3//+Zr7HmdJl8MPPP//CvbP4bfeH\nN99wjz/+mDvk0MPdT3fdxQ0dcqLbOAPgbKCju2lr00B2NdAQzVhU/FQWGkaf20FvLI7JTwaACC0y\nxOU8C00YCgBPIdggXaw/ulbXxdfHHHNMLk7YxTweL99+PvmIX0g+pVet8io91vy77xk1Z6QlcK/K\nCarstWZ0Y/xECAALi6whdPFWsxVrbYfNUsQBimTNIV2sKGGzVNgUFd/GWkj38MWLF/smq7DbuJqz\nOB8OFqhu5BpDh3NMGMpAiQSclblfyERZkFGLZJW8/oIif2jewgozdcrVbtEbi9zsW2a7AQfu71Ze\n6dvuf//9L3dX1J1/5owZuWVJ5JC90ndW9BagUaPO8+8EliKcj7MAOgB+GiC/yNtj0UwDpoE8Gmgz\nlh1ZA9ADIBICCPtUnPL5ARTC86HuAKOWrCqcDwEqvL6Y7Zbko5kLeZOsT0q/WuVVemlcA68t3Yti\n5KbS5d+7QAcYECDQ/MO2LDyh9Ya0ARtZXLSWBYV1aFXJt008mrLoIfbCCy94oCSu0matEN8Gurke\nuMLfB0h+9NFH3dKlS92QIUPck08+6S1T8uMJm7LUM0vlUB6lrGXxKeWarMQFcgiU0YJpwDSQbQ00\nhGWnmFsQWnWSKkjmSVLA1yWf1SDpWl2ndTFxFDdpHcqi8/E0W3IurlZ5lT9rLAWAQVpCWMZqyAQ4\nYO2guUrAA+gIdmQJATiABqwqoUMxFpvQKhO34IT7ioflRlabddZZxzejMlIzTs3kIeghzxB0wvIS\nJ5Rn9uzZudOnn366t6ZQJoAH644ATs1yLVkpc4m1oQ2BTpqe9zakfiuqaaDqGmgzsBPCAb4sqjy0\njvfaSoIdmrCKCYUsLsVcn5RPPM0k+cK0q1HeMD22sX6k6eOPb1S1gSdeZp4PLCdqkqK5CDgBUgQ3\ngIvgJQSa+LbicL0AB1iKdwnHh+jdd991qnDjMoX7en5l3SGtDh06uHHjxvlozz//vB+9Wc1Z9AaL\nAw/WKgv/0YD0nqZn/T/S2ZZpwDRQjgbaDOyUoxy7pnU0AKSoki51HTbPAXw4LNP8WA3oQRY1NSXB\nTTmAA/DIeiOwAUiAE5bQchNqW00nWNNaCtIhacnaxHxfe+65p7+UqSQAVSw5WKlC4JF1R81zLeXV\n6OcNdBr9Dlv52qoG2gzshNaS0MFYJvz4Ooxf64cjqeKOW3Jaki88X63y4pyqyqDWOsmXH3oBnuRv\nlS9eMccFO2oSiltxZMmJW2wENFrH4QZwaglukuTDssDyeDS2UUsB2ZUH+SE7/jusCUcffbRfFwIe\nvQM+Yhv80bONzi2YBkwDjaWBNuOgTHdtNe0AE3EfmDTdVqwXcb+d0KJBk1YIM0myt0Z5sTaoQkjK\nM8vHBDoAgwKWEsABCCCwr0VWGda6lrUW4rNdaQAwe/bs6YEH/bNfKIQyMyUF/jsMAkl39PHjx3un\nZaw7xAPqBEgqA/uUt1jZsTyxfPa3z92SJe9/bUZ4Jj7t2LGDizr8e0dfypLGoOfaQCeNd8dkMg1U\nroGGtezELSEh3GAFCMfCAYJCP564/07lai4tBXqDhfKxH1ou4iCUlHprlZfmkEYLL730kocIKnhV\n/jQH/f/2zj3IiurO4z9SIJYVUu7GZAOuFi5UGNkqERMZhFpFYnzk4Yih3IqRAZJSUUkibtCMZA1b\nAg6iUjKWQXwOJghEHIeYVWMSJCKPUgOl4aEbE0UdrazlHz7WRLOV3G+THzlc7jzune57u/t+TlXP\n6du3+5zf+Zyr/eV3fuecYs+Ox+oUx9tU6rkph6NEgl7I/lIu9azsd9EiIaNhM01H18rESlpoUMLE\n43d8Krpyn22m73pK6v97ChunXnTxJdYwqsHmzLnCfvbYL+zd9963f/jHj9u05uYDjnGN4+39P35g\nL+99zW5aenNk39lNU2xZ2y09tqUnG+L+zpkidOImS3kQSA+B3Hp2JHb0P355QLTWjqZzS0C4d0fi\nIRQQYZd0N+08vCfp8/7al0R75VmIY7dzibWeVqmuhG1xAHc5ZegFrrbJ26GkvDho14WEck/huV9L\nMnfPmgSLzksl2ST7JdokwiReWlpabN26dd1OR3eBp+e8nX7udWgo7af//YjdsGRxtCjg+IKI0qaj\n5W4SqhWUNz252bZs3mJnnnGmfbrhWDt3SpPNKKzdU4uE0KkFdeqEQPUJ5Ers9Da0o9iV3lYVVpyD\nhEItk8RW6NkJbZF9vbXT74+7vXrB6kXb3yT7+9qG/tbVl+f1Ir/88sujF73fLwHgqdqixustlcv7\nIHHWk+DRc26/PFQSPI888ogdd9y+VY41Hf3GG2+MvDny6ujecEgrFDobN260Fbffab9++ilrnvkN\n+83O3WULnLAdw4Z+ys6bem50zJ37H9HWEe3t91h7+8rIA3XuuVPC2xM9R+gkipfCIZAqArkaxpLH\nQGKgu3/l6wWrRdt0T7FnQQJH4iANXh3ZokUJQ0Ege9euXVuWfXG3Vy9apTgET1RQSv7ohR56Sty7\n4XlKzNxvhuJ21BcSaaWSizOJFh/OUvxOOB29s7Mz8l5p+EqCRzO0wvV33nrrLVuwcJHNunhWtBXE\nLwt1Xf3duf0SOsW2Svh8Y2azbdjwS7ugeYa1tbWZhriq8fvyOvw3XWwbnyEAgXwRGFAIRix/g6F8\nMaA1ZRBQsOukQvyIPCF5SNrnSW3xf+VnrU0SPBJqpQKX9Z+2huM0A0tCRltdXHjhhfbQQw9FzVy/\nfn30nLw/ikPydYC2bdtmN9241IYV9tuaN29erAKnN76LWpfYvJYr7eabFUy9L9aot2fK/V5CRyKn\nFLNyy+J+CEAgGwQQO9nop9RYqeBUxe34v4xTY1iFhrhoiyMWqUIT+v2Y+sK9PcWFSfBoGMs9OBI8\nI0eOjLw5iunR1hLyBCmYWVPmH3yw09TH3/z25fat2ZcWF1eVz9pkVN7XoUOH2uLWRbGKEoROVbqQ\nSiCQOgKIndR1SboNUryIhgm1aaX+dZz1JJHg3pEst0WeKQ+0DtshseOCR1PONWS1adOmaDq67ps6\ndWo0HV1xO62t19vu3busfeW90cadYTnVPlcg8/z5/2VvvPGGrWy/OxbBg9Cpdi9SHwTSQyBXMTvp\nwZpfSyQOtGN1lj0h3jvu1Qnjdfy7rOUSnjok3MLkcUcev6Mhq1LT0RcsWBQNeW0sbBw64aTGsIia\nnCueRzurjxgx0pqnz4yEXH8MQej0hx7PQiD7BPDsZL8Pq94CvVAVuyNvQpbjHjyQd8yYMVHci0SP\n4pGyLn7UP8VxPO7dUfyOByRrLaZdu3bZZz9zov1TIYB5xe0rTCIjbWnOFXMLy0f8tmIPD0InbT2K\nPRCoPgHETvWZ56JGiQId8+fPz2R7FJcyc+bMbm0/+eSTo/ZJNOiQ10TJBVL0IcV/9IIP43gkdpQU\nv+PDWV1dXXbZZbOj9Xce7Fxf1UDkctFpEUPtBL/qR/eW9ShCpyxc3AyB3BJA7OS2a5NtmHt3/GWS\nbG3xly7xIqE2o7CYndqyYcOGKOha+TvvvHNQhcOGDbOxY8dGh3Yl16GUZvFTHMfj8Tu+P9aWLVvs\n9NNPtyc3b03F0NVB0IMLiuGZNesSaxw3rjBDrCX4pvtT/21m2fvYfev4BgIQKIcAYqccWtx7AAEJ\nBQXFavp2lpJEjmzWy1DJvR6apq1Dq2xr/zTNVNq+fXt0lGrf6NGjI9FzwgknRCLIh7/SJIB8AUJ5\n4ZTUVrXxzTffLOy/dp5NPe/fazbrKjKojD+apfX1GdNt+W3LI69bT48idHqiw3cQqD8CiJ366/PY\nWqwXqTwkClaW8MlC0ktQHhqJGBcn7vGQCNAwjzwf4V5R+l73/6oQvKt9tJ544oloSKW4vVqrRsJH\nXh/Vody9CrUWQGEcj9pz7bULbM/zL5Q9LFTc5mp/XnbLrdax7v5oIcLu6g7b2t09XIcABOqLAGKn\nvvo79tbqxaJgZX/BxF5BjAX61GzNwvKZWCrevR0udBTT4oeu6dA9Eiw6NE377bfftueeey4SQBI/\nO3fuLGmphr9c/LgA0o21ED8SehJfaotW1+7v1g8lG1yFi1pl+bTPTS656KB+h+7FqoIpVAEBCGSE\nAGInIx2VZjM1LKSAX3+ZptVWDzaWrWHSy99FjYsc3z7BPTzy+ihJ6Ggatx/67Nf27t0biR+JIAmg\nV199Naxm//mECRMiz497geQdU6qGAFIcz3e+c6WNOna0Lbx2flRv1v48/OhjNqewuvLWbVv3e87U\nBoRO1noSeyFQPQKIneqxznVNGsaS2NELx4du0tTgnuxzseOBuz41OxQ8EkOeXNx4LuFT6lzXNPTl\nHqBnn3222+GvyZMn7x/6kgfIGcYtgCR2jjnmGHut6/VUTjN3xr3l539tmo1vHLffu4PQ6Y0Y30Og\nvgkgduq7/2NtfU+CItaKyihMQ1casupJiLnYkaDxadkSOi52dE0eHnl3dK8EiB8SOjp3wROKnlLX\nNPwlr48LoO6GvxT8LNGjQ0LI44v6K36uuuq79sGH/29Lb1pSBsX03SrvzvWt10WxOwid9PUPFkEg\nbQQQO2nrkYzbI8Gjl49mO/kLulZNktCZ9LdZSLLJvSXF9kjAhMHJLnjk4Ql3Apfnx+9TGXpOh5KL\nn1D4SOz44SLIcxdCyiV85PVxL1B3w18TJ06M2uOzvyoZ/moY1WB33dOe+qnmEdRe/px66mQbM+a4\nXKzm3UtT+RoCEOgnAcROPwHy+MEEFMOjGVq1nKUlcaPAaR2yozuhI+tdtEjI+Ewsj92RR8fjdvSd\nvD+6zw99drHk5TgRF0AueEKB49ckfkIB5PfI+yPxIxG0efNmL/KAXBtluvBRELQOT6U8QBKgd93d\nbus7O/y2TOfz/nO+ffjBn+z6xddluh0YDwEIJE8AsZM847qswcWGPCsSG+6FSBqGvDkeMK08nHXV\nU90uWNxzEwocFzkSNn6EYsef8Wueu/hRruTix70/Lnhc4IR58fkrr7wSCR+JIAmgnoa/fPaXhJB7\n11TnlVe12KBDBmc2MLm4/3zdnT3P7yn+is8QgAAEDiCA2DkABx/iJODxMvIo+HTvnjws/a1bs6wk\ncCSsdK68r8kFiYSKzl3UKHcx4+f+Ocz9PLzHryn3MmWPzr0+F0ASNzov5eUpFj7uDfKhL+USQdpO\noTiFa/90dq63xUtusLPO+HzxbZn9rGG51WtW7xd1mW0IhkMAAokSQOwkipfCRUBeHokQBQlL9Ciu\npxwh0hNFCSoJG3mPlJRr6KrS5CLEBYlyiZVShwsi/86Fjufh9VLXvGyvSza7+NG5RI4foQjyc89d\nDIVr/2gITJt8FifVlaekPbNGHzuqzx68PLWdtkAAAn0ngNjpOyvu7CcBiR4Jk/b2dmtqaoqCbSVM\nyhU+EjjyFuno7Oy0U045JXrZ9UfkdNe0UBxIvLgwcSET5qGg8XPPdV+p8/Cal+V1eN0ugJS7+PHc\nvTz+2YWPCyFf+6etrc0mTvw3+/GP13TX1ExeX9S6xP5ciNu55prvZdJ+jIYABKpDALFTHc7UEhAI\nxYoEkIa2JHgm/W3mlOf+iDxCeka5jpdffjkSOBI3lYglL7eS3AWIng1FiYSKPheLl1Kf+3rNxY+X\nHdbtAkjiRucublzshLnOJXbe/+MHtuK2H1TS7NQ+s/b+B+zBjo7MbXuRWqAYBoGcEhiY03bRrBQT\nkLjRUJYOJRcxvku3hrzCJCGkQ8JGw2DFYii8N+lzCQtPfi4RIrGhpHMXJ2FeSuDo+/C6n3vu34ef\ndc3LVV0KnlZS7vbIFp274NHnwq02/Jh/ie7N058hQ4bkqTm0BQIQSIgAYichsBTbdwK+jUPfn0jX\nnS4yZJWLjNALE4oTFyueFwsZfS73murSM0o610wyJdnix7vv/l90LW9/jj7qKPv100/lrVm0BwIQ\niJkAYidmoBQHAREIBdA+z8rfA4MlSPxwIVRK4Oi78Lqfex5+X3xN5XvZyj/88z4BlLfe+dfRDfb8\nC8/nrVm0BwIQiJkAYidmoBQHgVIEQvHj5xIklQx/hcLGBU9v1wYNHFTKLK5BAAIQqAsCiJ266GYa\nmUYCLnpkm84VYyMBpOSeH8//AVUvAAAG+ElEQVQlasLDxY3nLnrCPDwffOjgqNy8/dm5iwUF89an\ntAcCSRBA7CRBlTIhUCEBF0Ceu/hRcS58lIfCJzyX+AkFkAueIR/9aIUWpfuxvYWVpb96/gXpNhLr\nIACBmhNA7NS8CzAAAt0TcNGjO/xcYseHvyRmXAS56NHnUPDofODAj9j//uEP3VfENxCAAARyTOAj\nOW4bTYNALglI9Pgh0aNj4MCBdsghh9jgwYOjQ9tE6DjssMOiQzumd3W9ljse27fvsIZRo3LXLhoE\nAQjESwCxEy9PSoNA1Qm48FEerq0zaNCgSAAdeuih1tjYaGvX3Fd125Ku8KXf/84+9rF8DtElzY7y\nIVBPBBA79dTbtLVuCBQLoCOOOKKwGOOppp3C85T+pzDtvJaLTOaJJW2BQJ4JIHby3Lu0DQIBgRPH\nNdrjG38VXMn2qYTb611d7Hie7W7EeghUhQBipyqYqQQCtScw4aRG27plc+0NicmCp595xr7cVPkO\n9zGZQTEQgEAGCCB2MtBJmAiBOAhob7EX9uy2vKxN07Hufps4YXwcaCgDAhDIOQHETs47mOZBICRw\n9jlT7I477gwvZfL84Ucfi4awJOBIEIAABHojgNjpjRDfQyBHBC695GJ7+Kc/sa7X38h0q+5dudIu\nnT07023AeAhAoHoEEDvVY01NEKg5geHDh9sJnz3R7mm/t+a2VGqAvDra6bx5GisnV8qQ5yBQbwQG\nFFZb/ft2zPXWetoLgToksGPHDhs7dqz9Zudu067hWUvnf22ajR/faN/6Jp6drPUd9kKgVgTw7NSK\nPPVCoEYEjj/+eLt2wUJbuHBhjSyovNplt9xaiNV5Da9O5Qh5EgJ1SQCxU5fdTqPrncDsyy6NhoIk\nHrKSNIvs1rZl9v3vX2OHH354VszGTghAIAUEEDsp6ARMgEC1CUgsLL9teSQesrKqcktLi13Q3MyK\nydX+sVAfBHJAgJidHHQiTYBApQSWLWuzjo4O+9GqVTZs6KcqLSbx5+ZcMddefPG3tr6zI/G6qAAC\nEMgfAcRO/vqUFkGgLAJXXtVie/bsseXLf5BKwSOhs2P7MwVR9gDDV2X1LDdDAAJOgGEsJ0EOgTol\ncP3i66yhocFmzbokdevvSOhoXaClS29C6NTp75NmQyAOAoidOChSBgQyTiAUPGmI4dGihz50tXrN\najb7zPjvC/MhUGsCiJ1a9wD1QyAlBCR4GseNs6/PmG5r73+gZlZp1pW8TIrRWdl+N0KnZj1BxRDI\nDwHETn76kpZAoN8E5s1rsdbFrXbNvKvtoourP6ylqfBfmXJOJLoUjMwU8353KQVAAAIFAogdfgYQ\ngMABBLS55tZtW23AgAE2edIkW9S65IDvk/igLSDObppi2slcU+IlukgQgAAE4iLAbKy4SFIOBHJI\n4PHHH7cVt98ZrVr8+TPOshnTp8U6Y+vOu1faL36+b6+rlqtbbPr06TmkSJMgAIFaE0Ds1LoHqB8C\nGSAg0bPqvjV2+4rlduFFs6xx/El21pmnVyR85MXZtOlJW7d2tQ0dNsxmFGKEmpqaGLLKwO8AEyGQ\nVQKInaz2HHZDoAYEXnrpJdu4caM9+rOf232rfmhfPvscGzFipH3ik5+0kSNH2JAhQw6yavv2Hfbe\ne+/Z73/3YvTMpEmn2udOO82+9MUvEHx8EC0uQAACSRBA7CRBlTIhUCcE5PHRLup/sQH21FNPl2z1\nkUceaUf985F29NFHReJm+PDhJe/jIgQgAIGkCCB2kiJLuRCAAAQgAAEIpIIAs7FS0Q0YAQEIQAAC\nEIBAUgQQO0mRpVwIQAACEIAABFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoid\nVHQDRkAAAhCAAAQgkBQBxE5SZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAA\nBCAAAQikggBiJxXdgBEQgAAEIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCAp\nAoidpMhSLgQgAAEIQAACqSCA2ElFN2AEBCAAAQhAAAJJEUDsJEWWciEAAQhAAAIQSAUBxE4qugEj\nIAABCEAAAhBIigBiJymylAsBCEAAAhCAQCoIIHZS0Q0YAQEIQAACEIBAUgQQO0mRpVwIQAACEIAA\nBFJBALGTim7ACAhAAAIQgAAEkiKA2EmKLOVCAAIQgAAEIJAKAoidVHQDRkAAAhCAAAQgkBQBxE5S\nZCkXAhCAAAQgAIFUEEDspKIbMAICEIAABCAAgaQIIHaSIku5EIAABCAAAQikggBiJxXdgBEQgAAE\nIAABCCRFALGTFFnKhQAEIAABCEAgFQQQO6noBoyAAAQgAAEIQCApAn8FUX2PmBTVQm8AAAAASUVO\nRK5CYII=\n", + "text/plain": [ + "" + ] + }, + "execution_count": 112, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Image(filename='sentiment_network_sparse_2.png')" + ] + }, + { + "cell_type": "code", + "execution_count": 113, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('edie', 4.6913478822291435),\n", + " ('paulie', 4.0775374439057197),\n", + " ('felix', 3.1527360223636558),\n", + " ('polanski', 2.8233610476132043),\n", + " ('matthau', 2.8067217286092401),\n", + " ('victoria', 2.6810215287142909),\n", + " ('mildred', 2.6026896854443837),\n", + " ('gandhi', 2.5389738710582761),\n", + " ('flawless', 2.451005098112319),\n", + " ('superbly', 2.2600254785752498),\n", + " ('perfection', 2.1594842493533721),\n", + " ('astaire', 2.1400661634962708),\n", + " ('captures', 2.0386195471595809),\n", + " ('voight', 2.0301704926730531),\n", + " ('wonderfully', 2.0218960560332353),\n", + " ('powell', 1.9783454248084671),\n", + " ('brosnan', 1.9547990964725592),\n", + " ('lily', 1.9203768470501485),\n", + " ('bakshi', 1.9029851043382795),\n", + " ('lincoln', 1.9014583864844796),\n", + " ('refreshing', 1.8551812956655511),\n", + " ('breathtaking', 1.8481124057791867),\n", + " ('bourne', 1.8478489358790986),\n", + " ('lemmon', 1.8458266904983307),\n", + " ('delightful', 1.8002701588959635),\n", + " ('flynn', 1.7996646487351682),\n", + " ('andrews', 1.7764919970972666),\n", + " ('homer', 1.7692866133759964),\n", + " ('beautifully', 1.7626953362841438),\n", + " ('soccer', 1.7578579175523736),\n", + " ('elvira', 1.7397031072720019),\n", + " ('underrated', 1.7197859696029656),\n", + " ('gripping', 1.7165360479904674),\n", + " ('superb', 1.7091514458966952),\n", + " ('delight', 1.6714733033535532),\n", + " ('welles', 1.6677068205580761),\n", + " ('sadness', 1.663505133704376),\n", + " ('sinatra', 1.6389967146756448),\n", + " ('touching', 1.637217476541176),\n", + " ('timeless', 1.62924053973028),\n", + " ('macy', 1.6211339521972916),\n", + " ('unforgettable', 1.6177367152487956),\n", + " ('favorites', 1.6158688027643908),\n", + " ('stewart', 1.6119987332957739),\n", + " ('hartley', 1.6094379124341003),\n", + " ('sullivan', 1.6094379124341003),\n", + " ('extraordinary', 1.6094379124341003),\n", + " ('brilliantly', 1.5950491749820008),\n", + " ('friendship', 1.5677652160335325),\n", + " ('wonderful', 1.5645425925262093),\n", + " ('palma', 1.5553706911638245),\n", + " ('magnificent', 1.54663701119507),\n", + " ('finest', 1.5462590108125689),\n", + " ('jackie', 1.5439233053234738),\n", + " ('ritter', 1.5404450409471491),\n", + " ('tremendous', 1.5184661342283736),\n", + " ('freedom', 1.5091151908062312),\n", + " ('fantastic', 1.5048433868558566),\n", + " ('terrific', 1.5026699370083942),\n", + " ('noir', 1.493925025312256),\n", + " ('sidney', 1.493925025312256),\n", + " ('outstanding', 1.4910053152089213),\n", + " ('mann', 1.4894785973551214),\n", + " ('pleasantly', 1.4894785973551214),\n", + " ('nancy', 1.488077055429833),\n", + " ('marie', 1.4825711915553104),\n", + " ('marvelous', 1.4739999415389962),\n", + " ('excellent', 1.4647538505723599),\n", + " ('ruth', 1.4596256342054401),\n", + " ('stanwyck', 1.4412101187160054),\n", + " ('widmark', 1.4350845252893227),\n", + " ('splendid', 1.4271163556401458),\n", + " ('chan', 1.423108334242607),\n", + " ('exceptional', 1.4201959127955721),\n", + " ('tender', 1.410986973710262),\n", + " ('gentle', 1.4078005663408544),\n", + " ('poignant', 1.4022947024663317),\n", + " ('gem', 1.3932148039644643),\n", + " ('amazing', 1.3919815802404802),\n", + " ('chilling', 1.3862943611198906),\n", + " ('captivating', 1.3862943611198906),\n", + " ('fisher', 1.3862943611198906),\n", + " ('davies', 1.3862943611198906),\n", + " ('darker', 1.3652409519220583),\n", + " ('april', 1.3499267169490159),\n", + " ('kelly', 1.3461743673304654),\n", + " ('blake', 1.3418425985490567),\n", + " ('overlooked', 1.329135947279942),\n", + " ('ralph', 1.32818673031261),\n", + " ('bette', 1.3156767939059373),\n", + " ('hoffman', 1.3150668518315229),\n", + " ('cole', 1.3121863889661687),\n", + " ('shines', 1.3049487216659381),\n", + " ('powerful', 1.2999662776313934),\n", + " ('notch', 1.2950456896547455),\n", + " ('remarkable', 1.2883688239495823),\n", + " ('pitt', 1.286210902562908),\n", + " ('winters', 1.2833463918674481),\n", + " ('vivid', 1.2762934659055623),\n", + " ('gritty', 1.2757524867200667),\n", + " ('giallo', 1.2745029551317739),\n", + " ('portrait', 1.2704625455947689),\n", + " ('innocence', 1.2694300209805796),\n", + " ('psychiatrist', 1.2685113254635072),\n", + " ('favorite', 1.2668956297860055),\n", + " ('ensemble', 1.2656663733312759),\n", + " ('stunning', 1.2622417124499117),\n", + " ('burns', 1.259880436264232),\n", + " ('garbo', 1.258954938743289),\n", + " ('barbara', 1.2580400255962119),\n", + " ('panic', 1.2527629684953681),\n", + " ('holly', 1.2527629684953681),\n", + " ('philip', 1.2527629684953681),\n", + " ('carol', 1.2481440226390734),\n", + " ('perfect', 1.246742480713785),\n", + " ('appreciated', 1.2462482874741743),\n", + " ('favourite', 1.2411123512753928),\n", + " ('journey', 1.2367626271489269),\n", + " ('rural', 1.235471471385307),\n", + " ('bond', 1.2321436812926323),\n", + " ('builds', 1.2305398317106577),\n", + " ('brilliant', 1.2287554137664785),\n", + " ('brooklyn', 1.2286654169163074),\n", + " ('von', 1.225175011976539),\n", + " ('unfolds', 1.2163953243244932),\n", + " ('recommended', 1.2163953243244932),\n", + " ('daniel', 1.20215296760895),\n", + " ('perfectly', 1.1971931173405572),\n", + " ('crafted', 1.1962507582320256),\n", + " ('prince', 1.1939224684724346),\n", + " ('troubled', 1.192138346678933),\n", + " ('consequences', 1.1865810616140668),\n", + " ('haunting', 1.1814999484738773),\n", + " ('cinderella', 1.180052620608284),\n", + " ('alexander', 1.1759989522835299),\n", + " ('emotions', 1.1753049094563641),\n", + " ('boxing', 1.1735135968412274),\n", + " ('subtle', 1.1734135017508081),\n", + " ('curtis', 1.1649873576129823),\n", + " ('rare', 1.1566438362402944),\n", + " ('loved', 1.1563661500586044),\n", + " ('daughters', 1.1526795099383853),\n", + " ('courage', 1.1438688802562305),\n", + " ('dentist', 1.1426722784621401),\n", + " ('highly', 1.1420208631618658),\n", + " ('nominated', 1.1409146683587992),\n", + " ('tony', 1.1397491942285991),\n", + " ('draws', 1.1325138403437911),\n", + " ('everyday', 1.1306150197542835),\n", + " ('contrast', 1.1284652518177909),\n", + " ('cried', 1.1213405397456659),\n", + " ('fabulous', 1.1210851445201684),\n", + " ('ned', 1.120591195386885),\n", + " ('fay', 1.120591195386885),\n", + " ('emma', 1.1184149159642893),\n", + " ('sensitive', 1.113318436057805),\n", + " ('smooth', 1.1089750757036563),\n", + " ('dramas', 1.1080910326226534),\n", + " ('today', 1.1050431789984001),\n", + " ('helps', 1.1023091505494358),\n", + " ('inspiring', 1.0986122886681098),\n", + " ('jimmy', 1.0937696641923216),\n", + " ('awesome', 1.0931328229034842),\n", + " ('unique', 1.0881409888008142),\n", + " ('tragic', 1.0871835928444868),\n", + " ('intense', 1.0870514662670339),\n", + " ('stellar', 1.0857088838322018),\n", + " ('rival', 1.0822184788924332),\n", + " ('provides', 1.0797081340289569),\n", + " ('depression', 1.0782034170369026),\n", + " ('shy', 1.0775588794702773),\n", + " ('carrie', 1.076139432816051),\n", + " ('blend', 1.0753554265038423),\n", + " ('hank', 1.0736109864626924),\n", + " ('diana', 1.0726368022648489),\n", + " ('adorable', 1.0726368022648489),\n", + " ('unexpected', 1.0722255334949147),\n", + " ('achievement', 1.0668635903535293),\n", + " ('bettie', 1.0663514264498881),\n", + " ('happiness', 1.0632729222228008),\n", + " ('glorious', 1.0608719606852626),\n", + " ('davis', 1.0541605260972757),\n", + " ('terrifying', 1.0525211814678428),\n", + " ('beauty', 1.050410186850232),\n", + " ('ideal', 1.0479685558493548),\n", + " ('fears', 1.0467872208035236),\n", + " ('hong', 1.0438040521731147),\n", + " ('seasons', 1.0433496099930604),\n", + " ('fascinating', 1.0414538748281612),\n", + " ('carries', 1.0345904299031787),\n", + " ('satisfying', 1.0321225473992768),\n", + " ('definite', 1.0319209141694374),\n", + " ('touched', 1.0296194171811581),\n", + " ('greatest', 1.0248947127715422),\n", + " ('creates', 1.0241097613701886),\n", + " ('aunt', 1.023388867430522),\n", + " ('walter', 1.022328983918479),\n", + " ('spectacular', 1.0198314108149955),\n", + " ('portrayal', 1.0189810189761024),\n", + " ('ann', 1.0127808528183286),\n", + " ('enterprise', 1.0116009116784799),\n", + " ('musicals', 1.0096648026516135),\n", + " ('deeply', 1.0094845087721023),\n", + " ('incredible', 1.0061677561461084),\n", + " ('mature', 1.0060195018402847),\n", + " ('triumph', 0.99682959435816731),\n", + " ('margaret', 0.99682959435816731),\n", + " ('navy', 0.99493385919326827),\n", + " ('harry', 0.99176919305006062),\n", + " ('lucas', 0.990398704027877),\n", + " ('sweet', 0.98966110487955483),\n", + " ('joey', 0.98794672078059009),\n", + " ('oscar', 0.98721905111049713),\n", + " ('balance', 0.98649499054740353),\n", + " ('warm', 0.98485340331145166),\n", + " ('ages', 0.98449898190068863),\n", + " ('glover', 0.98082925301172619),\n", + " ('guilt', 0.98082925301172619),\n", + " ('carrey', 0.98082925301172619),\n", + " ('learns', 0.97881108885548895),\n", + " ('unusual', 0.97788374278196932),\n", + " ('sons', 0.97777581552483595),\n", + " ('complex', 0.97761897738147796),\n", + " ('essence', 0.97753435711487369),\n", + " ('brazil', 0.9769153536905899),\n", + " ('widow', 0.97650959186720987),\n", + " ('solid', 0.97537964824416146),\n", + " ('beautiful', 0.97326301262841053),\n", + " ('holmes', 0.97246100334120955),\n", + " ('awe', 0.97186058302896583),\n", + " ('vhs', 0.97116734209998934),\n", + " ('eerie', 0.97116734209998934),\n", + " ('lonely', 0.96873720724669754),\n", + " ('grim', 0.96873720724669754),\n", + " ('sport', 0.96825047080486615),\n", + " ('debut', 0.96508089604358704),\n", + " ('destiny', 0.96343751029985703),\n", + " ('thrillers', 0.96281074750904794),\n", + " ('tears', 0.95977584381389391),\n", + " ('rose', 0.95664202739772253),\n", + " ('feelings', 0.95551144502743635),\n", + " ('ginger', 0.95551144502743635),\n", + " ('winning', 0.95471810900804055),\n", + " ('stanley', 0.95387344302319799),\n", + " ('cox', 0.95343027882361187),\n", + " ('paris', 0.95278479030472663),\n", + " ('heart', 0.95238806924516806),\n", + " ('hooked', 0.95155887071161305),\n", + " ('comfortable', 0.94803943018873538),\n", + " ('mgm', 0.94446160884085151),\n", + " ('masterpiece', 0.94155039863339296),\n", + " ('themes', 0.94118828349588235),\n", + " ('danny', 0.93967118051821874),\n", + " ('anime', 0.93378388932167222),\n", + " ('perry', 0.93328830824272613),\n", + " ('joy', 0.93301752567946861),\n", + " ('lovable', 0.93081883243706487),\n", + " ('hal', 0.92953595862417571),\n", + " ('mysteries', 0.92953595862417571),\n", + " ('louis', 0.92871325187271225),\n", + " ('charming', 0.92520609553210742),\n", + " ('urban', 0.92367083917177761),\n", + " ('allows', 0.92183091224977043),\n", + " ('impact', 0.91815814604895041),\n", + " ('gradually', 0.91629073187415511),\n", + " ('lifestyle', 0.91629073187415511),\n", + " ('italy', 0.91629073187415511),\n", + " ('spy', 0.91289514287301687),\n", + " ('treat', 0.91193342650519937),\n", + " ('subsequent', 0.91056005716517008),\n", + " ('kennedy', 0.90981821736853763),\n", + " ('loving', 0.90967549275543591),\n", + " ('surprising', 0.90937028902958128),\n", + " ('quiet', 0.90648673177753425),\n", + " ('winter', 0.90624039602065365),\n", + " ('reveals', 0.90490540964902977),\n", + " ('raw', 0.90445627422715225),\n", + " ('funniest', 0.90078654533818991),\n", + " ('pleased', 0.89994159387262562),\n", + " ('norman', 0.89994159387262562),\n", + " ('thief', 0.89874642222324552),\n", + " ('season', 0.89827222637147675),\n", + " ('secrets', 0.89794159320595857),\n", + " ('colorful', 0.89705936994626756),\n", + " ('highest', 0.8967461358011849),\n", + " ('compelling', 0.89462923509297576),\n", + " ('danes', 0.89248008318043659),\n", + " ('castle', 0.88967708335606499),\n", + " ('kudos', 0.88889175768604067),\n", + " ('great', 0.88810470901464589),\n", + " ('baseball', 0.88730319500090271),\n", + " ('subtitles', 0.88730319500090271),\n", + " ('bleak', 0.88730319500090271),\n", + " ('winner', 0.88643776872447388),\n", + " ('tragedy', 0.88563699078315261),\n", + " ('todd', 0.88551907320740142),\n", + " ('nicely', 0.87924946019380601),\n", + " ('arthur', 0.87546873735389985),\n", + " ('essential', 0.87373111745535925),\n", + " ('gorgeous', 0.8731725250935497),\n", + " ('fonda', 0.87294029100054127),\n", + " ('eastwood', 0.87139541196626402),\n", + " ('focuses', 0.87082835779739776),\n", + " ('enjoyed', 0.87070195951624607),\n", + " ('natural', 0.86997924506912838),\n", + " ('intensity', 0.86835126958503595),\n", + " ('witty', 0.86824103423244681),\n", + " ('rob', 0.8642954367557748),\n", + " ('worlds', 0.86377269759070874),\n", + " ('health', 0.86113891179907498),\n", + " ('magical', 0.85953791528170564),\n", + " ('deeper', 0.85802182375017932),\n", + " ('lucy', 0.85618680780444956),\n", + " ('moving', 0.85566611005772031),\n", + " ('lovely', 0.85290640004681306),\n", + " ('purple', 0.8513711857748395),\n", + " ('memorable', 0.84801189112086062),\n", + " ('sings', 0.84729786038720367),\n", + " ('craig', 0.84342938360928321),\n", + " ('modesty', 0.84342938360928321),\n", + " ('relate', 0.84326559685926517),\n", + " ('episodes', 0.84223712084137292),\n", + " ('strong', 0.84167135777060931),\n", + " ('smith', 0.83959811108590054),\n", + " ('tear', 0.83704136022001441),\n", + " ('apartment', 0.83333115290549531),\n", + " ('princess', 0.83290912293510388),\n", + " ('disagree', 0.83290912293510388),\n", + " ('kung', 0.83173334384609199),\n", + " ('adventure', 0.83150561393278388),\n", + " ('columbo', 0.82667857318446791),\n", + " ('jake', 0.82667857318446791),\n", + " ('adds', 0.82485652591452319),\n", + " ('hart', 0.82472353834866463),\n", + " ('strength', 0.82417544296634937),\n", + " ('realizes', 0.82360006895738058),\n", + " ('dave', 0.8232003088081431),\n", + " ('childhood', 0.82208086393583857),\n", + " ('forbidden', 0.81989888619908913),\n", + " ('tight', 0.81883539572344199),\n", + " ('surreal', 0.8178506590609026),\n", + " ('manager', 0.81770990320170756),\n", + " ('dancer', 0.81574950265227764),\n", + " ('con', 0.81093021621632877),\n", + " ('studios', 0.81093021621632877),\n", + " ('miike', 0.80821651034473263),\n", + " ('realistic', 0.80807714723392232),\n", + " ('explicit', 0.80792269515237358),\n", + " ('kurt', 0.8060875917405409),\n", + " ('traditional', 0.80535917116687328),\n", + " ('deals', 0.80535917116687328),\n", + " ('holds', 0.80493858654806194),\n", + " ('carl', 0.80437281567016972),\n", + " ('touches', 0.80396154690023547),\n", + " ('gene', 0.80314807577427383),\n", + " ('albert', 0.8027669055771679),\n", + " ('abc', 0.80234647252493729),\n", + " ('cry', 0.80011930011211307),\n", + " ('sides', 0.7995275841185171),\n", + " ('develops', 0.79850769621777162),\n", + " ('eyre', 0.79850769621777162),\n", + " ('dances', 0.79694397424158891),\n", + " ('oscars', 0.79633141679517616),\n", + " ('legendary', 0.79600456599965308),\n", + " ('importance', 0.79492987486988764),\n", + " ('hearted', 0.79492987486988764),\n", + " ('portraying', 0.79356592830699269),\n", + " ('impressed', 0.79258107754813223),\n", + " ('waters', 0.79112758892014912),\n", + " ('empire', 0.79078565012386137),\n", + " ('edge', 0.789774016249017),\n", + " ('environment', 0.78845736036427028),\n", + " ('jean', 0.78845736036427028),\n", + " ('sentimental', 0.7864791203521645),\n", + " ('captured', 0.78623760362595729),\n", + " ('styles', 0.78592891401091158),\n", + " ('daring', 0.78592891401091158),\n", + " ('backgrounds', 0.78275933924963248),\n", + " ('frank', 0.78275933924963248),\n", + " ('matches', 0.78275933924963248),\n", + " ('tense', 0.78275933924963248),\n", + " ('gothic', 0.78209466657644144),\n", + " ('sharp', 0.7814397877056235),\n", + " ('achieved', 0.78015855754957497),\n", + " ('court', 0.77947526404844247),\n", + " ('steals', 0.7789140023173704),\n", + " ('rules', 0.77844476107184035),\n", + " ('colors', 0.77684619943659217),\n", + " ('reunion', 0.77318988823348167),\n", + " ('covers', 0.77139937745969345),\n", + " ('tale', 0.77010822169607374),\n", + " ('rain', 0.7683706017975328),\n", + " ('denzel', 0.76804848873306297),\n", + " ('stays', 0.76787072675588186),\n", + " ('blob', 0.76725515271366718),\n", + " ('conventional', 0.76214005204689672),\n", + " ('maria', 0.76214005204689672),\n", + " ('fresh', 0.76158434211317383),\n", + " ('midnight', 0.76096977689870637),\n", + " ('landscape', 0.75852993982279704),\n", + " ('animated', 0.75768570169751648),\n", + " ('titanic', 0.75666058628227129),\n", + " ('sunday', 0.75666058628227129),\n", + " ('spring', 0.7537718023763802),\n", + " ('cagney', 0.7537718023763802),\n", + " ('enjoyable', 0.75246375771636476),\n", + " ('immensely', 0.75198768058287868),\n", + " ('sir', 0.7507762933965817),\n", + " ('nevertheless', 0.75067102469813185),\n", + " ('driven', 0.74994477895307854),\n", + " ('performances', 0.74883252516063137),\n", + " ('memories', 0.74721440183022114),\n", + " ('nowadays', 0.74721440183022114),\n", + " ('simple', 0.74641420974143258),\n", + " ('golden', 0.74533293373051557),\n", + " ('leslie', 0.74533293373051557),\n", + " ('lovers', 0.74497224842453125),\n", + " ('relationship', 0.74484232345601786),\n", + " ('supporting', 0.74357803418683721),\n", + " ('che', 0.74262723782331497),\n", + " ('packed', 0.7410032017375805),\n", + " ('trek', 0.74021469141793106),\n", + " ('provoking', 0.73840377214806618),\n", + " ('strikes', 0.73759894313077912),\n", + " ('depiction', 0.73682224406260699),\n", + " ('emotional', 0.73678211645681524),\n", + " ('secretary', 0.7366322924996842),\n", + " ('influenced', 0.73511137965897755),\n", + " ('florida', 0.73511137965897755),\n", + " ('germany', 0.73288750920945944),\n", + " ('brings', 0.73142936713096229),\n", + " ('lewis', 0.73129894652432159),\n", + " ('elderly', 0.73088750854279239),\n", + " ('owner', 0.72743625403857748),\n", + " ('streets', 0.72666987259858895),\n", + " ('henry', 0.72642196944481741),\n", + " ('portrays', 0.72593700338293632),\n", + " ('bears', 0.7252354951114458),\n", + " ('china', 0.72489587887452556),\n", + " ('anger', 0.72439972406404984),\n", + " ('society', 0.72433010799663333),\n", + " ('available', 0.72415741730250549),\n", + " ('best', 0.72347034060446314),\n", + " ('bugs', 0.72270598280148979),\n", + " ('magic', 0.71878961117328299),\n", + " ('verhoeven', 0.71846498854423513),\n", + " ('delivers', 0.71846498854423513),\n", + " ('jim', 0.71783979315031676),\n", + " ('donald', 0.71667767797013937),\n", + " ('endearing', 0.71465338578090898),\n", + " ('relationships', 0.71393795022901896),\n", + " ('greatly', 0.71256526641704687),\n", + " ('charlie', 0.71024161391924534),\n", + " ('brad', 0.71024161391924534),\n", + " ('simon', 0.70967648251115578),\n", + " ('effectively', 0.70914752190638641),\n", + " ('march', 0.70774597998109789),\n", + " ('atmosphere', 0.70744773070214162),\n", + " ('influence', 0.70733181555190172),\n", + " ('genius', 0.706392407309966),\n", + " ('emotionally', 0.70556970055850243),\n", + " ('ken', 0.70526854109229009),\n", + " ('identity', 0.70484322032313651),\n", + " ('sophisticated', 0.70470800296102132),\n", + " ('dan', 0.70457587638356811),\n", + " ('andrew', 0.70329955202396321),\n", + " ('india', 0.70144598337464037),\n", + " ('roy', 0.69970458110610434),\n", + " ('surprisingly', 0.6995780708902356),\n", + " ('sky', 0.69780919366575667),\n", + " ('romantic', 0.69664981111114743),\n", + " ('match', 0.69566924999265523),\n", + " ('britain', 0.69314718055994529),\n", + " ('beatty', 0.69314718055994529),\n", + " ('affected', 0.69314718055994529),\n", + " ('cowboy', 0.69314718055994529),\n", + " ('wave', 0.69314718055994529),\n", + " ('stylish', 0.69314718055994529),\n", + " ('bitter', 0.69314718055994529),\n", + " ('patient', 0.69314718055994529),\n", + " ('meets', 0.69314718055994529),\n", + " ('love', 0.69198533541937324),\n", + " ('paul', 0.68980827929443067),\n", + " ('andy', 0.68846333124751902),\n", + " ('performance', 0.68797386327972465),\n", + " ('patrick', 0.68645819240914863),\n", + " ('unlike', 0.68546468438792907),\n", + " ('brooks', 0.68433655087779044),\n", + " ('refuses', 0.68348526964820844),\n", + " ('award', 0.6824518914431974),\n", + " ('complaint', 0.6824518914431974),\n", + " ('ride', 0.68229716453587952),\n", + " ('dawson', 0.68171848473632257),\n", + " ('luke', 0.68158635815886937),\n", + " ('wells', 0.68087708796813096),\n", + " ('france', 0.6804081547825156),\n", + " ('handsome', 0.68007509899259255),\n", + " ('sports', 0.68007509899259255),\n", + " ('rebel', 0.67875844310784572),\n", + " ('directs', 0.67875844310784572),\n", + " ('greater', 0.67605274720064523),\n", + " ('dreams', 0.67599410133369586),\n", + " ('effective', 0.67565402311242806),\n", + " ('interpretation', 0.67479804189174875),\n", + " ('works', 0.67445504754779284),\n", + " ('brando', 0.67445504754779284),\n", + " ('noble', 0.6737290947028437),\n", + " ('paced', 0.67314651385327573),\n", + " ('le', 0.67067432470788668),\n", + " ('master', 0.67015766233524654),\n", + " ('h', 0.6696166831497512),\n", + " ('rings', 0.66904962898088483),\n", + " ('easy', 0.66895995494594152),\n", + " ('city', 0.66820823221269321),\n", + " ('sunshine', 0.66782937257565544),\n", + " ('succeeds', 0.66647893347778397),\n", + " ('relations', 0.664159643686693),\n", + " ('england', 0.66387679825983203),\n", + " ('glimpse', 0.66329421741026418),\n", + " ('aired', 0.66268797307523675),\n", + " ('sees', 0.66263163663399482),\n", + " ('both', 0.66248336767382998),\n", + " ('definitely', 0.66199789483898808),\n", + " ('imaginative', 0.66139848224536502),\n", + " ('appreciate', 0.66083893732728749),\n", + " ('tricks', 0.66071190480679143),\n", + " ('striking', 0.66071190480679143),\n", + " ('carefully', 0.65999497324304479),\n", + " ('complicated', 0.65981076029235353),\n", + " ('perspective', 0.65962448852130173),\n", + " ('trilogy', 0.65877953705573755),\n", + " ('future', 0.65834665141052828),\n", + " ('lion', 0.65742909795786608),\n", + " ('victor', 0.65540685257709819),\n", + " ('douglas', 0.65540685257709819),\n", + " ('inspired', 0.65459851044271034),\n", + " ('marriage', 0.65392646740666405),\n", + " ('demands', 0.65392646740666405),\n", + " ('father', 0.65172321672194655),\n", + " ('page', 0.65123628494430852),\n", + " ('instant', 0.65058756614114943),\n", + " ('era', 0.6495567444850836),\n", + " ('ruthless', 0.64934455790155243),\n", + " ('saga', 0.64934455790155243),\n", + " ('joan', 0.64891392558311978),\n", + " ('joseph', 0.64841128671855386),\n", + " ('workers', 0.64829661439459352),\n", + " ('fantasy', 0.64726757480925168),\n", + " ('accomplished', 0.64551913157069074),\n", + " ('distant', 0.64551913157069074),\n", + " ('manhattan', 0.64435701639051324),\n", + " ('personal', 0.64355023942057321),\n", + " ('pushing', 0.64313675998528386),\n", + " ('meeting', 0.64313675998528386),\n", + " ('individual', 0.64313675998528386),\n", + " ('pleasant', 0.64250344774119039),\n", + " ('brave', 0.64185388617239469),\n", + " ('william', 0.64083139119578469),\n", + " ('hudson', 0.64077919504262937),\n", + " ('friendly', 0.63949446706762514),\n", + " ('eccentric', 0.63907995928966954),\n", + " ('awards', 0.63875310849414646),\n", + " ('jack', 0.63838309514997038),\n", + " ('seeking', 0.63808740337691783),\n", + " ('colonel', 0.63757732940513456),\n", + " ('divorce', 0.63757732940513456),\n", + " ('jane', 0.63443957973316734),\n", + " ('keeping', 0.63414883979798953),\n", + " ('gives', 0.63383568159497883),\n", + " ('ted', 0.63342794585832296),\n", + " ('animation', 0.63208692379869902),\n", + " ('progress', 0.6317782341836532),\n", + " ('concert', 0.63127177684185776),\n", + " ('larger', 0.63127177684185776),\n", + " ('nation', 0.6296337748376194),\n", + " ('albeit', 0.62739580299716491),\n", + " ('adapted', 0.62613647027698516),\n", + " ('discovers', 0.62542900650499444),\n", + " ('classic', 0.62504956428050518),\n", + " ('segment', 0.62335141862440335),\n", + " ('morgan', 0.62303761437291871),\n", + " ('mouse', 0.62294292188669675),\n", + " ('impressive', 0.62211140744319349),\n", + " ('artist', 0.62168821657780038),\n", + " ('ultimate', 0.62168821657780038),\n", + " ('griffith', 0.62117368093485603),\n", + " ('emily', 0.62082651898031915),\n", + " ('drew', 0.62082651898031915),\n", + " ('moved', 0.6197197120051281),\n", + " ('profound', 0.61903920840622351),\n", + " ('families', 0.61903920840622351),\n", + " ('innocent', 0.61851219917136446),\n", + " ('versions', 0.61730910416844087),\n", + " ('eddie', 0.61691981517206107),\n", + " ('criticism', 0.61651395453902935),\n", + " ('nature', 0.61594514653194088),\n", + " ('recognized', 0.61518563909023349),\n", + " ('sexuality', 0.61467556511845012),\n", + " ('contract', 0.61400986000122149),\n", + " ('brian', 0.61344043794920278),\n", + " ('remembered', 0.6131044728864089),\n", + " ('determined', 0.6123858239154869),\n", + " ('offers', 0.61207935747116349),\n", + " ('pleasure', 0.61195702582993206),\n", + " ('washington', 0.61180154110599294),\n", + " ('images', 0.61159731359583758),\n", + " ('games', 0.61067095873570676),\n", + " ('academy', 0.60872983874736208),\n", + " ('fashioned', 0.60798937221963845),\n", + " ('melodrama', 0.60749173598145145),\n", + " ('peoples', 0.60613580357031549),\n", + " ('charismatic', 0.60613580357031549),\n", + " ('rough', 0.60613580357031549),\n", + " ('dealing', 0.60517840761398811),\n", + " ('fine', 0.60496962268013299),\n", + " ('tap', 0.60391604683200273),\n", + " ('trio', 0.60157998703445481),\n", + " ('russell', 0.60120968523425966),\n", + " ('figures', 0.60077386042893011),\n", + " ('ward', 0.60005675749393339),\n", + " ('shine', 0.59911823091166894),\n", + " ('brady', 0.59911823091166894),\n", + " ('job', 0.59845562125168661),\n", + " ('satisfied', 0.59652034487087369),\n", + " ('river', 0.59637962862495086),\n", + " ('brown', 0.595773016534769),\n", + " ('believable', 0.59566072133302495),\n", + " ('bound', 0.59470710774669278),\n", + " ('always', 0.59470710774669278),\n", + " ('hall', 0.5933967777928858),\n", + " ('cook', 0.5916777203950857),\n", + " ('claire', 0.59136448625000293),\n", + " ('broadway', 0.59033768669372433),\n", + " ('anna', 0.58778666490211906),\n", + " ('peace', 0.58628403501758408),\n", + " ('visually', 0.58539431926349916),\n", + " ('falk', 0.58525821854876026),\n", + " ('morality', 0.58525821854876026),\n", + " ('growing', 0.58466653756587539),\n", + " ('experiences', 0.58314628534561685),\n", + " ('stood', 0.58314628534561685),\n", + " ('touch', 0.58122926435596001),\n", + " ('lives', 0.5810976767513224),\n", + " ('kubrick', 0.58066919713325493),\n", + " ('timing', 0.58047401805583243),\n", + " ('struggles', 0.57981849525294216),\n", + " ('expressions', 0.57981849525294216),\n", + " ('authentic', 0.57848427223980559),\n", + " ('helen', 0.57763429343810091),\n", + " ('pre', 0.57700753064729182),\n", + " ('quirky', 0.5753641449035618),\n", + " ('young', 0.57531672344534313),\n", + " ('inner', 0.57454143815209846),\n", + " ('mexico', 0.57443087372056334),\n", + " ('clint', 0.57380042292737909),\n", + " ('sisters', 0.57286101468544337),\n", + " ('realism', 0.57226528899949558),\n", + " ('personalities', 0.5720692490067093),\n", + " ('french', 0.5720692490067093),\n", + " ('surprises', 0.57113222999698177),\n", + " ('adventures', 0.57113222999698177),\n", + " ('overcome', 0.5697681593994407),\n", + " ('timothy', 0.56953322459276867),\n", + " ('tales', 0.56909453188996639),\n", + " ('war', 0.56843317302781682),\n", + " ('civil', 0.5679840376059393),\n", + " ('countries', 0.56737779327091187),\n", + " ('streep', 0.56710645966458029),\n", + " ('tradition', 0.56685345523565323),\n", + " ('oliver', 0.56673325570428668),\n", + " ('australia', 0.56580775818334383),\n", + " ('understanding', 0.56531380905006046),\n", + " ('players', 0.56509525370004821),\n", + " ('knowing', 0.56489284503626647),\n", + " ('rogers', 0.56421349718405212),\n", + " ('suspenseful', 0.56368911332305849),\n", + " ('variety', 0.56368911332305849),\n", + " ('true', 0.56281525180810066),\n", + " ('jr', 0.56220982311246936),\n", + " ('psychological', 0.56108745854687891),\n", + " ('branagh', 0.55961578793542266),\n", + " ('wealth', 0.55961578793542266),\n", + " ('performing', 0.55961578793542266),\n", + " ('odds', 0.55961578793542266),\n", + " ('sent', 0.55961578793542266),\n", + " ('reminiscent', 0.55961578793542266),\n", + " ('grand', 0.55961578793542266),\n", + " ('overwhelming', 0.55961578793542266),\n", + " ('brothers', 0.55891181043362848),\n", + " ('howard', 0.55811089675600245),\n", + " ('david', 0.55693122256475369),\n", + " ('generation', 0.55628799784274796),\n", + " ('grow', 0.55612538299565417),\n", + " ('survival', 0.55594605904646033),\n", + " ('mainstream', 0.55574731115750231),\n", + " ('dick', 0.55431073570572953),\n", + " ('charm', 0.55288175575407861),\n", + " ('kirk', 0.55278982286502287),\n", + " ('twists', 0.55244729845681018),\n", + " ('gangster', 0.55206858230003986),\n", + " ('jeff', 0.55179306225421365),\n", + " ('family', 0.55116244510065526),\n", + " ('tend', 0.55053307336110335),\n", + " ('thanks', 0.55049088015842218),\n", + " ('world', 0.54744234723432639),\n", + " ('sutherland', 0.54743536937855164),\n", + " ('life', 0.54695514434959924),\n", + " ('disc', 0.54654370636806993),\n", + " ('bug', 0.54654370636806993),\n", + " ('tribute', 0.5455111817538808),\n", + " ('europe', 0.54522705048332309),\n", + " ('sacrifice', 0.54430155296238014),\n", + " ('color', 0.54405127139431109),\n", + " ('superior', 0.54333490233128523),\n", + " ('york', 0.54318235866536513),\n", + " ('pulls', 0.54266622962164945),\n", + " ('hearts', 0.54232429082536171),\n", + " ('jackson', 0.54232429082536171),\n", + " ('enjoy', 0.54124285135906114),\n", + " ('redemption', 0.54056759296472823),\n", + " ('madness', 0.540384426007535),\n", + " ('hamilton', 0.5389965007326869),\n", + " ('stands', 0.5389965007326869),\n", + " ('trial', 0.5389965007326869),\n", + " ('greek', 0.5389965007326869),\n", + " ('each', 0.5388212312554177),\n", + " ('faithful', 0.53773307668591508),\n", + " ('received', 0.5372768098531604),\n", + " ('jealous', 0.53714293208336406),\n", + " ('documentaries', 0.53714293208336406),\n", + " ('different', 0.53709860682460819),\n", + " ('describes', 0.53680111016925136),\n", + " ('shorts', 0.53596159703753288),\n", + " ('brilliance', 0.53551823635636209),\n", + " ('mountains', 0.53492317534505118),\n", + " ('share', 0.53408248593025787),\n", + " ('dealt', 0.53408248593025787),\n", + " ('providing', 0.53329847961804933),\n", + " ('explore', 0.53329847961804933),\n", + " ('series', 0.5325809226575603),\n", + " ('fellow', 0.5323318289869543),\n", + " ('loves', 0.53062825106217038),\n", + " ('olivier', 0.53062825106217038),\n", + " ('revolution', 0.53062825106217038),\n", + " ('roman', 0.53062825106217038),\n", + " ('century', 0.53002783074992665),\n", + " ('musical', 0.52966871156747064),\n", + " ('heroic', 0.52925932545482868),\n", + " ('ironically', 0.52806743020049673),\n", + " ('approach', 0.52806743020049673),\n", + " ('temple', 0.52806743020049673),\n", + " ('moves', 0.5279372642387119),\n", + " ('gift', 0.52702030968597136),\n", + " ('julie', 0.52609309589677911),\n", + " ('tells', 0.52415107836314001),\n", + " ('radio', 0.52394671172868779),\n", + " ('uncle', 0.52354439617376536),\n", + " ('union', 0.52324814376454787),\n", + " ('deep', 0.52309571635780505),\n", + " ('reminds', 0.52157841554225237),\n", + " ('famous', 0.52118841080153722),\n", + " ('jazz', 0.52053443789295151),\n", + " ('dennis', 0.51987545928590861),\n", + " ('epic', 0.51919387343650736),\n", + " ('adult', 0.519167695083386),\n", + " ('shows', 0.51915322220375304),\n", + " ('performed', 0.5191244265806858),\n", + " ('demons', 0.5191244265806858),\n", + " ('eric', 0.51879379341516751),\n", + " ('discovered', 0.51879379341516751),\n", + " ('youth', 0.5185626062681431),\n", + " ('human', 0.51851411224987087),\n", + " ('tarzan', 0.51813827061227724),\n", + " ('ourselves', 0.51794309153485463),\n", + " ('wwii', 0.51758240622887042),\n", + " ('passion', 0.5162164724008671),\n", + " ('desire', 0.51607497965213445),\n", + " ('pays', 0.51581316527702981),\n", + " ('fox', 0.51557622652458857),\n", + " ('dirty', 0.51557622652458857),\n", + " ('symbolism', 0.51546600332249293),\n", + " ('sympathetic', 0.51546600332249293),\n", + " ('attitude', 0.51530993621331933),\n", + " ('appearances', 0.51466440007315639),\n", + " ('jeremy', 0.51466440007315639),\n", + " ('fun', 0.51439068993048687),\n", + " ('south', 0.51420972175023116),\n", + " ('arrives', 0.51409894911095988),\n", + " ('present', 0.51341965894303732),\n", + " ('com', 0.51326167856387173),\n", + " ('smile', 0.51265880484765169),\n", + " ('fits', 0.51082562376599072),\n", + " ('provided', 0.51082562376599072),\n", + " ('carter', 0.51082562376599072),\n", + " ('ring', 0.51082562376599072),\n", + " ('aging', 0.51082562376599072),\n", + " ('countryside', 0.51082562376599072),\n", + " ('alan', 0.51082562376599072),\n", + " ('visit', 0.51082562376599072),\n", + " ('begins', 0.51015650363396647),\n", + " ('success', 0.50900578704900468),\n", + " ('japan', 0.50900578704900468),\n", + " ('accurate', 0.50895471583017893),\n", + " ('proud', 0.50800474742434931),\n", + " ('daily', 0.5075946031845443),\n", + " ('atmospheric', 0.50724780241810674),\n", + " ('karloff', 0.50724780241810674),\n", + " ('recently', 0.50714914903668207),\n", + " ('fu', 0.50704490092608467),\n", + " ('horrors', 0.50656122497953315),\n", + " ('finding', 0.50637127341661037),\n", + " ('lust', 0.5059356384717989),\n", + " ('hitchcock', 0.50574947073413001),\n", + " ('among', 0.50334004951332734),\n", + " ('viewing', 0.50302139827440906),\n", + " ('shining', 0.50262885656181222),\n", + " ('investigation', 0.50262885656181222),\n", + " ('duo', 0.5020919437972361),\n", + " ('cameron', 0.5020919437972361),\n", + " ('finds', 0.50128303100539795),\n", + " ('contemporary', 0.50077528791248915),\n", + " ('genuine', 0.50046283673044401),\n", + " ('frightening', 0.49995595152908684),\n", + " ('plays', 0.49975983848890226),\n", + " ('age', 0.49941323171424595),\n", + " ('position', 0.49899116611898781),\n", + " ('continues', 0.49863035067217237),\n", + " ('roles', 0.49839716550752178),\n", + " ('james', 0.49837216269470402),\n", + " ('individuals', 0.49824684155913052),\n", + " ('brought', 0.49783842823917956),\n", + " ('hilarious', 0.49714551986191058),\n", + " ('brutal', 0.49681488669639234),\n", + " ('appropriate', 0.49643688631389105),\n", + " ('dance', 0.49581998314812048),\n", + " ('league', 0.49578774640145024),\n", + " ('helping', 0.49578774640145024),\n", + " ('answers', 0.49578774640145024),\n", + " ('stunts', 0.49561620510246196),\n", + " ('traveling', 0.49532143723002542),\n", + " ('thoroughly', 0.49414593456733524),\n", + " ('depicted', 0.49317068852726992),\n", + " ('honor', 0.49247648509779424),\n", + " ('combination', 0.49247648509779424),\n", + " ('differences', 0.49247648509779424),\n", + " ('fully', 0.49213349075383811),\n", + " ('tracy', 0.49159426183810306),\n", + " ('battles', 0.49140753790888908),\n", + " ('possibility', 0.49112055268665822),\n", + " ('romance', 0.4901589869574316),\n", + " ('initially', 0.49002249613622745),\n", + " ('happy', 0.4898997500608791),\n", + " ('crime', 0.48977221456815834),\n", + " ('singing', 0.4893852925281213),\n", + " ('especially', 0.48901267837860624),\n", + " ('shakespeare', 0.48754793889664511),\n", + " ('hugh', 0.48729512635579658),\n", + " ('detail', 0.48609484250827351),\n", + " ('guide', 0.48550781578170082),\n", + " ('companion', 0.48550781578170082),\n", + " ('julia', 0.48550781578170082),\n", + " ('san', 0.48550781578170082),\n", + " ('desperation', 0.48550781578170082),\n", + " ('strongly', 0.48460242866688824),\n", + " ('necessary', 0.48302334245403883),\n", + " ('humanity', 0.48265474679929443),\n", + " ('drama', 0.48221998493060503),\n", + " ('warming', 0.48183808689273838),\n", + " ('intrigue', 0.48183808689273838),\n", + " ('nonetheless', 0.48183808689273838),\n", + " ('cuba', 0.48183808689273838),\n", + " ('planned', 0.47957308026188628),\n", + " ('pictures', 0.47929937011921681),\n", + " ('broadcast', 0.47849024312305422),\n", + " ('nine', 0.47803580094299974),\n", + " ('settings', 0.47743860773325364),\n", + " ('history', 0.47732966933780852),\n", + " ('ordinary', 0.47725880012690741),\n", + " ('trade', 0.47692407209030935),\n", + " ('primary', 0.47608267532211779),\n", + " ('official', 0.47608267532211779),\n", + " ('episode', 0.47529620261150429),\n", + " ('role', 0.47520268270188676),\n", + " ('spirit', 0.47477690799839323),\n", + " ('grey', 0.47409361449726067),\n", + " ('ways', 0.47323464982718205),\n", + " ('cup', 0.47260441094579297),\n", + " ('piano', 0.47260441094579297),\n", + " ('familiar', 0.47241617565111949),\n", + " ('sinister', 0.47198579044972683),\n", + " ('reveal', 0.47171449364936496),\n", + " ('max', 0.47150852042515579),\n", + " ('dated', 0.47121648567094482),\n", + " ('discovery', 0.47000362924573563),\n", + " ('vicious', 0.47000362924573563),\n", + " ('losing', 0.47000362924573563),\n", + " ('genuinely', 0.46871413841586385),\n", + " ('hatred', 0.46734051182625186),\n", + " ('mistaken', 0.46702300110759781),\n", + " ('dream', 0.46608972992459924),\n", + " ('challenge', 0.46608972992459924),\n", + " ('crisis', 0.46575733836428446),\n", + " ('photographed', 0.46488852857896512),\n", + " ('machines', 0.46430560813109778),\n", + " ('critics', 0.46430560813109778),\n", + " ('bird', 0.46430560813109778),\n", + " ('born', 0.46411383518967209),\n", + " ('detective', 0.4636633473511525),\n", + " ('higher', 0.46328467899699055),\n", + " ('remains', 0.46262352194811296),\n", + " ('inevitable', 0.46262352194811296),\n", + " ('soviet', 0.4618180446592961),\n", + " ('ryan', 0.46134556650262099),\n", + " ('african', 0.46112595521371813),\n", + " ('smaller', 0.46081520319132935),\n", + " ('techniques', 0.46052488529119184),\n", + " ('information', 0.46034171833399862),\n", + " ('deserved', 0.45999798712841444),\n", + " ('cynical', 0.45953232937844013),\n", + " ('lynch', 0.45953232937844013),\n", + " ('francisco', 0.45953232937844013),\n", + " ('tour', 0.45953232937844013),\n", + " ('spielberg', 0.45953232937844013),\n", + " ('struggle', 0.45911782160048453),\n", + " ('language', 0.45902121257712653),\n", + " ('visual', 0.45823514408822852),\n", + " ('warner', 0.45724137763188427),\n", + " ('social', 0.45720078250735313),\n", + " ('reality', 0.45719346885019546),\n", + " ('hidden', 0.45675840249571492),\n", + " ('breaking', 0.45601738727099561),\n", + " ('sometimes', 0.45563021171182794),\n", + " ('modern', 0.45500247579345005),\n", + " ('surfing', 0.45425527227759638),\n", + " ('popular', 0.45410691533051023),\n", + " ('surprised', 0.4534409399850382),\n", + " ('follows', 0.45245361754408348),\n", + " ('keeps', 0.45234869400701483),\n", + " ('john', 0.4520909494482197),\n", + " ('defeat', 0.45198512374305722),\n", + " ('mixed', 0.45198512374305722),\n", + " ('justice', 0.45142724367280018),\n", + " ('treasure', 0.45083371313801535),\n", + " ('presents', 0.44973793178615257),\n", + " ('years', 0.44919197032104968),\n", + " ('chief', 0.44895022004790319),\n", + " ('shadows', 0.44802472252696035),\n", + " ('closely', 0.44701411102103689),\n", + " ('segments', 0.44701411102103689),\n", + " ('lose', 0.44658335503763702),\n", + " ('caine', 0.44628710262841953),\n", + " ('caught', 0.44610275383999071),\n", + " ('hamlet', 0.44558510189758965),\n", + " ('chinese', 0.44507424620321018),\n", + " ('welcome', 0.44438052435783792),\n", + " ('birth', 0.44368632092836219),\n", + " ('represents', 0.44320543609101143),\n", + " ('puts', 0.44279106572085081),\n", + " ('fame', 0.44183275227903923),\n", + " ('closer', 0.44183275227903923),\n", + " ('visuals', 0.44183275227903923),\n", + " ('web', 0.44183275227903923),\n", + " ('criminal', 0.4412745608048752),\n", + " ('minor', 0.4409224199448939),\n", + " ('jon', 0.44086703515908027),\n", + " ('liked', 0.44074991514020723),\n", + " ('restaurant', 0.44031183943833246),\n", + " ('flaws', 0.43983275161237217),\n", + " ('de', 0.43983275161237217),\n", + " ('searching', 0.4393666597838457),\n", + " ('rap', 0.43891304217570443),\n", + " ('light', 0.43884433018199892),\n", + " ('elizabeth', 0.43872232986464677),\n", + " ('marry', 0.43861731542506488),\n", + " ('oz', 0.43825493093115531),\n", + " ('controversial', 0.43825493093115531),\n", + " ('learned', 0.43825493093115531),\n", + " ('slowly', 0.43785660389939979),\n", + " ('bridge', 0.43721380642274466),\n", + " ('thrilling', 0.43721380642274466),\n", + " ('wayne', 0.43721380642274466),\n", + " ('comedic', 0.43721380642274466),\n", + " ('married', 0.43658501682196887),\n", + " ('nazi', 0.4361020775700542),\n", + " ('murder', 0.4353180712578455),\n", + " ('physical', 0.4353180712578455),\n", + " ('johnny', 0.43483971678806865),\n", + " ('michelle', 0.43445264498141672),\n", + " ('wallace', 0.43403848055222038),\n", + " ('silent', 0.43395706390247063),\n", + " ('comedies', 0.43395706390247063),\n", + " ('played', 0.43387244114515305),\n", + " ('international', 0.43363598507486073),\n", + " ('vision', 0.43286408229627887),\n", + " ('intelligent', 0.43196704885367099),\n", + " ('shop', 0.43078291609245434),\n", + " ('also', 0.43036720209769169),\n", + " ('levels', 0.4302451371066513),\n", + " ('miss', 0.43006426712153217),\n", + " ('ocean', 0.4295626596872249),\n", + " ...]" + ] + }, + "execution_count": 113, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"POSITIVE\" label\n", + "pos_neg_ratios.most_common()" + ] + }, + { + "cell_type": "code", + "execution_count": 114, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "[('boll', -4.0778152602708904),\n", + " ('uwe', -3.9218753018711578),\n", + " ('seagal', -3.3202501058581921),\n", + " ('unwatchable', -3.0269848170580955),\n", + " ('stinker', -2.9876839403711624),\n", + " ('mst', -2.7753833211707968),\n", + " ('incoherent', -2.7641396677532537),\n", + " ('unfunny', -2.5545257844967644),\n", + " ('waste', -2.4907515123361046),\n", + " ('blah', -2.4475792789485005),\n", + " ('horrid', -2.3715779644809971),\n", + " ('pointless', -2.3451073877136341),\n", + " ('atrocious', -2.3187369339642556),\n", + " ('redeeming', -2.2667790015910296),\n", + " ('prom', -2.2601040980178784),\n", + " ('drivel', -2.2476029585766928),\n", + " ('lousy', -2.2118080125207054),\n", + " ('worst', -2.1930856334332267),\n", + " ('laughable', -2.172468615469592),\n", + " ('awful', -2.1385076866397488),\n", + " ('poorly', -2.1326133844207011),\n", + " ('wasting', -2.1178155545614512),\n", + " ('remotely', -2.111046881095167),\n", + " ('existent', -2.0024805005437076),\n", + " ('boredom', -1.9241486572738005),\n", + " ('miserably', -1.9216610938019989),\n", + " ('sucks', -1.9166645809588516),\n", + " ('uninspired', -1.9131499212248517),\n", + " ('lame', -1.9117232884159072),\n", + " ('insult', -1.9085323769376259)]" + ] + }, + "execution_count": 114, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# words most frequently seen in a review with a \"NEGATIVE\" label\n", + "list(reversed(pos_neg_ratios.most_common()))[0:30]" + ] + }, + { + "cell_type": "code", + "execution_count": 115, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "
\n", + " \n", + " Loading BokehJS ...\n", + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + }, + { + "data": { + "application/javascript": [ + "\n", + "(function(global) {\n", + " function now() {\n", + " return new Date();\n", + " }\n", + "\n", + " var force = \"1\";\n", + "\n", + " if (typeof (window._bokeh_onload_callbacks) === \"undefined\" || force !== \"\") {\n", + " window._bokeh_onload_callbacks = [];\n", + " window._bokeh_is_loading = undefined;\n", + " }\n", + "\n", + "\n", + " \n", + " if (typeof (window._bokeh_timeout) === \"undefined\" || force !== \"\") {\n", + " window._bokeh_timeout = Date.now() + 5000;\n", + " window._bokeh_failed_load = false;\n", + " }\n", + "\n", + " var NB_LOAD_WARNING = {'data': {'text/html':\n", + " \"
\\n\"+\n", + " \"

\\n\"+\n", + " \"BokehJS does not appear to have successfully loaded. If loading BokehJS from CDN, this \\n\"+\n", + " \"may be due to a slow or bad network connection. Possible fixes:\\n\"+\n", + " \"

\\n\"+\n", + " \"
    \\n\"+\n", + " \"
  • re-rerun `output_notebook()` to attempt to load from CDN again, or
  • \\n\"+\n", + " \"
  • use INLINE resources instead, as so:
  • \\n\"+\n", + " \"
\\n\"+\n", + " \"\\n\"+\n", + " \"from bokeh.resources import INLINE\\n\"+\n", + " \"output_notebook(resources=INLINE)\\n\"+\n", + " \"\\n\"+\n", + " \"
\"}};\n", + "\n", + " function display_loaded() {\n", + " if (window.Bokeh !== undefined) {\n", + " Bokeh.$(\"#fcba94a8-578e-4e33-ab7d-b09fc2376af8\").text(\"BokehJS successfully loaded.\");\n", + " } else if (Date.now() < window._bokeh_timeout) {\n", + " setTimeout(display_loaded, 100)\n", + " }\n", + " }\n", + "\n", + " function run_callbacks() {\n", + " window._bokeh_onload_callbacks.forEach(function(callback) { callback() });\n", + " delete window._bokeh_onload_callbacks\n", + " console.info(\"Bokeh: all callbacks have finished\");\n", + " }\n", + "\n", + " function load_libs(js_urls, callback) {\n", + " window._bokeh_onload_callbacks.push(callback);\n", + " if (window._bokeh_is_loading > 0) {\n", + " console.log(\"Bokeh: BokehJS is being loaded, scheduling callback at\", now());\n", + " return null;\n", + " }\n", + " if (js_urls == null || js_urls.length === 0) {\n", + " run_callbacks();\n", + " return null;\n", + " }\n", + " console.log(\"Bokeh: BokehJS not loaded, scheduling load and callback at\", now());\n", + " window._bokeh_is_loading = js_urls.length;\n", + " for (var i = 0; i < js_urls.length; i++) {\n", + " var url = js_urls[i];\n", + " var s = document.createElement('script');\n", + " s.src = url;\n", + " s.async = false;\n", + " s.onreadystatechange = s.onload = function() {\n", + " window._bokeh_is_loading--;\n", + " if (window._bokeh_is_loading === 0) {\n", + " console.log(\"Bokeh: all BokehJS libraries loaded\");\n", + " run_callbacks()\n", + " }\n", + " };\n", + " s.onerror = function() {\n", + " console.warn(\"failed to load library \" + url);\n", + " };\n", + " console.log(\"Bokeh: injecting script tag for BokehJS library: \", url);\n", + " document.getElementsByTagName(\"head\")[0].appendChild(s);\n", + " }\n", + " };var element = document.getElementById(\"fcba94a8-578e-4e33-ab7d-b09fc2376af8\");\n", + " if (element == null) {\n", + " console.log(\"Bokeh: ERROR: autoload.js configured with elementid 'fcba94a8-578e-4e33-ab7d-b09fc2376af8' but no matching script tag was found. \")\n", + " return false;\n", + " }\n", + "\n", + " var js_urls = ['https://cdn.pydata.org/bokeh/release/bokeh-0.12.2.min.js', 'https://cdn.pydata.org/bokeh/release/bokeh-widgets-0.12.2.min.js', 'https://cdn.pydata.org/bokeh/release/bokeh-compiler-0.12.2.min.js'];\n", + "\n", + " var inline_js = [\n", + " function(Bokeh) {\n", + " Bokeh.set_log_level(\"info\");\n", + " },\n", + " \n", + " function(Bokeh) {\n", + " \n", + " Bokeh.$(\"#fcba94a8-578e-4e33-ab7d-b09fc2376af8\").text(\"BokehJS is loading...\");\n", + " },\n", + " function(Bokeh) {\n", + " console.log(\"Bokeh: injecting CSS: https://cdn.pydata.org/bokeh/release/bokeh-0.12.2.min.css\");\n", + " Bokeh.embed.inject_css(\"https://cdn.pydata.org/bokeh/release/bokeh-0.12.2.min.css\");\n", + " console.log(\"Bokeh: injecting CSS: https://cdn.pydata.org/bokeh/release/bokeh-widgets-0.12.2.min.css\");\n", + " Bokeh.embed.inject_css(\"https://cdn.pydata.org/bokeh/release/bokeh-widgets-0.12.2.min.css\");\n", + " }\n", + " ];\n", + "\n", + " function run_inline_js() {\n", + " \n", + " if ((window.Bokeh !== undefined) || (force === \"1\")) {\n", + " for (var i = 0; i < inline_js.length; i++) {\n", + " inline_js[i](window.Bokeh);\n", + " }if (force === \"1\") {\n", + " display_loaded();\n", + " }} else if (Date.now() < window._bokeh_timeout) {\n", + " setTimeout(run_inline_js, 100);\n", + " } else if (!window._bokeh_failed_load) {\n", + " console.log(\"Bokeh: BokehJS failed to load within specified timeout.\");\n", + " window._bokeh_failed_load = true;\n", + " } else if (!force) {\n", + " var cell = $(\"#fcba94a8-578e-4e33-ab7d-b09fc2376af8\").parents('.cell').data().cell;\n", + " cell.output_area.append_execute_result(NB_LOAD_WARNING)\n", + " }\n", + "\n", + " }\n", + "\n", + " if (window._bokeh_is_loading === 0) {\n", + " console.log(\"Bokeh: BokehJS loaded, going straight to plotting\");\n", + " run_inline_js();\n", + " } else {\n", + " load_libs(js_urls, function() {\n", + " console.log(\"Bokeh: BokehJS plotting callback run at\", now());\n", + " run_inline_js();\n", + " });\n", + " }\n", + "}(this));" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "from bokeh.models import ColumnDataSource, LabelSet\n", + "from bokeh.plotting import figure, show, output_file\n", + "from bokeh.io import output_notebook\n", + "output_notebook()" + ] + }, + { + "cell_type": "code", + "execution_count": 116, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "
\n", + "
\n", + "
\n", + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "hist, edges = np.histogram(list(map(lambda x:x[1],pos_neg_ratios.most_common())), density=True, bins=100, normed=True)\n", + "\n", + "p = figure(tools=\"pan,wheel_zoom,reset,save\",\n", + " toolbar_location=\"above\",\n", + " title=\"Word Positive/Negative Affinity Distribution\")\n", + "p.quad(top=hist, bottom=0, left=edges[:-1], right=edges[1:], line_color=\"#555555\")\n", + "show(p)" + ] + }, + { + "cell_type": "code", + "execution_count": 117, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "frequency_frequency = Counter()\n", + "\n", + "for word, cnt in total_counts.most_common():\n", + " frequency_frequency[cnt] += 1" + ] + }, + { + "cell_type": "code", + "execution_count": 118, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/html": [ + "\n", + "\n", + "
\n", + "
\n", + "
\n", + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "hist, edges = np.histogram(list(map(lambda x:x[1],frequency_frequency.most_common())), density=True, bins=100, normed=True)\n", + "\n", + "p = figure(tools=\"pan,wheel_zoom,reset,save\",\n", + " toolbar_location=\"above\",\n", + " title=\"The frequency distribution of the words in our corpus\")\n", + "p.quad(top=hist, bottom=0, left=edges[:-1], right=edges[1:], line_color=\"#555555\")\n", + "show(p)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Reducing Noise by Strategically Reducing the Vocabulary" + ] + }, + { + "cell_type": "code", + "execution_count": 122, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "import time\n", + "import sys\n", + "import numpy as np\n", + "\n", + "# Let's tweak our network from before to model these phenomena\n", + "class SentimentNetwork:\n", + " def __init__(self, reviews,labels,min_count = 10,polarity_cutoff = 0.1,hidden_nodes = 10, learning_rate = 0.1):\n", + " \n", + " np.random.seed(1)\n", + " \n", + " self.pre_process_data(reviews, polarity_cutoff, min_count)\n", + " \n", + " self.init_network(len(self.review_vocab),hidden_nodes, 1, learning_rate)\n", + " \n", + " \n", + " def pre_process_data(self,reviews, polarity_cutoff,min_count):\n", + " \n", + " positive_counts = Counter()\n", + " negative_counts = Counter()\n", + " total_counts = Counter()\n", + "\n", + " for i in range(len(reviews)):\n", + " if(labels[i] == 'POSITIVE'):\n", + " for word in reviews[i].split(\" \"):\n", + " positive_counts[word] += 1\n", + " total_counts[word] += 1\n", + " else:\n", + " for word in reviews[i].split(\" \"):\n", + " negative_counts[word] += 1\n", + " total_counts[word] += 1\n", + "\n", + " pos_neg_ratios = Counter()\n", + "\n", + " for term,cnt in list(total_counts.most_common()):\n", + " if(cnt >= 50):\n", + " pos_neg_ratio = positive_counts[term] / float(negative_counts[term]+1)\n", + " pos_neg_ratios[term] = pos_neg_ratio\n", + "\n", + " for word,ratio in pos_neg_ratios.most_common():\n", + " if(ratio > 1):\n", + " pos_neg_ratios[word] = np.log(ratio)\n", + " else:\n", + " pos_neg_ratios[word] = -np.log((1 / (ratio + 0.01)))\n", + " \n", + " review_vocab = set()\n", + " for review in reviews:\n", + " for word in review.split(\" \"):\n", + " if(total_counts[word] > min_count):\n", + " if(word in pos_neg_ratios.keys()):\n", + " if((pos_neg_ratios[word] >= polarity_cutoff) or (pos_neg_ratios[word] <= -polarity_cutoff)):\n", + " review_vocab.add(word)\n", + " else:\n", + " review_vocab.add(word)\n", + " self.review_vocab = list(review_vocab)\n", + " \n", + " label_vocab = set()\n", + " for label in labels:\n", + " label_vocab.add(label)\n", + " \n", + " self.label_vocab = list(label_vocab)\n", + " \n", + " self.review_vocab_size = len(self.review_vocab)\n", + " self.label_vocab_size = len(self.label_vocab)\n", + " \n", + " self.word2index = {}\n", + " for i, word in enumerate(self.review_vocab):\n", + " self.word2index[word] = i\n", + " \n", + " self.label2index = {}\n", + " for i, label in enumerate(self.label_vocab):\n", + " self.label2index[label] = i\n", + " \n", + " \n", + " def init_network(self, input_nodes, hidden_nodes, output_nodes, learning_rate):\n", + " # Set number of nodes in input, hidden and output layers.\n", + " self.input_nodes = input_nodes\n", + " self.hidden_nodes = hidden_nodes\n", + " self.output_nodes = output_nodes\n", + "\n", + " # Initialize weights\n", + " self.weights_0_1 = np.zeros((self.input_nodes,self.hidden_nodes))\n", + " \n", + " self.weights_1_2 = np.random.normal(0.0, self.output_nodes**-0.5, \n", + " (self.hidden_nodes, self.output_nodes))\n", + " \n", + " self.learning_rate = learning_rate\n", + " \n", + " self.layer_0 = np.zeros((1,input_nodes))\n", + " self.layer_1 = np.zeros((1,hidden_nodes))\n", + " \n", + " def sigmoid(self,x):\n", + " return 1 / (1 + np.exp(-x))\n", + " \n", + " \n", + " def sigmoid_output_2_derivative(self,output):\n", + " return output * (1 - output)\n", + " \n", + " def update_input_layer(self,review):\n", + "\n", + " # clear out previous state, reset the layer to be all 0s\n", + " self.layer_0 *= 0\n", + " for word in review.split(\" \"):\n", + " self.layer_0[0][self.word2index[word]] = 1\n", + "\n", + " def get_target_for_label(self,label):\n", + " if(label == 'POSITIVE'):\n", + " return 1\n", + " else:\n", + " return 0\n", + " \n", + " def train(self, training_reviews_raw, training_labels):\n", + " \n", + " training_reviews = list()\n", + " for review in training_reviews_raw:\n", + " indices = set()\n", + " for word in review.split(\" \"):\n", + " if(word in self.word2index.keys()):\n", + " indices.add(self.word2index[word])\n", + " training_reviews.append(list(indices))\n", + " \n", + " assert(len(training_reviews) == len(training_labels))\n", + " \n", + " correct_so_far = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(training_reviews)):\n", + " \n", + " review = training_reviews[i]\n", + " label = training_labels[i]\n", + " \n", + " #### Implement the forward pass here ####\n", + " ### Forward pass ###\n", + "\n", + " # Input Layer\n", + "\n", + " # Hidden layer\n", + "# layer_1 = self.layer_0.dot(self.weights_0_1)\n", + " self.layer_1 *= 0\n", + " for index in review:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + "\n", + " #### Implement the backward pass here ####\n", + " ### Backward pass ###\n", + "\n", + " # Output error\n", + " layer_2_error = layer_2 - self.get_target_for_label(label) # Output layer error is the difference between desired target and actual output.\n", + " layer_2_delta = layer_2_error * self.sigmoid_output_2_derivative(layer_2)\n", + "\n", + " # Backpropagated error\n", + " layer_1_error = layer_2_delta.dot(self.weights_1_2.T) # errors propagated to the hidden layer\n", + " layer_1_delta = layer_1_error # hidden layer gradients - no nonlinearity so it's the same as the error\n", + "\n", + " # Update the weights\n", + " self.weights_1_2 -= self.layer_1.T.dot(layer_2_delta) * self.learning_rate # update hidden-to-output weights with gradient descent step\n", + " \n", + " for index in review:\n", + " self.weights_0_1[index] -= layer_1_delta[0] * self.learning_rate # update input-to-hidden weights with gradient descent step\n", + "\n", + " if(layer_2 >= 0.5 and label == 'POSITIVE'):\n", + " correct_so_far += 1\n", + " if(layer_2 < 0.5 and label == 'NEGATIVE'):\n", + " correct_so_far += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(training_reviews)))[:4] + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] + \" #Correct:\" + str(correct_so_far) + \" #Trained:\" + str(i+1) + \" Training Accuracy:\" + str(correct_so_far * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " \n", + " def test(self, testing_reviews, testing_labels):\n", + " \n", + " correct = 0\n", + " \n", + " start = time.time()\n", + " \n", + " for i in range(len(testing_reviews)):\n", + " pred = self.run(testing_reviews[i])\n", + " if(pred == testing_labels[i]):\n", + " correct += 1\n", + " \n", + " reviews_per_second = i / float(time.time() - start)\n", + " \n", + " sys.stdout.write(\"\\rProgress:\" + str(100 * i/float(len(testing_reviews)))[:4] \\\n", + " + \"% Speed(reviews/sec):\" + str(reviews_per_second)[0:5] \\\n", + " + \"% #Correct:\" + str(correct) + \" #Tested:\" + str(i+1) + \" Testing Accuracy:\" + str(correct * 100 / float(i+1))[:4] + \"%\")\n", + " \n", + " def run(self, review):\n", + " \n", + " # Input Layer\n", + "\n", + "\n", + " # Hidden layer\n", + " self.layer_1 *= 0\n", + " unique_indices = set()\n", + " for word in review.lower().split(\" \"):\n", + " if word in self.word2index.keys():\n", + " unique_indices.add(self.word2index[word])\n", + " for index in unique_indices:\n", + " self.layer_1 += self.weights_0_1[index]\n", + " \n", + " # Output layer\n", + " layer_2 = self.sigmoid(self.layer_1.dot(self.weights_1_2))\n", + " \n", + " if(layer_2[0] >= 0.5):\n", + " return \"POSITIVE\"\n", + " else:\n", + " return \"NEGATIVE\"\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 123, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000],min_count=20,polarity_cutoff=0.05,learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 124, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):1371. #Correct:20461 #Trained:24000 Training Accuracy:85.2%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 125, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):1708.% #Correct:859 #Tested:1000 Testing Accuracy:85.9%" + ] + } + ], + "source": [ + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + }, + { + "cell_type": "code", + "execution_count": 126, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "mlp = SentimentNetwork(reviews[:-1000],labels[:-1000],min_count=20,polarity_cutoff=0.8,learning_rate=0.01)" + ] + }, + { + "cell_type": "code", + "execution_count": 127, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Progress:99.9% Speed(reviews/sec):7089. #Correct:20552 #Trained:24000 Training Accuracy:85.6%" + ] + } + ], + "source": [ + "mlp.train(reviews[:-1000],labels[:-1000])" + ] + }, + { + "cell_type": "code", + "execution_count": 128, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\r", + "Progress:0.0% Speed(reviews/sec):0.0% #Correct:0 #Tested:1 Testing Accuracy:0.0%\r", + "Progress:0.1% Speed(reviews/sec):2123.% #Correct:1 #Tested:2 Testing Accuracy:50.0%\r", + "Progress:0.2% Speed(reviews/sec):3623.% #Correct:2 #Tested:3 Testing Accuracy:66.6%\r", + "Progress:0.3% Speed(reviews/sec):4477.% #Correct:3 #Tested:4 Testing Accuracy:75.0%\r", + "Progress:0.4% Speed(reviews/sec):5488.% #Correct:3 #Tested:5 Testing Accuracy:60.0%\r", + "Progress:0.5% Speed(reviews/sec):5995.% #Correct:4 #Tested:6 Testing Accuracy:66.6%\r", + "Progress:0.6% Speed(reviews/sec):5698.% #Correct:5 #Tested:7 Testing Accuracy:71.4%\r", + "Progress:0.7% Speed(reviews/sec):5448.% #Correct:6 #Tested:8 Testing Accuracy:75.0%\r", + "Progress:0.8% Speed(reviews/sec):5041.% #Correct:7 #Tested:9 Testing Accuracy:77.7%\r", + "Progress:0.9% Speed(reviews/sec):2163.% #Correct:8 #Tested:10 Testing Accuracy:80.0%\r", + "Progress:1.0% Speed(reviews/sec):2218.% #Correct:9 #Tested:11 Testing Accuracy:81.8%\r", + "Progress:1.1% Speed(reviews/sec):2356.% #Correct:10 #Tested:12 Testing Accuracy:83.3%\r", + "Progress:1.2% Speed(reviews/sec):2496.% #Correct:10 #Tested:13 Testing Accuracy:76.9%\r", + "Progress:1.3% Speed(reviews/sec):2617.% #Correct:11 #Tested:14 Testing Accuracy:78.5%\r", + "Progress:1.4% Speed(reviews/sec):2746.% #Correct:11 #Tested:15 Testing Accuracy:73.3%\r", + "Progress:1.5% Speed(reviews/sec):2835.% #Correct:12 #Tested:16 Testing Accuracy:75.0%\r", + "Progress:1.6% Speed(reviews/sec):2812.% #Correct:13 #Tested:17 Testing Accuracy:76.4%\r", + "Progress:1.7% Speed(reviews/sec):2891.% #Correct:14 #Tested:18 Testing Accuracy:77.7%\r", + "Progress:1.8% Speed(reviews/sec):3022.% #Correct:15 #Tested:19 Testing Accuracy:78.9%\r", + "Progress:1.9% Speed(reviews/sec):3096.% #Correct:16 #Tested:20 Testing Accuracy:80.0%\r", + "Progress:2.0% Speed(reviews/sec):3125.% #Correct:17 #Tested:21 Testing Accuracy:80.9%\r", + "Progress:2.1% Speed(reviews/sec):3218.% #Correct:18 #Tested:22 Testing Accuracy:81.8%\r", + "Progress:2.2% Speed(reviews/sec):3214.% #Correct:19 #Tested:23 Testing Accuracy:82.6%\r", + "Progress:2.3% Speed(reviews/sec):3205.% #Correct:20 #Tested:24 Testing Accuracy:83.3%\r", + "Progress:2.4% Speed(reviews/sec):3137.% #Correct:21 #Tested:25 Testing Accuracy:84.0%\r", + "Progress:2.5% Speed(reviews/sec):3207.% #Correct:22 #Tested:26 Testing Accuracy:84.6%\r", + "Progress:2.6% Speed(reviews/sec):3265.% #Correct:22 #Tested:27 Testing Accuracy:81.4%\r", + "Progress:2.7% Speed(reviews/sec):3164.% #Correct:23 #Tested:28 Testing Accuracy:82.1%\r", + "Progress:2.8% Speed(reviews/sec):3221.% #Correct:24 #Tested:29 Testing Accuracy:82.7%\r", + "Progress:2.9% Speed(reviews/sec):3264.% #Correct:25 #Tested:30 Testing Accuracy:83.3%\r", + "Progress:3.0% Speed(reviews/sec):3286.% #Correct:25 #Tested:31 Testing Accuracy:80.6%\r", + "Progress:3.1% Speed(reviews/sec):3357.% #Correct:26 #Tested:32 Testing Accuracy:81.2%\r", + "Progress:3.2% Speed(reviews/sec):3401.% #Correct:27 #Tested:33 Testing Accuracy:81.8%\r", + "Progress:3.3% Speed(reviews/sec):3443.% #Correct:28 #Tested:34 Testing Accuracy:82.3%\r", + "Progress:3.4% Speed(reviews/sec):3388.% #Correct:29 #Tested:35 Testing Accuracy:82.8%\r", + "Progress:3.5% Speed(reviews/sec):3348.% #Correct:30 #Tested:36 Testing Accuracy:83.3%\r", + "Progress:3.6% Speed(reviews/sec):3013.% #Correct:31 #Tested:37 Testing Accuracy:83.7%\r", + "Progress:3.7% Speed(reviews/sec):3018.% #Correct:32 #Tested:38 Testing Accuracy:84.2%\r", + "Progress:3.8% Speed(reviews/sec):2956.% #Correct:33 #Tested:39 Testing Accuracy:84.6%\r", + "Progress:3.9% Speed(reviews/sec):2880.% #Correct:33 #Tested:40 Testing Accuracy:82.5%\r", + "Progress:4.0% Speed(reviews/sec):2846.% #Correct:34 #Tested:41 Testing Accuracy:82.9%\r", + "Progress:4.1% Speed(reviews/sec):2837.% #Correct:35 #Tested:42 Testing Accuracy:83.3%\r", + "Progress:4.2% Speed(reviews/sec):2796.% #Correct:36 #Tested:43 Testing Accuracy:83.7%\r", + "Progress:4.3% Speed(reviews/sec):2793.% #Correct:37 #Tested:44 Testing Accuracy:84.0%\r", + "Progress:4.4% Speed(reviews/sec):2713.% #Correct:38 #Tested:45 Testing Accuracy:84.4%\r", + "Progress:4.5% Speed(reviews/sec):2598.% #Correct:39 #Tested:46 Testing Accuracy:84.7%\r", + "Progress:4.6% Speed(reviews/sec):2535.% #Correct:40 #Tested:47 Testing Accuracy:85.1%\r", + "Progress:4.7% Speed(reviews/sec):2553.% #Correct:41 #Tested:48 Testing Accuracy:85.4%\r", + "Progress:4.8% Speed(reviews/sec):2531.% #Correct:42 #Tested:49 Testing Accuracy:85.7%\r", + "Progress:4.9% Speed(reviews/sec):2560.% #Correct:43 #Tested:50 Testing Accuracy:86.0%\r", + "Progress:5.0% Speed(reviews/sec):2594.% #Correct:44 #Tested:51 Testing Accuracy:86.2%\r", + "Progress:5.1% Speed(reviews/sec):2631.% #Correct:45 #Tested:52 Testing Accuracy:86.5%\r", + "Progress:5.2% Speed(reviews/sec):2630.% #Correct:46 #Tested:53 Testing Accuracy:86.7%\r", + "Progress:5.3% Speed(reviews/sec):2659.% #Correct:47 #Tested:54 Testing Accuracy:87.0%\r", + "Progress:5.4% Speed(reviews/sec):2692.% #Correct:48 #Tested:55 Testing Accuracy:87.2%\r", + "Progress:5.5% Speed(reviews/sec):2731.% #Correct:49 #Tested:56 Testing Accuracy:87.5%\r", + "Progress:5.6% Speed(reviews/sec):2698.% #Correct:50 #Tested:57 Testing Accuracy:87.7%\r", + "Progress:5.7% Speed(reviews/sec):2736.% #Correct:51 #Tested:58 Testing Accuracy:87.9%\r", + "Progress:5.8% Speed(reviews/sec):2774.% #Correct:52 #Tested:59 Testing Accuracy:88.1%\r", + "Progress:5.9% Speed(reviews/sec):2811.% #Correct:53 #Tested:60 Testing Accuracy:88.3%\r", + "Progress:6.0% Speed(reviews/sec):2822.% #Correct:53 #Tested:61 Testing Accuracy:86.8%\r", + "Progress:6.1% Speed(reviews/sec):2840.% #Correct:54 #Tested:62 Testing Accuracy:87.0%\r", + "Progress:6.2% Speed(reviews/sec):2877.% #Correct:55 #Tested:63 Testing Accuracy:87.3%\r", + "Progress:6.3% Speed(reviews/sec):2908.% #Correct:55 #Tested:64 Testing Accuracy:85.9%\r", + "Progress:6.4% Speed(reviews/sec):2910.% #Correct:55 #Tested:65 Testing Accuracy:84.6%\r", + "Progress:6.5% Speed(reviews/sec):2939.% #Correct:55 #Tested:66 Testing Accuracy:83.3%\r", + "Progress:6.6% Speed(reviews/sec):2969.% #Correct:56 #Tested:67 Testing Accuracy:83.5%\r", + "Progress:6.7% Speed(reviews/sec):2995.% #Correct:57 #Tested:68 Testing Accuracy:83.8%\r", + "Progress:6.8% Speed(reviews/sec):3011.% #Correct:58 #Tested:69 Testing Accuracy:84.0%\r", + "Progress:6.9% Speed(reviews/sec):2990.% #Correct:59 #Tested:70 Testing Accuracy:84.2%\r", + "Progress:7.0% Speed(reviews/sec):2985.% #Correct:59 #Tested:71 Testing Accuracy:83.0%\r", + "Progress:7.1% Speed(reviews/sec):3013.% #Correct:60 #Tested:72 Testing Accuracy:83.3%\r", + "Progress:7.2% Speed(reviews/sec):3037.% #Correct:61 #Tested:73 Testing Accuracy:83.5%\r", + "Progress:7.3% Speed(reviews/sec):3069.% #Correct:62 #Tested:74 Testing Accuracy:83.7%\r", + "Progress:7.4% Speed(reviews/sec):3094.% #Correct:63 #Tested:75 Testing Accuracy:84.0%\r", + "Progress:7.5% Speed(reviews/sec):3117.% #Correct:64 #Tested:76 Testing Accuracy:84.2%\r", + "Progress:7.6% Speed(reviews/sec):3136.% #Correct:65 #Tested:77 Testing Accuracy:84.4%\r", + "Progress:7.7% Speed(reviews/sec):3156.% #Correct:66 #Tested:78 Testing Accuracy:84.6%\r", + "Progress:7.8% Speed(reviews/sec):3180.% #Correct:67 #Tested:79 Testing Accuracy:84.8%\r", + "Progress:7.9% Speed(reviews/sec):3202.% #Correct:68 #Tested:80 Testing Accuracy:85.0%\r", + "Progress:8.0% Speed(reviews/sec):3227.% #Correct:69 #Tested:81 Testing Accuracy:85.1%\r", + "Progress:8.1% Speed(reviews/sec):3195.% #Correct:69 #Tested:82 Testing Accuracy:84.1%\r", + "Progress:8.2% Speed(reviews/sec):3211.% #Correct:70 #Tested:83 Testing Accuracy:84.3%\r", + "Progress:8.3% Speed(reviews/sec):3230.% #Correct:71 #Tested:84 Testing Accuracy:84.5%\r", + "Progress:8.4% Speed(reviews/sec):3257.% #Correct:72 #Tested:85 Testing Accuracy:84.7%\r", + "Progress:8.5% Speed(reviews/sec):3279.% #Correct:73 #Tested:86 Testing Accuracy:84.8%\r", + "Progress:8.6% Speed(reviews/sec):3288.% #Correct:74 #Tested:87 Testing Accuracy:85.0%\r", + "Progress:8.7% Speed(reviews/sec):3312.% #Correct:74 #Tested:88 Testing Accuracy:84.0%\r", + "Progress:8.8% Speed(reviews/sec):3336.% #Correct:75 #Tested:89 Testing Accuracy:84.2%\r", + "Progress:8.9% Speed(reviews/sec):3318.% #Correct:76 #Tested:90 Testing Accuracy:84.4%\r", + "Progress:9.0% Speed(reviews/sec):3331.% #Correct:77 #Tested:91 Testing Accuracy:84.6%\r", + "Progress:9.1% Speed(reviews/sec):3352.% #Correct:78 #Tested:92 Testing Accuracy:84.7%\r", + "Progress:9.2% Speed(reviews/sec):3375.% #Correct:79 #Tested:93 Testing Accuracy:84.9%\r", + "Progress:9.3% Speed(reviews/sec):3393.% #Correct:80 #Tested:94 Testing Accuracy:85.1%\r", + "Progress:9.4% Speed(reviews/sec):3414.% #Correct:81 #Tested:95 Testing Accuracy:85.2%\r", + "Progress:9.5% Speed(reviews/sec):3428.% #Correct:82 #Tested:96 Testing Accuracy:85.4%\r", + "Progress:9.6% Speed(reviews/sec):3442.% #Correct:83 #Tested:97 Testing Accuracy:85.5%\r", + "Progress:9.7% Speed(reviews/sec):3450.% #Correct:84 #Tested:98 Testing Accuracy:85.7%\r", + "Progress:9.8% Speed(reviews/sec):3472.% #Correct:85 #Tested:99 Testing Accuracy:85.8%\r", + "Progress:9.9% Speed(reviews/sec):3494.% #Correct:86 #Tested:100 Testing Accuracy:86.0%\r", + "Progress:10.0% Speed(reviews/sec):3512.% #Correct:87 #Tested:101 Testing Accuracy:86.1%\r", + "Progress:10.1% Speed(reviews/sec):3531.% #Correct:88 #Tested:102 Testing Accuracy:86.2%\r", + "Progress:10.2% Speed(reviews/sec):3549.% #Correct:89 #Tested:103 Testing Accuracy:86.4%\r", + "Progress:10.3% Speed(reviews/sec):3547.% #Correct:89 #Tested:104 Testing Accuracy:85.5%\r", + "Progress:10.4% Speed(reviews/sec):3567.% #Correct:90 #Tested:105 Testing Accuracy:85.7%\r", + "Progress:10.5% Speed(reviews/sec):3592.% #Correct:91 #Tested:106 Testing Accuracy:85.8%\r", + "Progress:10.6% Speed(reviews/sec):3603.% #Correct:92 #Tested:107 Testing Accuracy:85.9%\r", + "Progress:10.7% Speed(reviews/sec):3620.% #Correct:93 #Tested:108 Testing Accuracy:86.1%\r", + "Progress:10.8% Speed(reviews/sec):3641.% #Correct:94 #Tested:109 Testing Accuracy:86.2%\r", + "Progress:10.9% Speed(reviews/sec):3659.% #Correct:94 #Tested:110 Testing Accuracy:85.4%\r", + "Progress:11.0% Speed(reviews/sec):3674.% #Correct:95 #Tested:111 Testing Accuracy:85.5%\r", + "Progress:11.1% Speed(reviews/sec):3687.% #Correct:96 #Tested:112 Testing Accuracy:85.7%\r", + "Progress:11.2% Speed(reviews/sec):3702.% #Correct:97 #Tested:113 Testing Accuracy:85.8%\r", + "Progress:11.3% Speed(reviews/sec):3706.% #Correct:98 #Tested:114 Testing Accuracy:85.9%\r", + "Progress:11.4% Speed(reviews/sec):3718.% #Correct:99 #Tested:115 Testing Accuracy:86.0%\r", + "Progress:11.5% Speed(reviews/sec):3735.% #Correct:100 #Tested:116 Testing Accuracy:86.2%\r", + "Progress:11.6% Speed(reviews/sec):3755.% #Correct:101 #Tested:117 Testing Accuracy:86.3%\r", + "Progress:11.7% Speed(reviews/sec):3764.% #Correct:101 #Tested:118 Testing Accuracy:85.5%\r", + "Progress:11.8% Speed(reviews/sec):3783.% #Correct:102 #Tested:119 Testing Accuracy:85.7%\r", + "Progress:11.9% Speed(reviews/sec):3793.% #Correct:103 #Tested:120 Testing Accuracy:85.8%\r", + "Progress:12.0% Speed(reviews/sec):3800.% #Correct:103 #Tested:121 Testing Accuracy:85.1%\r", + "Progress:12.1% Speed(reviews/sec):3814.% #Correct:104 #Tested:122 Testing Accuracy:85.2%\r", + "Progress:12.2% Speed(reviews/sec):3833.% #Correct:105 #Tested:123 Testing Accuracy:85.3%\r", + "Progress:12.3% Speed(reviews/sec):3829.% #Correct:106 #Tested:124 Testing Accuracy:85.4%\r", + "Progress:12.4% Speed(reviews/sec):3835.% #Correct:107 #Tested:125 Testing Accuracy:85.6%\r", + "Progress:12.5% Speed(reviews/sec):3844.% #Correct:108 #Tested:126 Testing Accuracy:85.7%\r", + "Progress:12.6% Speed(reviews/sec):3865.% #Correct:109 #Tested:127 Testing Accuracy:85.8%\r", + "Progress:12.7% Speed(reviews/sec):3868.% #Correct:110 #Tested:128 Testing Accuracy:85.9%\r", + "Progress:12.8% Speed(reviews/sec):3877.% #Correct:111 #Tested:129 Testing Accuracy:86.0%\r", + "Progress:12.9% Speed(reviews/sec):3876.% #Correct:112 #Tested:130 Testing Accuracy:86.1%\r", + "Progress:13.0% Speed(reviews/sec):3877.% #Correct:112 #Tested:131 Testing Accuracy:85.4%\r", + "Progress:13.1% Speed(reviews/sec):3887.% #Correct:113 #Tested:132 Testing Accuracy:85.6%\r", + "Progress:13.2% Speed(reviews/sec):3902.% #Correct:114 #Tested:133 Testing Accuracy:85.7%\r", + "Progress:13.3% Speed(reviews/sec):3914.% #Correct:115 #Tested:134 Testing Accuracy:85.8%\r", + "Progress:13.4% Speed(reviews/sec):3916.% #Correct:116 #Tested:135 Testing Accuracy:85.9%\r", + "Progress:13.5% Speed(reviews/sec):3921.% #Correct:116 #Tested:136 Testing Accuracy:85.2%\r", + "Progress:13.6% Speed(reviews/sec):3932.% #Correct:117 #Tested:137 Testing Accuracy:85.4%\r", + "Progress:13.7% Speed(reviews/sec):3934.% #Correct:118 #Tested:138 Testing Accuracy:85.5%\r", + "Progress:13.8% Speed(reviews/sec):3945.% #Correct:119 #Tested:139 Testing Accuracy:85.6%\r", + "Progress:13.9% Speed(reviews/sec):3939.% #Correct:120 #Tested:140 Testing Accuracy:85.7%\r", + "Progress:14.0% Speed(reviews/sec):3960.% #Correct:121 #Tested:141 Testing Accuracy:85.8%\r", + "Progress:14.1% Speed(reviews/sec):3969.% #Correct:122 #Tested:142 Testing Accuracy:85.9%\r", + "Progress:14.2% Speed(reviews/sec):3985.% #Correct:123 #Tested:143 Testing Accuracy:86.0%\r", + "Progress:14.3% Speed(reviews/sec):4000.% #Correct:124 #Tested:144 Testing Accuracy:86.1%\r", + "Progress:14.4% Speed(reviews/sec):3940.% #Correct:125 #Tested:145 Testing Accuracy:86.2%\r", + "Progress:14.5% Speed(reviews/sec):3929.% #Correct:126 #Tested:146 Testing Accuracy:86.3%\r", + "Progress:14.6% Speed(reviews/sec):3915.% #Correct:127 #Tested:147 Testing Accuracy:86.3%\r", + "Progress:14.7% Speed(reviews/sec):3924.% #Correct:128 #Tested:148 Testing Accuracy:86.4%\r", + "Progress:14.8% Speed(reviews/sec):3938.% #Correct:129 #Tested:149 Testing Accuracy:86.5%\r", + "Progress:14.9% Speed(reviews/sec):3951.% #Correct:130 #Tested:150 Testing Accuracy:86.6%\r", + "Progress:15.0% Speed(reviews/sec):3965.% #Correct:131 #Tested:151 Testing Accuracy:86.7%\r", + "Progress:15.1% Speed(reviews/sec):3978.% #Correct:132 #Tested:152 Testing Accuracy:86.8%\r", + "Progress:15.2% Speed(reviews/sec):3976.% #Correct:133 #Tested:153 Testing Accuracy:86.9%\r", + "Progress:15.3% Speed(reviews/sec):3975.% #Correct:134 #Tested:154 Testing Accuracy:87.0%\r", + "Progress:15.4% Speed(reviews/sec):3987.% #Correct:135 #Tested:155 Testing Accuracy:87.0%\r", + "Progress:15.5% Speed(reviews/sec):3997.% #Correct:136 #Tested:156 Testing Accuracy:87.1%\r", + "Progress:15.6% Speed(reviews/sec):3996.% #Correct:137 #Tested:157 Testing Accuracy:87.2%\r", + "Progress:15.7% Speed(reviews/sec):4000.% #Correct:138 #Tested:158 Testing Accuracy:87.3%\r", + "Progress:15.8% Speed(reviews/sec):4015.% #Correct:139 #Tested:159 Testing Accuracy:87.4%\r", + "Progress:15.9% Speed(reviews/sec):4027.% #Correct:140 #Tested:160 Testing Accuracy:87.5%\r", + "Progress:16.0% Speed(reviews/sec):4032.% #Correct:141 #Tested:161 Testing Accuracy:87.5%\r", + "Progress:16.1% Speed(reviews/sec):4036.% #Correct:141 #Tested:162 Testing Accuracy:87.0%\r", + "Progress:16.2% Speed(reviews/sec):4049.% #Correct:142 #Tested:163 Testing Accuracy:87.1%\r", + "Progress:16.3% Speed(reviews/sec):4062.% #Correct:143 #Tested:164 Testing Accuracy:87.1%\r", + "Progress:16.4% Speed(reviews/sec):4077.% #Correct:144 #Tested:165 Testing Accuracy:87.2%\r", + "Progress:16.5% Speed(reviews/sec):4069.% #Correct:145 #Tested:166 Testing Accuracy:87.3%\r", + "Progress:16.6% Speed(reviews/sec):4070.% #Correct:145 #Tested:167 Testing Accuracy:86.8%\r", + "Progress:16.7% Speed(reviews/sec):4061.% #Correct:146 #Tested:168 Testing Accuracy:86.9%\r", + "Progress:16.8% Speed(reviews/sec):4075.% #Correct:147 #Tested:169 Testing Accuracy:86.9%\r", + "Progress:16.9% Speed(reviews/sec):4086.% #Correct:148 #Tested:170 Testing Accuracy:87.0%\r", + "Progress:17.0% Speed(reviews/sec):4091.% #Correct:149 #Tested:171 Testing Accuracy:87.1%\r", + "Progress:17.1% Speed(reviews/sec):4102.% #Correct:150 #Tested:172 Testing Accuracy:87.2%\r", + "Progress:17.2% Speed(reviews/sec):4096.% #Correct:151 #Tested:173 Testing Accuracy:87.2%\r", + "Progress:17.3% Speed(reviews/sec):4104.% #Correct:151 #Tested:174 Testing Accuracy:86.7%\r", + "Progress:17.4% Speed(reviews/sec):4109.% #Correct:152 #Tested:175 Testing Accuracy:86.8%\r", + "Progress:17.5% Speed(reviews/sec):4121.% #Correct:153 #Tested:176 Testing Accuracy:86.9%\r", + "Progress:17.6% Speed(reviews/sec):4133.% #Correct:154 #Tested:177 Testing Accuracy:87.0%\r", + "Progress:17.7% Speed(reviews/sec):4135.% #Correct:155 #Tested:178 Testing Accuracy:87.0%\r", + "Progress:17.8% Speed(reviews/sec):4149.% #Correct:156 #Tested:179 Testing Accuracy:87.1%\r", + "Progress:17.9% Speed(reviews/sec):4161.% #Correct:157 #Tested:180 Testing Accuracy:87.2%\r", + "Progress:18.0% Speed(reviews/sec):4170.% #Correct:158 #Tested:181 Testing Accuracy:87.2%\r", + "Progress:18.1% Speed(reviews/sec):4181.% #Correct:159 #Tested:182 Testing Accuracy:87.3%\r", + "Progress:18.2% Speed(reviews/sec):4191.% #Correct:159 #Tested:183 Testing Accuracy:86.8%\r", + "Progress:18.3% Speed(reviews/sec):4198.% #Correct:160 #Tested:184 Testing Accuracy:86.9%\r", + "Progress:18.4% Speed(reviews/sec):4189.% #Correct:161 #Tested:185 Testing Accuracy:87.0%\r", + "Progress:18.5% Speed(reviews/sec):4197.% #Correct:162 #Tested:186 Testing Accuracy:87.0%\r", + "Progress:18.6% Speed(reviews/sec):4212.% #Correct:162 #Tested:187 Testing Accuracy:86.6%\r", + "Progress:18.7% Speed(reviews/sec):4213.% #Correct:163 #Tested:188 Testing Accuracy:86.7%\r", + "Progress:18.8% Speed(reviews/sec):4197.% #Correct:164 #Tested:189 Testing Accuracy:86.7%\r", + "Progress:18.9% Speed(reviews/sec):4186.% #Correct:165 #Tested:190 Testing Accuracy:86.8%\r", + "Progress:19.0% Speed(reviews/sec):4187.% #Correct:166 #Tested:191 Testing Accuracy:86.9%\r", + "Progress:19.1% Speed(reviews/sec):4188.% #Correct:167 #Tested:192 Testing Accuracy:86.9%\r", + "Progress:19.2% Speed(reviews/sec):4184.% #Correct:168 #Tested:193 Testing Accuracy:87.0%\r", + "Progress:19.3% Speed(reviews/sec):4138.% #Correct:169 #Tested:194 Testing Accuracy:87.1%\r", + "Progress:19.4% Speed(reviews/sec):4146.% #Correct:170 #Tested:195 Testing Accuracy:87.1%\r", + "Progress:19.5% Speed(reviews/sec):4147.% #Correct:170 #Tested:196 Testing Accuracy:86.7%\r", + "Progress:19.6% Speed(reviews/sec):4149.% #Correct:171 #Tested:197 Testing Accuracy:86.8%\r", + "Progress:19.7% Speed(reviews/sec):4157.% #Correct:172 #Tested:198 Testing Accuracy:86.8%\r", + "Progress:19.8% Speed(reviews/sec):4170.% #Correct:173 #Tested:199 Testing Accuracy:86.9%\r", + "Progress:19.9% Speed(reviews/sec):4176.% #Correct:174 #Tested:200 Testing Accuracy:87.0%\r", + "Progress:20.0% Speed(reviews/sec):4190.% #Correct:175 #Tested:201 Testing Accuracy:87.0%\r", + "Progress:20.1% Speed(reviews/sec):4194.% #Correct:176 #Tested:202 Testing Accuracy:87.1%\r", + "Progress:20.2% Speed(reviews/sec):4205.% #Correct:177 #Tested:203 Testing Accuracy:87.1%\r", + "Progress:20.3% Speed(reviews/sec):4207.% #Correct:178 #Tested:204 Testing Accuracy:87.2%\r", + "Progress:20.4% Speed(reviews/sec):4217.% #Correct:179 #Tested:205 Testing Accuracy:87.3%\r", + "Progress:20.5% Speed(reviews/sec):4224.% #Correct:180 #Tested:206 Testing Accuracy:87.3%\r", + "Progress:20.6% Speed(reviews/sec):4223.% #Correct:181 #Tested:207 Testing Accuracy:87.4%\r", + "Progress:20.7% Speed(reviews/sec):4221.% #Correct:182 #Tested:208 Testing Accuracy:87.5%\r", + "Progress:20.8% Speed(reviews/sec):4227.% #Correct:183 #Tested:209 Testing Accuracy:87.5%\r", + "Progress:20.9% Speed(reviews/sec):4221.% #Correct:184 #Tested:210 Testing Accuracy:87.6%\r", + "Progress:21.0% Speed(reviews/sec):4231.% #Correct:184 #Tested:211 Testing Accuracy:87.2%\r", + "Progress:21.1% Speed(reviews/sec):4242.% #Correct:185 #Tested:212 Testing Accuracy:87.2%\r", + "Progress:21.2% Speed(reviews/sec):4254.% #Correct:186 #Tested:213 Testing Accuracy:87.3%\r", + "Progress:21.3% Speed(reviews/sec):4237.% #Correct:187 #Tested:214 Testing Accuracy:87.3%\r", + "Progress:21.4% Speed(reviews/sec):4250.% #Correct:188 #Tested:215 Testing Accuracy:87.4%\r", + "Progress:21.5% Speed(reviews/sec):4248.% #Correct:189 #Tested:216 Testing Accuracy:87.5%\r", + "Progress:21.6% Speed(reviews/sec):4259.% #Correct:190 #Tested:217 Testing Accuracy:87.5%\r", + "Progress:21.7% Speed(reviews/sec):4262.% #Correct:190 #Tested:218 Testing Accuracy:87.1%\r", + "Progress:21.8% Speed(reviews/sec):4265.% #Correct:191 #Tested:219 Testing Accuracy:87.2%\r", + "Progress:21.9% Speed(reviews/sec):4278.% #Correct:192 #Tested:220 Testing Accuracy:87.2%\r", + "Progress:22.0% Speed(reviews/sec):4275.% #Correct:193 #Tested:221 Testing Accuracy:87.3%\r", + "Progress:22.1% Speed(reviews/sec):4283.% #Correct:194 #Tested:222 Testing Accuracy:87.3%\r", + "Progress:22.2% Speed(reviews/sec):4265.% #Correct:195 #Tested:223 Testing Accuracy:87.4%\r", + "Progress:22.3% Speed(reviews/sec):4263.% #Correct:196 #Tested:224 Testing Accuracy:87.5%\r", + "Progress:22.4% Speed(reviews/sec):4246.% #Correct:197 #Tested:225 Testing Accuracy:87.5%\r", + "Progress:22.5% Speed(reviews/sec):4249.% #Correct:198 #Tested:226 Testing Accuracy:87.6%\r", + "Progress:22.6% Speed(reviews/sec):4236.% #Correct:199 #Tested:227 Testing Accuracy:87.6%\r", + "Progress:22.7% Speed(reviews/sec):4244.% #Correct:200 #Tested:228 Testing Accuracy:87.7%\r", + "Progress:22.8% Speed(reviews/sec):4250.% #Correct:201 #Tested:229 Testing Accuracy:87.7%\r", + "Progress:22.9% Speed(reviews/sec):4252.% #Correct:202 #Tested:230 Testing Accuracy:87.8%\r", + "Progress:23.0% Speed(reviews/sec):4247.% #Correct:203 #Tested:231 Testing Accuracy:87.8%\r", + "Progress:23.1% Speed(reviews/sec):4252.% #Correct:204 #Tested:232 Testing Accuracy:87.9%\r", + "Progress:23.2% Speed(reviews/sec):4265.% #Correct:205 #Tested:233 Testing Accuracy:87.9%\r", + "Progress:23.3% Speed(reviews/sec):4272.% #Correct:206 #Tested:234 Testing Accuracy:88.0%\r", + "Progress:23.4% Speed(reviews/sec):4257.% #Correct:207 #Tested:235 Testing Accuracy:88.0%\r", + "Progress:23.5% Speed(reviews/sec):4266.% #Correct:208 #Tested:236 Testing Accuracy:88.1%\r", + "Progress:23.6% Speed(reviews/sec):4258.% #Correct:208 #Tested:237 Testing Accuracy:87.7%\r", + "Progress:23.7% Speed(reviews/sec):4262.% #Correct:209 #Tested:238 Testing Accuracy:87.8%\r", + "Progress:23.8% Speed(reviews/sec):4271.% #Correct:210 #Tested:239 Testing Accuracy:87.8%\r", + "Progress:23.9% Speed(reviews/sec):4276.% #Correct:211 #Tested:240 Testing Accuracy:87.9%\r", + "Progress:24.0% Speed(reviews/sec):4260.% #Correct:212 #Tested:241 Testing Accuracy:87.9%\r", + "Progress:24.1% Speed(reviews/sec):4244.% #Correct:212 #Tested:242 Testing Accuracy:87.6%\r", + "Progress:24.2% Speed(reviews/sec):4230.% #Correct:213 #Tested:243 Testing Accuracy:87.6%\r", + "Progress:24.3% Speed(reviews/sec):4214.% #Correct:214 #Tested:244 Testing Accuracy:87.7%\r", + "Progress:24.4% Speed(reviews/sec):4208.% #Correct:215 #Tested:245 Testing Accuracy:87.7%\r", + "Progress:24.5% Speed(reviews/sec):4213.% #Correct:216 #Tested:246 Testing Accuracy:87.8%\r", + "Progress:24.6% Speed(reviews/sec):4222.% #Correct:217 #Tested:247 Testing Accuracy:87.8%\r", + "Progress:24.7% Speed(reviews/sec):4227.% #Correct:218 #Tested:248 Testing Accuracy:87.9%\r", + "Progress:24.8% Speed(reviews/sec):4239.% #Correct:219 #Tested:249 Testing Accuracy:87.9%\r", + "Progress:24.9% Speed(reviews/sec):4246.% #Correct:220 #Tested:250 Testing Accuracy:88.0%\r", + "Progress:25.0% Speed(reviews/sec):4255.% #Correct:221 #Tested:251 Testing Accuracy:88.0%\r", + "Progress:25.1% Speed(reviews/sec):4264.% #Correct:222 #Tested:252 Testing Accuracy:88.0%\r", + "Progress:25.2% Speed(reviews/sec):4272.% #Correct:223 #Tested:253 Testing Accuracy:88.1%\r", + "Progress:25.3% Speed(reviews/sec):4278.% #Correct:224 #Tested:254 Testing Accuracy:88.1%\r", + "Progress:25.4% Speed(reviews/sec):4287.% #Correct:225 #Tested:255 Testing Accuracy:88.2%\r", + "Progress:25.5% Speed(reviews/sec):4295.% #Correct:226 #Tested:256 Testing Accuracy:88.2%\r", + "Progress:25.6% Speed(reviews/sec):4306.% #Correct:227 #Tested:257 Testing Accuracy:88.3%\r", + "Progress:25.7% Speed(reviews/sec):4315.% #Correct:228 #Tested:258 Testing Accuracy:88.3%\r", + "Progress:25.8% Speed(reviews/sec):4326.% #Correct:229 #Tested:259 Testing Accuracy:88.4%\r", + "Progress:25.9% Speed(reviews/sec):4335.% #Correct:229 #Tested:260 Testing Accuracy:88.0%\r", + "Progress:26.0% Speed(reviews/sec):4344.% #Correct:230 #Tested:261 Testing Accuracy:88.1%\r", + "Progress:26.1% Speed(reviews/sec):4354.% #Correct:231 #Tested:262 Testing Accuracy:88.1%\r", + "Progress:26.2% Speed(reviews/sec):4357.% #Correct:232 #Tested:263 Testing Accuracy:88.2%\r", + "Progress:26.3% Speed(reviews/sec):4353.% #Correct:233 #Tested:264 Testing Accuracy:88.2%\r", + "Progress:26.4% Speed(reviews/sec):4360.% #Correct:234 #Tested:265 Testing Accuracy:88.3%\r", + "Progress:26.5% Speed(reviews/sec):4348.% #Correct:234 #Tested:266 Testing Accuracy:87.9%\r", + "Progress:26.6% Speed(reviews/sec):4353.% #Correct:235 #Tested:267 Testing Accuracy:88.0%\r", + "Progress:26.7% Speed(reviews/sec):4357.% #Correct:235 #Tested:268 Testing Accuracy:87.6%\r", + "Progress:26.8% Speed(reviews/sec):4365.% #Correct:236 #Tested:269 Testing Accuracy:87.7%\r", + "Progress:26.9% Speed(reviews/sec):4373.% #Correct:236 #Tested:270 Testing Accuracy:87.4%\r", + "Progress:27.0% Speed(reviews/sec):4375.% #Correct:236 #Tested:271 Testing Accuracy:87.0%\r", + "Progress:27.1% Speed(reviews/sec):4379.% #Correct:237 #Tested:272 Testing Accuracy:87.1%\r", + "Progress:27.2% Speed(reviews/sec):4376.% #Correct:238 #Tested:273 Testing Accuracy:87.1%\r", + "Progress:27.3% Speed(reviews/sec):4386.% #Correct:239 #Tested:274 Testing Accuracy:87.2%\r", + "Progress:27.4% Speed(reviews/sec):4386.% #Correct:240 #Tested:275 Testing Accuracy:87.2%\r", + "Progress:27.5% Speed(reviews/sec):4375.% #Correct:241 #Tested:276 Testing Accuracy:87.3%\r", + "Progress:27.6% Speed(reviews/sec):4382.% #Correct:242 #Tested:277 Testing Accuracy:87.3%\r", + "Progress:27.7% Speed(reviews/sec):4389.% #Correct:243 #Tested:278 Testing Accuracy:87.4%\r", + "Progress:27.8% Speed(reviews/sec):4398.% #Correct:244 #Tested:279 Testing Accuracy:87.4%\r", + "Progress:27.9% Speed(reviews/sec):4406.% #Correct:245 #Tested:280 Testing Accuracy:87.5%\r", + "Progress:28.0% Speed(reviews/sec):4409.% #Correct:246 #Tested:281 Testing Accuracy:87.5%\r", + "Progress:28.1% Speed(reviews/sec):4417.% #Correct:247 #Tested:282 Testing Accuracy:87.5%\r", + "Progress:28.2% Speed(reviews/sec):4428.% #Correct:248 #Tested:283 Testing Accuracy:87.6%\r", + "Progress:28.3% Speed(reviews/sec):4437.% #Correct:249 #Tested:284 Testing Accuracy:87.6%\r", + "Progress:28.4% Speed(reviews/sec):4414.% #Correct:250 #Tested:285 Testing Accuracy:87.7%\r", + "Progress:28.5% Speed(reviews/sec):4420.% #Correct:251 #Tested:286 Testing Accuracy:87.7%\r", + "Progress:28.6% Speed(reviews/sec):4430.% #Correct:252 #Tested:287 Testing Accuracy:87.8%\r", + "Progress:28.7% Speed(reviews/sec):4434.% #Correct:253 #Tested:288 Testing Accuracy:87.8%\r", + "Progress:28.8% Speed(reviews/sec):4442.% #Correct:254 #Tested:289 Testing Accuracy:87.8%\r", + "Progress:28.9% Speed(reviews/sec):4451.% #Correct:255 #Tested:290 Testing Accuracy:87.9%\r", + "Progress:29.0% Speed(reviews/sec):4457.% #Correct:256 #Tested:291 Testing Accuracy:87.9%\r", + "Progress:29.1% Speed(reviews/sec):4465.% #Correct:257 #Tested:292 Testing Accuracy:88.0%\r", + "Progress:29.2% Speed(reviews/sec):4474.% #Correct:258 #Tested:293 Testing Accuracy:88.0%\r", + "Progress:29.3% Speed(reviews/sec):4482.% #Correct:259 #Tested:294 Testing Accuracy:88.0%\r", + "Progress:29.4% Speed(reviews/sec):4490.% #Correct:260 #Tested:295 Testing Accuracy:88.1%\r", + "Progress:29.5% Speed(reviews/sec):4499.% #Correct:261 #Tested:296 Testing Accuracy:88.1%\r", + "Progress:29.6% Speed(reviews/sec):4510.% #Correct:262 #Tested:297 Testing Accuracy:88.2%\r", + "Progress:29.7% Speed(reviews/sec):4507.% #Correct:263 #Tested:298 Testing Accuracy:88.2%\r", + "Progress:29.8% Speed(reviews/sec):4511.% #Correct:264 #Tested:299 Testing Accuracy:88.2%\r", + "Progress:29.9% Speed(reviews/sec):4513.% #Correct:265 #Tested:300 Testing Accuracy:88.3%\r", + "Progress:30.0% Speed(reviews/sec):4518.% #Correct:266 #Tested:301 Testing Accuracy:88.3%\r", + "Progress:30.1% Speed(reviews/sec):4510.% #Correct:266 #Tested:302 Testing Accuracy:88.0%\r", + "Progress:30.2% Speed(reviews/sec):4487.% #Correct:267 #Tested:303 Testing Accuracy:88.1%\r", + "Progress:30.3% Speed(reviews/sec):4491.% #Correct:268 #Tested:304 Testing Accuracy:88.1%\r", + "Progress:30.4% Speed(reviews/sec):4500.% #Correct:269 #Tested:305 Testing Accuracy:88.1%\r", + "Progress:30.5% Speed(reviews/sec):4501.% #Correct:269 #Tested:306 Testing Accuracy:87.9%\r", + "Progress:30.6% Speed(reviews/sec):4430.% #Correct:270 #Tested:307 Testing Accuracy:87.9%\r", + "Progress:30.7% Speed(reviews/sec):4377.% #Correct:270 #Tested:308 Testing Accuracy:87.6%\r", + "Progress:30.8% Speed(reviews/sec):4375.% #Correct:271 #Tested:309 Testing Accuracy:87.7%\r", + "Progress:30.9% Speed(reviews/sec):4362.% #Correct:272 #Tested:310 Testing Accuracy:87.7%\r", + "Progress:31.0% Speed(reviews/sec):4370.% #Correct:273 #Tested:311 Testing Accuracy:87.7%\r", + "Progress:31.1% Speed(reviews/sec):4368.% #Correct:274 #Tested:312 Testing Accuracy:87.8%\r", + "Progress:31.2% Speed(reviews/sec):4373.% #Correct:275 #Tested:313 Testing Accuracy:87.8%\r", + "Progress:31.3% Speed(reviews/sec):4362.% #Correct:276 #Tested:314 Testing Accuracy:87.8%\r", + "Progress:31.4% Speed(reviews/sec):4366.% #Correct:277 #Tested:315 Testing Accuracy:87.9%\r", + "Progress:31.5% Speed(reviews/sec):4368.% #Correct:278 #Tested:316 Testing Accuracy:87.9%\r", + "Progress:31.6% Speed(reviews/sec):4376.% #Correct:279 #Tested:317 Testing Accuracy:88.0%\r", + "Progress:31.7% Speed(reviews/sec):4377.% #Correct:279 #Tested:318 Testing Accuracy:87.7%\r", + "Progress:31.8% Speed(reviews/sec):4383.% #Correct:280 #Tested:319 Testing Accuracy:87.7%\r", + "Progress:31.9% Speed(reviews/sec):4389.% #Correct:281 #Tested:320 Testing Accuracy:87.8%\r", + "Progress:32.0% Speed(reviews/sec):4396.% #Correct:282 #Tested:321 Testing Accuracy:87.8%\r", + "Progress:32.1% Speed(reviews/sec):4404.% #Correct:282 #Tested:322 Testing Accuracy:87.5%\r", + "Progress:32.2% Speed(reviews/sec):4412.% #Correct:283 #Tested:323 Testing Accuracy:87.6%\r", + "Progress:32.3% Speed(reviews/sec):4415.% #Correct:284 #Tested:324 Testing Accuracy:87.6%\r", + "Progress:32.4% Speed(reviews/sec):4413.% #Correct:285 #Tested:325 Testing Accuracy:87.6%\r", + "Progress:32.5% Speed(reviews/sec):4419.% #Correct:286 #Tested:326 Testing Accuracy:87.7%\r", + "Progress:32.6% Speed(reviews/sec):4424.% #Correct:287 #Tested:327 Testing Accuracy:87.7%\r", + "Progress:32.7% Speed(reviews/sec):4421.% #Correct:287 #Tested:328 Testing Accuracy:87.5%\r", + "Progress:32.8% Speed(reviews/sec):4413.% #Correct:288 #Tested:329 Testing Accuracy:87.5%\r", + "Progress:32.9% Speed(reviews/sec):4414.% #Correct:289 #Tested:330 Testing Accuracy:87.5%\r", + "Progress:33.0% Speed(reviews/sec):4417.% #Correct:290 #Tested:331 Testing Accuracy:87.6%\r", + "Progress:33.1% Speed(reviews/sec):4410.% #Correct:291 #Tested:332 Testing Accuracy:87.6%\r", + "Progress:33.2% Speed(reviews/sec):4391.% #Correct:292 #Tested:333 Testing Accuracy:87.6%\r", + "Progress:33.3% Speed(reviews/sec):4389.% #Correct:293 #Tested:334 Testing Accuracy:87.7%\r", + "Progress:33.4% Speed(reviews/sec):4355.% #Correct:294 #Tested:335 Testing Accuracy:87.7%\r", + "Progress:33.5% Speed(reviews/sec):4350.% #Correct:295 #Tested:336 Testing Accuracy:87.7%\r", + "Progress:33.6% Speed(reviews/sec):4351.% #Correct:296 #Tested:337 Testing Accuracy:87.8%\r", + "Progress:33.7% Speed(reviews/sec):4358.% #Correct:297 #Tested:338 Testing Accuracy:87.8%\r", + "Progress:33.8% Speed(reviews/sec):4363.% #Correct:297 #Tested:339 Testing Accuracy:87.6%\r", + "Progress:33.9% Speed(reviews/sec):4368.% #Correct:298 #Tested:340 Testing Accuracy:87.6%\r", + "Progress:34.0% Speed(reviews/sec):4366.% #Correct:298 #Tested:341 Testing Accuracy:87.3%\r", + "Progress:34.1% Speed(reviews/sec):4365.% #Correct:299 #Tested:342 Testing Accuracy:87.4%\r", + "Progress:34.2% Speed(reviews/sec):4343.% #Correct:300 #Tested:343 Testing Accuracy:87.4%\r", + "Progress:34.3% Speed(reviews/sec):4347.% #Correct:301 #Tested:344 Testing Accuracy:87.5%\r", + "Progress:34.4% Speed(reviews/sec):4343.% #Correct:302 #Tested:345 Testing Accuracy:87.5%\r", + "Progress:34.5% Speed(reviews/sec):4341.% #Correct:303 #Tested:346 Testing Accuracy:87.5%\r", + "Progress:34.6% Speed(reviews/sec):4342.% #Correct:303 #Tested:347 Testing Accuracy:87.3%\r", + "Progress:34.7% Speed(reviews/sec):4341.% #Correct:304 #Tested:348 Testing Accuracy:87.3%\r", + "Progress:34.8% Speed(reviews/sec):4334.% #Correct:305 #Tested:349 Testing Accuracy:87.3%\r", + "Progress:34.9% Speed(reviews/sec):4331.% #Correct:306 #Tested:350 Testing Accuracy:87.4%\r", + "Progress:35.0% Speed(reviews/sec):4334.% #Correct:307 #Tested:351 Testing Accuracy:87.4%\r", + "Progress:35.1% Speed(reviews/sec):4336.% #Correct:308 #Tested:352 Testing Accuracy:87.5%\r", + "Progress:35.2% Speed(reviews/sec):4327.% #Correct:309 #Tested:353 Testing Accuracy:87.5%\r", + "Progress:35.3% Speed(reviews/sec):4333.% #Correct:309 #Tested:354 Testing Accuracy:87.2%\r", + "Progress:35.4% Speed(reviews/sec):4338.% #Correct:310 #Tested:355 Testing Accuracy:87.3%\r", + "Progress:35.5% Speed(reviews/sec):4331.% #Correct:311 #Tested:356 Testing Accuracy:87.3%\r", + "Progress:35.6% Speed(reviews/sec):4334.% #Correct:311 #Tested:357 Testing Accuracy:87.1%\r", + "Progress:35.7% Speed(reviews/sec):4333.% #Correct:311 #Tested:358 Testing Accuracy:86.8%\r", + "Progress:35.8% Speed(reviews/sec):4339.% #Correct:312 #Tested:359 Testing Accuracy:86.9%\r", + "Progress:35.9% Speed(reviews/sec):4344.% #Correct:313 #Tested:360 Testing Accuracy:86.9%\r", + "Progress:36.0% Speed(reviews/sec):4346.% #Correct:314 #Tested:361 Testing Accuracy:86.9%\r", + "Progress:36.1% Speed(reviews/sec):4351.% #Correct:315 #Tested:362 Testing Accuracy:87.0%\r", + "Progress:36.2% Speed(reviews/sec):4322.% #Correct:316 #Tested:363 Testing Accuracy:87.0%\r", + "Progress:36.3% Speed(reviews/sec):4324.% #Correct:317 #Tested:364 Testing Accuracy:87.0%\r", + "Progress:36.4% Speed(reviews/sec):4330.% #Correct:317 #Tested:365 Testing Accuracy:86.8%\r", + "Progress:36.5% Speed(reviews/sec):4330.% #Correct:318 #Tested:366 Testing Accuracy:86.8%\r", + "Progress:36.6% Speed(reviews/sec):4331.% #Correct:319 #Tested:367 Testing Accuracy:86.9%\r", + "Progress:36.7% Speed(reviews/sec):4337.% #Correct:320 #Tested:368 Testing Accuracy:86.9%\r", + "Progress:36.8% Speed(reviews/sec):4338.% #Correct:320 #Tested:369 Testing Accuracy:86.7%\r", + "Progress:36.9% Speed(reviews/sec):4343.% #Correct:320 #Tested:370 Testing Accuracy:86.4%\r", + "Progress:37.0% Speed(reviews/sec):4344.% #Correct:321 #Tested:371 Testing Accuracy:86.5%\r", + "Progress:37.1% Speed(reviews/sec):4318.% #Correct:322 #Tested:372 Testing Accuracy:86.5%\r", + "Progress:37.2% Speed(reviews/sec):4315.% #Correct:322 #Tested:373 Testing Accuracy:86.3%\r", + "Progress:37.3% Speed(reviews/sec):4316.% #Correct:322 #Tested:374 Testing Accuracy:86.0%\r", + "Progress:37.4% Speed(reviews/sec):4300.% #Correct:323 #Tested:375 Testing Accuracy:86.1%\r", + "Progress:37.5% Speed(reviews/sec):4296.% #Correct:324 #Tested:376 Testing Accuracy:86.1%\r", + "Progress:37.6% Speed(reviews/sec):4300.% #Correct:325 #Tested:377 Testing Accuracy:86.2%\r", + "Progress:37.7% Speed(reviews/sec):4303.% #Correct:326 #Tested:378 Testing Accuracy:86.2%\r", + "Progress:37.8% Speed(reviews/sec):4309.% #Correct:326 #Tested:379 Testing Accuracy:86.0%\r", + "Progress:37.9% Speed(reviews/sec):4316.% #Correct:327 #Tested:380 Testing Accuracy:86.0%\r", + "Progress:38.0% Speed(reviews/sec):4304.% #Correct:328 #Tested:381 Testing Accuracy:86.0%\r", + "Progress:38.1% Speed(reviews/sec):4310.% #Correct:329 #Tested:382 Testing Accuracy:86.1%\r", + "Progress:38.2% Speed(reviews/sec):4309.% #Correct:330 #Tested:383 Testing Accuracy:86.1%\r", + "Progress:38.3% Speed(reviews/sec):4313.% #Correct:331 #Tested:384 Testing Accuracy:86.1%\r", + "Progress:38.4% Speed(reviews/sec):4318.% #Correct:332 #Tested:385 Testing Accuracy:86.2%\r", + "Progress:38.5% Speed(reviews/sec):4324.% #Correct:333 #Tested:386 Testing Accuracy:86.2%\r", + "Progress:38.6% Speed(reviews/sec):4324.% #Correct:334 #Tested:387 Testing Accuracy:86.3%\r", + "Progress:38.7% Speed(reviews/sec):4332.% #Correct:335 #Tested:388 Testing Accuracy:86.3%\r", + "Progress:38.8% Speed(reviews/sec):4336.% #Correct:335 #Tested:389 Testing Accuracy:86.1%\r", + "Progress:38.9% Speed(reviews/sec):4341.% #Correct:336 #Tested:390 Testing Accuracy:86.1%\r", + "Progress:39.0% Speed(reviews/sec):4347.% #Correct:336 #Tested:391 Testing Accuracy:85.9%\r", + "Progress:39.1% Speed(reviews/sec):4351.% #Correct:337 #Tested:392 Testing Accuracy:85.9%\r", + "Progress:39.2% Speed(reviews/sec):4358.% #Correct:337 #Tested:393 Testing Accuracy:85.7%\r", + "Progress:39.3% Speed(reviews/sec):4358.% #Correct:338 #Tested:394 Testing Accuracy:85.7%\r", + "Progress:39.4% Speed(reviews/sec):4354.% #Correct:338 #Tested:395 Testing Accuracy:85.5%\r", + "Progress:39.5% Speed(reviews/sec):4351.% #Correct:339 #Tested:396 Testing Accuracy:85.6%\r", + "Progress:39.6% Speed(reviews/sec):4344.% #Correct:340 #Tested:397 Testing Accuracy:85.6%\r", + "Progress:39.7% Speed(reviews/sec):4338.% #Correct:341 #Tested:398 Testing Accuracy:85.6%\r", + "Progress:39.8% Speed(reviews/sec):4314.% #Correct:341 #Tested:399 Testing Accuracy:85.4%\r", + "Progress:39.9% Speed(reviews/sec):4304.% #Correct:342 #Tested:400 Testing Accuracy:85.5%\r", + "Progress:40.0% Speed(reviews/sec):4283.% #Correct:343 #Tested:401 Testing Accuracy:85.5%\r", + "Progress:40.1% Speed(reviews/sec):4285.% #Correct:344 #Tested:402 Testing Accuracy:85.5%\r", + "Progress:40.2% Speed(reviews/sec):4284.% #Correct:345 #Tested:403 Testing Accuracy:85.6%\r", + "Progress:40.3% Speed(reviews/sec):4288.% #Correct:345 #Tested:404 Testing Accuracy:85.3%\r", + "Progress:40.4% Speed(reviews/sec):4293.% #Correct:346 #Tested:405 Testing Accuracy:85.4%\r", + "Progress:40.5% Speed(reviews/sec):4296.% #Correct:347 #Tested:406 Testing Accuracy:85.4%\r", + "Progress:40.6% Speed(reviews/sec):4294.% #Correct:348 #Tested:407 Testing Accuracy:85.5%\r", + "Progress:40.7% Speed(reviews/sec):4293.% #Correct:349 #Tested:408 Testing Accuracy:85.5%\r", + "Progress:40.8% Speed(reviews/sec):4287.% #Correct:350 #Tested:409 Testing Accuracy:85.5%\r", + "Progress:40.9% Speed(reviews/sec):4290.% #Correct:351 #Tested:410 Testing Accuracy:85.6%\r", + "Progress:41.0% Speed(reviews/sec):4294.% #Correct:352 #Tested:411 Testing Accuracy:85.6%\r", + "Progress:41.1% Speed(reviews/sec):4292.% #Correct:353 #Tested:412 Testing Accuracy:85.6%\r", + "Progress:41.2% Speed(reviews/sec):4297.% #Correct:354 #Tested:413 Testing Accuracy:85.7%\r", + "Progress:41.3% Speed(reviews/sec):4294.% #Correct:355 #Tested:414 Testing Accuracy:85.7%\r", + "Progress:41.4% Speed(reviews/sec):4299.% #Correct:356 #Tested:415 Testing Accuracy:85.7%\r", + "Progress:41.5% Speed(reviews/sec):4301.% #Correct:357 #Tested:416 Testing Accuracy:85.8%\r", + "Progress:41.6% Speed(reviews/sec):4305.% #Correct:358 #Tested:417 Testing Accuracy:85.8%\r", + "Progress:41.7% Speed(reviews/sec):4308.% #Correct:359 #Tested:418 Testing Accuracy:85.8%\r", + "Progress:41.8% Speed(reviews/sec):4311.% #Correct:360 #Tested:419 Testing Accuracy:85.9%\r", + "Progress:41.9% Speed(reviews/sec):4316.% #Correct:360 #Tested:420 Testing Accuracy:85.7%\r", + "Progress:42.0% Speed(reviews/sec):4312.% #Correct:361 #Tested:421 Testing Accuracy:85.7%\r", + "Progress:42.1% Speed(reviews/sec):4315.% #Correct:362 #Tested:422 Testing Accuracy:85.7%\r", + "Progress:42.2% Speed(reviews/sec):4318.% #Correct:363 #Tested:423 Testing Accuracy:85.8%\r", + "Progress:42.3% Speed(reviews/sec):4321.% #Correct:364 #Tested:424 Testing Accuracy:85.8%\r", + "Progress:42.4% Speed(reviews/sec):4323.% #Correct:365 #Tested:425 Testing Accuracy:85.8%\r", + "Progress:42.5% Speed(reviews/sec):4329.% #Correct:366 #Tested:426 Testing Accuracy:85.9%\r", + "Progress:42.6% Speed(reviews/sec):4320.% #Correct:367 #Tested:427 Testing Accuracy:85.9%\r", + "Progress:42.7% Speed(reviews/sec):4324.% #Correct:368 #Tested:428 Testing Accuracy:85.9%\r", + "Progress:42.8% Speed(reviews/sec):4326.% #Correct:369 #Tested:429 Testing Accuracy:86.0%\r", + "Progress:42.9% Speed(reviews/sec):4330.% #Correct:370 #Tested:430 Testing Accuracy:86.0%\r", + "Progress:43.0% Speed(reviews/sec):4335.% #Correct:371 #Tested:431 Testing Accuracy:86.0%\r", + "Progress:43.1% Speed(reviews/sec):4340.% #Correct:372 #Tested:432 Testing Accuracy:86.1%\r", + "Progress:43.2% Speed(reviews/sec):4342.% #Correct:372 #Tested:433 Testing Accuracy:85.9%\r", + "Progress:43.3% Speed(reviews/sec):4347.% #Correct:373 #Tested:434 Testing Accuracy:85.9%\r", + "Progress:43.4% Speed(reviews/sec):4350.% #Correct:374 #Tested:435 Testing Accuracy:85.9%\r", + "Progress:43.5% Speed(reviews/sec):4352.% #Correct:375 #Tested:436 Testing Accuracy:86.0%\r", + "Progress:43.6% Speed(reviews/sec):4358.% #Correct:376 #Tested:437 Testing Accuracy:86.0%\r", + "Progress:43.7% Speed(reviews/sec):4352.% #Correct:377 #Tested:438 Testing Accuracy:86.0%\r", + "Progress:43.8% Speed(reviews/sec):4358.% #Correct:378 #Tested:439 Testing Accuracy:86.1%\r", + "Progress:43.9% Speed(reviews/sec):4356.% #Correct:379 #Tested:440 Testing Accuracy:86.1%\r", + "Progress:44.0% Speed(reviews/sec):4353.% #Correct:380 #Tested:441 Testing Accuracy:86.1%\r", + "Progress:44.1% Speed(reviews/sec):4358.% #Correct:381 #Tested:442 Testing Accuracy:86.1%\r", + "Progress:44.2% Speed(reviews/sec):4364.% #Correct:382 #Tested:443 Testing Accuracy:86.2%\r", + "Progress:44.3% Speed(reviews/sec):4366.% #Correct:383 #Tested:444 Testing Accuracy:86.2%\r", + "Progress:44.4% Speed(reviews/sec):4357.% #Correct:384 #Tested:445 Testing Accuracy:86.2%\r", + "Progress:44.5% Speed(reviews/sec):4360.% #Correct:385 #Tested:446 Testing Accuracy:86.3%\r", + "Progress:44.6% Speed(reviews/sec):4364.% #Correct:386 #Tested:447 Testing Accuracy:86.3%\r", + "Progress:44.7% Speed(reviews/sec):4311.% #Correct:387 #Tested:448 Testing Accuracy:86.3%\r", + "Progress:44.8% Speed(reviews/sec):4302.% #Correct:388 #Tested:449 Testing Accuracy:86.4%\r", + "Progress:44.9% Speed(reviews/sec):4285.% #Correct:388 #Tested:450 Testing Accuracy:86.2%\r", + "Progress:45.0% Speed(reviews/sec):4285.% #Correct:389 #Tested:451 Testing Accuracy:86.2%\r", + "Progress:45.1% Speed(reviews/sec):4262.% #Correct:389 #Tested:452 Testing Accuracy:86.0%\r", + "Progress:45.2% Speed(reviews/sec):4262.% #Correct:390 #Tested:453 Testing Accuracy:86.0%\r", + "Progress:45.3% Speed(reviews/sec):4261.% #Correct:391 #Tested:454 Testing Accuracy:86.1%\r", + "Progress:45.4% Speed(reviews/sec):4265.% #Correct:392 #Tested:455 Testing Accuracy:86.1%\r", + "Progress:45.5% Speed(reviews/sec):4259.% #Correct:393 #Tested:456 Testing Accuracy:86.1%\r", + "Progress:45.6% Speed(reviews/sec):4257.% #Correct:394 #Tested:457 Testing Accuracy:86.2%\r", + "Progress:45.7% Speed(reviews/sec):4251.% #Correct:395 #Tested:458 Testing Accuracy:86.2%\r", + "Progress:45.8% Speed(reviews/sec):4247.% #Correct:396 #Tested:459 Testing Accuracy:86.2%\r", + "Progress:45.9% Speed(reviews/sec):4222.% #Correct:397 #Tested:460 Testing Accuracy:86.3%\r", + "Progress:46.0% Speed(reviews/sec):4217.% #Correct:398 #Tested:461 Testing Accuracy:86.3%\r", + "Progress:46.1% Speed(reviews/sec):4188.% #Correct:398 #Tested:462 Testing Accuracy:86.1%\r", + "Progress:46.2% Speed(reviews/sec):4083.% #Correct:399 #Tested:463 Testing Accuracy:86.1%\r", + "Progress:46.3% Speed(reviews/sec):4064.% #Correct:400 #Tested:464 Testing Accuracy:86.2%\r", + "Progress:46.4% Speed(reviews/sec):4057.% #Correct:401 #Tested:465 Testing Accuracy:86.2%\r", + "Progress:46.5% Speed(reviews/sec):4042.% #Correct:402 #Tested:466 Testing Accuracy:86.2%\r", + "Progress:46.6% Speed(reviews/sec):4043.% #Correct:403 #Tested:467 Testing Accuracy:86.2%\r", + "Progress:46.7% Speed(reviews/sec):4019.% #Correct:404 #Tested:468 Testing Accuracy:86.3%\r", + "Progress:46.8% Speed(reviews/sec):4007.% #Correct:405 #Tested:469 Testing Accuracy:86.3%\r", + "Progress:46.9% Speed(reviews/sec):4008.% #Correct:405 #Tested:470 Testing Accuracy:86.1%\r", + "Progress:47.0% Speed(reviews/sec):4008.% #Correct:406 #Tested:471 Testing Accuracy:86.1%\r", + "Progress:47.1% Speed(reviews/sec):3958.% #Correct:406 #Tested:472 Testing Accuracy:86.0%\r", + "Progress:47.2% Speed(reviews/sec):3957.% #Correct:407 #Tested:473 Testing Accuracy:86.0%\r", + "Progress:47.3% Speed(reviews/sec):3948.% #Correct:408 #Tested:474 Testing Accuracy:86.0%\r", + "Progress:47.4% Speed(reviews/sec):3938.% #Correct:409 #Tested:475 Testing Accuracy:86.1%\r", + "Progress:47.5% Speed(reviews/sec):3912.% #Correct:410 #Tested:476 Testing Accuracy:86.1%\r", + "Progress:47.6% Speed(reviews/sec):3890.% #Correct:411 #Tested:477 Testing Accuracy:86.1%\r", + "Progress:47.7% Speed(reviews/sec):3815.% #Correct:411 #Tested:478 Testing Accuracy:85.9%\r", + "Progress:47.8% Speed(reviews/sec):3804.% #Correct:412 #Tested:479 Testing Accuracy:86.0%\r", + "Progress:47.9% Speed(reviews/sec):3803.% #Correct:413 #Tested:480 Testing Accuracy:86.0%\r", + "Progress:48.0% Speed(reviews/sec):3767.% #Correct:414 #Tested:481 Testing Accuracy:86.0%\r", + "Progress:48.1% Speed(reviews/sec):3736.% #Correct:415 #Tested:482 Testing Accuracy:86.0%\r", + "Progress:48.2% Speed(reviews/sec):3737.% #Correct:416 #Tested:483 Testing Accuracy:86.1%\r", + "Progress:48.3% Speed(reviews/sec):3737.% #Correct:417 #Tested:484 Testing Accuracy:86.1%\r", + "Progress:48.4% Speed(reviews/sec):3731.% #Correct:418 #Tested:485 Testing Accuracy:86.1%\r", + "Progress:48.5% Speed(reviews/sec):3726.% #Correct:419 #Tested:486 Testing Accuracy:86.2%\r", + "Progress:48.6% Speed(reviews/sec):3730.% #Correct:420 #Tested:487 Testing Accuracy:86.2%\r", + "Progress:48.7% Speed(reviews/sec):3735.% #Correct:420 #Tested:488 Testing Accuracy:86.0%\r", + "Progress:48.8% Speed(reviews/sec):3738.% #Correct:421 #Tested:489 Testing Accuracy:86.0%\r", + "Progress:48.9% Speed(reviews/sec):3735.% #Correct:421 #Tested:490 Testing Accuracy:85.9%\r", + "Progress:49.0% Speed(reviews/sec):3738.% #Correct:422 #Tested:491 Testing Accuracy:85.9%\r", + "Progress:49.1% Speed(reviews/sec):3743.% #Correct:423 #Tested:492 Testing Accuracy:85.9%\r", + "Progress:49.2% Speed(reviews/sec):3745.% #Correct:424 #Tested:493 Testing Accuracy:86.0%\r", + "Progress:49.3% Speed(reviews/sec):3746.% #Correct:425 #Tested:494 Testing Accuracy:86.0%\r", + "Progress:49.4% Speed(reviews/sec):3750.% #Correct:426 #Tested:495 Testing Accuracy:86.0%\r", + "Progress:49.5% Speed(reviews/sec):3752.% #Correct:427 #Tested:496 Testing Accuracy:86.0%\r", + "Progress:49.6% Speed(reviews/sec):3756.% #Correct:428 #Tested:497 Testing Accuracy:86.1%\r", + "Progress:49.7% Speed(reviews/sec):3761.% #Correct:428 #Tested:498 Testing Accuracy:85.9%\r", + "Progress:49.8% Speed(reviews/sec):3759.% #Correct:429 #Tested:499 Testing Accuracy:85.9%\r", + "Progress:49.9% Speed(reviews/sec):3761.% #Correct:430 #Tested:500 Testing Accuracy:86.0%\r", + "Progress:50.0% Speed(reviews/sec):3767.% #Correct:431 #Tested:501 Testing Accuracy:86.0%\r", + "Progress:50.1% Speed(reviews/sec):3764.% #Correct:432 #Tested:502 Testing Accuracy:86.0%\r", + "Progress:50.2% Speed(reviews/sec):3766.% #Correct:433 #Tested:503 Testing Accuracy:86.0%\r", + "Progress:50.3% Speed(reviews/sec):3769.% #Correct:434 #Tested:504 Testing Accuracy:86.1%\r", + "Progress:50.4% Speed(reviews/sec):3772.% #Correct:434 #Tested:505 Testing Accuracy:85.9%\r", + "Progress:50.5% Speed(reviews/sec):3776.% #Correct:435 #Tested:506 Testing Accuracy:85.9%\r", + "Progress:50.6% Speed(reviews/sec):3772.% #Correct:436 #Tested:507 Testing Accuracy:85.9%\r", + "Progress:50.7% Speed(reviews/sec):3762.% #Correct:437 #Tested:508 Testing Accuracy:86.0%\r", + "Progress:50.8% Speed(reviews/sec):3766.% #Correct:438 #Tested:509 Testing Accuracy:86.0%\r", + "Progress:50.9% Speed(reviews/sec):3771.% #Correct:439 #Tested:510 Testing Accuracy:86.0%\r", + "Progress:51.0% Speed(reviews/sec):3756.% #Correct:440 #Tested:511 Testing Accuracy:86.1%\r", + "Progress:51.1% Speed(reviews/sec):3759.% #Correct:441 #Tested:512 Testing Accuracy:86.1%\r", + "Progress:51.2% Speed(reviews/sec):3760.% #Correct:442 #Tested:513 Testing Accuracy:86.1%\r", + "Progress:51.3% Speed(reviews/sec):3765.% #Correct:443 #Tested:514 Testing Accuracy:86.1%\r", + "Progress:51.4% Speed(reviews/sec):3767.% #Correct:444 #Tested:515 Testing Accuracy:86.2%\r", + "Progress:51.5% Speed(reviews/sec):3769.% #Correct:445 #Tested:516 Testing Accuracy:86.2%\r", + "Progress:51.6% Speed(reviews/sec):3769.% #Correct:446 #Tested:517 Testing Accuracy:86.2%\r", + "Progress:51.7% Speed(reviews/sec):3773.% #Correct:447 #Tested:518 Testing Accuracy:86.2%\r", + "Progress:51.8% Speed(reviews/sec):3776.% #Correct:447 #Tested:519 Testing Accuracy:86.1%\r", + "Progress:51.9% Speed(reviews/sec):3774.% #Correct:448 #Tested:520 Testing Accuracy:86.1%\r", + "Progress:52.0% Speed(reviews/sec):3776.% #Correct:449 #Tested:521 Testing Accuracy:86.1%\r", + "Progress:52.1% Speed(reviews/sec):3774.% #Correct:450 #Tested:522 Testing Accuracy:86.2%\r", + "Progress:52.2% Speed(reviews/sec):3774.% #Correct:451 #Tested:523 Testing Accuracy:86.2%\r", + "Progress:52.3% Speed(reviews/sec):3777.% #Correct:452 #Tested:524 Testing Accuracy:86.2%\r", + "Progress:52.4% Speed(reviews/sec):3779.% #Correct:453 #Tested:525 Testing Accuracy:86.2%\r", + "Progress:52.5% Speed(reviews/sec):3781.% #Correct:454 #Tested:526 Testing Accuracy:86.3%\r", + "Progress:52.6% Speed(reviews/sec):3785.% #Correct:455 #Tested:527 Testing Accuracy:86.3%\r", + "Progress:52.7% Speed(reviews/sec):3788.% #Correct:455 #Tested:528 Testing Accuracy:86.1%\r", + "Progress:52.8% Speed(reviews/sec):3788.% #Correct:455 #Tested:529 Testing Accuracy:86.0%\r", + "Progress:52.9% Speed(reviews/sec):3791.% #Correct:456 #Tested:530 Testing Accuracy:86.0%\r", + "Progress:53.0% Speed(reviews/sec):3792.% #Correct:457 #Tested:531 Testing Accuracy:86.0%\r", + "Progress:53.1% Speed(reviews/sec):3795.% #Correct:457 #Tested:532 Testing Accuracy:85.9%\r", + "Progress:53.2% Speed(reviews/sec):3800.% #Correct:458 #Tested:533 Testing Accuracy:85.9%\r", + "Progress:53.3% Speed(reviews/sec):3803.% #Correct:459 #Tested:534 Testing Accuracy:85.9%\r", + "Progress:53.4% Speed(reviews/sec):3807.% #Correct:460 #Tested:535 Testing Accuracy:85.9%\r", + "Progress:53.5% Speed(reviews/sec):3811.% #Correct:461 #Tested:536 Testing Accuracy:86.0%\r", + "Progress:53.6% Speed(reviews/sec):3815.% #Correct:461 #Tested:537 Testing Accuracy:85.8%\r", + "Progress:53.7% Speed(reviews/sec):3816.% #Correct:462 #Tested:538 Testing Accuracy:85.8%\r", + "Progress:53.8% Speed(reviews/sec):3816.% #Correct:463 #Tested:539 Testing Accuracy:85.8%\r", + "Progress:53.9% Speed(reviews/sec):3816.% #Correct:464 #Tested:540 Testing Accuracy:85.9%\r", + "Progress:54.0% Speed(reviews/sec):3813.% #Correct:465 #Tested:541 Testing Accuracy:85.9%\r", + "Progress:54.1% Speed(reviews/sec):3815.% #Correct:466 #Tested:542 Testing Accuracy:85.9%\r", + "Progress:54.2% Speed(reviews/sec):3813.% #Correct:467 #Tested:543 Testing Accuracy:86.0%\r", + "Progress:54.3% Speed(reviews/sec):3816.% #Correct:468 #Tested:544 Testing Accuracy:86.0%\r", + "Progress:54.4% Speed(reviews/sec):3817.% #Correct:468 #Tested:545 Testing Accuracy:85.8%\r", + "Progress:54.5% Speed(reviews/sec):3819.% #Correct:469 #Tested:546 Testing Accuracy:85.8%\r", + "Progress:54.6% Speed(reviews/sec):3818.% #Correct:469 #Tested:547 Testing Accuracy:85.7%\r", + "Progress:54.7% Speed(reviews/sec):3820.% #Correct:470 #Tested:548 Testing Accuracy:85.7%\r", + "Progress:54.8% Speed(reviews/sec):3825.% #Correct:471 #Tested:549 Testing Accuracy:85.7%\r", + "Progress:54.9% Speed(reviews/sec):3829.% #Correct:472 #Tested:550 Testing Accuracy:85.8%\r", + "Progress:55.0% Speed(reviews/sec):3833.% #Correct:473 #Tested:551 Testing Accuracy:85.8%\r", + "Progress:55.1% Speed(reviews/sec):3835.% #Correct:474 #Tested:552 Testing Accuracy:85.8%\r", + "Progress:55.2% Speed(reviews/sec):3836.% #Correct:475 #Tested:553 Testing Accuracy:85.8%\r", + "Progress:55.3% Speed(reviews/sec):3836.% #Correct:476 #Tested:554 Testing Accuracy:85.9%\r", + "Progress:55.4% Speed(reviews/sec):3827.% #Correct:477 #Tested:555 Testing Accuracy:85.9%\r", + "Progress:55.5% Speed(reviews/sec):3826.% #Correct:478 #Tested:556 Testing Accuracy:85.9%\r", + "Progress:55.6% Speed(reviews/sec):3823.% #Correct:479 #Tested:557 Testing Accuracy:85.9%\r", + "Progress:55.7% Speed(reviews/sec):3822.% #Correct:480 #Tested:558 Testing Accuracy:86.0%\r", + "Progress:55.8% Speed(reviews/sec):3821.% #Correct:480 #Tested:559 Testing Accuracy:85.8%\r", + "Progress:55.9% Speed(reviews/sec):3825.% #Correct:481 #Tested:560 Testing Accuracy:85.8%\r", + "Progress:56.0% Speed(reviews/sec):3829.% #Correct:482 #Tested:561 Testing Accuracy:85.9%\r", + "Progress:56.1% Speed(reviews/sec):3833.% #Correct:483 #Tested:562 Testing Accuracy:85.9%\r", + "Progress:56.2% Speed(reviews/sec):3834.% #Correct:484 #Tested:563 Testing Accuracy:85.9%\r", + "Progress:56.3% Speed(reviews/sec):3838.% #Correct:485 #Tested:564 Testing Accuracy:85.9%\r", + "Progress:56.4% Speed(reviews/sec):3838.% #Correct:486 #Tested:565 Testing Accuracy:86.0%\r", + "Progress:56.5% Speed(reviews/sec):3843.% #Correct:487 #Tested:566 Testing Accuracy:86.0%\r", + "Progress:56.6% Speed(reviews/sec):3846.% #Correct:488 #Tested:567 Testing Accuracy:86.0%\r", + "Progress:56.7% Speed(reviews/sec):3849.% #Correct:489 #Tested:568 Testing Accuracy:86.0%\r", + "Progress:56.8% Speed(reviews/sec):3853.% #Correct:490 #Tested:569 Testing Accuracy:86.1%\r", + "Progress:56.9% Speed(reviews/sec):3855.% #Correct:491 #Tested:570 Testing Accuracy:86.1%\r", + "Progress:57.0% Speed(reviews/sec):3851.% #Correct:492 #Tested:571 Testing Accuracy:86.1%\r", + "Progress:57.1% Speed(reviews/sec):3850.% #Correct:493 #Tested:572 Testing Accuracy:86.1%\r", + "Progress:57.2% Speed(reviews/sec):3851.% #Correct:493 #Tested:573 Testing Accuracy:86.0%\r", + "Progress:57.3% Speed(reviews/sec):3854.% #Correct:493 #Tested:574 Testing Accuracy:85.8%\r", + "Progress:57.4% Speed(reviews/sec):3853.% #Correct:494 #Tested:575 Testing Accuracy:85.9%\r", + "Progress:57.5% Speed(reviews/sec):3854.% #Correct:495 #Tested:576 Testing Accuracy:85.9%\r", + "Progress:57.6% Speed(reviews/sec):3855.% #Correct:496 #Tested:577 Testing Accuracy:85.9%\r", + "Progress:57.7% Speed(reviews/sec):3857.% #Correct:497 #Tested:578 Testing Accuracy:85.9%\r", + "Progress:57.8% Speed(reviews/sec):3851.% #Correct:498 #Tested:579 Testing Accuracy:86.0%\r", + "Progress:57.9% Speed(reviews/sec):3853.% #Correct:499 #Tested:580 Testing Accuracy:86.0%\r", + "Progress:58.0% Speed(reviews/sec):3853.% #Correct:500 #Tested:581 Testing Accuracy:86.0%\r", + "Progress:58.1% Speed(reviews/sec):3855.% #Correct:501 #Tested:582 Testing Accuracy:86.0%\r", + "Progress:58.2% Speed(reviews/sec):3858.% #Correct:502 #Tested:583 Testing Accuracy:86.1%\r", + "Progress:58.3% Speed(reviews/sec):3861.% #Correct:503 #Tested:584 Testing Accuracy:86.1%\r", + "Progress:58.4% Speed(reviews/sec):3864.% #Correct:504 #Tested:585 Testing Accuracy:86.1%\r", + "Progress:58.5% Speed(reviews/sec):3868.% #Correct:505 #Tested:586 Testing Accuracy:86.1%\r", + "Progress:58.6% Speed(reviews/sec):3872.% #Correct:506 #Tested:587 Testing Accuracy:86.2%\r", + "Progress:58.7% Speed(reviews/sec):3875.% #Correct:507 #Tested:588 Testing Accuracy:86.2%\r", + "Progress:58.8% Speed(reviews/sec):3880.% #Correct:508 #Tested:589 Testing Accuracy:86.2%\r", + "Progress:58.9% Speed(reviews/sec):3884.% #Correct:509 #Tested:590 Testing Accuracy:86.2%\r", + "Progress:59.0% Speed(reviews/sec):3887.% #Correct:510 #Tested:591 Testing Accuracy:86.2%\r", + "Progress:59.1% Speed(reviews/sec):3891.% #Correct:511 #Tested:592 Testing Accuracy:86.3%\r", + "Progress:59.2% Speed(reviews/sec):3890.% #Correct:511 #Tested:593 Testing Accuracy:86.1%\r", + "Progress:59.3% Speed(reviews/sec):3893.% #Correct:512 #Tested:594 Testing Accuracy:86.1%\r", + "Progress:59.4% Speed(reviews/sec):3895.% #Correct:513 #Tested:595 Testing Accuracy:86.2%\r", + "Progress:59.5% Speed(reviews/sec):3872.% #Correct:514 #Tested:596 Testing Accuracy:86.2%\r", + "Progress:59.6% Speed(reviews/sec):3874.% #Correct:515 #Tested:597 Testing Accuracy:86.2%\r", + "Progress:59.7% Speed(reviews/sec):3872.% #Correct:516 #Tested:598 Testing Accuracy:86.2%\r", + "Progress:59.8% Speed(reviews/sec):3874.% #Correct:516 #Tested:599 Testing Accuracy:86.1%\r", + "Progress:59.9% Speed(reviews/sec):3878.% #Correct:517 #Tested:600 Testing Accuracy:86.1%\r", + "Progress:60.0% Speed(reviews/sec):3878.% #Correct:517 #Tested:601 Testing Accuracy:86.0%\r", + "Progress:60.1% Speed(reviews/sec):3882.% #Correct:518 #Tested:602 Testing Accuracy:86.0%\r", + "Progress:60.2% Speed(reviews/sec):3885.% #Correct:519 #Tested:603 Testing Accuracy:86.0%\r", + "Progress:60.3% Speed(reviews/sec):3890.% #Correct:520 #Tested:604 Testing Accuracy:86.0%\r", + "Progress:60.4% Speed(reviews/sec):3893.% #Correct:521 #Tested:605 Testing Accuracy:86.1%\r", + "Progress:60.5% Speed(reviews/sec):3886.% #Correct:522 #Tested:606 Testing Accuracy:86.1%\r", + "Progress:60.6% Speed(reviews/sec):3890.% #Correct:522 #Tested:607 Testing Accuracy:85.9%\r", + "Progress:60.7% Speed(reviews/sec):3893.% #Correct:523 #Tested:608 Testing Accuracy:86.0%\r", + "Progress:60.8% Speed(reviews/sec):3897.% #Correct:524 #Tested:609 Testing Accuracy:86.0%\r", + "Progress:60.9% Speed(reviews/sec):3902.% #Correct:525 #Tested:610 Testing Accuracy:86.0%\r", + "Progress:61.0% Speed(reviews/sec):3903.% #Correct:525 #Tested:611 Testing Accuracy:85.9%\r", + "Progress:61.1% Speed(reviews/sec):3906.% #Correct:526 #Tested:612 Testing Accuracy:85.9%\r", + "Progress:61.2% Speed(reviews/sec):3911.% #Correct:527 #Tested:613 Testing Accuracy:85.9%\r", + "Progress:61.3% Speed(reviews/sec):3914.% #Correct:528 #Tested:614 Testing Accuracy:85.9%\r", + "Progress:61.4% Speed(reviews/sec):3919.% #Correct:528 #Tested:615 Testing Accuracy:85.8%\r", + "Progress:61.5% Speed(reviews/sec):3921.% #Correct:528 #Tested:616 Testing Accuracy:85.7%\r", + "Progress:61.6% Speed(reviews/sec):3920.% #Correct:529 #Tested:617 Testing Accuracy:85.7%\r", + "Progress:61.7% Speed(reviews/sec):3922.% #Correct:530 #Tested:618 Testing Accuracy:85.7%\r", + "Progress:61.8% Speed(reviews/sec):3925.% #Correct:531 #Tested:619 Testing Accuracy:85.7%\r", + "Progress:61.9% Speed(reviews/sec):3928.% #Correct:531 #Tested:620 Testing Accuracy:85.6%\r", + "Progress:62.0% Speed(reviews/sec):3928.% #Correct:531 #Tested:621 Testing Accuracy:85.5%\r", + "Progress:62.1% Speed(reviews/sec):3932.% #Correct:532 #Tested:622 Testing Accuracy:85.5%\r", + "Progress:62.2% Speed(reviews/sec):3935.% #Correct:532 #Tested:623 Testing Accuracy:85.3%\r", + "Progress:62.3% Speed(reviews/sec):3939.% #Correct:533 #Tested:624 Testing Accuracy:85.4%\r", + "Progress:62.4% Speed(reviews/sec):3942.% #Correct:533 #Tested:625 Testing Accuracy:85.2%\r", + "Progress:62.5% Speed(reviews/sec):3936.% #Correct:533 #Tested:626 Testing Accuracy:85.1%\r", + "Progress:62.6% Speed(reviews/sec):3937.% #Correct:533 #Tested:627 Testing Accuracy:85.0%\r", + "Progress:62.7% Speed(reviews/sec):3940.% #Correct:533 #Tested:628 Testing Accuracy:84.8%\r", + "Progress:62.8% Speed(reviews/sec):3945.% #Correct:533 #Tested:629 Testing Accuracy:84.7%\r", + "Progress:62.9% Speed(reviews/sec):3945.% #Correct:534 #Tested:630 Testing Accuracy:84.7%\r", + "Progress:63.0% Speed(reviews/sec):3947.% #Correct:534 #Tested:631 Testing Accuracy:84.6%\r", + "Progress:63.1% Speed(reviews/sec):3944.% #Correct:535 #Tested:632 Testing Accuracy:84.6%\r", + "Progress:63.2% Speed(reviews/sec):3948.% #Correct:535 #Tested:633 Testing Accuracy:84.5%\r", + "Progress:63.3% Speed(reviews/sec):3949.% #Correct:536 #Tested:634 Testing Accuracy:84.5%\r", + "Progress:63.4% Speed(reviews/sec):3948.% #Correct:536 #Tested:635 Testing Accuracy:84.4%\r", + "Progress:63.5% Speed(reviews/sec):3949.% #Correct:537 #Tested:636 Testing Accuracy:84.4%\r", + "Progress:63.6% Speed(reviews/sec):3945.% #Correct:537 #Tested:637 Testing Accuracy:84.3%\r", + "Progress:63.7% Speed(reviews/sec):3944.% #Correct:538 #Tested:638 Testing Accuracy:84.3%\r", + "Progress:63.8% Speed(reviews/sec):3946.% #Correct:539 #Tested:639 Testing Accuracy:84.3%\r", + "Progress:63.9% Speed(reviews/sec):3947.% #Correct:540 #Tested:640 Testing Accuracy:84.3%\r", + "Progress:64.0% Speed(reviews/sec):3949.% #Correct:540 #Tested:641 Testing Accuracy:84.2%\r", + "Progress:64.1% Speed(reviews/sec):3944.% #Correct:540 #Tested:642 Testing Accuracy:84.1%\r", + "Progress:64.2% Speed(reviews/sec):3943.% #Correct:541 #Tested:643 Testing Accuracy:84.1%\r", + "Progress:64.3% Speed(reviews/sec):3946.% #Correct:542 #Tested:644 Testing Accuracy:84.1%\r", + "Progress:64.4% Speed(reviews/sec):3946.% #Correct:543 #Tested:645 Testing Accuracy:84.1%\r", + "Progress:64.5% Speed(reviews/sec):3943.% #Correct:543 #Tested:646 Testing Accuracy:84.0%\r", + "Progress:64.6% Speed(reviews/sec):3941.% #Correct:544 #Tested:647 Testing Accuracy:84.0%\r", + "Progress:64.7% Speed(reviews/sec):3944.% #Correct:545 #Tested:648 Testing Accuracy:84.1%\r", + "Progress:64.8% Speed(reviews/sec):3947.% #Correct:546 #Tested:649 Testing Accuracy:84.1%\r", + "Progress:64.9% Speed(reviews/sec):3951.% #Correct:547 #Tested:650 Testing Accuracy:84.1%\r", + "Progress:65.0% Speed(reviews/sec):3955.% #Correct:547 #Tested:651 Testing Accuracy:84.0%\r", + "Progress:65.1% Speed(reviews/sec):3958.% #Correct:548 #Tested:652 Testing Accuracy:84.0%\r", + "Progress:65.2% Speed(reviews/sec):3962.% #Correct:549 #Tested:653 Testing Accuracy:84.0%\r", + "Progress:65.3% Speed(reviews/sec):3965.% #Correct:550 #Tested:654 Testing Accuracy:84.0%\r", + "Progress:65.4% Speed(reviews/sec):3963.% #Correct:550 #Tested:655 Testing Accuracy:83.9%\r", + "Progress:65.5% Speed(reviews/sec):3964.% #Correct:551 #Tested:656 Testing Accuracy:83.9%\r", + "Progress:65.6% Speed(reviews/sec):3967.% #Correct:551 #Tested:657 Testing Accuracy:83.8%\r", + "Progress:65.7% Speed(reviews/sec):3968.% #Correct:552 #Tested:658 Testing Accuracy:83.8%\r", + "Progress:65.8% Speed(reviews/sec):3972.% #Correct:553 #Tested:659 Testing Accuracy:83.9%\r", + "Progress:65.9% Speed(reviews/sec):3974.% #Correct:554 #Tested:660 Testing Accuracy:83.9%\r", + "Progress:66.0% Speed(reviews/sec):3978.% #Correct:555 #Tested:661 Testing Accuracy:83.9%\r", + "Progress:66.1% Speed(reviews/sec):3981.% #Correct:556 #Tested:662 Testing Accuracy:83.9%\r", + "Progress:66.2% Speed(reviews/sec):3983.% #Correct:557 #Tested:663 Testing Accuracy:84.0%\r", + "Progress:66.3% Speed(reviews/sec):3986.% #Correct:557 #Tested:664 Testing Accuracy:83.8%\r", + "Progress:66.4% Speed(reviews/sec):3989.% #Correct:558 #Tested:665 Testing Accuracy:83.9%\r", + "Progress:66.5% Speed(reviews/sec):3993.% #Correct:559 #Tested:666 Testing Accuracy:83.9%\r", + "Progress:66.6% Speed(reviews/sec):3997.% #Correct:560 #Tested:667 Testing Accuracy:83.9%\r", + "Progress:66.7% Speed(reviews/sec):4000.% #Correct:561 #Tested:668 Testing Accuracy:83.9%\r", + "Progress:66.8% Speed(reviews/sec):4002.% #Correct:562 #Tested:669 Testing Accuracy:84.0%\r", + "Progress:66.9% Speed(reviews/sec):4005.% #Correct:562 #Tested:670 Testing Accuracy:83.8%\r", + "Progress:67.0% Speed(reviews/sec):4010.% #Correct:563 #Tested:671 Testing Accuracy:83.9%\r", + "Progress:67.1% Speed(reviews/sec):4014.% #Correct:564 #Tested:672 Testing Accuracy:83.9%\r", + "Progress:67.2% Speed(reviews/sec):4018.% #Correct:565 #Tested:673 Testing Accuracy:83.9%\r", + "Progress:67.3% Speed(reviews/sec):4020.% #Correct:566 #Tested:674 Testing Accuracy:83.9%\r", + "Progress:67.4% Speed(reviews/sec):4024.% #Correct:567 #Tested:675 Testing Accuracy:84.0%\r", + "Progress:67.5% Speed(reviews/sec):4027.% #Correct:568 #Tested:676 Testing Accuracy:84.0%\r", + "Progress:67.6% Speed(reviews/sec):4031.% #Correct:568 #Tested:677 Testing Accuracy:83.8%\r", + "Progress:67.7% Speed(reviews/sec):4033.% #Correct:568 #Tested:678 Testing Accuracy:83.7%\r", + "Progress:67.8% Speed(reviews/sec):4037.% #Correct:569 #Tested:679 Testing Accuracy:83.7%\r", + "Progress:67.9% Speed(reviews/sec):4041.% #Correct:570 #Tested:680 Testing Accuracy:83.8%\r", + "Progress:68.0% Speed(reviews/sec):4044.% #Correct:570 #Tested:681 Testing Accuracy:83.7%\r", + "Progress:68.1% Speed(reviews/sec):4041.% #Correct:571 #Tested:682 Testing Accuracy:83.7%\r", + "Progress:68.2% Speed(reviews/sec):4036.% #Correct:572 #Tested:683 Testing Accuracy:83.7%\r", + "Progress:68.3% Speed(reviews/sec):4037.% #Correct:573 #Tested:684 Testing Accuracy:83.7%\r", + "Progress:68.4% Speed(reviews/sec):4037.% #Correct:574 #Tested:685 Testing Accuracy:83.7%\r", + "Progress:68.5% Speed(reviews/sec):4037.% #Correct:575 #Tested:686 Testing Accuracy:83.8%\r", + "Progress:68.6% Speed(reviews/sec):4039.% #Correct:575 #Tested:687 Testing Accuracy:83.6%\r", + "Progress:68.7% Speed(reviews/sec):4041.% #Correct:576 #Tested:688 Testing Accuracy:83.7%\r", + "Progress:68.8% Speed(reviews/sec):4036.% #Correct:577 #Tested:689 Testing Accuracy:83.7%\r", + "Progress:68.9% Speed(reviews/sec):4039.% #Correct:578 #Tested:690 Testing Accuracy:83.7%\r", + "Progress:69.0% Speed(reviews/sec):4041.% #Correct:579 #Tested:691 Testing Accuracy:83.7%\r", + "Progress:69.1% Speed(reviews/sec):4043.% #Correct:580 #Tested:692 Testing Accuracy:83.8%\r", + "Progress:69.2% Speed(reviews/sec):4041.% #Correct:581 #Tested:693 Testing Accuracy:83.8%\r", + "Progress:69.3% Speed(reviews/sec):4038.% #Correct:582 #Tested:694 Testing Accuracy:83.8%\r", + "Progress:69.4% Speed(reviews/sec):4037.% #Correct:582 #Tested:695 Testing Accuracy:83.7%\r", + "Progress:69.5% Speed(reviews/sec):4036.% #Correct:583 #Tested:696 Testing Accuracy:83.7%\r", + "Progress:69.6% Speed(reviews/sec):4040.% #Correct:584 #Tested:697 Testing Accuracy:83.7%\r", + "Progress:69.7% Speed(reviews/sec):4042.% #Correct:585 #Tested:698 Testing Accuracy:83.8%\r", + "Progress:69.8% Speed(reviews/sec):4046.% #Correct:586 #Tested:699 Testing Accuracy:83.8%\r", + "Progress:69.9% Speed(reviews/sec):4047.% #Correct:587 #Tested:700 Testing Accuracy:83.8%\r", + "Progress:70.0% Speed(reviews/sec):4046.% #Correct:588 #Tested:701 Testing Accuracy:83.8%\r", + "Progress:70.1% Speed(reviews/sec):4047.% #Correct:589 #Tested:702 Testing Accuracy:83.9%\r", + "Progress:70.2% Speed(reviews/sec):4043.% #Correct:590 #Tested:703 Testing Accuracy:83.9%\r", + "Progress:70.3% Speed(reviews/sec):4039.% #Correct:591 #Tested:704 Testing Accuracy:83.9%\r", + "Progress:70.4% Speed(reviews/sec):4035.% #Correct:592 #Tested:705 Testing Accuracy:83.9%\r", + "Progress:70.5% Speed(reviews/sec):4037.% #Correct:593 #Tested:706 Testing Accuracy:83.9%\r", + "Progress:70.6% Speed(reviews/sec):4039.% #Correct:594 #Tested:707 Testing Accuracy:84.0%\r", + "Progress:70.7% Speed(reviews/sec):4042.% #Correct:595 #Tested:708 Testing Accuracy:84.0%\r", + "Progress:70.8% Speed(reviews/sec):4043.% #Correct:596 #Tested:709 Testing Accuracy:84.0%\r", + "Progress:70.9% Speed(reviews/sec):4046.% #Correct:597 #Tested:710 Testing Accuracy:84.0%\r", + "Progress:71.0% Speed(reviews/sec):4049.% #Correct:598 #Tested:711 Testing Accuracy:84.1%\r", + "Progress:71.1% Speed(reviews/sec):4052.% #Correct:599 #Tested:712 Testing Accuracy:84.1%\r", + "Progress:71.2% Speed(reviews/sec):4056.% #Correct:599 #Tested:713 Testing Accuracy:84.0%\r", + "Progress:71.3% Speed(reviews/sec):4058.% #Correct:600 #Tested:714 Testing Accuracy:84.0%\r", + "Progress:71.4% Speed(reviews/sec):4060.% #Correct:601 #Tested:715 Testing Accuracy:84.0%\r", + "Progress:71.5% Speed(reviews/sec):4063.% #Correct:602 #Tested:716 Testing Accuracy:84.0%\r", + "Progress:71.6% Speed(reviews/sec):4067.% #Correct:603 #Tested:717 Testing Accuracy:84.1%\r", + "Progress:71.7% Speed(reviews/sec):4070.% #Correct:604 #Tested:718 Testing Accuracy:84.1%\r", + "Progress:71.8% Speed(reviews/sec):4072.% #Correct:605 #Tested:719 Testing Accuracy:84.1%\r", + "Progress:71.9% Speed(reviews/sec):4076.% #Correct:606 #Tested:720 Testing Accuracy:84.1%\r", + "Progress:72.0% Speed(reviews/sec):4080.% #Correct:606 #Tested:721 Testing Accuracy:84.0%\r", + "Progress:72.1% Speed(reviews/sec):4083.% #Correct:607 #Tested:722 Testing Accuracy:84.0%\r", + "Progress:72.2% Speed(reviews/sec):4085.% #Correct:608 #Tested:723 Testing Accuracy:84.0%\r", + "Progress:72.3% Speed(reviews/sec):4087.% #Correct:609 #Tested:724 Testing Accuracy:84.1%\r", + "Progress:72.4% Speed(reviews/sec):4091.% #Correct:609 #Tested:725 Testing Accuracy:84.0%\r", + "Progress:72.5% Speed(reviews/sec):4093.% #Correct:610 #Tested:726 Testing Accuracy:84.0%\r", + "Progress:72.6% Speed(reviews/sec):4090.% #Correct:611 #Tested:727 Testing Accuracy:84.0%\r", + "Progress:72.7% Speed(reviews/sec):4083.% #Correct:612 #Tested:728 Testing Accuracy:84.0%\r", + "Progress:72.8% Speed(reviews/sec):4086.% #Correct:613 #Tested:729 Testing Accuracy:84.0%\r", + "Progress:72.9% Speed(reviews/sec):4084.% #Correct:614 #Tested:730 Testing Accuracy:84.1%\r", + "Progress:73.0% Speed(reviews/sec):4086.% #Correct:615 #Tested:731 Testing Accuracy:84.1%\r", + "Progress:73.1% Speed(reviews/sec):4089.% #Correct:616 #Tested:732 Testing Accuracy:84.1%\r", + "Progress:73.2% Speed(reviews/sec):4090.% #Correct:617 #Tested:733 Testing Accuracy:84.1%\r", + "Progress:73.3% Speed(reviews/sec):4091.% #Correct:618 #Tested:734 Testing Accuracy:84.1%\r", + "Progress:73.4% Speed(reviews/sec):4091.% #Correct:619 #Tested:735 Testing Accuracy:84.2%\r", + "Progress:73.5% Speed(reviews/sec):4093.% #Correct:620 #Tested:736 Testing Accuracy:84.2%\r", + "Progress:73.6% Speed(reviews/sec):4095.% #Correct:621 #Tested:737 Testing Accuracy:84.2%\r", + "Progress:73.7% Speed(reviews/sec):4098.% #Correct:621 #Tested:738 Testing Accuracy:84.1%\r", + "Progress:73.8% Speed(reviews/sec):4099.% #Correct:622 #Tested:739 Testing Accuracy:84.1%\r", + "Progress:73.9% Speed(reviews/sec):4103.% #Correct:623 #Tested:740 Testing Accuracy:84.1%\r", + "Progress:74.0% Speed(reviews/sec):4107.% #Correct:624 #Tested:741 Testing Accuracy:84.2%\r", + "Progress:74.1% Speed(reviews/sec):4110.% #Correct:625 #Tested:742 Testing Accuracy:84.2%\r", + "Progress:74.2% Speed(reviews/sec):4112.% #Correct:626 #Tested:743 Testing Accuracy:84.2%\r", + "Progress:74.3% Speed(reviews/sec):4113.% #Correct:626 #Tested:744 Testing Accuracy:84.1%\r", + "Progress:74.4% Speed(reviews/sec):4116.% #Correct:627 #Tested:745 Testing Accuracy:84.1%\r", + "Progress:74.5% Speed(reviews/sec):4114.% #Correct:627 #Tested:746 Testing Accuracy:84.0%\r", + "Progress:74.6% Speed(reviews/sec):4116.% #Correct:628 #Tested:747 Testing Accuracy:84.0%\r", + "Progress:74.7% Speed(reviews/sec):4113.% #Correct:628 #Tested:748 Testing Accuracy:83.9%\r", + "Progress:74.8% Speed(reviews/sec):4114.% #Correct:629 #Tested:749 Testing Accuracy:83.9%\r", + "Progress:74.9% Speed(reviews/sec):4115.% #Correct:630 #Tested:750 Testing Accuracy:84.0%\r", + "Progress:75.0% Speed(reviews/sec):4119.% #Correct:631 #Tested:751 Testing Accuracy:84.0%\r", + "Progress:75.1% Speed(reviews/sec):4121.% #Correct:632 #Tested:752 Testing Accuracy:84.0%\r", + "Progress:75.2% Speed(reviews/sec):4123.% #Correct:633 #Tested:753 Testing Accuracy:84.0%\r", + "Progress:75.3% Speed(reviews/sec):4126.% #Correct:634 #Tested:754 Testing Accuracy:84.0%\r", + "Progress:75.4% Speed(reviews/sec):4124.% #Correct:635 #Tested:755 Testing Accuracy:84.1%\r", + "Progress:75.5% Speed(reviews/sec):4128.% #Correct:635 #Tested:756 Testing Accuracy:83.9%\r", + "Progress:75.6% Speed(reviews/sec):4130.% #Correct:635 #Tested:757 Testing Accuracy:83.8%\r", + "Progress:75.7% Speed(reviews/sec):4133.% #Correct:636 #Tested:758 Testing Accuracy:83.9%\r", + "Progress:75.8% Speed(reviews/sec):4135.% #Correct:636 #Tested:759 Testing Accuracy:83.7%\r", + "Progress:75.9% Speed(reviews/sec):4137.% #Correct:637 #Tested:760 Testing Accuracy:83.8%\r", + "Progress:76.0% Speed(reviews/sec):4137.% #Correct:637 #Tested:761 Testing Accuracy:83.7%\r", + "Progress:76.1% Speed(reviews/sec):4135.% #Correct:638 #Tested:762 Testing Accuracy:83.7%\r", + "Progress:76.2% Speed(reviews/sec):4130.% #Correct:638 #Tested:763 Testing Accuracy:83.6%\r", + "Progress:76.3% Speed(reviews/sec):4130.% #Correct:639 #Tested:764 Testing Accuracy:83.6%\r", + "Progress:76.4% Speed(reviews/sec):4097.% #Correct:639 #Tested:765 Testing Accuracy:83.5%\r", + "Progress:76.5% Speed(reviews/sec):4094.% #Correct:639 #Tested:766 Testing Accuracy:83.4%\r", + "Progress:76.6% Speed(reviews/sec):4095.% #Correct:639 #Tested:767 Testing Accuracy:83.3%\r", + "Progress:76.7% Speed(reviews/sec):4097.% #Correct:639 #Tested:768 Testing Accuracy:83.2%\r", + "Progress:76.8% Speed(reviews/sec):4099.% #Correct:639 #Tested:769 Testing Accuracy:83.0%\r", + "Progress:76.9% Speed(reviews/sec):4102.% #Correct:640 #Tested:770 Testing Accuracy:83.1%\r", + "Progress:77.0% Speed(reviews/sec):4095.% #Correct:640 #Tested:771 Testing Accuracy:83.0%\r", + "Progress:77.1% Speed(reviews/sec):4094.% #Correct:641 #Tested:772 Testing Accuracy:83.0%\r", + "Progress:77.2% Speed(reviews/sec):4096.% #Correct:642 #Tested:773 Testing Accuracy:83.0%\r", + "Progress:77.3% Speed(reviews/sec):4097.% #Correct:643 #Tested:774 Testing Accuracy:83.0%\r", + "Progress:77.4% Speed(reviews/sec):4100.% #Correct:644 #Tested:775 Testing Accuracy:83.0%\r", + "Progress:77.5% Speed(reviews/sec):4100.% #Correct:645 #Tested:776 Testing Accuracy:83.1%\r", + "Progress:77.6% Speed(reviews/sec):4101.% #Correct:645 #Tested:777 Testing Accuracy:83.0%\r", + "Progress:77.7% Speed(reviews/sec):4102.% #Correct:645 #Tested:778 Testing Accuracy:82.9%\r", + "Progress:77.8% Speed(reviews/sec):4099.% #Correct:646 #Tested:779 Testing Accuracy:82.9%\r", + "Progress:77.9% Speed(reviews/sec):4101.% #Correct:647 #Tested:780 Testing Accuracy:82.9%\r", + "Progress:78.0% Speed(reviews/sec):4095.% #Correct:647 #Tested:781 Testing Accuracy:82.8%\r", + "Progress:78.1% Speed(reviews/sec):4098.% #Correct:648 #Tested:782 Testing Accuracy:82.8%\r", + "Progress:78.2% Speed(reviews/sec):4094.% #Correct:648 #Tested:783 Testing Accuracy:82.7%\r", + "Progress:78.3% Speed(reviews/sec):4096.% #Correct:649 #Tested:784 Testing Accuracy:82.7%\r", + "Progress:78.4% Speed(reviews/sec):4095.% #Correct:649 #Tested:785 Testing Accuracy:82.6%\r", + "Progress:78.5% Speed(reviews/sec):4097.% #Correct:650 #Tested:786 Testing Accuracy:82.6%\r", + "Progress:78.6% Speed(reviews/sec):4097.% #Correct:650 #Tested:787 Testing Accuracy:82.5%\r", + "Progress:78.7% Speed(reviews/sec):4097.% #Correct:651 #Tested:788 Testing Accuracy:82.6%\r", + "Progress:78.8% Speed(reviews/sec):4100.% #Correct:651 #Tested:789 Testing Accuracy:82.5%\r", + "Progress:78.9% Speed(reviews/sec):4098.% #Correct:652 #Tested:790 Testing Accuracy:82.5%\r", + "Progress:79.0% Speed(reviews/sec):4098.% #Correct:652 #Tested:791 Testing Accuracy:82.4%\r", + "Progress:79.1% Speed(reviews/sec):4097.% #Correct:653 #Tested:792 Testing Accuracy:82.4%\r", + "Progress:79.2% Speed(reviews/sec):4097.% #Correct:653 #Tested:793 Testing Accuracy:82.3%\r", + "Progress:79.3% Speed(reviews/sec):4099.% #Correct:653 #Tested:794 Testing Accuracy:82.2%\r", + "Progress:79.4% Speed(reviews/sec):4100.% #Correct:654 #Tested:795 Testing Accuracy:82.2%\r", + "Progress:79.5% Speed(reviews/sec):4104.% #Correct:655 #Tested:796 Testing Accuracy:82.2%\r", + "Progress:79.6% Speed(reviews/sec):4107.% #Correct:656 #Tested:797 Testing Accuracy:82.3%\r", + "Progress:79.7% Speed(reviews/sec):4109.% #Correct:657 #Tested:798 Testing Accuracy:82.3%\r", + "Progress:79.8% Speed(reviews/sec):4113.% #Correct:658 #Tested:799 Testing Accuracy:82.3%\r", + "Progress:79.9% Speed(reviews/sec):4116.% #Correct:659 #Tested:800 Testing Accuracy:82.3%\r", + "Progress:80.0% Speed(reviews/sec):4120.% #Correct:660 #Tested:801 Testing Accuracy:82.3%\r", + "Progress:80.1% Speed(reviews/sec):4123.% #Correct:661 #Tested:802 Testing Accuracy:82.4%\r", + "Progress:80.2% Speed(reviews/sec):4126.% #Correct:662 #Tested:803 Testing Accuracy:82.4%\r", + "Progress:80.3% Speed(reviews/sec):4129.% #Correct:663 #Tested:804 Testing Accuracy:82.4%\r", + "Progress:80.4% Speed(reviews/sec):4132.% #Correct:664 #Tested:805 Testing Accuracy:82.4%\r", + "Progress:80.5% Speed(reviews/sec):4136.% #Correct:664 #Tested:806 Testing Accuracy:82.3%\r", + "Progress:80.6% Speed(reviews/sec):4138.% #Correct:665 #Tested:807 Testing Accuracy:82.4%\r", + "Progress:80.7% Speed(reviews/sec):4139.% #Correct:666 #Tested:808 Testing Accuracy:82.4%\r", + "Progress:80.8% Speed(reviews/sec):4142.% #Correct:667 #Tested:809 Testing Accuracy:82.4%\r", + "Progress:80.9% Speed(reviews/sec):4140.% #Correct:668 #Tested:810 Testing Accuracy:82.4%\r", + "Progress:81.0% Speed(reviews/sec):4135.% #Correct:669 #Tested:811 Testing Accuracy:82.4%\r", + "Progress:81.1% Speed(reviews/sec):4129.% #Correct:670 #Tested:812 Testing Accuracy:82.5%\r", + "Progress:81.2% Speed(reviews/sec):4128.% #Correct:671 #Tested:813 Testing Accuracy:82.5%\r", + "Progress:81.3% Speed(reviews/sec):4130.% #Correct:672 #Tested:814 Testing Accuracy:82.5%\r", + "Progress:81.4% Speed(reviews/sec):4125.% #Correct:673 #Tested:815 Testing Accuracy:82.5%\r", + "Progress:81.5% Speed(reviews/sec):4127.% #Correct:674 #Tested:816 Testing Accuracy:82.5%\r", + "Progress:81.6% Speed(reviews/sec):4129.% #Correct:675 #Tested:817 Testing Accuracy:82.6%\r", + "Progress:81.7% Speed(reviews/sec):4130.% #Correct:676 #Tested:818 Testing Accuracy:82.6%\r", + "Progress:81.8% Speed(reviews/sec):4130.% #Correct:677 #Tested:819 Testing Accuracy:82.6%\r", + "Progress:81.9% Speed(reviews/sec):4133.% #Correct:677 #Tested:820 Testing Accuracy:82.5%\r", + "Progress:82.0% Speed(reviews/sec):4128.% #Correct:678 #Tested:821 Testing Accuracy:82.5%\r", + "Progress:82.1% Speed(reviews/sec):4125.% #Correct:678 #Tested:822 Testing Accuracy:82.4%\r", + "Progress:82.2% Speed(reviews/sec):4126.% #Correct:678 #Tested:823 Testing Accuracy:82.3%\r", + "Progress:82.3% Speed(reviews/sec):4129.% #Correct:679 #Tested:824 Testing Accuracy:82.4%\r", + "Progress:82.4% Speed(reviews/sec):4131.% #Correct:680 #Tested:825 Testing Accuracy:82.4%\r", + "Progress:82.5% Speed(reviews/sec):4132.% #Correct:681 #Tested:826 Testing Accuracy:82.4%\r", + "Progress:82.6% Speed(reviews/sec):4126.% #Correct:682 #Tested:827 Testing Accuracy:82.4%\r", + "Progress:82.7% Speed(reviews/sec):4128.% #Correct:683 #Tested:828 Testing Accuracy:82.4%\r", + "Progress:82.8% Speed(reviews/sec):4129.% #Correct:684 #Tested:829 Testing Accuracy:82.5%\r", + "Progress:82.9% Speed(reviews/sec):4130.% #Correct:685 #Tested:830 Testing Accuracy:82.5%\r", + "Progress:83.0% Speed(reviews/sec):4133.% #Correct:686 #Tested:831 Testing Accuracy:82.5%\r", + "Progress:83.1% Speed(reviews/sec):4136.% #Correct:687 #Tested:832 Testing Accuracy:82.5%\r", + "Progress:83.2% Speed(reviews/sec):4137.% #Correct:688 #Tested:833 Testing Accuracy:82.5%\r", + "Progress:83.3% Speed(reviews/sec):4140.% #Correct:688 #Tested:834 Testing Accuracy:82.4%\r", + "Progress:83.4% Speed(reviews/sec):4141.% #Correct:689 #Tested:835 Testing Accuracy:82.5%\r", + "Progress:83.5% Speed(reviews/sec):4143.% #Correct:690 #Tested:836 Testing Accuracy:82.5%\r", + "Progress:83.6% Speed(reviews/sec):4142.% #Correct:691 #Tested:837 Testing Accuracy:82.5%\r", + "Progress:83.7% Speed(reviews/sec):4129.% #Correct:692 #Tested:838 Testing Accuracy:82.5%\r", + "Progress:83.8% Speed(reviews/sec):4130.% #Correct:692 #Tested:839 Testing Accuracy:82.4%\r", + "Progress:83.9% Speed(reviews/sec):4133.% #Correct:693 #Tested:840 Testing Accuracy:82.5%\r", + "Progress:84.0% Speed(reviews/sec):4134.% #Correct:694 #Tested:841 Testing Accuracy:82.5%\r", + "Progress:84.1% Speed(reviews/sec):4133.% #Correct:695 #Tested:842 Testing Accuracy:82.5%\r", + "Progress:84.2% Speed(reviews/sec):4133.% #Correct:696 #Tested:843 Testing Accuracy:82.5%\r", + "Progress:84.3% Speed(reviews/sec):4134.% #Correct:697 #Tested:844 Testing Accuracy:82.5%\r", + "Progress:84.4% Speed(reviews/sec):4134.% #Correct:698 #Tested:845 Testing Accuracy:82.6%\r", + "Progress:84.5% Speed(reviews/sec):4137.% #Correct:699 #Tested:846 Testing Accuracy:82.6%\r", + "Progress:84.6% Speed(reviews/sec):4137.% #Correct:699 #Tested:847 Testing Accuracy:82.5%\r", + "Progress:84.7% Speed(reviews/sec):4140.% #Correct:699 #Tested:848 Testing Accuracy:82.4%\r", + "Progress:84.8% Speed(reviews/sec):4143.% #Correct:700 #Tested:849 Testing Accuracy:82.4%\r", + "Progress:84.9% Speed(reviews/sec):4140.% #Correct:701 #Tested:850 Testing Accuracy:82.4%\r", + "Progress:85.0% Speed(reviews/sec):4138.% #Correct:702 #Tested:851 Testing Accuracy:82.4%\r", + "Progress:85.1% Speed(reviews/sec):4141.% #Correct:703 #Tested:852 Testing Accuracy:82.5%\r", + "Progress:85.2% Speed(reviews/sec):4141.% #Correct:703 #Tested:853 Testing Accuracy:82.4%\r", + "Progress:85.3% Speed(reviews/sec):4141.% #Correct:704 #Tested:854 Testing Accuracy:82.4%\r", + "Progress:85.4% Speed(reviews/sec):4144.% #Correct:705 #Tested:855 Testing Accuracy:82.4%\r", + "Progress:85.5% Speed(reviews/sec):4147.% #Correct:706 #Tested:856 Testing Accuracy:82.4%\r", + "Progress:85.6% Speed(reviews/sec):4149.% #Correct:707 #Tested:857 Testing Accuracy:82.4%\r", + "Progress:85.7% Speed(reviews/sec):4150.% #Correct:708 #Tested:858 Testing Accuracy:82.5%\r", + "Progress:85.8% Speed(reviews/sec):4151.% #Correct:709 #Tested:859 Testing Accuracy:82.5%\r", + "Progress:85.9% Speed(reviews/sec):4153.% #Correct:710 #Tested:860 Testing Accuracy:82.5%\r", + "Progress:86.0% Speed(reviews/sec):4153.% #Correct:711 #Tested:861 Testing Accuracy:82.5%\r", + "Progress:86.1% Speed(reviews/sec):4148.% #Correct:712 #Tested:862 Testing Accuracy:82.5%\r", + "Progress:86.2% Speed(reviews/sec):4146.% #Correct:712 #Tested:863 Testing Accuracy:82.5%\r", + "Progress:86.3% Speed(reviews/sec):4126.% #Correct:713 #Tested:864 Testing Accuracy:82.5%\r", + "Progress:86.4% Speed(reviews/sec):4125.% #Correct:713 #Tested:865 Testing Accuracy:82.4%\r", + "Progress:86.5% Speed(reviews/sec):4125.% #Correct:714 #Tested:866 Testing Accuracy:82.4%\r", + "Progress:86.6% Speed(reviews/sec):4126.% #Correct:714 #Tested:867 Testing Accuracy:82.3%\r", + "Progress:86.7% Speed(reviews/sec):4129.% #Correct:714 #Tested:868 Testing Accuracy:82.2%\r", + "Progress:86.8% Speed(reviews/sec):4132.% #Correct:715 #Tested:869 Testing Accuracy:82.2%\r", + "Progress:86.9% Speed(reviews/sec):4135.% #Correct:716 #Tested:870 Testing Accuracy:82.2%\r", + "Progress:87.0% Speed(reviews/sec):4137.% #Correct:717 #Tested:871 Testing Accuracy:82.3%\r", + "Progress:87.1% Speed(reviews/sec):4139.% #Correct:718 #Tested:872 Testing Accuracy:82.3%\r", + "Progress:87.2% Speed(reviews/sec):4141.% #Correct:719 #Tested:873 Testing Accuracy:82.3%\r", + "Progress:87.3% Speed(reviews/sec):4145.% #Correct:720 #Tested:874 Testing Accuracy:82.3%\r", + "Progress:87.4% Speed(reviews/sec):4147.% #Correct:721 #Tested:875 Testing Accuracy:82.4%\r", + "Progress:87.5% Speed(reviews/sec):4150.% #Correct:722 #Tested:876 Testing Accuracy:82.4%\r", + "Progress:87.6% Speed(reviews/sec):4154.% #Correct:722 #Tested:877 Testing Accuracy:82.3%\r", + "Progress:87.7% Speed(reviews/sec):4141.% #Correct:723 #Tested:878 Testing Accuracy:82.3%\r", + "Progress:87.8% Speed(reviews/sec):4143.% #Correct:724 #Tested:879 Testing Accuracy:82.3%\r", + "Progress:87.9% Speed(reviews/sec):4145.% #Correct:725 #Tested:880 Testing Accuracy:82.3%\r", + "Progress:88.0% Speed(reviews/sec):4147.% #Correct:726 #Tested:881 Testing Accuracy:82.4%\r", + "Progress:88.1% Speed(reviews/sec):4150.% #Correct:727 #Tested:882 Testing Accuracy:82.4%\r", + "Progress:88.2% Speed(reviews/sec):4152.% #Correct:728 #Tested:883 Testing Accuracy:82.4%\r", + "Progress:88.3% Speed(reviews/sec):4155.% #Correct:729 #Tested:884 Testing Accuracy:82.4%\r", + "Progress:88.4% Speed(reviews/sec):4158.% #Correct:730 #Tested:885 Testing Accuracy:82.4%\r", + "Progress:88.5% Speed(reviews/sec):4160.% #Correct:731 #Tested:886 Testing Accuracy:82.5%\r", + "Progress:88.6% Speed(reviews/sec):4164.% #Correct:731 #Tested:887 Testing Accuracy:82.4%\r", + "Progress:88.7% Speed(reviews/sec):4166.% #Correct:732 #Tested:888 Testing Accuracy:82.4%\r", + "Progress:88.8% Speed(reviews/sec):4168.% #Correct:732 #Tested:889 Testing Accuracy:82.3%\r", + "Progress:88.9% Speed(reviews/sec):4169.% #Correct:733 #Tested:890 Testing Accuracy:82.3%\r", + "Progress:89.0% Speed(reviews/sec):4171.% #Correct:734 #Tested:891 Testing Accuracy:82.3%\r", + "Progress:89.1% Speed(reviews/sec):4171.% #Correct:735 #Tested:892 Testing Accuracy:82.3%\r", + "Progress:89.2% Speed(reviews/sec):4175.% #Correct:735 #Tested:893 Testing Accuracy:82.3%\r", + "Progress:89.3% Speed(reviews/sec):4177.% #Correct:736 #Tested:894 Testing Accuracy:82.3%\r", + "Progress:89.4% Speed(reviews/sec):4180.% #Correct:737 #Tested:895 Testing Accuracy:82.3%\r", + "Progress:89.5% Speed(reviews/sec):4183.% #Correct:738 #Tested:896 Testing Accuracy:82.3%\r", + "Progress:89.6% Speed(reviews/sec):4185.% #Correct:739 #Tested:897 Testing Accuracy:82.3%\r", + "Progress:89.7% Speed(reviews/sec):4187.% #Correct:740 #Tested:898 Testing Accuracy:82.4%\r", + "Progress:89.8% Speed(reviews/sec):4183.% #Correct:741 #Tested:899 Testing Accuracy:82.4%\r", + "Progress:89.9% Speed(reviews/sec):4183.% #Correct:742 #Tested:900 Testing Accuracy:82.4%\r", + "Progress:90.0% Speed(reviews/sec):4177.% #Correct:743 #Tested:901 Testing Accuracy:82.4%\r", + "Progress:90.1% Speed(reviews/sec):4171.% #Correct:744 #Tested:902 Testing Accuracy:82.4%\r", + "Progress:90.2% Speed(reviews/sec):4171.% #Correct:745 #Tested:903 Testing Accuracy:82.5%\r", + "Progress:90.3% Speed(reviews/sec):4173.% #Correct:746 #Tested:904 Testing Accuracy:82.5%\r", + "Progress:90.4% Speed(reviews/sec):4176.% #Correct:747 #Tested:905 Testing Accuracy:82.5%\r", + "Progress:90.5% Speed(reviews/sec):4176.% #Correct:748 #Tested:906 Testing Accuracy:82.5%\r", + "Progress:90.6% Speed(reviews/sec):4177.% #Correct:748 #Tested:907 Testing Accuracy:82.4%\r", + "Progress:90.7% Speed(reviews/sec):4175.% #Correct:749 #Tested:908 Testing Accuracy:82.4%\r", + "Progress:90.8% Speed(reviews/sec):4175.% #Correct:749 #Tested:909 Testing Accuracy:82.3%\r", + "Progress:90.9% Speed(reviews/sec):4177.% #Correct:749 #Tested:910 Testing Accuracy:82.3%\r", + "Progress:91.0% Speed(reviews/sec):4177.% #Correct:750 #Tested:911 Testing Accuracy:82.3%\r", + "Progress:91.1% Speed(reviews/sec):4179.% #Correct:750 #Tested:912 Testing Accuracy:82.2%\r", + "Progress:91.2% Speed(reviews/sec):4179.% #Correct:751 #Tested:913 Testing Accuracy:82.2%\r", + "Progress:91.3% Speed(reviews/sec):4148.% #Correct:751 #Tested:914 Testing Accuracy:82.1%\r", + "Progress:91.4% Speed(reviews/sec):4144.% #Correct:752 #Tested:915 Testing Accuracy:82.1%\r", + "Progress:91.5% Speed(reviews/sec):4117.% #Correct:752 #Tested:916 Testing Accuracy:82.0%\r", + "Progress:91.6% Speed(reviews/sec):4111.% #Correct:752 #Tested:917 Testing Accuracy:82.0%\r", + "Progress:91.7% Speed(reviews/sec):4088.% #Correct:752 #Tested:918 Testing Accuracy:81.9%\r", + "Progress:91.8% Speed(reviews/sec):4081.% #Correct:753 #Tested:919 Testing Accuracy:81.9%\r", + "Progress:91.9% Speed(reviews/sec):4005.% #Correct:754 #Tested:920 Testing Accuracy:81.9%\r", + "Progress:92.0% Speed(reviews/sec):4002.% #Correct:754 #Tested:921 Testing Accuracy:81.8%\r", + "Progress:92.1% Speed(reviews/sec):3964.% #Correct:755 #Tested:922 Testing Accuracy:81.8%\r", + "Progress:92.2% Speed(reviews/sec):3949.% #Correct:756 #Tested:923 Testing Accuracy:81.9%\r", + "Progress:92.3% Speed(reviews/sec):3947.% #Correct:757 #Tested:924 Testing Accuracy:81.9%\r", + "Progress:92.4% Speed(reviews/sec):3945.% #Correct:758 #Tested:925 Testing Accuracy:81.9%\r", + "Progress:92.5% Speed(reviews/sec):3910.% #Correct:759 #Tested:926 Testing Accuracy:81.9%\r", + "Progress:92.6% Speed(reviews/sec):3883.% #Correct:760 #Tested:927 Testing Accuracy:81.9%\r", + "Progress:92.7% Speed(reviews/sec):3883.% #Correct:761 #Tested:928 Testing Accuracy:82.0%\r", + "Progress:92.8% Speed(reviews/sec):3885.% #Correct:761 #Tested:929 Testing Accuracy:81.9%\r", + "Progress:92.9% Speed(reviews/sec):3851.% #Correct:762 #Tested:930 Testing Accuracy:81.9%\r", + "Progress:93.0% Speed(reviews/sec):3808.% #Correct:763 #Tested:931 Testing Accuracy:81.9%\r", + "Progress:93.1% Speed(reviews/sec):3775.% #Correct:764 #Tested:932 Testing Accuracy:81.9%\r", + "Progress:93.2% Speed(reviews/sec):3777.% #Correct:765 #Tested:933 Testing Accuracy:81.9%\r", + "Progress:93.3% Speed(reviews/sec):3776.% #Correct:766 #Tested:934 Testing Accuracy:82.0%\r", + "Progress:93.4% Speed(reviews/sec):3773.% #Correct:767 #Tested:935 Testing Accuracy:82.0%\r", + "Progress:93.5% Speed(reviews/sec):3774.% #Correct:768 #Tested:936 Testing Accuracy:82.0%\r", + "Progress:93.6% Speed(reviews/sec):3776.% #Correct:768 #Tested:937 Testing Accuracy:81.9%\r", + "Progress:93.7% Speed(reviews/sec):3778.% #Correct:769 #Tested:938 Testing Accuracy:81.9%\r", + "Progress:93.8% Speed(reviews/sec):3780.% #Correct:769 #Tested:939 Testing Accuracy:81.8%\r", + "Progress:93.9% Speed(reviews/sec):3783.% #Correct:770 #Tested:940 Testing Accuracy:81.9%\r", + "Progress:94.0% Speed(reviews/sec):3786.% #Correct:771 #Tested:941 Testing Accuracy:81.9%\r", + "Progress:94.1% Speed(reviews/sec):3788.% #Correct:771 #Tested:942 Testing Accuracy:81.8%\r", + "Progress:94.2% Speed(reviews/sec):3790.% #Correct:771 #Tested:943 Testing Accuracy:81.7%\r", + "Progress:94.3% Speed(reviews/sec):3792.% #Correct:772 #Tested:944 Testing Accuracy:81.7%\r", + "Progress:94.4% Speed(reviews/sec):3791.% #Correct:773 #Tested:945 Testing Accuracy:81.7%\r", + "Progress:94.5% Speed(reviews/sec):3793.% #Correct:774 #Tested:946 Testing Accuracy:81.8%\r", + "Progress:94.6% Speed(reviews/sec):3795.% #Correct:775 #Tested:947 Testing Accuracy:81.8%\r", + "Progress:94.7% Speed(reviews/sec):3797.% #Correct:776 #Tested:948 Testing Accuracy:81.8%\r", + "Progress:94.8% Speed(reviews/sec):3799.% #Correct:777 #Tested:949 Testing Accuracy:81.8%\r", + "Progress:94.9% Speed(reviews/sec):3802.% #Correct:778 #Tested:950 Testing Accuracy:81.8%\r", + "Progress:95.0% Speed(reviews/sec):3804.% #Correct:779 #Tested:951 Testing Accuracy:81.9%\r", + "Progress:95.1% Speed(reviews/sec):3803.% #Correct:779 #Tested:952 Testing Accuracy:81.8%\r", + "Progress:95.2% Speed(reviews/sec):3805.% #Correct:779 #Tested:953 Testing Accuracy:81.7%\r", + "Progress:95.3% Speed(reviews/sec):3807.% #Correct:780 #Tested:954 Testing Accuracy:81.7%\r", + "Progress:95.4% Speed(reviews/sec):3806.% #Correct:781 #Tested:955 Testing Accuracy:81.7%\r", + "Progress:95.5% Speed(reviews/sec):3809.% #Correct:782 #Tested:956 Testing Accuracy:81.7%\r", + "Progress:95.6% Speed(reviews/sec):3811.% #Correct:783 #Tested:957 Testing Accuracy:81.8%\r", + "Progress:95.7% Speed(reviews/sec):3813.% #Correct:784 #Tested:958 Testing Accuracy:81.8%\r", + "Progress:95.8% Speed(reviews/sec):3814.% #Correct:785 #Tested:959 Testing Accuracy:81.8%\r", + "Progress:95.9% Speed(reviews/sec):3817.% #Correct:786 #Tested:960 Testing Accuracy:81.8%\r", + "Progress:96.0% Speed(reviews/sec):3812.% #Correct:787 #Tested:961 Testing Accuracy:81.8%\r", + "Progress:96.1% Speed(reviews/sec):3814.% #Correct:788 #Tested:962 Testing Accuracy:81.9%\r", + "Progress:96.2% Speed(reviews/sec):3816.% #Correct:789 #Tested:963 Testing Accuracy:81.9%\r", + "Progress:96.3% Speed(reviews/sec):3819.% #Correct:790 #Tested:964 Testing Accuracy:81.9%\r", + "Progress:96.4% Speed(reviews/sec):3821.% #Correct:791 #Tested:965 Testing Accuracy:81.9%\r", + "Progress:96.5% Speed(reviews/sec):3823.% #Correct:792 #Tested:966 Testing Accuracy:81.9%\r", + "Progress:96.6% Speed(reviews/sec):3826.% #Correct:793 #Tested:967 Testing Accuracy:82.0%\r", + "Progress:96.7% Speed(reviews/sec):3829.% #Correct:794 #Tested:968 Testing Accuracy:82.0%\r", + "Progress:96.8% Speed(reviews/sec):3831.% #Correct:795 #Tested:969 Testing Accuracy:82.0%\r", + "Progress:96.9% Speed(reviews/sec):3833.% #Correct:796 #Tested:970 Testing Accuracy:82.0%\r", + "Progress:97.0% Speed(reviews/sec):3832.% #Correct:797 #Tested:971 Testing Accuracy:82.0%\r", + "Progress:97.1% Speed(reviews/sec):3833.% #Correct:798 #Tested:972 Testing Accuracy:82.0%\r", + "Progress:97.2% Speed(reviews/sec):3835.% #Correct:798 #Tested:973 Testing Accuracy:82.0%\r", + "Progress:97.3% Speed(reviews/sec):3833.% #Correct:799 #Tested:974 Testing Accuracy:82.0%\r", + "Progress:97.4% Speed(reviews/sec):3825.% #Correct:800 #Tested:975 Testing Accuracy:82.0%\r", + "Progress:97.5% Speed(reviews/sec):3825.% #Correct:801 #Tested:976 Testing Accuracy:82.0%\r", + "Progress:97.6% Speed(reviews/sec):3821.% #Correct:802 #Tested:977 Testing Accuracy:82.0%\r", + "Progress:97.7% Speed(reviews/sec):3815.% #Correct:803 #Tested:978 Testing Accuracy:82.1%\r", + "Progress:97.8% Speed(reviews/sec):3817.% #Correct:804 #Tested:979 Testing Accuracy:82.1%\r", + "Progress:97.9% Speed(reviews/sec):3816.% #Correct:805 #Tested:980 Testing Accuracy:82.1%\r", + "Progress:98.0% Speed(reviews/sec):3817.% #Correct:806 #Tested:981 Testing Accuracy:82.1%\r", + "Progress:98.1% Speed(reviews/sec):3793.% #Correct:807 #Tested:982 Testing Accuracy:82.1%\r", + "Progress:98.2% Speed(reviews/sec):3787.% #Correct:808 #Tested:983 Testing Accuracy:82.1%\r", + "Progress:98.3% Speed(reviews/sec):3779.% #Correct:809 #Tested:984 Testing Accuracy:82.2%\r", + "Progress:98.4% Speed(reviews/sec):3781.% #Correct:809 #Tested:985 Testing Accuracy:82.1%\r", + "Progress:98.5% Speed(reviews/sec):3784.% #Correct:810 #Tested:986 Testing Accuracy:82.1%\r", + "Progress:98.6% Speed(reviews/sec):3786.% #Correct:811 #Tested:987 Testing Accuracy:82.1%\r", + "Progress:98.7% Speed(reviews/sec):3787.% #Correct:812 #Tested:988 Testing Accuracy:82.1%\r", + "Progress:98.8% Speed(reviews/sec):3789.% #Correct:813 #Tested:989 Testing Accuracy:82.2%\r", + "Progress:98.9% Speed(reviews/sec):3790.% #Correct:814 #Tested:990 Testing Accuracy:82.2%\r", + "Progress:99.0% Speed(reviews/sec):3792.% #Correct:815 #Tested:991 Testing Accuracy:82.2%\r", + "Progress:99.1% Speed(reviews/sec):3793.% #Correct:816 #Tested:992 Testing Accuracy:82.2%\r", + "Progress:99.2% Speed(reviews/sec):3796.% #Correct:817 #Tested:993 Testing Accuracy:82.2%\r", + "Progress:99.3% Speed(reviews/sec):3798.% #Correct:818 #Tested:994 Testing Accuracy:82.2%\r", + "Progress:99.4% Speed(reviews/sec):3798.% #Correct:818 #Tested:995 Testing Accuracy:82.2%\r", + "Progress:99.5% Speed(reviews/sec):3798.% #Correct:819 #Tested:996 Testing Accuracy:82.2%\r", + "Progress:99.6% Speed(reviews/sec):3800.% #Correct:820 #Tested:997 Testing Accuracy:82.2%\r", + "Progress:99.7% Speed(reviews/sec):3802.% #Correct:821 #Tested:998 Testing Accuracy:82.2%\r", + "Progress:99.8% Speed(reviews/sec):3803.% #Correct:821 #Tested:999 Testing Accuracy:82.1%\r", + "Progress:99.9% Speed(reviews/sec):3805.% #Correct:822 #Tested:1000 Testing Accuracy:82.2%" + ] + } + ], + "source": [ + "mlp.test(reviews[-1000:],labels[-1000:])" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python [default]", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.2" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/sentiment_network/labels.txt b/sentiment_network/labels.txt new file mode 100644 index 0000000..10366d9 --- /dev/null +++ b/sentiment_network/labels.txt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:301d18424e95957a000293f5d8393239451d7f54622a8f21613172aebe64ca06 +size 225000 diff --git a/sentiment_network/reviews.txt b/sentiment_network/reviews.txt new file mode 100644 index 0000000..2940b8f --- /dev/null +++ b/sentiment_network/reviews.txt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d7c2f21e5c8f5240859910ba5331b9baa88310aa978dbd8747ebd4ff6ebbefa6 +size 33678267 diff --git a/sentiment_network/sentiment_network.png b/sentiment_network/sentiment_network.png new file mode 100644 index 0000000..49e086d Binary files /dev/null and b/sentiment_network/sentiment_network.png differ diff --git a/sentiment_network/sentiment_network_2.png b/sentiment_network/sentiment_network_2.png new file mode 100644 index 0000000..78c68e7 Binary files /dev/null and b/sentiment_network/sentiment_network_2.png differ diff --git a/sentiment_network/sentiment_network_pos.png b/sentiment_network/sentiment_network_pos.png new file mode 100644 index 0000000..b67313d Binary files /dev/null and b/sentiment_network/sentiment_network_pos.png differ diff --git a/sentiment_network/sentiment_network_sparse.png b/sentiment_network/sentiment_network_sparse.png new file mode 100644 index 0000000..b41e891 Binary files /dev/null and b/sentiment_network/sentiment_network_sparse.png differ diff --git a/sentiment_network/sentiment_network_sparse_2.png b/sentiment_network/sentiment_network_sparse_2.png new file mode 100644 index 0000000..ed316a5 Binary files /dev/null and b/sentiment_network/sentiment_network_sparse_2.png differ diff --git a/tensorboard/.ipynb_checkpoints/Anna KaRNNa Name Scoped-checkpoint.ipynb b/tensorboard/.ipynb_checkpoints/Anna KaRNNa Name Scoped-checkpoint.ipynb new file mode 100644 index 0000000..db2d3c7 --- /dev/null +++ b/tensorboard/.ipynb_checkpoints/Anna KaRNNa Name Scoped-checkpoint.ipynb @@ -0,0 +1,847 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Anna KaRNNa\n", + "\n", + "In this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.\n", + "\n", + "This network is based off of Andrej Karpathy's [post on RNNs](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) and [implementation in Torch](https://github.com/karpathy/char-rnn). Also, some information [here at r2rt](http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.html) and from [Sherjil Ozair](https://github.com/sherjilozair/char-rnn-tensorflow) on GitHub. Below is the general architecture of the character-wise RNN.\n", + "\n", + "" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "from collections import namedtuple\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we'll load the text file and convert it into integers for our network to use." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with open('anna.txt', 'r') as f:\n", + " text=f.read()\n", + "vocab = set(text)\n", + "vocab_to_int = {c: i for i, c in enumerate(vocab)}\n", + "int_to_vocab = dict(enumerate(vocab))\n", + "chars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'Chapter 1\\n\\n\\nHappy families are all alike; every unhappy family is unhappy in its own\\nway.\\n\\nEverythin'" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "text[:100]" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([26, 49, 22, 46, 33, 70, 51, 37, 36, 31, 31, 31, 65, 22, 46, 46, 39,\n", + " 37, 23, 22, 4, 77, 32, 77, 70, 81, 37, 22, 51, 70, 37, 22, 32, 32,\n", + " 37, 22, 32, 77, 0, 70, 64, 37, 70, 16, 70, 51, 39, 37, 20, 74, 49,\n", + " 22, 46, 46, 39, 37, 23, 22, 4, 77, 32, 39, 37, 77, 81, 37, 20, 74,\n", + " 49, 22, 46, 46, 39, 37, 77, 74, 37, 77, 33, 81, 37, 21, 75, 74, 31,\n", + " 75, 22, 39, 13, 31, 31, 24, 16, 70, 51, 39, 33, 49, 77, 74], dtype=int32)" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chars[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.\n", + "\n", + "Here I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.\n", + "\n", + "The idea here is to make a 2D matrix where the number of rows is equal to the number of batches. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the `split_frac` keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def split_data(chars, batch_size, num_steps, split_frac=0.9):\n", + " \"\"\" \n", + " Split character data into training and validation sets, inputs and targets for each set.\n", + " \n", + " Arguments\n", + " ---------\n", + " chars: character array\n", + " batch_size: Size of examples in each of batch\n", + " num_steps: Number of sequence steps to keep in the input and pass to the network\n", + " split_frac: Fraction of batches to keep in the training set\n", + " \n", + " \n", + " Returns train_x, train_y, val_x, val_y\n", + " \"\"\"\n", + " \n", + " \n", + " slice_size = batch_size * num_steps\n", + " n_batches = int(len(chars) / slice_size)\n", + " \n", + " # Drop the last few characters to make only full batches\n", + " x = chars[: n_batches*slice_size]\n", + " y = chars[1: n_batches*slice_size + 1]\n", + " \n", + " # Split the data into batch_size slices, then stack them into a 2D matrix \n", + " x = np.stack(np.split(x, batch_size))\n", + " y = np.stack(np.split(y, batch_size))\n", + " \n", + " # Now x and y are arrays with dimensions batch_size x n_batches*num_steps\n", + " \n", + " # Split into training and validation sets, keep the virst split_frac batches for training\n", + " split_idx = int(n_batches*split_frac)\n", + " train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]\n", + " val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]\n", + " \n", + " return train_x, train_y, val_x, val_y" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_x, train_y, val_x, val_y = split_data(chars, 10, 200)" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(10, 178400)" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x.shape" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[26, 49, 22, 46, 33, 70, 51, 37, 36, 31],\n", + " [11, 74, 53, 37, 49, 70, 37, 4, 21, 16],\n", + " [37, 1, 22, 33, 1, 49, 77, 74, 69, 37],\n", + " [21, 33, 49, 70, 51, 37, 75, 21, 20, 32],\n", + " [37, 33, 49, 70, 37, 32, 22, 74, 53, 2],\n", + " [37, 52, 49, 51, 21, 20, 69, 49, 37, 32],\n", + " [33, 37, 33, 21, 31, 53, 21, 13, 31, 31],\n", + " [21, 37, 49, 70, 51, 81, 70, 32, 23, 54],\n", + " [49, 22, 33, 37, 77, 81, 37, 33, 49, 70],\n", + " [70, 51, 81, 70, 32, 23, 37, 22, 74, 53]], dtype=int32)" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x[:,:10]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "I'll write another function to grab batches out of the arrays made by split data. Here each batch will be a sliding window on these arrays with size `batch_size X num_steps`. For example, if we want our network to train on a sequence of 100 characters, `num_steps = 100`. For the next batch, we'll shift this window the next sequence of `num_steps` characters. In this way we can feed batches to the network and the cell states will continue through on each batch." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_batch(arrs, num_steps):\n", + " batch_size, slice_size = arrs[0].shape\n", + " \n", + " n_batches = int(slice_size/num_steps)\n", + " for b in range(n_batches):\n", + " yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,\n", + " learning_rate=0.001, grad_clip=5, sampling=False):\n", + " \n", + " if sampling == True:\n", + " batch_size, num_steps = 1, 1\n", + "\n", + " tf.reset_default_graph()\n", + " \n", + " # Declare placeholders we'll feed into the graph\n", + " with tf.name_scope('inputs'):\n", + " inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')\n", + " x_one_hot = tf.one_hot(inputs, num_classes, name='x_one_hot')\n", + " \n", + " with tf.name_scope('targets'):\n", + " targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')\n", + " y_one_hot = tf.one_hot(targets, num_classes, name='y_one_hot')\n", + " y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])\n", + " \n", + " keep_prob = tf.placeholder(tf.float32, name='keep_prob')\n", + " \n", + " # Build the RNN layers\n", + " with tf.name_scope(\"RNN_layers\"):\n", + " lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n", + " drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n", + " cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)\n", + " \n", + " with tf.name_scope(\"RNN_init_state\"):\n", + " initial_state = cell.zero_state(batch_size, tf.float32)\n", + "\n", + " # Run the data through the RNN layers\n", + " with tf.name_scope(\"RNN_forward\"):\n", + " rnn_inputs = [tf.squeeze(i, squeeze_dims=[1]) for i in tf.split(x_one_hot, num_steps, 1)]\n", + " outputs, state = tf.contrib.rnn.static_rnn(cell, rnn_inputs, initial_state=initial_state)\n", + " \n", + " final_state = state\n", + " \n", + " # Reshape output so it's a bunch of rows, one row for each cell output\n", + " with tf.name_scope('sequence_reshape'):\n", + " seq_output = tf.concat(outputs, axis=1,name='seq_output')\n", + " output = tf.reshape(seq_output, [-1, lstm_size], name='graph_output')\n", + " \n", + " # Now connect the RNN putputs to a softmax layer and calculate the cost\n", + " with tf.name_scope('logits'):\n", + " softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1),\n", + " name='softmax_w')\n", + " softmax_b = tf.Variable(tf.zeros(num_classes), name='softmax_b')\n", + " logits = tf.matmul(output, softmax_w) + softmax_b\n", + "\n", + " with tf.name_scope('predictions'):\n", + " preds = tf.nn.softmax(logits, name='predictions')\n", + " \n", + " \n", + " with tf.name_scope('cost'):\n", + " loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped, name='loss')\n", + " cost = tf.reduce_mean(loss, name='cost')\n", + "\n", + " # Optimizer for training, using gradient clipping to control exploding gradients\n", + " with tf.name_scope('train'):\n", + " tvars = tf.trainable_variables()\n", + " grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)\n", + " train_op = tf.train.AdamOptimizer(learning_rate)\n", + " optimizer = train_op.apply_gradients(zip(grads, tvars))\n", + " \n", + " # Export the nodes \n", + " export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',\n", + " 'keep_prob', 'cost', 'preds', 'optimizer']\n", + " Graph = namedtuple('Graph', export_nodes)\n", + " local_dict = locals()\n", + " graph = Graph(*[local_dict[each] for each in export_nodes])\n", + " \n", + " return graph" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Hyperparameters\n", + "\n", + "Here I'm defining the hyperparameters for the network. The two you probably haven't seen before are `lstm_size` and `num_layers`. These set the number of hidden units in the LSTM layers and the number of LSTM layers, respectively. Of course, making these bigger will improve the network's performance but you'll have to watch out for overfitting. If your validation loss is much larger than the training loss, you're probably overfitting. Decrease the size of the network or decrease the dropout keep probability." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "batch_size = 100\n", + "num_steps = 100\n", + "lstm_size = 512\n", + "num_layers = 2\n", + "learning_rate = 0.001" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Write out the graph for TensorBoard" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "with tf.Session() as sess:\n", + " \n", + " sess.run(tf.global_variables_initializer())\n", + " file_writer = tf.summary.FileWriter('./logs/3', sess.graph)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Time for training which is is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by `save_every_n`) I calculate the validation loss and save a checkpoint." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "!mkdir -p checkpoints/anna" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1/10 Iteration 1/1780 Training loss: 4.4195 1.3313 sec/batch\n", + "Epoch 1/10 Iteration 2/1780 Training loss: 4.3756 0.1287 sec/batch\n", + "Epoch 1/10 Iteration 3/1780 Training loss: 4.2069 0.1276 sec/batch\n", + "Epoch 1/10 Iteration 4/1780 Training loss: 4.5396 0.1185 sec/batch\n", + "Epoch 1/10 Iteration 5/1780 Training loss: 4.4190 0.1206 sec/batch\n", + "Epoch 1/10 Iteration 6/1780 Training loss: 4.3547 0.1233 sec/batch\n", + "Epoch 1/10 Iteration 7/1780 Training loss: 4.2792 0.1188 sec/batch\n", + "Epoch 1/10 Iteration 8/1780 Training loss: 4.2018 0.1170 sec/batch\n", + "Epoch 1/10 Iteration 9/1780 Training loss: 4.1251 0.1187 sec/batch\n", + "Epoch 1/10 Iteration 10/1780 Training loss: 4.0558 0.1174 sec/batch\n", + "Epoch 1/10 Iteration 11/1780 Training loss: 3.9946 0.1190 sec/batch\n", + "Epoch 1/10 Iteration 12/1780 Training loss: 3.9451 0.1193 sec/batch\n", + "Epoch 1/10 Iteration 13/1780 Training loss: 3.9011 0.1210 sec/batch\n", + "Epoch 1/10 Iteration 14/1780 Training loss: 3.8632 0.1185 sec/batch\n", + "Epoch 1/10 Iteration 15/1780 Training loss: 3.8275 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 16/1780 Training loss: 3.7945 0.1211 sec/batch\n", + "Epoch 1/10 Iteration 17/1780 Training loss: 3.7649 0.1215 sec/batch\n", + "Epoch 1/10 Iteration 18/1780 Training loss: 3.7400 0.1214 sec/batch\n", + "Epoch 1/10 Iteration 19/1780 Training loss: 3.7164 0.1247 sec/batch\n", + "Epoch 1/10 Iteration 20/1780 Training loss: 3.6933 0.1212 sec/batch\n", + "Epoch 1/10 Iteration 21/1780 Training loss: 3.6728 0.1203 sec/batch\n", + "Epoch 1/10 Iteration 22/1780 Training loss: 3.6538 0.1207 sec/batch\n", + "Epoch 1/10 Iteration 23/1780 Training loss: 3.6359 0.1200 sec/batch\n", + "Epoch 1/10 Iteration 24/1780 Training loss: 3.6198 0.1229 sec/batch\n", + "Epoch 1/10 Iteration 25/1780 Training loss: 3.6041 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 26/1780 Training loss: 3.5904 0.1202 sec/batch\n", + "Epoch 1/10 Iteration 27/1780 Training loss: 3.5774 0.1189 sec/batch\n", + "Epoch 1/10 Iteration 28/1780 Training loss: 3.5642 0.1214 sec/batch\n", + "Epoch 1/10 Iteration 29/1780 Training loss: 3.5522 0.1231 sec/batch\n", + "Epoch 1/10 Iteration 30/1780 Training loss: 3.5407 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 31/1780 Training loss: 3.5309 0.1180 sec/batch\n", + "Epoch 1/10 Iteration 32/1780 Training loss: 3.5207 0.1179 sec/batch\n", + "Epoch 1/10 Iteration 33/1780 Training loss: 3.5109 0.1224 sec/batch\n", + "Epoch 1/10 Iteration 34/1780 Training loss: 3.5021 0.1206 sec/batch\n", + "Epoch 1/10 Iteration 35/1780 Training loss: 3.4931 0.1241 sec/batch\n", + "Epoch 1/10 Iteration 36/1780 Training loss: 3.4850 0.1169 sec/batch\n", + "Epoch 1/10 Iteration 37/1780 Training loss: 3.4767 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 38/1780 Training loss: 3.4688 0.1202 sec/batch\n", + "Epoch 1/10 Iteration 39/1780 Training loss: 3.4611 0.1213 sec/batch\n" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 33\u001b[0m model.initial_state: new_state}\n\u001b[1;32m 34\u001b[0m batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n\u001b[0;32m---> 35\u001b[0;31m feed_dict=feed)\n\u001b[0m\u001b[1;32m 36\u001b[0m \u001b[0mloss\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0mbatch_loss\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 37\u001b[0m \u001b[0mend\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtime\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtime\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36mrun\u001b[0;34m(self, fetches, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 765\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 766\u001b[0m result = self._run(None, fetches, feed_dict, options_ptr,\n\u001b[0;32m--> 767\u001b[0;31m run_metadata_ptr)\n\u001b[0m\u001b[1;32m 768\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrun_metadata\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 769\u001b[0m \u001b[0mproto_data\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf_session\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mTF_GetBuffer\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mrun_metadata_ptr\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_run\u001b[0;34m(self, handle, fetches, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 963\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfinal_fetches\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0mfinal_targets\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 964\u001b[0m results = self._do_run(handle, final_targets, final_fetches,\n\u001b[0;32m--> 965\u001b[0;31m feed_dict_string, options, run_metadata)\n\u001b[0m\u001b[1;32m 966\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 967\u001b[0m \u001b[0mresults\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_do_run\u001b[0;34m(self, handle, target_list, fetch_list, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 1013\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mhandle\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1014\u001b[0m return self._do_call(_run_fn, self._session, feed_dict, fetch_list,\n\u001b[0;32m-> 1015\u001b[0;31m target_list, options, run_metadata)\n\u001b[0m\u001b[1;32m 1016\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1017\u001b[0m return self._do_call(_prun_fn, self._session, handle, feed_dict,\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_do_call\u001b[0;34m(self, fn, *args)\u001b[0m\n\u001b[1;32m 1020\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_do_call\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1021\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1022\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 1023\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0merrors\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mOpError\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1024\u001b[0m \u001b[0mmessage\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mcompat\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mas_text\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0me\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmessage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_run_fn\u001b[0;34m(session, feed_dict, fetch_list, target_list, options, run_metadata)\u001b[0m\n\u001b[1;32m 1002\u001b[0m return tf_session.TF_Run(session, options,\n\u001b[1;32m 1003\u001b[0m \u001b[0mfeed_dict\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfetch_list\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtarget_list\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1004\u001b[0;31m status, run_metadata)\n\u001b[0m\u001b[1;32m 1005\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1006\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_prun_fn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msession\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mhandle\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfeed_dict\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfetch_list\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "epochs = 10\n", + "save_every_n = 200\n", + "train_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)\n", + "\n", + "model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "saver = tf.train.Saver(max_to_keep=100)\n", + "\n", + "with tf.Session() as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Use the line below to load a checkpoint and resume training\n", + " #saver.restore(sess, 'checkpoints/anna20.ckpt')\n", + " \n", + " n_batches = int(train_x.shape[1]/num_steps)\n", + " iterations = n_batches * epochs\n", + " for e in range(epochs):\n", + " \n", + " # Train network\n", + " new_state = sess.run(model.initial_state)\n", + " loss = 0\n", + " for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):\n", + " iteration = e*n_batches + b\n", + " start = time.time()\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 0.5,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n", + " feed_dict=feed)\n", + " loss += batch_loss\n", + " end = time.time()\n", + " print('Epoch {}/{} '.format(e+1, epochs),\n", + " 'Iteration {}/{}'.format(iteration, iterations),\n", + " 'Training loss: {:.4f}'.format(loss/b),\n", + " '{:.4f} sec/batch'.format((end-start)))\n", + " \n", + " \n", + " if (iteration%save_every_n == 0) or (iteration == iterations):\n", + " # Check performance, notice dropout has been set to 1\n", + " val_loss = []\n", + " new_state = sess.run(model.initial_state)\n", + " for x, y in get_batch([val_x, val_y], num_steps):\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state = sess.run([model.cost, model.final_state], feed_dict=feed)\n", + " val_loss.append(batch_loss)\n", + "\n", + " print('Validation loss:', np.mean(val_loss),\n", + " 'Saving checkpoint!')\n", + " saver.save(sess, \"checkpoints/anna/i{}_l{}_{:.3f}.ckpt\".format(iteration, lstm_size, np.mean(val_loss)))" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "model_checkpoint_path: \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i400_l512_1.980.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i800_l512_1.595.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1200_l512_1.407.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1400_l512_1.349.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1600_l512_1.292.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1800_l512_1.255.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2000_l512_1.224.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2200_l512_1.204.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2400_l512_1.187.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2600_l512_1.172.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2800_l512_1.160.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3000_l512_1.148.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3200_l512_1.137.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3400_l512_1.129.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3560_l512_1.122.ckpt\"" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "tf.train.get_checkpoint_state('checkpoints/anna')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sampling\n", + "\n", + "Now that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.\n", + "\n", + "The network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def pick_top_n(preds, vocab_size, top_n=5):\n", + " p = np.squeeze(preds)\n", + " p[np.argsort(p)[:-top_n]] = 0\n", + " p = p / np.sum(p)\n", + " c = np.random.choice(vocab_size, 1, p=p)[0]\n", + " return c" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def sample(checkpoint, n_samples, lstm_size, vocab_size, prime=\"The \"):\n", + " prime = \"Far\"\n", + " samples = [c for c in prime]\n", + " model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)\n", + " saver = tf.train.Saver()\n", + " with tf.Session() as sess:\n", + " saver.restore(sess, checkpoint)\n", + " new_state = sess.run(model.initial_state)\n", + " for c in prime:\n", + " x = np.zeros((1, 1))\n", + " x[0,0] = vocab_to_int[c]\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + "\n", + " for i in range(n_samples):\n", + " x[0,0] = c\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + " \n", + " return ''.join(samples)" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farlathit that if had so\n", + "like it that it were. He could not trouble to his wife, and there was\n", + "anything in them of the side of his weaky in the creature at his forteren\n", + "to him.\n", + "\n", + "\"What is it? I can't bread to those,\" said Stepan Arkadyevitch. \"It's not\n", + "my children, and there is an almost this arm, true it mays already,\n", + "and tell you what I have say to you, and was not looking at the peasant,\n", + "why is, I don't know him out, and she doesn't speak to me immediately, as\n", + "you would say the countess and the more frest an angelembre, and time and\n", + "things's silent, but I was not in my stand that is in my head. But if he\n", + "say, and was so feeling with his soul. A child--in his soul of his\n", + "soul of his soul. He should not see that any of that sense of. Here he\n", + "had not been so composed and to speak for as in a whole picture, but\n", + "all the setting and her excellent and society, who had been delighted\n", + "and see to anywing had been being troed to thousand words on them,\n", + "we liked him.\n", + "\n", + "That set in her money at the table, he came into the party. The capable\n", + "of his she could not be as an old composure.\n", + "\n", + "\"That's all something there will be down becime by throe is\n", + "such a silent, as in a countess, I should state it out and divorct.\n", + "The discussion is not for me. I was that something was simply they are\n", + "all three manshess of a sensitions of mind it all.\"\n", + "\n", + "\"No,\" he thought, shouted and lifting his soul. \"While it might see your\n", + "honser and she, I could burst. And I had been a midelity. And I had a\n", + "marnief are through the countess,\" he said, looking at him, a chosing\n", + "which they had been carried out and still solied, and there was a sen that\n", + "was to be completely, and that this matter of all the seconds of it, and\n", + "a concipation were to her husband, who came up and conscaously, that he\n", + "was not the station. All his fourse she was always at the country,,\n", + "to speak oft, and though they were to hear the delightful throom and\n", + "whether they came towards the morning, and his living and a coller and\n", + "hold--the children. \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farnt him oste wha sorind thans tout thint asd an sesand an hires on thime sind thit aled, ban thand and out hore as the ter hos ton ho te that, was tis tart al the hand sostint him sore an tit an son thes, win he se ther san ther hher tas tarereng,.\n", + "\n", + "Anl at an ades in ond hesiln, ad hhe torers teans, wast tar arering tho this sos alten sorer has hhas an siton ther him he had sin he ard ate te anling the sosin her ans and\n", + "arins asd and ther ale te tot an tand tanginge wath and ho ald, so sot th asend sat hare sother horesinnd, he hesense wing ante her so tith tir sherinn, anded and to the toul anderin he sorit he torsith she se atere an ting ot hand and thit hhe so the te wile har\n", + "ens ont in the sersise, and we he seres tar aterer, to ato tat or has he he wan ton here won and sen heren he sosering, to to theer oo adent har herere the wosh oute, was serild ward tous hed astend..\n", + "\n", + "I's sint on alt in har tor tit her asd hade shithans ored he talereng an soredendere tim tot hees. Tise sor and \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Fard as astice her said he celatice of to seress in the raice, and to be the some and sere allats to that said to that the sark and a cast a the wither ald the pacinesse of her had astition, he said to the sount as she west at hissele. Af the cond it he was a fact onthis astisarianing.\n", + "\n", + "\n", + "\"Or a ton to to be that's a more at aspestale as the sont of anstiring as\n", + "thours and trey.\n", + "\n", + "The same wo dangring the\n", + "raterst, who sore and somethy had ast out an of his book. \"We had's beane were that, and a morted a thay he had to tere. Then to\n", + "her homent andertersed his his ancouted to the pirsted, the soution for of the pirsice inthirgest and stenciol, with the hard and and\n", + "a colrice of to be oneres,\n", + "the song to this anderssad.\n", + "The could ounterss the said to serom of\n", + "soment a carsed of sheres of she\n", + "torded\n", + "har and want in their of hould, but\n", + "her told in that in he tad a the same to her. Serghing an her has and with the seed, and the camt ont his about of the\n", + "sail, the her then all houg ant or to hus to \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farrat, his felt has at it.\n", + "\n", + "\"When the pose ther hor exceed\n", + "to his sheant was,\" weat a sime of his sounsed. The coment and the facily that which had began terede a marilicaly whice whether the pose of his hand, at she was alligated herself the same on she had to\n", + "taiking to his forthing and streath how to hand\n", + "began in a lang at some at it, this he cholded not set all her. \"Wo love that is setthing. Him anstering as seen that.\"\n", + "\n", + "\"Yes in the man that say the mare a crances is it?\" said Sergazy Ivancatching. \"You doon think were somether is ifficult of a mone of\n", + "though the most at the countes that the\n", + "mean on the come to say the most, to\n", + "his feesing of\n", + "a man she, whilo he\n", + "sained and well, that he would still at to said. He wind at his for the sore in the most\n", + "of hoss and almoved to see him. They have betine the sumper into at he his stire, and what he was that at the so steate of the\n", + "sound, and shin should have a geest of shall feet on the conderation to she had been at that imporsing the dre\n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + }, + "toc": { + "colors": { + "hover_highlight": "#DAA520", + "running_highlight": "#FF0000", + "selected_highlight": "#FFD700" + }, + "moveMenuLeft": true, + "nav_menu": { + "height": "111px", + "width": "251px" + }, + "navigate_menu": true, + "number_sections": true, + "sideBar": true, + "threshold": 4, + "toc_cell": false, + "toc_section_display": "block", + "toc_window_display": false + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tensorboard/.ipynb_checkpoints/Anna KaRNNa-checkpoint.ipynb b/tensorboard/.ipynb_checkpoints/Anna KaRNNa-checkpoint.ipynb new file mode 100644 index 0000000..7b4f6a6 --- /dev/null +++ b/tensorboard/.ipynb_checkpoints/Anna KaRNNa-checkpoint.ipynb @@ -0,0 +1,794 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Anna KaRNNa\n", + "\n", + "In this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.\n", + "\n", + "This network is based off of Andrej Karpathy's [post on RNNs](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) and [implementation in Torch](https://github.com/karpathy/char-rnn). Also, some information [here at r2rt](http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.html) and from [Sherjil Ozair](https://github.com/sherjilozair/char-rnn-tensorflow) on GitHub. Below is the general architecture of the character-wise RNN.\n", + "\n", + "" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "from collections import namedtuple\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we'll load the text file and convert it into integers for our network to use." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with open('anna.txt', 'r') as f:\n", + " text=f.read()\n", + "vocab = set(text)\n", + "vocab_to_int = {c: i for i, c in enumerate(vocab)}\n", + "int_to_vocab = dict(enumerate(vocab))\n", + "chars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'Chapter 1\\n\\n\\nHappy families are all alike; every unhappy family is unhappy in its own\\nway.\\n\\nEverythin'" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "text[:100]" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([82, 78, 3, 48, 15, 79, 77, 50, 30, 20, 20, 20, 38, 3, 48, 48, 8,\n", + " 50, 10, 3, 9, 33, 4, 33, 79, 43, 50, 3, 77, 79, 50, 3, 4, 4,\n", + " 50, 3, 4, 33, 17, 79, 64, 50, 79, 44, 79, 77, 8, 50, 49, 70, 78,\n", + " 3, 48, 48, 8, 50, 10, 3, 9, 33, 4, 8, 50, 33, 43, 50, 49, 70,\n", + " 78, 3, 48, 48, 8, 50, 33, 70, 50, 33, 15, 43, 50, 55, 62, 70, 20,\n", + " 62, 3, 8, 22, 20, 20, 80, 44, 79, 77, 8, 15, 78, 33, 70], dtype=int32)" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chars[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.\n", + "\n", + "Here I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.\n", + "\n", + "The idea here is to make a 2D matrix where the number of rows is equal to the number of batches. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the `split_frac` keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def split_data(chars, batch_size, num_steps, split_frac=0.9):\n", + " \"\"\" \n", + " Split character data into training and validation sets, inputs and targets for each set.\n", + " \n", + " Arguments\n", + " ---------\n", + " chars: character array\n", + " batch_size: Size of examples in each of batch\n", + " num_steps: Number of sequence steps to keep in the input and pass to the network\n", + " split_frac: Fraction of batches to keep in the training set\n", + " \n", + " \n", + " Returns train_x, train_y, val_x, val_y\n", + " \"\"\"\n", + " \n", + " \n", + " slice_size = batch_size * num_steps\n", + " n_batches = int(len(chars) / slice_size)\n", + " \n", + " # Drop the last few characters to make only full batches\n", + " x = chars[: n_batches*slice_size]\n", + " y = chars[1: n_batches*slice_size + 1]\n", + " \n", + " # Split the data into batch_size slices, then stack them into a 2D matrix \n", + " x = np.stack(np.split(x, batch_size))\n", + " y = np.stack(np.split(y, batch_size))\n", + " \n", + " # Now x and y are arrays with dimensions batch_size x n_batches*num_steps\n", + " \n", + " # Split into training and validation sets, keep the virst split_frac batches for training\n", + " split_idx = int(n_batches*split_frac)\n", + " train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]\n", + " val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]\n", + " \n", + " return train_x, train_y, val_x, val_y" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_x, train_y, val_x, val_y = split_data(chars, 10, 200)" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(10, 178400)" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x.shape" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[82, 78, 3, 48, 15, 79, 77, 50, 30, 20],\n", + " [67, 70, 58, 50, 78, 79, 50, 9, 55, 44],\n", + " [50, 65, 3, 15, 65, 78, 33, 70, 32, 50],\n", + " [55, 15, 78, 79, 77, 50, 62, 55, 49, 4],\n", + " [50, 15, 78, 79, 50, 4, 3, 70, 58, 18],\n", + " [50, 51, 78, 77, 55, 49, 32, 78, 50, 4],\n", + " [15, 50, 15, 55, 20, 58, 55, 22, 20, 20],\n", + " [55, 50, 78, 79, 77, 43, 79, 4, 10, 56],\n", + " [78, 3, 15, 50, 33, 43, 50, 15, 78, 79],\n", + " [79, 77, 43, 79, 4, 10, 50, 3, 70, 58]], dtype=int32)" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x[:,:10]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "I'll write another function to grab batches out of the arrays made by split data. Here each batch will be a sliding window on these arrays with size `batch_size X num_steps`. For example, if we want our network to train on a sequence of 100 characters, `num_steps = 100`. For the next batch, we'll shift this window the next sequence of `num_steps` characters. In this way we can feed batches to the network and the cell states will continue through on each batch." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_batch(arrs, num_steps):\n", + " batch_size, slice_size = arrs[0].shape\n", + " \n", + " n_batches = int(slice_size/num_steps)\n", + " for b in range(n_batches):\n", + " yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,\n", + " learning_rate=0.001, grad_clip=5, sampling=False):\n", + " \n", + " if sampling == True:\n", + " batch_size, num_steps = 1, 1\n", + "\n", + " tf.reset_default_graph()\n", + " \n", + " # Declare placeholders we'll feed into the graph\n", + " \n", + " inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')\n", + " x_one_hot = tf.one_hot(inputs, num_classes, name='x_one_hot')\n", + "\n", + "\n", + " targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')\n", + " y_one_hot = tf.one_hot(targets, num_classes, name='y_one_hot')\n", + " y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])\n", + " \n", + " keep_prob = tf.placeholder(tf.float32, name='keep_prob')\n", + " \n", + " # Build the RNN layers\n", + " \n", + " lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n", + " drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n", + " cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)\n", + "\n", + " initial_state = cell.zero_state(batch_size, tf.float32)\n", + "\n", + " # Run the data through the RNN layers\n", + " rnn_inputs = [tf.squeeze(i, squeeze_dims=[1]) for i in tf.split(x_one_hot, num_steps, 1)]\n", + " outputs, state = tf.contrib.rnn.static_rnn(cell, rnn_inputs, initial_state=initial_state)\n", + " \n", + " final_state = tf.identity(state, name='final_state')\n", + " \n", + " # Reshape output so it's a bunch of rows, one row for each cell output\n", + " \n", + " seq_output = tf.concat(outputs, axis=1,name='seq_output')\n", + " output = tf.reshape(seq_output, [-1, lstm_size], name='graph_output')\n", + " \n", + " # Now connect the RNN putputs to a softmax layer and calculate the cost\n", + " softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1),\n", + " name='softmax_w')\n", + " softmax_b = tf.Variable(tf.zeros(num_classes), name='softmax_b')\n", + " logits = tf.matmul(output, softmax_w) + softmax_b\n", + "\n", + " preds = tf.nn.softmax(logits, name='predictions')\n", + " \n", + " loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped, name='loss')\n", + " cost = tf.reduce_mean(loss, name='cost')\n", + "\n", + " # Optimizer for training, using gradient clipping to control exploding gradients\n", + " tvars = tf.trainable_variables()\n", + " grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)\n", + " train_op = tf.train.AdamOptimizer(learning_rate)\n", + " optimizer = train_op.apply_gradients(zip(grads, tvars))\n", + "\n", + " # Export the nodes \n", + " export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',\n", + " 'keep_prob', 'cost', 'preds', 'optimizer']\n", + " Graph = namedtuple('Graph', export_nodes)\n", + " local_dict = locals()\n", + " graph = Graph(*[local_dict[each] for each in export_nodes])\n", + " \n", + " return graph" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Hyperparameters\n", + "\n", + "Here I'm defining the hyperparameters for the network. The two you probably haven't seen before are `lstm_size` and `num_layers`. These set the number of hidden units in the LSTM layers and the number of LSTM layers, respectively. Of course, making these bigger will improve the network's performance but you'll have to watch out for overfitting. If your validation loss is much larger than the training loss, you're probably overfitting. Decrease the size of the network or decrease the dropout keep probability." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "batch_size = 100\n", + "num_steps = 100\n", + "lstm_size = 512\n", + "num_layers = 2\n", + "learning_rate = 0.001" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Write out the graph for TensorBoard" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "model = build_rnn(len(vocab),\n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "with tf.Session() as sess:\n", + " \n", + " sess.run(tf.global_variables_initializer())\n", + " file_writer = tf.summary.FileWriter('./logs/1', sess.graph)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Time for training which is is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by `save_every_n`) I calculate the validation loss and save a checkpoint." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "!mkdir -p checkpoints/anna" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": true + }, + "outputs": [ + { + "ename": "ValueError", + "evalue": "Expected state to be a tuple of length 2, but received: Tensor(\"initial_state:0\", shape=(2, 2, 100, 512), dtype=float32)", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mValueError\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 8\u001b[0m \u001b[0mlearning_rate\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mlearning_rate\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 9\u001b[0m \u001b[0mlstm_size\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mlstm_size\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 10\u001b[0;31m num_layers=num_layers)\n\u001b[0m\u001b[1;32m 11\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 12\u001b[0m \u001b[0msaver\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mSaver\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmax_to_keep\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m100\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m\u001b[0m in \u001b[0;36mbuild_rnn\u001b[0;34m(num_classes, batch_size, num_steps, lstm_size, num_layers, learning_rate, grad_clip, sampling)\u001b[0m\n\u001b[1;32m 25\u001b[0m \u001b[0;31m# Run the data through the RNN layers\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 26\u001b[0m \u001b[0mrnn_inputs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msqueeze\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msqueeze_dims\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mi\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msplit\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mx_one_hot\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnum_steps\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 27\u001b[0;31m \u001b[0moutputs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcontrib\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrnn\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mstatic_rnn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcell\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mrnn_inputs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minitial_state\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0minitial_state\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 28\u001b[0m \u001b[0mfinal_state\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0midentity\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mname\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'final_state'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 29\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn.py\u001b[0m in \u001b[0;36mstatic_rnn\u001b[0;34m(cell, inputs, initial_state, dtype, sequence_length, scope)\u001b[0m\n\u001b[1;32m 195\u001b[0m state_size=cell.state_size)\n\u001b[1;32m 196\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 197\u001b[0;31m \u001b[0;34m(\u001b[0m\u001b[0moutput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mcall_cell\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 198\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 199\u001b[0m \u001b[0moutputs\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0moutput\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn.py\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 182\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mtime\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mvarscope\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mreuse_variables\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 183\u001b[0m \u001b[0;31m# pylint: disable=cell-var-from-loop\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 184\u001b[0;31m \u001b[0mcall_cell\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mlambda\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mcell\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput_\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 185\u001b[0m \u001b[0;31m# pylint: enable=cell-var-from-loop\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 186\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0msequence_length\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn_cell_impl.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, inputs, state, scope)\u001b[0m\n\u001b[1;32m 647\u001b[0m raise ValueError(\n\u001b[1;32m 648\u001b[0m \u001b[0;34m\"Expected state to be a tuple of length %d, but received: %s\"\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 649\u001b[0;31m % (len(self.state_size), state))\n\u001b[0m\u001b[1;32m 650\u001b[0m \u001b[0mcur_state\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 651\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mValueError\u001b[0m: Expected state to be a tuple of length 2, but received: Tensor(\"initial_state:0\", shape=(2, 2, 100, 512), dtype=float32)" + ] + } + ], + "source": [ + "epochs = 1\n", + "save_every_n = 200\n", + "train_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)\n", + "\n", + "model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "saver = tf.train.Saver(max_to_keep=100)\n", + "\n", + "with tf.Session() as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Use the line below to load a checkpoint and resume training\n", + " #saver.restore(sess, 'checkpoints/anna20.ckpt')\n", + " \n", + " n_batches = int(train_x.shape[1]/num_steps)\n", + " iterations = n_batches * epochs\n", + " for e in range(epochs):\n", + " \n", + " # Train network\n", + " new_state = sess.run(model.initial_state)\n", + " loss = 0\n", + " for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):\n", + " iteration = e*n_batches + b\n", + " start = time.time()\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 0.5,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n", + " feed_dict=feed)\n", + " loss += batch_loss\n", + " end = time.time()\n", + " print('Epoch {}/{} '.format(e+1, epochs),\n", + " 'Iteration {}/{}'.format(iteration, iterations),\n", + " 'Training loss: {:.4f}'.format(loss/b),\n", + " '{:.4f} sec/batch'.format((end-start)))\n", + " \n", + " \n", + " if (iteration%save_every_n == 0) or (iteration == iterations):\n", + " # Check performance, notice dropout has been set to 1\n", + " val_loss = []\n", + " new_state = sess.run(model.initial_state)\n", + " for x, y in get_batch([val_x, val_y], num_steps):\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state = sess.run([model.cost, model.final_state], feed_dict=feed)\n", + " val_loss.append(batch_loss)\n", + "\n", + " print('Validation loss:', np.mean(val_loss),\n", + " 'Saving checkpoint!')\n", + " saver.save(sess, \"checkpoints/anna/i{}_l{}_{:.3f}.ckpt\".format(iteration, lstm_size, np.mean(val_loss)))" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "model_checkpoint_path: \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i400_l512_1.980.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i800_l512_1.595.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1200_l512_1.407.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1400_l512_1.349.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1600_l512_1.292.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1800_l512_1.255.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2000_l512_1.224.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2200_l512_1.204.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2400_l512_1.187.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2600_l512_1.172.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2800_l512_1.160.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3000_l512_1.148.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3200_l512_1.137.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3400_l512_1.129.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3560_l512_1.122.ckpt\"" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "tf.train.get_checkpoint_state('checkpoints/anna')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sampling\n", + "\n", + "Now that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.\n", + "\n", + "The network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def pick_top_n(preds, vocab_size, top_n=5):\n", + " p = np.squeeze(preds)\n", + " p[np.argsort(p)[:-top_n]] = 0\n", + " p = p / np.sum(p)\n", + " c = np.random.choice(vocab_size, 1, p=p)[0]\n", + " return c" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def sample(checkpoint, n_samples, lstm_size, vocab_size, prime=\"The \"):\n", + " prime = \"Far\"\n", + " samples = [c for c in prime]\n", + " model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)\n", + " saver = tf.train.Saver()\n", + " with tf.Session() as sess:\n", + " saver.restore(sess, checkpoint)\n", + " new_state = sess.run(model.initial_state)\n", + " for c in prime:\n", + " x = np.zeros((1, 1))\n", + " x[0,0] = vocab_to_int[c]\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + "\n", + " for i in range(n_samples):\n", + " x[0,0] = c\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + " \n", + " return ''.join(samples)" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farlathit that if had so\n", + "like it that it were. He could not trouble to his wife, and there was\n", + "anything in them of the side of his weaky in the creature at his forteren\n", + "to him.\n", + "\n", + "\"What is it? I can't bread to those,\" said Stepan Arkadyevitch. \"It's not\n", + "my children, and there is an almost this arm, true it mays already,\n", + "and tell you what I have say to you, and was not looking at the peasant,\n", + "why is, I don't know him out, and she doesn't speak to me immediately, as\n", + "you would say the countess and the more frest an angelembre, and time and\n", + "things's silent, but I was not in my stand that is in my head. But if he\n", + "say, and was so feeling with his soul. A child--in his soul of his\n", + "soul of his soul. He should not see that any of that sense of. Here he\n", + "had not been so composed and to speak for as in a whole picture, but\n", + "all the setting and her excellent and society, who had been delighted\n", + "and see to anywing had been being troed to thousand words on them,\n", + "we liked him.\n", + "\n", + "That set in her money at the table, he came into the party. The capable\n", + "of his she could not be as an old composure.\n", + "\n", + "\"That's all something there will be down becime by throe is\n", + "such a silent, as in a countess, I should state it out and divorct.\n", + "The discussion is not for me. I was that something was simply they are\n", + "all three manshess of a sensitions of mind it all.\"\n", + "\n", + "\"No,\" he thought, shouted and lifting his soul. \"While it might see your\n", + "honser and she, I could burst. And I had been a midelity. And I had a\n", + "marnief are through the countess,\" he said, looking at him, a chosing\n", + "which they had been carried out and still solied, and there was a sen that\n", + "was to be completely, and that this matter of all the seconds of it, and\n", + "a concipation were to her husband, who came up and conscaously, that he\n", + "was not the station. All his fourse she was always at the country,,\n", + "to speak oft, and though they were to hear the delightful throom and\n", + "whether they came towards the morning, and his living and a coller and\n", + "hold--the children. \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farnt him oste wha sorind thans tout thint asd an sesand an hires on thime sind thit aled, ban thand and out hore as the ter hos ton ho te that, was tis tart al the hand sostint him sore an tit an son thes, win he se ther san ther hher tas tarereng,.\n", + "\n", + "Anl at an ades in ond hesiln, ad hhe torers teans, wast tar arering tho this sos alten sorer has hhas an siton ther him he had sin he ard ate te anling the sosin her ans and\n", + "arins asd and ther ale te tot an tand tanginge wath and ho ald, so sot th asend sat hare sother horesinnd, he hesense wing ante her so tith tir sherinn, anded and to the toul anderin he sorit he torsith she se atere an ting ot hand and thit hhe so the te wile har\n", + "ens ont in the sersise, and we he seres tar aterer, to ato tat or has he he wan ton here won and sen heren he sosering, to to theer oo adent har herere the wosh oute, was serild ward tous hed astend..\n", + "\n", + "I's sint on alt in har tor tit her asd hade shithans ored he talereng an soredendere tim tot hees. Tise sor and \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Fard as astice her said he celatice of to seress in the raice, and to be the some and sere allats to that said to that the sark and a cast a the wither ald the pacinesse of her had astition, he said to the sount as she west at hissele. Af the cond it he was a fact onthis astisarianing.\n", + "\n", + "\n", + "\"Or a ton to to be that's a more at aspestale as the sont of anstiring as\n", + "thours and trey.\n", + "\n", + "The same wo dangring the\n", + "raterst, who sore and somethy had ast out an of his book. \"We had's beane were that, and a morted a thay he had to tere. Then to\n", + "her homent andertersed his his ancouted to the pirsted, the soution for of the pirsice inthirgest and stenciol, with the hard and and\n", + "a colrice of to be oneres,\n", + "the song to this anderssad.\n", + "The could ounterss the said to serom of\n", + "soment a carsed of sheres of she\n", + "torded\n", + "har and want in their of hould, but\n", + "her told in that in he tad a the same to her. Serghing an her has and with the seed, and the camt ont his about of the\n", + "sail, the her then all houg ant or to hus to \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farrat, his felt has at it.\n", + "\n", + "\"When the pose ther hor exceed\n", + "to his sheant was,\" weat a sime of his sounsed. The coment and the facily that which had began terede a marilicaly whice whether the pose of his hand, at she was alligated herself the same on she had to\n", + "taiking to his forthing and streath how to hand\n", + "began in a lang at some at it, this he cholded not set all her. \"Wo love that is setthing. Him anstering as seen that.\"\n", + "\n", + "\"Yes in the man that say the mare a crances is it?\" said Sergazy Ivancatching. \"You doon think were somether is ifficult of a mone of\n", + "though the most at the countes that the\n", + "mean on the come to say the most, to\n", + "his feesing of\n", + "a man she, whilo he\n", + "sained and well, that he would still at to said. He wind at his for the sore in the most\n", + "of hoss and almoved to see him. They have betine the sumper into at he his stire, and what he was that at the so steate of the\n", + "sound, and shin should have a geest of shall feet on the conderation to she had been at that imporsing the dre\n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + }, + "toc": { + "colors": { + "hover_highlight": "#DAA520", + "running_highlight": "#FF0000", + "selected_highlight": "#FFD700" + }, + "moveMenuLeft": true, + "nav_menu": { + "height": "123px", + "width": "335px" + }, + "navigate_menu": true, + "number_sections": true, + "sideBar": true, + "threshold": 4, + "toc_cell": false, + "toc_section_display": "block", + "toc_window_display": false + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tensorboard/Anna KaRNNa Hyperparameters.ipynb b/tensorboard/Anna KaRNNa Hyperparameters.ipynb new file mode 100644 index 0000000..7fa52d6 --- /dev/null +++ b/tensorboard/Anna KaRNNa Hyperparameters.ipynb @@ -0,0 +1,64830 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Anna KaRNNa\n", + "\n", + "In this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.\n", + "\n", + "This network is based off of Andrej Karpathy's [post on RNNs](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) and [implementation in Torch](https://github.com/karpathy/char-rnn). Also, some information [here at r2rt](http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.html) and from [Sherjil Ozair](https://github.com/sherjilozair/char-rnn-tensorflow) on GitHub. Below is the general architecture of the character-wise RNN.\n", + "\n", + "" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "from collections import namedtuple\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we'll load the text file and convert it into integers for our network to use." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with open('anna.txt', 'r') as f:\n", + " text=f.read()\n", + "vocab = set(text)\n", + "vocab_to_int = {c: i for i, c in enumerate(vocab)}\n", + "int_to_vocab = dict(enumerate(vocab))\n", + "chars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'Chapter 1\\n\\n\\nHappy families are all alike; every unhappy family is unhappy in its own\\nway.\\n\\nEverythin'" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "text[:100]" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([42, 74, 81, 14, 15, 4, 75, 5, 29, 51, 51, 51, 19, 81, 14, 14, 63,\n", + " 5, 45, 81, 50, 39, 35, 39, 4, 34, 5, 81, 75, 4, 5, 81, 35, 35,\n", + " 5, 81, 35, 39, 23, 4, 10, 5, 4, 43, 4, 75, 63, 5, 65, 82, 74,\n", + " 81, 14, 14, 63, 5, 45, 81, 50, 39, 35, 63, 5, 39, 34, 5, 65, 82,\n", + " 74, 81, 14, 14, 63, 5, 39, 82, 5, 39, 15, 34, 5, 6, 16, 82, 51,\n", + " 16, 81, 63, 64, 51, 51, 59, 43, 4, 75, 63, 15, 74, 39, 82], dtype=int32)" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chars[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.\n", + "\n", + "Here I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.\n", + "\n", + "The idea here is to make a 2D matrix where the number of rows is equal to the number of batches. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the `split_frac` keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def split_data(chars, batch_size, num_steps, split_frac=0.9):\n", + " \"\"\" \n", + " Split character data into training and validation sets, inputs and targets for each set.\n", + " \n", + " Arguments\n", + " ---------\n", + " chars: character array\n", + " batch_size: Size of examples in each of batch\n", + " num_steps: Number of sequence steps to keep in the input and pass to the network\n", + " split_frac: Fraction of batches to keep in the training set\n", + " \n", + " \n", + " Returns train_x, train_y, val_x, val_y\n", + " \"\"\"\n", + " \n", + " slice_size = batch_size * num_steps\n", + " n_batches = int(len(chars) / slice_size)\n", + " \n", + " # Drop the last few characters to make only full batches\n", + " x = chars[: n_batches*slice_size]\n", + " y = chars[1: n_batches*slice_size + 1]\n", + " \n", + " # Split the data into batch_size slices, then stack them into a 2D matrix \n", + " x = np.stack(np.split(x, batch_size))\n", + " y = np.stack(np.split(y, batch_size))\n", + " \n", + " # Now x and y are arrays with dimensions batch_size x n_batches*num_steps\n", + " \n", + " # Split into training and validation sets, keep the virst split_frac batches for training\n", + " split_idx = int(n_batches*split_frac)\n", + " train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]\n", + " val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]\n", + " \n", + " return train_x, train_y, val_x, val_y" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_x, train_y, val_x, val_y = split_data(chars, 10, 200)" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(10, 178400)" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x.shape" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[42, 74, 81, 14, 15, 4, 75, 5, 29, 51],\n", + " [12, 82, 47, 5, 74, 4, 5, 50, 6, 43],\n", + " [ 5, 18, 81, 15, 18, 74, 39, 82, 53, 5],\n", + " [ 6, 15, 74, 4, 75, 5, 16, 6, 65, 35],\n", + " [ 5, 15, 74, 4, 5, 35, 81, 82, 47, 76],\n", + " [ 5, 70, 74, 75, 6, 65, 53, 74, 5, 35],\n", + " [15, 5, 15, 6, 51, 47, 6, 64, 51, 51],\n", + " [ 6, 5, 74, 4, 75, 34, 4, 35, 45, 30],\n", + " [74, 81, 15, 5, 39, 34, 5, 15, 74, 4],\n", + " [ 4, 75, 34, 4, 35, 45, 5, 81, 82, 47]], dtype=int32)" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x[:,:10]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "I'll write another function to grab batches out of the arrays made by split data. Here each batch will be a sliding window on these arrays with size `batch_size X num_steps`. For example, if we want our network to train on a sequence of 100 characters, `num_steps = 100`. For the next batch, we'll shift this window the next sequence of `num_steps` characters. In this way we can feed batches to the network and the cell states will continue through on each batch." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_batch(arrs, num_steps):\n", + " batch_size, slice_size = arrs[0].shape\n", + " \n", + " n_batches = int(slice_size/num_steps)\n", + " for b in range(n_batches):\n", + " yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,\n", + " learning_rate=0.001, grad_clip=5, sampling=False):\n", + " \n", + " if sampling == True:\n", + " batch_size, num_steps = 1, 1\n", + "\n", + " tf.reset_default_graph()\n", + " \n", + " # Declare placeholders we'll feed into the graph\n", + " with tf.name_scope('inputs'):\n", + " inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')\n", + " x_one_hot = tf.one_hot(inputs, num_classes, name='x_one_hot')\n", + " \n", + " with tf.name_scope('targets'):\n", + " targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')\n", + " y_one_hot = tf.one_hot(targets, num_classes, name='y_one_hot')\n", + " y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])\n", + " \n", + " keep_prob = tf.placeholder(tf.float32, name='keep_prob')\n", + " \n", + " # Build the RNN layers\n", + " with tf.name_scope(\"RNN_cells\"):\n", + " lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n", + " drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n", + " cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)\n", + " \n", + " with tf.name_scope(\"RNN_init_state\"):\n", + " initial_state = cell.zero_state(batch_size, tf.float32)\n", + "\n", + " # Run the data through the RNN layers\n", + " with tf.name_scope(\"RNN_forward\"):\n", + " rnn_inputs = [tf.squeeze(i, squeeze_dims=[1]) for i in tf.split(x_one_hot, num_steps, 1)]\n", + " outputs, state = tf.contrib.rnn.static_rnn(cell, rnn_inputs, initial_state=initial_state)\n", + " \n", + " final_state = state\n", + " \n", + " # Reshape output so it's a bunch of rows, one row for each cell output\n", + " with tf.name_scope('sequence_reshape'):\n", + " seq_output = tf.concat(outputs, axis=1,name='seq_output')\n", + " output = tf.reshape(seq_output, [-1, lstm_size], name='graph_output')\n", + " \n", + " # Now connect the RNN outputs to a softmax layer and calculate the cost\n", + " with tf.name_scope('logits'):\n", + " softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1),\n", + " name='softmax_w')\n", + " softmax_b = tf.Variable(tf.zeros(num_classes), name='softmax_b')\n", + " logits = tf.matmul(output, softmax_w) + softmax_b\n", + " tf.summary.histogram('softmax_w', softmax_w)\n", + " tf.summary.histogram('softmax_b', softmax_b)\n", + "\n", + " with tf.name_scope('predictions'):\n", + " preds = tf.nn.softmax(logits, name='predictions')\n", + " tf.summary.histogram('predictions', preds)\n", + " \n", + " with tf.name_scope('cost'):\n", + " loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped, name='loss')\n", + " cost = tf.reduce_mean(loss, name='cost')\n", + " tf.summary.scalar('cost', cost)\n", + "\n", + " # Optimizer for training, using gradient clipping to control exploding gradients\n", + " with tf.name_scope('train'):\n", + " tvars = tf.trainable_variables()\n", + " grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)\n", + " train_op = tf.train.AdamOptimizer(learning_rate)\n", + " optimizer = train_op.apply_gradients(zip(grads, tvars))\n", + " \n", + " merged = tf.summary.merge_all()\n", + " \n", + " # Export the nodes \n", + " export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',\n", + " 'keep_prob', 'cost', 'preds', 'optimizer', 'merged']\n", + " Graph = namedtuple('Graph', export_nodes)\n", + " local_dict = locals()\n", + " graph = Graph(*[local_dict[each] for each in export_nodes])\n", + " \n", + " return graph" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Hyperparameters\n", + "\n", + "Here I'm defining the hyperparameters for the network. The two you probably haven't seen before are `lstm_size` and `num_layers`. These set the number of hidden units in the LSTM layers and the number of LSTM layers, respectively. Of course, making these bigger will improve the network's performance but you'll have to watch out for overfitting. If your validation loss is much larger than the training loss, you're probably overfitting. Decrease the size of the network or decrease the dropout keep probability." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "batch_size = 100\n", + "num_steps = 100\n", + "lstm_size = 512\n", + "num_layers = 2\n", + "learning_rate = 0.001" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Time for training which is is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by `save_every_n`) I calculate the validation loss and save a checkpoint." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "!mkdir -p checkpoints/anna" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def train(model, epochs, file_writer):\n", + " \n", + " with tf.Session() as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + "\n", + " # Use the line below to load a checkpoint and resume training\n", + " #saver.restore(sess, 'checkpoints/anna20.ckpt')\n", + "\n", + " n_batches = int(train_x.shape[1]/num_steps)\n", + " iterations = n_batches * epochs\n", + " for e in range(epochs):\n", + "\n", + " # Train network\n", + " new_state = sess.run(model.initial_state)\n", + " loss = 0\n", + " for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):\n", + " iteration = e*n_batches + b\n", + " start = time.time()\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 0.5,\n", + " model.initial_state: new_state}\n", + " summary, batch_loss, new_state, _ = sess.run([model.merged, model.cost, \n", + " model.final_state, model.optimizer], \n", + " feed_dict=feed)\n", + " loss += batch_loss\n", + " end = time.time()\n", + " print('Epoch {}/{} '.format(e+1, epochs),\n", + " 'Iteration {}/{}'.format(iteration, iterations),\n", + " 'Training loss: {:.4f}'.format(loss/b),\n", + " '{:.4f} sec/batch'.format((end-start)))\n", + "\n", + " file_writer.add_summary(summary, iteration)" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4166 0.6227 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4060 0.0297 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.3931 0.0292 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.3737 0.0312 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.3372 0.0347 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.2606 0.0412 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.1762 0.0489 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.1022 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.0332 0.0439 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.9722 0.0439 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.9176 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.8715 0.0548 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.8303 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.7942 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.7616 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.7324 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.7053 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.6826 0.0521 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.6605 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.6386 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.6194 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.6012 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.5842 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.5683 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.5534 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.5402 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.5276 0.0527 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.5150 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.5035 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.4925 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.4830 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.4730 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.4632 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.4546 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.4457 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.4379 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.4300 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.4222 0.0437 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.4146 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.4074 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.4007 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.3942 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.3879 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.3818 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.3758 0.0442 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.3703 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.3651 0.0493 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.3604 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.3556 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.3510 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.3465 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.3417 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.3375 0.0506 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.3331 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.3290 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.3248 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.3208 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.3170 0.0442 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.3131 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.3095 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.3059 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.3028 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.2998 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.2962 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.2927 0.0489 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.2897 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.2866 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.2829 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.2796 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.2767 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.2737 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.2710 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.2681 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.2652 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.2625 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.2599 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.2572 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.2545 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.2518 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.2489 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.2462 0.0516 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2436 0.0503 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2410 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2384 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2355 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2328 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2302 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2274 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2250 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2224 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2199 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.2172 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.2146 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.2119 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.2092 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.2064 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.2039 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.2011 0.0513 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.1984 0.0540 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.1957 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.1930 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.1903 0.0506 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.1876 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.1848 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.1820 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.1792 0.0522 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.1762 0.0506 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.1734 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.1706 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.1676 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.1647 0.0538 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.1619 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.1589 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.1559 0.0526 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.1529 0.0532 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.1498 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.1469 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.1441 0.0515 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.1413 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.1384 0.0567 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.1357 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.1328 0.0525 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.1299 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.1271 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.1242 0.0560 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.1211 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.1183 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.1155 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.1125 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.1096 0.0484 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.1068 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.1038 0.0524 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.1009 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.0980 0.0551 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.0949 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.0919 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.0889 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.0859 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.0831 0.0548 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.0802 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.0774 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.0744 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.0715 0.0523 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.0686 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.0658 0.0515 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.0630 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.0602 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.0576 0.0510 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.0547 0.0527 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.0518 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.0493 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.0468 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.0441 0.0559 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.0414 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.0386 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.0358 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.0330 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.0302 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.0274 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.0248 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.0222 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.0194 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.0166 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.0140 0.0494 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.0114 0.0524 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.0087 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.0062 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.0036 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.0011 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 2.9985 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 2.9960 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 2.9937 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 2.9914 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 2.9892 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 2.9869 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 2.9845 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 2.9820 0.0534 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 2.9794 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.5898 0.0466 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.5498 0.0472 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.5412 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.5373 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.5342 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.5320 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.5321 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.5316 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.5321 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.5309 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.5284 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.5277 0.0472 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.5269 0.0531 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.5283 0.0516 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.5275 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.5268 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.5258 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.5266 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.5257 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.5236 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.5222 0.0528 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.5227 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.5216 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.5201 0.0599 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.5184 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.5173 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.5161 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.5148 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.5142 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.5133 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.5129 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.5119 0.0530 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.5106 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.5098 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.5086 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.5079 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.5068 0.0523 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.5051 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.5035 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.5021 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.5008 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.4995 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.4980 0.0534 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.4968 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.4956 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.4938 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.4933 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.4922 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.4912 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.4905 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.4893 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.4884 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.4874 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.4862 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.4851 0.0552 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.4842 0.0559 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.4833 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.4821 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.4810 0.0525 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.4804 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.4795 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.4787 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.4782 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.4773 0.0524 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.4762 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.4755 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.4747 0.0552 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.4734 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.4723 0.0530 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.4717 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.4709 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.4704 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.4697 0.0565 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.4687 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.4679 0.0528 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.4676 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.4667 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.4662 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.4653 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.4645 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.4636 0.0560 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.4630 0.0522 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.4622 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.4612 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.4600 0.0574 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.4592 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.4585 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.4578 0.0525 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.4569 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.4564 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.4556 0.0542 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.4549 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.4542 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.4533 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.4524 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.4516 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.4508 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.4500 0.0535 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.4493 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.4485 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.4479 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.4472 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.4463 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.4455 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.4448 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.4441 0.0518 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.4433 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.4428 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.4423 0.0478 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.4413 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.4407 0.0548 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.4401 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.4395 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.4387 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.4380 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.4370 0.0544 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.4363 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.4357 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.4352 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.4346 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.4341 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.4335 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.4328 0.0536 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.4323 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.4317 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.4310 0.0519 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.4305 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.4300 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.4294 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.4288 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.4283 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.4275 0.0513 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.4270 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.4265 0.0521 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.4258 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.4253 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.4248 0.0529 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.4243 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.4239 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.4233 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.4229 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.4223 0.0526 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.4217 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.4211 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.4206 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.4202 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.4196 0.0524 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.4192 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.4187 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.4180 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.4177 0.0552 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.4176 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.4171 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.4167 0.0537 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.4162 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.4157 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.4152 0.0543 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.4147 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.4140 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.4137 0.0545 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.4132 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.4126 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.4120 0.0528 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.4115 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.4111 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.4106 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.4101 0.0532 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.4096 0.0529 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.4092 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.4086 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.4082 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.4079 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.4077 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.4074 0.0559 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.4071 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.4067 0.0538 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.4062 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.4055 0.0547 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.3644 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.3223 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.3169 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.3140 0.0519 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.3134 0.0513 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.3097 0.0507 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.3106 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.3116 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.3127 0.0560 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.3123 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.3106 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.3091 0.0529 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.3088 0.0564 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.3111 0.0552 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.3109 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.3108 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.3099 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.3117 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.3120 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.3107 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.3102 0.0562 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.3123 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.3116 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.3109 0.0559 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.3104 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.3098 0.0545 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.3092 0.0535 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.3090 0.0575 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.3096 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.3093 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.3096 0.0507 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.3089 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.3085 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.3086 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.3082 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.3079 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.3074 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.3061 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.3052 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.3045 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.3038 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.3032 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.3024 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.3016 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.3011 0.0552 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.2998 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.2996 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.2990 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.2985 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.2986 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.2978 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.2977 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.2971 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.2964 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.2961 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.2959 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.2955 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.2952 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.2948 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.2947 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.2942 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.2940 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.2940 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.2937 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.2932 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.2932 0.0569 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.2929 0.0568 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.2921 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.2916 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.2914 0.0552 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.2911 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.2909 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.2909 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.2903 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.2899 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.2900 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.2896 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.2895 0.0578 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.2889 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.2885 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.2880 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.2879 0.0562 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.2874 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.2869 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.2861 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.2855 0.0565 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.2852 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.2847 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.2843 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.2840 0.0514 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.2836 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.2832 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.2827 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.2821 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.2815 0.0538 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.2810 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.2806 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.2802 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.2798 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.2793 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.2790 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.2787 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.2781 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.2777 0.0534 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.2772 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.2769 0.0538 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.2765 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.2764 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.2763 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.2758 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.2754 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.2752 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.2748 0.0574 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.2743 0.0546 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.2739 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.2732 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.2729 0.0531 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.2726 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.2725 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.2722 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.2720 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.2716 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.2711 0.0513 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.2710 0.0545 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.2706 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.2702 0.0577 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.2700 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.2698 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.2695 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.2693 0.0562 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.2689 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.2685 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.2682 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.2681 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.2678 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.2675 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.2673 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.2671 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.2671 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.2668 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.2667 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.2664 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.2661 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.2658 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.2655 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.2654 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.2652 0.0534 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.2651 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.2648 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.2643 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.2642 0.0550 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.2642 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.2640 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.2639 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.2635 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.2633 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.2630 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.2626 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.2622 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.2622 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.2621 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.2617 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.2614 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.2611 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.2610 0.0532 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.2607 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.2605 0.0596 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.2603 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.2601 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.2598 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.2595 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.2593 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.2592 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.2591 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.2590 0.0547 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.2591 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.2591 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.2592 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.2781 0.0522 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.2326 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.2199 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.2145 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.2121 0.0577 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.2082 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.2080 0.0544 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.2080 0.0548 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.2089 0.0567 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.2082 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.2074 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.2060 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.2058 0.0562 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.2074 0.0543 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.2073 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.2064 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.2064 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.2082 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.2079 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.2075 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.2068 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.2082 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.2080 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.2072 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.2063 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.2054 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.2047 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.2045 0.0543 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.2053 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.2048 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.2046 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.2041 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.2035 0.0558 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.2039 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.2037 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.2034 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.2031 0.0537 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.2018 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.2009 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.2001 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.1995 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.1991 0.0605 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.1981 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.1972 0.0563 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.1971 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.1960 0.0537 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.1958 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.1952 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.1949 0.0558 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.1951 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.1945 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.1947 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.1942 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.1935 0.0535 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.1930 0.0596 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.1928 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.1927 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.1922 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.1916 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.1918 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.1914 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.1914 0.0551 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.1913 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.1910 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.1905 0.0562 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.1905 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.1903 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.1897 0.0521 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.1892 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.1892 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.1892 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.1891 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.1892 0.0530 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.1888 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.1886 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.1888 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.1885 0.0549 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.1884 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.1879 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.1875 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.1871 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.1871 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.1866 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.1862 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.1855 0.0537 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.1850 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.1847 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.1844 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.1839 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.1839 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.1833 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.1832 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.1827 0.0547 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.1823 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.1818 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.1816 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.1813 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.1810 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.1805 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.1801 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.1800 0.0548 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.1797 0.0484 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.1793 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.1789 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.1785 0.0558 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.1784 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.1781 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.1780 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.1778 0.0540 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.1774 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.1772 0.0528 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.1770 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.1768 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.1764 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.1760 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.1755 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.1752 0.0548 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.1749 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.1747 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.1745 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.1744 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.1741 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.1737 0.0546 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.1736 0.0542 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.1733 0.0606 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.1729 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.1727 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.1726 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.1725 0.0578 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.1723 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.1720 0.0548 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.1715 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.1713 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.1711 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.1709 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.1708 0.0555 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.1707 0.0562 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.1705 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.1706 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.1703 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.1702 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.1700 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.1698 0.0516 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.1695 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.1692 0.0618 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.1692 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.1690 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.1690 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.1688 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.1684 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.1682 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.1682 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.1681 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.1679 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.1676 0.0528 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.1675 0.0579 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.1673 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.1670 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.1667 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.1667 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.1666 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.1663 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.1661 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.1659 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.1657 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.1654 0.0558 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.1653 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.1653 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.1651 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.1648 0.0556 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.1646 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.1643 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.1642 0.0553 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.1642 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.1641 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.1639 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.1636 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.1634 0.0538 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.1803 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.1460 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.1341 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.1304 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.1271 0.0562 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.1241 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.1234 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.1234 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.1250 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.1246 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.1224 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.1204 0.0538 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.1206 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.1222 0.0489 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.1217 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.1207 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.1202 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.1223 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.1227 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.1228 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.1217 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.1225 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.1223 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.1214 0.0525 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.1206 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.1198 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.1188 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.1188 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.1197 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.1198 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.1195 0.0566 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.1185 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.1185 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.1189 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.1187 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.1183 0.0535 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.1176 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.1165 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.1154 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.1146 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.1139 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.1136 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.1129 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.1120 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.1120 0.0573 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.1106 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.1104 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.1098 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.1097 0.0535 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.1101 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.1097 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.1100 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.1096 0.0569 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.1092 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.1087 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.1087 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.1086 0.0570 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.1082 0.0556 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.1076 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.1079 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.1077 0.0534 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.1079 0.0481 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.1082 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.1081 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.1076 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.1077 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.1077 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.1071 0.0538 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.1067 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.1065 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.1066 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.1064 0.0540 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.1066 0.0567 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.1061 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.1058 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.1060 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.1057 0.0560 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.1056 0.0540 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.1050 0.0550 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.1047 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.1041 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.1041 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.1036 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.1033 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.1026 0.0537 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 2.1021 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 2.1017 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 2.1013 0.0550 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 2.1006 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 2.1007 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 2.1003 0.0561 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 2.1000 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 2.0994 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 2.0990 0.0485 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 2.0987 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 2.0985 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 2.0982 0.0560 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 2.0978 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 2.0972 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 2.0967 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 2.0966 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 2.0965 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 2.0961 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 2.0958 0.0558 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 2.0955 0.0550 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 2.0953 0.0538 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 2.0952 0.0563 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 2.0950 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 2.0949 0.0580 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 2.0946 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 2.0945 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 2.0943 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 2.0940 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 2.0937 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 2.0934 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 2.0929 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 2.0927 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 2.0925 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 2.0924 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 2.0922 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 2.0922 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 2.0919 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 2.0916 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 2.0916 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 2.0913 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 2.0908 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 2.0908 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 2.0907 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 2.0906 0.0560 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 2.0904 0.0535 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 2.0901 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 2.0896 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 2.0895 0.0564 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 2.0894 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 2.0892 0.0571 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 2.0891 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 2.0891 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 2.0889 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 2.0890 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 2.0887 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 2.0888 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 2.0886 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 2.0885 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 2.0882 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 2.0880 0.0546 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 2.0880 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 2.0878 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 2.0878 0.0546 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 2.0877 0.0540 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 2.0874 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 2.0872 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 2.0873 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 2.0871 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 2.0871 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 2.0869 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 2.0868 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 2.0867 0.0533 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 2.0865 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 2.0862 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 2.0862 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 2.0862 0.0537 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 2.0860 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 2.0859 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 2.0857 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 2.0856 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 2.0854 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 2.0853 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 2.0854 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 2.0853 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 2.0851 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 2.0849 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 2.0846 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 2.0845 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 2.0845 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 2.0844 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 2.0844 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 2.0841 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 2.0840 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.1134 0.0553 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 2.0733 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 2.0623 0.0543 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 2.0552 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 2.0525 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 2.0465 0.0536 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 2.0467 0.0581 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 2.0472 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 2.0483 0.0560 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 2.0488 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 2.0470 0.0534 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 2.0453 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 2.0455 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 2.0474 0.0556 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 2.0467 0.0520 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 2.0450 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 2.0452 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 2.0475 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 2.0474 0.0548 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 2.0472 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 2.0471 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 2.0479 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 2.0474 0.0572 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 2.0467 0.0490 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 2.0464 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 2.0455 0.0489 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 2.0449 0.0564 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 2.0449 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 2.0460 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 2.0462 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 2.0460 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 2.0452 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 2.0448 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 2.0455 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 2.0452 0.0558 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 2.0449 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 2.0446 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 2.0437 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 2.0427 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 2.0420 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 2.0415 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 2.0415 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 2.0411 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 2.0401 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 2.0400 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 2.0391 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 2.0388 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 2.0382 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 2.0379 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 2.0386 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 2.0380 0.0542 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 2.0385 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 2.0382 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 2.0377 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 2.0373 0.0563 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 2.0374 0.0520 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 2.0374 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 2.0370 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 2.0367 0.0560 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 2.0371 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 2.0368 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 2.0372 0.0589 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 2.0374 0.0543 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 2.0376 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 2.0373 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 2.0375 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 2.0375 0.0570 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 2.0370 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 2.0366 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 2.0364 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 2.0367 0.0561 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 2.0366 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 2.0368 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 2.0364 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 2.0363 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 2.0366 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 2.0364 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 2.0365 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 2.0359 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 2.0357 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 2.0352 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 2.0352 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 2.0348 0.0609 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 2.0346 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 2.0339 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 2.0336 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 2.0332 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 2.0329 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 2.0325 0.0568 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 2.0324 0.0556 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 2.0321 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 2.0318 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 2.0313 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 2.0309 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 2.0305 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 2.0304 0.0587 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 2.0302 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 2.0298 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 2.0294 0.0587 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 2.0289 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 2.0289 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 2.0288 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 2.0283 0.0554 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 2.0280 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 2.0277 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 2.0276 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 2.0274 0.0529 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 2.0273 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 2.0272 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 2.0271 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 2.0269 0.0593 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 2.0267 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 2.0265 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 2.0262 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 2.0259 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 2.0255 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 2.0253 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 2.0252 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 2.0252 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 2.0251 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 2.0250 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 2.0248 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 2.0244 0.0553 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 2.0245 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 2.0244 0.0490 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 2.0240 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 2.0240 0.0561 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 2.0239 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 2.0238 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 2.0237 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 2.0234 0.0562 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 2.0230 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 2.0230 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 2.0230 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 2.0229 0.0521 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 2.0229 0.0536 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 2.0229 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 2.0228 0.0551 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 2.0229 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 2.0227 0.0548 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 2.0229 0.0554 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 2.0227 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 2.0226 0.0542 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 2.0225 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 2.0223 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 2.0223 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 2.0222 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 2.0222 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 2.0221 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 2.0218 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 2.0215 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 2.0216 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 2.0215 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 2.0215 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 2.0212 0.0555 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 2.0211 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 2.0210 0.0558 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 2.0209 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 2.0206 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 2.0207 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 2.0207 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 2.0205 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 2.0204 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 2.0203 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 2.0202 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 2.0200 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 2.0199 0.0572 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 2.0200 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 2.0199 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 2.0198 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 2.0196 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 2.0194 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 2.0193 0.0549 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 2.0193 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 2.0193 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 2.0191 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 2.0189 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 2.0188 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 2.0628 0.0556 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 2.0175 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 2.0090 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 2.0004 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.9993 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.9910 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.9916 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.9912 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.9935 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.9944 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.9932 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.9910 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.9911 0.0575 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.9934 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.9927 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.9910 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.9906 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.9933 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.9935 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.9937 0.0494 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.9933 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.9947 0.0529 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.9937 0.0545 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.9925 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.9923 0.0548 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.9912 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.9902 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.9902 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.9912 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.9915 0.0562 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.9912 0.0541 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.9904 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.9902 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.9910 0.0490 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.9905 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.9903 0.0556 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.9899 0.0550 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.9889 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.9877 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.9872 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.9866 0.0598 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.9866 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.9860 0.0605 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.9854 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.9854 0.0554 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.9843 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.9843 0.0556 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.9836 0.0605 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.9833 0.0565 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.9837 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.9831 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.9838 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.9837 0.0565 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.9834 0.0556 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.9832 0.0570 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.9835 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.9836 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.9831 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.9828 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.9831 0.0570 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.9828 0.0536 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.9834 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.9837 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.9839 0.0541 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.9836 0.0581 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.9838 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.9840 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.9835 0.0562 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.9834 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.9834 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.9837 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.9838 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.9840 0.0574 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.9836 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.9835 0.0546 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.9836 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.9834 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.9835 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.9830 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.9827 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.9822 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.9823 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.9818 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.9816 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.9810 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.9806 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.9804 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.9799 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.9794 0.0547 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.9794 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.9790 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.9787 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.9781 0.0573 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.9777 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.9773 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.9773 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.9770 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.9766 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.9761 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.9757 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.9757 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.9755 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.9752 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.9750 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.9747 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.9745 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.9743 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.9743 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.9742 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.9741 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.9739 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.9737 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.9735 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.9733 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.9730 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.9727 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.9726 0.0590 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.9725 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.9724 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.9724 0.0493 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.9724 0.0587 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.9721 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.9718 0.0551 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.9718 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.9717 0.0547 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.9713 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.9713 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.9713 0.0559 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.9712 0.0544 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.9711 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.9709 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.9705 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.9705 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.9704 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.9703 0.0566 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.9703 0.0569 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.9703 0.0536 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.9703 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.9704 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.9702 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.9703 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.9701 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.9700 0.0490 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.9699 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.9697 0.0562 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.9698 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.9697 0.0491 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.9698 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.9697 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.9695 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.9692 0.0555 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.9693 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.9693 0.0541 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.9693 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.9692 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.9691 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.9691 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.9689 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.9686 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.9687 0.0573 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.9687 0.0526 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.9686 0.0552 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.9686 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.9686 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.9685 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.9683 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.9683 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.9686 0.0489 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.9685 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.9683 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.9682 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.9679 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.9679 0.0575 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.9679 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.9679 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.9678 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.9676 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.9675 0.0542 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 2.0081 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.9702 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.9630 0.0579 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.9546 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.9532 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.9467 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.9465 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.9459 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.9476 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.9484 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.9454 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.9441 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.9438 0.0496 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.9461 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.9451 0.0543 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.9436 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.9437 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.9458 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.9455 0.0569 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.9456 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.9455 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.9465 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.9459 0.0533 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.9449 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.9443 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.9440 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.9431 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.9432 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.9444 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.9449 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.9447 0.0534 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.9436 0.0544 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.9437 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.9444 0.0564 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.9443 0.0554 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.9442 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.9438 0.0495 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.9429 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.9418 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.9413 0.0552 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.9406 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.9409 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.9406 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.9399 0.0493 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.9400 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.9386 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.9384 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.9379 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.9380 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.9384 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.9378 0.0549 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.9384 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.9381 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.9380 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.9378 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.9381 0.0495 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.9382 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.9379 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.9376 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.9380 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.9377 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.9382 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.9385 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.9387 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.9386 0.0546 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.9389 0.0559 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.9390 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.9385 0.0541 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.9384 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.9384 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.9389 0.0559 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.9389 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.9390 0.0573 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.9389 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.9387 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.9390 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.9387 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.9389 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.9383 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.9382 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.9376 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.9377 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.9373 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.9371 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.9365 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.9361 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.9359 0.0492 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.9355 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.9350 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.9350 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.9346 0.0546 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.9343 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.9338 0.0544 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.9334 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.9330 0.0547 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.9329 0.0547 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.9328 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.9323 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.9320 0.0619 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.9315 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.9315 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.9313 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.9311 0.0583 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.9308 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.9305 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.9304 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.9302 0.0559 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.9302 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.9301 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.9300 0.0560 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.9299 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.9296 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.9294 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.9292 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.9289 0.0534 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.9285 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.9284 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.9283 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.9283 0.0565 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.9282 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.9281 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.9278 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.9275 0.0585 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.9277 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.9275 0.0530 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.9270 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.9272 0.0561 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.9272 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.9271 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.9270 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.9267 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.9263 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.9263 0.0559 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.9262 0.0492 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.9262 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.9262 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.9261 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.9261 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.9262 0.0593 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.9260 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.9262 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.9260 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.9259 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.9258 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.9257 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.9257 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.9257 0.0523 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.9258 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.9257 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.9255 0.0541 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.9253 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.9254 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.9254 0.0556 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.9254 0.0566 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.9253 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.9252 0.0553 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.9252 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.9251 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.9249 0.0560 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.9250 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.9251 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.9250 0.0570 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.9250 0.0587 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.9249 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.9248 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.9247 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.9246 0.0582 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.9249 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.9248 0.0492 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.9247 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.9246 0.0563 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.9244 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.9244 0.0547 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.9244 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.9244 0.0549 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.9244 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.9242 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.9242 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.9618 0.0536 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.9312 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.9216 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.9159 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.9128 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.9063 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.9070 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.9073 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.9091 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.9092 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.9060 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.9037 0.0541 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.9037 0.0593 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.9063 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.9062 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.9042 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.9042 0.0536 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.9065 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.9067 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.9073 0.0576 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.9067 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.9073 0.0530 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.9068 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.9063 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.9060 0.0539 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.9055 0.0562 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.9045 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.9048 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.9059 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.9061 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.9057 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.9044 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.9043 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.9052 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.9050 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.9047 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.9040 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.9029 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.9019 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.9013 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.9006 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.9008 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.9003 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.8994 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.8995 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.8983 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.8983 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.8976 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.8976 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.8984 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.8977 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.8984 0.0551 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.8982 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.8982 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.8979 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.8981 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.8982 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.8981 0.0494 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.8978 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.8982 0.0549 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.8979 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.8984 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.8987 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.8990 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.8988 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.8991 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.8994 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.8990 0.0563 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.8988 0.0607 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.8988 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.8992 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.8993 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.8995 0.0586 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.8993 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.8990 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.8994 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.8991 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.8992 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.8987 0.0548 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.8985 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.8980 0.0555 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.8981 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.8977 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.8976 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.8969 0.0587 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.8967 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.8964 0.0497 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.8962 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.8957 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.8957 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.8953 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.8951 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.8945 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.8941 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.8939 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.8938 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.8936 0.0585 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.8932 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.8927 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.8921 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.8921 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.8919 0.0537 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.8917 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.8915 0.0563 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.8913 0.0598 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.8911 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.8910 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.8910 0.0565 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.8911 0.0575 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.8910 0.0495 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.8908 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.8906 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.8905 0.0547 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.8903 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.8900 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.8896 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.8896 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.8894 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.8892 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.8891 0.0585 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.8890 0.0574 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.8886 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.8883 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.8885 0.0567 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.8884 0.0555 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.8879 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.8879 0.0575 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.8880 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.8878 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.8878 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.8875 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.8873 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.8873 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.8873 0.0548 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.8872 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.8873 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.8874 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.8874 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.8875 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.8875 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.8876 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.8875 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.8874 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.8874 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.8873 0.0526 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.8874 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.8875 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.8876 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.8876 0.0551 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.8874 0.0552 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.8872 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.8872 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.8872 0.0597 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.8873 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.8872 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.8871 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.8871 0.0561 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.8870 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.8868 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.8869 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.8870 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.8868 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.8868 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.8868 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.8866 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.8865 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.8866 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.8868 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.8868 0.0603 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.8867 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.8865 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.8864 0.0568 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.8864 0.0547 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.8864 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.8865 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.8864 0.0565 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.8862 0.0549 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.8862 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.9361 0.0496 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.9065 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.8925 0.0552 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.8859 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.8830 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.8740 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.8729 0.0695 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.8731 0.0496 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.8745 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.8751 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.8714 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.8692 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.8693 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.8710 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.8702 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.8686 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.8686 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.8711 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.8710 0.0541 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.8718 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.8716 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.8724 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.8718 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.8710 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.8703 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.8696 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.8689 0.0538 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.8694 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.8704 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.8702 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.8697 0.0558 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.8689 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.8694 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.8704 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.8703 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.8701 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.8698 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.8689 0.0566 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.8678 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.8674 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.8667 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.8667 0.0547 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.8662 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.8654 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.8655 0.0557 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.8644 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.8640 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.8634 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.8634 0.0596 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.8642 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.8637 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.8644 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.8641 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.8642 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.8639 0.0564 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.8639 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.8641 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.8638 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.8635 0.0623 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.8640 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.8638 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.8646 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.8649 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.8651 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.8650 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.8652 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.8654 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.8651 0.0546 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.8650 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.8651 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.8656 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.8656 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.8659 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.8656 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.8655 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.8658 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.8656 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.8658 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.8653 0.0552 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.8652 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.8645 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.8647 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.8644 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.8643 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.8638 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.8634 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.8630 0.0570 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.8626 0.0496 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.8622 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.8622 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.8619 0.0578 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.8617 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.8612 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.8608 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.8604 0.0570 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.8604 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.8601 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.8597 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.8592 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.8588 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.8588 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.8586 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.8583 0.0573 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.8581 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.8579 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.8578 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.8577 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.8577 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.8576 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.8576 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.8575 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.8573 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.8572 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.8570 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.8567 0.0549 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.8563 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.8562 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.8561 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.8560 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.8560 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.8560 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.8557 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.8553 0.0538 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.8554 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.8554 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.8549 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.8550 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.8551 0.0559 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.8550 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.8549 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.8545 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.8543 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.8543 0.0559 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.8544 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.8544 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.8544 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.8544 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.8545 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.8547 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.8547 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.8549 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.8549 0.0557 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.8548 0.0550 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.8548 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.8546 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.8547 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.8547 0.0556 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.8548 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.8548 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.8546 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.8544 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.8545 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.8545 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.8546 0.0570 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.8545 0.0582 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.8545 0.0542 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.8545 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.8544 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.8542 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.8544 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.8544 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.8543 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.8544 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.8544 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.8543 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.8542 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.8542 0.0573 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.8544 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.8544 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.8543 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.8542 0.0554 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.8540 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.8540 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.8541 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.8540 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.8539 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.8538 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.8538 0.0570 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.9126 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.8780 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.8658 0.0540 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.8583 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.8542 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.8477 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.8477 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.8463 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.8469 0.0585 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.8467 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.8435 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.8415 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.8409 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.8426 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.8423 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.8408 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.8407 0.0602 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.8429 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.8428 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.8432 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.8433 0.0588 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.8443 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.8436 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.8426 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.8423 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.8412 0.0499 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.8403 0.0540 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.8404 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.8411 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.8412 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.8408 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.8400 0.0551 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.8401 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.8410 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.8407 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.8405 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.8399 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.8390 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.8379 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.8373 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.8366 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.8368 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.8363 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.8357 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.8360 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.8348 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.8346 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.8342 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.8341 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.8349 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.8344 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.8350 0.0541 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.8346 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.8346 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.8343 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.8345 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.8349 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.8346 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.8343 0.0563 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.8350 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.8349 0.0551 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.8357 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.8361 0.0543 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.8365 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.8361 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.8364 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.8366 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.8362 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.8362 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.8361 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.8366 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.8367 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.8371 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.8369 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.8367 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.8370 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.8370 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.8371 0.0558 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.8366 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.8365 0.0548 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.8358 0.0576 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.8359 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.8355 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.8355 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.8350 0.0592 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.8346 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.8344 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.8340 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.8335 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.8337 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.8334 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.8332 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.8326 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.8322 0.0496 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.8319 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.8318 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.8316 0.0557 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.8312 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.8308 0.0579 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.8304 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.8304 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.8303 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.8300 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.8298 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.8297 0.0582 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.8296 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.8295 0.0577 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.8295 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.8295 0.0614 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.8294 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.8293 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.8291 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.8290 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.8288 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.8285 0.0499 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.8281 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.8280 0.0578 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.8279 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.8278 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.8277 0.0499 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.8276 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.8273 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.8269 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.8270 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.8270 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.8265 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.8267 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.8268 0.0552 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.8266 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.8265 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.8263 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.8259 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.8259 0.0564 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.8259 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.8259 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.8260 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.8261 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.8261 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.8262 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.8261 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.8263 0.0549 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.8263 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.8262 0.0497 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.8262 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.8261 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.8261 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.8262 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.8263 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.8264 0.0549 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.8262 0.0570 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.8260 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.8260 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.8260 0.0588 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.8261 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.8261 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.8260 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.8260 0.0599 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.8259 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.8257 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.8257 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.8258 0.0548 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.8257 0.0499 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.8258 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.8258 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.8258 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.8257 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.8257 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.8260 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.8259 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.8258 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.8257 0.0552 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.8255 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.8255 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.8255 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.8256 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.8255 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.8254 0.0559 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.8253 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.8809 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.8497 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.8381 0.0592 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.8308 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.8275 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.8182 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.8184 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.8167 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.8186 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.8197 0.0569 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.8169 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.8151 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.8149 0.0576 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.8166 0.0616 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.8155 0.0561 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.8136 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.8134 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.8151 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.8153 0.0541 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.8158 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.8157 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.8158 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.8155 0.0603 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.8146 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.8140 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.8128 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.8119 0.0551 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.8120 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.8133 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.8134 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.8132 0.0610 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.8125 0.0490 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.8127 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.8135 0.0553 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.8134 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.8131 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.8127 0.0576 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.8118 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.8106 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.8100 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.8095 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.8101 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.8096 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.8087 0.0564 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.8088 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.8078 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.8077 0.0538 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.8071 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.8069 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.8076 0.0564 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.8071 0.0527 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.8080 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.8079 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.8080 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.8077 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.8077 0.0577 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.8082 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.8080 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.8077 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.8082 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.8081 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.8089 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.8094 0.0564 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.8098 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.8097 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.8100 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.8103 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.8099 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.8098 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.8097 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.8102 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.8102 0.0531 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.8107 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.8105 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.8104 0.0543 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.8106 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.8105 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.8106 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.8100 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.8097 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.8093 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.8094 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.8090 0.0567 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.8089 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.8084 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.8081 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.8077 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.8074 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.8069 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.8070 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.8065 0.0604 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.8064 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.8059 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.8055 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.8052 0.0587 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.8051 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.8051 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.8046 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.8043 0.0609 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.8039 0.0551 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.8038 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.8037 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.8035 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.8033 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.8031 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.8029 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.8029 0.0547 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.8029 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.8030 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.8030 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.8029 0.0527 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.8028 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.8026 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.8024 0.0549 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.8021 0.0566 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.8018 0.0559 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.8018 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.8017 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.8016 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.8016 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.8015 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.8013 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.8009 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.8010 0.0494 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.8010 0.0570 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.8006 0.0558 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.8007 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.8007 0.0562 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.8006 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.8004 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.8001 0.0571 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.7998 0.0542 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.7998 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.7998 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.7998 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.7999 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.7999 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.7999 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.8001 0.0566 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.8000 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.8002 0.0568 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.8000 0.0569 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.8000 0.0575 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.8000 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.7999 0.0566 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.7999 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.7999 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.8001 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.8001 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.7999 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.7996 0.0580 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.7997 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.7998 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.7998 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.7998 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.7997 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.7997 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.7997 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.7995 0.0612 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.7997 0.0534 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.7998 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.7998 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.7999 0.0539 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.7999 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.7998 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.7997 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.7998 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.8001 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.8000 0.0552 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.8000 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.7999 0.0588 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.7997 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.7997 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.7998 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.7999 0.0538 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.7998 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.7996 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.7996 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.8573 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.8201 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.8087 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.8042 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.8002 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.7910 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.7905 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.7896 0.0564 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.7922 0.0546 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.7925 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.7890 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.7873 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.7873 0.0570 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.7890 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.7885 0.0499 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.7866 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.7870 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.7896 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.7898 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.7912 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.7912 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.7918 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.7911 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.7904 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.7903 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.7893 0.0572 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.7883 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.7886 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.7893 0.0586 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.7893 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.7890 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.7883 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.7886 0.0585 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.7894 0.0551 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.7892 0.0549 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.7891 0.0555 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.7885 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.7873 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.7861 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.7857 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.7852 0.0553 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.7855 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.7851 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.7844 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.7847 0.0599 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.7838 0.0558 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.7836 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.7833 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.7831 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.7837 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.7833 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.7839 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.7836 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.7839 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.7836 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.7837 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.7840 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.7838 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.7834 0.0499 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.7841 0.0567 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.7839 0.0557 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.7845 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.7849 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.7852 0.0562 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.7850 0.0565 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.7853 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.7856 0.0535 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.7853 0.0584 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.7852 0.0577 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.7852 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.7856 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.7855 0.0560 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.7860 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.7858 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.7857 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.7861 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.7858 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.7859 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.7855 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.7854 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.7849 0.0625 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.7850 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.7847 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.7847 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.7841 0.0544 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.7837 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.7834 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.7830 0.0599 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.7825 0.0581 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.7827 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.7825 0.0587 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.7823 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.7817 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.7814 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.7811 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.7810 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.7809 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.7805 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.7802 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.7798 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.7797 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.7796 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.7793 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.7792 0.0546 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.7790 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.7788 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.7789 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.7788 0.0549 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.7788 0.0557 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.7789 0.0599 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.7788 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.7787 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.7786 0.0557 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.7784 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.7782 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.7778 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.7777 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.7776 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.7775 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.7774 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.7775 0.0571 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.7771 0.0570 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.7768 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.7769 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.7769 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.7765 0.0569 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.7767 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.7768 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.7766 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.7765 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.7762 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.7758 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.7758 0.0584 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.7759 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.7758 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.7760 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.7761 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.7761 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.7762 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.7761 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.7763 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.7762 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.7761 0.0497 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.7762 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.7761 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.7762 0.0567 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.7762 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.7763 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.7763 0.0565 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.7761 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.7758 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.7760 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.7761 0.0560 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.7761 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.7761 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.7760 0.0559 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.7761 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.7761 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.7759 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.7761 0.0570 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.7763 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.7762 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.7763 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.7763 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.7762 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.7762 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.7762 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.7765 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.7765 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.7765 0.0569 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.7764 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.7762 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.7762 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.7762 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.7762 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.7762 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.7760 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.7761 0.0563 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.8300 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.8032 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.7926 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.7869 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.7813 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.7720 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.7713 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.7696 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.7708 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.7702 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.7668 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.7658 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.7659 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.7678 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.7669 0.0624 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.7652 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.7648 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.7672 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.7679 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.7683 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.7682 0.0575 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.7688 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.7679 0.0587 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.7673 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.7673 0.0500 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.7665 0.0501 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.7655 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.7659 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.7666 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.7667 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.7666 0.0576 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.7660 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.7663 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.7674 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.7672 0.0571 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.7671 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.7666 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.7656 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.7645 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.7640 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.7635 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.7639 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.7636 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.7629 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.7630 0.0619 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.7618 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.7615 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.7610 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.7608 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.7614 0.0593 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.7610 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.7618 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.7615 0.0565 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.7615 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.7613 0.0585 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.7614 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.7618 0.0552 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.7614 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.7611 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.7617 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.7616 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.7625 0.0557 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.7630 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.7635 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.7634 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.7636 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.7640 0.0576 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.7637 0.0498 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.7637 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.7636 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.7640 0.0570 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.7641 0.0552 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.7645 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.7643 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.7641 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.7644 0.0498 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.7642 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.7643 0.0557 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.7638 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.7638 0.0541 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.7633 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.7634 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.7631 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.7631 0.0586 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.7625 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.7623 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.7621 0.0541 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.7618 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.7612 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.7613 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.7610 0.0575 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.7608 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.7603 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.7600 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.7598 0.0538 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.7598 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.7596 0.0502 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.7592 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.7589 0.0558 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.7584 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.7584 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.7583 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.7581 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.7580 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.7578 0.0555 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.7578 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.7577 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.7576 0.0502 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.7577 0.0502 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.7577 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.7577 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.7576 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.7575 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.7574 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.7572 0.0608 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.7567 0.0552 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.7567 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.7566 0.0562 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.7565 0.0555 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.7565 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.7565 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.7561 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.7557 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.7558 0.0560 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.7558 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.7554 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.7554 0.0596 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.7555 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.7554 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.7552 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.7549 0.0570 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.7546 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.7545 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.7546 0.0554 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.7545 0.0563 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.7546 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.7547 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.7547 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.7548 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.7547 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.7550 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.7549 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.7549 0.0573 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.7550 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.7549 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.7550 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.7551 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.7553 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.7554 0.0565 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.7552 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.7550 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.7550 0.0570 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.7550 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.7551 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.7551 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.7550 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.7550 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.7550 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.7548 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.7550 0.0546 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.7551 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.7551 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.7552 0.0587 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.7552 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.7552 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.7551 0.0574 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.7551 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.7555 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.7554 0.0564 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.7554 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.7553 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.7552 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.7552 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.7552 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.7552 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.7551 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.7550 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.7550 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.8231 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.7839 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.7719 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.7675 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.7609 0.0625 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.7502 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.7509 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.7478 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.7501 0.0550 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.7498 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.7466 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.7452 0.0556 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.7449 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.7466 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.7457 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.7441 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.7443 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.7466 0.0557 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.7467 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.7478 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.7478 0.0548 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.7484 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.7475 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.7470 0.0566 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.7467 0.0580 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.7456 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.7445 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.7448 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.7457 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.7458 0.0579 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.7456 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.7452 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.7456 0.0579 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.7462 0.0504 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.7462 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.7459 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.7453 0.0586 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.7445 0.0554 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.7435 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.7429 0.0557 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.7424 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.7430 0.0613 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.7427 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.7419 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.7420 0.0579 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.7410 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.7407 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.7404 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.7401 0.0555 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.7406 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.7400 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.7407 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.7405 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.7407 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.7404 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.7408 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.7413 0.0544 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.7408 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.7404 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.7410 0.0549 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.7410 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.7418 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.7422 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.7425 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.7424 0.0557 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.7425 0.0504 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.7429 0.0555 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.7425 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.7425 0.0554 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.7425 0.0575 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.7432 0.0568 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.7432 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.7435 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.7433 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.7432 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.7435 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.7434 0.0567 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.7434 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.7431 0.0601 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.7430 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.7425 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.7427 0.0560 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.7423 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.7422 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.7417 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.7414 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.7412 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.7410 0.0565 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.7405 0.0537 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.7406 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.7403 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.7402 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.7398 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.7394 0.0563 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.7390 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.7391 0.0544 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.7389 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.7385 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.7382 0.0545 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.7378 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.7378 0.0577 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.7377 0.0560 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.7375 0.0504 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.7373 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.7371 0.0569 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.7370 0.0584 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.7369 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.7369 0.0546 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.7370 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.7371 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.7370 0.0570 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.7369 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.7367 0.0555 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.7365 0.0572 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.7364 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.7361 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.7360 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.7360 0.0560 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.7358 0.0550 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.7358 0.0500 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.7357 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.7354 0.0567 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.7350 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.7351 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.7350 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.7346 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.7347 0.0550 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.7349 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.7347 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.7346 0.0579 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.7343 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.7339 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.7339 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.7339 0.0545 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.7339 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.7340 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.7341 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.7342 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.7343 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.7342 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.7344 0.0580 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.7344 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.7344 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.7344 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.7343 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.7344 0.0556 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.7345 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.7347 0.0567 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.7347 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.7345 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.7343 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.7343 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.7344 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.7344 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.7344 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.7343 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.7344 0.0560 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.7343 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.7341 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.7343 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.7344 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.7344 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.7345 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.7345 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.7345 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.7345 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.7345 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.7350 0.0504 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.7349 0.0560 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.7349 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.7348 0.0501 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.7347 0.0566 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.7347 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.7348 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.7348 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.7347 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.7345 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.7345 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.8036 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.7691 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.7589 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.7531 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.7458 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.7359 0.0548 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.7358 0.0578 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.7324 0.0558 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.7336 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.7324 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.7297 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.7278 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.7271 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.7288 0.0570 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.7279 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.7266 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.7271 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.7294 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.7298 0.0560 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.7299 0.0547 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.7300 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.7299 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.7292 0.0556 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.7289 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.7288 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.7275 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.7265 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.7273 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.7279 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.7278 0.0566 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.7279 0.0572 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.7271 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.7275 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.7283 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.7283 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.7282 0.0593 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.7276 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.7265 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.7254 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.7248 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.7246 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.7251 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.7246 0.0580 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.7238 0.0583 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.7241 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.7230 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.7227 0.0560 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.7221 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.7219 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.7226 0.0540 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.7223 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.7233 0.0548 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.7230 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.7233 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.7230 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.7232 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.7237 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.7235 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.7231 0.0562 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.7238 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.7237 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.7245 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.7250 0.0560 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.7252 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.7252 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.7255 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.7259 0.0575 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.7257 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.7256 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.7254 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.7258 0.0568 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.7257 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.7263 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.7260 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.7259 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.7263 0.0562 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.7261 0.0567 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.7261 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.7257 0.0594 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.7255 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.7249 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.7251 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.7246 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.7246 0.0610 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.7242 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.7240 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.7236 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.7232 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.7227 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.7229 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.7225 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.7224 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.7219 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.7215 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.7212 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.7212 0.0542 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.7211 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.7207 0.0574 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.7203 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.7200 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.7199 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.7198 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.7198 0.0622 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.7196 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.7194 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.7193 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.7193 0.0548 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.7193 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.7194 0.0572 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.7194 0.0552 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.7192 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.7191 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.7190 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.7188 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.7186 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.7182 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.7181 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.7181 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.7180 0.0579 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.7180 0.0525 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.7180 0.0560 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.7177 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.7174 0.0628 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.7174 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.7173 0.0574 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.7169 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.7171 0.0565 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.7171 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.7170 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.7168 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.7164 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.7161 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.7161 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.7161 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.7161 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.7161 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.7162 0.0564 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.7163 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.7164 0.0607 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.7164 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.7166 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.7166 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.7165 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.7165 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.7164 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.7165 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.7166 0.0549 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.7169 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.7169 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.7168 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.7165 0.0588 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.7166 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.7166 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.7167 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.7167 0.0547 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.7167 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.7168 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.7168 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.7166 0.0550 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.7168 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.7170 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.7169 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.7171 0.0592 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.7171 0.0564 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.7170 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.7170 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.7171 0.0602 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.7175 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.7176 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.7176 0.0571 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.7175 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.7174 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.7174 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.7175 0.0577 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.7175 0.0570 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.7174 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.7173 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.7173 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.7878 0.0563 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.7551 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.7391 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.7332 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.7277 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.7186 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.7195 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.7166 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.7183 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.7175 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.7143 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.7133 0.0563 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.7132 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.7145 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.7138 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.7117 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.7122 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.7143 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.7148 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.7153 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.7152 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.7158 0.0571 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.7148 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.7144 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.7143 0.0567 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.7132 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.7123 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.7129 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.7140 0.0629 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.7140 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.7136 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.7126 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.7127 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.7131 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.7129 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.7129 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.7122 0.0603 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.7115 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.7102 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.7096 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.7091 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.7094 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.7090 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.7082 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.7083 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.7073 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.7069 0.0558 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.7065 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.7063 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.7068 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.7065 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.7071 0.0505 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.7069 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.7070 0.0595 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.7068 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.7069 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.7071 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.7069 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.7065 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.7072 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.7072 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.7080 0.0505 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.7085 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.7088 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.7087 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.7091 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.7093 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.7090 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.7090 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.7087 0.0558 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.7093 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.7094 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.7097 0.0551 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.7095 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.7095 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.7098 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.7095 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.7097 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.7092 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.7092 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.7086 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.7086 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.7083 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.7083 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.7077 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.7075 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.7073 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.7070 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.7066 0.0503 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.7068 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.7065 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.7063 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.7059 0.0559 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.7056 0.0542 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.7052 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.7053 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.7051 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.7047 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.7043 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.7039 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.7038 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.7037 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.7036 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.7034 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.7032 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.7031 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.7029 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.7030 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.7031 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.7030 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.7028 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.7027 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.7026 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.7025 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.7021 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.7018 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.7017 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.7017 0.0553 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.7016 0.0501 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.7016 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.7015 0.0626 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.7012 0.0598 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.7009 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.7010 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.7009 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.7005 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.7007 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.7007 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.7006 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.7004 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.7001 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.6999 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.6998 0.0580 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.6999 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.6999 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.7000 0.0561 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.7001 0.0565 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.7001 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.7003 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.7001 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.7004 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.7003 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.7002 0.0551 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.7003 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.7003 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.7004 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.7005 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.7007 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.7008 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.7006 0.0595 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.7003 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.7004 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.7006 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.7006 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.7007 0.0503 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.7006 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.7007 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.7008 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.7006 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.7008 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.7010 0.0574 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.7009 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.7010 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.7011 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.7010 0.0590 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.7010 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.7011 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.7015 0.0573 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.7014 0.0619 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.7014 0.0548 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.7013 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.7011 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.7012 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.7011 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.7012 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.7011 0.0588 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.7010 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.7010 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.7728 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.7391 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.7222 0.0556 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.7166 0.0575 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.7097 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.6994 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.6999 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.6972 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.6983 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.6983 0.0559 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.6945 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.6930 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.6927 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.6951 0.0503 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.6947 0.0545 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.6929 0.0593 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.6930 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.6948 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.6954 0.0601 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.6960 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.6963 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.6965 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.6957 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.6950 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.6951 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.6940 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.6931 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.6939 0.0567 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.6949 0.0568 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.6950 0.0554 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.6949 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.6941 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.6944 0.0568 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.6950 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.6951 0.0538 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.6948 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.6942 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.6934 0.0504 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.6921 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.6918 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.6915 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.6918 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.6916 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.6908 0.0555 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.6912 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.6903 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.6900 0.0554 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.6895 0.0597 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.6894 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.6900 0.0558 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.6896 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.6903 0.0564 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.6901 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.6904 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.6902 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.6903 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.6907 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.6905 0.0498 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.6902 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.6907 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.6907 0.0502 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.6915 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.6920 0.0555 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.6923 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.6921 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.6924 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.6928 0.0594 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.6925 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.6926 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.6925 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.6930 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.6931 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.6935 0.0504 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.6931 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.6931 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.6933 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.6932 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.6934 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.6930 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.6929 0.0609 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.6924 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.6926 0.0571 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.6921 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.6921 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.6917 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.6913 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.6911 0.0559 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.6908 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.6904 0.0572 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.6905 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.6901 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.6899 0.0545 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.6895 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.6892 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.6890 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.6890 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.6889 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.6886 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.6883 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.6879 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.6879 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.6877 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.6875 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.6874 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.6872 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.6872 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.6871 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.6871 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.6872 0.0498 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.6873 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.6871 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.6870 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.6869 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.6868 0.0554 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.6865 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.6861 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.6861 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.6860 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.6859 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.6859 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.6859 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.6855 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.6852 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.6853 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.6852 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.6848 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.6849 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.6849 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.6848 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.6846 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.6843 0.0571 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.6840 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.6840 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.6840 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.6840 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.6841 0.0558 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.6843 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.6843 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.6844 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.6842 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.6846 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.6845 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.6845 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.6846 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.6845 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.6847 0.0566 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.6848 0.0570 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.6850 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.6851 0.0572 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.6850 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.6847 0.0578 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.6848 0.0596 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.6849 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.6850 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.6851 0.0572 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.6850 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.6852 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.6852 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.6851 0.0568 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.6852 0.0584 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.6853 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.6854 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.6855 0.0551 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.6855 0.0574 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.6856 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.6856 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.6857 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.6861 0.0572 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.6861 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.6861 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.6860 0.0588 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.6858 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.6860 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.6860 0.0569 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.6862 0.0560 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.6860 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.6859 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.6860 0.0574 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.7648 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.7249 0.0578 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.7125 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.7043 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.6961 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.6880 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.6880 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.6857 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.6862 0.0572 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.6858 0.0548 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.6826 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.6812 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.6810 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.6827 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.6814 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.6796 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.6797 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.6815 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.6822 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.6833 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.6832 0.0565 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.6835 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.6826 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.6819 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.6822 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.6810 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.6801 0.0557 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.6811 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.6819 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.6822 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.6817 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.6810 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.6814 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.6819 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.6820 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.6817 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.6810 0.0552 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.6803 0.0573 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.6791 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.6783 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.6779 0.0586 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.6784 0.0561 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.6779 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.6773 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.6776 0.0567 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.6767 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.6761 0.0504 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.6757 0.0559 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.6755 0.0552 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.6760 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.6757 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.6764 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.6764 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.6766 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.6761 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.6764 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.6768 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.6764 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.6761 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.6768 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.6768 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.6779 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.6785 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.6787 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.6785 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.6789 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.6792 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.6790 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.6792 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.6791 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.6795 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.6795 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.6802 0.0580 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.6799 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.6798 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.6801 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.6800 0.0602 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.6802 0.0591 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.6797 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.6798 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.6793 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.6794 0.0594 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.6788 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.6789 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.6785 0.0589 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.6782 0.0585 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.6779 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.6776 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.6772 0.0575 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.6773 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.6770 0.0554 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.6769 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.6765 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.6762 0.0586 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.6759 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.6760 0.0557 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.6760 0.0605 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.6756 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.6753 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.6749 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.6749 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.6748 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.6747 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.6746 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.6744 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.6742 0.0593 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.6742 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.6742 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.6742 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.6742 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.6741 0.0506 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.6740 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.6739 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.6738 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.6735 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.6731 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.6731 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.6730 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.6729 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.6729 0.0568 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.6729 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.6725 0.0578 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.6722 0.0552 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.6722 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.6722 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.6717 0.0566 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.6719 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.6719 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.6718 0.0548 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.6717 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.6714 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.6712 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.6712 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.6712 0.0581 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.6712 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.6713 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.6714 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.6714 0.0571 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.6715 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.6713 0.0504 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.6717 0.0590 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.6716 0.0579 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.6715 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.6716 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.6716 0.0595 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.6717 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.6717 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.6719 0.0560 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.6720 0.0571 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.6718 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.6715 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.6716 0.0589 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.6717 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.6717 0.0591 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.6718 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.6717 0.0565 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.6719 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.6720 0.0543 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.6719 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.6720 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.6723 0.0538 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.6723 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.6724 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.6724 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.6724 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.6724 0.0551 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.6725 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.6729 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.6729 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.6729 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.6729 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.6727 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.6729 0.0638 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.6729 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.6730 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.6730 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.6729 0.0616 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.6729 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.7312 0.0501 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.6992 0.0585 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.6889 0.0554 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.6860 0.0569 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.6816 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.6731 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.6725 0.0588 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.6707 0.0564 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.6721 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.6715 0.0609 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.6689 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.6673 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.6669 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.6689 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.6684 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.6665 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.6668 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.6687 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.6692 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.6698 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.6697 0.0562 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.6704 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.6696 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.6694 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.6696 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.6682 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.6670 0.0563 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.6675 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.6681 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.6683 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.6684 0.0588 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.6679 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.6686 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.6690 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.6689 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.6685 0.0570 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.6680 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.6672 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.6657 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.6651 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.6649 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.6654 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.6650 0.0569 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.6644 0.0568 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.6644 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.6635 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.6632 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.6628 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.6626 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.6632 0.0590 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.6629 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.6637 0.0560 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.6635 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.6636 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.6634 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.6636 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.6642 0.0583 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.6639 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.6634 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.6640 0.0595 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.6639 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.6649 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.6653 0.0602 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.6655 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.6656 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.6659 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.6663 0.0562 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.6660 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.6661 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.6661 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.6668 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.6668 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.6673 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.6672 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.6671 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.6673 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.6672 0.0560 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.6673 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.6667 0.0591 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.6665 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.6660 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.6660 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.6657 0.0568 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.6659 0.0579 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.6656 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.6653 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.6651 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.6648 0.0576 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.6643 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.6643 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.6640 0.0568 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.6640 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.6636 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.6634 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.6631 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.6631 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.6631 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.6627 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.6624 0.0544 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.6620 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.6620 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.6619 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.6617 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.6616 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.6614 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.6612 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.6611 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.6612 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.6613 0.0554 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.6613 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.6612 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.6611 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.6609 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.6607 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.6606 0.0544 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.6602 0.0574 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.6600 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.6600 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.6599 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.6599 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.6598 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.6594 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.6591 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.6592 0.0581 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.6592 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.6588 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.6589 0.0548 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.6589 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.6588 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.6586 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.6584 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.6580 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.6580 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.6580 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.6581 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.6581 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.6582 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.6583 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.6583 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.6582 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.6586 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.6586 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.6585 0.0550 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.6585 0.0574 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.6585 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.6586 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.6587 0.0544 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.6588 0.0561 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.6589 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.6588 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.6586 0.0573 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.6586 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.6587 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.6587 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.6588 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.6588 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.6589 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.6590 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.6588 0.0548 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.6589 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.6591 0.0499 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.6591 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.6592 0.0575 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.6592 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.6592 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.6592 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.6594 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.6598 0.0557 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.6598 0.0504 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.6598 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.6597 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.6595 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.6596 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.6596 0.0574 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.6598 0.0613 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.6597 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.6596 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.6597 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4214 0.6314 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4153 0.0325 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.4091 0.0330 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.4028 0.0327 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.3959 0.0293 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.3878 0.0306 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.3775 0.0326 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.3631 0.0375 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.3384 0.0373 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 4.2987 0.0440 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 4.2504 0.0413 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 4.2024 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 4.1567 0.0429 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 4.1134 0.0515 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 4.0724 0.0521 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 4.0352 0.0441 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.9999 0.0490 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.9694 0.0510 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.9394 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.9103 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.8841 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.8592 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.8363 0.0545 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.8150 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.7947 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.7763 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.7592 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.7417 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.7255 0.0552 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.7108 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.6974 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.6836 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.6703 0.0495 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.6581 0.0518 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.6460 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.6349 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.6234 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.6129 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.6027 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.5930 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.5837 0.0499 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.5747 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.5657 0.0484 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.5574 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.5491 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.5415 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.5341 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.5273 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.5206 0.0494 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.5141 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.5078 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.5014 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.4955 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.4893 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.4838 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.4781 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.4728 0.0438 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.4677 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.4624 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.4575 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.4528 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.4484 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.4441 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.4394 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.4349 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.4310 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.4270 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.4225 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.4183 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.4145 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.4107 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.4074 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.4038 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.4003 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.3969 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.3937 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.3905 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.3872 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.3839 0.0512 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.3806 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.3775 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.3747 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.3718 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.3690 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.3661 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.3631 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.3603 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.3575 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.3550 0.0512 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.3524 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.3499 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.3473 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.3448 0.0509 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.3423 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.3399 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.3375 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.3351 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.3327 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.3304 0.0518 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.3281 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.3258 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.3236 0.0440 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.3214 0.0557 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.3192 0.0526 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.3170 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.3148 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.3125 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.3102 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.3082 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.3057 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.3037 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.3016 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.2995 0.0535 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.2973 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.2951 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.2929 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.2909 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.2890 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.2870 0.0517 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.2849 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.2832 0.0440 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.2812 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.2793 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.2775 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.2755 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.2733 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.2714 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.2695 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.2675 0.0514 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.2656 0.0539 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.2637 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.2617 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.2598 0.0514 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.2578 0.0550 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.2556 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.2536 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.2516 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.2495 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.2476 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.2456 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.2437 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.2415 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.2395 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.2375 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.2355 0.0512 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.2336 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.2316 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.2298 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.2277 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.2257 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.2239 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.2221 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.2201 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.2182 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.2161 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.2141 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.2120 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.2099 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.2078 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.2057 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.2037 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.2015 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.1994 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.1974 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.1954 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.1933 0.0510 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.1912 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.1892 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.1871 0.0494 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.1850 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.1830 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.1811 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.1794 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.1776 0.0490 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.1758 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.1739 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.1718 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.1697 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.8815 0.0530 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.8324 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.8155 0.0456 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.8105 0.0460 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.8063 0.0521 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.8030 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.8015 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.7995 0.0449 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.7975 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.7950 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.7916 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.7906 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.7885 0.0461 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.7883 0.0463 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.7869 0.0536 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.7858 0.0515 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.7841 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.7845 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.7828 0.0463 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.7797 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.7778 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.7768 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.7749 0.0458 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.7730 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.7710 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.7695 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.7680 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.7659 0.0461 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.7643 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.7628 0.0516 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.7619 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.7601 0.0462 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.7579 0.0463 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.7565 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.7546 0.0527 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.7534 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.7514 0.0458 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.7494 0.0461 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.7473 0.0554 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.7454 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.7433 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.7415 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.7398 0.0456 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.7380 0.0533 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.7361 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.7343 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.7330 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.7316 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.7301 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.7290 0.0534 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.7274 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.7259 0.0464 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.7243 0.0453 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.7226 0.0532 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.7210 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.7195 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.7181 0.0465 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.7166 0.0528 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.7151 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.7140 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.7126 0.0459 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.7115 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.7104 0.0522 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.7089 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.7074 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.7066 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.7052 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.7034 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.7018 0.0541 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.7006 0.0525 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.6994 0.0466 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.6984 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.6970 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.6957 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.6946 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.6938 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.6925 0.0516 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.6915 0.0550 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.6902 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.6890 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.6877 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.6865 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.6854 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.6841 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.6825 0.0524 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.6811 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.6799 0.0463 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.6787 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.6775 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.6764 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.6753 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.6742 0.0472 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.6731 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.6719 0.0541 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.6706 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.6693 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.6682 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.6671 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.6660 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.6649 0.0533 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.6640 0.0564 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.6630 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.6618 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.6606 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.6595 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.6585 0.0472 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.6573 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.6565 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.6556 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.6543 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.6533 0.0472 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.6524 0.0532 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.6514 0.0526 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.6503 0.0529 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.6492 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.6480 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.6469 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.6460 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.6452 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.6442 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.6434 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.6425 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.6415 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.6406 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.6397 0.0515 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.6386 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.6378 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.6370 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.6360 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.6351 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.6343 0.0535 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.6333 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.6325 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.6317 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.6306 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.6297 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.6288 0.0541 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.6279 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.6272 0.0540 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.6263 0.0526 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.6256 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.6247 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.6239 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.6230 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.6221 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.6214 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.6206 0.0563 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.6200 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.6191 0.0544 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.6183 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.6177 0.0537 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.6172 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.6165 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.6158 0.0528 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.6150 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.6142 0.0514 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.6133 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.6125 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.6116 0.0541 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.6109 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.6102 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.6093 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.6084 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.6076 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.6068 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.6061 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.6054 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.6047 0.0537 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.6040 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.6031 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.6025 0.0534 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.6020 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.6015 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.6010 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.6005 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.5998 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.5990 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.5981 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.5417 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.4880 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.4736 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.4689 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.4665 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.4644 0.0472 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.4639 0.0472 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.4657 0.0545 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.4664 0.0556 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.4657 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.4642 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.4642 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.4641 0.0550 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.4653 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.4650 0.0473 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.4651 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.4651 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.4664 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.4664 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.4646 0.0470 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.4639 0.0562 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.4650 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.4644 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.4634 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.4626 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.4620 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.4615 0.0523 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.4610 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.4611 0.0543 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.4608 0.0560 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.4609 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.4604 0.0476 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.4595 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.4591 0.0560 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.4583 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.4580 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.4573 0.0557 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.4561 0.0546 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.4550 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.4543 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.4534 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.4524 0.0530 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.4516 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.4508 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.4499 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.4486 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.4482 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.4477 0.0478 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.4470 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.4469 0.0513 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.4461 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.4457 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.4448 0.0542 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.4440 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.4434 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.4430 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.4425 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.4417 0.0522 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.4410 0.0525 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.4407 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.4402 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.4399 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.4398 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.4392 0.0473 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.4387 0.0507 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.4385 0.0562 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.4381 0.0475 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.4372 0.0478 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.4366 0.0527 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.4363 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.4359 0.0529 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.4356 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.4353 0.0537 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.4348 0.0519 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.4343 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.4343 0.0498 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.4339 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.4337 0.0534 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.4331 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.4326 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.4321 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.4319 0.0581 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.4313 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.4308 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.4299 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.4293 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.4289 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.4284 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.4280 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.4277 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.4272 0.0553 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.4268 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.4263 0.0507 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.4258 0.0513 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.4251 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.4246 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.4242 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.4239 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.4234 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.4228 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.4227 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.4223 0.0532 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.4217 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.4213 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.4209 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.4206 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.4202 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.4200 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.4198 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.4192 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.4189 0.0579 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.4187 0.0566 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.4184 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.4180 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.4176 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.4170 0.0530 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.4165 0.0541 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.4163 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.4162 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.4159 0.0529 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.4156 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.4153 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.4149 0.0541 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.4146 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.4143 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.4138 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.4136 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.4134 0.0527 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.4131 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.4128 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.4125 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.4120 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.4118 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.4116 0.0498 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.4111 0.0554 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.4109 0.0556 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.4105 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.4102 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.4102 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.4099 0.0519 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.4098 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.4095 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.4091 0.0555 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.4088 0.0560 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.4085 0.0531 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.4084 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.4081 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.4080 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.4077 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.4073 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.4073 0.0513 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.4073 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.4071 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.4070 0.0511 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.4067 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.4064 0.0565 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.4060 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.4057 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.4053 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.4051 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.4049 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.4044 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.4041 0.0585 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.4038 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.4036 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.4034 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.4032 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.4030 0.0535 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.4027 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.4024 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.4022 0.0538 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.4020 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.4021 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.4021 0.0478 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.4021 0.0562 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.4019 0.0542 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.4016 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.4012 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.4175 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.3651 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.3536 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.3508 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.3491 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.3479 0.0570 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.3479 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.3484 0.0481 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.3505 0.0544 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.3495 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.3478 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.3473 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.3473 0.0482 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.3493 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.3496 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.3498 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.3493 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.3513 0.0575 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.3516 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.3506 0.0537 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.3499 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.3512 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.3508 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.3498 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.3491 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.3488 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.3480 0.0550 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.3476 0.0484 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.3480 0.0561 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.3479 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.3482 0.0572 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.3475 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.3469 0.0567 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.3470 0.0516 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.3465 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.3464 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.3459 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.3448 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.3441 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.3435 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.3428 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.3423 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.3415 0.0525 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.3406 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.3401 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.3390 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.3390 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.3386 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.3382 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.3384 0.0572 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.3378 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.3377 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.3372 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.3367 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.3362 0.0567 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.3361 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.3359 0.0482 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.3353 0.0523 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.3348 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.3349 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.3344 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.3346 0.0539 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.3346 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.3344 0.0536 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.3339 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.3339 0.0570 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.3337 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.3331 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.3326 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.3325 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.3322 0.0528 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.3322 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.3320 0.0483 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.3316 0.0553 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.3314 0.0540 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.3317 0.0483 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.3314 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.3314 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.3309 0.0537 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.3306 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.3302 0.0483 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.3301 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.3297 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.3293 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.3285 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.3281 0.0561 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.3278 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.3274 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.3270 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.3268 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.3265 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.3263 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.3259 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.3255 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.3251 0.0554 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.3247 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.3245 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.3242 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.3237 0.0516 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.3233 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.3233 0.0538 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.3230 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.3225 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.3222 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.3220 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.3217 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.3214 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.3214 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.3213 0.0522 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.3208 0.0523 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.3206 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.3205 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.3202 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.3199 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.3196 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.3191 0.0567 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.3189 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.3187 0.0561 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.3186 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.3184 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.3184 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.3181 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.3178 0.0565 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.3177 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.3174 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.3170 0.0537 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.3169 0.0588 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.3168 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.3167 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.3165 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.3163 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.3159 0.0536 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.3158 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.3157 0.0551 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.3154 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.3153 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.3152 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.3150 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.3150 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.3148 0.0546 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.3148 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.3146 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.3144 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.3142 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.3140 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.3141 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.3140 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.3140 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.3138 0.0542 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.3134 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.3134 0.0594 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.3135 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.3134 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.3133 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.3131 0.0562 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.3129 0.0575 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.3127 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.3124 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.3121 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.3121 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.3120 0.0554 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.3118 0.0521 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.3116 0.0539 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.3114 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.3113 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.3111 0.0545 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.3110 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.3109 0.0523 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.3108 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.3105 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.3103 0.0539 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.3102 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.3102 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.3101 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.3101 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.3101 0.0548 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.3100 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.3098 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.3466 0.0568 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.3025 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.2858 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.2805 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.2773 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.2741 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.2743 0.0483 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.2759 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.2777 0.0539 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.2769 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.2755 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.2749 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.2747 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.2771 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.2766 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.2761 0.0557 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.2757 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.2776 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.2776 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.2767 0.0575 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.2760 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.2774 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.2767 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.2761 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.2753 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.2746 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.2741 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.2741 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.2748 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.2748 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.2749 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.2741 0.0574 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.2736 0.0535 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.2742 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.2738 0.0482 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.2734 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.2729 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.2717 0.0550 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.2711 0.0486 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.2704 0.0554 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.2696 0.0546 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.2691 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.2685 0.0489 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.2677 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.2674 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.2662 0.0486 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.2664 0.0557 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.2659 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.2656 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.2659 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.2652 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.2654 0.0539 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.2648 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.2644 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.2641 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.2641 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.2639 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.2633 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.2628 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.2630 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.2627 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.2627 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.2629 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.2627 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.2624 0.0534 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.2624 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.2622 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.2617 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.2613 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.2614 0.0559 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.2613 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.2613 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.2614 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.2610 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.2609 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.2611 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.2608 0.0555 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.2608 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.2603 0.0555 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.2600 0.0550 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.2595 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.2594 0.0547 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.2590 0.0547 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.2586 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.2580 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 2.2576 0.0549 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 2.2574 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 2.2571 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 2.2567 0.0572 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 2.2567 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 2.2565 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 2.2564 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 2.2560 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 2.2556 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 2.2552 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 2.2548 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 2.2547 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 2.2544 0.0562 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 2.2541 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 2.2536 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 2.2536 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 2.2535 0.0587 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 2.2531 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 2.2528 0.0540 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 2.2526 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 2.2524 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 2.2522 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 2.2521 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 2.2520 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 2.2516 0.0540 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 2.2515 0.0539 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 2.2515 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 2.2512 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 2.2510 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 2.2508 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 2.2504 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 2.2502 0.0584 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 2.2500 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 2.2501 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 2.2500 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 2.2500 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 2.2498 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 2.2495 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 2.2495 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 2.2493 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 2.2490 0.0541 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 2.2490 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 2.2489 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 2.2489 0.0586 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 2.2488 0.0554 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 2.2486 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 2.2482 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 2.2481 0.0574 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 2.2481 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 2.2479 0.0573 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 2.2478 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 2.2477 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 2.2477 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 2.2478 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 2.2476 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 2.2476 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 2.2474 0.0585 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 2.2473 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 2.2472 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 2.2471 0.0530 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 2.2471 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 2.2470 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 2.2471 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 2.2469 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 2.2467 0.0550 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 2.2466 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 2.2467 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 2.2467 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 2.2467 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 2.2465 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 2.2465 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 2.2463 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 2.2461 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 2.2459 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 2.2460 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 2.2460 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 2.2457 0.0552 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 2.2456 0.0486 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 2.2455 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 2.2454 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 2.2453 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 2.2452 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 2.2452 0.0548 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 2.2452 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 2.2450 0.0579 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 2.2448 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 2.2447 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 2.2447 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 2.2446 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 2.2446 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 2.2445 0.0560 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 2.2443 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 2.2441 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.2940 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 2.2406 0.0490 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 2.2274 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 2.2237 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 2.2213 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 2.2166 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 2.2168 0.0571 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 2.2179 0.0549 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 2.2202 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 2.2202 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 2.2192 0.0486 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 2.2176 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 2.2180 0.0487 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 2.2201 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 2.2194 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 2.2185 0.0545 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 2.2185 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 2.2206 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 2.2207 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 2.2202 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 2.2200 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 2.2206 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 2.2207 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 2.2203 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 2.2199 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 2.2192 0.0553 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 2.2182 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 2.2183 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 2.2189 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 2.2190 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 2.2196 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 2.2189 0.0551 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 2.2187 0.0596 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 2.2192 0.0489 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 2.2185 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 2.2185 0.0547 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 2.2181 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 2.2172 0.0540 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 2.2163 0.0590 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 2.2156 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 2.2152 0.0534 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 2.2147 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 2.2140 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 2.2133 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 2.2131 0.0611 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 2.2117 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 2.2119 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 2.2114 0.0520 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 2.2109 0.0564 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 2.2113 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 2.2107 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 2.2111 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 2.2108 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 2.2104 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 2.2101 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 2.2100 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 2.2100 0.0537 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 2.2095 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 2.2092 0.0540 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 2.2094 0.0545 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 2.2091 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 2.2092 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 2.2093 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 2.2091 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 2.2088 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 2.2090 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 2.2090 0.0490 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 2.2083 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 2.2080 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 2.2080 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 2.2080 0.0537 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 2.2079 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 2.2080 0.0579 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 2.2076 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 2.2073 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 2.2076 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 2.2075 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 2.2077 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 2.2071 0.0557 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 2.2070 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 2.2067 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 2.2066 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 2.2062 0.0570 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 2.2058 0.0571 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 2.2051 0.0571 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 2.2047 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 2.2047 0.0546 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 2.2044 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 2.2040 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 2.2041 0.0488 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 2.2038 0.0485 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 2.2037 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 2.2032 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 2.2029 0.0559 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 2.2025 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 2.2023 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 2.2021 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 2.2018 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 2.2015 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 2.2012 0.0520 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 2.2012 0.0571 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 2.2012 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 2.2008 0.0564 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 2.2006 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 2.2004 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 2.2002 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 2.2002 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 2.2001 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 2.2001 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 2.1998 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 2.1998 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 2.1996 0.0575 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 2.1994 0.0553 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 2.1992 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 2.1989 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 2.1985 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 2.1983 0.0542 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 2.1981 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 2.1982 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 2.1982 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 2.1982 0.0753 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 2.1980 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 2.1978 0.0489 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 2.1978 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 2.1977 0.0584 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 2.1974 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 2.1973 0.0547 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 2.1973 0.0555 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 2.1972 0.0592 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 2.1972 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 2.1970 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 2.1967 0.0581 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 2.1967 0.0573 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 2.1967 0.0490 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 2.1966 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 2.1965 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 2.1965 0.0602 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 2.1964 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 2.1966 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 2.1965 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 2.1965 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 2.1963 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 2.1963 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 2.1962 0.0534 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 2.1961 0.0540 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 2.1961 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 2.1961 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 2.1962 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 2.1961 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 2.1959 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 2.1958 0.0489 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 2.1960 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 2.1959 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 2.1960 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 2.1959 0.0490 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 2.1958 0.0553 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 2.1957 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 2.1955 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 2.1953 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 2.1954 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 2.1954 0.0567 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 2.1953 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 2.1951 0.0488 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 2.1950 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 2.1950 0.0529 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 2.1949 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 2.1949 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 2.1950 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 2.1949 0.0573 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 2.1947 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 2.1946 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 2.1945 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 2.1945 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 2.1945 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 2.1945 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 2.1944 0.0558 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 2.1943 0.0580 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 2.1942 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 2.2375 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 2.1916 0.0529 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 2.1796 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 2.1753 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 2.1735 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 2.1708 0.0529 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 2.1729 0.0554 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 2.1734 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 2.1763 0.0493 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 2.1767 0.0526 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 2.1752 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 2.1728 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 2.1727 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 2.1745 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 2.1745 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 2.1736 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 2.1736 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 2.1761 0.0547 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 2.1753 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 2.1749 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 2.1745 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 2.1752 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 2.1748 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 2.1742 0.0494 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 2.1741 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 2.1734 0.0540 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 2.1726 0.0560 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 2.1730 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 2.1738 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 2.1740 0.0543 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 2.1743 0.0529 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 2.1736 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 2.1737 0.0533 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 2.1740 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 2.1738 0.0544 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 2.1736 0.0549 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 2.1736 0.0490 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 2.1728 0.0562 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 2.1719 0.0547 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 2.1712 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 2.1706 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 2.1703 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 2.1696 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 2.1687 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 2.1686 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 2.1674 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 2.1675 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 2.1672 0.0559 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 2.1670 0.0547 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 2.1675 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 2.1670 0.0546 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 2.1674 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 2.1672 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 2.1669 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 2.1665 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 2.1666 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 2.1665 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 2.1662 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 2.1657 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 2.1659 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 2.1656 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 2.1660 0.0573 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 2.1663 0.0550 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 2.1662 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 2.1660 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 2.1661 0.0491 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 2.1662 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 2.1655 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 2.1653 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 2.1653 0.0550 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 2.1652 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 2.1653 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 2.1654 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 2.1651 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 2.1648 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 2.1652 0.0526 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 2.1650 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 2.1651 0.0552 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 2.1647 0.0564 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 2.1644 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 2.1640 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 2.1640 0.0552 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 2.1636 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 2.1634 0.0571 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 2.1627 0.0489 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 2.1624 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 2.1622 0.0542 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 2.1619 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 2.1616 0.0585 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 2.1617 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 2.1615 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 2.1613 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 2.1609 0.0494 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 2.1606 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 2.1603 0.0548 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 2.1601 0.0536 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 2.1598 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 2.1595 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 2.1591 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 2.1587 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 2.1587 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 2.1586 0.0615 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 2.1581 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 2.1580 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 2.1578 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 2.1577 0.0560 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 2.1575 0.0560 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 2.1575 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 2.1574 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 2.1573 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 2.1571 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 2.1570 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 2.1568 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 2.1565 0.0541 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 2.1563 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 2.1559 0.0494 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 2.1559 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 2.1557 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 2.1557 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 2.1556 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 2.1557 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 2.1555 0.0493 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 2.1553 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 2.1553 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 2.1552 0.0541 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 2.1549 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 2.1549 0.0565 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 2.1549 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 2.1549 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 2.1548 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 2.1546 0.0556 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 2.1542 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 2.1543 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 2.1543 0.0572 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 2.1542 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 2.1543 0.0490 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 2.1542 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 2.1543 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 2.1545 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 2.1543 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 2.1544 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 2.1542 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 2.1542 0.0549 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 2.1541 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 2.1539 0.0494 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 2.1540 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 2.1539 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 2.1540 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 2.1539 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 2.1538 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 2.1536 0.0558 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 2.1538 0.0593 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 2.1538 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 2.1538 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 2.1538 0.0554 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 2.1537 0.0550 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 2.1535 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 2.1534 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 2.1532 0.0565 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 2.1533 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 2.1533 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 2.1531 0.0557 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 2.1531 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 2.1530 0.0565 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 2.1529 0.0547 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 2.1528 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 2.1528 0.0561 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 2.1530 0.0571 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 2.1529 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 2.1528 0.0570 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 2.1527 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 2.1526 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 2.1526 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 2.1526 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 2.1526 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 2.1525 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 2.1524 0.0486 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 2.1524 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 2.2027 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 2.1578 0.0492 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 2.1425 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 2.1392 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 2.1380 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 2.1336 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 2.1352 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 2.1354 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 2.1375 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 2.1371 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 2.1351 0.0487 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 2.1333 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 2.1342 0.0495 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 2.1360 0.0493 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 2.1357 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 2.1342 0.0569 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 2.1337 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 2.1364 0.0574 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 2.1362 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 2.1357 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 2.1351 0.0543 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 2.1362 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 2.1357 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 2.1352 0.0586 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 2.1351 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 2.1342 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 2.1335 0.0491 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 2.1336 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 2.1344 0.0493 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 2.1345 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 2.1348 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 2.1340 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 2.1339 0.0544 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 2.1350 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 2.1344 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 2.1343 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 2.1338 0.0565 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 2.1326 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 2.1319 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 2.1312 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 2.1307 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 2.1305 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 2.1301 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 2.1293 0.0590 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 2.1294 0.0576 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 2.1282 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 2.1283 0.0495 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 2.1278 0.0609 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 2.1275 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 2.1280 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 2.1275 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 2.1278 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 2.1276 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 2.1274 0.0561 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 2.1269 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 2.1272 0.0541 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 2.1271 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 2.1267 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 2.1264 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 2.1267 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 2.1266 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 2.1269 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 2.1271 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 2.1272 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 2.1270 0.0493 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 2.1274 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 2.1275 0.0492 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 2.1269 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 2.1267 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 2.1266 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 2.1267 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 2.1269 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 2.1271 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 2.1268 0.0551 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 2.1267 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 2.1271 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 2.1269 0.0581 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 2.1270 0.0562 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 2.1265 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 2.1262 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 2.1257 0.0541 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 2.1258 0.0531 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 2.1253 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 2.1251 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 2.1244 0.0548 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 2.1240 0.0543 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 2.1239 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 2.1237 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 2.1232 0.0530 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 2.1232 0.0554 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 2.1230 0.0535 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 2.1229 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 2.1224 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 2.1222 0.0531 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 2.1218 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 2.1217 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 2.1215 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 2.1212 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 2.1208 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 2.1204 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 2.1204 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 2.1204 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 2.1200 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 2.1198 0.0568 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 2.1195 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 2.1194 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 2.1192 0.0562 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 2.1193 0.0568 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 2.1193 0.0551 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 2.1192 0.0496 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 2.1192 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 2.1191 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 2.1189 0.0580 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 2.1188 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 2.1186 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 2.1182 0.0561 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 2.1181 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 2.1181 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 2.1181 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 2.1179 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 2.1179 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 2.1178 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 2.1175 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 2.1176 0.0547 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 2.1175 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 2.1171 0.0570 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 2.1172 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 2.1172 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 2.1171 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 2.1172 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 2.1170 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 2.1167 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 2.1167 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 2.1168 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 2.1166 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 2.1166 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 2.1167 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 2.1166 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 2.1168 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 2.1166 0.0553 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 2.1167 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 2.1166 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 2.1165 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 2.1164 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 2.1162 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 2.1163 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 2.1163 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 2.1164 0.0552 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 2.1163 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 2.1161 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 2.1160 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 2.1162 0.0542 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 2.1163 0.0594 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 2.1163 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 2.1162 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 2.1161 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 2.1160 0.0590 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 2.1159 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 2.1157 0.0560 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 2.1158 0.0573 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 2.1158 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 2.1156 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 2.1156 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 2.1155 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 2.1155 0.0547 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 2.1154 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 2.1154 0.0491 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 2.1156 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 2.1155 0.0584 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 2.1153 0.0493 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 2.1152 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 2.1151 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 2.1151 0.0614 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 2.1152 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 2.1152 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 2.1152 0.0548 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 2.1150 0.0578 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 2.1150 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 2.1649 0.0491 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 2.1218 0.0586 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 2.1107 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 2.1016 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 2.1008 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 2.0943 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 2.0948 0.0538 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 2.0960 0.0560 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 2.0992 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 2.0989 0.0548 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 2.0963 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 2.0945 0.0495 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 2.0953 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 2.0967 0.0554 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 2.0957 0.0547 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 2.0940 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 2.0937 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 2.0966 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 2.0968 0.0568 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 2.0969 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 2.0961 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 2.0973 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 2.0975 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 2.0970 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 2.0968 0.0522 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 2.0962 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 2.0955 0.0578 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 2.0959 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 2.0971 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 2.0972 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 2.0977 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 2.0968 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 2.0967 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 2.0974 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 2.0973 0.0530 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 2.0974 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 2.0970 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 2.0959 0.0565 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 2.0952 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 2.0945 0.0522 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 2.0942 0.0495 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 2.0941 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 2.0935 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 2.0928 0.0522 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 2.0929 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 2.0916 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 2.0916 0.0526 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 2.0912 0.0592 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 2.0909 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 2.0914 0.0607 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 2.0910 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 2.0913 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 2.0909 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 2.0908 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 2.0905 0.0586 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 2.0908 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 2.0909 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 2.0905 0.0561 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 2.0902 0.0562 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 2.0907 0.0529 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 2.0906 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 2.0910 0.0610 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 2.0913 0.0559 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 2.0915 0.0593 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 2.0911 0.0493 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 2.0914 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 2.0914 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 2.0908 0.0537 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 2.0907 0.0493 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 2.0905 0.0587 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 2.0906 0.0561 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 2.0907 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 2.0909 0.0497 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 2.0905 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 2.0902 0.0533 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 2.0906 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 2.0905 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 2.0906 0.0621 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 2.0901 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 2.0899 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 2.0894 0.0566 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 2.0896 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 2.0891 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 2.0889 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 2.0883 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 2.0880 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 2.0878 0.0572 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 2.0877 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 2.0872 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 2.0873 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 2.0870 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 2.0869 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 2.0863 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 2.0861 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 2.0858 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 2.0857 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 2.0856 0.0537 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 2.0852 0.0540 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 2.0849 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 2.0845 0.0495 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 2.0845 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 2.0843 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 2.0841 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 2.0839 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 2.0837 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 2.0836 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 2.0834 0.0560 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 2.0834 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 2.0834 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 2.0833 0.0588 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 2.0832 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 2.0831 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 2.0830 0.0494 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 2.0828 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 2.0826 0.0571 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 2.0823 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 2.0822 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 2.0822 0.0566 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 2.0822 0.0554 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 2.0821 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 2.0822 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 2.0819 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 2.0817 0.0493 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 2.0818 0.0559 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 2.0817 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 2.0813 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 2.0813 0.0565 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 2.0813 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 2.0813 0.0494 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 2.0812 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 2.0810 0.0526 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 2.0807 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 2.0807 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 2.0807 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 2.0806 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 2.0806 0.0495 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 2.0806 0.0497 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 2.0806 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 2.0808 0.0554 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 2.0806 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 2.0808 0.0492 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 2.0806 0.0568 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 2.0805 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 2.0805 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 2.0803 0.0537 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 2.0804 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 2.0804 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 2.0805 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 2.0804 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 2.0802 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 2.0800 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 2.0802 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 2.0802 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 2.0802 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 2.0801 0.0582 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 2.0800 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 2.0799 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 2.0799 0.0570 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 2.0797 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 2.0799 0.0558 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 2.0799 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 2.0798 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 2.0798 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 2.0798 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 2.0797 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 2.0796 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 2.0796 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 2.0798 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 2.0797 0.0492 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 2.0797 0.0603 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 2.0795 0.0522 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 2.0794 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 2.0795 0.0495 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 2.0795 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 2.0795 0.0546 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 2.0794 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 2.0793 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 2.0793 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 2.1283 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 2.0882 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 2.0748 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 2.0695 0.0567 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 2.0689 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 2.0634 0.0546 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 2.0643 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 2.0644 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 2.0670 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 2.0669 0.0577 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 2.0645 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 2.0620 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 2.0618 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 2.0634 0.0558 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 2.0632 0.0491 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 2.0619 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 2.0620 0.0578 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 2.0644 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 2.0644 0.0592 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 2.0648 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 2.0639 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 2.0653 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 2.0651 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 2.0646 0.0615 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 2.0642 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 2.0629 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 2.0625 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 2.0629 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 2.0638 0.0590 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 2.0641 0.0549 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 2.0643 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 2.0636 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 2.0638 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 2.0647 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 2.0642 0.0541 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 2.0642 0.0587 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 2.0639 0.0565 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 2.0629 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 2.0618 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 2.0612 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 2.0608 0.0560 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 2.0608 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 2.0602 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 2.0595 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 2.0597 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 2.0584 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 2.0581 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 2.0574 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 2.0571 0.0496 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 2.0575 0.0566 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 2.0569 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 2.0572 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 2.0570 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 2.0568 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 2.0565 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 2.0567 0.0571 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 2.0569 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 2.0565 0.0554 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 2.0560 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 2.0564 0.0558 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 2.0562 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 2.0566 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 2.0571 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 2.0573 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 2.0572 0.0579 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 2.0576 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 2.0577 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 2.0570 0.0557 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 2.0569 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 2.0568 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 2.0570 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 2.0570 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 2.0573 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 2.0569 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 2.0567 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 2.0570 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 2.0569 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 2.0571 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 2.0566 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 2.0565 0.0553 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 2.0560 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 2.0562 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 2.0557 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 2.0555 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 2.0550 0.0583 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 2.0546 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 2.0545 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 2.0543 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 2.0538 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 2.0540 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 2.0536 0.0584 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 2.0535 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 2.0529 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 2.0526 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 2.0523 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 2.0522 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 2.0522 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 2.0518 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 2.0515 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 2.0510 0.0538 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 2.0510 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 2.0511 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 2.0507 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 2.0506 0.0556 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 2.0504 0.0572 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 2.0503 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 2.0502 0.0496 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 2.0502 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 2.0503 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 2.0501 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 2.0500 0.0552 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 2.0499 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 2.0498 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 2.0497 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 2.0495 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 2.0490 0.0573 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 2.0489 0.0568 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 2.0489 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 2.0489 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 2.0489 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 2.0488 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 2.0485 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 2.0483 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 2.0484 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 2.0483 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 2.0480 0.0587 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 2.0481 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 2.0481 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 2.0482 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 2.0482 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 2.0480 0.0546 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 2.0476 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 2.0476 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 2.0477 0.0557 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 2.0476 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 2.0477 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 2.0478 0.0558 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 2.0478 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 2.0479 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 2.0477 0.0572 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 2.0479 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 2.0478 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 2.0477 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 2.0477 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 2.0476 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 2.0476 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 2.0477 0.0560 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 2.0478 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 2.0478 0.0566 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 2.0476 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 2.0475 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 2.0477 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 2.0477 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 2.0477 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 2.0477 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 2.0476 0.0575 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 2.0476 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 2.0475 0.0492 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 2.0473 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 2.0475 0.0489 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 2.0475 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 2.0474 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 2.0474 0.0550 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 2.0474 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 2.0475 0.0561 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 2.0474 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 2.0474 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 2.0476 0.0573 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 2.0475 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 2.0475 0.0566 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 2.0474 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 2.0472 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 2.0473 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 2.0473 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 2.0473 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 2.0472 0.0585 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 2.0471 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 2.0471 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 2.0992 0.0497 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 2.0561 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 2.0453 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 2.0382 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 2.0350 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 2.0290 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 2.0298 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 2.0300 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 2.0330 0.0499 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 2.0329 0.0558 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 2.0296 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 2.0278 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 2.0284 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 2.0301 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 2.0293 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 2.0285 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 2.0287 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 2.0309 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 2.0311 0.0618 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 2.0306 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 2.0300 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 2.0308 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 2.0304 0.0589 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 2.0301 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 2.0300 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 2.0290 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 2.0282 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 2.0288 0.0558 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 2.0300 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 2.0305 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 2.0307 0.0575 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 2.0300 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 2.0303 0.0487 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 2.0309 0.0538 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 2.0306 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 2.0305 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 2.0302 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 2.0292 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 2.0282 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 2.0275 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 2.0270 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 2.0270 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 2.0266 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 2.0259 0.0577 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 2.0259 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 2.0246 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 2.0246 0.0568 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 2.0242 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 2.0239 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 2.0244 0.0566 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 2.0239 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 2.0244 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 2.0243 0.0550 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 2.0243 0.0544 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 2.0240 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 2.0243 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 2.0245 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 2.0243 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 2.0239 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 2.0245 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 2.0244 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 2.0251 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 2.0255 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 2.0257 0.0558 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 2.0255 0.0492 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 2.0258 0.0575 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 2.0259 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 2.0255 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 2.0253 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 2.0253 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 2.0256 0.0571 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 2.0258 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 2.0261 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 2.0257 0.0565 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 2.0257 0.0561 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 2.0261 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 2.0260 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 2.0262 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 2.0258 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 2.0258 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 2.0253 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 2.0255 0.0568 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 2.0251 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 2.0250 0.0563 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 2.0245 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 2.0241 0.0543 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 2.0241 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 2.0239 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 2.0234 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 2.0235 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 2.0232 0.0589 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 2.0231 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 2.0225 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 2.0222 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 2.0219 0.0579 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 2.0218 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 2.0217 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 2.0214 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 2.0211 0.0586 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 2.0207 0.0497 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 2.0206 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 2.0206 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 2.0203 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 2.0202 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 2.0199 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 2.0197 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 2.0196 0.0586 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 2.0196 0.0573 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 2.0196 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 2.0196 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 2.0195 0.0591 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 2.0193 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 2.0192 0.0571 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 2.0192 0.0562 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 2.0189 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 2.0185 0.0579 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 2.0184 0.0550 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 2.0184 0.0582 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 2.0185 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 2.0184 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 2.0185 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 2.0183 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 2.0180 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 2.0181 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 2.0180 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 2.0177 0.0538 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 2.0178 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 2.0178 0.0554 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 2.0177 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 2.0177 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 2.0174 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 2.0171 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 2.0171 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 2.0171 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 2.0171 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 2.0171 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 2.0172 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 2.0172 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 2.0175 0.0574 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 2.0172 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 2.0174 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 2.0173 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 2.0173 0.0497 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 2.0173 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 2.0171 0.0494 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 2.0172 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 2.0172 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 2.0173 0.0627 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 2.0172 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 2.0171 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 2.0169 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 2.0171 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 2.0170 0.0494 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 2.0171 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 2.0170 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 2.0169 0.0573 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 2.0170 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 2.0168 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 2.0166 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 2.0168 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 2.0169 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 2.0168 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 2.0168 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 2.0168 0.0622 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 2.0167 0.0566 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 2.0166 0.0543 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 2.0166 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 2.0168 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 2.0167 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 2.0166 0.0570 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 2.0166 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 2.0164 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 2.0165 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 2.0165 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 2.0165 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 2.0165 0.0576 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 2.0164 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 2.0164 0.0556 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 2.0715 0.0575 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 2.0286 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 2.0150 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 2.0092 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 2.0074 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 2.0010 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 2.0011 0.0562 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 2.0017 0.0538 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 2.0047 0.0567 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 2.0049 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 2.0026 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 2.0013 0.0558 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 2.0017 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 2.0035 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 2.0031 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 2.0021 0.0563 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 2.0021 0.0574 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 2.0042 0.0566 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 2.0039 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 2.0040 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 2.0029 0.0561 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 2.0040 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 2.0037 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 2.0030 0.0565 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 2.0026 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 2.0014 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 2.0007 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 2.0012 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 2.0023 0.0555 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 2.0027 0.0533 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 2.0024 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 2.0015 0.0552 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 2.0014 0.0601 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 2.0024 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 2.0023 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 2.0020 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 2.0016 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 2.0004 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.9994 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.9987 0.0555 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.9984 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.9986 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.9982 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.9975 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.9976 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.9964 0.0564 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.9966 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.9963 0.0553 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.9962 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.9968 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.9962 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.9970 0.0563 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.9966 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.9966 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.9964 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.9966 0.0550 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.9968 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.9965 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.9961 0.0496 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.9968 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.9968 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.9974 0.0579 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.9975 0.0493 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.9978 0.0569 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.9976 0.0553 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.9977 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.9977 0.0578 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.9971 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.9969 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.9968 0.0531 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.9971 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.9973 0.0533 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.9976 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.9972 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.9969 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.9973 0.0568 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.9972 0.0563 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.9973 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.9969 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.9967 0.0581 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.9963 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.9963 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.9959 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.9956 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.9950 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.9948 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.9948 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.9946 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.9942 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.9943 0.0571 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.9941 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.9940 0.0531 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.9935 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.9932 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.9929 0.0576 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.9928 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.9927 0.0543 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.9924 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.9920 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.9916 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.9915 0.0545 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.9915 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.9911 0.0568 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.9910 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.9907 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.9906 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.9904 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.9904 0.0586 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.9904 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.9903 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.9902 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.9901 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.9900 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.9898 0.0539 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.9896 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.9892 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.9891 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.9890 0.0586 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.9890 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.9889 0.0555 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.9889 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.9887 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.9885 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.9886 0.0627 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.9885 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.9881 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.9882 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.9881 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.9881 0.0548 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.9881 0.0527 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.9879 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.9875 0.0585 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.9876 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.9877 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.9876 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.9877 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.9877 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.9877 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.9881 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.9879 0.0550 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.9880 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.9880 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.9879 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.9879 0.0556 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.9877 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.9878 0.0563 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.9878 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.9879 0.0597 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.9879 0.0547 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.9878 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.9876 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.9878 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.9877 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.9877 0.0587 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.9877 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.9876 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.9876 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.9875 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.9874 0.0496 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.9875 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.9876 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.9875 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.9875 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.9875 0.0608 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.9875 0.0558 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.9874 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.9874 0.0558 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.9877 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.9876 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.9876 0.0560 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.9875 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.9873 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.9874 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.9875 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.9876 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.9875 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.9874 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.9874 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 2.0461 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 2.0038 0.0623 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.9920 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.9851 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.9824 0.0494 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.9746 0.0492 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.9750 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.9741 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.9766 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.9769 0.0537 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.9743 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.9724 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.9724 0.0494 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.9743 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.9741 0.0554 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.9727 0.0546 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.9731 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.9763 0.0618 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.9760 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.9766 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.9756 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.9764 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.9760 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.9756 0.0554 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.9754 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.9746 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.9740 0.0541 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.9747 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.9754 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.9756 0.0607 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.9753 0.0544 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.9747 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.9750 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.9758 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.9758 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.9754 0.0619 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.9750 0.0499 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.9736 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.9721 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.9714 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.9712 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.9710 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.9704 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.9698 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.9700 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.9686 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.9685 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.9681 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.9678 0.0496 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.9686 0.0556 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.9682 0.0551 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.9689 0.0577 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.9688 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.9688 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.9686 0.0552 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.9689 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.9690 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.9688 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.9685 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.9689 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.9688 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.9695 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.9699 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.9701 0.0575 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.9699 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.9703 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.9705 0.0574 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.9700 0.0555 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.9700 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.9699 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.9704 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.9706 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.9709 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.9706 0.0600 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.9703 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.9706 0.0544 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.9705 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.9707 0.0559 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.9700 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.9699 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.9695 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.9697 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.9693 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.9692 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.9686 0.0498 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.9682 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.9681 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.9679 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.9675 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.9675 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.9672 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.9672 0.0589 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.9667 0.0565 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.9664 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.9661 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.9661 0.0549 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.9660 0.0497 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.9655 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.9652 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.9648 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.9648 0.0498 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.9648 0.0536 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.9646 0.0556 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.9644 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.9640 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.9639 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.9639 0.0569 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.9638 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.9638 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.9637 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.9637 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.9635 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.9634 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.9633 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.9630 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.9626 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.9625 0.0495 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.9624 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.9624 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.9623 0.0598 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.9624 0.0499 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.9622 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.9619 0.0571 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.9620 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.9619 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.9616 0.0555 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.9617 0.0566 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.9616 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.9616 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.9616 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.9614 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.9610 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.9611 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.9612 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.9611 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.9612 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.9612 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.9612 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.9615 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.9613 0.0567 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.9615 0.0555 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.9615 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.9614 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.9614 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.9612 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.9613 0.0556 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.9614 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.9615 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.9615 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.9613 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.9611 0.0552 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.9612 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.9612 0.0560 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.9613 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.9613 0.0563 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.9613 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.9612 0.0493 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.9612 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.9610 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.9613 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.9613 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.9613 0.0571 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.9613 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.9613 0.0572 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.9613 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.9611 0.0551 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.9612 0.0559 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.9614 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.9614 0.0499 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.9614 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.9613 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.9611 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.9612 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.9612 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.9613 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.9612 0.0560 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.9611 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.9612 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 2.0178 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.9857 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.9694 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.9617 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.9588 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.9504 0.0562 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.9499 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.9492 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.9524 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.9523 0.0568 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.9489 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.9472 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.9482 0.0570 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.9498 0.0571 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.9497 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.9479 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.9483 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.9504 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.9504 0.0501 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.9506 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.9504 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.9520 0.0563 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.9512 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.9510 0.0587 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.9509 0.0559 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.9498 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.9491 0.0495 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.9498 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.9508 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.9515 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.9513 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.9510 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.9509 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.9518 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.9513 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.9509 0.0576 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.9507 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.9495 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.9485 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.9480 0.0578 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.9477 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.9476 0.0562 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.9468 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.9460 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.9462 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.9448 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.9447 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.9442 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.9441 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.9448 0.0563 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.9443 0.0491 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.9451 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.9450 0.0584 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.9449 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.9445 0.0498 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.9449 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.9452 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.9450 0.0574 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.9446 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.9452 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.9451 0.0586 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.9456 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.9459 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.9462 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.9459 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.9462 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.9463 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.9457 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.9454 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.9454 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.9458 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.9459 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.9464 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.9462 0.0562 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.9461 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.9464 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.9464 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.9465 0.0559 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.9460 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.9458 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.9453 0.0560 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.9454 0.0502 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.9450 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.9448 0.0585 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.9443 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.9439 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.9438 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.9435 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.9431 0.0558 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.9433 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.9429 0.0502 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.9428 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.9424 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.9421 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.9418 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.9419 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.9419 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.9415 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.9411 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.9406 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.9406 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.9406 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.9403 0.0497 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.9401 0.0567 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.9399 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.9399 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.9399 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.9399 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.9399 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.9399 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.9398 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.9397 0.0554 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.9396 0.0557 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.9395 0.0573 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.9393 0.0561 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.9390 0.0575 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.9390 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.9389 0.0586 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.9389 0.0554 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.9389 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.9389 0.0615 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.9386 0.0502 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.9384 0.0500 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.9385 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.9385 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.9382 0.0597 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.9383 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.9384 0.0540 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.9383 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.9383 0.0573 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.9381 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.9377 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.9378 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.9379 0.0590 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.9378 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.9379 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.9379 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.9379 0.0558 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.9382 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.9380 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.9383 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.9382 0.0565 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.9381 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.9381 0.0578 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.9380 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.9381 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.9381 0.0497 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.9383 0.0541 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.9382 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.9381 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.9379 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.9381 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.9381 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.9381 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.9381 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.9380 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.9379 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.9379 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.9377 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.9379 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.9380 0.0561 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.9379 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.9380 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.9380 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.9380 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.9379 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.9379 0.0500 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.9382 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.9382 0.0578 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.9382 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.9380 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.9379 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.9380 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.9381 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.9381 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.9381 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.9380 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.9381 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.9932 0.0494 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.9537 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.9445 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.9410 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.9389 0.0500 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.9301 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.9293 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.9278 0.0493 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.9302 0.0496 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.9308 0.0544 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.9263 0.0574 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.9245 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.9245 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.9265 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.9260 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.9244 0.0582 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.9247 0.0504 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.9269 0.0557 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.9269 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.9274 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.9263 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.9273 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.9269 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.9266 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.9262 0.0495 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.9249 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.9241 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.9245 0.0585 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.9256 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.9261 0.0546 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.9258 0.0579 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.9250 0.0577 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.9254 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.9265 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.9261 0.0568 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.9260 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.9256 0.0569 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.9245 0.0578 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.9235 0.0569 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.9228 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.9222 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.9227 0.0578 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.9223 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.9215 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.9219 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.9207 0.0578 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.9205 0.0552 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.9201 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.9200 0.0570 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.9209 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.9204 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.9210 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.9207 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.9207 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.9205 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.9207 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.9208 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.9205 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.9201 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.9207 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.9206 0.0494 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.9213 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.9216 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.9221 0.0565 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.9220 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.9223 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.9223 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.9217 0.0594 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.9217 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.9217 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.9220 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.9222 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.9226 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.9222 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.9221 0.0626 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.9225 0.0570 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.9225 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.9227 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.9222 0.0560 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.9222 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.9217 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.9217 0.0580 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.9213 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.9212 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.9207 0.0500 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.9203 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.9202 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.9199 0.0553 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.9194 0.0504 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.9196 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.9194 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.9193 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.9189 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.9187 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.9184 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.9185 0.0567 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.9185 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.9181 0.0590 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.9177 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.9173 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.9173 0.0495 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.9172 0.0552 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.9170 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.9169 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.9166 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.9165 0.0563 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.9164 0.0571 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.9164 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.9164 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.9164 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.9164 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.9162 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.9162 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.9160 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.9158 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.9155 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.9155 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.9154 0.0587 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.9154 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.9154 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.9154 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.9151 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.9149 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.9150 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.9150 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.9147 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.9149 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.9149 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.9150 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.9149 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.9147 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.9144 0.0543 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.9145 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.9146 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.9146 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.9147 0.0583 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.9147 0.0593 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.9148 0.0543 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.9150 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.9148 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.9150 0.0501 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.9149 0.0543 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.9149 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.9149 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.9148 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.9149 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.9149 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.9151 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.9151 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.9149 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.9147 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.9149 0.0567 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.9150 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.9151 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.9150 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.9150 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.9150 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.9150 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.9148 0.0500 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.9150 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.9151 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.9151 0.0544 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.9152 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.9152 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.9152 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.9152 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.9151 0.0602 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.9154 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.9154 0.0499 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.9153 0.0603 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.9152 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.9150 0.0554 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.9151 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.9152 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.9152 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.9152 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.9151 0.0556 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.9151 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.9738 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.9353 0.0565 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.9267 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.9221 0.0552 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.9174 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.9093 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.9090 0.0499 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.9068 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.9092 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.9089 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.9057 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.9041 0.0558 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.9044 0.0525 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.9065 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.9061 0.0554 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.9048 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.9044 0.0577 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.9060 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.9066 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.9068 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.9062 0.0550 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.9070 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.9068 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.9068 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.9064 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.9052 0.0587 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.9045 0.0569 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.9050 0.0537 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.9060 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.9064 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.9062 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.9054 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.9058 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.9066 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.9064 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.9064 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.9061 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.9051 0.0525 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.9040 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.9030 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.9026 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.9028 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.9022 0.0588 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.9014 0.0568 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.9015 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.9003 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.9000 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.8996 0.0564 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.8996 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.9003 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.9000 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.9006 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.9006 0.0588 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.9006 0.0525 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.9002 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.9003 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.9005 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.9003 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.8998 0.0550 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.9003 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.9001 0.0525 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.9008 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.9012 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.9016 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.9014 0.0575 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.9017 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.9017 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.9012 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.9013 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.9012 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.9018 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.9019 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.9025 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.9021 0.0525 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.9019 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.9024 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.9024 0.0563 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.9026 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.9022 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.9020 0.0564 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.9015 0.0559 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.9017 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.9012 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.9011 0.0585 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.9006 0.0557 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.9003 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.9002 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.8999 0.0582 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.8995 0.0575 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.8996 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.8994 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.8994 0.0596 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.8989 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.8986 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.8983 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.8983 0.0551 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.8983 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.8979 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.8977 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.8974 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.8974 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.8974 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.8972 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.8970 0.0595 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.8967 0.0569 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.8966 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.8965 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.8965 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.8966 0.0572 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.8966 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.8966 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.8965 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.8964 0.0540 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.8963 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.8961 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.8958 0.0601 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.8958 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.8957 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.8957 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.8956 0.0549 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.8956 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.8953 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.8951 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.8951 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.8951 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.8948 0.0581 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.8950 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.8950 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.8950 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.8949 0.0584 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.8947 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.8944 0.0572 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.8945 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.8947 0.0557 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.8947 0.0501 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.8948 0.0592 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.8949 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.8950 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.8952 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.8951 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.8954 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.8954 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.8954 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.8954 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.8953 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.8953 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.8953 0.0578 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.8955 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.8955 0.0554 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.8954 0.0542 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.8952 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.8953 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.8953 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.8954 0.0585 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.8953 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.8953 0.0585 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.8953 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.8953 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.8951 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.8953 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.8954 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.8954 0.0570 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.8955 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.8955 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.8955 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.8954 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.8955 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.8958 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.8958 0.0580 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.8957 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.8956 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.8954 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.8955 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.8956 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.8956 0.0598 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.8955 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.8954 0.0583 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.8954 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.9573 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.9171 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.9031 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.8960 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.8948 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.8875 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.8875 0.0580 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.8855 0.0557 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.8885 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.8891 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.8864 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.8846 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.8855 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.8870 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.8864 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.8849 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.8855 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.8874 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.8876 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.8888 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.8877 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.8890 0.0571 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.8885 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.8882 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.8877 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.8867 0.0579 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.8858 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.8865 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.8877 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.8884 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.8879 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.8871 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.8873 0.0561 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.8881 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.8879 0.0550 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.8874 0.0593 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.8873 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.8862 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.8850 0.0505 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.8841 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.8837 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.8839 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.8834 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.8825 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.8827 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.8815 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.8812 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.8807 0.0573 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.8807 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.8814 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.8810 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.8816 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.8815 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.8816 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.8814 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.8816 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.8818 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.8815 0.0573 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.8811 0.0563 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.8818 0.0556 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.8815 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.8822 0.0588 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.8825 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.8828 0.0576 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.8824 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.8828 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.8829 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.8824 0.0578 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.8823 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.8823 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.8827 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.8829 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.8834 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.8832 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.8832 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.8834 0.0561 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.8833 0.0553 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.8835 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.8831 0.0565 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.8830 0.0499 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.8824 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.8826 0.0573 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.8820 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.8820 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.8816 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.8813 0.0572 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.8812 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.8809 0.0595 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.8804 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.8805 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.8802 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.8801 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.8797 0.0505 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.8794 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.8791 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.8791 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.8791 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.8786 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.8783 0.0503 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.8778 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.8779 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.8778 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.8777 0.0561 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.8776 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.8774 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.8773 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.8773 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.8773 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.8774 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.8774 0.0571 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.8774 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.8772 0.0558 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.8771 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.8770 0.0557 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.8767 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.8764 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.8764 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.8763 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.8763 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.8763 0.0557 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.8764 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.8761 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.8758 0.0581 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.8760 0.0581 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.8760 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.8757 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.8758 0.0559 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.8758 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.8758 0.0544 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.8757 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.8754 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.8751 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.8752 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.8753 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.8752 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.8753 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.8754 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.8755 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.8756 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.8755 0.0505 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.8757 0.0542 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.8757 0.0598 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.8756 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.8757 0.0568 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.8756 0.0571 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.8757 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.8757 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.8759 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.8759 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.8757 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.8755 0.0559 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.8756 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.8757 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.8758 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.8758 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.8757 0.0574 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.8757 0.0498 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.8756 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.8755 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.8756 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.8757 0.0499 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.8758 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.8758 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.8758 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.8758 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.8757 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.8758 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.8761 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.8760 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.8760 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.8759 0.0546 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.8757 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.8758 0.0556 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.8759 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.8759 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.8759 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.8758 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.8758 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.9420 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.9002 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.8908 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.8842 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.8815 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.8721 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.8708 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.8691 0.0504 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.8721 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.8719 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.8686 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.8669 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.8680 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.8699 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.8694 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.8678 0.0554 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.8674 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.8693 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.8691 0.0502 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.8698 0.0585 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.8691 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.8702 0.0551 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.8696 0.0500 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.8695 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.8691 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.8684 0.0589 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.8674 0.0502 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.8680 0.0595 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.8689 0.0607 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.8693 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.8690 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.8681 0.0570 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.8683 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.8689 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.8688 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.8683 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.8679 0.0590 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.8669 0.0583 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.8659 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.8653 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.8649 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.8652 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.8646 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.8637 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.8640 0.0577 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.8629 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.8627 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.8622 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.8621 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.8628 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.8623 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.8631 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.8629 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.8630 0.0584 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.8629 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.8632 0.0503 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.8634 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.8633 0.0572 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.8629 0.0503 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.8636 0.0624 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.8635 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.8641 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.8645 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.8647 0.0588 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.8644 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.8647 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.8650 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.8645 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.8645 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.8643 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.8647 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.8649 0.0549 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.8653 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.8651 0.0597 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.8651 0.0504 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.8654 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.8654 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.8657 0.0548 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.8652 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.8651 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.8646 0.0540 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.8647 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.8641 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.8642 0.0542 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.8637 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.8633 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.8631 0.0583 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.8629 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.8623 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.8624 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.8621 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.8621 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.8617 0.0575 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.8615 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.8613 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.8613 0.0578 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.8612 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.8607 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.8604 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.8600 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.8599 0.0563 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.8599 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.8596 0.0499 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.8595 0.0567 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.8593 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.8592 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.8591 0.0563 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.8591 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.8591 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.8591 0.0567 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.8592 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.8592 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.8591 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.8590 0.0567 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.8588 0.0500 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.8585 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.8585 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.8584 0.0583 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.8584 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.8584 0.0548 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.8584 0.0549 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.8581 0.0563 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.8579 0.0501 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.8581 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.8580 0.0596 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.8577 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.8579 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.8580 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.8580 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.8579 0.0575 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.8577 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.8574 0.0545 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.8574 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.8575 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.8575 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.8576 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.8577 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.8578 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.8581 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.8579 0.0595 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.8582 0.0568 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.8581 0.0501 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.8581 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.8582 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.8580 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.8581 0.0538 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.8581 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.8583 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.8583 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.8582 0.0546 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.8579 0.0500 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.8581 0.0540 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.8581 0.0555 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.8582 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.8582 0.0504 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.8581 0.0495 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.8582 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.8582 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.8580 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.8582 0.0503 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.8583 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.8583 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.8584 0.0501 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.8584 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.8584 0.0538 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.8583 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.8583 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.8586 0.0542 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.8586 0.0584 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.8585 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.8584 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.8582 0.0568 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.8583 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.8583 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.8584 0.0503 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.8583 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.8582 0.0566 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.8583 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.9234 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.8796 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.8715 0.0603 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.8662 0.0589 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.8640 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.8549 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.8547 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.8536 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.8560 0.0549 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.8558 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.8511 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.8489 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.8495 0.0568 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.8523 0.0585 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.8518 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.8503 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.8506 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.8524 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.8519 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.8531 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.8529 0.0498 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.8538 0.0557 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.8536 0.0604 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.8530 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.8530 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.8522 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.8513 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.8519 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.8528 0.0566 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.8533 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.8531 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.8526 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.8527 0.0502 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.8533 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.8529 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.8528 0.0594 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.8521 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.8511 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.8500 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.8491 0.0571 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.8486 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.8489 0.0538 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.8484 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.8477 0.0607 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.8480 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.8467 0.0561 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.8465 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.8463 0.0506 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.8462 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.8468 0.0575 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.8464 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.8473 0.0567 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.8474 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.8475 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.8472 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.8475 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.8478 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.8476 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.8473 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.8479 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.8478 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.8485 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.8488 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.8492 0.0530 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.8490 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.8492 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.8494 0.0589 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.8490 0.0538 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.8489 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.8488 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.8490 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.8492 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.8497 0.0506 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.8495 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.8495 0.0500 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.8497 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.8497 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.8498 0.0552 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.8493 0.0551 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.8492 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.8487 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.8488 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.8483 0.0543 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.8482 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.8477 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.8475 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.8473 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.8471 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.8466 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.8467 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.8464 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.8461 0.0553 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.8457 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.8454 0.0586 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.8453 0.0589 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.8452 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.8450 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.8446 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.8442 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.8438 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.8438 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.8437 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.8435 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.8433 0.0548 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.8431 0.0575 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.8430 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.8429 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.8430 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.8429 0.0543 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.8429 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.8429 0.0572 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.8429 0.0566 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.8427 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.8425 0.0498 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.8423 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.8419 0.0549 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.8418 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.8418 0.0584 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.8419 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.8419 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.8419 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.8416 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.8414 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.8416 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.8416 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.8413 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.8415 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.8416 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.8414 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.8414 0.0499 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.8412 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.8408 0.0575 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.8409 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.8410 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.8410 0.0580 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.8411 0.0561 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.8412 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.8412 0.0543 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.8414 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.8412 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.8415 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.8414 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.8414 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.8415 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.8413 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.8413 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.8414 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.8415 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.8415 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.8414 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.8412 0.0538 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.8413 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.8413 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.8415 0.0548 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.8414 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.8414 0.0600 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.8414 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.8414 0.0564 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.8412 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.8415 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.8416 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.8416 0.0561 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.8417 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.8418 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.8417 0.0562 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.8417 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.8417 0.0568 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.8421 0.0582 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.8420 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.8420 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.8419 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.8417 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.8418 0.0551 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.8419 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.8419 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.8418 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.8417 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.8418 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.9053 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.8657 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.8540 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.8505 0.0570 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.8464 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.8376 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.8383 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.8364 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.8395 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.8398 0.0595 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.8362 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.8341 0.0552 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.8347 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.8371 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.8365 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.8344 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.8345 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.8362 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.8357 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.8369 0.0504 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.8365 0.0595 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.8374 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.8368 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.8365 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.8359 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.8348 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.8340 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.8343 0.0552 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.8355 0.0570 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.8364 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.8360 0.0568 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.8353 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.8354 0.0554 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.8362 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.8361 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.8359 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.8355 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.8344 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.8332 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.8323 0.0583 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.8319 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.8323 0.0553 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.8318 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.8312 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.8315 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.8303 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.8301 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.8299 0.0554 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.8300 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.8310 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.8305 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.8315 0.0558 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.8314 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.8315 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.8314 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.8316 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.8319 0.0561 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.8316 0.0573 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.8312 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.8318 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.8316 0.0562 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.8325 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.8328 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.8331 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.8329 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.8334 0.0598 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.8335 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.8331 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.8329 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.8330 0.0554 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.8335 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.8336 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.8340 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.8336 0.0622 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.8336 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.8340 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.8339 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.8340 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.8335 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.8335 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.8329 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.8330 0.0575 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.8325 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.8324 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.8320 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.8317 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.8316 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.8312 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.8307 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.8308 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.8305 0.0564 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.8304 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.8300 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.8296 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.8294 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.8294 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.8293 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.8289 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.8285 0.0579 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.8281 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.8281 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.8280 0.0550 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.8278 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.8276 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.8274 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.8273 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.8272 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.8273 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.8273 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.8273 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.8273 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.8272 0.0571 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.8271 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.8270 0.0502 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.8268 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.8265 0.0502 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.8264 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.8264 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.8264 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.8264 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.8263 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.8260 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.8257 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.8259 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.8259 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.8256 0.0598 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.8258 0.0499 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.8258 0.0502 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.8258 0.0549 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.8257 0.0596 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.8254 0.0564 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.8251 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.8253 0.0558 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.8254 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.8254 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.8255 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.8256 0.0544 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.8256 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.8258 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.8257 0.0549 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.8259 0.0502 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.8258 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.8257 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.8258 0.0578 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.8257 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.8258 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.8259 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.8261 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.8261 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.8259 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.8257 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.8259 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.8259 0.0578 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.8260 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.8259 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.8260 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.8260 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.8260 0.0602 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.8257 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.8260 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.8261 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.8261 0.0576 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.8262 0.0549 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.8262 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.8262 0.0586 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.8261 0.0579 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.8262 0.0504 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.8265 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.8265 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.8265 0.0626 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.8264 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.8262 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.8263 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.8264 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.8265 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.8263 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.8262 0.0548 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.8263 0.0577 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4182 0.6103 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4154 0.0271 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.4123 0.0267 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.4093 0.0290 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.4061 0.0281 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.4030 0.0275 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.3994 0.0285 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.3957 0.0293 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.3915 0.0343 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 4.3868 0.0354 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 4.3812 0.0407 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 4.3747 0.0351 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 4.3665 0.0358 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 4.3558 0.0374 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 4.3403 0.0382 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 4.3181 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 4.2906 0.0429 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 4.2625 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 4.2334 0.0424 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 4.2039 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 4.1756 0.0426 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 4.1482 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 4.1220 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 4.0970 0.0441 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 4.0722 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 4.0490 0.0513 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 4.0271 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 4.0051 0.0441 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.9846 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.9654 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.9477 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.9295 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.9121 0.0442 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.8960 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.8800 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.8653 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.8508 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.8366 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.8228 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.8097 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.7969 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.7845 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.7727 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.7614 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.7502 0.0526 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.7400 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.7301 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.7206 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.7118 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.7029 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.6941 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.6856 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.6775 0.0510 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.6695 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.6617 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.6541 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.6468 0.0489 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.6399 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.6327 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.6262 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.6198 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.6138 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.6080 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.6018 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.5957 0.0550 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.5903 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.5852 0.0493 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.5793 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.5739 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.5689 0.0495 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.5639 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.5592 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.5546 0.0516 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.5499 0.0510 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.5455 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.5414 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.5371 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.5328 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.5286 0.0496 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.5245 0.0512 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.5205 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.5167 0.0505 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.5130 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.5091 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.5054 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.5017 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.4982 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.4946 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.4914 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.4881 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.4850 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.4818 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.4786 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.4756 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.4725 0.0522 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.4695 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.4666 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.4637 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.4609 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.4581 0.0506 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.4555 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.4528 0.0508 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.4502 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.4476 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.4451 0.0563 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.4426 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.4400 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.4374 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.4350 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.4324 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.4301 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.4278 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.4254 0.0493 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.4230 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.4207 0.0444 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.4184 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.4162 0.0532 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.4142 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.4123 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.4101 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.4083 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.4063 0.0519 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.4044 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.4025 0.0562 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.4005 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.3984 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.3965 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.3947 0.0530 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.3928 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.3910 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.3894 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.3875 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.3858 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.3840 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.3820 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.3801 0.0519 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.3784 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.3766 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.3750 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.3733 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.3717 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.3700 0.0499 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.3683 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.3666 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.3650 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.3635 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.3621 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.3607 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.3591 0.0512 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.3576 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.3563 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.3550 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.3535 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.3521 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.3506 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.3492 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.3477 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.3463 0.0438 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.3448 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.3433 0.0503 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.3420 0.0542 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.3405 0.0506 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.3390 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.3376 0.0509 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.3363 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.3349 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.3336 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.3323 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.3310 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.3297 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.3284 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.3274 0.0442 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.3264 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.3254 0.0500 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.3242 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.3231 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.3219 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.3205 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 3.1919 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 3.1356 0.0447 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 3.1174 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 3.1106 0.0527 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 3.1078 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 3.1074 0.0443 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 3.1071 0.0464 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 3.1069 0.0525 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 3.1046 0.0454 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 3.1039 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 3.1007 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 3.0997 0.0452 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 3.0987 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 3.0988 0.0457 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 3.0986 0.0458 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 3.0982 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 3.0973 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 3.0983 0.0455 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 3.0980 0.0509 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 3.0963 0.0567 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 3.0955 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 3.0950 0.0451 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 3.0943 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 3.0935 0.0447 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 3.0924 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 3.0921 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 3.0919 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 3.0907 0.0464 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 3.0898 0.0455 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 3.0894 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 3.0896 0.0472 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 3.0888 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 3.0877 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 3.0875 0.0443 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 3.0865 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 3.0862 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 3.0851 0.0454 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 3.0842 0.0514 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 3.0832 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 3.0823 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 3.0812 0.0527 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 3.0803 0.0512 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 3.0792 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 3.0782 0.0456 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 3.0772 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 3.0763 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 3.0757 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 3.0751 0.0472 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 3.0746 0.0447 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 3.0740 0.0466 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 3.0733 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 3.0725 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 3.0719 0.0456 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 3.0710 0.0458 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 3.0703 0.0451 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 3.0693 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 3.0684 0.0455 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 3.0676 0.0541 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 3.0668 0.0447 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 3.0660 0.0462 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 3.0653 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 3.0649 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 3.0645 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 3.0635 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 3.0625 0.0455 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 3.0619 0.0463 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 3.0612 0.0472 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 3.0599 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 3.0590 0.0449 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 3.0583 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 3.0575 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 3.0570 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 3.0562 0.0447 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 3.0554 0.0455 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 3.0548 0.0458 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 3.0542 0.0459 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 3.0535 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 3.0529 0.0441 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 3.0521 0.0513 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 3.0511 0.0465 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 3.0502 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 3.0496 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 3.0489 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 3.0481 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 3.0471 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 3.0461 0.0546 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 3.0452 0.0521 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 3.0443 0.0522 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 3.0436 0.0466 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 3.0429 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 3.0422 0.0452 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 3.0414 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 3.0406 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 3.0397 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 3.0388 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 3.0379 0.0456 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 3.0371 0.0537 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 3.0362 0.0519 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 3.0354 0.0509 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 3.0345 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 3.0337 0.0463 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 3.0329 0.0566 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 3.0321 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 3.0312 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 3.0303 0.0459 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 3.0295 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 3.0285 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 3.0275 0.0464 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 3.0267 0.0513 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 3.0255 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 3.0247 0.0530 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 3.0238 0.0465 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 3.0228 0.0464 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 3.0217 0.0454 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 3.0207 0.0462 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 3.0196 0.0451 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 3.0187 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 3.0178 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 3.0170 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 3.0161 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 3.0154 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 3.0145 0.0464 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 3.0136 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 3.0128 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 3.0118 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 3.0107 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 3.0098 0.0456 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 3.0090 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 3.0079 0.0478 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 3.0070 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 3.0061 0.0466 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 3.0051 0.0457 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 3.0042 0.0478 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 3.0033 0.0462 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 3.0022 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 3.0011 0.0568 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 3.0002 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.9991 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.9983 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.9973 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.9963 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.9952 0.0466 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.9942 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.9932 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.9922 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.9913 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.9904 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.9896 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.9885 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.9875 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.9867 0.0465 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.9860 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.9851 0.0455 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.9842 0.0517 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.9831 0.0452 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.9822 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.9812 0.0546 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.9802 0.0556 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.9791 0.0551 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.9781 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.9771 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.9760 0.0464 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.9749 0.0523 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.9739 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.9729 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.9719 0.0462 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.9709 0.0459 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.9700 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.9691 0.0454 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.9680 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.9671 0.0461 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.9663 0.0458 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.9656 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.9649 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.9641 0.0509 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.9632 0.0521 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.9622 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.9612 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.8818 0.0462 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.8193 0.0463 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.8016 0.0561 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.7943 0.0469 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.7906 0.0465 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.7885 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.7872 0.0470 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.7864 0.0461 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.7854 0.0462 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.7831 0.0476 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.7799 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.7791 0.0461 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.7774 0.0562 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.7773 0.0469 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.7765 0.0468 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.7758 0.0538 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.7746 0.0463 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.7754 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.7750 0.0467 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.7727 0.0465 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.7714 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.7714 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.7703 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.7691 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.7676 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.7670 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.7664 0.0465 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.7654 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.7647 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.7637 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.7636 0.0469 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.7626 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.7613 0.0478 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.7605 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.7593 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.7588 0.0463 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.7576 0.0468 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.7560 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.7548 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.7536 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.7523 0.0469 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.7512 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.7498 0.0471 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.7487 0.0469 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.7474 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.7461 0.0459 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.7455 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.7447 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.7437 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.7430 0.0475 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.7420 0.0462 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.7409 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.7399 0.0463 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.7388 0.0522 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.7379 0.0462 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.7369 0.0555 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.7360 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.7349 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.7339 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.7330 0.0544 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.7322 0.0464 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.7316 0.0463 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.7311 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.7301 0.0473 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.7290 0.0473 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.7284 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.7276 0.0562 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.7263 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.7252 0.0472 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.7246 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.7238 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.7231 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.7223 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.7214 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.7208 0.0466 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.7202 0.0469 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.7195 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.7189 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.7181 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.7171 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.7164 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.7157 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.7149 0.0478 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.7141 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.7129 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.7120 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.7111 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.7102 0.0468 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.7094 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.7087 0.0472 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.7080 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.7072 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.7064 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.7055 0.0471 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.7045 0.0471 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.7037 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.7030 0.0507 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.7022 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.7015 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.7008 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.7001 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.6994 0.0472 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.6986 0.0478 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.6978 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.6970 0.0478 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.6964 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.6955 0.0476 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.6949 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.6943 0.0472 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.6934 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.6927 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.6920 0.0525 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.6913 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.6904 0.0476 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.6897 0.0534 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.6888 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.6881 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.6875 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.6869 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.6862 0.0470 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.6857 0.0538 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.6851 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.6844 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.6838 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.6831 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.6823 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.6817 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.6812 0.0470 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.6805 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.6799 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.6793 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.6786 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.6781 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.6775 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.6767 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.6760 0.0595 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.6754 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.6747 0.0473 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.6742 0.0541 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.6736 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.6731 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.6724 0.0470 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.6717 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.6710 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.6704 0.0513 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.6700 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.6693 0.0547 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.6689 0.0472 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.6682 0.0567 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.6675 0.0467 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.6671 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.6668 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.6663 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.6658 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.6652 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.6645 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.6639 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.6633 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.6626 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.6621 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.6615 0.0548 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.6608 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.6601 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.6594 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.6589 0.0475 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.6583 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.6577 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.6572 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.6567 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.6560 0.0584 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.6555 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.6551 0.0472 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.6548 0.0520 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.6545 0.0527 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.6542 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.6537 0.0527 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.6531 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.6524 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.6444 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.5839 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.5684 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.5602 0.0469 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.5570 0.0469 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.5544 0.0540 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.5535 0.0583 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.5531 0.0472 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.5532 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.5518 0.0549 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.5502 0.0470 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.5492 0.0479 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.5480 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.5493 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.5489 0.0526 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.5487 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.5488 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.5502 0.0526 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.5501 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.5484 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.5476 0.0467 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.5482 0.0482 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.5475 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.5468 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.5458 0.0476 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.5456 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.5451 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.5442 0.0480 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.5442 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.5439 0.0475 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.5443 0.0484 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.5434 0.0474 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.5425 0.0516 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.5422 0.0479 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.5415 0.0484 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.5413 0.0474 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.5406 0.0476 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.5394 0.0474 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.5387 0.0482 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.5380 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.5369 0.0476 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.5362 0.0471 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.5353 0.0537 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.5345 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.5337 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.5326 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.5324 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.5319 0.0547 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.5313 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.5311 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.5305 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.5301 0.0542 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.5294 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.5288 0.0473 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.5283 0.0526 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.5278 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.5275 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.5268 0.0482 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.5262 0.0540 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.5259 0.0472 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.5254 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.5251 0.0480 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.5250 0.0474 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.5244 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.5239 0.0482 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.5236 0.0533 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.5231 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.5220 0.0474 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.5213 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.5210 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.5206 0.0595 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.5203 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.5199 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.5194 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.5190 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.5191 0.0525 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.5185 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.5184 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.5179 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.5175 0.0530 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.5171 0.0556 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.5168 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.5164 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.5157 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.5150 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.5145 0.0478 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.5140 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.5136 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.5133 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.5130 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.5126 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.5123 0.0550 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.5119 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.5112 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.5106 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.5102 0.0549 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.5098 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.5093 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.5089 0.0476 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.5084 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.5081 0.0474 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.5078 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.5072 0.0523 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.5068 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.5065 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.5062 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.5057 0.0482 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.5055 0.0544 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.5053 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.5047 0.0606 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.5045 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.5042 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.5038 0.0562 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.5033 0.0477 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.5030 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.5023 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.5020 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.5017 0.0568 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.5015 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.5012 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.5010 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.5008 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.5004 0.0480 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.5002 0.0468 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.4999 0.0480 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.4994 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.4992 0.0480 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.4991 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.4988 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.4986 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.4983 0.0610 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.4979 0.0484 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.4976 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.4974 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.4970 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.4967 0.0523 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.4964 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.4961 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.4960 0.0516 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.4956 0.0548 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.4954 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.4950 0.0483 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.4947 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.4944 0.0596 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.4941 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.4940 0.0543 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.4937 0.0596 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.4936 0.0523 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.4932 0.0481 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.4928 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.4928 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.4927 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.4926 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.4924 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.4921 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.4918 0.0562 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.4915 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.4912 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.4907 0.0483 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.4906 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.4904 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.4899 0.0583 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.4895 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.4892 0.0599 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.4890 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.4888 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.4885 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.4883 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.4880 0.0539 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.4876 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.4874 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.4872 0.0481 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.4871 0.0603 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.4871 0.0556 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.4870 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.4868 0.0553 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.4865 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.4861 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.5175 0.0476 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.4606 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.4456 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.4417 0.0579 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.4386 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.4359 0.0482 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.4367 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.4375 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.4387 0.0472 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.4380 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.4360 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.4362 0.0470 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.4357 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.4379 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.4380 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.4380 0.0549 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.4382 0.0489 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.4397 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.4398 0.0486 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.4383 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.4373 0.0593 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.4378 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.4374 0.0479 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.4370 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.4362 0.0489 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.4359 0.0477 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.4353 0.0550 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.4349 0.0528 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.4352 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.4351 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.4354 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.4348 0.0485 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.4341 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.4340 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.4337 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.4339 0.0551 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.4334 0.0539 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.4323 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.4318 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.4312 0.0485 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.4304 0.0534 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.4298 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.4293 0.0551 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.4285 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.4276 0.0487 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.4264 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.4264 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.4261 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.4258 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.4259 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.4254 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.4252 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.4246 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.4242 0.0484 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.4237 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.4235 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.4234 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.4229 0.0525 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.4226 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.4225 0.0577 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.4220 0.0528 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.4219 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.4219 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.4217 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.4212 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.4213 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.4209 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.4202 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.4196 0.0487 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.4194 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.4193 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.4192 0.0485 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.4189 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.4185 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.4182 0.0478 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.4184 0.0533 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.4181 0.0486 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.4181 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.4178 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.4174 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.4170 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.4169 0.0554 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.4165 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.4161 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.4153 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 2.4149 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 2.4146 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 2.4143 0.0548 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 2.4140 0.0489 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 2.4139 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 2.4136 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 2.4135 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 2.4132 0.0484 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 2.4128 0.0528 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 2.4124 0.0533 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 2.4120 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 2.4118 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 2.4116 0.0596 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 2.4113 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 2.4110 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 2.4109 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 2.4106 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 2.4102 0.0573 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 2.4099 0.0487 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 2.4096 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 2.4094 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 2.4092 0.0548 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 2.4090 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 2.4089 0.0485 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 2.4085 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 2.4084 0.0558 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 2.4084 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 2.4081 0.0477 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 2.4077 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 2.4075 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 2.4070 0.0537 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 2.4068 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 2.4067 0.0482 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 2.4066 0.0541 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 2.4065 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 2.4065 0.0481 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 2.4063 0.0577 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 2.4060 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 2.4058 0.0541 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 2.4057 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 2.4053 0.0479 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 2.4052 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 2.4051 0.0564 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 2.4049 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 2.4048 0.0551 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 2.4046 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 2.4042 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 2.4041 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 2.4041 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 2.4038 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 2.4036 0.0478 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 2.4035 0.0484 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 2.4033 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 2.4034 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 2.4031 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 2.4031 0.0577 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 2.4028 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 2.4026 0.0590 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 2.4024 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 2.4022 0.0486 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 2.4022 0.0577 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 2.4020 0.0558 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 2.4021 0.0483 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 2.4018 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 2.4016 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 2.4016 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 2.4017 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 2.4016 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 2.4015 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 2.4013 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 2.4011 0.0481 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 2.4008 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 2.4006 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 2.4003 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 2.4002 0.0529 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 2.4001 0.0485 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 2.3998 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 2.3996 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 2.3993 0.0537 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 2.3993 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 2.3991 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 2.3990 0.0482 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 2.3989 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 2.3988 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 2.3986 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 2.3984 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 2.3983 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 2.3983 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 2.3982 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 2.3982 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 2.3980 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 2.3978 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 2.3975 0.0535 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.4472 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 2.3902 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 2.3741 0.0487 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 2.3679 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 2.3659 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 2.3647 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 2.3649 0.0555 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 2.3667 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 2.3682 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 2.3680 0.0541 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 2.3669 0.0592 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 2.3657 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 2.3649 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 2.3678 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 2.3672 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 2.3680 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 2.3683 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 2.3706 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 2.3711 0.0548 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 2.3700 0.0565 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 2.3695 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 2.3702 0.0486 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 2.3700 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 2.3694 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 2.3686 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 2.3682 0.0598 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 2.3680 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 2.3677 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 2.3680 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 2.3680 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 2.3684 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 2.3680 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 2.3672 0.0583 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 2.3674 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 2.3671 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 2.3673 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 2.3669 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 2.3658 0.0479 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 2.3651 0.0483 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 2.3645 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 2.3638 0.0521 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 2.3633 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 2.3627 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 2.3619 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 2.3612 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 2.3599 0.0539 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 2.3601 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 2.3596 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 2.3594 0.0534 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 2.3596 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 2.3590 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 2.3591 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 2.3586 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 2.3583 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 2.3578 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 2.3577 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 2.3574 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 2.3570 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 2.3567 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 2.3566 0.0483 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 2.3565 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 2.3565 0.0589 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 2.3565 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 2.3563 0.0482 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 2.3558 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 2.3558 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 2.3556 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 2.3549 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 2.3544 0.0588 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 2.3544 0.0563 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 2.3544 0.0488 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 2.3545 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 2.3544 0.0483 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 2.3541 0.0578 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 2.3539 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 2.3542 0.0548 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 2.3539 0.0489 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 2.3539 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 2.3536 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 2.3533 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 2.3529 0.0486 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 2.3528 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 2.3524 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 2.3520 0.0546 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 2.3513 0.0536 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 2.3509 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 2.3507 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 2.3504 0.0486 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 2.3500 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 2.3499 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 2.3497 0.0563 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 2.3496 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 2.3492 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 2.3489 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 2.3485 0.0595 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 2.3482 0.0542 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 2.3480 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 2.3477 0.0541 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 2.3475 0.0561 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 2.3472 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 2.3472 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 2.3470 0.0536 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 2.3467 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 2.3464 0.0487 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 2.3462 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 2.3460 0.0558 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 2.3457 0.0553 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 2.3456 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 2.3456 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 2.3452 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 2.3451 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 2.3451 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 2.3448 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 2.3446 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 2.3444 0.0557 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 2.3439 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 2.3437 0.0583 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 2.3436 0.0565 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 2.3436 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 2.3435 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 2.3434 0.0482 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 2.3432 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 2.3430 0.0557 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 2.3431 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 2.3428 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 2.3425 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 2.3425 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 2.3424 0.0562 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 2.3423 0.0549 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 2.3422 0.0489 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 2.3420 0.0520 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 2.3417 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 2.3416 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 2.3416 0.0489 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 2.3415 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 2.3414 0.0542 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 2.3413 0.0555 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 2.3412 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 2.3413 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 2.3412 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 2.3413 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 2.3410 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 2.3409 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 2.3408 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 2.3406 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 2.3407 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 2.3406 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 2.3406 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 2.3405 0.0582 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 2.3402 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 2.3402 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 2.3403 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 2.3403 0.0486 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 2.3402 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 2.3400 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 2.3399 0.0489 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 2.3397 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 2.3395 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 2.3393 0.0490 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 2.3393 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 2.3393 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 2.3390 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 2.3388 0.0545 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 2.3387 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 2.3386 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 2.3385 0.0576 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 2.3385 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 2.3385 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 2.3384 0.0485 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 2.3382 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 2.3380 0.0486 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 2.3379 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 2.3379 0.0520 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 2.3379 0.0485 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 2.3379 0.0565 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 2.3379 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 2.3377 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 2.3375 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 2.3988 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 2.3384 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 2.3235 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 2.3174 0.0544 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 2.3166 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 2.3143 0.0546 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 2.3147 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 2.3160 0.0491 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 2.3169 0.0493 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 2.3167 0.0492 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 2.3153 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 2.3145 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 2.3139 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 2.3158 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 2.3158 0.0540 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 2.3158 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 2.3157 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 2.3177 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 2.3183 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 2.3170 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 2.3165 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 2.3174 0.0480 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 2.3171 0.0555 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 2.3161 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 2.3153 0.0556 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 2.3150 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 2.3145 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 2.3143 0.0558 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 2.3147 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 2.3148 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 2.3148 0.0604 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 2.3147 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 2.3142 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 2.3147 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 2.3144 0.0489 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 2.3146 0.0554 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 2.3142 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 2.3130 0.0561 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 2.3123 0.0567 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 2.3117 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 2.3113 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 2.3109 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 2.3104 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 2.3097 0.0493 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 2.3093 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 2.3082 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 2.3083 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 2.3079 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 2.3077 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 2.3079 0.0490 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 2.3074 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 2.3075 0.0545 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 2.3070 0.0550 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 2.3067 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 2.3063 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 2.3063 0.0489 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 2.3061 0.0490 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 2.3056 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 2.3051 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 2.3052 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 2.3049 0.0482 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 2.3050 0.0549 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 2.3052 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 2.3051 0.0529 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 2.3048 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 2.3048 0.0529 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 2.3046 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 2.3041 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 2.3037 0.0559 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 2.3037 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 2.3038 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 2.3039 0.0486 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 2.3041 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 2.3038 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 2.3036 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 2.3039 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 2.3038 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 2.3038 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 2.3034 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 2.3031 0.0549 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 2.3028 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 2.3029 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 2.3025 0.0568 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 2.3021 0.0548 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 2.3014 0.0492 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 2.3011 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 2.3010 0.0594 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 2.3008 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 2.3004 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 2.3004 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 2.3002 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 2.3000 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 2.2998 0.0543 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 2.2994 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 2.2990 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 2.2988 0.0488 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 2.2985 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 2.2983 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 2.2981 0.0544 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 2.2977 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 2.2977 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 2.2976 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 2.2972 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 2.2970 0.0561 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 2.2967 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 2.2965 0.0587 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 2.2962 0.0566 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 2.2962 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 2.2961 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 2.2959 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 2.2957 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 2.2957 0.0485 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 2.2954 0.0487 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 2.2952 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 2.2950 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 2.2945 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 2.2944 0.0562 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 2.2943 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 2.2944 0.0550 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 2.2942 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 2.2942 0.0536 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 2.2940 0.0493 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 2.2939 0.0575 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 2.2939 0.0540 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 2.2937 0.0551 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 2.2935 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 2.2934 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 2.2934 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 2.2933 0.0549 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 2.2933 0.0588 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 2.2931 0.0570 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 2.2928 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 2.2927 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 2.2927 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 2.2926 0.0574 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 2.2925 0.0484 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 2.2925 0.0540 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 2.2925 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 2.2926 0.0555 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 2.2925 0.0610 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 2.2925 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 2.2924 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 2.2923 0.0548 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 2.2922 0.0490 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 2.2921 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 2.2922 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 2.2921 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 2.2923 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 2.2921 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 2.2920 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 2.2919 0.0488 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 2.2921 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 2.2921 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 2.2921 0.0485 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 2.2920 0.0548 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 2.2919 0.0547 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 2.2917 0.0489 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 2.2916 0.0487 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 2.2914 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 2.2915 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 2.2915 0.0585 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 2.2913 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 2.2912 0.0592 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 2.2911 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 2.2911 0.0484 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 2.2910 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 2.2909 0.0487 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 2.2909 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 2.2909 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 2.2907 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 2.2906 0.0605 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 2.2905 0.0554 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 2.2904 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 2.2904 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 2.2905 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 2.2904 0.0559 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 2.2902 0.0529 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 2.2901 0.0561 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 2.3523 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 2.2959 0.0561 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 2.2803 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 2.2747 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 2.2712 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 2.2671 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 2.2670 0.0533 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 2.2687 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 2.2710 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 2.2703 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 2.2698 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 2.2689 0.0491 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 2.2688 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 2.2711 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 2.2710 0.0567 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 2.2709 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 2.2716 0.0493 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 2.2734 0.0580 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 2.2737 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 2.2730 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 2.2726 0.0489 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 2.2732 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 2.2728 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 2.2722 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 2.2715 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 2.2711 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 2.2710 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 2.2708 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 2.2716 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 2.2716 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 2.2717 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 2.2716 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 2.2714 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 2.2718 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 2.2715 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 2.2715 0.0535 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 2.2715 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 2.2706 0.0563 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 2.2697 0.0577 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 2.2692 0.0556 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 2.2690 0.0549 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 2.2685 0.0531 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 2.2680 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 2.2673 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 2.2669 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 2.2657 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 2.2658 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 2.2655 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 2.2653 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 2.2656 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 2.2650 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 2.2654 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 2.2649 0.0551 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 2.2646 0.0574 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 2.2642 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 2.2641 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 2.2641 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 2.2637 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 2.2633 0.0560 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 2.2634 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 2.2632 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 2.2635 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 2.2636 0.0594 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 2.2636 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 2.2634 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 2.2636 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 2.2635 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 2.2629 0.0487 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 2.2626 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 2.2626 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 2.2628 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 2.2628 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 2.2630 0.0483 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 2.2627 0.0561 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 2.2626 0.0557 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 2.2629 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 2.2627 0.0542 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 2.2628 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 2.2624 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 2.2623 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 2.2618 0.0556 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 2.2619 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 2.2616 0.0534 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 2.2613 0.0487 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 2.2607 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 2.2605 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 2.2604 0.0495 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 2.2603 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 2.2599 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 2.2599 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 2.2596 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 2.2594 0.0491 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 2.2591 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 2.2587 0.0531 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 2.2586 0.0542 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 2.2584 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 2.2582 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 2.2580 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 2.2576 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 2.2574 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 2.2574 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 2.2574 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 2.2570 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 2.2568 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 2.2565 0.0531 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 2.2565 0.0542 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 2.2563 0.0496 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 2.2563 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 2.2563 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 2.2561 0.0492 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 2.2559 0.0585 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 2.2559 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 2.2558 0.0493 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 2.2555 0.0592 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 2.2553 0.0563 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 2.2549 0.0543 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 2.2548 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 2.2547 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 2.2548 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 2.2546 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 2.2546 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 2.2544 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 2.2542 0.0570 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 2.2542 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 2.2540 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 2.2537 0.0569 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 2.2537 0.0562 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 2.2537 0.0542 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 2.2536 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 2.2536 0.0543 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 2.2535 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 2.2532 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 2.2531 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 2.2532 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 2.2531 0.0492 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 2.2531 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 2.2530 0.0564 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 2.2530 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 2.2532 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 2.2530 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 2.2531 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 2.2530 0.0593 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 2.2529 0.0564 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 2.2529 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 2.2527 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 2.2528 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 2.2528 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 2.2530 0.0564 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 2.2529 0.0531 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 2.2528 0.0583 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 2.2527 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 2.2528 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 2.2528 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 2.2528 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 2.2527 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 2.2526 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 2.2525 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 2.2524 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 2.2522 0.0487 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 2.2523 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 2.2523 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 2.2522 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 2.2521 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 2.2520 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 2.2521 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 2.2520 0.0601 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 2.2519 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 2.2520 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 2.2520 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 2.2519 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 2.2518 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 2.2517 0.0492 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 2.2517 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 2.2517 0.0543 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 2.2517 0.0564 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 2.2518 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 2.2517 0.0548 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 2.2516 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 2.3182 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 2.2601 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 2.2457 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 2.2406 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 2.2368 0.0540 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 2.2334 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 2.2333 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 2.2336 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 2.2354 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 2.2349 0.0559 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 2.2336 0.0606 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 2.2324 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 2.2331 0.0536 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 2.2355 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 2.2351 0.0557 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 2.2350 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 2.2352 0.0536 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 2.2373 0.0497 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 2.2380 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 2.2375 0.0538 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 2.2374 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 2.2377 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 2.2378 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 2.2367 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 2.2364 0.0554 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 2.2360 0.0563 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 2.2357 0.0581 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 2.2358 0.0558 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 2.2368 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 2.2368 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 2.2370 0.0496 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 2.2365 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 2.2361 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 2.2366 0.0490 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 2.2364 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 2.2365 0.0568 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 2.2363 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 2.2351 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 2.2345 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 2.2339 0.0547 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 2.2337 0.0491 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 2.2333 0.0580 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 2.2328 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 2.2322 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 2.2319 0.0490 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 2.2308 0.0552 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 2.2307 0.0493 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 2.2304 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 2.2302 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 2.2306 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 2.2301 0.0497 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 2.2306 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 2.2304 0.0539 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 2.2303 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 2.2299 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 2.2300 0.0492 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 2.2298 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 2.2296 0.0581 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 2.2293 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 2.2296 0.0548 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 2.2294 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 2.2298 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 2.2300 0.0496 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 2.2300 0.0561 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 2.2297 0.0489 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 2.2300 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 2.2300 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 2.2294 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 2.2292 0.0552 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 2.2292 0.0487 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 2.2294 0.0562 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 2.2296 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 2.2297 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 2.2294 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 2.2293 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 2.2297 0.0538 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 2.2295 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 2.2296 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 2.2292 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 2.2289 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 2.2284 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 2.2286 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 2.2282 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 2.2280 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 2.2275 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 2.2272 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 2.2271 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 2.2268 0.0571 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 2.2266 0.0598 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 2.2267 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 2.2265 0.0494 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 2.2265 0.0538 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 2.2262 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 2.2258 0.0496 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 2.2255 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 2.2253 0.0579 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 2.2252 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 2.2251 0.0561 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 2.2249 0.0493 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 2.2246 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 2.2247 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 2.2245 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 2.2242 0.0497 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 2.2240 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 2.2238 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 2.2237 0.0497 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 2.2235 0.0495 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 2.2236 0.0594 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 2.2236 0.0541 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 2.2234 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 2.2233 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 2.2233 0.0522 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 2.2231 0.0579 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 2.2229 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 2.2228 0.0561 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 2.2223 0.0529 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 2.2222 0.0546 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 2.2222 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 2.2222 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 2.2221 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 2.2222 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 2.2220 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 2.2219 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 2.2219 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 2.2218 0.0495 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 2.2215 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 2.2215 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 2.2216 0.0609 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 2.2215 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 2.2215 0.0570 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 2.2213 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 2.2210 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 2.2210 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 2.2211 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 2.2211 0.0496 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 2.2211 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 2.2211 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 2.2211 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 2.2213 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 2.2212 0.0560 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 2.2214 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 2.2213 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 2.2212 0.0574 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 2.2211 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 2.2211 0.0497 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 2.2212 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 2.2212 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 2.2213 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 2.2212 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 2.2210 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 2.2210 0.0489 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 2.2212 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 2.2212 0.0551 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 2.2213 0.0536 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 2.2212 0.0494 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 2.2212 0.0529 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 2.2211 0.0548 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 2.2209 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 2.2208 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 2.2208 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 2.2208 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 2.2207 0.0567 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 2.2206 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 2.2206 0.0550 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 2.2206 0.0610 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 2.2206 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 2.2205 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 2.2206 0.0587 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 2.2206 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 2.2205 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 2.2204 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 2.2203 0.0561 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 2.2203 0.0559 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 2.2204 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 2.2205 0.0547 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 2.2205 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 2.2203 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 2.2203 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 2.2872 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 2.2339 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 2.2195 0.0547 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 2.2130 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 2.2099 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 2.2052 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 2.2062 0.0496 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 2.2072 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 2.2088 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 2.2092 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 2.2074 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 2.2065 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 2.2060 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 2.2084 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 2.2078 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 2.2072 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 2.2074 0.0496 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 2.2095 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 2.2101 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 2.2094 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 2.2087 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 2.2095 0.0605 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 2.2093 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 2.2084 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 2.2080 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 2.2075 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 2.2072 0.0541 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 2.2075 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 2.2083 0.0587 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 2.2086 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 2.2086 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 2.2081 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 2.2077 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 2.2086 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 2.2084 0.0570 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 2.2085 0.0568 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 2.2082 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 2.2072 0.0565 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 2.2066 0.0559 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 2.2061 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 2.2058 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 2.2056 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 2.2052 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 2.2047 0.0542 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 2.2047 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 2.2034 0.0562 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 2.2038 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 2.2033 0.0494 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 2.2031 0.0556 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 2.2037 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 2.2032 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 2.2036 0.0538 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 2.2032 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 2.2029 0.0495 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 2.2027 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 2.2029 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 2.2029 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 2.2025 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 2.2021 0.0545 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 2.2022 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 2.2020 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 2.2023 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 2.2025 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 2.2025 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 2.2023 0.0570 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 2.2025 0.0595 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 2.2025 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 2.2019 0.0538 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 2.2016 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 2.2015 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 2.2017 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 2.2018 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 2.2019 0.0534 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 2.2017 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 2.2015 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 2.2020 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 2.2018 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 2.2020 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 2.2017 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 2.2015 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 2.2011 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 2.2014 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 2.2010 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 2.2007 0.0494 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 2.2001 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 2.1999 0.0576 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 2.1999 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 2.1995 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 2.1992 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 2.1992 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 2.1991 0.0595 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 2.1991 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 2.1987 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 2.1984 0.0573 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 2.1982 0.0493 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 2.1981 0.0578 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 2.1979 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 2.1977 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 2.1974 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 2.1971 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 2.1972 0.0557 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 2.1971 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 2.1968 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 2.1966 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 2.1964 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 2.1963 0.0541 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 2.1962 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 2.1961 0.0560 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 2.1961 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 2.1960 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 2.1959 0.0545 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 2.1958 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 2.1956 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 2.1954 0.0585 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 2.1953 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 2.1949 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 2.1949 0.0554 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 2.1948 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 2.1950 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 2.1949 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 2.1949 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 2.1947 0.0565 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 2.1946 0.0611 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 2.1946 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 2.1946 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 2.1943 0.0552 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 2.1944 0.0495 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 2.1945 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 2.1944 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 2.1943 0.0568 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 2.1941 0.0564 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 2.1938 0.0554 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 2.1938 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 2.1939 0.0559 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 2.1938 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 2.1938 0.0570 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 2.1938 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 2.1939 0.0555 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 2.1941 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 2.1939 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 2.1940 0.0563 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 2.1939 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 2.1938 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 2.1938 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 2.1937 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 2.1938 0.0561 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 2.1937 0.0557 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 2.1939 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 2.1938 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 2.1937 0.0559 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 2.1936 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 2.1937 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 2.1937 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 2.1938 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 2.1937 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 2.1936 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 2.1935 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 2.1934 0.0582 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 2.1933 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 2.1934 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 2.1935 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 2.1934 0.0612 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 2.1933 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 2.1934 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 2.1934 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 2.1933 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 2.1933 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 2.1935 0.0571 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 2.1935 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 2.1934 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 2.1933 0.0496 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 2.1932 0.0496 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 2.1933 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 2.1933 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 2.1933 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 2.1933 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 2.1932 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 2.1932 0.0582 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 2.2717 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 2.2128 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 2.1968 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 2.1921 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 2.1891 0.0551 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 2.1824 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 2.1841 0.0496 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 2.1848 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 2.1861 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 2.1860 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 2.1836 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 2.1827 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 2.1824 0.0563 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 2.1845 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 2.1844 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 2.1841 0.0567 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 2.1841 0.0552 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 2.1866 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 2.1862 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 2.1852 0.0552 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 2.1846 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 2.1851 0.0490 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 2.1850 0.0492 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 2.1841 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 2.1839 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 2.1831 0.0567 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 2.1826 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 2.1827 0.0495 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 2.1834 0.0570 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 2.1833 0.0495 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 2.1834 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 2.1831 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 2.1828 0.0587 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 2.1834 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 2.1830 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 2.1830 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 2.1828 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 2.1819 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 2.1812 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 2.1805 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 2.1802 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 2.1799 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 2.1794 0.0556 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 2.1789 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 2.1789 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 2.1777 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 2.1778 0.0615 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 2.1772 0.0618 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 2.1769 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 2.1776 0.0556 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 2.1771 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 2.1775 0.0598 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 2.1771 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 2.1769 0.0541 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 2.1765 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 2.1766 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 2.1766 0.0560 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 2.1762 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 2.1758 0.0493 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 2.1762 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 2.1760 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 2.1764 0.0492 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 2.1767 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 2.1769 0.0541 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 2.1767 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 2.1769 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 2.1769 0.0564 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 2.1763 0.0559 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 2.1759 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 2.1759 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 2.1761 0.0554 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 2.1762 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 2.1764 0.0567 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 2.1761 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 2.1760 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 2.1763 0.0571 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 2.1761 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 2.1762 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 2.1758 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 2.1756 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 2.1753 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 2.1754 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 2.1750 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 2.1747 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 2.1742 0.0492 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 2.1740 0.0488 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 2.1741 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 2.1739 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 2.1736 0.0574 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 2.1737 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 2.1736 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 2.1734 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 2.1731 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 2.1728 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 2.1725 0.0544 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 2.1724 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 2.1723 0.0599 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 2.1721 0.0496 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 2.1718 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 2.1715 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 2.1715 0.0549 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 2.1714 0.0490 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 2.1712 0.0556 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 2.1710 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 2.1708 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 2.1707 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 2.1705 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 2.1705 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 2.1704 0.0602 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 2.1703 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 2.1703 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 2.1703 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 2.1700 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 2.1698 0.0493 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 2.1696 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 2.1693 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 2.1692 0.0554 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 2.1692 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 2.1693 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 2.1693 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 2.1694 0.0617 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 2.1692 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 2.1691 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 2.1692 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 2.1691 0.0537 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 2.1689 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 2.1690 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 2.1691 0.0564 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 2.1691 0.0571 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 2.1691 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 2.1690 0.0495 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 2.1686 0.0580 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 2.1687 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 2.1687 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 2.1688 0.0497 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 2.1688 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 2.1688 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 2.1689 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 2.1691 0.0499 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 2.1690 0.0497 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 2.1691 0.0560 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 2.1691 0.0578 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 2.1690 0.0562 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 2.1690 0.0587 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 2.1689 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 2.1690 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 2.1690 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 2.1692 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 2.1692 0.0570 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 2.1690 0.0496 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 2.1689 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 2.1691 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 2.1691 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 2.1692 0.0495 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 2.1691 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 2.1691 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 2.1691 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 2.1690 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 2.1688 0.0548 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 2.1690 0.0530 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 2.1690 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 2.1689 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 2.1690 0.0499 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 2.1689 0.0548 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 2.1690 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 2.1689 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 2.1689 0.0549 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 2.1690 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 2.1691 0.0570 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 2.1690 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 2.1689 0.0548 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 2.1689 0.0492 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 2.1690 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 2.1690 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 2.1690 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 2.1691 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 2.1690 0.0550 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 2.1689 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 2.2459 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 2.1872 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 2.1738 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 2.1688 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 2.1662 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 2.1604 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 2.1607 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 2.1620 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 2.1640 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 2.1638 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 2.1622 0.0559 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 2.1614 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 2.1613 0.0571 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 2.1625 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 2.1626 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 2.1620 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 2.1621 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 2.1644 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 2.1641 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 2.1636 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 2.1628 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 2.1635 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 2.1636 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 2.1625 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 2.1620 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 2.1612 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 2.1609 0.0559 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 2.1610 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 2.1620 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 2.1619 0.0539 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 2.1618 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 2.1616 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 2.1613 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 2.1621 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 2.1617 0.0548 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 2.1617 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 2.1614 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 2.1603 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 2.1593 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 2.1588 0.0542 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 2.1584 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 2.1587 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 2.1583 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 2.1577 0.0569 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 2.1577 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 2.1564 0.0558 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 2.1566 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 2.1562 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 2.1559 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 2.1563 0.0531 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 2.1557 0.0591 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 2.1562 0.0555 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 2.1559 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 2.1555 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 2.1552 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 2.1553 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 2.1551 0.0486 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 2.1547 0.0573 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 2.1543 0.0578 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 2.1548 0.0493 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 2.1544 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 2.1548 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 2.1551 0.0568 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 2.1551 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 2.1549 0.0541 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 2.1553 0.0553 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 2.1553 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 2.1548 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 2.1545 0.0571 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 2.1544 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 2.1547 0.0603 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 2.1548 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 2.1551 0.0562 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 2.1547 0.0559 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 2.1546 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 2.1550 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 2.1549 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 2.1549 0.0555 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 2.1545 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 2.1544 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 2.1540 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 2.1542 0.0627 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 2.1539 0.0589 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 2.1536 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 2.1530 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 2.1527 0.0547 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 2.1527 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 2.1524 0.0568 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 2.1520 0.0496 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 2.1520 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 2.1518 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 2.1518 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 2.1514 0.0579 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 2.1513 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 2.1510 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 2.1509 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 2.1508 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 2.1506 0.0586 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 2.1502 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 2.1499 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 2.1500 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 2.1499 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 2.1496 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 2.1493 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 2.1491 0.0494 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 2.1491 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 2.1490 0.0550 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 2.1490 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 2.1490 0.0545 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 2.1489 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 2.1489 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 2.1489 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 2.1487 0.0546 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 2.1485 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 2.1483 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 2.1479 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 2.1479 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 2.1478 0.0594 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 2.1480 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 2.1479 0.0599 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 2.1479 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 2.1478 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 2.1477 0.0492 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 2.1477 0.0629 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 2.1476 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 2.1473 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 2.1474 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 2.1474 0.0554 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 2.1474 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 2.1474 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 2.1472 0.0491 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 2.1469 0.0550 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 2.1470 0.0491 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 2.1471 0.0494 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 2.1470 0.0570 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 2.1472 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 2.1471 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 2.1472 0.0538 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 2.1474 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 2.1474 0.0545 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 2.1474 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 2.1473 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 2.1473 0.0548 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 2.1473 0.0567 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 2.1472 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 2.1474 0.0547 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 2.1474 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 2.1475 0.0569 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 2.1475 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 2.1474 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 2.1473 0.0557 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 2.1475 0.0541 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 2.1475 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 2.1475 0.0497 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 2.1475 0.0550 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 2.1475 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 2.1474 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 2.1474 0.0546 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 2.1473 0.0564 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 2.1474 0.0534 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 2.1474 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 2.1474 0.0554 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 2.1474 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 2.1473 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 2.1474 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 2.1473 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 2.1473 0.0561 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 2.1475 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 2.1475 0.0494 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 2.1474 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 2.1474 0.0569 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 2.1474 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 2.1475 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 2.1475 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 2.1476 0.0594 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 2.1476 0.0581 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 2.1475 0.0495 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 2.1474 0.0499 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 2.2197 0.0492 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 2.1697 0.0551 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 2.1586 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 2.1505 0.0586 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 2.1450 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 2.1394 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 2.1404 0.0535 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 2.1405 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 2.1422 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 2.1430 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 2.1416 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 2.1404 0.0496 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 2.1401 0.0496 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 2.1425 0.0578 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 2.1417 0.0495 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 2.1406 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 2.1403 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 2.1423 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 2.1423 0.0492 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 2.1415 0.0559 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 2.1412 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 2.1420 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 2.1417 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 2.1409 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 2.1408 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 2.1399 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 2.1391 0.0580 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 2.1392 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 2.1401 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 2.1402 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 2.1400 0.0562 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 2.1395 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 2.1394 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 2.1405 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 2.1403 0.0555 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 2.1401 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 2.1398 0.0566 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 2.1385 0.0574 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 2.1378 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 2.1370 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 2.1365 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 2.1365 0.0577 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 2.1363 0.0564 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 2.1356 0.0563 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 2.1356 0.0581 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 2.1344 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 2.1345 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 2.1340 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 2.1339 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 2.1345 0.0596 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 2.1340 0.0482 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 2.1345 0.0496 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 2.1341 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 2.1339 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 2.1336 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 2.1339 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 2.1340 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 2.1337 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 2.1334 0.0494 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 2.1337 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 2.1335 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 2.1340 0.0497 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 2.1342 0.0570 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 2.1344 0.0547 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 2.1341 0.0596 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 2.1344 0.0602 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 2.1345 0.0601 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 2.1341 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 2.1339 0.0536 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 2.1339 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 2.1343 0.0584 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 2.1345 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 2.1348 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 2.1344 0.0560 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 2.1344 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 2.1349 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 2.1347 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 2.1349 0.0585 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 2.1346 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 2.1344 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 2.1340 0.0544 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 2.1340 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 2.1336 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 2.1333 0.0535 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 2.1329 0.0564 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 2.1327 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 2.1328 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 2.1325 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 2.1322 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 2.1323 0.0608 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 2.1321 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 2.1320 0.0576 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 2.1317 0.0542 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 2.1315 0.0566 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 2.1312 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 2.1311 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 2.1310 0.0588 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 2.1308 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 2.1305 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 2.1301 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 2.1302 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 2.1301 0.0570 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 2.1299 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 2.1297 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 2.1294 0.0497 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 2.1294 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 2.1293 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 2.1294 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 2.1294 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 2.1293 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 2.1293 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 2.1292 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 2.1291 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 2.1290 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 2.1289 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 2.1284 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 2.1284 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 2.1284 0.0599 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 2.1284 0.0547 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 2.1283 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 2.1283 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 2.1282 0.0571 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 2.1281 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 2.1282 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 2.1281 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 2.1278 0.0565 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 2.1278 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 2.1278 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 2.1278 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 2.1278 0.0565 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 2.1276 0.0541 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 2.1274 0.0491 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 2.1275 0.0537 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 2.1276 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 2.1275 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 2.1276 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 2.1277 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 2.1278 0.0553 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 2.1280 0.0585 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 2.1279 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 2.1280 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 2.1279 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 2.1279 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 2.1279 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 2.1277 0.0551 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 2.1279 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 2.1279 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 2.1281 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 2.1280 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 2.1278 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 2.1277 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 2.1278 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 2.1278 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 2.1279 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 2.1278 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 2.1278 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 2.1277 0.0597 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 2.1277 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 2.1275 0.0490 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 2.1277 0.0498 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 2.1278 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 2.1277 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 2.1277 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 2.1277 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 2.1278 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 2.1277 0.0555 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 2.1277 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 2.1279 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 2.1279 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 2.1279 0.0582 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 2.1278 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 2.1277 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 2.1278 0.0498 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 2.1279 0.0547 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 2.1280 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 2.1280 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 2.1279 0.0559 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 2.1278 0.0577 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 2.1948 0.0497 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 2.1455 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 2.1328 0.0540 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 2.1267 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 2.1219 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 2.1168 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 2.1180 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 2.1188 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 2.1208 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 2.1211 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 2.1199 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 2.1191 0.0501 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 2.1189 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 2.1213 0.0543 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 2.1207 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 2.1200 0.0579 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 2.1201 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 2.1223 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 2.1227 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 2.1229 0.0593 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 2.1221 0.0573 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 2.1227 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 2.1224 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 2.1218 0.0582 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 2.1215 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 2.1208 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 2.1204 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 2.1205 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 2.1215 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 2.1215 0.0500 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 2.1214 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 2.1210 0.0575 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 2.1210 0.0546 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 2.1217 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 2.1215 0.0616 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 2.1214 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 2.1211 0.0492 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 2.1200 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 2.1192 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 2.1184 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 2.1180 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 2.1182 0.0557 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 2.1178 0.0497 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 2.1170 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 2.1170 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 2.1158 0.0498 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 2.1156 0.0582 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 2.1151 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 2.1149 0.0495 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 2.1155 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 2.1150 0.0568 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 2.1155 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 2.1151 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 2.1149 0.0579 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 2.1144 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 2.1145 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 2.1144 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 2.1143 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 2.1140 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 2.1144 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 2.1143 0.0493 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 2.1148 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 2.1150 0.0588 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 2.1151 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 2.1148 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 2.1150 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 2.1152 0.0585 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 2.1146 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 2.1143 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 2.1143 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 2.1145 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 2.1146 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 2.1148 0.0602 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 2.1146 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 2.1145 0.0580 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 2.1149 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 2.1149 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 2.1150 0.0552 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 2.1146 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 2.1143 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 2.1140 0.0490 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 2.1141 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 2.1137 0.0494 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 2.1136 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 2.1130 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 2.1128 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 2.1128 0.0584 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 2.1125 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 2.1122 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 2.1123 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 2.1121 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 2.1120 0.0544 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 2.1116 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 2.1114 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 2.1111 0.0498 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 2.1110 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 2.1109 0.0572 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 2.1107 0.0497 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 2.1104 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 2.1101 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 2.1101 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 2.1100 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 2.1097 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 2.1095 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 2.1093 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 2.1093 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 2.1092 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 2.1092 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 2.1094 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 2.1092 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 2.1092 0.0501 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 2.1091 0.0502 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 2.1090 0.0501 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 2.1088 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 2.1088 0.0496 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 2.1084 0.0553 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 2.1083 0.0552 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 2.1082 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 2.1083 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 2.1082 0.0546 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 2.1083 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 2.1081 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 2.1080 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 2.1082 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 2.1081 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 2.1079 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 2.1080 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 2.1081 0.0599 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 2.1080 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 2.1081 0.0543 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 2.1078 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 2.1076 0.0577 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 2.1076 0.0570 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 2.1078 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 2.1077 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 2.1078 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 2.1078 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 2.1079 0.0596 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 2.1082 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 2.1080 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 2.1081 0.0555 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 2.1080 0.0609 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 2.1081 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 2.1080 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 2.1080 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 2.1082 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 2.1082 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 2.1083 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 2.1083 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 2.1081 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 2.1079 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 2.1081 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 2.1082 0.0566 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 2.1083 0.0553 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 2.1082 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 2.1081 0.0502 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 2.1081 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 2.1080 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 2.1079 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 2.1081 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 2.1082 0.0588 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 2.1081 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 2.1082 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 2.1081 0.0553 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 2.1082 0.0553 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 2.1081 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 2.1081 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 2.1083 0.0582 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 2.1083 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 2.1082 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 2.1081 0.0555 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 2.1080 0.0553 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 2.1082 0.0565 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 2.1082 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 2.1083 0.0554 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 2.1083 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 2.1082 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 2.1082 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 2.1757 0.0494 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 2.1272 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 2.1144 0.0497 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 2.1069 0.0547 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 2.1038 0.0497 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 2.0977 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 2.0979 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 2.0978 0.0537 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 2.1003 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 2.1000 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 2.0982 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 2.0965 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 2.0968 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 2.0994 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 2.0988 0.0498 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 2.0982 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 2.0982 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 2.1009 0.0500 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 2.1008 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 2.1007 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 2.1003 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 2.1009 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 2.1010 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 2.1002 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 2.0998 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 2.0987 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 2.0981 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 2.0983 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 2.0992 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 2.0996 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 2.0997 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 2.0994 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 2.0996 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 2.1005 0.0497 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 2.1003 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 2.1003 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 2.1001 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 2.0989 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 2.0981 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 2.0974 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 2.0973 0.0595 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 2.0975 0.0493 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 2.0971 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 2.0965 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 2.0966 0.0576 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 2.0952 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 2.0953 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 2.0950 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 2.0948 0.0577 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 2.0954 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 2.0949 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 2.0954 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 2.0953 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 2.0951 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 2.0947 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 2.0949 0.0494 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 2.0949 0.0587 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 2.0948 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 2.0944 0.0596 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 2.0948 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 2.0946 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 2.0952 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 2.0955 0.0497 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 2.0955 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 2.0952 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 2.0955 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 2.0955 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 2.0951 0.0569 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 2.0949 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 2.0950 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 2.0953 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 2.0954 0.0497 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 2.0957 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 2.0953 0.0495 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 2.0953 0.0575 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 2.0957 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 2.0956 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 2.0957 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 2.0954 0.0601 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 2.0953 0.0578 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 2.0949 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 2.0951 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 2.0946 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 2.0944 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 2.0938 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 2.0936 0.0501 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 2.0936 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 2.0934 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 2.0930 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 2.0931 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 2.0929 0.0489 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 2.0928 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 2.0925 0.0492 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 2.0922 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 2.0919 0.0557 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 2.0919 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 2.0918 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 2.0915 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 2.0913 0.0498 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 2.0910 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 2.0910 0.0501 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 2.0909 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 2.0907 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 2.0906 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 2.0903 0.0548 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 2.0903 0.0578 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 2.0902 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 2.0902 0.0626 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 2.0902 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 2.0901 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 2.0901 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 2.0900 0.0493 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 2.0900 0.0493 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 2.0898 0.0563 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 2.0896 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 2.0892 0.0556 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 2.0892 0.0500 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 2.0891 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 2.0891 0.0549 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 2.0890 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 2.0891 0.0551 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 2.0889 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 2.0888 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 2.0889 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 2.0889 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 2.0886 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 2.0887 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 2.0888 0.0566 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 2.0888 0.0567 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 2.0888 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 2.0887 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 2.0884 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 2.0885 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 2.0885 0.0571 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 2.0885 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 2.0887 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 2.0888 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 2.0888 0.0567 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 2.0890 0.0497 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 2.0889 0.0550 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 2.0891 0.0490 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 2.0891 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 2.0890 0.0498 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 2.0890 0.0555 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 2.0889 0.0501 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 2.0890 0.0571 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 2.0891 0.0499 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 2.0893 0.0537 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 2.0893 0.0499 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 2.0892 0.0499 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 2.0891 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 2.0892 0.0499 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 2.0893 0.0594 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 2.0893 0.0490 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 2.0892 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 2.0892 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 2.0891 0.0598 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 2.0891 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 2.0889 0.0592 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 2.0890 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 2.0891 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 2.0890 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 2.0890 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 2.0890 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 2.0890 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 2.0890 0.0565 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 2.0890 0.0577 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 2.0892 0.0570 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 2.0892 0.0495 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 2.0891 0.0545 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 2.0890 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 2.0889 0.0583 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 2.0890 0.0495 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 2.0890 0.0500 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 2.0891 0.0570 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 2.0891 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 2.0890 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 2.0890 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 2.1553 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 2.1122 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 2.0953 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 2.0896 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 2.0843 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 2.0777 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 2.0790 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 2.0806 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 2.0828 0.0597 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 2.0822 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 2.0801 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 2.0788 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 2.0794 0.0562 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 2.0817 0.0593 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 2.0816 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 2.0809 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 2.0805 0.0570 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 2.0830 0.0566 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 2.0826 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 2.0829 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 2.0824 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 2.0830 0.0601 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 2.0831 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 2.0825 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 2.0824 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 2.0816 0.0587 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 2.0808 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 2.0808 0.0525 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 2.0815 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 2.0816 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 2.0815 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 2.0809 0.0497 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 2.0809 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 2.0818 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 2.0818 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 2.0818 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 2.0814 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 2.0805 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 2.0797 0.0543 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 2.0793 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 2.0789 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 2.0790 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 2.0786 0.0550 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 2.0779 0.0565 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 2.0781 0.0494 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 2.0768 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 2.0769 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 2.0764 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 2.0764 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 2.0771 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 2.0766 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 2.0771 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 2.0769 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 2.0767 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 2.0765 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 2.0767 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 2.0767 0.0566 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 2.0765 0.0584 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 2.0761 0.0568 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 2.0766 0.0610 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 2.0764 0.0569 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 2.0771 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 2.0773 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 2.0775 0.0584 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 2.0775 0.0582 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 2.0777 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 2.0777 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 2.0772 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 2.0770 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 2.0769 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 2.0774 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 2.0774 0.0495 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 2.0778 0.0558 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 2.0775 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 2.0775 0.0553 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 2.0779 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 2.0778 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 2.0780 0.0499 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 2.0774 0.0499 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 2.0773 0.0559 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 2.0770 0.0590 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 2.0771 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 2.0767 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 2.0765 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 2.0760 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 2.0758 0.0547 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 2.0758 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 2.0755 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 2.0751 0.0563 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 2.0751 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 2.0750 0.0616 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 2.0750 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 2.0745 0.0542 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 2.0743 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 2.0741 0.0566 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 2.0740 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 2.0739 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 2.0737 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 2.0733 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 2.0730 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 2.0731 0.0494 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 2.0731 0.0547 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 2.0728 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 2.0727 0.0494 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 2.0724 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 2.0723 0.0571 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 2.0723 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 2.0722 0.0557 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 2.0723 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 2.0722 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 2.0722 0.0548 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 2.0722 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 2.0721 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 2.0719 0.0497 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 2.0718 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 2.0714 0.0553 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 2.0714 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 2.0713 0.0553 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 2.0714 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 2.0714 0.0556 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 2.0715 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 2.0713 0.0501 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 2.0711 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 2.0713 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 2.0713 0.0589 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 2.0710 0.0585 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 2.0711 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 2.0711 0.0543 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 2.0711 0.0542 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 2.0712 0.0501 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 2.0710 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 2.0707 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 2.0708 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 2.0708 0.0537 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 2.0707 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 2.0708 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 2.0709 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 2.0710 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 2.0712 0.0489 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 2.0711 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 2.0713 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 2.0712 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 2.0713 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 2.0712 0.0540 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 2.0711 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 2.0713 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 2.0712 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 2.0714 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 2.0714 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 2.0713 0.0563 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 2.0711 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 2.0713 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 2.0713 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 2.0714 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 2.0714 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 2.0714 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 2.0713 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 2.0713 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 2.0711 0.0564 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 2.0713 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 2.0713 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 2.0713 0.0557 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 2.0713 0.0547 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 2.0713 0.0562 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 2.0714 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 2.0714 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 2.0713 0.0603 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 2.0716 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 2.0716 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 2.0715 0.0568 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 2.0714 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 2.0713 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 2.0714 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 2.0714 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 2.0715 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 2.0715 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 2.0714 0.0620 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 2.0714 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 2.1316 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 2.0920 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 2.0784 0.0565 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 2.0722 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 2.0685 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 2.0609 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 2.0617 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 2.0616 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 2.0639 0.0578 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 2.0643 0.0557 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 2.0617 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 2.0597 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 2.0606 0.0499 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 2.0632 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 2.0626 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 2.0616 0.0546 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 2.0618 0.0497 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 2.0643 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 2.0648 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 2.0648 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 2.0641 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 2.0647 0.0500 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 2.0644 0.0550 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 2.0638 0.0493 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 2.0637 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 2.0627 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 2.0622 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 2.0623 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 2.0631 0.0546 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 2.0633 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 2.0631 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 2.0627 0.0573 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 2.0626 0.0580 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 2.0633 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 2.0631 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 2.0630 0.0548 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 2.0627 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 2.0619 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 2.0610 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 2.0602 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 2.0599 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 2.0601 0.0500 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 2.0598 0.0500 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 2.0592 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 2.0594 0.0583 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 2.0581 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 2.0580 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 2.0576 0.0505 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 2.0575 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 2.0582 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 2.0576 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 2.0583 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 2.0581 0.0557 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 2.0581 0.0580 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 2.0580 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 2.0580 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 2.0581 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 2.0579 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 2.0576 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 2.0580 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 2.0578 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 2.0583 0.0601 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 2.0588 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 2.0589 0.0590 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 2.0588 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 2.0591 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 2.0592 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 2.0587 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 2.0585 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 2.0584 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 2.0587 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 2.0588 0.0497 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 2.0590 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 2.0586 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 2.0585 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 2.0590 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 2.0590 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 2.0589 0.0606 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 2.0586 0.0542 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 2.0584 0.0503 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 2.0580 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 2.0581 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 2.0576 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 2.0575 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 2.0570 0.0548 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 2.0568 0.0560 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 2.0567 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 2.0566 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 2.0562 0.0491 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 2.0562 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 2.0560 0.0571 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 2.0560 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 2.0555 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 2.0554 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 2.0551 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 2.0550 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 2.0549 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 2.0547 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 2.0543 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 2.0540 0.0563 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 2.0542 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 2.0541 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 2.0539 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 2.0538 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 2.0535 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 2.0535 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 2.0534 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 2.0532 0.0548 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 2.0532 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 2.0532 0.0550 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 2.0532 0.0496 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 2.0532 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 2.0530 0.0544 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 2.0530 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 2.0527 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 2.0524 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 2.0524 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 2.0523 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 2.0523 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 2.0523 0.0575 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 2.0524 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 2.0522 0.0579 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 2.0520 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 2.0522 0.0553 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 2.0522 0.0503 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 2.0519 0.0551 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 2.0521 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 2.0522 0.0544 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 2.0522 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 2.0523 0.0587 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 2.0521 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 2.0519 0.0503 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 2.0520 0.0501 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 2.0521 0.0495 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 2.0521 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 2.0522 0.0551 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 2.0523 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 2.0523 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 2.0525 0.0544 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 2.0524 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 2.0527 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 2.0526 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 2.0526 0.0557 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 2.0527 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 2.0527 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 2.0528 0.0551 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 2.0528 0.0503 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 2.0530 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 2.0530 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 2.0529 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 2.0528 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 2.0529 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 2.0530 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 2.0530 0.0556 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 2.0530 0.0497 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 2.0530 0.0553 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 2.0530 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 2.0529 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 2.0527 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 2.0529 0.0579 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 2.0530 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 2.0530 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 2.0531 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 2.0531 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 2.0531 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 2.0531 0.0585 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 2.0531 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 2.0534 0.0499 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 2.0533 0.0489 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 2.0533 0.0568 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 2.0533 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 2.0532 0.0500 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 2.0533 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 2.0533 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 2.0534 0.0579 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 2.0534 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 2.0533 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 2.0533 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 2.1187 0.0598 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 2.0770 0.0501 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 2.0627 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 2.0532 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 2.0501 0.0538 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 2.0439 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 2.0447 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 2.0445 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 2.0470 0.0558 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 2.0463 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 2.0434 0.0496 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 2.0420 0.0500 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 2.0426 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 2.0445 0.0575 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 2.0443 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 2.0432 0.0596 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 2.0429 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 2.0452 0.0589 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 2.0456 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 2.0458 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 2.0454 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 2.0462 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 2.0461 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 2.0456 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 2.0455 0.0556 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 2.0447 0.0587 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 2.0440 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 2.0448 0.0640 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 2.0459 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 2.0467 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 2.0464 0.0554 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 2.0458 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 2.0459 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 2.0468 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 2.0467 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 2.0466 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 2.0463 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 2.0453 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 2.0446 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 2.0443 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 2.0440 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 2.0441 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 2.0437 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 2.0430 0.0546 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 2.0429 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 2.0417 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 2.0416 0.0503 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 2.0411 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 2.0411 0.0549 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 2.0418 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 2.0414 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 2.0420 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 2.0416 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 2.0415 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 2.0412 0.0501 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 2.0413 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 2.0416 0.0619 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 2.0415 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 2.0411 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 2.0416 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 2.0414 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 2.0420 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 2.0424 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 2.0426 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 2.0424 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 2.0426 0.0497 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 2.0426 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 2.0420 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 2.0419 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 2.0419 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 2.0421 0.0504 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 2.0423 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 2.0426 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 2.0422 0.0495 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 2.0423 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 2.0427 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 2.0427 0.0502 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 2.0428 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 2.0425 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 2.0423 0.0577 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 2.0419 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 2.0420 0.0568 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 2.0416 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 2.0414 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 2.0408 0.0498 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 2.0405 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 2.0404 0.0504 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 2.0401 0.0579 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 2.0397 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 2.0398 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 2.0396 0.0570 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 2.0394 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 2.0391 0.0560 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 2.0389 0.0500 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 2.0387 0.0498 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 2.0386 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 2.0385 0.0502 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 2.0382 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 2.0378 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 2.0375 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 2.0376 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 2.0375 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 2.0372 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 2.0371 0.0555 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 2.0369 0.0570 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 2.0368 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 2.0367 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 2.0366 0.0499 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 2.0366 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 2.0365 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 2.0365 0.0542 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 2.0365 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 2.0364 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 2.0363 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 2.0361 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 2.0357 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 2.0356 0.0496 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 2.0357 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 2.0358 0.0639 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 2.0358 0.0501 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 2.0359 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 2.0357 0.0595 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 2.0355 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 2.0357 0.0589 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 2.0356 0.0556 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 2.0353 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 2.0354 0.0545 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 2.0355 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 2.0355 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 2.0355 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 2.0354 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 2.0351 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 2.0352 0.0538 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 2.0353 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 2.0353 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 2.0354 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 2.0355 0.0499 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 2.0355 0.0500 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 2.0357 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 2.0357 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 2.0359 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 2.0358 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 2.0357 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 2.0358 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 2.0357 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 2.0358 0.0502 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 2.0358 0.0494 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 2.0360 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 2.0360 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 2.0358 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 2.0357 0.0502 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 2.0358 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 2.0359 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 2.0359 0.0570 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 2.0359 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 2.0358 0.0546 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 2.0357 0.0499 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 2.0357 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 2.0356 0.0538 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 2.0357 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 2.0358 0.0497 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 2.0358 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 2.0358 0.0587 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 2.0358 0.0565 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 2.0359 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 2.0359 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 2.0358 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 2.0361 0.0565 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 2.0361 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 2.0360 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 2.0360 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 2.0359 0.0565 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 2.0360 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 2.0360 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 2.0361 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 2.0361 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 2.0360 0.0542 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 2.0359 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 2.0962 0.0495 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 2.0577 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 2.0449 0.0584 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 2.0368 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 2.0335 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 2.0274 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 2.0283 0.0578 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 2.0286 0.0552 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 2.0305 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 2.0309 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 2.0284 0.0499 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 2.0270 0.0552 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 2.0277 0.0497 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 2.0300 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 2.0304 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 2.0295 0.0738 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 2.0294 0.0501 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 2.0315 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 2.0314 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 2.0316 0.0550 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 2.0313 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 2.0319 0.0566 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 2.0317 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 2.0310 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 2.0306 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 2.0298 0.0606 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 2.0292 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 2.0296 0.0585 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 2.0304 0.0602 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 2.0307 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 2.0305 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 2.0298 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 2.0298 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 2.0305 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 2.0302 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 2.0301 0.0579 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 2.0298 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 2.0287 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 2.0277 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 2.0273 0.0499 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 2.0269 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 2.0272 0.0543 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 2.0268 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 2.0262 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 2.0263 0.0495 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 2.0252 0.0560 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 2.0254 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 2.0250 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 2.0249 0.0506 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 2.0254 0.0551 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 2.0249 0.0548 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 2.0257 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 2.0254 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 2.0253 0.0538 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 2.0251 0.0592 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 2.0252 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 2.0254 0.0495 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 2.0252 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 2.0249 0.0601 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 2.0253 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 2.0252 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 2.0258 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 2.0261 0.0582 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 2.0263 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 2.0260 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 2.0263 0.0538 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 2.0263 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 2.0259 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 2.0258 0.0506 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 2.0258 0.0567 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 2.0261 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 2.0262 0.0563 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 2.0266 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 2.0262 0.0530 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 2.0261 0.0584 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 2.0264 0.0563 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 2.0263 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 2.0264 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 2.0262 0.0589 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 2.0261 0.0561 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 2.0257 0.0583 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 2.0258 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 2.0254 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 2.0252 0.0557 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 2.0247 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 2.0245 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 2.0244 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 2.0242 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 2.0238 0.0554 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 2.0239 0.0560 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 2.0237 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 2.0237 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 2.0233 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 2.0230 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 2.0228 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 2.0227 0.0556 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 2.0226 0.0556 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 2.0223 0.0589 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 2.0221 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 2.0217 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 2.0216 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 2.0215 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 2.0213 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 2.0211 0.0556 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 2.0209 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 2.0208 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 2.0207 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 2.0207 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 2.0208 0.0500 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 2.0208 0.0496 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 2.0207 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 2.0206 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 2.0205 0.0578 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 2.0203 0.0500 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 2.0202 0.0553 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 2.0199 0.0604 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 2.0199 0.0555 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 2.0198 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 2.0199 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 2.0199 0.0572 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 2.0199 0.0563 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 2.0197 0.0554 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 2.0195 0.0498 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 2.0196 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 2.0196 0.0492 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 2.0192 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 2.0193 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 2.0193 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 2.0192 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 2.0193 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 2.0191 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 2.0188 0.0598 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 2.0189 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 2.0190 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 2.0189 0.0499 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 2.0191 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 2.0191 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 2.0192 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 2.0194 0.0560 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 2.0193 0.0551 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 2.0195 0.0616 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 2.0195 0.0500 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 2.0194 0.0500 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 2.0194 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 2.0193 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 2.0194 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 2.0194 0.0575 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 2.0196 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 2.0196 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 2.0195 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 2.0193 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 2.0194 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 2.0195 0.0530 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 2.0196 0.0501 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 2.0195 0.0561 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 2.0195 0.0582 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 2.0194 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 2.0194 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 2.0192 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 2.0193 0.0562 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 2.0194 0.0506 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 2.0193 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 2.0194 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 2.0194 0.0619 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 2.0194 0.0590 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 2.0194 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 2.0193 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 2.0196 0.0553 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 2.0196 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 2.0196 0.0561 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 2.0196 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 2.0196 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 2.0197 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 2.0198 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 2.0199 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 2.0199 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 2.0198 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 2.0198 0.0582 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 2.0867 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 2.0443 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 2.0324 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 2.0260 0.0566 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 2.0223 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 2.0160 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 2.0178 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 2.0177 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 2.0206 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 2.0201 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 2.0180 0.0497 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 2.0158 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 2.0163 0.0576 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 2.0178 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 2.0174 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 2.0156 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 2.0155 0.0551 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 2.0174 0.0498 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 2.0173 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 2.0174 0.0500 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 2.0165 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 2.0173 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 2.0172 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 2.0164 0.0501 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 2.0158 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 2.0149 0.0584 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 2.0139 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 2.0140 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 2.0148 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 2.0150 0.0638 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 2.0147 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 2.0143 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 2.0143 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 2.0154 0.0563 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 2.0149 0.0572 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 2.0148 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 2.0146 0.0496 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 2.0135 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 2.0124 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 2.0119 0.0552 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 2.0116 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 2.0119 0.0627 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 2.0115 0.0504 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 2.0107 0.0569 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 2.0107 0.0492 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 2.0097 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 2.0095 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 2.0090 0.0548 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 2.0087 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 2.0093 0.0566 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 2.0088 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 2.0096 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 2.0094 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 2.0094 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 2.0091 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 2.0093 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 2.0095 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 2.0093 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 2.0088 0.0504 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 2.0094 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 2.0092 0.0502 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 2.0097 0.0552 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 2.0101 0.0582 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 2.0104 0.0558 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 2.0103 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 2.0105 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 2.0106 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 2.0100 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 2.0099 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 2.0099 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 2.0103 0.0584 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 2.0104 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 2.0107 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 2.0104 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 2.0103 0.0500 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 2.0106 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 2.0106 0.0497 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 2.0107 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 2.0102 0.0501 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 2.0101 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 2.0097 0.0562 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 2.0097 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 2.0092 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 2.0090 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 2.0085 0.0548 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 2.0082 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 2.0081 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 2.0080 0.0497 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 2.0076 0.0573 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 2.0076 0.0564 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 2.0074 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 2.0074 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 2.0069 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 2.0067 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 2.0064 0.0494 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 2.0063 0.0492 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 2.0062 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 2.0058 0.0572 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 2.0055 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 2.0052 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 2.0052 0.0499 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 2.0052 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 2.0049 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 2.0047 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 2.0045 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 2.0044 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 2.0044 0.0595 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 2.0044 0.0567 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 2.0044 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 2.0043 0.0558 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 2.0043 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 2.0042 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 2.0041 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 2.0039 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 2.0037 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 2.0035 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 2.0035 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 2.0034 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 2.0034 0.0565 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 2.0034 0.0498 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 2.0035 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 2.0033 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 2.0032 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 2.0033 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 2.0032 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 2.0029 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 2.0030 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 2.0030 0.0560 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 2.0031 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 2.0031 0.0557 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 2.0028 0.0502 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 2.0026 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 2.0026 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 2.0027 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 2.0027 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 2.0028 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 2.0029 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 2.0030 0.0584 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 2.0032 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 2.0030 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 2.0032 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 2.0031 0.0548 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 2.0031 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 2.0031 0.0557 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 2.0030 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 2.0031 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 2.0031 0.0550 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 2.0034 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 2.0034 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 2.0033 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 2.0031 0.0570 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 2.0032 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 2.0032 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 2.0033 0.0499 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 2.0032 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 2.0032 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 2.0032 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 2.0032 0.0594 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 2.0030 0.0548 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 2.0032 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 2.0033 0.0569 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 2.0032 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 2.0033 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 2.0034 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 2.0034 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 2.0033 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 2.0033 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 2.0036 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 2.0036 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 2.0035 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 2.0035 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 2.0034 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 2.0035 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 2.0036 0.0584 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 2.0037 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 2.0036 0.0590 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 2.0035 0.0565 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 2.0035 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4210 1.2749 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4073 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.3853 0.0543 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.3343 0.0545 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.2398 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.1429 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.0643 0.0658 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 3.9951 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 3.9320 0.0632 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.8764 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.8281 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.7870 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.7523 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.7220 0.0686 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.6952 0.0707 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.6706 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.6469 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.6278 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.6086 0.0687 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.5897 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.5730 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.5579 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.5435 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.5304 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.5176 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.5065 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.4961 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.4852 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.4751 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.4661 0.0561 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.4581 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.4498 0.0684 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.4417 0.0687 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.4346 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.4271 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.4207 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.4138 0.0557 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.4074 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.4009 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.3950 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.3892 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.3838 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.3785 0.0570 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.3735 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.3685 0.0644 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.3639 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.3596 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.3557 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.3518 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.3480 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.3443 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.3406 0.0568 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.3371 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.3334 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.3301 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.3266 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.3233 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.3202 0.0574 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.3171 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.3142 0.0569 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.3113 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.3090 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.3067 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.3038 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.3011 0.0567 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.2989 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.2966 0.0669 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.2936 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.2911 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.2891 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.2869 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.2850 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.2828 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.2807 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.2789 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.2772 0.0663 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.2753 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.2735 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.2716 0.0555 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.2696 0.0697 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.2677 0.0656 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2661 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2644 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2627 0.0715 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2608 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2591 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2573 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2556 0.0757 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2542 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2528 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2513 0.0703 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.2497 0.0700 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.2482 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.2468 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.2452 0.0576 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.2436 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.2422 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.2407 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.2392 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.2377 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.2363 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.2349 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.2335 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.2320 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.2305 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.2291 0.0571 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.2274 0.0639 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.2257 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.2242 0.0692 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.2224 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.2208 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.2193 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.2175 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.2157 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.2139 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.2122 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.2104 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.2089 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.2073 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.2056 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.2040 0.0644 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.2024 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.2007 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.1991 0.0653 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.1973 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.1953 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.1936 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.1919 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.1900 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.1881 0.0653 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.1864 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.1844 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.1825 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.1805 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.1782 0.0684 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.1760 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.1741 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.1720 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.1700 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.1680 0.0695 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.1660 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.1637 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.1616 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.1594 0.0753 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.1573 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.1553 0.0644 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.1533 0.0674 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.1514 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.1491 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.1469 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.1451 0.0670 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.1432 0.0678 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.1412 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.1390 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.1368 0.0684 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.1346 0.0668 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.1324 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.1302 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.1279 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.1256 0.0708 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.1235 0.0644 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.1210 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.1186 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.1164 0.0725 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.1141 0.0737 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.1119 0.0757 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.1096 0.0690 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.1074 0.0725 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.1051 0.0639 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.1028 0.0697 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.1006 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.0985 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.0965 0.0694 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.0946 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.0925 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.0902 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.0879 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.0854 0.0664 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.7080 0.0728 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.6724 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.6671 0.0716 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.6624 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.6583 0.0779 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.6563 0.0660 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.6560 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.6554 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.6541 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.6513 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.6484 0.0670 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.6478 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.6463 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.6471 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.6460 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.6443 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.6421 0.0721 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.6422 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.6407 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.6379 0.0733 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.6362 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.6358 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.6340 0.0687 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.6319 0.0681 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.6298 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.6290 0.0719 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.6273 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.6255 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.6242 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.6223 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.6216 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.6197 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.6180 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.6164 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.6144 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.6130 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.6111 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.6089 0.0672 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.6073 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.6055 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.6036 0.0688 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.6017 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.6000 0.0685 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.5983 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.5965 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.5946 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.5936 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.5922 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.5908 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.5899 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.5886 0.0683 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.5871 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.5855 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.5839 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.5826 0.0670 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.5814 0.0599 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.5802 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.5788 0.0703 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.5775 0.0774 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.5764 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.5751 0.0685 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.5741 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.5732 0.0670 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.5718 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.5704 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.5695 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.5683 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.5666 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.5653 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.5643 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.5633 0.0702 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.5623 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.5613 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.5601 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.5590 0.0742 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.5584 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.5574 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.5564 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.5552 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.5540 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.5528 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.5518 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.5507 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.5496 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.5480 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.5469 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.5457 0.0671 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.5447 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.5436 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.5429 0.0744 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.5418 0.0707 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.5410 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.5400 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.5389 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.5377 0.0753 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.5366 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.5357 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.5347 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.5337 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.5328 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.5321 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.5311 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.5300 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.5290 0.0734 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.5280 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.5272 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.5262 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.5255 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.5247 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.5236 0.0691 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.5228 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.5220 0.0649 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.5211 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.5201 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.5191 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.5180 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.5171 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.5163 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.5157 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.5148 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.5142 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.5134 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.5125 0.0687 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.5118 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.5110 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.5100 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.5093 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.5087 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.5079 0.0705 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.5071 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.5063 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.5054 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.5047 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.5041 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.5032 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.5024 0.0611 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.5016 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.5008 0.0664 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.5002 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.4994 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.4988 0.0739 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.4979 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.4971 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.4963 0.0702 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.4957 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.4951 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.4944 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.4939 0.0713 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.4931 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.4922 0.0680 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.4916 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.4912 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.4906 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.4900 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.4893 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.4886 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.4878 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.4870 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.4862 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.4855 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.4849 0.0687 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.4840 0.0791 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.4832 0.0696 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.4824 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.4818 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.4811 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.4804 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.4798 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.4791 0.0663 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.4784 0.0696 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.4778 0.0700 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.4773 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.4770 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.4766 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.4763 0.0711 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.4756 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.4749 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.4741 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.4108 0.0776 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.3642 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.3541 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.3517 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.3502 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.3488 0.0691 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.3491 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.3512 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.3519 0.0728 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.3504 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.3487 0.0666 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.3486 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.3486 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.3504 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.3508 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.3500 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.3491 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.3504 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.3500 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.3484 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.3478 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.3483 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.3479 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.3468 0.0731 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.3462 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.3457 0.0706 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.3450 0.0729 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.3444 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.3447 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.3445 0.0701 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.3445 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.3438 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.3428 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.3428 0.0670 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.3421 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.3419 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.3412 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.3402 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.3393 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.3381 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.3372 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.3365 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.3356 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.3349 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.3343 0.0681 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.3328 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.3326 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.3320 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.3315 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.3314 0.0666 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.3305 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.3302 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.3296 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.3290 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.3283 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.3280 0.0742 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.3275 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.3269 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.3261 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.3260 0.0709 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.3255 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.3253 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.3252 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.3245 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.3239 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.3238 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.3234 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.3225 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.3218 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.3215 0.0687 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.3212 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.3211 0.0816 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.3208 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.3203 0.0714 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.3198 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.3197 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.3192 0.0780 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.3190 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.3185 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.3180 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.3174 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.3171 0.0728 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.3165 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.3161 0.0725 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.3152 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.3148 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.3143 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.3138 0.0707 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.3132 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.3129 0.0792 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.3125 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.3121 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.3117 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.3111 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.3104 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.3099 0.0787 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.3094 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.3090 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.3085 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.3080 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.3077 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.3075 0.0710 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.3068 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.3064 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.3059 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.3055 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.3052 0.0794 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.3048 0.0695 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.3045 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.3040 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.3037 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.3034 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.3030 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.3025 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.3020 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.3014 0.0744 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.3011 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.3008 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.3007 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.3003 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.3000 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.2997 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.2993 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.2991 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.2987 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.2982 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.2979 0.0786 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.2977 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.2974 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.2971 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.2968 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.2963 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.2960 0.0706 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.2958 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.2953 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.2950 0.0694 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.2946 0.0708 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.2943 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.2942 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.2938 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.2936 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.2932 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.2929 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.2926 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.2923 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.2922 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.2919 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.2919 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.2915 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.2912 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.2909 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.2909 0.0735 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.2906 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.2904 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.2901 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.2898 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.2894 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.2890 0.0704 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.2886 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.2884 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.2881 0.0711 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.2877 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.2873 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.2870 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.2868 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.2864 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.2862 0.0714 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.2861 0.0765 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.2858 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.2854 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.2852 0.0685 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.2850 0.0694 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.2850 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.2850 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.2849 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.2847 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.2843 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.2839 0.0795 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.2808 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.2433 0.0837 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.2285 0.0765 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.2231 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.2230 0.0718 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.2222 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.2228 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.2239 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.2266 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.2254 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.2228 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.2220 0.0722 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.2218 0.0747 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.2239 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.2238 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.2237 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.2231 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.2249 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.2251 0.0815 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.2244 0.0708 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.2230 0.0734 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.2241 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.2236 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.2228 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.2221 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.2216 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.2210 0.0689 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.2207 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.2209 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.2210 0.0695 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.2212 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.2207 0.0705 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.2202 0.0696 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.2204 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.2202 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.2199 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.2195 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.2181 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.2172 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.2161 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.2155 0.0692 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.2148 0.0724 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.2138 0.0707 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.2130 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.2125 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.2112 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.2112 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.2107 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.2102 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.2103 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.2094 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.2095 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.2093 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.2085 0.0762 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.2080 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.2078 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.2076 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.2072 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.2068 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.2070 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.2066 0.0677 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.2066 0.0709 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.2066 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.2063 0.0725 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.2060 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.2061 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.2060 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.2051 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.2047 0.0742 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.2044 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.2043 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.2043 0.0702 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.2043 0.0758 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.2039 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.2037 0.0716 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.2039 0.0697 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.2034 0.0618 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.2033 0.0627 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.2028 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.2025 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.2020 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.2018 0.0734 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.2012 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.2009 0.0741 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.2001 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.1995 0.0614 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.1992 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.1989 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.1984 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.1982 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.1978 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.1975 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.1972 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.1966 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.1961 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.1957 0.0697 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.1954 0.0746 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.1950 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.1946 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.1941 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.1939 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.1936 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.1932 0.0707 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.1928 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.1923 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.1920 0.0725 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.1917 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.1915 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.1912 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.1907 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.1905 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.1902 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.1899 0.0702 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.1895 0.0703 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.1892 0.0844 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.1887 0.0693 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.1884 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.1881 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.1880 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.1876 0.0698 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.1875 0.0753 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.1872 0.0693 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.1869 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.1868 0.0757 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.1865 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.1860 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.1859 0.0744 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.1858 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.1856 0.0723 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.1854 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.1852 0.0730 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.1848 0.0724 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.1847 0.0698 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.1846 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.1843 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.1842 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.1840 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.1839 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.1839 0.0683 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.1836 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.1835 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.1833 0.0827 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.1830 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.1828 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.1826 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.1825 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.1823 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.1823 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.1820 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.1817 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.1815 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.1816 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.1814 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.1813 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.1811 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.1809 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.1806 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.1804 0.0708 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.1800 0.0719 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.1800 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.1799 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.1797 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.1795 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.1792 0.0722 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.1791 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.1789 0.0710 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.1787 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.1787 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.1785 0.0774 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.1783 0.0692 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.1781 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.1780 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.1780 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.1780 0.0620 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.1780 0.0709 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.1778 0.0754 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.1775 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.1773 0.0715 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.2059 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.1608 0.0706 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.1450 0.0683 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.1380 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.1394 0.0718 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.1351 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.1356 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.1372 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.1395 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.1389 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.1376 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.1360 0.0723 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.1366 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.1394 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.1395 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.1390 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.1387 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.1405 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.1401 0.0683 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.1392 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.1381 0.0734 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.1393 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.1388 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.1379 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.1371 0.0758 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.1364 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.1358 0.0732 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.1356 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.1360 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.1360 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.1360 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.1352 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.1350 0.0702 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.1357 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.1353 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.1349 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.1348 0.0720 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.1337 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.1326 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.1316 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.1308 0.0754 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.1303 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.1297 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.1289 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.1286 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.1273 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.1272 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.1270 0.0703 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.1266 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.1270 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.1261 0.0726 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.1264 0.0716 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.1260 0.0748 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.1258 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.1254 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.1253 0.0702 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.1251 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.1247 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.1242 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.1241 0.0685 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.1238 0.0687 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.1238 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.1238 0.0676 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.1235 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.1232 0.0730 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.1234 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.1234 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.1227 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.1223 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.1222 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.1223 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.1222 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.1222 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.1218 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.1216 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.1216 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.1214 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.1212 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.1208 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.1205 0.0699 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.1201 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.1201 0.0676 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.1196 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.1192 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.1186 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 2.1182 0.0743 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 2.1179 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 2.1176 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 2.1171 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 2.1170 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 2.1167 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 2.1165 0.0692 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 2.1161 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 2.1156 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 2.1151 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 2.1147 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 2.1146 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 2.1142 0.0745 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 2.1138 0.0699 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 2.1133 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 2.1131 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 2.1130 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 2.1125 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 2.1122 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 2.1119 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 2.1117 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 2.1115 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 2.1114 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 2.1113 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 2.1110 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 2.1108 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 2.1105 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 2.1103 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 2.1100 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 2.1098 0.0726 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 2.1092 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 2.1090 0.0731 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 2.1089 0.0726 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 2.1087 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 2.1085 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 2.1084 0.0700 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 2.1082 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 2.1080 0.0728 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 2.1079 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 2.1077 0.0729 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 2.1073 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 2.1071 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 2.1070 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 2.1068 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 2.1066 0.0676 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 2.1064 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 2.1061 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 2.1060 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 2.1059 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 2.1056 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 2.1056 0.0751 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 2.1054 0.0775 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 2.1053 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 2.1054 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 2.1051 0.0755 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 2.1051 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 2.1049 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 2.1047 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 2.1045 0.0732 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 2.1043 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 2.1043 0.0813 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 2.1041 0.0717 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 2.1041 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 2.1039 0.0740 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 2.1036 0.0748 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 2.1035 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 2.1037 0.0717 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 2.1035 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 2.1035 0.0696 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 2.1033 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 2.1032 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 2.1030 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 2.1029 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 2.1026 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 2.1026 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 2.1026 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 2.1024 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 2.1022 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 2.1021 0.0718 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 2.1020 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 2.1018 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 2.1017 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 2.1017 0.0687 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 2.1016 0.0763 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 2.1014 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 2.1012 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 2.1010 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 2.1010 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 2.1009 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 2.1009 0.0741 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 2.1008 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 2.1006 0.0701 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 2.1004 0.0758 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.1294 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 2.0948 0.0722 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 2.0802 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 2.0719 0.0709 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 2.0719 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 2.0661 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 2.0674 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 2.0688 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 2.0713 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 2.0718 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 2.0697 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 2.0686 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 2.0686 0.0701 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 2.0706 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 2.0697 0.0704 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 2.0683 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 2.0687 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 2.0703 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 2.0697 0.0758 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 2.0695 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 2.0686 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 2.0697 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 2.0695 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 2.0686 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 2.0685 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 2.0677 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 2.0673 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 2.0671 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 2.0678 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 2.0683 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 2.0684 0.0717 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 2.0676 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 2.0676 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 2.0685 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 2.0678 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 2.0676 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 2.0669 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 2.0657 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 2.0644 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 2.0635 0.0686 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 2.0628 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 2.0625 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 2.0617 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 2.0609 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 2.0605 0.0741 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 2.0593 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 2.0592 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 2.0586 0.0742 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 2.0583 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 2.0586 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 2.0580 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 2.0585 0.0790 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 2.0580 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 2.0576 0.0722 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 2.0573 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 2.0571 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 2.0571 0.0733 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 2.0567 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 2.0562 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 2.0564 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 2.0562 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 2.0566 0.0650 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 2.0570 0.0741 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 2.0569 0.0807 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 2.0566 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 2.0569 0.0730 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 2.0569 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 2.0564 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 2.0561 0.0729 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 2.0559 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 2.0560 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 2.0561 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 2.0562 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 2.0559 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 2.0557 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 2.0559 0.0755 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 2.0556 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 2.0556 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 2.0551 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 2.0547 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 2.0543 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 2.0542 0.0735 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 2.0537 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 2.0536 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 2.0530 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 2.0526 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 2.0524 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 2.0522 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 2.0517 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 2.0516 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 2.0513 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 2.0512 0.0698 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 2.0507 0.0713 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 2.0504 0.0686 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 2.0499 0.0707 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 2.0497 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 2.0495 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 2.0492 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 2.0488 0.0782 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 2.0483 0.0669 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 2.0483 0.0714 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 2.0480 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 2.0475 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 2.0472 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 2.0469 0.0671 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 2.0468 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 2.0466 0.0743 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 2.0465 0.0715 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 2.0464 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 2.0461 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 2.0460 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 2.0458 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 2.0455 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 2.0453 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 2.0449 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 2.0444 0.0809 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 2.0444 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 2.0442 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 2.0441 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 2.0439 0.0708 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 2.0439 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 2.0437 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 2.0436 0.0704 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 2.0436 0.0721 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 2.0435 0.0711 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 2.0431 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 2.0430 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 2.0429 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 2.0428 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 2.0426 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 2.0423 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 2.0421 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 2.0421 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 2.0419 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 2.0417 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 2.0417 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 2.0416 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 2.0416 0.0761 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 2.0416 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 2.0413 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 2.0414 0.0704 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 2.0412 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 2.0410 0.0701 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 2.0409 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 2.0407 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 2.0407 0.0698 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 2.0407 0.0712 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 2.0407 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 2.0405 0.0762 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 2.0403 0.0713 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 2.0401 0.0720 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 2.0402 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 2.0401 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 2.0401 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 2.0399 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 2.0397 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 2.0396 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 2.0395 0.0635 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 2.0392 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 2.0392 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 2.0392 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 2.0390 0.0711 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 2.0389 0.0708 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 2.0388 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 2.0387 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 2.0385 0.0719 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 2.0384 0.0704 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 2.0385 0.0744 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 2.0383 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 2.0382 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 2.0380 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 2.0378 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 2.0378 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 2.0377 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 2.0377 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 2.0375 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 2.0373 0.0705 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 2.0372 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 2.0798 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 2.0381 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 2.0277 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 2.0213 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 2.0184 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 2.0106 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 2.0125 0.0698 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 2.0123 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 2.0142 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 2.0134 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 2.0119 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 2.0107 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 2.0112 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 2.0140 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 2.0126 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 2.0114 0.0719 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 2.0113 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 2.0140 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 2.0137 0.0749 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 2.0134 0.0769 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 2.0126 0.0780 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 2.0137 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 2.0127 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 2.0117 0.0725 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 2.0111 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 2.0102 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 2.0094 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 2.0097 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 2.0103 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 2.0104 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 2.0103 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 2.0096 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 2.0095 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 2.0101 0.0742 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 2.0094 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 2.0091 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 2.0084 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 2.0070 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 2.0057 0.0738 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 2.0048 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 2.0039 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 2.0036 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 2.0028 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 2.0020 0.0712 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 2.0018 0.0837 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 2.0006 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 2.0005 0.0720 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 2.0000 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.9996 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 2.0003 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.9996 0.0819 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 2.0002 0.0726 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.9997 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.9993 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.9989 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.9989 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.9989 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.9987 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.9983 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.9983 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.9982 0.0693 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.9986 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.9988 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.9989 0.0738 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.9986 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.9988 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.9988 0.0690 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.9982 0.0758 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.9979 0.0739 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.9978 0.0734 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.9980 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.9981 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.9983 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.9981 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.9979 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.9981 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.9981 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.9981 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.9976 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.9973 0.0747 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.9966 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.9966 0.0772 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.9961 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.9957 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.9951 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.9948 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.9947 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.9944 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.9939 0.0753 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.9939 0.0693 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.9934 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.9933 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.9930 0.0697 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.9924 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.9920 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.9918 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.9916 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.9912 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.9908 0.0682 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.9904 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.9902 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.9900 0.0724 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.9896 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.9893 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.9890 0.0744 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.9889 0.0710 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.9888 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.9887 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.9886 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.9885 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.9884 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.9881 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.9879 0.0693 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.9877 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.9874 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.9870 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.9868 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.9868 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.9867 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.9865 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.9864 0.0625 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.9862 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.9859 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.9860 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.9859 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.9855 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.9854 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.9854 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.9854 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.9852 0.0789 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.9851 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.9848 0.0792 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.9848 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.9848 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.9846 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.9845 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.9845 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.9844 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.9844 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.9842 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.9843 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.9840 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.9839 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.9839 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.9836 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.9836 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.9836 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.9836 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.9835 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.9833 0.0726 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.9831 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.9832 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.9832 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.9831 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.9830 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.9829 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.9829 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.9828 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.9826 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.9826 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.9827 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.9825 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.9825 0.0727 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.9825 0.0736 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.9823 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.9822 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.9821 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.9823 0.0710 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.9822 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.9821 0.0765 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.9819 0.0756 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.9817 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.9818 0.0734 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.9818 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.9817 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.9816 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.9814 0.0714 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.9813 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 2.0264 0.0702 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.9843 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.9707 0.0731 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.9648 0.0722 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.9621 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.9544 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.9553 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.9565 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.9596 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.9602 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.9571 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.9557 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.9561 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.9588 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.9582 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.9571 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.9569 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.9587 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.9583 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.9583 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.9581 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.9595 0.0696 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.9587 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.9577 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.9569 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.9559 0.0724 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.9551 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.9551 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.9558 0.0763 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.9563 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.9561 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.9555 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.9559 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.9564 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.9562 0.0732 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.9558 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.9553 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.9545 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.9533 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.9527 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.9521 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.9519 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.9511 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.9504 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.9505 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.9491 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.9490 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.9486 0.0794 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.9484 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.9488 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.9483 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.9491 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.9487 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.9485 0.0745 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.9483 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.9486 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.9487 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.9484 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.9481 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.9485 0.0725 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.9483 0.0701 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.9489 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.9493 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.9493 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.9493 0.0760 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.9495 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.9496 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.9491 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.9489 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.9489 0.0762 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.9492 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.9493 0.0735 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.9496 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.9493 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.9492 0.0716 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.9494 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.9494 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.9493 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.9488 0.0703 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.9487 0.0698 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.9481 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.9481 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.9477 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.9476 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.9470 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.9467 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.9466 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.9464 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.9459 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.9458 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.9454 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.9453 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.9449 0.0717 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.9444 0.0720 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.9439 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.9439 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.9437 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.9433 0.0725 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.9429 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.9424 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.9423 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.9421 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.9418 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.9415 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.9412 0.0714 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.9411 0.0694 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.9409 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.9408 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.9407 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.9406 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.9406 0.0771 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.9404 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.9402 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.9400 0.0689 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.9398 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.9394 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.9392 0.0745 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.9392 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.9390 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.9390 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.9389 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.9387 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.9384 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.9384 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.9382 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.9378 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.9378 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.9379 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.9378 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.9377 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.9374 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.9372 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.9371 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.9372 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.9371 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.9370 0.0696 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.9371 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.9371 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.9371 0.0719 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.9369 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.9370 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.9368 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.9368 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.9367 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.9366 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.9366 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.9366 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.9366 0.0769 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.9365 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.9363 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.9361 0.0710 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.9363 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.9362 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.9362 0.0753 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.9361 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.9360 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.9360 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.9359 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.9357 0.0715 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.9357 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.9358 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.9356 0.0710 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.9356 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.9355 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.9354 0.0712 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.9353 0.0732 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.9352 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.9355 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.9355 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.9353 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.9351 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.9349 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.9350 0.0713 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.9349 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.9350 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.9348 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.9346 0.0745 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.9346 0.0702 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.9785 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.9388 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.9263 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.9226 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.9199 0.0721 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.9138 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.9147 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.9137 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.9163 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.9157 0.0716 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.9142 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.9130 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.9133 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.9156 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.9153 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.9134 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.9135 0.0717 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.9149 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.9152 0.0721 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.9156 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.9150 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.9162 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.9157 0.0683 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.9155 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.9152 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.9141 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.9135 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.9142 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.9150 0.0701 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.9152 0.0787 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.9150 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.9143 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.9146 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.9151 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.9149 0.0688 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.9145 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.9138 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.9127 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.9114 0.0707 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.9107 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.9103 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.9104 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.9098 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.9089 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.9091 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.9081 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.9081 0.0748 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.9079 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.9078 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.9083 0.0730 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.9079 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.9086 0.0744 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.9082 0.0683 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.9080 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.9076 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.9078 0.0716 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.9080 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.9077 0.0724 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.9073 0.0738 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.9077 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.9075 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.9081 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.9085 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.9086 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.9083 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.9085 0.0704 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.9087 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.9082 0.0703 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.9080 0.0711 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.9080 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.9084 0.0772 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.9085 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.9088 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.9084 0.0725 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.9084 0.0685 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.9085 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.9085 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.9084 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.9079 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.9077 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.9072 0.0722 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.9072 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.9068 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.9067 0.0737 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.9062 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.9059 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.9058 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.9054 0.0764 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.9049 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.9048 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.9045 0.0718 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.9044 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.9039 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.9034 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.9030 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.9029 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.9027 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.9024 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.9020 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.9017 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.9016 0.0731 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.9015 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.9011 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.9008 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.9005 0.0712 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.9004 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.9002 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.9002 0.0724 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.9002 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.9001 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.9002 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.8999 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.8997 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.8996 0.0809 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.8993 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.8990 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.8989 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.8989 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.8989 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.8988 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.8987 0.0749 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.8984 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.8982 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.8983 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.8982 0.0724 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.8977 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.8978 0.0702 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.8978 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.8977 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.8976 0.0710 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.8973 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.8971 0.0709 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.8971 0.0723 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.8971 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.8969 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.8970 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.8969 0.0747 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.8969 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.8970 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.8968 0.0736 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.8969 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.8967 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.8966 0.0716 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.8966 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.8964 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.8965 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.8965 0.0743 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.8966 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.8965 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.8964 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.8962 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.8963 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.8962 0.0726 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.8962 0.0750 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.8962 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.8961 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.8961 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.8960 0.0713 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.8958 0.0700 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.8959 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.8960 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.8959 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.8959 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.8959 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.8958 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.8957 0.0766 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.8956 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.8959 0.0737 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.8958 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.8957 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.8956 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.8954 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.8955 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.8955 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.8954 0.0729 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.8953 0.0685 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.8951 0.0820 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.8950 0.0720 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.9570 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.9120 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.8987 0.0739 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.8920 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.8893 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.8825 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.8825 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.8823 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.8846 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.8852 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.8815 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.8810 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.8808 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.8829 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.8815 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.8799 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.8801 0.0700 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.8818 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.8818 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.8820 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.8808 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.8814 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.8810 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.8807 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.8802 0.0718 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.8792 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.8787 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.8791 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.8799 0.0808 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.8799 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.8795 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.8787 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.8788 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.8793 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.8789 0.0699 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.8785 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.8778 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.8770 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.8758 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.8751 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.8745 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.8747 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.8741 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.8734 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.8734 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.8724 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.8722 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.8719 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.8717 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.8721 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.8716 0.0757 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.8725 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.8722 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.8719 0.0690 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.8717 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.8719 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.8721 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.8719 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.8714 0.0695 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.8718 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.8717 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.8723 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.8727 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.8728 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.8728 0.0749 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.8730 0.0706 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.8732 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.8727 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.8724 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.8723 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.8727 0.0644 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.8728 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.8730 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.8727 0.0699 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.8727 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.8728 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.8728 0.0748 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.8728 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.8721 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.8720 0.0742 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.8716 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.8716 0.0799 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.8711 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.8710 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.8704 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.8702 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.8700 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.8697 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.8692 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.8691 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.8687 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.8686 0.0754 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.8681 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.8677 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.8673 0.0644 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.8673 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.8672 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.8669 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.8664 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.8661 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.8660 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.8658 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.8655 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.8653 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.8650 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.8650 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.8648 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.8648 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.8648 0.0708 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.8648 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.8648 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.8646 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.8644 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.8644 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.8642 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.8638 0.0709 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.8638 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.8637 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.8636 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.8635 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.8634 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.8631 0.0751 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.8629 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.8630 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.8630 0.0735 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.8626 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.8627 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.8627 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.8626 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.8625 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.8622 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.8620 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.8620 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.8620 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.8619 0.0763 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.8619 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.8619 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.8619 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.8620 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.8617 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.8619 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.8617 0.0709 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.8616 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.8616 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.8615 0.0777 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.8615 0.0742 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.8614 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.8616 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.8615 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.8614 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.8611 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.8613 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.8613 0.0735 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.8613 0.0734 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.8612 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.8612 0.0756 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.8612 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.8612 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.8609 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.8611 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.8612 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.8611 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.8611 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.8611 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.8610 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.8608 0.0714 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.8608 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.8611 0.0757 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.8611 0.0706 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.8609 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.8608 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.8606 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.8606 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.8606 0.0710 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.8606 0.0704 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.8605 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.8603 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.8603 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.9115 0.0746 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.8723 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.8617 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.8564 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.8542 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.8459 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.8473 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.8473 0.0724 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.8499 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.8495 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.8467 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.8459 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.8457 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.8482 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.8474 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.8456 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.8457 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.8471 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.8467 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.8468 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.8462 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.8477 0.0739 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.8471 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.8470 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.8467 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.8457 0.0887 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.8451 0.0707 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.8455 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.8463 0.0625 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.8467 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.8467 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.8462 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.8464 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.8473 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.8467 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.8466 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.8461 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.8452 0.0733 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.8441 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.8434 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.8430 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.8431 0.0733 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.8425 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.8417 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.8418 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.8406 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.8403 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.8399 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.8398 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.8403 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.8398 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.8407 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.8405 0.0632 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.8404 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.8400 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.8400 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.8402 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.8400 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.8395 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.8400 0.0716 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.8399 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.8407 0.0759 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.8410 0.0760 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.8413 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.8412 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.8414 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.8416 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.8412 0.0767 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.8411 0.0743 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.8409 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.8413 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.8414 0.0722 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.8418 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.8416 0.0722 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.8414 0.0723 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.8415 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.8414 0.0708 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.8414 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.8409 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.8409 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.8404 0.0764 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.8404 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.8400 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.8398 0.0742 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.8392 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.8389 0.0764 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.8387 0.0722 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.8383 0.0702 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.8378 0.0714 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.8377 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.8374 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.8372 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.8369 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.8364 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.8360 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.8359 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.8358 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.8354 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.8350 0.0720 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.8346 0.0748 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.8346 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.8345 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.8344 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.8341 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.8339 0.0711 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.8338 0.0708 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.8336 0.0686 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.8335 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.8336 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.8336 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.8335 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.8334 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.8333 0.0747 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.8331 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.8329 0.0707 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.8325 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.8325 0.0773 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.8325 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.8324 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.8323 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.8323 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.8320 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.8318 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.8320 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.8319 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.8315 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.8316 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.8318 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.8317 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.8315 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.8312 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.8310 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.8310 0.0731 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.8310 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.8309 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.8310 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.8311 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.8311 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.8312 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.8311 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.8313 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.8311 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.8310 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.8309 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.8308 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.8308 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.8308 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.8309 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.8308 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.8307 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.8305 0.0708 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.8306 0.0717 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.8305 0.0775 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.8306 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.8306 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.8304 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.8304 0.0720 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.8304 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.8302 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.8303 0.0798 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.8305 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.8304 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.8305 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.8305 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.8304 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.8303 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.8303 0.0753 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.8307 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.8306 0.0768 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.8305 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.8304 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.8302 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.8303 0.0707 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.8303 0.0709 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.8303 0.0719 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.8302 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.8300 0.0737 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.8301 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.9000 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.8582 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.8429 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.8380 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.8337 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.8248 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.8240 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.8220 0.0726 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.8237 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.8231 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.8202 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.8192 0.0759 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.8192 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.8213 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.8211 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.8192 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.8199 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.8221 0.0633 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.8214 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.8218 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.8215 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.8223 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.8214 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.8209 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.8204 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.8192 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.8187 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.8193 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.8200 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.8207 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.8206 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.8200 0.0633 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.8200 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.8204 0.0632 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.8201 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.8197 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.8193 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.8184 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.8175 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.8169 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.8161 0.0627 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.8164 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.8158 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.8148 0.0748 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.8150 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.8140 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.8140 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.8136 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.8134 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.8139 0.0633 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.8132 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.8139 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.8136 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.8133 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.8128 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.8131 0.0701 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.8133 0.0749 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.8132 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.8128 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.8131 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.8131 0.0695 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.8139 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.8141 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.8143 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.8142 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.8145 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.8147 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.8144 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.8141 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.8139 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.8143 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.8144 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.8148 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.8145 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.8144 0.0714 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.8145 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.8145 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.8145 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.8141 0.0674 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.8139 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.8135 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.8135 0.0699 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.8131 0.0798 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.8129 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.8124 0.0731 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.8121 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.8121 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.8117 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.8113 0.0725 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.8112 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.8108 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.8107 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.8102 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.8098 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.8095 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.8095 0.0723 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.8093 0.0712 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.8089 0.0712 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.8086 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.8080 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.8079 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.8079 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.8076 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.8074 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.8072 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.8071 0.0712 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.8071 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.8071 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.8071 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.8071 0.0647 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.8071 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.8068 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.8067 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.8066 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.8064 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.8061 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.8061 0.0750 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.8061 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.8061 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.8060 0.0723 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.8060 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.8057 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.8054 0.0788 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.8055 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.8055 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.8051 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.8052 0.0743 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.8053 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.8051 0.0793 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.8050 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.8047 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.8044 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.8044 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.8044 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.8043 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.8043 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.8044 0.0637 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.8045 0.0734 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.8045 0.0725 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.8044 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.8046 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.8045 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.8044 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.8044 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.8042 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.8042 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.8042 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.8043 0.0636 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.8043 0.0752 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.8041 0.0735 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.8039 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.8040 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.8039 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.8040 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.8039 0.0711 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.8039 0.0739 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.8040 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.8039 0.0720 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.8036 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.8038 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.8040 0.0725 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.8039 0.0748 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.8040 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.8039 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.8039 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.8038 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.8038 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.8042 0.0770 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.8041 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.8040 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.8038 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.8037 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.8037 0.0722 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.8038 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.8038 0.0692 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.8037 0.0735 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.8036 0.0726 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.8036 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.8651 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.8239 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.8073 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.8017 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.7986 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.7903 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.7903 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.7911 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.7942 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.7940 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.7906 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.7898 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.7896 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.7919 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.7918 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.7900 0.0687 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.7903 0.0737 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.7921 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.7924 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.7931 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.7925 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.7933 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.7927 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.7923 0.0847 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.7920 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.7911 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.7908 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.7915 0.0748 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.7922 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.7925 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.7925 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.7921 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.7922 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.7927 0.0747 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.7924 0.0699 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.7925 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.7917 0.0727 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.7908 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.7900 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.7894 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.7889 0.0738 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.7893 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.7886 0.0724 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.7876 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.7877 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.7868 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.7870 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.7868 0.0680 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.7864 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.7871 0.0715 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.7868 0.0789 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.7874 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.7872 0.0828 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.7872 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.7867 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.7869 0.0687 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.7872 0.0790 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.7870 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.7867 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.7871 0.0719 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.7870 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.7878 0.0740 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.7881 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.7883 0.0718 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.7882 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.7885 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.7887 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.7884 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.7883 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.7882 0.0716 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.7886 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.7887 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.7892 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.7890 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.7889 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.7891 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.7891 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.7891 0.0810 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.7886 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.7884 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.7878 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.7880 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.7876 0.0709 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.7875 0.0695 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.7869 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.7867 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.7865 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.7863 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.7858 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.7858 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.7854 0.0798 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.7854 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.7850 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.7846 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.7843 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.7842 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.7842 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.7838 0.0717 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.7835 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.7831 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.7830 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.7828 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.7826 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.7825 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.7823 0.0721 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.7822 0.0717 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.7821 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.7821 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.7823 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.7824 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.7824 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.7822 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.7821 0.0697 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.7819 0.0678 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.7817 0.0715 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.7815 0.0697 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.7814 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.7813 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.7814 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.7813 0.0701 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.7813 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.7810 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.7808 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.7809 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.7808 0.0831 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.7804 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.7805 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.7806 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.7805 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.7804 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.7802 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.7799 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.7799 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.7799 0.0732 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.7797 0.0903 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.7798 0.0708 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.7799 0.0726 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.7799 0.0814 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.7799 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.7798 0.0722 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.7801 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.7800 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.7799 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.7799 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.7798 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.7798 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.7799 0.0697 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.7801 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.7801 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.7800 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.7798 0.0710 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.7799 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.7799 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.7800 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.7799 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.7799 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.7799 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.7798 0.0748 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.7796 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.7798 0.0683 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.7800 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.7799 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.7800 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.7800 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.7799 0.0736 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.7799 0.0699 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.7800 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.7804 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.7803 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.7802 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.7800 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.7799 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.7800 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.7800 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.7801 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.7800 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.7799 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.7800 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.8372 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.8007 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.7900 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.7862 0.0723 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.7826 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.7754 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.7755 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.7739 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.7754 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.7754 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.7725 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.7710 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.7702 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.7726 0.0721 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.7716 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.7697 0.0691 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.7703 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.7727 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.7721 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.7725 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.7723 0.0730 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.7736 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.7732 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.7726 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.7722 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.7712 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.7706 0.0784 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.7709 0.0712 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.7714 0.0727 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.7720 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.7717 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.7714 0.0766 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.7712 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.7717 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.7711 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.7708 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.7702 0.0770 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.7692 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.7683 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.7676 0.0739 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.7668 0.0716 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.7672 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.7669 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.7661 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.7664 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.7657 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.7654 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.7648 0.0707 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.7647 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.7650 0.0703 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.7646 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.7653 0.0711 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.7651 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.7653 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.7647 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.7649 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.7654 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.7653 0.0626 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.7649 0.0711 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.7654 0.0710 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.7654 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.7661 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.7665 0.0715 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.7667 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.7665 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.7668 0.0730 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.7668 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.7664 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.7664 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.7663 0.0791 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.7670 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.7670 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.7676 0.0754 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.7672 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.7671 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.7673 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.7672 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.7673 0.0711 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.7668 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.7666 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.7661 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.7662 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.7657 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.7657 0.0739 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.7652 0.0700 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.7650 0.0728 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.7649 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.7646 0.0719 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.7641 0.0726 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.7640 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.7637 0.0755 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.7636 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.7631 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.7628 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.7625 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.7624 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.7622 0.0735 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.7618 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.7615 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.7610 0.0756 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.7611 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.7610 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.7607 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.7605 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.7603 0.0747 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.7602 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.7601 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.7600 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.7601 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.7602 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.7602 0.0712 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.7600 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.7601 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.7600 0.0716 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.7598 0.0755 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.7595 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.7595 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.7595 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.7594 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.7594 0.0722 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.7594 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.7590 0.0793 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.7587 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.7589 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.7588 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.7584 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.7586 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.7587 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.7586 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.7585 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.7582 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.7580 0.0710 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.7580 0.0720 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.7580 0.0709 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.7580 0.0781 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.7580 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.7581 0.0800 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.7581 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.7583 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.7581 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.7582 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.7582 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.7581 0.0723 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.7581 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.7580 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.7580 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.7580 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.7582 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.7582 0.0732 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.7581 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.7578 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.7580 0.0753 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.7580 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.7580 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.7580 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.7581 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.7581 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.7581 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.7578 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.7580 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.7582 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.7582 0.0757 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.7582 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.7582 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.7582 0.0729 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.7582 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.7583 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.7586 0.0724 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.7586 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.7586 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.7586 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.7584 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.7585 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.7585 0.0749 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.7585 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.7584 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.7583 0.0706 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.7584 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.8154 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.7829 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.7721 0.0778 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.7648 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.7608 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.7524 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.7534 0.0761 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.7529 0.0704 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.7557 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.7559 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.7532 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.7520 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.7517 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.7542 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.7532 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.7516 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.7518 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.7532 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.7533 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.7538 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.7529 0.0706 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.7536 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.7529 0.0763 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.7527 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.7524 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.7513 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.7507 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.7513 0.0628 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.7524 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.7526 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.7524 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.7515 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.7516 0.0707 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.7523 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.7521 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.7518 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.7512 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.7504 0.0722 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.7491 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.7485 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.7480 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.7487 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.7479 0.0728 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.7468 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.7470 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.7461 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.7457 0.0718 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.7453 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.7450 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.7456 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.7453 0.0761 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.7461 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.7459 0.0783 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.7462 0.0759 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.7459 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.7460 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.7464 0.0725 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.7462 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.7457 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.7461 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.7460 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.7468 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.7471 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.7473 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.7473 0.0765 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.7475 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.7478 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.7475 0.0638 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.7474 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.7474 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.7478 0.0719 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.7480 0.0701 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.7485 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.7482 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.7481 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.7482 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.7483 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.7484 0.0632 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.7480 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.7477 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.7473 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.7474 0.0704 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.7468 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.7469 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.7463 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.7461 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.7460 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.7457 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.7452 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.7451 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.7449 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.7447 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.7442 0.0737 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.7439 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.7435 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.7436 0.0726 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.7435 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.7433 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.7429 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.7425 0.0721 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.7425 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.7424 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.7421 0.0707 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.7419 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.7417 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.7416 0.0707 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.7415 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.7415 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.7415 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.7415 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.7414 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.7412 0.0629 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.7410 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.7409 0.0760 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.7406 0.0707 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.7404 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.7403 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.7402 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.7401 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.7401 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.7400 0.0703 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.7397 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.7394 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.7396 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.7395 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.7391 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.7391 0.0795 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.7393 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.7391 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.7390 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.7387 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.7385 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.7387 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.7388 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.7387 0.0701 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.7388 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.7389 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.7389 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.7390 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.7389 0.0715 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.7392 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.7391 0.0748 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.7390 0.0718 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.7391 0.0729 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.7390 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.7390 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.7390 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.7393 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.7393 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.7391 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.7389 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.7390 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.7392 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.7392 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.7391 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.7392 0.0741 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.7393 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.7393 0.0761 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.7390 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.7391 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.7394 0.0751 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.7393 0.0721 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.7394 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.7394 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.7393 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.7393 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.7393 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.7397 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.7396 0.0704 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.7396 0.0726 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.7395 0.0764 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.7393 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.7395 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.7395 0.0823 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.7396 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.7395 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.7394 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.7395 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.7970 0.0686 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.7641 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.7536 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.7493 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.7461 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.7380 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.7374 0.0699 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.7356 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.7378 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.7375 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.7341 0.0732 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.7335 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.7330 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.7351 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.7346 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.7330 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.7326 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.7344 0.0748 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.7342 0.0717 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.7349 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.7343 0.0703 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.7353 0.0708 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.7346 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.7350 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.7349 0.0727 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.7338 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.7331 0.0713 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.7337 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.7341 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.7346 0.0793 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.7342 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.7336 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.7340 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.7347 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.7344 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.7340 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.7333 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.7324 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.7312 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.7305 0.0713 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.7298 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.7302 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.7298 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.7290 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.7293 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.7282 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.7279 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.7277 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.7276 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.7282 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.7279 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.7290 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.7289 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.7288 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.7285 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.7287 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.7293 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.7290 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.7286 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.7291 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.7290 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.7297 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.7301 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.7304 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.7301 0.0746 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.7305 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.7308 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.7304 0.0810 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.7302 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.7302 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.7306 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.7306 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.7310 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.7308 0.0704 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.7308 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.7308 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.7308 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.7307 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.7302 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.7301 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.7295 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.7296 0.0782 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.7292 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.7292 0.0714 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.7288 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.7287 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.7285 0.0755 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.7283 0.0716 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.7277 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.7277 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.7271 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.7270 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.7265 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.7262 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.7259 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.7259 0.0762 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.7258 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.7254 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.7251 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.7245 0.0693 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.7245 0.0721 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.7245 0.0719 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.7242 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.7239 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.7237 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.7236 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.7235 0.0774 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.7234 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.7234 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.7235 0.0727 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.7236 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.7234 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.7233 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.7231 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.7229 0.0686 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.7225 0.0755 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.7226 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.7225 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.7224 0.0704 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.7224 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.7224 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.7221 0.0704 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.7218 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.7219 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.7219 0.0738 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.7215 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.7216 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.7217 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.7214 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.7213 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.7209 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.7207 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.7208 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.7209 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.7208 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.7209 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.7211 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.7212 0.0731 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.7212 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.7210 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.7213 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.7212 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.7212 0.0728 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.7213 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.7212 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.7213 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.7213 0.0709 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.7214 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.7214 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.7213 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.7210 0.0702 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.7212 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.7212 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.7212 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.7212 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.7212 0.0745 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.7213 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.7213 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.7211 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.7211 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.7214 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.7213 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.7215 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.7215 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.7214 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.7214 0.0713 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.7214 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.7218 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.7217 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.7217 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.7215 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.7214 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.7216 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.7216 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.7217 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.7216 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.7215 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.7216 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.7885 0.0771 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.7524 0.0695 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.7381 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.7334 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.7281 0.0738 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.7181 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.7179 0.0700 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.7160 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.7181 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.7199 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.7168 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.7159 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.7154 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.7173 0.0625 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.7164 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.7149 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.7151 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.7168 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.7169 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.7174 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.7167 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.7169 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.7158 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.7155 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.7152 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.7147 0.0721 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.7143 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.7148 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.7153 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.7158 0.0786 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.7158 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.7151 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.7153 0.0691 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.7159 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.7157 0.0692 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.7156 0.0730 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.7148 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.7138 0.0775 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.7125 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.7119 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.7112 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.7119 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.7114 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.7106 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.7108 0.0852 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.7098 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.7098 0.0717 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.7096 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.7094 0.0833 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.7102 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.7099 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.7108 0.0750 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.7108 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.7108 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.7105 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.7107 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.7111 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.7110 0.0751 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.7104 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.7108 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.7108 0.0717 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.7117 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.7120 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.7122 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.7120 0.0734 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.7122 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.7123 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.7119 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.7117 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.7115 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.7121 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.7125 0.0709 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.7129 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.7127 0.0772 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.7126 0.0709 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.7128 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.7129 0.0692 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.7131 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.7124 0.0741 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.7123 0.0737 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.7118 0.0689 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.7119 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.7116 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.7116 0.0730 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.7111 0.0691 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.7109 0.0800 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.7106 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.7103 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.7099 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.7099 0.0627 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.7096 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.7094 0.0754 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.7090 0.0787 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.7086 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.7084 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.7084 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.7083 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.7079 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.7077 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.7072 0.0692 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.7073 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.7071 0.0756 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.7068 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.7066 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.7065 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.7064 0.0695 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.7063 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.7064 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.7064 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.7065 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.7064 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.7061 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.7060 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.7058 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.7056 0.0731 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.7053 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.7054 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.7053 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.7052 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.7052 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.7052 0.0796 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.7049 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.7047 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.7048 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.7047 0.0727 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.7044 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.7045 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.7047 0.0745 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.7046 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.7044 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.7041 0.0715 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.7039 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.7039 0.0709 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.7040 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.7039 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.7040 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.7041 0.0735 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.7042 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.7043 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.7042 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.7044 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.7043 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.7042 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.7042 0.0803 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.7041 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.7041 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.7042 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.7043 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.7043 0.0765 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.7043 0.0778 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.7040 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.7041 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.7041 0.0723 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.7042 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.7042 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.7042 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.7042 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.7042 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.7040 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.7042 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.7044 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.7044 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.7045 0.0818 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.7045 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.7045 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.7045 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.7046 0.0720 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.7051 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.7050 0.0700 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.7050 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.7049 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.7048 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.7049 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.7048 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.7049 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.7048 0.0691 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.7047 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.7048 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.7603 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.7291 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.7175 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.7142 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.7103 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.7020 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.7033 0.0769 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.7024 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.7034 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.7044 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.7011 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.7004 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.7001 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.7023 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.7022 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.7001 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.7003 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.7020 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.7023 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.7025 0.0746 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.7016 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.7021 0.0638 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.7013 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.7009 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.7011 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.7006 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.6999 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.7008 0.0733 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.7013 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.7014 0.0723 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.7014 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.7007 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.7012 0.0710 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.7014 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.7009 0.0666 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.7008 0.0730 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.7002 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.6992 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.6982 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.6976 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.6970 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.6978 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.6972 0.0801 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.6966 0.0742 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.6967 0.0726 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.6961 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.6958 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.6955 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.6953 0.0709 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.6958 0.0747 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.6954 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.6960 0.0729 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.6962 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.6964 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.6959 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.6963 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.6966 0.0710 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.6965 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.6959 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.6965 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.6966 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.6973 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.6978 0.0726 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.6981 0.0724 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.6977 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.6978 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.6980 0.0725 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.6977 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.6975 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.6973 0.0793 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.6978 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.6980 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.6985 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.6981 0.0715 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.6981 0.0713 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.6983 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.6983 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.6983 0.0732 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.6977 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.6975 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.6970 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.6971 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.6966 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.6966 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.6961 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.6959 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.6957 0.0737 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.6954 0.0738 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.6950 0.0709 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.6950 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.6947 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.6947 0.0760 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.6944 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.6941 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.6938 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.6939 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.6938 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.6933 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.6930 0.0800 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.6926 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.6926 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.6925 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.6924 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.6922 0.0742 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.6920 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.6919 0.0708 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.6917 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.6918 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.6918 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.6919 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.6918 0.0711 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.6916 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.6915 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.6913 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.6911 0.0652 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.6908 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.6907 0.0720 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.6907 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.6907 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.6907 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.6908 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.6905 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.6902 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.6903 0.0720 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.6903 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.6899 0.0708 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.6901 0.0773 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.6902 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.6900 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.6898 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.6895 0.0708 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.6893 0.0733 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.6893 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.6893 0.0727 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.6893 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.6893 0.0784 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.6894 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.6895 0.0717 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.6895 0.0792 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.6894 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.6897 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.6896 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.6896 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.6896 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.6895 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.6896 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.6896 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.6898 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.6899 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.6898 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.6895 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.6896 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.6896 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.6897 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.6897 0.0933 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.6897 0.0716 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.6898 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.6898 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.6895 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.6897 0.0722 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.6899 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.6899 0.0719 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.6899 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.6899 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.6899 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.6899 0.0727 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.6900 0.0763 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.6904 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.6904 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.6903 0.0705 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.6903 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.6901 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.6902 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.6902 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.6904 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.6903 0.0761 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.6902 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.6903 0.0721 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.7493 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.7173 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.7060 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.7029 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.6966 0.0703 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.6880 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.6903 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.6895 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.6907 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.6907 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.6866 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.6863 0.0808 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.6853 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.6874 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.6859 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.6844 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.6849 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.6871 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.6869 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.6874 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.6871 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.6871 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.6864 0.0638 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.6864 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.6860 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.6852 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.6843 0.0707 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.6850 0.0688 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.6856 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.6859 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.6857 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.6851 0.0717 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.6854 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.6862 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.6861 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.6856 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.6850 0.0688 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.6841 0.0773 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.6828 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.6823 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.6818 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.6823 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.6819 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.6814 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.6815 0.0712 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.6808 0.0711 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.6805 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.6802 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.6801 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.6806 0.0715 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.6801 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.6809 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.6809 0.0767 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.6811 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.6808 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.6809 0.0739 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.6814 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.6812 0.0651 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.6808 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.6813 0.0690 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.6814 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.6823 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.6830 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.6833 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.6833 0.0761 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.6835 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.6837 0.0703 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.6831 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.6830 0.0651 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.6830 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.6835 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.6836 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.6840 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.6839 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.6837 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.6839 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.6839 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.6839 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.6835 0.0784 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.6832 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.6826 0.0723 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.6827 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.6823 0.0742 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.6823 0.0740 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.6820 0.0722 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.6818 0.0711 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.6816 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.6814 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.6810 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.6811 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.6807 0.0716 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.6806 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.6804 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.6800 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.6798 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.6797 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.6798 0.0727 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.6794 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.6791 0.0707 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.6787 0.0690 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.6786 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.6784 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.6781 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.6780 0.0751 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.6779 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.6778 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.6777 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.6777 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.6777 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.6777 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.6776 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.6776 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.6775 0.0724 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.6773 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.6771 0.0719 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.6768 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.6769 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.6769 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.6769 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.6768 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.6768 0.0723 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.6765 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.6762 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.6763 0.0761 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.6763 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.6760 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.6760 0.0690 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.6761 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.6759 0.0719 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.6757 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.6753 0.0723 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.6751 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.6752 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.6752 0.0729 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.6751 0.0739 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.6753 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.6754 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.6755 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.6755 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.6754 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.6756 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.6755 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.6754 0.0699 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.6755 0.0637 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.6753 0.0651 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.6754 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.6755 0.0733 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.6757 0.0720 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.6757 0.0758 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.6757 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.6754 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.6755 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.6755 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.6756 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.6757 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.6757 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.6758 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.6757 0.0780 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.6755 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.6757 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.6760 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.6759 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.6760 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.6761 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.6761 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.6760 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.6761 0.0734 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.6766 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.6766 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.6765 0.0706 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.6764 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.6763 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.6764 0.0716 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.6764 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.6765 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.6764 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.6763 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.6763 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.7433 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.7051 0.0762 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.6973 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.6923 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.6866 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.6795 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.6785 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.6770 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.6792 0.0729 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.6784 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.6749 0.0716 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.6732 0.0750 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.6726 0.0733 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.6748 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.6740 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.6724 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.6730 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.6749 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.6752 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.6758 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.6754 0.0718 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.6758 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.6755 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.6753 0.0745 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.6747 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.6739 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.6731 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.6736 0.0719 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.6741 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.6743 0.0714 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.6735 0.0716 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.6728 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.6734 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.6738 0.0634 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.6738 0.0741 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.6734 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.6730 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.6721 0.0763 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.6708 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.6702 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.6694 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.6697 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.6692 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.6684 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.6685 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.6674 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.6674 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.6672 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.6671 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.6678 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.6672 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.6680 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.6678 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.6680 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.6676 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.6679 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.6684 0.0780 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.6681 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.6676 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.6682 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.6681 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.6692 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.6697 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.6700 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.6697 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.6699 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.6702 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.6699 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.6697 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.6696 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.6701 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.6702 0.0715 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.6707 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.6705 0.0729 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.6704 0.0783 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.6706 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.6706 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.6707 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.6702 0.0862 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.6700 0.0709 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.6694 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.6697 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.6693 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.6692 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.6686 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.6684 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.6682 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.6678 0.0716 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.6674 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.6673 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.6670 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.6669 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.6667 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.6664 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.6662 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.6663 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.6662 0.0746 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.6659 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.6657 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.6652 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.6653 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.6651 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.6650 0.0719 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.6648 0.0712 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.6646 0.0708 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.6645 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.6645 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.6645 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.6645 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.6646 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.6644 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.6644 0.0638 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.6643 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.6642 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.6640 0.0746 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.6636 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.6637 0.0755 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.6636 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.6637 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.6637 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.6636 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.6634 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.6630 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.6630 0.0742 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.6629 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.6626 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.6627 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.6629 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.6627 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.6626 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.6622 0.0793 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.6620 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.6621 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.6621 0.0705 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.6621 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.6622 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.6622 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.6623 0.0719 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.6623 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.6622 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.6625 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.6624 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.6624 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.6625 0.0631 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.6624 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.6625 0.0764 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.6626 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.6628 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.6629 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.6628 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.6626 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.6627 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.6628 0.0625 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.6628 0.0757 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.6628 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.6628 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.6628 0.0726 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.6628 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.6626 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.6627 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.6630 0.0722 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.6630 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.6631 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.6631 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.6631 0.0642 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.6632 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.6633 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.6637 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.6636 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.6636 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.6634 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.6633 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.6634 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.6634 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.6635 0.0723 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.6634 0.0757 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.6633 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.6634 0.0665 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4215 1.2678 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4153 0.0541 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.4088 0.0536 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.4011 0.0534 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.3911 0.0538 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.3763 0.0559 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.3517 0.0558 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.3113 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.2544 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 4.1949 0.0564 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 4.1368 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 4.0853 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 4.0387 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.9977 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.9590 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.9233 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.8910 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.8642 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.8373 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.8114 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.7880 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.7663 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.7459 0.0652 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.7270 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.7093 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.6934 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.6785 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.6637 0.0630 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.6499 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.6370 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.6255 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.6135 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.6023 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.5921 0.0664 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.5818 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.5727 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.5630 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.5539 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.5451 0.0667 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.5369 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.5288 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.5212 0.0662 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.5137 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.5065 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.4996 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.4935 0.0701 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.4875 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.4818 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.4764 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.4711 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.4657 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.4604 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.4555 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.4505 0.0641 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.4459 0.0711 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.4410 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.4365 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.4321 0.0690 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.4278 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.4238 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.4198 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.4163 0.0570 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.4129 0.0680 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.4090 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.4053 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.4021 0.0662 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.3988 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.3950 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.3917 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.3886 0.0641 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.3855 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.3828 0.0632 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.3799 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.3771 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.3745 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.3720 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.3696 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.3671 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.3646 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.3620 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.3595 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.3572 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.3551 0.0567 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.3528 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.3504 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.3481 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.3458 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.3436 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.3417 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.3398 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.3380 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.3360 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.3341 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.3323 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.3304 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.3287 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.3270 0.0645 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.3252 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.3235 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.3218 0.0660 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.3203 0.0630 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.3187 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.3172 0.0698 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.3156 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.3141 0.0695 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.3126 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.3110 0.0652 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.3093 0.0645 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.3078 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.3062 0.0665 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.3047 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.3034 0.0670 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.3020 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.3004 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.2990 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.2976 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.2962 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.2950 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.2937 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.2924 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.2913 0.0764 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.2902 0.0570 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.2890 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.2880 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.2868 0.0680 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.2855 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.2843 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.2832 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.2821 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.2810 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.2800 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.2788 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.2778 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.2767 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.2754 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.2742 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.2731 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.2720 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.2710 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.2700 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.2689 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.2678 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.2667 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.2657 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.2646 0.0652 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.2637 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.2628 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.2620 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.2610 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.2600 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.2592 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.2585 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.2575 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.2566 0.0715 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.2556 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.2547 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.2537 0.0699 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.2528 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.2517 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.2507 0.0717 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.2498 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.2488 0.0639 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.2477 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.2468 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.2458 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.2449 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.2440 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.2431 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.2421 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.2411 0.0791 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.2403 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.2395 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.2388 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.2382 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.2374 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.2365 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.2356 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.2345 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 3.1012 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 3.0693 0.0686 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 3.0586 0.0649 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 3.0548 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 3.0534 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 3.0537 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 3.0532 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 3.0528 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 3.0501 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 3.0488 0.0763 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 3.0450 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 3.0438 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 3.0416 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 3.0409 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 3.0396 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 3.0381 0.0797 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 3.0361 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 3.0359 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 3.0343 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 3.0310 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 3.0289 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 3.0275 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 3.0250 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 3.0232 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 3.0211 0.0727 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 3.0194 0.0688 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 3.0179 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 3.0156 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 3.0139 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 3.0120 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 3.0109 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 3.0092 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 3.0071 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 3.0055 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 3.0034 0.0594 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 3.0019 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.9996 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.9973 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.9951 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.9931 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.9909 0.0697 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.9888 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.9865 0.0675 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.9846 0.0703 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.9822 0.0696 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.9803 0.0677 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.9785 0.0686 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.9768 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.9750 0.0714 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.9734 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.9715 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.9694 0.0671 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.9676 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.9654 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.9635 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.9614 0.0685 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.9594 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.9574 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.9554 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.9537 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.9518 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.9502 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.9485 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.9465 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.9446 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.9430 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.9412 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.9389 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.9369 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.9353 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.9335 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.9320 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.9302 0.0705 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.9284 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.9268 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.9254 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.9237 0.0712 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.9220 0.0685 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.9201 0.0680 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.9182 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.9164 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.9147 0.0670 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.9130 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.9111 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.9091 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.9072 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.9054 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.9035 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.9018 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.9002 0.0594 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.8985 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.8968 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.8951 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.8932 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.8913 0.0660 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.8894 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.8878 0.0765 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.8860 0.0649 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.8843 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.8826 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.8810 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.8792 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.8775 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.8757 0.0690 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.8740 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.8724 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.8706 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.8690 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.8674 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.8655 0.0729 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.8639 0.0689 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.8624 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.8606 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.8589 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.8571 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.8553 0.0663 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.8534 0.0660 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.8519 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.8505 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.8489 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.8476 0.0696 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.8461 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.8445 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.8430 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.8415 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.8398 0.0688 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.8384 0.0705 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.8371 0.0765 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.8356 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.8341 0.0679 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.8327 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.8312 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.8298 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.8285 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.8268 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.8253 0.0695 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.8238 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.8223 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.8211 0.0690 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.8196 0.0677 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.8184 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.8169 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.8154 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.8139 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.8125 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.8113 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.8099 0.0786 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.8088 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.8074 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.8060 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.8049 0.0738 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.8038 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.8026 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.8014 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.8001 0.0682 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.7987 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.7973 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.7959 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.7945 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.7933 0.0680 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.7922 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.7907 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.7893 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.7880 0.0717 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.7869 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.7856 0.0740 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.7844 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.7832 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.7821 0.0682 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.7808 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.7796 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.7786 0.0686 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.7777 0.0720 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.7767 0.0730 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.7758 0.0682 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.7747 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.7735 0.0697 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.7722 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.6374 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.5845 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.5706 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.5647 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.5634 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.5620 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.5621 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.5623 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.5625 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.5612 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.5595 0.0691 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.5596 0.0708 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.5590 0.0727 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.5601 0.0728 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.5599 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.5593 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.5591 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.5607 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.5603 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.5585 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.5577 0.0741 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.5585 0.0705 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.5575 0.0719 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.5568 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.5559 0.0637 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.5556 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.5551 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.5541 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.5536 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.5534 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.5536 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.5527 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.5519 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.5514 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.5504 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.5498 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.5492 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.5480 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.5471 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.5462 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.5454 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.5444 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.5435 0.0750 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.5427 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.5419 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.5406 0.0699 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.5404 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.5397 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.5390 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.5390 0.0782 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.5385 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.5381 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.5374 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.5367 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.5362 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.5356 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.5350 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.5343 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.5337 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.5333 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.5327 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.5324 0.0768 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.5322 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.5315 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.5309 0.0687 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.5306 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.5300 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.5290 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.5282 0.0624 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.5278 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.5274 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.5271 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.5267 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.5261 0.0709 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.5257 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.5257 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.5253 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.5249 0.0759 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.5244 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.5238 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.5233 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.5229 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.5224 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.5216 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.5207 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.5201 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.5196 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.5191 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.5186 0.0694 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.5182 0.0637 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.5177 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.5174 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.5168 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.5162 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.5154 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.5149 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.5145 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.5140 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.5135 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.5130 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.5127 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.5123 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.5117 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.5112 0.0666 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.5107 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.5103 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.5097 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.5095 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.5092 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.5086 0.0753 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.5082 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.5079 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.5074 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.5068 0.0700 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.5064 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.5057 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.5052 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.5048 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.5047 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.5044 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.5042 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.5039 0.0689 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.5034 0.0683 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.5032 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.5027 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.5021 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.5018 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.5016 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.5012 0.0705 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.5009 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.5005 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.5001 0.0729 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.4998 0.0698 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.4995 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.4990 0.0703 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.4986 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.4983 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.4980 0.0685 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.4978 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.4974 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.4972 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.4967 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.4964 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.4959 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.4955 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.4953 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.4949 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.4948 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.4943 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.4939 0.0722 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.4938 0.0670 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.4937 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.4934 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.4932 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.4929 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.4924 0.0718 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.4921 0.0716 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.4916 0.0716 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.4911 0.0708 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.4909 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.4906 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.4901 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.4897 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.4894 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.4891 0.0743 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.4888 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.4885 0.0722 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.4882 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.4879 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.4874 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.4871 0.0719 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.4870 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.4869 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.4869 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.4868 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.4865 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.4861 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.4857 0.0678 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.4905 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.4414 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.4292 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.4248 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.4226 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.4218 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.4219 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.4235 0.0753 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.4250 0.0695 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.4251 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.4232 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.4227 0.0795 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.4225 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.4244 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.4238 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.4234 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.4229 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.4248 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.4246 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.4230 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.4223 0.0606 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.4232 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.4228 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.4221 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.4216 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.4213 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.4205 0.0762 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.4202 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.4203 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.4198 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.4202 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.4193 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.4189 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.4190 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.4183 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.4182 0.0726 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.4175 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.4163 0.0683 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.4154 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.4146 0.0704 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.4138 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.4134 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.4127 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.4120 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.4114 0.0683 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.4101 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.4100 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.4095 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.4091 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.4092 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.4086 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.4084 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.4080 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.4077 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.4073 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.4071 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.4069 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.4063 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.4058 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.4057 0.0707 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.4052 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.4052 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.4054 0.0664 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.4049 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.4043 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.4043 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.4041 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.4033 0.0718 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.4026 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.4026 0.0802 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.4025 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.4024 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.4023 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.4018 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.4014 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.4016 0.0732 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.4012 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.4012 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.4008 0.0737 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.4004 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.4002 0.0814 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.4001 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.3996 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.3992 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.3985 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.3981 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.3977 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.3973 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.3969 0.0712 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.3968 0.0745 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.3965 0.0707 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.3962 0.0743 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.3959 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.3954 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.3949 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.3945 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.3943 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.3940 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.3936 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.3932 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.3932 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.3929 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.3924 0.0737 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.3921 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.3918 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.3915 0.0736 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.3911 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.3909 0.0720 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.3907 0.0627 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.3903 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.3900 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.3899 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.3897 0.0755 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.3893 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.3890 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.3884 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.3882 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.3880 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.3882 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.3879 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.3879 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.3877 0.0711 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.3874 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.3873 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.3870 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.3865 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.3863 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.3862 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.3859 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.3857 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.3854 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.3851 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.3849 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.3848 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.3845 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.3843 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.3840 0.0710 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.3839 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.3839 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.3835 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.3834 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.3831 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.3828 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.3825 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.3823 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.3822 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.3820 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.3821 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.3818 0.0801 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.3815 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.3814 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.3815 0.0696 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.3814 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.3813 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.3810 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.3808 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.3805 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.3802 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.3798 0.0627 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.3796 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.3795 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.3791 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.3788 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.3786 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.3785 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.3782 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.3779 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.3777 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.3776 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.3773 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.3771 0.0742 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.3769 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.3769 0.0878 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.3768 0.0718 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.3768 0.0618 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.3766 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.3764 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.3760 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.4047 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.3546 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.3378 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.3361 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.3349 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.3324 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.3321 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.3342 0.0720 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.3357 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.3342 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.3320 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.3314 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.3307 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.3327 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.3324 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.3320 0.0739 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.3318 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.3339 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.3345 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.3334 0.0705 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.3327 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.3333 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.3331 0.0727 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.3323 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.3317 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.3313 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.3306 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.3304 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.3308 0.0718 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.3311 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.3316 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.3308 0.0720 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.3301 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.3307 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.3298 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.3297 0.0714 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.3294 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.3283 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.3277 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.3269 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.3263 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.3257 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.3250 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.3243 0.0724 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.3239 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.3226 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.3224 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.3220 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.3217 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.3218 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.3210 0.0618 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.3210 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.3205 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.3200 0.0695 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.3196 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.3196 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.3193 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.3191 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.3186 0.0719 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.3186 0.0704 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.3182 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.3183 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.3183 0.0711 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.3180 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.3176 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.3175 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.3173 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.3165 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.3160 0.0683 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.3159 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.3158 0.0618 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.3159 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.3158 0.0685 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.3154 0.0621 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.3151 0.0722 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.3154 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.3152 0.0812 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.3151 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.3147 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.3143 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.3140 0.0619 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.3140 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.3136 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.3132 0.0766 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.3126 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 2.3122 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 2.3120 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 2.3116 0.0615 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 2.3112 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 2.3111 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 2.3108 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 2.3106 0.0603 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 2.3104 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 2.3099 0.0740 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 2.3094 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 2.3090 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 2.3088 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 2.3086 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 2.3083 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 2.3079 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 2.3078 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 2.3077 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 2.3072 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 2.3070 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 2.3066 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 2.3063 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 2.3061 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 2.3060 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 2.3060 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 2.3056 0.0751 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 2.3054 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 2.3053 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 2.3051 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 2.3049 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 2.3046 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 2.3041 0.0730 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 2.3039 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 2.3037 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 2.3037 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 2.3036 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 2.3036 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 2.3034 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 2.3032 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 2.3032 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 2.3030 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 2.3026 0.0776 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 2.3025 0.0724 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 2.3024 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 2.3022 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 2.3021 0.0749 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 2.3019 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 2.3016 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 2.3015 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 2.3016 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 2.3013 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 2.3011 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 2.3009 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 2.3008 0.0707 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 2.3008 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 2.3005 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 2.3006 0.0710 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 2.3003 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 2.3001 0.0713 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 2.2999 0.0719 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 2.2997 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 2.2998 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 2.2996 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 2.2997 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 2.2995 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 2.2992 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 2.2992 0.0685 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 2.2993 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 2.2991 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 2.2991 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 2.2989 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 2.2988 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 2.2986 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 2.2984 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 2.2982 0.0720 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 2.2982 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 2.2981 0.0720 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 2.2978 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 2.2976 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 2.2976 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 2.2975 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 2.2973 0.0703 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 2.2971 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 2.2971 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 2.2970 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 2.2968 0.0736 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 2.2966 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 2.2965 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 2.2964 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 2.2964 0.0744 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 2.2964 0.0687 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 2.2963 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 2.2960 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 2.2958 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.3370 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 2.2875 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 2.2717 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 2.2685 0.0618 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 2.2643 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 2.2625 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 2.2619 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 2.2634 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 2.2651 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 2.2658 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 2.2641 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 2.2636 0.0772 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 2.2641 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 2.2663 0.0744 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 2.2661 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 2.2658 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 2.2655 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 2.2672 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 2.2675 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 2.2667 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 2.2658 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 2.2668 0.0706 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 2.2662 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 2.2655 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 2.2646 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 2.2642 0.0723 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 2.2638 0.0755 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 2.2638 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 2.2640 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 2.2640 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 2.2646 0.0727 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 2.2640 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 2.2638 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 2.2641 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 2.2639 0.0765 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 2.2640 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 2.2636 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 2.2624 0.0737 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 2.2615 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 2.2609 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 2.2606 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 2.2601 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 2.2596 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 2.2591 0.0706 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 2.2588 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 2.2576 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 2.2575 0.0692 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 2.2573 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 2.2571 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 2.2574 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 2.2567 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 2.2568 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 2.2565 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 2.2560 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 2.2557 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 2.2558 0.0741 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 2.2559 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 2.2555 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 2.2552 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 2.2551 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 2.2549 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 2.2551 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 2.2552 0.0709 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 2.2549 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 2.2547 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 2.2547 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 2.2546 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 2.2539 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 2.2534 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 2.2533 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 2.2534 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 2.2534 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 2.2535 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 2.2532 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 2.2531 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 2.2533 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 2.2531 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 2.2531 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 2.2528 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 2.2527 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 2.2523 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 2.2524 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 2.2520 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 2.2516 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 2.2510 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 2.2506 0.0702 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 2.2504 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 2.2500 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 2.2497 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 2.2496 0.0714 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 2.2494 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 2.2493 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 2.2490 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 2.2486 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 2.2482 0.0718 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 2.2479 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 2.2478 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 2.2475 0.0737 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 2.2471 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 2.2467 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 2.2468 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 2.2467 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 2.2463 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 2.2461 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 2.2458 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 2.2455 0.0754 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 2.2454 0.0735 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 2.2452 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 2.2452 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 2.2450 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 2.2449 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 2.2448 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 2.2447 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 2.2445 0.0726 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 2.2443 0.0751 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 2.2438 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 2.2436 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 2.2436 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 2.2436 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 2.2434 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 2.2434 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 2.2432 0.0729 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 2.2430 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 2.2430 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 2.2428 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 2.2424 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 2.2423 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 2.2422 0.0650 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 2.2420 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 2.2420 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 2.2418 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 2.2415 0.0650 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 2.2415 0.0709 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 2.2415 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 2.2413 0.0721 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 2.2411 0.0706 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 2.2410 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 2.2409 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 2.2410 0.0745 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 2.2408 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 2.2409 0.0733 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 2.2407 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 2.2405 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 2.2403 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 2.2402 0.0635 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 2.2402 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 2.2401 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 2.2401 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 2.2399 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 2.2397 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 2.2396 0.0695 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 2.2398 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 2.2397 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 2.2397 0.0671 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 2.2395 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 2.2393 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 2.2392 0.0736 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 2.2391 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 2.2389 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 2.2389 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 2.2389 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 2.2386 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 2.2385 0.0758 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 2.2384 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 2.2383 0.0699 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 2.2381 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 2.2380 0.0669 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 2.2380 0.0725 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 2.2379 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 2.2377 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 2.2375 0.0721 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 2.2374 0.0744 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 2.2373 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 2.2373 0.0733 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 2.2373 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 2.2372 0.0759 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 2.2370 0.0671 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 2.2367 0.0725 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 2.2739 0.0757 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 2.2250 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 2.2140 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 2.2088 0.0735 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 2.2073 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 2.2046 0.0720 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 2.2063 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 2.2073 0.0708 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 2.2105 0.0732 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 2.2106 0.0740 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 2.2089 0.0752 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 2.2084 0.0756 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 2.2088 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 2.2112 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 2.2113 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 2.2106 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 2.2101 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 2.2127 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 2.2134 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 2.2128 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 2.2123 0.0802 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 2.2134 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 2.2132 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 2.2124 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 2.2119 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 2.2118 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 2.2111 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 2.2114 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 2.2121 0.0720 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 2.2121 0.0725 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 2.2126 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 2.2121 0.0708 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 2.2117 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 2.2125 0.0799 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 2.2122 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 2.2123 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 2.2119 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 2.2107 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 2.2101 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 2.2094 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 2.2089 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 2.2084 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 2.2078 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 2.2072 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 2.2071 0.0697 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 2.2058 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 2.2057 0.0709 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 2.2052 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 2.2051 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 2.2053 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 2.2047 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 2.2053 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 2.2049 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 2.2046 0.0740 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 2.2041 0.0709 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 2.2043 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 2.2042 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 2.2040 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 2.2036 0.0724 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 2.2037 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 2.2036 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 2.2039 0.0717 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 2.2040 0.0776 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 2.2038 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 2.2035 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 2.2036 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 2.2034 0.0700 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 2.2027 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 2.2023 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 2.2022 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 2.2022 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 2.2022 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 2.2023 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 2.2021 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 2.2020 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 2.2023 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 2.2021 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 2.2022 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 2.2017 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 2.2015 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 2.2012 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 2.2013 0.0683 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 2.2009 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 2.2006 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 2.2000 0.0694 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 2.1997 0.0743 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 2.1994 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 2.1992 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 2.1988 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 2.1989 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 2.1986 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 2.1986 0.0743 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 2.1982 0.0695 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 2.1979 0.0722 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 2.1975 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 2.1973 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 2.1970 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 2.1968 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 2.1964 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 2.1961 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 2.1961 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 2.1960 0.0720 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 2.1957 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 2.1954 0.0740 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 2.1952 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 2.1950 0.0742 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 2.1948 0.0680 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 2.1947 0.0695 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 2.1948 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 2.1945 0.0756 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 2.1944 0.0690 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 2.1943 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 2.1942 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 2.1939 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 2.1936 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 2.1932 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 2.1931 0.0680 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 2.1930 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 2.1930 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 2.1928 0.0742 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 2.1928 0.0690 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 2.1927 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 2.1925 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 2.1926 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 2.1924 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 2.1920 0.0723 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 2.1920 0.0709 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 2.1920 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 2.1918 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 2.1918 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 2.1916 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 2.1914 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 2.1914 0.0739 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 2.1914 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 2.1912 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 2.1912 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 2.1911 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 2.1910 0.0714 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 2.1912 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 2.1910 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 2.1911 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 2.1909 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 2.1907 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 2.1907 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 2.1905 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 2.1905 0.0698 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 2.1904 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 2.1905 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 2.1904 0.0763 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 2.1902 0.0710 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 2.1901 0.0736 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 2.1902 0.0731 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 2.1902 0.0710 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 2.1902 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 2.1900 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 2.1898 0.0748 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 2.1898 0.0718 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 2.1897 0.0697 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 2.1895 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 2.1895 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 2.1895 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 2.1893 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 2.1892 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 2.1892 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 2.1893 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 2.1892 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 2.1891 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 2.1892 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 2.1891 0.0756 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 2.1889 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 2.1887 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 2.1886 0.0710 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 2.1886 0.0633 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 2.1886 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 2.1886 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 2.1885 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 2.1883 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 2.1881 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 2.2406 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 2.1908 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 2.1779 0.0775 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 2.1711 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 2.1694 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 2.1645 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 2.1658 0.0778 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 2.1659 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 2.1681 0.0696 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 2.1684 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 2.1670 0.0621 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 2.1654 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 2.1661 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 2.1692 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 2.1695 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 2.1692 0.0744 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 2.1689 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 2.1711 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 2.1717 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 2.1708 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 2.1699 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 2.1714 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 2.1708 0.0740 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 2.1700 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 2.1697 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 2.1692 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 2.1682 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 2.1685 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 2.1691 0.0719 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 2.1693 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 2.1697 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 2.1692 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 2.1691 0.0751 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 2.1696 0.0709 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 2.1691 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 2.1694 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 2.1688 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 2.1678 0.0731 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 2.1669 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 2.1663 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 2.1657 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 2.1653 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 2.1647 0.0744 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 2.1640 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 2.1637 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 2.1624 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 2.1622 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 2.1618 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 2.1615 0.0706 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 2.1619 0.0709 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 2.1613 0.0726 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 2.1617 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 2.1615 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 2.1612 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 2.1607 0.0637 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 2.1610 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 2.1610 0.0703 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 2.1608 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 2.1605 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 2.1607 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 2.1602 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 2.1605 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 2.1608 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 2.1608 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 2.1606 0.0739 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 2.1607 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 2.1606 0.0802 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 2.1600 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 2.1596 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 2.1596 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 2.1598 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 2.1598 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 2.1600 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 2.1597 0.0749 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 2.1597 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 2.1601 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 2.1598 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 2.1599 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 2.1594 0.0716 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 2.1593 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 2.1590 0.0695 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 2.1590 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 2.1585 0.0703 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 2.1582 0.0709 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 2.1576 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 2.1573 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 2.1570 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 2.1567 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 2.1563 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 2.1563 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 2.1560 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 2.1560 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 2.1556 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 2.1552 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 2.1549 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 2.1548 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 2.1547 0.0637 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 2.1544 0.0736 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 2.1541 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 2.1538 0.0708 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 2.1538 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 2.1537 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 2.1533 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 2.1531 0.0717 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 2.1528 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 2.1527 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 2.1526 0.0717 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 2.1525 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 2.1526 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 2.1524 0.0703 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 2.1522 0.0710 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 2.1521 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 2.1520 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 2.1518 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 2.1517 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 2.1513 0.0698 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 2.1512 0.0803 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 2.1510 0.0725 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 2.1510 0.0703 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 2.1509 0.0707 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 2.1509 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 2.1507 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 2.1506 0.0737 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 2.1507 0.0749 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 2.1505 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 2.1501 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 2.1501 0.0748 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 2.1501 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 2.1500 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 2.1500 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 2.1497 0.0708 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 2.1494 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 2.1494 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 2.1493 0.0747 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 2.1492 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 2.1491 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 2.1491 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 2.1491 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 2.1492 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 2.1490 0.0752 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 2.1491 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 2.1489 0.0759 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 2.1489 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 2.1488 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 2.1486 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 2.1486 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 2.1485 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 2.1486 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 2.1484 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 2.1482 0.0698 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 2.1481 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 2.1483 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 2.1483 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 2.1482 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 2.1480 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 2.1479 0.0719 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 2.1478 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 2.1477 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 2.1475 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 2.1476 0.0709 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 2.1476 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 2.1474 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 2.1474 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 2.1474 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 2.1473 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 2.1472 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 2.1471 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 2.1471 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 2.1471 0.0732 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 2.1469 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 2.1468 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 2.1466 0.0742 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 2.1466 0.0797 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 2.1466 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 2.1466 0.0769 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 2.1465 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 2.1464 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 2.1462 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 2.1898 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 2.1465 0.0685 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 2.1343 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 2.1301 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 2.1279 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 2.1229 0.0727 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 2.1265 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 2.1261 0.0720 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 2.1282 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 2.1277 0.0688 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 2.1265 0.0701 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 2.1255 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 2.1257 0.0728 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 2.1283 0.0730 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 2.1278 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 2.1271 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 2.1272 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 2.1297 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 2.1302 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 2.1301 0.0621 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 2.1296 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 2.1309 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 2.1304 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 2.1294 0.0726 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 2.1288 0.0725 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 2.1287 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 2.1278 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 2.1281 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 2.1289 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 2.1289 0.0620 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 2.1295 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 2.1289 0.0688 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 2.1288 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 2.1294 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 2.1292 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 2.1291 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 2.1288 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 2.1277 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 2.1265 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 2.1261 0.0735 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 2.1254 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 2.1252 0.0711 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 2.1245 0.0780 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 2.1238 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 2.1237 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 2.1224 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 2.1225 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 2.1221 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 2.1219 0.0779 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 2.1223 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 2.1215 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 2.1219 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 2.1215 0.0725 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 2.1212 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 2.1209 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 2.1210 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 2.1211 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 2.1207 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 2.1204 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 2.1204 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 2.1201 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 2.1204 0.0744 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 2.1208 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 2.1208 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 2.1205 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 2.1207 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 2.1206 0.0711 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 2.1201 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 2.1197 0.0762 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 2.1196 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 2.1199 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 2.1201 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 2.1204 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 2.1202 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 2.1201 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 2.1205 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 2.1201 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 2.1202 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 2.1199 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 2.1196 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 2.1193 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 2.1193 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 2.1188 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 2.1187 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 2.1181 0.0609 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 2.1178 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 2.1176 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 2.1174 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 2.1170 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 2.1169 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 2.1167 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 2.1166 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 2.1162 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 2.1158 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 2.1153 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 2.1151 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 2.1149 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 2.1145 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 2.1141 0.0727 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 2.1137 0.0707 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 2.1138 0.0703 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 2.1137 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 2.1133 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 2.1131 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 2.1129 0.0715 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 2.1128 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 2.1127 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 2.1126 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 2.1127 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 2.1125 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 2.1124 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 2.1124 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 2.1123 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 2.1122 0.0745 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 2.1119 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 2.1115 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 2.1115 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 2.1113 0.0719 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 2.1114 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 2.1113 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 2.1112 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 2.1110 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 2.1107 0.0751 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 2.1109 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 2.1106 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 2.1101 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 2.1101 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 2.1101 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 2.1099 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 2.1099 0.0727 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 2.1098 0.0714 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 2.1095 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 2.1095 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 2.1095 0.0736 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 2.1095 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 2.1095 0.0683 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 2.1095 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 2.1095 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 2.1096 0.0741 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 2.1094 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 2.1095 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 2.1093 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 2.1092 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 2.1091 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 2.1090 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 2.1090 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 2.1089 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 2.1091 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 2.1090 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 2.1088 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 2.1088 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 2.1090 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 2.1090 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 2.1090 0.0738 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 2.1089 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 2.1088 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 2.1087 0.0756 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 2.1087 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 2.1085 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 2.1085 0.0728 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 2.1086 0.0715 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 2.1084 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 2.1084 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 2.1083 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 2.1083 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 2.1082 0.0768 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 2.1081 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 2.1083 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 2.1082 0.0710 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 2.1081 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 2.1079 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 2.1078 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 2.1078 0.0720 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 2.1078 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 2.1079 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 2.1078 0.0766 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 2.1076 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 2.1075 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 2.1602 0.0710 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 2.1198 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 2.1057 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 2.0988 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 2.0968 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 2.0917 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 2.0911 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 2.0918 0.0750 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 2.0942 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 2.0932 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 2.0904 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 2.0896 0.0736 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 2.0896 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 2.0924 0.0711 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 2.0920 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 2.0919 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 2.0926 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 2.0942 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 2.0947 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 2.0946 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 2.0937 0.0644 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 2.0949 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 2.0943 0.0715 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 2.0934 0.0711 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 2.0929 0.0729 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 2.0927 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 2.0924 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 2.0923 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 2.0929 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 2.0935 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 2.0940 0.0725 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 2.0933 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 2.0933 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 2.0939 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 2.0935 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 2.0933 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 2.0929 0.0737 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 2.0918 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 2.0909 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 2.0903 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 2.0899 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 2.0895 0.0762 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 2.0890 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 2.0884 0.0725 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 2.0882 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 2.0870 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 2.0871 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 2.0869 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 2.0865 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 2.0870 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 2.0864 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 2.0871 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 2.0867 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 2.0865 0.0773 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 2.0860 0.0709 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 2.0861 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 2.0862 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 2.0859 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 2.0856 0.0711 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 2.0858 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 2.0854 0.0630 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 2.0859 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 2.0861 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 2.0861 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 2.0859 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 2.0862 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 2.0862 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 2.0857 0.0737 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 2.0854 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 2.0853 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 2.0857 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 2.0859 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 2.0860 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 2.0857 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 2.0856 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 2.0860 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 2.0858 0.0726 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 2.0858 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 2.0855 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 2.0853 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 2.0850 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 2.0849 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 2.0844 0.0788 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 2.0842 0.0721 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 2.0836 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 2.0832 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 2.0831 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 2.0827 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 2.0824 0.0801 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 2.0824 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 2.0820 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 2.0820 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 2.0816 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 2.0813 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 2.0809 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 2.0809 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 2.0808 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 2.0805 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 2.0801 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 2.0797 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 2.0797 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 2.0796 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 2.0792 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 2.0790 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 2.0787 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 2.0785 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 2.0784 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 2.0782 0.0757 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 2.0783 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 2.0781 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 2.0780 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 2.0780 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 2.0780 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 2.0778 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 2.0775 0.0690 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 2.0770 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 2.0771 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 2.0770 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 2.0771 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 2.0769 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 2.0770 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 2.0767 0.0789 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 2.0765 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 2.0766 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 2.0764 0.0759 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 2.0761 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 2.0761 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 2.0762 0.0733 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 2.0761 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 2.0761 0.0777 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 2.0759 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 2.0756 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 2.0756 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 2.0756 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 2.0755 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 2.0755 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 2.0754 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 2.0755 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 2.0756 0.0724 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 2.0754 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 2.0755 0.0759 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 2.0753 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 2.0753 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 2.0751 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 2.0750 0.0717 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 2.0750 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 2.0750 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 2.0751 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 2.0750 0.0644 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 2.0748 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 2.0748 0.0742 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 2.0750 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 2.0749 0.0631 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 2.0749 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 2.0748 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 2.0747 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 2.0747 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 2.0746 0.0700 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 2.0745 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 2.0746 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 2.0746 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 2.0745 0.0620 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 2.0744 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 2.0744 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 2.0744 0.0723 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 2.0743 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 2.0742 0.0631 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 2.0744 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 2.0743 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 2.0742 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 2.0740 0.0720 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 2.0739 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 2.0739 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 2.0738 0.0832 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 2.0738 0.0722 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 2.0738 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 2.0736 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 2.0736 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 2.1244 0.0694 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 2.0796 0.0722 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 2.0670 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 2.0612 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 2.0602 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 2.0568 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 2.0589 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 2.0598 0.0726 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 2.0626 0.0704 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 2.0628 0.0739 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 2.0600 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 2.0589 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 2.0591 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 2.0618 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 2.0609 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 2.0604 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 2.0603 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 2.0624 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 2.0620 0.0707 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 2.0618 0.0786 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 2.0608 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 2.0619 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 2.0615 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 2.0614 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 2.0609 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 2.0601 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 2.0590 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 2.0593 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 2.0602 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 2.0604 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 2.0605 0.0630 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 2.0597 0.0694 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 2.0595 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 2.0604 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 2.0601 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 2.0599 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 2.0595 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 2.0584 0.0704 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 2.0571 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 2.0566 0.0633 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 2.0559 0.0625 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 2.0558 0.0756 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 2.0555 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 2.0548 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 2.0549 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 2.0536 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 2.0536 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 2.0532 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 2.0530 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 2.0536 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 2.0530 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 2.0536 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 2.0533 0.0738 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 2.0529 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 2.0525 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 2.0527 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 2.0528 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 2.0525 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 2.0521 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 2.0525 0.0715 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 2.0525 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 2.0530 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 2.0532 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 2.0531 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 2.0530 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 2.0533 0.0694 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 2.0533 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 2.0528 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 2.0522 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 2.0520 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 2.0523 0.0746 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 2.0524 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 2.0528 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 2.0526 0.0716 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 2.0525 0.0686 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 2.0528 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 2.0526 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 2.0525 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 2.0521 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 2.0520 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 2.0515 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 2.0515 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 2.0512 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 2.0510 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 2.0504 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 2.0501 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 2.0499 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 2.0496 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 2.0491 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 2.0491 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 2.0488 0.0686 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 2.0487 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 2.0482 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 2.0478 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 2.0474 0.0707 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 2.0474 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 2.0472 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 2.0468 0.0709 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 2.0465 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 2.0461 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 2.0461 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 2.0460 0.0630 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 2.0457 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 2.0455 0.0737 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 2.0452 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 2.0451 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 2.0449 0.0633 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 2.0448 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 2.0448 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 2.0448 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 2.0448 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 2.0447 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 2.0446 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 2.0445 0.0720 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 2.0442 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 2.0439 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 2.0438 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 2.0438 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 2.0438 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 2.0438 0.0706 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 2.0438 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 2.0436 0.0630 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 2.0434 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 2.0435 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 2.0434 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 2.0429 0.0708 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 2.0429 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 2.0430 0.0628 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 2.0429 0.0694 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 2.0429 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 2.0427 0.0710 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 2.0424 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 2.0424 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 2.0424 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 2.0423 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 2.0423 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 2.0423 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 2.0422 0.0627 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 2.0424 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 2.0423 0.0750 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 2.0425 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 2.0424 0.0755 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 2.0423 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 2.0422 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 2.0421 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 2.0422 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 2.0422 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 2.0423 0.0710 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 2.0422 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 2.0420 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 2.0419 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 2.0421 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 2.0421 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 2.0421 0.0626 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 2.0420 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 2.0419 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 2.0418 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 2.0418 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 2.0416 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 2.0417 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 2.0417 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 2.0416 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 2.0415 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 2.0416 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 2.0415 0.0738 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 2.0415 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 2.0415 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 2.0416 0.0718 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 2.0416 0.0766 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 2.0415 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 2.0413 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 2.0411 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 2.0412 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 2.0412 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 2.0413 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 2.0413 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 2.0411 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 2.0410 0.0780 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 2.0914 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 2.0485 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 2.0367 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 2.0334 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 2.0297 0.0645 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 2.0231 0.0732 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 2.0247 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 2.0262 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 2.0286 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 2.0288 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 2.0258 0.0627 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 2.0248 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 2.0251 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 2.0278 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 2.0266 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 2.0256 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 2.0262 0.0632 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 2.0280 0.0731 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 2.0280 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 2.0278 0.0727 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 2.0268 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 2.0280 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 2.0275 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 2.0271 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 2.0265 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 2.0262 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 2.0253 0.0701 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 2.0256 0.0739 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 2.0268 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 2.0270 0.0647 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 2.0274 0.0699 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 2.0268 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 2.0268 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 2.0275 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 2.0272 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 2.0273 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 2.0269 0.0617 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 2.0262 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 2.0250 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 2.0245 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 2.0241 0.0624 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 2.0239 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 2.0234 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 2.0229 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 2.0230 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 2.0219 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 2.0218 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 2.0214 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 2.0211 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 2.0215 0.0708 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 2.0208 0.0702 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 2.0216 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 2.0213 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 2.0211 0.0695 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 2.0205 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 2.0208 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 2.0208 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 2.0206 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 2.0203 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 2.0205 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 2.0203 0.0632 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 2.0207 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 2.0210 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 2.0209 0.0713 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 2.0210 0.0746 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 2.0211 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 2.0211 0.0725 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 2.0205 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 2.0201 0.0720 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 2.0201 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 2.0204 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 2.0205 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 2.0208 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 2.0206 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 2.0205 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 2.0207 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 2.0205 0.0753 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 2.0206 0.0647 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 2.0202 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 2.0201 0.0722 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 2.0196 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 2.0197 0.0723 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 2.0192 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 2.0190 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 2.0185 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 2.0181 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 2.0181 0.0738 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 2.0179 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 2.0174 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 2.0174 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 2.0171 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 2.0169 0.0756 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 2.0164 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 2.0161 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 2.0157 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 2.0155 0.0739 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 2.0154 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 2.0150 0.0755 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 2.0147 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 2.0143 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 2.0144 0.0699 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 2.0142 0.0702 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 2.0139 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 2.0137 0.0723 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 2.0135 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 2.0134 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 2.0133 0.0737 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 2.0132 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 2.0132 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 2.0132 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 2.0132 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 2.0131 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 2.0131 0.0811 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 2.0130 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 2.0128 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 2.0124 0.0722 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 2.0124 0.0708 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 2.0123 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 2.0123 0.0674 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 2.0122 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 2.0122 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 2.0120 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 2.0118 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 2.0119 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 2.0118 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 2.0114 0.0724 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 2.0114 0.0629 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 2.0114 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 2.0113 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 2.0112 0.0748 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 2.0111 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 2.0107 0.0724 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 2.0107 0.0771 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 2.0108 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 2.0107 0.0747 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 2.0109 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 2.0109 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 2.0108 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 2.0110 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 2.0108 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 2.0110 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 2.0108 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 2.0107 0.0732 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 2.0107 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 2.0105 0.0766 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 2.0106 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 2.0106 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 2.0108 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 2.0108 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 2.0106 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 2.0104 0.0769 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 2.0106 0.0708 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 2.0106 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 2.0107 0.0740 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 2.0106 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 2.0105 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 2.0105 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 2.0105 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 2.0103 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 2.0105 0.0738 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 2.0105 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 2.0104 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 2.0105 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 2.0104 0.0741 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 2.0104 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 2.0104 0.0715 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 2.0103 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 2.0105 0.0725 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 2.0105 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 2.0103 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 2.0102 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 2.0101 0.0806 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 2.0101 0.0701 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 2.0101 0.0647 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 2.0102 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 2.0102 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 2.0100 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 2.0100 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 2.0696 0.0695 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 2.0204 0.0714 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 2.0102 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 2.0060 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 2.0030 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.9958 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.9962 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.9981 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 2.0018 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 2.0014 0.0683 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.9990 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.9984 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.9988 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 2.0012 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.9999 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.9987 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.9992 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 2.0010 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 2.0004 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 2.0004 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.9995 0.0713 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 2.0006 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.9999 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.9995 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.9993 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.9986 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.9982 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.9985 0.0683 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.9997 0.0757 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.9996 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.9998 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.9992 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.9995 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 2.0002 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.9997 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.9997 0.0768 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.9996 0.0759 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.9985 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.9973 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.9965 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.9960 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.9960 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.9956 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.9951 0.0734 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.9953 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.9940 0.0734 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.9938 0.0692 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.9933 0.0717 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.9929 0.0743 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.9936 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.9929 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.9936 0.0714 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.9933 0.0718 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.9932 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.9928 0.0723 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.9931 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.9932 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.9928 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.9925 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.9928 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.9925 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.9933 0.0726 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.9936 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.9937 0.0721 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.9937 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.9940 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.9939 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.9934 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.9931 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.9931 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.9933 0.0722 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.9933 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.9936 0.0758 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.9934 0.0730 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.9934 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.9936 0.0729 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.9935 0.0684 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.9936 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.9931 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.9930 0.0732 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.9926 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.9926 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.9921 0.0612 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.9919 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.9914 0.0709 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.9911 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.9910 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.9908 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.9903 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.9904 0.0762 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.9900 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.9900 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.9895 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.9892 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.9890 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.9889 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.9888 0.0703 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.9884 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.9880 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.9876 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.9875 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.9874 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.9871 0.0785 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.9869 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.9867 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.9866 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.9866 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.9865 0.0706 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.9865 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.9865 0.0735 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.9864 0.0709 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.9863 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.9862 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.9861 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.9859 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.9855 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.9854 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.9853 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.9853 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.9852 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.9852 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.9850 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.9848 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.9848 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.9847 0.0768 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.9843 0.0768 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.9844 0.0625 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.9844 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.9844 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.9844 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.9842 0.0723 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.9839 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.9839 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.9839 0.0774 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.9838 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.9838 0.0687 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.9839 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.9840 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.9842 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.9840 0.0618 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.9842 0.0640 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.9841 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.9841 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.9841 0.0693 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.9839 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.9840 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.9840 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.9841 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.9840 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.9838 0.0754 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.9837 0.0737 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.9838 0.0720 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.9838 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.9839 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.9838 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.9837 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.9837 0.0720 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.9837 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.9835 0.0720 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.9836 0.0751 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.9837 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.9836 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.9837 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.9836 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.9836 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.9835 0.0700 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.9835 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.9837 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.9836 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.9835 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.9833 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.9832 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.9833 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.9833 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.9833 0.0725 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.9832 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.9831 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.9830 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 2.0358 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.9937 0.0715 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.9831 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.9770 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.9748 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.9686 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.9701 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.9704 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.9729 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.9735 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.9708 0.0772 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.9686 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.9693 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.9720 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.9720 0.0722 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.9713 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.9711 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.9729 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.9727 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.9729 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.9721 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.9730 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.9727 0.0782 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.9722 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.9719 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.9710 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.9707 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.9708 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.9717 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.9717 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.9719 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.9711 0.0797 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.9711 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.9720 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.9717 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.9714 0.0738 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.9707 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.9697 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.9685 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.9681 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.9677 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.9678 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.9674 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.9668 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.9667 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.9657 0.0821 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.9656 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.9651 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.9649 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.9656 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.9650 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.9657 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.9655 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.9653 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.9648 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.9650 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.9651 0.0721 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.9650 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.9645 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.9648 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.9648 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.9653 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.9656 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.9658 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.9655 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.9658 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.9660 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.9655 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.9653 0.0742 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.9652 0.0725 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.9655 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.9658 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.9660 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.9659 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.9659 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.9661 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.9659 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.9659 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.9655 0.0700 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.9655 0.0759 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.9651 0.0719 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.9651 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.9648 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.9647 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.9642 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.9637 0.0723 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.9637 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.9635 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.9630 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.9631 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.9628 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.9626 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.9622 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.9618 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.9615 0.0744 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.9615 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.9613 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.9610 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.9607 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.9603 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.9604 0.0714 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.9603 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.9601 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.9599 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.9596 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.9596 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.9596 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.9595 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.9595 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.9594 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.9593 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.9592 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.9591 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.9589 0.0713 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.9587 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.9583 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.9582 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.9582 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.9582 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.9581 0.0708 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.9580 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.9579 0.0703 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.9577 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.9577 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.9575 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.9571 0.0762 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.9572 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.9573 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.9573 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.9574 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.9572 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.9569 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.9570 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.9570 0.0860 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.9570 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.9569 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.9570 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.9570 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.9571 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.9570 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.9571 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.9570 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.9570 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.9570 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.9569 0.0632 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.9570 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.9570 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.9572 0.0771 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.9572 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.9570 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.9568 0.0729 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.9570 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.9570 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.9570 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.9570 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.9570 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.9570 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.9570 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.9568 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.9570 0.0711 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.9570 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.9570 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.9570 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.9570 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.9570 0.0746 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.9569 0.0707 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.9568 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.9571 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.9570 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.9569 0.0712 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.9568 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.9566 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.9567 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.9567 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.9567 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.9567 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.9565 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.9565 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 2.0152 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.9782 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.9633 0.0629 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.9549 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.9522 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.9461 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.9485 0.0749 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.9479 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.9498 0.0740 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.9497 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.9463 0.0729 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.9446 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.9454 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.9479 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.9473 0.0694 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.9453 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.9458 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.9475 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.9481 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.9494 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.9487 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.9496 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.9494 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.9490 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.9484 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.9479 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.9474 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.9473 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.9483 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.9486 0.0738 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.9484 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.9475 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.9475 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.9485 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.9480 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.9477 0.0784 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.9472 0.0705 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.9458 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.9448 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.9442 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.9439 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.9440 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.9435 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.9428 0.0710 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.9428 0.0719 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.9418 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.9419 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.9416 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.9415 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.9423 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.9419 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.9427 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.9425 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.9424 0.0732 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.9420 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.9420 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.9422 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.9419 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.9415 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.9417 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.9416 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.9424 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.9427 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.9429 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.9429 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.9431 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.9433 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.9428 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.9428 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.9427 0.0701 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.9429 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.9431 0.0706 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.9434 0.0710 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.9430 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.9429 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.9431 0.0768 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.9429 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.9429 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.9424 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.9422 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.9417 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.9418 0.0629 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.9413 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.9411 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.9406 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.9402 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.9401 0.0714 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.9399 0.0759 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.9394 0.0776 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.9395 0.0733 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.9391 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.9390 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.9386 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.9382 0.0764 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.9380 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.9379 0.0703 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.9378 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.9374 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.9370 0.0629 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.9367 0.0729 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.9368 0.0703 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.9368 0.0705 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.9365 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.9364 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.9361 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.9360 0.0715 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.9359 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.9358 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.9358 0.0711 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.9357 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.9357 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.9356 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.9354 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.9353 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.9350 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.9345 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.9345 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.9344 0.0868 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.9344 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.9344 0.0735 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.9345 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.9342 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.9340 0.0719 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.9340 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.9339 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.9336 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.9338 0.0759 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.9339 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.9338 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.9338 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.9336 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.9334 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.9334 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.9334 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.9334 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.9334 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.9334 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.9335 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.9336 0.0713 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.9335 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.9337 0.0712 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.9336 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.9335 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.9335 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.9333 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.9334 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.9334 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.9336 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.9335 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.9333 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.9331 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.9333 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.9333 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.9334 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.9333 0.0728 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.9332 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.9332 0.0701 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.9332 0.0765 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.9330 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.9332 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.9333 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.9332 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.9332 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.9332 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.9332 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.9331 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.9330 0.0739 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.9333 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.9332 0.0709 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.9332 0.0707 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.9331 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.9329 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.9330 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.9330 0.0731 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.9331 0.0628 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.9331 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.9330 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.9330 0.0722 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.9837 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.9505 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.9355 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.9320 0.0739 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.9302 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.9250 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.9246 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.9250 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.9285 0.0721 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.9285 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.9248 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.9230 0.0726 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.9233 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.9249 0.0729 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.9246 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.9235 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.9230 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.9256 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.9256 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.9255 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.9245 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.9258 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.9255 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.9251 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.9248 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.9241 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.9231 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.9237 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.9249 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.9251 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.9249 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.9240 0.0747 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.9242 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.9250 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.9247 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.9246 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.9240 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.9228 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.9219 0.0820 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.9213 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.9210 0.0712 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.9209 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.9207 0.0703 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.9201 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.9202 0.0631 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.9190 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.9189 0.0753 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.9186 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.9184 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.9192 0.0707 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.9185 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.9194 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.9192 0.0732 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.9192 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.9189 0.0736 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.9190 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.9192 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.9188 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.9185 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.9189 0.0699 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.9188 0.0725 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.9196 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.9200 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.9200 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.9200 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.9204 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.9205 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.9201 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.9198 0.0705 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.9200 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.9204 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.9206 0.0721 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.9208 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.9205 0.0625 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.9206 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.9209 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.9209 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.9209 0.0781 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.9205 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.9204 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.9199 0.0725 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.9201 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.9196 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.9196 0.0706 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.9191 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.9187 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.9186 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.9184 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.9179 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.9180 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.9177 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.9176 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.9172 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.9168 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.9165 0.0709 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.9163 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.9162 0.0763 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.9158 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.9155 0.0734 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.9150 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.9150 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.9149 0.0628 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.9146 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.9144 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.9142 0.0742 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.9141 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.9141 0.0707 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.9140 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.9140 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.9140 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.9139 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.9138 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.9137 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.9135 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.9134 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.9130 0.0631 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.9129 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.9127 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.9128 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.9127 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.9127 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.9124 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.9122 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.9123 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.9123 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.9119 0.0625 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.9120 0.0716 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.9121 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.9120 0.0705 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.9120 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.9118 0.0726 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.9115 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.9115 0.0719 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.9115 0.0728 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.9115 0.0641 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.9116 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.9116 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.9117 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.9118 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.9117 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.9119 0.0740 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.9118 0.0757 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.9117 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.9116 0.0707 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.9115 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.9115 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.9115 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.9117 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.9117 0.0628 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.9116 0.0704 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.9114 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.9116 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.9116 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.9116 0.0709 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.9116 0.0709 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.9115 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.9116 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.9116 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.9114 0.0748 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.9116 0.0757 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.9118 0.0727 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.9118 0.0638 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.9118 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.9118 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.9118 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.9117 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.9116 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.9119 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.9118 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.9118 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.9117 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.9115 0.0704 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.9115 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.9116 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.9117 0.0725 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.9116 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.9115 0.0641 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.9115 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.9692 0.0746 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.9300 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.9169 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.9112 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.9087 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.9014 0.0695 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.9029 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.9025 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.9056 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.9060 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.9038 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.9021 0.0743 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.9021 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.9049 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.9043 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.9023 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.9025 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.9050 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.9051 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.9055 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.9049 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.9059 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.9048 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.9043 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.9040 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.9030 0.0714 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.9022 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.9027 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.9037 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.9039 0.0716 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.9037 0.0752 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.9029 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.9031 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.9036 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.9032 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.9033 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.9029 0.0628 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.9020 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.9009 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.9001 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.9000 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.9001 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.8997 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.8991 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.8994 0.0721 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.8982 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.8982 0.0739 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.8980 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.8977 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.8983 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.8979 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.8988 0.0728 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.8986 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.8985 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.8981 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.8984 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.8987 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.8985 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.8980 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.8985 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.8985 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.8992 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.8996 0.0620 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.8998 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.8997 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.9000 0.0767 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.9001 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.8997 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.8995 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.8993 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.8995 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.8995 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.8999 0.0686 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.8996 0.0767 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.8996 0.0723 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.8998 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.8996 0.0695 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.8997 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.8992 0.0707 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.8991 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.8987 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.8987 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.8982 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.8982 0.0779 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.8978 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.8976 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.8974 0.0636 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.8971 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.8967 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.8969 0.0695 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.8966 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.8965 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.8960 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.8957 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.8953 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.8953 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.8951 0.0707 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.8948 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.8944 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.8940 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.8940 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.8940 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.8937 0.0704 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.8935 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.8933 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.8933 0.0744 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.8932 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.8930 0.0713 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.8930 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.8931 0.0717 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.8931 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.8931 0.0756 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.8929 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.8927 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.8925 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.8921 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.8921 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.8920 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.8920 0.0745 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.8919 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.8920 0.0754 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.8917 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.8915 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.8916 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.8915 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.8911 0.0723 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.8913 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.8913 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.8913 0.0700 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.8913 0.0708 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.8910 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.8907 0.0739 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.8908 0.0703 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.8909 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.8908 0.0636 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.8909 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.8910 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.8910 0.0837 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.8912 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.8910 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.8912 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.8910 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.8909 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.8909 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.8908 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.8908 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.8908 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.8911 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.8910 0.0733 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.8909 0.0627 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.8908 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.8909 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.8909 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.8910 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.8910 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.8909 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.8909 0.0682 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.8909 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.8907 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.8908 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.8910 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.8910 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.8910 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.8910 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.8910 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.8910 0.0822 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.8910 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.8913 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.8912 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.8912 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.8911 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.8910 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.8910 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.8910 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.8911 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.8911 0.0722 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.8909 0.0709 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.8909 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.9510 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.9101 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.8950 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.8916 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.8892 0.0638 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.8822 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.8834 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.8838 0.0630 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.8867 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.8860 0.0722 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.8834 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.8820 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.8813 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.8837 0.0721 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.8827 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.8814 0.0728 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.8813 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.8834 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.8832 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.8836 0.0756 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.8827 0.0791 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.8838 0.0722 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.8836 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.8837 0.0777 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.8833 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.8827 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.8823 0.0652 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.8827 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.8837 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.8839 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.8834 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.8827 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.8830 0.0775 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.8837 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.8833 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.8834 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.8832 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.8822 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.8812 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.8808 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.8802 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.8802 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.8798 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.8792 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.8793 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.8783 0.0701 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.8784 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.8780 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.8778 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.8786 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.8780 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.8787 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.8784 0.0732 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.8783 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.8780 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.8780 0.0711 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.8783 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.8781 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.8778 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.8783 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.8782 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.8789 0.0715 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.8794 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.8797 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.8796 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.8798 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.8799 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.8795 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.8793 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.8791 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.8795 0.0750 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.8796 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.8800 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.8798 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.8798 0.0780 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.8801 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.8801 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.8803 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.8798 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.8797 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.8792 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.8793 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.8788 0.0734 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.8787 0.0708 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.8783 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.8780 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.8779 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.8777 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.8773 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.8775 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.8771 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.8771 0.0750 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.8768 0.0751 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.8764 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.8761 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.8760 0.0740 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.8759 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.8756 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.8751 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.8747 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.8747 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.8746 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.8744 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.8742 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.8740 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.8739 0.0682 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.8738 0.0730 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.8738 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.8738 0.0731 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.8738 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.8738 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.8736 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.8736 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.8735 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.8733 0.0832 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.8729 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.8728 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.8728 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.8728 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.8728 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.8727 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.8724 0.0735 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.8722 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.8724 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.8722 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.8719 0.0733 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.8720 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.8721 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.8721 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.8720 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.8718 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.8716 0.0701 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.8716 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.8717 0.0715 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.8716 0.0759 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.8716 0.0717 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.8717 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.8718 0.0711 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.8719 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.8718 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.8721 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.8720 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.8720 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.8719 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.8718 0.0705 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.8719 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.8718 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.8721 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.8721 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.8720 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.8719 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.8720 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.8720 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.8721 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.8721 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.8721 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.8721 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.8720 0.0752 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.8718 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.8720 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.8721 0.0730 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.8720 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.8721 0.0744 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.8722 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.8722 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.8720 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.8720 0.0688 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.8724 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.8723 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.8723 0.0724 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.8722 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.8721 0.0728 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.8721 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.8721 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.8722 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.8722 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.8721 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.8721 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.9323 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.8906 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.8817 0.0766 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.8767 0.0746 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.8728 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.8638 0.0638 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.8663 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.8671 0.0753 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.8688 0.0767 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.8700 0.0703 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.8662 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.8643 0.0730 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.8655 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.8677 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.8669 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.8659 0.0734 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.8652 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.8667 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.8667 0.0710 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.8672 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.8665 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.8671 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.8666 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.8664 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.8664 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.8656 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.8651 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.8655 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.8663 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.8667 0.0706 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.8667 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.8662 0.0742 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.8663 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.8668 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.8663 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.8662 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.8657 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.8646 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.8635 0.0741 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.8632 0.0719 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.8627 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.8631 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.8627 0.0712 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.8621 0.0700 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.8624 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.8612 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.8611 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.8608 0.0774 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.8606 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.8613 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.8607 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.8615 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.8615 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.8615 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.8611 0.0736 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.8611 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.8615 0.0756 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.8613 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.8610 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.8615 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.8616 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.8625 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.8629 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.8631 0.0807 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.8630 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.8633 0.0792 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.8634 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.8629 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.8628 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.8629 0.0744 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.8631 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.8632 0.0714 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.8636 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.8634 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.8633 0.0731 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.8634 0.0723 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.8633 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.8633 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.8629 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.8630 0.0757 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.8625 0.0712 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.8625 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.8621 0.0715 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.8621 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.8615 0.0633 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.8612 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.8610 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.8608 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.8604 0.0723 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.8604 0.0721 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.8602 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.8601 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.8598 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.8594 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.8591 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.8592 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.8591 0.0760 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.8589 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.8586 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.8582 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.8581 0.0741 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.8580 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.8579 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.8577 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.8575 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.8573 0.0633 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.8572 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.8571 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.8572 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.8572 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.8573 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.8571 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.8572 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.8571 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.8569 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.8564 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.8564 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.8564 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.8564 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.8564 0.0647 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.8563 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.8561 0.0758 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.8558 0.0716 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.8559 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.8558 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.8555 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.8556 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.8556 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.8556 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.8554 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.8552 0.0766 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.8550 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.8549 0.0773 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.8550 0.0714 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.8550 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.8550 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.8551 0.0705 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.8551 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.8552 0.0759 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.8552 0.0737 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.8555 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.8554 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.8553 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.8552 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.8551 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.8552 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.8552 0.0637 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.8554 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.8554 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.8554 0.0705 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.8552 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.8553 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.8554 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.8554 0.0750 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.8554 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.8554 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.8554 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.8554 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.8553 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.8554 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.8556 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.8555 0.0637 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.8556 0.0699 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.8556 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.8556 0.0619 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.8555 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.8555 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.8558 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.8558 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.8558 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.8557 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.8555 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.8556 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.8556 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.8557 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.8556 0.0626 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.8555 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.8555 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.9040 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.8734 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.8656 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.8623 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.8584 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.8508 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.8522 0.0701 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.8519 0.0622 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.8541 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.8537 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.8506 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.8486 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.8489 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.8512 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.8505 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.8497 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.8493 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.8512 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.8514 0.0631 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.8519 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.8508 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.8519 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.8516 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.8508 0.0772 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.8503 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.8492 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.8486 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.8492 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.8500 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.8502 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.8499 0.0722 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.8493 0.0725 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.8494 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.8501 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.8497 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.8499 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.8497 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.8488 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.8477 0.0634 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.8471 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.8464 0.0634 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.8465 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.8462 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.8455 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.8454 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.8443 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.8440 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.8435 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.8433 0.0748 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.8442 0.0699 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.8436 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.8444 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.8441 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.8441 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.8439 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.8439 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.8443 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.8440 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.8436 0.0753 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.8439 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.8438 0.0738 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.8448 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.8451 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.8453 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.8452 0.0717 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.8455 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.8456 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.8452 0.0761 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.8450 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.8450 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.8454 0.0770 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.8456 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.8459 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.8458 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.8457 0.0767 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.8458 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.8458 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.8458 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.8454 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.8452 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.8447 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.8448 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.8444 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.8443 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.8438 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.8435 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.8435 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.8432 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.8429 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.8430 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.8426 0.0758 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.8426 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.8422 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.8419 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.8418 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.8417 0.0737 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.8416 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.8413 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.8410 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.8405 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.8405 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.8404 0.0713 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.8402 0.0701 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.8400 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.8398 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.8397 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.8398 0.0716 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.8398 0.0742 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.8399 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.8400 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.8400 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.8399 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.8397 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.8396 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.8394 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.8391 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.8392 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.8392 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.8392 0.0705 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.8391 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.8392 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.8389 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.8387 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.8389 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.8387 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.8384 0.0829 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.8386 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.8387 0.0767 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.8385 0.0709 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.8385 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.8383 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.8380 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.8380 0.0714 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.8381 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.8381 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.8382 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.8383 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.8383 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.8385 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.8384 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.8387 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.8386 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.8385 0.0730 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.8386 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.8385 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.8386 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.8386 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.8388 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.8389 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.8387 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.8385 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.8386 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.8386 0.0705 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.8387 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.8387 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.8387 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.8388 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.8387 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.8386 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.8387 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.8389 0.0736 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.8389 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.8389 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.8390 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.8389 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.8389 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.8389 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.8393 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.8392 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.8392 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.8391 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.8390 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.8391 0.0632 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.8391 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.8391 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.8391 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.8389 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.8390 0.0661 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4193 1.2818 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4159 0.0540 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.4122 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.4082 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.4036 0.0549 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.3985 0.0531 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.3917 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.3834 0.0536 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.3723 0.0542 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 4.3575 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 4.3366 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 4.3095 0.0539 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 4.2770 0.0687 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 4.2418 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 4.2064 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 4.1722 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 4.1390 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 4.1090 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 4.0804 0.0666 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 4.0515 0.0659 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 4.0249 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.9998 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.9760 0.0576 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.9534 0.0561 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.9317 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.9117 0.0694 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.8930 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.8749 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.8578 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.8418 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.8267 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.8114 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.7971 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.7838 0.0639 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.7710 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.7589 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.7466 0.0675 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.7349 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.7236 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.7130 0.0564 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.7026 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.6925 0.0680 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.6829 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.6736 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.6644 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.6561 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.6483 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.6408 0.0576 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.6334 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.6263 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.6193 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.6123 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.6059 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.5991 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.5931 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.5867 0.0567 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.5807 0.0571 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.5750 0.0680 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.5691 0.0664 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.5639 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.5586 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.5538 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.5492 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.5441 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.5392 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.5347 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.5304 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.5254 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.5209 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.5170 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.5128 0.0568 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.5092 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.5053 0.0577 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.5016 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.4980 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.4945 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.4911 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.4876 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.4841 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.4806 0.0657 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.4773 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.4743 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.4712 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.4681 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.4649 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.4619 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.4589 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.4560 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.4533 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.4507 0.0699 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.4481 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.4455 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.4430 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.4405 0.0560 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.4380 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.4355 0.0660 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.4332 0.0694 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.4308 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.4286 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.4263 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.4241 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.4219 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.4198 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.4177 0.0574 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.4156 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.4135 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.4113 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.4091 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.4071 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.4049 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.4030 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.4010 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.3991 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.3971 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.3952 0.0568 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.3933 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.3916 0.0702 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.3899 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.3883 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.3865 0.0641 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.3851 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.3835 0.0563 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.3819 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.3804 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.3787 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.3771 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.3756 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.3742 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.3727 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.3713 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.3700 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.3685 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.3672 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.3657 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.3641 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.3625 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.3611 0.0666 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.3597 0.0684 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.3584 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.3570 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.3557 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.3542 0.0679 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.3529 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.3516 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.3504 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.3492 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.3481 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.3470 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.3457 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.3446 0.0681 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.3435 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.3426 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.3414 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.3403 0.0666 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.3391 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.3379 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.3368 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.3357 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.3344 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.3333 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.3323 0.0687 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.3311 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.3299 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.3288 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.3277 0.0679 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.3266 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.3256 0.0576 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.3246 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.3236 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.3225 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.3216 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.3208 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.3201 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.3194 0.0660 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.3185 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.3176 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.3167 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.3156 0.0569 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 3.2191 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 3.1734 0.0587 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 3.1601 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 3.1550 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 3.1531 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 3.1533 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 3.1531 0.0589 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 3.1524 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 3.1500 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 3.1493 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 3.1460 0.0594 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 3.1452 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 3.1441 0.0588 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 3.1444 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 3.1440 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 3.1441 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 3.1433 0.0570 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 3.1443 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 3.1442 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 3.1424 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 3.1420 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 3.1418 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 3.1410 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 3.1401 0.0577 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 3.1391 0.0583 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 3.1391 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 3.1389 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 3.1379 0.0585 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 3.1375 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 3.1370 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 3.1374 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 3.1367 0.0660 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 3.1358 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 3.1355 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 3.1345 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 3.1341 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 3.1331 0.0589 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 3.1323 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 3.1313 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 3.1303 0.0672 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 3.1292 0.0702 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 3.1285 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 3.1276 0.0587 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 3.1268 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 3.1257 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 3.1251 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 3.1245 0.0664 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 3.1239 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 3.1234 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 3.1229 0.0599 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 3.1221 0.0681 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 3.1212 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 3.1205 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 3.1195 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 3.1188 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 3.1177 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 3.1169 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 3.1160 0.0709 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 3.1149 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 3.1140 0.0576 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 3.1131 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 3.1126 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 3.1121 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 3.1108 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 3.1097 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 3.1090 0.0587 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 3.1082 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 3.1067 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 3.1056 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 3.1048 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 3.1037 0.0698 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 3.1030 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 3.1020 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 3.1010 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 3.1002 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 3.0994 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 3.0985 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 3.0976 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 3.0965 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 3.0953 0.0587 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 3.0942 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 3.0932 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 3.0923 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 3.0913 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 3.0900 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 3.0887 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 3.0876 0.0711 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 3.0865 0.0664 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 3.0855 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 3.0845 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 3.0835 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 3.0825 0.0697 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 3.0814 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 3.0804 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 3.0792 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 3.0779 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 3.0769 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 3.0758 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 3.0747 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 3.0737 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 3.0726 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 3.0716 0.0721 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 3.0705 0.0578 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 3.0693 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 3.0682 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 3.0671 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 3.0659 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 3.0646 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 3.0636 0.0589 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 3.0622 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 3.0611 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 3.0599 0.0690 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 3.0586 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 3.0573 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 3.0559 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 3.0546 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 3.0533 0.0579 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 3.0522 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 3.0512 0.0681 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 3.0499 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 3.0489 0.0688 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 3.0478 0.0671 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 3.0467 0.0705 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 3.0457 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 3.0445 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 3.0431 0.0649 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 3.0420 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 3.0410 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 3.0398 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 3.0386 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 3.0376 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 3.0364 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 3.0353 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 3.0341 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 3.0328 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 3.0315 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 3.0303 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 3.0290 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 3.0280 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 3.0269 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 3.0257 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 3.0244 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 3.0232 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 3.0220 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 3.0209 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 3.0198 0.0585 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 3.0186 0.0583 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 3.0178 0.0677 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 3.0165 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 3.0154 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 3.0145 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 3.0136 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 3.0125 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 3.0114 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 3.0102 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 3.0091 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 3.0079 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 3.0067 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 3.0054 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 3.0043 0.0700 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 3.0033 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 3.0020 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 3.0007 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.9996 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.9985 0.0670 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.9973 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.9962 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.9951 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.9940 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.9927 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.9917 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.9908 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.9899 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.9891 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.9882 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.9871 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.9860 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.9847 0.0588 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.8654 0.0599 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.8127 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.7950 0.0707 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.7901 0.0599 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.7864 0.0597 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.7822 0.0592 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.7817 0.0733 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.7806 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.7790 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.7771 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.7738 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.7729 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.7710 0.0591 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.7715 0.0590 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.7701 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.7696 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.7682 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.7690 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.7682 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.7654 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.7643 0.0602 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.7639 0.0590 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.7626 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.7614 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.7599 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.7592 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.7580 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.7567 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.7559 0.0729 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.7548 0.0605 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.7545 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.7532 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.7516 0.0592 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.7506 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.7491 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.7483 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.7470 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.7452 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.7438 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.7428 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.7414 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.7401 0.0721 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.7387 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.7375 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.7362 0.0591 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.7349 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.7340 0.0595 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.7330 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.7318 0.0703 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.7310 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.7299 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.7287 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.7277 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.7264 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.7252 0.0613 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.7240 0.0711 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.7228 0.0615 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.7216 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.7204 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.7196 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.7184 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.7177 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.7172 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.7161 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.7151 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.7145 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.7135 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.7122 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.7109 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.7102 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.7092 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.7085 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.7075 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.7065 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.7058 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.7052 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.7044 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.7038 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.7029 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.7021 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.7012 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.7005 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.6997 0.0597 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.6988 0.0666 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.6977 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.6967 0.0700 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.6958 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.6950 0.0590 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.6942 0.0591 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.6935 0.0674 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.6927 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.6920 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.6912 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.6903 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.6893 0.0613 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.6885 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.6878 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.6870 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.6862 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.6854 0.0598 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.6847 0.0684 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.6839 0.0605 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.6831 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.6823 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.6815 0.0613 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.6808 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.6799 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.6793 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.6787 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.6778 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.6771 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.6766 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.6758 0.0615 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.6750 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.6743 0.0585 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.6733 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.6726 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.6720 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.6715 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.6708 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.6703 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.6697 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.6690 0.0602 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.6685 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.6679 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.6670 0.0613 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.6666 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.6661 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.6654 0.0601 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.6649 0.0835 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.6643 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.6636 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.6630 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.6625 0.0590 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.6617 0.0597 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.6611 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.6604 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.6598 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.6593 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.6587 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.6582 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.6575 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.6569 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.6563 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.6557 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.6553 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.6547 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.6544 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.6537 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.6531 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.6528 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.6525 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.6520 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.6515 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.6510 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.6504 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.6498 0.0722 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.6492 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.6485 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.6481 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.6476 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.6469 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.6462 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.6457 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.6453 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.6447 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.6442 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.6437 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.6432 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.6425 0.0681 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.6421 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.6417 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.6415 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.6412 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.6410 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.6405 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.6400 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.6394 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.6427 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.5821 0.0610 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.5659 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.5602 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.5550 0.0742 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.5511 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.5510 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.5515 0.0614 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.5515 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.5505 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.5487 0.0618 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.5485 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.5481 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.5502 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.5495 0.0664 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.5494 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.5490 0.0598 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.5508 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.5506 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.5492 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.5482 0.0605 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.5491 0.0730 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.5485 0.0693 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.5479 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.5468 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.5466 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.5459 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.5453 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.5455 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.5452 0.0602 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.5454 0.0620 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.5450 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.5443 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.5440 0.0718 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.5435 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.5433 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.5425 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.5416 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.5409 0.0597 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.5401 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.5394 0.0607 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.5386 0.0603 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.5378 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.5373 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.5368 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.5358 0.0752 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.5358 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.5356 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.5350 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.5350 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.5346 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.5343 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.5338 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.5334 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.5329 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.5325 0.0735 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.5322 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.5316 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.5309 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.5307 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.5302 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.5301 0.0620 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.5301 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.5296 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.5291 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.5291 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.5287 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.5279 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.5274 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.5272 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.5270 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.5269 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.5267 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.5262 0.0620 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.5260 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.5261 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.5258 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.5256 0.0724 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.5252 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.5249 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.5245 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.5244 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.5241 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.5237 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.5230 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.5225 0.0616 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.5222 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.5218 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.5215 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.5214 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.5212 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.5210 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.5207 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.5202 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.5197 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.5193 0.0730 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.5190 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.5188 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.5185 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.5182 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.5181 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.5179 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.5174 0.0739 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.5171 0.0602 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.5168 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.5166 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.5163 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.5161 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.5159 0.0717 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.5154 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.5152 0.0703 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.5152 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.5148 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.5145 0.0678 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.5141 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.5136 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.5133 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.5131 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.5130 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.5128 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.5126 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.5123 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.5121 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.5120 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.5118 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.5114 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.5114 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.5113 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.5111 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.5110 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.5108 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.5105 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.5104 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.5104 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.5101 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.5098 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.5096 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.5094 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.5094 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.5091 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.5090 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.5087 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.5084 0.0664 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.5082 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.5080 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.5080 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.5077 0.0620 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.5078 0.0698 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.5074 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.5072 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.5072 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.5073 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.5071 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.5071 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.5069 0.0614 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.5067 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.5064 0.0698 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.5062 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.5059 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.5058 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.5056 0.0610 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.5053 0.0618 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.5049 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.5047 0.0708 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.5046 0.0693 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.5043 0.0705 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.5042 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.5040 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.5039 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.5036 0.0618 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.5034 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.5033 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.5033 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.5033 0.0689 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.5033 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.5031 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.5029 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.5025 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.5546 0.0733 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.4946 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.4816 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.4774 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.4736 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.4702 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.4690 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.4707 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.4719 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.4702 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.4684 0.0709 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.4680 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.4675 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.4693 0.0683 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.4692 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.4688 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.4687 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.4706 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.4709 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.4692 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.4685 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.4694 0.0711 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.4691 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.4682 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.4674 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.4669 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.4661 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.4658 0.0619 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.4659 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.4657 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.4662 0.0606 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.4657 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.4651 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.4649 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.4641 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.4640 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.4635 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.4625 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.4618 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.4611 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.4604 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.4598 0.0730 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.4592 0.0773 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.4587 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.4581 0.0618 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.4570 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.4570 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.4565 0.0726 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.4562 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.4562 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.4558 0.0601 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.4557 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.4553 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.4549 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.4545 0.0615 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.4543 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.4539 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.4536 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.4533 0.0683 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.4533 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.4528 0.0608 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.4527 0.0725 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.4528 0.0606 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.4524 0.0706 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.4519 0.0685 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.4519 0.0687 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.4516 0.0778 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.4509 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.4505 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.4505 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.4503 0.0729 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.4503 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.4502 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.4499 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.4498 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.4500 0.0685 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.4497 0.0603 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.4497 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.4492 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.4489 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.4486 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.4485 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.4482 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.4478 0.0726 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.4471 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 2.4468 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 2.4466 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 2.4464 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 2.4461 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 2.4460 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 2.4458 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 2.4456 0.0751 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 2.4454 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 2.4450 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 2.4446 0.0771 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 2.4442 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 2.4441 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 2.4438 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 2.4435 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 2.4434 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 2.4433 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 2.4431 0.0708 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 2.4428 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 2.4426 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 2.4423 0.0735 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 2.4421 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 2.4418 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 2.4417 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 2.4416 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 2.4411 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 2.4410 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 2.4410 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 2.4408 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 2.4404 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 2.4402 0.0619 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 2.4398 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 2.4395 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 2.4393 0.0713 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 2.4393 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 2.4391 0.0606 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 2.4391 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 2.4389 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 2.4386 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 2.4386 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 2.4384 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 2.4381 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 2.4380 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 2.4380 0.0611 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 2.4378 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 2.4376 0.0603 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 2.4375 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 2.4372 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 2.4372 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 2.4372 0.0712 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 2.4369 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 2.4368 0.0838 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 2.4366 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 2.4365 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 2.4365 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 2.4363 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 2.4362 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 2.4360 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 2.4358 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 2.4356 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 2.4354 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 2.4355 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 2.4353 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 2.4354 0.0777 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 2.4352 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 2.4349 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 2.4350 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 2.4351 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 2.4350 0.0608 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 2.4349 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 2.4348 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 2.4346 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 2.4344 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 2.4342 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 2.4339 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 2.4339 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 2.4338 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 2.4334 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 2.4332 0.0611 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 2.4331 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 2.4330 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 2.4328 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 2.4326 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 2.4325 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 2.4324 0.0695 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 2.4321 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 2.4320 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 2.4318 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 2.4318 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 2.4318 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 2.4317 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 2.4316 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 2.4314 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 2.4311 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.4869 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 2.4252 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 2.4121 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 2.4071 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 2.4037 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 2.3998 0.0696 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 2.4007 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 2.4022 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 2.4033 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 2.4019 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 2.4011 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 2.4011 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 2.4014 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 2.4030 0.0723 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 2.4029 0.0700 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 2.4035 0.0602 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 2.4033 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 2.4054 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 2.4056 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 2.4041 0.0626 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 2.4037 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 2.4045 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 2.4042 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 2.4035 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 2.4026 0.0721 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 2.4022 0.0626 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 2.4018 0.0610 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 2.4017 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 2.4020 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 2.4019 0.0698 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 2.4022 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 2.4016 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 2.4009 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 2.4010 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 2.4005 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 2.4005 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 2.4003 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 2.3992 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 2.3987 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 2.3979 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 2.3972 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 2.3965 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 2.3960 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 2.3955 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 2.3949 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 2.3938 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 2.3939 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 2.3936 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 2.3933 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 2.3935 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 2.3931 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 2.3930 0.0722 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 2.3928 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 2.3926 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 2.3923 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 2.3920 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 2.3917 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 2.3914 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 2.3909 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 2.3908 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 2.3906 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 2.3906 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 2.3909 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 2.3906 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 2.3904 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 2.3903 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 2.3903 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 2.3897 0.0635 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 2.3892 0.0605 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 2.3891 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 2.3891 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 2.3891 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 2.3890 0.0705 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 2.3887 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 2.3885 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 2.3888 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 2.3885 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 2.3886 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 2.3882 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 2.3879 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 2.3876 0.0764 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 2.3877 0.0711 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 2.3874 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 2.3871 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 2.3864 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 2.3861 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 2.3860 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 2.3858 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 2.3855 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 2.3855 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 2.3852 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 2.3851 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 2.3847 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 2.3844 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 2.3839 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 2.3836 0.0732 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 2.3835 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 2.3832 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 2.3830 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 2.3828 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 2.3828 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 2.3828 0.0635 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 2.3824 0.0612 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 2.3821 0.0635 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 2.3818 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 2.3817 0.0727 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 2.3814 0.0626 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 2.3813 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 2.3812 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 2.3809 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 2.3808 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 2.3809 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 2.3807 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 2.3804 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 2.3802 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 2.3797 0.0710 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 2.3797 0.0753 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 2.3795 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 2.3796 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 2.3795 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 2.3795 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 2.3793 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 2.3791 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 2.3790 0.0694 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 2.3788 0.0710 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 2.3785 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 2.3784 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 2.3784 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 2.3783 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 2.3783 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 2.3781 0.0605 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 2.3779 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 2.3778 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 2.3777 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 2.3775 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 2.3774 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 2.3772 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 2.3771 0.0744 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 2.3770 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 2.3768 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 2.3768 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 2.3765 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 2.3764 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 2.3762 0.0810 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 2.3761 0.0626 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 2.3762 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 2.3761 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 2.3762 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 2.3760 0.0624 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 2.3758 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 2.3759 0.0706 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 2.3760 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 2.3760 0.0735 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 2.3759 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 2.3757 0.0618 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 2.3756 0.0719 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 2.3755 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 2.3754 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 2.3751 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 2.3752 0.0729 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 2.3751 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 2.3748 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 2.3746 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 2.3744 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 2.3744 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 2.3743 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 2.3742 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 2.3741 0.0732 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 2.3740 0.0735 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 2.3738 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 2.3737 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 2.3736 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 2.3736 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 2.3735 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 2.3735 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 2.3734 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 2.3733 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 2.3731 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 2.4334 0.0620 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 2.3785 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 2.3626 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 2.3556 0.0633 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 2.3530 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 2.3503 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 2.3516 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 2.3520 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 2.3536 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 2.3526 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 2.3510 0.0746 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 2.3506 0.0737 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 2.3506 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 2.3532 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 2.3531 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 2.3534 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 2.3533 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 2.3549 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 2.3557 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 2.3550 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 2.3543 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 2.3550 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 2.3545 0.0720 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 2.3537 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 2.3532 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 2.3526 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 2.3522 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 2.3517 0.0633 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 2.3520 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 2.3521 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 2.3525 0.0694 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 2.3519 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 2.3511 0.0745 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 2.3512 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 2.3505 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 2.3504 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 2.3501 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 2.3491 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 2.3484 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 2.3474 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 2.3468 0.0677 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 2.3461 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 2.3457 0.0713 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 2.3450 0.0712 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 2.3446 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 2.3434 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 2.3434 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 2.3429 0.0747 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 2.3426 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 2.3429 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 2.3425 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 2.3426 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 2.3421 0.0612 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 2.3418 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 2.3414 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 2.3413 0.0620 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 2.3412 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 2.3408 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 2.3406 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 2.3405 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 2.3402 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 2.3403 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 2.3403 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 2.3402 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 2.3398 0.0715 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 2.3398 0.0696 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 2.3397 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 2.3390 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 2.3386 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 2.3385 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 2.3385 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 2.3385 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 2.3386 0.0620 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 2.3383 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 2.3382 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 2.3383 0.0804 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 2.3381 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 2.3380 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 2.3377 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 2.3375 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 2.3371 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 2.3371 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 2.3367 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 2.3364 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 2.3357 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 2.3354 0.0752 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 2.3351 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 2.3349 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 2.3346 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 2.3345 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 2.3343 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 2.3343 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 2.3340 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 2.3336 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 2.3332 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 2.3330 0.0621 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 2.3329 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 2.3327 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 2.3326 0.0707 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 2.3322 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 2.3324 0.0693 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 2.3323 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 2.3320 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 2.3317 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 2.3314 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 2.3312 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 2.3309 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 2.3308 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 2.3307 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 2.3304 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 2.3303 0.0700 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 2.3303 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 2.3301 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 2.3299 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 2.3296 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 2.3292 0.0625 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 2.3291 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 2.3289 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 2.3290 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 2.3288 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 2.3288 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 2.3287 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 2.3285 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 2.3284 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 2.3283 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 2.3279 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 2.3279 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 2.3280 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 2.3279 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 2.3278 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 2.3277 0.0732 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 2.3274 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 2.3273 0.0717 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 2.3274 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 2.3271 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 2.3271 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 2.3270 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 2.3270 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 2.3271 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 2.3268 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 2.3269 0.0609 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 2.3267 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 2.3266 0.0609 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 2.3264 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 2.3263 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 2.3264 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 2.3263 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 2.3264 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 2.3262 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 2.3260 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 2.3259 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 2.3261 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 2.3260 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 2.3260 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 2.3259 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 2.3258 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 2.3257 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 2.3255 0.0731 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 2.3253 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 2.3254 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 2.3253 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 2.3251 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 2.3250 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 2.3249 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 2.3248 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 2.3248 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 2.3247 0.0708 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 2.3247 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 2.3245 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 2.3243 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 2.3242 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 2.3240 0.0729 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 2.3240 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 2.3239 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 2.3239 0.0714 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 2.3238 0.0680 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 2.3237 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 2.3235 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 2.3826 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 2.3264 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 2.3119 0.0674 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 2.3056 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 2.3046 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 2.3019 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 2.3012 0.0609 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 2.3027 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 2.3031 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 2.3025 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 2.3010 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 2.3005 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 2.3011 0.0621 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 2.3041 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 2.3039 0.0694 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 2.3040 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 2.3038 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 2.3054 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 2.3060 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 2.3051 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 2.3042 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 2.3049 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 2.3047 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 2.3040 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 2.3033 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 2.3033 0.0734 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 2.3028 0.0702 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 2.3031 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 2.3032 0.0622 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 2.3032 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 2.3038 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 2.3035 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 2.3031 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 2.3033 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 2.3026 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 2.3026 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 2.3021 0.0776 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 2.3012 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 2.3005 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 2.2998 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 2.2993 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 2.2987 0.0732 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 2.2982 0.0738 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 2.2975 0.0753 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 2.2971 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 2.2961 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 2.2962 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 2.2958 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 2.2958 0.0689 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 2.2962 0.0714 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 2.2958 0.0698 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 2.2960 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 2.2957 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 2.2955 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 2.2951 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 2.2951 0.0607 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 2.2951 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 2.2948 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 2.2945 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 2.2943 0.0694 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 2.2940 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 2.2941 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 2.2943 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 2.2942 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 2.2939 0.0731 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 2.2939 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 2.2939 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 2.2933 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 2.2928 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 2.2927 0.0735 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 2.2928 0.0722 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 2.2929 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 2.2929 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 2.2926 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 2.2925 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 2.2927 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 2.2924 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 2.2925 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 2.2921 0.0703 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 2.2917 0.0724 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 2.2914 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 2.2915 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 2.2910 0.0715 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 2.2907 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 2.2901 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 2.2897 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 2.2896 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 2.2894 0.0620 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 2.2891 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 2.2892 0.0763 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 2.2889 0.0695 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 2.2889 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 2.2885 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 2.2882 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 2.2879 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 2.2877 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 2.2875 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 2.2873 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 2.2870 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 2.2867 0.0724 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 2.2867 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 2.2866 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 2.2863 0.0621 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 2.2860 0.0694 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 2.2857 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 2.2855 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 2.2853 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 2.2853 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 2.2852 0.0750 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 2.2850 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 2.2848 0.0797 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 2.2848 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 2.2846 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 2.2844 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 2.2842 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 2.2837 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 2.2835 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 2.2834 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 2.2834 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 2.2833 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 2.2834 0.0696 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 2.2832 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 2.2830 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 2.2830 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 2.2828 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 2.2825 0.0722 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 2.2824 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 2.2824 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 2.2823 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 2.2823 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 2.2822 0.0608 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 2.2819 0.0696 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 2.2818 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 2.2818 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 2.2816 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 2.2816 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 2.2815 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 2.2814 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 2.2815 0.0617 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 2.2813 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 2.2813 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 2.2811 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 2.2810 0.0724 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 2.2808 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 2.2807 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 2.2807 0.0617 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 2.2807 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 2.2808 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 2.2807 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 2.2806 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 2.2806 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 2.2807 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 2.2807 0.0775 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 2.2806 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 2.2805 0.0609 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 2.2804 0.0621 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 2.2803 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 2.2801 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 2.2800 0.0620 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 2.2800 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 2.2800 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 2.2798 0.0755 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 2.2797 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 2.2796 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 2.2796 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 2.2795 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 2.2794 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 2.2794 0.0707 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 2.2793 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 2.2791 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 2.2790 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 2.2788 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 2.2788 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 2.2788 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 2.2788 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 2.2788 0.0619 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 2.2786 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 2.2784 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 2.3365 0.0685 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 2.2807 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 2.2656 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 2.2593 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 2.2570 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 2.2539 0.0731 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 2.2541 0.0724 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 2.2552 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 2.2574 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 2.2573 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 2.2556 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 2.2557 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 2.2562 0.0634 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 2.2592 0.0750 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 2.2596 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 2.2598 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 2.2598 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 2.2618 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 2.2624 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 2.2614 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 2.2605 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 2.2612 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 2.2609 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 2.2604 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 2.2598 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 2.2597 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 2.2590 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 2.2589 0.0748 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 2.2594 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 2.2594 0.0623 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 2.2599 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 2.2593 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 2.2592 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 2.2596 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 2.2593 0.0683 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 2.2592 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 2.2588 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 2.2577 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 2.2571 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 2.2565 0.0634 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 2.2560 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 2.2555 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 2.2551 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 2.2546 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 2.2544 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 2.2531 0.0623 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 2.2532 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 2.2528 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 2.2526 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 2.2529 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 2.2525 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 2.2526 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 2.2523 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 2.2519 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 2.2515 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 2.2517 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 2.2516 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 2.2514 0.0623 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 2.2511 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 2.2510 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 2.2510 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 2.2511 0.0702 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 2.2514 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 2.2513 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 2.2508 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 2.2508 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 2.2506 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 2.2500 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 2.2496 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 2.2496 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 2.2498 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 2.2499 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 2.2501 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 2.2499 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 2.2498 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 2.2501 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 2.2499 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 2.2499 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 2.2496 0.0698 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 2.2492 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 2.2489 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 2.2489 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 2.2485 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 2.2483 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 2.2479 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 2.2477 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 2.2476 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 2.2473 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 2.2471 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 2.2470 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 2.2469 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 2.2468 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 2.2464 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 2.2461 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 2.2456 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 2.2454 0.0621 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 2.2453 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 2.2451 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 2.2448 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 2.2446 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 2.2446 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 2.2445 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 2.2442 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 2.2440 0.0708 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 2.2437 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 2.2436 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 2.2433 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 2.2432 0.0742 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 2.2432 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 2.2430 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 2.2429 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 2.2429 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 2.2427 0.0624 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 2.2425 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 2.2423 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 2.2419 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 2.2419 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 2.2419 0.0700 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 2.2420 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 2.2419 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 2.2420 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 2.2418 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 2.2417 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 2.2417 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 2.2415 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 2.2412 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 2.2412 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 2.2412 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 2.2412 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 2.2411 0.0610 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 2.2411 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 2.2407 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 2.2407 0.0620 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 2.2408 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 2.2406 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 2.2406 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 2.2406 0.0747 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 2.2405 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 2.2406 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 2.2403 0.0715 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 2.2404 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 2.2403 0.0725 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 2.2402 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 2.2401 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 2.2400 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 2.2400 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 2.2400 0.0707 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 2.2401 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 2.2399 0.0712 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 2.2397 0.0719 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 2.2398 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 2.2400 0.0727 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 2.2399 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 2.2400 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 2.2398 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 2.2397 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 2.2396 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 2.2395 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 2.2393 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 2.2394 0.0757 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 2.2394 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 2.2392 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 2.2391 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 2.2390 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 2.2390 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 2.2389 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 2.2388 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 2.2389 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 2.2388 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 2.2386 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 2.2385 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 2.2384 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 2.2384 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 2.2385 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 2.2385 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 2.2384 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 2.2383 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 2.2382 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 2.2982 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 2.2473 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 2.2348 0.0709 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 2.2249 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 2.2230 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 2.2194 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 2.2205 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 2.2211 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 2.2236 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 2.2234 0.0740 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 2.2211 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 2.2203 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 2.2211 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 2.2243 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 2.2242 0.0706 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 2.2239 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 2.2242 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 2.2266 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 2.2270 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 2.2262 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 2.2253 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 2.2258 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 2.2252 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 2.2242 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 2.2236 0.0714 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 2.2233 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 2.2227 0.0721 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 2.2226 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 2.2233 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 2.2232 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 2.2234 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 2.2229 0.0704 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 2.2226 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 2.2232 0.0756 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 2.2228 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 2.2227 0.0612 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 2.2226 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 2.2217 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 2.2210 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 2.2204 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 2.2199 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 2.2195 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 2.2189 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 2.2182 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 2.2181 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 2.2170 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 2.2170 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 2.2168 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 2.2164 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 2.2168 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 2.2161 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 2.2163 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 2.2160 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 2.2156 0.0746 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 2.2152 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 2.2152 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 2.2153 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 2.2151 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 2.2149 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 2.2149 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 2.2148 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 2.2150 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 2.2152 0.0613 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 2.2152 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 2.2148 0.0726 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 2.2151 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 2.2151 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 2.2145 0.0613 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 2.2139 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 2.2138 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 2.2139 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 2.2140 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 2.2142 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 2.2138 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 2.2138 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 2.2141 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 2.2140 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 2.2140 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 2.2137 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 2.2135 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 2.2132 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 2.2133 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 2.2130 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 2.2127 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 2.2121 0.0717 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 2.2118 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 2.2117 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 2.2114 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 2.2111 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 2.2112 0.0732 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 2.2109 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 2.2109 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 2.2106 0.0740 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 2.2101 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 2.2098 0.0731 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 2.2096 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 2.2095 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 2.2093 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 2.2090 0.0747 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 2.2087 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 2.2087 0.0699 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 2.2087 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 2.2084 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 2.2082 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 2.2078 0.0710 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 2.2078 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 2.2076 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 2.2075 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 2.2075 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 2.2073 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 2.2072 0.0630 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 2.2073 0.0713 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 2.2071 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 2.2070 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 2.2068 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 2.2064 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 2.2064 0.0709 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 2.2063 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 2.2064 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 2.2063 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 2.2064 0.0727 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 2.2063 0.0713 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 2.2062 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 2.2062 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 2.2061 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 2.2057 0.0611 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 2.2058 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 2.2059 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 2.2058 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 2.2058 0.0644 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 2.2057 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 2.2054 0.0715 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 2.2053 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 2.2054 0.0623 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 2.2052 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 2.2053 0.0718 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 2.2053 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 2.2054 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 2.2055 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 2.2053 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 2.2054 0.0710 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 2.2052 0.0720 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 2.2052 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 2.2051 0.0719 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 2.2049 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 2.2050 0.0760 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 2.2050 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 2.2051 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 2.2050 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 2.2049 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 2.2048 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 2.2050 0.0761 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 2.2050 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 2.2050 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 2.2049 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 2.2048 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 2.2047 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 2.2046 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 2.2044 0.0716 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 2.2046 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 2.2046 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 2.2044 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 2.2043 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 2.2042 0.0708 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 2.2042 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 2.2041 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 2.2040 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 2.2041 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 2.2040 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 2.2038 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 2.2037 0.0644 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 2.2035 0.0616 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 2.2035 0.0752 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 2.2035 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 2.2036 0.0724 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 2.2035 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 2.2034 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 2.2033 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 2.2716 0.0723 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 2.2158 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 2.2027 0.0704 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 2.1966 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 2.1930 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 2.1887 0.0748 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 2.1895 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 2.1905 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 2.1916 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 2.1916 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 2.1893 0.0632 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 2.1883 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 2.1878 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 2.1906 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 2.1909 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 2.1894 0.0709 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 2.1894 0.0713 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 2.1922 0.0735 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 2.1925 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 2.1918 0.0699 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 2.1911 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 2.1922 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 2.1915 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 2.1911 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 2.1909 0.0627 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 2.1905 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 2.1902 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 2.1903 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 2.1907 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 2.1910 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 2.1913 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 2.1909 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 2.1907 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 2.1912 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 2.1906 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 2.1904 0.0741 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 2.1900 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 2.1890 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 2.1884 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 2.1877 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 2.1873 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 2.1868 0.0714 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 2.1862 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 2.1856 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 2.1855 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 2.1841 0.0625 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 2.1843 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 2.1839 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 2.1836 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 2.1841 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 2.1837 0.0729 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 2.1841 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 2.1839 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 2.1836 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 2.1830 0.0632 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 2.1831 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 2.1831 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 2.1828 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 2.1826 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 2.1827 0.0736 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 2.1827 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 2.1832 0.0740 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 2.1834 0.0802 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 2.1834 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 2.1830 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 2.1832 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 2.1832 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 2.1826 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 2.1821 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 2.1821 0.0717 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 2.1823 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 2.1825 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 2.1826 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 2.1823 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 2.1823 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 2.1826 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 2.1824 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 2.1825 0.0691 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 2.1821 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 2.1818 0.0761 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 2.1815 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 2.1816 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 2.1811 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 2.1808 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 2.1802 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 2.1798 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 2.1797 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 2.1796 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 2.1793 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 2.1793 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 2.1791 0.0633 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 2.1790 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 2.1786 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 2.1782 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 2.1777 0.0773 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 2.1777 0.0643 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 2.1776 0.0622 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 2.1773 0.0728 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 2.1770 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 2.1767 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 2.1767 0.0758 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 2.1767 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 2.1764 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 2.1761 0.0626 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 2.1759 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 2.1757 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 2.1757 0.0694 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 2.1757 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 2.1758 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 2.1756 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 2.1756 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 2.1755 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 2.1753 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 2.1751 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 2.1749 0.0772 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 2.1744 0.0714 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 2.1745 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 2.1744 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 2.1744 0.0630 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 2.1743 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 2.1744 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 2.1743 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 2.1742 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 2.1742 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 2.1741 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 2.1736 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 2.1737 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 2.1736 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 2.1736 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 2.1735 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 2.1733 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 2.1731 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 2.1731 0.0705 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 2.1731 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 2.1729 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 2.1728 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 2.1728 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 2.1728 0.0783 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 2.1730 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 2.1728 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 2.1728 0.0869 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 2.1727 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 2.1727 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 2.1726 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 2.1725 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 2.1726 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 2.1725 0.0745 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 2.1726 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 2.1726 0.0724 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 2.1725 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 2.1725 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 2.1727 0.0622 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 2.1727 0.0742 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 2.1727 0.0703 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 2.1726 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 2.1726 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 2.1725 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 2.1724 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 2.1722 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 2.1723 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 2.1723 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 2.1722 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 2.1721 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 2.1720 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 2.1720 0.0628 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 2.1718 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 2.1718 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 2.1719 0.0731 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 2.1719 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 2.1717 0.0633 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 2.1716 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 2.1715 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 2.1715 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 2.1715 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 2.1715 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 2.1715 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 2.1714 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 2.1713 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 2.2403 0.0622 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 2.1838 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 2.1688 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 2.1601 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 2.1590 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 2.1537 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 2.1556 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 2.1567 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 2.1580 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 2.1570 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 2.1549 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 2.1542 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 2.1541 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 2.1580 0.0742 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 2.1579 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 2.1574 0.0629 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 2.1571 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 2.1593 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 2.1597 0.0745 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 2.1593 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 2.1590 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 2.1600 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 2.1597 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 2.1589 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 2.1584 0.0774 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 2.1580 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 2.1572 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 2.1573 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 2.1580 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 2.1583 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 2.1587 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 2.1580 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 2.1579 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 2.1586 0.0723 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 2.1582 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 2.1580 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 2.1575 0.0719 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 2.1565 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 2.1557 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 2.1551 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 2.1547 0.0645 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 2.1542 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 2.1537 0.0632 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 2.1531 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 2.1528 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 2.1517 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 2.1519 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 2.1516 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 2.1514 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 2.1518 0.0647 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 2.1510 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 2.1514 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 2.1512 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 2.1508 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 2.1504 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 2.1505 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 2.1505 0.0780 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 2.1502 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 2.1498 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 2.1499 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 2.1497 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 2.1502 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 2.1503 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 2.1502 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 2.1500 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 2.1502 0.0711 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 2.1502 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 2.1496 0.0740 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 2.1492 0.0733 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 2.1492 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 2.1495 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 2.1496 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 2.1498 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 2.1496 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 2.1496 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 2.1499 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 2.1496 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 2.1496 0.0622 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 2.1493 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 2.1490 0.0636 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 2.1487 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 2.1487 0.0617 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 2.1484 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 2.1483 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 2.1478 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 2.1475 0.0647 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 2.1474 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 2.1472 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 2.1468 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 2.1469 0.0735 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 2.1466 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 2.1466 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 2.1463 0.0613 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 2.1459 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 2.1456 0.0718 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 2.1455 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 2.1454 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 2.1452 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 2.1449 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 2.1447 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 2.1446 0.0632 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 2.1445 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 2.1442 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 2.1440 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 2.1436 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 2.1435 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 2.1434 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 2.1433 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 2.1435 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 2.1433 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 2.1432 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 2.1432 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 2.1430 0.0726 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 2.1428 0.0695 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 2.1425 0.0647 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 2.1421 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 2.1421 0.0711 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 2.1420 0.0620 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 2.1421 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 2.1420 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 2.1420 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 2.1419 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 2.1417 0.0637 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 2.1418 0.0752 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 2.1416 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 2.1413 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 2.1414 0.0718 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 2.1414 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 2.1412 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 2.1412 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 2.1410 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 2.1407 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 2.1407 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 2.1408 0.0632 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 2.1406 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 2.1406 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 2.1405 0.0637 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 2.1406 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 2.1408 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 2.1406 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 2.1407 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 2.1406 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 2.1405 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 2.1405 0.0718 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 2.1404 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 2.1405 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 2.1405 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 2.1406 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 2.1405 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 2.1404 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 2.1403 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 2.1405 0.0636 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 2.1405 0.0633 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 2.1406 0.0781 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 2.1406 0.0725 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 2.1405 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 2.1404 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 2.1404 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 2.1402 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 2.1403 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 2.1403 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 2.1402 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 2.1402 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 2.1401 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 2.1401 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 2.1400 0.0742 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 2.1400 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 2.1402 0.0701 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 2.1401 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 2.1400 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 2.1399 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 2.1397 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 2.1398 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 2.1398 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 2.1399 0.0645 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 2.1399 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 2.1398 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 2.1397 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 2.2167 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 2.1615 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 2.1449 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 2.1361 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 2.1339 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 2.1273 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 2.1289 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 2.1281 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 2.1296 0.0701 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 2.1292 0.0741 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 2.1269 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 2.1264 0.0709 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 2.1259 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 2.1290 0.0625 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 2.1293 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 2.1285 0.0729 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 2.1288 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 2.1305 0.0683 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 2.1309 0.0719 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 2.1314 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 2.1311 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 2.1322 0.0764 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 2.1320 0.0706 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 2.1311 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 2.1305 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 2.1302 0.0714 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 2.1293 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 2.1297 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 2.1302 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 2.1304 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 2.1308 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 2.1299 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 2.1297 0.0756 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 2.1303 0.0754 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 2.1298 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 2.1296 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 2.1294 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 2.1284 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 2.1274 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 2.1267 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 2.1262 0.0743 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 2.1258 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 2.1251 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 2.1246 0.0702 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 2.1245 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 2.1234 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 2.1235 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 2.1232 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 2.1230 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 2.1235 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 2.1231 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 2.1236 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 2.1234 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 2.1231 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 2.1225 0.0718 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 2.1227 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 2.1230 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 2.1227 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 2.1225 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 2.1226 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 2.1224 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 2.1228 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 2.1230 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 2.1229 0.0718 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 2.1226 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 2.1228 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 2.1228 0.0684 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 2.1224 0.0713 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 2.1220 0.0640 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 2.1219 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 2.1222 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 2.1223 0.0749 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 2.1225 0.0678 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 2.1221 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 2.1220 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 2.1223 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 2.1220 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 2.1220 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 2.1217 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 2.1215 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 2.1210 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 2.1211 0.0732 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 2.1207 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 2.1206 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 2.1201 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 2.1199 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 2.1199 0.0640 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 2.1198 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 2.1194 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 2.1194 0.0702 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 2.1191 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 2.1190 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 2.1185 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 2.1182 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 2.1179 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 2.1177 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 2.1177 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 2.1174 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 2.1171 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 2.1167 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 2.1167 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 2.1167 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 2.1163 0.0628 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 2.1161 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 2.1160 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 2.1159 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 2.1158 0.0723 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 2.1158 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 2.1158 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 2.1157 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 2.1158 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 2.1157 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 2.1155 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 2.1154 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 2.1152 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 2.1148 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 2.1149 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 2.1148 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 2.1149 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 2.1148 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 2.1149 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 2.1147 0.0703 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 2.1146 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 2.1147 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 2.1146 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 2.1142 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 2.1143 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 2.1142 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 2.1142 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 2.1141 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 2.1140 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 2.1137 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 2.1137 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 2.1137 0.0687 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 2.1137 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 2.1136 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 2.1136 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 2.1137 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 2.1138 0.0640 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 2.1136 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 2.1138 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 2.1136 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 2.1136 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 2.1135 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 2.1134 0.0761 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 2.1134 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 2.1134 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 2.1136 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 2.1136 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 2.1135 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 2.1134 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 2.1136 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 2.1136 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 2.1137 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 2.1136 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 2.1135 0.0738 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 2.1135 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 2.1134 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 2.1132 0.0617 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 2.1134 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 2.1136 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 2.1135 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 2.1134 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 2.1134 0.0618 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 2.1134 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 2.1133 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 2.1133 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 2.1134 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 2.1134 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 2.1133 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 2.1132 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 2.1131 0.0693 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 2.1131 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 2.1132 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 2.1132 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 2.1132 0.0717 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 2.1131 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 2.1130 0.0712 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 2.1894 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 2.1337 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 2.1215 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 2.1149 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 2.1108 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 2.1031 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 2.1046 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 2.1047 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 2.1067 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 2.1060 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 2.1040 0.0764 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 2.1027 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 2.1030 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 2.1064 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 2.1061 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 2.1049 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 2.1047 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 2.1068 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 2.1066 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 2.1058 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 2.1048 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 2.1058 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 2.1052 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 2.1044 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 2.1042 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 2.1033 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 2.1027 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 2.1031 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 2.1041 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 2.1041 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 2.1045 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 2.1041 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 2.1039 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 2.1047 0.0711 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 2.1042 0.0738 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 2.1040 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 2.1039 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 2.1030 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 2.1023 0.0712 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 2.1017 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 2.1012 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 2.1009 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 2.1002 0.0742 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 2.0994 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 2.0993 0.0730 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 2.0981 0.0720 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 2.0982 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 2.0978 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 2.0977 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 2.0984 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 2.0978 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 2.0981 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 2.0978 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 2.0976 0.0768 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 2.0972 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 2.0973 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 2.0974 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 2.0972 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 2.0968 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 2.0972 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 2.0969 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 2.0973 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 2.0976 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 2.0976 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 2.0973 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 2.0975 0.0717 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 2.0975 0.0721 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 2.0969 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 2.0965 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 2.0966 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 2.0968 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 2.0968 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 2.0971 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 2.0967 0.0744 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 2.0968 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 2.0971 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 2.0968 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 2.0968 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 2.0965 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 2.0963 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 2.0960 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 2.0961 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 2.0958 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 2.0957 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 2.0952 0.0733 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 2.0948 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 2.0946 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 2.0943 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 2.0940 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 2.0941 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 2.0937 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 2.0937 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 2.0933 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 2.0930 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 2.0926 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 2.0925 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 2.0923 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 2.0921 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 2.0917 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 2.0913 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 2.0913 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 2.0912 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 2.0910 0.0621 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 2.0907 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 2.0905 0.0632 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 2.0904 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 2.0904 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 2.0903 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 2.0903 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 2.0902 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 2.0902 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 2.0901 0.0757 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 2.0899 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 2.0898 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 2.0896 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 2.0892 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 2.0892 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 2.0891 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 2.0892 0.0717 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 2.0892 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 2.0892 0.0623 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 2.0890 0.0749 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 2.0889 0.0708 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 2.0889 0.0733 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 2.0889 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 2.0887 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 2.0886 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 2.0886 0.0743 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 2.0886 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 2.0887 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 2.0885 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 2.0882 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 2.0883 0.0737 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 2.0882 0.0730 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 2.0882 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 2.0883 0.0722 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 2.0883 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 2.0883 0.0727 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 2.0885 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 2.0884 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 2.0885 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 2.0884 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 2.0882 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 2.0881 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 2.0881 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 2.0881 0.0738 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 2.0882 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 2.0883 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 2.0883 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 2.0881 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 2.0880 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 2.0883 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 2.0882 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 2.0882 0.0718 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 2.0882 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 2.0881 0.0758 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 2.0882 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 2.0882 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 2.0880 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 2.0881 0.0710 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 2.0882 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 2.0881 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 2.0881 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 2.0881 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 2.0881 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 2.0880 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 2.0880 0.0776 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 2.0882 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 2.0882 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 2.0881 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 2.0880 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 2.0879 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 2.0879 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 2.0879 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 2.0880 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 2.0880 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 2.0879 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 2.0878 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 2.1623 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 2.1084 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 2.0931 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 2.0874 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 2.0836 0.0754 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 2.0778 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 2.0789 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 2.0810 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 2.0816 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 2.0809 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 2.0782 0.0718 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 2.0772 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 2.0771 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 2.0794 0.0625 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 2.0791 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 2.0785 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 2.0790 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 2.0813 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 2.0809 0.0638 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 2.0805 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 2.0799 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 2.0813 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 2.0809 0.0709 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 2.0803 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 2.0802 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 2.0795 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 2.0791 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 2.0794 0.0630 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 2.0803 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 2.0802 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 2.0803 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 2.0795 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 2.0796 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 2.0802 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 2.0799 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 2.0794 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 2.0788 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 2.0779 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 2.0767 0.0753 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 2.0763 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 2.0759 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 2.0757 0.0740 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 2.0751 0.0629 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 2.0746 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 2.0746 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 2.0733 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 2.0732 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 2.0728 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 2.0727 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 2.0731 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 2.0727 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 2.0733 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 2.0728 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 2.0725 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 2.0721 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 2.0722 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 2.0725 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 2.0722 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 2.0720 0.0753 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 2.0723 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 2.0721 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 2.0727 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 2.0731 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 2.0732 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 2.0730 0.0701 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 2.0733 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 2.0732 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 2.0728 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 2.0723 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 2.0722 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 2.0725 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 2.0725 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 2.0728 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 2.0725 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 2.0724 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 2.0726 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 2.0724 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 2.0725 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 2.0721 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 2.0720 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 2.0716 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 2.0717 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 2.0712 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 2.0710 0.0721 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 2.0705 0.0749 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 2.0702 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 2.0701 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 2.0699 0.0720 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 2.0695 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 2.0696 0.0788 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 2.0693 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 2.0691 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 2.0687 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 2.0684 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 2.0681 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 2.0679 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 2.0678 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 2.0675 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 2.0672 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 2.0669 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 2.0669 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 2.0669 0.0734 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 2.0666 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 2.0665 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 2.0663 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 2.0663 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 2.0661 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 2.0661 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 2.0662 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 2.0660 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 2.0660 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 2.0660 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 2.0658 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 2.0657 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 2.0655 0.0628 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 2.0652 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 2.0651 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 2.0651 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 2.0651 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 2.0650 0.0638 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 2.0651 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 2.0650 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 2.0649 0.0742 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 2.0650 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 2.0649 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 2.0646 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 2.0647 0.0628 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 2.0647 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 2.0647 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 2.0647 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 2.0646 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 2.0645 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 2.0645 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 2.0645 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 2.0644 0.0725 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 2.0644 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 2.0645 0.0728 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 2.0645 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 2.0647 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 2.0646 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 2.0647 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 2.0646 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 2.0646 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 2.0645 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 2.0644 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 2.0645 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 2.0646 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 2.0648 0.0734 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 2.0647 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 2.0646 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 2.0645 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 2.0647 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 2.0646 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 2.0647 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 2.0646 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 2.0646 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 2.0646 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 2.0645 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 2.0643 0.0632 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 2.0645 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 2.0646 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 2.0645 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 2.0645 0.0711 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 2.0645 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 2.0645 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 2.0644 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 2.0644 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 2.0646 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 2.0645 0.0725 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 2.0645 0.0721 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 2.0643 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 2.0642 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 2.0643 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 2.0644 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 2.0645 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 2.0644 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 2.0644 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 2.0644 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 2.1366 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 2.0808 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 2.0677 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 2.0633 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 2.0599 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 2.0526 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 2.0546 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 2.0540 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 2.0566 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 2.0556 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 2.0534 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 2.0523 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 2.0522 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 2.0554 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 2.0549 0.0620 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 2.0544 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 2.0546 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 2.0570 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 2.0575 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 2.0579 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 2.0575 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 2.0587 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 2.0581 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 2.0573 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 2.0570 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 2.0566 0.0679 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 2.0560 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 2.0560 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 2.0566 0.0730 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 2.0569 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 2.0574 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 2.0568 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 2.0570 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 2.0578 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 2.0579 0.0731 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 2.0575 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 2.0571 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 2.0561 0.0701 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 2.0550 0.0742 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 2.0544 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 2.0541 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 2.0539 0.0731 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 2.0536 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 2.0529 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 2.0529 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 2.0517 0.0729 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 2.0517 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 2.0514 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 2.0514 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 2.0520 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 2.0513 0.0702 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 2.0518 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 2.0517 0.0715 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 2.0514 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 2.0511 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 2.0512 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 2.0514 0.0707 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 2.0512 0.0641 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 2.0509 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 2.0513 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 2.0511 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 2.0516 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 2.0519 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 2.0519 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 2.0517 0.0713 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 2.0520 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 2.0519 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 2.0515 0.0733 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 2.0511 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 2.0510 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 2.0512 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 2.0512 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 2.0513 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 2.0510 0.0638 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 2.0508 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 2.0512 0.0743 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 2.0511 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 2.0511 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 2.0507 0.0717 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 2.0506 0.0638 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 2.0502 0.0856 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 2.0503 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 2.0499 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 2.0497 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 2.0492 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 2.0489 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 2.0488 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 2.0486 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 2.0482 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 2.0482 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 2.0479 0.0703 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 2.0479 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 2.0475 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 2.0472 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 2.0469 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 2.0467 0.0730 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 2.0467 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 2.0463 0.0628 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 2.0461 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 2.0457 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 2.0458 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 2.0457 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 2.0454 0.0732 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 2.0452 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 2.0449 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 2.0449 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 2.0448 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 2.0448 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 2.0448 0.0641 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 2.0446 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 2.0446 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 2.0445 0.0708 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 2.0443 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 2.0441 0.0730 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 2.0440 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 2.0436 0.0727 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 2.0436 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 2.0435 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 2.0437 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 2.0435 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 2.0435 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 2.0433 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 2.0431 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 2.0432 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 2.0432 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 2.0428 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 2.0429 0.0638 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 2.0430 0.0704 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 2.0430 0.0765 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 2.0430 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 2.0428 0.0763 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 2.0426 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 2.0426 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 2.0426 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 2.0425 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 2.0426 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 2.0426 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 2.0427 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 2.0429 0.0705 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 2.0427 0.0755 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 2.0429 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 2.0428 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 2.0428 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 2.0427 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 2.0426 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 2.0427 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 2.0428 0.0743 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 2.0430 0.0798 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 2.0429 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 2.0428 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 2.0427 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 2.0429 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 2.0428 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 2.0430 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 2.0429 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 2.0429 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 2.0429 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 2.0428 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 2.0427 0.0720 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 2.0428 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 2.0429 0.0702 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 2.0428 0.0641 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 2.0428 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 2.0428 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 2.0428 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 2.0428 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 2.0428 0.0679 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 2.0431 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 2.0431 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 2.0430 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 2.0428 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 2.0427 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 2.0428 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 2.0428 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 2.0429 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 2.0429 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 2.0428 0.0703 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 2.0428 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 2.1103 0.0621 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 2.0626 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 2.0463 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 2.0417 0.0728 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 2.0408 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 2.0352 0.0731 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 2.0373 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 2.0375 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 2.0407 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 2.0412 0.0726 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 2.0380 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 2.0366 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 2.0361 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 2.0387 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 2.0381 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 2.0375 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 2.0374 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 2.0394 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 2.0391 0.0628 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 2.0394 0.0726 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 2.0383 0.0638 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 2.0392 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 2.0386 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 2.0379 0.0709 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 2.0378 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 2.0373 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 2.0368 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 2.0373 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 2.0383 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 2.0385 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 2.0386 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 2.0381 0.0720 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 2.0378 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 2.0386 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 2.0381 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 2.0381 0.0730 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 2.0378 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 2.0368 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 2.0359 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 2.0352 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 2.0348 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 2.0346 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 2.0341 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 2.0334 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 2.0334 0.0742 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 2.0321 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 2.0321 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 2.0317 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 2.0315 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 2.0324 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 2.0318 0.0716 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 2.0325 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 2.0322 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 2.0321 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 2.0318 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 2.0319 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 2.0321 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 2.0318 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 2.0313 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 2.0318 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 2.0317 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 2.0322 0.0727 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 2.0326 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 2.0328 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 2.0325 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 2.0328 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 2.0328 0.0718 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 2.0323 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 2.0319 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 2.0320 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 2.0324 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 2.0325 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 2.0329 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 2.0325 0.0740 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 2.0324 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 2.0327 0.0720 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 2.0326 0.0636 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 2.0326 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 2.0322 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 2.0320 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 2.0316 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 2.0317 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 2.0313 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 2.0311 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 2.0305 0.0686 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 2.0302 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 2.0302 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 2.0299 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 2.0295 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 2.0296 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 2.0293 0.0771 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 2.0292 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 2.0289 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 2.0286 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 2.0282 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 2.0281 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 2.0281 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 2.0278 0.0707 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 2.0275 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 2.0271 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 2.0270 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 2.0270 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 2.0268 0.0632 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 2.0266 0.0721 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 2.0264 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 2.0263 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 2.0262 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 2.0262 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 2.0263 0.0713 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 2.0262 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 2.0262 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 2.0261 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 2.0260 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 2.0258 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 2.0257 0.0687 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 2.0254 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 2.0255 0.0715 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 2.0255 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 2.0256 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 2.0255 0.0742 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 2.0255 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 2.0254 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 2.0251 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 2.0252 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 2.0252 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 2.0248 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 2.0247 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 2.0247 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 2.0246 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 2.0246 0.0619 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 2.0245 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 2.0243 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 2.0243 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 2.0243 0.0771 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 2.0242 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 2.0243 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 2.0244 0.0774 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 2.0244 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 2.0246 0.0761 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 2.0244 0.0756 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 2.0246 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 2.0245 0.0638 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 2.0244 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 2.0244 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 2.0243 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 2.0244 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 2.0245 0.0686 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 2.0246 0.0734 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 2.0246 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 2.0246 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 2.0245 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 2.0247 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 2.0247 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 2.0246 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 2.0245 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 2.0246 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 2.0246 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 2.0246 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 2.0244 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 2.0246 0.0638 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 2.0247 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 2.0246 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 2.0246 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 2.0246 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 2.0246 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 2.0245 0.0747 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 2.0245 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 2.0247 0.0632 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 2.0247 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 2.0246 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 2.0245 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 2.0244 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 2.0245 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 2.0245 0.0627 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 2.0246 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 2.0245 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 2.0244 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 2.0244 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 2.0945 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 2.0463 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 2.0320 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 2.0258 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 2.0247 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 2.0172 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 2.0173 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 2.0172 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 2.0193 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 2.0200 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 2.0178 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 2.0174 0.0639 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 2.0175 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 2.0196 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 2.0196 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 2.0184 0.0720 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 2.0179 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 2.0204 0.0688 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 2.0205 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 2.0209 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 2.0198 0.0666 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 2.0212 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 2.0207 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 2.0198 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 2.0193 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 2.0186 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 2.0179 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 2.0182 0.0781 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 2.0190 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 2.0191 0.0618 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 2.0191 0.0749 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 2.0185 0.0713 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 2.0187 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 2.0194 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 2.0190 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 2.0187 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 2.0183 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 2.0173 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 2.0163 0.0754 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 2.0155 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 2.0153 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 2.0150 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 2.0145 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 2.0137 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 2.0137 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 2.0124 0.0652 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 2.0125 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 2.0122 0.0704 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 2.0119 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 2.0128 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 2.0123 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 2.0128 0.0767 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 2.0126 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 2.0124 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 2.0122 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 2.0123 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 2.0126 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 2.0125 0.0628 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 2.0121 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 2.0125 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 2.0124 0.0796 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 2.0130 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 2.0132 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 2.0134 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 2.0132 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 2.0136 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 2.0137 0.0631 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 2.0132 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 2.0129 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 2.0129 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 2.0131 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 2.0133 0.0744 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 2.0135 0.0732 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 2.0132 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 2.0132 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 2.0134 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 2.0132 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 2.0133 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 2.0128 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 2.0127 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 2.0123 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 2.0124 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 2.0121 0.0721 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 2.0120 0.0624 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 2.0115 0.0723 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 2.0112 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 2.0111 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 2.0109 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 2.0103 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 2.0104 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 2.0101 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 2.0100 0.0682 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 2.0096 0.0737 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 2.0093 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 2.0090 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 2.0089 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 2.0089 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 2.0086 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 2.0083 0.0706 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 2.0079 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 2.0079 0.0705 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 2.0079 0.0701 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 2.0076 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 2.0075 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 2.0073 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 2.0072 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 2.0072 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 2.0071 0.0640 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 2.0072 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 2.0072 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 2.0073 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 2.0072 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 2.0070 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 2.0069 0.0621 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 2.0067 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 2.0064 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 2.0063 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 2.0063 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 2.0063 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 2.0062 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 2.0063 0.0734 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 2.0061 0.0652 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 2.0060 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 2.0061 0.0724 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 2.0061 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 2.0058 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 2.0058 0.0730 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 2.0059 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 2.0059 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 2.0059 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 2.0057 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 2.0054 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 2.0055 0.0638 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 2.0055 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 2.0055 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 2.0056 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 2.0055 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 2.0056 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 2.0058 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 2.0057 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 2.0060 0.0630 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 2.0059 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 2.0058 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 2.0057 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 2.0056 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 2.0057 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 2.0056 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 2.0059 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 2.0059 0.0727 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 2.0058 0.0628 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 2.0057 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 2.0059 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 2.0059 0.0730 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 2.0060 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 2.0060 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 2.0059 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 2.0059 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 2.0058 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 2.0057 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 2.0059 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 2.0059 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 2.0058 0.0741 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 2.0058 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 2.0058 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 2.0059 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 2.0058 0.0726 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 2.0058 0.0717 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 2.0061 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 2.0060 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 2.0060 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 2.0059 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 2.0057 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 2.0057 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 2.0057 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 2.0058 0.0632 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 2.0058 0.0724 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 2.0056 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 2.0057 0.0760 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 2.0792 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 2.0364 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 2.0221 0.0742 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 2.0152 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 2.0123 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 2.0027 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 2.0024 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 2.0024 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 2.0039 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 2.0045 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 2.0015 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.9999 0.0712 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.9996 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 2.0026 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 2.0026 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 2.0016 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 2.0017 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 2.0040 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 2.0038 0.0771 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 2.0036 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 2.0023 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 2.0035 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 2.0031 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 2.0023 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 2.0022 0.0806 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 2.0013 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 2.0003 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 2.0007 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 2.0014 0.0637 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 2.0018 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 2.0018 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 2.0012 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 2.0011 0.0776 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 2.0018 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 2.0013 0.0719 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 2.0014 0.0618 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 2.0011 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 2.0000 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.9990 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.9984 0.0712 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.9980 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.9981 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.9975 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.9968 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.9969 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.9956 0.0734 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.9956 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.9951 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.9949 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.9957 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.9951 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.9957 0.0763 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.9957 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.9956 0.0633 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.9952 0.0638 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.9953 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.9955 0.0756 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.9954 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.9951 0.0731 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.9955 0.0867 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.9954 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.9960 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.9963 0.0743 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.9964 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.9963 0.0745 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.9965 0.0628 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.9967 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.9962 0.0637 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.9959 0.0717 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.9958 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.9962 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.9964 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.9968 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.9964 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.9965 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.9967 0.0700 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.9966 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.9967 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.9963 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.9961 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.9958 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.9959 0.0719 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.9955 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.9954 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.9948 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.9946 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.9944 0.0717 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.9942 0.0745 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.9938 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.9938 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.9936 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.9935 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.9930 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.9928 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.9924 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.9924 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.9924 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.9921 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.9917 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.9913 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.9913 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.9913 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.9910 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.9908 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.9906 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.9905 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.9905 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.9903 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.9903 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.9903 0.0637 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.9903 0.0707 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.9900 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.9899 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.9898 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.9896 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.9892 0.0714 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.9892 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.9890 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.9890 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.9889 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.9890 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.9888 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.9886 0.0690 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.9887 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.9888 0.0738 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.9885 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.9886 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.9886 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.9886 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.9886 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.9883 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.9881 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.9881 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.9881 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.9880 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.9880 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.9881 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.9881 0.0716 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.9884 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.9881 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.9883 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.9882 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.9882 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.9882 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.9881 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.9881 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.9882 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.9883 0.0764 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.9882 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.9882 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.9880 0.0705 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.9882 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.9882 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.9882 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.9881 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.9881 0.0753 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.9881 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.9881 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.9879 0.0761 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.9880 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.9881 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.9880 0.0728 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.9881 0.0719 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.9882 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.9882 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.9881 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.9881 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.9883 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.9883 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.9883 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.9882 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.9880 0.0641 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.9881 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.9881 0.0741 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.9882 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.9882 0.0742 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.9881 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.9880 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 2.0563 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 2.0149 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.9998 0.0728 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.9957 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.9933 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.9847 0.0736 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.9839 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.9828 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.9863 0.0743 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.9857 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.9825 0.0712 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.9813 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.9809 0.0778 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.9834 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.9826 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.9819 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.9817 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.9843 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.9843 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.9843 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.9832 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.9841 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.9835 0.0699 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.9828 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.9824 0.0638 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.9818 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.9816 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.9824 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.9835 0.0625 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.9839 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.9838 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.9829 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.9830 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.9838 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.9835 0.0757 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.9832 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.9829 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.9818 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.9811 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.9806 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.9800 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.9799 0.0724 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.9793 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.9787 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.9788 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.9775 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.9773 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.9769 0.0717 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.9769 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.9777 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.9773 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.9779 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.9777 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.9776 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.9772 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.9773 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.9774 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.9773 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.9771 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.9773 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.9773 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.9780 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.9783 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.9785 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.9784 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.9787 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.9788 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.9784 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.9781 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.9781 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.9786 0.0751 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.9787 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.9790 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.9788 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.9789 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.9794 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.9793 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.9792 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.9789 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.9787 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.9783 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.9784 0.0758 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.9780 0.0731 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.9778 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.9774 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.9772 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.9772 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.9769 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.9765 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.9766 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.9764 0.0750 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.9763 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.9758 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.9755 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.9752 0.0735 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.9751 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.9752 0.0763 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.9749 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.9745 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.9742 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.9742 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.9742 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.9739 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.9738 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.9736 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.9736 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.9735 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.9734 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.9735 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.9735 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.9735 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.9735 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.9734 0.0631 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.9733 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.9732 0.0726 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.9730 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.9730 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.9730 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.9730 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.9730 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.9730 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.9728 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.9726 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.9727 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.9727 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.9723 0.0724 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.9724 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.9725 0.0750 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.9724 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.9724 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.9722 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.9720 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.9721 0.0634 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.9721 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.9720 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.9721 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.9722 0.0638 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.9722 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.9724 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.9724 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.9726 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.9726 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.9726 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.9725 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.9724 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.9725 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.9726 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.9727 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.9726 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.9726 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.9724 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.9725 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.9726 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.9726 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.9726 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.9725 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.9726 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.9726 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.9724 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.9725 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.9727 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.9726 0.0629 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.9727 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.9727 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.9727 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.9727 0.0631 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.9727 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.9730 0.0747 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.9729 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.9729 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.9727 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.9727 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.9728 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.9728 0.0737 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.9729 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.9728 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.9727 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.9727 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4207 0.5971 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.3963 0.0321 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.3501 0.0347 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.1968 0.0415 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.0777 0.0441 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 3.9763 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 3.8962 0.0541 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 3.8313 0.0506 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 3.7747 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.7280 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.6863 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.6506 0.0558 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.6197 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.5939 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.5697 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.5474 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.5268 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.5103 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.4941 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.4778 0.0484 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.4636 0.0519 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.4503 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.4379 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.4267 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.4158 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.4060 0.0516 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.3971 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.3875 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.3790 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.3710 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.3645 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.3570 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.3496 0.0520 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.3432 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.3363 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.3304 0.0516 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.3240 0.0520 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.3180 0.0501 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.3122 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.3068 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.3013 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.2961 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.2910 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.2860 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.2809 0.0549 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.2765 0.0540 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.2724 0.0533 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.2685 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.2647 0.0522 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.2608 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.2569 0.0495 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.2529 0.0518 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.2491 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.2451 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.2414 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.2374 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.2338 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.2302 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.2264 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.2230 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.2195 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.2164 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.2133 0.0490 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.2097 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.2060 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.2027 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.1993 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.1953 0.0556 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.1915 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.1882 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.1846 0.0528 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.1813 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.1776 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.1739 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.1703 0.0544 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.1668 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.1632 0.0499 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.1596 0.0484 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.1557 0.0517 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.1517 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.1479 0.0501 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.1442 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.1404 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.1364 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.1321 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.1281 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.1240 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.1199 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.1159 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.1119 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.1078 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.1037 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.0996 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.0954 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.0910 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.0866 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.0824 0.0531 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.0781 0.0514 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.0740 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.0697 0.0542 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.0656 0.0501 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.0614 0.0489 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.0571 0.0506 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.0527 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.0486 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.0444 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.0401 0.0533 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.0360 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.0320 0.0517 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.0276 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.0234 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.0194 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.0153 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.0110 0.0496 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.0069 0.0540 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.0026 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 2.9984 0.0568 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 2.9945 0.0494 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 2.9907 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 2.9868 0.0540 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 2.9830 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 2.9791 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 2.9752 0.0529 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 2.9714 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 2.9677 0.0531 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 2.9637 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 2.9601 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 2.9565 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 2.9528 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 2.9491 0.0489 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 2.9455 0.0529 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 2.9418 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 2.9382 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 2.9347 0.0500 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 2.9310 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 2.9274 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 2.9239 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 2.9205 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 2.9172 0.0515 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 2.9137 0.0505 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 2.9105 0.0496 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 2.9070 0.0516 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 2.9037 0.0503 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 2.9004 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 2.8971 0.0489 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 2.8940 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 2.8907 0.0526 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 2.8876 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 2.8844 0.0549 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 2.8811 0.0520 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 2.8781 0.0632 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 2.8753 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 2.8723 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 2.8693 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 2.8662 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 2.8631 0.0531 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 2.8601 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 2.8570 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 2.8539 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 2.8510 0.0554 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 2.8481 0.0489 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 2.8451 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 2.8420 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 2.8391 0.0569 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 2.8363 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 2.8335 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 2.8307 0.0540 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 2.8280 0.0517 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 2.8252 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 2.8224 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 2.8196 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 2.8170 0.0519 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 2.8145 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 2.8120 0.0556 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 2.8097 0.0556 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 2.8077 0.0523 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 2.8055 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 2.8033 0.0546 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.3837 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.3404 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.3260 0.0525 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.3246 0.0566 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.3238 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.3222 0.0530 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.3214 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.3218 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.3229 0.0558 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.3229 0.0574 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.3204 0.0560 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.3189 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.3183 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.3200 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.3192 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.3184 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.3176 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.3190 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.3186 0.0531 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.3168 0.0524 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.3156 0.0521 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.3168 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.3157 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.3144 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.3131 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.3120 0.0550 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.3105 0.0536 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.3099 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.3096 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.3092 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.3085 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.3071 0.0556 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.3060 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.3056 0.0527 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.3045 0.0519 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.3035 0.0541 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.3023 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.3006 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.2991 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.2978 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.2964 0.0539 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.2951 0.0583 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.2937 0.0515 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.2922 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.2911 0.0517 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.2889 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.2882 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.2871 0.0540 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.2861 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.2857 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.2846 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.2840 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.2828 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.2817 0.0526 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.2805 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.2798 0.0538 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.2790 0.0558 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.2779 0.0536 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.2768 0.0535 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.2764 0.0532 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.2754 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.2749 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.2745 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.2738 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.2728 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.2722 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.2713 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.2700 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.2689 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.2681 0.0550 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.2674 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.2667 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.2663 0.0530 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.2652 0.0548 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.2644 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.2640 0.0513 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.2631 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.2626 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.2615 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.2606 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.2595 0.0509 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.2589 0.0571 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.2578 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.2568 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.2554 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.2545 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.2537 0.0541 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.2528 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.2517 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.2510 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.2502 0.0594 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.2494 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.2482 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.2472 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.2463 0.0567 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.2454 0.0555 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.2444 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.2434 0.0513 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.2424 0.0536 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.2413 0.0554 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.2406 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.2399 0.0512 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.2390 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.2382 0.0561 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.2372 0.0515 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.2365 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.2357 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.2351 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.2345 0.0531 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.2336 0.0555 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.2329 0.0550 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.2323 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.2315 0.0509 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.2307 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.2298 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.2287 0.0594 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.2280 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.2272 0.0521 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.2266 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.2259 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.2252 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.2244 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.2236 0.0585 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.2230 0.0534 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.2222 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.2213 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.2208 0.0560 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.2202 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.2195 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.2188 0.0594 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.2180 0.0549 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.2172 0.0527 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.2165 0.0533 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.2159 0.0521 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.2152 0.0513 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.2146 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.2140 0.0576 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.2134 0.0515 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.2130 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.2123 0.0509 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.2117 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.2111 0.0524 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.2104 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.2098 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.2090 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.2085 0.0525 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.2079 0.0526 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.2074 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.2067 0.0516 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.2059 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.2053 0.0529 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.2048 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.2043 0.0543 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.2037 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.2030 0.0533 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.2023 0.0579 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.2016 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.2009 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.2001 0.0553 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.1998 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.1994 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.1991 0.0523 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.1987 0.0550 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.1984 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.1977 0.0545 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.1971 0.0568 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.1966 0.0532 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.1962 0.0523 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.1956 0.0553 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.1949 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.1943 0.0522 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.1936 0.0517 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.1931 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.1926 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.1921 0.0524 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.1914 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.1907 0.0545 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.1902 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.1331 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.0920 0.0573 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.0822 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.0775 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.0744 0.0547 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.0691 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.0692 0.0572 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.0695 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.0716 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.0712 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.0689 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.0668 0.0596 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.0662 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.0680 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.0664 0.0575 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.0650 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.0641 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.0657 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.0655 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.0655 0.0507 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.0644 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.0655 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.0647 0.0537 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.0638 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.0629 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.0614 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.0600 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.0598 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.0602 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.0599 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.0594 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.0583 0.0561 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.0579 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.0583 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.0574 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.0567 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.0562 0.0539 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.0548 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.0533 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.0521 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.0513 0.0586 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.0507 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.0497 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.0488 0.0579 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.0486 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.0470 0.0546 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.0465 0.0542 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.0456 0.0525 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.0450 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.0453 0.0498 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.0443 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.0446 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.0438 0.0522 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.0432 0.0535 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.0425 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.0423 0.0540 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.0420 0.0580 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.0413 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.0405 0.0576 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.0407 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.0400 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.0400 0.0551 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.0400 0.0520 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.0398 0.0547 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.0389 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.0388 0.0520 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.0385 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.0376 0.0527 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.0371 0.0553 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.0366 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.0366 0.0535 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.0363 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.0362 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.0356 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.0350 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.0350 0.0548 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.0345 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.0344 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.0335 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.0330 0.0530 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.0323 0.0514 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.0321 0.0570 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.0313 0.0529 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.0307 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.0297 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.0289 0.0540 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.0284 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.0277 0.0498 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.0269 0.0511 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.0265 0.0530 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.0258 0.0539 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.0253 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.0244 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.0237 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.0229 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.0224 0.0583 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.0218 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.0211 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.0203 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.0195 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.0191 0.0551 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.0187 0.0545 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.0179 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.0174 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.0166 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.0161 0.0555 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.0156 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.0152 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.0148 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.0143 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.0139 0.0544 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.0133 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.0128 0.0522 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.0122 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.0116 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.0108 0.0590 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.0102 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.0097 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.0093 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.0088 0.0514 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.0084 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.0077 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.0071 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.0068 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.0063 0.0555 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.0055 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.0052 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.0048 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.0043 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.0039 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.0032 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.0024 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.0020 0.0544 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.0016 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.0011 0.0540 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.0008 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.0004 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.0001 0.0543 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 1.9998 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 1.9993 0.0538 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 1.9991 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 1.9986 0.0548 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 1.9982 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 1.9977 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 1.9972 0.0566 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 1.9969 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 1.9965 0.0507 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 1.9963 0.0581 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 1.9959 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 1.9952 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 1.9946 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 1.9944 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 1.9941 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 1.9938 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 1.9934 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 1.9930 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 1.9926 0.0576 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 1.9922 0.0535 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 1.9916 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 1.9914 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 1.9911 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 1.9906 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 1.9903 0.0573 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 1.9899 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 1.9895 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 1.9891 0.0511 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 1.9888 0.0585 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 1.9887 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 1.9883 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 1.9879 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 1.9874 0.0576 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 1.9869 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 1.9866 0.0583 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 1.9863 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 1.9860 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 1.9856 0.0613 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 1.9850 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 1.9847 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 1.9858 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 1.9434 0.0581 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 1.9297 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 1.9235 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 1.9195 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 1.9111 0.0559 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 1.9101 0.0537 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 1.9091 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 1.9101 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 1.9079 0.0546 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 1.9037 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 1.9009 0.0550 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 1.9008 0.0551 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 1.9029 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 1.9017 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 1.9002 0.0565 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 1.8993 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 1.9015 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 1.9013 0.0525 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 1.9011 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 1.8997 0.0526 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 1.9005 0.0538 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 1.8996 0.0616 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 1.8987 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 1.8981 0.0533 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 1.8969 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 1.8956 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 1.8959 0.0533 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 1.8966 0.0570 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 1.8964 0.0554 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 1.8958 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 1.8947 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 1.8943 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 1.8947 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 1.8943 0.0546 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 1.8937 0.0582 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 1.8929 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 1.8916 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 1.8903 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 1.8895 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 1.8889 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 1.8889 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 1.8881 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 1.8872 0.0547 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 1.8873 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 1.8859 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 1.8857 0.0610 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 1.8850 0.0554 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 1.8848 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 1.8854 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 1.8846 0.0525 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 1.8853 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 1.8849 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 1.8848 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 1.8842 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 1.8841 0.0600 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 1.8841 0.0530 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 1.8836 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 1.8828 0.0539 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 1.8832 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 1.8827 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 1.8831 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 1.8832 0.0551 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 1.8831 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 1.8825 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 1.8826 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 1.8825 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 1.8819 0.0552 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 1.8816 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 1.8811 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 1.8813 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 1.8812 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 1.8813 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 1.8806 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 1.8802 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 1.8801 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 1.8796 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 1.8794 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 1.8786 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 1.8781 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 1.8774 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 1.8773 0.0587 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 1.8765 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 1.8760 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 1.8752 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 1.8745 0.0533 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 1.8741 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 1.8737 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 1.8730 0.0556 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 1.8728 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 1.8722 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 1.8717 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 1.8710 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 1.8704 0.0538 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 1.8698 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 1.8695 0.0542 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 1.8691 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 1.8684 0.0565 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 1.8678 0.0553 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 1.8670 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 1.8667 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 1.8663 0.0577 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 1.8657 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 1.8652 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 1.8647 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 1.8643 0.0549 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 1.8639 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 1.8636 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 1.8634 0.0530 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 1.8631 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 1.8627 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 1.8623 0.0522 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 1.8618 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 1.8614 0.0565 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 1.8608 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 1.8601 0.0552 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 1.8598 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 1.8595 0.0522 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 1.8591 0.0549 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 1.8588 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 1.8586 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 1.8579 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 1.8573 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 1.8570 0.0605 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 1.8567 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 1.8560 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 1.8558 0.0575 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 1.8556 0.0595 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 1.8553 0.0533 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 1.8549 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 1.8544 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 1.8540 0.0573 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 1.8537 0.0516 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 1.8535 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 1.8532 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 1.8529 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 1.8527 0.0581 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 1.8525 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 1.8523 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 1.8519 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 1.8518 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 1.8515 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 1.8512 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 1.8510 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 1.8505 0.0487 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 1.8504 0.0551 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 1.8501 0.0554 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 1.8500 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 1.8497 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 1.8493 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 1.8488 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 1.8488 0.0528 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 1.8486 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 1.8484 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 1.8481 0.0540 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 1.8478 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 1.8476 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 1.8473 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 1.8469 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 1.8468 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 1.8467 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 1.8464 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 1.8462 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 1.8459 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 1.8457 0.0557 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 1.8454 0.0516 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 1.8453 0.0571 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 1.8455 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 1.8452 0.0547 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 1.8449 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 1.8446 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 1.8442 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 1.8440 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 1.8438 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 1.8436 0.0552 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 1.8433 0.0544 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 1.8430 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 1.8428 0.0561 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 1.8616 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 1.8289 0.0539 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 1.8158 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 1.8077 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 1.8041 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 1.7944 0.0544 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 1.7945 0.0563 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 1.7933 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 1.7949 0.0544 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 1.7941 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 1.7904 0.0585 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 1.7889 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 1.7889 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 1.7914 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 1.7899 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 1.7878 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 1.7874 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 1.7892 0.0489 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 1.7889 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 1.7893 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 1.7885 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 1.7890 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 1.7884 0.0605 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 1.7877 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 1.7872 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 1.7861 0.0569 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 1.7847 0.0530 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 1.7854 0.0528 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 1.7859 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 1.7861 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 1.7857 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 1.7847 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 1.7847 0.0554 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 1.7854 0.0559 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 1.7847 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 1.7841 0.0533 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 1.7831 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 1.7818 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 1.7804 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 1.7795 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 1.7790 0.0540 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 1.7791 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 1.7784 0.0573 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 1.7774 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 1.7773 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 1.7761 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 1.7756 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 1.7748 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 1.7746 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 1.7751 0.0562 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 1.7744 0.0581 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 1.7751 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 1.7748 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 1.7746 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 1.7743 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 1.7743 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 1.7743 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 1.7739 0.0528 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 1.7732 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 1.7737 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 1.7733 0.0593 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 1.7739 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 1.7740 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 1.7741 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 1.7736 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 1.7737 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 1.7738 0.0547 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 1.7731 0.0537 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 1.7729 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 1.7726 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 1.7729 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 1.7730 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 1.7732 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 1.7727 0.0567 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 1.7725 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 1.7726 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 1.7724 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 1.7723 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 1.7716 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 1.7712 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 1.7705 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 1.7706 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 1.7699 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 1.7696 0.0541 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 1.7689 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 1.7684 0.0541 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 1.7681 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 1.7676 0.0615 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 1.7670 0.0585 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 1.7669 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 1.7665 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 1.7662 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 1.7654 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 1.7649 0.0488 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 1.7645 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 1.7641 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 1.7638 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 1.7633 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 1.7627 0.0556 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 1.7620 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 1.7617 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 1.7614 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 1.7610 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 1.7607 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 1.7604 0.0580 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 1.7601 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 1.7599 0.0563 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 1.7596 0.0533 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 1.7595 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 1.7593 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 1.7591 0.0606 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 1.7588 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 1.7585 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 1.7581 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 1.7577 0.0554 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 1.7572 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 1.7570 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 1.7568 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 1.7565 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 1.7563 0.0541 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 1.7562 0.0587 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 1.7557 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 1.7551 0.0540 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 1.7551 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 1.7549 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 1.7543 0.0550 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 1.7543 0.0548 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 1.7542 0.0540 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 1.7540 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 1.7537 0.0541 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 1.7533 0.0554 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 1.7529 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 1.7527 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 1.7526 0.0570 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 1.7525 0.0549 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 1.7523 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 1.7523 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 1.7522 0.0619 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 1.7522 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 1.7520 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 1.7521 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 1.7518 0.0562 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 1.7517 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 1.7515 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 1.7512 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 1.7512 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 1.7510 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 1.7511 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 1.7509 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 1.7506 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 1.7502 0.0537 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 1.7501 0.0530 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 1.7500 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 1.7499 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 1.7498 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 1.7496 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 1.7495 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 1.7493 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 1.7489 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 1.7489 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 1.7489 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 1.7487 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 1.7487 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 1.7486 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 1.7484 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 1.7482 0.0535 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 1.7482 0.0542 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 1.7484 0.0590 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 1.7482 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 1.7480 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 1.7478 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 1.7475 0.0552 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 1.7474 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 1.7473 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 1.7472 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 1.7469 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 1.7467 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 1.7466 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 1.7902 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 1.7522 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 1.7385 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 1.7316 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 1.7262 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 1.7160 0.0540 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 1.7145 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 1.7131 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 1.7143 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 1.7132 0.0555 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.7095 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.7081 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.7079 0.0521 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.7108 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.7100 0.0599 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.7080 0.0540 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.7079 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.7100 0.0545 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.7099 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.7105 0.0544 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.7100 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.7104 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.7095 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.7091 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.7085 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.7072 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.7062 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.7065 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.7069 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.7073 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.7071 0.0558 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.7062 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.7060 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.7067 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.7062 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.7058 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.7050 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.7037 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.7023 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.7018 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.7013 0.0575 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.7018 0.0565 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.7012 0.0567 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.7004 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.7007 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.6996 0.0591 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.6993 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.6987 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.6985 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.6992 0.0582 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.6986 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.6993 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.6991 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.6991 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.6987 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.6985 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.6986 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.6982 0.0573 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.6976 0.0583 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.6981 0.0541 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.6978 0.0562 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.6984 0.0551 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.6985 0.0562 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.6988 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.6984 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.6985 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.6986 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.6981 0.0605 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.6981 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.6978 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.6982 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.6983 0.0555 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.6986 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.6983 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.6981 0.0544 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.6984 0.0558 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.6980 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.6979 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.6973 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.6971 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.6965 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.6966 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.6960 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.6959 0.0558 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.6953 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.6949 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.6945 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.6941 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.6935 0.0563 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.6934 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.6929 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.6926 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.6920 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.6915 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.6910 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.6908 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.6907 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.6901 0.0572 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.6896 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.6890 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.6888 0.0544 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.6885 0.0542 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.6882 0.0545 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.6879 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.6876 0.0578 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.6873 0.0520 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.6872 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.6869 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.6868 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.6867 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.6865 0.0532 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.6862 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.6860 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.6858 0.0540 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.6854 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.6851 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.6849 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.6847 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.6846 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.6844 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.6842 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.6837 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.6831 0.0572 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.6831 0.0544 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.6829 0.0536 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.6823 0.0576 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.6823 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.6823 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.6820 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.6817 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.6813 0.0565 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.6810 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.6810 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.6809 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.6808 0.0564 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.6808 0.0521 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.6808 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.6807 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.6807 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.6805 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.6806 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.6805 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.6804 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.6804 0.0547 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.6801 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.6802 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.6801 0.0596 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.6802 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.6801 0.0556 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.6800 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.6795 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.6795 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.6794 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.6793 0.0585 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.6793 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.6791 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.6790 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.6789 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.6786 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.6786 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.6787 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.6786 0.0520 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.6785 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.6785 0.0594 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.6784 0.0568 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.6782 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.6782 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.6785 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.6783 0.0600 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.6782 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.6779 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.6778 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.6778 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.6777 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.6777 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.6775 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.6773 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.6773 0.0585 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 1.7252 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.6903 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.6772 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.6720 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.6674 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.6561 0.0494 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.6549 0.0535 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.6541 0.0621 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.6553 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.6536 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.6501 0.0545 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.6480 0.0578 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.6479 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.6509 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.6500 0.0587 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.6480 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.6476 0.0490 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.6494 0.0535 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.6496 0.0567 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.6503 0.0493 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.6497 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.6504 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.6493 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.6489 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.6485 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.6473 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.6459 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.6464 0.0542 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.6469 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.6472 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.6468 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.6458 0.0590 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.6457 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.6461 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.6457 0.0535 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.6456 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.6449 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.6438 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.6425 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.6419 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.6415 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.6419 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.6413 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.6408 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.6411 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.6401 0.0564 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.6397 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.6392 0.0572 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.6391 0.0572 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.6396 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.6391 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.6399 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.6398 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.6397 0.0561 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.6393 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.6394 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.6397 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.6392 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.6387 0.0582 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.6393 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.6390 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.6399 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.6400 0.0558 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.6403 0.0582 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.6399 0.0562 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.6401 0.0597 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.6401 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.6398 0.0550 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.6397 0.0585 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.6394 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.6400 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.6402 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.6406 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.6404 0.0572 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.6403 0.0549 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.6404 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.6402 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.6402 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.6395 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.6391 0.0559 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.6385 0.0586 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.6385 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.6380 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.6379 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.6374 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.6370 0.0570 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.6366 0.0565 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.6364 0.0533 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.6358 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.6358 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.6354 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.6352 0.0605 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.6348 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.6343 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.6340 0.0557 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.6338 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.6336 0.0594 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.6330 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.6325 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.6320 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.6319 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.6317 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.6313 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.6310 0.0592 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.6307 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.6305 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.6304 0.0562 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.6303 0.0491 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.6302 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.6301 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.6299 0.0589 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.6297 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.6294 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.6292 0.0594 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.6289 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.6285 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.6283 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.6282 0.0611 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.6280 0.0560 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.6278 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.6277 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.6273 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.6267 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.6267 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.6266 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.6263 0.0694 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.6264 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.6263 0.0550 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.6260 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.6257 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.6253 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.6250 0.0590 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.6250 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.6250 0.0526 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.6249 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.6249 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.6250 0.0568 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.6250 0.0568 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.6250 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.6248 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.6249 0.0544 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.6248 0.0545 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.6248 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.6248 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.6246 0.0575 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.6247 0.0526 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.6246 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.6247 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.6248 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.6246 0.0536 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.6242 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.6242 0.0533 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.6242 0.0556 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.6242 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.6241 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.6241 0.0548 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.6240 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.6240 0.0557 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.6236 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.6237 0.0536 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.6239 0.0571 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.6238 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.6238 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.6237 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.6237 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.6236 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.6236 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.6240 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.6239 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.6238 0.0620 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.6236 0.0565 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.6234 0.0591 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.6234 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.6234 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.6234 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.6232 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.6230 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.6231 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.6892 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.6486 0.0592 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.6312 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.6241 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.6171 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.6059 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.6055 0.0563 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.6032 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.6040 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.6035 0.0563 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.6004 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.5999 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.5999 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.6022 0.0523 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.6011 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.5996 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.5997 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.6014 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.6018 0.0574 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.6030 0.0598 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.6019 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.6027 0.0556 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.6021 0.0551 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.6019 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.6017 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.6003 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.5991 0.0563 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.5999 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.6005 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.6008 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.6005 0.0554 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.5996 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.5998 0.0523 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.6001 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.5999 0.0583 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.5997 0.0569 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.5990 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.5976 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.5962 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.5957 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.5952 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.5956 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.5949 0.0547 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.5944 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.5948 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.5939 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.5935 0.0530 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.5930 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.5930 0.0562 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.5936 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.5931 0.0557 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.5940 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.5938 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.5940 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.5938 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.5938 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.5941 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.5936 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.5930 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.5935 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.5935 0.0591 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.5944 0.0534 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.5946 0.0547 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.5949 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.5946 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.5950 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.5951 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.5946 0.0560 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.5947 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.5944 0.0595 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.5949 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.5950 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.5954 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.5950 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.5949 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.5950 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.5947 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.5945 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.5939 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.5937 0.0586 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.5932 0.0590 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.5931 0.0532 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.5925 0.0607 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.5925 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.5920 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.5916 0.0590 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.5914 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.5910 0.0567 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.5906 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.5906 0.0570 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.5903 0.0523 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.5901 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.5896 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.5891 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.5888 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.5887 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.5885 0.0535 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.5882 0.0549 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.5878 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.5873 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.5873 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.5871 0.0583 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.5868 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.5866 0.0544 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.5863 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.5862 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.5861 0.0533 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.5859 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.5858 0.0551 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.5857 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.5855 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.5853 0.0597 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.5850 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.5848 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.5845 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.5840 0.0581 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.5839 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.5839 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.5838 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.5837 0.0545 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.5835 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.5830 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.5826 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.5826 0.0587 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.5826 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.5821 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.5822 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.5821 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.5819 0.0562 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.5816 0.0610 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.5812 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.5810 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.5810 0.0594 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.5810 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.5811 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.5810 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.5810 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.5811 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.5811 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.5809 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.5812 0.0542 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.5811 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.5810 0.0596 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.5811 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.5810 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.5811 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.5810 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.5812 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.5813 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.5810 0.0585 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.5807 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.5806 0.0550 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.5806 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.5806 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.5805 0.0564 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.5805 0.0574 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.5805 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.5804 0.0546 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.5801 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.5802 0.0545 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.5803 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.5802 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.5802 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.5802 0.0565 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.5802 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.5801 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.5802 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.5806 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.5805 0.0581 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.5805 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.5803 0.0567 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.5801 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.5802 0.0534 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.5801 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.5801 0.0556 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.5800 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.5798 0.0589 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.5799 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.6386 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.6033 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.5915 0.0623 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.5868 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.5794 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.5675 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.5676 0.0533 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.5648 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.5663 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.5654 0.0552 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.5629 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.5616 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.5618 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.5638 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.5628 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.5612 0.0530 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.5614 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.5634 0.0533 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.5641 0.0554 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.5651 0.0529 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.5639 0.0609 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.5639 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.5631 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.5629 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.5626 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.5609 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.5595 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.5604 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.5609 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.5614 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.5613 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.5604 0.0533 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.5607 0.0497 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.5610 0.0555 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.5606 0.0540 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.5606 0.0496 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.5599 0.0496 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.5585 0.0583 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.5571 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.5567 0.0529 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.5563 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.5566 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.5559 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.5554 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.5557 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.5546 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.5544 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.5539 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.5537 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.5542 0.0595 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.5536 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.5545 0.0585 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.5543 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.5545 0.0600 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.5542 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.5542 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.5545 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.5542 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.5538 0.0563 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.5544 0.0591 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.5544 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.5553 0.0602 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.5558 0.0530 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.5560 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.5558 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.5559 0.0554 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.5561 0.0538 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.5557 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.5558 0.0493 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.5557 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.5563 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.5565 0.0566 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.5568 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.5565 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.5564 0.0489 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.5567 0.0551 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.5566 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.5566 0.0538 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.5561 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.5559 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.5554 0.0576 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.5554 0.0557 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.5548 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.5548 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.5544 0.0603 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.5540 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.5538 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.5535 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.5531 0.0596 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.5531 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.5528 0.0522 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.5526 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.5521 0.0547 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.5517 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.5514 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.5514 0.0566 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.5513 0.0546 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.5509 0.0529 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.5505 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.5499 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.5498 0.0551 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.5497 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.5494 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.5493 0.0571 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.5491 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.5490 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.5490 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.5489 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.5488 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.5488 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.5487 0.0583 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.5486 0.0604 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.5483 0.0526 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.5480 0.0611 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.5477 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.5474 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.5473 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.5472 0.0536 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.5470 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.5469 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.5468 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.5464 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.5459 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.5460 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.5460 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.5455 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.5455 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.5455 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.5453 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.5450 0.0583 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.5446 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.5443 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.5444 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.5445 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.5445 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.5445 0.0567 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.5447 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.5447 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.5447 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.5446 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.5448 0.0577 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.5448 0.0611 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.5447 0.0563 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.5448 0.0606 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.5446 0.0575 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.5448 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.5448 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.5449 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.5450 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.5449 0.0552 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.5445 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.5444 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.5444 0.0568 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.5445 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.5444 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.5443 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.5443 0.0555 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.5443 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.5440 0.0540 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.5441 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.5442 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.5441 0.0564 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.5441 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.5441 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.5441 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.5440 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.5441 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.5446 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.5446 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.5445 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.5444 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.5442 0.0538 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.5443 0.0588 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.5443 0.0607 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.5443 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.5441 0.0549 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.5440 0.0558 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.5440 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.6196 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.5750 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.5629 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.5583 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.5513 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.5407 0.0563 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.5412 0.0606 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.5381 0.0543 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.5376 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.5358 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.5317 0.0567 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.5311 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.5310 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.5318 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.5305 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.5288 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.5290 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.5307 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.5307 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.5318 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.5310 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.5313 0.0581 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.5306 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.5303 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.5304 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.5288 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.5277 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.5282 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.5287 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.5293 0.0559 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.5291 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.5284 0.0563 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.5287 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.5287 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.5286 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.5283 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.5276 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.5267 0.0596 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.5252 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.5245 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.5241 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.5248 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.5242 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.5235 0.0582 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.5238 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.5227 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.5224 0.0581 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.5221 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.5222 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.5226 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.5222 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.5231 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.5229 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.5230 0.0542 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.5230 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.5230 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.5233 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.5231 0.0556 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.5227 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.5232 0.0550 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.5232 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.5241 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.5245 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.5247 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.5245 0.0542 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.5249 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.5252 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.5249 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.5250 0.0571 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.5249 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.5255 0.0563 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.5257 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.5261 0.0591 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.5259 0.0546 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.5258 0.0549 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.5261 0.0552 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.5259 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.5259 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.5253 0.0552 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.5252 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.5245 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.5246 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.5241 0.0600 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.5241 0.0549 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.5236 0.0556 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.5233 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.5230 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.5227 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.5222 0.0574 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.5223 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.5220 0.0588 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.5218 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.5214 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.5211 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.5208 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.5208 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.5207 0.0494 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.5204 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.5200 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.5196 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.5196 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.5194 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.5193 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.5191 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.5190 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.5189 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.5189 0.0595 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.5188 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.5186 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.5186 0.0495 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.5184 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.5183 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.5181 0.0567 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.5179 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.5177 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.5173 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.5173 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.5172 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.5170 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.5169 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.5168 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.5163 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.5159 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.5160 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.5159 0.0542 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.5154 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.5155 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.5154 0.0570 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.5152 0.0534 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.5149 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.5145 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.5143 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.5144 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.5144 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.5144 0.0542 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.5144 0.0569 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.5146 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.5147 0.0587 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.5147 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.5145 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.5148 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.5148 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.5148 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.5150 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.5148 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.5150 0.0595 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.5150 0.0740 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.5153 0.0559 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.5153 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.5152 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.5148 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.5147 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.5147 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.5148 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.5147 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.5147 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.5147 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.5147 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.5144 0.0570 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.5145 0.0547 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.5147 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.5146 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.5147 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.5148 0.0564 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.5147 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.5146 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.5148 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.5152 0.0552 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.5152 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.5152 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.5151 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.5150 0.0564 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.5151 0.0555 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.5150 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.5151 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.5149 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.5147 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.5148 0.0583 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.5809 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.5478 0.0614 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.5330 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.5283 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.5203 0.0564 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.5104 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.5114 0.0578 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.5083 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.5088 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.5081 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.5048 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.5041 0.0566 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.5042 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.5059 0.0550 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.5043 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.5026 0.0538 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.5026 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.5045 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.5047 0.0541 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.5060 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.5050 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.5051 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.5042 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.5039 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.5039 0.0559 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.5023 0.0612 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.5012 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.5020 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.5025 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.5032 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.5028 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.5020 0.0540 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.5022 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.5022 0.0622 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.5020 0.0558 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.5018 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.5009 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.4999 0.0588 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.4983 0.0562 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.4980 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.4975 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.4982 0.0581 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.4977 0.0540 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.4971 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.4974 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.4964 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.4961 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.4958 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.4957 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.4961 0.0530 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.4958 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.4967 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.4968 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.4970 0.0576 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.4968 0.0551 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.4969 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.4972 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.4969 0.0565 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.4965 0.0549 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.4972 0.0552 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.4973 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.4981 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.4985 0.0589 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.4989 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.4987 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.4991 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.4993 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.4990 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.4990 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.4988 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.4995 0.0543 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.4996 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.5000 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.4997 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.4997 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.4999 0.0568 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.4997 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.4997 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.4991 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.4989 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.4984 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.4985 0.0552 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.4980 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.4979 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.4975 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.4972 0.0583 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.4969 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.4966 0.0540 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.4962 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.4963 0.0583 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.4959 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.4958 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.4953 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.4950 0.0530 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.4946 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.4947 0.0499 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.4946 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.4941 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.4937 0.0530 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.4933 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.4933 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.4932 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.4929 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.4929 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.4926 0.0566 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.4925 0.0586 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.4925 0.0541 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.4925 0.0559 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.4924 0.0550 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.4924 0.0548 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.4922 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.4920 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.4919 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.4917 0.0551 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.4915 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.4911 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.4911 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.4910 0.0585 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.4909 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.4908 0.0590 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.4907 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.4903 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.4898 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.4899 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.4899 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.4895 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.4896 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.4895 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.4893 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.4890 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.4886 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.4883 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.4883 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.4883 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.4884 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.4884 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.4887 0.0573 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.4887 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.4886 0.0602 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.4885 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.4888 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.4888 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.4888 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.4890 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.4888 0.0597 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.4890 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.4890 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.4892 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.4892 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.4891 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.4888 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.4887 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.4888 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.4888 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.4888 0.0587 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.4887 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.4888 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.4888 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.4886 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.4886 0.0572 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.4888 0.0551 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.4888 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.4888 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.4888 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.4888 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.4887 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.4889 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.4893 0.0558 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.4893 0.0500 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.4893 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.4892 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.4890 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.4891 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.4892 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.4893 0.0540 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.4891 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.4890 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.4891 0.0568 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.5506 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.5150 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.5065 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.5044 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.4963 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.4847 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.4861 0.0548 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.4837 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.4839 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.4818 0.0561 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.4791 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.4786 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.4785 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.4801 0.0531 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.4792 0.0560 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.4778 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.4778 0.0534 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.4798 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.4799 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.4816 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.4807 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.4812 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.4805 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.4804 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.4804 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.4786 0.0543 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.4774 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.4781 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.4785 0.0558 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.4789 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.4788 0.0554 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.4781 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.4786 0.0546 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.4786 0.0535 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.4784 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.4786 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.4778 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.4769 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.4755 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.4753 0.0567 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.4748 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.4753 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.4747 0.0567 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.4742 0.0550 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.4746 0.0586 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.4739 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.4734 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.4734 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.4732 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.4736 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.4730 0.0556 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.4737 0.0542 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.4736 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.4737 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.4735 0.0543 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.4734 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.4738 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.4735 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.4730 0.0553 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.4738 0.0576 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.4737 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.4746 0.0578 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.4750 0.0545 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.4752 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.4751 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.4753 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.4754 0.0598 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.4752 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.4752 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.4751 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.4758 0.0579 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.4760 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.4764 0.0586 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.4762 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.4761 0.0541 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.4763 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.4761 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.4762 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.4756 0.0495 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.4755 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.4750 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.4750 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.4743 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.4745 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.4741 0.0601 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.4738 0.0542 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.4736 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.4734 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.4730 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.4730 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.4727 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.4725 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.4721 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.4717 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.4714 0.0591 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.4715 0.0595 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.4715 0.0565 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.4710 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.4707 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.4703 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.4701 0.0578 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.4700 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.4698 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.4697 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.4695 0.0550 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.4695 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.4695 0.0590 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.4694 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.4692 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.4693 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.4692 0.0545 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.4691 0.0560 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.4689 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.4687 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.4685 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.4682 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.4681 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.4681 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.4680 0.0570 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.4678 0.0572 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.4678 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.4674 0.0554 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.4669 0.0582 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.4669 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.4669 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.4664 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.4665 0.0562 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.4665 0.0548 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.4664 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.4661 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.4657 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.4656 0.0527 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.4656 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.4657 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.4657 0.0588 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.4657 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.4658 0.0531 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.4659 0.0591 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.4659 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.4659 0.0597 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.4662 0.0587 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.4663 0.0531 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.4662 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.4664 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.4663 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.4665 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.4665 0.0570 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.4667 0.0543 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.4668 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.4667 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.4664 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.4663 0.0609 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.4664 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.4664 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.4664 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.4664 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.4665 0.0554 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.4664 0.0549 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.4662 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.4664 0.0592 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.4665 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.4665 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.4665 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.4666 0.0567 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.4666 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.4665 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.4666 0.0546 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.4671 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.4670 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.4670 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.4669 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.4668 0.0586 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.4669 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.4669 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.4670 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.4669 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.4668 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.4669 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.5402 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.5068 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.4932 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.4898 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.4818 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.4707 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.4705 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.4674 0.0564 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.4673 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.4656 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.4619 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.4611 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.4612 0.0546 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.4619 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.4606 0.0589 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.4591 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.4591 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.4602 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.4596 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.4609 0.0567 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.4602 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.4602 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.4593 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.4593 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.4595 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.4576 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.4569 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.4573 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.4579 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.4585 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.4582 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.4573 0.0588 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.4576 0.0547 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.4579 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.4577 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.4577 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.4569 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.4559 0.0595 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.4547 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.4543 0.0594 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.4538 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.4544 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.4540 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.4535 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.4539 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.4529 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.4525 0.0560 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.4522 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.4521 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.4524 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.4519 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.4526 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.4525 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.4527 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.4525 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.4525 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.4529 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.4526 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.4520 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.4526 0.0552 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.4526 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.4534 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.4536 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.4539 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.4538 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.4541 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.4544 0.0535 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.4542 0.0536 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.4543 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.4541 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.4550 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.4553 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.4559 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.4557 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.4557 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.4559 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.4557 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.4557 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.4551 0.0553 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.4551 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.4546 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.4547 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.4543 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.4542 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.4540 0.0590 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.4538 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.4536 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.4532 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.4528 0.0535 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.4529 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.4526 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.4524 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.4520 0.0551 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.4517 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.4514 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.4515 0.0577 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.4515 0.0509 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.4511 0.0558 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.4508 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.4505 0.0589 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.4506 0.0550 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.4504 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.4502 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.4501 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.4499 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.4499 0.0600 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.4498 0.0579 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.4497 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.4496 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.4496 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.4494 0.0567 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.4494 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.4493 0.0549 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.4491 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.4489 0.0541 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.4486 0.0551 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.4485 0.0597 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.4484 0.0556 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.4483 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.4482 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.4481 0.0541 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.4478 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.4474 0.0554 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.4474 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.4473 0.0557 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.4469 0.0607 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.4470 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.4470 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.4468 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.4465 0.0567 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.4461 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.4458 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.4460 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.4460 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.4461 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.4462 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.4463 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.4463 0.0544 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.4463 0.0582 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.4463 0.0584 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.4466 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.4466 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.4466 0.0556 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.4468 0.0549 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.4467 0.0560 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.4469 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.4469 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.4471 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.4473 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.4472 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.4468 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.4468 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.4469 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.4469 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.4470 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.4470 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.4471 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.4471 0.0552 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.4468 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.4469 0.0583 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.4471 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.4471 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.4472 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.4473 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.4473 0.0584 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.4473 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.4474 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.4479 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.4479 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.4479 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.4478 0.0584 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.4477 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.4478 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.4478 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.4479 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.4478 0.0578 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.4476 0.0551 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.4478 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.5251 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.4897 0.0585 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.4760 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.4720 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.4615 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.4502 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.4502 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.4466 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.4466 0.0565 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.4460 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.4415 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.4415 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.4414 0.0580 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.4430 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.4422 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.4402 0.0552 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.4404 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.4420 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.4421 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.4434 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.4429 0.0604 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.4428 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.4422 0.0543 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.4423 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.4425 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.4411 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.4398 0.0579 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.4400 0.0543 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.4402 0.0581 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.4405 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.4402 0.0562 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.4396 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.4397 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.4402 0.0566 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.4399 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.4401 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.4392 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.4385 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.4371 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.4367 0.0566 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.4363 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.4367 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.4363 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.4356 0.0558 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.4357 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.4348 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.4345 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.4338 0.0540 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.4337 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.4340 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.4336 0.0538 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.4345 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.4343 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.4344 0.0493 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.4342 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.4342 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.4347 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.4346 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.4341 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.4347 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.4347 0.0596 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.4356 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.4359 0.0605 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.4362 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.4359 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.4364 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.4367 0.0566 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.4364 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.4364 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.4364 0.0554 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.4370 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.4373 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.4378 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.4375 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.4374 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.4376 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.4373 0.0573 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.4374 0.0557 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.4368 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.4368 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.4365 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.4365 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.4360 0.0560 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.4359 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.4355 0.0570 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.4353 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.4350 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.4347 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.4344 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.4344 0.0561 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.4342 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.4341 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.4337 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.4334 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.4332 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.4333 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.4333 0.0554 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.4329 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.4326 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.4322 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.4322 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.4321 0.0565 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.4320 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.4319 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.4317 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.4315 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.4315 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.4315 0.0563 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.4313 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.4313 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.4312 0.0605 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.4312 0.0558 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.4311 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.4309 0.0566 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.4307 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.4304 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.4304 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.4305 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.4304 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.4304 0.0579 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.4303 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.4300 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.4296 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.4295 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.4295 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.4291 0.0540 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.4292 0.0563 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.4292 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.4290 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.4287 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.4283 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.4281 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.4282 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.4283 0.0566 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.4284 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.4284 0.0647 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.4285 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.4286 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.4286 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.4286 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.4289 0.0609 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.4289 0.0568 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.4289 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.4291 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.4290 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.4292 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.4293 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.4295 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.4297 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.4296 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.4293 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.4292 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.4293 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.4293 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.4293 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.4293 0.0572 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.4293 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.4293 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.4291 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.4292 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.4294 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.4294 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.4295 0.0555 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.4295 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.4295 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.4294 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.4296 0.0504 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.4301 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.4301 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.4301 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.4300 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.4299 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.4301 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.4300 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.4301 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.4299 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.4298 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.4300 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.5036 0.0550 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.4693 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.4592 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.4556 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.4469 0.0552 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.4356 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.4355 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.4338 0.0586 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.4337 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.4319 0.0560 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.4284 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.4273 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.4276 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.4289 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.4275 0.0563 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.4256 0.0576 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.4257 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.4271 0.0547 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.4269 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.4281 0.0550 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.4275 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.4274 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.4266 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.4268 0.0575 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.4268 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.4251 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.4238 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.4244 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.4245 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.4248 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.4245 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.4236 0.0605 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.4238 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.4241 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.4239 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.4237 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.4232 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.4221 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.4207 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.4205 0.0588 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.4200 0.0537 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.4207 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.4205 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.4198 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.4201 0.0611 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.4192 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.4190 0.0574 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.4185 0.0552 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.4185 0.0546 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.4189 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.4183 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.4190 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.4190 0.0554 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.4192 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.4189 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.4189 0.0609 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.4193 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.4190 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.4186 0.0567 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.4192 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.4193 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.4202 0.0555 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.4205 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.4209 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.4207 0.0577 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.4211 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.4212 0.0543 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.4211 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.4211 0.0554 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.4210 0.0546 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.4217 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.4219 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.4225 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.4223 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.4223 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.4225 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.4224 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.4224 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.4218 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.4218 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.4213 0.0589 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.4213 0.0601 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.4208 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.4207 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.4204 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.4202 0.0555 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.4200 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.4197 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.4193 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.4194 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.4190 0.0584 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.4190 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.4186 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.4184 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.4181 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.4182 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.4182 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.4179 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.4175 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.4171 0.0599 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.4171 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.4170 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.4168 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.4167 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.4165 0.0550 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.4164 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.4164 0.0614 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.4163 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.4162 0.0574 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.4162 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.4161 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.4160 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.4158 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.4157 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.4155 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.4153 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.4152 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.4153 0.0566 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.4152 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.4152 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.4152 0.0573 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.4148 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.4144 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.4144 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.4144 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.4139 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.4140 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.4140 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.4138 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.4135 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.4132 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.4129 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.4130 0.0554 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.4131 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.4131 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.4131 0.0549 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.4133 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.4134 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.4134 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.4134 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.4137 0.0557 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.4138 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.4138 0.0553 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.4140 0.0570 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.4140 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.4143 0.0547 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.4144 0.0581 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.4147 0.0573 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.4148 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.4147 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.4145 0.0580 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.4144 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.4144 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.4145 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.4145 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.4145 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.4145 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.4145 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.4143 0.0572 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.4144 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.4145 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.4146 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.4146 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.4146 0.0566 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.4146 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.4146 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.4147 0.0563 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.4152 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.4152 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.4152 0.0594 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.4152 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.4151 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.4152 0.0626 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.4152 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.4152 0.0566 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.4151 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.4150 0.0549 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.4151 0.0496 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.4812 0.0551 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.4530 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.4425 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.4401 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.4323 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.4204 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.4206 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.4171 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.4179 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.4164 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.4125 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.4115 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.4118 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.4130 0.0571 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.4119 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.4103 0.0558 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.4108 0.0585 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.4119 0.0616 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.4117 0.0562 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.4132 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.4128 0.0550 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.4127 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.4122 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.4126 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.4123 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.4108 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.4095 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.4101 0.0552 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.4102 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.4105 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.4101 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.4094 0.0586 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.4096 0.0576 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.4096 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.4094 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.4094 0.0566 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.4089 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.4082 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.4068 0.0542 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.4065 0.0577 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.4061 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.4066 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.4063 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.4057 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.4060 0.0560 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.4052 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.4049 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.4043 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.4042 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.4044 0.0548 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.4039 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.4047 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.4047 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.4049 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.4044 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.4045 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.4048 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.4046 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.4042 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.4047 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.4048 0.0553 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.4056 0.0583 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.4059 0.0611 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.4061 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.4060 0.0547 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.4064 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.4068 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.4065 0.0573 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.4066 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.4065 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.4071 0.0542 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.4074 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.4079 0.0597 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.4078 0.0540 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.4078 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.4079 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.4077 0.0552 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.4076 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.4071 0.0548 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.4071 0.0554 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.4067 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.4066 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.4060 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.4061 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.4058 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.4056 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.4054 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.4050 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.4047 0.0560 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.4048 0.0573 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.4045 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.4044 0.0557 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.4040 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.4037 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.4035 0.0543 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.4036 0.0564 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.4037 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.4033 0.0573 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.4030 0.0542 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.4027 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.4027 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.4026 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.4025 0.0582 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.4024 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.4022 0.0499 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.4022 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.4022 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.4022 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.4020 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.4021 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.4019 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.4019 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.4018 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.4017 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.4015 0.0565 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.4013 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.4012 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.4012 0.0610 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.4012 0.0564 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.4011 0.0562 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.4010 0.0578 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.4006 0.0549 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.4002 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.4003 0.0576 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.4002 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.3999 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.3999 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.3999 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.3997 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.3994 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.3990 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.3988 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.3989 0.0595 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.3990 0.0559 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.3990 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.3991 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.3993 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.3993 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.3994 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.3994 0.0560 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.3997 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.3998 0.0547 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.3998 0.0562 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.4000 0.0559 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.4000 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.4002 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.4002 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.4004 0.0578 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.4007 0.0556 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.4005 0.0553 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.4002 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.4000 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.4001 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.4002 0.0569 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.4002 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.4002 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.4003 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.4002 0.0549 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.4000 0.0559 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.4001 0.0576 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.4002 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.4002 0.0563 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.4002 0.0584 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.4002 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.4002 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.4002 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.4003 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.4008 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.4008 0.0555 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.4009 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.4008 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.4006 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.4008 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.4008 0.0580 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.4009 0.0560 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.4008 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.4006 0.0575 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.4008 0.0559 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.4772 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.4413 0.0577 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.4316 0.0544 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.4289 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.4197 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.4087 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.4085 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.4067 0.0563 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.4065 0.0588 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.4046 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.4003 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.4003 0.0576 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.4002 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.4018 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.4008 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.3987 0.0574 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.3989 0.0558 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.3999 0.0550 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.3997 0.0561 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.4014 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.4010 0.0582 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.4006 0.0568 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.3995 0.0563 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.3994 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.3995 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.3976 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.3965 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.3973 0.0563 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.3975 0.0546 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.3980 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.3974 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.3968 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.3972 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.3972 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.3968 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.3967 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.3960 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.3952 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.3941 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.3941 0.0548 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.3936 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.3946 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.3944 0.0589 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.3939 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.3941 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.3933 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.3930 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.3924 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.3924 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.3927 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.3923 0.0578 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.3931 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.3931 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.3932 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.3929 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.3930 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.3934 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.3930 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.3927 0.0560 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.3933 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.3935 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.3943 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.3947 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.3948 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.3946 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.3949 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.3952 0.0594 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.3952 0.0499 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.3952 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.3951 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.3957 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.3961 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.3966 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.3963 0.0600 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.3962 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.3965 0.0562 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.3964 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.3964 0.0546 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.3958 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.3958 0.0561 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.3953 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.3953 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.3948 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.3948 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.3945 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.3943 0.0559 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.3940 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.3937 0.0500 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.3933 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.3933 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.3931 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.3931 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.3927 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.3924 0.0557 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.3922 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.3923 0.0604 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.3923 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.3918 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.3916 0.0598 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.3912 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.3912 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.3911 0.0576 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.3910 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.3909 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.3907 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.3906 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.3906 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.3907 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.3906 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.3907 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.3905 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.3904 0.0575 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.3903 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.3902 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.3900 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.3898 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.3898 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.3897 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.3896 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.3896 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.3895 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.3892 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.3888 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.3889 0.0556 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.3889 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.3885 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.3885 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.3886 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.3884 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.3881 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.3878 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.3875 0.0563 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.3876 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.3876 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.3877 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.3878 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.3879 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.3880 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.3879 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.3879 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.3883 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.3883 0.0568 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.3882 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.3884 0.0606 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.3883 0.0576 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.3885 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.3885 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.3888 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.3889 0.0542 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.3888 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.3885 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.3884 0.0548 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.3886 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.3886 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.3886 0.0559 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.3886 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.3887 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.3886 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.3884 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.3886 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.3888 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.3888 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.3888 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.3889 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.3889 0.0572 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.3889 0.0557 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.3891 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.3895 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.3895 0.0563 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.3896 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.3895 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.3894 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.3895 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.3895 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.3896 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.3895 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.3894 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.3896 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.4651 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.4339 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.4217 0.0559 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.4163 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.4073 0.0554 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.3968 0.0600 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.3961 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.3936 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.3932 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.3915 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.3882 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.3879 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.3888 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.3902 0.0503 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.3890 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.3872 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.3874 0.0593 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.3891 0.0551 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.3888 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.3903 0.0566 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.3894 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.3894 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.3890 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.3896 0.0560 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.3897 0.0599 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.3880 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.3870 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.3876 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.3876 0.0557 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.3881 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.3878 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.3870 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.3872 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.3875 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.3872 0.0574 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.3871 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.3863 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.3855 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.3844 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.3845 0.0558 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.3840 0.0621 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.3848 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.3846 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.3838 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.3839 0.0538 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.3832 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.3826 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.3822 0.0565 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.3822 0.0551 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.3825 0.0548 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.3821 0.0545 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.3829 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.3828 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.3829 0.0559 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.3825 0.0548 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.3825 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.3829 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.3828 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.3822 0.0560 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.3830 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.3830 0.0538 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.3837 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.3841 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.3843 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.3842 0.0558 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.3845 0.0618 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.3847 0.0563 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.3844 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.3844 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.3842 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.3849 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.3853 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.3858 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.3855 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.3855 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.3858 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.3857 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.3855 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.3851 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.3850 0.0554 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.3845 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.3845 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.3839 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.3839 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.3836 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.3835 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.3833 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.3829 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.3826 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.3827 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.3824 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.3823 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.3819 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.3816 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.3815 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.3816 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.3816 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.3813 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.3810 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.3807 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.3808 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.3807 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.3806 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.3805 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.3804 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.3802 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.3803 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.3803 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.3801 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.3803 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.3802 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.3802 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.3801 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.3799 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.3797 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.3795 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.3796 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.3795 0.0549 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.3794 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.3794 0.0559 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.3793 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.3790 0.0540 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.3786 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.3786 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.3786 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.3782 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.3782 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.3783 0.0559 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.3781 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.3777 0.0565 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.3774 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.3772 0.0504 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.3774 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.3775 0.0575 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.3775 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.3776 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.3777 0.0551 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.3777 0.0499 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.3777 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.3777 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.3780 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.3781 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.3780 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.3782 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.3781 0.0571 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.3783 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.3783 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.3786 0.0583 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.3787 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.3787 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.3784 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.3783 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.3784 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.3784 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.3784 0.0555 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.3783 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.3784 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.3784 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.3782 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.3783 0.0563 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.3785 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.3785 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.3786 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.3786 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.3787 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.3787 0.0581 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.3789 0.0554 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.3794 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.3793 0.0555 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.3794 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.3793 0.0582 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.3792 0.0560 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.3793 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.3793 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.3794 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.3793 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.3792 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.3794 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.4462 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.4203 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.4087 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.4069 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.3969 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.3872 0.0579 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.3870 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.3851 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.3861 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.3847 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.3818 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.3811 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.3806 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.3822 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.3812 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.3792 0.0592 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.3795 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.3812 0.0618 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.3808 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.3825 0.0502 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.3819 0.0501 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.3821 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.3812 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.3811 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.3813 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.3793 0.0566 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.3781 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.3787 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.3789 0.0548 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.3795 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.3788 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.3780 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.3783 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.3782 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.3777 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.3774 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.3762 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.3752 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.3741 0.0556 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.3739 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.3732 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.3737 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.3734 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.3726 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.3730 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.3722 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.3718 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.3714 0.0506 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.3713 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.3716 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.3709 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.3717 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.3716 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.3718 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.3715 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.3714 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.3717 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.3715 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.3710 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.3716 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.3715 0.0563 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.3724 0.0550 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.3727 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.3727 0.0502 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.3726 0.0568 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.3730 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.3732 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.3730 0.0550 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.3731 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.3730 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.3736 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.3738 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.3743 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.3740 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.3741 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.3743 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.3741 0.0501 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.3741 0.0598 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.3736 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.3736 0.0553 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.3731 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.3730 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.3726 0.0578 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.3726 0.0501 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.3723 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.3722 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.3720 0.0566 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.3717 0.0566 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.3714 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.3714 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.3711 0.0549 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.3711 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.3708 0.0576 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.3704 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.3702 0.0581 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.3702 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.3703 0.0538 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.3700 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.3696 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.3693 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.3694 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.3692 0.0502 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.3692 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.3692 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.3690 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.3689 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.3690 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.3691 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.3689 0.0556 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.3690 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.3688 0.0555 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.3687 0.0551 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.3686 0.0560 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.3685 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.3683 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.3680 0.0555 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.3681 0.0598 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.3681 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.3681 0.0573 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.3681 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.3679 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.3676 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.3672 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.3673 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.3672 0.0530 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.3669 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.3669 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.3669 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.3667 0.0580 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.3664 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.3661 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.3659 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.3660 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.3660 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.3661 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.3661 0.0606 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.3664 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.3664 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.3665 0.0552 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.3665 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.3668 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.3669 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.3669 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.3672 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.3670 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.3672 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.3673 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.3676 0.0555 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.3677 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.3677 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.3674 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.3673 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.3675 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.3675 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.3676 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.3676 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.3676 0.0538 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.3676 0.0595 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.3675 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.3676 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.3678 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.3678 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.3678 0.0568 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.3678 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.3678 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.3678 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.3680 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.3685 0.0504 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.3685 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.3686 0.0586 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.3685 0.0564 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.3684 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.3686 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.3686 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.3686 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.3685 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.3684 0.0530 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.3686 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.4466 0.0496 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.4085 0.0578 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.3992 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.3958 0.0562 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.3871 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.3754 0.0568 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.3754 0.0562 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.3728 0.0544 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.3717 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.3714 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.3683 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.3679 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.3677 0.0550 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.3692 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.3682 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.3661 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.3664 0.0563 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.3683 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.3682 0.0572 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.3692 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.3688 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.3686 0.0500 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.3678 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.3678 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.3677 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.3658 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.3650 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.3658 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.3661 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.3668 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.3663 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.3657 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.3659 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.3662 0.0568 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.3660 0.0562 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.3660 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.3652 0.0581 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.3641 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.3631 0.0560 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.3631 0.0550 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.3624 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.3633 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.3630 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.3624 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.3627 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.3621 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.3620 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.3614 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.3612 0.0592 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.3616 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.3611 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.3618 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.3617 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.3619 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.3617 0.0621 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.3617 0.0570 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.3621 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.3618 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.3612 0.0560 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.3618 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.3619 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.3626 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.3630 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.3630 0.0615 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.3629 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.3634 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.3636 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.3635 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.3635 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.3634 0.0588 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.3641 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.3644 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.3649 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.3646 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.3646 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.3649 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.3647 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.3649 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.3643 0.0565 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.3642 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.3637 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.3637 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.3632 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.3630 0.0549 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.3627 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.3626 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.3624 0.0592 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.3620 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.3617 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.3618 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.3615 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.3616 0.0572 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.3612 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.3609 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.3606 0.0599 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.3607 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.3607 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.3602 0.0557 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.3600 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.3597 0.0554 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.3598 0.0568 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.3596 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.3594 0.0504 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.3593 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.3592 0.0549 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.3592 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.3592 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.3592 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.3591 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.3592 0.0585 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.3591 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.3591 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.3590 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.3589 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.3588 0.0584 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.3585 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.3586 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.3587 0.0562 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.3586 0.0569 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.3586 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.3586 0.0614 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.3583 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.3579 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.3579 0.0554 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.3578 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.3574 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.3574 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.3575 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.3573 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.3570 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.3566 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.3565 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.3567 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.3567 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.3567 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.3568 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.3570 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.3571 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.3571 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.3571 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.3575 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.3575 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.3574 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.3577 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.3576 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.3577 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.3578 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.3580 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.3582 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.3582 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.3579 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.3578 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.3579 0.0567 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.3580 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.3580 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.3580 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.3580 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.3580 0.0557 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.3578 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.3579 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.3581 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.3582 0.0565 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.3582 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.3582 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.3582 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.3582 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.3584 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.3588 0.0567 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.3588 0.0572 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.3589 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.3588 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.3587 0.0544 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.3589 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.3589 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.3590 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.3588 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.3587 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.3589 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4127 0.6333 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.3998 0.0377 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.3856 0.0401 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.3656 0.0323 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.3253 0.0390 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.2332 0.0441 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.1464 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.0752 0.0512 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.0096 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.9550 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.9036 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.8607 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.8215 0.0510 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.7883 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.7579 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.7299 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.7035 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.6813 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.6596 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.6389 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.6201 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.6029 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.5868 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.5717 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.5573 0.0536 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.5448 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.5328 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.5210 0.0513 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.5100 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.4995 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.4905 0.0512 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.4812 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.4719 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.4637 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.4554 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.4479 0.0522 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.4401 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.4328 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.4257 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.4189 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.4123 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.4061 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.4001 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.3942 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.3885 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.3831 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.3782 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.3734 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.3688 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.3643 0.0496 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.3598 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.3553 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.3511 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.3468 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.3428 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.3386 0.0489 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.3347 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.3310 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.3271 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.3237 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.3201 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.3171 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.3141 0.0526 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.3106 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.3074 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.3045 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.3015 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.2981 0.0539 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.2950 0.0515 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.2923 0.0515 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.2894 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.2869 0.0513 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.2842 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.2815 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.2791 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.2768 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.2743 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.2719 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.2695 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.2669 0.0541 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.2645 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2622 0.0535 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2599 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2576 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2551 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2527 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2504 0.0556 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2481 0.0563 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2460 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2439 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2417 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.2396 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.2375 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.2354 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.2332 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.2311 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.2290 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.2269 0.0499 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.2249 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.2229 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.2208 0.0551 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.2188 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.2168 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.2147 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.2126 0.0519 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.2105 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.2083 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.2060 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.2040 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.2015 0.0509 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.1994 0.0454 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.1973 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.1951 0.0516 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.1928 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.1905 0.0499 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.1881 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.1859 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.1837 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.1816 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.1793 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.1773 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.1750 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.1728 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.1706 0.0503 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.1682 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.1657 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.1634 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.1612 0.0496 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.1587 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.1564 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.1541 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.1516 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.1492 0.0530 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.1468 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.1441 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.1415 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.1389 0.0510 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.1364 0.0526 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.1339 0.0534 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.1314 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.1289 0.0499 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.1263 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.1237 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.1211 0.0501 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.1186 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.1161 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.1136 0.0493 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.1112 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.1085 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.1058 0.0547 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.1035 0.0500 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.1012 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.0986 0.0534 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.0961 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.0934 0.0492 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.0908 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.0882 0.0503 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.0855 0.0490 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.0828 0.0495 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.0802 0.0519 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.0776 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.0749 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.0721 0.0494 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.0696 0.0533 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.0670 0.0513 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.0644 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.0618 0.0541 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.0592 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.0567 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.0540 0.0483 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.0515 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.0492 0.0512 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.0469 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.0446 0.0553 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.0422 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.0397 0.0539 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.0371 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.0344 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.6584 0.0528 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.6004 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.5833 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.5773 0.0514 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.5735 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.5697 0.0562 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.5684 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.5680 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.5675 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.5655 0.0513 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.5624 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.5607 0.0530 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.5590 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.5599 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.5596 0.0542 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.5583 0.0533 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.5571 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.5572 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.5559 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.5530 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.5513 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.5505 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.5485 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.5465 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.5445 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.5427 0.0566 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.5411 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.5396 0.0472 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.5386 0.0547 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.5374 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.5364 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.5345 0.0531 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.5327 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.5312 0.0524 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.5294 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.5282 0.0564 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.5266 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.5244 0.0516 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.5225 0.0577 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.5207 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.5186 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.5169 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.5150 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.5131 0.0560 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.5113 0.0532 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.5092 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.5079 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.5065 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.5048 0.0611 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.5037 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.5022 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.5010 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.4995 0.0521 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.4979 0.0518 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.4965 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.4953 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.4940 0.0528 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.4924 0.0547 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.4911 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.4900 0.0514 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.4887 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.4876 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.4867 0.0518 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.4855 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.4840 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.4830 0.0571 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.4819 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.4803 0.0568 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.4790 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.4781 0.0475 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.4770 0.0555 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.4760 0.0509 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.4748 0.0741 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.4737 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.4725 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.4719 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.4707 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.4698 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.4685 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.4674 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.4662 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.4654 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.4642 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.4629 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.4613 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.4601 0.0512 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.4591 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.4579 0.0549 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.4567 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.4559 0.0553 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.4548 0.0558 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.4539 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.4528 0.0527 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.4516 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.4504 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.4492 0.0575 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.4481 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.4471 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.4460 0.0539 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.4448 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.4440 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.4430 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.4418 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.4407 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.4396 0.0547 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.4387 0.0577 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.4377 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.4370 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.4362 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.4350 0.0567 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.4342 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.4334 0.0535 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.4324 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.4314 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.4304 0.0545 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.4292 0.0533 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.4283 0.0536 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.4274 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.4267 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.4258 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.4251 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.4243 0.0537 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.4234 0.0515 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.4226 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.4218 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.4208 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.4201 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.4194 0.0518 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.4185 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.4177 0.0483 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.4168 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.4158 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.4151 0.0512 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.4144 0.0538 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.4135 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.4127 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.4119 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.4111 0.0549 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.4106 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.4100 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.4098 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.4094 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.4091 0.0545 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.4087 0.0523 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.4083 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.4081 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.4076 0.0579 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.4071 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.4063 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.4054 0.0512 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.4048 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.4043 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.4036 0.0518 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.4030 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.4022 0.0512 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.4015 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.4007 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.4000 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.3991 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.3985 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.3978 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.3970 0.0512 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.3962 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.3955 0.0531 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.3949 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.3942 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.3935 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.3928 0.0512 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.3921 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.3913 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.3906 0.0534 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.3899 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.3893 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.3886 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.3880 0.0531 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.3874 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.3866 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.3859 0.0519 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.3466 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.2863 0.0548 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.2712 0.0578 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.2638 0.0541 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.2596 0.0529 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.2559 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.2565 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.2573 0.0557 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.2583 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.2575 0.0589 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.2553 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.2538 0.0548 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.2535 0.0549 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.2555 0.0552 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.2554 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.2546 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.2542 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.2558 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.2560 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.2547 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.2539 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.2551 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.2542 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.2530 0.0555 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.2525 0.0558 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.2512 0.0552 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.2502 0.0522 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.2498 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.2501 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.2495 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.2493 0.0569 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.2483 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.2476 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.2477 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.2469 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.2463 0.0548 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.2458 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.2443 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.2434 0.0576 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.2422 0.0548 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.2416 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.2408 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.2398 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.2386 0.0532 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.2380 0.0511 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.2364 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.2361 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.2353 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.2348 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.2348 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.2339 0.0535 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.2340 0.0597 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.2334 0.0487 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.2329 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.2323 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.2320 0.0530 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.2316 0.0552 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.2307 0.0575 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.2300 0.0553 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.2299 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.2293 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.2293 0.0580 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.2293 0.0523 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.2290 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.2284 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.2283 0.0531 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.2280 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.2272 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.2264 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.2260 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.2257 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.2253 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.2250 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.2244 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.2238 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.2238 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.2233 0.0525 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.2230 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.2223 0.0559 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.2217 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.2209 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.2206 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.2199 0.0546 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.2193 0.0527 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.2183 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.2176 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.2173 0.0511 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.2168 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.2161 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.2158 0.0522 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.2154 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.2150 0.0532 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.2141 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.2134 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.2129 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.2124 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.2119 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.2113 0.0574 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.2105 0.0527 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.2097 0.0519 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.2095 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.2090 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.2084 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.2078 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.2073 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.2069 0.0547 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.2064 0.0542 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.2062 0.0517 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.2059 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.2053 0.0558 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.2049 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.2045 0.0498 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.2040 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.2035 0.0555 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.2030 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.2023 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.2019 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.2014 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.2012 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.2008 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.2005 0.0541 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.1999 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.1994 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.1992 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.1987 0.0568 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.1981 0.0548 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.1978 0.0527 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.1976 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.1972 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.1969 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.1964 0.0555 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.1958 0.0599 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.1956 0.0556 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.1953 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.1949 0.0530 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.1947 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.1944 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.1941 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.1941 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.1936 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.1934 0.0549 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.1931 0.0533 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.1927 0.0566 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.1924 0.0555 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.1920 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.1918 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.1915 0.0519 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.1913 0.0553 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.1910 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.1905 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.1902 0.0591 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.1901 0.0513 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.1898 0.0522 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.1894 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.1891 0.0529 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.1888 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.1884 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.1880 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.1875 0.0520 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.1874 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.1872 0.0565 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.1868 0.0531 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.1865 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.1862 0.0519 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.1859 0.0507 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.1856 0.0534 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.1853 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.1852 0.0549 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.1849 0.0596 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.1845 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.1842 0.0558 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.1838 0.0514 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.1836 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.1833 0.0514 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.1831 0.0513 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.1828 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.1824 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.1821 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.2074 0.0481 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.1521 0.0516 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.1381 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.1306 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.1269 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.1214 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.1209 0.0555 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.1211 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.1220 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.1212 0.0557 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.1189 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.1168 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.1172 0.0530 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.1192 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.1184 0.0558 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.1176 0.0589 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.1169 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.1191 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.1193 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.1186 0.0530 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.1177 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.1186 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.1181 0.0523 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.1170 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.1164 0.0551 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.1151 0.0526 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.1141 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.1140 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.1144 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.1143 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.1141 0.0573 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.1133 0.0594 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.1129 0.0530 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.1133 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.1128 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.1126 0.0536 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.1121 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.1107 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.1095 0.0558 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.1085 0.0540 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.1079 0.0570 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.1076 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.1070 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.1060 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.1055 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.1043 0.0543 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.1040 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.1033 0.0561 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.1028 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.1030 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.1022 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.1026 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.1020 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.1015 0.0484 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.1010 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.1009 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.1008 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.1002 0.0548 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.0995 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.0999 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.0996 0.0522 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.0998 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.1000 0.0536 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.0999 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.0996 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.0997 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.0995 0.0581 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.0987 0.0600 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.0981 0.0547 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.0979 0.0583 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.0980 0.0566 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.0978 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.0978 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.0972 0.0569 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.0969 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.0971 0.0526 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.0968 0.0572 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.0967 0.0523 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.0960 0.0535 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.0958 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.0952 0.0548 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.0951 0.0501 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.0944 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.0940 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.0932 0.0525 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.0927 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.0924 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.0919 0.0545 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.0912 0.0536 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.0910 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.0906 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.0903 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.0896 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.0890 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.0885 0.0589 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.0880 0.0571 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.0876 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.0871 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.0866 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.0859 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.0858 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.0855 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.0850 0.0582 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.0846 0.0569 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.0842 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.0839 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.0836 0.0539 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.0833 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.0831 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.0827 0.0554 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.0825 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.0821 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.0818 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.0815 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.0811 0.0530 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.0804 0.0571 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.0801 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.0797 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.0795 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.0793 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.0791 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.0786 0.0561 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.0781 0.0544 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.0780 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.0777 0.0549 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.0772 0.0523 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.0771 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.0769 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.0766 0.0522 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.0765 0.0558 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.0760 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.0756 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.0755 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.0754 0.0566 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.0752 0.0537 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.0750 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.0750 0.0539 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.0749 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.0749 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.0746 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.0745 0.0535 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.0742 0.0550 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.0739 0.0521 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.0738 0.0540 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.0735 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.0734 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.0733 0.0591 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.0733 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.0730 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.0726 0.0552 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.0722 0.0547 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.0722 0.0581 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.0721 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.0720 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.0717 0.0577 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.0714 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.0712 0.0521 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.0709 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.0706 0.0580 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.0706 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.0705 0.0573 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.0703 0.0553 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.0701 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.0699 0.0683 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.0698 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.0695 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.0693 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.0694 0.0557 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.0692 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.0689 0.0515 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.0686 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.0683 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.0682 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.0680 0.0549 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.0679 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.0677 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.0674 0.0569 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.0672 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.0986 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.0531 0.0565 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.0383 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.0332 0.0533 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.0303 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.0226 0.0556 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.0230 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.0228 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.0244 0.0531 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.0241 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.0214 0.0556 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.0193 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.0190 0.0556 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.0211 0.0547 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.0201 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.0185 0.0552 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.0183 0.0574 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.0205 0.0775 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.0205 0.0549 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.0202 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.0197 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.0204 0.0525 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.0200 0.0557 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.0190 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.0187 0.0585 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.0177 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.0168 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.0170 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.0178 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.0174 0.0525 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.0169 0.0577 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.0163 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.0161 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.0165 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.0160 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.0154 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.0150 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.0136 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.0125 0.0558 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.0117 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.0110 0.0539 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.0108 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.0103 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.0094 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.0094 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.0080 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.0078 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.0069 0.0577 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.0067 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.0072 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.0065 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.0071 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.0067 0.0596 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.0064 0.0547 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.0058 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.0059 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.0058 0.0530 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.0052 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.0046 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.0052 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.0050 0.0566 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.0054 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.0055 0.0533 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.0056 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.0053 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.0053 0.0583 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.0052 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.0046 0.0572 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.0042 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.0039 0.0530 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.0041 0.0589 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.0040 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.0041 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.0035 0.0581 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.0033 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.0035 0.0529 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.0034 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.0033 0.0566 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.0027 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.0025 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.0019 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.0019 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.0012 0.0537 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.0010 0.0534 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.0002 0.0583 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 1.9997 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 1.9993 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 1.9989 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 1.9983 0.0584 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 1.9983 0.0564 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 1.9978 0.0563 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 1.9975 0.0541 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 1.9969 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 1.9964 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 1.9958 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 1.9955 0.0541 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 1.9951 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 1.9946 0.0549 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 1.9940 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 1.9933 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 1.9930 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 1.9928 0.0559 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 1.9922 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 1.9919 0.0619 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 1.9915 0.0579 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 1.9912 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 1.9910 0.0575 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 1.9907 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 1.9905 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 1.9903 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 1.9900 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 1.9898 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 1.9896 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 1.9892 0.0548 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 1.9888 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 1.9882 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 1.9880 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 1.9876 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 1.9875 0.0529 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 1.9873 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 1.9871 0.0524 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 1.9867 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 1.9863 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 1.9862 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 1.9859 0.0531 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 1.9855 0.0564 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 1.9854 0.0535 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 1.9853 0.0535 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 1.9851 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 1.9850 0.0566 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 1.9846 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 1.9841 0.0557 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 1.9840 0.0539 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 1.9838 0.0554 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 1.9836 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 1.9835 0.0573 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 1.9835 0.0529 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 1.9834 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 1.9835 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 1.9832 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 1.9832 0.0572 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 1.9830 0.0525 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 1.9828 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 1.9827 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 1.9824 0.0548 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 1.9824 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 1.9823 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 1.9823 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 1.9822 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 1.9819 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 1.9816 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 1.9816 0.0552 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 1.9815 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 1.9814 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 1.9812 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 1.9811 0.0528 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 1.9809 0.0547 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 1.9808 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 1.9805 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 1.9806 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 1.9806 0.0547 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 1.9803 0.0573 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 1.9803 0.0583 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 1.9802 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 1.9800 0.0531 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 1.9798 0.0529 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 1.9797 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 1.9799 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 1.9797 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 1.9795 0.0568 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 1.9793 0.0525 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 1.9791 0.0537 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 1.9790 0.0566 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 1.9789 0.0528 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 1.9788 0.0564 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 1.9786 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 1.9783 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 1.9782 0.0532 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.0262 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 1.9749 0.0535 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 1.9605 0.0568 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 1.9548 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 1.9485 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 1.9396 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 1.9405 0.0559 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 1.9392 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 1.9412 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 1.9405 0.0535 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.9379 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.9362 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.9362 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.9386 0.0566 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.9379 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.9364 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.9359 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.9387 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.9388 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.9391 0.0612 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.9386 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.9397 0.0536 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.9389 0.0561 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.9382 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.9382 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.9370 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.9361 0.0572 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.9363 0.0564 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.9371 0.0581 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.9370 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.9368 0.0534 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.9360 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.9360 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.9365 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.9362 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.9361 0.0571 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.9353 0.0534 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.9343 0.0569 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.9328 0.0521 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.9319 0.0545 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.9313 0.0551 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.9314 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.9308 0.0580 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.9299 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.9300 0.0521 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.9287 0.0563 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.9285 0.0529 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.9278 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.9275 0.0564 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.9282 0.0541 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.9273 0.0549 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.9280 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.9276 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.9272 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.9268 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.9269 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.9269 0.0544 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.9265 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.9259 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.9264 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.9262 0.0576 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.9266 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.9269 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.9270 0.0532 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.9267 0.0567 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.9269 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.9268 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.9261 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.9258 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.9257 0.0539 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.9259 0.0565 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.9259 0.0532 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.9261 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.9256 0.0547 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.9255 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.9257 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.9254 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.9254 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.9247 0.0587 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.9245 0.0539 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.9238 0.0560 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.9238 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.9230 0.0551 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.9228 0.0508 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.9221 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.9217 0.0556 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.9212 0.0549 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.9209 0.0584 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.9203 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.9203 0.0584 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.9199 0.0548 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.9197 0.0557 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.9191 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.9186 0.0563 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.9181 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.9179 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.9177 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.9171 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.9166 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.9159 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.9158 0.0580 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.9155 0.0543 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.9151 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.9149 0.0543 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.9145 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.9143 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.9140 0.0529 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.9138 0.0521 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.9137 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.9135 0.0536 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.9133 0.0583 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.9131 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.9128 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.9125 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.9122 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.9117 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.9114 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.9112 0.0603 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.9110 0.0543 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.9108 0.0542 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.9107 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.9103 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.9099 0.0582 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.9099 0.0562 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.9097 0.0506 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.9093 0.0565 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.9093 0.0535 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.9093 0.0546 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.9091 0.0563 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.9089 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.9085 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.9081 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.9080 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.9078 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.9077 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.9076 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.9075 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.9074 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.9075 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.9072 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.9073 0.0538 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.9071 0.0580 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.9069 0.0577 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.9068 0.0589 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.9065 0.0562 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.9065 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.9063 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.9064 0.0521 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.9063 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.9060 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.9056 0.0514 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.9057 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.9057 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.9056 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.9055 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.9054 0.0526 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.9054 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.9053 0.0537 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.9049 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.9050 0.0536 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.9051 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.9050 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.9049 0.0557 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.9049 0.0568 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.9048 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.9046 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.9046 0.0532 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.9048 0.0556 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.9046 0.0609 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.9045 0.0553 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.9043 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.9040 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.9040 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.9039 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.9038 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.9036 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.9033 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.9032 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 1.9537 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.9056 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.8932 0.0535 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.8863 0.0566 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.8819 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.8727 0.0559 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.8737 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.8723 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.8746 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.8743 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.8705 0.0572 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.8684 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.8685 0.0543 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.8706 0.0543 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.8699 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.8680 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.8675 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.8693 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.8694 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.8695 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.8686 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.8695 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.8691 0.0580 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.8686 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.8683 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.8675 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.8664 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.8670 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.8678 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.8680 0.0539 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.8678 0.0552 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.8670 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.8669 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.8674 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.8669 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.8667 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.8660 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.8648 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.8636 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.8628 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.8624 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.8626 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.8620 0.0571 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.8611 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.8612 0.0582 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.8598 0.0555 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.8595 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.8588 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.8586 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.8592 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.8584 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.8591 0.0562 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.8589 0.0547 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.8588 0.0577 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.8585 0.0559 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.8586 0.0571 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.8586 0.0541 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.8583 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.8577 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.8581 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.8579 0.0570 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.8583 0.0594 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.8586 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.8588 0.0552 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.8585 0.0526 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.8587 0.0536 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.8585 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.8579 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.8575 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.8574 0.0594 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.8578 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.8577 0.0625 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.8581 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.8577 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.8576 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.8578 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.8576 0.0617 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.8576 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.8569 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.8567 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.8562 0.0566 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.8562 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.8555 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.8552 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.8546 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.8542 0.0545 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.8539 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.8535 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.8529 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.8529 0.0575 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.8524 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.8522 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.8516 0.0546 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.8512 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.8508 0.0551 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.8506 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.8503 0.0596 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.8498 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.8493 0.0555 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.8488 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.8487 0.0608 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.8486 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.8482 0.0542 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.8480 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.8476 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.8474 0.0541 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.8473 0.0565 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.8471 0.0560 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.8470 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.8469 0.0535 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.8467 0.0533 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.8466 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.8463 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.8460 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.8457 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.8452 0.0579 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.8451 0.0544 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.8449 0.0536 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.8448 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.8447 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.8445 0.0554 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.8441 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.8437 0.0561 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.8438 0.0569 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.8436 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.8432 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.8433 0.0544 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.8432 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.8430 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.8428 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.8425 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.8422 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.8421 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.8421 0.0577 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.8421 0.0536 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.8420 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.8420 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.8420 0.0539 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.8420 0.0592 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.8418 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.8420 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.8418 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.8417 0.0533 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.8416 0.0559 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.8414 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.8414 0.0549 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.8413 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.8414 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.8414 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.8411 0.0567 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.8408 0.0567 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.8409 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.8409 0.0566 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.8408 0.0540 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.8407 0.0543 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.8406 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.8406 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.8405 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.8401 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.8402 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.8403 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.8401 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.8401 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.8400 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.8399 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.8398 0.0562 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.8398 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.8401 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.8400 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.8398 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.8396 0.0550 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.8394 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.8394 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.8393 0.0529 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.8393 0.0548 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.8392 0.0539 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.8390 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.8389 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.8999 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.8534 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.8391 0.0561 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.8314 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.8256 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.8159 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.8159 0.0546 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.8142 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.8173 0.0598 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.8159 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.8132 0.0547 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.8116 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.8113 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.8132 0.0523 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.8132 0.0533 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.8117 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.8110 0.0535 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.8128 0.0566 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.8128 0.0602 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.8134 0.0544 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.8131 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.8138 0.0531 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.8130 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.8122 0.0584 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.8119 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.8102 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.8095 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.8099 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.8108 0.0563 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.8111 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.8108 0.0586 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.8100 0.0550 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.8103 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.8110 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.8107 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.8106 0.0563 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.8099 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.8087 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.8074 0.0591 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.8068 0.0591 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.8061 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.8063 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.8055 0.0557 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.8048 0.0531 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.8049 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.8038 0.0598 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.8034 0.0552 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.8030 0.0533 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.8028 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.8035 0.0552 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.8028 0.0548 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.8037 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.8034 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.8035 0.0550 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.8033 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.8036 0.0572 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.8038 0.0532 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.8034 0.0566 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.8029 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.8037 0.0584 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.8035 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.8041 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.8042 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.8045 0.0550 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.8041 0.0577 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.8043 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.8043 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.8038 0.0570 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.8036 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.8034 0.0542 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.8039 0.0530 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.8040 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.8043 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.8040 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.8038 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.8042 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.8040 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.8040 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.8035 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.8032 0.0593 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.8026 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.8027 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.8020 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.8019 0.0559 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.8014 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.8009 0.0589 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.8006 0.0532 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.8003 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.7997 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.7996 0.0554 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.7992 0.0557 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.7990 0.0550 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.7984 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.7979 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.7976 0.0571 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.7975 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.7972 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.7967 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.7961 0.0541 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.7956 0.0622 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.7955 0.0544 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.7953 0.0552 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.7950 0.0544 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.7948 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.7946 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.7944 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.7943 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.7941 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.7942 0.0559 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.7941 0.0563 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.7939 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.7938 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.7935 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.7933 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.7930 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.7926 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.7925 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.7924 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.7923 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.7921 0.0544 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.7919 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.7915 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.7911 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.7910 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.7907 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.7902 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.7903 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.7903 0.0511 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.7901 0.0571 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.7899 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.7896 0.0559 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.7892 0.0534 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.7891 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.7890 0.0547 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.7890 0.0557 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.7889 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.7890 0.0554 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.7889 0.0535 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.7890 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.7888 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.7889 0.0534 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.7888 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.7886 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.7887 0.0569 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.7884 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.7884 0.0541 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.7883 0.0558 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.7884 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.7883 0.0586 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.7881 0.0553 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.7878 0.0583 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.7879 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.7879 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.7879 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.7878 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.7877 0.0546 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.7876 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.7875 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.7872 0.0498 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.7872 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.7873 0.0553 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.7871 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.7871 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.7871 0.0543 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.7869 0.0512 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.7869 0.0530 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.7869 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.7872 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.7871 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.7869 0.0582 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.7868 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.7866 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.7867 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.7866 0.0558 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.7865 0.0588 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.7864 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.7862 0.0572 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.7862 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.8544 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.8104 0.0547 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.7965 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.7889 0.0546 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.7825 0.0549 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.7732 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.7724 0.0537 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.7694 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.7706 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.7698 0.0554 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.7659 0.0560 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.7641 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.7645 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.7668 0.0587 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.7662 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.7641 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.7640 0.0555 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.7660 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.7661 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.7668 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.7664 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.7674 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.7666 0.0526 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.7660 0.0586 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.7657 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.7642 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.7633 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.7641 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.7648 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.7649 0.0575 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.7645 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.7636 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.7638 0.0563 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.7642 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.7639 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.7637 0.0559 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.7631 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.7620 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.7609 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.7603 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.7597 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.7601 0.0555 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.7594 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.7588 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.7590 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.7579 0.0533 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.7577 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.7573 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.7570 0.0567 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.7576 0.0541 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.7570 0.0624 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.7579 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.7576 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.7576 0.0562 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.7572 0.0536 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.7574 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.7577 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.7573 0.0533 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.7568 0.0570 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.7574 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.7572 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.7580 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.7583 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.7586 0.0611 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.7585 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.7586 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.7588 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.7584 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.7581 0.0580 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.7580 0.0533 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.7585 0.0563 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.7585 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.7590 0.0543 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.7587 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.7585 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.7587 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.7587 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.7585 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.7580 0.0550 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.7578 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.7572 0.0575 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.7572 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.7566 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.7565 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.7559 0.0557 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.7556 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.7552 0.0530 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.7549 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.7543 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.7543 0.0558 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.7539 0.0529 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.7537 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.7532 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.7528 0.0573 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.7525 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.7523 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.7521 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.7516 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.7511 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.7507 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.7506 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.7505 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.7502 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.7500 0.0530 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.7498 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.7496 0.0536 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.7494 0.0573 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.7493 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.7492 0.0559 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.7491 0.0541 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.7490 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.7489 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.7487 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.7485 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.7482 0.0570 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.7477 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.7476 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.7474 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.7472 0.0541 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.7471 0.0575 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.7471 0.0601 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.7467 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.7464 0.0559 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.7464 0.0554 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.7463 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.7459 0.0572 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.7460 0.0565 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.7461 0.0539 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.7459 0.0565 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.7457 0.0541 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.7454 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.7450 0.0560 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.7450 0.0580 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.7449 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.7449 0.0556 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.7450 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.7450 0.0541 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.7450 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.7450 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.7448 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.7450 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.7449 0.0569 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.7448 0.0615 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.7449 0.0620 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.7447 0.0554 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.7447 0.0548 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.7447 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.7449 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.7449 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.7447 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.7443 0.0568 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.7444 0.0580 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.7444 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.7444 0.0546 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.7444 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.7443 0.0564 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.7443 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.7443 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.7440 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.7440 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.7442 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.7441 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.7441 0.0547 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.7440 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.7439 0.0530 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.7439 0.0552 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.7439 0.0576 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.7442 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.7441 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.7440 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.7438 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.7437 0.0556 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.7438 0.0575 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.7437 0.0549 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.7437 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.7436 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.7434 0.0540 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.7435 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.8151 0.0595 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.7703 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.7552 0.0557 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.7486 0.0580 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.7447 0.0550 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.7330 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.7334 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.7321 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.7334 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.7321 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.7284 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.7265 0.0586 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.7265 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.7276 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.7267 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.7249 0.0567 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.7247 0.0589 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.7261 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.7265 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.7275 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.7273 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.7277 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.7271 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.7263 0.0576 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.7262 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.7248 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.7238 0.0550 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.7248 0.0538 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.7253 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.7255 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.7252 0.0545 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.7245 0.0552 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.7247 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.7255 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.7251 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.7249 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.7241 0.0538 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.7229 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.7214 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.7207 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.7204 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.7207 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.7201 0.0580 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.7195 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.7197 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.7188 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.7184 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.7183 0.0576 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.7182 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.7188 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.7183 0.0553 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.7190 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.7190 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.7189 0.0584 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.7186 0.0546 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.7188 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.7192 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.7190 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.7184 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.7190 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.7188 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.7195 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.7197 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.7200 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.7198 0.0567 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.7199 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.7199 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.7196 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.7196 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.7194 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.7200 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.7200 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.7204 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.7201 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.7201 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.7203 0.0604 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.7201 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.7202 0.0547 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.7196 0.0560 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.7194 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.7189 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.7190 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.7184 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.7183 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.7179 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.7176 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.7172 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.7169 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.7163 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.7164 0.0574 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.7161 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.7158 0.0569 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.7153 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.7148 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.7145 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.7145 0.0547 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.7143 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.7138 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.7134 0.0563 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.7128 0.0562 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.7127 0.0620 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.7125 0.0562 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.7123 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.7121 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.7118 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.7116 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.7115 0.0558 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.7114 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.7113 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.7113 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.7111 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.7110 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.7108 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.7105 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.7102 0.0543 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.7099 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.7098 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.7097 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.7096 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.7095 0.0623 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.7095 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.7090 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.7086 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.7087 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.7086 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.7082 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.7083 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.7084 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.7082 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.7081 0.0559 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.7077 0.0591 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.7074 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.7074 0.0601 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.7074 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.7074 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.7074 0.0578 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.7074 0.0610 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.7074 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.7074 0.0577 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.7072 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.7075 0.0567 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.7074 0.0569 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.7073 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.7074 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.7072 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.7073 0.0580 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.7073 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.7074 0.0583 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.7074 0.0561 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.7073 0.0566 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.7070 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.7071 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.7071 0.0564 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.7071 0.0562 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.7071 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.7070 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.7070 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.7070 0.0543 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.7067 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.7068 0.0550 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.7069 0.0581 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.7069 0.0541 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.7069 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.7069 0.0553 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.7069 0.0547 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.7069 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.7069 0.0747 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.7073 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.7072 0.0577 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.7072 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.7071 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.7069 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.7070 0.0576 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.7069 0.0534 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.7069 0.0542 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.7068 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.7066 0.0547 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.7067 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.7876 0.0596 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.7389 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.7231 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.7150 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.7108 0.0585 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.7005 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.6999 0.0628 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.6969 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.6978 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.6967 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.6928 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.6910 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.6900 0.0596 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.6921 0.0550 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.6914 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.6893 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.6902 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.6917 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.6920 0.0609 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.6930 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.6926 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.6932 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.6926 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.6921 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.6921 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.6905 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.6893 0.0566 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.6898 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.6906 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.6908 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.6907 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.6897 0.0572 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.6901 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.6907 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.6907 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.6904 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.6898 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.6888 0.0560 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.6876 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.6868 0.0544 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.6862 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.6865 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.6861 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.6856 0.0554 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.6858 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.6848 0.0581 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.6846 0.0561 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.6843 0.0569 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.6842 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.6850 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.6845 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.6852 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.6851 0.0556 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.6850 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.6847 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.6850 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.6853 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.6852 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.6846 0.0499 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.6853 0.0557 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.6852 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.6860 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.6863 0.0562 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.6866 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.6864 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.6865 0.0550 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.6868 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.6864 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.6864 0.0565 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.6863 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.6867 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.6869 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.6873 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.6871 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.6870 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.6872 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.6870 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.6870 0.0602 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.6865 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.6864 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.6857 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.6858 0.0543 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.6853 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.6852 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.6848 0.0605 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.6845 0.0565 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.6842 0.0551 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.6840 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.6835 0.0590 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.6836 0.0578 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.6832 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.6830 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.6825 0.0543 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.6821 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.6818 0.0540 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.6818 0.0592 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.6816 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.6812 0.0557 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.6808 0.0572 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.6802 0.0622 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.6801 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.6800 0.0548 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.6797 0.0558 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.6796 0.0537 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.6795 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.6793 0.0548 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.6792 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.6791 0.0558 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.6790 0.0531 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.6790 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.6789 0.0561 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.6787 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.6785 0.0538 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.6784 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.6781 0.0544 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.6777 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.6776 0.0568 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.6775 0.0544 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.6774 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.6773 0.0572 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.6771 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.6767 0.0537 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.6763 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.6764 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.6763 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.6758 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.6760 0.0578 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.6760 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.6760 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.6758 0.0530 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.6753 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.6750 0.0568 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.6751 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.6751 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.6750 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.6750 0.0503 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.6751 0.0598 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.6751 0.0570 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.6751 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.6749 0.0537 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.6751 0.0550 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.6750 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.6749 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.6749 0.0550 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.6747 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.6748 0.0549 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.6748 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.6750 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.6750 0.0592 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.6749 0.0588 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.6746 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.6746 0.0567 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.6746 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.6746 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.6746 0.0595 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.6745 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.6746 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.6746 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.6743 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.6745 0.0501 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.6746 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.6745 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.6745 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.6746 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.6745 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.6745 0.0594 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.6746 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.6749 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.6748 0.0577 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.6748 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.6747 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.6745 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.6746 0.0567 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.6745 0.0566 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.6745 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.6744 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.6743 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.6744 0.0535 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.7601 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.7092 0.0527 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.6894 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.6832 0.0541 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.6769 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.6665 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.6678 0.0570 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.6663 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.6688 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.6667 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.6636 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.6620 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.6621 0.0611 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.6641 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.6630 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.6612 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.6617 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.6631 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.6633 0.0582 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.6641 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.6637 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.6638 0.0527 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.6628 0.0592 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.6620 0.0533 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.6620 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.6601 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.6590 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.6598 0.0551 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.6604 0.0584 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.6606 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.6601 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.6593 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.6596 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.6604 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.6603 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.6601 0.0507 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.6594 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.6583 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.6569 0.0562 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.6562 0.0550 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.6558 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.6565 0.0570 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.6559 0.0536 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.6553 0.0589 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.6556 0.0572 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.6546 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.6542 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.6540 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.6539 0.0588 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.6547 0.0552 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.6544 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.6551 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.6551 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.6552 0.0563 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.6550 0.0563 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.6551 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.6555 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.6552 0.0562 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.6546 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.6553 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.6551 0.0563 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.6560 0.0596 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.6562 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.6566 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.6563 0.0533 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.6565 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.6566 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.6563 0.0583 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.6562 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.6562 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.6568 0.0600 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.6570 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.6573 0.0613 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.6570 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.6569 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.6572 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.6570 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.6570 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.6565 0.0534 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.6564 0.0556 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.6558 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.6561 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.6555 0.0589 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.6555 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.6551 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.6548 0.0559 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.6545 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.6541 0.0565 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.6537 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.6538 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.6535 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.6533 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.6528 0.0579 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.6525 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.6521 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.6521 0.0550 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.6519 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.6514 0.0535 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.6511 0.0561 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.6506 0.0556 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.6505 0.0556 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.6504 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.6501 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.6500 0.0554 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.6498 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.6497 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.6496 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.6495 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.6494 0.0591 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.6494 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.6493 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.6492 0.0535 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.6491 0.0553 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.6489 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.6486 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.6482 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.6481 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.6481 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.6480 0.0705 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.6479 0.0567 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.6479 0.0584 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.6475 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.6471 0.0533 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.6472 0.0500 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.6471 0.0576 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.6467 0.0556 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.6468 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.6469 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.6468 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.6466 0.0539 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.6462 0.0588 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.6459 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.6460 0.0557 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.6460 0.0560 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.6460 0.0549 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.6460 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.6461 0.0533 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.6461 0.0598 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.6462 0.0548 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.6460 0.0538 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.6462 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.6461 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.6460 0.0538 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.6462 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.6460 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.6461 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.6461 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.6463 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.6464 0.0571 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.6462 0.0539 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.6459 0.0596 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.6460 0.0584 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.6461 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.6461 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.6460 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.6460 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.6460 0.0569 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.6459 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.6457 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.6458 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.6460 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.6459 0.0588 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.6459 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.6460 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.6459 0.0571 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.6458 0.0594 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.6458 0.0581 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.6462 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.6461 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.6461 0.0535 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.6460 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.6458 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.6459 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.6458 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.6458 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.6456 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.6455 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.6456 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.7195 0.0567 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.6740 0.0558 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.6587 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.6541 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.6476 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.6366 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.6381 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.6367 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.6379 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.6374 0.0572 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.6338 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.6316 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.6319 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.6334 0.0557 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.6325 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.6313 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.6319 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.6334 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.6335 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.6344 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.6341 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.6348 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.6342 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.6339 0.0580 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.6341 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.6326 0.0574 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.6313 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.6316 0.0575 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.6323 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.6325 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.6322 0.0552 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.6316 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.6319 0.0554 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.6326 0.0550 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.6326 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.6324 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.6317 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.6306 0.0564 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.6293 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.6286 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.6282 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.6288 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.6282 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.6277 0.0602 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.6280 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.6272 0.0573 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.6269 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.6267 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.6267 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.6276 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.6271 0.0536 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.6279 0.0575 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.6277 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.6280 0.0536 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.6278 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.6280 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.6283 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.6281 0.0542 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.6277 0.0568 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.6283 0.0571 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.6282 0.0551 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.6292 0.0553 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.6295 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.6298 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.6298 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.6300 0.0558 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.6302 0.0563 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.6298 0.0576 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.6299 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.6297 0.0596 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.6304 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.6305 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.6310 0.0555 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.6308 0.0604 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.6308 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.6309 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.6308 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.6308 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.6303 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.6302 0.0537 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.6296 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.6296 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.6290 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.6290 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.6286 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.6284 0.0588 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.6280 0.0544 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.6277 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.6273 0.0583 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.6275 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.6272 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.6270 0.0537 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.6264 0.0587 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.6261 0.0580 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.6260 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.6259 0.0592 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.6257 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.6253 0.0557 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.6249 0.0608 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.6245 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.6244 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.6244 0.0552 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.6241 0.0574 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.6240 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.6238 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.6236 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.6236 0.0569 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.6235 0.0535 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.6235 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.6235 0.0591 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.6233 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.6232 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.6230 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.6228 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.6226 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.6222 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.6221 0.0536 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.6220 0.0588 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.6219 0.0590 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.6218 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.6217 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.6213 0.0564 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.6209 0.0595 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.6210 0.0559 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.6209 0.0547 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.6205 0.0582 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.6206 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.6207 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.6205 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.6203 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.6200 0.0608 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.6197 0.0557 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.6197 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.6198 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.6198 0.0577 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.6198 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.6199 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.6200 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.6200 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.6199 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.6203 0.0617 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.6202 0.0570 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.6200 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.6202 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.6200 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.6202 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.6202 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.6204 0.0500 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.6205 0.0541 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.6204 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.6201 0.0587 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.6200 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.6201 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.6202 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.6202 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.6202 0.0562 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.6203 0.0565 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.6202 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.6200 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.6201 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.6203 0.0562 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.6202 0.0628 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.6202 0.0536 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.6202 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.6202 0.0586 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.6202 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.6202 0.0570 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.6205 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.6205 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.6204 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.6203 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.6201 0.0575 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.6202 0.0582 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.6202 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.6202 0.0515 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.6200 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.6199 0.0575 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.6199 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.7038 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.6593 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.6409 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.6357 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.6269 0.0562 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.6152 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.6160 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.6135 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.6153 0.0563 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.6130 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.6094 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.6075 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.6073 0.0586 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.6093 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.6088 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.6069 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.6079 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.6094 0.0578 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.6098 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.6115 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.6113 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.6116 0.0544 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.6109 0.0558 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.6106 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.6109 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.6090 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.6076 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.6083 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.6089 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.6092 0.0647 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.6089 0.0546 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.6084 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.6090 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.6097 0.0510 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.6095 0.0626 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.6093 0.0540 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.6084 0.0578 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.6072 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.6059 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.6053 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.6047 0.0559 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.6050 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.6044 0.0543 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.6039 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.6043 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.6033 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.6030 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.6030 0.0501 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.6030 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.6038 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.6031 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.6039 0.0540 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.6040 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.6041 0.0589 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.6039 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.6041 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.6046 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.6044 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.6039 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.6045 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.6042 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.6052 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.6056 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.6058 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.6058 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.6061 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.6064 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.6060 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.6059 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.6060 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.6066 0.0610 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.6069 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.6075 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.6072 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.6072 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.6074 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.6074 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.6074 0.0574 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.6069 0.0572 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.6067 0.0752 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.6062 0.0575 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.6062 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.6058 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.6057 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.6053 0.0540 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.6051 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.6048 0.0559 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.6045 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.6040 0.0503 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.6041 0.0564 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.6039 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.6037 0.0565 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.6032 0.0610 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.6029 0.0598 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.6026 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.6026 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.6024 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.6020 0.0557 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.6016 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.6011 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.6010 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.6009 0.0544 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.6008 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.6006 0.0568 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.6004 0.0605 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.6001 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.6001 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.6001 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.6001 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.6001 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.5999 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.5999 0.0541 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.5997 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.5996 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.5992 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.5988 0.0597 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.5987 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.5987 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.5986 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.5985 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.5985 0.0573 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.5981 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.5977 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.5977 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.5976 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.5971 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.5973 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.5973 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.5972 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.5969 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.5966 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.5963 0.0567 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.5963 0.0574 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.5964 0.0579 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.5964 0.0560 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.5964 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.5965 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.5966 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.5966 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.5965 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.5969 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.5968 0.0574 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.5967 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.5968 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.5966 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.5968 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.5968 0.0564 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.5970 0.0558 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.5971 0.0521 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.5970 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.5967 0.0579 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.5967 0.0529 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.5968 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.5969 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.5969 0.0538 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.5969 0.0583 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.5970 0.0562 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.5970 0.0555 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.5967 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.5968 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.5969 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.5969 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.5970 0.0550 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.5970 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.5969 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.5969 0.0589 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.5970 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.5975 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.5974 0.0546 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.5974 0.0556 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.5973 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.5971 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.5972 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.5972 0.0574 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.5972 0.0599 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.5971 0.0560 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.5970 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.5971 0.0578 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.6803 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.6388 0.0551 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.6230 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.6176 0.0578 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.6102 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.5985 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.5990 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.5960 0.0568 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.5967 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.5950 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.5911 0.0578 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.5900 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.5891 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.5911 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.5901 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.5888 0.0593 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.5891 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.5900 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.5903 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.5911 0.0620 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.5906 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.5909 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.5899 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.5894 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.5896 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.5879 0.0565 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.5866 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.5873 0.0572 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.5880 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.5884 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.5883 0.0575 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.5877 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.5882 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.5889 0.0548 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.5886 0.0549 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.5883 0.0545 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.5876 0.0546 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.5864 0.0555 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.5851 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.5846 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.5842 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.5847 0.0582 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.5842 0.0551 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.5837 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.5842 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.5832 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.5828 0.0555 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.5827 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.5826 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.5834 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.5828 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.5837 0.0584 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.5836 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.5839 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.5838 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.5839 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.5844 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.5840 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.5835 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.5841 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.5839 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.5848 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.5851 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.5853 0.0625 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.5853 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.5855 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.5857 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.5854 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.5855 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.5853 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.5858 0.0578 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.5860 0.0593 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.5865 0.0583 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.5861 0.0557 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.5861 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.5865 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.5863 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.5862 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.5856 0.0583 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.5856 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.5851 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.5851 0.0550 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.5847 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.5847 0.0547 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.5843 0.0543 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.5841 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.5837 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.5835 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.5830 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.5832 0.0547 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.5829 0.0563 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.5828 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.5822 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.5819 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.5817 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.5817 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.5816 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.5811 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.5808 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.5803 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.5802 0.0568 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.5801 0.0630 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.5799 0.0537 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.5797 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.5795 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.5793 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.5793 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.5792 0.0566 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.5791 0.0613 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.5791 0.0552 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.5789 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.5789 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.5788 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.5786 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.5783 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.5779 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.5778 0.0566 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.5778 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.5777 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.5775 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.5774 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.5770 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.5766 0.0553 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.5766 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.5765 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.5762 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.5763 0.0596 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.5764 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.5763 0.0550 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.5761 0.0548 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.5757 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.5754 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.5755 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.5755 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.5756 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.5756 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.5758 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.5758 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.5758 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.5757 0.0584 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.5761 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.5760 0.0544 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.5760 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.5761 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.5760 0.0592 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.5761 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.5761 0.0625 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.5763 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.5765 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.5764 0.0560 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.5761 0.0549 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.5760 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.5761 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.5761 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.5762 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.5762 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.5762 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.5762 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.5760 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.5761 0.0551 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.5762 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.5762 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.5762 0.0551 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.5762 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.5762 0.0537 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.5762 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.5763 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.5768 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.5768 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.5767 0.0558 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.5766 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.5765 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.5766 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.5766 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.5766 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.5765 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.5763 0.0572 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.5765 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.6578 0.0501 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.6142 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.5987 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.5949 0.0551 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.5861 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.5760 0.0576 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.5761 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.5732 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.5745 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.5727 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.5687 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.5680 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.5685 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.5703 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.5701 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.5682 0.0552 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.5688 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.5705 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.5714 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.5725 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.5722 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.5724 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.5716 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.5712 0.0581 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.5711 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.5697 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.5681 0.0573 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.5687 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.5692 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.5697 0.0550 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.5694 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.5686 0.0610 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.5690 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.5695 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.5694 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.5695 0.0558 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.5687 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.5676 0.0593 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.5662 0.0571 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.5659 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.5651 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.5656 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.5650 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.5645 0.0537 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.5647 0.0579 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.5638 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.5637 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.5636 0.0572 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.5637 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.5645 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.5642 0.0551 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.5650 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.5651 0.0587 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.5653 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.5651 0.0568 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.5652 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.5658 0.0537 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.5656 0.0565 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.5651 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.5657 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.5657 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.5667 0.0618 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.5672 0.0554 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.5673 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.5671 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.5673 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.5675 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.5672 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.5672 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.5671 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.5677 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.5679 0.0548 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.5685 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.5683 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.5683 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.5685 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.5683 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.5683 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.5678 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.5678 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.5672 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.5672 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.5667 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.5666 0.0573 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.5663 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.5660 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.5657 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.5655 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.5651 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.5651 0.0540 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.5647 0.0540 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.5646 0.0571 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.5641 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.5638 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.5636 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.5637 0.0592 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.5635 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.5631 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.5627 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.5622 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.5622 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.5620 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.5619 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.5618 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.5616 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.5615 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.5615 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.5614 0.0504 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.5613 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.5614 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.5612 0.0571 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.5611 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.5611 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.5609 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.5605 0.0554 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.5601 0.0573 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.5601 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.5601 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.5599 0.0540 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.5599 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.5598 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.5594 0.0550 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.5591 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.5591 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.5590 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.5586 0.0578 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.5588 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.5589 0.0585 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.5588 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.5586 0.0541 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.5582 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.5579 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.5580 0.0519 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.5581 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.5581 0.0554 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.5582 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.5584 0.0543 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.5585 0.0602 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.5585 0.0543 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.5585 0.0565 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.5589 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.5589 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.5588 0.0602 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.5589 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.5588 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.5589 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.5589 0.0558 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.5591 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.5592 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.5591 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.5588 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.5588 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.5589 0.0547 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.5590 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.5590 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.5589 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.5590 0.0571 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.5590 0.0576 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.5588 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.5588 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.5590 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.5590 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.5590 0.0549 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.5590 0.0572 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.5590 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.5589 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.5590 0.0613 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.5594 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.5593 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.5593 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.5593 0.0526 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.5591 0.0560 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.5592 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.5592 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.5593 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.5591 0.0537 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.5590 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.5591 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.6435 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.5949 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.5822 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.5790 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.5712 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.5590 0.0580 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.5598 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.5574 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.5581 0.0580 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.5564 0.0595 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.5527 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.5512 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.5512 0.0567 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.5528 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.5523 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.5501 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.5508 0.0599 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.5520 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.5524 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.5532 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.5529 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.5531 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.5526 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.5522 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.5523 0.0564 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.5508 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.5495 0.0604 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.5503 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.5511 0.0599 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.5515 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.5511 0.0558 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.5503 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.5506 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.5511 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.5514 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.5514 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.5505 0.0575 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.5495 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.5481 0.0572 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.5476 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.5471 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.5478 0.0553 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.5473 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.5466 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.5468 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.5458 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.5455 0.0572 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.5452 0.0596 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.5452 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.5459 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.5455 0.0553 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.5464 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.5463 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.5465 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.5462 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.5464 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.5469 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.5467 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.5462 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.5469 0.0573 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.5469 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.5478 0.0546 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.5481 0.0572 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.5484 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.5483 0.0550 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.5484 0.0603 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.5486 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.5482 0.0548 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.5482 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.5480 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.5486 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.5489 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.5495 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.5493 0.0567 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.5492 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.5494 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.5493 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.5493 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.5488 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.5489 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.5484 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.5484 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.5479 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.5478 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.5474 0.0557 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.5473 0.0604 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.5469 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.5467 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.5464 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.5465 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.5462 0.0546 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.5461 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.5457 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.5454 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.5451 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.5452 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.5451 0.0587 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.5448 0.0578 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.5445 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.5440 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.5440 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.5439 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.5437 0.0572 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.5435 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.5434 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.5431 0.0582 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.5431 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.5430 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.5430 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.5430 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.5429 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.5429 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.5427 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.5426 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.5423 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.5419 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.5419 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.5418 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.5418 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.5418 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.5417 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.5414 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.5410 0.0581 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.5410 0.0542 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.5410 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.5406 0.0585 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.5407 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.5409 0.0620 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.5407 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.5404 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.5400 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.5397 0.0619 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.5398 0.0562 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.5398 0.0586 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.5397 0.0526 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.5398 0.0559 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.5399 0.0567 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.5400 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.5401 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.5400 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.5403 0.0550 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.5403 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.5402 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.5404 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.5403 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.5404 0.0586 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.5404 0.0577 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.5406 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.5407 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.5406 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.5404 0.0556 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.5404 0.0617 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.5405 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.5406 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.5406 0.0550 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.5406 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.5407 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.5408 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.5405 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.5406 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.5408 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.5409 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.5409 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.5409 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.5409 0.0560 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.5409 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.5410 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.5415 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.5414 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.5414 0.0550 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.5414 0.0567 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.5412 0.0603 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.5413 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.5413 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.5413 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.5412 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.5411 0.0510 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.5413 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.6303 0.0558 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.5856 0.0596 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.5695 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.5625 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.5535 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.5428 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.5434 0.0546 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.5415 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.5425 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.5410 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.5366 0.0567 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.5349 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.5351 0.0569 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.5376 0.0587 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.5372 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.5356 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.5364 0.0571 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.5375 0.0564 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.5381 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.5389 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.5386 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.5388 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.5378 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.5370 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.5369 0.0587 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.5353 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.5340 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.5346 0.0548 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.5349 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.5357 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.5356 0.0542 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.5349 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.5354 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.5358 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.5357 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.5353 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.5344 0.0545 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.5331 0.0555 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.5317 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.5314 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.5308 0.0580 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.5315 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.5310 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.5305 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.5309 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.5299 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.5298 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.5295 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.5295 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.5300 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.5295 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.5303 0.0571 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.5304 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.5307 0.0549 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.5305 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.5306 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.5311 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.5309 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.5305 0.0623 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.5312 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.5311 0.0600 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.5321 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.5325 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.5328 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.5327 0.0567 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.5329 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.5331 0.0548 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.5327 0.0594 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.5329 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.5328 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.5335 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.5339 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.5345 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.5344 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.5344 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.5346 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.5345 0.0560 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.5345 0.0577 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.5341 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.5340 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.5335 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.5335 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.5329 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.5330 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.5328 0.0540 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.5325 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.5322 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.5320 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.5315 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.5315 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.5312 0.0566 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.5311 0.0639 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.5306 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.5303 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.5300 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.5300 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.5299 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.5295 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.5292 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.5288 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.5286 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.5285 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.5285 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.5284 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.5282 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.5280 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.5280 0.0563 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.5280 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.5279 0.0546 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.5280 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.5278 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.5278 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.5277 0.0559 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.5276 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.5273 0.0554 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.5270 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.5269 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.5270 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.5269 0.0624 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.5268 0.0577 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.5268 0.0595 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.5264 0.0575 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.5261 0.0568 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.5261 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.5260 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.5256 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.5257 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.5258 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.5257 0.0542 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.5255 0.0584 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.5251 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.5248 0.0568 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.5249 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.5249 0.0596 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.5249 0.0574 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.5249 0.0567 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.5251 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.5252 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.5253 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.5253 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.5257 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.5256 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.5256 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.5258 0.0539 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.5257 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.5259 0.0555 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.5258 0.0597 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.5261 0.0565 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.5262 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.5261 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.5258 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.5258 0.0560 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.5259 0.0603 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.5260 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.5260 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.5261 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.5261 0.0544 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.5261 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.5259 0.0557 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.5260 0.0578 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.5262 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.5262 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.5263 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.5263 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.5263 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.5263 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.5264 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.5268 0.0599 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.5269 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.5269 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.5268 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.5266 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.5267 0.0588 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.5267 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.5268 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.5266 0.0557 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.5265 0.0632 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.5266 0.0597 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.6116 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.5688 0.0560 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.5538 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.5499 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.5394 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.5277 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.5278 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.5262 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.5272 0.0559 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.5268 0.0585 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.5226 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.5210 0.0530 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.5213 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.5230 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.5222 0.0565 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.5205 0.0579 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.5210 0.0556 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.5222 0.0564 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.5224 0.0575 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.5236 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.5231 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.5235 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.5229 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.5225 0.0596 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.5226 0.0566 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.5210 0.0556 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.5194 0.0580 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.5204 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.5211 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.5212 0.0581 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.5208 0.0543 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.5202 0.0552 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.5206 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.5211 0.0578 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.5209 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.5208 0.0568 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.5200 0.0763 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.5190 0.0571 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.5177 0.0549 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.5171 0.0538 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.5166 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.5172 0.0568 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.5167 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.5163 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.5167 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.5157 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.5154 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.5151 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.5148 0.0581 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.5156 0.0616 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.5152 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.5162 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.5161 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.5164 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.5162 0.0605 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.5163 0.0559 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.5168 0.0576 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.5166 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.5161 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.5166 0.0567 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.5165 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.5174 0.0588 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.5177 0.0565 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.5180 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.5179 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.5182 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.5184 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.5180 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.5180 0.0565 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.5178 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.5185 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.5188 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.5194 0.0543 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.5192 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.5191 0.0637 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.5193 0.0598 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.5193 0.0593 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.5194 0.0577 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.5189 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.5189 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.5183 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.5184 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.5180 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.5180 0.0580 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.5176 0.0553 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.5174 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.5171 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.5168 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.5164 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.5165 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.5162 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.5161 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.5156 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.5152 0.0530 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.5150 0.0579 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.5151 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.5150 0.0629 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.5145 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.5141 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.5138 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.5136 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.5134 0.0562 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.5132 0.0560 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.5131 0.0647 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.5129 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.5128 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.5129 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.5128 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.5127 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.5128 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.5127 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.5126 0.0599 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.5125 0.0568 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.5123 0.0564 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.5120 0.0560 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.5117 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.5117 0.0530 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.5117 0.0590 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.5115 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.5115 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.5114 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.5110 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.5107 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.5107 0.0576 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.5106 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.5102 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.5103 0.0565 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.5104 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.5103 0.0555 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.5101 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.5097 0.0532 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.5095 0.0559 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.5096 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.5096 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.5096 0.0564 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.5096 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.5098 0.0549 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.5100 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.5101 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.5100 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.5104 0.0606 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.5104 0.0514 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.5103 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.5105 0.0593 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.5105 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.5107 0.0592 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.5107 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.5109 0.0626 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.5110 0.0583 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.5109 0.0595 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.5106 0.0577 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.5106 0.0559 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.5106 0.0577 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.5107 0.0548 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.5107 0.0583 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.5107 0.0577 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.5108 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.5108 0.0578 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.5105 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.5107 0.0576 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.5109 0.0559 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.5109 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.5109 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.5110 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.5110 0.0585 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.5110 0.0585 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.5111 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.5116 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.5116 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.5116 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.5115 0.0566 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.5113 0.0573 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.5114 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.5114 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.5115 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.5114 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.5113 0.0567 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.5115 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.6026 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.5570 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.5416 0.0587 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.5386 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.5265 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.5159 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.5161 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.5126 0.0627 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.5133 0.0576 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.5120 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.5080 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.5076 0.0594 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.5080 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.5095 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.5090 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.5073 0.0599 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.5078 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.5090 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.5089 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.5100 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.5096 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.5099 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.5090 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.5086 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.5086 0.0549 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.5073 0.0550 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.5060 0.0582 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.5066 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.5073 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.5078 0.0565 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.5073 0.0599 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.5065 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.5070 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.5075 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.5073 0.0583 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.5072 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.5065 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.5052 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.5040 0.0564 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.5035 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.5029 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.5036 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.5032 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.5027 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.5031 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.5022 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.5019 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.5016 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.5017 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.5024 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.5019 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.5027 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.5027 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.5029 0.0564 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.5027 0.0557 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.5028 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.5033 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.5030 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.5025 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.5031 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.5031 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.5041 0.0569 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.5044 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.5046 0.0570 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.5045 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.5047 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.5049 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.5046 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.5046 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.5045 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.5051 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.5053 0.0578 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.5059 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.5056 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.5056 0.0568 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.5058 0.0593 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.5057 0.0552 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.5055 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.5051 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.5050 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.5045 0.0552 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.5046 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.5040 0.0593 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.5040 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.5037 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.5035 0.0634 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.5032 0.0549 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.5029 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.5025 0.0552 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.5025 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.5023 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.5021 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.5018 0.0592 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.5015 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.5012 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.5013 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.5013 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.5008 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.5005 0.0578 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.5001 0.0551 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.5001 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.4999 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.4999 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.4998 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.4996 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.4995 0.0504 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.4994 0.0620 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.4994 0.0575 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.4994 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.4994 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.4992 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.4992 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.4990 0.0563 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.4990 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.4987 0.0567 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.4983 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.4983 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.4984 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.4983 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.4982 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.4980 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.4978 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.4975 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.4975 0.0584 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.4974 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.4970 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.4971 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.4972 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.4972 0.0544 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.4970 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.4966 0.0581 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.4964 0.0561 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.4965 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.4965 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.4965 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.4966 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.4968 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.4969 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.4970 0.0578 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.4968 0.0544 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.4972 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.4972 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.4971 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.4973 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.4973 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.4974 0.0599 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.4974 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.4977 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.4978 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.4978 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.4975 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.4975 0.0582 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.4976 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.4976 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.4977 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.4976 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.4977 0.0553 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.4976 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.4974 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.4975 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.4977 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.4977 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.4977 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.4978 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.4978 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.4978 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.4979 0.0544 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.4984 0.0579 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.4984 0.0587 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.4984 0.0557 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.4983 0.0532 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.4982 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.4983 0.0581 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.4983 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.4983 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.4982 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.4981 0.0511 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.4982 0.0517 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4221 0.6205 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4156 0.0355 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.4090 0.0322 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.4019 0.0325 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.3937 0.0331 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.3838 0.0338 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.3704 0.0339 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.3498 0.0363 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.3109 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 4.2520 0.0494 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 4.1924 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 4.1394 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 4.0935 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 4.0527 0.0484 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 4.0141 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.9791 0.0490 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.9466 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.9180 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.8910 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.8645 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.8408 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.8190 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.7981 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.7788 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.7603 0.0527 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.7430 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.7272 0.0514 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.7113 0.0479 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.6962 0.0499 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.6823 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.6701 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.6576 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.6455 0.0488 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.6343 0.0482 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.6235 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.6135 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.6030 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.5932 0.0541 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.5837 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.5748 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.5660 0.0447 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.5579 0.0527 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.5498 0.0459 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.5419 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.5343 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.5275 0.0452 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.5209 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.5145 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.5085 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.5026 0.0571 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.4966 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.4907 0.0515 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.4852 0.0493 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.4799 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.4747 0.0549 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.4693 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.4643 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.4595 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.4546 0.0499 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.4500 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.4457 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.4418 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.4379 0.0524 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.4334 0.0495 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.4292 0.0517 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.4255 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.4218 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.4175 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.4136 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.4102 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.4068 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.4036 0.0509 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.4003 0.0484 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.3970 0.0500 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.3939 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.3910 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.3880 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.3851 0.0515 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.3821 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.3791 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.3762 0.0441 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.3736 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.3710 0.0446 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.3684 0.0525 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.3656 0.0511 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.3630 0.0443 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.3603 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.3578 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.3554 0.0472 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.3531 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.3508 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.3484 0.0518 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.3463 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.3441 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.3419 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.3397 0.0484 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.3377 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.3354 0.0485 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.3334 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.3314 0.0504 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.3294 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.3275 0.0551 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.3256 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.3236 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.3216 0.0439 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.3198 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.3177 0.0541 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.3158 0.0484 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.3140 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.3119 0.0469 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.3101 0.0489 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.3083 0.0497 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.3065 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.3046 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.3028 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.3010 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.2992 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.2977 0.0490 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.2961 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.2944 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.2930 0.0448 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.2914 0.0449 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.2899 0.0455 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.2885 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.2868 0.0480 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.2850 0.0520 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.2835 0.0460 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.2820 0.0450 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.2804 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.2789 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.2775 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.2760 0.0467 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.2745 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.2730 0.0481 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.2712 0.0464 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.2696 0.0457 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.2681 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.2666 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.2652 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.2637 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.2623 0.0502 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.2607 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.2591 0.0500 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.2576 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.2562 0.0486 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.2548 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.2535 0.0463 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.2522 0.0453 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.2508 0.0496 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.2494 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.2482 0.0491 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.2470 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.2457 0.0498 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.2444 0.0531 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.2430 0.0470 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.2416 0.0513 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.2402 0.0445 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.2388 0.0473 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.2373 0.0466 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.2359 0.0487 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.2345 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.2329 0.0474 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.2314 0.0476 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.2300 0.0451 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.2287 0.0478 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.2273 0.0471 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.2259 0.0477 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.2245 0.0461 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.2232 0.0462 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.2217 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.2204 0.0456 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.2192 0.0500 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.2181 0.0458 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.2170 0.0465 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.2157 0.0468 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.2145 0.0475 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.2131 0.0507 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.2116 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 3.0621 0.0457 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 3.0001 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.9801 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.9704 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.9670 0.0527 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.9637 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.9630 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.9614 0.0455 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.9588 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.9565 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.9529 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.9511 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.9489 0.0449 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.9481 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.9464 0.0455 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.9446 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.9428 0.0526 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.9428 0.0478 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.9413 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.9383 0.0478 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.9363 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.9350 0.0455 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.9329 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.9312 0.0478 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.9290 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.9275 0.0524 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.9261 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.9238 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.9222 0.0465 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.9205 0.0541 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.9196 0.0514 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.9176 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.9154 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.9139 0.0457 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.9117 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.9104 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.9081 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.9059 0.0500 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.9038 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.9019 0.0493 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.9000 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.8981 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.8958 0.0461 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.8938 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.8917 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.8897 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.8881 0.0507 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.8863 0.0471 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.8846 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.8830 0.0522 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.8812 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.8794 0.0473 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.8777 0.0516 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.8756 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.8737 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.8719 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.8702 0.0530 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.8684 0.0519 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.8664 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.8648 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.8630 0.0479 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.8614 0.0532 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.8600 0.0463 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.8582 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.8563 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.8548 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.8532 0.0465 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.8510 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.8491 0.0568 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.8475 0.0576 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.8459 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.8443 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.8426 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.8409 0.0515 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.8393 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.8381 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.8364 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.8350 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.8333 0.0494 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.8316 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.8299 0.0523 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.8286 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.8269 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.8253 0.0516 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.8234 0.0521 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.8217 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.8201 0.0469 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.8184 0.0481 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.8168 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.8153 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.8137 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.8122 0.0505 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.8107 0.0485 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.8091 0.0509 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.8073 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.8056 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.8042 0.0574 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.8026 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.8011 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.7996 0.0468 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.7983 0.0503 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.7967 0.0519 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.7951 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.7935 0.0526 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.7920 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.7904 0.0560 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.7889 0.0518 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.7875 0.0477 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.7861 0.0480 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.7843 0.0467 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.7829 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.7815 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.7799 0.0514 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.7784 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.7769 0.0534 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.7752 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.7738 0.0514 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.7724 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.7712 0.0508 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.7697 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.7686 0.0512 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.7672 0.0565 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.7658 0.0478 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.7645 0.0523 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.7631 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.7616 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.7603 0.0513 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.7591 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.7577 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.7564 0.0488 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.7551 0.0548 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.7537 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.7525 0.0523 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.7513 0.0525 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.7498 0.0499 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.7484 0.0486 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.7471 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.7457 0.0518 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.7446 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.7432 0.0470 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.7422 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.7408 0.0577 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.7395 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.7383 0.0482 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.7371 0.0484 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.7360 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.7348 0.0510 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.7338 0.0541 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.7325 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.7312 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.7303 0.0522 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.7294 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.7283 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.7272 0.0492 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.7259 0.0520 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.7247 0.0559 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.7234 0.0506 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.7222 0.0511 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.7209 0.0489 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.7198 0.0491 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.7187 0.0487 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.7174 0.0582 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.7160 0.0474 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.7149 0.0526 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.7138 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.7126 0.0504 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.7115 0.0515 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.7104 0.0497 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.7093 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.7081 0.0476 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.7070 0.0498 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.7060 0.0496 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.7050 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.7041 0.0502 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.7031 0.0495 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.7023 0.0490 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.7015 0.0501 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.7005 0.0581 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.6190 0.0510 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.5439 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.5241 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.5147 0.0460 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.5121 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.5066 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.5064 0.0475 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.5065 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.5070 0.0475 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.5059 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.5030 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.5017 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.5002 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.5020 0.0478 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.5010 0.0556 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.5006 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.5001 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.5012 0.0534 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.5010 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.4992 0.0519 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.4979 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.4981 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.4969 0.0513 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.4957 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.4943 0.0508 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.4933 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.4924 0.0468 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.4916 0.0540 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.4912 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.4906 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.4904 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.4895 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.4882 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.4875 0.0545 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.4866 0.0526 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.4860 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.4850 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.4835 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.4820 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.4810 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.4796 0.0475 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.4785 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.4774 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.4762 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.4752 0.0507 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.4738 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.4732 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.4724 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.4715 0.0476 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.4710 0.0475 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.4700 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.4693 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.4684 0.0565 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.4674 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.4665 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.4657 0.0572 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.4652 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.4642 0.0543 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.4633 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.4628 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.4622 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.4616 0.0581 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.4613 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.4607 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.4598 0.0479 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.4594 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.4587 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.4576 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.4566 0.0505 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.4561 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.4557 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.4553 0.0521 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.4547 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.4540 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.4533 0.0543 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.4532 0.0506 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.4525 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.4521 0.0538 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.4513 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.4507 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.4500 0.0474 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.4496 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.4489 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.4481 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.4472 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.4464 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.4458 0.0540 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.4451 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.4445 0.0485 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.4440 0.0536 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.4433 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.4428 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.4422 0.0502 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.4414 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.4407 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.4399 0.0551 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.4393 0.0500 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.4387 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.4381 0.0553 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.4374 0.0544 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.4370 0.0476 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.4365 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.4358 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.4351 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.4346 0.0490 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.4340 0.0492 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.4334 0.0473 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.4330 0.0573 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.4326 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.4319 0.0550 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.4314 0.0514 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.4310 0.0486 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.4304 0.0568 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.4298 0.0511 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.4292 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.4284 0.0541 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.4279 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.4273 0.0529 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.4270 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.4265 0.0518 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.4261 0.0540 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.4256 0.0482 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.4250 0.0481 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.4246 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.4241 0.0477 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.4234 0.0496 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.4230 0.0497 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.4227 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.4222 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.4217 0.0489 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.4212 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.4206 0.0557 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.4201 0.0515 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.4198 0.0483 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.4192 0.0478 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.4187 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.4183 0.0538 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.4179 0.0484 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.4176 0.0495 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.4171 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.4168 0.0488 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.4163 0.0637 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.4158 0.0545 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.4153 0.0549 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.4149 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.4146 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.4142 0.0543 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.4140 0.0557 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.4135 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.4129 0.0530 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.4126 0.0499 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.4125 0.0562 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.4122 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.4119 0.0527 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.4114 0.0559 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.4110 0.0511 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.4105 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.4100 0.0498 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.4095 0.0584 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.4092 0.0516 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.4088 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.4082 0.0503 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.4077 0.0524 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.4073 0.0509 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.4070 0.0491 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.4065 0.0480 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.4061 0.0531 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.4058 0.0501 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.4054 0.0548 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.4049 0.0514 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.4044 0.0557 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.4040 0.0504 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.4037 0.0512 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.4033 0.0493 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.4030 0.0551 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.4027 0.0528 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.4022 0.0494 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.4017 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.4200 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.3563 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.3376 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.3314 0.0535 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.3276 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.3234 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.3239 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.3239 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.3247 0.0522 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.3240 0.0479 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.3223 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.3214 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.3209 0.0545 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.3236 0.0567 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.3232 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.3230 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.3222 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.3240 0.0525 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.3241 0.0554 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.3232 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.3222 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.3231 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.3225 0.0481 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.3218 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.3209 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.3205 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.3197 0.0758 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.3194 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.3199 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.3199 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.3199 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.3192 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.3184 0.0557 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.3186 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.3181 0.0514 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.3177 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.3172 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.3157 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.3147 0.0583 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.3137 0.0574 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.3130 0.0518 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.3125 0.0573 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.3118 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.3109 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.3103 0.0521 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.3090 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.3089 0.0488 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.3083 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.3078 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.3078 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.3070 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.3069 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.3063 0.0484 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.3058 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.3052 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.3050 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.3046 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.3040 0.0481 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.3034 0.0507 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.3034 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.3029 0.0581 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.3028 0.0528 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.3028 0.0535 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.3025 0.0571 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.3020 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.3021 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.3019 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.3011 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.3003 0.0576 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.3001 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.3001 0.0533 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.2999 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.2997 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.2992 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.2989 0.0483 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.2991 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.2988 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.2987 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.2981 0.0522 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.2976 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.2970 0.0543 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.2968 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.2964 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.2958 0.0520 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.2949 0.0521 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.2945 0.0500 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.2941 0.0574 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.2937 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.2931 0.0531 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.2929 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.2925 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.2922 0.0543 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.2916 0.0552 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.2911 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.2906 0.0485 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.2900 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.2896 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.2892 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.2887 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.2882 0.0526 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.2880 0.0512 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.2877 0.0548 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.2871 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.2867 0.0525 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.2862 0.0499 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.2859 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.2855 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.2853 0.0525 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.2851 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.2846 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.2844 0.0517 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.2842 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.2838 0.0593 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.2834 0.0516 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.2829 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.2823 0.0508 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.2820 0.0564 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.2817 0.0586 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.2815 0.0522 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.2813 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.2811 0.0510 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.2807 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.2804 0.0544 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.2803 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.2800 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.2796 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.2795 0.0524 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.2793 0.0584 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.2790 0.0527 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.2788 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.2785 0.0476 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.2780 0.0530 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.2777 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.2775 0.0544 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.2773 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.2771 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.2769 0.0534 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.2767 0.0529 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.2767 0.0566 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.2764 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.2764 0.0532 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.2760 0.0494 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.2758 0.0519 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.2756 0.0509 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.2753 0.0557 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.2752 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.2750 0.0491 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.2750 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.2747 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.2743 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.2742 0.0505 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.2742 0.0486 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.2741 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.2739 0.0541 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.2736 0.0506 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.2733 0.0479 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.2730 0.0502 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.2727 0.0526 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.2724 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.2724 0.0503 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.2722 0.0497 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.2719 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.2716 0.0498 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.2714 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.2713 0.0489 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.2710 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.2708 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.2707 0.0504 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.2706 0.0493 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.2703 0.0483 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.2700 0.0592 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.2697 0.0495 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.2695 0.0492 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.2694 0.0490 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.2693 0.0513 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.2691 0.0496 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.2688 0.0511 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.2685 0.0522 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.3203 0.0485 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.2571 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.2410 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.2355 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.2310 0.0489 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.2258 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.2258 0.0592 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.2263 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.2263 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.2264 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.2239 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.2233 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.2233 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.2261 0.0496 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.2253 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.2245 0.0532 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.2240 0.0559 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.2258 0.0484 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.2260 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.2258 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.2255 0.0572 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.2268 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.2266 0.0487 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.2258 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.2253 0.0484 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.2242 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.2233 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.2232 0.0484 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.2239 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.2239 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.2240 0.0487 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.2234 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.2227 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.2230 0.0583 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.2225 0.0534 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.2222 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.2218 0.0489 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.2204 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.2193 0.0568 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.2183 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.2179 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.2174 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.2169 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.2160 0.0491 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.2158 0.0528 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.2144 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.2146 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.2138 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.2135 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.2137 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.2131 0.0482 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.2132 0.0530 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.2128 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.2124 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.2120 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.2120 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.2119 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.2115 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.2109 0.0510 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.2112 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.2109 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.2110 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.2112 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.2111 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.2107 0.0533 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.2108 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.2107 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.2101 0.0554 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.2096 0.0611 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.2094 0.0578 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.2095 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.2093 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.2093 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.2088 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.2084 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.2087 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.2083 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.2082 0.0531 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.2077 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.2073 0.0548 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.2068 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.2067 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.2061 0.0539 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.2057 0.0517 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.2049 0.0492 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 2.2045 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 2.2044 0.0552 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 2.2040 0.0495 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 2.2035 0.0591 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 2.2034 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 2.2031 0.0525 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 2.2029 0.0574 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 2.2023 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 2.2018 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 2.2013 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 2.2010 0.0500 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 2.2007 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 2.2003 0.0485 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 2.1998 0.0521 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 2.1994 0.0529 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 2.1993 0.0526 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 2.1991 0.0534 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 2.1986 0.0564 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 2.1982 0.0546 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 2.1979 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 2.1976 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 2.1973 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 2.1972 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 2.1972 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 2.1969 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 2.1967 0.0520 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 2.1967 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 2.1964 0.0530 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 2.1961 0.0525 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 2.1957 0.0503 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 2.1952 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 2.1950 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 2.1947 0.0534 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 2.1947 0.0490 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 2.1945 0.0536 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 2.1944 0.0508 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 2.1941 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 2.1938 0.0575 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 2.1937 0.0497 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 2.1935 0.0538 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 2.1931 0.0553 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 2.1930 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 2.1929 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 2.1928 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 2.1927 0.0499 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 2.1924 0.0523 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 2.1919 0.0559 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 2.1918 0.0558 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 2.1917 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 2.1915 0.0501 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 2.1914 0.0585 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 2.1913 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 2.1912 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 2.1913 0.0543 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 2.1910 0.0572 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 2.1910 0.0515 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 2.1908 0.0531 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 2.1906 0.0514 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 2.1904 0.0545 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 2.1903 0.0566 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 2.1903 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 2.1902 0.0538 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 2.1902 0.0511 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 2.1900 0.0528 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 2.1898 0.0513 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 2.1896 0.0527 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 2.1897 0.0557 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 2.1896 0.0512 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 2.1895 0.0502 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 2.1894 0.0531 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 2.1892 0.0557 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 2.1890 0.0546 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 2.1888 0.0566 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 2.1885 0.0493 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 2.1886 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 2.1886 0.0509 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 2.1884 0.0506 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 2.1882 0.0487 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 2.1880 0.0574 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 2.1879 0.0529 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 2.1878 0.0505 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 2.1877 0.0564 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 2.1878 0.0548 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 2.1876 0.0498 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 2.1874 0.0507 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 2.1872 0.0568 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 2.1870 0.0518 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 2.1870 0.0494 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 2.1870 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 2.1869 0.0519 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 2.1868 0.0516 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 2.1866 0.0504 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 2.1865 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.2528 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 2.1938 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 2.1756 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 2.1688 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 2.1634 0.0489 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 2.1566 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 2.1560 0.0539 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 2.1564 0.0591 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 2.1578 0.0558 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 2.1575 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 2.1554 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 2.1535 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 2.1531 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 2.1548 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 2.1544 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 2.1532 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 2.1531 0.0542 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 2.1552 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 2.1560 0.0502 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 2.1557 0.0529 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 2.1550 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 2.1565 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 2.1561 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 2.1554 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 2.1553 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 2.1543 0.0555 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 2.1536 0.0567 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 2.1537 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 2.1545 0.0535 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 2.1544 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 2.1546 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 2.1542 0.0561 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 2.1539 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 2.1545 0.0534 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 2.1540 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 2.1537 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 2.1534 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 2.1521 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 2.1510 0.0522 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 2.1502 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 2.1496 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 2.1494 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 2.1491 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 2.1483 0.0556 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 2.1484 0.0528 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 2.1471 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 2.1469 0.0546 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 2.1463 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 2.1462 0.0520 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 2.1466 0.0571 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 2.1458 0.0532 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 2.1462 0.0568 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 2.1458 0.0600 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 2.1455 0.0551 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 2.1452 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 2.1452 0.0509 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 2.1451 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 2.1446 0.0544 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 2.1441 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 2.1445 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 2.1442 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 2.1445 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 2.1448 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 2.1447 0.0571 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 2.1444 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 2.1444 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 2.1443 0.0552 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 2.1438 0.0523 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 2.1434 0.0564 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 2.1434 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 2.1437 0.0495 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 2.1436 0.0487 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 2.1436 0.0494 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 2.1431 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 2.1430 0.0635 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 2.1433 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 2.1432 0.0534 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 2.1432 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 2.1426 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 2.1423 0.0575 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 2.1418 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 2.1419 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 2.1413 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 2.1411 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 2.1404 0.0519 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 2.1400 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 2.1399 0.0568 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 2.1396 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 2.1391 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 2.1391 0.0547 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 2.1388 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 2.1387 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 2.1381 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 2.1377 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 2.1374 0.0539 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 2.1371 0.0529 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 2.1368 0.0507 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 2.1364 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 2.1360 0.0557 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 2.1354 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 2.1354 0.0553 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 2.1352 0.0544 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 2.1348 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 2.1346 0.0579 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 2.1342 0.0587 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 2.1339 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 2.1337 0.0534 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 2.1337 0.0511 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 2.1336 0.0532 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 2.1334 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 2.1332 0.0515 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 2.1331 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 2.1328 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 2.1326 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 2.1322 0.0501 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 2.1317 0.0584 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 2.1315 0.0533 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 2.1313 0.0488 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 2.1313 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 2.1311 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 2.1311 0.0544 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 2.1308 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 2.1305 0.0503 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 2.1306 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 2.1304 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 2.1301 0.0595 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 2.1300 0.0518 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 2.1300 0.0559 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 2.1299 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 2.1299 0.0542 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 2.1296 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 2.1292 0.0546 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 2.1291 0.0531 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 2.1291 0.0491 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 2.1291 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 2.1290 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 2.1290 0.0496 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 2.1289 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 2.1291 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 2.1289 0.0530 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 2.1290 0.0607 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 2.1288 0.0525 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 2.1287 0.0562 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 2.1286 0.0512 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 2.1284 0.0499 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 2.1284 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 2.1283 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 2.1284 0.0492 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 2.1283 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 2.1281 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 2.1278 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 2.1279 0.0510 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 2.1279 0.0545 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 2.1279 0.0497 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 2.1277 0.0493 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 2.1276 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 2.1274 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 2.1273 0.0560 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 2.1270 0.0588 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 2.1272 0.0516 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 2.1271 0.0505 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 2.1270 0.0558 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 2.1269 0.0527 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 2.1268 0.0536 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 2.1267 0.0504 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 2.1266 0.0569 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 2.1266 0.0550 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 2.1267 0.0569 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 2.1266 0.0517 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 2.1265 0.0585 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 2.1263 0.0521 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 2.1261 0.0524 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 2.1261 0.0546 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 2.1261 0.0498 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 2.1260 0.0513 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 2.1259 0.0564 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 2.1258 0.0500 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 2.1257 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 2.1947 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 2.1386 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 2.1218 0.0534 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 2.1137 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 2.1105 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 2.1029 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 2.1033 0.0492 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 2.1037 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 2.1053 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 2.1050 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 2.1024 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 2.1001 0.0543 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 2.1013 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 2.1030 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 2.1027 0.0554 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 2.1017 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 2.1014 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 2.1035 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 2.1038 0.0492 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 2.1037 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 2.1028 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 2.1039 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 2.1034 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 2.1027 0.0617 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 2.1023 0.0585 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 2.1012 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 2.0998 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 2.0998 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 2.1004 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 2.1004 0.0535 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 2.1004 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 2.0997 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 2.0994 0.0488 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 2.1000 0.0545 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 2.0996 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 2.0993 0.0491 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 2.0991 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 2.0974 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 2.0962 0.0553 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 2.0954 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 2.0951 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 2.0949 0.0492 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 2.0945 0.0589 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 2.0935 0.0569 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 2.0935 0.0593 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 2.0922 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 2.0923 0.0555 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 2.0917 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 2.0914 0.0595 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 2.0920 0.0556 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 2.0911 0.0549 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 2.0916 0.0489 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 2.0911 0.0554 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 2.0907 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 2.0903 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 2.0905 0.0493 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 2.0905 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 2.0900 0.0546 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 2.0896 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 2.0900 0.0494 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 2.0899 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 2.0902 0.0489 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 2.0906 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 2.0906 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 2.0904 0.0492 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 2.0905 0.0560 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 2.0904 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 2.0898 0.0496 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 2.0893 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 2.0893 0.0552 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 2.0896 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 2.0895 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 2.0898 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 2.0893 0.0521 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 2.0891 0.0561 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 2.0894 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 2.0891 0.0502 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 2.0891 0.0506 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 2.0885 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 2.0881 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 2.0877 0.0555 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 2.0877 0.0513 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 2.0872 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 2.0869 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 2.0863 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 2.0860 0.0520 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 2.0859 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 2.0856 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 2.0851 0.0573 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 2.0850 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 2.0846 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 2.0845 0.0544 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 2.0839 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 2.0836 0.0508 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 2.0832 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 2.0829 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 2.0827 0.0535 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 2.0824 0.0526 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 2.0819 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 2.0814 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 2.0814 0.0501 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 2.0813 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 2.0808 0.0494 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 2.0806 0.0531 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 2.0802 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 2.0801 0.0541 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 2.0799 0.0541 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 2.0798 0.0552 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 2.0799 0.0529 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 2.0796 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 2.0796 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 2.0795 0.0483 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 2.0793 0.0495 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 2.0790 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 2.0786 0.0507 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 2.0781 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 2.0780 0.0518 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 2.0777 0.0524 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 2.0777 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 2.0776 0.0516 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 2.0776 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 2.0774 0.0537 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 2.0770 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 2.0771 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 2.0770 0.0492 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 2.0767 0.0530 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 2.0767 0.0512 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 2.0767 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 2.0766 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 2.0765 0.0525 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 2.0762 0.0542 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 2.0758 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 2.0758 0.0498 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 2.0758 0.0494 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 2.0757 0.0490 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 2.0757 0.0511 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 2.0757 0.0585 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 2.0757 0.0546 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 2.0759 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 2.0757 0.0497 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 2.0758 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 2.0756 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 2.0755 0.0560 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 2.0754 0.0491 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 2.0753 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 2.0752 0.0560 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 2.0752 0.0543 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 2.0753 0.0523 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 2.0752 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 2.0750 0.0515 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 2.0747 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 2.0748 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 2.0748 0.0505 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 2.0748 0.0500 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 2.0747 0.0514 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 2.0747 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 2.0745 0.0586 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 2.0744 0.0527 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 2.0741 0.0563 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 2.0742 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 2.0742 0.0504 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 2.0741 0.0528 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 2.0741 0.0568 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 2.0740 0.0503 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 2.0740 0.0499 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 2.0738 0.0522 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 2.0738 0.0572 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 2.0740 0.0547 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 2.0739 0.0510 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 2.0737 0.0532 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 2.0736 0.0517 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 2.0734 0.0519 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 2.0734 0.0535 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 2.0733 0.0509 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 2.0733 0.0492 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 2.0732 0.0488 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 2.0730 0.0538 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 2.0729 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 2.1414 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 2.0880 0.0546 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 2.0706 0.0559 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 2.0631 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 2.0599 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 2.0519 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 2.0534 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 2.0538 0.0577 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 2.0556 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 2.0549 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 2.0521 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 2.0504 0.0532 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 2.0504 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 2.0523 0.0557 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 2.0518 0.0495 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 2.0506 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 2.0505 0.0568 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 2.0529 0.0532 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 2.0536 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 2.0538 0.0537 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 2.0528 0.0577 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 2.0537 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 2.0534 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 2.0527 0.0575 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 2.0526 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 2.0516 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 2.0507 0.0568 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 2.0508 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 2.0514 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 2.0514 0.0542 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 2.0512 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 2.0506 0.0533 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 2.0506 0.0596 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 2.0511 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 2.0508 0.0570 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 2.0505 0.0523 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 2.0501 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 2.0485 0.0493 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 2.0475 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 2.0464 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 2.0461 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 2.0458 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 2.0455 0.0546 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 2.0446 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 2.0447 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 2.0434 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 2.0431 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 2.0426 0.0533 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 2.0423 0.0553 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 2.0429 0.0530 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 2.0422 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 2.0427 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 2.0423 0.0574 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 2.0420 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 2.0416 0.0552 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 2.0418 0.0599 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 2.0417 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 2.0414 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 2.0409 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 2.0415 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 2.0411 0.0495 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 2.0416 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 2.0420 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 2.0422 0.0562 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 2.0418 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 2.0420 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 2.0420 0.0534 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 2.0415 0.0564 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 2.0410 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 2.0409 0.0554 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 2.0413 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 2.0413 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 2.0415 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 2.0411 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 2.0410 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 2.0413 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 2.0411 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 2.0411 0.0506 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 2.0407 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 2.0404 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 2.0399 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 2.0400 0.0524 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 2.0394 0.0546 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 2.0392 0.0536 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 2.0386 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 2.0383 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 2.0381 0.0526 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 2.0377 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 2.0373 0.0532 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 2.0373 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 2.0370 0.0553 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 2.0369 0.0546 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 2.0363 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 2.0361 0.0516 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 2.0357 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 2.0355 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 2.0353 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 2.0350 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 2.0345 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 2.0340 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 2.0339 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 2.0338 0.0564 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 2.0334 0.0533 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 2.0331 0.0603 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 2.0328 0.0561 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 2.0326 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 2.0324 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 2.0323 0.0534 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 2.0323 0.0495 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 2.0322 0.0494 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 2.0320 0.0507 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 2.0319 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 2.0317 0.0523 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 2.0315 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 2.0312 0.0533 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 2.0307 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 2.0305 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 2.0304 0.0520 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 2.0303 0.0503 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 2.0302 0.0548 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 2.0302 0.0529 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 2.0299 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 2.0295 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 2.0296 0.0505 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 2.0295 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 2.0291 0.0528 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 2.0291 0.0523 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 2.0291 0.0559 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 2.0290 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 2.0290 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 2.0287 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 2.0283 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 2.0283 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 2.0282 0.0614 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 2.0282 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 2.0282 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 2.0282 0.0518 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 2.0283 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 2.0284 0.0519 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 2.0283 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 2.0284 0.0527 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 2.0283 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 2.0282 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 2.0282 0.0588 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 2.0281 0.0513 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 2.0281 0.0514 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 2.0281 0.0572 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 2.0282 0.0522 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 2.0281 0.0515 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 2.0279 0.0525 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 2.0277 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 2.0278 0.0499 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 2.0278 0.0555 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 2.0278 0.0497 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 2.0277 0.0509 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 2.0276 0.0579 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 2.0275 0.0585 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 2.0273 0.0521 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 2.0270 0.0501 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 2.0272 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 2.0272 0.0579 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 2.0271 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 2.0271 0.0488 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 2.0270 0.0538 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 2.0269 0.0500 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 2.0268 0.0510 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 2.0268 0.0502 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 2.0270 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 2.0269 0.0550 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 2.0267 0.0517 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 2.0266 0.0551 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 2.0264 0.0504 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 2.0264 0.0540 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 2.0264 0.0539 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 2.0263 0.0508 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 2.0263 0.0606 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 2.0261 0.0551 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 2.0260 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 2.0946 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 2.0399 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 2.0258 0.0581 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 2.0177 0.0537 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 2.0128 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 2.0054 0.0504 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 2.0050 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 2.0051 0.0526 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 2.0069 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 2.0063 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 2.0037 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 2.0020 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 2.0027 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 2.0044 0.0539 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 2.0033 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 2.0022 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 2.0023 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 2.0046 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 2.0054 0.0491 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 2.0061 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 2.0054 0.0537 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 2.0063 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 2.0060 0.0526 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 2.0056 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 2.0055 0.0506 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 2.0042 0.0536 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 2.0031 0.0582 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 2.0036 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 2.0045 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 2.0046 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 2.0046 0.0493 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 2.0039 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 2.0037 0.0491 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 2.0046 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 2.0045 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 2.0043 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 2.0040 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 2.0026 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 2.0017 0.0598 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 2.0009 0.0551 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 2.0003 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 2.0003 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.9999 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.9992 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.9993 0.0563 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.9978 0.0555 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.9976 0.0552 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.9972 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.9969 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.9976 0.0530 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.9968 0.0496 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.9974 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.9969 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.9967 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.9964 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.9966 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.9967 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.9964 0.0508 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.9959 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.9965 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.9963 0.0544 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.9970 0.0584 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.9973 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.9976 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.9972 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.9976 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.9976 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.9970 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.9967 0.0551 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.9968 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.9972 0.0539 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.9973 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.9975 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.9971 0.0537 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.9970 0.0505 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.9972 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.9972 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.9972 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.9967 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.9964 0.0547 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.9959 0.0533 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.9959 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.9953 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.9952 0.0539 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.9945 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.9942 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.9940 0.0546 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.9937 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.9932 0.0512 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.9932 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.9930 0.0495 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.9929 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.9924 0.0500 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.9920 0.0499 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.9917 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.9916 0.0553 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.9915 0.0537 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.9912 0.0552 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.9909 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.9905 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.9905 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.9904 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.9901 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.9899 0.0560 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.9896 0.0514 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.9894 0.0569 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.9892 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.9891 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.9891 0.0540 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.9890 0.0529 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.9890 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.9889 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.9886 0.0511 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.9884 0.0509 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.9882 0.0516 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.9877 0.0546 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.9876 0.0552 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.9874 0.0517 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.9874 0.0538 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.9873 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.9872 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.9870 0.0525 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.9867 0.0729 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.9867 0.0502 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.9867 0.0507 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.9863 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.9863 0.0551 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.9864 0.0503 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.9863 0.0527 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.9862 0.0576 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.9860 0.0541 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.9856 0.0575 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.9856 0.0533 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.9856 0.0492 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.9855 0.0498 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.9855 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.9855 0.0542 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.9855 0.0535 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.9857 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.9855 0.0522 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.9856 0.0529 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.9855 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.9854 0.0546 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.9854 0.0573 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.9852 0.0581 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.9852 0.0538 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.9852 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.9853 0.0518 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.9853 0.0522 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.9851 0.0501 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.9849 0.0532 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.9849 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.9850 0.0564 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.9850 0.0513 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.9849 0.0522 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.9848 0.0521 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.9846 0.0526 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.9845 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.9842 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.9844 0.0524 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.9844 0.0523 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.9843 0.0545 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.9843 0.0519 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.9842 0.0510 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.9842 0.0588 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.9840 0.0534 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.9840 0.0493 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.9843 0.0496 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.9842 0.0573 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.9840 0.0530 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.9838 0.0583 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.9836 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.9837 0.0515 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.9837 0.0531 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.9837 0.0528 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.9836 0.0585 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.9835 0.0520 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.9834 0.0574 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 2.0470 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.9934 0.0538 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.9825 0.0567 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.9771 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.9730 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.9647 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.9661 0.0535 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.9653 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.9680 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.9673 0.0571 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.9645 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.9628 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.9634 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.9650 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.9645 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.9632 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.9628 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.9648 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.9651 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.9655 0.0573 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.9646 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.9653 0.0644 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.9647 0.0574 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.9642 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.9638 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.9624 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.9612 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.9615 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.9622 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.9626 0.0593 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.9623 0.0554 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.9618 0.0546 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.9617 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.9624 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.9618 0.0538 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.9612 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.9607 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.9596 0.0549 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.9586 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.9577 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.9572 0.0509 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.9572 0.0591 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.9569 0.0580 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.9562 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.9564 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.9551 0.0557 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.9550 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.9545 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.9543 0.0564 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.9550 0.0502 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.9543 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.9549 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.9546 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.9544 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.9540 0.0558 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.9543 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.9543 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.9540 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.9537 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.9543 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.9540 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.9546 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.9550 0.0583 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.9552 0.0583 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.9550 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.9552 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.9554 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.9549 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.9547 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.9546 0.0518 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.9551 0.0508 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.9553 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.9555 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.9551 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.9549 0.0555 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.9552 0.0551 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.9551 0.0533 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.9552 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.9547 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.9545 0.0550 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.9539 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.9540 0.0548 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.9535 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.9533 0.0512 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.9527 0.0498 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.9524 0.0591 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.9522 0.0587 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.9519 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.9514 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.9513 0.0514 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.9510 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.9509 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.9503 0.0497 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.9499 0.0519 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.9496 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.9494 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.9492 0.0501 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.9489 0.0550 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.9484 0.0592 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.9479 0.0579 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.9479 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.9477 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.9474 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.9472 0.0517 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.9468 0.0585 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.9466 0.0527 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.9464 0.0490 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.9463 0.0570 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.9463 0.0587 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.9463 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.9463 0.0499 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.9462 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.9460 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.9459 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.9456 0.0540 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.9452 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.9451 0.0571 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.9449 0.0521 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.9449 0.0612 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.9449 0.0588 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.9449 0.0555 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.9446 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.9443 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.9444 0.0515 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.9443 0.0529 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.9440 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.9441 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.9441 0.0513 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.9441 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.9441 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.9438 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.9435 0.0568 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.9435 0.0571 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.9435 0.0560 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.9434 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.9434 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.9434 0.0531 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.9434 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.9436 0.0510 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.9435 0.0558 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.9436 0.0532 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.9436 0.0539 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.9435 0.0520 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.9435 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.9433 0.0557 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.9433 0.0525 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.9433 0.0528 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.9435 0.0605 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.9435 0.0504 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.9433 0.0526 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.9430 0.0505 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.9431 0.0537 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.9432 0.0530 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.9432 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.9431 0.0516 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.9431 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.9431 0.0506 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.9429 0.0561 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.9426 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.9428 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.9429 0.0523 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.9428 0.0503 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.9428 0.0578 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.9427 0.0583 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.9426 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.9425 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.9425 0.0536 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.9428 0.0554 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.9427 0.0606 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.9426 0.0511 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.9425 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.9423 0.0544 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.9423 0.0500 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.9423 0.0522 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.9423 0.0507 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.9422 0.0558 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.9421 0.0524 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.9420 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 2.0168 0.0622 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.9651 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.9515 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.9431 0.0537 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.9373 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.9285 0.0536 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.9293 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.9289 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.9315 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.9306 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.9280 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.9261 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.9266 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.9282 0.0544 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.9274 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.9258 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.9257 0.0548 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.9276 0.0595 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.9278 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.9279 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.9273 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.9279 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.9272 0.0573 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.9270 0.0530 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.9263 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.9252 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.9243 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.9247 0.0552 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.9256 0.0496 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.9260 0.0546 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.9257 0.0563 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.9251 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.9249 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.9257 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.9252 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.9248 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.9244 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.9232 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.9221 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.9215 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.9210 0.0568 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.9212 0.0543 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.9209 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.9201 0.0573 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.9203 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.9191 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.9190 0.0512 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.9184 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.9181 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.9186 0.0496 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.9178 0.0542 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.9185 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.9182 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.9180 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.9176 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.9179 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.9179 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.9177 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.9174 0.0556 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.9179 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.9176 0.0570 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.9182 0.0534 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.9186 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.9190 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.9187 0.0579 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.9189 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.9188 0.0593 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.9183 0.0570 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.9180 0.0505 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.9179 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.9184 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.9186 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.9189 0.0560 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.9184 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.9183 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.9185 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.9184 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.9185 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.9180 0.0533 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.9177 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.9172 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.9172 0.0541 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.9166 0.0572 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.9163 0.0526 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.9158 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.9154 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.9152 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.9149 0.0571 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.9143 0.0610 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.9142 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.9140 0.0545 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.9139 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.9133 0.0589 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.9131 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.9127 0.0529 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.9126 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.9123 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.9118 0.0525 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.9114 0.0571 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.9109 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.9108 0.0495 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.9107 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.9103 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.9101 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.9097 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.9096 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.9095 0.0508 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.9094 0.0583 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.9093 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.9093 0.0507 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.9091 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.9089 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.9087 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.9086 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.9083 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.9079 0.0599 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.9078 0.0563 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.9076 0.0560 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.9076 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.9076 0.0555 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.9075 0.0510 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.9071 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.9068 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.9069 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.9068 0.0553 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.9064 0.0565 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.9065 0.0530 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.9065 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.9065 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.9064 0.0600 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.9061 0.0584 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.9058 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.9058 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.9058 0.0544 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.9057 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.9058 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.9058 0.0502 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.9058 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.9060 0.0520 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.9058 0.0530 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.9059 0.0509 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.9059 0.0511 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.9059 0.0547 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.9059 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.9058 0.0537 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.9058 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.9058 0.0558 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.9059 0.0519 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.9059 0.0524 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.9057 0.0535 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.9055 0.0495 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.9056 0.0493 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.9056 0.0498 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.9057 0.0513 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.9056 0.0517 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.9055 0.0539 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.9054 0.0518 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.9054 0.0552 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.9052 0.0543 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.9054 0.0516 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.9055 0.0532 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.9054 0.0514 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.9054 0.0504 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.9054 0.0497 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.9053 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.9052 0.0522 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.9053 0.0515 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.9056 0.0521 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.9055 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.9055 0.0556 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.9053 0.0578 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.9052 0.0567 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.9052 0.0523 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.9052 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.9052 0.0528 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.9052 0.0527 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.9050 0.0506 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.9049 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.9775 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.9283 0.0510 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.9150 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.9070 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.9009 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.8915 0.0548 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.8928 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.8909 0.0566 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.8935 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.8923 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.8891 0.0501 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.8871 0.0524 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.8873 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.8897 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.8892 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.8874 0.0560 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.8871 0.0543 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.8891 0.0535 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.8893 0.0566 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.8896 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.8890 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.8898 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.8897 0.0562 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.8892 0.0560 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.8890 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.8878 0.0553 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.8871 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.8878 0.0533 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.8888 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.8890 0.0562 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.8886 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.8878 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.8879 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.8886 0.0519 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.8882 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.8880 0.0587 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.8876 0.0552 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.8865 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.8852 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.8845 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.8842 0.0547 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.8843 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.8836 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.8828 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.8830 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.8818 0.0569 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.8817 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.8812 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.8810 0.0544 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.8817 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.8812 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.8819 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.8819 0.0534 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.8817 0.0533 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.8813 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.8816 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.8819 0.0529 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.8818 0.0531 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.8813 0.0592 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.8818 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.8817 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.8824 0.0535 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.8827 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.8830 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.8827 0.0508 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.8831 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.8832 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.8828 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.8825 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.8826 0.0579 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.8831 0.0516 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.8832 0.0555 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.8836 0.0576 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.8832 0.0570 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.8831 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.8833 0.0547 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.8831 0.0534 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.8831 0.0563 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.8825 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.8823 0.0596 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.8818 0.0581 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.8819 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.8814 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.8813 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.8807 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.8805 0.0564 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.8804 0.0539 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.8800 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.8794 0.0585 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.8794 0.0530 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.8792 0.0553 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.8790 0.0558 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.8784 0.0553 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.8781 0.0566 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.8777 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.8776 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.8774 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.8770 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.8766 0.0523 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.8761 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.8761 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.8760 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.8757 0.0559 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.8754 0.0557 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.8752 0.0559 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.8749 0.0532 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.8747 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.8746 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.8746 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.8747 0.0622 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.8746 0.0580 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.8745 0.0517 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.8744 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.8742 0.0565 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.8740 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.8736 0.0499 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.8735 0.0572 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.8734 0.0502 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.8733 0.0512 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.8732 0.0588 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.8732 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.8729 0.0552 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.8726 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.8727 0.0560 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.8726 0.0533 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.8722 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.8723 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.8723 0.0557 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.8723 0.0591 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.8723 0.0569 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.8720 0.0521 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.8717 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.8717 0.0592 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.8717 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.8716 0.0547 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.8716 0.0528 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.8717 0.0506 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.8717 0.0535 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.8719 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.8717 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.8719 0.0573 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.8719 0.0526 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.8718 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.8718 0.0560 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.8716 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.8717 0.0552 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.8717 0.0522 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.8718 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.8718 0.0584 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.8716 0.0520 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.8713 0.0606 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.8714 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.8714 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.8714 0.0548 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.8714 0.0555 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.8713 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.8712 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.8711 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.8709 0.0505 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.8711 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.8712 0.0503 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.8711 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.8711 0.0525 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.8711 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.8711 0.0518 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.8710 0.0514 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.8711 0.0504 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.8714 0.0515 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.8713 0.0535 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.8712 0.0602 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.8710 0.0511 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.8709 0.0540 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.8709 0.0565 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.8709 0.0537 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.8709 0.0498 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.8708 0.0513 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.8707 0.0509 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.8707 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.9516 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.8999 0.0582 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.8844 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.8761 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.8712 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.8602 0.0570 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.8601 0.0498 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.8588 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.8613 0.0596 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.8607 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.8576 0.0570 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.8555 0.0563 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.8558 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.8577 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.8572 0.0554 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.8561 0.0567 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.8564 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.8583 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.8586 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.8590 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.8586 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.8595 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.8588 0.0523 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.8581 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.8576 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.8565 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.8554 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.8556 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.8564 0.0577 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.8570 0.0614 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.8569 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.8559 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.8559 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.8568 0.0498 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.8565 0.0553 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.8562 0.0574 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.8557 0.0580 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.8545 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.8533 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.8525 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.8522 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.8526 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.8522 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.8512 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.8513 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.8503 0.0542 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.8502 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.8498 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.8498 0.0530 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.8505 0.0535 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.8498 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.8506 0.0548 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.8502 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.8501 0.0503 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.8498 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.8498 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.8501 0.0535 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.8499 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.8495 0.0573 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.8500 0.0573 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.8498 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.8506 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.8511 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.8514 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.8512 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.8515 0.0518 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.8515 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.8511 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.8509 0.0554 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.8509 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.8513 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.8515 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.8519 0.0571 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.8515 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.8513 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.8516 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.8514 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.8514 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.8509 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.8508 0.0506 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.8502 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.8504 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.8499 0.0571 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.8498 0.0538 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.8492 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.8489 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.8488 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.8485 0.0532 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.8481 0.0560 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.8483 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.8480 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.8478 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.8472 0.0495 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.8469 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.8467 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.8466 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.8465 0.0553 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.8462 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.8458 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.8453 0.0528 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.8453 0.0607 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.8452 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.8448 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.8446 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.8442 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.8441 0.0590 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.8439 0.0510 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.8438 0.0575 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.8438 0.0554 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.8438 0.0543 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.8437 0.0552 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.8436 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.8434 0.0541 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.8432 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.8429 0.0567 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.8426 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.8425 0.0586 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.8424 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.8423 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.8422 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.8421 0.0534 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.8417 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.8413 0.0580 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.8414 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.8413 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.8409 0.0561 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.8410 0.0547 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.8410 0.0521 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.8409 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.8409 0.0512 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.8406 0.0517 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.8402 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.8402 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.8402 0.0544 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.8401 0.0526 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.8402 0.0572 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.8402 0.0586 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.8402 0.0507 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.8404 0.0501 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.8402 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.8404 0.0605 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.8404 0.0508 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.8404 0.0513 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.8404 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.8403 0.0516 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.8403 0.0533 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.8403 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.8405 0.0592 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.8405 0.0575 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.8404 0.0511 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.8402 0.0522 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.8403 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.8403 0.0562 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.8404 0.0539 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.8403 0.0505 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.8403 0.0504 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.8402 0.0536 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.8401 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.8400 0.0493 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.8401 0.0524 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.8403 0.0498 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.8402 0.0540 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.8403 0.0537 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.8402 0.0520 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.8402 0.0519 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.8401 0.0545 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.8401 0.0527 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.8405 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.8403 0.0531 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.8403 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.8402 0.0591 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.8400 0.0586 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.8400 0.0559 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.8400 0.0525 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.8400 0.0514 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.8399 0.0502 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.8398 0.0529 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.8398 0.0495 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.9196 0.0568 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.8741 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.8567 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.8481 0.0544 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.8432 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.8322 0.0592 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.8337 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.8316 0.0580 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.8340 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.8330 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.8298 0.0582 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.8274 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.8275 0.0538 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.8292 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.8287 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.8267 0.0583 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.8266 0.0596 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.8281 0.0538 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.8282 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.8288 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.8278 0.0582 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.8286 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.8282 0.0498 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.8276 0.0554 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.8277 0.0553 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.8262 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.8254 0.0569 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.8258 0.0546 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.8267 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.8271 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.8269 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.8261 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.8264 0.0523 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.8272 0.0500 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.8271 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.8268 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.8264 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.8252 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.8241 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.8232 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.8230 0.0539 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.8234 0.0541 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.8231 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.8222 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.8223 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.8211 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.8207 0.0560 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.8203 0.0579 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.8202 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.8210 0.0587 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.8203 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.8211 0.0495 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.8210 0.0501 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.8208 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.8204 0.0581 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.8205 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.8208 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.8206 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.8202 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.8208 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.8205 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.8211 0.0547 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.8215 0.0559 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.8217 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.8215 0.0574 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.8218 0.0538 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.8218 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.8213 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.8212 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.8211 0.0583 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.8217 0.0591 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.8218 0.0568 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.8222 0.0541 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.8220 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.8219 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.8223 0.0573 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.8222 0.0586 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.8222 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.8218 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.8217 0.0546 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.8213 0.0537 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.8215 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.8210 0.0574 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.8209 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.8204 0.0570 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.8201 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.8198 0.0554 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.8195 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.8191 0.0524 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.8192 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.8189 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.8188 0.0520 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.8184 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.8181 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.8178 0.0567 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.8177 0.0505 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.8176 0.0536 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.8172 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.8168 0.0509 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.8163 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.8163 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.8161 0.0497 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.8159 0.0531 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.8158 0.0502 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.8156 0.0568 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.8154 0.0558 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.8153 0.0542 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.8152 0.0514 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.8152 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.8152 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.8151 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.8149 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.8147 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.8145 0.0580 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.8142 0.0588 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.8138 0.0515 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.8137 0.0587 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.8136 0.0508 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.8135 0.0528 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.8134 0.0501 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.8132 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.8129 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.8126 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.8127 0.0592 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.8127 0.0562 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.8123 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.8124 0.0605 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.8124 0.0545 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.8124 0.0549 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.8123 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.8120 0.0554 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.8117 0.0526 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.8118 0.0540 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.8117 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.8117 0.0543 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.8117 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.8118 0.0530 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.8118 0.0623 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.8120 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.8119 0.0513 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.8122 0.0507 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.8121 0.0584 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.8121 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.8122 0.0548 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.8120 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.8121 0.0506 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.8121 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.8123 0.0533 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.8123 0.0519 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.8122 0.0518 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.8120 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.8121 0.0564 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.8122 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.8122 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.8121 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.8121 0.0555 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.8121 0.0535 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.8121 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.8119 0.0620 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.8121 0.0592 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.8121 0.0559 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.8121 0.0563 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.8122 0.0512 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.8122 0.0551 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.8121 0.0516 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.8121 0.0525 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.8121 0.0532 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.8125 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.8125 0.0517 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.8124 0.0527 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.8123 0.0583 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.8121 0.0522 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.8122 0.0566 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.8122 0.0546 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.8122 0.0499 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.8121 0.0534 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.8120 0.0511 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.8121 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.8967 0.0492 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.8464 0.0556 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.8323 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.8228 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.8177 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.8065 0.0498 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.8082 0.0565 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.8065 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.8080 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.8065 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.8027 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.8008 0.0546 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.8011 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.8038 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.8032 0.0613 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.8016 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.8012 0.0556 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.8032 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.8031 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.8038 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.8027 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.8036 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.8031 0.0542 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.8027 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.8024 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.8008 0.0502 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.7997 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.7998 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.8006 0.0564 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.8011 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.8006 0.0530 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.7999 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.8001 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.8010 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.8008 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.8005 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.8001 0.0563 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.7989 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.7976 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.7970 0.0547 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.7964 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.7967 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.7964 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.7956 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.7959 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.7948 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.7946 0.0545 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.7941 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.7939 0.0600 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.7946 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.7938 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.7945 0.0531 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.7946 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.7948 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.7946 0.0603 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.7949 0.0565 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.7952 0.0568 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.7951 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.7947 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.7953 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.7951 0.0597 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.7958 0.0589 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.7962 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.7965 0.0553 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.7962 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.7966 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.7968 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.7963 0.0522 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.7961 0.0496 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.7961 0.0538 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.7966 0.0543 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.7969 0.0515 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.7972 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.7968 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.7967 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.7971 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.7968 0.0577 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.7968 0.0577 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.7963 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.7962 0.0556 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.7957 0.0521 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.7958 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.7952 0.0537 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.7952 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.7947 0.0510 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.7944 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.7943 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.7940 0.0567 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.7935 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.7938 0.0597 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.7936 0.0513 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.7934 0.0543 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.7929 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.7927 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.7924 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.7924 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.7922 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.7918 0.0528 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.7914 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.7910 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.7910 0.0543 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.7909 0.0574 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.7906 0.0527 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.7904 0.0554 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.7902 0.0540 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.7900 0.0577 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.7900 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.7899 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.7899 0.0507 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.7899 0.0509 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.7897 0.0498 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.7896 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.7894 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.7893 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.7891 0.0512 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.7887 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.7886 0.0551 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.7885 0.0577 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.7884 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.7883 0.0612 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.7883 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.7879 0.0536 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.7876 0.0518 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.7878 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.7877 0.0583 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.7874 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.7875 0.0511 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.7876 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.7875 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.7874 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.7871 0.0561 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.7868 0.0534 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.7869 0.0506 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.7869 0.0532 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.7869 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.7870 0.0503 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.7870 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.7871 0.0555 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.7872 0.0523 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.7871 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.7873 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.7872 0.0526 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.7872 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.7872 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.7871 0.0537 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.7872 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.7871 0.0565 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.7873 0.0562 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.7873 0.0820 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.7871 0.0612 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.7869 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.7869 0.0553 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.7870 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.7870 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.7869 0.0533 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.7869 0.0508 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.7869 0.0529 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.7868 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.7866 0.0559 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.7868 0.0520 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.7869 0.0501 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.7868 0.0597 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.7869 0.0535 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.7869 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.7868 0.0551 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.7868 0.0514 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.7868 0.0524 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.7872 0.0519 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.7871 0.0517 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.7871 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.7870 0.0541 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.7868 0.0525 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.7869 0.0543 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.7869 0.0501 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.7869 0.0516 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.7868 0.0539 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.7866 0.0505 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.7866 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.8683 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.8210 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.8042 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.7984 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.7926 0.0515 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.7816 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.7815 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.7787 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.7795 0.0549 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.7795 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.7763 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.7743 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.7749 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.7768 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.7760 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.7749 0.0558 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.7751 0.0601 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.7775 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.7780 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.7788 0.0567 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.7785 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.7794 0.0543 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.7787 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.7783 0.0525 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.7779 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.7766 0.0571 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.7754 0.0592 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.7759 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.7766 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.7770 0.0543 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.7767 0.0525 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.7763 0.0578 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.7764 0.0510 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.7770 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.7769 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.7765 0.0507 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.7760 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.7749 0.0563 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.7736 0.0570 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.7729 0.0535 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.7723 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.7727 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.7722 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.7714 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.7718 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.7707 0.0500 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.7703 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.7700 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.7700 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.7707 0.0618 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.7701 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.7710 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.7709 0.0568 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.7709 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.7706 0.0549 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.7708 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.7712 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.7709 0.0578 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.7706 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.7712 0.0551 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.7710 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.7718 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.7723 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.7724 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.7723 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.7726 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.7726 0.0537 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.7722 0.0537 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.7721 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.7720 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.7725 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.7727 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.7731 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.7729 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.7728 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.7732 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.7730 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.7732 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.7727 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.7727 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.7722 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.7723 0.0551 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.7719 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.7719 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.7714 0.0522 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.7711 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.7710 0.0582 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.7706 0.0529 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.7701 0.0501 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.7702 0.0610 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.7698 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.7696 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.7691 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.7687 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.7684 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.7683 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.7681 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.7677 0.0508 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.7672 0.0588 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.7667 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.7667 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.7666 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.7664 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.7662 0.0544 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.7659 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.7657 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.7656 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.7655 0.0506 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.7655 0.0509 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.7656 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.7654 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.7653 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.7651 0.0559 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.7650 0.0563 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.7648 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.7644 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.7644 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.7643 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.7642 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.7641 0.0583 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.7641 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.7638 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.7635 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.7636 0.0523 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.7636 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.7631 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.7632 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.7632 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.7631 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.7630 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.7627 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.7624 0.0505 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.7624 0.0631 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.7624 0.0516 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.7623 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.7624 0.0536 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.7625 0.0514 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.7626 0.0524 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.7627 0.0604 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.7626 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.7628 0.0520 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.7628 0.0517 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.7628 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.7628 0.0513 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.7628 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.7628 0.0511 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.7628 0.0530 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.7630 0.0600 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.7629 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.7628 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.7626 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.7627 0.0518 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.7627 0.0537 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.7628 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.7627 0.0512 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.7626 0.0527 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.7627 0.0577 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.7627 0.0554 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.7624 0.0502 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.7625 0.0563 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.7627 0.0533 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.7626 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.7627 0.0528 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.7627 0.0538 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.7627 0.0532 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.7626 0.0546 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.7627 0.0521 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.7631 0.0545 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.7631 0.0539 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.7631 0.0534 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.7631 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.7630 0.0562 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.7631 0.0561 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.7632 0.0583 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.7634 0.0503 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.7633 0.0531 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.7632 0.0547 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.7633 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.8519 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.8053 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.7928 0.0577 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.7830 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.7766 0.0522 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.7658 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.7656 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.7625 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.7649 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.7630 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.7594 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.7563 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.7566 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.7589 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.7575 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.7555 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.7556 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.7573 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.7570 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.7575 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.7563 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.7569 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.7563 0.0589 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.7560 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.7559 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.7545 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.7531 0.0546 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.7532 0.0501 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.7540 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.7549 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.7545 0.0517 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.7538 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.7539 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.7547 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.7543 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.7539 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.7532 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.7519 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.7506 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.7499 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.7495 0.0593 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.7496 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.7490 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.7481 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.7483 0.0519 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.7473 0.0531 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.7471 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.7465 0.0541 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.7464 0.0585 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.7472 0.0572 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.7466 0.0496 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.7474 0.0553 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.7473 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.7473 0.0521 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.7470 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.7473 0.0506 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.7477 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.7474 0.0548 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.7469 0.0566 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.7474 0.0551 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.7471 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.7479 0.0621 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.7483 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.7485 0.0550 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.7481 0.0571 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.7484 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.7486 0.0544 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.7481 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.7480 0.0553 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.7477 0.0575 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.7483 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.7484 0.0512 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.7488 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.7484 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.7483 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.7487 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.7485 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.7485 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.7481 0.0501 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.7479 0.0568 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.7474 0.0588 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.7474 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.7468 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.7467 0.0555 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.7463 0.0569 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.7460 0.0579 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.7458 0.0543 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.7454 0.0501 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.7449 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.7451 0.0573 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.7448 0.0580 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.7445 0.0552 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.7440 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.7437 0.0504 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.7434 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.7433 0.0549 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.7431 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.7427 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.7424 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.7418 0.0554 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.7417 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.7415 0.0546 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.7413 0.0592 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.7410 0.0548 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.7408 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.7406 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.7405 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.7404 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.7405 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.7405 0.0501 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.7403 0.0523 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.7402 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.7400 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.7398 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.7395 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.7391 0.0558 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.7390 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.7389 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.7387 0.0516 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.7386 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.7385 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.7381 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.7377 0.0542 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.7378 0.0536 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.7377 0.0570 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.7372 0.0520 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.7374 0.0535 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.7374 0.0534 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.7372 0.0573 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.7371 0.0564 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.7368 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.7365 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.7365 0.0564 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.7364 0.0538 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.7364 0.0544 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.7365 0.0588 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.7367 0.0586 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.7368 0.0553 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.7369 0.0544 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.7368 0.0533 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.7370 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.7369 0.0513 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.7369 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.7370 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.7368 0.0567 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.7369 0.0573 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.7368 0.0547 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.7370 0.0581 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.7370 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.7369 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.7365 0.0540 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.7365 0.0508 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.7366 0.0507 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.7367 0.0524 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.7366 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.7366 0.0530 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.7366 0.0495 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.7366 0.0518 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.7363 0.0511 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.7365 0.0502 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.7366 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.7365 0.0532 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.7366 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.7365 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.7365 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.7364 0.0515 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.7365 0.0514 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.7369 0.0558 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.7368 0.0545 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.7368 0.0539 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.7367 0.0576 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.7366 0.0537 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.7366 0.0509 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.7367 0.0529 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.7367 0.0525 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.7365 0.0528 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.7364 0.0527 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.7365 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.8349 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.7874 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.7667 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.7571 0.0550 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.7493 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.7384 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.7387 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.7358 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.7376 0.0512 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.7362 0.0555 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.7324 0.0597 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.7299 0.0561 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.7302 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.7326 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.7317 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.7302 0.0540 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.7297 0.0513 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.7318 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.7317 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.7326 0.0564 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.7315 0.0570 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.7321 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.7314 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.7310 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.7306 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.7290 0.0574 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.7278 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.7279 0.0567 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.7287 0.0559 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.7289 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.7285 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.7276 0.0560 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.7278 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.7285 0.0565 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.7281 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.7277 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.7271 0.0540 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.7259 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.7245 0.0524 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.7239 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.7232 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.7236 0.0595 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.7230 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.7220 0.0591 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.7223 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.7213 0.0502 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.7210 0.0595 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.7205 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.7204 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.7212 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.7205 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.7214 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.7213 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.7214 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.7212 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.7214 0.0589 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.7217 0.0540 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.7214 0.0557 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.7209 0.0558 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.7214 0.0546 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.7212 0.0573 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.7220 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.7225 0.0534 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.7227 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.7225 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.7227 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.7228 0.0541 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.7223 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.7222 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.7220 0.0578 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.7225 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.7227 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.7231 0.0500 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.7228 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.7228 0.0589 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.7232 0.0569 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.7231 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.7231 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.7226 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.7224 0.0566 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.7218 0.0509 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.7218 0.0510 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.7213 0.0542 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.7212 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.7208 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.7205 0.0532 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.7203 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.7200 0.0501 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.7196 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.7196 0.0528 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.7194 0.0501 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.7193 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.7189 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.7185 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.7183 0.0526 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.7183 0.0570 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.7181 0.0520 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.7177 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.7173 0.0571 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.7169 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.7168 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.7167 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.7164 0.0543 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.7162 0.0603 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.7160 0.0596 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.7157 0.0638 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.7156 0.0518 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.7155 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.7155 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.7155 0.0553 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.7154 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.7153 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.7151 0.0574 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.7149 0.0545 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.7146 0.0507 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.7143 0.0535 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.7142 0.0530 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.7141 0.0497 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.7141 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.7141 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.7141 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.7138 0.0576 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.7135 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.7137 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.7137 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.7133 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.7135 0.0571 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.7135 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.7135 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.7133 0.0560 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.7129 0.0529 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.7127 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.7126 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.7126 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.7126 0.0525 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.7127 0.0506 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.7128 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.7128 0.0501 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.7129 0.0564 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.7128 0.0533 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.7131 0.0531 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.7131 0.0562 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.7131 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.7131 0.0519 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.7130 0.0570 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.7131 0.0584 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.7131 0.0522 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.7134 0.0514 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.7134 0.0515 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.7132 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.7130 0.0617 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.7130 0.0517 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.7131 0.0582 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.7132 0.0542 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.7132 0.0540 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.7131 0.0572 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.7132 0.0516 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.7131 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.7129 0.0504 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.7131 0.0537 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.7132 0.0521 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.7131 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.7132 0.0505 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.7133 0.0536 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.7132 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.7132 0.0551 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.7133 0.0559 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.7137 0.0502 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.7136 0.0508 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.7137 0.0511 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.7136 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.7134 0.0527 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.7134 0.0547 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.7134 0.0575 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.7134 0.0579 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.7133 0.0523 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.7132 0.0552 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.7133 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.8186 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.7695 0.0589 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.7469 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.7375 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.7311 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.7198 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.7207 0.0520 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.7177 0.0530 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.7187 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.7174 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.7137 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.7111 0.0506 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.7108 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.7122 0.0565 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.7113 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.7095 0.0539 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.7095 0.0506 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.7111 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.7112 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.7116 0.0558 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.7102 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.7100 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.7095 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.7093 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.7092 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.7076 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.7066 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.7067 0.0500 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.7075 0.0543 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.7080 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.7076 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.7068 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.7067 0.0503 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.7075 0.0599 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.7073 0.0588 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.7071 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.7064 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.7051 0.0590 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.7038 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.7029 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.7027 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.7031 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.7026 0.0557 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.7018 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.7020 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.7009 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.7007 0.0597 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.7003 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.7002 0.0537 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.7009 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.7003 0.0583 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.7012 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.7011 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.7011 0.0526 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.7008 0.0560 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.7010 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.7014 0.0553 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.7009 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.7006 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.7012 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.7012 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.7020 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.7024 0.0589 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.7027 0.0571 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.7023 0.0557 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.7026 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.7027 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.7023 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.7023 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.7020 0.0528 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.7027 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.7029 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.7033 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.7031 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.7029 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.7031 0.0511 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.7029 0.0569 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.7029 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.7024 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.7023 0.0586 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.7018 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.7019 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.7014 0.0561 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.7012 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.7008 0.0626 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.7006 0.0542 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.7003 0.0512 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.7000 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.6995 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.6996 0.0519 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.6994 0.0549 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.6992 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.6988 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.6985 0.0536 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.6982 0.0548 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.6982 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.6980 0.0621 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.6977 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.6973 0.0629 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.6969 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.6968 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.6966 0.0562 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.6964 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.6962 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.6960 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.6958 0.0552 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.6958 0.0524 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.6958 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.6958 0.0556 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.6958 0.0549 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.6957 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.6955 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.6954 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.6952 0.0550 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.6950 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.6946 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.6946 0.0535 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.6946 0.0523 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.6945 0.0559 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.6944 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.6944 0.0504 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.6940 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.6937 0.0541 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.6938 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.6938 0.0533 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.6934 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.6935 0.0590 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.6935 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.6934 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.6933 0.0577 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.6930 0.0593 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.6927 0.0557 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.6927 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.6927 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.6927 0.0502 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.6928 0.0554 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.6928 0.0507 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.6929 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.6931 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.6929 0.0521 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.6932 0.0567 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.6932 0.0510 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.6932 0.0548 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.6933 0.0534 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.6932 0.0546 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.6933 0.0611 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.6933 0.0509 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.6935 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.6936 0.0525 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.6935 0.0540 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.6932 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.6933 0.0505 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.6933 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.6934 0.0547 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.6934 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.6934 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.6935 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.6935 0.0518 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.6932 0.0515 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.6934 0.0527 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.6936 0.0531 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.6935 0.0516 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.6936 0.0508 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.6936 0.0570 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.6936 0.0596 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.6936 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.6937 0.0577 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.6941 0.0545 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.6940 0.0529 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.6940 0.0556 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.6939 0.0517 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.6937 0.0522 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.6938 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.6939 0.0569 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.6939 0.0544 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.6938 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.6937 0.0513 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.6938 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.8154 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.7624 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.7416 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.7301 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.7224 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.7085 0.0496 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.7080 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.7042 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.7043 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.7023 0.0571 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.6978 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.6951 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.6949 0.0567 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.6970 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.6955 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.6934 0.0550 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.6931 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.6944 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.6944 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.6947 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.6936 0.0563 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.6938 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.6930 0.0588 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.6922 0.0570 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.6919 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.6905 0.0574 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.6891 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.6894 0.0607 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.6901 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.6905 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.6900 0.0537 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.6893 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.6892 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.6900 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.6898 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.6895 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.6889 0.0508 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.6877 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.6861 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.6854 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.6848 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.6852 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.6848 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.6840 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.6841 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.6831 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.6827 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.6821 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.6819 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.6827 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.6822 0.0580 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.6831 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.6832 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.6833 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.6831 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.6832 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.6836 0.0549 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.6832 0.0565 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.6827 0.0521 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.6833 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.6833 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.6839 0.0504 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.6843 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.6845 0.0589 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.6844 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.6847 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.6847 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.6842 0.0513 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.6842 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.6840 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.6844 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.6846 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.6851 0.0538 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.6849 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.6848 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.6851 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.6849 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.6849 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.6843 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.6842 0.0533 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.6835 0.0587 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.6837 0.0556 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.6831 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.6830 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.6826 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.6824 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.6820 0.0621 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.6817 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.6812 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.6814 0.0551 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.6811 0.0530 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.6810 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.6805 0.0576 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.6802 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.6800 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.6799 0.0505 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.6797 0.0535 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.6793 0.0602 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.6790 0.0580 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.6786 0.0524 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.6786 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.6784 0.0545 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.6782 0.0587 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.6780 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.6778 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.6776 0.0498 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.6776 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.6776 0.0565 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.6776 0.0523 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.6776 0.0552 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.6774 0.0519 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.6773 0.0520 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.6771 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.6770 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.6767 0.0543 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.6765 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.6765 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.6765 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.6763 0.0515 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.6763 0.0554 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.6763 0.0510 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.6760 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.6756 0.0550 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.6758 0.0528 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.6757 0.0549 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.6753 0.0541 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.6755 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.6757 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.6756 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.6755 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.6752 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.6749 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.6750 0.0529 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.6749 0.0531 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.6749 0.0512 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.6749 0.0600 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.6750 0.0526 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.6751 0.0583 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.6752 0.0517 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.6751 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.6754 0.0551 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.6754 0.0534 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.6753 0.0559 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.6754 0.0518 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.6753 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.6754 0.0586 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.6754 0.0572 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.6756 0.0565 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.6757 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.6755 0.0598 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.6753 0.0527 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.6753 0.0555 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.6754 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.6755 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.6754 0.0557 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.6754 0.0596 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.6755 0.0507 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.6754 0.0539 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.6752 0.0547 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.6753 0.0540 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.6755 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.6755 0.0546 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.6754 0.0573 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.6755 0.0514 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.6754 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.6754 0.0525 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.6755 0.0509 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.6759 0.0560 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.6759 0.0629 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.6759 0.0577 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.6758 0.0522 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.6756 0.0536 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.6757 0.0553 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.6757 0.0506 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.6758 0.0516 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.6757 0.0503 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.6756 0.0542 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.6757 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4197 1.2867 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.3853 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.2336 0.0690 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.1475 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.0287 0.0651 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 3.9450 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 3.8819 0.0752 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 3.8289 0.0644 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 3.7772 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.7325 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.6919 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.6596 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.6315 0.0656 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.6060 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.5836 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.5620 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.5422 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.5264 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.5114 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.4956 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.4824 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.4700 0.0713 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.4579 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.4471 0.0668 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.4364 0.0688 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.4275 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.4194 0.0771 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.4105 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.4023 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.3951 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.3889 0.0641 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.3820 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.3754 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.3696 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.3636 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.3583 0.0725 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.3528 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.3476 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.3422 0.0687 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.3373 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.3325 0.0653 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.3281 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.3237 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.3196 0.0774 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.3155 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.3118 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.3084 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.3052 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.3022 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.2993 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.2962 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.2931 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.2904 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.2875 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.2849 0.0679 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.2820 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.2794 0.0657 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.2770 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.2743 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.2721 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.2698 0.0656 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.2680 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.2663 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.2639 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.2616 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.2599 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.2581 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.2557 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.2537 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.2521 0.0665 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.2503 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.2488 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.2471 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.2454 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.2441 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.2427 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.2412 0.0659 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.2398 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.2383 0.0689 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.2366 0.0632 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.2352 0.0663 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2339 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2326 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2312 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2296 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2282 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2268 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2254 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2242 0.0641 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2230 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2218 0.0710 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.2205 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.2194 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.2182 0.0701 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.2170 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.2157 0.0675 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.2146 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.2133 0.0699 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.2121 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.2108 0.0652 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.2097 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.2085 0.0681 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.2073 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.2061 0.0687 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.2048 0.0651 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.2037 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.2022 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.2007 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.1995 0.0630 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.1978 0.0659 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.1965 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.1951 0.0665 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.1936 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.1920 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.1903 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.1887 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.1870 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.1855 0.0674 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.1839 0.0724 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.1821 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.1807 0.0682 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.1789 0.0630 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.1772 0.0698 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.1755 0.0750 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.1735 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.1714 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.1697 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.1684 0.0736 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.1664 0.0666 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.1645 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.1628 0.0667 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.1608 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.1588 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.1568 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.1545 0.0651 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.1523 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.1501 0.0679 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.1479 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.1458 0.0671 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.1435 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.1412 0.0643 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.1387 0.0659 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.1364 0.0716 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.1340 0.0652 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.1316 0.0651 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.1293 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.1268 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.1245 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.1218 0.0707 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.1193 0.0715 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.1170 0.0664 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.1147 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.1121 0.0645 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.1096 0.0660 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.1070 0.0671 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.1044 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.1017 0.0661 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.0991 0.0686 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.0964 0.0661 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.0938 0.0644 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.0912 0.0712 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.0883 0.0743 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.0855 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.0829 0.0653 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.0803 0.0693 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.0777 0.0659 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.0751 0.0673 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.0726 0.0734 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.0700 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.0673 0.0665 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.0648 0.0660 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.0624 0.0651 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.0600 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.0578 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.0554 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.0529 0.0659 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.0503 0.0630 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.0476 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.6328 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.5919 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.5831 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.5775 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.5743 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.5723 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.5709 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.5717 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.5725 0.0685 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.5692 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.5669 0.0789 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.5663 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.5646 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.5659 0.0672 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.5654 0.0693 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.5642 0.0695 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.5624 0.0687 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.5632 0.0611 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.5618 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.5588 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.5565 0.0731 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.5559 0.0686 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.5543 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.5523 0.0720 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.5500 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.5487 0.0702 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.5472 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.5457 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.5444 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.5430 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.5421 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.5405 0.0685 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.5384 0.0715 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.5371 0.0708 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.5351 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.5340 0.0706 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.5321 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.5298 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.5283 0.0660 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.5262 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.5243 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.5224 0.0660 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.5205 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.5188 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.5171 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.5149 0.0675 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.5137 0.0649 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.5122 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.5106 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.5094 0.0743 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.5077 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.5064 0.0740 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.5050 0.0705 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.5034 0.0702 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.5021 0.0675 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.5008 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.4995 0.0686 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.4980 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.4965 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.4955 0.0709 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.4944 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.4935 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.4927 0.0720 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.4917 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.4903 0.0757 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.4892 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.4881 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.4865 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.4850 0.0713 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.4838 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.4828 0.0663 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.4818 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.4808 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.4794 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.4782 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.4775 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.4762 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.4752 0.0715 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.4738 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.4726 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.4712 0.0779 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.4701 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.4689 0.0700 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.4674 0.0726 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.4658 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.4644 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.4632 0.0734 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.4619 0.0686 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.4605 0.0706 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.4593 0.0711 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.4580 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.4570 0.0666 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.4557 0.0756 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.4543 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.4529 0.0680 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.4516 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.4505 0.0680 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.4494 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.4481 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.4468 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.4459 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.4448 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.4434 0.0688 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.4422 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.4409 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.4398 0.0798 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.4386 0.0727 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.4375 0.0718 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.4366 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.4353 0.0672 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.4342 0.0720 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.4333 0.0734 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.4321 0.0700 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.4310 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.4299 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.4286 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.4275 0.0680 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.4265 0.0693 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.4256 0.0664 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.4245 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.4236 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.4225 0.0739 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.4214 0.0687 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.4205 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.4195 0.0740 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.4183 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.4173 0.0714 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.4165 0.0664 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.4155 0.0737 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.4146 0.0742 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.4135 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.4124 0.0664 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.4115 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.4105 0.0686 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.4094 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.4084 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.4073 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.4063 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.4056 0.0688 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.4046 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.4037 0.0756 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.4026 0.0663 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.4017 0.0698 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.4006 0.0683 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.3997 0.0703 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.3989 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.3980 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.3971 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.3961 0.0724 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.3951 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.3942 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.3935 0.0708 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.3926 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.3918 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.3907 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.3897 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.3887 0.0759 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.3877 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.3865 0.0705 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.3857 0.0672 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.3849 0.0735 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.3838 0.0712 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.3828 0.0716 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.3819 0.0888 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.3810 0.0782 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.3801 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.3792 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.3783 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.3773 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.3764 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.3754 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.3746 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.3739 0.0736 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.3733 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.3726 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.3717 0.0711 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.3707 0.0664 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.3696 0.0766 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.2575 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.2074 0.0705 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.1949 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.1911 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.1920 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.1900 0.0698 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.1903 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.1912 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.1935 0.0693 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.1929 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.1906 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.1888 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.1884 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.1904 0.0708 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.1894 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.1885 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.1877 0.0714 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.1890 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.1880 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.1869 0.0704 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.1852 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.1864 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.1861 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.1843 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.1833 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.1824 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.1812 0.0681 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.1806 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.1809 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.1806 0.0681 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.1803 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.1792 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.1784 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.1786 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.1776 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.1770 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.1763 0.0726 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.1746 0.0723 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.1731 0.0720 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.1715 0.0685 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.1704 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.1694 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.1682 0.0704 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.1670 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.1661 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.1642 0.0684 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.1638 0.0670 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.1628 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.1619 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.1619 0.0799 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.1606 0.0728 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.1602 0.0711 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.1593 0.0687 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.1583 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.1572 0.0701 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.1564 0.0707 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.1559 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.1549 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.1539 0.0687 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.1536 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.1528 0.0702 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.1525 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.1521 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.1514 0.0708 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.1505 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.1503 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.1498 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.1486 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.1477 0.0683 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.1471 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.1467 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.1462 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.1459 0.0808 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.1451 0.0694 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.1445 0.0684 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.1443 0.0699 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.1434 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.1430 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.1420 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.1412 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.1402 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.1397 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.1386 0.0708 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.1380 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.1370 0.0749 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.1361 0.0714 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.1354 0.0704 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.1345 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.1336 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.1331 0.0714 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.1323 0.0746 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.1317 0.0670 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.1308 0.0737 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.1299 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.1289 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.1282 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.1275 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.1266 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.1258 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.1248 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.1242 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.1236 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.1227 0.0740 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.1220 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.1212 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.1205 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.1199 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.1192 0.0702 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.1187 0.0670 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.1180 0.0730 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.1175 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.1167 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.1160 0.0707 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.1153 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.1145 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.1135 0.0701 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.1129 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.1122 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.1117 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.1110 0.0706 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.1104 0.0685 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.1096 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.1089 0.0670 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.1084 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.1078 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.1069 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.1064 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.1059 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.1053 0.0726 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.1048 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.1041 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.1033 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.1028 0.0716 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.1024 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.1017 0.0715 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.1012 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.1007 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.1002 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.0998 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.0992 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.0988 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.0982 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.0975 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.0970 0.0704 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.0963 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.0959 0.0685 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.0954 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.0950 0.0724 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.0943 0.0702 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.0937 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.0932 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.0928 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.0923 0.0694 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.0918 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.0912 0.0684 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.0906 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.0901 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.0896 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.0889 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.0885 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.0881 0.0670 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.0874 0.0693 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.0869 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.0864 0.0709 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.0859 0.0762 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.0853 0.0773 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.0848 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.0845 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.0839 0.0674 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.0833 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.0827 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.0821 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.0817 0.0695 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.0812 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.0808 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.0802 0.0694 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.0796 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.0791 0.0715 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.0504 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.0076 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 1.9902 0.0713 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 1.9841 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 1.9820 0.0711 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 1.9758 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 1.9779 0.0726 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 1.9774 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 1.9796 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 1.9790 0.0689 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 1.9764 0.0695 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 1.9743 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 1.9739 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 1.9772 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 1.9768 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 1.9747 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 1.9742 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 1.9756 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 1.9753 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 1.9749 0.0697 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 1.9739 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 1.9748 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 1.9738 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 1.9732 0.0730 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 1.9723 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 1.9710 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 1.9697 0.0737 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 1.9695 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 1.9702 0.0772 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 1.9700 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 1.9693 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 1.9682 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 1.9679 0.0717 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 1.9680 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 1.9669 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 1.9663 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 1.9654 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 1.9641 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 1.9627 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 1.9618 0.0705 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 1.9609 0.0700 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 1.9604 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 1.9597 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 1.9586 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 1.9581 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 1.9566 0.0721 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 1.9560 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 1.9550 0.0709 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 1.9543 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 1.9547 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 1.9537 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 1.9540 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 1.9535 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 1.9530 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 1.9524 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 1.9521 0.0697 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 1.9519 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 1.9514 0.0715 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 1.9506 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 1.9505 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 1.9500 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 1.9503 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 1.9503 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 1.9500 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 1.9496 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 1.9497 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 1.9495 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 1.9488 0.0700 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 1.9483 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 1.9478 0.0723 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 1.9479 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 1.9476 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 1.9476 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 1.9470 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 1.9466 0.0705 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 1.9465 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 1.9460 0.0664 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 1.9458 0.0802 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 1.9449 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 1.9446 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 1.9437 0.0712 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 1.9435 0.0765 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 1.9427 0.0677 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 1.9423 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 1.9415 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 1.9410 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 1.9405 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 1.9398 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 1.9390 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 1.9388 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 1.9382 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 1.9378 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 1.9371 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 1.9364 0.0736 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 1.9357 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 1.9352 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 1.9348 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 1.9343 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 1.9336 0.0702 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 1.9329 0.0725 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 1.9326 0.0726 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 1.9321 0.0689 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 1.9316 0.0718 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 1.9310 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 1.9304 0.0741 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 1.9300 0.0664 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 1.9296 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 1.9293 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 1.9290 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 1.9285 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 1.9282 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 1.9278 0.0712 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 1.9272 0.0764 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 1.9268 0.0703 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 1.9263 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 1.9255 0.0728 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 1.9252 0.0701 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 1.9247 0.0693 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 1.9243 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 1.9240 0.0677 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 1.9236 0.0761 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 1.9230 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 1.9225 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 1.9223 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 1.9218 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 1.9211 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 1.9209 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 1.9207 0.0691 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 1.9203 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 1.9199 0.0677 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 1.9195 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 1.9189 0.0741 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 1.9187 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 1.9185 0.0710 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 1.9181 0.0790 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 1.9178 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 1.9176 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 1.9173 0.0715 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 1.9172 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 1.9168 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 1.9167 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 1.9163 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 1.9160 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 1.9157 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 1.9152 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 1.9150 0.0767 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 1.9147 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 1.9146 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 1.9142 0.0781 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 1.9138 0.0728 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 1.9133 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 1.9132 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 1.9130 0.0750 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 1.9127 0.0713 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 1.9123 0.0746 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 1.9119 0.0804 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 1.9117 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 1.9114 0.0727 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 1.9109 0.0698 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 1.9108 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 1.9106 0.0693 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 1.9102 0.0702 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 1.9100 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 1.9097 0.0678 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 1.9094 0.0697 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 1.9091 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 1.9088 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 1.9088 0.0695 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 1.9084 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 1.9080 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 1.9076 0.0710 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 1.9072 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 1.9070 0.0816 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 1.9068 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 1.9066 0.0705 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 1.9062 0.0746 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 1.9058 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 1.9056 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 1.9239 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 1.8784 0.0732 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 1.8641 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 1.8566 0.0696 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 1.8549 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 1.8464 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 1.8471 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 1.8461 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 1.8486 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 1.8477 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 1.8448 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 1.8435 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 1.8427 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 1.8457 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 1.8456 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 1.8429 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 1.8430 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 1.8448 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 1.8438 0.0755 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 1.8441 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 1.8429 0.0676 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 1.8440 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 1.8430 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 1.8422 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 1.8414 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 1.8400 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 1.8387 0.0736 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 1.8382 0.0711 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 1.8392 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 1.8395 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 1.8392 0.0692 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 1.8380 0.0753 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 1.8381 0.0700 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 1.8382 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 1.8377 0.0731 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 1.8373 0.0753 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 1.8363 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 1.8349 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 1.8334 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 1.8325 0.0687 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 1.8318 0.0782 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 1.8317 0.0721 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 1.8309 0.0711 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 1.8298 0.0754 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 1.8296 0.0748 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 1.8282 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 1.8280 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 1.8274 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 1.8270 0.0717 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 1.8275 0.0724 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 1.8267 0.0692 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 1.8275 0.0703 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 1.8271 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 1.8268 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 1.8263 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 1.8261 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 1.8261 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 1.8258 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 1.8251 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 1.8253 0.0717 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 1.8251 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 1.8256 0.0723 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 1.8257 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 1.8256 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 1.8253 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 1.8254 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 1.8253 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 1.8246 0.0715 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 1.8243 0.0857 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 1.8239 0.0695 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 1.8240 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 1.8241 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 1.8242 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 1.8236 0.0692 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 1.8233 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 1.8234 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 1.8231 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 1.8230 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 1.8223 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 1.8219 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 1.8212 0.0778 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 1.8212 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 1.8205 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 1.8203 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 1.8197 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 1.8192 0.0683 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 1.8188 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 1.8183 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 1.8176 0.0725 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 1.8176 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 1.8170 0.0703 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 1.8167 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 1.8160 0.0701 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 1.8155 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 1.8150 0.0727 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 1.8147 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 1.8143 0.0687 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 1.8137 0.0697 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 1.8131 0.0718 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 1.8124 0.0724 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 1.8122 0.0725 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 1.8118 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 1.8114 0.0767 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 1.8109 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 1.8105 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 1.8102 0.0744 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 1.8099 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 1.8096 0.0704 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 1.8095 0.0805 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 1.8092 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 1.8089 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 1.8085 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 1.8082 0.0714 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 1.8079 0.0755 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 1.8074 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 1.8069 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 1.8066 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 1.8063 0.0700 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 1.8060 0.0695 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 1.8058 0.0729 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 1.8056 0.0731 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 1.8051 0.0709 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 1.8046 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 1.8045 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 1.8042 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 1.8035 0.0692 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 1.8035 0.0760 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 1.8034 0.0692 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 1.8031 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 1.8028 0.0702 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 1.8024 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 1.8020 0.0709 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 1.8018 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 1.8017 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 1.8014 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 1.8014 0.0700 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 1.8013 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 1.8012 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 1.8011 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 1.8009 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 1.8009 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 1.8006 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 1.8004 0.0711 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 1.8002 0.0753 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 1.7998 0.0703 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 1.7997 0.0702 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 1.7995 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 1.7995 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 1.7993 0.0729 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 1.7990 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 1.7986 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 1.7985 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 1.7983 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 1.7982 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 1.7979 0.0761 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 1.7977 0.0713 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 1.7976 0.0676 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 1.7974 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 1.7970 0.0748 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 1.7970 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 1.7969 0.0710 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 1.7967 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 1.7965 0.0789 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 1.7963 0.0740 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 1.7961 0.0704 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 1.7958 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 1.7957 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 1.7958 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 1.7956 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 1.7953 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 1.7949 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 1.7946 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 1.7945 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 1.7943 0.0722 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 1.7943 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 1.7940 0.0701 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 1.7936 0.0711 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 1.7935 0.0728 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 1.8312 0.0692 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 1.7920 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 1.7784 0.0699 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 1.7709 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 1.7656 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 1.7552 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 1.7545 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 1.7528 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 1.7555 0.0805 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 1.7548 0.0669 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.7517 0.0704 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.7508 0.0707 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.7504 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.7528 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.7521 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.7498 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.7494 0.0725 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.7516 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.7516 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.7520 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.7514 0.0698 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.7524 0.0758 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.7515 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.7508 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.7503 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.7493 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.7476 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.7478 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.7484 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.7482 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.7479 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.7468 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.7467 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.7473 0.0732 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.7469 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.7464 0.0706 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.7455 0.0701 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.7440 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.7427 0.0784 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.7420 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.7414 0.0751 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.7417 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.7410 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.7400 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.7400 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.7387 0.0713 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.7384 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.7378 0.0748 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.7374 0.0715 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.7379 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.7374 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.7382 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.7377 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.7374 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.7369 0.0747 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.7367 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.7368 0.0720 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.7364 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.7359 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.7361 0.0708 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.7358 0.0783 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.7367 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.7368 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.7368 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.7366 0.0714 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.7367 0.0712 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.7368 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.7364 0.0747 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.7362 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.7360 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.7362 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.7362 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.7366 0.0701 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.7361 0.0700 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.7359 0.0709 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.7360 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.7356 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.7356 0.0776 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.7350 0.0697 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.7347 0.0704 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.7342 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.7342 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.7336 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.7334 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.7328 0.0671 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.7324 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.7321 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.7316 0.0726 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.7310 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.7309 0.0697 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.7304 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.7300 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.7293 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.7289 0.0772 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.7284 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.7282 0.0696 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.7280 0.0780 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.7275 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.7269 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.7264 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.7261 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.7259 0.0735 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.7255 0.0650 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.7252 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.7248 0.0716 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.7246 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.7242 0.0697 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.7240 0.0712 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.7238 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.7237 0.0669 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.7236 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.7233 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.7230 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.7227 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.7224 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.7219 0.0718 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.7216 0.0760 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.7215 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.7213 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.7210 0.0740 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.7209 0.0698 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.7204 0.0767 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.7199 0.0727 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.7199 0.0737 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.7196 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.7191 0.0711 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.7192 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.7190 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.7187 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.7184 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.7180 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.7177 0.0692 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.7176 0.0732 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.7175 0.0730 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.7173 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.7172 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.7172 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.7171 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.7170 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.7168 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.7170 0.0650 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.7167 0.0713 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.7165 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.7163 0.0719 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.7160 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.7160 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.7159 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.7160 0.0750 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.7159 0.0725 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.7157 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.7153 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.7152 0.0717 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.7151 0.0774 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.7150 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.7149 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.7147 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.7146 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.7144 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.7140 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.7140 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.7140 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.7139 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.7138 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.7137 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.7136 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.7134 0.0669 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.7134 0.0700 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.7135 0.0757 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.7133 0.0733 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.7131 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.7128 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.7125 0.0741 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.7125 0.0725 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.7124 0.0701 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.7123 0.0705 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.7121 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.7118 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.7118 0.0682 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 1.7637 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.7241 0.0715 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.7092 0.0745 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.7006 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.6939 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.6846 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.6856 0.0716 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.6830 0.0677 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.6848 0.0694 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.6849 0.0730 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.6814 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.6800 0.0806 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.6800 0.0717 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.6818 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.6806 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.6785 0.0733 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.6784 0.0714 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.6796 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.6799 0.0713 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.6803 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.6794 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.6803 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.6798 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.6797 0.0727 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.6791 0.0691 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.6779 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.6764 0.0713 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.6764 0.0700 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.6774 0.0682 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.6776 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.6771 0.0748 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.6761 0.0696 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.6763 0.0720 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.6769 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.6763 0.0694 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.6759 0.0713 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.6751 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.6739 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.6725 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.6718 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.6715 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.6717 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.6713 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.6704 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.6706 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.6698 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.6697 0.0711 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.6690 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.6688 0.0832 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.6695 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.6692 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.6699 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.6697 0.0698 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.6696 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.6691 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.6691 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.6693 0.0683 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.6689 0.0728 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.6683 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.6688 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.6687 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.6695 0.0677 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.6697 0.0694 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.6699 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.6698 0.0700 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.6701 0.0700 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.6700 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.6697 0.0710 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.6694 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.6693 0.0698 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.6697 0.0740 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.6697 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.6700 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.6696 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.6694 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.6695 0.0730 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.6693 0.0727 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.6692 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.6686 0.0717 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.6683 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.6678 0.0700 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.6679 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.6672 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.6671 0.0691 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.6666 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.6663 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.6660 0.0807 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.6655 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.6649 0.0726 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.6649 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.6645 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.6642 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.6636 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.6633 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.6629 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.6628 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.6626 0.0683 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.6621 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.6617 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.6612 0.0731 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.6610 0.0693 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.6609 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.6606 0.0716 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.6602 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.6599 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.6597 0.0820 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.6595 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.6593 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.6592 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.6593 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.6589 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.6587 0.0690 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.6584 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.6581 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.6578 0.0691 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.6574 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.6571 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.6569 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.6567 0.0749 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.6566 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.6565 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.6560 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.6556 0.0695 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.6556 0.0726 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.6555 0.0729 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.6550 0.0719 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.6551 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.6549 0.0741 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.6547 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.6545 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.6540 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.6536 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.6535 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.6535 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.6534 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.6533 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.6534 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.6534 0.0714 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.6533 0.0715 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.6531 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.6534 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.6532 0.0714 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.6530 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.6530 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.6528 0.0754 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.6528 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.6528 0.0751 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.6529 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.6528 0.0723 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.6526 0.0699 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.6522 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.6521 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.6520 0.0801 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.6519 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.6518 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.6516 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.6516 0.0740 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.6514 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.6511 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.6512 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.6513 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.6512 0.0719 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.6512 0.0682 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.6511 0.0727 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.6510 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.6508 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.6508 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.6511 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.6509 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.6508 0.0690 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.6506 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.6504 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.6504 0.0708 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.6504 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.6504 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.6502 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.6499 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.6499 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.7125 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.6677 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.6546 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.6508 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.6435 0.0729 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.6336 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.6338 0.0689 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.6323 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.6334 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.6327 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.6291 0.0695 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.6276 0.0718 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.6270 0.0702 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.6301 0.0782 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.6293 0.0737 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.6276 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.6277 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.6294 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.6297 0.0749 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.6307 0.0732 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.6294 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.6298 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.6289 0.0744 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.6283 0.0719 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.6277 0.0750 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.6266 0.0716 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.6253 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.6255 0.0674 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.6262 0.0698 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.6264 0.0718 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.6261 0.0734 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.6254 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.6257 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.6258 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.6255 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.6250 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.6244 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.6231 0.0730 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.6216 0.0707 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.6209 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.6206 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.6210 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.6201 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.6191 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.6194 0.0728 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.6186 0.0707 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.6181 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.6176 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.6173 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.6177 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.6171 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.6179 0.0695 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.6175 0.0695 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.6177 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.6171 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.6172 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.6175 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.6172 0.0767 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.6168 0.0815 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.6171 0.0730 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.6170 0.0790 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.6180 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.6183 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.6184 0.0705 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.6181 0.0710 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.6182 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.6183 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.6178 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.6176 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.6175 0.0763 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.6178 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.6182 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.6186 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.6184 0.0746 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.6182 0.0709 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.6183 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.6181 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.6181 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.6176 0.0720 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.6173 0.0743 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.6167 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.6167 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.6161 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.6161 0.0823 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.6156 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.6153 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.6151 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.6147 0.0783 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.6141 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.6141 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.6137 0.0751 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.6135 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.6130 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.6126 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.6122 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.6121 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.6120 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.6115 0.0768 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.6111 0.0750 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.6106 0.0701 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.6105 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.6102 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.6100 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.6096 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.6093 0.0698 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.6092 0.0770 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.6090 0.0724 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.6089 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.6087 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.6088 0.0715 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.6085 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.6084 0.0693 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.6082 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.6080 0.0711 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.6076 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.6072 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.6070 0.0728 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.6070 0.0724 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.6069 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.6068 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.6066 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.6062 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.6059 0.0711 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.6058 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.6057 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.6052 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.6053 0.0696 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.6052 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.6050 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.6047 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.6043 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.6041 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.6041 0.0763 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.6041 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.6039 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.6039 0.0707 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.6040 0.0693 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.6040 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.6040 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.6038 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.6041 0.0706 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.6040 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.6038 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.6038 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.6037 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.6038 0.0708 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.6038 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.6039 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.6039 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.6037 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.6034 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.6033 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.6033 0.0725 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.6033 0.0716 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.6032 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.6031 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.6032 0.0728 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.6031 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.6028 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.6028 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.6030 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.6029 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.6029 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.6028 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.6027 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.6026 0.0682 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.6026 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.6029 0.0714 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.6028 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.6028 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.6026 0.0711 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.6023 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.6024 0.0725 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.6023 0.0720 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.6023 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.6021 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.6019 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.6020 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.6727 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.6280 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.6148 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.6086 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.6008 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.5907 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.5894 0.0726 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.5866 0.0744 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.5877 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.5882 0.0724 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.5840 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.5835 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.5832 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.5858 0.0720 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.5852 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.5833 0.0791 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.5837 0.0763 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.5853 0.0703 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.5857 0.0724 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.5862 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.5856 0.0720 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.5862 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.5852 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.5851 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.5847 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.5837 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.5824 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.5832 0.0719 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.5838 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.5842 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.5839 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.5828 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.5829 0.0744 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.5832 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.5829 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.5825 0.0744 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.5817 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.5806 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.5789 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.5784 0.0715 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.5780 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.5784 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.5779 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.5771 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.5773 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.5763 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.5761 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.5757 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.5754 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.5759 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.5754 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.5764 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.5763 0.0777 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.5762 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.5760 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.5760 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.5766 0.0688 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.5764 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.5758 0.0704 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.5762 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.5763 0.0712 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.5774 0.0756 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.5779 0.0734 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.5778 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.5778 0.0729 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.5779 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.5779 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.5774 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.5773 0.0718 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.5771 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.5774 0.0754 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.5776 0.0746 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.5780 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.5777 0.0727 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.5775 0.0697 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.5777 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.5775 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.5776 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.5770 0.0712 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.5769 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.5763 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.5763 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.5757 0.0779 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.5756 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.5751 0.0745 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.5748 0.0703 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.5745 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.5741 0.0685 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.5735 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.5735 0.0784 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.5732 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.5731 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.5727 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.5723 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.5719 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.5720 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.5718 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.5712 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.5707 0.0723 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.5703 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.5703 0.0718 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.5700 0.0697 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.5697 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.5695 0.0703 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.5693 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.5692 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.5691 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.5689 0.0720 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.5688 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.5688 0.0712 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.5686 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.5683 0.0728 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.5682 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.5680 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.5676 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.5671 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.5670 0.0701 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.5670 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.5668 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.5667 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.5665 0.0698 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.5661 0.0767 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.5656 0.0711 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.5657 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.5656 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.5651 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.5652 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.5651 0.0756 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.5650 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.5647 0.0738 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.5643 0.0718 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.5641 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.5641 0.0701 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.5641 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.5641 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.5641 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.5643 0.0730 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.5642 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.5642 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.5641 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.5644 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.5643 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.5642 0.0723 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.5643 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.5642 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.5642 0.0759 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.5642 0.0702 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.5643 0.0760 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.5643 0.0717 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.5641 0.0709 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.5637 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.5636 0.0700 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.5636 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.5636 0.0685 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.5635 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.5634 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.5634 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.5633 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.5631 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.5632 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.5633 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.5632 0.0723 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.5633 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.5632 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.5631 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.5631 0.0758 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.5632 0.0723 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.5635 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.5634 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.5634 0.0707 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.5632 0.0771 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.5629 0.0770 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.5630 0.0712 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.5630 0.0711 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.5630 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.5629 0.0704 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.5627 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.5627 0.0721 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.6327 0.0761 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.5975 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.5786 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.5736 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.5643 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.5540 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.5543 0.0805 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.5519 0.0709 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.5520 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.5511 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.5479 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.5481 0.0695 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.5469 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.5492 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.5485 0.0729 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.5468 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.5467 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.5490 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.5489 0.0864 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.5498 0.0786 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.5491 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.5492 0.0706 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.5487 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.5488 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.5488 0.0746 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.5475 0.0800 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.5466 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.5472 0.0711 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.5478 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.5481 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.5480 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.5471 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.5473 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.5477 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.5473 0.0710 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.5470 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.5463 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.5451 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.5436 0.0716 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.5431 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.5428 0.0754 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.5432 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.5426 0.0731 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.5419 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.5424 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.5414 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.5410 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.5407 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.5406 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.5409 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.5406 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.5414 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.5413 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.5415 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.5410 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.5410 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.5411 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.5407 0.0719 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.5401 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.5405 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.5403 0.0743 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.5414 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.5419 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.5420 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.5418 0.0731 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.5419 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.5420 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.5416 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.5417 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.5417 0.0745 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.5422 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.5423 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.5427 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.5422 0.0708 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.5420 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.5422 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.5419 0.0747 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.5420 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.5413 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.5412 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.5407 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.5408 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.5404 0.0791 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.5403 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.5400 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.5398 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.5394 0.0865 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.5392 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.5388 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.5389 0.0769 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.5386 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.5384 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.5380 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.5377 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.5374 0.0747 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.5375 0.0733 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.5374 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.5369 0.0759 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.5364 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.5359 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.5359 0.0721 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.5358 0.0852 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.5356 0.0715 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.5354 0.0748 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.5351 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.5350 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.5348 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.5348 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.5346 0.0714 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.5345 0.0794 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.5344 0.0729 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.5343 0.0747 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.5341 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.5338 0.0715 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.5334 0.0705 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.5330 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.5329 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.5329 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.5328 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.5328 0.0756 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.5327 0.0745 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.5322 0.0704 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.5318 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.5318 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.5317 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.5313 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.5315 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.5314 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.5312 0.0722 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.5310 0.0734 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.5306 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.5303 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.5304 0.0725 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.5304 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.5303 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.5303 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.5305 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.5305 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.5305 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.5304 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.5308 0.0785 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.5308 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.5307 0.0794 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.5309 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.5307 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.5308 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.5308 0.0788 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.5310 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.5310 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.5309 0.0742 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.5307 0.0695 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.5305 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.5306 0.0723 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.5306 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.5305 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.5304 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.5305 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.5305 0.0741 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.5303 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.5303 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.5305 0.0704 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.5305 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.5305 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.5305 0.0711 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.5304 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.5304 0.0820 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.5305 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.5309 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.5308 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.5308 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.5306 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.5303 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.5304 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.5303 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.5304 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.5302 0.0704 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.5300 0.0745 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.5302 0.0717 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.5943 0.0772 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.5592 0.0703 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.5457 0.0702 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.5417 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.5353 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.5255 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.5258 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.5235 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.5249 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.5236 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.5200 0.0891 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.5192 0.0741 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.5187 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.5209 0.0734 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.5198 0.0707 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.5176 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.5175 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.5188 0.0709 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.5190 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.5200 0.0815 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.5192 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.5194 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.5192 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.5188 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.5185 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.5171 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.5154 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.5160 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.5166 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.5171 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.5169 0.0698 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.5156 0.0727 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.5157 0.0776 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.5159 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.5157 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.5155 0.0686 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.5148 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.5138 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.5124 0.0776 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.5117 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.5114 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.5122 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.5116 0.0738 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.5108 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.5110 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.5104 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.5101 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.5095 0.0723 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.5095 0.0771 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.5101 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.5097 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.5104 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.5102 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.5104 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.5100 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.5101 0.0748 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.5106 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.5103 0.0767 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.5097 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.5102 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.5103 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.5112 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.5117 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.5117 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.5115 0.0710 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.5116 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.5117 0.0724 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.5114 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.5113 0.0856 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.5113 0.0691 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.5118 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.5121 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.5126 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.5122 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.5121 0.0724 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.5123 0.0736 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.5120 0.0732 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.5119 0.0714 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.5114 0.0784 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.5112 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.5108 0.0705 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.5108 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.5103 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.5103 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.5101 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.5099 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.5096 0.0742 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.5092 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.5087 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.5089 0.0711 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.5085 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.5083 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.5079 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.5076 0.0632 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.5071 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.5072 0.0718 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.5072 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.5068 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.5064 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.5060 0.0708 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.5059 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.5057 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.5055 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.5053 0.0709 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.5050 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.5048 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.5046 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.5046 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.5045 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.5046 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.5044 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.5043 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.5042 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.5039 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.5036 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.5032 0.0780 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.5032 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.5032 0.0735 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.5031 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.5031 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.5029 0.0726 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.5025 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.5021 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.5021 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.5020 0.0705 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.5016 0.0799 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.5017 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.5017 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.5015 0.0719 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.5013 0.0715 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.5009 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.5006 0.0703 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.5007 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.5007 0.0704 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.5007 0.0725 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.5008 0.0799 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.5009 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.5009 0.0712 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.5009 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.5008 0.0719 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.5010 0.0709 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.5010 0.0713 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.5008 0.0731 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.5010 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.5009 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.5010 0.0701 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.5011 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.5013 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.5013 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.5013 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.5010 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.5009 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.5010 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.5010 0.0792 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.5010 0.0709 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.5009 0.0732 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.5009 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.5009 0.0756 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.5006 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.5007 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.5008 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.5008 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.5009 0.0694 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.5008 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.5008 0.0737 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.5008 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.5009 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.5014 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.5013 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.5013 0.0740 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.5012 0.0714 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.5009 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.5010 0.0739 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.5011 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.5012 0.0710 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.5010 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.5009 0.0727 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.5010 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.5712 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.5322 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.5185 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.5151 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.5087 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.4988 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.5001 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.4979 0.0718 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.4981 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.4976 0.0727 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.4946 0.0702 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.4944 0.0741 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.4936 0.0788 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.4953 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.4950 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.4929 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.4934 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.4945 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.4945 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.4957 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.4953 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.4957 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.4951 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.4948 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.4945 0.0794 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.4925 0.0729 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.4912 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.4920 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.4923 0.0739 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.4926 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.4920 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.4912 0.0711 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.4918 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.4922 0.0705 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.4920 0.0757 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.4917 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.4908 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.4896 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.4880 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.4874 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.4870 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.4875 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.4867 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.4861 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.4863 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.4856 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.4852 0.0731 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.4846 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.4845 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.4849 0.0714 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.4843 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.4852 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.4849 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.4849 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.4847 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.4847 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.4851 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.4847 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.4841 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.4847 0.0702 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.4848 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.4858 0.0726 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.4861 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.4861 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.4861 0.0737 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.4864 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.4864 0.0702 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.4861 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.4861 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.4861 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.4866 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.4866 0.0739 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.4871 0.0775 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.4867 0.0717 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.4867 0.0722 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.4867 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.4867 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.4868 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.4862 0.0749 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.4861 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.4856 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.4855 0.0647 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.4850 0.0729 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.4849 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.4847 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.4845 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.4841 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.4839 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.4835 0.0718 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.4836 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.4833 0.0695 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.4831 0.0722 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.4828 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.4825 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.4821 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.4822 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.4821 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.4817 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.4813 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.4810 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.4810 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.4808 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.4806 0.0727 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.4804 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.4801 0.0711 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.4800 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.4799 0.0757 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.4800 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.4800 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.4800 0.0794 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.4797 0.0715 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.4796 0.0747 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.4795 0.0726 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.4793 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.4789 0.0724 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.4787 0.0702 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.4786 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.4786 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.4786 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.4785 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.4784 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.4781 0.0717 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.4777 0.0717 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.4777 0.0712 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.4776 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.4772 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.4773 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.4775 0.0713 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.4773 0.0749 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.4770 0.0695 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.4766 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.4765 0.0717 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.4765 0.0739 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.4764 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.4764 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.4765 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.4766 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.4766 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.4765 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.4764 0.0744 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.4768 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.4767 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.4766 0.0729 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.4767 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.4765 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.4767 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.4767 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.4770 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.4771 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.4770 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.4767 0.0674 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.4766 0.0720 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.4766 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.4767 0.0809 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.4767 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.4766 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.4767 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.4766 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.4764 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.4765 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.4767 0.0708 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.4766 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.4767 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.4767 0.0708 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.4766 0.0735 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.4766 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.4767 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.4771 0.0758 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.4770 0.0740 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.4770 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.4768 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.4767 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.4768 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.4767 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.4768 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.4766 0.0732 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.4765 0.0714 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.4766 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.5608 0.0722 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.5222 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.5058 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.5015 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.4916 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.4818 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.4812 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.4781 0.0718 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.4790 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.4788 0.0721 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.4748 0.0726 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.4744 0.0703 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.4737 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.4751 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.4743 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.4715 0.0707 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.4726 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.4737 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.4737 0.0707 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.4747 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.4744 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.4751 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.4747 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.4746 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.4747 0.0721 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.4730 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.4717 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.4724 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.4726 0.0750 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.4726 0.0758 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.4724 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.4714 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.4716 0.0741 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.4718 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.4714 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.4710 0.0724 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.4702 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.4690 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.4674 0.0734 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.4668 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.4662 0.0680 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.4668 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.4664 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.4657 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.4659 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.4651 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.4650 0.0727 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.4646 0.0710 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.4644 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.4647 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.4642 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.4650 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.4647 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.4650 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.4647 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.4646 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.4649 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.4645 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.4640 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.4645 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.4644 0.0730 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.4653 0.0727 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.4656 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.4657 0.0714 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.4657 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.4657 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.4658 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.4655 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.4654 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.4653 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.4658 0.0777 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.4661 0.0695 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.4666 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.4663 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.4662 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.4665 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.4663 0.0760 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.4663 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.4657 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.4656 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.4652 0.0692 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.4652 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.4647 0.0750 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.4646 0.0762 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.4643 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.4641 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.4638 0.0739 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.4635 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.4631 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.4632 0.0778 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.4630 0.0725 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.4628 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.4624 0.0702 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.4621 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.4618 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.4619 0.0738 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.4618 0.0706 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.4613 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.4610 0.0771 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.4607 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.4607 0.0722 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.4605 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.4605 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.4603 0.0715 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.4600 0.0702 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.4598 0.0723 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.4598 0.0717 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.4597 0.0717 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.4596 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.4597 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.4595 0.0768 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.4594 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.4593 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.4591 0.0692 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.4588 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.4584 0.0721 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.4584 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.4584 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.4582 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.4582 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.4582 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.4579 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.4575 0.0725 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.4576 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.4576 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.4571 0.0684 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.4572 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.4572 0.0759 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.4570 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.4567 0.0683 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.4563 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.4561 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.4562 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.4561 0.0731 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.4561 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.4561 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.4563 0.0744 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.4563 0.0822 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.4562 0.0684 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.4562 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.4565 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.4565 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.4563 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.4566 0.0765 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.4564 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.4565 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.4565 0.0726 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.4567 0.0852 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.4568 0.0740 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.4568 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.4565 0.0706 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.4564 0.0773 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.4564 0.0707 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.4564 0.0736 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.4564 0.0687 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.4563 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.4563 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.4562 0.0790 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.4560 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.4561 0.0692 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.4563 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.4562 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.4562 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.4562 0.0692 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.4562 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.4561 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.4562 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.4566 0.0727 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.4566 0.0717 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.4566 0.0693 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.4565 0.0766 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.4563 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.4564 0.0761 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.4564 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.4564 0.0735 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.4563 0.0793 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.4561 0.0710 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.4562 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.5362 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.4946 0.0750 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.4841 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.4787 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.4698 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.4579 0.0732 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.4575 0.0709 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.4553 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.4545 0.0733 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.4542 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.4508 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.4503 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.4495 0.0765 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.4514 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.4507 0.0746 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.4484 0.0722 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.4492 0.0736 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.4501 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.4501 0.0758 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.4513 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.4506 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.4510 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.4504 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.4503 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.4501 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.4488 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.4476 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.4487 0.0780 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.4491 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.4493 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.4485 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.4474 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.4478 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.4479 0.0721 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.4477 0.0797 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.4476 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.4468 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.4459 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.4449 0.0767 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.4445 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.4442 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.4448 0.0753 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.4444 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.4436 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.4437 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.4428 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.4426 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.4421 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.4420 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.4421 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.4417 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.4423 0.0716 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.4423 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.4426 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.4423 0.0760 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.4421 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.4425 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.4424 0.0733 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.4419 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.4426 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.4428 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.4438 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.4443 0.0746 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.4442 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.4441 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.4441 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.4442 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.4440 0.0763 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.4440 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.4440 0.0712 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.4446 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.4449 0.0764 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.4455 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.4452 0.0706 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.4451 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.4454 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.4453 0.0784 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.4452 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.4446 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.4445 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.4441 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.4440 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.4435 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.4434 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.4432 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.4431 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.4428 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.4427 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.4423 0.0709 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.4423 0.0707 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.4420 0.0702 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.4419 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.4416 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.4413 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.4410 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.4410 0.0725 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.4411 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.4406 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.4402 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.4398 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.4398 0.0647 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.4397 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.4396 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.4395 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.4392 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.4392 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.4392 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.4391 0.0865 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.4389 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.4390 0.0728 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.4388 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.4388 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.4387 0.0711 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.4386 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.4383 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.4379 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.4378 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.4379 0.0706 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.4379 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.4379 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.4378 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.4374 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.4371 0.0717 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.4371 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.4370 0.0725 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.4367 0.0710 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.4368 0.0750 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.4368 0.0707 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.4367 0.0733 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.4364 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.4361 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.4360 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.4360 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.4360 0.0720 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.4360 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.4361 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.4362 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.4362 0.0730 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.4362 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.4361 0.0718 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.4364 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.4365 0.0740 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.4364 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.4365 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.4363 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.4365 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.4365 0.0756 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.4367 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.4369 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.4368 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.4364 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.4363 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.4363 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.4363 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.4363 0.0762 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.4362 0.0709 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.4363 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.4362 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.4360 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.4361 0.0723 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.4362 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.4362 0.0720 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.4362 0.0703 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.4362 0.0702 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.4362 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.4362 0.0763 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.4364 0.0759 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.4368 0.0762 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.4368 0.0762 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.4368 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.4367 0.0702 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.4364 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.4366 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.4365 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.4366 0.0729 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.4364 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.4364 0.0757 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.4365 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.5082 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.4710 0.0739 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.4601 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.4562 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.4499 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.4381 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.4388 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.4358 0.0708 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.4366 0.0865 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.4357 0.0736 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.4329 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.4319 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.4305 0.0773 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.4327 0.0716 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.4317 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.4298 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.4302 0.0824 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.4317 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.4316 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.4328 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.4323 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.4330 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.4322 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.4324 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.4325 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.4308 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.4298 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.4306 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.4312 0.0771 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.4315 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.4310 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.4304 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.4308 0.0706 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.4311 0.0782 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.4310 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.4310 0.0707 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.4303 0.0759 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.4294 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.4282 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.4278 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.4273 0.0718 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.4280 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.4275 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.4269 0.0726 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.4271 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.4262 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.4259 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.4254 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.4252 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.4257 0.0723 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.4253 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.4261 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.4261 0.0738 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.4264 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.4261 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.4263 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.4267 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.4265 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.4261 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.4269 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.4267 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.4276 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.4281 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.4282 0.0748 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.4282 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.4284 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.4286 0.0739 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.4284 0.0748 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.4284 0.0707 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.4284 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.4289 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.4292 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.4297 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.4294 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.4295 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.4296 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.4295 0.0742 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.4295 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.4288 0.0746 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.4287 0.0758 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.4282 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.4282 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.4278 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.4277 0.0826 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.4274 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.4273 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.4271 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.4269 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.4264 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.4267 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.4265 0.0714 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.4264 0.0795 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.4260 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.4257 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.4254 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.4255 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.4255 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.4251 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.4246 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.4242 0.0714 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.4242 0.0722 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.4240 0.0736 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.4239 0.0705 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.4237 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.4236 0.0804 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.4234 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.4235 0.0739 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.4234 0.0714 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.4234 0.0762 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.4235 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.4234 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.4234 0.0713 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.4232 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.4231 0.0703 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.4228 0.0718 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.4224 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.4224 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.4223 0.0715 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.4222 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.4222 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.4221 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.4218 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.4214 0.0766 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.4215 0.0725 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.4215 0.0737 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.4211 0.0717 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.4212 0.0763 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.4213 0.0749 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.4211 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.4209 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.4204 0.0720 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.4202 0.0764 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.4203 0.0708 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.4203 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.4203 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.4204 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.4206 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.4206 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.4206 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.4206 0.0709 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.4209 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.4209 0.0712 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.4208 0.0786 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.4209 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.4208 0.0720 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.4210 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.4210 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.4212 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.4214 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.4212 0.0741 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.4210 0.0707 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.4209 0.0703 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.4210 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.4209 0.0733 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.4210 0.0772 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.4209 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.4209 0.0694 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.4208 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.4206 0.0739 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.4207 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.4209 0.0726 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.4209 0.0761 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.4208 0.0786 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.4208 0.0726 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.4208 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.4207 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.4209 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.4213 0.0731 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.4214 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.4213 0.0789 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.4213 0.0723 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.4211 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.4213 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.4213 0.0712 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.4213 0.0708 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.4212 0.0705 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.4211 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.4212 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.4956 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.4542 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.4400 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.4386 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.4301 0.0731 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.4199 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.4211 0.0709 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.4183 0.0701 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.4195 0.0721 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.4191 0.0751 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.4163 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.4158 0.0725 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.4152 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.4175 0.0786 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.4163 0.0706 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.4145 0.0735 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.4154 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.4167 0.0724 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.4164 0.0686 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.4175 0.0780 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.4167 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.4174 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.4171 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.4175 0.0715 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.4177 0.0743 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.4163 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.4150 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.4156 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.4161 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.4164 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.4160 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.4150 0.0795 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.4156 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.4159 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.4157 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.4156 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.4149 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.4135 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.4122 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.4117 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.4112 0.0693 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.4121 0.0722 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.4116 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.4107 0.0758 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.4108 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.4102 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.4103 0.0746 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.4099 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.4099 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.4102 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.4100 0.0750 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.4108 0.0718 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.4107 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.4110 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.4109 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.4111 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.4115 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.4112 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.4108 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.4113 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.4114 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.4123 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.4127 0.0748 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.4127 0.0800 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.4128 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.4129 0.0793 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.4131 0.0753 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.4128 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.4129 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.4129 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.4133 0.0798 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.4136 0.0769 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.4140 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.4136 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.4136 0.0693 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.4137 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.4136 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.4134 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.4128 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.4128 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.4123 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.4122 0.0701 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.4117 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.4116 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.4114 0.0722 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.4112 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.4110 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.4107 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.4103 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.4104 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.4102 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.4101 0.0716 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.4098 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.4096 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.4092 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.4094 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.4092 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.4089 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.4084 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.4081 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.4081 0.0727 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.4080 0.0750 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.4080 0.0703 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.4079 0.0851 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.4076 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.4076 0.0727 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.4075 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.4075 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.4075 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.4076 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.4074 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.4074 0.0733 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.4073 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.4071 0.0722 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.4068 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.4065 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.4064 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.4065 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.4065 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.4064 0.0738 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.4063 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.4060 0.0693 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.4057 0.0793 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.4057 0.0686 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.4057 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.4053 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.4054 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.4054 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.4052 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.4050 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.4047 0.0717 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.4045 0.0737 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.4046 0.0762 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.4046 0.0706 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.4046 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.4047 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.4049 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.4050 0.0706 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.4049 0.0702 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.4049 0.0709 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.4053 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.4053 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.4052 0.0819 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.4054 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.4053 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.4054 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.4055 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.4057 0.0744 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.4059 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.4057 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.4054 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.4053 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.4054 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.4055 0.0723 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.4055 0.0679 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.4055 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.4055 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.4054 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.4052 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.4053 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.4055 0.0699 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.4055 0.0713 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.4055 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.4055 0.0749 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.4055 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.4056 0.0708 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.4057 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.4062 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.4061 0.0782 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.4061 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.4060 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.4058 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.4060 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.4059 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.4059 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.4058 0.0720 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.4057 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.4058 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.4890 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.4477 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.4306 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.4250 0.0682 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.4179 0.0798 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.4087 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.4084 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.4079 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.4086 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.4086 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.4053 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.4051 0.0773 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.4047 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.4066 0.0703 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.4055 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.4039 0.0744 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.4042 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.4053 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.4053 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.4058 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.4050 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.4062 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.4058 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.4060 0.0729 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.4061 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.4043 0.0703 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.4031 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.4035 0.0682 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.4039 0.0810 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.4041 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.4036 0.0686 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.4028 0.0741 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.4030 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.4032 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.4031 0.0735 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.4026 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.4021 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.4007 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.3991 0.0692 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.3986 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.3981 0.0692 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.3987 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.3984 0.0752 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.3978 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.3979 0.0781 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.3971 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.3967 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.3965 0.0723 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.3963 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.3965 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.3958 0.0721 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.3965 0.0709 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.3965 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.3967 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.3966 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.3966 0.0713 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.3969 0.0720 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.3968 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.3963 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.3969 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.3970 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.3979 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.3984 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.3986 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.3984 0.0701 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.3985 0.0704 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.3988 0.0752 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.3985 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.3985 0.0744 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.3984 0.0714 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.3989 0.0682 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.3992 0.0744 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.3997 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.3994 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.3993 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.3994 0.0722 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.3993 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.3992 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.3985 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.3985 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.3982 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.3982 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.3978 0.0734 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.3977 0.0753 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.3974 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.3973 0.0715 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.3970 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.3968 0.0772 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.3965 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.3965 0.0774 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.3964 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.3964 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.3960 0.0717 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.3957 0.0703 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.3954 0.0730 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.3954 0.0682 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.3954 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.3950 0.0787 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.3947 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.3944 0.0736 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.3944 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.3943 0.0692 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.3942 0.0712 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.3941 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.3939 0.0708 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.3938 0.0790 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.3938 0.0687 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.3938 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.3936 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.3937 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.3936 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.3935 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.3935 0.0737 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.3933 0.0763 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.3930 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.3926 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.3926 0.0721 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.3927 0.0713 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.3926 0.0808 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.3926 0.0744 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.3925 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.3922 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.3919 0.0747 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.3920 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.3919 0.0717 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.3916 0.0704 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.3917 0.0750 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.3917 0.0757 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.3915 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.3912 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.3909 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.3907 0.0759 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.3907 0.0718 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.3907 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.3906 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.3907 0.0708 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.3909 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.3910 0.0741 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.3910 0.0720 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.3910 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.3913 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.3913 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.3912 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.3914 0.0732 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.3914 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.3916 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.3917 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.3920 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.3921 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.3920 0.0731 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.3918 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.3917 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.3918 0.0757 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.3918 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.3918 0.0689 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.3918 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.3918 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.3918 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.3916 0.0737 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.3917 0.0686 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.3919 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.3920 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.3920 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.3920 0.0712 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.3920 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.3920 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.3921 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.3926 0.0700 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.3926 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.3926 0.0713 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.3925 0.0686 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.3923 0.0807 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.3925 0.0703 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.3925 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.3925 0.0714 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.3924 0.0701 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.3923 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.3924 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.4821 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.4418 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.4233 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.4189 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.4096 0.0738 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.3995 0.0730 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.3997 0.0760 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.3986 0.0720 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.3986 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.3980 0.0748 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.3948 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.3939 0.0688 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.3933 0.0756 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.3952 0.0736 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.3936 0.0704 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.3921 0.0751 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.3930 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.3940 0.0746 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.3941 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.3950 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.3942 0.0735 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.3943 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.3935 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.3937 0.0709 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.3938 0.0752 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.3922 0.0733 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.3909 0.0763 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.3915 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.3918 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.3921 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.3917 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.3908 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.3909 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.3908 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.3907 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.3906 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.3897 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.3885 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.3871 0.0746 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.3866 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.3862 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.3870 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.3865 0.0740 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.3858 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.3858 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.3853 0.0769 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.3851 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.3849 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.3847 0.0737 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.3849 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.3842 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.3850 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.3851 0.0729 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.3853 0.0736 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.3852 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.3852 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.3855 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.3851 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.3847 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.3853 0.0710 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.3852 0.0652 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.3861 0.0697 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.3864 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.3867 0.0708 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.3866 0.0705 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.3866 0.0722 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.3868 0.0709 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.3866 0.0688 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.3866 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.3866 0.0715 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.3871 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.3875 0.0751 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.3880 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.3876 0.0761 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.3876 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.3878 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.3877 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.3876 0.0735 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.3870 0.0742 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.3870 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.3865 0.0734 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.3864 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.3860 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.3859 0.0720 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.3857 0.0867 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.3855 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.3852 0.0724 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.3851 0.0731 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.3849 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.3850 0.0704 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.3848 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.3847 0.0746 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.3842 0.0750 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.3840 0.0724 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.3837 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.3838 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.3838 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.3833 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.3829 0.0706 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.3825 0.0790 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.3825 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.3823 0.0824 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.3823 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.3821 0.0723 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.3818 0.0745 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.3818 0.0774 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.3818 0.0710 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.3817 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.3815 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.3816 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.3814 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.3813 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.3812 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.3810 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.3808 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.3804 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.3804 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.3804 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.3804 0.0731 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.3803 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.3803 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.3799 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.3795 0.0768 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.3795 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.3795 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.3791 0.0688 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.3792 0.0709 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.3792 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.3791 0.0751 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.3788 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.3783 0.0743 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.3781 0.0733 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.3783 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.3783 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.3784 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.3784 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.3787 0.0709 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.3788 0.0731 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.3788 0.0741 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.3788 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.3791 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.3792 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.3790 0.0743 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.3793 0.0720 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.3792 0.0767 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.3794 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.3795 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.3797 0.0740 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.3799 0.0682 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.3798 0.0697 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.3795 0.0697 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.3793 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.3794 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.3793 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.3793 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.3793 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.3793 0.0798 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.3793 0.0728 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.3791 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.3792 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.3795 0.0697 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.3794 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.3795 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.3795 0.0727 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.3795 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.3795 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.3797 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.3801 0.0682 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.3801 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.3801 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.3800 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.3799 0.0786 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.3800 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.3800 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.3800 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.3799 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.3798 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.3799 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.4625 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.4276 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.4095 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.4044 0.0827 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.3941 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.3841 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.3849 0.0699 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.3827 0.0703 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.3829 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.3826 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.3789 0.0761 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.3787 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.3783 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.3796 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.3780 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.3762 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.3767 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.3781 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.3776 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.3789 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.3781 0.0688 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.3787 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.3781 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.3782 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.3781 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.3767 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.3759 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.3765 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.3771 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.3777 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.3770 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.3761 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.3764 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.3764 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.3759 0.0742 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.3760 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.3753 0.0704 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.3743 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.3733 0.0787 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.3729 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.3722 0.0717 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.3729 0.0699 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.3725 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.3719 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.3720 0.0809 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.3712 0.0763 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.3710 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.3707 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.3708 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.3710 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.3705 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.3714 0.0728 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.3714 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.3716 0.0705 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.3715 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.3715 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.3720 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.3717 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.3712 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.3720 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.3721 0.0746 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.3731 0.0707 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.3738 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.3739 0.0733 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.3738 0.0743 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.3740 0.0743 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.3743 0.0732 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.3740 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.3739 0.0720 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.3739 0.0795 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.3745 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.3746 0.0699 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.3752 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.3748 0.0736 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.3747 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.3751 0.0751 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.3748 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.3748 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.3742 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.3741 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.3738 0.0730 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.3737 0.0723 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.3733 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.3734 0.0707 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.3731 0.0734 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.3730 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.3728 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.3727 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.3723 0.0712 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.3723 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.3722 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.3721 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.3717 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.3715 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.3712 0.0700 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.3713 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.3713 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.3708 0.0737 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.3704 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.3701 0.0700 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.3701 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.3698 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.3697 0.0703 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.3695 0.0722 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.3694 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.3692 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.3691 0.0739 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.3691 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.3691 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.3691 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.3689 0.0712 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.3690 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.3689 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.3687 0.0766 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.3684 0.0731 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.3681 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.3682 0.0688 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.3682 0.0721 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.3682 0.0729 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.3682 0.0706 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.3682 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.3679 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.3676 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.3675 0.0732 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.3676 0.0740 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.3672 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.3673 0.0707 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.3673 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.3671 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.3668 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.3665 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.3664 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.3665 0.0734 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.3665 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.3665 0.0751 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.3666 0.0726 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.3668 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.3669 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.3669 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.3669 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.3672 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.3673 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.3672 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.3675 0.0745 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.3674 0.0729 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.3676 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.3677 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.3680 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.3683 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.3682 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.3679 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.3678 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.3678 0.0715 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.3678 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.3678 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.3677 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.3677 0.0816 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.3676 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.3674 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.3675 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.3678 0.0716 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.3678 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.3679 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.3679 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.3679 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.3679 0.0723 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.3680 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.3684 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.3684 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.3684 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.3684 0.0717 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.3683 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.3684 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.3684 0.0810 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.3684 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.3683 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.3682 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.3684 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.4440 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.4117 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.3977 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.3932 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.3850 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.3740 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.3735 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.3712 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.3709 0.0789 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.3709 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.3680 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.3680 0.0872 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.3677 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.3699 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.3689 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.3668 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.3679 0.0723 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.3688 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.3687 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.3699 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.3694 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.3696 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.3686 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.3687 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.3687 0.0750 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.3668 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.3656 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.3663 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.3664 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.3666 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.3660 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.3654 0.0722 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.3657 0.0723 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.3659 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.3657 0.0709 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.3659 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.3653 0.0699 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.3642 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.3629 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.3626 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.3622 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.3631 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.3627 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.3621 0.0778 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.3625 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.3617 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.3616 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.3612 0.0737 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.3612 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.3616 0.0888 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.3610 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.3617 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.3618 0.0754 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.3621 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.3619 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.3620 0.0755 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.3622 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.3620 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.3615 0.0761 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.3623 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.3621 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.3631 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.3635 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.3635 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.3635 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.3637 0.0709 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.3639 0.0705 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.3636 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.3635 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.3634 0.0731 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.3640 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.3643 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.3648 0.0699 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.3645 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.3645 0.0774 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.3646 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.3646 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.3644 0.0738 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.3639 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.3637 0.0701 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.3633 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.3633 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.3628 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.3627 0.0760 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.3624 0.0791 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.3624 0.0721 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.3622 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.3621 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.3617 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.3619 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.3617 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.3616 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.3612 0.0725 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.3609 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.3606 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.3607 0.0719 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.3607 0.0749 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.3603 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.3600 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.3596 0.0708 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.3598 0.0797 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.3596 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.3596 0.0782 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.3595 0.0765 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.3594 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.3592 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.3593 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.3593 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.3593 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.3594 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.3592 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.3592 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.3591 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.3590 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.3587 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.3584 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.3583 0.0789 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.3584 0.0791 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.3582 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.3582 0.0705 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.3583 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.3580 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.3577 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.3577 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.3577 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.3574 0.0730 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.3574 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.3575 0.0751 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.3574 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.3571 0.0731 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.3568 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.3568 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.3569 0.0740 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.3570 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.3570 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.3571 0.0742 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.3572 0.0755 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.3573 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.3572 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.3573 0.0701 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.3576 0.0733 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.3577 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.3576 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.3578 0.0734 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.3576 0.0713 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.3579 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.3579 0.0760 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.3582 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.3583 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.3582 0.0776 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.3579 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.3578 0.0705 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.3578 0.0853 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.3578 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.3579 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.3579 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.3579 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.3578 0.0737 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.3576 0.0726 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.3578 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.3580 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.3580 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.3581 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.3581 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.3581 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.3581 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.3583 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.3588 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.3588 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.3588 0.0797 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.3587 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.3586 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.3588 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.3588 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.3589 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.3588 0.0793 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.3587 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.3588 0.0667 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4187 1.2812 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4017 0.0693 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.3743 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.3052 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.1804 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.0909 0.0661 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.0118 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 3.9459 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 3.8888 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.8418 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.7987 0.0645 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.7616 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.7291 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.7007 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.6752 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.6522 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.6310 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.6126 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.5948 0.0672 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.5771 0.0702 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.5616 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.5475 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.5341 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.5220 0.0662 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.5098 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.4995 0.0736 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.4899 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.4799 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.4709 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.4626 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.4554 0.0659 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.4475 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.4400 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.4335 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.4268 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.4208 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.4146 0.0695 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.4084 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.4026 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.3971 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.3920 0.0645 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.3871 0.0656 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.3821 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.3775 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.3728 0.0658 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.3685 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.3647 0.0653 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.3610 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.3575 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.3540 0.0681 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.3505 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.3469 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.3437 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.3403 0.0702 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.3372 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.3338 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.3309 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.3281 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.3251 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.3226 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.3199 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.3177 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.3156 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.3128 0.0643 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.3102 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.3081 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.3060 0.0687 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.3033 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.3010 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.2991 0.0691 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.2970 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.2952 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.2931 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.2912 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.2895 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.2878 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.2860 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.2843 0.0651 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.2824 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.2806 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.2788 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2773 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2757 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2741 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2724 0.0659 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2707 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2690 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2674 0.0708 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2659 0.0643 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2645 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2631 0.0652 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.2616 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.2602 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.2588 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.2573 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.2559 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.2546 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.2532 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.2519 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.2504 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.2490 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.2476 0.0739 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.2463 0.0685 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.2449 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.2435 0.0717 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.2421 0.0739 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.2406 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.2390 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.2377 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.2360 0.0782 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.2345 0.0748 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.2331 0.0680 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.2316 0.0660 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.2299 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.2283 0.0681 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.2266 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.2250 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.2235 0.0665 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.2220 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.2203 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.2187 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.2171 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.2155 0.0656 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.2140 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.2124 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.2104 0.0673 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.2088 0.0679 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.2072 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.2054 0.0683 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.2036 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.2019 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.2000 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.1981 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.1962 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.1940 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.1919 0.0679 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.1898 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.1877 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.1856 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.1834 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.1813 0.0652 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.1789 0.0702 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.1766 0.0630 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.1743 0.0687 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.1722 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.1700 0.0749 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.1677 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.1655 0.0683 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.1630 0.0695 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.1606 0.0641 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.1584 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.1563 0.0721 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.1540 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.1516 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.1490 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.1465 0.0661 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.1438 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.1413 0.0643 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.1386 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.1360 0.0707 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.1334 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.1306 0.0653 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.1278 0.0674 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.1251 0.0720 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.1225 0.0644 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.1198 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.1171 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.1145 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.1118 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.1090 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.1064 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.1039 0.0670 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.1015 0.0657 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.0990 0.0688 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.0966 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.0939 0.0726 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.0912 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.0884 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.6856 0.0666 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.6315 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.6158 0.0611 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.6099 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.6040 0.0725 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.5992 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.5982 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.5966 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.5958 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.5922 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.5886 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.5873 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.5858 0.0681 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.5863 0.0679 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.5845 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.5828 0.0721 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.5812 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.5815 0.0682 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.5797 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.5766 0.0733 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.5748 0.0679 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.5747 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.5731 0.0675 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.5711 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.5695 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.5684 0.0739 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.5666 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.5645 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.5632 0.0698 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.5616 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.5606 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.5584 0.0750 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.5564 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.5550 0.0677 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.5531 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.5518 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.5500 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.5477 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.5458 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.5440 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.5424 0.0731 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.5406 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.5387 0.0726 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.5368 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.5353 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.5332 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.5321 0.0691 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.5305 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.5289 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.5279 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.5264 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.5249 0.0681 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.5235 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.5221 0.0675 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.5205 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.5193 0.0699 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.5180 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.5163 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.5149 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.5138 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.5122 0.0666 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.5112 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.5103 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.5088 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.5074 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.5065 0.0682 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.5055 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.5040 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.5025 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.5014 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.5003 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.4995 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.4985 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.4971 0.0706 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.4960 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.4954 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.4942 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.4932 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.4920 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.4909 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.4897 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.4888 0.0718 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.4874 0.0687 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.4861 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.4846 0.0706 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.4833 0.0649 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.4822 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.4810 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.4798 0.0690 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.4787 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.4775 0.0735 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.4765 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.4753 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.4741 0.0682 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.4727 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.4716 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.4706 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.4694 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.4683 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.4671 0.0731 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.4662 0.0695 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.4651 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.4638 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.4627 0.0666 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.4615 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.4605 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.4595 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.4587 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.4578 0.0688 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.4565 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.4557 0.0775 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.4547 0.0829 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.4536 0.0802 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.4525 0.0671 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.4513 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.4501 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.4491 0.0693 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.4482 0.0694 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.4473 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.4464 0.0716 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.4457 0.0703 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.4447 0.0723 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.4436 0.0768 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.4428 0.0721 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.4418 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.4407 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.4399 0.0686 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.4391 0.0748 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.4383 0.0693 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.4374 0.0722 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.4364 0.0732 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.4353 0.0700 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.4344 0.0695 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.4337 0.0822 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.4327 0.0679 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.4317 0.0758 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.4308 0.0677 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.4299 0.0740 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.4292 0.0717 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.4283 0.0718 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.4276 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.4266 0.0687 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.4257 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.4249 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.4240 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.4233 0.0683 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.4225 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.4218 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.4209 0.0677 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.4200 0.0696 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.4193 0.0649 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.4186 0.0719 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.4178 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.4170 0.0738 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.4160 0.0719 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.4152 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.4142 0.0706 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.4133 0.0706 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.4122 0.0663 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.4115 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.4108 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.4098 0.0828 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.4089 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.4081 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.4073 0.0795 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.4065 0.0687 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.4056 0.0721 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.4049 0.1005 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.4041 0.0987 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.4032 0.0793 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.4024 0.0677 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.4016 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.4010 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.4003 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.3997 0.0731 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.3992 0.0679 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.3987 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.3982 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.3346 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.2739 0.0785 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.2563 0.0724 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.2498 0.0755 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.2447 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.2420 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.2417 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.2428 0.0730 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.2453 0.0684 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.2449 0.0852 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.2422 0.0741 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.2410 0.0748 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.2407 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.2420 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.2414 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.2401 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.2394 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.2409 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.2403 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.2391 0.0693 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.2378 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.2386 0.0715 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.2379 0.0726 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.2364 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.2354 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.2342 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.2330 0.0728 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.2324 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.2322 0.0699 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.2317 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.2316 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.2302 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.2293 0.0722 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.2292 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.2284 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.2279 0.0770 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.2272 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.2257 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.2246 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.2231 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.2223 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.2215 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.2203 0.0728 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.2193 0.0736 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.2184 0.0687 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.2166 0.0699 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.2161 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.2151 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.2143 0.0774 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.2143 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.2133 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.2132 0.0716 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.2123 0.0703 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.2115 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.2105 0.0693 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.2100 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.2095 0.0718 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.2087 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.2079 0.0715 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.2078 0.0666 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.2071 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.2070 0.0674 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.2069 0.0751 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.2063 0.0712 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.2055 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.2052 0.0762 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.2048 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.2038 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.2029 0.0700 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.2023 0.0755 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.2021 0.0684 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.2017 0.0753 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.2013 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.2004 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.1998 0.0737 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.1995 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.1988 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.1984 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.1975 0.0720 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.1969 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.1960 0.0740 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.1955 0.0695 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.1946 0.0714 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.1939 0.0695 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.1930 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.1922 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.1915 0.0693 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.1907 0.0707 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.1899 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.1896 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.1888 0.0674 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.1883 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.1875 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.1866 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.1857 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.1851 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.1845 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.1838 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.1829 0.0715 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.1820 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.1816 0.0700 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.1810 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.1802 0.0734 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.1795 0.0731 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.1788 0.0703 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.1782 0.0710 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.1775 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.1771 0.0718 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.1766 0.0738 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.1760 0.0693 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.1754 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.1749 0.0802 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.1742 0.0735 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.1737 0.0701 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.1730 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.1721 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.1716 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.1710 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.1706 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.1700 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.1696 0.0689 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.1690 0.0775 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.1684 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.1680 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.1675 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.1667 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.1663 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.1659 0.0859 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.1654 0.0728 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.1649 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.1643 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.1636 0.0724 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.1632 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.1628 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.1622 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.1617 0.0699 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.1613 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.1608 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.1606 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.1600 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.1596 0.0689 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.1591 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.1585 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.1579 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.1574 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.1570 0.0732 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.1565 0.0710 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.1562 0.0721 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.1557 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.1550 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.1545 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.1543 0.0748 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.1538 0.0732 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.1534 0.0781 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.1528 0.0683 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.1523 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.1518 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.1513 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.1507 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.1504 0.0744 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.1501 0.0666 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.1496 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.1491 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.1487 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.1483 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.1478 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.1474 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.1471 0.0702 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.1467 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.1462 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.1456 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.1451 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.1448 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.1444 0.0683 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.1440 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.1436 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.1431 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.1426 0.0737 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.1422 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.0884 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.0706 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.0642 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.0610 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.0535 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.0524 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.0535 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.0564 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.0550 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.0517 0.0618 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.0495 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.0486 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.0509 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.0499 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.0482 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.0480 0.0708 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.0502 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.0500 0.0703 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.0494 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.0490 0.0733 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.0503 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.0496 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.0484 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.0477 0.0664 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.0466 0.0691 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.0454 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.0456 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.0463 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.0464 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.0462 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.0453 0.0746 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.0450 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.0456 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.0450 0.0678 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.0446 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.0441 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.0425 0.0701 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.0413 0.0691 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.0401 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.0392 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.0387 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.0378 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.0368 0.0761 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.0364 0.0722 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.0350 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.0346 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.0337 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.0329 0.0710 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.0332 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.0322 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.0324 0.0692 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.0317 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.0311 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.0304 0.0734 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.0301 0.0708 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.0299 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.0294 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.0285 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.0286 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.0282 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.0285 0.0704 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.0288 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.0286 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.0280 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.0281 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.0278 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.0270 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.0264 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.0261 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.0260 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.0257 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.0256 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.0250 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.0248 0.0719 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.0248 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.0244 0.0692 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.0241 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.0232 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.0226 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.0219 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.0217 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.0209 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.0203 0.0717 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.0195 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.0188 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.0184 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.0179 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.0172 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.0170 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.0165 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.0161 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.0153 0.0728 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.0148 0.0748 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.0141 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.0138 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.0134 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.0127 0.0664 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.0121 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.0114 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.0111 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.0108 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.0101 0.0702 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.0096 0.0727 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.0090 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.0086 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.0082 0.0698 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.0078 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.0076 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.0072 0.0840 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.0068 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.0063 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.0059 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.0055 0.0732 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.0050 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.0042 0.0745 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.0039 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.0035 0.0774 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.0032 0.0696 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.0029 0.0706 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.0027 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.0022 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.0016 0.0732 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.0014 0.0751 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.0010 0.0773 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.0004 0.0737 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.0001 0.0689 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 1.9999 0.0710 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 1.9996 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 1.9993 0.0712 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 1.9987 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 1.9982 0.0747 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 1.9979 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 1.9976 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 1.9971 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 1.9969 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 1.9967 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 1.9965 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 1.9963 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 1.9960 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 1.9958 0.0737 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 1.9954 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 1.9951 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 1.9948 0.0698 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 1.9943 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 1.9941 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 1.9938 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 1.9937 0.0695 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 1.9933 0.0716 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 1.9929 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 1.9925 0.0753 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 1.9925 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 1.9922 0.0696 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 1.9920 0.0748 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 1.9917 0.0756 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 1.9913 0.0696 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 1.9911 0.0696 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 1.9907 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 1.9902 0.0763 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 1.9901 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 1.9900 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 1.9897 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 1.9895 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 1.9892 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 1.9889 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 1.9886 0.0761 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 1.9883 0.0664 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 1.9883 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 1.9880 0.0711 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 1.9877 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 1.9874 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 1.9870 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 1.9868 0.0711 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 1.9866 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 1.9864 0.0683 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 1.9861 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 1.9858 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 1.9855 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.0127 0.0710 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 1.9657 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 1.9528 0.0697 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 1.9451 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 1.9421 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 1.9316 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 1.9326 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 1.9315 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 1.9336 0.0728 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 1.9325 0.0692 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 1.9301 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 1.9282 0.0705 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 1.9276 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 1.9299 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 1.9286 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 1.9274 0.0724 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 1.9268 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 1.9281 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 1.9277 0.0729 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 1.9280 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 1.9266 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 1.9276 0.0737 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 1.9268 0.0703 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 1.9260 0.0701 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 1.9256 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 1.9240 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 1.9228 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 1.9228 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 1.9236 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 1.9237 0.0692 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 1.9236 0.0770 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 1.9223 0.0699 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 1.9222 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 1.9226 0.0734 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 1.9220 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 1.9215 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 1.9210 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 1.9196 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 1.9184 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 1.9173 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 1.9166 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 1.9163 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 1.9156 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 1.9145 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 1.9145 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 1.9129 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 1.9127 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 1.9119 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 1.9117 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 1.9122 0.0766 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 1.9114 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 1.9121 0.0747 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 1.9118 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 1.9114 0.0708 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 1.9109 0.0744 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 1.9108 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 1.9107 0.0696 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 1.9103 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 1.9097 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 1.9100 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 1.9096 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 1.9102 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 1.9103 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 1.9103 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 1.9099 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 1.9099 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 1.9098 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 1.9091 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 1.9088 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 1.9086 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 1.9088 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 1.9086 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 1.9086 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 1.9080 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 1.9078 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 1.9079 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 1.9075 0.0700 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 1.9074 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 1.9067 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 1.9064 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 1.9057 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 1.9056 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 1.9049 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 1.9046 0.0750 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 1.9039 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 1.9034 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 1.9031 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 1.9025 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 1.9018 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 1.9017 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 1.9012 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 1.9008 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 1.9001 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 1.8996 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 1.8991 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 1.8987 0.0791 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 1.8984 0.0697 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 1.8978 0.0713 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 1.8972 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 1.8965 0.0753 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 1.8962 0.0704 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 1.8959 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 1.8954 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 1.8951 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 1.8946 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 1.8942 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 1.8939 0.0697 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 1.8936 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 1.8935 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 1.8932 0.0768 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 1.8930 0.0718 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 1.8927 0.0746 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 1.8923 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 1.8919 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 1.8915 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 1.8909 0.0700 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 1.8907 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 1.8903 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 1.8901 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 1.8898 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 1.8896 0.0709 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 1.8891 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 1.8886 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 1.8886 0.0713 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 1.8883 0.0759 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 1.8877 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 1.8875 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 1.8873 0.0743 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 1.8870 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 1.8868 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 1.8863 0.0687 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 1.8858 0.0747 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 1.8857 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 1.8855 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 1.8852 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 1.8851 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 1.8850 0.0740 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 1.8848 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 1.8848 0.0710 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 1.8845 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 1.8846 0.0715 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 1.8843 0.0711 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 1.8841 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 1.8839 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 1.8836 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 1.8835 0.0702 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 1.8833 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 1.8832 0.0725 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 1.8830 0.0705 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 1.8826 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 1.8823 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 1.8823 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 1.8821 0.0696 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 1.8820 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 1.8818 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 1.8815 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 1.8813 0.0697 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 1.8811 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 1.8806 0.0805 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 1.8806 0.0707 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 1.8805 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 1.8803 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 1.8800 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 1.8799 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 1.8797 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 1.8794 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 1.8793 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 1.8794 0.0712 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 1.8792 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 1.8790 0.0730 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 1.8787 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 1.8783 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 1.8782 0.0696 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 1.8780 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 1.8778 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 1.8776 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 1.8773 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 1.8771 0.0696 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 1.9196 0.0794 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 1.8743 0.0722 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 1.8583 0.0811 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 1.8516 0.0764 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 1.8480 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 1.8383 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 1.8389 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 1.8362 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 1.8389 0.0694 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 1.8385 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.8351 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.8333 0.0765 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.8325 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.8349 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.8344 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.8330 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.8329 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.8345 0.0697 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.8342 0.0669 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.8347 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.8340 0.0712 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.8348 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.8340 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.8333 0.0686 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.8328 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.8316 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.8307 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.8312 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.8321 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.8318 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.8315 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.8306 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.8306 0.0698 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.8309 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.8303 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.8299 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.8292 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.8280 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.8266 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.8260 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.8254 0.0755 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.8254 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.8247 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.8236 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.8236 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.8224 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.8222 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.8215 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.8210 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.8216 0.0671 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.8209 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.8216 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.8212 0.0715 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.8210 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.8205 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.8203 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.8204 0.0671 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.8200 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.8195 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.8199 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.8195 0.0695 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.8202 0.0846 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.8205 0.0758 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.8206 0.0769 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.8203 0.0710 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.8206 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.8207 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.8202 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.8199 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.8198 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.8202 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.8202 0.0669 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.8203 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.8198 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.8196 0.0806 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.8197 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.8192 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.8191 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.8184 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.8179 0.0722 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.8172 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.8172 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.8165 0.0695 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.8162 0.0726 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.8157 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.8153 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.8149 0.0723 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.8143 0.0671 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.8137 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.8137 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.8133 0.0707 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.8130 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.8123 0.0702 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.8120 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.8115 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.8114 0.0679 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.8111 0.0712 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.8106 0.0717 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.8101 0.0734 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.8096 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.8095 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.8092 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.8088 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.8085 0.0700 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.8081 0.0696 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.8079 0.0708 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.8077 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.8076 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.8076 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.8075 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.8073 0.0712 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.8070 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.8068 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.8064 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.8060 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.8054 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.8051 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.8049 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.8046 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.8044 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.8043 0.0701 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.8039 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.8034 0.0714 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.8035 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.8032 0.0789 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.8026 0.0707 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.8026 0.0712 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.8025 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.8023 0.0746 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.8021 0.0710 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.8017 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.8013 0.0694 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.8013 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.8010 0.0699 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.8010 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.8009 0.0705 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.8009 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.8008 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.8007 0.0739 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.8005 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.8007 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.8005 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.8004 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.8003 0.0809 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.8000 0.0773 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.8000 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.7999 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.7999 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.7998 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.7996 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.7993 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.7994 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.7993 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.7992 0.0720 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.7991 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.7989 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.7988 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.7987 0.0686 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.7983 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.7984 0.0758 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.7985 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.7983 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.7983 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.7981 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.7980 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.7978 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.7977 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.7980 0.0777 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.7978 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.7977 0.0692 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.7974 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.7971 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.7970 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.7969 0.0694 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.7969 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.7967 0.0814 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.7965 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.7964 0.0687 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 1.8518 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.8094 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.7952 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.7876 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.7807 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.7689 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.7689 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.7675 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.7704 0.0677 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.7696 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.7666 0.0682 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.7636 0.0690 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.7633 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.7660 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.7648 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.7631 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.7625 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.7645 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.7646 0.0683 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.7651 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.7641 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.7652 0.0698 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.7643 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.7639 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.7635 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.7624 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.7609 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.7608 0.0716 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.7615 0.0693 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.7614 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.7613 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.7601 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.7603 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.7607 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.7604 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.7600 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.7593 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.7582 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.7567 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.7558 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.7551 0.0752 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.7554 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.7549 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.7540 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.7541 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.7527 0.0683 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.7522 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.7516 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.7513 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.7520 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.7514 0.0824 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.7522 0.0690 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.7520 0.0711 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.7518 0.0693 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.7514 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.7515 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.7517 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.7513 0.0759 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.7507 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.7511 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.7509 0.0677 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.7517 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.7519 0.0683 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.7520 0.0677 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.7520 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.7523 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.7524 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.7520 0.0814 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.7518 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.7518 0.0738 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.7522 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.7523 0.0693 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.7526 0.0694 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.7521 0.0768 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.7522 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.7522 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.7519 0.0716 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.7519 0.0699 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.7514 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.7511 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.7504 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.7506 0.0730 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.7500 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.7498 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.7493 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.7488 0.0769 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.7485 0.0713 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.7481 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.7475 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.7475 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.7471 0.0747 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.7470 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.7465 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.7460 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.7456 0.0758 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.7455 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.7453 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.7448 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.7443 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.7437 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.7436 0.0727 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.7434 0.0709 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.7431 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.7427 0.0707 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.7425 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.7423 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.7420 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.7419 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.7419 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.7419 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.7417 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.7415 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.7412 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.7410 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.7407 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.7403 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.7401 0.0717 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.7399 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.7397 0.0695 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.7395 0.0713 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.7394 0.0735 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.7389 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.7385 0.0690 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.7385 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.7383 0.0677 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.7379 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.7380 0.0711 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.7380 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.7377 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.7376 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.7371 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.7367 0.0707 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.7367 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.7366 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.7365 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.7365 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.7365 0.0718 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.7364 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.7364 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.7363 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.7365 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.7363 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.7361 0.0710 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.7360 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.7358 0.0717 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.7359 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.7358 0.0731 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.7359 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.7358 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.7356 0.0730 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.7354 0.0682 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.7353 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.7353 0.0699 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.7353 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.7352 0.0715 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.7350 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.7350 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.7348 0.0745 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.7345 0.0749 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.7345 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.7346 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.7345 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.7344 0.0748 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.7343 0.0696 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.7342 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.7341 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.7341 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.7344 0.0695 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.7342 0.0748 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.7341 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.7338 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.7336 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.7336 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.7336 0.0680 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.7336 0.0672 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.7334 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.7332 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.7331 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.8097 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.7581 0.0774 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.7389 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.7313 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.7264 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.7155 0.0714 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.7147 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.7130 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.7148 0.0734 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.7141 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.7105 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.7083 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.7071 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.7100 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.7092 0.0674 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.7076 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.7072 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.7086 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.7084 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.7096 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.7088 0.0708 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.7098 0.0712 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.7087 0.0767 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.7086 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.7080 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.7066 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.7055 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.7058 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.7068 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.7071 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.7065 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.7052 0.0674 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.7054 0.0774 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.7059 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.7056 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.7052 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.7047 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.7035 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.7022 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.7015 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.7007 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.7010 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.7004 0.0805 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.6996 0.0830 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.6998 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.6987 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.6984 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.6978 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.6974 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.6982 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.6977 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.6986 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.6983 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.6980 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.6976 0.0674 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.6976 0.0708 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.6977 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.6973 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.6968 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.6972 0.0703 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.6970 0.0713 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.6977 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.6982 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.6982 0.0713 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.6981 0.0752 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.6984 0.0763 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.6986 0.0712 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.6981 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.6981 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.6981 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.6984 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.6985 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.6988 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.6985 0.0709 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.6982 0.0719 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.6984 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.6982 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.6983 0.0682 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.6978 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.6976 0.0682 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.6972 0.0715 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.6972 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.6967 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.6966 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.6962 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.6958 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.6955 0.0831 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.6952 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.6946 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.6946 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.6943 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.6941 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.6936 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.6933 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.6928 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.6928 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.6926 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.6921 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.6916 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.6912 0.0717 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.6911 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.6909 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.6907 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.6904 0.0707 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.6901 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.6899 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.6897 0.0697 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.6895 0.0737 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.6895 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.6895 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.6893 0.0747 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.6892 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.6890 0.0696 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.6887 0.0721 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.6884 0.0694 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.6879 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.6878 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.6876 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.6874 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.6873 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.6872 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.6868 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.6864 0.0707 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.6864 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.6862 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.6858 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.6858 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.6858 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.6856 0.0717 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.6853 0.0739 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.6849 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.6846 0.0710 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.6846 0.0714 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.6846 0.0753 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.6845 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.6844 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.6845 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.6845 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.6845 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.6844 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.6847 0.0736 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.6845 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.6843 0.0748 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.6844 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.6842 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.6842 0.0702 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.6840 0.0778 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.6842 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.6841 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.6839 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.6836 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.6837 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.6836 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.6836 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.6835 0.0701 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.6834 0.0783 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.6834 0.0765 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.6834 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.6830 0.0674 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.6831 0.0903 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.6833 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.6832 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.6832 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.6832 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.6831 0.0698 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.6830 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.6830 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.6833 0.0674 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.6832 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.6831 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.6829 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.6827 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.6827 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.6826 0.0769 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.6826 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.6825 0.0711 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.6823 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.6822 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.7557 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.7057 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.6860 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.6804 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.6757 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.6651 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.6649 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.6630 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.6656 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.6644 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.6612 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.6598 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.6593 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.6615 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.6606 0.0770 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.6593 0.0704 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.6593 0.0746 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.6607 0.0688 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.6608 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.6618 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.6615 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.6625 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.6621 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.6621 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.6619 0.0701 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.6604 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.6593 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.6597 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.6607 0.0701 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.6607 0.0709 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.6605 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.6593 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.6597 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.6602 0.0718 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.6599 0.0702 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.6598 0.0701 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.6590 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.6578 0.0772 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.6565 0.0774 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.6558 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.6553 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.6555 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.6549 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.6539 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.6540 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.6532 0.0719 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.6528 0.0685 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.6522 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.6516 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.6522 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.6519 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.6530 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.6528 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.6529 0.0738 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.6526 0.0703 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.6527 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.6530 0.0736 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.6525 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.6520 0.0704 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.6525 0.0843 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.6521 0.0700 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.6530 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.6533 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.6534 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.6533 0.0713 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.6537 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.6538 0.0707 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.6534 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.6535 0.0716 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.6534 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.6539 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.6540 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.6545 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.6542 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.6541 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.6544 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.6541 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.6542 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.6537 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.6534 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.6529 0.0728 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.6530 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.6525 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.6523 0.0726 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.6519 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.6516 0.0697 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.6513 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.6508 0.0698 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.6504 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.6505 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.6502 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.6500 0.0698 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.6494 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.6491 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.6487 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.6487 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.6486 0.0715 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.6481 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.6478 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.6473 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.6471 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.6470 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.6468 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.6465 0.0709 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.6463 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.6462 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.6460 0.0718 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.6460 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.6459 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.6458 0.0698 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.6456 0.0697 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.6455 0.0702 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.6454 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.6452 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.6449 0.0806 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.6445 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.6444 0.0709 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.6443 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.6442 0.0730 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.6440 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.6439 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.6434 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.6431 0.0747 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.6432 0.0731 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.6432 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.6428 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.6428 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.6428 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.6426 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.6424 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.6420 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.6416 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.6416 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.6416 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.6415 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.6415 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.6416 0.0877 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.6416 0.0806 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.6416 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.6415 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.6418 0.0715 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.6417 0.0713 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.6417 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.6417 0.0867 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.6415 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.6416 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.6416 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.6417 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.6416 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.6415 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.6412 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.6411 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.6412 0.0702 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.6412 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.6411 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.6410 0.0708 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.6410 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.6409 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.6406 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.6408 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.6410 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.6410 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.6409 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.6409 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.6408 0.0710 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.6407 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.6408 0.0720 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.6411 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.6411 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.6410 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.6408 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.6406 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.6406 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.6406 0.0688 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.6406 0.0766 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.6405 0.0746 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.6403 0.0759 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.6403 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.7176 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.6745 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.6593 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.6551 0.0739 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.6478 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.6363 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.6360 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.6334 0.0741 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.6354 0.0720 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.6343 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.6305 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.6291 0.0690 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.6282 0.0714 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.6303 0.0726 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.6292 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.6269 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.6273 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.6287 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.6283 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.6288 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.6277 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.6280 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.6272 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.6270 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.6267 0.0714 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.6249 0.0788 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.6237 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.6241 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.6248 0.0744 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.6248 0.0717 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.6246 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.6239 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.6240 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.6241 0.0706 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.6241 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.6237 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.6231 0.0822 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.6219 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.6206 0.0740 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.6202 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.6197 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.6201 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.6195 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.6187 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.6189 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.6177 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.6175 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.6170 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.6168 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.6174 0.0803 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.6167 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.6176 0.0753 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.6174 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.6175 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.6172 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.6171 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.6175 0.0690 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.6171 0.0704 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.6167 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.6171 0.0697 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.6170 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.6178 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.6180 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.6182 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.6181 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.6185 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.6186 0.0735 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.6183 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.6182 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.6181 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.6186 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.6187 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.6191 0.0773 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.6188 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.6188 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.6190 0.0755 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.6187 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.6187 0.0715 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.6181 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.6179 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.6174 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.6175 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.6170 0.0724 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.6169 0.0690 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.6165 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.6161 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.6159 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.6156 0.0700 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.6152 0.0740 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.6153 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.6149 0.0778 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.6146 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.6142 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.6138 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.6133 0.0731 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.6133 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.6131 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.6126 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.6123 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.6120 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.6119 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.6117 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.6114 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.6112 0.0720 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.6110 0.0752 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.6108 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.6107 0.0723 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.6107 0.0733 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.6107 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.6106 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.6104 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.6102 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.6100 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.6098 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.6095 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.6092 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.6091 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.6090 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.6090 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.6088 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.6087 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.6084 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.6080 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.6081 0.0690 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.6080 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.6076 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.6076 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.6076 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.6075 0.0709 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.6073 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.6069 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.6066 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.6067 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.6067 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.6066 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.6067 0.0708 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.6068 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.6068 0.0705 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.6068 0.0729 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.6066 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.6070 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.6069 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.6068 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.6069 0.0723 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.6068 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.6069 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.6069 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.6070 0.0705 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.6070 0.0743 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.6068 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.6065 0.0731 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.6065 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.6065 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.6065 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.6065 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.6064 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.6064 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.6063 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.6060 0.0714 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.6062 0.0729 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.6064 0.0708 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.6063 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.6063 0.0768 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.6062 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.6061 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.6060 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.6061 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.6065 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.6064 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.6064 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.6062 0.0737 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.6061 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.6061 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.6061 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.6061 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.6060 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.6058 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.6058 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.6903 0.0699 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.6400 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.6244 0.0839 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.6201 0.0732 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.6137 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.6051 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.6054 0.0707 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.6027 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.6023 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.6015 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.5978 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.5961 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.5951 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.5976 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.5962 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.5943 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.5945 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.5959 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.5959 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.5969 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.5963 0.0791 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.5966 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.5964 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.5958 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.5953 0.0742 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.5940 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.5928 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.5929 0.0767 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.5935 0.0691 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.5937 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.5934 0.0713 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.5925 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.5930 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.5933 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.5930 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.5928 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.5920 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.5909 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.5895 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.5889 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.5883 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.5888 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.5883 0.0743 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.5875 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.5877 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.5867 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.5863 0.0772 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.5859 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.5856 0.0699 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.5861 0.0735 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.5855 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.5864 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.5863 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.5862 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.5859 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.5861 0.0732 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.5865 0.0710 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.5863 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.5858 0.0745 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.5863 0.0711 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.5861 0.0708 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.5872 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.5875 0.0716 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.5876 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.5875 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.5878 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.5879 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.5876 0.0724 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.5875 0.0691 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.5875 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.5879 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.5880 0.0703 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.5884 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.5882 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.5882 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.5884 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.5882 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.5882 0.0717 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.5877 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.5876 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.5870 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.5870 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.5864 0.0729 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.5863 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.5858 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.5855 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.5853 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.5851 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.5848 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.5849 0.0777 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.5846 0.0691 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.5844 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.5839 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.5836 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.5831 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.5831 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.5830 0.0735 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.5826 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.5823 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.5818 0.0721 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.5819 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.5817 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.5815 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.5813 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.5810 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.5810 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.5809 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.5808 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.5807 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.5806 0.0705 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.5804 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.5803 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.5801 0.0805 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.5799 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.5796 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.5792 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.5792 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.5791 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.5789 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.5788 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.5786 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.5781 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.5777 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.5778 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.5777 0.0702 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.5772 0.0704 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.5773 0.0763 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.5773 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.5772 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.5770 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.5767 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.5764 0.0712 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.5765 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.5764 0.0781 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.5764 0.0729 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.5765 0.0704 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.5767 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.5767 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.5767 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.5767 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.5770 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.5770 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.5769 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.5770 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.5768 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.5770 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.5769 0.0704 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.5771 0.0714 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.5771 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.5770 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.5767 0.0737 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.5767 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.5767 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.5766 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.5765 0.0723 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.5765 0.0739 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.5765 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.5764 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.5761 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.5762 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.5764 0.0749 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.5763 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.5763 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.5763 0.0811 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.5763 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.5762 0.0734 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.5763 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.5767 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.5766 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.5766 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.5764 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.5762 0.0686 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.5764 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.5764 0.0703 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.5764 0.0786 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.5763 0.0803 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.5762 0.0723 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.5762 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.6538 0.0699 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.6147 0.0749 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.5958 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.5891 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.5824 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.5732 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.5740 0.0674 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.5718 0.0726 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.5736 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.5720 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.5678 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.5670 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.5666 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.5694 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.5687 0.0712 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.5668 0.0745 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.5677 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.5683 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.5687 0.0819 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.5694 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.5687 0.0748 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.5695 0.0692 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.5688 0.0729 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.5684 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.5684 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.5670 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.5659 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.5662 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.5671 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.5676 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.5670 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.5658 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.5659 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.5663 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.5664 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.5659 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.5654 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.5644 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.5633 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.5627 0.0815 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.5622 0.0743 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.5627 0.0740 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.5622 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.5614 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.5613 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.5603 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.5602 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.5596 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.5593 0.0743 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.5598 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.5593 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.5602 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.5602 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.5601 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.5599 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.5599 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.5603 0.0734 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.5599 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.5594 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.5598 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.5598 0.0739 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.5606 0.0758 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.5609 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.5611 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.5610 0.0742 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.5611 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.5613 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.5611 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.5610 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.5609 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.5613 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.5615 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.5619 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.5617 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.5616 0.0755 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.5618 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.5616 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.5616 0.0730 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.5610 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.5610 0.0722 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.5604 0.0674 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.5604 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.5599 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.5599 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.5595 0.0736 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.5593 0.0752 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.5591 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.5587 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.5583 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.5584 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.5581 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.5580 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.5575 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.5571 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.5569 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.5568 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.5567 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.5562 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.5559 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.5555 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.5554 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.5553 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.5551 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.5549 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.5547 0.0809 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.5545 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.5543 0.0802 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.5543 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.5542 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.5543 0.0725 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.5541 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.5539 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.5538 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.5535 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.5533 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.5529 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.5527 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.5527 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.5525 0.0791 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.5524 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.5523 0.0722 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.5520 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.5515 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.5515 0.0719 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.5515 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.5511 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.5512 0.0774 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.5512 0.0781 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.5511 0.0724 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.5508 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.5505 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.5503 0.0772 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.5504 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.5504 0.0730 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.5504 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.5504 0.0705 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.5506 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.5507 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.5507 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.5507 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.5511 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.5511 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.5510 0.0755 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.5510 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.5509 0.0692 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.5511 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.5510 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.5511 0.0797 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.5511 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.5510 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.5507 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.5507 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.5508 0.0718 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.5508 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.5508 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.5509 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.5509 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.5508 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.5505 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.5507 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.5509 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.5509 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.5509 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.5509 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.5508 0.0692 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.5508 0.0725 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.5510 0.0719 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.5513 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.5513 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.5513 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.5512 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.5510 0.0740 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.5512 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.5511 0.0708 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.5512 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.5510 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.5508 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.5509 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.6367 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.5920 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.5777 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.5719 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.5621 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.5521 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.5515 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.5487 0.0719 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.5507 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.5494 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.5453 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.5452 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.5438 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.5454 0.0710 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.5447 0.0717 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.5429 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.5431 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.5446 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.5445 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.5457 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.5450 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.5460 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.5454 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.5450 0.0727 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.5450 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.5433 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.5421 0.0739 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.5425 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.5431 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.5435 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.5432 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.5417 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.5422 0.0680 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.5424 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.5422 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.5418 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.5412 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.5403 0.0755 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.5389 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.5383 0.0750 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.5376 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.5383 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.5380 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.5371 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.5375 0.0693 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.5367 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.5363 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.5359 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.5356 0.0707 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.5361 0.0683 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.5355 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.5363 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.5363 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.5364 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.5359 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.5359 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.5363 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.5358 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.5354 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.5359 0.0710 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.5359 0.0760 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.5368 0.0713 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.5369 0.0683 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.5369 0.0728 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.5369 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.5369 0.0708 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.5371 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.5368 0.0699 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.5368 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.5366 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.5373 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.5375 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.5381 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.5378 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.5378 0.0722 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.5380 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.5379 0.0683 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.5378 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.5372 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.5372 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.5366 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.5367 0.0701 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.5362 0.0703 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.5362 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.5359 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.5357 0.0723 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.5354 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.5352 0.0697 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.5349 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.5350 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.5347 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.5346 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.5343 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.5339 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.5336 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.5336 0.0699 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.5335 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.5331 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.5328 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.5324 0.0697 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.5324 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.5323 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.5321 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.5320 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.5317 0.0680 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.5316 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.5315 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.5314 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.5313 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.5314 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.5312 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.5312 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.5310 0.0730 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.5307 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.5304 0.0707 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.5301 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.5301 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.5300 0.0709 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.5298 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.5297 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.5298 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.5295 0.0699 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.5290 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.5291 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.5290 0.0693 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.5285 0.0710 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.5285 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.5285 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.5284 0.0697 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.5282 0.0701 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.5278 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.5275 0.0742 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.5275 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.5275 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.5275 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.5275 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.5276 0.0712 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.5276 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.5276 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.5276 0.0699 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.5280 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.5280 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.5279 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.5280 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.5279 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.5281 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.5281 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.5283 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.5284 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.5282 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.5279 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.5279 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.5280 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.5280 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.5279 0.0754 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.5278 0.0811 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.5278 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.5277 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.5274 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.5276 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.5279 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.5280 0.0664 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.5279 0.0765 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.5279 0.0701 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.5278 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.5278 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.5279 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.5283 0.0713 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.5283 0.0744 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.5283 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.5281 0.0700 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.5280 0.0713 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.5281 0.0678 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.5281 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.5282 0.0752 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.5281 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.5280 0.0692 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.5281 0.0706 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.6152 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.5713 0.0720 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.5563 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.5509 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.5436 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.5335 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.5333 0.0702 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.5293 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.5307 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.5287 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.5248 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.5244 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.5229 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.5247 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.5241 0.0713 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.5223 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.5231 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.5246 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.5243 0.0773 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.5256 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.5244 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.5253 0.0710 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.5244 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.5240 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.5240 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.5222 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.5210 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.5213 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.5215 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.5217 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.5214 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.5204 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.5209 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.5210 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.5207 0.0728 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.5204 0.0691 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.5197 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.5185 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.5173 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.5166 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.5161 0.0721 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.5167 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.5162 0.0708 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.5155 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.5157 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.5149 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.5146 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.5142 0.0706 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.5140 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.5148 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.5144 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.5152 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.5153 0.0778 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.5153 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.5151 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.5152 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.5157 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.5154 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.5150 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.5156 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.5155 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.5165 0.0710 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.5168 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.5170 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.5167 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.5168 0.0745 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.5169 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.5166 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.5167 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.5166 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.5171 0.0749 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.5173 0.0714 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.5180 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.5177 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.5177 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.5179 0.0709 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.5178 0.0724 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.5179 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.5172 0.0747 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.5172 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.5167 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.5166 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.5162 0.0709 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.5161 0.0834 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.5158 0.0718 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.5155 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.5153 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.5150 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.5147 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.5147 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.5143 0.0756 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.5142 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.5138 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.5134 0.0776 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.5132 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.5133 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.5132 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.5128 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.5124 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.5120 0.0729 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.5120 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.5119 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.5117 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.5115 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.5114 0.0710 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.5114 0.0747 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.5113 0.0713 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.5113 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.5112 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.5113 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.5112 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.5111 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.5110 0.0717 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.5108 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.5105 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.5102 0.0872 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.5101 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.5101 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.5100 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.5099 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.5099 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.5096 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.5093 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.5093 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.5093 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.5089 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.5090 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.5090 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.5088 0.0721 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.5086 0.0703 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.5082 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.5080 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.5081 0.0729 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.5081 0.0718 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.5081 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.5081 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.5083 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.5084 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.5084 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.5083 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.5087 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.5087 0.0729 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.5086 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.5088 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.5087 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.5088 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.5088 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.5090 0.0717 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.5091 0.0700 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.5089 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.5086 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.5085 0.0790 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.5086 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.5086 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.5086 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.5086 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.5087 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.5087 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.5084 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.5085 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.5088 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.5088 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.5088 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.5088 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.5088 0.0742 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.5088 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.5090 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.5094 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.5093 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.5093 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.5093 0.0732 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.5091 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.5092 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.5092 0.0766 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.5092 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.5091 0.0647 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.5089 0.0733 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.5090 0.0740 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.6029 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.5529 0.0726 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.5307 0.0759 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.5240 0.0727 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.5177 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.5079 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.5089 0.0802 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.5067 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.5077 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.5078 0.0719 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.5036 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.5028 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.5029 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.5054 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.5051 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.5033 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.5033 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.5046 0.0713 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.5047 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.5054 0.0750 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.5048 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.5053 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.5050 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.5047 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.5045 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.5033 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.5020 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.5026 0.0783 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.5032 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.5034 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.5033 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.5025 0.0701 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.5026 0.0703 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.5029 0.0726 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.5026 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.5027 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.5020 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.5009 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.4996 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.4991 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.4987 0.0803 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.4993 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.4990 0.0734 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.4985 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.4988 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.4979 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.4976 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.4972 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.4969 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.4973 0.0819 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.4967 0.0800 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.4975 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.4975 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.4975 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.4974 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.4975 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.4978 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.4974 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.4968 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.4975 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.4973 0.0720 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.4981 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.4984 0.0732 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.4987 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.4985 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.4988 0.0756 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.4991 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.4987 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.4987 0.0726 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.4987 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.4992 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.4994 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.4999 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.4997 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.4995 0.0705 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.4997 0.0705 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.4996 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.4995 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.4989 0.0722 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.4988 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.4982 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.4982 0.0728 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.4978 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.4977 0.0822 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.4974 0.0722 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.4972 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.4970 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.4968 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.4965 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.4965 0.0709 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.4961 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.4959 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.4955 0.0729 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.4952 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.4950 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.4951 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.4951 0.0722 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.4947 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.4943 0.0791 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.4939 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.4939 0.0778 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.4938 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.4937 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.4935 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.4933 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.4932 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.4931 0.0694 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.4930 0.0713 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.4930 0.0744 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.4931 0.0738 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.4929 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.4928 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.4927 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.4926 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.4924 0.0821 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.4920 0.0733 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.4920 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.4920 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.4919 0.0718 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.4918 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.4917 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.4913 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.4910 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.4910 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.4910 0.0815 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.4906 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.4907 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.4907 0.0739 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.4906 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.4903 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.4900 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.4898 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.4900 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.4900 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.4899 0.0740 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.4900 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.4901 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.4901 0.0732 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.4901 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.4901 0.0715 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.4905 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.4905 0.0712 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.4905 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.4906 0.0710 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.4905 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.4906 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.4906 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.4908 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.4908 0.0747 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.4907 0.0786 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.4904 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.4903 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.4904 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.4904 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.4904 0.0709 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.4904 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.4904 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.4905 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.4902 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.4903 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.4906 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.4907 0.0711 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.4907 0.0705 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.4906 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.4906 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.4906 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.4907 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.4912 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.4911 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.4912 0.0638 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.4911 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.4910 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.4911 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.4910 0.0744 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.4912 0.0770 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.4911 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.4909 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.4909 0.0705 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.5776 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.5366 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.5198 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.5152 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.5088 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.4986 0.0768 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.4994 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.4962 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.4972 0.0782 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.4956 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.4919 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.4905 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.4892 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.4907 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.4896 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.4881 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.4883 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.4899 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.4901 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.4908 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.4904 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.4911 0.0728 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.4905 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.4903 0.0778 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.4900 0.0716 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.4886 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.4874 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.4876 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.4881 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.4883 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.4880 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.4868 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.4870 0.0724 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.4873 0.0715 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.4869 0.0753 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.4867 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.4859 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.4847 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.4831 0.0731 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.4827 0.0731 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.4823 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.4828 0.0679 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.4824 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.4817 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.4817 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.4808 0.0800 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.4805 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.4803 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.4802 0.0722 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.4805 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.4800 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.4810 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.4810 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.4812 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.4810 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.4812 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.4817 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.4814 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.4808 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.4815 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.4815 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.4824 0.0730 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.4829 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.4829 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.4830 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.4833 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.4835 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.4830 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.4831 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.4830 0.0693 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.4836 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.4840 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.4845 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.4843 0.0712 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.4841 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.4842 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.4839 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.4840 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.4835 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.4833 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.4828 0.0715 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.4827 0.0804 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.4822 0.0863 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.4822 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.4819 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.4816 0.0706 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.4813 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.4811 0.0705 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.4807 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.4808 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.4804 0.0752 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.4802 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.4798 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.4795 0.0708 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.4793 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.4793 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.4793 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.4789 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.4785 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.4781 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.4781 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.4779 0.0705 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.4777 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.4776 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.4774 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.4774 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.4774 0.0704 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.4773 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.4773 0.0753 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.4773 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.4772 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.4770 0.0699 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.4769 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.4767 0.0693 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.4765 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.4762 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.4762 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.4762 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.4760 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.4759 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.4758 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.4755 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.4751 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.4752 0.0734 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.4750 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.4747 0.0712 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.4748 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.4749 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.4748 0.0734 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.4745 0.0731 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.4742 0.0719 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.4739 0.0731 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.4739 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.4739 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.4739 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.4740 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.4742 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.4742 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.4742 0.0708 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.4742 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.4746 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.4746 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.4745 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.4747 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.4745 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.4747 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.4747 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.4749 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.4750 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.4748 0.0739 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.4745 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.4744 0.0808 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.4744 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.4744 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.4745 0.0724 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.4744 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.4744 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.4744 0.0714 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.4742 0.0725 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.4744 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.4747 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.4747 0.0686 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.4747 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.4747 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.4747 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.4747 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.4749 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.4753 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.4753 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.4754 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.4753 0.0726 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.4752 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.4753 0.0726 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.4754 0.0715 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.4755 0.0699 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.4754 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.4753 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.4754 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.5577 0.0701 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.5138 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.4972 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.4951 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.4888 0.0760 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.4797 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.4793 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.4761 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.4774 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.4753 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.4726 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.4718 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.4709 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.4724 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.4714 0.0709 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.4690 0.0695 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.4699 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.4712 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.4715 0.0768 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.4730 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.4723 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.4730 0.0700 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.4725 0.0707 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.4723 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.4722 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.4702 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.4691 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.4696 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.4701 0.0788 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.4707 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.4701 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.4691 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.4693 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.4699 0.0717 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.4698 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.4694 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.4684 0.0712 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.4672 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.4661 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.4656 0.0755 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.4651 0.0751 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.4658 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.4655 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.4648 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.4652 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.4644 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.4642 0.0731 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.4639 0.0695 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.4638 0.0786 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.4642 0.0686 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.4636 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.4644 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.4643 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.4647 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.4644 0.0682 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.4646 0.0724 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.4651 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.4648 0.0701 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.4644 0.0712 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.4650 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.4651 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.4661 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.4665 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.4667 0.0727 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.4666 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.4667 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.4668 0.0718 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.4665 0.0707 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.4665 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.4664 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.4671 0.0777 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.4674 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.4679 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.4677 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.4676 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.4678 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.4676 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.4676 0.0691 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.4670 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.4670 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.4666 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.4667 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.4662 0.0728 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.4662 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.4659 0.0733 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.4657 0.0756 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.4654 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.4651 0.0815 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.4648 0.0773 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.4649 0.0724 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.4646 0.0716 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.4645 0.0701 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.4641 0.0708 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.4639 0.0727 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.4637 0.0687 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.4638 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.4637 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.4634 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.4630 0.0782 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.4627 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.4627 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.4626 0.0728 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.4625 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.4624 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.4622 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.4622 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.4622 0.0716 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.4621 0.0726 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.4619 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.4620 0.0708 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.4618 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.4618 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.4617 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.4616 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.4614 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.4610 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.4611 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.4611 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.4609 0.0741 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.4608 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.4607 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.4605 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.4601 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.4601 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.4601 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.4598 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.4599 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.4600 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.4598 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.4596 0.0756 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.4592 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.4590 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.4590 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.4590 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.4590 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.4590 0.0704 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.4592 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.4593 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.4594 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.4594 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.4598 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.4598 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.4597 0.0755 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.4599 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.4598 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.4599 0.0732 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.4599 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.4601 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.4602 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.4601 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.4598 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.4597 0.0763 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.4598 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.4598 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.4598 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.4599 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.4599 0.0689 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.4599 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.4596 0.0714 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.4598 0.0743 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.4601 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.4601 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.4601 0.0692 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.4601 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.4601 0.0701 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.4600 0.0733 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.4602 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.4606 0.0723 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.4607 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.4607 0.0753 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.4606 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.4605 0.0687 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.4606 0.0707 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.4607 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.4608 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.4607 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.4606 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.4608 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.5556 0.0719 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.5109 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.4945 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.4884 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.4776 0.0734 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.4665 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.4675 0.0682 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.4660 0.0723 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.4671 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.4658 0.0832 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.4624 0.0706 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.4617 0.0728 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.4611 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.4624 0.0731 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.4617 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.4593 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.4601 0.0774 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.4608 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.4612 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.4622 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.4617 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.4626 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.4621 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.4618 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.4617 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.4600 0.0721 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.4585 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.4591 0.0726 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.4594 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.4597 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.4592 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.4583 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.4587 0.0740 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.4587 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.4585 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.4581 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.4575 0.0732 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.4563 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.4553 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.4547 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.4542 0.0717 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.4548 0.0750 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.4544 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.4538 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.4540 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.4530 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.4527 0.0714 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.4522 0.0721 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.4521 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.4527 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.4523 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.4532 0.0754 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.4530 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.4531 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.4529 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.4528 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.4533 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.4530 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.4526 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.4531 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.4532 0.0722 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.4541 0.0733 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.4546 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.4548 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.4547 0.0720 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.4548 0.0733 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.4551 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.4547 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.4546 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.4544 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.4550 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.4552 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.4556 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.4553 0.0808 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.4553 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.4553 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.4551 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.4550 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.4544 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.4544 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.4540 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.4539 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.4535 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.4534 0.0719 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.4531 0.0706 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.4528 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.4526 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.4523 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.4519 0.0705 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.4519 0.0763 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.4516 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.4515 0.0701 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.4512 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.4510 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.4507 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.4507 0.0708 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.4506 0.0772 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.4503 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.4500 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.4497 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.4497 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.4495 0.0794 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.4494 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.4492 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.4492 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.4491 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.4491 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.4491 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.4490 0.0737 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.4490 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.4488 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.4487 0.0799 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.4487 0.0701 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.4485 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.4482 0.0922 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.4479 0.0735 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.4480 0.0714 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.4480 0.0751 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.4479 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.4479 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.4479 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.4476 0.0706 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.4472 0.0785 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.4473 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.4473 0.0714 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.4469 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.4471 0.0782 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.4471 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.4470 0.0701 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.4467 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.4464 0.0782 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.4462 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.4463 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.4463 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.4462 0.0682 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.4462 0.0711 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.4465 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.4466 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.4465 0.0748 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.4466 0.0727 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.4470 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.4470 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.4469 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.4471 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.4470 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.4471 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.4472 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.4474 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.4475 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.4474 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.4471 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.4471 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.4472 0.0740 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.4471 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.4471 0.0688 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.4471 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.4471 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.4470 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.4468 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.4470 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.4472 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.4473 0.0705 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.4472 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.4472 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.4473 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.4473 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.4474 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.4478 0.0662 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.4479 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.4479 0.0759 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.4479 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.4478 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.4478 0.0704 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.4478 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.4479 0.0767 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.4477 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.4477 0.0734 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.4478 0.0710 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.5450 0.0743 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.4977 0.0647 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.4808 0.0759 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.4766 0.0732 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.4695 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.4585 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.4587 0.0712 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.4548 0.0690 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.4559 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.4545 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.4507 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.4494 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.4489 0.0651 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.4502 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.4495 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.4471 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.4476 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.4490 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.4491 0.0751 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.4502 0.0703 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.4495 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.4499 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.4495 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.4489 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.4488 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.4473 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.4462 0.0719 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.4468 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.4473 0.0710 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.4475 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.4471 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.4461 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.4466 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.4468 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.4465 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.4465 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.4457 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.4446 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.4434 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.4429 0.0785 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.4424 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.4428 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.4424 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.4417 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.4421 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.4413 0.0710 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.4410 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.4408 0.0723 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.4408 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.4414 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.4409 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.4415 0.0796 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.4416 0.0706 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.4419 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.4417 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.4418 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.4422 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.4417 0.0705 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.4413 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.4417 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.4418 0.0740 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.4426 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.4428 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.4429 0.0704 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.4430 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.4431 0.0754 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.4432 0.0703 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.4430 0.0763 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.4430 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.4429 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.4435 0.0845 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.4437 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.4443 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.4441 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.4440 0.0706 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.4442 0.0754 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.4440 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.4440 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.4434 0.0737 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.4433 0.0712 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.4428 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.4428 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.4424 0.0710 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.4423 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.4420 0.0689 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.4418 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.4416 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.4414 0.0723 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.4410 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.4411 0.0753 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.4408 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.4406 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.4402 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.4399 0.0721 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.4397 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.4398 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.4397 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.4394 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.4390 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.4386 0.0703 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.4387 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.4386 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.4385 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.4383 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.4382 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.4382 0.0760 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.4382 0.0740 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.4382 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.4381 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.4381 0.0714 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.4380 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.4379 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.4378 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.4377 0.0716 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.4374 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.4371 0.0700 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.4372 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.4372 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.4370 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.4370 0.0747 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.4369 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.4366 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.4362 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.4363 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.4363 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.4360 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.4360 0.0771 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.4360 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.4359 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.4355 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.4351 0.0704 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.4349 0.0651 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.4350 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.4350 0.0739 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.4350 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.4350 0.0764 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.4352 0.0774 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.4353 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.4353 0.0742 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.4353 0.0716 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.4357 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.4357 0.0790 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.4357 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.4359 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.4358 0.0728 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.4360 0.0690 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.4360 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.4362 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.4363 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.4361 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.4359 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.4358 0.0758 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.4358 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.4357 0.0803 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.4357 0.0795 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.4358 0.0772 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.4358 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.4358 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.4355 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.4357 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.4360 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.4360 0.0780 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.4360 0.0711 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.4360 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.4360 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.4360 0.0707 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.4361 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.4365 0.0762 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.4366 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.4366 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.4365 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.4364 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.4366 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.4366 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.4367 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.4366 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.4365 0.0720 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.4366 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.5275 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.4834 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.4664 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.4617 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.4542 0.0718 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.4455 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.4446 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.4427 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.4431 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.4422 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.4390 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.4378 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.4376 0.0733 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.4395 0.0713 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.4377 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.4365 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.4369 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.4381 0.0720 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.4379 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.4392 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.4384 0.0740 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.4389 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.4382 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.4380 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.4377 0.0705 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.4363 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.4352 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.4355 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.4355 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.4358 0.0794 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.4354 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.4344 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.4348 0.0744 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.4348 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.4346 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.4341 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.4334 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.4325 0.0760 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.4311 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.4307 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.4303 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.4308 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.4304 0.0783 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.4297 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.4300 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.4293 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.4290 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.4286 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.4284 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.4288 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.4285 0.0717 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.4292 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.4291 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.4291 0.0720 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.4289 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.4290 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.4294 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.4292 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.4286 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.4292 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.4292 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.4301 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.4306 0.0714 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.4307 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.4308 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.4310 0.0731 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.4311 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.4307 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.4308 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.4307 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.4314 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.4316 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.4320 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.4318 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.4316 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.4318 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.4316 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.4316 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.4312 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.4311 0.0751 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.4307 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.4306 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.4302 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.4302 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.4300 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.4298 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.4296 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.4293 0.0746 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.4291 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.4292 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.4289 0.0713 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.4288 0.0719 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.4285 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.4282 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.4279 0.0750 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.4278 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.4278 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.4275 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.4272 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.4269 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.4269 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.4267 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.4266 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.4265 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.4264 0.0634 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.4263 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.4264 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.4264 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.4263 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.4263 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.4262 0.0708 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.4261 0.0773 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.4259 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.4257 0.0758 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.4255 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.4251 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.4252 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.4252 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.4252 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.4252 0.0743 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.4251 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.4248 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.4244 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.4245 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.4244 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.4240 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.4241 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.4241 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.4240 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.4238 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.4234 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.4232 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.4232 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.4233 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.4233 0.0783 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.4232 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.4234 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.4235 0.0809 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.4235 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.4235 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.4239 0.0742 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.4239 0.0728 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.4238 0.0714 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.4240 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.4238 0.0853 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.4240 0.0748 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.4240 0.0705 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.4242 0.0733 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.4244 0.0744 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.4243 0.0735 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.4240 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.4240 0.0713 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.4241 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.4241 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.4241 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.4242 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.4242 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.4242 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.4240 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.4241 0.0712 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.4244 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.4244 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.4244 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.4245 0.0750 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.4245 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.4245 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.4246 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.4250 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.4250 0.0702 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.4250 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.4249 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.4248 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.4250 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.4250 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.4251 0.0797 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.4250 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.4249 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.4250 0.0705 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4156 1.2725 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4064 0.0710 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.3955 0.0637 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.3804 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.3584 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.3232 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.2645 0.0632 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.1938 0.0662 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.1349 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 4.0832 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 4.0336 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.9895 0.0675 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.9506 0.0721 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.9175 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.8868 0.0672 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.8584 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.8318 0.0702 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.8083 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.7863 0.0683 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.7640 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.7437 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.7253 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.7082 0.0729 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.6924 0.0664 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.6774 0.0679 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.6638 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.6514 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.6387 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.6268 0.0632 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.6158 0.0662 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.6060 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.5958 0.0667 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.5856 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.5766 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.5675 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.5594 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.5510 0.0663 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.5430 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.5351 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.5280 0.0674 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.5208 0.0643 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.5140 0.0682 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.5073 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.5009 0.0658 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.4947 0.0666 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.4890 0.0669 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.4836 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.4784 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.4735 0.0706 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.4686 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.4637 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.4589 0.0718 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.4545 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.4500 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.4460 0.0700 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.4416 0.0639 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.4374 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.4336 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.4297 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.4260 0.0658 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.4225 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.4194 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.4164 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.4129 0.0671 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.4094 0.0679 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.4065 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.4037 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.4003 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.3972 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.3945 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.3917 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.3894 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.3868 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.3843 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.3820 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.3796 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.3772 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.3750 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.3725 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.3700 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.3677 0.0685 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.3657 0.0682 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.3637 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.3616 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.3594 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.3573 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.3551 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.3531 0.0688 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.3514 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.3497 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.3480 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.3462 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.3445 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.3428 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.3411 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.3393 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.3377 0.0677 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.3361 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.3346 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.3330 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.3315 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.3301 0.0661 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.3286 0.0660 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.3271 0.0764 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.3257 0.0663 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.3243 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.3228 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.3212 0.0661 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.3199 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.3184 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.3170 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.3158 0.0645 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.3144 0.0701 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.3130 0.0643 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.3116 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.3102 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.3089 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.3079 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.3068 0.0658 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.3055 0.0721 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.3045 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.3034 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.3024 0.0734 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.3014 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.3003 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.2990 0.0693 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.2979 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.2969 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.2958 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.2948 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.2939 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.2928 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.2919 0.0689 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.2909 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.2896 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.2885 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.2875 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.2864 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.2855 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.2845 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.2835 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.2825 0.0653 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.2815 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.2805 0.0645 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.2795 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.2786 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.2779 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.2772 0.0680 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.2763 0.0715 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.2754 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.2746 0.0632 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.2739 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.2731 0.0678 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.2723 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.2714 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.2705 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.2696 0.0725 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.2688 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.2679 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.2669 0.0672 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.2661 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.2651 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.2642 0.0667 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.2633 0.0715 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.2625 0.0647 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.2616 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.2608 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.2600 0.0669 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.2592 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.2583 0.0657 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.2575 0.0691 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.2569 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.2564 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.2558 0.0761 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.2550 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.2544 0.0833 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.2536 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.2527 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 3.1812 0.0599 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 3.1333 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 3.1177 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 3.1139 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 3.1123 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 3.1110 0.0670 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 3.1109 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 3.1101 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 3.1082 0.0687 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 3.1069 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 3.1042 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 3.1027 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 3.1016 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 3.1016 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 3.1007 0.0670 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 3.1002 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 3.0984 0.0672 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 3.0990 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 3.0987 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 3.0961 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 3.0949 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 3.0942 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 3.0926 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 3.0913 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 3.0897 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 3.0888 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 3.0881 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 3.0862 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 3.0849 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 3.0835 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 3.0830 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 3.0814 0.0594 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 3.0796 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 3.0786 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 3.0768 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 3.0757 0.0724 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 3.0739 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 3.0720 0.0699 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 3.0700 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 3.0683 0.0734 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 3.0662 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 3.0644 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 3.0625 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 3.0605 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 3.0583 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 3.0564 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 3.0548 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 3.0532 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 3.0512 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 3.0494 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 3.0475 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 3.0455 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 3.0435 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 3.0412 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 3.0393 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 3.0372 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 3.0355 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 3.0336 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 3.0313 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 3.0294 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 3.0272 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 3.0254 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 3.0237 0.0681 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 3.0214 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 3.0190 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 3.0172 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 3.0151 0.0746 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 3.0126 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 3.0103 0.0689 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 3.0084 0.0677 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 3.0063 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 3.0045 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 3.0024 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 3.0003 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.9985 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.9967 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.9947 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.9927 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.9907 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.9885 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.9866 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.9849 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.9829 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.9809 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.9786 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.9765 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.9745 0.0706 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.9725 0.0703 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.9706 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.9688 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.9667 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.9648 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.9628 0.0670 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.9607 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.9585 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.9563 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.9544 0.0683 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.9523 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.9503 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.9483 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.9463 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.9443 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.9422 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.9401 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.9381 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.9361 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.9340 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.9320 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.9301 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.9278 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.9258 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.9238 0.0699 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.9217 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.9195 0.0688 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.9173 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.9151 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.9130 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.9109 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.9091 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.9071 0.0746 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.9053 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.9033 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.9012 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.8992 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.8972 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.8950 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.8932 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.8914 0.0672 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.8893 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.8873 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.8854 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.8834 0.0683 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.8815 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.8796 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.8775 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.8755 0.0653 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.8735 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.8716 0.0707 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.8698 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.8679 0.0713 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.8661 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.8641 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.8623 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.8604 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.8586 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.8569 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.8551 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.8534 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.8515 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.8496 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.8481 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.8466 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.8449 0.0767 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.8432 0.0663 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.8414 0.0739 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.8397 0.0722 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.8379 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.8361 0.0675 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.8343 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.8326 0.0647 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.8309 0.0675 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.8291 0.0680 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.8272 0.0666 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.8256 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.8240 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.8223 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.8206 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.8190 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.8174 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.8157 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.8140 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.8125 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.8111 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.8097 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.8082 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.8067 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.8052 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.8035 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.6247 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.5543 0.0624 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.5373 0.0637 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.5305 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.5269 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.5228 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.5231 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.5229 0.0778 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.5233 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.5210 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.5190 0.0722 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.5186 0.0670 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.5177 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.5185 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.5177 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.5171 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.5162 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.5180 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.5165 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.5147 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.5132 0.0742 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.5138 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.5127 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.5118 0.0624 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.5106 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.5098 0.0695 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.5091 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.5080 0.0742 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.5076 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.5072 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.5069 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.5060 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.5044 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.5037 0.0637 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.5027 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.5021 0.0729 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.5011 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.4995 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.4983 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.4972 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.4960 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.4949 0.0709 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.4937 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.4924 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.4914 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.4900 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.4895 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.4888 0.0842 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.4880 0.0666 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.4875 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.4866 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.4859 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.4850 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.4841 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.4833 0.0732 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.4827 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.4819 0.0705 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.4809 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.4800 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.4795 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.4790 0.0707 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.4784 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.4780 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.4774 0.0724 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.4766 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.4762 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.4756 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.4745 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.4735 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.4731 0.0724 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.4724 0.0693 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.4719 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.4714 0.0684 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.4706 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.4700 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.4699 0.0681 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.4692 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.4688 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.4680 0.0694 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.4674 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.4666 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.4663 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.4656 0.0759 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.4647 0.0704 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.4638 0.0751 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.4631 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.4625 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.4618 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.4611 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.4606 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.4600 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.4594 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.4587 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.4579 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.4571 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.4563 0.0691 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.4558 0.0695 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.4552 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.4546 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.4539 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.4535 0.0811 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.4530 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.4521 0.0730 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.4514 0.0787 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.4506 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.4501 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.4495 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.4491 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.4487 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.4479 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.4473 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.4469 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.4464 0.0671 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.4457 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.4451 0.0715 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.4442 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.4437 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.4432 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.4429 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.4423 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.4419 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.4414 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.4408 0.0753 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.4404 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.4399 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.4393 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.4389 0.0731 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.4386 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.4381 0.0689 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.4376 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.4371 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.4364 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.4360 0.0714 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.4356 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.4351 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.4345 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.4341 0.0712 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.4336 0.0693 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.4333 0.0730 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.4328 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.4324 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.4318 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.4314 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.4308 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.4303 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.4301 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.4296 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.4294 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.4288 0.0687 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.4282 0.0722 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.4279 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.4277 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.4273 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.4269 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.4264 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.4260 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.4255 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.4250 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.4244 0.0729 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.4241 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.4237 0.0682 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.4231 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.4226 0.0762 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.4222 0.0706 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.4218 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.4214 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.4209 0.0701 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.4206 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.4202 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.4196 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.4191 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.4187 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.4183 0.0723 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.4181 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.4178 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.4174 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.4168 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.4163 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.4438 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.3724 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.3504 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.3423 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.3395 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.3339 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.3336 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.3358 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.3370 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.3358 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.3335 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.3334 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.3329 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.3352 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.3342 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.3335 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.3328 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.3345 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.3343 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.3328 0.0759 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.3314 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.3321 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.3318 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.3310 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.3302 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.3293 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.3282 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.3277 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.3277 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.3275 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.3273 0.0675 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.3264 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.3255 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.3254 0.0698 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.3249 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.3245 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.3240 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.3227 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.3218 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.3208 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.3200 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.3192 0.0781 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.3182 0.0693 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.3172 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.3165 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.3151 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.3150 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.3143 0.0837 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.3138 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.3139 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.3131 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.3128 0.0677 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.3122 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.3117 0.0706 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.3112 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.3109 0.0678 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.3105 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.3099 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.3093 0.0678 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.3091 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.3086 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.3085 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.3084 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.3081 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.3075 0.0750 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.3074 0.0747 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.3071 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.3063 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.3056 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.3053 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.3052 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.3050 0.0693 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.3047 0.0715 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.3040 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.3036 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.3037 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.3032 0.0701 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.3032 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.3026 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.3020 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.3014 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.3012 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.3006 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.2999 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.2990 0.0731 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.2984 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.2981 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.2976 0.0707 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.2970 0.0804 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.2967 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.2962 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.2958 0.0705 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.2952 0.0714 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.2947 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.2941 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.2936 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.2931 0.0701 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.2927 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.2922 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.2916 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.2914 0.0703 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.2910 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.2905 0.0707 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.2899 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.2894 0.0716 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.2890 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.2886 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.2883 0.0751 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.2881 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.2876 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.2872 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.2868 0.0763 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.2864 0.0715 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.2859 0.0760 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.2854 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.2847 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.2844 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.2841 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.2838 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.2835 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.2834 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.2829 0.0627 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.2825 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.2823 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.2819 0.0693 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.2812 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.2809 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.2806 0.0756 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.2802 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.2800 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.2795 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.2789 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.2786 0.0700 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.2784 0.0692 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.2780 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.2777 0.0722 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.2773 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.2769 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.2769 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.2764 0.0683 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.2763 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.2759 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.2754 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.2751 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.2747 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.2745 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.2742 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.2741 0.0696 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.2737 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.2733 0.0664 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.2730 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.2729 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.2726 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.2724 0.0702 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.2720 0.0729 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.2717 0.0727 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.2713 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.2709 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.2705 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.2704 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.2702 0.0713 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.2697 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.2694 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.2691 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.2688 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.2685 0.0757 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.2682 0.0731 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.2681 0.0680 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.2678 0.0774 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.2674 0.0681 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.2670 0.0753 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.2667 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.2665 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.2663 0.0765 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.2661 0.0715 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.2658 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.2654 0.0776 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.2650 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.3106 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.2477 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.2265 0.0700 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.2177 0.0760 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.2130 0.0741 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.2072 0.0720 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.2070 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.2090 0.0713 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.2102 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.2090 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.2063 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.2049 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.2043 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.2067 0.0748 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.2065 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.2054 0.0731 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.2049 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.2065 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.2064 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.2052 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.2040 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.2051 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.2043 0.0676 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.2035 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.2027 0.0735 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.2014 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.2007 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.2005 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.2009 0.0699 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.2008 0.0783 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.2007 0.0692 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.1999 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.1991 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.1996 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.1990 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.1986 0.0695 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.1982 0.0697 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.1969 0.0699 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.1959 0.0708 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.1948 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.1940 0.0703 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.1935 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.1926 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.1917 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.1910 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.1895 0.0676 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.1893 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.1883 0.0741 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.1879 0.0710 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.1881 0.0741 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.1874 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.1876 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.1869 0.0745 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.1864 0.0719 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.1858 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.1857 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.1855 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.1850 0.0731 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.1844 0.0716 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.1845 0.0676 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.1839 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.1839 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.1839 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.1837 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.1833 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.1832 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.1831 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.1822 0.0707 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.1815 0.0696 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.1813 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.1814 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.1811 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.1811 0.0729 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.1805 0.0695 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.1802 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.1803 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.1799 0.0720 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.1797 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.1791 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.1788 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.1782 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.1780 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.1774 0.0702 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.1768 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.1760 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 2.1755 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 2.1753 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 2.1749 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 2.1742 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 2.1741 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 2.1736 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 2.1734 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 2.1728 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 2.1722 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 2.1717 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 2.1713 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 2.1710 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 2.1706 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 2.1700 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 2.1693 0.0745 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 2.1691 0.0734 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 2.1689 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 2.1683 0.0766 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 2.1679 0.0685 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 2.1673 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 2.1671 0.0735 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 2.1668 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 2.1667 0.0737 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 2.1666 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 2.1661 0.0724 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 2.1658 0.0737 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 2.1654 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 2.1651 0.0764 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 2.1647 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 2.1643 0.0695 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 2.1636 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 2.1633 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 2.1630 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 2.1628 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 2.1626 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 2.1624 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 2.1620 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 2.1616 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 2.1615 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 2.1611 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 2.1606 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 2.1604 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 2.1602 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 2.1600 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 2.1597 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 2.1594 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 2.1589 0.0740 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 2.1587 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 2.1585 0.0715 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 2.1582 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 2.1581 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 2.1579 0.0750 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 2.1577 0.0704 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 2.1576 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 2.1572 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 2.1571 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 2.1568 0.0676 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 2.1564 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 2.1561 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 2.1558 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 2.1556 0.0721 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 2.1554 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 2.1554 0.0784 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 2.1551 0.0695 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 2.1547 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 2.1545 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 2.1545 0.0683 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 2.1543 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 2.1541 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 2.1538 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 2.1535 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 2.1533 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 2.1531 0.0741 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 2.1527 0.0717 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 2.1527 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 2.1525 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 2.1523 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 2.1520 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 2.1518 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 2.1515 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 2.1513 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 2.1510 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 2.1510 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 2.1508 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 2.1505 0.0694 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 2.1501 0.0707 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 2.1498 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 2.1497 0.0718 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 2.1495 0.0736 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 2.1493 0.0679 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 2.1490 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 2.1487 0.0733 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 2.1485 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.2032 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 2.1389 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 2.1210 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 2.1142 0.0790 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 2.1116 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 2.1053 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 2.1051 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 2.1054 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 2.1068 0.0739 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 2.1053 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 2.1028 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 2.1010 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 2.1005 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 2.1028 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 2.1017 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 2.1008 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 2.1008 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 2.1036 0.0722 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 2.1034 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 2.1030 0.0803 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 2.1020 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 2.1031 0.0716 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 2.1024 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 2.1014 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 2.1006 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 2.0996 0.0650 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 2.0986 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 2.0984 0.0747 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 2.0991 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 2.0990 0.0779 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 2.0991 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 2.0981 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 2.0977 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 2.0983 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 2.0979 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 2.0975 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 2.0971 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 2.0959 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 2.0948 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 2.0939 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 2.0931 0.0692 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 2.0927 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 2.0918 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 2.0908 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 2.0904 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 2.0890 0.0761 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 2.0890 0.0701 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 2.0884 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 2.0880 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 2.0883 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 2.0874 0.0720 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 2.0879 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 2.0873 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 2.0870 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 2.0866 0.0696 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 2.0865 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 2.0864 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 2.0859 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 2.0855 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 2.0856 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 2.0853 0.0741 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 2.0855 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 2.0857 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 2.0854 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 2.0851 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 2.0852 0.0749 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 2.0850 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 2.0842 0.0700 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 2.0837 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 2.0835 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 2.0837 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 2.0835 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 2.0835 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 2.0829 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 2.0826 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 2.0828 0.0730 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 2.0824 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 2.0823 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 2.0817 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 2.0813 0.0713 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 2.0807 0.0706 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 2.0805 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 2.0798 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 2.0794 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 2.0786 0.0692 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 2.0781 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 2.0780 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 2.0775 0.0716 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 2.0769 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 2.0768 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 2.0763 0.0709 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 2.0761 0.0712 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 2.0755 0.0717 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 2.0750 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 2.0745 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 2.0742 0.0708 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 2.0738 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 2.0734 0.0707 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 2.0729 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 2.0722 0.0696 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 2.0721 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 2.0719 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 2.0714 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 2.0710 0.0692 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 2.0706 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 2.0705 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 2.0701 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 2.0699 0.0697 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 2.0698 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 2.0695 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 2.0693 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 2.0690 0.0671 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 2.0687 0.0635 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 2.0684 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 2.0680 0.0707 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 2.0676 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 2.0673 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 2.0671 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 2.0670 0.0702 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 2.0667 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 2.0666 0.0694 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 2.0663 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 2.0659 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 2.0658 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 2.0656 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 2.0651 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 2.0649 0.0720 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 2.0648 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 2.0646 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 2.0643 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 2.0639 0.0708 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 2.0635 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 2.0634 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 2.0632 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 2.0629 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 2.0628 0.0770 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 2.0627 0.0695 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 2.0625 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 2.0625 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 2.0622 0.0696 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 2.0621 0.0781 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 2.0619 0.0718 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 2.0616 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 2.0614 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 2.0611 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 2.0610 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 2.0609 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 2.0608 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 2.0606 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 2.0603 0.0739 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 2.0600 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 2.0601 0.0679 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 2.0599 0.0681 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 2.0598 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 2.0597 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 2.0595 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 2.0593 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 2.0591 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 2.0587 0.0653 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 2.0588 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 2.0587 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 2.0585 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 2.0584 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 2.0582 0.0717 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 2.0579 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 2.0578 0.0698 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 2.0576 0.0739 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 2.0577 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 2.0575 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 2.0572 0.0709 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 2.0570 0.0714 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 2.0567 0.0699 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 2.0566 0.0730 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 2.0565 0.0753 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 2.0564 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 2.0562 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 2.0559 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 2.0557 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 2.1152 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 2.0604 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 2.0382 0.0755 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 2.0294 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 2.0283 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 2.0200 0.0694 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 2.0206 0.0682 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 2.0200 0.0682 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 2.0223 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 2.0209 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 2.0179 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 2.0160 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 2.0159 0.0724 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 2.0178 0.0654 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 2.0167 0.0718 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 2.0152 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 2.0148 0.0740 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 2.0169 0.0804 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 2.0171 0.0680 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 2.0172 0.0757 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 2.0166 0.0722 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 2.0180 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 2.0172 0.0770 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 2.0165 0.0707 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 2.0157 0.0695 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 2.0148 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 2.0139 0.0707 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 2.0141 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 2.0149 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 2.0149 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 2.0148 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 2.0140 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 2.0136 0.0753 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 2.0144 0.0700 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 2.0140 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 2.0136 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 2.0131 0.0735 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 2.0118 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 2.0107 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 2.0099 0.0690 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 2.0096 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 2.0094 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 2.0088 0.0711 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 2.0078 0.0699 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 2.0075 0.0710 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 2.0061 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 2.0060 0.0753 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 2.0052 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 2.0049 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 2.0054 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 2.0047 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 2.0053 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 2.0049 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 2.0044 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 2.0039 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 2.0039 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 2.0038 0.0714 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 2.0033 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 2.0027 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 2.0031 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 2.0028 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 2.0032 0.0682 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 2.0036 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 2.0036 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 2.0033 0.0753 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 2.0035 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 2.0034 0.0680 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 2.0027 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 2.0025 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 2.0023 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 2.0026 0.0697 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 2.0026 0.0747 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 2.0026 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 2.0022 0.0732 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 2.0020 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 2.0023 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 2.0020 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 2.0020 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 2.0014 0.0718 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 2.0011 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 2.0004 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 2.0004 0.0803 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.9997 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.9996 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.9989 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.9985 0.0711 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.9983 0.0844 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.9979 0.0877 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.9973 0.0773 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.9973 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.9970 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.9967 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.9962 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.9957 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.9953 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.9949 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.9946 0.0647 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.9941 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.9937 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.9931 0.0788 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.9931 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.9929 0.0775 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.9925 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.9922 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.9918 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.9917 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.9914 0.0711 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.9913 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.9912 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.9908 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.9907 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.9904 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.9901 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.9898 0.0687 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.9895 0.0722 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.9891 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.9889 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.9887 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.9886 0.0696 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.9884 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.9884 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.9881 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.9877 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.9878 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.9876 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.9871 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.9871 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.9871 0.0733 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.9869 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.9867 0.0740 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.9864 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.9860 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.9859 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.9858 0.0717 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.9856 0.0737 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.9856 0.0684 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.9854 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.9853 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.9855 0.0697 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.9853 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.9853 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.9851 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.9848 0.0695 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.9847 0.0693 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.9845 0.0683 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.9844 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.9844 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.9844 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.9842 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.9839 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.9837 0.0729 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.9838 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.9837 0.0717 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.9836 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.9835 0.0829 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.9833 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.9832 0.0779 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.9830 0.0714 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.9827 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.9828 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.9829 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.9827 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.9827 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.9825 0.0779 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.9824 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.9823 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.9822 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.9823 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.9822 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.9820 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.9817 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.9815 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.9815 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.9814 0.0713 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.9814 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.9812 0.0707 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.9809 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.9807 0.0730 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 2.0475 0.0747 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.9908 0.0695 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.9749 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.9637 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.9617 0.0693 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.9534 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.9534 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.9520 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.9540 0.0733 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.9528 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.9494 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.9476 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.9477 0.0716 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.9500 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.9490 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.9478 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.9480 0.0813 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.9505 0.0710 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.9502 0.0717 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.9501 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.9492 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.9505 0.0715 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.9499 0.0748 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.9492 0.0788 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.9489 0.0768 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.9474 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.9464 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.9466 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.9474 0.0761 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.9476 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.9477 0.0722 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.9471 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.9469 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.9473 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.9466 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.9464 0.0698 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.9459 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.9446 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.9433 0.0703 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.9424 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.9417 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.9416 0.0750 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.9411 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.9402 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.9403 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.9390 0.0723 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.9387 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.9380 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.9376 0.0803 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.9381 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.9374 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.9382 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.9378 0.0707 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.9374 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.9369 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.9369 0.0733 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.9371 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.9367 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.9362 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.9366 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.9365 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.9371 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.9375 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.9376 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.9374 0.0749 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.9377 0.0881 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.9376 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.9372 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.9368 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.9366 0.0682 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.9370 0.0702 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.9370 0.0718 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.9372 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.9368 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.9364 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.9365 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.9364 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.9364 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.9358 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.9354 0.0755 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.9347 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.9347 0.0749 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.9340 0.0713 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.9339 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.9333 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.9329 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.9328 0.0710 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.9324 0.0711 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.9317 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.9317 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.9312 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.9310 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.9304 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.9301 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.9297 0.0715 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.9294 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.9293 0.0754 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.9289 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.9284 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.9279 0.0740 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.9278 0.0780 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.9277 0.0637 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.9274 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.9271 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.9268 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.9267 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.9265 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.9263 0.0729 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.9263 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.9261 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.9260 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.9258 0.0761 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.9255 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.9253 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.9250 0.0693 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.9246 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.9245 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.9243 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.9242 0.0700 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.9239 0.0716 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.9239 0.0853 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.9235 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.9231 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.9232 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.9231 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.9226 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.9226 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.9226 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.9224 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.9224 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.9220 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.9217 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.9216 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.9215 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.9214 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.9214 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.9214 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.9213 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.9214 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.9212 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.9213 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.9212 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.9210 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.9210 0.0775 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.9207 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.9206 0.0828 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.9206 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.9206 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.9204 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.9202 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.9200 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.9200 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.9199 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.9199 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.9197 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.9196 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.9195 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.9194 0.0714 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.9191 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.9192 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.9192 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.9190 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.9190 0.0735 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.9188 0.0803 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.9187 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.9186 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.9185 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.9187 0.0726 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.9186 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.9184 0.0711 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.9182 0.0745 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.9179 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.9179 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.9179 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.9179 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.9177 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.9175 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.9174 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.9938 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.9423 0.0733 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.9204 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.9095 0.0767 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.9063 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.8953 0.0718 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.8952 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.8940 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.8968 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.8949 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.8916 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.8904 0.0697 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.8903 0.0851 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.8926 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.8917 0.0742 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.8902 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.8902 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.8921 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.8917 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.8916 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.8910 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.8924 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.8917 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.8911 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.8908 0.0740 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.8893 0.0721 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.8885 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.8886 0.0752 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.8896 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.8896 0.0688 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.8895 0.0743 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.8890 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.8890 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.8897 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.8894 0.0736 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.8892 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.8887 0.0823 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.8875 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.8865 0.0749 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.8856 0.0756 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.8850 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.8851 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.8846 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.8840 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.8840 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.8827 0.0703 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.8825 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.8817 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.8816 0.0726 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.8822 0.0711 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.8816 0.0711 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.8825 0.0720 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.8822 0.0915 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.8820 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.8816 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.8818 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.8820 0.0697 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.8816 0.0768 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.8812 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.8815 0.0769 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.8814 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.8821 0.0702 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.8823 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.8825 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.8824 0.0688 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.8825 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.8827 0.0700 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.8822 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.8819 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.8817 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.8822 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.8823 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.8825 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.8822 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.8820 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.8822 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.8820 0.0755 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.8821 0.0727 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.8816 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.8813 0.0697 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.8807 0.0694 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.8807 0.0710 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.8800 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.8799 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.8794 0.0710 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.8789 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.8787 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.8783 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.8778 0.0760 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.8779 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.8774 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.8772 0.0716 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.8767 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.8763 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.8760 0.0729 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.8758 0.0728 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.8756 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.8751 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.8748 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.8743 0.0705 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.8743 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.8741 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.8738 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.8736 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.8733 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.8732 0.0805 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.8731 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.8730 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.8729 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.8728 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.8726 0.0791 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.8724 0.0752 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.8722 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.8720 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.8717 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.8713 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.8713 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.8711 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.8710 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.8707 0.0711 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.8706 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.8703 0.0707 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.8700 0.0700 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.8701 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.8699 0.0740 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.8695 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.8695 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.8695 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.8693 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.8691 0.0759 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.8688 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.8684 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.8683 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.8682 0.0727 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.8681 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.8681 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.8681 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.8680 0.0683 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.8681 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.8679 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.8681 0.0683 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.8679 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.8677 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.8676 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.8674 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.8675 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.8674 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.8675 0.0739 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.8674 0.0698 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.8673 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.8670 0.0683 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.8671 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.8671 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.8672 0.0771 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.8670 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.8669 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.8668 0.0743 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.8667 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.8665 0.0756 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.8666 0.0778 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.8667 0.0734 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.8665 0.0717 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.8665 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.8664 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.8664 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.8662 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.8662 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.8664 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.8663 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.8661 0.0700 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.8659 0.0718 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.8657 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.8658 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.8657 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.8656 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.8655 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.8653 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.8653 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.9501 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.8979 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.8772 0.0733 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.8650 0.0747 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.8619 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.8513 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.8524 0.0709 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.8508 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.8521 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.8509 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.8465 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.8454 0.0690 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.8451 0.0753 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.8473 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.8466 0.0737 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.8453 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.8449 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.8466 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.8465 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.8466 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.8460 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.8476 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.8470 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.8468 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.8463 0.0771 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.8452 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.8444 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.8446 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.8454 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.8454 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.8453 0.0736 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.8444 0.0741 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.8443 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.8448 0.0783 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.8441 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.8440 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.8435 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.8422 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.8410 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.8402 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.8397 0.0777 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.8398 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.8394 0.0740 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.8385 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.8384 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.8372 0.0717 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.8369 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.8362 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.8362 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.8367 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.8362 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.8370 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.8367 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.8365 0.0695 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.8363 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.8362 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.8364 0.0725 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.8360 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.8355 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.8359 0.0777 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.8355 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.8362 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.8363 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.8365 0.0713 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.8363 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.8366 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.8365 0.0690 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.8361 0.0753 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.8359 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.8358 0.0717 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.8364 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.8364 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.8367 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.8363 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.8361 0.0709 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.8364 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.8361 0.0716 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.8362 0.0783 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.8357 0.0724 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.8355 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.8350 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.8351 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.8345 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.8345 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.8340 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.8337 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.8335 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.8331 0.0789 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.8327 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.8328 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.8325 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.8323 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.8318 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.8314 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.8309 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.8309 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.8307 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.8303 0.0717 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.8298 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.8293 0.0820 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.8292 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.8291 0.0789 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.8288 0.0710 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.8285 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.8282 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.8280 0.0721 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.8280 0.0644 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.8279 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.8279 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.8279 0.0749 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.8278 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.8275 0.0699 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.8273 0.0712 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.8271 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.8269 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.8264 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.8262 0.0741 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.8260 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.8259 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.8258 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.8257 0.0715 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.8253 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.8250 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.8250 0.0754 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.8250 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.8245 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.8245 0.0751 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.8245 0.0742 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.8244 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.8242 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.8237 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.8234 0.0716 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.8234 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.8233 0.0723 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.8232 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.8231 0.0683 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.8232 0.0694 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.8231 0.0792 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.8232 0.0719 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.8231 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.8232 0.0740 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.8231 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.8230 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.8230 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.8228 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.8229 0.0702 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.8229 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.8231 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.8229 0.0741 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.8228 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.8226 0.0719 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.8227 0.0701 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.8228 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.8228 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.8226 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.8225 0.0734 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.8225 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.8224 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.8222 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.8223 0.0668 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.8224 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.8223 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.8223 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.8222 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.8221 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.8220 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.8220 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.8223 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.8222 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.8221 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.8219 0.0706 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.8216 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.8217 0.0714 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.8217 0.0729 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.8217 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.8216 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.8214 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.8213 0.0710 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.8992 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.8495 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.8286 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.8218 0.0732 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.8166 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.8060 0.0699 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.8066 0.0731 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.8056 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.8081 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.8074 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.8041 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.8028 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.8017 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.8041 0.0694 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.8031 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.8020 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.8018 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.8038 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.8036 0.0825 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.8040 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.8032 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.8047 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.8045 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.8039 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.8035 0.0698 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.8024 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.8012 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.8015 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.8026 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.8027 0.0776 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.8025 0.0743 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.8019 0.0711 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.8017 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.8021 0.0643 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.8016 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.8015 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.8009 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.7998 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.7986 0.0747 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.7979 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.7974 0.0702 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.7975 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.7968 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.7962 0.0702 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.7965 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.7955 0.0701 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.7953 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.7947 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.7945 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.7950 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.7945 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.7955 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.7952 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.7950 0.0699 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.7946 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.7947 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.7950 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.7948 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.7944 0.0755 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.7948 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.7946 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.7954 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.7957 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.7960 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.7958 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.7960 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.7962 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.7958 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.7958 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.7956 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.7961 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.7961 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.7964 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.7961 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.7961 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.7963 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.7961 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.7961 0.0755 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.7955 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.7952 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.7946 0.0702 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.7947 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.7941 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.7940 0.0706 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.7934 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.7931 0.0762 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.7928 0.0687 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.7925 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.7919 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.7921 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.7918 0.0744 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.7918 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.7914 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.7911 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.7907 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.7906 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.7904 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.7900 0.0698 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.7896 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.7891 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.7890 0.0770 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.7888 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.7886 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.7884 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.7881 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.7879 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.7879 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.7879 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.7879 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.7879 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.7879 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.7877 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.7874 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.7873 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.7871 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.7867 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.7865 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.7864 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.7863 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.7861 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.7860 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.7856 0.0706 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.7853 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.7854 0.0810 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.7853 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.7849 0.0741 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.7850 0.0709 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.7850 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.7849 0.0763 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.7848 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.7844 0.0691 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.7841 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.7841 0.0722 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.7841 0.0753 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.7841 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.7841 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.7841 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.7841 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.7843 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.7841 0.0703 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.7843 0.0744 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.7842 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.7841 0.0709 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.7840 0.0757 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.7838 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.7839 0.0691 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.7838 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.7840 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.7840 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.7838 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.7835 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.7836 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.7836 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.7836 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.7834 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.7833 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.7834 0.0764 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.7833 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.7830 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.7831 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.7833 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.7832 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.7832 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.7831 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.7830 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.7829 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.7829 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.7832 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.7831 0.0768 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.7830 0.0699 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.7829 0.0711 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.7828 0.0739 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.7828 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.7828 0.0699 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.7828 0.0707 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.7827 0.0730 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.7825 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.7825 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.8663 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.8153 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.7946 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.7893 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.7844 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.7733 0.0699 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.7736 0.0715 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.7712 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.7733 0.0736 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.7721 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.7685 0.0683 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.7672 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.7672 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.7690 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.7677 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.7666 0.0715 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.7663 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.7680 0.0720 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.7679 0.0761 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.7686 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.7680 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.7693 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.7685 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.7683 0.0781 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.7682 0.0723 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.7665 0.0767 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.7653 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.7659 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.7667 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.7665 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.7662 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.7655 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.7659 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.7663 0.0729 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.7659 0.0752 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.7659 0.0705 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.7654 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.7647 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.7632 0.0756 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.7626 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.7621 0.0699 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.7623 0.0741 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.7618 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.7613 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.7615 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.7604 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.7600 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.7595 0.0767 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.7591 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.7598 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.7592 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.7602 0.0717 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.7601 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.7600 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.7597 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.7599 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.7601 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.7599 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.7594 0.0708 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.7598 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.7596 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.7605 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.7609 0.0719 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.7611 0.0767 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.7611 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.7613 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.7614 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.7609 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.7608 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.7606 0.0645 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.7613 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.7614 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.7618 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.7615 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.7614 0.0695 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.7616 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.7615 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.7616 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.7610 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.7608 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.7603 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.7605 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.7599 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.7598 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.7592 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.7589 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.7587 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.7583 0.0763 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.7578 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.7579 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.7576 0.0709 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.7574 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.7569 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.7567 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.7563 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.7563 0.0736 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.7563 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.7557 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.7553 0.0709 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.7548 0.0727 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.7548 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.7547 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.7545 0.0705 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.7542 0.0723 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.7539 0.0717 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.7537 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.7536 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.7535 0.0701 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.7535 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.7535 0.0705 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.7534 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.7532 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.7530 0.0718 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.7528 0.0753 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.7526 0.0742 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.7522 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.7521 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.7519 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.7519 0.0738 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.7518 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.7517 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.7513 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.7510 0.0705 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.7511 0.0714 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.7510 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.7506 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.7507 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.7507 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.7507 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.7505 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.7502 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.7498 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.7499 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.7499 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.7498 0.0795 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.7498 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.7499 0.0709 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.7500 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.7500 0.0722 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.7499 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.7502 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.7501 0.0714 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.7499 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.7498 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.7496 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.7497 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.7497 0.0766 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.7499 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.7499 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.7497 0.0757 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.7495 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.7496 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.7497 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.7497 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.7496 0.0674 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.7495 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.7496 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.7496 0.0666 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.7493 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.7493 0.0723 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.7496 0.0792 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.7495 0.0712 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.7495 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.7495 0.0674 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.7495 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.7494 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.7494 0.0699 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.7497 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.7496 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.7495 0.0751 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.7494 0.0726 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.7492 0.0716 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.7493 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.7493 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.7493 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.7492 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.7490 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.7491 0.0700 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.8318 0.0687 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.7837 0.0692 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.7646 0.0732 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.7556 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.7529 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.7405 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.7421 0.0784 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.7402 0.0746 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.7420 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.7410 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.7381 0.0701 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.7369 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.7364 0.0768 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.7392 0.0729 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.7387 0.0756 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.7371 0.0737 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.7372 0.0720 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.7397 0.0720 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.7399 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.7402 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.7398 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.7411 0.0701 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.7401 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.7396 0.0709 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.7396 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.7381 0.0734 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.7367 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.7369 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.7379 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.7381 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.7379 0.0724 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.7370 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.7371 0.0733 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.7375 0.0678 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.7369 0.0699 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.7368 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.7361 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.7348 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.7337 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.7332 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.7328 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.7330 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.7323 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.7316 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.7317 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.7305 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.7303 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.7298 0.0709 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.7296 0.0732 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.7303 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.7299 0.0706 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.7309 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.7307 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.7307 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.7306 0.0695 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.7306 0.0769 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.7310 0.0730 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.7306 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.7301 0.0740 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.7305 0.0708 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.7306 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.7313 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.7317 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.7318 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.7316 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.7318 0.0718 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.7319 0.0723 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.7315 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.7315 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.7314 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.7319 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.7321 0.0775 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.7325 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.7322 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.7322 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.7323 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.7322 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.7323 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.7318 0.0683 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.7316 0.0757 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.7310 0.0781 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.7311 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.7305 0.0770 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.7304 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.7299 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.7296 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.7294 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.7292 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.7286 0.0721 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.7286 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.7282 0.0678 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.7279 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.7275 0.0710 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.7271 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.7267 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.7266 0.0811 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.7264 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.7259 0.0715 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.7255 0.0715 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.7250 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.7250 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.7248 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.7246 0.0757 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.7243 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.7241 0.0824 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.7240 0.0777 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.7240 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.7238 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.7237 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.7237 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.7236 0.0823 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.7234 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.7232 0.0707 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.7231 0.0704 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.7228 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.7224 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.7224 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.7223 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.7222 0.0713 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.7221 0.0700 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.7220 0.0737 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.7217 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.7214 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.7215 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.7214 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.7210 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.7210 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.7210 0.0744 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.7209 0.0770 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.7207 0.0755 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.7204 0.0700 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.7201 0.0687 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.7201 0.0714 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.7201 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.7201 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.7200 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.7202 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.7201 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.7202 0.0728 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.7200 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.7202 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.7201 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.7200 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.7201 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.7199 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.7200 0.0702 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.7200 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.7200 0.0730 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.7201 0.0703 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.7199 0.0699 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.7196 0.0718 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.7195 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.7196 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.7196 0.0741 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.7195 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.7194 0.0731 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.7195 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.7195 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.7193 0.0744 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.7194 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.7197 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.7196 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.7197 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.7197 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.7196 0.0732 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.7196 0.0680 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.7197 0.0678 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.7201 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.7200 0.0731 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.7199 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.7198 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.7197 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.7197 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.7197 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.7197 0.0710 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.7196 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.7194 0.0751 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.7195 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.8139 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.7636 0.0736 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.7437 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.7351 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.7289 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.7172 0.0703 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.7164 0.0736 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.7141 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.7162 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.7154 0.0727 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.7120 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.7103 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.7102 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.7124 0.0725 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.7111 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.7098 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.7092 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.7117 0.0765 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.7112 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.7116 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.7103 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.7113 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.7109 0.0722 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.7105 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.7101 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.7088 0.0719 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.7076 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.7078 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.7085 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.7087 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.7085 0.0762 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.7079 0.0700 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.7084 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.7089 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.7084 0.0719 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.7083 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.7078 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.7070 0.0702 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.7057 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.7051 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.7045 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.7048 0.0710 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.7043 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.7036 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.7038 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.7027 0.0717 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.7023 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.7016 0.0782 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.7015 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.7022 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.7016 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.7026 0.0708 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.7024 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.7021 0.0819 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.7020 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.7022 0.0700 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.7025 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.7022 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.7018 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.7023 0.0769 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.7022 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.7031 0.0700 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.7036 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.7040 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.7039 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.7041 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.7044 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.7040 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.7041 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.7038 0.0727 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.7044 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.7046 0.0680 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.7050 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.7049 0.0716 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.7047 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.7049 0.0759 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.7048 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.7049 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.7043 0.0709 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.7042 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.7037 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.7038 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.7033 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.7032 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.7028 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.7025 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.7021 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.7019 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.7014 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.7014 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.7012 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.7010 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.7006 0.0647 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.7002 0.0747 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.6999 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.6999 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.6997 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.6993 0.0758 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.6989 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.6984 0.0734 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.6984 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.6983 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.6980 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.6978 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.6975 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.6974 0.0706 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.6974 0.0754 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.6973 0.0787 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.6973 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.6973 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.6971 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.6969 0.0706 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.6966 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.6964 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.6961 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.6957 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.6957 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.6956 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.6955 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.6954 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.6952 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.6948 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.6945 0.0703 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.6946 0.0755 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.6946 0.0721 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.6942 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.6943 0.0717 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.6945 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.6944 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.6942 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.6939 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.6936 0.0647 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.6936 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.6936 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.6935 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.6935 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.6936 0.0709 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.6937 0.0715 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.6937 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.6936 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.6938 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.6937 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.6935 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.6936 0.0725 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.6935 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.6935 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.6936 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.6937 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.6937 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.6935 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.6933 0.0722 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.6934 0.0748 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.6934 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.6934 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.6933 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.6933 0.0715 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.6933 0.0707 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.6932 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.6930 0.0691 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.6931 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.6933 0.0785 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.6932 0.0807 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.6932 0.0720 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.6932 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.6932 0.0751 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.6931 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.6931 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.6935 0.0744 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.6934 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.6934 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.6932 0.0798 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.6930 0.0708 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.6931 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.6930 0.0709 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.6930 0.0738 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.6929 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.6927 0.0691 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.6927 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.7812 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.7304 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.7112 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.7027 0.0719 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.6971 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.6872 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.6872 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.6847 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.6862 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.6853 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.6827 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.6819 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.6820 0.0739 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.6848 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.6837 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.6821 0.0727 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.6816 0.0704 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.6838 0.0725 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.6844 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.6850 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.6841 0.0735 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.6855 0.0743 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.6841 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.6836 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.6836 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.6822 0.0752 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.6812 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.6816 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.6825 0.0713 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.6825 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.6822 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.6816 0.0745 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.6817 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.6820 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.6817 0.0769 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.6815 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.6809 0.0726 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.6800 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.6787 0.0770 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.6780 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.6775 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.6779 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.6774 0.0757 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.6765 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.6765 0.0706 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.6756 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.6752 0.0728 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.6747 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.6747 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.6754 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.6751 0.0782 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.6762 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.6762 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.6761 0.0747 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.6758 0.0778 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.6758 0.0741 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.6762 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.6760 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.6756 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.6759 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.6758 0.0752 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.6768 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.6771 0.0760 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.6773 0.0717 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.6774 0.0713 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.6776 0.0806 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.6779 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.6776 0.0741 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.6776 0.0736 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.6775 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.6780 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.6781 0.0771 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.6785 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.6783 0.0757 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.6781 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.6784 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.6783 0.0701 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.6784 0.0710 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.6778 0.0719 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.6775 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.6770 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.6771 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.6765 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.6764 0.0723 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.6761 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.6758 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.6757 0.0715 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.6752 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.6747 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.6748 0.0694 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.6744 0.0733 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.6742 0.0708 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.6738 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.6734 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.6731 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.6732 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.6731 0.0712 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.6728 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.6725 0.0761 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.6721 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.6720 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.6719 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.6717 0.0712 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.6715 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.6713 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.6712 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.6711 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.6710 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.6710 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.6711 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.6711 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.6709 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.6707 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.6706 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.6704 0.0800 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.6700 0.0755 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.6700 0.0741 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.6698 0.0716 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.6697 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.6697 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.6695 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.6691 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.6688 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.6689 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.6688 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.6684 0.0728 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.6685 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.6685 0.0759 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.6684 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.6683 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.6679 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.6677 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.6676 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.6677 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.6676 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.6676 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.6677 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.6677 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.6679 0.0797 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.6678 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.6680 0.0796 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.6679 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.6678 0.0711 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.6678 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.6676 0.0770 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.6676 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.6676 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.6678 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.6678 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.6677 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.6674 0.0687 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.6675 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.6675 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.6674 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.6673 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.6672 0.0732 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.6673 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.6673 0.0753 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.6670 0.0729 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.6671 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.6672 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.6672 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.6672 0.0782 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.6672 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.6671 0.0738 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.6670 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.6670 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.6674 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.6673 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.6672 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.6671 0.0706 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.6669 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.6669 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.6670 0.0725 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.6670 0.0698 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.6668 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.6667 0.0674 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.6667 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.7601 0.0762 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.7135 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.6955 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.6864 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.6810 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.6699 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.6689 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.6646 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.6652 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.6639 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.6603 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.6591 0.0747 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.6584 0.0744 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.6603 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.6587 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.6569 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.6566 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.6582 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.6576 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.6582 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.6569 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.6578 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.6571 0.0740 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.6566 0.0701 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.6565 0.0749 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.6550 0.0679 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.6536 0.0721 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.6538 0.0734 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.6541 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.6540 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.6541 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.6533 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.6534 0.0705 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.6539 0.0724 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.6538 0.0715 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.6537 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.6530 0.0740 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.6521 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.6509 0.0737 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.6503 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.6499 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.6502 0.0719 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.6497 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.6489 0.0741 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.6490 0.0686 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.6479 0.0714 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.6475 0.0703 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.6471 0.0702 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.6471 0.0784 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.6480 0.0713 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.6478 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.6488 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.6488 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.6489 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.6486 0.0641 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.6489 0.0722 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.6493 0.0691 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.6490 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.6485 0.0847 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.6489 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.6488 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.6498 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.6502 0.0736 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.6503 0.0730 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.6501 0.0709 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.6504 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.6506 0.0686 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.6502 0.0763 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.6502 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.6500 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.6506 0.0728 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.6509 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.6512 0.0747 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.6509 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.6507 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.6509 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.6505 0.0703 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.6506 0.0686 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.6500 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.6499 0.0765 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.6494 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.6494 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.6489 0.0769 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.6490 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.6488 0.0694 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.6487 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.6487 0.0715 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.6486 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.6481 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.6482 0.0658 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.6481 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.6480 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.6476 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.6473 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.6471 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.6473 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.6471 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.6467 0.0679 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.6464 0.0707 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.6460 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.6460 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.6458 0.0704 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.6456 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.6455 0.0716 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.6453 0.0772 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.6452 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.6452 0.0746 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.6452 0.0809 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.6452 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.6452 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.6450 0.0705 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.6448 0.0766 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.6447 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.6445 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.6442 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.6439 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.6438 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.6438 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.6436 0.0698 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.6434 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.6434 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.6429 0.0702 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.6426 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.6427 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.6425 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.6421 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.6422 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.6421 0.0720 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.6420 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.6418 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.6414 0.0762 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.6411 0.0708 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.6411 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.6411 0.0835 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.6410 0.0687 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.6410 0.0715 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.6411 0.0728 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.6412 0.0692 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.6412 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.6411 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.6414 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.6413 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.6411 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.6412 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.6410 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.6411 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.6412 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.6413 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.6413 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.6411 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.6408 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.6408 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.6409 0.0743 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.6409 0.0785 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.6408 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.6407 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.6407 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.6407 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.6405 0.0767 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.6406 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.6408 0.0737 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.6408 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.6407 0.0777 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.6407 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.6406 0.0753 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.6406 0.0737 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.6407 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.6411 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.6409 0.0734 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.6408 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.6407 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.6405 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.6405 0.0719 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.6405 0.0711 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.6405 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.6403 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.6401 0.0679 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.6402 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.7364 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.6894 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.6654 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.6591 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.6518 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.6409 0.0811 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.6411 0.0700 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.6405 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.6419 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.6394 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.6357 0.0774 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.6336 0.0853 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.6324 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.6343 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.6337 0.0934 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.6318 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.6315 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.6330 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.6332 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.6335 0.0712 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.6328 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.6333 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.6324 0.0686 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.6321 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.6320 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.6303 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.6289 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.6295 0.0735 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.6302 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.6301 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.6300 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.6290 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.6290 0.0776 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.6291 0.0789 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.6288 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.6286 0.0727 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.6278 0.0700 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.6264 0.0763 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.6251 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.6244 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.6240 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.6243 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.6236 0.0713 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.6228 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.6230 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.6218 0.0713 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.6214 0.0760 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.6211 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.6207 0.0786 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.6213 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.6208 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.6218 0.0749 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.6217 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.6217 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.6213 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.6214 0.0734 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.6218 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.6214 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.6208 0.0703 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.6212 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.6209 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.6217 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.6221 0.0717 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.6223 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.6221 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.6222 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.6224 0.0689 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.6221 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.6221 0.0701 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.6220 0.0707 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.6226 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.6228 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.6232 0.0691 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.6229 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.6229 0.0704 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.6230 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.6227 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.6228 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.6222 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.6221 0.0731 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.6216 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.6216 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.6211 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.6211 0.0689 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.6208 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.6206 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.6204 0.0729 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.6201 0.0716 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.6197 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.6198 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.6195 0.0700 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.6194 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.6190 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.6186 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.6183 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.6183 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.6182 0.0767 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.6177 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.6173 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.6168 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.6167 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.6166 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.6163 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.6162 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.6159 0.0682 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.6159 0.0799 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.6158 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.6157 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.6157 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.6158 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.6157 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.6156 0.0766 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.6154 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.6152 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.6149 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.6145 0.0725 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.6144 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.6145 0.0735 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.6144 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.6143 0.0742 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.6142 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.6137 0.0766 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.6133 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.6134 0.0747 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.6134 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.6130 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.6131 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.6132 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.6131 0.0775 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.6129 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.6126 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.6123 0.0754 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.6123 0.0708 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.6123 0.0716 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.6123 0.0714 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.6123 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.6125 0.0693 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.6125 0.0713 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.6126 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.6125 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.6129 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.6127 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.6126 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.6126 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.6125 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.6126 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.6126 0.0732 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.6128 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.6129 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.6128 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.6125 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.6126 0.0770 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.6126 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.6126 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.6126 0.0691 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.6125 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.6126 0.0691 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.6125 0.0754 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.6123 0.0692 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.6124 0.0735 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.6126 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.6126 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.6127 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.6127 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.6127 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.6127 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.6127 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.6131 0.0706 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.6130 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.6130 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.6128 0.0738 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.6127 0.0721 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.6128 0.0720 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.6128 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.6128 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.6127 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.6125 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.6126 0.0705 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.7301 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.6809 0.0744 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.6635 0.0720 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.6531 0.0744 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.6464 0.0704 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.6325 0.0778 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.6309 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.6286 0.0720 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.6291 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.6258 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.6228 0.0706 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.6217 0.0688 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.6203 0.0737 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.6213 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.6198 0.0728 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.6179 0.0722 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.6178 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.6190 0.0716 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.6185 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.6190 0.0709 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.6179 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.6183 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.6175 0.0666 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.6168 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.6162 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.6142 0.0737 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.6127 0.0919 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.6128 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.6136 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.6135 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.6129 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.6116 0.0765 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.6117 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.6119 0.0704 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.6116 0.0701 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.6113 0.0722 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.6107 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.6096 0.0719 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.6080 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.6069 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.6062 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.6063 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.6057 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.6048 0.0745 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.6049 0.0765 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.6038 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.6035 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.6032 0.0716 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.6029 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.6035 0.0743 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.6027 0.0704 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.6035 0.0786 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.6032 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.6031 0.0704 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.6027 0.0747 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.6026 0.0706 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.6030 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.6026 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.6021 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.6025 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.6022 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.6031 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.6035 0.0794 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.6037 0.0743 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.6036 0.0759 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.6037 0.0740 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.6037 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.6032 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.6034 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.6031 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.6036 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.6038 0.0793 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.6043 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.6041 0.0750 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.6040 0.0704 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.6041 0.0731 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.6039 0.0741 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.6041 0.0729 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.6036 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.6033 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.6027 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.6027 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.6021 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.6020 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.6017 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.6014 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.6012 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.6009 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.6004 0.0729 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.6004 0.0733 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.6001 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.5998 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.5994 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.5991 0.0711 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.5988 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.5988 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.5986 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.5981 0.0672 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.5977 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.5974 0.0702 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.5973 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.5972 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.5971 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.5969 0.0726 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.5966 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.5965 0.0710 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.5964 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.5963 0.0666 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.5963 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.5963 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.5961 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.5960 0.0724 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.5958 0.0734 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.5956 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.5953 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.5950 0.0666 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.5949 0.0681 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.5949 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.5948 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.5947 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.5946 0.0745 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.5941 0.0740 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.5938 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.5939 0.0799 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.5938 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.5934 0.0745 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.5935 0.0749 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.5935 0.0688 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.5934 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.5932 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.5928 0.0764 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.5925 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.5925 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.5925 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.5925 0.0771 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.5925 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.5927 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.5928 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.5929 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.5929 0.0731 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.5932 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.5931 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.5931 0.0728 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.5931 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.5930 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.5932 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.5932 0.0701 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.5934 0.0725 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.5935 0.0784 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.5933 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.5930 0.0716 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.5931 0.0721 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.5930 0.0755 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.5931 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.5930 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.5930 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.5930 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.5930 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.5928 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.5930 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.5932 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.5932 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.5933 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.5932 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.5933 0.0721 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.5932 0.0746 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.5933 0.0801 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.5937 0.0698 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.5937 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.5936 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.5935 0.0666 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.5933 0.0732 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.5934 0.0718 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.5934 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.5935 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.5933 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.5931 0.0747 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.5932 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.7181 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.6603 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.6359 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.6263 0.0717 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.6179 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.6048 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.6040 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.6001 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.6007 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.5985 0.0720 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.5948 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.5926 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.5915 0.0799 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.5928 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.5921 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.5905 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.5906 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.5923 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.5921 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.5926 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.5918 0.0732 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.5924 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.5918 0.0762 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.5916 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.5915 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.5897 0.0660 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.5887 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.5890 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.5901 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.5900 0.0734 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.5898 0.0775 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.5890 0.0742 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.5895 0.0651 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.5901 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.5900 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.5896 0.0742 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.5891 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.5883 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.5870 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.5866 0.0775 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.5860 0.0728 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.5865 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.5859 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.5852 0.0752 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.5855 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.5847 0.0815 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.5844 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.5838 0.0716 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.5838 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.5843 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.5837 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.5847 0.0731 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.5847 0.0714 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.5845 0.0722 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.5842 0.0739 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.5843 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.5849 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.5846 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.5839 0.0705 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.5843 0.0731 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.5842 0.0700 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.5850 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.5853 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.5855 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.5854 0.0714 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.5856 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.5856 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.5852 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.5851 0.0710 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.5848 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.5855 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.5857 0.0730 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.5861 0.0719 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.5857 0.0700 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.5856 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.5856 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.5855 0.0733 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.5856 0.0715 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.5850 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.5848 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.5844 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.5845 0.0677 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.5838 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.5837 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.5833 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.5831 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.5828 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.5825 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.5821 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.5823 0.0699 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.5820 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.5817 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.5812 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.5809 0.0710 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.5805 0.0733 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.5806 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.5804 0.0719 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.5799 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.5794 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.5791 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.5790 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.5789 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.5787 0.0734 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.5785 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.5783 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.5783 0.0733 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.5782 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.5782 0.0690 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.5782 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.5782 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.5781 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.5779 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.5778 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.5775 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.5772 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.5769 0.0750 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.5768 0.0755 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.5768 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.5767 0.0721 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.5766 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.5765 0.0809 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.5762 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.5758 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.5758 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.5757 0.0727 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.5754 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.5755 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.5756 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.5755 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.5754 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.5750 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.5749 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.5750 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.5750 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.5750 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.5752 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.5755 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.5755 0.0758 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.5756 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.5755 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.5758 0.0760 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.5758 0.0716 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.5758 0.0690 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.5760 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.5759 0.0781 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.5761 0.0742 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.5762 0.0721 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.5764 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.5765 0.0713 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.5764 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.5761 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.5761 0.0831 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.5761 0.0676 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.5762 0.0711 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.5761 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.5761 0.0731 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.5762 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.5762 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.5760 0.0720 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.5761 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.5763 0.0647 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.5763 0.0651 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.5763 0.0685 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.5763 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.5762 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.5762 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.5763 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.5767 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.5766 0.0792 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.5766 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.5765 0.0704 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.5762 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.5764 0.0748 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.5763 0.0702 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.5763 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.5763 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.5761 0.0725 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.5762 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.7109 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.6507 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.6259 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.6132 0.0709 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.6044 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.5930 0.0744 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.5914 0.0747 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.5875 0.0751 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.5886 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.5867 0.0770 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.5815 0.0754 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.5798 0.0715 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.5785 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.5806 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.5794 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.5774 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.5777 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.5787 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.5797 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.5805 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.5798 0.0709 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.5803 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.5793 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.5790 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.5786 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.5772 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.5758 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.5763 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.5768 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.5769 0.0745 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.5764 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.5751 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.5754 0.0815 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.5756 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.5751 0.0769 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.5747 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.5739 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.5727 0.0771 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.5712 0.0721 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.5708 0.0725 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.5701 0.0739 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.5703 0.0767 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.5696 0.0793 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.5689 0.0789 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.5691 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.5680 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.5678 0.0750 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.5674 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.5669 0.0746 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.5674 0.0712 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.5668 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.5675 0.0751 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.5675 0.0730 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.5674 0.0735 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.5670 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.5669 0.0751 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.5672 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.5670 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.5662 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.5666 0.0713 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.5664 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.5672 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.5677 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.5678 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.5677 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.5678 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.5681 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.5677 0.0718 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.5676 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.5674 0.0694 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.5681 0.0709 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.5683 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.5687 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.5686 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.5684 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.5686 0.0791 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.5684 0.0725 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.5685 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.5679 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.5677 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.5672 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.5673 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.5667 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.5665 0.0817 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.5663 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.5661 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.5659 0.0715 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.5655 0.0737 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.5651 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.5652 0.0709 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.5648 0.0734 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.5646 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.5643 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.5640 0.0728 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.5636 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.5636 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.5635 0.0742 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.5630 0.0723 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.5625 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.5621 0.0790 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.5621 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.5620 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.5619 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.5617 0.0743 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.5616 0.0724 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.5615 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.5614 0.0701 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.5614 0.0747 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.5614 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.5614 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.5612 0.0782 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.5611 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.5609 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.5606 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.5604 0.0878 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.5600 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.5601 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.5600 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.5599 0.0708 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.5597 0.0699 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.5597 0.0727 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.5593 0.0708 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.5589 0.0734 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.5590 0.0717 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.5589 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.5585 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.5586 0.0716 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.5586 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.5585 0.0722 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.5583 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.5580 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.5577 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.5577 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.5578 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.5578 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.5579 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.5581 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.5582 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.5583 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.5582 0.0800 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.5585 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.5584 0.0726 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.5584 0.0761 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.5585 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.5584 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.5584 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.5584 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.5586 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.5587 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.5585 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.5582 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.5582 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.5583 0.0731 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.5584 0.0745 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.5584 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.5583 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.5584 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.5584 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.5582 0.0709 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.5583 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.5586 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.5585 0.0701 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.5585 0.0712 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.5586 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.5586 0.0765 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.5586 0.0699 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.5587 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.5592 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.5591 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.5591 0.0752 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.5589 0.0683 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.5588 0.0710 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.5589 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.5589 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.5590 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.5589 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.5587 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.5588 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4219 0.6408 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.3707 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.2916 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.1900 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.1193 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.0499 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 3.9738 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 3.8991 0.0550 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 3.8350 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.7827 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.7335 0.0569 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.6921 0.0557 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.6557 0.0531 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.6252 0.0534 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.5978 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.5729 0.0567 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.5496 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.5298 0.0537 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.5106 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.4920 0.0567 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.4757 0.0529 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.4608 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.4467 0.0536 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.4340 0.0555 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.4216 0.0561 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.4107 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.4006 0.0545 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.3900 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.3802 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.3713 0.0558 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.3638 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.3556 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.3474 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.3403 0.0554 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.3328 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.3263 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.3192 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.3125 0.0542 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.3061 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.3000 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.2941 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.2884 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.2828 0.0552 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.2774 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.2720 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.2671 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.2624 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.2580 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.2534 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.2490 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.2445 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.2401 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.2359 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.2315 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.2274 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.2231 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.2190 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.2149 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.2107 0.0685 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.2067 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.2028 0.0541 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.1992 0.0553 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.1956 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.1913 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.1870 0.0551 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.1832 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.1792 0.0559 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.1746 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.1700 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.1660 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.1615 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.1574 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.1528 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.1481 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.1435 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.1391 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.1343 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.1295 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.1245 0.0642 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.1194 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.1143 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.1094 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.1045 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.0996 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.0940 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.0887 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.0833 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.0781 0.0676 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.0727 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.0676 0.0658 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.0622 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.0569 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.0516 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.0461 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.0406 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.0351 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.0298 0.0567 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.0247 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.0198 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.0146 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.0098 0.0658 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.0047 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 2.9995 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 2.9945 0.0576 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 2.9894 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 2.9844 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 2.9793 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 2.9746 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 2.9699 0.0632 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 2.9647 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 2.9600 0.0791 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 2.9554 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 2.9506 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 2.9464 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 2.9428 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 2.9385 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 2.9342 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 2.9300 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 2.9261 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 2.9219 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 2.9179 0.0561 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 2.9138 0.0574 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 2.9096 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 2.9056 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 2.9015 0.0567 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 2.8973 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 2.8934 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 2.8895 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 2.8855 0.0653 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 2.8816 0.0577 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 2.8777 0.0562 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 2.8736 0.0655 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 2.8698 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 2.8660 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 2.8621 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 2.8583 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 2.8546 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 2.8508 0.0571 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 2.8473 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 2.8435 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 2.8401 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 2.8363 0.0633 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 2.8328 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 2.8292 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 2.8256 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 2.8223 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 2.8188 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 2.8155 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 2.8120 0.0659 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 2.8084 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 2.8051 0.0570 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 2.8020 0.0571 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 2.7988 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 2.7956 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 2.7923 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 2.7890 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 2.7857 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 2.7824 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 2.7790 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 2.7761 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 2.7730 0.0577 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 2.7697 0.0690 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 2.7665 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 2.7634 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 2.7604 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 2.7574 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 2.7544 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 2.7527 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 2.7512 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 2.7494 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 2.7472 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 2.7446 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 2.7423 0.0686 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 2.7398 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 2.7373 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 2.7351 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 2.7326 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 2.7299 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.3070 0.0583 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.2615 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.2495 0.0611 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.2465 0.0575 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.2438 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.2409 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.2400 0.0722 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.2408 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.2424 0.0575 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.2411 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.2384 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.2369 0.0678 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.2367 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.2377 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.2360 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.2345 0.0567 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.2329 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.2334 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.2325 0.0696 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.2303 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.2280 0.0564 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.2285 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.2269 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.2249 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.2236 0.0588 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.2218 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.2201 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.2191 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.2188 0.0725 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.2177 0.0706 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.2167 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.2150 0.0569 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.2134 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.2128 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.2112 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.2101 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.2087 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.2065 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.2046 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.2029 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.2013 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.2001 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.1985 0.0582 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.1966 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.1953 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.1930 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.1920 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.1905 0.0574 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.1893 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.1888 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.1872 0.0588 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.1867 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.1852 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.1839 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.1826 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.1818 0.0658 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.1809 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.1796 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.1783 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.1781 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.1769 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.1765 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.1758 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.1749 0.0581 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.1736 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.1729 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.1720 0.0675 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.1705 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.1695 0.0680 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.1685 0.0589 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.1679 0.0671 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.1670 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.1662 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.1649 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.1639 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.1634 0.0577 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.1622 0.0611 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.1615 0.0582 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.1601 0.0589 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.1590 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.1577 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.1569 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.1555 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.1544 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.1529 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.1517 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.1507 0.0582 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.1496 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.1483 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.1476 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.1466 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.1458 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.1445 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.1433 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.1422 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.1412 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.1403 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.1391 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.1378 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.1364 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.1357 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.1349 0.0585 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.1337 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.1327 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.1317 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.1308 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.1298 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.1290 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.1282 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.1272 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.1263 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.1254 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.1245 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.1235 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.1225 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.1213 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.1204 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.1195 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.1187 0.0659 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.1179 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.1172 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.1162 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.1151 0.0574 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.1145 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.1137 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.1125 0.0589 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.1119 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.1112 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.1104 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.1096 0.0576 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.1086 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.1076 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.1069 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.1062 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.1054 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.1047 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.1040 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.1033 0.0599 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.1028 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.1019 0.0706 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.1013 0.0651 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.1004 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.0996 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.0988 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.0980 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.0973 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.0967 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.0961 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.0953 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.0944 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.0935 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.0930 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.0923 0.0683 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.0916 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.0908 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.0901 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.0893 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.0885 0.0660 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.0876 0.0700 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.0871 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.0865 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.0858 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.0851 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.0845 0.0585 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.0838 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.0830 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.0824 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.0819 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.0812 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.0805 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.0798 0.0589 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.0791 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.0786 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.0781 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.0775 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.0768 0.0611 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.0760 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.0754 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.0180 0.0602 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 1.9726 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 1.9626 0.0587 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 1.9563 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 1.9525 0.0595 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 1.9433 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 1.9439 0.0701 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 1.9434 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 1.9464 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 1.9454 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 1.9431 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 1.9398 0.0605 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 1.9402 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 1.9418 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 1.9401 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 1.9386 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 1.9373 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 1.9389 0.0605 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 1.9384 0.0599 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 1.9377 0.0595 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 1.9364 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 1.9374 0.0755 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 1.9358 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 1.9348 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 1.9337 0.0591 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 1.9322 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 1.9305 0.0588 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 1.9304 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 1.9308 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 1.9304 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 1.9300 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 1.9287 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 1.9280 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 1.9281 0.0615 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 1.9272 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 1.9261 0.0632 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 1.9252 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 1.9235 0.0587 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 1.9221 0.0579 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 1.9208 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 1.9197 0.0598 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 1.9193 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 1.9184 0.0597 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 1.9170 0.0710 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 1.9166 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 1.9149 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 1.9142 0.0585 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 1.9130 0.0582 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 1.9122 0.0596 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 1.9124 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 1.9113 0.0588 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 1.9115 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 1.9107 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 1.9100 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 1.9093 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 1.9090 0.0598 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 1.9087 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 1.9080 0.0598 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 1.9071 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 1.9074 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 1.9068 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 1.9069 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 1.9069 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 1.9068 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 1.9061 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 1.9060 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 1.9056 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 1.9049 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 1.9043 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 1.9038 0.0613 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 1.9038 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 1.9034 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 1.9033 0.0584 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 1.9024 0.0696 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 1.9017 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 1.9016 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 1.9011 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 1.9008 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 1.8998 0.0657 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 1.8991 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 1.8981 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 1.8978 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 1.8968 0.0586 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 1.8961 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 1.8951 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 1.8943 0.0589 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 1.8936 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 1.8928 0.0812 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 1.8918 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 1.8915 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 1.8907 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 1.8900 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 1.8891 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 1.8884 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 1.8875 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 1.8870 0.0704 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 1.8863 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 1.8854 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 1.8846 0.0595 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 1.8835 0.0694 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 1.8830 0.0748 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 1.8825 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 1.8816 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 1.8810 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 1.8802 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 1.8795 0.0584 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 1.8789 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 1.8784 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 1.8779 0.0590 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 1.8773 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 1.8768 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 1.8761 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 1.8755 0.0583 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 1.8749 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 1.8742 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 1.8734 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 1.8729 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 1.8723 0.0597 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 1.8718 0.0590 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 1.8713 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 1.8710 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 1.8703 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 1.8696 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 1.8692 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 1.8687 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 1.8680 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 1.8677 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 1.8674 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 1.8669 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 1.8664 0.0601 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 1.8658 0.0674 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 1.8651 0.0613 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 1.8647 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 1.8643 0.0615 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 1.8639 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 1.8635 0.0713 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 1.8631 0.0624 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 1.8628 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 1.8626 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 1.8621 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 1.8619 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 1.8614 0.0698 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 1.8609 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 1.8605 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 1.8600 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 1.8597 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 1.8593 0.0601 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 1.8591 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 1.8587 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 1.8581 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 1.8575 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 1.8572 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 1.8569 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 1.8565 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 1.8560 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 1.8556 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 1.8552 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 1.8548 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 1.8542 0.0589 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 1.8539 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 1.8537 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 1.8532 0.0595 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 1.8529 0.0599 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 1.8525 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 1.8521 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 1.8516 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 1.8513 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 1.8512 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 1.8507 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 1.8503 0.0624 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 1.8498 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 1.8493 0.0655 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 1.8490 0.0591 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 1.8486 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 1.8483 0.0592 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 1.8478 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 1.8473 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 1.8469 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 1.8368 0.0586 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 1.8033 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 1.7914 0.0600 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 1.7848 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 1.7803 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 1.7703 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 1.7709 0.0719 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 1.7684 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 1.7702 0.0591 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 1.7695 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 1.7661 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 1.7639 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 1.7640 0.0605 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 1.7659 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 1.7645 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 1.7630 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 1.7628 0.0616 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 1.7645 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 1.7638 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 1.7640 0.0599 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 1.7629 0.0598 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 1.7633 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 1.7620 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 1.7612 0.0606 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 1.7608 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 1.7592 0.0602 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 1.7576 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 1.7578 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 1.7580 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 1.7575 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 1.7569 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 1.7558 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 1.7554 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 1.7558 0.0614 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 1.7550 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 1.7545 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 1.7536 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 1.7521 0.0616 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 1.7506 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 1.7496 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 1.7486 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 1.7486 0.0606 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 1.7477 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 1.7466 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 1.7464 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 1.7448 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 1.7441 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 1.7433 0.0601 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 1.7427 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 1.7429 0.0614 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 1.7422 0.0600 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 1.7425 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 1.7422 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 1.7419 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 1.7414 0.0685 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 1.7413 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 1.7415 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 1.7408 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 1.7399 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 1.7402 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 1.7399 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 1.7403 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 1.7405 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 1.7404 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 1.7401 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 1.7400 0.0595 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 1.7398 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 1.7392 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 1.7389 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 1.7385 0.0754 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 1.7388 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 1.7388 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 1.7390 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 1.7384 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 1.7380 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 1.7381 0.0606 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 1.7377 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 1.7375 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 1.7368 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 1.7365 0.0627 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 1.7356 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 1.7355 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 1.7346 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 1.7342 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 1.7335 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 1.7328 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 1.7324 0.0610 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 1.7318 0.0749 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 1.7311 0.0599 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 1.7309 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 1.7304 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 1.7298 0.0782 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 1.7290 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 1.7283 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 1.7276 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 1.7273 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 1.7268 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 1.7260 0.0589 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 1.7254 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 1.7246 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 1.7243 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 1.7238 0.0616 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 1.7232 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 1.7227 0.0598 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 1.7222 0.0603 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 1.7216 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 1.7213 0.0600 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 1.7209 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 1.7206 0.0662 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 1.7203 0.0577 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 1.7199 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 1.7195 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 1.7190 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 1.7185 0.0597 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 1.7181 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 1.7174 0.0677 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 1.7171 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 1.7167 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 1.7162 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 1.7159 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 1.7155 0.0817 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 1.7149 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 1.7143 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 1.7140 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 1.7137 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 1.7131 0.0595 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 1.7130 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 1.7127 0.0717 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 1.7123 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 1.7119 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 1.7113 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 1.7107 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 1.7105 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 1.7102 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 1.7099 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 1.7097 0.0607 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 1.7095 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 1.7093 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 1.7092 0.0620 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 1.7088 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 1.7088 0.0607 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 1.7085 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 1.7081 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 1.7079 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 1.7075 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 1.7074 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 1.7071 0.0727 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 1.7071 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 1.7069 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 1.7065 0.0666 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 1.7059 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 1.7057 0.0601 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 1.7055 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 1.7052 0.0601 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 1.7049 0.0721 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 1.7046 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 1.7044 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 1.7041 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 1.7037 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 1.7035 0.0597 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 1.7035 0.0602 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 1.7032 0.0720 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 1.7030 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 1.7027 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 1.7025 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 1.7022 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 1.7021 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 1.7022 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 1.7018 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 1.7015 0.0772 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 1.7011 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 1.7007 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 1.7006 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 1.7003 0.0724 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 1.7002 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 1.6998 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 1.6994 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 1.6992 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 1.7361 0.0594 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 1.6932 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 1.6771 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 1.6679 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 1.6620 0.0601 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 1.6510 0.0586 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 1.6507 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 1.6481 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 1.6491 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 1.6473 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 1.6433 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 1.6412 0.0591 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 1.6407 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 1.6425 0.0594 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 1.6415 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 1.6400 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 1.6395 0.0600 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 1.6410 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 1.6413 0.0748 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 1.6424 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 1.6415 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 1.6424 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 1.6411 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 1.6405 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 1.6401 0.0597 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 1.6385 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 1.6371 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 1.6372 0.0596 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 1.6375 0.0590 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 1.6375 0.0591 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 1.6372 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 1.6358 0.0706 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 1.6360 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 1.6366 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 1.6361 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 1.6355 0.0596 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 1.6346 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 1.6333 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 1.6319 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 1.6310 0.0725 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 1.6302 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 1.6304 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 1.6294 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 1.6286 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 1.6287 0.0718 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 1.6274 0.0604 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 1.6269 0.0674 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 1.6263 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 1.6258 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 1.6261 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 1.6254 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 1.6260 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 1.6258 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 1.6260 0.0602 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 1.6255 0.0588 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 1.6256 0.0675 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 1.6258 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 1.6253 0.0608 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 1.6246 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 1.6250 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 1.6247 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 1.6253 0.0740 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 1.6256 0.0597 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 1.6257 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 1.6254 0.0608 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 1.6255 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 1.6255 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 1.6250 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 1.6248 0.0592 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 1.6245 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 1.6248 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 1.6247 0.0615 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 1.6250 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 1.6246 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 1.6243 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 1.6243 0.0605 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 1.6240 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 1.6237 0.0684 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 1.6229 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 1.6227 0.0590 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 1.6221 0.0741 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 1.6219 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 1.6211 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 1.6210 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 1.6205 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 1.6201 0.0638 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 1.6195 0.0602 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 1.6191 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 1.6184 0.0786 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 1.6183 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 1.6178 0.0682 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 1.6175 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 1.6169 0.0723 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 1.6163 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 1.6158 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 1.6156 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 1.6153 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 1.6147 0.0588 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 1.6142 0.0601 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 1.6135 0.0601 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 1.6132 0.0704 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 1.6128 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 1.6124 0.0700 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 1.6120 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 1.6116 0.0603 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 1.6112 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 1.6109 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 1.6106 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 1.6104 0.0608 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 1.6101 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 1.6098 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 1.6095 0.0600 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 1.6091 0.0596 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 1.6087 0.0766 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 1.6082 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 1.6076 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 1.6074 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 1.6071 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 1.6068 0.0615 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 1.6066 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 1.6064 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 1.6058 0.0709 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 1.6051 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 1.6050 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 1.6048 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 1.6042 0.0604 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 1.6041 0.0589 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 1.6040 0.0597 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 1.6036 0.0717 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 1.6032 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 1.6025 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 1.6021 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 1.6020 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 1.6018 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 1.6016 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 1.6015 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 1.6014 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 1.6014 0.0618 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 1.6013 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 1.6010 0.0611 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 1.6011 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 1.6008 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 1.6005 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 1.6005 0.0606 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 1.6002 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 1.6001 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 1.5998 0.0599 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 1.5999 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 1.5998 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 1.5994 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 1.5989 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 1.5987 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 1.5986 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 1.5984 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 1.5982 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 1.5980 0.0699 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 1.5979 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 1.5977 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 1.5973 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 1.5972 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 1.5971 0.0596 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 1.5969 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 1.5968 0.0594 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 1.5966 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 1.5964 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 1.5961 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 1.5961 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 1.5964 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 1.5962 0.0615 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 1.5960 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 1.5957 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 1.5954 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 1.5954 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 1.5952 0.0677 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 1.5951 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 1.5949 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 1.5946 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 1.5945 0.0608 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 1.6345 0.0604 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 1.6004 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 1.5851 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 1.5812 0.0608 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 1.5749 0.0596 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 1.5650 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 1.5646 0.0709 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 1.5612 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 1.5635 0.0603 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 1.5618 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.5579 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.5562 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.5552 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.5572 0.0602 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.5562 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.5548 0.0596 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.5550 0.0624 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.5563 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.5563 0.0711 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.5576 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.5566 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.5566 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.5553 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.5542 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.5537 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.5520 0.0734 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.5503 0.0612 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.5507 0.0648 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.5511 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.5512 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.5509 0.0597 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.5499 0.0610 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.5501 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.5501 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.5496 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.5491 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.5483 0.0603 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.5468 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.5453 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.5445 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.5438 0.0635 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.5442 0.0588 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.5436 0.0612 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.5427 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.5428 0.0608 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.5417 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.5409 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.5404 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.5400 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.5403 0.0711 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.5395 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.5401 0.0608 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.5398 0.0592 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.5399 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.5394 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.5392 0.0618 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.5394 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.5388 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.5379 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.5381 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.5378 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.5384 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.5385 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.5385 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.5381 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.5381 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.5382 0.0626 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.5377 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.5377 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.5375 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.5379 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.5379 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.5383 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.5379 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.5377 0.0611 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.5378 0.0598 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.5375 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.5374 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.5367 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.5364 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.5359 0.0610 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.5357 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.5349 0.0686 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.5348 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.5343 0.0592 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.5337 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.5333 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.5329 0.0694 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.5322 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.5320 0.0650 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.5316 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.5314 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.5307 0.0597 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.5302 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.5297 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.5296 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.5294 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.5288 0.0728 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.5283 0.0608 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.5277 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.5276 0.0662 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.5273 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.5270 0.0609 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.5268 0.0596 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.5265 0.0609 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.5262 0.0605 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.5260 0.0608 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.5258 0.0594 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.5254 0.0603 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.5252 0.0592 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.5248 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.5245 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.5240 0.0588 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.5237 0.0624 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.5232 0.0737 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.5227 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.5225 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.5223 0.0607 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.5220 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.5218 0.0602 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.5216 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.5210 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.5205 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.5205 0.0618 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.5203 0.0597 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.5198 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.5198 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.5197 0.0609 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.5194 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.5190 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.5185 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.5181 0.0610 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.5181 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.5179 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.5178 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.5177 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.5177 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.5176 0.0618 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.5175 0.0624 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.5172 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.5173 0.0618 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.5170 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.5167 0.0611 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.5167 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.5164 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.5163 0.0594 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.5161 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.5162 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.5161 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.5158 0.0596 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.5153 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.5150 0.0624 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.5149 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.5147 0.0607 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.5145 0.0595 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.5143 0.0599 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.5142 0.0599 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.5139 0.0599 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.5135 0.0609 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.5135 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.5135 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.5132 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.5131 0.0612 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.5129 0.0650 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.5128 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.5125 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.5125 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.5128 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.5126 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.5124 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.5121 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.5118 0.0743 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.5117 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.5116 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.5115 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.5112 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.5109 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.5109 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 1.5869 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.5410 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.5290 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.5223 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.5124 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.5005 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.4991 0.0610 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.4959 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.4958 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.4941 0.0733 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.4901 0.0593 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.4876 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.4866 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.4870 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.4851 0.0604 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.4825 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.4823 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.4830 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.4824 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.4831 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.4819 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.4816 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.4801 0.0594 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.4792 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.4788 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.4771 0.0591 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.4755 0.0615 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.4757 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.4757 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.4757 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.4751 0.0620 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.4740 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.4739 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.4742 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.4737 0.0602 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.4733 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.4724 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.4711 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.4695 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.4689 0.0604 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.4684 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.4689 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.4682 0.0676 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.4671 0.0625 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.4669 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.4656 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.4649 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.4641 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.4636 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.4638 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.4630 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.4634 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.4631 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.4631 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.4625 0.0625 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.4624 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.4627 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.4623 0.0620 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.4615 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.4619 0.0596 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.4617 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.4625 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.4627 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.4626 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.4624 0.0731 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.4626 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.4626 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.4623 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.4622 0.0601 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.4621 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.4625 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.4626 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.4630 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.4626 0.0611 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.4623 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.4625 0.0602 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.4622 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.4620 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.4612 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.4609 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.4604 0.0701 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.4603 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.4597 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.4595 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.4591 0.0611 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.4588 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.4584 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.4579 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.4573 0.0595 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.4572 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.4569 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.4566 0.0608 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.4560 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.4554 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.4549 0.0607 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.4548 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.4546 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.4541 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.4536 0.0680 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.4529 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.4527 0.0617 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.4525 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.4522 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.4519 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.4517 0.0651 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.4514 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.4511 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.4509 0.0737 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.4506 0.0621 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.4506 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.4502 0.0609 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.4500 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.4497 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.4494 0.0607 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.4491 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.4486 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.4484 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.4483 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.4481 0.0625 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.4479 0.0719 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.4478 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.4475 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.4470 0.0612 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.4470 0.0596 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.4469 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.4464 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.4463 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.4463 0.0669 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.4460 0.0603 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.4456 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.4450 0.0610 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.4446 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.4446 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.4446 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.4445 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.4444 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.4445 0.0620 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.4444 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.4443 0.0607 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.4442 0.0716 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.4444 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.4443 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.4441 0.0612 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.4441 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.4439 0.0674 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.4440 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.4439 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.4441 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.4441 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.4438 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.4434 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.4432 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.4432 0.0602 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.4431 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.4430 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.4429 0.0596 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.4428 0.0612 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.4427 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.4423 0.0665 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.4423 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.4423 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.4421 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.4420 0.0750 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.4418 0.0713 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.4418 0.0612 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.4416 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.4417 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.4420 0.0761 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.4419 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.4418 0.0694 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.4415 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.4413 0.0666 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.4412 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.4412 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.4411 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.4409 0.0760 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.4406 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.4407 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.5820 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.5113 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.4865 0.0615 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.4764 0.0616 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.4643 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.4536 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.4515 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.4470 0.0622 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.4455 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.4438 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.4391 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.4367 0.0693 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.4349 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.4356 0.0601 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.4338 0.0595 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.4311 0.0606 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.4312 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.4319 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.4315 0.0595 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.4322 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.4309 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.4307 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.4294 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.4285 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.4280 0.0617 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.4257 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.4241 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.4241 0.0593 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.4243 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.4241 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.4231 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.4217 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.4215 0.0603 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.4218 0.0621 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.4214 0.0744 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.4210 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.4200 0.0702 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.4186 0.0611 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.4170 0.0599 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.4163 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.4157 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.4160 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.4155 0.0717 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.4144 0.0610 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.4144 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.4132 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.4124 0.0735 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.4120 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.4118 0.0610 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.4119 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.4113 0.0609 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.4118 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.4114 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.4116 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.4112 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.4112 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.4113 0.0614 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.4109 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.4101 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.4103 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.4101 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.4109 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.4111 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.4111 0.0615 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.4107 0.0720 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.4108 0.0602 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.4108 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.4104 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.4105 0.0611 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.4102 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.4106 0.0620 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.4108 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.4111 0.0605 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.4108 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.4105 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.4105 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.4102 0.0619 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.4099 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.4090 0.0679 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.4088 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.4082 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.4080 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.4073 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.4071 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.4067 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.4064 0.0610 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.4060 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.4055 0.0694 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.4049 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.4048 0.0680 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.4046 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.4043 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.4038 0.0674 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.4033 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.4029 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.4029 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.4027 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.4023 0.0657 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.4019 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.4014 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.4012 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.4009 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.4007 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.4005 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.4002 0.0610 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.4000 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.3999 0.0674 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.3997 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.3994 0.0607 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.3994 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.3990 0.0596 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.3987 0.0620 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.3985 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.3982 0.0604 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.3979 0.0604 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.3974 0.0734 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.3972 0.0695 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.3972 0.0614 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.3969 0.0605 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.3967 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.3966 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.3962 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.3957 0.0615 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.3956 0.0595 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.3954 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.3949 0.0616 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.3949 0.0616 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.3948 0.0694 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.3945 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.3940 0.0617 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.3934 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.3931 0.0620 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.3931 0.0619 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.3931 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.3930 0.0607 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.3931 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.3931 0.0763 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.3930 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.3930 0.0745 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.3929 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.3931 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.3930 0.0787 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.3928 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.3929 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.3928 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.3928 0.0597 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.3928 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.3929 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.3929 0.0594 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.3927 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.3923 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.3921 0.0771 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.3921 0.0716 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.3920 0.0701 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.3920 0.0617 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.3919 0.0720 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.3919 0.0804 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.3917 0.0776 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.3914 0.0683 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.3914 0.0715 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.3914 0.0751 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.3913 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.3912 0.0695 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.3911 0.0827 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.3911 0.0619 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.3909 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.3910 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.3914 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.3914 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.3913 0.0732 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.3911 0.0709 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.3908 0.0740 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.3909 0.0696 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.3908 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.3908 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.3905 0.0705 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.3903 0.0721 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.3904 0.0685 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.5244 0.0718 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.4565 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.4380 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.4283 0.0824 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.4176 0.0727 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.4063 0.0763 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.4053 0.0726 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.4036 0.0607 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.4024 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.4004 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.3959 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.3950 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.3932 0.0724 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.3936 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.3909 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.3887 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.3879 0.0717 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.3880 0.0709 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.3874 0.0760 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.3883 0.0715 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.3872 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.3866 0.0802 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.3848 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.3837 0.0853 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.3835 0.0723 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.3814 0.0597 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.3798 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.3801 0.0602 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.3801 0.0607 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.3803 0.0702 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.3792 0.0714 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.3780 0.0851 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.3779 0.0794 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.3777 0.0720 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.3774 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.3769 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.3757 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.3743 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.3728 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.3721 0.0613 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.3713 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.3718 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.3712 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.3702 0.0606 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.3700 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.3691 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.3686 0.0621 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.3680 0.0609 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.3679 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.3680 0.0614 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.3672 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.3678 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.3675 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.3676 0.0822 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.3671 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.3670 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.3672 0.0708 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.3668 0.0707 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.3660 0.0716 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.3663 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.3662 0.0726 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.3670 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.3670 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.3668 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.3666 0.0826 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.3666 0.0607 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.3666 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.3664 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.3664 0.0611 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.3662 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.3666 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.3667 0.0624 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.3672 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.3668 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.3666 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.3667 0.0597 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.3664 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.3662 0.0723 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.3654 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.3653 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.3649 0.0612 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.3648 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.3642 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.3640 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.3636 0.0769 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.3633 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.3630 0.0608 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.3626 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.3621 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.3621 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.3618 0.0693 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.3615 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.3611 0.0613 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.3606 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.3601 0.0614 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.3601 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.3599 0.0606 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.3594 0.0609 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.3591 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.3587 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.3585 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.3582 0.0621 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.3580 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.3578 0.0620 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.3575 0.0658 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.3573 0.0593 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.3571 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.3569 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.3566 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.3566 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.3563 0.0717 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.3560 0.0768 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.3558 0.0608 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.3556 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.3553 0.0594 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.3549 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.3548 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.3547 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.3544 0.0615 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.3543 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.3541 0.0672 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.3537 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.3533 0.0615 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.3531 0.0621 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.3530 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.3525 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.3525 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.3524 0.0594 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.3521 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.3517 0.0615 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.3512 0.0711 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.3508 0.0608 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.3508 0.0597 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.3508 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.3508 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.3507 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.3508 0.0606 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.3508 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.3508 0.0730 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.3507 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.3510 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.3509 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.3507 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.3508 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.3507 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.3508 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.3507 0.0752 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.3510 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.3510 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.3509 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.3505 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.3504 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.3504 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.3504 0.0731 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.3504 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.3504 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.3503 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.3502 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.3499 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.3499 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.3499 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.3499 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.3498 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.3497 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.3496 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.3495 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.3497 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.3501 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.3500 0.0782 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.3500 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.3498 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.3496 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.3497 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.3496 0.0588 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.3496 0.0594 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.3494 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.3492 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.3493 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.4809 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.4146 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.3913 0.0630 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.3827 0.0608 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.3721 0.0613 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.3620 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.3634 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.3608 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.3591 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.3567 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.3522 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.3514 0.0609 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.3504 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.3508 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.3482 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.3461 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.3459 0.0614 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.3463 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.3461 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.3473 0.0705 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.3463 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.3461 0.0705 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.3451 0.0601 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.3443 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.3438 0.0700 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.3416 0.0687 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.3404 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.3411 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.3412 0.0729 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.3414 0.0597 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.3404 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.3392 0.0608 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.3390 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.3389 0.0630 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.3383 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.3378 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.3370 0.0596 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.3353 0.0609 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.3338 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.3332 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.3322 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.3328 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.3323 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.3314 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.3313 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.3304 0.0602 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.3301 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.3296 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.3296 0.0615 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.3297 0.0616 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.3291 0.0663 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.3296 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.3295 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.3296 0.0601 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.3293 0.0644 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.3291 0.0600 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.3293 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.3288 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.3281 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.3285 0.0722 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.3283 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.3290 0.0631 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.3292 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.3292 0.0614 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.3291 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.3291 0.0613 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.3292 0.0609 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.3289 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.3290 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.3289 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.3295 0.0606 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.3296 0.0614 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.3300 0.0695 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.3296 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.3295 0.0585 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.3295 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.3293 0.0601 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.3293 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.3285 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.3284 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.3280 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.3280 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.3276 0.0620 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.3275 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.3272 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.3270 0.0715 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.3268 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.3265 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.3260 0.0598 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.3259 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.3257 0.0620 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.3256 0.0695 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.3252 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.3249 0.0621 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.3245 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.3247 0.0601 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.3247 0.0669 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.3243 0.0604 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.3240 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.3236 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.3234 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.3232 0.0609 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.3231 0.0626 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.3230 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.3228 0.0605 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.3227 0.0738 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.3226 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.3225 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.3222 0.0759 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.3222 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.3219 0.0615 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.3218 0.0819 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.3217 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.3215 0.0631 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.3213 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.3208 0.0718 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.3207 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.3206 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.3204 0.0763 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.3202 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.3201 0.0713 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.3197 0.0773 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.3192 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.3192 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.3191 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.3187 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.3187 0.0600 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.3186 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.3183 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.3179 0.0696 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.3174 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.3171 0.0611 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.3172 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.3171 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.3171 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.3172 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.3173 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.3173 0.0741 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.3173 0.0601 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.3172 0.0772 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.3175 0.0742 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.3175 0.0746 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.3174 0.0710 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.3175 0.0607 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.3174 0.0781 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.3174 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.3174 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.3177 0.0790 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.3177 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.3175 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.3171 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.3169 0.0765 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.3169 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.3168 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.3168 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.3168 0.0877 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.3167 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.3166 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.3163 0.0714 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.3163 0.0621 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.3164 0.0753 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.3164 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.3163 0.0757 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.3162 0.0620 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.3162 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.3161 0.0598 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.3162 0.0590 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.3166 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.3165 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.3165 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.3164 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.3161 0.0614 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.3162 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.3162 0.0611 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.3161 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.3159 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.3158 0.0603 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.3158 0.0615 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.4723 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.3997 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.3726 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.3630 0.0604 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.3510 0.0603 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.3393 0.0596 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.3376 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.3341 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.3315 0.0618 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.3284 0.0611 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.3233 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.3227 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.3216 0.0623 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.3218 0.0614 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.3197 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.3172 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.3170 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.3175 0.0594 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.3169 0.0598 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.3175 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.3165 0.0618 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.3163 0.0602 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.3147 0.0610 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.3141 0.0609 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.3136 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.3115 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.3102 0.0611 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.3108 0.0727 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.3107 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.3106 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.3096 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.3084 0.0603 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.3080 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.3081 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.3077 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.3074 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.3064 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.3049 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.3033 0.0610 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.3032 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.3025 0.0617 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.3032 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.3029 0.0610 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.3022 0.0632 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.3022 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.3014 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.3008 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.3003 0.0610 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.3004 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.3006 0.0618 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.3000 0.0611 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.3005 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.3003 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.3003 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.2999 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.3000 0.0601 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.3004 0.0702 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.2999 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.2993 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.2999 0.0602 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.2999 0.0586 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.3006 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.3008 0.0666 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.3009 0.0604 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.3008 0.0627 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.3008 0.0723 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.3009 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.3007 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.3009 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.3006 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.3011 0.0836 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.3012 0.0699 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.3017 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.3013 0.0748 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.3012 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.3014 0.0618 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.3013 0.0611 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.3012 0.0616 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.3005 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.3004 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.3000 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.2999 0.0751 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.2995 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.2994 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.2990 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.2989 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.2986 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.2984 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.2979 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.2979 0.0667 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.2976 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.2975 0.0613 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.2971 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.2966 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.2963 0.0626 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.2964 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.2963 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.2958 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.2954 0.0706 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.2950 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.2949 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.2947 0.0617 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.2944 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.2943 0.0702 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.2940 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.2938 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.2937 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.2936 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.2933 0.0633 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.2933 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.2930 0.0626 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.2929 0.0807 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.2928 0.0713 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.2926 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.2924 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.2920 0.0630 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.2919 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.2918 0.0613 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.2916 0.0743 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.2915 0.0612 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.2914 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.2911 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.2906 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.2905 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.2904 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.2899 0.0628 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.2899 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.2897 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.2894 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.2890 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.2886 0.0614 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.2883 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.2883 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.2882 0.0610 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.2882 0.0611 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.2882 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.2883 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.2884 0.0617 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.2883 0.0783 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.2882 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.2885 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.2886 0.0703 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.2884 0.0609 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.2885 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.2883 0.0728 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.2885 0.0600 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.2885 0.0678 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.2887 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.2888 0.0643 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.2887 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.2884 0.0753 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.2882 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.2884 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.2883 0.0708 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.2882 0.0691 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.2882 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.2882 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.2880 0.0710 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.2878 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.2878 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.2879 0.0840 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.2878 0.0593 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.2877 0.0703 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.2876 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.2876 0.0776 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.2875 0.0693 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.2877 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.2880 0.0738 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.2880 0.0623 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.2880 0.0618 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.2879 0.0825 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.2877 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.2878 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.2877 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.2877 0.0688 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.2875 0.0616 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.2874 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.2876 0.0713 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.4363 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.3643 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.3372 0.0716 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.3306 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.3175 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.3047 0.0721 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.3034 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.3002 0.0605 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.2992 0.0627 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.2978 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.2935 0.0733 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.2924 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.2918 0.0837 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.2922 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.2900 0.0829 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.2876 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.2870 0.0603 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.2871 0.0609 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.2870 0.0633 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.2885 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.2872 0.0772 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.2869 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.2858 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.2856 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.2852 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.2830 0.0741 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.2818 0.0608 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.2822 0.0813 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.2820 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.2822 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.2816 0.0604 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.2801 0.0611 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.2801 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.2799 0.0610 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.2792 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.2788 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.2780 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.2767 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.2750 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.2748 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.2742 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.2752 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.2750 0.0585 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.2742 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.2743 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.2735 0.0740 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.2732 0.0754 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.2728 0.0753 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.2726 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.2730 0.0741 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.2722 0.0720 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.2728 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.2726 0.0881 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.2728 0.0736 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.2724 0.0756 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.2724 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.2726 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.2722 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.2717 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.2720 0.0717 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.2719 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.2725 0.0620 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.2726 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.2727 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.2726 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.2727 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.2727 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.2725 0.0607 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.2726 0.0610 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.2725 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.2729 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.2730 0.0594 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.2734 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.2730 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.2727 0.0738 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.2729 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.2727 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.2725 0.0699 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.2719 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.2717 0.0674 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.2713 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.2712 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.2708 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.2705 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.2703 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.2702 0.0708 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.2700 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.2697 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.2694 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.2694 0.0709 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.2691 0.0719 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.2690 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.2687 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.2683 0.0694 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.2682 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.2683 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.2682 0.0780 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.2678 0.0768 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.2674 0.0595 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.2671 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.2670 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.2667 0.0690 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.2666 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.2665 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.2662 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.2661 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.2659 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.2658 0.0609 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.2656 0.0766 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.2656 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.2654 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.2652 0.0620 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.2652 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.2650 0.0606 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.2648 0.0727 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.2644 0.0756 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.2643 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.2643 0.0729 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.2641 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.2640 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.2639 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.2635 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.2631 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.2630 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.2629 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.2625 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.2625 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.2623 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.2621 0.0629 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.2616 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.2610 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.2608 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.2608 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.2608 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.2607 0.0604 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.2607 0.0592 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.2608 0.0591 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.2609 0.0710 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.2609 0.0745 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.2609 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.2612 0.0624 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.2612 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.2611 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.2613 0.0697 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.2611 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.2613 0.0633 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.2613 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.2615 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.2616 0.0770 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.2615 0.0610 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.2612 0.0674 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.2610 0.0636 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.2611 0.0896 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.2610 0.0681 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.2610 0.0705 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.2610 0.0632 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.2609 0.0602 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.2608 0.0607 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.2606 0.0672 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.2607 0.0632 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.2607 0.0759 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.2607 0.0743 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.2606 0.0715 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.2605 0.0627 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.2606 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.2605 0.0601 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.2607 0.0614 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.2610 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.2610 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.2611 0.0715 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.2610 0.0602 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.2608 0.0669 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.2610 0.0607 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.2610 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.2610 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.2609 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.2608 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.2610 0.0678 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.4181 0.0640 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.3445 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.3184 0.0684 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.3085 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.2966 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.2834 0.0620 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.2823 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.2789 0.0678 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.2769 0.0725 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.2743 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.2697 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.2683 0.0601 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.2676 0.0695 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.2673 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.2657 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.2636 0.0720 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.2634 0.0607 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.2639 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.2634 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.2639 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.2632 0.0680 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.2635 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.2628 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.2624 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.2623 0.0640 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.2603 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.2588 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.2593 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.2592 0.0728 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.2594 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.2585 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.2571 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.2570 0.0688 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.2569 0.0680 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.2563 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.2559 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.2551 0.0602 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.2538 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.2523 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.2519 0.0609 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.2511 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.2519 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.2518 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.2511 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.2513 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.2508 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.2504 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.2498 0.0604 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.2496 0.0601 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.2498 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.2492 0.0715 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.2498 0.0621 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.2497 0.0602 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.2497 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.2492 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.2491 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.2494 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.2492 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.2487 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.2491 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.2492 0.0610 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.2498 0.0618 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.2499 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.2499 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.2498 0.0625 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.2500 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.2502 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.2501 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.2501 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.2499 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.2504 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.2507 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.2512 0.0616 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.2507 0.0593 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.2506 0.0607 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.2508 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.2507 0.0684 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.2506 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.2500 0.0612 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.2499 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.2495 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.2493 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.2489 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.2489 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.2485 0.0614 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.2485 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.2484 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.2482 0.0700 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.2478 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.2479 0.0600 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.2476 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.2475 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.2471 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.2467 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.2465 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.2466 0.0620 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.2466 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.2462 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.2459 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.2455 0.0620 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.2455 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.2453 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.2452 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.2451 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.2448 0.0702 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.2446 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.2445 0.0596 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.2445 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.2442 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.2442 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.2440 0.0705 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.2439 0.0703 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.2438 0.0623 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.2437 0.0691 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.2434 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.2431 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.2430 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.2430 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.2429 0.0623 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.2428 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.2427 0.0611 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.2423 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.2420 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.2419 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.2418 0.0784 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.2414 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.2413 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.2412 0.0713 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.2409 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.2406 0.0727 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.2402 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.2399 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.2399 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.2399 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.2398 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.2399 0.0611 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.2400 0.0624 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.2401 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.2400 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.2399 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.2402 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.2402 0.0599 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.2402 0.0606 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.2404 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.2404 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.2405 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.2405 0.0605 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.2407 0.0620 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.2407 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.2406 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.2403 0.0628 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.2401 0.0607 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.2402 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.2402 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.2401 0.0617 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.2400 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.2400 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.2399 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.2397 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.2398 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.2399 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.2398 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.2398 0.0608 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.2397 0.0618 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.2396 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.2395 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.2397 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.2400 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.2401 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.2401 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.2401 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.2399 0.0604 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.2400 0.0712 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.2400 0.0596 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.2400 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.2399 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.2399 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.2400 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.3876 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.3189 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.2929 0.0602 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.2859 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.2731 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.2602 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.2570 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.2530 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.2510 0.0710 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.2496 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.2461 0.0731 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.2451 0.0737 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.2453 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.2456 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.2441 0.0612 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.2418 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.2419 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.2422 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.2415 0.0618 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.2426 0.0621 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.2422 0.0598 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.2421 0.0609 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.2412 0.0597 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.2410 0.0611 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.2406 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.2388 0.0583 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.2373 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.2380 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.2382 0.0647 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.2384 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.2377 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.2366 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.2366 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.2366 0.0617 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.2361 0.0601 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.2357 0.0610 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.2348 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.2335 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.2321 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.2318 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.2310 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.2318 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.2317 0.0615 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.2308 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.2310 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.2304 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.2297 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.2294 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.2292 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.2294 0.0707 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.2288 0.0612 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.2295 0.0715 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.2293 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.2294 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.2290 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.2290 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.2291 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.2288 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.2283 0.0675 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.2288 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.2289 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.2295 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.2295 0.0624 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.2296 0.0622 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.2295 0.0589 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.2296 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.2297 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.2296 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.2297 0.0683 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.2297 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.2301 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.2304 0.0720 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.2308 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.2306 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.2305 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.2306 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.2304 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.2304 0.0602 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.2298 0.0620 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.2298 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.2294 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.2293 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.2289 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.2288 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.2285 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.2284 0.0621 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.2284 0.0611 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.2281 0.0704 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.2277 0.0696 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.2277 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.2273 0.0619 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.2273 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.2269 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.2265 0.0605 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.2263 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.2264 0.0600 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.2264 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.2260 0.0623 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.2255 0.0621 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.2253 0.0605 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.2252 0.0723 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.2249 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.2248 0.0594 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.2247 0.0597 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.2244 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.2242 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.2241 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.2240 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.2238 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.2238 0.0617 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.2235 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.2235 0.0609 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.2235 0.0623 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.2233 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.2230 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.2227 0.0610 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.2227 0.0613 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.2227 0.0623 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.2226 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.2225 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.2223 0.0613 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.2220 0.0632 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.2216 0.0619 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.2216 0.0620 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.2214 0.0706 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.2211 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.2210 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.2210 0.0613 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.2207 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.2204 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.2200 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.2198 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.2199 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.2199 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.2198 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.2198 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.2200 0.0598 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.2201 0.0619 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.2201 0.0600 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.2201 0.0612 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.2205 0.0688 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.2206 0.0622 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.2205 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.2207 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.2205 0.0622 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.2207 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.2207 0.0702 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.2209 0.0599 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.2210 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.2209 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.2206 0.0719 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.2204 0.0697 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.2205 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.2205 0.0620 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.2205 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.2205 0.0621 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.2204 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.2204 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.2202 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.2203 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.2204 0.0620 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.2204 0.0613 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.2203 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.2203 0.0691 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.2202 0.0621 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.2201 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.2203 0.0598 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.2207 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.2207 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.2208 0.0650 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.2207 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.2206 0.0774 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.2207 0.0647 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.2207 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.2207 0.0616 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.2206 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.2205 0.0617 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.2207 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.3561 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.2961 0.0599 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.2735 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.2624 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.2506 0.0620 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.2370 0.0630 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.2338 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.2309 0.0626 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.2299 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.2273 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.2245 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.2243 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.2242 0.0623 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.2250 0.0620 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.2241 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.2224 0.0602 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.2223 0.0623 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.2234 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.2231 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.2239 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.2232 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.2233 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.2225 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.2225 0.0613 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.2225 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.2206 0.0607 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.2195 0.0694 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.2200 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.2198 0.0628 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.2201 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.2195 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.2184 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.2186 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.2185 0.0638 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.2179 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.2177 0.0619 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.2169 0.0598 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.2156 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.2144 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.2141 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.2136 0.0614 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.2146 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.2142 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.2136 0.0625 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.2138 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.2133 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.2128 0.0688 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.2124 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.2123 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.2125 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.2120 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.2126 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.2126 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.2128 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.2125 0.0609 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.2127 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.2129 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.2126 0.0619 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.2121 0.0603 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.2126 0.0706 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.2127 0.0603 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.2133 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.2135 0.0626 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.2135 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.2135 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.2137 0.0740 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.2138 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.2136 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.2137 0.0721 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.2137 0.0625 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.2141 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.2144 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.2150 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.2147 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.2146 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.2148 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.2147 0.0610 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.2146 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.2140 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.2141 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.2137 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.2137 0.0611 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.2133 0.0615 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.2133 0.0611 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.2129 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.2128 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.2126 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.2124 0.0619 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.2121 0.0620 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.2121 0.0782 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.2119 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.2118 0.0604 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.2114 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.2111 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.2109 0.0617 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.2109 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.2109 0.0596 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.2106 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.2103 0.0814 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.2099 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.2098 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.2096 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.2095 0.0596 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.2094 0.0620 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.2092 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.2091 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.2090 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.2089 0.0606 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.2088 0.0615 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.2089 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.2086 0.0619 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.2086 0.0602 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.2085 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.2084 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.2081 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.2078 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.2078 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.2078 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.2076 0.0607 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.2076 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.2074 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.2070 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.2066 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.2065 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.2063 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.2059 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.2059 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.2059 0.0629 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.2057 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.2053 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.2049 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.2046 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.2046 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.2045 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.2045 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.2045 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.2046 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.2047 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.2047 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.2047 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.2050 0.0795 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.2051 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.2051 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.2054 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.2053 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.2056 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.2056 0.0614 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.2057 0.0605 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.2059 0.0594 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.2057 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.2054 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.2053 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.2054 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.2054 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.2054 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.2053 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.2053 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.2052 0.0759 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.2050 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.2050 0.0617 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.2051 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.2051 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.2050 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.2050 0.0803 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.2050 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.2049 0.0736 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.2051 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.2054 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.2054 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.2054 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.2053 0.0611 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.2052 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.2054 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.2054 0.0627 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.2053 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.2052 0.0601 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.2050 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.2053 0.0625 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.3394 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.2800 0.0686 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.2580 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.2492 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.2356 0.0685 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.2234 0.0626 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.2203 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.2165 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.2148 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.2135 0.0712 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.2101 0.0703 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.2096 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.2095 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.2099 0.0616 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.2078 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.2059 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.2060 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.2068 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.2064 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.2074 0.0624 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.2068 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.2072 0.0733 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.2062 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.2064 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.2059 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.2041 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.2030 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.2033 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.2032 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.2038 0.0625 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.2031 0.0618 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.2021 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.2021 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.2021 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.2015 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.2012 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.2003 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.1993 0.0596 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.1980 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.1976 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.1970 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.1978 0.0588 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.1977 0.0620 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.1968 0.0611 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.1969 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.1963 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.1957 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.1952 0.0620 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.1952 0.0620 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.1955 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.1949 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.1954 0.0604 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.1952 0.0611 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.1954 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.1950 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.1950 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.1953 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.1951 0.0638 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.1946 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.1951 0.0624 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.1951 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.1957 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.1959 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.1958 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.1958 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.1960 0.0679 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.1963 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.1963 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.1963 0.0613 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.1961 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.1967 0.0622 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.1969 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.1973 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.1970 0.0628 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.1968 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.1970 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.1969 0.0611 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.1968 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.1963 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.1963 0.0766 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.1960 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.1958 0.0626 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.1955 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.1954 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.1950 0.0693 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.1950 0.0612 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.1948 0.0624 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.1945 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.1943 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.1944 0.0723 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.1942 0.0601 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.1940 0.0620 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.1936 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.1932 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.1931 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.1931 0.0799 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.1931 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.1927 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.1924 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.1922 0.0628 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.1921 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.1919 0.0616 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.1919 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.1918 0.0645 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.1916 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.1914 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.1913 0.0768 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.1912 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.1910 0.0625 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.1911 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.1909 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.1909 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.1908 0.0604 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.1906 0.0630 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.1903 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.1901 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.1901 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.1900 0.0719 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.1899 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.1900 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.1898 0.0616 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.1896 0.0708 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.1892 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.1891 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.1889 0.0621 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.1886 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.1886 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.1886 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.1884 0.0725 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.1880 0.0603 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.1876 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.1875 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.1875 0.0638 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.1875 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.1874 0.0588 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.1874 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.1876 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.1877 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.1877 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.1877 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.1880 0.0701 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.1882 0.0628 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.1882 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.1884 0.0602 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.1883 0.0714 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.1884 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.1885 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.1886 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.1887 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.1886 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.1883 0.0622 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.1882 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.1883 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.1883 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.1882 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.1882 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.1882 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.1881 0.0612 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.1879 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.1880 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.1881 0.0605 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.1881 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.1881 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.1881 0.0625 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.1881 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.1880 0.0673 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.1881 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.1885 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.1885 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.1885 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.1885 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.1884 0.0618 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.1886 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.1885 0.0628 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.1886 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.1884 0.0748 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.1883 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.1885 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.3238 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.2661 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.2436 0.0721 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.2354 0.0621 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.2215 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.2109 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.2080 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.2046 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.2024 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.2002 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.1976 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.1969 0.0752 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.1969 0.0766 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.1973 0.0611 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.1961 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.1945 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.1948 0.0672 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.1962 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.1953 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.1963 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.1952 0.0715 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.1950 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.1940 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.1937 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.1936 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.1917 0.0636 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.1908 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.1913 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.1912 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.1913 0.0620 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.1905 0.0623 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.1895 0.0638 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.1894 0.0905 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.1895 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.1887 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.1886 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.1878 0.0626 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.1865 0.0812 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.1852 0.0734 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.1852 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.1845 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.1853 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.1852 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.1844 0.0732 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.1843 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.1836 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.1829 0.0610 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.1826 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.1826 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.1829 0.0685 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.1823 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.1830 0.0628 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.1829 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.1830 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.1827 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.1825 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.1826 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.1824 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.1819 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.1825 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.1825 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.1831 0.0681 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.1832 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.1833 0.0664 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.1832 0.0710 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.1832 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.1833 0.0632 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.1831 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.1833 0.0604 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.1831 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.1836 0.0619 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.1838 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.1841 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.1837 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.1835 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.1836 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.1835 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.1834 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.1829 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.1829 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.1825 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.1825 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.1821 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.1822 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.1819 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.1819 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.1817 0.0645 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.1815 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.1812 0.0611 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.1813 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.1810 0.0696 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.1809 0.0698 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.1805 0.0606 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.1800 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.1798 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.1798 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.1798 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.1794 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.1790 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.1787 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.1787 0.0616 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.1785 0.0609 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.1784 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.1783 0.0722 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.1782 0.0638 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.1780 0.0690 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.1779 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.1779 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.1777 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.1778 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.1776 0.0682 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.1775 0.0711 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.1773 0.0699 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.1771 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.1768 0.0607 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.1766 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.1766 0.0609 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.1766 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.1766 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.1765 0.0623 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.1763 0.0620 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.1760 0.0612 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.1757 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.1756 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.1754 0.0603 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.1751 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.1751 0.0703 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.1750 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.1748 0.0600 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.1744 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.1740 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.1739 0.0687 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.1739 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.1739 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.1738 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.1738 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.1739 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.1740 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.1741 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.1740 0.0621 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.1743 0.0655 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.1744 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.1744 0.0600 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.1747 0.0617 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.1746 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.1748 0.0619 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.1748 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.1749 0.0599 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.1751 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.1749 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.1747 0.0632 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.1745 0.0627 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.1746 0.0599 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.1746 0.0638 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.1747 0.0620 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.1747 0.0616 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.1746 0.0638 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.1745 0.0654 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.1744 0.0597 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.1745 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.1746 0.0662 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.1746 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.1746 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.1745 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.1745 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.1744 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.1745 0.0601 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.1749 0.0599 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.1749 0.0600 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.1749 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.1748 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.1747 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.1748 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.1749 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.1749 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.1747 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.1746 0.0623 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.1748 0.0630 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.3045 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.2421 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.2205 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.2149 0.0710 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.2014 0.0628 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.1885 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.1865 0.0711 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.1843 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.1828 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.1812 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.1780 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.1772 0.0617 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.1774 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.1777 0.0631 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.1760 0.0623 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.1745 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.1749 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.1759 0.0632 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.1755 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.1766 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.1756 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.1754 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.1749 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.1754 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.1751 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.1733 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.1722 0.0596 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.1728 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.1727 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.1731 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.1721 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.1708 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.1709 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.1711 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.1705 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.1701 0.0627 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.1694 0.0730 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.1683 0.0623 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.1675 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.1675 0.0728 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.1668 0.0630 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.1677 0.0620 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.1675 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.1670 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.1672 0.0617 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.1666 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.1661 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.1658 0.0632 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.1659 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.1663 0.0599 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.1657 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.1662 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.1659 0.0620 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.1660 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.1658 0.0639 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.1657 0.0624 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.1659 0.0640 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.1658 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.1653 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.1658 0.0630 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.1657 0.0638 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.1663 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.1666 0.0621 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.1667 0.0600 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.1666 0.0707 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.1668 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.1670 0.0607 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.1671 0.0639 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.1673 0.0697 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.1672 0.0621 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.1678 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.1681 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.1686 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.1683 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.1683 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.1685 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.1683 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.1683 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.1678 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.1678 0.0624 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.1674 0.0642 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.1674 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.1670 0.0630 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.1670 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.1668 0.0609 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.1667 0.0724 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.1665 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.1664 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.1662 0.0736 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.1663 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.1661 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.1661 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.1658 0.0619 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.1654 0.0600 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.1651 0.0601 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.1652 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.1652 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.1648 0.0601 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.1645 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.1643 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.1643 0.0734 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.1641 0.0615 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.1641 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.1640 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.1637 0.0619 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.1636 0.0679 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.1635 0.0617 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.1635 0.0615 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.1633 0.0686 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.1636 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.1634 0.0682 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.1633 0.0717 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.1632 0.0631 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.1632 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.1630 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.1628 0.0795 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.1629 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.1628 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.1628 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.1627 0.0628 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.1626 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.1622 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.1619 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.1618 0.0648 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.1616 0.0736 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.1613 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.1613 0.0714 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.1612 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.1609 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.1606 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.1602 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.1601 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.1602 0.0688 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.1601 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.1601 0.0708 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.1600 0.0621 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.1602 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.1603 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.1604 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.1604 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.1608 0.0670 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.1609 0.0619 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.1609 0.0709 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.1611 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.1611 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.1613 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.1614 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.1616 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.1617 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.1617 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.1614 0.0599 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.1613 0.0615 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.1614 0.0627 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.1614 0.0706 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.1614 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.1613 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.1612 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.1611 0.0632 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.1610 0.0609 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.1611 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.1612 0.0729 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.1612 0.0630 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.1612 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.1611 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.1611 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.1611 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.1613 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.1617 0.0609 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.1617 0.0599 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.1617 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.1617 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.1616 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.1618 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.1618 0.0652 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.1618 0.0639 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.1617 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.1617 0.0609 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.1619 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.2851 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.2311 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.2096 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.2015 0.0641 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.1897 0.0628 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.1789 0.0728 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.1769 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.1728 0.0598 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.1717 0.0629 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.1700 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.1670 0.0622 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.1661 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.1666 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.1668 0.0705 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.1648 0.0777 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.1633 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.1637 0.0628 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.1646 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.1643 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.1654 0.0633 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.1644 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.1648 0.0621 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.1643 0.0729 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.1647 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.1646 0.0746 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.1627 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.1619 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.1627 0.0618 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.1626 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.1630 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.1621 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.1610 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.1609 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.1613 0.0722 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.1610 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.1608 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.1599 0.0641 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.1587 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.1576 0.0637 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.1575 0.0651 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.1570 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.1579 0.0604 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.1578 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.1570 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.1571 0.0621 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.1566 0.0747 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.1562 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.1558 0.0627 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.1558 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.1561 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.1556 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.1561 0.0635 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.1561 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.1560 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.1558 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.1558 0.0593 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.1558 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.1557 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.1552 0.0682 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.1558 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.1558 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.1565 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.1567 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.1567 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.1567 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.1568 0.0708 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.1571 0.0626 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.1570 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.1572 0.0618 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.1572 0.0673 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.1577 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.1580 0.0629 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.1584 0.0633 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.1579 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.1580 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.1581 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.1580 0.0612 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.1580 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.1573 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.1573 0.0627 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.1569 0.0717 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.1568 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.1564 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.1564 0.0629 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.1561 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.1559 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.1558 0.0619 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.1555 0.0619 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.1552 0.0600 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.1552 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.1549 0.0627 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.1549 0.0618 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.1546 0.0641 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.1542 0.0614 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.1540 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.1540 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.1540 0.0718 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.1535 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.1532 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.1530 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.1529 0.0599 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.1528 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.1528 0.0612 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.1527 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.1526 0.0633 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.1525 0.0641 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.1523 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.1524 0.0744 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.1522 0.0647 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.1523 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.1522 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.1521 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.1521 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.1519 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.1517 0.0627 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.1514 0.0688 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.1514 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.1513 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.1513 0.0616 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.1513 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.1512 0.0611 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.1509 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.1506 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.1505 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.1502 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.1499 0.0725 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.1498 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.1497 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.1495 0.0616 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.1492 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.1488 0.0637 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.1487 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.1487 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.1488 0.0635 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.1487 0.0612 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.1487 0.0758 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.1488 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.1489 0.0626 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.1490 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.1490 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.1494 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.1495 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.1495 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.1498 0.0606 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.1498 0.0806 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.1500 0.0602 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.1501 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.1503 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.1505 0.0681 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.1504 0.0622 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.1501 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.1499 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.1501 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.1501 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.1500 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.1500 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.1499 0.0679 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.1498 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.1496 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.1497 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.1498 0.0616 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.1498 0.0611 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.1498 0.0604 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.1498 0.0603 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.1498 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.1498 0.0793 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.1500 0.0668 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.1503 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.1503 0.0656 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.1504 0.0635 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.1504 0.0612 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.1503 0.0628 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.1505 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.1505 0.0606 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.1506 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.1505 0.0629 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.1505 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.1507 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.2754 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.2178 0.0642 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.1996 0.0750 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.1924 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.1811 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.1692 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.1670 0.0615 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.1632 0.0621 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.1616 0.0618 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.1598 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.1557 0.0625 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.1549 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.1548 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.1552 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.1532 0.0609 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.1514 0.0632 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.1517 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.1525 0.0603 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.1519 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.1527 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.1519 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.1525 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.1522 0.0744 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.1523 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.1519 0.0672 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.1501 0.0631 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.1492 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.1496 0.0744 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.1493 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.1499 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.1488 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.1477 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.1474 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.1478 0.0620 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.1475 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.1470 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.1461 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.1454 0.0686 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.1443 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.1440 0.0613 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.1434 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.1445 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.1443 0.0723 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.1436 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.1436 0.0738 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.1429 0.0642 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.1424 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.1420 0.0642 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.1419 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.1422 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.1417 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.1423 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.1422 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.1423 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.1419 0.0625 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.1418 0.0772 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.1419 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.1419 0.0615 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.1415 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.1422 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.1423 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.1430 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.1432 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.1433 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.1432 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.1433 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.1435 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.1434 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.1437 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.1437 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.1442 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.1446 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.1450 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.1446 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.1446 0.0622 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.1449 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.1447 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.1446 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.1441 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.1441 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.1438 0.0625 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.1437 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.1433 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.1433 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.1430 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.1430 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.1428 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.1426 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.1422 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.1422 0.0681 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.1420 0.0668 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.1421 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.1417 0.0626 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.1413 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.1412 0.0708 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.1412 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.1413 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.1410 0.0623 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.1406 0.0675 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.1405 0.0627 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.1404 0.0602 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.1403 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.1404 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.1403 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.1402 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.1401 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.1400 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.1400 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.1400 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.1401 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.1399 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.1400 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.1401 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.1400 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.1398 0.0631 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.1394 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.1395 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.1395 0.0744 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.1395 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.1395 0.0613 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.1394 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.1391 0.0629 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.1389 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.1388 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.1387 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.1384 0.0680 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.1384 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.1383 0.0631 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.1381 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.1378 0.0629 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.1374 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.1373 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.1374 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.1374 0.0634 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.1374 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.1373 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.1374 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.1375 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.1376 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.1376 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.1380 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.1381 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.1381 0.0620 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.1383 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.1383 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.1386 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.1387 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.1388 0.0592 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.1390 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.1389 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.1386 0.0622 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.1385 0.0602 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.1386 0.0626 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.1386 0.0632 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.1386 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.1386 0.0725 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.1385 0.0642 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.1384 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.1383 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.1385 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.1386 0.0696 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.1386 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.1386 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.1385 0.0613 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.1385 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.1384 0.0627 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.1386 0.0732 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.1389 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.1389 0.0607 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.1389 0.0629 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.1389 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.1388 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.1389 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.1389 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.1389 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.1388 0.0634 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.1388 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.1390 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4207 0.6241 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.3931 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.3359 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.2193 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.0825 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 3.9925 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 3.9218 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 3.8581 0.0543 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 3.7973 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.7481 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.7073 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.6730 0.0675 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.6414 0.0544 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.6136 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.5888 0.0553 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.5663 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.5459 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.5290 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.5123 0.0539 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.4954 0.0545 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.4806 0.0565 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.4671 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.4547 0.0538 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.4431 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.4320 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.4223 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.4131 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.4037 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.3952 0.0646 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.3872 0.0553 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.3802 0.0540 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.3727 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.3653 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.3588 0.0667 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.3521 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.3461 0.0559 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.3395 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.3333 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.3272 0.0625 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.3218 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.3165 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.3112 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.3058 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.3010 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.2961 0.0563 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.2918 0.0656 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.2877 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.2837 0.0545 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.2797 0.0624 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.2758 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.2720 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.2680 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.2644 0.0575 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.2606 0.0569 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.2571 0.0569 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.2533 0.0568 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.2500 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.2466 0.0561 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.2431 0.0570 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.2399 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.2367 0.0560 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.2339 0.0552 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.2311 0.0548 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.2278 0.0554 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.2245 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.2216 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.2188 0.0685 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.2153 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.2121 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.2094 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.2064 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.2038 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.2009 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.1981 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.1954 0.0568 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.1927 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.1899 0.0570 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.1870 0.0578 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.1841 0.0664 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.1809 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.1780 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.1751 0.0569 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.1722 0.0576 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.1691 0.0576 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.1658 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.1626 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.1594 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.1561 0.0653 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.1529 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.1498 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.1466 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.1432 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.1398 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.1363 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.1327 0.0690 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.1292 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.1259 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.1223 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.1188 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.1153 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.1117 0.0684 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.1081 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.1044 0.0574 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.1007 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.0969 0.0620 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.0931 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.0893 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.0856 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.0820 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.0779 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.0741 0.0594 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.0704 0.0657 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.0664 0.0643 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.0624 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.0585 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.0544 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.0505 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.0467 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.0431 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.0392 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.0356 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.0319 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.0282 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.0245 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.0207 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.0167 0.0683 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.0131 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.0095 0.0626 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.0058 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.0022 0.0683 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 2.9985 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 2.9947 0.0688 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 2.9911 0.0632 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 2.9875 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 2.9837 0.0622 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 2.9802 0.0686 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 2.9768 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 2.9737 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 2.9703 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 2.9668 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 2.9636 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 2.9601 0.0683 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 2.9568 0.0635 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 2.9534 0.0588 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 2.9500 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 2.9469 0.0627 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 2.9436 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 2.9404 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 2.9370 0.0597 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 2.9336 0.0582 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 2.9305 0.0699 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 2.9276 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 2.9245 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 2.9214 0.0618 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 2.9182 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 2.9149 0.0664 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 2.9117 0.0619 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 2.9085 0.0609 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 2.9052 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 2.9023 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 2.8992 0.0584 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 2.8960 0.0673 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 2.8929 0.0634 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 2.8898 0.0630 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 2.8869 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 2.8839 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 2.8810 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 2.8781 0.0675 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 2.8752 0.0623 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 2.8722 0.0666 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 2.8693 0.0574 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 2.8666 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 2.8641 0.0711 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 2.8614 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 2.8588 0.0598 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 2.8566 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 2.8543 0.0733 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 2.8520 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.4630 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.3982 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.3800 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.3835 0.0577 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.3855 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.3776 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.3758 0.0572 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.3755 0.0661 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.3748 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.3735 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.3694 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.3680 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.3660 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.3667 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.3648 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.3627 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.3614 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.3618 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.3602 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.3578 0.0572 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.3556 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.3556 0.0576 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.3539 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.3520 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.3503 0.0582 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.3484 0.0693 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.3466 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.3453 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.3447 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.3435 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.3426 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.3407 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.3391 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.3380 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.3367 0.0585 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.3356 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.3343 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.3321 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.3303 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.3286 0.0599 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.3272 0.0693 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.3257 0.0677 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.3242 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.3225 0.0696 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.3211 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.3189 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.3182 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.3168 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.3156 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.3151 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.3137 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.3129 0.0675 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.3116 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.3104 0.0625 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.3090 0.0599 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.3082 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.3073 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.3061 0.0684 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.3049 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.3043 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.3031 0.0589 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.3024 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.3018 0.0588 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.3009 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.2997 0.0682 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.2990 0.0583 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.2982 0.0668 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.2969 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.2959 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.2951 0.0663 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.2945 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.2937 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.2929 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.2919 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.2909 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.2905 0.0648 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.2895 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.2889 0.0722 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.2878 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.2868 0.0701 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.2856 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.2850 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.2840 0.0671 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.2829 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.2815 0.0579 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.2806 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.2797 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.2788 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.2778 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.2770 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.2761 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.2753 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.2741 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.2730 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.2720 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.2711 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.2702 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.2693 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.2683 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.2672 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.2665 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.2657 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.2646 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.2637 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.2628 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.2619 0.0694 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.2612 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.2606 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.2600 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.2591 0.0631 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.2584 0.0679 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.2578 0.0779 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.2570 0.0660 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.2562 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.2554 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.2543 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.2536 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.2529 0.0669 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.2523 0.0611 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.2516 0.0697 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.2510 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.2502 0.0597 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.2494 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.2490 0.0680 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.2483 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.2475 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.2470 0.0641 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.2465 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.2459 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.2452 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.2445 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.2436 0.0693 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.2430 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.2425 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.2419 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.2414 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.2409 0.0646 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.2404 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.2401 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.2394 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.2390 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.2383 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.2377 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.2371 0.0676 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.2365 0.0660 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.2361 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.2356 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.2352 0.0634 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.2346 0.0587 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.2339 0.0657 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.2334 0.0622 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.2330 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.2326 0.0601 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.2322 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.2316 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.2310 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.2304 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.2298 0.0588 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.2290 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.2288 0.0691 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.2283 0.0693 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.2277 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.2271 0.0633 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.2265 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.2259 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.2254 0.0585 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.2249 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.2245 0.0746 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.2239 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.2233 0.0698 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.2227 0.0644 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.2221 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.2216 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.2211 0.0611 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.2206 0.0650 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.2201 0.0628 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.2195 0.0582 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.2189 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.2127 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.1578 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.1403 0.0663 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.1314 0.0599 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.1261 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.1191 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.1192 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.1197 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.1205 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.1190 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.1157 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.1130 0.0681 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.1128 0.0596 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.1148 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.1141 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.1127 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.1113 0.0631 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.1132 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.1126 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.1118 0.0607 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.1109 0.0585 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.1118 0.0605 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.1104 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.1095 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.1088 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.1071 0.0601 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.1055 0.0580 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.1052 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.1057 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.1051 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.1050 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.1040 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.1034 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.1038 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.1030 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.1024 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.1016 0.0597 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.0999 0.0605 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.0986 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.0974 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.0967 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.0962 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.0954 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.0941 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.0939 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.0921 0.0734 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.0917 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.0909 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.0903 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.0905 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.0896 0.0613 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.0897 0.0601 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.0891 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.0886 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.0878 0.0736 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.0877 0.0683 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.0875 0.0624 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.0869 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.0861 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.0863 0.0589 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.0858 0.0642 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.0859 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.0859 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.0857 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.0850 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.0850 0.0626 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.0847 0.0711 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.0839 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.0834 0.0602 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.0829 0.0637 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.0830 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.0827 0.0598 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.0825 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.0818 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.0812 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.0812 0.0701 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.0806 0.0685 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.0804 0.0637 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.0796 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.0790 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.0783 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.0780 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.0771 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.0765 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.0756 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.0749 0.0660 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.0743 0.0598 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.0737 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.0728 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.0725 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.0719 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.0715 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.0706 0.0617 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.0699 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.0693 0.0651 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.0688 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.0681 0.0699 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.0675 0.0710 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.0667 0.0643 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.0658 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.0655 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.0651 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.0645 0.0704 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.0639 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.0632 0.0716 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.0627 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.0622 0.0654 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.0619 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.0615 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.0610 0.0676 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.0606 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.0602 0.0589 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.0596 0.0624 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.0590 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.0585 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.0577 0.0650 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.0573 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.0568 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.0564 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.0561 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.0557 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.0551 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.0544 0.0697 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.0542 0.0670 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.0537 0.0586 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.0531 0.0699 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.0528 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.0525 0.0710 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.0521 0.0641 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.0517 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.0511 0.0693 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.0504 0.0695 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.0501 0.0706 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.0498 0.0669 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.0494 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.0491 0.0672 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.0488 0.0596 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.0485 0.0597 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.0484 0.0678 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.0480 0.0586 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.0478 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.0474 0.0581 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.0470 0.0674 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.0466 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.0462 0.0598 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.0460 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.0457 0.0593 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.0455 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.0452 0.0608 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.0447 0.0634 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.0442 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.0440 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.0437 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.0434 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.0430 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.0427 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.0423 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.0419 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.0414 0.0722 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.0413 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.0410 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.0407 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.0404 0.0675 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.0400 0.0730 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.0396 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.0392 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.0390 0.0685 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.0389 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.0385 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.0381 0.0635 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.0377 0.0601 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.0372 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.0370 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.0367 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.0364 0.0648 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.0361 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.0356 0.0747 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.0353 0.0607 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.0641 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.0075 0.0592 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 1.9913 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 1.9824 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 1.9769 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 1.9671 0.0616 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 1.9673 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 1.9664 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 1.9679 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 1.9664 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 1.9636 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 1.9610 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 1.9611 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 1.9624 0.0592 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 1.9619 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 1.9605 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 1.9597 0.0736 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 1.9619 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 1.9616 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 1.9618 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 1.9613 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 1.9621 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 1.9611 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 1.9603 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 1.9594 0.0691 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 1.9579 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 1.9567 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 1.9568 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 1.9572 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 1.9569 0.0696 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 1.9563 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 1.9555 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 1.9551 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 1.9555 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 1.9549 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 1.9541 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 1.9536 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 1.9520 0.0614 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 1.9507 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 1.9498 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 1.9491 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 1.9490 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 1.9482 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 1.9474 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 1.9472 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 1.9456 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 1.9453 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 1.9445 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 1.9441 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 1.9446 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 1.9437 0.0610 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 1.9441 0.0692 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 1.9436 0.0603 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 1.9433 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 1.9425 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 1.9425 0.0616 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 1.9424 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 1.9419 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 1.9413 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 1.9416 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 1.9412 0.0687 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 1.9415 0.0805 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 1.9415 0.0589 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 1.9416 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 1.9411 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 1.9412 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 1.9411 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 1.9405 0.0589 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 1.9400 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 1.9396 0.0620 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 1.9398 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 1.9396 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 1.9398 0.0726 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 1.9392 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 1.9387 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 1.9388 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 1.9384 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 1.9383 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 1.9376 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 1.9372 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 1.9364 0.0706 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 1.9363 0.0661 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 1.9354 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 1.9350 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 1.9342 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 1.9336 0.0684 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 1.9331 0.0652 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 1.9326 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 1.9318 0.0603 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 1.9316 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 1.9311 0.0691 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 1.9306 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 1.9298 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 1.9292 0.0706 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 1.9286 0.0736 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 1.9281 0.0648 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 1.9277 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 1.9271 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 1.9264 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 1.9256 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 1.9253 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 1.9249 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 1.9243 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 1.9239 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 1.9233 0.0634 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 1.9228 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 1.9224 0.0606 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 1.9221 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 1.9219 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 1.9216 0.0603 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 1.9213 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 1.9208 0.0746 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 1.9204 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 1.9200 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 1.9195 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 1.9189 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 1.9185 0.0603 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 1.9181 0.0689 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 1.9178 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 1.9175 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 1.9172 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 1.9166 0.0622 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 1.9160 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 1.9158 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 1.9155 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 1.9149 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 1.9147 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 1.9145 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 1.9141 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 1.9138 0.0686 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 1.9133 0.0620 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 1.9127 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 1.9125 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 1.9122 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 1.9119 0.0678 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 1.9117 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 1.9116 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 1.9114 0.0653 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 1.9114 0.0602 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 1.9110 0.0720 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 1.9109 0.0605 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 1.9106 0.0607 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 1.9103 0.0605 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 1.9101 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 1.9097 0.0642 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 1.9095 0.0679 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 1.9093 0.0607 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 1.9093 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 1.9090 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 1.9087 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 1.9082 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 1.9081 0.0641 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 1.9080 0.0672 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 1.9077 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 1.9075 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 1.9072 0.0602 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 1.9069 0.0620 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 1.9066 0.0647 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 1.9062 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 1.9061 0.0595 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 1.9060 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 1.9057 0.0591 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 1.9055 0.0670 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 1.9053 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 1.9050 0.0595 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 1.9046 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 1.9045 0.0601 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 1.9046 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 1.9043 0.0591 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 1.9040 0.0602 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 1.9036 0.0607 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 1.9032 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 1.9030 0.0628 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 1.9027 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 1.9025 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 1.9022 0.0627 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 1.9018 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 1.9016 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 1.9429 0.0700 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 1.8913 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 1.8754 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 1.8665 0.0619 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 1.8602 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 1.8495 0.0594 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 1.8496 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 1.8489 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 1.8508 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 1.8497 0.0591 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 1.8470 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 1.8440 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 1.8447 0.0666 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 1.8466 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 1.8461 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 1.8446 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 1.8445 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 1.8465 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 1.8466 0.0713 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 1.8472 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 1.8461 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 1.8468 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 1.8462 0.0605 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 1.8458 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 1.8456 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 1.8440 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 1.8424 0.0660 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 1.8427 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 1.8435 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 1.8437 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 1.8431 0.0592 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 1.8420 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 1.8419 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 1.8423 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 1.8417 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 1.8412 0.0611 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 1.8404 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 1.8389 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 1.8373 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 1.8362 0.0608 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 1.8355 0.0611 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 1.8355 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 1.8347 0.0621 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 1.8338 0.0599 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 1.8338 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 1.8322 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 1.8316 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 1.8308 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 1.8302 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 1.8308 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 1.8300 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 1.8304 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 1.8299 0.0695 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 1.8297 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 1.8292 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 1.8292 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 1.8291 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 1.8285 0.0635 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 1.8277 0.0731 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 1.8283 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 1.8281 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 1.8286 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 1.8286 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 1.8287 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 1.8283 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 1.8284 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 1.8282 0.0654 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 1.8277 0.0596 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 1.8273 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 1.8270 0.0602 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 1.8272 0.0663 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 1.8270 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 1.8273 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 1.8267 0.0706 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 1.8262 0.0708 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 1.8263 0.0633 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 1.8258 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 1.8256 0.0587 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 1.8248 0.0740 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 1.8244 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 1.8237 0.0608 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 1.8235 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 1.8227 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 1.8224 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 1.8217 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 1.8211 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 1.8206 0.0698 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 1.8200 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 1.8193 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 1.8191 0.0619 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 1.8186 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 1.8182 0.0616 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 1.8175 0.0652 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 1.8169 0.0621 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 1.8164 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 1.8161 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 1.8158 0.0603 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 1.8151 0.0618 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 1.8145 0.0699 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 1.8138 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 1.8136 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 1.8132 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 1.8128 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 1.8124 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 1.8120 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 1.8116 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 1.8112 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 1.8110 0.0597 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 1.8108 0.0705 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 1.8106 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 1.8103 0.0764 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 1.8099 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 1.8096 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 1.8092 0.0611 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 1.8087 0.0602 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 1.8082 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 1.8079 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 1.8076 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 1.8073 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 1.8070 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 1.8067 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 1.8062 0.0621 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 1.8057 0.0621 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 1.8056 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 1.8052 0.0619 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 1.8047 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 1.8046 0.0690 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 1.8044 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 1.8041 0.0608 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 1.8037 0.0621 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 1.8032 0.0618 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 1.8027 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 1.8025 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 1.8023 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 1.8020 0.0588 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 1.8018 0.0594 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 1.8016 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 1.8014 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 1.8014 0.0643 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 1.8010 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 1.8011 0.0594 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 1.8008 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 1.8005 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 1.8004 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 1.8000 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 1.7999 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 1.7997 0.0646 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 1.7997 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 1.7995 0.0605 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 1.7992 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 1.7988 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 1.7987 0.0680 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 1.7985 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 1.7984 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 1.7981 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 1.7980 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 1.7978 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 1.7975 0.0662 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 1.7971 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 1.7970 0.0664 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 1.7970 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 1.7967 0.0640 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 1.7965 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 1.7964 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 1.7961 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 1.7959 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 1.7958 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 1.7960 0.0621 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 1.7958 0.0615 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 1.7956 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 1.7954 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 1.7951 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 1.7951 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 1.7949 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 1.7948 0.0696 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 1.7946 0.0705 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 1.7943 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 1.7942 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 1.8479 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 1.8020 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 1.7870 0.0694 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 1.7759 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 1.7708 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 1.7598 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 1.7594 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 1.7566 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 1.7582 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 1.7572 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.7542 0.0624 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.7508 0.0797 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.7509 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.7527 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.7517 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.7497 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.7490 0.0604 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.7510 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.7505 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.7509 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.7498 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.7502 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.7494 0.0607 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.7487 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.7483 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.7465 0.0737 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.7454 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.7456 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.7459 0.0601 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.7460 0.0637 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.7455 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.7445 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.7444 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.7449 0.0610 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.7445 0.0611 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.7441 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.7430 0.0693 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.7417 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.7403 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.7395 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.7387 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.7391 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.7386 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.7375 0.0620 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.7376 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.7361 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.7356 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.7350 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.7346 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.7350 0.0602 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.7342 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.7348 0.0603 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.7345 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.7342 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.7338 0.0685 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.7340 0.0712 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.7339 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.7335 0.0654 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.7327 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.7332 0.0607 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.7329 0.0705 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.7335 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.7336 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.7337 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.7333 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.7333 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.7334 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.7328 0.0612 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.7325 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.7322 0.0601 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.7325 0.0680 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.7324 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.7326 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.7322 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.7317 0.0701 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.7318 0.0599 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.7314 0.0711 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.7313 0.0611 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.7306 0.0604 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.7304 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.7296 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.7295 0.0743 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.7287 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.7286 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.7279 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.7274 0.0696 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.7271 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.7265 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.7258 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.7257 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.7252 0.0659 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.7248 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.7241 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.7236 0.0717 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.7232 0.0717 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.7229 0.0701 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.7226 0.0602 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.7220 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.7214 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.7207 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.7205 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.7201 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.7198 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.7194 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.7189 0.0660 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.7185 0.0610 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.7182 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.7179 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.7177 0.0720 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.7175 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.7172 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.7169 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.7165 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.7162 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.7158 0.0704 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.7152 0.0708 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.7150 0.0658 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.7146 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.7143 0.0645 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.7140 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.7138 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.7132 0.0603 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.7127 0.0603 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.7125 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.7122 0.0618 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.7116 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.7116 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.7114 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.7111 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.7108 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.7103 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.7097 0.0695 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.7095 0.0772 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.7094 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.7093 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.7091 0.0664 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.7090 0.0607 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.7089 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.7089 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.7085 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.7086 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.7084 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.7082 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.7081 0.0608 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.7077 0.0684 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.7076 0.0699 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.7074 0.0656 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.7074 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.7072 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.7070 0.0649 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.7065 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.7063 0.0770 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.7062 0.0804 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.7061 0.0742 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.7060 0.0668 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.7057 0.0688 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.7055 0.0608 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.7053 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.7048 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.7048 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.7048 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.7046 0.0678 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.7044 0.0611 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.7043 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.7042 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.7040 0.0633 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.7039 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.7041 0.0786 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.7039 0.0641 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.7037 0.0619 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.7034 0.0622 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.7030 0.0629 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.7029 0.0594 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.7027 0.0624 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.7026 0.0602 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.7023 0.0596 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.7021 0.0674 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.7020 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 1.7790 0.0607 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.7325 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.7139 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.7034 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.6936 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.6824 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.6804 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.6777 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.6779 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.6762 0.0595 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.6726 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.6703 0.0620 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.6697 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.6714 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.6702 0.0621 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.6684 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.6682 0.0721 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.6698 0.0625 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.6700 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.6705 0.0613 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.6693 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.6695 0.0601 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.6686 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.6680 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.6681 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.6661 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.6645 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.6647 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.6651 0.0663 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.6650 0.0611 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.6645 0.0633 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.6633 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.6631 0.0738 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.6635 0.0645 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.6631 0.0601 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.6625 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.6619 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.6602 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.6586 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.6577 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.6570 0.0604 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.6573 0.0610 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.6567 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.6558 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.6559 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.6545 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.6540 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.6533 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.6529 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.6533 0.0677 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.6527 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.6534 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.6531 0.0685 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.6529 0.0700 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.6527 0.0613 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.6526 0.0604 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.6528 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.6521 0.0613 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.6511 0.0715 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.6515 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.6513 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.6519 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.6520 0.0700 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.6520 0.0668 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.6516 0.0604 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.6518 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.6518 0.0702 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.6512 0.0610 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.6511 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.6508 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.6513 0.0625 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.6512 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.6515 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.6512 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.6510 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.6511 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.6508 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.6507 0.0613 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.6500 0.0611 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.6498 0.0657 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.6491 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.6490 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.6483 0.0609 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.6482 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.6476 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.6471 0.0658 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.6466 0.0596 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.6462 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.6456 0.0620 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.6455 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.6451 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.6447 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.6441 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.6437 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.6432 0.0677 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.6430 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.6428 0.0689 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.6421 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.6416 0.0617 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.6410 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.6407 0.0613 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.6404 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.6402 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.6398 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.6394 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.6391 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.6390 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.6388 0.0660 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.6385 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.6385 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.6383 0.0607 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.6380 0.0612 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.6377 0.0698 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.6374 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.6370 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.6366 0.0607 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.6363 0.0608 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.6361 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.6358 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.6356 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.6353 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.6348 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.6343 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.6343 0.0595 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.6343 0.0596 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.6339 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.6339 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.6338 0.0613 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.6336 0.0635 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.6333 0.0604 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.6328 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.6324 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.6323 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.6322 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.6321 0.0607 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.6320 0.0584 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.6320 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.6320 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.6320 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.6317 0.0703 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.6319 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.6317 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.6315 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.6316 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.6313 0.0611 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.6313 0.0621 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.6311 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.6312 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.6311 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.6309 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.6304 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.6303 0.0612 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.6302 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.6301 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.6300 0.0578 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.6298 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.6297 0.0605 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.6296 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.6292 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.6292 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.6292 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.6290 0.0696 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.6290 0.0615 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.6288 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.6288 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.6287 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.6287 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.6290 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.6288 0.0621 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.6287 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.6285 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.6281 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.6281 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.6280 0.0598 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.6280 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.6278 0.0649 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.6276 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.6278 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.7636 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.7106 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.6897 0.0612 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.6784 0.0622 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.6715 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.6592 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.6581 0.0599 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.6547 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.6551 0.0706 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.6525 0.0616 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.6475 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.6433 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.6416 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.6424 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.6397 0.0660 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.6365 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.6355 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.6364 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.6357 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.6358 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.6345 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.6341 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.6323 0.0716 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.6313 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.6307 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.6287 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.6267 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.6267 0.0642 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.6268 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.6264 0.0695 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.6255 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.6243 0.0658 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.6239 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.6237 0.0607 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.6230 0.0710 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.6222 0.0694 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.6210 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.6192 0.0622 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.6173 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.6162 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.6153 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.6153 0.0701 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.6142 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.6133 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.6131 0.0767 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.6116 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.6108 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.6101 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.6097 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.6099 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.6090 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.6094 0.0604 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.6091 0.0599 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.6090 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.6086 0.0714 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.6084 0.0601 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.6084 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.6077 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.6069 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.6071 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.6067 0.0614 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.6073 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.6074 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.6075 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.6070 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.6068 0.0616 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.6066 0.0650 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.6060 0.0647 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.6057 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.6053 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.6058 0.0691 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.6056 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.6059 0.0694 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.6053 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.6050 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.6049 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.6045 0.0696 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.6041 0.0644 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.6031 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.6029 0.0608 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.6021 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.6019 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.6011 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.6009 0.0619 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.6004 0.0722 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.5999 0.0611 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.5993 0.0652 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.5988 0.0656 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.5981 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.5980 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.5976 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.5972 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.5965 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.5960 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.5955 0.0725 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.5953 0.0706 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.5949 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.5943 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.5937 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.5930 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.5927 0.0619 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.5923 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.5920 0.0723 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.5916 0.0688 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.5912 0.0706 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.5908 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.5905 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.5903 0.0637 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.5900 0.0598 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.5898 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.5895 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.5892 0.0637 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.5889 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.5886 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.5881 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.5876 0.0622 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.5874 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.5872 0.0728 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.5869 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.5867 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.5864 0.0608 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.5859 0.0608 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.5854 0.0626 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.5854 0.0622 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.5853 0.0690 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.5848 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.5848 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.5849 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.5846 0.0676 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.5843 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.5837 0.0720 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.5832 0.0669 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.5831 0.0634 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.5830 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.5829 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.5828 0.0620 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.5827 0.0637 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.5827 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.5826 0.0621 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.5824 0.0699 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.5826 0.0736 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.5825 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.5824 0.0687 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.5824 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.5822 0.0665 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.5821 0.0612 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.5820 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.5821 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.5821 0.0639 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.5818 0.0710 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.5814 0.0661 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.5812 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.5812 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.5811 0.0604 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.5809 0.0622 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.5808 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.5807 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.5806 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.5801 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.5801 0.0608 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.5801 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.5799 0.0738 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.5799 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.5797 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.5796 0.0617 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.5794 0.0722 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.5795 0.0621 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.5797 0.0598 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.5796 0.0610 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.5794 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.5792 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.5789 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.5789 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.5787 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.5787 0.0775 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.5784 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.5782 0.0668 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.5782 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.6861 0.0610 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.6339 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.6151 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.6011 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.5884 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.5759 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.5733 0.0689 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.5699 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.5688 0.0674 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.5673 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.5637 0.0634 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.5622 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.5623 0.0697 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.5634 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.5622 0.0611 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.5600 0.0612 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.5597 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.5614 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.5612 0.0777 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.5616 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.5605 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.5602 0.0654 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.5586 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.5581 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.5582 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.5561 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.5545 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.5546 0.0703 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.5549 0.0725 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.5549 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.5544 0.0621 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.5531 0.0699 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.5530 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.5532 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.5530 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.5525 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.5514 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.5499 0.0614 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.5484 0.0603 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.5475 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.5467 0.0620 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.5471 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.5462 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.5454 0.0722 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.5455 0.0666 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.5443 0.0688 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.5436 0.0636 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.5428 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.5425 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.5430 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.5423 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.5428 0.0734 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.5426 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.5427 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.5424 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.5424 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.5426 0.0614 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.5421 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.5413 0.0677 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.5419 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.5416 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.5424 0.0704 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.5426 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.5428 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.5424 0.0697 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.5424 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.5425 0.0730 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.5421 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.5420 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.5418 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.5422 0.0719 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.5422 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.5426 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.5421 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.5419 0.0673 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.5420 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.5417 0.0741 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.5414 0.0645 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.5406 0.0620 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.5404 0.0609 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.5397 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.5395 0.0600 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.5388 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.5385 0.0620 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.5381 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.5377 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.5372 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.5369 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.5363 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.5363 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.5360 0.0604 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.5357 0.0678 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.5351 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.5346 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.5342 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.5340 0.0624 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.5338 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.5333 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.5327 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.5322 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.5320 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.5318 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.5316 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.5314 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.5311 0.0624 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.5309 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.5306 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.5305 0.0659 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.5303 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.5302 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.5300 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.5297 0.0610 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.5294 0.0642 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.5292 0.0739 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.5288 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.5284 0.0604 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.5283 0.0668 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.5282 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.5280 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.5278 0.0708 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.5277 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.5272 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.5267 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.5267 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.5266 0.0639 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.5262 0.0610 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.5262 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.5261 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.5259 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.5257 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.5252 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.5248 0.0685 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.5249 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.5248 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.5248 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.5248 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.5247 0.0624 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.5248 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.5247 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.5245 0.0603 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.5248 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.5246 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.5244 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.5245 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.5242 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.5243 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.5242 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.5244 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.5244 0.0604 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.5241 0.0651 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.5237 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.5236 0.0653 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.5236 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.5235 0.0680 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.5234 0.0607 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.5233 0.0727 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.5233 0.0640 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.5232 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.5229 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.5229 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.5230 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.5229 0.0621 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.5228 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.5228 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.5228 0.0729 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.5226 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.5227 0.0700 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.5230 0.0643 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.5229 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.5228 0.0696 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.5227 0.0624 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.5225 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.5225 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.5225 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.5225 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.5223 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.5221 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.5222 0.0705 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.6569 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.5974 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.5735 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.5615 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.5482 0.0620 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.5343 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.5327 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.5293 0.0600 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.5316 0.0589 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.5301 0.0602 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.5259 0.0626 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.5250 0.0776 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.5257 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.5275 0.0612 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.5263 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.5244 0.0616 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.5241 0.0614 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.5246 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.5242 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.5249 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.5238 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.5237 0.0614 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.5223 0.0682 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.5218 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.5217 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.5196 0.0609 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.5179 0.0753 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.5181 0.0616 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.5181 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.5180 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.5172 0.0630 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.5158 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.5158 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.5158 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.5152 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.5145 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.5134 0.0605 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.5119 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.5100 0.0686 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.5090 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.5081 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.5086 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.5075 0.0623 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.5067 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.5066 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.5055 0.0689 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.5047 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.5041 0.0672 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.5035 0.0719 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.5037 0.0690 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.5029 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.5034 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.5030 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.5028 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.5024 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.5024 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.5025 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.5020 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.5012 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.5015 0.0602 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.5013 0.0598 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.5021 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.5022 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.5023 0.0609 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.5019 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.5020 0.0620 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.5019 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.5015 0.0699 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.5014 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.5012 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.5017 0.0603 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.5017 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.5020 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.5015 0.0703 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.5012 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.5012 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.5009 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.5007 0.0722 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.4998 0.0714 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.4998 0.0711 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.4991 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.4989 0.0677 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.4984 0.0656 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.4981 0.0722 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.4977 0.0707 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.4973 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.4968 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.4965 0.0623 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.4959 0.0721 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.4959 0.0631 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.4956 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.4954 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.4948 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.4943 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.4939 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.4939 0.0670 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.4937 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.4930 0.0642 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.4925 0.0618 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.4920 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.4918 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.4916 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.4914 0.0623 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.4911 0.0605 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.4909 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.4907 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.4905 0.0630 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.4904 0.0626 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.4902 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.4902 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.4898 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.4896 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.4894 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.4891 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.4888 0.0612 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.4883 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.4883 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.4882 0.0626 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.4880 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.4878 0.0705 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.4877 0.0681 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.4872 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.4866 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.4866 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.4864 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.4859 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.4860 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.4859 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.4857 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.4853 0.0705 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.4849 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.4846 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.4846 0.0671 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.4845 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.4845 0.0684 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.4845 0.0688 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.4844 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.4844 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.4843 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.4841 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.4845 0.0633 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.4844 0.0597 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.4843 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.4844 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.4842 0.0621 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.4842 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.4841 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.4844 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.4845 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.4842 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.4838 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.4836 0.0611 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.4837 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.4836 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.4836 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.4836 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.4836 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.4834 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.4831 0.0626 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.4831 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.4832 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.4830 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.4829 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.4829 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.4828 0.0616 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.4827 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.4828 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.4831 0.0610 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.4830 0.0717 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.4829 0.0693 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.4828 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.4826 0.0708 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.4826 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.4825 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.4825 0.0625 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.4823 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.4821 0.0613 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.4822 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.6196 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.5473 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.5264 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.5150 0.0597 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.5010 0.0699 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.4881 0.0686 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.4862 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.4827 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.4828 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.4819 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.4787 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.4781 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.4790 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.4801 0.0630 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.4787 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.4766 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.4762 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.4776 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.4776 0.0626 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.4783 0.0600 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.4772 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.4768 0.0632 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.4753 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.4748 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.4750 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.4727 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.4712 0.0632 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.4716 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.4716 0.0711 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.4717 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.4711 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.4700 0.0636 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.4697 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.4699 0.0626 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.4694 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.4691 0.0611 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.4683 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.4665 0.0615 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.4648 0.0737 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.4643 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.4634 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.4640 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.4634 0.0627 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.4626 0.0727 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.4626 0.0616 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.4613 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.4609 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.4603 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.4598 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.4603 0.0632 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.4596 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.4602 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.4598 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.4599 0.0643 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.4597 0.0610 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.4598 0.0697 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.4600 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.4596 0.0612 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.4589 0.0623 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.4595 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.4593 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.4600 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.4603 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.4604 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.4601 0.0623 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.4603 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.4603 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.4599 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.4599 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.4598 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.4604 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.4605 0.0702 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.4610 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.4605 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.4602 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.4603 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.4601 0.0633 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.4599 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.4591 0.0628 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.4591 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.4585 0.0686 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.4584 0.0737 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.4579 0.0680 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.4577 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.4573 0.0701 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.4570 0.0671 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.4565 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.4562 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.4556 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.4557 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.4554 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.4551 0.0704 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.4546 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.4541 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.4538 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.4537 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.4536 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.4532 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.4528 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.4523 0.0673 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.4521 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.4519 0.0679 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.4517 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.4515 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.4512 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.4511 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.4509 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.4509 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.4507 0.0683 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.4508 0.0791 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.4505 0.0614 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.4503 0.0626 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.4501 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.4499 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.4496 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.4491 0.0692 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.4491 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.4491 0.0717 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.4489 0.0617 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.4487 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.4486 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.4481 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.4476 0.0722 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.4476 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.4475 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.4471 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.4471 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.4471 0.0648 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.4469 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.4466 0.0623 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.4461 0.0644 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.4458 0.0653 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.4458 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.4459 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.4459 0.0600 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.4459 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.4459 0.0598 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.4459 0.0726 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.4460 0.0684 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.4458 0.0613 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.4461 0.0627 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.4461 0.0614 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.4460 0.0625 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.4461 0.0627 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.4460 0.0743 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.4462 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.4461 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.4464 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.4466 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.4464 0.0642 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.4461 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.4459 0.0633 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.4460 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.4459 0.0633 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.4459 0.0640 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.4459 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.4459 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.4459 0.0685 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.4455 0.0675 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.4456 0.0690 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.4457 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.4456 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.4456 0.0656 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.4456 0.0630 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.4455 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.4454 0.0619 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.4455 0.0713 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.4458 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.4458 0.0604 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.4457 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.4456 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.4453 0.0795 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.4454 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.4454 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.4454 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.4453 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.4452 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.4452 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.5692 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.5084 0.0731 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.4852 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.4755 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.4638 0.0685 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.4513 0.0725 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.4479 0.0617 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.4447 0.0706 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.4444 0.0741 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.4435 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.4390 0.0624 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.4380 0.0622 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.4373 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.4381 0.0646 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.4366 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.4346 0.0614 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.4344 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.4357 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.4356 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.4370 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.4363 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.4361 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.4352 0.0611 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.4348 0.0745 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.4351 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.4330 0.0729 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.4316 0.0692 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.4324 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.4329 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.4331 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.4325 0.0611 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.4315 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.4316 0.0629 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.4319 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.4315 0.0617 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.4314 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.4309 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.4293 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.4278 0.0644 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.4272 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.4266 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.4273 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.4266 0.0696 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.4261 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.4261 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.4252 0.0605 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.4247 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.4242 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.4240 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.4243 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.4237 0.0712 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.4244 0.0676 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.4243 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.4245 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.4243 0.0647 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.4242 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.4244 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.4240 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.4234 0.0595 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.4239 0.0702 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.4239 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.4246 0.0659 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.4249 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.4251 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.4249 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.4250 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.4250 0.0633 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.4248 0.0624 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.4247 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.4245 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.4251 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.4253 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.4257 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.4253 0.0803 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.4253 0.0629 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.4255 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.4253 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.4253 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.4245 0.0703 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.4245 0.0616 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.4240 0.0637 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.4240 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.4235 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.4234 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.4230 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.4228 0.0606 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.4225 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.4222 0.0737 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.4218 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.4218 0.0627 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.4217 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.4216 0.0616 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.4212 0.0616 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.4209 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.4206 0.0656 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.4206 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.4206 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.4201 0.0678 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.4197 0.0611 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.4193 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.4192 0.0622 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.4191 0.0724 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.4189 0.0665 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.4187 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.4186 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.4184 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.4182 0.0662 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.4182 0.0607 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.4180 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.4181 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.4179 0.0727 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.4178 0.0624 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.4176 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.4174 0.0624 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.4171 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.4167 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.4166 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.4165 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.4164 0.0603 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.4163 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.4162 0.0617 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.4157 0.0639 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.4152 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.4152 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.4151 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.4147 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.4147 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.4146 0.0664 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.4144 0.0677 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.4141 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.4137 0.0616 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.4134 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.4135 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.4135 0.0675 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.4136 0.0723 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.4137 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.4137 0.0633 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.4138 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.4137 0.0601 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.4136 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.4140 0.0661 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.4139 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.4138 0.0642 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.4140 0.0779 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.4138 0.0645 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.4140 0.0804 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.4140 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.4143 0.0636 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.4144 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.4143 0.0645 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.4139 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.4138 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.4139 0.0624 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.4139 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.4138 0.0654 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.4138 0.0650 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.4138 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.4137 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.4134 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.4135 0.0641 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.4136 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.4135 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.4135 0.0620 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.4135 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.4135 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.4134 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.4135 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.4138 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.4138 0.0622 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.4138 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.4137 0.0668 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.4135 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.4136 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.4137 0.0742 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.4137 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.4136 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.4134 0.0636 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.4135 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.5378 0.0693 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.4754 0.0692 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.4522 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.4430 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.4295 0.0625 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.4170 0.0614 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.4147 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.4122 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.4117 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.4099 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.4061 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.4053 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.4057 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.4067 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.4050 0.0693 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.4034 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.4035 0.0703 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.4048 0.0610 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.4047 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.4060 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.4055 0.0700 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.4056 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.4047 0.0681 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.4041 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.4044 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.4025 0.0621 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.4012 0.0614 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.4018 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.4019 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.4019 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.4012 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.4001 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.4002 0.0617 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.4004 0.0684 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.4000 0.0618 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.3996 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.3991 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.3977 0.0611 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.3962 0.0619 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.3958 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.3953 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.3960 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.3954 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.3949 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.3951 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.3941 0.0695 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.3936 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.3931 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.3928 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.3931 0.0687 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.3924 0.0693 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.3930 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.3929 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.3932 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.3931 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.3932 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.3936 0.0726 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.3933 0.0634 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.3928 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.3934 0.0708 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.3934 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.3943 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.3945 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.3946 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.3944 0.0620 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.3945 0.0628 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.3946 0.0619 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.3944 0.0625 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.3943 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.3943 0.0715 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.3948 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.3952 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.3957 0.0672 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.3953 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.3951 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.3952 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.3951 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.3952 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.3945 0.0607 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.3946 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.3941 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.3940 0.0645 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.3935 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.3934 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.3931 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.3929 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.3927 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.3925 0.0730 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.3921 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.3921 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.3919 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.3918 0.0618 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.3914 0.0640 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.3911 0.0613 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.3908 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.3909 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.3909 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.3905 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.3901 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.3898 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.3897 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.3895 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.3894 0.0662 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.3892 0.0708 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.3891 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.3890 0.0694 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.3890 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.3890 0.0599 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.3889 0.0627 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.3889 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.3888 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.3887 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.3886 0.0682 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.3885 0.0615 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.3882 0.0609 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.3878 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.3877 0.0665 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.3877 0.0743 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.3876 0.0711 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.3875 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.3874 0.0708 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.3871 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.3867 0.0624 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.3867 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.3866 0.0742 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.3863 0.0616 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.3864 0.0607 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.3863 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.3861 0.0762 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.3858 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.3853 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.3851 0.0612 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.3852 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.3852 0.0675 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.3853 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.3854 0.0677 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.3854 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.3854 0.0762 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.3855 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.3854 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.3857 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.3857 0.0632 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.3856 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.3859 0.0671 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.3857 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.3859 0.0606 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.3859 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.3861 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.3863 0.0648 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.3861 0.0716 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.3858 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.3857 0.0670 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.3858 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.3858 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.3858 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.3857 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.3858 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.3857 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.3854 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.3855 0.0669 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.3856 0.0658 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.3855 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.3855 0.0655 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.3855 0.0628 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.3855 0.0616 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.3854 0.0630 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.3855 0.0755 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.3858 0.0674 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.3858 0.0690 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.3858 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.3858 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.3856 0.0625 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.3857 0.0593 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.3858 0.0640 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.3859 0.0726 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.3857 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.3856 0.0712 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.3858 0.0723 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.4987 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.4418 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.4210 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.4152 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.4022 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.3894 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.3879 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.3845 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.3829 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.3815 0.0626 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.3782 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.3780 0.0615 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.3783 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.3790 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.3785 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.3771 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.3767 0.0760 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.3780 0.0685 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.3784 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.3794 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.3786 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.3787 0.0619 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.3777 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.3776 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.3778 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.3758 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.3744 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.3749 0.0619 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.3754 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.3756 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.3749 0.0626 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.3739 0.0616 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.3743 0.0597 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.3748 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.3744 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.3742 0.0694 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.3738 0.0725 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.3728 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.3715 0.0626 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.3714 0.0714 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.3708 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.3717 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.3713 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.3708 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.3711 0.0604 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.3701 0.0617 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.3697 0.0679 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.3692 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.3692 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.3696 0.0741 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.3690 0.0619 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.3698 0.0722 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.3696 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.3699 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.3695 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.3696 0.0610 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.3700 0.0615 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.3696 0.0617 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.3691 0.0626 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.3697 0.0613 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.3696 0.0703 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.3705 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.3708 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.3710 0.0728 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.3708 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.3711 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.3713 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.3710 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.3710 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.3710 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.3716 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.3719 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.3723 0.0656 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.3719 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.3717 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.3719 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.3718 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.3717 0.0718 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.3712 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.3712 0.0724 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.3708 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.3708 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.3704 0.0691 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.3705 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.3702 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.3700 0.0678 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.3698 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.3696 0.0630 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.3692 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.3693 0.0626 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.3691 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.3690 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.3685 0.0728 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.3681 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.3678 0.0632 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.3679 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.3679 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.3675 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.3671 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.3667 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.3666 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.3665 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.3664 0.0626 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.3663 0.0625 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.3662 0.0610 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.3661 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.3660 0.0726 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.3660 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.3658 0.0719 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.3660 0.0622 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.3658 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.3657 0.0711 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.3655 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.3654 0.0662 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.3651 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.3648 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.3648 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.3647 0.0706 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.3646 0.0657 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.3646 0.0625 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.3645 0.0615 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.3641 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.3636 0.0707 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.3636 0.0621 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.3634 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.3630 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.3631 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.3631 0.0689 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.3628 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.3625 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.3620 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.3618 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.3619 0.0707 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.3619 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.3619 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.3619 0.0723 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.3620 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.3621 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.3620 0.0624 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.3619 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.3623 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.3623 0.0741 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.3622 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.3624 0.0609 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.3623 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.3624 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.3624 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.3626 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.3628 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.3627 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.3625 0.0604 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.3623 0.0632 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.3623 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.3623 0.0677 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.3622 0.0615 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.3622 0.0746 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.3623 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.3623 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.3620 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.3621 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.3623 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.3622 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.3622 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.3621 0.0616 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.3621 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.3620 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.3623 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.3625 0.0725 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.3626 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.3626 0.0687 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.3625 0.0701 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.3623 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.3624 0.0728 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.3625 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.3625 0.0778 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.3624 0.0645 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.3623 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.3624 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.4743 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.4203 0.0612 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.4034 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.3948 0.0716 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.3826 0.0723 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.3717 0.0725 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.3700 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.3669 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.3655 0.0708 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.3646 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.3607 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.3602 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.3603 0.0728 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.3608 0.0744 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.3596 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.3576 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.3572 0.0732 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.3582 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.3579 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.3589 0.0813 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.3579 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.3576 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.3571 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.3568 0.0766 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.3573 0.0737 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.3550 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.3537 0.0673 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.3544 0.0729 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.3546 0.0730 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.3547 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.3541 0.0746 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.3528 0.0708 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.3530 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.3533 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.3527 0.0735 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.3525 0.0625 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.3520 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.3508 0.0682 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.3495 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.3492 0.0705 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.3485 0.0704 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.3492 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.3489 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.3484 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.3486 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.3479 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.3474 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.3469 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.3468 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.3472 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.3465 0.0670 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.3471 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.3470 0.0715 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.3471 0.0694 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.3470 0.0616 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.3470 0.0645 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.3473 0.0630 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.3470 0.0799 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.3464 0.0757 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.3469 0.0711 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.3468 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.3476 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.3478 0.0724 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.3480 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.3479 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.3480 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.3482 0.0759 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.3479 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.3479 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.3478 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.3484 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.3487 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.3492 0.0763 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.3488 0.0620 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.3487 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.3488 0.0739 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.3486 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.3486 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.3479 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.3479 0.0749 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.3475 0.0748 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.3475 0.0759 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.3471 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.3470 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.3467 0.0811 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.3464 0.0737 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.3462 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.3459 0.0626 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.3455 0.0747 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.3454 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.3452 0.0776 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.3452 0.0700 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.3448 0.0747 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.3446 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.3442 0.0626 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.3442 0.0720 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.3442 0.0628 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.3438 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.3434 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.3430 0.0607 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.3429 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.3428 0.0620 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.3427 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.3425 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.3423 0.0617 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.3422 0.0708 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.3421 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.3421 0.0716 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.3419 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.3420 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.3418 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.3417 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.3416 0.0672 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.3414 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.3412 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.3408 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.3409 0.0729 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.3410 0.0754 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.3408 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.3407 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.3406 0.0753 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.3402 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.3398 0.0613 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.3398 0.0728 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.3397 0.0626 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.3394 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.3395 0.0675 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.3394 0.0638 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.3392 0.0680 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.3389 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.3385 0.0703 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.3383 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.3384 0.0694 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.3384 0.0650 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.3385 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.3386 0.0630 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.3388 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.3389 0.0757 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.3389 0.0661 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.3388 0.0658 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.3392 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.3392 0.0681 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.3390 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.3392 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.3391 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.3393 0.0615 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.3393 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.3395 0.0615 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.3398 0.0709 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.3397 0.0765 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.3393 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.3392 0.0638 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.3393 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.3392 0.0660 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.3392 0.0686 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.3392 0.0764 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.3393 0.0694 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.3392 0.0725 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.3390 0.0745 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.3391 0.0649 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.3393 0.0630 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.3392 0.0689 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.3392 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.3391 0.0714 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.3392 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.3391 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.3393 0.0694 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.3396 0.0607 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.3396 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.3397 0.0709 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.3396 0.0707 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.3393 0.0737 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.3395 0.0754 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.3395 0.0610 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.3395 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.3394 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.3393 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.3394 0.0738 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.4544 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.4031 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.3828 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.3734 0.0723 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.3611 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.3485 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.3466 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.3429 0.0624 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.3423 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.3399 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.3368 0.0626 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.3367 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.3368 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.3375 0.0626 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.3357 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.3338 0.0781 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.3339 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.3354 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.3354 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.3363 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.3358 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.3357 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.3348 0.0620 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.3348 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.3346 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.3326 0.0613 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.3311 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.3317 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.3317 0.0684 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.3320 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.3313 0.0756 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.3303 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.3303 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.3307 0.0761 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.3305 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.3302 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.3297 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.3287 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.3274 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.3271 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.3266 0.0734 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.3275 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.3271 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.3266 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.3271 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.3261 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.3255 0.0695 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.3251 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.3249 0.0624 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.3254 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.3248 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.3257 0.0624 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.3257 0.0631 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.3259 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.3256 0.0638 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.3257 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.3261 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.3258 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.3253 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.3259 0.0710 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.3258 0.0742 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.3266 0.0750 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.3268 0.0680 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.3271 0.0697 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.3270 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.3271 0.0795 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.3273 0.0713 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.3270 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.3270 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.3270 0.0728 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.3275 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.3278 0.0735 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.3283 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.3280 0.0682 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.3279 0.0815 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.3281 0.0742 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.3279 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.3279 0.0622 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.3272 0.0600 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.3271 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.3267 0.0753 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.3267 0.0734 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.3262 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.3262 0.0707 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.3259 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.3257 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.3254 0.0712 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.3252 0.0625 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.3249 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.3250 0.0641 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.3248 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.3249 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.3246 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.3242 0.0728 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.3240 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.3241 0.0720 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.3240 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.3236 0.0777 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.3233 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.3229 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.3229 0.0638 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.3227 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.3227 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.3226 0.0653 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.3225 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.3224 0.0657 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.3224 0.0709 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.3223 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.3222 0.0604 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.3223 0.0696 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.3222 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.3220 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.3218 0.0774 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.3217 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.3214 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.3210 0.0626 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.3211 0.0616 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.3212 0.0743 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.3210 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.3210 0.0765 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.3209 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.3206 0.0607 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.3201 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.3201 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.3199 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.3196 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.3196 0.0650 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.3197 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.3195 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.3192 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.3188 0.0677 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.3185 0.0649 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.3186 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.3186 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.3187 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.3188 0.0739 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.3188 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.3189 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.3189 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.3188 0.0668 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.3191 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.3192 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.3191 0.0639 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.3193 0.0670 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.3192 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.3194 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.3194 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.3197 0.0620 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.3199 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.3197 0.0621 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.3195 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.3193 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.3194 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.3194 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.3193 0.0705 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.3193 0.0765 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.3193 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.3192 0.0622 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.3189 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.3190 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.3192 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.3191 0.0719 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.3191 0.0604 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.3191 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.3190 0.0620 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.3190 0.0637 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.3192 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.3195 0.0610 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.3196 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.3196 0.0631 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.3195 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.3194 0.0605 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.3195 0.0605 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.3196 0.0626 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.3196 0.0652 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.3195 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.3193 0.0604 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.3194 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.4344 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.3758 0.0609 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.3583 0.0610 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.3493 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.3369 0.0606 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.3254 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.3242 0.0604 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.3204 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.3203 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.3194 0.0612 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.3157 0.0601 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.3156 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.3160 0.0606 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.3168 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.3145 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.3131 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.3136 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.3149 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.3149 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.3163 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.3159 0.0612 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.3158 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.3151 0.0628 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.3152 0.0687 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.3151 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.3130 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.3117 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.3128 0.0632 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.3128 0.0610 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.3128 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.3122 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.3114 0.0691 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.3117 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.3119 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.3117 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.3114 0.0647 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.3111 0.0620 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.3102 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.3090 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.3088 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.3081 0.0739 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.3091 0.0610 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.3089 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.3082 0.0730 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.3084 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.3075 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.3070 0.0659 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.3065 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.3063 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.3066 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.3060 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.3068 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.3067 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.3069 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.3065 0.0626 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.3066 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.3068 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.3066 0.0750 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.3060 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.3066 0.0620 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.3067 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.3075 0.0735 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.3077 0.0627 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.3079 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.3078 0.0695 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.3080 0.0595 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.3083 0.0607 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.3081 0.0617 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.3081 0.0607 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.3080 0.0737 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.3085 0.0627 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.3088 0.0634 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.3093 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.3091 0.0675 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.3091 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.3092 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.3092 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.3091 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.3084 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.3084 0.0626 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.3079 0.0629 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.3079 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.3074 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.3074 0.0639 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.3072 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.3071 0.0682 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.3068 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.3066 0.0620 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.3063 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.3064 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.3061 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.3060 0.0604 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.3057 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.3054 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.3052 0.0618 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.3053 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.3054 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.3050 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.3047 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.3044 0.0694 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.3044 0.0712 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.3043 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.3042 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.3042 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.3041 0.0611 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.3040 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.3040 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.3041 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.3038 0.0712 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.3040 0.0607 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.3038 0.0610 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.3038 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.3036 0.0652 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.3035 0.0715 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.3033 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.3030 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.3030 0.0625 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.3030 0.0595 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.3030 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.3030 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.3029 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.3025 0.0623 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.3021 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.3021 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.3019 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.3015 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.3016 0.0609 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.3016 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.3015 0.0701 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.3012 0.0611 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.3007 0.0630 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.3005 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.3005 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.3005 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.3005 0.0607 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.3006 0.0607 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.3008 0.0668 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.3009 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.3009 0.0610 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.3008 0.0673 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.3011 0.0677 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.3012 0.0611 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.3011 0.0599 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.3013 0.0607 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.3012 0.0625 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.3014 0.0740 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.3014 0.0629 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.3017 0.0609 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.3018 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.3016 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.3013 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.3012 0.0604 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.3013 0.0669 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.3013 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.3013 0.0638 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.3013 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.3013 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.3013 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.3011 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.3012 0.0611 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.3013 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.3012 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.3013 0.0616 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.3013 0.0616 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.3013 0.0602 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.3013 0.0603 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.3014 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.3018 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.3018 0.0625 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.3019 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.3018 0.0612 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.3016 0.0620 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.3018 0.0742 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.3019 0.0611 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.3019 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.3018 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.3017 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.3019 0.0603 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.4075 0.0607 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.3594 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.3422 0.0674 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.3358 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.3233 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.3120 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.3087 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.3062 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.3051 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.3035 0.0630 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.3003 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.3006 0.0678 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.3010 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.3015 0.0603 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.2996 0.0755 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.2984 0.0689 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.2987 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.2997 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.2992 0.0640 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.3003 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.2999 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.3002 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.2993 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.2992 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.2991 0.0603 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.2971 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.2961 0.0762 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.2970 0.0730 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.2971 0.0615 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.2972 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.2966 0.0618 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.2954 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.2954 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.2956 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.2953 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.2952 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.2948 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.2938 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.2927 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.2927 0.0735 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.2921 0.0627 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.2930 0.0609 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.2928 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.2923 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.2926 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.2918 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.2914 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.2909 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.2909 0.0691 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.2912 0.0743 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.2907 0.0607 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.2914 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.2914 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.2916 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.2913 0.0615 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.2913 0.0621 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.2916 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.2912 0.0639 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.2907 0.0623 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.2911 0.0607 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.2913 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.2921 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.2922 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.2924 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.2924 0.0755 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.2926 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.2928 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.2926 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.2926 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.2925 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.2931 0.0621 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.2934 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.2939 0.0692 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.2936 0.0619 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.2935 0.0623 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.2936 0.0603 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.2935 0.0627 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.2935 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.2929 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.2928 0.0620 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.2924 0.0660 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.2923 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.2919 0.0600 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.2919 0.0712 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.2915 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.2914 0.0650 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.2911 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.2909 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.2906 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.2906 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.2904 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.2904 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.2899 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.2896 0.0651 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.2894 0.0607 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.2893 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.2894 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.2890 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.2886 0.0640 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.2883 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.2882 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.2880 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.2879 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.2877 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.2877 0.0623 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.2875 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.2876 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.2876 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.2875 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.2876 0.0609 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.2875 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.2875 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.2874 0.0617 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.2873 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.2870 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.2867 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.2867 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.2867 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.2867 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.2867 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.2866 0.0709 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.2863 0.0589 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.2859 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.2859 0.0631 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.2857 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.2854 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.2854 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.2854 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.2853 0.0699 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.2850 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.2846 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.2844 0.0618 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.2845 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.2844 0.0620 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.2844 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.2845 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.2846 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.2846 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.2846 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.2844 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.2848 0.0666 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.2848 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.2848 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.2851 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.2850 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.2852 0.0676 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.2852 0.0638 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.2854 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.2856 0.0653 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.2855 0.0652 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.2852 0.0620 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.2851 0.0600 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.2852 0.0643 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.2852 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.2852 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.2852 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.2853 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.2852 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.2850 0.0601 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.2851 0.0667 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.2852 0.0631 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.2852 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.2851 0.0693 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.2851 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.2851 0.0690 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.2851 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.2853 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.2856 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.2856 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.2856 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.2855 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.2853 0.0725 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.2855 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.2855 0.0609 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.2856 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.2855 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.2853 0.0668 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.2855 0.0612 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.3843 0.0604 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.3384 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.3239 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.3178 0.0618 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.3050 0.0603 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.2920 0.0694 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.2899 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.2864 0.0619 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.2855 0.0614 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.2844 0.0715 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.2808 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.2812 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.2815 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.2822 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.2810 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.2794 0.0607 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.2799 0.0674 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.2810 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.2813 0.0614 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.2827 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.2822 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.2823 0.0730 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.2815 0.0635 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.2813 0.0638 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.2815 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.2794 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.2784 0.0647 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.2789 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.2791 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.2793 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.2788 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.2777 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.2780 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.2783 0.0692 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.2778 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.2776 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.2769 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.2758 0.0638 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.2747 0.0647 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.2746 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.2738 0.0606 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.2749 0.0730 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.2747 0.0621 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.2744 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.2748 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.2740 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.2736 0.0641 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.2730 0.0614 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.2730 0.0605 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.2733 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.2728 0.0680 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.2735 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.2735 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.2737 0.0747 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.2734 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.2734 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.2738 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.2735 0.0602 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.2730 0.0614 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.2737 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.2736 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.2746 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.2748 0.0626 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.2749 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.2748 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.2750 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.2752 0.0629 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.2749 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.2750 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.2749 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.2754 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.2757 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.2762 0.0699 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.2758 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.2758 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.2760 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.2761 0.0635 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.2761 0.0621 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.2756 0.0663 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.2756 0.0614 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.2752 0.0611 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.2751 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.2747 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.2748 0.0618 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.2744 0.0648 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.2744 0.0619 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.2742 0.0605 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.2740 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.2737 0.0743 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.2739 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.2736 0.0717 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.2736 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.2732 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.2731 0.0612 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.2729 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.2730 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.2730 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.2726 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.2723 0.0626 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.2720 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.2721 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.2719 0.0607 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.2718 0.0605 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.2718 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.2717 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.2715 0.0619 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.2715 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.2716 0.0757 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.2714 0.0598 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.2716 0.0604 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.2715 0.0633 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.2713 0.0760 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.2713 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.2712 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.2710 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.2706 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.2707 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.2707 0.0621 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.2706 0.0621 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.2706 0.0667 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.2705 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.2702 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.2698 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.2698 0.0633 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.2697 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.2693 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.2694 0.0634 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.2694 0.0696 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.2692 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.2689 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.2686 0.0620 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.2684 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.2684 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.2684 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.2684 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.2685 0.0635 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.2687 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.2688 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.2689 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.2688 0.0616 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.2691 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.2692 0.0606 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.2691 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.2693 0.0638 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.2692 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.2694 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.2694 0.0607 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.2697 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.2699 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.2697 0.0647 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.2694 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.2693 0.0638 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.2693 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.2693 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.2693 0.0612 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.2693 0.0717 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.2694 0.0642 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.2693 0.0611 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.2690 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.2692 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.2693 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.2693 0.0629 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.2692 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.2691 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.2692 0.0631 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.2691 0.0622 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.2693 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.2696 0.0761 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.2697 0.0665 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.2697 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.2696 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.2694 0.0709 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.2696 0.0614 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.2696 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.2696 0.0691 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.2695 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.2694 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.2696 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.3675 0.0629 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.3230 0.0704 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.3073 0.0618 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.2981 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.2877 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.2768 0.0631 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.2758 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.2726 0.0603 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.2720 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.2711 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.2680 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.2683 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.2689 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.2692 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.2681 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.2663 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.2662 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.2675 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.2676 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.2690 0.0609 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.2684 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.2688 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.2682 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.2683 0.0615 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.2683 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.2661 0.0629 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.2647 0.0700 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.2653 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.2656 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.2659 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.2655 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.2647 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.2650 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.2651 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.2647 0.0618 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.2646 0.0603 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.2640 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.2629 0.0685 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.2617 0.0673 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.2615 0.0613 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.2609 0.0643 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.2619 0.0626 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.2618 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.2614 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.2616 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.2609 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.2605 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.2598 0.0609 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.2596 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.2599 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.2594 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.2602 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.2602 0.0599 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.2604 0.0647 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.2602 0.0629 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.2601 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.2603 0.0614 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.2601 0.0638 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.2596 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.2602 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.2602 0.0607 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.2609 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.2611 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.2612 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.2612 0.0626 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.2613 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.2617 0.0634 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.2614 0.0632 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.2614 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.2613 0.0703 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.2620 0.0627 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.2623 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.2628 0.0620 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.2626 0.0692 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.2627 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.2628 0.0613 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.2627 0.0669 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.2626 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.2620 0.0626 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.2620 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.2617 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.2616 0.0642 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.2611 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.2611 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.2608 0.0609 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.2607 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.2605 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.2603 0.0726 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.2600 0.0614 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.2600 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.2599 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.2599 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.2595 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.2592 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.2590 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.2590 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.2590 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.2587 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.2583 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.2580 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.2579 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.2577 0.0638 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.2576 0.0632 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.2575 0.0665 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.2574 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.2574 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.2574 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.2574 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.2573 0.0674 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.2574 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.2573 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.2573 0.0600 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.2572 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.2571 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.2568 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.2565 0.0721 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.2565 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.2566 0.0625 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.2565 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.2566 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.2565 0.0642 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.2562 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.2558 0.0638 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.2558 0.0618 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.2556 0.0743 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.2552 0.0614 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.2553 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.2553 0.0613 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.2551 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.2549 0.0688 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.2545 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.2543 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.2543 0.0618 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.2543 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.2543 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.2543 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.2545 0.0726 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.2546 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.2546 0.0651 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.2546 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.2550 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.2550 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.2548 0.0623 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.2551 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.2550 0.0743 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.2552 0.0615 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.2552 0.0635 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.2555 0.0620 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.2557 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.2555 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.2552 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.2551 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.2552 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.2551 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.2551 0.0615 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.2550 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.2550 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.2549 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.2547 0.0615 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.2548 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.2550 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.2549 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.2549 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.2549 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.2550 0.0720 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.2549 0.0655 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.2551 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.2554 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.2554 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.2555 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.2554 0.0690 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.2552 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.2553 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.2554 0.0622 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.2554 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.2553 0.0706 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.2552 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.2554 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4216 0.6327 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.4081 0.0611 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.3921 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.3692 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.3186 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.2165 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.1269 0.0631 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.0486 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 3.9827 0.0549 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.9265 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.8739 0.0530 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.8297 0.0615 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.7904 0.0552 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.7564 0.0604 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.7255 0.0537 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.6979 0.0561 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.6733 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.6516 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.6310 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.6105 0.0654 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.5923 0.0569 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.5753 0.0561 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.5601 0.0535 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.5460 0.0532 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.5326 0.0536 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.5203 0.0613 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.5092 0.0550 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.4979 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.4875 0.0547 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.4778 0.0673 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.4690 0.0539 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.4600 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.4515 0.0617 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.4441 0.0650 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.4363 0.0532 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.4293 0.0562 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.4218 0.0674 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.4149 0.0552 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.4082 0.0540 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.4020 0.0557 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.3957 0.0605 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.3900 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.3845 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.3790 0.0708 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.3736 0.0630 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.3688 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.3643 0.0563 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.3599 0.0554 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.3557 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.3515 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.3475 0.0560 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.3434 0.0543 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.3395 0.0535 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.3354 0.0648 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.3316 0.0614 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.3277 0.0759 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.3241 0.0629 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.3208 0.0621 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.3173 0.0534 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.3141 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.3109 0.0608 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.3081 0.0542 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.3054 0.0532 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.3021 0.0554 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.2991 0.0717 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.2965 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.2938 0.0546 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.2906 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.2877 0.0571 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.2852 0.0636 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.2825 0.0552 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.2802 0.0729 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.2777 0.0644 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.2754 0.0568 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.2731 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.2709 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.2686 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.2664 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.2641 0.0549 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.2617 0.0669 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.2596 0.0590 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2575 0.0556 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2555 0.0589 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2535 0.0564 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2512 0.0602 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2491 0.0579 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2470 0.0612 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2449 0.0649 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2430 0.0592 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2412 0.0556 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2394 0.0548 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.2374 0.0550 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.2356 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.2337 0.0554 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.2318 0.0529 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.2299 0.0544 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.2281 0.0657 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.2262 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.2243 0.0539 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.2225 0.0640 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.2207 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.2189 0.0544 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.2172 0.0547 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.2154 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.2137 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.2119 0.0543 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.2100 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.2081 0.0595 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.2065 0.0585 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.2044 0.0549 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.2026 0.0545 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.2009 0.0550 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.1990 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.1972 0.0564 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.1953 0.0561 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.1934 0.0551 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.1916 0.0603 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.1899 0.0548 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.1883 0.0559 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.1864 0.0665 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.1849 0.0569 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.1832 0.0558 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.1814 0.0543 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.1797 0.0586 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.1778 0.0645 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.1758 0.0549 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.1740 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.1723 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.1705 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.1688 0.0567 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.1671 0.0556 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.1652 0.0552 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.1633 0.0571 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.1615 0.0563 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.1594 0.0544 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.1573 0.0551 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.1554 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.1534 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.1515 0.0570 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.1496 0.0616 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.1477 0.0573 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.1456 0.0561 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.1436 0.0551 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.1416 0.0628 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.1396 0.0587 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.1376 0.0583 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.1357 0.0556 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.1338 0.0580 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.1318 0.0601 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.1298 0.0574 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.1279 0.0638 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.1261 0.0564 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.1241 0.0599 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.1221 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.1200 0.0570 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.1179 0.0606 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.1157 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.1136 0.0557 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.1114 0.0552 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.1093 0.0571 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.1072 0.0562 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.1049 0.0593 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.1026 0.0569 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.1005 0.0581 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.0983 0.0610 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.0961 0.0557 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.0939 0.0572 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.0917 0.0576 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.0897 0.0563 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.0874 0.0566 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.0852 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.0832 0.0600 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.0812 0.0607 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.0792 0.0562 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.0772 0.0596 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.0750 0.0557 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.0728 0.0591 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.0705 0.0588 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.8058 0.0577 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.7226 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.6981 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.6860 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.6800 0.0618 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.6749 0.0639 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.6714 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.6692 0.0565 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.6668 0.0636 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.6639 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.6589 0.0608 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.6559 0.0656 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.6533 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.6528 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.6509 0.0700 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.6494 0.0621 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.6476 0.0582 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.6467 0.0731 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.6451 0.0581 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.6420 0.0591 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.6394 0.0575 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.6384 0.0573 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.6360 0.0662 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.6336 0.0573 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.6312 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.6293 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.6271 0.0555 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.6248 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.6233 0.0564 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.6213 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.6204 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.6183 0.0571 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.6159 0.0573 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.6142 0.0666 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.6119 0.0612 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.6104 0.0567 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.6084 0.0566 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.6060 0.0566 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.6037 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.6015 0.0575 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.5996 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.5978 0.0617 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.5956 0.0715 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.5935 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.5916 0.0585 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.5891 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.5876 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.5859 0.0587 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.5841 0.0607 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.5827 0.0871 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.5806 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.5790 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.5772 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.5752 0.0594 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.5735 0.0649 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.5720 0.0581 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.5704 0.0574 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.5686 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.5670 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.5655 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.5646 0.0586 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.5639 0.0652 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.5635 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.5625 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.5608 0.0569 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.5596 0.0713 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.5585 0.0559 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.5567 0.0566 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.5550 0.0596 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.5539 0.0592 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.5527 0.0637 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.5515 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.5502 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.5487 0.0732 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.5474 0.0574 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.5465 0.0581 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.5451 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.5440 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.5426 0.0629 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.5412 0.0575 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.5399 0.0582 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.5388 0.0603 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.5374 0.0630 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.5360 0.0573 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.5342 0.0578 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.5328 0.0709 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.5316 0.0673 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.5303 0.0578 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.5289 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.5278 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.5265 0.0585 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.5253 0.0566 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.5240 0.0579 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.5226 0.0681 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.5212 0.0692 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.5198 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.5185 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.5172 0.0697 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.5159 0.0635 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.5146 0.0619 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.5135 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.5123 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.5109 0.0645 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.5097 0.0579 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.5084 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.5072 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.5060 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.5051 0.0578 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.5040 0.0588 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.5027 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.5016 0.0588 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.5005 0.0573 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.4992 0.0598 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.4980 0.0721 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.4968 0.0688 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.4954 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.4943 0.0578 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.4933 0.0654 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.4924 0.0573 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.4914 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.4905 0.0594 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.4895 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.4885 0.0615 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.4875 0.0574 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.4866 0.0583 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.4854 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.4845 0.0583 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.4836 0.0574 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.4827 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.4818 0.0640 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.4808 0.0606 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.4796 0.0624 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.4787 0.0638 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.4778 0.0642 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.4768 0.0632 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.4758 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.4749 0.0602 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.4740 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.4732 0.0665 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.4723 0.0578 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.4716 0.0583 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.4706 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.4698 0.0577 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.4689 0.0573 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.4680 0.0593 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.4673 0.0620 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.4664 0.0643 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.4657 0.0610 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.4648 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.4638 0.0605 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.4631 0.0573 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.4625 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.4617 0.0569 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.4609 0.0626 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.4600 0.0655 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.4591 0.0609 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.4582 0.0569 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.4574 0.0689 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.4564 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.4558 0.0614 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.4550 0.0595 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.4541 0.0613 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.4531 0.0627 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.4523 0.0696 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.4515 0.0582 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.4507 0.0616 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.4499 0.0578 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.4491 0.0584 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.4483 0.0600 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.4474 0.0674 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.4466 0.0667 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.4458 0.0573 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.4451 0.0588 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.4444 0.0604 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.4437 0.0623 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.4429 0.0590 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.4421 0.0580 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.4412 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.4194 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.3438 0.0579 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.3217 0.0583 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.3123 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.3065 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.3013 0.0587 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.3006 0.0579 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.3000 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.3004 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.2996 0.0592 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.2968 0.0577 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.2956 0.0612 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.2951 0.0601 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.2972 0.0622 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.2959 0.0576 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.2947 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.2943 0.0637 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.2954 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.2952 0.0583 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.2938 0.0728 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.2931 0.0561 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.2941 0.0580 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.2933 0.0583 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.2921 0.0753 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.2913 0.0677 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.2901 0.0580 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.2890 0.0576 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.2885 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.2888 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.2883 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.2879 0.0592 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.2869 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.2860 0.0719 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.2857 0.0644 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.2849 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.2843 0.0741 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.2836 0.0583 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.2822 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.2809 0.0578 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.2798 0.0665 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.2788 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.2782 0.0733 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.2774 0.0707 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.2762 0.0613 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.2756 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.2740 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.2736 0.0656 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.2726 0.0625 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.2722 0.0628 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.2720 0.0708 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.2713 0.0658 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.2711 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.2704 0.0700 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.2700 0.0741 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.2693 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.2690 0.0583 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.2687 0.0606 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.2679 0.0619 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.2673 0.0708 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.2672 0.0645 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.2665 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.2664 0.0703 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.2665 0.0621 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.2661 0.0573 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.2656 0.0748 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.2654 0.0720 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.2650 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.2642 0.0595 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.2635 0.0664 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.2632 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.2631 0.0578 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.2627 0.0581 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.2623 0.0582 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.2616 0.0712 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.2611 0.0647 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.2612 0.0597 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.2606 0.0594 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.2604 0.0706 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.2596 0.0589 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.2591 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.2584 0.0638 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.2582 0.0620 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.2574 0.0609 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.2567 0.0595 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.2558 0.0576 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.2552 0.0736 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.2548 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.2542 0.0596 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.2535 0.0600 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.2533 0.0610 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.2527 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.2523 0.0585 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.2515 0.0580 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.2509 0.0707 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.2503 0.0591 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.2498 0.0585 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.2493 0.0602 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.2488 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.2481 0.0735 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.2474 0.0605 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.2471 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.2466 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.2459 0.0673 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.2453 0.0630 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.2447 0.0668 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.2442 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.2438 0.0602 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.2436 0.0750 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.2433 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.2428 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.2425 0.0596 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.2421 0.0598 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.2416 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.2412 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.2406 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.2400 0.0605 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.2396 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.2392 0.0679 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.2390 0.0690 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.2387 0.0778 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.2385 0.0701 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.2380 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.2375 0.0794 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.2373 0.0713 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.2370 0.0661 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.2364 0.0652 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.2362 0.0640 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.2359 0.0662 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.2356 0.0623 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.2353 0.0636 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.2348 0.0627 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.2342 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.2339 0.0692 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.2337 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.2333 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.2331 0.0659 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.2328 0.0779 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.2325 0.0666 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.2325 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.2320 0.0578 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.2319 0.0653 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.2315 0.0604 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.2311 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.2308 0.0616 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.2305 0.0603 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.2304 0.0611 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.2301 0.0637 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.2300 0.0614 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.2296 0.0686 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.2292 0.0633 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.2289 0.0685 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.2288 0.0688 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.2286 0.0691 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.2284 0.0680 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.2280 0.0792 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.2277 0.0775 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.2274 0.0700 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.2270 0.0589 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.2265 0.0639 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.2264 0.0667 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.2262 0.0711 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.2259 0.0800 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.2256 0.0618 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.2252 0.0681 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.2250 0.0646 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.2247 0.0776 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.2244 0.0649 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.2243 0.0719 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.2240 0.0596 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.2237 0.0695 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.2233 0.0708 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.2229 0.0720 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.2228 0.0781 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.2225 0.0729 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.2223 0.0691 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.2220 0.0719 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.2216 0.0629 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.2213 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.2801 0.0571 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.2109 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.1903 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.1814 0.0593 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.1753 0.0593 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.1662 0.0816 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.1665 0.0613 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.1667 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.1672 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.1669 0.0702 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.1642 0.0676 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.1623 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.1619 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.1643 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.1632 0.0591 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.1625 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.1622 0.0577 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.1644 0.0604 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.1645 0.0587 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.1639 0.0598 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.1633 0.0582 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.1644 0.0600 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.1640 0.0591 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.1631 0.0579 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.1626 0.0575 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.1612 0.0665 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.1600 0.0669 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.1600 0.0601 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.1606 0.0580 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.1605 0.0587 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.1601 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.1593 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.1588 0.0593 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.1591 0.0710 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.1586 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.1582 0.0597 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.1578 0.0697 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.1561 0.0602 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.1549 0.0656 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.1539 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.1534 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.1531 0.0699 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.1526 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.1516 0.0633 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.1515 0.0720 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.1500 0.0626 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.1497 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.1490 0.0605 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.1486 0.0643 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.1490 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.1483 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.1487 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.1480 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.1478 0.0617 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.1473 0.0636 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.1472 0.0580 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.1471 0.0629 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.1465 0.0694 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.1459 0.0651 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.1463 0.0570 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.1460 0.0573 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.1463 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.1465 0.0690 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.1465 0.0621 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.1461 0.0606 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.1461 0.0654 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.1459 0.0605 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.1453 0.0630 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.1449 0.0578 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.1446 0.0584 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.1448 0.0605 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.1445 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.1445 0.0593 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.1439 0.0657 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.1435 0.0596 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.1438 0.0592 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.1435 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.1435 0.0596 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.1428 0.0619 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.1425 0.0587 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.1419 0.0579 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.1418 0.0591 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.1412 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.1408 0.0579 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.1400 0.0614 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.1396 0.0598 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.1394 0.0618 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.1390 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.1385 0.0585 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.1383 0.0682 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.1378 0.0632 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.1376 0.0583 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.1369 0.0603 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.1364 0.0702 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.1359 0.0616 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.1355 0.0597 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.1351 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.1346 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.1340 0.0609 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.1333 0.0598 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.1331 0.0635 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.1328 0.0625 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.1323 0.0667 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.1319 0.0606 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.1314 0.0614 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.1311 0.0612 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.1308 0.0692 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.1306 0.0598 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.1305 0.0603 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.1302 0.0584 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.1299 0.0608 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.1297 0.0602 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.1294 0.0644 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.1290 0.0673 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.1286 0.0588 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.1280 0.0597 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.1277 0.0586 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.1274 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.1272 0.0659 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.1270 0.0587 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.1269 0.0585 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.1265 0.0618 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.1261 0.0674 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.1260 0.0596 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.1257 0.0594 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.1252 0.0660 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.1251 0.0615 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.1250 0.0597 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.1247 0.0580 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.1246 0.0645 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.1242 0.0688 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.1237 0.0600 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.1236 0.0588 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.1235 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.1233 0.0600 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.1232 0.0587 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.1230 0.0588 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.1229 0.0591 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.1230 0.0631 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.1226 0.0607 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.1226 0.0637 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.1224 0.0717 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.1221 0.0599 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.1220 0.0607 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.1217 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.1217 0.0640 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.1215 0.0624 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.1216 0.0584 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.1215 0.0581 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.1211 0.0591 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.1209 0.0639 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.1209 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.1208 0.0650 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.1206 0.0663 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.1204 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.1203 0.0586 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.1200 0.0598 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.1198 0.0638 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.1194 0.0668 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.1194 0.0614 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.1193 0.0580 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.1191 0.0658 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.1189 0.0655 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.1187 0.0597 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.1185 0.0581 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.1183 0.0711 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.1182 0.0611 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.1182 0.0595 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.1180 0.0585 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.1177 0.0720 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.1174 0.0649 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.1172 0.0646 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.1171 0.0671 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.1169 0.0623 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.1168 0.0592 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.1166 0.0590 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.1163 0.0586 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.1161 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.1861 0.0691 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 2.1157 0.0601 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 2.0982 0.0591 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 2.0886 0.0618 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 2.0850 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 2.0776 0.0590 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 2.0787 0.0592 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 2.0771 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 2.0787 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 2.0789 0.0593 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 2.0758 0.0597 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 2.0743 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 2.0741 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 2.0760 0.0581 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 2.0755 0.0579 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 2.0741 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 2.0735 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 2.0756 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 2.0759 0.0592 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 2.0758 0.0688 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 2.0748 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 2.0755 0.0601 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 2.0747 0.0579 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 2.0738 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 2.0735 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 2.0722 0.0617 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 2.0711 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 2.0712 0.0658 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 2.0720 0.0650 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 2.0719 0.0648 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 2.0715 0.0591 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 2.0707 0.0636 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 2.0705 0.0585 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 2.0709 0.0587 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 2.0705 0.0585 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 2.0702 0.0649 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 2.0697 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 2.0683 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 2.0673 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 2.0663 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 2.0658 0.0707 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 2.0656 0.0577 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 2.0650 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 2.0640 0.0705 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 2.0640 0.0594 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 2.0625 0.0625 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 2.0625 0.0669 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 2.0618 0.0665 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 2.0614 0.0593 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 2.0619 0.0584 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 2.0612 0.0586 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 2.0618 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 2.0614 0.0651 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 2.0612 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 2.0606 0.0628 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 2.0608 0.0615 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 2.0608 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 2.0605 0.0593 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 2.0599 0.0596 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 2.0605 0.0631 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 2.0602 0.0623 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 2.0606 0.0621 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 2.0610 0.0579 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 2.0611 0.0591 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 2.0608 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 2.0609 0.0584 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 2.0607 0.0586 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 2.0602 0.0618 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 2.0597 0.0608 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 2.0596 0.0589 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 2.0598 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 2.0595 0.0701 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 2.0595 0.0686 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 2.0590 0.0606 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 2.0586 0.0610 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 2.0587 0.0620 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 2.0584 0.0621 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 2.0583 0.0585 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 2.0576 0.0590 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 2.0572 0.0668 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 2.0567 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 2.0567 0.0656 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 2.0559 0.0672 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 2.0556 0.0587 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 2.0548 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 2.0544 0.0630 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 2.0542 0.0704 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 2.0539 0.0647 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 2.0532 0.0584 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 2.0532 0.0585 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 2.0528 0.0615 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 2.0526 0.0599 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 2.0519 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 2.0515 0.0593 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 2.0510 0.0758 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 2.0508 0.0583 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 2.0505 0.0624 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 2.0501 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 2.0495 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 2.0488 0.0693 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 2.0486 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 2.0483 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 2.0479 0.0629 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 2.0475 0.0613 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 2.0470 0.0579 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 2.0467 0.0622 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 2.0465 0.0661 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 2.0463 0.0600 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 2.0462 0.0586 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 2.0460 0.0589 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 2.0457 0.0603 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 2.0455 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 2.0452 0.0596 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 2.0448 0.0587 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 2.0444 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 2.0438 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 2.0435 0.0590 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 2.0431 0.0583 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 2.0429 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 2.0428 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 2.0426 0.0718 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 2.0421 0.0590 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 2.0418 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 2.0418 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 2.0415 0.0601 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 2.0411 0.0586 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 2.0411 0.0600 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 2.0410 0.0657 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 2.0408 0.0637 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 2.0406 0.0673 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 2.0402 0.0678 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 2.0397 0.0590 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 2.0395 0.0584 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 2.0394 0.0592 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 2.0392 0.0779 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 2.0392 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 2.0391 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 2.0390 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 2.0390 0.0655 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 2.0388 0.0639 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 2.0388 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 2.0385 0.0589 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 2.0383 0.0591 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 2.0382 0.0670 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 2.0379 0.0598 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 2.0379 0.0701 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 2.0378 0.0689 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 2.0378 0.0594 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 2.0376 0.0588 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 2.0373 0.0595 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 2.0370 0.0687 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 2.0370 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 2.0369 0.0597 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 2.0368 0.0659 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 2.0366 0.0614 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 2.0364 0.0731 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 2.0362 0.0632 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 2.0359 0.0607 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 2.0356 0.0671 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 2.0357 0.0644 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 2.0356 0.0645 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 2.0355 0.0627 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 2.0354 0.0609 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 2.0352 0.0641 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 2.0351 0.0642 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 2.0349 0.0612 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 2.0349 0.0681 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 2.0350 0.0732 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 2.0348 0.0611 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 2.0345 0.0594 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 2.0343 0.0653 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 2.0341 0.0667 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 2.0340 0.0626 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 2.0339 0.0593 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 2.0338 0.0634 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 2.0336 0.0715 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 2.0334 0.0597 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 2.0332 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 2.1041 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 2.0444 0.0588 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 2.0253 0.0607 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 2.0180 0.0638 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 2.0106 0.0592 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 2.0022 0.0650 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 2.0015 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 2.0007 0.0594 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 2.0019 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 2.0010 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.9986 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.9966 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.9968 0.0602 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.9988 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.9978 0.0584 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.9963 0.0599 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.9958 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.9979 0.0721 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.9979 0.0612 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.9980 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.9970 0.0697 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.9976 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.9971 0.0611 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.9966 0.0603 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.9962 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.9950 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.9938 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.9940 0.0586 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.9948 0.0706 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.9946 0.0626 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.9944 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.9935 0.0581 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.9933 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.9940 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.9935 0.0603 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.9930 0.0651 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.9925 0.0647 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.9913 0.0592 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.9900 0.0604 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.9892 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.9888 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.9890 0.0724 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.9885 0.0593 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.9875 0.0586 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.9877 0.0731 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.9862 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.9859 0.0600 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.9852 0.0597 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.9848 0.0618 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.9853 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.9847 0.0585 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.9854 0.0584 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.9851 0.0607 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.9849 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.9842 0.0584 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.9844 0.0592 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.9843 0.0642 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.9839 0.0609 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.9834 0.0593 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.9840 0.0626 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.9839 0.0719 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.9843 0.0655 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.9846 0.0595 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.9848 0.0588 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.9845 0.0682 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.9846 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.9846 0.0621 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.9839 0.0611 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.9835 0.0766 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.9833 0.0589 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.9836 0.0679 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.9835 0.0582 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.9837 0.0675 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.9832 0.0636 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.9829 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.9832 0.0593 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.9829 0.0667 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.9829 0.0639 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.9823 0.0596 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.9819 0.0592 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.9815 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.9815 0.0626 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.9807 0.0615 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.9805 0.0589 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.9797 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.9792 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.9789 0.0596 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.9785 0.0586 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.9778 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.9777 0.0736 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.9772 0.0598 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.9770 0.0586 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.9763 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.9759 0.0672 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.9754 0.0593 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.9752 0.0632 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.9750 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.9746 0.0657 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.9742 0.0613 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.9735 0.0608 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.9734 0.0630 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.9732 0.0710 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.9728 0.0590 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.9725 0.0616 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.9721 0.0671 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.9718 0.0679 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.9716 0.0601 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.9714 0.0594 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.9713 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.9712 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.9710 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.9708 0.0605 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.9705 0.0588 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.9702 0.0689 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.9698 0.0611 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.9693 0.0598 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.9691 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.9687 0.0729 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.9686 0.0697 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.9683 0.0595 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.9682 0.0612 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.9677 0.0686 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.9674 0.0646 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.9673 0.0626 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.9671 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.9666 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.9666 0.0593 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.9665 0.0624 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.9662 0.0586 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.9661 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.9656 0.0719 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.9651 0.0640 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.9650 0.0704 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.9648 0.0669 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.9647 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.9646 0.0677 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.9645 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.9643 0.0758 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.9644 0.0690 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.9641 0.0717 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.9642 0.0644 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.9640 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.9638 0.0691 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.9636 0.0631 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.9633 0.0687 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.9634 0.0673 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.9633 0.0666 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.9633 0.0816 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.9632 0.0652 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.9629 0.0595 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.9626 0.0661 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.9625 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.9625 0.0670 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.9624 0.0825 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.9622 0.0627 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.9620 0.0703 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.9619 0.0669 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.9616 0.0706 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.9613 0.0799 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.9613 0.0665 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.9613 0.0683 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.9611 0.0625 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.9611 0.0602 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.9609 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.9607 0.0597 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.9605 0.0606 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.9604 0.0623 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.9605 0.0604 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.9603 0.0614 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.9601 0.0676 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.9599 0.0617 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.9596 0.0595 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.9595 0.0663 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.9595 0.0609 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.9594 0.0628 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.9592 0.0634 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.9589 0.0643 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.9588 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 2.0324 0.0639 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.9757 0.0598 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.9583 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.9482 0.0618 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.9413 0.0591 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.9309 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.9299 0.0719 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.9283 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.9291 0.0591 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.9289 0.0608 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.9258 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.9234 0.0593 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.9232 0.0602 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.9249 0.0648 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.9240 0.0712 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.9225 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.9221 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.9238 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.9240 0.0610 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.9239 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.9229 0.0582 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.9236 0.0610 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.9228 0.0707 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.9223 0.0673 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.9218 0.0710 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.9206 0.0611 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.9192 0.0720 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.9193 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.9198 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.9199 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.9197 0.0602 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.9187 0.0595 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.9185 0.0650 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.9190 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.9186 0.0692 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.9184 0.0609 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.9181 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.9165 0.0629 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.9152 0.0653 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.9144 0.0756 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.9140 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.9141 0.0713 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.9135 0.0670 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.9125 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.9126 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.9111 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.9108 0.0686 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.9102 0.0768 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.9099 0.0642 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.9104 0.0707 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.9098 0.0615 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.9104 0.0602 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.9102 0.0679 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.9100 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.9096 0.0671 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.9097 0.0613 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.9098 0.0598 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.9094 0.0747 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.9087 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.9092 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.9091 0.0611 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.9098 0.0675 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.9100 0.0608 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.9102 0.0611 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.9099 0.0640 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.9100 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.9100 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.9094 0.0598 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.9091 0.0621 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.9090 0.0634 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.9095 0.0643 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.9095 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.9098 0.0598 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.9093 0.0705 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.9090 0.0597 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.9093 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.9091 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.9090 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.9084 0.0623 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.9080 0.0603 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.9074 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.9076 0.0630 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.9069 0.0655 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.9068 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.9061 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.9056 0.0667 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.9053 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.9049 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.9042 0.0597 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.9041 0.0612 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.9038 0.0678 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.9035 0.0612 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.9028 0.0604 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.9025 0.0711 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.9021 0.0659 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.9019 0.0601 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.9018 0.0755 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.9013 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.9008 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.9002 0.0616 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.9001 0.0598 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.8998 0.0662 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.8994 0.0625 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.8991 0.0590 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.8987 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.8984 0.0646 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.8982 0.0637 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.8980 0.0644 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.8979 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.8978 0.0598 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.8977 0.0605 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.8975 0.0603 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.8972 0.0695 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.8969 0.0632 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.8965 0.0624 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.8960 0.0614 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.8958 0.0704 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.8956 0.0688 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.8955 0.0610 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.8954 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.8952 0.0707 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.8948 0.0641 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.8945 0.0601 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.8944 0.0610 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.8943 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.8938 0.0622 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.8938 0.0631 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.8937 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.8935 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.8933 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.8928 0.0608 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.8924 0.0594 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.8923 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.8921 0.0604 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.8920 0.0602 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.8919 0.0594 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.8918 0.0606 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.8917 0.0626 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.8919 0.0593 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.8916 0.0590 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.8916 0.0633 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.8914 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.8912 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.8912 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.8909 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.8909 0.0638 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.8909 0.0619 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.8910 0.0607 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.8909 0.0656 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.8906 0.0599 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.8903 0.0601 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.8903 0.0591 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.8903 0.0652 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.8902 0.0691 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.8900 0.0595 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.8899 0.0628 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.8897 0.0696 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.8896 0.0636 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.8893 0.0601 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.8894 0.0603 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.8894 0.0706 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.8892 0.0621 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.8892 0.0584 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.8891 0.0591 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.8889 0.0601 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.8888 0.0681 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.8888 0.0592 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.8890 0.0587 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.8888 0.0664 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.8886 0.0661 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.8884 0.0600 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.8881 0.0595 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.8882 0.0593 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.8880 0.0627 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.8880 0.0592 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.8878 0.0597 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.8875 0.0602 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.8874 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.9726 0.0592 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.9155 0.0659 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.8958 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.8859 0.0602 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.8785 0.0592 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.8676 0.0592 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.8663 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.8642 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.8653 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.8644 0.0594 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.8613 0.0653 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.8587 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.8586 0.0612 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.8604 0.0591 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.8591 0.0693 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.8574 0.0594 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.8568 0.0602 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.8582 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.8584 0.0637 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.8587 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.8576 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.8581 0.0593 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.8577 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.8574 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.8573 0.0590 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.8563 0.0595 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.8551 0.0597 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.8553 0.0620 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.8560 0.0605 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.8559 0.0603 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.8558 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.8547 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.8544 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.8549 0.0610 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.8545 0.0612 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.8543 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.8538 0.0606 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.8523 0.0672 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.8510 0.0675 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.8503 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.8502 0.0621 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.8508 0.0601 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.8502 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.8492 0.0609 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.8494 0.0614 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.8479 0.0596 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.8477 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.8474 0.0692 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.8470 0.0609 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.8476 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.8469 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.8474 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.8471 0.0615 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.8469 0.0633 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.8465 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.8466 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.8467 0.0599 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.8464 0.0637 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.8456 0.0666 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.8460 0.0625 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.8457 0.0590 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.8463 0.0599 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.8466 0.0663 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.8467 0.0607 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.8463 0.0635 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.8465 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.8465 0.0654 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.8459 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.8455 0.0595 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.8452 0.0616 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.8456 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.8457 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.8460 0.0611 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.8455 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.8451 0.0704 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.8453 0.0605 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.8450 0.0610 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.8448 0.0593 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.8440 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.8438 0.0662 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.8431 0.0592 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.8431 0.0593 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.8425 0.0622 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.8424 0.0596 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.8418 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.8415 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.8412 0.0685 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.8407 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.8401 0.0608 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.8401 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.8397 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.8393 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.8387 0.0605 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.8383 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.8379 0.0624 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.8376 0.0638 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.8373 0.0597 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.8368 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.8363 0.0719 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.8355 0.0591 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.8354 0.0591 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.8352 0.0606 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.8348 0.0649 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.8345 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.8342 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.8340 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.8338 0.0612 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.8337 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.8336 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.8335 0.0595 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.8334 0.0744 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.8332 0.0643 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.8330 0.0592 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.8327 0.0667 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.8324 0.0712 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.8320 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.8318 0.0684 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.8317 0.0602 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.8316 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.8314 0.0664 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.8313 0.0612 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.8309 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.8304 0.0616 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.8304 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.8303 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.8298 0.0616 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.8299 0.0682 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.8299 0.0636 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.8298 0.0611 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.8297 0.0599 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.8293 0.0603 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.8288 0.0618 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.8288 0.0599 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.8287 0.0651 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.8286 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.8285 0.0646 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.8285 0.0645 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.8285 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.8285 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.8282 0.0617 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.8283 0.0678 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.8282 0.0671 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.8280 0.0632 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.8279 0.0648 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.8277 0.0605 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.8277 0.0641 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.8276 0.0655 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.8277 0.0631 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.8276 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.8273 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.8270 0.0640 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.8269 0.0637 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.8269 0.0613 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.8268 0.0604 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.8268 0.0677 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.8266 0.0617 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.8265 0.0623 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.8263 0.0600 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.8260 0.0673 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.8260 0.0686 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.8261 0.0595 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.8260 0.0591 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.8259 0.0670 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.8258 0.0627 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.8257 0.0593 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.8255 0.0592 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.8255 0.0681 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.8257 0.0628 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.8254 0.0595 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.8253 0.0607 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.8251 0.0720 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.8249 0.0630 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.8249 0.0598 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.8248 0.0598 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.8248 0.0607 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.8246 0.0596 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.8244 0.0629 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.8243 0.0606 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.9252 0.0608 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.8640 0.0706 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.8427 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.8305 0.0611 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.8233 0.0662 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.8121 0.0603 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.8103 0.0588 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.8076 0.0611 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.8090 0.0606 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.8075 0.0665 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.8035 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.8007 0.0607 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.7998 0.0730 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.8013 0.0600 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.8002 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.7988 0.0595 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.7982 0.0646 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.7999 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.7999 0.0592 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.8005 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.7994 0.0761 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.7999 0.0623 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.7994 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.7992 0.0598 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.7992 0.0690 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.7978 0.0593 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.7964 0.0593 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.7964 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.7972 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.7971 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.7966 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.7954 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.7954 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.7961 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.7957 0.0624 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.7954 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.7949 0.0635 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.7935 0.0632 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.7921 0.0609 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.7914 0.0627 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.7907 0.0682 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.7910 0.0675 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.7903 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.7894 0.0598 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.7896 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.7883 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.7881 0.0599 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.7875 0.0612 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.7872 0.0647 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.7877 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.7872 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.7880 0.0595 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.7879 0.0669 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.7878 0.0760 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.7874 0.0610 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.7876 0.0599 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.7877 0.0656 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.7875 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.7867 0.0596 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.7873 0.0603 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.7869 0.0648 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.7875 0.0611 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.7877 0.0602 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.7879 0.0601 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.7876 0.0757 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.7878 0.0649 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.7879 0.0610 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.7874 0.0602 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.7870 0.0681 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.7868 0.0660 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.7875 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.7876 0.0595 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.7881 0.0586 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.7877 0.0650 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.7876 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.7879 0.0600 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.7877 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.7877 0.0623 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.7870 0.0608 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.7868 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.7862 0.0609 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.7863 0.0634 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.7858 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.7857 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.7850 0.0615 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.7846 0.0679 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.7843 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.7840 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.7834 0.0708 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.7834 0.0684 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.7830 0.0695 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.7827 0.0601 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.7821 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.7818 0.0629 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.7813 0.0597 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.7810 0.0613 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.7808 0.0715 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.7802 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.7798 0.0594 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.7791 0.0597 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.7790 0.0750 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.7788 0.0598 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.7786 0.0633 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.7783 0.0592 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.7779 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.7777 0.0670 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.7775 0.0599 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.7773 0.0579 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.7772 0.0667 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.7771 0.0623 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.7769 0.0614 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.7767 0.0607 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.7766 0.0612 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.7763 0.0641 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.7759 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.7754 0.0618 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.7753 0.0657 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.7750 0.0600 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.7748 0.0597 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.7747 0.0595 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.7747 0.0698 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.7742 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.7738 0.0590 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.7738 0.0611 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.7736 0.0638 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.7731 0.0625 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.7731 0.0602 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.7731 0.0597 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.7729 0.0622 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.7727 0.0676 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.7724 0.0619 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.7719 0.0637 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.7719 0.0738 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.7718 0.0616 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.7718 0.0600 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.7718 0.0605 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.7718 0.0620 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.7717 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.7718 0.0626 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.7717 0.0601 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.7718 0.0661 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.7717 0.0623 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.7716 0.0601 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.7716 0.0600 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.7715 0.0837 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.7716 0.0631 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.7717 0.0617 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.7719 0.0615 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.7718 0.0687 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.7717 0.0655 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.7714 0.0620 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.7715 0.0614 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.7715 0.0664 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.7715 0.0589 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.7714 0.0587 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.7714 0.0595 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.7713 0.0597 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.7712 0.0709 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.7709 0.0652 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.7710 0.0628 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.7712 0.0630 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.7711 0.0623 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.7711 0.0601 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.7711 0.0671 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.7711 0.0698 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.7710 0.0606 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.7711 0.0624 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.7714 0.0686 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.7713 0.0691 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.7712 0.0593 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.7711 0.0602 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.7710 0.0644 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.7711 0.0663 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.7710 0.0698 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.7710 0.0600 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.7708 0.0601 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.7706 0.0692 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.7706 0.0591 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.8731 0.0626 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.8125 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.7915 0.0649 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.7805 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.7743 0.0594 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.7638 0.0594 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.7624 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.7599 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.7606 0.0605 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.7600 0.0620 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.7556 0.0636 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.7523 0.0598 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.7521 0.0598 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.7537 0.0652 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.7519 0.0718 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.7497 0.0595 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.7491 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.7508 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.7512 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.7516 0.0706 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.7501 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.7503 0.0604 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.7496 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.7491 0.0592 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.7490 0.0603 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.7479 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.7467 0.0607 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.7469 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.7474 0.0598 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.7477 0.0605 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.7475 0.0691 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.7465 0.0599 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.7462 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.7469 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.7467 0.0692 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.7461 0.0638 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.7455 0.0594 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.7440 0.0627 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.7426 0.0630 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.7419 0.0626 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.7414 0.0611 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.7418 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.7412 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.7402 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.7405 0.0596 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.7395 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.7392 0.0674 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.7387 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.7384 0.0599 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.7391 0.0599 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.7385 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.7392 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.7391 0.0613 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.7389 0.0630 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.7386 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.7387 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.7389 0.0608 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.7385 0.0590 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.7379 0.0643 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.7385 0.0600 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.7382 0.0601 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.7390 0.0613 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.7394 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.7396 0.0661 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.7392 0.0587 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.7394 0.0601 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.7394 0.0600 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.7389 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.7385 0.0612 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.7384 0.0586 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.7389 0.0614 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.7390 0.0632 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.7394 0.0599 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.7389 0.0603 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.7388 0.0600 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.7392 0.0675 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.7390 0.0597 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.7389 0.0593 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.7381 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.7379 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.7373 0.0660 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.7374 0.0608 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.7369 0.0629 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.7368 0.0679 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.7363 0.0626 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.7359 0.0605 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.7356 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.7351 0.0615 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.7346 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.7347 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.7343 0.0680 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.7340 0.0610 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.7335 0.0698 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.7332 0.0600 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.7328 0.0616 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.7327 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.7325 0.0598 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.7321 0.0599 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.7316 0.0676 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.7310 0.0665 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.7310 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.7308 0.0605 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.7305 0.0664 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.7304 0.0628 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.7301 0.0615 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.7300 0.0658 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.7299 0.0608 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.7298 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.7297 0.0606 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.7296 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.7295 0.0634 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.7294 0.0622 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.7292 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.7291 0.0595 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.7288 0.0673 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.7284 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.7283 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.7282 0.0623 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.7281 0.0646 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.7280 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.7279 0.0662 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.7275 0.0596 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.7271 0.0678 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.7271 0.0613 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.7270 0.0619 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.7266 0.0604 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.7267 0.0732 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.7267 0.0621 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.7265 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.7264 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.7260 0.0666 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.7256 0.0595 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.7256 0.0593 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.7255 0.0585 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.7255 0.0641 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.7255 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.7255 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.7255 0.0602 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.7256 0.0645 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.7255 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.7257 0.0647 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.7255 0.0653 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.7254 0.0594 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.7253 0.0781 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.7252 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.7252 0.0616 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.7252 0.0655 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.7254 0.0601 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.7253 0.0604 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.7252 0.0667 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.7249 0.0635 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.7249 0.0602 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.7248 0.0592 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.7249 0.0657 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.7248 0.0640 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.7247 0.0593 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.7246 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.7245 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.7243 0.0616 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.7244 0.0590 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.7245 0.0617 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.7243 0.0624 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.7244 0.0631 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.7244 0.0651 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.7242 0.0599 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.7241 0.0648 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.7242 0.0637 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.7245 0.0613 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.7243 0.0685 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.7242 0.0705 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.7241 0.0614 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.7238 0.0596 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.7239 0.0604 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.7238 0.0639 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.7238 0.0654 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.7236 0.0659 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.7234 0.0650 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.7234 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.8206 0.0593 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.7703 0.0600 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.7479 0.0601 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.7393 0.0694 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.7324 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.7203 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.7203 0.0589 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.7167 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.7175 0.0662 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.7158 0.0655 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.7116 0.0602 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.7099 0.0745 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.7096 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.7112 0.0610 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.7098 0.0609 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.7081 0.0637 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.7076 0.0622 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.7093 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.7098 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.7103 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.7089 0.0669 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.7090 0.0590 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.7080 0.0598 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.7072 0.0643 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.7072 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.7061 0.0595 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.7045 0.0597 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.7050 0.0713 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.7058 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.7056 0.0597 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.7049 0.0603 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.7041 0.0622 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.7040 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.7046 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.7043 0.0617 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.7039 0.0696 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.7034 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.7019 0.0604 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.7006 0.0603 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.7001 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.6996 0.0682 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.7000 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.6994 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.6984 0.0788 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.6985 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.6974 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.6969 0.0611 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.6964 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.6962 0.0633 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.6969 0.0604 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.6961 0.0614 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.6969 0.0686 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.6968 0.0652 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.6967 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.6964 0.0605 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.6965 0.0729 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.6968 0.0632 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.6965 0.0613 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.6959 0.0628 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.6965 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.6963 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.6971 0.0601 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.6975 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.6977 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.6972 0.0626 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.6973 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.6974 0.0591 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.6968 0.0623 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.6968 0.0624 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.6967 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.6973 0.0597 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.6974 0.0660 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.6977 0.0650 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.6973 0.0591 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.6970 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.6974 0.0701 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.6972 0.0613 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.6971 0.0597 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.6964 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.6963 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.6958 0.0674 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.6959 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.6954 0.0668 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.6954 0.0711 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.6949 0.0603 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.6947 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.6945 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.6940 0.0774 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.6935 0.0599 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.6936 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.6932 0.0614 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.6930 0.0638 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.6925 0.0612 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.6922 0.0618 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.6918 0.0610 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.6916 0.0694 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.6915 0.0626 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.6911 0.0657 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.6908 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.6902 0.0731 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.6902 0.0595 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.6900 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.6897 0.0601 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.6894 0.0689 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.6891 0.0645 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.6890 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.6888 0.0597 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.6886 0.0611 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.6884 0.0677 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.6884 0.0601 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.6882 0.0649 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.6881 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.6879 0.0651 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.6876 0.0609 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.6873 0.0597 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.6869 0.0612 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.6867 0.0616 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.6866 0.0592 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.6864 0.0598 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.6863 0.0672 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.6863 0.0627 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.6858 0.0594 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.6854 0.0639 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.6854 0.0663 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.6855 0.0625 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.6850 0.0602 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.6851 0.0698 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.6851 0.0595 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.6850 0.0614 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.6848 0.0594 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.6845 0.0627 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.6841 0.0700 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.6840 0.0664 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.6840 0.0601 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.6840 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.6840 0.0658 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.6840 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.6840 0.0595 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.6841 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.6839 0.0659 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.6842 0.0617 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.6840 0.0590 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.6839 0.0616 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.6839 0.0641 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.6837 0.0676 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.6838 0.0635 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.6838 0.0596 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.6839 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.6839 0.0618 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.6838 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.6835 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.6834 0.0654 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.6835 0.0629 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.6835 0.0607 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.6834 0.0600 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.6834 0.0646 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.6834 0.0681 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.6833 0.0600 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.6830 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.6831 0.0631 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.6832 0.0616 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.6830 0.0621 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.6830 0.0606 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.6830 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.6829 0.0670 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.6828 0.0647 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.6829 0.0608 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.6832 0.0665 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.6831 0.0596 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.6830 0.0605 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.6828 0.0602 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.6826 0.0661 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.6826 0.0695 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.6826 0.0620 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.6826 0.0592 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.6824 0.0634 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.6823 0.0596 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.6823 0.0602 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.7930 0.0601 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.7395 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.7118 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.7020 0.0607 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.6935 0.0611 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.6821 0.0718 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.6799 0.0589 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.6768 0.0620 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.6770 0.0606 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.6754 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.6709 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.6684 0.0611 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.6684 0.0605 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.6698 0.0698 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.6687 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.6669 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.6667 0.0597 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.6685 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.6684 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.6693 0.0635 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.6682 0.0607 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.6687 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.6679 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.6672 0.0600 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.6670 0.0606 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.6655 0.0717 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.6643 0.0599 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.6647 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.6653 0.0593 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.6653 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.6647 0.0612 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.6638 0.0622 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.6639 0.0600 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.6646 0.0615 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.6645 0.0728 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.6641 0.0649 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.6635 0.0598 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.6620 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.6606 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.6600 0.0631 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.6594 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.6598 0.0686 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.6591 0.0595 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.6580 0.0610 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.6582 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.6572 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.6566 0.0671 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.6563 0.0603 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.6560 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.6566 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.6561 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.6568 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.6566 0.0716 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.6566 0.0617 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.6564 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.6564 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.6567 0.0597 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.6565 0.0606 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.6558 0.0613 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.6564 0.0597 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.6561 0.0687 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.6569 0.0613 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.6574 0.0607 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.6575 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.6573 0.0756 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.6575 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.6575 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.6570 0.0613 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.6568 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.6567 0.0629 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.6573 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.6573 0.0598 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.6578 0.0599 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.6574 0.0597 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.6571 0.0601 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.6574 0.0601 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.6571 0.0684 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.6570 0.0603 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.6563 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.6562 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.6555 0.0702 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.6556 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.6551 0.0610 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.6551 0.0595 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.6546 0.0609 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.6543 0.0640 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.6540 0.0598 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.6535 0.0585 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.6530 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.6530 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.6528 0.0601 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.6525 0.0617 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.6520 0.0682 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.6517 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.6515 0.0603 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.6514 0.0657 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.6512 0.0658 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.6507 0.0610 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.6503 0.0604 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.6498 0.0616 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.6497 0.0736 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.6495 0.0625 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.6493 0.0605 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.6491 0.0600 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.6487 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.6486 0.0597 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.6484 0.0601 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.6484 0.0604 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.6483 0.0704 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.6483 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.6481 0.0604 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.6480 0.0598 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.6479 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.6477 0.0660 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.6474 0.0599 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.6470 0.0626 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.6470 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.6469 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.6467 0.0582 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.6466 0.0679 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.6466 0.0632 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.6461 0.0613 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.6457 0.0628 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.6457 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.6457 0.0643 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.6452 0.0617 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.6453 0.0603 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.6454 0.0599 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.6452 0.0691 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.6450 0.0680 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.6447 0.0593 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.6443 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.6443 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.6443 0.0622 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.6444 0.0622 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.6443 0.0623 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.6444 0.0655 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.6444 0.0702 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.6445 0.0592 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.6444 0.0737 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.6447 0.0663 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.6446 0.0648 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.6445 0.0609 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.6445 0.0700 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.6443 0.0608 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.6444 0.0633 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.6444 0.0630 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.6446 0.0709 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.6445 0.0653 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.6444 0.0651 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.6441 0.0602 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.6440 0.0689 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.6440 0.0667 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.6440 0.0592 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.6440 0.0602 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.6439 0.0609 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.6439 0.0638 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.6438 0.0605 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.6435 0.0609 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.6436 0.0688 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.6438 0.0754 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.6436 0.0621 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.6437 0.0599 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.6437 0.0652 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.6436 0.0619 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.6436 0.0594 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.6437 0.0673 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.6440 0.0707 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.6439 0.0693 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.6438 0.0611 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.6437 0.0670 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.6434 0.0634 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.6435 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.6435 0.0605 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.6435 0.0605 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.6434 0.0618 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.6432 0.0608 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.6432 0.0598 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.7645 0.0613 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.7083 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.6840 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.6731 0.0604 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.6636 0.0597 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.6507 0.0718 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.6481 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.6446 0.0713 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.6437 0.0610 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.6420 0.0750 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.6374 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.6348 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.6344 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.6365 0.0716 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.6349 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.6330 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.6326 0.0593 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.6347 0.0616 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.6347 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.6354 0.0698 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.6344 0.0619 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.6345 0.0609 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.6335 0.0652 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.6328 0.0666 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.6328 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.6310 0.0628 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.6296 0.0619 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.6301 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.6307 0.0621 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.6307 0.0609 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.6303 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.6292 0.0601 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.6291 0.0593 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.6295 0.0613 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.6293 0.0640 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.6290 0.0610 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.6286 0.0601 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.6273 0.0593 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.6260 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.6253 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.6248 0.0592 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.6253 0.0605 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.6247 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.6238 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.6241 0.0597 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.6230 0.0641 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.6226 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.6224 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.6223 0.0606 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.6229 0.0651 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.6224 0.0695 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.6232 0.0696 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.6230 0.0685 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.6231 0.0605 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.6229 0.0707 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.6230 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.6233 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.6229 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.6223 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.6228 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.6228 0.0606 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.6236 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.6240 0.0673 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.6242 0.0598 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.6239 0.0595 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.6240 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.6241 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.6236 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.6234 0.0647 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.6232 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.6238 0.0649 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.6240 0.0676 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.6244 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.6240 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.6238 0.0610 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.6241 0.0606 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.6239 0.0687 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.6238 0.0618 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.6230 0.0663 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.6230 0.0642 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.6224 0.0667 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.6225 0.0633 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.6220 0.0631 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.6219 0.0620 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.6213 0.0678 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.6209 0.0684 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.6206 0.0606 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.6202 0.0608 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.6197 0.0746 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.6197 0.0700 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.6194 0.0617 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.6191 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.6187 0.0726 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.6183 0.0765 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.6180 0.0619 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.6180 0.0612 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.6177 0.0755 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.6173 0.0689 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.6169 0.0646 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.6164 0.0767 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.6164 0.0712 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.6161 0.0737 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.6159 0.0747 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.6157 0.0606 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.6155 0.0635 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.6154 0.0637 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.6152 0.0851 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.6152 0.0686 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.6150 0.0620 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.6151 0.0727 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.6150 0.0629 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.6149 0.0660 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.6148 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.6146 0.0616 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.6143 0.0601 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.6140 0.0813 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.6139 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.6138 0.0615 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.6137 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.6137 0.0695 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.6136 0.0661 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.6131 0.0608 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.6127 0.0609 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.6127 0.0679 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.6127 0.0654 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.6123 0.0604 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.6124 0.0595 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.6124 0.0617 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.6122 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.6119 0.0601 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.6116 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.6112 0.0636 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.6112 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.6112 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.6112 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.6113 0.0727 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.6113 0.0650 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.6113 0.0606 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.6114 0.0599 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.6112 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.6115 0.0618 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.6114 0.0626 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.6114 0.0657 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.6114 0.0643 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.6113 0.0625 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.6114 0.0684 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.6114 0.0603 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.6116 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.6116 0.0596 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.6114 0.0619 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.6111 0.0617 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.6111 0.0617 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.6111 0.0620 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.6111 0.0616 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.6111 0.0609 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.6110 0.0778 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.6111 0.0625 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.6110 0.0614 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.6107 0.0639 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.6108 0.0644 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.6109 0.0614 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.6108 0.0608 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.6109 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.6109 0.0659 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.6108 0.0703 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.6108 0.0638 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.6109 0.0668 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.6113 0.0706 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.6112 0.0596 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.6112 0.0611 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.6111 0.0760 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.6108 0.0653 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.6109 0.0601 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.6109 0.0622 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.6110 0.0614 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.6108 0.0748 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.6106 0.0656 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.6107 0.0605 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.7273 0.0714 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.6752 0.0632 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.6509 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.6406 0.0624 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.6308 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.6186 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.6167 0.0779 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.6132 0.0614 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.6133 0.0605 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.6113 0.0623 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.6068 0.0625 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.6045 0.0664 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.6039 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.6052 0.0612 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.6039 0.0622 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.6015 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.6009 0.0674 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.6022 0.0655 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.6025 0.0795 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.6029 0.0894 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.6019 0.0787 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.6024 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.6014 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.6008 0.0693 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.6009 0.0764 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.5993 0.0594 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.5979 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.5986 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.5989 0.0792 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.5992 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.5988 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.5977 0.0615 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.5977 0.0670 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.5984 0.0661 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.5983 0.0663 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.5981 0.0600 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.5978 0.0622 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.5964 0.0625 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.5949 0.0627 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.5941 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.5937 0.0751 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.5941 0.0652 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.5932 0.0623 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.5922 0.0671 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.5924 0.0789 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.5913 0.0598 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.5908 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.5904 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.5902 0.0754 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.5907 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.5902 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.5909 0.0624 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.5908 0.0699 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.5910 0.0611 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.5909 0.0603 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.5910 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.5914 0.0638 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.5910 0.0672 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.5904 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.5909 0.0596 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.5908 0.0610 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.5916 0.0632 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.5918 0.0599 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.5920 0.0647 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.5917 0.0619 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.5918 0.0668 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.5919 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.5916 0.0686 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.5914 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.5912 0.0654 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.5918 0.0593 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.5921 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.5925 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.5921 0.0629 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.5920 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.5922 0.0725 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.5921 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.5920 0.0639 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.5913 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.5912 0.0631 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.5905 0.0711 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.5907 0.0737 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.5902 0.0601 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.5902 0.0739 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.5897 0.0636 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.5893 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.5890 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.5887 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.5882 0.0626 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.5883 0.0599 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.5880 0.0599 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.5878 0.0753 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.5874 0.0776 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.5871 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.5868 0.0646 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.5868 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.5867 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.5863 0.0594 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.5860 0.0622 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.5856 0.0705 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.5855 0.0596 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.5854 0.0633 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.5852 0.0669 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.5850 0.0635 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.5848 0.0602 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.5846 0.0720 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.5846 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.5845 0.0612 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.5844 0.0603 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.5844 0.0648 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.5842 0.0681 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.5841 0.0665 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.5840 0.0611 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.5837 0.0591 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.5835 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.5831 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.5830 0.0595 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.5831 0.0609 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.5829 0.0593 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.5829 0.0749 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.5829 0.0600 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.5826 0.0695 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.5821 0.0643 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.5822 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.5821 0.0597 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.5817 0.0712 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.5818 0.0644 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.5818 0.0616 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.5817 0.0613 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.5815 0.0637 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.5811 0.0690 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.5808 0.0676 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.5808 0.0799 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.5808 0.0621 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.5807 0.0640 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.5808 0.0618 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.5809 0.0716 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.5809 0.0599 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.5810 0.0642 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.5808 0.0824 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.5811 0.0717 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.5811 0.0667 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.5810 0.0623 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.5811 0.0659 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.5810 0.0623 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.5811 0.0790 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.5811 0.0783 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.5813 0.0673 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.5813 0.0682 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.5811 0.0692 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.5808 0.0628 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.5807 0.0624 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.5809 0.0625 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.5809 0.0714 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.5808 0.0620 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.5808 0.0764 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.5808 0.0658 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.5808 0.0666 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.5805 0.0611 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.5805 0.0651 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.5807 0.0600 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.5806 0.0606 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.5807 0.0607 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.5806 0.0756 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.5806 0.0653 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.5806 0.0616 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.5807 0.0641 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.5810 0.0927 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.5810 0.0698 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.5809 0.0708 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.5808 0.0634 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.5806 0.0649 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.5807 0.0915 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.5807 0.0660 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.5807 0.0759 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.5806 0.0684 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.5804 0.0615 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.5805 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.7055 0.0642 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.6526 0.0600 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.6316 0.0779 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.6199 0.0671 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.6106 0.0605 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.5968 0.0609 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.5951 0.0737 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.5916 0.0620 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.5919 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.5896 0.0609 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.5854 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.5833 0.0602 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.5829 0.0605 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.5844 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.5830 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.5809 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.5803 0.0630 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.5818 0.0619 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.5812 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.5820 0.0677 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.5806 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.5803 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.5795 0.0647 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.5788 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.5781 0.0590 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.5763 0.0616 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.5748 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.5750 0.0664 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.5750 0.0598 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.5749 0.0596 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.5744 0.0655 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.5735 0.0684 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.5737 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.5741 0.0709 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.5739 0.0783 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.5735 0.0701 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.5729 0.0752 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.5714 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.5701 0.0644 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.5696 0.0629 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.5690 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.5694 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.5688 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.5678 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.5682 0.0703 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.5670 0.0639 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.5665 0.0604 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.5661 0.0611 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.5660 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.5665 0.0662 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.5660 0.0599 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.5668 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.5666 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.5667 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.5664 0.0626 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.5665 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.5668 0.0754 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.5665 0.0610 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.5658 0.0616 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.5663 0.0640 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.5661 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.5669 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.5672 0.0597 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.5675 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.5672 0.0620 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.5672 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.5673 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.5671 0.0605 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.5670 0.0691 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.5667 0.0617 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.5675 0.0663 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.5676 0.0656 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.5680 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.5677 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.5675 0.0641 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.5677 0.0599 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.5674 0.0693 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.5674 0.0728 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.5667 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.5666 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.5660 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.5661 0.0696 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.5657 0.0605 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.5656 0.0643 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.5651 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.5648 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.5644 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.5641 0.0632 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.5636 0.0606 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.5636 0.0603 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.5633 0.0617 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.5631 0.0648 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.5627 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.5623 0.0716 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.5620 0.0667 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.5619 0.0703 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.5619 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.5615 0.0668 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.5611 0.0669 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.5607 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.5606 0.0615 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.5605 0.0722 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.5603 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.5600 0.0622 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.5599 0.0625 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.5597 0.0606 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.5596 0.0678 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.5595 0.0690 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.5594 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.5593 0.0706 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.5591 0.0612 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.5590 0.0612 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.5589 0.0603 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.5586 0.0634 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.5583 0.0625 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.5579 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.5578 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.5577 0.0666 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.5575 0.0593 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.5574 0.0606 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.5573 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.5570 0.0789 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.5565 0.0636 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.5566 0.0736 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.5565 0.0750 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.5561 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.5562 0.0721 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.5562 0.0780 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.5561 0.0630 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.5558 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.5554 0.0692 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.5551 0.0695 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.5552 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.5552 0.0702 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.5553 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.5553 0.0608 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.5554 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.5554 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.5555 0.0657 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.5554 0.0619 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.5558 0.0652 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.5557 0.0659 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.5557 0.0612 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.5558 0.0638 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.5556 0.0646 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.5558 0.0699 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.5558 0.0683 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.5560 0.0630 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.5561 0.0748 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.5560 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.5557 0.0631 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.5556 0.0621 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.5557 0.0685 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.5557 0.0607 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.5556 0.0697 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.5556 0.0633 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.5556 0.0629 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.5555 0.0637 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.5552 0.0654 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.5553 0.0653 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.5555 0.0607 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.5553 0.0665 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.5554 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.5554 0.0610 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.5553 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.5553 0.0618 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.5554 0.0651 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.5557 0.0614 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.5556 0.0628 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.5556 0.0729 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.5555 0.0679 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.5552 0.0624 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.5554 0.0612 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.5553 0.0604 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.5554 0.0676 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.5553 0.0635 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.5551 0.0601 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.5552 0.0602 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.6745 0.0628 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.6225 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.6005 0.0625 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.5918 0.0596 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.5826 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.5692 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.5671 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.5642 0.0651 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.5632 0.0661 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.5615 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.5578 0.0660 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.5557 0.0621 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.5547 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.5559 0.0766 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.5544 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.5521 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.5519 0.0605 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.5534 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.5536 0.0629 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.5540 0.0746 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.5522 0.0679 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.5528 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.5519 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.5511 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.5511 0.0611 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.5492 0.0603 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.5478 0.0732 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.5482 0.0833 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.5484 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.5483 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.5480 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.5472 0.0751 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.5472 0.0676 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.5475 0.0618 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.5473 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.5470 0.0701 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.5465 0.0772 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.5450 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.5435 0.0633 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.5429 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.5422 0.0701 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.5427 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.5420 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.5410 0.0841 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.5414 0.0714 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.5403 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.5400 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.5394 0.0690 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.5391 0.0693 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.5398 0.0735 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.5392 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.5400 0.0605 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.5400 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.5400 0.0614 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.5396 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.5399 0.0595 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.5404 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.5400 0.0675 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.5394 0.0600 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.5401 0.0665 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.5400 0.0693 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.5409 0.0662 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.5412 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.5414 0.0601 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.5410 0.0616 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.5413 0.0707 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.5414 0.0601 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.5411 0.0593 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.5411 0.0716 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.5408 0.0666 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.5415 0.0623 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.5417 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.5421 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.5418 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.5416 0.0616 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.5418 0.0585 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.5417 0.0852 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.5416 0.0804 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.5409 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.5408 0.0647 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.5401 0.0775 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.5402 0.0718 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.5396 0.0683 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.5396 0.0664 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.5391 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.5388 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.5384 0.0678 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.5381 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.5377 0.0738 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.5377 0.0663 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.5374 0.0716 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.5373 0.0643 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.5368 0.0626 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.5364 0.0746 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.5361 0.0642 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.5360 0.0602 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.5359 0.0717 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.5356 0.0603 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.5352 0.0597 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.5346 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.5346 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.5345 0.0621 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.5344 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.5341 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.5339 0.0700 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.5338 0.0607 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.5336 0.0656 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.5335 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.5334 0.0606 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.5334 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.5333 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.5332 0.0636 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.5330 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.5327 0.0648 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.5325 0.0612 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.5321 0.0780 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.5319 0.0646 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.5318 0.0598 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.5317 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.5316 0.0709 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.5316 0.0594 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.5312 0.0607 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.5307 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.5307 0.0745 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.5306 0.0634 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.5302 0.0603 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.5303 0.0611 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.5303 0.0605 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.5302 0.0699 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.5299 0.0617 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.5296 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.5293 0.0659 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.5293 0.0671 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.5293 0.0612 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.5293 0.0672 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.5293 0.0674 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.5294 0.0667 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.5295 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.5296 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.5295 0.0621 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.5299 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.5298 0.0601 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.5298 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.5299 0.0688 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.5298 0.0627 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.5299 0.0593 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.5299 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.5301 0.0654 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.5301 0.0644 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.5299 0.0612 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.5296 0.0608 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.5296 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.5297 0.0635 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.5298 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.5297 0.0681 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.5296 0.0640 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.5297 0.0624 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.5296 0.0615 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.5293 0.0604 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.5294 0.0689 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.5296 0.0604 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.5295 0.0626 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.5295 0.0669 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.5295 0.0632 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.5295 0.0655 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.5294 0.0699 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.5295 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.5299 0.0630 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.5298 0.0602 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.5298 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.5297 0.0619 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.5295 0.0626 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.5296 0.0624 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.5296 0.0601 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.5296 0.0599 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.5295 0.0716 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.5293 0.0609 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.5294 0.0625 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.6637 0.0588 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.6130 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.5924 0.0688 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.5825 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.5678 0.0618 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.5530 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.5494 0.0622 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.5445 0.0623 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.5432 0.0775 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.5408 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.5361 0.0738 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.5337 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.5340 0.0778 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.5353 0.0748 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.5333 0.0737 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.5311 0.0770 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.5307 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.5317 0.0619 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.5315 0.0756 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.5323 0.0684 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.5309 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.5310 0.0756 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.5298 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.5292 0.0609 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.5286 0.0697 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.5265 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.5250 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.5252 0.0593 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.5254 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.5256 0.0623 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.5252 0.0797 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.5242 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.5243 0.0713 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.5245 0.0609 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.5243 0.0695 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.5239 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.5233 0.0598 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.5217 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.5203 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.5197 0.0624 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.5192 0.0629 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.5199 0.0683 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.5193 0.0678 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.5186 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.5188 0.0598 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.5180 0.0609 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.5175 0.0670 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.5170 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.5169 0.0601 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.5174 0.0700 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.5168 0.0601 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.5174 0.0601 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.5172 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.5172 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.5169 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.5171 0.0679 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.5175 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.5170 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.5164 0.0722 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.5170 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.5168 0.0606 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.5177 0.0603 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.5180 0.0640 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.5182 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.5180 0.0787 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.5182 0.0610 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.5181 0.0619 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.5178 0.0760 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.5175 0.0751 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.5174 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.5180 0.0779 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.5182 0.0745 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.5186 0.0627 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.5183 0.0600 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.5182 0.0635 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.5184 0.0663 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.5182 0.0591 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.5181 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.5174 0.0619 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.5173 0.0597 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.5167 0.0686 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.5167 0.0637 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.5161 0.0617 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.5160 0.0612 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.5155 0.0596 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.5152 0.0607 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.5149 0.0650 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.5146 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.5142 0.0613 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.5142 0.0610 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.5140 0.0631 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.5138 0.0641 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.5133 0.0656 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.5130 0.0665 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.5127 0.0625 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.5127 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.5126 0.0648 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.5122 0.0607 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.5117 0.0705 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.5113 0.0621 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.5113 0.0600 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.5111 0.0621 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.5110 0.0740 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.5108 0.0651 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.5106 0.0701 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.5104 0.0671 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.5103 0.0595 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.5102 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.5101 0.0629 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.5101 0.0674 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.5100 0.0606 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.5099 0.0619 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.5097 0.0702 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.5096 0.0726 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.5093 0.0606 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.5088 0.0638 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.5087 0.0780 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.5086 0.0680 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.5085 0.0606 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.5084 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.5084 0.0623 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.5079 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.5075 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.5075 0.0643 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.5074 0.0612 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.5070 0.0649 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.5071 0.0617 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.5071 0.0610 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.5070 0.0788 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.5067 0.0594 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.5064 0.0601 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.5061 0.0644 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.5062 0.0719 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.5062 0.0689 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.5061 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.5062 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.5063 0.0676 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.5062 0.0666 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.5063 0.0615 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.5062 0.0667 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.5066 0.0741 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.5065 0.0629 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.5065 0.0628 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.5066 0.0593 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.5065 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.5065 0.0661 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.5066 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.5068 0.0616 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.5068 0.0642 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.5066 0.0633 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.5063 0.0598 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.5062 0.0614 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.5063 0.0618 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.5063 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.5063 0.0653 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.5063 0.0597 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.5064 0.0720 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.5063 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.5060 0.0605 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.5061 0.0627 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.5063 0.0636 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.5061 0.0646 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.5062 0.0616 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.5062 0.0660 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.5062 0.0657 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.5061 0.0687 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.5063 0.0609 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.5067 0.0618 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.5066 0.0746 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.5066 0.0632 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.5065 0.0597 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.5063 0.0716 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.5065 0.0597 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.5065 0.0626 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.5065 0.0608 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.5064 0.0743 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.5062 0.0658 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.5063 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.6393 0.0601 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.5865 0.0797 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.5637 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.5528 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.5404 0.0597 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.5279 0.0647 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.5260 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.5229 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.5228 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.5208 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.5163 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.5146 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.5143 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.5154 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.5139 0.0615 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.5117 0.0703 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.5113 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.5125 0.0590 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.5123 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.5129 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.5115 0.0723 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.5115 0.0753 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.5104 0.0621 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.5098 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.5095 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.5078 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.5063 0.0663 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.5065 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.5064 0.0620 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.5061 0.0608 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.5058 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.5048 0.0639 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.5045 0.0599 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.5048 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.5045 0.0714 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.5038 0.0759 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.5028 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.5014 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.4999 0.0801 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.4991 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.4985 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.4992 0.0600 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.4986 0.0661 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.4975 0.0594 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.4976 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.4966 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.4961 0.0613 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.4956 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.4953 0.0684 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.4957 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.4951 0.0600 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.4959 0.0617 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.4957 0.0666 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.4958 0.0636 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.4956 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.4956 0.0598 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.4960 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.4957 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.4949 0.0590 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.4955 0.0734 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.4954 0.0669 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.4963 0.0596 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.4963 0.0664 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.4965 0.0654 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.4961 0.0658 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.4962 0.0589 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.4962 0.0598 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.4959 0.0675 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.4958 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.4957 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.4964 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.4967 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.4971 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.4969 0.0616 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.4967 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.4968 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.4967 0.0677 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.4966 0.0631 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.4958 0.0657 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.4957 0.0620 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.4951 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.4950 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.4945 0.0595 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.4945 0.0655 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.4940 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.4937 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.4933 0.0644 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.4931 0.0724 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.4926 0.0652 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.4927 0.0638 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.4924 0.0634 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.4922 0.0633 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.4917 0.0628 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.4913 0.0627 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.4910 0.0612 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.4910 0.0646 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.4909 0.0727 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.4905 0.0619 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.4902 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.4898 0.0619 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.4898 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.4897 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.4896 0.0597 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.4893 0.0615 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.4891 0.0624 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.4889 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.4888 0.0671 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.4887 0.0649 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.4886 0.0601 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.4886 0.0597 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.4884 0.0645 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.4884 0.0631 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.4882 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.4880 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.4877 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.4873 0.0623 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.4873 0.0624 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.4873 0.0626 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.4871 0.0656 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.4871 0.0685 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.4870 0.0635 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.4867 0.0628 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.4862 0.0600 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.4862 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.4861 0.0687 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.4858 0.0601 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.4858 0.0598 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.4858 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.4857 0.0683 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.4854 0.0659 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.4851 0.0611 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.4848 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.4848 0.0604 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.4847 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.4847 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.4847 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.4848 0.0614 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.4849 0.0641 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.4850 0.0637 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.4849 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.4853 0.0630 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.4852 0.0605 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.4851 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.4853 0.0695 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.4852 0.0610 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.4853 0.0629 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.4853 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.4855 0.0725 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.4855 0.0619 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.4854 0.0607 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.4851 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.4850 0.0665 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.4851 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.4851 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.4851 0.0666 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.4850 0.0723 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.4851 0.0601 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.4850 0.0598 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.4848 0.0625 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.4849 0.0673 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.4850 0.0680 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.4849 0.0622 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.4850 0.0623 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.4849 0.0765 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.4849 0.0606 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.4849 0.0602 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.4850 0.0599 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.4854 0.0694 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.4854 0.0682 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.4853 0.0599 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.4852 0.0628 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.4851 0.0700 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.4852 0.0628 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.4853 0.0598 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.4853 0.0598 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.4851 0.0696 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.4850 0.0624 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.4851 0.0611 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.6165 0.0591 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.5597 0.0607 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.5366 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.5272 0.0600 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.5153 0.0603 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.5031 0.0618 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.5023 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.4992 0.0605 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.4974 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.4953 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.4911 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.4896 0.0596 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.4890 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.4902 0.0736 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.4884 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.4861 0.0607 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.4862 0.0597 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.4875 0.0649 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.4876 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.4889 0.0605 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.4876 0.0613 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.4875 0.0795 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.4864 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.4861 0.0670 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.4861 0.0622 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.4842 0.0714 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.4826 0.0592 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.4826 0.0651 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.4826 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.4824 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.4819 0.0604 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.4809 0.0601 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.4809 0.0612 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.4812 0.0620 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.4808 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.4806 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.4797 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.4781 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.4767 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.4761 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.4755 0.0725 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.4761 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.4757 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.4748 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.4750 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.4741 0.0744 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.4737 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.4731 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.4730 0.0893 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.4735 0.0600 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.4728 0.0666 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.4736 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.4735 0.0644 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.4736 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.4734 0.0664 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.4734 0.0655 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.4739 0.0588 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.4735 0.0635 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.4729 0.0686 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.4734 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.4734 0.0695 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.4742 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.4743 0.0698 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.4745 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.4743 0.0596 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.4745 0.0657 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.4746 0.0599 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.4743 0.0687 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.4742 0.0693 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.4739 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.4746 0.0627 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.4750 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.4754 0.0732 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.4752 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.4750 0.0627 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.4753 0.0641 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.4752 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.4751 0.0715 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.4744 0.0597 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.4744 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.4737 0.0700 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.4738 0.0597 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.4733 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.4734 0.0601 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.4729 0.0684 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.4727 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.4723 0.0626 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.4720 0.0606 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.4717 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.4717 0.0641 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.4715 0.0594 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.4714 0.0598 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.4710 0.0671 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.4706 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.4703 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.4703 0.0616 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.4703 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.4699 0.0747 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.4697 0.0619 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.4692 0.0620 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.4693 0.0618 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.4692 0.0732 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.4690 0.0600 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.4688 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.4686 0.0624 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.4685 0.0643 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.4684 0.0606 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.4683 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.4681 0.0602 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.4681 0.0659 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.4679 0.0653 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.4678 0.0672 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.4676 0.0645 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.4674 0.0604 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.4671 0.0590 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.4667 0.0646 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.4666 0.0797 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.4666 0.0628 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.4664 0.0662 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.4663 0.0658 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.4663 0.0636 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.4659 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.4655 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.4655 0.0734 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.4654 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.4651 0.0600 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.4652 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.4653 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.4651 0.0625 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.4648 0.0675 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.4645 0.0678 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.4642 0.0654 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.4643 0.0607 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.4643 0.0597 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.4643 0.0737 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.4644 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.4645 0.0603 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.4646 0.0612 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.4646 0.0811 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.4646 0.0619 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.4649 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.4648 0.0650 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.4647 0.0763 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.4649 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.4647 0.0600 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.4649 0.0623 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.4650 0.0741 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.4652 0.0601 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.4654 0.0610 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.4653 0.0633 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.4650 0.0661 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.4649 0.0707 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.4650 0.0615 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.4650 0.0608 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.4649 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.4649 0.0669 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.4650 0.0597 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.4650 0.0600 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.4647 0.0640 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.4648 0.0700 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.4650 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.4648 0.0814 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.4648 0.0652 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.4648 0.0630 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.4648 0.0683 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.4647 0.0701 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.4649 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.4652 0.0627 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.4652 0.0707 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.4652 0.0617 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.4651 0.0601 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.4649 0.0603 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.4651 0.0604 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.4651 0.0697 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.4652 0.0632 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.4650 0.0609 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.4649 0.0639 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.4651 0.0624 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.5934 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.5344 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.5132 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.5063 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.4940 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.4831 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.4816 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.4782 0.0645 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.4769 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.4742 0.0598 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.4694 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.4683 0.0642 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.4682 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.4693 0.0617 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.4677 0.0596 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.4660 0.0589 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.4659 0.0646 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.4671 0.0652 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.4666 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.4671 0.0649 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.4660 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.4661 0.0593 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.4652 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.4647 0.0638 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.4645 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.4629 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.4619 0.0614 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.4623 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.4624 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.4626 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.4623 0.0597 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.4610 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.4611 0.0695 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.4613 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.4610 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.4607 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.4600 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.4586 0.0677 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.4570 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.4565 0.0759 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.4561 0.0621 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.4566 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.4560 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.4552 0.0598 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.4556 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.4547 0.0621 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.4544 0.0650 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.4538 0.0630 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.4536 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.4541 0.0609 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.4534 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.4541 0.0593 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.4538 0.0598 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.4540 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.4537 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.4539 0.0620 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.4544 0.0607 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.4541 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.4534 0.0723 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.4538 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.4537 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.4546 0.0602 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.4549 0.0666 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.4550 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.4550 0.0622 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.4551 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.4552 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.4550 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.4549 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.4547 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.4554 0.0679 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.4556 0.0697 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.4561 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.4558 0.0611 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.4556 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.4559 0.0670 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.4557 0.0597 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.4558 0.0610 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.4551 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.4551 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.4545 0.0602 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.4546 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.4542 0.0638 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.4543 0.0622 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.4539 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.4537 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.4533 0.0620 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.4530 0.0662 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.4526 0.0603 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.4527 0.0607 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.4524 0.0698 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.4524 0.0656 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.4520 0.0760 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.4516 0.0767 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.4513 0.0718 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.4514 0.0603 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.4513 0.0613 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.4509 0.0753 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.4505 0.0599 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.4501 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.4500 0.0628 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.4500 0.0667 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.4499 0.0629 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.4497 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.4495 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.4493 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.4492 0.0627 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.4492 0.0612 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.4490 0.0693 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.4491 0.0747 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.4489 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.4489 0.0607 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.4487 0.0661 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.4485 0.0792 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.4482 0.0752 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.4479 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.4478 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.4478 0.0711 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.4476 0.0644 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.4476 0.0701 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.4475 0.0660 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.4472 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.4468 0.0678 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.4468 0.0728 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.4468 0.0614 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.4463 0.0671 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.4464 0.0708 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.4464 0.0582 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.4463 0.0606 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.4460 0.0648 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.4456 0.0659 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.4454 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.4455 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.4455 0.0728 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.4456 0.0604 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.4456 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.4457 0.0687 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.4458 0.0684 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.4459 0.0676 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.4459 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.4462 0.0658 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.4462 0.0742 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.4461 0.0664 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.4463 0.0640 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.4462 0.0657 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.4464 0.0691 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.4464 0.0705 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.4467 0.0829 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.4468 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.4466 0.0600 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.4463 0.0633 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.4463 0.0653 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.4464 0.0654 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.4465 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.4464 0.0601 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.4464 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.4465 0.0616 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.4465 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.4462 0.0707 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.4463 0.0605 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.4466 0.0641 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.4465 0.0726 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.4465 0.0689 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.4465 0.0637 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.4465 0.0619 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.4464 0.0682 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.4466 0.0586 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.4469 0.0597 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.4469 0.0636 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.4469 0.0608 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.4468 0.0663 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.4467 0.0599 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.4468 0.0623 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.4468 0.0699 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.4469 0.0615 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.4468 0.0639 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.4467 0.0626 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.4469 0.0661 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4191 1.3781 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.3344 0.1342 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.9709 0.1296 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.8049 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.5868 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.4135 0.1289 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.2752 0.1227 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.1670 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.0737 0.1204 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.9942 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.9257 0.1201 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.8699 0.1209 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.8236 0.1210 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.7829 0.1251 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.7466 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.7143 0.1266 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.6845 0.1232 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.6599 0.1305 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.6371 0.1286 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.6141 0.1318 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.5936 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.5754 0.1323 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.5581 0.1223 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.5423 0.1232 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.5274 0.1247 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.5144 0.1250 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.5021 0.1251 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.4897 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.4786 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.4682 0.1202 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.4591 0.1200 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.4497 0.1211 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.4404 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.4325 0.1299 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.4242 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.4168 0.1233 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.4090 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.4017 0.1229 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.3946 0.1292 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.3880 0.1273 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.3815 0.1326 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.3755 0.1359 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.3694 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.3637 0.1304 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.3578 0.1244 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.3527 0.1261 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.3478 0.1269 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.3432 0.1281 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.3387 0.1309 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.3342 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.3297 0.1340 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.3252 0.1291 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.3209 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.3166 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.3126 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.3084 0.1254 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.3044 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.3005 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.2965 0.1216 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.2926 0.1276 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.2900 0.1233 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.2916 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.2919 0.1270 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.2905 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.2874 0.1247 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.2846 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.2816 0.1208 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.2778 0.1202 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.2745 0.1204 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.2716 0.1252 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.2685 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.2657 0.1208 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.2625 0.1210 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.2593 0.1259 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.2562 0.1280 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.2532 0.1230 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.2500 0.1247 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.2468 0.1275 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.2436 0.1287 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.2400 0.1251 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.2366 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2332 0.1254 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2299 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2262 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2224 0.1206 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2187 0.1216 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2148 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2113 0.1253 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2085 0.1300 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2052 0.1312 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2019 0.1228 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.1986 0.1205 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.1953 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.1919 0.1407 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.1883 0.1285 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.1847 0.1319 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.1811 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.1774 0.1240 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.1736 0.1214 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.1698 0.1272 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.1661 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.1622 0.1249 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.1582 0.1211 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.1542 0.1286 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.1501 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.1460 0.1216 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.1418 0.1260 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.1387 0.1216 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.1359 0.1208 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.1322 0.1238 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.1285 0.1313 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.1250 0.1241 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.1215 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.1178 0.1300 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.1140 0.1259 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.1100 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.1062 0.1248 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.1024 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.0988 0.1198 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.0949 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.0911 0.1252 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.0872 0.1260 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.0832 0.1285 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.0793 0.1228 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.0753 0.1327 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.0710 0.1241 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.0671 0.1224 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.0632 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.0591 0.1240 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.0551 0.1208 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.0510 0.1209 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.0470 0.1244 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.0431 0.1257 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.0392 0.1223 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.0351 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.0311 0.1231 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.0271 0.1262 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.0232 0.1284 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.0194 0.1258 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.0154 0.1250 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.0117 0.1311 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.0077 0.1358 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.0039 0.1282 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.0001 0.1277 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 2.9963 0.1272 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 2.9927 0.1236 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 2.9890 0.1244 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 2.9855 0.1348 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 2.9818 0.1258 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 2.9781 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 2.9747 0.1204 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 2.9713 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 2.9678 0.1257 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 2.9644 0.1296 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 2.9608 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 2.9573 0.1216 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 2.9538 0.1205 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 2.9502 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 2.9466 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 2.9434 0.1321 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 2.9400 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 2.9364 0.1322 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 2.9330 0.1263 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 2.9296 0.1214 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 2.9263 0.1221 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 2.9229 0.1251 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 2.9196 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 2.9164 0.1289 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 2.9131 0.1250 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 2.9098 0.1270 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 2.9066 0.1314 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 2.9034 0.1275 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 2.9004 0.1287 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 2.8975 0.1305 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 2.8945 0.1352 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 2.8914 0.1274 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 2.8882 0.1323 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 2.8851 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.3828 0.1258 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.3415 0.1389 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.3282 0.1253 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.3259 0.1313 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.3240 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.3196 0.1210 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.3179 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.3189 0.1340 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.3191 0.1208 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.3161 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.3130 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.3113 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.3095 0.1317 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.3101 0.1272 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.3080 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.3065 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.3048 0.1320 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.3053 0.1316 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.3037 0.1348 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.3017 0.1387 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.2995 0.1215 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.2996 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.2979 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.2955 0.1212 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.2932 0.1245 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.2908 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.2884 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.2867 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.2856 0.1313 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.2840 0.1313 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.2828 0.1343 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.2805 0.1247 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.2785 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.2773 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.2755 0.1205 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.2738 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.2722 0.1212 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.2695 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.2669 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.2647 0.1206 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.2626 0.1209 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.2609 0.1267 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.2586 0.1234 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.2563 0.1265 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.2545 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.2519 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.2502 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.2483 0.1349 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.2467 0.1214 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.2456 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.2435 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.2424 0.1276 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.2406 0.1250 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.2389 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.2371 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.2356 0.1255 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.2342 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.2325 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.2308 0.1205 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.2297 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.2283 0.1255 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.2272 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.2262 0.1341 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.2249 0.1325 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.2233 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.2222 0.1257 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.2209 0.1388 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.2190 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.2174 0.1259 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.2160 0.1299 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.2149 0.1315 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.2136 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.2124 0.1289 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.2108 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.2092 0.1239 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.2082 0.1236 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.2067 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.2054 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.2036 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.2019 0.1247 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.2002 0.1266 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.1988 0.1237 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.1971 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.1955 0.1246 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.1937 0.1237 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.1922 0.1235 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.1908 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.1892 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.1875 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.1863 0.1259 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.1847 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.1834 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.1818 0.1207 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.1802 0.1269 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.1786 0.1206 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.1770 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.1756 0.1306 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.1740 0.1304 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.1724 0.1266 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.1706 0.1238 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.1694 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.1681 0.1253 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.1666 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.1650 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.1635 0.1210 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.1621 0.1365 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.1607 0.1279 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.1596 0.1277 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.1584 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.1571 0.1230 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.1558 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.1546 0.1211 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.1533 0.1250 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.1519 0.1227 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.1504 0.1258 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.1488 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.1476 0.1276 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.1463 0.1296 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.1451 0.1240 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.1439 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.1427 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.1413 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.1399 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.1388 0.1233 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.1376 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.1360 0.1264 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.1349 0.1214 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.1337 0.1234 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.1326 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.1314 0.1304 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.1300 0.1247 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.1286 0.1227 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.1274 0.1236 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.1263 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.1252 0.1237 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.1241 0.1247 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.1230 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.1219 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.1209 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.1197 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.1187 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.1175 0.1288 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.1163 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.1152 0.1238 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.1140 0.1341 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.1129 0.1305 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.1118 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.1108 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.1097 0.1245 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.1085 0.1346 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.1074 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.1065 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.1054 0.1251 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.1044 0.1206 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.1033 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.1022 0.1215 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.1010 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.0999 0.1355 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.0986 0.1202 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.0978 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.0968 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.0957 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.0947 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.0937 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.0927 0.1206 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.0915 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.0907 0.1286 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.0899 0.1262 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.0888 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.0878 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.0868 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.0857 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.0849 0.1261 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.0839 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.0830 0.1258 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.0820 0.1319 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.0809 0.1246 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.0800 0.1270 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 1.9587 0.1217 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 1.9178 0.1229 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 1.9073 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 1.9027 0.1253 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 1.8995 0.1204 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 1.8904 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 1.8916 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 1.8905 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 1.8923 0.1283 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 1.8920 0.1217 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 1.8880 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 1.8852 0.1251 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 1.8839 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 1.8856 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 1.8840 0.1269 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 1.8808 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 1.8797 0.1234 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 1.8811 0.1251 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 1.8798 0.1264 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 1.8792 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 1.8775 0.1211 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 1.8784 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 1.8768 0.1206 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 1.8757 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 1.8746 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 1.8726 0.1246 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 1.8710 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 1.8707 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 1.8710 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 1.8706 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 1.8701 0.1211 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 1.8687 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 1.8682 0.1316 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 1.8685 0.1211 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 1.8677 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 1.8668 0.1210 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 1.8658 0.1218 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 1.8638 0.1246 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 1.8621 0.1212 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 1.8607 0.1207 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 1.8596 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 1.8593 0.1245 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 1.8581 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 1.8567 0.1266 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 1.8561 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 1.8543 0.1234 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 1.8534 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 1.8523 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 1.8515 0.1261 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 1.8514 0.1260 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 1.8503 0.1215 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 1.8504 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 1.8494 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 1.8486 0.1206 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 1.8477 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 1.8470 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 1.8467 0.1233 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 1.8457 0.1207 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 1.8447 0.1204 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 1.8446 0.1234 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 1.8437 0.1244 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 1.8438 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 1.8436 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 1.8433 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 1.8424 0.1205 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 1.8422 0.1246 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 1.8418 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 1.8408 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 1.8401 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 1.8394 0.1238 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 1.8393 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 1.8390 0.1258 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 1.8387 0.1243 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 1.8379 0.1255 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 1.8371 0.1276 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 1.8366 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 1.8359 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 1.8354 0.1244 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 1.8342 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 1.8334 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 1.8323 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 1.8317 0.1212 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 1.8306 0.1212 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 1.8301 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 1.8292 0.1215 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 1.8283 0.1253 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 1.8275 0.1215 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 1.8266 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 1.8256 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 1.8251 0.1254 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 1.8242 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 1.8235 0.1284 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 1.8223 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 1.8214 0.1217 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 1.8204 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 1.8198 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 1.8190 0.1248 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 1.8180 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 1.8171 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 1.8159 0.1334 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 1.8154 0.1284 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 1.8146 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 1.8138 0.1217 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 1.8129 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 1.8121 0.1289 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 1.8114 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 1.8108 0.1251 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 1.8101 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 1.8096 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 1.8090 0.1265 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 1.8083 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 1.8076 0.1269 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 1.8069 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 1.8062 0.1293 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 1.8054 0.1231 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 1.8046 0.1231 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 1.8040 0.1210 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 1.8034 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 1.8028 0.1274 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 1.8023 0.1300 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 1.8017 0.1270 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 1.8008 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 1.8000 0.1240 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 1.7995 0.1243 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 1.7989 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 1.7980 0.1245 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 1.7975 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 1.7970 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 1.7964 0.1208 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 1.7957 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 1.7949 0.1238 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 1.7940 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 1.7935 0.1309 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 1.7930 0.1212 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 1.7925 0.1248 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 1.7919 0.1308 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 1.7914 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 1.7909 0.1252 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 1.7905 0.1218 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 1.7899 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 1.7896 0.1244 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 1.7890 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 1.7885 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 1.7879 0.1313 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 1.7873 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 1.7868 0.1264 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 1.7863 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 1.7860 0.1347 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 1.7855 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 1.7849 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 1.7842 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 1.7838 0.1241 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 1.7832 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 1.7827 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 1.7822 0.1211 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 1.7817 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 1.7812 0.1236 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 1.7806 0.1255 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 1.7798 0.1297 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 1.7794 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 1.7791 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 1.7786 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 1.7781 0.1252 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 1.7775 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 1.7770 0.1258 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 1.7764 0.1257 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 1.7759 0.1250 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 1.7758 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 1.7752 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 1.7747 0.1323 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 1.7740 0.1267 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 1.7734 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 1.7730 0.1236 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 1.7726 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 1.7721 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 1.7715 0.1262 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 1.7709 0.1238 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 1.7705 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 1.7402 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 1.7042 0.1276 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 1.6952 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 1.6914 0.1228 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 1.6851 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 1.6753 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 1.6746 0.1249 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 1.6731 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 1.6747 0.1210 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 1.6730 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 1.6695 0.1214 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 1.6679 0.1255 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 1.6670 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 1.6689 0.1260 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 1.6673 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 1.6652 0.1203 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 1.6644 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 1.6659 0.1242 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 1.6653 0.1292 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 1.6656 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 1.6643 0.1226 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 1.6653 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 1.6639 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 1.6638 0.1302 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 1.6629 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 1.6611 0.1281 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 1.6595 0.1225 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 1.6596 0.1316 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 1.6598 0.1219 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 1.6598 0.1260 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 1.6591 0.1260 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 1.6574 0.1265 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 1.6576 0.1232 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 1.6577 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 1.6571 0.1216 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 1.6564 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 1.6553 0.1236 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 1.6539 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 1.6522 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 1.6512 0.1243 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 1.6504 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 1.6505 0.1299 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 1.6493 0.1256 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 1.6480 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 1.6478 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 1.6464 0.1271 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 1.6460 0.1364 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 1.6450 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 1.6443 0.1236 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 1.6444 0.1225 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 1.6436 0.1249 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 1.6443 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 1.6437 0.1210 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 1.6434 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 1.6428 0.1207 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 1.6425 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 1.6427 0.1207 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 1.6419 0.1264 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 1.6411 0.1249 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 1.6413 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 1.6408 0.1269 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 1.6413 0.1237 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 1.6415 0.1236 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 1.6414 0.1254 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 1.6410 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 1.6408 0.1259 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 1.6407 0.1232 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 1.6402 0.1211 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 1.6398 0.1228 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 1.6394 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 1.6396 0.1274 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 1.6393 0.1244 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 1.6394 0.1215 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 1.6388 0.1217 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 1.6384 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 1.6382 0.1208 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 1.6377 0.1214 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 1.6373 0.1259 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 1.6364 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 1.6360 0.1264 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 1.6352 0.1263 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 1.6348 0.1251 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 1.6340 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 1.6337 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 1.6330 0.1242 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 1.6325 0.1230 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 1.6319 0.1263 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 1.6313 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 1.6305 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 1.6303 0.1248 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 1.6298 0.1236 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 1.6292 0.1272 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 1.6285 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 1.6278 0.1264 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 1.6273 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 1.6270 0.1208 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 1.6266 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 1.6258 0.1210 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 1.6252 0.1213 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 1.6244 0.1210 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 1.6241 0.1243 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 1.6236 0.1212 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 1.6231 0.1213 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 1.6226 0.1357 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 1.6220 0.1254 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 1.6215 0.1311 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 1.6212 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 1.6208 0.1282 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 1.6204 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 1.6201 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 1.6196 0.1232 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 1.6191 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 1.6186 0.1213 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 1.6181 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 1.6174 0.1230 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 1.6167 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 1.6163 0.1212 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 1.6160 0.1217 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 1.6156 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 1.6152 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 1.6148 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 1.6141 0.1216 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 1.6134 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 1.6131 0.1305 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 1.6128 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 1.6121 0.1267 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 1.6118 0.1275 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 1.6115 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 1.6110 0.1271 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 1.6105 0.1230 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 1.6098 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 1.6092 0.1225 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 1.6089 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 1.6086 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 1.6083 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 1.6080 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 1.6078 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 1.6076 0.1253 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 1.6073 0.1204 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 1.6070 0.1259 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 1.6070 0.1272 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 1.6066 0.1266 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 1.6062 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 1.6060 0.1213 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 1.6056 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 1.6054 0.1269 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 1.6051 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 1.6050 0.1250 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 1.6047 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 1.6042 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 1.6036 0.1304 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 1.6033 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 1.6030 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 1.6028 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 1.6025 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 1.6022 0.1217 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 1.6019 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 1.6016 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 1.6011 0.1256 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 1.6009 0.1248 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 1.6008 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 1.6005 0.1261 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 1.6002 0.1279 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 1.5999 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 1.5996 0.1259 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 1.5993 0.1251 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 1.5991 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 1.5993 0.1210 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 1.5990 0.1237 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 1.5986 0.1213 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 1.5982 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 1.5978 0.1213 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 1.5976 0.1211 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 1.5974 0.1241 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 1.5972 0.1209 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 1.5968 0.1273 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 1.5964 0.1239 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 1.5961 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 1.6303 0.1247 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 1.5908 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 1.5755 0.1231 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 1.5691 0.1273 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 1.5608 0.1239 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 1.5506 0.1253 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 1.5499 0.1248 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 1.5472 0.1244 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 1.5485 0.1213 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 1.5460 0.1228 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 1.5428 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 1.5411 0.1282 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 1.5409 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 1.5418 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 1.5400 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 1.5379 0.1249 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 1.5376 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 1.5392 0.1249 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 1.5388 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 1.5392 0.1266 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 1.5381 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 1.5389 0.1211 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 1.5376 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 1.5375 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 1.5370 0.1241 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 1.5351 0.1251 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 1.5334 0.1303 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 1.5332 0.1307 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 1.5335 0.1231 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 1.5336 0.1257 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 1.5327 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 1.5314 0.1233 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 1.5316 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 1.5318 0.1242 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 1.5313 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 1.5303 0.1298 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 1.5293 0.1279 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 1.5279 0.1318 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 1.5261 0.1213 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 1.5254 0.1272 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 1.5247 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 1.5249 0.1285 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 1.5240 0.1244 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 1.5232 0.1299 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 1.5231 0.1303 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 1.5221 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 1.5216 0.1205 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 1.5211 0.1202 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 1.5208 0.1209 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 1.5208 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 1.5203 0.1210 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 1.5208 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 1.5203 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 1.5203 0.1268 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 1.5198 0.1297 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 1.5195 0.1235 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 1.5198 0.1263 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 1.5192 0.1289 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 1.5184 0.1301 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 1.5186 0.1254 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 1.5183 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 1.5190 0.1225 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 1.5193 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 1.5193 0.1291 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 1.5190 0.1239 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 1.5189 0.1255 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 1.5189 0.1251 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 1.5184 0.1283 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 1.5182 0.1252 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 1.5179 0.1334 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 1.5182 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 1.5181 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 1.5183 0.1226 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 1.5177 0.1274 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 1.5173 0.1261 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 1.5173 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 1.5170 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 1.5169 0.1262 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 1.5160 0.1228 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 1.5157 0.1207 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 1.5150 0.1207 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 1.5148 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 1.5140 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 1.5137 0.1264 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 1.5132 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 1.5128 0.1300 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 1.5122 0.1211 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 1.5117 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 1.5112 0.1233 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 1.5112 0.1244 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 1.5108 0.1254 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 1.5105 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 1.5099 0.1288 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 1.5094 0.1218 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 1.5089 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 1.5088 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 1.5085 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 1.5079 0.1206 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 1.5074 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 1.5067 0.1287 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 1.5065 0.1213 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 1.5062 0.1244 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 1.5058 0.1233 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 1.5054 0.1271 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 1.5050 0.1207 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 1.5047 0.1326 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 1.5044 0.1252 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 1.5042 0.1235 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 1.5039 0.1253 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 1.5038 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 1.5033 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 1.5030 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 1.5026 0.1218 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 1.5022 0.1258 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 1.5017 0.1241 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 1.5012 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 1.5010 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 1.5008 0.1269 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 1.5005 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 1.5003 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 1.5001 0.1228 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 1.4996 0.1256 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 1.4990 0.1229 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 1.4988 0.1269 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 1.4987 0.1276 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 1.4981 0.1213 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 1.4980 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 1.4979 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 1.4976 0.1269 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 1.4971 0.1241 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 1.4966 0.1252 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 1.4962 0.1234 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 1.4960 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 1.4960 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 1.4958 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 1.4957 0.1230 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 1.4956 0.1213 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 1.4955 0.1253 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 1.4953 0.1219 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 1.4951 0.1229 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 1.4953 0.1228 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 1.4951 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 1.4949 0.1199 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 1.4949 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 1.4946 0.1262 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 1.4946 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 1.4944 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 1.4945 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 1.4943 0.1247 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 1.4940 0.1252 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 1.4936 0.1290 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 1.4934 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 1.4933 0.1228 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 1.4931 0.1276 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 1.4929 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 1.4927 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 1.4925 0.1320 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 1.4923 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 1.4919 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 1.4918 0.1252 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 1.4918 0.1365 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 1.4917 0.1210 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 1.4915 0.1210 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 1.4914 0.1263 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 1.4912 0.1242 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 1.4910 0.1200 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 1.4909 0.1218 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 1.4912 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 1.4910 0.1271 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 1.4909 0.1225 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 1.4906 0.1218 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 1.4903 0.1250 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 1.4903 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 1.4902 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 1.4901 0.1210 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 1.4898 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 1.4894 0.1230 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 1.4894 0.1211 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 1.5354 0.1240 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 1.4988 0.1241 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 1.4818 0.1212 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 1.4783 0.1211 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 1.4686 0.1291 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 1.4589 0.1318 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 1.4591 0.1214 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 1.4569 0.1210 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 1.4571 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 1.4561 0.1252 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.4526 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.4519 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.4516 0.1225 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.4535 0.1251 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.4521 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.4504 0.1255 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.4501 0.1203 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.4518 0.1343 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.4516 0.1206 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.4524 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.4518 0.1212 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.4521 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.4511 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.4513 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.4509 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.4487 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.4473 0.1211 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.4477 0.1300 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.4477 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.4479 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.4471 0.1246 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.4457 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.4459 0.1223 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.4463 0.1209 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.4458 0.1243 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.4455 0.1203 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.4446 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.4434 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.4417 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.4410 0.1294 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.4405 0.1225 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.4410 0.1287 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.4406 0.1204 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.4395 0.1283 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.4397 0.1212 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.4386 0.1210 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.4384 0.1207 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.4379 0.1246 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.4376 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.4379 0.1208 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.4374 0.1254 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.4379 0.1274 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.4377 0.1294 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.4378 0.1250 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.4374 0.1218 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.4375 0.1204 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.4377 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.4371 0.1247 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.4364 0.1249 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.4368 0.1235 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.4366 0.1256 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.4373 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.4376 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.4377 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.4375 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.4376 0.1295 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.4377 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.4373 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.4372 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.4370 0.1240 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.4374 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.4374 0.1231 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.4377 0.1237 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.4372 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.4371 0.1211 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.4371 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.4368 0.1241 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.4367 0.1260 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.4359 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.4356 0.1220 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.4350 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.4349 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.4343 0.1225 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.4341 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.4336 0.1308 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.4334 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.4330 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.4327 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.4323 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.4322 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.4319 0.1228 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.4317 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.4313 0.1208 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.4309 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.4306 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.4306 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.4305 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.4300 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.4295 0.1294 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.4290 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.4289 0.1237 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.4286 0.1209 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.4284 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.4281 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.4277 0.1205 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.4275 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.4273 0.1237 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.4272 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.4269 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.4268 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.4265 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.4263 0.1257 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.4260 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.4257 0.1218 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.4253 0.1214 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.4248 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.4247 0.1246 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.4245 0.1242 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.4243 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.4241 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.4240 0.1220 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.4235 0.1208 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.4231 0.1248 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.4230 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.4229 0.1214 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.4223 0.1255 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.4222 0.1242 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.4221 0.1241 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.4218 0.1228 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.4214 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.4209 0.1250 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.4205 0.1228 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.4204 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.4204 0.1252 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.4202 0.1257 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.4201 0.1258 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.4202 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.4201 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.4200 0.1243 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.4200 0.1255 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.4203 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.4202 0.1211 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.4200 0.1265 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.4202 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.4199 0.1247 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.4200 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.4199 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.4200 0.1274 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.4200 0.1233 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.4198 0.1247 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.4194 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.4191 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.4191 0.1206 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.4189 0.1293 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.4189 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.4188 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.4186 0.1205 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.4185 0.1277 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.4182 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.4183 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.4184 0.1214 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.4183 0.1277 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.4182 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.4181 0.1218 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.4180 0.1225 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.4179 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.4179 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.4182 0.1248 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.4181 0.1208 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.4180 0.1289 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.4178 0.1207 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.4175 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.4176 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.4175 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.4175 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.4173 0.1253 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.4170 0.1209 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.4171 0.1226 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 1.4757 0.1223 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.4393 0.1237 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.4258 0.1217 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.4241 0.1273 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.4133 0.1218 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.4020 0.1226 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.4023 0.1211 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.3995 0.1270 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.3994 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.3976 0.1279 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.3938 0.1360 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.3926 0.1249 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.3927 0.1213 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.3937 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.3926 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.3896 0.1268 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.3900 0.1237 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.3909 0.1225 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.3909 0.1228 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.3915 0.1293 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.3906 0.1255 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.3910 0.1281 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.3901 0.1301 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.3906 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.3904 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.3886 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.3874 0.1232 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.3876 0.1370 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.3876 0.1245 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.3882 0.1258 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.3875 0.1331 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.3863 0.1237 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.3866 0.1272 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.3867 0.1301 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.3865 0.1247 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.3862 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.3851 0.1278 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.3841 0.1288 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.3826 0.1245 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.3821 0.1301 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.3815 0.1276 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.3822 0.1298 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.3818 0.1234 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.3813 0.1254 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.3813 0.1211 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.3804 0.1287 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.3801 0.1262 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.3796 0.1300 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.3794 0.1258 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.3796 0.1279 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.3789 0.1226 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.3798 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.3798 0.1215 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.3797 0.1214 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.3793 0.1213 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.3796 0.1241 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.3800 0.1226 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.3796 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.3789 0.1275 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.3794 0.1284 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.3795 0.1245 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.3803 0.1283 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.3805 0.1224 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.3805 0.1272 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.3804 0.1253 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.3804 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.3807 0.1257 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.3803 0.1377 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.3802 0.1220 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.3800 0.1294 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.3805 0.1285 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.3804 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.3808 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.3804 0.1232 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.3802 0.1267 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.3802 0.1260 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.3800 0.1232 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.3798 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.3790 0.1235 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.3788 0.1223 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.3781 0.1219 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.3781 0.1271 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.3774 0.1258 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.3772 0.1279 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.3769 0.1286 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.3765 0.1228 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.3761 0.1210 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.3758 0.1222 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.3754 0.1213 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.3754 0.1234 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.3751 0.1195 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.3749 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.3745 0.1234 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.3741 0.1199 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.3738 0.1213 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.3738 0.1208 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.3736 0.1208 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.3731 0.1206 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.3727 0.1236 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.3722 0.1208 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.3721 0.1244 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.3718 0.1208 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.3715 0.1225 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.3713 0.1241 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.3710 0.1240 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.3708 0.1264 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.3706 0.1235 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.3705 0.1218 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.3703 0.1267 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.3702 0.1267 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.3700 0.1229 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.3698 0.1258 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.3696 0.1203 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.3693 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.3690 0.1203 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.3685 0.1214 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.3684 0.1205 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.3684 0.1218 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.3681 0.1204 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.3679 0.1214 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.3676 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.3672 0.1235 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.3667 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.3665 0.1219 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.3664 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.3659 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.3658 0.1230 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.3657 0.1217 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.3655 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.3651 0.1225 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.3646 0.1241 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.3643 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.3642 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.3641 0.1291 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.3640 0.1275 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.3639 0.1249 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.3639 0.1262 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.3640 0.1241 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.3638 0.1232 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.3637 0.1229 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.3641 0.1210 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.3639 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.3637 0.1260 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.3639 0.1234 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.3637 0.1199 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.3637 0.1249 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.3637 0.1210 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.3638 0.1206 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.3638 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.3636 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.3633 0.1219 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.3630 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.3630 0.1218 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.3629 0.1217 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.3628 0.1235 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.3626 0.1250 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.3626 0.1244 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.3624 0.1278 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.3620 0.1347 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.3620 0.1248 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.3622 0.1241 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.3621 0.1233 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.3619 0.1312 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.3618 0.1222 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.3617 0.1227 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.3615 0.1215 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.3616 0.1212 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.3618 0.1213 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.3617 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.3616 0.1281 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.3615 0.1233 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.3613 0.1276 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.3613 0.1282 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.3613 0.1234 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.3613 0.1276 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.3610 0.1258 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.3607 0.1270 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.3608 0.1270 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.4389 0.1247 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.3914 0.1226 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.3751 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.3676 0.1258 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.3557 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.3449 0.1261 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.3425 0.1296 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.3398 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.3409 0.1295 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.3394 0.1217 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.3360 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.3354 0.1233 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.3347 0.1226 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.3362 0.1305 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.3349 0.1278 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.3331 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.3325 0.1236 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.3340 0.1289 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.3335 0.1257 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.3345 0.1228 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.3342 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.3346 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.3339 0.1255 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.3342 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.3343 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.3328 0.1255 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.3321 0.1283 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.3327 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.3329 0.1259 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.3335 0.1237 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.3332 0.1233 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.3323 0.1257 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.3329 0.1188 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.3332 0.1208 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.3332 0.1189 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.3330 0.1212 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.3320 0.1168 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.3308 0.1213 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.3295 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.3292 0.1225 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.3288 0.1210 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.3294 0.1269 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.3290 0.1235 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.3284 0.1212 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.3284 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.3275 0.1236 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.3273 0.1310 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.3267 0.1216 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.3266 0.1207 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.3268 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.3262 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.3268 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.3266 0.1230 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.3267 0.1271 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.3266 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.3266 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.3271 0.1232 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.3268 0.1256 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.3262 0.1247 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.3268 0.1255 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.3269 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.3275 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.3278 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.3280 0.1215 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.3280 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.3282 0.1261 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.3286 0.1361 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.3281 0.1217 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.3281 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.3278 0.1236 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.3282 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.3282 0.1228 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.3286 0.1240 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.3281 0.1250 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.3278 0.1245 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.3279 0.1247 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.3278 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.3276 0.1209 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.3269 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.3267 0.1228 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.3262 0.1249 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.3260 0.1213 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.3254 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.3252 0.1233 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.3249 0.1242 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.3247 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.3243 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.3240 0.1254 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.3235 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.3234 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.3231 0.1276 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.3229 0.1242 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.3225 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.3220 0.1249 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.3216 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.3217 0.1230 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.3216 0.1242 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.3212 0.1259 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.3207 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.3203 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.3203 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.3201 0.1211 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.3199 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.3197 0.1237 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.3194 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.3193 0.1246 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.3192 0.1233 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.3192 0.1228 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.3190 0.1246 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.3189 0.1242 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.3186 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.3185 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.3183 0.1259 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.3181 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.3177 0.1271 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.3174 0.1211 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.3173 0.1234 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.3172 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.3171 0.1253 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.3169 0.1224 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.3167 0.1225 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.3163 0.1235 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.3158 0.1215 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.3157 0.1251 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.3156 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.3152 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.3151 0.1369 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.3149 0.1307 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.3147 0.1251 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.3142 0.1255 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.3137 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.3135 0.1249 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.3136 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.3136 0.1267 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.3135 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.3135 0.1236 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.3136 0.1230 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.3136 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.3135 0.1254 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.3135 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.3138 0.1235 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.3138 0.1215 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.3137 0.1211 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.3139 0.1225 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.3137 0.1266 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.3137 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.3136 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.3138 0.1231 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.3138 0.1343 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.3137 0.1213 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.3133 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.3131 0.1230 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.3131 0.1245 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.3130 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.3129 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.3128 0.1273 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.3127 0.1206 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.3126 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.3123 0.1246 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.3124 0.1247 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.3125 0.1317 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.3125 0.1206 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.3125 0.1209 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.3124 0.1267 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.3123 0.1231 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.3122 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.3122 0.1247 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.3125 0.1232 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.3125 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.3124 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.3122 0.1210 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.3121 0.1216 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.3122 0.1224 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.3121 0.1232 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.3121 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.3119 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.3117 0.1263 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.3118 0.1212 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.4248 0.1202 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.3791 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.3604 0.1243 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.3492 0.1255 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.3373 0.1230 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.3257 0.1235 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.3246 0.1231 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.3214 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.3191 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.3174 0.1248 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.3135 0.1237 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.3127 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.3123 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.3128 0.1241 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.3108 0.1213 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.3081 0.1213 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.3080 0.1221 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.3091 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.3083 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.3086 0.1297 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.3076 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.3075 0.1250 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.3062 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.3058 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.3050 0.1212 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.3026 0.1249 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.3009 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.3010 0.1301 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.3010 0.1253 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.3013 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.3002 0.1218 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.2989 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.2989 0.1263 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.2989 0.1218 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.2984 0.1211 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.2983 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.2973 0.1217 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.2962 0.1214 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.2948 0.1210 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.2942 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.2935 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.2941 0.1232 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.2937 0.1266 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.2930 0.1290 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.2930 0.1275 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.2923 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.2920 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.2916 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.2914 0.1215 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.2914 0.1210 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.2910 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.2915 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.2914 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.2915 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.2913 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.2913 0.1255 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.2915 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.2910 0.1211 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.2905 0.1231 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.2911 0.1259 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.2911 0.1306 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.2917 0.1252 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.2919 0.1255 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.2919 0.1317 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.2916 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.2918 0.1248 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.2919 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.2916 0.1223 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.2915 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.2914 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.2917 0.1211 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.2918 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.2921 0.1221 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.2917 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.2915 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.2916 0.1206 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.2915 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.2913 0.1275 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.2906 0.1253 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.2906 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.2901 0.1258 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.2899 0.1211 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.2894 0.1263 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.2894 0.1221 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.2890 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.2889 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.2886 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.2884 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.2880 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.2880 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.2877 0.1219 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.2876 0.1209 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.2871 0.1259 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.2867 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.2864 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.2865 0.1237 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.2863 0.1230 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.2859 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.2855 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.2852 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.2851 0.1230 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.2849 0.1223 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.2847 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.2846 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.2843 0.1251 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.2841 0.1215 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.2840 0.1251 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.2839 0.1311 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.2837 0.1214 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.2837 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.2835 0.1248 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.2835 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.2833 0.1433 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.2830 0.1245 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.2826 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.2822 0.1254 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.2822 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.2822 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.2820 0.1270 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.2818 0.1304 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.2817 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.2813 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.2809 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.2808 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.2807 0.1211 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.2802 0.1286 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.2802 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.2801 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.2800 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.2796 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.2791 0.1215 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.2789 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.2790 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.2790 0.1256 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.2789 0.1223 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.2789 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.2790 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.2790 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.2789 0.1210 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.2789 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.2793 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.2793 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.2791 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.2793 0.1213 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.2791 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.2792 0.1316 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.2792 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.2794 0.1275 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.2795 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.2794 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.2791 0.1237 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.2789 0.1307 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.2790 0.1263 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.2790 0.1243 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.2789 0.1261 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.2788 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.2788 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.2787 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.2785 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.2785 0.1213 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.2786 0.1249 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.2786 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.2785 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.2785 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.2784 0.1235 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.2784 0.1213 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.2785 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.2788 0.1249 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.2788 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.2788 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.2787 0.1254 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.2786 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.2788 0.1209 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.2788 0.1274 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.2788 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.2787 0.1237 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.2786 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.2787 0.1248 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.3999 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.3420 0.1280 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.3216 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.3129 0.1272 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.3003 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.2881 0.1296 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.2869 0.1238 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.2847 0.1324 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.2840 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.2825 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.2795 0.1278 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.2779 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.2771 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.2775 0.1236 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.2759 0.1250 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.2737 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.2731 0.1224 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.2741 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.2736 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.2741 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.2738 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.2739 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.2731 0.1255 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.2734 0.1256 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.2729 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.2707 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.2692 0.1212 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.2692 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.2693 0.1261 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.2697 0.1215 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.2689 0.1258 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.2677 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.2681 0.1213 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.2682 0.1213 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.2679 0.1233 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.2676 0.1249 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.2668 0.1213 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.2658 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.2648 0.1215 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.2645 0.1281 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.2638 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.2645 0.1208 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.2641 0.1240 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.2637 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.2640 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.2632 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.2627 0.1226 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.2625 0.1234 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.2625 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.2626 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.2620 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.2626 0.1215 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.2625 0.1209 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.2626 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.2623 0.1230 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.2624 0.1238 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.2626 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.2623 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.2618 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.2623 0.1286 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.2624 0.1226 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.2634 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.2637 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.2638 0.1270 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.2638 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.2639 0.1274 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.2640 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.2636 0.1245 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.2637 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.2635 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.2640 0.1215 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.2641 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.2645 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.2640 0.1248 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.2639 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.2638 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.2636 0.1366 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.2633 0.1207 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.2627 0.1230 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.2626 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.2621 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.2620 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.2613 0.1246 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.2613 0.1298 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.2609 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.2608 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.2606 0.1210 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.2603 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.2598 0.1214 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.2598 0.1208 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.2596 0.1243 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.2594 0.1239 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.2589 0.1218 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.2584 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.2581 0.1234 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.2581 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.2581 0.1220 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.2577 0.1211 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.2573 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.2570 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.2569 0.1250 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.2567 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.2566 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.2563 0.1216 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.2562 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.2561 0.1233 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.2559 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.2559 0.1211 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.2558 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.2558 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.2556 0.1253 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.2555 0.1313 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.2554 0.1243 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.2553 0.1240 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.2549 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.2545 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.2545 0.1215 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.2545 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.2544 0.1249 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.2542 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.2541 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.2538 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.2534 0.1272 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.2534 0.1215 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.2532 0.1239 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.2528 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.2527 0.1201 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.2527 0.1243 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.2525 0.1214 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.2522 0.1208 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.2518 0.1218 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.2516 0.1209 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.2517 0.1218 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.2517 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.2517 0.1218 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.2516 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.2518 0.1262 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.2519 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.2518 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.2518 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.2522 0.1277 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.2522 0.1230 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.2521 0.1224 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.2523 0.1216 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.2522 0.1240 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.2523 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.2524 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.2527 0.1238 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.2528 0.1321 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.2527 0.1212 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.2524 0.1234 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.2522 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.2522 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.2521 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.2521 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.2521 0.1263 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.2521 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.2519 0.1211 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.2518 0.1239 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.2518 0.1275 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.2520 0.1261 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.2519 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.2518 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.2519 0.1248 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.2518 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.2517 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.2519 0.1218 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.2522 0.1234 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.2522 0.1338 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.2523 0.1216 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.2521 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.2520 0.1253 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.2521 0.1305 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.2521 0.1212 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.2521 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.2519 0.1230 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.2517 0.1214 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.2519 0.1278 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.3962 0.1257 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.3270 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.3035 0.1267 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.2945 0.1258 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.2811 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.2688 0.1265 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.2662 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.2633 0.1301 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.2611 0.1211 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.2590 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.2549 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.2541 0.1216 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.2536 0.1290 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.2537 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.2515 0.1214 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.2497 0.1216 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.2498 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.2512 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.2505 0.1229 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.2512 0.1287 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.2504 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.2506 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.2495 0.1215 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.2497 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.2491 0.1227 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.2472 0.1235 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.2461 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.2462 0.1207 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.2465 0.1263 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.2466 0.1240 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.2458 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.2449 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.2451 0.1249 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.2453 0.1216 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.2450 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.2450 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.2440 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.2427 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.2412 0.1297 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.2408 0.1215 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.2400 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.2406 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.2401 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.2395 0.1209 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.2396 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.2387 0.1259 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.2382 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.2377 0.1210 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.2375 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.2377 0.1221 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.2371 0.1215 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.2377 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.2376 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.2377 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.2374 0.1234 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.2376 0.1222 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.2378 0.1273 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.2374 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.2369 0.1211 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.2373 0.1219 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.2374 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.2381 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.2383 0.1214 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.2384 0.1332 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.2384 0.1213 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.2385 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.2386 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.2384 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.2384 0.1303 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.2381 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.2387 0.1268 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.2388 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.2391 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.2387 0.1229 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.2387 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.2388 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.2388 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.2386 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.2380 0.1249 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.2379 0.1253 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.2375 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.2374 0.1269 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.2369 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.2369 0.1227 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.2366 0.1249 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.2365 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.2364 0.1240 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.2361 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.2357 0.1251 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.2357 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.2355 0.1266 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.2353 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.2350 0.1227 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.2346 0.1253 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.2344 0.1272 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.2343 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.2343 0.1232 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.2339 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.2335 0.1238 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.2331 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.2330 0.1221 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.2328 0.1259 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.2327 0.1256 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.2325 0.1234 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.2322 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.2320 0.1258 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.2320 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.2320 0.1239 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.2317 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.2318 0.1261 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.2316 0.1303 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.2315 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.2314 0.1296 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.2312 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.2310 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.2307 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.2307 0.1238 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.2307 0.1231 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.2305 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.2305 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.2303 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.2300 0.1250 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.2296 0.1214 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.2296 0.1261 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.2294 0.1327 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.2291 0.1216 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.2291 0.1248 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.2291 0.1249 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.2289 0.1276 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.2285 0.1234 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.2281 0.1267 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.2280 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.2281 0.1229 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.2281 0.1242 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.2281 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.2281 0.1222 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.2283 0.1262 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.2284 0.1212 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.2283 0.1305 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.2284 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.2287 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.2288 0.1214 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.2287 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.2289 0.1232 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.2288 0.1222 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.2290 0.1249 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.2290 0.1227 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.2293 0.1235 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.2295 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.2293 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.2291 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.2289 0.1212 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.2290 0.1239 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.2289 0.1234 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.2289 0.1240 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.2288 0.1221 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.2288 0.1206 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.2287 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.2285 0.1227 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.2285 0.1209 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.2287 0.1235 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.2286 0.1239 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.2285 0.1215 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.2285 0.1213 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.2285 0.1289 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.2284 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.2286 0.1204 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.2289 0.1261 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.2289 0.1234 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.2289 0.1261 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.2288 0.1263 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.2288 0.1219 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.2289 0.1219 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.2290 0.1272 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.2290 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.2289 0.1232 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.2287 0.1231 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.2289 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.3796 0.1233 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.3101 0.1216 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.2875 0.1221 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.2771 0.1215 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.2629 0.1228 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.2488 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.2472 0.1221 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.2445 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.2434 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.2412 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.2366 0.1216 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.2362 0.1221 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.2353 0.1227 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.2352 0.1207 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.2339 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.2316 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.2318 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.2323 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.2319 0.1300 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.2331 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.2326 0.1219 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.2325 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.2315 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.2313 0.1242 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.2308 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.2290 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.2279 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.2280 0.1215 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.2283 0.1280 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.2288 0.1247 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.2275 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.2262 0.1234 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.2261 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.2259 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.2256 0.1233 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.2253 0.1248 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.2247 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.2235 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.2220 0.1241 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.2215 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.2208 0.1316 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.2213 0.1312 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.2207 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.2200 0.1259 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.2203 0.1211 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.2197 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.2195 0.1213 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.2191 0.1210 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.2191 0.1213 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.2193 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.2187 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.2193 0.1247 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.2193 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.2195 0.1305 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.2192 0.1219 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.2194 0.1247 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.2195 0.1250 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.2193 0.1265 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.2188 0.1233 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.2193 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.2192 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.2199 0.1272 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.2202 0.1307 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.2203 0.1228 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.2201 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.2203 0.1236 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.2205 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.2203 0.1281 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.2202 0.1219 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.2202 0.1227 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.2208 0.1219 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.2209 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.2212 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.2208 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.2207 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.2207 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.2206 0.1271 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.2204 0.1249 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.2197 0.1267 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.2197 0.1262 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.2193 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.2192 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.2187 0.1254 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.2187 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.2183 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.2183 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.2180 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.2178 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.2175 0.1258 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.2174 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.2173 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.2171 0.1248 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.2167 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.2163 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.2160 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.2160 0.1241 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.2160 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.2155 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.2152 0.1233 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.2150 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.2149 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.2147 0.1220 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.2146 0.1216 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.2144 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.2142 0.1247 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.2141 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.2140 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.2140 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.2137 0.1233 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.2137 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.2134 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.2134 0.1215 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.2133 0.1338 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.2131 0.1234 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.2130 0.1242 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.2127 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.2127 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.2127 0.1257 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.2126 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.2125 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.2124 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.2120 0.1212 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.2116 0.1215 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.2115 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.2113 0.1216 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.2110 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.2109 0.1221 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.2109 0.1289 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.2107 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.2104 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.2100 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.2099 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.2099 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.2099 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.2099 0.1234 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.2099 0.1248 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.2100 0.1257 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.2100 0.1286 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.2100 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.2100 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.2104 0.1221 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.2104 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.2103 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.2106 0.1253 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.2104 0.1213 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.2106 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.2107 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.2110 0.1293 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.2112 0.1255 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.2110 0.1272 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.2108 0.1255 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.2106 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.2106 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.2106 0.1216 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.2105 0.1259 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.2105 0.1208 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.2105 0.1249 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.2103 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.2102 0.1286 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.2103 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.2105 0.1227 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.2105 0.1347 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.2105 0.1253 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.2105 0.1230 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.2105 0.1242 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.2105 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.2107 0.1213 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.2110 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.2111 0.1228 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.2111 0.1242 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.2111 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.2110 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.2111 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.2111 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.2112 0.1254 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.2111 0.1221 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.2110 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.2112 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.3482 0.1311 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.2941 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.2716 0.1220 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.2629 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.2501 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.2359 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.2346 0.1248 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.2316 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.2301 0.1248 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.2285 0.1232 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.2250 0.1234 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.2242 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.2233 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.2240 0.1240 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.2222 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.2197 0.1234 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.2199 0.1213 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.2203 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.2197 0.1216 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.2205 0.1252 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.2197 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.2198 0.1219 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.2190 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.2191 0.1253 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.2184 0.1268 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.2162 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.2148 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.2151 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.2149 0.1274 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.2152 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.2142 0.1258 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.2128 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.2129 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.2128 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.2125 0.1210 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.2123 0.1266 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.2115 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.2103 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.2090 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.2086 0.1236 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.2077 0.1237 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.2086 0.1244 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.2083 0.1263 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.2074 0.1233 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.2074 0.1268 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.2066 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.2062 0.1248 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.2057 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.2055 0.1257 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.2059 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.2055 0.1275 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.2061 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.2059 0.1220 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.2062 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.2060 0.1260 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.2058 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.2060 0.1279 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.2056 0.1266 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.2050 0.1308 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.2054 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.2054 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.2061 0.1269 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.2062 0.1245 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.2063 0.1218 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.2062 0.1285 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.2063 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.2065 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.2062 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.2063 0.1234 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.2061 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.2066 0.1230 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.2067 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.2070 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.2066 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.2064 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.2065 0.1213 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.2065 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.2064 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.2057 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.2057 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.2053 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.2052 0.1215 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.2048 0.1244 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.2046 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.2042 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.2040 0.1274 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.2039 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.2037 0.1213 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.2034 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.2033 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.2029 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.2028 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.2024 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.2019 0.1211 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.2016 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.2016 0.1259 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.2015 0.1252 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.2012 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.2009 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.2005 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.2005 0.1218 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.2003 0.1294 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.2002 0.1232 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.2001 0.1244 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.1999 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.1998 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.1997 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.1997 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.1995 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.1995 0.1218 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.1993 0.1214 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.1993 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.1993 0.1245 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.1991 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.1988 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.1985 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.1985 0.1220 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.1984 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.1982 0.1215 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.1982 0.1253 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.1981 0.1332 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.1978 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.1974 0.1227 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.1974 0.1252 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.1973 0.1288 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.1968 0.1214 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.1968 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.1967 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.1964 0.1254 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.1960 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.1955 0.1276 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.1954 0.1239 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.1955 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.1955 0.1209 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.1955 0.1269 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.1955 0.1210 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.1957 0.1209 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.1957 0.1237 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.1957 0.1309 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.1957 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.1960 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.1961 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.1960 0.1263 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.1962 0.1262 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.1961 0.1284 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.1962 0.1208 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.1963 0.1216 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.1965 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.1966 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.1965 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.1963 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.1961 0.1233 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.1962 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.1961 0.1220 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.1961 0.1237 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.1961 0.1300 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.1961 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.1960 0.1244 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.1958 0.1268 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.1959 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.1960 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.1960 0.1260 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.1959 0.1213 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.1959 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.1958 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.1957 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.1959 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.1963 0.1214 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.1963 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.1964 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.1962 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.1962 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.1963 0.1212 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.1963 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.1964 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.1962 0.1253 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.1961 0.1244 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.1963 0.1216 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.3379 0.1237 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.2798 0.1213 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.2602 0.1236 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.2500 0.1215 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.2375 0.1215 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.2244 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.2226 0.1213 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.2196 0.1249 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.2177 0.1220 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.2154 0.1212 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.2119 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.2104 0.1241 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.2093 0.1270 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.2101 0.1217 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.2080 0.1214 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.2059 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.2053 0.1250 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.2059 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.2055 0.1217 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.2061 0.1245 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.2051 0.1227 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.2052 0.1212 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.2041 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.2043 0.1236 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.2038 0.1305 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.2017 0.1216 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.2006 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.2008 0.1211 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.2006 0.1226 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.2005 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.1997 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.1984 0.1266 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.1986 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.1985 0.1211 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.1981 0.1217 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.1980 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.1971 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.1958 0.1220 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.1945 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.1943 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.1935 0.1216 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.1940 0.1260 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.1937 0.1217 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.1929 0.1210 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.1930 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.1922 0.1217 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.1918 0.1237 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.1914 0.1231 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.1913 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.1914 0.1251 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.1909 0.1212 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.1913 0.1246 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.1911 0.1206 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.1912 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.1908 0.1244 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.1907 0.1266 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.1910 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.1906 0.1246 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.1901 0.1214 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.1905 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.1903 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.1909 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.1910 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.1911 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.1909 0.1281 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.1909 0.1215 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.1910 0.1213 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.1908 0.1252 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.1909 0.1212 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.1908 0.1265 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.1912 0.1215 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.1914 0.1215 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.1917 0.1266 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.1913 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.1911 0.1235 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.1911 0.1231 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.1909 0.1258 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.1906 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.1901 0.1235 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.1900 0.1244 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.1896 0.1226 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.1895 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.1890 0.1282 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.1890 0.1236 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.1888 0.1258 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.1887 0.1244 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.1884 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.1882 0.1330 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.1878 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.1878 0.1267 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.1876 0.1252 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.1875 0.1268 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.1870 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.1866 0.1259 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.1864 0.1211 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.1863 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.1862 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.1858 0.1251 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.1855 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.1851 0.1325 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.1850 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.1848 0.1265 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.1847 0.1226 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.1845 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.1843 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.1841 0.1214 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.1840 0.1227 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.1839 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.1837 0.1320 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.1837 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.1835 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.1834 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.1833 0.1272 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.1831 0.1263 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.1828 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.1826 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.1826 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.1825 0.1251 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.1823 0.1251 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.1823 0.1213 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.1821 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.1818 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.1815 0.1272 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.1814 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.1814 0.1246 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.1810 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.1810 0.1256 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.1809 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.1806 0.1245 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.1803 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.1798 0.1235 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.1797 0.1227 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.1796 0.1214 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.1796 0.1253 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.1795 0.1244 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.1795 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.1797 0.1203 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.1798 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.1798 0.1224 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.1798 0.1210 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.1802 0.1271 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.1802 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.1801 0.1224 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.1804 0.1210 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.1803 0.1224 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.1804 0.1216 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.1804 0.1327 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.1806 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.1808 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.1806 0.1349 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.1804 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.1802 0.1241 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.1802 0.1220 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.1802 0.1236 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.1801 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.1801 0.1211 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.1800 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.1799 0.1224 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.1797 0.1231 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.1798 0.1247 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.1800 0.1268 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.1800 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.1800 0.1216 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.1800 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.1800 0.1279 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.1799 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.1801 0.1270 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.1805 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.1805 0.1247 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.1806 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.1806 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.1805 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.1807 0.1245 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.1807 0.1235 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.1808 0.1224 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.1807 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.1806 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.1808 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.3304 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.2600 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.2423 0.1243 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.2363 0.1248 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.2251 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.2129 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.2106 0.1261 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.2074 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.2051 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.2029 0.1251 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.1998 0.1230 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.1990 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.1987 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.1987 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.1969 0.1218 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.1947 0.1310 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.1946 0.1217 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.1951 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.1943 0.1207 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.1953 0.1270 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.1941 0.1237 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.1938 0.1257 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.1930 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.1935 0.1328 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.1927 0.1236 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.1908 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.1893 0.1231 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.1896 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.1897 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.1895 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.1884 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.1872 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.1871 0.1209 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.1873 0.1211 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.1869 0.1211 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.1864 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.1853 0.1209 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.1841 0.1216 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.1827 0.1236 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.1823 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.1815 0.1296 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.1822 0.1249 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.1818 0.1210 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.1812 0.1251 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.1810 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.1802 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.1796 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.1790 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.1791 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.1789 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.1783 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.1788 0.1258 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.1789 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.1790 0.1240 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.1788 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.1786 0.1231 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.1790 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.1787 0.1230 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.1782 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.1786 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.1785 0.1213 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.1792 0.1280 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.1795 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.1795 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.1793 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.1793 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.1794 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.1791 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.1790 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.1788 0.1231 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.1793 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.1794 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.1798 0.1217 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.1793 0.1239 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.1791 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.1790 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.1790 0.1240 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.1786 0.1225 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.1780 0.1222 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.1779 0.1240 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.1775 0.1286 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.1775 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.1771 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.1770 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.1767 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.1766 0.1239 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.1764 0.1249 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.1760 0.1304 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.1757 0.1248 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.1757 0.1218 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.1754 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.1752 0.1217 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.1748 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.1744 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.1742 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.1743 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.1743 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.1739 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.1736 0.1299 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.1733 0.1230 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.1732 0.1255 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.1730 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.1730 0.1235 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.1728 0.1217 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.1726 0.1230 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.1724 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.1724 0.1248 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.1724 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.1722 0.1222 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.1722 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.1720 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.1720 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.1720 0.1235 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.1718 0.1225 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.1716 0.1230 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.1714 0.1225 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.1713 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.1713 0.1281 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.1711 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.1710 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.1708 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.1705 0.1244 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.1702 0.1278 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.1700 0.1240 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.1700 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.1696 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.1695 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.1694 0.1312 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.1693 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.1689 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.1684 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.1683 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.1684 0.1253 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.1683 0.1216 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.1683 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.1684 0.1240 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.1684 0.1331 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.1685 0.1235 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.1685 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.1686 0.1279 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.1690 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.1690 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.1689 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.1692 0.1222 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.1690 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.1692 0.1257 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.1692 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.1694 0.1239 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.1696 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.1695 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.1692 0.1214 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.1691 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.1692 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.1691 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.1691 0.1235 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.1690 0.1247 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.1690 0.1240 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.1689 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.1687 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.1687 0.1216 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.1689 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.1690 0.1243 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.1689 0.1257 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.1689 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.1689 0.1327 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.1689 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.1690 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.1693 0.1293 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.1694 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.1694 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.1693 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.1692 0.1212 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.1694 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.1694 0.1275 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.1695 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.1693 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.1692 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.1694 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.3288 0.1257 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.2580 0.1210 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.2364 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.2244 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.2111 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.1998 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.1998 0.1266 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.1962 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.1943 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.1918 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.1889 0.1251 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.1879 0.1259 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.1876 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.1876 0.1218 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.1852 0.1306 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.1831 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.1826 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.1832 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.1821 0.1248 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.1825 0.1228 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.1819 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.1819 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.1811 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.1813 0.1232 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.1805 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.1784 0.1273 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.1770 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.1772 0.1256 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.1769 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.1768 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.1758 0.1209 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.1746 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.1746 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.1746 0.1228 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.1739 0.1238 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.1739 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.1729 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.1716 0.1232 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.1703 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.1699 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.1692 0.1247 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.1698 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.1695 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.1689 0.1290 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.1688 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.1680 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.1674 0.1250 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.1669 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.1666 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.1668 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.1662 0.1237 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.1667 0.1238 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.1664 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.1664 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.1660 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.1659 0.1254 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.1661 0.1238 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.1658 0.1241 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.1651 0.1241 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.1654 0.1269 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.1653 0.1213 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.1659 0.1253 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.1661 0.1247 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.1661 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.1661 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.1663 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.1666 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.1664 0.1213 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.1663 0.1246 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.1660 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.1665 0.1264 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.1666 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.1670 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.1666 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.1666 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.1668 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.1667 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.1666 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.1660 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.1658 0.1218 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.1655 0.1239 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.1653 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.1649 0.1213 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.1650 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.1647 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.1647 0.1239 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.1645 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.1643 0.1301 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.1639 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.1640 0.1230 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.1637 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.1635 0.1259 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.1631 0.1301 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.1626 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.1624 0.1268 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.1624 0.1278 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.1624 0.1276 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.1621 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.1618 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.1615 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.1614 0.1255 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.1612 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.1611 0.1211 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.1610 0.1239 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.1608 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.1606 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.1605 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.1604 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.1602 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.1602 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.1599 0.1266 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.1597 0.1236 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.1597 0.1249 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.1595 0.1285 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.1592 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.1590 0.1207 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.1590 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.1590 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.1588 0.1237 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.1587 0.1262 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.1586 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.1582 0.1254 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.1578 0.1328 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.1578 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.1577 0.1248 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.1574 0.1233 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.1573 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.1572 0.1251 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.1570 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.1566 0.1279 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.1562 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.1560 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.1560 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.1560 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.1560 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.1560 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.1561 0.1233 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.1562 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.1562 0.1209 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.1563 0.1234 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.1566 0.1213 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.1567 0.1218 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.1566 0.1255 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.1568 0.1233 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.1567 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.1569 0.1235 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.1570 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.1572 0.1242 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.1573 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.1573 0.1248 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.1571 0.1241 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.1569 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.1569 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.1569 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.1568 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.1567 0.1251 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.1567 0.1328 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.1566 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.1564 0.1248 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.1565 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.1567 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.1567 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.1566 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.1567 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.1566 0.1261 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.1566 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.1567 0.1287 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.1571 0.1218 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.1571 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.1572 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.1571 0.1253 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.1570 0.1213 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.1572 0.1242 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.1572 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.1573 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.1572 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.1570 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.1572 0.1230 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.2867 0.1253 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.2333 0.1271 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.2167 0.1223 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.2087 0.1213 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.1952 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.1831 0.1238 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.1815 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.1785 0.1223 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.1761 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.1741 0.1252 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.1712 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.1711 0.1247 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.1704 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.1702 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.1683 0.1311 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.1665 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.1665 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.1672 0.1250 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.1666 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.1676 0.1310 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.1668 0.1239 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.1666 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.1660 0.1235 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.1660 0.1231 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.1656 0.1206 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.1634 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.1621 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.1627 0.1236 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.1626 0.1251 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.1625 0.1255 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.1615 0.1214 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.1606 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.1605 0.1227 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.1604 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.1598 0.1269 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.1595 0.1211 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.1587 0.1207 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.1575 0.1247 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.1564 0.1207 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.1561 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.1554 0.1209 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.1562 0.1211 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.1559 0.1200 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.1553 0.1207 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.1553 0.1207 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.1546 0.1231 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.1541 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.1537 0.1269 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.1534 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.1534 0.1210 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.1529 0.1209 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.1534 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.1535 0.1200 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.1536 0.1211 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.1533 0.1238 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.1533 0.1205 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.1536 0.1208 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.1532 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.1528 0.1215 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.1532 0.1208 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.1531 0.1209 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.1537 0.1206 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.1540 0.1197 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.1540 0.1214 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.1539 0.1213 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.1540 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.1543 0.1207 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.1542 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.1542 0.1211 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.1543 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.1547 0.1207 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.1549 0.1297 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.1555 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.1551 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.1550 0.1201 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.1550 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.1549 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.1547 0.1220 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.1541 0.1232 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.1541 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.1538 0.1202 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.1536 0.1214 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.1531 0.1210 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.1532 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.1529 0.1202 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.1528 0.1263 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.1526 0.1257 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.1523 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.1520 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.1520 0.1352 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.1517 0.1215 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.1516 0.1266 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.1513 0.1228 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.1508 0.1208 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.1506 0.1202 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.1506 0.1217 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.1506 0.1233 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.1502 0.1290 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.1499 0.1212 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.1496 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.1495 0.1252 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.1493 0.1279 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.1493 0.1242 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.1491 0.1278 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.1489 0.1273 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.1488 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.1488 0.1239 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.1488 0.1258 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.1487 0.1239 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.1487 0.1284 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.1485 0.1248 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.1484 0.1213 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.1483 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.1481 0.1250 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.1479 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.1477 0.1239 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.1476 0.1236 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.1476 0.1233 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.1474 0.1233 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.1474 0.1307 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.1473 0.1263 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.1470 0.1220 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.1466 0.1240 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.1465 0.1276 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.1465 0.1253 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.1462 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.1461 0.1272 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.1462 0.1221 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.1459 0.1211 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.1456 0.1242 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.1452 0.1227 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.1451 0.1247 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.1452 0.1259 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.1451 0.1253 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.1451 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.1451 0.1272 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.1452 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.1453 0.1255 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.1453 0.1273 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.1453 0.1290 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.1456 0.1323 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.1457 0.1255 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.1456 0.1154 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.1458 0.1147 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.1458 0.1163 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.1460 0.1159 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.1460 0.1162 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.1462 0.1150 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.1464 0.1150 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.1464 0.1147 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.1462 0.1170 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.1460 0.1160 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.1461 0.1152 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.1460 0.1165 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.1459 0.1174 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.1458 0.1156 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.1458 0.1183 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.1457 0.1150 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.1455 0.1173 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.1456 0.1149 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.1458 0.1145 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.1458 0.1162 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.1457 0.1214 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.1457 0.1199 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.1457 0.1152 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.1457 0.1217 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.1458 0.1178 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.1462 0.1150 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.1463 0.1156 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.1463 0.1154 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.1463 0.1183 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.1462 0.1168 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.1464 0.1157 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.1464 0.1155 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.1465 0.1144 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.1463 0.1162 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.1463 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.1465 0.1219 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.2656 0.1204 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.2188 0.1170 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.1966 0.1220 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.1909 0.1270 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.1816 0.1331 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.1695 0.1349 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.1673 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.1654 0.1318 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.1637 0.1259 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.1632 0.1254 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.1594 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.1592 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.1585 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.1582 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.1568 0.1218 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.1548 0.1252 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.1549 0.1219 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.1556 0.1261 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.1552 0.1247 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.1561 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.1551 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.1553 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.1550 0.1331 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.1551 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.1547 0.1233 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.1528 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.1517 0.1213 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.1518 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.1519 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.1521 0.1213 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.1513 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.1503 0.1228 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.1503 0.1228 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.1505 0.1220 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.1502 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.1501 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.1493 0.1217 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.1481 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.1468 0.1233 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.1465 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.1458 0.1245 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.1466 0.1244 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.1462 0.1235 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.1457 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.1457 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.1449 0.1253 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.1444 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.1441 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.1439 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.1438 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.1432 0.1245 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.1438 0.1233 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.1438 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.1439 0.1276 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.1437 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.1436 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.1437 0.1276 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.1434 0.1244 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.1427 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.1429 0.1269 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.1427 0.1224 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.1434 0.1245 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.1438 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.1438 0.1267 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.1437 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.1437 0.1256 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.1438 0.1270 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.1437 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.1437 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.1435 0.1224 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.1440 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.1442 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.1445 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.1442 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.1441 0.1246 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.1441 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.1441 0.1240 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.1439 0.1245 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.1433 0.1247 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.1432 0.1299 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.1430 0.1217 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.1430 0.1233 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.1425 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.1425 0.1239 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.1423 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.1421 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.1418 0.1245 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.1417 0.1298 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.1413 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.1413 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.1411 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.1410 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.1407 0.1255 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.1404 0.1267 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.1402 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.1403 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.1403 0.1220 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.1399 0.1246 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.1396 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.1393 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.1393 0.1316 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.1391 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.1392 0.1235 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.1391 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.1390 0.1233 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.1389 0.1233 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.1389 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.1388 0.1285 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.1386 0.1217 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.1387 0.1240 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.1385 0.1248 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.1384 0.1220 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.1384 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.1383 0.1288 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.1381 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.1379 0.1233 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.1379 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.1379 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.1378 0.1259 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.1378 0.1249 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.1376 0.1256 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.1373 0.1214 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.1370 0.1278 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.1369 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.1369 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.1365 0.1218 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.1365 0.1249 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.1364 0.1248 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.1362 0.1230 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.1359 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.1355 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.1354 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.1355 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.1355 0.1290 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.1355 0.1239 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.1355 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.1357 0.1252 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.1358 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.1358 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.1358 0.1239 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.1362 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.1363 0.1264 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.1362 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.1364 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.1363 0.1247 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.1365 0.1254 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.1365 0.1259 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.1367 0.1259 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.1368 0.1264 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.1368 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.1365 0.1213 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.1364 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.1364 0.1246 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.1364 0.1239 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.1363 0.1219 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.1363 0.1239 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.1363 0.1220 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.1362 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.1361 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.1362 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.1364 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.1364 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.1364 0.1255 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.1364 0.1224 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.1364 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.1363 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.1365 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.1368 0.1274 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.1368 0.1230 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.1369 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.1369 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.1368 0.1254 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.1370 0.1248 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.1370 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.1370 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.1369 0.1253 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.1368 0.1228 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.1370 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.2480 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.2023 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.1822 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.1760 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.1670 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.1543 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.1514 0.1251 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.1506 0.1249 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.1497 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.1479 0.1254 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.1441 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.1445 0.1248 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.1448 0.1236 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.1454 0.1224 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.1439 0.1241 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.1421 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.1424 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.1436 0.1257 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.1434 0.1229 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.1440 0.1246 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.1434 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.1434 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.1427 0.1260 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.1434 0.1246 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.1431 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.1414 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.1404 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.1408 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.1405 0.1256 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.1411 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.1401 0.1216 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.1390 0.1257 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.1392 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.1392 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.1387 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.1386 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.1380 0.1259 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.1369 0.1347 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.1359 0.1212 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.1355 0.1249 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.1347 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.1356 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.1355 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.1347 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.1346 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.1340 0.1224 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.1336 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.1334 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.1333 0.1269 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.1334 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.1330 0.1251 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.1335 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.1334 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.1336 0.1235 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.1334 0.1214 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.1334 0.1251 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.1335 0.1239 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.1332 0.1240 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.1327 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.1331 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.1332 0.1224 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.1339 0.1241 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.1341 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.1341 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.1341 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.1340 0.1247 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.1343 0.1248 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.1342 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.1342 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.1342 0.1256 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.1346 0.1258 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.1347 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.1351 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.1347 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.1347 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.1348 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.1348 0.1224 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.1347 0.1209 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.1342 0.1236 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.1342 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.1338 0.1255 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.1338 0.1254 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.1333 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.1333 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.1330 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.1329 0.1273 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.1327 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.1324 0.1240 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.1321 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.1320 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.1317 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.1317 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.1314 0.1245 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.1311 0.1263 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.1309 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.1310 0.1222 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.1309 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.1306 0.1222 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.1302 0.1247 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.1301 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.1300 0.1236 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.1299 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.1299 0.1236 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.1297 0.1249 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.1295 0.1239 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.1293 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.1293 0.1222 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.1293 0.1272 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.1292 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.1293 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.1291 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.1290 0.1266 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.1290 0.1222 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.1289 0.1280 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.1287 0.1222 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.1284 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.1284 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.1284 0.1215 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.1282 0.1224 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.1282 0.1222 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.1281 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.1279 0.1224 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.1276 0.1239 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.1276 0.1256 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.1274 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.1271 0.1264 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.1270 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.1270 0.1249 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.1268 0.1229 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.1265 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.1261 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.1261 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.1262 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.1261 0.1211 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.1261 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.1260 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.1262 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.1264 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.1265 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.1265 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.1268 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.1270 0.1216 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.1268 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.1271 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.1270 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.1272 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.1272 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.1273 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.1275 0.1241 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.1274 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.1272 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.1270 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.1271 0.1251 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.1270 0.1259 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.1270 0.1257 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.1269 0.1239 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.1269 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.1269 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.1266 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.1267 0.1229 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.1268 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.1269 0.1240 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.1269 0.1216 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.1269 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.1269 0.1215 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.1268 0.1235 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.1269 0.1214 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.1272 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.1273 0.1261 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.1274 0.1208 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.1274 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.1274 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.1275 0.1264 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.1276 0.1204 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.1277 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.1275 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.1274 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.1277 0.1252 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.2421 0.1240 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.1950 0.1228 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.1757 0.1228 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.1707 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.1597 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.1483 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.1455 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.1424 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.1400 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.1382 0.1209 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.1355 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.1364 0.1240 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.1368 0.1228 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.1368 0.1237 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.1350 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.1329 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.1330 0.1216 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.1340 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.1339 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.1348 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.1342 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.1348 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.1342 0.1277 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.1343 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.1342 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.1327 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.1315 0.1239 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.1320 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.1318 0.1282 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.1319 0.1287 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.1311 0.1253 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.1303 0.1292 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.1303 0.1265 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.1304 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.1299 0.1251 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.1300 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.1290 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.1279 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.1270 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.1268 0.1251 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.1261 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.1271 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.1268 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.1261 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.1261 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.1255 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.1253 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.1250 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.1247 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.1249 0.1249 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.1243 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.1250 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.1247 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.1249 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.1247 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.1249 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.1250 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.1247 0.1243 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.1243 0.1213 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.1248 0.1269 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.1247 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.1252 0.1255 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.1256 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.1258 0.1269 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.1257 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.1259 0.1234 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.1262 0.1246 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.1260 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.1261 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.1261 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.1266 0.1275 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.1269 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.1273 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.1268 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.1268 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.1269 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.1268 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.1267 0.1272 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.1261 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.1261 0.1249 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.1258 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.1258 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.1254 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.1254 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.1251 0.1230 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.1250 0.1252 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.1248 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.1246 0.1260 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.1243 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.1245 0.1241 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.1243 0.1256 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.1242 0.1269 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.1238 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.1234 0.1266 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.1232 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.1232 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.1233 0.1213 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.1229 0.1229 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.1227 0.1229 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.1225 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.1225 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.1224 0.1257 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.1224 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.1222 0.1261 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.1221 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.1220 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.1220 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.1220 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.1219 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.1219 0.1257 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.1216 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.1216 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.1215 0.1241 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.1214 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.1212 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.1209 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.1209 0.1234 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.1209 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.1208 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.1208 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.1207 0.1243 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.1204 0.1229 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.1201 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.1200 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.1199 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.1195 0.1228 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.1195 0.1228 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.1194 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.1192 0.1279 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.1188 0.1251 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.1184 0.1230 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.1184 0.1210 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.1185 0.1260 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.1185 0.1246 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.1185 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.1185 0.1268 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.1186 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.1187 0.1252 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.1188 0.1256 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.1189 0.1214 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.1191 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.1192 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.1191 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.1194 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.1193 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.1195 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.1195 0.1242 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.1197 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.1198 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.1198 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.1195 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.1194 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.1194 0.1264 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.1194 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.1194 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.1194 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.1194 0.1216 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.1193 0.1212 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.1191 0.1264 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.1192 0.1241 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.1193 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.1193 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.1193 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.1193 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.1193 0.1261 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.1192 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.1193 0.1210 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.1196 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.1197 0.1230 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.1198 0.1214 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.1197 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.1196 0.1212 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.1198 0.1210 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.1199 0.1251 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.1199 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.1198 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.1197 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.1199 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4198 1.2810 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.3758 0.1352 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.2230 0.1311 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.5803 0.1221 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.5141 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.3822 0.1237 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.2765 0.1256 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 4.1904 0.1232 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 4.1139 0.1231 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 4.0459 0.1266 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.9846 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.9324 0.1253 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.8860 0.1258 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.8455 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.8091 0.1232 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.7763 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.7462 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.7206 0.1248 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.6964 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.6726 0.1231 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.6521 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.6326 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.6148 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.5983 0.1250 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.5825 0.1230 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.5684 0.1209 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.5554 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.5422 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.5303 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.5192 0.1232 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.5096 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.4994 0.1223 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.4897 0.1247 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.4811 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.4724 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.4645 0.1237 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.4563 0.1238 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.4486 0.1227 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.4412 0.1221 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.4343 0.1248 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.4273 0.1224 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.4208 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.4147 0.1224 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.4087 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.4027 0.1238 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.3973 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.3923 0.1206 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.3877 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.3830 0.1239 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.3785 0.1247 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.3741 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.3697 0.1231 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.3656 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.3614 0.1211 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.3575 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.3533 0.1248 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.3495 0.1241 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.3460 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.3423 0.1253 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.3391 0.1211 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.3357 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.3328 0.1227 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.3300 0.1214 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.3267 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.3235 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.3209 0.1208 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.3181 0.1230 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.3149 0.1248 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.3121 0.1206 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.3096 0.1256 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.3070 0.1230 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.3047 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.3021 0.1238 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.2998 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.2975 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.2953 0.1289 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.2932 0.1249 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.2910 0.1228 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.2887 0.1232 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.2863 0.1240 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.2840 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2820 0.1229 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2800 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2779 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2757 0.1240 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2737 0.1238 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2715 0.1250 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2694 0.1227 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2675 0.1217 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2656 0.1267 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2638 0.1237 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.2618 0.1270 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.2598 0.1236 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.2583 0.1238 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.2563 0.1206 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.2545 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.2529 0.1214 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.2512 0.1247 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.2495 0.1249 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.2477 0.1231 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.2461 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.2443 0.1207 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.2427 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.2409 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.2392 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.2374 0.1236 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.2356 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.2338 0.1241 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.2321 0.1247 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.2300 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.2282 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.2265 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.2248 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.2231 0.1313 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.2212 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.2194 0.1258 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.2177 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.2161 0.1248 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.2145 0.1230 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.2126 0.1251 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.2110 0.1223 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.2093 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.2074 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.2057 0.1254 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.2037 0.1240 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.2015 0.1249 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.1996 0.1233 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.1977 0.1224 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.1957 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.1940 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.1926 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.1910 0.1265 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.1894 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.1876 0.1240 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.1855 0.1217 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.1836 0.1205 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.1817 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.1798 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.1779 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.1760 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.1740 0.1240 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.1719 0.1313 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.1698 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.1678 0.1217 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.1657 0.1267 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.1636 0.1241 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.1615 0.1257 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.1595 0.1227 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.1573 0.1238 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.1550 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.1530 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.1510 0.1336 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.1488 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.1466 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.1442 0.1241 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.1418 0.1272 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.1393 0.1230 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.1369 0.1244 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.1343 0.1217 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.1320 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.1295 0.1229 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.1269 0.1223 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.1242 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.1217 0.1249 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.1193 0.1233 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.1168 0.1223 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.1143 0.1268 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.1117 0.1264 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.1094 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.1067 0.1270 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.1042 0.1262 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.1019 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.0995 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.0971 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.0948 0.1216 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.0923 0.1263 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.0896 0.1210 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.0868 0.1235 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.6553 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.6054 0.1244 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.5964 0.1245 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.5914 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.5876 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.5815 0.1244 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.5799 0.1240 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.5778 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.5772 0.1299 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.5731 0.1215 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.5686 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.5670 0.1245 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.5639 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.5634 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.5610 0.1245 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.5582 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.5555 0.1240 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.5557 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.5537 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.5501 0.1234 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.5475 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.5469 0.1209 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.5447 0.1240 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.5419 0.1257 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.5391 0.1238 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.5373 0.1262 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.5349 0.1252 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.5326 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.5309 0.1311 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.5290 0.1274 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.5275 0.1240 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.5252 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.5227 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.5210 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.5187 0.1230 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.5172 0.1210 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.5152 0.1212 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.5126 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.5104 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.5082 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.5061 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.5039 0.1220 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.5017 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.4994 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.4976 0.1226 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.4951 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.4937 0.1210 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.4919 0.1244 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.4901 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.4888 0.1261 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.4869 0.1211 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.4854 0.1235 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.4836 0.1245 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.4818 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.4800 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.4786 0.1210 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.4770 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.4752 0.1236 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.4734 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.4723 0.1267 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.4708 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.4696 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.4684 0.1248 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.4669 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.4654 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.4642 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.4628 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.4609 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.4591 0.1227 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.4579 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.4568 0.1240 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.4556 0.1248 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.4543 0.1220 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.4527 0.1253 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.4513 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.4503 0.1245 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.4490 0.1239 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.4478 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.4464 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.4450 0.1226 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.4435 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.4424 0.1233 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.4408 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.4393 0.1236 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.4374 0.1230 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.4360 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.4347 0.1255 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.4333 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.4318 0.1255 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.4305 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.4291 0.1297 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.4279 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.4265 0.1230 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.4250 0.1299 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.4234 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.4220 0.1237 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.4207 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.4194 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.4179 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.4165 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.4154 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.4141 0.1220 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.4126 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.4113 0.1220 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.4099 0.1252 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.4087 0.1233 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.4074 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.4064 0.1220 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.4053 0.1259 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.4039 0.1209 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.4027 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.4015 0.1238 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.4002 0.1234 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.3990 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.3977 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.3963 0.1342 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.3951 0.1234 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.3940 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.3930 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.3919 0.1244 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.3910 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.3898 0.1212 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.3885 0.1226 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.3875 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.3864 0.1259 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.3851 0.1215 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.3840 0.1234 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.3830 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.3819 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.3809 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.3797 0.1238 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.3783 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.3773 0.1212 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.3763 0.1220 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.3751 0.1238 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.3741 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.3730 0.1261 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.3719 0.1237 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.3711 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.3699 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.3690 0.1227 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.3679 0.1247 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.3668 0.1252 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.3657 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.3646 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.3637 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.3627 0.1214 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.3618 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.3607 0.1230 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.3596 0.1210 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.3586 0.1247 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.3578 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.3568 0.1237 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.3558 0.1215 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.3547 0.1230 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.3536 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.3525 0.1261 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.3515 0.1247 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.3502 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.3495 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.3485 0.1238 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.3474 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.3463 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.3454 0.1255 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.3444 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.3434 0.1272 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.3424 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.3416 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.3405 0.1235 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.3394 0.1220 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.3383 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.3373 0.1245 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.3365 0.1235 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.3356 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.3347 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.3337 0.1246 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.3327 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.3317 0.1212 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.2153 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.1681 0.1361 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.1562 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.1509 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.1492 0.1234 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.1425 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.1419 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.1432 0.1209 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.1442 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.1431 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.1410 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.1391 0.1266 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.1385 0.1246 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.1401 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.1394 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.1377 0.1258 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.1369 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.1415 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.1433 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.1437 0.1217 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.1433 0.1241 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.1449 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.1450 0.1266 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.1442 0.1238 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.1439 0.1246 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.1431 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.1421 0.1258 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.1421 0.1243 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.1428 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.1426 0.1238 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.1421 0.1251 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.1406 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.1399 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.1400 0.1259 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.1392 0.1234 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.1384 0.1218 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.1377 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.1360 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.1343 0.1218 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.1326 0.1255 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.1314 0.1241 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.1306 0.1263 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.1293 0.1274 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.1279 0.1209 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.1271 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.1250 0.1252 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.1241 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.1228 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.1218 0.1265 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.1217 0.1212 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.1204 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.1202 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.1191 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.1181 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.1171 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.1165 0.1256 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.1158 0.1212 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.1148 0.1218 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.1137 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.1134 0.1244 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.1126 0.1229 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.1122 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.1119 0.1259 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.1114 0.1244 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.1106 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.1102 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.1096 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.1085 0.1246 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.1076 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.1068 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.1064 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.1058 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.1053 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.1044 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.1036 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.1032 0.1243 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.1023 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.1018 0.1217 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.1007 0.1253 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.0997 0.1209 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.0987 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.0980 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.0969 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.0960 0.1252 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.0947 0.1231 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.0937 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.0929 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.0920 0.1236 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.0909 0.1256 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.0902 0.1245 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.0893 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.0886 0.1298 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.0874 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.0865 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.0856 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.0847 0.1203 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.0839 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.0831 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.0819 0.1218 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.0808 0.1276 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.0801 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.0795 0.1256 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.0785 0.1262 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.0777 0.1251 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.0768 0.1252 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.0761 0.1243 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.0754 0.1254 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.0747 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.0741 0.1229 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.0734 0.1240 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.0727 0.1253 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.0720 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.0712 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.0704 0.1260 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.0696 0.1266 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.0686 0.1243 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.0679 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.0672 0.1253 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.0666 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.0660 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.0654 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.0646 0.1264 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.0638 0.1253 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.0633 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.0626 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.0616 0.1229 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.0611 0.1218 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.0606 0.1236 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.0599 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.0592 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.0584 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.0575 0.1245 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.0569 0.1243 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.0563 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.0556 0.1240 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.0551 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.0545 0.1233 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.0539 0.1238 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.0536 0.1244 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.0529 0.1231 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.0524 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.0518 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.0511 0.1256 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.0506 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.0499 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.0494 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.0490 0.1250 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.0485 0.1253 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.0479 0.1246 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.0473 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.0467 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.0464 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.0459 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.0454 0.1257 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.0448 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.0442 0.1256 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.0436 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.0430 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.0422 0.1248 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.0418 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.0414 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.0408 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.0402 0.1250 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.0397 0.1238 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.0391 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.0385 0.1240 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.0381 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.0378 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.0372 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.0366 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.0360 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.0354 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.0349 0.1245 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.0345 0.1291 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.0340 0.1211 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.0334 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.0327 0.1268 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.0322 0.1360 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 1.9983 0.1247 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 1.9566 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 1.9419 0.1258 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 1.9359 0.1251 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 1.9358 0.1268 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 1.9263 0.1264 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 1.9261 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 1.9254 0.1242 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 1.9288 0.1259 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 1.9283 0.1232 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 1.9240 0.1269 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 1.9218 0.1230 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 1.9215 0.1244 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 1.9246 0.1288 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 1.9229 0.1237 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 1.9213 0.1258 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 1.9200 0.1239 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 1.9219 0.1282 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 1.9216 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 1.9214 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 1.9198 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 1.9209 0.1258 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 1.9199 0.1237 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 1.9186 0.1242 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 1.9178 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 1.9162 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 1.9148 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 1.9146 0.1310 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 1.9152 0.1216 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 1.9147 0.1242 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 1.9143 0.1255 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 1.9128 0.1247 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 1.9121 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 1.9123 0.1272 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 1.9116 0.1266 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 1.9109 0.1288 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 1.9102 0.1320 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 1.9086 0.1305 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 1.9071 0.1253 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 1.9060 0.1249 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 1.9053 0.1261 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 1.9048 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 1.9040 0.1261 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 1.9031 0.1237 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 1.9028 0.1264 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 1.9010 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 1.9008 0.1216 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 1.8999 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 1.8993 0.1213 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 1.8995 0.1217 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 1.8985 0.1254 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 1.8990 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 1.8982 0.1219 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 1.8978 0.1263 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 1.8971 0.1272 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 1.8968 0.1275 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 1.8967 0.1239 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 1.8961 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 1.8953 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 1.8954 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 1.8950 0.1272 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 1.8953 0.1232 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 1.8952 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 1.8949 0.1265 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 1.8945 0.1217 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 1.8945 0.1254 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 1.8943 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 1.8935 0.1276 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 1.8931 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 1.8925 0.1269 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 1.8927 0.1247 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 1.8924 0.1299 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 1.8923 0.1272 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 1.8915 0.1321 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 1.8910 0.1247 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 1.8909 0.1324 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 1.8902 0.1215 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 1.8900 0.1225 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 1.8891 0.1217 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 1.8885 0.1242 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 1.8876 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 1.8873 0.1262 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 1.8864 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 1.8858 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 1.8849 0.1225 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 1.8843 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 1.8837 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 1.8830 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 1.8822 0.1310 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 1.8819 0.1280 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 1.8813 0.1236 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 1.8808 0.1273 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 1.8799 0.1261 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 1.8791 0.1289 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 1.8783 0.1255 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 1.8778 0.1273 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 1.8772 0.1232 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 1.8764 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 1.8755 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 1.8746 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 1.8742 0.1268 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 1.8736 0.1216 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 1.8731 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 1.8725 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 1.8718 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 1.8713 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 1.8708 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 1.8703 0.1307 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 1.8698 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 1.8694 0.1292 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 1.8690 0.1290 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 1.8684 0.1250 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 1.8679 0.1270 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 1.8673 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 1.8668 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 1.8661 0.1263 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 1.8656 0.1248 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 1.8651 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 1.8646 0.1319 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 1.8642 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 1.8638 0.1249 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 1.8631 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 1.8625 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 1.8622 0.1312 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 1.8618 0.1210 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 1.8610 0.1214 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 1.8607 0.1257 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 1.8604 0.1280 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 1.8600 0.1216 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 1.8595 0.1225 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 1.8589 0.1254 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 1.8583 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 1.8579 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 1.8576 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 1.8572 0.1294 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 1.8568 0.1260 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 1.8565 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 1.8562 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 1.8560 0.1266 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 1.8555 0.1236 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 1.8553 0.1247 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 1.8549 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 1.8545 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 1.8542 0.1215 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 1.8537 0.1249 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 1.8534 0.1295 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 1.8530 0.1237 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 1.8528 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 1.8524 0.1292 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 1.8519 0.1248 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 1.8513 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 1.8511 0.1268 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 1.8508 0.1236 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 1.8506 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 1.8502 0.1213 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 1.8498 0.1261 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 1.8494 0.1249 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 1.8490 0.1219 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 1.8485 0.1244 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 1.8482 0.1226 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 1.8481 0.1254 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 1.8476 0.1244 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 1.8474 0.1361 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 1.8471 0.1241 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 1.8467 0.1255 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 1.8462 0.1269 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 1.8460 0.1287 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 1.8459 0.1253 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 1.8455 0.1265 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 1.8452 0.1219 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 1.8447 0.1210 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 1.8442 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 1.8439 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 1.8436 0.1243 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 1.8433 0.1228 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 1.8429 0.1239 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 1.8424 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 1.8421 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 1.8467 0.1234 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 1.8045 0.1215 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 1.7901 0.1263 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 1.7832 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 1.7804 0.1257 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 1.7698 0.1231 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 1.7688 0.1290 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 1.7676 0.1273 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 1.7703 0.1272 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 1.7703 0.1260 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 1.7662 0.1225 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 1.7638 0.1241 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 1.7633 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 1.7661 0.1219 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 1.7652 0.1250 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 1.7639 0.1305 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 1.7633 0.1230 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 1.7647 0.1219 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 1.7644 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 1.7647 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 1.7634 0.1265 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 1.7647 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 1.7638 0.1261 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 1.7630 0.1245 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 1.7628 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 1.7611 0.1267 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 1.7594 0.1299 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 1.7596 0.1286 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 1.7602 0.1257 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 1.7598 0.1247 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 1.7595 0.1295 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 1.7584 0.1273 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 1.7584 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 1.7589 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 1.7582 0.1213 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 1.7575 0.1204 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 1.7567 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 1.7554 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 1.7539 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 1.7528 0.1290 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 1.7520 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 1.7521 0.1255 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 1.7511 0.1255 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 1.7500 0.1268 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 1.7499 0.1242 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 1.7486 0.1248 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 1.7482 0.1252 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 1.7475 0.1247 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 1.7467 0.1285 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 1.7470 0.1230 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 1.7462 0.1258 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 1.7469 0.1209 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 1.7463 0.1305 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 1.7460 0.1231 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 1.7454 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 1.7453 0.1218 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 1.7453 0.1218 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 1.7448 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 1.7440 0.1231 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 1.7444 0.1225 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 1.7441 0.1218 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 1.7446 0.1249 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 1.7448 0.1229 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 1.7451 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 1.7450 0.1249 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 1.7452 0.1248 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 1.7452 0.1341 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 1.7446 0.1233 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 1.7441 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 1.7439 0.1249 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 1.7442 0.1248 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 1.7441 0.1230 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 1.7442 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 1.7437 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 1.7434 0.1289 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 1.7433 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 1.7428 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 1.7426 0.1261 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 1.7419 0.1233 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 1.7414 0.1285 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 1.7407 0.1309 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 1.7405 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 1.7397 0.1264 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 1.7395 0.1280 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 1.7388 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 1.7381 0.1245 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 1.7376 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 1.7370 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 1.7362 0.1240 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 1.7361 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 1.7355 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 1.7351 0.1228 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 1.7344 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 1.7338 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 1.7331 0.1208 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 1.7328 0.1254 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 1.7324 0.1234 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 1.7317 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 1.7311 0.1233 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 1.7304 0.1245 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 1.7301 0.1215 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 1.7297 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 1.7293 0.1267 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 1.7289 0.1350 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 1.7284 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 1.7280 0.1270 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 1.7275 0.1239 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 1.7272 0.1240 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 1.7269 0.1298 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 1.7266 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 1.7263 0.1254 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 1.7258 0.1215 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 1.7254 0.1280 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 1.7251 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 1.7245 0.1254 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 1.7240 0.1219 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 1.7236 0.1242 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 1.7232 0.1219 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 1.7229 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 1.7225 0.1241 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 1.7221 0.1247 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 1.7214 0.1253 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 1.7208 0.1257 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 1.7206 0.1270 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 1.7203 0.1241 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 1.7197 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 1.7195 0.1206 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 1.7192 0.1288 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 1.7189 0.1215 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 1.7184 0.1230 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 1.7178 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 1.7172 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 1.7170 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 1.7167 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 1.7165 0.1218 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 1.7163 0.1229 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 1.7161 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 1.7161 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 1.7160 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 1.7158 0.1251 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 1.7158 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 1.7155 0.1270 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 1.7152 0.1265 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 1.7151 0.1343 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 1.7147 0.1226 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 1.7146 0.1230 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 1.7144 0.1247 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 1.7144 0.1240 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 1.7142 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 1.7139 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 1.7134 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 1.7132 0.1215 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 1.7130 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 1.7127 0.1248 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 1.7124 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 1.7121 0.1210 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 1.7120 0.1231 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 1.7117 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 1.7112 0.1256 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 1.7110 0.1207 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 1.7109 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 1.7107 0.1254 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 1.7105 0.1282 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 1.7103 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 1.7100 0.1239 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 1.7097 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 1.7095 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 1.7097 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 1.7094 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 1.7090 0.1235 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 1.7087 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 1.7083 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 1.7082 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 1.7079 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 1.7077 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 1.7074 0.1242 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 1.7070 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 1.7069 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 1.7389 0.1281 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 1.6958 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 1.6820 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 1.6758 0.1250 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 1.6693 0.1250 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 1.6598 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 1.6591 0.1231 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 1.6567 0.1269 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 1.6573 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 1.6560 0.1376 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.6519 0.1233 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.6510 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.6510 0.1337 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.6534 0.1212 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.6524 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.6511 0.1261 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.6503 0.1280 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.6522 0.1218 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.6520 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.6526 0.1265 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.6512 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.6518 0.1225 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.6512 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.6505 0.1252 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.6506 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.6491 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.6477 0.1237 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.6482 0.1225 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.6489 0.1220 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.6487 0.1242 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.6486 0.1237 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.6473 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.6476 0.1220 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.6480 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.6474 0.1243 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.6471 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.6461 0.1212 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.6448 0.1214 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.6431 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.6424 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.6416 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.6420 0.1249 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.6412 0.1252 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.6401 0.1235 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.6401 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.6389 0.1264 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.6384 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.6379 0.1241 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.6374 0.1223 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.6379 0.1202 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.6371 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.6377 0.1214 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.6374 0.1240 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.6373 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.6370 0.1212 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.6370 0.1250 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.6373 0.1251 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.6369 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.6362 0.1225 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.6365 0.1257 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.6364 0.1255 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.6371 0.1253 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.6374 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.6375 0.1289 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.6373 0.1284 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.6373 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.6372 0.1263 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.6366 0.1249 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.6363 0.1262 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.6360 0.1277 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.6364 0.1317 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.6365 0.1307 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.6367 0.1277 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.6363 0.1296 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.6360 0.1237 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.6360 0.1272 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.6357 0.1277 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.6356 0.1210 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.6349 0.1287 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.6346 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.6338 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.6338 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.6330 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.6328 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.6323 0.1273 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.6319 0.1252 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.6315 0.1280 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.6310 0.1288 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.6304 0.1328 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.6303 0.1260 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.6299 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.6296 0.1253 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.6291 0.1259 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.6287 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.6282 0.1228 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.6280 0.1252 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.6277 0.1249 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.6270 0.1241 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.6264 0.1214 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.6257 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.6257 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.6254 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.6249 0.1237 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.6246 0.1285 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.6243 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.6239 0.1273 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.6236 0.1261 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.6234 0.1235 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.6232 0.1220 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.6231 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.6228 0.1259 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.6225 0.1241 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.6222 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.6219 0.1223 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.6215 0.1254 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.6210 0.1297 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.6207 0.1249 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.6204 0.1233 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.6201 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.6198 0.1241 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.6195 0.1223 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.6190 0.1225 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.6185 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.6183 0.1223 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.6181 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.6176 0.1282 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.6175 0.1246 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.6173 0.1250 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.6170 0.1228 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.6166 0.1261 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.6161 0.1235 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.6156 0.1247 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.6155 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.6153 0.1253 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.6151 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.6150 0.1223 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.6149 0.1253 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.6148 0.1265 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.6147 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.6145 0.1214 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.6146 0.1241 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.6143 0.1292 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.6141 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.6140 0.1268 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.6137 0.1260 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.6137 0.1290 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.6135 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.6135 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.6133 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.6130 0.1294 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.6125 0.1308 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.6124 0.1185 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.6123 0.1225 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.6121 0.1242 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.6119 0.1263 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.6117 0.1220 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.6117 0.1254 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.6114 0.1243 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.6110 0.1231 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.6109 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.6108 0.1235 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.6106 0.1258 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.6106 0.1240 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.6104 0.1289 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.6102 0.1278 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.6100 0.1263 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.6099 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.6102 0.1220 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.6100 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.6098 0.1223 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.6096 0.1257 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.6093 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.6092 0.1261 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.6090 0.1293 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.6090 0.1242 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.6087 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.6084 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.6084 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 1.6609 0.1219 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.6207 0.1259 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.6018 0.1218 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.5948 0.1262 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.5896 0.1221 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.5782 0.1227 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.5775 0.1321 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.5758 0.1224 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.5759 0.1248 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.5748 0.1254 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.5709 0.1237 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.5700 0.1270 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.5703 0.1280 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.5728 0.1223 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.5716 0.1246 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.5702 0.1259 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.5697 0.1245 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.5709 0.1237 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.5706 0.1257 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.5716 0.1250 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.5703 0.1165 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.5710 0.1183 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.5698 0.1203 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.5696 0.1322 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.5692 0.1162 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.5677 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.5666 0.1232 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.5669 0.1229 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.5672 0.1215 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.5674 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.5672 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.5661 0.1224 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.5665 0.1207 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.5668 0.1240 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.5664 0.1268 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.5663 0.1251 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.5655 0.1195 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.5641 0.1264 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.5627 0.1228 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.5622 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.5616 0.1246 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.5619 0.1359 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.5611 0.1302 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.5601 0.1212 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.5603 0.1228 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.5592 0.1278 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.5588 0.1248 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.5583 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.5578 0.1254 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.5583 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.5578 0.1245 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.5584 0.1213 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.5580 0.1264 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.5580 0.1208 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.5576 0.1223 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.5575 0.1314 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.5578 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.5573 0.1300 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.5567 0.1237 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.5571 0.1232 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.5570 0.1272 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.5577 0.1219 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.5580 0.1199 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.5581 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.5578 0.1218 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.5578 0.1223 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.5577 0.1224 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.5573 0.1245 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.5571 0.1229 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.5568 0.1223 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.5572 0.1281 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.5573 0.1295 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.5576 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.5572 0.1240 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.5570 0.1331 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.5572 0.1275 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.5568 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.5566 0.1259 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.5559 0.1243 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.5556 0.1309 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.5550 0.1253 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.5549 0.1227 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.5543 0.1247 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.5542 0.1272 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.5538 0.1211 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.5534 0.1229 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.5530 0.1222 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.5526 0.1249 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.5520 0.1212 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.5519 0.1247 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.5516 0.1188 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.5513 0.1190 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.5508 0.1193 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.5503 0.1233 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.5498 0.1192 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.5496 0.1194 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.5494 0.1174 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.5490 0.1218 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.5484 0.1248 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.5479 0.1269 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.5477 0.1230 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.5474 0.1256 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.5471 0.1284 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.5467 0.1240 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.5464 0.1236 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.5461 0.1240 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.5460 0.1259 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.5458 0.1261 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.5456 0.1251 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.5455 0.1214 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.5452 0.1220 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.5449 0.1247 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.5446 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.5443 0.1279 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.5439 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.5435 0.1362 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.5432 0.1234 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.5430 0.1264 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.5427 0.1230 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.5426 0.1214 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.5423 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.5418 0.1310 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.5413 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.5412 0.1227 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.5410 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.5405 0.1356 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.5404 0.1227 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.5403 0.1209 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.5401 0.1209 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.5397 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.5391 0.1226 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.5387 0.1233 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.5387 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.5387 0.1219 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.5385 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.5384 0.1274 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.5384 0.1212 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.5383 0.1233 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.5382 0.1235 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.5381 0.1232 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.5383 0.1225 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.5382 0.1209 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.5380 0.1237 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.5380 0.1362 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.5377 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.5377 0.1260 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.5376 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.5377 0.1223 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.5376 0.1212 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.5374 0.1244 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.5370 0.1224 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.5368 0.1233 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.5367 0.1236 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.5366 0.1251 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.5365 0.1211 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.5363 0.1250 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.5362 0.1248 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.5360 0.1218 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.5356 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.5356 0.1269 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.5356 0.1224 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.5355 0.1268 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.5354 0.1235 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.5352 0.1243 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.5350 0.1217 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.5348 0.1255 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.5349 0.1236 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.5352 0.1256 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.5350 0.1233 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.5348 0.1226 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.5346 0.1221 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.5343 0.1240 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.5342 0.1343 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.5341 0.1212 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.5340 0.1213 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.5338 0.1293 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.5335 0.1209 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.5335 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.5937 0.1217 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.5550 0.1308 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.5391 0.1205 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.5302 0.1263 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.5225 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.5115 0.1242 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.5112 0.1211 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.5092 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.5093 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.5081 0.1244 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.5038 0.1208 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.5019 0.1234 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.5023 0.1235 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.5042 0.1271 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.5029 0.1200 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.5012 0.1231 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.5016 0.1240 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.5027 0.1236 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.5032 0.1216 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.5042 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.5037 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.5040 0.1271 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.5033 0.1212 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.5030 0.1209 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.5028 0.1200 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.5014 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.5001 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.5007 0.1216 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.5009 0.1213 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.5009 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.5004 0.1246 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.4994 0.1209 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.4998 0.1209 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.5001 0.1266 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.4995 0.1215 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.4990 0.1247 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.4979 0.1207 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.4966 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.4948 0.1237 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.4942 0.1211 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.4937 0.1206 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.4941 0.1232 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.4936 0.1217 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.4929 0.1231 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.4932 0.1251 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.4922 0.1260 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.4918 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.4915 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.4913 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.4917 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.4913 0.1210 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.4918 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.4917 0.1213 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.4917 0.1226 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.4914 0.1231 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.4915 0.1268 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.4918 0.1213 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.4915 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.4909 0.1216 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.4914 0.1207 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.4914 0.1210 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.4922 0.1242 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.4924 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.4925 0.1226 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.4923 0.1225 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.4924 0.1216 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.4924 0.1266 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.4918 0.1245 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.4916 0.1216 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.4913 0.1219 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.4917 0.1249 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.4917 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.4921 0.1211 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.4916 0.1215 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.4913 0.1236 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.4914 0.1252 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.4911 0.1225 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.4907 0.1245 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.4899 0.1235 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.4896 0.1231 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.4890 0.1208 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.4888 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.4881 0.1247 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.4879 0.1217 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.4875 0.1245 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.4871 0.1240 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.4866 0.1230 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.4860 0.1237 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.4854 0.1245 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.4854 0.1247 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.4850 0.1237 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.4846 0.1272 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.4840 0.1206 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.4836 0.1254 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.4831 0.1240 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.4829 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.4826 0.1236 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.4821 0.1266 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.4816 0.1219 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.4810 0.1249 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.4808 0.1294 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.4804 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.4800 0.1206 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.4796 0.1224 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.4792 0.1237 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.4789 0.1252 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.4787 0.1232 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.4784 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.4782 0.1237 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.4780 0.1215 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.4776 0.1244 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.4774 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.4772 0.1254 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.4769 0.1213 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.4766 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.4760 0.1305 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.4758 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.4756 0.1209 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.4754 0.1212 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.4751 0.1219 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.4748 0.1215 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.4743 0.1292 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.4737 0.1216 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.4736 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.4733 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.4728 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.4728 0.1219 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.4727 0.1211 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.4724 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.4720 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.4715 0.1224 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.4710 0.1266 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.4710 0.1224 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.4709 0.1247 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.4707 0.1250 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.4706 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.4705 0.1202 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.4705 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.4704 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.4703 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.4705 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.4704 0.1208 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.4702 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.4703 0.1314 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.4700 0.1259 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.4701 0.1270 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.4699 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.4700 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.4699 0.1245 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.4696 0.1215 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.4691 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.4689 0.1260 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.4688 0.1231 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.4687 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.4685 0.1211 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.4683 0.1240 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.4682 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.4679 0.1253 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.4676 0.1234 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.4675 0.1212 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.4676 0.1224 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.4675 0.1224 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.4673 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.4672 0.1233 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.4670 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.4668 0.1267 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.4669 0.1275 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.4672 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.4670 0.1224 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.4669 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.4667 0.1212 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.4663 0.1219 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.4663 0.1230 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.4662 0.1245 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.4661 0.1233 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.4658 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.4656 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.4655 0.1223 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.5520 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.5057 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.4855 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.4758 0.1252 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.4644 0.1253 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.4529 0.1219 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.4539 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.4509 0.1262 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.4497 0.1249 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.4480 0.1235 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.4438 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.4423 0.1294 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.4417 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.4430 0.1261 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.4414 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.4392 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.4391 0.1219 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.4399 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.4399 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.4404 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.4396 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.4396 0.1267 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.4389 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.4389 0.1261 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.4389 0.1232 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.4371 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.4356 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.4362 0.1259 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.4366 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.4368 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.4366 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.4353 0.1235 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.4354 0.1257 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.4357 0.1217 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.4354 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.4353 0.1289 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.4346 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.4334 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.4318 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.4312 0.1219 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.4306 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.4312 0.1245 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.4308 0.1243 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.4299 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.4301 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.4289 0.1223 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.4286 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.4281 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.4279 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.4282 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.4276 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.4283 0.1223 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.4281 0.1208 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.4284 0.1255 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.4284 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.4283 0.1255 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.4286 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.4281 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.4275 0.1243 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.4280 0.1263 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.4279 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.4286 0.1316 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.4288 0.1241 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.4288 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.4287 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.4289 0.1256 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.4288 0.1241 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.4283 0.1258 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.4283 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.4279 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.4282 0.1250 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.4283 0.1230 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.4286 0.1258 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.4283 0.1237 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.4282 0.1218 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.4283 0.1235 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.4278 0.1264 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.4277 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.4269 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.4266 0.1219 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.4261 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.4259 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.4253 0.1253 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.4252 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.4248 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.4246 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.4242 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.4238 0.1255 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.4233 0.1266 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.4235 0.1215 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.4232 0.1267 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.4231 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.4227 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.4223 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.4219 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.4218 0.1255 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.4217 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.4211 0.1250 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.4206 0.1215 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.4201 0.1264 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.4201 0.1218 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.4199 0.1212 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.4196 0.1245 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.4194 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.4191 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.4189 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.4187 0.1260 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.4185 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.4183 0.1243 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.4182 0.1252 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.4180 0.1255 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.4178 0.1215 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.4176 0.1235 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.4174 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.4171 0.1230 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.4166 0.1309 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.4165 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.4165 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.4163 0.1217 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.4161 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.4159 0.1209 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.4154 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.4149 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.4148 0.1231 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.4147 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.4142 0.1285 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.4142 0.1240 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.4142 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.4139 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.4137 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.4133 0.1245 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.4130 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.4131 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.4131 0.1267 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.4129 0.1256 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.4128 0.1277 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.4129 0.1241 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.4129 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.4129 0.1237 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.4128 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.4131 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.4129 0.1285 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.4128 0.1219 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.4129 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.4127 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.4127 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.4126 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.4128 0.1237 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.4128 0.1248 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.4126 0.1217 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.4122 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.4120 0.1251 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.4120 0.1211 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.4119 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.4118 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.4117 0.1223 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.4116 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.4114 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.4111 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.4111 0.1218 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.4113 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.4112 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.4111 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.4110 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.4109 0.1267 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.4107 0.1231 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.4108 0.1219 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.4112 0.1285 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.4111 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.4111 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.4109 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.4107 0.1248 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.4108 0.1252 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.4108 0.1252 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.4108 0.1215 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.4106 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.4103 0.1207 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.4103 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.5154 0.1272 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.4662 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.4399 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.4324 0.1233 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.4201 0.1259 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.4094 0.1243 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.4085 0.1239 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.4050 0.1250 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.4031 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.4012 0.1230 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.3967 0.1238 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.3958 0.1233 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.3949 0.1224 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.3960 0.1226 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.3951 0.1238 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.3927 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.3923 0.1261 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.3935 0.1213 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.3933 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.3944 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.3936 0.1224 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.3939 0.1233 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.3931 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.3929 0.1256 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.3926 0.1239 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.3908 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.3894 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.3898 0.1248 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.3900 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.3901 0.1214 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.3897 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.3884 0.1252 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.3888 0.1242 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.3891 0.1230 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.3890 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.3887 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.3880 0.1308 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.3867 0.1250 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.3853 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.3846 0.1258 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.3840 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.3847 0.1236 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.3843 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.3835 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.3835 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.3823 0.1278 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.3817 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.3813 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.3811 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.3811 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.3805 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.3813 0.1250 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.3810 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.3813 0.1245 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.3812 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.3812 0.1254 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.3814 0.1224 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.3811 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.3804 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.3809 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.3809 0.1239 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.3816 0.1274 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.3820 0.1248 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.3820 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.3820 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.3823 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.3824 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.3820 0.1254 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.3819 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.3818 0.1266 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.3823 0.1248 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.3823 0.1214 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.3828 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.3825 0.1258 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.3823 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.3824 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.3822 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.3821 0.1282 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.3814 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.3812 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.3806 0.1245 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.3804 0.1254 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.3799 0.1312 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.3799 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.3796 0.1265 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.3794 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.3790 0.1242 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.3787 0.1236 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.3782 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.3783 0.1238 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.3781 0.1253 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.3779 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.3775 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.3771 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.3769 0.1246 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.3769 0.1240 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.3767 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.3763 0.1261 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.3759 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.3754 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.3754 0.1253 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.3752 0.1249 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.3750 0.1238 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.3748 0.1233 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.3747 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.3746 0.1224 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.3745 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.3745 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.3743 0.1253 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.3744 0.1252 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.3742 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.3741 0.1296 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.3740 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.3738 0.1239 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.3734 0.1215 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.3730 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.3729 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.3728 0.1279 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.3727 0.1242 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.3726 0.1328 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.3724 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.3721 0.1261 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.3716 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.3716 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.3715 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.3710 0.1234 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.3710 0.1240 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.3709 0.1245 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.3707 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.3703 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.3698 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.3695 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.3696 0.1261 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.3696 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.3695 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.3695 0.1226 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.3696 0.1295 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.3697 0.1249 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.3696 0.1242 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.3696 0.1255 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.3699 0.1242 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.3698 0.1239 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.3697 0.1365 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.3699 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.3697 0.1238 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.3698 0.1245 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.3697 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.3699 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.3699 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.3697 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.3694 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.3692 0.1273 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.3693 0.1258 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.3692 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.3692 0.1256 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.3691 0.1236 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.3691 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.3690 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.3687 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.3688 0.1255 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.3688 0.1245 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.3688 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.3686 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.3686 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.3685 0.1230 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.3684 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.3686 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.3689 0.1264 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.3688 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.3688 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.3687 0.1226 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.3685 0.1216 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.3686 0.1254 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.3686 0.1243 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.3686 0.1252 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.3684 0.1250 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.3683 0.1233 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.3684 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.4978 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.4455 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.4274 0.1248 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.4184 0.1229 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.4032 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.3892 0.1219 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.3875 0.1229 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.3826 0.1256 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.3805 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.3789 0.1310 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.3748 0.1285 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.3742 0.1270 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.3735 0.1249 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.3739 0.1299 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.3720 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.3697 0.1257 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.3692 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.3696 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.3694 0.1235 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.3704 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.3689 0.1262 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.3691 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.3681 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.3677 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.3670 0.1234 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.3648 0.1248 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.3633 0.1265 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.3636 0.1321 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.3638 0.1231 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.3638 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.3632 0.1231 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.3619 0.1260 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.3621 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.3621 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.3615 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.3610 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.3598 0.1268 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.3585 0.1251 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.3569 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.3563 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.3553 0.1253 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.3556 0.1247 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.3551 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.3543 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.3546 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.3535 0.1265 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.3531 0.1222 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.3526 0.1210 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.3524 0.1248 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.3525 0.1213 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.3519 0.1229 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.3525 0.1261 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.3523 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.3523 0.1221 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.3520 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.3520 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.3523 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.3518 0.1247 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.3511 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.3517 0.1213 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.3515 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.3523 0.1247 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.3524 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.3524 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.3523 0.1247 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.3522 0.1310 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.3522 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.3518 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.3518 0.1261 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.3515 0.1240 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.3519 0.1235 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.3520 0.1238 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.3524 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.3519 0.1200 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.3517 0.1231 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.3518 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.3515 0.1221 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.3513 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.3506 0.1219 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.3504 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.3498 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.3497 0.1251 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.3491 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.3490 0.1243 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.3486 0.1263 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.3485 0.1309 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.3482 0.1251 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.3479 0.1240 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.3474 0.1247 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.3474 0.1258 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.3471 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.3467 0.1285 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.3464 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.3459 0.1219 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.3456 0.1234 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.3455 0.1250 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.3454 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.3449 0.1229 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.3444 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.3441 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.3440 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.3438 0.1258 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.3435 0.1251 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.3433 0.1222 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.3430 0.1227 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.3428 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.3427 0.1260 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.3427 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.3425 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.3425 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.3422 0.1259 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.3421 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.3420 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.3418 0.1311 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.3415 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.3412 0.1227 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.3412 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.3411 0.1227 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.3410 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.3409 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.3407 0.1286 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.3403 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.3400 0.1257 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.3400 0.1249 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.3399 0.1242 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.3395 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.3395 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.3395 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.3392 0.1240 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.3388 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.3383 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.3381 0.1239 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.3382 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.3381 0.1243 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.3380 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.3380 0.1249 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.3382 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.3382 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.3382 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.3381 0.1216 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.3384 0.1248 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.3384 0.1352 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.3382 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.3384 0.1242 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.3382 0.1231 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.3383 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.3382 0.1263 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.3384 0.1247 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.3385 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.3383 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.3380 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.3378 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.3379 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.3378 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.3378 0.1222 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.3377 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.3376 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.3375 0.1263 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.3372 0.1211 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.3373 0.1272 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.3375 0.1211 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.3375 0.1240 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.3374 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.3373 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.3372 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.3371 0.1219 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.3372 0.1234 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.3375 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.3375 0.1250 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.3374 0.1238 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.3374 0.1232 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.3371 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.3373 0.1266 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.3372 0.1264 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.3372 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.3370 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.3368 0.1235 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.3369 0.1221 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.4520 0.1242 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.4060 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.3844 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.3763 0.1257 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.3658 0.1261 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.3526 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.3498 0.1296 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.3473 0.1214 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.3460 0.1227 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.3432 0.1272 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.3395 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.3385 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.3386 0.1228 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.3393 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.3374 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.3350 0.1230 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.3354 0.1230 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.3359 0.1265 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.3351 0.1271 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.3360 0.1233 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.3349 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.3349 0.1253 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.3342 0.1255 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.3340 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.3335 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.3315 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.3299 0.1228 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.3302 0.1250 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.3305 0.1292 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.3306 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.3296 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.3282 0.1260 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.3283 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.3285 0.1257 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.3282 0.1256 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.3277 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.3269 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.3257 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.3243 0.1242 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.3236 0.1227 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.3230 0.1232 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.3236 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.3235 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.3227 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.3229 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.3219 0.1258 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.3216 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.3215 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.3213 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.3213 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.3209 0.1214 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.3215 0.1397 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.3213 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.3216 0.1241 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.3214 0.1220 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.3213 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.3216 0.1231 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.3212 0.1248 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.3205 0.1255 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.3210 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.3209 0.1269 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.3217 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.3220 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.3221 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.3219 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.3220 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.3222 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.3218 0.1250 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.3218 0.1320 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.3215 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.3221 0.1220 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.3224 0.1234 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.3229 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.3224 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.3222 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.3224 0.1260 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.3221 0.1279 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.3221 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.3215 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.3212 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.3208 0.1258 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.3208 0.1256 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.3202 0.1236 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.3201 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.3198 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.3198 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.3195 0.1249 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.3192 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.3187 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.3187 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.3184 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.3183 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.3179 0.1210 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.3176 0.1230 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.3172 0.1242 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.3172 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.3171 0.1247 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.3167 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.3163 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.3159 0.1223 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.3159 0.1250 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.3157 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.3156 0.1231 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.3154 0.1227 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.3152 0.1214 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.3150 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.3149 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.3148 0.1261 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.3146 0.1206 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.3146 0.1259 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.3144 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.3143 0.1292 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.3141 0.1241 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.3139 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.3136 0.1231 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.3133 0.1317 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.3132 0.1247 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.3132 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.3131 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.3130 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.3129 0.1262 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.3125 0.1249 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.3121 0.1228 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.3121 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.3120 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.3117 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.3117 0.1257 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.3116 0.1303 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.3114 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.3110 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.3105 0.1254 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.3103 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.3103 0.1257 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.3102 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.3100 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.3100 0.1248 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.3102 0.1249 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.3102 0.1234 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.3103 0.1211 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.3102 0.1227 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.3105 0.1254 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.3105 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.3104 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.3106 0.1219 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.3105 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.3106 0.1283 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.3105 0.1266 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.3108 0.1254 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.3109 0.1241 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.3107 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.3104 0.1272 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.3102 0.1247 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.3103 0.1260 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.3102 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.3103 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.3102 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.3102 0.1268 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.3100 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.3097 0.1282 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.3098 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.3099 0.1253 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.3099 0.1232 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.3098 0.1249 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.3098 0.1267 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.3098 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.3097 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.3099 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.3103 0.1217 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.3103 0.1266 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.3102 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.3101 0.1236 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.3099 0.1265 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.3100 0.1234 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.3100 0.1242 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.3100 0.1259 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.3098 0.1268 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.3096 0.1289 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.3097 0.1276 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.4415 0.1240 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.3939 0.1248 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.3686 0.1254 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.3577 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.3447 0.1248 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.3311 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.3296 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.3254 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.3236 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.3210 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.3166 0.1245 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.3154 0.1216 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.3152 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.3158 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.3141 0.1245 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.3114 0.1277 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.3115 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.3127 0.1236 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.3121 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.3129 0.1265 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.3123 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.3123 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.3115 0.1215 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.3116 0.1286 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.3111 0.1232 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.3092 0.1334 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.3073 0.1210 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.3077 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.3076 0.1212 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.3077 0.1328 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.3069 0.1218 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.3055 0.1213 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.3057 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.3052 0.1258 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.3047 0.1205 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.3042 0.1239 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.3032 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.3017 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.3003 0.1230 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.2999 0.1280 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.2992 0.1239 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.2996 0.1261 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.2993 0.1217 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.2986 0.1289 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.2987 0.1267 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.2978 0.1260 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.2973 0.1236 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.2969 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.2970 0.1212 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.2971 0.1257 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.2966 0.1232 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.2973 0.1260 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.2972 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.2973 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.2971 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.2971 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.2975 0.1240 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.2972 0.1253 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.2965 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.2971 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.2970 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.2978 0.1263 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.2980 0.1255 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.2979 0.1227 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.2977 0.1294 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.2978 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.2979 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.2976 0.1239 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.2978 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.2976 0.1236 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.2980 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.2982 0.1255 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.2987 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.2983 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.2981 0.1251 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.2982 0.1264 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.2980 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.2978 0.1266 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.2971 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.2970 0.1232 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.2965 0.1236 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.2963 0.1233 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.2959 0.1326 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.2958 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.2955 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.2954 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.2950 0.1258 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.2947 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.2944 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.2943 0.1220 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.2940 0.1232 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.2939 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.2935 0.1244 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.2932 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.2929 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.2929 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.2929 0.1240 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.2925 0.1278 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.2921 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.2918 0.1216 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.2917 0.1271 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.2915 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.2913 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.2912 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.2909 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.2908 0.1245 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.2907 0.1217 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.2906 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.2904 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.2904 0.1259 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.2902 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.2901 0.1265 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.2899 0.1219 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.2897 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.2894 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.2891 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.2890 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.2890 0.1272 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.2888 0.1345 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.2887 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.2886 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.2881 0.1253 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.2878 0.1255 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.2878 0.1219 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.2877 0.1240 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.2873 0.1233 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.2873 0.1219 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.2872 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.2870 0.1219 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.2867 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.2863 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.2861 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.2862 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.2862 0.1257 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.2861 0.1227 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.2862 0.1252 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.2864 0.1237 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.2865 0.1288 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.2864 0.1244 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.2864 0.1258 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.2868 0.1209 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.2868 0.1316 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.2866 0.1244 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.2869 0.1214 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.2867 0.1233 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.2869 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.2870 0.1217 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.2871 0.1218 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.2872 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.2870 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.2867 0.1215 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.2865 0.1211 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.2866 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.2865 0.1218 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.2865 0.1233 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.2864 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.2864 0.1220 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.2863 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.2860 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.2861 0.1260 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.2864 0.1215 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.2864 0.1216 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.2864 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.2863 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.2863 0.1253 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.2862 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.2863 0.1244 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.2867 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.2867 0.1257 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.2867 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.2866 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.2865 0.1212 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.2866 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.2865 0.1230 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.2865 0.1218 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.2864 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.2862 0.1234 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.2864 0.1256 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.4078 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.3547 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.3361 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.3267 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.3146 0.1249 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.3024 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.3001 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.2979 0.1252 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.2970 0.1253 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.2960 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.2922 0.1257 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.2909 0.1231 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.2909 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.2914 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.2892 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.2866 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.2865 0.1247 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.2874 0.1253 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.2872 0.1243 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.2883 0.1343 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.2877 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.2875 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.2868 0.1254 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.2865 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.2861 0.1249 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.2841 0.1249 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.2827 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.2831 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.2830 0.1254 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.2835 0.1236 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.2827 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.2818 0.1251 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.2816 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.2816 0.1247 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.2813 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.2808 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.2800 0.1258 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.2787 0.1235 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.2776 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.2771 0.1272 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.2764 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.2770 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.2766 0.1266 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.2759 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.2761 0.1275 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.2753 0.1227 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.2747 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.2745 0.1277 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.2745 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.2746 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.2741 0.1255 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.2746 0.1260 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.2746 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.2748 0.1322 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.2747 0.1213 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.2746 0.1246 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.2748 0.1268 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.2746 0.1258 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.2741 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.2746 0.1351 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.2744 0.1238 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.2753 0.1245 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.2756 0.1237 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.2756 0.1253 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.2754 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.2754 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.2756 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.2752 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.2753 0.1227 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.2751 0.1243 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.2756 0.1236 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.2759 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.2763 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.2758 0.1224 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.2756 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.2758 0.1245 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.2757 0.1231 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.2755 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.2749 0.1226 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.2748 0.1255 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.2744 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.2743 0.1246 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.2737 0.1235 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.2737 0.1237 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.2734 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.2733 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.2731 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.2728 0.1244 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.2725 0.1226 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.2725 0.1257 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.2723 0.1260 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.2723 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.2718 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.2715 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.2712 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.2713 0.1274 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.2714 0.1264 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.2710 0.1215 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.2706 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.2702 0.1254 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.2702 0.1241 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.2700 0.1244 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.2700 0.1272 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.2698 0.1257 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.2695 0.1325 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.2695 0.1254 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.2695 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.2695 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.2693 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.2694 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.2691 0.1257 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.2691 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.2691 0.1258 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.2689 0.1263 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.2686 0.1231 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.2683 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.2683 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.2683 0.1258 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.2682 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.2681 0.1278 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.2681 0.1244 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.2677 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.2673 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.2673 0.1256 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.2672 0.1214 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.2668 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.2669 0.1305 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.2669 0.1216 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.2667 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.2663 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.2659 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.2657 0.1245 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.2658 0.1355 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.2658 0.1237 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.2657 0.1269 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.2657 0.1328 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.2658 0.1277 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.2660 0.1247 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.2660 0.1260 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.2659 0.1211 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.2662 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.2663 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.2662 0.1246 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.2664 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.2662 0.1257 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.2664 0.1303 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.2664 0.1247 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.2666 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.2668 0.1268 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.2666 0.1252 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.2663 0.1283 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.2661 0.1245 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.2662 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.2661 0.1243 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.2661 0.1271 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.2660 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.2660 0.1263 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.2659 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.2657 0.1237 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.2659 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.2661 0.1257 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.2661 0.1220 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.2660 0.1214 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.2660 0.1252 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.2660 0.1253 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.2659 0.1250 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.2660 0.1270 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.2664 0.1231 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.2664 0.1263 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.2664 0.1231 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.2663 0.1264 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.2662 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.2664 0.1227 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.2663 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.2664 0.1301 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.2662 0.1213 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.2661 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.2662 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.3883 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.3378 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.3192 0.1290 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.3098 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.2984 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.2867 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.2852 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.2819 0.1236 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.2798 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.2785 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.2749 0.1222 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.2743 0.1248 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.2745 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.2746 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.2731 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.2714 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.2712 0.1249 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.2716 0.1208 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.2714 0.1218 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.2724 0.1251 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.2718 0.1225 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.2720 0.1244 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.2711 0.1247 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.2708 0.1361 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.2702 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.2683 0.1239 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.2668 0.1250 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.2673 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.2672 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.2675 0.1230 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.2665 0.1247 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.2655 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.2656 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.2658 0.1243 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.2655 0.1235 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.2651 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.2643 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.2631 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.2619 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.2615 0.1247 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.2606 0.1261 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.2613 0.1244 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.2607 0.1310 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.2601 0.1294 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.2602 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.2594 0.1253 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.2589 0.1249 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.2585 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.2584 0.1216 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.2585 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.2578 0.1282 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.2584 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.2584 0.1216 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.2585 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.2582 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.2580 0.1236 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.2582 0.1210 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.2579 0.1256 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.2573 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.2578 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.2576 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.2582 0.1248 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.2585 0.1265 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.2586 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.2585 0.1256 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.2586 0.1240 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.2589 0.1270 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.2586 0.1251 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.2585 0.1210 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.2583 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.2588 0.1250 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.2590 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.2594 0.1256 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.2590 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.2590 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.2591 0.1210 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.2589 0.1263 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.2587 0.1236 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.2580 0.1243 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.2579 0.1231 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.2575 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.2574 0.1240 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.2570 0.1235 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.2569 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.2566 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.2565 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.2563 0.1239 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.2560 0.1262 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.2556 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.2555 0.1275 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.2552 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.2551 0.1247 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.2547 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.2544 0.1205 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.2541 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.2542 0.1210 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.2541 0.1249 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.2536 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.2533 0.1320 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.2529 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.2529 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.2527 0.1231 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.2526 0.1361 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.2524 0.1270 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.2522 0.1272 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.2520 0.1261 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.2519 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.2520 0.1213 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.2518 0.1218 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.2519 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.2516 0.1211 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.2516 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.2515 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.2514 0.1236 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.2512 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.2509 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.2509 0.1237 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.2509 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.2507 0.1244 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.2507 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.2507 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.2504 0.1260 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.2501 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.2500 0.1330 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.2500 0.1309 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.2495 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.2494 0.1257 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.2493 0.1212 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.2491 0.1222 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.2488 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.2484 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.2482 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.2483 0.1213 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.2482 0.1204 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.2482 0.1211 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.2482 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.2484 0.1218 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.2485 0.1230 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.2485 0.1272 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.2486 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.2489 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.2490 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.2489 0.1281 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.2490 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.2489 0.1293 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.2491 0.1287 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.2491 0.1267 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.2493 0.1256 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.2495 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.2493 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.2490 0.1217 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.2488 0.1247 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.2489 0.1214 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.2488 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.2488 0.1239 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.2487 0.1256 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.2487 0.1217 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.2486 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.2483 0.1281 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.2484 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.2486 0.1253 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.2485 0.1256 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.2485 0.1307 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.2484 0.1258 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.2484 0.1251 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.2483 0.1457 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.2484 0.1408 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.2488 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.2488 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.2488 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.2487 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.2487 0.1432 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.2489 0.1285 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.2489 0.1248 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.2489 0.1284 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.2488 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.2486 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.2487 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.3697 0.1261 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.3173 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.2994 0.1233 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.2909 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.2799 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.2679 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.2666 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.2641 0.1260 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.2610 0.1218 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.2595 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.2555 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.2545 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.2547 0.1259 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.2550 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.2532 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.2509 0.1232 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.2512 0.1213 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.2519 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.2515 0.1233 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.2521 0.1236 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.2515 0.1209 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.2519 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.2515 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.2515 0.1238 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.2512 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.2496 0.1254 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.2481 0.1270 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.2489 0.1208 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.2487 0.1307 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.2492 0.1204 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.2485 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.2475 0.1228 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.2473 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.2472 0.1250 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.2468 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.2463 0.1250 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.2452 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.2440 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.2428 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.2424 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.2418 0.1206 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.2422 0.1236 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.2420 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.2411 0.1242 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.2412 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.2404 0.1317 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.2399 0.1242 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.2396 0.1234 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.2395 0.1213 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.2395 0.1255 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.2390 0.1247 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.2396 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.2395 0.1253 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.2397 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.2396 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.2396 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.2398 0.1232 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.2397 0.1262 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.2392 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.2397 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.2397 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.2404 0.1358 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.2408 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.2409 0.1287 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.2409 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.2410 0.1238 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.2412 0.1235 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.2410 0.1254 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.2411 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.2408 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.2413 0.1210 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.2415 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.2419 0.1262 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.2415 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.2416 0.1258 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.2417 0.1218 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.2415 0.1246 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.2414 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.2407 0.1238 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.2407 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.2403 0.1282 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.2402 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.2397 0.1250 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.2397 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.2394 0.1234 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.2393 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.2391 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.2388 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.2384 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.2383 0.1256 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.2380 0.1228 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.2380 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.2376 0.1282 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.2373 0.1260 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.2370 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.2370 0.1232 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.2369 0.1206 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.2366 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.2362 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.2359 0.1254 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.2360 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.2357 0.1236 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.2357 0.1284 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.2356 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.2354 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.2353 0.1242 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.2353 0.1205 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.2352 0.1228 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.2350 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.2351 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.2349 0.1211 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.2348 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.2348 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.2346 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.2344 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.2341 0.1281 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.2340 0.1274 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.2341 0.1250 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.2339 0.1303 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.2338 0.1314 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.2337 0.1263 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.2334 0.1250 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.2330 0.1303 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.2329 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.2328 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.2324 0.1211 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.2324 0.1234 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.2323 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.2321 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.2318 0.1232 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.2315 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.2313 0.1251 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.2314 0.1260 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.2314 0.1242 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.2314 0.1230 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.2315 0.1285 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.2316 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.2318 0.1230 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.2318 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.2319 0.1213 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.2322 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.2323 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.2322 0.1237 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.2325 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.2323 0.1211 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.2325 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.2325 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.2327 0.1234 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.2328 0.1255 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.2327 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.2324 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.2322 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.2323 0.1237 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.2322 0.1251 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.2322 0.1230 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.2322 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.2322 0.1256 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.2321 0.1237 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.2319 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.2320 0.1239 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.2322 0.1218 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.2323 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.2322 0.1238 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.2322 0.1209 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.2322 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.2322 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.2323 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.2327 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.2327 0.1250 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.2328 0.1228 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.2327 0.1248 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.2325 0.1249 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.2326 0.1247 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.2327 0.1234 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.2327 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.2325 0.1255 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.2324 0.1233 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.2325 0.1213 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.3512 0.1275 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.2988 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.2814 0.1214 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.2744 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.2613 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.2494 0.1259 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.2491 0.1233 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.2461 0.1208 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.2447 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.2430 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.2394 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.2393 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.2392 0.1223 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.2395 0.1227 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.2378 0.1238 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.2355 0.1376 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.2358 0.1410 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.2367 0.1291 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.2363 0.1277 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.2368 0.1324 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.2359 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.2361 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.2349 0.1232 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.2349 0.1241 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.2348 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.2332 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.2320 0.1246 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.2325 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.2325 0.1287 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.2328 0.1275 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.2318 0.1267 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.2307 0.1291 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.2305 0.1242 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.2306 0.1209 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.2304 0.1207 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.2300 0.1253 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.2291 0.1246 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.2279 0.1236 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.2269 0.1272 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.2266 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.2259 0.1257 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.2267 0.1247 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.2264 0.1249 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.2256 0.1229 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.2257 0.1270 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.2248 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.2245 0.1228 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.2243 0.1238 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.2243 0.1229 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.2243 0.1363 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.2238 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.2244 0.1241 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.2243 0.1219 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.2246 0.1238 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.2243 0.1227 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.2241 0.1221 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.2243 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.2241 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.2236 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.2241 0.1238 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.2241 0.1285 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.2249 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.2254 0.1241 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.2255 0.1259 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.2254 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.2257 0.1263 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.2259 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.2257 0.1223 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.2258 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.2258 0.1235 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.2265 0.1275 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.2267 0.1235 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.2271 0.1251 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.2268 0.1292 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.2268 0.1213 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.2271 0.1246 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.2270 0.1236 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.2269 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.2263 0.1315 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.2262 0.1236 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.2258 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.2256 0.1305 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.2252 0.1256 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.2253 0.1209 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.2250 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.2249 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.2246 0.1209 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.2245 0.1275 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.2241 0.1235 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.2241 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.2239 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.2238 0.1231 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.2234 0.1248 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.2229 0.1213 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.2227 0.1285 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.2227 0.1249 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.2226 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.2223 0.1264 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.2219 0.1251 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.2216 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.2215 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.2213 0.1214 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.2211 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.2210 0.1211 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.2208 0.1219 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.2207 0.1217 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.2206 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.2207 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.2205 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.2206 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.2204 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.2204 0.1247 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.2203 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.2202 0.1258 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.2200 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.2196 0.1209 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.2197 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.2197 0.1297 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.2196 0.1264 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.2196 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.2195 0.1271 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.2192 0.1276 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.2189 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.2188 0.1251 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.2187 0.1215 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.2183 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.2182 0.1239 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.2183 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.2180 0.1232 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.2177 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.2173 0.1233 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.2171 0.1219 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.2173 0.1240 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.2173 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.2172 0.1250 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.2172 0.1228 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.2173 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.2174 0.1239 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.2175 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.2175 0.1232 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.2178 0.1229 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.2179 0.1334 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.2178 0.1220 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.2181 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.2180 0.1235 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.2182 0.1240 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.2182 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.2184 0.1221 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.2185 0.1255 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.2184 0.1249 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.2181 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.2179 0.1236 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.2181 0.1248 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.2180 0.1219 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.2180 0.1242 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.2179 0.1296 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.2180 0.1214 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.2178 0.1232 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.2176 0.1254 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.2178 0.1232 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.2179 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.2179 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.2178 0.1231 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.2178 0.1228 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.2178 0.1263 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.2177 0.1230 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.2179 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.2183 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.2183 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.2183 0.1208 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.2183 0.1257 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.2182 0.1228 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.2183 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.2182 0.1211 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.2182 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.2180 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.2180 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.2181 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.3264 0.1314 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.2801 0.1277 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.2643 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.2572 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.2448 0.1233 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.2328 0.1266 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.2316 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.2280 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.2278 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.2271 0.1235 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.2241 0.1235 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.2231 0.1267 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.2225 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.2228 0.1219 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.2216 0.1262 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.2193 0.1247 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.2188 0.1213 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.2202 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.2200 0.1376 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.2209 0.1230 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.2207 0.1253 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.2212 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.2205 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.2207 0.1262 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.2201 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.2185 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.2172 0.1214 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.2181 0.1247 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.2180 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.2181 0.1316 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.2173 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.2163 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.2163 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.2164 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.2161 0.1217 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.2158 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.2151 0.1212 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.2139 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.2127 0.1255 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.2123 0.1254 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.2116 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.2124 0.1259 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.2121 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.2114 0.1282 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.2115 0.1288 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.2107 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.2104 0.1246 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.2101 0.1235 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.2101 0.1260 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.2104 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.2101 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.2108 0.1255 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.2106 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.2108 0.1212 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.2106 0.1270 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.2105 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.2107 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.2105 0.1219 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.2099 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.2107 0.1255 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.2106 0.1262 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.2112 0.1249 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.2115 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.2116 0.1263 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.2114 0.1245 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.2115 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.2117 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.2114 0.1244 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.2117 0.1276 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.2115 0.1239 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.2121 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.2123 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.2128 0.1235 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.2124 0.1301 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.2124 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.2126 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.2125 0.1246 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.2126 0.1248 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.2120 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.2119 0.1230 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.2114 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.2114 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.2110 0.1218 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.2110 0.1240 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.2107 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.2106 0.1257 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.2105 0.1246 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.2103 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.2100 0.1290 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.2102 0.1214 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.2100 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.2100 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.2096 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.2093 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.2090 0.1211 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.2091 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.2091 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.2088 0.1217 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.2085 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.2083 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.2083 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.2081 0.1265 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.2081 0.1255 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.2080 0.1274 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.2078 0.1220 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.2077 0.1275 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.2077 0.1295 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.2077 0.1294 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.2075 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.2076 0.1246 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.2074 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.2073 0.1273 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.2072 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.2072 0.1271 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.2070 0.1249 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.2066 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.2067 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.2067 0.1279 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.2065 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.2065 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.2064 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.2061 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.2057 0.1246 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.2058 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.2056 0.1280 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.2052 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.2052 0.1252 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.2052 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.2051 0.1267 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.2047 0.1258 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.2044 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.2042 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.2043 0.1268 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.2043 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.2043 0.1228 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.2043 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.2044 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.2046 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.2046 0.1217 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.2046 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.2050 0.1254 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.2051 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.2050 0.1230 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.2053 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.2051 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.2053 0.1256 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.2053 0.1249 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.2055 0.1259 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.2056 0.1253 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.2054 0.1244 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.2052 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.2050 0.1209 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.2051 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.2050 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.2049 0.1217 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.2049 0.1219 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.2049 0.1272 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.2048 0.1252 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.2046 0.1318 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.2047 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.2048 0.1224 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.2049 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.2049 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.2048 0.1247 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.2049 0.1244 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.2048 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.2049 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.2052 0.1285 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.2053 0.1239 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.2053 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.2052 0.1245 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.2051 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.2053 0.1235 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.2053 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.2053 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.2052 0.1217 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.2051 0.1247 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.2053 0.1236 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.3272 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.2749 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.2542 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.2472 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.2355 0.1276 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.2231 0.1255 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.2225 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.2200 0.1243 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.2182 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.2169 0.1254 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.2133 0.1324 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.2124 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.2130 0.1241 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.2130 0.1254 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.2112 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.2092 0.1245 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.2093 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.2104 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.2101 0.1213 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.2109 0.1241 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.2103 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.2105 0.1216 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.2099 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.2100 0.1214 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.2097 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.2081 0.1246 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.2069 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.2073 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.2073 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.2074 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.2068 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.2057 0.1241 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.2058 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.2057 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.2055 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.2051 0.1243 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.2044 0.1252 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.2033 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.2022 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.2019 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.2012 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.2019 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.2017 0.1262 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.2011 0.1283 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.2011 0.1215 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.2005 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.2001 0.1263 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.1998 0.1266 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.1998 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.2001 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.1995 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.2001 0.1272 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.2000 0.1268 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.2001 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.1999 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.2000 0.1265 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.2002 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.2000 0.1240 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.1995 0.1294 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.2002 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.2002 0.1243 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.2008 0.1247 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.2011 0.1241 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.2012 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.2012 0.1243 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.2011 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.2015 0.1243 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.2013 0.1236 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.2015 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.2012 0.1216 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.2017 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.2019 0.1205 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.2023 0.1337 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.2020 0.1255 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.2019 0.1252 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.2020 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.2019 0.1271 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.2018 0.1239 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.2012 0.1209 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.2011 0.1222 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.2006 0.1245 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.2005 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.2001 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.2000 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.1997 0.1236 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.1996 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.1994 0.1241 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.1991 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.1988 0.1214 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.1989 0.1214 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.1987 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.1986 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.1982 0.1216 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.1979 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.1977 0.1285 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.1977 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.1977 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.1974 0.1251 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.1971 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.1969 0.1280 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.1969 0.1262 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.1967 0.1256 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.1966 0.1254 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.1965 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.1964 0.1249 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.1963 0.1252 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.1963 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.1962 0.1284 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.1959 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.1959 0.1353 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.1958 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.1957 0.1216 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.1957 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.1956 0.1206 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.1953 0.1263 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.1950 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.1950 0.1243 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.1951 0.1248 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.1949 0.1292 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.1949 0.1257 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.1948 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.1944 0.1210 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.1942 0.1248 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.1942 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.1940 0.1259 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.1937 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.1937 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.1937 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.1934 0.1246 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.1931 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.1927 0.1254 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.1925 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.1926 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.1926 0.1262 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.1926 0.1275 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.1926 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.1927 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.1929 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.1929 0.1240 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.1930 0.1235 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.1933 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.1934 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.1934 0.1248 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.1936 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.1935 0.1267 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.1937 0.1248 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.1937 0.1275 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.1939 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.1941 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.1940 0.1248 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.1937 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.1935 0.1279 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.1936 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.1936 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.1936 0.1245 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.1935 0.1283 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.1935 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.1935 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.1932 0.1236 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.1934 0.1262 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.1935 0.1311 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.1935 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.1935 0.1252 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.1934 0.1229 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.1934 0.1247 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.1933 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.1935 0.1321 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.1938 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.1938 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.1939 0.1260 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.1938 0.1215 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.1937 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.1938 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.1939 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.1939 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.1937 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.1937 0.1345 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.1939 0.1269 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.3091 0.1271 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.2649 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.2458 0.1265 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.2374 0.1246 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.2269 0.1237 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.2154 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.2151 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.2126 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.2111 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.2097 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.2062 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.2051 0.1244 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.2045 0.1229 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.2047 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.2024 0.1229 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.2004 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.2006 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.2009 0.1308 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.2007 0.1265 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.2018 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.2011 0.1270 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.2013 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.2006 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.2008 0.1242 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.2004 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.1986 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.1971 0.1239 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.1978 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.1978 0.1255 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.1982 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.1974 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.1961 0.1244 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.1962 0.1244 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.1960 0.1313 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.1956 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.1953 0.1249 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.1945 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.1934 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.1921 0.1265 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.1917 0.1276 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.1910 0.1249 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.1917 0.1270 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.1915 0.1216 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.1905 0.1216 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.1906 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.1898 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.1894 0.1237 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.1892 0.1213 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.1889 0.1288 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.1890 0.1237 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.1886 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.1893 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.1891 0.1258 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.1893 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.1891 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.1890 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.1892 0.1244 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.1890 0.1230 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.1885 0.1275 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.1891 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.1890 0.1269 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.1897 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.1901 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.1902 0.1237 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.1902 0.1230 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.1903 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.1906 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.1904 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.1905 0.1251 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.1905 0.1268 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.1910 0.1258 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.1913 0.1276 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.1917 0.1252 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.1913 0.1216 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.1913 0.1308 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.1914 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.1913 0.1243 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.1911 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.1907 0.1241 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.1908 0.1230 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.1904 0.1274 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.1904 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.1900 0.1240 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.1901 0.1252 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.1897 0.1244 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.1896 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.1894 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.1891 0.1317 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.1888 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.1888 0.1306 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.1885 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.1885 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.1881 0.1246 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.1877 0.1241 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.1875 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.1875 0.1244 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.1874 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.1871 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.1867 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.1865 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.1864 0.1262 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.1863 0.1272 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.1864 0.1240 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.1863 0.1261 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.1862 0.1309 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.1860 0.1243 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.1860 0.1243 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.1861 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.1859 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.1859 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.1857 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.1857 0.1216 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.1856 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.1855 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.1852 0.1239 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.1849 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.1850 0.1253 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.1850 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.1849 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.1849 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.1848 0.1241 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.1846 0.1257 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.1843 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.1842 0.1261 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.1840 0.1249 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.1836 0.1243 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.1837 0.1210 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.1835 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.1834 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.1831 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.1827 0.1309 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.1825 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.1826 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.1827 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.1827 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.1826 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.1828 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.1829 0.1243 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.1829 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.1829 0.1250 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.1832 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.1833 0.1234 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.1832 0.1209 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.1834 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.1833 0.1250 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.1835 0.1257 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.1835 0.1228 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.1837 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.1839 0.1246 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.1838 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.1835 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.1833 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.1834 0.1211 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.1833 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.1833 0.1239 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.1832 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.1832 0.1282 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.1831 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.1830 0.1250 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.1831 0.1230 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.1832 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.1833 0.1246 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.1833 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.1833 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.1832 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.1831 0.1264 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.1833 0.1228 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.1837 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.1837 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.1838 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.1837 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.1836 0.1296 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.1838 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.1839 0.1243 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.1839 0.1234 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.1838 0.1246 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.1837 0.1257 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.1839 0.1249 sec/batch\n", + "Epoch 1/20 Iteration 1/3560 Training loss: 4.4167 1.3106 sec/batch\n", + "Epoch 1/20 Iteration 2/3560 Training loss: 4.3957 0.1376 sec/batch\n", + "Epoch 1/20 Iteration 3/3560 Training loss: 4.3641 0.1318 sec/batch\n", + "Epoch 1/20 Iteration 4/3560 Training loss: 4.2941 0.1214 sec/batch\n", + "Epoch 1/20 Iteration 5/3560 Training loss: 4.1987 0.1217 sec/batch\n", + "Epoch 1/20 Iteration 6/3560 Training loss: 4.0854 0.1303 sec/batch\n", + "Epoch 1/20 Iteration 7/3560 Training loss: 4.0018 0.1205 sec/batch\n", + "Epoch 1/20 Iteration 8/3560 Training loss: 3.9378 0.1283 sec/batch\n", + "Epoch 1/20 Iteration 9/3560 Training loss: 3.8813 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 10/3560 Training loss: 3.8306 0.1283 sec/batch\n", + "Epoch 1/20 Iteration 11/3560 Training loss: 3.7844 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 12/3560 Training loss: 3.7464 0.1295 sec/batch\n", + "Epoch 1/20 Iteration 13/3560 Training loss: 3.7139 0.1292 sec/batch\n", + "Epoch 1/20 Iteration 14/3560 Training loss: 3.6842 0.1211 sec/batch\n", + "Epoch 1/20 Iteration 15/3560 Training loss: 3.6587 0.1275 sec/batch\n", + "Epoch 1/20 Iteration 16/3560 Training loss: 3.6354 0.1212 sec/batch\n", + "Epoch 1/20 Iteration 17/3560 Training loss: 3.6140 0.1241 sec/batch\n", + "Epoch 1/20 Iteration 18/3560 Training loss: 3.5961 0.1208 sec/batch\n", + "Epoch 1/20 Iteration 19/3560 Training loss: 3.5787 0.1211 sec/batch\n", + "Epoch 1/20 Iteration 20/3560 Training loss: 3.5614 0.1256 sec/batch\n", + "Epoch 1/20 Iteration 21/3560 Training loss: 3.5464 0.1221 sec/batch\n", + "Epoch 1/20 Iteration 22/3560 Training loss: 3.5323 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 23/3560 Training loss: 3.5191 0.1298 sec/batch\n", + "Epoch 1/20 Iteration 24/3560 Training loss: 3.5070 0.1274 sec/batch\n", + "Epoch 1/20 Iteration 25/3560 Training loss: 3.4951 0.1217 sec/batch\n", + "Epoch 1/20 Iteration 26/3560 Training loss: 3.4849 0.1301 sec/batch\n", + "Epoch 1/20 Iteration 27/3560 Training loss: 3.4757 0.1292 sec/batch\n", + "Epoch 1/20 Iteration 28/3560 Training loss: 3.4660 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 29/3560 Training loss: 3.4566 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 30/3560 Training loss: 3.4485 0.1209 sec/batch\n", + "Epoch 1/20 Iteration 31/3560 Training loss: 3.4416 0.1232 sec/batch\n", + "Epoch 1/20 Iteration 32/3560 Training loss: 3.4342 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 33/3560 Training loss: 3.4267 0.1255 sec/batch\n", + "Epoch 1/20 Iteration 34/3560 Training loss: 3.4201 0.1254 sec/batch\n", + "Epoch 1/20 Iteration 35/3560 Training loss: 3.4135 0.1256 sec/batch\n", + "Epoch 1/20 Iteration 36/3560 Training loss: 3.4076 0.1251 sec/batch\n", + "Epoch 1/20 Iteration 37/3560 Training loss: 3.4012 0.1252 sec/batch\n", + "Epoch 1/20 Iteration 38/3560 Training loss: 3.3953 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 39/3560 Training loss: 3.3895 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 40/3560 Training loss: 3.3841 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 41/3560 Training loss: 3.3786 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 42/3560 Training loss: 3.3735 0.1229 sec/batch\n", + "Epoch 1/20 Iteration 43/3560 Training loss: 3.3687 0.1224 sec/batch\n", + "Epoch 1/20 Iteration 44/3560 Training loss: 3.3642 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 45/3560 Training loss: 3.3595 0.1295 sec/batch\n", + "Epoch 1/20 Iteration 46/3560 Training loss: 3.3556 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 47/3560 Training loss: 3.3515 0.1209 sec/batch\n", + "Epoch 1/20 Iteration 48/3560 Training loss: 3.3480 0.1274 sec/batch\n", + "Epoch 1/20 Iteration 49/3560 Training loss: 3.3446 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 50/3560 Training loss: 3.3413 0.1224 sec/batch\n", + "Epoch 1/20 Iteration 51/3560 Training loss: 3.3378 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 52/3560 Training loss: 3.3343 0.1229 sec/batch\n", + "Epoch 1/20 Iteration 53/3560 Training loss: 3.3311 0.1254 sec/batch\n", + "Epoch 1/20 Iteration 54/3560 Training loss: 3.3278 0.1233 sec/batch\n", + "Epoch 1/20 Iteration 55/3560 Training loss: 3.3248 0.1214 sec/batch\n", + "Epoch 1/20 Iteration 56/3560 Training loss: 3.3215 0.1254 sec/batch\n", + "Epoch 1/20 Iteration 57/3560 Training loss: 3.3186 0.1241 sec/batch\n", + "Epoch 1/20 Iteration 58/3560 Training loss: 3.3159 0.1255 sec/batch\n", + "Epoch 1/20 Iteration 59/3560 Training loss: 3.3130 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 60/3560 Training loss: 3.3104 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 61/3560 Training loss: 3.3077 0.1241 sec/batch\n", + "Epoch 1/20 Iteration 62/3560 Training loss: 3.3056 0.1302 sec/batch\n", + "Epoch 1/20 Iteration 63/3560 Training loss: 3.3035 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 64/3560 Training loss: 3.3008 0.1254 sec/batch\n", + "Epoch 1/20 Iteration 65/3560 Training loss: 3.2982 0.1255 sec/batch\n", + "Epoch 1/20 Iteration 66/3560 Training loss: 3.2962 0.1211 sec/batch\n", + "Epoch 1/20 Iteration 67/3560 Training loss: 3.2941 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 68/3560 Training loss: 3.2914 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 69/3560 Training loss: 3.2890 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 70/3560 Training loss: 3.2870 0.1253 sec/batch\n", + "Epoch 1/20 Iteration 71/3560 Training loss: 3.2849 0.1248 sec/batch\n", + "Epoch 1/20 Iteration 72/3560 Training loss: 3.2832 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 73/3560 Training loss: 3.2812 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 74/3560 Training loss: 3.2793 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 75/3560 Training loss: 3.2775 0.1261 sec/batch\n", + "Epoch 1/20 Iteration 76/3560 Training loss: 3.2759 0.1271 sec/batch\n", + "Epoch 1/20 Iteration 77/3560 Training loss: 3.2741 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 78/3560 Training loss: 3.2723 0.1257 sec/batch\n", + "Epoch 1/20 Iteration 79/3560 Training loss: 3.2705 0.1245 sec/batch\n", + "Epoch 1/20 Iteration 80/3560 Training loss: 3.2686 0.1197 sec/batch\n", + "Epoch 1/20 Iteration 81/3560 Training loss: 3.2668 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 82/3560 Training loss: 3.2652 0.1207 sec/batch\n", + "Epoch 1/20 Iteration 83/3560 Training loss: 3.2637 0.1335 sec/batch\n", + "Epoch 1/20 Iteration 84/3560 Training loss: 3.2620 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 85/3560 Training loss: 3.2602 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 86/3560 Training loss: 3.2586 0.1243 sec/batch\n", + "Epoch 1/20 Iteration 87/3560 Training loss: 3.2568 0.1305 sec/batch\n", + "Epoch 1/20 Iteration 88/3560 Training loss: 3.2550 0.1210 sec/batch\n", + "Epoch 1/20 Iteration 89/3560 Training loss: 3.2535 0.1249 sec/batch\n", + "Epoch 1/20 Iteration 90/3560 Training loss: 3.2521 0.1296 sec/batch\n", + "Epoch 1/20 Iteration 91/3560 Training loss: 3.2507 0.1220 sec/batch\n", + "Epoch 1/20 Iteration 92/3560 Training loss: 3.2491 0.1240 sec/batch\n", + "Epoch 1/20 Iteration 93/3560 Training loss: 3.2476 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 94/3560 Training loss: 3.2462 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 95/3560 Training loss: 3.2447 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 96/3560 Training loss: 3.2432 0.1251 sec/batch\n", + "Epoch 1/20 Iteration 97/3560 Training loss: 3.2419 0.1244 sec/batch\n", + "Epoch 1/20 Iteration 98/3560 Training loss: 3.2404 0.1217 sec/batch\n", + "Epoch 1/20 Iteration 99/3560 Training loss: 3.2390 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 100/3560 Training loss: 3.2376 0.1211 sec/batch\n", + "Epoch 1/20 Iteration 101/3560 Training loss: 3.2362 0.1214 sec/batch\n", + "Epoch 1/20 Iteration 102/3560 Training loss: 3.2349 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 103/3560 Training loss: 3.2336 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 104/3560 Training loss: 3.2321 0.1251 sec/batch\n", + "Epoch 1/20 Iteration 105/3560 Training loss: 3.2306 0.1292 sec/batch\n", + "Epoch 1/20 Iteration 106/3560 Training loss: 3.2292 0.1247 sec/batch\n", + "Epoch 1/20 Iteration 107/3560 Training loss: 3.2277 0.1207 sec/batch\n", + "Epoch 1/20 Iteration 108/3560 Training loss: 3.2263 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 109/3560 Training loss: 3.2249 0.1246 sec/batch\n", + "Epoch 1/20 Iteration 110/3560 Training loss: 3.2233 0.1279 sec/batch\n", + "Epoch 1/20 Iteration 111/3560 Training loss: 3.2218 0.1214 sec/batch\n", + "Epoch 1/20 Iteration 112/3560 Training loss: 3.2204 0.1266 sec/batch\n", + "Epoch 1/20 Iteration 113/3560 Training loss: 3.2189 0.1270 sec/batch\n", + "Epoch 1/20 Iteration 114/3560 Training loss: 3.2173 0.1230 sec/batch\n", + "Epoch 1/20 Iteration 115/3560 Training loss: 3.2157 0.1290 sec/batch\n", + "Epoch 1/20 Iteration 116/3560 Training loss: 3.2141 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 117/3560 Training loss: 3.2133 0.1259 sec/batch\n", + "Epoch 1/20 Iteration 118/3560 Training loss: 3.2126 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 119/3560 Training loss: 3.2116 0.1211 sec/batch\n", + "Epoch 1/20 Iteration 120/3560 Training loss: 3.2103 0.1223 sec/batch\n", + "Epoch 1/20 Iteration 121/3560 Training loss: 3.2092 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 122/3560 Training loss: 3.2081 0.1239 sec/batch\n", + "Epoch 1/20 Iteration 123/3560 Training loss: 3.2069 0.1267 sec/batch\n", + "Epoch 1/20 Iteration 124/3560 Training loss: 3.2058 0.1234 sec/batch\n", + "Epoch 1/20 Iteration 125/3560 Training loss: 3.2045 0.1224 sec/batch\n", + "Epoch 1/20 Iteration 126/3560 Training loss: 3.2032 0.1231 sec/batch\n", + "Epoch 1/20 Iteration 127/3560 Training loss: 3.2020 0.1270 sec/batch\n", + "Epoch 1/20 Iteration 128/3560 Training loss: 3.2007 0.1248 sec/batch\n", + "Epoch 1/20 Iteration 129/3560 Training loss: 3.1994 0.1258 sec/batch\n", + "Epoch 1/20 Iteration 130/3560 Training loss: 3.1981 0.1216 sec/batch\n", + "Epoch 1/20 Iteration 131/3560 Training loss: 3.1969 0.1251 sec/batch\n", + "Epoch 1/20 Iteration 132/3560 Training loss: 3.1955 0.1256 sec/batch\n", + "Epoch 1/20 Iteration 133/3560 Training loss: 3.1942 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 134/3560 Training loss: 3.1927 0.1237 sec/batch\n", + "Epoch 1/20 Iteration 135/3560 Training loss: 3.1911 0.1229 sec/batch\n", + "Epoch 1/20 Iteration 136/3560 Training loss: 3.1894 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 137/3560 Training loss: 3.1878 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 138/3560 Training loss: 3.1862 0.1229 sec/batch\n", + "Epoch 1/20 Iteration 139/3560 Training loss: 3.1848 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 140/3560 Training loss: 3.1831 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 141/3560 Training loss: 3.1815 0.1224 sec/batch\n", + "Epoch 1/20 Iteration 142/3560 Training loss: 3.1797 0.1247 sec/batch\n", + "Epoch 1/20 Iteration 143/3560 Training loss: 3.1780 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 144/3560 Training loss: 3.1762 0.1254 sec/batch\n", + "Epoch 1/20 Iteration 145/3560 Training loss: 3.1745 0.1277 sec/batch\n", + "Epoch 1/20 Iteration 146/3560 Training loss: 3.1728 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 147/3560 Training loss: 3.1711 0.1333 sec/batch\n", + "Epoch 1/20 Iteration 148/3560 Training loss: 3.1695 0.1255 sec/batch\n", + "Epoch 1/20 Iteration 149/3560 Training loss: 3.1675 0.1236 sec/batch\n", + "Epoch 1/20 Iteration 150/3560 Training loss: 3.1657 0.1215 sec/batch\n", + "Epoch 1/20 Iteration 151/3560 Training loss: 3.1641 0.1299 sec/batch\n", + "Epoch 1/20 Iteration 152/3560 Training loss: 3.1624 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 153/3560 Training loss: 3.1607 0.1244 sec/batch\n", + "Epoch 1/20 Iteration 154/3560 Training loss: 3.1588 0.1242 sec/batch\n", + "Epoch 1/20 Iteration 155/3560 Training loss: 3.1569 0.1227 sec/batch\n", + "Epoch 1/20 Iteration 156/3560 Training loss: 3.1553 0.1214 sec/batch\n", + "Epoch 1/20 Iteration 157/3560 Training loss: 3.1535 0.1217 sec/batch\n", + "Epoch 1/20 Iteration 158/3560 Training loss: 3.1519 0.1250 sec/batch\n", + "Epoch 1/20 Iteration 159/3560 Training loss: 3.1500 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 160/3560 Training loss: 3.1483 0.1236 sec/batch\n", + "Epoch 1/20 Iteration 161/3560 Training loss: 3.1466 0.1218 sec/batch\n", + "Epoch 1/20 Iteration 162/3560 Training loss: 3.1446 0.1330 sec/batch\n", + "Epoch 1/20 Iteration 163/3560 Training loss: 3.1425 0.1230 sec/batch\n", + "Epoch 1/20 Iteration 164/3560 Training loss: 3.1406 0.1226 sec/batch\n", + "Epoch 1/20 Iteration 165/3560 Training loss: 3.1387 0.1249 sec/batch\n", + "Epoch 1/20 Iteration 166/3560 Training loss: 3.1368 0.1264 sec/batch\n", + "Epoch 1/20 Iteration 167/3560 Training loss: 3.1349 0.1221 sec/batch\n", + "Epoch 1/20 Iteration 168/3560 Training loss: 3.1329 0.1235 sec/batch\n", + "Epoch 1/20 Iteration 169/3560 Training loss: 3.1309 0.1219 sec/batch\n", + "Epoch 1/20 Iteration 170/3560 Training loss: 3.1288 0.1231 sec/batch\n", + "Epoch 1/20 Iteration 171/3560 Training loss: 3.1268 0.1213 sec/batch\n", + "Epoch 1/20 Iteration 172/3560 Training loss: 3.1249 0.1222 sec/batch\n", + "Epoch 1/20 Iteration 173/3560 Training loss: 3.1231 0.1225 sec/batch\n", + "Epoch 1/20 Iteration 174/3560 Training loss: 3.1212 0.1257 sec/batch\n", + "Epoch 1/20 Iteration 175/3560 Training loss: 3.1192 0.1240 sec/batch\n", + "Epoch 1/20 Iteration 176/3560 Training loss: 3.1171 0.1236 sec/batch\n", + "Epoch 1/20 Iteration 177/3560 Training loss: 3.1149 0.1250 sec/batch\n", + "Epoch 1/20 Iteration 178/3560 Training loss: 3.1126 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 179/3560 Training loss: 2.7978 0.1236 sec/batch\n", + "Epoch 2/20 Iteration 180/3560 Training loss: 2.7489 0.1246 sec/batch\n", + "Epoch 2/20 Iteration 181/3560 Training loss: 2.7294 0.1258 sec/batch\n", + "Epoch 2/20 Iteration 182/3560 Training loss: 2.7204 0.1241 sec/batch\n", + "Epoch 2/20 Iteration 183/3560 Training loss: 2.7128 0.1260 sec/batch\n", + "Epoch 2/20 Iteration 184/3560 Training loss: 2.7075 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 185/3560 Training loss: 2.7047 0.1262 sec/batch\n", + "Epoch 2/20 Iteration 186/3560 Training loss: 2.7016 0.1299 sec/batch\n", + "Epoch 2/20 Iteration 187/3560 Training loss: 2.6978 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 188/3560 Training loss: 2.6938 0.1237 sec/batch\n", + "Epoch 2/20 Iteration 189/3560 Training loss: 2.6883 0.1270 sec/batch\n", + "Epoch 2/20 Iteration 190/3560 Training loss: 2.6866 0.1226 sec/batch\n", + "Epoch 2/20 Iteration 191/3560 Training loss: 2.6829 0.1226 sec/batch\n", + "Epoch 2/20 Iteration 192/3560 Training loss: 2.6817 0.1252 sec/batch\n", + "Epoch 2/20 Iteration 193/3560 Training loss: 2.6788 0.1271 sec/batch\n", + "Epoch 2/20 Iteration 194/3560 Training loss: 2.6762 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 195/3560 Training loss: 2.6732 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 196/3560 Training loss: 2.6727 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 197/3560 Training loss: 2.6700 0.1236 sec/batch\n", + "Epoch 2/20 Iteration 198/3560 Training loss: 2.6658 0.1226 sec/batch\n", + "Epoch 2/20 Iteration 199/3560 Training loss: 2.6627 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 200/3560 Training loss: 2.6617 0.1237 sec/batch\n", + "Epoch 2/20 Iteration 201/3560 Training loss: 2.6590 0.1289 sec/batch\n", + "Epoch 2/20 Iteration 202/3560 Training loss: 2.6559 0.1258 sec/batch\n", + "Epoch 2/20 Iteration 203/3560 Training loss: 2.6525 0.1240 sec/batch\n", + "Epoch 2/20 Iteration 204/3560 Training loss: 2.6502 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 205/3560 Training loss: 2.6474 0.1247 sec/batch\n", + "Epoch 2/20 Iteration 206/3560 Training loss: 2.6447 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 207/3560 Training loss: 2.6422 0.1264 sec/batch\n", + "Epoch 2/20 Iteration 208/3560 Training loss: 2.6395 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 209/3560 Training loss: 2.6380 0.1271 sec/batch\n", + "Epoch 2/20 Iteration 210/3560 Training loss: 2.6352 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 211/3560 Training loss: 2.6322 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 212/3560 Training loss: 2.6302 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 213/3560 Training loss: 2.6276 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 214/3560 Training loss: 2.6254 0.1239 sec/batch\n", + "Epoch 2/20 Iteration 215/3560 Training loss: 2.6228 0.1240 sec/batch\n", + "Epoch 2/20 Iteration 216/3560 Training loss: 2.6197 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 217/3560 Training loss: 2.6172 0.1211 sec/batch\n", + "Epoch 2/20 Iteration 218/3560 Training loss: 2.6146 0.1246 sec/batch\n", + "Epoch 2/20 Iteration 219/3560 Training loss: 2.6121 0.1248 sec/batch\n", + "Epoch 2/20 Iteration 220/3560 Training loss: 2.6096 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 221/3560 Training loss: 2.6071 0.1214 sec/batch\n", + "Epoch 2/20 Iteration 222/3560 Training loss: 2.6046 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 223/3560 Training loss: 2.6021 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 224/3560 Training loss: 2.5992 0.1247 sec/batch\n", + "Epoch 2/20 Iteration 225/3560 Training loss: 2.5976 0.1206 sec/batch\n", + "Epoch 2/20 Iteration 226/3560 Training loss: 2.5955 0.1209 sec/batch\n", + "Epoch 2/20 Iteration 227/3560 Training loss: 2.5934 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 228/3560 Training loss: 2.5920 0.1215 sec/batch\n", + "Epoch 2/20 Iteration 229/3560 Training loss: 2.5899 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 230/3560 Training loss: 2.5880 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 231/3560 Training loss: 2.5860 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 232/3560 Training loss: 2.5838 0.1210 sec/batch\n", + "Epoch 2/20 Iteration 233/3560 Training loss: 2.5819 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 234/3560 Training loss: 2.5802 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 235/3560 Training loss: 2.5784 0.1306 sec/batch\n", + "Epoch 2/20 Iteration 236/3560 Training loss: 2.5764 0.1255 sec/batch\n", + "Epoch 2/20 Iteration 237/3560 Training loss: 2.5744 0.1233 sec/batch\n", + "Epoch 2/20 Iteration 238/3560 Training loss: 2.5728 0.1277 sec/batch\n", + "Epoch 2/20 Iteration 239/3560 Training loss: 2.5710 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 240/3560 Training loss: 2.5696 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 241/3560 Training loss: 2.5683 0.1254 sec/batch\n", + "Epoch 2/20 Iteration 242/3560 Training loss: 2.5666 0.1230 sec/batch\n", + "Epoch 2/20 Iteration 243/3560 Training loss: 2.5648 0.1300 sec/batch\n", + "Epoch 2/20 Iteration 244/3560 Training loss: 2.5634 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 245/3560 Training loss: 2.5617 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 246/3560 Training loss: 2.5597 0.1210 sec/batch\n", + "Epoch 2/20 Iteration 247/3560 Training loss: 2.5579 0.1236 sec/batch\n", + "Epoch 2/20 Iteration 248/3560 Training loss: 2.5565 0.1238 sec/batch\n", + "Epoch 2/20 Iteration 249/3560 Training loss: 2.5552 0.1248 sec/batch\n", + "Epoch 2/20 Iteration 250/3560 Training loss: 2.5539 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 251/3560 Training loss: 2.5524 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 252/3560 Training loss: 2.5507 0.1281 sec/batch\n", + "Epoch 2/20 Iteration 253/3560 Training loss: 2.5494 0.1236 sec/batch\n", + "Epoch 2/20 Iteration 254/3560 Training loss: 2.5484 0.1233 sec/batch\n", + "Epoch 2/20 Iteration 255/3560 Training loss: 2.5469 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 256/3560 Training loss: 2.5457 0.1255 sec/batch\n", + "Epoch 2/20 Iteration 257/3560 Training loss: 2.5441 0.1251 sec/batch\n", + "Epoch 2/20 Iteration 258/3560 Training loss: 2.5427 0.1306 sec/batch\n", + "Epoch 2/20 Iteration 259/3560 Training loss: 2.5413 0.1269 sec/batch\n", + "Epoch 2/20 Iteration 260/3560 Training loss: 2.5401 0.1226 sec/batch\n", + "Epoch 2/20 Iteration 261/3560 Training loss: 2.5387 0.1273 sec/batch\n", + "Epoch 2/20 Iteration 262/3560 Training loss: 2.5372 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 263/3560 Training loss: 2.5354 0.1215 sec/batch\n", + "Epoch 2/20 Iteration 264/3560 Training loss: 2.5338 0.1259 sec/batch\n", + "Epoch 2/20 Iteration 265/3560 Training loss: 2.5326 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 266/3560 Training loss: 2.5312 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 267/3560 Training loss: 2.5298 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 268/3560 Training loss: 2.5287 0.1215 sec/batch\n", + "Epoch 2/20 Iteration 269/3560 Training loss: 2.5273 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 270/3560 Training loss: 2.5261 0.1236 sec/batch\n", + "Epoch 2/20 Iteration 271/3560 Training loss: 2.5248 0.1226 sec/batch\n", + "Epoch 2/20 Iteration 272/3560 Training loss: 2.5234 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 273/3560 Training loss: 2.5218 0.1238 sec/batch\n", + "Epoch 2/20 Iteration 274/3560 Training loss: 2.5204 0.1235 sec/batch\n", + "Epoch 2/20 Iteration 275/3560 Training loss: 2.5192 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 276/3560 Training loss: 2.5179 0.1255 sec/batch\n", + "Epoch 2/20 Iteration 277/3560 Training loss: 2.5167 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 278/3560 Training loss: 2.5154 0.1210 sec/batch\n", + "Epoch 2/20 Iteration 279/3560 Training loss: 2.5144 0.1233 sec/batch\n", + "Epoch 2/20 Iteration 280/3560 Training loss: 2.5133 0.1227 sec/batch\n", + "Epoch 2/20 Iteration 281/3560 Training loss: 2.5119 0.1240 sec/batch\n", + "Epoch 2/20 Iteration 282/3560 Training loss: 2.5106 0.1272 sec/batch\n", + "Epoch 2/20 Iteration 283/3560 Training loss: 2.5094 0.1237 sec/batch\n", + "Epoch 2/20 Iteration 284/3560 Training loss: 2.5082 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 285/3560 Training loss: 2.5070 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 286/3560 Training loss: 2.5061 0.1312 sec/batch\n", + "Epoch 2/20 Iteration 287/3560 Training loss: 2.5052 0.1315 sec/batch\n", + "Epoch 2/20 Iteration 288/3560 Training loss: 2.5038 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 289/3560 Training loss: 2.5028 0.1248 sec/batch\n", + "Epoch 2/20 Iteration 290/3560 Training loss: 2.5019 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 291/3560 Training loss: 2.5007 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 292/3560 Training loss: 2.4995 0.1248 sec/batch\n", + "Epoch 2/20 Iteration 293/3560 Training loss: 2.4985 0.1248 sec/batch\n", + "Epoch 2/20 Iteration 294/3560 Training loss: 2.4971 0.1221 sec/batch\n", + "Epoch 2/20 Iteration 295/3560 Training loss: 2.4961 0.1250 sec/batch\n", + "Epoch 2/20 Iteration 296/3560 Training loss: 2.4950 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 297/3560 Training loss: 2.4941 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 298/3560 Training loss: 2.4931 0.1235 sec/batch\n", + "Epoch 2/20 Iteration 299/3560 Training loss: 2.4923 0.1214 sec/batch\n", + "Epoch 2/20 Iteration 300/3560 Training loss: 2.4912 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 301/3560 Training loss: 2.4900 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 302/3560 Training loss: 2.4891 0.1256 sec/batch\n", + "Epoch 2/20 Iteration 303/3560 Training loss: 2.4881 0.1246 sec/batch\n", + "Epoch 2/20 Iteration 304/3560 Training loss: 2.4869 0.1282 sec/batch\n", + "Epoch 2/20 Iteration 305/3560 Training loss: 2.4860 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 306/3560 Training loss: 2.4851 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 307/3560 Training loss: 2.4842 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 308/3560 Training loss: 2.4832 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 309/3560 Training loss: 2.4821 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 310/3560 Training loss: 2.4810 0.1234 sec/batch\n", + "Epoch 2/20 Iteration 311/3560 Training loss: 2.4800 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 312/3560 Training loss: 2.4791 0.1265 sec/batch\n", + "Epoch 2/20 Iteration 313/3560 Training loss: 2.4780 0.1213 sec/batch\n", + "Epoch 2/20 Iteration 314/3560 Training loss: 2.4770 0.1218 sec/batch\n", + "Epoch 2/20 Iteration 315/3560 Training loss: 2.4761 0.1231 sec/batch\n", + "Epoch 2/20 Iteration 316/3560 Training loss: 2.4751 0.1216 sec/batch\n", + "Epoch 2/20 Iteration 317/3560 Training loss: 2.4744 0.1232 sec/batch\n", + "Epoch 2/20 Iteration 318/3560 Training loss: 2.4734 0.1230 sec/batch\n", + "Epoch 2/20 Iteration 319/3560 Training loss: 2.4726 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 320/3560 Training loss: 2.4716 0.1212 sec/batch\n", + "Epoch 2/20 Iteration 321/3560 Training loss: 2.4706 0.1233 sec/batch\n", + "Epoch 2/20 Iteration 322/3560 Training loss: 2.4697 0.1244 sec/batch\n", + "Epoch 2/20 Iteration 323/3560 Training loss: 2.4687 0.1234 sec/batch\n", + "Epoch 2/20 Iteration 324/3560 Training loss: 2.4680 0.1271 sec/batch\n", + "Epoch 2/20 Iteration 325/3560 Training loss: 2.4670 0.1270 sec/batch\n", + "Epoch 2/20 Iteration 326/3560 Training loss: 2.4663 0.1300 sec/batch\n", + "Epoch 2/20 Iteration 327/3560 Training loss: 2.4653 0.1224 sec/batch\n", + "Epoch 2/20 Iteration 328/3560 Training loss: 2.4643 0.1287 sec/batch\n", + "Epoch 2/20 Iteration 329/3560 Training loss: 2.4637 0.1291 sec/batch\n", + "Epoch 2/20 Iteration 330/3560 Training loss: 2.4631 0.1302 sec/batch\n", + "Epoch 2/20 Iteration 331/3560 Training loss: 2.4623 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 332/3560 Training loss: 2.4615 0.1219 sec/batch\n", + "Epoch 2/20 Iteration 333/3560 Training loss: 2.4606 0.1252 sec/batch\n", + "Epoch 2/20 Iteration 334/3560 Training loss: 2.4597 0.1255 sec/batch\n", + "Epoch 2/20 Iteration 335/3560 Training loss: 2.4588 0.1215 sec/batch\n", + "Epoch 2/20 Iteration 336/3560 Training loss: 2.4579 0.1228 sec/batch\n", + "Epoch 2/20 Iteration 337/3560 Training loss: 2.4568 0.1251 sec/batch\n", + "Epoch 2/20 Iteration 338/3560 Training loss: 2.4561 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 339/3560 Training loss: 2.4553 0.1229 sec/batch\n", + "Epoch 2/20 Iteration 340/3560 Training loss: 2.4542 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 341/3560 Training loss: 2.4533 0.1227 sec/batch\n", + "Epoch 2/20 Iteration 342/3560 Training loss: 2.4524 0.1217 sec/batch\n", + "Epoch 2/20 Iteration 343/3560 Training loss: 2.4516 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 344/3560 Training loss: 2.4507 0.1222 sec/batch\n", + "Epoch 2/20 Iteration 345/3560 Training loss: 2.4499 0.1249 sec/batch\n", + "Epoch 2/20 Iteration 346/3560 Training loss: 2.4491 0.1257 sec/batch\n", + "Epoch 2/20 Iteration 347/3560 Training loss: 2.4483 0.1242 sec/batch\n", + "Epoch 2/20 Iteration 348/3560 Training loss: 2.4474 0.1223 sec/batch\n", + "Epoch 2/20 Iteration 349/3560 Training loss: 2.4465 0.1245 sec/batch\n", + "Epoch 2/20 Iteration 350/3560 Training loss: 2.4456 0.1220 sec/batch\n", + "Epoch 2/20 Iteration 351/3560 Training loss: 2.4449 0.1296 sec/batch\n", + "Epoch 2/20 Iteration 352/3560 Training loss: 2.4442 0.1212 sec/batch\n", + "Epoch 2/20 Iteration 353/3560 Training loss: 2.4435 0.1243 sec/batch\n", + "Epoch 2/20 Iteration 354/3560 Training loss: 2.4427 0.1225 sec/batch\n", + "Epoch 2/20 Iteration 355/3560 Training loss: 2.4418 0.1246 sec/batch\n", + "Epoch 2/20 Iteration 356/3560 Training loss: 2.4408 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 357/3560 Training loss: 2.4076 0.1240 sec/batch\n", + "Epoch 3/20 Iteration 358/3560 Training loss: 2.3322 0.1259 sec/batch\n", + "Epoch 3/20 Iteration 359/3560 Training loss: 2.3108 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 360/3560 Training loss: 2.3012 0.1244 sec/batch\n", + "Epoch 3/20 Iteration 361/3560 Training loss: 2.2960 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 362/3560 Training loss: 2.2904 0.1240 sec/batch\n", + "Epoch 3/20 Iteration 363/3560 Training loss: 2.2886 0.1303 sec/batch\n", + "Epoch 3/20 Iteration 364/3560 Training loss: 2.2891 0.1266 sec/batch\n", + "Epoch 3/20 Iteration 365/3560 Training loss: 2.2894 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 366/3560 Training loss: 2.2875 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 367/3560 Training loss: 2.2848 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 368/3560 Training loss: 2.2836 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 369/3560 Training loss: 2.2828 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 370/3560 Training loss: 2.2849 0.1238 sec/batch\n", + "Epoch 3/20 Iteration 371/3560 Training loss: 2.2839 0.1257 sec/batch\n", + "Epoch 3/20 Iteration 372/3560 Training loss: 2.2828 0.1271 sec/batch\n", + "Epoch 3/20 Iteration 373/3560 Training loss: 2.2821 0.1213 sec/batch\n", + "Epoch 3/20 Iteration 374/3560 Training loss: 2.2834 0.1240 sec/batch\n", + "Epoch 3/20 Iteration 375/3560 Training loss: 2.2828 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 376/3560 Training loss: 2.2809 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 377/3560 Training loss: 2.2796 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 378/3560 Training loss: 2.2806 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 379/3560 Training loss: 2.2794 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 380/3560 Training loss: 2.2782 0.1296 sec/batch\n", + "Epoch 3/20 Iteration 381/3560 Training loss: 2.2772 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 382/3560 Training loss: 2.2763 0.1246 sec/batch\n", + "Epoch 3/20 Iteration 383/3560 Training loss: 2.2750 0.1211 sec/batch\n", + "Epoch 3/20 Iteration 384/3560 Training loss: 2.2745 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 385/3560 Training loss: 2.2744 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 386/3560 Training loss: 2.2739 0.1289 sec/batch\n", + "Epoch 3/20 Iteration 387/3560 Training loss: 2.2732 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 388/3560 Training loss: 2.2719 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 389/3560 Training loss: 2.2706 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 390/3560 Training loss: 2.2705 0.1254 sec/batch\n", + "Epoch 3/20 Iteration 391/3560 Training loss: 2.2697 0.1266 sec/batch\n", + "Epoch 3/20 Iteration 392/3560 Training loss: 2.2690 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 393/3560 Training loss: 2.2681 0.1245 sec/batch\n", + "Epoch 3/20 Iteration 394/3560 Training loss: 2.2664 0.1240 sec/batch\n", + "Epoch 3/20 Iteration 395/3560 Training loss: 2.2650 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 396/3560 Training loss: 2.2635 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 397/3560 Training loss: 2.2625 0.1217 sec/batch\n", + "Epoch 3/20 Iteration 398/3560 Training loss: 2.2615 0.1241 sec/batch\n", + "Epoch 3/20 Iteration 399/3560 Training loss: 2.2601 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 400/3560 Training loss: 2.2588 0.1215 sec/batch\n", + "Epoch 3/20 Iteration 401/3560 Training loss: 2.2581 0.1258 sec/batch\n", + "Epoch 3/20 Iteration 402/3560 Training loss: 2.2564 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 403/3560 Training loss: 2.2559 0.1271 sec/batch\n", + "Epoch 3/20 Iteration 404/3560 Training loss: 2.2550 0.1215 sec/batch\n", + "Epoch 3/20 Iteration 405/3560 Training loss: 2.2543 0.1262 sec/batch\n", + "Epoch 3/20 Iteration 406/3560 Training loss: 2.2541 0.1234 sec/batch\n", + "Epoch 3/20 Iteration 407/3560 Training loss: 2.2529 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 408/3560 Training loss: 2.2529 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 409/3560 Training loss: 2.2519 0.1268 sec/batch\n", + "Epoch 3/20 Iteration 410/3560 Training loss: 2.2511 0.1253 sec/batch\n", + "Epoch 3/20 Iteration 411/3560 Training loss: 2.2503 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 412/3560 Training loss: 2.2497 0.1273 sec/batch\n", + "Epoch 3/20 Iteration 413/3560 Training loss: 2.2491 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 414/3560 Training loss: 2.2483 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 415/3560 Training loss: 2.2473 0.1265 sec/batch\n", + "Epoch 3/20 Iteration 416/3560 Training loss: 2.2472 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 417/3560 Training loss: 2.2465 0.1301 sec/batch\n", + "Epoch 3/20 Iteration 418/3560 Training loss: 2.2464 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 419/3560 Training loss: 2.2462 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 420/3560 Training loss: 2.2458 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 421/3560 Training loss: 2.2450 0.1255 sec/batch\n", + "Epoch 3/20 Iteration 422/3560 Training loss: 2.2445 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 423/3560 Training loss: 2.2441 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 424/3560 Training loss: 2.2431 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 425/3560 Training loss: 2.2421 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 426/3560 Training loss: 2.2416 0.1257 sec/batch\n", + "Epoch 3/20 Iteration 427/3560 Training loss: 2.2414 0.1237 sec/batch\n", + "Epoch 3/20 Iteration 428/3560 Training loss: 2.2408 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 429/3560 Training loss: 2.2405 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 430/3560 Training loss: 2.2396 0.1232 sec/batch\n", + "Epoch 3/20 Iteration 431/3560 Training loss: 2.2388 0.1231 sec/batch\n", + "Epoch 3/20 Iteration 432/3560 Training loss: 2.2386 0.1256 sec/batch\n", + "Epoch 3/20 Iteration 433/3560 Training loss: 2.2379 0.1258 sec/batch\n", + "Epoch 3/20 Iteration 434/3560 Training loss: 2.2373 0.1211 sec/batch\n", + "Epoch 3/20 Iteration 435/3560 Training loss: 2.2364 0.1213 sec/batch\n", + "Epoch 3/20 Iteration 436/3560 Training loss: 2.2357 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 437/3560 Training loss: 2.2348 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 438/3560 Training loss: 2.2343 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 439/3560 Training loss: 2.2334 0.1267 sec/batch\n", + "Epoch 3/20 Iteration 440/3560 Training loss: 2.2328 0.1276 sec/batch\n", + "Epoch 3/20 Iteration 441/3560 Training loss: 2.2317 0.1313 sec/batch\n", + "Epoch 3/20 Iteration 442/3560 Training loss: 2.2309 0.1251 sec/batch\n", + "Epoch 3/20 Iteration 443/3560 Training loss: 2.2303 0.1245 sec/batch\n", + "Epoch 3/20 Iteration 444/3560 Training loss: 2.2296 0.1245 sec/batch\n", + "Epoch 3/20 Iteration 445/3560 Training loss: 2.2288 0.1263 sec/batch\n", + "Epoch 3/20 Iteration 446/3560 Training loss: 2.2283 0.1275 sec/batch\n", + "Epoch 3/20 Iteration 447/3560 Training loss: 2.2275 0.1229 sec/batch\n", + "Epoch 3/20 Iteration 448/3560 Training loss: 2.2270 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 449/3560 Training loss: 2.2261 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 450/3560 Training loss: 2.2253 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 451/3560 Training loss: 2.2244 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 452/3560 Training loss: 2.2236 0.1374 sec/batch\n", + "Epoch 3/20 Iteration 453/3560 Training loss: 2.2230 0.1248 sec/batch\n", + "Epoch 3/20 Iteration 454/3560 Training loss: 2.2223 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 455/3560 Training loss: 2.2212 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 456/3560 Training loss: 2.2203 0.1229 sec/batch\n", + "Epoch 3/20 Iteration 457/3560 Training loss: 2.2198 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 458/3560 Training loss: 2.2192 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 459/3560 Training loss: 2.2183 0.1324 sec/batch\n", + "Epoch 3/20 Iteration 460/3560 Training loss: 2.2176 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 461/3560 Training loss: 2.2168 0.1256 sec/batch\n", + "Epoch 3/20 Iteration 462/3560 Training loss: 2.2161 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 463/3560 Training loss: 2.2155 0.1282 sec/batch\n", + "Epoch 3/20 Iteration 464/3560 Training loss: 2.2150 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 465/3560 Training loss: 2.2145 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 466/3560 Training loss: 2.2138 0.1267 sec/batch\n", + "Epoch 3/20 Iteration 467/3560 Training loss: 2.2132 0.1244 sec/batch\n", + "Epoch 3/20 Iteration 468/3560 Training loss: 2.2127 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 469/3560 Training loss: 2.2121 0.1248 sec/batch\n", + "Epoch 3/20 Iteration 470/3560 Training loss: 2.2114 0.1229 sec/batch\n", + "Epoch 3/20 Iteration 471/3560 Training loss: 2.2108 0.1238 sec/batch\n", + "Epoch 3/20 Iteration 472/3560 Training loss: 2.2099 0.1219 sec/batch\n", + "Epoch 3/20 Iteration 473/3560 Training loss: 2.2093 0.1252 sec/batch\n", + "Epoch 3/20 Iteration 474/3560 Training loss: 2.2087 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 475/3560 Training loss: 2.2082 0.1241 sec/batch\n", + "Epoch 3/20 Iteration 476/3560 Training loss: 2.2077 0.1250 sec/batch\n", + "Epoch 3/20 Iteration 477/3560 Training loss: 2.2072 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 478/3560 Training loss: 2.2066 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 479/3560 Training loss: 2.2059 0.1272 sec/batch\n", + "Epoch 3/20 Iteration 480/3560 Training loss: 2.2054 0.1233 sec/batch\n", + "Epoch 3/20 Iteration 481/3560 Training loss: 2.2048 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 482/3560 Training loss: 2.2040 0.1241 sec/batch\n", + "Epoch 3/20 Iteration 483/3560 Training loss: 2.2036 0.1243 sec/batch\n", + "Epoch 3/20 Iteration 484/3560 Training loss: 2.2031 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 485/3560 Training loss: 2.2026 0.1229 sec/batch\n", + "Epoch 3/20 Iteration 486/3560 Training loss: 2.2022 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 487/3560 Training loss: 2.2015 0.1263 sec/batch\n", + "Epoch 3/20 Iteration 488/3560 Training loss: 2.2008 0.1239 sec/batch\n", + "Epoch 3/20 Iteration 489/3560 Training loss: 2.2003 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 490/3560 Training loss: 2.1998 0.1295 sec/batch\n", + "Epoch 3/20 Iteration 491/3560 Training loss: 2.1993 0.1264 sec/batch\n", + "Epoch 3/20 Iteration 492/3560 Training loss: 2.1989 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 493/3560 Training loss: 2.1984 0.1302 sec/batch\n", + "Epoch 3/20 Iteration 494/3560 Training loss: 2.1979 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 495/3560 Training loss: 2.1977 0.1231 sec/batch\n", + "Epoch 3/20 Iteration 496/3560 Training loss: 2.1971 0.1227 sec/batch\n", + "Epoch 3/20 Iteration 497/3560 Training loss: 2.1968 0.1213 sec/batch\n", + "Epoch 3/20 Iteration 498/3560 Training loss: 2.1962 0.1223 sec/batch\n", + "Epoch 3/20 Iteration 499/3560 Training loss: 2.1957 0.1211 sec/batch\n", + "Epoch 3/20 Iteration 500/3560 Training loss: 2.1951 0.1269 sec/batch\n", + "Epoch 3/20 Iteration 501/3560 Training loss: 2.1945 0.1226 sec/batch\n", + "Epoch 3/20 Iteration 502/3560 Training loss: 2.1941 0.1230 sec/batch\n", + "Epoch 3/20 Iteration 503/3560 Training loss: 2.1938 0.1263 sec/batch\n", + "Epoch 3/20 Iteration 504/3560 Training loss: 2.1934 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 505/3560 Training loss: 2.1929 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 506/3560 Training loss: 2.1922 0.1280 sec/batch\n", + "Epoch 3/20 Iteration 507/3560 Training loss: 2.1918 0.1247 sec/batch\n", + "Epoch 3/20 Iteration 508/3560 Training loss: 2.1916 0.1273 sec/batch\n", + "Epoch 3/20 Iteration 509/3560 Training loss: 2.1912 0.1249 sec/batch\n", + "Epoch 3/20 Iteration 510/3560 Training loss: 2.1907 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 511/3560 Training loss: 2.1901 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 512/3560 Training loss: 2.1896 0.1236 sec/batch\n", + "Epoch 3/20 Iteration 513/3560 Training loss: 2.1891 0.1220 sec/batch\n", + "Epoch 3/20 Iteration 514/3560 Training loss: 2.1885 0.1241 sec/batch\n", + "Epoch 3/20 Iteration 515/3560 Training loss: 2.1878 0.1291 sec/batch\n", + "Epoch 3/20 Iteration 516/3560 Training loss: 2.1875 0.1214 sec/batch\n", + "Epoch 3/20 Iteration 517/3560 Training loss: 2.1871 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 518/3560 Training loss: 2.1866 0.1221 sec/batch\n", + "Epoch 3/20 Iteration 519/3560 Training loss: 2.1862 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 520/3560 Training loss: 2.1857 0.1235 sec/batch\n", + "Epoch 3/20 Iteration 521/3560 Training loss: 2.1853 0.1228 sec/batch\n", + "Epoch 3/20 Iteration 522/3560 Training loss: 2.1847 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 523/3560 Training loss: 2.1844 0.1222 sec/batch\n", + "Epoch 3/20 Iteration 524/3560 Training loss: 2.1840 0.1231 sec/batch\n", + "Epoch 3/20 Iteration 525/3560 Training loss: 2.1835 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 526/3560 Training loss: 2.1830 0.1231 sec/batch\n", + "Epoch 3/20 Iteration 527/3560 Training loss: 2.1825 0.1242 sec/batch\n", + "Epoch 3/20 Iteration 528/3560 Training loss: 2.1821 0.1225 sec/batch\n", + "Epoch 3/20 Iteration 529/3560 Training loss: 2.1818 0.1233 sec/batch\n", + "Epoch 3/20 Iteration 530/3560 Training loss: 2.1814 0.1224 sec/batch\n", + "Epoch 3/20 Iteration 531/3560 Training loss: 2.1810 0.1254 sec/batch\n", + "Epoch 3/20 Iteration 532/3560 Training loss: 2.1805 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 533/3560 Training loss: 2.1799 0.1216 sec/batch\n", + "Epoch 3/20 Iteration 534/3560 Training loss: 2.1795 0.1237 sec/batch\n", + "Epoch 4/20 Iteration 535/3560 Training loss: 2.1941 0.1211 sec/batch\n", + "Epoch 4/20 Iteration 536/3560 Training loss: 2.1302 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 537/3560 Training loss: 2.1121 0.1217 sec/batch\n", + "Epoch 4/20 Iteration 538/3560 Training loss: 2.1012 0.1255 sec/batch\n", + "Epoch 4/20 Iteration 539/3560 Training loss: 2.0959 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 540/3560 Training loss: 2.0897 0.1236 sec/batch\n", + "Epoch 4/20 Iteration 541/3560 Training loss: 2.0900 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 542/3560 Training loss: 2.0900 0.1243 sec/batch\n", + "Epoch 4/20 Iteration 543/3560 Training loss: 2.0907 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 544/3560 Training loss: 2.0906 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 545/3560 Training loss: 2.0872 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 546/3560 Training loss: 2.0853 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 547/3560 Training loss: 2.0849 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 548/3560 Training loss: 2.0863 0.1287 sec/batch\n", + "Epoch 4/20 Iteration 549/3560 Training loss: 2.0849 0.1309 sec/batch\n", + "Epoch 4/20 Iteration 550/3560 Training loss: 2.0831 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 551/3560 Training loss: 2.0818 0.1337 sec/batch\n", + "Epoch 4/20 Iteration 552/3560 Training loss: 2.0834 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 553/3560 Training loss: 2.0831 0.1211 sec/batch\n", + "Epoch 4/20 Iteration 554/3560 Training loss: 2.0824 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 555/3560 Training loss: 2.0815 0.1219 sec/batch\n", + "Epoch 4/20 Iteration 556/3560 Training loss: 2.0826 0.1219 sec/batch\n", + "Epoch 4/20 Iteration 557/3560 Training loss: 2.0817 0.1215 sec/batch\n", + "Epoch 4/20 Iteration 558/3560 Training loss: 2.0805 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 559/3560 Training loss: 2.0800 0.1255 sec/batch\n", + "Epoch 4/20 Iteration 560/3560 Training loss: 2.0785 0.1309 sec/batch\n", + "Epoch 4/20 Iteration 561/3560 Training loss: 2.0771 0.1306 sec/batch\n", + "Epoch 4/20 Iteration 562/3560 Training loss: 2.0768 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 563/3560 Training loss: 2.0774 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 564/3560 Training loss: 2.0772 0.1237 sec/batch\n", + "Epoch 4/20 Iteration 565/3560 Training loss: 2.0766 0.1212 sec/batch\n", + "Epoch 4/20 Iteration 566/3560 Training loss: 2.0755 0.1345 sec/batch\n", + "Epoch 4/20 Iteration 567/3560 Training loss: 2.0751 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 568/3560 Training loss: 2.0753 0.1285 sec/batch\n", + "Epoch 4/20 Iteration 569/3560 Training loss: 2.0747 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 570/3560 Training loss: 2.0741 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 571/3560 Training loss: 2.0735 0.1244 sec/batch\n", + "Epoch 4/20 Iteration 572/3560 Training loss: 2.0720 0.1226 sec/batch\n", + "Epoch 4/20 Iteration 573/3560 Training loss: 2.0704 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 574/3560 Training loss: 2.0691 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 575/3560 Training loss: 2.0683 0.1257 sec/batch\n", + "Epoch 4/20 Iteration 576/3560 Training loss: 2.0675 0.1214 sec/batch\n", + "Epoch 4/20 Iteration 577/3560 Training loss: 2.0666 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 578/3560 Training loss: 2.0654 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 579/3560 Training loss: 2.0648 0.1247 sec/batch\n", + "Epoch 4/20 Iteration 580/3560 Training loss: 2.0630 0.1228 sec/batch\n", + "Epoch 4/20 Iteration 581/3560 Training loss: 2.0626 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 582/3560 Training loss: 2.0618 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 583/3560 Training loss: 2.0614 0.1219 sec/batch\n", + "Epoch 4/20 Iteration 584/3560 Training loss: 2.0618 0.1239 sec/batch\n", + "Epoch 4/20 Iteration 585/3560 Training loss: 2.0609 0.1214 sec/batch\n", + "Epoch 4/20 Iteration 586/3560 Training loss: 2.0613 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 587/3560 Training loss: 2.0606 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 588/3560 Training loss: 2.0601 0.1248 sec/batch\n", + "Epoch 4/20 Iteration 589/3560 Training loss: 2.0593 0.1254 sec/batch\n", + "Epoch 4/20 Iteration 590/3560 Training loss: 2.0590 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 591/3560 Training loss: 2.0588 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 592/3560 Training loss: 2.0583 0.1226 sec/batch\n", + "Epoch 4/20 Iteration 593/3560 Training loss: 2.0575 0.1243 sec/batch\n", + "Epoch 4/20 Iteration 594/3560 Training loss: 2.0576 0.1289 sec/batch\n", + "Epoch 4/20 Iteration 595/3560 Training loss: 2.0571 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 596/3560 Training loss: 2.0572 0.1216 sec/batch\n", + "Epoch 4/20 Iteration 597/3560 Training loss: 2.0572 0.1217 sec/batch\n", + "Epoch 4/20 Iteration 598/3560 Training loss: 2.0571 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 599/3560 Training loss: 2.0564 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 600/3560 Training loss: 2.0564 0.1268 sec/batch\n", + "Epoch 4/20 Iteration 601/3560 Training loss: 2.0561 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 602/3560 Training loss: 2.0552 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 603/3560 Training loss: 2.0546 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 604/3560 Training loss: 2.0542 0.1290 sec/batch\n", + "Epoch 4/20 Iteration 605/3560 Training loss: 2.0543 0.1225 sec/batch\n", + "Epoch 4/20 Iteration 606/3560 Training loss: 2.0541 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 607/3560 Training loss: 2.0538 0.1249 sec/batch\n", + "Epoch 4/20 Iteration 608/3560 Training loss: 2.0531 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 609/3560 Training loss: 2.0527 0.1248 sec/batch\n", + "Epoch 4/20 Iteration 610/3560 Training loss: 2.0527 0.1262 sec/batch\n", + "Epoch 4/20 Iteration 611/3560 Training loss: 2.0522 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 612/3560 Training loss: 2.0520 0.1262 sec/batch\n", + "Epoch 4/20 Iteration 613/3560 Training loss: 2.0511 0.1234 sec/batch\n", + "Epoch 4/20 Iteration 614/3560 Training loss: 2.0506 0.1251 sec/batch\n", + "Epoch 4/20 Iteration 615/3560 Training loss: 2.0497 0.1257 sec/batch\n", + "Epoch 4/20 Iteration 616/3560 Training loss: 2.0495 0.1262 sec/batch\n", + "Epoch 4/20 Iteration 617/3560 Training loss: 2.0487 0.1245 sec/batch\n", + "Epoch 4/20 Iteration 618/3560 Training loss: 2.0483 0.1249 sec/batch\n", + "Epoch 4/20 Iteration 619/3560 Training loss: 2.0475 0.1253 sec/batch\n", + "Epoch 4/20 Iteration 620/3560 Training loss: 2.0469 0.1354 sec/batch\n", + "Epoch 4/20 Iteration 621/3560 Training loss: 2.0465 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 622/3560 Training loss: 2.0459 0.1274 sec/batch\n", + "Epoch 4/20 Iteration 623/3560 Training loss: 2.0451 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 624/3560 Training loss: 2.0448 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 625/3560 Training loss: 2.0441 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 626/3560 Training loss: 2.0437 0.1318 sec/batch\n", + "Epoch 4/20 Iteration 627/3560 Training loss: 2.0427 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 628/3560 Training loss: 2.0421 0.1251 sec/batch\n", + "Epoch 4/20 Iteration 629/3560 Training loss: 2.0413 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 630/3560 Training loss: 2.0408 0.1215 sec/batch\n", + "Epoch 4/20 Iteration 631/3560 Training loss: 2.0404 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 632/3560 Training loss: 2.0398 0.1273 sec/batch\n", + "Epoch 4/20 Iteration 633/3560 Training loss: 2.0390 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 634/3560 Training loss: 2.0382 0.1265 sec/batch\n", + "Epoch 4/20 Iteration 635/3560 Training loss: 2.0377 0.1237 sec/batch\n", + "Epoch 4/20 Iteration 636/3560 Training loss: 2.0373 0.1228 sec/batch\n", + "Epoch 4/20 Iteration 637/3560 Training loss: 2.0366 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 638/3560 Training loss: 2.0360 0.1229 sec/batch\n", + "Epoch 4/20 Iteration 639/3560 Training loss: 2.0353 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 640/3560 Training loss: 2.0349 0.1226 sec/batch\n", + "Epoch 4/20 Iteration 641/3560 Training loss: 2.0345 0.1216 sec/batch\n", + "Epoch 4/20 Iteration 642/3560 Training loss: 2.0341 0.1265 sec/batch\n", + "Epoch 4/20 Iteration 643/3560 Training loss: 2.0339 0.1219 sec/batch\n", + "Epoch 4/20 Iteration 644/3560 Training loss: 2.0335 0.1225 sec/batch\n", + "Epoch 4/20 Iteration 645/3560 Training loss: 2.0331 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 646/3560 Training loss: 2.0326 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 647/3560 Training loss: 2.0322 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 648/3560 Training loss: 2.0317 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 649/3560 Training loss: 2.0311 0.1298 sec/batch\n", + "Epoch 4/20 Iteration 650/3560 Training loss: 2.0304 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 651/3560 Training loss: 2.0300 0.1228 sec/batch\n", + "Epoch 4/20 Iteration 652/3560 Training loss: 2.0295 0.1269 sec/batch\n", + "Epoch 4/20 Iteration 653/3560 Training loss: 2.0291 0.1216 sec/batch\n", + "Epoch 4/20 Iteration 654/3560 Training loss: 2.0287 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 655/3560 Training loss: 2.0284 0.1297 sec/batch\n", + "Epoch 4/20 Iteration 656/3560 Training loss: 2.0279 0.1257 sec/batch\n", + "Epoch 4/20 Iteration 657/3560 Training loss: 2.0273 0.1266 sec/batch\n", + "Epoch 4/20 Iteration 658/3560 Training loss: 2.0271 0.1248 sec/batch\n", + "Epoch 4/20 Iteration 659/3560 Training loss: 2.0267 0.1296 sec/batch\n", + "Epoch 4/20 Iteration 660/3560 Training loss: 2.0260 0.1232 sec/batch\n", + "Epoch 4/20 Iteration 661/3560 Training loss: 2.0257 0.1223 sec/batch\n", + "Epoch 4/20 Iteration 662/3560 Training loss: 2.0254 0.1248 sec/batch\n", + "Epoch 4/20 Iteration 663/3560 Training loss: 2.0250 0.1219 sec/batch\n", + "Epoch 4/20 Iteration 664/3560 Training loss: 2.0247 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 665/3560 Training loss: 2.0241 0.1214 sec/batch\n", + "Epoch 4/20 Iteration 666/3560 Training loss: 2.0236 0.1303 sec/batch\n", + "Epoch 4/20 Iteration 667/3560 Training loss: 2.0233 0.1221 sec/batch\n", + "Epoch 4/20 Iteration 668/3560 Training loss: 2.0231 0.1242 sec/batch\n", + "Epoch 4/20 Iteration 669/3560 Training loss: 2.0227 0.1283 sec/batch\n", + "Epoch 4/20 Iteration 670/3560 Training loss: 2.0225 0.1215 sec/batch\n", + "Epoch 4/20 Iteration 671/3560 Training loss: 2.0222 0.1244 sec/batch\n", + "Epoch 4/20 Iteration 672/3560 Training loss: 2.0219 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 673/3560 Training loss: 2.0219 0.1212 sec/batch\n", + "Epoch 4/20 Iteration 674/3560 Training loss: 2.0214 0.1226 sec/batch\n", + "Epoch 4/20 Iteration 675/3560 Training loss: 2.0213 0.1233 sec/batch\n", + "Epoch 4/20 Iteration 676/3560 Training loss: 2.0209 0.1252 sec/batch\n", + "Epoch 4/20 Iteration 677/3560 Training loss: 2.0205 0.1261 sec/batch\n", + "Epoch 4/20 Iteration 678/3560 Training loss: 2.0201 0.1278 sec/batch\n", + "Epoch 4/20 Iteration 679/3560 Training loss: 2.0197 0.1217 sec/batch\n", + "Epoch 4/20 Iteration 680/3560 Training loss: 2.0195 0.1267 sec/batch\n", + "Epoch 4/20 Iteration 681/3560 Training loss: 2.0192 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 682/3560 Training loss: 2.0190 0.1225 sec/batch\n", + "Epoch 4/20 Iteration 683/3560 Training loss: 2.0187 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 684/3560 Training loss: 2.0182 0.1226 sec/batch\n", + "Epoch 4/20 Iteration 685/3560 Training loss: 2.0177 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 686/3560 Training loss: 2.0176 0.1323 sec/batch\n", + "Epoch 4/20 Iteration 687/3560 Training loss: 2.0173 0.1259 sec/batch\n", + "Epoch 4/20 Iteration 688/3560 Training loss: 2.0171 0.1243 sec/batch\n", + "Epoch 4/20 Iteration 689/3560 Training loss: 2.0166 0.1215 sec/batch\n", + "Epoch 4/20 Iteration 690/3560 Training loss: 2.0162 0.1279 sec/batch\n", + "Epoch 4/20 Iteration 691/3560 Training loss: 2.0159 0.1222 sec/batch\n", + "Epoch 4/20 Iteration 692/3560 Training loss: 2.0155 0.1240 sec/batch\n", + "Epoch 4/20 Iteration 693/3560 Training loss: 2.0149 0.1235 sec/batch\n", + "Epoch 4/20 Iteration 694/3560 Training loss: 2.0147 0.1210 sec/batch\n", + "Epoch 4/20 Iteration 695/3560 Training loss: 2.0146 0.1253 sec/batch\n", + "Epoch 4/20 Iteration 696/3560 Training loss: 2.0141 0.1218 sec/batch\n", + "Epoch 4/20 Iteration 697/3560 Training loss: 2.0139 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 698/3560 Training loss: 2.0136 0.1224 sec/batch\n", + "Epoch 4/20 Iteration 699/3560 Training loss: 2.0132 0.1251 sec/batch\n", + "Epoch 4/20 Iteration 700/3560 Training loss: 2.0128 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 701/3560 Training loss: 2.0125 0.1264 sec/batch\n", + "Epoch 4/20 Iteration 702/3560 Training loss: 2.0124 0.1220 sec/batch\n", + "Epoch 4/20 Iteration 703/3560 Training loss: 2.0120 0.1254 sec/batch\n", + "Epoch 4/20 Iteration 704/3560 Training loss: 2.0117 0.1227 sec/batch\n", + "Epoch 4/20 Iteration 705/3560 Training loss: 2.0112 0.1246 sec/batch\n", + "Epoch 4/20 Iteration 706/3560 Training loss: 2.0107 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 707/3560 Training loss: 2.0105 0.1230 sec/batch\n", + "Epoch 4/20 Iteration 708/3560 Training loss: 2.0102 0.1253 sec/batch\n", + "Epoch 4/20 Iteration 709/3560 Training loss: 2.0099 0.1232 sec/batch\n", + "Epoch 4/20 Iteration 710/3560 Training loss: 2.0095 0.1238 sec/batch\n", + "Epoch 4/20 Iteration 711/3560 Training loss: 2.0091 0.1231 sec/batch\n", + "Epoch 4/20 Iteration 712/3560 Training loss: 2.0088 0.1235 sec/batch\n", + "Epoch 5/20 Iteration 713/3560 Training loss: 2.0619 0.1291 sec/batch\n", + "Epoch 5/20 Iteration 714/3560 Training loss: 1.9908 0.1226 sec/batch\n", + "Epoch 5/20 Iteration 715/3560 Training loss: 1.9731 0.1256 sec/batch\n", + "Epoch 5/20 Iteration 716/3560 Training loss: 1.9632 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 717/3560 Training loss: 1.9564 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 718/3560 Training loss: 1.9469 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 719/3560 Training loss: 1.9465 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 720/3560 Training loss: 1.9457 0.1225 sec/batch\n", + "Epoch 5/20 Iteration 721/3560 Training loss: 1.9480 0.1239 sec/batch\n", + "Epoch 5/20 Iteration 722/3560 Training loss: 1.9467 0.1254 sec/batch\n", + "Epoch 5/20 Iteration 723/3560 Training loss: 1.9426 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 724/3560 Training loss: 1.9404 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 725/3560 Training loss: 1.9401 0.1247 sec/batch\n", + "Epoch 5/20 Iteration 726/3560 Training loss: 1.9417 0.1273 sec/batch\n", + "Epoch 5/20 Iteration 727/3560 Training loss: 1.9401 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 728/3560 Training loss: 1.9380 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 729/3560 Training loss: 1.9374 0.1225 sec/batch\n", + "Epoch 5/20 Iteration 730/3560 Training loss: 1.9392 0.1239 sec/batch\n", + "Epoch 5/20 Iteration 731/3560 Training loss: 1.9386 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 732/3560 Training loss: 1.9385 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 733/3560 Training loss: 1.9371 0.1244 sec/batch\n", + "Epoch 5/20 Iteration 734/3560 Training loss: 1.9379 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 735/3560 Training loss: 1.9367 0.1226 sec/batch\n", + "Epoch 5/20 Iteration 736/3560 Training loss: 1.9359 0.1206 sec/batch\n", + "Epoch 5/20 Iteration 737/3560 Training loss: 1.9355 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 738/3560 Training loss: 1.9341 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 739/3560 Training loss: 1.9328 0.1274 sec/batch\n", + "Epoch 5/20 Iteration 740/3560 Training loss: 1.9328 0.1334 sec/batch\n", + "Epoch 5/20 Iteration 741/3560 Training loss: 1.9334 0.1260 sec/batch\n", + "Epoch 5/20 Iteration 742/3560 Training loss: 1.9328 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 743/3560 Training loss: 1.9326 0.1296 sec/batch\n", + "Epoch 5/20 Iteration 744/3560 Training loss: 1.9316 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 745/3560 Training loss: 1.9310 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 746/3560 Training loss: 1.9314 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 747/3560 Training loss: 1.9309 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 748/3560 Training loss: 1.9303 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 749/3560 Training loss: 1.9298 0.1229 sec/batch\n", + "Epoch 5/20 Iteration 750/3560 Training loss: 1.9286 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 751/3560 Training loss: 1.9271 0.1207 sec/batch\n", + "Epoch 5/20 Iteration 752/3560 Training loss: 1.9262 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 753/3560 Training loss: 1.9255 0.1211 sec/batch\n", + "Epoch 5/20 Iteration 754/3560 Training loss: 1.9253 0.1228 sec/batch\n", + "Epoch 5/20 Iteration 755/3560 Training loss: 1.9247 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 756/3560 Training loss: 1.9237 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 757/3560 Training loss: 1.9234 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 758/3560 Training loss: 1.9219 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 759/3560 Training loss: 1.9216 0.1234 sec/batch\n", + "Epoch 5/20 Iteration 760/3560 Training loss: 1.9208 0.1218 sec/batch\n", + "Epoch 5/20 Iteration 761/3560 Training loss: 1.9205 0.1286 sec/batch\n", + "Epoch 5/20 Iteration 762/3560 Training loss: 1.9208 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 763/3560 Training loss: 1.9200 0.1276 sec/batch\n", + "Epoch 5/20 Iteration 764/3560 Training loss: 1.9206 0.1282 sec/batch\n", + "Epoch 5/20 Iteration 765/3560 Training loss: 1.9202 0.1263 sec/batch\n", + "Epoch 5/20 Iteration 766/3560 Training loss: 1.9196 0.1291 sec/batch\n", + "Epoch 5/20 Iteration 767/3560 Training loss: 1.9192 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 768/3560 Training loss: 1.9192 0.1339 sec/batch\n", + "Epoch 5/20 Iteration 769/3560 Training loss: 1.9191 0.1319 sec/batch\n", + "Epoch 5/20 Iteration 770/3560 Training loss: 1.9190 0.1309 sec/batch\n", + "Epoch 5/20 Iteration 771/3560 Training loss: 1.9184 0.1277 sec/batch\n", + "Epoch 5/20 Iteration 772/3560 Training loss: 1.9187 0.1267 sec/batch\n", + "Epoch 5/20 Iteration 773/3560 Training loss: 1.9186 0.1261 sec/batch\n", + "Epoch 5/20 Iteration 774/3560 Training loss: 1.9192 0.1239 sec/batch\n", + "Epoch 5/20 Iteration 775/3560 Training loss: 1.9196 0.1260 sec/batch\n", + "Epoch 5/20 Iteration 776/3560 Training loss: 1.9197 0.1247 sec/batch\n", + "Epoch 5/20 Iteration 777/3560 Training loss: 1.9196 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 778/3560 Training loss: 1.9198 0.1258 sec/batch\n", + "Epoch 5/20 Iteration 779/3560 Training loss: 1.9198 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 780/3560 Training loss: 1.9193 0.1281 sec/batch\n", + "Epoch 5/20 Iteration 781/3560 Training loss: 1.9187 0.1230 sec/batch\n", + "Epoch 5/20 Iteration 782/3560 Training loss: 1.9185 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 783/3560 Training loss: 1.9187 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 784/3560 Training loss: 1.9185 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 785/3560 Training loss: 1.9186 0.1250 sec/batch\n", + "Epoch 5/20 Iteration 786/3560 Training loss: 1.9181 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 787/3560 Training loss: 1.9177 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 788/3560 Training loss: 1.9178 0.1248 sec/batch\n", + "Epoch 5/20 Iteration 789/3560 Training loss: 1.9175 0.1229 sec/batch\n", + "Epoch 5/20 Iteration 790/3560 Training loss: 1.9175 0.1254 sec/batch\n", + "Epoch 5/20 Iteration 791/3560 Training loss: 1.9167 0.1286 sec/batch\n", + "Epoch 5/20 Iteration 792/3560 Training loss: 1.9163 0.1260 sec/batch\n", + "Epoch 5/20 Iteration 793/3560 Training loss: 1.9157 0.1254 sec/batch\n", + "Epoch 5/20 Iteration 794/3560 Training loss: 1.9155 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 795/3560 Training loss: 1.9146 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 796/3560 Training loss: 1.9142 0.1266 sec/batch\n", + "Epoch 5/20 Iteration 797/3560 Training loss: 1.9135 0.1233 sec/batch\n", + "Epoch 5/20 Iteration 798/3560 Training loss: 1.9129 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 799/3560 Training loss: 1.9125 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 800/3560 Training loss: 1.9120 0.1284 sec/batch\n", + "Epoch 5/20 Iteration 801/3560 Training loss: 1.9111 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 802/3560 Training loss: 1.9110 0.1231 sec/batch\n", + "Epoch 5/20 Iteration 803/3560 Training loss: 1.9105 0.1249 sec/batch\n", + "Epoch 5/20 Iteration 804/3560 Training loss: 1.9101 0.1250 sec/batch\n", + "Epoch 5/20 Iteration 805/3560 Training loss: 1.9093 0.1258 sec/batch\n", + "Epoch 5/20 Iteration 806/3560 Training loss: 1.9088 0.1211 sec/batch\n", + "Epoch 5/20 Iteration 807/3560 Training loss: 1.9082 0.1259 sec/batch\n", + "Epoch 5/20 Iteration 808/3560 Training loss: 1.9078 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 809/3560 Training loss: 1.9074 0.1230 sec/batch\n", + "Epoch 5/20 Iteration 810/3560 Training loss: 1.9068 0.1234 sec/batch\n", + "Epoch 5/20 Iteration 811/3560 Training loss: 1.9061 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 812/3560 Training loss: 1.9054 0.1232 sec/batch\n", + "Epoch 5/20 Iteration 813/3560 Training loss: 1.9051 0.1213 sec/batch\n", + "Epoch 5/20 Iteration 814/3560 Training loss: 1.9048 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 815/3560 Training loss: 1.9043 0.1242 sec/batch\n", + "Epoch 5/20 Iteration 816/3560 Training loss: 1.9039 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 817/3560 Training loss: 1.9034 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 818/3560 Training loss: 1.9029 0.1217 sec/batch\n", + "Epoch 5/20 Iteration 819/3560 Training loss: 1.9025 0.1228 sec/batch\n", + "Epoch 5/20 Iteration 820/3560 Training loss: 1.9022 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 821/3560 Training loss: 1.9020 0.1229 sec/batch\n", + "Epoch 5/20 Iteration 822/3560 Training loss: 1.9017 0.1264 sec/batch\n", + "Epoch 5/20 Iteration 823/3560 Training loss: 1.9013 0.1243 sec/batch\n", + "Epoch 5/20 Iteration 824/3560 Training loss: 1.9009 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 825/3560 Training loss: 1.9006 0.1343 sec/batch\n", + "Epoch 5/20 Iteration 826/3560 Training loss: 1.9000 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 827/3560 Training loss: 1.8996 0.1280 sec/batch\n", + "Epoch 5/20 Iteration 828/3560 Training loss: 1.8989 0.1231 sec/batch\n", + "Epoch 5/20 Iteration 829/3560 Training loss: 1.8985 0.1266 sec/batch\n", + "Epoch 5/20 Iteration 830/3560 Training loss: 1.8981 0.1231 sec/batch\n", + "Epoch 5/20 Iteration 831/3560 Training loss: 1.8977 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 832/3560 Training loss: 1.8974 0.1216 sec/batch\n", + "Epoch 5/20 Iteration 833/3560 Training loss: 1.8972 0.1233 sec/batch\n", + "Epoch 5/20 Iteration 834/3560 Training loss: 1.8966 0.1363 sec/batch\n", + "Epoch 5/20 Iteration 835/3560 Training loss: 1.8961 0.1224 sec/batch\n", + "Epoch 5/20 Iteration 836/3560 Training loss: 1.8960 0.1234 sec/batch\n", + "Epoch 5/20 Iteration 837/3560 Training loss: 1.8956 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 838/3560 Training loss: 1.8949 0.1212 sec/batch\n", + "Epoch 5/20 Iteration 839/3560 Training loss: 1.8948 0.1228 sec/batch\n", + "Epoch 5/20 Iteration 840/3560 Training loss: 1.8946 0.1214 sec/batch\n", + "Epoch 5/20 Iteration 841/3560 Training loss: 1.8943 0.1240 sec/batch\n", + "Epoch 5/20 Iteration 842/3560 Training loss: 1.8939 0.1238 sec/batch\n", + "Epoch 5/20 Iteration 843/3560 Training loss: 1.8933 0.1249 sec/batch\n", + "Epoch 5/20 Iteration 844/3560 Training loss: 1.8928 0.1296 sec/batch\n", + "Epoch 5/20 Iteration 845/3560 Training loss: 1.8926 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 846/3560 Training loss: 1.8924 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 847/3560 Training loss: 1.8921 0.1239 sec/batch\n", + "Epoch 5/20 Iteration 848/3560 Training loss: 1.8919 0.1258 sec/batch\n", + "Epoch 5/20 Iteration 849/3560 Training loss: 1.8917 0.1215 sec/batch\n", + "Epoch 5/20 Iteration 850/3560 Training loss: 1.8915 0.1222 sec/batch\n", + "Epoch 5/20 Iteration 851/3560 Training loss: 1.8914 0.1236 sec/batch\n", + "Epoch 5/20 Iteration 852/3560 Training loss: 1.8911 0.1235 sec/batch\n", + "Epoch 5/20 Iteration 853/3560 Training loss: 1.8911 0.1226 sec/batch\n", + "Epoch 5/20 Iteration 854/3560 Training loss: 1.8907 0.1244 sec/batch\n", + "Epoch 5/20 Iteration 855/3560 Training loss: 1.8904 0.1220 sec/batch\n", + "Epoch 5/20 Iteration 856/3560 Training loss: 1.8901 0.1213 sec/batch\n", + "Epoch 5/20 Iteration 857/3560 Training loss: 1.8897 0.1229 sec/batch\n", + "Epoch 5/20 Iteration 858/3560 Training loss: 1.8896 0.1248 sec/batch\n", + "Epoch 5/20 Iteration 859/3560 Training loss: 1.8894 0.1233 sec/batch\n", + "Epoch 5/20 Iteration 860/3560 Training loss: 1.8893 0.1244 sec/batch\n", + "Epoch 5/20 Iteration 861/3560 Training loss: 1.8891 0.1265 sec/batch\n", + "Epoch 5/20 Iteration 862/3560 Training loss: 1.8888 0.1241 sec/batch\n", + "Epoch 5/20 Iteration 863/3560 Training loss: 1.8883 0.1246 sec/batch\n", + "Epoch 5/20 Iteration 864/3560 Training loss: 1.8882 0.1223 sec/batch\n", + "Epoch 5/20 Iteration 865/3560 Training loss: 1.8881 0.1291 sec/batch\n", + "Epoch 5/20 Iteration 866/3560 Training loss: 1.8879 0.1215 sec/batch\n", + "Epoch 5/20 Iteration 867/3560 Training loss: 1.8876 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 868/3560 Training loss: 1.8874 0.1266 sec/batch\n", + "Epoch 5/20 Iteration 869/3560 Training loss: 1.8871 0.1210 sec/batch\n", + "Epoch 5/20 Iteration 870/3560 Training loss: 1.8868 0.1210 sec/batch\n", + "Epoch 5/20 Iteration 871/3560 Training loss: 1.8864 0.1382 sec/batch\n", + "Epoch 5/20 Iteration 872/3560 Training loss: 1.8863 0.1308 sec/batch\n", + "Epoch 5/20 Iteration 873/3560 Training loss: 1.8862 0.1268 sec/batch\n", + "Epoch 5/20 Iteration 874/3560 Training loss: 1.8859 0.1268 sec/batch\n", + "Epoch 5/20 Iteration 875/3560 Training loss: 1.8857 0.1251 sec/batch\n", + "Epoch 5/20 Iteration 876/3560 Training loss: 1.8854 0.1237 sec/batch\n", + "Epoch 5/20 Iteration 877/3560 Training loss: 1.8851 0.1227 sec/batch\n", + "Epoch 5/20 Iteration 878/3560 Training loss: 1.8847 0.1271 sec/batch\n", + "Epoch 5/20 Iteration 879/3560 Training loss: 1.8845 0.1248 sec/batch\n", + "Epoch 5/20 Iteration 880/3560 Training loss: 1.8846 0.1229 sec/batch\n", + "Epoch 5/20 Iteration 881/3560 Training loss: 1.8843 0.1268 sec/batch\n", + "Epoch 5/20 Iteration 882/3560 Training loss: 1.8840 0.1270 sec/batch\n", + "Epoch 5/20 Iteration 883/3560 Training loss: 1.8836 0.1301 sec/batch\n", + "Epoch 5/20 Iteration 884/3560 Training loss: 1.8833 0.1299 sec/batch\n", + "Epoch 5/20 Iteration 885/3560 Training loss: 1.8831 0.1300 sec/batch\n", + "Epoch 5/20 Iteration 886/3560 Training loss: 1.8829 0.1255 sec/batch\n", + "Epoch 5/20 Iteration 887/3560 Training loss: 1.8827 0.1240 sec/batch\n", + "Epoch 5/20 Iteration 888/3560 Training loss: 1.8824 0.1221 sec/batch\n", + "Epoch 5/20 Iteration 889/3560 Training loss: 1.8820 0.1250 sec/batch\n", + "Epoch 5/20 Iteration 890/3560 Training loss: 1.8819 0.1220 sec/batch\n", + "Epoch 6/20 Iteration 891/3560 Training loss: 1.9447 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 892/3560 Training loss: 1.8816 0.1210 sec/batch\n", + "Epoch 6/20 Iteration 893/3560 Training loss: 1.8583 0.1242 sec/batch\n", + "Epoch 6/20 Iteration 894/3560 Training loss: 1.8504 0.1252 sec/batch\n", + "Epoch 6/20 Iteration 895/3560 Training loss: 1.8447 0.1253 sec/batch\n", + "Epoch 6/20 Iteration 896/3560 Training loss: 1.8332 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 897/3560 Training loss: 1.8350 0.1248 sec/batch\n", + "Epoch 6/20 Iteration 898/3560 Training loss: 1.8326 0.1233 sec/batch\n", + "Epoch 6/20 Iteration 899/3560 Training loss: 1.8342 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 900/3560 Training loss: 1.8333 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 901/3560 Training loss: 1.8303 0.1263 sec/batch\n", + "Epoch 6/20 Iteration 902/3560 Training loss: 1.8284 0.1246 sec/batch\n", + "Epoch 6/20 Iteration 903/3560 Training loss: 1.8277 0.1331 sec/batch\n", + "Epoch 6/20 Iteration 904/3560 Training loss: 1.8295 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 905/3560 Training loss: 1.8286 0.1277 sec/batch\n", + "Epoch 6/20 Iteration 906/3560 Training loss: 1.8265 0.1263 sec/batch\n", + "Epoch 6/20 Iteration 907/3560 Training loss: 1.8259 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 908/3560 Training loss: 1.8283 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 909/3560 Training loss: 1.8284 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 910/3560 Training loss: 1.8286 0.1218 sec/batch\n", + "Epoch 6/20 Iteration 911/3560 Training loss: 1.8275 0.1269 sec/batch\n", + "Epoch 6/20 Iteration 912/3560 Training loss: 1.8277 0.1233 sec/batch\n", + "Epoch 6/20 Iteration 913/3560 Training loss: 1.8268 0.1233 sec/batch\n", + "Epoch 6/20 Iteration 914/3560 Training loss: 1.8258 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 915/3560 Training loss: 1.8257 0.1324 sec/batch\n", + "Epoch 6/20 Iteration 916/3560 Training loss: 1.8242 0.1212 sec/batch\n", + "Epoch 6/20 Iteration 917/3560 Training loss: 1.8231 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 918/3560 Training loss: 1.8233 0.1413 sec/batch\n", + "Epoch 6/20 Iteration 919/3560 Training loss: 1.8243 0.1324 sec/batch\n", + "Epoch 6/20 Iteration 920/3560 Training loss: 1.8243 0.1286 sec/batch\n", + "Epoch 6/20 Iteration 921/3560 Training loss: 1.8237 0.1275 sec/batch\n", + "Epoch 6/20 Iteration 922/3560 Training loss: 1.8228 0.1300 sec/batch\n", + "Epoch 6/20 Iteration 923/3560 Training loss: 1.8228 0.1231 sec/batch\n", + "Epoch 6/20 Iteration 924/3560 Training loss: 1.8229 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 925/3560 Training loss: 1.8224 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 926/3560 Training loss: 1.8221 0.1395 sec/batch\n", + "Epoch 6/20 Iteration 927/3560 Training loss: 1.8215 0.1341 sec/batch\n", + "Epoch 6/20 Iteration 928/3560 Training loss: 1.8202 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 929/3560 Training loss: 1.8185 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 930/3560 Training loss: 1.8175 0.1233 sec/batch\n", + "Epoch 6/20 Iteration 931/3560 Training loss: 1.8168 0.1213 sec/batch\n", + "Epoch 6/20 Iteration 932/3560 Training loss: 1.8168 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 933/3560 Training loss: 1.8161 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 934/3560 Training loss: 1.8152 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 935/3560 Training loss: 1.8152 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 936/3560 Training loss: 1.8140 0.1253 sec/batch\n", + "Epoch 6/20 Iteration 937/3560 Training loss: 1.8135 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 938/3560 Training loss: 1.8129 0.1249 sec/batch\n", + "Epoch 6/20 Iteration 939/3560 Training loss: 1.8126 0.1254 sec/batch\n", + "Epoch 6/20 Iteration 940/3560 Training loss: 1.8131 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 941/3560 Training loss: 1.8122 0.1376 sec/batch\n", + "Epoch 6/20 Iteration 942/3560 Training loss: 1.8130 0.1237 sec/batch\n", + "Epoch 6/20 Iteration 943/3560 Training loss: 1.8125 0.1240 sec/batch\n", + "Epoch 6/20 Iteration 944/3560 Training loss: 1.8123 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 945/3560 Training loss: 1.8119 0.1249 sec/batch\n", + "Epoch 6/20 Iteration 946/3560 Training loss: 1.8118 0.1226 sec/batch\n", + "Epoch 6/20 Iteration 947/3560 Training loss: 1.8119 0.1280 sec/batch\n", + "Epoch 6/20 Iteration 948/3560 Training loss: 1.8116 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 949/3560 Training loss: 1.8111 0.1262 sec/batch\n", + "Epoch 6/20 Iteration 950/3560 Training loss: 1.8114 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 951/3560 Training loss: 1.8110 0.1212 sec/batch\n", + "Epoch 6/20 Iteration 952/3560 Training loss: 1.8117 0.1249 sec/batch\n", + "Epoch 6/20 Iteration 953/3560 Training loss: 1.8118 0.1231 sec/batch\n", + "Epoch 6/20 Iteration 954/3560 Training loss: 1.8119 0.1231 sec/batch\n", + "Epoch 6/20 Iteration 955/3560 Training loss: 1.8115 0.1331 sec/batch\n", + "Epoch 6/20 Iteration 956/3560 Training loss: 1.8117 0.1240 sec/batch\n", + "Epoch 6/20 Iteration 957/3560 Training loss: 1.8117 0.1293 sec/batch\n", + "Epoch 6/20 Iteration 958/3560 Training loss: 1.8113 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 959/3560 Training loss: 1.8110 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 960/3560 Training loss: 1.8107 0.1259 sec/batch\n", + "Epoch 6/20 Iteration 961/3560 Training loss: 1.8109 0.1228 sec/batch\n", + "Epoch 6/20 Iteration 962/3560 Training loss: 1.8108 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 963/3560 Training loss: 1.8109 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 964/3560 Training loss: 1.8103 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 965/3560 Training loss: 1.8101 0.1212 sec/batch\n", + "Epoch 6/20 Iteration 966/3560 Training loss: 1.8101 0.1218 sec/batch\n", + "Epoch 6/20 Iteration 967/3560 Training loss: 1.8098 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 968/3560 Training loss: 1.8097 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 969/3560 Training loss: 1.8091 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 970/3560 Training loss: 1.8087 0.1267 sec/batch\n", + "Epoch 6/20 Iteration 971/3560 Training loss: 1.8080 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 972/3560 Training loss: 1.8079 0.1272 sec/batch\n", + "Epoch 6/20 Iteration 973/3560 Training loss: 1.8070 0.1213 sec/batch\n", + "Epoch 6/20 Iteration 974/3560 Training loss: 1.8069 0.1231 sec/batch\n", + "Epoch 6/20 Iteration 975/3560 Training loss: 1.8063 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 976/3560 Training loss: 1.8057 0.1255 sec/batch\n", + "Epoch 6/20 Iteration 977/3560 Training loss: 1.8053 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 978/3560 Training loss: 1.8048 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 979/3560 Training loss: 1.8041 0.1251 sec/batch\n", + "Epoch 6/20 Iteration 980/3560 Training loss: 1.8039 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 981/3560 Training loss: 1.8033 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 982/3560 Training loss: 1.8029 0.1247 sec/batch\n", + "Epoch 6/20 Iteration 983/3560 Training loss: 1.8021 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 984/3560 Training loss: 1.8015 0.1211 sec/batch\n", + "Epoch 6/20 Iteration 985/3560 Training loss: 1.8010 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 986/3560 Training loss: 1.8007 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 987/3560 Training loss: 1.8004 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 988/3560 Training loss: 1.7998 0.1252 sec/batch\n", + "Epoch 6/20 Iteration 989/3560 Training loss: 1.7991 0.1274 sec/batch\n", + "Epoch 6/20 Iteration 990/3560 Training loss: 1.7984 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 991/3560 Training loss: 1.7981 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 992/3560 Training loss: 1.7978 0.1235 sec/batch\n", + "Epoch 6/20 Iteration 993/3560 Training loss: 1.7974 0.1242 sec/batch\n", + "Epoch 6/20 Iteration 994/3560 Training loss: 1.7971 0.1223 sec/batch\n", + "Epoch 6/20 Iteration 995/3560 Training loss: 1.7967 0.1221 sec/batch\n", + "Epoch 6/20 Iteration 996/3560 Training loss: 1.7964 0.1215 sec/batch\n", + "Epoch 6/20 Iteration 997/3560 Training loss: 1.7961 0.1261 sec/batch\n", + "Epoch 6/20 Iteration 998/3560 Training loss: 1.7959 0.1219 sec/batch\n", + "Epoch 6/20 Iteration 999/3560 Training loss: 1.7958 0.1250 sec/batch\n", + "Epoch 6/20 Iteration 1000/3560 Training loss: 1.7955 0.1218 sec/batch\n", + "Epoch 6/20 Iteration 1001/3560 Training loss: 1.7953 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 1002/3560 Training loss: 1.7950 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 1003/3560 Training loss: 1.7948 0.1213 sec/batch\n", + "Epoch 6/20 Iteration 1004/3560 Training loss: 1.7945 0.1324 sec/batch\n", + "Epoch 6/20 Iteration 1005/3560 Training loss: 1.7941 0.1247 sec/batch\n", + "Epoch 6/20 Iteration 1006/3560 Training loss: 1.7936 0.1238 sec/batch\n", + "Epoch 6/20 Iteration 1007/3560 Training loss: 1.7934 0.1248 sec/batch\n", + "Epoch 6/20 Iteration 1008/3560 Training loss: 1.7931 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 1009/3560 Training loss: 1.7928 0.1267 sec/batch\n", + "Epoch 6/20 Iteration 1010/3560 Training loss: 1.7926 0.1289 sec/batch\n", + "Epoch 6/20 Iteration 1011/3560 Training loss: 1.7925 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 1012/3560 Training loss: 1.7919 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 1013/3560 Training loss: 1.7915 0.1333 sec/batch\n", + "Epoch 6/20 Iteration 1014/3560 Training loss: 1.7914 0.1271 sec/batch\n", + "Epoch 6/20 Iteration 1015/3560 Training loss: 1.7912 0.1262 sec/batch\n", + "Epoch 6/20 Iteration 1016/3560 Training loss: 1.7906 0.1230 sec/batch\n", + "Epoch 6/20 Iteration 1017/3560 Training loss: 1.7905 0.1263 sec/batch\n", + "Epoch 6/20 Iteration 1018/3560 Training loss: 1.7904 0.1228 sec/batch\n", + "Epoch 6/20 Iteration 1019/3560 Training loss: 1.7901 0.1291 sec/batch\n", + "Epoch 6/20 Iteration 1020/3560 Training loss: 1.7899 0.1233 sec/batch\n", + "Epoch 6/20 Iteration 1021/3560 Training loss: 1.7894 0.1255 sec/batch\n", + "Epoch 6/20 Iteration 1022/3560 Training loss: 1.7890 0.1259 sec/batch\n", + "Epoch 6/20 Iteration 1023/3560 Training loss: 1.7889 0.1277 sec/batch\n", + "Epoch 6/20 Iteration 1024/3560 Training loss: 1.7886 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 1025/3560 Training loss: 1.7884 0.1239 sec/batch\n", + "Epoch 6/20 Iteration 1026/3560 Training loss: 1.7883 0.1307 sec/batch\n", + "Epoch 6/20 Iteration 1027/3560 Training loss: 1.7882 0.1260 sec/batch\n", + "Epoch 6/20 Iteration 1028/3560 Training loss: 1.7880 0.1235 sec/batch\n", + "Epoch 6/20 Iteration 1029/3560 Training loss: 1.7879 0.1369 sec/batch\n", + "Epoch 6/20 Iteration 1030/3560 Training loss: 1.7876 0.1264 sec/batch\n", + "Epoch 6/20 Iteration 1031/3560 Training loss: 1.7877 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 1032/3560 Training loss: 1.7874 0.1237 sec/batch\n", + "Epoch 6/20 Iteration 1033/3560 Training loss: 1.7872 0.1257 sec/batch\n", + "Epoch 6/20 Iteration 1034/3560 Training loss: 1.7871 0.1274 sec/batch\n", + "Epoch 6/20 Iteration 1035/3560 Training loss: 1.7868 0.1247 sec/batch\n", + "Epoch 6/20 Iteration 1036/3560 Training loss: 1.7867 0.1251 sec/batch\n", + "Epoch 6/20 Iteration 1037/3560 Training loss: 1.7865 0.1242 sec/batch\n", + "Epoch 6/20 Iteration 1038/3560 Training loss: 1.7865 0.1252 sec/batch\n", + "Epoch 6/20 Iteration 1039/3560 Training loss: 1.7863 0.1216 sec/batch\n", + "Epoch 6/20 Iteration 1040/3560 Training loss: 1.7860 0.1263 sec/batch\n", + "Epoch 6/20 Iteration 1041/3560 Training loss: 1.7856 0.1217 sec/batch\n", + "Epoch 6/20 Iteration 1042/3560 Training loss: 1.7855 0.1249 sec/batch\n", + "Epoch 6/20 Iteration 1043/3560 Training loss: 1.7854 0.1224 sec/batch\n", + "Epoch 6/20 Iteration 1044/3560 Training loss: 1.7851 0.1234 sec/batch\n", + "Epoch 6/20 Iteration 1045/3560 Training loss: 1.7850 0.1264 sec/batch\n", + "Epoch 6/20 Iteration 1046/3560 Training loss: 1.7847 0.1235 sec/batch\n", + "Epoch 6/20 Iteration 1047/3560 Training loss: 1.7845 0.1245 sec/batch\n", + "Epoch 6/20 Iteration 1048/3560 Training loss: 1.7843 0.1251 sec/batch\n", + "Epoch 6/20 Iteration 1049/3560 Training loss: 1.7839 0.1307 sec/batch\n", + "Epoch 6/20 Iteration 1050/3560 Training loss: 1.7839 0.1259 sec/batch\n", + "Epoch 6/20 Iteration 1051/3560 Training loss: 1.7838 0.1253 sec/batch\n", + "Epoch 6/20 Iteration 1052/3560 Training loss: 1.7836 0.1211 sec/batch\n", + "Epoch 6/20 Iteration 1053/3560 Training loss: 1.7835 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 1054/3560 Training loss: 1.7833 0.1256 sec/batch\n", + "Epoch 6/20 Iteration 1055/3560 Training loss: 1.7831 0.1243 sec/batch\n", + "Epoch 6/20 Iteration 1056/3560 Training loss: 1.7828 0.1229 sec/batch\n", + "Epoch 6/20 Iteration 1057/3560 Training loss: 1.7827 0.1222 sec/batch\n", + "Epoch 6/20 Iteration 1058/3560 Training loss: 1.7830 0.1263 sec/batch\n", + "Epoch 6/20 Iteration 1059/3560 Training loss: 1.7827 0.1227 sec/batch\n", + "Epoch 6/20 Iteration 1060/3560 Training loss: 1.7825 0.1277 sec/batch\n", + "Epoch 6/20 Iteration 1061/3560 Training loss: 1.7822 0.1236 sec/batch\n", + "Epoch 6/20 Iteration 1062/3560 Training loss: 1.7818 0.1246 sec/batch\n", + "Epoch 6/20 Iteration 1063/3560 Training loss: 1.7817 0.1244 sec/batch\n", + "Epoch 6/20 Iteration 1064/3560 Training loss: 1.7816 0.1272 sec/batch\n", + "Epoch 6/20 Iteration 1065/3560 Training loss: 1.7815 0.1232 sec/batch\n", + "Epoch 6/20 Iteration 1066/3560 Training loss: 1.7812 0.1243 sec/batch\n", + "Epoch 6/20 Iteration 1067/3560 Training loss: 1.7809 0.1268 sec/batch\n", + "Epoch 6/20 Iteration 1068/3560 Training loss: 1.7808 0.1215 sec/batch\n", + "Epoch 7/20 Iteration 1069/3560 Training loss: 1.8563 0.1226 sec/batch\n", + "Epoch 7/20 Iteration 1070/3560 Training loss: 1.7949 0.1236 sec/batch\n", + "Epoch 7/20 Iteration 1071/3560 Training loss: 1.7716 0.1210 sec/batch\n", + "Epoch 7/20 Iteration 1072/3560 Training loss: 1.7648 0.1264 sec/batch\n", + "Epoch 7/20 Iteration 1073/3560 Training loss: 1.7557 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1074/3560 Training loss: 1.7449 0.1215 sec/batch\n", + "Epoch 7/20 Iteration 1075/3560 Training loss: 1.7454 0.1263 sec/batch\n", + "Epoch 7/20 Iteration 1076/3560 Training loss: 1.7432 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1077/3560 Training loss: 1.7458 0.1302 sec/batch\n", + "Epoch 7/20 Iteration 1078/3560 Training loss: 1.7450 0.1282 sec/batch\n", + "Epoch 7/20 Iteration 1079/3560 Training loss: 1.7411 0.1329 sec/batch\n", + "Epoch 7/20 Iteration 1080/3560 Training loss: 1.7390 0.1332 sec/batch\n", + "Epoch 7/20 Iteration 1081/3560 Training loss: 1.7384 0.1240 sec/batch\n", + "Epoch 7/20 Iteration 1082/3560 Training loss: 1.7411 0.1359 sec/batch\n", + "Epoch 7/20 Iteration 1083/3560 Training loss: 1.7400 0.1272 sec/batch\n", + "Epoch 7/20 Iteration 1084/3560 Training loss: 1.7376 0.1279 sec/batch\n", + "Epoch 7/20 Iteration 1085/3560 Training loss: 1.7368 0.1286 sec/batch\n", + "Epoch 7/20 Iteration 1086/3560 Training loss: 1.7386 0.1257 sec/batch\n", + "Epoch 7/20 Iteration 1087/3560 Training loss: 1.7379 0.1297 sec/batch\n", + "Epoch 7/20 Iteration 1088/3560 Training loss: 1.7384 0.1281 sec/batch\n", + "Epoch 7/20 Iteration 1089/3560 Training loss: 1.7376 0.1310 sec/batch\n", + "Epoch 7/20 Iteration 1090/3560 Training loss: 1.7382 0.1251 sec/batch\n", + "Epoch 7/20 Iteration 1091/3560 Training loss: 1.7368 0.1251 sec/batch\n", + "Epoch 7/20 Iteration 1092/3560 Training loss: 1.7360 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1093/3560 Training loss: 1.7354 0.1305 sec/batch\n", + "Epoch 7/20 Iteration 1094/3560 Training loss: 1.7336 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1095/3560 Training loss: 1.7321 0.1316 sec/batch\n", + "Epoch 7/20 Iteration 1096/3560 Training loss: 1.7321 0.1216 sec/batch\n", + "Epoch 7/20 Iteration 1097/3560 Training loss: 1.7326 0.1262 sec/batch\n", + "Epoch 7/20 Iteration 1098/3560 Training loss: 1.7324 0.1230 sec/batch\n", + "Epoch 7/20 Iteration 1099/3560 Training loss: 1.7322 0.1250 sec/batch\n", + "Epoch 7/20 Iteration 1100/3560 Training loss: 1.7312 0.1315 sec/batch\n", + "Epoch 7/20 Iteration 1101/3560 Training loss: 1.7314 0.1219 sec/batch\n", + "Epoch 7/20 Iteration 1102/3560 Training loss: 1.7318 0.1260 sec/batch\n", + "Epoch 7/20 Iteration 1103/3560 Training loss: 1.7314 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1104/3560 Training loss: 1.7309 0.1228 sec/batch\n", + "Epoch 7/20 Iteration 1105/3560 Training loss: 1.7303 0.1343 sec/batch\n", + "Epoch 7/20 Iteration 1106/3560 Training loss: 1.7290 0.1314 sec/batch\n", + "Epoch 7/20 Iteration 1107/3560 Training loss: 1.7275 0.1261 sec/batch\n", + "Epoch 7/20 Iteration 1108/3560 Training loss: 1.7267 0.1267 sec/batch\n", + "Epoch 7/20 Iteration 1109/3560 Training loss: 1.7259 0.1246 sec/batch\n", + "Epoch 7/20 Iteration 1110/3560 Training loss: 1.7258 0.1270 sec/batch\n", + "Epoch 7/20 Iteration 1111/3560 Training loss: 1.7249 0.1326 sec/batch\n", + "Epoch 7/20 Iteration 1112/3560 Training loss: 1.7241 0.1309 sec/batch\n", + "Epoch 7/20 Iteration 1113/3560 Training loss: 1.7241 0.1294 sec/batch\n", + "Epoch 7/20 Iteration 1114/3560 Training loss: 1.7230 0.1292 sec/batch\n", + "Epoch 7/20 Iteration 1115/3560 Training loss: 1.7227 0.1306 sec/batch\n", + "Epoch 7/20 Iteration 1116/3560 Training loss: 1.7222 0.1277 sec/batch\n", + "Epoch 7/20 Iteration 1117/3560 Training loss: 1.7218 0.1257 sec/batch\n", + "Epoch 7/20 Iteration 1118/3560 Training loss: 1.7225 0.1282 sec/batch\n", + "Epoch 7/20 Iteration 1119/3560 Training loss: 1.7220 0.1268 sec/batch\n", + "Epoch 7/20 Iteration 1120/3560 Training loss: 1.7228 0.1282 sec/batch\n", + "Epoch 7/20 Iteration 1121/3560 Training loss: 1.7227 0.1261 sec/batch\n", + "Epoch 7/20 Iteration 1122/3560 Training loss: 1.7227 0.1236 sec/batch\n", + "Epoch 7/20 Iteration 1123/3560 Training loss: 1.7223 0.1404 sec/batch\n", + "Epoch 7/20 Iteration 1124/3560 Training loss: 1.7224 0.1463 sec/batch\n", + "Epoch 7/20 Iteration 1125/3560 Training loss: 1.7226 0.1338 sec/batch\n", + "Epoch 7/20 Iteration 1126/3560 Training loss: 1.7222 0.1210 sec/batch\n", + "Epoch 7/20 Iteration 1127/3560 Training loss: 1.7216 0.1291 sec/batch\n", + "Epoch 7/20 Iteration 1128/3560 Training loss: 1.7220 0.1357 sec/batch\n", + "Epoch 7/20 Iteration 1129/3560 Training loss: 1.7219 0.1224 sec/batch\n", + "Epoch 7/20 Iteration 1130/3560 Training loss: 1.7228 0.1265 sec/batch\n", + "Epoch 7/20 Iteration 1131/3560 Training loss: 1.7230 0.1222 sec/batch\n", + "Epoch 7/20 Iteration 1132/3560 Training loss: 1.7233 0.1208 sec/batch\n", + "Epoch 7/20 Iteration 1133/3560 Training loss: 1.7231 0.1355 sec/batch\n", + "Epoch 7/20 Iteration 1134/3560 Training loss: 1.7232 0.1263 sec/batch\n", + "Epoch 7/20 Iteration 1135/3560 Training loss: 1.7233 0.1276 sec/batch\n", + "Epoch 7/20 Iteration 1136/3560 Training loss: 1.7231 0.1218 sec/batch\n", + "Epoch 7/20 Iteration 1137/3560 Training loss: 1.7228 0.1280 sec/batch\n", + "Epoch 7/20 Iteration 1138/3560 Training loss: 1.7227 0.1236 sec/batch\n", + "Epoch 7/20 Iteration 1139/3560 Training loss: 1.7232 0.1243 sec/batch\n", + "Epoch 7/20 Iteration 1140/3560 Training loss: 1.7233 0.1244 sec/batch\n", + "Epoch 7/20 Iteration 1141/3560 Training loss: 1.7237 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1142/3560 Training loss: 1.7233 0.1308 sec/batch\n", + "Epoch 7/20 Iteration 1143/3560 Training loss: 1.7231 0.1302 sec/batch\n", + "Epoch 7/20 Iteration 1144/3560 Training loss: 1.7232 0.1329 sec/batch\n", + "Epoch 7/20 Iteration 1145/3560 Training loss: 1.7229 0.1247 sec/batch\n", + "Epoch 7/20 Iteration 1146/3560 Training loss: 1.7230 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1147/3560 Training loss: 1.7223 0.1263 sec/batch\n", + "Epoch 7/20 Iteration 1148/3560 Training loss: 1.7220 0.1220 sec/batch\n", + "Epoch 7/20 Iteration 1149/3560 Training loss: 1.7214 0.1207 sec/batch\n", + "Epoch 7/20 Iteration 1150/3560 Training loss: 1.7214 0.1275 sec/batch\n", + "Epoch 7/20 Iteration 1151/3560 Training loss: 1.7207 0.1270 sec/batch\n", + "Epoch 7/20 Iteration 1152/3560 Training loss: 1.7206 0.1332 sec/batch\n", + "Epoch 7/20 Iteration 1153/3560 Training loss: 1.7201 0.1285 sec/batch\n", + "Epoch 7/20 Iteration 1154/3560 Training loss: 1.7198 0.1433 sec/batch\n", + "Epoch 7/20 Iteration 1155/3560 Training loss: 1.7195 0.1331 sec/batch\n", + "Epoch 7/20 Iteration 1156/3560 Training loss: 1.7190 0.1293 sec/batch\n", + "Epoch 7/20 Iteration 1157/3560 Training loss: 1.7183 0.1225 sec/batch\n", + "Epoch 7/20 Iteration 1158/3560 Training loss: 1.7183 0.1230 sec/batch\n", + "Epoch 7/20 Iteration 1159/3560 Training loss: 1.7178 0.1259 sec/batch\n", + "Epoch 7/20 Iteration 1160/3560 Training loss: 1.7175 0.1251 sec/batch\n", + "Epoch 7/20 Iteration 1161/3560 Training loss: 1.7170 0.1213 sec/batch\n", + "Epoch 7/20 Iteration 1162/3560 Training loss: 1.7165 0.1316 sec/batch\n", + "Epoch 7/20 Iteration 1163/3560 Training loss: 1.7161 0.1231 sec/batch\n", + "Epoch 7/20 Iteration 1164/3560 Training loss: 1.7159 0.1262 sec/batch\n", + "Epoch 7/20 Iteration 1165/3560 Training loss: 1.7156 0.1239 sec/batch\n", + "Epoch 7/20 Iteration 1166/3560 Training loss: 1.7150 0.1266 sec/batch\n", + "Epoch 7/20 Iteration 1167/3560 Training loss: 1.7145 0.1275 sec/batch\n", + "Epoch 7/20 Iteration 1168/3560 Training loss: 1.7138 0.1219 sec/batch\n", + "Epoch 7/20 Iteration 1169/3560 Training loss: 1.7137 0.1289 sec/batch\n", + "Epoch 7/20 Iteration 1170/3560 Training loss: 1.7133 0.1373 sec/batch\n", + "Epoch 7/20 Iteration 1171/3560 Training loss: 1.7130 0.1379 sec/batch\n", + "Epoch 7/20 Iteration 1172/3560 Training loss: 1.7127 0.1327 sec/batch\n", + "Epoch 7/20 Iteration 1173/3560 Training loss: 1.7123 0.1191 sec/batch\n", + "Epoch 7/20 Iteration 1174/3560 Training loss: 1.7120 0.1226 sec/batch\n", + "Epoch 7/20 Iteration 1175/3560 Training loss: 1.7118 0.1205 sec/batch\n", + "Epoch 7/20 Iteration 1176/3560 Training loss: 1.7116 0.1308 sec/batch\n", + "Epoch 7/20 Iteration 1177/3560 Training loss: 1.7114 0.1321 sec/batch\n", + "Epoch 7/20 Iteration 1178/3560 Training loss: 1.7112 0.1276 sec/batch\n", + "Epoch 7/20 Iteration 1179/3560 Training loss: 1.7109 0.1219 sec/batch\n", + "Epoch 7/20 Iteration 1180/3560 Training loss: 1.7105 0.1280 sec/batch\n", + "Epoch 7/20 Iteration 1181/3560 Training loss: 1.7103 0.1248 sec/batch\n", + "Epoch 7/20 Iteration 1182/3560 Training loss: 1.7099 0.1336 sec/batch\n", + "Epoch 7/20 Iteration 1183/3560 Training loss: 1.7095 0.1247 sec/batch\n", + "Epoch 7/20 Iteration 1184/3560 Training loss: 1.7090 0.1254 sec/batch\n", + "Epoch 7/20 Iteration 1185/3560 Training loss: 1.7089 0.1293 sec/batch\n", + "Epoch 7/20 Iteration 1186/3560 Training loss: 1.7087 0.1278 sec/batch\n", + "Epoch 7/20 Iteration 1187/3560 Training loss: 1.7085 0.1226 sec/batch\n", + "Epoch 7/20 Iteration 1188/3560 Training loss: 1.7082 0.1277 sec/batch\n", + "Epoch 7/20 Iteration 1189/3560 Training loss: 1.7080 0.1251 sec/batch\n", + "Epoch 7/20 Iteration 1190/3560 Training loss: 1.7075 0.1235 sec/batch\n", + "Epoch 7/20 Iteration 1191/3560 Training loss: 1.7071 0.1374 sec/batch\n", + "Epoch 7/20 Iteration 1192/3560 Training loss: 1.7070 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1193/3560 Training loss: 1.7069 0.1300 sec/batch\n", + "Epoch 7/20 Iteration 1194/3560 Training loss: 1.7064 0.1254 sec/batch\n", + "Epoch 7/20 Iteration 1195/3560 Training loss: 1.7063 0.1298 sec/batch\n", + "Epoch 7/20 Iteration 1196/3560 Training loss: 1.7061 0.1260 sec/batch\n", + "Epoch 7/20 Iteration 1197/3560 Training loss: 1.7059 0.1322 sec/batch\n", + "Epoch 7/20 Iteration 1198/3560 Training loss: 1.7057 0.1273 sec/batch\n", + "Epoch 7/20 Iteration 1199/3560 Training loss: 1.7052 0.1305 sec/batch\n", + "Epoch 7/20 Iteration 1200/3560 Training loss: 1.7048 0.1247 sec/batch\n", + "Epoch 7/20 Iteration 1201/3560 Training loss: 1.7047 0.1297 sec/batch\n", + "Epoch 7/20 Iteration 1202/3560 Training loss: 1.7046 0.1237 sec/batch\n", + "Epoch 7/20 Iteration 1203/3560 Training loss: 1.7045 0.1234 sec/batch\n", + "Epoch 7/20 Iteration 1204/3560 Training loss: 1.7044 0.1399 sec/batch\n", + "Epoch 7/20 Iteration 1205/3560 Training loss: 1.7043 0.1269 sec/batch\n", + "Epoch 7/20 Iteration 1206/3560 Training loss: 1.7042 0.1267 sec/batch\n", + "Epoch 7/20 Iteration 1207/3560 Training loss: 1.7041 0.1205 sec/batch\n", + "Epoch 7/20 Iteration 1208/3560 Training loss: 1.7039 0.1254 sec/batch\n", + "Epoch 7/20 Iteration 1209/3560 Training loss: 1.7041 0.1276 sec/batch\n", + "Epoch 7/20 Iteration 1210/3560 Training loss: 1.7038 0.1236 sec/batch\n", + "Epoch 7/20 Iteration 1211/3560 Training loss: 1.7036 0.1319 sec/batch\n", + "Epoch 7/20 Iteration 1212/3560 Training loss: 1.7036 0.1213 sec/batch\n", + "Epoch 7/20 Iteration 1213/3560 Training loss: 1.7033 0.1245 sec/batch\n", + "Epoch 7/20 Iteration 1214/3560 Training loss: 1.7032 0.1310 sec/batch\n", + "Epoch 7/20 Iteration 1215/3560 Training loss: 1.7031 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1216/3560 Training loss: 1.7031 0.1220 sec/batch\n", + "Epoch 7/20 Iteration 1217/3560 Training loss: 1.7030 0.1385 sec/batch\n", + "Epoch 7/20 Iteration 1218/3560 Training loss: 1.7028 0.1305 sec/batch\n", + "Epoch 7/20 Iteration 1219/3560 Training loss: 1.7023 0.1215 sec/batch\n", + "Epoch 7/20 Iteration 1220/3560 Training loss: 1.7022 0.1263 sec/batch\n", + "Epoch 7/20 Iteration 1221/3560 Training loss: 1.7021 0.1307 sec/batch\n", + "Epoch 7/20 Iteration 1222/3560 Training loss: 1.7020 0.1286 sec/batch\n", + "Epoch 7/20 Iteration 1223/3560 Training loss: 1.7018 0.1240 sec/batch\n", + "Epoch 7/20 Iteration 1224/3560 Training loss: 1.7017 0.1278 sec/batch\n", + "Epoch 7/20 Iteration 1225/3560 Training loss: 1.7015 0.1296 sec/batch\n", + "Epoch 7/20 Iteration 1226/3560 Training loss: 1.7014 0.1257 sec/batch\n", + "Epoch 7/20 Iteration 1227/3560 Training loss: 1.7009 0.1227 sec/batch\n", + "Epoch 7/20 Iteration 1228/3560 Training loss: 1.7009 0.1310 sec/batch\n", + "Epoch 7/20 Iteration 1229/3560 Training loss: 1.7009 0.1270 sec/batch\n", + "Epoch 7/20 Iteration 1230/3560 Training loss: 1.7007 0.1229 sec/batch\n", + "Epoch 7/20 Iteration 1231/3560 Training loss: 1.7006 0.1236 sec/batch\n", + "Epoch 7/20 Iteration 1232/3560 Training loss: 1.7005 0.1220 sec/batch\n", + "Epoch 7/20 Iteration 1233/3560 Training loss: 1.7004 0.1214 sec/batch\n", + "Epoch 7/20 Iteration 1234/3560 Training loss: 1.7002 0.1230 sec/batch\n", + "Epoch 7/20 Iteration 1235/3560 Training loss: 1.7002 0.1230 sec/batch\n", + "Epoch 7/20 Iteration 1236/3560 Training loss: 1.7004 0.1220 sec/batch\n", + "Epoch 7/20 Iteration 1237/3560 Training loss: 1.7002 0.1227 sec/batch\n", + "Epoch 7/20 Iteration 1238/3560 Training loss: 1.7000 0.1217 sec/batch\n", + "Epoch 7/20 Iteration 1239/3560 Training loss: 1.6998 0.1235 sec/batch\n", + "Epoch 7/20 Iteration 1240/3560 Training loss: 1.6995 0.1255 sec/batch\n", + "Epoch 7/20 Iteration 1241/3560 Training loss: 1.6994 0.1252 sec/batch\n", + "Epoch 7/20 Iteration 1242/3560 Training loss: 1.6993 0.1212 sec/batch\n", + "Epoch 7/20 Iteration 1243/3560 Training loss: 1.6992 0.1248 sec/batch\n", + "Epoch 7/20 Iteration 1244/3560 Training loss: 1.6990 0.1238 sec/batch\n", + "Epoch 7/20 Iteration 1245/3560 Training loss: 1.6987 0.1242 sec/batch\n", + "Epoch 7/20 Iteration 1246/3560 Training loss: 1.6986 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1247/3560 Training loss: 1.7771 0.1225 sec/batch\n", + "Epoch 8/20 Iteration 1248/3560 Training loss: 1.7201 0.1281 sec/batch\n", + "Epoch 8/20 Iteration 1249/3560 Training loss: 1.7012 0.1215 sec/batch\n", + "Epoch 8/20 Iteration 1250/3560 Training loss: 1.6931 0.1266 sec/batch\n", + "Epoch 8/20 Iteration 1251/3560 Training loss: 1.6826 0.1222 sec/batch\n", + "Epoch 8/20 Iteration 1252/3560 Training loss: 1.6719 0.1261 sec/batch\n", + "Epoch 8/20 Iteration 1253/3560 Training loss: 1.6723 0.1295 sec/batch\n", + "Epoch 8/20 Iteration 1254/3560 Training loss: 1.6693 0.1213 sec/batch\n", + "Epoch 8/20 Iteration 1255/3560 Training loss: 1.6703 0.1210 sec/batch\n", + "Epoch 8/20 Iteration 1256/3560 Training loss: 1.6698 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1257/3560 Training loss: 1.6668 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1258/3560 Training loss: 1.6656 0.1281 sec/batch\n", + "Epoch 8/20 Iteration 1259/3560 Training loss: 1.6655 0.1332 sec/batch\n", + "Epoch 8/20 Iteration 1260/3560 Training loss: 1.6676 0.1312 sec/batch\n", + "Epoch 8/20 Iteration 1261/3560 Training loss: 1.6672 0.1387 sec/batch\n", + "Epoch 8/20 Iteration 1262/3560 Training loss: 1.6657 0.1266 sec/batch\n", + "Epoch 8/20 Iteration 1263/3560 Training loss: 1.6653 0.1339 sec/batch\n", + "Epoch 8/20 Iteration 1264/3560 Training loss: 1.6665 0.1349 sec/batch\n", + "Epoch 8/20 Iteration 1265/3560 Training loss: 1.6668 0.1319 sec/batch\n", + "Epoch 8/20 Iteration 1266/3560 Training loss: 1.6678 0.1343 sec/batch\n", + "Epoch 8/20 Iteration 1267/3560 Training loss: 1.6663 0.1302 sec/batch\n", + "Epoch 8/20 Iteration 1268/3560 Training loss: 1.6667 0.1226 sec/batch\n", + "Epoch 8/20 Iteration 1269/3560 Training loss: 1.6653 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1270/3560 Training loss: 1.6651 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1271/3560 Training loss: 1.6650 0.1246 sec/batch\n", + "Epoch 8/20 Iteration 1272/3560 Training loss: 1.6635 0.1309 sec/batch\n", + "Epoch 8/20 Iteration 1273/3560 Training loss: 1.6622 0.1252 sec/batch\n", + "Epoch 8/20 Iteration 1274/3560 Training loss: 1.6623 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1275/3560 Training loss: 1.6628 0.1242 sec/batch\n", + "Epoch 8/20 Iteration 1276/3560 Training loss: 1.6630 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1277/3560 Training loss: 1.6628 0.1278 sec/batch\n", + "Epoch 8/20 Iteration 1278/3560 Training loss: 1.6615 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1279/3560 Training loss: 1.6614 0.1243 sec/batch\n", + "Epoch 8/20 Iteration 1280/3560 Training loss: 1.6619 0.1230 sec/batch\n", + "Epoch 8/20 Iteration 1281/3560 Training loss: 1.6614 0.1250 sec/batch\n", + "Epoch 8/20 Iteration 1282/3560 Training loss: 1.6608 0.1279 sec/batch\n", + "Epoch 8/20 Iteration 1283/3560 Training loss: 1.6598 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1284/3560 Training loss: 1.6586 0.1283 sec/batch\n", + "Epoch 8/20 Iteration 1285/3560 Training loss: 1.6570 0.1228 sec/batch\n", + "Epoch 8/20 Iteration 1286/3560 Training loss: 1.6568 0.1265 sec/batch\n", + "Epoch 8/20 Iteration 1287/3560 Training loss: 1.6570 0.1234 sec/batch\n", + "Epoch 8/20 Iteration 1288/3560 Training loss: 1.6582 0.1376 sec/batch\n", + "Epoch 8/20 Iteration 1289/3560 Training loss: 1.6581 0.1321 sec/batch\n", + "Epoch 8/20 Iteration 1290/3560 Training loss: 1.6576 0.1374 sec/batch\n", + "Epoch 8/20 Iteration 1291/3560 Training loss: 1.6579 0.1293 sec/batch\n", + "Epoch 8/20 Iteration 1292/3560 Training loss: 1.6571 0.1243 sec/batch\n", + "Epoch 8/20 Iteration 1293/3560 Training loss: 1.6570 0.1286 sec/batch\n", + "Epoch 8/20 Iteration 1294/3560 Training loss: 1.6565 0.1275 sec/batch\n", + "Epoch 8/20 Iteration 1295/3560 Training loss: 1.6564 0.1305 sec/batch\n", + "Epoch 8/20 Iteration 1296/3560 Training loss: 1.6572 0.1280 sec/batch\n", + "Epoch 8/20 Iteration 1297/3560 Training loss: 1.6566 0.1256 sec/batch\n", + "Epoch 8/20 Iteration 1298/3560 Training loss: 1.6576 0.1328 sec/batch\n", + "Epoch 8/20 Iteration 1299/3560 Training loss: 1.6573 0.1226 sec/batch\n", + "Epoch 8/20 Iteration 1300/3560 Training loss: 1.6573 0.1303 sec/batch\n", + "Epoch 8/20 Iteration 1301/3560 Training loss: 1.6570 0.1309 sec/batch\n", + "Epoch 8/20 Iteration 1302/3560 Training loss: 1.6573 0.1346 sec/batch\n", + "Epoch 8/20 Iteration 1303/3560 Training loss: 1.6577 0.1282 sec/batch\n", + "Epoch 8/20 Iteration 1304/3560 Training loss: 1.6573 0.1266 sec/batch\n", + "Epoch 8/20 Iteration 1305/3560 Training loss: 1.6567 0.1294 sec/batch\n", + "Epoch 8/20 Iteration 1306/3560 Training loss: 1.6570 0.1322 sec/batch\n", + "Epoch 8/20 Iteration 1307/3560 Training loss: 1.6567 0.1274 sec/batch\n", + "Epoch 8/20 Iteration 1308/3560 Training loss: 1.6575 0.1256 sec/batch\n", + "Epoch 8/20 Iteration 1309/3560 Training loss: 1.6577 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1310/3560 Training loss: 1.6580 0.1275 sec/batch\n", + "Epoch 8/20 Iteration 1311/3560 Training loss: 1.6578 0.1271 sec/batch\n", + "Epoch 8/20 Iteration 1312/3560 Training loss: 1.6580 0.1300 sec/batch\n", + "Epoch 8/20 Iteration 1313/3560 Training loss: 1.6579 0.1273 sec/batch\n", + "Epoch 8/20 Iteration 1314/3560 Training loss: 1.6574 0.1289 sec/batch\n", + "Epoch 8/20 Iteration 1315/3560 Training loss: 1.6572 0.1249 sec/batch\n", + "Epoch 8/20 Iteration 1316/3560 Training loss: 1.6570 0.1225 sec/batch\n", + "Epoch 8/20 Iteration 1317/3560 Training loss: 1.6575 0.1272 sec/batch\n", + "Epoch 8/20 Iteration 1318/3560 Training loss: 1.6577 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1319/3560 Training loss: 1.6580 0.1260 sec/batch\n", + "Epoch 8/20 Iteration 1320/3560 Training loss: 1.6577 0.1277 sec/batch\n", + "Epoch 8/20 Iteration 1321/3560 Training loss: 1.6574 0.1224 sec/batch\n", + "Epoch 8/20 Iteration 1322/3560 Training loss: 1.6575 0.1249 sec/batch\n", + "Epoch 8/20 Iteration 1323/3560 Training loss: 1.6571 0.1212 sec/batch\n", + "Epoch 8/20 Iteration 1324/3560 Training loss: 1.6571 0.1382 sec/batch\n", + "Epoch 8/20 Iteration 1325/3560 Training loss: 1.6564 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1326/3560 Training loss: 1.6561 0.1233 sec/batch\n", + "Epoch 8/20 Iteration 1327/3560 Training loss: 1.6555 0.1263 sec/batch\n", + "Epoch 8/20 Iteration 1328/3560 Training loss: 1.6553 0.1302 sec/batch\n", + "Epoch 8/20 Iteration 1329/3560 Training loss: 1.6548 0.1383 sec/batch\n", + "Epoch 8/20 Iteration 1330/3560 Training loss: 1.6546 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1331/3560 Training loss: 1.6541 0.1210 sec/batch\n", + "Epoch 8/20 Iteration 1332/3560 Training loss: 1.6537 0.1225 sec/batch\n", + "Epoch 8/20 Iteration 1333/3560 Training loss: 1.6534 0.1228 sec/batch\n", + "Epoch 8/20 Iteration 1334/3560 Training loss: 1.6530 0.1242 sec/batch\n", + "Epoch 8/20 Iteration 1335/3560 Training loss: 1.6525 0.1250 sec/batch\n", + "Epoch 8/20 Iteration 1336/3560 Training loss: 1.6524 0.1287 sec/batch\n", + "Epoch 8/20 Iteration 1337/3560 Training loss: 1.6519 0.1271 sec/batch\n", + "Epoch 8/20 Iteration 1338/3560 Training loss: 1.6518 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1339/3560 Training loss: 1.6513 0.1235 sec/batch\n", + "Epoch 8/20 Iteration 1340/3560 Training loss: 1.6508 0.1217 sec/batch\n", + "Epoch 8/20 Iteration 1341/3560 Training loss: 1.6503 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1342/3560 Training loss: 1.6502 0.1219 sec/batch\n", + "Epoch 8/20 Iteration 1343/3560 Training loss: 1.6499 0.1223 sec/batch\n", + "Epoch 8/20 Iteration 1344/3560 Training loss: 1.6494 0.1296 sec/batch\n", + "Epoch 8/20 Iteration 1345/3560 Training loss: 1.6490 0.1302 sec/batch\n", + "Epoch 8/20 Iteration 1346/3560 Training loss: 1.6484 0.1257 sec/batch\n", + "Epoch 8/20 Iteration 1347/3560 Training loss: 1.6483 0.1260 sec/batch\n", + "Epoch 8/20 Iteration 1348/3560 Training loss: 1.6480 0.1275 sec/batch\n", + "Epoch 8/20 Iteration 1349/3560 Training loss: 1.6477 0.1332 sec/batch\n", + "Epoch 8/20 Iteration 1350/3560 Training loss: 1.6474 0.1226 sec/batch\n", + "Epoch 8/20 Iteration 1351/3560 Training loss: 1.6470 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1352/3560 Training loss: 1.6467 0.1311 sec/batch\n", + "Epoch 8/20 Iteration 1353/3560 Training loss: 1.6465 0.1294 sec/batch\n", + "Epoch 8/20 Iteration 1354/3560 Training loss: 1.6462 0.1275 sec/batch\n", + "Epoch 8/20 Iteration 1355/3560 Training loss: 1.6461 0.1253 sec/batch\n", + "Epoch 8/20 Iteration 1356/3560 Training loss: 1.6460 0.1263 sec/batch\n", + "Epoch 8/20 Iteration 1357/3560 Training loss: 1.6458 0.1231 sec/batch\n", + "Epoch 8/20 Iteration 1358/3560 Training loss: 1.6455 0.1245 sec/batch\n", + "Epoch 8/20 Iteration 1359/3560 Training loss: 1.6452 0.1284 sec/batch\n", + "Epoch 8/20 Iteration 1360/3560 Training loss: 1.6449 0.1302 sec/batch\n", + "Epoch 8/20 Iteration 1361/3560 Training loss: 1.6446 0.1241 sec/batch\n", + "Epoch 8/20 Iteration 1362/3560 Training loss: 1.6440 0.1269 sec/batch\n", + "Epoch 8/20 Iteration 1363/3560 Training loss: 1.6438 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1364/3560 Training loss: 1.6437 0.1242 sec/batch\n", + "Epoch 8/20 Iteration 1365/3560 Training loss: 1.6434 0.1333 sec/batch\n", + "Epoch 8/20 Iteration 1366/3560 Training loss: 1.6431 0.1253 sec/batch\n", + "Epoch 8/20 Iteration 1367/3560 Training loss: 1.6429 0.1265 sec/batch\n", + "Epoch 8/20 Iteration 1368/3560 Training loss: 1.6424 0.1271 sec/batch\n", + "Epoch 8/20 Iteration 1369/3560 Training loss: 1.6419 0.1243 sec/batch\n", + "Epoch 8/20 Iteration 1370/3560 Training loss: 1.6419 0.1234 sec/batch\n", + "Epoch 8/20 Iteration 1371/3560 Training loss: 1.6418 0.1219 sec/batch\n", + "Epoch 8/20 Iteration 1372/3560 Training loss: 1.6413 0.1257 sec/batch\n", + "Epoch 8/20 Iteration 1373/3560 Training loss: 1.6413 0.1253 sec/batch\n", + "Epoch 8/20 Iteration 1374/3560 Training loss: 1.6412 0.1235 sec/batch\n", + "Epoch 8/20 Iteration 1375/3560 Training loss: 1.6409 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1376/3560 Training loss: 1.6407 0.1272 sec/batch\n", + "Epoch 8/20 Iteration 1377/3560 Training loss: 1.6402 0.1231 sec/batch\n", + "Epoch 8/20 Iteration 1378/3560 Training loss: 1.6398 0.1268 sec/batch\n", + "Epoch 8/20 Iteration 1379/3560 Training loss: 1.6398 0.1221 sec/batch\n", + "Epoch 8/20 Iteration 1380/3560 Training loss: 1.6397 0.1304 sec/batch\n", + "Epoch 8/20 Iteration 1381/3560 Training loss: 1.6396 0.1230 sec/batch\n", + "Epoch 8/20 Iteration 1382/3560 Training loss: 1.6395 0.1258 sec/batch\n", + "Epoch 8/20 Iteration 1383/3560 Training loss: 1.6395 0.1338 sec/batch\n", + "Epoch 8/20 Iteration 1384/3560 Training loss: 1.6394 0.1229 sec/batch\n", + "Epoch 8/20 Iteration 1385/3560 Training loss: 1.6394 0.1251 sec/batch\n", + "Epoch 8/20 Iteration 1386/3560 Training loss: 1.6392 0.1275 sec/batch\n", + "Epoch 8/20 Iteration 1387/3560 Training loss: 1.6394 0.1293 sec/batch\n", + "Epoch 8/20 Iteration 1388/3560 Training loss: 1.6391 0.1249 sec/batch\n", + "Epoch 8/20 Iteration 1389/3560 Training loss: 1.6390 0.1285 sec/batch\n", + "Epoch 8/20 Iteration 1390/3560 Training loss: 1.6390 0.1316 sec/batch\n", + "Epoch 8/20 Iteration 1391/3560 Training loss: 1.6388 0.1208 sec/batch\n", + "Epoch 8/20 Iteration 1392/3560 Training loss: 1.6388 0.1394 sec/batch\n", + "Epoch 8/20 Iteration 1393/3560 Training loss: 1.6388 0.1285 sec/batch\n", + "Epoch 8/20 Iteration 1394/3560 Training loss: 1.6388 0.1214 sec/batch\n", + "Epoch 8/20 Iteration 1395/3560 Training loss: 1.6388 0.1237 sec/batch\n", + "Epoch 8/20 Iteration 1396/3560 Training loss: 1.6386 0.1225 sec/batch\n", + "Epoch 8/20 Iteration 1397/3560 Training loss: 1.6382 0.1281 sec/batch\n", + "Epoch 8/20 Iteration 1398/3560 Training loss: 1.6381 0.1262 sec/batch\n", + "Epoch 8/20 Iteration 1399/3560 Training loss: 1.6380 0.1258 sec/batch\n", + "Epoch 8/20 Iteration 1400/3560 Training loss: 1.6380 0.1250 sec/batch\n", + "Epoch 8/20 Iteration 1401/3560 Training loss: 1.6378 0.1218 sec/batch\n", + "Epoch 8/20 Iteration 1402/3560 Training loss: 1.6377 0.1220 sec/batch\n", + "Epoch 8/20 Iteration 1403/3560 Training loss: 1.6376 0.1259 sec/batch\n", + "Epoch 8/20 Iteration 1404/3560 Training loss: 1.6375 0.1226 sec/batch\n", + "Epoch 8/20 Iteration 1405/3560 Training loss: 1.6372 0.1233 sec/batch\n", + "Epoch 8/20 Iteration 1406/3560 Training loss: 1.6372 0.1238 sec/batch\n", + "Epoch 8/20 Iteration 1407/3560 Training loss: 1.6373 0.1257 sec/batch\n", + "Epoch 8/20 Iteration 1408/3560 Training loss: 1.6372 0.1255 sec/batch\n", + "Epoch 8/20 Iteration 1409/3560 Training loss: 1.6372 0.1252 sec/batch\n", + "Epoch 8/20 Iteration 1410/3560 Training loss: 1.6370 0.1236 sec/batch\n", + "Epoch 8/20 Iteration 1411/3560 Training loss: 1.6368 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1412/3560 Training loss: 1.6366 0.1265 sec/batch\n", + "Epoch 8/20 Iteration 1413/3560 Training loss: 1.6366 0.1227 sec/batch\n", + "Epoch 8/20 Iteration 1414/3560 Training loss: 1.6369 0.1246 sec/batch\n", + "Epoch 8/20 Iteration 1415/3560 Training loss: 1.6368 0.1230 sec/batch\n", + "Epoch 8/20 Iteration 1416/3560 Training loss: 1.6367 0.1248 sec/batch\n", + "Epoch 8/20 Iteration 1417/3560 Training loss: 1.6365 0.1234 sec/batch\n", + "Epoch 8/20 Iteration 1418/3560 Training loss: 1.6362 0.1239 sec/batch\n", + "Epoch 8/20 Iteration 1419/3560 Training loss: 1.6362 0.1253 sec/batch\n", + "Epoch 8/20 Iteration 1420/3560 Training loss: 1.6361 0.1255 sec/batch\n", + "Epoch 8/20 Iteration 1421/3560 Training loss: 1.6361 0.1244 sec/batch\n", + "Epoch 8/20 Iteration 1422/3560 Training loss: 1.6358 0.1249 sec/batch\n", + "Epoch 8/20 Iteration 1423/3560 Training loss: 1.6355 0.1256 sec/batch\n", + "Epoch 8/20 Iteration 1424/3560 Training loss: 1.6355 0.1281 sec/batch\n", + "Epoch 9/20 Iteration 1425/3560 Training loss: 1.7204 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1426/3560 Training loss: 1.6681 0.1263 sec/batch\n", + "Epoch 9/20 Iteration 1427/3560 Training loss: 1.6446 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1428/3560 Training loss: 1.6373 0.1266 sec/batch\n", + "Epoch 9/20 Iteration 1429/3560 Training loss: 1.6289 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1430/3560 Training loss: 1.6165 0.1270 sec/batch\n", + "Epoch 9/20 Iteration 1431/3560 Training loss: 1.6173 0.1309 sec/batch\n", + "Epoch 9/20 Iteration 1432/3560 Training loss: 1.6144 0.1278 sec/batch\n", + "Epoch 9/20 Iteration 1433/3560 Training loss: 1.6162 0.1296 sec/batch\n", + "Epoch 9/20 Iteration 1434/3560 Training loss: 1.6150 0.1251 sec/batch\n", + "Epoch 9/20 Iteration 1435/3560 Training loss: 1.6109 0.1300 sec/batch\n", + "Epoch 9/20 Iteration 1436/3560 Training loss: 1.6100 0.1261 sec/batch\n", + "Epoch 9/20 Iteration 1437/3560 Training loss: 1.6099 0.1230 sec/batch\n", + "Epoch 9/20 Iteration 1438/3560 Training loss: 1.6121 0.1241 sec/batch\n", + "Epoch 9/20 Iteration 1439/3560 Training loss: 1.6109 0.1278 sec/batch\n", + "Epoch 9/20 Iteration 1440/3560 Training loss: 1.6093 0.1315 sec/batch\n", + "Epoch 9/20 Iteration 1441/3560 Training loss: 1.6095 0.1217 sec/batch\n", + "Epoch 9/20 Iteration 1442/3560 Training loss: 1.6107 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1443/3560 Training loss: 1.6106 0.1305 sec/batch\n", + "Epoch 9/20 Iteration 1444/3560 Training loss: 1.6118 0.1268 sec/batch\n", + "Epoch 9/20 Iteration 1445/3560 Training loss: 1.6109 0.1251 sec/batch\n", + "Epoch 9/20 Iteration 1446/3560 Training loss: 1.6113 0.1292 sec/batch\n", + "Epoch 9/20 Iteration 1447/3560 Training loss: 1.6104 0.1352 sec/batch\n", + "Epoch 9/20 Iteration 1448/3560 Training loss: 1.6097 0.1269 sec/batch\n", + "Epoch 9/20 Iteration 1449/3560 Training loss: 1.6097 0.1302 sec/batch\n", + "Epoch 9/20 Iteration 1450/3560 Training loss: 1.6081 0.1351 sec/batch\n", + "Epoch 9/20 Iteration 1451/3560 Training loss: 1.6065 0.1269 sec/batch\n", + "Epoch 9/20 Iteration 1452/3560 Training loss: 1.6065 0.1256 sec/batch\n", + "Epoch 9/20 Iteration 1453/3560 Training loss: 1.6066 0.1336 sec/batch\n", + "Epoch 9/20 Iteration 1454/3560 Training loss: 1.6068 0.1261 sec/batch\n", + "Epoch 9/20 Iteration 1455/3560 Training loss: 1.6068 0.1312 sec/batch\n", + "Epoch 9/20 Iteration 1456/3560 Training loss: 1.6058 0.1277 sec/batch\n", + "Epoch 9/20 Iteration 1457/3560 Training loss: 1.6059 0.1301 sec/batch\n", + "Epoch 9/20 Iteration 1458/3560 Training loss: 1.6062 0.1280 sec/batch\n", + "Epoch 9/20 Iteration 1459/3560 Training loss: 1.6062 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1460/3560 Training loss: 1.6059 0.1271 sec/batch\n", + "Epoch 9/20 Iteration 1461/3560 Training loss: 1.6048 0.1270 sec/batch\n", + "Epoch 9/20 Iteration 1462/3560 Training loss: 1.6035 0.1350 sec/batch\n", + "Epoch 9/20 Iteration 1463/3560 Training loss: 1.6022 0.1329 sec/batch\n", + "Epoch 9/20 Iteration 1464/3560 Training loss: 1.6014 0.1217 sec/batch\n", + "Epoch 9/20 Iteration 1465/3560 Training loss: 1.6009 0.1265 sec/batch\n", + "Epoch 9/20 Iteration 1466/3560 Training loss: 1.6013 0.1240 sec/batch\n", + "Epoch 9/20 Iteration 1467/3560 Training loss: 1.6005 0.1266 sec/batch\n", + "Epoch 9/20 Iteration 1468/3560 Training loss: 1.5998 0.1289 sec/batch\n", + "Epoch 9/20 Iteration 1469/3560 Training loss: 1.5999 0.1202 sec/batch\n", + "Epoch 9/20 Iteration 1470/3560 Training loss: 1.5987 0.1302 sec/batch\n", + "Epoch 9/20 Iteration 1471/3560 Training loss: 1.5985 0.1262 sec/batch\n", + "Epoch 9/20 Iteration 1472/3560 Training loss: 1.5980 0.1291 sec/batch\n", + "Epoch 9/20 Iteration 1473/3560 Training loss: 1.5977 0.1254 sec/batch\n", + "Epoch 9/20 Iteration 1474/3560 Training loss: 1.5983 0.1298 sec/batch\n", + "Epoch 9/20 Iteration 1475/3560 Training loss: 1.5977 0.1286 sec/batch\n", + "Epoch 9/20 Iteration 1476/3560 Training loss: 1.5985 0.1304 sec/batch\n", + "Epoch 9/20 Iteration 1477/3560 Training loss: 1.5982 0.1241 sec/batch\n", + "Epoch 9/20 Iteration 1478/3560 Training loss: 1.5982 0.1294 sec/batch\n", + "Epoch 9/20 Iteration 1479/3560 Training loss: 1.5979 0.1278 sec/batch\n", + "Epoch 9/20 Iteration 1480/3560 Training loss: 1.5979 0.1267 sec/batch\n", + "Epoch 9/20 Iteration 1481/3560 Training loss: 1.5983 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1482/3560 Training loss: 1.5979 0.1270 sec/batch\n", + "Epoch 9/20 Iteration 1483/3560 Training loss: 1.5972 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1484/3560 Training loss: 1.5977 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1485/3560 Training loss: 1.5975 0.1219 sec/batch\n", + "Epoch 9/20 Iteration 1486/3560 Training loss: 1.5984 0.1311 sec/batch\n", + "Epoch 9/20 Iteration 1487/3560 Training loss: 1.5987 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1488/3560 Training loss: 1.5989 0.1293 sec/batch\n", + "Epoch 9/20 Iteration 1489/3560 Training loss: 1.5988 0.1293 sec/batch\n", + "Epoch 9/20 Iteration 1490/3560 Training loss: 1.5991 0.1230 sec/batch\n", + "Epoch 9/20 Iteration 1491/3560 Training loss: 1.5991 0.1266 sec/batch\n", + "Epoch 9/20 Iteration 1492/3560 Training loss: 1.5987 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1493/3560 Training loss: 1.5985 0.1231 sec/batch\n", + "Epoch 9/20 Iteration 1494/3560 Training loss: 1.5982 0.1230 sec/batch\n", + "Epoch 9/20 Iteration 1495/3560 Training loss: 1.5988 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1496/3560 Training loss: 1.5989 0.1264 sec/batch\n", + "Epoch 9/20 Iteration 1497/3560 Training loss: 1.5993 0.1262 sec/batch\n", + "Epoch 9/20 Iteration 1498/3560 Training loss: 1.5988 0.1241 sec/batch\n", + "Epoch 9/20 Iteration 1499/3560 Training loss: 1.5986 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1500/3560 Training loss: 1.5987 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1501/3560 Training loss: 1.5984 0.1227 sec/batch\n", + "Epoch 9/20 Iteration 1502/3560 Training loss: 1.5984 0.1253 sec/batch\n", + "Epoch 9/20 Iteration 1503/3560 Training loss: 1.5978 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1504/3560 Training loss: 1.5976 0.1235 sec/batch\n", + "Epoch 9/20 Iteration 1505/3560 Training loss: 1.5970 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1506/3560 Training loss: 1.5968 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1507/3560 Training loss: 1.5961 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1508/3560 Training loss: 1.5960 0.1228 sec/batch\n", + "Epoch 9/20 Iteration 1509/3560 Training loss: 1.5955 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1510/3560 Training loss: 1.5952 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1511/3560 Training loss: 1.5949 0.1219 sec/batch\n", + "Epoch 9/20 Iteration 1512/3560 Training loss: 1.5944 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1513/3560 Training loss: 1.5939 0.1262 sec/batch\n", + "Epoch 9/20 Iteration 1514/3560 Training loss: 1.5939 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1515/3560 Training loss: 1.5936 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1516/3560 Training loss: 1.5933 0.1258 sec/batch\n", + "Epoch 9/20 Iteration 1517/3560 Training loss: 1.5929 0.1254 sec/batch\n", + "Epoch 9/20 Iteration 1518/3560 Training loss: 1.5925 0.1313 sec/batch\n", + "Epoch 9/20 Iteration 1519/3560 Training loss: 1.5921 0.1273 sec/batch\n", + "Epoch 9/20 Iteration 1520/3560 Training loss: 1.5921 0.1263 sec/batch\n", + "Epoch 9/20 Iteration 1521/3560 Training loss: 1.5919 0.1251 sec/batch\n", + "Epoch 9/20 Iteration 1522/3560 Training loss: 1.5914 0.1320 sec/batch\n", + "Epoch 9/20 Iteration 1523/3560 Training loss: 1.5910 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1524/3560 Training loss: 1.5905 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1525/3560 Training loss: 1.5905 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1526/3560 Training loss: 1.5903 0.1257 sec/batch\n", + "Epoch 9/20 Iteration 1527/3560 Training loss: 1.5901 0.1241 sec/batch\n", + "Epoch 9/20 Iteration 1528/3560 Training loss: 1.5898 0.1286 sec/batch\n", + "Epoch 9/20 Iteration 1529/3560 Training loss: 1.5896 0.1248 sec/batch\n", + "Epoch 9/20 Iteration 1530/3560 Training loss: 1.5895 0.1244 sec/batch\n", + "Epoch 9/20 Iteration 1531/3560 Training loss: 1.5894 0.1257 sec/batch\n", + "Epoch 9/20 Iteration 1532/3560 Training loss: 1.5892 0.1294 sec/batch\n", + "Epoch 9/20 Iteration 1533/3560 Training loss: 1.5891 0.1270 sec/batch\n", + "Epoch 9/20 Iteration 1534/3560 Training loss: 1.5890 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1535/3560 Training loss: 1.5887 0.1257 sec/batch\n", + "Epoch 9/20 Iteration 1536/3560 Training loss: 1.5885 0.1285 sec/batch\n", + "Epoch 9/20 Iteration 1537/3560 Training loss: 1.5882 0.1231 sec/batch\n", + "Epoch 9/20 Iteration 1538/3560 Training loss: 1.5880 0.1233 sec/batch\n", + "Epoch 9/20 Iteration 1539/3560 Training loss: 1.5876 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1540/3560 Training loss: 1.5872 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1541/3560 Training loss: 1.5870 0.1215 sec/batch\n", + "Epoch 9/20 Iteration 1542/3560 Training loss: 1.5869 0.1218 sec/batch\n", + "Epoch 9/20 Iteration 1543/3560 Training loss: 1.5866 0.1220 sec/batch\n", + "Epoch 9/20 Iteration 1544/3560 Training loss: 1.5864 0.1234 sec/batch\n", + "Epoch 9/20 Iteration 1545/3560 Training loss: 1.5862 0.1221 sec/batch\n", + "Epoch 9/20 Iteration 1546/3560 Training loss: 1.5858 0.1245 sec/batch\n", + "Epoch 9/20 Iteration 1547/3560 Training loss: 1.5853 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1548/3560 Training loss: 1.5853 0.1253 sec/batch\n", + "Epoch 9/20 Iteration 1549/3560 Training loss: 1.5852 0.1256 sec/batch\n", + "Epoch 9/20 Iteration 1550/3560 Training loss: 1.5849 0.1290 sec/batch\n", + "Epoch 9/20 Iteration 1551/3560 Training loss: 1.5849 0.1291 sec/batch\n", + "Epoch 9/20 Iteration 1552/3560 Training loss: 1.5848 0.1240 sec/batch\n", + "Epoch 9/20 Iteration 1553/3560 Training loss: 1.5846 0.1255 sec/batch\n", + "Epoch 9/20 Iteration 1554/3560 Training loss: 1.5843 0.1256 sec/batch\n", + "Epoch 9/20 Iteration 1555/3560 Training loss: 1.5839 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1556/3560 Training loss: 1.5836 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1557/3560 Training loss: 1.5836 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1558/3560 Training loss: 1.5836 0.1223 sec/batch\n", + "Epoch 9/20 Iteration 1559/3560 Training loss: 1.5834 0.1269 sec/batch\n", + "Epoch 9/20 Iteration 1560/3560 Training loss: 1.5834 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1561/3560 Training loss: 1.5834 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1562/3560 Training loss: 1.5834 0.1215 sec/batch\n", + "Epoch 9/20 Iteration 1563/3560 Training loss: 1.5833 0.1286 sec/batch\n", + "Epoch 9/20 Iteration 1564/3560 Training loss: 1.5832 0.1224 sec/batch\n", + "Epoch 9/20 Iteration 1565/3560 Training loss: 1.5834 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1566/3560 Training loss: 1.5833 0.1226 sec/batch\n", + "Epoch 9/20 Iteration 1567/3560 Training loss: 1.5832 0.1259 sec/batch\n", + "Epoch 9/20 Iteration 1568/3560 Training loss: 1.5833 0.1247 sec/batch\n", + "Epoch 9/20 Iteration 1569/3560 Training loss: 1.5831 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1570/3560 Training loss: 1.5831 0.1324 sec/batch\n", + "Epoch 9/20 Iteration 1571/3560 Training loss: 1.5830 0.1241 sec/batch\n", + "Epoch 9/20 Iteration 1572/3560 Training loss: 1.5831 0.1258 sec/batch\n", + "Epoch 9/20 Iteration 1573/3560 Training loss: 1.5831 0.1236 sec/batch\n", + "Epoch 9/20 Iteration 1574/3560 Training loss: 1.5829 0.1294 sec/batch\n", + "Epoch 9/20 Iteration 1575/3560 Training loss: 1.5824 0.1235 sec/batch\n", + "Epoch 9/20 Iteration 1576/3560 Training loss: 1.5824 0.1239 sec/batch\n", + "Epoch 9/20 Iteration 1577/3560 Training loss: 1.5823 0.1222 sec/batch\n", + "Epoch 9/20 Iteration 1578/3560 Training loss: 1.5823 0.1276 sec/batch\n", + "Epoch 9/20 Iteration 1579/3560 Training loss: 1.5822 0.1225 sec/batch\n", + "Epoch 9/20 Iteration 1580/3560 Training loss: 1.5820 0.1257 sec/batch\n", + "Epoch 9/20 Iteration 1581/3560 Training loss: 1.5820 0.1278 sec/batch\n", + "Epoch 9/20 Iteration 1582/3560 Training loss: 1.5819 0.1294 sec/batch\n", + "Epoch 9/20 Iteration 1583/3560 Training loss: 1.5815 0.1246 sec/batch\n", + "Epoch 9/20 Iteration 1584/3560 Training loss: 1.5816 0.1324 sec/batch\n", + "Epoch 9/20 Iteration 1585/3560 Training loss: 1.5817 0.1289 sec/batch\n", + "Epoch 9/20 Iteration 1586/3560 Training loss: 1.5816 0.1264 sec/batch\n", + "Epoch 9/20 Iteration 1587/3560 Training loss: 1.5814 0.1243 sec/batch\n", + "Epoch 9/20 Iteration 1588/3560 Training loss: 1.5813 0.1238 sec/batch\n", + "Epoch 9/20 Iteration 1589/3560 Training loss: 1.5812 0.1250 sec/batch\n", + "Epoch 9/20 Iteration 1590/3560 Training loss: 1.5811 0.1278 sec/batch\n", + "Epoch 9/20 Iteration 1591/3560 Training loss: 1.5812 0.1249 sec/batch\n", + "Epoch 9/20 Iteration 1592/3560 Training loss: 1.5815 0.1288 sec/batch\n", + "Epoch 9/20 Iteration 1593/3560 Training loss: 1.5814 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1594/3560 Training loss: 1.5813 0.1216 sec/batch\n", + "Epoch 9/20 Iteration 1595/3560 Training loss: 1.5811 0.1242 sec/batch\n", + "Epoch 9/20 Iteration 1596/3560 Training loss: 1.5808 0.1325 sec/batch\n", + "Epoch 9/20 Iteration 1597/3560 Training loss: 1.5809 0.1229 sec/batch\n", + "Epoch 9/20 Iteration 1598/3560 Training loss: 1.5809 0.1282 sec/batch\n", + "Epoch 9/20 Iteration 1599/3560 Training loss: 1.5808 0.1299 sec/batch\n", + "Epoch 9/20 Iteration 1600/3560 Training loss: 1.5807 0.1257 sec/batch\n", + "Epoch 9/20 Iteration 1601/3560 Training loss: 1.5805 0.1249 sec/batch\n", + "Epoch 9/20 Iteration 1602/3560 Training loss: 1.5805 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1603/3560 Training loss: 1.6652 0.1226 sec/batch\n", + "Epoch 10/20 Iteration 1604/3560 Training loss: 1.6179 0.1256 sec/batch\n", + "Epoch 10/20 Iteration 1605/3560 Training loss: 1.5953 0.1265 sec/batch\n", + "Epoch 10/20 Iteration 1606/3560 Training loss: 1.5871 0.1307 sec/batch\n", + "Epoch 10/20 Iteration 1607/3560 Training loss: 1.5800 0.1226 sec/batch\n", + "Epoch 10/20 Iteration 1608/3560 Training loss: 1.5685 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1609/3560 Training loss: 1.5686 0.1256 sec/batch\n", + "Epoch 10/20 Iteration 1610/3560 Training loss: 1.5653 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1611/3560 Training loss: 1.5663 0.1248 sec/batch\n", + "Epoch 10/20 Iteration 1612/3560 Training loss: 1.5647 0.1283 sec/batch\n", + "Epoch 10/20 Iteration 1613/3560 Training loss: 1.5611 0.1267 sec/batch\n", + "Epoch 10/20 Iteration 1614/3560 Training loss: 1.5601 0.1282 sec/batch\n", + "Epoch 10/20 Iteration 1615/3560 Training loss: 1.5601 0.1234 sec/batch\n", + "Epoch 10/20 Iteration 1616/3560 Training loss: 1.5621 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1617/3560 Training loss: 1.5617 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1618/3560 Training loss: 1.5601 0.1261 sec/batch\n", + "Epoch 10/20 Iteration 1619/3560 Training loss: 1.5601 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1620/3560 Training loss: 1.5618 0.1363 sec/batch\n", + "Epoch 10/20 Iteration 1621/3560 Training loss: 1.5615 0.1236 sec/batch\n", + "Epoch 10/20 Iteration 1622/3560 Training loss: 1.5625 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1623/3560 Training loss: 1.5614 0.1224 sec/batch\n", + "Epoch 10/20 Iteration 1624/3560 Training loss: 1.5620 0.1259 sec/batch\n", + "Epoch 10/20 Iteration 1625/3560 Training loss: 1.5609 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1626/3560 Training loss: 1.5607 0.1271 sec/batch\n", + "Epoch 10/20 Iteration 1627/3560 Training loss: 1.5607 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1628/3560 Training loss: 1.5590 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1629/3560 Training loss: 1.5578 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1630/3560 Training loss: 1.5578 0.1365 sec/batch\n", + "Epoch 10/20 Iteration 1631/3560 Training loss: 1.5583 0.1239 sec/batch\n", + "Epoch 10/20 Iteration 1632/3560 Training loss: 1.5582 0.1226 sec/batch\n", + "Epoch 10/20 Iteration 1633/3560 Training loss: 1.5581 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1634/3560 Training loss: 1.5573 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1635/3560 Training loss: 1.5572 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1636/3560 Training loss: 1.5574 0.1253 sec/batch\n", + "Epoch 10/20 Iteration 1637/3560 Training loss: 1.5571 0.1258 sec/batch\n", + "Epoch 10/20 Iteration 1638/3560 Training loss: 1.5568 0.1245 sec/batch\n", + "Epoch 10/20 Iteration 1639/3560 Training loss: 1.5561 0.1224 sec/batch\n", + "Epoch 10/20 Iteration 1640/3560 Training loss: 1.5549 0.1260 sec/batch\n", + "Epoch 10/20 Iteration 1641/3560 Training loss: 1.5535 0.1240 sec/batch\n", + "Epoch 10/20 Iteration 1642/3560 Training loss: 1.5531 0.1236 sec/batch\n", + "Epoch 10/20 Iteration 1643/3560 Training loss: 1.5526 0.1311 sec/batch\n", + "Epoch 10/20 Iteration 1644/3560 Training loss: 1.5532 0.1256 sec/batch\n", + "Epoch 10/20 Iteration 1645/3560 Training loss: 1.5527 0.1268 sec/batch\n", + "Epoch 10/20 Iteration 1646/3560 Training loss: 1.5520 0.1246 sec/batch\n", + "Epoch 10/20 Iteration 1647/3560 Training loss: 1.5520 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1648/3560 Training loss: 1.5509 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1649/3560 Training loss: 1.5507 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1650/3560 Training loss: 1.5501 0.1220 sec/batch\n", + "Epoch 10/20 Iteration 1651/3560 Training loss: 1.5498 0.1275 sec/batch\n", + "Epoch 10/20 Iteration 1652/3560 Training loss: 1.5504 0.1266 sec/batch\n", + "Epoch 10/20 Iteration 1653/3560 Training loss: 1.5499 0.1253 sec/batch\n", + "Epoch 10/20 Iteration 1654/3560 Training loss: 1.5508 0.1302 sec/batch\n", + "Epoch 10/20 Iteration 1655/3560 Training loss: 1.5505 0.1243 sec/batch\n", + "Epoch 10/20 Iteration 1656/3560 Training loss: 1.5507 0.1281 sec/batch\n", + "Epoch 10/20 Iteration 1657/3560 Training loss: 1.5505 0.1298 sec/batch\n", + "Epoch 10/20 Iteration 1658/3560 Training loss: 1.5506 0.1282 sec/batch\n", + "Epoch 10/20 Iteration 1659/3560 Training loss: 1.5510 0.1216 sec/batch\n", + "Epoch 10/20 Iteration 1660/3560 Training loss: 1.5508 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1661/3560 Training loss: 1.5501 0.1233 sec/batch\n", + "Epoch 10/20 Iteration 1662/3560 Training loss: 1.5506 0.1298 sec/batch\n", + "Epoch 10/20 Iteration 1663/3560 Training loss: 1.5505 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1664/3560 Training loss: 1.5515 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1665/3560 Training loss: 1.5518 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1666/3560 Training loss: 1.5520 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1667/3560 Training loss: 1.5516 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1668/3560 Training loss: 1.5518 0.1254 sec/batch\n", + "Epoch 10/20 Iteration 1669/3560 Training loss: 1.5519 0.1213 sec/batch\n", + "Epoch 10/20 Iteration 1670/3560 Training loss: 1.5514 0.1256 sec/batch\n", + "Epoch 10/20 Iteration 1671/3560 Training loss: 1.5512 0.1225 sec/batch\n", + "Epoch 10/20 Iteration 1672/3560 Training loss: 1.5510 0.1220 sec/batch\n", + "Epoch 10/20 Iteration 1673/3560 Training loss: 1.5515 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1674/3560 Training loss: 1.5516 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1675/3560 Training loss: 1.5520 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1676/3560 Training loss: 1.5518 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1677/3560 Training loss: 1.5516 0.1245 sec/batch\n", + "Epoch 10/20 Iteration 1678/3560 Training loss: 1.5516 0.1250 sec/batch\n", + "Epoch 10/20 Iteration 1679/3560 Training loss: 1.5513 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1680/3560 Training loss: 1.5513 0.1281 sec/batch\n", + "Epoch 10/20 Iteration 1681/3560 Training loss: 1.5506 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1682/3560 Training loss: 1.5503 0.1268 sec/batch\n", + "Epoch 10/20 Iteration 1683/3560 Training loss: 1.5497 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1684/3560 Training loss: 1.5497 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1685/3560 Training loss: 1.5491 0.1258 sec/batch\n", + "Epoch 10/20 Iteration 1686/3560 Training loss: 1.5489 0.1274 sec/batch\n", + "Epoch 10/20 Iteration 1687/3560 Training loss: 1.5486 0.1253 sec/batch\n", + "Epoch 10/20 Iteration 1688/3560 Training loss: 1.5483 0.1238 sec/batch\n", + "Epoch 10/20 Iteration 1689/3560 Training loss: 1.5481 0.1230 sec/batch\n", + "Epoch 10/20 Iteration 1690/3560 Training loss: 1.5478 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1691/3560 Training loss: 1.5473 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1692/3560 Training loss: 1.5474 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1693/3560 Training loss: 1.5470 0.1270 sec/batch\n", + "Epoch 10/20 Iteration 1694/3560 Training loss: 1.5468 0.1263 sec/batch\n", + "Epoch 10/20 Iteration 1695/3560 Training loss: 1.5463 0.1218 sec/batch\n", + "Epoch 10/20 Iteration 1696/3560 Training loss: 1.5459 0.1271 sec/batch\n", + "Epoch 10/20 Iteration 1697/3560 Training loss: 1.5455 0.1214 sec/batch\n", + "Epoch 10/20 Iteration 1698/3560 Training loss: 1.5455 0.1236 sec/batch\n", + "Epoch 10/20 Iteration 1699/3560 Training loss: 1.5453 0.1254 sec/batch\n", + "Epoch 10/20 Iteration 1700/3560 Training loss: 1.5448 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1701/3560 Training loss: 1.5444 0.1257 sec/batch\n", + "Epoch 10/20 Iteration 1702/3560 Training loss: 1.5438 0.1299 sec/batch\n", + "Epoch 10/20 Iteration 1703/3560 Training loss: 1.5438 0.1243 sec/batch\n", + "Epoch 10/20 Iteration 1704/3560 Training loss: 1.5435 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1705/3560 Training loss: 1.5432 0.1345 sec/batch\n", + "Epoch 10/20 Iteration 1706/3560 Training loss: 1.5430 0.1273 sec/batch\n", + "Epoch 10/20 Iteration 1707/3560 Training loss: 1.5427 0.1308 sec/batch\n", + "Epoch 10/20 Iteration 1708/3560 Training loss: 1.5425 0.1237 sec/batch\n", + "Epoch 10/20 Iteration 1709/3560 Training loss: 1.5424 0.1273 sec/batch\n", + "Epoch 10/20 Iteration 1710/3560 Training loss: 1.5422 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1711/3560 Training loss: 1.5420 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1712/3560 Training loss: 1.5419 0.1216 sec/batch\n", + "Epoch 10/20 Iteration 1713/3560 Training loss: 1.5416 0.1258 sec/batch\n", + "Epoch 10/20 Iteration 1714/3560 Training loss: 1.5413 0.1241 sec/batch\n", + "Epoch 10/20 Iteration 1715/3560 Training loss: 1.5411 0.1273 sec/batch\n", + "Epoch 10/20 Iteration 1716/3560 Training loss: 1.5408 0.1234 sec/batch\n", + "Epoch 10/20 Iteration 1717/3560 Training loss: 1.5405 0.1260 sec/batch\n", + "Epoch 10/20 Iteration 1718/3560 Training loss: 1.5401 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1719/3560 Training loss: 1.5398 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1720/3560 Training loss: 1.5398 0.1256 sec/batch\n", + "Epoch 10/20 Iteration 1721/3560 Training loss: 1.5395 0.1242 sec/batch\n", + "Epoch 10/20 Iteration 1722/3560 Training loss: 1.5394 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1723/3560 Training loss: 1.5392 0.1255 sec/batch\n", + "Epoch 10/20 Iteration 1724/3560 Training loss: 1.5388 0.1219 sec/batch\n", + "Epoch 10/20 Iteration 1725/3560 Training loss: 1.5383 0.1244 sec/batch\n", + "Epoch 10/20 Iteration 1726/3560 Training loss: 1.5383 0.1228 sec/batch\n", + "Epoch 10/20 Iteration 1727/3560 Training loss: 1.5382 0.1245 sec/batch\n", + "Epoch 10/20 Iteration 1728/3560 Training loss: 1.5377 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1729/3560 Training loss: 1.5377 0.1254 sec/batch\n", + "Epoch 10/20 Iteration 1730/3560 Training loss: 1.5376 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1731/3560 Training loss: 1.5374 0.1217 sec/batch\n", + "Epoch 10/20 Iteration 1732/3560 Training loss: 1.5372 0.1227 sec/batch\n", + "Epoch 10/20 Iteration 1733/3560 Training loss: 1.5369 0.1235 sec/batch\n", + "Epoch 10/20 Iteration 1734/3560 Training loss: 1.5366 0.1249 sec/batch\n", + "Epoch 10/20 Iteration 1735/3560 Training loss: 1.5366 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1736/3560 Training loss: 1.5366 0.1211 sec/batch\n", + "Epoch 10/20 Iteration 1737/3560 Training loss: 1.5365 0.1223 sec/batch\n", + "Epoch 10/20 Iteration 1738/3560 Training loss: 1.5365 0.1287 sec/batch\n", + "Epoch 10/20 Iteration 1739/3560 Training loss: 1.5365 0.1282 sec/batch\n", + "Epoch 10/20 Iteration 1740/3560 Training loss: 1.5365 0.1332 sec/batch\n", + "Epoch 10/20 Iteration 1741/3560 Training loss: 1.5365 0.1279 sec/batch\n", + "Epoch 10/20 Iteration 1742/3560 Training loss: 1.5364 0.1230 sec/batch\n", + "Epoch 10/20 Iteration 1743/3560 Training loss: 1.5367 0.1200 sec/batch\n", + "Epoch 10/20 Iteration 1744/3560 Training loss: 1.5365 0.1273 sec/batch\n", + "Epoch 10/20 Iteration 1745/3560 Training loss: 1.5364 0.1260 sec/batch\n", + "Epoch 10/20 Iteration 1746/3560 Training loss: 1.5365 0.1252 sec/batch\n", + "Epoch 10/20 Iteration 1747/3560 Training loss: 1.5363 0.1266 sec/batch\n", + "Epoch 10/20 Iteration 1748/3560 Training loss: 1.5364 0.1231 sec/batch\n", + "Epoch 10/20 Iteration 1749/3560 Training loss: 1.5364 0.1259 sec/batch\n", + "Epoch 10/20 Iteration 1750/3560 Training loss: 1.5365 0.1332 sec/batch\n", + "Epoch 10/20 Iteration 1751/3560 Training loss: 1.5364 0.1325 sec/batch\n", + "Epoch 10/20 Iteration 1752/3560 Training loss: 1.5362 0.1251 sec/batch\n", + "Epoch 10/20 Iteration 1753/3560 Training loss: 1.5358 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1754/3560 Training loss: 1.5357 0.1246 sec/batch\n", + "Epoch 10/20 Iteration 1755/3560 Training loss: 1.5356 0.1233 sec/batch\n", + "Epoch 10/20 Iteration 1756/3560 Training loss: 1.5356 0.1240 sec/batch\n", + "Epoch 10/20 Iteration 1757/3560 Training loss: 1.5355 0.1248 sec/batch\n", + "Epoch 10/20 Iteration 1758/3560 Training loss: 1.5354 0.1221 sec/batch\n", + "Epoch 10/20 Iteration 1759/3560 Training loss: 1.5353 0.1247 sec/batch\n", + "Epoch 10/20 Iteration 1760/3560 Training loss: 1.5351 0.1220 sec/batch\n", + "Epoch 10/20 Iteration 1761/3560 Training loss: 1.5348 0.1229 sec/batch\n", + "Epoch 10/20 Iteration 1762/3560 Training loss: 1.5348 0.1267 sec/batch\n", + "Epoch 10/20 Iteration 1763/3560 Training loss: 1.5350 0.1222 sec/batch\n", + "Epoch 10/20 Iteration 1764/3560 Training loss: 1.5349 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1765/3560 Training loss: 1.5350 0.1301 sec/batch\n", + "Epoch 10/20 Iteration 1766/3560 Training loss: 1.5349 0.1284 sec/batch\n", + "Epoch 10/20 Iteration 1767/3560 Training loss: 1.5348 0.1286 sec/batch\n", + "Epoch 10/20 Iteration 1768/3560 Training loss: 1.5347 0.1279 sec/batch\n", + "Epoch 10/20 Iteration 1769/3560 Training loss: 1.5347 0.1248 sec/batch\n", + "Epoch 10/20 Iteration 1770/3560 Training loss: 1.5351 0.1266 sec/batch\n", + "Epoch 10/20 Iteration 1771/3560 Training loss: 1.5350 0.1274 sec/batch\n", + "Epoch 10/20 Iteration 1772/3560 Training loss: 1.5350 0.1250 sec/batch\n", + "Epoch 10/20 Iteration 1773/3560 Training loss: 1.5348 0.1267 sec/batch\n", + "Epoch 10/20 Iteration 1774/3560 Training loss: 1.5346 0.1297 sec/batch\n", + "Epoch 10/20 Iteration 1775/3560 Training loss: 1.5347 0.1232 sec/batch\n", + "Epoch 10/20 Iteration 1776/3560 Training loss: 1.5347 0.1265 sec/batch\n", + "Epoch 10/20 Iteration 1777/3560 Training loss: 1.5347 0.1271 sec/batch\n", + "Epoch 10/20 Iteration 1778/3560 Training loss: 1.5345 0.1249 sec/batch\n", + "Epoch 10/20 Iteration 1779/3560 Training loss: 1.5342 0.1317 sec/batch\n", + "Epoch 10/20 Iteration 1780/3560 Training loss: 1.5343 0.1240 sec/batch\n", + "Epoch 11/20 Iteration 1781/3560 Training loss: 1.6224 0.1297 sec/batch\n", + "Epoch 11/20 Iteration 1782/3560 Training loss: 1.5744 0.1295 sec/batch\n", + "Epoch 11/20 Iteration 1783/3560 Training loss: 1.5554 0.1267 sec/batch\n", + "Epoch 11/20 Iteration 1784/3560 Training loss: 1.5449 0.1285 sec/batch\n", + "Epoch 11/20 Iteration 1785/3560 Training loss: 1.5385 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1786/3560 Training loss: 1.5274 0.1275 sec/batch\n", + "Epoch 11/20 Iteration 1787/3560 Training loss: 1.5268 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1788/3560 Training loss: 1.5239 0.1212 sec/batch\n", + "Epoch 11/20 Iteration 1789/3560 Training loss: 1.5239 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1790/3560 Training loss: 1.5224 0.1335 sec/batch\n", + "Epoch 11/20 Iteration 1791/3560 Training loss: 1.5187 0.1251 sec/batch\n", + "Epoch 11/20 Iteration 1792/3560 Training loss: 1.5175 0.1262 sec/batch\n", + "Epoch 11/20 Iteration 1793/3560 Training loss: 1.5161 0.1237 sec/batch\n", + "Epoch 11/20 Iteration 1794/3560 Training loss: 1.5182 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1795/3560 Training loss: 1.5169 0.1277 sec/batch\n", + "Epoch 11/20 Iteration 1796/3560 Training loss: 1.5147 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1797/3560 Training loss: 1.5147 0.1243 sec/batch\n", + "Epoch 11/20 Iteration 1798/3560 Training loss: 1.5164 0.1270 sec/batch\n", + "Epoch 11/20 Iteration 1799/3560 Training loss: 1.5166 0.1264 sec/batch\n", + "Epoch 11/20 Iteration 1800/3560 Training loss: 1.5179 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1801/3560 Training loss: 1.5180 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1802/3560 Training loss: 1.5185 0.1253 sec/batch\n", + "Epoch 11/20 Iteration 1803/3560 Training loss: 1.5176 0.1247 sec/batch\n", + "Epoch 11/20 Iteration 1804/3560 Training loss: 1.5172 0.1269 sec/batch\n", + "Epoch 11/20 Iteration 1805/3560 Training loss: 1.5174 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1806/3560 Training loss: 1.5158 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1807/3560 Training loss: 1.5141 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1808/3560 Training loss: 1.5147 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1809/3560 Training loss: 1.5151 0.1235 sec/batch\n", + "Epoch 11/20 Iteration 1810/3560 Training loss: 1.5153 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1811/3560 Training loss: 1.5150 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1812/3560 Training loss: 1.5141 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1813/3560 Training loss: 1.5144 0.1215 sec/batch\n", + "Epoch 11/20 Iteration 1814/3560 Training loss: 1.5147 0.1281 sec/batch\n", + "Epoch 11/20 Iteration 1815/3560 Training loss: 1.5143 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1816/3560 Training loss: 1.5142 0.1217 sec/batch\n", + "Epoch 11/20 Iteration 1817/3560 Training loss: 1.5131 0.1248 sec/batch\n", + "Epoch 11/20 Iteration 1818/3560 Training loss: 1.5119 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1819/3560 Training loss: 1.5104 0.1215 sec/batch\n", + "Epoch 11/20 Iteration 1820/3560 Training loss: 1.5100 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1821/3560 Training loss: 1.5094 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1822/3560 Training loss: 1.5100 0.1280 sec/batch\n", + "Epoch 11/20 Iteration 1823/3560 Training loss: 1.5091 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1824/3560 Training loss: 1.5084 0.1280 sec/batch\n", + "Epoch 11/20 Iteration 1825/3560 Training loss: 1.5085 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1826/3560 Training loss: 1.5074 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1827/3560 Training loss: 1.5071 0.1231 sec/batch\n", + "Epoch 11/20 Iteration 1828/3560 Training loss: 1.5065 0.1238 sec/batch\n", + "Epoch 11/20 Iteration 1829/3560 Training loss: 1.5064 0.1289 sec/batch\n", + "Epoch 11/20 Iteration 1830/3560 Training loss: 1.5068 0.1282 sec/batch\n", + "Epoch 11/20 Iteration 1831/3560 Training loss: 1.5063 0.1315 sec/batch\n", + "Epoch 11/20 Iteration 1832/3560 Training loss: 1.5073 0.1274 sec/batch\n", + "Epoch 11/20 Iteration 1833/3560 Training loss: 1.5072 0.1231 sec/batch\n", + "Epoch 11/20 Iteration 1834/3560 Training loss: 1.5072 0.1243 sec/batch\n", + "Epoch 11/20 Iteration 1835/3560 Training loss: 1.5070 0.1296 sec/batch\n", + "Epoch 11/20 Iteration 1836/3560 Training loss: 1.5069 0.1250 sec/batch\n", + "Epoch 11/20 Iteration 1837/3560 Training loss: 1.5074 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1838/3560 Training loss: 1.5070 0.1285 sec/batch\n", + "Epoch 11/20 Iteration 1839/3560 Training loss: 1.5064 0.1224 sec/batch\n", + "Epoch 11/20 Iteration 1840/3560 Training loss: 1.5070 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1841/3560 Training loss: 1.5070 0.1227 sec/batch\n", + "Epoch 11/20 Iteration 1842/3560 Training loss: 1.5079 0.1282 sec/batch\n", + "Epoch 11/20 Iteration 1843/3560 Training loss: 1.5081 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1844/3560 Training loss: 1.5084 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1845/3560 Training loss: 1.5081 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1846/3560 Training loss: 1.5083 0.1242 sec/batch\n", + "Epoch 11/20 Iteration 1847/3560 Training loss: 1.5083 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1848/3560 Training loss: 1.5077 0.1265 sec/batch\n", + "Epoch 11/20 Iteration 1849/3560 Training loss: 1.5077 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1850/3560 Training loss: 1.5074 0.1232 sec/batch\n", + "Epoch 11/20 Iteration 1851/3560 Training loss: 1.5079 0.1263 sec/batch\n", + "Epoch 11/20 Iteration 1852/3560 Training loss: 1.5082 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1853/3560 Training loss: 1.5083 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1854/3560 Training loss: 1.5079 0.1257 sec/batch\n", + "Epoch 11/20 Iteration 1855/3560 Training loss: 1.5076 0.1249 sec/batch\n", + "Epoch 11/20 Iteration 1856/3560 Training loss: 1.5076 0.1243 sec/batch\n", + "Epoch 11/20 Iteration 1857/3560 Training loss: 1.5072 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1858/3560 Training loss: 1.5072 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1859/3560 Training loss: 1.5064 0.1385 sec/batch\n", + "Epoch 11/20 Iteration 1860/3560 Training loss: 1.5062 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1861/3560 Training loss: 1.5055 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1862/3560 Training loss: 1.5053 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1863/3560 Training loss: 1.5046 0.1216 sec/batch\n", + "Epoch 11/20 Iteration 1864/3560 Training loss: 1.5045 0.1239 sec/batch\n", + "Epoch 11/20 Iteration 1865/3560 Training loss: 1.5041 0.1216 sec/batch\n", + "Epoch 11/20 Iteration 1866/3560 Training loss: 1.5038 0.1235 sec/batch\n", + "Epoch 11/20 Iteration 1867/3560 Training loss: 1.5035 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1868/3560 Training loss: 1.5030 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1869/3560 Training loss: 1.5025 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1870/3560 Training loss: 1.5024 0.1262 sec/batch\n", + "Epoch 11/20 Iteration 1871/3560 Training loss: 1.5022 0.1234 sec/batch\n", + "Epoch 11/20 Iteration 1872/3560 Training loss: 1.5019 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1873/3560 Training loss: 1.5014 0.1236 sec/batch\n", + "Epoch 11/20 Iteration 1874/3560 Training loss: 1.5010 0.1267 sec/batch\n", + "Epoch 11/20 Iteration 1875/3560 Training loss: 1.5005 0.1257 sec/batch\n", + "Epoch 11/20 Iteration 1876/3560 Training loss: 1.5003 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1877/3560 Training loss: 1.5001 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1878/3560 Training loss: 1.4996 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1879/3560 Training loss: 1.4990 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1880/3560 Training loss: 1.4987 0.1253 sec/batch\n", + "Epoch 11/20 Iteration 1881/3560 Training loss: 1.4988 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1882/3560 Training loss: 1.4988 0.1286 sec/batch\n", + "Epoch 11/20 Iteration 1883/3560 Training loss: 1.4987 0.1239 sec/batch\n", + "Epoch 11/20 Iteration 1884/3560 Training loss: 1.4986 0.1276 sec/batch\n", + "Epoch 11/20 Iteration 1885/3560 Training loss: 1.4984 0.1243 sec/batch\n", + "Epoch 11/20 Iteration 1886/3560 Training loss: 1.4984 0.1269 sec/batch\n", + "Epoch 11/20 Iteration 1887/3560 Training loss: 1.4983 0.1262 sec/batch\n", + "Epoch 11/20 Iteration 1888/3560 Training loss: 1.4982 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1889/3560 Training loss: 1.4982 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1890/3560 Training loss: 1.4983 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1891/3560 Training loss: 1.4981 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1892/3560 Training loss: 1.4981 0.1251 sec/batch\n", + "Epoch 11/20 Iteration 1893/3560 Training loss: 1.4980 0.1239 sec/batch\n", + "Epoch 11/20 Iteration 1894/3560 Training loss: 1.4979 0.1229 sec/batch\n", + "Epoch 11/20 Iteration 1895/3560 Training loss: 1.4976 0.1272 sec/batch\n", + "Epoch 11/20 Iteration 1896/3560 Training loss: 1.4972 0.1242 sec/batch\n", + "Epoch 11/20 Iteration 1897/3560 Training loss: 1.4971 0.1293 sec/batch\n", + "Epoch 11/20 Iteration 1898/3560 Training loss: 1.4971 0.1267 sec/batch\n", + "Epoch 11/20 Iteration 1899/3560 Training loss: 1.4970 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1900/3560 Training loss: 1.4969 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1901/3560 Training loss: 1.4968 0.1314 sec/batch\n", + "Epoch 11/20 Iteration 1902/3560 Training loss: 1.4964 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1903/3560 Training loss: 1.4960 0.1276 sec/batch\n", + "Epoch 11/20 Iteration 1904/3560 Training loss: 1.4960 0.1241 sec/batch\n", + "Epoch 11/20 Iteration 1905/3560 Training loss: 1.4959 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1906/3560 Training loss: 1.4955 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1907/3560 Training loss: 1.4954 0.1263 sec/batch\n", + "Epoch 11/20 Iteration 1908/3560 Training loss: 1.4954 0.1319 sec/batch\n", + "Epoch 11/20 Iteration 1909/3560 Training loss: 1.4952 0.1267 sec/batch\n", + "Epoch 11/20 Iteration 1910/3560 Training loss: 1.4949 0.1380 sec/batch\n", + "Epoch 11/20 Iteration 1911/3560 Training loss: 1.4944 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1912/3560 Training loss: 1.4942 0.1270 sec/batch\n", + "Epoch 11/20 Iteration 1913/3560 Training loss: 1.4942 0.1221 sec/batch\n", + "Epoch 11/20 Iteration 1914/3560 Training loss: 1.4941 0.1238 sec/batch\n", + "Epoch 11/20 Iteration 1915/3560 Training loss: 1.4940 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1916/3560 Training loss: 1.4939 0.1277 sec/batch\n", + "Epoch 11/20 Iteration 1917/3560 Training loss: 1.4940 0.1264 sec/batch\n", + "Epoch 11/20 Iteration 1918/3560 Training loss: 1.4939 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1919/3560 Training loss: 1.4939 0.1298 sec/batch\n", + "Epoch 11/20 Iteration 1920/3560 Training loss: 1.4938 0.1255 sec/batch\n", + "Epoch 11/20 Iteration 1921/3560 Training loss: 1.4940 0.1262 sec/batch\n", + "Epoch 11/20 Iteration 1922/3560 Training loss: 1.4939 0.1225 sec/batch\n", + "Epoch 11/20 Iteration 1923/3560 Training loss: 1.4937 0.1226 sec/batch\n", + "Epoch 11/20 Iteration 1924/3560 Training loss: 1.4938 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1925/3560 Training loss: 1.4935 0.1222 sec/batch\n", + "Epoch 11/20 Iteration 1926/3560 Training loss: 1.4936 0.1223 sec/batch\n", + "Epoch 11/20 Iteration 1927/3560 Training loss: 1.4936 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1928/3560 Training loss: 1.4937 0.1284 sec/batch\n", + "Epoch 11/20 Iteration 1929/3560 Training loss: 1.4937 0.1248 sec/batch\n", + "Epoch 11/20 Iteration 1930/3560 Training loss: 1.4934 0.1252 sec/batch\n", + "Epoch 11/20 Iteration 1931/3560 Training loss: 1.4930 0.1263 sec/batch\n", + "Epoch 11/20 Iteration 1932/3560 Training loss: 1.4928 0.1270 sec/batch\n", + "Epoch 11/20 Iteration 1933/3560 Training loss: 1.4927 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1934/3560 Training loss: 1.4925 0.1288 sec/batch\n", + "Epoch 11/20 Iteration 1935/3560 Training loss: 1.4925 0.1220 sec/batch\n", + "Epoch 11/20 Iteration 1936/3560 Training loss: 1.4923 0.1230 sec/batch\n", + "Epoch 11/20 Iteration 1937/3560 Training loss: 1.4924 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1938/3560 Training loss: 1.4923 0.1275 sec/batch\n", + "Epoch 11/20 Iteration 1939/3560 Training loss: 1.4921 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1940/3560 Training loss: 1.4922 0.1233 sec/batch\n", + "Epoch 11/20 Iteration 1941/3560 Training loss: 1.4924 0.1232 sec/batch\n", + "Epoch 11/20 Iteration 1942/3560 Training loss: 1.4923 0.1293 sec/batch\n", + "Epoch 11/20 Iteration 1943/3560 Training loss: 1.4922 0.1245 sec/batch\n", + "Epoch 11/20 Iteration 1944/3560 Training loss: 1.4921 0.1244 sec/batch\n", + "Epoch 11/20 Iteration 1945/3560 Training loss: 1.4921 0.1218 sec/batch\n", + "Epoch 11/20 Iteration 1946/3560 Training loss: 1.4919 0.1254 sec/batch\n", + "Epoch 11/20 Iteration 1947/3560 Training loss: 1.4920 0.1297 sec/batch\n", + "Epoch 11/20 Iteration 1948/3560 Training loss: 1.4923 0.1316 sec/batch\n", + "Epoch 11/20 Iteration 1949/3560 Training loss: 1.4921 0.1246 sec/batch\n", + "Epoch 11/20 Iteration 1950/3560 Training loss: 1.4921 0.1256 sec/batch\n", + "Epoch 11/20 Iteration 1951/3560 Training loss: 1.4919 0.1285 sec/batch\n", + "Epoch 11/20 Iteration 1952/3560 Training loss: 1.4916 0.1363 sec/batch\n", + "Epoch 11/20 Iteration 1953/3560 Training loss: 1.4917 0.1228 sec/batch\n", + "Epoch 11/20 Iteration 1954/3560 Training loss: 1.4916 0.1308 sec/batch\n", + "Epoch 11/20 Iteration 1955/3560 Training loss: 1.4917 0.1309 sec/batch\n", + "Epoch 11/20 Iteration 1956/3560 Training loss: 1.4914 0.1210 sec/batch\n", + "Epoch 11/20 Iteration 1957/3560 Training loss: 1.4911 0.1251 sec/batch\n", + "Epoch 11/20 Iteration 1958/3560 Training loss: 1.4911 0.1233 sec/batch\n", + "Epoch 12/20 Iteration 1959/3560 Training loss: 1.5874 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 1960/3560 Training loss: 1.5376 0.1336 sec/batch\n", + "Epoch 12/20 Iteration 1961/3560 Training loss: 1.5153 0.1321 sec/batch\n", + "Epoch 12/20 Iteration 1962/3560 Training loss: 1.5049 0.1296 sec/batch\n", + "Epoch 12/20 Iteration 1963/3560 Training loss: 1.4964 0.1250 sec/batch\n", + "Epoch 12/20 Iteration 1964/3560 Training loss: 1.4848 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 1965/3560 Training loss: 1.4833 0.1237 sec/batch\n", + "Epoch 12/20 Iteration 1966/3560 Training loss: 1.4802 0.1253 sec/batch\n", + "Epoch 12/20 Iteration 1967/3560 Training loss: 1.4794 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 1968/3560 Training loss: 1.4774 0.1296 sec/batch\n", + "Epoch 12/20 Iteration 1969/3560 Training loss: 1.4735 0.1254 sec/batch\n", + "Epoch 12/20 Iteration 1970/3560 Training loss: 1.4724 0.1275 sec/batch\n", + "Epoch 12/20 Iteration 1971/3560 Training loss: 1.4719 0.1265 sec/batch\n", + "Epoch 12/20 Iteration 1972/3560 Training loss: 1.4740 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 1973/3560 Training loss: 1.4718 0.1326 sec/batch\n", + "Epoch 12/20 Iteration 1974/3560 Training loss: 1.4692 0.1296 sec/batch\n", + "Epoch 12/20 Iteration 1975/3560 Training loss: 1.4690 0.1257 sec/batch\n", + "Epoch 12/20 Iteration 1976/3560 Training loss: 1.4705 0.1292 sec/batch\n", + "Epoch 12/20 Iteration 1977/3560 Training loss: 1.4703 0.1352 sec/batch\n", + "Epoch 12/20 Iteration 1978/3560 Training loss: 1.4711 0.1248 sec/batch\n", + "Epoch 12/20 Iteration 1979/3560 Training loss: 1.4700 0.1285 sec/batch\n", + "Epoch 12/20 Iteration 1980/3560 Training loss: 1.4700 0.1377 sec/batch\n", + "Epoch 12/20 Iteration 1981/3560 Training loss: 1.4690 0.1298 sec/batch\n", + "Epoch 12/20 Iteration 1982/3560 Training loss: 1.4685 0.1226 sec/batch\n", + "Epoch 12/20 Iteration 1983/3560 Training loss: 1.4686 0.1306 sec/batch\n", + "Epoch 12/20 Iteration 1984/3560 Training loss: 1.4669 0.1254 sec/batch\n", + "Epoch 12/20 Iteration 1985/3560 Training loss: 1.4662 0.1221 sec/batch\n", + "Epoch 12/20 Iteration 1986/3560 Training loss: 1.4674 0.1305 sec/batch\n", + "Epoch 12/20 Iteration 1987/3560 Training loss: 1.4680 0.1300 sec/batch\n", + "Epoch 12/20 Iteration 1988/3560 Training loss: 1.4684 0.1270 sec/batch\n", + "Epoch 12/20 Iteration 1989/3560 Training loss: 1.4682 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 1990/3560 Training loss: 1.4672 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 1991/3560 Training loss: 1.4678 0.1248 sec/batch\n", + "Epoch 12/20 Iteration 1992/3560 Training loss: 1.4678 0.1301 sec/batch\n", + "Epoch 12/20 Iteration 1993/3560 Training loss: 1.4676 0.1295 sec/batch\n", + "Epoch 12/20 Iteration 1994/3560 Training loss: 1.4673 0.1203 sec/batch\n", + "Epoch 12/20 Iteration 1995/3560 Training loss: 1.4664 0.1272 sec/batch\n", + "Epoch 12/20 Iteration 1996/3560 Training loss: 1.4652 0.1299 sec/batch\n", + "Epoch 12/20 Iteration 1997/3560 Training loss: 1.4639 0.1294 sec/batch\n", + "Epoch 12/20 Iteration 1998/3560 Training loss: 1.4634 0.1218 sec/batch\n", + "Epoch 12/20 Iteration 1999/3560 Training loss: 1.4628 0.1324 sec/batch\n", + "Epoch 12/20 Iteration 2000/3560 Training loss: 1.4634 0.1362 sec/batch\n", + "Epoch 12/20 Iteration 2001/3560 Training loss: 1.4630 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2002/3560 Training loss: 1.4621 0.1299 sec/batch\n", + "Epoch 12/20 Iteration 2003/3560 Training loss: 1.4623 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2004/3560 Training loss: 1.4612 0.1249 sec/batch\n", + "Epoch 12/20 Iteration 2005/3560 Training loss: 1.4606 0.1279 sec/batch\n", + "Epoch 12/20 Iteration 2006/3560 Training loss: 1.4600 0.1269 sec/batch\n", + "Epoch 12/20 Iteration 2007/3560 Training loss: 1.4598 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 2008/3560 Training loss: 1.4602 0.1288 sec/batch\n", + "Epoch 12/20 Iteration 2009/3560 Training loss: 1.4595 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 2010/3560 Training loss: 1.4603 0.1290 sec/batch\n", + "Epoch 12/20 Iteration 2011/3560 Training loss: 1.4600 0.1301 sec/batch\n", + "Epoch 12/20 Iteration 2012/3560 Training loss: 1.4600 0.1362 sec/batch\n", + "Epoch 12/20 Iteration 2013/3560 Training loss: 1.4597 0.1242 sec/batch\n", + "Epoch 12/20 Iteration 2014/3560 Training loss: 1.4597 0.1374 sec/batch\n", + "Epoch 12/20 Iteration 2015/3560 Training loss: 1.4599 0.1224 sec/batch\n", + "Epoch 12/20 Iteration 2016/3560 Training loss: 1.4596 0.1251 sec/batch\n", + "Epoch 12/20 Iteration 2017/3560 Training loss: 1.4590 0.1269 sec/batch\n", + "Epoch 12/20 Iteration 2018/3560 Training loss: 1.4595 0.1232 sec/batch\n", + "Epoch 12/20 Iteration 2019/3560 Training loss: 1.4593 0.1248 sec/batch\n", + "Epoch 12/20 Iteration 2020/3560 Training loss: 1.4601 0.1259 sec/batch\n", + "Epoch 12/20 Iteration 2021/3560 Training loss: 1.4604 0.1231 sec/batch\n", + "Epoch 12/20 Iteration 2022/3560 Training loss: 1.4607 0.1278 sec/batch\n", + "Epoch 12/20 Iteration 2023/3560 Training loss: 1.4606 0.1264 sec/batch\n", + "Epoch 12/20 Iteration 2024/3560 Training loss: 1.4606 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 2025/3560 Training loss: 1.4605 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2026/3560 Training loss: 1.4600 0.1270 sec/batch\n", + "Epoch 12/20 Iteration 2027/3560 Training loss: 1.4599 0.1263 sec/batch\n", + "Epoch 12/20 Iteration 2028/3560 Training loss: 1.4598 0.1295 sec/batch\n", + "Epoch 12/20 Iteration 2029/3560 Training loss: 1.4603 0.1281 sec/batch\n", + "Epoch 12/20 Iteration 2030/3560 Training loss: 1.4606 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 2031/3560 Training loss: 1.4611 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2032/3560 Training loss: 1.4607 0.1241 sec/batch\n", + "Epoch 12/20 Iteration 2033/3560 Training loss: 1.4604 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 2034/3560 Training loss: 1.4606 0.1273 sec/batch\n", + "Epoch 12/20 Iteration 2035/3560 Training loss: 1.4603 0.1234 sec/batch\n", + "Epoch 12/20 Iteration 2036/3560 Training loss: 1.4602 0.1332 sec/batch\n", + "Epoch 12/20 Iteration 2037/3560 Training loss: 1.4596 0.1241 sec/batch\n", + "Epoch 12/20 Iteration 2038/3560 Training loss: 1.4594 0.1292 sec/batch\n", + "Epoch 12/20 Iteration 2039/3560 Training loss: 1.4588 0.1342 sec/batch\n", + "Epoch 12/20 Iteration 2040/3560 Training loss: 1.4586 0.1389 sec/batch\n", + "Epoch 12/20 Iteration 2041/3560 Training loss: 1.4580 0.1270 sec/batch\n", + "Epoch 12/20 Iteration 2042/3560 Training loss: 1.4578 0.1274 sec/batch\n", + "Epoch 12/20 Iteration 2043/3560 Training loss: 1.4576 0.1290 sec/batch\n", + "Epoch 12/20 Iteration 2044/3560 Training loss: 1.4575 0.1290 sec/batch\n", + "Epoch 12/20 Iteration 2045/3560 Training loss: 1.4572 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 2046/3560 Training loss: 1.4569 0.1320 sec/batch\n", + "Epoch 12/20 Iteration 2047/3560 Training loss: 1.4563 0.1268 sec/batch\n", + "Epoch 12/20 Iteration 2048/3560 Training loss: 1.4564 0.1320 sec/batch\n", + "Epoch 12/20 Iteration 2049/3560 Training loss: 1.4561 0.1285 sec/batch\n", + "Epoch 12/20 Iteration 2050/3560 Training loss: 1.4559 0.1278 sec/batch\n", + "Epoch 12/20 Iteration 2051/3560 Training loss: 1.4554 0.1323 sec/batch\n", + "Epoch 12/20 Iteration 2052/3560 Training loss: 1.4550 0.1243 sec/batch\n", + "Epoch 12/20 Iteration 2053/3560 Training loss: 1.4547 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 2054/3560 Training loss: 1.4546 0.1258 sec/batch\n", + "Epoch 12/20 Iteration 2055/3560 Training loss: 1.4545 0.1331 sec/batch\n", + "Epoch 12/20 Iteration 2056/3560 Training loss: 1.4539 0.1232 sec/batch\n", + "Epoch 12/20 Iteration 2057/3560 Training loss: 1.4535 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2058/3560 Training loss: 1.4530 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2059/3560 Training loss: 1.4529 0.1252 sec/batch\n", + "Epoch 12/20 Iteration 2060/3560 Training loss: 1.4527 0.1269 sec/batch\n", + "Epoch 12/20 Iteration 2061/3560 Training loss: 1.4525 0.1274 sec/batch\n", + "Epoch 12/20 Iteration 2062/3560 Training loss: 1.4522 0.1227 sec/batch\n", + "Epoch 12/20 Iteration 2063/3560 Training loss: 1.4519 0.1269 sec/batch\n", + "Epoch 12/20 Iteration 2064/3560 Training loss: 1.4518 0.1306 sec/batch\n", + "Epoch 12/20 Iteration 2065/3560 Training loss: 1.4517 0.1309 sec/batch\n", + "Epoch 12/20 Iteration 2066/3560 Training loss: 1.4517 0.1315 sec/batch\n", + "Epoch 12/20 Iteration 2067/3560 Training loss: 1.4515 0.1347 sec/batch\n", + "Epoch 12/20 Iteration 2068/3560 Training loss: 1.4516 0.1299 sec/batch\n", + "Epoch 12/20 Iteration 2069/3560 Training loss: 1.4513 0.1351 sec/batch\n", + "Epoch 12/20 Iteration 2070/3560 Training loss: 1.4512 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 2071/3560 Training loss: 1.4511 0.1236 sec/batch\n", + "Epoch 12/20 Iteration 2072/3560 Training loss: 1.4509 0.1308 sec/batch\n", + "Epoch 12/20 Iteration 2073/3560 Training loss: 1.4506 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2074/3560 Training loss: 1.4502 0.1229 sec/batch\n", + "Epoch 12/20 Iteration 2075/3560 Training loss: 1.4501 0.1221 sec/batch\n", + "Epoch 12/20 Iteration 2076/3560 Training loss: 1.4501 0.1247 sec/batch\n", + "Epoch 12/20 Iteration 2077/3560 Training loss: 1.4499 0.1219 sec/batch\n", + "Epoch 12/20 Iteration 2078/3560 Training loss: 1.4497 0.1279 sec/batch\n", + "Epoch 12/20 Iteration 2079/3560 Training loss: 1.4495 0.1233 sec/batch\n", + "Epoch 12/20 Iteration 2080/3560 Training loss: 1.4492 0.1275 sec/batch\n", + "Epoch 12/20 Iteration 2081/3560 Training loss: 1.4487 0.1244 sec/batch\n", + "Epoch 12/20 Iteration 2082/3560 Training loss: 1.4487 0.1413 sec/batch\n", + "Epoch 12/20 Iteration 2083/3560 Training loss: 1.4486 0.1317 sec/batch\n", + "Epoch 12/20 Iteration 2084/3560 Training loss: 1.4481 0.1228 sec/batch\n", + "Epoch 12/20 Iteration 2085/3560 Training loss: 1.4481 0.1326 sec/batch\n", + "Epoch 12/20 Iteration 2086/3560 Training loss: 1.4480 0.1231 sec/batch\n", + "Epoch 12/20 Iteration 2087/3560 Training loss: 1.4477 0.1289 sec/batch\n", + "Epoch 12/20 Iteration 2088/3560 Training loss: 1.4474 0.1311 sec/batch\n", + "Epoch 12/20 Iteration 2089/3560 Training loss: 1.4470 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 2090/3560 Training loss: 1.4467 0.1236 sec/batch\n", + "Epoch 12/20 Iteration 2091/3560 Training loss: 1.4468 0.1245 sec/batch\n", + "Epoch 12/20 Iteration 2092/3560 Training loss: 1.4467 0.1382 sec/batch\n", + "Epoch 12/20 Iteration 2093/3560 Training loss: 1.4467 0.1256 sec/batch\n", + "Epoch 12/20 Iteration 2094/3560 Training loss: 1.4467 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 2095/3560 Training loss: 1.4468 0.1234 sec/batch\n", + "Epoch 12/20 Iteration 2096/3560 Training loss: 1.4469 0.1351 sec/batch\n", + "Epoch 12/20 Iteration 2097/3560 Training loss: 1.4469 0.1238 sec/batch\n", + "Epoch 12/20 Iteration 2098/3560 Training loss: 1.4469 0.1227 sec/batch\n", + "Epoch 12/20 Iteration 2099/3560 Training loss: 1.4472 0.1225 sec/batch\n", + "Epoch 12/20 Iteration 2100/3560 Training loss: 1.4471 0.1214 sec/batch\n", + "Epoch 12/20 Iteration 2101/3560 Training loss: 1.4470 0.1259 sec/batch\n", + "Epoch 12/20 Iteration 2102/3560 Training loss: 1.4471 0.1209 sec/batch\n", + "Epoch 12/20 Iteration 2103/3560 Training loss: 1.4469 0.1248 sec/batch\n", + "Epoch 12/20 Iteration 2104/3560 Training loss: 1.4471 0.1345 sec/batch\n", + "Epoch 12/20 Iteration 2105/3560 Training loss: 1.4471 0.1231 sec/batch\n", + "Epoch 12/20 Iteration 2106/3560 Training loss: 1.4473 0.1231 sec/batch\n", + "Epoch 12/20 Iteration 2107/3560 Training loss: 1.4473 0.1272 sec/batch\n", + "Epoch 12/20 Iteration 2108/3560 Training loss: 1.4471 0.1272 sec/batch\n", + "Epoch 12/20 Iteration 2109/3560 Training loss: 1.4467 0.1289 sec/batch\n", + "Epoch 12/20 Iteration 2110/3560 Training loss: 1.4466 0.1276 sec/batch\n", + "Epoch 12/20 Iteration 2111/3560 Training loss: 1.4465 0.1257 sec/batch\n", + "Epoch 12/20 Iteration 2112/3560 Training loss: 1.4465 0.1269 sec/batch\n", + "Epoch 12/20 Iteration 2113/3560 Training loss: 1.4464 0.1219 sec/batch\n", + "Epoch 12/20 Iteration 2114/3560 Training loss: 1.4464 0.1230 sec/batch\n", + "Epoch 12/20 Iteration 2115/3560 Training loss: 1.4464 0.1240 sec/batch\n", + "Epoch 12/20 Iteration 2116/3560 Training loss: 1.4463 0.1230 sec/batch\n", + "Epoch 12/20 Iteration 2117/3560 Training loss: 1.4460 0.1255 sec/batch\n", + "Epoch 12/20 Iteration 2118/3560 Training loss: 1.4460 0.1322 sec/batch\n", + "Epoch 12/20 Iteration 2119/3560 Training loss: 1.4462 0.1250 sec/batch\n", + "Epoch 12/20 Iteration 2120/3560 Training loss: 1.4462 0.1262 sec/batch\n", + "Epoch 12/20 Iteration 2121/3560 Training loss: 1.4461 0.1235 sec/batch\n", + "Epoch 12/20 Iteration 2122/3560 Training loss: 1.4460 0.1285 sec/batch\n", + "Epoch 12/20 Iteration 2123/3560 Training loss: 1.4460 0.1291 sec/batch\n", + "Epoch 12/20 Iteration 2124/3560 Training loss: 1.4458 0.1308 sec/batch\n", + "Epoch 12/20 Iteration 2125/3560 Training loss: 1.4459 0.1246 sec/batch\n", + "Epoch 12/20 Iteration 2126/3560 Training loss: 1.4463 0.1291 sec/batch\n", + "Epoch 12/20 Iteration 2127/3560 Training loss: 1.4463 0.1247 sec/batch\n", + "Epoch 12/20 Iteration 2128/3560 Training loss: 1.4462 0.1291 sec/batch\n", + "Epoch 12/20 Iteration 2129/3560 Training loss: 1.4461 0.1241 sec/batch\n", + "Epoch 12/20 Iteration 2130/3560 Training loss: 1.4459 0.1292 sec/batch\n", + "Epoch 12/20 Iteration 2131/3560 Training loss: 1.4460 0.1233 sec/batch\n", + "Epoch 12/20 Iteration 2132/3560 Training loss: 1.4459 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2133/3560 Training loss: 1.4460 0.1213 sec/batch\n", + "Epoch 12/20 Iteration 2134/3560 Training loss: 1.4458 0.1239 sec/batch\n", + "Epoch 12/20 Iteration 2135/3560 Training loss: 1.4456 0.1222 sec/batch\n", + "Epoch 12/20 Iteration 2136/3560 Training loss: 1.4456 0.1344 sec/batch\n", + "Epoch 13/20 Iteration 2137/3560 Training loss: 1.5613 0.1273 sec/batch\n", + "Epoch 13/20 Iteration 2138/3560 Training loss: 1.5156 0.1268 sec/batch\n", + "Epoch 13/20 Iteration 2139/3560 Training loss: 1.5003 0.1265 sec/batch\n", + "Epoch 13/20 Iteration 2140/3560 Training loss: 1.4949 0.1327 sec/batch\n", + "Epoch 13/20 Iteration 2141/3560 Training loss: 1.4832 0.1230 sec/batch\n", + "Epoch 13/20 Iteration 2142/3560 Training loss: 1.4682 0.1309 sec/batch\n", + "Epoch 13/20 Iteration 2143/3560 Training loss: 1.4656 0.1284 sec/batch\n", + "Epoch 13/20 Iteration 2144/3560 Training loss: 1.4608 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2145/3560 Training loss: 1.4593 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2146/3560 Training loss: 1.4564 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2147/3560 Training loss: 1.4515 0.1310 sec/batch\n", + "Epoch 13/20 Iteration 2148/3560 Training loss: 1.4496 0.1278 sec/batch\n", + "Epoch 13/20 Iteration 2149/3560 Training loss: 1.4485 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2150/3560 Training loss: 1.4494 0.1300 sec/batch\n", + "Epoch 13/20 Iteration 2151/3560 Training loss: 1.4476 0.1261 sec/batch\n", + "Epoch 13/20 Iteration 2152/3560 Training loss: 1.4452 0.1331 sec/batch\n", + "Epoch 13/20 Iteration 2153/3560 Training loss: 1.4447 0.1252 sec/batch\n", + "Epoch 13/20 Iteration 2154/3560 Training loss: 1.4455 0.1316 sec/batch\n", + "Epoch 13/20 Iteration 2155/3560 Training loss: 1.4453 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2156/3560 Training loss: 1.4459 0.1259 sec/batch\n", + "Epoch 13/20 Iteration 2157/3560 Training loss: 1.4447 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2158/3560 Training loss: 1.4444 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2159/3560 Training loss: 1.4429 0.1251 sec/batch\n", + "Epoch 13/20 Iteration 2160/3560 Training loss: 1.4422 0.1360 sec/batch\n", + "Epoch 13/20 Iteration 2161/3560 Training loss: 1.4416 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2162/3560 Training loss: 1.4395 0.1227 sec/batch\n", + "Epoch 13/20 Iteration 2163/3560 Training loss: 1.4380 0.1257 sec/batch\n", + "Epoch 13/20 Iteration 2164/3560 Training loss: 1.4379 0.1237 sec/batch\n", + "Epoch 13/20 Iteration 2165/3560 Training loss: 1.4382 0.1329 sec/batch\n", + "Epoch 13/20 Iteration 2166/3560 Training loss: 1.4378 0.1245 sec/batch\n", + "Epoch 13/20 Iteration 2167/3560 Training loss: 1.4374 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2168/3560 Training loss: 1.4363 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2169/3560 Training loss: 1.4366 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2170/3560 Training loss: 1.4368 0.1262 sec/batch\n", + "Epoch 13/20 Iteration 2171/3560 Training loss: 1.4362 0.1296 sec/batch\n", + "Epoch 13/20 Iteration 2172/3560 Training loss: 1.4359 0.1282 sec/batch\n", + "Epoch 13/20 Iteration 2173/3560 Training loss: 1.4349 0.1270 sec/batch\n", + "Epoch 13/20 Iteration 2174/3560 Training loss: 1.4338 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2175/3560 Training loss: 1.4323 0.1252 sec/batch\n", + "Epoch 13/20 Iteration 2176/3560 Training loss: 1.4318 0.1248 sec/batch\n", + "Epoch 13/20 Iteration 2177/3560 Training loss: 1.4310 0.1236 sec/batch\n", + "Epoch 13/20 Iteration 2178/3560 Training loss: 1.4312 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2179/3560 Training loss: 1.4307 0.1234 sec/batch\n", + "Epoch 13/20 Iteration 2180/3560 Training loss: 1.4299 0.1258 sec/batch\n", + "Epoch 13/20 Iteration 2181/3560 Training loss: 1.4299 0.1276 sec/batch\n", + "Epoch 13/20 Iteration 2182/3560 Training loss: 1.4288 0.1221 sec/batch\n", + "Epoch 13/20 Iteration 2183/3560 Training loss: 1.4285 0.1298 sec/batch\n", + "Epoch 13/20 Iteration 2184/3560 Training loss: 1.4279 0.1246 sec/batch\n", + "Epoch 13/20 Iteration 2185/3560 Training loss: 1.4277 0.1271 sec/batch\n", + "Epoch 13/20 Iteration 2186/3560 Training loss: 1.4281 0.1257 sec/batch\n", + "Epoch 13/20 Iteration 2187/3560 Training loss: 1.4274 0.1236 sec/batch\n", + "Epoch 13/20 Iteration 2188/3560 Training loss: 1.4281 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2189/3560 Training loss: 1.4278 0.1315 sec/batch\n", + "Epoch 13/20 Iteration 2190/3560 Training loss: 1.4280 0.1266 sec/batch\n", + "Epoch 13/20 Iteration 2191/3560 Training loss: 1.4277 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2192/3560 Training loss: 1.4278 0.1310 sec/batch\n", + "Epoch 13/20 Iteration 2193/3560 Training loss: 1.4282 0.1275 sec/batch\n", + "Epoch 13/20 Iteration 2194/3560 Training loss: 1.4280 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2195/3560 Training loss: 1.4273 0.1278 sec/batch\n", + "Epoch 13/20 Iteration 2196/3560 Training loss: 1.4277 0.1430 sec/batch\n", + "Epoch 13/20 Iteration 2197/3560 Training loss: 1.4276 0.1274 sec/batch\n", + "Epoch 13/20 Iteration 2198/3560 Training loss: 1.4285 0.1302 sec/batch\n", + "Epoch 13/20 Iteration 2199/3560 Training loss: 1.4287 0.1282 sec/batch\n", + "Epoch 13/20 Iteration 2200/3560 Training loss: 1.4288 0.1298 sec/batch\n", + "Epoch 13/20 Iteration 2201/3560 Training loss: 1.4288 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2202/3560 Training loss: 1.4289 0.1304 sec/batch\n", + "Epoch 13/20 Iteration 2203/3560 Training loss: 1.4289 0.1316 sec/batch\n", + "Epoch 13/20 Iteration 2204/3560 Training loss: 1.4285 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2205/3560 Training loss: 1.4284 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2206/3560 Training loss: 1.4282 0.1252 sec/batch\n", + "Epoch 13/20 Iteration 2207/3560 Training loss: 1.4288 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2208/3560 Training loss: 1.4290 0.1314 sec/batch\n", + "Epoch 13/20 Iteration 2209/3560 Training loss: 1.4293 0.1255 sec/batch\n", + "Epoch 13/20 Iteration 2210/3560 Training loss: 1.4289 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2211/3560 Training loss: 1.4289 0.1237 sec/batch\n", + "Epoch 13/20 Iteration 2212/3560 Training loss: 1.4290 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2213/3560 Training loss: 1.4287 0.1211 sec/batch\n", + "Epoch 13/20 Iteration 2214/3560 Training loss: 1.4286 0.1251 sec/batch\n", + "Epoch 13/20 Iteration 2215/3560 Training loss: 1.4280 0.1298 sec/batch\n", + "Epoch 13/20 Iteration 2216/3560 Training loss: 1.4279 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2217/3560 Training loss: 1.4273 0.1265 sec/batch\n", + "Epoch 13/20 Iteration 2218/3560 Training loss: 1.4273 0.1277 sec/batch\n", + "Epoch 13/20 Iteration 2219/3560 Training loss: 1.4267 0.1347 sec/batch\n", + "Epoch 13/20 Iteration 2220/3560 Training loss: 1.4266 0.1288 sec/batch\n", + "Epoch 13/20 Iteration 2221/3560 Training loss: 1.4262 0.1300 sec/batch\n", + "Epoch 13/20 Iteration 2222/3560 Training loss: 1.4260 0.1274 sec/batch\n", + "Epoch 13/20 Iteration 2223/3560 Training loss: 1.4257 0.1219 sec/batch\n", + "Epoch 13/20 Iteration 2224/3560 Training loss: 1.4254 0.1277 sec/batch\n", + "Epoch 13/20 Iteration 2225/3560 Training loss: 1.4250 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2226/3560 Training loss: 1.4251 0.1288 sec/batch\n", + "Epoch 13/20 Iteration 2227/3560 Training loss: 1.4248 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2228/3560 Training loss: 1.4246 0.1272 sec/batch\n", + "Epoch 13/20 Iteration 2229/3560 Training loss: 1.4243 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2230/3560 Training loss: 1.4240 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2231/3560 Training loss: 1.4235 0.1290 sec/batch\n", + "Epoch 13/20 Iteration 2232/3560 Training loss: 1.4236 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2233/3560 Training loss: 1.4235 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2234/3560 Training loss: 1.4231 0.1240 sec/batch\n", + "Epoch 13/20 Iteration 2235/3560 Training loss: 1.4227 0.1237 sec/batch\n", + "Epoch 13/20 Iteration 2236/3560 Training loss: 1.4222 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2237/3560 Training loss: 1.4221 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2238/3560 Training loss: 1.4220 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2239/3560 Training loss: 1.4217 0.1237 sec/batch\n", + "Epoch 13/20 Iteration 2240/3560 Training loss: 1.4215 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2241/3560 Training loss: 1.4213 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2242/3560 Training loss: 1.4212 0.1287 sec/batch\n", + "Epoch 13/20 Iteration 2243/3560 Training loss: 1.4211 0.1219 sec/batch\n", + "Epoch 13/20 Iteration 2244/3560 Training loss: 1.4211 0.1265 sec/batch\n", + "Epoch 13/20 Iteration 2245/3560 Training loss: 1.4208 0.1227 sec/batch\n", + "Epoch 13/20 Iteration 2246/3560 Training loss: 1.4209 0.1230 sec/batch\n", + "Epoch 13/20 Iteration 2247/3560 Training loss: 1.4206 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2248/3560 Training loss: 1.4206 0.1272 sec/batch\n", + "Epoch 13/20 Iteration 2249/3560 Training loss: 1.4204 0.1248 sec/batch\n", + "Epoch 13/20 Iteration 2250/3560 Training loss: 1.4202 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2251/3560 Training loss: 1.4199 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2252/3560 Training loss: 1.4196 0.1215 sec/batch\n", + "Epoch 13/20 Iteration 2253/3560 Training loss: 1.4195 0.1212 sec/batch\n", + "Epoch 13/20 Iteration 2254/3560 Training loss: 1.4194 0.1243 sec/batch\n", + "Epoch 13/20 Iteration 2255/3560 Training loss: 1.4192 0.1300 sec/batch\n", + "Epoch 13/20 Iteration 2256/3560 Training loss: 1.4190 0.1215 sec/batch\n", + "Epoch 13/20 Iteration 2257/3560 Training loss: 1.4188 0.1254 sec/batch\n", + "Epoch 13/20 Iteration 2258/3560 Training loss: 1.4184 0.1257 sec/batch\n", + "Epoch 13/20 Iteration 2259/3560 Training loss: 1.4179 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2260/3560 Training loss: 1.4179 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2261/3560 Training loss: 1.4179 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2262/3560 Training loss: 1.4175 0.1239 sec/batch\n", + "Epoch 13/20 Iteration 2263/3560 Training loss: 1.4175 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2264/3560 Training loss: 1.4175 0.1251 sec/batch\n", + "Epoch 13/20 Iteration 2265/3560 Training loss: 1.4173 0.1270 sec/batch\n", + "Epoch 13/20 Iteration 2266/3560 Training loss: 1.4170 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2267/3560 Training loss: 1.4165 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2268/3560 Training loss: 1.4162 0.1307 sec/batch\n", + "Epoch 13/20 Iteration 2269/3560 Training loss: 1.4162 0.1250 sec/batch\n", + "Epoch 13/20 Iteration 2270/3560 Training loss: 1.4162 0.1232 sec/batch\n", + "Epoch 13/20 Iteration 2271/3560 Training loss: 1.4161 0.1254 sec/batch\n", + "Epoch 13/20 Iteration 2272/3560 Training loss: 1.4160 0.1226 sec/batch\n", + "Epoch 13/20 Iteration 2273/3560 Training loss: 1.4161 0.1229 sec/batch\n", + "Epoch 13/20 Iteration 2274/3560 Training loss: 1.4162 0.1228 sec/batch\n", + "Epoch 13/20 Iteration 2275/3560 Training loss: 1.4161 0.1227 sec/batch\n", + "Epoch 13/20 Iteration 2276/3560 Training loss: 1.4160 0.1283 sec/batch\n", + "Epoch 13/20 Iteration 2277/3560 Training loss: 1.4162 0.1261 sec/batch\n", + "Epoch 13/20 Iteration 2278/3560 Training loss: 1.4161 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2279/3560 Training loss: 1.4161 0.1222 sec/batch\n", + "Epoch 13/20 Iteration 2280/3560 Training loss: 1.4163 0.1273 sec/batch\n", + "Epoch 13/20 Iteration 2281/3560 Training loss: 1.4161 0.1239 sec/batch\n", + "Epoch 13/20 Iteration 2282/3560 Training loss: 1.4162 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2283/3560 Training loss: 1.4162 0.1219 sec/batch\n", + "Epoch 13/20 Iteration 2284/3560 Training loss: 1.4164 0.1227 sec/batch\n", + "Epoch 13/20 Iteration 2285/3560 Training loss: 1.4164 0.1212 sec/batch\n", + "Epoch 13/20 Iteration 2286/3560 Training loss: 1.4163 0.1240 sec/batch\n", + "Epoch 13/20 Iteration 2287/3560 Training loss: 1.4159 0.1230 sec/batch\n", + "Epoch 13/20 Iteration 2288/3560 Training loss: 1.4157 0.1231 sec/batch\n", + "Epoch 13/20 Iteration 2289/3560 Training loss: 1.4156 0.1224 sec/batch\n", + "Epoch 13/20 Iteration 2290/3560 Training loss: 1.4156 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2291/3560 Training loss: 1.4156 0.1263 sec/batch\n", + "Epoch 13/20 Iteration 2292/3560 Training loss: 1.4156 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2293/3560 Training loss: 1.4156 0.1241 sec/batch\n", + "Epoch 13/20 Iteration 2294/3560 Training loss: 1.4155 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2295/3560 Training loss: 1.4152 0.1249 sec/batch\n", + "Epoch 13/20 Iteration 2296/3560 Training loss: 1.4153 0.1263 sec/batch\n", + "Epoch 13/20 Iteration 2297/3560 Training loss: 1.4155 0.1247 sec/batch\n", + "Epoch 13/20 Iteration 2298/3560 Training loss: 1.4154 0.1259 sec/batch\n", + "Epoch 13/20 Iteration 2299/3560 Training loss: 1.4154 0.1242 sec/batch\n", + "Epoch 13/20 Iteration 2300/3560 Training loss: 1.4153 0.1238 sec/batch\n", + "Epoch 13/20 Iteration 2301/3560 Training loss: 1.4152 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2302/3560 Training loss: 1.4151 0.1258 sec/batch\n", + "Epoch 13/20 Iteration 2303/3560 Training loss: 1.4152 0.1236 sec/batch\n", + "Epoch 13/20 Iteration 2304/3560 Training loss: 1.4156 0.1252 sec/batch\n", + "Epoch 13/20 Iteration 2305/3560 Training loss: 1.4156 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2306/3560 Training loss: 1.4156 0.1258 sec/batch\n", + "Epoch 13/20 Iteration 2307/3560 Training loss: 1.4154 0.1235 sec/batch\n", + "Epoch 13/20 Iteration 2308/3560 Training loss: 1.4152 0.1256 sec/batch\n", + "Epoch 13/20 Iteration 2309/3560 Training loss: 1.4153 0.1216 sec/batch\n", + "Epoch 13/20 Iteration 2310/3560 Training loss: 1.4154 0.1257 sec/batch\n", + "Epoch 13/20 Iteration 2311/3560 Training loss: 1.4154 0.1223 sec/batch\n", + "Epoch 13/20 Iteration 2312/3560 Training loss: 1.4153 0.1257 sec/batch\n", + "Epoch 13/20 Iteration 2313/3560 Training loss: 1.4152 0.1225 sec/batch\n", + "Epoch 13/20 Iteration 2314/3560 Training loss: 1.4153 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2315/3560 Training loss: 1.5328 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2316/3560 Training loss: 1.4838 0.1256 sec/batch\n", + "Epoch 14/20 Iteration 2317/3560 Training loss: 1.4598 0.1251 sec/batch\n", + "Epoch 14/20 Iteration 2318/3560 Training loss: 1.4484 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2319/3560 Training loss: 1.4380 0.1250 sec/batch\n", + "Epoch 14/20 Iteration 2320/3560 Training loss: 1.4254 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2321/3560 Training loss: 1.4247 0.1264 sec/batch\n", + "Epoch 14/20 Iteration 2322/3560 Training loss: 1.4214 0.1209 sec/batch\n", + "Epoch 14/20 Iteration 2323/3560 Training loss: 1.4201 0.1249 sec/batch\n", + "Epoch 14/20 Iteration 2324/3560 Training loss: 1.4173 0.1238 sec/batch\n", + "Epoch 14/20 Iteration 2325/3560 Training loss: 1.4125 0.1263 sec/batch\n", + "Epoch 14/20 Iteration 2326/3560 Training loss: 1.4116 0.1265 sec/batch\n", + "Epoch 14/20 Iteration 2327/3560 Training loss: 1.4109 0.1246 sec/batch\n", + "Epoch 14/20 Iteration 2328/3560 Training loss: 1.4117 0.1227 sec/batch\n", + "Epoch 14/20 Iteration 2329/3560 Training loss: 1.4109 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2330/3560 Training loss: 1.4093 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2331/3560 Training loss: 1.4088 0.1215 sec/batch\n", + "Epoch 14/20 Iteration 2332/3560 Training loss: 1.4100 0.1262 sec/batch\n", + "Epoch 14/20 Iteration 2333/3560 Training loss: 1.4097 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2334/3560 Training loss: 1.4109 0.1249 sec/batch\n", + "Epoch 14/20 Iteration 2335/3560 Training loss: 1.4097 0.1253 sec/batch\n", + "Epoch 14/20 Iteration 2336/3560 Training loss: 1.4099 0.1211 sec/batch\n", + "Epoch 14/20 Iteration 2337/3560 Training loss: 1.4087 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2338/3560 Training loss: 1.4082 0.1278 sec/batch\n", + "Epoch 14/20 Iteration 2339/3560 Training loss: 1.4080 0.1245 sec/batch\n", + "Epoch 14/20 Iteration 2340/3560 Training loss: 1.4059 0.1252 sec/batch\n", + "Epoch 14/20 Iteration 2341/3560 Training loss: 1.4046 0.1237 sec/batch\n", + "Epoch 14/20 Iteration 2342/3560 Training loss: 1.4045 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2343/3560 Training loss: 1.4043 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2344/3560 Training loss: 1.4045 0.1259 sec/batch\n", + "Epoch 14/20 Iteration 2345/3560 Training loss: 1.4040 0.1299 sec/batch\n", + "Epoch 14/20 Iteration 2346/3560 Training loss: 1.4027 0.1215 sec/batch\n", + "Epoch 14/20 Iteration 2347/3560 Training loss: 1.4031 0.1247 sec/batch\n", + "Epoch 14/20 Iteration 2348/3560 Training loss: 1.4034 0.1312 sec/batch\n", + "Epoch 14/20 Iteration 2349/3560 Training loss: 1.4031 0.1257 sec/batch\n", + "Epoch 14/20 Iteration 2350/3560 Training loss: 1.4028 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2351/3560 Training loss: 1.4016 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2352/3560 Training loss: 1.4004 0.1275 sec/batch\n", + "Epoch 14/20 Iteration 2353/3560 Training loss: 1.3989 0.1250 sec/batch\n", + "Epoch 14/20 Iteration 2354/3560 Training loss: 1.3983 0.1252 sec/batch\n", + "Epoch 14/20 Iteration 2355/3560 Training loss: 1.3976 0.1243 sec/batch\n", + "Epoch 14/20 Iteration 2356/3560 Training loss: 1.3984 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2357/3560 Training loss: 1.3981 0.1250 sec/batch\n", + "Epoch 14/20 Iteration 2358/3560 Training loss: 1.3973 0.1251 sec/batch\n", + "Epoch 14/20 Iteration 2359/3560 Training loss: 1.3975 0.1232 sec/batch\n", + "Epoch 14/20 Iteration 2360/3560 Training loss: 1.3967 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2361/3560 Training loss: 1.3965 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2362/3560 Training loss: 1.3961 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2363/3560 Training loss: 1.3959 0.1241 sec/batch\n", + "Epoch 14/20 Iteration 2364/3560 Training loss: 1.3962 0.1385 sec/batch\n", + "Epoch 14/20 Iteration 2365/3560 Training loss: 1.3957 0.1258 sec/batch\n", + "Epoch 14/20 Iteration 2366/3560 Training loss: 1.3964 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2367/3560 Training loss: 1.3962 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2368/3560 Training loss: 1.3965 0.1280 sec/batch\n", + "Epoch 14/20 Iteration 2369/3560 Training loss: 1.3963 0.1255 sec/batch\n", + "Epoch 14/20 Iteration 2370/3560 Training loss: 1.3964 0.1269 sec/batch\n", + "Epoch 14/20 Iteration 2371/3560 Training loss: 1.3967 0.1295 sec/batch\n", + "Epoch 14/20 Iteration 2372/3560 Training loss: 1.3965 0.1268 sec/batch\n", + "Epoch 14/20 Iteration 2373/3560 Training loss: 1.3959 0.1237 sec/batch\n", + "Epoch 14/20 Iteration 2374/3560 Training loss: 1.3964 0.1282 sec/batch\n", + "Epoch 14/20 Iteration 2375/3560 Training loss: 1.3964 0.1227 sec/batch\n", + "Epoch 14/20 Iteration 2376/3560 Training loss: 1.3974 0.1320 sec/batch\n", + "Epoch 14/20 Iteration 2377/3560 Training loss: 1.3977 0.1272 sec/batch\n", + "Epoch 14/20 Iteration 2378/3560 Training loss: 1.3978 0.1312 sec/batch\n", + "Epoch 14/20 Iteration 2379/3560 Training loss: 1.3976 0.1221 sec/batch\n", + "Epoch 14/20 Iteration 2380/3560 Training loss: 1.3978 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2381/3560 Training loss: 1.3979 0.1226 sec/batch\n", + "Epoch 14/20 Iteration 2382/3560 Training loss: 1.3975 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2383/3560 Training loss: 1.3974 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2384/3560 Training loss: 1.3973 0.1226 sec/batch\n", + "Epoch 14/20 Iteration 2385/3560 Training loss: 1.3979 0.1235 sec/batch\n", + "Epoch 14/20 Iteration 2386/3560 Training loss: 1.3980 0.1306 sec/batch\n", + "Epoch 14/20 Iteration 2387/3560 Training loss: 1.3984 0.1333 sec/batch\n", + "Epoch 14/20 Iteration 2388/3560 Training loss: 1.3980 0.1259 sec/batch\n", + "Epoch 14/20 Iteration 2389/3560 Training loss: 1.3978 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2390/3560 Training loss: 1.3979 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2391/3560 Training loss: 1.3976 0.1231 sec/batch\n", + "Epoch 14/20 Iteration 2392/3560 Training loss: 1.3975 0.1293 sec/batch\n", + "Epoch 14/20 Iteration 2393/3560 Training loss: 1.3969 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2394/3560 Training loss: 1.3968 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2395/3560 Training loss: 1.3962 0.1245 sec/batch\n", + "Epoch 14/20 Iteration 2396/3560 Training loss: 1.3962 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2397/3560 Training loss: 1.3957 0.1254 sec/batch\n", + "Epoch 14/20 Iteration 2398/3560 Training loss: 1.3956 0.1270 sec/batch\n", + "Epoch 14/20 Iteration 2399/3560 Training loss: 1.3953 0.1263 sec/batch\n", + "Epoch 14/20 Iteration 2400/3560 Training loss: 1.3951 0.1249 sec/batch\n", + "Epoch 14/20 Iteration 2401/3560 Training loss: 1.3949 0.1247 sec/batch\n", + "Epoch 14/20 Iteration 2402/3560 Training loss: 1.3946 0.1289 sec/batch\n", + "Epoch 14/20 Iteration 2403/3560 Training loss: 1.3941 0.1251 sec/batch\n", + "Epoch 14/20 Iteration 2404/3560 Training loss: 1.3943 0.1235 sec/batch\n", + "Epoch 14/20 Iteration 2405/3560 Training loss: 1.3941 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2406/3560 Training loss: 1.3939 0.1314 sec/batch\n", + "Epoch 14/20 Iteration 2407/3560 Training loss: 1.3936 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2408/3560 Training loss: 1.3932 0.1278 sec/batch\n", + "Epoch 14/20 Iteration 2409/3560 Training loss: 1.3929 0.1259 sec/batch\n", + "Epoch 14/20 Iteration 2410/3560 Training loss: 1.3928 0.1255 sec/batch\n", + "Epoch 14/20 Iteration 2411/3560 Training loss: 1.3927 0.1254 sec/batch\n", + "Epoch 14/20 Iteration 2412/3560 Training loss: 1.3922 0.1243 sec/batch\n", + "Epoch 14/20 Iteration 2413/3560 Training loss: 1.3918 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2414/3560 Training loss: 1.3913 0.1258 sec/batch\n", + "Epoch 14/20 Iteration 2415/3560 Training loss: 1.3913 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2416/3560 Training loss: 1.3912 0.1244 sec/batch\n", + "Epoch 14/20 Iteration 2417/3560 Training loss: 1.3910 0.1260 sec/batch\n", + "Epoch 14/20 Iteration 2418/3560 Training loss: 1.3909 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2419/3560 Training loss: 1.3907 0.1247 sec/batch\n", + "Epoch 14/20 Iteration 2420/3560 Training loss: 1.3906 0.1316 sec/batch\n", + "Epoch 14/20 Iteration 2421/3560 Training loss: 1.3905 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2422/3560 Training loss: 1.3904 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2423/3560 Training loss: 1.3903 0.1241 sec/batch\n", + "Epoch 14/20 Iteration 2424/3560 Training loss: 1.3903 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2425/3560 Training loss: 1.3902 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2426/3560 Training loss: 1.3900 0.1317 sec/batch\n", + "Epoch 14/20 Iteration 2427/3560 Training loss: 1.3899 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2428/3560 Training loss: 1.3898 0.1343 sec/batch\n", + "Epoch 14/20 Iteration 2429/3560 Training loss: 1.3895 0.1236 sec/batch\n", + "Epoch 14/20 Iteration 2430/3560 Training loss: 1.3891 0.1223 sec/batch\n", + "Epoch 14/20 Iteration 2431/3560 Training loss: 1.3891 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2432/3560 Training loss: 1.3891 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2433/3560 Training loss: 1.3889 0.1246 sec/batch\n", + "Epoch 14/20 Iteration 2434/3560 Training loss: 1.3888 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2435/3560 Training loss: 1.3887 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2436/3560 Training loss: 1.3884 0.1238 sec/batch\n", + "Epoch 14/20 Iteration 2437/3560 Training loss: 1.3879 0.1229 sec/batch\n", + "Epoch 14/20 Iteration 2438/3560 Training loss: 1.3879 0.1220 sec/batch\n", + "Epoch 14/20 Iteration 2439/3560 Training loss: 1.3879 0.1222 sec/batch\n", + "Epoch 14/20 Iteration 2440/3560 Training loss: 1.3875 0.1320 sec/batch\n", + "Epoch 14/20 Iteration 2441/3560 Training loss: 1.3874 0.1324 sec/batch\n", + "Epoch 14/20 Iteration 2442/3560 Training loss: 1.3874 0.1357 sec/batch\n", + "Epoch 14/20 Iteration 2443/3560 Training loss: 1.3872 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2444/3560 Training loss: 1.3870 0.1241 sec/batch\n", + "Epoch 14/20 Iteration 2445/3560 Training loss: 1.3866 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2446/3560 Training loss: 1.3864 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2447/3560 Training loss: 1.3866 0.1236 sec/batch\n", + "Epoch 14/20 Iteration 2448/3560 Training loss: 1.3866 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2449/3560 Training loss: 1.3865 0.1281 sec/batch\n", + "Epoch 14/20 Iteration 2450/3560 Training loss: 1.3865 0.1253 sec/batch\n", + "Epoch 14/20 Iteration 2451/3560 Training loss: 1.3867 0.1216 sec/batch\n", + "Epoch 14/20 Iteration 2452/3560 Training loss: 1.3868 0.1240 sec/batch\n", + "Epoch 14/20 Iteration 2453/3560 Training loss: 1.3868 0.1299 sec/batch\n", + "Epoch 14/20 Iteration 2454/3560 Training loss: 1.3869 0.1283 sec/batch\n", + "Epoch 14/20 Iteration 2455/3560 Training loss: 1.3873 0.1235 sec/batch\n", + "Epoch 14/20 Iteration 2456/3560 Training loss: 1.3872 0.1253 sec/batch\n", + "Epoch 14/20 Iteration 2457/3560 Training loss: 1.3872 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2458/3560 Training loss: 1.3874 0.1238 sec/batch\n", + "Epoch 14/20 Iteration 2459/3560 Training loss: 1.3873 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2460/3560 Training loss: 1.3874 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2461/3560 Training loss: 1.3875 0.1248 sec/batch\n", + "Epoch 14/20 Iteration 2462/3560 Training loss: 1.3877 0.1405 sec/batch\n", + "Epoch 14/20 Iteration 2463/3560 Training loss: 1.3878 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2464/3560 Training loss: 1.3876 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2465/3560 Training loss: 1.3872 0.1299 sec/batch\n", + "Epoch 14/20 Iteration 2466/3560 Training loss: 1.3871 0.1249 sec/batch\n", + "Epoch 14/20 Iteration 2467/3560 Training loss: 1.3871 0.1239 sec/batch\n", + "Epoch 14/20 Iteration 2468/3560 Training loss: 1.3871 0.1272 sec/batch\n", + "Epoch 14/20 Iteration 2469/3560 Training loss: 1.3870 0.1260 sec/batch\n", + "Epoch 14/20 Iteration 2470/3560 Training loss: 1.3870 0.1263 sec/batch\n", + "Epoch 14/20 Iteration 2471/3560 Training loss: 1.3870 0.1242 sec/batch\n", + "Epoch 14/20 Iteration 2472/3560 Training loss: 1.3869 0.1363 sec/batch\n", + "Epoch 14/20 Iteration 2473/3560 Training loss: 1.3866 0.1234 sec/batch\n", + "Epoch 14/20 Iteration 2474/3560 Training loss: 1.3867 0.1359 sec/batch\n", + "Epoch 14/20 Iteration 2475/3560 Training loss: 1.3869 0.1218 sec/batch\n", + "Epoch 14/20 Iteration 2476/3560 Training loss: 1.3869 0.1219 sec/batch\n", + "Epoch 14/20 Iteration 2477/3560 Training loss: 1.3868 0.1301 sec/batch\n", + "Epoch 14/20 Iteration 2478/3560 Training loss: 1.3869 0.1253 sec/batch\n", + "Epoch 14/20 Iteration 2479/3560 Training loss: 1.3869 0.1261 sec/batch\n", + "Epoch 14/20 Iteration 2480/3560 Training loss: 1.3868 0.1233 sec/batch\n", + "Epoch 14/20 Iteration 2481/3560 Training loss: 1.3869 0.1237 sec/batch\n", + "Epoch 14/20 Iteration 2482/3560 Training loss: 1.3872 0.1228 sec/batch\n", + "Epoch 14/20 Iteration 2483/3560 Training loss: 1.3871 0.1236 sec/batch\n", + "Epoch 14/20 Iteration 2484/3560 Training loss: 1.3871 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2485/3560 Training loss: 1.3870 0.1226 sec/batch\n", + "Epoch 14/20 Iteration 2486/3560 Training loss: 1.3868 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2487/3560 Training loss: 1.3869 0.1267 sec/batch\n", + "Epoch 14/20 Iteration 2488/3560 Training loss: 1.3869 0.1230 sec/batch\n", + "Epoch 14/20 Iteration 2489/3560 Training loss: 1.3869 0.1225 sec/batch\n", + "Epoch 14/20 Iteration 2490/3560 Training loss: 1.3868 0.1259 sec/batch\n", + "Epoch 14/20 Iteration 2491/3560 Training loss: 1.3866 0.1252 sec/batch\n", + "Epoch 14/20 Iteration 2492/3560 Training loss: 1.3867 0.1260 sec/batch\n", + "Epoch 15/20 Iteration 2493/3560 Training loss: 1.5137 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2494/3560 Training loss: 1.4646 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2495/3560 Training loss: 1.4367 0.1256 sec/batch\n", + "Epoch 15/20 Iteration 2496/3560 Training loss: 1.4257 0.1248 sec/batch\n", + "Epoch 15/20 Iteration 2497/3560 Training loss: 1.4157 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2498/3560 Training loss: 1.4040 0.1237 sec/batch\n", + "Epoch 15/20 Iteration 2499/3560 Training loss: 1.4028 0.1236 sec/batch\n", + "Epoch 15/20 Iteration 2500/3560 Training loss: 1.4000 0.1230 sec/batch\n", + "Epoch 15/20 Iteration 2501/3560 Training loss: 1.3983 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2502/3560 Training loss: 1.3956 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2503/3560 Training loss: 1.3914 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2504/3560 Training loss: 1.3903 0.1240 sec/batch\n", + "Epoch 15/20 Iteration 2505/3560 Training loss: 1.3899 0.1267 sec/batch\n", + "Epoch 15/20 Iteration 2506/3560 Training loss: 1.3907 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2507/3560 Training loss: 1.3898 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2508/3560 Training loss: 1.3878 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2509/3560 Training loss: 1.3876 0.1273 sec/batch\n", + "Epoch 15/20 Iteration 2510/3560 Training loss: 1.3889 0.1257 sec/batch\n", + "Epoch 15/20 Iteration 2511/3560 Training loss: 1.3882 0.1330 sec/batch\n", + "Epoch 15/20 Iteration 2512/3560 Training loss: 1.3884 0.1331 sec/batch\n", + "Epoch 15/20 Iteration 2513/3560 Training loss: 1.3874 0.1233 sec/batch\n", + "Epoch 15/20 Iteration 2514/3560 Training loss: 1.3875 0.1257 sec/batch\n", + "Epoch 15/20 Iteration 2515/3560 Training loss: 1.3862 0.1267 sec/batch\n", + "Epoch 15/20 Iteration 2516/3560 Training loss: 1.3858 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2517/3560 Training loss: 1.3855 0.1231 sec/batch\n", + "Epoch 15/20 Iteration 2518/3560 Training loss: 1.3838 0.1253 sec/batch\n", + "Epoch 15/20 Iteration 2519/3560 Training loss: 1.3823 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2520/3560 Training loss: 1.3827 0.1250 sec/batch\n", + "Epoch 15/20 Iteration 2521/3560 Training loss: 1.3823 0.1222 sec/batch\n", + "Epoch 15/20 Iteration 2522/3560 Training loss: 1.3823 0.1279 sec/batch\n", + "Epoch 15/20 Iteration 2523/3560 Training loss: 1.3816 0.1247 sec/batch\n", + "Epoch 15/20 Iteration 2524/3560 Training loss: 1.3804 0.1237 sec/batch\n", + "Epoch 15/20 Iteration 2525/3560 Training loss: 1.3807 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2526/3560 Training loss: 1.3810 0.1236 sec/batch\n", + "Epoch 15/20 Iteration 2527/3560 Training loss: 1.3808 0.1222 sec/batch\n", + "Epoch 15/20 Iteration 2528/3560 Training loss: 1.3805 0.1278 sec/batch\n", + "Epoch 15/20 Iteration 2529/3560 Training loss: 1.3798 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2530/3560 Training loss: 1.3783 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2531/3560 Training loss: 1.3769 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2532/3560 Training loss: 1.3765 0.1263 sec/batch\n", + "Epoch 15/20 Iteration 2533/3560 Training loss: 1.3758 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2534/3560 Training loss: 1.3763 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2535/3560 Training loss: 1.3759 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2536/3560 Training loss: 1.3753 0.1214 sec/batch\n", + "Epoch 15/20 Iteration 2537/3560 Training loss: 1.3755 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2538/3560 Training loss: 1.3747 0.1282 sec/batch\n", + "Epoch 15/20 Iteration 2539/3560 Training loss: 1.3742 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2540/3560 Training loss: 1.3736 0.1213 sec/batch\n", + "Epoch 15/20 Iteration 2541/3560 Training loss: 1.3734 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2542/3560 Training loss: 1.3737 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2543/3560 Training loss: 1.3732 0.1249 sec/batch\n", + "Epoch 15/20 Iteration 2544/3560 Training loss: 1.3738 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2545/3560 Training loss: 1.3735 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2546/3560 Training loss: 1.3737 0.1248 sec/batch\n", + "Epoch 15/20 Iteration 2547/3560 Training loss: 1.3735 0.1243 sec/batch\n", + "Epoch 15/20 Iteration 2548/3560 Training loss: 1.3735 0.1257 sec/batch\n", + "Epoch 15/20 Iteration 2549/3560 Training loss: 1.3740 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2550/3560 Training loss: 1.3737 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2551/3560 Training loss: 1.3731 0.1253 sec/batch\n", + "Epoch 15/20 Iteration 2552/3560 Training loss: 1.3737 0.1231 sec/batch\n", + "Epoch 15/20 Iteration 2553/3560 Training loss: 1.3738 0.1225 sec/batch\n", + "Epoch 15/20 Iteration 2554/3560 Training loss: 1.3747 0.1225 sec/batch\n", + "Epoch 15/20 Iteration 2555/3560 Training loss: 1.3751 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2556/3560 Training loss: 1.3752 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2557/3560 Training loss: 1.3751 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2558/3560 Training loss: 1.3754 0.1251 sec/batch\n", + "Epoch 15/20 Iteration 2559/3560 Training loss: 1.3754 0.1248 sec/batch\n", + "Epoch 15/20 Iteration 2560/3560 Training loss: 1.3750 0.1264 sec/batch\n", + "Epoch 15/20 Iteration 2561/3560 Training loss: 1.3750 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2562/3560 Training loss: 1.3748 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2563/3560 Training loss: 1.3754 0.1225 sec/batch\n", + "Epoch 15/20 Iteration 2564/3560 Training loss: 1.3757 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2565/3560 Training loss: 1.3760 0.1305 sec/batch\n", + "Epoch 15/20 Iteration 2566/3560 Training loss: 1.3757 0.1287 sec/batch\n", + "Epoch 15/20 Iteration 2567/3560 Training loss: 1.3756 0.1255 sec/batch\n", + "Epoch 15/20 Iteration 2568/3560 Training loss: 1.3757 0.1255 sec/batch\n", + "Epoch 15/20 Iteration 2569/3560 Training loss: 1.3755 0.1245 sec/batch\n", + "Epoch 15/20 Iteration 2570/3560 Training loss: 1.3755 0.1288 sec/batch\n", + "Epoch 15/20 Iteration 2571/3560 Training loss: 1.3748 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2572/3560 Training loss: 1.3747 0.1251 sec/batch\n", + "Epoch 15/20 Iteration 2573/3560 Training loss: 1.3740 0.1236 sec/batch\n", + "Epoch 15/20 Iteration 2574/3560 Training loss: 1.3739 0.1287 sec/batch\n", + "Epoch 15/20 Iteration 2575/3560 Training loss: 1.3734 0.1308 sec/batch\n", + "Epoch 15/20 Iteration 2576/3560 Training loss: 1.3732 0.1306 sec/batch\n", + "Epoch 15/20 Iteration 2577/3560 Training loss: 1.3729 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2578/3560 Training loss: 1.3727 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2579/3560 Training loss: 1.3724 0.1244 sec/batch\n", + "Epoch 15/20 Iteration 2580/3560 Training loss: 1.3721 0.1269 sec/batch\n", + "Epoch 15/20 Iteration 2581/3560 Training loss: 1.3716 0.1239 sec/batch\n", + "Epoch 15/20 Iteration 2582/3560 Training loss: 1.3719 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2583/3560 Training loss: 1.3717 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2584/3560 Training loss: 1.3715 0.1257 sec/batch\n", + "Epoch 15/20 Iteration 2585/3560 Training loss: 1.3711 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2586/3560 Training loss: 1.3708 0.1268 sec/batch\n", + "Epoch 15/20 Iteration 2587/3560 Training loss: 1.3705 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2588/3560 Training loss: 1.3706 0.1266 sec/batch\n", + "Epoch 15/20 Iteration 2589/3560 Training loss: 1.3705 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2590/3560 Training loss: 1.3701 0.1216 sec/batch\n", + "Epoch 15/20 Iteration 2591/3560 Training loss: 1.3697 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2592/3560 Training loss: 1.3692 0.1255 sec/batch\n", + "Epoch 15/20 Iteration 2593/3560 Training loss: 1.3692 0.1228 sec/batch\n", + "Epoch 15/20 Iteration 2594/3560 Training loss: 1.3691 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2595/3560 Training loss: 1.3689 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2596/3560 Training loss: 1.3687 0.1254 sec/batch\n", + "Epoch 15/20 Iteration 2597/3560 Training loss: 1.3685 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2598/3560 Training loss: 1.3684 0.1257 sec/batch\n", + "Epoch 15/20 Iteration 2599/3560 Training loss: 1.3683 0.1237 sec/batch\n", + "Epoch 15/20 Iteration 2600/3560 Training loss: 1.3683 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2601/3560 Training loss: 1.3681 0.1246 sec/batch\n", + "Epoch 15/20 Iteration 2602/3560 Training loss: 1.3681 0.1300 sec/batch\n", + "Epoch 15/20 Iteration 2603/3560 Training loss: 1.3678 0.1250 sec/batch\n", + "Epoch 15/20 Iteration 2604/3560 Training loss: 1.3678 0.1212 sec/batch\n", + "Epoch 15/20 Iteration 2605/3560 Training loss: 1.3676 0.1241 sec/batch\n", + "Epoch 15/20 Iteration 2606/3560 Training loss: 1.3674 0.1252 sec/batch\n", + "Epoch 15/20 Iteration 2607/3560 Training loss: 1.3671 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2608/3560 Training loss: 1.3668 0.1251 sec/batch\n", + "Epoch 15/20 Iteration 2609/3560 Training loss: 1.3666 0.1229 sec/batch\n", + "Epoch 15/20 Iteration 2610/3560 Training loss: 1.3665 0.1260 sec/batch\n", + "Epoch 15/20 Iteration 2611/3560 Training loss: 1.3665 0.1247 sec/batch\n", + "Epoch 15/20 Iteration 2612/3560 Training loss: 1.3663 0.1309 sec/batch\n", + "Epoch 15/20 Iteration 2613/3560 Training loss: 1.3661 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2614/3560 Training loss: 1.3657 0.1231 sec/batch\n", + "Epoch 15/20 Iteration 2615/3560 Training loss: 1.3653 0.1288 sec/batch\n", + "Epoch 15/20 Iteration 2616/3560 Training loss: 1.3654 0.1303 sec/batch\n", + "Epoch 15/20 Iteration 2617/3560 Training loss: 1.3653 0.1249 sec/batch\n", + "Epoch 15/20 Iteration 2618/3560 Training loss: 1.3650 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2619/3560 Training loss: 1.3650 0.1237 sec/batch\n", + "Epoch 15/20 Iteration 2620/3560 Training loss: 1.3650 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2621/3560 Training loss: 1.3648 0.1234 sec/batch\n", + "Epoch 15/20 Iteration 2622/3560 Training loss: 1.3645 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2623/3560 Training loss: 1.3641 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2624/3560 Training loss: 1.3639 0.1215 sec/batch\n", + "Epoch 15/20 Iteration 2625/3560 Training loss: 1.3640 0.1238 sec/batch\n", + "Epoch 15/20 Iteration 2626/3560 Training loss: 1.3640 0.1263 sec/batch\n", + "Epoch 15/20 Iteration 2627/3560 Training loss: 1.3639 0.1230 sec/batch\n", + "Epoch 15/20 Iteration 2628/3560 Training loss: 1.3639 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2629/3560 Training loss: 1.3641 0.1226 sec/batch\n", + "Epoch 15/20 Iteration 2630/3560 Training loss: 1.3641 0.1237 sec/batch\n", + "Epoch 15/20 Iteration 2631/3560 Training loss: 1.3641 0.1224 sec/batch\n", + "Epoch 15/20 Iteration 2632/3560 Training loss: 1.3640 0.1311 sec/batch\n", + "Epoch 15/20 Iteration 2633/3560 Training loss: 1.3643 0.1231 sec/batch\n", + "Epoch 15/20 Iteration 2634/3560 Training loss: 1.3643 0.1249 sec/batch\n", + "Epoch 15/20 Iteration 2635/3560 Training loss: 1.3643 0.1221 sec/batch\n", + "Epoch 15/20 Iteration 2636/3560 Training loss: 1.3645 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2637/3560 Training loss: 1.3643 0.1220 sec/batch\n", + "Epoch 15/20 Iteration 2638/3560 Training loss: 1.3645 0.1222 sec/batch\n", + "Epoch 15/20 Iteration 2639/3560 Training loss: 1.3645 0.1323 sec/batch\n", + "Epoch 15/20 Iteration 2640/3560 Training loss: 1.3647 0.1237 sec/batch\n", + "Epoch 15/20 Iteration 2641/3560 Training loss: 1.3648 0.1250 sec/batch\n", + "Epoch 15/20 Iteration 2642/3560 Training loss: 1.3646 0.1237 sec/batch\n", + "Epoch 15/20 Iteration 2643/3560 Training loss: 1.3643 0.1288 sec/batch\n", + "Epoch 15/20 Iteration 2644/3560 Training loss: 1.3642 0.1244 sec/batch\n", + "Epoch 15/20 Iteration 2645/3560 Training loss: 1.3642 0.1272 sec/batch\n", + "Epoch 15/20 Iteration 2646/3560 Training loss: 1.3641 0.1250 sec/batch\n", + "Epoch 15/20 Iteration 2647/3560 Training loss: 1.3642 0.1256 sec/batch\n", + "Epoch 15/20 Iteration 2648/3560 Training loss: 1.3641 0.1247 sec/batch\n", + "Epoch 15/20 Iteration 2649/3560 Training loss: 1.3641 0.1306 sec/batch\n", + "Epoch 15/20 Iteration 2650/3560 Training loss: 1.3641 0.1227 sec/batch\n", + "Epoch 15/20 Iteration 2651/3560 Training loss: 1.3638 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2652/3560 Training loss: 1.3639 0.1295 sec/batch\n", + "Epoch 15/20 Iteration 2653/3560 Training loss: 1.3641 0.1276 sec/batch\n", + "Epoch 15/20 Iteration 2654/3560 Training loss: 1.3640 0.1271 sec/batch\n", + "Epoch 15/20 Iteration 2655/3560 Training loss: 1.3640 0.1259 sec/batch\n", + "Epoch 15/20 Iteration 2656/3560 Training loss: 1.3640 0.1297 sec/batch\n", + "Epoch 15/20 Iteration 2657/3560 Training loss: 1.3639 0.1219 sec/batch\n", + "Epoch 15/20 Iteration 2658/3560 Training loss: 1.3638 0.1266 sec/batch\n", + "Epoch 15/20 Iteration 2659/3560 Training loss: 1.3639 0.1223 sec/batch\n", + "Epoch 15/20 Iteration 2660/3560 Training loss: 1.3643 0.1242 sec/batch\n", + "Epoch 15/20 Iteration 2661/3560 Training loss: 1.3643 0.1244 sec/batch\n", + "Epoch 15/20 Iteration 2662/3560 Training loss: 1.3643 0.1222 sec/batch\n", + "Epoch 15/20 Iteration 2663/3560 Training loss: 1.3642 0.1261 sec/batch\n", + "Epoch 15/20 Iteration 2664/3560 Training loss: 1.3640 0.1286 sec/batch\n", + "Epoch 15/20 Iteration 2665/3560 Training loss: 1.3642 0.1262 sec/batch\n", + "Epoch 15/20 Iteration 2666/3560 Training loss: 1.3643 0.1272 sec/batch\n", + "Epoch 15/20 Iteration 2667/3560 Training loss: 1.3643 0.1266 sec/batch\n", + "Epoch 15/20 Iteration 2668/3560 Training loss: 1.3642 0.1232 sec/batch\n", + "Epoch 15/20 Iteration 2669/3560 Training loss: 1.3640 0.1207 sec/batch\n", + "Epoch 15/20 Iteration 2670/3560 Training loss: 1.3642 0.1235 sec/batch\n", + "Epoch 16/20 Iteration 2671/3560 Training loss: 1.4929 0.1305 sec/batch\n", + "Epoch 16/20 Iteration 2672/3560 Training loss: 1.4419 0.1232 sec/batch\n", + "Epoch 16/20 Iteration 2673/3560 Training loss: 1.4175 0.1247 sec/batch\n", + "Epoch 16/20 Iteration 2674/3560 Training loss: 1.4075 0.1214 sec/batch\n", + "Epoch 16/20 Iteration 2675/3560 Training loss: 1.3972 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2676/3560 Training loss: 1.3857 0.1270 sec/batch\n", + "Epoch 16/20 Iteration 2677/3560 Training loss: 1.3836 0.1295 sec/batch\n", + "Epoch 16/20 Iteration 2678/3560 Training loss: 1.3793 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2679/3560 Training loss: 1.3779 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2680/3560 Training loss: 1.3762 0.1250 sec/batch\n", + "Epoch 16/20 Iteration 2681/3560 Training loss: 1.3719 0.1340 sec/batch\n", + "Epoch 16/20 Iteration 2682/3560 Training loss: 1.3700 0.1253 sec/batch\n", + "Epoch 16/20 Iteration 2683/3560 Training loss: 1.3696 0.1247 sec/batch\n", + "Epoch 16/20 Iteration 2684/3560 Training loss: 1.3704 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2685/3560 Training loss: 1.3687 0.1242 sec/batch\n", + "Epoch 16/20 Iteration 2686/3560 Training loss: 1.3661 0.1277 sec/batch\n", + "Epoch 16/20 Iteration 2687/3560 Training loss: 1.3657 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2688/3560 Training loss: 1.3664 0.1247 sec/batch\n", + "Epoch 16/20 Iteration 2689/3560 Training loss: 1.3659 0.1241 sec/batch\n", + "Epoch 16/20 Iteration 2690/3560 Training loss: 1.3663 0.1211 sec/batch\n", + "Epoch 16/20 Iteration 2691/3560 Training loss: 1.3652 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2692/3560 Training loss: 1.3648 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2693/3560 Training loss: 1.3639 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2694/3560 Training loss: 1.3637 0.1235 sec/batch\n", + "Epoch 16/20 Iteration 2695/3560 Training loss: 1.3634 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2696/3560 Training loss: 1.3614 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2697/3560 Training loss: 1.3598 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2698/3560 Training loss: 1.3604 0.1288 sec/batch\n", + "Epoch 16/20 Iteration 2699/3560 Training loss: 1.3608 0.1233 sec/batch\n", + "Epoch 16/20 Iteration 2700/3560 Training loss: 1.3610 0.1239 sec/batch\n", + "Epoch 16/20 Iteration 2701/3560 Training loss: 1.3603 0.1241 sec/batch\n", + "Epoch 16/20 Iteration 2702/3560 Training loss: 1.3593 0.1237 sec/batch\n", + "Epoch 16/20 Iteration 2703/3560 Training loss: 1.3597 0.1231 sec/batch\n", + "Epoch 16/20 Iteration 2704/3560 Training loss: 1.3599 0.1280 sec/batch\n", + "Epoch 16/20 Iteration 2705/3560 Training loss: 1.3594 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2706/3560 Training loss: 1.3592 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2707/3560 Training loss: 1.3584 0.1234 sec/batch\n", + "Epoch 16/20 Iteration 2708/3560 Training loss: 1.3572 0.1233 sec/batch\n", + "Epoch 16/20 Iteration 2709/3560 Training loss: 1.3558 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2710/3560 Training loss: 1.3551 0.1261 sec/batch\n", + "Epoch 16/20 Iteration 2711/3560 Training loss: 1.3546 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2712/3560 Training loss: 1.3553 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2713/3560 Training loss: 1.3547 0.1284 sec/batch\n", + "Epoch 16/20 Iteration 2714/3560 Training loss: 1.3540 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2715/3560 Training loss: 1.3539 0.1294 sec/batch\n", + "Epoch 16/20 Iteration 2716/3560 Training loss: 1.3532 0.1210 sec/batch\n", + "Epoch 16/20 Iteration 2717/3560 Training loss: 1.3525 0.1254 sec/batch\n", + "Epoch 16/20 Iteration 2718/3560 Training loss: 1.3521 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2719/3560 Training loss: 1.3519 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2720/3560 Training loss: 1.3522 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2721/3560 Training loss: 1.3516 0.1249 sec/batch\n", + "Epoch 16/20 Iteration 2722/3560 Training loss: 1.3522 0.1277 sec/batch\n", + "Epoch 16/20 Iteration 2723/3560 Training loss: 1.3520 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2724/3560 Training loss: 1.3521 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2725/3560 Training loss: 1.3519 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2726/3560 Training loss: 1.3520 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2727/3560 Training loss: 1.3523 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2728/3560 Training loss: 1.3520 0.1249 sec/batch\n", + "Epoch 16/20 Iteration 2729/3560 Training loss: 1.3514 0.1251 sec/batch\n", + "Epoch 16/20 Iteration 2730/3560 Training loss: 1.3520 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2731/3560 Training loss: 1.3520 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2732/3560 Training loss: 1.3531 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2733/3560 Training loss: 1.3534 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2734/3560 Training loss: 1.3535 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2735/3560 Training loss: 1.3534 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2736/3560 Training loss: 1.3537 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2737/3560 Training loss: 1.3538 0.1228 sec/batch\n", + "Epoch 16/20 Iteration 2738/3560 Training loss: 1.3535 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2739/3560 Training loss: 1.3535 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2740/3560 Training loss: 1.3533 0.1241 sec/batch\n", + "Epoch 16/20 Iteration 2741/3560 Training loss: 1.3539 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2742/3560 Training loss: 1.3543 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2743/3560 Training loss: 1.3548 0.1218 sec/batch\n", + "Epoch 16/20 Iteration 2744/3560 Training loss: 1.3544 0.1329 sec/batch\n", + "Epoch 16/20 Iteration 2745/3560 Training loss: 1.3543 0.1213 sec/batch\n", + "Epoch 16/20 Iteration 2746/3560 Training loss: 1.3545 0.1223 sec/batch\n", + "Epoch 16/20 Iteration 2747/3560 Training loss: 1.3542 0.1377 sec/batch\n", + "Epoch 16/20 Iteration 2748/3560 Training loss: 1.3541 0.1270 sec/batch\n", + "Epoch 16/20 Iteration 2749/3560 Training loss: 1.3535 0.1236 sec/batch\n", + "Epoch 16/20 Iteration 2750/3560 Training loss: 1.3534 0.1232 sec/batch\n", + "Epoch 16/20 Iteration 2751/3560 Training loss: 1.3528 0.1276 sec/batch\n", + "Epoch 16/20 Iteration 2752/3560 Training loss: 1.3528 0.1230 sec/batch\n", + "Epoch 16/20 Iteration 2753/3560 Training loss: 1.3524 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2754/3560 Training loss: 1.3524 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2755/3560 Training loss: 1.3520 0.1257 sec/batch\n", + "Epoch 16/20 Iteration 2756/3560 Training loss: 1.3519 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2757/3560 Training loss: 1.3517 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2758/3560 Training loss: 1.3514 0.1211 sec/batch\n", + "Epoch 16/20 Iteration 2759/3560 Training loss: 1.3509 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2760/3560 Training loss: 1.3509 0.1264 sec/batch\n", + "Epoch 16/20 Iteration 2761/3560 Training loss: 1.3507 0.1241 sec/batch\n", + "Epoch 16/20 Iteration 2762/3560 Training loss: 1.3505 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2763/3560 Training loss: 1.3501 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2764/3560 Training loss: 1.3499 0.1210 sec/batch\n", + "Epoch 16/20 Iteration 2765/3560 Training loss: 1.3496 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2766/3560 Training loss: 1.3495 0.1325 sec/batch\n", + "Epoch 16/20 Iteration 2767/3560 Training loss: 1.3495 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2768/3560 Training loss: 1.3491 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2769/3560 Training loss: 1.3486 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2770/3560 Training loss: 1.3482 0.1261 sec/batch\n", + "Epoch 16/20 Iteration 2771/3560 Training loss: 1.3480 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2772/3560 Training loss: 1.3479 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2773/3560 Training loss: 1.3477 0.1259 sec/batch\n", + "Epoch 16/20 Iteration 2774/3560 Training loss: 1.3476 0.1280 sec/batch\n", + "Epoch 16/20 Iteration 2775/3560 Training loss: 1.3475 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2776/3560 Training loss: 1.3473 0.1255 sec/batch\n", + "Epoch 16/20 Iteration 2777/3560 Training loss: 1.3472 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2778/3560 Training loss: 1.3471 0.1219 sec/batch\n", + "Epoch 16/20 Iteration 2779/3560 Training loss: 1.3470 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2780/3560 Training loss: 1.3470 0.1248 sec/batch\n", + "Epoch 16/20 Iteration 2781/3560 Training loss: 1.3468 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2782/3560 Training loss: 1.3468 0.1237 sec/batch\n", + "Epoch 16/20 Iteration 2783/3560 Training loss: 1.3467 0.1241 sec/batch\n", + "Epoch 16/20 Iteration 2784/3560 Training loss: 1.3466 0.1215 sec/batch\n", + "Epoch 16/20 Iteration 2785/3560 Training loss: 1.3464 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2786/3560 Training loss: 1.3460 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2787/3560 Training loss: 1.3460 0.1247 sec/batch\n", + "Epoch 16/20 Iteration 2788/3560 Training loss: 1.3459 0.1253 sec/batch\n", + "Epoch 16/20 Iteration 2789/3560 Training loss: 1.3458 0.1316 sec/batch\n", + "Epoch 16/20 Iteration 2790/3560 Training loss: 1.3456 0.1241 sec/batch\n", + "Epoch 16/20 Iteration 2791/3560 Training loss: 1.3455 0.1238 sec/batch\n", + "Epoch 16/20 Iteration 2792/3560 Training loss: 1.3452 0.1271 sec/batch\n", + "Epoch 16/20 Iteration 2793/3560 Training loss: 1.3448 0.1247 sec/batch\n", + "Epoch 16/20 Iteration 2794/3560 Training loss: 1.3448 0.1304 sec/batch\n", + "Epoch 16/20 Iteration 2795/3560 Training loss: 1.3448 0.1212 sec/batch\n", + "Epoch 16/20 Iteration 2796/3560 Training loss: 1.3444 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2797/3560 Training loss: 1.3444 0.1243 sec/batch\n", + "Epoch 16/20 Iteration 2798/3560 Training loss: 1.3445 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2799/3560 Training loss: 1.3443 0.1252 sec/batch\n", + "Epoch 16/20 Iteration 2800/3560 Training loss: 1.3441 0.1221 sec/batch\n", + "Epoch 16/20 Iteration 2801/3560 Training loss: 1.3436 0.1240 sec/batch\n", + "Epoch 16/20 Iteration 2802/3560 Training loss: 1.3434 0.1207 sec/batch\n", + "Epoch 16/20 Iteration 2803/3560 Training loss: 1.3436 0.1246 sec/batch\n", + "Epoch 16/20 Iteration 2804/3560 Training loss: 1.3436 0.1217 sec/batch\n", + "Epoch 16/20 Iteration 2805/3560 Training loss: 1.3435 0.1232 sec/batch\n", + "Epoch 16/20 Iteration 2806/3560 Training loss: 1.3436 0.1281 sec/batch\n", + "Epoch 16/20 Iteration 2807/3560 Training loss: 1.3438 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2808/3560 Training loss: 1.3439 0.1333 sec/batch\n", + "Epoch 16/20 Iteration 2809/3560 Training loss: 1.3439 0.1267 sec/batch\n", + "Epoch 16/20 Iteration 2810/3560 Training loss: 1.3439 0.1216 sec/batch\n", + "Epoch 16/20 Iteration 2811/3560 Training loss: 1.3442 0.1251 sec/batch\n", + "Epoch 16/20 Iteration 2812/3560 Training loss: 1.3442 0.1225 sec/batch\n", + "Epoch 16/20 Iteration 2813/3560 Training loss: 1.3442 0.1211 sec/batch\n", + "Epoch 16/20 Iteration 2814/3560 Training loss: 1.3444 0.1236 sec/batch\n", + "Epoch 16/20 Iteration 2815/3560 Training loss: 1.3443 0.1224 sec/batch\n", + "Epoch 16/20 Iteration 2816/3560 Training loss: 1.3445 0.1237 sec/batch\n", + "Epoch 16/20 Iteration 2817/3560 Training loss: 1.3445 0.1235 sec/batch\n", + "Epoch 16/20 Iteration 2818/3560 Training loss: 1.3447 0.1249 sec/batch\n", + "Epoch 16/20 Iteration 2819/3560 Training loss: 1.3448 0.1227 sec/batch\n", + "Epoch 16/20 Iteration 2820/3560 Training loss: 1.3447 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2821/3560 Training loss: 1.3444 0.1246 sec/batch\n", + "Epoch 16/20 Iteration 2822/3560 Training loss: 1.3443 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2823/3560 Training loss: 1.3443 0.1228 sec/batch\n", + "Epoch 16/20 Iteration 2824/3560 Training loss: 1.3443 0.1238 sec/batch\n", + "Epoch 16/20 Iteration 2825/3560 Training loss: 1.3443 0.1273 sec/batch\n", + "Epoch 16/20 Iteration 2826/3560 Training loss: 1.3443 0.1236 sec/batch\n", + "Epoch 16/20 Iteration 2827/3560 Training loss: 1.3443 0.1280 sec/batch\n", + "Epoch 16/20 Iteration 2828/3560 Training loss: 1.3443 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2829/3560 Training loss: 1.3441 0.1222 sec/batch\n", + "Epoch 16/20 Iteration 2830/3560 Training loss: 1.3441 0.1244 sec/batch\n", + "Epoch 16/20 Iteration 2831/3560 Training loss: 1.3443 0.1220 sec/batch\n", + "Epoch 16/20 Iteration 2832/3560 Training loss: 1.3443 0.1248 sec/batch\n", + "Epoch 16/20 Iteration 2833/3560 Training loss: 1.3443 0.1229 sec/batch\n", + "Epoch 16/20 Iteration 2834/3560 Training loss: 1.3442 0.1272 sec/batch\n", + "Epoch 16/20 Iteration 2835/3560 Training loss: 1.3442 0.1235 sec/batch\n", + "Epoch 16/20 Iteration 2836/3560 Training loss: 1.3441 0.1218 sec/batch\n", + "Epoch 16/20 Iteration 2837/3560 Training loss: 1.3442 0.1233 sec/batch\n", + "Epoch 16/20 Iteration 2838/3560 Training loss: 1.3446 0.1242 sec/batch\n", + "Epoch 16/20 Iteration 2839/3560 Training loss: 1.3446 0.1259 sec/batch\n", + "Epoch 16/20 Iteration 2840/3560 Training loss: 1.3447 0.1248 sec/batch\n", + "Epoch 16/20 Iteration 2841/3560 Training loss: 1.3446 0.1251 sec/batch\n", + "Epoch 16/20 Iteration 2842/3560 Training loss: 1.3444 0.1230 sec/batch\n", + "Epoch 16/20 Iteration 2843/3560 Training loss: 1.3446 0.1251 sec/batch\n", + "Epoch 16/20 Iteration 2844/3560 Training loss: 1.3446 0.1245 sec/batch\n", + "Epoch 16/20 Iteration 2845/3560 Training loss: 1.3447 0.1226 sec/batch\n", + "Epoch 16/20 Iteration 2846/3560 Training loss: 1.3445 0.1228 sec/batch\n", + "Epoch 16/20 Iteration 2847/3560 Training loss: 1.3443 0.1271 sec/batch\n", + "Epoch 16/20 Iteration 2848/3560 Training loss: 1.3445 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2849/3560 Training loss: 1.4724 0.1252 sec/batch\n", + "Epoch 17/20 Iteration 2850/3560 Training loss: 1.4243 0.1251 sec/batch\n", + "Epoch 17/20 Iteration 2851/3560 Training loss: 1.3977 0.1246 sec/batch\n", + "Epoch 17/20 Iteration 2852/3560 Training loss: 1.3879 0.1223 sec/batch\n", + "Epoch 17/20 Iteration 2853/3560 Training loss: 1.3748 0.1231 sec/batch\n", + "Epoch 17/20 Iteration 2854/3560 Training loss: 1.3616 0.1223 sec/batch\n", + "Epoch 17/20 Iteration 2855/3560 Training loss: 1.3603 0.1255 sec/batch\n", + "Epoch 17/20 Iteration 2856/3560 Training loss: 1.3575 0.1240 sec/batch\n", + "Epoch 17/20 Iteration 2857/3560 Training loss: 1.3554 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2858/3560 Training loss: 1.3541 0.1238 sec/batch\n", + "Epoch 17/20 Iteration 2859/3560 Training loss: 1.3500 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2860/3560 Training loss: 1.3485 0.1285 sec/batch\n", + "Epoch 17/20 Iteration 2861/3560 Training loss: 1.3484 0.1233 sec/batch\n", + "Epoch 17/20 Iteration 2862/3560 Training loss: 1.3496 0.1259 sec/batch\n", + "Epoch 17/20 Iteration 2863/3560 Training loss: 1.3476 0.1220 sec/batch\n", + "Epoch 17/20 Iteration 2864/3560 Training loss: 1.3456 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2865/3560 Training loss: 1.3454 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 2866/3560 Training loss: 1.3457 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 2867/3560 Training loss: 1.3453 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 2868/3560 Training loss: 1.3460 0.1238 sec/batch\n", + "Epoch 17/20 Iteration 2869/3560 Training loss: 1.3457 0.1253 sec/batch\n", + "Epoch 17/20 Iteration 2870/3560 Training loss: 1.3456 0.1217 sec/batch\n", + "Epoch 17/20 Iteration 2871/3560 Training loss: 1.3440 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2872/3560 Training loss: 1.3437 0.1241 sec/batch\n", + "Epoch 17/20 Iteration 2873/3560 Training loss: 1.3434 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2874/3560 Training loss: 1.3418 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 2875/3560 Training loss: 1.3404 0.1278 sec/batch\n", + "Epoch 17/20 Iteration 2876/3560 Training loss: 1.3407 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 2877/3560 Training loss: 1.3411 0.1264 sec/batch\n", + "Epoch 17/20 Iteration 2878/3560 Training loss: 1.3412 0.1277 sec/batch\n", + "Epoch 17/20 Iteration 2879/3560 Training loss: 1.3404 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 2880/3560 Training loss: 1.3392 0.1242 sec/batch\n", + "Epoch 17/20 Iteration 2881/3560 Training loss: 1.3395 0.1235 sec/batch\n", + "Epoch 17/20 Iteration 2882/3560 Training loss: 1.3397 0.1257 sec/batch\n", + "Epoch 17/20 Iteration 2883/3560 Training loss: 1.3391 0.1388 sec/batch\n", + "Epoch 17/20 Iteration 2884/3560 Training loss: 1.3388 0.1292 sec/batch\n", + "Epoch 17/20 Iteration 2885/3560 Training loss: 1.3379 0.1241 sec/batch\n", + "Epoch 17/20 Iteration 2886/3560 Training loss: 1.3366 0.1269 sec/batch\n", + "Epoch 17/20 Iteration 2887/3560 Training loss: 1.3354 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 2888/3560 Training loss: 1.3351 0.1230 sec/batch\n", + "Epoch 17/20 Iteration 2889/3560 Training loss: 1.3345 0.1272 sec/batch\n", + "Epoch 17/20 Iteration 2890/3560 Training loss: 1.3351 0.1301 sec/batch\n", + "Epoch 17/20 Iteration 2891/3560 Training loss: 1.3346 0.1248 sec/batch\n", + "Epoch 17/20 Iteration 2892/3560 Training loss: 1.3341 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 2893/3560 Training loss: 1.3343 0.1250 sec/batch\n", + "Epoch 17/20 Iteration 2894/3560 Training loss: 1.3335 0.1273 sec/batch\n", + "Epoch 17/20 Iteration 2895/3560 Training loss: 1.3331 0.1312 sec/batch\n", + "Epoch 17/20 Iteration 2896/3560 Training loss: 1.3328 0.1264 sec/batch\n", + "Epoch 17/20 Iteration 2897/3560 Training loss: 1.3326 0.1257 sec/batch\n", + "Epoch 17/20 Iteration 2898/3560 Training loss: 1.3329 0.1236 sec/batch\n", + "Epoch 17/20 Iteration 2899/3560 Training loss: 1.3324 0.1235 sec/batch\n", + "Epoch 17/20 Iteration 2900/3560 Training loss: 1.3331 0.1249 sec/batch\n", + "Epoch 17/20 Iteration 2901/3560 Training loss: 1.3329 0.1285 sec/batch\n", + "Epoch 17/20 Iteration 2902/3560 Training loss: 1.3331 0.1233 sec/batch\n", + "Epoch 17/20 Iteration 2903/3560 Training loss: 1.3329 0.1227 sec/batch\n", + "Epoch 17/20 Iteration 2904/3560 Training loss: 1.3329 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2905/3560 Training loss: 1.3330 0.1282 sec/batch\n", + "Epoch 17/20 Iteration 2906/3560 Training loss: 1.3327 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 2907/3560 Training loss: 1.3321 0.1249 sec/batch\n", + "Epoch 17/20 Iteration 2908/3560 Training loss: 1.3328 0.1270 sec/batch\n", + "Epoch 17/20 Iteration 2909/3560 Training loss: 1.3327 0.1269 sec/batch\n", + "Epoch 17/20 Iteration 2910/3560 Training loss: 1.3336 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2911/3560 Training loss: 1.3339 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 2912/3560 Training loss: 1.3340 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 2913/3560 Training loss: 1.3339 0.1238 sec/batch\n", + "Epoch 17/20 Iteration 2914/3560 Training loss: 1.3340 0.1260 sec/batch\n", + "Epoch 17/20 Iteration 2915/3560 Training loss: 1.3341 0.1260 sec/batch\n", + "Epoch 17/20 Iteration 2916/3560 Training loss: 1.3338 0.1378 sec/batch\n", + "Epoch 17/20 Iteration 2917/3560 Training loss: 1.3338 0.1341 sec/batch\n", + "Epoch 17/20 Iteration 2918/3560 Training loss: 1.3337 0.1271 sec/batch\n", + "Epoch 17/20 Iteration 2919/3560 Training loss: 1.3343 0.1231 sec/batch\n", + "Epoch 17/20 Iteration 2920/3560 Training loss: 1.3347 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2921/3560 Training loss: 1.3351 0.1276 sec/batch\n", + "Epoch 17/20 Iteration 2922/3560 Training loss: 1.3348 0.1230 sec/batch\n", + "Epoch 17/20 Iteration 2923/3560 Training loss: 1.3347 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 2924/3560 Training loss: 1.3349 0.1239 sec/batch\n", + "Epoch 17/20 Iteration 2925/3560 Training loss: 1.3347 0.1289 sec/batch\n", + "Epoch 17/20 Iteration 2926/3560 Training loss: 1.3347 0.1265 sec/batch\n", + "Epoch 17/20 Iteration 2927/3560 Training loss: 1.3341 0.1304 sec/batch\n", + "Epoch 17/20 Iteration 2928/3560 Training loss: 1.3340 0.1276 sec/batch\n", + "Epoch 17/20 Iteration 2929/3560 Training loss: 1.3335 0.1257 sec/batch\n", + "Epoch 17/20 Iteration 2930/3560 Training loss: 1.3334 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 2931/3560 Training loss: 1.3330 0.1229 sec/batch\n", + "Epoch 17/20 Iteration 2932/3560 Training loss: 1.3329 0.1269 sec/batch\n", + "Epoch 17/20 Iteration 2933/3560 Training loss: 1.3327 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2934/3560 Training loss: 1.3326 0.1256 sec/batch\n", + "Epoch 17/20 Iteration 2935/3560 Training loss: 1.3324 0.1245 sec/batch\n", + "Epoch 17/20 Iteration 2936/3560 Training loss: 1.3322 0.1281 sec/batch\n", + "Epoch 17/20 Iteration 2937/3560 Training loss: 1.3317 0.1282 sec/batch\n", + "Epoch 17/20 Iteration 2938/3560 Training loss: 1.3319 0.1256 sec/batch\n", + "Epoch 17/20 Iteration 2939/3560 Training loss: 1.3317 0.1255 sec/batch\n", + "Epoch 17/20 Iteration 2940/3560 Training loss: 1.3316 0.1307 sec/batch\n", + "Epoch 17/20 Iteration 2941/3560 Training loss: 1.3313 0.1300 sec/batch\n", + "Epoch 17/20 Iteration 2942/3560 Training loss: 1.3310 0.1259 sec/batch\n", + "Epoch 17/20 Iteration 2943/3560 Training loss: 1.3306 0.1294 sec/batch\n", + "Epoch 17/20 Iteration 2944/3560 Training loss: 1.3307 0.1250 sec/batch\n", + "Epoch 17/20 Iteration 2945/3560 Training loss: 1.3306 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 2946/3560 Training loss: 1.3302 0.1265 sec/batch\n", + "Epoch 17/20 Iteration 2947/3560 Training loss: 1.3298 0.1257 sec/batch\n", + "Epoch 17/20 Iteration 2948/3560 Training loss: 1.3294 0.1313 sec/batch\n", + "Epoch 17/20 Iteration 2949/3560 Training loss: 1.3293 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2950/3560 Training loss: 1.3291 0.1287 sec/batch\n", + "Epoch 17/20 Iteration 2951/3560 Training loss: 1.3290 0.1276 sec/batch\n", + "Epoch 17/20 Iteration 2952/3560 Training loss: 1.3289 0.1283 sec/batch\n", + "Epoch 17/20 Iteration 2953/3560 Training loss: 1.3287 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2954/3560 Training loss: 1.3287 0.1227 sec/batch\n", + "Epoch 17/20 Iteration 2955/3560 Training loss: 1.3286 0.1243 sec/batch\n", + "Epoch 17/20 Iteration 2956/3560 Training loss: 1.3286 0.1321 sec/batch\n", + "Epoch 17/20 Iteration 2957/3560 Training loss: 1.3283 0.1252 sec/batch\n", + "Epoch 17/20 Iteration 2958/3560 Training loss: 1.3283 0.1210 sec/batch\n", + "Epoch 17/20 Iteration 2959/3560 Training loss: 1.3282 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2960/3560 Training loss: 1.3281 0.1237 sec/batch\n", + "Epoch 17/20 Iteration 2961/3560 Training loss: 1.3281 0.1216 sec/batch\n", + "Epoch 17/20 Iteration 2962/3560 Training loss: 1.3280 0.1226 sec/batch\n", + "Epoch 17/20 Iteration 2963/3560 Training loss: 1.3277 0.1277 sec/batch\n", + "Epoch 17/20 Iteration 2964/3560 Training loss: 1.3274 0.1232 sec/batch\n", + "Epoch 17/20 Iteration 2965/3560 Training loss: 1.3274 0.1228 sec/batch\n", + "Epoch 17/20 Iteration 2966/3560 Training loss: 1.3274 0.1357 sec/batch\n", + "Epoch 17/20 Iteration 2967/3560 Training loss: 1.3273 0.1255 sec/batch\n", + "Epoch 17/20 Iteration 2968/3560 Training loss: 1.3272 0.1214 sec/batch\n", + "Epoch 17/20 Iteration 2969/3560 Training loss: 1.3271 0.1223 sec/batch\n", + "Epoch 17/20 Iteration 2970/3560 Training loss: 1.3268 0.1273 sec/batch\n", + "Epoch 17/20 Iteration 2971/3560 Training loss: 1.3264 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 2972/3560 Training loss: 1.3264 0.1312 sec/batch\n", + "Epoch 17/20 Iteration 2973/3560 Training loss: 1.3263 0.1268 sec/batch\n", + "Epoch 17/20 Iteration 2974/3560 Training loss: 1.3260 0.1236 sec/batch\n", + "Epoch 17/20 Iteration 2975/3560 Training loss: 1.3261 0.1302 sec/batch\n", + "Epoch 17/20 Iteration 2976/3560 Training loss: 1.3261 0.1242 sec/batch\n", + "Epoch 17/20 Iteration 2977/3560 Training loss: 1.3259 0.1259 sec/batch\n", + "Epoch 17/20 Iteration 2978/3560 Training loss: 1.3256 0.1236 sec/batch\n", + "Epoch 17/20 Iteration 2979/3560 Training loss: 1.3252 0.1254 sec/batch\n", + "Epoch 17/20 Iteration 2980/3560 Training loss: 1.3251 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 2981/3560 Training loss: 1.3252 0.1227 sec/batch\n", + "Epoch 17/20 Iteration 2982/3560 Training loss: 1.3253 0.1265 sec/batch\n", + "Epoch 17/20 Iteration 2983/3560 Training loss: 1.3253 0.1239 sec/batch\n", + "Epoch 17/20 Iteration 2984/3560 Training loss: 1.3253 0.1246 sec/batch\n", + "Epoch 17/20 Iteration 2985/3560 Training loss: 1.3255 0.1213 sec/batch\n", + "Epoch 17/20 Iteration 2986/3560 Training loss: 1.3256 0.1326 sec/batch\n", + "Epoch 17/20 Iteration 2987/3560 Training loss: 1.3256 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2988/3560 Training loss: 1.3255 0.1262 sec/batch\n", + "Epoch 17/20 Iteration 2989/3560 Training loss: 1.3258 0.1220 sec/batch\n", + "Epoch 17/20 Iteration 2990/3560 Training loss: 1.3259 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 2991/3560 Training loss: 1.3259 0.1223 sec/batch\n", + "Epoch 17/20 Iteration 2992/3560 Training loss: 1.3261 0.1217 sec/batch\n", + "Epoch 17/20 Iteration 2993/3560 Training loss: 1.3260 0.1250 sec/batch\n", + "Epoch 17/20 Iteration 2994/3560 Training loss: 1.3262 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2995/3560 Training loss: 1.3262 0.1275 sec/batch\n", + "Epoch 17/20 Iteration 2996/3560 Training loss: 1.3265 0.1218 sec/batch\n", + "Epoch 17/20 Iteration 2997/3560 Training loss: 1.3265 0.1247 sec/batch\n", + "Epoch 17/20 Iteration 2998/3560 Training loss: 1.3264 0.1229 sec/batch\n", + "Epoch 17/20 Iteration 2999/3560 Training loss: 1.3261 0.1258 sec/batch\n", + "Epoch 17/20 Iteration 3000/3560 Training loss: 1.3260 0.1229 sec/batch\n", + "Epoch 17/20 Iteration 3001/3560 Training loss: 1.3261 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 3002/3560 Training loss: 1.3260 0.1221 sec/batch\n", + "Epoch 17/20 Iteration 3003/3560 Training loss: 1.3261 0.1251 sec/batch\n", + "Epoch 17/20 Iteration 3004/3560 Training loss: 1.3261 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 3005/3560 Training loss: 1.3260 0.1222 sec/batch\n", + "Epoch 17/20 Iteration 3006/3560 Training loss: 1.3259 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 3007/3560 Training loss: 1.3257 0.1251 sec/batch\n", + "Epoch 17/20 Iteration 3008/3560 Training loss: 1.3258 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 3009/3560 Training loss: 1.3260 0.1225 sec/batch\n", + "Epoch 17/20 Iteration 3010/3560 Training loss: 1.3260 0.1234 sec/batch\n", + "Epoch 17/20 Iteration 3011/3560 Training loss: 1.3259 0.1212 sec/batch\n", + "Epoch 17/20 Iteration 3012/3560 Training loss: 1.3258 0.1212 sec/batch\n", + "Epoch 17/20 Iteration 3013/3560 Training loss: 1.3258 0.1235 sec/batch\n", + "Epoch 17/20 Iteration 3014/3560 Training loss: 1.3257 0.1217 sec/batch\n", + "Epoch 17/20 Iteration 3015/3560 Training loss: 1.3259 0.1215 sec/batch\n", + "Epoch 17/20 Iteration 3016/3560 Training loss: 1.3263 0.1244 sec/batch\n", + "Epoch 17/20 Iteration 3017/3560 Training loss: 1.3264 0.1224 sec/batch\n", + "Epoch 17/20 Iteration 3018/3560 Training loss: 1.3264 0.1261 sec/batch\n", + "Epoch 17/20 Iteration 3019/3560 Training loss: 1.3263 0.1252 sec/batch\n", + "Epoch 17/20 Iteration 3020/3560 Training loss: 1.3262 0.1283 sec/batch\n", + "Epoch 17/20 Iteration 3021/3560 Training loss: 1.3263 0.1219 sec/batch\n", + "Epoch 17/20 Iteration 3022/3560 Training loss: 1.3264 0.1281 sec/batch\n", + "Epoch 17/20 Iteration 3023/3560 Training loss: 1.3264 0.1227 sec/batch\n", + "Epoch 17/20 Iteration 3024/3560 Training loss: 1.3263 0.1250 sec/batch\n", + "Epoch 17/20 Iteration 3025/3560 Training loss: 1.3262 0.1220 sec/batch\n", + "Epoch 17/20 Iteration 3026/3560 Training loss: 1.3263 0.1278 sec/batch\n", + "Epoch 18/20 Iteration 3027/3560 Training loss: 1.4579 0.1274 sec/batch\n", + "Epoch 18/20 Iteration 3028/3560 Training loss: 1.4031 0.1329 sec/batch\n", + "Epoch 18/20 Iteration 3029/3560 Training loss: 1.3784 0.1252 sec/batch\n", + "Epoch 18/20 Iteration 3030/3560 Training loss: 1.3690 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3031/3560 Training loss: 1.3569 0.1217 sec/batch\n", + "Epoch 18/20 Iteration 3032/3560 Training loss: 1.3442 0.1313 sec/batch\n", + "Epoch 18/20 Iteration 3033/3560 Training loss: 1.3425 0.1264 sec/batch\n", + "Epoch 18/20 Iteration 3034/3560 Training loss: 1.3396 0.1292 sec/batch\n", + "Epoch 18/20 Iteration 3035/3560 Training loss: 1.3389 0.1285 sec/batch\n", + "Epoch 18/20 Iteration 3036/3560 Training loss: 1.3363 0.1289 sec/batch\n", + "Epoch 18/20 Iteration 3037/3560 Training loss: 1.3325 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3038/3560 Training loss: 1.3322 0.1358 sec/batch\n", + "Epoch 18/20 Iteration 3039/3560 Training loss: 1.3319 0.1253 sec/batch\n", + "Epoch 18/20 Iteration 3040/3560 Training loss: 1.3326 0.1302 sec/batch\n", + "Epoch 18/20 Iteration 3041/3560 Training loss: 1.3310 0.1287 sec/batch\n", + "Epoch 18/20 Iteration 3042/3560 Training loss: 1.3293 0.1283 sec/batch\n", + "Epoch 18/20 Iteration 3043/3560 Training loss: 1.3289 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3044/3560 Training loss: 1.3300 0.1281 sec/batch\n", + "Epoch 18/20 Iteration 3045/3560 Training loss: 1.3301 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3046/3560 Training loss: 1.3310 0.1264 sec/batch\n", + "Epoch 18/20 Iteration 3047/3560 Training loss: 1.3300 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3048/3560 Training loss: 1.3299 0.1280 sec/batch\n", + "Epoch 18/20 Iteration 3049/3560 Training loss: 1.3287 0.1299 sec/batch\n", + "Epoch 18/20 Iteration 3050/3560 Training loss: 1.3286 0.1284 sec/batch\n", + "Epoch 18/20 Iteration 3051/3560 Training loss: 1.3282 0.1245 sec/batch\n", + "Epoch 18/20 Iteration 3052/3560 Training loss: 1.3263 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3053/3560 Training loss: 1.3250 0.1264 sec/batch\n", + "Epoch 18/20 Iteration 3054/3560 Training loss: 1.3254 0.1317 sec/batch\n", + "Epoch 18/20 Iteration 3055/3560 Training loss: 1.3257 0.1235 sec/batch\n", + "Epoch 18/20 Iteration 3056/3560 Training loss: 1.3261 0.1267 sec/batch\n", + "Epoch 18/20 Iteration 3057/3560 Training loss: 1.3254 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3058/3560 Training loss: 1.3242 0.1230 sec/batch\n", + "Epoch 18/20 Iteration 3059/3560 Training loss: 1.3247 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3060/3560 Training loss: 1.3246 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3061/3560 Training loss: 1.3243 0.1228 sec/batch\n", + "Epoch 18/20 Iteration 3062/3560 Training loss: 1.3239 0.1261 sec/batch\n", + "Epoch 18/20 Iteration 3063/3560 Training loss: 1.3230 0.1228 sec/batch\n", + "Epoch 18/20 Iteration 3064/3560 Training loss: 1.3218 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3065/3560 Training loss: 1.3205 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3066/3560 Training loss: 1.3200 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3067/3560 Training loss: 1.3195 0.1280 sec/batch\n", + "Epoch 18/20 Iteration 3068/3560 Training loss: 1.3202 0.1293 sec/batch\n", + "Epoch 18/20 Iteration 3069/3560 Training loss: 1.3198 0.1216 sec/batch\n", + "Epoch 18/20 Iteration 3070/3560 Training loss: 1.3193 0.1256 sec/batch\n", + "Epoch 18/20 Iteration 3071/3560 Training loss: 1.3195 0.1230 sec/batch\n", + "Epoch 18/20 Iteration 3072/3560 Training loss: 1.3186 0.1240 sec/batch\n", + "Epoch 18/20 Iteration 3073/3560 Training loss: 1.3182 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3074/3560 Training loss: 1.3177 0.1257 sec/batch\n", + "Epoch 18/20 Iteration 3075/3560 Training loss: 1.3174 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3076/3560 Training loss: 1.3177 0.1270 sec/batch\n", + "Epoch 18/20 Iteration 3077/3560 Training loss: 1.3172 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3078/3560 Training loss: 1.3179 0.1290 sec/batch\n", + "Epoch 18/20 Iteration 3079/3560 Training loss: 1.3179 0.1218 sec/batch\n", + "Epoch 18/20 Iteration 3080/3560 Training loss: 1.3181 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3081/3560 Training loss: 1.3180 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3082/3560 Training loss: 1.3179 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3083/3560 Training loss: 1.3181 0.1316 sec/batch\n", + "Epoch 18/20 Iteration 3084/3560 Training loss: 1.3179 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3085/3560 Training loss: 1.3173 0.1221 sec/batch\n", + "Epoch 18/20 Iteration 3086/3560 Training loss: 1.3178 0.1258 sec/batch\n", + "Epoch 18/20 Iteration 3087/3560 Training loss: 1.3178 0.1224 sec/batch\n", + "Epoch 18/20 Iteration 3088/3560 Training loss: 1.3187 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3089/3560 Training loss: 1.3190 0.1240 sec/batch\n", + "Epoch 18/20 Iteration 3090/3560 Training loss: 1.3192 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3091/3560 Training loss: 1.3190 0.1273 sec/batch\n", + "Epoch 18/20 Iteration 3092/3560 Training loss: 1.3192 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3093/3560 Training loss: 1.3194 0.1228 sec/batch\n", + "Epoch 18/20 Iteration 3094/3560 Training loss: 1.3191 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3095/3560 Training loss: 1.3190 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3096/3560 Training loss: 1.3188 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3097/3560 Training loss: 1.3194 0.1249 sec/batch\n", + "Epoch 18/20 Iteration 3098/3560 Training loss: 1.3197 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3099/3560 Training loss: 1.3202 0.1228 sec/batch\n", + "Epoch 18/20 Iteration 3100/3560 Training loss: 1.3198 0.1248 sec/batch\n", + "Epoch 18/20 Iteration 3101/3560 Training loss: 1.3197 0.1244 sec/batch\n", + "Epoch 18/20 Iteration 3102/3560 Training loss: 1.3199 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3103/3560 Training loss: 1.3198 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3104/3560 Training loss: 1.3197 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3105/3560 Training loss: 1.3191 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3106/3560 Training loss: 1.3190 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3107/3560 Training loss: 1.3184 0.1248 sec/batch\n", + "Epoch 18/20 Iteration 3108/3560 Training loss: 1.3183 0.1249 sec/batch\n", + "Epoch 18/20 Iteration 3109/3560 Training loss: 1.3179 0.1233 sec/batch\n", + "Epoch 18/20 Iteration 3110/3560 Training loss: 1.3177 0.1254 sec/batch\n", + "Epoch 18/20 Iteration 3111/3560 Training loss: 1.3175 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3112/3560 Training loss: 1.3173 0.1296 sec/batch\n", + "Epoch 18/20 Iteration 3113/3560 Training loss: 1.3170 0.1211 sec/batch\n", + "Epoch 18/20 Iteration 3114/3560 Training loss: 1.3166 0.1248 sec/batch\n", + "Epoch 18/20 Iteration 3115/3560 Training loss: 1.3162 0.1231 sec/batch\n", + "Epoch 18/20 Iteration 3116/3560 Training loss: 1.3163 0.1248 sec/batch\n", + "Epoch 18/20 Iteration 3117/3560 Training loss: 1.3161 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3118/3560 Training loss: 1.3161 0.1224 sec/batch\n", + "Epoch 18/20 Iteration 3119/3560 Training loss: 1.3157 0.1249 sec/batch\n", + "Epoch 18/20 Iteration 3120/3560 Training loss: 1.3153 0.1245 sec/batch\n", + "Epoch 18/20 Iteration 3121/3560 Training loss: 1.3150 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3122/3560 Training loss: 1.3150 0.1220 sec/batch\n", + "Epoch 18/20 Iteration 3123/3560 Training loss: 1.3150 0.1261 sec/batch\n", + "Epoch 18/20 Iteration 3124/3560 Training loss: 1.3145 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3125/3560 Training loss: 1.3142 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3126/3560 Training loss: 1.3137 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3127/3560 Training loss: 1.3137 0.1252 sec/batch\n", + "Epoch 18/20 Iteration 3128/3560 Training loss: 1.3136 0.1266 sec/batch\n", + "Epoch 18/20 Iteration 3129/3560 Training loss: 1.3135 0.1254 sec/batch\n", + "Epoch 18/20 Iteration 3130/3560 Training loss: 1.3133 0.1211 sec/batch\n", + "Epoch 18/20 Iteration 3131/3560 Training loss: 1.3132 0.1229 sec/batch\n", + "Epoch 18/20 Iteration 3132/3560 Training loss: 1.3130 0.1277 sec/batch\n", + "Epoch 18/20 Iteration 3133/3560 Training loss: 1.3130 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3134/3560 Training loss: 1.3129 0.1315 sec/batch\n", + "Epoch 18/20 Iteration 3135/3560 Training loss: 1.3127 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3136/3560 Training loss: 1.3129 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3137/3560 Training loss: 1.3126 0.1247 sec/batch\n", + "Epoch 18/20 Iteration 3138/3560 Training loss: 1.3126 0.1243 sec/batch\n", + "Epoch 18/20 Iteration 3139/3560 Training loss: 1.3125 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3140/3560 Training loss: 1.3123 0.1219 sec/batch\n", + "Epoch 18/20 Iteration 3141/3560 Training loss: 1.3120 0.1261 sec/batch\n", + "Epoch 18/20 Iteration 3142/3560 Training loss: 1.3117 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3143/3560 Training loss: 1.3117 0.1280 sec/batch\n", + "Epoch 18/20 Iteration 3144/3560 Training loss: 1.3116 0.1218 sec/batch\n", + "Epoch 18/20 Iteration 3145/3560 Training loss: 1.3115 0.1242 sec/batch\n", + "Epoch 18/20 Iteration 3146/3560 Training loss: 1.3113 0.1266 sec/batch\n", + "Epoch 18/20 Iteration 3147/3560 Training loss: 1.3112 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3148/3560 Training loss: 1.3109 0.1244 sec/batch\n", + "Epoch 18/20 Iteration 3149/3560 Training loss: 1.3106 0.1250 sec/batch\n", + "Epoch 18/20 Iteration 3150/3560 Training loss: 1.3106 0.1249 sec/batch\n", + "Epoch 18/20 Iteration 3151/3560 Training loss: 1.3105 0.1212 sec/batch\n", + "Epoch 18/20 Iteration 3152/3560 Training loss: 1.3101 0.1256 sec/batch\n", + "Epoch 18/20 Iteration 3153/3560 Training loss: 1.3102 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3154/3560 Training loss: 1.3102 0.1254 sec/batch\n", + "Epoch 18/20 Iteration 3155/3560 Training loss: 1.3101 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3156/3560 Training loss: 1.3098 0.1252 sec/batch\n", + "Epoch 18/20 Iteration 3157/3560 Training loss: 1.3095 0.1234 sec/batch\n", + "Epoch 18/20 Iteration 3158/3560 Training loss: 1.3093 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3159/3560 Training loss: 1.3094 0.1256 sec/batch\n", + "Epoch 18/20 Iteration 3160/3560 Training loss: 1.3094 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3161/3560 Training loss: 1.3094 0.1258 sec/batch\n", + "Epoch 18/20 Iteration 3162/3560 Training loss: 1.3094 0.1215 sec/batch\n", + "Epoch 18/20 Iteration 3163/3560 Training loss: 1.3096 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3164/3560 Training loss: 1.3096 0.1264 sec/batch\n", + "Epoch 18/20 Iteration 3165/3560 Training loss: 1.3096 0.1237 sec/batch\n", + "Epoch 18/20 Iteration 3166/3560 Training loss: 1.3097 0.1239 sec/batch\n", + "Epoch 18/20 Iteration 3167/3560 Training loss: 1.3100 0.1222 sec/batch\n", + "Epoch 18/20 Iteration 3168/3560 Training loss: 1.3101 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3169/3560 Training loss: 1.3101 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3170/3560 Training loss: 1.3104 0.1239 sec/batch\n", + "Epoch 18/20 Iteration 3171/3560 Training loss: 1.3102 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3172/3560 Training loss: 1.3104 0.1347 sec/batch\n", + "Epoch 18/20 Iteration 3173/3560 Training loss: 1.3103 0.1224 sec/batch\n", + "Epoch 18/20 Iteration 3174/3560 Training loss: 1.3106 0.1224 sec/batch\n", + "Epoch 18/20 Iteration 3175/3560 Training loss: 1.3107 0.1241 sec/batch\n", + "Epoch 18/20 Iteration 3176/3560 Training loss: 1.3106 0.1225 sec/batch\n", + "Epoch 18/20 Iteration 3177/3560 Training loss: 1.3103 0.1209 sec/batch\n", + "Epoch 18/20 Iteration 3178/3560 Training loss: 1.3102 0.1307 sec/batch\n", + "Epoch 18/20 Iteration 3179/3560 Training loss: 1.3102 0.1220 sec/batch\n", + "Epoch 18/20 Iteration 3180/3560 Training loss: 1.3102 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3181/3560 Training loss: 1.3101 0.1248 sec/batch\n", + "Epoch 18/20 Iteration 3182/3560 Training loss: 1.3101 0.1251 sec/batch\n", + "Epoch 18/20 Iteration 3183/3560 Training loss: 1.3102 0.1218 sec/batch\n", + "Epoch 18/20 Iteration 3184/3560 Training loss: 1.3101 0.1260 sec/batch\n", + "Epoch 18/20 Iteration 3185/3560 Training loss: 1.3099 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3186/3560 Training loss: 1.3099 0.1255 sec/batch\n", + "Epoch 18/20 Iteration 3187/3560 Training loss: 1.3101 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3188/3560 Training loss: 1.3101 0.1224 sec/batch\n", + "Epoch 18/20 Iteration 3189/3560 Training loss: 1.3100 0.1260 sec/batch\n", + "Epoch 18/20 Iteration 3190/3560 Training loss: 1.3100 0.1244 sec/batch\n", + "Epoch 18/20 Iteration 3191/3560 Training loss: 1.3100 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3192/3560 Training loss: 1.3100 0.1264 sec/batch\n", + "Epoch 18/20 Iteration 3193/3560 Training loss: 1.3101 0.1252 sec/batch\n", + "Epoch 18/20 Iteration 3194/3560 Training loss: 1.3105 0.1232 sec/batch\n", + "Epoch 18/20 Iteration 3195/3560 Training loss: 1.3105 0.1267 sec/batch\n", + "Epoch 18/20 Iteration 3196/3560 Training loss: 1.3105 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3197/3560 Training loss: 1.3105 0.1223 sec/batch\n", + "Epoch 18/20 Iteration 3198/3560 Training loss: 1.3104 0.1260 sec/batch\n", + "Epoch 18/20 Iteration 3199/3560 Training loss: 1.3105 0.1238 sec/batch\n", + "Epoch 18/20 Iteration 3200/3560 Training loss: 1.3105 0.1227 sec/batch\n", + "Epoch 18/20 Iteration 3201/3560 Training loss: 1.3106 0.1236 sec/batch\n", + "Epoch 18/20 Iteration 3202/3560 Training loss: 1.3104 0.1213 sec/batch\n", + "Epoch 18/20 Iteration 3203/3560 Training loss: 1.3103 0.1226 sec/batch\n", + "Epoch 18/20 Iteration 3204/3560 Training loss: 1.3104 0.1255 sec/batch\n", + "Epoch 19/20 Iteration 3205/3560 Training loss: 1.4372 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3206/3560 Training loss: 1.3939 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3207/3560 Training loss: 1.3740 0.1214 sec/batch\n", + "Epoch 19/20 Iteration 3208/3560 Training loss: 1.3633 0.1252 sec/batch\n", + "Epoch 19/20 Iteration 3209/3560 Training loss: 1.3485 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3210/3560 Training loss: 1.3354 0.1246 sec/batch\n", + "Epoch 19/20 Iteration 3211/3560 Training loss: 1.3337 0.1273 sec/batch\n", + "Epoch 19/20 Iteration 3212/3560 Training loss: 1.3306 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3213/3560 Training loss: 1.3300 0.1246 sec/batch\n", + "Epoch 19/20 Iteration 3214/3560 Training loss: 1.3281 0.1257 sec/batch\n", + "Epoch 19/20 Iteration 3215/3560 Training loss: 1.3241 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3216/3560 Training loss: 1.3229 0.1245 sec/batch\n", + "Epoch 19/20 Iteration 3217/3560 Training loss: 1.3222 0.1229 sec/batch\n", + "Epoch 19/20 Iteration 3218/3560 Training loss: 1.3232 0.1285 sec/batch\n", + "Epoch 19/20 Iteration 3219/3560 Training loss: 1.3221 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3220/3560 Training loss: 1.3199 0.1239 sec/batch\n", + "Epoch 19/20 Iteration 3221/3560 Training loss: 1.3191 0.1224 sec/batch\n", + "Epoch 19/20 Iteration 3222/3560 Training loss: 1.3199 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3223/3560 Training loss: 1.3196 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3224/3560 Training loss: 1.3205 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3225/3560 Training loss: 1.3197 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3226/3560 Training loss: 1.3194 0.1229 sec/batch\n", + "Epoch 19/20 Iteration 3227/3560 Training loss: 1.3182 0.1222 sec/batch\n", + "Epoch 19/20 Iteration 3228/3560 Training loss: 1.3180 0.1286 sec/batch\n", + "Epoch 19/20 Iteration 3229/3560 Training loss: 1.3176 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3230/3560 Training loss: 1.3154 0.1239 sec/batch\n", + "Epoch 19/20 Iteration 3231/3560 Training loss: 1.3138 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3232/3560 Training loss: 1.3144 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3233/3560 Training loss: 1.3147 0.1251 sec/batch\n", + "Epoch 19/20 Iteration 3234/3560 Training loss: 1.3146 0.1264 sec/batch\n", + "Epoch 19/20 Iteration 3235/3560 Training loss: 1.3139 0.1258 sec/batch\n", + "Epoch 19/20 Iteration 3236/3560 Training loss: 1.3128 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3237/3560 Training loss: 1.3129 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3238/3560 Training loss: 1.3131 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3239/3560 Training loss: 1.3124 0.1217 sec/batch\n", + "Epoch 19/20 Iteration 3240/3560 Training loss: 1.3121 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3241/3560 Training loss: 1.3110 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3242/3560 Training loss: 1.3099 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3243/3560 Training loss: 1.3085 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3244/3560 Training loss: 1.3082 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3245/3560 Training loss: 1.3073 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3246/3560 Training loss: 1.3080 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3247/3560 Training loss: 1.3076 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3248/3560 Training loss: 1.3069 0.1271 sec/batch\n", + "Epoch 19/20 Iteration 3249/3560 Training loss: 1.3071 0.1235 sec/batch\n", + "Epoch 19/20 Iteration 3250/3560 Training loss: 1.3063 0.1235 sec/batch\n", + "Epoch 19/20 Iteration 3251/3560 Training loss: 1.3056 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3252/3560 Training loss: 1.3051 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3253/3560 Training loss: 1.3050 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3254/3560 Training loss: 1.3051 0.1260 sec/batch\n", + "Epoch 19/20 Iteration 3255/3560 Training loss: 1.3044 0.1376 sec/batch\n", + "Epoch 19/20 Iteration 3256/3560 Training loss: 1.3050 0.1241 sec/batch\n", + "Epoch 19/20 Iteration 3257/3560 Training loss: 1.3049 0.1229 sec/batch\n", + "Epoch 19/20 Iteration 3258/3560 Training loss: 1.3051 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3259/3560 Training loss: 1.3047 0.1222 sec/batch\n", + "Epoch 19/20 Iteration 3260/3560 Training loss: 1.3048 0.1252 sec/batch\n", + "Epoch 19/20 Iteration 3261/3560 Training loss: 1.3050 0.1321 sec/batch\n", + "Epoch 19/20 Iteration 3262/3560 Training loss: 1.3045 0.1317 sec/batch\n", + "Epoch 19/20 Iteration 3263/3560 Training loss: 1.3038 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3264/3560 Training loss: 1.3045 0.1271 sec/batch\n", + "Epoch 19/20 Iteration 3265/3560 Training loss: 1.3045 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3266/3560 Training loss: 1.3053 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3267/3560 Training loss: 1.3054 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3268/3560 Training loss: 1.3055 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3269/3560 Training loss: 1.3053 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3270/3560 Training loss: 1.3054 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3271/3560 Training loss: 1.3056 0.1252 sec/batch\n", + "Epoch 19/20 Iteration 3272/3560 Training loss: 1.3053 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3273/3560 Training loss: 1.3052 0.1262 sec/batch\n", + "Epoch 19/20 Iteration 3274/3560 Training loss: 1.3051 0.1248 sec/batch\n", + "Epoch 19/20 Iteration 3275/3560 Training loss: 1.3057 0.1263 sec/batch\n", + "Epoch 19/20 Iteration 3276/3560 Training loss: 1.3060 0.1245 sec/batch\n", + "Epoch 19/20 Iteration 3277/3560 Training loss: 1.3064 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3278/3560 Training loss: 1.3060 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3279/3560 Training loss: 1.3060 0.1261 sec/batch\n", + "Epoch 19/20 Iteration 3280/3560 Training loss: 1.3061 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3281/3560 Training loss: 1.3059 0.1255 sec/batch\n", + "Epoch 19/20 Iteration 3282/3560 Training loss: 1.3058 0.1229 sec/batch\n", + "Epoch 19/20 Iteration 3283/3560 Training loss: 1.3052 0.1235 sec/batch\n", + "Epoch 19/20 Iteration 3284/3560 Training loss: 1.3051 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3285/3560 Training loss: 1.3046 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3286/3560 Training loss: 1.3045 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3287/3560 Training loss: 1.3040 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3288/3560 Training loss: 1.3039 0.1224 sec/batch\n", + "Epoch 19/20 Iteration 3289/3560 Training loss: 1.3037 0.1254 sec/batch\n", + "Epoch 19/20 Iteration 3290/3560 Training loss: 1.3035 0.1274 sec/batch\n", + "Epoch 19/20 Iteration 3291/3560 Training loss: 1.3033 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3292/3560 Training loss: 1.3030 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3293/3560 Training loss: 1.3026 0.1229 sec/batch\n", + "Epoch 19/20 Iteration 3294/3560 Training loss: 1.3026 0.1297 sec/batch\n", + "Epoch 19/20 Iteration 3295/3560 Training loss: 1.3024 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3296/3560 Training loss: 1.3022 0.1320 sec/batch\n", + "Epoch 19/20 Iteration 3297/3560 Training loss: 1.3018 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3298/3560 Training loss: 1.3014 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3299/3560 Training loss: 1.3010 0.1238 sec/batch\n", + "Epoch 19/20 Iteration 3300/3560 Training loss: 1.3009 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3301/3560 Training loss: 1.3009 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3302/3560 Training loss: 1.3005 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3303/3560 Training loss: 1.3001 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3304/3560 Training loss: 1.2998 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3305/3560 Training loss: 1.2997 0.1227 sec/batch\n", + "Epoch 19/20 Iteration 3306/3560 Training loss: 1.2996 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3307/3560 Training loss: 1.2995 0.1231 sec/batch\n", + "Epoch 19/20 Iteration 3308/3560 Training loss: 1.2993 0.1240 sec/batch\n", + "Epoch 19/20 Iteration 3309/3560 Training loss: 1.2991 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3310/3560 Training loss: 1.2991 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3311/3560 Training loss: 1.2990 0.1223 sec/batch\n", + "Epoch 19/20 Iteration 3312/3560 Training loss: 1.2989 0.1259 sec/batch\n", + "Epoch 19/20 Iteration 3313/3560 Training loss: 1.2988 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3314/3560 Training loss: 1.2989 0.1267 sec/batch\n", + "Epoch 19/20 Iteration 3315/3560 Training loss: 1.2988 0.1233 sec/batch\n", + "Epoch 19/20 Iteration 3316/3560 Training loss: 1.2986 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3317/3560 Training loss: 1.2985 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3318/3560 Training loss: 1.2984 0.1262 sec/batch\n", + "Epoch 19/20 Iteration 3319/3560 Training loss: 1.2982 0.1219 sec/batch\n", + "Epoch 19/20 Iteration 3320/3560 Training loss: 1.2979 0.1235 sec/batch\n", + "Epoch 19/20 Iteration 3321/3560 Training loss: 1.2978 0.1237 sec/batch\n", + "Epoch 19/20 Iteration 3322/3560 Training loss: 1.2977 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3323/3560 Training loss: 1.2976 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3324/3560 Training loss: 1.2975 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3325/3560 Training loss: 1.2975 0.1215 sec/batch\n", + "Epoch 19/20 Iteration 3326/3560 Training loss: 1.2972 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3327/3560 Training loss: 1.2968 0.1240 sec/batch\n", + "Epoch 19/20 Iteration 3328/3560 Training loss: 1.2968 0.1224 sec/batch\n", + "Epoch 19/20 Iteration 3329/3560 Training loss: 1.2967 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3330/3560 Training loss: 1.2964 0.1255 sec/batch\n", + "Epoch 19/20 Iteration 3331/3560 Training loss: 1.2964 0.1294 sec/batch\n", + "Epoch 19/20 Iteration 3332/3560 Training loss: 1.2964 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3333/3560 Training loss: 1.2963 0.1409 sec/batch\n", + "Epoch 19/20 Iteration 3334/3560 Training loss: 1.2960 0.1267 sec/batch\n", + "Epoch 19/20 Iteration 3335/3560 Training loss: 1.2956 0.1288 sec/batch\n", + "Epoch 19/20 Iteration 3336/3560 Training loss: 1.2955 0.1329 sec/batch\n", + "Epoch 19/20 Iteration 3337/3560 Training loss: 1.2956 0.1343 sec/batch\n", + "Epoch 19/20 Iteration 3338/3560 Training loss: 1.2956 0.1278 sec/batch\n", + "Epoch 19/20 Iteration 3339/3560 Training loss: 1.2955 0.1260 sec/batch\n", + "Epoch 19/20 Iteration 3340/3560 Training loss: 1.2956 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3341/3560 Training loss: 1.2958 0.1234 sec/batch\n", + "Epoch 19/20 Iteration 3342/3560 Training loss: 1.2959 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3343/3560 Training loss: 1.2959 0.1250 sec/batch\n", + "Epoch 19/20 Iteration 3344/3560 Training loss: 1.2959 0.1270 sec/batch\n", + "Epoch 19/20 Iteration 3345/3560 Training loss: 1.2962 0.1248 sec/batch\n", + "Epoch 19/20 Iteration 3346/3560 Training loss: 1.2962 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3347/3560 Training loss: 1.2962 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3348/3560 Training loss: 1.2965 0.1246 sec/batch\n", + "Epoch 19/20 Iteration 3349/3560 Training loss: 1.2963 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3350/3560 Training loss: 1.2965 0.1272 sec/batch\n", + "Epoch 19/20 Iteration 3351/3560 Training loss: 1.2966 0.1292 sec/batch\n", + "Epoch 19/20 Iteration 3352/3560 Training loss: 1.2968 0.1220 sec/batch\n", + "Epoch 19/20 Iteration 3353/3560 Training loss: 1.2969 0.1301 sec/batch\n", + "Epoch 19/20 Iteration 3354/3560 Training loss: 1.2967 0.1239 sec/batch\n", + "Epoch 19/20 Iteration 3355/3560 Training loss: 1.2965 0.1254 sec/batch\n", + "Epoch 19/20 Iteration 3356/3560 Training loss: 1.2964 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3357/3560 Training loss: 1.2964 0.1278 sec/batch\n", + "Epoch 19/20 Iteration 3358/3560 Training loss: 1.2963 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3359/3560 Training loss: 1.2964 0.1273 sec/batch\n", + "Epoch 19/20 Iteration 3360/3560 Training loss: 1.2963 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3361/3560 Training loss: 1.2964 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3362/3560 Training loss: 1.2963 0.1338 sec/batch\n", + "Epoch 19/20 Iteration 3363/3560 Training loss: 1.2961 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3364/3560 Training loss: 1.2962 0.1242 sec/batch\n", + "Epoch 19/20 Iteration 3365/3560 Training loss: 1.2965 0.1268 sec/batch\n", + "Epoch 19/20 Iteration 3366/3560 Training loss: 1.2964 0.1240 sec/batch\n", + "Epoch 19/20 Iteration 3367/3560 Training loss: 1.2964 0.1232 sec/batch\n", + "Epoch 19/20 Iteration 3368/3560 Training loss: 1.2964 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3369/3560 Training loss: 1.2964 0.1221 sec/batch\n", + "Epoch 19/20 Iteration 3370/3560 Training loss: 1.2963 0.1225 sec/batch\n", + "Epoch 19/20 Iteration 3371/3560 Training loss: 1.2964 0.1213 sec/batch\n", + "Epoch 19/20 Iteration 3372/3560 Training loss: 1.2967 0.1253 sec/batch\n", + "Epoch 19/20 Iteration 3373/3560 Training loss: 1.2968 0.1215 sec/batch\n", + "Epoch 19/20 Iteration 3374/3560 Training loss: 1.2968 0.1218 sec/batch\n", + "Epoch 19/20 Iteration 3375/3560 Training loss: 1.2967 0.1235 sec/batch\n", + "Epoch 19/20 Iteration 3376/3560 Training loss: 1.2966 0.1228 sec/batch\n", + "Epoch 19/20 Iteration 3377/3560 Training loss: 1.2967 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3378/3560 Training loss: 1.2967 0.1243 sec/batch\n", + "Epoch 19/20 Iteration 3379/3560 Training loss: 1.2967 0.1226 sec/batch\n", + "Epoch 19/20 Iteration 3380/3560 Training loss: 1.2966 0.1244 sec/batch\n", + "Epoch 19/20 Iteration 3381/3560 Training loss: 1.2965 0.1230 sec/batch\n", + "Epoch 19/20 Iteration 3382/3560 Training loss: 1.2967 0.1272 sec/batch\n", + "Epoch 20/20 Iteration 3383/3560 Training loss: 1.4291 0.1299 sec/batch\n", + "Epoch 20/20 Iteration 3384/3560 Training loss: 1.3758 0.1229 sec/batch\n", + "Epoch 20/20 Iteration 3385/3560 Training loss: 1.3515 0.1237 sec/batch\n", + "Epoch 20/20 Iteration 3386/3560 Training loss: 1.3406 0.1241 sec/batch\n", + "Epoch 20/20 Iteration 3387/3560 Training loss: 1.3276 0.1240 sec/batch\n", + "Epoch 20/20 Iteration 3388/3560 Training loss: 1.3159 0.1411 sec/batch\n", + "Epoch 20/20 Iteration 3389/3560 Training loss: 1.3152 0.1304 sec/batch\n", + "Epoch 20/20 Iteration 3390/3560 Training loss: 1.3113 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3391/3560 Training loss: 1.3096 0.1253 sec/batch\n", + "Epoch 20/20 Iteration 3392/3560 Training loss: 1.3078 0.1234 sec/batch\n", + "Epoch 20/20 Iteration 3393/3560 Training loss: 1.3039 0.1255 sec/batch\n", + "Epoch 20/20 Iteration 3394/3560 Training loss: 1.3030 0.1234 sec/batch\n", + "Epoch 20/20 Iteration 3395/3560 Training loss: 1.3024 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3396/3560 Training loss: 1.3032 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3397/3560 Training loss: 1.3016 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3398/3560 Training loss: 1.2996 0.1258 sec/batch\n", + "Epoch 20/20 Iteration 3399/3560 Training loss: 1.2997 0.1269 sec/batch\n", + "Epoch 20/20 Iteration 3400/3560 Training loss: 1.3008 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3401/3560 Training loss: 1.3009 0.1263 sec/batch\n", + "Epoch 20/20 Iteration 3402/3560 Training loss: 1.3017 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3403/3560 Training loss: 1.3010 0.1214 sec/batch\n", + "Epoch 20/20 Iteration 3404/3560 Training loss: 1.3013 0.1295 sec/batch\n", + "Epoch 20/20 Iteration 3405/3560 Training loss: 1.3003 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3406/3560 Training loss: 1.2999 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3407/3560 Training loss: 1.2993 0.1274 sec/batch\n", + "Epoch 20/20 Iteration 3408/3560 Training loss: 1.2977 0.1216 sec/batch\n", + "Epoch 20/20 Iteration 3409/3560 Training loss: 1.2959 0.1216 sec/batch\n", + "Epoch 20/20 Iteration 3410/3560 Training loss: 1.2963 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3411/3560 Training loss: 1.2960 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3412/3560 Training loss: 1.2962 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3413/3560 Training loss: 1.2956 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3414/3560 Training loss: 1.2946 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3415/3560 Training loss: 1.2946 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3416/3560 Training loss: 1.2947 0.1262 sec/batch\n", + "Epoch 20/20 Iteration 3417/3560 Training loss: 1.2942 0.1259 sec/batch\n", + "Epoch 20/20 Iteration 3418/3560 Training loss: 1.2940 0.1271 sec/batch\n", + "Epoch 20/20 Iteration 3419/3560 Training loss: 1.2931 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3420/3560 Training loss: 1.2917 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3421/3560 Training loss: 1.2907 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3422/3560 Training loss: 1.2902 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3423/3560 Training loss: 1.2895 0.1264 sec/batch\n", + "Epoch 20/20 Iteration 3424/3560 Training loss: 1.2901 0.1260 sec/batch\n", + "Epoch 20/20 Iteration 3425/3560 Training loss: 1.2896 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3426/3560 Training loss: 1.2890 0.1250 sec/batch\n", + "Epoch 20/20 Iteration 3427/3560 Training loss: 1.2891 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3428/3560 Training loss: 1.2883 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3429/3560 Training loss: 1.2878 0.1239 sec/batch\n", + "Epoch 20/20 Iteration 3430/3560 Training loss: 1.2874 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3431/3560 Training loss: 1.2873 0.1339 sec/batch\n", + "Epoch 20/20 Iteration 3432/3560 Training loss: 1.2876 0.1205 sec/batch\n", + "Epoch 20/20 Iteration 3433/3560 Training loss: 1.2869 0.1261 sec/batch\n", + "Epoch 20/20 Iteration 3434/3560 Training loss: 1.2875 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3435/3560 Training loss: 1.2874 0.1253 sec/batch\n", + "Epoch 20/20 Iteration 3436/3560 Training loss: 1.2877 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3437/3560 Training loss: 1.2876 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3438/3560 Training loss: 1.2876 0.1250 sec/batch\n", + "Epoch 20/20 Iteration 3439/3560 Training loss: 1.2878 0.1230 sec/batch\n", + "Epoch 20/20 Iteration 3440/3560 Training loss: 1.2875 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3441/3560 Training loss: 1.2869 0.1230 sec/batch\n", + "Epoch 20/20 Iteration 3442/3560 Training loss: 1.2876 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3443/3560 Training loss: 1.2876 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3444/3560 Training loss: 1.2885 0.1271 sec/batch\n", + "Epoch 20/20 Iteration 3445/3560 Training loss: 1.2888 0.1241 sec/batch\n", + "Epoch 20/20 Iteration 3446/3560 Training loss: 1.2889 0.1269 sec/batch\n", + "Epoch 20/20 Iteration 3447/3560 Training loss: 1.2887 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3448/3560 Training loss: 1.2890 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3449/3560 Training loss: 1.2891 0.1234 sec/batch\n", + "Epoch 20/20 Iteration 3450/3560 Training loss: 1.2889 0.1240 sec/batch\n", + "Epoch 20/20 Iteration 3451/3560 Training loss: 1.2890 0.1237 sec/batch\n", + "Epoch 20/20 Iteration 3452/3560 Training loss: 1.2890 0.1229 sec/batch\n", + "Epoch 20/20 Iteration 3453/3560 Training loss: 1.2895 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3454/3560 Training loss: 1.2900 0.1277 sec/batch\n", + "Epoch 20/20 Iteration 3455/3560 Training loss: 1.2905 0.1287 sec/batch\n", + "Epoch 20/20 Iteration 3456/3560 Training loss: 1.2902 0.1216 sec/batch\n", + "Epoch 20/20 Iteration 3457/3560 Training loss: 1.2901 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3458/3560 Training loss: 1.2903 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3459/3560 Training loss: 1.2901 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3460/3560 Training loss: 1.2900 0.1267 sec/batch\n", + "Epoch 20/20 Iteration 3461/3560 Training loss: 1.2894 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3462/3560 Training loss: 1.2894 0.1279 sec/batch\n", + "Epoch 20/20 Iteration 3463/3560 Training loss: 1.2889 0.1286 sec/batch\n", + "Epoch 20/20 Iteration 3464/3560 Training loss: 1.2888 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3465/3560 Training loss: 1.2884 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3466/3560 Training loss: 1.2884 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3467/3560 Training loss: 1.2882 0.1266 sec/batch\n", + "Epoch 20/20 Iteration 3468/3560 Training loss: 1.2881 0.1290 sec/batch\n", + "Epoch 20/20 Iteration 3469/3560 Training loss: 1.2878 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3470/3560 Training loss: 1.2875 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3471/3560 Training loss: 1.2871 0.1251 sec/batch\n", + "Epoch 20/20 Iteration 3472/3560 Training loss: 1.2871 0.1213 sec/batch\n", + "Epoch 20/20 Iteration 3473/3560 Training loss: 1.2869 0.1262 sec/batch\n", + "Epoch 20/20 Iteration 3474/3560 Training loss: 1.2867 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3475/3560 Training loss: 1.2863 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3476/3560 Training loss: 1.2859 0.1256 sec/batch\n", + "Epoch 20/20 Iteration 3477/3560 Training loss: 1.2855 0.1233 sec/batch\n", + "Epoch 20/20 Iteration 3478/3560 Training loss: 1.2856 0.1211 sec/batch\n", + "Epoch 20/20 Iteration 3479/3560 Training loss: 1.2856 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3480/3560 Training loss: 1.2852 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3481/3560 Training loss: 1.2849 0.1242 sec/batch\n", + "Epoch 20/20 Iteration 3482/3560 Training loss: 1.2846 0.1250 sec/batch\n", + "Epoch 20/20 Iteration 3483/3560 Training loss: 1.2845 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3484/3560 Training loss: 1.2843 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3485/3560 Training loss: 1.2843 0.1236 sec/batch\n", + "Epoch 20/20 Iteration 3486/3560 Training loss: 1.2841 0.1240 sec/batch\n", + "Epoch 20/20 Iteration 3487/3560 Training loss: 1.2840 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3488/3560 Training loss: 1.2839 0.1268 sec/batch\n", + "Epoch 20/20 Iteration 3489/3560 Training loss: 1.2838 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3490/3560 Training loss: 1.2838 0.1229 sec/batch\n", + "Epoch 20/20 Iteration 3491/3560 Training loss: 1.2836 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3492/3560 Training loss: 1.2837 0.1235 sec/batch\n", + "Epoch 20/20 Iteration 3493/3560 Training loss: 1.2834 0.1249 sec/batch\n", + "Epoch 20/20 Iteration 3494/3560 Training loss: 1.2833 0.1264 sec/batch\n", + "Epoch 20/20 Iteration 3495/3560 Training loss: 1.2833 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3496/3560 Training loss: 1.2832 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3497/3560 Training loss: 1.2829 0.1259 sec/batch\n", + "Epoch 20/20 Iteration 3498/3560 Training loss: 1.2827 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3499/3560 Training loss: 1.2827 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3500/3560 Training loss: 1.2827 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3501/3560 Training loss: 1.2826 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3502/3560 Training loss: 1.2826 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3503/3560 Training loss: 1.2825 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3504/3560 Training loss: 1.2823 0.1244 sec/batch\n", + "Epoch 20/20 Iteration 3505/3560 Training loss: 1.2818 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3506/3560 Training loss: 1.2819 0.1265 sec/batch\n", + "Epoch 20/20 Iteration 3507/3560 Training loss: 1.2818 0.1223 sec/batch\n", + "Epoch 20/20 Iteration 3508/3560 Training loss: 1.2814 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3509/3560 Training loss: 1.2815 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3510/3560 Training loss: 1.2815 0.1245 sec/batch\n", + "Epoch 20/20 Iteration 3511/3560 Training loss: 1.2813 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3512/3560 Training loss: 1.2810 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3513/3560 Training loss: 1.2806 0.1234 sec/batch\n", + "Epoch 20/20 Iteration 3514/3560 Training loss: 1.2805 0.1255 sec/batch\n", + "Epoch 20/20 Iteration 3515/3560 Training loss: 1.2807 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3516/3560 Training loss: 1.2807 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3517/3560 Training loss: 1.2807 0.1270 sec/batch\n", + "Epoch 20/20 Iteration 3518/3560 Training loss: 1.2807 0.1226 sec/batch\n", + "Epoch 20/20 Iteration 3519/3560 Training loss: 1.2809 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3520/3560 Training loss: 1.2810 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3521/3560 Training loss: 1.2810 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3522/3560 Training loss: 1.2810 0.1218 sec/batch\n", + "Epoch 20/20 Iteration 3523/3560 Training loss: 1.2813 0.1270 sec/batch\n", + "Epoch 20/20 Iteration 3524/3560 Training loss: 1.2815 0.1231 sec/batch\n", + "Epoch 20/20 Iteration 3525/3560 Training loss: 1.2814 0.1247 sec/batch\n", + "Epoch 20/20 Iteration 3526/3560 Training loss: 1.2816 0.1238 sec/batch\n", + "Epoch 20/20 Iteration 3527/3560 Training loss: 1.2815 0.1256 sec/batch\n", + "Epoch 20/20 Iteration 3528/3560 Training loss: 1.2817 0.1252 sec/batch\n", + "Epoch 20/20 Iteration 3529/3560 Training loss: 1.2817 0.1253 sec/batch\n", + "Epoch 20/20 Iteration 3530/3560 Training loss: 1.2819 0.1274 sec/batch\n", + "Epoch 20/20 Iteration 3531/3560 Training loss: 1.2820 0.1232 sec/batch\n", + "Epoch 20/20 Iteration 3532/3560 Training loss: 1.2819 0.1263 sec/batch\n", + "Epoch 20/20 Iteration 3533/3560 Training loss: 1.2816 0.1254 sec/batch\n", + "Epoch 20/20 Iteration 3534/3560 Training loss: 1.2814 0.1251 sec/batch\n", + "Epoch 20/20 Iteration 3535/3560 Training loss: 1.2814 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3536/3560 Training loss: 1.2814 0.1301 sec/batch\n", + "Epoch 20/20 Iteration 3537/3560 Training loss: 1.2814 0.1246 sec/batch\n", + "Epoch 20/20 Iteration 3538/3560 Training loss: 1.2814 0.1303 sec/batch\n", + "Epoch 20/20 Iteration 3539/3560 Training loss: 1.2814 0.1219 sec/batch\n", + "Epoch 20/20 Iteration 3540/3560 Training loss: 1.2813 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3541/3560 Training loss: 1.2811 0.1228 sec/batch\n", + "Epoch 20/20 Iteration 3542/3560 Training loss: 1.2812 0.1248 sec/batch\n", + "Epoch 20/20 Iteration 3543/3560 Training loss: 1.2814 0.1288 sec/batch\n", + "Epoch 20/20 Iteration 3544/3560 Training loss: 1.2813 0.1222 sec/batch\n", + "Epoch 20/20 Iteration 3545/3560 Training loss: 1.2813 0.1224 sec/batch\n", + "Epoch 20/20 Iteration 3546/3560 Training loss: 1.2812 0.1220 sec/batch\n", + "Epoch 20/20 Iteration 3547/3560 Training loss: 1.2812 0.1243 sec/batch\n", + "Epoch 20/20 Iteration 3548/3560 Training loss: 1.2811 0.1211 sec/batch\n", + "Epoch 20/20 Iteration 3549/3560 Training loss: 1.2813 0.1201 sec/batch\n", + "Epoch 20/20 Iteration 3550/3560 Training loss: 1.2817 0.1217 sec/batch\n", + "Epoch 20/20 Iteration 3551/3560 Training loss: 1.2818 0.1221 sec/batch\n", + "Epoch 20/20 Iteration 3552/3560 Training loss: 1.2819 0.1215 sec/batch\n", + "Epoch 20/20 Iteration 3553/3560 Training loss: 1.2818 0.1268 sec/batch\n", + "Epoch 20/20 Iteration 3554/3560 Training loss: 1.2817 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3555/3560 Training loss: 1.2818 0.1257 sec/batch\n", + "Epoch 20/20 Iteration 3556/3560 Training loss: 1.2818 0.1225 sec/batch\n", + "Epoch 20/20 Iteration 3557/3560 Training loss: 1.2819 0.1353 sec/batch\n", + "Epoch 20/20 Iteration 3558/3560 Training loss: 1.2818 0.1272 sec/batch\n", + "Epoch 20/20 Iteration 3559/3560 Training loss: 1.2817 0.1227 sec/batch\n", + "Epoch 20/20 Iteration 3560/3560 Training loss: 1.2818 0.1244 sec/batch\n" + ] + } + ], + "source": [ + "epochs = 20\n", + "batch_size = 100\n", + "num_steps = 100\n", + "train_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)\n", + "\n", + "for lstm_size in [128,256,512]:\n", + " for num_layers in [1, 2]:\n", + " for learning_rate in [0.002, 0.001]:\n", + " log_string = 'logs/4/lr={},rl={},ru={}'.format(learning_rate, num_layers, lstm_size)\n", + " writer = tf.summary.FileWriter(log_string)\n", + " model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + " \n", + " train(model, epochs, writer)" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "model_checkpoint_path: \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i400_l512_1.980.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i800_l512_1.595.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1200_l512_1.407.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1400_l512_1.349.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1600_l512_1.292.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1800_l512_1.255.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2000_l512_1.224.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2200_l512_1.204.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2400_l512_1.187.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2600_l512_1.172.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2800_l512_1.160.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3000_l512_1.148.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3200_l512_1.137.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3400_l512_1.129.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3560_l512_1.122.ckpt\"" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "tf.train.get_checkpoint_state('checkpoints/anna')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sampling\n", + "\n", + "Now that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.\n", + "\n", + "The network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def pick_top_n(preds, vocab_size, top_n=5):\n", + " p = np.squeeze(preds)\n", + " p[np.argsort(p)[:-top_n]] = 0\n", + " p = p / np.sum(p)\n", + " c = np.random.choice(vocab_size, 1, p=p)[0]\n", + " return c" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def sample(checkpoint, n_samples, lstm_size, vocab_size, prime=\"The \"):\n", + " prime = \"Far\"\n", + " samples = [c for c in prime]\n", + " model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)\n", + " saver = tf.train.Saver()\n", + " with tf.Session() as sess:\n", + " saver.restore(sess, checkpoint)\n", + " new_state = sess.run(model.initial_state)\n", + " for c in prime:\n", + " x = np.zeros((1, 1))\n", + " x[0,0] = vocab_to_int[c]\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + "\n", + " for i in range(n_samples):\n", + " x[0,0] = c\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + " \n", + " return ''.join(samples)" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farlathit that if had so\n", + "like it that it were. He could not trouble to his wife, and there was\n", + "anything in them of the side of his weaky in the creature at his forteren\n", + "to him.\n", + "\n", + "\"What is it? I can't bread to those,\" said Stepan Arkadyevitch. \"It's not\n", + "my children, and there is an almost this arm, true it mays already,\n", + "and tell you what I have say to you, and was not looking at the peasant,\n", + "why is, I don't know him out, and she doesn't speak to me immediately, as\n", + "you would say the countess and the more frest an angelembre, and time and\n", + "things's silent, but I was not in my stand that is in my head. But if he\n", + "say, and was so feeling with his soul. A child--in his soul of his\n", + "soul of his soul. He should not see that any of that sense of. Here he\n", + "had not been so composed and to speak for as in a whole picture, but\n", + "all the setting and her excellent and society, who had been delighted\n", + "and see to anywing had been being troed to thousand words on them,\n", + "we liked him.\n", + "\n", + "That set in her money at the table, he came into the party. The capable\n", + "of his she could not be as an old composure.\n", + "\n", + "\"That's all something there will be down becime by throe is\n", + "such a silent, as in a countess, I should state it out and divorct.\n", + "The discussion is not for me. I was that something was simply they are\n", + "all three manshess of a sensitions of mind it all.\"\n", + "\n", + "\"No,\" he thought, shouted and lifting his soul. \"While it might see your\n", + "honser and she, I could burst. And I had been a midelity. And I had a\n", + "marnief are through the countess,\" he said, looking at him, a chosing\n", + "which they had been carried out and still solied, and there was a sen that\n", + "was to be completely, and that this matter of all the seconds of it, and\n", + "a concipation were to her husband, who came up and conscaously, that he\n", + "was not the station. All his fourse she was always at the country,,\n", + "to speak oft, and though they were to hear the delightful throom and\n", + "whether they came towards the morning, and his living and a coller and\n", + "hold--the children. \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farnt him oste wha sorind thans tout thint asd an sesand an hires on thime sind thit aled, ban thand and out hore as the ter hos ton ho te that, was tis tart al the hand sostint him sore an tit an son thes, win he se ther san ther hher tas tarereng,.\n", + "\n", + "Anl at an ades in ond hesiln, ad hhe torers teans, wast tar arering tho this sos alten sorer has hhas an siton ther him he had sin he ard ate te anling the sosin her ans and\n", + "arins asd and ther ale te tot an tand tanginge wath and ho ald, so sot th asend sat hare sother horesinnd, he hesense wing ante her so tith tir sherinn, anded and to the toul anderin he sorit he torsith she se atere an ting ot hand and thit hhe so the te wile har\n", + "ens ont in the sersise, and we he seres tar aterer, to ato tat or has he he wan ton here won and sen heren he sosering, to to theer oo adent har herere the wosh oute, was serild ward tous hed astend..\n", + "\n", + "I's sint on alt in har tor tit her asd hade shithans ored he talereng an soredendere tim tot hees. Tise sor and \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Fard as astice her said he celatice of to seress in the raice, and to be the some and sere allats to that said to that the sark and a cast a the wither ald the pacinesse of her had astition, he said to the sount as she west at hissele. Af the cond it he was a fact onthis astisarianing.\n", + "\n", + "\n", + "\"Or a ton to to be that's a more at aspestale as the sont of anstiring as\n", + "thours and trey.\n", + "\n", + "The same wo dangring the\n", + "raterst, who sore and somethy had ast out an of his book. \"We had's beane were that, and a morted a thay he had to tere. Then to\n", + "her homent andertersed his his ancouted to the pirsted, the soution for of the pirsice inthirgest and stenciol, with the hard and and\n", + "a colrice of to be oneres,\n", + "the song to this anderssad.\n", + "The could ounterss the said to serom of\n", + "soment a carsed of sheres of she\n", + "torded\n", + "har and want in their of hould, but\n", + "her told in that in he tad a the same to her. Serghing an her has and with the seed, and the camt ont his about of the\n", + "sail, the her then all houg ant or to hus to \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farrat, his felt has at it.\n", + "\n", + "\"When the pose ther hor exceed\n", + "to his sheant was,\" weat a sime of his sounsed. The coment and the facily that which had began terede a marilicaly whice whether the pose of his hand, at she was alligated herself the same on she had to\n", + "taiking to his forthing and streath how to hand\n", + "began in a lang at some at it, this he cholded not set all her. \"Wo love that is setthing. Him anstering as seen that.\"\n", + "\n", + "\"Yes in the man that say the mare a crances is it?\" said Sergazy Ivancatching. \"You doon think were somether is ifficult of a mone of\n", + "though the most at the countes that the\n", + "mean on the come to say the most, to\n", + "his feesing of\n", + "a man she, whilo he\n", + "sained and well, that he would still at to said. He wind at his for the sore in the most\n", + "of hoss and almoved to see him. They have betine the sumper into at he his stire, and what he was that at the so steate of the\n", + "sound, and shin should have a geest of shall feet on the conderation to she had been at that imporsing the dre\n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tensorboard/Anna KaRNNa Name Scoped.ipynb b/tensorboard/Anna KaRNNa Name Scoped.ipynb new file mode 100644 index 0000000..5e5e2a4 --- /dev/null +++ b/tensorboard/Anna KaRNNa Name Scoped.ipynb @@ -0,0 +1,847 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Anna KaRNNa\n", + "\n", + "In this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.\n", + "\n", + "This network is based off of Andrej Karpathy's [post on RNNs](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) and [implementation in Torch](https://github.com/karpathy/char-rnn). Also, some information [here at r2rt](http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.html) and from [Sherjil Ozair](https://github.com/sherjilozair/char-rnn-tensorflow) on GitHub. Below is the general architecture of the character-wise RNN.\n", + "\n", + "" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "from collections import namedtuple\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we'll load the text file and convert it into integers for our network to use." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with open('anna.txt', 'r') as f:\n", + " text=f.read()\n", + "vocab = set(text)\n", + "vocab_to_int = {c: i for i, c in enumerate(vocab)}\n", + "int_to_vocab = dict(enumerate(vocab))\n", + "chars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'Chapter 1\\n\\n\\nHappy families are all alike; every unhappy family is unhappy in its own\\nway.\\n\\nEverythin'" + ] + }, + "execution_count": 15, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "text[:100]" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([26, 49, 22, 46, 33, 70, 51, 37, 36, 31, 31, 31, 65, 22, 46, 46, 39,\n", + " 37, 23, 22, 4, 77, 32, 77, 70, 81, 37, 22, 51, 70, 37, 22, 32, 32,\n", + " 37, 22, 32, 77, 0, 70, 64, 37, 70, 16, 70, 51, 39, 37, 20, 74, 49,\n", + " 22, 46, 46, 39, 37, 23, 22, 4, 77, 32, 39, 37, 77, 81, 37, 20, 74,\n", + " 49, 22, 46, 46, 39, 37, 77, 74, 37, 77, 33, 81, 37, 21, 75, 74, 31,\n", + " 75, 22, 39, 13, 31, 31, 24, 16, 70, 51, 39, 33, 49, 77, 74], dtype=int32)" + ] + }, + "execution_count": 16, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chars[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.\n", + "\n", + "Here I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.\n", + "\n", + "The idea here is to make a 2D matrix where the number of rows is equal to the number of batches. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the `split_frac` keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def split_data(chars, batch_size, num_steps, split_frac=0.9):\n", + " \"\"\" \n", + " Split character data into training and validation sets, inputs and targets for each set.\n", + " \n", + " Arguments\n", + " ---------\n", + " chars: character array\n", + " batch_size: Size of examples in each of batch\n", + " num_steps: Number of sequence steps to keep in the input and pass to the network\n", + " split_frac: Fraction of batches to keep in the training set\n", + " \n", + " \n", + " Returns train_x, train_y, val_x, val_y\n", + " \"\"\"\n", + " \n", + " \n", + " slice_size = batch_size * num_steps\n", + " n_batches = int(len(chars) / slice_size)\n", + " \n", + " # Drop the last few characters to make only full batches\n", + " x = chars[: n_batches*slice_size]\n", + " y = chars[1: n_batches*slice_size + 1]\n", + " \n", + " # Split the data into batch_size slices, then stack them into a 2D matrix \n", + " x = np.stack(np.split(x, batch_size))\n", + " y = np.stack(np.split(y, batch_size))\n", + " \n", + " # Now x and y are arrays with dimensions batch_size x n_batches*num_steps\n", + " \n", + " # Split into training and validation sets, keep the virst split_frac batches for training\n", + " split_idx = int(n_batches*split_frac)\n", + " train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]\n", + " val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]\n", + " \n", + " return train_x, train_y, val_x, val_y" + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_x, train_y, val_x, val_y = split_data(chars, 10, 200)" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(10, 178400)" + ] + }, + "execution_count": 19, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x.shape" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[26, 49, 22, 46, 33, 70, 51, 37, 36, 31],\n", + " [11, 74, 53, 37, 49, 70, 37, 4, 21, 16],\n", + " [37, 1, 22, 33, 1, 49, 77, 74, 69, 37],\n", + " [21, 33, 49, 70, 51, 37, 75, 21, 20, 32],\n", + " [37, 33, 49, 70, 37, 32, 22, 74, 53, 2],\n", + " [37, 52, 49, 51, 21, 20, 69, 49, 37, 32],\n", + " [33, 37, 33, 21, 31, 53, 21, 13, 31, 31],\n", + " [21, 37, 49, 70, 51, 81, 70, 32, 23, 54],\n", + " [49, 22, 33, 37, 77, 81, 37, 33, 49, 70],\n", + " [70, 51, 81, 70, 32, 23, 37, 22, 74, 53]], dtype=int32)" + ] + }, + "execution_count": 20, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x[:,:10]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "I'll write another function to grab batches out of the arrays made by split data. Here each batch will be a sliding window on these arrays with size `batch_size X num_steps`. For example, if we want our network to train on a sequence of 100 characters, `num_steps = 100`. For the next batch, we'll shift this window the next sequence of `num_steps` characters. In this way we can feed batches to the network and the cell states will continue through on each batch." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_batch(arrs, num_steps):\n", + " batch_size, slice_size = arrs[0].shape\n", + " \n", + " n_batches = int(slice_size/num_steps)\n", + " for b in range(n_batches):\n", + " yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,\n", + " learning_rate=0.001, grad_clip=5, sampling=False):\n", + " \n", + " if sampling == True:\n", + " batch_size, num_steps = 1, 1\n", + "\n", + " tf.reset_default_graph()\n", + " \n", + " # Declare placeholders we'll feed into the graph\n", + " with tf.name_scope('inputs'):\n", + " inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')\n", + " x_one_hot = tf.one_hot(inputs, num_classes, name='x_one_hot')\n", + " \n", + " with tf.name_scope('targets'):\n", + " targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')\n", + " y_one_hot = tf.one_hot(targets, num_classes, name='y_one_hot')\n", + " y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])\n", + " \n", + " keep_prob = tf.placeholder(tf.float32, name='keep_prob')\n", + " \n", + " # Build the RNN layers\n", + " with tf.name_scope(\"RNN_layers\"):\n", + " lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n", + " drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n", + " cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)\n", + " \n", + " with tf.name_scope(\"RNN_init_state\"):\n", + " initial_state = cell.zero_state(batch_size, tf.float32)\n", + "\n", + " # Run the data through the RNN layers\n", + " with tf.name_scope(\"RNN_forward\"):\n", + " rnn_inputs = [tf.squeeze(i, squeeze_dims=[1]) for i in tf.split(x_one_hot, num_steps, 1)]\n", + " outputs, state = tf.contrib.rnn.static_rnn(cell, rnn_inputs, initial_state=initial_state)\n", + " \n", + " final_state = state\n", + " \n", + " # Reshape output so it's a bunch of rows, one row for each cell output\n", + " with tf.name_scope('sequence_reshape'):\n", + " seq_output = tf.concat(outputs, axis=1,name='seq_output')\n", + " output = tf.reshape(seq_output, [-1, lstm_size], name='graph_output')\n", + " \n", + " # Now connect the RNN putputs to a softmax layer and calculate the cost\n", + " with tf.name_scope('logits'):\n", + " softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1),\n", + " name='softmax_w')\n", + " softmax_b = tf.Variable(tf.zeros(num_classes), name='softmax_b')\n", + " logits = tf.matmul(output, softmax_w) + softmax_b\n", + "\n", + " with tf.name_scope('predictions'):\n", + " preds = tf.nn.softmax(logits, name='predictions')\n", + " \n", + " \n", + " with tf.name_scope('cost'):\n", + " loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped, name='loss')\n", + " cost = tf.reduce_mean(loss, name='cost')\n", + "\n", + " # Optimizer for training, using gradient clipping to control exploding gradients\n", + " with tf.name_scope('train'):\n", + " tvars = tf.trainable_variables()\n", + " grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)\n", + " train_op = tf.train.AdamOptimizer(learning_rate)\n", + " optimizer = train_op.apply_gradients(zip(grads, tvars))\n", + " \n", + " # Export the nodes \n", + " export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',\n", + " 'keep_prob', 'cost', 'preds', 'optimizer']\n", + " Graph = namedtuple('Graph', export_nodes)\n", + " local_dict = locals()\n", + " graph = Graph(*[local_dict[each] for each in export_nodes])\n", + " \n", + " return graph" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Hyperparameters\n", + "\n", + "Here I'm defining the hyperparameters for the network. The two you probably haven't seen before are `lstm_size` and `num_layers`. These set the number of hidden units in the LSTM layers and the number of LSTM layers, respectively. Of course, making these bigger will improve the network's performance but you'll have to watch out for overfitting. If your validation loss is much larger than the training loss, you're probably overfitting. Decrease the size of the network or decrease the dropout keep probability." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "batch_size = 100\n", + "num_steps = 100\n", + "lstm_size = 512\n", + "num_layers = 2\n", + "learning_rate = 0.001" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Write out the graph for TensorBoard" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "with tf.Session() as sess:\n", + " \n", + " sess.run(tf.global_variables_initializer())\n", + " file_writer = tf.summary.FileWriter('./logs/3', sess.graph)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Time for training which is is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by `save_every_n`) I calculate the validation loss and save a checkpoint." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "!mkdir -p checkpoints/anna" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1/10 Iteration 1/1780 Training loss: 4.4195 1.3313 sec/batch\n", + "Epoch 1/10 Iteration 2/1780 Training loss: 4.3756 0.1287 sec/batch\n", + "Epoch 1/10 Iteration 3/1780 Training loss: 4.2069 0.1276 sec/batch\n", + "Epoch 1/10 Iteration 4/1780 Training loss: 4.5396 0.1185 sec/batch\n", + "Epoch 1/10 Iteration 5/1780 Training loss: 4.4190 0.1206 sec/batch\n", + "Epoch 1/10 Iteration 6/1780 Training loss: 4.3547 0.1233 sec/batch\n", + "Epoch 1/10 Iteration 7/1780 Training loss: 4.2792 0.1188 sec/batch\n", + "Epoch 1/10 Iteration 8/1780 Training loss: 4.2018 0.1170 sec/batch\n", + "Epoch 1/10 Iteration 9/1780 Training loss: 4.1251 0.1187 sec/batch\n", + "Epoch 1/10 Iteration 10/1780 Training loss: 4.0558 0.1174 sec/batch\n", + "Epoch 1/10 Iteration 11/1780 Training loss: 3.9946 0.1190 sec/batch\n", + "Epoch 1/10 Iteration 12/1780 Training loss: 3.9451 0.1193 sec/batch\n", + "Epoch 1/10 Iteration 13/1780 Training loss: 3.9011 0.1210 sec/batch\n", + "Epoch 1/10 Iteration 14/1780 Training loss: 3.8632 0.1185 sec/batch\n", + "Epoch 1/10 Iteration 15/1780 Training loss: 3.8275 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 16/1780 Training loss: 3.7945 0.1211 sec/batch\n", + "Epoch 1/10 Iteration 17/1780 Training loss: 3.7649 0.1215 sec/batch\n", + "Epoch 1/10 Iteration 18/1780 Training loss: 3.7400 0.1214 sec/batch\n", + "Epoch 1/10 Iteration 19/1780 Training loss: 3.7164 0.1247 sec/batch\n", + "Epoch 1/10 Iteration 20/1780 Training loss: 3.6933 0.1212 sec/batch\n", + "Epoch 1/10 Iteration 21/1780 Training loss: 3.6728 0.1203 sec/batch\n", + "Epoch 1/10 Iteration 22/1780 Training loss: 3.6538 0.1207 sec/batch\n", + "Epoch 1/10 Iteration 23/1780 Training loss: 3.6359 0.1200 sec/batch\n", + "Epoch 1/10 Iteration 24/1780 Training loss: 3.6198 0.1229 sec/batch\n", + "Epoch 1/10 Iteration 25/1780 Training loss: 3.6041 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 26/1780 Training loss: 3.5904 0.1202 sec/batch\n", + "Epoch 1/10 Iteration 27/1780 Training loss: 3.5774 0.1189 sec/batch\n", + "Epoch 1/10 Iteration 28/1780 Training loss: 3.5642 0.1214 sec/batch\n", + "Epoch 1/10 Iteration 29/1780 Training loss: 3.5522 0.1231 sec/batch\n", + "Epoch 1/10 Iteration 30/1780 Training loss: 3.5407 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 31/1780 Training loss: 3.5309 0.1180 sec/batch\n", + "Epoch 1/10 Iteration 32/1780 Training loss: 3.5207 0.1179 sec/batch\n", + "Epoch 1/10 Iteration 33/1780 Training loss: 3.5109 0.1224 sec/batch\n", + "Epoch 1/10 Iteration 34/1780 Training loss: 3.5021 0.1206 sec/batch\n", + "Epoch 1/10 Iteration 35/1780 Training loss: 3.4931 0.1241 sec/batch\n", + "Epoch 1/10 Iteration 36/1780 Training loss: 3.4850 0.1169 sec/batch\n", + "Epoch 1/10 Iteration 37/1780 Training loss: 3.4767 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 38/1780 Training loss: 3.4688 0.1202 sec/batch\n", + "Epoch 1/10 Iteration 39/1780 Training loss: 3.4611 0.1213 sec/batch\n" + ] + }, + { + "ename": "KeyboardInterrupt", + "evalue": "", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 33\u001b[0m model.initial_state: new_state}\n\u001b[1;32m 34\u001b[0m batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n\u001b[0;32m---> 35\u001b[0;31m feed_dict=feed)\n\u001b[0m\u001b[1;32m 36\u001b[0m \u001b[0mloss\u001b[0m \u001b[0;34m+=\u001b[0m \u001b[0mbatch_loss\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 37\u001b[0m \u001b[0mend\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtime\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtime\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36mrun\u001b[0;34m(self, fetches, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 765\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 766\u001b[0m result = self._run(None, fetches, feed_dict, options_ptr,\n\u001b[0;32m--> 767\u001b[0;31m run_metadata_ptr)\n\u001b[0m\u001b[1;32m 768\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mrun_metadata\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 769\u001b[0m \u001b[0mproto_data\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf_session\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mTF_GetBuffer\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mrun_metadata_ptr\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_run\u001b[0;34m(self, handle, fetches, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 963\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mfinal_fetches\u001b[0m \u001b[0;32mor\u001b[0m \u001b[0mfinal_targets\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 964\u001b[0m results = self._do_run(handle, final_targets, final_fetches,\n\u001b[0;32m--> 965\u001b[0;31m feed_dict_string, options, run_metadata)\n\u001b[0m\u001b[1;32m 966\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 967\u001b[0m \u001b[0mresults\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_do_run\u001b[0;34m(self, handle, target_list, fetch_list, feed_dict, options, run_metadata)\u001b[0m\n\u001b[1;32m 1013\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mhandle\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1014\u001b[0m return self._do_call(_run_fn, self._session, feed_dict, fetch_list,\n\u001b[0;32m-> 1015\u001b[0;31m target_list, options, run_metadata)\n\u001b[0m\u001b[1;32m 1016\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1017\u001b[0m return self._do_call(_prun_fn, self._session, handle, feed_dict,\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_do_call\u001b[0;34m(self, fn, *args)\u001b[0m\n\u001b[1;32m 1020\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_do_call\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1021\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1022\u001b[0;31m \u001b[0;32mreturn\u001b[0m \u001b[0mfn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m*\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 1023\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0merrors\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mOpError\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1024\u001b[0m \u001b[0mmessage\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mcompat\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mas_text\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0me\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mmessage\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/python/client/session.py\u001b[0m in \u001b[0;36m_run_fn\u001b[0;34m(session, feed_dict, fetch_list, target_list, options, run_metadata)\u001b[0m\n\u001b[1;32m 1002\u001b[0m return tf_session.TF_Run(session, options,\n\u001b[1;32m 1003\u001b[0m \u001b[0mfeed_dict\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfetch_list\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtarget_list\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m-> 1004\u001b[0;31m status, run_metadata)\n\u001b[0m\u001b[1;32m 1005\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 1006\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m_prun_fn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msession\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mhandle\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfeed_dict\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mfetch_list\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + ] + } + ], + "source": [ + "epochs = 10\n", + "save_every_n = 200\n", + "train_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)\n", + "\n", + "model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "saver = tf.train.Saver(max_to_keep=100)\n", + "\n", + "with tf.Session() as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Use the line below to load a checkpoint and resume training\n", + " #saver.restore(sess, 'checkpoints/anna20.ckpt')\n", + " \n", + " n_batches = int(train_x.shape[1]/num_steps)\n", + " iterations = n_batches * epochs\n", + " for e in range(epochs):\n", + " \n", + " # Train network\n", + " new_state = sess.run(model.initial_state)\n", + " loss = 0\n", + " for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):\n", + " iteration = e*n_batches + b\n", + " start = time.time()\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 0.5,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n", + " feed_dict=feed)\n", + " loss += batch_loss\n", + " end = time.time()\n", + " print('Epoch {}/{} '.format(e+1, epochs),\n", + " 'Iteration {}/{}'.format(iteration, iterations),\n", + " 'Training loss: {:.4f}'.format(loss/b),\n", + " '{:.4f} sec/batch'.format((end-start)))\n", + " \n", + " \n", + " if (iteration%save_every_n == 0) or (iteration == iterations):\n", + " # Check performance, notice dropout has been set to 1\n", + " val_loss = []\n", + " new_state = sess.run(model.initial_state)\n", + " for x, y in get_batch([val_x, val_y], num_steps):\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state = sess.run([model.cost, model.final_state], feed_dict=feed)\n", + " val_loss.append(batch_loss)\n", + "\n", + " print('Validation loss:', np.mean(val_loss),\n", + " 'Saving checkpoint!')\n", + " saver.save(sess, \"checkpoints/anna/i{}_l{}_{:.3f}.ckpt\".format(iteration, lstm_size, np.mean(val_loss)))" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "model_checkpoint_path: \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i400_l512_1.980.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i800_l512_1.595.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1200_l512_1.407.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1400_l512_1.349.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1600_l512_1.292.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1800_l512_1.255.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2000_l512_1.224.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2200_l512_1.204.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2400_l512_1.187.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2600_l512_1.172.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2800_l512_1.160.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3000_l512_1.148.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3200_l512_1.137.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3400_l512_1.129.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3560_l512_1.122.ckpt\"" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "tf.train.get_checkpoint_state('checkpoints/anna')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sampling\n", + "\n", + "Now that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.\n", + "\n", + "The network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def pick_top_n(preds, vocab_size, top_n=5):\n", + " p = np.squeeze(preds)\n", + " p[np.argsort(p)[:-top_n]] = 0\n", + " p = p / np.sum(p)\n", + " c = np.random.choice(vocab_size, 1, p=p)[0]\n", + " return c" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def sample(checkpoint, n_samples, lstm_size, vocab_size, prime=\"The \"):\n", + " prime = \"Far\"\n", + " samples = [c for c in prime]\n", + " model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)\n", + " saver = tf.train.Saver()\n", + " with tf.Session() as sess:\n", + " saver.restore(sess, checkpoint)\n", + " new_state = sess.run(model.initial_state)\n", + " for c in prime:\n", + " x = np.zeros((1, 1))\n", + " x[0,0] = vocab_to_int[c]\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + "\n", + " for i in range(n_samples):\n", + " x[0,0] = c\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + " \n", + " return ''.join(samples)" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farlathit that if had so\n", + "like it that it were. He could not trouble to his wife, and there was\n", + "anything in them of the side of his weaky in the creature at his forteren\n", + "to him.\n", + "\n", + "\"What is it? I can't bread to those,\" said Stepan Arkadyevitch. \"It's not\n", + "my children, and there is an almost this arm, true it mays already,\n", + "and tell you what I have say to you, and was not looking at the peasant,\n", + "why is, I don't know him out, and she doesn't speak to me immediately, as\n", + "you would say the countess and the more frest an angelembre, and time and\n", + "things's silent, but I was not in my stand that is in my head. But if he\n", + "say, and was so feeling with his soul. A child--in his soul of his\n", + "soul of his soul. He should not see that any of that sense of. Here he\n", + "had not been so composed and to speak for as in a whole picture, but\n", + "all the setting and her excellent and society, who had been delighted\n", + "and see to anywing had been being troed to thousand words on them,\n", + "we liked him.\n", + "\n", + "That set in her money at the table, he came into the party. The capable\n", + "of his she could not be as an old composure.\n", + "\n", + "\"That's all something there will be down becime by throe is\n", + "such a silent, as in a countess, I should state it out and divorct.\n", + "The discussion is not for me. I was that something was simply they are\n", + "all three manshess of a sensitions of mind it all.\"\n", + "\n", + "\"No,\" he thought, shouted and lifting his soul. \"While it might see your\n", + "honser and she, I could burst. And I had been a midelity. And I had a\n", + "marnief are through the countess,\" he said, looking at him, a chosing\n", + "which they had been carried out and still solied, and there was a sen that\n", + "was to be completely, and that this matter of all the seconds of it, and\n", + "a concipation were to her husband, who came up and conscaously, that he\n", + "was not the station. All his fourse she was always at the country,,\n", + "to speak oft, and though they were to hear the delightful throom and\n", + "whether they came towards the morning, and his living and a coller and\n", + "hold--the children. \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farnt him oste wha sorind thans tout thint asd an sesand an hires on thime sind thit aled, ban thand and out hore as the ter hos ton ho te that, was tis tart al the hand sostint him sore an tit an son thes, win he se ther san ther hher tas tarereng,.\n", + "\n", + "Anl at an ades in ond hesiln, ad hhe torers teans, wast tar arering tho this sos alten sorer has hhas an siton ther him he had sin he ard ate te anling the sosin her ans and\n", + "arins asd and ther ale te tot an tand tanginge wath and ho ald, so sot th asend sat hare sother horesinnd, he hesense wing ante her so tith tir sherinn, anded and to the toul anderin he sorit he torsith she se atere an ting ot hand and thit hhe so the te wile har\n", + "ens ont in the sersise, and we he seres tar aterer, to ato tat or has he he wan ton here won and sen heren he sosering, to to theer oo adent har herere the wosh oute, was serild ward tous hed astend..\n", + "\n", + "I's sint on alt in har tor tit her asd hade shithans ored he talereng an soredendere tim tot hees. Tise sor and \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Fard as astice her said he celatice of to seress in the raice, and to be the some and sere allats to that said to that the sark and a cast a the wither ald the pacinesse of her had astition, he said to the sount as she west at hissele. Af the cond it he was a fact onthis astisarianing.\n", + "\n", + "\n", + "\"Or a ton to to be that's a more at aspestale as the sont of anstiring as\n", + "thours and trey.\n", + "\n", + "The same wo dangring the\n", + "raterst, who sore and somethy had ast out an of his book. \"We had's beane were that, and a morted a thay he had to tere. Then to\n", + "her homent andertersed his his ancouted to the pirsted, the soution for of the pirsice inthirgest and stenciol, with the hard and and\n", + "a colrice of to be oneres,\n", + "the song to this anderssad.\n", + "The could ounterss the said to serom of\n", + "soment a carsed of sheres of she\n", + "torded\n", + "har and want in their of hould, but\n", + "her told in that in he tad a the same to her. Serghing an her has and with the seed, and the camt ont his about of the\n", + "sail, the her then all houg ant or to hus to \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farrat, his felt has at it.\n", + "\n", + "\"When the pose ther hor exceed\n", + "to his sheant was,\" weat a sime of his sounsed. The coment and the facily that which had began terede a marilicaly whice whether the pose of his hand, at she was alligated herself the same on she had to\n", + "taiking to his forthing and streath how to hand\n", + "began in a lang at some at it, this he cholded not set all her. \"Wo love that is setthing. Him anstering as seen that.\"\n", + "\n", + "\"Yes in the man that say the mare a crances is it?\" said Sergazy Ivancatching. \"You doon think were somether is ifficult of a mone of\n", + "though the most at the countes that the\n", + "mean on the come to say the most, to\n", + "his feesing of\n", + "a man she, whilo he\n", + "sained and well, that he would still at to said. He wind at his for the sore in the most\n", + "of hoss and almoved to see him. They have betine the sumper into at he his stire, and what he was that at the so steate of the\n", + "sound, and shin should have a geest of shall feet on the conderation to she had been at that imporsing the dre\n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + }, + "toc": { + "colors": { + "hover_highlight": "#DAA520", + "running_highlight": "#FF0000", + "selected_highlight": "#FFD700" + }, + "moveMenuLeft": true, + "nav_menu": { + "height": "111px", + "width": "251px" + }, + "navigate_menu": true, + "number_sections": true, + "sideBar": true, + "threshold": 4, + "toc_cell": false, + "toc_section_display": "block", + "toc_window_display": false + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tensorboard/Anna KaRNNa Summaries.ipynb b/tensorboard/Anna KaRNNa Summaries.ipynb new file mode 100644 index 0000000..31d1824 --- /dev/null +++ b/tensorboard/Anna KaRNNa Summaries.ipynb @@ -0,0 +1,2553 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Anna KaRNNa\n", + "\n", + "In this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.\n", + "\n", + "This network is based off of Andrej Karpathy's [post on RNNs](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) and [implementation in Torch](https://github.com/karpathy/char-rnn). Also, some information [here at r2rt](http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.html) and from [Sherjil Ozair](https://github.com/sherjilozair/char-rnn-tensorflow) on GitHub. Below is the general architecture of the character-wise RNN.\n", + "\n", + "" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "from collections import namedtuple\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we'll load the text file and convert it into integers for our network to use." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with open('anna.txt', 'r') as f:\n", + " text=f.read()\n", + "vocab = set(text)\n", + "vocab_to_int = {c: i for i, c in enumerate(vocab)}\n", + "int_to_vocab = dict(enumerate(vocab))\n", + "chars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'Chapter 1\\n\\n\\nHappy families are all alike; every unhappy family is unhappy in its own\\nway.\\n\\nEverythin'" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "text[:100]" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ 6, 55, 75, 1, 64, 12, 26, 8, 61, 73, 73, 73, 25, 75, 1, 1, 76,\n", + " 8, 62, 75, 7, 18, 82, 18, 12, 5, 8, 75, 26, 12, 8, 75, 82, 82,\n", + " 8, 75, 82, 18, 54, 12, 47, 8, 12, 32, 12, 26, 76, 8, 35, 63, 55,\n", + " 75, 1, 1, 76, 8, 62, 75, 7, 18, 82, 76, 8, 18, 5, 8, 35, 63,\n", + " 55, 75, 1, 1, 76, 8, 18, 63, 8, 18, 64, 5, 8, 29, 72, 63, 73,\n", + " 72, 75, 76, 19, 73, 73, 23, 32, 12, 26, 76, 64, 55, 18, 63], dtype=int32)" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chars[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.\n", + "\n", + "Here I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.\n", + "\n", + "The idea here is to make a 2D matrix where the number of rows is equal to the number of batches. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the `split_frac` keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def split_data(chars, batch_size, num_steps, split_frac=0.9):\n", + " \"\"\" \n", + " Split character data into training and validation sets, inputs and targets for each set.\n", + " \n", + " Arguments\n", + " ---------\n", + " chars: character array\n", + " batch_size: Size of examples in each of batch\n", + " num_steps: Number of sequence steps to keep in the input and pass to the network\n", + " split_frac: Fraction of batches to keep in the training set\n", + " \n", + " \n", + " Returns train_x, train_y, val_x, val_y\n", + " \"\"\"\n", + " \n", + " slice_size = batch_size * num_steps\n", + " n_batches = int(len(chars) / slice_size)\n", + " \n", + " # Drop the last few characters to make only full batches\n", + " x = chars[: n_batches*slice_size]\n", + " y = chars[1: n_batches*slice_size + 1]\n", + " \n", + " # Split the data into batch_size slices, then stack them into a 2D matrix \n", + " x = np.stack(np.split(x, batch_size))\n", + " y = np.stack(np.split(y, batch_size))\n", + " \n", + " # Now x and y are arrays with dimensions batch_size x n_batches*num_steps\n", + " \n", + " # Split into training and validation sets, keep the virst split_frac batches for training\n", + " split_idx = int(n_batches*split_frac)\n", + " train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]\n", + " val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]\n", + " \n", + " return train_x, train_y, val_x, val_y" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_x, train_y, val_x, val_y = split_data(chars, 10, 200)" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(10, 178400)" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x.shape" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[ 6, 55, 75, 1, 64, 12, 26, 8, 61, 73],\n", + " [30, 63, 79, 8, 55, 12, 8, 7, 29, 32],\n", + " [ 8, 53, 75, 64, 53, 55, 18, 63, 56, 8],\n", + " [29, 64, 55, 12, 26, 8, 72, 29, 35, 82],\n", + " [ 8, 64, 55, 12, 8, 82, 75, 63, 79, 46],\n", + " [ 8, 33, 55, 26, 29, 35, 56, 55, 8, 82],\n", + " [64, 8, 64, 29, 73, 79, 29, 19, 73, 73],\n", + " [29, 8, 55, 12, 26, 5, 12, 82, 62, 48],\n", + " [55, 75, 64, 8, 18, 5, 8, 64, 55, 12],\n", + " [12, 26, 5, 12, 82, 62, 8, 75, 63, 79]], dtype=int32)" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x[:,:10]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "I'll write another function to grab batches out of the arrays made by split data. Here each batch will be a sliding window on these arrays with size `batch_size X num_steps`. For example, if we want our network to train on a sequence of 100 characters, `num_steps = 100`. For the next batch, we'll shift this window the next sequence of `num_steps` characters. In this way we can feed batches to the network and the cell states will continue through on each batch." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_batch(arrs, num_steps):\n", + " batch_size, slice_size = arrs[0].shape\n", + " \n", + " n_batches = int(slice_size/num_steps)\n", + " for b in range(n_batches):\n", + " yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,\n", + " learning_rate=0.001, grad_clip=5, sampling=False):\n", + " \n", + " if sampling == True:\n", + " batch_size, num_steps = 1, 1\n", + "\n", + " tf.reset_default_graph()\n", + " \n", + " # Declare placeholders we'll feed into the graph\n", + " with tf.name_scope('inputs'):\n", + " inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')\n", + " x_one_hot = tf.one_hot(inputs, num_classes, name='x_one_hot')\n", + " \n", + " with tf.name_scope('targets'):\n", + " targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')\n", + " y_one_hot = tf.one_hot(targets, num_classes, name='y_one_hot')\n", + " y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])\n", + " \n", + " keep_prob = tf.placeholder(tf.float32, name='keep_prob')\n", + " \n", + " # Build the RNN layers\n", + " with tf.name_scope(\"RNN_cells\"):\n", + " lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n", + " drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n", + " cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)\n", + " \n", + " with tf.name_scope(\"RNN_init_state\"):\n", + " initial_state = cell.zero_state(batch_size, tf.float32)\n", + "\n", + " # Run the data through the RNN layers\n", + " with tf.name_scope(\"RNN_forward\"):\n", + " rnn_inputs = [tf.squeeze(i, squeeze_dims=[1]) for i in tf.split(x_one_hot, num_steps, 1)]\n", + " outputs, state = tf.contrib.rnn.static_rnn(cell, rnn_inputs, initial_state=initial_state)\n", + " \n", + " final_state = state\n", + " \n", + " # Reshape output so it's a bunch of rows, one row for each cell output\n", + " with tf.name_scope('sequence_reshape'):\n", + " seq_output = tf.concat(outputs, axis=1,name='seq_output')\n", + " output = tf.reshape(seq_output, [-1, lstm_size], name='graph_output')\n", + " \n", + " # Now connect the RNN outputs to a softmax layer and calculate the cost\n", + " with tf.name_scope('logits'):\n", + " softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1),\n", + " name='softmax_w')\n", + " softmax_b = tf.Variable(tf.zeros(num_classes), name='softmax_b')\n", + " logits = tf.matmul(output, softmax_w) + softmax_b\n", + " tf.summary.histogram('softmax_w', softmax_w)\n", + " tf.summary.histogram('softmax_b', softmax_b)\n", + "\n", + " with tf.name_scope('predictions'):\n", + " preds = tf.nn.softmax(logits, name='predictions')\n", + " tf.summary.histogram('predictions', preds)\n", + " \n", + " with tf.name_scope('cost'):\n", + " loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped, name='loss')\n", + " cost = tf.reduce_mean(loss, name='cost')\n", + " tf.summary.scalar('cost', cost)\n", + "\n", + " # Optimizer for training, using gradient clipping to control exploding gradients\n", + " with tf.name_scope('train'):\n", + " tvars = tf.trainable_variables()\n", + " grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)\n", + " train_op = tf.train.AdamOptimizer(learning_rate)\n", + " optimizer = train_op.apply_gradients(zip(grads, tvars))\n", + " \n", + " merged = tf.summary.merge_all()\n", + " \n", + " # Export the nodes \n", + " export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',\n", + " 'keep_prob', 'cost', 'preds', 'optimizer', 'merged']\n", + " Graph = namedtuple('Graph', export_nodes)\n", + " local_dict = locals()\n", + " graph = Graph(*[local_dict[each] for each in export_nodes])\n", + " \n", + " return graph" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Hyperparameters\n", + "\n", + "Here I'm defining the hyperparameters for the network. The two you probably haven't seen before are `lstm_size` and `num_layers`. These set the number of hidden units in the LSTM layers and the number of LSTM layers, respectively. Of course, making these bigger will improve the network's performance but you'll have to watch out for overfitting. If your validation loss is much larger than the training loss, you're probably overfitting. Decrease the size of the network or decrease the dropout keep probability." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "batch_size = 100\n", + "num_steps = 100\n", + "lstm_size = 512\n", + "num_layers = 2\n", + "learning_rate = 0.001" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Time for training which is is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by `save_every_n`) I calculate the validation loss and save a checkpoint." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "!mkdir -p checkpoints/anna" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1/10 Iteration 1/1780 Training loss: 4.4188 1.2876 sec/batch\n", + "Epoch 1/10 Iteration 2/1780 Training loss: 4.3775 0.1364 sec/batch\n", + "Epoch 1/10 Iteration 3/1780 Training loss: 4.2100 0.1310 sec/batch\n", + "Epoch 1/10 Iteration 4/1780 Training loss: 4.5256 0.1212 sec/batch\n", + "Epoch 1/10 Iteration 5/1780 Training loss: 4.4524 0.1271 sec/batch\n", + "Epoch 1/10 Iteration 6/1780 Training loss: 4.3496 0.1272 sec/batch\n", + "Epoch 1/10 Iteration 7/1780 Training loss: 4.2637 0.1260 sec/batch\n", + "Epoch 1/10 Iteration 8/1780 Training loss: 4.1856 0.1231 sec/batch\n", + "Epoch 1/10 Iteration 9/1780 Training loss: 4.1126 0.1210 sec/batch\n", + "Epoch 1/10 Iteration 10/1780 Training loss: 4.0469 0.1198 sec/batch\n", + "Epoch 1/10 Iteration 11/1780 Training loss: 3.9883 0.1211 sec/batch\n", + "Epoch 1/10 Iteration 12/1780 Training loss: 3.9390 0.1232 sec/batch\n", + "Epoch 1/10 Iteration 13/1780 Training loss: 3.8954 0.1352 sec/batch\n", + "Epoch 1/10 Iteration 14/1780 Training loss: 3.8584 0.1232 sec/batch\n", + "Epoch 1/10 Iteration 15/1780 Training loss: 3.8247 0.1217 sec/batch\n", + "Epoch 1/10 Iteration 16/1780 Training loss: 3.7941 0.1202 sec/batch\n", + "Epoch 1/10 Iteration 17/1780 Training loss: 3.7654 0.1205 sec/batch\n", + "Epoch 1/10 Iteration 18/1780 Training loss: 3.7406 0.1200 sec/batch\n", + "Epoch 1/10 Iteration 19/1780 Training loss: 3.7170 0.1227 sec/batch\n", + "Epoch 1/10 Iteration 20/1780 Training loss: 3.6936 0.1193 sec/batch\n", + "Epoch 1/10 Iteration 21/1780 Training loss: 3.6733 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 22/1780 Training loss: 3.6542 0.1187 sec/batch\n", + "Epoch 1/10 Iteration 23/1780 Training loss: 3.6371 0.1194 sec/batch\n", + "Epoch 1/10 Iteration 24/1780 Training loss: 3.6212 0.1194 sec/batch\n", + "Epoch 1/10 Iteration 25/1780 Training loss: 3.6055 0.1203 sec/batch\n", + "Epoch 1/10 Iteration 26/1780 Training loss: 3.5918 0.1209 sec/batch\n", + "Epoch 1/10 Iteration 27/1780 Training loss: 3.5789 0.1189 sec/batch\n", + "Epoch 1/10 Iteration 28/1780 Training loss: 3.5657 0.1197 sec/batch\n", + "Epoch 1/10 Iteration 29/1780 Training loss: 3.5534 0.1242 sec/batch\n", + "Epoch 1/10 Iteration 30/1780 Training loss: 3.5425 0.1184 sec/batch\n", + "Epoch 1/10 Iteration 31/1780 Training loss: 3.5325 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 32/1780 Training loss: 3.5224 0.1203 sec/batch\n", + "Epoch 1/10 Iteration 33/1780 Training loss: 3.5125 0.1236 sec/batch\n", + "Epoch 1/10 Iteration 34/1780 Training loss: 3.5037 0.1195 sec/batch\n", + "Epoch 1/10 Iteration 35/1780 Training loss: 3.4948 0.1202 sec/batch\n", + "Epoch 1/10 Iteration 36/1780 Training loss: 3.4867 0.1190 sec/batch\n", + "Epoch 1/10 Iteration 37/1780 Training loss: 3.4782 0.1226 sec/batch\n", + "Epoch 1/10 Iteration 38/1780 Training loss: 3.4702 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 39/1780 Training loss: 3.4625 0.1223 sec/batch\n", + "Epoch 1/10 Iteration 40/1780 Training loss: 3.4553 0.1196 sec/batch\n", + "Epoch 1/10 Iteration 41/1780 Training loss: 3.4482 0.1200 sec/batch\n", + "Epoch 1/10 Iteration 42/1780 Training loss: 3.4415 0.1195 sec/batch\n", + "Epoch 1/10 Iteration 43/1780 Training loss: 3.4350 0.1209 sec/batch\n", + "Epoch 1/10 Iteration 44/1780 Training loss: 3.4287 0.1215 sec/batch\n", + "Epoch 1/10 Iteration 45/1780 Training loss: 3.4225 0.1255 sec/batch\n", + "Epoch 1/10 Iteration 46/1780 Training loss: 3.4170 0.1194 sec/batch\n", + "Epoch 1/10 Iteration 47/1780 Training loss: 3.4116 0.1194 sec/batch\n", + "Epoch 1/10 Iteration 48/1780 Training loss: 3.4067 0.1190 sec/batch\n", + "Epoch 1/10 Iteration 49/1780 Training loss: 3.4020 0.1215 sec/batch\n", + "Epoch 1/10 Iteration 50/1780 Training loss: 3.3972 0.1203 sec/batch\n", + "Epoch 1/10 Iteration 51/1780 Training loss: 3.3926 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 52/1780 Training loss: 3.3878 0.1188 sec/batch\n", + "Epoch 1/10 Iteration 53/1780 Training loss: 3.3836 0.1214 sec/batch\n", + "Epoch 1/10 Iteration 54/1780 Training loss: 3.3791 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 55/1780 Training loss: 3.3750 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 56/1780 Training loss: 3.3707 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 57/1780 Training loss: 3.3667 0.1234 sec/batch\n", + "Epoch 1/10 Iteration 58/1780 Training loss: 3.3630 0.1213 sec/batch\n", + "Epoch 1/10 Iteration 59/1780 Training loss: 3.3592 0.1229 sec/batch\n", + "Epoch 1/10 Iteration 60/1780 Training loss: 3.3557 0.1194 sec/batch\n", + "Epoch 1/10 Iteration 61/1780 Training loss: 3.3522 0.1205 sec/batch\n", + "Epoch 1/10 Iteration 62/1780 Training loss: 3.3493 0.1189 sec/batch\n", + "Epoch 1/10 Iteration 63/1780 Training loss: 3.3464 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 64/1780 Training loss: 3.3429 0.1210 sec/batch\n", + "Epoch 1/10 Iteration 65/1780 Training loss: 3.3396 0.1213 sec/batch\n", + "Epoch 1/10 Iteration 66/1780 Training loss: 3.3368 0.1218 sec/batch\n", + "Epoch 1/10 Iteration 67/1780 Training loss: 3.3340 0.1202 sec/batch\n", + "Epoch 1/10 Iteration 68/1780 Training loss: 3.3306 0.1195 sec/batch\n", + "Epoch 1/10 Iteration 69/1780 Training loss: 3.3276 0.1225 sec/batch\n", + "Epoch 1/10 Iteration 70/1780 Training loss: 3.3249 0.1188 sec/batch\n", + "Epoch 1/10 Iteration 71/1780 Training loss: 3.3221 0.1208 sec/batch\n", + "Epoch 1/10 Iteration 72/1780 Training loss: 3.3197 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 73/1780 Training loss: 3.3170 0.1206 sec/batch\n", + "Epoch 1/10 Iteration 74/1780 Training loss: 3.3145 0.1192 sec/batch\n", + "Epoch 1/10 Iteration 75/1780 Training loss: 3.3122 0.1233 sec/batch\n", + "Epoch 1/10 Iteration 76/1780 Training loss: 3.3099 0.1197 sec/batch\n", + "Epoch 1/10 Iteration 77/1780 Training loss: 3.3076 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 78/1780 Training loss: 3.3053 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 79/1780 Training loss: 3.3029 0.1232 sec/batch\n", + "Epoch 1/10 Iteration 80/1780 Training loss: 3.3004 0.1190 sec/batch\n", + "Epoch 1/10 Iteration 81/1780 Training loss: 3.2982 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 82/1780 Training loss: 3.2961 0.1196 sec/batch\n", + "Epoch 1/10 Iteration 83/1780 Training loss: 3.2940 0.1213 sec/batch\n", + "Epoch 1/10 Iteration 84/1780 Training loss: 3.2919 0.1184 sec/batch\n", + "Epoch 1/10 Iteration 85/1780 Training loss: 3.2899 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 86/1780 Training loss: 3.2881 0.1190 sec/batch\n", + "Epoch 1/10 Iteration 87/1780 Training loss: 3.2862 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 88/1780 Training loss: 3.2843 0.1217 sec/batch\n", + "Epoch 1/10 Iteration 89/1780 Training loss: 3.2826 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 90/1780 Training loss: 3.2809 0.1191 sec/batch\n", + "Epoch 1/10 Iteration 91/1780 Training loss: 3.2791 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 92/1780 Training loss: 3.2773 0.1219 sec/batch\n", + "Epoch 1/10 Iteration 93/1780 Training loss: 3.2756 0.1192 sec/batch\n", + "Epoch 1/10 Iteration 94/1780 Training loss: 3.2738 0.1213 sec/batch\n", + "Epoch 1/10 Iteration 95/1780 Training loss: 3.2721 0.1207 sec/batch\n", + "Epoch 1/10 Iteration 96/1780 Training loss: 3.2703 0.1186 sec/batch\n", + "Epoch 1/10 Iteration 97/1780 Training loss: 3.2688 0.1207 sec/batch\n", + "Epoch 1/10 Iteration 98/1780 Training loss: 3.2670 0.1203 sec/batch\n", + "Epoch 1/10 Iteration 99/1780 Training loss: 3.2654 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 100/1780 Training loss: 3.2637 0.1199 sec/batch\n", + "Validation loss: 3.05181 Saving checkpoint!\n", + "Epoch 1/10 Iteration 101/1780 Training loss: 3.2620 0.1184 sec/batch\n", + "Epoch 1/10 Iteration 102/1780 Training loss: 3.2603 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 103/1780 Training loss: 3.2587 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 104/1780 Training loss: 3.2569 0.1208 sec/batch\n", + "Epoch 1/10 Iteration 105/1780 Training loss: 3.2553 0.1187 sec/batch\n", + "Epoch 1/10 Iteration 106/1780 Training loss: 3.2536 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 107/1780 Training loss: 3.2519 0.1227 sec/batch\n", + "Epoch 1/10 Iteration 108/1780 Training loss: 3.2502 0.1205 sec/batch\n", + "Epoch 1/10 Iteration 109/1780 Training loss: 3.2487 0.1224 sec/batch\n", + "Epoch 1/10 Iteration 110/1780 Training loss: 3.2469 0.1220 sec/batch\n", + "Epoch 1/10 Iteration 111/1780 Training loss: 3.2453 0.1191 sec/batch\n", + "Epoch 1/10 Iteration 112/1780 Training loss: 3.2437 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 113/1780 Training loss: 3.2421 0.1191 sec/batch\n", + "Epoch 1/10 Iteration 114/1780 Training loss: 3.2404 0.1207 sec/batch\n", + "Epoch 1/10 Iteration 115/1780 Training loss: 3.2387 0.1202 sec/batch\n", + "Epoch 1/10 Iteration 116/1780 Training loss: 3.2371 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 117/1780 Training loss: 3.2354 0.1195 sec/batch\n", + "Epoch 1/10 Iteration 118/1780 Training loss: 3.2340 0.1217 sec/batch\n", + "Epoch 1/10 Iteration 119/1780 Training loss: 3.2325 0.1211 sec/batch\n", + "Epoch 1/10 Iteration 120/1780 Training loss: 3.2309 0.1200 sec/batch\n", + "Epoch 1/10 Iteration 121/1780 Training loss: 3.2295 0.1187 sec/batch\n", + "Epoch 1/10 Iteration 122/1780 Training loss: 3.2280 0.1229 sec/batch\n", + "Epoch 1/10 Iteration 123/1780 Training loss: 3.2264 0.1189 sec/batch\n", + "Epoch 1/10 Iteration 124/1780 Training loss: 3.2249 0.1207 sec/batch\n", + "Epoch 1/10 Iteration 125/1780 Training loss: 3.2232 0.1194 sec/batch\n", + "Epoch 1/10 Iteration 126/1780 Training loss: 3.2214 0.1226 sec/batch\n", + "Epoch 1/10 Iteration 127/1780 Training loss: 3.2197 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 128/1780 Training loss: 3.2181 0.1190 sec/batch\n", + "Epoch 1/10 Iteration 129/1780 Training loss: 3.2164 0.1223 sec/batch\n", + "Epoch 1/10 Iteration 130/1780 Training loss: 3.2148 0.1223 sec/batch\n", + "Epoch 1/10 Iteration 131/1780 Training loss: 3.2132 0.1215 sec/batch\n", + "Epoch 1/10 Iteration 132/1780 Training loss: 3.2114 0.1222 sec/batch\n", + "Epoch 1/10 Iteration 133/1780 Training loss: 3.2097 0.1211 sec/batch\n", + "Epoch 1/10 Iteration 134/1780 Training loss: 3.2079 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 135/1780 Training loss: 3.2059 0.1228 sec/batch\n", + "Epoch 1/10 Iteration 136/1780 Training loss: 3.2039 0.1214 sec/batch\n", + "Epoch 1/10 Iteration 137/1780 Training loss: 3.2020 0.1199 sec/batch\n", + "Epoch 1/10 Iteration 138/1780 Training loss: 3.2000 0.1207 sec/batch\n", + "Epoch 1/10 Iteration 139/1780 Training loss: 3.1982 0.1205 sec/batch\n", + "Epoch 1/10 Iteration 140/1780 Training loss: 3.1961 0.1202 sec/batch\n", + "Epoch 1/10 Iteration 141/1780 Training loss: 3.1941 0.1209 sec/batch\n", + "Epoch 1/10 Iteration 142/1780 Training loss: 3.1921 0.1225 sec/batch\n", + "Epoch 1/10 Iteration 143/1780 Training loss: 3.1901 0.1191 sec/batch\n", + "Epoch 1/10 Iteration 144/1780 Training loss: 3.1880 0.1246 sec/batch\n", + "Epoch 1/10 Iteration 145/1780 Training loss: 3.1860 0.1200 sec/batch\n", + "Epoch 1/10 Iteration 146/1780 Training loss: 3.1840 0.1214 sec/batch\n", + "Epoch 1/10 Iteration 147/1780 Training loss: 3.1820 0.1289 sec/batch\n", + "Epoch 1/10 Iteration 148/1780 Training loss: 3.1800 0.1206 sec/batch\n", + "Epoch 1/10 Iteration 149/1780 Training loss: 3.1778 0.1210 sec/batch\n", + "Epoch 1/10 Iteration 150/1780 Training loss: 3.1756 0.1208 sec/batch\n", + "Epoch 1/10 Iteration 151/1780 Training loss: 3.1736 0.1197 sec/batch\n", + "Epoch 1/10 Iteration 152/1780 Training loss: 3.1716 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 153/1780 Training loss: 3.1694 0.1216 sec/batch\n", + "Epoch 1/10 Iteration 154/1780 Training loss: 3.1671 0.1206 sec/batch\n", + "Epoch 1/10 Iteration 155/1780 Training loss: 3.1648 0.1193 sec/batch\n", + "Epoch 1/10 Iteration 156/1780 Training loss: 3.1624 0.1201 sec/batch\n", + "Epoch 1/10 Iteration 157/1780 Training loss: 3.1599 0.1191 sec/batch\n", + "Epoch 1/10 Iteration 158/1780 Training loss: 3.1574 0.1211 sec/batch\n", + "Epoch 1/10 Iteration 159/1780 Training loss: 3.1548 0.1318 sec/batch\n", + "Epoch 1/10 Iteration 160/1780 Training loss: 3.1523 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 161/1780 Training loss: 3.1498 0.1213 sec/batch\n", + "Epoch 1/10 Iteration 162/1780 Training loss: 3.1471 0.1204 sec/batch\n", + "Epoch 1/10 Iteration 163/1780 Training loss: 3.1446 0.1221 sec/batch\n", + "Epoch 1/10 Iteration 164/1780 Training loss: 3.1430 0.1203 sec/batch\n", + "Epoch 1/10 Iteration 165/1780 Training loss: 3.1411 0.1189 sec/batch\n", + "Epoch 1/10 Iteration 166/1780 Training loss: 3.1390 0.1221 sec/batch\n", + "Epoch 1/10 Iteration 167/1780 Training loss: 3.1367 0.1196 sec/batch\n", + "Epoch 1/10 Iteration 168/1780 Training loss: 3.1346 0.1224 sec/batch\n", + "Epoch 1/10 Iteration 169/1780 Training loss: 3.1325 0.1187 sec/batch\n", + "Epoch 1/10 Iteration 170/1780 Training loss: 3.1301 0.1226 sec/batch\n", + "Epoch 1/10 Iteration 171/1780 Training loss: 3.1278 0.1188 sec/batch\n", + "Epoch 1/10 Iteration 172/1780 Training loss: 3.1258 0.1196 sec/batch\n", + "Epoch 1/10 Iteration 173/1780 Training loss: 3.1237 0.1192 sec/batch\n", + "Epoch 1/10 Iteration 174/1780 Training loss: 3.1215 0.1223 sec/batch\n", + "Epoch 1/10 Iteration 175/1780 Training loss: 3.1193 0.1186 sec/batch\n", + "Epoch 1/10 Iteration 176/1780 Training loss: 3.1179 0.1208 sec/batch\n", + "Epoch 1/10 Iteration 177/1780 Training loss: 3.1162 0.1187 sec/batch\n", + "Epoch 1/10 Iteration 178/1780 Training loss: 3.1137 0.1232 sec/batch\n", + "Epoch 2/10 Iteration 179/1780 Training loss: 2.6953 0.1210 sec/batch\n", + "Epoch 2/10 Iteration 180/1780 Training loss: 2.6538 0.1232 sec/batch\n", + "Epoch 2/10 Iteration 181/1780 Training loss: 2.6371 0.1197 sec/batch\n", + "Epoch 2/10 Iteration 182/1780 Training loss: 2.6328 0.1235 sec/batch\n", + "Epoch 2/10 Iteration 183/1780 Training loss: 2.6298 0.1185 sec/batch\n", + "Epoch 2/10 Iteration 184/1780 Training loss: 2.6251 0.1227 sec/batch\n", + "Epoch 2/10 Iteration 185/1780 Training loss: 2.6222 0.1192 sec/batch\n", + "Epoch 2/10 Iteration 186/1780 Training loss: 2.6206 0.1228 sec/batch\n", + "Epoch 2/10 Iteration 187/1780 Training loss: 2.6176 0.1232 sec/batch\n", + "Epoch 2/10 Iteration 188/1780 Training loss: 2.6138 0.1206 sec/batch\n", + "Epoch 2/10 Iteration 189/1780 Training loss: 2.6088 0.1204 sec/batch\n", + "Epoch 2/10 Iteration 190/1780 Training loss: 2.6067 0.1209 sec/batch\n", + "Epoch 2/10 Iteration 191/1780 Training loss: 2.6035 0.1196 sec/batch\n", + "Epoch 2/10 Iteration 192/1780 Training loss: 2.6023 0.1203 sec/batch\n", + "Epoch 2/10 Iteration 193/1780 Training loss: 2.5985 0.1229 sec/batch\n", + "Epoch 2/10 Iteration 194/1780 Training loss: 2.5957 0.1262 sec/batch\n", + "Epoch 2/10 Iteration 195/1780 Training loss: 2.5928 0.1223 sec/batch\n", + "Epoch 2/10 Iteration 196/1780 Training loss: 2.5922 0.1223 sec/batch\n", + "Epoch 2/10 Iteration 197/1780 Training loss: 2.5893 0.1192 sec/batch\n", + "Epoch 2/10 Iteration 198/1780 Training loss: 2.5853 0.1222 sec/batch\n", + "Epoch 2/10 Iteration 199/1780 Training loss: 2.5819 0.1228 sec/batch\n", + "Epoch 2/10 Iteration 200/1780 Training loss: 2.5808 0.1213 sec/batch\n", + "Validation loss: 2.43305 Saving checkpoint!\n", + "Epoch 2/10 Iteration 201/1780 Training loss: 2.5788 0.1208 sec/batch\n", + "Epoch 2/10 Iteration 202/1780 Training loss: 2.5758 0.1206 sec/batch\n", + "Epoch 2/10 Iteration 203/1780 Training loss: 2.5726 0.1197 sec/batch\n", + "Epoch 2/10 Iteration 204/1780 Training loss: 2.5701 0.1203 sec/batch\n", + "Epoch 2/10 Iteration 205/1780 Training loss: 2.5674 0.1191 sec/batch\n", + "Epoch 2/10 Iteration 206/1780 Training loss: 2.5649 0.1218 sec/batch\n", + "Epoch 2/10 Iteration 207/1780 Training loss: 2.5627 0.1205 sec/batch\n", + "Epoch 2/10 Iteration 208/1780 Training loss: 2.5605 0.1194 sec/batch\n", + "Epoch 2/10 Iteration 209/1780 Training loss: 2.5589 0.1231 sec/batch\n", + "Epoch 2/10 Iteration 210/1780 Training loss: 2.5562 0.1208 sec/batch\n", + "Epoch 2/10 Iteration 211/1780 Training loss: 2.5533 0.1237 sec/batch\n", + "Epoch 2/10 Iteration 212/1780 Training loss: 2.5509 0.1243 sec/batch\n", + "Epoch 2/10 Iteration 213/1780 Training loss: 2.5486 0.1192 sec/batch\n", + "Epoch 2/10 Iteration 214/1780 Training loss: 2.5464 0.1218 sec/batch\n", + "Epoch 2/10 Iteration 215/1780 Training loss: 2.5440 0.1228 sec/batch\n", + "Epoch 2/10 Iteration 216/1780 Training loss: 2.5412 0.1224 sec/batch\n", + "Epoch 2/10 Iteration 217/1780 Training loss: 2.5388 0.1195 sec/batch\n", + "Epoch 2/10 Iteration 218/1780 Training loss: 2.5362 0.1229 sec/batch\n", + "Epoch 2/10 Iteration 219/1780 Training loss: 2.5336 0.1219 sec/batch\n", + "Epoch 2/10 Iteration 220/1780 Training loss: 2.5310 0.1241 sec/batch\n", + "Epoch 2/10 Iteration 221/1780 Training loss: 2.5286 0.1194 sec/batch\n", + "Epoch 2/10 Iteration 222/1780 Training loss: 2.5260 0.1209 sec/batch\n", + "Epoch 2/10 Iteration 223/1780 Training loss: 2.5238 0.1195 sec/batch\n", + "Epoch 2/10 Iteration 224/1780 Training loss: 2.5209 0.1212 sec/batch\n", + "Epoch 2/10 Iteration 225/1780 Training loss: 2.5193 0.1191 sec/batch\n", + "Epoch 2/10 Iteration 226/1780 Training loss: 2.5171 0.1196 sec/batch\n", + "Epoch 2/10 Iteration 227/1780 Training loss: 2.5150 0.1202 sec/batch\n", + "Epoch 2/10 Iteration 228/1780 Training loss: 2.5135 0.1234 sec/batch\n", + "Epoch 2/10 Iteration 229/1780 Training loss: 2.5115 0.1213 sec/batch\n", + "Epoch 2/10 Iteration 230/1780 Training loss: 2.5097 0.1203 sec/batch\n", + "Epoch 2/10 Iteration 231/1780 Training loss: 2.5077 0.1210 sec/batch\n", + "Epoch 2/10 Iteration 232/1780 Training loss: 2.5057 0.1202 sec/batch\n", + "Epoch 2/10 Iteration 233/1780 Training loss: 2.5035 0.1194 sec/batch\n", + "Epoch 2/10 Iteration 234/1780 Training loss: 2.5019 0.1208 sec/batch\n", + "Epoch 2/10 Iteration 235/1780 Training loss: 2.5001 0.1209 sec/batch\n", + "Epoch 2/10 Iteration 236/1780 Training loss: 2.4982 0.1326 sec/batch\n", + "Epoch 2/10 Iteration 237/1780 Training loss: 2.4963 0.1190 sec/batch\n", + "Epoch 2/10 Iteration 238/1780 Training loss: 2.4948 0.1222 sec/batch\n", + "Epoch 2/10 Iteration 239/1780 Training loss: 2.4930 0.1195 sec/batch\n", + "Epoch 2/10 Iteration 240/1780 Training loss: 2.4915 0.1190 sec/batch\n", + "Epoch 2/10 Iteration 241/1780 Training loss: 2.4902 0.1215 sec/batch\n", + "Epoch 2/10 Iteration 242/1780 Training loss: 2.4885 0.1208 sec/batch\n", + "Epoch 2/10 Iteration 243/1780 Training loss: 2.4867 0.1213 sec/batch\n", + "Epoch 2/10 Iteration 244/1780 Training loss: 2.4853 0.1208 sec/batch\n", + "Epoch 2/10 Iteration 245/1780 Training loss: 2.4836 0.1193 sec/batch\n", + "Epoch 2/10 Iteration 246/1780 Training loss: 2.4816 0.1196 sec/batch\n", + "Epoch 2/10 Iteration 247/1780 Training loss: 2.4796 0.1220 sec/batch\n", + "Epoch 2/10 Iteration 248/1780 Training loss: 2.4781 0.1227 sec/batch\n", + "Epoch 2/10 Iteration 249/1780 Training loss: 2.4767 0.1215 sec/batch\n", + "Epoch 2/10 Iteration 250/1780 Training loss: 2.4754 0.1240 sec/batch\n", + "Epoch 2/10 Iteration 251/1780 Training loss: 2.4740 0.1215 sec/batch\n", + "Epoch 2/10 Iteration 252/1780 Training loss: 2.4723 0.1198 sec/batch\n", + "Epoch 2/10 Iteration 253/1780 Training loss: 2.4707 0.1199 sec/batch\n", + "Epoch 2/10 Iteration 254/1780 Training loss: 2.4696 0.1210 sec/batch\n", + "Epoch 2/10 Iteration 255/1780 Training loss: 2.4681 0.1215 sec/batch\n", + "Epoch 2/10 Iteration 256/1780 Training loss: 2.4667 0.1201 sec/batch\n", + "Epoch 2/10 Iteration 257/1780 Training loss: 2.4651 0.1189 sec/batch\n", + "Epoch 2/10 Iteration 258/1780 Training loss: 2.4635 0.1210 sec/batch\n", + "Epoch 2/10 Iteration 259/1780 Training loss: 2.4619 0.1193 sec/batch\n", + "Epoch 2/10 Iteration 260/1780 Training loss: 2.4604 0.1212 sec/batch\n", + "Epoch 2/10 Iteration 261/1780 Training loss: 2.4588 0.1281 sec/batch\n", + "Epoch 2/10 Iteration 262/1780 Training loss: 2.4575 0.1231 sec/batch\n", + "Epoch 2/10 Iteration 263/1780 Training loss: 2.4561 0.1188 sec/batch\n", + "Epoch 2/10 Iteration 264/1780 Training loss: 2.4546 0.1216 sec/batch\n", + "Epoch 2/10 Iteration 265/1780 Training loss: 2.4534 0.1192 sec/batch\n", + "Epoch 2/10 Iteration 266/1780 Training loss: 2.4521 0.1232 sec/batch\n", + "Epoch 2/10 Iteration 267/1780 Training loss: 2.4507 0.1201 sec/batch\n", + "Epoch 2/10 Iteration 268/1780 Training loss: 2.4495 0.1327 sec/batch\n", + "Epoch 2/10 Iteration 269/1780 Training loss: 2.4480 0.1185 sec/batch\n", + "Epoch 2/10 Iteration 270/1780 Training loss: 2.4466 0.1232 sec/batch\n", + "Epoch 2/10 Iteration 271/1780 Training loss: 2.4452 0.1174 sec/batch\n", + "Epoch 2/10 Iteration 272/1780 Training loss: 2.4437 0.1204 sec/batch\n", + "Epoch 2/10 Iteration 273/1780 Training loss: 2.4423 0.1197 sec/batch\n", + "Epoch 2/10 Iteration 274/1780 Training loss: 2.4408 0.1207 sec/batch\n", + "Epoch 2/10 Iteration 275/1780 Training loss: 2.4395 0.1204 sec/batch\n", + "Epoch 2/10 Iteration 276/1780 Training loss: 2.4380 0.1194 sec/batch\n", + "Epoch 2/10 Iteration 277/1780 Training loss: 2.4365 0.1200 sec/batch\n", + "Epoch 2/10 Iteration 278/1780 Training loss: 2.4351 0.1209 sec/batch\n", + "Epoch 2/10 Iteration 279/1780 Training loss: 2.4338 0.1203 sec/batch\n", + "Epoch 2/10 Iteration 280/1780 Training loss: 2.4325 0.1200 sec/batch\n", + "Epoch 2/10 Iteration 281/1780 Training loss: 2.4309 0.1201 sec/batch\n", + "Epoch 2/10 Iteration 282/1780 Training loss: 2.4294 0.1218 sec/batch\n", + "Epoch 2/10 Iteration 283/1780 Training loss: 2.4279 0.1224 sec/batch\n", + "Epoch 2/10 Iteration 284/1780 Training loss: 2.4266 0.1209 sec/batch\n", + "Epoch 2/10 Iteration 285/1780 Training loss: 2.4253 0.1194 sec/batch\n", + "Epoch 2/10 Iteration 286/1780 Training loss: 2.4242 0.1218 sec/batch\n", + "Epoch 2/10 Iteration 287/1780 Training loss: 2.4229 0.1196 sec/batch\n", + "Epoch 2/10 Iteration 288/1780 Training loss: 2.4215 0.1220 sec/batch\n", + "Epoch 2/10 Iteration 289/1780 Training loss: 2.4202 0.1193 sec/batch\n", + "Epoch 2/10 Iteration 290/1780 Training loss: 2.4189 0.1216 sec/batch\n", + "Epoch 2/10 Iteration 291/1780 Training loss: 2.4175 0.1196 sec/batch\n", + "Epoch 2/10 Iteration 292/1780 Training loss: 2.4160 0.1214 sec/batch\n", + "Epoch 2/10 Iteration 293/1780 Training loss: 2.4146 0.1197 sec/batch\n", + "Epoch 2/10 Iteration 294/1780 Training loss: 2.4130 0.1226 sec/batch\n", + "Epoch 2/10 Iteration 295/1780 Training loss: 2.4117 0.1220 sec/batch\n", + "Epoch 2/10 Iteration 296/1780 Training loss: 2.4103 0.1206 sec/batch\n", + "Epoch 2/10 Iteration 297/1780 Training loss: 2.4092 0.1215 sec/batch\n", + "Epoch 2/10 Iteration 298/1780 Training loss: 2.4080 0.1216 sec/batch\n", + "Epoch 2/10 Iteration 299/1780 Training loss: 2.4068 0.1187 sec/batch\n", + "Epoch 2/10 Iteration 300/1780 Training loss: 2.4054 0.1198 sec/batch\n", + "Validation loss: 2.16109 Saving checkpoint!\n", + "Epoch 2/10 Iteration 301/1780 Training loss: 2.4042 0.1188 sec/batch\n", + "Epoch 2/10 Iteration 302/1780 Training loss: 2.4030 0.1222 sec/batch\n", + "Epoch 2/10 Iteration 303/1780 Training loss: 2.4017 0.1224 sec/batch\n", + "Epoch 2/10 Iteration 304/1780 Training loss: 2.4002 0.1229 sec/batch\n", + "Epoch 2/10 Iteration 305/1780 Training loss: 2.3991 0.1241 sec/batch\n", + "Epoch 2/10 Iteration 306/1780 Training loss: 2.3979 0.1218 sec/batch\n", + "Epoch 2/10 Iteration 307/1780 Training loss: 2.3968 0.1212 sec/batch\n", + "Epoch 2/10 Iteration 308/1780 Training loss: 2.3956 0.1210 sec/batch\n", + "Epoch 2/10 Iteration 309/1780 Training loss: 2.3943 0.1204 sec/batch\n", + "Epoch 2/10 Iteration 310/1780 Training loss: 2.3929 0.1215 sec/batch\n", + "Epoch 2/10 Iteration 311/1780 Training loss: 2.3916 0.1196 sec/batch\n", + "Epoch 2/10 Iteration 312/1780 Training loss: 2.3905 0.1224 sec/batch\n", + "Epoch 2/10 Iteration 313/1780 Training loss: 2.3893 0.1192 sec/batch\n", + "Epoch 2/10 Iteration 314/1780 Training loss: 2.3881 0.1197 sec/batch\n", + "Epoch 2/10 Iteration 315/1780 Training loss: 2.3869 0.1214 sec/batch\n", + "Epoch 2/10 Iteration 316/1780 Training loss: 2.3857 0.1206 sec/batch\n", + "Epoch 2/10 Iteration 317/1780 Training loss: 2.3848 0.1216 sec/batch\n", + "Epoch 2/10 Iteration 318/1780 Training loss: 2.3835 0.1205 sec/batch\n", + "Epoch 2/10 Iteration 319/1780 Training loss: 2.3824 0.1217 sec/batch\n", + "Epoch 2/10 Iteration 320/1780 Training loss: 2.3811 0.1205 sec/batch\n", + "Epoch 2/10 Iteration 321/1780 Training loss: 2.3799 0.1201 sec/batch\n", + "Epoch 2/10 Iteration 322/1780 Training loss: 2.3787 0.1232 sec/batch\n", + "Epoch 2/10 Iteration 323/1780 Training loss: 2.3775 0.1197 sec/batch\n", + "Epoch 2/10 Iteration 324/1780 Training loss: 2.3765 0.1205 sec/batch\n", + "Epoch 2/10 Iteration 325/1780 Training loss: 2.3754 0.1203 sec/batch\n", + "Epoch 2/10 Iteration 326/1780 Training loss: 2.3744 0.1205 sec/batch\n", + "Epoch 2/10 Iteration 327/1780 Training loss: 2.3732 0.1204 sec/batch\n", + "Epoch 2/10 Iteration 328/1780 Training loss: 2.3720 0.1210 sec/batch\n", + "Epoch 2/10 Iteration 329/1780 Training loss: 2.3710 0.1191 sec/batch\n", + "Epoch 2/10 Iteration 330/1780 Training loss: 2.3701 0.1199 sec/batch\n", + "Epoch 2/10 Iteration 331/1780 Training loss: 2.3691 0.1218 sec/batch\n", + "Epoch 2/10 Iteration 332/1780 Training loss: 2.3680 0.1200 sec/batch\n", + "Epoch 2/10 Iteration 333/1780 Training loss: 2.3668 0.1206 sec/batch\n", + "Epoch 2/10 Iteration 334/1780 Training loss: 2.3656 0.1211 sec/batch\n", + "Epoch 2/10 Iteration 335/1780 Training loss: 2.3645 0.1201 sec/batch\n", + "Epoch 2/10 Iteration 336/1780 Training loss: 2.3633 0.1229 sec/batch\n", + "Epoch 2/10 Iteration 337/1780 Training loss: 2.3620 0.1186 sec/batch\n", + "Epoch 2/10 Iteration 338/1780 Training loss: 2.3610 0.1238 sec/batch\n", + "Epoch 2/10 Iteration 339/1780 Training loss: 2.3600 0.1197 sec/batch\n", + "Epoch 2/10 Iteration 340/1780 Training loss: 2.3588 0.1216 sec/batch\n", + "Epoch 2/10 Iteration 341/1780 Training loss: 2.3577 0.1209 sec/batch\n", + "Epoch 2/10 Iteration 342/1780 Training loss: 2.3566 0.1204 sec/batch\n", + "Epoch 2/10 Iteration 343/1780 Training loss: 2.3555 0.1199 sec/batch\n", + "Epoch 2/10 Iteration 344/1780 Training loss: 2.3544 0.1249 sec/batch\n", + "Epoch 2/10 Iteration 345/1780 Training loss: 2.3533 0.1188 sec/batch\n", + "Epoch 2/10 Iteration 346/1780 Training loss: 2.3524 0.1219 sec/batch\n", + "Epoch 2/10 Iteration 347/1780 Training loss: 2.3513 0.1242 sec/batch\n", + "Epoch 2/10 Iteration 348/1780 Training loss: 2.3501 0.1230 sec/batch\n", + "Epoch 2/10 Iteration 349/1780 Training loss: 2.3489 0.1213 sec/batch\n", + "Epoch 2/10 Iteration 350/1780 Training loss: 2.3479 0.1217 sec/batch\n", + "Epoch 2/10 Iteration 351/1780 Training loss: 2.3469 0.1192 sec/batch\n", + "Epoch 2/10 Iteration 352/1780 Training loss: 2.3459 0.1199 sec/batch\n", + "Epoch 2/10 Iteration 353/1780 Training loss: 2.3450 0.1217 sec/batch\n", + "Epoch 2/10 Iteration 354/1780 Training loss: 2.3439 0.1213 sec/batch\n", + "Epoch 2/10 Iteration 355/1780 Training loss: 2.3428 0.1294 sec/batch\n", + "Epoch 2/10 Iteration 356/1780 Training loss: 2.3417 0.1208 sec/batch\n", + "Epoch 3/10 Iteration 357/1780 Training loss: 2.2072 0.1212 sec/batch\n", + "Epoch 3/10 Iteration 358/1780 Training loss: 2.1648 0.1217 sec/batch\n", + "Epoch 3/10 Iteration 359/1780 Training loss: 2.1521 0.1214 sec/batch\n", + "Epoch 3/10 Iteration 360/1780 Training loss: 2.1456 0.1205 sec/batch\n", + "Epoch 3/10 Iteration 361/1780 Training loss: 2.1434 0.1209 sec/batch\n", + "Epoch 3/10 Iteration 362/1780 Training loss: 2.1387 0.1210 sec/batch\n", + "Epoch 3/10 Iteration 363/1780 Training loss: 2.1379 0.1210 sec/batch\n", + "Epoch 3/10 Iteration 364/1780 Training loss: 2.1381 0.1228 sec/batch\n", + "Epoch 3/10 Iteration 365/1780 Training loss: 2.1400 0.1193 sec/batch\n", + "Epoch 3/10 Iteration 366/1780 Training loss: 2.1401 0.1203 sec/batch\n", + "Epoch 3/10 Iteration 367/1780 Training loss: 2.1372 0.1216 sec/batch\n", + "Epoch 3/10 Iteration 368/1780 Training loss: 2.1354 0.1204 sec/batch\n", + "Epoch 3/10 Iteration 369/1780 Training loss: 2.1345 0.1225 sec/batch\n", + "Epoch 3/10 Iteration 370/1780 Training loss: 2.1361 0.1210 sec/batch\n", + "Epoch 3/10 Iteration 371/1780 Training loss: 2.1352 0.1214 sec/batch\n", + "Epoch 3/10 Iteration 372/1780 Training loss: 2.1337 0.1213 sec/batch\n", + "Epoch 3/10 Iteration 373/1780 Training loss: 2.1331 0.1198 sec/batch\n", + "Epoch 3/10 Iteration 374/1780 Training loss: 2.1347 0.1227 sec/batch\n", + "Epoch 3/10 Iteration 375/1780 Training loss: 2.1341 0.1211 sec/batch\n", + "Epoch 3/10 Iteration 376/1780 Training loss: 2.1330 0.1210 sec/batch\n", + "Epoch 3/10 Iteration 377/1780 Training loss: 2.1319 0.1197 sec/batch\n", + "Epoch 3/10 Iteration 378/1780 Training loss: 2.1329 0.1203 sec/batch\n", + "Epoch 3/10 Iteration 379/1780 Training loss: 2.1318 0.1201 sec/batch\n", + "Epoch 3/10 Iteration 380/1780 Training loss: 2.1302 0.1209 sec/batch\n", + "Epoch 3/10 Iteration 381/1780 Training loss: 2.1293 0.1218 sec/batch\n", + "Epoch 3/10 Iteration 382/1780 Training loss: 2.1279 0.1216 sec/batch\n", + "Epoch 3/10 Iteration 383/1780 Training loss: 2.1265 0.1213 sec/batch\n", + "Epoch 3/10 Iteration 384/1780 Training loss: 2.1257 0.1228 sec/batch\n", + "Epoch 3/10 Iteration 385/1780 Training loss: 2.1261 0.1213 sec/batch\n", + "Epoch 3/10 Iteration 386/1780 Training loss: 2.1254 0.1203 sec/batch\n", + "Epoch 3/10 Iteration 387/1780 Training loss: 2.1249 0.1194 sec/batch\n", + "Epoch 3/10 Iteration 388/1780 Training loss: 2.1232 0.1196 sec/batch\n", + "Epoch 3/10 Iteration 389/1780 Training loss: 2.1223 0.1218 sec/batch\n", + "Epoch 3/10 Iteration 390/1780 Training loss: 2.1222 0.1218 sec/batch\n", + "Epoch 3/10 Iteration 391/1780 Training loss: 2.1212 0.1194 sec/batch\n", + "Epoch 3/10 Iteration 392/1780 Training loss: 2.1202 0.1205 sec/batch\n", + "Epoch 3/10 Iteration 393/1780 Training loss: 2.1193 0.1268 sec/batch\n", + "Epoch 3/10 Iteration 394/1780 Training loss: 2.1172 0.1223 sec/batch\n", + "Epoch 3/10 Iteration 395/1780 Training loss: 2.1155 0.1202 sec/batch\n", + "Epoch 3/10 Iteration 396/1780 Training loss: 2.1140 0.1208 sec/batch\n", + "Epoch 3/10 Iteration 397/1780 Training loss: 2.1126 0.1198 sec/batch\n", + "Epoch 3/10 Iteration 398/1780 Training loss: 2.1118 0.1209 sec/batch\n", + "Epoch 3/10 Iteration 399/1780 Training loss: 2.1106 0.1202 sec/batch\n", + "Epoch 3/10 Iteration 400/1780 Training loss: 2.1093 0.1228 sec/batch\n", + "Validation loss: 1.97191 Saving checkpoint!\n", + "Epoch 3/10 Iteration 401/1780 Training loss: 2.1092 0.1196 sec/batch\n", + "Epoch 3/10 Iteration 402/1780 Training loss: 2.1071 0.1222 sec/batch\n", + "Epoch 3/10 Iteration 403/1780 Training loss: 2.1064 0.1206 sec/batch\n", + "Epoch 3/10 Iteration 404/1780 Training loss: 2.1050 0.1231 sec/batch\n", + "Epoch 3/10 Iteration 405/1780 Training loss: 2.1041 0.1221 sec/batch\n", + "Epoch 3/10 Iteration 406/1780 Training loss: 2.1039 0.1212 sec/batch\n", + "Epoch 3/10 Iteration 407/1780 Training loss: 2.1025 0.1207 sec/batch\n", + "Epoch 3/10 Iteration 408/1780 Training loss: 2.1023 0.1207 sec/batch\n", + "Epoch 3/10 Iteration 409/1780 Training loss: 2.1013 0.1184 sec/batch\n", + "Epoch 3/10 Iteration 410/1780 Training loss: 2.1005 0.1197 sec/batch\n", + "Epoch 3/10 Iteration 411/1780 Training loss: 2.0995 0.1209 sec/batch\n", + "Epoch 3/10 Iteration 412/1780 Training loss: 2.0988 0.1208 sec/batch\n", + "Epoch 3/10 Iteration 413/1780 Training loss: 2.0982 0.1197 sec/batch\n", + "Epoch 3/10 Iteration 414/1780 Training loss: 2.0972 0.1195 sec/batch\n", + "Epoch 3/10 Iteration 415/1780 Training loss: 2.0961 0.1209 sec/batch\n", + "Epoch 3/10 Iteration 416/1780 Training loss: 2.0957 0.1206 sec/batch\n", + "Epoch 3/10 Iteration 417/1780 Training loss: 2.0948 0.1214 sec/batch\n", + "Epoch 3/10 Iteration 418/1780 Training loss: 2.0947 0.1225 sec/batch\n", + "Epoch 3/10 Iteration 419/1780 Training loss: 2.0947 0.1187 sec/batch\n", + "Epoch 3/10 Iteration 420/1780 Training loss: 2.0944 0.1204 sec/batch\n", + "Epoch 3/10 Iteration 421/1780 Training loss: 2.0935 0.1222 sec/batch\n", + "Epoch 3/10 Iteration 422/1780 Training loss: 2.0933 0.1246 sec/batch\n", + "Epoch 3/10 Iteration 423/1780 Training loss: 2.0927 0.1190 sec/batch\n", + "Epoch 3/10 Iteration 424/1780 Training loss: 2.0916 0.1198 sec/batch\n", + "Epoch 3/10 Iteration 425/1780 Training loss: 2.0905 0.1207 sec/batch\n", + "Epoch 3/10 Iteration 426/1780 Training loss: 2.0898 0.1204 sec/batch\n", + "Epoch 3/10 Iteration 427/1780 Training loss: 2.0894 0.1208 sec/batch\n", + "Epoch 3/10 Iteration 428/1780 Training loss: 2.0889 0.1200 sec/batch\n", + "Epoch 3/10 Iteration 429/1780 Training loss: 2.0885 0.1201 sec/batch\n", + "Epoch 3/10 Iteration 430/1780 Training loss: 2.0876 0.1215 sec/batch\n", + "Epoch 3/10 Iteration 431/1780 Training loss: 2.0870 0.1207 sec/batch\n", + "Epoch 3/10 Iteration 432/1780 Training loss: 2.0867 0.1202 sec/batch\n", + "Epoch 3/10 Iteration 433/1780 Training loss: 2.0858 0.1208 sec/batch\n", + "Epoch 3/10 Iteration 434/1780 Training loss: 2.0852 0.1213 sec/batch\n", + "Epoch 3/10 Iteration 435/1780 Training loss: 2.0842 0.1193 sec/batch\n", + "Epoch 3/10 Iteration 436/1780 Training loss: 2.0833 0.1194 sec/batch\n", + "Epoch 3/10 Iteration 437/1780 Training loss: 2.0821 0.1204 sec/batch\n", + "Epoch 3/10 Iteration 438/1780 Training loss: 2.0815 0.1223 sec/batch\n", + "Epoch 3/10 Iteration 439/1780 Training loss: 2.0803 0.1218 sec/batch\n", + "Epoch 3/10 Iteration 440/1780 Training loss: 2.0795 0.1225 sec/batch\n", + "Epoch 3/10 Iteration 441/1780 Training loss: 2.0783 0.1221 sec/batch\n", + "Epoch 3/10 Iteration 442/1780 Training loss: 2.0774 0.1201 sec/batch\n", + "Epoch 3/10 Iteration 443/1780 Training loss: 2.0766 0.1228 sec/batch\n", + "Epoch 3/10 Iteration 444/1780 Training loss: 2.0757 0.1219 sec/batch\n", + "Epoch 3/10 Iteration 445/1780 Training loss: 2.0745 0.1194 sec/batch\n", + "Epoch 3/10 Iteration 446/1780 Training loss: 2.0738 0.1230 sec/batch\n", + "Epoch 3/10 Iteration 447/1780 Training loss: 2.0728 0.1217 sec/batch\n", + "Epoch 3/10 Iteration 448/1780 Training loss: 2.0721 0.1196 sec/batch\n", + "Epoch 3/10 Iteration 449/1780 Training loss: 2.0709 0.1204 sec/batch\n", + "Epoch 3/10 Iteration 450/1780 Training loss: 2.0699 0.1205 sec/batch\n", + "Epoch 3/10 Iteration 451/1780 Training loss: 2.0689 0.1191 sec/batch\n", + "Epoch 3/10 Iteration 452/1780 Training loss: 2.0681 0.1223 sec/batch\n", + "Epoch 3/10 Iteration 453/1780 Training loss: 2.0673 0.1236 sec/batch\n", + "Epoch 3/10 Iteration 454/1780 Training loss: 2.0663 0.1206 sec/batch\n", + "Epoch 3/10 Iteration 455/1780 Training loss: 2.0654 0.1197 sec/batch\n", + "Epoch 3/10 Iteration 456/1780 Training loss: 2.0643 0.1199 sec/batch\n", + "Epoch 3/10 Iteration 457/1780 Training loss: 2.0636 0.1196 sec/batch\n", + "Epoch 3/10 Iteration 458/1780 Training loss: 2.0630 0.1228 sec/batch\n", + "Epoch 3/10 Iteration 459/1780 Training loss: 2.0621 0.1223 sec/batch\n", + "Epoch 3/10 Iteration 460/1780 Training loss: 2.0612 0.1226 sec/batch\n", + "Epoch 3/10 Iteration 461/1780 Training loss: 2.0604 0.1220 sec/batch\n", + "Epoch 3/10 Iteration 462/1780 Training loss: 2.0596 0.1246 sec/batch\n", + "Epoch 3/10 Iteration 463/1780 Training loss: 2.0587 0.1215 sec/batch\n", + "Epoch 3/10 Iteration 464/1780 Training loss: 2.0581 0.1226 sec/batch\n", + "Epoch 3/10 Iteration 465/1780 Training loss: 2.0576 0.1210 sec/batch\n", + "Epoch 3/10 Iteration 466/1780 Training loss: 2.0568 0.1232 sec/batch\n", + "Epoch 3/10 Iteration 467/1780 Training loss: 2.0560 0.1268 sec/batch\n", + "Epoch 3/10 Iteration 468/1780 Training loss: 2.0552 0.1210 sec/batch\n", + "Epoch 3/10 Iteration 469/1780 Training loss: 2.0545 0.1212 sec/batch\n", + "Epoch 3/10 Iteration 470/1780 Training loss: 2.0538 0.1225 sec/batch\n", + "Epoch 3/10 Iteration 471/1780 Training loss: 2.0528 0.1192 sec/batch\n", + "Epoch 3/10 Iteration 472/1780 Training loss: 2.0518 0.1195 sec/batch\n", + "Epoch 3/10 Iteration 473/1780 Training loss: 2.0511 0.1205 sec/batch\n", + "Epoch 3/10 Iteration 474/1780 Training loss: 2.0504 0.1211 sec/batch\n", + "Epoch 3/10 Iteration 475/1780 Training loss: 2.0497 0.1213 sec/batch\n", + "Epoch 3/10 Iteration 476/1780 Training loss: 2.0490 0.1193 sec/batch\n", + "Epoch 3/10 Iteration 477/1780 Training loss: 2.0484 0.1204 sec/batch\n", + "Epoch 3/10 Iteration 478/1780 Training loss: 2.0475 0.1215 sec/batch\n", + "Epoch 3/10 Iteration 479/1780 Training loss: 2.0467 0.1205 sec/batch\n", + "Epoch 3/10 Iteration 480/1780 Training loss: 2.0461 0.1211 sec/batch\n", + "Epoch 3/10 Iteration 481/1780 Training loss: 2.0455 0.1203 sec/batch\n", + "Epoch 3/10 Iteration 482/1780 Training loss: 2.0444 0.1209 sec/batch\n", + "Epoch 3/10 Iteration 483/1780 Training loss: 2.0439 0.1194 sec/batch\n", + "Epoch 3/10 Iteration 484/1780 Training loss: 2.0433 0.1259 sec/batch\n", + "Epoch 3/10 Iteration 485/1780 Training loss: 2.0428 0.1202 sec/batch\n", + "Epoch 3/10 Iteration 486/1780 Training loss: 2.0422 0.1211 sec/batch\n", + "Epoch 3/10 Iteration 487/1780 Training loss: 2.0414 0.1222 sec/batch\n", + "Epoch 3/10 Iteration 488/1780 Training loss: 2.0406 0.1208 sec/batch\n", + "Epoch 3/10 Iteration 489/1780 Training loss: 2.0399 0.1209 sec/batch\n", + "Epoch 3/10 Iteration 490/1780 Training loss: 2.0394 0.1232 sec/batch\n", + "Epoch 3/10 Iteration 491/1780 Training loss: 2.0388 0.1193 sec/batch\n", + "Epoch 3/10 Iteration 492/1780 Training loss: 2.0383 0.1196 sec/batch\n", + "Epoch 3/10 Iteration 493/1780 Training loss: 2.0377 0.1202 sec/batch\n", + "Epoch 3/10 Iteration 494/1780 Training loss: 2.0372 0.1228 sec/batch\n", + "Epoch 3/10 Iteration 495/1780 Training loss: 2.0368 0.1212 sec/batch\n", + "Epoch 3/10 Iteration 496/1780 Training loss: 2.0361 0.1201 sec/batch\n", + "Epoch 3/10 Iteration 497/1780 Training loss: 2.0357 0.1209 sec/batch\n", + "Epoch 3/10 Iteration 498/1780 Training loss: 2.0349 0.1231 sec/batch\n", + "Epoch 3/10 Iteration 499/1780 Training loss: 2.0343 0.1196 sec/batch\n", + "Epoch 3/10 Iteration 500/1780 Training loss: 2.0337 0.1215 sec/batch\n", + "Validation loss: 1.84066 Saving checkpoint!\n", + "Epoch 3/10 Iteration 501/1780 Training loss: 2.0332 0.1197 sec/batch\n", + "Epoch 3/10 Iteration 502/1780 Training loss: 2.0326 0.1207 sec/batch\n", + "Epoch 3/10 Iteration 503/1780 Training loss: 2.0320 0.1198 sec/batch\n", + "Epoch 3/10 Iteration 504/1780 Training loss: 2.0316 0.1234 sec/batch\n", + "Epoch 3/10 Iteration 505/1780 Training loss: 2.0310 0.1202 sec/batch\n", + "Epoch 3/10 Iteration 506/1780 Training loss: 2.0302 0.1211 sec/batch\n", + "Epoch 3/10 Iteration 507/1780 Training loss: 2.0296 0.1195 sec/batch\n", + "Epoch 3/10 Iteration 508/1780 Training loss: 2.0292 0.1198 sec/batch\n", + "Epoch 3/10 Iteration 509/1780 Training loss: 2.0287 0.1225 sec/batch\n", + "Epoch 3/10 Iteration 510/1780 Training loss: 2.0282 0.1203 sec/batch\n", + "Epoch 3/10 Iteration 511/1780 Training loss: 2.0275 0.1199 sec/batch\n", + "Epoch 3/10 Iteration 512/1780 Training loss: 2.0269 0.1206 sec/batch\n", + "Epoch 3/10 Iteration 513/1780 Training loss: 2.0262 0.1189 sec/batch\n", + "Epoch 3/10 Iteration 514/1780 Training loss: 2.0255 0.1226 sec/batch\n", + "Epoch 3/10 Iteration 515/1780 Training loss: 2.0248 0.1220 sec/batch\n", + "Epoch 3/10 Iteration 516/1780 Training loss: 2.0243 0.1205 sec/batch\n", + "Epoch 3/10 Iteration 517/1780 Training loss: 2.0239 0.1193 sec/batch\n", + "Epoch 3/10 Iteration 518/1780 Training loss: 2.0233 0.1198 sec/batch\n", + "Epoch 3/10 Iteration 519/1780 Training loss: 2.0227 0.1220 sec/batch\n", + "Epoch 3/10 Iteration 520/1780 Training loss: 2.0222 0.1213 sec/batch\n", + "Epoch 3/10 Iteration 521/1780 Training loss: 2.0216 0.1206 sec/batch\n", + "Epoch 3/10 Iteration 522/1780 Training loss: 2.0209 0.1222 sec/batch\n", + "Epoch 3/10 Iteration 523/1780 Training loss: 2.0204 0.1224 sec/batch\n", + "Epoch 3/10 Iteration 524/1780 Training loss: 2.0202 0.1204 sec/batch\n", + "Epoch 3/10 Iteration 525/1780 Training loss: 2.0195 0.1218 sec/batch\n", + "Epoch 3/10 Iteration 526/1780 Training loss: 2.0189 0.1204 sec/batch\n", + "Epoch 3/10 Iteration 527/1780 Training loss: 2.0182 0.1211 sec/batch\n", + "Epoch 3/10 Iteration 528/1780 Training loss: 2.0174 0.1203 sec/batch\n", + "Epoch 3/10 Iteration 529/1780 Training loss: 2.0170 0.1214 sec/batch\n", + "Epoch 3/10 Iteration 530/1780 Training loss: 2.0164 0.1214 sec/batch\n", + "Epoch 3/10 Iteration 531/1780 Training loss: 2.0159 0.1194 sec/batch\n", + "Epoch 3/10 Iteration 532/1780 Training loss: 2.0153 0.1246 sec/batch\n", + "Epoch 3/10 Iteration 533/1780 Training loss: 2.0146 0.1200 sec/batch\n", + "Epoch 3/10 Iteration 534/1780 Training loss: 2.0141 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 535/1780 Training loss: 1.9760 0.1208 sec/batch\n", + "Epoch 4/10 Iteration 536/1780 Training loss: 1.9361 0.1223 sec/batch\n", + "Epoch 4/10 Iteration 537/1780 Training loss: 1.9218 0.1204 sec/batch\n", + "Epoch 4/10 Iteration 538/1780 Training loss: 1.9151 0.1209 sec/batch\n", + "Epoch 4/10 Iteration 539/1780 Training loss: 1.9126 0.1238 sec/batch\n", + "Epoch 4/10 Iteration 540/1780 Training loss: 1.9034 0.1229 sec/batch\n", + "Epoch 4/10 Iteration 541/1780 Training loss: 1.9039 0.1209 sec/batch\n", + "Epoch 4/10 Iteration 542/1780 Training loss: 1.9039 0.1225 sec/batch\n", + "Epoch 4/10 Iteration 543/1780 Training loss: 1.9061 0.1197 sec/batch\n", + "Epoch 4/10 Iteration 544/1780 Training loss: 1.9051 0.1224 sec/batch\n", + "Epoch 4/10 Iteration 545/1780 Training loss: 1.9024 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 546/1780 Training loss: 1.9002 0.1227 sec/batch\n", + "Epoch 4/10 Iteration 547/1780 Training loss: 1.8999 0.1223 sec/batch\n", + "Epoch 4/10 Iteration 548/1780 Training loss: 1.9020 0.1240 sec/batch\n", + "Epoch 4/10 Iteration 549/1780 Training loss: 1.9007 0.1216 sec/batch\n", + "Epoch 4/10 Iteration 550/1780 Training loss: 1.8991 0.1226 sec/batch\n", + "Epoch 4/10 Iteration 551/1780 Training loss: 1.8983 0.1221 sec/batch\n", + "Epoch 4/10 Iteration 552/1780 Training loss: 1.9000 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 553/1780 Training loss: 1.8991 0.1264 sec/batch\n", + "Epoch 4/10 Iteration 554/1780 Training loss: 1.8990 0.1214 sec/batch\n", + "Epoch 4/10 Iteration 555/1780 Training loss: 1.8978 0.1221 sec/batch\n", + "Epoch 4/10 Iteration 556/1780 Training loss: 1.8984 0.1208 sec/batch\n", + "Epoch 4/10 Iteration 557/1780 Training loss: 1.8970 0.1240 sec/batch\n", + "Epoch 4/10 Iteration 558/1780 Training loss: 1.8962 0.1213 sec/batch\n", + "Epoch 4/10 Iteration 559/1780 Training loss: 1.8953 0.1198 sec/batch\n", + "Epoch 4/10 Iteration 560/1780 Training loss: 1.8938 0.1210 sec/batch\n", + "Epoch 4/10 Iteration 561/1780 Training loss: 1.8923 0.1204 sec/batch\n", + "Epoch 4/10 Iteration 562/1780 Training loss: 1.8923 0.1199 sec/batch\n", + "Epoch 4/10 Iteration 563/1780 Training loss: 1.8930 0.1227 sec/batch\n", + "Epoch 4/10 Iteration 564/1780 Training loss: 1.8926 0.1244 sec/batch\n", + "Epoch 4/10 Iteration 565/1780 Training loss: 1.8921 0.1201 sec/batch\n", + "Epoch 4/10 Iteration 566/1780 Training loss: 1.8908 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 567/1780 Training loss: 1.8904 0.1212 sec/batch\n", + "Epoch 4/10 Iteration 568/1780 Training loss: 1.8909 0.1223 sec/batch\n", + "Epoch 4/10 Iteration 569/1780 Training loss: 1.8899 0.1218 sec/batch\n", + "Epoch 4/10 Iteration 570/1780 Training loss: 1.8891 0.1203 sec/batch\n", + "Epoch 4/10 Iteration 571/1780 Training loss: 1.8882 0.1242 sec/batch\n", + "Epoch 4/10 Iteration 572/1780 Training loss: 1.8867 0.1219 sec/batch\n", + "Epoch 4/10 Iteration 573/1780 Training loss: 1.8851 0.1204 sec/batch\n", + "Epoch 4/10 Iteration 574/1780 Training loss: 1.8840 0.1215 sec/batch\n", + "Epoch 4/10 Iteration 575/1780 Training loss: 1.8830 0.1208 sec/batch\n", + "Epoch 4/10 Iteration 576/1780 Training loss: 1.8829 0.1234 sec/batch\n", + "Epoch 4/10 Iteration 577/1780 Training loss: 1.8819 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 578/1780 Training loss: 1.8806 0.1211 sec/batch\n", + "Epoch 4/10 Iteration 579/1780 Training loss: 1.8802 0.1227 sec/batch\n", + "Epoch 4/10 Iteration 580/1780 Training loss: 1.8786 0.1222 sec/batch\n", + "Epoch 4/10 Iteration 581/1780 Training loss: 1.8781 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 582/1780 Training loss: 1.8771 0.1230 sec/batch\n", + "Epoch 4/10 Iteration 583/1780 Training loss: 1.8766 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 584/1780 Training loss: 1.8769 0.1311 sec/batch\n", + "Epoch 4/10 Iteration 585/1780 Training loss: 1.8759 0.1200 sec/batch\n", + "Epoch 4/10 Iteration 586/1780 Training loss: 1.8765 0.1204 sec/batch\n", + "Epoch 4/10 Iteration 587/1780 Training loss: 1.8759 0.1209 sec/batch\n", + "Epoch 4/10 Iteration 588/1780 Training loss: 1.8754 0.1236 sec/batch\n", + "Epoch 4/10 Iteration 589/1780 Training loss: 1.8746 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 590/1780 Training loss: 1.8742 0.1205 sec/batch\n", + "Epoch 4/10 Iteration 591/1780 Training loss: 1.8739 0.1211 sec/batch\n", + "Epoch 4/10 Iteration 592/1780 Training loss: 1.8733 0.1207 sec/batch\n", + "Epoch 4/10 Iteration 593/1780 Training loss: 1.8725 0.1201 sec/batch\n", + "Epoch 4/10 Iteration 594/1780 Training loss: 1.8726 0.1218 sec/batch\n", + "Epoch 4/10 Iteration 595/1780 Training loss: 1.8722 0.1220 sec/batch\n", + "Epoch 4/10 Iteration 596/1780 Training loss: 1.8725 0.1204 sec/batch\n", + "Epoch 4/10 Iteration 597/1780 Training loss: 1.8725 0.1209 sec/batch\n", + "Epoch 4/10 Iteration 598/1780 Training loss: 1.8724 0.1206 sec/batch\n", + "Epoch 4/10 Iteration 599/1780 Training loss: 1.8720 0.1207 sec/batch\n", + "Epoch 4/10 Iteration 600/1780 Training loss: 1.8718 0.1195 sec/batch\n", + "Validation loss: 1.73093 Saving checkpoint!\n", + "Epoch 4/10 Iteration 601/1780 Training loss: 1.8722 0.1200 sec/batch\n", + "Epoch 4/10 Iteration 602/1780 Training loss: 1.8713 0.1218 sec/batch\n", + "Epoch 4/10 Iteration 603/1780 Training loss: 1.8707 0.1227 sec/batch\n", + "Epoch 4/10 Iteration 604/1780 Training loss: 1.8703 0.1217 sec/batch\n", + "Epoch 4/10 Iteration 605/1780 Training loss: 1.8703 0.1209 sec/batch\n", + "Epoch 4/10 Iteration 606/1780 Training loss: 1.8699 0.1209 sec/batch\n", + "Epoch 4/10 Iteration 607/1780 Training loss: 1.8698 0.1248 sec/batch\n", + "Epoch 4/10 Iteration 608/1780 Training loss: 1.8691 0.1225 sec/batch\n", + "Epoch 4/10 Iteration 609/1780 Training loss: 1.8687 0.1215 sec/batch\n", + "Epoch 4/10 Iteration 610/1780 Training loss: 1.8685 0.1204 sec/batch\n", + "Epoch 4/10 Iteration 611/1780 Training loss: 1.8680 0.1221 sec/batch\n", + "Epoch 4/10 Iteration 612/1780 Training loss: 1.8676 0.1204 sec/batch\n", + "Epoch 4/10 Iteration 613/1780 Training loss: 1.8668 0.1208 sec/batch\n", + "Epoch 4/10 Iteration 614/1780 Training loss: 1.8662 0.1245 sec/batch\n", + "Epoch 4/10 Iteration 615/1780 Training loss: 1.8652 0.1214 sec/batch\n", + "Epoch 4/10 Iteration 616/1780 Training loss: 1.8650 0.1223 sec/batch\n", + "Epoch 4/10 Iteration 617/1780 Training loss: 1.8640 0.1206 sec/batch\n", + "Epoch 4/10 Iteration 618/1780 Training loss: 1.8637 0.1236 sec/batch\n", + "Epoch 4/10 Iteration 619/1780 Training loss: 1.8629 0.1221 sec/batch\n", + "Epoch 4/10 Iteration 620/1780 Training loss: 1.8622 0.1234 sec/batch\n", + "Epoch 4/10 Iteration 621/1780 Training loss: 1.8617 0.1212 sec/batch\n", + "Epoch 4/10 Iteration 622/1780 Training loss: 1.8611 0.1245 sec/batch\n", + "Epoch 4/10 Iteration 623/1780 Training loss: 1.8601 0.1201 sec/batch\n", + "Epoch 4/10 Iteration 624/1780 Training loss: 1.8600 0.1217 sec/batch\n", + "Epoch 4/10 Iteration 625/1780 Training loss: 1.8593 0.1225 sec/batch\n", + "Epoch 4/10 Iteration 626/1780 Training loss: 1.8587 0.1230 sec/batch\n", + "Epoch 4/10 Iteration 627/1780 Training loss: 1.8579 0.1226 sec/batch\n", + "Epoch 4/10 Iteration 628/1780 Training loss: 1.8573 0.1205 sec/batch\n", + "Epoch 4/10 Iteration 629/1780 Training loss: 1.8566 0.1207 sec/batch\n", + "Epoch 4/10 Iteration 630/1780 Training loss: 1.8560 0.1212 sec/batch\n", + "Epoch 4/10 Iteration 631/1780 Training loss: 1.8556 0.1198 sec/batch\n", + "Epoch 4/10 Iteration 632/1780 Training loss: 1.8549 0.1219 sec/batch\n", + "Epoch 4/10 Iteration 633/1780 Training loss: 1.8542 0.1227 sec/batch\n", + "Epoch 4/10 Iteration 634/1780 Training loss: 1.8533 0.1207 sec/batch\n", + "Epoch 4/10 Iteration 635/1780 Training loss: 1.8528 0.1207 sec/batch\n", + "Epoch 4/10 Iteration 636/1780 Training loss: 1.8524 0.1216 sec/batch\n", + "Epoch 4/10 Iteration 637/1780 Training loss: 1.8517 0.1208 sec/batch\n", + "Epoch 4/10 Iteration 638/1780 Training loss: 1.8512 0.1207 sec/batch\n", + "Epoch 4/10 Iteration 639/1780 Training loss: 1.8506 0.1199 sec/batch\n", + "Epoch 4/10 Iteration 640/1780 Training loss: 1.8501 0.1212 sec/batch\n", + "Epoch 4/10 Iteration 641/1780 Training loss: 1.8497 0.1322 sec/batch\n", + "Epoch 4/10 Iteration 642/1780 Training loss: 1.8493 0.1219 sec/batch\n", + "Epoch 4/10 Iteration 643/1780 Training loss: 1.8490 0.1222 sec/batch\n", + "Epoch 4/10 Iteration 644/1780 Training loss: 1.8485 0.1210 sec/batch\n", + "Epoch 4/10 Iteration 645/1780 Training loss: 1.8481 0.1205 sec/batch\n", + "Epoch 4/10 Iteration 646/1780 Training loss: 1.8475 0.1213 sec/batch\n", + "Epoch 4/10 Iteration 647/1780 Training loss: 1.8470 0.1216 sec/batch\n", + "Epoch 4/10 Iteration 648/1780 Training loss: 1.8465 0.1219 sec/batch\n", + "Epoch 4/10 Iteration 649/1780 Training loss: 1.8458 0.1224 sec/batch\n", + "Epoch 4/10 Iteration 650/1780 Training loss: 1.8451 0.1233 sec/batch\n", + "Epoch 4/10 Iteration 651/1780 Training loss: 1.8447 0.1205 sec/batch\n", + "Epoch 4/10 Iteration 652/1780 Training loss: 1.8442 0.1225 sec/batch\n", + "Epoch 4/10 Iteration 653/1780 Training loss: 1.8437 0.1217 sec/batch\n", + "Epoch 4/10 Iteration 654/1780 Training loss: 1.8432 0.1231 sec/batch\n", + "Epoch 4/10 Iteration 655/1780 Training loss: 1.8428 0.1208 sec/batch\n", + "Epoch 4/10 Iteration 656/1780 Training loss: 1.8421 0.1206 sec/batch\n", + "Epoch 4/10 Iteration 657/1780 Training loss: 1.8415 0.1199 sec/batch\n", + "Epoch 4/10 Iteration 658/1780 Training loss: 1.8412 0.1228 sec/batch\n", + "Epoch 4/10 Iteration 659/1780 Training loss: 1.8407 0.1206 sec/batch\n", + "Epoch 4/10 Iteration 660/1780 Training loss: 1.8398 0.1207 sec/batch\n", + "Epoch 4/10 Iteration 661/1780 Training loss: 1.8395 0.1210 sec/batch\n", + "Epoch 4/10 Iteration 662/1780 Training loss: 1.8391 0.1215 sec/batch\n", + "Epoch 4/10 Iteration 663/1780 Training loss: 1.8386 0.1224 sec/batch\n", + "Epoch 4/10 Iteration 664/1780 Training loss: 1.8382 0.1221 sec/batch\n", + "Epoch 4/10 Iteration 665/1780 Training loss: 1.8375 0.1245 sec/batch\n", + "Epoch 4/10 Iteration 666/1780 Training loss: 1.8369 0.1218 sec/batch\n", + "Epoch 4/10 Iteration 667/1780 Training loss: 1.8365 0.1198 sec/batch\n", + "Epoch 4/10 Iteration 668/1780 Training loss: 1.8361 0.1224 sec/batch\n", + "Epoch 4/10 Iteration 669/1780 Training loss: 1.8357 0.1211 sec/batch\n", + "Epoch 4/10 Iteration 670/1780 Training loss: 1.8354 0.1217 sec/batch\n", + "Epoch 4/10 Iteration 671/1780 Training loss: 1.8350 0.1198 sec/batch\n", + "Epoch 4/10 Iteration 672/1780 Training loss: 1.8347 0.1214 sec/batch\n", + "Epoch 4/10 Iteration 673/1780 Training loss: 1.8345 0.1196 sec/batch\n", + "Epoch 4/10 Iteration 674/1780 Training loss: 1.8340 0.1197 sec/batch\n", + "Epoch 4/10 Iteration 675/1780 Training loss: 1.8338 0.1204 sec/batch\n", + "Epoch 4/10 Iteration 676/1780 Training loss: 1.8333 0.1227 sec/batch\n", + "Epoch 4/10 Iteration 677/1780 Training loss: 1.8330 0.1210 sec/batch\n", + "Epoch 4/10 Iteration 678/1780 Training loss: 1.8326 0.1234 sec/batch\n", + "Epoch 4/10 Iteration 679/1780 Training loss: 1.8320 0.1201 sec/batch\n", + "Epoch 4/10 Iteration 680/1780 Training loss: 1.8317 0.1209 sec/batch\n", + "Epoch 4/10 Iteration 681/1780 Training loss: 1.8314 0.1212 sec/batch\n", + "Epoch 4/10 Iteration 682/1780 Training loss: 1.8312 0.1226 sec/batch\n", + "Epoch 4/10 Iteration 683/1780 Training loss: 1.8308 0.1227 sec/batch\n", + "Epoch 4/10 Iteration 684/1780 Training loss: 1.8303 0.1220 sec/batch\n", + "Epoch 4/10 Iteration 685/1780 Training loss: 1.8297 0.1231 sec/batch\n", + "Epoch 4/10 Iteration 686/1780 Training loss: 1.8295 0.1194 sec/batch\n", + "Epoch 4/10 Iteration 687/1780 Training loss: 1.8292 0.1221 sec/batch\n", + "Epoch 4/10 Iteration 688/1780 Training loss: 1.8288 0.1206 sec/batch\n", + "Epoch 4/10 Iteration 689/1780 Training loss: 1.8284 0.1210 sec/batch\n", + "Epoch 4/10 Iteration 690/1780 Training loss: 1.8280 0.1226 sec/batch\n", + "Epoch 4/10 Iteration 691/1780 Training loss: 1.8277 0.1197 sec/batch\n", + "Epoch 4/10 Iteration 692/1780 Training loss: 1.8273 0.1207 sec/batch\n", + "Epoch 4/10 Iteration 693/1780 Training loss: 1.8267 0.1224 sec/batch\n", + "Epoch 4/10 Iteration 694/1780 Training loss: 1.8264 0.1267 sec/batch\n", + "Epoch 4/10 Iteration 695/1780 Training loss: 1.8263 0.1214 sec/batch\n", + "Epoch 4/10 Iteration 696/1780 Training loss: 1.8259 0.1224 sec/batch\n", + "Epoch 4/10 Iteration 697/1780 Training loss: 1.8255 0.1230 sec/batch\n", + "Epoch 4/10 Iteration 698/1780 Training loss: 1.8252 0.1234 sec/batch\n", + "Epoch 4/10 Iteration 699/1780 Training loss: 1.8248 0.1210 sec/batch\n", + "Epoch 4/10 Iteration 700/1780 Training loss: 1.8243 0.1202 sec/batch\n", + "Validation loss: 1.65231 Saving checkpoint!\n", + "Epoch 4/10 Iteration 701/1780 Training loss: 1.8245 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 702/1780 Training loss: 1.8245 0.1223 sec/batch\n", + "Epoch 4/10 Iteration 703/1780 Training loss: 1.8241 0.1228 sec/batch\n", + "Epoch 4/10 Iteration 704/1780 Training loss: 1.8237 0.1214 sec/batch\n", + "Epoch 4/10 Iteration 705/1780 Training loss: 1.8233 0.1206 sec/batch\n", + "Epoch 4/10 Iteration 706/1780 Training loss: 1.8228 0.1249 sec/batch\n", + "Epoch 4/10 Iteration 707/1780 Training loss: 1.8225 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 708/1780 Training loss: 1.8221 0.1202 sec/batch\n", + "Epoch 4/10 Iteration 709/1780 Training loss: 1.8218 0.1235 sec/batch\n", + "Epoch 4/10 Iteration 710/1780 Training loss: 1.8214 0.1214 sec/batch\n", + "Epoch 4/10 Iteration 711/1780 Training loss: 1.8208 0.1219 sec/batch\n", + "Epoch 4/10 Iteration 712/1780 Training loss: 1.8206 0.1209 sec/batch\n", + "Epoch 5/10 Iteration 713/1780 Training loss: 1.8258 0.1203 sec/batch\n", + "Epoch 5/10 Iteration 714/1780 Training loss: 1.7858 0.1202 sec/batch\n", + "Epoch 5/10 Iteration 715/1780 Training loss: 1.7699 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 716/1780 Training loss: 1.7626 0.1229 sec/batch\n", + "Epoch 5/10 Iteration 717/1780 Training loss: 1.7575 0.1229 sec/batch\n", + "Epoch 5/10 Iteration 718/1780 Training loss: 1.7478 0.1233 sec/batch\n", + "Epoch 5/10 Iteration 719/1780 Training loss: 1.7484 0.1197 sec/batch\n", + "Epoch 5/10 Iteration 720/1780 Training loss: 1.7470 0.1201 sec/batch\n", + "Epoch 5/10 Iteration 721/1780 Training loss: 1.7486 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 722/1780 Training loss: 1.7473 0.1214 sec/batch\n", + "Epoch 5/10 Iteration 723/1780 Training loss: 1.7443 0.1229 sec/batch\n", + "Epoch 5/10 Iteration 724/1780 Training loss: 1.7431 0.1229 sec/batch\n", + "Epoch 5/10 Iteration 725/1780 Training loss: 1.7430 0.1201 sec/batch\n", + "Epoch 5/10 Iteration 726/1780 Training loss: 1.7453 0.1203 sec/batch\n", + "Epoch 5/10 Iteration 727/1780 Training loss: 1.7441 0.1212 sec/batch\n", + "Epoch 5/10 Iteration 728/1780 Training loss: 1.7419 0.1239 sec/batch\n", + "Epoch 5/10 Iteration 729/1780 Training loss: 1.7417 0.1221 sec/batch\n", + "Epoch 5/10 Iteration 730/1780 Training loss: 1.7430 0.1210 sec/batch\n", + "Epoch 5/10 Iteration 731/1780 Training loss: 1.7426 0.1208 sec/batch\n", + "Epoch 5/10 Iteration 732/1780 Training loss: 1.7427 0.1209 sec/batch\n", + "Epoch 5/10 Iteration 733/1780 Training loss: 1.7420 0.1212 sec/batch\n", + "Epoch 5/10 Iteration 734/1780 Training loss: 1.7431 0.1232 sec/batch\n", + "Epoch 5/10 Iteration 735/1780 Training loss: 1.7421 0.1198 sec/batch\n", + "Epoch 5/10 Iteration 736/1780 Training loss: 1.7414 0.1201 sec/batch\n", + "Epoch 5/10 Iteration 737/1780 Training loss: 1.7411 0.1225 sec/batch\n", + "Epoch 5/10 Iteration 738/1780 Training loss: 1.7398 0.1234 sec/batch\n", + "Epoch 5/10 Iteration 739/1780 Training loss: 1.7383 0.1267 sec/batch\n", + "Epoch 5/10 Iteration 740/1780 Training loss: 1.7388 0.1221 sec/batch\n", + "Epoch 5/10 Iteration 741/1780 Training loss: 1.7394 0.1196 sec/batch\n", + "Epoch 5/10 Iteration 742/1780 Training loss: 1.7393 0.1201 sec/batch\n", + "Epoch 5/10 Iteration 743/1780 Training loss: 1.7389 0.1211 sec/batch\n", + "Epoch 5/10 Iteration 744/1780 Training loss: 1.7376 0.1209 sec/batch\n", + "Epoch 5/10 Iteration 745/1780 Training loss: 1.7375 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 746/1780 Training loss: 1.7376 0.1219 sec/batch\n", + "Epoch 5/10 Iteration 747/1780 Training loss: 1.7371 0.1199 sec/batch\n", + "Epoch 5/10 Iteration 748/1780 Training loss: 1.7366 0.1196 sec/batch\n", + "Epoch 5/10 Iteration 749/1780 Training loss: 1.7359 0.1233 sec/batch\n", + "Epoch 5/10 Iteration 750/1780 Training loss: 1.7344 0.1204 sec/batch\n", + "Epoch 5/10 Iteration 751/1780 Training loss: 1.7328 0.1230 sec/batch\n", + "Epoch 5/10 Iteration 752/1780 Training loss: 1.7319 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 753/1780 Training loss: 1.7311 0.1234 sec/batch\n", + "Epoch 5/10 Iteration 754/1780 Training loss: 1.7316 0.1203 sec/batch\n", + "Epoch 5/10 Iteration 755/1780 Training loss: 1.7307 0.1209 sec/batch\n", + "Epoch 5/10 Iteration 756/1780 Training loss: 1.7298 0.1219 sec/batch\n", + "Epoch 5/10 Iteration 757/1780 Training loss: 1.7298 0.1231 sec/batch\n", + "Epoch 5/10 Iteration 758/1780 Training loss: 1.7289 0.1212 sec/batch\n", + "Epoch 5/10 Iteration 759/1780 Training loss: 1.7284 0.1206 sec/batch\n", + "Epoch 5/10 Iteration 760/1780 Training loss: 1.7277 0.1196 sec/batch\n", + "Epoch 5/10 Iteration 761/1780 Training loss: 1.7273 0.1217 sec/batch\n", + "Epoch 5/10 Iteration 762/1780 Training loss: 1.7279 0.1209 sec/batch\n", + "Epoch 5/10 Iteration 763/1780 Training loss: 1.7271 0.1225 sec/batch\n", + "Epoch 5/10 Iteration 764/1780 Training loss: 1.7277 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 765/1780 Training loss: 1.7274 0.1227 sec/batch\n", + "Epoch 5/10 Iteration 766/1780 Training loss: 1.7272 0.1203 sec/batch\n", + "Epoch 5/10 Iteration 767/1780 Training loss: 1.7266 0.1216 sec/batch\n", + "Epoch 5/10 Iteration 768/1780 Training loss: 1.7263 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 769/1780 Training loss: 1.7264 0.1211 sec/batch\n", + "Epoch 5/10 Iteration 770/1780 Training loss: 1.7259 0.1199 sec/batch\n", + "Epoch 5/10 Iteration 771/1780 Training loss: 1.7252 0.1231 sec/batch\n", + "Epoch 5/10 Iteration 772/1780 Training loss: 1.7255 0.1210 sec/batch\n", + "Epoch 5/10 Iteration 773/1780 Training loss: 1.7252 0.1195 sec/batch\n", + "Epoch 5/10 Iteration 774/1780 Training loss: 1.7257 0.1259 sec/batch\n", + "Epoch 5/10 Iteration 775/1780 Training loss: 1.7260 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 776/1780 Training loss: 1.7263 0.1208 sec/batch\n", + "Epoch 5/10 Iteration 777/1780 Training loss: 1.7260 0.1231 sec/batch\n", + "Epoch 5/10 Iteration 778/1780 Training loss: 1.7261 0.1212 sec/batch\n", + "Epoch 5/10 Iteration 779/1780 Training loss: 1.7263 0.1223 sec/batch\n", + "Epoch 5/10 Iteration 780/1780 Training loss: 1.7258 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 781/1780 Training loss: 1.7255 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 782/1780 Training loss: 1.7252 0.1242 sec/batch\n", + "Epoch 5/10 Iteration 783/1780 Training loss: 1.7255 0.1230 sec/batch\n", + "Epoch 5/10 Iteration 784/1780 Training loss: 1.7254 0.1201 sec/batch\n", + "Epoch 5/10 Iteration 785/1780 Training loss: 1.7255 0.1203 sec/batch\n", + "Epoch 5/10 Iteration 786/1780 Training loss: 1.7250 0.1259 sec/batch\n", + "Epoch 5/10 Iteration 787/1780 Training loss: 1.7245 0.1374 sec/batch\n", + "Epoch 5/10 Iteration 788/1780 Training loss: 1.7246 0.1230 sec/batch\n", + "Epoch 5/10 Iteration 789/1780 Training loss: 1.7241 0.1200 sec/batch\n", + "Epoch 5/10 Iteration 790/1780 Training loss: 1.7240 0.1225 sec/batch\n", + "Epoch 5/10 Iteration 791/1780 Training loss: 1.7233 0.1212 sec/batch\n", + "Epoch 5/10 Iteration 792/1780 Training loss: 1.7228 0.1208 sec/batch\n", + "Epoch 5/10 Iteration 793/1780 Training loss: 1.7221 0.1241 sec/batch\n", + "Epoch 5/10 Iteration 794/1780 Training loss: 1.7219 0.1229 sec/batch\n", + "Epoch 5/10 Iteration 795/1780 Training loss: 1.7211 0.1293 sec/batch\n", + "Epoch 5/10 Iteration 796/1780 Training loss: 1.7208 0.1219 sec/batch\n", + "Epoch 5/10 Iteration 797/1780 Training loss: 1.7201 0.1216 sec/batch\n", + "Epoch 5/10 Iteration 798/1780 Training loss: 1.7196 0.1230 sec/batch\n", + "Epoch 5/10 Iteration 799/1780 Training loss: 1.7191 0.1198 sec/batch\n", + "Epoch 5/10 Iteration 800/1780 Training loss: 1.7186 0.1205 sec/batch\n", + "Validation loss: 1.57561 Saving checkpoint!\n", + "Epoch 5/10 Iteration 801/1780 Training loss: 1.7186 0.1210 sec/batch\n", + "Epoch 5/10 Iteration 802/1780 Training loss: 1.7185 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 803/1780 Training loss: 1.7179 0.1223 sec/batch\n", + "Epoch 5/10 Iteration 804/1780 Training loss: 1.7174 0.1230 sec/batch\n", + "Epoch 5/10 Iteration 805/1780 Training loss: 1.7167 0.1220 sec/batch\n", + "Epoch 5/10 Iteration 806/1780 Training loss: 1.7161 0.1199 sec/batch\n", + "Epoch 5/10 Iteration 807/1780 Training loss: 1.7155 0.1211 sec/batch\n", + "Epoch 5/10 Iteration 808/1780 Training loss: 1.7151 0.1203 sec/batch\n", + "Epoch 5/10 Iteration 809/1780 Training loss: 1.7146 0.1232 sec/batch\n", + "Epoch 5/10 Iteration 810/1780 Training loss: 1.7139 0.1223 sec/batch\n", + "Epoch 5/10 Iteration 811/1780 Training loss: 1.7132 0.1211 sec/batch\n", + "Epoch 5/10 Iteration 812/1780 Training loss: 1.7124 0.1235 sec/batch\n", + "Epoch 5/10 Iteration 813/1780 Training loss: 1.7121 0.1200 sec/batch\n", + "Epoch 5/10 Iteration 814/1780 Training loss: 1.7117 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 815/1780 Training loss: 1.7111 0.1223 sec/batch\n", + "Epoch 5/10 Iteration 816/1780 Training loss: 1.7107 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 817/1780 Training loss: 1.7101 0.1216 sec/batch\n", + "Epoch 5/10 Iteration 818/1780 Training loss: 1.7097 0.1225 sec/batch\n", + "Epoch 5/10 Iteration 819/1780 Training loss: 1.7093 0.1204 sec/batch\n", + "Epoch 5/10 Iteration 820/1780 Training loss: 1.7089 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 821/1780 Training loss: 1.7086 0.1228 sec/batch\n", + "Epoch 5/10 Iteration 822/1780 Training loss: 1.7084 0.1201 sec/batch\n", + "Epoch 5/10 Iteration 823/1780 Training loss: 1.7080 0.1211 sec/batch\n", + "Epoch 5/10 Iteration 824/1780 Training loss: 1.7075 0.1227 sec/batch\n", + "Epoch 5/10 Iteration 825/1780 Training loss: 1.7071 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 826/1780 Training loss: 1.7067 0.1305 sec/batch\n", + "Epoch 5/10 Iteration 827/1780 Training loss: 1.7061 0.1222 sec/batch\n", + "Epoch 5/10 Iteration 828/1780 Training loss: 1.7056 0.1232 sec/batch\n", + "Epoch 5/10 Iteration 829/1780 Training loss: 1.7053 0.1210 sec/batch\n", + "Epoch 5/10 Iteration 830/1780 Training loss: 1.7049 0.1211 sec/batch\n", + "Epoch 5/10 Iteration 831/1780 Training loss: 1.7045 0.1220 sec/batch\n", + "Epoch 5/10 Iteration 832/1780 Training loss: 1.7042 0.1217 sec/batch\n", + "Epoch 5/10 Iteration 833/1780 Training loss: 1.7038 0.1219 sec/batch\n", + "Epoch 5/10 Iteration 834/1780 Training loss: 1.7032 0.1204 sec/batch\n", + "Epoch 5/10 Iteration 835/1780 Training loss: 1.7026 0.1212 sec/batch\n", + "Epoch 5/10 Iteration 836/1780 Training loss: 1.7023 0.1233 sec/batch\n", + "Epoch 5/10 Iteration 837/1780 Training loss: 1.7020 0.1250 sec/batch\n", + "Epoch 5/10 Iteration 838/1780 Training loss: 1.7014 0.1192 sec/batch\n", + "Epoch 5/10 Iteration 839/1780 Training loss: 1.7012 0.1248 sec/batch\n", + "Epoch 5/10 Iteration 840/1780 Training loss: 1.7010 0.1203 sec/batch\n", + "Epoch 5/10 Iteration 841/1780 Training loss: 1.7006 0.1225 sec/batch\n", + "Epoch 5/10 Iteration 842/1780 Training loss: 1.7002 0.1236 sec/batch\n", + "Epoch 5/10 Iteration 843/1780 Training loss: 1.6995 0.1222 sec/batch\n", + "Epoch 5/10 Iteration 844/1780 Training loss: 1.6990 0.1244 sec/batch\n", + "Epoch 5/10 Iteration 845/1780 Training loss: 1.6988 0.1213 sec/batch\n", + "Epoch 5/10 Iteration 846/1780 Training loss: 1.6986 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 847/1780 Training loss: 1.6984 0.1214 sec/batch\n", + "Epoch 5/10 Iteration 848/1780 Training loss: 1.6983 0.1206 sec/batch\n", + "Epoch 5/10 Iteration 849/1780 Training loss: 1.6981 0.1198 sec/batch\n", + "Epoch 5/10 Iteration 850/1780 Training loss: 1.6979 0.1218 sec/batch\n", + "Epoch 5/10 Iteration 851/1780 Training loss: 1.6978 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 852/1780 Training loss: 1.6975 0.1204 sec/batch\n", + "Epoch 5/10 Iteration 853/1780 Training loss: 1.6975 0.1233 sec/batch\n", + "Epoch 5/10 Iteration 854/1780 Training loss: 1.6972 0.1210 sec/batch\n", + "Epoch 5/10 Iteration 855/1780 Training loss: 1.6969 0.1209 sec/batch\n", + "Epoch 5/10 Iteration 856/1780 Training loss: 1.6968 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 857/1780 Training loss: 1.6964 0.1226 sec/batch\n", + "Epoch 5/10 Iteration 858/1780 Training loss: 1.6962 0.1232 sec/batch\n", + "Epoch 5/10 Iteration 859/1780 Training loss: 1.6959 0.1227 sec/batch\n", + "Epoch 5/10 Iteration 860/1780 Training loss: 1.6959 0.1222 sec/batch\n", + "Epoch 5/10 Iteration 861/1780 Training loss: 1.6957 0.1203 sec/batch\n", + "Epoch 5/10 Iteration 862/1780 Training loss: 1.6953 0.1207 sec/batch\n", + "Epoch 5/10 Iteration 863/1780 Training loss: 1.6949 0.1236 sec/batch\n", + "Epoch 5/10 Iteration 864/1780 Training loss: 1.6946 0.1209 sec/batch\n", + "Epoch 5/10 Iteration 865/1780 Training loss: 1.6944 0.1202 sec/batch\n", + "Epoch 5/10 Iteration 866/1780 Training loss: 1.6942 0.1209 sec/batch\n", + "Epoch 5/10 Iteration 867/1780 Training loss: 1.6939 0.1204 sec/batch\n", + "Epoch 5/10 Iteration 868/1780 Training loss: 1.6936 0.1210 sec/batch\n", + "Epoch 5/10 Iteration 869/1780 Training loss: 1.6934 0.1235 sec/batch\n", + "Epoch 5/10 Iteration 870/1780 Training loss: 1.6931 0.1229 sec/batch\n", + "Epoch 5/10 Iteration 871/1780 Training loss: 1.6926 0.1198 sec/batch\n", + "Epoch 5/10 Iteration 872/1780 Training loss: 1.6925 0.1215 sec/batch\n", + "Epoch 5/10 Iteration 873/1780 Training loss: 1.6924 0.1214 sec/batch\n", + "Epoch 5/10 Iteration 874/1780 Training loss: 1.6922 0.1201 sec/batch\n", + "Epoch 5/10 Iteration 875/1780 Training loss: 1.6919 0.1214 sec/batch\n", + "Epoch 5/10 Iteration 876/1780 Training loss: 1.6917 0.1223 sec/batch\n", + "Epoch 5/10 Iteration 877/1780 Training loss: 1.6914 0.1232 sec/batch\n", + "Epoch 5/10 Iteration 878/1780 Training loss: 1.6911 0.1215 sec/batch\n", + "Epoch 5/10 Iteration 879/1780 Training loss: 1.6909 0.1216 sec/batch\n", + "Epoch 5/10 Iteration 880/1780 Training loss: 1.6911 0.1222 sec/batch\n", + "Epoch 5/10 Iteration 881/1780 Training loss: 1.6908 0.1216 sec/batch\n", + "Epoch 5/10 Iteration 882/1780 Training loss: 1.6905 0.1212 sec/batch\n", + "Epoch 5/10 Iteration 883/1780 Training loss: 1.6902 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 884/1780 Training loss: 1.6898 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 885/1780 Training loss: 1.6896 0.1211 sec/batch\n", + "Epoch 5/10 Iteration 886/1780 Training loss: 1.6894 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 887/1780 Training loss: 1.6893 0.1241 sec/batch\n", + "Epoch 5/10 Iteration 888/1780 Training loss: 1.6889 0.1212 sec/batch\n", + "Epoch 5/10 Iteration 889/1780 Training loss: 1.6885 0.1205 sec/batch\n", + "Epoch 5/10 Iteration 890/1780 Training loss: 1.6883 0.1202 sec/batch\n", + "Epoch 6/10 Iteration 891/1780 Training loss: 1.7285 0.1223 sec/batch\n", + "Epoch 6/10 Iteration 892/1780 Training loss: 1.6840 0.1218 sec/batch\n", + "Epoch 6/10 Iteration 893/1780 Training loss: 1.6686 0.1203 sec/batch\n", + "Epoch 6/10 Iteration 894/1780 Training loss: 1.6615 0.1204 sec/batch\n", + "Epoch 6/10 Iteration 895/1780 Training loss: 1.6549 0.1203 sec/batch\n", + "Epoch 6/10 Iteration 896/1780 Training loss: 1.6431 0.1222 sec/batch\n", + "Epoch 6/10 Iteration 897/1780 Training loss: 1.6436 0.1213 sec/batch\n", + "Epoch 6/10 Iteration 898/1780 Training loss: 1.6423 0.1211 sec/batch\n", + "Epoch 6/10 Iteration 899/1780 Training loss: 1.6439 0.1205 sec/batch\n", + "Epoch 6/10 Iteration 900/1780 Training loss: 1.6417 0.1204 sec/batch\n", + "Validation loss: 1.51374 Saving checkpoint!\n", + "Epoch 6/10 Iteration 901/1780 Training loss: 1.6435 0.1198 sec/batch\n", + "Epoch 6/10 Iteration 902/1780 Training loss: 1.6409 0.1217 sec/batch\n", + "Epoch 6/10 Iteration 903/1780 Training loss: 1.6401 0.1229 sec/batch\n", + "Epoch 6/10 Iteration 904/1780 Training loss: 1.6419 0.1198 sec/batch\n", + "Epoch 6/10 Iteration 905/1780 Training loss: 1.6410 0.1214 sec/batch\n", + "Epoch 6/10 Iteration 906/1780 Training loss: 1.6389 0.1208 sec/batch\n", + "Epoch 6/10 Iteration 907/1780 Training loss: 1.6384 0.1208 sec/batch\n", + "Epoch 6/10 Iteration 908/1780 Training loss: 1.6397 0.1241 sec/batch\n", + "Epoch 6/10 Iteration 909/1780 Training loss: 1.6398 0.1209 sec/batch\n", + "Epoch 6/10 Iteration 910/1780 Training loss: 1.6401 0.1209 sec/batch\n", + "Epoch 6/10 Iteration 911/1780 Training loss: 1.6394 0.1210 sec/batch\n", + "Epoch 6/10 Iteration 912/1780 Training loss: 1.6401 0.1213 sec/batch\n", + "Epoch 6/10 Iteration 913/1780 Training loss: 1.6389 0.1220 sec/batch\n", + "Epoch 6/10 Iteration 914/1780 Training loss: 1.6386 0.1233 sec/batch\n", + "Epoch 6/10 Iteration 915/1780 Training loss: 1.6385 0.1227 sec/batch\n", + "Epoch 6/10 Iteration 916/1780 Training loss: 1.6368 0.1203 sec/batch\n", + "Epoch 6/10 Iteration 917/1780 Training loss: 1.6351 0.1212 sec/batch\n", + "Epoch 6/10 Iteration 918/1780 Training loss: 1.6350 0.1206 sec/batch\n", + "Epoch 6/10 Iteration 919/1780 Training loss: 1.6354 0.1205 sec/batch\n", + "Epoch 6/10 Iteration 920/1780 Training loss: 1.6355 0.1209 sec/batch\n", + "Epoch 6/10 Iteration 921/1780 Training loss: 1.6349 0.1202 sec/batch\n", + "Epoch 6/10 Iteration 922/1780 Training loss: 1.6336 0.1199 sec/batch\n", + "Epoch 6/10 Iteration 923/1780 Training loss: 1.6337 0.1205 sec/batch\n", + "Epoch 6/10 Iteration 924/1780 Training loss: 1.6339 0.1198 sec/batch\n", + "Epoch 6/10 Iteration 925/1780 Training loss: 1.6333 0.1222 sec/batch\n", + "Epoch 6/10 Iteration 926/1780 Training loss: 1.6328 0.1208 sec/batch\n", + "Epoch 6/10 Iteration 927/1780 Training loss: 1.6318 0.1206 sec/batch\n", + "Epoch 6/10 Iteration 928/1780 Training loss: 1.6305 0.1246 sec/batch\n", + "Epoch 6/10 Iteration 929/1780 Training loss: 1.6290 0.1223 sec/batch\n", + "Epoch 6/10 Iteration 930/1780 Training loss: 1.6284 0.1204 sec/batch\n", + "Epoch 6/10 Iteration 931/1780 Training loss: 1.6278 0.1215 sec/batch\n", + "Epoch 6/10 Iteration 932/1780 Training loss: 1.6279 0.1207 sec/batch\n", + "Epoch 6/10 Iteration 933/1780 Training loss: 1.6270 0.1205 sec/batch\n", + "Epoch 6/10 Iteration 934/1780 Training loss: 1.6259 0.1233 sec/batch\n", + "Epoch 6/10 Iteration 935/1780 Training loss: 1.6258 0.1209 sec/batch\n", + "Epoch 6/10 Iteration 936/1780 Training loss: 1.6249 0.1227 sec/batch\n", + "Epoch 6/10 Iteration 937/1780 Training loss: 1.6244 0.1219 sec/batch\n", + "Epoch 6/10 Iteration 938/1780 Training loss: 1.6237 0.1207 sec/batch\n", + "Epoch 6/10 Iteration 939/1780 Training loss: 1.6229 0.1205 sec/batch\n", + "Epoch 6/10 Iteration 940/1780 Training loss: 1.6233 0.1218 sec/batch\n", + "Epoch 6/10 Iteration 941/1780 Training loss: 1.6226 0.1203 sec/batch\n", + "Epoch 6/10 Iteration 942/1780 Training loss: 1.6233 0.1199 sec/batch\n", + "Epoch 6/10 Iteration 943/1780 Training loss: 1.6230 0.1208 sec/batch\n", + "Epoch 6/10 Iteration 944/1780 Training loss: 1.6229 0.1203 sec/batch\n", + "Epoch 6/10 Iteration 945/1780 Training loss: 1.6224 0.1228 sec/batch\n", + "Epoch 6/10 Iteration 946/1780 Training loss: 1.6224 0.1207 sec/batch\n", + "Epoch 6/10 Iteration 947/1780 Training loss: 1.6225 0.1222 sec/batch\n", + "Epoch 6/10 Iteration 948/1780 Training loss: 1.6218 0.1209 sec/batch\n", + "Epoch 6/10 Iteration 949/1780 Training loss: 1.6210 0.1236 sec/batch\n", + "Epoch 6/10 Iteration 950/1780 Training loss: 1.6212 0.1225 sec/batch\n", + "Epoch 6/10 Iteration 951/1780 Training loss: 1.6212 0.1220 sec/batch\n", + "Epoch 6/10 Iteration 952/1780 Training loss: 1.6218 0.1220 sec/batch\n", + "Epoch 6/10 Iteration 953/1780 Training loss: 1.6221 0.1199 sec/batch\n", + "Epoch 6/10 Iteration 954/1780 Training loss: 1.6221 0.1196 sec/batch\n", + "Epoch 6/10 Iteration 955/1780 Training loss: 1.6219 0.1219 sec/batch\n", + "Epoch 6/10 Iteration 956/1780 Training loss: 1.6220 0.1207 sec/batch\n", + "Epoch 6/10 Iteration 957/1780 Training loss: 1.6221 0.1216 sec/batch\n", + "Epoch 6/10 Iteration 958/1780 Training loss: 1.6217 0.1203 sec/batch\n", + "Epoch 6/10 Iteration 959/1780 Training loss: 1.6216 0.1221 sec/batch\n", + "Epoch 6/10 Iteration 960/1780 Training loss: 1.6213 0.1203 sec/batch\n", + "Epoch 6/10 Iteration 961/1780 Training loss: 1.6217 0.1245 sec/batch\n", + "Epoch 6/10 Iteration 962/1780 Training loss: 1.6217 0.1221 sec/batch\n", + "Epoch 6/10 Iteration 963/1780 Training loss: 1.6219 0.1204 sec/batch\n", + "Epoch 6/10 Iteration 964/1780 Training loss: 1.6214 0.1210 sec/batch\n", + "Epoch 6/10 Iteration 965/1780 Training loss: 1.6211 0.1251 sec/batch\n", + "Epoch 6/10 Iteration 966/1780 Training loss: 1.6212 0.1232 sec/batch\n", + "Epoch 6/10 Iteration 967/1780 Training loss: 1.6208 0.1242 sec/batch\n", + "Epoch 6/10 Iteration 968/1780 Training loss: 1.6206 0.1223 sec/batch\n", + "Epoch 6/10 Iteration 969/1780 Training loss: 1.6198 0.1259 sec/batch\n", + "Epoch 6/10 Iteration 970/1780 Training loss: 1.6196 0.1220 sec/batch\n", + "Epoch 6/10 Iteration 971/1780 Training loss: 1.6189 0.1229 sec/batch\n", + "Epoch 6/10 Iteration 972/1780 Training loss: 1.6187 0.1230 sec/batch\n", + "Epoch 6/10 Iteration 973/1780 Training loss: 1.6180 0.1245 sec/batch\n", + "Epoch 6/10 Iteration 974/1780 Training loss: 1.6177 0.1251 sec/batch\n", + "Epoch 6/10 Iteration 975/1780 Training loss: 1.6173 0.1277 sec/batch\n", + "Epoch 6/10 Iteration 976/1780 Training loss: 1.6168 0.1243 sec/batch\n", + "Epoch 6/10 Iteration 977/1780 Training loss: 1.6163 0.1280 sec/batch\n", + "Epoch 6/10 Iteration 978/1780 Training loss: 1.6159 0.1237 sec/batch\n", + "Epoch 6/10 Iteration 979/1780 Training loss: 1.6152 0.1262 sec/batch\n", + "Epoch 6/10 Iteration 980/1780 Training loss: 1.6152 0.1246 sec/batch\n", + "Epoch 6/10 Iteration 981/1780 Training loss: 1.6148 0.1235 sec/batch\n", + "Epoch 6/10 Iteration 982/1780 Training loss: 1.6144 0.1269 sec/batch\n", + "Epoch 6/10 Iteration 983/1780 Training loss: 1.6138 0.1269 sec/batch\n", + "Epoch 6/10 Iteration 984/1780 Training loss: 1.6133 0.1274 sec/batch\n", + "Epoch 6/10 Iteration 985/1780 Training loss: 1.6127 0.1229 sec/batch\n", + "Epoch 6/10 Iteration 986/1780 Training loss: 1.6125 0.1256 sec/batch\n", + "Epoch 6/10 Iteration 987/1780 Training loss: 1.6123 0.1259 sec/batch\n", + "Epoch 6/10 Iteration 988/1780 Training loss: 1.6117 0.1219 sec/batch\n", + "Epoch 6/10 Iteration 989/1780 Training loss: 1.6112 0.1202 sec/batch\n", + "Epoch 6/10 Iteration 990/1780 Training loss: 1.6104 0.1220 sec/batch\n", + "Epoch 6/10 Iteration 991/1780 Training loss: 1.6103 0.1203 sec/batch\n", + "Epoch 6/10 Iteration 992/1780 Training loss: 1.6100 0.1209 sec/batch\n", + "Epoch 6/10 Iteration 993/1780 Training loss: 1.6096 0.1252 sec/batch\n", + "Epoch 6/10 Iteration 994/1780 Training loss: 1.6092 0.1334 sec/batch\n", + "Epoch 6/10 Iteration 995/1780 Training loss: 1.6089 0.1243 sec/batch\n", + "Epoch 6/10 Iteration 996/1780 Training loss: 1.6086 0.1249 sec/batch\n", + "Epoch 6/10 Iteration 997/1780 Training loss: 1.6084 0.1239 sec/batch\n", + "Epoch 6/10 Iteration 998/1780 Training loss: 1.6081 0.1253 sec/batch\n", + "Epoch 6/10 Iteration 999/1780 Training loss: 1.6079 0.1252 sec/batch\n", + "Epoch 6/10 Iteration 1000/1780 Training loss: 1.6078 0.1274 sec/batch\n", + "Validation loss: 1.46721 Saving checkpoint!\n", + "Epoch 6/10 Iteration 1001/1780 Training loss: 1.6081 0.1217 sec/batch\n", + "Epoch 6/10 Iteration 1002/1780 Training loss: 1.6078 0.1249 sec/batch\n", + "Epoch 6/10 Iteration 1003/1780 Training loss: 1.6076 0.1228 sec/batch\n", + "Epoch 6/10 Iteration 1004/1780 Training loss: 1.6073 0.1246 sec/batch\n", + "Epoch 6/10 Iteration 1005/1780 Training loss: 1.6069 0.1240 sec/batch\n", + "Epoch 6/10 Iteration 1006/1780 Training loss: 1.6063 0.1261 sec/batch\n", + "Epoch 6/10 Iteration 1007/1780 Training loss: 1.6061 0.1235 sec/batch\n", + "Epoch 6/10 Iteration 1008/1780 Training loss: 1.6059 0.1219 sec/batch\n", + "Epoch 6/10 Iteration 1009/1780 Training loss: 1.6055 0.1258 sec/batch\n", + "Epoch 6/10 Iteration 1010/1780 Training loss: 1.6052 0.1300 sec/batch\n", + "Epoch 6/10 Iteration 1011/1780 Training loss: 1.6050 0.1216 sec/batch\n", + "Epoch 6/10 Iteration 1012/1780 Training loss: 1.6044 0.1244 sec/batch\n", + "Epoch 6/10 Iteration 1013/1780 Training loss: 1.6039 0.1259 sec/batch\n", + "Epoch 6/10 Iteration 1014/1780 Training loss: 1.6037 0.1246 sec/batch\n", + "Epoch 6/10 Iteration 1015/1780 Training loss: 1.6034 0.1218 sec/batch\n", + "Epoch 6/10 Iteration 1016/1780 Training loss: 1.6029 0.1248 sec/batch\n", + "Epoch 6/10 Iteration 1017/1780 Training loss: 1.6028 0.1233 sec/batch\n", + "Epoch 6/10 Iteration 1018/1780 Training loss: 1.6026 0.1230 sec/batch\n", + "Epoch 6/10 Iteration 1019/1780 Training loss: 1.6023 0.1232 sec/batch\n", + "Epoch 6/10 Iteration 1020/1780 Training loss: 1.6019 0.1207 sec/batch\n", + "Epoch 6/10 Iteration 1021/1780 Training loss: 1.6013 0.1210 sec/batch\n", + "Epoch 6/10 Iteration 1022/1780 Training loss: 1.6009 0.1231 sec/batch\n", + "Epoch 6/10 Iteration 1023/1780 Training loss: 1.6008 0.1217 sec/batch\n", + "Epoch 6/10 Iteration 1024/1780 Training loss: 1.6006 0.1226 sec/batch\n", + "Epoch 6/10 Iteration 1025/1780 Training loss: 1.6005 0.1207 sec/batch\n", + "Epoch 6/10 Iteration 1026/1780 Training loss: 1.6003 0.1200 sec/batch\n", + "Epoch 6/10 Iteration 1027/1780 Training loss: 1.6003 0.1213 sec/batch\n", + "Epoch 6/10 Iteration 1028/1780 Training loss: 1.6002 0.1204 sec/batch\n", + "Epoch 6/10 Iteration 1029/1780 Training loss: 1.6001 0.1207 sec/batch\n", + "Epoch 6/10 Iteration 1030/1780 Training loss: 1.5998 0.1212 sec/batch\n", + "Epoch 6/10 Iteration 1031/1780 Training loss: 1.6000 0.1235 sec/batch\n", + "Epoch 6/10 Iteration 1032/1780 Training loss: 1.5998 0.1233 sec/batch\n", + "Epoch 6/10 Iteration 1033/1780 Training loss: 1.5995 0.1257 sec/batch\n", + "Epoch 6/10 Iteration 1034/1780 Training loss: 1.5996 0.1232 sec/batch\n", + "Epoch 6/10 Iteration 1035/1780 Training loss: 1.5993 0.1234 sec/batch\n", + "Epoch 6/10 Iteration 1036/1780 Training loss: 1.5992 0.1231 sec/batch\n", + "Epoch 6/10 Iteration 1037/1780 Training loss: 1.5990 0.1219 sec/batch\n", + "Epoch 6/10 Iteration 1038/1780 Training loss: 1.5990 0.1212 sec/batch\n", + "Epoch 6/10 Iteration 1039/1780 Training loss: 1.5988 0.1233 sec/batch\n", + "Epoch 6/10 Iteration 1040/1780 Training loss: 1.5985 0.1216 sec/batch\n", + "Epoch 6/10 Iteration 1041/1780 Training loss: 1.5981 0.1213 sec/batch\n", + "Epoch 6/10 Iteration 1042/1780 Training loss: 1.5979 0.1204 sec/batch\n", + "Epoch 6/10 Iteration 1043/1780 Training loss: 1.5977 0.1201 sec/batch\n", + "Epoch 6/10 Iteration 1044/1780 Training loss: 1.5976 0.1202 sec/batch\n", + "Epoch 6/10 Iteration 1045/1780 Training loss: 1.5974 0.1203 sec/batch\n", + "Epoch 6/10 Iteration 1046/1780 Training loss: 1.5971 0.1194 sec/batch\n", + "Epoch 6/10 Iteration 1047/1780 Training loss: 1.5970 0.1269 sec/batch\n", + "Epoch 6/10 Iteration 1048/1780 Training loss: 1.5968 0.1310 sec/batch\n", + "Epoch 6/10 Iteration 1049/1780 Training loss: 1.5963 0.1288 sec/batch\n", + "Epoch 6/10 Iteration 1050/1780 Training loss: 1.5963 0.1232 sec/batch\n", + "Epoch 6/10 Iteration 1051/1780 Training loss: 1.5963 0.1242 sec/batch\n", + "Epoch 6/10 Iteration 1052/1780 Training loss: 1.5961 0.1242 sec/batch\n", + "Epoch 6/10 Iteration 1053/1780 Training loss: 1.5959 0.1205 sec/batch\n", + "Epoch 6/10 Iteration 1054/1780 Training loss: 1.5958 0.1194 sec/batch\n", + "Epoch 6/10 Iteration 1055/1780 Training loss: 1.5955 0.1237 sec/batch\n", + "Epoch 6/10 Iteration 1056/1780 Training loss: 1.5953 0.1210 sec/batch\n", + "Epoch 6/10 Iteration 1057/1780 Training loss: 1.5953 0.1208 sec/batch\n", + "Epoch 6/10 Iteration 1058/1780 Training loss: 1.5956 0.1224 sec/batch\n", + "Epoch 6/10 Iteration 1059/1780 Training loss: 1.5954 0.1305 sec/batch\n", + "Epoch 6/10 Iteration 1060/1780 Training loss: 1.5951 0.1221 sec/batch\n", + "Epoch 6/10 Iteration 1061/1780 Training loss: 1.5948 0.1207 sec/batch\n", + "Epoch 6/10 Iteration 1062/1780 Training loss: 1.5945 0.1215 sec/batch\n", + "Epoch 6/10 Iteration 1063/1780 Training loss: 1.5944 0.1195 sec/batch\n", + "Epoch 6/10 Iteration 1064/1780 Training loss: 1.5942 0.1209 sec/batch\n", + "Epoch 6/10 Iteration 1065/1780 Training loss: 1.5941 0.1220 sec/batch\n", + "Epoch 6/10 Iteration 1066/1780 Training loss: 1.5938 0.1227 sec/batch\n", + "Epoch 6/10 Iteration 1067/1780 Training loss: 1.5935 0.1240 sec/batch\n", + "Epoch 6/10 Iteration 1068/1780 Training loss: 1.5934 0.1240 sec/batch\n", + "Epoch 7/10 Iteration 1069/1780 Training loss: 1.6372 0.1212 sec/batch\n", + "Epoch 7/10 Iteration 1070/1780 Training loss: 1.5995 0.1223 sec/batch\n", + "Epoch 7/10 Iteration 1071/1780 Training loss: 1.5822 0.1204 sec/batch\n", + "Epoch 7/10 Iteration 1072/1780 Training loss: 1.5759 0.1216 sec/batch\n", + "Epoch 7/10 Iteration 1073/1780 Training loss: 1.5702 0.1213 sec/batch\n", + "Epoch 7/10 Iteration 1074/1780 Training loss: 1.5601 0.1211 sec/batch\n", + "Epoch 7/10 Iteration 1075/1780 Training loss: 1.5609 0.1205 sec/batch\n", + "Epoch 7/10 Iteration 1076/1780 Training loss: 1.5592 0.1235 sec/batch\n", + "Epoch 7/10 Iteration 1077/1780 Training loss: 1.5611 0.1202 sec/batch\n", + "Epoch 7/10 Iteration 1078/1780 Training loss: 1.5607 0.1200 sec/batch\n", + "Epoch 7/10 Iteration 1079/1780 Training loss: 1.5565 0.1236 sec/batch\n", + "Epoch 7/10 Iteration 1080/1780 Training loss: 1.5554 0.1219 sec/batch\n", + "Epoch 7/10 Iteration 1081/1780 Training loss: 1.5553 0.1225 sec/batch\n", + "Epoch 7/10 Iteration 1082/1780 Training loss: 1.5567 0.1216 sec/batch\n", + "Epoch 7/10 Iteration 1083/1780 Training loss: 1.5559 0.1224 sec/batch\n", + "Epoch 7/10 Iteration 1084/1780 Training loss: 1.5534 0.1221 sec/batch\n", + "Epoch 7/10 Iteration 1085/1780 Training loss: 1.5536 0.1237 sec/batch\n", + "Epoch 7/10 Iteration 1086/1780 Training loss: 1.5556 0.1209 sec/batch\n", + "Epoch 7/10 Iteration 1087/1780 Training loss: 1.5558 0.1226 sec/batch\n", + "Epoch 7/10 Iteration 1088/1780 Training loss: 1.5568 0.1211 sec/batch\n", + "Epoch 7/10 Iteration 1089/1780 Training loss: 1.5560 0.1229 sec/batch\n", + "Epoch 7/10 Iteration 1090/1780 Training loss: 1.5560 0.1223 sec/batch\n", + "Epoch 7/10 Iteration 1091/1780 Training loss: 1.5549 0.1239 sec/batch\n", + "Epoch 7/10 Iteration 1092/1780 Training loss: 1.5546 0.1231 sec/batch\n", + "Epoch 7/10 Iteration 1093/1780 Training loss: 1.5543 0.1223 sec/batch\n", + "Epoch 7/10 Iteration 1094/1780 Training loss: 1.5528 0.1209 sec/batch\n", + "Epoch 7/10 Iteration 1095/1780 Training loss: 1.5511 0.1235 sec/batch\n", + "Epoch 7/10 Iteration 1096/1780 Training loss: 1.5513 0.1219 sec/batch\n", + "Epoch 7/10 Iteration 1097/1780 Training loss: 1.5517 0.1222 sec/batch\n", + "Epoch 7/10 Iteration 1098/1780 Training loss: 1.5517 0.1243 sec/batch\n", + "Epoch 7/10 Iteration 1099/1780 Training loss: 1.5511 0.1266 sec/batch\n", + "Epoch 7/10 Iteration 1100/1780 Training loss: 1.5501 0.1268 sec/batch\n", + "Validation loss: 1.43194 Saving checkpoint!\n", + "Epoch 7/10 Iteration 1101/1780 Training loss: 1.5527 0.1234 sec/batch\n", + "Epoch 7/10 Iteration 1102/1780 Training loss: 1.5528 0.1254 sec/batch\n", + "Epoch 7/10 Iteration 1103/1780 Training loss: 1.5527 0.1247 sec/batch\n", + "Epoch 7/10 Iteration 1104/1780 Training loss: 1.5524 0.1249 sec/batch\n", + "Epoch 7/10 Iteration 1105/1780 Training loss: 1.5513 0.1250 sec/batch\n", + "Epoch 7/10 Iteration 1106/1780 Training loss: 1.5504 0.1252 sec/batch\n", + "Epoch 7/10 Iteration 1107/1780 Training loss: 1.5489 0.1261 sec/batch\n", + "Epoch 7/10 Iteration 1108/1780 Training loss: 1.5482 0.1202 sec/batch\n", + "Epoch 7/10 Iteration 1109/1780 Training loss: 1.5475 0.1234 sec/batch\n", + "Epoch 7/10 Iteration 1110/1780 Training loss: 1.5480 0.1218 sec/batch\n", + "Epoch 7/10 Iteration 1111/1780 Training loss: 1.5474 0.1225 sec/batch\n", + "Epoch 7/10 Iteration 1112/1780 Training loss: 1.5466 0.1223 sec/batch\n", + "Epoch 7/10 Iteration 1113/1780 Training loss: 1.5469 0.1227 sec/batch\n", + "Epoch 7/10 Iteration 1114/1780 Training loss: 1.5458 0.1221 sec/batch\n", + "Epoch 7/10 Iteration 1115/1780 Training loss: 1.5455 0.1231 sec/batch\n", + "Epoch 7/10 Iteration 1116/1780 Training loss: 1.5449 0.1215 sec/batch\n", + "Epoch 7/10 Iteration 1117/1780 Training loss: 1.5446 0.1237 sec/batch\n", + "Epoch 7/10 Iteration 1118/1780 Training loss: 1.5449 0.1218 sec/batch\n", + "Epoch 7/10 Iteration 1119/1780 Training loss: 1.5442 0.1245 sec/batch\n", + "Epoch 7/10 Iteration 1120/1780 Training loss: 1.5449 0.1219 sec/batch\n", + "Epoch 7/10 Iteration 1121/1780 Training loss: 1.5447 0.1229 sec/batch\n", + "Epoch 7/10 Iteration 1122/1780 Training loss: 1.5446 0.1212 sec/batch\n", + "Epoch 7/10 Iteration 1123/1780 Training loss: 1.5441 0.1248 sec/batch\n", + "Epoch 7/10 Iteration 1124/1780 Training loss: 1.5439 0.1228 sec/batch\n", + "Epoch 7/10 Iteration 1125/1780 Training loss: 1.5441 0.1267 sec/batch\n", + "Epoch 7/10 Iteration 1126/1780 Training loss: 1.5434 0.1204 sec/batch\n", + "Epoch 7/10 Iteration 1127/1780 Training loss: 1.5426 0.1258 sec/batch\n", + "Epoch 7/10 Iteration 1128/1780 Training loss: 1.5429 0.1223 sec/batch\n", + "Epoch 7/10 Iteration 1129/1780 Training loss: 1.5426 0.1226 sec/batch\n", + "Epoch 7/10 Iteration 1130/1780 Training loss: 1.5432 0.1215 sec/batch\n", + "Epoch 7/10 Iteration 1131/1780 Training loss: 1.5435 0.1222 sec/batch\n", + "Epoch 7/10 Iteration 1132/1780 Training loss: 1.5435 0.1218 sec/batch\n", + "Epoch 7/10 Iteration 1133/1780 Training loss: 1.5432 0.1226 sec/batch\n", + "Epoch 7/10 Iteration 1134/1780 Training loss: 1.5432 0.1231 sec/batch\n", + "Epoch 7/10 Iteration 1135/1780 Training loss: 1.5431 0.1242 sec/batch\n", + "Epoch 7/10 Iteration 1136/1780 Training loss: 1.5425 0.1238 sec/batch\n", + "Epoch 7/10 Iteration 1137/1780 Training loss: 1.5423 0.1258 sec/batch\n", + "Epoch 7/10 Iteration 1138/1780 Training loss: 1.5420 0.1209 sec/batch\n", + "Epoch 7/10 Iteration 1139/1780 Training loss: 1.5424 0.1254 sec/batch\n", + "Epoch 7/10 Iteration 1140/1780 Training loss: 1.5424 0.1215 sec/batch\n", + "Epoch 7/10 Iteration 1141/1780 Training loss: 1.5425 0.1297 sec/batch\n", + "Epoch 7/10 Iteration 1142/1780 Training loss: 1.5419 0.1234 sec/batch\n", + "Epoch 7/10 Iteration 1143/1780 Training loss: 1.5414 0.1258 sec/batch\n", + "Epoch 7/10 Iteration 1144/1780 Training loss: 1.5414 0.1236 sec/batch\n", + "Epoch 7/10 Iteration 1145/1780 Training loss: 1.5409 0.1230 sec/batch\n", + "Epoch 7/10 Iteration 1146/1780 Training loss: 1.5407 0.1220 sec/batch\n", + "Epoch 7/10 Iteration 1147/1780 Training loss: 1.5398 0.1226 sec/batch\n", + "Epoch 7/10 Iteration 1148/1780 Training loss: 1.5395 0.1218 sec/batch\n", + "Epoch 7/10 Iteration 1149/1780 Training loss: 1.5387 0.1263 sec/batch\n", + "Epoch 7/10 Iteration 1150/1780 Training loss: 1.5385 0.1233 sec/batch\n", + "Epoch 7/10 Iteration 1151/1780 Training loss: 1.5378 0.1249 sec/batch\n", + "Epoch 7/10 Iteration 1152/1780 Training loss: 1.5376 0.1244 sec/batch\n", + "Epoch 7/10 Iteration 1153/1780 Training loss: 1.5370 0.1239 sec/batch\n", + "Epoch 7/10 Iteration 1154/1780 Training loss: 1.5367 0.1217 sec/batch\n", + "Epoch 7/10 Iteration 1155/1780 Training loss: 1.5362 0.1217 sec/batch\n", + "Epoch 7/10 Iteration 1156/1780 Training loss: 1.5357 0.1245 sec/batch\n", + "Epoch 7/10 Iteration 1157/1780 Training loss: 1.5351 0.1221 sec/batch\n", + "Epoch 7/10 Iteration 1158/1780 Training loss: 1.5350 0.1242 sec/batch\n", + "Epoch 7/10 Iteration 1159/1780 Training loss: 1.5345 0.1215 sec/batch\n", + "Epoch 7/10 Iteration 1160/1780 Training loss: 1.5340 0.1224 sec/batch\n", + "Epoch 7/10 Iteration 1161/1780 Training loss: 1.5334 0.1254 sec/batch\n", + "Epoch 7/10 Iteration 1162/1780 Training loss: 1.5330 0.1213 sec/batch\n", + "Epoch 7/10 Iteration 1163/1780 Training loss: 1.5324 0.1225 sec/batch\n", + "Epoch 7/10 Iteration 1164/1780 Training loss: 1.5322 0.1216 sec/batch\n", + "Epoch 7/10 Iteration 1165/1780 Training loss: 1.5320 0.1235 sec/batch\n", + "Epoch 7/10 Iteration 1166/1780 Training loss: 1.5313 0.1236 sec/batch\n", + "Epoch 7/10 Iteration 1167/1780 Training loss: 1.5307 0.1213 sec/batch\n", + "Epoch 7/10 Iteration 1168/1780 Training loss: 1.5300 0.1235 sec/batch\n", + "Epoch 7/10 Iteration 1169/1780 Training loss: 1.5298 0.1229 sec/batch\n", + "Epoch 7/10 Iteration 1170/1780 Training loss: 1.5296 0.1226 sec/batch\n", + "Epoch 7/10 Iteration 1171/1780 Training loss: 1.5292 0.1262 sec/batch\n", + "Epoch 7/10 Iteration 1172/1780 Training loss: 1.5290 0.1220 sec/batch\n", + "Epoch 7/10 Iteration 1173/1780 Training loss: 1.5286 0.1236 sec/batch\n", + "Epoch 7/10 Iteration 1174/1780 Training loss: 1.5283 0.1318 sec/batch\n", + "Epoch 7/10 Iteration 1175/1780 Training loss: 1.5281 0.1231 sec/batch\n", + "Epoch 7/10 Iteration 1176/1780 Training loss: 1.5278 0.1227 sec/batch\n", + "Epoch 7/10 Iteration 1177/1780 Training loss: 1.5275 0.1247 sec/batch\n", + "Epoch 7/10 Iteration 1178/1780 Training loss: 1.5274 0.1220 sec/batch\n", + "Epoch 7/10 Iteration 1179/1780 Training loss: 1.5270 0.1229 sec/batch\n", + "Epoch 7/10 Iteration 1180/1780 Training loss: 1.5267 0.1219 sec/batch\n", + "Epoch 7/10 Iteration 1181/1780 Training loss: 1.5263 0.1224 sec/batch\n", + "Epoch 7/10 Iteration 1182/1780 Training loss: 1.5259 0.1216 sec/batch\n", + "Epoch 7/10 Iteration 1183/1780 Training loss: 1.5255 0.1224 sec/batch\n", + "Epoch 7/10 Iteration 1184/1780 Training loss: 1.5249 0.1230 sec/batch\n", + "Epoch 7/10 Iteration 1185/1780 Training loss: 1.5246 0.1255 sec/batch\n", + "Epoch 7/10 Iteration 1186/1780 Training loss: 1.5244 0.1230 sec/batch\n", + "Epoch 7/10 Iteration 1187/1780 Training loss: 1.5241 0.1254 sec/batch\n", + "Epoch 7/10 Iteration 1188/1780 Training loss: 1.5238 0.1218 sec/batch\n", + "Epoch 7/10 Iteration 1189/1780 Training loss: 1.5236 0.1256 sec/batch\n", + "Epoch 7/10 Iteration 1190/1780 Training loss: 1.5231 0.1229 sec/batch\n", + "Epoch 7/10 Iteration 1191/1780 Training loss: 1.5225 0.1222 sec/batch\n", + "Epoch 7/10 Iteration 1192/1780 Training loss: 1.5223 0.1212 sec/batch\n", + "Epoch 7/10 Iteration 1193/1780 Training loss: 1.5220 0.1227 sec/batch\n", + "Epoch 7/10 Iteration 1194/1780 Training loss: 1.5214 0.1209 sec/batch\n", + "Epoch 7/10 Iteration 1195/1780 Training loss: 1.5212 0.1247 sec/batch\n", + "Epoch 7/10 Iteration 1196/1780 Training loss: 1.5210 0.1214 sec/batch\n", + "Epoch 7/10 Iteration 1197/1780 Training loss: 1.5208 0.1254 sec/batch\n", + "Epoch 7/10 Iteration 1198/1780 Training loss: 1.5203 0.1230 sec/batch\n", + "Epoch 7/10 Iteration 1199/1780 Training loss: 1.5197 0.1251 sec/batch\n", + "Epoch 7/10 Iteration 1200/1780 Training loss: 1.5192 0.1241 sec/batch\n", + "Validation loss: 1.37934 Saving checkpoint!\n", + "Epoch 7/10 Iteration 1201/1780 Training loss: 1.5201 0.1215 sec/batch\n", + "Epoch 7/10 Iteration 1202/1780 Training loss: 1.5200 0.1217 sec/batch\n", + "Epoch 7/10 Iteration 1203/1780 Training loss: 1.5199 0.1240 sec/batch\n", + "Epoch 7/10 Iteration 1204/1780 Training loss: 1.5199 0.1249 sec/batch\n", + "Epoch 7/10 Iteration 1205/1780 Training loss: 1.5199 0.1257 sec/batch\n", + "Epoch 7/10 Iteration 1206/1780 Training loss: 1.5199 0.1239 sec/batch\n", + "Epoch 7/10 Iteration 1207/1780 Training loss: 1.5198 0.1222 sec/batch\n", + "Epoch 7/10 Iteration 1208/1780 Training loss: 1.5196 0.1239 sec/batch\n", + "Epoch 7/10 Iteration 1209/1780 Training loss: 1.5199 0.1251 sec/batch\n", + "Epoch 7/10 Iteration 1210/1780 Training loss: 1.5197 0.1215 sec/batch\n", + "Epoch 7/10 Iteration 1211/1780 Training loss: 1.5195 0.1221 sec/batch\n", + "Epoch 7/10 Iteration 1212/1780 Training loss: 1.5196 0.1212 sec/batch\n", + "Epoch 7/10 Iteration 1213/1780 Training loss: 1.5193 0.1230 sec/batch\n", + "Epoch 7/10 Iteration 1214/1780 Training loss: 1.5192 0.1225 sec/batch\n", + "Epoch 7/10 Iteration 1215/1780 Training loss: 1.5191 0.1219 sec/batch\n", + "Epoch 7/10 Iteration 1216/1780 Training loss: 1.5191 0.1232 sec/batch\n", + "Epoch 7/10 Iteration 1217/1780 Training loss: 1.5191 0.1278 sec/batch\n", + "Epoch 7/10 Iteration 1218/1780 Training loss: 1.5187 0.1211 sec/batch\n", + "Epoch 7/10 Iteration 1219/1780 Training loss: 1.5182 0.1224 sec/batch\n", + "Epoch 7/10 Iteration 1220/1780 Training loss: 1.5180 0.1219 sec/batch\n", + "Epoch 7/10 Iteration 1221/1780 Training loss: 1.5178 0.1227 sec/batch\n", + "Epoch 7/10 Iteration 1222/1780 Training loss: 1.5177 0.1235 sec/batch\n", + "Epoch 7/10 Iteration 1223/1780 Training loss: 1.5175 0.1230 sec/batch\n", + "Epoch 7/10 Iteration 1224/1780 Training loss: 1.5173 0.1233 sec/batch\n", + "Epoch 7/10 Iteration 1225/1780 Training loss: 1.5173 0.1238 sec/batch\n", + "Epoch 7/10 Iteration 1226/1780 Training loss: 1.5171 0.1222 sec/batch\n", + "Epoch 7/10 Iteration 1227/1780 Training loss: 1.5166 0.1256 sec/batch\n", + "Epoch 7/10 Iteration 1228/1780 Training loss: 1.5166 0.1215 sec/batch\n", + "Epoch 7/10 Iteration 1229/1780 Training loss: 1.5167 0.1268 sec/batch\n", + "Epoch 7/10 Iteration 1230/1780 Training loss: 1.5165 0.1226 sec/batch\n", + "Epoch 7/10 Iteration 1231/1780 Training loss: 1.5163 0.1264 sec/batch\n", + "Epoch 7/10 Iteration 1232/1780 Training loss: 1.5162 0.1214 sec/batch\n", + "Epoch 7/10 Iteration 1233/1780 Training loss: 1.5160 0.1232 sec/batch\n", + "Epoch 7/10 Iteration 1234/1780 Training loss: 1.5157 0.1234 sec/batch\n", + "Epoch 7/10 Iteration 1235/1780 Training loss: 1.5157 0.1232 sec/batch\n", + "Epoch 7/10 Iteration 1236/1780 Training loss: 1.5160 0.1216 sec/batch\n", + "Epoch 7/10 Iteration 1237/1780 Training loss: 1.5158 0.1224 sec/batch\n", + "Epoch 7/10 Iteration 1238/1780 Training loss: 1.5156 0.1227 sec/batch\n", + "Epoch 7/10 Iteration 1239/1780 Training loss: 1.5153 0.1241 sec/batch\n", + "Epoch 7/10 Iteration 1240/1780 Training loss: 1.5150 0.1266 sec/batch\n", + "Epoch 7/10 Iteration 1241/1780 Training loss: 1.5150 0.1220 sec/batch\n", + "Epoch 7/10 Iteration 1242/1780 Training loss: 1.5149 0.1254 sec/batch\n", + "Epoch 7/10 Iteration 1243/1780 Training loss: 1.5148 0.1229 sec/batch\n", + "Epoch 7/10 Iteration 1244/1780 Training loss: 1.5145 0.1249 sec/batch\n", + "Epoch 7/10 Iteration 1245/1780 Training loss: 1.5142 0.1246 sec/batch\n", + "Epoch 7/10 Iteration 1246/1780 Training loss: 1.5141 0.1222 sec/batch\n", + "Epoch 8/10 Iteration 1247/1780 Training loss: 1.5910 0.1253 sec/batch\n", + "Epoch 8/10 Iteration 1248/1780 Training loss: 1.5418 0.1228 sec/batch\n", + "Epoch 8/10 Iteration 1249/1780 Training loss: 1.5191 0.1264 sec/batch\n", + "Epoch 8/10 Iteration 1250/1780 Training loss: 1.5103 0.1242 sec/batch\n", + "Epoch 8/10 Iteration 1251/1780 Training loss: 1.5018 0.1317 sec/batch\n", + "Epoch 8/10 Iteration 1252/1780 Training loss: 1.4903 0.1233 sec/batch\n", + "Epoch 8/10 Iteration 1253/1780 Training loss: 1.4902 0.1234 sec/batch\n", + "Epoch 8/10 Iteration 1254/1780 Training loss: 1.4880 0.1212 sec/batch\n", + "Epoch 8/10 Iteration 1255/1780 Training loss: 1.4880 0.1231 sec/batch\n", + "Epoch 8/10 Iteration 1256/1780 Training loss: 1.4869 0.1224 sec/batch\n", + "Epoch 8/10 Iteration 1257/1780 Training loss: 1.4827 0.1234 sec/batch\n", + "Epoch 8/10 Iteration 1258/1780 Training loss: 1.4808 0.1275 sec/batch\n", + "Epoch 8/10 Iteration 1259/1780 Training loss: 1.4796 0.1237 sec/batch\n", + "Epoch 8/10 Iteration 1260/1780 Training loss: 1.4813 0.1243 sec/batch\n", + "Epoch 8/10 Iteration 1261/1780 Training loss: 1.4808 0.1225 sec/batch\n", + "Epoch 8/10 Iteration 1262/1780 Training loss: 1.4793 0.1227 sec/batch\n", + "Epoch 8/10 Iteration 1263/1780 Training loss: 1.4792 0.1220 sec/batch\n", + "Epoch 8/10 Iteration 1264/1780 Training loss: 1.4808 0.1213 sec/batch\n", + "Epoch 8/10 Iteration 1265/1780 Training loss: 1.4807 0.1260 sec/batch\n", + "Epoch 8/10 Iteration 1266/1780 Training loss: 1.4814 0.1235 sec/batch\n", + "Epoch 8/10 Iteration 1267/1780 Training loss: 1.4810 0.1230 sec/batch\n", + "Epoch 8/10 Iteration 1268/1780 Training loss: 1.4813 0.1216 sec/batch\n", + "Epoch 8/10 Iteration 1269/1780 Training loss: 1.4802 0.1255 sec/batch\n", + "Epoch 8/10 Iteration 1270/1780 Training loss: 1.4797 0.1209 sec/batch\n", + "Epoch 8/10 Iteration 1271/1780 Training loss: 1.4796 0.1219 sec/batch\n", + "Epoch 8/10 Iteration 1272/1780 Training loss: 1.4779 0.1234 sec/batch\n", + "Epoch 8/10 Iteration 1273/1780 Training loss: 1.4763 0.1255 sec/batch\n", + "Epoch 8/10 Iteration 1274/1780 Training loss: 1.4762 0.1246 sec/batch\n", + "Epoch 8/10 Iteration 1275/1780 Training loss: 1.4763 0.1245 sec/batch\n", + "Epoch 8/10 Iteration 1276/1780 Training loss: 1.4768 0.1216 sec/batch\n", + "Epoch 8/10 Iteration 1277/1780 Training loss: 1.4763 0.1245 sec/batch\n", + "Epoch 8/10 Iteration 1278/1780 Training loss: 1.4751 0.1246 sec/batch\n", + "Epoch 8/10 Iteration 1279/1780 Training loss: 1.4754 0.1230 sec/batch\n", + "Epoch 8/10 Iteration 1280/1780 Training loss: 1.4753 0.1217 sec/batch\n", + "Epoch 8/10 Iteration 1281/1780 Training loss: 1.4749 0.1261 sec/batch\n", + "Epoch 8/10 Iteration 1282/1780 Training loss: 1.4746 0.1282 sec/batch\n", + "Epoch 8/10 Iteration 1283/1780 Training loss: 1.4740 0.1229 sec/batch\n", + "Epoch 8/10 Iteration 1284/1780 Training loss: 1.4728 0.1230 sec/batch\n", + "Epoch 8/10 Iteration 1285/1780 Training loss: 1.4715 0.1233 sec/batch\n", + "Epoch 8/10 Iteration 1286/1780 Training loss: 1.4708 0.1212 sec/batch\n", + "Epoch 8/10 Iteration 1287/1780 Training loss: 1.4703 0.1309 sec/batch\n", + "Epoch 8/10 Iteration 1288/1780 Training loss: 1.4706 0.1227 sec/batch\n", + "Epoch 8/10 Iteration 1289/1780 Training loss: 1.4699 0.1234 sec/batch\n", + "Epoch 8/10 Iteration 1290/1780 Training loss: 1.4689 0.1222 sec/batch\n", + "Epoch 8/10 Iteration 1291/1780 Training loss: 1.4688 0.1241 sec/batch\n", + "Epoch 8/10 Iteration 1292/1780 Training loss: 1.4678 0.1217 sec/batch\n", + "Epoch 8/10 Iteration 1293/1780 Training loss: 1.4674 0.1230 sec/batch\n", + "Epoch 8/10 Iteration 1294/1780 Training loss: 1.4666 0.1219 sec/batch\n", + "Epoch 8/10 Iteration 1295/1780 Training loss: 1.4662 0.1222 sec/batch\n", + "Epoch 8/10 Iteration 1296/1780 Training loss: 1.4664 0.1223 sec/batch\n", + "Epoch 8/10 Iteration 1297/1780 Training loss: 1.4658 0.1235 sec/batch\n", + "Epoch 8/10 Iteration 1298/1780 Training loss: 1.4666 0.1228 sec/batch\n", + "Epoch 8/10 Iteration 1299/1780 Training loss: 1.4663 0.1276 sec/batch\n", + "Epoch 8/10 Iteration 1300/1780 Training loss: 1.4664 0.1240 sec/batch\n", + "Validation loss: 1.34587 Saving checkpoint!\n", + "Epoch 8/10 Iteration 1301/1780 Training loss: 1.4681 0.1217 sec/batch\n", + "Epoch 8/10 Iteration 1302/1780 Training loss: 1.4686 0.1215 sec/batch\n", + "Epoch 8/10 Iteration 1303/1780 Training loss: 1.4691 0.1234 sec/batch\n", + "Epoch 8/10 Iteration 1304/1780 Training loss: 1.4687 0.1223 sec/batch\n", + "Epoch 8/10 Iteration 1305/1780 Training loss: 1.4681 0.1259 sec/batch\n", + "Epoch 8/10 Iteration 1306/1780 Training loss: 1.4686 0.1227 sec/batch\n", + "Epoch 8/10 Iteration 1307/1780 Training loss: 1.4686 0.1248 sec/batch\n", + "Epoch 8/10 Iteration 1308/1780 Training loss: 1.4693 0.1232 sec/batch\n", + "Epoch 8/10 Iteration 1309/1780 Training loss: 1.4698 0.1257 sec/batch\n", + "Epoch 8/10 Iteration 1310/1780 Training loss: 1.4699 0.1215 sec/batch\n", + "Epoch 8/10 Iteration 1311/1780 Training loss: 1.4697 0.1221 sec/batch\n", + "Epoch 8/10 Iteration 1312/1780 Training loss: 1.4698 0.1216 sec/batch\n", + "Epoch 8/10 Iteration 1313/1780 Training loss: 1.4699 0.1241 sec/batch\n", + "Epoch 8/10 Iteration 1314/1780 Training loss: 1.4693 0.1206 sec/batch\n", + "Epoch 8/10 Iteration 1315/1780 Training loss: 1.4692 0.1240 sec/batch\n", + "Epoch 8/10 Iteration 1316/1780 Training loss: 1.4688 0.1212 sec/batch\n", + "Epoch 8/10 Iteration 1317/1780 Training loss: 1.4693 0.1223 sec/batch\n", + "Epoch 8/10 Iteration 1318/1780 Training loss: 1.4694 0.1248 sec/batch\n", + "Epoch 8/10 Iteration 1319/1780 Training loss: 1.4697 0.1271 sec/batch\n", + "Epoch 8/10 Iteration 1320/1780 Training loss: 1.4693 0.1231 sec/batch\n", + "Epoch 8/10 Iteration 1321/1780 Training loss: 1.4689 0.1228 sec/batch\n", + "Epoch 8/10 Iteration 1322/1780 Training loss: 1.4689 0.1220 sec/batch\n", + "Epoch 8/10 Iteration 1323/1780 Training loss: 1.4686 0.1254 sec/batch\n", + "Epoch 8/10 Iteration 1324/1780 Training loss: 1.4684 0.1223 sec/batch\n", + "Epoch 8/10 Iteration 1325/1780 Training loss: 1.4678 0.1257 sec/batch\n", + "Epoch 8/10 Iteration 1326/1780 Training loss: 1.4676 0.1218 sec/batch\n", + "Epoch 8/10 Iteration 1327/1780 Training loss: 1.4671 0.1226 sec/batch\n", + "Epoch 8/10 Iteration 1328/1780 Training loss: 1.4670 0.1216 sec/batch\n", + "Epoch 8/10 Iteration 1329/1780 Training loss: 1.4664 0.1234 sec/batch\n", + "Epoch 8/10 Iteration 1330/1780 Training loss: 1.4663 0.1209 sec/batch\n", + "Epoch 8/10 Iteration 1331/1780 Training loss: 1.4660 0.1228 sec/batch\n", + "Epoch 8/10 Iteration 1332/1780 Training loss: 1.4656 0.1228 sec/batch\n", + "Epoch 8/10 Iteration 1333/1780 Training loss: 1.4652 0.1246 sec/batch\n", + "Epoch 8/10 Iteration 1334/1780 Training loss: 1.4648 0.1222 sec/batch\n", + "Epoch 8/10 Iteration 1335/1780 Training loss: 1.4642 0.1232 sec/batch\n", + "Epoch 8/10 Iteration 1336/1780 Training loss: 1.4642 0.1230 sec/batch\n", + "Epoch 8/10 Iteration 1337/1780 Training loss: 1.4638 0.1238 sec/batch\n", + "Epoch 8/10 Iteration 1338/1780 Training loss: 1.4636 0.1248 sec/batch\n", + "Epoch 8/10 Iteration 1339/1780 Training loss: 1.4631 0.1230 sec/batch\n", + "Epoch 8/10 Iteration 1340/1780 Training loss: 1.4627 0.1242 sec/batch\n", + "Epoch 8/10 Iteration 1341/1780 Training loss: 1.4623 0.1226 sec/batch\n", + "Epoch 8/10 Iteration 1342/1780 Training loss: 1.4621 0.1210 sec/batch\n", + "Epoch 8/10 Iteration 1343/1780 Training loss: 1.4620 0.1234 sec/batch\n", + "Epoch 8/10 Iteration 1344/1780 Training loss: 1.4613 0.1231 sec/batch\n", + "Epoch 8/10 Iteration 1345/1780 Training loss: 1.4608 0.1213 sec/batch\n", + "Epoch 8/10 Iteration 1346/1780 Training loss: 1.4603 0.1220 sec/batch\n", + "Epoch 8/10 Iteration 1347/1780 Training loss: 1.4601 0.1239 sec/batch\n", + "Epoch 8/10 Iteration 1348/1780 Training loss: 1.4598 0.1213 sec/batch\n", + "Epoch 8/10 Iteration 1349/1780 Training loss: 1.4595 0.1239 sec/batch\n", + "Epoch 8/10 Iteration 1350/1780 Training loss: 1.4593 0.1215 sec/batch\n", + "Epoch 8/10 Iteration 1351/1780 Training loss: 1.4589 0.1238 sec/batch\n", + "Epoch 8/10 Iteration 1352/1780 Training loss: 1.4587 0.1239 sec/batch\n", + "Epoch 8/10 Iteration 1353/1780 Training loss: 1.4584 0.1232 sec/batch\n", + "Epoch 8/10 Iteration 1354/1780 Training loss: 1.4582 0.1216 sec/batch\n", + "Epoch 8/10 Iteration 1355/1780 Training loss: 1.4579 0.1229 sec/batch\n", + "Epoch 8/10 Iteration 1356/1780 Training loss: 1.4578 0.1244 sec/batch\n", + "Epoch 8/10 Iteration 1357/1780 Training loss: 1.4574 0.1214 sec/batch\n", + "Epoch 8/10 Iteration 1358/1780 Training loss: 1.4572 0.1208 sec/batch\n", + "Epoch 8/10 Iteration 1359/1780 Training loss: 1.4569 0.1254 sec/batch\n", + "Epoch 8/10 Iteration 1360/1780 Training loss: 1.4566 0.1203 sec/batch\n", + "Epoch 8/10 Iteration 1361/1780 Training loss: 1.4562 0.1283 sec/batch\n", + "Epoch 8/10 Iteration 1362/1780 Training loss: 1.4558 0.1203 sec/batch\n", + "Epoch 8/10 Iteration 1363/1780 Training loss: 1.4556 0.1239 sec/batch\n", + "Epoch 8/10 Iteration 1364/1780 Training loss: 1.4555 0.1228 sec/batch\n", + "Epoch 8/10 Iteration 1365/1780 Training loss: 1.4553 0.1224 sec/batch\n", + "Epoch 8/10 Iteration 1366/1780 Training loss: 1.4552 0.1244 sec/batch\n", + "Epoch 8/10 Iteration 1367/1780 Training loss: 1.4549 0.1228 sec/batch\n", + "Epoch 8/10 Iteration 1368/1780 Training loss: 1.4543 0.1223 sec/batch\n", + "Epoch 8/10 Iteration 1369/1780 Training loss: 1.4539 0.1212 sec/batch\n", + "Epoch 8/10 Iteration 1370/1780 Training loss: 1.4538 0.1215 sec/batch\n", + "Epoch 8/10 Iteration 1371/1780 Training loss: 1.4536 0.1222 sec/batch\n", + "Epoch 8/10 Iteration 1372/1780 Training loss: 1.4530 0.1205 sec/batch\n", + "Epoch 8/10 Iteration 1373/1780 Training loss: 1.4529 0.1241 sec/batch\n", + "Epoch 8/10 Iteration 1374/1780 Training loss: 1.4529 0.1216 sec/batch\n", + "Epoch 8/10 Iteration 1375/1780 Training loss: 1.4528 0.1249 sec/batch\n", + "Epoch 8/10 Iteration 1376/1780 Training loss: 1.4525 0.1219 sec/batch\n", + "Epoch 8/10 Iteration 1377/1780 Training loss: 1.4519 0.1220 sec/batch\n", + "Epoch 8/10 Iteration 1378/1780 Training loss: 1.4517 0.1238 sec/batch\n", + "Epoch 8/10 Iteration 1379/1780 Training loss: 1.4518 0.1225 sec/batch\n", + "Epoch 8/10 Iteration 1380/1780 Training loss: 1.4518 0.1212 sec/batch\n", + "Epoch 8/10 Iteration 1381/1780 Training loss: 1.4517 0.1227 sec/batch\n", + "Epoch 8/10 Iteration 1382/1780 Training loss: 1.4517 0.1211 sec/batch\n", + "Epoch 8/10 Iteration 1383/1780 Training loss: 1.4517 0.1249 sec/batch\n", + "Epoch 8/10 Iteration 1384/1780 Training loss: 1.4517 0.1227 sec/batch\n", + "Epoch 8/10 Iteration 1385/1780 Training loss: 1.4517 0.1229 sec/batch\n", + "Epoch 8/10 Iteration 1386/1780 Training loss: 1.4517 0.1213 sec/batch\n", + "Epoch 8/10 Iteration 1387/1780 Training loss: 1.4519 0.1219 sec/batch\n", + "Epoch 8/10 Iteration 1388/1780 Training loss: 1.4518 0.1215 sec/batch\n", + "Epoch 8/10 Iteration 1389/1780 Training loss: 1.4516 0.1220 sec/batch\n", + "Epoch 8/10 Iteration 1390/1780 Training loss: 1.4517 0.1204 sec/batch\n", + "Epoch 8/10 Iteration 1391/1780 Training loss: 1.4515 0.1230 sec/batch\n", + "Epoch 8/10 Iteration 1392/1780 Training loss: 1.4515 0.1228 sec/batch\n", + "Epoch 8/10 Iteration 1393/1780 Training loss: 1.4514 0.1241 sec/batch\n", + "Epoch 8/10 Iteration 1394/1780 Training loss: 1.4515 0.1228 sec/batch\n", + "Epoch 8/10 Iteration 1395/1780 Training loss: 1.4515 0.1224 sec/batch\n", + "Epoch 8/10 Iteration 1396/1780 Training loss: 1.4513 0.1202 sec/batch\n", + "Epoch 8/10 Iteration 1397/1780 Training loss: 1.4509 0.1257 sec/batch\n", + "Epoch 8/10 Iteration 1398/1780 Training loss: 1.4506 0.1229 sec/batch\n", + "Epoch 8/10 Iteration 1399/1780 Training loss: 1.4506 0.1253 sec/batch\n", + "Epoch 8/10 Iteration 1400/1780 Training loss: 1.4504 0.1224 sec/batch\n", + "Validation loss: 1.3216 Saving checkpoint!\n", + "Epoch 8/10 Iteration 1401/1780 Training loss: 1.4511 0.1220 sec/batch\n", + "Epoch 8/10 Iteration 1402/1780 Training loss: 1.4510 0.1214 sec/batch\n", + "Epoch 8/10 Iteration 1403/1780 Training loss: 1.4510 0.1233 sec/batch\n", + "Epoch 8/10 Iteration 1404/1780 Training loss: 1.4510 0.1211 sec/batch\n", + "Epoch 8/10 Iteration 1405/1780 Training loss: 1.4507 0.1231 sec/batch\n", + "Epoch 8/10 Iteration 1406/1780 Training loss: 1.4507 0.1231 sec/batch\n", + "Epoch 8/10 Iteration 1407/1780 Training loss: 1.4508 0.1250 sec/batch\n", + "Epoch 8/10 Iteration 1408/1780 Training loss: 1.4507 0.1235 sec/batch\n", + "Epoch 8/10 Iteration 1409/1780 Training loss: 1.4506 0.1251 sec/batch\n", + "Epoch 8/10 Iteration 1410/1780 Training loss: 1.4504 0.1218 sec/batch\n", + "Epoch 8/10 Iteration 1411/1780 Training loss: 1.4502 0.1239 sec/batch\n", + "Epoch 8/10 Iteration 1412/1780 Training loss: 1.4501 0.1212 sec/batch\n", + "Epoch 8/10 Iteration 1413/1780 Training loss: 1.4502 0.1235 sec/batch\n", + "Epoch 8/10 Iteration 1414/1780 Training loss: 1.4505 0.1242 sec/batch\n", + "Epoch 8/10 Iteration 1415/1780 Training loss: 1.4504 0.1225 sec/batch\n", + "Epoch 8/10 Iteration 1416/1780 Training loss: 1.4503 0.1239 sec/batch\n", + "Epoch 8/10 Iteration 1417/1780 Training loss: 1.4501 0.1255 sec/batch\n", + "Epoch 8/10 Iteration 1418/1780 Training loss: 1.4498 0.1217 sec/batch\n", + "Epoch 8/10 Iteration 1419/1780 Training loss: 1.4499 0.1256 sec/batch\n", + "Epoch 8/10 Iteration 1420/1780 Training loss: 1.4498 0.1245 sec/batch\n", + "Epoch 8/10 Iteration 1421/1780 Training loss: 1.4498 0.1246 sec/batch\n", + "Epoch 8/10 Iteration 1422/1780 Training loss: 1.4496 0.1216 sec/batch\n", + "Epoch 8/10 Iteration 1423/1780 Training loss: 1.4493 0.1225 sec/batch\n", + "Epoch 8/10 Iteration 1424/1780 Training loss: 1.4494 0.1255 sec/batch\n", + "Epoch 9/10 Iteration 1425/1780 Training loss: 1.5353 0.1220 sec/batch\n", + "Epoch 9/10 Iteration 1426/1780 Training loss: 1.4841 0.1218 sec/batch\n", + "Epoch 9/10 Iteration 1427/1780 Training loss: 1.4645 0.1242 sec/batch\n", + "Epoch 9/10 Iteration 1428/1780 Training loss: 1.4598 0.1219 sec/batch\n", + "Epoch 9/10 Iteration 1429/1780 Training loss: 1.4487 0.1245 sec/batch\n", + "Epoch 9/10 Iteration 1430/1780 Training loss: 1.4362 0.1217 sec/batch\n", + "Epoch 9/10 Iteration 1431/1780 Training loss: 1.4347 0.1253 sec/batch\n", + "Epoch 9/10 Iteration 1432/1780 Training loss: 1.4325 0.1219 sec/batch\n", + "Epoch 9/10 Iteration 1433/1780 Training loss: 1.4321 0.1241 sec/batch\n", + "Epoch 9/10 Iteration 1434/1780 Training loss: 1.4305 0.1243 sec/batch\n", + "Epoch 9/10 Iteration 1435/1780 Training loss: 1.4266 0.1241 sec/batch\n", + "Epoch 9/10 Iteration 1436/1780 Training loss: 1.4252 0.1217 sec/batch\n", + "Epoch 9/10 Iteration 1437/1780 Training loss: 1.4243 0.1271 sec/batch\n", + "Epoch 9/10 Iteration 1438/1780 Training loss: 1.4250 0.1232 sec/batch\n", + "Epoch 9/10 Iteration 1439/1780 Training loss: 1.4237 0.1221 sec/batch\n", + "Epoch 9/10 Iteration 1440/1780 Training loss: 1.4216 0.1235 sec/batch\n", + "Epoch 9/10 Iteration 1441/1780 Training loss: 1.4219 0.1228 sec/batch\n", + "Epoch 9/10 Iteration 1442/1780 Training loss: 1.4232 0.1223 sec/batch\n", + "Epoch 9/10 Iteration 1443/1780 Training loss: 1.4232 0.1219 sec/batch\n", + "Epoch 9/10 Iteration 1444/1780 Training loss: 1.4239 0.1281 sec/batch\n", + "Epoch 9/10 Iteration 1445/1780 Training loss: 1.4230 0.1237 sec/batch\n", + "Epoch 9/10 Iteration 1446/1780 Training loss: 1.4234 0.1218 sec/batch\n", + "Epoch 9/10 Iteration 1447/1780 Training loss: 1.4223 0.1256 sec/batch\n", + "Epoch 9/10 Iteration 1448/1780 Training loss: 1.4222 0.1235 sec/batch\n", + "Epoch 9/10 Iteration 1449/1780 Training loss: 1.4222 0.1227 sec/batch\n", + "Epoch 9/10 Iteration 1450/1780 Training loss: 1.4206 0.1216 sec/batch\n", + "Epoch 9/10 Iteration 1451/1780 Training loss: 1.4192 0.1242 sec/batch\n", + "Epoch 9/10 Iteration 1452/1780 Training loss: 1.4198 0.1221 sec/batch\n", + "Epoch 9/10 Iteration 1453/1780 Training loss: 1.4205 0.1231 sec/batch\n", + "Epoch 9/10 Iteration 1454/1780 Training loss: 1.4204 0.1242 sec/batch\n", + "Epoch 9/10 Iteration 1455/1780 Training loss: 1.4202 0.1229 sec/batch\n", + "Epoch 9/10 Iteration 1456/1780 Training loss: 1.4191 0.1219 sec/batch\n", + "Epoch 9/10 Iteration 1457/1780 Training loss: 1.4193 0.1238 sec/batch\n", + "Epoch 9/10 Iteration 1458/1780 Training loss: 1.4194 0.1247 sec/batch\n", + "Epoch 9/10 Iteration 1459/1780 Training loss: 1.4191 0.1228 sec/batch\n", + "Epoch 9/10 Iteration 1460/1780 Training loss: 1.4188 0.1231 sec/batch\n", + "Epoch 9/10 Iteration 1461/1780 Training loss: 1.4179 0.1235 sec/batch\n", + "Epoch 9/10 Iteration 1462/1780 Training loss: 1.4167 0.1234 sec/batch\n", + "Epoch 9/10 Iteration 1463/1780 Training loss: 1.4153 0.1228 sec/batch\n", + "Epoch 9/10 Iteration 1464/1780 Training loss: 1.4146 0.1217 sec/batch\n", + "Epoch 9/10 Iteration 1465/1780 Training loss: 1.4139 0.1258 sec/batch\n", + "Epoch 9/10 Iteration 1466/1780 Training loss: 1.4143 0.1233 sec/batch\n", + "Epoch 9/10 Iteration 1467/1780 Training loss: 1.4137 0.1252 sec/batch\n", + "Epoch 9/10 Iteration 1468/1780 Training loss: 1.4128 0.1249 sec/batch\n", + "Epoch 9/10 Iteration 1469/1780 Training loss: 1.4131 0.1238 sec/batch\n", + "Epoch 9/10 Iteration 1470/1780 Training loss: 1.4122 0.1247 sec/batch\n", + "Epoch 9/10 Iteration 1471/1780 Training loss: 1.4121 0.1243 sec/batch\n", + "Epoch 9/10 Iteration 1472/1780 Training loss: 1.4117 0.1209 sec/batch\n", + "Epoch 9/10 Iteration 1473/1780 Training loss: 1.4118 0.1266 sec/batch\n", + "Epoch 9/10 Iteration 1474/1780 Training loss: 1.4120 0.1234 sec/batch\n", + "Epoch 9/10 Iteration 1475/1780 Training loss: 1.4115 0.1269 sec/batch\n", + "Epoch 9/10 Iteration 1476/1780 Training loss: 1.4123 0.1232 sec/batch\n", + "Epoch 9/10 Iteration 1477/1780 Training loss: 1.4122 0.1286 sec/batch\n", + "Epoch 9/10 Iteration 1478/1780 Training loss: 1.4125 0.1287 sec/batch\n", + "Epoch 9/10 Iteration 1479/1780 Training loss: 1.4124 0.1227 sec/batch\n", + "Epoch 9/10 Iteration 1480/1780 Training loss: 1.4124 0.1213 sec/batch\n", + "Epoch 9/10 Iteration 1481/1780 Training loss: 1.4128 0.1257 sec/batch\n", + "Epoch 9/10 Iteration 1482/1780 Training loss: 1.4123 0.1241 sec/batch\n", + "Epoch 9/10 Iteration 1483/1780 Training loss: 1.4117 0.1232 sec/batch\n", + "Epoch 9/10 Iteration 1484/1780 Training loss: 1.4122 0.1224 sec/batch\n", + "Epoch 9/10 Iteration 1485/1780 Training loss: 1.4121 0.1224 sec/batch\n", + "Epoch 9/10 Iteration 1486/1780 Training loss: 1.4127 0.1207 sec/batch\n", + "Epoch 9/10 Iteration 1487/1780 Training loss: 1.4132 0.1240 sec/batch\n", + "Epoch 9/10 Iteration 1488/1780 Training loss: 1.4131 0.1215 sec/batch\n", + "Epoch 9/10 Iteration 1489/1780 Training loss: 1.4130 0.1224 sec/batch\n", + "Epoch 9/10 Iteration 1490/1780 Training loss: 1.4131 0.1227 sec/batch\n", + "Epoch 9/10 Iteration 1491/1780 Training loss: 1.4131 0.1268 sec/batch\n", + "Epoch 9/10 Iteration 1492/1780 Training loss: 1.4127 0.1229 sec/batch\n", + "Epoch 9/10 Iteration 1493/1780 Training loss: 1.4128 0.1231 sec/batch\n", + "Epoch 9/10 Iteration 1494/1780 Training loss: 1.4128 0.1236 sec/batch\n", + "Epoch 9/10 Iteration 1495/1780 Training loss: 1.4132 0.1247 sec/batch\n", + "Epoch 9/10 Iteration 1496/1780 Training loss: 1.4134 0.1217 sec/batch\n", + "Epoch 9/10 Iteration 1497/1780 Training loss: 1.4138 0.1244 sec/batch\n", + "Epoch 9/10 Iteration 1498/1780 Training loss: 1.4133 0.1215 sec/batch\n", + "Epoch 9/10 Iteration 1499/1780 Training loss: 1.4131 0.1271 sec/batch\n", + "Epoch 9/10 Iteration 1500/1780 Training loss: 1.4132 0.1231 sec/batch\n", + "Validation loss: 1.29403 Saving checkpoint!\n", + "Epoch 9/10 Iteration 1501/1780 Training loss: 1.4146 0.1211 sec/batch\n", + "Epoch 9/10 Iteration 1502/1780 Training loss: 1.4145 0.1221 sec/batch\n", + "Epoch 9/10 Iteration 1503/1780 Training loss: 1.4140 0.1238 sec/batch\n", + "Epoch 9/10 Iteration 1504/1780 Training loss: 1.4139 0.1218 sec/batch\n", + "Epoch 9/10 Iteration 1505/1780 Training loss: 1.4134 0.1228 sec/batch\n", + "Epoch 9/10 Iteration 1506/1780 Training loss: 1.4134 0.1236 sec/batch\n", + "Epoch 9/10 Iteration 1507/1780 Training loss: 1.4129 0.1233 sec/batch\n", + "Epoch 9/10 Iteration 1508/1780 Training loss: 1.4128 0.1248 sec/batch\n", + "Epoch 9/10 Iteration 1509/1780 Training loss: 1.4126 0.1222 sec/batch\n", + "Epoch 9/10 Iteration 1510/1780 Training loss: 1.4124 0.1211 sec/batch\n", + "Epoch 9/10 Iteration 1511/1780 Training loss: 1.4121 0.1235 sec/batch\n", + "Epoch 9/10 Iteration 1512/1780 Training loss: 1.4119 0.1201 sec/batch\n", + "Epoch 9/10 Iteration 1513/1780 Training loss: 1.4114 0.1221 sec/batch\n", + "Epoch 9/10 Iteration 1514/1780 Training loss: 1.4114 0.1207 sec/batch\n", + "Epoch 9/10 Iteration 1515/1780 Training loss: 1.4112 0.1237 sec/batch\n", + "Epoch 9/10 Iteration 1516/1780 Training loss: 1.4110 0.1217 sec/batch\n", + "Epoch 9/10 Iteration 1517/1780 Training loss: 1.4107 0.1230 sec/batch\n", + "Epoch 9/10 Iteration 1518/1780 Training loss: 1.4103 0.1231 sec/batch\n", + "Epoch 9/10 Iteration 1519/1780 Training loss: 1.4100 0.1228 sec/batch\n", + "Epoch 9/10 Iteration 1520/1780 Training loss: 1.4100 0.1228 sec/batch\n", + "Epoch 9/10 Iteration 1521/1780 Training loss: 1.4100 0.1254 sec/batch\n", + "Epoch 9/10 Iteration 1522/1780 Training loss: 1.4095 0.1229 sec/batch\n", + "Epoch 9/10 Iteration 1523/1780 Training loss: 1.4092 0.1239 sec/batch\n", + "Epoch 9/10 Iteration 1524/1780 Training loss: 1.4087 0.1210 sec/batch\n", + "Epoch 9/10 Iteration 1525/1780 Training loss: 1.4087 0.1227 sec/batch\n", + "Epoch 9/10 Iteration 1526/1780 Training loss: 1.4086 0.1227 sec/batch\n", + "Epoch 9/10 Iteration 1527/1780 Training loss: 1.4085 0.1252 sec/batch\n", + "Epoch 9/10 Iteration 1528/1780 Training loss: 1.4083 0.1245 sec/batch\n", + "Epoch 9/10 Iteration 1529/1780 Training loss: 1.4081 0.1335 sec/batch\n", + "Epoch 9/10 Iteration 1530/1780 Training loss: 1.4080 0.1237 sec/batch\n", + "Epoch 9/10 Iteration 1531/1780 Training loss: 1.4078 0.1220 sec/batch\n", + "Epoch 9/10 Iteration 1532/1780 Training loss: 1.4078 0.1213 sec/batch\n", + "Epoch 9/10 Iteration 1533/1780 Training loss: 1.4076 0.1215 sec/batch\n", + "Epoch 9/10 Iteration 1534/1780 Training loss: 1.4076 0.1235 sec/batch\n", + "Epoch 9/10 Iteration 1535/1780 Training loss: 1.4072 0.1248 sec/batch\n", + "Epoch 9/10 Iteration 1536/1780 Training loss: 1.4071 0.1215 sec/batch\n", + "Epoch 9/10 Iteration 1537/1780 Training loss: 1.4069 0.1226 sec/batch\n", + "Epoch 9/10 Iteration 1538/1780 Training loss: 1.4067 0.1229 sec/batch\n", + "Epoch 9/10 Iteration 1539/1780 Training loss: 1.4063 0.1254 sec/batch\n", + "Epoch 9/10 Iteration 1540/1780 Training loss: 1.4059 0.1226 sec/batch\n", + "Epoch 9/10 Iteration 1541/1780 Training loss: 1.4058 0.1251 sec/batch\n", + "Epoch 9/10 Iteration 1542/1780 Training loss: 1.4058 0.1242 sec/batch\n", + "Epoch 9/10 Iteration 1543/1780 Training loss: 1.4055 0.1250 sec/batch\n", + "Epoch 9/10 Iteration 1544/1780 Training loss: 1.4055 0.1237 sec/batch\n", + "Epoch 9/10 Iteration 1545/1780 Training loss: 1.4053 0.1282 sec/batch\n", + "Epoch 9/10 Iteration 1546/1780 Training loss: 1.4049 0.1247 sec/batch\n", + "Epoch 9/10 Iteration 1547/1780 Training loss: 1.4043 0.1252 sec/batch\n", + "Epoch 9/10 Iteration 1548/1780 Training loss: 1.4042 0.1225 sec/batch\n", + "Epoch 9/10 Iteration 1549/1780 Training loss: 1.4040 0.1220 sec/batch\n", + "Epoch 9/10 Iteration 1550/1780 Training loss: 1.4036 0.1226 sec/batch\n", + "Epoch 9/10 Iteration 1551/1780 Training loss: 1.4036 0.1226 sec/batch\n", + "Epoch 9/10 Iteration 1552/1780 Training loss: 1.4035 0.1206 sec/batch\n", + "Epoch 9/10 Iteration 1553/1780 Training loss: 1.4032 0.1232 sec/batch\n", + "Epoch 9/10 Iteration 1554/1780 Training loss: 1.4028 0.1234 sec/batch\n", + "Epoch 9/10 Iteration 1555/1780 Training loss: 1.4023 0.1214 sec/batch\n", + "Epoch 9/10 Iteration 1556/1780 Training loss: 1.4020 0.1207 sec/batch\n", + "Epoch 9/10 Iteration 1557/1780 Training loss: 1.4020 0.1221 sec/batch\n", + "Epoch 9/10 Iteration 1558/1780 Training loss: 1.4019 0.1221 sec/batch\n", + "Epoch 9/10 Iteration 1559/1780 Training loss: 1.4019 0.1257 sec/batch\n", + "Epoch 9/10 Iteration 1560/1780 Training loss: 1.4018 0.1217 sec/batch\n", + "Epoch 9/10 Iteration 1561/1780 Training loss: 1.4020 0.1255 sec/batch\n", + "Epoch 9/10 Iteration 1562/1780 Training loss: 1.4020 0.1208 sec/batch\n", + "Epoch 9/10 Iteration 1563/1780 Training loss: 1.4020 0.1229 sec/batch\n", + "Epoch 9/10 Iteration 1564/1780 Training loss: 1.4018 0.1239 sec/batch\n", + "Epoch 9/10 Iteration 1565/1780 Training loss: 1.4021 0.1219 sec/batch\n", + "Epoch 9/10 Iteration 1566/1780 Training loss: 1.4021 0.1223 sec/batch\n", + "Epoch 9/10 Iteration 1567/1780 Training loss: 1.4020 0.1235 sec/batch\n", + "Epoch 9/10 Iteration 1568/1780 Training loss: 1.4021 0.1228 sec/batch\n", + "Epoch 9/10 Iteration 1569/1780 Training loss: 1.4019 0.1253 sec/batch\n", + "Epoch 9/10 Iteration 1570/1780 Training loss: 1.4020 0.1220 sec/batch\n", + "Epoch 9/10 Iteration 1571/1780 Training loss: 1.4019 0.1231 sec/batch\n", + "Epoch 9/10 Iteration 1572/1780 Training loss: 1.4021 0.1341 sec/batch\n", + "Epoch 9/10 Iteration 1573/1780 Training loss: 1.4021 0.1225 sec/batch\n", + "Epoch 9/10 Iteration 1574/1780 Training loss: 1.4019 0.1217 sec/batch\n", + "Epoch 9/10 Iteration 1575/1780 Training loss: 1.4015 0.1227 sec/batch\n", + "Epoch 9/10 Iteration 1576/1780 Training loss: 1.4013 0.1202 sec/batch\n", + "Epoch 9/10 Iteration 1577/1780 Training loss: 1.4013 0.1273 sec/batch\n", + "Epoch 9/10 Iteration 1578/1780 Training loss: 1.4011 0.1213 sec/batch\n", + "Epoch 9/10 Iteration 1579/1780 Training loss: 1.4011 0.1290 sec/batch\n", + "Epoch 9/10 Iteration 1580/1780 Training loss: 1.4009 0.1201 sec/batch\n", + "Epoch 9/10 Iteration 1581/1780 Training loss: 1.4009 0.1219 sec/batch\n", + "Epoch 9/10 Iteration 1582/1780 Training loss: 1.4008 0.1235 sec/batch\n", + "Epoch 9/10 Iteration 1583/1780 Training loss: 1.4005 0.1212 sec/batch\n", + "Epoch 9/10 Iteration 1584/1780 Training loss: 1.4005 0.1224 sec/batch\n", + "Epoch 9/10 Iteration 1585/1780 Training loss: 1.4006 0.1226 sec/batch\n", + "Epoch 9/10 Iteration 1586/1780 Training loss: 1.4005 0.1227 sec/batch\n", + "Epoch 9/10 Iteration 1587/1780 Training loss: 1.4005 0.1289 sec/batch\n", + "Epoch 9/10 Iteration 1588/1780 Training loss: 1.4004 0.1241 sec/batch\n", + "Epoch 9/10 Iteration 1589/1780 Training loss: 1.4003 0.1219 sec/batch\n", + "Epoch 9/10 Iteration 1590/1780 Training loss: 1.4001 0.1237 sec/batch\n", + "Epoch 9/10 Iteration 1591/1780 Training loss: 1.4002 0.1235 sec/batch\n", + "Epoch 9/10 Iteration 1592/1780 Training loss: 1.4006 0.1215 sec/batch\n", + "Epoch 9/10 Iteration 1593/1780 Training loss: 1.4005 0.1251 sec/batch\n", + "Epoch 9/10 Iteration 1594/1780 Training loss: 1.4004 0.1221 sec/batch\n", + "Epoch 9/10 Iteration 1595/1780 Training loss: 1.4003 0.1227 sec/batch\n", + "Epoch 9/10 Iteration 1596/1780 Training loss: 1.4000 0.1242 sec/batch\n", + "Epoch 9/10 Iteration 1597/1780 Training loss: 1.4001 0.1221 sec/batch\n", + "Epoch 9/10 Iteration 1598/1780 Training loss: 1.4000 0.1211 sec/batch\n", + "Epoch 9/10 Iteration 1599/1780 Training loss: 1.4000 0.1315 sec/batch\n", + "Epoch 9/10 Iteration 1600/1780 Training loss: 1.3999 0.1200 sec/batch\n", + "Validation loss: 1.27288 Saving checkpoint!\n", + "Epoch 9/10 Iteration 1601/1780 Training loss: 1.4005 0.1202 sec/batch\n", + "Epoch 9/10 Iteration 1602/1780 Training loss: 1.4007 0.1246 sec/batch\n", + "Epoch 10/10 Iteration 1603/1780 Training loss: 1.5037 0.1222 sec/batch\n", + "Epoch 10/10 Iteration 1604/1780 Training loss: 1.4527 0.1217 sec/batch\n", + "Epoch 10/10 Iteration 1605/1780 Training loss: 1.4277 0.1252 sec/batch\n", + "Epoch 10/10 Iteration 1606/1780 Training loss: 1.4221 0.1206 sec/batch\n", + "Epoch 10/10 Iteration 1607/1780 Training loss: 1.4116 0.1220 sec/batch\n", + "Epoch 10/10 Iteration 1608/1780 Training loss: 1.3979 0.1219 sec/batch\n", + "Epoch 10/10 Iteration 1609/1780 Training loss: 1.3973 0.1243 sec/batch\n", + "Epoch 10/10 Iteration 1610/1780 Training loss: 1.3954 0.1233 sec/batch\n", + "Epoch 10/10 Iteration 1611/1780 Training loss: 1.3955 0.1269 sec/batch\n", + "Epoch 10/10 Iteration 1612/1780 Training loss: 1.3939 0.1219 sec/batch\n", + "Epoch 10/10 Iteration 1613/1780 Training loss: 1.3906 0.1257 sec/batch\n", + "Epoch 10/10 Iteration 1614/1780 Training loss: 1.3894 0.1240 sec/batch\n", + "Epoch 10/10 Iteration 1615/1780 Training loss: 1.3886 0.1228 sec/batch\n", + "Epoch 10/10 Iteration 1616/1780 Training loss: 1.3897 0.1220 sec/batch\n", + "Epoch 10/10 Iteration 1617/1780 Training loss: 1.3882 0.1248 sec/batch\n", + "Epoch 10/10 Iteration 1618/1780 Training loss: 1.3860 0.1238 sec/batch\n", + "Epoch 10/10 Iteration 1619/1780 Training loss: 1.3862 0.1225 sec/batch\n", + "Epoch 10/10 Iteration 1620/1780 Training loss: 1.3878 0.1222 sec/batch\n", + "Epoch 10/10 Iteration 1621/1780 Training loss: 1.3873 0.1219 sec/batch\n", + "Epoch 10/10 Iteration 1622/1780 Training loss: 1.3886 0.1232 sec/batch\n", + "Epoch 10/10 Iteration 1623/1780 Training loss: 1.3874 0.1277 sec/batch\n", + "Epoch 10/10 Iteration 1624/1780 Training loss: 1.3876 0.1213 sec/batch\n", + "Epoch 10/10 Iteration 1625/1780 Training loss: 1.3860 0.1217 sec/batch\n", + "Epoch 10/10 Iteration 1626/1780 Training loss: 1.3856 0.1226 sec/batch\n", + "Epoch 10/10 Iteration 1627/1780 Training loss: 1.3855 0.1219 sec/batch\n", + "Epoch 10/10 Iteration 1628/1780 Training loss: 1.3835 0.1222 sec/batch\n", + "Epoch 10/10 Iteration 1629/1780 Training loss: 1.3821 0.1256 sec/batch\n", + "Epoch 10/10 Iteration 1630/1780 Training loss: 1.3825 0.1217 sec/batch\n", + "Epoch 10/10 Iteration 1631/1780 Training loss: 1.3826 0.1251 sec/batch\n", + "Epoch 10/10 Iteration 1632/1780 Training loss: 1.3828 0.1245 sec/batch\n", + "Epoch 10/10 Iteration 1633/1780 Training loss: 1.3823 0.1274 sec/batch\n", + "Epoch 10/10 Iteration 1634/1780 Training loss: 1.3810 0.1231 sec/batch\n", + "Epoch 10/10 Iteration 1635/1780 Training loss: 1.3813 0.1290 sec/batch\n", + "Epoch 10/10 Iteration 1636/1780 Training loss: 1.3817 0.1234 sec/batch\n", + "Epoch 10/10 Iteration 1637/1780 Training loss: 1.3814 0.1252 sec/batch\n", + "Epoch 10/10 Iteration 1638/1780 Training loss: 1.3810 0.1226 sec/batch\n", + "Epoch 10/10 Iteration 1639/1780 Training loss: 1.3801 0.1261 sec/batch\n", + "Epoch 10/10 Iteration 1640/1780 Training loss: 1.3790 0.1215 sec/batch\n", + "Epoch 10/10 Iteration 1641/1780 Training loss: 1.3775 0.1235 sec/batch\n", + "Epoch 10/10 Iteration 1642/1780 Training loss: 1.3768 0.1250 sec/batch\n", + "Epoch 10/10 Iteration 1643/1780 Training loss: 1.3763 0.1233 sec/batch\n", + "Epoch 10/10 Iteration 1644/1780 Training loss: 1.3766 0.1223 sec/batch\n", + "Epoch 10/10 Iteration 1645/1780 Training loss: 1.3763 0.1227 sec/batch\n", + "Epoch 10/10 Iteration 1646/1780 Training loss: 1.3757 0.1231 sec/batch\n", + "Epoch 10/10 Iteration 1647/1780 Training loss: 1.3760 0.1232 sec/batch\n", + "Epoch 10/10 Iteration 1648/1780 Training loss: 1.3749 0.1222 sec/batch\n", + "Epoch 10/10 Iteration 1649/1780 Training loss: 1.3745 0.1225 sec/batch\n", + "Epoch 10/10 Iteration 1650/1780 Training loss: 1.3739 0.1226 sec/batch\n", + "Epoch 10/10 Iteration 1651/1780 Training loss: 1.3737 0.1242 sec/batch\n", + "Epoch 10/10 Iteration 1652/1780 Training loss: 1.3741 0.1255 sec/batch\n", + "Epoch 10/10 Iteration 1653/1780 Training loss: 1.3734 0.1230 sec/batch\n", + "Epoch 10/10 Iteration 1654/1780 Training loss: 1.3742 0.1211 sec/batch\n", + "Epoch 10/10 Iteration 1655/1780 Training loss: 1.3738 0.1227 sec/batch\n", + "Epoch 10/10 Iteration 1656/1780 Training loss: 1.3740 0.1254 sec/batch\n", + "Epoch 10/10 Iteration 1657/1780 Training loss: 1.3737 0.1261 sec/batch\n", + "Epoch 10/10 Iteration 1658/1780 Training loss: 1.3739 0.1222 sec/batch\n", + "Epoch 10/10 Iteration 1659/1780 Training loss: 1.3742 0.1254 sec/batch\n", + "Epoch 10/10 Iteration 1660/1780 Training loss: 1.3737 0.1211 sec/batch\n", + "Epoch 10/10 Iteration 1661/1780 Training loss: 1.3733 0.1233 sec/batch\n", + "Epoch 10/10 Iteration 1662/1780 Training loss: 1.3740 0.1232 sec/batch\n", + "Epoch 10/10 Iteration 1663/1780 Training loss: 1.3740 0.1230 sec/batch\n", + "Epoch 10/10 Iteration 1664/1780 Training loss: 1.3747 0.1234 sec/batch\n", + "Epoch 10/10 Iteration 1665/1780 Training loss: 1.3751 0.1225 sec/batch\n", + "Epoch 10/10 Iteration 1666/1780 Training loss: 1.3752 0.1269 sec/batch\n", + "Epoch 10/10 Iteration 1667/1780 Training loss: 1.3751 0.1221 sec/batch\n", + "Epoch 10/10 Iteration 1668/1780 Training loss: 1.3751 0.1244 sec/batch\n", + "Epoch 10/10 Iteration 1669/1780 Training loss: 1.3752 0.1228 sec/batch\n", + "Epoch 10/10 Iteration 1670/1780 Training loss: 1.3748 0.1214 sec/batch\n", + "Epoch 10/10 Iteration 1671/1780 Training loss: 1.3748 0.1236 sec/batch\n", + "Epoch 10/10 Iteration 1672/1780 Training loss: 1.3747 0.1221 sec/batch\n", + "Epoch 10/10 Iteration 1673/1780 Training loss: 1.3751 0.1268 sec/batch\n", + "Epoch 10/10 Iteration 1674/1780 Training loss: 1.3753 0.1213 sec/batch\n", + "Epoch 10/10 Iteration 1675/1780 Training loss: 1.3758 0.1251 sec/batch\n", + "Epoch 10/10 Iteration 1676/1780 Training loss: 1.3754 0.1224 sec/batch\n", + "Epoch 10/10 Iteration 1677/1780 Training loss: 1.3753 0.1224 sec/batch\n", + "Epoch 10/10 Iteration 1678/1780 Training loss: 1.3753 0.1225 sec/batch\n", + "Epoch 10/10 Iteration 1679/1780 Training loss: 1.3751 0.1225 sec/batch\n", + "Epoch 10/10 Iteration 1680/1780 Training loss: 1.3749 0.1229 sec/batch\n", + "Epoch 10/10 Iteration 1681/1780 Training loss: 1.3742 0.1252 sec/batch\n", + "Epoch 10/10 Iteration 1682/1780 Training loss: 1.3741 0.1226 sec/batch\n", + "Epoch 10/10 Iteration 1683/1780 Training loss: 1.3736 0.1255 sec/batch\n", + "Epoch 10/10 Iteration 1684/1780 Training loss: 1.3736 0.1217 sec/batch\n", + "Epoch 10/10 Iteration 1685/1780 Training loss: 1.3729 0.1251 sec/batch\n", + "Epoch 10/10 Iteration 1686/1780 Training loss: 1.3728 0.1218 sec/batch\n", + "Epoch 10/10 Iteration 1687/1780 Training loss: 1.3725 0.1235 sec/batch\n", + "Epoch 10/10 Iteration 1688/1780 Training loss: 1.3723 0.1215 sec/batch\n", + "Epoch 10/10 Iteration 1689/1780 Training loss: 1.3720 0.1262 sec/batch\n", + "Epoch 10/10 Iteration 1690/1780 Training loss: 1.3716 0.1229 sec/batch\n", + "Epoch 10/10 Iteration 1691/1780 Training loss: 1.3711 0.1232 sec/batch\n", + "Epoch 10/10 Iteration 1692/1780 Training loss: 1.3711 0.1215 sec/batch\n", + "Epoch 10/10 Iteration 1693/1780 Training loss: 1.3708 0.1228 sec/batch\n", + "Epoch 10/10 Iteration 1694/1780 Training loss: 1.3705 0.1233 sec/batch\n", + "Epoch 10/10 Iteration 1695/1780 Training loss: 1.3702 0.1253 sec/batch\n", + "Epoch 10/10 Iteration 1696/1780 Training loss: 1.3699 0.1233 sec/batch\n", + "Epoch 10/10 Iteration 1697/1780 Training loss: 1.3696 0.1231 sec/batch\n", + "Epoch 10/10 Iteration 1698/1780 Training loss: 1.3695 0.1218 sec/batch\n", + "Epoch 10/10 Iteration 1699/1780 Training loss: 1.3695 0.1242 sec/batch\n", + "Epoch 10/10 Iteration 1700/1780 Training loss: 1.3691 0.1220 sec/batch\n", + "Validation loss: 1.25628 Saving checkpoint!\n", + "Epoch 10/10 Iteration 1701/1780 Training loss: 1.3703 0.1237 sec/batch\n", + "Epoch 10/10 Iteration 1702/1780 Training loss: 1.3699 0.1257 sec/batch\n", + "Epoch 10/10 Iteration 1703/1780 Training loss: 1.3698 0.1244 sec/batch\n", + "Epoch 10/10 Iteration 1704/1780 Training loss: 1.3697 0.1210 sec/batch\n", + "Epoch 10/10 Iteration 1705/1780 Training loss: 1.3696 0.1271 sec/batch\n", + "Epoch 10/10 Iteration 1706/1780 Training loss: 1.3695 0.1220 sec/batch\n", + "Epoch 10/10 Iteration 1707/1780 Training loss: 1.3693 0.1230 sec/batch\n", + "Epoch 10/10 Iteration 1708/1780 Training loss: 1.3691 0.1214 sec/batch\n", + "Epoch 10/10 Iteration 1709/1780 Training loss: 1.3691 0.1233 sec/batch\n", + "Epoch 10/10 Iteration 1710/1780 Training loss: 1.3690 0.1252 sec/batch\n", + "Epoch 10/10 Iteration 1711/1780 Training loss: 1.3689 0.1254 sec/batch\n", + "Epoch 10/10 Iteration 1712/1780 Training loss: 1.3689 0.1226 sec/batch\n", + "Epoch 10/10 Iteration 1713/1780 Training loss: 1.3688 0.1226 sec/batch\n", + "Epoch 10/10 Iteration 1714/1780 Training loss: 1.3686 0.1216 sec/batch\n", + "Epoch 10/10 Iteration 1715/1780 Training loss: 1.3684 0.1223 sec/batch\n", + "Epoch 10/10 Iteration 1716/1780 Training loss: 1.3683 0.1222 sec/batch\n", + "Epoch 10/10 Iteration 1717/1780 Training loss: 1.3679 0.1280 sec/batch\n", + "Epoch 10/10 Iteration 1718/1780 Training loss: 1.3676 0.1235 sec/batch\n", + "Epoch 10/10 Iteration 1719/1780 Training loss: 1.3675 0.1218 sec/batch\n", + "Epoch 10/10 Iteration 1720/1780 Training loss: 1.3675 0.1205 sec/batch\n", + "Epoch 10/10 Iteration 1721/1780 Training loss: 1.3673 0.1237 sec/batch\n", + "Epoch 10/10 Iteration 1722/1780 Training loss: 1.3672 0.1234 sec/batch\n", + "Epoch 10/10 Iteration 1723/1780 Training loss: 1.3670 0.1233 sec/batch\n", + "Epoch 10/10 Iteration 1724/1780 Training loss: 1.3666 0.1210 sec/batch\n", + "Epoch 10/10 Iteration 1725/1780 Training loss: 1.3661 0.1220 sec/batch\n", + "Epoch 10/10 Iteration 1726/1780 Training loss: 1.3661 0.1216 sec/batch\n", + "Epoch 10/10 Iteration 1727/1780 Training loss: 1.3660 0.1231 sec/batch\n", + "Epoch 10/10 Iteration 1728/1780 Training loss: 1.3656 0.1217 sec/batch\n", + "Epoch 10/10 Iteration 1729/1780 Training loss: 1.3656 0.1358 sec/batch\n", + "Epoch 10/10 Iteration 1730/1780 Training loss: 1.3655 0.1230 sec/batch\n", + "Epoch 10/10 Iteration 1731/1780 Training loss: 1.3653 0.1226 sec/batch\n", + "Epoch 10/10 Iteration 1732/1780 Training loss: 1.3650 0.1224 sec/batch\n", + "Epoch 10/10 Iteration 1733/1780 Training loss: 1.3645 0.1263 sec/batch\n", + "Epoch 10/10 Iteration 1734/1780 Training loss: 1.3642 0.1268 sec/batch\n", + "Epoch 10/10 Iteration 1735/1780 Training loss: 1.3642 0.1247 sec/batch\n", + "Epoch 10/10 Iteration 1736/1780 Training loss: 1.3642 0.1221 sec/batch\n", + "Epoch 10/10 Iteration 1737/1780 Training loss: 1.3641 0.1220 sec/batch\n", + "Epoch 10/10 Iteration 1738/1780 Training loss: 1.3641 0.1220 sec/batch\n", + "Epoch 10/10 Iteration 1739/1780 Training loss: 1.3642 0.1242 sec/batch\n", + "Epoch 10/10 Iteration 1740/1780 Training loss: 1.3642 0.1230 sec/batch\n", + "Epoch 10/10 Iteration 1741/1780 Training loss: 1.3641 0.1222 sec/batch\n", + "Epoch 10/10 Iteration 1742/1780 Training loss: 1.3641 0.1229 sec/batch\n", + "Epoch 10/10 Iteration 1743/1780 Training loss: 1.3644 0.1305 sec/batch\n", + "Epoch 10/10 Iteration 1744/1780 Training loss: 1.3643 0.1230 sec/batch\n", + "Epoch 10/10 Iteration 1745/1780 Training loss: 1.3642 0.1237 sec/batch\n", + "Epoch 10/10 Iteration 1746/1780 Training loss: 1.3644 0.1235 sec/batch\n", + "Epoch 10/10 Iteration 1747/1780 Training loss: 1.3643 0.1240 sec/batch\n", + "Epoch 10/10 Iteration 1748/1780 Training loss: 1.3643 0.1214 sec/batch\n", + "Epoch 10/10 Iteration 1749/1780 Training loss: 1.3643 0.1250 sec/batch\n", + "Epoch 10/10 Iteration 1750/1780 Training loss: 1.3644 0.1210 sec/batch\n", + "Epoch 10/10 Iteration 1751/1780 Training loss: 1.3644 0.1213 sec/batch\n", + "Epoch 10/10 Iteration 1752/1780 Training loss: 1.3643 0.1221 sec/batch\n", + "Epoch 10/10 Iteration 1753/1780 Training loss: 1.3640 0.1228 sec/batch\n", + "Epoch 10/10 Iteration 1754/1780 Training loss: 1.3637 0.1214 sec/batch\n", + "Epoch 10/10 Iteration 1755/1780 Training loss: 1.3637 0.1229 sec/batch\n", + "Epoch 10/10 Iteration 1756/1780 Training loss: 1.3636 0.1205 sec/batch\n", + "Epoch 10/10 Iteration 1757/1780 Training loss: 1.3635 0.1220 sec/batch\n", + "Epoch 10/10 Iteration 1758/1780 Training loss: 1.3635 0.1227 sec/batch\n", + "Epoch 10/10 Iteration 1759/1780 Training loss: 1.3634 0.1219 sec/batch\n", + "Epoch 10/10 Iteration 1760/1780 Training loss: 1.3634 0.1237 sec/batch\n", + "Epoch 10/10 Iteration 1761/1780 Training loss: 1.3630 0.1224 sec/batch\n", + "Epoch 10/10 Iteration 1762/1780 Training loss: 1.3631 0.1231 sec/batch\n", + "Epoch 10/10 Iteration 1763/1780 Training loss: 1.3633 0.1252 sec/batch\n", + "Epoch 10/10 Iteration 1764/1780 Training loss: 1.3632 0.1230 sec/batch\n", + "Epoch 10/10 Iteration 1765/1780 Training loss: 1.3631 0.1226 sec/batch\n", + "Epoch 10/10 Iteration 1766/1780 Training loss: 1.3631 0.1220 sec/batch\n", + "Epoch 10/10 Iteration 1767/1780 Training loss: 1.3630 0.1261 sec/batch\n", + "Epoch 10/10 Iteration 1768/1780 Training loss: 1.3630 0.1215 sec/batch\n", + "Epoch 10/10 Iteration 1769/1780 Training loss: 1.3630 0.1260 sec/batch\n", + "Epoch 10/10 Iteration 1770/1780 Training loss: 1.3634 0.1234 sec/batch\n", + "Epoch 10/10 Iteration 1771/1780 Training loss: 1.3633 0.1226 sec/batch\n", + "Epoch 10/10 Iteration 1772/1780 Training loss: 1.3633 0.1212 sec/batch\n", + "Epoch 10/10 Iteration 1773/1780 Training loss: 1.3631 0.1219 sec/batch\n", + "Epoch 10/10 Iteration 1774/1780 Training loss: 1.3629 0.1213 sec/batch\n", + "Epoch 10/10 Iteration 1775/1780 Training loss: 1.3630 0.1227 sec/batch\n", + "Epoch 10/10 Iteration 1776/1780 Training loss: 1.3629 0.1212 sec/batch\n", + "Epoch 10/10 Iteration 1777/1780 Training loss: 1.3630 0.1228 sec/batch\n", + "Epoch 10/10 Iteration 1778/1780 Training loss: 1.3627 0.1205 sec/batch\n", + "Epoch 10/10 Iteration 1779/1780 Training loss: 1.3625 0.1228 sec/batch\n", + "Epoch 10/10 Iteration 1780/1780 Training loss: 1.3626 0.1239 sec/batch\n", + "Validation loss: 1.24267 Saving checkpoint!\n" + ] + } + ], + "source": [ + "epochs = 10\n", + "save_every_n = 100\n", + "train_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)\n", + "\n", + "model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "saver = tf.train.Saver(max_to_keep=100)\n", + "\n", + "with tf.Session() as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + " train_writer = tf.summary.FileWriter('./logs/2/train', sess.graph)\n", + " test_writer = tf.summary.FileWriter('./logs/2/test')\n", + " \n", + " # Use the line below to load a checkpoint and resume training\n", + " #saver.restore(sess, 'checkpoints/anna20.ckpt')\n", + " \n", + " n_batches = int(train_x.shape[1]/num_steps)\n", + " iterations = n_batches * epochs\n", + " for e in range(epochs):\n", + " \n", + " # Train network\n", + " new_state = sess.run(model.initial_state)\n", + " loss = 0\n", + " for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):\n", + " iteration = e*n_batches + b\n", + " start = time.time()\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 0.5,\n", + " model.initial_state: new_state}\n", + " summary, batch_loss, new_state, _ = sess.run([model.merged, model.cost, \n", + " model.final_state, model.optimizer], \n", + " feed_dict=feed)\n", + " loss += batch_loss\n", + " end = time.time()\n", + " print('Epoch {}/{} '.format(e+1, epochs),\n", + " 'Iteration {}/{}'.format(iteration, iterations),\n", + " 'Training loss: {:.4f}'.format(loss/b),\n", + " '{:.4f} sec/batch'.format((end-start)))\n", + " \n", + " train_writer.add_summary(summary, iteration)\n", + " \n", + " if (iteration%save_every_n == 0) or (iteration == iterations):\n", + " # Check performance, notice dropout has been set to 1\n", + " val_loss = []\n", + " new_state = sess.run(model.initial_state)\n", + " for x, y in get_batch([val_x, val_y], num_steps):\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " summary, batch_loss, new_state = sess.run([model.merged, model.cost, \n", + " model.final_state], feed_dict=feed)\n", + " val_loss.append(batch_loss)\n", + " \n", + " test_writer.add_summary(summary, iteration)\n", + "\n", + " print('Validation loss:', np.mean(val_loss),\n", + " 'Saving checkpoint!')\n", + " #saver.save(sess, \"checkpoints/anna/i{}_l{}_{:.3f}.ckpt\".format(iteration, lstm_size, np.mean(val_loss)))" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "model_checkpoint_path: \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i400_l512_1.980.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i800_l512_1.595.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1200_l512_1.407.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1400_l512_1.349.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1600_l512_1.292.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1800_l512_1.255.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2000_l512_1.224.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2200_l512_1.204.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2400_l512_1.187.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2600_l512_1.172.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2800_l512_1.160.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3000_l512_1.148.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3200_l512_1.137.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3400_l512_1.129.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3560_l512_1.122.ckpt\"" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "tf.train.get_checkpoint_state('checkpoints/anna')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sampling\n", + "\n", + "Now that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.\n", + "\n", + "The network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def pick_top_n(preds, vocab_size, top_n=5):\n", + " p = np.squeeze(preds)\n", + " p[np.argsort(p)[:-top_n]] = 0\n", + " p = p / np.sum(p)\n", + " c = np.random.choice(vocab_size, 1, p=p)[0]\n", + " return c" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def sample(checkpoint, n_samples, lstm_size, vocab_size, prime=\"The \"):\n", + " prime = \"Far\"\n", + " samples = [c for c in prime]\n", + " model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)\n", + " saver = tf.train.Saver()\n", + " with tf.Session() as sess:\n", + " saver.restore(sess, checkpoint)\n", + " new_state = sess.run(model.initial_state)\n", + " for c in prime:\n", + " x = np.zeros((1, 1))\n", + " x[0,0] = vocab_to_int[c]\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + "\n", + " for i in range(n_samples):\n", + " x[0,0] = c\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + " \n", + " return ''.join(samples)" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farlathit that if had so\n", + "like it that it were. He could not trouble to his wife, and there was\n", + "anything in them of the side of his weaky in the creature at his forteren\n", + "to him.\n", + "\n", + "\"What is it? I can't bread to those,\" said Stepan Arkadyevitch. \"It's not\n", + "my children, and there is an almost this arm, true it mays already,\n", + "and tell you what I have say to you, and was not looking at the peasant,\n", + "why is, I don't know him out, and she doesn't speak to me immediately, as\n", + "you would say the countess and the more frest an angelembre, and time and\n", + "things's silent, but I was not in my stand that is in my head. But if he\n", + "say, and was so feeling with his soul. A child--in his soul of his\n", + "soul of his soul. He should not see that any of that sense of. Here he\n", + "had not been so composed and to speak for as in a whole picture, but\n", + "all the setting and her excellent and society, who had been delighted\n", + "and see to anywing had been being troed to thousand words on them,\n", + "we liked him.\n", + "\n", + "That set in her money at the table, he came into the party. The capable\n", + "of his she could not be as an old composure.\n", + "\n", + "\"That's all something there will be down becime by throe is\n", + "such a silent, as in a countess, I should state it out and divorct.\n", + "The discussion is not for me. I was that something was simply they are\n", + "all three manshess of a sensitions of mind it all.\"\n", + "\n", + "\"No,\" he thought, shouted and lifting his soul. \"While it might see your\n", + "honser and she, I could burst. And I had been a midelity. And I had a\n", + "marnief are through the countess,\" he said, looking at him, a chosing\n", + "which they had been carried out and still solied, and there was a sen that\n", + "was to be completely, and that this matter of all the seconds of it, and\n", + "a concipation were to her husband, who came up and conscaously, that he\n", + "was not the station. All his fourse she was always at the country,,\n", + "to speak oft, and though they were to hear the delightful throom and\n", + "whether they came towards the morning, and his living and a coller and\n", + "hold--the children. \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farnt him oste wha sorind thans tout thint asd an sesand an hires on thime sind thit aled, ban thand and out hore as the ter hos ton ho te that, was tis tart al the hand sostint him sore an tit an son thes, win he se ther san ther hher tas tarereng,.\n", + "\n", + "Anl at an ades in ond hesiln, ad hhe torers teans, wast tar arering tho this sos alten sorer has hhas an siton ther him he had sin he ard ate te anling the sosin her ans and\n", + "arins asd and ther ale te tot an tand tanginge wath and ho ald, so sot th asend sat hare sother horesinnd, he hesense wing ante her so tith tir sherinn, anded and to the toul anderin he sorit he torsith she se atere an ting ot hand and thit hhe so the te wile har\n", + "ens ont in the sersise, and we he seres tar aterer, to ato tat or has he he wan ton here won and sen heren he sosering, to to theer oo adent har herere the wosh oute, was serild ward tous hed astend..\n", + "\n", + "I's sint on alt in har tor tit her asd hade shithans ored he talereng an soredendere tim tot hees. Tise sor and \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Fard as astice her said he celatice of to seress in the raice, and to be the some and sere allats to that said to that the sark and a cast a the wither ald the pacinesse of her had astition, he said to the sount as she west at hissele. Af the cond it he was a fact onthis astisarianing.\n", + "\n", + "\n", + "\"Or a ton to to be that's a more at aspestale as the sont of anstiring as\n", + "thours and trey.\n", + "\n", + "The same wo dangring the\n", + "raterst, who sore and somethy had ast out an of his book. \"We had's beane were that, and a morted a thay he had to tere. Then to\n", + "her homent andertersed his his ancouted to the pirsted, the soution for of the pirsice inthirgest and stenciol, with the hard and and\n", + "a colrice of to be oneres,\n", + "the song to this anderssad.\n", + "The could ounterss the said to serom of\n", + "soment a carsed of sheres of she\n", + "torded\n", + "har and want in their of hould, but\n", + "her told in that in he tad a the same to her. Serghing an her has and with the seed, and the camt ont his about of the\n", + "sail, the her then all houg ant or to hus to \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farrat, his felt has at it.\n", + "\n", + "\"When the pose ther hor exceed\n", + "to his sheant was,\" weat a sime of his sounsed. The coment and the facily that which had began terede a marilicaly whice whether the pose of his hand, at she was alligated herself the same on she had to\n", + "taiking to his forthing and streath how to hand\n", + "began in a lang at some at it, this he cholded not set all her. \"Wo love that is setthing. Him anstering as seen that.\"\n", + "\n", + "\"Yes in the man that say the mare a crances is it?\" said Sergazy Ivancatching. \"You doon think were somether is ifficult of a mone of\n", + "though the most at the countes that the\n", + "mean on the come to say the most, to\n", + "his feesing of\n", + "a man she, whilo he\n", + "sained and well, that he would still at to said. He wind at his for the sore in the most\n", + "of hoss and almoved to see him. They have betine the sumper into at he his stire, and what he was that at the so steate of the\n", + "sound, and shin should have a geest of shall feet on the conderation to she had been at that imporsing the dre\n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.0" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tensorboard/Anna KaRNNa.ipynb b/tensorboard/Anna KaRNNa.ipynb new file mode 100644 index 0000000..7b4f6a6 --- /dev/null +++ b/tensorboard/Anna KaRNNa.ipynb @@ -0,0 +1,794 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Anna KaRNNa\n", + "\n", + "In this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.\n", + "\n", + "This network is based off of Andrej Karpathy's [post on RNNs](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) and [implementation in Torch](https://github.com/karpathy/char-rnn). Also, some information [here at r2rt](http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.html) and from [Sherjil Ozair](https://github.com/sherjilozair/char-rnn-tensorflow) on GitHub. Below is the general architecture of the character-wise RNN.\n", + "\n", + "" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "import time\n", + "from collections import namedtuple\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First we'll load the text file and convert it into integers for our network to use." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "with open('anna.txt', 'r') as f:\n", + " text=f.read()\n", + "vocab = set(text)\n", + "vocab_to_int = {c: i for i, c in enumerate(vocab)}\n", + "int_to_vocab = dict(enumerate(vocab))\n", + "chars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "'Chapter 1\\n\\n\\nHappy families are all alike; every unhappy family is unhappy in its own\\nway.\\n\\nEverythin'" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "text[:100]" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([82, 78, 3, 48, 15, 79, 77, 50, 30, 20, 20, 20, 38, 3, 48, 48, 8,\n", + " 50, 10, 3, 9, 33, 4, 33, 79, 43, 50, 3, 77, 79, 50, 3, 4, 4,\n", + " 50, 3, 4, 33, 17, 79, 64, 50, 79, 44, 79, 77, 8, 50, 49, 70, 78,\n", + " 3, 48, 48, 8, 50, 10, 3, 9, 33, 4, 8, 50, 33, 43, 50, 49, 70,\n", + " 78, 3, 48, 48, 8, 50, 33, 70, 50, 33, 15, 43, 50, 55, 62, 70, 20,\n", + " 62, 3, 8, 22, 20, 20, 80, 44, 79, 77, 8, 15, 78, 33, 70], dtype=int32)" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "chars[:100]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.\n", + "\n", + "Here I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.\n", + "\n", + "The idea here is to make a 2D matrix where the number of rows is equal to the number of batches. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the `split_frac` keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def split_data(chars, batch_size, num_steps, split_frac=0.9):\n", + " \"\"\" \n", + " Split character data into training and validation sets, inputs and targets for each set.\n", + " \n", + " Arguments\n", + " ---------\n", + " chars: character array\n", + " batch_size: Size of examples in each of batch\n", + " num_steps: Number of sequence steps to keep in the input and pass to the network\n", + " split_frac: Fraction of batches to keep in the training set\n", + " \n", + " \n", + " Returns train_x, train_y, val_x, val_y\n", + " \"\"\"\n", + " \n", + " \n", + " slice_size = batch_size * num_steps\n", + " n_batches = int(len(chars) / slice_size)\n", + " \n", + " # Drop the last few characters to make only full batches\n", + " x = chars[: n_batches*slice_size]\n", + " y = chars[1: n_batches*slice_size + 1]\n", + " \n", + " # Split the data into batch_size slices, then stack them into a 2D matrix \n", + " x = np.stack(np.split(x, batch_size))\n", + " y = np.stack(np.split(y, batch_size))\n", + " \n", + " # Now x and y are arrays with dimensions batch_size x n_batches*num_steps\n", + " \n", + " # Split into training and validation sets, keep the virst split_frac batches for training\n", + " split_idx = int(n_batches*split_frac)\n", + " train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]\n", + " val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]\n", + " \n", + " return train_x, train_y, val_x, val_y" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "train_x, train_y, val_x, val_y = split_data(chars, 10, 200)" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "(10, 178400)" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x.shape" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[82, 78, 3, 48, 15, 79, 77, 50, 30, 20],\n", + " [67, 70, 58, 50, 78, 79, 50, 9, 55, 44],\n", + " [50, 65, 3, 15, 65, 78, 33, 70, 32, 50],\n", + " [55, 15, 78, 79, 77, 50, 62, 55, 49, 4],\n", + " [50, 15, 78, 79, 50, 4, 3, 70, 58, 18],\n", + " [50, 51, 78, 77, 55, 49, 32, 78, 50, 4],\n", + " [15, 50, 15, 55, 20, 58, 55, 22, 20, 20],\n", + " [55, 50, 78, 79, 77, 43, 79, 4, 10, 56],\n", + " [78, 3, 15, 50, 33, 43, 50, 15, 78, 79],\n", + " [79, 77, 43, 79, 4, 10, 50, 3, 70, 58]], dtype=int32)" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "train_x[:,:10]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "I'll write another function to grab batches out of the arrays made by split data. Here each batch will be a sliding window on these arrays with size `batch_size X num_steps`. For example, if we want our network to train on a sequence of 100 characters, `num_steps = 100`. For the next batch, we'll shift this window the next sequence of `num_steps` characters. In this way we can feed batches to the network and the cell states will continue through on each batch." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def get_batch(arrs, num_steps):\n", + " batch_size, slice_size = arrs[0].shape\n", + " \n", + " n_batches = int(slice_size/num_steps)\n", + " for b in range(n_batches):\n", + " yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,\n", + " learning_rate=0.001, grad_clip=5, sampling=False):\n", + " \n", + " if sampling == True:\n", + " batch_size, num_steps = 1, 1\n", + "\n", + " tf.reset_default_graph()\n", + " \n", + " # Declare placeholders we'll feed into the graph\n", + " \n", + " inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')\n", + " x_one_hot = tf.one_hot(inputs, num_classes, name='x_one_hot')\n", + "\n", + "\n", + " targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')\n", + " y_one_hot = tf.one_hot(targets, num_classes, name='y_one_hot')\n", + " y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])\n", + " \n", + " keep_prob = tf.placeholder(tf.float32, name='keep_prob')\n", + " \n", + " # Build the RNN layers\n", + " \n", + " lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n", + " drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n", + " cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)\n", + "\n", + " initial_state = cell.zero_state(batch_size, tf.float32)\n", + "\n", + " # Run the data through the RNN layers\n", + " rnn_inputs = [tf.squeeze(i, squeeze_dims=[1]) for i in tf.split(x_one_hot, num_steps, 1)]\n", + " outputs, state = tf.contrib.rnn.static_rnn(cell, rnn_inputs, initial_state=initial_state)\n", + " \n", + " final_state = tf.identity(state, name='final_state')\n", + " \n", + " # Reshape output so it's a bunch of rows, one row for each cell output\n", + " \n", + " seq_output = tf.concat(outputs, axis=1,name='seq_output')\n", + " output = tf.reshape(seq_output, [-1, lstm_size], name='graph_output')\n", + " \n", + " # Now connect the RNN putputs to a softmax layer and calculate the cost\n", + " softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1),\n", + " name='softmax_w')\n", + " softmax_b = tf.Variable(tf.zeros(num_classes), name='softmax_b')\n", + " logits = tf.matmul(output, softmax_w) + softmax_b\n", + "\n", + " preds = tf.nn.softmax(logits, name='predictions')\n", + " \n", + " loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped, name='loss')\n", + " cost = tf.reduce_mean(loss, name='cost')\n", + "\n", + " # Optimizer for training, using gradient clipping to control exploding gradients\n", + " tvars = tf.trainable_variables()\n", + " grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)\n", + " train_op = tf.train.AdamOptimizer(learning_rate)\n", + " optimizer = train_op.apply_gradients(zip(grads, tvars))\n", + "\n", + " # Export the nodes \n", + " export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',\n", + " 'keep_prob', 'cost', 'preds', 'optimizer']\n", + " Graph = namedtuple('Graph', export_nodes)\n", + " local_dict = locals()\n", + " graph = Graph(*[local_dict[each] for each in export_nodes])\n", + " \n", + " return graph" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Hyperparameters\n", + "\n", + "Here I'm defining the hyperparameters for the network. The two you probably haven't seen before are `lstm_size` and `num_layers`. These set the number of hidden units in the LSTM layers and the number of LSTM layers, respectively. Of course, making these bigger will improve the network's performance but you'll have to watch out for overfitting. If your validation loss is much larger than the training loss, you're probably overfitting. Decrease the size of the network or decrease the dropout keep probability." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "batch_size = 100\n", + "num_steps = 100\n", + "lstm_size = 512\n", + "num_layers = 2\n", + "learning_rate = 0.001" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Write out the graph for TensorBoard" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": { + "collapsed": false + }, + "outputs": [], + "source": [ + "model = build_rnn(len(vocab),\n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "with tf.Session() as sess:\n", + " \n", + " sess.run(tf.global_variables_initializer())\n", + " file_writer = tf.summary.FileWriter('./logs/1', sess.graph)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training\n", + "\n", + "Time for training which is is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by `save_every_n`) I calculate the validation loss and save a checkpoint." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "!mkdir -p checkpoints/anna" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true, + "scrolled": true + }, + "outputs": [ + { + "ename": "ValueError", + "evalue": "Expected state to be a tuple of length 2, but received: Tensor(\"initial_state:0\", shape=(2, 2, 100, 512), dtype=float32)", + "output_type": "error", + "traceback": [ + "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[0;31mValueError\u001b[0m Traceback (most recent call last)", + "\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 8\u001b[0m \u001b[0mlearning_rate\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mlearning_rate\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 9\u001b[0m \u001b[0mlstm_size\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mlstm_size\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 10\u001b[0;31m num_layers=num_layers)\n\u001b[0m\u001b[1;32m 11\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 12\u001b[0m \u001b[0msaver\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mtrain\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mSaver\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmax_to_keep\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;36m100\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m\u001b[0m in \u001b[0;36mbuild_rnn\u001b[0;34m(num_classes, batch_size, num_steps, lstm_size, num_layers, learning_rate, grad_clip, sampling)\u001b[0m\n\u001b[1;32m 25\u001b[0m \u001b[0;31m# Run the data through the RNN layers\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 26\u001b[0m \u001b[0mrnn_inputs\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msqueeze\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0msqueeze_dims\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mi\u001b[0m \u001b[0;32min\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msplit\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mx_one_hot\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mnum_steps\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m1\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 27\u001b[0;31m \u001b[0moutputs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mcontrib\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrnn\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mstatic_rnn\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcell\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mrnn_inputs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0minitial_state\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0minitial_state\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 28\u001b[0m \u001b[0mfinal_state\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mtf\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0midentity\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mstate\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mname\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'final_state'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 29\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn.py\u001b[0m in \u001b[0;36mstatic_rnn\u001b[0;34m(cell, inputs, initial_state, dtype, sequence_length, scope)\u001b[0m\n\u001b[1;32m 195\u001b[0m state_size=cell.state_size)\n\u001b[1;32m 196\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 197\u001b[0;31m \u001b[0;34m(\u001b[0m\u001b[0moutput\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mcall_cell\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 198\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 199\u001b[0m \u001b[0moutputs\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0moutput\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn.py\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 182\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0mtime\u001b[0m \u001b[0;34m>\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mvarscope\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mreuse_variables\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 183\u001b[0m \u001b[0;31m# pylint: disable=cell-var-from-loop\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 184\u001b[0;31m \u001b[0mcall_cell\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mlambda\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mcell\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0minput_\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 185\u001b[0m \u001b[0;31m# pylint: enable=cell-var-from-loop\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 186\u001b[0m \u001b[0;32mif\u001b[0m \u001b[0msequence_length\u001b[0m \u001b[0;32mis\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0;32mNone\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;32m/home/mat/miniconda3/envs/tf-gpu/lib/python3.5/site-packages/tensorflow/contrib/rnn/python/ops/core_rnn_cell_impl.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, inputs, state, scope)\u001b[0m\n\u001b[1;32m 647\u001b[0m raise ValueError(\n\u001b[1;32m 648\u001b[0m \u001b[0;34m\"Expected state to be a tuple of length %d, but received: %s\"\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 649\u001b[0;31m % (len(self.state_size), state))\n\u001b[0m\u001b[1;32m 650\u001b[0m \u001b[0mcur_state\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mstate\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0mi\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 651\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", + "\u001b[0;31mValueError\u001b[0m: Expected state to be a tuple of length 2, but received: Tensor(\"initial_state:0\", shape=(2, 2, 100, 512), dtype=float32)" + ] + } + ], + "source": [ + "epochs = 1\n", + "save_every_n = 200\n", + "train_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)\n", + "\n", + "model = build_rnn(len(vocab), \n", + " batch_size=batch_size,\n", + " num_steps=num_steps,\n", + " learning_rate=learning_rate,\n", + " lstm_size=lstm_size,\n", + " num_layers=num_layers)\n", + "\n", + "saver = tf.train.Saver(max_to_keep=100)\n", + "\n", + "with tf.Session() as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + " \n", + " # Use the line below to load a checkpoint and resume training\n", + " #saver.restore(sess, 'checkpoints/anna20.ckpt')\n", + " \n", + " n_batches = int(train_x.shape[1]/num_steps)\n", + " iterations = n_batches * epochs\n", + " for e in range(epochs):\n", + " \n", + " # Train network\n", + " new_state = sess.run(model.initial_state)\n", + " loss = 0\n", + " for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):\n", + " iteration = e*n_batches + b\n", + " start = time.time()\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 0.5,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n", + " feed_dict=feed)\n", + " loss += batch_loss\n", + " end = time.time()\n", + " print('Epoch {}/{} '.format(e+1, epochs),\n", + " 'Iteration {}/{}'.format(iteration, iterations),\n", + " 'Training loss: {:.4f}'.format(loss/b),\n", + " '{:.4f} sec/batch'.format((end-start)))\n", + " \n", + " \n", + " if (iteration%save_every_n == 0) or (iteration == iterations):\n", + " # Check performance, notice dropout has been set to 1\n", + " val_loss = []\n", + " new_state = sess.run(model.initial_state)\n", + " for x, y in get_batch([val_x, val_y], num_steps):\n", + " feed = {model.inputs: x,\n", + " model.targets: y,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " batch_loss, new_state = sess.run([model.cost, model.final_state], feed_dict=feed)\n", + " val_loss.append(batch_loss)\n", + "\n", + " print('Validation loss:', np.mean(val_loss),\n", + " 'Saving checkpoint!')\n", + " saver.save(sess, \"checkpoints/anna/i{}_l{}_{:.3f}.ckpt\".format(iteration, lstm_size, np.mean(val_loss)))" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "model_checkpoint_path: \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i400_l512_1.980.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i800_l512_1.595.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1200_l512_1.407.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1400_l512_1.349.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1600_l512_1.292.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i1800_l512_1.255.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2000_l512_1.224.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2200_l512_1.204.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2400_l512_1.187.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2600_l512_1.172.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i2800_l512_1.160.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3000_l512_1.148.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3200_l512_1.137.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3400_l512_1.129.ckpt\"\n", + "all_model_checkpoint_paths: \"checkpoints/anna/i3560_l512_1.122.ckpt\"" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "tf.train.get_checkpoint_state('checkpoints/anna')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Sampling\n", + "\n", + "Now that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.\n", + "\n", + "The network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def pick_top_n(preds, vocab_size, top_n=5):\n", + " p = np.squeeze(preds)\n", + " p[np.argsort(p)[:-top_n]] = 0\n", + " p = p / np.sum(p)\n", + " c = np.random.choice(vocab_size, 1, p=p)[0]\n", + " return c" + ] + }, + { + "cell_type": "code", + "execution_count": 41, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [ + "def sample(checkpoint, n_samples, lstm_size, vocab_size, prime=\"The \"):\n", + " prime = \"Far\"\n", + " samples = [c for c in prime]\n", + " model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)\n", + " saver = tf.train.Saver()\n", + " with tf.Session() as sess:\n", + " saver.restore(sess, checkpoint)\n", + " new_state = sess.run(model.initial_state)\n", + " for c in prime:\n", + " x = np.zeros((1, 1))\n", + " x[0,0] = vocab_to_int[c]\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + "\n", + " for i in range(n_samples):\n", + " x[0,0] = c\n", + " feed = {model.inputs: x,\n", + " model.keep_prob: 1.,\n", + " model.initial_state: new_state}\n", + " preds, new_state = sess.run([model.preds, model.final_state], \n", + " feed_dict=feed)\n", + "\n", + " c = pick_top_n(preds, len(vocab))\n", + " samples.append(int_to_vocab[c])\n", + " \n", + " return ''.join(samples)" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farlathit that if had so\n", + "like it that it were. He could not trouble to his wife, and there was\n", + "anything in them of the side of his weaky in the creature at his forteren\n", + "to him.\n", + "\n", + "\"What is it? I can't bread to those,\" said Stepan Arkadyevitch. \"It's not\n", + "my children, and there is an almost this arm, true it mays already,\n", + "and tell you what I have say to you, and was not looking at the peasant,\n", + "why is, I don't know him out, and she doesn't speak to me immediately, as\n", + "you would say the countess and the more frest an angelembre, and time and\n", + "things's silent, but I was not in my stand that is in my head. But if he\n", + "say, and was so feeling with his soul. A child--in his soul of his\n", + "soul of his soul. He should not see that any of that sense of. Here he\n", + "had not been so composed and to speak for as in a whole picture, but\n", + "all the setting and her excellent and society, who had been delighted\n", + "and see to anywing had been being troed to thousand words on them,\n", + "we liked him.\n", + "\n", + "That set in her money at the table, he came into the party. The capable\n", + "of his she could not be as an old composure.\n", + "\n", + "\"That's all something there will be down becime by throe is\n", + "such a silent, as in a countess, I should state it out and divorct.\n", + "The discussion is not for me. I was that something was simply they are\n", + "all three manshess of a sensitions of mind it all.\"\n", + "\n", + "\"No,\" he thought, shouted and lifting his soul. \"While it might see your\n", + "honser and she, I could burst. And I had been a midelity. And I had a\n", + "marnief are through the countess,\" he said, looking at him, a chosing\n", + "which they had been carried out and still solied, and there was a sen that\n", + "was to be completely, and that this matter of all the seconds of it, and\n", + "a concipation were to her husband, who came up and conscaously, that he\n", + "was not the station. All his fourse she was always at the country,,\n", + "to speak oft, and though they were to hear the delightful throom and\n", + "whether they came towards the morning, and his living and a coller and\n", + "hold--the children. \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i3560_l512_1.122.ckpt\"\n", + "samp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farnt him oste wha sorind thans tout thint asd an sesand an hires on thime sind thit aled, ban thand and out hore as the ter hos ton ho te that, was tis tart al the hand sostint him sore an tit an son thes, win he se ther san ther hher tas tarereng,.\n", + "\n", + "Anl at an ades in ond hesiln, ad hhe torers teans, wast tar arering tho this sos alten sorer has hhas an siton ther him he had sin he ard ate te anling the sosin her ans and\n", + "arins asd and ther ale te tot an tand tanginge wath and ho ald, so sot th asend sat hare sother horesinnd, he hesense wing ante her so tith tir sherinn, anded and to the toul anderin he sorit he torsith she se atere an ting ot hand and thit hhe so the te wile har\n", + "ens ont in the sersise, and we he seres tar aterer, to ato tat or has he he wan ton here won and sen heren he sosering, to to theer oo adent har herere the wosh oute, was serild ward tous hed astend..\n", + "\n", + "I's sint on alt in har tor tit her asd hade shithans ored he talereng an soredendere tim tot hees. Tise sor and \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i200_l512_2.432.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Fard as astice her said he celatice of to seress in the raice, and to be the some and sere allats to that said to that the sark and a cast a the wither ald the pacinesse of her had astition, he said to the sount as she west at hissele. Af the cond it he was a fact onthis astisarianing.\n", + "\n", + "\n", + "\"Or a ton to to be that's a more at aspestale as the sont of anstiring as\n", + "thours and trey.\n", + "\n", + "The same wo dangring the\n", + "raterst, who sore and somethy had ast out an of his book. \"We had's beane were that, and a morted a thay he had to tere. Then to\n", + "her homent andertersed his his ancouted to the pirsted, the soution for of the pirsice inthirgest and stenciol, with the hard and and\n", + "a colrice of to be oneres,\n", + "the song to this anderssad.\n", + "The could ounterss the said to serom of\n", + "soment a carsed of sheres of she\n", + "torded\n", + "har and want in their of hould, but\n", + "her told in that in he tad a the same to her. Serghing an her has and with the seed, and the camt ont his about of the\n", + "sail, the her then all houg ant or to hus to \n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i600_l512_1.750.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Farrat, his felt has at it.\n", + "\n", + "\"When the pose ther hor exceed\n", + "to his sheant was,\" weat a sime of his sounsed. The coment and the facily that which had began terede a marilicaly whice whether the pose of his hand, at she was alligated herself the same on she had to\n", + "taiking to his forthing and streath how to hand\n", + "began in a lang at some at it, this he cholded not set all her. \"Wo love that is setthing. Him anstering as seen that.\"\n", + "\n", + "\"Yes in the man that say the mare a crances is it?\" said Sergazy Ivancatching. \"You doon think were somether is ifficult of a mone of\n", + "though the most at the countes that the\n", + "mean on the come to say the most, to\n", + "his feesing of\n", + "a man she, whilo he\n", + "sained and well, that he would still at to said. He wind at his for the sore in the most\n", + "of hoss and almoved to see him. They have betine the sumper into at he his stire, and what he was that at the so steate of the\n", + "sound, and shin should have a geest of shall feet on the conderation to she had been at that imporsing the dre\n" + ] + } + ], + "source": [ + "checkpoint = \"checkpoints/anna/i1000_l512_1.484.ckpt\"\n", + "samp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\n", + "print(samp)" + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + }, + "toc": { + "colors": { + "hover_highlight": "#DAA520", + "running_highlight": "#FF0000", + "selected_highlight": "#FFD700" + }, + "moveMenuLeft": true, + "nav_menu": { + "height": "123px", + "width": "335px" + }, + "navigate_menu": true, + "number_sections": true, + "sideBar": true, + "threshold": 4, + "toc_cell": false, + "toc_section_display": "block", + "toc_window_display": false + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tensorboard/anna.txt b/tensorboard/anna.txt new file mode 100644 index 0000000..f177bb0 --- /dev/null +++ b/tensorboard/anna.txt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:568bb39fc78aca98e9db91c4329dcd1aa5ec4c3df3aca2064ce5d6f023ae16c9 +size 2025486 diff --git a/tensorboard/assets/charseq.jpeg b/tensorboard/assets/charseq.jpeg new file mode 100644 index 0000000..c8e1221 Binary files /dev/null and b/tensorboard/assets/charseq.jpeg differ diff --git a/tensorboard/logs/1/events.out.tfevents.1490607407.sihil b/tensorboard/logs/1/events.out.tfevents.1490607407.sihil new file mode 100644 index 0000000..4d5d644 Binary files /dev/null and b/tensorboard/logs/1/events.out.tfevents.1490607407.sihil differ diff --git a/tensorboard/logs/3/events.out.tfevents.1490609182.sihil b/tensorboard/logs/3/events.out.tfevents.1490609182.sihil new file mode 100644 index 0000000..a1a4d69 Binary files /dev/null and b/tensorboard/logs/3/events.out.tfevents.1490609182.sihil differ diff --git a/tensorboard/logs/3/events.out.tfevents.1490610125.sihil b/tensorboard/logs/3/events.out.tfevents.1490610125.sihil new file mode 100644 index 0000000..cb468bf Binary files /dev/null and b/tensorboard/logs/3/events.out.tfevents.1490610125.sihil differ diff --git a/tensorboard/utils.py b/tensorboard/utils.py new file mode 100644 index 0000000..1056d93 --- /dev/null +++ b/tensorboard/utils.py @@ -0,0 +1,23 @@ +import numpy as np + +def split_data(chars, batch_size, num_steps, split_frac=0.9): + slice_size = batch_size * num_steps + n_batches = int(len(chars) / slice_size) + x = chars[: n_batches*slice_size] + y = chars[1: n_batches*slice_size + 1] + split_idx = int(batch_size*split_frac) + + x = np.stack(np.split(x, batch_size)) + y = np.stack(np.split(y, batch_size)) + + split_idx = int(n_batches*split_frac) + train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps] + val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:] + return train_x, train_y, val_x, val_y + +def get_batch(arrs, num_steps): + batch_size, slice_size = arrs[0].shape + + n_batches = int(slice_size/num_steps) + for b in range(n_batches): + yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs] \ No newline at end of file diff --git a/tv-script-generation/.ipynb_checkpoints/dlnd_tv_script_generation-checkpoint.ipynb b/tv-script-generation/.ipynb_checkpoints/dlnd_tv_script_generation-checkpoint.ipynb new file mode 100644 index 0000000..ac7b12e --- /dev/null +++ b/tv-script-generation/.ipynb_checkpoints/dlnd_tv_script_generation-checkpoint.ipynb @@ -0,0 +1,1048 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# TV Script Generation\n", + "In this project, you'll generate your own [Simpsons](https://en.wikipedia.org/wiki/The_Simpsons) TV scripts using RNNs. You'll be using part of the [Simpsons dataset](https://www.kaggle.com/wcukierski/the-simpsons-by-the-data) of scripts from 27 seasons. The Neural Network you'll build will generate a new TV script for a scene at [Moe's Tavern](https://simpsonswiki.com/wiki/Moe's_Tavern).\n", + "## Get the Data\n", + "The data is already provided for you. You'll be using a subset of the original dataset. It consists of only the scenes in Moe's Tavern. This doesn't include other versions of the tavern, like \"Moe's Cavern\", \"Flaming Moe's\", \"Uncle Moe's Family Feed-Bag\", etc.." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import helper\n", + "\n", + "data_dir = './data/simpsons/moes_tavern_lines.txt'\n", + "text = helper.load_data(data_dir)\n", + "# Ignore notice, since we don't use it for analysing the data\n", + "text = text[81:]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Explore the Data\n", + "Play around with `view_sentence_range` to view different parts of the data." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Dataset Stats\n", + "Roughly the number of unique words: 11492\n", + "Number of scenes: 262\n", + "Average number of sentences in each scene: 15.248091603053435\n", + "Number of lines: 4257\n", + "Average number of words in each line: 11.50434578341555\n", + "\n", + "The sentences 0 to 10:\n", + "Moe_Szyslak: (INTO PHONE) Moe's Tavern. Where the elite meet to drink.\n", + "Bart_Simpson: Eh, yeah, hello, is Mike there? Last name, Rotch.\n", + "Moe_Szyslak: (INTO PHONE) Hold on, I'll check. (TO BARFLIES) Mike Rotch. Mike Rotch. Hey, has anybody seen Mike Rotch, lately?\n", + "Moe_Szyslak: (INTO PHONE) Listen you little puke. One of these days I'm gonna catch you, and I'm gonna carve my name on your back with an ice pick.\n", + "Moe_Szyslak: What's the matter Homer? You're not your normal effervescent self.\n", + "Homer_Simpson: I got my problems, Moe. Give me another one.\n", + "Moe_Szyslak: Homer, hey, you should not drink to forget your problems.\n", + "Barney_Gumble: Yeah, you should only drink to enhance your social skills.\n", + "\n", + "\n" + ] + } + ], + "source": [ + "view_sentence_range = (0, 10)\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import numpy as np\n", + "\n", + "print('Dataset Stats')\n", + "print('Roughly the number of unique words: {}'.format(len({word: None for word in text.split()})))\n", + "scenes = text.split('\\n\\n')\n", + "print('Number of scenes: {}'.format(len(scenes)))\n", + "sentence_count_scene = [scene.count('\\n') for scene in scenes]\n", + "print('Average number of sentences in each scene: {}'.format(np.average(sentence_count_scene)))\n", + "\n", + "sentences = [sentence for scene in scenes for sentence in scene.split('\\n')]\n", + "print('Number of lines: {}'.format(len(sentences)))\n", + "word_count_sentence = [len(sentence.split()) for sentence in sentences]\n", + "print('Average number of words in each line: {}'.format(np.average(word_count_sentence)))\n", + "\n", + "print()\n", + "print('The sentences {} to {}:'.format(*view_sentence_range))\n", + "print('\\n'.join(text.split('\\n')[view_sentence_range[0]:view_sentence_range[1]]))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Implement Preprocessing Functions\n", + "The first thing to do to any dataset is preprocessing. Implement the following preprocessing functions below:\n", + "- Lookup Table\n", + "- Tokenize Punctuation\n", + "\n", + "### Lookup Table\n", + "To create a word embedding, you first need to transform the words to ids. In this function, create two dictionaries:\n", + "- Dictionary to go from the words to an id, we'll call `vocab_to_int`\n", + "- Dictionary to go from the id to word, we'll call `int_to_vocab`\n", + "\n", + "Return these dictionaries in the following tuple `(vocab_to_int, int_to_vocab)`" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "import numpy as np\n", + "import problem_unittests as tests\n", + "\n", + "def create_lookup_tables(text):\n", + " \"\"\"\n", + " Create lookup tables for vocabulary\n", + " :param text: The text of tv scripts split into words\n", + " :return: A tuple of dicts (vocab_to_int, int_to_vocab)\n", + " \"\"\"\n", + " vocab = set(text)\n", + " \n", + " vocab_to_int = {word: index for index, word in enumerate(vocab)}\n", + " int_to_vocab = {index: word for (word, index) in vocab_to_int.items()}\n", + " \n", + " return vocab_to_int, int_to_vocab\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_create_lookup_tables(create_lookup_tables)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Tokenize Punctuation\n", + "We'll be splitting the script into a word array using spaces as delimiters. However, punctuations like periods and exclamation marks make it hard for the neural network to distinguish between the word \"bye\" and \"bye!\".\n", + "\n", + "Implement the function `token_lookup` to return a dict that will be used to tokenize symbols like \"!\" into \"||Exclamation_Mark||\". Create a dictionary for the following symbols where the symbol is the key and value is the token:\n", + "- Period ( . )\n", + "- Comma ( , )\n", + "- Quotation Mark ( \" )\n", + "- Semicolon ( ; )\n", + "- Exclamation mark ( ! )\n", + "- Question mark ( ? )\n", + "- Left Parentheses ( ( )\n", + "- Right Parentheses ( ) )\n", + "- Dash ( -- )\n", + "- Return ( \\n )\n", + "\n", + "This dictionary will be used to token the symbols and add the delimiter (space) around it. This separates the symbols as it's own word, making it easier for the neural network to predict on the next word. Make sure you don't use a token that could be confused as a word. Instead of using the token \"dash\", try using something like \"||dash||\"." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def token_lookup():\n", + " \"\"\"\n", + " Generate a dict to turn punctuation into a token.\n", + " :return: Tokenize dictionary where the key is the punctuation and the value is the token\n", + " \"\"\"\n", + " \n", + " return {\n", + " '.': '||period||',\n", + " ',': '||comma||',\n", + " '\"': '||quotation_mark||',\n", + " ';': '||semicolon||',\n", + " '!': '||exclamation_mark||',\n", + " '?': '||question_mark||',\n", + " '(': '||left_parentheses',\n", + " ')': '||right_parentheses',\n", + " '--': '||dash||',\n", + " '\\n': '||return||'\n", + " }\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_tokenize(token_lookup)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Preprocess all the data and save it\n", + "Running the code cell below will preprocess all the data and save it to file." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "# Preprocess Training, Validation, and Testing Data\n", + "helper.preprocess_and_save_data(data_dir, token_lookup, create_lookup_tables)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Check Point\n", + "This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import helper\n", + "import numpy as np\n", + "import problem_unittests as tests\n", + "\n", + "int_text, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Build the Neural Network\n", + "You'll build the components necessary to build a RNN by implementing the following functions below:\n", + "- get_inputs\n", + "- get_init_cell\n", + "- get_embed\n", + "- build_rnn\n", + "- build_nn\n", + "- get_batches\n", + "\n", + "### Check the Version of TensorFlow and Access to GPU" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "TensorFlow Version: 1.0.0\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/__main__.py:14: UserWarning: No GPU found. Please use a GPU to train your neural network.\n" + ] + } + ], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "from distutils.version import LooseVersion\n", + "import warnings\n", + "import tensorflow as tf\n", + "\n", + "# Check TensorFlow Version\n", + "assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer'\n", + "print('TensorFlow Version: {}'.format(tf.__version__))\n", + "\n", + "# Check for a GPU\n", + "if not tf.test.gpu_device_name():\n", + " warnings.warn('No GPU found. Please use a GPU to train your neural network.')\n", + "else:\n", + " print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Input\n", + "Implement the `get_inputs()` function to create TF Placeholders for the Neural Network. It should create the following placeholders:\n", + "- Input text placeholder named \"input\" using the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder) `name` parameter.\n", + "- Targets placeholder\n", + "- Learning Rate placeholder\n", + "\n", + "Return the placeholders in the following the tuple `(Input, Targets, LearingRate)`" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_inputs():\n", + " \"\"\"\n", + " Create TF Placeholders for input, targets, and learning rate.\n", + " :return: Tuple (input, targets, learning rate)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_inputs(get_inputs)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Build RNN Cell and Initialize\n", + "Stack one or more [`BasicLSTMCells`](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/BasicLSTMCell) in a [`MultiRNNCell`](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/MultiRNNCell).\n", + "- The Rnn size should be set using `rnn_size`\n", + "- Initalize Cell State using the MultiRNNCell's [`zero_state()`](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/MultiRNNCell#zero_state) function\n", + " - Apply the name \"initial_state\" to the initial state using [`tf.identity()`](https://www.tensorflow.org/api_docs/python/tf/identity)\n", + "\n", + "Return the cell and initial state in the following tuple `(Cell, InitialState)`" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_init_cell(batch_size, rnn_size):\n", + " \"\"\"\n", + " Create an RNN Cell and initialize it.\n", + " :param batch_size: Size of batches\n", + " :param rnn_size: Size of RNNs\n", + " :return: Tuple (cell, initialize state)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_init_cell(get_init_cell)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Word Embedding\n", + "Apply embedding to `input_data` using TensorFlow. Return the embedded sequence." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_embed(input_data, vocab_size, embed_dim):\n", + " \"\"\"\n", + " Create embedding for .\n", + " :param input_data: TF placeholder for text input.\n", + " :param vocab_size: Number of words in vocabulary.\n", + " :param embed_dim: Number of embedding dimensions\n", + " :return: Embedded input.\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_embed(get_embed)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Build RNN\n", + "You created a RNN Cell in the `get_init_cell()` function. Time to use the cell to create a RNN.\n", + "- Build the RNN using the [`tf.nn.dynamic_rnn()`](https://www.tensorflow.org/api_docs/python/tf/nn/dynamic_rnn)\n", + " - Apply the name \"final_state\" to the final state using [`tf.identity()`](https://www.tensorflow.org/api_docs/python/tf/identity)\n", + "\n", + "Return the outputs and final_state state in the following tuple `(Outputs, FinalState)` " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(cell, inputs):\n", + " \"\"\"\n", + " Create a RNN using a RNN Cell\n", + " :param cell: RNN Cell\n", + " :param inputs: Input text data\n", + " :return: Tuple (Outputs, Final State)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_build_rnn(build_rnn)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Build the Neural Network\n", + "Apply the functions you implemented above to:\n", + "- Apply embedding to `input_data` using your `get_embed(input_data, vocab_size, embed_dim)` function.\n", + "- Build RNN using `cell` and your `build_rnn(cell, inputs)` function.\n", + "- Apply a fully connected layer with a linear activation and `vocab_size` as the number of outputs.\n", + "\n", + "Return the logits and final state in the following tuple (Logits, FinalState) " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_nn(cell, rnn_size, input_data, vocab_size):\n", + " \"\"\"\n", + " Build part of the neural network\n", + " :param cell: RNN cell\n", + " :param rnn_size: Size of rnns\n", + " :param input_data: Input data\n", + " :param vocab_size: Vocabulary size\n", + " :return: Tuple (Logits, FinalState)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_build_nn(build_nn)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Batches\n", + "Implement `get_batches` to create batches of input and targets using `int_text`. The batches should be a Numpy array with the shape `(number of batches, 2, batch size, sequence length)`. Each batch contains two elements:\n", + "- The first element is a single batch of **input** with the shape `[batch size, sequence length]`\n", + "- The second element is a single batch of **targets** with the shape `[batch size, sequence length]`\n", + "\n", + "If you can't fill the last batch with enough data, drop the last batch.\n", + "\n", + "For exmple, `get_batches([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15], 2, 3)` would return a Numpy array of the following:\n", + "```\n", + "[\n", + " # First Batch\n", + " [\n", + " # Batch of Input\n", + " [[ 1 2 3], [ 7 8 9]],\n", + " # Batch of targets\n", + " [[ 2 3 4], [ 8 9 10]]\n", + " ],\n", + " \n", + " # Second Batch\n", + " [\n", + " # Batch of Input\n", + " [[ 4 5 6], [10 11 12]],\n", + " # Batch of targets\n", + " [[ 5 6 7], [11 12 13]]\n", + " ]\n", + "]\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_batches(int_text, batch_size, seq_length):\n", + " \"\"\"\n", + " Return batches of input and target\n", + " :param int_text: Text with the words replaced by their ids\n", + " :param batch_size: The size of batch\n", + " :param seq_length: The length of sequence\n", + " :return: Batches as a Numpy array\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_batches(get_batches)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Neural Network Training\n", + "### Hyperparameters\n", + "Tune the following parameters:\n", + "\n", + "- Set `num_epochs` to the number of epochs.\n", + "- Set `batch_size` to the batch size.\n", + "- Set `rnn_size` to the size of the RNNs.\n", + "- Set `seq_length` to the length of sequence.\n", + "- Set `learning_rate` to the learning rate.\n", + "- Set `show_every_n_batches` to the number of batches the neural network should print progress." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Number of Epochs\n", + "num_epochs = None\n", + "# Batch Size\n", + "batch_size = None\n", + "# RNN Size\n", + "rnn_size = None\n", + "# Sequence Length\n", + "seq_length = None\n", + "# Learning Rate\n", + "learning_rate = None\n", + "# Show stats for every n number of batches\n", + "show_every_n_batches = None\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "save_dir = './save'" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Build the Graph\n", + "Build the graph using the neural network you implemented." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "from tensorflow.contrib import seq2seq\n", + "\n", + "train_graph = tf.Graph()\n", + "with train_graph.as_default():\n", + " vocab_size = len(int_to_vocab)\n", + " input_text, targets, lr = get_inputs()\n", + " input_data_shape = tf.shape(input_text)\n", + " cell, initial_state = get_init_cell(input_data_shape[0], rnn_size)\n", + " logits, final_state = build_nn(cell, rnn_size, input_text, vocab_size)\n", + "\n", + " # Probabilities for generating words\n", + " probs = tf.nn.softmax(logits, name='probs')\n", + "\n", + " # Loss function\n", + " cost = seq2seq.sequence_loss(\n", + " logits,\n", + " targets,\n", + " tf.ones([input_data_shape[0], input_data_shape[1]]))\n", + "\n", + " # Optimizer\n", + " optimizer = tf.train.AdamOptimizer(lr)\n", + "\n", + " # Gradient Clipping\n", + " gradients = optimizer.compute_gradients(cost)\n", + " capped_gradients = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gradients]\n", + " train_op = optimizer.apply_gradients(capped_gradients)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Train\n", + "Train the neural network on the preprocessed data. If you have a hard time getting a good loss, check the [forms](https://discussions.udacity.com/) to see if anyone is having the same problem." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "batches = get_batches(int_text, batch_size, seq_length)\n", + "\n", + "with tf.Session(graph=train_graph) as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + "\n", + " for epoch_i in range(num_epochs):\n", + " state = sess.run(initial_state, {input_text: batches[0][0]})\n", + "\n", + " for batch_i, (x, y) in enumerate(batches):\n", + " feed = {\n", + " input_text: x,\n", + " targets: y,\n", + " initial_state: state,\n", + " lr: learning_rate}\n", + " train_loss, state, _ = sess.run([cost, final_state, train_op], feed)\n", + "\n", + " # Show every batches\n", + " if (epoch_i * len(batches) + batch_i) % show_every_n_batches == 0:\n", + " print('Epoch {:>3} Batch {:>4}/{} train_loss = {:.3f}'.format(\n", + " epoch_i,\n", + " batch_i,\n", + " len(batches),\n", + " train_loss))\n", + "\n", + " # Save Model\n", + " saver = tf.train.Saver()\n", + " saver.save(sess, save_dir)\n", + " print('Model Trained and Saved')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Save Parameters\n", + "Save `seq_length` and `save_dir` for generating a new TV script." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "# Save parameters for checkpoint\n", + "helper.save_params((seq_length, save_dir))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Checkpoint" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import tensorflow as tf\n", + "import numpy as np\n", + "import helper\n", + "import problem_unittests as tests\n", + "\n", + "_, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()\n", + "seq_length, load_dir = helper.load_params()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Implement Generate Functions\n", + "### Get Tensors\n", + "Get tensors from `loaded_graph` using the function [`get_tensor_by_name()`](https://www.tensorflow.org/api_docs/python/tf/Graph#get_tensor_by_name). Get the tensors using the following names:\n", + "- \"input:0\"\n", + "- \"initial_state:0\"\n", + "- \"final_state:0\"\n", + "- \"probs:0\"\n", + "\n", + "Return the tensors in the following tuple `(InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)` " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_tensors(loaded_graph):\n", + " \"\"\"\n", + " Get input, initial state, final state, and probabilities tensor from \n", + " :param loaded_graph: TensorFlow graph loaded from file\n", + " :return: Tuple (InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None, None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_tensors(get_tensors)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Choose Word\n", + "Implement the `pick_word()` function to select the next word using `probabilities`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def pick_word(probabilities, int_to_vocab):\n", + " \"\"\"\n", + " Pick the next word in the generated text\n", + " :param probabilities: Probabilites of the next word\n", + " :param int_to_vocab: Dictionary of word ids as the keys and words as the values\n", + " :return: String of the predicted word\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_pick_word(pick_word)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Generate TV Script\n", + "This will generate the TV script for you. Set `gen_length` to the length of TV script you want to generate." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "gen_length = 200\n", + "# homer_simpson, moe_szyslak, or Barney_Gumble\n", + "prime_word = 'moe_szyslak'\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "loaded_graph = tf.Graph()\n", + "with tf.Session(graph=loaded_graph) as sess:\n", + " # Load saved model\n", + " loader = tf.train.import_meta_graph(load_dir + '.meta')\n", + " loader.restore(sess, load_dir)\n", + "\n", + " # Get Tensors from loaded model\n", + " input_text, initial_state, final_state, probs = get_tensors(loaded_graph)\n", + "\n", + " # Sentences generation setup\n", + " gen_sentences = [prime_word + ':']\n", + " prev_state = sess.run(initial_state, {input_text: np.array([[1]])})\n", + "\n", + " # Generate sentences\n", + " for n in range(gen_length):\n", + " # Dynamic Input\n", + " dyn_input = [[vocab_to_int[word] for word in gen_sentences[-seq_length:]]]\n", + " dyn_seq_length = len(dyn_input[0])\n", + "\n", + " # Get Prediction\n", + " probabilities, prev_state = sess.run(\n", + " [probs, final_state],\n", + " {input_text: dyn_input, initial_state: prev_state})\n", + " \n", + " pred_word = pick_word(probabilities[dyn_seq_length-1], int_to_vocab)\n", + "\n", + " gen_sentences.append(pred_word)\n", + " \n", + " # Remove tokens\n", + " tv_script = ' '.join(gen_sentences)\n", + " for key, token in token_dict.items():\n", + " ending = ' ' if key in ['\\n', '(', '\"'] else ''\n", + " tv_script = tv_script.replace(' ' + token.lower(), key)\n", + " tv_script = tv_script.replace('\\n ', '\\n')\n", + " tv_script = tv_script.replace('( ', '(')\n", + " \n", + " print(tv_script)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# The TV Script is Nonsensical\n", + "It's ok if the TV script doesn't make any sense. We trained on less than a megabyte of text. In order to get good results, you'll have to use a smaller vocabulary or get more data. Luckly there's more data! As we mentioned in the begging of this project, this is a subset of [another dataset](https://www.kaggle.com/wcukierski/the-simpsons-by-the-data). We didn't have you train on all the data, because that would take too long. However, you are free to train your neural network on all the data. After you complete the project, of course.\n", + "# Submitting This Project\n", + "When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as \"dlnd_tv_script_generation.ipynb\" and save it as a HTML file under \"File\" -> \"Download as\". Include the \"helper.py\" and \"problem_unittests.py\" files in your submission." + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + }, + "toc": { + "colors": { + "hover_highlight": "#DAA520", + "running_highlight": "#FF0000", + "selected_highlight": "#FFD700" + }, + "moveMenuLeft": true, + "nav_menu": { + "height": "511px", + "width": "251px" + }, + "navigate_menu": true, + "number_sections": true, + "sideBar": true, + "threshold": 4, + "toc_cell": false, + "toc_section_display": "block", + "toc_window_display": false + }, + "widgets": { + "state": {}, + "version": "1.1.2" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/tv-script-generation/__pycache__/helper.cpython-35.pyc b/tv-script-generation/__pycache__/helper.cpython-35.pyc new file mode 100644 index 0000000..a9561b5 Binary files /dev/null and b/tv-script-generation/__pycache__/helper.cpython-35.pyc differ diff --git a/tv-script-generation/__pycache__/problem_unittests.cpython-35.pyc b/tv-script-generation/__pycache__/problem_unittests.cpython-35.pyc new file mode 100644 index 0000000..fb7b788 Binary files /dev/null and b/tv-script-generation/__pycache__/problem_unittests.cpython-35.pyc differ diff --git a/tv-script-generation/data/simpsons/moes_tavern_lines.txt b/tv-script-generation/data/simpsons/moes_tavern_lines.txt new file mode 100644 index 0000000..1410bfe --- /dev/null +++ b/tv-script-generation/data/simpsons/moes_tavern_lines.txt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:44eaf379f2368165d52d395329d0c639ef1a0036d5b7e2c0f0d3acfd657d7ed4 +size 305284 diff --git a/tv-script-generation/dlnd_tv_script_generation.ipynb b/tv-script-generation/dlnd_tv_script_generation.ipynb new file mode 100644 index 0000000..ac7b12e --- /dev/null +++ b/tv-script-generation/dlnd_tv_script_generation.ipynb @@ -0,0 +1,1048 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# TV Script Generation\n", + "In this project, you'll generate your own [Simpsons](https://en.wikipedia.org/wiki/The_Simpsons) TV scripts using RNNs. You'll be using part of the [Simpsons dataset](https://www.kaggle.com/wcukierski/the-simpsons-by-the-data) of scripts from 27 seasons. The Neural Network you'll build will generate a new TV script for a scene at [Moe's Tavern](https://simpsonswiki.com/wiki/Moe's_Tavern).\n", + "## Get the Data\n", + "The data is already provided for you. You'll be using a subset of the original dataset. It consists of only the scenes in Moe's Tavern. This doesn't include other versions of the tavern, like \"Moe's Cavern\", \"Flaming Moe's\", \"Uncle Moe's Family Feed-Bag\", etc.." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import helper\n", + "\n", + "data_dir = './data/simpsons/moes_tavern_lines.txt'\n", + "text = helper.load_data(data_dir)\n", + "# Ignore notice, since we don't use it for analysing the data\n", + "text = text[81:]" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Explore the Data\n", + "Play around with `view_sentence_range` to view different parts of the data." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Dataset Stats\n", + "Roughly the number of unique words: 11492\n", + "Number of scenes: 262\n", + "Average number of sentences in each scene: 15.248091603053435\n", + "Number of lines: 4257\n", + "Average number of words in each line: 11.50434578341555\n", + "\n", + "The sentences 0 to 10:\n", + "Moe_Szyslak: (INTO PHONE) Moe's Tavern. Where the elite meet to drink.\n", + "Bart_Simpson: Eh, yeah, hello, is Mike there? Last name, Rotch.\n", + "Moe_Szyslak: (INTO PHONE) Hold on, I'll check. (TO BARFLIES) Mike Rotch. Mike Rotch. Hey, has anybody seen Mike Rotch, lately?\n", + "Moe_Szyslak: (INTO PHONE) Listen you little puke. One of these days I'm gonna catch you, and I'm gonna carve my name on your back with an ice pick.\n", + "Moe_Szyslak: What's the matter Homer? You're not your normal effervescent self.\n", + "Homer_Simpson: I got my problems, Moe. Give me another one.\n", + "Moe_Szyslak: Homer, hey, you should not drink to forget your problems.\n", + "Barney_Gumble: Yeah, you should only drink to enhance your social skills.\n", + "\n", + "\n" + ] + } + ], + "source": [ + "view_sentence_range = (0, 10)\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import numpy as np\n", + "\n", + "print('Dataset Stats')\n", + "print('Roughly the number of unique words: {}'.format(len({word: None for word in text.split()})))\n", + "scenes = text.split('\\n\\n')\n", + "print('Number of scenes: {}'.format(len(scenes)))\n", + "sentence_count_scene = [scene.count('\\n') for scene in scenes]\n", + "print('Average number of sentences in each scene: {}'.format(np.average(sentence_count_scene)))\n", + "\n", + "sentences = [sentence for scene in scenes for sentence in scene.split('\\n')]\n", + "print('Number of lines: {}'.format(len(sentences)))\n", + "word_count_sentence = [len(sentence.split()) for sentence in sentences]\n", + "print('Average number of words in each line: {}'.format(np.average(word_count_sentence)))\n", + "\n", + "print()\n", + "print('The sentences {} to {}:'.format(*view_sentence_range))\n", + "print('\\n'.join(text.split('\\n')[view_sentence_range[0]:view_sentence_range[1]]))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Implement Preprocessing Functions\n", + "The first thing to do to any dataset is preprocessing. Implement the following preprocessing functions below:\n", + "- Lookup Table\n", + "- Tokenize Punctuation\n", + "\n", + "### Lookup Table\n", + "To create a word embedding, you first need to transform the words to ids. In this function, create two dictionaries:\n", + "- Dictionary to go from the words to an id, we'll call `vocab_to_int`\n", + "- Dictionary to go from the id to word, we'll call `int_to_vocab`\n", + "\n", + "Return these dictionaries in the following tuple `(vocab_to_int, int_to_vocab)`" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "import numpy as np\n", + "import problem_unittests as tests\n", + "\n", + "def create_lookup_tables(text):\n", + " \"\"\"\n", + " Create lookup tables for vocabulary\n", + " :param text: The text of tv scripts split into words\n", + " :return: A tuple of dicts (vocab_to_int, int_to_vocab)\n", + " \"\"\"\n", + " vocab = set(text)\n", + " \n", + " vocab_to_int = {word: index for index, word in enumerate(vocab)}\n", + " int_to_vocab = {index: word for (word, index) in vocab_to_int.items()}\n", + " \n", + " return vocab_to_int, int_to_vocab\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_create_lookup_tables(create_lookup_tables)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Tokenize Punctuation\n", + "We'll be splitting the script into a word array using spaces as delimiters. However, punctuations like periods and exclamation marks make it hard for the neural network to distinguish between the word \"bye\" and \"bye!\".\n", + "\n", + "Implement the function `token_lookup` to return a dict that will be used to tokenize symbols like \"!\" into \"||Exclamation_Mark||\". Create a dictionary for the following symbols where the symbol is the key and value is the token:\n", + "- Period ( . )\n", + "- Comma ( , )\n", + "- Quotation Mark ( \" )\n", + "- Semicolon ( ; )\n", + "- Exclamation mark ( ! )\n", + "- Question mark ( ? )\n", + "- Left Parentheses ( ( )\n", + "- Right Parentheses ( ) )\n", + "- Dash ( -- )\n", + "- Return ( \\n )\n", + "\n", + "This dictionary will be used to token the symbols and add the delimiter (space) around it. This separates the symbols as it's own word, making it easier for the neural network to predict on the next word. Make sure you don't use a token that could be confused as a word. Instead of using the token \"dash\", try using something like \"||dash||\"." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tests Passed\n" + ] + } + ], + "source": [ + "def token_lookup():\n", + " \"\"\"\n", + " Generate a dict to turn punctuation into a token.\n", + " :return: Tokenize dictionary where the key is the punctuation and the value is the token\n", + " \"\"\"\n", + " \n", + " return {\n", + " '.': '||period||',\n", + " ',': '||comma||',\n", + " '\"': '||quotation_mark||',\n", + " ';': '||semicolon||',\n", + " '!': '||exclamation_mark||',\n", + " '?': '||question_mark||',\n", + " '(': '||left_parentheses',\n", + " ')': '||right_parentheses',\n", + " '--': '||dash||',\n", + " '\\n': '||return||'\n", + " }\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_tokenize(token_lookup)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Preprocess all the data and save it\n", + "Running the code cell below will preprocess all the data and save it to file." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "# Preprocess Training, Validation, and Testing Data\n", + "helper.preprocess_and_save_data(data_dir, token_lookup, create_lookup_tables)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Check Point\n", + "This is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import helper\n", + "import numpy as np\n", + "import problem_unittests as tests\n", + "\n", + "int_text, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Build the Neural Network\n", + "You'll build the components necessary to build a RNN by implementing the following functions below:\n", + "- get_inputs\n", + "- get_init_cell\n", + "- get_embed\n", + "- build_rnn\n", + "- build_nn\n", + "- get_batches\n", + "\n", + "### Check the Version of TensorFlow and Access to GPU" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "collapsed": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "TensorFlow Version: 1.0.0\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "/home/spike/.pyenv/versions/3.5.1/envs/ml/lib/python3.5/site-packages/ipykernel/__main__.py:14: UserWarning: No GPU found. Please use a GPU to train your neural network.\n" + ] + } + ], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "from distutils.version import LooseVersion\n", + "import warnings\n", + "import tensorflow as tf\n", + "\n", + "# Check TensorFlow Version\n", + "assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer'\n", + "print('TensorFlow Version: {}'.format(tf.__version__))\n", + "\n", + "# Check for a GPU\n", + "if not tf.test.gpu_device_name():\n", + " warnings.warn('No GPU found. Please use a GPU to train your neural network.')\n", + "else:\n", + " print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Input\n", + "Implement the `get_inputs()` function to create TF Placeholders for the Neural Network. It should create the following placeholders:\n", + "- Input text placeholder named \"input\" using the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder) `name` parameter.\n", + "- Targets placeholder\n", + "- Learning Rate placeholder\n", + "\n", + "Return the placeholders in the following the tuple `(Input, Targets, LearingRate)`" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_inputs():\n", + " \"\"\"\n", + " Create TF Placeholders for input, targets, and learning rate.\n", + " :return: Tuple (input, targets, learning rate)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_inputs(get_inputs)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Build RNN Cell and Initialize\n", + "Stack one or more [`BasicLSTMCells`](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/BasicLSTMCell) in a [`MultiRNNCell`](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/MultiRNNCell).\n", + "- The Rnn size should be set using `rnn_size`\n", + "- Initalize Cell State using the MultiRNNCell's [`zero_state()`](https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/MultiRNNCell#zero_state) function\n", + " - Apply the name \"initial_state\" to the initial state using [`tf.identity()`](https://www.tensorflow.org/api_docs/python/tf/identity)\n", + "\n", + "Return the cell and initial state in the following tuple `(Cell, InitialState)`" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_init_cell(batch_size, rnn_size):\n", + " \"\"\"\n", + " Create an RNN Cell and initialize it.\n", + " :param batch_size: Size of batches\n", + " :param rnn_size: Size of RNNs\n", + " :return: Tuple (cell, initialize state)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_init_cell(get_init_cell)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Word Embedding\n", + "Apply embedding to `input_data` using TensorFlow. Return the embedded sequence." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_embed(input_data, vocab_size, embed_dim):\n", + " \"\"\"\n", + " Create embedding for .\n", + " :param input_data: TF placeholder for text input.\n", + " :param vocab_size: Number of words in vocabulary.\n", + " :param embed_dim: Number of embedding dimensions\n", + " :return: Embedded input.\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_embed(get_embed)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Build RNN\n", + "You created a RNN Cell in the `get_init_cell()` function. Time to use the cell to create a RNN.\n", + "- Build the RNN using the [`tf.nn.dynamic_rnn()`](https://www.tensorflow.org/api_docs/python/tf/nn/dynamic_rnn)\n", + " - Apply the name \"final_state\" to the final state using [`tf.identity()`](https://www.tensorflow.org/api_docs/python/tf/identity)\n", + "\n", + "Return the outputs and final_state state in the following tuple `(Outputs, FinalState)` " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_rnn(cell, inputs):\n", + " \"\"\"\n", + " Create a RNN using a RNN Cell\n", + " :param cell: RNN Cell\n", + " :param inputs: Input text data\n", + " :return: Tuple (Outputs, Final State)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_build_rnn(build_rnn)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Build the Neural Network\n", + "Apply the functions you implemented above to:\n", + "- Apply embedding to `input_data` using your `get_embed(input_data, vocab_size, embed_dim)` function.\n", + "- Build RNN using `cell` and your `build_rnn(cell, inputs)` function.\n", + "- Apply a fully connected layer with a linear activation and `vocab_size` as the number of outputs.\n", + "\n", + "Return the logits and final state in the following tuple (Logits, FinalState) " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def build_nn(cell, rnn_size, input_data, vocab_size):\n", + " \"\"\"\n", + " Build part of the neural network\n", + " :param cell: RNN cell\n", + " :param rnn_size: Size of rnns\n", + " :param input_data: Input data\n", + " :param vocab_size: Vocabulary size\n", + " :return: Tuple (Logits, FinalState)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_build_nn(build_nn)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Batches\n", + "Implement `get_batches` to create batches of input and targets using `int_text`. The batches should be a Numpy array with the shape `(number of batches, 2, batch size, sequence length)`. Each batch contains two elements:\n", + "- The first element is a single batch of **input** with the shape `[batch size, sequence length]`\n", + "- The second element is a single batch of **targets** with the shape `[batch size, sequence length]`\n", + "\n", + "If you can't fill the last batch with enough data, drop the last batch.\n", + "\n", + "For exmple, `get_batches([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15], 2, 3)` would return a Numpy array of the following:\n", + "```\n", + "[\n", + " # First Batch\n", + " [\n", + " # Batch of Input\n", + " [[ 1 2 3], [ 7 8 9]],\n", + " # Batch of targets\n", + " [[ 2 3 4], [ 8 9 10]]\n", + " ],\n", + " \n", + " # Second Batch\n", + " [\n", + " # Batch of Input\n", + " [[ 4 5 6], [10 11 12]],\n", + " # Batch of targets\n", + " [[ 5 6 7], [11 12 13]]\n", + " ]\n", + "]\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_batches(int_text, batch_size, seq_length):\n", + " \"\"\"\n", + " Return batches of input and target\n", + " :param int_text: Text with the words replaced by their ids\n", + " :param batch_size: The size of batch\n", + " :param seq_length: The length of sequence\n", + " :return: Batches as a Numpy array\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_batches(get_batches)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Neural Network Training\n", + "### Hyperparameters\n", + "Tune the following parameters:\n", + "\n", + "- Set `num_epochs` to the number of epochs.\n", + "- Set `batch_size` to the batch size.\n", + "- Set `rnn_size` to the size of the RNNs.\n", + "- Set `seq_length` to the length of sequence.\n", + "- Set `learning_rate` to the learning rate.\n", + "- Set `show_every_n_batches` to the number of batches the neural network should print progress." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "# Number of Epochs\n", + "num_epochs = None\n", + "# Batch Size\n", + "batch_size = None\n", + "# RNN Size\n", + "rnn_size = None\n", + "# Sequence Length\n", + "seq_length = None\n", + "# Learning Rate\n", + "learning_rate = None\n", + "# Show stats for every n number of batches\n", + "show_every_n_batches = None\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "save_dir = './save'" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Build the Graph\n", + "Build the graph using the neural network you implemented." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "from tensorflow.contrib import seq2seq\n", + "\n", + "train_graph = tf.Graph()\n", + "with train_graph.as_default():\n", + " vocab_size = len(int_to_vocab)\n", + " input_text, targets, lr = get_inputs()\n", + " input_data_shape = tf.shape(input_text)\n", + " cell, initial_state = get_init_cell(input_data_shape[0], rnn_size)\n", + " logits, final_state = build_nn(cell, rnn_size, input_text, vocab_size)\n", + "\n", + " # Probabilities for generating words\n", + " probs = tf.nn.softmax(logits, name='probs')\n", + "\n", + " # Loss function\n", + " cost = seq2seq.sequence_loss(\n", + " logits,\n", + " targets,\n", + " tf.ones([input_data_shape[0], input_data_shape[1]]))\n", + "\n", + " # Optimizer\n", + " optimizer = tf.train.AdamOptimizer(lr)\n", + "\n", + " # Gradient Clipping\n", + " gradients = optimizer.compute_gradients(cost)\n", + " capped_gradients = [(tf.clip_by_value(grad, -1., 1.), var) for grad, var in gradients]\n", + " train_op = optimizer.apply_gradients(capped_gradients)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Train\n", + "Train the neural network on the preprocessed data. If you have a hard time getting a good loss, check the [forms](https://discussions.udacity.com/) to see if anyone is having the same problem." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "batches = get_batches(int_text, batch_size, seq_length)\n", + "\n", + "with tf.Session(graph=train_graph) as sess:\n", + " sess.run(tf.global_variables_initializer())\n", + "\n", + " for epoch_i in range(num_epochs):\n", + " state = sess.run(initial_state, {input_text: batches[0][0]})\n", + "\n", + " for batch_i, (x, y) in enumerate(batches):\n", + " feed = {\n", + " input_text: x,\n", + " targets: y,\n", + " initial_state: state,\n", + " lr: learning_rate}\n", + " train_loss, state, _ = sess.run([cost, final_state, train_op], feed)\n", + "\n", + " # Show every batches\n", + " if (epoch_i * len(batches) + batch_i) % show_every_n_batches == 0:\n", + " print('Epoch {:>3} Batch {:>4}/{} train_loss = {:.3f}'.format(\n", + " epoch_i,\n", + " batch_i,\n", + " len(batches),\n", + " train_loss))\n", + "\n", + " # Save Model\n", + " saver = tf.train.Saver()\n", + " saver.save(sess, save_dir)\n", + " print('Model Trained and Saved')" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Save Parameters\n", + "Save `seq_length` and `save_dir` for generating a new TV script." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "# Save parameters for checkpoint\n", + "helper.save_params((seq_length, save_dir))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# Checkpoint" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL\n", + "\"\"\"\n", + "import tensorflow as tf\n", + "import numpy as np\n", + "import helper\n", + "import problem_unittests as tests\n", + "\n", + "_, vocab_to_int, int_to_vocab, token_dict = helper.load_preprocess()\n", + "seq_length, load_dir = helper.load_params()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Implement Generate Functions\n", + "### Get Tensors\n", + "Get tensors from `loaded_graph` using the function [`get_tensor_by_name()`](https://www.tensorflow.org/api_docs/python/tf/Graph#get_tensor_by_name). Get the tensors using the following names:\n", + "- \"input:0\"\n", + "- \"initial_state:0\"\n", + "- \"final_state:0\"\n", + "- \"probs:0\"\n", + "\n", + "Return the tensors in the following tuple `(InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)` " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def get_tensors(loaded_graph):\n", + " \"\"\"\n", + " Get input, initial state, final state, and probabilities tensor from \n", + " :param loaded_graph: TensorFlow graph loaded from file\n", + " :return: Tuple (InputTensor, InitialStateTensor, FinalStateTensor, ProbsTensor)\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None, None, None, None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_get_tensors(get_tensors)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "### Choose Word\n", + "Implement the `pick_word()` function to select the next word using `probabilities`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "def pick_word(probabilities, int_to_vocab):\n", + " \"\"\"\n", + " Pick the next word in the generated text\n", + " :param probabilities: Probabilites of the next word\n", + " :param int_to_vocab: Dictionary of word ids as the keys and words as the values\n", + " :return: String of the predicted word\n", + " \"\"\"\n", + " # TODO: Implement Function\n", + " return None\n", + "\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "tests.test_pick_word(pick_word)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "## Generate TV Script\n", + "This will generate the TV script for you. Set `gen_length` to the length of TV script you want to generate." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": false, + "deletable": true, + "editable": true + }, + "outputs": [], + "source": [ + "gen_length = 200\n", + "# homer_simpson, moe_szyslak, or Barney_Gumble\n", + "prime_word = 'moe_szyslak'\n", + "\n", + "\"\"\"\n", + "DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n", + "\"\"\"\n", + "loaded_graph = tf.Graph()\n", + "with tf.Session(graph=loaded_graph) as sess:\n", + " # Load saved model\n", + " loader = tf.train.import_meta_graph(load_dir + '.meta')\n", + " loader.restore(sess, load_dir)\n", + "\n", + " # Get Tensors from loaded model\n", + " input_text, initial_state, final_state, probs = get_tensors(loaded_graph)\n", + "\n", + " # Sentences generation setup\n", + " gen_sentences = [prime_word + ':']\n", + " prev_state = sess.run(initial_state, {input_text: np.array([[1]])})\n", + "\n", + " # Generate sentences\n", + " for n in range(gen_length):\n", + " # Dynamic Input\n", + " dyn_input = [[vocab_to_int[word] for word in gen_sentences[-seq_length:]]]\n", + " dyn_seq_length = len(dyn_input[0])\n", + "\n", + " # Get Prediction\n", + " probabilities, prev_state = sess.run(\n", + " [probs, final_state],\n", + " {input_text: dyn_input, initial_state: prev_state})\n", + " \n", + " pred_word = pick_word(probabilities[dyn_seq_length-1], int_to_vocab)\n", + "\n", + " gen_sentences.append(pred_word)\n", + " \n", + " # Remove tokens\n", + " tv_script = ' '.join(gen_sentences)\n", + " for key, token in token_dict.items():\n", + " ending = ' ' if key in ['\\n', '(', '\"'] else ''\n", + " tv_script = tv_script.replace(' ' + token.lower(), key)\n", + " tv_script = tv_script.replace('\\n ', '\\n')\n", + " tv_script = tv_script.replace('( ', '(')\n", + " \n", + " print(tv_script)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "deletable": true, + "editable": true + }, + "source": [ + "# The TV Script is Nonsensical\n", + "It's ok if the TV script doesn't make any sense. We trained on less than a megabyte of text. In order to get good results, you'll have to use a smaller vocabulary or get more data. Luckly there's more data! As we mentioned in the begging of this project, this is a subset of [another dataset](https://www.kaggle.com/wcukierski/the-simpsons-by-the-data). We didn't have you train on all the data, because that would take too long. However, you are free to train your neural network on all the data. After you complete the project, of course.\n", + "# Submitting This Project\n", + "When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as \"dlnd_tv_script_generation.ipynb\" and save it as a HTML file under \"File\" -> \"Download as\". Include the \"helper.py\" and \"problem_unittests.py\" files in your submission." + ] + } + ], + "metadata": { + "hide_input": false, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.5.1" + }, + "toc": { + "colors": { + "hover_highlight": "#DAA520", + "running_highlight": "#FF0000", + "selected_highlight": "#FFD700" + }, + "moveMenuLeft": true, + "nav_menu": { + "height": "511px", + "width": "251px" + }, + "navigate_menu": true, + "number_sections": true, + "sideBar": true, + "threshold": 4, + "toc_cell": false, + "toc_section_display": "block", + "toc_window_display": false + }, + "widgets": { + "state": {}, + "version": "1.1.2" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} diff --git a/tv-script-generation/helper.py b/tv-script-generation/helper.py new file mode 100644 index 0000000..eec8857 --- /dev/null +++ b/tv-script-generation/helper.py @@ -0,0 +1,55 @@ +import os +import pickle + + +def load_data(path): + """ + Load Dataset from File + """ + input_file = os.path.join(path) + with open(input_file, "r") as f: + data = f.read() + + return data + + +def preprocess_and_save_data(dataset_path, token_lookup, create_lookup_tables): + """ + Preprocess Text Data + """ + text = load_data(dataset_path) + + # Ignore notice, since we don't use it for analysing the data + text = text[81:] + + token_dict = token_lookup() + for key, token in token_dict.items(): + text = text.replace(key, ' {} '.format(token)) + + text = text.lower() + text = text.split() + + vocab_to_int, int_to_vocab = create_lookup_tables(text) + int_text = [vocab_to_int[word] for word in text] + pickle.dump((int_text, vocab_to_int, int_to_vocab, token_dict), open('preprocess.p', 'wb')) + + +def load_preprocess(): + """ + Load the Preprocessed Training data and return them in batches of or less + """ + return pickle.load(open('preprocess.p', mode='rb')) + + +def save_params(params): + """ + Save parameters to file + """ + pickle.dump(params, open('params.p', 'wb')) + + +def load_params(): + """ + Load parameters from file + """ + return pickle.load(open('params.p', mode='rb')) diff --git a/tv-script-generation/preprocess.p b/tv-script-generation/preprocess.p new file mode 100644 index 0000000..ac84c3d --- /dev/null +++ b/tv-script-generation/preprocess.p @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:be490d22f681ecd706cf3a53a5e7ec52421d9eb7f11e4cdb612740f915b9eab7 +size 391674 diff --git a/tv-script-generation/problem_unittests.py b/tv-script-generation/problem_unittests.py new file mode 100644 index 0000000..63fe478 --- /dev/null +++ b/tv-script-generation/problem_unittests.py @@ -0,0 +1,296 @@ +import numpy as np +import tensorflow as tf +from tensorflow.contrib import rnn + + +def _print_success_message(): + print('Tests Passed') + + +def test_create_lookup_tables(create_lookup_tables): + with tf.Graph().as_default(): + test_text = ''' + Moe_Szyslak Moe's Tavern Where the elite meet to drink + Bart_Simpson Eh yeah hello is Mike there Last name Rotch + Moe_Szyslak Hold on I'll check Mike Rotch Mike Rotch Hey has anybody seen Mike Rotch lately + Moe_Szyslak Listen you little puke One of these days I'm gonna catch you and I'm gonna carve my name on your back with an ice pick + Moe_Szyslak Whats the matter Homer You're not your normal effervescent self + Homer_Simpson I got my problems Moe Give me another one + Moe_Szyslak Homer hey you should not drink to forget your problems + Barney_Gumble Yeah you should only drink to enhance your social skills''' + + test_text = test_text.lower() + test_text = test_text.split() + + vocab_to_int, int_to_vocab = create_lookup_tables(test_text) + + # Check types + assert isinstance(vocab_to_int, dict),\ + 'vocab_to_int is not a dictionary.' + assert isinstance(int_to_vocab, dict),\ + 'int_to_vocab is not a dictionary.' + + # Compare lengths of dicts + assert len(vocab_to_int) == len(int_to_vocab),\ + 'Length of vocab_to_int and int_to_vocab don\'t match. ' \ + 'vocab_to_int is length {}. int_to_vocab is length {}'.format(len(vocab_to_int), len(int_to_vocab)) + + # Make sure the dicts have the same words + vocab_to_int_word_set = set(vocab_to_int.keys()) + int_to_vocab_word_set = set(int_to_vocab.values()) + + assert not (vocab_to_int_word_set - int_to_vocab_word_set),\ + 'vocab_to_int and int_to_vocab don\'t have the same words.' \ + '{} found in vocab_to_int, but not in int_to_vocab'.format(vocab_to_int_word_set - int_to_vocab_word_set) + assert not (int_to_vocab_word_set - vocab_to_int_word_set),\ + 'vocab_to_int and int_to_vocab don\'t have the same words.' \ + '{} found in int_to_vocab, but not in vocab_to_int'.format(int_to_vocab_word_set - vocab_to_int_word_set) + + # Make sure the dicts have the same word ids + vocab_to_int_word_id_set = set(vocab_to_int.values()) + int_to_vocab_word_id_set = set(int_to_vocab.keys()) + + assert not (vocab_to_int_word_id_set - int_to_vocab_word_id_set),\ + 'vocab_to_int and int_to_vocab don\'t contain the same word ids.' \ + '{} found in vocab_to_int, but not in int_to_vocab'.format(vocab_to_int_word_id_set - int_to_vocab_word_id_set) + assert not (int_to_vocab_word_id_set - vocab_to_int_word_id_set),\ + 'vocab_to_int and int_to_vocab don\'t contain the same word ids.' \ + '{} found in int_to_vocab, but not in vocab_to_int'.format(int_to_vocab_word_id_set - vocab_to_int_word_id_set) + + # Make sure the dicts make the same lookup + missmatches = [(word, id, id, int_to_vocab[id]) for word, id in vocab_to_int.items() if int_to_vocab[id] != word] + + assert not missmatches,\ + 'Found {} missmatche(s). First missmatch: vocab_to_int[{}] = {} and int_to_vocab[{}] = {}'.format( + len(missmatches), + *missmatches[0]) + + assert len(vocab_to_int) > len(set(test_text))/2,\ + 'The length of vocab seems too small. Found a length of {}'.format(len(vocab_to_int)) + + _print_success_message() + + +def test_get_batches(get_batches): + with tf.Graph().as_default(): + test_batch_size = 128 + test_seq_length = 5 + test_int_text = list(range(1000*test_seq_length)) + batches = get_batches(test_int_text, test_batch_size, test_seq_length) + + # Check type + assert isinstance(batches, np.ndarray),\ + 'Batches is not a Numpy array' + + # Check shape + assert batches.shape == (7, 2, 128, 5),\ + 'Batches returned wrong shape. Found {}'.format(batches.shape) + + _print_success_message() + + +def test_tokenize(token_lookup): + with tf.Graph().as_default(): + symbols = set(['.', ',', '"', ';', '!', '?', '(', ')', '--', '\n']) + token_dict = token_lookup() + + # Check type + assert isinstance(token_dict, dict), \ + 'Returned type is {}.'.format(type(token_dict)) + + # Check symbols + missing_symbols = symbols - set(token_dict.keys()) + unknown_symbols = set(token_dict.keys()) - symbols + + assert not missing_symbols, \ + 'Missing symbols: {}'.format(missing_symbols) + assert not unknown_symbols, \ + 'Unknown symbols: {}'.format(unknown_symbols) + + # Check values type + bad_value_type = [type(val) for val in token_dict.values() if not isinstance(val, str)] + + assert not bad_value_type,\ + 'Found token as {} type.'.format(bad_value_type[0]) + + # Check for spaces + key_has_spaces = [k for k in token_dict.keys() if ' ' in k] + val_has_spaces = [val for val in token_dict.values() if ' ' in val] + + assert not key_has_spaces,\ + 'The key "{}" includes spaces. Remove spaces from keys and values'.format(key_has_spaces[0]) + assert not val_has_spaces,\ + 'The value "{}" includes spaces. Remove spaces from keys and values'.format(val_has_spaces[0]) + + # Check for symbols in values + symbol_val = () + for symbol in symbols: + for val in token_dict.values(): + if symbol in val: + symbol_val = (symbol, val) + + assert not symbol_val,\ + 'Don\'t use a symbol that will be replaced in your tokens. Found the symbol {} in value {}'.format(*symbol_val) + + _print_success_message() + + +def test_get_inputs(get_inputs): + with tf.Graph().as_default(): + input_data, targets, lr = get_inputs() + + # Check type + assert input_data.op.type == 'Placeholder',\ + 'Input not a Placeholder.' + assert targets.op.type == 'Placeholder',\ + 'Targets not a Placeholder.' + assert lr.op.type == 'Placeholder',\ + 'Learning Rate not a Placeholder.' + + # Check name + assert input_data.name == 'input:0',\ + 'Input has bad name. Found name {}'.format(input_data.name) + + # Check rank + input_rank = 0 if input_data.get_shape() == None else len(input_data.get_shape()) + targets_rank = 0 if targets.get_shape() == None else len(targets.get_shape()) + lr_rank = 0 if lr.get_shape() == None else len(lr.get_shape()) + + assert input_rank == 2,\ + 'Input has wrong rank. Rank {} found.'.format(input_rank) + assert targets_rank == 2,\ + 'Targets has wrong rank. Rank {} found.'.format(targets_rank) + assert lr_rank == 0,\ + 'Learning Rate has wrong rank. Rank {} found'.format(lr_rank) + + _print_success_message() + + +def test_get_init_cell(get_init_cell): + with tf.Graph().as_default(): + test_batch_size_ph = tf.placeholder(tf.int32) + test_rnn_size = 256 + + cell, init_state = get_init_cell(test_batch_size_ph, test_rnn_size) + + # Check type + assert isinstance(cell, tf.contrib.rnn.MultiRNNCell),\ + 'Cell is wrong type. Found {} type'.format(type(cell)) + + # Check for name attribute + assert hasattr(init_state, 'name'),\ + 'Initial state doesn\'t have the "name" attribute. Try using `tf.identity` to set the name.' + + # Check name + assert init_state.name == 'initial_state:0',\ + 'Initial state doesn\'t have the correct name. Found the name {}'.format(init_state.name) + + _print_success_message() + + +def test_get_embed(get_embed): + with tf.Graph().as_default(): + embed_shape = [50, 5, 256] + test_input_data = tf.placeholder(tf.int32, embed_shape[:2]) + test_vocab_size = 27 + test_embed_dim = embed_shape[2] + + embed = get_embed(test_input_data, test_vocab_size, test_embed_dim) + + # Check shape + assert embed.shape == embed_shape,\ + 'Wrong shape. Found shape {}'.format(embed.shape) + + _print_success_message() + + +def test_build_rnn(build_rnn): + with tf.Graph().as_default(): + test_rnn_size = 256 + test_rnn_layer_size = 2 + test_cell = rnn.MultiRNNCell([rnn.BasicLSTMCell(test_rnn_size)] * test_rnn_layer_size) + + test_inputs = tf.placeholder(tf.float32, [None, None, test_rnn_size]) + outputs, final_state = build_rnn(test_cell, test_inputs) + + # Check name + assert hasattr(final_state, 'name'),\ + 'Final state doesn\'t have the "name" attribute. Try using `tf.identity` to set the name.' + assert final_state.name == 'final_state:0',\ + 'Final state doesn\'t have the correct name. Found the name {}'.format(final_state.name) + + # Check shape + assert outputs.get_shape().as_list() == [None, None, test_rnn_size],\ + 'Outputs has wrong shape. Found shape {}'.format(outputs.get_shape()) + assert final_state.get_shape().as_list() == [test_rnn_layer_size, 2, None, test_rnn_size],\ + 'Final state wrong shape. Found shape {}'.format(final_state.get_shape()) + + _print_success_message() + + +def test_build_nn(build_nn): + with tf.Graph().as_default(): + test_input_data_shape = [128, 5] + test_input_data = tf.placeholder(tf.int32, test_input_data_shape) + test_rnn_size = 256 + test_rnn_layer_size = 2 + test_vocab_size = 27 + test_cell = rnn.MultiRNNCell([rnn.BasicLSTMCell(test_rnn_size)] * test_rnn_layer_size) + + logits, final_state = build_nn(test_cell, test_rnn_size, test_input_data, test_vocab_size) + + # Check name + assert hasattr(final_state, 'name'), \ + 'Final state doesn\'t have the "name" attribute. Are you using build_rnn?' + assert final_state.name == 'final_state:0', \ + 'Final state doesn\'t have the correct name. Found the name {}. Are you using build_rnn?'.format(final_state.name) + + # Check Shape + assert logits.get_shape().as_list() == test_input_data_shape + [test_vocab_size], \ + 'Outputs has wrong shape. Found shape {}'.format(logits.get_shape()) + assert final_state.get_shape().as_list() == [test_rnn_layer_size, 2, None, test_rnn_size], \ + 'Final state wrong shape. Found shape {}'.format(final_state.get_shape()) + + _print_success_message() + + +def test_get_tensors(get_tensors): + test_graph = tf.Graph() + with test_graph.as_default(): + test_input = tf.placeholder(tf.int32, name='input') + test_initial_state = tf.placeholder(tf.int32, name='initial_state') + test_final_state = tf.placeholder(tf.int32, name='final_state') + test_probs = tf.placeholder(tf.float32, name='probs') + + input_text, initial_state, final_state, probs = get_tensors(test_graph) + + # Check correct tensor + assert input_text == test_input,\ + 'Test input is wrong tensor' + assert initial_state == test_initial_state, \ + 'Initial state is wrong tensor' + assert final_state == test_final_state, \ + 'Final state is wrong tensor' + assert probs == test_probs, \ + 'Probabilities is wrong tensor' + + _print_success_message() + + +def test_pick_word(pick_word): + with tf.Graph().as_default(): + test_probabilities = np.array([0.1, 0.8, 0.05, 0.05]) + test_int_to_vocab = {word_i: word for word_i, word in enumerate(['this', 'is', 'a', 'test'])} + + pred_word = pick_word(test_probabilities, test_int_to_vocab) + + # Check type + assert isinstance(pred_word, str),\ + 'Predicted word is wrong type. Found {} type.'.format(type(pred_word)) + + # Check word is from vocab + assert pred_word in test_int_to_vocab.values(),\ + 'Predicted word not found in int_to_vocab.' + + + _print_success_message() \ No newline at end of file